[
  {
    "path": ".github/ISSUE_TEMPLATE/bug.md",
    "content": "---\nname: Bug\nabout: Bug report\ntitle: ''\nlabels: status/triage, type/bug\nassignees: ''\n\n---\n### Summary\n<!--- Please provide a general summary of the issue. -->\n\n\n---\n\n### Reproduction\n\n##### Steps\n<!--- What steps should be taken to reproduce the issue? -->\n\n1.\n2.\n3.\n\n\n##### Current behavior\n<!--- What actually happened? -->\n\n\n##### Expected behavior\n<!--- What did you expect to happen? -->\n\n\n---\n\n### Environment\n\n##### pack info\n<!--- Run `pack report` and copy output here. -->\n\n##### docker info\n<!--- Run `docker info` and copy output here. -->\n"
  },
  {
    "path": ".github/ISSUE_TEMPLATE/chore.md",
    "content": "---\nname: Chore\nabout: Suggest a chore that will help contributors and doesn't affect end users.\ntitle: ''\nlabels: type/chore, status/triage\nassignees: ''\n\n---\n\n### Description\n<!-- A concise description of why this chore matters, who will enjoy it and how. -->\n\n### Proposed solution\n<!-- A clear and concise description of how do you think the chore should be implemented. -->\n\n### Additional context\n<!-- Add any other context or screenshots about the chore that may help. -->\n"
  },
  {
    "path": ".github/ISSUE_TEMPLATE/config.yml",
    "content": "contact_links:\n  - name: Questions\n    url: https://github.com/buildpacks/community/discussions\n    about: Have a general question?\n"
  },
  {
    "path": ".github/ISSUE_TEMPLATE/feature.md",
    "content": "---\nname: Feature request\nabout: Suggest a new feature or an improvement to existing functionality\ntitle: ''\nlabels: type/enhancement, status/triage\nassignees: ''\n\n---\n\n### Description\n<!-- A concise description of what problem the feature solves and why solving it matters.\nEx. My shoelaces won't stay tied and I keep tripping... -->\n\n### Proposed solution\n<!-- A clear and concise description of what you want to happen.\nEx. We could have velcro on the shoes instead of laces...-->\n\n### Describe alternatives you've considered\n<!-- A clear and concise description of any alternative solutions or features you've considered. -->\n\n### Additional context\n- [ ] This feature should be documented somewhere\n\n<!-- Add any other context or screenshots about the feature request here. -->\n"
  },
  {
    "path": ".github/dependabot.yml",
    "content": "version: 2\nupdates:\n  # Set update schedule for gomod\n  - package-ecosystem: \"gomod\"\n    directory: \"/\"\n    schedule:\n      interval: \"weekly\"\n    groups:\n      # Group all minor/patch go dependencies into a single PR.\n      go-dependencies:\n        update-types:\n          - \"minor\"\n          - \"patch\"\n    labels:\n      - \"dependencies\"\n      - \"go\"\n      - \"type/chore\"\n\n  # Set update schedule for GitHub Actions\n  - package-ecosystem: \"github-actions\"\n    directory: \"/\"\n    schedule:\n      interval: \"weekly\"\n    labels:\n      - \"dependencies\"\n      - \"github_actions\"\n      - \"type/chore\"\n"
  },
  {
    "path": ".github/labeler.yml",
    "content": "# Rules defined here: https://github.com/actions/labeler\ntype/chore:\n  - '*.md'\n  - '**/*.yml'\n  - 'acceptance/**/*'\n  - '.github/**/*'\n  - 'go.mod'\n  - 'go.sum'\n\ntype/enhancement:\n  - '*.go'\n  - '**/*.go'\n"
  },
  {
    "path": ".github/pull_request_template.md",
    "content": "## Summary\n<!-- Provide a high-level summary of the change. -->\n\n## Output\n<!-- If applicable, please provide examples of the output changes. -->\n\n#### Before\n\n#### After\n\n## Documentation\n<!-- If this change should be documented, please create an issue or PR on https://github.com/buildpacks/docs and link below. -->\n<!-- NOTE: This can be added (by editing the issue) after the PR is opened. -->\n\n- Should this change be documented?\n    - [ ] Yes, see #___\n    - [ ] No\n\n## Related\n<!-- If this PR addresses an issue, please provide issue number below. -->\n\nResolves #___\n"
  },
  {
    "path": ".github/release-notes.yml",
    "content": "labels:\n  breaking-change:\n    title: Breaking Changes\n    description: Changes that may require a little bit of thought before upgrading.\n    weight: 8\n  experimental:\n    title: Experimental\n    description: |\n      _Experimental features that may change in the future. Use them at your discretion._\n      \n      _To enable these features, run `pack config experimental true`, or add `experimental = true` to your `~/.pack/config.toml`._\n    weight: 9\n  type/enhancement:\n    title: Features\n    weight: 1\n  type/bug:\n    title: Bugs\n    weight: 2\n\nsections:\n  contributors:\n    title: Contributors\n    description: |\n      We'd like to acknowledge that this release wouldn't be as good without the help of the following amazing contributors:"
  },
  {
    "path": ".github/workflows/actions/release-notes/.gitignore",
    "content": "node_modules/\nchangelog.md"
  },
  {
    "path": ".github/workflows/actions/release-notes/README.md",
    "content": "## Changelog\n\nA simple script that generates the changelog for pack based on a pack version (aka milestone).\n\n### Usage\n\n#### Config\n\nThis script takes a configuration file in the following format:\n\n```yaml\nlabels:\n  # labels are grouped based on order but displayed based on weight\n  <label>:\n    # title for the group of issues\n    title: <string>\n    # description for the group of issues\n    description: <string>\n    # description for the group of issues\n    weight: <number>\n\nsections:\n  contributors:\n    # title for the contributors section, hidden if empty\n    title: <string>\n    # description for the contributors section\n    description: <string>\n```\n\n#### Github Action\n\n```yaml\n- name: Generate changelog\n  uses: ./.github/workflows/actions/release-notes\n  id: changelog\n  with:\n    github-token: ${{ secrets.GITHUB_TOKEN }}\n    milestone: <milestone>\n```\n\n#### Local\n\nTo run/test locally:\n\n```shell script\n# install deps\nnpm install\n\n# set required info\nexport GITHUB_TOKEN=\"<GITHUB_PAT_TOKEN>\"\n\n# run locally\nnpm run local -- <milestone> <config-path>\n```\n\nNotice that a file `changelog.md` is created as well for further inspection.\n\n### Updating\n\nThis action is packaged for distribution without vendoring `npm_modules` with use of [ncc](https://github.com/vercel/ncc).\n\nWhen making changes to the action, compile it and commit the changes.\n\n```shell script\nnpm run-script build\n```"
  },
  {
    "path": ".github/workflows/actions/release-notes/action.js",
    "content": "/*\n * This file is the main entrypoint for GitHub Actions (see action.yml)\n */\n\nconst core = require('@actions/core');\nconst github = require('@actions/github');\nconst releaseNotes = require('./release-notes.js');\n\ntry {\n  const defaultConfigFile = \"./.github/release-notes.yml\";\n\n  releaseNotes(\n    github.getOctokit(core.getInput(\"github-token\", {required: true})),\n    `${github.context.repo.owner}/${github.context.repo.repo}`,\n    core.getInput('milestone', {required: true}),\n    core.getInput('configFile') || defaultConfigFile,\n  )\n    .then(contents => {\n      console.log(\"GENERATED CHANGELOG\\n=========================\\n\", contents);\n      core.setOutput(\"contents\", contents)\n    })\n    .catch(error => core.setFailed(error.message))\n} catch (error) {\n  core.setFailed(error.message);\n}"
  },
  {
    "path": ".github/workflows/actions/release-notes/action.yml",
    "content": "name: 'Release Notes'\ndescription: 'Generate release notes based on pull requests in a milestone.'\ninputs:\n  github-token:\n    description: GitHub token used to search for pull requests.\n    required: true\n  milestone:\n    description: The milestone used to look for pull requests.\n    required: true\n  config-file:\n    description: Path to the configuration (yaml) file.\n    required: false\n    default: \"./.github/release-notes.yml\"\noutputs:\n  contents:\n    description: The contents of the release notes.\nruns:\n  using: 'node16'\n  main: 'dist/index.js'"
  },
  {
    "path": ".github/workflows/actions/release-notes/dist/index.js",
    "content": "module.exports =\n/******/ (() => { // webpackBootstrap\n/******/ \tvar __webpack_modules__ = ({\n\n/***/ 4582:\n/***/ ((__unused_webpack_module, __unused_webpack_exports, __webpack_require__) => {\n\n/*\n * This file is the main entrypoint for GitHub Actions (see action.yml)\n */\n\nconst core = __webpack_require__(2186);\nconst github = __webpack_require__(5438);\nconst releaseNotes = __webpack_require__(8571);\n\ntry {\n  const defaultConfigFile = \"./.github/release-notes.yml\";\n\n  releaseNotes(\n    github.getOctokit(core.getInput(\"github-token\", {required: true})),\n    `${github.context.repo.owner}/${github.context.repo.repo}`,\n    core.getInput('milestone', {required: true}),\n    core.getInput('configFile') || defaultConfigFile,\n  )\n    .then(contents => {\n      console.log(\"GENERATED CHANGELOG\\n=========================\\n\", contents);\n      core.setOutput(\"contents\", contents)\n    })\n    .catch(error => core.setFailed(error.message))\n} catch (error) {\n  core.setFailed(error.message);\n}\n\n/***/ }),\n\n/***/ 7351:\n/***/ (function(__unused_webpack_module, exports, __webpack_require__) {\n\n\"use strict\";\n\nvar __createBinding = (this && this.__createBinding) || (Object.create ? (function(o, m, k, k2) {\n    if (k2 === undefined) k2 = k;\n    Object.defineProperty(o, k2, { enumerable: true, get: function() { return m[k]; } });\n}) : (function(o, m, k, k2) {\n    if (k2 === undefined) k2 = k;\n    o[k2] = m[k];\n}));\nvar __setModuleDefault = (this && this.__setModuleDefault) || (Object.create ? (function(o, v) {\n    Object.defineProperty(o, \"default\", { enumerable: true, value: v });\n}) : function(o, v) {\n    o[\"default\"] = v;\n});\nvar __importStar = (this && this.__importStar) || function (mod) {\n    if (mod && mod.__esModule) return mod;\n    var result = {};\n    if (mod != null) for (var k in mod) if (k !== \"default\" && Object.hasOwnProperty.call(mod, k)) __createBinding(result, mod, k);\n    __setModuleDefault(result, mod);\n    return result;\n};\nObject.defineProperty(exports, \"__esModule\", ({ value: true }));\nexports.issue = exports.issueCommand = void 0;\nconst os = __importStar(__webpack_require__(2087));\nconst utils_1 = __webpack_require__(5278);\n/**\n * Commands\n *\n * Command Format:\n *   ::name key=value,key=value::message\n *\n * Examples:\n *   ::warning::This is the message\n *   ::set-env name=MY_VAR::some value\n */\nfunction issueCommand(command, properties, message) {\n    const cmd = new Command(command, properties, message);\n    process.stdout.write(cmd.toString() + os.EOL);\n}\nexports.issueCommand = issueCommand;\nfunction issue(name, message = '') {\n    issueCommand(name, {}, message);\n}\nexports.issue = issue;\nconst CMD_STRING = '::';\nclass Command {\n    constructor(command, properties, message) {\n        if (!command) {\n            command = 'missing.command';\n        }\n        this.command = command;\n        this.properties = properties;\n        this.message = message;\n    }\n    toString() {\n        let cmdStr = CMD_STRING + this.command;\n        if (this.properties && Object.keys(this.properties).length > 0) {\n            cmdStr += ' ';\n            let first = true;\n            for (const key in this.properties) {\n                if (this.properties.hasOwnProperty(key)) {\n                    const val = this.properties[key];\n                    if (val) {\n                        if (first) {\n                            first = false;\n                        }\n                        else {\n                            cmdStr += ',';\n                        }\n                        cmdStr += `${key}=${escapeProperty(val)}`;\n                    }\n                }\n            }\n        }\n        cmdStr += `${CMD_STRING}${escapeData(this.message)}`;\n        return cmdStr;\n    }\n}\nfunction escapeData(s) {\n    return utils_1.toCommandValue(s)\n        .replace(/%/g, '%25')\n        .replace(/\\r/g, '%0D')\n        .replace(/\\n/g, '%0A');\n}\nfunction escapeProperty(s) {\n    return utils_1.toCommandValue(s)\n        .replace(/%/g, '%25')\n        .replace(/\\r/g, '%0D')\n        .replace(/\\n/g, '%0A')\n        .replace(/:/g, '%3A')\n        .replace(/,/g, '%2C');\n}\n//# sourceMappingURL=command.js.map\n\n/***/ }),\n\n/***/ 2186:\n/***/ (function(__unused_webpack_module, exports, __webpack_require__) {\n\n\"use strict\";\n\nvar __createBinding = (this && this.__createBinding) || (Object.create ? (function(o, m, k, k2) {\n    if (k2 === undefined) k2 = k;\n    Object.defineProperty(o, k2, { enumerable: true, get: function() { return m[k]; } });\n}) : (function(o, m, k, k2) {\n    if (k2 === undefined) k2 = k;\n    o[k2] = m[k];\n}));\nvar __setModuleDefault = (this && this.__setModuleDefault) || (Object.create ? (function(o, v) {\n    Object.defineProperty(o, \"default\", { enumerable: true, value: v });\n}) : function(o, v) {\n    o[\"default\"] = v;\n});\nvar __importStar = (this && this.__importStar) || function (mod) {\n    if (mod && mod.__esModule) return mod;\n    var result = {};\n    if (mod != null) for (var k in mod) if (k !== \"default\" && Object.hasOwnProperty.call(mod, k)) __createBinding(result, mod, k);\n    __setModuleDefault(result, mod);\n    return result;\n};\nvar __awaiter = (this && this.__awaiter) || function (thisArg, _arguments, P, generator) {\n    function adopt(value) { return value instanceof P ? value : new P(function (resolve) { resolve(value); }); }\n    return new (P || (P = Promise))(function (resolve, reject) {\n        function fulfilled(value) { try { step(generator.next(value)); } catch (e) { reject(e); } }\n        function rejected(value) { try { step(generator[\"throw\"](value)); } catch (e) { reject(e); } }\n        function step(result) { result.done ? resolve(result.value) : adopt(result.value).then(fulfilled, rejected); }\n        step((generator = generator.apply(thisArg, _arguments || [])).next());\n    });\n};\nObject.defineProperty(exports, \"__esModule\", ({ value: true }));\nexports.getIDToken = exports.getState = exports.saveState = exports.group = exports.endGroup = exports.startGroup = exports.info = exports.notice = exports.warning = exports.error = exports.debug = exports.isDebug = exports.setFailed = exports.setCommandEcho = exports.setOutput = exports.getBooleanInput = exports.getMultilineInput = exports.getInput = exports.addPath = exports.setSecret = exports.exportVariable = exports.ExitCode = void 0;\nconst command_1 = __webpack_require__(7351);\nconst file_command_1 = __webpack_require__(717);\nconst utils_1 = __webpack_require__(5278);\nconst os = __importStar(__webpack_require__(2087));\nconst path = __importStar(__webpack_require__(5622));\nconst oidc_utils_1 = __webpack_require__(8041);\n/**\n * The code to exit an action\n */\nvar ExitCode;\n(function (ExitCode) {\n    /**\n     * A code indicating that the action was successful\n     */\n    ExitCode[ExitCode[\"Success\"] = 0] = \"Success\";\n    /**\n     * A code indicating that the action was a failure\n     */\n    ExitCode[ExitCode[\"Failure\"] = 1] = \"Failure\";\n})(ExitCode = exports.ExitCode || (exports.ExitCode = {}));\n//-----------------------------------------------------------------------\n// Variables\n//-----------------------------------------------------------------------\n/**\n * Sets env variable for this action and future actions in the job\n * @param name the name of the variable to set\n * @param val the value of the variable. Non-string values will be converted to a string via JSON.stringify\n */\n// eslint-disable-next-line @typescript-eslint/no-explicit-any\nfunction exportVariable(name, val) {\n    const convertedVal = utils_1.toCommandValue(val);\n    process.env[name] = convertedVal;\n    const filePath = process.env['GITHUB_ENV'] || '';\n    if (filePath) {\n        return file_command_1.issueFileCommand('ENV', file_command_1.prepareKeyValueMessage(name, val));\n    }\n    command_1.issueCommand('set-env', { name }, convertedVal);\n}\nexports.exportVariable = exportVariable;\n/**\n * Registers a secret which will get masked from logs\n * @param secret value of the secret\n */\nfunction setSecret(secret) {\n    command_1.issueCommand('add-mask', {}, secret);\n}\nexports.setSecret = setSecret;\n/**\n * Prepends inputPath to the PATH (for this action and future actions)\n * @param inputPath\n */\nfunction addPath(inputPath) {\n    const filePath = process.env['GITHUB_PATH'] || '';\n    if (filePath) {\n        file_command_1.issueFileCommand('PATH', inputPath);\n    }\n    else {\n        command_1.issueCommand('add-path', {}, inputPath);\n    }\n    process.env['PATH'] = `${inputPath}${path.delimiter}${process.env['PATH']}`;\n}\nexports.addPath = addPath;\n/**\n * Gets the value of an input.\n * Unless trimWhitespace is set to false in InputOptions, the value is also trimmed.\n * Returns an empty string if the value is not defined.\n *\n * @param     name     name of the input to get\n * @param     options  optional. See InputOptions.\n * @returns   string\n */\nfunction getInput(name, options) {\n    const val = process.env[`INPUT_${name.replace(/ /g, '_').toUpperCase()}`] || '';\n    if (options && options.required && !val) {\n        throw new Error(`Input required and not supplied: ${name}`);\n    }\n    if (options && options.trimWhitespace === false) {\n        return val;\n    }\n    return val.trim();\n}\nexports.getInput = getInput;\n/**\n * Gets the values of an multiline input.  Each value is also trimmed.\n *\n * @param     name     name of the input to get\n * @param     options  optional. See InputOptions.\n * @returns   string[]\n *\n */\nfunction getMultilineInput(name, options) {\n    const inputs = getInput(name, options)\n        .split('\\n')\n        .filter(x => x !== '');\n    if (options && options.trimWhitespace === false) {\n        return inputs;\n    }\n    return inputs.map(input => input.trim());\n}\nexports.getMultilineInput = getMultilineInput;\n/**\n * Gets the input value of the boolean type in the YAML 1.2 \"core schema\" specification.\n * Support boolean input list: `true | True | TRUE | false | False | FALSE` .\n * The return value is also in boolean type.\n * ref: https://yaml.org/spec/1.2/spec.html#id2804923\n *\n * @param     name     name of the input to get\n * @param     options  optional. See InputOptions.\n * @returns   boolean\n */\nfunction getBooleanInput(name, options) {\n    const trueValue = ['true', 'True', 'TRUE'];\n    const falseValue = ['false', 'False', 'FALSE'];\n    const val = getInput(name, options);\n    if (trueValue.includes(val))\n        return true;\n    if (falseValue.includes(val))\n        return false;\n    throw new TypeError(`Input does not meet YAML 1.2 \"Core Schema\" specification: ${name}\\n` +\n        `Support boolean input list: \\`true | True | TRUE | false | False | FALSE\\``);\n}\nexports.getBooleanInput = getBooleanInput;\n/**\n * Sets the value of an output.\n *\n * @param     name     name of the output to set\n * @param     value    value to store. Non-string values will be converted to a string via JSON.stringify\n */\n// eslint-disable-next-line @typescript-eslint/no-explicit-any\nfunction setOutput(name, value) {\n    const filePath = process.env['GITHUB_OUTPUT'] || '';\n    if (filePath) {\n        return file_command_1.issueFileCommand('OUTPUT', file_command_1.prepareKeyValueMessage(name, value));\n    }\n    process.stdout.write(os.EOL);\n    command_1.issueCommand('set-output', { name }, utils_1.toCommandValue(value));\n}\nexports.setOutput = setOutput;\n/**\n * Enables or disables the echoing of commands into stdout for the rest of the step.\n * Echoing is disabled by default if ACTIONS_STEP_DEBUG is not set.\n *\n */\nfunction setCommandEcho(enabled) {\n    command_1.issue('echo', enabled ? 'on' : 'off');\n}\nexports.setCommandEcho = setCommandEcho;\n//-----------------------------------------------------------------------\n// Results\n//-----------------------------------------------------------------------\n/**\n * Sets the action status to failed.\n * When the action exits it will be with an exit code of 1\n * @param message add error issue message\n */\nfunction setFailed(message) {\n    process.exitCode = ExitCode.Failure;\n    error(message);\n}\nexports.setFailed = setFailed;\n//-----------------------------------------------------------------------\n// Logging Commands\n//-----------------------------------------------------------------------\n/**\n * Gets whether Actions Step Debug is on or not\n */\nfunction isDebug() {\n    return process.env['RUNNER_DEBUG'] === '1';\n}\nexports.isDebug = isDebug;\n/**\n * Writes debug message to user log\n * @param message debug message\n */\nfunction debug(message) {\n    command_1.issueCommand('debug', {}, message);\n}\nexports.debug = debug;\n/**\n * Adds an error issue\n * @param message error issue message. Errors will be converted to string via toString()\n * @param properties optional properties to add to the annotation.\n */\nfunction error(message, properties = {}) {\n    command_1.issueCommand('error', utils_1.toCommandProperties(properties), message instanceof Error ? message.toString() : message);\n}\nexports.error = error;\n/**\n * Adds a warning issue\n * @param message warning issue message. Errors will be converted to string via toString()\n * @param properties optional properties to add to the annotation.\n */\nfunction warning(message, properties = {}) {\n    command_1.issueCommand('warning', utils_1.toCommandProperties(properties), message instanceof Error ? message.toString() : message);\n}\nexports.warning = warning;\n/**\n * Adds a notice issue\n * @param message notice issue message. Errors will be converted to string via toString()\n * @param properties optional properties to add to the annotation.\n */\nfunction notice(message, properties = {}) {\n    command_1.issueCommand('notice', utils_1.toCommandProperties(properties), message instanceof Error ? message.toString() : message);\n}\nexports.notice = notice;\n/**\n * Writes info to log with console.log.\n * @param message info message\n */\nfunction info(message) {\n    process.stdout.write(message + os.EOL);\n}\nexports.info = info;\n/**\n * Begin an output group.\n *\n * Output until the next `groupEnd` will be foldable in this group\n *\n * @param name The name of the output group\n */\nfunction startGroup(name) {\n    command_1.issue('group', name);\n}\nexports.startGroup = startGroup;\n/**\n * End an output group.\n */\nfunction endGroup() {\n    command_1.issue('endgroup');\n}\nexports.endGroup = endGroup;\n/**\n * Wrap an asynchronous function call in a group.\n *\n * Returns the same type as the function itself.\n *\n * @param name The name of the group\n * @param fn The function to wrap in the group\n */\nfunction group(name, fn) {\n    return __awaiter(this, void 0, void 0, function* () {\n        startGroup(name);\n        let result;\n        try {\n            result = yield fn();\n        }\n        finally {\n            endGroup();\n        }\n        return result;\n    });\n}\nexports.group = group;\n//-----------------------------------------------------------------------\n// Wrapper action state\n//-----------------------------------------------------------------------\n/**\n * Saves state for current action, the state can only be retrieved by this action's post job execution.\n *\n * @param     name     name of the state to store\n * @param     value    value to store. Non-string values will be converted to a string via JSON.stringify\n */\n// eslint-disable-next-line @typescript-eslint/no-explicit-any\nfunction saveState(name, value) {\n    const filePath = process.env['GITHUB_STATE'] || '';\n    if (filePath) {\n        return file_command_1.issueFileCommand('STATE', file_command_1.prepareKeyValueMessage(name, value));\n    }\n    command_1.issueCommand('save-state', { name }, utils_1.toCommandValue(value));\n}\nexports.saveState = saveState;\n/**\n * Gets the value of an state set by this action's main execution.\n *\n * @param     name     name of the state to get\n * @returns   string\n */\nfunction getState(name) {\n    return process.env[`STATE_${name}`] || '';\n}\nexports.getState = getState;\nfunction getIDToken(aud) {\n    return __awaiter(this, void 0, void 0, function* () {\n        return yield oidc_utils_1.OidcClient.getIDToken(aud);\n    });\n}\nexports.getIDToken = getIDToken;\n/**\n * Summary exports\n */\nvar summary_1 = __webpack_require__(1327);\nObject.defineProperty(exports, \"summary\", ({ enumerable: true, get: function () { return summary_1.summary; } }));\n/**\n * @deprecated use core.summary\n */\nvar summary_2 = __webpack_require__(1327);\nObject.defineProperty(exports, \"markdownSummary\", ({ enumerable: true, get: function () { return summary_2.markdownSummary; } }));\n/**\n * Path exports\n */\nvar path_utils_1 = __webpack_require__(2981);\nObject.defineProperty(exports, \"toPosixPath\", ({ enumerable: true, get: function () { return path_utils_1.toPosixPath; } }));\nObject.defineProperty(exports, \"toWin32Path\", ({ enumerable: true, get: function () { return path_utils_1.toWin32Path; } }));\nObject.defineProperty(exports, \"toPlatformPath\", ({ enumerable: true, get: function () { return path_utils_1.toPlatformPath; } }));\n//# sourceMappingURL=core.js.map\n\n/***/ }),\n\n/***/ 717:\n/***/ (function(__unused_webpack_module, exports, __webpack_require__) {\n\n\"use strict\";\n\n// For internal use, subject to change.\nvar __createBinding = (this && this.__createBinding) || (Object.create ? (function(o, m, k, k2) {\n    if (k2 === undefined) k2 = k;\n    Object.defineProperty(o, k2, { enumerable: true, get: function() { return m[k]; } });\n}) : (function(o, m, k, k2) {\n    if (k2 === undefined) k2 = k;\n    o[k2] = m[k];\n}));\nvar __setModuleDefault = (this && this.__setModuleDefault) || (Object.create ? (function(o, v) {\n    Object.defineProperty(o, \"default\", { enumerable: true, value: v });\n}) : function(o, v) {\n    o[\"default\"] = v;\n});\nvar __importStar = (this && this.__importStar) || function (mod) {\n    if (mod && mod.__esModule) return mod;\n    var result = {};\n    if (mod != null) for (var k in mod) if (k !== \"default\" && Object.hasOwnProperty.call(mod, k)) __createBinding(result, mod, k);\n    __setModuleDefault(result, mod);\n    return result;\n};\nObject.defineProperty(exports, \"__esModule\", ({ value: true }));\nexports.prepareKeyValueMessage = exports.issueFileCommand = void 0;\n// We use any as a valid input type\n/* eslint-disable @typescript-eslint/no-explicit-any */\nconst fs = __importStar(__webpack_require__(5747));\nconst os = __importStar(__webpack_require__(2087));\nconst uuid_1 = __webpack_require__(4552);\nconst utils_1 = __webpack_require__(5278);\nfunction issueFileCommand(command, message) {\n    const filePath = process.env[`GITHUB_${command}`];\n    if (!filePath) {\n        throw new Error(`Unable to find environment variable for file command ${command}`);\n    }\n    if (!fs.existsSync(filePath)) {\n        throw new Error(`Missing file at path: ${filePath}`);\n    }\n    fs.appendFileSync(filePath, `${utils_1.toCommandValue(message)}${os.EOL}`, {\n        encoding: 'utf8'\n    });\n}\nexports.issueFileCommand = issueFileCommand;\nfunction prepareKeyValueMessage(key, value) {\n    const delimiter = `ghadelimiter_${uuid_1.v4()}`;\n    const convertedValue = utils_1.toCommandValue(value);\n    // These should realistically never happen, but just in case someone finds a\n    // way to exploit uuid generation let's not allow keys or values that contain\n    // the delimiter.\n    if (key.includes(delimiter)) {\n        throw new Error(`Unexpected input: name should not contain the delimiter \"${delimiter}\"`);\n    }\n    if (convertedValue.includes(delimiter)) {\n        throw new Error(`Unexpected input: value should not contain the delimiter \"${delimiter}\"`);\n    }\n    return `${key}<<${delimiter}${os.EOL}${convertedValue}${os.EOL}${delimiter}`;\n}\nexports.prepareKeyValueMessage = prepareKeyValueMessage;\n//# sourceMappingURL=file-command.js.map\n\n/***/ }),\n\n/***/ 8041:\n/***/ (function(__unused_webpack_module, exports, __webpack_require__) {\n\n\"use strict\";\n\nvar __awaiter = (this && this.__awaiter) || function (thisArg, _arguments, P, generator) {\n    function adopt(value) { return value instanceof P ? value : new P(function (resolve) { resolve(value); }); }\n    return new (P || (P = Promise))(function (resolve, reject) {\n        function fulfilled(value) { try { step(generator.next(value)); } catch (e) { reject(e); } }\n        function rejected(value) { try { step(generator[\"throw\"](value)); } catch (e) { reject(e); } }\n        function step(result) { result.done ? resolve(result.value) : adopt(result.value).then(fulfilled, rejected); }\n        step((generator = generator.apply(thisArg, _arguments || [])).next());\n    });\n};\nObject.defineProperty(exports, \"__esModule\", ({ value: true }));\nexports.OidcClient = void 0;\nconst http_client_1 = __webpack_require__(1404);\nconst auth_1 = __webpack_require__(6758);\nconst core_1 = __webpack_require__(2186);\nclass OidcClient {\n    static createHttpClient(allowRetry = true, maxRetry = 10) {\n        const requestOptions = {\n            allowRetries: allowRetry,\n            maxRetries: maxRetry\n        };\n        return new http_client_1.HttpClient('actions/oidc-client', [new auth_1.BearerCredentialHandler(OidcClient.getRequestToken())], requestOptions);\n    }\n    static getRequestToken() {\n        const token = process.env['ACTIONS_ID_TOKEN_REQUEST_TOKEN'];\n        if (!token) {\n            throw new Error('Unable to get ACTIONS_ID_TOKEN_REQUEST_TOKEN env variable');\n        }\n        return token;\n    }\n    static getIDTokenUrl() {\n        const runtimeUrl = process.env['ACTIONS_ID_TOKEN_REQUEST_URL'];\n        if (!runtimeUrl) {\n            throw new Error('Unable to get ACTIONS_ID_TOKEN_REQUEST_URL env variable');\n        }\n        return runtimeUrl;\n    }\n    static getCall(id_token_url) {\n        var _a;\n        return __awaiter(this, void 0, void 0, function* () {\n            const httpclient = OidcClient.createHttpClient();\n            const res = yield httpclient\n                .getJson(id_token_url)\n                .catch(error => {\n                throw new Error(`Failed to get ID Token. \\n \n        Error Code : ${error.statusCode}\\n \n        Error Message: ${error.result.message}`);\n            });\n            const id_token = (_a = res.result) === null || _a === void 0 ? void 0 : _a.value;\n            if (!id_token) {\n                throw new Error('Response json body do not have ID Token field');\n            }\n            return id_token;\n        });\n    }\n    static getIDToken(audience) {\n        return __awaiter(this, void 0, void 0, function* () {\n            try {\n                // New ID Token is requested from action service\n                let id_token_url = OidcClient.getIDTokenUrl();\n                if (audience) {\n                    const encodedAudience = encodeURIComponent(audience);\n                    id_token_url = `${id_token_url}&audience=${encodedAudience}`;\n                }\n                core_1.debug(`ID token url is ${id_token_url}`);\n                const id_token = yield OidcClient.getCall(id_token_url);\n                core_1.setSecret(id_token);\n                return id_token;\n            }\n            catch (error) {\n                throw new Error(`Error message: ${error.message}`);\n            }\n        });\n    }\n}\nexports.OidcClient = OidcClient;\n//# sourceMappingURL=oidc-utils.js.map\n\n/***/ }),\n\n/***/ 2981:\n/***/ (function(__unused_webpack_module, exports, __webpack_require__) {\n\n\"use strict\";\n\nvar __createBinding = (this && this.__createBinding) || (Object.create ? (function(o, m, k, k2) {\n    if (k2 === undefined) k2 = k;\n    Object.defineProperty(o, k2, { enumerable: true, get: function() { return m[k]; } });\n}) : (function(o, m, k, k2) {\n    if (k2 === undefined) k2 = k;\n    o[k2] = m[k];\n}));\nvar __setModuleDefault = (this && this.__setModuleDefault) || (Object.create ? (function(o, v) {\n    Object.defineProperty(o, \"default\", { enumerable: true, value: v });\n}) : function(o, v) {\n    o[\"default\"] = v;\n});\nvar __importStar = (this && this.__importStar) || function (mod) {\n    if (mod && mod.__esModule) return mod;\n    var result = {};\n    if (mod != null) for (var k in mod) if (k !== \"default\" && Object.hasOwnProperty.call(mod, k)) __createBinding(result, mod, k);\n    __setModuleDefault(result, mod);\n    return result;\n};\nObject.defineProperty(exports, \"__esModule\", ({ value: true }));\nexports.toPlatformPath = exports.toWin32Path = exports.toPosixPath = void 0;\nconst path = __importStar(__webpack_require__(5622));\n/**\n * toPosixPath converts the given path to the posix form. On Windows, \\\\ will be\n * replaced with /.\n *\n * @param pth. Path to transform.\n * @return string Posix path.\n */\nfunction toPosixPath(pth) {\n    return pth.replace(/[\\\\]/g, '/');\n}\nexports.toPosixPath = toPosixPath;\n/**\n * toWin32Path converts the given path to the win32 form. On Linux, / will be\n * replaced with \\\\.\n *\n * @param pth. Path to transform.\n * @return string Win32 path.\n */\nfunction toWin32Path(pth) {\n    return pth.replace(/[/]/g, '\\\\');\n}\nexports.toWin32Path = toWin32Path;\n/**\n * toPlatformPath converts the given path to a platform-specific path. It does\n * this by replacing instances of / and \\ with the platform-specific path\n * separator.\n *\n * @param pth The path to platformize.\n * @return string The platform-specific path.\n */\nfunction toPlatformPath(pth) {\n    return pth.replace(/[/\\\\]/g, path.sep);\n}\nexports.toPlatformPath = toPlatformPath;\n//# sourceMappingURL=path-utils.js.map\n\n/***/ }),\n\n/***/ 1327:\n/***/ (function(__unused_webpack_module, exports, __webpack_require__) {\n\n\"use strict\";\n\nvar __awaiter = (this && this.__awaiter) || function (thisArg, _arguments, P, generator) {\n    function adopt(value) { return value instanceof P ? value : new P(function (resolve) { resolve(value); }); }\n    return new (P || (P = Promise))(function (resolve, reject) {\n        function fulfilled(value) { try { step(generator.next(value)); } catch (e) { reject(e); } }\n        function rejected(value) { try { step(generator[\"throw\"](value)); } catch (e) { reject(e); } }\n        function step(result) { result.done ? resolve(result.value) : adopt(result.value).then(fulfilled, rejected); }\n        step((generator = generator.apply(thisArg, _arguments || [])).next());\n    });\n};\nObject.defineProperty(exports, \"__esModule\", ({ value: true }));\nexports.summary = exports.markdownSummary = exports.SUMMARY_DOCS_URL = exports.SUMMARY_ENV_VAR = void 0;\nconst os_1 = __webpack_require__(2087);\nconst fs_1 = __webpack_require__(5747);\nconst { access, appendFile, writeFile } = fs_1.promises;\nexports.SUMMARY_ENV_VAR = 'GITHUB_STEP_SUMMARY';\nexports.SUMMARY_DOCS_URL = 'https://docs.github.com/actions/using-workflows/workflow-commands-for-github-actions#adding-a-job-summary';\nclass Summary {\n    constructor() {\n        this._buffer = '';\n    }\n    /**\n     * Finds the summary file path from the environment, rejects if env var is not found or file does not exist\n     * Also checks r/w permissions.\n     *\n     * @returns step summary file path\n     */\n    filePath() {\n        return __awaiter(this, void 0, void 0, function* () {\n            if (this._filePath) {\n                return this._filePath;\n            }\n            const pathFromEnv = process.env[exports.SUMMARY_ENV_VAR];\n            if (!pathFromEnv) {\n                throw new Error(`Unable to find environment variable for $${exports.SUMMARY_ENV_VAR}. Check if your runtime environment supports job summaries.`);\n            }\n            try {\n                yield access(pathFromEnv, fs_1.constants.R_OK | fs_1.constants.W_OK);\n            }\n            catch (_a) {\n                throw new Error(`Unable to access summary file: '${pathFromEnv}'. Check if the file has correct read/write permissions.`);\n            }\n            this._filePath = pathFromEnv;\n            return this._filePath;\n        });\n    }\n    /**\n     * Wraps content in an HTML tag, adding any HTML attributes\n     *\n     * @param {string} tag HTML tag to wrap\n     * @param {string | null} content content within the tag\n     * @param {[attribute: string]: string} attrs key-value list of HTML attributes to add\n     *\n     * @returns {string} content wrapped in HTML element\n     */\n    wrap(tag, content, attrs = {}) {\n        const htmlAttrs = Object.entries(attrs)\n            .map(([key, value]) => ` ${key}=\"${value}\"`)\n            .join('');\n        if (!content) {\n            return `<${tag}${htmlAttrs}>`;\n        }\n        return `<${tag}${htmlAttrs}>${content}</${tag}>`;\n    }\n    /**\n     * Writes text in the buffer to the summary buffer file and empties buffer. Will append by default.\n     *\n     * @param {SummaryWriteOptions} [options] (optional) options for write operation\n     *\n     * @returns {Promise<Summary>} summary instance\n     */\n    write(options) {\n        return __awaiter(this, void 0, void 0, function* () {\n            const overwrite = !!(options === null || options === void 0 ? void 0 : options.overwrite);\n            const filePath = yield this.filePath();\n            const writeFunc = overwrite ? writeFile : appendFile;\n            yield writeFunc(filePath, this._buffer, { encoding: 'utf8' });\n            return this.emptyBuffer();\n        });\n    }\n    /**\n     * Clears the summary buffer and wipes the summary file\n     *\n     * @returns {Summary} summary instance\n     */\n    clear() {\n        return __awaiter(this, void 0, void 0, function* () {\n            return this.emptyBuffer().write({ overwrite: true });\n        });\n    }\n    /**\n     * Returns the current summary buffer as a string\n     *\n     * @returns {string} string of summary buffer\n     */\n    stringify() {\n        return this._buffer;\n    }\n    /**\n     * If the summary buffer is empty\n     *\n     * @returns {boolen} true if the buffer is empty\n     */\n    isEmptyBuffer() {\n        return this._buffer.length === 0;\n    }\n    /**\n     * Resets the summary buffer without writing to summary file\n     *\n     * @returns {Summary} summary instance\n     */\n    emptyBuffer() {\n        this._buffer = '';\n        return this;\n    }\n    /**\n     * Adds raw text to the summary buffer\n     *\n     * @param {string} text content to add\n     * @param {boolean} [addEOL=false] (optional) append an EOL to the raw text (default: false)\n     *\n     * @returns {Summary} summary instance\n     */\n    addRaw(text, addEOL = false) {\n        this._buffer += text;\n        return addEOL ? this.addEOL() : this;\n    }\n    /**\n     * Adds the operating system-specific end-of-line marker to the buffer\n     *\n     * @returns {Summary} summary instance\n     */\n    addEOL() {\n        return this.addRaw(os_1.EOL);\n    }\n    /**\n     * Adds an HTML codeblock to the summary buffer\n     *\n     * @param {string} code content to render within fenced code block\n     * @param {string} lang (optional) language to syntax highlight code\n     *\n     * @returns {Summary} summary instance\n     */\n    addCodeBlock(code, lang) {\n        const attrs = Object.assign({}, (lang && { lang }));\n        const element = this.wrap('pre', this.wrap('code', code), attrs);\n        return this.addRaw(element).addEOL();\n    }\n    /**\n     * Adds an HTML list to the summary buffer\n     *\n     * @param {string[]} items list of items to render\n     * @param {boolean} [ordered=false] (optional) if the rendered list should be ordered or not (default: false)\n     *\n     * @returns {Summary} summary instance\n     */\n    addList(items, ordered = false) {\n        const tag = ordered ? 'ol' : 'ul';\n        const listItems = items.map(item => this.wrap('li', item)).join('');\n        const element = this.wrap(tag, listItems);\n        return this.addRaw(element).addEOL();\n    }\n    /**\n     * Adds an HTML table to the summary buffer\n     *\n     * @param {SummaryTableCell[]} rows table rows\n     *\n     * @returns {Summary} summary instance\n     */\n    addTable(rows) {\n        const tableBody = rows\n            .map(row => {\n            const cells = row\n                .map(cell => {\n                if (typeof cell === 'string') {\n                    return this.wrap('td', cell);\n                }\n                const { header, data, colspan, rowspan } = cell;\n                const tag = header ? 'th' : 'td';\n                const attrs = Object.assign(Object.assign({}, (colspan && { colspan })), (rowspan && { rowspan }));\n                return this.wrap(tag, data, attrs);\n            })\n                .join('');\n            return this.wrap('tr', cells);\n        })\n            .join('');\n        const element = this.wrap('table', tableBody);\n        return this.addRaw(element).addEOL();\n    }\n    /**\n     * Adds a collapsable HTML details element to the summary buffer\n     *\n     * @param {string} label text for the closed state\n     * @param {string} content collapsable content\n     *\n     * @returns {Summary} summary instance\n     */\n    addDetails(label, content) {\n        const element = this.wrap('details', this.wrap('summary', label) + content);\n        return this.addRaw(element).addEOL();\n    }\n    /**\n     * Adds an HTML image tag to the summary buffer\n     *\n     * @param {string} src path to the image you to embed\n     * @param {string} alt text description of the image\n     * @param {SummaryImageOptions} options (optional) addition image attributes\n     *\n     * @returns {Summary} summary instance\n     */\n    addImage(src, alt, options) {\n        const { width, height } = options || {};\n        const attrs = Object.assign(Object.assign({}, (width && { width })), (height && { height }));\n        const element = this.wrap('img', null, Object.assign({ src, alt }, attrs));\n        return this.addRaw(element).addEOL();\n    }\n    /**\n     * Adds an HTML section heading element\n     *\n     * @param {string} text heading text\n     * @param {number | string} [level=1] (optional) the heading level, default: 1\n     *\n     * @returns {Summary} summary instance\n     */\n    addHeading(text, level) {\n        const tag = `h${level}`;\n        const allowedTag = ['h1', 'h2', 'h3', 'h4', 'h5', 'h6'].includes(tag)\n            ? tag\n            : 'h1';\n        const element = this.wrap(allowedTag, text);\n        return this.addRaw(element).addEOL();\n    }\n    /**\n     * Adds an HTML thematic break (<hr>) to the summary buffer\n     *\n     * @returns {Summary} summary instance\n     */\n    addSeparator() {\n        const element = this.wrap('hr', null);\n        return this.addRaw(element).addEOL();\n    }\n    /**\n     * Adds an HTML line break (<br>) to the summary buffer\n     *\n     * @returns {Summary} summary instance\n     */\n    addBreak() {\n        const element = this.wrap('br', null);\n        return this.addRaw(element).addEOL();\n    }\n    /**\n     * Adds an HTML blockquote to the summary buffer\n     *\n     * @param {string} text quote text\n     * @param {string} cite (optional) citation url\n     *\n     * @returns {Summary} summary instance\n     */\n    addQuote(text, cite) {\n        const attrs = Object.assign({}, (cite && { cite }));\n        const element = this.wrap('blockquote', text, attrs);\n        return this.addRaw(element).addEOL();\n    }\n    /**\n     * Adds an HTML anchor tag to the summary buffer\n     *\n     * @param {string} text link text/content\n     * @param {string} href hyperlink\n     *\n     * @returns {Summary} summary instance\n     */\n    addLink(text, href) {\n        const element = this.wrap('a', text, { href });\n        return this.addRaw(element).addEOL();\n    }\n}\nconst _summary = new Summary();\n/**\n * @deprecated use `core.summary`\n */\nexports.markdownSummary = _summary;\nexports.summary = _summary;\n//# sourceMappingURL=summary.js.map\n\n/***/ }),\n\n/***/ 5278:\n/***/ ((__unused_webpack_module, exports) => {\n\n\"use strict\";\n\n// We use any as a valid input type\n/* eslint-disable @typescript-eslint/no-explicit-any */\nObject.defineProperty(exports, \"__esModule\", ({ value: true }));\nexports.toCommandProperties = exports.toCommandValue = void 0;\n/**\n * Sanitizes an input into a string so it can be passed into issueCommand safely\n * @param input input to sanitize into a string\n */\nfunction toCommandValue(input) {\n    if (input === null || input === undefined) {\n        return '';\n    }\n    else if (typeof input === 'string' || input instanceof String) {\n        return input;\n    }\n    return JSON.stringify(input);\n}\nexports.toCommandValue = toCommandValue;\n/**\n *\n * @param annotationProperties\n * @returns The command properties to send with the actual annotation command\n * See IssueCommandProperties: https://github.com/actions/runner/blob/main/src/Runner.Worker/ActionCommandManager.cs#L646\n */\nfunction toCommandProperties(annotationProperties) {\n    if (!Object.keys(annotationProperties).length) {\n        return {};\n    }\n    return {\n        title: annotationProperties.title,\n        file: annotationProperties.file,\n        line: annotationProperties.startLine,\n        endLine: annotationProperties.endLine,\n        col: annotationProperties.startColumn,\n        endColumn: annotationProperties.endColumn\n    };\n}\nexports.toCommandProperties = toCommandProperties;\n//# sourceMappingURL=utils.js.map\n\n/***/ }),\n\n/***/ 6758:\n/***/ (function(__unused_webpack_module, exports) {\n\n\"use strict\";\n\nvar __awaiter = (this && this.__awaiter) || function (thisArg, _arguments, P, generator) {\n    function adopt(value) { return value instanceof P ? value : new P(function (resolve) { resolve(value); }); }\n    return new (P || (P = Promise))(function (resolve, reject) {\n        function fulfilled(value) { try { step(generator.next(value)); } catch (e) { reject(e); } }\n        function rejected(value) { try { step(generator[\"throw\"](value)); } catch (e) { reject(e); } }\n        function step(result) { result.done ? resolve(result.value) : adopt(result.value).then(fulfilled, rejected); }\n        step((generator = generator.apply(thisArg, _arguments || [])).next());\n    });\n};\nObject.defineProperty(exports, \"__esModule\", ({ value: true }));\nexports.PersonalAccessTokenCredentialHandler = exports.BearerCredentialHandler = exports.BasicCredentialHandler = void 0;\nclass BasicCredentialHandler {\n    constructor(username, password) {\n        this.username = username;\n        this.password = password;\n    }\n    prepareRequest(options) {\n        if (!options.headers) {\n            throw Error('The request has no headers');\n        }\n        options.headers['Authorization'] = `Basic ${Buffer.from(`${this.username}:${this.password}`).toString('base64')}`;\n    }\n    // This handler cannot handle 401\n    canHandleAuthentication() {\n        return false;\n    }\n    handleAuthentication() {\n        return __awaiter(this, void 0, void 0, function* () {\n            throw new Error('not implemented');\n        });\n    }\n}\nexports.BasicCredentialHandler = BasicCredentialHandler;\nclass BearerCredentialHandler {\n    constructor(token) {\n        this.token = token;\n    }\n    // currently implements pre-authorization\n    // TODO: support preAuth = false where it hooks on 401\n    prepareRequest(options) {\n        if (!options.headers) {\n            throw Error('The request has no headers');\n        }\n        options.headers['Authorization'] = `Bearer ${this.token}`;\n    }\n    // This handler cannot handle 401\n    canHandleAuthentication() {\n        return false;\n    }\n    handleAuthentication() {\n        return __awaiter(this, void 0, void 0, function* () {\n            throw new Error('not implemented');\n        });\n    }\n}\nexports.BearerCredentialHandler = BearerCredentialHandler;\nclass PersonalAccessTokenCredentialHandler {\n    constructor(token) {\n        this.token = token;\n    }\n    // currently implements pre-authorization\n    // TODO: support preAuth = false where it hooks on 401\n    prepareRequest(options) {\n        if (!options.headers) {\n            throw Error('The request has no headers');\n        }\n        options.headers['Authorization'] = `Basic ${Buffer.from(`PAT:${this.token}`).toString('base64')}`;\n    }\n    // This handler cannot handle 401\n    canHandleAuthentication() {\n        return false;\n    }\n    handleAuthentication() {\n        return __awaiter(this, void 0, void 0, function* () {\n            throw new Error('not implemented');\n        });\n    }\n}\nexports.PersonalAccessTokenCredentialHandler = PersonalAccessTokenCredentialHandler;\n//# sourceMappingURL=auth.js.map\n\n/***/ }),\n\n/***/ 1404:\n/***/ (function(__unused_webpack_module, exports, __webpack_require__) {\n\n\"use strict\";\n\n/* eslint-disable @typescript-eslint/no-explicit-any */\nvar __createBinding = (this && this.__createBinding) || (Object.create ? (function(o, m, k, k2) {\n    if (k2 === undefined) k2 = k;\n    Object.defineProperty(o, k2, { enumerable: true, get: function() { return m[k]; } });\n}) : (function(o, m, k, k2) {\n    if (k2 === undefined) k2 = k;\n    o[k2] = m[k];\n}));\nvar __setModuleDefault = (this && this.__setModuleDefault) || (Object.create ? (function(o, v) {\n    Object.defineProperty(o, \"default\", { enumerable: true, value: v });\n}) : function(o, v) {\n    o[\"default\"] = v;\n});\nvar __importStar = (this && this.__importStar) || function (mod) {\n    if (mod && mod.__esModule) return mod;\n    var result = {};\n    if (mod != null) for (var k in mod) if (k !== \"default\" && Object.hasOwnProperty.call(mod, k)) __createBinding(result, mod, k);\n    __setModuleDefault(result, mod);\n    return result;\n};\nvar __awaiter = (this && this.__awaiter) || function (thisArg, _arguments, P, generator) {\n    function adopt(value) { return value instanceof P ? value : new P(function (resolve) { resolve(value); }); }\n    return new (P || (P = Promise))(function (resolve, reject) {\n        function fulfilled(value) { try { step(generator.next(value)); } catch (e) { reject(e); } }\n        function rejected(value) { try { step(generator[\"throw\"](value)); } catch (e) { reject(e); } }\n        function step(result) { result.done ? resolve(result.value) : adopt(result.value).then(fulfilled, rejected); }\n        step((generator = generator.apply(thisArg, _arguments || [])).next());\n    });\n};\nObject.defineProperty(exports, \"__esModule\", ({ value: true }));\nexports.HttpClient = exports.isHttps = exports.HttpClientResponse = exports.HttpClientError = exports.getProxyUrl = exports.MediaTypes = exports.Headers = exports.HttpCodes = void 0;\nconst http = __importStar(__webpack_require__(8605));\nconst https = __importStar(__webpack_require__(7211));\nconst pm = __importStar(__webpack_require__(2843));\nconst tunnel = __importStar(__webpack_require__(4294));\nvar HttpCodes;\n(function (HttpCodes) {\n    HttpCodes[HttpCodes[\"OK\"] = 200] = \"OK\";\n    HttpCodes[HttpCodes[\"MultipleChoices\"] = 300] = \"MultipleChoices\";\n    HttpCodes[HttpCodes[\"MovedPermanently\"] = 301] = \"MovedPermanently\";\n    HttpCodes[HttpCodes[\"ResourceMoved\"] = 302] = \"ResourceMoved\";\n    HttpCodes[HttpCodes[\"SeeOther\"] = 303] = \"SeeOther\";\n    HttpCodes[HttpCodes[\"NotModified\"] = 304] = \"NotModified\";\n    HttpCodes[HttpCodes[\"UseProxy\"] = 305] = \"UseProxy\";\n    HttpCodes[HttpCodes[\"SwitchProxy\"] = 306] = \"SwitchProxy\";\n    HttpCodes[HttpCodes[\"TemporaryRedirect\"] = 307] = \"TemporaryRedirect\";\n    HttpCodes[HttpCodes[\"PermanentRedirect\"] = 308] = \"PermanentRedirect\";\n    HttpCodes[HttpCodes[\"BadRequest\"] = 400] = \"BadRequest\";\n    HttpCodes[HttpCodes[\"Unauthorized\"] = 401] = \"Unauthorized\";\n    HttpCodes[HttpCodes[\"PaymentRequired\"] = 402] = \"PaymentRequired\";\n    HttpCodes[HttpCodes[\"Forbidden\"] = 403] = \"Forbidden\";\n    HttpCodes[HttpCodes[\"NotFound\"] = 404] = \"NotFound\";\n    HttpCodes[HttpCodes[\"MethodNotAllowed\"] = 405] = \"MethodNotAllowed\";\n    HttpCodes[HttpCodes[\"NotAcceptable\"] = 406] = \"NotAcceptable\";\n    HttpCodes[HttpCodes[\"ProxyAuthenticationRequired\"] = 407] = \"ProxyAuthenticationRequired\";\n    HttpCodes[HttpCodes[\"RequestTimeout\"] = 408] = \"RequestTimeout\";\n    HttpCodes[HttpCodes[\"Conflict\"] = 409] = \"Conflict\";\n    HttpCodes[HttpCodes[\"Gone\"] = 410] = \"Gone\";\n    HttpCodes[HttpCodes[\"TooManyRequests\"] = 429] = \"TooManyRequests\";\n    HttpCodes[HttpCodes[\"InternalServerError\"] = 500] = \"InternalServerError\";\n    HttpCodes[HttpCodes[\"NotImplemented\"] = 501] = \"NotImplemented\";\n    HttpCodes[HttpCodes[\"BadGateway\"] = 502] = \"BadGateway\";\n    HttpCodes[HttpCodes[\"ServiceUnavailable\"] = 503] = \"ServiceUnavailable\";\n    HttpCodes[HttpCodes[\"GatewayTimeout\"] = 504] = \"GatewayTimeout\";\n})(HttpCodes = exports.HttpCodes || (exports.HttpCodes = {}));\nvar Headers;\n(function (Headers) {\n    Headers[\"Accept\"] = \"accept\";\n    Headers[\"ContentType\"] = \"content-type\";\n})(Headers = exports.Headers || (exports.Headers = {}));\nvar MediaTypes;\n(function (MediaTypes) {\n    MediaTypes[\"ApplicationJson\"] = \"application/json\";\n})(MediaTypes = exports.MediaTypes || (exports.MediaTypes = {}));\n/**\n * Returns the proxy URL, depending upon the supplied url and proxy environment variables.\n * @param serverUrl  The server URL where the request will be sent. For example, https://api.github.com\n */\nfunction getProxyUrl(serverUrl) {\n    const proxyUrl = pm.getProxyUrl(new URL(serverUrl));\n    return proxyUrl ? proxyUrl.href : '';\n}\nexports.getProxyUrl = getProxyUrl;\nconst HttpRedirectCodes = [\n    HttpCodes.MovedPermanently,\n    HttpCodes.ResourceMoved,\n    HttpCodes.SeeOther,\n    HttpCodes.TemporaryRedirect,\n    HttpCodes.PermanentRedirect\n];\nconst HttpResponseRetryCodes = [\n    HttpCodes.BadGateway,\n    HttpCodes.ServiceUnavailable,\n    HttpCodes.GatewayTimeout\n];\nconst RetryableHttpVerbs = ['OPTIONS', 'GET', 'DELETE', 'HEAD'];\nconst ExponentialBackoffCeiling = 10;\nconst ExponentialBackoffTimeSlice = 5;\nclass HttpClientError extends Error {\n    constructor(message, statusCode) {\n        super(message);\n        this.name = 'HttpClientError';\n        this.statusCode = statusCode;\n        Object.setPrototypeOf(this, HttpClientError.prototype);\n    }\n}\nexports.HttpClientError = HttpClientError;\nclass HttpClientResponse {\n    constructor(message) {\n        this.message = message;\n    }\n    readBody() {\n        return __awaiter(this, void 0, void 0, function* () {\n            return new Promise((resolve) => __awaiter(this, void 0, void 0, function* () {\n                let output = Buffer.alloc(0);\n                this.message.on('data', (chunk) => {\n                    output = Buffer.concat([output, chunk]);\n                });\n                this.message.on('end', () => {\n                    resolve(output.toString());\n                });\n            }));\n        });\n    }\n}\nexports.HttpClientResponse = HttpClientResponse;\nfunction isHttps(requestUrl) {\n    const parsedUrl = new URL(requestUrl);\n    return parsedUrl.protocol === 'https:';\n}\nexports.isHttps = isHttps;\nclass HttpClient {\n    constructor(userAgent, handlers, requestOptions) {\n        this._ignoreSslError = false;\n        this._allowRedirects = true;\n        this._allowRedirectDowngrade = false;\n        this._maxRedirects = 50;\n        this._allowRetries = false;\n        this._maxRetries = 1;\n        this._keepAlive = false;\n        this._disposed = false;\n        this.userAgent = userAgent;\n        this.handlers = handlers || [];\n        this.requestOptions = requestOptions;\n        if (requestOptions) {\n            if (requestOptions.ignoreSslError != null) {\n                this._ignoreSslError = requestOptions.ignoreSslError;\n            }\n            this._socketTimeout = requestOptions.socketTimeout;\n            if (requestOptions.allowRedirects != null) {\n                this._allowRedirects = requestOptions.allowRedirects;\n            }\n            if (requestOptions.allowRedirectDowngrade != null) {\n                this._allowRedirectDowngrade = requestOptions.allowRedirectDowngrade;\n            }\n            if (requestOptions.maxRedirects != null) {\n                this._maxRedirects = Math.max(requestOptions.maxRedirects, 0);\n            }\n            if (requestOptions.keepAlive != null) {\n                this._keepAlive = requestOptions.keepAlive;\n            }\n            if (requestOptions.allowRetries != null) {\n                this._allowRetries = requestOptions.allowRetries;\n            }\n            if (requestOptions.maxRetries != null) {\n                this._maxRetries = requestOptions.maxRetries;\n            }\n        }\n    }\n    options(requestUrl, additionalHeaders) {\n        return __awaiter(this, void 0, void 0, function* () {\n            return this.request('OPTIONS', requestUrl, null, additionalHeaders || {});\n        });\n    }\n    get(requestUrl, additionalHeaders) {\n        return __awaiter(this, void 0, void 0, function* () {\n            return this.request('GET', requestUrl, null, additionalHeaders || {});\n        });\n    }\n    del(requestUrl, additionalHeaders) {\n        return __awaiter(this, void 0, void 0, function* () {\n            return this.request('DELETE', requestUrl, null, additionalHeaders || {});\n        });\n    }\n    post(requestUrl, data, additionalHeaders) {\n        return __awaiter(this, void 0, void 0, function* () {\n            return this.request('POST', requestUrl, data, additionalHeaders || {});\n        });\n    }\n    patch(requestUrl, data, additionalHeaders) {\n        return __awaiter(this, void 0, void 0, function* () {\n            return this.request('PATCH', requestUrl, data, additionalHeaders || {});\n        });\n    }\n    put(requestUrl, data, additionalHeaders) {\n        return __awaiter(this, void 0, void 0, function* () {\n            return this.request('PUT', requestUrl, data, additionalHeaders || {});\n        });\n    }\n    head(requestUrl, additionalHeaders) {\n        return __awaiter(this, void 0, void 0, function* () {\n            return this.request('HEAD', requestUrl, null, additionalHeaders || {});\n        });\n    }\n    sendStream(verb, requestUrl, stream, additionalHeaders) {\n        return __awaiter(this, void 0, void 0, function* () {\n            return this.request(verb, requestUrl, stream, additionalHeaders);\n        });\n    }\n    /**\n     * Gets a typed object from an endpoint\n     * Be aware that not found returns a null.  Other errors (4xx, 5xx) reject the promise\n     */\n    getJson(requestUrl, additionalHeaders = {}) {\n        return __awaiter(this, void 0, void 0, function* () {\n            additionalHeaders[Headers.Accept] = this._getExistingOrDefaultHeader(additionalHeaders, Headers.Accept, MediaTypes.ApplicationJson);\n            const res = yield this.get(requestUrl, additionalHeaders);\n            return this._processResponse(res, this.requestOptions);\n        });\n    }\n    postJson(requestUrl, obj, additionalHeaders = {}) {\n        return __awaiter(this, void 0, void 0, function* () {\n            const data = JSON.stringify(obj, null, 2);\n            additionalHeaders[Headers.Accept] = this._getExistingOrDefaultHeader(additionalHeaders, Headers.Accept, MediaTypes.ApplicationJson);\n            additionalHeaders[Headers.ContentType] = this._getExistingOrDefaultHeader(additionalHeaders, Headers.ContentType, MediaTypes.ApplicationJson);\n            const res = yield this.post(requestUrl, data, additionalHeaders);\n            return this._processResponse(res, this.requestOptions);\n        });\n    }\n    putJson(requestUrl, obj, additionalHeaders = {}) {\n        return __awaiter(this, void 0, void 0, function* () {\n            const data = JSON.stringify(obj, null, 2);\n            additionalHeaders[Headers.Accept] = this._getExistingOrDefaultHeader(additionalHeaders, Headers.Accept, MediaTypes.ApplicationJson);\n            additionalHeaders[Headers.ContentType] = this._getExistingOrDefaultHeader(additionalHeaders, Headers.ContentType, MediaTypes.ApplicationJson);\n            const res = yield this.put(requestUrl, data, additionalHeaders);\n            return this._processResponse(res, this.requestOptions);\n        });\n    }\n    patchJson(requestUrl, obj, additionalHeaders = {}) {\n        return __awaiter(this, void 0, void 0, function* () {\n            const data = JSON.stringify(obj, null, 2);\n            additionalHeaders[Headers.Accept] = this._getExistingOrDefaultHeader(additionalHeaders, Headers.Accept, MediaTypes.ApplicationJson);\n            additionalHeaders[Headers.ContentType] = this._getExistingOrDefaultHeader(additionalHeaders, Headers.ContentType, MediaTypes.ApplicationJson);\n            const res = yield this.patch(requestUrl, data, additionalHeaders);\n            return this._processResponse(res, this.requestOptions);\n        });\n    }\n    /**\n     * Makes a raw http request.\n     * All other methods such as get, post, patch, and request ultimately call this.\n     * Prefer get, del, post and patch\n     */\n    request(verb, requestUrl, data, headers) {\n        return __awaiter(this, void 0, void 0, function* () {\n            if (this._disposed) {\n                throw new Error('Client has already been disposed.');\n            }\n            const parsedUrl = new URL(requestUrl);\n            let info = this._prepareRequest(verb, parsedUrl, headers);\n            // Only perform retries on reads since writes may not be idempotent.\n            const maxTries = this._allowRetries && RetryableHttpVerbs.includes(verb)\n                ? this._maxRetries + 1\n                : 1;\n            let numTries = 0;\n            let response;\n            do {\n                response = yield this.requestRaw(info, data);\n                // Check if it's an authentication challenge\n                if (response &&\n                    response.message &&\n                    response.message.statusCode === HttpCodes.Unauthorized) {\n                    let authenticationHandler;\n                    for (const handler of this.handlers) {\n                        if (handler.canHandleAuthentication(response)) {\n                            authenticationHandler = handler;\n                            break;\n                        }\n                    }\n                    if (authenticationHandler) {\n                        return authenticationHandler.handleAuthentication(this, info, data);\n                    }\n                    else {\n                        // We have received an unauthorized response but have no handlers to handle it.\n                        // Let the response return to the caller.\n                        return response;\n                    }\n                }\n                let redirectsRemaining = this._maxRedirects;\n                while (response.message.statusCode &&\n                    HttpRedirectCodes.includes(response.message.statusCode) &&\n                    this._allowRedirects &&\n                    redirectsRemaining > 0) {\n                    const redirectUrl = response.message.headers['location'];\n                    if (!redirectUrl) {\n                        // if there's no location to redirect to, we won't\n                        break;\n                    }\n                    const parsedRedirectUrl = new URL(redirectUrl);\n                    if (parsedUrl.protocol === 'https:' &&\n                        parsedUrl.protocol !== parsedRedirectUrl.protocol &&\n                        !this._allowRedirectDowngrade) {\n                        throw new Error('Redirect from HTTPS to HTTP protocol. This downgrade is not allowed for security reasons. If you want to allow this behavior, set the allowRedirectDowngrade option to true.');\n                    }\n                    // we need to finish reading the response before reassigning response\n                    // which will leak the open socket.\n                    yield response.readBody();\n                    // strip authorization header if redirected to a different hostname\n                    if (parsedRedirectUrl.hostname !== parsedUrl.hostname) {\n                        for (const header in headers) {\n                            // header names are case insensitive\n                            if (header.toLowerCase() === 'authorization') {\n                                delete headers[header];\n                            }\n                        }\n                    }\n                    // let's make the request with the new redirectUrl\n                    info = this._prepareRequest(verb, parsedRedirectUrl, headers);\n                    response = yield this.requestRaw(info, data);\n                    redirectsRemaining--;\n                }\n                if (!response.message.statusCode ||\n                    !HttpResponseRetryCodes.includes(response.message.statusCode)) {\n                    // If not a retry code, return immediately instead of retrying\n                    return response;\n                }\n                numTries += 1;\n                if (numTries < maxTries) {\n                    yield response.readBody();\n                    yield this._performExponentialBackoff(numTries);\n                }\n            } while (numTries < maxTries);\n            return response;\n        });\n    }\n    /**\n     * Needs to be called if keepAlive is set to true in request options.\n     */\n    dispose() {\n        if (this._agent) {\n            this._agent.destroy();\n        }\n        this._disposed = true;\n    }\n    /**\n     * Raw request.\n     * @param info\n     * @param data\n     */\n    requestRaw(info, data) {\n        return __awaiter(this, void 0, void 0, function* () {\n            return new Promise((resolve, reject) => {\n                function callbackForResult(err, res) {\n                    if (err) {\n                        reject(err);\n                    }\n                    else if (!res) {\n                        // If `err` is not passed, then `res` must be passed.\n                        reject(new Error('Unknown error'));\n                    }\n                    else {\n                        resolve(res);\n                    }\n                }\n                this.requestRawWithCallback(info, data, callbackForResult);\n            });\n        });\n    }\n    /**\n     * Raw request with callback.\n     * @param info\n     * @param data\n     * @param onResult\n     */\n    requestRawWithCallback(info, data, onResult) {\n        if (typeof data === 'string') {\n            if (!info.options.headers) {\n                info.options.headers = {};\n            }\n            info.options.headers['Content-Length'] = Buffer.byteLength(data, 'utf8');\n        }\n        let callbackCalled = false;\n        function handleResult(err, res) {\n            if (!callbackCalled) {\n                callbackCalled = true;\n                onResult(err, res);\n            }\n        }\n        const req = info.httpModule.request(info.options, (msg) => {\n            const res = new HttpClientResponse(msg);\n            handleResult(undefined, res);\n        });\n        let socket;\n        req.on('socket', sock => {\n            socket = sock;\n        });\n        // If we ever get disconnected, we want the socket to timeout eventually\n        req.setTimeout(this._socketTimeout || 3 * 60000, () => {\n            if (socket) {\n                socket.end();\n            }\n            handleResult(new Error(`Request timeout: ${info.options.path}`));\n        });\n        req.on('error', function (err) {\n            // err has statusCode property\n            // res should have headers\n            handleResult(err);\n        });\n        if (data && typeof data === 'string') {\n            req.write(data, 'utf8');\n        }\n        if (data && typeof data !== 'string') {\n            data.on('close', function () {\n                req.end();\n            });\n            data.pipe(req);\n        }\n        else {\n            req.end();\n        }\n    }\n    /**\n     * Gets an http agent. This function is useful when you need an http agent that handles\n     * routing through a proxy server - depending upon the url and proxy environment variables.\n     * @param serverUrl  The server URL where the request will be sent. For example, https://api.github.com\n     */\n    getAgent(serverUrl) {\n        const parsedUrl = new URL(serverUrl);\n        return this._getAgent(parsedUrl);\n    }\n    _prepareRequest(method, requestUrl, headers) {\n        const info = {};\n        info.parsedUrl = requestUrl;\n        const usingSsl = info.parsedUrl.protocol === 'https:';\n        info.httpModule = usingSsl ? https : http;\n        const defaultPort = usingSsl ? 443 : 80;\n        info.options = {};\n        info.options.host = info.parsedUrl.hostname;\n        info.options.port = info.parsedUrl.port\n            ? parseInt(info.parsedUrl.port)\n            : defaultPort;\n        info.options.path =\n            (info.parsedUrl.pathname || '') + (info.parsedUrl.search || '');\n        info.options.method = method;\n        info.options.headers = this._mergeHeaders(headers);\n        if (this.userAgent != null) {\n            info.options.headers['user-agent'] = this.userAgent;\n        }\n        info.options.agent = this._getAgent(info.parsedUrl);\n        // gives handlers an opportunity to participate\n        if (this.handlers) {\n            for (const handler of this.handlers) {\n                handler.prepareRequest(info.options);\n            }\n        }\n        return info;\n    }\n    _mergeHeaders(headers) {\n        if (this.requestOptions && this.requestOptions.headers) {\n            return Object.assign({}, lowercaseKeys(this.requestOptions.headers), lowercaseKeys(headers || {}));\n        }\n        return lowercaseKeys(headers || {});\n    }\n    _getExistingOrDefaultHeader(additionalHeaders, header, _default) {\n        let clientHeader;\n        if (this.requestOptions && this.requestOptions.headers) {\n            clientHeader = lowercaseKeys(this.requestOptions.headers)[header];\n        }\n        return additionalHeaders[header] || clientHeader || _default;\n    }\n    _getAgent(parsedUrl) {\n        let agent;\n        const proxyUrl = pm.getProxyUrl(parsedUrl);\n        const useProxy = proxyUrl && proxyUrl.hostname;\n        if (this._keepAlive && useProxy) {\n            agent = this._proxyAgent;\n        }\n        if (this._keepAlive && !useProxy) {\n            agent = this._agent;\n        }\n        // if agent is already assigned use that agent.\n        if (agent) {\n            return agent;\n        }\n        const usingSsl = parsedUrl.protocol === 'https:';\n        let maxSockets = 100;\n        if (this.requestOptions) {\n            maxSockets = this.requestOptions.maxSockets || http.globalAgent.maxSockets;\n        }\n        // This is `useProxy` again, but we need to check `proxyURl` directly for TypeScripts's flow analysis.\n        if (proxyUrl && proxyUrl.hostname) {\n            const agentOptions = {\n                maxSockets,\n                keepAlive: this._keepAlive,\n                proxy: Object.assign(Object.assign({}, ((proxyUrl.username || proxyUrl.password) && {\n                    proxyAuth: `${proxyUrl.username}:${proxyUrl.password}`\n                })), { host: proxyUrl.hostname, port: proxyUrl.port })\n            };\n            let tunnelAgent;\n            const overHttps = proxyUrl.protocol === 'https:';\n            if (usingSsl) {\n                tunnelAgent = overHttps ? tunnel.httpsOverHttps : tunnel.httpsOverHttp;\n            }\n            else {\n                tunnelAgent = overHttps ? tunnel.httpOverHttps : tunnel.httpOverHttp;\n            }\n            agent = tunnelAgent(agentOptions);\n            this._proxyAgent = agent;\n        }\n        // if reusing agent across request and tunneling agent isn't assigned create a new agent\n        if (this._keepAlive && !agent) {\n            const options = { keepAlive: this._keepAlive, maxSockets };\n            agent = usingSsl ? new https.Agent(options) : new http.Agent(options);\n            this._agent = agent;\n        }\n        // if not using private agent and tunnel agent isn't setup then use global agent\n        if (!agent) {\n            agent = usingSsl ? https.globalAgent : http.globalAgent;\n        }\n        if (usingSsl && this._ignoreSslError) {\n            // we don't want to set NODE_TLS_REJECT_UNAUTHORIZED=0 since that will affect request for entire process\n            // http.RequestOptions doesn't expose a way to modify RequestOptions.agent.options\n            // we have to cast it to any and change it directly\n            agent.options = Object.assign(agent.options || {}, {\n                rejectUnauthorized: false\n            });\n        }\n        return agent;\n    }\n    _performExponentialBackoff(retryNumber) {\n        return __awaiter(this, void 0, void 0, function* () {\n            retryNumber = Math.min(ExponentialBackoffCeiling, retryNumber);\n            const ms = ExponentialBackoffTimeSlice * Math.pow(2, retryNumber);\n            return new Promise(resolve => setTimeout(() => resolve(), ms));\n        });\n    }\n    _processResponse(res, options) {\n        return __awaiter(this, void 0, void 0, function* () {\n            return new Promise((resolve, reject) => __awaiter(this, void 0, void 0, function* () {\n                const statusCode = res.message.statusCode || 0;\n                const response = {\n                    statusCode,\n                    result: null,\n                    headers: {}\n                };\n                // not found leads to null obj returned\n                if (statusCode === HttpCodes.NotFound) {\n                    resolve(response);\n                }\n                // get the result from the body\n                function dateTimeDeserializer(key, value) {\n                    if (typeof value === 'string') {\n                        const a = new Date(value);\n                        if (!isNaN(a.valueOf())) {\n                            return a;\n                        }\n                    }\n                    return value;\n                }\n                let obj;\n                let contents;\n                try {\n                    contents = yield res.readBody();\n                    if (contents && contents.length > 0) {\n                        if (options && options.deserializeDates) {\n                            obj = JSON.parse(contents, dateTimeDeserializer);\n                        }\n                        else {\n                            obj = JSON.parse(contents);\n                        }\n                        response.result = obj;\n                    }\n                    response.headers = res.message.headers;\n                }\n                catch (err) {\n                    // Invalid resource (contents not json);  leaving result obj null\n                }\n                // note that 3xx redirects are handled by the http layer.\n                if (statusCode > 299) {\n                    let msg;\n                    // if exception/error in body, attempt to get better error\n                    if (obj && obj.message) {\n                        msg = obj.message;\n                    }\n                    else if (contents && contents.length > 0) {\n                        // it may be the case that the exception is in the body message as string\n                        msg = contents;\n                    }\n                    else {\n                        msg = `Failed request: (${statusCode})`;\n                    }\n                    const err = new HttpClientError(msg, statusCode);\n                    err.result = response.result;\n                    reject(err);\n                }\n                else {\n                    resolve(response);\n                }\n            }));\n        });\n    }\n}\nexports.HttpClient = HttpClient;\nconst lowercaseKeys = (obj) => Object.keys(obj).reduce((c, k) => ((c[k.toLowerCase()] = obj[k]), c), {});\n//# sourceMappingURL=index.js.map\n\n/***/ }),\n\n/***/ 2843:\n/***/ ((__unused_webpack_module, exports) => {\n\n\"use strict\";\n\nObject.defineProperty(exports, \"__esModule\", ({ value: true }));\nexports.checkBypass = exports.getProxyUrl = void 0;\nfunction getProxyUrl(reqUrl) {\n    const usingSsl = reqUrl.protocol === 'https:';\n    if (checkBypass(reqUrl)) {\n        return undefined;\n    }\n    const proxyVar = (() => {\n        if (usingSsl) {\n            return process.env['https_proxy'] || process.env['HTTPS_PROXY'];\n        }\n        else {\n            return process.env['http_proxy'] || process.env['HTTP_PROXY'];\n        }\n    })();\n    if (proxyVar) {\n        return new URL(proxyVar);\n    }\n    else {\n        return undefined;\n    }\n}\nexports.getProxyUrl = getProxyUrl;\nfunction checkBypass(reqUrl) {\n    if (!reqUrl.hostname) {\n        return false;\n    }\n    const noProxy = process.env['no_proxy'] || process.env['NO_PROXY'] || '';\n    if (!noProxy) {\n        return false;\n    }\n    // Determine the request port\n    let reqPort;\n    if (reqUrl.port) {\n        reqPort = Number(reqUrl.port);\n    }\n    else if (reqUrl.protocol === 'http:') {\n        reqPort = 80;\n    }\n    else if (reqUrl.protocol === 'https:') {\n        reqPort = 443;\n    }\n    // Format the request hostname and hostname with port\n    const upperReqHosts = [reqUrl.hostname.toUpperCase()];\n    if (typeof reqPort === 'number') {\n        upperReqHosts.push(`${upperReqHosts[0]}:${reqPort}`);\n    }\n    // Compare request host against noproxy\n    for (const upperNoProxyItem of noProxy\n        .split(',')\n        .map(x => x.trim().toUpperCase())\n        .filter(x => x)) {\n        if (upperReqHosts.some(x => x === upperNoProxyItem)) {\n            return true;\n        }\n    }\n    return false;\n}\nexports.checkBypass = checkBypass;\n//# sourceMappingURL=proxy.js.map\n\n/***/ }),\n\n/***/ 4087:\n/***/ ((__unused_webpack_module, exports, __webpack_require__) => {\n\n\"use strict\";\n\nObject.defineProperty(exports, \"__esModule\", ({ value: true }));\nexports.Context = void 0;\nconst fs_1 = __webpack_require__(5747);\nconst os_1 = __webpack_require__(2087);\nclass Context {\n    /**\n     * Hydrate the context from the environment\n     */\n    constructor() {\n        this.payload = {};\n        if (process.env.GITHUB_EVENT_PATH) {\n            if (fs_1.existsSync(process.env.GITHUB_EVENT_PATH)) {\n                this.payload = JSON.parse(fs_1.readFileSync(process.env.GITHUB_EVENT_PATH, { encoding: 'utf8' }));\n            }\n            else {\n                const path = process.env.GITHUB_EVENT_PATH;\n                process.stdout.write(`GITHUB_EVENT_PATH ${path} does not exist${os_1.EOL}`);\n            }\n        }\n        this.eventName = process.env.GITHUB_EVENT_NAME;\n        this.sha = process.env.GITHUB_SHA;\n        this.ref = process.env.GITHUB_REF;\n        this.workflow = process.env.GITHUB_WORKFLOW;\n        this.action = process.env.GITHUB_ACTION;\n        this.actor = process.env.GITHUB_ACTOR;\n        this.job = process.env.GITHUB_JOB;\n        this.runNumber = parseInt(process.env.GITHUB_RUN_NUMBER, 10);\n        this.runId = parseInt(process.env.GITHUB_RUN_ID, 10);\n    }\n    get issue() {\n        const payload = this.payload;\n        return Object.assign(Object.assign({}, this.repo), { number: (payload.issue || payload.pull_request || payload).number });\n    }\n    get repo() {\n        if (process.env.GITHUB_REPOSITORY) {\n            const [owner, repo] = process.env.GITHUB_REPOSITORY.split('/');\n            return { owner, repo };\n        }\n        if (this.payload.repository) {\n            return {\n                owner: this.payload.repository.owner.login,\n                repo: this.payload.repository.name\n            };\n        }\n        throw new Error(\"context.repo requires a GITHUB_REPOSITORY environment variable like 'owner/repo'\");\n    }\n}\nexports.Context = Context;\n//# sourceMappingURL=context.js.map\n\n/***/ }),\n\n/***/ 5438:\n/***/ (function(__unused_webpack_module, exports, __webpack_require__) {\n\n\"use strict\";\n\nvar __createBinding = (this && this.__createBinding) || (Object.create ? (function(o, m, k, k2) {\n    if (k2 === undefined) k2 = k;\n    Object.defineProperty(o, k2, { enumerable: true, get: function() { return m[k]; } });\n}) : (function(o, m, k, k2) {\n    if (k2 === undefined) k2 = k;\n    o[k2] = m[k];\n}));\nvar __setModuleDefault = (this && this.__setModuleDefault) || (Object.create ? (function(o, v) {\n    Object.defineProperty(o, \"default\", { enumerable: true, value: v });\n}) : function(o, v) {\n    o[\"default\"] = v;\n});\nvar __importStar = (this && this.__importStar) || function (mod) {\n    if (mod && mod.__esModule) return mod;\n    var result = {};\n    if (mod != null) for (var k in mod) if (Object.hasOwnProperty.call(mod, k)) __createBinding(result, mod, k);\n    __setModuleDefault(result, mod);\n    return result;\n};\nObject.defineProperty(exports, \"__esModule\", ({ value: true }));\nexports.getOctokit = exports.context = void 0;\nconst Context = __importStar(__webpack_require__(4087));\nconst utils_1 = __webpack_require__(3030);\nexports.context = new Context.Context();\n/**\n * Returns a hydrated octokit ready to use for GitHub Actions\n *\n * @param     token    the repo PAT or GITHUB_TOKEN\n * @param     options  other options to set\n */\nfunction getOctokit(token, options) {\n    return new utils_1.GitHub(utils_1.getOctokitOptions(token, options));\n}\nexports.getOctokit = getOctokit;\n//# sourceMappingURL=github.js.map\n\n/***/ }),\n\n/***/ 7914:\n/***/ (function(__unused_webpack_module, exports, __webpack_require__) {\n\n\"use strict\";\n\nvar __createBinding = (this && this.__createBinding) || (Object.create ? (function(o, m, k, k2) {\n    if (k2 === undefined) k2 = k;\n    Object.defineProperty(o, k2, { enumerable: true, get: function() { return m[k]; } });\n}) : (function(o, m, k, k2) {\n    if (k2 === undefined) k2 = k;\n    o[k2] = m[k];\n}));\nvar __setModuleDefault = (this && this.__setModuleDefault) || (Object.create ? (function(o, v) {\n    Object.defineProperty(o, \"default\", { enumerable: true, value: v });\n}) : function(o, v) {\n    o[\"default\"] = v;\n});\nvar __importStar = (this && this.__importStar) || function (mod) {\n    if (mod && mod.__esModule) return mod;\n    var result = {};\n    if (mod != null) for (var k in mod) if (Object.hasOwnProperty.call(mod, k)) __createBinding(result, mod, k);\n    __setModuleDefault(result, mod);\n    return result;\n};\nObject.defineProperty(exports, \"__esModule\", ({ value: true }));\nexports.getApiBaseUrl = exports.getProxyAgent = exports.getAuthString = void 0;\nconst httpClient = __importStar(__webpack_require__(9925));\nfunction getAuthString(token, options) {\n    if (!token && !options.auth) {\n        throw new Error('Parameter token or opts.auth is required');\n    }\n    else if (token && options.auth) {\n        throw new Error('Parameters token and opts.auth may not both be specified');\n    }\n    return typeof options.auth === 'string' ? options.auth : `token ${token}`;\n}\nexports.getAuthString = getAuthString;\nfunction getProxyAgent(destinationUrl) {\n    const hc = new httpClient.HttpClient();\n    return hc.getAgent(destinationUrl);\n}\nexports.getProxyAgent = getProxyAgent;\nfunction getApiBaseUrl() {\n    return process.env['GITHUB_API_URL'] || 'https://api.github.com';\n}\nexports.getApiBaseUrl = getApiBaseUrl;\n//# sourceMappingURL=utils.js.map\n\n/***/ }),\n\n/***/ 3030:\n/***/ (function(__unused_webpack_module, exports, __webpack_require__) {\n\n\"use strict\";\n\nvar __createBinding = (this && this.__createBinding) || (Object.create ? (function(o, m, k, k2) {\n    if (k2 === undefined) k2 = k;\n    Object.defineProperty(o, k2, { enumerable: true, get: function() { return m[k]; } });\n}) : (function(o, m, k, k2) {\n    if (k2 === undefined) k2 = k;\n    o[k2] = m[k];\n}));\nvar __setModuleDefault = (this && this.__setModuleDefault) || (Object.create ? (function(o, v) {\n    Object.defineProperty(o, \"default\", { enumerable: true, value: v });\n}) : function(o, v) {\n    o[\"default\"] = v;\n});\nvar __importStar = (this && this.__importStar) || function (mod) {\n    if (mod && mod.__esModule) return mod;\n    var result = {};\n    if (mod != null) for (var k in mod) if (Object.hasOwnProperty.call(mod, k)) __createBinding(result, mod, k);\n    __setModuleDefault(result, mod);\n    return result;\n};\nObject.defineProperty(exports, \"__esModule\", ({ value: true }));\nexports.getOctokitOptions = exports.GitHub = exports.context = void 0;\nconst Context = __importStar(__webpack_require__(4087));\nconst Utils = __importStar(__webpack_require__(7914));\n// octokit + plugins\nconst core_1 = __webpack_require__(6762);\nconst plugin_rest_endpoint_methods_1 = __webpack_require__(3044);\nconst plugin_paginate_rest_1 = __webpack_require__(4193);\nexports.context = new Context.Context();\nconst baseUrl = Utils.getApiBaseUrl();\nconst defaults = {\n    baseUrl,\n    request: {\n        agent: Utils.getProxyAgent(baseUrl)\n    }\n};\nexports.GitHub = core_1.Octokit.plugin(plugin_rest_endpoint_methods_1.restEndpointMethods, plugin_paginate_rest_1.paginateRest).defaults(defaults);\n/**\n * Convience function to correctly format Octokit Options to pass into the constructor.\n *\n * @param     token    the repo PAT or GITHUB_TOKEN\n * @param     options  other options to set\n */\nfunction getOctokitOptions(token, options) {\n    const opts = Object.assign({}, options || {}); // Shallow clone - don't mutate the object provided by the caller\n    // Auth\n    const auth = Utils.getAuthString(token, opts);\n    if (auth) {\n        opts.auth = auth;\n    }\n    return opts;\n}\nexports.getOctokitOptions = getOctokitOptions;\n//# sourceMappingURL=utils.js.map\n\n/***/ }),\n\n/***/ 9925:\n/***/ ((__unused_webpack_module, exports, __webpack_require__) => {\n\n\"use strict\";\n\nObject.defineProperty(exports, \"__esModule\", ({ value: true }));\nconst url = __webpack_require__(8835);\nconst http = __webpack_require__(8605);\nconst https = __webpack_require__(7211);\nconst pm = __webpack_require__(6443);\nlet tunnel;\nvar HttpCodes;\n(function (HttpCodes) {\n    HttpCodes[HttpCodes[\"OK\"] = 200] = \"OK\";\n    HttpCodes[HttpCodes[\"MultipleChoices\"] = 300] = \"MultipleChoices\";\n    HttpCodes[HttpCodes[\"MovedPermanently\"] = 301] = \"MovedPermanently\";\n    HttpCodes[HttpCodes[\"ResourceMoved\"] = 302] = \"ResourceMoved\";\n    HttpCodes[HttpCodes[\"SeeOther\"] = 303] = \"SeeOther\";\n    HttpCodes[HttpCodes[\"NotModified\"] = 304] = \"NotModified\";\n    HttpCodes[HttpCodes[\"UseProxy\"] = 305] = \"UseProxy\";\n    HttpCodes[HttpCodes[\"SwitchProxy\"] = 306] = \"SwitchProxy\";\n    HttpCodes[HttpCodes[\"TemporaryRedirect\"] = 307] = \"TemporaryRedirect\";\n    HttpCodes[HttpCodes[\"PermanentRedirect\"] = 308] = \"PermanentRedirect\";\n    HttpCodes[HttpCodes[\"BadRequest\"] = 400] = \"BadRequest\";\n    HttpCodes[HttpCodes[\"Unauthorized\"] = 401] = \"Unauthorized\";\n    HttpCodes[HttpCodes[\"PaymentRequired\"] = 402] = \"PaymentRequired\";\n    HttpCodes[HttpCodes[\"Forbidden\"] = 403] = \"Forbidden\";\n    HttpCodes[HttpCodes[\"NotFound\"] = 404] = \"NotFound\";\n    HttpCodes[HttpCodes[\"MethodNotAllowed\"] = 405] = \"MethodNotAllowed\";\n    HttpCodes[HttpCodes[\"NotAcceptable\"] = 406] = \"NotAcceptable\";\n    HttpCodes[HttpCodes[\"ProxyAuthenticationRequired\"] = 407] = \"ProxyAuthenticationRequired\";\n    HttpCodes[HttpCodes[\"RequestTimeout\"] = 408] = \"RequestTimeout\";\n    HttpCodes[HttpCodes[\"Conflict\"] = 409] = \"Conflict\";\n    HttpCodes[HttpCodes[\"Gone\"] = 410] = \"Gone\";\n    HttpCodes[HttpCodes[\"TooManyRequests\"] = 429] = \"TooManyRequests\";\n    HttpCodes[HttpCodes[\"InternalServerError\"] = 500] = \"InternalServerError\";\n    HttpCodes[HttpCodes[\"NotImplemented\"] = 501] = \"NotImplemented\";\n    HttpCodes[HttpCodes[\"BadGateway\"] = 502] = \"BadGateway\";\n    HttpCodes[HttpCodes[\"ServiceUnavailable\"] = 503] = \"ServiceUnavailable\";\n    HttpCodes[HttpCodes[\"GatewayTimeout\"] = 504] = \"GatewayTimeout\";\n})(HttpCodes = exports.HttpCodes || (exports.HttpCodes = {}));\nvar Headers;\n(function (Headers) {\n    Headers[\"Accept\"] = \"accept\";\n    Headers[\"ContentType\"] = \"content-type\";\n})(Headers = exports.Headers || (exports.Headers = {}));\nvar MediaTypes;\n(function (MediaTypes) {\n    MediaTypes[\"ApplicationJson\"] = \"application/json\";\n})(MediaTypes = exports.MediaTypes || (exports.MediaTypes = {}));\n/**\n * Returns the proxy URL, depending upon the supplied url and proxy environment variables.\n * @param serverUrl  The server URL where the request will be sent. For example, https://api.github.com\n */\nfunction getProxyUrl(serverUrl) {\n    let proxyUrl = pm.getProxyUrl(url.parse(serverUrl));\n    return proxyUrl ? proxyUrl.href : '';\n}\nexports.getProxyUrl = getProxyUrl;\nconst HttpRedirectCodes = [\n    HttpCodes.MovedPermanently,\n    HttpCodes.ResourceMoved,\n    HttpCodes.SeeOther,\n    HttpCodes.TemporaryRedirect,\n    HttpCodes.PermanentRedirect\n];\nconst HttpResponseRetryCodes = [\n    HttpCodes.BadGateway,\n    HttpCodes.ServiceUnavailable,\n    HttpCodes.GatewayTimeout\n];\nconst RetryableHttpVerbs = ['OPTIONS', 'GET', 'DELETE', 'HEAD'];\nconst ExponentialBackoffCeiling = 10;\nconst ExponentialBackoffTimeSlice = 5;\nclass HttpClientResponse {\n    constructor(message) {\n        this.message = message;\n    }\n    readBody() {\n        return new Promise(async (resolve, reject) => {\n            let output = Buffer.alloc(0);\n            this.message.on('data', (chunk) => {\n                output = Buffer.concat([output, chunk]);\n            });\n            this.message.on('end', () => {\n                resolve(output.toString());\n            });\n        });\n    }\n}\nexports.HttpClientResponse = HttpClientResponse;\nfunction isHttps(requestUrl) {\n    let parsedUrl = url.parse(requestUrl);\n    return parsedUrl.protocol === 'https:';\n}\nexports.isHttps = isHttps;\nclass HttpClient {\n    constructor(userAgent, handlers, requestOptions) {\n        this._ignoreSslError = false;\n        this._allowRedirects = true;\n        this._allowRedirectDowngrade = false;\n        this._maxRedirects = 50;\n        this._allowRetries = false;\n        this._maxRetries = 1;\n        this._keepAlive = false;\n        this._disposed = false;\n        this.userAgent = userAgent;\n        this.handlers = handlers || [];\n        this.requestOptions = requestOptions;\n        if (requestOptions) {\n            if (requestOptions.ignoreSslError != null) {\n                this._ignoreSslError = requestOptions.ignoreSslError;\n            }\n            this._socketTimeout = requestOptions.socketTimeout;\n            if (requestOptions.allowRedirects != null) {\n                this._allowRedirects = requestOptions.allowRedirects;\n            }\n            if (requestOptions.allowRedirectDowngrade != null) {\n                this._allowRedirectDowngrade = requestOptions.allowRedirectDowngrade;\n            }\n            if (requestOptions.maxRedirects != null) {\n                this._maxRedirects = Math.max(requestOptions.maxRedirects, 0);\n            }\n            if (requestOptions.keepAlive != null) {\n                this._keepAlive = requestOptions.keepAlive;\n            }\n            if (requestOptions.allowRetries != null) {\n                this._allowRetries = requestOptions.allowRetries;\n            }\n            if (requestOptions.maxRetries != null) {\n                this._maxRetries = requestOptions.maxRetries;\n            }\n        }\n    }\n    options(requestUrl, additionalHeaders) {\n        return this.request('OPTIONS', requestUrl, null, additionalHeaders || {});\n    }\n    get(requestUrl, additionalHeaders) {\n        return this.request('GET', requestUrl, null, additionalHeaders || {});\n    }\n    del(requestUrl, additionalHeaders) {\n        return this.request('DELETE', requestUrl, null, additionalHeaders || {});\n    }\n    post(requestUrl, data, additionalHeaders) {\n        return this.request('POST', requestUrl, data, additionalHeaders || {});\n    }\n    patch(requestUrl, data, additionalHeaders) {\n        return this.request('PATCH', requestUrl, data, additionalHeaders || {});\n    }\n    put(requestUrl, data, additionalHeaders) {\n        return this.request('PUT', requestUrl, data, additionalHeaders || {});\n    }\n    head(requestUrl, additionalHeaders) {\n        return this.request('HEAD', requestUrl, null, additionalHeaders || {});\n    }\n    sendStream(verb, requestUrl, stream, additionalHeaders) {\n        return this.request(verb, requestUrl, stream, additionalHeaders);\n    }\n    /**\n     * Gets a typed object from an endpoint\n     * Be aware that not found returns a null.  Other errors (4xx, 5xx) reject the promise\n     */\n    async getJson(requestUrl, additionalHeaders = {}) {\n        additionalHeaders[Headers.Accept] = this._getExistingOrDefaultHeader(additionalHeaders, Headers.Accept, MediaTypes.ApplicationJson);\n        let res = await this.get(requestUrl, additionalHeaders);\n        return this._processResponse(res, this.requestOptions);\n    }\n    async postJson(requestUrl, obj, additionalHeaders = {}) {\n        let data = JSON.stringify(obj, null, 2);\n        additionalHeaders[Headers.Accept] = this._getExistingOrDefaultHeader(additionalHeaders, Headers.Accept, MediaTypes.ApplicationJson);\n        additionalHeaders[Headers.ContentType] = this._getExistingOrDefaultHeader(additionalHeaders, Headers.ContentType, MediaTypes.ApplicationJson);\n        let res = await this.post(requestUrl, data, additionalHeaders);\n        return this._processResponse(res, this.requestOptions);\n    }\n    async putJson(requestUrl, obj, additionalHeaders = {}) {\n        let data = JSON.stringify(obj, null, 2);\n        additionalHeaders[Headers.Accept] = this._getExistingOrDefaultHeader(additionalHeaders, Headers.Accept, MediaTypes.ApplicationJson);\n        additionalHeaders[Headers.ContentType] = this._getExistingOrDefaultHeader(additionalHeaders, Headers.ContentType, MediaTypes.ApplicationJson);\n        let res = await this.put(requestUrl, data, additionalHeaders);\n        return this._processResponse(res, this.requestOptions);\n    }\n    async patchJson(requestUrl, obj, additionalHeaders = {}) {\n        let data = JSON.stringify(obj, null, 2);\n        additionalHeaders[Headers.Accept] = this._getExistingOrDefaultHeader(additionalHeaders, Headers.Accept, MediaTypes.ApplicationJson);\n        additionalHeaders[Headers.ContentType] = this._getExistingOrDefaultHeader(additionalHeaders, Headers.ContentType, MediaTypes.ApplicationJson);\n        let res = await this.patch(requestUrl, data, additionalHeaders);\n        return this._processResponse(res, this.requestOptions);\n    }\n    /**\n     * Makes a raw http request.\n     * All other methods such as get, post, patch, and request ultimately call this.\n     * Prefer get, del, post and patch\n     */\n    async request(verb, requestUrl, data, headers) {\n        if (this._disposed) {\n            throw new Error('Client has already been disposed.');\n        }\n        let parsedUrl = url.parse(requestUrl);\n        let info = this._prepareRequest(verb, parsedUrl, headers);\n        // Only perform retries on reads since writes may not be idempotent.\n        let maxTries = this._allowRetries && RetryableHttpVerbs.indexOf(verb) != -1\n            ? this._maxRetries + 1\n            : 1;\n        let numTries = 0;\n        let response;\n        while (numTries < maxTries) {\n            response = await this.requestRaw(info, data);\n            // Check if it's an authentication challenge\n            if (response &&\n                response.message &&\n                response.message.statusCode === HttpCodes.Unauthorized) {\n                let authenticationHandler;\n                for (let i = 0; i < this.handlers.length; i++) {\n                    if (this.handlers[i].canHandleAuthentication(response)) {\n                        authenticationHandler = this.handlers[i];\n                        break;\n                    }\n                }\n                if (authenticationHandler) {\n                    return authenticationHandler.handleAuthentication(this, info, data);\n                }\n                else {\n                    // We have received an unauthorized response but have no handlers to handle it.\n                    // Let the response return to the caller.\n                    return response;\n                }\n            }\n            let redirectsRemaining = this._maxRedirects;\n            while (HttpRedirectCodes.indexOf(response.message.statusCode) != -1 &&\n                this._allowRedirects &&\n                redirectsRemaining > 0) {\n                const redirectUrl = response.message.headers['location'];\n                if (!redirectUrl) {\n                    // if there's no location to redirect to, we won't\n                    break;\n                }\n                let parsedRedirectUrl = url.parse(redirectUrl);\n                if (parsedUrl.protocol == 'https:' &&\n                    parsedUrl.protocol != parsedRedirectUrl.protocol &&\n                    !this._allowRedirectDowngrade) {\n                    throw new Error('Redirect from HTTPS to HTTP protocol. This downgrade is not allowed for security reasons. If you want to allow this behavior, set the allowRedirectDowngrade option to true.');\n                }\n                // we need to finish reading the response before reassigning response\n                // which will leak the open socket.\n                await response.readBody();\n                // strip authorization header if redirected to a different hostname\n                if (parsedRedirectUrl.hostname !== parsedUrl.hostname) {\n                    for (let header in headers) {\n                        // header names are case insensitive\n                        if (header.toLowerCase() === 'authorization') {\n                            delete headers[header];\n                        }\n                    }\n                }\n                // let's make the request with the new redirectUrl\n                info = this._prepareRequest(verb, parsedRedirectUrl, headers);\n                response = await this.requestRaw(info, data);\n                redirectsRemaining--;\n            }\n            if (HttpResponseRetryCodes.indexOf(response.message.statusCode) == -1) {\n                // If not a retry code, return immediately instead of retrying\n                return response;\n            }\n            numTries += 1;\n            if (numTries < maxTries) {\n                await response.readBody();\n                await this._performExponentialBackoff(numTries);\n            }\n        }\n        return response;\n    }\n    /**\n     * Needs to be called if keepAlive is set to true in request options.\n     */\n    dispose() {\n        if (this._agent) {\n            this._agent.destroy();\n        }\n        this._disposed = true;\n    }\n    /**\n     * Raw request.\n     * @param info\n     * @param data\n     */\n    requestRaw(info, data) {\n        return new Promise((resolve, reject) => {\n            let callbackForResult = function (err, res) {\n                if (err) {\n                    reject(err);\n                }\n                resolve(res);\n            };\n            this.requestRawWithCallback(info, data, callbackForResult);\n        });\n    }\n    /**\n     * Raw request with callback.\n     * @param info\n     * @param data\n     * @param onResult\n     */\n    requestRawWithCallback(info, data, onResult) {\n        let socket;\n        if (typeof data === 'string') {\n            info.options.headers['Content-Length'] = Buffer.byteLength(data, 'utf8');\n        }\n        let callbackCalled = false;\n        let handleResult = (err, res) => {\n            if (!callbackCalled) {\n                callbackCalled = true;\n                onResult(err, res);\n            }\n        };\n        let req = info.httpModule.request(info.options, (msg) => {\n            let res = new HttpClientResponse(msg);\n            handleResult(null, res);\n        });\n        req.on('socket', sock => {\n            socket = sock;\n        });\n        // If we ever get disconnected, we want the socket to timeout eventually\n        req.setTimeout(this._socketTimeout || 3 * 60000, () => {\n            if (socket) {\n                socket.end();\n            }\n            handleResult(new Error('Request timeout: ' + info.options.path), null);\n        });\n        req.on('error', function (err) {\n            // err has statusCode property\n            // res should have headers\n            handleResult(err, null);\n        });\n        if (data && typeof data === 'string') {\n            req.write(data, 'utf8');\n        }\n        if (data && typeof data !== 'string') {\n            data.on('close', function () {\n                req.end();\n            });\n            data.pipe(req);\n        }\n        else {\n            req.end();\n        }\n    }\n    /**\n     * Gets an http agent. This function is useful when you need an http agent that handles\n     * routing through a proxy server - depending upon the url and proxy environment variables.\n     * @param serverUrl  The server URL where the request will be sent. For example, https://api.github.com\n     */\n    getAgent(serverUrl) {\n        let parsedUrl = url.parse(serverUrl);\n        return this._getAgent(parsedUrl);\n    }\n    _prepareRequest(method, requestUrl, headers) {\n        const info = {};\n        info.parsedUrl = requestUrl;\n        const usingSsl = info.parsedUrl.protocol === 'https:';\n        info.httpModule = usingSsl ? https : http;\n        const defaultPort = usingSsl ? 443 : 80;\n        info.options = {};\n        info.options.host = info.parsedUrl.hostname;\n        info.options.port = info.parsedUrl.port\n            ? parseInt(info.parsedUrl.port)\n            : defaultPort;\n        info.options.path =\n            (info.parsedUrl.pathname || '') + (info.parsedUrl.search || '');\n        info.options.method = method;\n        info.options.headers = this._mergeHeaders(headers);\n        if (this.userAgent != null) {\n            info.options.headers['user-agent'] = this.userAgent;\n        }\n        info.options.agent = this._getAgent(info.parsedUrl);\n        // gives handlers an opportunity to participate\n        if (this.handlers) {\n            this.handlers.forEach(handler => {\n                handler.prepareRequest(info.options);\n            });\n        }\n        return info;\n    }\n    _mergeHeaders(headers) {\n        const lowercaseKeys = obj => Object.keys(obj).reduce((c, k) => ((c[k.toLowerCase()] = obj[k]), c), {});\n        if (this.requestOptions && this.requestOptions.headers) {\n            return Object.assign({}, lowercaseKeys(this.requestOptions.headers), lowercaseKeys(headers));\n        }\n        return lowercaseKeys(headers || {});\n    }\n    _getExistingOrDefaultHeader(additionalHeaders, header, _default) {\n        const lowercaseKeys = obj => Object.keys(obj).reduce((c, k) => ((c[k.toLowerCase()] = obj[k]), c), {});\n        let clientHeader;\n        if (this.requestOptions && this.requestOptions.headers) {\n            clientHeader = lowercaseKeys(this.requestOptions.headers)[header];\n        }\n        return additionalHeaders[header] || clientHeader || _default;\n    }\n    _getAgent(parsedUrl) {\n        let agent;\n        let proxyUrl = pm.getProxyUrl(parsedUrl);\n        let useProxy = proxyUrl && proxyUrl.hostname;\n        if (this._keepAlive && useProxy) {\n            agent = this._proxyAgent;\n        }\n        if (this._keepAlive && !useProxy) {\n            agent = this._agent;\n        }\n        // if agent is already assigned use that agent.\n        if (!!agent) {\n            return agent;\n        }\n        const usingSsl = parsedUrl.protocol === 'https:';\n        let maxSockets = 100;\n        if (!!this.requestOptions) {\n            maxSockets = this.requestOptions.maxSockets || http.globalAgent.maxSockets;\n        }\n        if (useProxy) {\n            // If using proxy, need tunnel\n            if (!tunnel) {\n                tunnel = __webpack_require__(4294);\n            }\n            const agentOptions = {\n                maxSockets: maxSockets,\n                keepAlive: this._keepAlive,\n                proxy: {\n                    proxyAuth: proxyUrl.auth,\n                    host: proxyUrl.hostname,\n                    port: proxyUrl.port\n                }\n            };\n            let tunnelAgent;\n            const overHttps = proxyUrl.protocol === 'https:';\n            if (usingSsl) {\n                tunnelAgent = overHttps ? tunnel.httpsOverHttps : tunnel.httpsOverHttp;\n            }\n            else {\n                tunnelAgent = overHttps ? tunnel.httpOverHttps : tunnel.httpOverHttp;\n            }\n            agent = tunnelAgent(agentOptions);\n            this._proxyAgent = agent;\n        }\n        // if reusing agent across request and tunneling agent isn't assigned create a new agent\n        if (this._keepAlive && !agent) {\n            const options = { keepAlive: this._keepAlive, maxSockets: maxSockets };\n            agent = usingSsl ? new https.Agent(options) : new http.Agent(options);\n            this._agent = agent;\n        }\n        // if not using private agent and tunnel agent isn't setup then use global agent\n        if (!agent) {\n            agent = usingSsl ? https.globalAgent : http.globalAgent;\n        }\n        if (usingSsl && this._ignoreSslError) {\n            // we don't want to set NODE_TLS_REJECT_UNAUTHORIZED=0 since that will affect request for entire process\n            // http.RequestOptions doesn't expose a way to modify RequestOptions.agent.options\n            // we have to cast it to any and change it directly\n            agent.options = Object.assign(agent.options || {}, {\n                rejectUnauthorized: false\n            });\n        }\n        return agent;\n    }\n    _performExponentialBackoff(retryNumber) {\n        retryNumber = Math.min(ExponentialBackoffCeiling, retryNumber);\n        const ms = ExponentialBackoffTimeSlice * Math.pow(2, retryNumber);\n        return new Promise(resolve => setTimeout(() => resolve(), ms));\n    }\n    static dateTimeDeserializer(key, value) {\n        if (typeof value === 'string') {\n            let a = new Date(value);\n            if (!isNaN(a.valueOf())) {\n                return a;\n            }\n        }\n        return value;\n    }\n    async _processResponse(res, options) {\n        return new Promise(async (resolve, reject) => {\n            const statusCode = res.message.statusCode;\n            const response = {\n                statusCode: statusCode,\n                result: null,\n                headers: {}\n            };\n            // not found leads to null obj returned\n            if (statusCode == HttpCodes.NotFound) {\n                resolve(response);\n            }\n            let obj;\n            let contents;\n            // get the result from the body\n            try {\n                contents = await res.readBody();\n                if (contents && contents.length > 0) {\n                    if (options && options.deserializeDates) {\n                        obj = JSON.parse(contents, HttpClient.dateTimeDeserializer);\n                    }\n                    else {\n                        obj = JSON.parse(contents);\n                    }\n                    response.result = obj;\n                }\n                response.headers = res.message.headers;\n            }\n            catch (err) {\n                // Invalid resource (contents not json);  leaving result obj null\n            }\n            // note that 3xx redirects are handled by the http layer.\n            if (statusCode > 299) {\n                let msg;\n                // if exception/error in body, attempt to get better error\n                if (obj && obj.message) {\n                    msg = obj.message;\n                }\n                else if (contents && contents.length > 0) {\n                    // it may be the case that the exception is in the body message as string\n                    msg = contents;\n                }\n                else {\n                    msg = 'Failed request: (' + statusCode + ')';\n                }\n                let err = new Error(msg);\n                // attach statusCode and body obj (if available) to the error object\n                err['statusCode'] = statusCode;\n                if (response.result) {\n                    err['result'] = response.result;\n                }\n                reject(err);\n            }\n            else {\n                resolve(response);\n            }\n        });\n    }\n}\nexports.HttpClient = HttpClient;\n\n\n/***/ }),\n\n/***/ 6443:\n/***/ ((__unused_webpack_module, exports, __webpack_require__) => {\n\n\"use strict\";\n\nObject.defineProperty(exports, \"__esModule\", ({ value: true }));\nconst url = __webpack_require__(8835);\nfunction getProxyUrl(reqUrl) {\n    let usingSsl = reqUrl.protocol === 'https:';\n    let proxyUrl;\n    if (checkBypass(reqUrl)) {\n        return proxyUrl;\n    }\n    let proxyVar;\n    if (usingSsl) {\n        proxyVar = process.env['https_proxy'] || process.env['HTTPS_PROXY'];\n    }\n    else {\n        proxyVar = process.env['http_proxy'] || process.env['HTTP_PROXY'];\n    }\n    if (proxyVar) {\n        proxyUrl = url.parse(proxyVar);\n    }\n    return proxyUrl;\n}\nexports.getProxyUrl = getProxyUrl;\nfunction checkBypass(reqUrl) {\n    if (!reqUrl.hostname) {\n        return false;\n    }\n    let noProxy = process.env['no_proxy'] || process.env['NO_PROXY'] || '';\n    if (!noProxy) {\n        return false;\n    }\n    // Determine the request port\n    let reqPort;\n    if (reqUrl.port) {\n        reqPort = Number(reqUrl.port);\n    }\n    else if (reqUrl.protocol === 'http:') {\n        reqPort = 80;\n    }\n    else if (reqUrl.protocol === 'https:') {\n        reqPort = 443;\n    }\n    // Format the request hostname and hostname with port\n    let upperReqHosts = [reqUrl.hostname.toUpperCase()];\n    if (typeof reqPort === 'number') {\n        upperReqHosts.push(`${upperReqHosts[0]}:${reqPort}`);\n    }\n    // Compare request host against noproxy\n    for (let upperNoProxyItem of noProxy\n        .split(',')\n        .map(x => x.trim().toUpperCase())\n        .filter(x => x)) {\n        if (upperReqHosts.some(x => x === upperNoProxyItem)) {\n            return true;\n        }\n    }\n    return false;\n}\nexports.checkBypass = checkBypass;\n\n\n/***/ }),\n\n/***/ 334:\n/***/ ((__unused_webpack_module, exports) => {\n\n\"use strict\";\n\n\nObject.defineProperty(exports, \"__esModule\", ({ value: true }));\n\nasync function auth(token) {\n  const tokenType = token.split(/\\./).length === 3 ? \"app\" : /^v\\d+\\./.test(token) ? \"installation\" : \"oauth\";\n  return {\n    type: \"token\",\n    token: token,\n    tokenType\n  };\n}\n\n/**\n * Prefix token for usage in the Authorization header\n *\n * @param token OAuth token or JSON Web Token\n */\nfunction withAuthorizationPrefix(token) {\n  if (token.split(/\\./).length === 3) {\n    return `bearer ${token}`;\n  }\n\n  return `token ${token}`;\n}\n\nasync function hook(token, request, route, parameters) {\n  const endpoint = request.endpoint.merge(route, parameters);\n  endpoint.headers.authorization = withAuthorizationPrefix(token);\n  return request(endpoint);\n}\n\nconst createTokenAuth = function createTokenAuth(token) {\n  if (!token) {\n    throw new Error(\"[@octokit/auth-token] No token passed to createTokenAuth\");\n  }\n\n  if (typeof token !== \"string\") {\n    throw new Error(\"[@octokit/auth-token] Token passed to createTokenAuth is not a string\");\n  }\n\n  token = token.replace(/^(token|bearer) +/i, \"\");\n  return Object.assign(auth.bind(null, token), {\n    hook: hook.bind(null, token)\n  });\n};\n\nexports.createTokenAuth = createTokenAuth;\n//# sourceMappingURL=index.js.map\n\n\n/***/ }),\n\n/***/ 6762:\n/***/ ((__unused_webpack_module, exports, __webpack_require__) => {\n\n\"use strict\";\n\n\nObject.defineProperty(exports, \"__esModule\", ({ value: true }));\n\nvar universalUserAgent = __webpack_require__(5030);\nvar beforeAfterHook = __webpack_require__(3682);\nvar request = __webpack_require__(6234);\nvar graphql = __webpack_require__(8467);\nvar authToken = __webpack_require__(334);\n\nfunction _defineProperty(obj, key, value) {\n  if (key in obj) {\n    Object.defineProperty(obj, key, {\n      value: value,\n      enumerable: true,\n      configurable: true,\n      writable: true\n    });\n  } else {\n    obj[key] = value;\n  }\n\n  return obj;\n}\n\nfunction ownKeys(object, enumerableOnly) {\n  var keys = Object.keys(object);\n\n  if (Object.getOwnPropertySymbols) {\n    var symbols = Object.getOwnPropertySymbols(object);\n    if (enumerableOnly) symbols = symbols.filter(function (sym) {\n      return Object.getOwnPropertyDescriptor(object, sym).enumerable;\n    });\n    keys.push.apply(keys, symbols);\n  }\n\n  return keys;\n}\n\nfunction _objectSpread2(target) {\n  for (var i = 1; i < arguments.length; i++) {\n    var source = arguments[i] != null ? arguments[i] : {};\n\n    if (i % 2) {\n      ownKeys(Object(source), true).forEach(function (key) {\n        _defineProperty(target, key, source[key]);\n      });\n    } else if (Object.getOwnPropertyDescriptors) {\n      Object.defineProperties(target, Object.getOwnPropertyDescriptors(source));\n    } else {\n      ownKeys(Object(source)).forEach(function (key) {\n        Object.defineProperty(target, key, Object.getOwnPropertyDescriptor(source, key));\n      });\n    }\n  }\n\n  return target;\n}\n\nconst VERSION = \"3.1.0\";\n\nclass Octokit {\n  constructor(options = {}) {\n    const hook = new beforeAfterHook.Collection();\n    const requestDefaults = {\n      baseUrl: request.request.endpoint.DEFAULTS.baseUrl,\n      headers: {},\n      request: Object.assign({}, options.request, {\n        hook: hook.bind(null, \"request\")\n      }),\n      mediaType: {\n        previews: [],\n        format: \"\"\n      }\n    }; // prepend default user agent with `options.userAgent` if set\n\n    requestDefaults.headers[\"user-agent\"] = [options.userAgent, `octokit-core.js/${VERSION} ${universalUserAgent.getUserAgent()}`].filter(Boolean).join(\" \");\n\n    if (options.baseUrl) {\n      requestDefaults.baseUrl = options.baseUrl;\n    }\n\n    if (options.previews) {\n      requestDefaults.mediaType.previews = options.previews;\n    }\n\n    if (options.timeZone) {\n      requestDefaults.headers[\"time-zone\"] = options.timeZone;\n    }\n\n    this.request = request.request.defaults(requestDefaults);\n    this.graphql = graphql.withCustomRequest(this.request).defaults(_objectSpread2(_objectSpread2({}, requestDefaults), {}, {\n      baseUrl: requestDefaults.baseUrl.replace(/\\/api\\/v3$/, \"/api\")\n    }));\n    this.log = Object.assign({\n      debug: () => {},\n      info: () => {},\n      warn: console.warn.bind(console),\n      error: console.error.bind(console)\n    }, options.log);\n    this.hook = hook; // (1) If neither `options.authStrategy` nor `options.auth` are set, the `octokit` instance\n    //     is unauthenticated. The `this.auth()` method is a no-op and no request hook is registred.\n    // (2) If only `options.auth` is set, use the default token authentication strategy.\n    // (3) If `options.authStrategy` is set then use it and pass in `options.auth`. Always pass own request as many strategies accept a custom request instance.\n    // TODO: type `options.auth` based on `options.authStrategy`.\n\n    if (!options.authStrategy) {\n      if (!options.auth) {\n        // (1)\n        this.auth = async () => ({\n          type: \"unauthenticated\"\n        });\n      } else {\n        // (2)\n        const auth = authToken.createTokenAuth(options.auth); // @ts-ignore  ¯\\_(ツ)_/¯\n\n        hook.wrap(\"request\", auth.hook);\n        this.auth = auth;\n      }\n    } else {\n      const auth = options.authStrategy(Object.assign({\n        request: this.request\n      }, options.auth)); // @ts-ignore  ¯\\_(ツ)_/¯\n\n      hook.wrap(\"request\", auth.hook);\n      this.auth = auth;\n    } // apply plugins\n    // https://stackoverflow.com/a/16345172\n\n\n    const classConstructor = this.constructor;\n    classConstructor.plugins.forEach(plugin => {\n      Object.assign(this, plugin(this, options));\n    });\n  }\n\n  static defaults(defaults) {\n    const OctokitWithDefaults = class extends this {\n      constructor(...args) {\n        const options = args[0] || {};\n\n        if (typeof defaults === \"function\") {\n          super(defaults(options));\n          return;\n        }\n\n        super(Object.assign({}, defaults, options, options.userAgent && defaults.userAgent ? {\n          userAgent: `${options.userAgent} ${defaults.userAgent}`\n        } : null));\n      }\n\n    };\n    return OctokitWithDefaults;\n  }\n  /**\n   * Attach a plugin (or many) to your Octokit instance.\n   *\n   * @example\n   * const API = Octokit.plugin(plugin1, plugin2, plugin3, ...)\n   */\n\n\n  static plugin(...newPlugins) {\n    var _a;\n\n    const currentPlugins = this.plugins;\n    const NewOctokit = (_a = class extends this {}, _a.plugins = currentPlugins.concat(newPlugins.filter(plugin => !currentPlugins.includes(plugin))), _a);\n    return NewOctokit;\n  }\n\n}\nOctokit.VERSION = VERSION;\nOctokit.plugins = [];\n\nexports.Octokit = Octokit;\n//# sourceMappingURL=index.js.map\n\n\n/***/ }),\n\n/***/ 9440:\n/***/ ((__unused_webpack_module, exports, __webpack_require__) => {\n\n\"use strict\";\n\n\nObject.defineProperty(exports, \"__esModule\", ({ value: true }));\n\nfunction _interopDefault (ex) { return (ex && (typeof ex === 'object') && 'default' in ex) ? ex['default'] : ex; }\n\nvar isPlainObject = _interopDefault(__webpack_require__(8840));\nvar universalUserAgent = __webpack_require__(5030);\n\nfunction lowercaseKeys(object) {\n  if (!object) {\n    return {};\n  }\n\n  return Object.keys(object).reduce((newObj, key) => {\n    newObj[key.toLowerCase()] = object[key];\n    return newObj;\n  }, {});\n}\n\nfunction mergeDeep(defaults, options) {\n  const result = Object.assign({}, defaults);\n  Object.keys(options).forEach(key => {\n    if (isPlainObject(options[key])) {\n      if (!(key in defaults)) Object.assign(result, {\n        [key]: options[key]\n      });else result[key] = mergeDeep(defaults[key], options[key]);\n    } else {\n      Object.assign(result, {\n        [key]: options[key]\n      });\n    }\n  });\n  return result;\n}\n\nfunction merge(defaults, route, options) {\n  if (typeof route === \"string\") {\n    let [method, url] = route.split(\" \");\n    options = Object.assign(url ? {\n      method,\n      url\n    } : {\n      url: method\n    }, options);\n  } else {\n    options = Object.assign({}, route);\n  } // lowercase header names before merging with defaults to avoid duplicates\n\n\n  options.headers = lowercaseKeys(options.headers);\n  const mergedOptions = mergeDeep(defaults || {}, options); // mediaType.previews arrays are merged, instead of overwritten\n\n  if (defaults && defaults.mediaType.previews.length) {\n    mergedOptions.mediaType.previews = defaults.mediaType.previews.filter(preview => !mergedOptions.mediaType.previews.includes(preview)).concat(mergedOptions.mediaType.previews);\n  }\n\n  mergedOptions.mediaType.previews = mergedOptions.mediaType.previews.map(preview => preview.replace(/-preview/, \"\"));\n  return mergedOptions;\n}\n\nfunction addQueryParameters(url, parameters) {\n  const separator = /\\?/.test(url) ? \"&\" : \"?\";\n  const names = Object.keys(parameters);\n\n  if (names.length === 0) {\n    return url;\n  }\n\n  return url + separator + names.map(name => {\n    if (name === \"q\") {\n      return \"q=\" + parameters.q.split(\"+\").map(encodeURIComponent).join(\"+\");\n    }\n\n    return `${name}=${encodeURIComponent(parameters[name])}`;\n  }).join(\"&\");\n}\n\nconst urlVariableRegex = /\\{[^}]+\\}/g;\n\nfunction removeNonChars(variableName) {\n  return variableName.replace(/^\\W+|\\W+$/g, \"\").split(/,/);\n}\n\nfunction extractUrlVariableNames(url) {\n  const matches = url.match(urlVariableRegex);\n\n  if (!matches) {\n    return [];\n  }\n\n  return matches.map(removeNonChars).reduce((a, b) => a.concat(b), []);\n}\n\nfunction omit(object, keysToOmit) {\n  return Object.keys(object).filter(option => !keysToOmit.includes(option)).reduce((obj, key) => {\n    obj[key] = object[key];\n    return obj;\n  }, {});\n}\n\n// Based on https://github.com/bramstein/url-template, licensed under BSD\n// TODO: create separate package.\n//\n// Copyright (c) 2012-2014, Bram Stein\n// All rights reserved.\n// Redistribution and use in source and binary forms, with or without\n// modification, are permitted provided that the following conditions\n// are met:\n//  1. Redistributions of source code must retain the above copyright\n//     notice, this list of conditions and the following disclaimer.\n//  2. Redistributions in binary form must reproduce the above copyright\n//     notice, this list of conditions and the following disclaimer in the\n//     documentation and/or other materials provided with the distribution.\n//  3. The name of the author may not be used to endorse or promote products\n//     derived from this software without specific prior written permission.\n// THIS SOFTWARE IS PROVIDED BY THE AUTHOR \"AS IS\" AND ANY EXPRESS OR IMPLIED\n// WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF\n// MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO\n// EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT,\n// INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING,\n// BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,\n// DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY\n// OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING\n// NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE,\n// EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.\n\n/* istanbul ignore file */\nfunction encodeReserved(str) {\n  return str.split(/(%[0-9A-Fa-f]{2})/g).map(function (part) {\n    if (!/%[0-9A-Fa-f]/.test(part)) {\n      part = encodeURI(part).replace(/%5B/g, \"[\").replace(/%5D/g, \"]\");\n    }\n\n    return part;\n  }).join(\"\");\n}\n\nfunction encodeUnreserved(str) {\n  return encodeURIComponent(str).replace(/[!'()*]/g, function (c) {\n    return \"%\" + c.charCodeAt(0).toString(16).toUpperCase();\n  });\n}\n\nfunction encodeValue(operator, value, key) {\n  value = operator === \"+\" || operator === \"#\" ? encodeReserved(value) : encodeUnreserved(value);\n\n  if (key) {\n    return encodeUnreserved(key) + \"=\" + value;\n  } else {\n    return value;\n  }\n}\n\nfunction isDefined(value) {\n  return value !== undefined && value !== null;\n}\n\nfunction isKeyOperator(operator) {\n  return operator === \";\" || operator === \"&\" || operator === \"?\";\n}\n\nfunction getValues(context, operator, key, modifier) {\n  var value = context[key],\n      result = [];\n\n  if (isDefined(value) && value !== \"\") {\n    if (typeof value === \"string\" || typeof value === \"number\" || typeof value === \"boolean\") {\n      value = value.toString();\n\n      if (modifier && modifier !== \"*\") {\n        value = value.substring(0, parseInt(modifier, 10));\n      }\n\n      result.push(encodeValue(operator, value, isKeyOperator(operator) ? key : \"\"));\n    } else {\n      if (modifier === \"*\") {\n        if (Array.isArray(value)) {\n          value.filter(isDefined).forEach(function (value) {\n            result.push(encodeValue(operator, value, isKeyOperator(operator) ? key : \"\"));\n          });\n        } else {\n          Object.keys(value).forEach(function (k) {\n            if (isDefined(value[k])) {\n              result.push(encodeValue(operator, value[k], k));\n            }\n          });\n        }\n      } else {\n        const tmp = [];\n\n        if (Array.isArray(value)) {\n          value.filter(isDefined).forEach(function (value) {\n            tmp.push(encodeValue(operator, value));\n          });\n        } else {\n          Object.keys(value).forEach(function (k) {\n            if (isDefined(value[k])) {\n              tmp.push(encodeUnreserved(k));\n              tmp.push(encodeValue(operator, value[k].toString()));\n            }\n          });\n        }\n\n        if (isKeyOperator(operator)) {\n          result.push(encodeUnreserved(key) + \"=\" + tmp.join(\",\"));\n        } else if (tmp.length !== 0) {\n          result.push(tmp.join(\",\"));\n        }\n      }\n    }\n  } else {\n    if (operator === \";\") {\n      if (isDefined(value)) {\n        result.push(encodeUnreserved(key));\n      }\n    } else if (value === \"\" && (operator === \"&\" || operator === \"?\")) {\n      result.push(encodeUnreserved(key) + \"=\");\n    } else if (value === \"\") {\n      result.push(\"\");\n    }\n  }\n\n  return result;\n}\n\nfunction parseUrl(template) {\n  return {\n    expand: expand.bind(null, template)\n  };\n}\n\nfunction expand(template, context) {\n  var operators = [\"+\", \"#\", \".\", \"/\", \";\", \"?\", \"&\"];\n  return template.replace(/\\{([^\\{\\}]+)\\}|([^\\{\\}]+)/g, function (_, expression, literal) {\n    if (expression) {\n      let operator = \"\";\n      const values = [];\n\n      if (operators.indexOf(expression.charAt(0)) !== -1) {\n        operator = expression.charAt(0);\n        expression = expression.substr(1);\n      }\n\n      expression.split(/,/g).forEach(function (variable) {\n        var tmp = /([^:\\*]*)(?::(\\d+)|(\\*))?/.exec(variable);\n        values.push(getValues(context, operator, tmp[1], tmp[2] || tmp[3]));\n      });\n\n      if (operator && operator !== \"+\") {\n        var separator = \",\";\n\n        if (operator === \"?\") {\n          separator = \"&\";\n        } else if (operator !== \"#\") {\n          separator = operator;\n        }\n\n        return (values.length !== 0 ? operator : \"\") + values.join(separator);\n      } else {\n        return values.join(\",\");\n      }\n    } else {\n      return encodeReserved(literal);\n    }\n  });\n}\n\nfunction parse(options) {\n  // https://fetch.spec.whatwg.org/#methods\n  let method = options.method.toUpperCase(); // replace :varname with {varname} to make it RFC 6570 compatible\n\n  let url = (options.url || \"/\").replace(/:([a-z]\\w+)/g, \"{+$1}\");\n  let headers = Object.assign({}, options.headers);\n  let body;\n  let parameters = omit(options, [\"method\", \"baseUrl\", \"url\", \"headers\", \"request\", \"mediaType\"]); // extract variable names from URL to calculate remaining variables later\n\n  const urlVariableNames = extractUrlVariableNames(url);\n  url = parseUrl(url).expand(parameters);\n\n  if (!/^http/.test(url)) {\n    url = options.baseUrl + url;\n  }\n\n  const omittedParameters = Object.keys(options).filter(option => urlVariableNames.includes(option)).concat(\"baseUrl\");\n  const remainingParameters = omit(parameters, omittedParameters);\n  const isBinaryRequset = /application\\/octet-stream/i.test(headers.accept);\n\n  if (!isBinaryRequset) {\n    if (options.mediaType.format) {\n      // e.g. application/vnd.github.v3+json => application/vnd.github.v3.raw\n      headers.accept = headers.accept.split(/,/).map(preview => preview.replace(/application\\/vnd(\\.\\w+)(\\.v3)?(\\.\\w+)?(\\+json)?$/, `application/vnd$1$2.${options.mediaType.format}`)).join(\",\");\n    }\n\n    if (options.mediaType.previews.length) {\n      const previewsFromAcceptHeader = headers.accept.match(/[\\w-]+(?=-preview)/g) || [];\n      headers.accept = previewsFromAcceptHeader.concat(options.mediaType.previews).map(preview => {\n        const format = options.mediaType.format ? `.${options.mediaType.format}` : \"+json\";\n        return `application/vnd.github.${preview}-preview${format}`;\n      }).join(\",\");\n    }\n  } // for GET/HEAD requests, set URL query parameters from remaining parameters\n  // for PATCH/POST/PUT/DELETE requests, set request body from remaining parameters\n\n\n  if ([\"GET\", \"HEAD\"].includes(method)) {\n    url = addQueryParameters(url, remainingParameters);\n  } else {\n    if (\"data\" in remainingParameters) {\n      body = remainingParameters.data;\n    } else {\n      if (Object.keys(remainingParameters).length) {\n        body = remainingParameters;\n      } else {\n        headers[\"content-length\"] = 0;\n      }\n    }\n  } // default content-type for JSON if body is set\n\n\n  if (!headers[\"content-type\"] && typeof body !== \"undefined\") {\n    headers[\"content-type\"] = \"application/json; charset=utf-8\";\n  } // GitHub expects 'content-length: 0' header for PUT/PATCH requests without body.\n  // fetch does not allow to set `content-length` header, but we can set body to an empty string\n\n\n  if ([\"PATCH\", \"PUT\"].includes(method) && typeof body === \"undefined\") {\n    body = \"\";\n  } // Only return body/request keys if present\n\n\n  return Object.assign({\n    method,\n    url,\n    headers\n  }, typeof body !== \"undefined\" ? {\n    body\n  } : null, options.request ? {\n    request: options.request\n  } : null);\n}\n\nfunction endpointWithDefaults(defaults, route, options) {\n  return parse(merge(defaults, route, options));\n}\n\nfunction withDefaults(oldDefaults, newDefaults) {\n  const DEFAULTS = merge(oldDefaults, newDefaults);\n  const endpoint = endpointWithDefaults.bind(null, DEFAULTS);\n  return Object.assign(endpoint, {\n    DEFAULTS,\n    defaults: withDefaults.bind(null, DEFAULTS),\n    merge: merge.bind(null, DEFAULTS),\n    parse\n  });\n}\n\nconst VERSION = \"6.0.3\";\n\nconst userAgent = `octokit-endpoint.js/${VERSION} ${universalUserAgent.getUserAgent()}`; // DEFAULTS has all properties set that EndpointOptions has, except url.\n// So we use RequestParameters and add method as additional required property.\n\nconst DEFAULTS = {\n  method: \"GET\",\n  baseUrl: \"https://api.github.com\",\n  headers: {\n    accept: \"application/vnd.github.v3+json\",\n    \"user-agent\": userAgent\n  },\n  mediaType: {\n    format: \"\",\n    previews: []\n  }\n};\n\nconst endpoint = withDefaults(null, DEFAULTS);\n\nexports.endpoint = endpoint;\n//# sourceMappingURL=index.js.map\n\n\n/***/ }),\n\n/***/ 8467:\n/***/ ((__unused_webpack_module, exports, __webpack_require__) => {\n\n\"use strict\";\n\n\nObject.defineProperty(exports, \"__esModule\", ({ value: true }));\n\nvar request = __webpack_require__(6234);\nvar universalUserAgent = __webpack_require__(5030);\n\nconst VERSION = \"4.5.1\";\n\nclass GraphqlError extends Error {\n  constructor(request, response) {\n    const message = response.data.errors[0].message;\n    super(message);\n    Object.assign(this, response.data);\n    this.name = \"GraphqlError\";\n    this.request = request; // Maintains proper stack trace (only available on V8)\n\n    /* istanbul ignore next */\n\n    if (Error.captureStackTrace) {\n      Error.captureStackTrace(this, this.constructor);\n    }\n  }\n\n}\n\nconst NON_VARIABLE_OPTIONS = [\"method\", \"baseUrl\", \"url\", \"headers\", \"request\", \"query\", \"mediaType\"];\nfunction graphql(request, query, options) {\n  options = typeof query === \"string\" ? options = Object.assign({\n    query\n  }, options) : options = query;\n  const requestOptions = Object.keys(options).reduce((result, key) => {\n    if (NON_VARIABLE_OPTIONS.includes(key)) {\n      result[key] = options[key];\n      return result;\n    }\n\n    if (!result.variables) {\n      result.variables = {};\n    }\n\n    result.variables[key] = options[key];\n    return result;\n  }, {});\n  return request(requestOptions).then(response => {\n    if (response.data.errors) {\n      throw new GraphqlError(requestOptions, {\n        data: response.data\n      });\n    }\n\n    return response.data.data;\n  });\n}\n\nfunction withDefaults(request$1, newDefaults) {\n  const newRequest = request$1.defaults(newDefaults);\n\n  const newApi = (query, options) => {\n    return graphql(newRequest, query, options);\n  };\n\n  return Object.assign(newApi, {\n    defaults: withDefaults.bind(null, newRequest),\n    endpoint: request.request.endpoint\n  });\n}\n\nconst graphql$1 = withDefaults(request.request, {\n  headers: {\n    \"user-agent\": `octokit-graphql.js/${VERSION} ${universalUserAgent.getUserAgent()}`\n  },\n  method: \"POST\",\n  url: \"/graphql\"\n});\nfunction withCustomRequest(customRequest) {\n  return withDefaults(customRequest, {\n    method: \"POST\",\n    url: \"/graphql\"\n  });\n}\n\nexports.graphql = graphql$1;\nexports.withCustomRequest = withCustomRequest;\n//# sourceMappingURL=index.js.map\n\n\n/***/ }),\n\n/***/ 4193:\n/***/ ((__unused_webpack_module, exports) => {\n\n\"use strict\";\n\n\nObject.defineProperty(exports, \"__esModule\", ({ value: true }));\n\nconst VERSION = \"2.2.3\";\n\n/**\n * Some “list” response that can be paginated have a different response structure\n *\n * They have a `total_count` key in the response (search also has `incomplete_results`,\n * /installation/repositories also has `repository_selection`), as well as a key with\n * the list of the items which name varies from endpoint to endpoint.\n *\n * Octokit normalizes these responses so that paginated results are always returned following\n * the same structure. One challenge is that if the list response has only one page, no Link\n * header is provided, so this header alone is not sufficient to check wether a response is\n * paginated or not.\n *\n * We check if a \"total_count\" key is present in the response data, but also make sure that\n * a \"url\" property is not, as the \"Get the combined status for a specific ref\" endpoint would\n * otherwise match: https://developer.github.com/v3/repos/statuses/#get-the-combined-status-for-a-specific-ref\n */\nfunction normalizePaginatedListResponse(response) {\n  const responseNeedsNormalization = \"total_count\" in response.data && !(\"url\" in response.data);\n  if (!responseNeedsNormalization) return response; // keep the additional properties intact as there is currently no other way\n  // to retrieve the same information.\n\n  const incompleteResults = response.data.incomplete_results;\n  const repositorySelection = response.data.repository_selection;\n  const totalCount = response.data.total_count;\n  delete response.data.incomplete_results;\n  delete response.data.repository_selection;\n  delete response.data.total_count;\n  const namespaceKey = Object.keys(response.data)[0];\n  const data = response.data[namespaceKey];\n  response.data = data;\n\n  if (typeof incompleteResults !== \"undefined\") {\n    response.data.incomplete_results = incompleteResults;\n  }\n\n  if (typeof repositorySelection !== \"undefined\") {\n    response.data.repository_selection = repositorySelection;\n  }\n\n  response.data.total_count = totalCount;\n  return response;\n}\n\nfunction iterator(octokit, route, parameters) {\n  const options = typeof route === \"function\" ? route.endpoint(parameters) : octokit.request.endpoint(route, parameters);\n  const requestMethod = typeof route === \"function\" ? route : octokit.request;\n  const method = options.method;\n  const headers = options.headers;\n  let url = options.url;\n  return {\n    [Symbol.asyncIterator]: () => ({\n      next() {\n        if (!url) {\n          return Promise.resolve({\n            done: true\n          });\n        }\n\n        return requestMethod({\n          method,\n          url,\n          headers\n        }).then(normalizePaginatedListResponse).then(response => {\n          // `response.headers.link` format:\n          // '<https://api.github.com/users/aseemk/followers?page=2>; rel=\"next\", <https://api.github.com/users/aseemk/followers?page=2>; rel=\"last\"'\n          // sets `url` to undefined if \"next\" URL is not present or `link` header is not set\n          url = ((response.headers.link || \"\").match(/<([^>]+)>;\\s*rel=\"next\"/) || [])[1];\n          return {\n            value: response\n          };\n        });\n      }\n\n    })\n  };\n}\n\nfunction paginate(octokit, route, parameters, mapFn) {\n  if (typeof parameters === \"function\") {\n    mapFn = parameters;\n    parameters = undefined;\n  }\n\n  return gather(octokit, [], iterator(octokit, route, parameters)[Symbol.asyncIterator](), mapFn);\n}\n\nfunction gather(octokit, results, iterator, mapFn) {\n  return iterator.next().then(result => {\n    if (result.done) {\n      return results;\n    }\n\n    let earlyExit = false;\n\n    function done() {\n      earlyExit = true;\n    }\n\n    results = results.concat(mapFn ? mapFn(result.value, done) : result.value.data);\n\n    if (earlyExit) {\n      return results;\n    }\n\n    return gather(octokit, results, iterator, mapFn);\n  });\n}\n\n/**\n * @param octokit Octokit instance\n * @param options Options passed to Octokit constructor\n */\n\nfunction paginateRest(octokit) {\n  return {\n    paginate: Object.assign(paginate.bind(null, octokit), {\n      iterator: iterator.bind(null, octokit)\n    })\n  };\n}\npaginateRest.VERSION = VERSION;\n\nexports.paginateRest = paginateRest;\n//# sourceMappingURL=index.js.map\n\n\n/***/ }),\n\n/***/ 3044:\n/***/ ((__unused_webpack_module, exports) => {\n\n\"use strict\";\n\n\nObject.defineProperty(exports, \"__esModule\", ({ value: true }));\n\nconst Endpoints = {\n  actions: {\n    addSelectedRepoToOrgSecret: [\"PUT /orgs/{org}/actions/secrets/{secret_name}/repositories/{repository_id}\"],\n    cancelWorkflowRun: [\"POST /repos/{owner}/{repo}/actions/runs/{run_id}/cancel\"],\n    createOrUpdateOrgSecret: [\"PUT /orgs/{org}/actions/secrets/{secret_name}\"],\n    createOrUpdateRepoSecret: [\"PUT /repos/{owner}/{repo}/actions/secrets/{secret_name}\"],\n    createRegistrationTokenForOrg: [\"POST /orgs/{org}/actions/runners/registration-token\"],\n    createRegistrationTokenForRepo: [\"POST /repos/{owner}/{repo}/actions/runners/registration-token\"],\n    createRemoveTokenForOrg: [\"POST /orgs/{org}/actions/runners/remove-token\"],\n    createRemoveTokenForRepo: [\"POST /repos/{owner}/{repo}/actions/runners/remove-token\"],\n    deleteArtifact: [\"DELETE /repos/{owner}/{repo}/actions/artifacts/{artifact_id}\"],\n    deleteOrgSecret: [\"DELETE /orgs/{org}/actions/secrets/{secret_name}\"],\n    deleteRepoSecret: [\"DELETE /repos/{owner}/{repo}/actions/secrets/{secret_name}\"],\n    deleteSelfHostedRunnerFromOrg: [\"DELETE /orgs/{org}/actions/runners/{runner_id}\"],\n    deleteSelfHostedRunnerFromRepo: [\"DELETE /repos/{owner}/{repo}/actions/runners/{runner_id}\"],\n    deleteWorkflowRunLogs: [\"DELETE /repos/{owner}/{repo}/actions/runs/{run_id}/logs\"],\n    downloadArtifact: [\"GET /repos/{owner}/{repo}/actions/artifacts/{artifact_id}/{archive_format}\"],\n    downloadJobLogsForWorkflowRun: [\"GET /repos/{owner}/{repo}/actions/jobs/{job_id}/logs\"],\n    downloadWorkflowRunLogs: [\"GET /repos/{owner}/{repo}/actions/runs/{run_id}/logs\"],\n    getArtifact: [\"GET /repos/{owner}/{repo}/actions/artifacts/{artifact_id}\"],\n    getJobForWorkflowRun: [\"GET /repos/{owner}/{repo}/actions/jobs/{job_id}\"],\n    getOrgPublicKey: [\"GET /orgs/{org}/actions/secrets/public-key\"],\n    getOrgSecret: [\"GET /orgs/{org}/actions/secrets/{secret_name}\"],\n    getRepoPublicKey: [\"GET /repos/{owner}/{repo}/actions/secrets/public-key\"],\n    getRepoSecret: [\"GET /repos/{owner}/{repo}/actions/secrets/{secret_name}\"],\n    getSelfHostedRunnerForOrg: [\"GET /orgs/{org}/actions/runners/{runner_id}\"],\n    getSelfHostedRunnerForRepo: [\"GET /repos/{owner}/{repo}/actions/runners/{runner_id}\"],\n    getWorkflow: [\"GET /repos/{owner}/{repo}/actions/workflows/{workflow_id}\"],\n    getWorkflowRun: [\"GET /repos/{owner}/{repo}/actions/runs/{run_id}\"],\n    getWorkflowRunUsage: [\"GET /repos/{owner}/{repo}/actions/runs/{run_id}/timing\"],\n    getWorkflowUsage: [\"GET /repos/{owner}/{repo}/actions/workflows/{workflow_id}/timing\"],\n    listArtifactsForRepo: [\"GET /repos/{owner}/{repo}/actions/artifacts\"],\n    listJobsForWorkflowRun: [\"GET /repos/{owner}/{repo}/actions/runs/{run_id}/jobs\"],\n    listOrgSecrets: [\"GET /orgs/{org}/actions/secrets\"],\n    listRepoSecrets: [\"GET /repos/{owner}/{repo}/actions/secrets\"],\n    listRepoWorkflows: [\"GET /repos/{owner}/{repo}/actions/workflows\"],\n    listRunnerApplicationsForOrg: [\"GET /orgs/{org}/actions/runners/downloads\"],\n    listRunnerApplicationsForRepo: [\"GET /repos/{owner}/{repo}/actions/runners/downloads\"],\n    listSelectedReposForOrgSecret: [\"GET /orgs/{org}/actions/secrets/{secret_name}/repositories\"],\n    listSelfHostedRunnersForOrg: [\"GET /orgs/{org}/actions/runners\"],\n    listSelfHostedRunnersForRepo: [\"GET /repos/{owner}/{repo}/actions/runners\"],\n    listWorkflowRunArtifacts: [\"GET /repos/{owner}/{repo}/actions/runs/{run_id}/artifacts\"],\n    listWorkflowRuns: [\"GET /repos/{owner}/{repo}/actions/workflows/{workflow_id}/runs\"],\n    listWorkflowRunsForRepo: [\"GET /repos/{owner}/{repo}/actions/runs\"],\n    reRunWorkflow: [\"POST /repos/{owner}/{repo}/actions/runs/{run_id}/rerun\"],\n    removeSelectedRepoFromOrgSecret: [\"DELETE /orgs/{org}/actions/secrets/{secret_name}/repositories/{repository_id}\"],\n    setSelectedReposForOrgSecret: [\"PUT /orgs/{org}/actions/secrets/{secret_name}/repositories\"]\n  },\n  activity: {\n    checkRepoIsStarredByAuthenticatedUser: [\"GET /user/starred/{owner}/{repo}\"],\n    deleteRepoSubscription: [\"DELETE /repos/{owner}/{repo}/subscription\"],\n    deleteThreadSubscription: [\"DELETE /notifications/threads/{thread_id}/subscription\"],\n    getFeeds: [\"GET /feeds\"],\n    getRepoSubscription: [\"GET /repos/{owner}/{repo}/subscription\"],\n    getThread: [\"GET /notifications/threads/{thread_id}\"],\n    getThreadSubscriptionForAuthenticatedUser: [\"GET /notifications/threads/{thread_id}/subscription\"],\n    listEventsForAuthenticatedUser: [\"GET /users/{username}/events\"],\n    listNotificationsForAuthenticatedUser: [\"GET /notifications\"],\n    listOrgEventsForAuthenticatedUser: [\"GET /users/{username}/events/orgs/{org}\"],\n    listPublicEvents: [\"GET /events\"],\n    listPublicEventsForRepoNetwork: [\"GET /networks/{owner}/{repo}/events\"],\n    listPublicEventsForUser: [\"GET /users/{username}/events/public\"],\n    listPublicOrgEvents: [\"GET /orgs/{org}/events\"],\n    listReceivedEventsForUser: [\"GET /users/{username}/received_events\"],\n    listReceivedPublicEventsForUser: [\"GET /users/{username}/received_events/public\"],\n    listRepoEvents: [\"GET /repos/{owner}/{repo}/events\"],\n    listRepoNotificationsForAuthenticatedUser: [\"GET /repos/{owner}/{repo}/notifications\"],\n    listReposStarredByAuthenticatedUser: [\"GET /user/starred\"],\n    listReposStarredByUser: [\"GET /users/{username}/starred\"],\n    listReposWatchedByUser: [\"GET /users/{username}/subscriptions\"],\n    listStargazersForRepo: [\"GET /repos/{owner}/{repo}/stargazers\"],\n    listWatchedReposForAuthenticatedUser: [\"GET /user/subscriptions\"],\n    listWatchersForRepo: [\"GET /repos/{owner}/{repo}/subscribers\"],\n    markNotificationsAsRead: [\"PUT /notifications\"],\n    markRepoNotificationsAsRead: [\"PUT /repos/{owner}/{repo}/notifications\"],\n    markThreadAsRead: [\"PATCH /notifications/threads/{thread_id}\"],\n    setRepoSubscription: [\"PUT /repos/{owner}/{repo}/subscription\"],\n    setThreadSubscription: [\"PUT /notifications/threads/{thread_id}/subscription\"],\n    starRepoForAuthenticatedUser: [\"PUT /user/starred/{owner}/{repo}\"],\n    unstarRepoForAuthenticatedUser: [\"DELETE /user/starred/{owner}/{repo}\"]\n  },\n  apps: {\n    addRepoToInstallation: [\"PUT /user/installations/{installation_id}/repositories/{repository_id}\", {\n      mediaType: {\n        previews: [\"machine-man\"]\n      }\n    }],\n    checkToken: [\"POST /applications/{client_id}/token\"],\n    createContentAttachment: [\"POST /content_references/{content_reference_id}/attachments\", {\n      mediaType: {\n        previews: [\"corsair\"]\n      }\n    }],\n    createFromManifest: [\"POST /app-manifests/{code}/conversions\"],\n    createInstallationAccessToken: [\"POST /app/installations/{installation_id}/access_tokens\", {\n      mediaType: {\n        previews: [\"machine-man\"]\n      }\n    }],\n    deleteAuthorization: [\"DELETE /applications/{client_id}/grant\"],\n    deleteInstallation: [\"DELETE /app/installations/{installation_id}\", {\n      mediaType: {\n        previews: [\"machine-man\"]\n      }\n    }],\n    deleteToken: [\"DELETE /applications/{client_id}/token\"],\n    getAuthenticated: [\"GET /app\", {\n      mediaType: {\n        previews: [\"machine-man\"]\n      }\n    }],\n    getBySlug: [\"GET /apps/{app_slug}\", {\n      mediaType: {\n        previews: [\"machine-man\"]\n      }\n    }],\n    getInstallation: [\"GET /app/installations/{installation_id}\", {\n      mediaType: {\n        previews: [\"machine-man\"]\n      }\n    }],\n    getOrgInstallation: [\"GET /orgs/{org}/installation\", {\n      mediaType: {\n        previews: [\"machine-man\"]\n      }\n    }],\n    getRepoInstallation: [\"GET /repos/{owner}/{repo}/installation\", {\n      mediaType: {\n        previews: [\"machine-man\"]\n      }\n    }],\n    getSubscriptionPlanForAccount: [\"GET /marketplace_listing/accounts/{account_id}\"],\n    getSubscriptionPlanForAccountStubbed: [\"GET /marketplace_listing/stubbed/accounts/{account_id}\"],\n    getUserInstallation: [\"GET /users/{username}/installation\", {\n      mediaType: {\n        previews: [\"machine-man\"]\n      }\n    }],\n    listAccountsForPlan: [\"GET /marketplace_listing/plans/{plan_id}/accounts\"],\n    listAccountsForPlanStubbed: [\"GET /marketplace_listing/stubbed/plans/{plan_id}/accounts\"],\n    listInstallationReposForAuthenticatedUser: [\"GET /user/installations/{installation_id}/repositories\", {\n      mediaType: {\n        previews: [\"machine-man\"]\n      }\n    }],\n    listInstallations: [\"GET /app/installations\", {\n      mediaType: {\n        previews: [\"machine-man\"]\n      }\n    }],\n    listInstallationsForAuthenticatedUser: [\"GET /user/installations\", {\n      mediaType: {\n        previews: [\"machine-man\"]\n      }\n    }],\n    listPlans: [\"GET /marketplace_listing/plans\"],\n    listPlansStubbed: [\"GET /marketplace_listing/stubbed/plans\"],\n    listReposAccessibleToInstallation: [\"GET /installation/repositories\", {\n      mediaType: {\n        previews: [\"machine-man\"]\n      }\n    }],\n    listSubscriptionsForAuthenticatedUser: [\"GET /user/marketplace_purchases\"],\n    listSubscriptionsForAuthenticatedUserStubbed: [\"GET /user/marketplace_purchases/stubbed\"],\n    removeRepoFromInstallation: [\"DELETE /user/installations/{installation_id}/repositories/{repository_id}\", {\n      mediaType: {\n        previews: [\"machine-man\"]\n      }\n    }],\n    resetToken: [\"PATCH /applications/{client_id}/token\"],\n    revokeInstallationAccessToken: [\"DELETE /installation/token\"],\n    suspendInstallation: [\"PUT /app/installations/{installation_id}/suspended\"],\n    unsuspendInstallation: [\"DELETE /app/installations/{installation_id}/suspended\"]\n  },\n  checks: {\n    create: [\"POST /repos/{owner}/{repo}/check-runs\", {\n      mediaType: {\n        previews: [\"antiope\"]\n      }\n    }],\n    createSuite: [\"POST /repos/{owner}/{repo}/check-suites\", {\n      mediaType: {\n        previews: [\"antiope\"]\n      }\n    }],\n    get: [\"GET /repos/{owner}/{repo}/check-runs/{check_run_id}\", {\n      mediaType: {\n        previews: [\"antiope\"]\n      }\n    }],\n    getSuite: [\"GET /repos/{owner}/{repo}/check-suites/{check_suite_id}\", {\n      mediaType: {\n        previews: [\"antiope\"]\n      }\n    }],\n    listAnnotations: [\"GET /repos/{owner}/{repo}/check-runs/{check_run_id}/annotations\", {\n      mediaType: {\n        previews: [\"antiope\"]\n      }\n    }],\n    listForRef: [\"GET /repos/{owner}/{repo}/commits/{ref}/check-runs\", {\n      mediaType: {\n        previews: [\"antiope\"]\n      }\n    }],\n    listForSuite: [\"GET /repos/{owner}/{repo}/check-suites/{check_suite_id}/check-runs\", {\n      mediaType: {\n        previews: [\"antiope\"]\n      }\n    }],\n    listSuitesForRef: [\"GET /repos/{owner}/{repo}/commits/{ref}/check-suites\", {\n      mediaType: {\n        previews: [\"antiope\"]\n      }\n    }],\n    rerequestSuite: [\"POST /repos/{owner}/{repo}/check-suites/{check_suite_id}/rerequest\", {\n      mediaType: {\n        previews: [\"antiope\"]\n      }\n    }],\n    setSuitesPreferences: [\"PATCH /repos/{owner}/{repo}/check-suites/preferences\", {\n      mediaType: {\n        previews: [\"antiope\"]\n      }\n    }],\n    update: [\"PATCH /repos/{owner}/{repo}/check-runs/{check_run_id}\", {\n      mediaType: {\n        previews: [\"antiope\"]\n      }\n    }]\n  },\n  codeScanning: {\n    getAlert: [\"GET /repos/{owner}/{repo}/code-scanning/alerts/{alert_id}\"],\n    listAlertsForRepo: [\"GET /repos/{owner}/{repo}/code-scanning/alerts\"]\n  },\n  codesOfConduct: {\n    getAllCodesOfConduct: [\"GET /codes_of_conduct\", {\n      mediaType: {\n        previews: [\"scarlet-witch\"]\n      }\n    }],\n    getConductCode: [\"GET /codes_of_conduct/{key}\", {\n      mediaType: {\n        previews: [\"scarlet-witch\"]\n      }\n    }],\n    getForRepo: [\"GET /repos/{owner}/{repo}/community/code_of_conduct\", {\n      mediaType: {\n        previews: [\"scarlet-witch\"]\n      }\n    }]\n  },\n  emojis: {\n    get: [\"GET /emojis\"]\n  },\n  gists: {\n    checkIsStarred: [\"GET /gists/{gist_id}/star\"],\n    create: [\"POST /gists\"],\n    createComment: [\"POST /gists/{gist_id}/comments\"],\n    delete: [\"DELETE /gists/{gist_id}\"],\n    deleteComment: [\"DELETE /gists/{gist_id}/comments/{comment_id}\"],\n    fork: [\"POST /gists/{gist_id}/forks\"],\n    get: [\"GET /gists/{gist_id}\"],\n    getComment: [\"GET /gists/{gist_id}/comments/{comment_id}\"],\n    getRevision: [\"GET /gists/{gist_id}/{sha}\"],\n    list: [\"GET /gists\"],\n    listComments: [\"GET /gists/{gist_id}/comments\"],\n    listCommits: [\"GET /gists/{gist_id}/commits\"],\n    listForUser: [\"GET /users/{username}/gists\"],\n    listForks: [\"GET /gists/{gist_id}/forks\"],\n    listPublic: [\"GET /gists/public\"],\n    listStarred: [\"GET /gists/starred\"],\n    star: [\"PUT /gists/{gist_id}/star\"],\n    unstar: [\"DELETE /gists/{gist_id}/star\"],\n    update: [\"PATCH /gists/{gist_id}\"],\n    updateComment: [\"PATCH /gists/{gist_id}/comments/{comment_id}\"]\n  },\n  git: {\n    createBlob: [\"POST /repos/{owner}/{repo}/git/blobs\"],\n    createCommit: [\"POST /repos/{owner}/{repo}/git/commits\"],\n    createRef: [\"POST /repos/{owner}/{repo}/git/refs\"],\n    createTag: [\"POST /repos/{owner}/{repo}/git/tags\"],\n    createTree: [\"POST /repos/{owner}/{repo}/git/trees\"],\n    deleteRef: [\"DELETE /repos/{owner}/{repo}/git/refs/{ref}\"],\n    getBlob: [\"GET /repos/{owner}/{repo}/git/blobs/{file_sha}\"],\n    getCommit: [\"GET /repos/{owner}/{repo}/git/commits/{commit_sha}\"],\n    getRef: [\"GET /repos/{owner}/{repo}/git/ref/{ref}\"],\n    getTag: [\"GET /repos/{owner}/{repo}/git/tags/{tag_sha}\"],\n    getTree: [\"GET /repos/{owner}/{repo}/git/trees/{tree_sha}\"],\n    listMatchingRefs: [\"GET /repos/{owner}/{repo}/git/matching-refs/{ref}\"],\n    updateRef: [\"PATCH /repos/{owner}/{repo}/git/refs/{ref}\"]\n  },\n  gitignore: {\n    getAllTemplates: [\"GET /gitignore/templates\"],\n    getTemplate: [\"GET /gitignore/templates/{name}\"]\n  },\n  interactions: {\n    getRestrictionsForOrg: [\"GET /orgs/{org}/interaction-limits\", {\n      mediaType: {\n        previews: [\"sombra\"]\n      }\n    }],\n    getRestrictionsForRepo: [\"GET /repos/{owner}/{repo}/interaction-limits\", {\n      mediaType: {\n        previews: [\"sombra\"]\n      }\n    }],\n    removeRestrictionsForOrg: [\"DELETE /orgs/{org}/interaction-limits\", {\n      mediaType: {\n        previews: [\"sombra\"]\n      }\n    }],\n    removeRestrictionsForRepo: [\"DELETE /repos/{owner}/{repo}/interaction-limits\", {\n      mediaType: {\n        previews: [\"sombra\"]\n      }\n    }],\n    setRestrictionsForOrg: [\"PUT /orgs/{org}/interaction-limits\", {\n      mediaType: {\n        previews: [\"sombra\"]\n      }\n    }],\n    setRestrictionsForRepo: [\"PUT /repos/{owner}/{repo}/interaction-limits\", {\n      mediaType: {\n        previews: [\"sombra\"]\n      }\n    }]\n  },\n  issues: {\n    addAssignees: [\"POST /repos/{owner}/{repo}/issues/{issue_number}/assignees\"],\n    addLabels: [\"POST /repos/{owner}/{repo}/issues/{issue_number}/labels\"],\n    checkUserCanBeAssigned: [\"GET /repos/{owner}/{repo}/assignees/{assignee}\"],\n    create: [\"POST /repos/{owner}/{repo}/issues\"],\n    createComment: [\"POST /repos/{owner}/{repo}/issues/{issue_number}/comments\"],\n    createLabel: [\"POST /repos/{owner}/{repo}/labels\"],\n    createMilestone: [\"POST /repos/{owner}/{repo}/milestones\"],\n    deleteComment: [\"DELETE /repos/{owner}/{repo}/issues/comments/{comment_id}\"],\n    deleteLabel: [\"DELETE /repos/{owner}/{repo}/labels/{name}\"],\n    deleteMilestone: [\"DELETE /repos/{owner}/{repo}/milestones/{milestone_number}\"],\n    get: [\"GET /repos/{owner}/{repo}/issues/{issue_number}\"],\n    getComment: [\"GET /repos/{owner}/{repo}/issues/comments/{comment_id}\"],\n    getEvent: [\"GET /repos/{owner}/{repo}/issues/events/{event_id}\"],\n    getLabel: [\"GET /repos/{owner}/{repo}/labels/{name}\"],\n    getMilestone: [\"GET /repos/{owner}/{repo}/milestones/{milestone_number}\"],\n    list: [\"GET /issues\"],\n    listAssignees: [\"GET /repos/{owner}/{repo}/assignees\"],\n    listComments: [\"GET /repos/{owner}/{repo}/issues/{issue_number}/comments\"],\n    listCommentsForRepo: [\"GET /repos/{owner}/{repo}/issues/comments\"],\n    listEvents: [\"GET /repos/{owner}/{repo}/issues/{issue_number}/events\"],\n    listEventsForRepo: [\"GET /repos/{owner}/{repo}/issues/events\"],\n    listEventsForTimeline: [\"GET /repos/{owner}/{repo}/issues/{issue_number}/timeline\", {\n      mediaType: {\n        previews: [\"mockingbird\"]\n      }\n    }],\n    listForAuthenticatedUser: [\"GET /user/issues\"],\n    listForOrg: [\"GET /orgs/{org}/issues\"],\n    listForRepo: [\"GET /repos/{owner}/{repo}/issues\"],\n    listLabelsForMilestone: [\"GET /repos/{owner}/{repo}/milestones/{milestone_number}/labels\"],\n    listLabelsForRepo: [\"GET /repos/{owner}/{repo}/labels\"],\n    listLabelsOnIssue: [\"GET /repos/{owner}/{repo}/issues/{issue_number}/labels\"],\n    listMilestones: [\"GET /repos/{owner}/{repo}/milestones\"],\n    lock: [\"PUT /repos/{owner}/{repo}/issues/{issue_number}/lock\"],\n    removeAllLabels: [\"DELETE /repos/{owner}/{repo}/issues/{issue_number}/labels\"],\n    removeAssignees: [\"DELETE /repos/{owner}/{repo}/issues/{issue_number}/assignees\"],\n    removeLabel: [\"DELETE /repos/{owner}/{repo}/issues/{issue_number}/labels/{name}\"],\n    setLabels: [\"PUT /repos/{owner}/{repo}/issues/{issue_number}/labels\"],\n    unlock: [\"DELETE /repos/{owner}/{repo}/issues/{issue_number}/lock\"],\n    update: [\"PATCH /repos/{owner}/{repo}/issues/{issue_number}\"],\n    updateComment: [\"PATCH /repos/{owner}/{repo}/issues/comments/{comment_id}\"],\n    updateLabel: [\"PATCH /repos/{owner}/{repo}/labels/{name}\"],\n    updateMilestone: [\"PATCH /repos/{owner}/{repo}/milestones/{milestone_number}\"]\n  },\n  licenses: {\n    get: [\"GET /licenses/{license}\"],\n    getAllCommonlyUsed: [\"GET /licenses\"],\n    getForRepo: [\"GET /repos/{owner}/{repo}/license\"]\n  },\n  markdown: {\n    render: [\"POST /markdown\"],\n    renderRaw: [\"POST /markdown/raw\", {\n      headers: {\n        \"content-type\": \"text/plain; charset=utf-8\"\n      }\n    }]\n  },\n  meta: {\n    get: [\"GET /meta\"]\n  },\n  migrations: {\n    cancelImport: [\"DELETE /repos/{owner}/{repo}/import\"],\n    deleteArchiveForAuthenticatedUser: [\"DELETE /user/migrations/{migration_id}/archive\", {\n      mediaType: {\n        previews: [\"wyandotte\"]\n      }\n    }],\n    deleteArchiveForOrg: [\"DELETE /orgs/{org}/migrations/{migration_id}/archive\", {\n      mediaType: {\n        previews: [\"wyandotte\"]\n      }\n    }],\n    downloadArchiveForOrg: [\"GET /orgs/{org}/migrations/{migration_id}/archive\", {\n      mediaType: {\n        previews: [\"wyandotte\"]\n      }\n    }],\n    getArchiveForAuthenticatedUser: [\"GET /user/migrations/{migration_id}/archive\", {\n      mediaType: {\n        previews: [\"wyandotte\"]\n      }\n    }],\n    getCommitAuthors: [\"GET /repos/{owner}/{repo}/import/authors\"],\n    getImportStatus: [\"GET /repos/{owner}/{repo}/import\"],\n    getLargeFiles: [\"GET /repos/{owner}/{repo}/import/large_files\"],\n    getStatusForAuthenticatedUser: [\"GET /user/migrations/{migration_id}\", {\n      mediaType: {\n        previews: [\"wyandotte\"]\n      }\n    }],\n    getStatusForOrg: [\"GET /orgs/{org}/migrations/{migration_id}\", {\n      mediaType: {\n        previews: [\"wyandotte\"]\n      }\n    }],\n    listForAuthenticatedUser: [\"GET /user/migrations\", {\n      mediaType: {\n        previews: [\"wyandotte\"]\n      }\n    }],\n    listForOrg: [\"GET /orgs/{org}/migrations\", {\n      mediaType: {\n        previews: [\"wyandotte\"]\n      }\n    }],\n    listReposForOrg: [\"GET /orgs/{org}/migrations/{migration_id}/repositories\", {\n      mediaType: {\n        previews: [\"wyandotte\"]\n      }\n    }],\n    listReposForUser: [\"GET /user/{migration_id}/repositories\", {\n      mediaType: {\n        previews: [\"wyandotte\"]\n      }\n    }],\n    mapCommitAuthor: [\"PATCH /repos/{owner}/{repo}/import/authors/{author_id}\"],\n    setLfsPreference: [\"PATCH /repos/{owner}/{repo}/import/lfs\"],\n    startForAuthenticatedUser: [\"POST /user/migrations\"],\n    startForOrg: [\"POST /orgs/{org}/migrations\"],\n    startImport: [\"PUT /repos/{owner}/{repo}/import\"],\n    unlockRepoForAuthenticatedUser: [\"DELETE /user/migrations/{migration_id}/repos/{repo_name}/lock\", {\n      mediaType: {\n        previews: [\"wyandotte\"]\n      }\n    }],\n    unlockRepoForOrg: [\"DELETE /orgs/{org}/migrations/{migration_id}/repos/{repo_name}/lock\", {\n      mediaType: {\n        previews: [\"wyandotte\"]\n      }\n    }],\n    updateImport: [\"PATCH /repos/{owner}/{repo}/import\"]\n  },\n  orgs: {\n    blockUser: [\"PUT /orgs/{org}/blocks/{username}\"],\n    checkBlockedUser: [\"GET /orgs/{org}/blocks/{username}\"],\n    checkMembershipForUser: [\"GET /orgs/{org}/members/{username}\"],\n    checkPublicMembershipForUser: [\"GET /orgs/{org}/public_members/{username}\"],\n    convertMemberToOutsideCollaborator: [\"PUT /orgs/{org}/outside_collaborators/{username}\"],\n    createInvitation: [\"POST /orgs/{org}/invitations\"],\n    createWebhook: [\"POST /orgs/{org}/hooks\"],\n    deleteWebhook: [\"DELETE /orgs/{org}/hooks/{hook_id}\"],\n    get: [\"GET /orgs/{org}\"],\n    getMembershipForAuthenticatedUser: [\"GET /user/memberships/orgs/{org}\"],\n    getMembershipForUser: [\"GET /orgs/{org}/memberships/{username}\"],\n    getWebhook: [\"GET /orgs/{org}/hooks/{hook_id}\"],\n    list: [\"GET /organizations\"],\n    listAppInstallations: [\"GET /orgs/{org}/installations\", {\n      mediaType: {\n        previews: [\"machine-man\"]\n      }\n    }],\n    listBlockedUsers: [\"GET /orgs/{org}/blocks\"],\n    listForAuthenticatedUser: [\"GET /user/orgs\"],\n    listForUser: [\"GET /users/{username}/orgs\"],\n    listInvitationTeams: [\"GET /orgs/{org}/invitations/{invitation_id}/teams\"],\n    listMembers: [\"GET /orgs/{org}/members\"],\n    listMembershipsForAuthenticatedUser: [\"GET /user/memberships/orgs\"],\n    listOutsideCollaborators: [\"GET /orgs/{org}/outside_collaborators\"],\n    listPendingInvitations: [\"GET /orgs/{org}/invitations\"],\n    listPublicMembers: [\"GET /orgs/{org}/public_members\"],\n    listWebhooks: [\"GET /orgs/{org}/hooks\"],\n    pingWebhook: [\"POST /orgs/{org}/hooks/{hook_id}/pings\"],\n    removeMember: [\"DELETE /orgs/{org}/members/{username}\"],\n    removeMembershipForUser: [\"DELETE /orgs/{org}/memberships/{username}\"],\n    removeOutsideCollaborator: [\"DELETE /orgs/{org}/outside_collaborators/{username}\"],\n    removePublicMembershipForAuthenticatedUser: [\"DELETE /orgs/{org}/public_members/{username}\"],\n    setMembershipForUser: [\"PUT /orgs/{org}/memberships/{username}\"],\n    setPublicMembershipForAuthenticatedUser: [\"PUT /orgs/{org}/public_members/{username}\"],\n    unblockUser: [\"DELETE /orgs/{org}/blocks/{username}\"],\n    update: [\"PATCH /orgs/{org}\"],\n    updateMembershipForAuthenticatedUser: [\"PATCH /user/memberships/orgs/{org}\"],\n    updateWebhook: [\"PATCH /orgs/{org}/hooks/{hook_id}\"]\n  },\n  projects: {\n    addCollaborator: [\"PUT /projects/{project_id}/collaborators/{username}\", {\n      mediaType: {\n        previews: [\"inertia\"]\n      }\n    }],\n    createCard: [\"POST /projects/columns/{column_id}/cards\", {\n      mediaType: {\n        previews: [\"inertia\"]\n      }\n    }],\n    createColumn: [\"POST /projects/{project_id}/columns\", {\n      mediaType: {\n        previews: [\"inertia\"]\n      }\n    }],\n    createForAuthenticatedUser: [\"POST /user/projects\", {\n      mediaType: {\n        previews: [\"inertia\"]\n      }\n    }],\n    createForOrg: [\"POST /orgs/{org}/projects\", {\n      mediaType: {\n        previews: [\"inertia\"]\n      }\n    }],\n    createForRepo: [\"POST /repos/{owner}/{repo}/projects\", {\n      mediaType: {\n        previews: [\"inertia\"]\n      }\n    }],\n    delete: [\"DELETE /projects/{project_id}\", {\n      mediaType: {\n        previews: [\"inertia\"]\n      }\n    }],\n    deleteCard: [\"DELETE /projects/columns/cards/{card_id}\", {\n      mediaType: {\n        previews: [\"inertia\"]\n      }\n    }],\n    deleteColumn: [\"DELETE /projects/columns/{column_id}\", {\n      mediaType: {\n        previews: [\"inertia\"]\n      }\n    }],\n    get: [\"GET /projects/{project_id}\", {\n      mediaType: {\n        previews: [\"inertia\"]\n      }\n    }],\n    getCard: [\"GET /projects/columns/cards/{card_id}\", {\n      mediaType: {\n        previews: [\"inertia\"]\n      }\n    }],\n    getColumn: [\"GET /projects/columns/{column_id}\", {\n      mediaType: {\n        previews: [\"inertia\"]\n      }\n    }],\n    getPermissionForUser: [\"GET /projects/{project_id}/collaborators/{username}/permission\", {\n      mediaType: {\n        previews: [\"inertia\"]\n      }\n    }],\n    listCards: [\"GET /projects/columns/{column_id}/cards\", {\n      mediaType: {\n        previews: [\"inertia\"]\n      }\n    }],\n    listCollaborators: [\"GET /projects/{project_id}/collaborators\", {\n      mediaType: {\n        previews: [\"inertia\"]\n      }\n    }],\n    listColumns: [\"GET /projects/{project_id}/columns\", {\n      mediaType: {\n        previews: [\"inertia\"]\n      }\n    }],\n    listForOrg: [\"GET /orgs/{org}/projects\", {\n      mediaType: {\n        previews: [\"inertia\"]\n      }\n    }],\n    listForRepo: [\"GET /repos/{owner}/{repo}/projects\", {\n      mediaType: {\n        previews: [\"inertia\"]\n      }\n    }],\n    listForUser: [\"GET /users/{username}/projects\", {\n      mediaType: {\n        previews: [\"inertia\"]\n      }\n    }],\n    moveCard: [\"POST /projects/columns/cards/{card_id}/moves\", {\n      mediaType: {\n        previews: [\"inertia\"]\n      }\n    }],\n    moveColumn: [\"POST /projects/columns/{column_id}/moves\", {\n      mediaType: {\n        previews: [\"inertia\"]\n      }\n    }],\n    removeCollaborator: [\"DELETE /projects/{project_id}/collaborators/{username}\", {\n      mediaType: {\n        previews: [\"inertia\"]\n      }\n    }],\n    update: [\"PATCH /projects/{project_id}\", {\n      mediaType: {\n        previews: [\"inertia\"]\n      }\n    }],\n    updateCard: [\"PATCH /projects/columns/cards/{card_id}\", {\n      mediaType: {\n        previews: [\"inertia\"]\n      }\n    }],\n    updateColumn: [\"PATCH /projects/columns/{column_id}\", {\n      mediaType: {\n        previews: [\"inertia\"]\n      }\n    }]\n  },\n  pulls: {\n    checkIfMerged: [\"GET /repos/{owner}/{repo}/pulls/{pull_number}/merge\"],\n    create: [\"POST /repos/{owner}/{repo}/pulls\"],\n    createReplyForReviewComment: [\"POST /repos/{owner}/{repo}/pulls/{pull_number}/comments/{comment_id}/replies\"],\n    createReview: [\"POST /repos/{owner}/{repo}/pulls/{pull_number}/reviews\"],\n    createReviewComment: [\"POST /repos/{owner}/{repo}/pulls/{pull_number}/comments\"],\n    deletePendingReview: [\"DELETE /repos/{owner}/{repo}/pulls/{pull_number}/reviews/{review_id}\"],\n    deleteReviewComment: [\"DELETE /repos/{owner}/{repo}/pulls/comments/{comment_id}\"],\n    dismissReview: [\"PUT /repos/{owner}/{repo}/pulls/{pull_number}/reviews/{review_id}/dismissals\"],\n    get: [\"GET /repos/{owner}/{repo}/pulls/{pull_number}\"],\n    getReview: [\"GET /repos/{owner}/{repo}/pulls/{pull_number}/reviews/{review_id}\"],\n    getReviewComment: [\"GET /repos/{owner}/{repo}/pulls/comments/{comment_id}\"],\n    list: [\"GET /repos/{owner}/{repo}/pulls\"],\n    listCommentsForReview: [\"GET /repos/{owner}/{repo}/pulls/{pull_number}/reviews/{review_id}/comments\"],\n    listCommits: [\"GET /repos/{owner}/{repo}/pulls/{pull_number}/commits\"],\n    listFiles: [\"GET /repos/{owner}/{repo}/pulls/{pull_number}/files\"],\n    listRequestedReviewers: [\"GET /repos/{owner}/{repo}/pulls/{pull_number}/requested_reviewers\"],\n    listReviewComments: [\"GET /repos/{owner}/{repo}/pulls/{pull_number}/comments\"],\n    listReviewCommentsForRepo: [\"GET /repos/{owner}/{repo}/pulls/comments\"],\n    listReviews: [\"GET /repos/{owner}/{repo}/pulls/{pull_number}/reviews\"],\n    merge: [\"PUT /repos/{owner}/{repo}/pulls/{pull_number}/merge\"],\n    removeRequestedReviewers: [\"DELETE /repos/{owner}/{repo}/pulls/{pull_number}/requested_reviewers\"],\n    requestReviewers: [\"POST /repos/{owner}/{repo}/pulls/{pull_number}/requested_reviewers\"],\n    submitReview: [\"POST /repos/{owner}/{repo}/pulls/{pull_number}/reviews/{review_id}/events\"],\n    update: [\"PATCH /repos/{owner}/{repo}/pulls/{pull_number}\"],\n    updateBranch: [\"PUT /repos/{owner}/{repo}/pulls/{pull_number}/update-branch\", {\n      mediaType: {\n        previews: [\"lydian\"]\n      }\n    }],\n    updateReview: [\"PUT /repos/{owner}/{repo}/pulls/{pull_number}/reviews/{review_id}\"],\n    updateReviewComment: [\"PATCH /repos/{owner}/{repo}/pulls/comments/{comment_id}\"]\n  },\n  rateLimit: {\n    get: [\"GET /rate_limit\"]\n  },\n  reactions: {\n    createForCommitComment: [\"POST /repos/{owner}/{repo}/comments/{comment_id}/reactions\", {\n      mediaType: {\n        previews: [\"squirrel-girl\"]\n      }\n    }],\n    createForIssue: [\"POST /repos/{owner}/{repo}/issues/{issue_number}/reactions\", {\n      mediaType: {\n        previews: [\"squirrel-girl\"]\n      }\n    }],\n    createForIssueComment: [\"POST /repos/{owner}/{repo}/issues/comments/{comment_id}/reactions\", {\n      mediaType: {\n        previews: [\"squirrel-girl\"]\n      }\n    }],\n    createForPullRequestReviewComment: [\"POST /repos/{owner}/{repo}/pulls/comments/{comment_id}/reactions\", {\n      mediaType: {\n        previews: [\"squirrel-girl\"]\n      }\n    }],\n    createForTeamDiscussionCommentInOrg: [\"POST /orgs/{org}/teams/{team_slug}/discussions/{discussion_number}/comments/{comment_number}/reactions\", {\n      mediaType: {\n        previews: [\"squirrel-girl\"]\n      }\n    }],\n    createForTeamDiscussionInOrg: [\"POST /orgs/{org}/teams/{team_slug}/discussions/{discussion_number}/reactions\", {\n      mediaType: {\n        previews: [\"squirrel-girl\"]\n      }\n    }],\n    deleteForCommitComment: [\"DELETE /repos/{owner}/{repo}/comments/{comment_id}/reactions/{reaction_id}\", {\n      mediaType: {\n        previews: [\"squirrel-girl\"]\n      }\n    }],\n    deleteForIssue: [\"DELETE /repos/{owner}/{repo}/issues/{issue_number}/reactions/{reaction_id}\", {\n      mediaType: {\n        previews: [\"squirrel-girl\"]\n      }\n    }],\n    deleteForIssueComment: [\"DELETE /repos/{owner}/{repo}/issues/comments/{comment_id}/reactions/{reaction_id}\", {\n      mediaType: {\n        previews: [\"squirrel-girl\"]\n      }\n    }],\n    deleteForPullRequestComment: [\"DELETE /repos/{owner}/{repo}/pulls/comments/{comment_id}/reactions/{reaction_id}\", {\n      mediaType: {\n        previews: [\"squirrel-girl\"]\n      }\n    }],\n    deleteForTeamDiscussion: [\"DELETE /orgs/{org}/teams/{team_slug}/discussions/{discussion_number}/reactions/{reaction_id}\", {\n      mediaType: {\n        previews: [\"squirrel-girl\"]\n      }\n    }],\n    deleteForTeamDiscussionComment: [\"DELETE /orgs/{org}/teams/{team_slug}/discussions/{discussion_number}/comments/{comment_number}/reactions/{reaction_id}\", {\n      mediaType: {\n        previews: [\"squirrel-girl\"]\n      }\n    }],\n    listForCommitComment: [\"GET /repos/{owner}/{repo}/comments/{comment_id}/reactions\", {\n      mediaType: {\n        previews: [\"squirrel-girl\"]\n      }\n    }],\n    listForIssue: [\"GET /repos/{owner}/{repo}/issues/{issue_number}/reactions\", {\n      mediaType: {\n        previews: [\"squirrel-girl\"]\n      }\n    }],\n    listForIssueComment: [\"GET /repos/{owner}/{repo}/issues/comments/{comment_id}/reactions\", {\n      mediaType: {\n        previews: [\"squirrel-girl\"]\n      }\n    }],\n    listForPullRequestReviewComment: [\"GET /repos/{owner}/{repo}/pulls/comments/{comment_id}/reactions\", {\n      mediaType: {\n        previews: [\"squirrel-girl\"]\n      }\n    }],\n    listForTeamDiscussionCommentInOrg: [\"GET /orgs/{org}/teams/{team_slug}/discussions/{discussion_number}/comments/{comment_number}/reactions\", {\n      mediaType: {\n        previews: [\"squirrel-girl\"]\n      }\n    }],\n    listForTeamDiscussionInOrg: [\"GET /orgs/{org}/teams/{team_slug}/discussions/{discussion_number}/reactions\", {\n      mediaType: {\n        previews: [\"squirrel-girl\"]\n      }\n    }]\n  },\n  repos: {\n    acceptInvitation: [\"PATCH /user/repository_invitations/{invitation_id}\"],\n    addAppAccessRestrictions: [\"POST /repos/{owner}/{repo}/branches/{branch}/protection/restrictions/apps\", {}, {\n      mapToData: \"apps\"\n    }],\n    addCollaborator: [\"PUT /repos/{owner}/{repo}/collaborators/{username}\"],\n    addStatusCheckContexts: [\"POST /repos/{owner}/{repo}/branches/{branch}/protection/required_status_checks/contexts\", {}, {\n      mapToData: \"contexts\"\n    }],\n    addTeamAccessRestrictions: [\"POST /repos/{owner}/{repo}/branches/{branch}/protection/restrictions/teams\", {}, {\n      mapToData: \"teams\"\n    }],\n    addUserAccessRestrictions: [\"POST /repos/{owner}/{repo}/branches/{branch}/protection/restrictions/users\", {}, {\n      mapToData: \"users\"\n    }],\n    checkCollaborator: [\"GET /repos/{owner}/{repo}/collaborators/{username}\"],\n    checkVulnerabilityAlerts: [\"GET /repos/{owner}/{repo}/vulnerability-alerts\", {\n      mediaType: {\n        previews: [\"dorian\"]\n      }\n    }],\n    compareCommits: [\"GET /repos/{owner}/{repo}/compare/{base}...{head}\"],\n    createCommitComment: [\"POST /repos/{owner}/{repo}/commits/{commit_sha}/comments\"],\n    createCommitSignatureProtection: [\"POST /repos/{owner}/{repo}/branches/{branch}/protection/required_signatures\", {\n      mediaType: {\n        previews: [\"zzzax\"]\n      }\n    }],\n    createCommitStatus: [\"POST /repos/{owner}/{repo}/statuses/{sha}\"],\n    createDeployKey: [\"POST /repos/{owner}/{repo}/keys\"],\n    createDeployment: [\"POST /repos/{owner}/{repo}/deployments\"],\n    createDeploymentStatus: [\"POST /repos/{owner}/{repo}/deployments/{deployment_id}/statuses\"],\n    createDispatchEvent: [\"POST /repos/{owner}/{repo}/dispatches\"],\n    createForAuthenticatedUser: [\"POST /user/repos\"],\n    createFork: [\"POST /repos/{owner}/{repo}/forks\"],\n    createInOrg: [\"POST /orgs/{org}/repos\"],\n    createOrUpdateFileContents: [\"PUT /repos/{owner}/{repo}/contents/{path}\"],\n    createPagesSite: [\"POST /repos/{owner}/{repo}/pages\", {\n      mediaType: {\n        previews: [\"switcheroo\"]\n      }\n    }],\n    createRelease: [\"POST /repos/{owner}/{repo}/releases\"],\n    createUsingTemplate: [\"POST /repos/{template_owner}/{template_repo}/generate\", {\n      mediaType: {\n        previews: [\"baptiste\"]\n      }\n    }],\n    createWebhook: [\"POST /repos/{owner}/{repo}/hooks\"],\n    declineInvitation: [\"DELETE /user/repository_invitations/{invitation_id}\"],\n    delete: [\"DELETE /repos/{owner}/{repo}\"],\n    deleteAccessRestrictions: [\"DELETE /repos/{owner}/{repo}/branches/{branch}/protection/restrictions\"],\n    deleteAdminBranchProtection: [\"DELETE /repos/{owner}/{repo}/branches/{branch}/protection/enforce_admins\"],\n    deleteBranchProtection: [\"DELETE /repos/{owner}/{repo}/branches/{branch}/protection\"],\n    deleteCommitComment: [\"DELETE /repos/{owner}/{repo}/comments/{comment_id}\"],\n    deleteCommitSignatureProtection: [\"DELETE /repos/{owner}/{repo}/branches/{branch}/protection/required_signatures\", {\n      mediaType: {\n        previews: [\"zzzax\"]\n      }\n    }],\n    deleteDeployKey: [\"DELETE /repos/{owner}/{repo}/keys/{key_id}\"],\n    deleteDeployment: [\"DELETE /repos/{owner}/{repo}/deployments/{deployment_id}\"],\n    deleteFile: [\"DELETE /repos/{owner}/{repo}/contents/{path}\"],\n    deleteInvitation: [\"DELETE /repos/{owner}/{repo}/invitations/{invitation_id}\"],\n    deletePagesSite: [\"DELETE /repos/{owner}/{repo}/pages\", {\n      mediaType: {\n        previews: [\"switcheroo\"]\n      }\n    }],\n    deletePullRequestReviewProtection: [\"DELETE /repos/{owner}/{repo}/branches/{branch}/protection/required_pull_request_reviews\"],\n    deleteRelease: [\"DELETE /repos/{owner}/{repo}/releases/{release_id}\"],\n    deleteReleaseAsset: [\"DELETE /repos/{owner}/{repo}/releases/assets/{asset_id}\"],\n    deleteWebhook: [\"DELETE /repos/{owner}/{repo}/hooks/{hook_id}\"],\n    disableAutomatedSecurityFixes: [\"DELETE /repos/{owner}/{repo}/automated-security-fixes\", {\n      mediaType: {\n        previews: [\"london\"]\n      }\n    }],\n    disableVulnerabilityAlerts: [\"DELETE /repos/{owner}/{repo}/vulnerability-alerts\", {\n      mediaType: {\n        previews: [\"dorian\"]\n      }\n    }],\n    downloadArchive: [\"GET /repos/{owner}/{repo}/{archive_format}/{ref}\"],\n    enableAutomatedSecurityFixes: [\"PUT /repos/{owner}/{repo}/automated-security-fixes\", {\n      mediaType: {\n        previews: [\"london\"]\n      }\n    }],\n    enableVulnerabilityAlerts: [\"PUT /repos/{owner}/{repo}/vulnerability-alerts\", {\n      mediaType: {\n        previews: [\"dorian\"]\n      }\n    }],\n    get: [\"GET /repos/{owner}/{repo}\"],\n    getAccessRestrictions: [\"GET /repos/{owner}/{repo}/branches/{branch}/protection/restrictions\"],\n    getAdminBranchProtection: [\"GET /repos/{owner}/{repo}/branches/{branch}/protection/enforce_admins\"],\n    getAllStatusCheckContexts: [\"GET /repos/{owner}/{repo}/branches/{branch}/protection/required_status_checks/contexts\"],\n    getAllTopics: [\"GET /repos/{owner}/{repo}/topics\", {\n      mediaType: {\n        previews: [\"mercy\"]\n      }\n    }],\n    getAppsWithAccessToProtectedBranch: [\"GET /repos/{owner}/{repo}/branches/{branch}/protection/restrictions/apps\"],\n    getBranch: [\"GET /repos/{owner}/{repo}/branches/{branch}\"],\n    getBranchProtection: [\"GET /repos/{owner}/{repo}/branches/{branch}/protection\"],\n    getClones: [\"GET /repos/{owner}/{repo}/traffic/clones\"],\n    getCodeFrequencyStats: [\"GET /repos/{owner}/{repo}/stats/code_frequency\"],\n    getCollaboratorPermissionLevel: [\"GET /repos/{owner}/{repo}/collaborators/{username}/permission\"],\n    getCombinedStatusForRef: [\"GET /repos/{owner}/{repo}/commits/{ref}/status\"],\n    getCommit: [\"GET /repos/{owner}/{repo}/commits/{ref}\"],\n    getCommitActivityStats: [\"GET /repos/{owner}/{repo}/stats/commit_activity\"],\n    getCommitComment: [\"GET /repos/{owner}/{repo}/comments/{comment_id}\"],\n    getCommitSignatureProtection: [\"GET /repos/{owner}/{repo}/branches/{branch}/protection/required_signatures\", {\n      mediaType: {\n        previews: [\"zzzax\"]\n      }\n    }],\n    getCommunityProfileMetrics: [\"GET /repos/{owner}/{repo}/community/profile\"],\n    getContent: [\"GET /repos/{owner}/{repo}/contents/{path}\"],\n    getContributorsStats: [\"GET /repos/{owner}/{repo}/stats/contributors\"],\n    getDeployKey: [\"GET /repos/{owner}/{repo}/keys/{key_id}\"],\n    getDeployment: [\"GET /repos/{owner}/{repo}/deployments/{deployment_id}\"],\n    getDeploymentStatus: [\"GET /repos/{owner}/{repo}/deployments/{deployment_id}/statuses/{status_id}\"],\n    getLatestPagesBuild: [\"GET /repos/{owner}/{repo}/pages/builds/latest\"],\n    getLatestRelease: [\"GET /repos/{owner}/{repo}/releases/latest\"],\n    getPages: [\"GET /repos/{owner}/{repo}/pages\"],\n    getPagesBuild: [\"GET /repos/{owner}/{repo}/pages/builds/{build_id}\"],\n    getParticipationStats: [\"GET /repos/{owner}/{repo}/stats/participation\"],\n    getPullRequestReviewProtection: [\"GET /repos/{owner}/{repo}/branches/{branch}/protection/required_pull_request_reviews\"],\n    getPunchCardStats: [\"GET /repos/{owner}/{repo}/stats/punch_card\"],\n    getReadme: [\"GET /repos/{owner}/{repo}/readme\"],\n    getRelease: [\"GET /repos/{owner}/{repo}/releases/{release_id}\"],\n    getReleaseAsset: [\"GET /repos/{owner}/{repo}/releases/assets/{asset_id}\"],\n    getReleaseByTag: [\"GET /repos/{owner}/{repo}/releases/tags/{tag}\"],\n    getStatusChecksProtection: [\"GET /repos/{owner}/{repo}/branches/{branch}/protection/required_status_checks\"],\n    getTeamsWithAccessToProtectedBranch: [\"GET /repos/{owner}/{repo}/branches/{branch}/protection/restrictions/teams\"],\n    getTopPaths: [\"GET /repos/{owner}/{repo}/traffic/popular/paths\"],\n    getTopReferrers: [\"GET /repos/{owner}/{repo}/traffic/popular/referrers\"],\n    getUsersWithAccessToProtectedBranch: [\"GET /repos/{owner}/{repo}/branches/{branch}/protection/restrictions/users\"],\n    getViews: [\"GET /repos/{owner}/{repo}/traffic/views\"],\n    getWebhook: [\"GET /repos/{owner}/{repo}/hooks/{hook_id}\"],\n    listBranches: [\"GET /repos/{owner}/{repo}/branches\"],\n    listBranchesForHeadCommit: [\"GET /repos/{owner}/{repo}/commits/{commit_sha}/branches-where-head\", {\n      mediaType: {\n        previews: [\"groot\"]\n      }\n    }],\n    listCollaborators: [\"GET /repos/{owner}/{repo}/collaborators\"],\n    listCommentsForCommit: [\"GET /repos/{owner}/{repo}/commits/{commit_sha}/comments\"],\n    listCommitCommentsForRepo: [\"GET /repos/{owner}/{repo}/comments\"],\n    listCommitStatusesForRef: [\"GET /repos/{owner}/{repo}/commits/{ref}/statuses\"],\n    listCommits: [\"GET /repos/{owner}/{repo}/commits\"],\n    listContributors: [\"GET /repos/{owner}/{repo}/contributors\"],\n    listDeployKeys: [\"GET /repos/{owner}/{repo}/keys\"],\n    listDeploymentStatuses: [\"GET /repos/{owner}/{repo}/deployments/{deployment_id}/statuses\"],\n    listDeployments: [\"GET /repos/{owner}/{repo}/deployments\"],\n    listForAuthenticatedUser: [\"GET /user/repos\"],\n    listForOrg: [\"GET /orgs/{org}/repos\"],\n    listForUser: [\"GET /users/{username}/repos\"],\n    listForks: [\"GET /repos/{owner}/{repo}/forks\"],\n    listInvitations: [\"GET /repos/{owner}/{repo}/invitations\"],\n    listInvitationsForAuthenticatedUser: [\"GET /user/repository_invitations\"],\n    listLanguages: [\"GET /repos/{owner}/{repo}/languages\"],\n    listPagesBuilds: [\"GET /repos/{owner}/{repo}/pages/builds\"],\n    listPublic: [\"GET /repositories\"],\n    listPullRequestsAssociatedWithCommit: [\"GET /repos/{owner}/{repo}/commits/{commit_sha}/pulls\", {\n      mediaType: {\n        previews: [\"groot\"]\n      }\n    }],\n    listReleaseAssets: [\"GET /repos/{owner}/{repo}/releases/{release_id}/assets\"],\n    listReleases: [\"GET /repos/{owner}/{repo}/releases\"],\n    listTags: [\"GET /repos/{owner}/{repo}/tags\"],\n    listTeams: [\"GET /repos/{owner}/{repo}/teams\"],\n    listWebhooks: [\"GET /repos/{owner}/{repo}/hooks\"],\n    merge: [\"POST /repos/{owner}/{repo}/merges\"],\n    pingWebhook: [\"POST /repos/{owner}/{repo}/hooks/{hook_id}/pings\"],\n    removeAppAccessRestrictions: [\"DELETE /repos/{owner}/{repo}/branches/{branch}/protection/restrictions/apps\", {}, {\n      mapToData: \"apps\"\n    }],\n    removeCollaborator: [\"DELETE /repos/{owner}/{repo}/collaborators/{username}\"],\n    removeStatusCheckContexts: [\"DELETE /repos/{owner}/{repo}/branches/{branch}/protection/required_status_checks/contexts\", {}, {\n      mapToData: \"contexts\"\n    }],\n    removeStatusCheckProtection: [\"DELETE /repos/{owner}/{repo}/branches/{branch}/protection/required_status_checks\"],\n    removeTeamAccessRestrictions: [\"DELETE /repos/{owner}/{repo}/branches/{branch}/protection/restrictions/teams\", {}, {\n      mapToData: \"teams\"\n    }],\n    removeUserAccessRestrictions: [\"DELETE /repos/{owner}/{repo}/branches/{branch}/protection/restrictions/users\", {}, {\n      mapToData: \"users\"\n    }],\n    replaceAllTopics: [\"PUT /repos/{owner}/{repo}/topics\", {\n      mediaType: {\n        previews: [\"mercy\"]\n      }\n    }],\n    requestPagesBuild: [\"POST /repos/{owner}/{repo}/pages/builds\"],\n    setAdminBranchProtection: [\"POST /repos/{owner}/{repo}/branches/{branch}/protection/enforce_admins\"],\n    setAppAccessRestrictions: [\"PUT /repos/{owner}/{repo}/branches/{branch}/protection/restrictions/apps\", {}, {\n      mapToData: \"apps\"\n    }],\n    setStatusCheckContexts: [\"PUT /repos/{owner}/{repo}/branches/{branch}/protection/required_status_checks/contexts\", {}, {\n      mapToData: \"contexts\"\n    }],\n    setTeamAccessRestrictions: [\"PUT /repos/{owner}/{repo}/branches/{branch}/protection/restrictions/teams\", {}, {\n      mapToData: \"teams\"\n    }],\n    setUserAccessRestrictions: [\"PUT /repos/{owner}/{repo}/branches/{branch}/protection/restrictions/users\", {}, {\n      mapToData: \"users\"\n    }],\n    testPushWebhook: [\"POST /repos/{owner}/{repo}/hooks/{hook_id}/tests\"],\n    transfer: [\"POST /repos/{owner}/{repo}/transfer\"],\n    update: [\"PATCH /repos/{owner}/{repo}\"],\n    updateBranchProtection: [\"PUT /repos/{owner}/{repo}/branches/{branch}/protection\"],\n    updateCommitComment: [\"PATCH /repos/{owner}/{repo}/comments/{comment_id}\"],\n    updateInformationAboutPagesSite: [\"PUT /repos/{owner}/{repo}/pages\"],\n    updateInvitation: [\"PATCH /repos/{owner}/{repo}/invitations/{invitation_id}\"],\n    updatePullRequestReviewProtection: [\"PATCH /repos/{owner}/{repo}/branches/{branch}/protection/required_pull_request_reviews\"],\n    updateRelease: [\"PATCH /repos/{owner}/{repo}/releases/{release_id}\"],\n    updateReleaseAsset: [\"PATCH /repos/{owner}/{repo}/releases/assets/{asset_id}\"],\n    updateStatusCheckPotection: [\"PATCH /repos/{owner}/{repo}/branches/{branch}/protection/required_status_checks\"],\n    updateWebhook: [\"PATCH /repos/{owner}/{repo}/hooks/{hook_id}\"],\n    uploadReleaseAsset: [\"POST /repos/{owner}/{repo}/releases/{release_id}/assets{?name,label}\", {\n      baseUrl: \"https://uploads.github.com\"\n    }]\n  },\n  search: {\n    code: [\"GET /search/code\"],\n    commits: [\"GET /search/commits\", {\n      mediaType: {\n        previews: [\"cloak\"]\n      }\n    }],\n    issuesAndPullRequests: [\"GET /search/issues\"],\n    labels: [\"GET /search/labels\"],\n    repos: [\"GET /search/repositories\"],\n    topics: [\"GET /search/topics\"],\n    users: [\"GET /search/users\"]\n  },\n  teams: {\n    addOrUpdateMembershipForUserInOrg: [\"PUT /orgs/{org}/teams/{team_slug}/memberships/{username}\"],\n    addOrUpdateProjectPermissionsInOrg: [\"PUT /orgs/{org}/teams/{team_slug}/projects/{project_id}\", {\n      mediaType: {\n        previews: [\"inertia\"]\n      }\n    }],\n    addOrUpdateRepoPermissionsInOrg: [\"PUT /orgs/{org}/teams/{team_slug}/repos/{owner}/{repo}\"],\n    checkPermissionsForProjectInOrg: [\"GET /orgs/{org}/teams/{team_slug}/projects/{project_id}\", {\n      mediaType: {\n        previews: [\"inertia\"]\n      }\n    }],\n    checkPermissionsForRepoInOrg: [\"GET /orgs/{org}/teams/{team_slug}/repos/{owner}/{repo}\"],\n    create: [\"POST /orgs/{org}/teams\"],\n    createDiscussionCommentInOrg: [\"POST /orgs/{org}/teams/{team_slug}/discussions/{discussion_number}/comments\"],\n    createDiscussionInOrg: [\"POST /orgs/{org}/teams/{team_slug}/discussions\"],\n    deleteDiscussionCommentInOrg: [\"DELETE /orgs/{org}/teams/{team_slug}/discussions/{discussion_number}/comments/{comment_number}\"],\n    deleteDiscussionInOrg: [\"DELETE /orgs/{org}/teams/{team_slug}/discussions/{discussion_number}\"],\n    deleteInOrg: [\"DELETE /orgs/{org}/teams/{team_slug}\"],\n    getByName: [\"GET /orgs/{org}/teams/{team_slug}\"],\n    getDiscussionCommentInOrg: [\"GET /orgs/{org}/teams/{team_slug}/discussions/{discussion_number}/comments/{comment_number}\"],\n    getDiscussionInOrg: [\"GET /orgs/{org}/teams/{team_slug}/discussions/{discussion_number}\"],\n    getMembershipForUserInOrg: [\"GET /orgs/{org}/teams/{team_slug}/memberships/{username}\"],\n    list: [\"GET /orgs/{org}/teams\"],\n    listChildInOrg: [\"GET /orgs/{org}/teams/{team_slug}/teams\"],\n    listDiscussionCommentsInOrg: [\"GET /orgs/{org}/teams/{team_slug}/discussions/{discussion_number}/comments\"],\n    listDiscussionsInOrg: [\"GET /orgs/{org}/teams/{team_slug}/discussions\"],\n    listForAuthenticatedUser: [\"GET /user/teams\"],\n    listMembersInOrg: [\"GET /orgs/{org}/teams/{team_slug}/members\"],\n    listPendingInvitationsInOrg: [\"GET /orgs/{org}/teams/{team_slug}/invitations\"],\n    listProjectsInOrg: [\"GET /orgs/{org}/teams/{team_slug}/projects\", {\n      mediaType: {\n        previews: [\"inertia\"]\n      }\n    }],\n    listReposInOrg: [\"GET /orgs/{org}/teams/{team_slug}/repos\"],\n    removeMembershipForUserInOrg: [\"DELETE /orgs/{org}/teams/{team_slug}/memberships/{username}\"],\n    removeProjectInOrg: [\"DELETE /orgs/{org}/teams/{team_slug}/projects/{project_id}\"],\n    removeRepoInOrg: [\"DELETE /orgs/{org}/teams/{team_slug}/repos/{owner}/{repo}\"],\n    updateDiscussionCommentInOrg: [\"PATCH /orgs/{org}/teams/{team_slug}/discussions/{discussion_number}/comments/{comment_number}\"],\n    updateDiscussionInOrg: [\"PATCH /orgs/{org}/teams/{team_slug}/discussions/{discussion_number}\"],\n    updateInOrg: [\"PATCH /orgs/{org}/teams/{team_slug}\"]\n  },\n  users: {\n    addEmailForAuthenticated: [\"POST /user/emails\"],\n    block: [\"PUT /user/blocks/{username}\"],\n    checkBlocked: [\"GET /user/blocks/{username}\"],\n    checkFollowingForUser: [\"GET /users/{username}/following/{target_user}\"],\n    checkPersonIsFollowedByAuthenticated: [\"GET /user/following/{username}\"],\n    createGpgKeyForAuthenticated: [\"POST /user/gpg_keys\"],\n    createPublicSshKeyForAuthenticated: [\"POST /user/keys\"],\n    deleteEmailForAuthenticated: [\"DELETE /user/emails\"],\n    deleteGpgKeyForAuthenticated: [\"DELETE /user/gpg_keys/{gpg_key_id}\"],\n    deletePublicSshKeyForAuthenticated: [\"DELETE /user/keys/{key_id}\"],\n    follow: [\"PUT /user/following/{username}\"],\n    getAuthenticated: [\"GET /user\"],\n    getByUsername: [\"GET /users/{username}\"],\n    getContextForUser: [\"GET /users/{username}/hovercard\"],\n    getGpgKeyForAuthenticated: [\"GET /user/gpg_keys/{gpg_key_id}\"],\n    getPublicSshKeyForAuthenticated: [\"GET /user/keys/{key_id}\"],\n    list: [\"GET /users\"],\n    listBlockedByAuthenticated: [\"GET /user/blocks\"],\n    listEmailsForAuthenticated: [\"GET /user/emails\"],\n    listFollowedByAuthenticated: [\"GET /user/following\"],\n    listFollowersForAuthenticatedUser: [\"GET /user/followers\"],\n    listFollowersForUser: [\"GET /users/{username}/followers\"],\n    listFollowingForUser: [\"GET /users/{username}/following\"],\n    listGpgKeysForAuthenticated: [\"GET /user/gpg_keys\"],\n    listGpgKeysForUser: [\"GET /users/{username}/gpg_keys\"],\n    listPublicEmailsForAuthenticated: [\"GET /user/public_emails\"],\n    listPublicKeysForUser: [\"GET /users/{username}/keys\"],\n    listPublicSshKeysForAuthenticated: [\"GET /user/keys\"],\n    setPrimaryEmailVisibilityForAuthenticated: [\"PATCH /user/email/visibility\"],\n    unblock: [\"DELETE /user/blocks/{username}\"],\n    unfollow: [\"DELETE /user/following/{username}\"],\n    updateAuthenticated: [\"PATCH /user\"]\n  }\n};\n\nconst VERSION = \"4.0.0\";\n\nfunction endpointsToMethods(octokit, endpointsMap) {\n  const newMethods = {};\n\n  for (const [scope, endpoints] of Object.entries(endpointsMap)) {\n    for (const [methodName, endpoint] of Object.entries(endpoints)) {\n      const [route, defaults, decorations] = endpoint;\n      const [method, url] = route.split(/ /);\n      const endpointDefaults = Object.assign({\n        method,\n        url\n      }, defaults);\n\n      if (!newMethods[scope]) {\n        newMethods[scope] = {};\n      }\n\n      const scopeMethods = newMethods[scope];\n\n      if (decorations) {\n        scopeMethods[methodName] = decorate(octokit, scope, methodName, endpointDefaults, decorations);\n        continue;\n      }\n\n      scopeMethods[methodName] = octokit.request.defaults(endpointDefaults);\n    }\n  }\n\n  return newMethods;\n}\n\nfunction decorate(octokit, scope, methodName, defaults, decorations) {\n  const requestWithDefaults = octokit.request.defaults(defaults);\n  /* istanbul ignore next */\n\n  function withDecorations(...args) {\n    // @ts-ignore https://github.com/microsoft/TypeScript/issues/25488\n    let options = requestWithDefaults.endpoint.merge(...args); // There are currently no other decorations than `.mapToData`\n\n    if (decorations.mapToData) {\n      options = Object.assign({}, options, {\n        data: options[decorations.mapToData],\n        [decorations.mapToData]: undefined\n      });\n      return requestWithDefaults(options);\n    }\n\n    if (decorations.renamed) {\n      const [newScope, newMethodName] = decorations.renamed;\n      octokit.log.warn(`octokit.${scope}.${methodName}() has been renamed to octokit.${newScope}.${newMethodName}()`);\n    }\n\n    if (decorations.deprecated) {\n      octokit.log.warn(decorations.deprecated);\n    }\n\n    if (decorations.renamedParameters) {\n      // @ts-ignore https://github.com/microsoft/TypeScript/issues/25488\n      const options = requestWithDefaults.endpoint.merge(...args);\n\n      for (const [name, alias] of Object.entries(decorations.renamedParameters)) {\n        if (name in options) {\n          octokit.log.warn(`\"${name}\" parameter is deprecated for \"octokit.${scope}.${methodName}()\". Use \"${alias}\" instead`);\n\n          if (!(alias in options)) {\n            options[alias] = options[name];\n          }\n\n          delete options[name];\n        }\n      }\n\n      return requestWithDefaults(options);\n    } // @ts-ignore https://github.com/microsoft/TypeScript/issues/25488\n\n\n    return requestWithDefaults(...args);\n  }\n\n  return Object.assign(withDecorations, requestWithDefaults);\n}\n\n/**\n * This plugin is a 1:1 copy of internal @octokit/rest plugins. The primary\n * goal is to rebuild @octokit/rest on top of @octokit/core. Once that is\n * done, we will remove the registerEndpoints methods and return the methods\n * directly as with the other plugins. At that point we will also remove the\n * legacy workarounds and deprecations.\n *\n * See the plan at\n * https://github.com/octokit/plugin-rest-endpoint-methods.js/pull/1\n */\n\nfunction restEndpointMethods(octokit) {\n  return endpointsToMethods(octokit, Endpoints);\n}\nrestEndpointMethods.VERSION = VERSION;\n\nexports.restEndpointMethods = restEndpointMethods;\n//# sourceMappingURL=index.js.map\n\n\n/***/ }),\n\n/***/ 537:\n/***/ ((__unused_webpack_module, exports, __webpack_require__) => {\n\n\"use strict\";\n\n\nObject.defineProperty(exports, \"__esModule\", ({ value: true }));\n\nfunction _interopDefault (ex) { return (ex && (typeof ex === 'object') && 'default' in ex) ? ex['default'] : ex; }\n\nvar deprecation = __webpack_require__(8932);\nvar once = _interopDefault(__webpack_require__(1223));\n\nconst logOnce = once(deprecation => console.warn(deprecation));\n/**\n * Error with extra properties to help with debugging\n */\n\nclass RequestError extends Error {\n  constructor(message, statusCode, options) {\n    super(message); // Maintains proper stack trace (only available on V8)\n\n    /* istanbul ignore next */\n\n    if (Error.captureStackTrace) {\n      Error.captureStackTrace(this, this.constructor);\n    }\n\n    this.name = \"HttpError\";\n    this.status = statusCode;\n    Object.defineProperty(this, \"code\", {\n      get() {\n        logOnce(new deprecation.Deprecation(\"[@octokit/request-error] `error.code` is deprecated, use `error.status`.\"));\n        return statusCode;\n      }\n\n    });\n    this.headers = options.headers || {}; // redact request credentials without mutating original request options\n\n    const requestCopy = Object.assign({}, options.request);\n\n    if (options.request.headers.authorization) {\n      requestCopy.headers = Object.assign({}, options.request.headers, {\n        authorization: options.request.headers.authorization.replace(/ .*$/, \" [REDACTED]\")\n      });\n    }\n\n    requestCopy.url = requestCopy.url // client_id & client_secret can be passed as URL query parameters to increase rate limit\n    // see https://developer.github.com/v3/#increasing-the-unauthenticated-rate-limit-for-oauth-applications\n    .replace(/\\bclient_secret=\\w+/g, \"client_secret=[REDACTED]\") // OAuth tokens can be passed as URL query parameters, although it is not recommended\n    // see https://developer.github.com/v3/#oauth2-token-sent-in-a-header\n    .replace(/\\baccess_token=\\w+/g, \"access_token=[REDACTED]\");\n    this.request = requestCopy;\n  }\n\n}\n\nexports.RequestError = RequestError;\n//# sourceMappingURL=index.js.map\n\n\n/***/ }),\n\n/***/ 6234:\n/***/ ((__unused_webpack_module, exports, __webpack_require__) => {\n\n\"use strict\";\n\n\nObject.defineProperty(exports, \"__esModule\", ({ value: true }));\n\nfunction _interopDefault (ex) { return (ex && (typeof ex === 'object') && 'default' in ex) ? ex['default'] : ex; }\n\nvar endpoint = __webpack_require__(9440);\nvar universalUserAgent = __webpack_require__(5030);\nvar isPlainObject = _interopDefault(__webpack_require__(8840));\nvar nodeFetch = _interopDefault(__webpack_require__(467));\nvar requestError = __webpack_require__(537);\n\nconst VERSION = \"5.4.5\";\n\nfunction getBufferResponse(response) {\n  return response.arrayBuffer();\n}\n\nfunction fetchWrapper(requestOptions) {\n  if (isPlainObject(requestOptions.body) || Array.isArray(requestOptions.body)) {\n    requestOptions.body = JSON.stringify(requestOptions.body);\n  }\n\n  let headers = {};\n  let status;\n  let url;\n  const fetch = requestOptions.request && requestOptions.request.fetch || nodeFetch;\n  return fetch(requestOptions.url, Object.assign({\n    method: requestOptions.method,\n    body: requestOptions.body,\n    headers: requestOptions.headers,\n    redirect: requestOptions.redirect\n  }, requestOptions.request)).then(response => {\n    url = response.url;\n    status = response.status;\n\n    for (const keyAndValue of response.headers) {\n      headers[keyAndValue[0]] = keyAndValue[1];\n    }\n\n    if (status === 204 || status === 205) {\n      return;\n    } // GitHub API returns 200 for HEAD requests\n\n\n    if (requestOptions.method === \"HEAD\") {\n      if (status < 400) {\n        return;\n      }\n\n      throw new requestError.RequestError(response.statusText, status, {\n        headers,\n        request: requestOptions\n      });\n    }\n\n    if (status === 304) {\n      throw new requestError.RequestError(\"Not modified\", status, {\n        headers,\n        request: requestOptions\n      });\n    }\n\n    if (status >= 400) {\n      return response.text().then(message => {\n        const error = new requestError.RequestError(message, status, {\n          headers,\n          request: requestOptions\n        });\n\n        try {\n          let responseBody = JSON.parse(error.message);\n          Object.assign(error, responseBody);\n          let errors = responseBody.errors; // Assumption `errors` would always be in Array format\n\n          error.message = error.message + \": \" + errors.map(JSON.stringify).join(\", \");\n        } catch (e) {// ignore, see octokit/rest.js#684\n        }\n\n        throw error;\n      });\n    }\n\n    const contentType = response.headers.get(\"content-type\");\n\n    if (/application\\/json/.test(contentType)) {\n      return response.json();\n    }\n\n    if (!contentType || /^text\\/|charset=utf-8$/.test(contentType)) {\n      return response.text();\n    }\n\n    return getBufferResponse(response);\n  }).then(data => {\n    return {\n      status,\n      url,\n      headers,\n      data\n    };\n  }).catch(error => {\n    if (error instanceof requestError.RequestError) {\n      throw error;\n    }\n\n    throw new requestError.RequestError(error.message, 500, {\n      headers,\n      request: requestOptions\n    });\n  });\n}\n\nfunction withDefaults(oldEndpoint, newDefaults) {\n  const endpoint = oldEndpoint.defaults(newDefaults);\n\n  const newApi = function (route, parameters) {\n    const endpointOptions = endpoint.merge(route, parameters);\n\n    if (!endpointOptions.request || !endpointOptions.request.hook) {\n      return fetchWrapper(endpoint.parse(endpointOptions));\n    }\n\n    const request = (route, parameters) => {\n      return fetchWrapper(endpoint.parse(endpoint.merge(route, parameters)));\n    };\n\n    Object.assign(request, {\n      endpoint,\n      defaults: withDefaults.bind(null, endpoint)\n    });\n    return endpointOptions.request.hook(request, endpointOptions);\n  };\n\n  return Object.assign(newApi, {\n    endpoint,\n    defaults: withDefaults.bind(null, endpoint)\n  });\n}\n\nconst request = withDefaults(endpoint.endpoint, {\n  headers: {\n    \"user-agent\": `octokit-request.js/${VERSION} ${universalUserAgent.getUserAgent()}`\n  }\n});\n\nexports.request = request;\n//# sourceMappingURL=index.js.map\n\n\n/***/ }),\n\n/***/ 3682:\n/***/ ((module, __unused_webpack_exports, __webpack_require__) => {\n\nvar register = __webpack_require__(4670)\nvar addHook = __webpack_require__(5549)\nvar removeHook = __webpack_require__(6819)\n\n// bind with array of arguments: https://stackoverflow.com/a/21792913\nvar bind = Function.bind\nvar bindable = bind.bind(bind)\n\nfunction bindApi (hook, state, name) {\n  var removeHookRef = bindable(removeHook, null).apply(null, name ? [state, name] : [state])\n  hook.api = { remove: removeHookRef }\n  hook.remove = removeHookRef\n\n  ;['before', 'error', 'after', 'wrap'].forEach(function (kind) {\n    var args = name ? [state, kind, name] : [state, kind]\n    hook[kind] = hook.api[kind] = bindable(addHook, null).apply(null, args)\n  })\n}\n\nfunction HookSingular () {\n  var singularHookName = 'h'\n  var singularHookState = {\n    registry: {}\n  }\n  var singularHook = register.bind(null, singularHookState, singularHookName)\n  bindApi(singularHook, singularHookState, singularHookName)\n  return singularHook\n}\n\nfunction HookCollection () {\n  var state = {\n    registry: {}\n  }\n\n  var hook = register.bind(null, state)\n  bindApi(hook, state)\n\n  return hook\n}\n\nvar collectionHookDeprecationMessageDisplayed = false\nfunction Hook () {\n  if (!collectionHookDeprecationMessageDisplayed) {\n    console.warn('[before-after-hook]: \"Hook()\" repurposing warning, use \"Hook.Collection()\". Read more: https://git.io/upgrade-before-after-hook-to-1.4')\n    collectionHookDeprecationMessageDisplayed = true\n  }\n  return HookCollection()\n}\n\nHook.Singular = HookSingular.bind()\nHook.Collection = HookCollection.bind()\n\nmodule.exports = Hook\n// expose constructors as a named property for TypeScript\nmodule.exports.Hook = Hook\nmodule.exports.Singular = Hook.Singular\nmodule.exports.Collection = Hook.Collection\n\n\n/***/ }),\n\n/***/ 5549:\n/***/ ((module) => {\n\nmodule.exports = addHook\n\nfunction addHook (state, kind, name, hook) {\n  var orig = hook\n  if (!state.registry[name]) {\n    state.registry[name] = []\n  }\n\n  if (kind === 'before') {\n    hook = function (method, options) {\n      return Promise.resolve()\n        .then(orig.bind(null, options))\n        .then(method.bind(null, options))\n    }\n  }\n\n  if (kind === 'after') {\n    hook = function (method, options) {\n      var result\n      return Promise.resolve()\n        .then(method.bind(null, options))\n        .then(function (result_) {\n          result = result_\n          return orig(result, options)\n        })\n        .then(function () {\n          return result\n        })\n    }\n  }\n\n  if (kind === 'error') {\n    hook = function (method, options) {\n      return Promise.resolve()\n        .then(method.bind(null, options))\n        .catch(function (error) {\n          return orig(error, options)\n        })\n    }\n  }\n\n  state.registry[name].push({\n    hook: hook,\n    orig: orig\n  })\n}\n\n\n/***/ }),\n\n/***/ 4670:\n/***/ ((module) => {\n\nmodule.exports = register\n\nfunction register (state, name, method, options) {\n  if (typeof method !== 'function') {\n    throw new Error('method for before hook must be a function')\n  }\n\n  if (!options) {\n    options = {}\n  }\n\n  if (Array.isArray(name)) {\n    return name.reverse().reduce(function (callback, name) {\n      return register.bind(null, state, name, callback, options)\n    }, method)()\n  }\n\n  return Promise.resolve()\n    .then(function () {\n      if (!state.registry[name]) {\n        return method(options)\n      }\n\n      return (state.registry[name]).reduce(function (method, registered) {\n        return registered.hook.bind(null, method, options)\n      }, method)()\n    })\n}\n\n\n/***/ }),\n\n/***/ 6819:\n/***/ ((module) => {\n\nmodule.exports = removeHook\n\nfunction removeHook (state, name, method) {\n  if (!state.registry[name]) {\n    return\n  }\n\n  var index = state.registry[name]\n    .map(function (registered) { return registered.orig })\n    .indexOf(method)\n\n  if (index === -1) {\n    return\n  }\n\n  state.registry[name].splice(index, 1)\n}\n\n\n/***/ }),\n\n/***/ 2746:\n/***/ ((module, __unused_webpack_exports, __webpack_require__) => {\n\n\"use strict\";\n\n\nconst cp = __webpack_require__(3129);\nconst parse = __webpack_require__(6855);\nconst enoent = __webpack_require__(4101);\n\nfunction spawn(command, args, options) {\n    // Parse the arguments\n    const parsed = parse(command, args, options);\n\n    // Spawn the child process\n    const spawned = cp.spawn(parsed.command, parsed.args, parsed.options);\n\n    // Hook into child process \"exit\" event to emit an error if the command\n    // does not exists, see: https://github.com/IndigoUnited/node-cross-spawn/issues/16\n    enoent.hookChildProcess(spawned, parsed);\n\n    return spawned;\n}\n\nfunction spawnSync(command, args, options) {\n    // Parse the arguments\n    const parsed = parse(command, args, options);\n\n    // Spawn the child process\n    const result = cp.spawnSync(parsed.command, parsed.args, parsed.options);\n\n    // Analyze if the command does not exist, see: https://github.com/IndigoUnited/node-cross-spawn/issues/16\n    result.error = result.error || enoent.verifyENOENTSync(result.status, parsed);\n\n    return result;\n}\n\nmodule.exports = spawn;\nmodule.exports.spawn = spawn;\nmodule.exports.sync = spawnSync;\n\nmodule.exports._parse = parse;\nmodule.exports._enoent = enoent;\n\n\n/***/ }),\n\n/***/ 4101:\n/***/ ((module) => {\n\n\"use strict\";\n\n\nconst isWin = process.platform === 'win32';\n\nfunction notFoundError(original, syscall) {\n    return Object.assign(new Error(`${syscall} ${original.command} ENOENT`), {\n        code: 'ENOENT',\n        errno: 'ENOENT',\n        syscall: `${syscall} ${original.command}`,\n        path: original.command,\n        spawnargs: original.args,\n    });\n}\n\nfunction hookChildProcess(cp, parsed) {\n    if (!isWin) {\n        return;\n    }\n\n    const originalEmit = cp.emit;\n\n    cp.emit = function (name, arg1) {\n        // If emitting \"exit\" event and exit code is 1, we need to check if\n        // the command exists and emit an \"error\" instead\n        // See https://github.com/IndigoUnited/node-cross-spawn/issues/16\n        if (name === 'exit') {\n            const err = verifyENOENT(arg1, parsed, 'spawn');\n\n            if (err) {\n                return originalEmit.call(cp, 'error', err);\n            }\n        }\n\n        return originalEmit.apply(cp, arguments); // eslint-disable-line prefer-rest-params\n    };\n}\n\nfunction verifyENOENT(status, parsed) {\n    if (isWin && status === 1 && !parsed.file) {\n        return notFoundError(parsed.original, 'spawn');\n    }\n\n    return null;\n}\n\nfunction verifyENOENTSync(status, parsed) {\n    if (isWin && status === 1 && !parsed.file) {\n        return notFoundError(parsed.original, 'spawnSync');\n    }\n\n    return null;\n}\n\nmodule.exports = {\n    hookChildProcess,\n    verifyENOENT,\n    verifyENOENTSync,\n    notFoundError,\n};\n\n\n/***/ }),\n\n/***/ 6855:\n/***/ ((module, __unused_webpack_exports, __webpack_require__) => {\n\n\"use strict\";\n\n\nconst path = __webpack_require__(5622);\nconst niceTry = __webpack_require__(8560);\nconst resolveCommand = __webpack_require__(7274);\nconst escape = __webpack_require__(4274);\nconst readShebang = __webpack_require__(1252);\nconst semver = __webpack_require__(5911);\n\nconst isWin = process.platform === 'win32';\nconst isExecutableRegExp = /\\.(?:com|exe)$/i;\nconst isCmdShimRegExp = /node_modules[\\\\/].bin[\\\\/][^\\\\/]+\\.cmd$/i;\n\n// `options.shell` is supported in Node ^4.8.0, ^5.7.0 and >= 6.0.0\nconst supportsShellOption = niceTry(() => semver.satisfies(process.version, '^4.8.0 || ^5.7.0 || >= 6.0.0', true)) || false;\n\nfunction detectShebang(parsed) {\n    parsed.file = resolveCommand(parsed);\n\n    const shebang = parsed.file && readShebang(parsed.file);\n\n    if (shebang) {\n        parsed.args.unshift(parsed.file);\n        parsed.command = shebang;\n\n        return resolveCommand(parsed);\n    }\n\n    return parsed.file;\n}\n\nfunction parseNonShell(parsed) {\n    if (!isWin) {\n        return parsed;\n    }\n\n    // Detect & add support for shebangs\n    const commandFile = detectShebang(parsed);\n\n    // We don't need a shell if the command filename is an executable\n    const needsShell = !isExecutableRegExp.test(commandFile);\n\n    // If a shell is required, use cmd.exe and take care of escaping everything correctly\n    // Note that `forceShell` is an hidden option used only in tests\n    if (parsed.options.forceShell || needsShell) {\n        // Need to double escape meta chars if the command is a cmd-shim located in `node_modules/.bin/`\n        // The cmd-shim simply calls execute the package bin file with NodeJS, proxying any argument\n        // Because the escape of metachars with ^ gets interpreted when the cmd.exe is first called,\n        // we need to double escape them\n        const needsDoubleEscapeMetaChars = isCmdShimRegExp.test(commandFile);\n\n        // Normalize posix paths into OS compatible paths (e.g.: foo/bar -> foo\\bar)\n        // This is necessary otherwise it will always fail with ENOENT in those cases\n        parsed.command = path.normalize(parsed.command);\n\n        // Escape command & arguments\n        parsed.command = escape.command(parsed.command);\n        parsed.args = parsed.args.map((arg) => escape.argument(arg, needsDoubleEscapeMetaChars));\n\n        const shellCommand = [parsed.command].concat(parsed.args).join(' ');\n\n        parsed.args = ['/d', '/s', '/c', `\"${shellCommand}\"`];\n        parsed.command = process.env.comspec || 'cmd.exe';\n        parsed.options.windowsVerbatimArguments = true; // Tell node's spawn that the arguments are already escaped\n    }\n\n    return parsed;\n}\n\nfunction parseShell(parsed) {\n    // If node supports the shell option, there's no need to mimic its behavior\n    if (supportsShellOption) {\n        return parsed;\n    }\n\n    // Mimic node shell option\n    // See https://github.com/nodejs/node/blob/b9f6a2dc059a1062776133f3d4fd848c4da7d150/lib/child_process.js#L335\n    const shellCommand = [parsed.command].concat(parsed.args).join(' ');\n\n    if (isWin) {\n        parsed.command = typeof parsed.options.shell === 'string' ? parsed.options.shell : process.env.comspec || 'cmd.exe';\n        parsed.args = ['/d', '/s', '/c', `\"${shellCommand}\"`];\n        parsed.options.windowsVerbatimArguments = true; // Tell node's spawn that the arguments are already escaped\n    } else {\n        if (typeof parsed.options.shell === 'string') {\n            parsed.command = parsed.options.shell;\n        } else if (process.platform === 'android') {\n            parsed.command = '/system/bin/sh';\n        } else {\n            parsed.command = '/bin/sh';\n        }\n\n        parsed.args = ['-c', shellCommand];\n    }\n\n    return parsed;\n}\n\nfunction parse(command, args, options) {\n    // Normalize arguments, similar to nodejs\n    if (args && !Array.isArray(args)) {\n        options = args;\n        args = null;\n    }\n\n    args = args ? args.slice(0) : []; // Clone array to avoid changing the original\n    options = Object.assign({}, options); // Clone object to avoid changing the original\n\n    // Build our parsed object\n    const parsed = {\n        command,\n        args,\n        options,\n        file: undefined,\n        original: {\n            command,\n            args,\n        },\n    };\n\n    // Delegate further parsing to shell or non-shell\n    return options.shell ? parseShell(parsed) : parseNonShell(parsed);\n}\n\nmodule.exports = parse;\n\n\n/***/ }),\n\n/***/ 4274:\n/***/ ((module) => {\n\n\"use strict\";\n\n\n// See http://www.robvanderwoude.com/escapechars.php\nconst metaCharsRegExp = /([()\\][%!^\"`<>&|;, *?])/g;\n\nfunction escapeCommand(arg) {\n    // Escape meta chars\n    arg = arg.replace(metaCharsRegExp, '^$1');\n\n    return arg;\n}\n\nfunction escapeArgument(arg, doubleEscapeMetaChars) {\n    // Convert to string\n    arg = `${arg}`;\n\n    // Algorithm below is based on https://qntm.org/cmd\n\n    // Sequence of backslashes followed by a double quote:\n    // double up all the backslashes and escape the double quote\n    arg = arg.replace(/(\\\\*)\"/g, '$1$1\\\\\"');\n\n    // Sequence of backslashes followed by the end of the string\n    // (which will become a double quote later):\n    // double up all the backslashes\n    arg = arg.replace(/(\\\\*)$/, '$1$1');\n\n    // All other backslashes occur literally\n\n    // Quote the whole thing:\n    arg = `\"${arg}\"`;\n\n    // Escape meta chars\n    arg = arg.replace(metaCharsRegExp, '^$1');\n\n    // Double escape meta chars if necessary\n    if (doubleEscapeMetaChars) {\n        arg = arg.replace(metaCharsRegExp, '^$1');\n    }\n\n    return arg;\n}\n\nmodule.exports.command = escapeCommand;\nmodule.exports.argument = escapeArgument;\n\n\n/***/ }),\n\n/***/ 1252:\n/***/ ((module, __unused_webpack_exports, __webpack_require__) => {\n\n\"use strict\";\n\n\nconst fs = __webpack_require__(5747);\nconst shebangCommand = __webpack_require__(7032);\n\nfunction readShebang(command) {\n    // Read the first 150 bytes from the file\n    const size = 150;\n    let buffer;\n\n    if (Buffer.alloc) {\n        // Node.js v4.5+ / v5.10+\n        buffer = Buffer.alloc(size);\n    } else {\n        // Old Node.js API\n        buffer = new Buffer(size);\n        buffer.fill(0); // zero-fill\n    }\n\n    let fd;\n\n    try {\n        fd = fs.openSync(command, 'r');\n        fs.readSync(fd, buffer, 0, size, 0);\n        fs.closeSync(fd);\n    } catch (e) { /* Empty */ }\n\n    // Attempt to extract shebang (null is returned if not a shebang)\n    return shebangCommand(buffer.toString());\n}\n\nmodule.exports = readShebang;\n\n\n/***/ }),\n\n/***/ 7274:\n/***/ ((module, __unused_webpack_exports, __webpack_require__) => {\n\n\"use strict\";\n\n\nconst path = __webpack_require__(5622);\nconst which = __webpack_require__(4207);\nconst pathKey = __webpack_require__(539)();\n\nfunction resolveCommandAttempt(parsed, withoutPathExt) {\n    const cwd = process.cwd();\n    const hasCustomCwd = parsed.options.cwd != null;\n\n    // If a custom `cwd` was specified, we need to change the process cwd\n    // because `which` will do stat calls but does not support a custom cwd\n    if (hasCustomCwd) {\n        try {\n            process.chdir(parsed.options.cwd);\n        } catch (err) {\n            /* Empty */\n        }\n    }\n\n    let resolved;\n\n    try {\n        resolved = which.sync(parsed.command, {\n            path: (parsed.options.env || process.env)[pathKey],\n            pathExt: withoutPathExt ? path.delimiter : undefined,\n        });\n    } catch (e) {\n        /* Empty */\n    } finally {\n        process.chdir(cwd);\n    }\n\n    // If we successfully resolved, ensure that an absolute path is returned\n    // Note that when a custom `cwd` was used, we need to resolve to an absolute path based on it\n    if (resolved) {\n        resolved = path.resolve(hasCustomCwd ? parsed.options.cwd : '', resolved);\n    }\n\n    return resolved;\n}\n\nfunction resolveCommand(parsed) {\n    return resolveCommandAttempt(parsed) || resolveCommandAttempt(parsed, true);\n}\n\nmodule.exports = resolveCommand;\n\n\n/***/ }),\n\n/***/ 8932:\n/***/ ((__unused_webpack_module, exports) => {\n\n\"use strict\";\n\n\nObject.defineProperty(exports, \"__esModule\", ({ value: true }));\n\nclass Deprecation extends Error {\n  constructor(message) {\n    super(message); // Maintains proper stack trace (only available on V8)\n\n    /* istanbul ignore next */\n\n    if (Error.captureStackTrace) {\n      Error.captureStackTrace(this, this.constructor);\n    }\n\n    this.name = 'Deprecation';\n  }\n\n}\n\nexports.Deprecation = Deprecation;\n\n\n/***/ }),\n\n/***/ 1205:\n/***/ ((module, __unused_webpack_exports, __webpack_require__) => {\n\nvar once = __webpack_require__(1223);\n\nvar noop = function() {};\n\nvar isRequest = function(stream) {\n\treturn stream.setHeader && typeof stream.abort === 'function';\n};\n\nvar isChildProcess = function(stream) {\n\treturn stream.stdio && Array.isArray(stream.stdio) && stream.stdio.length === 3\n};\n\nvar eos = function(stream, opts, callback) {\n\tif (typeof opts === 'function') return eos(stream, null, opts);\n\tif (!opts) opts = {};\n\n\tcallback = once(callback || noop);\n\n\tvar ws = stream._writableState;\n\tvar rs = stream._readableState;\n\tvar readable = opts.readable || (opts.readable !== false && stream.readable);\n\tvar writable = opts.writable || (opts.writable !== false && stream.writable);\n\tvar cancelled = false;\n\n\tvar onlegacyfinish = function() {\n\t\tif (!stream.writable) onfinish();\n\t};\n\n\tvar onfinish = function() {\n\t\twritable = false;\n\t\tif (!readable) callback.call(stream);\n\t};\n\n\tvar onend = function() {\n\t\treadable = false;\n\t\tif (!writable) callback.call(stream);\n\t};\n\n\tvar onexit = function(exitCode) {\n\t\tcallback.call(stream, exitCode ? new Error('exited with error code: ' + exitCode) : null);\n\t};\n\n\tvar onerror = function(err) {\n\t\tcallback.call(stream, err);\n\t};\n\n\tvar onclose = function() {\n\t\tprocess.nextTick(onclosenexttick);\n\t};\n\n\tvar onclosenexttick = function() {\n\t\tif (cancelled) return;\n\t\tif (readable && !(rs && (rs.ended && !rs.destroyed))) return callback.call(stream, new Error('premature close'));\n\t\tif (writable && !(ws && (ws.ended && !ws.destroyed))) return callback.call(stream, new Error('premature close'));\n\t};\n\n\tvar onrequest = function() {\n\t\tstream.req.on('finish', onfinish);\n\t};\n\n\tif (isRequest(stream)) {\n\t\tstream.on('complete', onfinish);\n\t\tstream.on('abort', onclose);\n\t\tif (stream.req) onrequest();\n\t\telse stream.on('request', onrequest);\n\t} else if (writable && !ws) { // legacy streams\n\t\tstream.on('end', onlegacyfinish);\n\t\tstream.on('close', onlegacyfinish);\n\t}\n\n\tif (isChildProcess(stream)) stream.on('exit', onexit);\n\n\tstream.on('end', onend);\n\tstream.on('finish', onfinish);\n\tif (opts.error !== false) stream.on('error', onerror);\n\tstream.on('close', onclose);\n\n\treturn function() {\n\t\tcancelled = true;\n\t\tstream.removeListener('complete', onfinish);\n\t\tstream.removeListener('abort', onclose);\n\t\tstream.removeListener('request', onrequest);\n\t\tif (stream.req) stream.req.removeListener('finish', onfinish);\n\t\tstream.removeListener('end', onlegacyfinish);\n\t\tstream.removeListener('close', onlegacyfinish);\n\t\tstream.removeListener('finish', onfinish);\n\t\tstream.removeListener('exit', onexit);\n\t\tstream.removeListener('end', onend);\n\t\tstream.removeListener('error', onerror);\n\t\tstream.removeListener('close', onclose);\n\t};\n};\n\nmodule.exports = eos;\n\n\n/***/ }),\n\n/***/ 5447:\n/***/ ((module, __unused_webpack_exports, __webpack_require__) => {\n\n\"use strict\";\n\nconst path = __webpack_require__(5622);\nconst childProcess = __webpack_require__(3129);\nconst crossSpawn = __webpack_require__(2746);\nconst stripEof = __webpack_require__(5515);\nconst npmRunPath = __webpack_require__(502);\nconst isStream = __webpack_require__(1554);\nconst _getStream = __webpack_require__(1766);\nconst pFinally = __webpack_require__(1330);\nconst onExit = __webpack_require__(4931);\nconst errname = __webpack_require__(4689);\nconst stdio = __webpack_require__(166);\n\nconst TEN_MEGABYTES = 1000 * 1000 * 10;\n\nfunction handleArgs(cmd, args, opts) {\n\tlet parsed;\n\n\topts = Object.assign({\n\t\textendEnv: true,\n\t\tenv: {}\n\t}, opts);\n\n\tif (opts.extendEnv) {\n\t\topts.env = Object.assign({}, process.env, opts.env);\n\t}\n\n\tif (opts.__winShell === true) {\n\t\tdelete opts.__winShell;\n\t\tparsed = {\n\t\t\tcommand: cmd,\n\t\t\targs,\n\t\t\toptions: opts,\n\t\t\tfile: cmd,\n\t\t\toriginal: {\n\t\t\t\tcmd,\n\t\t\t\targs\n\t\t\t}\n\t\t};\n\t} else {\n\t\tparsed = crossSpawn._parse(cmd, args, opts);\n\t}\n\n\topts = Object.assign({\n\t\tmaxBuffer: TEN_MEGABYTES,\n\t\tbuffer: true,\n\t\tstripEof: true,\n\t\tpreferLocal: true,\n\t\tlocalDir: parsed.options.cwd || process.cwd(),\n\t\tencoding: 'utf8',\n\t\treject: true,\n\t\tcleanup: true\n\t}, parsed.options);\n\n\topts.stdio = stdio(opts);\n\n\tif (opts.preferLocal) {\n\t\topts.env = npmRunPath.env(Object.assign({}, opts, {cwd: opts.localDir}));\n\t}\n\n\tif (opts.detached) {\n\t\t// #115\n\t\topts.cleanup = false;\n\t}\n\n\tif (process.platform === 'win32' && path.basename(parsed.command) === 'cmd.exe') {\n\t\t// #116\n\t\tparsed.args.unshift('/q');\n\t}\n\n\treturn {\n\t\tcmd: parsed.command,\n\t\targs: parsed.args,\n\t\topts,\n\t\tparsed\n\t};\n}\n\nfunction handleInput(spawned, input) {\n\tif (input === null || input === undefined) {\n\t\treturn;\n\t}\n\n\tif (isStream(input)) {\n\t\tinput.pipe(spawned.stdin);\n\t} else {\n\t\tspawned.stdin.end(input);\n\t}\n}\n\nfunction handleOutput(opts, val) {\n\tif (val && opts.stripEof) {\n\t\tval = stripEof(val);\n\t}\n\n\treturn val;\n}\n\nfunction handleShell(fn, cmd, opts) {\n\tlet file = '/bin/sh';\n\tlet args = ['-c', cmd];\n\n\topts = Object.assign({}, opts);\n\n\tif (process.platform === 'win32') {\n\t\topts.__winShell = true;\n\t\tfile = process.env.comspec || 'cmd.exe';\n\t\targs = ['/s', '/c', `\"${cmd}\"`];\n\t\topts.windowsVerbatimArguments = true;\n\t}\n\n\tif (opts.shell) {\n\t\tfile = opts.shell;\n\t\tdelete opts.shell;\n\t}\n\n\treturn fn(file, args, opts);\n}\n\nfunction getStream(process, stream, {encoding, buffer, maxBuffer}) {\n\tif (!process[stream]) {\n\t\treturn null;\n\t}\n\n\tlet ret;\n\n\tif (!buffer) {\n\t\t// TODO: Use `ret = util.promisify(stream.finished)(process[stream]);` when targeting Node.js 10\n\t\tret = new Promise((resolve, reject) => {\n\t\t\tprocess[stream]\n\t\t\t\t.once('end', resolve)\n\t\t\t\t.once('error', reject);\n\t\t});\n\t} else if (encoding) {\n\t\tret = _getStream(process[stream], {\n\t\t\tencoding,\n\t\t\tmaxBuffer\n\t\t});\n\t} else {\n\t\tret = _getStream.buffer(process[stream], {maxBuffer});\n\t}\n\n\treturn ret.catch(err => {\n\t\terr.stream = stream;\n\t\terr.message = `${stream} ${err.message}`;\n\t\tthrow err;\n\t});\n}\n\nfunction makeError(result, options) {\n\tconst {stdout, stderr} = result;\n\n\tlet err = result.error;\n\tconst {code, signal} = result;\n\n\tconst {parsed, joinedCmd} = options;\n\tconst timedOut = options.timedOut || false;\n\n\tif (!err) {\n\t\tlet output = '';\n\n\t\tif (Array.isArray(parsed.opts.stdio)) {\n\t\t\tif (parsed.opts.stdio[2] !== 'inherit') {\n\t\t\t\toutput += output.length > 0 ? stderr : `\\n${stderr}`;\n\t\t\t}\n\n\t\t\tif (parsed.opts.stdio[1] !== 'inherit') {\n\t\t\t\toutput += `\\n${stdout}`;\n\t\t\t}\n\t\t} else if (parsed.opts.stdio !== 'inherit') {\n\t\t\toutput = `\\n${stderr}${stdout}`;\n\t\t}\n\n\t\terr = new Error(`Command failed: ${joinedCmd}${output}`);\n\t\terr.code = code < 0 ? errname(code) : code;\n\t}\n\n\terr.stdout = stdout;\n\terr.stderr = stderr;\n\terr.failed = true;\n\terr.signal = signal || null;\n\terr.cmd = joinedCmd;\n\terr.timedOut = timedOut;\n\n\treturn err;\n}\n\nfunction joinCmd(cmd, args) {\n\tlet joinedCmd = cmd;\n\n\tif (Array.isArray(args) && args.length > 0) {\n\t\tjoinedCmd += ' ' + args.join(' ');\n\t}\n\n\treturn joinedCmd;\n}\n\nmodule.exports = (cmd, args, opts) => {\n\tconst parsed = handleArgs(cmd, args, opts);\n\tconst {encoding, buffer, maxBuffer} = parsed.opts;\n\tconst joinedCmd = joinCmd(cmd, args);\n\n\tlet spawned;\n\ttry {\n\t\tspawned = childProcess.spawn(parsed.cmd, parsed.args, parsed.opts);\n\t} catch (err) {\n\t\treturn Promise.reject(err);\n\t}\n\n\tlet removeExitHandler;\n\tif (parsed.opts.cleanup) {\n\t\tremoveExitHandler = onExit(() => {\n\t\t\tspawned.kill();\n\t\t});\n\t}\n\n\tlet timeoutId = null;\n\tlet timedOut = false;\n\n\tconst cleanup = () => {\n\t\tif (timeoutId) {\n\t\t\tclearTimeout(timeoutId);\n\t\t\ttimeoutId = null;\n\t\t}\n\n\t\tif (removeExitHandler) {\n\t\t\tremoveExitHandler();\n\t\t}\n\t};\n\n\tif (parsed.opts.timeout > 0) {\n\t\ttimeoutId = setTimeout(() => {\n\t\t\ttimeoutId = null;\n\t\t\ttimedOut = true;\n\t\t\tspawned.kill(parsed.opts.killSignal);\n\t\t}, parsed.opts.timeout);\n\t}\n\n\tconst processDone = new Promise(resolve => {\n\t\tspawned.on('exit', (code, signal) => {\n\t\t\tcleanup();\n\t\t\tresolve({code, signal});\n\t\t});\n\n\t\tspawned.on('error', err => {\n\t\t\tcleanup();\n\t\t\tresolve({error: err});\n\t\t});\n\n\t\tif (spawned.stdin) {\n\t\t\tspawned.stdin.on('error', err => {\n\t\t\t\tcleanup();\n\t\t\t\tresolve({error: err});\n\t\t\t});\n\t\t}\n\t});\n\n\tfunction destroy() {\n\t\tif (spawned.stdout) {\n\t\t\tspawned.stdout.destroy();\n\t\t}\n\n\t\tif (spawned.stderr) {\n\t\t\tspawned.stderr.destroy();\n\t\t}\n\t}\n\n\tconst handlePromise = () => pFinally(Promise.all([\n\t\tprocessDone,\n\t\tgetStream(spawned, 'stdout', {encoding, buffer, maxBuffer}),\n\t\tgetStream(spawned, 'stderr', {encoding, buffer, maxBuffer})\n\t]).then(arr => {\n\t\tconst result = arr[0];\n\t\tresult.stdout = arr[1];\n\t\tresult.stderr = arr[2];\n\n\t\tif (result.error || result.code !== 0 || result.signal !== null) {\n\t\t\tconst err = makeError(result, {\n\t\t\t\tjoinedCmd,\n\t\t\t\tparsed,\n\t\t\t\ttimedOut\n\t\t\t});\n\n\t\t\t// TODO: missing some timeout logic for killed\n\t\t\t// https://github.com/nodejs/node/blob/master/lib/child_process.js#L203\n\t\t\t// err.killed = spawned.killed || killed;\n\t\t\terr.killed = err.killed || spawned.killed;\n\n\t\t\tif (!parsed.opts.reject) {\n\t\t\t\treturn err;\n\t\t\t}\n\n\t\t\tthrow err;\n\t\t}\n\n\t\treturn {\n\t\t\tstdout: handleOutput(parsed.opts, result.stdout),\n\t\t\tstderr: handleOutput(parsed.opts, result.stderr),\n\t\t\tcode: 0,\n\t\t\tfailed: false,\n\t\t\tkilled: false,\n\t\t\tsignal: null,\n\t\t\tcmd: joinedCmd,\n\t\t\ttimedOut: false\n\t\t};\n\t}), destroy);\n\n\tcrossSpawn._enoent.hookChildProcess(spawned, parsed.parsed);\n\n\thandleInput(spawned, parsed.opts.input);\n\n\tspawned.then = (onfulfilled, onrejected) => handlePromise().then(onfulfilled, onrejected);\n\tspawned.catch = onrejected => handlePromise().catch(onrejected);\n\n\treturn spawned;\n};\n\n// TODO: set `stderr: 'ignore'` when that option is implemented\nmodule.exports.stdout = (...args) => module.exports(...args).then(x => x.stdout);\n\n// TODO: set `stdout: 'ignore'` when that option is implemented\nmodule.exports.stderr = (...args) => module.exports(...args).then(x => x.stderr);\n\nmodule.exports.shell = (cmd, opts) => handleShell(module.exports, cmd, opts);\n\nmodule.exports.sync = (cmd, args, opts) => {\n\tconst parsed = handleArgs(cmd, args, opts);\n\tconst joinedCmd = joinCmd(cmd, args);\n\n\tif (isStream(parsed.opts.input)) {\n\t\tthrow new TypeError('The `input` option cannot be a stream in sync mode');\n\t}\n\n\tconst result = childProcess.spawnSync(parsed.cmd, parsed.args, parsed.opts);\n\tresult.code = result.status;\n\n\tif (result.error || result.status !== 0 || result.signal !== null) {\n\t\tconst err = makeError(result, {\n\t\t\tjoinedCmd,\n\t\t\tparsed\n\t\t});\n\n\t\tif (!parsed.opts.reject) {\n\t\t\treturn err;\n\t\t}\n\n\t\tthrow err;\n\t}\n\n\treturn {\n\t\tstdout: handleOutput(parsed.opts, result.stdout),\n\t\tstderr: handleOutput(parsed.opts, result.stderr),\n\t\tcode: 0,\n\t\tfailed: false,\n\t\tsignal: null,\n\t\tcmd: joinedCmd,\n\t\ttimedOut: false\n\t};\n};\n\nmodule.exports.shellSync = (cmd, opts) => handleShell(module.exports.sync, cmd, opts);\n\n\n/***/ }),\n\n/***/ 4689:\n/***/ ((module, __unused_webpack_exports, __webpack_require__) => {\n\n\"use strict\";\n\n// Older verions of Node.js might not have `util.getSystemErrorName()`.\n// In that case, fall back to a deprecated internal.\nconst util = __webpack_require__(1669);\n\nlet uv;\n\nif (typeof util.getSystemErrorName === 'function') {\n\tmodule.exports = util.getSystemErrorName;\n} else {\n\ttry {\n\t\tuv = process.binding('uv');\n\n\t\tif (typeof uv.errname !== 'function') {\n\t\t\tthrow new TypeError('uv.errname is not a function');\n\t\t}\n\t} catch (err) {\n\t\tconsole.error('execa/lib/errname: unable to establish process.binding(\\'uv\\')', err);\n\t\tuv = null;\n\t}\n\n\tmodule.exports = code => errname(uv, code);\n}\n\n// Used for testing the fallback behavior\nmodule.exports.__test__ = errname;\n\nfunction errname(uv, code) {\n\tif (uv) {\n\t\treturn uv.errname(code);\n\t}\n\n\tif (!(code < 0)) {\n\t\tthrow new Error('err >= 0');\n\t}\n\n\treturn `Unknown system error ${code}`;\n}\n\n\n\n/***/ }),\n\n/***/ 166:\n/***/ ((module) => {\n\n\"use strict\";\n\nconst alias = ['stdin', 'stdout', 'stderr'];\n\nconst hasAlias = opts => alias.some(x => Boolean(opts[x]));\n\nmodule.exports = opts => {\n\tif (!opts) {\n\t\treturn null;\n\t}\n\n\tif (opts.stdio && hasAlias(opts)) {\n\t\tthrow new Error(`It's not possible to provide \\`stdio\\` in combination with one of ${alias.map(x => `\\`${x}\\``).join(', ')}`);\n\t}\n\n\tif (typeof opts.stdio === 'string') {\n\t\treturn opts.stdio;\n\t}\n\n\tconst stdio = opts.stdio || [];\n\n\tif (!Array.isArray(stdio)) {\n\t\tthrow new TypeError(`Expected \\`stdio\\` to be of type \\`string\\` or \\`Array\\`, got \\`${typeof stdio}\\``);\n\t}\n\n\tconst result = [];\n\tconst len = Math.max(stdio.length, alias.length);\n\n\tfor (let i = 0; i < len; i++) {\n\t\tlet value = null;\n\n\t\tif (stdio[i] !== undefined) {\n\t\t\tvalue = stdio[i];\n\t\t} else if (opts[alias[i]] !== undefined) {\n\t\t\tvalue = opts[alias[i]];\n\t\t}\n\n\t\tresult[i] = value;\n\t}\n\n\treturn result;\n};\n\n\n/***/ }),\n\n/***/ 1585:\n/***/ ((module, __unused_webpack_exports, __webpack_require__) => {\n\n\"use strict\";\n\nconst {PassThrough} = __webpack_require__(2413);\n\nmodule.exports = options => {\n\toptions = Object.assign({}, options);\n\n\tconst {array} = options;\n\tlet {encoding} = options;\n\tconst buffer = encoding === 'buffer';\n\tlet objectMode = false;\n\n\tif (array) {\n\t\tobjectMode = !(encoding || buffer);\n\t} else {\n\t\tencoding = encoding || 'utf8';\n\t}\n\n\tif (buffer) {\n\t\tencoding = null;\n\t}\n\n\tlet len = 0;\n\tconst ret = [];\n\tconst stream = new PassThrough({objectMode});\n\n\tif (encoding) {\n\t\tstream.setEncoding(encoding);\n\t}\n\n\tstream.on('data', chunk => {\n\t\tret.push(chunk);\n\n\t\tif (objectMode) {\n\t\t\tlen = ret.length;\n\t\t} else {\n\t\t\tlen += chunk.length;\n\t\t}\n\t});\n\n\tstream.getBufferedValue = () => {\n\t\tif (array) {\n\t\t\treturn ret;\n\t\t}\n\n\t\treturn buffer ? Buffer.concat(ret, len) : ret.join('');\n\t};\n\n\tstream.getBufferedLength = () => len;\n\n\treturn stream;\n};\n\n\n/***/ }),\n\n/***/ 1766:\n/***/ ((module, __unused_webpack_exports, __webpack_require__) => {\n\n\"use strict\";\n\nconst pump = __webpack_require__(8341);\nconst bufferStream = __webpack_require__(1585);\n\nclass MaxBufferError extends Error {\n\tconstructor() {\n\t\tsuper('maxBuffer exceeded');\n\t\tthis.name = 'MaxBufferError';\n\t}\n}\n\nfunction getStream(inputStream, options) {\n\tif (!inputStream) {\n\t\treturn Promise.reject(new Error('Expected a stream'));\n\t}\n\n\toptions = Object.assign({maxBuffer: Infinity}, options);\n\n\tconst {maxBuffer} = options;\n\n\tlet stream;\n\treturn new Promise((resolve, reject) => {\n\t\tconst rejectPromise = error => {\n\t\t\tif (error) { // A null check\n\t\t\t\terror.bufferedData = stream.getBufferedValue();\n\t\t\t}\n\t\t\treject(error);\n\t\t};\n\n\t\tstream = pump(inputStream, bufferStream(options), error => {\n\t\t\tif (error) {\n\t\t\t\trejectPromise(error);\n\t\t\t\treturn;\n\t\t\t}\n\n\t\t\tresolve();\n\t\t});\n\n\t\tstream.on('data', () => {\n\t\t\tif (stream.getBufferedLength() > maxBuffer) {\n\t\t\t\trejectPromise(new MaxBufferError());\n\t\t\t}\n\t\t});\n\t}).then(() => stream.getBufferedValue());\n}\n\nmodule.exports = getStream;\nmodule.exports.buffer = (stream, options) => getStream(stream, Object.assign({}, options, {encoding: 'buffer'}));\nmodule.exports.array = (stream, options) => getStream(stream, Object.assign({}, options, {array: true}));\nmodule.exports.MaxBufferError = MaxBufferError;\n\n\n/***/ }),\n\n/***/ 8840:\n/***/ ((module) => {\n\n\"use strict\";\n\n\n/*!\n * isobject <https://github.com/jonschlinkert/isobject>\n *\n * Copyright (c) 2014-2017, Jon Schlinkert.\n * Released under the MIT License.\n */\n\nfunction isObject(val) {\n  return val != null && typeof val === 'object' && Array.isArray(val) === false;\n}\n\n/*!\n * is-plain-object <https://github.com/jonschlinkert/is-plain-object>\n *\n * Copyright (c) 2014-2017, Jon Schlinkert.\n * Released under the MIT License.\n */\n\nfunction isObjectObject(o) {\n  return isObject(o) === true\n    && Object.prototype.toString.call(o) === '[object Object]';\n}\n\nfunction isPlainObject(o) {\n  var ctor,prot;\n\n  if (isObjectObject(o) === false) return false;\n\n  // If has modified constructor\n  ctor = o.constructor;\n  if (typeof ctor !== 'function') return false;\n\n  // If has modified prototype\n  prot = ctor.prototype;\n  if (isObjectObject(prot) === false) return false;\n\n  // If constructor does not have an Object-specific method\n  if (prot.hasOwnProperty('isPrototypeOf') === false) {\n    return false;\n  }\n\n  // Most likely a plain Object\n  return true;\n}\n\nmodule.exports = isPlainObject;\n\n\n/***/ }),\n\n/***/ 1554:\n/***/ ((module) => {\n\n\"use strict\";\n\n\nvar isStream = module.exports = function (stream) {\n\treturn stream !== null && typeof stream === 'object' && typeof stream.pipe === 'function';\n};\n\nisStream.writable = function (stream) {\n\treturn isStream(stream) && stream.writable !== false && typeof stream._write === 'function' && typeof stream._writableState === 'object';\n};\n\nisStream.readable = function (stream) {\n\treturn isStream(stream) && stream.readable !== false && typeof stream._read === 'function' && typeof stream._readableState === 'object';\n};\n\nisStream.duplex = function (stream) {\n\treturn isStream.writable(stream) && isStream.readable(stream);\n};\n\nisStream.transform = function (stream) {\n\treturn isStream.duplex(stream) && typeof stream._transform === 'function' && typeof stream._transformState === 'object';\n};\n\n\n/***/ }),\n\n/***/ 7126:\n/***/ ((module, __unused_webpack_exports, __webpack_require__) => {\n\nvar fs = __webpack_require__(5747)\nvar core\nif (process.platform === 'win32' || global.TESTING_WINDOWS) {\n  core = __webpack_require__(2001)\n} else {\n  core = __webpack_require__(9728)\n}\n\nmodule.exports = isexe\nisexe.sync = sync\n\nfunction isexe (path, options, cb) {\n  if (typeof options === 'function') {\n    cb = options\n    options = {}\n  }\n\n  if (!cb) {\n    if (typeof Promise !== 'function') {\n      throw new TypeError('callback not provided')\n    }\n\n    return new Promise(function (resolve, reject) {\n      isexe(path, options || {}, function (er, is) {\n        if (er) {\n          reject(er)\n        } else {\n          resolve(is)\n        }\n      })\n    })\n  }\n\n  core(path, options || {}, function (er, is) {\n    // ignore EACCES because that just means we aren't allowed to run it\n    if (er) {\n      if (er.code === 'EACCES' || options && options.ignoreErrors) {\n        er = null\n        is = false\n      }\n    }\n    cb(er, is)\n  })\n}\n\nfunction sync (path, options) {\n  // my kingdom for a filtered catch\n  try {\n    return core.sync(path, options || {})\n  } catch (er) {\n    if (options && options.ignoreErrors || er.code === 'EACCES') {\n      return false\n    } else {\n      throw er\n    }\n  }\n}\n\n\n/***/ }),\n\n/***/ 9728:\n/***/ ((module, __unused_webpack_exports, __webpack_require__) => {\n\nmodule.exports = isexe\nisexe.sync = sync\n\nvar fs = __webpack_require__(5747)\n\nfunction isexe (path, options, cb) {\n  fs.stat(path, function (er, stat) {\n    cb(er, er ? false : checkStat(stat, options))\n  })\n}\n\nfunction sync (path, options) {\n  return checkStat(fs.statSync(path), options)\n}\n\nfunction checkStat (stat, options) {\n  return stat.isFile() && checkMode(stat, options)\n}\n\nfunction checkMode (stat, options) {\n  var mod = stat.mode\n  var uid = stat.uid\n  var gid = stat.gid\n\n  var myUid = options.uid !== undefined ?\n    options.uid : process.getuid && process.getuid()\n  var myGid = options.gid !== undefined ?\n    options.gid : process.getgid && process.getgid()\n\n  var u = parseInt('100', 8)\n  var g = parseInt('010', 8)\n  var o = parseInt('001', 8)\n  var ug = u | g\n\n  var ret = (mod & o) ||\n    (mod & g) && gid === myGid ||\n    (mod & u) && uid === myUid ||\n    (mod & ug) && myUid === 0\n\n  return ret\n}\n\n\n/***/ }),\n\n/***/ 2001:\n/***/ ((module, __unused_webpack_exports, __webpack_require__) => {\n\nmodule.exports = isexe\nisexe.sync = sync\n\nvar fs = __webpack_require__(5747)\n\nfunction checkPathExt (path, options) {\n  var pathext = options.pathExt !== undefined ?\n    options.pathExt : process.env.PATHEXT\n\n  if (!pathext) {\n    return true\n  }\n\n  pathext = pathext.split(';')\n  if (pathext.indexOf('') !== -1) {\n    return true\n  }\n  for (var i = 0; i < pathext.length; i++) {\n    var p = pathext[i].toLowerCase()\n    if (p && path.substr(-p.length).toLowerCase() === p) {\n      return true\n    }\n  }\n  return false\n}\n\nfunction checkStat (stat, path, options) {\n  if (!stat.isSymbolicLink() && !stat.isFile()) {\n    return false\n  }\n  return checkPathExt(path, options)\n}\n\nfunction isexe (path, options, cb) {\n  fs.stat(path, function (er, stat) {\n    cb(er, er ? false : checkStat(stat, path, options))\n  })\n}\n\nfunction sync (path, options) {\n  return checkStat(fs.statSync(path), path, options)\n}\n\n\n/***/ }),\n\n/***/ 7493:\n/***/ ((module, __unused_webpack_exports, __webpack_require__) => {\n\n\"use strict\";\n\nconst os = __webpack_require__(2087);\n\nconst nameMap = new Map([\n\t[20, ['Big Sur', '11']],\n\t[19, ['Catalina', '10.15']],\n\t[18, ['Mojave', '10.14']],\n\t[17, ['High Sierra', '10.13']],\n\t[16, ['Sierra', '10.12']],\n\t[15, ['El Capitan', '10.11']],\n\t[14, ['Yosemite', '10.10']],\n\t[13, ['Mavericks', '10.9']],\n\t[12, ['Mountain Lion', '10.8']],\n\t[11, ['Lion', '10.7']],\n\t[10, ['Snow Leopard', '10.6']],\n\t[9, ['Leopard', '10.5']],\n\t[8, ['Tiger', '10.4']],\n\t[7, ['Panther', '10.3']],\n\t[6, ['Jaguar', '10.2']],\n\t[5, ['Puma', '10.1']]\n]);\n\nconst macosRelease = release => {\n\trelease = Number((release || os.release()).split('.')[0]);\n\n\tconst [name, version] = nameMap.get(release);\n\n\treturn {\n\t\tname,\n\t\tversion\n\t};\n};\n\nmodule.exports = macosRelease;\n// TODO: remove this in the next major version\nmodule.exports.default = macosRelease;\n\n\n/***/ }),\n\n/***/ 8560:\n/***/ ((module) => {\n\n\"use strict\";\n\n\n/**\n * Tries to execute a function and discards any error that occurs.\n * @param {Function} fn - Function that might or might not throw an error.\n * @returns {?*} Return-value of the function when no error occurred.\n */\nmodule.exports = function(fn) {\n\n\ttry { return fn() } catch (e) {}\n\n}\n\n/***/ }),\n\n/***/ 467:\n/***/ ((module, exports, __webpack_require__) => {\n\n\"use strict\";\n\n\nObject.defineProperty(exports, \"__esModule\", ({ value: true }));\n\nfunction _interopDefault (ex) { return (ex && (typeof ex === 'object') && 'default' in ex) ? ex['default'] : ex; }\n\nvar Stream = _interopDefault(__webpack_require__(2413));\nvar http = _interopDefault(__webpack_require__(8605));\nvar Url = _interopDefault(__webpack_require__(8835));\nvar whatwgUrl = _interopDefault(__webpack_require__(8665));\nvar https = _interopDefault(__webpack_require__(7211));\nvar zlib = _interopDefault(__webpack_require__(8761));\n\n// Based on https://github.com/tmpvar/jsdom/blob/aa85b2abf07766ff7bf5c1f6daafb3726f2f2db5/lib/jsdom/living/blob.js\n\n// fix for \"Readable\" isn't a named export issue\nconst Readable = Stream.Readable;\n\nconst BUFFER = Symbol('buffer');\nconst TYPE = Symbol('type');\n\nclass Blob {\n\tconstructor() {\n\t\tthis[TYPE] = '';\n\n\t\tconst blobParts = arguments[0];\n\t\tconst options = arguments[1];\n\n\t\tconst buffers = [];\n\t\tlet size = 0;\n\n\t\tif (blobParts) {\n\t\t\tconst a = blobParts;\n\t\t\tconst length = Number(a.length);\n\t\t\tfor (let i = 0; i < length; i++) {\n\t\t\t\tconst element = a[i];\n\t\t\t\tlet buffer;\n\t\t\t\tif (element instanceof Buffer) {\n\t\t\t\t\tbuffer = element;\n\t\t\t\t} else if (ArrayBuffer.isView(element)) {\n\t\t\t\t\tbuffer = Buffer.from(element.buffer, element.byteOffset, element.byteLength);\n\t\t\t\t} else if (element instanceof ArrayBuffer) {\n\t\t\t\t\tbuffer = Buffer.from(element);\n\t\t\t\t} else if (element instanceof Blob) {\n\t\t\t\t\tbuffer = element[BUFFER];\n\t\t\t\t} else {\n\t\t\t\t\tbuffer = Buffer.from(typeof element === 'string' ? element : String(element));\n\t\t\t\t}\n\t\t\t\tsize += buffer.length;\n\t\t\t\tbuffers.push(buffer);\n\t\t\t}\n\t\t}\n\n\t\tthis[BUFFER] = Buffer.concat(buffers);\n\n\t\tlet type = options && options.type !== undefined && String(options.type).toLowerCase();\n\t\tif (type && !/[^\\u0020-\\u007E]/.test(type)) {\n\t\t\tthis[TYPE] = type;\n\t\t}\n\t}\n\tget size() {\n\t\treturn this[BUFFER].length;\n\t}\n\tget type() {\n\t\treturn this[TYPE];\n\t}\n\ttext() {\n\t\treturn Promise.resolve(this[BUFFER].toString());\n\t}\n\tarrayBuffer() {\n\t\tconst buf = this[BUFFER];\n\t\tconst ab = buf.buffer.slice(buf.byteOffset, buf.byteOffset + buf.byteLength);\n\t\treturn Promise.resolve(ab);\n\t}\n\tstream() {\n\t\tconst readable = new Readable();\n\t\treadable._read = function () {};\n\t\treadable.push(this[BUFFER]);\n\t\treadable.push(null);\n\t\treturn readable;\n\t}\n\ttoString() {\n\t\treturn '[object Blob]';\n\t}\n\tslice() {\n\t\tconst size = this.size;\n\n\t\tconst start = arguments[0];\n\t\tconst end = arguments[1];\n\t\tlet relativeStart, relativeEnd;\n\t\tif (start === undefined) {\n\t\t\trelativeStart = 0;\n\t\t} else if (start < 0) {\n\t\t\trelativeStart = Math.max(size + start, 0);\n\t\t} else {\n\t\t\trelativeStart = Math.min(start, size);\n\t\t}\n\t\tif (end === undefined) {\n\t\t\trelativeEnd = size;\n\t\t} else if (end < 0) {\n\t\t\trelativeEnd = Math.max(size + end, 0);\n\t\t} else {\n\t\t\trelativeEnd = Math.min(end, size);\n\t\t}\n\t\tconst span = Math.max(relativeEnd - relativeStart, 0);\n\n\t\tconst buffer = this[BUFFER];\n\t\tconst slicedBuffer = buffer.slice(relativeStart, relativeStart + span);\n\t\tconst blob = new Blob([], { type: arguments[2] });\n\t\tblob[BUFFER] = slicedBuffer;\n\t\treturn blob;\n\t}\n}\n\nObject.defineProperties(Blob.prototype, {\n\tsize: { enumerable: true },\n\ttype: { enumerable: true },\n\tslice: { enumerable: true }\n});\n\nObject.defineProperty(Blob.prototype, Symbol.toStringTag, {\n\tvalue: 'Blob',\n\twritable: false,\n\tenumerable: false,\n\tconfigurable: true\n});\n\n/**\n * fetch-error.js\n *\n * FetchError interface for operational errors\n */\n\n/**\n * Create FetchError instance\n *\n * @param   String      message      Error message for human\n * @param   String      type         Error type for machine\n * @param   String      systemError  For Node.js system error\n * @return  FetchError\n */\nfunction FetchError(message, type, systemError) {\n  Error.call(this, message);\n\n  this.message = message;\n  this.type = type;\n\n  // when err.type is `system`, err.code contains system error code\n  if (systemError) {\n    this.code = this.errno = systemError.code;\n  }\n\n  // hide custom error implementation details from end-users\n  Error.captureStackTrace(this, this.constructor);\n}\n\nFetchError.prototype = Object.create(Error.prototype);\nFetchError.prototype.constructor = FetchError;\nFetchError.prototype.name = 'FetchError';\n\nlet convert;\ntry {\n\tconvert = __webpack_require__(2877).convert;\n} catch (e) {}\n\nconst INTERNALS = Symbol('Body internals');\n\n// fix an issue where \"PassThrough\" isn't a named export for node <10\nconst PassThrough = Stream.PassThrough;\n\n/**\n * Body mixin\n *\n * Ref: https://fetch.spec.whatwg.org/#body\n *\n * @param   Stream  body  Readable stream\n * @param   Object  opts  Response options\n * @return  Void\n */\nfunction Body(body) {\n\tvar _this = this;\n\n\tvar _ref = arguments.length > 1 && arguments[1] !== undefined ? arguments[1] : {},\n\t    _ref$size = _ref.size;\n\n\tlet size = _ref$size === undefined ? 0 : _ref$size;\n\tvar _ref$timeout = _ref.timeout;\n\tlet timeout = _ref$timeout === undefined ? 0 : _ref$timeout;\n\n\tif (body == null) {\n\t\t// body is undefined or null\n\t\tbody = null;\n\t} else if (isURLSearchParams(body)) {\n\t\t// body is a URLSearchParams\n\t\tbody = Buffer.from(body.toString());\n\t} else if (isBlob(body)) ; else if (Buffer.isBuffer(body)) ; else if (Object.prototype.toString.call(body) === '[object ArrayBuffer]') {\n\t\t// body is ArrayBuffer\n\t\tbody = Buffer.from(body);\n\t} else if (ArrayBuffer.isView(body)) {\n\t\t// body is ArrayBufferView\n\t\tbody = Buffer.from(body.buffer, body.byteOffset, body.byteLength);\n\t} else if (body instanceof Stream) ; else {\n\t\t// none of the above\n\t\t// coerce to string then buffer\n\t\tbody = Buffer.from(String(body));\n\t}\n\tthis[INTERNALS] = {\n\t\tbody,\n\t\tdisturbed: false,\n\t\terror: null\n\t};\n\tthis.size = size;\n\tthis.timeout = timeout;\n\n\tif (body instanceof Stream) {\n\t\tbody.on('error', function (err) {\n\t\t\tconst error = err.name === 'AbortError' ? err : new FetchError(`Invalid response body while trying to fetch ${_this.url}: ${err.message}`, 'system', err);\n\t\t\t_this[INTERNALS].error = error;\n\t\t});\n\t}\n}\n\nBody.prototype = {\n\tget body() {\n\t\treturn this[INTERNALS].body;\n\t},\n\n\tget bodyUsed() {\n\t\treturn this[INTERNALS].disturbed;\n\t},\n\n\t/**\n  * Decode response as ArrayBuffer\n  *\n  * @return  Promise\n  */\n\tarrayBuffer() {\n\t\treturn consumeBody.call(this).then(function (buf) {\n\t\t\treturn buf.buffer.slice(buf.byteOffset, buf.byteOffset + buf.byteLength);\n\t\t});\n\t},\n\n\t/**\n  * Return raw response as Blob\n  *\n  * @return Promise\n  */\n\tblob() {\n\t\tlet ct = this.headers && this.headers.get('content-type') || '';\n\t\treturn consumeBody.call(this).then(function (buf) {\n\t\t\treturn Object.assign(\n\t\t\t// Prevent copying\n\t\t\tnew Blob([], {\n\t\t\t\ttype: ct.toLowerCase()\n\t\t\t}), {\n\t\t\t\t[BUFFER]: buf\n\t\t\t});\n\t\t});\n\t},\n\n\t/**\n  * Decode response as json\n  *\n  * @return  Promise\n  */\n\tjson() {\n\t\tvar _this2 = this;\n\n\t\treturn consumeBody.call(this).then(function (buffer) {\n\t\t\ttry {\n\t\t\t\treturn JSON.parse(buffer.toString());\n\t\t\t} catch (err) {\n\t\t\t\treturn Body.Promise.reject(new FetchError(`invalid json response body at ${_this2.url} reason: ${err.message}`, 'invalid-json'));\n\t\t\t}\n\t\t});\n\t},\n\n\t/**\n  * Decode response as text\n  *\n  * @return  Promise\n  */\n\ttext() {\n\t\treturn consumeBody.call(this).then(function (buffer) {\n\t\t\treturn buffer.toString();\n\t\t});\n\t},\n\n\t/**\n  * Decode response as buffer (non-spec api)\n  *\n  * @return  Promise\n  */\n\tbuffer() {\n\t\treturn consumeBody.call(this);\n\t},\n\n\t/**\n  * Decode response as text, while automatically detecting the encoding and\n  * trying to decode to UTF-8 (non-spec api)\n  *\n  * @return  Promise\n  */\n\ttextConverted() {\n\t\tvar _this3 = this;\n\n\t\treturn consumeBody.call(this).then(function (buffer) {\n\t\t\treturn convertBody(buffer, _this3.headers);\n\t\t});\n\t}\n};\n\n// In browsers, all properties are enumerable.\nObject.defineProperties(Body.prototype, {\n\tbody: { enumerable: true },\n\tbodyUsed: { enumerable: true },\n\tarrayBuffer: { enumerable: true },\n\tblob: { enumerable: true },\n\tjson: { enumerable: true },\n\ttext: { enumerable: true }\n});\n\nBody.mixIn = function (proto) {\n\tfor (const name of Object.getOwnPropertyNames(Body.prototype)) {\n\t\t// istanbul ignore else: future proof\n\t\tif (!(name in proto)) {\n\t\t\tconst desc = Object.getOwnPropertyDescriptor(Body.prototype, name);\n\t\t\tObject.defineProperty(proto, name, desc);\n\t\t}\n\t}\n};\n\n/**\n * Consume and convert an entire Body to a Buffer.\n *\n * Ref: https://fetch.spec.whatwg.org/#concept-body-consume-body\n *\n * @return  Promise\n */\nfunction consumeBody() {\n\tvar _this4 = this;\n\n\tif (this[INTERNALS].disturbed) {\n\t\treturn Body.Promise.reject(new TypeError(`body used already for: ${this.url}`));\n\t}\n\n\tthis[INTERNALS].disturbed = true;\n\n\tif (this[INTERNALS].error) {\n\t\treturn Body.Promise.reject(this[INTERNALS].error);\n\t}\n\n\tlet body = this.body;\n\n\t// body is null\n\tif (body === null) {\n\t\treturn Body.Promise.resolve(Buffer.alloc(0));\n\t}\n\n\t// body is blob\n\tif (isBlob(body)) {\n\t\tbody = body.stream();\n\t}\n\n\t// body is buffer\n\tif (Buffer.isBuffer(body)) {\n\t\treturn Body.Promise.resolve(body);\n\t}\n\n\t// istanbul ignore if: should never happen\n\tif (!(body instanceof Stream)) {\n\t\treturn Body.Promise.resolve(Buffer.alloc(0));\n\t}\n\n\t// body is stream\n\t// get ready to actually consume the body\n\tlet accum = [];\n\tlet accumBytes = 0;\n\tlet abort = false;\n\n\treturn new Body.Promise(function (resolve, reject) {\n\t\tlet resTimeout;\n\n\t\t// allow timeout on slow response body\n\t\tif (_this4.timeout) {\n\t\t\tresTimeout = setTimeout(function () {\n\t\t\t\tabort = true;\n\t\t\t\treject(new FetchError(`Response timeout while trying to fetch ${_this4.url} (over ${_this4.timeout}ms)`, 'body-timeout'));\n\t\t\t}, _this4.timeout);\n\t\t}\n\n\t\t// handle stream errors\n\t\tbody.on('error', function (err) {\n\t\t\tif (err.name === 'AbortError') {\n\t\t\t\t// if the request was aborted, reject with this Error\n\t\t\t\tabort = true;\n\t\t\t\treject(err);\n\t\t\t} else {\n\t\t\t\t// other errors, such as incorrect content-encoding\n\t\t\t\treject(new FetchError(`Invalid response body while trying to fetch ${_this4.url}: ${err.message}`, 'system', err));\n\t\t\t}\n\t\t});\n\n\t\tbody.on('data', function (chunk) {\n\t\t\tif (abort || chunk === null) {\n\t\t\t\treturn;\n\t\t\t}\n\n\t\t\tif (_this4.size && accumBytes + chunk.length > _this4.size) {\n\t\t\t\tabort = true;\n\t\t\t\treject(new FetchError(`content size at ${_this4.url} over limit: ${_this4.size}`, 'max-size'));\n\t\t\t\treturn;\n\t\t\t}\n\n\t\t\taccumBytes += chunk.length;\n\t\t\taccum.push(chunk);\n\t\t});\n\n\t\tbody.on('end', function () {\n\t\t\tif (abort) {\n\t\t\t\treturn;\n\t\t\t}\n\n\t\t\tclearTimeout(resTimeout);\n\n\t\t\ttry {\n\t\t\t\tresolve(Buffer.concat(accum, accumBytes));\n\t\t\t} catch (err) {\n\t\t\t\t// handle streams that have accumulated too much data (issue #414)\n\t\t\t\treject(new FetchError(`Could not create Buffer from response body for ${_this4.url}: ${err.message}`, 'system', err));\n\t\t\t}\n\t\t});\n\t});\n}\n\n/**\n * Detect buffer encoding and convert to target encoding\n * ref: http://www.w3.org/TR/2011/WD-html5-20110113/parsing.html#determining-the-character-encoding\n *\n * @param   Buffer  buffer    Incoming buffer\n * @param   String  encoding  Target encoding\n * @return  String\n */\nfunction convertBody(buffer, headers) {\n\tif (typeof convert !== 'function') {\n\t\tthrow new Error('The package `encoding` must be installed to use the textConverted() function');\n\t}\n\n\tconst ct = headers.get('content-type');\n\tlet charset = 'utf-8';\n\tlet res, str;\n\n\t// header\n\tif (ct) {\n\t\tres = /charset=([^;]*)/i.exec(ct);\n\t}\n\n\t// no charset in content type, peek at response body for at most 1024 bytes\n\tstr = buffer.slice(0, 1024).toString();\n\n\t// html5\n\tif (!res && str) {\n\t\tres = /<meta.+?charset=(['\"])(.+?)\\1/i.exec(str);\n\t}\n\n\t// html4\n\tif (!res && str) {\n\t\tres = /<meta[\\s]+?http-equiv=(['\"])content-type\\1[\\s]+?content=(['\"])(.+?)\\2/i.exec(str);\n\t\tif (!res) {\n\t\t\tres = /<meta[\\s]+?content=(['\"])(.+?)\\1[\\s]+?http-equiv=(['\"])content-type\\3/i.exec(str);\n\t\t\tif (res) {\n\t\t\t\tres.pop(); // drop last quote\n\t\t\t}\n\t\t}\n\n\t\tif (res) {\n\t\t\tres = /charset=(.*)/i.exec(res.pop());\n\t\t}\n\t}\n\n\t// xml\n\tif (!res && str) {\n\t\tres = /<\\?xml.+?encoding=(['\"])(.+?)\\1/i.exec(str);\n\t}\n\n\t// found charset\n\tif (res) {\n\t\tcharset = res.pop();\n\n\t\t// prevent decode issues when sites use incorrect encoding\n\t\t// ref: https://hsivonen.fi/encoding-menu/\n\t\tif (charset === 'gb2312' || charset === 'gbk') {\n\t\t\tcharset = 'gb18030';\n\t\t}\n\t}\n\n\t// turn raw buffers into a single utf-8 buffer\n\treturn convert(buffer, 'UTF-8', charset).toString();\n}\n\n/**\n * Detect a URLSearchParams object\n * ref: https://github.com/bitinn/node-fetch/issues/296#issuecomment-307598143\n *\n * @param   Object  obj     Object to detect by type or brand\n * @return  String\n */\nfunction isURLSearchParams(obj) {\n\t// Duck-typing as a necessary condition.\n\tif (typeof obj !== 'object' || typeof obj.append !== 'function' || typeof obj.delete !== 'function' || typeof obj.get !== 'function' || typeof obj.getAll !== 'function' || typeof obj.has !== 'function' || typeof obj.set !== 'function') {\n\t\treturn false;\n\t}\n\n\t// Brand-checking and more duck-typing as optional condition.\n\treturn obj.constructor.name === 'URLSearchParams' || Object.prototype.toString.call(obj) === '[object URLSearchParams]' || typeof obj.sort === 'function';\n}\n\n/**\n * Check if `obj` is a W3C `Blob` object (which `File` inherits from)\n * @param  {*} obj\n * @return {boolean}\n */\nfunction isBlob(obj) {\n\treturn typeof obj === 'object' && typeof obj.arrayBuffer === 'function' && typeof obj.type === 'string' && typeof obj.stream === 'function' && typeof obj.constructor === 'function' && typeof obj.constructor.name === 'string' && /^(Blob|File)$/.test(obj.constructor.name) && /^(Blob|File)$/.test(obj[Symbol.toStringTag]);\n}\n\n/**\n * Clone body given Res/Req instance\n *\n * @param   Mixed  instance  Response or Request instance\n * @return  Mixed\n */\nfunction clone(instance) {\n\tlet p1, p2;\n\tlet body = instance.body;\n\n\t// don't allow cloning a used body\n\tif (instance.bodyUsed) {\n\t\tthrow new Error('cannot clone body after it is used');\n\t}\n\n\t// check that body is a stream and not form-data object\n\t// note: we can't clone the form-data object without having it as a dependency\n\tif (body instanceof Stream && typeof body.getBoundary !== 'function') {\n\t\t// tee instance body\n\t\tp1 = new PassThrough();\n\t\tp2 = new PassThrough();\n\t\tbody.pipe(p1);\n\t\tbody.pipe(p2);\n\t\t// set instance body to teed body and return the other teed body\n\t\tinstance[INTERNALS].body = p1;\n\t\tbody = p2;\n\t}\n\n\treturn body;\n}\n\n/**\n * Performs the operation \"extract a `Content-Type` value from |object|\" as\n * specified in the specification:\n * https://fetch.spec.whatwg.org/#concept-bodyinit-extract\n *\n * This function assumes that instance.body is present.\n *\n * @param   Mixed  instance  Any options.body input\n */\nfunction extractContentType(body) {\n\tif (body === null) {\n\t\t// body is null\n\t\treturn null;\n\t} else if (typeof body === 'string') {\n\t\t// body is string\n\t\treturn 'text/plain;charset=UTF-8';\n\t} else if (isURLSearchParams(body)) {\n\t\t// body is a URLSearchParams\n\t\treturn 'application/x-www-form-urlencoded;charset=UTF-8';\n\t} else if (isBlob(body)) {\n\t\t// body is blob\n\t\treturn body.type || null;\n\t} else if (Buffer.isBuffer(body)) {\n\t\t// body is buffer\n\t\treturn null;\n\t} else if (Object.prototype.toString.call(body) === '[object ArrayBuffer]') {\n\t\t// body is ArrayBuffer\n\t\treturn null;\n\t} else if (ArrayBuffer.isView(body)) {\n\t\t// body is ArrayBufferView\n\t\treturn null;\n\t} else if (typeof body.getBoundary === 'function') {\n\t\t// detect form data input from form-data module\n\t\treturn `multipart/form-data;boundary=${body.getBoundary()}`;\n\t} else if (body instanceof Stream) {\n\t\t// body is stream\n\t\t// can't really do much about this\n\t\treturn null;\n\t} else {\n\t\t// Body constructor defaults other things to string\n\t\treturn 'text/plain;charset=UTF-8';\n\t}\n}\n\n/**\n * The Fetch Standard treats this as if \"total bytes\" is a property on the body.\n * For us, we have to explicitly get it with a function.\n *\n * ref: https://fetch.spec.whatwg.org/#concept-body-total-bytes\n *\n * @param   Body    instance   Instance of Body\n * @return  Number?            Number of bytes, or null if not possible\n */\nfunction getTotalBytes(instance) {\n\tconst body = instance.body;\n\n\n\tif (body === null) {\n\t\t// body is null\n\t\treturn 0;\n\t} else if (isBlob(body)) {\n\t\treturn body.size;\n\t} else if (Buffer.isBuffer(body)) {\n\t\t// body is buffer\n\t\treturn body.length;\n\t} else if (body && typeof body.getLengthSync === 'function') {\n\t\t// detect form data input from form-data module\n\t\tif (body._lengthRetrievers && body._lengthRetrievers.length == 0 || // 1.x\n\t\tbody.hasKnownLength && body.hasKnownLength()) {\n\t\t\t// 2.x\n\t\t\treturn body.getLengthSync();\n\t\t}\n\t\treturn null;\n\t} else {\n\t\t// body is stream\n\t\treturn null;\n\t}\n}\n\n/**\n * Write a Body to a Node.js WritableStream (e.g. http.Request) object.\n *\n * @param   Body    instance   Instance of Body\n * @return  Void\n */\nfunction writeToStream(dest, instance) {\n\tconst body = instance.body;\n\n\n\tif (body === null) {\n\t\t// body is null\n\t\tdest.end();\n\t} else if (isBlob(body)) {\n\t\tbody.stream().pipe(dest);\n\t} else if (Buffer.isBuffer(body)) {\n\t\t// body is buffer\n\t\tdest.write(body);\n\t\tdest.end();\n\t} else {\n\t\t// body is stream\n\t\tbody.pipe(dest);\n\t}\n}\n\n// expose Promise\nBody.Promise = global.Promise;\n\n/**\n * headers.js\n *\n * Headers class offers convenient helpers\n */\n\nconst invalidTokenRegex = /[^\\^_`a-zA-Z\\-0-9!#$%&'*+.|~]/;\nconst invalidHeaderCharRegex = /[^\\t\\x20-\\x7e\\x80-\\xff]/;\n\nfunction validateName(name) {\n\tname = `${name}`;\n\tif (invalidTokenRegex.test(name) || name === '') {\n\t\tthrow new TypeError(`${name} is not a legal HTTP header name`);\n\t}\n}\n\nfunction validateValue(value) {\n\tvalue = `${value}`;\n\tif (invalidHeaderCharRegex.test(value)) {\n\t\tthrow new TypeError(`${value} is not a legal HTTP header value`);\n\t}\n}\n\n/**\n * Find the key in the map object given a header name.\n *\n * Returns undefined if not found.\n *\n * @param   String  name  Header name\n * @return  String|Undefined\n */\nfunction find(map, name) {\n\tname = name.toLowerCase();\n\tfor (const key in map) {\n\t\tif (key.toLowerCase() === name) {\n\t\t\treturn key;\n\t\t}\n\t}\n\treturn undefined;\n}\n\nconst MAP = Symbol('map');\nclass Headers {\n\t/**\n  * Headers class\n  *\n  * @param   Object  headers  Response headers\n  * @return  Void\n  */\n\tconstructor() {\n\t\tlet init = arguments.length > 0 && arguments[0] !== undefined ? arguments[0] : undefined;\n\n\t\tthis[MAP] = Object.create(null);\n\n\t\tif (init instanceof Headers) {\n\t\t\tconst rawHeaders = init.raw();\n\t\t\tconst headerNames = Object.keys(rawHeaders);\n\n\t\t\tfor (const headerName of headerNames) {\n\t\t\t\tfor (const value of rawHeaders[headerName]) {\n\t\t\t\t\tthis.append(headerName, value);\n\t\t\t\t}\n\t\t\t}\n\n\t\t\treturn;\n\t\t}\n\n\t\t// We don't worry about converting prop to ByteString here as append()\n\t\t// will handle it.\n\t\tif (init == null) ; else if (typeof init === 'object') {\n\t\t\tconst method = init[Symbol.iterator];\n\t\t\tif (method != null) {\n\t\t\t\tif (typeof method !== 'function') {\n\t\t\t\t\tthrow new TypeError('Header pairs must be iterable');\n\t\t\t\t}\n\n\t\t\t\t// sequence<sequence<ByteString>>\n\t\t\t\t// Note: per spec we have to first exhaust the lists then process them\n\t\t\t\tconst pairs = [];\n\t\t\t\tfor (const pair of init) {\n\t\t\t\t\tif (typeof pair !== 'object' || typeof pair[Symbol.iterator] !== 'function') {\n\t\t\t\t\t\tthrow new TypeError('Each header pair must be iterable');\n\t\t\t\t\t}\n\t\t\t\t\tpairs.push(Array.from(pair));\n\t\t\t\t}\n\n\t\t\t\tfor (const pair of pairs) {\n\t\t\t\t\tif (pair.length !== 2) {\n\t\t\t\t\t\tthrow new TypeError('Each header pair must be a name/value tuple');\n\t\t\t\t\t}\n\t\t\t\t\tthis.append(pair[0], pair[1]);\n\t\t\t\t}\n\t\t\t} else {\n\t\t\t\t// record<ByteString, ByteString>\n\t\t\t\tfor (const key of Object.keys(init)) {\n\t\t\t\t\tconst value = init[key];\n\t\t\t\t\tthis.append(key, value);\n\t\t\t\t}\n\t\t\t}\n\t\t} else {\n\t\t\tthrow new TypeError('Provided initializer must be an object');\n\t\t}\n\t}\n\n\t/**\n  * Return combined header value given name\n  *\n  * @param   String  name  Header name\n  * @return  Mixed\n  */\n\tget(name) {\n\t\tname = `${name}`;\n\t\tvalidateName(name);\n\t\tconst key = find(this[MAP], name);\n\t\tif (key === undefined) {\n\t\t\treturn null;\n\t\t}\n\n\t\treturn this[MAP][key].join(', ');\n\t}\n\n\t/**\n  * Iterate over all headers\n  *\n  * @param   Function  callback  Executed for each item with parameters (value, name, thisArg)\n  * @param   Boolean   thisArg   `this` context for callback function\n  * @return  Void\n  */\n\tforEach(callback) {\n\t\tlet thisArg = arguments.length > 1 && arguments[1] !== undefined ? arguments[1] : undefined;\n\n\t\tlet pairs = getHeaders(this);\n\t\tlet i = 0;\n\t\twhile (i < pairs.length) {\n\t\t\tvar _pairs$i = pairs[i];\n\t\t\tconst name = _pairs$i[0],\n\t\t\t      value = _pairs$i[1];\n\n\t\t\tcallback.call(thisArg, value, name, this);\n\t\t\tpairs = getHeaders(this);\n\t\t\ti++;\n\t\t}\n\t}\n\n\t/**\n  * Overwrite header values given name\n  *\n  * @param   String  name   Header name\n  * @param   String  value  Header value\n  * @return  Void\n  */\n\tset(name, value) {\n\t\tname = `${name}`;\n\t\tvalue = `${value}`;\n\t\tvalidateName(name);\n\t\tvalidateValue(value);\n\t\tconst key = find(this[MAP], name);\n\t\tthis[MAP][key !== undefined ? key : name] = [value];\n\t}\n\n\t/**\n  * Append a value onto existing header\n  *\n  * @param   String  name   Header name\n  * @param   String  value  Header value\n  * @return  Void\n  */\n\tappend(name, value) {\n\t\tname = `${name}`;\n\t\tvalue = `${value}`;\n\t\tvalidateName(name);\n\t\tvalidateValue(value);\n\t\tconst key = find(this[MAP], name);\n\t\tif (key !== undefined) {\n\t\t\tthis[MAP][key].push(value);\n\t\t} else {\n\t\t\tthis[MAP][name] = [value];\n\t\t}\n\t}\n\n\t/**\n  * Check for header name existence\n  *\n  * @param   String   name  Header name\n  * @return  Boolean\n  */\n\thas(name) {\n\t\tname = `${name}`;\n\t\tvalidateName(name);\n\t\treturn find(this[MAP], name) !== undefined;\n\t}\n\n\t/**\n  * Delete all header values given name\n  *\n  * @param   String  name  Header name\n  * @return  Void\n  */\n\tdelete(name) {\n\t\tname = `${name}`;\n\t\tvalidateName(name);\n\t\tconst key = find(this[MAP], name);\n\t\tif (key !== undefined) {\n\t\t\tdelete this[MAP][key];\n\t\t}\n\t}\n\n\t/**\n  * Return raw headers (non-spec api)\n  *\n  * @return  Object\n  */\n\traw() {\n\t\treturn this[MAP];\n\t}\n\n\t/**\n  * Get an iterator on keys.\n  *\n  * @return  Iterator\n  */\n\tkeys() {\n\t\treturn createHeadersIterator(this, 'key');\n\t}\n\n\t/**\n  * Get an iterator on values.\n  *\n  * @return  Iterator\n  */\n\tvalues() {\n\t\treturn createHeadersIterator(this, 'value');\n\t}\n\n\t/**\n  * Get an iterator on entries.\n  *\n  * This is the default iterator of the Headers object.\n  *\n  * @return  Iterator\n  */\n\t[Symbol.iterator]() {\n\t\treturn createHeadersIterator(this, 'key+value');\n\t}\n}\nHeaders.prototype.entries = Headers.prototype[Symbol.iterator];\n\nObject.defineProperty(Headers.prototype, Symbol.toStringTag, {\n\tvalue: 'Headers',\n\twritable: false,\n\tenumerable: false,\n\tconfigurable: true\n});\n\nObject.defineProperties(Headers.prototype, {\n\tget: { enumerable: true },\n\tforEach: { enumerable: true },\n\tset: { enumerable: true },\n\tappend: { enumerable: true },\n\thas: { enumerable: true },\n\tdelete: { enumerable: true },\n\tkeys: { enumerable: true },\n\tvalues: { enumerable: true },\n\tentries: { enumerable: true }\n});\n\nfunction getHeaders(headers) {\n\tlet kind = arguments.length > 1 && arguments[1] !== undefined ? arguments[1] : 'key+value';\n\n\tconst keys = Object.keys(headers[MAP]).sort();\n\treturn keys.map(kind === 'key' ? function (k) {\n\t\treturn k.toLowerCase();\n\t} : kind === 'value' ? function (k) {\n\t\treturn headers[MAP][k].join(', ');\n\t} : function (k) {\n\t\treturn [k.toLowerCase(), headers[MAP][k].join(', ')];\n\t});\n}\n\nconst INTERNAL = Symbol('internal');\n\nfunction createHeadersIterator(target, kind) {\n\tconst iterator = Object.create(HeadersIteratorPrototype);\n\titerator[INTERNAL] = {\n\t\ttarget,\n\t\tkind,\n\t\tindex: 0\n\t};\n\treturn iterator;\n}\n\nconst HeadersIteratorPrototype = Object.setPrototypeOf({\n\tnext() {\n\t\t// istanbul ignore if\n\t\tif (!this || Object.getPrototypeOf(this) !== HeadersIteratorPrototype) {\n\t\t\tthrow new TypeError('Value of `this` is not a HeadersIterator');\n\t\t}\n\n\t\tvar _INTERNAL = this[INTERNAL];\n\t\tconst target = _INTERNAL.target,\n\t\t      kind = _INTERNAL.kind,\n\t\t      index = _INTERNAL.index;\n\n\t\tconst values = getHeaders(target, kind);\n\t\tconst len = values.length;\n\t\tif (index >= len) {\n\t\t\treturn {\n\t\t\t\tvalue: undefined,\n\t\t\t\tdone: true\n\t\t\t};\n\t\t}\n\n\t\tthis[INTERNAL].index = index + 1;\n\n\t\treturn {\n\t\t\tvalue: values[index],\n\t\t\tdone: false\n\t\t};\n\t}\n}, Object.getPrototypeOf(Object.getPrototypeOf([][Symbol.iterator]())));\n\nObject.defineProperty(HeadersIteratorPrototype, Symbol.toStringTag, {\n\tvalue: 'HeadersIterator',\n\twritable: false,\n\tenumerable: false,\n\tconfigurable: true\n});\n\n/**\n * Export the Headers object in a form that Node.js can consume.\n *\n * @param   Headers  headers\n * @return  Object\n */\nfunction exportNodeCompatibleHeaders(headers) {\n\tconst obj = Object.assign({ __proto__: null }, headers[MAP]);\n\n\t// http.request() only supports string as Host header. This hack makes\n\t// specifying custom Host header possible.\n\tconst hostHeaderKey = find(headers[MAP], 'Host');\n\tif (hostHeaderKey !== undefined) {\n\t\tobj[hostHeaderKey] = obj[hostHeaderKey][0];\n\t}\n\n\treturn obj;\n}\n\n/**\n * Create a Headers object from an object of headers, ignoring those that do\n * not conform to HTTP grammar productions.\n *\n * @param   Object  obj  Object of headers\n * @return  Headers\n */\nfunction createHeadersLenient(obj) {\n\tconst headers = new Headers();\n\tfor (const name of Object.keys(obj)) {\n\t\tif (invalidTokenRegex.test(name)) {\n\t\t\tcontinue;\n\t\t}\n\t\tif (Array.isArray(obj[name])) {\n\t\t\tfor (const val of obj[name]) {\n\t\t\t\tif (invalidHeaderCharRegex.test(val)) {\n\t\t\t\t\tcontinue;\n\t\t\t\t}\n\t\t\t\tif (headers[MAP][name] === undefined) {\n\t\t\t\t\theaders[MAP][name] = [val];\n\t\t\t\t} else {\n\t\t\t\t\theaders[MAP][name].push(val);\n\t\t\t\t}\n\t\t\t}\n\t\t} else if (!invalidHeaderCharRegex.test(obj[name])) {\n\t\t\theaders[MAP][name] = [obj[name]];\n\t\t}\n\t}\n\treturn headers;\n}\n\nconst INTERNALS$1 = Symbol('Response internals');\n\n// fix an issue where \"STATUS_CODES\" aren't a named export for node <10\nconst STATUS_CODES = http.STATUS_CODES;\n\n/**\n * Response class\n *\n * @param   Stream  body  Readable stream\n * @param   Object  opts  Response options\n * @return  Void\n */\nclass Response {\n\tconstructor() {\n\t\tlet body = arguments.length > 0 && arguments[0] !== undefined ? arguments[0] : null;\n\t\tlet opts = arguments.length > 1 && arguments[1] !== undefined ? arguments[1] : {};\n\n\t\tBody.call(this, body, opts);\n\n\t\tconst status = opts.status || 200;\n\t\tconst headers = new Headers(opts.headers);\n\n\t\tif (body != null && !headers.has('Content-Type')) {\n\t\t\tconst contentType = extractContentType(body);\n\t\t\tif (contentType) {\n\t\t\t\theaders.append('Content-Type', contentType);\n\t\t\t}\n\t\t}\n\n\t\tthis[INTERNALS$1] = {\n\t\t\turl: opts.url,\n\t\t\tstatus,\n\t\t\tstatusText: opts.statusText || STATUS_CODES[status],\n\t\t\theaders,\n\t\t\tcounter: opts.counter\n\t\t};\n\t}\n\n\tget url() {\n\t\treturn this[INTERNALS$1].url || '';\n\t}\n\n\tget status() {\n\t\treturn this[INTERNALS$1].status;\n\t}\n\n\t/**\n  * Convenience property representing if the request ended normally\n  */\n\tget ok() {\n\t\treturn this[INTERNALS$1].status >= 200 && this[INTERNALS$1].status < 300;\n\t}\n\n\tget redirected() {\n\t\treturn this[INTERNALS$1].counter > 0;\n\t}\n\n\tget statusText() {\n\t\treturn this[INTERNALS$1].statusText;\n\t}\n\n\tget headers() {\n\t\treturn this[INTERNALS$1].headers;\n\t}\n\n\t/**\n  * Clone this response\n  *\n  * @return  Response\n  */\n\tclone() {\n\t\treturn new Response(clone(this), {\n\t\t\turl: this.url,\n\t\t\tstatus: this.status,\n\t\t\tstatusText: this.statusText,\n\t\t\theaders: this.headers,\n\t\t\tok: this.ok,\n\t\t\tredirected: this.redirected\n\t\t});\n\t}\n}\n\nBody.mixIn(Response.prototype);\n\nObject.defineProperties(Response.prototype, {\n\turl: { enumerable: true },\n\tstatus: { enumerable: true },\n\tok: { enumerable: true },\n\tredirected: { enumerable: true },\n\tstatusText: { enumerable: true },\n\theaders: { enumerable: true },\n\tclone: { enumerable: true }\n});\n\nObject.defineProperty(Response.prototype, Symbol.toStringTag, {\n\tvalue: 'Response',\n\twritable: false,\n\tenumerable: false,\n\tconfigurable: true\n});\n\nconst INTERNALS$2 = Symbol('Request internals');\nconst URL = Url.URL || whatwgUrl.URL;\n\n// fix an issue where \"format\", \"parse\" aren't a named export for node <10\nconst parse_url = Url.parse;\nconst format_url = Url.format;\n\n/**\n * Wrapper around `new URL` to handle arbitrary URLs\n *\n * @param  {string} urlStr\n * @return {void}\n */\nfunction parseURL(urlStr) {\n\t/*\n \tCheck whether the URL is absolute or not\n \t\tScheme: https://tools.ietf.org/html/rfc3986#section-3.1\n \tAbsolute URL: https://tools.ietf.org/html/rfc3986#section-4.3\n */\n\tif (/^[a-zA-Z][a-zA-Z\\d+\\-.]*:/.exec(urlStr)) {\n\t\turlStr = new URL(urlStr).toString();\n\t}\n\n\t// Fallback to old implementation for arbitrary URLs\n\treturn parse_url(urlStr);\n}\n\nconst streamDestructionSupported = 'destroy' in Stream.Readable.prototype;\n\n/**\n * Check if a value is an instance of Request.\n *\n * @param   Mixed   input\n * @return  Boolean\n */\nfunction isRequest(input) {\n\treturn typeof input === 'object' && typeof input[INTERNALS$2] === 'object';\n}\n\nfunction isAbortSignal(signal) {\n\tconst proto = signal && typeof signal === 'object' && Object.getPrototypeOf(signal);\n\treturn !!(proto && proto.constructor.name === 'AbortSignal');\n}\n\n/**\n * Request class\n *\n * @param   Mixed   input  Url or Request instance\n * @param   Object  init   Custom options\n * @return  Void\n */\nclass Request {\n\tconstructor(input) {\n\t\tlet init = arguments.length > 1 && arguments[1] !== undefined ? arguments[1] : {};\n\n\t\tlet parsedURL;\n\n\t\t// normalize input\n\t\tif (!isRequest(input)) {\n\t\t\tif (input && input.href) {\n\t\t\t\t// in order to support Node.js' Url objects; though WHATWG's URL objects\n\t\t\t\t// will fall into this branch also (since their `toString()` will return\n\t\t\t\t// `href` property anyway)\n\t\t\t\tparsedURL = parseURL(input.href);\n\t\t\t} else {\n\t\t\t\t// coerce input to a string before attempting to parse\n\t\t\t\tparsedURL = parseURL(`${input}`);\n\t\t\t}\n\t\t\tinput = {};\n\t\t} else {\n\t\t\tparsedURL = parseURL(input.url);\n\t\t}\n\n\t\tlet method = init.method || input.method || 'GET';\n\t\tmethod = method.toUpperCase();\n\n\t\tif ((init.body != null || isRequest(input) && input.body !== null) && (method === 'GET' || method === 'HEAD')) {\n\t\t\tthrow new TypeError('Request with GET/HEAD method cannot have body');\n\t\t}\n\n\t\tlet inputBody = init.body != null ? init.body : isRequest(input) && input.body !== null ? clone(input) : null;\n\n\t\tBody.call(this, inputBody, {\n\t\t\ttimeout: init.timeout || input.timeout || 0,\n\t\t\tsize: init.size || input.size || 0\n\t\t});\n\n\t\tconst headers = new Headers(init.headers || input.headers || {});\n\n\t\tif (inputBody != null && !headers.has('Content-Type')) {\n\t\t\tconst contentType = extractContentType(inputBody);\n\t\t\tif (contentType) {\n\t\t\t\theaders.append('Content-Type', contentType);\n\t\t\t}\n\t\t}\n\n\t\tlet signal = isRequest(input) ? input.signal : null;\n\t\tif ('signal' in init) signal = init.signal;\n\n\t\tif (signal != null && !isAbortSignal(signal)) {\n\t\t\tthrow new TypeError('Expected signal to be an instanceof AbortSignal');\n\t\t}\n\n\t\tthis[INTERNALS$2] = {\n\t\t\tmethod,\n\t\t\tredirect: init.redirect || input.redirect || 'follow',\n\t\t\theaders,\n\t\t\tparsedURL,\n\t\t\tsignal\n\t\t};\n\n\t\t// node-fetch-only options\n\t\tthis.follow = init.follow !== undefined ? init.follow : input.follow !== undefined ? input.follow : 20;\n\t\tthis.compress = init.compress !== undefined ? init.compress : input.compress !== undefined ? input.compress : true;\n\t\tthis.counter = init.counter || input.counter || 0;\n\t\tthis.agent = init.agent || input.agent;\n\t}\n\n\tget method() {\n\t\treturn this[INTERNALS$2].method;\n\t}\n\n\tget url() {\n\t\treturn format_url(this[INTERNALS$2].parsedURL);\n\t}\n\n\tget headers() {\n\t\treturn this[INTERNALS$2].headers;\n\t}\n\n\tget redirect() {\n\t\treturn this[INTERNALS$2].redirect;\n\t}\n\n\tget signal() {\n\t\treturn this[INTERNALS$2].signal;\n\t}\n\n\t/**\n  * Clone this request\n  *\n  * @return  Request\n  */\n\tclone() {\n\t\treturn new Request(this);\n\t}\n}\n\nBody.mixIn(Request.prototype);\n\nObject.defineProperty(Request.prototype, Symbol.toStringTag, {\n\tvalue: 'Request',\n\twritable: false,\n\tenumerable: false,\n\tconfigurable: true\n});\n\nObject.defineProperties(Request.prototype, {\n\tmethod: { enumerable: true },\n\turl: { enumerable: true },\n\theaders: { enumerable: true },\n\tredirect: { enumerable: true },\n\tclone: { enumerable: true },\n\tsignal: { enumerable: true }\n});\n\n/**\n * Convert a Request to Node.js http request options.\n *\n * @param   Request  A Request instance\n * @return  Object   The options object to be passed to http.request\n */\nfunction getNodeRequestOptions(request) {\n\tconst parsedURL = request[INTERNALS$2].parsedURL;\n\tconst headers = new Headers(request[INTERNALS$2].headers);\n\n\t// fetch step 1.3\n\tif (!headers.has('Accept')) {\n\t\theaders.set('Accept', '*/*');\n\t}\n\n\t// Basic fetch\n\tif (!parsedURL.protocol || !parsedURL.hostname) {\n\t\tthrow new TypeError('Only absolute URLs are supported');\n\t}\n\n\tif (!/^https?:$/.test(parsedURL.protocol)) {\n\t\tthrow new TypeError('Only HTTP(S) protocols are supported');\n\t}\n\n\tif (request.signal && request.body instanceof Stream.Readable && !streamDestructionSupported) {\n\t\tthrow new Error('Cancellation of streamed requests with AbortSignal is not supported in node < 8');\n\t}\n\n\t// HTTP-network-or-cache fetch steps 2.4-2.7\n\tlet contentLengthValue = null;\n\tif (request.body == null && /^(POST|PUT)$/i.test(request.method)) {\n\t\tcontentLengthValue = '0';\n\t}\n\tif (request.body != null) {\n\t\tconst totalBytes = getTotalBytes(request);\n\t\tif (typeof totalBytes === 'number') {\n\t\t\tcontentLengthValue = String(totalBytes);\n\t\t}\n\t}\n\tif (contentLengthValue) {\n\t\theaders.set('Content-Length', contentLengthValue);\n\t}\n\n\t// HTTP-network-or-cache fetch step 2.11\n\tif (!headers.has('User-Agent')) {\n\t\theaders.set('User-Agent', 'node-fetch/1.0 (+https://github.com/bitinn/node-fetch)');\n\t}\n\n\t// HTTP-network-or-cache fetch step 2.15\n\tif (request.compress && !headers.has('Accept-Encoding')) {\n\t\theaders.set('Accept-Encoding', 'gzip,deflate');\n\t}\n\n\tlet agent = request.agent;\n\tif (typeof agent === 'function') {\n\t\tagent = agent(parsedURL);\n\t}\n\n\tif (!headers.has('Connection') && !agent) {\n\t\theaders.set('Connection', 'close');\n\t}\n\n\t// HTTP-network fetch step 4.2\n\t// chunked encoding is handled by Node.js\n\n\treturn Object.assign({}, parsedURL, {\n\t\tmethod: request.method,\n\t\theaders: exportNodeCompatibleHeaders(headers),\n\t\tagent\n\t});\n}\n\n/**\n * abort-error.js\n *\n * AbortError interface for cancelled requests\n */\n\n/**\n * Create AbortError instance\n *\n * @param   String      message      Error message for human\n * @return  AbortError\n */\nfunction AbortError(message) {\n  Error.call(this, message);\n\n  this.type = 'aborted';\n  this.message = message;\n\n  // hide custom error implementation details from end-users\n  Error.captureStackTrace(this, this.constructor);\n}\n\nAbortError.prototype = Object.create(Error.prototype);\nAbortError.prototype.constructor = AbortError;\nAbortError.prototype.name = 'AbortError';\n\nconst URL$1 = Url.URL || whatwgUrl.URL;\n\n// fix an issue where \"PassThrough\", \"resolve\" aren't a named export for node <10\nconst PassThrough$1 = Stream.PassThrough;\n\nconst isDomainOrSubdomain = function isDomainOrSubdomain(destination, original) {\n\tconst orig = new URL$1(original).hostname;\n\tconst dest = new URL$1(destination).hostname;\n\n\treturn orig === dest || orig[orig.length - dest.length - 1] === '.' && orig.endsWith(dest);\n};\n\n/**\n * Fetch function\n *\n * @param   Mixed    url   Absolute url or Request instance\n * @param   Object   opts  Fetch options\n * @return  Promise\n */\nfunction fetch(url, opts) {\n\n\t// allow custom promise\n\tif (!fetch.Promise) {\n\t\tthrow new Error('native promise missing, set fetch.Promise to your favorite alternative');\n\t}\n\n\tBody.Promise = fetch.Promise;\n\n\t// wrap http.request into fetch\n\treturn new fetch.Promise(function (resolve, reject) {\n\t\t// build request object\n\t\tconst request = new Request(url, opts);\n\t\tconst options = getNodeRequestOptions(request);\n\n\t\tconst send = (options.protocol === 'https:' ? https : http).request;\n\t\tconst signal = request.signal;\n\n\t\tlet response = null;\n\n\t\tconst abort = function abort() {\n\t\t\tlet error = new AbortError('The user aborted a request.');\n\t\t\treject(error);\n\t\t\tif (request.body && request.body instanceof Stream.Readable) {\n\t\t\t\trequest.body.destroy(error);\n\t\t\t}\n\t\t\tif (!response || !response.body) return;\n\t\t\tresponse.body.emit('error', error);\n\t\t};\n\n\t\tif (signal && signal.aborted) {\n\t\t\tabort();\n\t\t\treturn;\n\t\t}\n\n\t\tconst abortAndFinalize = function abortAndFinalize() {\n\t\t\tabort();\n\t\t\tfinalize();\n\t\t};\n\n\t\t// send request\n\t\tconst req = send(options);\n\t\tlet reqTimeout;\n\n\t\tif (signal) {\n\t\t\tsignal.addEventListener('abort', abortAndFinalize);\n\t\t}\n\n\t\tfunction finalize() {\n\t\t\treq.abort();\n\t\t\tif (signal) signal.removeEventListener('abort', abortAndFinalize);\n\t\t\tclearTimeout(reqTimeout);\n\t\t}\n\n\t\tif (request.timeout) {\n\t\t\treq.once('socket', function (socket) {\n\t\t\t\treqTimeout = setTimeout(function () {\n\t\t\t\t\treject(new FetchError(`network timeout at: ${request.url}`, 'request-timeout'));\n\t\t\t\t\tfinalize();\n\t\t\t\t}, request.timeout);\n\t\t\t});\n\t\t}\n\n\t\treq.on('error', function (err) {\n\t\t\treject(new FetchError(`request to ${request.url} failed, reason: ${err.message}`, 'system', err));\n\t\t\tfinalize();\n\t\t});\n\n\t\treq.on('response', function (res) {\n\t\t\tclearTimeout(reqTimeout);\n\n\t\t\tconst headers = createHeadersLenient(res.headers);\n\n\t\t\t// HTTP fetch step 5\n\t\t\tif (fetch.isRedirect(res.statusCode)) {\n\t\t\t\t// HTTP fetch step 5.2\n\t\t\t\tconst location = headers.get('Location');\n\n\t\t\t\t// HTTP fetch step 5.3\n\t\t\t\tlet locationURL = null;\n\t\t\t\ttry {\n\t\t\t\t\tlocationURL = location === null ? null : new URL$1(location, request.url).toString();\n\t\t\t\t} catch (err) {\n\t\t\t\t\t// error here can only be invalid URL in Location: header\n\t\t\t\t\t// do not throw when options.redirect == manual\n\t\t\t\t\t// let the user extract the errorneous redirect URL\n\t\t\t\t\tif (request.redirect !== 'manual') {\n\t\t\t\t\t\treject(new FetchError(`uri requested responds with an invalid redirect URL: ${location}`, 'invalid-redirect'));\n\t\t\t\t\t\tfinalize();\n\t\t\t\t\t\treturn;\n\t\t\t\t\t}\n\t\t\t\t}\n\n\t\t\t\t// HTTP fetch step 5.5\n\t\t\t\tswitch (request.redirect) {\n\t\t\t\t\tcase 'error':\n\t\t\t\t\t\treject(new FetchError(`uri requested responds with a redirect, redirect mode is set to error: ${request.url}`, 'no-redirect'));\n\t\t\t\t\t\tfinalize();\n\t\t\t\t\t\treturn;\n\t\t\t\t\tcase 'manual':\n\t\t\t\t\t\t// node-fetch-specific step: make manual redirect a bit easier to use by setting the Location header value to the resolved URL.\n\t\t\t\t\t\tif (locationURL !== null) {\n\t\t\t\t\t\t\t// handle corrupted header\n\t\t\t\t\t\t\ttry {\n\t\t\t\t\t\t\t\theaders.set('Location', locationURL);\n\t\t\t\t\t\t\t} catch (err) {\n\t\t\t\t\t\t\t\t// istanbul ignore next: nodejs server prevent invalid response headers, we can't test this through normal request\n\t\t\t\t\t\t\t\treject(err);\n\t\t\t\t\t\t\t}\n\t\t\t\t\t\t}\n\t\t\t\t\t\tbreak;\n\t\t\t\t\tcase 'follow':\n\t\t\t\t\t\t// HTTP-redirect fetch step 2\n\t\t\t\t\t\tif (locationURL === null) {\n\t\t\t\t\t\t\tbreak;\n\t\t\t\t\t\t}\n\n\t\t\t\t\t\t// HTTP-redirect fetch step 5\n\t\t\t\t\t\tif (request.counter >= request.follow) {\n\t\t\t\t\t\t\treject(new FetchError(`maximum redirect reached at: ${request.url}`, 'max-redirect'));\n\t\t\t\t\t\t\tfinalize();\n\t\t\t\t\t\t\treturn;\n\t\t\t\t\t\t}\n\n\t\t\t\t\t\t// HTTP-redirect fetch step 6 (counter increment)\n\t\t\t\t\t\t// Create a new Request object.\n\t\t\t\t\t\tconst requestOpts = {\n\t\t\t\t\t\t\theaders: new Headers(request.headers),\n\t\t\t\t\t\t\tfollow: request.follow,\n\t\t\t\t\t\t\tcounter: request.counter + 1,\n\t\t\t\t\t\t\tagent: request.agent,\n\t\t\t\t\t\t\tcompress: request.compress,\n\t\t\t\t\t\t\tmethod: request.method,\n\t\t\t\t\t\t\tbody: request.body,\n\t\t\t\t\t\t\tsignal: request.signal,\n\t\t\t\t\t\t\ttimeout: request.timeout,\n\t\t\t\t\t\t\tsize: request.size\n\t\t\t\t\t\t};\n\n\t\t\t\t\t\tif (!isDomainOrSubdomain(request.url, locationURL)) {\n\t\t\t\t\t\t\tfor (const name of ['authorization', 'www-authenticate', 'cookie', 'cookie2']) {\n\t\t\t\t\t\t\t\trequestOpts.headers.delete(name);\n\t\t\t\t\t\t\t}\n\t\t\t\t\t\t}\n\n\t\t\t\t\t\t// HTTP-redirect fetch step 9\n\t\t\t\t\t\tif (res.statusCode !== 303 && request.body && getTotalBytes(request) === null) {\n\t\t\t\t\t\t\treject(new FetchError('Cannot follow redirect with body being a readable stream', 'unsupported-redirect'));\n\t\t\t\t\t\t\tfinalize();\n\t\t\t\t\t\t\treturn;\n\t\t\t\t\t\t}\n\n\t\t\t\t\t\t// HTTP-redirect fetch step 11\n\t\t\t\t\t\tif (res.statusCode === 303 || (res.statusCode === 301 || res.statusCode === 302) && request.method === 'POST') {\n\t\t\t\t\t\t\trequestOpts.method = 'GET';\n\t\t\t\t\t\t\trequestOpts.body = undefined;\n\t\t\t\t\t\t\trequestOpts.headers.delete('content-length');\n\t\t\t\t\t\t}\n\n\t\t\t\t\t\t// HTTP-redirect fetch step 15\n\t\t\t\t\t\tresolve(fetch(new Request(locationURL, requestOpts)));\n\t\t\t\t\t\tfinalize();\n\t\t\t\t\t\treturn;\n\t\t\t\t}\n\t\t\t}\n\n\t\t\t// prepare response\n\t\t\tres.once('end', function () {\n\t\t\t\tif (signal) signal.removeEventListener('abort', abortAndFinalize);\n\t\t\t});\n\t\t\tlet body = res.pipe(new PassThrough$1());\n\n\t\t\tconst response_options = {\n\t\t\t\turl: request.url,\n\t\t\t\tstatus: res.statusCode,\n\t\t\t\tstatusText: res.statusMessage,\n\t\t\t\theaders: headers,\n\t\t\t\tsize: request.size,\n\t\t\t\ttimeout: request.timeout,\n\t\t\t\tcounter: request.counter\n\t\t\t};\n\n\t\t\t// HTTP-network fetch step 12.1.1.3\n\t\t\tconst codings = headers.get('Content-Encoding');\n\n\t\t\t// HTTP-network fetch step 12.1.1.4: handle content codings\n\n\t\t\t// in following scenarios we ignore compression support\n\t\t\t// 1. compression support is disabled\n\t\t\t// 2. HEAD request\n\t\t\t// 3. no Content-Encoding header\n\t\t\t// 4. no content response (204)\n\t\t\t// 5. content not modified response (304)\n\t\t\tif (!request.compress || request.method === 'HEAD' || codings === null || res.statusCode === 204 || res.statusCode === 304) {\n\t\t\t\tresponse = new Response(body, response_options);\n\t\t\t\tresolve(response);\n\t\t\t\treturn;\n\t\t\t}\n\n\t\t\t// For Node v6+\n\t\t\t// Be less strict when decoding compressed responses, since sometimes\n\t\t\t// servers send slightly invalid responses that are still accepted\n\t\t\t// by common browsers.\n\t\t\t// Always using Z_SYNC_FLUSH is what cURL does.\n\t\t\tconst zlibOptions = {\n\t\t\t\tflush: zlib.Z_SYNC_FLUSH,\n\t\t\t\tfinishFlush: zlib.Z_SYNC_FLUSH\n\t\t\t};\n\n\t\t\t// for gzip\n\t\t\tif (codings == 'gzip' || codings == 'x-gzip') {\n\t\t\t\tbody = body.pipe(zlib.createGunzip(zlibOptions));\n\t\t\t\tresponse = new Response(body, response_options);\n\t\t\t\tresolve(response);\n\t\t\t\treturn;\n\t\t\t}\n\n\t\t\t// for deflate\n\t\t\tif (codings == 'deflate' || codings == 'x-deflate') {\n\t\t\t\t// handle the infamous raw deflate response from old servers\n\t\t\t\t// a hack for old IIS and Apache servers\n\t\t\t\tconst raw = res.pipe(new PassThrough$1());\n\t\t\t\traw.once('data', function (chunk) {\n\t\t\t\t\t// see http://stackoverflow.com/questions/37519828\n\t\t\t\t\tif ((chunk[0] & 0x0F) === 0x08) {\n\t\t\t\t\t\tbody = body.pipe(zlib.createInflate());\n\t\t\t\t\t} else {\n\t\t\t\t\t\tbody = body.pipe(zlib.createInflateRaw());\n\t\t\t\t\t}\n\t\t\t\t\tresponse = new Response(body, response_options);\n\t\t\t\t\tresolve(response);\n\t\t\t\t});\n\t\t\t\treturn;\n\t\t\t}\n\n\t\t\t// for br\n\t\t\tif (codings == 'br' && typeof zlib.createBrotliDecompress === 'function') {\n\t\t\t\tbody = body.pipe(zlib.createBrotliDecompress());\n\t\t\t\tresponse = new Response(body, response_options);\n\t\t\t\tresolve(response);\n\t\t\t\treturn;\n\t\t\t}\n\n\t\t\t// otherwise, use response as-is\n\t\t\tresponse = new Response(body, response_options);\n\t\t\tresolve(response);\n\t\t});\n\n\t\twriteToStream(req, request);\n\t});\n}\n/**\n * Redirect code matching\n *\n * @param   Number   code  Status code\n * @return  Boolean\n */\nfetch.isRedirect = function (code) {\n\treturn code === 301 || code === 302 || code === 303 || code === 307 || code === 308;\n};\n\n// expose Promise\nfetch.Promise = global.Promise;\n\nmodule.exports = exports = fetch;\nObject.defineProperty(exports, \"__esModule\", ({ value: true }));\nexports.default = exports;\nexports.Headers = Headers;\nexports.Request = Request;\nexports.Response = Response;\nexports.FetchError = FetchError;\n\n\n/***/ }),\n\n/***/ 502:\n/***/ ((module, __unused_webpack_exports, __webpack_require__) => {\n\n\"use strict\";\n\nconst path = __webpack_require__(5622);\nconst pathKey = __webpack_require__(539);\n\nmodule.exports = opts => {\n\topts = Object.assign({\n\t\tcwd: process.cwd(),\n\t\tpath: process.env[pathKey()]\n\t}, opts);\n\n\tlet prev;\n\tlet pth = path.resolve(opts.cwd);\n\tconst ret = [];\n\n\twhile (prev !== pth) {\n\t\tret.push(path.join(pth, 'node_modules/.bin'));\n\t\tprev = pth;\n\t\tpth = path.resolve(pth, '..');\n\t}\n\n\t// ensure the running `node` binary is used\n\tret.push(path.dirname(process.execPath));\n\n\treturn ret.concat(opts.path).join(path.delimiter);\n};\n\nmodule.exports.env = opts => {\n\topts = Object.assign({\n\t\tenv: process.env\n\t}, opts);\n\n\tconst env = Object.assign({}, opts.env);\n\tconst path = pathKey({env});\n\n\topts.path = env[path];\n\tenv[path] = module.exports(opts);\n\n\treturn env;\n};\n\n\n/***/ }),\n\n/***/ 1223:\n/***/ ((module, __unused_webpack_exports, __webpack_require__) => {\n\nvar wrappy = __webpack_require__(2940)\nmodule.exports = wrappy(once)\nmodule.exports.strict = wrappy(onceStrict)\n\nonce.proto = once(function () {\n  Object.defineProperty(Function.prototype, 'once', {\n    value: function () {\n      return once(this)\n    },\n    configurable: true\n  })\n\n  Object.defineProperty(Function.prototype, 'onceStrict', {\n    value: function () {\n      return onceStrict(this)\n    },\n    configurable: true\n  })\n})\n\nfunction once (fn) {\n  var f = function () {\n    if (f.called) return f.value\n    f.called = true\n    return f.value = fn.apply(this, arguments)\n  }\n  f.called = false\n  return f\n}\n\nfunction onceStrict (fn) {\n  var f = function () {\n    if (f.called)\n      throw new Error(f.onceError)\n    f.called = true\n    return f.value = fn.apply(this, arguments)\n  }\n  var name = fn.name || 'Function wrapped with `once`'\n  f.onceError = name + \" shouldn't be called more than once\"\n  f.called = false\n  return f\n}\n\n\n/***/ }),\n\n/***/ 4824:\n/***/ ((module, __unused_webpack_exports, __webpack_require__) => {\n\n\"use strict\";\n\nconst os = __webpack_require__(2087);\nconst macosRelease = __webpack_require__(7493);\nconst winRelease = __webpack_require__(3515);\n\nconst osName = (platform, release) => {\n\tif (!platform && release) {\n\t\tthrow new Error('You can\\'t specify a `release` without specifying `platform`');\n\t}\n\n\tplatform = platform || os.platform();\n\n\tlet id;\n\n\tif (platform === 'darwin') {\n\t\tif (!release && os.platform() === 'darwin') {\n\t\t\trelease = os.release();\n\t\t}\n\n\t\tconst prefix = release ? (Number(release.split('.')[0]) > 15 ? 'macOS' : 'OS X') : 'macOS';\n\t\tid = release ? macosRelease(release).name : '';\n\t\treturn prefix + (id ? ' ' + id : '');\n\t}\n\n\tif (platform === 'linux') {\n\t\tif (!release && os.platform() === 'linux') {\n\t\t\trelease = os.release();\n\t\t}\n\n\t\tid = release ? release.replace(/^(\\d+\\.\\d+).*/, '$1') : '';\n\t\treturn 'Linux' + (id ? ' ' + id : '');\n\t}\n\n\tif (platform === 'win32') {\n\t\tif (!release && os.platform() === 'win32') {\n\t\t\trelease = os.release();\n\t\t}\n\n\t\tid = release ? winRelease(release) : '';\n\t\treturn 'Windows' + (id ? ' ' + id : '');\n\t}\n\n\treturn platform;\n};\n\nmodule.exports = osName;\n\n\n/***/ }),\n\n/***/ 1330:\n/***/ ((module) => {\n\n\"use strict\";\n\nmodule.exports = (promise, onFinally) => {\n\tonFinally = onFinally || (() => {});\n\n\treturn promise.then(\n\t\tval => new Promise(resolve => {\n\t\t\tresolve(onFinally());\n\t\t}).then(() => val),\n\t\terr => new Promise(resolve => {\n\t\t\tresolve(onFinally());\n\t\t}).then(() => {\n\t\t\tthrow err;\n\t\t})\n\t);\n};\n\n\n/***/ }),\n\n/***/ 539:\n/***/ ((module) => {\n\n\"use strict\";\n\nmodule.exports = opts => {\n\topts = opts || {};\n\n\tconst env = opts.env || process.env;\n\tconst platform = opts.platform || process.platform;\n\n\tif (platform !== 'win32') {\n\t\treturn 'PATH';\n\t}\n\n\treturn Object.keys(env).find(x => x.toUpperCase() === 'PATH') || 'Path';\n};\n\n\n/***/ }),\n\n/***/ 8341:\n/***/ ((module, __unused_webpack_exports, __webpack_require__) => {\n\nvar once = __webpack_require__(1223)\nvar eos = __webpack_require__(1205)\nvar fs = __webpack_require__(5747) // we only need fs to get the ReadStream and WriteStream prototypes\n\nvar noop = function () {}\nvar ancient = /^v?\\.0/.test(process.version)\n\nvar isFn = function (fn) {\n  return typeof fn === 'function'\n}\n\nvar isFS = function (stream) {\n  if (!ancient) return false // newer node version do not need to care about fs is a special way\n  if (!fs) return false // browser\n  return (stream instanceof (fs.ReadStream || noop) || stream instanceof (fs.WriteStream || noop)) && isFn(stream.close)\n}\n\nvar isRequest = function (stream) {\n  return stream.setHeader && isFn(stream.abort)\n}\n\nvar destroyer = function (stream, reading, writing, callback) {\n  callback = once(callback)\n\n  var closed = false\n  stream.on('close', function () {\n    closed = true\n  })\n\n  eos(stream, {readable: reading, writable: writing}, function (err) {\n    if (err) return callback(err)\n    closed = true\n    callback()\n  })\n\n  var destroyed = false\n  return function (err) {\n    if (closed) return\n    if (destroyed) return\n    destroyed = true\n\n    if (isFS(stream)) return stream.close(noop) // use close for fs streams to avoid fd leaks\n    if (isRequest(stream)) return stream.abort() // request.destroy just do .end - .abort is what we want\n\n    if (isFn(stream.destroy)) return stream.destroy()\n\n    callback(err || new Error('stream was destroyed'))\n  }\n}\n\nvar call = function (fn) {\n  fn()\n}\n\nvar pipe = function (from, to) {\n  return from.pipe(to)\n}\n\nvar pump = function () {\n  var streams = Array.prototype.slice.call(arguments)\n  var callback = isFn(streams[streams.length - 1] || noop) && streams.pop() || noop\n\n  if (Array.isArray(streams[0])) streams = streams[0]\n  if (streams.length < 2) throw new Error('pump requires two streams per minimum')\n\n  var error\n  var destroys = streams.map(function (stream, i) {\n    var reading = i < streams.length - 1\n    var writing = i > 0\n    return destroyer(stream, reading, writing, function (err) {\n      if (!error) error = err\n      if (err) destroys.forEach(call)\n      if (reading) return\n      destroys.forEach(call)\n      callback(error)\n    })\n  })\n\n  return streams.reduce(pipe)\n}\n\nmodule.exports = pump\n\n\n/***/ }),\n\n/***/ 5911:\n/***/ ((module, exports) => {\n\nexports = module.exports = SemVer\n\nvar debug\n/* istanbul ignore next */\nif (typeof process === 'object' &&\n    process.env &&\n    process.env.NODE_DEBUG &&\n    /\\bsemver\\b/i.test(process.env.NODE_DEBUG)) {\n  debug = function () {\n    var args = Array.prototype.slice.call(arguments, 0)\n    args.unshift('SEMVER')\n    console.log.apply(console, args)\n  }\n} else {\n  debug = function () {}\n}\n\n// Note: this is the semver.org version of the spec that it implements\n// Not necessarily the package version of this code.\nexports.SEMVER_SPEC_VERSION = '2.0.0'\n\nvar MAX_LENGTH = 256\nvar MAX_SAFE_INTEGER = Number.MAX_SAFE_INTEGER ||\n  /* istanbul ignore next */ 9007199254740991\n\n// Max safe segment length for coercion.\nvar MAX_SAFE_COMPONENT_LENGTH = 16\n\n// The actual regexps go on exports.re\nvar re = exports.re = []\nvar src = exports.src = []\nvar R = 0\n\n// The following Regular Expressions can be used for tokenizing,\n// validating, and parsing SemVer version strings.\n\n// ## Numeric Identifier\n// A single `0`, or a non-zero digit followed by zero or more digits.\n\nvar NUMERICIDENTIFIER = R++\nsrc[NUMERICIDENTIFIER] = '0|[1-9]\\\\d*'\nvar NUMERICIDENTIFIERLOOSE = R++\nsrc[NUMERICIDENTIFIERLOOSE] = '[0-9]+'\n\n// ## Non-numeric Identifier\n// Zero or more digits, followed by a letter or hyphen, and then zero or\n// more letters, digits, or hyphens.\n\nvar NONNUMERICIDENTIFIER = R++\nsrc[NONNUMERICIDENTIFIER] = '\\\\d*[a-zA-Z-][a-zA-Z0-9-]*'\n\n// ## Main Version\n// Three dot-separated numeric identifiers.\n\nvar MAINVERSION = R++\nsrc[MAINVERSION] = '(' + src[NUMERICIDENTIFIER] + ')\\\\.' +\n                   '(' + src[NUMERICIDENTIFIER] + ')\\\\.' +\n                   '(' + src[NUMERICIDENTIFIER] + ')'\n\nvar MAINVERSIONLOOSE = R++\nsrc[MAINVERSIONLOOSE] = '(' + src[NUMERICIDENTIFIERLOOSE] + ')\\\\.' +\n                        '(' + src[NUMERICIDENTIFIERLOOSE] + ')\\\\.' +\n                        '(' + src[NUMERICIDENTIFIERLOOSE] + ')'\n\n// ## Pre-release Version Identifier\n// A numeric identifier, or a non-numeric identifier.\n\nvar PRERELEASEIDENTIFIER = R++\nsrc[PRERELEASEIDENTIFIER] = '(?:' + src[NUMERICIDENTIFIER] +\n                            '|' + src[NONNUMERICIDENTIFIER] + ')'\n\nvar PRERELEASEIDENTIFIERLOOSE = R++\nsrc[PRERELEASEIDENTIFIERLOOSE] = '(?:' + src[NUMERICIDENTIFIERLOOSE] +\n                                 '|' + src[NONNUMERICIDENTIFIER] + ')'\n\n// ## Pre-release Version\n// Hyphen, followed by one or more dot-separated pre-release version\n// identifiers.\n\nvar PRERELEASE = R++\nsrc[PRERELEASE] = '(?:-(' + src[PRERELEASEIDENTIFIER] +\n                  '(?:\\\\.' + src[PRERELEASEIDENTIFIER] + ')*))'\n\nvar PRERELEASELOOSE = R++\nsrc[PRERELEASELOOSE] = '(?:-?(' + src[PRERELEASEIDENTIFIERLOOSE] +\n                       '(?:\\\\.' + src[PRERELEASEIDENTIFIERLOOSE] + ')*))'\n\n// ## Build Metadata Identifier\n// Any combination of digits, letters, or hyphens.\n\nvar BUILDIDENTIFIER = R++\nsrc[BUILDIDENTIFIER] = '[0-9A-Za-z-]+'\n\n// ## Build Metadata\n// Plus sign, followed by one or more period-separated build metadata\n// identifiers.\n\nvar BUILD = R++\nsrc[BUILD] = '(?:\\\\+(' + src[BUILDIDENTIFIER] +\n             '(?:\\\\.' + src[BUILDIDENTIFIER] + ')*))'\n\n// ## Full Version String\n// A main version, followed optionally by a pre-release version and\n// build metadata.\n\n// Note that the only major, minor, patch, and pre-release sections of\n// the version string are capturing groups.  The build metadata is not a\n// capturing group, because it should not ever be used in version\n// comparison.\n\nvar FULL = R++\nvar FULLPLAIN = 'v?' + src[MAINVERSION] +\n                src[PRERELEASE] + '?' +\n                src[BUILD] + '?'\n\nsrc[FULL] = '^' + FULLPLAIN + '$'\n\n// like full, but allows v1.2.3 and =1.2.3, which people do sometimes.\n// also, 1.0.0alpha1 (prerelease without the hyphen) which is pretty\n// common in the npm registry.\nvar LOOSEPLAIN = '[v=\\\\s]*' + src[MAINVERSIONLOOSE] +\n                 src[PRERELEASELOOSE] + '?' +\n                 src[BUILD] + '?'\n\nvar LOOSE = R++\nsrc[LOOSE] = '^' + LOOSEPLAIN + '$'\n\nvar GTLT = R++\nsrc[GTLT] = '((?:<|>)?=?)'\n\n// Something like \"2.*\" or \"1.2.x\".\n// Note that \"x.x\" is a valid xRange identifer, meaning \"any version\"\n// Only the first item is strictly required.\nvar XRANGEIDENTIFIERLOOSE = R++\nsrc[XRANGEIDENTIFIERLOOSE] = src[NUMERICIDENTIFIERLOOSE] + '|x|X|\\\\*'\nvar XRANGEIDENTIFIER = R++\nsrc[XRANGEIDENTIFIER] = src[NUMERICIDENTIFIER] + '|x|X|\\\\*'\n\nvar XRANGEPLAIN = R++\nsrc[XRANGEPLAIN] = '[v=\\\\s]*(' + src[XRANGEIDENTIFIER] + ')' +\n                   '(?:\\\\.(' + src[XRANGEIDENTIFIER] + ')' +\n                   '(?:\\\\.(' + src[XRANGEIDENTIFIER] + ')' +\n                   '(?:' + src[PRERELEASE] + ')?' +\n                   src[BUILD] + '?' +\n                   ')?)?'\n\nvar XRANGEPLAINLOOSE = R++\nsrc[XRANGEPLAINLOOSE] = '[v=\\\\s]*(' + src[XRANGEIDENTIFIERLOOSE] + ')' +\n                        '(?:\\\\.(' + src[XRANGEIDENTIFIERLOOSE] + ')' +\n                        '(?:\\\\.(' + src[XRANGEIDENTIFIERLOOSE] + ')' +\n                        '(?:' + src[PRERELEASELOOSE] + ')?' +\n                        src[BUILD] + '?' +\n                        ')?)?'\n\nvar XRANGE = R++\nsrc[XRANGE] = '^' + src[GTLT] + '\\\\s*' + src[XRANGEPLAIN] + '$'\nvar XRANGELOOSE = R++\nsrc[XRANGELOOSE] = '^' + src[GTLT] + '\\\\s*' + src[XRANGEPLAINLOOSE] + '$'\n\n// Coercion.\n// Extract anything that could conceivably be a part of a valid semver\nvar COERCE = R++\nsrc[COERCE] = '(?:^|[^\\\\d])' +\n              '(\\\\d{1,' + MAX_SAFE_COMPONENT_LENGTH + '})' +\n              '(?:\\\\.(\\\\d{1,' + MAX_SAFE_COMPONENT_LENGTH + '}))?' +\n              '(?:\\\\.(\\\\d{1,' + MAX_SAFE_COMPONENT_LENGTH + '}))?' +\n              '(?:$|[^\\\\d])'\n\n// Tilde ranges.\n// Meaning is \"reasonably at or greater than\"\nvar LONETILDE = R++\nsrc[LONETILDE] = '(?:~>?)'\n\nvar TILDETRIM = R++\nsrc[TILDETRIM] = '(\\\\s*)' + src[LONETILDE] + '\\\\s+'\nre[TILDETRIM] = new RegExp(src[TILDETRIM], 'g')\nvar tildeTrimReplace = '$1~'\n\nvar TILDE = R++\nsrc[TILDE] = '^' + src[LONETILDE] + src[XRANGEPLAIN] + '$'\nvar TILDELOOSE = R++\nsrc[TILDELOOSE] = '^' + src[LONETILDE] + src[XRANGEPLAINLOOSE] + '$'\n\n// Caret ranges.\n// Meaning is \"at least and backwards compatible with\"\nvar LONECARET = R++\nsrc[LONECARET] = '(?:\\\\^)'\n\nvar CARETTRIM = R++\nsrc[CARETTRIM] = '(\\\\s*)' + src[LONECARET] + '\\\\s+'\nre[CARETTRIM] = new RegExp(src[CARETTRIM], 'g')\nvar caretTrimReplace = '$1^'\n\nvar CARET = R++\nsrc[CARET] = '^' + src[LONECARET] + src[XRANGEPLAIN] + '$'\nvar CARETLOOSE = R++\nsrc[CARETLOOSE] = '^' + src[LONECARET] + src[XRANGEPLAINLOOSE] + '$'\n\n// A simple gt/lt/eq thing, or just \"\" to indicate \"any version\"\nvar COMPARATORLOOSE = R++\nsrc[COMPARATORLOOSE] = '^' + src[GTLT] + '\\\\s*(' + LOOSEPLAIN + ')$|^$'\nvar COMPARATOR = R++\nsrc[COMPARATOR] = '^' + src[GTLT] + '\\\\s*(' + FULLPLAIN + ')$|^$'\n\n// An expression to strip any whitespace between the gtlt and the thing\n// it modifies, so that `> 1.2.3` ==> `>1.2.3`\nvar COMPARATORTRIM = R++\nsrc[COMPARATORTRIM] = '(\\\\s*)' + src[GTLT] +\n                      '\\\\s*(' + LOOSEPLAIN + '|' + src[XRANGEPLAIN] + ')'\n\n// this one has to use the /g flag\nre[COMPARATORTRIM] = new RegExp(src[COMPARATORTRIM], 'g')\nvar comparatorTrimReplace = '$1$2$3'\n\n// Something like `1.2.3 - 1.2.4`\n// Note that these all use the loose form, because they'll be\n// checked against either the strict or loose comparator form\n// later.\nvar HYPHENRANGE = R++\nsrc[HYPHENRANGE] = '^\\\\s*(' + src[XRANGEPLAIN] + ')' +\n                   '\\\\s+-\\\\s+' +\n                   '(' + src[XRANGEPLAIN] + ')' +\n                   '\\\\s*$'\n\nvar HYPHENRANGELOOSE = R++\nsrc[HYPHENRANGELOOSE] = '^\\\\s*(' + src[XRANGEPLAINLOOSE] + ')' +\n                        '\\\\s+-\\\\s+' +\n                        '(' + src[XRANGEPLAINLOOSE] + ')' +\n                        '\\\\s*$'\n\n// Star ranges basically just allow anything at all.\nvar STAR = R++\nsrc[STAR] = '(<|>)?=?\\\\s*\\\\*'\n\n// Compile to actual regexp objects.\n// All are flag-free, unless they were created above with a flag.\nfor (var i = 0; i < R; i++) {\n  debug(i, src[i])\n  if (!re[i]) {\n    re[i] = new RegExp(src[i])\n  }\n}\n\nexports.parse = parse\nfunction parse (version, options) {\n  if (!options || typeof options !== 'object') {\n    options = {\n      loose: !!options,\n      includePrerelease: false\n    }\n  }\n\n  if (version instanceof SemVer) {\n    return version\n  }\n\n  if (typeof version !== 'string') {\n    return null\n  }\n\n  if (version.length > MAX_LENGTH) {\n    return null\n  }\n\n  var r = options.loose ? re[LOOSE] : re[FULL]\n  if (!r.test(version)) {\n    return null\n  }\n\n  try {\n    return new SemVer(version, options)\n  } catch (er) {\n    return null\n  }\n}\n\nexports.valid = valid\nfunction valid (version, options) {\n  var v = parse(version, options)\n  return v ? v.version : null\n}\n\nexports.clean = clean\nfunction clean (version, options) {\n  var s = parse(version.trim().replace(/^[=v]+/, ''), options)\n  return s ? s.version : null\n}\n\nexports.SemVer = SemVer\n\nfunction SemVer (version, options) {\n  if (!options || typeof options !== 'object') {\n    options = {\n      loose: !!options,\n      includePrerelease: false\n    }\n  }\n  if (version instanceof SemVer) {\n    if (version.loose === options.loose) {\n      return version\n    } else {\n      version = version.version\n    }\n  } else if (typeof version !== 'string') {\n    throw new TypeError('Invalid Version: ' + version)\n  }\n\n  if (version.length > MAX_LENGTH) {\n    throw new TypeError('version is longer than ' + MAX_LENGTH + ' characters')\n  }\n\n  if (!(this instanceof SemVer)) {\n    return new SemVer(version, options)\n  }\n\n  debug('SemVer', version, options)\n  this.options = options\n  this.loose = !!options.loose\n\n  var m = version.trim().match(options.loose ? re[LOOSE] : re[FULL])\n\n  if (!m) {\n    throw new TypeError('Invalid Version: ' + version)\n  }\n\n  this.raw = version\n\n  // these are actually numbers\n  this.major = +m[1]\n  this.minor = +m[2]\n  this.patch = +m[3]\n\n  if (this.major > MAX_SAFE_INTEGER || this.major < 0) {\n    throw new TypeError('Invalid major version')\n  }\n\n  if (this.minor > MAX_SAFE_INTEGER || this.minor < 0) {\n    throw new TypeError('Invalid minor version')\n  }\n\n  if (this.patch > MAX_SAFE_INTEGER || this.patch < 0) {\n    throw new TypeError('Invalid patch version')\n  }\n\n  // numberify any prerelease numeric ids\n  if (!m[4]) {\n    this.prerelease = []\n  } else {\n    this.prerelease = m[4].split('.').map(function (id) {\n      if (/^[0-9]+$/.test(id)) {\n        var num = +id\n        if (num >= 0 && num < MAX_SAFE_INTEGER) {\n          return num\n        }\n      }\n      return id\n    })\n  }\n\n  this.build = m[5] ? m[5].split('.') : []\n  this.format()\n}\n\nSemVer.prototype.format = function () {\n  this.version = this.major + '.' + this.minor + '.' + this.patch\n  if (this.prerelease.length) {\n    this.version += '-' + this.prerelease.join('.')\n  }\n  return this.version\n}\n\nSemVer.prototype.toString = function () {\n  return this.version\n}\n\nSemVer.prototype.compare = function (other) {\n  debug('SemVer.compare', this.version, this.options, other)\n  if (!(other instanceof SemVer)) {\n    other = new SemVer(other, this.options)\n  }\n\n  return this.compareMain(other) || this.comparePre(other)\n}\n\nSemVer.prototype.compareMain = function (other) {\n  if (!(other instanceof SemVer)) {\n    other = new SemVer(other, this.options)\n  }\n\n  return compareIdentifiers(this.major, other.major) ||\n         compareIdentifiers(this.minor, other.minor) ||\n         compareIdentifiers(this.patch, other.patch)\n}\n\nSemVer.prototype.comparePre = function (other) {\n  if (!(other instanceof SemVer)) {\n    other = new SemVer(other, this.options)\n  }\n\n  // NOT having a prerelease is > having one\n  if (this.prerelease.length && !other.prerelease.length) {\n    return -1\n  } else if (!this.prerelease.length && other.prerelease.length) {\n    return 1\n  } else if (!this.prerelease.length && !other.prerelease.length) {\n    return 0\n  }\n\n  var i = 0\n  do {\n    var a = this.prerelease[i]\n    var b = other.prerelease[i]\n    debug('prerelease compare', i, a, b)\n    if (a === undefined && b === undefined) {\n      return 0\n    } else if (b === undefined) {\n      return 1\n    } else if (a === undefined) {\n      return -1\n    } else if (a === b) {\n      continue\n    } else {\n      return compareIdentifiers(a, b)\n    }\n  } while (++i)\n}\n\n// preminor will bump the version up to the next minor release, and immediately\n// down to pre-release. premajor and prepatch work the same way.\nSemVer.prototype.inc = function (release, identifier) {\n  switch (release) {\n    case 'premajor':\n      this.prerelease.length = 0\n      this.patch = 0\n      this.minor = 0\n      this.major++\n      this.inc('pre', identifier)\n      break\n    case 'preminor':\n      this.prerelease.length = 0\n      this.patch = 0\n      this.minor++\n      this.inc('pre', identifier)\n      break\n    case 'prepatch':\n      // If this is already a prerelease, it will bump to the next version\n      // drop any prereleases that might already exist, since they are not\n      // relevant at this point.\n      this.prerelease.length = 0\n      this.inc('patch', identifier)\n      this.inc('pre', identifier)\n      break\n    // If the input is a non-prerelease version, this acts the same as\n    // prepatch.\n    case 'prerelease':\n      if (this.prerelease.length === 0) {\n        this.inc('patch', identifier)\n      }\n      this.inc('pre', identifier)\n      break\n\n    case 'major':\n      // If this is a pre-major version, bump up to the same major version.\n      // Otherwise increment major.\n      // 1.0.0-5 bumps to 1.0.0\n      // 1.1.0 bumps to 2.0.0\n      if (this.minor !== 0 ||\n          this.patch !== 0 ||\n          this.prerelease.length === 0) {\n        this.major++\n      }\n      this.minor = 0\n      this.patch = 0\n      this.prerelease = []\n      break\n    case 'minor':\n      // If this is a pre-minor version, bump up to the same minor version.\n      // Otherwise increment minor.\n      // 1.2.0-5 bumps to 1.2.0\n      // 1.2.1 bumps to 1.3.0\n      if (this.patch !== 0 || this.prerelease.length === 0) {\n        this.minor++\n      }\n      this.patch = 0\n      this.prerelease = []\n      break\n    case 'patch':\n      // If this is not a pre-release version, it will increment the patch.\n      // If it is a pre-release it will bump up to the same patch version.\n      // 1.2.0-5 patches to 1.2.0\n      // 1.2.0 patches to 1.2.1\n      if (this.prerelease.length === 0) {\n        this.patch++\n      }\n      this.prerelease = []\n      break\n    // This probably shouldn't be used publicly.\n    // 1.0.0 \"pre\" would become 1.0.0-0 which is the wrong direction.\n    case 'pre':\n      if (this.prerelease.length === 0) {\n        this.prerelease = [0]\n      } else {\n        var i = this.prerelease.length\n        while (--i >= 0) {\n          if (typeof this.prerelease[i] === 'number') {\n            this.prerelease[i]++\n            i = -2\n          }\n        }\n        if (i === -1) {\n          // didn't increment anything\n          this.prerelease.push(0)\n        }\n      }\n      if (identifier) {\n        // 1.2.0-beta.1 bumps to 1.2.0-beta.2,\n        // 1.2.0-beta.fooblz or 1.2.0-beta bumps to 1.2.0-beta.0\n        if (this.prerelease[0] === identifier) {\n          if (isNaN(this.prerelease[1])) {\n            this.prerelease = [identifier, 0]\n          }\n        } else {\n          this.prerelease = [identifier, 0]\n        }\n      }\n      break\n\n    default:\n      throw new Error('invalid increment argument: ' + release)\n  }\n  this.format()\n  this.raw = this.version\n  return this\n}\n\nexports.inc = inc\nfunction inc (version, release, loose, identifier) {\n  if (typeof (loose) === 'string') {\n    identifier = loose\n    loose = undefined\n  }\n\n  try {\n    return new SemVer(version, loose).inc(release, identifier).version\n  } catch (er) {\n    return null\n  }\n}\n\nexports.diff = diff\nfunction diff (version1, version2) {\n  if (eq(version1, version2)) {\n    return null\n  } else {\n    var v1 = parse(version1)\n    var v2 = parse(version2)\n    var prefix = ''\n    if (v1.prerelease.length || v2.prerelease.length) {\n      prefix = 'pre'\n      var defaultResult = 'prerelease'\n    }\n    for (var key in v1) {\n      if (key === 'major' || key === 'minor' || key === 'patch') {\n        if (v1[key] !== v2[key]) {\n          return prefix + key\n        }\n      }\n    }\n    return defaultResult // may be undefined\n  }\n}\n\nexports.compareIdentifiers = compareIdentifiers\n\nvar numeric = /^[0-9]+$/\nfunction compareIdentifiers (a, b) {\n  var anum = numeric.test(a)\n  var bnum = numeric.test(b)\n\n  if (anum && bnum) {\n    a = +a\n    b = +b\n  }\n\n  return a === b ? 0\n    : (anum && !bnum) ? -1\n    : (bnum && !anum) ? 1\n    : a < b ? -1\n    : 1\n}\n\nexports.rcompareIdentifiers = rcompareIdentifiers\nfunction rcompareIdentifiers (a, b) {\n  return compareIdentifiers(b, a)\n}\n\nexports.major = major\nfunction major (a, loose) {\n  return new SemVer(a, loose).major\n}\n\nexports.minor = minor\nfunction minor (a, loose) {\n  return new SemVer(a, loose).minor\n}\n\nexports.patch = patch\nfunction patch (a, loose) {\n  return new SemVer(a, loose).patch\n}\n\nexports.compare = compare\nfunction compare (a, b, loose) {\n  return new SemVer(a, loose).compare(new SemVer(b, loose))\n}\n\nexports.compareLoose = compareLoose\nfunction compareLoose (a, b) {\n  return compare(a, b, true)\n}\n\nexports.rcompare = rcompare\nfunction rcompare (a, b, loose) {\n  return compare(b, a, loose)\n}\n\nexports.sort = sort\nfunction sort (list, loose) {\n  return list.sort(function (a, b) {\n    return exports.compare(a, b, loose)\n  })\n}\n\nexports.rsort = rsort\nfunction rsort (list, loose) {\n  return list.sort(function (a, b) {\n    return exports.rcompare(a, b, loose)\n  })\n}\n\nexports.gt = gt\nfunction gt (a, b, loose) {\n  return compare(a, b, loose) > 0\n}\n\nexports.lt = lt\nfunction lt (a, b, loose) {\n  return compare(a, b, loose) < 0\n}\n\nexports.eq = eq\nfunction eq (a, b, loose) {\n  return compare(a, b, loose) === 0\n}\n\nexports.neq = neq\nfunction neq (a, b, loose) {\n  return compare(a, b, loose) !== 0\n}\n\nexports.gte = gte\nfunction gte (a, b, loose) {\n  return compare(a, b, loose) >= 0\n}\n\nexports.lte = lte\nfunction lte (a, b, loose) {\n  return compare(a, b, loose) <= 0\n}\n\nexports.cmp = cmp\nfunction cmp (a, op, b, loose) {\n  switch (op) {\n    case '===':\n      if (typeof a === 'object')\n        a = a.version\n      if (typeof b === 'object')\n        b = b.version\n      return a === b\n\n    case '!==':\n      if (typeof a === 'object')\n        a = a.version\n      if (typeof b === 'object')\n        b = b.version\n      return a !== b\n\n    case '':\n    case '=':\n    case '==':\n      return eq(a, b, loose)\n\n    case '!=':\n      return neq(a, b, loose)\n\n    case '>':\n      return gt(a, b, loose)\n\n    case '>=':\n      return gte(a, b, loose)\n\n    case '<':\n      return lt(a, b, loose)\n\n    case '<=':\n      return lte(a, b, loose)\n\n    default:\n      throw new TypeError('Invalid operator: ' + op)\n  }\n}\n\nexports.Comparator = Comparator\nfunction Comparator (comp, options) {\n  if (!options || typeof options !== 'object') {\n    options = {\n      loose: !!options,\n      includePrerelease: false\n    }\n  }\n\n  if (comp instanceof Comparator) {\n    if (comp.loose === !!options.loose) {\n      return comp\n    } else {\n      comp = comp.value\n    }\n  }\n\n  if (!(this instanceof Comparator)) {\n    return new Comparator(comp, options)\n  }\n\n  debug('comparator', comp, options)\n  this.options = options\n  this.loose = !!options.loose\n  this.parse(comp)\n\n  if (this.semver === ANY) {\n    this.value = ''\n  } else {\n    this.value = this.operator + this.semver.version\n  }\n\n  debug('comp', this)\n}\n\nvar ANY = {}\nComparator.prototype.parse = function (comp) {\n  var r = this.options.loose ? re[COMPARATORLOOSE] : re[COMPARATOR]\n  var m = comp.match(r)\n\n  if (!m) {\n    throw new TypeError('Invalid comparator: ' + comp)\n  }\n\n  this.operator = m[1]\n  if (this.operator === '=') {\n    this.operator = ''\n  }\n\n  // if it literally is just '>' or '' then allow anything.\n  if (!m[2]) {\n    this.semver = ANY\n  } else {\n    this.semver = new SemVer(m[2], this.options.loose)\n  }\n}\n\nComparator.prototype.toString = function () {\n  return this.value\n}\n\nComparator.prototype.test = function (version) {\n  debug('Comparator.test', version, this.options.loose)\n\n  if (this.semver === ANY) {\n    return true\n  }\n\n  if (typeof version === 'string') {\n    version = new SemVer(version, this.options)\n  }\n\n  return cmp(version, this.operator, this.semver, this.options)\n}\n\nComparator.prototype.intersects = function (comp, options) {\n  if (!(comp instanceof Comparator)) {\n    throw new TypeError('a Comparator is required')\n  }\n\n  if (!options || typeof options !== 'object') {\n    options = {\n      loose: !!options,\n      includePrerelease: false\n    }\n  }\n\n  var rangeTmp\n\n  if (this.operator === '') {\n    rangeTmp = new Range(comp.value, options)\n    return satisfies(this.value, rangeTmp, options)\n  } else if (comp.operator === '') {\n    rangeTmp = new Range(this.value, options)\n    return satisfies(comp.semver, rangeTmp, options)\n  }\n\n  var sameDirectionIncreasing =\n    (this.operator === '>=' || this.operator === '>') &&\n    (comp.operator === '>=' || comp.operator === '>')\n  var sameDirectionDecreasing =\n    (this.operator === '<=' || this.operator === '<') &&\n    (comp.operator === '<=' || comp.operator === '<')\n  var sameSemVer = this.semver.version === comp.semver.version\n  var differentDirectionsInclusive =\n    (this.operator === '>=' || this.operator === '<=') &&\n    (comp.operator === '>=' || comp.operator === '<=')\n  var oppositeDirectionsLessThan =\n    cmp(this.semver, '<', comp.semver, options) &&\n    ((this.operator === '>=' || this.operator === '>') &&\n    (comp.operator === '<=' || comp.operator === '<'))\n  var oppositeDirectionsGreaterThan =\n    cmp(this.semver, '>', comp.semver, options) &&\n    ((this.operator === '<=' || this.operator === '<') &&\n    (comp.operator === '>=' || comp.operator === '>'))\n\n  return sameDirectionIncreasing || sameDirectionDecreasing ||\n    (sameSemVer && differentDirectionsInclusive) ||\n    oppositeDirectionsLessThan || oppositeDirectionsGreaterThan\n}\n\nexports.Range = Range\nfunction Range (range, options) {\n  if (!options || typeof options !== 'object') {\n    options = {\n      loose: !!options,\n      includePrerelease: false\n    }\n  }\n\n  if (range instanceof Range) {\n    if (range.loose === !!options.loose &&\n        range.includePrerelease === !!options.includePrerelease) {\n      return range\n    } else {\n      return new Range(range.raw, options)\n    }\n  }\n\n  if (range instanceof Comparator) {\n    return new Range(range.value, options)\n  }\n\n  if (!(this instanceof Range)) {\n    return new Range(range, options)\n  }\n\n  this.options = options\n  this.loose = !!options.loose\n  this.includePrerelease = !!options.includePrerelease\n\n  // First, split based on boolean or ||\n  this.raw = range\n  this.set = range.split(/\\s*\\|\\|\\s*/).map(function (range) {\n    return this.parseRange(range.trim())\n  }, this).filter(function (c) {\n    // throw out any that are not relevant for whatever reason\n    return c.length\n  })\n\n  if (!this.set.length) {\n    throw new TypeError('Invalid SemVer Range: ' + range)\n  }\n\n  this.format()\n}\n\nRange.prototype.format = function () {\n  this.range = this.set.map(function (comps) {\n    return comps.join(' ').trim()\n  }).join('||').trim()\n  return this.range\n}\n\nRange.prototype.toString = function () {\n  return this.range\n}\n\nRange.prototype.parseRange = function (range) {\n  var loose = this.options.loose\n  range = range.trim()\n  // `1.2.3 - 1.2.4` => `>=1.2.3 <=1.2.4`\n  var hr = loose ? re[HYPHENRANGELOOSE] : re[HYPHENRANGE]\n  range = range.replace(hr, hyphenReplace)\n  debug('hyphen replace', range)\n  // `> 1.2.3 < 1.2.5` => `>1.2.3 <1.2.5`\n  range = range.replace(re[COMPARATORTRIM], comparatorTrimReplace)\n  debug('comparator trim', range, re[COMPARATORTRIM])\n\n  // `~ 1.2.3` => `~1.2.3`\n  range = range.replace(re[TILDETRIM], tildeTrimReplace)\n\n  // `^ 1.2.3` => `^1.2.3`\n  range = range.replace(re[CARETTRIM], caretTrimReplace)\n\n  // normalize spaces\n  range = range.split(/\\s+/).join(' ')\n\n  // At this point, the range is completely trimmed and\n  // ready to be split into comparators.\n\n  var compRe = loose ? re[COMPARATORLOOSE] : re[COMPARATOR]\n  var set = range.split(' ').map(function (comp) {\n    return parseComparator(comp, this.options)\n  }, this).join(' ').split(/\\s+/)\n  if (this.options.loose) {\n    // in loose mode, throw out any that are not valid comparators\n    set = set.filter(function (comp) {\n      return !!comp.match(compRe)\n    })\n  }\n  set = set.map(function (comp) {\n    return new Comparator(comp, this.options)\n  }, this)\n\n  return set\n}\n\nRange.prototype.intersects = function (range, options) {\n  if (!(range instanceof Range)) {\n    throw new TypeError('a Range is required')\n  }\n\n  return this.set.some(function (thisComparators) {\n    return thisComparators.every(function (thisComparator) {\n      return range.set.some(function (rangeComparators) {\n        return rangeComparators.every(function (rangeComparator) {\n          return thisComparator.intersects(rangeComparator, options)\n        })\n      })\n    })\n  })\n}\n\n// Mostly just for testing and legacy API reasons\nexports.toComparators = toComparators\nfunction toComparators (range, options) {\n  return new Range(range, options).set.map(function (comp) {\n    return comp.map(function (c) {\n      return c.value\n    }).join(' ').trim().split(' ')\n  })\n}\n\n// comprised of xranges, tildes, stars, and gtlt's at this point.\n// already replaced the hyphen ranges\n// turn into a set of JUST comparators.\nfunction parseComparator (comp, options) {\n  debug('comp', comp, options)\n  comp = replaceCarets(comp, options)\n  debug('caret', comp)\n  comp = replaceTildes(comp, options)\n  debug('tildes', comp)\n  comp = replaceXRanges(comp, options)\n  debug('xrange', comp)\n  comp = replaceStars(comp, options)\n  debug('stars', comp)\n  return comp\n}\n\nfunction isX (id) {\n  return !id || id.toLowerCase() === 'x' || id === '*'\n}\n\n// ~, ~> --> * (any, kinda silly)\n// ~2, ~2.x, ~2.x.x, ~>2, ~>2.x ~>2.x.x --> >=2.0.0 <3.0.0\n// ~2.0, ~2.0.x, ~>2.0, ~>2.0.x --> >=2.0.0 <2.1.0\n// ~1.2, ~1.2.x, ~>1.2, ~>1.2.x --> >=1.2.0 <1.3.0\n// ~1.2.3, ~>1.2.3 --> >=1.2.3 <1.3.0\n// ~1.2.0, ~>1.2.0 --> >=1.2.0 <1.3.0\nfunction replaceTildes (comp, options) {\n  return comp.trim().split(/\\s+/).map(function (comp) {\n    return replaceTilde(comp, options)\n  }).join(' ')\n}\n\nfunction replaceTilde (comp, options) {\n  var r = options.loose ? re[TILDELOOSE] : re[TILDE]\n  return comp.replace(r, function (_, M, m, p, pr) {\n    debug('tilde', comp, _, M, m, p, pr)\n    var ret\n\n    if (isX(M)) {\n      ret = ''\n    } else if (isX(m)) {\n      ret = '>=' + M + '.0.0 <' + (+M + 1) + '.0.0'\n    } else if (isX(p)) {\n      // ~1.2 == >=1.2.0 <1.3.0\n      ret = '>=' + M + '.' + m + '.0 <' + M + '.' + (+m + 1) + '.0'\n    } else if (pr) {\n      debug('replaceTilde pr', pr)\n      ret = '>=' + M + '.' + m + '.' + p + '-' + pr +\n            ' <' + M + '.' + (+m + 1) + '.0'\n    } else {\n      // ~1.2.3 == >=1.2.3 <1.3.0\n      ret = '>=' + M + '.' + m + '.' + p +\n            ' <' + M + '.' + (+m + 1) + '.0'\n    }\n\n    debug('tilde return', ret)\n    return ret\n  })\n}\n\n// ^ --> * (any, kinda silly)\n// ^2, ^2.x, ^2.x.x --> >=2.0.0 <3.0.0\n// ^2.0, ^2.0.x --> >=2.0.0 <3.0.0\n// ^1.2, ^1.2.x --> >=1.2.0 <2.0.0\n// ^1.2.3 --> >=1.2.3 <2.0.0\n// ^1.2.0 --> >=1.2.0 <2.0.0\nfunction replaceCarets (comp, options) {\n  return comp.trim().split(/\\s+/).map(function (comp) {\n    return replaceCaret(comp, options)\n  }).join(' ')\n}\n\nfunction replaceCaret (comp, options) {\n  debug('caret', comp, options)\n  var r = options.loose ? re[CARETLOOSE] : re[CARET]\n  return comp.replace(r, function (_, M, m, p, pr) {\n    debug('caret', comp, _, M, m, p, pr)\n    var ret\n\n    if (isX(M)) {\n      ret = ''\n    } else if (isX(m)) {\n      ret = '>=' + M + '.0.0 <' + (+M + 1) + '.0.0'\n    } else if (isX(p)) {\n      if (M === '0') {\n        ret = '>=' + M + '.' + m + '.0 <' + M + '.' + (+m + 1) + '.0'\n      } else {\n        ret = '>=' + M + '.' + m + '.0 <' + (+M + 1) + '.0.0'\n      }\n    } else if (pr) {\n      debug('replaceCaret pr', pr)\n      if (M === '0') {\n        if (m === '0') {\n          ret = '>=' + M + '.' + m + '.' + p + '-' + pr +\n                ' <' + M + '.' + m + '.' + (+p + 1)\n        } else {\n          ret = '>=' + M + '.' + m + '.' + p + '-' + pr +\n                ' <' + M + '.' + (+m + 1) + '.0'\n        }\n      } else {\n        ret = '>=' + M + '.' + m + '.' + p + '-' + pr +\n              ' <' + (+M + 1) + '.0.0'\n      }\n    } else {\n      debug('no pr')\n      if (M === '0') {\n        if (m === '0') {\n          ret = '>=' + M + '.' + m + '.' + p +\n                ' <' + M + '.' + m + '.' + (+p + 1)\n        } else {\n          ret = '>=' + M + '.' + m + '.' + p +\n                ' <' + M + '.' + (+m + 1) + '.0'\n        }\n      } else {\n        ret = '>=' + M + '.' + m + '.' + p +\n              ' <' + (+M + 1) + '.0.0'\n      }\n    }\n\n    debug('caret return', ret)\n    return ret\n  })\n}\n\nfunction replaceXRanges (comp, options) {\n  debug('replaceXRanges', comp, options)\n  return comp.split(/\\s+/).map(function (comp) {\n    return replaceXRange(comp, options)\n  }).join(' ')\n}\n\nfunction replaceXRange (comp, options) {\n  comp = comp.trim()\n  var r = options.loose ? re[XRANGELOOSE] : re[XRANGE]\n  return comp.replace(r, function (ret, gtlt, M, m, p, pr) {\n    debug('xRange', comp, ret, gtlt, M, m, p, pr)\n    var xM = isX(M)\n    var xm = xM || isX(m)\n    var xp = xm || isX(p)\n    var anyX = xp\n\n    if (gtlt === '=' && anyX) {\n      gtlt = ''\n    }\n\n    if (xM) {\n      if (gtlt === '>' || gtlt === '<') {\n        // nothing is allowed\n        ret = '<0.0.0'\n      } else {\n        // nothing is forbidden\n        ret = '*'\n      }\n    } else if (gtlt && anyX) {\n      // we know patch is an x, because we have any x at all.\n      // replace X with 0\n      if (xm) {\n        m = 0\n      }\n      p = 0\n\n      if (gtlt === '>') {\n        // >1 => >=2.0.0\n        // >1.2 => >=1.3.0\n        // >1.2.3 => >= 1.2.4\n        gtlt = '>='\n        if (xm) {\n          M = +M + 1\n          m = 0\n          p = 0\n        } else {\n          m = +m + 1\n          p = 0\n        }\n      } else if (gtlt === '<=') {\n        // <=0.7.x is actually <0.8.0, since any 0.7.x should\n        // pass.  Similarly, <=7.x is actually <8.0.0, etc.\n        gtlt = '<'\n        if (xm) {\n          M = +M + 1\n        } else {\n          m = +m + 1\n        }\n      }\n\n      ret = gtlt + M + '.' + m + '.' + p\n    } else if (xm) {\n      ret = '>=' + M + '.0.0 <' + (+M + 1) + '.0.0'\n    } else if (xp) {\n      ret = '>=' + M + '.' + m + '.0 <' + M + '.' + (+m + 1) + '.0'\n    }\n\n    debug('xRange return', ret)\n\n    return ret\n  })\n}\n\n// Because * is AND-ed with everything else in the comparator,\n// and '' means \"any version\", just remove the *s entirely.\nfunction replaceStars (comp, options) {\n  debug('replaceStars', comp, options)\n  // Looseness is ignored here.  star is always as loose as it gets!\n  return comp.trim().replace(re[STAR], '')\n}\n\n// This function is passed to string.replace(re[HYPHENRANGE])\n// M, m, patch, prerelease, build\n// 1.2 - 3.4.5 => >=1.2.0 <=3.4.5\n// 1.2.3 - 3.4 => >=1.2.0 <3.5.0 Any 3.4.x will do\n// 1.2 - 3.4 => >=1.2.0 <3.5.0\nfunction hyphenReplace ($0,\n  from, fM, fm, fp, fpr, fb,\n  to, tM, tm, tp, tpr, tb) {\n  if (isX(fM)) {\n    from = ''\n  } else if (isX(fm)) {\n    from = '>=' + fM + '.0.0'\n  } else if (isX(fp)) {\n    from = '>=' + fM + '.' + fm + '.0'\n  } else {\n    from = '>=' + from\n  }\n\n  if (isX(tM)) {\n    to = ''\n  } else if (isX(tm)) {\n    to = '<' + (+tM + 1) + '.0.0'\n  } else if (isX(tp)) {\n    to = '<' + tM + '.' + (+tm + 1) + '.0'\n  } else if (tpr) {\n    to = '<=' + tM + '.' + tm + '.' + tp + '-' + tpr\n  } else {\n    to = '<=' + to\n  }\n\n  return (from + ' ' + to).trim()\n}\n\n// if ANY of the sets match ALL of its comparators, then pass\nRange.prototype.test = function (version) {\n  if (!version) {\n    return false\n  }\n\n  if (typeof version === 'string') {\n    version = new SemVer(version, this.options)\n  }\n\n  for (var i = 0; i < this.set.length; i++) {\n    if (testSet(this.set[i], version, this.options)) {\n      return true\n    }\n  }\n  return false\n}\n\nfunction testSet (set, version, options) {\n  for (var i = 0; i < set.length; i++) {\n    if (!set[i].test(version)) {\n      return false\n    }\n  }\n\n  if (version.prerelease.length && !options.includePrerelease) {\n    // Find the set of versions that are allowed to have prereleases\n    // For example, ^1.2.3-pr.1 desugars to >=1.2.3-pr.1 <2.0.0\n    // That should allow `1.2.3-pr.2` to pass.\n    // However, `1.2.4-alpha.notready` should NOT be allowed,\n    // even though it's within the range set by the comparators.\n    for (i = 0; i < set.length; i++) {\n      debug(set[i].semver)\n      if (set[i].semver === ANY) {\n        continue\n      }\n\n      if (set[i].semver.prerelease.length > 0) {\n        var allowed = set[i].semver\n        if (allowed.major === version.major &&\n            allowed.minor === version.minor &&\n            allowed.patch === version.patch) {\n          return true\n        }\n      }\n    }\n\n    // Version has a -pre, but it's not one of the ones we like.\n    return false\n  }\n\n  return true\n}\n\nexports.satisfies = satisfies\nfunction satisfies (version, range, options) {\n  try {\n    range = new Range(range, options)\n  } catch (er) {\n    return false\n  }\n  return range.test(version)\n}\n\nexports.maxSatisfying = maxSatisfying\nfunction maxSatisfying (versions, range, options) {\n  var max = null\n  var maxSV = null\n  try {\n    var rangeObj = new Range(range, options)\n  } catch (er) {\n    return null\n  }\n  versions.forEach(function (v) {\n    if (rangeObj.test(v)) {\n      // satisfies(v, range, options)\n      if (!max || maxSV.compare(v) === -1) {\n        // compare(max, v, true)\n        max = v\n        maxSV = new SemVer(max, options)\n      }\n    }\n  })\n  return max\n}\n\nexports.minSatisfying = minSatisfying\nfunction minSatisfying (versions, range, options) {\n  var min = null\n  var minSV = null\n  try {\n    var rangeObj = new Range(range, options)\n  } catch (er) {\n    return null\n  }\n  versions.forEach(function (v) {\n    if (rangeObj.test(v)) {\n      // satisfies(v, range, options)\n      if (!min || minSV.compare(v) === 1) {\n        // compare(min, v, true)\n        min = v\n        minSV = new SemVer(min, options)\n      }\n    }\n  })\n  return min\n}\n\nexports.minVersion = minVersion\nfunction minVersion (range, loose) {\n  range = new Range(range, loose)\n\n  var minver = new SemVer('0.0.0')\n  if (range.test(minver)) {\n    return minver\n  }\n\n  minver = new SemVer('0.0.0-0')\n  if (range.test(minver)) {\n    return minver\n  }\n\n  minver = null\n  for (var i = 0; i < range.set.length; ++i) {\n    var comparators = range.set[i]\n\n    comparators.forEach(function (comparator) {\n      // Clone to avoid manipulating the comparator's semver object.\n      var compver = new SemVer(comparator.semver.version)\n      switch (comparator.operator) {\n        case '>':\n          if (compver.prerelease.length === 0) {\n            compver.patch++\n          } else {\n            compver.prerelease.push(0)\n          }\n          compver.raw = compver.format()\n          /* fallthrough */\n        case '':\n        case '>=':\n          if (!minver || gt(minver, compver)) {\n            minver = compver\n          }\n          break\n        case '<':\n        case '<=':\n          /* Ignore maximum versions */\n          break\n        /* istanbul ignore next */\n        default:\n          throw new Error('Unexpected operation: ' + comparator.operator)\n      }\n    })\n  }\n\n  if (minver && range.test(minver)) {\n    return minver\n  }\n\n  return null\n}\n\nexports.validRange = validRange\nfunction validRange (range, options) {\n  try {\n    // Return '*' instead of '' so that truthiness works.\n    // This will throw if it's invalid anyway\n    return new Range(range, options).range || '*'\n  } catch (er) {\n    return null\n  }\n}\n\n// Determine if version is less than all the versions possible in the range\nexports.ltr = ltr\nfunction ltr (version, range, options) {\n  return outside(version, range, '<', options)\n}\n\n// Determine if version is greater than all the versions possible in the range.\nexports.gtr = gtr\nfunction gtr (version, range, options) {\n  return outside(version, range, '>', options)\n}\n\nexports.outside = outside\nfunction outside (version, range, hilo, options) {\n  version = new SemVer(version, options)\n  range = new Range(range, options)\n\n  var gtfn, ltefn, ltfn, comp, ecomp\n  switch (hilo) {\n    case '>':\n      gtfn = gt\n      ltefn = lte\n      ltfn = lt\n      comp = '>'\n      ecomp = '>='\n      break\n    case '<':\n      gtfn = lt\n      ltefn = gte\n      ltfn = gt\n      comp = '<'\n      ecomp = '<='\n      break\n    default:\n      throw new TypeError('Must provide a hilo val of \"<\" or \">\"')\n  }\n\n  // If it satisifes the range it is not outside\n  if (satisfies(version, range, options)) {\n    return false\n  }\n\n  // From now on, variable terms are as if we're in \"gtr\" mode.\n  // but note that everything is flipped for the \"ltr\" function.\n\n  for (var i = 0; i < range.set.length; ++i) {\n    var comparators = range.set[i]\n\n    var high = null\n    var low = null\n\n    comparators.forEach(function (comparator) {\n      if (comparator.semver === ANY) {\n        comparator = new Comparator('>=0.0.0')\n      }\n      high = high || comparator\n      low = low || comparator\n      if (gtfn(comparator.semver, high.semver, options)) {\n        high = comparator\n      } else if (ltfn(comparator.semver, low.semver, options)) {\n        low = comparator\n      }\n    })\n\n    // If the edge version comparator has a operator then our version\n    // isn't outside it\n    if (high.operator === comp || high.operator === ecomp) {\n      return false\n    }\n\n    // If the lowest version comparator has an operator and our version\n    // is less than it then it isn't higher than the range\n    if ((!low.operator || low.operator === comp) &&\n        ltefn(version, low.semver)) {\n      return false\n    } else if (low.operator === ecomp && ltfn(version, low.semver)) {\n      return false\n    }\n  }\n  return true\n}\n\nexports.prerelease = prerelease\nfunction prerelease (version, options) {\n  var parsed = parse(version, options)\n  return (parsed && parsed.prerelease.length) ? parsed.prerelease : null\n}\n\nexports.intersects = intersects\nfunction intersects (r1, r2, options) {\n  r1 = new Range(r1, options)\n  r2 = new Range(r2, options)\n  return r1.intersects(r2)\n}\n\nexports.coerce = coerce\nfunction coerce (version) {\n  if (version instanceof SemVer) {\n    return version\n  }\n\n  if (typeof version !== 'string') {\n    return null\n  }\n\n  var match = version.match(re[COERCE])\n\n  if (match == null) {\n    return null\n  }\n\n  return parse(match[1] +\n    '.' + (match[2] || '0') +\n    '.' + (match[3] || '0'))\n}\n\n\n/***/ }),\n\n/***/ 7032:\n/***/ ((module, __unused_webpack_exports, __webpack_require__) => {\n\n\"use strict\";\n\nvar shebangRegex = __webpack_require__(2638);\n\nmodule.exports = function (str) {\n\tvar match = str.match(shebangRegex);\n\n\tif (!match) {\n\t\treturn null;\n\t}\n\n\tvar arr = match[0].replace(/#! ?/, '').split(' ');\n\tvar bin = arr[0].split('/').pop();\n\tvar arg = arr[1];\n\n\treturn (bin === 'env' ?\n\t\targ :\n\t\tbin + (arg ? ' ' + arg : '')\n\t);\n};\n\n\n/***/ }),\n\n/***/ 2638:\n/***/ ((module) => {\n\n\"use strict\";\n\nmodule.exports = /^#!.*/;\n\n\n/***/ }),\n\n/***/ 4931:\n/***/ ((module, __unused_webpack_exports, __webpack_require__) => {\n\n// Note: since nyc uses this module to output coverage, any lines\n// that are in the direct sync flow of nyc's outputCoverage are\n// ignored, since we can never get coverage for them.\nvar assert = __webpack_require__(2357)\nvar signals = __webpack_require__(3710)\nvar isWin = /^win/i.test(process.platform)\n\nvar EE = __webpack_require__(8614)\n/* istanbul ignore if */\nif (typeof EE !== 'function') {\n  EE = EE.EventEmitter\n}\n\nvar emitter\nif (process.__signal_exit_emitter__) {\n  emitter = process.__signal_exit_emitter__\n} else {\n  emitter = process.__signal_exit_emitter__ = new EE()\n  emitter.count = 0\n  emitter.emitted = {}\n}\n\n// Because this emitter is a global, we have to check to see if a\n// previous version of this library failed to enable infinite listeners.\n// I know what you're about to say.  But literally everything about\n// signal-exit is a compromise with evil.  Get used to it.\nif (!emitter.infinite) {\n  emitter.setMaxListeners(Infinity)\n  emitter.infinite = true\n}\n\nmodule.exports = function (cb, opts) {\n  assert.equal(typeof cb, 'function', 'a callback must be provided for exit handler')\n\n  if (loaded === false) {\n    load()\n  }\n\n  var ev = 'exit'\n  if (opts && opts.alwaysLast) {\n    ev = 'afterexit'\n  }\n\n  var remove = function () {\n    emitter.removeListener(ev, cb)\n    if (emitter.listeners('exit').length === 0 &&\n        emitter.listeners('afterexit').length === 0) {\n      unload()\n    }\n  }\n  emitter.on(ev, cb)\n\n  return remove\n}\n\nmodule.exports.unload = unload\nfunction unload () {\n  if (!loaded) {\n    return\n  }\n  loaded = false\n\n  signals.forEach(function (sig) {\n    try {\n      process.removeListener(sig, sigListeners[sig])\n    } catch (er) {}\n  })\n  process.emit = originalProcessEmit\n  process.reallyExit = originalProcessReallyExit\n  emitter.count -= 1\n}\n\nfunction emit (event, code, signal) {\n  if (emitter.emitted[event]) {\n    return\n  }\n  emitter.emitted[event] = true\n  emitter.emit(event, code, signal)\n}\n\n// { <signal>: <listener fn>, ... }\nvar sigListeners = {}\nsignals.forEach(function (sig) {\n  sigListeners[sig] = function listener () {\n    // If there are no other listeners, an exit is coming!\n    // Simplest way: remove us and then re-send the signal.\n    // We know that this will kill the process, so we can\n    // safely emit now.\n    var listeners = process.listeners(sig)\n    if (listeners.length === emitter.count) {\n      unload()\n      emit('exit', null, sig)\n      /* istanbul ignore next */\n      emit('afterexit', null, sig)\n      /* istanbul ignore next */\n      if (isWin && sig === 'SIGHUP') {\n        // \"SIGHUP\" throws an `ENOSYS` error on Windows,\n        // so use a supported signal instead\n        sig = 'SIGINT'\n      }\n      process.kill(process.pid, sig)\n    }\n  }\n})\n\nmodule.exports.signals = function () {\n  return signals\n}\n\nmodule.exports.load = load\n\nvar loaded = false\n\nfunction load () {\n  if (loaded) {\n    return\n  }\n  loaded = true\n\n  // This is the number of onSignalExit's that are in play.\n  // It's important so that we can count the correct number of\n  // listeners on signals, and don't wait for the other one to\n  // handle it instead of us.\n  emitter.count += 1\n\n  signals = signals.filter(function (sig) {\n    try {\n      process.on(sig, sigListeners[sig])\n      return true\n    } catch (er) {\n      return false\n    }\n  })\n\n  process.emit = processEmit\n  process.reallyExit = processReallyExit\n}\n\nvar originalProcessReallyExit = process.reallyExit\nfunction processReallyExit (code) {\n  process.exitCode = code || 0\n  emit('exit', process.exitCode, null)\n  /* istanbul ignore next */\n  emit('afterexit', process.exitCode, null)\n  /* istanbul ignore next */\n  originalProcessReallyExit.call(process, process.exitCode)\n}\n\nvar originalProcessEmit = process.emit\nfunction processEmit (ev, arg) {\n  if (ev === 'exit') {\n    if (arg !== undefined) {\n      process.exitCode = arg\n    }\n    var ret = originalProcessEmit.apply(this, arguments)\n    emit('exit', process.exitCode, null)\n    /* istanbul ignore next */\n    emit('afterexit', process.exitCode, null)\n    return ret\n  } else {\n    return originalProcessEmit.apply(this, arguments)\n  }\n}\n\n\n/***/ }),\n\n/***/ 3710:\n/***/ ((module) => {\n\n// This is not the set of all possible signals.\n//\n// It IS, however, the set of all signals that trigger\n// an exit on either Linux or BSD systems.  Linux is a\n// superset of the signal names supported on BSD, and\n// the unknown signals just fail to register, so we can\n// catch that easily enough.\n//\n// Don't bother with SIGKILL.  It's uncatchable, which\n// means that we can't fire any callbacks anyway.\n//\n// If a user does happen to register a handler on a non-\n// fatal signal like SIGWINCH or something, and then\n// exit, it'll end up firing `process.emit('exit')`, so\n// the handler will be fired anyway.\n//\n// SIGBUS, SIGFPE, SIGSEGV and SIGILL, when not raised\n// artificially, inherently leave the process in a\n// state from which it is not safe to try and enter JS\n// listeners.\nmodule.exports = [\n  'SIGABRT',\n  'SIGALRM',\n  'SIGHUP',\n  'SIGINT',\n  'SIGTERM'\n]\n\nif (process.platform !== 'win32') {\n  module.exports.push(\n    'SIGVTALRM',\n    'SIGXCPU',\n    'SIGXFSZ',\n    'SIGUSR2',\n    'SIGTRAP',\n    'SIGSYS',\n    'SIGQUIT',\n    'SIGIOT'\n    // should detect profiler and enable/disable accordingly.\n    // see #21\n    // 'SIGPROF'\n  )\n}\n\nif (process.platform === 'linux') {\n  module.exports.push(\n    'SIGIO',\n    'SIGPOLL',\n    'SIGPWR',\n    'SIGSTKFLT',\n    'SIGUNUSED'\n  )\n}\n\n\n/***/ }),\n\n/***/ 5515:\n/***/ ((module) => {\n\n\"use strict\";\n\nmodule.exports = function (x) {\n\tvar lf = typeof x === 'string' ? '\\n' : '\\n'.charCodeAt();\n\tvar cr = typeof x === 'string' ? '\\r' : '\\r'.charCodeAt();\n\n\tif (x[x.length - 1] === lf) {\n\t\tx = x.slice(0, x.length - 1);\n\t}\n\n\tif (x[x.length - 1] === cr) {\n\t\tx = x.slice(0, x.length - 1);\n\t}\n\n\treturn x;\n};\n\n\n/***/ }),\n\n/***/ 4256:\n/***/ ((module, __unused_webpack_exports, __webpack_require__) => {\n\n\"use strict\";\n\n\nvar punycode = __webpack_require__(4213);\nvar mappingTable = __webpack_require__(68);\n\nvar PROCESSING_OPTIONS = {\n  TRANSITIONAL: 0,\n  NONTRANSITIONAL: 1\n};\n\nfunction normalize(str) { // fix bug in v8\n  return str.split('\\u0000').map(function (s) { return s.normalize('NFC'); }).join('\\u0000');\n}\n\nfunction findStatus(val) {\n  var start = 0;\n  var end = mappingTable.length - 1;\n\n  while (start <= end) {\n    var mid = Math.floor((start + end) / 2);\n\n    var target = mappingTable[mid];\n    if (target[0][0] <= val && target[0][1] >= val) {\n      return target;\n    } else if (target[0][0] > val) {\n      end = mid - 1;\n    } else {\n      start = mid + 1;\n    }\n  }\n\n  return null;\n}\n\nvar regexAstralSymbols = /[\\uD800-\\uDBFF][\\uDC00-\\uDFFF]/g;\n\nfunction countSymbols(string) {\n  return string\n    // replace every surrogate pair with a BMP symbol\n    .replace(regexAstralSymbols, '_')\n    // then get the length\n    .length;\n}\n\nfunction mapChars(domain_name, useSTD3, processing_option) {\n  var hasError = false;\n  var processed = \"\";\n\n  var len = countSymbols(domain_name);\n  for (var i = 0; i < len; ++i) {\n    var codePoint = domain_name.codePointAt(i);\n    var status = findStatus(codePoint);\n\n    switch (status[1]) {\n      case \"disallowed\":\n        hasError = true;\n        processed += String.fromCodePoint(codePoint);\n        break;\n      case \"ignored\":\n        break;\n      case \"mapped\":\n        processed += String.fromCodePoint.apply(String, status[2]);\n        break;\n      case \"deviation\":\n        if (processing_option === PROCESSING_OPTIONS.TRANSITIONAL) {\n          processed += String.fromCodePoint.apply(String, status[2]);\n        } else {\n          processed += String.fromCodePoint(codePoint);\n        }\n        break;\n      case \"valid\":\n        processed += String.fromCodePoint(codePoint);\n        break;\n      case \"disallowed_STD3_mapped\":\n        if (useSTD3) {\n          hasError = true;\n          processed += String.fromCodePoint(codePoint);\n        } else {\n          processed += String.fromCodePoint.apply(String, status[2]);\n        }\n        break;\n      case \"disallowed_STD3_valid\":\n        if (useSTD3) {\n          hasError = true;\n        }\n\n        processed += String.fromCodePoint(codePoint);\n        break;\n    }\n  }\n\n  return {\n    string: processed,\n    error: hasError\n  };\n}\n\nvar combiningMarksRegex = /[\\u0300-\\u036F\\u0483-\\u0489\\u0591-\\u05BD\\u05BF\\u05C1\\u05C2\\u05C4\\u05C5\\u05C7\\u0610-\\u061A\\u064B-\\u065F\\u0670\\u06D6-\\u06DC\\u06DF-\\u06E4\\u06E7\\u06E8\\u06EA-\\u06ED\\u0711\\u0730-\\u074A\\u07A6-\\u07B0\\u07EB-\\u07F3\\u0816-\\u0819\\u081B-\\u0823\\u0825-\\u0827\\u0829-\\u082D\\u0859-\\u085B\\u08E4-\\u0903\\u093A-\\u093C\\u093E-\\u094F\\u0951-\\u0957\\u0962\\u0963\\u0981-\\u0983\\u09BC\\u09BE-\\u09C4\\u09C7\\u09C8\\u09CB-\\u09CD\\u09D7\\u09E2\\u09E3\\u0A01-\\u0A03\\u0A3C\\u0A3E-\\u0A42\\u0A47\\u0A48\\u0A4B-\\u0A4D\\u0A51\\u0A70\\u0A71\\u0A75\\u0A81-\\u0A83\\u0ABC\\u0ABE-\\u0AC5\\u0AC7-\\u0AC9\\u0ACB-\\u0ACD\\u0AE2\\u0AE3\\u0B01-\\u0B03\\u0B3C\\u0B3E-\\u0B44\\u0B47\\u0B48\\u0B4B-\\u0B4D\\u0B56\\u0B57\\u0B62\\u0B63\\u0B82\\u0BBE-\\u0BC2\\u0BC6-\\u0BC8\\u0BCA-\\u0BCD\\u0BD7\\u0C00-\\u0C03\\u0C3E-\\u0C44\\u0C46-\\u0C48\\u0C4A-\\u0C4D\\u0C55\\u0C56\\u0C62\\u0C63\\u0C81-\\u0C83\\u0CBC\\u0CBE-\\u0CC4\\u0CC6-\\u0CC8\\u0CCA-\\u0CCD\\u0CD5\\u0CD6\\u0CE2\\u0CE3\\u0D01-\\u0D03\\u0D3E-\\u0D44\\u0D46-\\u0D48\\u0D4A-\\u0D4D\\u0D57\\u0D62\\u0D63\\u0D82\\u0D83\\u0DCA\\u0DCF-\\u0DD4\\u0DD6\\u0DD8-\\u0DDF\\u0DF2\\u0DF3\\u0E31\\u0E34-\\u0E3A\\u0E47-\\u0E4E\\u0EB1\\u0EB4-\\u0EB9\\u0EBB\\u0EBC\\u0EC8-\\u0ECD\\u0F18\\u0F19\\u0F35\\u0F37\\u0F39\\u0F3E\\u0F3F\\u0F71-\\u0F84\\u0F86\\u0F87\\u0F8D-\\u0F97\\u0F99-\\u0FBC\\u0FC6\\u102B-\\u103E\\u1056-\\u1059\\u105E-\\u1060\\u1062-\\u1064\\u1067-\\u106D\\u1071-\\u1074\\u1082-\\u108D\\u108F\\u109A-\\u109D\\u135D-\\u135F\\u1712-\\u1714\\u1732-\\u1734\\u1752\\u1753\\u1772\\u1773\\u17B4-\\u17D3\\u17DD\\u180B-\\u180D\\u18A9\\u1920-\\u192B\\u1930-\\u193B\\u19B0-\\u19C0\\u19C8\\u19C9\\u1A17-\\u1A1B\\u1A55-\\u1A5E\\u1A60-\\u1A7C\\u1A7F\\u1AB0-\\u1ABE\\u1B00-\\u1B04\\u1B34-\\u1B44\\u1B6B-\\u1B73\\u1B80-\\u1B82\\u1BA1-\\u1BAD\\u1BE6-\\u1BF3\\u1C24-\\u1C37\\u1CD0-\\u1CD2\\u1CD4-\\u1CE8\\u1CED\\u1CF2-\\u1CF4\\u1CF8\\u1CF9\\u1DC0-\\u1DF5\\u1DFC-\\u1DFF\\u20D0-\\u20F0\\u2CEF-\\u2CF1\\u2D7F\\u2DE0-\\u2DFF\\u302A-\\u302F\\u3099\\u309A\\uA66F-\\uA672\\uA674-\\uA67D\\uA69F\\uA6F0\\uA6F1\\uA802\\uA806\\uA80B\\uA823-\\uA827\\uA880\\uA881\\uA8B4-\\uA8C4\\uA8E0-\\uA8F1\\uA926-\\uA92D\\uA947-\\uA953\\uA980-\\uA983\\uA9B3-\\uA9C0\\uA9E5\\uAA29-\\uAA36\\uAA43\\uAA4C\\uAA4D\\uAA7B-\\uAA7D\\uAAB0\\uAAB2-\\uAAB4\\uAAB7\\uAAB8\\uAABE\\uAABF\\uAAC1\\uAAEB-\\uAAEF\\uAAF5\\uAAF6\\uABE3-\\uABEA\\uABEC\\uABED\\uFB1E\\uFE00-\\uFE0F\\uFE20-\\uFE2D]|\\uD800[\\uDDFD\\uDEE0\\uDF76-\\uDF7A]|\\uD802[\\uDE01-\\uDE03\\uDE05\\uDE06\\uDE0C-\\uDE0F\\uDE38-\\uDE3A\\uDE3F\\uDEE5\\uDEE6]|\\uD804[\\uDC00-\\uDC02\\uDC38-\\uDC46\\uDC7F-\\uDC82\\uDCB0-\\uDCBA\\uDD00-\\uDD02\\uDD27-\\uDD34\\uDD73\\uDD80-\\uDD82\\uDDB3-\\uDDC0\\uDE2C-\\uDE37\\uDEDF-\\uDEEA\\uDF01-\\uDF03\\uDF3C\\uDF3E-\\uDF44\\uDF47\\uDF48\\uDF4B-\\uDF4D\\uDF57\\uDF62\\uDF63\\uDF66-\\uDF6C\\uDF70-\\uDF74]|\\uD805[\\uDCB0-\\uDCC3\\uDDAF-\\uDDB5\\uDDB8-\\uDDC0\\uDE30-\\uDE40\\uDEAB-\\uDEB7]|\\uD81A[\\uDEF0-\\uDEF4\\uDF30-\\uDF36]|\\uD81B[\\uDF51-\\uDF7E\\uDF8F-\\uDF92]|\\uD82F[\\uDC9D\\uDC9E]|\\uD834[\\uDD65-\\uDD69\\uDD6D-\\uDD72\\uDD7B-\\uDD82\\uDD85-\\uDD8B\\uDDAA-\\uDDAD\\uDE42-\\uDE44]|\\uD83A[\\uDCD0-\\uDCD6]|\\uDB40[\\uDD00-\\uDDEF]/;\n\nfunction validateLabel(label, processing_option) {\n  if (label.substr(0, 4) === \"xn--\") {\n    label = punycode.toUnicode(label);\n    processing_option = PROCESSING_OPTIONS.NONTRANSITIONAL;\n  }\n\n  var error = false;\n\n  if (normalize(label) !== label ||\n      (label[3] === \"-\" && label[4] === \"-\") ||\n      label[0] === \"-\" || label[label.length - 1] === \"-\" ||\n      label.indexOf(\".\") !== -1 ||\n      label.search(combiningMarksRegex) === 0) {\n    error = true;\n  }\n\n  var len = countSymbols(label);\n  for (var i = 0; i < len; ++i) {\n    var status = findStatus(label.codePointAt(i));\n    if ((processing === PROCESSING_OPTIONS.TRANSITIONAL && status[1] !== \"valid\") ||\n        (processing === PROCESSING_OPTIONS.NONTRANSITIONAL &&\n         status[1] !== \"valid\" && status[1] !== \"deviation\")) {\n      error = true;\n      break;\n    }\n  }\n\n  return {\n    label: label,\n    error: error\n  };\n}\n\nfunction processing(domain_name, useSTD3, processing_option) {\n  var result = mapChars(domain_name, useSTD3, processing_option);\n  result.string = normalize(result.string);\n\n  var labels = result.string.split(\".\");\n  for (var i = 0; i < labels.length; ++i) {\n    try {\n      var validation = validateLabel(labels[i]);\n      labels[i] = validation.label;\n      result.error = result.error || validation.error;\n    } catch(e) {\n      result.error = true;\n    }\n  }\n\n  return {\n    string: labels.join(\".\"),\n    error: result.error\n  };\n}\n\nmodule.exports.toASCII = function(domain_name, useSTD3, processing_option, verifyDnsLength) {\n  var result = processing(domain_name, useSTD3, processing_option);\n  var labels = result.string.split(\".\");\n  labels = labels.map(function(l) {\n    try {\n      return punycode.toASCII(l);\n    } catch(e) {\n      result.error = true;\n      return l;\n    }\n  });\n\n  if (verifyDnsLength) {\n    var total = labels.slice(0, labels.length - 1).join(\".\").length;\n    if (total.length > 253 || total.length === 0) {\n      result.error = true;\n    }\n\n    for (var i=0; i < labels.length; ++i) {\n      if (labels.length > 63 || labels.length === 0) {\n        result.error = true;\n        break;\n      }\n    }\n  }\n\n  if (result.error) return null;\n  return labels.join(\".\");\n};\n\nmodule.exports.toUnicode = function(domain_name, useSTD3) {\n  var result = processing(domain_name, useSTD3, PROCESSING_OPTIONS.NONTRANSITIONAL);\n\n  return {\n    domain: result.string,\n    error: result.error\n  };\n};\n\nmodule.exports.PROCESSING_OPTIONS = PROCESSING_OPTIONS;\n\n\n/***/ }),\n\n/***/ 4294:\n/***/ ((module, __unused_webpack_exports, __webpack_require__) => {\n\nmodule.exports = __webpack_require__(4219);\n\n\n/***/ }),\n\n/***/ 4219:\n/***/ ((__unused_webpack_module, exports, __webpack_require__) => {\n\n\"use strict\";\n\n\nvar net = __webpack_require__(1631);\nvar tls = __webpack_require__(4016);\nvar http = __webpack_require__(8605);\nvar https = __webpack_require__(7211);\nvar events = __webpack_require__(8614);\nvar assert = __webpack_require__(2357);\nvar util = __webpack_require__(1669);\n\n\nexports.httpOverHttp = httpOverHttp;\nexports.httpsOverHttp = httpsOverHttp;\nexports.httpOverHttps = httpOverHttps;\nexports.httpsOverHttps = httpsOverHttps;\n\n\nfunction httpOverHttp(options) {\n  var agent = new TunnelingAgent(options);\n  agent.request = http.request;\n  return agent;\n}\n\nfunction httpsOverHttp(options) {\n  var agent = new TunnelingAgent(options);\n  agent.request = http.request;\n  agent.createSocket = createSecureSocket;\n  agent.defaultPort = 443;\n  return agent;\n}\n\nfunction httpOverHttps(options) {\n  var agent = new TunnelingAgent(options);\n  agent.request = https.request;\n  return agent;\n}\n\nfunction httpsOverHttps(options) {\n  var agent = new TunnelingAgent(options);\n  agent.request = https.request;\n  agent.createSocket = createSecureSocket;\n  agent.defaultPort = 443;\n  return agent;\n}\n\n\nfunction TunnelingAgent(options) {\n  var self = this;\n  self.options = options || {};\n  self.proxyOptions = self.options.proxy || {};\n  self.maxSockets = self.options.maxSockets || http.Agent.defaultMaxSockets;\n  self.requests = [];\n  self.sockets = [];\n\n  self.on('free', function onFree(socket, host, port, localAddress) {\n    var options = toOptions(host, port, localAddress);\n    for (var i = 0, len = self.requests.length; i < len; ++i) {\n      var pending = self.requests[i];\n      if (pending.host === options.host && pending.port === options.port) {\n        // Detect the request to connect same origin server,\n        // reuse the connection.\n        self.requests.splice(i, 1);\n        pending.request.onSocket(socket);\n        return;\n      }\n    }\n    socket.destroy();\n    self.removeSocket(socket);\n  });\n}\nutil.inherits(TunnelingAgent, events.EventEmitter);\n\nTunnelingAgent.prototype.addRequest = function addRequest(req, host, port, localAddress) {\n  var self = this;\n  var options = mergeOptions({request: req}, self.options, toOptions(host, port, localAddress));\n\n  if (self.sockets.length >= this.maxSockets) {\n    // We are over limit so we'll add it to the queue.\n    self.requests.push(options);\n    return;\n  }\n\n  // If we are under maxSockets create a new one.\n  self.createSocket(options, function(socket) {\n    socket.on('free', onFree);\n    socket.on('close', onCloseOrRemove);\n    socket.on('agentRemove', onCloseOrRemove);\n    req.onSocket(socket);\n\n    function onFree() {\n      self.emit('free', socket, options);\n    }\n\n    function onCloseOrRemove(err) {\n      self.removeSocket(socket);\n      socket.removeListener('free', onFree);\n      socket.removeListener('close', onCloseOrRemove);\n      socket.removeListener('agentRemove', onCloseOrRemove);\n    }\n  });\n};\n\nTunnelingAgent.prototype.createSocket = function createSocket(options, cb) {\n  var self = this;\n  var placeholder = {};\n  self.sockets.push(placeholder);\n\n  var connectOptions = mergeOptions({}, self.proxyOptions, {\n    method: 'CONNECT',\n    path: options.host + ':' + options.port,\n    agent: false,\n    headers: {\n      host: options.host + ':' + options.port\n    }\n  });\n  if (options.localAddress) {\n    connectOptions.localAddress = options.localAddress;\n  }\n  if (connectOptions.proxyAuth) {\n    connectOptions.headers = connectOptions.headers || {};\n    connectOptions.headers['Proxy-Authorization'] = 'Basic ' +\n        new Buffer(connectOptions.proxyAuth).toString('base64');\n  }\n\n  debug('making CONNECT request');\n  var connectReq = self.request(connectOptions);\n  connectReq.useChunkedEncodingByDefault = false; // for v0.6\n  connectReq.once('response', onResponse); // for v0.6\n  connectReq.once('upgrade', onUpgrade);   // for v0.6\n  connectReq.once('connect', onConnect);   // for v0.7 or later\n  connectReq.once('error', onError);\n  connectReq.end();\n\n  function onResponse(res) {\n    // Very hacky. This is necessary to avoid http-parser leaks.\n    res.upgrade = true;\n  }\n\n  function onUpgrade(res, socket, head) {\n    // Hacky.\n    process.nextTick(function() {\n      onConnect(res, socket, head);\n    });\n  }\n\n  function onConnect(res, socket, head) {\n    connectReq.removeAllListeners();\n    socket.removeAllListeners();\n\n    if (res.statusCode !== 200) {\n      debug('tunneling socket could not be established, statusCode=%d',\n        res.statusCode);\n      socket.destroy();\n      var error = new Error('tunneling socket could not be established, ' +\n        'statusCode=' + res.statusCode);\n      error.code = 'ECONNRESET';\n      options.request.emit('error', error);\n      self.removeSocket(placeholder);\n      return;\n    }\n    if (head.length > 0) {\n      debug('got illegal response body from proxy');\n      socket.destroy();\n      var error = new Error('got illegal response body from proxy');\n      error.code = 'ECONNRESET';\n      options.request.emit('error', error);\n      self.removeSocket(placeholder);\n      return;\n    }\n    debug('tunneling connection has established');\n    self.sockets[self.sockets.indexOf(placeholder)] = socket;\n    return cb(socket);\n  }\n\n  function onError(cause) {\n    connectReq.removeAllListeners();\n\n    debug('tunneling socket could not be established, cause=%s\\n',\n          cause.message, cause.stack);\n    var error = new Error('tunneling socket could not be established, ' +\n                          'cause=' + cause.message);\n    error.code = 'ECONNRESET';\n    options.request.emit('error', error);\n    self.removeSocket(placeholder);\n  }\n};\n\nTunnelingAgent.prototype.removeSocket = function removeSocket(socket) {\n  var pos = this.sockets.indexOf(socket)\n  if (pos === -1) {\n    return;\n  }\n  this.sockets.splice(pos, 1);\n\n  var pending = this.requests.shift();\n  if (pending) {\n    // If we have pending requests and a socket gets closed a new one\n    // needs to be created to take over in the pool for the one that closed.\n    this.createSocket(pending, function(socket) {\n      pending.request.onSocket(socket);\n    });\n  }\n};\n\nfunction createSecureSocket(options, cb) {\n  var self = this;\n  TunnelingAgent.prototype.createSocket.call(self, options, function(socket) {\n    var hostHeader = options.request.getHeader('host');\n    var tlsOptions = mergeOptions({}, self.options, {\n      socket: socket,\n      servername: hostHeader ? hostHeader.replace(/:.*$/, '') : options.host\n    });\n\n    // 0 is dummy port for v0.6\n    var secureSocket = tls.connect(0, tlsOptions);\n    self.sockets[self.sockets.indexOf(socket)] = secureSocket;\n    cb(secureSocket);\n  });\n}\n\n\nfunction toOptions(host, port, localAddress) {\n  if (typeof host === 'string') { // since v0.10\n    return {\n      host: host,\n      port: port,\n      localAddress: localAddress\n    };\n  }\n  return host; // for v0.11 or later\n}\n\nfunction mergeOptions(target) {\n  for (var i = 1, len = arguments.length; i < len; ++i) {\n    var overrides = arguments[i];\n    if (typeof overrides === 'object') {\n      var keys = Object.keys(overrides);\n      for (var j = 0, keyLen = keys.length; j < keyLen; ++j) {\n        var k = keys[j];\n        if (overrides[k] !== undefined) {\n          target[k] = overrides[k];\n        }\n      }\n    }\n  }\n  return target;\n}\n\n\nvar debug;\nif (process.env.NODE_DEBUG && /\\btunnel\\b/.test(process.env.NODE_DEBUG)) {\n  debug = function() {\n    var args = Array.prototype.slice.call(arguments);\n    if (typeof args[0] === 'string') {\n      args[0] = 'TUNNEL: ' + args[0];\n    } else {\n      args.unshift('TUNNEL:');\n    }\n    console.error.apply(console, args);\n  }\n} else {\n  debug = function() {};\n}\nexports.debug = debug; // for test\n\n\n/***/ }),\n\n/***/ 5030:\n/***/ ((__unused_webpack_module, exports, __webpack_require__) => {\n\n\"use strict\";\n\n\nObject.defineProperty(exports, \"__esModule\", ({ value: true }));\n\nfunction _interopDefault (ex) { return (ex && (typeof ex === 'object') && 'default' in ex) ? ex['default'] : ex; }\n\nvar osName = _interopDefault(__webpack_require__(4824));\n\nfunction getUserAgent() {\n  try {\n    return `Node.js/${process.version.substr(1)} (${osName()}; ${process.arch})`;\n  } catch (error) {\n    if (/wmic os get Caption/.test(error.message)) {\n      return \"Windows <version undetectable>\";\n    }\n\n    return \"<environment undetectable>\";\n  }\n}\n\nexports.getUserAgent = getUserAgent;\n//# sourceMappingURL=index.js.map\n\n\n/***/ }),\n\n/***/ 4552:\n/***/ ((__unused_webpack_module, __webpack_exports__, __webpack_require__) => {\n\n\"use strict\";\n// ESM COMPAT FLAG\n__webpack_require__.r(__webpack_exports__);\n\n// EXPORTS\n__webpack_require__.d(__webpack_exports__, {\n  \"v1\": () => /* reexport */ esm_node_v1,\n  \"v3\": () => /* reexport */ esm_node_v3,\n  \"v4\": () => /* reexport */ esm_node_v4,\n  \"v5\": () => /* reexport */ esm_node_v5,\n  \"NIL\": () => /* reexport */ nil,\n  \"version\": () => /* reexport */ esm_node_version,\n  \"validate\": () => /* reexport */ esm_node_validate,\n  \"stringify\": () => /* reexport */ esm_node_stringify,\n  \"parse\": () => /* reexport */ esm_node_parse\n});\n\n// EXTERNAL MODULE: external \"crypto\"\nvar external_crypto_ = __webpack_require__(6417);\nvar external_crypto_default = /*#__PURE__*/__webpack_require__.n(external_crypto_);\n\n// CONCATENATED MODULE: ./node_modules/uuid/dist/esm-node/rng.js\n\nconst rnds8Pool = new Uint8Array(256); // # of random values to pre-allocate\n\nlet poolPtr = rnds8Pool.length;\nfunction rng() {\n  if (poolPtr > rnds8Pool.length - 16) {\n    external_crypto_default().randomFillSync(rnds8Pool);\n    poolPtr = 0;\n  }\n\n  return rnds8Pool.slice(poolPtr, poolPtr += 16);\n}\n// CONCATENATED MODULE: ./node_modules/uuid/dist/esm-node/regex.js\n/* harmony default export */ const regex = (/^(?:[0-9a-f]{8}-[0-9a-f]{4}-[1-5][0-9a-f]{3}-[89ab][0-9a-f]{3}-[0-9a-f]{12}|00000000-0000-0000-0000-000000000000)$/i);\n// CONCATENATED MODULE: ./node_modules/uuid/dist/esm-node/validate.js\n\n\nfunction validate(uuid) {\n  return typeof uuid === 'string' && regex.test(uuid);\n}\n\n/* harmony default export */ const esm_node_validate = (validate);\n// CONCATENATED MODULE: ./node_modules/uuid/dist/esm-node/stringify.js\n\n/**\n * Convert array of 16 byte values to UUID string format of the form:\n * XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX\n */\n\nconst byteToHex = [];\n\nfor (let i = 0; i < 256; ++i) {\n  byteToHex.push((i + 0x100).toString(16).substr(1));\n}\n\nfunction stringify(arr, offset = 0) {\n  // Note: Be careful editing this code!  It's been tuned for performance\n  // and works in ways you may not expect. See https://github.com/uuidjs/uuid/pull/434\n  const uuid = (byteToHex[arr[offset + 0]] + byteToHex[arr[offset + 1]] + byteToHex[arr[offset + 2]] + byteToHex[arr[offset + 3]] + '-' + byteToHex[arr[offset + 4]] + byteToHex[arr[offset + 5]] + '-' + byteToHex[arr[offset + 6]] + byteToHex[arr[offset + 7]] + '-' + byteToHex[arr[offset + 8]] + byteToHex[arr[offset + 9]] + '-' + byteToHex[arr[offset + 10]] + byteToHex[arr[offset + 11]] + byteToHex[arr[offset + 12]] + byteToHex[arr[offset + 13]] + byteToHex[arr[offset + 14]] + byteToHex[arr[offset + 15]]).toLowerCase(); // Consistency check for valid UUID.  If this throws, it's likely due to one\n  // of the following:\n  // - One or more input array values don't map to a hex octet (leading to\n  // \"undefined\" in the uuid)\n  // - Invalid input values for the RFC `version` or `variant` fields\n\n  if (!esm_node_validate(uuid)) {\n    throw TypeError('Stringified UUID is invalid');\n  }\n\n  return uuid;\n}\n\n/* harmony default export */ const esm_node_stringify = (stringify);\n// CONCATENATED MODULE: ./node_modules/uuid/dist/esm-node/v1.js\n\n // **`v1()` - Generate time-based UUID**\n//\n// Inspired by https://github.com/LiosK/UUID.js\n// and http://docs.python.org/library/uuid.html\n\nlet _nodeId;\n\nlet _clockseq; // Previous uuid creation time\n\n\nlet _lastMSecs = 0;\nlet _lastNSecs = 0; // See https://github.com/uuidjs/uuid for API details\n\nfunction v1(options, buf, offset) {\n  let i = buf && offset || 0;\n  const b = buf || new Array(16);\n  options = options || {};\n  let node = options.node || _nodeId;\n  let clockseq = options.clockseq !== undefined ? options.clockseq : _clockseq; // node and clockseq need to be initialized to random values if they're not\n  // specified.  We do this lazily to minimize issues related to insufficient\n  // system entropy.  See #189\n\n  if (node == null || clockseq == null) {\n    const seedBytes = options.random || (options.rng || rng)();\n\n    if (node == null) {\n      // Per 4.5, create and 48-bit node id, (47 random bits + multicast bit = 1)\n      node = _nodeId = [seedBytes[0] | 0x01, seedBytes[1], seedBytes[2], seedBytes[3], seedBytes[4], seedBytes[5]];\n    }\n\n    if (clockseq == null) {\n      // Per 4.2.2, randomize (14 bit) clockseq\n      clockseq = _clockseq = (seedBytes[6] << 8 | seedBytes[7]) & 0x3fff;\n    }\n  } // UUID timestamps are 100 nano-second units since the Gregorian epoch,\n  // (1582-10-15 00:00).  JSNumbers aren't precise enough for this, so\n  // time is handled internally as 'msecs' (integer milliseconds) and 'nsecs'\n  // (100-nanoseconds offset from msecs) since unix epoch, 1970-01-01 00:00.\n\n\n  let msecs = options.msecs !== undefined ? options.msecs : Date.now(); // Per 4.2.1.2, use count of uuid's generated during the current clock\n  // cycle to simulate higher resolution clock\n\n  let nsecs = options.nsecs !== undefined ? options.nsecs : _lastNSecs + 1; // Time since last uuid creation (in msecs)\n\n  const dt = msecs - _lastMSecs + (nsecs - _lastNSecs) / 10000; // Per 4.2.1.2, Bump clockseq on clock regression\n\n  if (dt < 0 && options.clockseq === undefined) {\n    clockseq = clockseq + 1 & 0x3fff;\n  } // Reset nsecs if clock regresses (new clockseq) or we've moved onto a new\n  // time interval\n\n\n  if ((dt < 0 || msecs > _lastMSecs) && options.nsecs === undefined) {\n    nsecs = 0;\n  } // Per 4.2.1.2 Throw error if too many uuids are requested\n\n\n  if (nsecs >= 10000) {\n    throw new Error(\"uuid.v1(): Can't create more than 10M uuids/sec\");\n  }\n\n  _lastMSecs = msecs;\n  _lastNSecs = nsecs;\n  _clockseq = clockseq; // Per 4.1.4 - Convert from unix epoch to Gregorian epoch\n\n  msecs += 12219292800000; // `time_low`\n\n  const tl = ((msecs & 0xfffffff) * 10000 + nsecs) % 0x100000000;\n  b[i++] = tl >>> 24 & 0xff;\n  b[i++] = tl >>> 16 & 0xff;\n  b[i++] = tl >>> 8 & 0xff;\n  b[i++] = tl & 0xff; // `time_mid`\n\n  const tmh = msecs / 0x100000000 * 10000 & 0xfffffff;\n  b[i++] = tmh >>> 8 & 0xff;\n  b[i++] = tmh & 0xff; // `time_high_and_version`\n\n  b[i++] = tmh >>> 24 & 0xf | 0x10; // include version\n\n  b[i++] = tmh >>> 16 & 0xff; // `clock_seq_hi_and_reserved` (Per 4.2.2 - include variant)\n\n  b[i++] = clockseq >>> 8 | 0x80; // `clock_seq_low`\n\n  b[i++] = clockseq & 0xff; // `node`\n\n  for (let n = 0; n < 6; ++n) {\n    b[i + n] = node[n];\n  }\n\n  return buf || esm_node_stringify(b);\n}\n\n/* harmony default export */ const esm_node_v1 = (v1);\n// CONCATENATED MODULE: ./node_modules/uuid/dist/esm-node/parse.js\n\n\nfunction parse(uuid) {\n  if (!esm_node_validate(uuid)) {\n    throw TypeError('Invalid UUID');\n  }\n\n  let v;\n  const arr = new Uint8Array(16); // Parse ########-....-....-....-............\n\n  arr[0] = (v = parseInt(uuid.slice(0, 8), 16)) >>> 24;\n  arr[1] = v >>> 16 & 0xff;\n  arr[2] = v >>> 8 & 0xff;\n  arr[3] = v & 0xff; // Parse ........-####-....-....-............\n\n  arr[4] = (v = parseInt(uuid.slice(9, 13), 16)) >>> 8;\n  arr[5] = v & 0xff; // Parse ........-....-####-....-............\n\n  arr[6] = (v = parseInt(uuid.slice(14, 18), 16)) >>> 8;\n  arr[7] = v & 0xff; // Parse ........-....-....-####-............\n\n  arr[8] = (v = parseInt(uuid.slice(19, 23), 16)) >>> 8;\n  arr[9] = v & 0xff; // Parse ........-....-....-....-############\n  // (Use \"/\" to avoid 32-bit truncation when bit-shifting high-order bytes)\n\n  arr[10] = (v = parseInt(uuid.slice(24, 36), 16)) / 0x10000000000 & 0xff;\n  arr[11] = v / 0x100000000 & 0xff;\n  arr[12] = v >>> 24 & 0xff;\n  arr[13] = v >>> 16 & 0xff;\n  arr[14] = v >>> 8 & 0xff;\n  arr[15] = v & 0xff;\n  return arr;\n}\n\n/* harmony default export */ const esm_node_parse = (parse);\n// CONCATENATED MODULE: ./node_modules/uuid/dist/esm-node/v35.js\n\n\n\nfunction stringToBytes(str) {\n  str = unescape(encodeURIComponent(str)); // UTF8 escape\n\n  const bytes = [];\n\n  for (let i = 0; i < str.length; ++i) {\n    bytes.push(str.charCodeAt(i));\n  }\n\n  return bytes;\n}\n\nconst DNS = '6ba7b810-9dad-11d1-80b4-00c04fd430c8';\nconst URL = '6ba7b811-9dad-11d1-80b4-00c04fd430c8';\n/* harmony default export */ function v35(name, version, hashfunc) {\n  function generateUUID(value, namespace, buf, offset) {\n    if (typeof value === 'string') {\n      value = stringToBytes(value);\n    }\n\n    if (typeof namespace === 'string') {\n      namespace = esm_node_parse(namespace);\n    }\n\n    if (namespace.length !== 16) {\n      throw TypeError('Namespace must be array-like (16 iterable integer values, 0-255)');\n    } // Compute hash of namespace and value, Per 4.3\n    // Future: Use spread syntax when supported on all platforms, e.g. `bytes =\n    // hashfunc([...namespace, ... value])`\n\n\n    let bytes = new Uint8Array(16 + value.length);\n    bytes.set(namespace);\n    bytes.set(value, namespace.length);\n    bytes = hashfunc(bytes);\n    bytes[6] = bytes[6] & 0x0f | version;\n    bytes[8] = bytes[8] & 0x3f | 0x80;\n\n    if (buf) {\n      offset = offset || 0;\n\n      for (let i = 0; i < 16; ++i) {\n        buf[offset + i] = bytes[i];\n      }\n\n      return buf;\n    }\n\n    return esm_node_stringify(bytes);\n  } // Function#name is not settable on some platforms (#270)\n\n\n  try {\n    generateUUID.name = name; // eslint-disable-next-line no-empty\n  } catch (err) {} // For CommonJS default export support\n\n\n  generateUUID.DNS = DNS;\n  generateUUID.URL = URL;\n  return generateUUID;\n}\n// CONCATENATED MODULE: ./node_modules/uuid/dist/esm-node/md5.js\n\n\nfunction md5(bytes) {\n  if (Array.isArray(bytes)) {\n    bytes = Buffer.from(bytes);\n  } else if (typeof bytes === 'string') {\n    bytes = Buffer.from(bytes, 'utf8');\n  }\n\n  return external_crypto_default().createHash('md5').update(bytes).digest();\n}\n\n/* harmony default export */ const esm_node_md5 = (md5);\n// CONCATENATED MODULE: ./node_modules/uuid/dist/esm-node/v3.js\n\n\nconst v3 = v35('v3', 0x30, esm_node_md5);\n/* harmony default export */ const esm_node_v3 = (v3);\n// CONCATENATED MODULE: ./node_modules/uuid/dist/esm-node/v4.js\n\n\n\nfunction v4(options, buf, offset) {\n  options = options || {};\n  const rnds = options.random || (options.rng || rng)(); // Per 4.4, set bits for version and `clock_seq_hi_and_reserved`\n\n  rnds[6] = rnds[6] & 0x0f | 0x40;\n  rnds[8] = rnds[8] & 0x3f | 0x80; // Copy bytes to buffer, if provided\n\n  if (buf) {\n    offset = offset || 0;\n\n    for (let i = 0; i < 16; ++i) {\n      buf[offset + i] = rnds[i];\n    }\n\n    return buf;\n  }\n\n  return esm_node_stringify(rnds);\n}\n\n/* harmony default export */ const esm_node_v4 = (v4);\n// CONCATENATED MODULE: ./node_modules/uuid/dist/esm-node/sha1.js\n\n\nfunction sha1(bytes) {\n  if (Array.isArray(bytes)) {\n    bytes = Buffer.from(bytes);\n  } else if (typeof bytes === 'string') {\n    bytes = Buffer.from(bytes, 'utf8');\n  }\n\n  return external_crypto_default().createHash('sha1').update(bytes).digest();\n}\n\n/* harmony default export */ const esm_node_sha1 = (sha1);\n// CONCATENATED MODULE: ./node_modules/uuid/dist/esm-node/v5.js\n\n\nconst v5 = v35('v5', 0x50, esm_node_sha1);\n/* harmony default export */ const esm_node_v5 = (v5);\n// CONCATENATED MODULE: ./node_modules/uuid/dist/esm-node/nil.js\n/* harmony default export */ const nil = ('00000000-0000-0000-0000-000000000000');\n// CONCATENATED MODULE: ./node_modules/uuid/dist/esm-node/version.js\n\n\nfunction version(uuid) {\n  if (!esm_node_validate(uuid)) {\n    throw TypeError('Invalid UUID');\n  }\n\n  return parseInt(uuid.substr(14, 1), 16);\n}\n\n/* harmony default export */ const esm_node_version = (version);\n// CONCATENATED MODULE: ./node_modules/uuid/dist/esm-node/index.js\n\n\n\n\n\n\n\n\n\n\n/***/ }),\n\n/***/ 4886:\n/***/ ((module) => {\n\n\"use strict\";\n\n\nvar conversions = {};\nmodule.exports = conversions;\n\nfunction sign(x) {\n    return x < 0 ? -1 : 1;\n}\n\nfunction evenRound(x) {\n    // Round x to the nearest integer, choosing the even integer if it lies halfway between two.\n    if ((x % 1) === 0.5 && (x & 1) === 0) { // [even number].5; round down (i.e. floor)\n        return Math.floor(x);\n    } else {\n        return Math.round(x);\n    }\n}\n\nfunction createNumberConversion(bitLength, typeOpts) {\n    if (!typeOpts.unsigned) {\n        --bitLength;\n    }\n    const lowerBound = typeOpts.unsigned ? 0 : -Math.pow(2, bitLength);\n    const upperBound = Math.pow(2, bitLength) - 1;\n\n    const moduloVal = typeOpts.moduloBitLength ? Math.pow(2, typeOpts.moduloBitLength) : Math.pow(2, bitLength);\n    const moduloBound = typeOpts.moduloBitLength ? Math.pow(2, typeOpts.moduloBitLength - 1) : Math.pow(2, bitLength - 1);\n\n    return function(V, opts) {\n        if (!opts) opts = {};\n\n        let x = +V;\n\n        if (opts.enforceRange) {\n            if (!Number.isFinite(x)) {\n                throw new TypeError(\"Argument is not a finite number\");\n            }\n\n            x = sign(x) * Math.floor(Math.abs(x));\n            if (x < lowerBound || x > upperBound) {\n                throw new TypeError(\"Argument is not in byte range\");\n            }\n\n            return x;\n        }\n\n        if (!isNaN(x) && opts.clamp) {\n            x = evenRound(x);\n\n            if (x < lowerBound) x = lowerBound;\n            if (x > upperBound) x = upperBound;\n            return x;\n        }\n\n        if (!Number.isFinite(x) || x === 0) {\n            return 0;\n        }\n\n        x = sign(x) * Math.floor(Math.abs(x));\n        x = x % moduloVal;\n\n        if (!typeOpts.unsigned && x >= moduloBound) {\n            return x - moduloVal;\n        } else if (typeOpts.unsigned) {\n            if (x < 0) {\n              x += moduloVal;\n            } else if (x === -0) { // don't return negative zero\n              return 0;\n            }\n        }\n\n        return x;\n    }\n}\n\nconversions[\"void\"] = function () {\n    return undefined;\n};\n\nconversions[\"boolean\"] = function (val) {\n    return !!val;\n};\n\nconversions[\"byte\"] = createNumberConversion(8, { unsigned: false });\nconversions[\"octet\"] = createNumberConversion(8, { unsigned: true });\n\nconversions[\"short\"] = createNumberConversion(16, { unsigned: false });\nconversions[\"unsigned short\"] = createNumberConversion(16, { unsigned: true });\n\nconversions[\"long\"] = createNumberConversion(32, { unsigned: false });\nconversions[\"unsigned long\"] = createNumberConversion(32, { unsigned: true });\n\nconversions[\"long long\"] = createNumberConversion(32, { unsigned: false, moduloBitLength: 64 });\nconversions[\"unsigned long long\"] = createNumberConversion(32, { unsigned: true, moduloBitLength: 64 });\n\nconversions[\"double\"] = function (V) {\n    const x = +V;\n\n    if (!Number.isFinite(x)) {\n        throw new TypeError(\"Argument is not a finite floating-point value\");\n    }\n\n    return x;\n};\n\nconversions[\"unrestricted double\"] = function (V) {\n    const x = +V;\n\n    if (isNaN(x)) {\n        throw new TypeError(\"Argument is NaN\");\n    }\n\n    return x;\n};\n\n// not quite valid, but good enough for JS\nconversions[\"float\"] = conversions[\"double\"];\nconversions[\"unrestricted float\"] = conversions[\"unrestricted double\"];\n\nconversions[\"DOMString\"] = function (V, opts) {\n    if (!opts) opts = {};\n\n    if (opts.treatNullAsEmptyString && V === null) {\n        return \"\";\n    }\n\n    return String(V);\n};\n\nconversions[\"ByteString\"] = function (V, opts) {\n    const x = String(V);\n    let c = undefined;\n    for (let i = 0; (c = x.codePointAt(i)) !== undefined; ++i) {\n        if (c > 255) {\n            throw new TypeError(\"Argument is not a valid bytestring\");\n        }\n    }\n\n    return x;\n};\n\nconversions[\"USVString\"] = function (V) {\n    const S = String(V);\n    const n = S.length;\n    const U = [];\n    for (let i = 0; i < n; ++i) {\n        const c = S.charCodeAt(i);\n        if (c < 0xD800 || c > 0xDFFF) {\n            U.push(String.fromCodePoint(c));\n        } else if (0xDC00 <= c && c <= 0xDFFF) {\n            U.push(String.fromCodePoint(0xFFFD));\n        } else {\n            if (i === n - 1) {\n                U.push(String.fromCodePoint(0xFFFD));\n            } else {\n                const d = S.charCodeAt(i + 1);\n                if (0xDC00 <= d && d <= 0xDFFF) {\n                    const a = c & 0x3FF;\n                    const b = d & 0x3FF;\n                    U.push(String.fromCodePoint((2 << 15) + (2 << 9) * a + b));\n                    ++i;\n                } else {\n                    U.push(String.fromCodePoint(0xFFFD));\n                }\n            }\n        }\n    }\n\n    return U.join('');\n};\n\nconversions[\"Date\"] = function (V, opts) {\n    if (!(V instanceof Date)) {\n        throw new TypeError(\"Argument is not a Date object\");\n    }\n    if (isNaN(V)) {\n        return undefined;\n    }\n\n    return V;\n};\n\nconversions[\"RegExp\"] = function (V, opts) {\n    if (!(V instanceof RegExp)) {\n        V = new RegExp(V);\n    }\n\n    return V;\n};\n\n\n/***/ }),\n\n/***/ 7537:\n/***/ ((__unused_webpack_module, exports, __webpack_require__) => {\n\n\"use strict\";\n\nconst usm = __webpack_require__(2158);\n\nexports.implementation = class URLImpl {\n  constructor(constructorArgs) {\n    const url = constructorArgs[0];\n    const base = constructorArgs[1];\n\n    let parsedBase = null;\n    if (base !== undefined) {\n      parsedBase = usm.basicURLParse(base);\n      if (parsedBase === \"failure\") {\n        throw new TypeError(\"Invalid base URL\");\n      }\n    }\n\n    const parsedURL = usm.basicURLParse(url, { baseURL: parsedBase });\n    if (parsedURL === \"failure\") {\n      throw new TypeError(\"Invalid URL\");\n    }\n\n    this._url = parsedURL;\n\n    // TODO: query stuff\n  }\n\n  get href() {\n    return usm.serializeURL(this._url);\n  }\n\n  set href(v) {\n    const parsedURL = usm.basicURLParse(v);\n    if (parsedURL === \"failure\") {\n      throw new TypeError(\"Invalid URL\");\n    }\n\n    this._url = parsedURL;\n  }\n\n  get origin() {\n    return usm.serializeURLOrigin(this._url);\n  }\n\n  get protocol() {\n    return this._url.scheme + \":\";\n  }\n\n  set protocol(v) {\n    usm.basicURLParse(v + \":\", { url: this._url, stateOverride: \"scheme start\" });\n  }\n\n  get username() {\n    return this._url.username;\n  }\n\n  set username(v) {\n    if (usm.cannotHaveAUsernamePasswordPort(this._url)) {\n      return;\n    }\n\n    usm.setTheUsername(this._url, v);\n  }\n\n  get password() {\n    return this._url.password;\n  }\n\n  set password(v) {\n    if (usm.cannotHaveAUsernamePasswordPort(this._url)) {\n      return;\n    }\n\n    usm.setThePassword(this._url, v);\n  }\n\n  get host() {\n    const url = this._url;\n\n    if (url.host === null) {\n      return \"\";\n    }\n\n    if (url.port === null) {\n      return usm.serializeHost(url.host);\n    }\n\n    return usm.serializeHost(url.host) + \":\" + usm.serializeInteger(url.port);\n  }\n\n  set host(v) {\n    if (this._url.cannotBeABaseURL) {\n      return;\n    }\n\n    usm.basicURLParse(v, { url: this._url, stateOverride: \"host\" });\n  }\n\n  get hostname() {\n    if (this._url.host === null) {\n      return \"\";\n    }\n\n    return usm.serializeHost(this._url.host);\n  }\n\n  set hostname(v) {\n    if (this._url.cannotBeABaseURL) {\n      return;\n    }\n\n    usm.basicURLParse(v, { url: this._url, stateOverride: \"hostname\" });\n  }\n\n  get port() {\n    if (this._url.port === null) {\n      return \"\";\n    }\n\n    return usm.serializeInteger(this._url.port);\n  }\n\n  set port(v) {\n    if (usm.cannotHaveAUsernamePasswordPort(this._url)) {\n      return;\n    }\n\n    if (v === \"\") {\n      this._url.port = null;\n    } else {\n      usm.basicURLParse(v, { url: this._url, stateOverride: \"port\" });\n    }\n  }\n\n  get pathname() {\n    if (this._url.cannotBeABaseURL) {\n      return this._url.path[0];\n    }\n\n    if (this._url.path.length === 0) {\n      return \"\";\n    }\n\n    return \"/\" + this._url.path.join(\"/\");\n  }\n\n  set pathname(v) {\n    if (this._url.cannotBeABaseURL) {\n      return;\n    }\n\n    this._url.path = [];\n    usm.basicURLParse(v, { url: this._url, stateOverride: \"path start\" });\n  }\n\n  get search() {\n    if (this._url.query === null || this._url.query === \"\") {\n      return \"\";\n    }\n\n    return \"?\" + this._url.query;\n  }\n\n  set search(v) {\n    // TODO: query stuff\n\n    const url = this._url;\n\n    if (v === \"\") {\n      url.query = null;\n      return;\n    }\n\n    const input = v[0] === \"?\" ? v.substring(1) : v;\n    url.query = \"\";\n    usm.basicURLParse(input, { url, stateOverride: \"query\" });\n  }\n\n  get hash() {\n    if (this._url.fragment === null || this._url.fragment === \"\") {\n      return \"\";\n    }\n\n    return \"#\" + this._url.fragment;\n  }\n\n  set hash(v) {\n    if (v === \"\") {\n      this._url.fragment = null;\n      return;\n    }\n\n    const input = v[0] === \"#\" ? v.substring(1) : v;\n    this._url.fragment = \"\";\n    usm.basicURLParse(input, { url: this._url, stateOverride: \"fragment\" });\n  }\n\n  toJSON() {\n    return this.href;\n  }\n};\n\n\n/***/ }),\n\n/***/ 3394:\n/***/ ((module, __unused_webpack_exports, __webpack_require__) => {\n\n\"use strict\";\n\n\nconst conversions = __webpack_require__(4886);\nconst utils = __webpack_require__(3185);\nconst Impl = __webpack_require__(7537);\n\nconst impl = utils.implSymbol;\n\nfunction URL(url) {\n  if (!this || this[impl] || !(this instanceof URL)) {\n    throw new TypeError(\"Failed to construct 'URL': Please use the 'new' operator, this DOM object constructor cannot be called as a function.\");\n  }\n  if (arguments.length < 1) {\n    throw new TypeError(\"Failed to construct 'URL': 1 argument required, but only \" + arguments.length + \" present.\");\n  }\n  const args = [];\n  for (let i = 0; i < arguments.length && i < 2; ++i) {\n    args[i] = arguments[i];\n  }\n  args[0] = conversions[\"USVString\"](args[0]);\n  if (args[1] !== undefined) {\n  args[1] = conversions[\"USVString\"](args[1]);\n  }\n\n  module.exports.setup(this, args);\n}\n\nURL.prototype.toJSON = function toJSON() {\n  if (!this || !module.exports.is(this)) {\n    throw new TypeError(\"Illegal invocation\");\n  }\n  const args = [];\n  for (let i = 0; i < arguments.length && i < 0; ++i) {\n    args[i] = arguments[i];\n  }\n  return this[impl].toJSON.apply(this[impl], args);\n};\nObject.defineProperty(URL.prototype, \"href\", {\n  get() {\n    return this[impl].href;\n  },\n  set(V) {\n    V = conversions[\"USVString\"](V);\n    this[impl].href = V;\n  },\n  enumerable: true,\n  configurable: true\n});\n\nURL.prototype.toString = function () {\n  if (!this || !module.exports.is(this)) {\n    throw new TypeError(\"Illegal invocation\");\n  }\n  return this.href;\n};\n\nObject.defineProperty(URL.prototype, \"origin\", {\n  get() {\n    return this[impl].origin;\n  },\n  enumerable: true,\n  configurable: true\n});\n\nObject.defineProperty(URL.prototype, \"protocol\", {\n  get() {\n    return this[impl].protocol;\n  },\n  set(V) {\n    V = conversions[\"USVString\"](V);\n    this[impl].protocol = V;\n  },\n  enumerable: true,\n  configurable: true\n});\n\nObject.defineProperty(URL.prototype, \"username\", {\n  get() {\n    return this[impl].username;\n  },\n  set(V) {\n    V = conversions[\"USVString\"](V);\n    this[impl].username = V;\n  },\n  enumerable: true,\n  configurable: true\n});\n\nObject.defineProperty(URL.prototype, \"password\", {\n  get() {\n    return this[impl].password;\n  },\n  set(V) {\n    V = conversions[\"USVString\"](V);\n    this[impl].password = V;\n  },\n  enumerable: true,\n  configurable: true\n});\n\nObject.defineProperty(URL.prototype, \"host\", {\n  get() {\n    return this[impl].host;\n  },\n  set(V) {\n    V = conversions[\"USVString\"](V);\n    this[impl].host = V;\n  },\n  enumerable: true,\n  configurable: true\n});\n\nObject.defineProperty(URL.prototype, \"hostname\", {\n  get() {\n    return this[impl].hostname;\n  },\n  set(V) {\n    V = conversions[\"USVString\"](V);\n    this[impl].hostname = V;\n  },\n  enumerable: true,\n  configurable: true\n});\n\nObject.defineProperty(URL.prototype, \"port\", {\n  get() {\n    return this[impl].port;\n  },\n  set(V) {\n    V = conversions[\"USVString\"](V);\n    this[impl].port = V;\n  },\n  enumerable: true,\n  configurable: true\n});\n\nObject.defineProperty(URL.prototype, \"pathname\", {\n  get() {\n    return this[impl].pathname;\n  },\n  set(V) {\n    V = conversions[\"USVString\"](V);\n    this[impl].pathname = V;\n  },\n  enumerable: true,\n  configurable: true\n});\n\nObject.defineProperty(URL.prototype, \"search\", {\n  get() {\n    return this[impl].search;\n  },\n  set(V) {\n    V = conversions[\"USVString\"](V);\n    this[impl].search = V;\n  },\n  enumerable: true,\n  configurable: true\n});\n\nObject.defineProperty(URL.prototype, \"hash\", {\n  get() {\n    return this[impl].hash;\n  },\n  set(V) {\n    V = conversions[\"USVString\"](V);\n    this[impl].hash = V;\n  },\n  enumerable: true,\n  configurable: true\n});\n\n\nmodule.exports = {\n  is(obj) {\n    return !!obj && obj[impl] instanceof Impl.implementation;\n  },\n  create(constructorArgs, privateData) {\n    let obj = Object.create(URL.prototype);\n    this.setup(obj, constructorArgs, privateData);\n    return obj;\n  },\n  setup(obj, constructorArgs, privateData) {\n    if (!privateData) privateData = {};\n    privateData.wrapper = obj;\n\n    obj[impl] = new Impl.implementation(constructorArgs, privateData);\n    obj[impl][utils.wrapperSymbol] = obj;\n  },\n  interface: URL,\n  expose: {\n    Window: { URL: URL },\n    Worker: { URL: URL }\n  }\n};\n\n\n\n/***/ }),\n\n/***/ 8665:\n/***/ ((__unused_webpack_module, exports, __webpack_require__) => {\n\n\"use strict\";\n\n\nexports.URL = __webpack_require__(3394).interface;\nexports.serializeURL = __webpack_require__(2158).serializeURL;\nexports.serializeURLOrigin = __webpack_require__(2158).serializeURLOrigin;\nexports.basicURLParse = __webpack_require__(2158).basicURLParse;\nexports.setTheUsername = __webpack_require__(2158).setTheUsername;\nexports.setThePassword = __webpack_require__(2158).setThePassword;\nexports.serializeHost = __webpack_require__(2158).serializeHost;\nexports.serializeInteger = __webpack_require__(2158).serializeInteger;\nexports.parseURL = __webpack_require__(2158).parseURL;\n\n\n/***/ }),\n\n/***/ 2158:\n/***/ ((module, __unused_webpack_exports, __webpack_require__) => {\n\n\"use strict\";\n\r\nconst punycode = __webpack_require__(4213);\r\nconst tr46 = __webpack_require__(4256);\r\n\r\nconst specialSchemes = {\r\n  ftp: 21,\r\n  file: null,\r\n  gopher: 70,\r\n  http: 80,\r\n  https: 443,\r\n  ws: 80,\r\n  wss: 443\r\n};\r\n\r\nconst failure = Symbol(\"failure\");\r\n\r\nfunction countSymbols(str) {\r\n  return punycode.ucs2.decode(str).length;\r\n}\r\n\r\nfunction at(input, idx) {\r\n  const c = input[idx];\r\n  return isNaN(c) ? undefined : String.fromCodePoint(c);\r\n}\r\n\r\nfunction isASCIIDigit(c) {\r\n  return c >= 0x30 && c <= 0x39;\r\n}\r\n\r\nfunction isASCIIAlpha(c) {\r\n  return (c >= 0x41 && c <= 0x5A) || (c >= 0x61 && c <= 0x7A);\r\n}\r\n\r\nfunction isASCIIAlphanumeric(c) {\r\n  return isASCIIAlpha(c) || isASCIIDigit(c);\r\n}\r\n\r\nfunction isASCIIHex(c) {\r\n  return isASCIIDigit(c) || (c >= 0x41 && c <= 0x46) || (c >= 0x61 && c <= 0x66);\r\n}\r\n\r\nfunction isSingleDot(buffer) {\r\n  return buffer === \".\" || buffer.toLowerCase() === \"%2e\";\r\n}\r\n\r\nfunction isDoubleDot(buffer) {\r\n  buffer = buffer.toLowerCase();\r\n  return buffer === \"..\" || buffer === \"%2e.\" || buffer === \".%2e\" || buffer === \"%2e%2e\";\r\n}\r\n\r\nfunction isWindowsDriveLetterCodePoints(cp1, cp2) {\r\n  return isASCIIAlpha(cp1) && (cp2 === 58 || cp2 === 124);\r\n}\r\n\r\nfunction isWindowsDriveLetterString(string) {\r\n  return string.length === 2 && isASCIIAlpha(string.codePointAt(0)) && (string[1] === \":\" || string[1] === \"|\");\r\n}\r\n\r\nfunction isNormalizedWindowsDriveLetterString(string) {\r\n  return string.length === 2 && isASCIIAlpha(string.codePointAt(0)) && string[1] === \":\";\r\n}\r\n\r\nfunction containsForbiddenHostCodePoint(string) {\r\n  return string.search(/\\u0000|\\u0009|\\u000A|\\u000D|\\u0020|#|%|\\/|:|\\?|@|\\[|\\\\|\\]/) !== -1;\r\n}\r\n\r\nfunction containsForbiddenHostCodePointExcludingPercent(string) {\r\n  return string.search(/\\u0000|\\u0009|\\u000A|\\u000D|\\u0020|#|\\/|:|\\?|@|\\[|\\\\|\\]/) !== -1;\r\n}\r\n\r\nfunction isSpecialScheme(scheme) {\r\n  return specialSchemes[scheme] !== undefined;\r\n}\r\n\r\nfunction isSpecial(url) {\r\n  return isSpecialScheme(url.scheme);\r\n}\r\n\r\nfunction defaultPort(scheme) {\r\n  return specialSchemes[scheme];\r\n}\r\n\r\nfunction percentEncode(c) {\r\n  let hex = c.toString(16).toUpperCase();\r\n  if (hex.length === 1) {\r\n    hex = \"0\" + hex;\r\n  }\r\n\r\n  return \"%\" + hex;\r\n}\r\n\r\nfunction utf8PercentEncode(c) {\r\n  const buf = new Buffer(c);\r\n\r\n  let str = \"\";\r\n\r\n  for (let i = 0; i < buf.length; ++i) {\r\n    str += percentEncode(buf[i]);\r\n  }\r\n\r\n  return str;\r\n}\r\n\r\nfunction utf8PercentDecode(str) {\r\n  const input = new Buffer(str);\r\n  const output = [];\r\n  for (let i = 0; i < input.length; ++i) {\r\n    if (input[i] !== 37) {\r\n      output.push(input[i]);\r\n    } else if (input[i] === 37 && isASCIIHex(input[i + 1]) && isASCIIHex(input[i + 2])) {\r\n      output.push(parseInt(input.slice(i + 1, i + 3).toString(), 16));\r\n      i += 2;\r\n    } else {\r\n      output.push(input[i]);\r\n    }\r\n  }\r\n  return new Buffer(output).toString();\r\n}\r\n\r\nfunction isC0ControlPercentEncode(c) {\r\n  return c <= 0x1F || c > 0x7E;\r\n}\r\n\r\nconst extraPathPercentEncodeSet = new Set([32, 34, 35, 60, 62, 63, 96, 123, 125]);\r\nfunction isPathPercentEncode(c) {\r\n  return isC0ControlPercentEncode(c) || extraPathPercentEncodeSet.has(c);\r\n}\r\n\r\nconst extraUserinfoPercentEncodeSet =\r\n  new Set([47, 58, 59, 61, 64, 91, 92, 93, 94, 124]);\r\nfunction isUserinfoPercentEncode(c) {\r\n  return isPathPercentEncode(c) || extraUserinfoPercentEncodeSet.has(c);\r\n}\r\n\r\nfunction percentEncodeChar(c, encodeSetPredicate) {\r\n  const cStr = String.fromCodePoint(c);\r\n\r\n  if (encodeSetPredicate(c)) {\r\n    return utf8PercentEncode(cStr);\r\n  }\r\n\r\n  return cStr;\r\n}\r\n\r\nfunction parseIPv4Number(input) {\r\n  let R = 10;\r\n\r\n  if (input.length >= 2 && input.charAt(0) === \"0\" && input.charAt(1).toLowerCase() === \"x\") {\r\n    input = input.substring(2);\r\n    R = 16;\r\n  } else if (input.length >= 2 && input.charAt(0) === \"0\") {\r\n    input = input.substring(1);\r\n    R = 8;\r\n  }\r\n\r\n  if (input === \"\") {\r\n    return 0;\r\n  }\r\n\r\n  const regex = R === 10 ? /[^0-9]/ : (R === 16 ? /[^0-9A-Fa-f]/ : /[^0-7]/);\r\n  if (regex.test(input)) {\r\n    return failure;\r\n  }\r\n\r\n  return parseInt(input, R);\r\n}\r\n\r\nfunction parseIPv4(input) {\r\n  const parts = input.split(\".\");\r\n  if (parts[parts.length - 1] === \"\") {\r\n    if (parts.length > 1) {\r\n      parts.pop();\r\n    }\r\n  }\r\n\r\n  if (parts.length > 4) {\r\n    return input;\r\n  }\r\n\r\n  const numbers = [];\r\n  for (const part of parts) {\r\n    if (part === \"\") {\r\n      return input;\r\n    }\r\n    const n = parseIPv4Number(part);\r\n    if (n === failure) {\r\n      return input;\r\n    }\r\n\r\n    numbers.push(n);\r\n  }\r\n\r\n  for (let i = 0; i < numbers.length - 1; ++i) {\r\n    if (numbers[i] > 255) {\r\n      return failure;\r\n    }\r\n  }\r\n  if (numbers[numbers.length - 1] >= Math.pow(256, 5 - numbers.length)) {\r\n    return failure;\r\n  }\r\n\r\n  let ipv4 = numbers.pop();\r\n  let counter = 0;\r\n\r\n  for (const n of numbers) {\r\n    ipv4 += n * Math.pow(256, 3 - counter);\r\n    ++counter;\r\n  }\r\n\r\n  return ipv4;\r\n}\r\n\r\nfunction serializeIPv4(address) {\r\n  let output = \"\";\r\n  let n = address;\r\n\r\n  for (let i = 1; i <= 4; ++i) {\r\n    output = String(n % 256) + output;\r\n    if (i !== 4) {\r\n      output = \".\" + output;\r\n    }\r\n    n = Math.floor(n / 256);\r\n  }\r\n\r\n  return output;\r\n}\r\n\r\nfunction parseIPv6(input) {\r\n  const address = [0, 0, 0, 0, 0, 0, 0, 0];\r\n  let pieceIndex = 0;\r\n  let compress = null;\r\n  let pointer = 0;\r\n\r\n  input = punycode.ucs2.decode(input);\r\n\r\n  if (input[pointer] === 58) {\r\n    if (input[pointer + 1] !== 58) {\r\n      return failure;\r\n    }\r\n\r\n    pointer += 2;\r\n    ++pieceIndex;\r\n    compress = pieceIndex;\r\n  }\r\n\r\n  while (pointer < input.length) {\r\n    if (pieceIndex === 8) {\r\n      return failure;\r\n    }\r\n\r\n    if (input[pointer] === 58) {\r\n      if (compress !== null) {\r\n        return failure;\r\n      }\r\n      ++pointer;\r\n      ++pieceIndex;\r\n      compress = pieceIndex;\r\n      continue;\r\n    }\r\n\r\n    let value = 0;\r\n    let length = 0;\r\n\r\n    while (length < 4 && isASCIIHex(input[pointer])) {\r\n      value = value * 0x10 + parseInt(at(input, pointer), 16);\r\n      ++pointer;\r\n      ++length;\r\n    }\r\n\r\n    if (input[pointer] === 46) {\r\n      if (length === 0) {\r\n        return failure;\r\n      }\r\n\r\n      pointer -= length;\r\n\r\n      if (pieceIndex > 6) {\r\n        return failure;\r\n      }\r\n\r\n      let numbersSeen = 0;\r\n\r\n      while (input[pointer] !== undefined) {\r\n        let ipv4Piece = null;\r\n\r\n        if (numbersSeen > 0) {\r\n          if (input[pointer] === 46 && numbersSeen < 4) {\r\n            ++pointer;\r\n          } else {\r\n            return failure;\r\n          }\r\n        }\r\n\r\n        if (!isASCIIDigit(input[pointer])) {\r\n          return failure;\r\n        }\r\n\r\n        while (isASCIIDigit(input[pointer])) {\r\n          const number = parseInt(at(input, pointer));\r\n          if (ipv4Piece === null) {\r\n            ipv4Piece = number;\r\n          } else if (ipv4Piece === 0) {\r\n            return failure;\r\n          } else {\r\n            ipv4Piece = ipv4Piece * 10 + number;\r\n          }\r\n          if (ipv4Piece > 255) {\r\n            return failure;\r\n          }\r\n          ++pointer;\r\n        }\r\n\r\n        address[pieceIndex] = address[pieceIndex] * 0x100 + ipv4Piece;\r\n\r\n        ++numbersSeen;\r\n\r\n        if (numbersSeen === 2 || numbersSeen === 4) {\r\n          ++pieceIndex;\r\n        }\r\n      }\r\n\r\n      if (numbersSeen !== 4) {\r\n        return failure;\r\n      }\r\n\r\n      break;\r\n    } else if (input[pointer] === 58) {\r\n      ++pointer;\r\n      if (input[pointer] === undefined) {\r\n        return failure;\r\n      }\r\n    } else if (input[pointer] !== undefined) {\r\n      return failure;\r\n    }\r\n\r\n    address[pieceIndex] = value;\r\n    ++pieceIndex;\r\n  }\r\n\r\n  if (compress !== null) {\r\n    let swaps = pieceIndex - compress;\r\n    pieceIndex = 7;\r\n    while (pieceIndex !== 0 && swaps > 0) {\r\n      const temp = address[compress + swaps - 1];\r\n      address[compress + swaps - 1] = address[pieceIndex];\r\n      address[pieceIndex] = temp;\r\n      --pieceIndex;\r\n      --swaps;\r\n    }\r\n  } else if (compress === null && pieceIndex !== 8) {\r\n    return failure;\r\n  }\r\n\r\n  return address;\r\n}\r\n\r\nfunction serializeIPv6(address) {\r\n  let output = \"\";\r\n  const seqResult = findLongestZeroSequence(address);\r\n  const compress = seqResult.idx;\r\n  let ignore0 = false;\r\n\r\n  for (let pieceIndex = 0; pieceIndex <= 7; ++pieceIndex) {\r\n    if (ignore0 && address[pieceIndex] === 0) {\r\n      continue;\r\n    } else if (ignore0) {\r\n      ignore0 = false;\r\n    }\r\n\r\n    if (compress === pieceIndex) {\r\n      const separator = pieceIndex === 0 ? \"::\" : \":\";\r\n      output += separator;\r\n      ignore0 = true;\r\n      continue;\r\n    }\r\n\r\n    output += address[pieceIndex].toString(16);\r\n\r\n    if (pieceIndex !== 7) {\r\n      output += \":\";\r\n    }\r\n  }\r\n\r\n  return output;\r\n}\r\n\r\nfunction parseHost(input, isSpecialArg) {\r\n  if (input[0] === \"[\") {\r\n    if (input[input.length - 1] !== \"]\") {\r\n      return failure;\r\n    }\r\n\r\n    return parseIPv6(input.substring(1, input.length - 1));\r\n  }\r\n\r\n  if (!isSpecialArg) {\r\n    return parseOpaqueHost(input);\r\n  }\r\n\r\n  const domain = utf8PercentDecode(input);\r\n  const asciiDomain = tr46.toASCII(domain, false, tr46.PROCESSING_OPTIONS.NONTRANSITIONAL, false);\r\n  if (asciiDomain === null) {\r\n    return failure;\r\n  }\r\n\r\n  if (containsForbiddenHostCodePoint(asciiDomain)) {\r\n    return failure;\r\n  }\r\n\r\n  const ipv4Host = parseIPv4(asciiDomain);\r\n  if (typeof ipv4Host === \"number\" || ipv4Host === failure) {\r\n    return ipv4Host;\r\n  }\r\n\r\n  return asciiDomain;\r\n}\r\n\r\nfunction parseOpaqueHost(input) {\r\n  if (containsForbiddenHostCodePointExcludingPercent(input)) {\r\n    return failure;\r\n  }\r\n\r\n  let output = \"\";\r\n  const decoded = punycode.ucs2.decode(input);\r\n  for (let i = 0; i < decoded.length; ++i) {\r\n    output += percentEncodeChar(decoded[i], isC0ControlPercentEncode);\r\n  }\r\n  return output;\r\n}\r\n\r\nfunction findLongestZeroSequence(arr) {\r\n  let maxIdx = null;\r\n  let maxLen = 1; // only find elements > 1\r\n  let currStart = null;\r\n  let currLen = 0;\r\n\r\n  for (let i = 0; i < arr.length; ++i) {\r\n    if (arr[i] !== 0) {\r\n      if (currLen > maxLen) {\r\n        maxIdx = currStart;\r\n        maxLen = currLen;\r\n      }\r\n\r\n      currStart = null;\r\n      currLen = 0;\r\n    } else {\r\n      if (currStart === null) {\r\n        currStart = i;\r\n      }\r\n      ++currLen;\r\n    }\r\n  }\r\n\r\n  // if trailing zeros\r\n  if (currLen > maxLen) {\r\n    maxIdx = currStart;\r\n    maxLen = currLen;\r\n  }\r\n\r\n  return {\r\n    idx: maxIdx,\r\n    len: maxLen\r\n  };\r\n}\r\n\r\nfunction serializeHost(host) {\r\n  if (typeof host === \"number\") {\r\n    return serializeIPv4(host);\r\n  }\r\n\r\n  // IPv6 serializer\r\n  if (host instanceof Array) {\r\n    return \"[\" + serializeIPv6(host) + \"]\";\r\n  }\r\n\r\n  return host;\r\n}\r\n\r\nfunction trimControlChars(url) {\r\n  return url.replace(/^[\\u0000-\\u001F\\u0020]+|[\\u0000-\\u001F\\u0020]+$/g, \"\");\r\n}\r\n\r\nfunction trimTabAndNewline(url) {\r\n  return url.replace(/\\u0009|\\u000A|\\u000D/g, \"\");\r\n}\r\n\r\nfunction shortenPath(url) {\r\n  const path = url.path;\r\n  if (path.length === 0) {\r\n    return;\r\n  }\r\n  if (url.scheme === \"file\" && path.length === 1 && isNormalizedWindowsDriveLetter(path[0])) {\r\n    return;\r\n  }\r\n\r\n  path.pop();\r\n}\r\n\r\nfunction includesCredentials(url) {\r\n  return url.username !== \"\" || url.password !== \"\";\r\n}\r\n\r\nfunction cannotHaveAUsernamePasswordPort(url) {\r\n  return url.host === null || url.host === \"\" || url.cannotBeABaseURL || url.scheme === \"file\";\r\n}\r\n\r\nfunction isNormalizedWindowsDriveLetter(string) {\r\n  return /^[A-Za-z]:$/.test(string);\r\n}\r\n\r\nfunction URLStateMachine(input, base, encodingOverride, url, stateOverride) {\r\n  this.pointer = 0;\r\n  this.input = input;\r\n  this.base = base || null;\r\n  this.encodingOverride = encodingOverride || \"utf-8\";\r\n  this.stateOverride = stateOverride;\r\n  this.url = url;\r\n  this.failure = false;\r\n  this.parseError = false;\r\n\r\n  if (!this.url) {\r\n    this.url = {\r\n      scheme: \"\",\r\n      username: \"\",\r\n      password: \"\",\r\n      host: null,\r\n      port: null,\r\n      path: [],\r\n      query: null,\r\n      fragment: null,\r\n\r\n      cannotBeABaseURL: false\r\n    };\r\n\r\n    const res = trimControlChars(this.input);\r\n    if (res !== this.input) {\r\n      this.parseError = true;\r\n    }\r\n    this.input = res;\r\n  }\r\n\r\n  const res = trimTabAndNewline(this.input);\r\n  if (res !== this.input) {\r\n    this.parseError = true;\r\n  }\r\n  this.input = res;\r\n\r\n  this.state = stateOverride || \"scheme start\";\r\n\r\n  this.buffer = \"\";\r\n  this.atFlag = false;\r\n  this.arrFlag = false;\r\n  this.passwordTokenSeenFlag = false;\r\n\r\n  this.input = punycode.ucs2.decode(this.input);\r\n\r\n  for (; this.pointer <= this.input.length; ++this.pointer) {\r\n    const c = this.input[this.pointer];\r\n    const cStr = isNaN(c) ? undefined : String.fromCodePoint(c);\r\n\r\n    // exec state machine\r\n    const ret = this[\"parse \" + this.state](c, cStr);\r\n    if (!ret) {\r\n      break; // terminate algorithm\r\n    } else if (ret === failure) {\r\n      this.failure = true;\r\n      break;\r\n    }\r\n  }\r\n}\r\n\r\nURLStateMachine.prototype[\"parse scheme start\"] = function parseSchemeStart(c, cStr) {\r\n  if (isASCIIAlpha(c)) {\r\n    this.buffer += cStr.toLowerCase();\r\n    this.state = \"scheme\";\r\n  } else if (!this.stateOverride) {\r\n    this.state = \"no scheme\";\r\n    --this.pointer;\r\n  } else {\r\n    this.parseError = true;\r\n    return failure;\r\n  }\r\n\r\n  return true;\r\n};\r\n\r\nURLStateMachine.prototype[\"parse scheme\"] = function parseScheme(c, cStr) {\r\n  if (isASCIIAlphanumeric(c) || c === 43 || c === 45 || c === 46) {\r\n    this.buffer += cStr.toLowerCase();\r\n  } else if (c === 58) {\r\n    if (this.stateOverride) {\r\n      if (isSpecial(this.url) && !isSpecialScheme(this.buffer)) {\r\n        return false;\r\n      }\r\n\r\n      if (!isSpecial(this.url) && isSpecialScheme(this.buffer)) {\r\n        return false;\r\n      }\r\n\r\n      if ((includesCredentials(this.url) || this.url.port !== null) && this.buffer === \"file\") {\r\n        return false;\r\n      }\r\n\r\n      if (this.url.scheme === \"file\" && (this.url.host === \"\" || this.url.host === null)) {\r\n        return false;\r\n      }\r\n    }\r\n    this.url.scheme = this.buffer;\r\n    this.buffer = \"\";\r\n    if (this.stateOverride) {\r\n      return false;\r\n    }\r\n    if (this.url.scheme === \"file\") {\r\n      if (this.input[this.pointer + 1] !== 47 || this.input[this.pointer + 2] !== 47) {\r\n        this.parseError = true;\r\n      }\r\n      this.state = \"file\";\r\n    } else if (isSpecial(this.url) && this.base !== null && this.base.scheme === this.url.scheme) {\r\n      this.state = \"special relative or authority\";\r\n    } else if (isSpecial(this.url)) {\r\n      this.state = \"special authority slashes\";\r\n    } else if (this.input[this.pointer + 1] === 47) {\r\n      this.state = \"path or authority\";\r\n      ++this.pointer;\r\n    } else {\r\n      this.url.cannotBeABaseURL = true;\r\n      this.url.path.push(\"\");\r\n      this.state = \"cannot-be-a-base-URL path\";\r\n    }\r\n  } else if (!this.stateOverride) {\r\n    this.buffer = \"\";\r\n    this.state = \"no scheme\";\r\n    this.pointer = -1;\r\n  } else {\r\n    this.parseError = true;\r\n    return failure;\r\n  }\r\n\r\n  return true;\r\n};\r\n\r\nURLStateMachine.prototype[\"parse no scheme\"] = function parseNoScheme(c) {\r\n  if (this.base === null || (this.base.cannotBeABaseURL && c !== 35)) {\r\n    return failure;\r\n  } else if (this.base.cannotBeABaseURL && c === 35) {\r\n    this.url.scheme = this.base.scheme;\r\n    this.url.path = this.base.path.slice();\r\n    this.url.query = this.base.query;\r\n    this.url.fragment = \"\";\r\n    this.url.cannotBeABaseURL = true;\r\n    this.state = \"fragment\";\r\n  } else if (this.base.scheme === \"file\") {\r\n    this.state = \"file\";\r\n    --this.pointer;\r\n  } else {\r\n    this.state = \"relative\";\r\n    --this.pointer;\r\n  }\r\n\r\n  return true;\r\n};\r\n\r\nURLStateMachine.prototype[\"parse special relative or authority\"] = function parseSpecialRelativeOrAuthority(c) {\r\n  if (c === 47 && this.input[this.pointer + 1] === 47) {\r\n    this.state = \"special authority ignore slashes\";\r\n    ++this.pointer;\r\n  } else {\r\n    this.parseError = true;\r\n    this.state = \"relative\";\r\n    --this.pointer;\r\n  }\r\n\r\n  return true;\r\n};\r\n\r\nURLStateMachine.prototype[\"parse path or authority\"] = function parsePathOrAuthority(c) {\r\n  if (c === 47) {\r\n    this.state = \"authority\";\r\n  } else {\r\n    this.state = \"path\";\r\n    --this.pointer;\r\n  }\r\n\r\n  return true;\r\n};\r\n\r\nURLStateMachine.prototype[\"parse relative\"] = function parseRelative(c) {\r\n  this.url.scheme = this.base.scheme;\r\n  if (isNaN(c)) {\r\n    this.url.username = this.base.username;\r\n    this.url.password = this.base.password;\r\n    this.url.host = this.base.host;\r\n    this.url.port = this.base.port;\r\n    this.url.path = this.base.path.slice();\r\n    this.url.query = this.base.query;\r\n  } else if (c === 47) {\r\n    this.state = \"relative slash\";\r\n  } else if (c === 63) {\r\n    this.url.username = this.base.username;\r\n    this.url.password = this.base.password;\r\n    this.url.host = this.base.host;\r\n    this.url.port = this.base.port;\r\n    this.url.path = this.base.path.slice();\r\n    this.url.query = \"\";\r\n    this.state = \"query\";\r\n  } else if (c === 35) {\r\n    this.url.username = this.base.username;\r\n    this.url.password = this.base.password;\r\n    this.url.host = this.base.host;\r\n    this.url.port = this.base.port;\r\n    this.url.path = this.base.path.slice();\r\n    this.url.query = this.base.query;\r\n    this.url.fragment = \"\";\r\n    this.state = \"fragment\";\r\n  } else if (isSpecial(this.url) && c === 92) {\r\n    this.parseError = true;\r\n    this.state = \"relative slash\";\r\n  } else {\r\n    this.url.username = this.base.username;\r\n    this.url.password = this.base.password;\r\n    this.url.host = this.base.host;\r\n    this.url.port = this.base.port;\r\n    this.url.path = this.base.path.slice(0, this.base.path.length - 1);\r\n\r\n    this.state = \"path\";\r\n    --this.pointer;\r\n  }\r\n\r\n  return true;\r\n};\r\n\r\nURLStateMachine.prototype[\"parse relative slash\"] = function parseRelativeSlash(c) {\r\n  if (isSpecial(this.url) && (c === 47 || c === 92)) {\r\n    if (c === 92) {\r\n      this.parseError = true;\r\n    }\r\n    this.state = \"special authority ignore slashes\";\r\n  } else if (c === 47) {\r\n    this.state = \"authority\";\r\n  } else {\r\n    this.url.username = this.base.username;\r\n    this.url.password = this.base.password;\r\n    this.url.host = this.base.host;\r\n    this.url.port = this.base.port;\r\n    this.state = \"path\";\r\n    --this.pointer;\r\n  }\r\n\r\n  return true;\r\n};\r\n\r\nURLStateMachine.prototype[\"parse special authority slashes\"] = function parseSpecialAuthoritySlashes(c) {\r\n  if (c === 47 && this.input[this.pointer + 1] === 47) {\r\n    this.state = \"special authority ignore slashes\";\r\n    ++this.pointer;\r\n  } else {\r\n    this.parseError = true;\r\n    this.state = \"special authority ignore slashes\";\r\n    --this.pointer;\r\n  }\r\n\r\n  return true;\r\n};\r\n\r\nURLStateMachine.prototype[\"parse special authority ignore slashes\"] = function parseSpecialAuthorityIgnoreSlashes(c) {\r\n  if (c !== 47 && c !== 92) {\r\n    this.state = \"authority\";\r\n    --this.pointer;\r\n  } else {\r\n    this.parseError = true;\r\n  }\r\n\r\n  return true;\r\n};\r\n\r\nURLStateMachine.prototype[\"parse authority\"] = function parseAuthority(c, cStr) {\r\n  if (c === 64) {\r\n    this.parseError = true;\r\n    if (this.atFlag) {\r\n      this.buffer = \"%40\" + this.buffer;\r\n    }\r\n    this.atFlag = true;\r\n\r\n    // careful, this is based on buffer and has its own pointer (this.pointer != pointer) and inner chars\r\n    const len = countSymbols(this.buffer);\r\n    for (let pointer = 0; pointer < len; ++pointer) {\r\n      const codePoint = this.buffer.codePointAt(pointer);\r\n\r\n      if (codePoint === 58 && !this.passwordTokenSeenFlag) {\r\n        this.passwordTokenSeenFlag = true;\r\n        continue;\r\n      }\r\n      const encodedCodePoints = percentEncodeChar(codePoint, isUserinfoPercentEncode);\r\n      if (this.passwordTokenSeenFlag) {\r\n        this.url.password += encodedCodePoints;\r\n      } else {\r\n        this.url.username += encodedCodePoints;\r\n      }\r\n    }\r\n    this.buffer = \"\";\r\n  } else if (isNaN(c) || c === 47 || c === 63 || c === 35 ||\r\n             (isSpecial(this.url) && c === 92)) {\r\n    if (this.atFlag && this.buffer === \"\") {\r\n      this.parseError = true;\r\n      return failure;\r\n    }\r\n    this.pointer -= countSymbols(this.buffer) + 1;\r\n    this.buffer = \"\";\r\n    this.state = \"host\";\r\n  } else {\r\n    this.buffer += cStr;\r\n  }\r\n\r\n  return true;\r\n};\r\n\r\nURLStateMachine.prototype[\"parse hostname\"] =\r\nURLStateMachine.prototype[\"parse host\"] = function parseHostName(c, cStr) {\r\n  if (this.stateOverride && this.url.scheme === \"file\") {\r\n    --this.pointer;\r\n    this.state = \"file host\";\r\n  } else if (c === 58 && !this.arrFlag) {\r\n    if (this.buffer === \"\") {\r\n      this.parseError = true;\r\n      return failure;\r\n    }\r\n\r\n    const host = parseHost(this.buffer, isSpecial(this.url));\r\n    if (host === failure) {\r\n      return failure;\r\n    }\r\n\r\n    this.url.host = host;\r\n    this.buffer = \"\";\r\n    this.state = \"port\";\r\n    if (this.stateOverride === \"hostname\") {\r\n      return false;\r\n    }\r\n  } else if (isNaN(c) || c === 47 || c === 63 || c === 35 ||\r\n             (isSpecial(this.url) && c === 92)) {\r\n    --this.pointer;\r\n    if (isSpecial(this.url) && this.buffer === \"\") {\r\n      this.parseError = true;\r\n      return failure;\r\n    } else if (this.stateOverride && this.buffer === \"\" &&\r\n               (includesCredentials(this.url) || this.url.port !== null)) {\r\n      this.parseError = true;\r\n      return false;\r\n    }\r\n\r\n    const host = parseHost(this.buffer, isSpecial(this.url));\r\n    if (host === failure) {\r\n      return failure;\r\n    }\r\n\r\n    this.url.host = host;\r\n    this.buffer = \"\";\r\n    this.state = \"path start\";\r\n    if (this.stateOverride) {\r\n      return false;\r\n    }\r\n  } else {\r\n    if (c === 91) {\r\n      this.arrFlag = true;\r\n    } else if (c === 93) {\r\n      this.arrFlag = false;\r\n    }\r\n    this.buffer += cStr;\r\n  }\r\n\r\n  return true;\r\n};\r\n\r\nURLStateMachine.prototype[\"parse port\"] = function parsePort(c, cStr) {\r\n  if (isASCIIDigit(c)) {\r\n    this.buffer += cStr;\r\n  } else if (isNaN(c) || c === 47 || c === 63 || c === 35 ||\r\n             (isSpecial(this.url) && c === 92) ||\r\n             this.stateOverride) {\r\n    if (this.buffer !== \"\") {\r\n      const port = parseInt(this.buffer);\r\n      if (port > Math.pow(2, 16) - 1) {\r\n        this.parseError = true;\r\n        return failure;\r\n      }\r\n      this.url.port = port === defaultPort(this.url.scheme) ? null : port;\r\n      this.buffer = \"\";\r\n    }\r\n    if (this.stateOverride) {\r\n      return false;\r\n    }\r\n    this.state = \"path start\";\r\n    --this.pointer;\r\n  } else {\r\n    this.parseError = true;\r\n    return failure;\r\n  }\r\n\r\n  return true;\r\n};\r\n\r\nconst fileOtherwiseCodePoints = new Set([47, 92, 63, 35]);\r\n\r\nURLStateMachine.prototype[\"parse file\"] = function parseFile(c) {\r\n  this.url.scheme = \"file\";\r\n\r\n  if (c === 47 || c === 92) {\r\n    if (c === 92) {\r\n      this.parseError = true;\r\n    }\r\n    this.state = \"file slash\";\r\n  } else if (this.base !== null && this.base.scheme === \"file\") {\r\n    if (isNaN(c)) {\r\n      this.url.host = this.base.host;\r\n      this.url.path = this.base.path.slice();\r\n      this.url.query = this.base.query;\r\n    } else if (c === 63) {\r\n      this.url.host = this.base.host;\r\n      this.url.path = this.base.path.slice();\r\n      this.url.query = \"\";\r\n      this.state = \"query\";\r\n    } else if (c === 35) {\r\n      this.url.host = this.base.host;\r\n      this.url.path = this.base.path.slice();\r\n      this.url.query = this.base.query;\r\n      this.url.fragment = \"\";\r\n      this.state = \"fragment\";\r\n    } else {\r\n      if (this.input.length - this.pointer - 1 === 0 || // remaining consists of 0 code points\r\n          !isWindowsDriveLetterCodePoints(c, this.input[this.pointer + 1]) ||\r\n          (this.input.length - this.pointer - 1 >= 2 && // remaining has at least 2 code points\r\n           !fileOtherwiseCodePoints.has(this.input[this.pointer + 2]))) {\r\n        this.url.host = this.base.host;\r\n        this.url.path = this.base.path.slice();\r\n        shortenPath(this.url);\r\n      } else {\r\n        this.parseError = true;\r\n      }\r\n\r\n      this.state = \"path\";\r\n      --this.pointer;\r\n    }\r\n  } else {\r\n    this.state = \"path\";\r\n    --this.pointer;\r\n  }\r\n\r\n  return true;\r\n};\r\n\r\nURLStateMachine.prototype[\"parse file slash\"] = function parseFileSlash(c) {\r\n  if (c === 47 || c === 92) {\r\n    if (c === 92) {\r\n      this.parseError = true;\r\n    }\r\n    this.state = \"file host\";\r\n  } else {\r\n    if (this.base !== null && this.base.scheme === \"file\") {\r\n      if (isNormalizedWindowsDriveLetterString(this.base.path[0])) {\r\n        this.url.path.push(this.base.path[0]);\r\n      } else {\r\n        this.url.host = this.base.host;\r\n      }\r\n    }\r\n    this.state = \"path\";\r\n    --this.pointer;\r\n  }\r\n\r\n  return true;\r\n};\r\n\r\nURLStateMachine.prototype[\"parse file host\"] = function parseFileHost(c, cStr) {\r\n  if (isNaN(c) || c === 47 || c === 92 || c === 63 || c === 35) {\r\n    --this.pointer;\r\n    if (!this.stateOverride && isWindowsDriveLetterString(this.buffer)) {\r\n      this.parseError = true;\r\n      this.state = \"path\";\r\n    } else if (this.buffer === \"\") {\r\n      this.url.host = \"\";\r\n      if (this.stateOverride) {\r\n        return false;\r\n      }\r\n      this.state = \"path start\";\r\n    } else {\r\n      let host = parseHost(this.buffer, isSpecial(this.url));\r\n      if (host === failure) {\r\n        return failure;\r\n      }\r\n      if (host === \"localhost\") {\r\n        host = \"\";\r\n      }\r\n      this.url.host = host;\r\n\r\n      if (this.stateOverride) {\r\n        return false;\r\n      }\r\n\r\n      this.buffer = \"\";\r\n      this.state = \"path start\";\r\n    }\r\n  } else {\r\n    this.buffer += cStr;\r\n  }\r\n\r\n  return true;\r\n};\r\n\r\nURLStateMachine.prototype[\"parse path start\"] = function parsePathStart(c) {\r\n  if (isSpecial(this.url)) {\r\n    if (c === 92) {\r\n      this.parseError = true;\r\n    }\r\n    this.state = \"path\";\r\n\r\n    if (c !== 47 && c !== 92) {\r\n      --this.pointer;\r\n    }\r\n  } else if (!this.stateOverride && c === 63) {\r\n    this.url.query = \"\";\r\n    this.state = \"query\";\r\n  } else if (!this.stateOverride && c === 35) {\r\n    this.url.fragment = \"\";\r\n    this.state = \"fragment\";\r\n  } else if (c !== undefined) {\r\n    this.state = \"path\";\r\n    if (c !== 47) {\r\n      --this.pointer;\r\n    }\r\n  }\r\n\r\n  return true;\r\n};\r\n\r\nURLStateMachine.prototype[\"parse path\"] = function parsePath(c) {\r\n  if (isNaN(c) || c === 47 || (isSpecial(this.url) && c === 92) ||\r\n      (!this.stateOverride && (c === 63 || c === 35))) {\r\n    if (isSpecial(this.url) && c === 92) {\r\n      this.parseError = true;\r\n    }\r\n\r\n    if (isDoubleDot(this.buffer)) {\r\n      shortenPath(this.url);\r\n      if (c !== 47 && !(isSpecial(this.url) && c === 92)) {\r\n        this.url.path.push(\"\");\r\n      }\r\n    } else if (isSingleDot(this.buffer) && c !== 47 &&\r\n               !(isSpecial(this.url) && c === 92)) {\r\n      this.url.path.push(\"\");\r\n    } else if (!isSingleDot(this.buffer)) {\r\n      if (this.url.scheme === \"file\" && this.url.path.length === 0 && isWindowsDriveLetterString(this.buffer)) {\r\n        if (this.url.host !== \"\" && this.url.host !== null) {\r\n          this.parseError = true;\r\n          this.url.host = \"\";\r\n        }\r\n        this.buffer = this.buffer[0] + \":\";\r\n      }\r\n      this.url.path.push(this.buffer);\r\n    }\r\n    this.buffer = \"\";\r\n    if (this.url.scheme === \"file\" && (c === undefined || c === 63 || c === 35)) {\r\n      while (this.url.path.length > 1 && this.url.path[0] === \"\") {\r\n        this.parseError = true;\r\n        this.url.path.shift();\r\n      }\r\n    }\r\n    if (c === 63) {\r\n      this.url.query = \"\";\r\n      this.state = \"query\";\r\n    }\r\n    if (c === 35) {\r\n      this.url.fragment = \"\";\r\n      this.state = \"fragment\";\r\n    }\r\n  } else {\r\n    // TODO: If c is not a URL code point and not \"%\", parse error.\r\n\r\n    if (c === 37 &&\r\n      (!isASCIIHex(this.input[this.pointer + 1]) ||\r\n        !isASCIIHex(this.input[this.pointer + 2]))) {\r\n      this.parseError = true;\r\n    }\r\n\r\n    this.buffer += percentEncodeChar(c, isPathPercentEncode);\r\n  }\r\n\r\n  return true;\r\n};\r\n\r\nURLStateMachine.prototype[\"parse cannot-be-a-base-URL path\"] = function parseCannotBeABaseURLPath(c) {\r\n  if (c === 63) {\r\n    this.url.query = \"\";\r\n    this.state = \"query\";\r\n  } else if (c === 35) {\r\n    this.url.fragment = \"\";\r\n    this.state = \"fragment\";\r\n  } else {\r\n    // TODO: Add: not a URL code point\r\n    if (!isNaN(c) && c !== 37) {\r\n      this.parseError = true;\r\n    }\r\n\r\n    if (c === 37 &&\r\n        (!isASCIIHex(this.input[this.pointer + 1]) ||\r\n         !isASCIIHex(this.input[this.pointer + 2]))) {\r\n      this.parseError = true;\r\n    }\r\n\r\n    if (!isNaN(c)) {\r\n      this.url.path[0] = this.url.path[0] + percentEncodeChar(c, isC0ControlPercentEncode);\r\n    }\r\n  }\r\n\r\n  return true;\r\n};\r\n\r\nURLStateMachine.prototype[\"parse query\"] = function parseQuery(c, cStr) {\r\n  if (isNaN(c) || (!this.stateOverride && c === 35)) {\r\n    if (!isSpecial(this.url) || this.url.scheme === \"ws\" || this.url.scheme === \"wss\") {\r\n      this.encodingOverride = \"utf-8\";\r\n    }\r\n\r\n    const buffer = new Buffer(this.buffer); // TODO: Use encoding override instead\r\n    for (let i = 0; i < buffer.length; ++i) {\r\n      if (buffer[i] < 0x21 || buffer[i] > 0x7E || buffer[i] === 0x22 || buffer[i] === 0x23 ||\r\n          buffer[i] === 0x3C || buffer[i] === 0x3E) {\r\n        this.url.query += percentEncode(buffer[i]);\r\n      } else {\r\n        this.url.query += String.fromCodePoint(buffer[i]);\r\n      }\r\n    }\r\n\r\n    this.buffer = \"\";\r\n    if (c === 35) {\r\n      this.url.fragment = \"\";\r\n      this.state = \"fragment\";\r\n    }\r\n  } else {\r\n    // TODO: If c is not a URL code point and not \"%\", parse error.\r\n    if (c === 37 &&\r\n      (!isASCIIHex(this.input[this.pointer + 1]) ||\r\n        !isASCIIHex(this.input[this.pointer + 2]))) {\r\n      this.parseError = true;\r\n    }\r\n\r\n    this.buffer += cStr;\r\n  }\r\n\r\n  return true;\r\n};\r\n\r\nURLStateMachine.prototype[\"parse fragment\"] = function parseFragment(c) {\r\n  if (isNaN(c)) { // do nothing\r\n  } else if (c === 0x0) {\r\n    this.parseError = true;\r\n  } else {\r\n    // TODO: If c is not a URL code point and not \"%\", parse error.\r\n    if (c === 37 &&\r\n      (!isASCIIHex(this.input[this.pointer + 1]) ||\r\n        !isASCIIHex(this.input[this.pointer + 2]))) {\r\n      this.parseError = true;\r\n    }\r\n\r\n    this.url.fragment += percentEncodeChar(c, isC0ControlPercentEncode);\r\n  }\r\n\r\n  return true;\r\n};\r\n\r\nfunction serializeURL(url, excludeFragment) {\r\n  let output = url.scheme + \":\";\r\n  if (url.host !== null) {\r\n    output += \"//\";\r\n\r\n    if (url.username !== \"\" || url.password !== \"\") {\r\n      output += url.username;\r\n      if (url.password !== \"\") {\r\n        output += \":\" + url.password;\r\n      }\r\n      output += \"@\";\r\n    }\r\n\r\n    output += serializeHost(url.host);\r\n\r\n    if (url.port !== null) {\r\n      output += \":\" + url.port;\r\n    }\r\n  } else if (url.host === null && url.scheme === \"file\") {\r\n    output += \"//\";\r\n  }\r\n\r\n  if (url.cannotBeABaseURL) {\r\n    output += url.path[0];\r\n  } else {\r\n    for (const string of url.path) {\r\n      output += \"/\" + string;\r\n    }\r\n  }\r\n\r\n  if (url.query !== null) {\r\n    output += \"?\" + url.query;\r\n  }\r\n\r\n  if (!excludeFragment && url.fragment !== null) {\r\n    output += \"#\" + url.fragment;\r\n  }\r\n\r\n  return output;\r\n}\r\n\r\nfunction serializeOrigin(tuple) {\r\n  let result = tuple.scheme + \"://\";\r\n  result += serializeHost(tuple.host);\r\n\r\n  if (tuple.port !== null) {\r\n    result += \":\" + tuple.port;\r\n  }\r\n\r\n  return result;\r\n}\r\n\r\nmodule.exports.serializeURL = serializeURL;\r\n\r\nmodule.exports.serializeURLOrigin = function (url) {\r\n  // https://url.spec.whatwg.org/#concept-url-origin\r\n  switch (url.scheme) {\r\n    case \"blob\":\r\n      try {\r\n        return module.exports.serializeURLOrigin(module.exports.parseURL(url.path[0]));\r\n      } catch (e) {\r\n        // serializing an opaque origin returns \"null\"\r\n        return \"null\";\r\n      }\r\n    case \"ftp\":\r\n    case \"gopher\":\r\n    case \"http\":\r\n    case \"https\":\r\n    case \"ws\":\r\n    case \"wss\":\r\n      return serializeOrigin({\r\n        scheme: url.scheme,\r\n        host: url.host,\r\n        port: url.port\r\n      });\r\n    case \"file\":\r\n      // spec says \"exercise to the reader\", chrome says \"file://\"\r\n      return \"file://\";\r\n    default:\r\n      // serializing an opaque origin returns \"null\"\r\n      return \"null\";\r\n  }\r\n};\r\n\r\nmodule.exports.basicURLParse = function (input, options) {\r\n  if (options === undefined) {\r\n    options = {};\r\n  }\r\n\r\n  const usm = new URLStateMachine(input, options.baseURL, options.encodingOverride, options.url, options.stateOverride);\r\n  if (usm.failure) {\r\n    return \"failure\";\r\n  }\r\n\r\n  return usm.url;\r\n};\r\n\r\nmodule.exports.setTheUsername = function (url, username) {\r\n  url.username = \"\";\r\n  const decoded = punycode.ucs2.decode(username);\r\n  for (let i = 0; i < decoded.length; ++i) {\r\n    url.username += percentEncodeChar(decoded[i], isUserinfoPercentEncode);\r\n  }\r\n};\r\n\r\nmodule.exports.setThePassword = function (url, password) {\r\n  url.password = \"\";\r\n  const decoded = punycode.ucs2.decode(password);\r\n  for (let i = 0; i < decoded.length; ++i) {\r\n    url.password += percentEncodeChar(decoded[i], isUserinfoPercentEncode);\r\n  }\r\n};\r\n\r\nmodule.exports.serializeHost = serializeHost;\r\n\r\nmodule.exports.cannotHaveAUsernamePasswordPort = cannotHaveAUsernamePasswordPort;\r\n\r\nmodule.exports.serializeInteger = function (integer) {\r\n  return String(integer);\r\n};\r\n\r\nmodule.exports.parseURL = function (input, options) {\r\n  if (options === undefined) {\r\n    options = {};\r\n  }\r\n\r\n  // We don't handle blobs, so this just delegates:\r\n  return module.exports.basicURLParse(input, { baseURL: options.baseURL, encodingOverride: options.encodingOverride });\r\n};\r\n\n\n/***/ }),\n\n/***/ 3185:\n/***/ ((module) => {\n\n\"use strict\";\n\n\nmodule.exports.mixin = function mixin(target, source) {\n  const keys = Object.getOwnPropertyNames(source);\n  for (let i = 0; i < keys.length; ++i) {\n    Object.defineProperty(target, keys[i], Object.getOwnPropertyDescriptor(source, keys[i]));\n  }\n};\n\nmodule.exports.wrapperSymbol = Symbol(\"wrapper\");\nmodule.exports.implSymbol = Symbol(\"impl\");\n\nmodule.exports.wrapperForImpl = function (impl) {\n  return impl[module.exports.wrapperSymbol];\n};\n\nmodule.exports.implForWrapper = function (wrapper) {\n  return wrapper[module.exports.implSymbol];\n};\n\n\n\n/***/ }),\n\n/***/ 4207:\n/***/ ((module, __unused_webpack_exports, __webpack_require__) => {\n\nmodule.exports = which\nwhich.sync = whichSync\n\nvar isWindows = process.platform === 'win32' ||\n    process.env.OSTYPE === 'cygwin' ||\n    process.env.OSTYPE === 'msys'\n\nvar path = __webpack_require__(5622)\nvar COLON = isWindows ? ';' : ':'\nvar isexe = __webpack_require__(7126)\n\nfunction getNotFoundError (cmd) {\n  var er = new Error('not found: ' + cmd)\n  er.code = 'ENOENT'\n\n  return er\n}\n\nfunction getPathInfo (cmd, opt) {\n  var colon = opt.colon || COLON\n  var pathEnv = opt.path || process.env.PATH || ''\n  var pathExt = ['']\n\n  pathEnv = pathEnv.split(colon)\n\n  var pathExtExe = ''\n  if (isWindows) {\n    pathEnv.unshift(process.cwd())\n    pathExtExe = (opt.pathExt || process.env.PATHEXT || '.EXE;.CMD;.BAT;.COM')\n    pathExt = pathExtExe.split(colon)\n\n\n    // Always test the cmd itself first.  isexe will check to make sure\n    // it's found in the pathExt set.\n    if (cmd.indexOf('.') !== -1 && pathExt[0] !== '')\n      pathExt.unshift('')\n  }\n\n  // If it has a slash, then we don't bother searching the pathenv.\n  // just check the file itself, and that's it.\n  if (cmd.match(/\\//) || isWindows && cmd.match(/\\\\/))\n    pathEnv = ['']\n\n  return {\n    env: pathEnv,\n    ext: pathExt,\n    extExe: pathExtExe\n  }\n}\n\nfunction which (cmd, opt, cb) {\n  if (typeof opt === 'function') {\n    cb = opt\n    opt = {}\n  }\n\n  var info = getPathInfo(cmd, opt)\n  var pathEnv = info.env\n  var pathExt = info.ext\n  var pathExtExe = info.extExe\n  var found = []\n\n  ;(function F (i, l) {\n    if (i === l) {\n      if (opt.all && found.length)\n        return cb(null, found)\n      else\n        return cb(getNotFoundError(cmd))\n    }\n\n    var pathPart = pathEnv[i]\n    if (pathPart.charAt(0) === '\"' && pathPart.slice(-1) === '\"')\n      pathPart = pathPart.slice(1, -1)\n\n    var p = path.join(pathPart, cmd)\n    if (!pathPart && (/^\\.[\\\\\\/]/).test(cmd)) {\n      p = cmd.slice(0, 2) + p\n    }\n    ;(function E (ii, ll) {\n      if (ii === ll) return F(i + 1, l)\n      var ext = pathExt[ii]\n      isexe(p + ext, { pathExt: pathExtExe }, function (er, is) {\n        if (!er && is) {\n          if (opt.all)\n            found.push(p + ext)\n          else\n            return cb(null, p + ext)\n        }\n        return E(ii + 1, ll)\n      })\n    })(0, pathExt.length)\n  })(0, pathEnv.length)\n}\n\nfunction whichSync (cmd, opt) {\n  opt = opt || {}\n\n  var info = getPathInfo(cmd, opt)\n  var pathEnv = info.env\n  var pathExt = info.ext\n  var pathExtExe = info.extExe\n  var found = []\n\n  for (var i = 0, l = pathEnv.length; i < l; i ++) {\n    var pathPart = pathEnv[i]\n    if (pathPart.charAt(0) === '\"' && pathPart.slice(-1) === '\"')\n      pathPart = pathPart.slice(1, -1)\n\n    var p = path.join(pathPart, cmd)\n    if (!pathPart && /^\\.[\\\\\\/]/.test(cmd)) {\n      p = cmd.slice(0, 2) + p\n    }\n    for (var j = 0, ll = pathExt.length; j < ll; j ++) {\n      var cur = p + pathExt[j]\n      var is\n      try {\n        is = isexe.sync(cur, { pathExt: pathExtExe })\n        if (is) {\n          if (opt.all)\n            found.push(cur)\n          else\n            return cur\n        }\n      } catch (ex) {}\n    }\n  }\n\n  if (opt.all && found.length)\n    return found\n\n  if (opt.nothrow)\n    return null\n\n  throw getNotFoundError(cmd)\n}\n\n\n/***/ }),\n\n/***/ 3515:\n/***/ ((module, __unused_webpack_exports, __webpack_require__) => {\n\n\"use strict\";\n\nconst os = __webpack_require__(2087);\nconst execa = __webpack_require__(5447);\n\n// Reference: https://www.gaijin.at/en/lstwinver.php\nconst names = new Map([\n\t['10.0', '10'],\n\t['6.3', '8.1'],\n\t['6.2', '8'],\n\t['6.1', '7'],\n\t['6.0', 'Vista'],\n\t['5.2', 'Server 2003'],\n\t['5.1', 'XP'],\n\t['5.0', '2000'],\n\t['4.9', 'ME'],\n\t['4.1', '98'],\n\t['4.0', '95']\n]);\n\nconst windowsRelease = release => {\n\tconst version = /\\d+\\.\\d/.exec(release || os.release());\n\n\tif (release && !version) {\n\t\tthrow new Error('`release` argument doesn\\'t match `n.n`');\n\t}\n\n\tconst ver = (version || [])[0];\n\n\t// Server 2008, 2012, 2016, and 2019 versions are ambiguous with desktop versions and must be detected at runtime.\n\t// If `release` is omitted or we're on a Windows system, and the version number is an ambiguous version\n\t// then use `wmic` to get the OS caption: https://msdn.microsoft.com/en-us/library/aa394531(v=vs.85).aspx\n\t// If `wmic` is obsoloete (later versions of Windows 10), use PowerShell instead.\n\t// If the resulting caption contains the year 2008, 2012, 2016 or 2019, it is a server version, so return a server OS name.\n\tif ((!release || release === os.release()) && ['6.1', '6.2', '6.3', '10.0'].includes(ver)) {\n\t\tlet stdout;\n\t\ttry {\n\t\t\tstdout = execa.sync('wmic', ['os', 'get', 'Caption']).stdout || '';\n\t\t} catch (_) {\n\t\t\tstdout = execa.sync('powershell', ['(Get-CimInstance -ClassName Win32_OperatingSystem).caption']).stdout || '';\n\t\t}\n\n\t\tconst year = (stdout.match(/2008|2012|2016|2019/) || [])[0];\n\n\t\tif (year) {\n\t\t\treturn `Server ${year}`;\n\t\t}\n\t}\n\n\treturn names.get(ver);\n};\n\nmodule.exports = windowsRelease;\n\n\n/***/ }),\n\n/***/ 2940:\n/***/ ((module) => {\n\n// Returns a wrapper function that returns a wrapped callback\n// The wrapper function should do some stuff, and return a\n// presumably different callback function.\n// This makes sure that own properties are retained, so that\n// decorations and such are not lost along the way.\nmodule.exports = wrappy\nfunction wrappy (fn, cb) {\n  if (fn && cb) return wrappy(fn)(cb)\n\n  if (typeof fn !== 'function')\n    throw new TypeError('need wrapper function')\n\n  Object.keys(fn).forEach(function (k) {\n    wrapper[k] = fn[k]\n  })\n\n  return wrapper\n\n  function wrapper() {\n    var args = new Array(arguments.length)\n    for (var i = 0; i < args.length; i++) {\n      args[i] = arguments[i]\n    }\n    var ret = fn.apply(this, args)\n    var cb = args[args.length-1]\n    if (typeof ret === 'function' && ret !== cb) {\n      Object.keys(cb).forEach(function (k) {\n        ret[k] = cb[k]\n      })\n    }\n    return ret\n  }\n}\n\n\n/***/ }),\n\n/***/ 1983:\n/***/ ((__unused_webpack_module, exports, __webpack_require__) => {\n\n\"use strict\";\n\n\nvar PlainValue = __webpack_require__(5215);\nvar resolveSeq = __webpack_require__(6140);\nvar Schema = __webpack_require__(3656);\n\nconst defaultOptions = {\n  anchorPrefix: 'a',\n  customTags: null,\n  indent: 2,\n  indentSeq: true,\n  keepCstNodes: false,\n  keepNodeTypes: true,\n  keepBlobsInJSON: true,\n  mapAsMap: false,\n  maxAliasCount: 100,\n  prettyErrors: false,\n  // TODO Set true in v2\n  simpleKeys: false,\n  version: '1.2'\n};\nconst scalarOptions = {\n  get binary() {\n    return resolveSeq.binaryOptions;\n  },\n\n  set binary(opt) {\n    Object.assign(resolveSeq.binaryOptions, opt);\n  },\n\n  get bool() {\n    return resolveSeq.boolOptions;\n  },\n\n  set bool(opt) {\n    Object.assign(resolveSeq.boolOptions, opt);\n  },\n\n  get int() {\n    return resolveSeq.intOptions;\n  },\n\n  set int(opt) {\n    Object.assign(resolveSeq.intOptions, opt);\n  },\n\n  get null() {\n    return resolveSeq.nullOptions;\n  },\n\n  set null(opt) {\n    Object.assign(resolveSeq.nullOptions, opt);\n  },\n\n  get str() {\n    return resolveSeq.strOptions;\n  },\n\n  set str(opt) {\n    Object.assign(resolveSeq.strOptions, opt);\n  }\n\n};\nconst documentOptions = {\n  '1.0': {\n    schema: 'yaml-1.1',\n    merge: true,\n    tagPrefixes: [{\n      handle: '!',\n      prefix: PlainValue.defaultTagPrefix\n    }, {\n      handle: '!!',\n      prefix: 'tag:private.yaml.org,2002:'\n    }]\n  },\n  '1.1': {\n    schema: 'yaml-1.1',\n    merge: true,\n    tagPrefixes: [{\n      handle: '!',\n      prefix: '!'\n    }, {\n      handle: '!!',\n      prefix: PlainValue.defaultTagPrefix\n    }]\n  },\n  '1.2': {\n    schema: 'core',\n    merge: false,\n    tagPrefixes: [{\n      handle: '!',\n      prefix: '!'\n    }, {\n      handle: '!!',\n      prefix: PlainValue.defaultTagPrefix\n    }]\n  }\n};\n\nfunction stringifyTag(doc, tag) {\n  if ((doc.version || doc.options.version) === '1.0') {\n    const priv = tag.match(/^tag:private\\.yaml\\.org,2002:([^:/]+)$/);\n    if (priv) return '!' + priv[1];\n    const vocab = tag.match(/^tag:([a-zA-Z0-9-]+)\\.yaml\\.org,2002:(.*)/);\n    return vocab ? `!${vocab[1]}/${vocab[2]}` : `!${tag.replace(/^tag:/, '')}`;\n  }\n\n  let p = doc.tagPrefixes.find(p => tag.indexOf(p.prefix) === 0);\n\n  if (!p) {\n    const dtp = doc.getDefaults().tagPrefixes;\n    p = dtp && dtp.find(p => tag.indexOf(p.prefix) === 0);\n  }\n\n  if (!p) return tag[0] === '!' ? tag : `!<${tag}>`;\n  const suffix = tag.substr(p.prefix.length).replace(/[!,[\\]{}]/g, ch => ({\n    '!': '%21',\n    ',': '%2C',\n    '[': '%5B',\n    ']': '%5D',\n    '{': '%7B',\n    '}': '%7D'\n  })[ch]);\n  return p.handle + suffix;\n}\n\nfunction getTagObject(tags, item) {\n  if (item instanceof resolveSeq.Alias) return resolveSeq.Alias;\n\n  if (item.tag) {\n    const match = tags.filter(t => t.tag === item.tag);\n    if (match.length > 0) return match.find(t => t.format === item.format) || match[0];\n  }\n\n  let tagObj, obj;\n\n  if (item instanceof resolveSeq.Scalar) {\n    obj = item.value; // TODO: deprecate/remove class check\n\n    const match = tags.filter(t => t.identify && t.identify(obj) || t.class && obj instanceof t.class);\n    tagObj = match.find(t => t.format === item.format) || match.find(t => !t.format);\n  } else {\n    obj = item;\n    tagObj = tags.find(t => t.nodeClass && obj instanceof t.nodeClass);\n  }\n\n  if (!tagObj) {\n    const name = obj && obj.constructor ? obj.constructor.name : typeof obj;\n    throw new Error(`Tag not resolved for ${name} value`);\n  }\n\n  return tagObj;\n} // needs to be called before value stringifier to allow for circular anchor refs\n\n\nfunction stringifyProps(node, tagObj, {\n  anchors,\n  doc\n}) {\n  const props = [];\n  const anchor = doc.anchors.getName(node);\n\n  if (anchor) {\n    anchors[anchor] = node;\n    props.push(`&${anchor}`);\n  }\n\n  if (node.tag) {\n    props.push(stringifyTag(doc, node.tag));\n  } else if (!tagObj.default) {\n    props.push(stringifyTag(doc, tagObj.tag));\n  }\n\n  return props.join(' ');\n}\n\nfunction stringify(item, ctx, onComment, onChompKeep) {\n  const {\n    anchors,\n    schema\n  } = ctx.doc;\n  let tagObj;\n\n  if (!(item instanceof resolveSeq.Node)) {\n    const createCtx = {\n      aliasNodes: [],\n      onTagObj: o => tagObj = o,\n      prevObjects: new Map()\n    };\n    item = schema.createNode(item, true, null, createCtx);\n\n    for (const alias of createCtx.aliasNodes) {\n      alias.source = alias.source.node;\n      let name = anchors.getName(alias.source);\n\n      if (!name) {\n        name = anchors.newName();\n        anchors.map[name] = alias.source;\n      }\n    }\n  }\n\n  if (item instanceof resolveSeq.Pair) return item.toString(ctx, onComment, onChompKeep);\n  if (!tagObj) tagObj = getTagObject(schema.tags, item);\n  const props = stringifyProps(item, tagObj, ctx);\n  if (props.length > 0) ctx.indentAtStart = (ctx.indentAtStart || 0) + props.length + 1;\n  const str = typeof tagObj.stringify === 'function' ? tagObj.stringify(item, ctx, onComment, onChompKeep) : item instanceof resolveSeq.Scalar ? resolveSeq.stringifyString(item, ctx, onComment, onChompKeep) : item.toString(ctx, onComment, onChompKeep);\n  if (!props) return str;\n  return item instanceof resolveSeq.Scalar || str[0] === '{' || str[0] === '[' ? `${props} ${str}` : `${props}\\n${ctx.indent}${str}`;\n}\n\nclass Anchors {\n  static validAnchorNode(node) {\n    return node instanceof resolveSeq.Scalar || node instanceof resolveSeq.YAMLSeq || node instanceof resolveSeq.YAMLMap;\n  }\n\n  constructor(prefix) {\n    PlainValue._defineProperty(this, \"map\", {});\n\n    this.prefix = prefix;\n  }\n\n  createAlias(node, name) {\n    this.setAnchor(node, name);\n    return new resolveSeq.Alias(node);\n  }\n\n  createMergePair(...sources) {\n    const merge = new resolveSeq.Merge();\n    merge.value.items = sources.map(s => {\n      if (s instanceof resolveSeq.Alias) {\n        if (s.source instanceof resolveSeq.YAMLMap) return s;\n      } else if (s instanceof resolveSeq.YAMLMap) {\n        return this.createAlias(s);\n      }\n\n      throw new Error('Merge sources must be Map nodes or their Aliases');\n    });\n    return merge;\n  }\n\n  getName(node) {\n    const {\n      map\n    } = this;\n    return Object.keys(map).find(a => map[a] === node);\n  }\n\n  getNames() {\n    return Object.keys(this.map);\n  }\n\n  getNode(name) {\n    return this.map[name];\n  }\n\n  newName(prefix) {\n    if (!prefix) prefix = this.prefix;\n    const names = Object.keys(this.map);\n\n    for (let i = 1; true; ++i) {\n      const name = `${prefix}${i}`;\n      if (!names.includes(name)) return name;\n    }\n  } // During parsing, map & aliases contain CST nodes\n\n\n  resolveNodes() {\n    const {\n      map,\n      _cstAliases\n    } = this;\n    Object.keys(map).forEach(a => {\n      map[a] = map[a].resolved;\n    });\n\n    _cstAliases.forEach(a => {\n      a.source = a.source.resolved;\n    });\n\n    delete this._cstAliases;\n  }\n\n  setAnchor(node, name) {\n    if (node != null && !Anchors.validAnchorNode(node)) {\n      throw new Error('Anchors may only be set for Scalar, Seq and Map nodes');\n    }\n\n    if (name && /[\\x00-\\x19\\s,[\\]{}]/.test(name)) {\n      throw new Error('Anchor names must not contain whitespace or control characters');\n    }\n\n    const {\n      map\n    } = this;\n    const prev = node && Object.keys(map).find(a => map[a] === node);\n\n    if (prev) {\n      if (!name) {\n        return prev;\n      } else if (prev !== name) {\n        delete map[prev];\n        map[name] = node;\n      }\n    } else {\n      if (!name) {\n        if (!node) return null;\n        name = this.newName();\n      }\n\n      map[name] = node;\n    }\n\n    return name;\n  }\n\n}\n\nconst visit = (node, tags) => {\n  if (node && typeof node === 'object') {\n    const {\n      tag\n    } = node;\n\n    if (node instanceof resolveSeq.Collection) {\n      if (tag) tags[tag] = true;\n      node.items.forEach(n => visit(n, tags));\n    } else if (node instanceof resolveSeq.Pair) {\n      visit(node.key, tags);\n      visit(node.value, tags);\n    } else if (node instanceof resolveSeq.Scalar) {\n      if (tag) tags[tag] = true;\n    }\n  }\n\n  return tags;\n};\n\nconst listTagNames = node => Object.keys(visit(node, {}));\n\nfunction parseContents(doc, contents) {\n  const comments = {\n    before: [],\n    after: []\n  };\n  let body = undefined;\n  let spaceBefore = false;\n\n  for (const node of contents) {\n    if (node.valueRange) {\n      if (body !== undefined) {\n        const msg = 'Document contains trailing content not separated by a ... or --- line';\n        doc.errors.push(new PlainValue.YAMLSyntaxError(node, msg));\n        break;\n      }\n\n      const res = resolveSeq.resolveNode(doc, node);\n\n      if (spaceBefore) {\n        res.spaceBefore = true;\n        spaceBefore = false;\n      }\n\n      body = res;\n    } else if (node.comment !== null) {\n      const cc = body === undefined ? comments.before : comments.after;\n      cc.push(node.comment);\n    } else if (node.type === PlainValue.Type.BLANK_LINE) {\n      spaceBefore = true;\n\n      if (body === undefined && comments.before.length > 0 && !doc.commentBefore) {\n        // space-separated comments at start are parsed as document comments\n        doc.commentBefore = comments.before.join('\\n');\n        comments.before = [];\n      }\n    }\n  }\n\n  doc.contents = body || null;\n\n  if (!body) {\n    doc.comment = comments.before.concat(comments.after).join('\\n') || null;\n  } else {\n    const cb = comments.before.join('\\n');\n\n    if (cb) {\n      const cbNode = body instanceof resolveSeq.Collection && body.items[0] ? body.items[0] : body;\n      cbNode.commentBefore = cbNode.commentBefore ? `${cb}\\n${cbNode.commentBefore}` : cb;\n    }\n\n    doc.comment = comments.after.join('\\n') || null;\n  }\n}\n\nfunction resolveTagDirective({\n  tagPrefixes\n}, directive) {\n  const [handle, prefix] = directive.parameters;\n\n  if (!handle || !prefix) {\n    const msg = 'Insufficient parameters given for %TAG directive';\n    throw new PlainValue.YAMLSemanticError(directive, msg);\n  }\n\n  if (tagPrefixes.some(p => p.handle === handle)) {\n    const msg = 'The %TAG directive must only be given at most once per handle in the same document.';\n    throw new PlainValue.YAMLSemanticError(directive, msg);\n  }\n\n  return {\n    handle,\n    prefix\n  };\n}\n\nfunction resolveYamlDirective(doc, directive) {\n  let [version] = directive.parameters;\n  if (directive.name === 'YAML:1.0') version = '1.0';\n\n  if (!version) {\n    const msg = 'Insufficient parameters given for %YAML directive';\n    throw new PlainValue.YAMLSemanticError(directive, msg);\n  }\n\n  if (!documentOptions[version]) {\n    const v0 = doc.version || doc.options.version;\n    const msg = `Document will be parsed as YAML ${v0} rather than YAML ${version}`;\n    doc.warnings.push(new PlainValue.YAMLWarning(directive, msg));\n  }\n\n  return version;\n}\n\nfunction parseDirectives(doc, directives, prevDoc) {\n  const directiveComments = [];\n  let hasDirectives = false;\n\n  for (const directive of directives) {\n    const {\n      comment,\n      name\n    } = directive;\n\n    switch (name) {\n      case 'TAG':\n        try {\n          doc.tagPrefixes.push(resolveTagDirective(doc, directive));\n        } catch (error) {\n          doc.errors.push(error);\n        }\n\n        hasDirectives = true;\n        break;\n\n      case 'YAML':\n      case 'YAML:1.0':\n        if (doc.version) {\n          const msg = 'The %YAML directive must only be given at most once per document.';\n          doc.errors.push(new PlainValue.YAMLSemanticError(directive, msg));\n        }\n\n        try {\n          doc.version = resolveYamlDirective(doc, directive);\n        } catch (error) {\n          doc.errors.push(error);\n        }\n\n        hasDirectives = true;\n        break;\n\n      default:\n        if (name) {\n          const msg = `YAML only supports %TAG and %YAML directives, and not %${name}`;\n          doc.warnings.push(new PlainValue.YAMLWarning(directive, msg));\n        }\n\n    }\n\n    if (comment) directiveComments.push(comment);\n  }\n\n  if (prevDoc && !hasDirectives && '1.1' === (doc.version || prevDoc.version || doc.options.version)) {\n    const copyTagPrefix = ({\n      handle,\n      prefix\n    }) => ({\n      handle,\n      prefix\n    });\n\n    doc.tagPrefixes = prevDoc.tagPrefixes.map(copyTagPrefix);\n    doc.version = prevDoc.version;\n  }\n\n  doc.commentBefore = directiveComments.join('\\n') || null;\n}\n\nfunction assertCollection(contents) {\n  if (contents instanceof resolveSeq.Collection) return true;\n  throw new Error('Expected a YAML collection as document contents');\n}\n\nclass Document {\n  constructor(options) {\n    this.anchors = new Anchors(options.anchorPrefix);\n    this.commentBefore = null;\n    this.comment = null;\n    this.contents = null;\n    this.directivesEndMarker = null;\n    this.errors = [];\n    this.options = options;\n    this.schema = null;\n    this.tagPrefixes = [];\n    this.version = null;\n    this.warnings = [];\n  }\n\n  add(value) {\n    assertCollection(this.contents);\n    return this.contents.add(value);\n  }\n\n  addIn(path, value) {\n    assertCollection(this.contents);\n    this.contents.addIn(path, value);\n  }\n\n  delete(key) {\n    assertCollection(this.contents);\n    return this.contents.delete(key);\n  }\n\n  deleteIn(path) {\n    if (resolveSeq.isEmptyPath(path)) {\n      if (this.contents == null) return false;\n      this.contents = null;\n      return true;\n    }\n\n    assertCollection(this.contents);\n    return this.contents.deleteIn(path);\n  }\n\n  getDefaults() {\n    return Document.defaults[this.version] || Document.defaults[this.options.version] || {};\n  }\n\n  get(key, keepScalar) {\n    return this.contents instanceof resolveSeq.Collection ? this.contents.get(key, keepScalar) : undefined;\n  }\n\n  getIn(path, keepScalar) {\n    if (resolveSeq.isEmptyPath(path)) return !keepScalar && this.contents instanceof resolveSeq.Scalar ? this.contents.value : this.contents;\n    return this.contents instanceof resolveSeq.Collection ? this.contents.getIn(path, keepScalar) : undefined;\n  }\n\n  has(key) {\n    return this.contents instanceof resolveSeq.Collection ? this.contents.has(key) : false;\n  }\n\n  hasIn(path) {\n    if (resolveSeq.isEmptyPath(path)) return this.contents !== undefined;\n    return this.contents instanceof resolveSeq.Collection ? this.contents.hasIn(path) : false;\n  }\n\n  set(key, value) {\n    assertCollection(this.contents);\n    this.contents.set(key, value);\n  }\n\n  setIn(path, value) {\n    if (resolveSeq.isEmptyPath(path)) this.contents = value;else {\n      assertCollection(this.contents);\n      this.contents.setIn(path, value);\n    }\n  }\n\n  setSchema(id, customTags) {\n    if (!id && !customTags && this.schema) return;\n    if (typeof id === 'number') id = id.toFixed(1);\n\n    if (id === '1.0' || id === '1.1' || id === '1.2') {\n      if (this.version) this.version = id;else this.options.version = id;\n      delete this.options.schema;\n    } else if (id && typeof id === 'string') {\n      this.options.schema = id;\n    }\n\n    if (Array.isArray(customTags)) this.options.customTags = customTags;\n    const opt = Object.assign({}, this.getDefaults(), this.options);\n    this.schema = new Schema.Schema(opt);\n  }\n\n  parse(node, prevDoc) {\n    if (this.options.keepCstNodes) this.cstNode = node;\n    if (this.options.keepNodeTypes) this.type = 'DOCUMENT';\n    const {\n      directives = [],\n      contents = [],\n      directivesEndMarker,\n      error,\n      valueRange\n    } = node;\n\n    if (error) {\n      if (!error.source) error.source = this;\n      this.errors.push(error);\n    }\n\n    parseDirectives(this, directives, prevDoc);\n    if (directivesEndMarker) this.directivesEndMarker = true;\n    this.range = valueRange ? [valueRange.start, valueRange.end] : null;\n    this.setSchema();\n    this.anchors._cstAliases = [];\n    parseContents(this, contents);\n    this.anchors.resolveNodes();\n\n    if (this.options.prettyErrors) {\n      for (const error of this.errors) if (error instanceof PlainValue.YAMLError) error.makePretty();\n\n      for (const warn of this.warnings) if (warn instanceof PlainValue.YAMLError) warn.makePretty();\n    }\n\n    return this;\n  }\n\n  listNonDefaultTags() {\n    return listTagNames(this.contents).filter(t => t.indexOf(Schema.Schema.defaultPrefix) !== 0);\n  }\n\n  setTagPrefix(handle, prefix) {\n    if (handle[0] !== '!' || handle[handle.length - 1] !== '!') throw new Error('Handle must start and end with !');\n\n    if (prefix) {\n      const prev = this.tagPrefixes.find(p => p.handle === handle);\n      if (prev) prev.prefix = prefix;else this.tagPrefixes.push({\n        handle,\n        prefix\n      });\n    } else {\n      this.tagPrefixes = this.tagPrefixes.filter(p => p.handle !== handle);\n    }\n  }\n\n  toJSON(arg, onAnchor) {\n    const {\n      keepBlobsInJSON,\n      mapAsMap,\n      maxAliasCount\n    } = this.options;\n    const keep = keepBlobsInJSON && (typeof arg !== 'string' || !(this.contents instanceof resolveSeq.Scalar));\n    const ctx = {\n      doc: this,\n      indentStep: '  ',\n      keep,\n      mapAsMap: keep && !!mapAsMap,\n      maxAliasCount,\n      stringify // Requiring directly in Pair would create circular dependencies\n\n    };\n    const anchorNames = Object.keys(this.anchors.map);\n    if (anchorNames.length > 0) ctx.anchors = new Map(anchorNames.map(name => [this.anchors.map[name], {\n      alias: [],\n      aliasCount: 0,\n      count: 1\n    }]));\n    const res = resolveSeq.toJSON(this.contents, arg, ctx);\n    if (typeof onAnchor === 'function' && ctx.anchors) for (const {\n      count,\n      res\n    } of ctx.anchors.values()) onAnchor(res, count);\n    return res;\n  }\n\n  toString() {\n    if (this.errors.length > 0) throw new Error('Document with errors cannot be stringified');\n    const indentSize = this.options.indent;\n\n    if (!Number.isInteger(indentSize) || indentSize <= 0) {\n      const s = JSON.stringify(indentSize);\n      throw new Error(`\"indent\" option must be a positive integer, not ${s}`);\n    }\n\n    this.setSchema();\n    const lines = [];\n    let hasDirectives = false;\n\n    if (this.version) {\n      let vd = '%YAML 1.2';\n\n      if (this.schema.name === 'yaml-1.1') {\n        if (this.version === '1.0') vd = '%YAML:1.0';else if (this.version === '1.1') vd = '%YAML 1.1';\n      }\n\n      lines.push(vd);\n      hasDirectives = true;\n    }\n\n    const tagNames = this.listNonDefaultTags();\n    this.tagPrefixes.forEach(({\n      handle,\n      prefix\n    }) => {\n      if (tagNames.some(t => t.indexOf(prefix) === 0)) {\n        lines.push(`%TAG ${handle} ${prefix}`);\n        hasDirectives = true;\n      }\n    });\n    if (hasDirectives || this.directivesEndMarker) lines.push('---');\n\n    if (this.commentBefore) {\n      if (hasDirectives || !this.directivesEndMarker) lines.unshift('');\n      lines.unshift(this.commentBefore.replace(/^/gm, '#'));\n    }\n\n    const ctx = {\n      anchors: {},\n      doc: this,\n      indent: '',\n      indentStep: ' '.repeat(indentSize),\n      stringify // Requiring directly in nodes would create circular dependencies\n\n    };\n    let chompKeep = false;\n    let contentComment = null;\n\n    if (this.contents) {\n      if (this.contents instanceof resolveSeq.Node) {\n        if (this.contents.spaceBefore && (hasDirectives || this.directivesEndMarker)) lines.push('');\n        if (this.contents.commentBefore) lines.push(this.contents.commentBefore.replace(/^/gm, '#')); // top-level block scalars need to be indented if followed by a comment\n\n        ctx.forceBlockIndent = !!this.comment;\n        contentComment = this.contents.comment;\n      }\n\n      const onChompKeep = contentComment ? null : () => chompKeep = true;\n      const body = stringify(this.contents, ctx, () => contentComment = null, onChompKeep);\n      lines.push(resolveSeq.addComment(body, '', contentComment));\n    } else if (this.contents !== undefined) {\n      lines.push(stringify(this.contents, ctx));\n    }\n\n    if (this.comment) {\n      if ((!chompKeep || contentComment) && lines[lines.length - 1] !== '') lines.push('');\n      lines.push(this.comment.replace(/^/gm, '#'));\n    }\n\n    return lines.join('\\n') + '\\n';\n  }\n\n}\n\nPlainValue._defineProperty(Document, \"defaults\", documentOptions);\n\nexports.Document = Document;\nexports.defaultOptions = defaultOptions;\nexports.scalarOptions = scalarOptions;\n\n\n/***/ }),\n\n/***/ 5215:\n/***/ ((__unused_webpack_module, exports) => {\n\n\"use strict\";\n\n\nconst Char = {\n  ANCHOR: '&',\n  COMMENT: '#',\n  TAG: '!',\n  DIRECTIVES_END: '-',\n  DOCUMENT_END: '.'\n};\nconst Type = {\n  ALIAS: 'ALIAS',\n  BLANK_LINE: 'BLANK_LINE',\n  BLOCK_FOLDED: 'BLOCK_FOLDED',\n  BLOCK_LITERAL: 'BLOCK_LITERAL',\n  COMMENT: 'COMMENT',\n  DIRECTIVE: 'DIRECTIVE',\n  DOCUMENT: 'DOCUMENT',\n  FLOW_MAP: 'FLOW_MAP',\n  FLOW_SEQ: 'FLOW_SEQ',\n  MAP: 'MAP',\n  MAP_KEY: 'MAP_KEY',\n  MAP_VALUE: 'MAP_VALUE',\n  PLAIN: 'PLAIN',\n  QUOTE_DOUBLE: 'QUOTE_DOUBLE',\n  QUOTE_SINGLE: 'QUOTE_SINGLE',\n  SEQ: 'SEQ',\n  SEQ_ITEM: 'SEQ_ITEM'\n};\nconst defaultTagPrefix = 'tag:yaml.org,2002:';\nconst defaultTags = {\n  MAP: 'tag:yaml.org,2002:map',\n  SEQ: 'tag:yaml.org,2002:seq',\n  STR: 'tag:yaml.org,2002:str'\n};\n\nfunction findLineStarts(src) {\n  const ls = [0];\n  let offset = src.indexOf('\\n');\n\n  while (offset !== -1) {\n    offset += 1;\n    ls.push(offset);\n    offset = src.indexOf('\\n', offset);\n  }\n\n  return ls;\n}\n\nfunction getSrcInfo(cst) {\n  let lineStarts, src;\n\n  if (typeof cst === 'string') {\n    lineStarts = findLineStarts(cst);\n    src = cst;\n  } else {\n    if (Array.isArray(cst)) cst = cst[0];\n\n    if (cst && cst.context) {\n      if (!cst.lineStarts) cst.lineStarts = findLineStarts(cst.context.src);\n      lineStarts = cst.lineStarts;\n      src = cst.context.src;\n    }\n  }\n\n  return {\n    lineStarts,\n    src\n  };\n}\n/**\n * @typedef {Object} LinePos - One-indexed position in the source\n * @property {number} line\n * @property {number} col\n */\n\n/**\n * Determine the line/col position matching a character offset.\n *\n * Accepts a source string or a CST document as the second parameter. With\n * the latter, starting indices for lines are cached in the document as\n * `lineStarts: number[]`.\n *\n * Returns a one-indexed `{ line, col }` location if found, or\n * `undefined` otherwise.\n *\n * @param {number} offset\n * @param {string|Document|Document[]} cst\n * @returns {?LinePos}\n */\n\n\nfunction getLinePos(offset, cst) {\n  if (typeof offset !== 'number' || offset < 0) return null;\n  const {\n    lineStarts,\n    src\n  } = getSrcInfo(cst);\n  if (!lineStarts || !src || offset > src.length) return null;\n\n  for (let i = 0; i < lineStarts.length; ++i) {\n    const start = lineStarts[i];\n\n    if (offset < start) {\n      return {\n        line: i,\n        col: offset - lineStarts[i - 1] + 1\n      };\n    }\n\n    if (offset === start) return {\n      line: i + 1,\n      col: 1\n    };\n  }\n\n  const line = lineStarts.length;\n  return {\n    line,\n    col: offset - lineStarts[line - 1] + 1\n  };\n}\n/**\n * Get a specified line from the source.\n *\n * Accepts a source string or a CST document as the second parameter. With\n * the latter, starting indices for lines are cached in the document as\n * `lineStarts: number[]`.\n *\n * Returns the line as a string if found, or `null` otherwise.\n *\n * @param {number} line One-indexed line number\n * @param {string|Document|Document[]} cst\n * @returns {?string}\n */\n\nfunction getLine(line, cst) {\n  const {\n    lineStarts,\n    src\n  } = getSrcInfo(cst);\n  if (!lineStarts || !(line >= 1) || line > lineStarts.length) return null;\n  const start = lineStarts[line - 1];\n  let end = lineStarts[line]; // undefined for last line; that's ok for slice()\n\n  while (end && end > start && src[end - 1] === '\\n') --end;\n\n  return src.slice(start, end);\n}\n/**\n * Pretty-print the starting line from the source indicated by the range `pos`\n *\n * Trims output to `maxWidth` chars while keeping the starting column visible,\n * using `…` at either end to indicate dropped characters.\n *\n * Returns a two-line string (or `null`) with `\\n` as separator; the second line\n * will hold appropriately indented `^` marks indicating the column range.\n *\n * @param {Object} pos\n * @param {LinePos} pos.start\n * @param {LinePos} [pos.end]\n * @param {string|Document|Document[]*} cst\n * @param {number} [maxWidth=80]\n * @returns {?string}\n */\n\nfunction getPrettyContext({\n  start,\n  end\n}, cst, maxWidth = 80) {\n  let src = getLine(start.line, cst);\n  if (!src) return null;\n  let {\n    col\n  } = start;\n\n  if (src.length > maxWidth) {\n    if (col <= maxWidth - 10) {\n      src = src.substr(0, maxWidth - 1) + '…';\n    } else {\n      const halfWidth = Math.round(maxWidth / 2);\n      if (src.length > col + halfWidth) src = src.substr(0, col + halfWidth - 1) + '…';\n      col -= src.length - maxWidth;\n      src = '…' + src.substr(1 - maxWidth);\n    }\n  }\n\n  let errLen = 1;\n  let errEnd = '';\n\n  if (end) {\n    if (end.line === start.line && col + (end.col - start.col) <= maxWidth + 1) {\n      errLen = end.col - start.col;\n    } else {\n      errLen = Math.min(src.length + 1, maxWidth) - col;\n      errEnd = '…';\n    }\n  }\n\n  const offset = col > 1 ? ' '.repeat(col - 1) : '';\n  const err = '^'.repeat(errLen);\n  return `${src}\\n${offset}${err}${errEnd}`;\n}\n\nclass Range {\n  static copy(orig) {\n    return new Range(orig.start, orig.end);\n  }\n\n  constructor(start, end) {\n    this.start = start;\n    this.end = end || start;\n  }\n\n  isEmpty() {\n    return typeof this.start !== 'number' || !this.end || this.end <= this.start;\n  }\n  /**\n   * Set `origStart` and `origEnd` to point to the original source range for\n   * this node, which may differ due to dropped CR characters.\n   *\n   * @param {number[]} cr - Positions of dropped CR characters\n   * @param {number} offset - Starting index of `cr` from the last call\n   * @returns {number} - The next offset, matching the one found for `origStart`\n   */\n\n\n  setOrigRange(cr, offset) {\n    const {\n      start,\n      end\n    } = this;\n\n    if (cr.length === 0 || end <= cr[0]) {\n      this.origStart = start;\n      this.origEnd = end;\n      return offset;\n    }\n\n    let i = offset;\n\n    while (i < cr.length) {\n      if (cr[i] > start) break;else ++i;\n    }\n\n    this.origStart = start + i;\n    const nextOffset = i;\n\n    while (i < cr.length) {\n      // if end was at \\n, it should now be at \\r\n      if (cr[i] >= end) break;else ++i;\n    }\n\n    this.origEnd = end + i;\n    return nextOffset;\n  }\n\n}\n\n/** Root class of all nodes */\n\nclass Node {\n  static addStringTerminator(src, offset, str) {\n    if (str[str.length - 1] === '\\n') return str;\n    const next = Node.endOfWhiteSpace(src, offset);\n    return next >= src.length || src[next] === '\\n' ? str + '\\n' : str;\n  } // ^(---|...)\n\n\n  static atDocumentBoundary(src, offset, sep) {\n    const ch0 = src[offset];\n    if (!ch0) return true;\n    const prev = src[offset - 1];\n    if (prev && prev !== '\\n') return false;\n\n    if (sep) {\n      if (ch0 !== sep) return false;\n    } else {\n      if (ch0 !== Char.DIRECTIVES_END && ch0 !== Char.DOCUMENT_END) return false;\n    }\n\n    const ch1 = src[offset + 1];\n    const ch2 = src[offset + 2];\n    if (ch1 !== ch0 || ch2 !== ch0) return false;\n    const ch3 = src[offset + 3];\n    return !ch3 || ch3 === '\\n' || ch3 === '\\t' || ch3 === ' ';\n  }\n\n  static endOfIdentifier(src, offset) {\n    let ch = src[offset];\n    const isVerbatim = ch === '<';\n    const notOk = isVerbatim ? ['\\n', '\\t', ' ', '>'] : ['\\n', '\\t', ' ', '[', ']', '{', '}', ','];\n\n    while (ch && notOk.indexOf(ch) === -1) ch = src[offset += 1];\n\n    if (isVerbatim && ch === '>') offset += 1;\n    return offset;\n  }\n\n  static endOfIndent(src, offset) {\n    let ch = src[offset];\n\n    while (ch === ' ') ch = src[offset += 1];\n\n    return offset;\n  }\n\n  static endOfLine(src, offset) {\n    let ch = src[offset];\n\n    while (ch && ch !== '\\n') ch = src[offset += 1];\n\n    return offset;\n  }\n\n  static endOfWhiteSpace(src, offset) {\n    let ch = src[offset];\n\n    while (ch === '\\t' || ch === ' ') ch = src[offset += 1];\n\n    return offset;\n  }\n\n  static startOfLine(src, offset) {\n    let ch = src[offset - 1];\n    if (ch === '\\n') return offset;\n\n    while (ch && ch !== '\\n') ch = src[offset -= 1];\n\n    return offset + 1;\n  }\n  /**\n   * End of indentation, or null if the line's indent level is not more\n   * than `indent`\n   *\n   * @param {string} src\n   * @param {number} indent\n   * @param {number} lineStart\n   * @returns {?number}\n   */\n\n\n  static endOfBlockIndent(src, indent, lineStart) {\n    const inEnd = Node.endOfIndent(src, lineStart);\n\n    if (inEnd > lineStart + indent) {\n      return inEnd;\n    } else {\n      const wsEnd = Node.endOfWhiteSpace(src, inEnd);\n      const ch = src[wsEnd];\n      if (!ch || ch === '\\n') return wsEnd;\n    }\n\n    return null;\n  }\n\n  static atBlank(src, offset, endAsBlank) {\n    const ch = src[offset];\n    return ch === '\\n' || ch === '\\t' || ch === ' ' || endAsBlank && !ch;\n  }\n\n  static nextNodeIsIndented(ch, indentDiff, indicatorAsIndent) {\n    if (!ch || indentDiff < 0) return false;\n    if (indentDiff > 0) return true;\n    return indicatorAsIndent && ch === '-';\n  } // should be at line or string end, or at next non-whitespace char\n\n\n  static normalizeOffset(src, offset) {\n    const ch = src[offset];\n    return !ch ? offset : ch !== '\\n' && src[offset - 1] === '\\n' ? offset - 1 : Node.endOfWhiteSpace(src, offset);\n  } // fold single newline into space, multiple newlines to N - 1 newlines\n  // presumes src[offset] === '\\n'\n\n\n  static foldNewline(src, offset, indent) {\n    let inCount = 0;\n    let error = false;\n    let fold = '';\n    let ch = src[offset + 1];\n\n    while (ch === ' ' || ch === '\\t' || ch === '\\n') {\n      switch (ch) {\n        case '\\n':\n          inCount = 0;\n          offset += 1;\n          fold += '\\n';\n          break;\n\n        case '\\t':\n          if (inCount <= indent) error = true;\n          offset = Node.endOfWhiteSpace(src, offset + 2) - 1;\n          break;\n\n        case ' ':\n          inCount += 1;\n          offset += 1;\n          break;\n      }\n\n      ch = src[offset + 1];\n    }\n\n    if (!fold) fold = ' ';\n    if (ch && inCount <= indent) error = true;\n    return {\n      fold,\n      offset,\n      error\n    };\n  }\n\n  constructor(type, props, context) {\n    Object.defineProperty(this, 'context', {\n      value: context || null,\n      writable: true\n    });\n    this.error = null;\n    this.range = null;\n    this.valueRange = null;\n    this.props = props || [];\n    this.type = type;\n    this.value = null;\n  }\n\n  getPropValue(idx, key, skipKey) {\n    if (!this.context) return null;\n    const {\n      src\n    } = this.context;\n    const prop = this.props[idx];\n    return prop && src[prop.start] === key ? src.slice(prop.start + (skipKey ? 1 : 0), prop.end) : null;\n  }\n\n  get anchor() {\n    for (let i = 0; i < this.props.length; ++i) {\n      const anchor = this.getPropValue(i, Char.ANCHOR, true);\n      if (anchor != null) return anchor;\n    }\n\n    return null;\n  }\n\n  get comment() {\n    const comments = [];\n\n    for (let i = 0; i < this.props.length; ++i) {\n      const comment = this.getPropValue(i, Char.COMMENT, true);\n      if (comment != null) comments.push(comment);\n    }\n\n    return comments.length > 0 ? comments.join('\\n') : null;\n  }\n\n  commentHasRequiredWhitespace(start) {\n    const {\n      src\n    } = this.context;\n    if (this.header && start === this.header.end) return false;\n    if (!this.valueRange) return false;\n    const {\n      end\n    } = this.valueRange;\n    return start !== end || Node.atBlank(src, end - 1);\n  }\n\n  get hasComment() {\n    if (this.context) {\n      const {\n        src\n      } = this.context;\n\n      for (let i = 0; i < this.props.length; ++i) {\n        if (src[this.props[i].start] === Char.COMMENT) return true;\n      }\n    }\n\n    return false;\n  }\n\n  get hasProps() {\n    if (this.context) {\n      const {\n        src\n      } = this.context;\n\n      for (let i = 0; i < this.props.length; ++i) {\n        if (src[this.props[i].start] !== Char.COMMENT) return true;\n      }\n    }\n\n    return false;\n  }\n\n  get includesTrailingLines() {\n    return false;\n  }\n\n  get jsonLike() {\n    const jsonLikeTypes = [Type.FLOW_MAP, Type.FLOW_SEQ, Type.QUOTE_DOUBLE, Type.QUOTE_SINGLE];\n    return jsonLikeTypes.indexOf(this.type) !== -1;\n  }\n\n  get rangeAsLinePos() {\n    if (!this.range || !this.context) return undefined;\n    const start = getLinePos(this.range.start, this.context.root);\n    if (!start) return undefined;\n    const end = getLinePos(this.range.end, this.context.root);\n    return {\n      start,\n      end\n    };\n  }\n\n  get rawValue() {\n    if (!this.valueRange || !this.context) return null;\n    const {\n      start,\n      end\n    } = this.valueRange;\n    return this.context.src.slice(start, end);\n  }\n\n  get tag() {\n    for (let i = 0; i < this.props.length; ++i) {\n      const tag = this.getPropValue(i, Char.TAG, false);\n\n      if (tag != null) {\n        if (tag[1] === '<') {\n          return {\n            verbatim: tag.slice(2, -1)\n          };\n        } else {\n          // eslint-disable-next-line no-unused-vars\n          const [_, handle, suffix] = tag.match(/^(.*!)([^!]*)$/);\n          return {\n            handle,\n            suffix\n          };\n        }\n      }\n    }\n\n    return null;\n  }\n\n  get valueRangeContainsNewline() {\n    if (!this.valueRange || !this.context) return false;\n    const {\n      start,\n      end\n    } = this.valueRange;\n    const {\n      src\n    } = this.context;\n\n    for (let i = start; i < end; ++i) {\n      if (src[i] === '\\n') return true;\n    }\n\n    return false;\n  }\n\n  parseComment(start) {\n    const {\n      src\n    } = this.context;\n\n    if (src[start] === Char.COMMENT) {\n      const end = Node.endOfLine(src, start + 1);\n      const commentRange = new Range(start, end);\n      this.props.push(commentRange);\n      return end;\n    }\n\n    return start;\n  }\n  /**\n   * Populates the `origStart` and `origEnd` values of all ranges for this\n   * node. Extended by child classes to handle descendant nodes.\n   *\n   * @param {number[]} cr - Positions of dropped CR characters\n   * @param {number} offset - Starting index of `cr` from the last call\n   * @returns {number} - The next offset, matching the one found for `origStart`\n   */\n\n\n  setOrigRanges(cr, offset) {\n    if (this.range) offset = this.range.setOrigRange(cr, offset);\n    if (this.valueRange) this.valueRange.setOrigRange(cr, offset);\n    this.props.forEach(prop => prop.setOrigRange(cr, offset));\n    return offset;\n  }\n\n  toString() {\n    const {\n      context: {\n        src\n      },\n      range,\n      value\n    } = this;\n    if (value != null) return value;\n    const str = src.slice(range.start, range.end);\n    return Node.addStringTerminator(src, range.end, str);\n  }\n\n}\n\nclass YAMLError extends Error {\n  constructor(name, source, message) {\n    if (!message || !(source instanceof Node)) throw new Error(`Invalid arguments for new ${name}`);\n    super();\n    this.name = name;\n    this.message = message;\n    this.source = source;\n  }\n\n  makePretty() {\n    if (!this.source) return;\n    this.nodeType = this.source.type;\n    const cst = this.source.context && this.source.context.root;\n\n    if (typeof this.offset === 'number') {\n      this.range = new Range(this.offset, this.offset + 1);\n      const start = cst && getLinePos(this.offset, cst);\n\n      if (start) {\n        const end = {\n          line: start.line,\n          col: start.col + 1\n        };\n        this.linePos = {\n          start,\n          end\n        };\n      }\n\n      delete this.offset;\n    } else {\n      this.range = this.source.range;\n      this.linePos = this.source.rangeAsLinePos;\n    }\n\n    if (this.linePos) {\n      const {\n        line,\n        col\n      } = this.linePos.start;\n      this.message += ` at line ${line}, column ${col}`;\n      const ctx = cst && getPrettyContext(this.linePos, cst);\n      if (ctx) this.message += `:\\n\\n${ctx}\\n`;\n    }\n\n    delete this.source;\n  }\n\n}\nclass YAMLReferenceError extends YAMLError {\n  constructor(source, message) {\n    super('YAMLReferenceError', source, message);\n  }\n\n}\nclass YAMLSemanticError extends YAMLError {\n  constructor(source, message) {\n    super('YAMLSemanticError', source, message);\n  }\n\n}\nclass YAMLSyntaxError extends YAMLError {\n  constructor(source, message) {\n    super('YAMLSyntaxError', source, message);\n  }\n\n}\nclass YAMLWarning extends YAMLError {\n  constructor(source, message) {\n    super('YAMLWarning', source, message);\n  }\n\n}\n\nfunction _defineProperty(obj, key, value) {\n  if (key in obj) {\n    Object.defineProperty(obj, key, {\n      value: value,\n      enumerable: true,\n      configurable: true,\n      writable: true\n    });\n  } else {\n    obj[key] = value;\n  }\n\n  return obj;\n}\n\nclass PlainValue extends Node {\n  static endOfLine(src, start, inFlow) {\n    let ch = src[start];\n    let offset = start;\n\n    while (ch && ch !== '\\n') {\n      if (inFlow && (ch === '[' || ch === ']' || ch === '{' || ch === '}' || ch === ',')) break;\n      const next = src[offset + 1];\n      if (ch === ':' && (!next || next === '\\n' || next === '\\t' || next === ' ' || inFlow && next === ',')) break;\n      if ((ch === ' ' || ch === '\\t') && next === '#') break;\n      offset += 1;\n      ch = next;\n    }\n\n    return offset;\n  }\n\n  get strValue() {\n    if (!this.valueRange || !this.context) return null;\n    let {\n      start,\n      end\n    } = this.valueRange;\n    const {\n      src\n    } = this.context;\n    let ch = src[end - 1];\n\n    while (start < end && (ch === '\\n' || ch === '\\t' || ch === ' ')) ch = src[--end - 1];\n\n    let str = '';\n\n    for (let i = start; i < end; ++i) {\n      const ch = src[i];\n\n      if (ch === '\\n') {\n        const {\n          fold,\n          offset\n        } = Node.foldNewline(src, i, -1);\n        str += fold;\n        i = offset;\n      } else if (ch === ' ' || ch === '\\t') {\n        // trim trailing whitespace\n        const wsStart = i;\n        let next = src[i + 1];\n\n        while (i < end && (next === ' ' || next === '\\t')) {\n          i += 1;\n          next = src[i + 1];\n        }\n\n        if (next !== '\\n') str += i > wsStart ? src.slice(wsStart, i + 1) : ch;\n      } else {\n        str += ch;\n      }\n    }\n\n    const ch0 = src[start];\n\n    switch (ch0) {\n      case '\\t':\n        {\n          const msg = 'Plain value cannot start with a tab character';\n          const errors = [new YAMLSemanticError(this, msg)];\n          return {\n            errors,\n            str\n          };\n        }\n\n      case '@':\n      case '`':\n        {\n          const msg = `Plain value cannot start with reserved character ${ch0}`;\n          const errors = [new YAMLSemanticError(this, msg)];\n          return {\n            errors,\n            str\n          };\n        }\n\n      default:\n        return str;\n    }\n  }\n\n  parseBlockValue(start) {\n    const {\n      indent,\n      inFlow,\n      src\n    } = this.context;\n    let offset = start;\n    let valueEnd = start;\n\n    for (let ch = src[offset]; ch === '\\n'; ch = src[offset]) {\n      if (Node.atDocumentBoundary(src, offset + 1)) break;\n      const end = Node.endOfBlockIndent(src, indent, offset + 1);\n      if (end === null || src[end] === '#') break;\n\n      if (src[end] === '\\n') {\n        offset = end;\n      } else {\n        valueEnd = PlainValue.endOfLine(src, end, inFlow);\n        offset = valueEnd;\n      }\n    }\n\n    if (this.valueRange.isEmpty()) this.valueRange.start = start;\n    this.valueRange.end = valueEnd;\n    return valueEnd;\n  }\n  /**\n   * Parses a plain value from the source\n   *\n   * Accepted forms are:\n   * ```\n   * #comment\n   *\n   * first line\n   *\n   * first line #comment\n   *\n   * first line\n   * block\n   * lines\n   *\n   * #comment\n   * block\n   * lines\n   * ```\n   * where block lines are empty or have an indent level greater than `indent`.\n   *\n   * @param {ParseContext} context\n   * @param {number} start - Index of first character\n   * @returns {number} - Index of the character after this scalar, may be `\\n`\n   */\n\n\n  parse(context, start) {\n    this.context = context;\n    const {\n      inFlow,\n      src\n    } = context;\n    let offset = start;\n    const ch = src[offset];\n\n    if (ch && ch !== '#' && ch !== '\\n') {\n      offset = PlainValue.endOfLine(src, start, inFlow);\n    }\n\n    this.valueRange = new Range(start, offset);\n    offset = Node.endOfWhiteSpace(src, offset);\n    offset = this.parseComment(offset);\n\n    if (!this.hasComment || this.valueRange.isEmpty()) {\n      offset = this.parseBlockValue(offset);\n    }\n\n    return offset;\n  }\n\n}\n\nexports.Char = Char;\nexports.Node = Node;\nexports.PlainValue = PlainValue;\nexports.Range = Range;\nexports.Type = Type;\nexports.YAMLError = YAMLError;\nexports.YAMLReferenceError = YAMLReferenceError;\nexports.YAMLSemanticError = YAMLSemanticError;\nexports.YAMLSyntaxError = YAMLSyntaxError;\nexports.YAMLWarning = YAMLWarning;\nexports._defineProperty = _defineProperty;\nexports.defaultTagPrefix = defaultTagPrefix;\nexports.defaultTags = defaultTags;\n\n\n/***/ }),\n\n/***/ 3656:\n/***/ ((__unused_webpack_module, exports, __webpack_require__) => {\n\n\"use strict\";\n\n\nvar PlainValue = __webpack_require__(5215);\nvar resolveSeq = __webpack_require__(6140);\nvar warnings = __webpack_require__(7383);\n\nfunction createMap(schema, obj, ctx) {\n  const map = new resolveSeq.YAMLMap(schema);\n\n  if (obj instanceof Map) {\n    for (const [key, value] of obj) map.items.push(schema.createPair(key, value, ctx));\n  } else if (obj && typeof obj === 'object') {\n    for (const key of Object.keys(obj)) map.items.push(schema.createPair(key, obj[key], ctx));\n  }\n\n  if (typeof schema.sortMapEntries === 'function') {\n    map.items.sort(schema.sortMapEntries);\n  }\n\n  return map;\n}\n\nconst map = {\n  createNode: createMap,\n  default: true,\n  nodeClass: resolveSeq.YAMLMap,\n  tag: 'tag:yaml.org,2002:map',\n  resolve: resolveSeq.resolveMap\n};\n\nfunction createSeq(schema, obj, ctx) {\n  const seq = new resolveSeq.YAMLSeq(schema);\n\n  if (obj && obj[Symbol.iterator]) {\n    for (const it of obj) {\n      const v = schema.createNode(it, ctx.wrapScalars, null, ctx);\n      seq.items.push(v);\n    }\n  }\n\n  return seq;\n}\n\nconst seq = {\n  createNode: createSeq,\n  default: true,\n  nodeClass: resolveSeq.YAMLSeq,\n  tag: 'tag:yaml.org,2002:seq',\n  resolve: resolveSeq.resolveSeq\n};\n\nconst string = {\n  identify: value => typeof value === 'string',\n  default: true,\n  tag: 'tag:yaml.org,2002:str',\n  resolve: resolveSeq.resolveString,\n\n  stringify(item, ctx, onComment, onChompKeep) {\n    ctx = Object.assign({\n      actualString: true\n    }, ctx);\n    return resolveSeq.stringifyString(item, ctx, onComment, onChompKeep);\n  },\n\n  options: resolveSeq.strOptions\n};\n\nconst failsafe = [map, seq, string];\n\n/* global BigInt */\n\nconst intIdentify = value => typeof value === 'bigint' || Number.isInteger(value);\n\nconst intResolve = (src, part, radix) => resolveSeq.intOptions.asBigInt ? BigInt(src) : parseInt(part, radix);\n\nfunction intStringify(node, radix, prefix) {\n  const {\n    value\n  } = node;\n  if (intIdentify(value) && value >= 0) return prefix + value.toString(radix);\n  return resolveSeq.stringifyNumber(node);\n}\n\nconst nullObj = {\n  identify: value => value == null,\n  createNode: (schema, value, ctx) => ctx.wrapScalars ? new resolveSeq.Scalar(null) : null,\n  default: true,\n  tag: 'tag:yaml.org,2002:null',\n  test: /^(?:~|[Nn]ull|NULL)?$/,\n  resolve: () => null,\n  options: resolveSeq.nullOptions,\n  stringify: () => resolveSeq.nullOptions.nullStr\n};\nconst boolObj = {\n  identify: value => typeof value === 'boolean',\n  default: true,\n  tag: 'tag:yaml.org,2002:bool',\n  test: /^(?:[Tt]rue|TRUE|[Ff]alse|FALSE)$/,\n  resolve: str => str[0] === 't' || str[0] === 'T',\n  options: resolveSeq.boolOptions,\n  stringify: ({\n    value\n  }) => value ? resolveSeq.boolOptions.trueStr : resolveSeq.boolOptions.falseStr\n};\nconst octObj = {\n  identify: value => intIdentify(value) && value >= 0,\n  default: true,\n  tag: 'tag:yaml.org,2002:int',\n  format: 'OCT',\n  test: /^0o([0-7]+)$/,\n  resolve: (str, oct) => intResolve(str, oct, 8),\n  options: resolveSeq.intOptions,\n  stringify: node => intStringify(node, 8, '0o')\n};\nconst intObj = {\n  identify: intIdentify,\n  default: true,\n  tag: 'tag:yaml.org,2002:int',\n  test: /^[-+]?[0-9]+$/,\n  resolve: str => intResolve(str, str, 10),\n  options: resolveSeq.intOptions,\n  stringify: resolveSeq.stringifyNumber\n};\nconst hexObj = {\n  identify: value => intIdentify(value) && value >= 0,\n  default: true,\n  tag: 'tag:yaml.org,2002:int',\n  format: 'HEX',\n  test: /^0x([0-9a-fA-F]+)$/,\n  resolve: (str, hex) => intResolve(str, hex, 16),\n  options: resolveSeq.intOptions,\n  stringify: node => intStringify(node, 16, '0x')\n};\nconst nanObj = {\n  identify: value => typeof value === 'number',\n  default: true,\n  tag: 'tag:yaml.org,2002:float',\n  test: /^(?:[-+]?\\.inf|(\\.nan))$/i,\n  resolve: (str, nan) => nan ? NaN : str[0] === '-' ? Number.NEGATIVE_INFINITY : Number.POSITIVE_INFINITY,\n  stringify: resolveSeq.stringifyNumber\n};\nconst expObj = {\n  identify: value => typeof value === 'number',\n  default: true,\n  tag: 'tag:yaml.org,2002:float',\n  format: 'EXP',\n  test: /^[-+]?(?:\\.[0-9]+|[0-9]+(?:\\.[0-9]*)?)[eE][-+]?[0-9]+$/,\n  resolve: str => parseFloat(str),\n  stringify: ({\n    value\n  }) => Number(value).toExponential()\n};\nconst floatObj = {\n  identify: value => typeof value === 'number',\n  default: true,\n  tag: 'tag:yaml.org,2002:float',\n  test: /^[-+]?(?:\\.([0-9]+)|[0-9]+\\.([0-9]*))$/,\n\n  resolve(str, frac1, frac2) {\n    const frac = frac1 || frac2;\n    const node = new resolveSeq.Scalar(parseFloat(str));\n    if (frac && frac[frac.length - 1] === '0') node.minFractionDigits = frac.length;\n    return node;\n  },\n\n  stringify: resolveSeq.stringifyNumber\n};\nconst core = failsafe.concat([nullObj, boolObj, octObj, intObj, hexObj, nanObj, expObj, floatObj]);\n\n/* global BigInt */\n\nconst intIdentify$1 = value => typeof value === 'bigint' || Number.isInteger(value);\n\nconst stringifyJSON = ({\n  value\n}) => JSON.stringify(value);\n\nconst json = [map, seq, {\n  identify: value => typeof value === 'string',\n  default: true,\n  tag: 'tag:yaml.org,2002:str',\n  resolve: resolveSeq.resolveString,\n  stringify: stringifyJSON\n}, {\n  identify: value => value == null,\n  createNode: (schema, value, ctx) => ctx.wrapScalars ? new resolveSeq.Scalar(null) : null,\n  default: true,\n  tag: 'tag:yaml.org,2002:null',\n  test: /^null$/,\n  resolve: () => null,\n  stringify: stringifyJSON\n}, {\n  identify: value => typeof value === 'boolean',\n  default: true,\n  tag: 'tag:yaml.org,2002:bool',\n  test: /^true|false$/,\n  resolve: str => str === 'true',\n  stringify: stringifyJSON\n}, {\n  identify: intIdentify$1,\n  default: true,\n  tag: 'tag:yaml.org,2002:int',\n  test: /^-?(?:0|[1-9][0-9]*)$/,\n  resolve: str => resolveSeq.intOptions.asBigInt ? BigInt(str) : parseInt(str, 10),\n  stringify: ({\n    value\n  }) => intIdentify$1(value) ? value.toString() : JSON.stringify(value)\n}, {\n  identify: value => typeof value === 'number',\n  default: true,\n  tag: 'tag:yaml.org,2002:float',\n  test: /^-?(?:0|[1-9][0-9]*)(?:\\.[0-9]*)?(?:[eE][-+]?[0-9]+)?$/,\n  resolve: str => parseFloat(str),\n  stringify: stringifyJSON\n}];\n\njson.scalarFallback = str => {\n  throw new SyntaxError(`Unresolved plain scalar ${JSON.stringify(str)}`);\n};\n\n/* global BigInt */\n\nconst boolStringify = ({\n  value\n}) => value ? resolveSeq.boolOptions.trueStr : resolveSeq.boolOptions.falseStr;\n\nconst intIdentify$2 = value => typeof value === 'bigint' || Number.isInteger(value);\n\nfunction intResolve$1(sign, src, radix) {\n  let str = src.replace(/_/g, '');\n\n  if (resolveSeq.intOptions.asBigInt) {\n    switch (radix) {\n      case 2:\n        str = `0b${str}`;\n        break;\n\n      case 8:\n        str = `0o${str}`;\n        break;\n\n      case 16:\n        str = `0x${str}`;\n        break;\n    }\n\n    const n = BigInt(str);\n    return sign === '-' ? BigInt(-1) * n : n;\n  }\n\n  const n = parseInt(str, radix);\n  return sign === '-' ? -1 * n : n;\n}\n\nfunction intStringify$1(node, radix, prefix) {\n  const {\n    value\n  } = node;\n\n  if (intIdentify$2(value)) {\n    const str = value.toString(radix);\n    return value < 0 ? '-' + prefix + str.substr(1) : prefix + str;\n  }\n\n  return resolveSeq.stringifyNumber(node);\n}\n\nconst yaml11 = failsafe.concat([{\n  identify: value => value == null,\n  createNode: (schema, value, ctx) => ctx.wrapScalars ? new resolveSeq.Scalar(null) : null,\n  default: true,\n  tag: 'tag:yaml.org,2002:null',\n  test: /^(?:~|[Nn]ull|NULL)?$/,\n  resolve: () => null,\n  options: resolveSeq.nullOptions,\n  stringify: () => resolveSeq.nullOptions.nullStr\n}, {\n  identify: value => typeof value === 'boolean',\n  default: true,\n  tag: 'tag:yaml.org,2002:bool',\n  test: /^(?:Y|y|[Yy]es|YES|[Tt]rue|TRUE|[Oo]n|ON)$/,\n  resolve: () => true,\n  options: resolveSeq.boolOptions,\n  stringify: boolStringify\n}, {\n  identify: value => typeof value === 'boolean',\n  default: true,\n  tag: 'tag:yaml.org,2002:bool',\n  test: /^(?:N|n|[Nn]o|NO|[Ff]alse|FALSE|[Oo]ff|OFF)$/i,\n  resolve: () => false,\n  options: resolveSeq.boolOptions,\n  stringify: boolStringify\n}, {\n  identify: intIdentify$2,\n  default: true,\n  tag: 'tag:yaml.org,2002:int',\n  format: 'BIN',\n  test: /^([-+]?)0b([0-1_]+)$/,\n  resolve: (str, sign, bin) => intResolve$1(sign, bin, 2),\n  stringify: node => intStringify$1(node, 2, '0b')\n}, {\n  identify: intIdentify$2,\n  default: true,\n  tag: 'tag:yaml.org,2002:int',\n  format: 'OCT',\n  test: /^([-+]?)0([0-7_]+)$/,\n  resolve: (str, sign, oct) => intResolve$1(sign, oct, 8),\n  stringify: node => intStringify$1(node, 8, '0')\n}, {\n  identify: intIdentify$2,\n  default: true,\n  tag: 'tag:yaml.org,2002:int',\n  test: /^([-+]?)([0-9][0-9_]*)$/,\n  resolve: (str, sign, abs) => intResolve$1(sign, abs, 10),\n  stringify: resolveSeq.stringifyNumber\n}, {\n  identify: intIdentify$2,\n  default: true,\n  tag: 'tag:yaml.org,2002:int',\n  format: 'HEX',\n  test: /^([-+]?)0x([0-9a-fA-F_]+)$/,\n  resolve: (str, sign, hex) => intResolve$1(sign, hex, 16),\n  stringify: node => intStringify$1(node, 16, '0x')\n}, {\n  identify: value => typeof value === 'number',\n  default: true,\n  tag: 'tag:yaml.org,2002:float',\n  test: /^(?:[-+]?\\.inf|(\\.nan))$/i,\n  resolve: (str, nan) => nan ? NaN : str[0] === '-' ? Number.NEGATIVE_INFINITY : Number.POSITIVE_INFINITY,\n  stringify: resolveSeq.stringifyNumber\n}, {\n  identify: value => typeof value === 'number',\n  default: true,\n  tag: 'tag:yaml.org,2002:float',\n  format: 'EXP',\n  test: /^[-+]?([0-9][0-9_]*)?(\\.[0-9_]*)?[eE][-+]?[0-9]+$/,\n  resolve: str => parseFloat(str.replace(/_/g, '')),\n  stringify: ({\n    value\n  }) => Number(value).toExponential()\n}, {\n  identify: value => typeof value === 'number',\n  default: true,\n  tag: 'tag:yaml.org,2002:float',\n  test: /^[-+]?(?:[0-9][0-9_]*)?\\.([0-9_]*)$/,\n\n  resolve(str, frac) {\n    const node = new resolveSeq.Scalar(parseFloat(str.replace(/_/g, '')));\n\n    if (frac) {\n      const f = frac.replace(/_/g, '');\n      if (f[f.length - 1] === '0') node.minFractionDigits = f.length;\n    }\n\n    return node;\n  },\n\n  stringify: resolveSeq.stringifyNumber\n}], warnings.binary, warnings.omap, warnings.pairs, warnings.set, warnings.intTime, warnings.floatTime, warnings.timestamp);\n\nconst schemas = {\n  core,\n  failsafe,\n  json,\n  yaml11\n};\nconst tags = {\n  binary: warnings.binary,\n  bool: boolObj,\n  float: floatObj,\n  floatExp: expObj,\n  floatNaN: nanObj,\n  floatTime: warnings.floatTime,\n  int: intObj,\n  intHex: hexObj,\n  intOct: octObj,\n  intTime: warnings.intTime,\n  map,\n  null: nullObj,\n  omap: warnings.omap,\n  pairs: warnings.pairs,\n  seq,\n  set: warnings.set,\n  timestamp: warnings.timestamp\n};\n\nfunction findTagObject(value, tagName, tags) {\n  if (tagName) {\n    const match = tags.filter(t => t.tag === tagName);\n    const tagObj = match.find(t => !t.format) || match[0];\n    if (!tagObj) throw new Error(`Tag ${tagName} not found`);\n    return tagObj;\n  } // TODO: deprecate/remove class check\n\n\n  return tags.find(t => (t.identify && t.identify(value) || t.class && value instanceof t.class) && !t.format);\n}\n\nfunction createNode(value, tagName, ctx) {\n  if (value instanceof resolveSeq.Node) return value;\n  const {\n    defaultPrefix,\n    onTagObj,\n    prevObjects,\n    schema,\n    wrapScalars\n  } = ctx;\n  if (tagName && tagName.startsWith('!!')) tagName = defaultPrefix + tagName.slice(2);\n  let tagObj = findTagObject(value, tagName, schema.tags);\n\n  if (!tagObj) {\n    if (typeof value.toJSON === 'function') value = value.toJSON();\n    if (typeof value !== 'object') return wrapScalars ? new resolveSeq.Scalar(value) : value;\n    tagObj = value instanceof Map ? map : value[Symbol.iterator] ? seq : map;\n  }\n\n  if (onTagObj) {\n    onTagObj(tagObj);\n    delete ctx.onTagObj;\n  } // Detect duplicate references to the same object & use Alias nodes for all\n  // after first. The `obj` wrapper allows for circular references to resolve.\n\n\n  const obj = {};\n\n  if (value && typeof value === 'object' && prevObjects) {\n    const prev = prevObjects.get(value);\n\n    if (prev) {\n      const alias = new resolveSeq.Alias(prev); // leaves source dirty; must be cleaned by caller\n\n      ctx.aliasNodes.push(alias); // defined along with prevObjects\n\n      return alias;\n    }\n\n    obj.value = value;\n    prevObjects.set(value, obj);\n  }\n\n  obj.node = tagObj.createNode ? tagObj.createNode(ctx.schema, value, ctx) : wrapScalars ? new resolveSeq.Scalar(value) : value;\n  if (tagName && obj.node instanceof resolveSeq.Node) obj.node.tag = tagName;\n  return obj.node;\n}\n\nfunction getSchemaTags(schemas, knownTags, customTags, schemaId) {\n  let tags = schemas[schemaId.replace(/\\W/g, '')]; // 'yaml-1.1' -> 'yaml11'\n\n  if (!tags) {\n    const keys = Object.keys(schemas).map(key => JSON.stringify(key)).join(', ');\n    throw new Error(`Unknown schema \"${schemaId}\"; use one of ${keys}`);\n  }\n\n  if (Array.isArray(customTags)) {\n    for (const tag of customTags) tags = tags.concat(tag);\n  } else if (typeof customTags === 'function') {\n    tags = customTags(tags.slice());\n  }\n\n  for (let i = 0; i < tags.length; ++i) {\n    const tag = tags[i];\n\n    if (typeof tag === 'string') {\n      const tagObj = knownTags[tag];\n\n      if (!tagObj) {\n        const keys = Object.keys(knownTags).map(key => JSON.stringify(key)).join(', ');\n        throw new Error(`Unknown custom tag \"${tag}\"; use one of ${keys}`);\n      }\n\n      tags[i] = tagObj;\n    }\n  }\n\n  return tags;\n}\n\nconst sortMapEntriesByKey = (a, b) => a.key < b.key ? -1 : a.key > b.key ? 1 : 0;\n\nclass Schema {\n  // TODO: remove in v2\n  // TODO: remove in v2\n  constructor({\n    customTags,\n    merge,\n    schema,\n    sortMapEntries,\n    tags: deprecatedCustomTags\n  }) {\n    this.merge = !!merge;\n    this.name = schema;\n    this.sortMapEntries = sortMapEntries === true ? sortMapEntriesByKey : sortMapEntries || null;\n    if (!customTags && deprecatedCustomTags) warnings.warnOptionDeprecation('tags', 'customTags');\n    this.tags = getSchemaTags(schemas, tags, customTags || deprecatedCustomTags, schema);\n  }\n\n  createNode(value, wrapScalars, tagName, ctx) {\n    const baseCtx = {\n      defaultPrefix: Schema.defaultPrefix,\n      schema: this,\n      wrapScalars\n    };\n    const createCtx = ctx ? Object.assign(ctx, baseCtx) : baseCtx;\n    return createNode(value, tagName, createCtx);\n  }\n\n  createPair(key, value, ctx) {\n    if (!ctx) ctx = {\n      wrapScalars: true\n    };\n    const k = this.createNode(key, ctx.wrapScalars, null, ctx);\n    const v = this.createNode(value, ctx.wrapScalars, null, ctx);\n    return new resolveSeq.Pair(k, v);\n  }\n\n}\n\nPlainValue._defineProperty(Schema, \"defaultPrefix\", PlainValue.defaultTagPrefix);\n\nPlainValue._defineProperty(Schema, \"defaultTags\", PlainValue.defaultTags);\n\nexports.Schema = Schema;\n\n\n/***/ }),\n\n/***/ 5065:\n/***/ ((__unused_webpack_module, exports, __webpack_require__) => {\n\n\"use strict\";\n\n\nvar PlainValue = __webpack_require__(5215);\nvar parseCst = __webpack_require__(445);\n__webpack_require__(6140);\nvar Document$1 = __webpack_require__(1983);\nvar Schema = __webpack_require__(3656);\nvar warnings = __webpack_require__(7383);\n\nfunction createNode(value, wrapScalars = true, tag) {\n  if (tag === undefined && typeof wrapScalars === 'string') {\n    tag = wrapScalars;\n    wrapScalars = true;\n  }\n\n  const options = Object.assign({}, Document$1.Document.defaults[Document$1.defaultOptions.version], Document$1.defaultOptions);\n  const schema = new Schema.Schema(options);\n  return schema.createNode(value, wrapScalars, tag);\n}\n\nclass Document extends Document$1.Document {\n  constructor(options) {\n    super(Object.assign({}, Document$1.defaultOptions, options));\n  }\n\n}\n\nfunction parseAllDocuments(src, options) {\n  const stream = [];\n  let prev;\n\n  for (const cstDoc of parseCst.parse(src)) {\n    const doc = new Document(options);\n    doc.parse(cstDoc, prev);\n    stream.push(doc);\n    prev = doc;\n  }\n\n  return stream;\n}\n\nfunction parseDocument(src, options) {\n  const cst = parseCst.parse(src);\n  const doc = new Document(options).parse(cst[0]);\n\n  if (cst.length > 1) {\n    const errMsg = 'Source contains multiple documents; please use YAML.parseAllDocuments()';\n    doc.errors.unshift(new PlainValue.YAMLSemanticError(cst[1], errMsg));\n  }\n\n  return doc;\n}\n\nfunction parse(src, options) {\n  const doc = parseDocument(src, options);\n  doc.warnings.forEach(warning => warnings.warn(warning));\n  if (doc.errors.length > 0) throw doc.errors[0];\n  return doc.toJSON();\n}\n\nfunction stringify(value, options) {\n  const doc = new Document(options);\n  doc.contents = value;\n  return String(doc);\n}\n\nconst YAML = {\n  createNode,\n  defaultOptions: Document$1.defaultOptions,\n  Document,\n  parse,\n  parseAllDocuments,\n  parseCST: parseCst.parse,\n  parseDocument,\n  scalarOptions: Document$1.scalarOptions,\n  stringify\n};\n\nexports.YAML = YAML;\n\n\n/***/ }),\n\n/***/ 445:\n/***/ ((__unused_webpack_module, exports, __webpack_require__) => {\n\n\"use strict\";\n\n\nvar PlainValue = __webpack_require__(5215);\n\nclass BlankLine extends PlainValue.Node {\n  constructor() {\n    super(PlainValue.Type.BLANK_LINE);\n  }\n  /* istanbul ignore next */\n\n\n  get includesTrailingLines() {\n    // This is never called from anywhere, but if it were,\n    // this is the value it should return.\n    return true;\n  }\n  /**\n   * Parses a blank line from the source\n   *\n   * @param {ParseContext} context\n   * @param {number} start - Index of first \\n character\n   * @returns {number} - Index of the character after this\n   */\n\n\n  parse(context, start) {\n    this.context = context;\n    this.range = new PlainValue.Range(start, start + 1);\n    return start + 1;\n  }\n\n}\n\nclass CollectionItem extends PlainValue.Node {\n  constructor(type, props) {\n    super(type, props);\n    this.node = null;\n  }\n\n  get includesTrailingLines() {\n    return !!this.node && this.node.includesTrailingLines;\n  }\n  /**\n   * @param {ParseContext} context\n   * @param {number} start - Index of first character\n   * @returns {number} - Index of the character after this\n   */\n\n\n  parse(context, start) {\n    this.context = context;\n    const {\n      parseNode,\n      src\n    } = context;\n    let {\n      atLineStart,\n      lineStart\n    } = context;\n    if (!atLineStart && this.type === PlainValue.Type.SEQ_ITEM) this.error = new PlainValue.YAMLSemanticError(this, 'Sequence items must not have preceding content on the same line');\n    const indent = atLineStart ? start - lineStart : context.indent;\n    let offset = PlainValue.Node.endOfWhiteSpace(src, start + 1);\n    let ch = src[offset];\n    const inlineComment = ch === '#';\n    const comments = [];\n    let blankLine = null;\n\n    while (ch === '\\n' || ch === '#') {\n      if (ch === '#') {\n        const end = PlainValue.Node.endOfLine(src, offset + 1);\n        comments.push(new PlainValue.Range(offset, end));\n        offset = end;\n      } else {\n        atLineStart = true;\n        lineStart = offset + 1;\n        const wsEnd = PlainValue.Node.endOfWhiteSpace(src, lineStart);\n\n        if (src[wsEnd] === '\\n' && comments.length === 0) {\n          blankLine = new BlankLine();\n          lineStart = blankLine.parse({\n            src\n          }, lineStart);\n        }\n\n        offset = PlainValue.Node.endOfIndent(src, lineStart);\n      }\n\n      ch = src[offset];\n    }\n\n    if (PlainValue.Node.nextNodeIsIndented(ch, offset - (lineStart + indent), this.type !== PlainValue.Type.SEQ_ITEM)) {\n      this.node = parseNode({\n        atLineStart,\n        inCollection: false,\n        indent,\n        lineStart,\n        parent: this\n      }, offset);\n    } else if (ch && lineStart > start + 1) {\n      offset = lineStart - 1;\n    }\n\n    if (this.node) {\n      if (blankLine) {\n        // Only blank lines preceding non-empty nodes are captured. Note that\n        // this means that collection item range start indices do not always\n        // increase monotonically. -- eemeli/yaml#126\n        const items = context.parent.items || context.parent.contents;\n        if (items) items.push(blankLine);\n      }\n\n      if (comments.length) Array.prototype.push.apply(this.props, comments);\n      offset = this.node.range.end;\n    } else {\n      if (inlineComment) {\n        const c = comments[0];\n        this.props.push(c);\n        offset = c.end;\n      } else {\n        offset = PlainValue.Node.endOfLine(src, start + 1);\n      }\n    }\n\n    const end = this.node ? this.node.valueRange.end : offset;\n    this.valueRange = new PlainValue.Range(start, end);\n    return offset;\n  }\n\n  setOrigRanges(cr, offset) {\n    offset = super.setOrigRanges(cr, offset);\n    return this.node ? this.node.setOrigRanges(cr, offset) : offset;\n  }\n\n  toString() {\n    const {\n      context: {\n        src\n      },\n      node,\n      range,\n      value\n    } = this;\n    if (value != null) return value;\n    const str = node ? src.slice(range.start, node.range.start) + String(node) : src.slice(range.start, range.end);\n    return PlainValue.Node.addStringTerminator(src, range.end, str);\n  }\n\n}\n\nclass Comment extends PlainValue.Node {\n  constructor() {\n    super(PlainValue.Type.COMMENT);\n  }\n  /**\n   * Parses a comment line from the source\n   *\n   * @param {ParseContext} context\n   * @param {number} start - Index of first character\n   * @returns {number} - Index of the character after this scalar\n   */\n\n\n  parse(context, start) {\n    this.context = context;\n    const offset = this.parseComment(start);\n    this.range = new PlainValue.Range(start, offset);\n    return offset;\n  }\n\n}\n\nfunction grabCollectionEndComments(node) {\n  let cnode = node;\n\n  while (cnode instanceof CollectionItem) cnode = cnode.node;\n\n  if (!(cnode instanceof Collection)) return null;\n  const len = cnode.items.length;\n  let ci = -1;\n\n  for (let i = len - 1; i >= 0; --i) {\n    const n = cnode.items[i];\n\n    if (n.type === PlainValue.Type.COMMENT) {\n      // Keep sufficiently indented comments with preceding node\n      const {\n        indent,\n        lineStart\n      } = n.context;\n      if (indent > 0 && n.range.start >= lineStart + indent) break;\n      ci = i;\n    } else if (n.type === PlainValue.Type.BLANK_LINE) ci = i;else break;\n  }\n\n  if (ci === -1) return null;\n  const ca = cnode.items.splice(ci, len - ci);\n  const prevEnd = ca[0].range.start;\n\n  while (true) {\n    cnode.range.end = prevEnd;\n    if (cnode.valueRange && cnode.valueRange.end > prevEnd) cnode.valueRange.end = prevEnd;\n    if (cnode === node) break;\n    cnode = cnode.context.parent;\n  }\n\n  return ca;\n}\nclass Collection extends PlainValue.Node {\n  static nextContentHasIndent(src, offset, indent) {\n    const lineStart = PlainValue.Node.endOfLine(src, offset) + 1;\n    offset = PlainValue.Node.endOfWhiteSpace(src, lineStart);\n    const ch = src[offset];\n    if (!ch) return false;\n    if (offset >= lineStart + indent) return true;\n    if (ch !== '#' && ch !== '\\n') return false;\n    return Collection.nextContentHasIndent(src, offset, indent);\n  }\n\n  constructor(firstItem) {\n    super(firstItem.type === PlainValue.Type.SEQ_ITEM ? PlainValue.Type.SEQ : PlainValue.Type.MAP);\n\n    for (let i = firstItem.props.length - 1; i >= 0; --i) {\n      if (firstItem.props[i].start < firstItem.context.lineStart) {\n        // props on previous line are assumed by the collection\n        this.props = firstItem.props.slice(0, i + 1);\n        firstItem.props = firstItem.props.slice(i + 1);\n        const itemRange = firstItem.props[0] || firstItem.valueRange;\n        firstItem.range.start = itemRange.start;\n        break;\n      }\n    }\n\n    this.items = [firstItem];\n    const ec = grabCollectionEndComments(firstItem);\n    if (ec) Array.prototype.push.apply(this.items, ec);\n  }\n\n  get includesTrailingLines() {\n    return this.items.length > 0;\n  }\n  /**\n   * @param {ParseContext} context\n   * @param {number} start - Index of first character\n   * @returns {number} - Index of the character after this\n   */\n\n\n  parse(context, start) {\n    this.context = context;\n    const {\n      parseNode,\n      src\n    } = context; // It's easier to recalculate lineStart here rather than tracking down the\n    // last context from which to read it -- eemeli/yaml#2\n\n    let lineStart = PlainValue.Node.startOfLine(src, start);\n    const firstItem = this.items[0]; // First-item context needs to be correct for later comment handling\n    // -- eemeli/yaml#17\n\n    firstItem.context.parent = this;\n    this.valueRange = PlainValue.Range.copy(firstItem.valueRange);\n    const indent = firstItem.range.start - firstItem.context.lineStart;\n    let offset = start;\n    offset = PlainValue.Node.normalizeOffset(src, offset);\n    let ch = src[offset];\n    let atLineStart = PlainValue.Node.endOfWhiteSpace(src, lineStart) === offset;\n    let prevIncludesTrailingLines = false;\n\n    while (ch) {\n      while (ch === '\\n' || ch === '#') {\n        if (atLineStart && ch === '\\n' && !prevIncludesTrailingLines) {\n          const blankLine = new BlankLine();\n          offset = blankLine.parse({\n            src\n          }, offset);\n          this.valueRange.end = offset;\n\n          if (offset >= src.length) {\n            ch = null;\n            break;\n          }\n\n          this.items.push(blankLine);\n          offset -= 1; // blankLine.parse() consumes terminal newline\n        } else if (ch === '#') {\n          if (offset < lineStart + indent && !Collection.nextContentHasIndent(src, offset, indent)) {\n            return offset;\n          }\n\n          const comment = new Comment();\n          offset = comment.parse({\n            indent,\n            lineStart,\n            src\n          }, offset);\n          this.items.push(comment);\n          this.valueRange.end = offset;\n\n          if (offset >= src.length) {\n            ch = null;\n            break;\n          }\n        }\n\n        lineStart = offset + 1;\n        offset = PlainValue.Node.endOfIndent(src, lineStart);\n\n        if (PlainValue.Node.atBlank(src, offset)) {\n          const wsEnd = PlainValue.Node.endOfWhiteSpace(src, offset);\n          const next = src[wsEnd];\n\n          if (!next || next === '\\n' || next === '#') {\n            offset = wsEnd;\n          }\n        }\n\n        ch = src[offset];\n        atLineStart = true;\n      }\n\n      if (!ch) {\n        break;\n      }\n\n      if (offset !== lineStart + indent && (atLineStart || ch !== ':')) {\n        if (offset < lineStart + indent) {\n          if (lineStart > start) offset = lineStart;\n          break;\n        } else if (!this.error) {\n          const msg = 'All collection items must start at the same column';\n          this.error = new PlainValue.YAMLSyntaxError(this, msg);\n        }\n      }\n\n      if (firstItem.type === PlainValue.Type.SEQ_ITEM) {\n        if (ch !== '-') {\n          if (lineStart > start) offset = lineStart;\n          break;\n        }\n      } else if (ch === '-' && !this.error) {\n        // map key may start with -, as long as it's followed by a non-whitespace char\n        const next = src[offset + 1];\n\n        if (!next || next === '\\n' || next === '\\t' || next === ' ') {\n          const msg = 'A collection cannot be both a mapping and a sequence';\n          this.error = new PlainValue.YAMLSyntaxError(this, msg);\n        }\n      }\n\n      const node = parseNode({\n        atLineStart,\n        inCollection: true,\n        indent,\n        lineStart,\n        parent: this\n      }, offset);\n      if (!node) return offset; // at next document start\n\n      this.items.push(node);\n      this.valueRange.end = node.valueRange.end;\n      offset = PlainValue.Node.normalizeOffset(src, node.range.end);\n      ch = src[offset];\n      atLineStart = false;\n      prevIncludesTrailingLines = node.includesTrailingLines; // Need to reset lineStart and atLineStart here if preceding node's range\n      // has advanced to check the current line's indentation level\n      // -- eemeli/yaml#10 & eemeli/yaml#38\n\n      if (ch) {\n        let ls = offset - 1;\n        let prev = src[ls];\n\n        while (prev === ' ' || prev === '\\t') prev = src[--ls];\n\n        if (prev === '\\n') {\n          lineStart = ls + 1;\n          atLineStart = true;\n        }\n      }\n\n      const ec = grabCollectionEndComments(node);\n      if (ec) Array.prototype.push.apply(this.items, ec);\n    }\n\n    return offset;\n  }\n\n  setOrigRanges(cr, offset) {\n    offset = super.setOrigRanges(cr, offset);\n    this.items.forEach(node => {\n      offset = node.setOrigRanges(cr, offset);\n    });\n    return offset;\n  }\n\n  toString() {\n    const {\n      context: {\n        src\n      },\n      items,\n      range,\n      value\n    } = this;\n    if (value != null) return value;\n    let str = src.slice(range.start, items[0].range.start) + String(items[0]);\n\n    for (let i = 1; i < items.length; ++i) {\n      const item = items[i];\n      const {\n        atLineStart,\n        indent\n      } = item.context;\n      if (atLineStart) for (let i = 0; i < indent; ++i) str += ' ';\n      str += String(item);\n    }\n\n    return PlainValue.Node.addStringTerminator(src, range.end, str);\n  }\n\n}\n\nclass Directive extends PlainValue.Node {\n  constructor() {\n    super(PlainValue.Type.DIRECTIVE);\n    this.name = null;\n  }\n\n  get parameters() {\n    const raw = this.rawValue;\n    return raw ? raw.trim().split(/[ \\t]+/) : [];\n  }\n\n  parseName(start) {\n    const {\n      src\n    } = this.context;\n    let offset = start;\n    let ch = src[offset];\n\n    while (ch && ch !== '\\n' && ch !== '\\t' && ch !== ' ') ch = src[offset += 1];\n\n    this.name = src.slice(start, offset);\n    return offset;\n  }\n\n  parseParameters(start) {\n    const {\n      src\n    } = this.context;\n    let offset = start;\n    let ch = src[offset];\n\n    while (ch && ch !== '\\n' && ch !== '#') ch = src[offset += 1];\n\n    this.valueRange = new PlainValue.Range(start, offset);\n    return offset;\n  }\n\n  parse(context, start) {\n    this.context = context;\n    let offset = this.parseName(start + 1);\n    offset = this.parseParameters(offset);\n    offset = this.parseComment(offset);\n    this.range = new PlainValue.Range(start, offset);\n    return offset;\n  }\n\n}\n\nclass Document extends PlainValue.Node {\n  static startCommentOrEndBlankLine(src, start) {\n    const offset = PlainValue.Node.endOfWhiteSpace(src, start);\n    const ch = src[offset];\n    return ch === '#' || ch === '\\n' ? offset : start;\n  }\n\n  constructor() {\n    super(PlainValue.Type.DOCUMENT);\n    this.directives = null;\n    this.contents = null;\n    this.directivesEndMarker = null;\n    this.documentEndMarker = null;\n  }\n\n  parseDirectives(start) {\n    const {\n      src\n    } = this.context;\n    this.directives = [];\n    let atLineStart = true;\n    let hasDirectives = false;\n    let offset = start;\n\n    while (!PlainValue.Node.atDocumentBoundary(src, offset, PlainValue.Char.DIRECTIVES_END)) {\n      offset = Document.startCommentOrEndBlankLine(src, offset);\n\n      switch (src[offset]) {\n        case '\\n':\n          if (atLineStart) {\n            const blankLine = new BlankLine();\n            offset = blankLine.parse({\n              src\n            }, offset);\n\n            if (offset < src.length) {\n              this.directives.push(blankLine);\n            }\n          } else {\n            offset += 1;\n            atLineStart = true;\n          }\n\n          break;\n\n        case '#':\n          {\n            const comment = new Comment();\n            offset = comment.parse({\n              src\n            }, offset);\n            this.directives.push(comment);\n            atLineStart = false;\n          }\n          break;\n\n        case '%':\n          {\n            const directive = new Directive();\n            offset = directive.parse({\n              parent: this,\n              src\n            }, offset);\n            this.directives.push(directive);\n            hasDirectives = true;\n            atLineStart = false;\n          }\n          break;\n\n        default:\n          if (hasDirectives) {\n            this.error = new PlainValue.YAMLSemanticError(this, 'Missing directives-end indicator line');\n          } else if (this.directives.length > 0) {\n            this.contents = this.directives;\n            this.directives = [];\n          }\n\n          return offset;\n      }\n    }\n\n    if (src[offset]) {\n      this.directivesEndMarker = new PlainValue.Range(offset, offset + 3);\n      return offset + 3;\n    }\n\n    if (hasDirectives) {\n      this.error = new PlainValue.YAMLSemanticError(this, 'Missing directives-end indicator line');\n    } else if (this.directives.length > 0) {\n      this.contents = this.directives;\n      this.directives = [];\n    }\n\n    return offset;\n  }\n\n  parseContents(start) {\n    const {\n      parseNode,\n      src\n    } = this.context;\n    if (!this.contents) this.contents = [];\n    let lineStart = start;\n\n    while (src[lineStart - 1] === '-') lineStart -= 1;\n\n    let offset = PlainValue.Node.endOfWhiteSpace(src, start);\n    let atLineStart = lineStart === start;\n    this.valueRange = new PlainValue.Range(offset);\n\n    while (!PlainValue.Node.atDocumentBoundary(src, offset, PlainValue.Char.DOCUMENT_END)) {\n      switch (src[offset]) {\n        case '\\n':\n          if (atLineStart) {\n            const blankLine = new BlankLine();\n            offset = blankLine.parse({\n              src\n            }, offset);\n\n            if (offset < src.length) {\n              this.contents.push(blankLine);\n            }\n          } else {\n            offset += 1;\n            atLineStart = true;\n          }\n\n          lineStart = offset;\n          break;\n\n        case '#':\n          {\n            const comment = new Comment();\n            offset = comment.parse({\n              src\n            }, offset);\n            this.contents.push(comment);\n            atLineStart = false;\n          }\n          break;\n\n        default:\n          {\n            const iEnd = PlainValue.Node.endOfIndent(src, offset);\n            const context = {\n              atLineStart,\n              indent: -1,\n              inFlow: false,\n              inCollection: false,\n              lineStart,\n              parent: this\n            };\n            const node = parseNode(context, iEnd);\n            if (!node) return this.valueRange.end = iEnd; // at next document start\n\n            this.contents.push(node);\n            offset = node.range.end;\n            atLineStart = false;\n            const ec = grabCollectionEndComments(node);\n            if (ec) Array.prototype.push.apply(this.contents, ec);\n          }\n      }\n\n      offset = Document.startCommentOrEndBlankLine(src, offset);\n    }\n\n    this.valueRange.end = offset;\n\n    if (src[offset]) {\n      this.documentEndMarker = new PlainValue.Range(offset, offset + 3);\n      offset += 3;\n\n      if (src[offset]) {\n        offset = PlainValue.Node.endOfWhiteSpace(src, offset);\n\n        if (src[offset] === '#') {\n          const comment = new Comment();\n          offset = comment.parse({\n            src\n          }, offset);\n          this.contents.push(comment);\n        }\n\n        switch (src[offset]) {\n          case '\\n':\n            offset += 1;\n            break;\n\n          case undefined:\n            break;\n\n          default:\n            this.error = new PlainValue.YAMLSyntaxError(this, 'Document end marker line cannot have a non-comment suffix');\n        }\n      }\n    }\n\n    return offset;\n  }\n  /**\n   * @param {ParseContext} context\n   * @param {number} start - Index of first character\n   * @returns {number} - Index of the character after this\n   */\n\n\n  parse(context, start) {\n    context.root = this;\n    this.context = context;\n    const {\n      src\n    } = context;\n    let offset = src.charCodeAt(start) === 0xfeff ? start + 1 : start; // skip BOM\n\n    offset = this.parseDirectives(offset);\n    offset = this.parseContents(offset);\n    return offset;\n  }\n\n  setOrigRanges(cr, offset) {\n    offset = super.setOrigRanges(cr, offset);\n    this.directives.forEach(node => {\n      offset = node.setOrigRanges(cr, offset);\n    });\n    if (this.directivesEndMarker) offset = this.directivesEndMarker.setOrigRange(cr, offset);\n    this.contents.forEach(node => {\n      offset = node.setOrigRanges(cr, offset);\n    });\n    if (this.documentEndMarker) offset = this.documentEndMarker.setOrigRange(cr, offset);\n    return offset;\n  }\n\n  toString() {\n    const {\n      contents,\n      directives,\n      value\n    } = this;\n    if (value != null) return value;\n    let str = directives.join('');\n\n    if (contents.length > 0) {\n      if (directives.length > 0 || contents[0].type === PlainValue.Type.COMMENT) str += '---\\n';\n      str += contents.join('');\n    }\n\n    if (str[str.length - 1] !== '\\n') str += '\\n';\n    return str;\n  }\n\n}\n\nclass Alias extends PlainValue.Node {\n  /**\n   * Parses an *alias from the source\n   *\n   * @param {ParseContext} context\n   * @param {number} start - Index of first character\n   * @returns {number} - Index of the character after this scalar\n   */\n  parse(context, start) {\n    this.context = context;\n    const {\n      src\n    } = context;\n    let offset = PlainValue.Node.endOfIdentifier(src, start + 1);\n    this.valueRange = new PlainValue.Range(start + 1, offset);\n    offset = PlainValue.Node.endOfWhiteSpace(src, offset);\n    offset = this.parseComment(offset);\n    return offset;\n  }\n\n}\n\nconst Chomp = {\n  CLIP: 'CLIP',\n  KEEP: 'KEEP',\n  STRIP: 'STRIP'\n};\nclass BlockValue extends PlainValue.Node {\n  constructor(type, props) {\n    super(type, props);\n    this.blockIndent = null;\n    this.chomping = Chomp.CLIP;\n    this.header = null;\n  }\n\n  get includesTrailingLines() {\n    return this.chomping === Chomp.KEEP;\n  }\n\n  get strValue() {\n    if (!this.valueRange || !this.context) return null;\n    let {\n      start,\n      end\n    } = this.valueRange;\n    const {\n      indent,\n      src\n    } = this.context;\n    if (this.valueRange.isEmpty()) return '';\n    let lastNewLine = null;\n    let ch = src[end - 1];\n\n    while (ch === '\\n' || ch === '\\t' || ch === ' ') {\n      end -= 1;\n\n      if (end <= start) {\n        if (this.chomping === Chomp.KEEP) break;else return ''; // probably never happens\n      }\n\n      if (ch === '\\n') lastNewLine = end;\n      ch = src[end - 1];\n    }\n\n    let keepStart = end + 1;\n\n    if (lastNewLine) {\n      if (this.chomping === Chomp.KEEP) {\n        keepStart = lastNewLine;\n        end = this.valueRange.end;\n      } else {\n        end = lastNewLine;\n      }\n    }\n\n    const bi = indent + this.blockIndent;\n    const folded = this.type === PlainValue.Type.BLOCK_FOLDED;\n    let atStart = true;\n    let str = '';\n    let sep = '';\n    let prevMoreIndented = false;\n\n    for (let i = start; i < end; ++i) {\n      for (let j = 0; j < bi; ++j) {\n        if (src[i] !== ' ') break;\n        i += 1;\n      }\n\n      const ch = src[i];\n\n      if (ch === '\\n') {\n        if (sep === '\\n') str += '\\n';else sep = '\\n';\n      } else {\n        const lineEnd = PlainValue.Node.endOfLine(src, i);\n        const line = src.slice(i, lineEnd);\n        i = lineEnd;\n\n        if (folded && (ch === ' ' || ch === '\\t') && i < keepStart) {\n          if (sep === ' ') sep = '\\n';else if (!prevMoreIndented && !atStart && sep === '\\n') sep = '\\n\\n';\n          str += sep + line; //+ ((lineEnd < end && src[lineEnd]) || '')\n\n          sep = lineEnd < end && src[lineEnd] || '';\n          prevMoreIndented = true;\n        } else {\n          str += sep + line;\n          sep = folded && i < keepStart ? ' ' : '\\n';\n          prevMoreIndented = false;\n        }\n\n        if (atStart && line !== '') atStart = false;\n      }\n    }\n\n    return this.chomping === Chomp.STRIP ? str : str + '\\n';\n  }\n\n  parseBlockHeader(start) {\n    const {\n      src\n    } = this.context;\n    let offset = start + 1;\n    let bi = '';\n\n    while (true) {\n      const ch = src[offset];\n\n      switch (ch) {\n        case '-':\n          this.chomping = Chomp.STRIP;\n          break;\n\n        case '+':\n          this.chomping = Chomp.KEEP;\n          break;\n\n        case '0':\n        case '1':\n        case '2':\n        case '3':\n        case '4':\n        case '5':\n        case '6':\n        case '7':\n        case '8':\n        case '9':\n          bi += ch;\n          break;\n\n        default:\n          this.blockIndent = Number(bi) || null;\n          this.header = new PlainValue.Range(start, offset);\n          return offset;\n      }\n\n      offset += 1;\n    }\n  }\n\n  parseBlockValue(start) {\n    const {\n      indent,\n      src\n    } = this.context;\n    const explicit = !!this.blockIndent;\n    let offset = start;\n    let valueEnd = start;\n    let minBlockIndent = 1;\n\n    for (let ch = src[offset]; ch === '\\n'; ch = src[offset]) {\n      offset += 1;\n      if (PlainValue.Node.atDocumentBoundary(src, offset)) break;\n      const end = PlainValue.Node.endOfBlockIndent(src, indent, offset); // should not include tab?\n\n      if (end === null) break;\n      const ch = src[end];\n      const lineIndent = end - (offset + indent);\n\n      if (!this.blockIndent) {\n        // no explicit block indent, none yet detected\n        if (src[end] !== '\\n') {\n          // first line with non-whitespace content\n          if (lineIndent < minBlockIndent) {\n            const msg = 'Block scalars with more-indented leading empty lines must use an explicit indentation indicator';\n            this.error = new PlainValue.YAMLSemanticError(this, msg);\n          }\n\n          this.blockIndent = lineIndent;\n        } else if (lineIndent > minBlockIndent) {\n          // empty line with more whitespace\n          minBlockIndent = lineIndent;\n        }\n      } else if (ch && ch !== '\\n' && lineIndent < this.blockIndent) {\n        if (src[end] === '#') break;\n\n        if (!this.error) {\n          const src = explicit ? 'explicit indentation indicator' : 'first line';\n          const msg = `Block scalars must not be less indented than their ${src}`;\n          this.error = new PlainValue.YAMLSemanticError(this, msg);\n        }\n      }\n\n      if (src[end] === '\\n') {\n        offset = end;\n      } else {\n        offset = valueEnd = PlainValue.Node.endOfLine(src, end);\n      }\n    }\n\n    if (this.chomping !== Chomp.KEEP) {\n      offset = src[valueEnd] ? valueEnd + 1 : valueEnd;\n    }\n\n    this.valueRange = new PlainValue.Range(start + 1, offset);\n    return offset;\n  }\n  /**\n   * Parses a block value from the source\n   *\n   * Accepted forms are:\n   * ```\n   * BS\n   * block\n   * lines\n   *\n   * BS #comment\n   * block\n   * lines\n   * ```\n   * where the block style BS matches the regexp `[|>][-+1-9]*` and block lines\n   * are empty or have an indent level greater than `indent`.\n   *\n   * @param {ParseContext} context\n   * @param {number} start - Index of first character\n   * @returns {number} - Index of the character after this block\n   */\n\n\n  parse(context, start) {\n    this.context = context;\n    const {\n      src\n    } = context;\n    let offset = this.parseBlockHeader(start);\n    offset = PlainValue.Node.endOfWhiteSpace(src, offset);\n    offset = this.parseComment(offset);\n    offset = this.parseBlockValue(offset);\n    return offset;\n  }\n\n  setOrigRanges(cr, offset) {\n    offset = super.setOrigRanges(cr, offset);\n    return this.header ? this.header.setOrigRange(cr, offset) : offset;\n  }\n\n}\n\nclass FlowCollection extends PlainValue.Node {\n  constructor(type, props) {\n    super(type, props);\n    this.items = null;\n  }\n\n  prevNodeIsJsonLike(idx = this.items.length) {\n    const node = this.items[idx - 1];\n    return !!node && (node.jsonLike || node.type === PlainValue.Type.COMMENT && this.prevNodeIsJsonLike(idx - 1));\n  }\n  /**\n   * @param {ParseContext} context\n   * @param {number} start - Index of first character\n   * @returns {number} - Index of the character after this\n   */\n\n\n  parse(context, start) {\n    this.context = context;\n    const {\n      parseNode,\n      src\n    } = context;\n    let {\n      indent,\n      lineStart\n    } = context;\n    let char = src[start]; // { or [\n\n    this.items = [{\n      char,\n      offset: start\n    }];\n    let offset = PlainValue.Node.endOfWhiteSpace(src, start + 1);\n    char = src[offset];\n\n    while (char && char !== ']' && char !== '}') {\n      switch (char) {\n        case '\\n':\n          {\n            lineStart = offset + 1;\n            const wsEnd = PlainValue.Node.endOfWhiteSpace(src, lineStart);\n\n            if (src[wsEnd] === '\\n') {\n              const blankLine = new BlankLine();\n              lineStart = blankLine.parse({\n                src\n              }, lineStart);\n              this.items.push(blankLine);\n            }\n\n            offset = PlainValue.Node.endOfIndent(src, lineStart);\n\n            if (offset <= lineStart + indent) {\n              char = src[offset];\n\n              if (offset < lineStart + indent || char !== ']' && char !== '}') {\n                const msg = 'Insufficient indentation in flow collection';\n                this.error = new PlainValue.YAMLSemanticError(this, msg);\n              }\n            }\n          }\n          break;\n\n        case ',':\n          {\n            this.items.push({\n              char,\n              offset\n            });\n            offset += 1;\n          }\n          break;\n\n        case '#':\n          {\n            const comment = new Comment();\n            offset = comment.parse({\n              src\n            }, offset);\n            this.items.push(comment);\n          }\n          break;\n\n        case '?':\n        case ':':\n          {\n            const next = src[offset + 1];\n\n            if (next === '\\n' || next === '\\t' || next === ' ' || next === ',' || // in-flow : after JSON-like key does not need to be followed by whitespace\n            char === ':' && this.prevNodeIsJsonLike()) {\n              this.items.push({\n                char,\n                offset\n              });\n              offset += 1;\n              break;\n            }\n          }\n        // fallthrough\n\n        default:\n          {\n            const node = parseNode({\n              atLineStart: false,\n              inCollection: false,\n              inFlow: true,\n              indent: -1,\n              lineStart,\n              parent: this\n            }, offset);\n\n            if (!node) {\n              // at next document start\n              this.valueRange = new PlainValue.Range(start, offset);\n              return offset;\n            }\n\n            this.items.push(node);\n            offset = PlainValue.Node.normalizeOffset(src, node.range.end);\n          }\n      }\n\n      offset = PlainValue.Node.endOfWhiteSpace(src, offset);\n      char = src[offset];\n    }\n\n    this.valueRange = new PlainValue.Range(start, offset + 1);\n\n    if (char) {\n      this.items.push({\n        char,\n        offset\n      });\n      offset = PlainValue.Node.endOfWhiteSpace(src, offset + 1);\n      offset = this.parseComment(offset);\n    }\n\n    return offset;\n  }\n\n  setOrigRanges(cr, offset) {\n    offset = super.setOrigRanges(cr, offset);\n    this.items.forEach(node => {\n      if (node instanceof PlainValue.Node) {\n        offset = node.setOrigRanges(cr, offset);\n      } else if (cr.length === 0) {\n        node.origOffset = node.offset;\n      } else {\n        let i = offset;\n\n        while (i < cr.length) {\n          if (cr[i] > node.offset) break;else ++i;\n        }\n\n        node.origOffset = node.offset + i;\n        offset = i;\n      }\n    });\n    return offset;\n  }\n\n  toString() {\n    const {\n      context: {\n        src\n      },\n      items,\n      range,\n      value\n    } = this;\n    if (value != null) return value;\n    const nodes = items.filter(item => item instanceof PlainValue.Node);\n    let str = '';\n    let prevEnd = range.start;\n    nodes.forEach(node => {\n      const prefix = src.slice(prevEnd, node.range.start);\n      prevEnd = node.range.end;\n      str += prefix + String(node);\n\n      if (str[str.length - 1] === '\\n' && src[prevEnd - 1] !== '\\n' && src[prevEnd] === '\\n') {\n        // Comment range does not include the terminal newline, but its\n        // stringified value does. Without this fix, newlines at comment ends\n        // get duplicated.\n        prevEnd += 1;\n      }\n    });\n    str += src.slice(prevEnd, range.end);\n    return PlainValue.Node.addStringTerminator(src, range.end, str);\n  }\n\n}\n\nclass QuoteDouble extends PlainValue.Node {\n  static endOfQuote(src, offset) {\n    let ch = src[offset];\n\n    while (ch && ch !== '\"') {\n      offset += ch === '\\\\' ? 2 : 1;\n      ch = src[offset];\n    }\n\n    return offset + 1;\n  }\n  /**\n   * @returns {string | { str: string, errors: YAMLSyntaxError[] }}\n   */\n\n\n  get strValue() {\n    if (!this.valueRange || !this.context) return null;\n    const errors = [];\n    const {\n      start,\n      end\n    } = this.valueRange;\n    const {\n      indent,\n      src\n    } = this.context;\n    if (src[end - 1] !== '\"') errors.push(new PlainValue.YAMLSyntaxError(this, 'Missing closing \"quote')); // Using String#replace is too painful with escaped newlines preceded by\n    // escaped backslashes; also, this should be faster.\n\n    let str = '';\n\n    for (let i = start + 1; i < end - 1; ++i) {\n      const ch = src[i];\n\n      if (ch === '\\n') {\n        if (PlainValue.Node.atDocumentBoundary(src, i + 1)) errors.push(new PlainValue.YAMLSemanticError(this, 'Document boundary indicators are not allowed within string values'));\n        const {\n          fold,\n          offset,\n          error\n        } = PlainValue.Node.foldNewline(src, i, indent);\n        str += fold;\n        i = offset;\n        if (error) errors.push(new PlainValue.YAMLSemanticError(this, 'Multi-line double-quoted string needs to be sufficiently indented'));\n      } else if (ch === '\\\\') {\n        i += 1;\n\n        switch (src[i]) {\n          case '0':\n            str += '\\0';\n            break;\n          // null character\n\n          case 'a':\n            str += '\\x07';\n            break;\n          // bell character\n\n          case 'b':\n            str += '\\b';\n            break;\n          // backspace\n\n          case 'e':\n            str += '\\x1b';\n            break;\n          // escape character\n\n          case 'f':\n            str += '\\f';\n            break;\n          // form feed\n\n          case 'n':\n            str += '\\n';\n            break;\n          // line feed\n\n          case 'r':\n            str += '\\r';\n            break;\n          // carriage return\n\n          case 't':\n            str += '\\t';\n            break;\n          // horizontal tab\n\n          case 'v':\n            str += '\\v';\n            break;\n          // vertical tab\n\n          case 'N':\n            str += '\\u0085';\n            break;\n          // Unicode next line\n\n          case '_':\n            str += '\\u00a0';\n            break;\n          // Unicode non-breaking space\n\n          case 'L':\n            str += '\\u2028';\n            break;\n          // Unicode line separator\n\n          case 'P':\n            str += '\\u2029';\n            break;\n          // Unicode paragraph separator\n\n          case ' ':\n            str += ' ';\n            break;\n\n          case '\"':\n            str += '\"';\n            break;\n\n          case '/':\n            str += '/';\n            break;\n\n          case '\\\\':\n            str += '\\\\';\n            break;\n\n          case '\\t':\n            str += '\\t';\n            break;\n\n          case 'x':\n            str += this.parseCharCode(i + 1, 2, errors);\n            i += 2;\n            break;\n\n          case 'u':\n            str += this.parseCharCode(i + 1, 4, errors);\n            i += 4;\n            break;\n\n          case 'U':\n            str += this.parseCharCode(i + 1, 8, errors);\n            i += 8;\n            break;\n\n          case '\\n':\n            // skip escaped newlines, but still trim the following line\n            while (src[i + 1] === ' ' || src[i + 1] === '\\t') i += 1;\n\n            break;\n\n          default:\n            errors.push(new PlainValue.YAMLSyntaxError(this, `Invalid escape sequence ${src.substr(i - 1, 2)}`));\n            str += '\\\\' + src[i];\n        }\n      } else if (ch === ' ' || ch === '\\t') {\n        // trim trailing whitespace\n        const wsStart = i;\n        let next = src[i + 1];\n\n        while (next === ' ' || next === '\\t') {\n          i += 1;\n          next = src[i + 1];\n        }\n\n        if (next !== '\\n') str += i > wsStart ? src.slice(wsStart, i + 1) : ch;\n      } else {\n        str += ch;\n      }\n    }\n\n    return errors.length > 0 ? {\n      errors,\n      str\n    } : str;\n  }\n\n  parseCharCode(offset, length, errors) {\n    const {\n      src\n    } = this.context;\n    const cc = src.substr(offset, length);\n    const ok = cc.length === length && /^[0-9a-fA-F]+$/.test(cc);\n    const code = ok ? parseInt(cc, 16) : NaN;\n\n    if (isNaN(code)) {\n      errors.push(new PlainValue.YAMLSyntaxError(this, `Invalid escape sequence ${src.substr(offset - 2, length + 2)}`));\n      return src.substr(offset - 2, length + 2);\n    }\n\n    return String.fromCodePoint(code);\n  }\n  /**\n   * Parses a \"double quoted\" value from the source\n   *\n   * @param {ParseContext} context\n   * @param {number} start - Index of first character\n   * @returns {number} - Index of the character after this scalar\n   */\n\n\n  parse(context, start) {\n    this.context = context;\n    const {\n      src\n    } = context;\n    let offset = QuoteDouble.endOfQuote(src, start + 1);\n    this.valueRange = new PlainValue.Range(start, offset);\n    offset = PlainValue.Node.endOfWhiteSpace(src, offset);\n    offset = this.parseComment(offset);\n    return offset;\n  }\n\n}\n\nclass QuoteSingle extends PlainValue.Node {\n  static endOfQuote(src, offset) {\n    let ch = src[offset];\n\n    while (ch) {\n      if (ch === \"'\") {\n        if (src[offset + 1] !== \"'\") break;\n        ch = src[offset += 2];\n      } else {\n        ch = src[offset += 1];\n      }\n    }\n\n    return offset + 1;\n  }\n  /**\n   * @returns {string | { str: string, errors: YAMLSyntaxError[] }}\n   */\n\n\n  get strValue() {\n    if (!this.valueRange || !this.context) return null;\n    const errors = [];\n    const {\n      start,\n      end\n    } = this.valueRange;\n    const {\n      indent,\n      src\n    } = this.context;\n    if (src[end - 1] !== \"'\") errors.push(new PlainValue.YAMLSyntaxError(this, \"Missing closing 'quote\"));\n    let str = '';\n\n    for (let i = start + 1; i < end - 1; ++i) {\n      const ch = src[i];\n\n      if (ch === '\\n') {\n        if (PlainValue.Node.atDocumentBoundary(src, i + 1)) errors.push(new PlainValue.YAMLSemanticError(this, 'Document boundary indicators are not allowed within string values'));\n        const {\n          fold,\n          offset,\n          error\n        } = PlainValue.Node.foldNewline(src, i, indent);\n        str += fold;\n        i = offset;\n        if (error) errors.push(new PlainValue.YAMLSemanticError(this, 'Multi-line single-quoted string needs to be sufficiently indented'));\n      } else if (ch === \"'\") {\n        str += ch;\n        i += 1;\n        if (src[i] !== \"'\") errors.push(new PlainValue.YAMLSyntaxError(this, 'Unescaped single quote? This should not happen.'));\n      } else if (ch === ' ' || ch === '\\t') {\n        // trim trailing whitespace\n        const wsStart = i;\n        let next = src[i + 1];\n\n        while (next === ' ' || next === '\\t') {\n          i += 1;\n          next = src[i + 1];\n        }\n\n        if (next !== '\\n') str += i > wsStart ? src.slice(wsStart, i + 1) : ch;\n      } else {\n        str += ch;\n      }\n    }\n\n    return errors.length > 0 ? {\n      errors,\n      str\n    } : str;\n  }\n  /**\n   * Parses a 'single quoted' value from the source\n   *\n   * @param {ParseContext} context\n   * @param {number} start - Index of first character\n   * @returns {number} - Index of the character after this scalar\n   */\n\n\n  parse(context, start) {\n    this.context = context;\n    const {\n      src\n    } = context;\n    let offset = QuoteSingle.endOfQuote(src, start + 1);\n    this.valueRange = new PlainValue.Range(start, offset);\n    offset = PlainValue.Node.endOfWhiteSpace(src, offset);\n    offset = this.parseComment(offset);\n    return offset;\n  }\n\n}\n\nfunction createNewNode(type, props) {\n  switch (type) {\n    case PlainValue.Type.ALIAS:\n      return new Alias(type, props);\n\n    case PlainValue.Type.BLOCK_FOLDED:\n    case PlainValue.Type.BLOCK_LITERAL:\n      return new BlockValue(type, props);\n\n    case PlainValue.Type.FLOW_MAP:\n    case PlainValue.Type.FLOW_SEQ:\n      return new FlowCollection(type, props);\n\n    case PlainValue.Type.MAP_KEY:\n    case PlainValue.Type.MAP_VALUE:\n    case PlainValue.Type.SEQ_ITEM:\n      return new CollectionItem(type, props);\n\n    case PlainValue.Type.COMMENT:\n    case PlainValue.Type.PLAIN:\n      return new PlainValue.PlainValue(type, props);\n\n    case PlainValue.Type.QUOTE_DOUBLE:\n      return new QuoteDouble(type, props);\n\n    case PlainValue.Type.QUOTE_SINGLE:\n      return new QuoteSingle(type, props);\n\n    /* istanbul ignore next */\n\n    default:\n      return null;\n    // should never happen\n  }\n}\n/**\n * @param {boolean} atLineStart - Node starts at beginning of line\n * @param {boolean} inFlow - true if currently in a flow context\n * @param {boolean} inCollection - true if currently in a collection context\n * @param {number} indent - Current level of indentation\n * @param {number} lineStart - Start of the current line\n * @param {Node} parent - The parent of the node\n * @param {string} src - Source of the YAML document\n */\n\n\nclass ParseContext {\n  static parseType(src, offset, inFlow) {\n    switch (src[offset]) {\n      case '*':\n        return PlainValue.Type.ALIAS;\n\n      case '>':\n        return PlainValue.Type.BLOCK_FOLDED;\n\n      case '|':\n        return PlainValue.Type.BLOCK_LITERAL;\n\n      case '{':\n        return PlainValue.Type.FLOW_MAP;\n\n      case '[':\n        return PlainValue.Type.FLOW_SEQ;\n\n      case '?':\n        return !inFlow && PlainValue.Node.atBlank(src, offset + 1, true) ? PlainValue.Type.MAP_KEY : PlainValue.Type.PLAIN;\n\n      case ':':\n        return !inFlow && PlainValue.Node.atBlank(src, offset + 1, true) ? PlainValue.Type.MAP_VALUE : PlainValue.Type.PLAIN;\n\n      case '-':\n        return !inFlow && PlainValue.Node.atBlank(src, offset + 1, true) ? PlainValue.Type.SEQ_ITEM : PlainValue.Type.PLAIN;\n\n      case '\"':\n        return PlainValue.Type.QUOTE_DOUBLE;\n\n      case \"'\":\n        return PlainValue.Type.QUOTE_SINGLE;\n\n      default:\n        return PlainValue.Type.PLAIN;\n    }\n  }\n\n  constructor(orig = {}, {\n    atLineStart,\n    inCollection,\n    inFlow,\n    indent,\n    lineStart,\n    parent\n  } = {}) {\n    PlainValue._defineProperty(this, \"parseNode\", (overlay, start) => {\n      if (PlainValue.Node.atDocumentBoundary(this.src, start)) return null;\n      const context = new ParseContext(this, overlay);\n      const {\n        props,\n        type,\n        valueStart\n      } = context.parseProps(start);\n      const node = createNewNode(type, props);\n      let offset = node.parse(context, valueStart);\n      node.range = new PlainValue.Range(start, offset);\n      /* istanbul ignore if */\n\n      if (offset <= start) {\n        // This should never happen, but if it does, let's make sure to at least\n        // step one character forward to avoid a busy loop.\n        node.error = new Error(`Node#parse consumed no characters`);\n        node.error.parseEnd = offset;\n        node.error.source = node;\n        node.range.end = start + 1;\n      }\n\n      if (context.nodeStartsCollection(node)) {\n        if (!node.error && !context.atLineStart && context.parent.type === PlainValue.Type.DOCUMENT) {\n          node.error = new PlainValue.YAMLSyntaxError(node, 'Block collection must not have preceding content here (e.g. directives-end indicator)');\n        }\n\n        const collection = new Collection(node);\n        offset = collection.parse(new ParseContext(context), offset);\n        collection.range = new PlainValue.Range(start, offset);\n        return collection;\n      }\n\n      return node;\n    });\n\n    this.atLineStart = atLineStart != null ? atLineStart : orig.atLineStart || false;\n    this.inCollection = inCollection != null ? inCollection : orig.inCollection || false;\n    this.inFlow = inFlow != null ? inFlow : orig.inFlow || false;\n    this.indent = indent != null ? indent : orig.indent;\n    this.lineStart = lineStart != null ? lineStart : orig.lineStart;\n    this.parent = parent != null ? parent : orig.parent || {};\n    this.root = orig.root;\n    this.src = orig.src;\n  }\n\n  nodeStartsCollection(node) {\n    const {\n      inCollection,\n      inFlow,\n      src\n    } = this;\n    if (inCollection || inFlow) return false;\n    if (node instanceof CollectionItem) return true; // check for implicit key\n\n    let offset = node.range.end;\n    if (src[offset] === '\\n' || src[offset - 1] === '\\n') return false;\n    offset = PlainValue.Node.endOfWhiteSpace(src, offset);\n    return src[offset] === ':';\n  } // Anchor and tag are before type, which determines the node implementation\n  // class; hence this intermediate step.\n\n\n  parseProps(offset) {\n    const {\n      inFlow,\n      parent,\n      src\n    } = this;\n    const props = [];\n    let lineHasProps = false;\n    offset = this.atLineStart ? PlainValue.Node.endOfIndent(src, offset) : PlainValue.Node.endOfWhiteSpace(src, offset);\n    let ch = src[offset];\n\n    while (ch === PlainValue.Char.ANCHOR || ch === PlainValue.Char.COMMENT || ch === PlainValue.Char.TAG || ch === '\\n') {\n      if (ch === '\\n') {\n        const lineStart = offset + 1;\n        const inEnd = PlainValue.Node.endOfIndent(src, lineStart);\n        const indentDiff = inEnd - (lineStart + this.indent);\n        const noIndicatorAsIndent = parent.type === PlainValue.Type.SEQ_ITEM && parent.context.atLineStart;\n        if (!PlainValue.Node.nextNodeIsIndented(src[inEnd], indentDiff, !noIndicatorAsIndent)) break;\n        this.atLineStart = true;\n        this.lineStart = lineStart;\n        lineHasProps = false;\n        offset = inEnd;\n      } else if (ch === PlainValue.Char.COMMENT) {\n        const end = PlainValue.Node.endOfLine(src, offset + 1);\n        props.push(new PlainValue.Range(offset, end));\n        offset = end;\n      } else {\n        let end = PlainValue.Node.endOfIdentifier(src, offset + 1);\n\n        if (ch === PlainValue.Char.TAG && src[end] === ',' && /^[a-zA-Z0-9-]+\\.[a-zA-Z0-9-]+,\\d\\d\\d\\d(-\\d\\d){0,2}\\/\\S/.test(src.slice(offset + 1, end + 13))) {\n          // Let's presume we're dealing with a YAML 1.0 domain tag here, rather\n          // than an empty but 'foo.bar' private-tagged node in a flow collection\n          // followed without whitespace by a plain string starting with a year\n          // or date divided by something.\n          end = PlainValue.Node.endOfIdentifier(src, end + 5);\n        }\n\n        props.push(new PlainValue.Range(offset, end));\n        lineHasProps = true;\n        offset = PlainValue.Node.endOfWhiteSpace(src, end);\n      }\n\n      ch = src[offset];\n    } // '- &a : b' has an anchor on an empty node\n\n\n    if (lineHasProps && ch === ':' && PlainValue.Node.atBlank(src, offset + 1, true)) offset -= 1;\n    const type = ParseContext.parseType(src, offset, inFlow);\n    return {\n      props,\n      type,\n      valueStart: offset\n    };\n  }\n  /**\n   * Parses a node from the source\n   * @param {ParseContext} overlay\n   * @param {number} start - Index of first non-whitespace character for the node\n   * @returns {?Node} - null if at a document boundary\n   */\n\n\n}\n\n// Published as 'yaml/parse-cst'\nfunction parse(src) {\n  const cr = [];\n\n  if (src.indexOf('\\r') !== -1) {\n    src = src.replace(/\\r\\n?/g, (match, offset) => {\n      if (match.length > 1) cr.push(offset);\n      return '\\n';\n    });\n  }\n\n  const documents = [];\n  let offset = 0;\n\n  do {\n    const doc = new Document();\n    const context = new ParseContext({\n      src\n    });\n    offset = doc.parse(context, offset);\n    documents.push(doc);\n  } while (offset < src.length);\n\n  documents.setOrigRanges = () => {\n    if (cr.length === 0) return false;\n\n    for (let i = 1; i < cr.length; ++i) cr[i] -= i;\n\n    let crOffset = 0;\n\n    for (let i = 0; i < documents.length; ++i) {\n      crOffset = documents[i].setOrigRanges(cr, crOffset);\n    }\n\n    cr.splice(0, cr.length);\n    return true;\n  };\n\n  documents.toString = () => documents.join('...\\n');\n\n  return documents;\n}\n\nexports.parse = parse;\n\n\n/***/ }),\n\n/***/ 6140:\n/***/ ((__unused_webpack_module, exports, __webpack_require__) => {\n\n\"use strict\";\n\n\nvar PlainValue = __webpack_require__(5215);\n\nfunction addCommentBefore(str, indent, comment) {\n  if (!comment) return str;\n  const cc = comment.replace(/[\\s\\S]^/gm, `$&${indent}#`);\n  return `#${cc}\\n${indent}${str}`;\n}\nfunction addComment(str, indent, comment) {\n  return !comment ? str : comment.indexOf('\\n') === -1 ? `${str} #${comment}` : `${str}\\n` + comment.replace(/^/gm, `${indent || ''}#`);\n}\n\nclass Node {}\n\nfunction toJSON(value, arg, ctx) {\n  if (Array.isArray(value)) return value.map((v, i) => toJSON(v, String(i), ctx));\n\n  if (value && typeof value.toJSON === 'function') {\n    const anchor = ctx && ctx.anchors && ctx.anchors.get(value);\n    if (anchor) ctx.onCreate = res => {\n      anchor.res = res;\n      delete ctx.onCreate;\n    };\n    const res = value.toJSON(arg, ctx);\n    if (anchor && ctx.onCreate) ctx.onCreate(res);\n    return res;\n  }\n\n  if ((!ctx || !ctx.keep) && typeof value === 'bigint') return Number(value);\n  return value;\n}\n\nclass Scalar extends Node {\n  constructor(value) {\n    super();\n    this.value = value;\n  }\n\n  toJSON(arg, ctx) {\n    return ctx && ctx.keep ? this.value : toJSON(this.value, arg, ctx);\n  }\n\n  toString() {\n    return String(this.value);\n  }\n\n}\n\nfunction collectionFromPath(schema, path, value) {\n  let v = value;\n\n  for (let i = path.length - 1; i >= 0; --i) {\n    const k = path[i];\n    const o = Number.isInteger(k) && k >= 0 ? [] : {};\n    o[k] = v;\n    v = o;\n  }\n\n  return schema.createNode(v, false);\n} // null, undefined, or an empty non-string iterable (e.g. [])\n\n\nconst isEmptyPath = path => path == null || typeof path === 'object' && path[Symbol.iterator]().next().done;\nclass Collection extends Node {\n  constructor(schema) {\n    super();\n\n    PlainValue._defineProperty(this, \"items\", []);\n\n    this.schema = schema;\n  }\n\n  addIn(path, value) {\n    if (isEmptyPath(path)) this.add(value);else {\n      const [key, ...rest] = path;\n      const node = this.get(key, true);\n      if (node instanceof Collection) node.addIn(rest, value);else if (node === undefined && this.schema) this.set(key, collectionFromPath(this.schema, rest, value));else throw new Error(`Expected YAML collection at ${key}. Remaining path: ${rest}`);\n    }\n  }\n\n  deleteIn([key, ...rest]) {\n    if (rest.length === 0) return this.delete(key);\n    const node = this.get(key, true);\n    if (node instanceof Collection) return node.deleteIn(rest);else throw new Error(`Expected YAML collection at ${key}. Remaining path: ${rest}`);\n  }\n\n  getIn([key, ...rest], keepScalar) {\n    const node = this.get(key, true);\n    if (rest.length === 0) return !keepScalar && node instanceof Scalar ? node.value : node;else return node instanceof Collection ? node.getIn(rest, keepScalar) : undefined;\n  }\n\n  hasAllNullValues() {\n    return this.items.every(node => {\n      if (!node || node.type !== 'PAIR') return false;\n      const n = node.value;\n      return n == null || n instanceof Scalar && n.value == null && !n.commentBefore && !n.comment && !n.tag;\n    });\n  }\n\n  hasIn([key, ...rest]) {\n    if (rest.length === 0) return this.has(key);\n    const node = this.get(key, true);\n    return node instanceof Collection ? node.hasIn(rest) : false;\n  }\n\n  setIn([key, ...rest], value) {\n    if (rest.length === 0) {\n      this.set(key, value);\n    } else {\n      const node = this.get(key, true);\n      if (node instanceof Collection) node.setIn(rest, value);else if (node === undefined && this.schema) this.set(key, collectionFromPath(this.schema, rest, value));else throw new Error(`Expected YAML collection at ${key}. Remaining path: ${rest}`);\n    }\n  } // overridden in implementations\n\n  /* istanbul ignore next */\n\n\n  toJSON() {\n    return null;\n  }\n\n  toString(ctx, {\n    blockItem,\n    flowChars,\n    isMap,\n    itemIndent\n  }, onComment, onChompKeep) {\n    const {\n      indent,\n      indentStep,\n      stringify\n    } = ctx;\n    const inFlow = this.type === PlainValue.Type.FLOW_MAP || this.type === PlainValue.Type.FLOW_SEQ || ctx.inFlow;\n    if (inFlow) itemIndent += indentStep;\n    const allNullValues = isMap && this.hasAllNullValues();\n    ctx = Object.assign({}, ctx, {\n      allNullValues,\n      indent: itemIndent,\n      inFlow,\n      type: null\n    });\n    let chompKeep = false;\n    let hasItemWithNewLine = false;\n    const nodes = this.items.reduce((nodes, item, i) => {\n      let comment;\n\n      if (item) {\n        if (!chompKeep && item.spaceBefore) nodes.push({\n          type: 'comment',\n          str: ''\n        });\n        if (item.commentBefore) item.commentBefore.match(/^.*$/gm).forEach(line => {\n          nodes.push({\n            type: 'comment',\n            str: `#${line}`\n          });\n        });\n        if (item.comment) comment = item.comment;\n        if (inFlow && (!chompKeep && item.spaceBefore || item.commentBefore || item.comment || item.key && (item.key.commentBefore || item.key.comment) || item.value && (item.value.commentBefore || item.value.comment))) hasItemWithNewLine = true;\n      }\n\n      chompKeep = false;\n      let str = stringify(item, ctx, () => comment = null, () => chompKeep = true);\n      if (inFlow && !hasItemWithNewLine && str.includes('\\n')) hasItemWithNewLine = true;\n      if (inFlow && i < this.items.length - 1) str += ',';\n      str = addComment(str, itemIndent, comment);\n      if (chompKeep && (comment || inFlow)) chompKeep = false;\n      nodes.push({\n        type: 'item',\n        str\n      });\n      return nodes;\n    }, []);\n    let str;\n\n    if (nodes.length === 0) {\n      str = flowChars.start + flowChars.end;\n    } else if (inFlow) {\n      const {\n        start,\n        end\n      } = flowChars;\n      const strings = nodes.map(n => n.str);\n\n      if (hasItemWithNewLine || strings.reduce((sum, str) => sum + str.length + 2, 2) > Collection.maxFlowStringSingleLineLength) {\n        str = start;\n\n        for (const s of strings) {\n          str += s ? `\\n${indentStep}${indent}${s}` : '\\n';\n        }\n\n        str += `\\n${indent}${end}`;\n      } else {\n        str = `${start} ${strings.join(' ')} ${end}`;\n      }\n    } else {\n      const strings = nodes.map(blockItem);\n      str = strings.shift();\n\n      for (const s of strings) str += s ? `\\n${indent}${s}` : '\\n';\n    }\n\n    if (this.comment) {\n      str += '\\n' + this.comment.replace(/^/gm, `${indent}#`);\n      if (onComment) onComment();\n    } else if (chompKeep && onChompKeep) onChompKeep();\n\n    return str;\n  }\n\n}\n\nPlainValue._defineProperty(Collection, \"maxFlowStringSingleLineLength\", 60);\n\nfunction asItemIndex(key) {\n  let idx = key instanceof Scalar ? key.value : key;\n  if (idx && typeof idx === 'string') idx = Number(idx);\n  return Number.isInteger(idx) && idx >= 0 ? idx : null;\n}\n\nclass YAMLSeq extends Collection {\n  add(value) {\n    this.items.push(value);\n  }\n\n  delete(key) {\n    const idx = asItemIndex(key);\n    if (typeof idx !== 'number') return false;\n    const del = this.items.splice(idx, 1);\n    return del.length > 0;\n  }\n\n  get(key, keepScalar) {\n    const idx = asItemIndex(key);\n    if (typeof idx !== 'number') return undefined;\n    const it = this.items[idx];\n    return !keepScalar && it instanceof Scalar ? it.value : it;\n  }\n\n  has(key) {\n    const idx = asItemIndex(key);\n    return typeof idx === 'number' && idx < this.items.length;\n  }\n\n  set(key, value) {\n    const idx = asItemIndex(key);\n    if (typeof idx !== 'number') throw new Error(`Expected a valid index, not ${key}.`);\n    this.items[idx] = value;\n  }\n\n  toJSON(_, ctx) {\n    const seq = [];\n    if (ctx && ctx.onCreate) ctx.onCreate(seq);\n    let i = 0;\n\n    for (const item of this.items) seq.push(toJSON(item, String(i++), ctx));\n\n    return seq;\n  }\n\n  toString(ctx, onComment, onChompKeep) {\n    if (!ctx) return JSON.stringify(this);\n    return super.toString(ctx, {\n      blockItem: n => n.type === 'comment' ? n.str : `- ${n.str}`,\n      flowChars: {\n        start: '[',\n        end: ']'\n      },\n      isMap: false,\n      itemIndent: (ctx.indent || '') + '  '\n    }, onComment, onChompKeep);\n  }\n\n}\n\nconst stringifyKey = (key, jsKey, ctx) => {\n  if (jsKey === null) return '';\n  if (typeof jsKey !== 'object') return String(jsKey);\n  if (key instanceof Node && ctx && ctx.doc) return key.toString({\n    anchors: {},\n    doc: ctx.doc,\n    indent: '',\n    indentStep: ctx.indentStep,\n    inFlow: true,\n    inStringifyKey: true,\n    stringify: ctx.stringify\n  });\n  return JSON.stringify(jsKey);\n};\n\nclass Pair extends Node {\n  constructor(key, value = null) {\n    super();\n    this.key = key;\n    this.value = value;\n    this.type = Pair.Type.PAIR;\n  }\n\n  get commentBefore() {\n    return this.key instanceof Node ? this.key.commentBefore : undefined;\n  }\n\n  set commentBefore(cb) {\n    if (this.key == null) this.key = new Scalar(null);\n    if (this.key instanceof Node) this.key.commentBefore = cb;else {\n      const msg = 'Pair.commentBefore is an alias for Pair.key.commentBefore. To set it, the key must be a Node.';\n      throw new Error(msg);\n    }\n  }\n\n  addToJSMap(ctx, map) {\n    const key = toJSON(this.key, '', ctx);\n\n    if (map instanceof Map) {\n      const value = toJSON(this.value, key, ctx);\n      map.set(key, value);\n    } else if (map instanceof Set) {\n      map.add(key);\n    } else {\n      const stringKey = stringifyKey(this.key, key, ctx);\n      map[stringKey] = toJSON(this.value, stringKey, ctx);\n    }\n\n    return map;\n  }\n\n  toJSON(_, ctx) {\n    const pair = ctx && ctx.mapAsMap ? new Map() : {};\n    return this.addToJSMap(ctx, pair);\n  }\n\n  toString(ctx, onComment, onChompKeep) {\n    if (!ctx || !ctx.doc) return JSON.stringify(this);\n    const {\n      indent: indentSize,\n      indentSeq,\n      simpleKeys\n    } = ctx.doc.options;\n    let {\n      key,\n      value\n    } = this;\n    let keyComment = key instanceof Node && key.comment;\n\n    if (simpleKeys) {\n      if (keyComment) {\n        throw new Error('With simple keys, key nodes cannot have comments');\n      }\n\n      if (key instanceof Collection) {\n        const msg = 'With simple keys, collection cannot be used as a key value';\n        throw new Error(msg);\n      }\n    }\n\n    const explicitKey = !simpleKeys && (!key || keyComment || key instanceof Collection || key.type === PlainValue.Type.BLOCK_FOLDED || key.type === PlainValue.Type.BLOCK_LITERAL);\n    const {\n      doc,\n      indent,\n      indentStep,\n      stringify\n    } = ctx;\n    ctx = Object.assign({}, ctx, {\n      implicitKey: !explicitKey,\n      indent: indent + indentStep\n    });\n    let chompKeep = false;\n    let str = stringify(key, ctx, () => keyComment = null, () => chompKeep = true);\n    str = addComment(str, ctx.indent, keyComment);\n\n    if (ctx.allNullValues && !simpleKeys) {\n      if (this.comment) {\n        str = addComment(str, ctx.indent, this.comment);\n        if (onComment) onComment();\n      } else if (chompKeep && !keyComment && onChompKeep) onChompKeep();\n\n      return ctx.inFlow ? str : `? ${str}`;\n    }\n\n    str = explicitKey ? `? ${str}\\n${indent}:` : `${str}:`;\n\n    if (this.comment) {\n      // expected (but not strictly required) to be a single-line comment\n      str = addComment(str, ctx.indent, this.comment);\n      if (onComment) onComment();\n    }\n\n    let vcb = '';\n    let valueComment = null;\n\n    if (value instanceof Node) {\n      if (value.spaceBefore) vcb = '\\n';\n\n      if (value.commentBefore) {\n        const cs = value.commentBefore.replace(/^/gm, `${ctx.indent}#`);\n        vcb += `\\n${cs}`;\n      }\n\n      valueComment = value.comment;\n    } else if (value && typeof value === 'object') {\n      value = doc.schema.createNode(value, true);\n    }\n\n    ctx.implicitKey = false;\n    if (!explicitKey && !this.comment && value instanceof Scalar) ctx.indentAtStart = str.length + 1;\n    chompKeep = false;\n\n    if (!indentSeq && indentSize >= 2 && !ctx.inFlow && !explicitKey && value instanceof YAMLSeq && value.type !== PlainValue.Type.FLOW_SEQ && !value.tag && !doc.anchors.getName(value)) {\n      // If indentSeq === false, consider '- ' as part of indentation where possible\n      ctx.indent = ctx.indent.substr(2);\n    }\n\n    const valueStr = stringify(value, ctx, () => valueComment = null, () => chompKeep = true);\n    let ws = ' ';\n\n    if (vcb || this.comment) {\n      ws = `${vcb}\\n${ctx.indent}`;\n    } else if (!explicitKey && value instanceof Collection) {\n      const flow = valueStr[0] === '[' || valueStr[0] === '{';\n      if (!flow || valueStr.includes('\\n')) ws = `\\n${ctx.indent}`;\n    }\n\n    if (chompKeep && !valueComment && onChompKeep) onChompKeep();\n    return addComment(str + ws + valueStr, ctx.indent, valueComment);\n  }\n\n}\n\nPlainValue._defineProperty(Pair, \"Type\", {\n  PAIR: 'PAIR',\n  MERGE_PAIR: 'MERGE_PAIR'\n});\n\nconst getAliasCount = (node, anchors) => {\n  if (node instanceof Alias) {\n    const anchor = anchors.get(node.source);\n    return anchor.count * anchor.aliasCount;\n  } else if (node instanceof Collection) {\n    let count = 0;\n\n    for (const item of node.items) {\n      const c = getAliasCount(item, anchors);\n      if (c > count) count = c;\n    }\n\n    return count;\n  } else if (node instanceof Pair) {\n    const kc = getAliasCount(node.key, anchors);\n    const vc = getAliasCount(node.value, anchors);\n    return Math.max(kc, vc);\n  }\n\n  return 1;\n};\n\nclass Alias extends Node {\n  static stringify({\n    range,\n    source\n  }, {\n    anchors,\n    doc,\n    implicitKey,\n    inStringifyKey\n  }) {\n    let anchor = Object.keys(anchors).find(a => anchors[a] === source);\n    if (!anchor && inStringifyKey) anchor = doc.anchors.getName(source) || doc.anchors.newName();\n    if (anchor) return `*${anchor}${implicitKey ? ' ' : ''}`;\n    const msg = doc.anchors.getName(source) ? 'Alias node must be after source node' : 'Source node not found for alias node';\n    throw new Error(`${msg} [${range}]`);\n  }\n\n  constructor(source) {\n    super();\n    this.source = source;\n    this.type = PlainValue.Type.ALIAS;\n  }\n\n  set tag(t) {\n    throw new Error('Alias nodes cannot have tags');\n  }\n\n  toJSON(arg, ctx) {\n    if (!ctx) return toJSON(this.source, arg, ctx);\n    const {\n      anchors,\n      maxAliasCount\n    } = ctx;\n    const anchor = anchors.get(this.source);\n    /* istanbul ignore if */\n\n    if (!anchor || anchor.res === undefined) {\n      const msg = 'This should not happen: Alias anchor was not resolved?';\n      if (this.cstNode) throw new PlainValue.YAMLReferenceError(this.cstNode, msg);else throw new ReferenceError(msg);\n    }\n\n    if (maxAliasCount >= 0) {\n      anchor.count += 1;\n      if (anchor.aliasCount === 0) anchor.aliasCount = getAliasCount(this.source, anchors);\n\n      if (anchor.count * anchor.aliasCount > maxAliasCount) {\n        const msg = 'Excessive alias count indicates a resource exhaustion attack';\n        if (this.cstNode) throw new PlainValue.YAMLReferenceError(this.cstNode, msg);else throw new ReferenceError(msg);\n      }\n    }\n\n    return anchor.res;\n  } // Only called when stringifying an alias mapping key while constructing\n  // Object output.\n\n\n  toString(ctx) {\n    return Alias.stringify(this, ctx);\n  }\n\n}\n\nPlainValue._defineProperty(Alias, \"default\", true);\n\nfunction findPair(items, key) {\n  const k = key instanceof Scalar ? key.value : key;\n\n  for (const it of items) {\n    if (it instanceof Pair) {\n      if (it.key === key || it.key === k) return it;\n      if (it.key && it.key.value === k) return it;\n    }\n  }\n\n  return undefined;\n}\nclass YAMLMap extends Collection {\n  add(pair, overwrite) {\n    if (!pair) pair = new Pair(pair);else if (!(pair instanceof Pair)) pair = new Pair(pair.key || pair, pair.value);\n    const prev = findPair(this.items, pair.key);\n    const sortEntries = this.schema && this.schema.sortMapEntries;\n\n    if (prev) {\n      if (overwrite) prev.value = pair.value;else throw new Error(`Key ${pair.key} already set`);\n    } else if (sortEntries) {\n      const i = this.items.findIndex(item => sortEntries(pair, item) < 0);\n      if (i === -1) this.items.push(pair);else this.items.splice(i, 0, pair);\n    } else {\n      this.items.push(pair);\n    }\n  }\n\n  delete(key) {\n    const it = findPair(this.items, key);\n    if (!it) return false;\n    const del = this.items.splice(this.items.indexOf(it), 1);\n    return del.length > 0;\n  }\n\n  get(key, keepScalar) {\n    const it = findPair(this.items, key);\n    const node = it && it.value;\n    return !keepScalar && node instanceof Scalar ? node.value : node;\n  }\n\n  has(key) {\n    return !!findPair(this.items, key);\n  }\n\n  set(key, value) {\n    this.add(new Pair(key, value), true);\n  }\n  /**\n   * @param {*} arg ignored\n   * @param {*} ctx Conversion context, originally set in Document#toJSON()\n   * @param {Class} Type If set, forces the returned collection type\n   * @returns {*} Instance of Type, Map, or Object\n   */\n\n\n  toJSON(_, ctx, Type) {\n    const map = Type ? new Type() : ctx && ctx.mapAsMap ? new Map() : {};\n    if (ctx && ctx.onCreate) ctx.onCreate(map);\n\n    for (const item of this.items) item.addToJSMap(ctx, map);\n\n    return map;\n  }\n\n  toString(ctx, onComment, onChompKeep) {\n    if (!ctx) return JSON.stringify(this);\n\n    for (const item of this.items) {\n      if (!(item instanceof Pair)) throw new Error(`Map items must all be pairs; found ${JSON.stringify(item)} instead`);\n    }\n\n    return super.toString(ctx, {\n      blockItem: n => n.str,\n      flowChars: {\n        start: '{',\n        end: '}'\n      },\n      isMap: true,\n      itemIndent: ctx.indent || ''\n    }, onComment, onChompKeep);\n  }\n\n}\n\nconst MERGE_KEY = '<<';\nclass Merge extends Pair {\n  constructor(pair) {\n    if (pair instanceof Pair) {\n      let seq = pair.value;\n\n      if (!(seq instanceof YAMLSeq)) {\n        seq = new YAMLSeq();\n        seq.items.push(pair.value);\n        seq.range = pair.value.range;\n      }\n\n      super(pair.key, seq);\n      this.range = pair.range;\n    } else {\n      super(new Scalar(MERGE_KEY), new YAMLSeq());\n    }\n\n    this.type = Pair.Type.MERGE_PAIR;\n  } // If the value associated with a merge key is a single mapping node, each of\n  // its key/value pairs is inserted into the current mapping, unless the key\n  // already exists in it. If the value associated with the merge key is a\n  // sequence, then this sequence is expected to contain mapping nodes and each\n  // of these nodes is merged in turn according to its order in the sequence.\n  // Keys in mapping nodes earlier in the sequence override keys specified in\n  // later mapping nodes. -- http://yaml.org/type/merge.html\n\n\n  addToJSMap(ctx, map) {\n    for (const {\n      source\n    } of this.value.items) {\n      if (!(source instanceof YAMLMap)) throw new Error('Merge sources must be maps');\n      const srcMap = source.toJSON(null, ctx, Map);\n\n      for (const [key, value] of srcMap) {\n        if (map instanceof Map) {\n          if (!map.has(key)) map.set(key, value);\n        } else if (map instanceof Set) {\n          map.add(key);\n        } else {\n          if (!Object.prototype.hasOwnProperty.call(map, key)) map[key] = value;\n        }\n      }\n    }\n\n    return map;\n  }\n\n  toString(ctx, onComment) {\n    const seq = this.value;\n    if (seq.items.length > 1) return super.toString(ctx, onComment);\n    this.value = seq.items[0];\n    const str = super.toString(ctx, onComment);\n    this.value = seq;\n    return str;\n  }\n\n}\n\nconst binaryOptions = {\n  defaultType: PlainValue.Type.BLOCK_LITERAL,\n  lineWidth: 76\n};\nconst boolOptions = {\n  trueStr: 'true',\n  falseStr: 'false'\n};\nconst intOptions = {\n  asBigInt: false\n};\nconst nullOptions = {\n  nullStr: 'null'\n};\nconst strOptions = {\n  defaultType: PlainValue.Type.PLAIN,\n  doubleQuoted: {\n    jsonEncoding: false,\n    minMultiLineLength: 40\n  },\n  fold: {\n    lineWidth: 80,\n    minContentWidth: 20\n  }\n};\n\nfunction resolveScalar(str, tags, scalarFallback) {\n  for (const {\n    format,\n    test,\n    resolve\n  } of tags) {\n    if (test) {\n      const match = str.match(test);\n\n      if (match) {\n        let res = resolve.apply(null, match);\n        if (!(res instanceof Scalar)) res = new Scalar(res);\n        if (format) res.format = format;\n        return res;\n      }\n    }\n  }\n\n  if (scalarFallback) str = scalarFallback(str);\n  return new Scalar(str);\n}\n\nconst FOLD_FLOW = 'flow';\nconst FOLD_BLOCK = 'block';\nconst FOLD_QUOTED = 'quoted'; // presumes i+1 is at the start of a line\n// returns index of last newline in more-indented block\n\nconst consumeMoreIndentedLines = (text, i) => {\n  let ch = text[i + 1];\n\n  while (ch === ' ' || ch === '\\t') {\n    do {\n      ch = text[i += 1];\n    } while (ch && ch !== '\\n');\n\n    ch = text[i + 1];\n  }\n\n  return i;\n};\n/**\n * Tries to keep input at up to `lineWidth` characters, splitting only on spaces\n * not followed by newlines or spaces unless `mode` is `'quoted'`. Lines are\n * terminated with `\\n` and started with `indent`.\n *\n * @param {string} text\n * @param {string} indent\n * @param {string} [mode='flow'] `'block'` prevents more-indented lines\n *   from being folded; `'quoted'` allows for `\\` escapes, including escaped\n *   newlines\n * @param {Object} options\n * @param {number} [options.indentAtStart] Accounts for leading contents on\n *   the first line, defaulting to `indent.length`\n * @param {number} [options.lineWidth=80]\n * @param {number} [options.minContentWidth=20] Allow highly indented lines to\n *   stretch the line width\n * @param {function} options.onFold Called once if the text is folded\n * @param {function} options.onFold Called once if any line of text exceeds\n *   lineWidth characters\n */\n\n\nfunction foldFlowLines(text, indent, mode, {\n  indentAtStart,\n  lineWidth = 80,\n  minContentWidth = 20,\n  onFold,\n  onOverflow\n}) {\n  if (!lineWidth || lineWidth < 0) return text;\n  const endStep = Math.max(1 + minContentWidth, 1 + lineWidth - indent.length);\n  if (text.length <= endStep) return text;\n  const folds = [];\n  const escapedFolds = {};\n  let end = lineWidth - (typeof indentAtStart === 'number' ? indentAtStart : indent.length);\n  let split = undefined;\n  let prev = undefined;\n  let overflow = false;\n  let i = -1;\n\n  if (mode === FOLD_BLOCK) {\n    i = consumeMoreIndentedLines(text, i);\n    if (i !== -1) end = i + endStep;\n  }\n\n  for (let ch; ch = text[i += 1];) {\n    if (mode === FOLD_QUOTED && ch === '\\\\') {\n      switch (text[i + 1]) {\n        case 'x':\n          i += 3;\n          break;\n\n        case 'u':\n          i += 5;\n          break;\n\n        case 'U':\n          i += 9;\n          break;\n\n        default:\n          i += 1;\n      }\n    }\n\n    if (ch === '\\n') {\n      if (mode === FOLD_BLOCK) i = consumeMoreIndentedLines(text, i);\n      end = i + endStep;\n      split = undefined;\n    } else {\n      if (ch === ' ' && prev && prev !== ' ' && prev !== '\\n' && prev !== '\\t') {\n        // space surrounded by non-space can be replaced with newline + indent\n        const next = text[i + 1];\n        if (next && next !== ' ' && next !== '\\n' && next !== '\\t') split = i;\n      }\n\n      if (i >= end) {\n        if (split) {\n          folds.push(split);\n          end = split + endStep;\n          split = undefined;\n        } else if (mode === FOLD_QUOTED) {\n          // white-space collected at end may stretch past lineWidth\n          while (prev === ' ' || prev === '\\t') {\n            prev = ch;\n            ch = text[i += 1];\n            overflow = true;\n          } // i - 2 accounts for not-dropped last char + newline-escaping \\\n\n\n          folds.push(i - 2);\n          escapedFolds[i - 2] = true;\n          end = i - 2 + endStep;\n          split = undefined;\n        } else {\n          overflow = true;\n        }\n      }\n    }\n\n    prev = ch;\n  }\n\n  if (overflow && onOverflow) onOverflow();\n  if (folds.length === 0) return text;\n  if (onFold) onFold();\n  let res = text.slice(0, folds[0]);\n\n  for (let i = 0; i < folds.length; ++i) {\n    const fold = folds[i];\n    const end = folds[i + 1] || text.length;\n    if (mode === FOLD_QUOTED && escapedFolds[fold]) res += `${text[fold]}\\\\`;\n    res += `\\n${indent}${text.slice(fold + 1, end)}`;\n  }\n\n  return res;\n}\n\nconst getFoldOptions = ({\n  indentAtStart\n}) => indentAtStart ? Object.assign({\n  indentAtStart\n}, strOptions.fold) : strOptions.fold; // Also checks for lines starting with %, as parsing the output as YAML 1.1 will\n// presume that's starting a new document.\n\n\nconst containsDocumentMarker = str => /^(%|---|\\.\\.\\.)/m.test(str);\n\nfunction lineLengthOverLimit(str, limit) {\n  const strLen = str.length;\n  if (strLen <= limit) return false;\n\n  for (let i = 0, start = 0; i < strLen; ++i) {\n    if (str[i] === '\\n') {\n      if (i - start > limit) return true;\n      start = i + 1;\n      if (strLen - start <= limit) return false;\n    }\n  }\n\n  return true;\n}\n\nfunction doubleQuotedString(value, ctx) {\n  const {\n    implicitKey\n  } = ctx;\n  const {\n    jsonEncoding,\n    minMultiLineLength\n  } = strOptions.doubleQuoted;\n  const json = JSON.stringify(value);\n  if (jsonEncoding) return json;\n  const indent = ctx.indent || (containsDocumentMarker(value) ? '  ' : '');\n  let str = '';\n  let start = 0;\n\n  for (let i = 0, ch = json[i]; ch; ch = json[++i]) {\n    if (ch === ' ' && json[i + 1] === '\\\\' && json[i + 2] === 'n') {\n      // space before newline needs to be escaped to not be folded\n      str += json.slice(start, i) + '\\\\ ';\n      i += 1;\n      start = i;\n      ch = '\\\\';\n    }\n\n    if (ch === '\\\\') switch (json[i + 1]) {\n      case 'u':\n        {\n          str += json.slice(start, i);\n          const code = json.substr(i + 2, 4);\n\n          switch (code) {\n            case '0000':\n              str += '\\\\0';\n              break;\n\n            case '0007':\n              str += '\\\\a';\n              break;\n\n            case '000b':\n              str += '\\\\v';\n              break;\n\n            case '001b':\n              str += '\\\\e';\n              break;\n\n            case '0085':\n              str += '\\\\N';\n              break;\n\n            case '00a0':\n              str += '\\\\_';\n              break;\n\n            case '2028':\n              str += '\\\\L';\n              break;\n\n            case '2029':\n              str += '\\\\P';\n              break;\n\n            default:\n              if (code.substr(0, 2) === '00') str += '\\\\x' + code.substr(2);else str += json.substr(i, 6);\n          }\n\n          i += 5;\n          start = i + 1;\n        }\n        break;\n\n      case 'n':\n        if (implicitKey || json[i + 2] === '\"' || json.length < minMultiLineLength) {\n          i += 1;\n        } else {\n          // folding will eat first newline\n          str += json.slice(start, i) + '\\n\\n';\n\n          while (json[i + 2] === '\\\\' && json[i + 3] === 'n' && json[i + 4] !== '\"') {\n            str += '\\n';\n            i += 2;\n          }\n\n          str += indent; // space after newline needs to be escaped to not be folded\n\n          if (json[i + 2] === ' ') str += '\\\\';\n          i += 1;\n          start = i + 1;\n        }\n\n        break;\n\n      default:\n        i += 1;\n    }\n  }\n\n  str = start ? str + json.slice(start) : json;\n  return implicitKey ? str : foldFlowLines(str, indent, FOLD_QUOTED, getFoldOptions(ctx));\n}\n\nfunction singleQuotedString(value, ctx) {\n  if (ctx.implicitKey) {\n    if (/\\n/.test(value)) return doubleQuotedString(value, ctx);\n  } else {\n    // single quoted string can't have leading or trailing whitespace around newline\n    if (/[ \\t]\\n|\\n[ \\t]/.test(value)) return doubleQuotedString(value, ctx);\n  }\n\n  const indent = ctx.indent || (containsDocumentMarker(value) ? '  ' : '');\n  const res = \"'\" + value.replace(/'/g, \"''\").replace(/\\n+/g, `$&\\n${indent}`) + \"'\";\n  return ctx.implicitKey ? res : foldFlowLines(res, indent, FOLD_FLOW, getFoldOptions(ctx));\n}\n\nfunction blockString({\n  comment,\n  type,\n  value\n}, ctx, onComment, onChompKeep) {\n  // 1. Block can't end in whitespace unless the last line is non-empty.\n  // 2. Strings consisting of only whitespace are best rendered explicitly.\n  if (/\\n[\\t ]+$/.test(value) || /^\\s*$/.test(value)) {\n    return doubleQuotedString(value, ctx);\n  }\n\n  const indent = ctx.indent || (ctx.forceBlockIndent || containsDocumentMarker(value) ? '  ' : '');\n  const indentSize = indent ? '2' : '1'; // root is at -1\n\n  const literal = type === PlainValue.Type.BLOCK_FOLDED ? false : type === PlainValue.Type.BLOCK_LITERAL ? true : !lineLengthOverLimit(value, strOptions.fold.lineWidth - indent.length);\n  let header = literal ? '|' : '>';\n  if (!value) return header + '\\n';\n  let wsStart = '';\n  let wsEnd = '';\n  value = value.replace(/[\\n\\t ]*$/, ws => {\n    const n = ws.indexOf('\\n');\n\n    if (n === -1) {\n      header += '-'; // strip\n    } else if (value === ws || n !== ws.length - 1) {\n      header += '+'; // keep\n\n      if (onChompKeep) onChompKeep();\n    }\n\n    wsEnd = ws.replace(/\\n$/, '');\n    return '';\n  }).replace(/^[\\n ]*/, ws => {\n    if (ws.indexOf(' ') !== -1) header += indentSize;\n    const m = ws.match(/ +$/);\n\n    if (m) {\n      wsStart = ws.slice(0, -m[0].length);\n      return m[0];\n    } else {\n      wsStart = ws;\n      return '';\n    }\n  });\n  if (wsEnd) wsEnd = wsEnd.replace(/\\n+(?!\\n|$)/g, `$&${indent}`);\n  if (wsStart) wsStart = wsStart.replace(/\\n+/g, `$&${indent}`);\n\n  if (comment) {\n    header += ' #' + comment.replace(/ ?[\\r\\n]+/g, ' ');\n    if (onComment) onComment();\n  }\n\n  if (!value) return `${header}${indentSize}\\n${indent}${wsEnd}`;\n\n  if (literal) {\n    value = value.replace(/\\n+/g, `$&${indent}`);\n    return `${header}\\n${indent}${wsStart}${value}${wsEnd}`;\n  }\n\n  value = value.replace(/\\n+/g, '\\n$&').replace(/(?:^|\\n)([\\t ].*)(?:([\\n\\t ]*)\\n(?![\\n\\t ]))?/g, '$1$2') // more-indented lines aren't folded\n  //         ^ ind.line  ^ empty     ^ capture next empty lines only at end of indent\n  .replace(/\\n+/g, `$&${indent}`);\n  const body = foldFlowLines(`${wsStart}${value}${wsEnd}`, indent, FOLD_BLOCK, strOptions.fold);\n  return `${header}\\n${indent}${body}`;\n}\n\nfunction plainString(item, ctx, onComment, onChompKeep) {\n  const {\n    comment,\n    type,\n    value\n  } = item;\n  const {\n    actualString,\n    implicitKey,\n    indent,\n    inFlow\n  } = ctx;\n\n  if (implicitKey && /[\\n[\\]{},]/.test(value) || inFlow && /[[\\]{},]/.test(value)) {\n    return doubleQuotedString(value, ctx);\n  }\n\n  if (!value || /^[\\n\\t ,[\\]{}#&*!|>'\"%@`]|^[?-]$|^[?-][ \\t]|[\\n:][ \\t]|[ \\t]\\n|[\\n\\t ]#|[\\n\\t :]$/.test(value)) {\n    // not allowed:\n    // - empty string, '-' or '?'\n    // - start with an indicator character (except [?:-]) or /[?-] /\n    // - '\\n ', ': ' or ' \\n' anywhere\n    // - '#' not preceded by a non-space char\n    // - end with ' ' or ':'\n    return implicitKey || inFlow || value.indexOf('\\n') === -1 ? value.indexOf('\"') !== -1 && value.indexOf(\"'\") === -1 ? singleQuotedString(value, ctx) : doubleQuotedString(value, ctx) : blockString(item, ctx, onComment, onChompKeep);\n  }\n\n  if (!implicitKey && !inFlow && type !== PlainValue.Type.PLAIN && value.indexOf('\\n') !== -1) {\n    // Where allowed & type not set explicitly, prefer block style for multiline strings\n    return blockString(item, ctx, onComment, onChompKeep);\n  }\n\n  if (indent === '' && containsDocumentMarker(value)) {\n    ctx.forceBlockIndent = true;\n    return blockString(item, ctx, onComment, onChompKeep);\n  }\n\n  const str = value.replace(/\\n+/g, `$&\\n${indent}`); // Verify that output will be parsed as a string, as e.g. plain numbers and\n  // booleans get parsed with those types in v1.2 (e.g. '42', 'true' & '0.9e-3'),\n  // and others in v1.1.\n\n  if (actualString) {\n    const {\n      tags\n    } = ctx.doc.schema;\n    const resolved = resolveScalar(str, tags, tags.scalarFallback).value;\n    if (typeof resolved !== 'string') return doubleQuotedString(value, ctx);\n  }\n\n  const body = implicitKey ? str : foldFlowLines(str, indent, FOLD_FLOW, getFoldOptions(ctx));\n\n  if (comment && !inFlow && (body.indexOf('\\n') !== -1 || comment.indexOf('\\n') !== -1)) {\n    if (onComment) onComment();\n    return addCommentBefore(body, indent, comment);\n  }\n\n  return body;\n}\n\nfunction stringifyString(item, ctx, onComment, onChompKeep) {\n  const {\n    defaultType\n  } = strOptions;\n  const {\n    implicitKey,\n    inFlow\n  } = ctx;\n  let {\n    type,\n    value\n  } = item;\n\n  if (typeof value !== 'string') {\n    value = String(value);\n    item = Object.assign({}, item, {\n      value\n    });\n  }\n\n  const _stringify = _type => {\n    switch (_type) {\n      case PlainValue.Type.BLOCK_FOLDED:\n      case PlainValue.Type.BLOCK_LITERAL:\n        return blockString(item, ctx, onComment, onChompKeep);\n\n      case PlainValue.Type.QUOTE_DOUBLE:\n        return doubleQuotedString(value, ctx);\n\n      case PlainValue.Type.QUOTE_SINGLE:\n        return singleQuotedString(value, ctx);\n\n      case PlainValue.Type.PLAIN:\n        return plainString(item, ctx, onComment, onChompKeep);\n\n      default:\n        return null;\n    }\n  };\n\n  if (type !== PlainValue.Type.QUOTE_DOUBLE && /[\\x00-\\x08\\x0b-\\x1f\\x7f-\\x9f]/.test(value)) {\n    // force double quotes on control characters\n    type = PlainValue.Type.QUOTE_DOUBLE;\n  } else if ((implicitKey || inFlow) && (type === PlainValue.Type.BLOCK_FOLDED || type === PlainValue.Type.BLOCK_LITERAL)) {\n    // should not happen; blocks are not valid inside flow containers\n    type = PlainValue.Type.QUOTE_DOUBLE;\n  }\n\n  let res = _stringify(type);\n\n  if (res === null) {\n    res = _stringify(defaultType);\n    if (res === null) throw new Error(`Unsupported default string type ${defaultType}`);\n  }\n\n  return res;\n}\n\nfunction stringifyNumber({\n  format,\n  minFractionDigits,\n  tag,\n  value\n}) {\n  if (typeof value === 'bigint') return String(value);\n  if (!isFinite(value)) return isNaN(value) ? '.nan' : value < 0 ? '-.inf' : '.inf';\n  let n = JSON.stringify(value);\n\n  if (!format && minFractionDigits && (!tag || tag === 'tag:yaml.org,2002:float') && /^\\d/.test(n)) {\n    let i = n.indexOf('.');\n\n    if (i < 0) {\n      i = n.length;\n      n += '.';\n    }\n\n    let d = minFractionDigits - (n.length - i - 1);\n\n    while (d-- > 0) n += '0';\n  }\n\n  return n;\n}\n\nfunction checkFlowCollectionEnd(errors, cst) {\n  let char, name;\n\n  switch (cst.type) {\n    case PlainValue.Type.FLOW_MAP:\n      char = '}';\n      name = 'flow map';\n      break;\n\n    case PlainValue.Type.FLOW_SEQ:\n      char = ']';\n      name = 'flow sequence';\n      break;\n\n    default:\n      errors.push(new PlainValue.YAMLSemanticError(cst, 'Not a flow collection!?'));\n      return;\n  }\n\n  let lastItem;\n\n  for (let i = cst.items.length - 1; i >= 0; --i) {\n    const item = cst.items[i];\n\n    if (!item || item.type !== PlainValue.Type.COMMENT) {\n      lastItem = item;\n      break;\n    }\n  }\n\n  if (lastItem && lastItem.char !== char) {\n    const msg = `Expected ${name} to end with ${char}`;\n    let err;\n\n    if (typeof lastItem.offset === 'number') {\n      err = new PlainValue.YAMLSemanticError(cst, msg);\n      err.offset = lastItem.offset + 1;\n    } else {\n      err = new PlainValue.YAMLSemanticError(lastItem, msg);\n      if (lastItem.range && lastItem.range.end) err.offset = lastItem.range.end - lastItem.range.start;\n    }\n\n    errors.push(err);\n  }\n}\nfunction checkFlowCommentSpace(errors, comment) {\n  const prev = comment.context.src[comment.range.start - 1];\n\n  if (prev !== '\\n' && prev !== '\\t' && prev !== ' ') {\n    const msg = 'Comments must be separated from other tokens by white space characters';\n    errors.push(new PlainValue.YAMLSemanticError(comment, msg));\n  }\n}\nfunction getLongKeyError(source, key) {\n  const sk = String(key);\n  const k = sk.substr(0, 8) + '...' + sk.substr(-8);\n  return new PlainValue.YAMLSemanticError(source, `The \"${k}\" key is too long`);\n}\nfunction resolveComments(collection, comments) {\n  for (const {\n    afterKey,\n    before,\n    comment\n  } of comments) {\n    let item = collection.items[before];\n\n    if (!item) {\n      if (comment !== undefined) {\n        if (collection.comment) collection.comment += '\\n' + comment;else collection.comment = comment;\n      }\n    } else {\n      if (afterKey && item.value) item = item.value;\n\n      if (comment === undefined) {\n        if (afterKey || !item.commentBefore) item.spaceBefore = true;\n      } else {\n        if (item.commentBefore) item.commentBefore += '\\n' + comment;else item.commentBefore = comment;\n      }\n    }\n  }\n}\n\n// on error, will return { str: string, errors: Error[] }\nfunction resolveString(doc, node) {\n  const res = node.strValue;\n  if (!res) return '';\n  if (typeof res === 'string') return res;\n  res.errors.forEach(error => {\n    if (!error.source) error.source = node;\n    doc.errors.push(error);\n  });\n  return res.str;\n}\n\nfunction resolveTagHandle(doc, node) {\n  const {\n    handle,\n    suffix\n  } = node.tag;\n  let prefix = doc.tagPrefixes.find(p => p.handle === handle);\n\n  if (!prefix) {\n    const dtp = doc.getDefaults().tagPrefixes;\n    if (dtp) prefix = dtp.find(p => p.handle === handle);\n    if (!prefix) throw new PlainValue.YAMLSemanticError(node, `The ${handle} tag handle is non-default and was not declared.`);\n  }\n\n  if (!suffix) throw new PlainValue.YAMLSemanticError(node, `The ${handle} tag has no suffix.`);\n\n  if (handle === '!' && (doc.version || doc.options.version) === '1.0') {\n    if (suffix[0] === '^') {\n      doc.warnings.push(new PlainValue.YAMLWarning(node, 'YAML 1.0 ^ tag expansion is not supported'));\n      return suffix;\n    }\n\n    if (/[:/]/.test(suffix)) {\n      // word/foo -> tag:word.yaml.org,2002:foo\n      const vocab = suffix.match(/^([a-z0-9-]+)\\/(.*)/i);\n      return vocab ? `tag:${vocab[1]}.yaml.org,2002:${vocab[2]}` : `tag:${suffix}`;\n    }\n  }\n\n  return prefix.prefix + decodeURIComponent(suffix);\n}\n\nfunction resolveTagName(doc, node) {\n  const {\n    tag,\n    type\n  } = node;\n  let nonSpecific = false;\n\n  if (tag) {\n    const {\n      handle,\n      suffix,\n      verbatim\n    } = tag;\n\n    if (verbatim) {\n      if (verbatim !== '!' && verbatim !== '!!') return verbatim;\n      const msg = `Verbatim tags aren't resolved, so ${verbatim} is invalid.`;\n      doc.errors.push(new PlainValue.YAMLSemanticError(node, msg));\n    } else if (handle === '!' && !suffix) {\n      nonSpecific = true;\n    } else {\n      try {\n        return resolveTagHandle(doc, node);\n      } catch (error) {\n        doc.errors.push(error);\n      }\n    }\n  }\n\n  switch (type) {\n    case PlainValue.Type.BLOCK_FOLDED:\n    case PlainValue.Type.BLOCK_LITERAL:\n    case PlainValue.Type.QUOTE_DOUBLE:\n    case PlainValue.Type.QUOTE_SINGLE:\n      return PlainValue.defaultTags.STR;\n\n    case PlainValue.Type.FLOW_MAP:\n    case PlainValue.Type.MAP:\n      return PlainValue.defaultTags.MAP;\n\n    case PlainValue.Type.FLOW_SEQ:\n    case PlainValue.Type.SEQ:\n      return PlainValue.defaultTags.SEQ;\n\n    case PlainValue.Type.PLAIN:\n      return nonSpecific ? PlainValue.defaultTags.STR : null;\n\n    default:\n      return null;\n  }\n}\n\nfunction resolveByTagName(doc, node, tagName) {\n  const {\n    tags\n  } = doc.schema;\n  const matchWithTest = [];\n\n  for (const tag of tags) {\n    if (tag.tag === tagName) {\n      if (tag.test) matchWithTest.push(tag);else {\n        const res = tag.resolve(doc, node);\n        return res instanceof Collection ? res : new Scalar(res);\n      }\n    }\n  }\n\n  const str = resolveString(doc, node);\n  if (typeof str === 'string' && matchWithTest.length > 0) return resolveScalar(str, matchWithTest, tags.scalarFallback);\n  return null;\n}\n\nfunction getFallbackTagName({\n  type\n}) {\n  switch (type) {\n    case PlainValue.Type.FLOW_MAP:\n    case PlainValue.Type.MAP:\n      return PlainValue.defaultTags.MAP;\n\n    case PlainValue.Type.FLOW_SEQ:\n    case PlainValue.Type.SEQ:\n      return PlainValue.defaultTags.SEQ;\n\n    default:\n      return PlainValue.defaultTags.STR;\n  }\n}\n\nfunction resolveTag(doc, node, tagName) {\n  try {\n    const res = resolveByTagName(doc, node, tagName);\n\n    if (res) {\n      if (tagName && node.tag) res.tag = tagName;\n      return res;\n    }\n  } catch (error) {\n    /* istanbul ignore if */\n    if (!error.source) error.source = node;\n    doc.errors.push(error);\n    return null;\n  }\n\n  try {\n    const fallback = getFallbackTagName(node);\n    if (!fallback) throw new Error(`The tag ${tagName} is unavailable`);\n    const msg = `The tag ${tagName} is unavailable, falling back to ${fallback}`;\n    doc.warnings.push(new PlainValue.YAMLWarning(node, msg));\n    const res = resolveByTagName(doc, node, fallback);\n    res.tag = tagName;\n    return res;\n  } catch (error) {\n    const refError = new PlainValue.YAMLReferenceError(node, error.message);\n    refError.stack = error.stack;\n    doc.errors.push(refError);\n    return null;\n  }\n}\n\nconst isCollectionItem = node => {\n  if (!node) return false;\n  const {\n    type\n  } = node;\n  return type === PlainValue.Type.MAP_KEY || type === PlainValue.Type.MAP_VALUE || type === PlainValue.Type.SEQ_ITEM;\n};\n\nfunction resolveNodeProps(errors, node) {\n  const comments = {\n    before: [],\n    after: []\n  };\n  let hasAnchor = false;\n  let hasTag = false;\n  const props = isCollectionItem(node.context.parent) ? node.context.parent.props.concat(node.props) : node.props;\n\n  for (const {\n    start,\n    end\n  } of props) {\n    switch (node.context.src[start]) {\n      case PlainValue.Char.COMMENT:\n        {\n          if (!node.commentHasRequiredWhitespace(start)) {\n            const msg = 'Comments must be separated from other tokens by white space characters';\n            errors.push(new PlainValue.YAMLSemanticError(node, msg));\n          }\n\n          const {\n            header,\n            valueRange\n          } = node;\n          const cc = valueRange && (start > valueRange.start || header && start > header.start) ? comments.after : comments.before;\n          cc.push(node.context.src.slice(start + 1, end));\n          break;\n        }\n      // Actual anchor & tag resolution is handled by schema, here we just complain\n\n      case PlainValue.Char.ANCHOR:\n        if (hasAnchor) {\n          const msg = 'A node can have at most one anchor';\n          errors.push(new PlainValue.YAMLSemanticError(node, msg));\n        }\n\n        hasAnchor = true;\n        break;\n\n      case PlainValue.Char.TAG:\n        if (hasTag) {\n          const msg = 'A node can have at most one tag';\n          errors.push(new PlainValue.YAMLSemanticError(node, msg));\n        }\n\n        hasTag = true;\n        break;\n    }\n  }\n\n  return {\n    comments,\n    hasAnchor,\n    hasTag\n  };\n}\n\nfunction resolveNodeValue(doc, node) {\n  const {\n    anchors,\n    errors,\n    schema\n  } = doc;\n\n  if (node.type === PlainValue.Type.ALIAS) {\n    const name = node.rawValue;\n    const src = anchors.getNode(name);\n\n    if (!src) {\n      const msg = `Aliased anchor not found: ${name}`;\n      errors.push(new PlainValue.YAMLReferenceError(node, msg));\n      return null;\n    } // Lazy resolution for circular references\n\n\n    const res = new Alias(src);\n\n    anchors._cstAliases.push(res);\n\n    return res;\n  }\n\n  const tagName = resolveTagName(doc, node);\n  if (tagName) return resolveTag(doc, node, tagName);\n\n  if (node.type !== PlainValue.Type.PLAIN) {\n    const msg = `Failed to resolve ${node.type} node here`;\n    errors.push(new PlainValue.YAMLSyntaxError(node, msg));\n    return null;\n  }\n\n  try {\n    const str = resolveString(doc, node);\n    return resolveScalar(str, schema.tags, schema.tags.scalarFallback);\n  } catch (error) {\n    if (!error.source) error.source = node;\n    errors.push(error);\n    return null;\n  }\n} // sets node.resolved on success\n\n\nfunction resolveNode(doc, node) {\n  if (!node) return null;\n  if (node.error) doc.errors.push(node.error);\n  const {\n    comments,\n    hasAnchor,\n    hasTag\n  } = resolveNodeProps(doc.errors, node);\n\n  if (hasAnchor) {\n    const {\n      anchors\n    } = doc;\n    const name = node.anchor;\n    const prev = anchors.getNode(name); // At this point, aliases for any preceding node with the same anchor\n    // name have already been resolved, so it may safely be renamed.\n\n    if (prev) anchors.map[anchors.newName(name)] = prev; // During parsing, we need to store the CST node in anchors.map as\n    // anchors need to be available during resolution to allow for\n    // circular references.\n\n    anchors.map[name] = node;\n  }\n\n  if (node.type === PlainValue.Type.ALIAS && (hasAnchor || hasTag)) {\n    const msg = 'An alias node must not specify any properties';\n    doc.errors.push(new PlainValue.YAMLSemanticError(node, msg));\n  }\n\n  const res = resolveNodeValue(doc, node);\n\n  if (res) {\n    res.range = [node.range.start, node.range.end];\n    if (doc.options.keepCstNodes) res.cstNode = node;\n    if (doc.options.keepNodeTypes) res.type = node.type;\n    const cb = comments.before.join('\\n');\n\n    if (cb) {\n      res.commentBefore = res.commentBefore ? `${res.commentBefore}\\n${cb}` : cb;\n    }\n\n    const ca = comments.after.join('\\n');\n    if (ca) res.comment = res.comment ? `${res.comment}\\n${ca}` : ca;\n  }\n\n  return node.resolved = res;\n}\n\nfunction resolveMap(doc, cst) {\n  if (cst.type !== PlainValue.Type.MAP && cst.type !== PlainValue.Type.FLOW_MAP) {\n    const msg = `A ${cst.type} node cannot be resolved as a mapping`;\n    doc.errors.push(new PlainValue.YAMLSyntaxError(cst, msg));\n    return null;\n  }\n\n  const {\n    comments,\n    items\n  } = cst.type === PlainValue.Type.FLOW_MAP ? resolveFlowMapItems(doc, cst) : resolveBlockMapItems(doc, cst);\n  const map = new YAMLMap();\n  map.items = items;\n  resolveComments(map, comments);\n  let hasCollectionKey = false;\n\n  for (let i = 0; i < items.length; ++i) {\n    const {\n      key: iKey\n    } = items[i];\n    if (iKey instanceof Collection) hasCollectionKey = true;\n\n    if (doc.schema.merge && iKey && iKey.value === MERGE_KEY) {\n      items[i] = new Merge(items[i]);\n      const sources = items[i].value.items;\n      let error = null;\n      sources.some(node => {\n        if (node instanceof Alias) {\n          // During parsing, alias sources are CST nodes; to account for\n          // circular references their resolved values can't be used here.\n          const {\n            type\n          } = node.source;\n          if (type === PlainValue.Type.MAP || type === PlainValue.Type.FLOW_MAP) return false;\n          return error = 'Merge nodes aliases can only point to maps';\n        }\n\n        return error = 'Merge nodes can only have Alias nodes as values';\n      });\n      if (error) doc.errors.push(new PlainValue.YAMLSemanticError(cst, error));\n    } else {\n      for (let j = i + 1; j < items.length; ++j) {\n        const {\n          key: jKey\n        } = items[j];\n\n        if (iKey === jKey || iKey && jKey && Object.prototype.hasOwnProperty.call(iKey, 'value') && iKey.value === jKey.value) {\n          const msg = `Map keys must be unique; \"${iKey}\" is repeated`;\n          doc.errors.push(new PlainValue.YAMLSemanticError(cst, msg));\n          break;\n        }\n      }\n    }\n  }\n\n  if (hasCollectionKey && !doc.options.mapAsMap) {\n    const warn = 'Keys with collection values will be stringified as YAML due to JS Object restrictions. Use mapAsMap: true to avoid this.';\n    doc.warnings.push(new PlainValue.YAMLWarning(cst, warn));\n  }\n\n  cst.resolved = map;\n  return map;\n}\n\nconst valueHasPairComment = ({\n  context: {\n    lineStart,\n    node,\n    src\n  },\n  props\n}) => {\n  if (props.length === 0) return false;\n  const {\n    start\n  } = props[0];\n  if (node && start > node.valueRange.start) return false;\n  if (src[start] !== PlainValue.Char.COMMENT) return false;\n\n  for (let i = lineStart; i < start; ++i) if (src[i] === '\\n') return false;\n\n  return true;\n};\n\nfunction resolvePairComment(item, pair) {\n  if (!valueHasPairComment(item)) return;\n  const comment = item.getPropValue(0, PlainValue.Char.COMMENT, true);\n  let found = false;\n  const cb = pair.value.commentBefore;\n\n  if (cb && cb.startsWith(comment)) {\n    pair.value.commentBefore = cb.substr(comment.length + 1);\n    found = true;\n  } else {\n    const cc = pair.value.comment;\n\n    if (!item.node && cc && cc.startsWith(comment)) {\n      pair.value.comment = cc.substr(comment.length + 1);\n      found = true;\n    }\n  }\n\n  if (found) pair.comment = comment;\n}\n\nfunction resolveBlockMapItems(doc, cst) {\n  const comments = [];\n  const items = [];\n  let key = undefined;\n  let keyStart = null;\n\n  for (let i = 0; i < cst.items.length; ++i) {\n    const item = cst.items[i];\n\n    switch (item.type) {\n      case PlainValue.Type.BLANK_LINE:\n        comments.push({\n          afterKey: !!key,\n          before: items.length\n        });\n        break;\n\n      case PlainValue.Type.COMMENT:\n        comments.push({\n          afterKey: !!key,\n          before: items.length,\n          comment: item.comment\n        });\n        break;\n\n      case PlainValue.Type.MAP_KEY:\n        if (key !== undefined) items.push(new Pair(key));\n        if (item.error) doc.errors.push(item.error);\n        key = resolveNode(doc, item.node);\n        keyStart = null;\n        break;\n\n      case PlainValue.Type.MAP_VALUE:\n        {\n          if (key === undefined) key = null;\n          if (item.error) doc.errors.push(item.error);\n\n          if (!item.context.atLineStart && item.node && item.node.type === PlainValue.Type.MAP && !item.node.context.atLineStart) {\n            const msg = 'Nested mappings are not allowed in compact mappings';\n            doc.errors.push(new PlainValue.YAMLSemanticError(item.node, msg));\n          }\n\n          let valueNode = item.node;\n\n          if (!valueNode && item.props.length > 0) {\n            // Comments on an empty mapping value need to be preserved, so we\n            // need to construct a minimal empty node here to use instead of the\n            // missing `item.node`. -- eemeli/yaml#19\n            valueNode = new PlainValue.PlainValue(PlainValue.Type.PLAIN, []);\n            valueNode.context = {\n              parent: item,\n              src: item.context.src\n            };\n            const pos = item.range.start + 1;\n            valueNode.range = {\n              start: pos,\n              end: pos\n            };\n            valueNode.valueRange = {\n              start: pos,\n              end: pos\n            };\n\n            if (typeof item.range.origStart === 'number') {\n              const origPos = item.range.origStart + 1;\n              valueNode.range.origStart = valueNode.range.origEnd = origPos;\n              valueNode.valueRange.origStart = valueNode.valueRange.origEnd = origPos;\n            }\n          }\n\n          const pair = new Pair(key, resolveNode(doc, valueNode));\n          resolvePairComment(item, pair);\n          items.push(pair);\n\n          if (key && typeof keyStart === 'number') {\n            if (item.range.start > keyStart + 1024) doc.errors.push(getLongKeyError(cst, key));\n          }\n\n          key = undefined;\n          keyStart = null;\n        }\n        break;\n\n      default:\n        if (key !== undefined) items.push(new Pair(key));\n        key = resolveNode(doc, item);\n        keyStart = item.range.start;\n        if (item.error) doc.errors.push(item.error);\n\n        next: for (let j = i + 1;; ++j) {\n          const nextItem = cst.items[j];\n\n          switch (nextItem && nextItem.type) {\n            case PlainValue.Type.BLANK_LINE:\n            case PlainValue.Type.COMMENT:\n              continue next;\n\n            case PlainValue.Type.MAP_VALUE:\n              break next;\n\n            default:\n              {\n                const msg = 'Implicit map keys need to be followed by map values';\n                doc.errors.push(new PlainValue.YAMLSemanticError(item, msg));\n                break next;\n              }\n          }\n        }\n\n        if (item.valueRangeContainsNewline) {\n          const msg = 'Implicit map keys need to be on a single line';\n          doc.errors.push(new PlainValue.YAMLSemanticError(item, msg));\n        }\n\n    }\n  }\n\n  if (key !== undefined) items.push(new Pair(key));\n  return {\n    comments,\n    items\n  };\n}\n\nfunction resolveFlowMapItems(doc, cst) {\n  const comments = [];\n  const items = [];\n  let key = undefined;\n  let explicitKey = false;\n  let next = '{';\n\n  for (let i = 0; i < cst.items.length; ++i) {\n    const item = cst.items[i];\n\n    if (typeof item.char === 'string') {\n      const {\n        char,\n        offset\n      } = item;\n\n      if (char === '?' && key === undefined && !explicitKey) {\n        explicitKey = true;\n        next = ':';\n        continue;\n      }\n\n      if (char === ':') {\n        if (key === undefined) key = null;\n\n        if (next === ':') {\n          next = ',';\n          continue;\n        }\n      } else {\n        if (explicitKey) {\n          if (key === undefined && char !== ',') key = null;\n          explicitKey = false;\n        }\n\n        if (key !== undefined) {\n          items.push(new Pair(key));\n          key = undefined;\n\n          if (char === ',') {\n            next = ':';\n            continue;\n          }\n        }\n      }\n\n      if (char === '}') {\n        if (i === cst.items.length - 1) continue;\n      } else if (char === next) {\n        next = ':';\n        continue;\n      }\n\n      const msg = `Flow map contains an unexpected ${char}`;\n      const err = new PlainValue.YAMLSyntaxError(cst, msg);\n      err.offset = offset;\n      doc.errors.push(err);\n    } else if (item.type === PlainValue.Type.BLANK_LINE) {\n      comments.push({\n        afterKey: !!key,\n        before: items.length\n      });\n    } else if (item.type === PlainValue.Type.COMMENT) {\n      checkFlowCommentSpace(doc.errors, item);\n      comments.push({\n        afterKey: !!key,\n        before: items.length,\n        comment: item.comment\n      });\n    } else if (key === undefined) {\n      if (next === ',') doc.errors.push(new PlainValue.YAMLSemanticError(item, 'Separator , missing in flow map'));\n      key = resolveNode(doc, item);\n    } else {\n      if (next !== ',') doc.errors.push(new PlainValue.YAMLSemanticError(item, 'Indicator : missing in flow map entry'));\n      items.push(new Pair(key, resolveNode(doc, item)));\n      key = undefined;\n      explicitKey = false;\n    }\n  }\n\n  checkFlowCollectionEnd(doc.errors, cst);\n  if (key !== undefined) items.push(new Pair(key));\n  return {\n    comments,\n    items\n  };\n}\n\nfunction resolveSeq(doc, cst) {\n  if (cst.type !== PlainValue.Type.SEQ && cst.type !== PlainValue.Type.FLOW_SEQ) {\n    const msg = `A ${cst.type} node cannot be resolved as a sequence`;\n    doc.errors.push(new PlainValue.YAMLSyntaxError(cst, msg));\n    return null;\n  }\n\n  const {\n    comments,\n    items\n  } = cst.type === PlainValue.Type.FLOW_SEQ ? resolveFlowSeqItems(doc, cst) : resolveBlockSeqItems(doc, cst);\n  const seq = new YAMLSeq();\n  seq.items = items;\n  resolveComments(seq, comments);\n\n  if (!doc.options.mapAsMap && items.some(it => it instanceof Pair && it.key instanceof Collection)) {\n    const warn = 'Keys with collection values will be stringified as YAML due to JS Object restrictions. Use mapAsMap: true to avoid this.';\n    doc.warnings.push(new PlainValue.YAMLWarning(cst, warn));\n  }\n\n  cst.resolved = seq;\n  return seq;\n}\n\nfunction resolveBlockSeqItems(doc, cst) {\n  const comments = [];\n  const items = [];\n\n  for (let i = 0; i < cst.items.length; ++i) {\n    const item = cst.items[i];\n\n    switch (item.type) {\n      case PlainValue.Type.BLANK_LINE:\n        comments.push({\n          before: items.length\n        });\n        break;\n\n      case PlainValue.Type.COMMENT:\n        comments.push({\n          comment: item.comment,\n          before: items.length\n        });\n        break;\n\n      case PlainValue.Type.SEQ_ITEM:\n        if (item.error) doc.errors.push(item.error);\n        items.push(resolveNode(doc, item.node));\n\n        if (item.hasProps) {\n          const msg = 'Sequence items cannot have tags or anchors before the - indicator';\n          doc.errors.push(new PlainValue.YAMLSemanticError(item, msg));\n        }\n\n        break;\n\n      default:\n        if (item.error) doc.errors.push(item.error);\n        doc.errors.push(new PlainValue.YAMLSyntaxError(item, `Unexpected ${item.type} node in sequence`));\n    }\n  }\n\n  return {\n    comments,\n    items\n  };\n}\n\nfunction resolveFlowSeqItems(doc, cst) {\n  const comments = [];\n  const items = [];\n  let explicitKey = false;\n  let key = undefined;\n  let keyStart = null;\n  let next = '[';\n  let prevItem = null;\n\n  for (let i = 0; i < cst.items.length; ++i) {\n    const item = cst.items[i];\n\n    if (typeof item.char === 'string') {\n      const {\n        char,\n        offset\n      } = item;\n\n      if (char !== ':' && (explicitKey || key !== undefined)) {\n        if (explicitKey && key === undefined) key = next ? items.pop() : null;\n        items.push(new Pair(key));\n        explicitKey = false;\n        key = undefined;\n        keyStart = null;\n      }\n\n      if (char === next) {\n        next = null;\n      } else if (!next && char === '?') {\n        explicitKey = true;\n      } else if (next !== '[' && char === ':' && key === undefined) {\n        if (next === ',') {\n          key = items.pop();\n\n          if (key instanceof Pair) {\n            const msg = 'Chaining flow sequence pairs is invalid';\n            const err = new PlainValue.YAMLSemanticError(cst, msg);\n            err.offset = offset;\n            doc.errors.push(err);\n          }\n\n          if (!explicitKey && typeof keyStart === 'number') {\n            const keyEnd = item.range ? item.range.start : item.offset;\n            if (keyEnd > keyStart + 1024) doc.errors.push(getLongKeyError(cst, key));\n            const {\n              src\n            } = prevItem.context;\n\n            for (let i = keyStart; i < keyEnd; ++i) if (src[i] === '\\n') {\n              const msg = 'Implicit keys of flow sequence pairs need to be on a single line';\n              doc.errors.push(new PlainValue.YAMLSemanticError(prevItem, msg));\n              break;\n            }\n          }\n        } else {\n          key = null;\n        }\n\n        keyStart = null;\n        explicitKey = false;\n        next = null;\n      } else if (next === '[' || char !== ']' || i < cst.items.length - 1) {\n        const msg = `Flow sequence contains an unexpected ${char}`;\n        const err = new PlainValue.YAMLSyntaxError(cst, msg);\n        err.offset = offset;\n        doc.errors.push(err);\n      }\n    } else if (item.type === PlainValue.Type.BLANK_LINE) {\n      comments.push({\n        before: items.length\n      });\n    } else if (item.type === PlainValue.Type.COMMENT) {\n      checkFlowCommentSpace(doc.errors, item);\n      comments.push({\n        comment: item.comment,\n        before: items.length\n      });\n    } else {\n      if (next) {\n        const msg = `Expected a ${next} in flow sequence`;\n        doc.errors.push(new PlainValue.YAMLSemanticError(item, msg));\n      }\n\n      const value = resolveNode(doc, item);\n\n      if (key === undefined) {\n        items.push(value);\n        prevItem = item;\n      } else {\n        items.push(new Pair(key, value));\n        key = undefined;\n      }\n\n      keyStart = item.range.start;\n      next = ',';\n    }\n  }\n\n  checkFlowCollectionEnd(doc.errors, cst);\n  if (key !== undefined) items.push(new Pair(key));\n  return {\n    comments,\n    items\n  };\n}\n\nexports.Alias = Alias;\nexports.Collection = Collection;\nexports.Merge = Merge;\nexports.Node = Node;\nexports.Pair = Pair;\nexports.Scalar = Scalar;\nexports.YAMLMap = YAMLMap;\nexports.YAMLSeq = YAMLSeq;\nexports.addComment = addComment;\nexports.binaryOptions = binaryOptions;\nexports.boolOptions = boolOptions;\nexports.findPair = findPair;\nexports.intOptions = intOptions;\nexports.isEmptyPath = isEmptyPath;\nexports.nullOptions = nullOptions;\nexports.resolveMap = resolveMap;\nexports.resolveNode = resolveNode;\nexports.resolveSeq = resolveSeq;\nexports.resolveString = resolveString;\nexports.strOptions = strOptions;\nexports.stringifyNumber = stringifyNumber;\nexports.stringifyString = stringifyString;\nexports.toJSON = toJSON;\n\n\n/***/ }),\n\n/***/ 7383:\n/***/ ((__unused_webpack_module, exports, __webpack_require__) => {\n\n\"use strict\";\n\n\nvar PlainValue = __webpack_require__(5215);\nvar resolveSeq = __webpack_require__(6140);\n\n/* global atob, btoa, Buffer */\nconst binary = {\n  identify: value => value instanceof Uint8Array,\n  // Buffer inherits from Uint8Array\n  default: false,\n  tag: 'tag:yaml.org,2002:binary',\n\n  /**\n   * Returns a Buffer in node and an Uint8Array in browsers\n   *\n   * To use the resulting buffer as an image, you'll want to do something like:\n   *\n   *   const blob = new Blob([buffer], { type: 'image/jpeg' })\n   *   document.querySelector('#photo').src = URL.createObjectURL(blob)\n   */\n  resolve: (doc, node) => {\n    const src = resolveSeq.resolveString(doc, node);\n\n    if (typeof Buffer === 'function') {\n      return Buffer.from(src, 'base64');\n    } else if (typeof atob === 'function') {\n      // On IE 11, atob() can't handle newlines\n      const str = atob(src.replace(/[\\n\\r]/g, ''));\n      const buffer = new Uint8Array(str.length);\n\n      for (let i = 0; i < str.length; ++i) buffer[i] = str.charCodeAt(i);\n\n      return buffer;\n    } else {\n      const msg = 'This environment does not support reading binary tags; either Buffer or atob is required';\n      doc.errors.push(new PlainValue.YAMLReferenceError(node, msg));\n      return null;\n    }\n  },\n  options: resolveSeq.binaryOptions,\n  stringify: ({\n    comment,\n    type,\n    value\n  }, ctx, onComment, onChompKeep) => {\n    let src;\n\n    if (typeof Buffer === 'function') {\n      src = value instanceof Buffer ? value.toString('base64') : Buffer.from(value.buffer).toString('base64');\n    } else if (typeof btoa === 'function') {\n      let s = '';\n\n      for (let i = 0; i < value.length; ++i) s += String.fromCharCode(value[i]);\n\n      src = btoa(s);\n    } else {\n      throw new Error('This environment does not support writing binary tags; either Buffer or btoa is required');\n    }\n\n    if (!type) type = resolveSeq.binaryOptions.defaultType;\n\n    if (type === PlainValue.Type.QUOTE_DOUBLE) {\n      value = src;\n    } else {\n      const {\n        lineWidth\n      } = resolveSeq.binaryOptions;\n      const n = Math.ceil(src.length / lineWidth);\n      const lines = new Array(n);\n\n      for (let i = 0, o = 0; i < n; ++i, o += lineWidth) {\n        lines[i] = src.substr(o, lineWidth);\n      }\n\n      value = lines.join(type === PlainValue.Type.BLOCK_LITERAL ? '\\n' : ' ');\n    }\n\n    return resolveSeq.stringifyString({\n      comment,\n      type,\n      value\n    }, ctx, onComment, onChompKeep);\n  }\n};\n\nfunction parsePairs(doc, cst) {\n  const seq = resolveSeq.resolveSeq(doc, cst);\n\n  for (let i = 0; i < seq.items.length; ++i) {\n    let item = seq.items[i];\n    if (item instanceof resolveSeq.Pair) continue;else if (item instanceof resolveSeq.YAMLMap) {\n      if (item.items.length > 1) {\n        const msg = 'Each pair must have its own sequence indicator';\n        throw new PlainValue.YAMLSemanticError(cst, msg);\n      }\n\n      const pair = item.items[0] || new resolveSeq.Pair();\n      if (item.commentBefore) pair.commentBefore = pair.commentBefore ? `${item.commentBefore}\\n${pair.commentBefore}` : item.commentBefore;\n      if (item.comment) pair.comment = pair.comment ? `${item.comment}\\n${pair.comment}` : item.comment;\n      item = pair;\n    }\n    seq.items[i] = item instanceof resolveSeq.Pair ? item : new resolveSeq.Pair(item);\n  }\n\n  return seq;\n}\nfunction createPairs(schema, iterable, ctx) {\n  const pairs = new resolveSeq.YAMLSeq(schema);\n  pairs.tag = 'tag:yaml.org,2002:pairs';\n\n  for (const it of iterable) {\n    let key, value;\n\n    if (Array.isArray(it)) {\n      if (it.length === 2) {\n        key = it[0];\n        value = it[1];\n      } else throw new TypeError(`Expected [key, value] tuple: ${it}`);\n    } else if (it && it instanceof Object) {\n      const keys = Object.keys(it);\n\n      if (keys.length === 1) {\n        key = keys[0];\n        value = it[key];\n      } else throw new TypeError(`Expected { key: value } tuple: ${it}`);\n    } else {\n      key = it;\n    }\n\n    const pair = schema.createPair(key, value, ctx);\n    pairs.items.push(pair);\n  }\n\n  return pairs;\n}\nconst pairs = {\n  default: false,\n  tag: 'tag:yaml.org,2002:pairs',\n  resolve: parsePairs,\n  createNode: createPairs\n};\n\nclass YAMLOMap extends resolveSeq.YAMLSeq {\n  constructor() {\n    super();\n\n    PlainValue._defineProperty(this, \"add\", resolveSeq.YAMLMap.prototype.add.bind(this));\n\n    PlainValue._defineProperty(this, \"delete\", resolveSeq.YAMLMap.prototype.delete.bind(this));\n\n    PlainValue._defineProperty(this, \"get\", resolveSeq.YAMLMap.prototype.get.bind(this));\n\n    PlainValue._defineProperty(this, \"has\", resolveSeq.YAMLMap.prototype.has.bind(this));\n\n    PlainValue._defineProperty(this, \"set\", resolveSeq.YAMLMap.prototype.set.bind(this));\n\n    this.tag = YAMLOMap.tag;\n  }\n\n  toJSON(_, ctx) {\n    const map = new Map();\n    if (ctx && ctx.onCreate) ctx.onCreate(map);\n\n    for (const pair of this.items) {\n      let key, value;\n\n      if (pair instanceof resolveSeq.Pair) {\n        key = resolveSeq.toJSON(pair.key, '', ctx);\n        value = resolveSeq.toJSON(pair.value, key, ctx);\n      } else {\n        key = resolveSeq.toJSON(pair, '', ctx);\n      }\n\n      if (map.has(key)) throw new Error('Ordered maps must not include duplicate keys');\n      map.set(key, value);\n    }\n\n    return map;\n  }\n\n}\n\nPlainValue._defineProperty(YAMLOMap, \"tag\", 'tag:yaml.org,2002:omap');\n\nfunction parseOMap(doc, cst) {\n  const pairs = parsePairs(doc, cst);\n  const seenKeys = [];\n\n  for (const {\n    key\n  } of pairs.items) {\n    if (key instanceof resolveSeq.Scalar) {\n      if (seenKeys.includes(key.value)) {\n        const msg = 'Ordered maps must not include duplicate keys';\n        throw new PlainValue.YAMLSemanticError(cst, msg);\n      } else {\n        seenKeys.push(key.value);\n      }\n    }\n  }\n\n  return Object.assign(new YAMLOMap(), pairs);\n}\n\nfunction createOMap(schema, iterable, ctx) {\n  const pairs = createPairs(schema, iterable, ctx);\n  const omap = new YAMLOMap();\n  omap.items = pairs.items;\n  return omap;\n}\n\nconst omap = {\n  identify: value => value instanceof Map,\n  nodeClass: YAMLOMap,\n  default: false,\n  tag: 'tag:yaml.org,2002:omap',\n  resolve: parseOMap,\n  createNode: createOMap\n};\n\nclass YAMLSet extends resolveSeq.YAMLMap {\n  constructor() {\n    super();\n    this.tag = YAMLSet.tag;\n  }\n\n  add(key) {\n    const pair = key instanceof resolveSeq.Pair ? key : new resolveSeq.Pair(key);\n    const prev = resolveSeq.findPair(this.items, pair.key);\n    if (!prev) this.items.push(pair);\n  }\n\n  get(key, keepPair) {\n    const pair = resolveSeq.findPair(this.items, key);\n    return !keepPair && pair instanceof resolveSeq.Pair ? pair.key instanceof resolveSeq.Scalar ? pair.key.value : pair.key : pair;\n  }\n\n  set(key, value) {\n    if (typeof value !== 'boolean') throw new Error(`Expected boolean value for set(key, value) in a YAML set, not ${typeof value}`);\n    const prev = resolveSeq.findPair(this.items, key);\n\n    if (prev && !value) {\n      this.items.splice(this.items.indexOf(prev), 1);\n    } else if (!prev && value) {\n      this.items.push(new resolveSeq.Pair(key));\n    }\n  }\n\n  toJSON(_, ctx) {\n    return super.toJSON(_, ctx, Set);\n  }\n\n  toString(ctx, onComment, onChompKeep) {\n    if (!ctx) return JSON.stringify(this);\n    if (this.hasAllNullValues()) return super.toString(ctx, onComment, onChompKeep);else throw new Error('Set items must all have null values');\n  }\n\n}\n\nPlainValue._defineProperty(YAMLSet, \"tag\", 'tag:yaml.org,2002:set');\n\nfunction parseSet(doc, cst) {\n  const map = resolveSeq.resolveMap(doc, cst);\n  if (!map.hasAllNullValues()) throw new PlainValue.YAMLSemanticError(cst, 'Set items must all have null values');\n  return Object.assign(new YAMLSet(), map);\n}\n\nfunction createSet(schema, iterable, ctx) {\n  const set = new YAMLSet();\n\n  for (const value of iterable) set.items.push(schema.createPair(value, null, ctx));\n\n  return set;\n}\n\nconst set = {\n  identify: value => value instanceof Set,\n  nodeClass: YAMLSet,\n  default: false,\n  tag: 'tag:yaml.org,2002:set',\n  resolve: parseSet,\n  createNode: createSet\n};\n\nconst parseSexagesimal = (sign, parts) => {\n  const n = parts.split(':').reduce((n, p) => n * 60 + Number(p), 0);\n  return sign === '-' ? -n : n;\n}; // hhhh:mm:ss.sss\n\n\nconst stringifySexagesimal = ({\n  value\n}) => {\n  if (isNaN(value) || !isFinite(value)) return resolveSeq.stringifyNumber(value);\n  let sign = '';\n\n  if (value < 0) {\n    sign = '-';\n    value = Math.abs(value);\n  }\n\n  const parts = [value % 60]; // seconds, including ms\n\n  if (value < 60) {\n    parts.unshift(0); // at least one : is required\n  } else {\n    value = Math.round((value - parts[0]) / 60);\n    parts.unshift(value % 60); // minutes\n\n    if (value >= 60) {\n      value = Math.round((value - parts[0]) / 60);\n      parts.unshift(value); // hours\n    }\n  }\n\n  return sign + parts.map(n => n < 10 ? '0' + String(n) : String(n)).join(':').replace(/000000\\d*$/, '') // % 60 may introduce error\n  ;\n};\n\nconst intTime = {\n  identify: value => typeof value === 'number',\n  default: true,\n  tag: 'tag:yaml.org,2002:int',\n  format: 'TIME',\n  test: /^([-+]?)([0-9][0-9_]*(?::[0-5]?[0-9])+)$/,\n  resolve: (str, sign, parts) => parseSexagesimal(sign, parts.replace(/_/g, '')),\n  stringify: stringifySexagesimal\n};\nconst floatTime = {\n  identify: value => typeof value === 'number',\n  default: true,\n  tag: 'tag:yaml.org,2002:float',\n  format: 'TIME',\n  test: /^([-+]?)([0-9][0-9_]*(?::[0-5]?[0-9])+\\.[0-9_]*)$/,\n  resolve: (str, sign, parts) => parseSexagesimal(sign, parts.replace(/_/g, '')),\n  stringify: stringifySexagesimal\n};\nconst timestamp = {\n  identify: value => value instanceof Date,\n  default: true,\n  tag: 'tag:yaml.org,2002:timestamp',\n  // If the time zone is omitted, the timestamp is assumed to be specified in UTC. The time part\n  // may be omitted altogether, resulting in a date format. In such a case, the time part is\n  // assumed to be 00:00:00Z (start of day, UTC).\n  test: RegExp('^(?:' + '([0-9]{4})-([0-9]{1,2})-([0-9]{1,2})' + // YYYY-Mm-Dd\n  '(?:(?:t|T|[ \\\\t]+)' + // t | T | whitespace\n  '([0-9]{1,2}):([0-9]{1,2}):([0-9]{1,2}(\\\\.[0-9]+)?)' + // Hh:Mm:Ss(.ss)?\n  '(?:[ \\\\t]*(Z|[-+][012]?[0-9](?::[0-9]{2})?))?' + // Z | +5 | -03:30\n  ')?' + ')$'),\n  resolve: (str, year, month, day, hour, minute, second, millisec, tz) => {\n    if (millisec) millisec = (millisec + '00').substr(1, 3);\n    let date = Date.UTC(year, month - 1, day, hour || 0, minute || 0, second || 0, millisec || 0);\n\n    if (tz && tz !== 'Z') {\n      let d = parseSexagesimal(tz[0], tz.slice(1));\n      if (Math.abs(d) < 30) d *= 60;\n      date -= 60000 * d;\n    }\n\n    return new Date(date);\n  },\n  stringify: ({\n    value\n  }) => value.toISOString().replace(/((T00:00)?:00)?\\.000Z$/, '')\n};\n\n/* global console, process, YAML_SILENCE_DEPRECATION_WARNINGS, YAML_SILENCE_WARNINGS */\nfunction shouldWarn(deprecation) {\n  const env = typeof process !== 'undefined' && process.env || {};\n\n  if (deprecation) {\n    if (typeof YAML_SILENCE_DEPRECATION_WARNINGS !== 'undefined') return !YAML_SILENCE_DEPRECATION_WARNINGS;\n    return !env.YAML_SILENCE_DEPRECATION_WARNINGS;\n  }\n\n  if (typeof YAML_SILENCE_WARNINGS !== 'undefined') return !YAML_SILENCE_WARNINGS;\n  return !env.YAML_SILENCE_WARNINGS;\n}\n\nfunction warn(warning, type) {\n  if (shouldWarn(false)) {\n    const emit = typeof process !== 'undefined' && process.emitWarning; // This will throw in Jest if `warning` is an Error instance due to\n    // https://github.com/facebook/jest/issues/2549\n\n    if (emit) emit(warning, type);else {\n      // eslint-disable-next-line no-console\n      console.warn(type ? `${type}: ${warning}` : warning);\n    }\n  }\n}\nfunction warnFileDeprecation(filename) {\n  if (shouldWarn(true)) {\n    const path = filename.replace(/.*yaml[/\\\\]/i, '').replace(/\\.js$/, '').replace(/\\\\/g, '/');\n    warn(`The endpoint 'yaml/${path}' will be removed in a future release.`, 'DeprecationWarning');\n  }\n}\nconst warned = {};\nfunction warnOptionDeprecation(name, alternative) {\n  if (!warned[name] && shouldWarn(true)) {\n    warned[name] = true;\n    let msg = `The option '${name}' will be removed in a future release`;\n    msg += alternative ? `, use '${alternative}' instead.` : '.';\n    warn(msg, 'DeprecationWarning');\n  }\n}\n\nexports.binary = binary;\nexports.floatTime = floatTime;\nexports.intTime = intTime;\nexports.omap = omap;\nexports.pairs = pairs;\nexports.set = set;\nexports.timestamp = timestamp;\nexports.warn = warn;\nexports.warnFileDeprecation = warnFileDeprecation;\nexports.warnOptionDeprecation = warnOptionDeprecation;\n\n\n/***/ }),\n\n/***/ 3552:\n/***/ ((module, __unused_webpack_exports, __webpack_require__) => {\n\nmodule.exports = __webpack_require__(5065).YAML\n\n\n/***/ }),\n\n/***/ 8571:\n/***/ ((module, __unused_webpack_exports, __webpack_require__) => {\n\nconst {promises: fs} = __webpack_require__(5747);\nconst YAML = __webpack_require__(3552);\n\nmodule.exports = async (github, repository, milestone, configPath) => {\n  return await fs.readFile(configPath, \"utf-8\")\n    .then(content => YAML.parse(content))\n    .then(config => {\n      let labelGroups = config.labels\n      let weightSortFunc = (a, b) => labelGroups[a].weight - labelGroups[b].weight\n\n      console.log(\"looking up PRs for milestone\", milestone, \"in repo\", repository);\n      return github.paginate(\"GET /search/issues\", {\n        q: `repo:${repository} is:pr is:merged milestone:${milestone}`,\n      }).then(issues => {\n        console.log(\"Issues count:\", issues.length);\n\n        let groupedIssues = groupIssuesByLabels(issues, Object.keys(labelGroups));\n\n        // generate issues list\n        let output = \"\";\n        for (let key of Object.keys(labelGroups).sort(weightSortFunc)) {\n          let displayGroup = labelGroups[key]\n          let issues = (groupedIssues[key] || []);\n          console.log(key, \"issues:\", issues.length);\n\n          if (issues.length > 0) {\n            output += `### ${displayGroup.title}\\n\\n`;\n            if (displayGroup.description) {\n              output += `${displayGroup.description.trim()}\\n\\n`;\n            }\n            issues.forEach(issue => {\n              output += createIssueEntry(issue);\n            });\n            output += \"\\n\";\n          }\n        }\n\n        let hiddenIssues = groupedIssues[\"\"] || [];\n        console.warn(\"Issues not displayed: \", hiddenIssues.length);\n        if (hiddenIssues.length > 0) {\n          console.warn(\" - \" + hiddenIssues.map(issue => issue.number).join(\", \"));\n        }\n\n        // generate contributors list\n        if (\n          config\n          && config.sections\n          && config.sections.contributors\n          && config.sections.contributors.title\n        ) {\n          output += `## ${config.sections.contributors.title}\\n\\n`;\n\n          if (config.sections.contributors.description) {\n            output += `${config.sections.contributors.description.trim()}\\n\\n`;\n          }\n\n          let uniqueFunc = (value, index, self) => self.indexOf(value) === index\n          output += issues\n            .map(issueContrib)\n            .filter(uniqueFunc)\n            .sort()\n            .map(v => `@${v}`)\n            .join(\", \")\n        }\n\n        return output.trim();\n      });\n    })\n};\n\nfunction createIssueEntry(issue) {\n  return `* ${issue.title} (#${issue.number} by @${issueContrib(issue)})\\n`;\n}\n\nfunction issueContrib(issue) {\n  return issue.user.login;\n}\n\nfunction groupIssuesByLabels(issues, labels) {\n  return issues.reduce((groupedMap, issue) => {\n    let typeLabel = issue.labels\n      .filter(label => labels.includes(label.name))\n      .map(label => label.name)[0] || \"\";\n\n    (groupedMap[typeLabel] = groupedMap[typeLabel] || []).push(issue);\n\n    return groupedMap;\n  }, {})\n}\n\n/***/ }),\n\n/***/ 2877:\n/***/ ((module) => {\n\nmodule.exports = eval(\"require\")(\"encoding\");\n\n\n/***/ }),\n\n/***/ 68:\n/***/ ((module) => {\n\n\"use strict\";\nmodule.exports = JSON.parse(\"[[[0,44],\\\"disallowed_STD3_valid\\\"],[[45,46],\\\"valid\\\"],[[47,47],\\\"disallowed_STD3_valid\\\"],[[48,57],\\\"valid\\\"],[[58,64],\\\"disallowed_STD3_valid\\\"],[[65,65],\\\"mapped\\\",[97]],[[66,66],\\\"mapped\\\",[98]],[[67,67],\\\"mapped\\\",[99]],[[68,68],\\\"mapped\\\",[100]],[[69,69],\\\"mapped\\\",[101]],[[70,70],\\\"mapped\\\",[102]],[[71,71],\\\"mapped\\\",[103]],[[72,72],\\\"mapped\\\",[104]],[[73,73],\\\"mapped\\\",[105]],[[74,74],\\\"mapped\\\",[106]],[[75,75],\\\"mapped\\\",[107]],[[76,76],\\\"mapped\\\",[108]],[[77,77],\\\"mapped\\\",[109]],[[78,78],\\\"mapped\\\",[110]],[[79,79],\\\"mapped\\\",[111]],[[80,80],\\\"mapped\\\",[112]],[[81,81],\\\"mapped\\\",[113]],[[82,82],\\\"mapped\\\",[114]],[[83,83],\\\"mapped\\\",[115]],[[84,84],\\\"mapped\\\",[116]],[[85,85],\\\"mapped\\\",[117]],[[86,86],\\\"mapped\\\",[118]],[[87,87],\\\"mapped\\\",[119]],[[88,88],\\\"mapped\\\",[120]],[[89,89],\\\"mapped\\\",[121]],[[90,90],\\\"mapped\\\",[122]],[[91,96],\\\"disallowed_STD3_valid\\\"],[[97,122],\\\"valid\\\"],[[123,127],\\\"disallowed_STD3_valid\\\"],[[128,159],\\\"disallowed\\\"],[[160,160],\\\"disallowed_STD3_mapped\\\",[32]],[[161,167],\\\"valid\\\",[],\\\"NV8\\\"],[[168,168],\\\"disallowed_STD3_mapped\\\",[32,776]],[[169,169],\\\"valid\\\",[],\\\"NV8\\\"],[[170,170],\\\"mapped\\\",[97]],[[171,172],\\\"valid\\\",[],\\\"NV8\\\"],[[173,173],\\\"ignored\\\"],[[174,174],\\\"valid\\\",[],\\\"NV8\\\"],[[175,175],\\\"disallowed_STD3_mapped\\\",[32,772]],[[176,177],\\\"valid\\\",[],\\\"NV8\\\"],[[178,178],\\\"mapped\\\",[50]],[[179,179],\\\"mapped\\\",[51]],[[180,180],\\\"disallowed_STD3_mapped\\\",[32,769]],[[181,181],\\\"mapped\\\",[956]],[[182,182],\\\"valid\\\",[],\\\"NV8\\\"],[[183,183],\\\"valid\\\"],[[184,184],\\\"disallowed_STD3_mapped\\\",[32,807]],[[185,185],\\\"mapped\\\",[49]],[[186,186],\\\"mapped\\\",[111]],[[187,187],\\\"valid\\\",[],\\\"NV8\\\"],[[188,188],\\\"mapped\\\",[49,8260,52]],[[189,189],\\\"mapped\\\",[49,8260,50]],[[190,190],\\\"mapped\\\",[51,8260,52]],[[191,191],\\\"valid\\\",[],\\\"NV8\\\"],[[192,192],\\\"mapped\\\",[224]],[[193,193],\\\"mapped\\\",[225]],[[194,194],\\\"mapped\\\",[226]],[[195,195],\\\"mapped\\\",[227]],[[196,196],\\\"mapped\\\",[228]],[[197,197],\\\"mapped\\\",[229]],[[198,198],\\\"mapped\\\",[230]],[[199,199],\\\"mapped\\\",[231]],[[200,200],\\\"mapped\\\",[232]],[[201,201],\\\"mapped\\\",[233]],[[202,202],\\\"mapped\\\",[234]],[[203,203],\\\"mapped\\\",[235]],[[204,204],\\\"mapped\\\",[236]],[[205,205],\\\"mapped\\\",[237]],[[206,206],\\\"mapped\\\",[238]],[[207,207],\\\"mapped\\\",[239]],[[208,208],\\\"mapped\\\",[240]],[[209,209],\\\"mapped\\\",[241]],[[210,210],\\\"mapped\\\",[242]],[[211,211],\\\"mapped\\\",[243]],[[212,212],\\\"mapped\\\",[244]],[[213,213],\\\"mapped\\\",[245]],[[214,214],\\\"mapped\\\",[246]],[[215,215],\\\"valid\\\",[],\\\"NV8\\\"],[[216,216],\\\"mapped\\\",[248]],[[217,217],\\\"mapped\\\",[249]],[[218,218],\\\"mapped\\\",[250]],[[219,219],\\\"mapped\\\",[251]],[[220,220],\\\"mapped\\\",[252]],[[221,221],\\\"mapped\\\",[253]],[[222,222],\\\"mapped\\\",[254]],[[223,223],\\\"deviation\\\",[115,115]],[[224,246],\\\"valid\\\"],[[247,247],\\\"valid\\\",[],\\\"NV8\\\"],[[248,255],\\\"valid\\\"],[[256,256],\\\"mapped\\\",[257]],[[257,257],\\\"valid\\\"],[[258,258],\\\"mapped\\\",[259]],[[259,259],\\\"valid\\\"],[[260,260],\\\"mapped\\\",[261]],[[261,261],\\\"valid\\\"],[[262,262],\\\"mapped\\\",[263]],[[263,263],\\\"valid\\\"],[[264,264],\\\"mapped\\\",[265]],[[265,265],\\\"valid\\\"],[[266,266],\\\"mapped\\\",[267]],[[267,267],\\\"valid\\\"],[[268,268],\\\"mapped\\\",[269]],[[269,269],\\\"valid\\\"],[[270,270],\\\"mapped\\\",[271]],[[271,271],\\\"valid\\\"],[[272,272],\\\"mapped\\\",[273]],[[273,273],\\\"valid\\\"],[[274,274],\\\"mapped\\\",[275]],[[275,275],\\\"valid\\\"],[[276,276],\\\"mapped\\\",[277]],[[277,277],\\\"valid\\\"],[[278,278],\\\"mapped\\\",[279]],[[279,279],\\\"valid\\\"],[[280,280],\\\"mapped\\\",[281]],[[281,281],\\\"valid\\\"],[[282,282],\\\"mapped\\\",[283]],[[283,283],\\\"valid\\\"],[[284,284],\\\"mapped\\\",[285]],[[285,285],\\\"valid\\\"],[[286,286],\\\"mapped\\\",[287]],[[287,287],\\\"valid\\\"],[[288,288],\\\"mapped\\\",[289]],[[289,289],\\\"valid\\\"],[[290,290],\\\"mapped\\\",[291]],[[291,291],\\\"valid\\\"],[[292,292],\\\"mapped\\\",[293]],[[293,293],\\\"valid\\\"],[[294,294],\\\"mapped\\\",[295]],[[295,295],\\\"valid\\\"],[[296,296],\\\"mapped\\\",[297]],[[297,297],\\\"valid\\\"],[[298,298],\\\"mapped\\\",[299]],[[299,299],\\\"valid\\\"],[[300,300],\\\"mapped\\\",[301]],[[301,301],\\\"valid\\\"],[[302,302],\\\"mapped\\\",[303]],[[303,303],\\\"valid\\\"],[[304,304],\\\"mapped\\\",[105,775]],[[305,305],\\\"valid\\\"],[[306,307],\\\"mapped\\\",[105,106]],[[308,308],\\\"mapped\\\",[309]],[[309,309],\\\"valid\\\"],[[310,310],\\\"mapped\\\",[311]],[[311,312],\\\"valid\\\"],[[313,313],\\\"mapped\\\",[314]],[[314,314],\\\"valid\\\"],[[315,315],\\\"mapped\\\",[316]],[[316,316],\\\"valid\\\"],[[317,317],\\\"mapped\\\",[318]],[[318,318],\\\"valid\\\"],[[319,320],\\\"mapped\\\",[108,183]],[[321,321],\\\"mapped\\\",[322]],[[322,322],\\\"valid\\\"],[[323,323],\\\"mapped\\\",[324]],[[324,324],\\\"valid\\\"],[[325,325],\\\"mapped\\\",[326]],[[326,326],\\\"valid\\\"],[[327,327],\\\"mapped\\\",[328]],[[328,328],\\\"valid\\\"],[[329,329],\\\"mapped\\\",[700,110]],[[330,330],\\\"mapped\\\",[331]],[[331,331],\\\"valid\\\"],[[332,332],\\\"mapped\\\",[333]],[[333,333],\\\"valid\\\"],[[334,334],\\\"mapped\\\",[335]],[[335,335],\\\"valid\\\"],[[336,336],\\\"mapped\\\",[337]],[[337,337],\\\"valid\\\"],[[338,338],\\\"mapped\\\",[339]],[[339,339],\\\"valid\\\"],[[340,340],\\\"mapped\\\",[341]],[[341,341],\\\"valid\\\"],[[342,342],\\\"mapped\\\",[343]],[[343,343],\\\"valid\\\"],[[344,344],\\\"mapped\\\",[345]],[[345,345],\\\"valid\\\"],[[346,346],\\\"mapped\\\",[347]],[[347,347],\\\"valid\\\"],[[348,348],\\\"mapped\\\",[349]],[[349,349],\\\"valid\\\"],[[350,350],\\\"mapped\\\",[351]],[[351,351],\\\"valid\\\"],[[352,352],\\\"mapped\\\",[353]],[[353,353],\\\"valid\\\"],[[354,354],\\\"mapped\\\",[355]],[[355,355],\\\"valid\\\"],[[356,356],\\\"mapped\\\",[357]],[[357,357],\\\"valid\\\"],[[358,358],\\\"mapped\\\",[359]],[[359,359],\\\"valid\\\"],[[360,360],\\\"mapped\\\",[361]],[[361,361],\\\"valid\\\"],[[362,362],\\\"mapped\\\",[363]],[[363,363],\\\"valid\\\"],[[364,364],\\\"mapped\\\",[365]],[[365,365],\\\"valid\\\"],[[366,366],\\\"mapped\\\",[367]],[[367,367],\\\"valid\\\"],[[368,368],\\\"mapped\\\",[369]],[[369,369],\\\"valid\\\"],[[370,370],\\\"mapped\\\",[371]],[[371,371],\\\"valid\\\"],[[372,372],\\\"mapped\\\",[373]],[[373,373],\\\"valid\\\"],[[374,374],\\\"mapped\\\",[375]],[[375,375],\\\"valid\\\"],[[376,376],\\\"mapped\\\",[255]],[[377,377],\\\"mapped\\\",[378]],[[378,378],\\\"valid\\\"],[[379,379],\\\"mapped\\\",[380]],[[380,380],\\\"valid\\\"],[[381,381],\\\"mapped\\\",[382]],[[382,382],\\\"valid\\\"],[[383,383],\\\"mapped\\\",[115]],[[384,384],\\\"valid\\\"],[[385,385],\\\"mapped\\\",[595]],[[386,386],\\\"mapped\\\",[387]],[[387,387],\\\"valid\\\"],[[388,388],\\\"mapped\\\",[389]],[[389,389],\\\"valid\\\"],[[390,390],\\\"mapped\\\",[596]],[[391,391],\\\"mapped\\\",[392]],[[392,392],\\\"valid\\\"],[[393,393],\\\"mapped\\\",[598]],[[394,394],\\\"mapped\\\",[599]],[[395,395],\\\"mapped\\\",[396]],[[396,397],\\\"valid\\\"],[[398,398],\\\"mapped\\\",[477]],[[399,399],\\\"mapped\\\",[601]],[[400,400],\\\"mapped\\\",[603]],[[401,401],\\\"mapped\\\",[402]],[[402,402],\\\"valid\\\"],[[403,403],\\\"mapped\\\",[608]],[[404,404],\\\"mapped\\\",[611]],[[405,405],\\\"valid\\\"],[[406,406],\\\"mapped\\\",[617]],[[407,407],\\\"mapped\\\",[616]],[[408,408],\\\"mapped\\\",[409]],[[409,411],\\\"valid\\\"],[[412,412],\\\"mapped\\\",[623]],[[413,413],\\\"mapped\\\",[626]],[[414,414],\\\"valid\\\"],[[415,415],\\\"mapped\\\",[629]],[[416,416],\\\"mapped\\\",[417]],[[417,417],\\\"valid\\\"],[[418,418],\\\"mapped\\\",[419]],[[419,419],\\\"valid\\\"],[[420,420],\\\"mapped\\\",[421]],[[421,421],\\\"valid\\\"],[[422,422],\\\"mapped\\\",[640]],[[423,423],\\\"mapped\\\",[424]],[[424,424],\\\"valid\\\"],[[425,425],\\\"mapped\\\",[643]],[[426,427],\\\"valid\\\"],[[428,428],\\\"mapped\\\",[429]],[[429,429],\\\"valid\\\"],[[430,430],\\\"mapped\\\",[648]],[[431,431],\\\"mapped\\\",[432]],[[432,432],\\\"valid\\\"],[[433,433],\\\"mapped\\\",[650]],[[434,434],\\\"mapped\\\",[651]],[[435,435],\\\"mapped\\\",[436]],[[436,436],\\\"valid\\\"],[[437,437],\\\"mapped\\\",[438]],[[438,438],\\\"valid\\\"],[[439,439],\\\"mapped\\\",[658]],[[440,440],\\\"mapped\\\",[441]],[[441,443],\\\"valid\\\"],[[444,444],\\\"mapped\\\",[445]],[[445,451],\\\"valid\\\"],[[452,454],\\\"mapped\\\",[100,382]],[[455,457],\\\"mapped\\\",[108,106]],[[458,460],\\\"mapped\\\",[110,106]],[[461,461],\\\"mapped\\\",[462]],[[462,462],\\\"valid\\\"],[[463,463],\\\"mapped\\\",[464]],[[464,464],\\\"valid\\\"],[[465,465],\\\"mapped\\\",[466]],[[466,466],\\\"valid\\\"],[[467,467],\\\"mapped\\\",[468]],[[468,468],\\\"valid\\\"],[[469,469],\\\"mapped\\\",[470]],[[470,470],\\\"valid\\\"],[[471,471],\\\"mapped\\\",[472]],[[472,472],\\\"valid\\\"],[[473,473],\\\"mapped\\\",[474]],[[474,474],\\\"valid\\\"],[[475,475],\\\"mapped\\\",[476]],[[476,477],\\\"valid\\\"],[[478,478],\\\"mapped\\\",[479]],[[479,479],\\\"valid\\\"],[[480,480],\\\"mapped\\\",[481]],[[481,481],\\\"valid\\\"],[[482,482],\\\"mapped\\\",[483]],[[483,483],\\\"valid\\\"],[[484,484],\\\"mapped\\\",[485]],[[485,485],\\\"valid\\\"],[[486,486],\\\"mapped\\\",[487]],[[487,487],\\\"valid\\\"],[[488,488],\\\"mapped\\\",[489]],[[489,489],\\\"valid\\\"],[[490,490],\\\"mapped\\\",[491]],[[491,491],\\\"valid\\\"],[[492,492],\\\"mapped\\\",[493]],[[493,493],\\\"valid\\\"],[[494,494],\\\"mapped\\\",[495]],[[495,496],\\\"valid\\\"],[[497,499],\\\"mapped\\\",[100,122]],[[500,500],\\\"mapped\\\",[501]],[[501,501],\\\"valid\\\"],[[502,502],\\\"mapped\\\",[405]],[[503,503],\\\"mapped\\\",[447]],[[504,504],\\\"mapped\\\",[505]],[[505,505],\\\"valid\\\"],[[506,506],\\\"mapped\\\",[507]],[[507,507],\\\"valid\\\"],[[508,508],\\\"mapped\\\",[509]],[[509,509],\\\"valid\\\"],[[510,510],\\\"mapped\\\",[511]],[[511,511],\\\"valid\\\"],[[512,512],\\\"mapped\\\",[513]],[[513,513],\\\"valid\\\"],[[514,514],\\\"mapped\\\",[515]],[[515,515],\\\"valid\\\"],[[516,516],\\\"mapped\\\",[517]],[[517,517],\\\"valid\\\"],[[518,518],\\\"mapped\\\",[519]],[[519,519],\\\"valid\\\"],[[520,520],\\\"mapped\\\",[521]],[[521,521],\\\"valid\\\"],[[522,522],\\\"mapped\\\",[523]],[[523,523],\\\"valid\\\"],[[524,524],\\\"mapped\\\",[525]],[[525,525],\\\"valid\\\"],[[526,526],\\\"mapped\\\",[527]],[[527,527],\\\"valid\\\"],[[528,528],\\\"mapped\\\",[529]],[[529,529],\\\"valid\\\"],[[530,530],\\\"mapped\\\",[531]],[[531,531],\\\"valid\\\"],[[532,532],\\\"mapped\\\",[533]],[[533,533],\\\"valid\\\"],[[534,534],\\\"mapped\\\",[535]],[[535,535],\\\"valid\\\"],[[536,536],\\\"mapped\\\",[537]],[[537,537],\\\"valid\\\"],[[538,538],\\\"mapped\\\",[539]],[[539,539],\\\"valid\\\"],[[540,540],\\\"mapped\\\",[541]],[[541,541],\\\"valid\\\"],[[542,542],\\\"mapped\\\",[543]],[[543,543],\\\"valid\\\"],[[544,544],\\\"mapped\\\",[414]],[[545,545],\\\"valid\\\"],[[546,546],\\\"mapped\\\",[547]],[[547,547],\\\"valid\\\"],[[548,548],\\\"mapped\\\",[549]],[[549,549],\\\"valid\\\"],[[550,550],\\\"mapped\\\",[551]],[[551,551],\\\"valid\\\"],[[552,552],\\\"mapped\\\",[553]],[[553,553],\\\"valid\\\"],[[554,554],\\\"mapped\\\",[555]],[[555,555],\\\"valid\\\"],[[556,556],\\\"mapped\\\",[557]],[[557,557],\\\"valid\\\"],[[558,558],\\\"mapped\\\",[559]],[[559,559],\\\"valid\\\"],[[560,560],\\\"mapped\\\",[561]],[[561,561],\\\"valid\\\"],[[562,562],\\\"mapped\\\",[563]],[[563,563],\\\"valid\\\"],[[564,566],\\\"valid\\\"],[[567,569],\\\"valid\\\"],[[570,570],\\\"mapped\\\",[11365]],[[571,571],\\\"mapped\\\",[572]],[[572,572],\\\"valid\\\"],[[573,573],\\\"mapped\\\",[410]],[[574,574],\\\"mapped\\\",[11366]],[[575,576],\\\"valid\\\"],[[577,577],\\\"mapped\\\",[578]],[[578,578],\\\"valid\\\"],[[579,579],\\\"mapped\\\",[384]],[[580,580],\\\"mapped\\\",[649]],[[581,581],\\\"mapped\\\",[652]],[[582,582],\\\"mapped\\\",[583]],[[583,583],\\\"valid\\\"],[[584,584],\\\"mapped\\\",[585]],[[585,585],\\\"valid\\\"],[[586,586],\\\"mapped\\\",[587]],[[587,587],\\\"valid\\\"],[[588,588],\\\"mapped\\\",[589]],[[589,589],\\\"valid\\\"],[[590,590],\\\"mapped\\\",[591]],[[591,591],\\\"valid\\\"],[[592,680],\\\"valid\\\"],[[681,685],\\\"valid\\\"],[[686,687],\\\"valid\\\"],[[688,688],\\\"mapped\\\",[104]],[[689,689],\\\"mapped\\\",[614]],[[690,690],\\\"mapped\\\",[106]],[[691,691],\\\"mapped\\\",[114]],[[692,692],\\\"mapped\\\",[633]],[[693,693],\\\"mapped\\\",[635]],[[694,694],\\\"mapped\\\",[641]],[[695,695],\\\"mapped\\\",[119]],[[696,696],\\\"mapped\\\",[121]],[[697,705],\\\"valid\\\"],[[706,709],\\\"valid\\\",[],\\\"NV8\\\"],[[710,721],\\\"valid\\\"],[[722,727],\\\"valid\\\",[],\\\"NV8\\\"],[[728,728],\\\"disallowed_STD3_mapped\\\",[32,774]],[[729,729],\\\"disallowed_STD3_mapped\\\",[32,775]],[[730,730],\\\"disallowed_STD3_mapped\\\",[32,778]],[[731,731],\\\"disallowed_STD3_mapped\\\",[32,808]],[[732,732],\\\"disallowed_STD3_mapped\\\",[32,771]],[[733,733],\\\"disallowed_STD3_mapped\\\",[32,779]],[[734,734],\\\"valid\\\",[],\\\"NV8\\\"],[[735,735],\\\"valid\\\",[],\\\"NV8\\\"],[[736,736],\\\"mapped\\\",[611]],[[737,737],\\\"mapped\\\",[108]],[[738,738],\\\"mapped\\\",[115]],[[739,739],\\\"mapped\\\",[120]],[[740,740],\\\"mapped\\\",[661]],[[741,745],\\\"valid\\\",[],\\\"NV8\\\"],[[746,747],\\\"valid\\\",[],\\\"NV8\\\"],[[748,748],\\\"valid\\\"],[[749,749],\\\"valid\\\",[],\\\"NV8\\\"],[[750,750],\\\"valid\\\"],[[751,767],\\\"valid\\\",[],\\\"NV8\\\"],[[768,831],\\\"valid\\\"],[[832,832],\\\"mapped\\\",[768]],[[833,833],\\\"mapped\\\",[769]],[[834,834],\\\"valid\\\"],[[835,835],\\\"mapped\\\",[787]],[[836,836],\\\"mapped\\\",[776,769]],[[837,837],\\\"mapped\\\",[953]],[[838,846],\\\"valid\\\"],[[847,847],\\\"ignored\\\"],[[848,855],\\\"valid\\\"],[[856,860],\\\"valid\\\"],[[861,863],\\\"valid\\\"],[[864,865],\\\"valid\\\"],[[866,866],\\\"valid\\\"],[[867,879],\\\"valid\\\"],[[880,880],\\\"mapped\\\",[881]],[[881,881],\\\"valid\\\"],[[882,882],\\\"mapped\\\",[883]],[[883,883],\\\"valid\\\"],[[884,884],\\\"mapped\\\",[697]],[[885,885],\\\"valid\\\"],[[886,886],\\\"mapped\\\",[887]],[[887,887],\\\"valid\\\"],[[888,889],\\\"disallowed\\\"],[[890,890],\\\"disallowed_STD3_mapped\\\",[32,953]],[[891,893],\\\"valid\\\"],[[894,894],\\\"disallowed_STD3_mapped\\\",[59]],[[895,895],\\\"mapped\\\",[1011]],[[896,899],\\\"disallowed\\\"],[[900,900],\\\"disallowed_STD3_mapped\\\",[32,769]],[[901,901],\\\"disallowed_STD3_mapped\\\",[32,776,769]],[[902,902],\\\"mapped\\\",[940]],[[903,903],\\\"mapped\\\",[183]],[[904,904],\\\"mapped\\\",[941]],[[905,905],\\\"mapped\\\",[942]],[[906,906],\\\"mapped\\\",[943]],[[907,907],\\\"disallowed\\\"],[[908,908],\\\"mapped\\\",[972]],[[909,909],\\\"disallowed\\\"],[[910,910],\\\"mapped\\\",[973]],[[911,911],\\\"mapped\\\",[974]],[[912,912],\\\"valid\\\"],[[913,913],\\\"mapped\\\",[945]],[[914,914],\\\"mapped\\\",[946]],[[915,915],\\\"mapped\\\",[947]],[[916,916],\\\"mapped\\\",[948]],[[917,917],\\\"mapped\\\",[949]],[[918,918],\\\"mapped\\\",[950]],[[919,919],\\\"mapped\\\",[951]],[[920,920],\\\"mapped\\\",[952]],[[921,921],\\\"mapped\\\",[953]],[[922,922],\\\"mapped\\\",[954]],[[923,923],\\\"mapped\\\",[955]],[[924,924],\\\"mapped\\\",[956]],[[925,925],\\\"mapped\\\",[957]],[[926,926],\\\"mapped\\\",[958]],[[927,927],\\\"mapped\\\",[959]],[[928,928],\\\"mapped\\\",[960]],[[929,929],\\\"mapped\\\",[961]],[[930,930],\\\"disallowed\\\"],[[931,931],\\\"mapped\\\",[963]],[[932,932],\\\"mapped\\\",[964]],[[933,933],\\\"mapped\\\",[965]],[[934,934],\\\"mapped\\\",[966]],[[935,935],\\\"mapped\\\",[967]],[[936,936],\\\"mapped\\\",[968]],[[937,937],\\\"mapped\\\",[969]],[[938,938],\\\"mapped\\\",[970]],[[939,939],\\\"mapped\\\",[971]],[[940,961],\\\"valid\\\"],[[962,962],\\\"deviation\\\",[963]],[[963,974],\\\"valid\\\"],[[975,975],\\\"mapped\\\",[983]],[[976,976],\\\"mapped\\\",[946]],[[977,977],\\\"mapped\\\",[952]],[[978,978],\\\"mapped\\\",[965]],[[979,979],\\\"mapped\\\",[973]],[[980,980],\\\"mapped\\\",[971]],[[981,981],\\\"mapped\\\",[966]],[[982,982],\\\"mapped\\\",[960]],[[983,983],\\\"valid\\\"],[[984,984],\\\"mapped\\\",[985]],[[985,985],\\\"valid\\\"],[[986,986],\\\"mapped\\\",[987]],[[987,987],\\\"valid\\\"],[[988,988],\\\"mapped\\\",[989]],[[989,989],\\\"valid\\\"],[[990,990],\\\"mapped\\\",[991]],[[991,991],\\\"valid\\\"],[[992,992],\\\"mapped\\\",[993]],[[993,993],\\\"valid\\\"],[[994,994],\\\"mapped\\\",[995]],[[995,995],\\\"valid\\\"],[[996,996],\\\"mapped\\\",[997]],[[997,997],\\\"valid\\\"],[[998,998],\\\"mapped\\\",[999]],[[999,999],\\\"valid\\\"],[[1000,1000],\\\"mapped\\\",[1001]],[[1001,1001],\\\"valid\\\"],[[1002,1002],\\\"mapped\\\",[1003]],[[1003,1003],\\\"valid\\\"],[[1004,1004],\\\"mapped\\\",[1005]],[[1005,1005],\\\"valid\\\"],[[1006,1006],\\\"mapped\\\",[1007]],[[1007,1007],\\\"valid\\\"],[[1008,1008],\\\"mapped\\\",[954]],[[1009,1009],\\\"mapped\\\",[961]],[[1010,1010],\\\"mapped\\\",[963]],[[1011,1011],\\\"valid\\\"],[[1012,1012],\\\"mapped\\\",[952]],[[1013,1013],\\\"mapped\\\",[949]],[[1014,1014],\\\"valid\\\",[],\\\"NV8\\\"],[[1015,1015],\\\"mapped\\\",[1016]],[[1016,1016],\\\"valid\\\"],[[1017,1017],\\\"mapped\\\",[963]],[[1018,1018],\\\"mapped\\\",[1019]],[[1019,1019],\\\"valid\\\"],[[1020,1020],\\\"valid\\\"],[[1021,1021],\\\"mapped\\\",[891]],[[1022,1022],\\\"mapped\\\",[892]],[[1023,1023],\\\"mapped\\\",[893]],[[1024,1024],\\\"mapped\\\",[1104]],[[1025,1025],\\\"mapped\\\",[1105]],[[1026,1026],\\\"mapped\\\",[1106]],[[1027,1027],\\\"mapped\\\",[1107]],[[1028,1028],\\\"mapped\\\",[1108]],[[1029,1029],\\\"mapped\\\",[1109]],[[1030,1030],\\\"mapped\\\",[1110]],[[1031,1031],\\\"mapped\\\",[1111]],[[1032,1032],\\\"mapped\\\",[1112]],[[1033,1033],\\\"mapped\\\",[1113]],[[1034,1034],\\\"mapped\\\",[1114]],[[1035,1035],\\\"mapped\\\",[1115]],[[1036,1036],\\\"mapped\\\",[1116]],[[1037,1037],\\\"mapped\\\",[1117]],[[1038,1038],\\\"mapped\\\",[1118]],[[1039,1039],\\\"mapped\\\",[1119]],[[1040,1040],\\\"mapped\\\",[1072]],[[1041,1041],\\\"mapped\\\",[1073]],[[1042,1042],\\\"mapped\\\",[1074]],[[1043,1043],\\\"mapped\\\",[1075]],[[1044,1044],\\\"mapped\\\",[1076]],[[1045,1045],\\\"mapped\\\",[1077]],[[1046,1046],\\\"mapped\\\",[1078]],[[1047,1047],\\\"mapped\\\",[1079]],[[1048,1048],\\\"mapped\\\",[1080]],[[1049,1049],\\\"mapped\\\",[1081]],[[1050,1050],\\\"mapped\\\",[1082]],[[1051,1051],\\\"mapped\\\",[1083]],[[1052,1052],\\\"mapped\\\",[1084]],[[1053,1053],\\\"mapped\\\",[1085]],[[1054,1054],\\\"mapped\\\",[1086]],[[1055,1055],\\\"mapped\\\",[1087]],[[1056,1056],\\\"mapped\\\",[1088]],[[1057,1057],\\\"mapped\\\",[1089]],[[1058,1058],\\\"mapped\\\",[1090]],[[1059,1059],\\\"mapped\\\",[1091]],[[1060,1060],\\\"mapped\\\",[1092]],[[1061,1061],\\\"mapped\\\",[1093]],[[1062,1062],\\\"mapped\\\",[1094]],[[1063,1063],\\\"mapped\\\",[1095]],[[1064,1064],\\\"mapped\\\",[1096]],[[1065,1065],\\\"mapped\\\",[1097]],[[1066,1066],\\\"mapped\\\",[1098]],[[1067,1067],\\\"mapped\\\",[1099]],[[1068,1068],\\\"mapped\\\",[1100]],[[1069,1069],\\\"mapped\\\",[1101]],[[1070,1070],\\\"mapped\\\",[1102]],[[1071,1071],\\\"mapped\\\",[1103]],[[1072,1103],\\\"valid\\\"],[[1104,1104],\\\"valid\\\"],[[1105,1116],\\\"valid\\\"],[[1117,1117],\\\"valid\\\"],[[1118,1119],\\\"valid\\\"],[[1120,1120],\\\"mapped\\\",[1121]],[[1121,1121],\\\"valid\\\"],[[1122,1122],\\\"mapped\\\",[1123]],[[1123,1123],\\\"valid\\\"],[[1124,1124],\\\"mapped\\\",[1125]],[[1125,1125],\\\"valid\\\"],[[1126,1126],\\\"mapped\\\",[1127]],[[1127,1127],\\\"valid\\\"],[[1128,1128],\\\"mapped\\\",[1129]],[[1129,1129],\\\"valid\\\"],[[1130,1130],\\\"mapped\\\",[1131]],[[1131,1131],\\\"valid\\\"],[[1132,1132],\\\"mapped\\\",[1133]],[[1133,1133],\\\"valid\\\"],[[1134,1134],\\\"mapped\\\",[1135]],[[1135,1135],\\\"valid\\\"],[[1136,1136],\\\"mapped\\\",[1137]],[[1137,1137],\\\"valid\\\"],[[1138,1138],\\\"mapped\\\",[1139]],[[1139,1139],\\\"valid\\\"],[[1140,1140],\\\"mapped\\\",[1141]],[[1141,1141],\\\"valid\\\"],[[1142,1142],\\\"mapped\\\",[1143]],[[1143,1143],\\\"valid\\\"],[[1144,1144],\\\"mapped\\\",[1145]],[[1145,1145],\\\"valid\\\"],[[1146,1146],\\\"mapped\\\",[1147]],[[1147,1147],\\\"valid\\\"],[[1148,1148],\\\"mapped\\\",[1149]],[[1149,1149],\\\"valid\\\"],[[1150,1150],\\\"mapped\\\",[1151]],[[1151,1151],\\\"valid\\\"],[[1152,1152],\\\"mapped\\\",[1153]],[[1153,1153],\\\"valid\\\"],[[1154,1154],\\\"valid\\\",[],\\\"NV8\\\"],[[1155,1158],\\\"valid\\\"],[[1159,1159],\\\"valid\\\"],[[1160,1161],\\\"valid\\\",[],\\\"NV8\\\"],[[1162,1162],\\\"mapped\\\",[1163]],[[1163,1163],\\\"valid\\\"],[[1164,1164],\\\"mapped\\\",[1165]],[[1165,1165],\\\"valid\\\"],[[1166,1166],\\\"mapped\\\",[1167]],[[1167,1167],\\\"valid\\\"],[[1168,1168],\\\"mapped\\\",[1169]],[[1169,1169],\\\"valid\\\"],[[1170,1170],\\\"mapped\\\",[1171]],[[1171,1171],\\\"valid\\\"],[[1172,1172],\\\"mapped\\\",[1173]],[[1173,1173],\\\"valid\\\"],[[1174,1174],\\\"mapped\\\",[1175]],[[1175,1175],\\\"valid\\\"],[[1176,1176],\\\"mapped\\\",[1177]],[[1177,1177],\\\"valid\\\"],[[1178,1178],\\\"mapped\\\",[1179]],[[1179,1179],\\\"valid\\\"],[[1180,1180],\\\"mapped\\\",[1181]],[[1181,1181],\\\"valid\\\"],[[1182,1182],\\\"mapped\\\",[1183]],[[1183,1183],\\\"valid\\\"],[[1184,1184],\\\"mapped\\\",[1185]],[[1185,1185],\\\"valid\\\"],[[1186,1186],\\\"mapped\\\",[1187]],[[1187,1187],\\\"valid\\\"],[[1188,1188],\\\"mapped\\\",[1189]],[[1189,1189],\\\"valid\\\"],[[1190,1190],\\\"mapped\\\",[1191]],[[1191,1191],\\\"valid\\\"],[[1192,1192],\\\"mapped\\\",[1193]],[[1193,1193],\\\"valid\\\"],[[1194,1194],\\\"mapped\\\",[1195]],[[1195,1195],\\\"valid\\\"],[[1196,1196],\\\"mapped\\\",[1197]],[[1197,1197],\\\"valid\\\"],[[1198,1198],\\\"mapped\\\",[1199]],[[1199,1199],\\\"valid\\\"],[[1200,1200],\\\"mapped\\\",[1201]],[[1201,1201],\\\"valid\\\"],[[1202,1202],\\\"mapped\\\",[1203]],[[1203,1203],\\\"valid\\\"],[[1204,1204],\\\"mapped\\\",[1205]],[[1205,1205],\\\"valid\\\"],[[1206,1206],\\\"mapped\\\",[1207]],[[1207,1207],\\\"valid\\\"],[[1208,1208],\\\"mapped\\\",[1209]],[[1209,1209],\\\"valid\\\"],[[1210,1210],\\\"mapped\\\",[1211]],[[1211,1211],\\\"valid\\\"],[[1212,1212],\\\"mapped\\\",[1213]],[[1213,1213],\\\"valid\\\"],[[1214,1214],\\\"mapped\\\",[1215]],[[1215,1215],\\\"valid\\\"],[[1216,1216],\\\"disallowed\\\"],[[1217,1217],\\\"mapped\\\",[1218]],[[1218,1218],\\\"valid\\\"],[[1219,1219],\\\"mapped\\\",[1220]],[[1220,1220],\\\"valid\\\"],[[1221,1221],\\\"mapped\\\",[1222]],[[1222,1222],\\\"valid\\\"],[[1223,1223],\\\"mapped\\\",[1224]],[[1224,1224],\\\"valid\\\"],[[1225,1225],\\\"mapped\\\",[1226]],[[1226,1226],\\\"valid\\\"],[[1227,1227],\\\"mapped\\\",[1228]],[[1228,1228],\\\"valid\\\"],[[1229,1229],\\\"mapped\\\",[1230]],[[1230,1230],\\\"valid\\\"],[[1231,1231],\\\"valid\\\"],[[1232,1232],\\\"mapped\\\",[1233]],[[1233,1233],\\\"valid\\\"],[[1234,1234],\\\"mapped\\\",[1235]],[[1235,1235],\\\"valid\\\"],[[1236,1236],\\\"mapped\\\",[1237]],[[1237,1237],\\\"valid\\\"],[[1238,1238],\\\"mapped\\\",[1239]],[[1239,1239],\\\"valid\\\"],[[1240,1240],\\\"mapped\\\",[1241]],[[1241,1241],\\\"valid\\\"],[[1242,1242],\\\"mapped\\\",[1243]],[[1243,1243],\\\"valid\\\"],[[1244,1244],\\\"mapped\\\",[1245]],[[1245,1245],\\\"valid\\\"],[[1246,1246],\\\"mapped\\\",[1247]],[[1247,1247],\\\"valid\\\"],[[1248,1248],\\\"mapped\\\",[1249]],[[1249,1249],\\\"valid\\\"],[[1250,1250],\\\"mapped\\\",[1251]],[[1251,1251],\\\"valid\\\"],[[1252,1252],\\\"mapped\\\",[1253]],[[1253,1253],\\\"valid\\\"],[[1254,1254],\\\"mapped\\\",[1255]],[[1255,1255],\\\"valid\\\"],[[1256,1256],\\\"mapped\\\",[1257]],[[1257,1257],\\\"valid\\\"],[[1258,1258],\\\"mapped\\\",[1259]],[[1259,1259],\\\"valid\\\"],[[1260,1260],\\\"mapped\\\",[1261]],[[1261,1261],\\\"valid\\\"],[[1262,1262],\\\"mapped\\\",[1263]],[[1263,1263],\\\"valid\\\"],[[1264,1264],\\\"mapped\\\",[1265]],[[1265,1265],\\\"valid\\\"],[[1266,1266],\\\"mapped\\\",[1267]],[[1267,1267],\\\"valid\\\"],[[1268,1268],\\\"mapped\\\",[1269]],[[1269,1269],\\\"valid\\\"],[[1270,1270],\\\"mapped\\\",[1271]],[[1271,1271],\\\"valid\\\"],[[1272,1272],\\\"mapped\\\",[1273]],[[1273,1273],\\\"valid\\\"],[[1274,1274],\\\"mapped\\\",[1275]],[[1275,1275],\\\"valid\\\"],[[1276,1276],\\\"mapped\\\",[1277]],[[1277,1277],\\\"valid\\\"],[[1278,1278],\\\"mapped\\\",[1279]],[[1279,1279],\\\"valid\\\"],[[1280,1280],\\\"mapped\\\",[1281]],[[1281,1281],\\\"valid\\\"],[[1282,1282],\\\"mapped\\\",[1283]],[[1283,1283],\\\"valid\\\"],[[1284,1284],\\\"mapped\\\",[1285]],[[1285,1285],\\\"valid\\\"],[[1286,1286],\\\"mapped\\\",[1287]],[[1287,1287],\\\"valid\\\"],[[1288,1288],\\\"mapped\\\",[1289]],[[1289,1289],\\\"valid\\\"],[[1290,1290],\\\"mapped\\\",[1291]],[[1291,1291],\\\"valid\\\"],[[1292,1292],\\\"mapped\\\",[1293]],[[1293,1293],\\\"valid\\\"],[[1294,1294],\\\"mapped\\\",[1295]],[[1295,1295],\\\"valid\\\"],[[1296,1296],\\\"mapped\\\",[1297]],[[1297,1297],\\\"valid\\\"],[[1298,1298],\\\"mapped\\\",[1299]],[[1299,1299],\\\"valid\\\"],[[1300,1300],\\\"mapped\\\",[1301]],[[1301,1301],\\\"valid\\\"],[[1302,1302],\\\"mapped\\\",[1303]],[[1303,1303],\\\"valid\\\"],[[1304,1304],\\\"mapped\\\",[1305]],[[1305,1305],\\\"valid\\\"],[[1306,1306],\\\"mapped\\\",[1307]],[[1307,1307],\\\"valid\\\"],[[1308,1308],\\\"mapped\\\",[1309]],[[1309,1309],\\\"valid\\\"],[[1310,1310],\\\"mapped\\\",[1311]],[[1311,1311],\\\"valid\\\"],[[1312,1312],\\\"mapped\\\",[1313]],[[1313,1313],\\\"valid\\\"],[[1314,1314],\\\"mapped\\\",[1315]],[[1315,1315],\\\"valid\\\"],[[1316,1316],\\\"mapped\\\",[1317]],[[1317,1317],\\\"valid\\\"],[[1318,1318],\\\"mapped\\\",[1319]],[[1319,1319],\\\"valid\\\"],[[1320,1320],\\\"mapped\\\",[1321]],[[1321,1321],\\\"valid\\\"],[[1322,1322],\\\"mapped\\\",[1323]],[[1323,1323],\\\"valid\\\"],[[1324,1324],\\\"mapped\\\",[1325]],[[1325,1325],\\\"valid\\\"],[[1326,1326],\\\"mapped\\\",[1327]],[[1327,1327],\\\"valid\\\"],[[1328,1328],\\\"disallowed\\\"],[[1329,1329],\\\"mapped\\\",[1377]],[[1330,1330],\\\"mapped\\\",[1378]],[[1331,1331],\\\"mapped\\\",[1379]],[[1332,1332],\\\"mapped\\\",[1380]],[[1333,1333],\\\"mapped\\\",[1381]],[[1334,1334],\\\"mapped\\\",[1382]],[[1335,1335],\\\"mapped\\\",[1383]],[[1336,1336],\\\"mapped\\\",[1384]],[[1337,1337],\\\"mapped\\\",[1385]],[[1338,1338],\\\"mapped\\\",[1386]],[[1339,1339],\\\"mapped\\\",[1387]],[[1340,1340],\\\"mapped\\\",[1388]],[[1341,1341],\\\"mapped\\\",[1389]],[[1342,1342],\\\"mapped\\\",[1390]],[[1343,1343],\\\"mapped\\\",[1391]],[[1344,1344],\\\"mapped\\\",[1392]],[[1345,1345],\\\"mapped\\\",[1393]],[[1346,1346],\\\"mapped\\\",[1394]],[[1347,1347],\\\"mapped\\\",[1395]],[[1348,1348],\\\"mapped\\\",[1396]],[[1349,1349],\\\"mapped\\\",[1397]],[[1350,1350],\\\"mapped\\\",[1398]],[[1351,1351],\\\"mapped\\\",[1399]],[[1352,1352],\\\"mapped\\\",[1400]],[[1353,1353],\\\"mapped\\\",[1401]],[[1354,1354],\\\"mapped\\\",[1402]],[[1355,1355],\\\"mapped\\\",[1403]],[[1356,1356],\\\"mapped\\\",[1404]],[[1357,1357],\\\"mapped\\\",[1405]],[[1358,1358],\\\"mapped\\\",[1406]],[[1359,1359],\\\"mapped\\\",[1407]],[[1360,1360],\\\"mapped\\\",[1408]],[[1361,1361],\\\"mapped\\\",[1409]],[[1362,1362],\\\"mapped\\\",[1410]],[[1363,1363],\\\"mapped\\\",[1411]],[[1364,1364],\\\"mapped\\\",[1412]],[[1365,1365],\\\"mapped\\\",[1413]],[[1366,1366],\\\"mapped\\\",[1414]],[[1367,1368],\\\"disallowed\\\"],[[1369,1369],\\\"valid\\\"],[[1370,1375],\\\"valid\\\",[],\\\"NV8\\\"],[[1376,1376],\\\"disallowed\\\"],[[1377,1414],\\\"valid\\\"],[[1415,1415],\\\"mapped\\\",[1381,1410]],[[1416,1416],\\\"disallowed\\\"],[[1417,1417],\\\"valid\\\",[],\\\"NV8\\\"],[[1418,1418],\\\"valid\\\",[],\\\"NV8\\\"],[[1419,1420],\\\"disallowed\\\"],[[1421,1422],\\\"valid\\\",[],\\\"NV8\\\"],[[1423,1423],\\\"valid\\\",[],\\\"NV8\\\"],[[1424,1424],\\\"disallowed\\\"],[[1425,1441],\\\"valid\\\"],[[1442,1442],\\\"valid\\\"],[[1443,1455],\\\"valid\\\"],[[1456,1465],\\\"valid\\\"],[[1466,1466],\\\"valid\\\"],[[1467,1469],\\\"valid\\\"],[[1470,1470],\\\"valid\\\",[],\\\"NV8\\\"],[[1471,1471],\\\"valid\\\"],[[1472,1472],\\\"valid\\\",[],\\\"NV8\\\"],[[1473,1474],\\\"valid\\\"],[[1475,1475],\\\"valid\\\",[],\\\"NV8\\\"],[[1476,1476],\\\"valid\\\"],[[1477,1477],\\\"valid\\\"],[[1478,1478],\\\"valid\\\",[],\\\"NV8\\\"],[[1479,1479],\\\"valid\\\"],[[1480,1487],\\\"disallowed\\\"],[[1488,1514],\\\"valid\\\"],[[1515,1519],\\\"disallowed\\\"],[[1520,1524],\\\"valid\\\"],[[1525,1535],\\\"disallowed\\\"],[[1536,1539],\\\"disallowed\\\"],[[1540,1540],\\\"disallowed\\\"],[[1541,1541],\\\"disallowed\\\"],[[1542,1546],\\\"valid\\\",[],\\\"NV8\\\"],[[1547,1547],\\\"valid\\\",[],\\\"NV8\\\"],[[1548,1548],\\\"valid\\\",[],\\\"NV8\\\"],[[1549,1551],\\\"valid\\\",[],\\\"NV8\\\"],[[1552,1557],\\\"valid\\\"],[[1558,1562],\\\"valid\\\"],[[1563,1563],\\\"valid\\\",[],\\\"NV8\\\"],[[1564,1564],\\\"disallowed\\\"],[[1565,1565],\\\"disallowed\\\"],[[1566,1566],\\\"valid\\\",[],\\\"NV8\\\"],[[1567,1567],\\\"valid\\\",[],\\\"NV8\\\"],[[1568,1568],\\\"valid\\\"],[[1569,1594],\\\"valid\\\"],[[1595,1599],\\\"valid\\\"],[[1600,1600],\\\"valid\\\",[],\\\"NV8\\\"],[[1601,1618],\\\"valid\\\"],[[1619,1621],\\\"valid\\\"],[[1622,1624],\\\"valid\\\"],[[1625,1630],\\\"valid\\\"],[[1631,1631],\\\"valid\\\"],[[1632,1641],\\\"valid\\\"],[[1642,1645],\\\"valid\\\",[],\\\"NV8\\\"],[[1646,1647],\\\"valid\\\"],[[1648,1652],\\\"valid\\\"],[[1653,1653],\\\"mapped\\\",[1575,1652]],[[1654,1654],\\\"mapped\\\",[1608,1652]],[[1655,1655],\\\"mapped\\\",[1735,1652]],[[1656,1656],\\\"mapped\\\",[1610,1652]],[[1657,1719],\\\"valid\\\"],[[1720,1721],\\\"valid\\\"],[[1722,1726],\\\"valid\\\"],[[1727,1727],\\\"valid\\\"],[[1728,1742],\\\"valid\\\"],[[1743,1743],\\\"valid\\\"],[[1744,1747],\\\"valid\\\"],[[1748,1748],\\\"valid\\\",[],\\\"NV8\\\"],[[1749,1756],\\\"valid\\\"],[[1757,1757],\\\"disallowed\\\"],[[1758,1758],\\\"valid\\\",[],\\\"NV8\\\"],[[1759,1768],\\\"valid\\\"],[[1769,1769],\\\"valid\\\",[],\\\"NV8\\\"],[[1770,1773],\\\"valid\\\"],[[1774,1775],\\\"valid\\\"],[[1776,1785],\\\"valid\\\"],[[1786,1790],\\\"valid\\\"],[[1791,1791],\\\"valid\\\"],[[1792,1805],\\\"valid\\\",[],\\\"NV8\\\"],[[1806,1806],\\\"disallowed\\\"],[[1807,1807],\\\"disallowed\\\"],[[1808,1836],\\\"valid\\\"],[[1837,1839],\\\"valid\\\"],[[1840,1866],\\\"valid\\\"],[[1867,1868],\\\"disallowed\\\"],[[1869,1871],\\\"valid\\\"],[[1872,1901],\\\"valid\\\"],[[1902,1919],\\\"valid\\\"],[[1920,1968],\\\"valid\\\"],[[1969,1969],\\\"valid\\\"],[[1970,1983],\\\"disallowed\\\"],[[1984,2037],\\\"valid\\\"],[[2038,2042],\\\"valid\\\",[],\\\"NV8\\\"],[[2043,2047],\\\"disallowed\\\"],[[2048,2093],\\\"valid\\\"],[[2094,2095],\\\"disallowed\\\"],[[2096,2110],\\\"valid\\\",[],\\\"NV8\\\"],[[2111,2111],\\\"disallowed\\\"],[[2112,2139],\\\"valid\\\"],[[2140,2141],\\\"disallowed\\\"],[[2142,2142],\\\"valid\\\",[],\\\"NV8\\\"],[[2143,2207],\\\"disallowed\\\"],[[2208,2208],\\\"valid\\\"],[[2209,2209],\\\"valid\\\"],[[2210,2220],\\\"valid\\\"],[[2221,2226],\\\"valid\\\"],[[2227,2228],\\\"valid\\\"],[[2229,2274],\\\"disallowed\\\"],[[2275,2275],\\\"valid\\\"],[[2276,2302],\\\"valid\\\"],[[2303,2303],\\\"valid\\\"],[[2304,2304],\\\"valid\\\"],[[2305,2307],\\\"valid\\\"],[[2308,2308],\\\"valid\\\"],[[2309,2361],\\\"valid\\\"],[[2362,2363],\\\"valid\\\"],[[2364,2381],\\\"valid\\\"],[[2382,2382],\\\"valid\\\"],[[2383,2383],\\\"valid\\\"],[[2384,2388],\\\"valid\\\"],[[2389,2389],\\\"valid\\\"],[[2390,2391],\\\"valid\\\"],[[2392,2392],\\\"mapped\\\",[2325,2364]],[[2393,2393],\\\"mapped\\\",[2326,2364]],[[2394,2394],\\\"mapped\\\",[2327,2364]],[[2395,2395],\\\"mapped\\\",[2332,2364]],[[2396,2396],\\\"mapped\\\",[2337,2364]],[[2397,2397],\\\"mapped\\\",[2338,2364]],[[2398,2398],\\\"mapped\\\",[2347,2364]],[[2399,2399],\\\"mapped\\\",[2351,2364]],[[2400,2403],\\\"valid\\\"],[[2404,2405],\\\"valid\\\",[],\\\"NV8\\\"],[[2406,2415],\\\"valid\\\"],[[2416,2416],\\\"valid\\\",[],\\\"NV8\\\"],[[2417,2418],\\\"valid\\\"],[[2419,2423],\\\"valid\\\"],[[2424,2424],\\\"valid\\\"],[[2425,2426],\\\"valid\\\"],[[2427,2428],\\\"valid\\\"],[[2429,2429],\\\"valid\\\"],[[2430,2431],\\\"valid\\\"],[[2432,2432],\\\"valid\\\"],[[2433,2435],\\\"valid\\\"],[[2436,2436],\\\"disallowed\\\"],[[2437,2444],\\\"valid\\\"],[[2445,2446],\\\"disallowed\\\"],[[2447,2448],\\\"valid\\\"],[[2449,2450],\\\"disallowed\\\"],[[2451,2472],\\\"valid\\\"],[[2473,2473],\\\"disallowed\\\"],[[2474,2480],\\\"valid\\\"],[[2481,2481],\\\"disallowed\\\"],[[2482,2482],\\\"valid\\\"],[[2483,2485],\\\"disallowed\\\"],[[2486,2489],\\\"valid\\\"],[[2490,2491],\\\"disallowed\\\"],[[2492,2492],\\\"valid\\\"],[[2493,2493],\\\"valid\\\"],[[2494,2500],\\\"valid\\\"],[[2501,2502],\\\"disallowed\\\"],[[2503,2504],\\\"valid\\\"],[[2505,2506],\\\"disallowed\\\"],[[2507,2509],\\\"valid\\\"],[[2510,2510],\\\"valid\\\"],[[2511,2518],\\\"disallowed\\\"],[[2519,2519],\\\"valid\\\"],[[2520,2523],\\\"disallowed\\\"],[[2524,2524],\\\"mapped\\\",[2465,2492]],[[2525,2525],\\\"mapped\\\",[2466,2492]],[[2526,2526],\\\"disallowed\\\"],[[2527,2527],\\\"mapped\\\",[2479,2492]],[[2528,2531],\\\"valid\\\"],[[2532,2533],\\\"disallowed\\\"],[[2534,2545],\\\"valid\\\"],[[2546,2554],\\\"valid\\\",[],\\\"NV8\\\"],[[2555,2555],\\\"valid\\\",[],\\\"NV8\\\"],[[2556,2560],\\\"disallowed\\\"],[[2561,2561],\\\"valid\\\"],[[2562,2562],\\\"valid\\\"],[[2563,2563],\\\"valid\\\"],[[2564,2564],\\\"disallowed\\\"],[[2565,2570],\\\"valid\\\"],[[2571,2574],\\\"disallowed\\\"],[[2575,2576],\\\"valid\\\"],[[2577,2578],\\\"disallowed\\\"],[[2579,2600],\\\"valid\\\"],[[2601,2601],\\\"disallowed\\\"],[[2602,2608],\\\"valid\\\"],[[2609,2609],\\\"disallowed\\\"],[[2610,2610],\\\"valid\\\"],[[2611,2611],\\\"mapped\\\",[2610,2620]],[[2612,2612],\\\"disallowed\\\"],[[2613,2613],\\\"valid\\\"],[[2614,2614],\\\"mapped\\\",[2616,2620]],[[2615,2615],\\\"disallowed\\\"],[[2616,2617],\\\"valid\\\"],[[2618,2619],\\\"disallowed\\\"],[[2620,2620],\\\"valid\\\"],[[2621,2621],\\\"disallowed\\\"],[[2622,2626],\\\"valid\\\"],[[2627,2630],\\\"disallowed\\\"],[[2631,2632],\\\"valid\\\"],[[2633,2634],\\\"disallowed\\\"],[[2635,2637],\\\"valid\\\"],[[2638,2640],\\\"disallowed\\\"],[[2641,2641],\\\"valid\\\"],[[2642,2648],\\\"disallowed\\\"],[[2649,2649],\\\"mapped\\\",[2582,2620]],[[2650,2650],\\\"mapped\\\",[2583,2620]],[[2651,2651],\\\"mapped\\\",[2588,2620]],[[2652,2652],\\\"valid\\\"],[[2653,2653],\\\"disallowed\\\"],[[2654,2654],\\\"mapped\\\",[2603,2620]],[[2655,2661],\\\"disallowed\\\"],[[2662,2676],\\\"valid\\\"],[[2677,2677],\\\"valid\\\"],[[2678,2688],\\\"disallowed\\\"],[[2689,2691],\\\"valid\\\"],[[2692,2692],\\\"disallowed\\\"],[[2693,2699],\\\"valid\\\"],[[2700,2700],\\\"valid\\\"],[[2701,2701],\\\"valid\\\"],[[2702,2702],\\\"disallowed\\\"],[[2703,2705],\\\"valid\\\"],[[2706,2706],\\\"disallowed\\\"],[[2707,2728],\\\"valid\\\"],[[2729,2729],\\\"disallowed\\\"],[[2730,2736],\\\"valid\\\"],[[2737,2737],\\\"disallowed\\\"],[[2738,2739],\\\"valid\\\"],[[2740,2740],\\\"disallowed\\\"],[[2741,2745],\\\"valid\\\"],[[2746,2747],\\\"disallowed\\\"],[[2748,2757],\\\"valid\\\"],[[2758,2758],\\\"disallowed\\\"],[[2759,2761],\\\"valid\\\"],[[2762,2762],\\\"disallowed\\\"],[[2763,2765],\\\"valid\\\"],[[2766,2767],\\\"disallowed\\\"],[[2768,2768],\\\"valid\\\"],[[2769,2783],\\\"disallowed\\\"],[[2784,2784],\\\"valid\\\"],[[2785,2787],\\\"valid\\\"],[[2788,2789],\\\"disallowed\\\"],[[2790,2799],\\\"valid\\\"],[[2800,2800],\\\"valid\\\",[],\\\"NV8\\\"],[[2801,2801],\\\"valid\\\",[],\\\"NV8\\\"],[[2802,2808],\\\"disallowed\\\"],[[2809,2809],\\\"valid\\\"],[[2810,2816],\\\"disallowed\\\"],[[2817,2819],\\\"valid\\\"],[[2820,2820],\\\"disallowed\\\"],[[2821,2828],\\\"valid\\\"],[[2829,2830],\\\"disallowed\\\"],[[2831,2832],\\\"valid\\\"],[[2833,2834],\\\"disallowed\\\"],[[2835,2856],\\\"valid\\\"],[[2857,2857],\\\"disallowed\\\"],[[2858,2864],\\\"valid\\\"],[[2865,2865],\\\"disallowed\\\"],[[2866,2867],\\\"valid\\\"],[[2868,2868],\\\"disallowed\\\"],[[2869,2869],\\\"valid\\\"],[[2870,2873],\\\"valid\\\"],[[2874,2875],\\\"disallowed\\\"],[[2876,2883],\\\"valid\\\"],[[2884,2884],\\\"valid\\\"],[[2885,2886],\\\"disallowed\\\"],[[2887,2888],\\\"valid\\\"],[[2889,2890],\\\"disallowed\\\"],[[2891,2893],\\\"valid\\\"],[[2894,2901],\\\"disallowed\\\"],[[2902,2903],\\\"valid\\\"],[[2904,2907],\\\"disallowed\\\"],[[2908,2908],\\\"mapped\\\",[2849,2876]],[[2909,2909],\\\"mapped\\\",[2850,2876]],[[2910,2910],\\\"disallowed\\\"],[[2911,2913],\\\"valid\\\"],[[2914,2915],\\\"valid\\\"],[[2916,2917],\\\"disallowed\\\"],[[2918,2927],\\\"valid\\\"],[[2928,2928],\\\"valid\\\",[],\\\"NV8\\\"],[[2929,2929],\\\"valid\\\"],[[2930,2935],\\\"valid\\\",[],\\\"NV8\\\"],[[2936,2945],\\\"disallowed\\\"],[[2946,2947],\\\"valid\\\"],[[2948,2948],\\\"disallowed\\\"],[[2949,2954],\\\"valid\\\"],[[2955,2957],\\\"disallowed\\\"],[[2958,2960],\\\"valid\\\"],[[2961,2961],\\\"disallowed\\\"],[[2962,2965],\\\"valid\\\"],[[2966,2968],\\\"disallowed\\\"],[[2969,2970],\\\"valid\\\"],[[2971,2971],\\\"disallowed\\\"],[[2972,2972],\\\"valid\\\"],[[2973,2973],\\\"disallowed\\\"],[[2974,2975],\\\"valid\\\"],[[2976,2978],\\\"disallowed\\\"],[[2979,2980],\\\"valid\\\"],[[2981,2983],\\\"disallowed\\\"],[[2984,2986],\\\"valid\\\"],[[2987,2989],\\\"disallowed\\\"],[[2990,2997],\\\"valid\\\"],[[2998,2998],\\\"valid\\\"],[[2999,3001],\\\"valid\\\"],[[3002,3005],\\\"disallowed\\\"],[[3006,3010],\\\"valid\\\"],[[3011,3013],\\\"disallowed\\\"],[[3014,3016],\\\"valid\\\"],[[3017,3017],\\\"disallowed\\\"],[[3018,3021],\\\"valid\\\"],[[3022,3023],\\\"disallowed\\\"],[[3024,3024],\\\"valid\\\"],[[3025,3030],\\\"disallowed\\\"],[[3031,3031],\\\"valid\\\"],[[3032,3045],\\\"disallowed\\\"],[[3046,3046],\\\"valid\\\"],[[3047,3055],\\\"valid\\\"],[[3056,3058],\\\"valid\\\",[],\\\"NV8\\\"],[[3059,3066],\\\"valid\\\",[],\\\"NV8\\\"],[[3067,3071],\\\"disallowed\\\"],[[3072,3072],\\\"valid\\\"],[[3073,3075],\\\"valid\\\"],[[3076,3076],\\\"disallowed\\\"],[[3077,3084],\\\"valid\\\"],[[3085,3085],\\\"disallowed\\\"],[[3086,3088],\\\"valid\\\"],[[3089,3089],\\\"disallowed\\\"],[[3090,3112],\\\"valid\\\"],[[3113,3113],\\\"disallowed\\\"],[[3114,3123],\\\"valid\\\"],[[3124,3124],\\\"valid\\\"],[[3125,3129],\\\"valid\\\"],[[3130,3132],\\\"disallowed\\\"],[[3133,3133],\\\"valid\\\"],[[3134,3140],\\\"valid\\\"],[[3141,3141],\\\"disallowed\\\"],[[3142,3144],\\\"valid\\\"],[[3145,3145],\\\"disallowed\\\"],[[3146,3149],\\\"valid\\\"],[[3150,3156],\\\"disallowed\\\"],[[3157,3158],\\\"valid\\\"],[[3159,3159],\\\"disallowed\\\"],[[3160,3161],\\\"valid\\\"],[[3162,3162],\\\"valid\\\"],[[3163,3167],\\\"disallowed\\\"],[[3168,3169],\\\"valid\\\"],[[3170,3171],\\\"valid\\\"],[[3172,3173],\\\"disallowed\\\"],[[3174,3183],\\\"valid\\\"],[[3184,3191],\\\"disallowed\\\"],[[3192,3199],\\\"valid\\\",[],\\\"NV8\\\"],[[3200,3200],\\\"disallowed\\\"],[[3201,3201],\\\"valid\\\"],[[3202,3203],\\\"valid\\\"],[[3204,3204],\\\"disallowed\\\"],[[3205,3212],\\\"valid\\\"],[[3213,3213],\\\"disallowed\\\"],[[3214,3216],\\\"valid\\\"],[[3217,3217],\\\"disallowed\\\"],[[3218,3240],\\\"valid\\\"],[[3241,3241],\\\"disallowed\\\"],[[3242,3251],\\\"valid\\\"],[[3252,3252],\\\"disallowed\\\"],[[3253,3257],\\\"valid\\\"],[[3258,3259],\\\"disallowed\\\"],[[3260,3261],\\\"valid\\\"],[[3262,3268],\\\"valid\\\"],[[3269,3269],\\\"disallowed\\\"],[[3270,3272],\\\"valid\\\"],[[3273,3273],\\\"disallowed\\\"],[[3274,3277],\\\"valid\\\"],[[3278,3284],\\\"disallowed\\\"],[[3285,3286],\\\"valid\\\"],[[3287,3293],\\\"disallowed\\\"],[[3294,3294],\\\"valid\\\"],[[3295,3295],\\\"disallowed\\\"],[[3296,3297],\\\"valid\\\"],[[3298,3299],\\\"valid\\\"],[[3300,3301],\\\"disallowed\\\"],[[3302,3311],\\\"valid\\\"],[[3312,3312],\\\"disallowed\\\"],[[3313,3314],\\\"valid\\\"],[[3315,3328],\\\"disallowed\\\"],[[3329,3329],\\\"valid\\\"],[[3330,3331],\\\"valid\\\"],[[3332,3332],\\\"disallowed\\\"],[[3333,3340],\\\"valid\\\"],[[3341,3341],\\\"disallowed\\\"],[[3342,3344],\\\"valid\\\"],[[3345,3345],\\\"disallowed\\\"],[[3346,3368],\\\"valid\\\"],[[3369,3369],\\\"valid\\\"],[[3370,3385],\\\"valid\\\"],[[3386,3386],\\\"valid\\\"],[[3387,3388],\\\"disallowed\\\"],[[3389,3389],\\\"valid\\\"],[[3390,3395],\\\"valid\\\"],[[3396,3396],\\\"valid\\\"],[[3397,3397],\\\"disallowed\\\"],[[3398,3400],\\\"valid\\\"],[[3401,3401],\\\"disallowed\\\"],[[3402,3405],\\\"valid\\\"],[[3406,3406],\\\"valid\\\"],[[3407,3414],\\\"disallowed\\\"],[[3415,3415],\\\"valid\\\"],[[3416,3422],\\\"disallowed\\\"],[[3423,3423],\\\"valid\\\"],[[3424,3425],\\\"valid\\\"],[[3426,3427],\\\"valid\\\"],[[3428,3429],\\\"disallowed\\\"],[[3430,3439],\\\"valid\\\"],[[3440,3445],\\\"valid\\\",[],\\\"NV8\\\"],[[3446,3448],\\\"disallowed\\\"],[[3449,3449],\\\"valid\\\",[],\\\"NV8\\\"],[[3450,3455],\\\"valid\\\"],[[3456,3457],\\\"disallowed\\\"],[[3458,3459],\\\"valid\\\"],[[3460,3460],\\\"disallowed\\\"],[[3461,3478],\\\"valid\\\"],[[3479,3481],\\\"disallowed\\\"],[[3482,3505],\\\"valid\\\"],[[3506,3506],\\\"disallowed\\\"],[[3507,3515],\\\"valid\\\"],[[3516,3516],\\\"disallowed\\\"],[[3517,3517],\\\"valid\\\"],[[3518,3519],\\\"disallowed\\\"],[[3520,3526],\\\"valid\\\"],[[3527,3529],\\\"disallowed\\\"],[[3530,3530],\\\"valid\\\"],[[3531,3534],\\\"disallowed\\\"],[[3535,3540],\\\"valid\\\"],[[3541,3541],\\\"disallowed\\\"],[[3542,3542],\\\"valid\\\"],[[3543,3543],\\\"disallowed\\\"],[[3544,3551],\\\"valid\\\"],[[3552,3557],\\\"disallowed\\\"],[[3558,3567],\\\"valid\\\"],[[3568,3569],\\\"disallowed\\\"],[[3570,3571],\\\"valid\\\"],[[3572,3572],\\\"valid\\\",[],\\\"NV8\\\"],[[3573,3584],\\\"disallowed\\\"],[[3585,3634],\\\"valid\\\"],[[3635,3635],\\\"mapped\\\",[3661,3634]],[[3636,3642],\\\"valid\\\"],[[3643,3646],\\\"disallowed\\\"],[[3647,3647],\\\"valid\\\",[],\\\"NV8\\\"],[[3648,3662],\\\"valid\\\"],[[3663,3663],\\\"valid\\\",[],\\\"NV8\\\"],[[3664,3673],\\\"valid\\\"],[[3674,3675],\\\"valid\\\",[],\\\"NV8\\\"],[[3676,3712],\\\"disallowed\\\"],[[3713,3714],\\\"valid\\\"],[[3715,3715],\\\"disallowed\\\"],[[3716,3716],\\\"valid\\\"],[[3717,3718],\\\"disallowed\\\"],[[3719,3720],\\\"valid\\\"],[[3721,3721],\\\"disallowed\\\"],[[3722,3722],\\\"valid\\\"],[[3723,3724],\\\"disallowed\\\"],[[3725,3725],\\\"valid\\\"],[[3726,3731],\\\"disallowed\\\"],[[3732,3735],\\\"valid\\\"],[[3736,3736],\\\"disallowed\\\"],[[3737,3743],\\\"valid\\\"],[[3744,3744],\\\"disallowed\\\"],[[3745,3747],\\\"valid\\\"],[[3748,3748],\\\"disallowed\\\"],[[3749,3749],\\\"valid\\\"],[[3750,3750],\\\"disallowed\\\"],[[3751,3751],\\\"valid\\\"],[[3752,3753],\\\"disallowed\\\"],[[3754,3755],\\\"valid\\\"],[[3756,3756],\\\"disallowed\\\"],[[3757,3762],\\\"valid\\\"],[[3763,3763],\\\"mapped\\\",[3789,3762]],[[3764,3769],\\\"valid\\\"],[[3770,3770],\\\"disallowed\\\"],[[3771,3773],\\\"valid\\\"],[[3774,3775],\\\"disallowed\\\"],[[3776,3780],\\\"valid\\\"],[[3781,3781],\\\"disallowed\\\"],[[3782,3782],\\\"valid\\\"],[[3783,3783],\\\"disallowed\\\"],[[3784,3789],\\\"valid\\\"],[[3790,3791],\\\"disallowed\\\"],[[3792,3801],\\\"valid\\\"],[[3802,3803],\\\"disallowed\\\"],[[3804,3804],\\\"mapped\\\",[3755,3737]],[[3805,3805],\\\"mapped\\\",[3755,3745]],[[3806,3807],\\\"valid\\\"],[[3808,3839],\\\"disallowed\\\"],[[3840,3840],\\\"valid\\\"],[[3841,3850],\\\"valid\\\",[],\\\"NV8\\\"],[[3851,3851],\\\"valid\\\"],[[3852,3852],\\\"mapped\\\",[3851]],[[3853,3863],\\\"valid\\\",[],\\\"NV8\\\"],[[3864,3865],\\\"valid\\\"],[[3866,3871],\\\"valid\\\",[],\\\"NV8\\\"],[[3872,3881],\\\"valid\\\"],[[3882,3892],\\\"valid\\\",[],\\\"NV8\\\"],[[3893,3893],\\\"valid\\\"],[[3894,3894],\\\"valid\\\",[],\\\"NV8\\\"],[[3895,3895],\\\"valid\\\"],[[3896,3896],\\\"valid\\\",[],\\\"NV8\\\"],[[3897,3897],\\\"valid\\\"],[[3898,3901],\\\"valid\\\",[],\\\"NV8\\\"],[[3902,3906],\\\"valid\\\"],[[3907,3907],\\\"mapped\\\",[3906,4023]],[[3908,3911],\\\"valid\\\"],[[3912,3912],\\\"disallowed\\\"],[[3913,3916],\\\"valid\\\"],[[3917,3917],\\\"mapped\\\",[3916,4023]],[[3918,3921],\\\"valid\\\"],[[3922,3922],\\\"mapped\\\",[3921,4023]],[[3923,3926],\\\"valid\\\"],[[3927,3927],\\\"mapped\\\",[3926,4023]],[[3928,3931],\\\"valid\\\"],[[3932,3932],\\\"mapped\\\",[3931,4023]],[[3933,3944],\\\"valid\\\"],[[3945,3945],\\\"mapped\\\",[3904,4021]],[[3946,3946],\\\"valid\\\"],[[3947,3948],\\\"valid\\\"],[[3949,3952],\\\"disallowed\\\"],[[3953,3954],\\\"valid\\\"],[[3955,3955],\\\"mapped\\\",[3953,3954]],[[3956,3956],\\\"valid\\\"],[[3957,3957],\\\"mapped\\\",[3953,3956]],[[3958,3958],\\\"mapped\\\",[4018,3968]],[[3959,3959],\\\"mapped\\\",[4018,3953,3968]],[[3960,3960],\\\"mapped\\\",[4019,3968]],[[3961,3961],\\\"mapped\\\",[4019,3953,3968]],[[3962,3968],\\\"valid\\\"],[[3969,3969],\\\"mapped\\\",[3953,3968]],[[3970,3972],\\\"valid\\\"],[[3973,3973],\\\"valid\\\",[],\\\"NV8\\\"],[[3974,3979],\\\"valid\\\"],[[3980,3983],\\\"valid\\\"],[[3984,3986],\\\"valid\\\"],[[3987,3987],\\\"mapped\\\",[3986,4023]],[[3988,3989],\\\"valid\\\"],[[3990,3990],\\\"valid\\\"],[[3991,3991],\\\"valid\\\"],[[3992,3992],\\\"disallowed\\\"],[[3993,3996],\\\"valid\\\"],[[3997,3997],\\\"mapped\\\",[3996,4023]],[[3998,4001],\\\"valid\\\"],[[4002,4002],\\\"mapped\\\",[4001,4023]],[[4003,4006],\\\"valid\\\"],[[4007,4007],\\\"mapped\\\",[4006,4023]],[[4008,4011],\\\"valid\\\"],[[4012,4012],\\\"mapped\\\",[4011,4023]],[[4013,4013],\\\"valid\\\"],[[4014,4016],\\\"valid\\\"],[[4017,4023],\\\"valid\\\"],[[4024,4024],\\\"valid\\\"],[[4025,4025],\\\"mapped\\\",[3984,4021]],[[4026,4028],\\\"valid\\\"],[[4029,4029],\\\"disallowed\\\"],[[4030,4037],\\\"valid\\\",[],\\\"NV8\\\"],[[4038,4038],\\\"valid\\\"],[[4039,4044],\\\"valid\\\",[],\\\"NV8\\\"],[[4045,4045],\\\"disallowed\\\"],[[4046,4046],\\\"valid\\\",[],\\\"NV8\\\"],[[4047,4047],\\\"valid\\\",[],\\\"NV8\\\"],[[4048,4049],\\\"valid\\\",[],\\\"NV8\\\"],[[4050,4052],\\\"valid\\\",[],\\\"NV8\\\"],[[4053,4056],\\\"valid\\\",[],\\\"NV8\\\"],[[4057,4058],\\\"valid\\\",[],\\\"NV8\\\"],[[4059,4095],\\\"disallowed\\\"],[[4096,4129],\\\"valid\\\"],[[4130,4130],\\\"valid\\\"],[[4131,4135],\\\"valid\\\"],[[4136,4136],\\\"valid\\\"],[[4137,4138],\\\"valid\\\"],[[4139,4139],\\\"valid\\\"],[[4140,4146],\\\"valid\\\"],[[4147,4149],\\\"valid\\\"],[[4150,4153],\\\"valid\\\"],[[4154,4159],\\\"valid\\\"],[[4160,4169],\\\"valid\\\"],[[4170,4175],\\\"valid\\\",[],\\\"NV8\\\"],[[4176,4185],\\\"valid\\\"],[[4186,4249],\\\"valid\\\"],[[4250,4253],\\\"valid\\\"],[[4254,4255],\\\"valid\\\",[],\\\"NV8\\\"],[[4256,4293],\\\"disallowed\\\"],[[4294,4294],\\\"disallowed\\\"],[[4295,4295],\\\"mapped\\\",[11559]],[[4296,4300],\\\"disallowed\\\"],[[4301,4301],\\\"mapped\\\",[11565]],[[4302,4303],\\\"disallowed\\\"],[[4304,4342],\\\"valid\\\"],[[4343,4344],\\\"valid\\\"],[[4345,4346],\\\"valid\\\"],[[4347,4347],\\\"valid\\\",[],\\\"NV8\\\"],[[4348,4348],\\\"mapped\\\",[4316]],[[4349,4351],\\\"valid\\\"],[[4352,4441],\\\"valid\\\",[],\\\"NV8\\\"],[[4442,4446],\\\"valid\\\",[],\\\"NV8\\\"],[[4447,4448],\\\"disallowed\\\"],[[4449,4514],\\\"valid\\\",[],\\\"NV8\\\"],[[4515,4519],\\\"valid\\\",[],\\\"NV8\\\"],[[4520,4601],\\\"valid\\\",[],\\\"NV8\\\"],[[4602,4607],\\\"valid\\\",[],\\\"NV8\\\"],[[4608,4614],\\\"valid\\\"],[[4615,4615],\\\"valid\\\"],[[4616,4678],\\\"valid\\\"],[[4679,4679],\\\"valid\\\"],[[4680,4680],\\\"valid\\\"],[[4681,4681],\\\"disallowed\\\"],[[4682,4685],\\\"valid\\\"],[[4686,4687],\\\"disallowed\\\"],[[4688,4694],\\\"valid\\\"],[[4695,4695],\\\"disallowed\\\"],[[4696,4696],\\\"valid\\\"],[[4697,4697],\\\"disallowed\\\"],[[4698,4701],\\\"valid\\\"],[[4702,4703],\\\"disallowed\\\"],[[4704,4742],\\\"valid\\\"],[[4743,4743],\\\"valid\\\"],[[4744,4744],\\\"valid\\\"],[[4745,4745],\\\"disallowed\\\"],[[4746,4749],\\\"valid\\\"],[[4750,4751],\\\"disallowed\\\"],[[4752,4782],\\\"valid\\\"],[[4783,4783],\\\"valid\\\"],[[4784,4784],\\\"valid\\\"],[[4785,4785],\\\"disallowed\\\"],[[4786,4789],\\\"valid\\\"],[[4790,4791],\\\"disallowed\\\"],[[4792,4798],\\\"valid\\\"],[[4799,4799],\\\"disallowed\\\"],[[4800,4800],\\\"valid\\\"],[[4801,4801],\\\"disallowed\\\"],[[4802,4805],\\\"valid\\\"],[[4806,4807],\\\"disallowed\\\"],[[4808,4814],\\\"valid\\\"],[[4815,4815],\\\"valid\\\"],[[4816,4822],\\\"valid\\\"],[[4823,4823],\\\"disallowed\\\"],[[4824,4846],\\\"valid\\\"],[[4847,4847],\\\"valid\\\"],[[4848,4878],\\\"valid\\\"],[[4879,4879],\\\"valid\\\"],[[4880,4880],\\\"valid\\\"],[[4881,4881],\\\"disallowed\\\"],[[4882,4885],\\\"valid\\\"],[[4886,4887],\\\"disallowed\\\"],[[4888,4894],\\\"valid\\\"],[[4895,4895],\\\"valid\\\"],[[4896,4934],\\\"valid\\\"],[[4935,4935],\\\"valid\\\"],[[4936,4954],\\\"valid\\\"],[[4955,4956],\\\"disallowed\\\"],[[4957,4958],\\\"valid\\\"],[[4959,4959],\\\"valid\\\"],[[4960,4960],\\\"valid\\\",[],\\\"NV8\\\"],[[4961,4988],\\\"valid\\\",[],\\\"NV8\\\"],[[4989,4991],\\\"disallowed\\\"],[[4992,5007],\\\"valid\\\"],[[5008,5017],\\\"valid\\\",[],\\\"NV8\\\"],[[5018,5023],\\\"disallowed\\\"],[[5024,5108],\\\"valid\\\"],[[5109,5109],\\\"valid\\\"],[[5110,5111],\\\"disallowed\\\"],[[5112,5112],\\\"mapped\\\",[5104]],[[5113,5113],\\\"mapped\\\",[5105]],[[5114,5114],\\\"mapped\\\",[5106]],[[5115,5115],\\\"mapped\\\",[5107]],[[5116,5116],\\\"mapped\\\",[5108]],[[5117,5117],\\\"mapped\\\",[5109]],[[5118,5119],\\\"disallowed\\\"],[[5120,5120],\\\"valid\\\",[],\\\"NV8\\\"],[[5121,5740],\\\"valid\\\"],[[5741,5742],\\\"valid\\\",[],\\\"NV8\\\"],[[5743,5750],\\\"valid\\\"],[[5751,5759],\\\"valid\\\"],[[5760,5760],\\\"disallowed\\\"],[[5761,5786],\\\"valid\\\"],[[5787,5788],\\\"valid\\\",[],\\\"NV8\\\"],[[5789,5791],\\\"disallowed\\\"],[[5792,5866],\\\"valid\\\"],[[5867,5872],\\\"valid\\\",[],\\\"NV8\\\"],[[5873,5880],\\\"valid\\\"],[[5881,5887],\\\"disallowed\\\"],[[5888,5900],\\\"valid\\\"],[[5901,5901],\\\"disallowed\\\"],[[5902,5908],\\\"valid\\\"],[[5909,5919],\\\"disallowed\\\"],[[5920,5940],\\\"valid\\\"],[[5941,5942],\\\"valid\\\",[],\\\"NV8\\\"],[[5943,5951],\\\"disallowed\\\"],[[5952,5971],\\\"valid\\\"],[[5972,5983],\\\"disallowed\\\"],[[5984,5996],\\\"valid\\\"],[[5997,5997],\\\"disallowed\\\"],[[5998,6000],\\\"valid\\\"],[[6001,6001],\\\"disallowed\\\"],[[6002,6003],\\\"valid\\\"],[[6004,6015],\\\"disallowed\\\"],[[6016,6067],\\\"valid\\\"],[[6068,6069],\\\"disallowed\\\"],[[6070,6099],\\\"valid\\\"],[[6100,6102],\\\"valid\\\",[],\\\"NV8\\\"],[[6103,6103],\\\"valid\\\"],[[6104,6107],\\\"valid\\\",[],\\\"NV8\\\"],[[6108,6108],\\\"valid\\\"],[[6109,6109],\\\"valid\\\"],[[6110,6111],\\\"disallowed\\\"],[[6112,6121],\\\"valid\\\"],[[6122,6127],\\\"disallowed\\\"],[[6128,6137],\\\"valid\\\",[],\\\"NV8\\\"],[[6138,6143],\\\"disallowed\\\"],[[6144,6149],\\\"valid\\\",[],\\\"NV8\\\"],[[6150,6150],\\\"disallowed\\\"],[[6151,6154],\\\"valid\\\",[],\\\"NV8\\\"],[[6155,6157],\\\"ignored\\\"],[[6158,6158],\\\"disallowed\\\"],[[6159,6159],\\\"disallowed\\\"],[[6160,6169],\\\"valid\\\"],[[6170,6175],\\\"disallowed\\\"],[[6176,6263],\\\"valid\\\"],[[6264,6271],\\\"disallowed\\\"],[[6272,6313],\\\"valid\\\"],[[6314,6314],\\\"valid\\\"],[[6315,6319],\\\"disallowed\\\"],[[6320,6389],\\\"valid\\\"],[[6390,6399],\\\"disallowed\\\"],[[6400,6428],\\\"valid\\\"],[[6429,6430],\\\"valid\\\"],[[6431,6431],\\\"disallowed\\\"],[[6432,6443],\\\"valid\\\"],[[6444,6447],\\\"disallowed\\\"],[[6448,6459],\\\"valid\\\"],[[6460,6463],\\\"disallowed\\\"],[[6464,6464],\\\"valid\\\",[],\\\"NV8\\\"],[[6465,6467],\\\"disallowed\\\"],[[6468,6469],\\\"valid\\\",[],\\\"NV8\\\"],[[6470,6509],\\\"valid\\\"],[[6510,6511],\\\"disallowed\\\"],[[6512,6516],\\\"valid\\\"],[[6517,6527],\\\"disallowed\\\"],[[6528,6569],\\\"valid\\\"],[[6570,6571],\\\"valid\\\"],[[6572,6575],\\\"disallowed\\\"],[[6576,6601],\\\"valid\\\"],[[6602,6607],\\\"disallowed\\\"],[[6608,6617],\\\"valid\\\"],[[6618,6618],\\\"valid\\\",[],\\\"XV8\\\"],[[6619,6621],\\\"disallowed\\\"],[[6622,6623],\\\"valid\\\",[],\\\"NV8\\\"],[[6624,6655],\\\"valid\\\",[],\\\"NV8\\\"],[[6656,6683],\\\"valid\\\"],[[6684,6685],\\\"disallowed\\\"],[[6686,6687],\\\"valid\\\",[],\\\"NV8\\\"],[[6688,6750],\\\"valid\\\"],[[6751,6751],\\\"disallowed\\\"],[[6752,6780],\\\"valid\\\"],[[6781,6782],\\\"disallowed\\\"],[[6783,6793],\\\"valid\\\"],[[6794,6799],\\\"disallowed\\\"],[[6800,6809],\\\"valid\\\"],[[6810,6815],\\\"disallowed\\\"],[[6816,6822],\\\"valid\\\",[],\\\"NV8\\\"],[[6823,6823],\\\"valid\\\"],[[6824,6829],\\\"valid\\\",[],\\\"NV8\\\"],[[6830,6831],\\\"disallowed\\\"],[[6832,6845],\\\"valid\\\"],[[6846,6846],\\\"valid\\\",[],\\\"NV8\\\"],[[6847,6911],\\\"disallowed\\\"],[[6912,6987],\\\"valid\\\"],[[6988,6991],\\\"disallowed\\\"],[[6992,7001],\\\"valid\\\"],[[7002,7018],\\\"valid\\\",[],\\\"NV8\\\"],[[7019,7027],\\\"valid\\\"],[[7028,7036],\\\"valid\\\",[],\\\"NV8\\\"],[[7037,7039],\\\"disallowed\\\"],[[7040,7082],\\\"valid\\\"],[[7083,7085],\\\"valid\\\"],[[7086,7097],\\\"valid\\\"],[[7098,7103],\\\"valid\\\"],[[7104,7155],\\\"valid\\\"],[[7156,7163],\\\"disallowed\\\"],[[7164,7167],\\\"valid\\\",[],\\\"NV8\\\"],[[7168,7223],\\\"valid\\\"],[[7224,7226],\\\"disallowed\\\"],[[7227,7231],\\\"valid\\\",[],\\\"NV8\\\"],[[7232,7241],\\\"valid\\\"],[[7242,7244],\\\"disallowed\\\"],[[7245,7293],\\\"valid\\\"],[[7294,7295],\\\"valid\\\",[],\\\"NV8\\\"],[[7296,7359],\\\"disallowed\\\"],[[7360,7367],\\\"valid\\\",[],\\\"NV8\\\"],[[7368,7375],\\\"disallowed\\\"],[[7376,7378],\\\"valid\\\"],[[7379,7379],\\\"valid\\\",[],\\\"NV8\\\"],[[7380,7410],\\\"valid\\\"],[[7411,7414],\\\"valid\\\"],[[7415,7415],\\\"disallowed\\\"],[[7416,7417],\\\"valid\\\"],[[7418,7423],\\\"disallowed\\\"],[[7424,7467],\\\"valid\\\"],[[7468,7468],\\\"mapped\\\",[97]],[[7469,7469],\\\"mapped\\\",[230]],[[7470,7470],\\\"mapped\\\",[98]],[[7471,7471],\\\"valid\\\"],[[7472,7472],\\\"mapped\\\",[100]],[[7473,7473],\\\"mapped\\\",[101]],[[7474,7474],\\\"mapped\\\",[477]],[[7475,7475],\\\"mapped\\\",[103]],[[7476,7476],\\\"mapped\\\",[104]],[[7477,7477],\\\"mapped\\\",[105]],[[7478,7478],\\\"mapped\\\",[106]],[[7479,7479],\\\"mapped\\\",[107]],[[7480,7480],\\\"mapped\\\",[108]],[[7481,7481],\\\"mapped\\\",[109]],[[7482,7482],\\\"mapped\\\",[110]],[[7483,7483],\\\"valid\\\"],[[7484,7484],\\\"mapped\\\",[111]],[[7485,7485],\\\"mapped\\\",[547]],[[7486,7486],\\\"mapped\\\",[112]],[[7487,7487],\\\"mapped\\\",[114]],[[7488,7488],\\\"mapped\\\",[116]],[[7489,7489],\\\"mapped\\\",[117]],[[7490,7490],\\\"mapped\\\",[119]],[[7491,7491],\\\"mapped\\\",[97]],[[7492,7492],\\\"mapped\\\",[592]],[[7493,7493],\\\"mapped\\\",[593]],[[7494,7494],\\\"mapped\\\",[7426]],[[7495,7495],\\\"mapped\\\",[98]],[[7496,7496],\\\"mapped\\\",[100]],[[7497,7497],\\\"mapped\\\",[101]],[[7498,7498],\\\"mapped\\\",[601]],[[7499,7499],\\\"mapped\\\",[603]],[[7500,7500],\\\"mapped\\\",[604]],[[7501,7501],\\\"mapped\\\",[103]],[[7502,7502],\\\"valid\\\"],[[7503,7503],\\\"mapped\\\",[107]],[[7504,7504],\\\"mapped\\\",[109]],[[7505,7505],\\\"mapped\\\",[331]],[[7506,7506],\\\"mapped\\\",[111]],[[7507,7507],\\\"mapped\\\",[596]],[[7508,7508],\\\"mapped\\\",[7446]],[[7509,7509],\\\"mapped\\\",[7447]],[[7510,7510],\\\"mapped\\\",[112]],[[7511,7511],\\\"mapped\\\",[116]],[[7512,7512],\\\"mapped\\\",[117]],[[7513,7513],\\\"mapped\\\",[7453]],[[7514,7514],\\\"mapped\\\",[623]],[[7515,7515],\\\"mapped\\\",[118]],[[7516,7516],\\\"mapped\\\",[7461]],[[7517,7517],\\\"mapped\\\",[946]],[[7518,7518],\\\"mapped\\\",[947]],[[7519,7519],\\\"mapped\\\",[948]],[[7520,7520],\\\"mapped\\\",[966]],[[7521,7521],\\\"mapped\\\",[967]],[[7522,7522],\\\"mapped\\\",[105]],[[7523,7523],\\\"mapped\\\",[114]],[[7524,7524],\\\"mapped\\\",[117]],[[7525,7525],\\\"mapped\\\",[118]],[[7526,7526],\\\"mapped\\\",[946]],[[7527,7527],\\\"mapped\\\",[947]],[[7528,7528],\\\"mapped\\\",[961]],[[7529,7529],\\\"mapped\\\",[966]],[[7530,7530],\\\"mapped\\\",[967]],[[7531,7531],\\\"valid\\\"],[[7532,7543],\\\"valid\\\"],[[7544,7544],\\\"mapped\\\",[1085]],[[7545,7578],\\\"valid\\\"],[[7579,7579],\\\"mapped\\\",[594]],[[7580,7580],\\\"mapped\\\",[99]],[[7581,7581],\\\"mapped\\\",[597]],[[7582,7582],\\\"mapped\\\",[240]],[[7583,7583],\\\"mapped\\\",[604]],[[7584,7584],\\\"mapped\\\",[102]],[[7585,7585],\\\"mapped\\\",[607]],[[7586,7586],\\\"mapped\\\",[609]],[[7587,7587],\\\"mapped\\\",[613]],[[7588,7588],\\\"mapped\\\",[616]],[[7589,7589],\\\"mapped\\\",[617]],[[7590,7590],\\\"mapped\\\",[618]],[[7591,7591],\\\"mapped\\\",[7547]],[[7592,7592],\\\"mapped\\\",[669]],[[7593,7593],\\\"mapped\\\",[621]],[[7594,7594],\\\"mapped\\\",[7557]],[[7595,7595],\\\"mapped\\\",[671]],[[7596,7596],\\\"mapped\\\",[625]],[[7597,7597],\\\"mapped\\\",[624]],[[7598,7598],\\\"mapped\\\",[626]],[[7599,7599],\\\"mapped\\\",[627]],[[7600,7600],\\\"mapped\\\",[628]],[[7601,7601],\\\"mapped\\\",[629]],[[7602,7602],\\\"mapped\\\",[632]],[[7603,7603],\\\"mapped\\\",[642]],[[7604,7604],\\\"mapped\\\",[643]],[[7605,7605],\\\"mapped\\\",[427]],[[7606,7606],\\\"mapped\\\",[649]],[[7607,7607],\\\"mapped\\\",[650]],[[7608,7608],\\\"mapped\\\",[7452]],[[7609,7609],\\\"mapped\\\",[651]],[[7610,7610],\\\"mapped\\\",[652]],[[7611,7611],\\\"mapped\\\",[122]],[[7612,7612],\\\"mapped\\\",[656]],[[7613,7613],\\\"mapped\\\",[657]],[[7614,7614],\\\"mapped\\\",[658]],[[7615,7615],\\\"mapped\\\",[952]],[[7616,7619],\\\"valid\\\"],[[7620,7626],\\\"valid\\\"],[[7627,7654],\\\"valid\\\"],[[7655,7669],\\\"valid\\\"],[[7670,7675],\\\"disallowed\\\"],[[7676,7676],\\\"valid\\\"],[[7677,7677],\\\"valid\\\"],[[7678,7679],\\\"valid\\\"],[[7680,7680],\\\"mapped\\\",[7681]],[[7681,7681],\\\"valid\\\"],[[7682,7682],\\\"mapped\\\",[7683]],[[7683,7683],\\\"valid\\\"],[[7684,7684],\\\"mapped\\\",[7685]],[[7685,7685],\\\"valid\\\"],[[7686,7686],\\\"mapped\\\",[7687]],[[7687,7687],\\\"valid\\\"],[[7688,7688],\\\"mapped\\\",[7689]],[[7689,7689],\\\"valid\\\"],[[7690,7690],\\\"mapped\\\",[7691]],[[7691,7691],\\\"valid\\\"],[[7692,7692],\\\"mapped\\\",[7693]],[[7693,7693],\\\"valid\\\"],[[7694,7694],\\\"mapped\\\",[7695]],[[7695,7695],\\\"valid\\\"],[[7696,7696],\\\"mapped\\\",[7697]],[[7697,7697],\\\"valid\\\"],[[7698,7698],\\\"mapped\\\",[7699]],[[7699,7699],\\\"valid\\\"],[[7700,7700],\\\"mapped\\\",[7701]],[[7701,7701],\\\"valid\\\"],[[7702,7702],\\\"mapped\\\",[7703]],[[7703,7703],\\\"valid\\\"],[[7704,7704],\\\"mapped\\\",[7705]],[[7705,7705],\\\"valid\\\"],[[7706,7706],\\\"mapped\\\",[7707]],[[7707,7707],\\\"valid\\\"],[[7708,7708],\\\"mapped\\\",[7709]],[[7709,7709],\\\"valid\\\"],[[7710,7710],\\\"mapped\\\",[7711]],[[7711,7711],\\\"valid\\\"],[[7712,7712],\\\"mapped\\\",[7713]],[[7713,7713],\\\"valid\\\"],[[7714,7714],\\\"mapped\\\",[7715]],[[7715,7715],\\\"valid\\\"],[[7716,7716],\\\"mapped\\\",[7717]],[[7717,7717],\\\"valid\\\"],[[7718,7718],\\\"mapped\\\",[7719]],[[7719,7719],\\\"valid\\\"],[[7720,7720],\\\"mapped\\\",[7721]],[[7721,7721],\\\"valid\\\"],[[7722,7722],\\\"mapped\\\",[7723]],[[7723,7723],\\\"valid\\\"],[[7724,7724],\\\"mapped\\\",[7725]],[[7725,7725],\\\"valid\\\"],[[7726,7726],\\\"mapped\\\",[7727]],[[7727,7727],\\\"valid\\\"],[[7728,7728],\\\"mapped\\\",[7729]],[[7729,7729],\\\"valid\\\"],[[7730,7730],\\\"mapped\\\",[7731]],[[7731,7731],\\\"valid\\\"],[[7732,7732],\\\"mapped\\\",[7733]],[[7733,7733],\\\"valid\\\"],[[7734,7734],\\\"mapped\\\",[7735]],[[7735,7735],\\\"valid\\\"],[[7736,7736],\\\"mapped\\\",[7737]],[[7737,7737],\\\"valid\\\"],[[7738,7738],\\\"mapped\\\",[7739]],[[7739,7739],\\\"valid\\\"],[[7740,7740],\\\"mapped\\\",[7741]],[[7741,7741],\\\"valid\\\"],[[7742,7742],\\\"mapped\\\",[7743]],[[7743,7743],\\\"valid\\\"],[[7744,7744],\\\"mapped\\\",[7745]],[[7745,7745],\\\"valid\\\"],[[7746,7746],\\\"mapped\\\",[7747]],[[7747,7747],\\\"valid\\\"],[[7748,7748],\\\"mapped\\\",[7749]],[[7749,7749],\\\"valid\\\"],[[7750,7750],\\\"mapped\\\",[7751]],[[7751,7751],\\\"valid\\\"],[[7752,7752],\\\"mapped\\\",[7753]],[[7753,7753],\\\"valid\\\"],[[7754,7754],\\\"mapped\\\",[7755]],[[7755,7755],\\\"valid\\\"],[[7756,7756],\\\"mapped\\\",[7757]],[[7757,7757],\\\"valid\\\"],[[7758,7758],\\\"mapped\\\",[7759]],[[7759,7759],\\\"valid\\\"],[[7760,7760],\\\"mapped\\\",[7761]],[[7761,7761],\\\"valid\\\"],[[7762,7762],\\\"mapped\\\",[7763]],[[7763,7763],\\\"valid\\\"],[[7764,7764],\\\"mapped\\\",[7765]],[[7765,7765],\\\"valid\\\"],[[7766,7766],\\\"mapped\\\",[7767]],[[7767,7767],\\\"valid\\\"],[[7768,7768],\\\"mapped\\\",[7769]],[[7769,7769],\\\"valid\\\"],[[7770,7770],\\\"mapped\\\",[7771]],[[7771,7771],\\\"valid\\\"],[[7772,7772],\\\"mapped\\\",[7773]],[[7773,7773],\\\"valid\\\"],[[7774,7774],\\\"mapped\\\",[7775]],[[7775,7775],\\\"valid\\\"],[[7776,7776],\\\"mapped\\\",[7777]],[[7777,7777],\\\"valid\\\"],[[7778,7778],\\\"mapped\\\",[7779]],[[7779,7779],\\\"valid\\\"],[[7780,7780],\\\"mapped\\\",[7781]],[[7781,7781],\\\"valid\\\"],[[7782,7782],\\\"mapped\\\",[7783]],[[7783,7783],\\\"valid\\\"],[[7784,7784],\\\"mapped\\\",[7785]],[[7785,7785],\\\"valid\\\"],[[7786,7786],\\\"mapped\\\",[7787]],[[7787,7787],\\\"valid\\\"],[[7788,7788],\\\"mapped\\\",[7789]],[[7789,7789],\\\"valid\\\"],[[7790,7790],\\\"mapped\\\",[7791]],[[7791,7791],\\\"valid\\\"],[[7792,7792],\\\"mapped\\\",[7793]],[[7793,7793],\\\"valid\\\"],[[7794,7794],\\\"mapped\\\",[7795]],[[7795,7795],\\\"valid\\\"],[[7796,7796],\\\"mapped\\\",[7797]],[[7797,7797],\\\"valid\\\"],[[7798,7798],\\\"mapped\\\",[7799]],[[7799,7799],\\\"valid\\\"],[[7800,7800],\\\"mapped\\\",[7801]],[[7801,7801],\\\"valid\\\"],[[7802,7802],\\\"mapped\\\",[7803]],[[7803,7803],\\\"valid\\\"],[[7804,7804],\\\"mapped\\\",[7805]],[[7805,7805],\\\"valid\\\"],[[7806,7806],\\\"mapped\\\",[7807]],[[7807,7807],\\\"valid\\\"],[[7808,7808],\\\"mapped\\\",[7809]],[[7809,7809],\\\"valid\\\"],[[7810,7810],\\\"mapped\\\",[7811]],[[7811,7811],\\\"valid\\\"],[[7812,7812],\\\"mapped\\\",[7813]],[[7813,7813],\\\"valid\\\"],[[7814,7814],\\\"mapped\\\",[7815]],[[7815,7815],\\\"valid\\\"],[[7816,7816],\\\"mapped\\\",[7817]],[[7817,7817],\\\"valid\\\"],[[7818,7818],\\\"mapped\\\",[7819]],[[7819,7819],\\\"valid\\\"],[[7820,7820],\\\"mapped\\\",[7821]],[[7821,7821],\\\"valid\\\"],[[7822,7822],\\\"mapped\\\",[7823]],[[7823,7823],\\\"valid\\\"],[[7824,7824],\\\"mapped\\\",[7825]],[[7825,7825],\\\"valid\\\"],[[7826,7826],\\\"mapped\\\",[7827]],[[7827,7827],\\\"valid\\\"],[[7828,7828],\\\"mapped\\\",[7829]],[[7829,7833],\\\"valid\\\"],[[7834,7834],\\\"mapped\\\",[97,702]],[[7835,7835],\\\"mapped\\\",[7777]],[[7836,7837],\\\"valid\\\"],[[7838,7838],\\\"mapped\\\",[115,115]],[[7839,7839],\\\"valid\\\"],[[7840,7840],\\\"mapped\\\",[7841]],[[7841,7841],\\\"valid\\\"],[[7842,7842],\\\"mapped\\\",[7843]],[[7843,7843],\\\"valid\\\"],[[7844,7844],\\\"mapped\\\",[7845]],[[7845,7845],\\\"valid\\\"],[[7846,7846],\\\"mapped\\\",[7847]],[[7847,7847],\\\"valid\\\"],[[7848,7848],\\\"mapped\\\",[7849]],[[7849,7849],\\\"valid\\\"],[[7850,7850],\\\"mapped\\\",[7851]],[[7851,7851],\\\"valid\\\"],[[7852,7852],\\\"mapped\\\",[7853]],[[7853,7853],\\\"valid\\\"],[[7854,7854],\\\"mapped\\\",[7855]],[[7855,7855],\\\"valid\\\"],[[7856,7856],\\\"mapped\\\",[7857]],[[7857,7857],\\\"valid\\\"],[[7858,7858],\\\"mapped\\\",[7859]],[[7859,7859],\\\"valid\\\"],[[7860,7860],\\\"mapped\\\",[7861]],[[7861,7861],\\\"valid\\\"],[[7862,7862],\\\"mapped\\\",[7863]],[[7863,7863],\\\"valid\\\"],[[7864,7864],\\\"mapped\\\",[7865]],[[7865,7865],\\\"valid\\\"],[[7866,7866],\\\"mapped\\\",[7867]],[[7867,7867],\\\"valid\\\"],[[7868,7868],\\\"mapped\\\",[7869]],[[7869,7869],\\\"valid\\\"],[[7870,7870],\\\"mapped\\\",[7871]],[[7871,7871],\\\"valid\\\"],[[7872,7872],\\\"mapped\\\",[7873]],[[7873,7873],\\\"valid\\\"],[[7874,7874],\\\"mapped\\\",[7875]],[[7875,7875],\\\"valid\\\"],[[7876,7876],\\\"mapped\\\",[7877]],[[7877,7877],\\\"valid\\\"],[[7878,7878],\\\"mapped\\\",[7879]],[[7879,7879],\\\"valid\\\"],[[7880,7880],\\\"mapped\\\",[7881]],[[7881,7881],\\\"valid\\\"],[[7882,7882],\\\"mapped\\\",[7883]],[[7883,7883],\\\"valid\\\"],[[7884,7884],\\\"mapped\\\",[7885]],[[7885,7885],\\\"valid\\\"],[[7886,7886],\\\"mapped\\\",[7887]],[[7887,7887],\\\"valid\\\"],[[7888,7888],\\\"mapped\\\",[7889]],[[7889,7889],\\\"valid\\\"],[[7890,7890],\\\"mapped\\\",[7891]],[[7891,7891],\\\"valid\\\"],[[7892,7892],\\\"mapped\\\",[7893]],[[7893,7893],\\\"valid\\\"],[[7894,7894],\\\"mapped\\\",[7895]],[[7895,7895],\\\"valid\\\"],[[7896,7896],\\\"mapped\\\",[7897]],[[7897,7897],\\\"valid\\\"],[[7898,7898],\\\"mapped\\\",[7899]],[[7899,7899],\\\"valid\\\"],[[7900,7900],\\\"mapped\\\",[7901]],[[7901,7901],\\\"valid\\\"],[[7902,7902],\\\"mapped\\\",[7903]],[[7903,7903],\\\"valid\\\"],[[7904,7904],\\\"mapped\\\",[7905]],[[7905,7905],\\\"valid\\\"],[[7906,7906],\\\"mapped\\\",[7907]],[[7907,7907],\\\"valid\\\"],[[7908,7908],\\\"mapped\\\",[7909]],[[7909,7909],\\\"valid\\\"],[[7910,7910],\\\"mapped\\\",[7911]],[[7911,7911],\\\"valid\\\"],[[7912,7912],\\\"mapped\\\",[7913]],[[7913,7913],\\\"valid\\\"],[[7914,7914],\\\"mapped\\\",[7915]],[[7915,7915],\\\"valid\\\"],[[7916,7916],\\\"mapped\\\",[7917]],[[7917,7917],\\\"valid\\\"],[[7918,7918],\\\"mapped\\\",[7919]],[[7919,7919],\\\"valid\\\"],[[7920,7920],\\\"mapped\\\",[7921]],[[7921,7921],\\\"valid\\\"],[[7922,7922],\\\"mapped\\\",[7923]],[[7923,7923],\\\"valid\\\"],[[7924,7924],\\\"mapped\\\",[7925]],[[7925,7925],\\\"valid\\\"],[[7926,7926],\\\"mapped\\\",[7927]],[[7927,7927],\\\"valid\\\"],[[7928,7928],\\\"mapped\\\",[7929]],[[7929,7929],\\\"valid\\\"],[[7930,7930],\\\"mapped\\\",[7931]],[[7931,7931],\\\"valid\\\"],[[7932,7932],\\\"mapped\\\",[7933]],[[7933,7933],\\\"valid\\\"],[[7934,7934],\\\"mapped\\\",[7935]],[[7935,7935],\\\"valid\\\"],[[7936,7943],\\\"valid\\\"],[[7944,7944],\\\"mapped\\\",[7936]],[[7945,7945],\\\"mapped\\\",[7937]],[[7946,7946],\\\"mapped\\\",[7938]],[[7947,7947],\\\"mapped\\\",[7939]],[[7948,7948],\\\"mapped\\\",[7940]],[[7949,7949],\\\"mapped\\\",[7941]],[[7950,7950],\\\"mapped\\\",[7942]],[[7951,7951],\\\"mapped\\\",[7943]],[[7952,7957],\\\"valid\\\"],[[7958,7959],\\\"disallowed\\\"],[[7960,7960],\\\"mapped\\\",[7952]],[[7961,7961],\\\"mapped\\\",[7953]],[[7962,7962],\\\"mapped\\\",[7954]],[[7963,7963],\\\"mapped\\\",[7955]],[[7964,7964],\\\"mapped\\\",[7956]],[[7965,7965],\\\"mapped\\\",[7957]],[[7966,7967],\\\"disallowed\\\"],[[7968,7975],\\\"valid\\\"],[[7976,7976],\\\"mapped\\\",[7968]],[[7977,7977],\\\"mapped\\\",[7969]],[[7978,7978],\\\"mapped\\\",[7970]],[[7979,7979],\\\"mapped\\\",[7971]],[[7980,7980],\\\"mapped\\\",[7972]],[[7981,7981],\\\"mapped\\\",[7973]],[[7982,7982],\\\"mapped\\\",[7974]],[[7983,7983],\\\"mapped\\\",[7975]],[[7984,7991],\\\"valid\\\"],[[7992,7992],\\\"mapped\\\",[7984]],[[7993,7993],\\\"mapped\\\",[7985]],[[7994,7994],\\\"mapped\\\",[7986]],[[7995,7995],\\\"mapped\\\",[7987]],[[7996,7996],\\\"mapped\\\",[7988]],[[7997,7997],\\\"mapped\\\",[7989]],[[7998,7998],\\\"mapped\\\",[7990]],[[7999,7999],\\\"mapped\\\",[7991]],[[8000,8005],\\\"valid\\\"],[[8006,8007],\\\"disallowed\\\"],[[8008,8008],\\\"mapped\\\",[8000]],[[8009,8009],\\\"mapped\\\",[8001]],[[8010,8010],\\\"mapped\\\",[8002]],[[8011,8011],\\\"mapped\\\",[8003]],[[8012,8012],\\\"mapped\\\",[8004]],[[8013,8013],\\\"mapped\\\",[8005]],[[8014,8015],\\\"disallowed\\\"],[[8016,8023],\\\"valid\\\"],[[8024,8024],\\\"disallowed\\\"],[[8025,8025],\\\"mapped\\\",[8017]],[[8026,8026],\\\"disallowed\\\"],[[8027,8027],\\\"mapped\\\",[8019]],[[8028,8028],\\\"disallowed\\\"],[[8029,8029],\\\"mapped\\\",[8021]],[[8030,8030],\\\"disallowed\\\"],[[8031,8031],\\\"mapped\\\",[8023]],[[8032,8039],\\\"valid\\\"],[[8040,8040],\\\"mapped\\\",[8032]],[[8041,8041],\\\"mapped\\\",[8033]],[[8042,8042],\\\"mapped\\\",[8034]],[[8043,8043],\\\"mapped\\\",[8035]],[[8044,8044],\\\"mapped\\\",[8036]],[[8045,8045],\\\"mapped\\\",[8037]],[[8046,8046],\\\"mapped\\\",[8038]],[[8047,8047],\\\"mapped\\\",[8039]],[[8048,8048],\\\"valid\\\"],[[8049,8049],\\\"mapped\\\",[940]],[[8050,8050],\\\"valid\\\"],[[8051,8051],\\\"mapped\\\",[941]],[[8052,8052],\\\"valid\\\"],[[8053,8053],\\\"mapped\\\",[942]],[[8054,8054],\\\"valid\\\"],[[8055,8055],\\\"mapped\\\",[943]],[[8056,8056],\\\"valid\\\"],[[8057,8057],\\\"mapped\\\",[972]],[[8058,8058],\\\"valid\\\"],[[8059,8059],\\\"mapped\\\",[973]],[[8060,8060],\\\"valid\\\"],[[8061,8061],\\\"mapped\\\",[974]],[[8062,8063],\\\"disallowed\\\"],[[8064,8064],\\\"mapped\\\",[7936,953]],[[8065,8065],\\\"mapped\\\",[7937,953]],[[8066,8066],\\\"mapped\\\",[7938,953]],[[8067,8067],\\\"mapped\\\",[7939,953]],[[8068,8068],\\\"mapped\\\",[7940,953]],[[8069,8069],\\\"mapped\\\",[7941,953]],[[8070,8070],\\\"mapped\\\",[7942,953]],[[8071,8071],\\\"mapped\\\",[7943,953]],[[8072,8072],\\\"mapped\\\",[7936,953]],[[8073,8073],\\\"mapped\\\",[7937,953]],[[8074,8074],\\\"mapped\\\",[7938,953]],[[8075,8075],\\\"mapped\\\",[7939,953]],[[8076,8076],\\\"mapped\\\",[7940,953]],[[8077,8077],\\\"mapped\\\",[7941,953]],[[8078,8078],\\\"mapped\\\",[7942,953]],[[8079,8079],\\\"mapped\\\",[7943,953]],[[8080,8080],\\\"mapped\\\",[7968,953]],[[8081,8081],\\\"mapped\\\",[7969,953]],[[8082,8082],\\\"mapped\\\",[7970,953]],[[8083,8083],\\\"mapped\\\",[7971,953]],[[8084,8084],\\\"mapped\\\",[7972,953]],[[8085,8085],\\\"mapped\\\",[7973,953]],[[8086,8086],\\\"mapped\\\",[7974,953]],[[8087,8087],\\\"mapped\\\",[7975,953]],[[8088,8088],\\\"mapped\\\",[7968,953]],[[8089,8089],\\\"mapped\\\",[7969,953]],[[8090,8090],\\\"mapped\\\",[7970,953]],[[8091,8091],\\\"mapped\\\",[7971,953]],[[8092,8092],\\\"mapped\\\",[7972,953]],[[8093,8093],\\\"mapped\\\",[7973,953]],[[8094,8094],\\\"mapped\\\",[7974,953]],[[8095,8095],\\\"mapped\\\",[7975,953]],[[8096,8096],\\\"mapped\\\",[8032,953]],[[8097,8097],\\\"mapped\\\",[8033,953]],[[8098,8098],\\\"mapped\\\",[8034,953]],[[8099,8099],\\\"mapped\\\",[8035,953]],[[8100,8100],\\\"mapped\\\",[8036,953]],[[8101,8101],\\\"mapped\\\",[8037,953]],[[8102,8102],\\\"mapped\\\",[8038,953]],[[8103,8103],\\\"mapped\\\",[8039,953]],[[8104,8104],\\\"mapped\\\",[8032,953]],[[8105,8105],\\\"mapped\\\",[8033,953]],[[8106,8106],\\\"mapped\\\",[8034,953]],[[8107,8107],\\\"mapped\\\",[8035,953]],[[8108,8108],\\\"mapped\\\",[8036,953]],[[8109,8109],\\\"mapped\\\",[8037,953]],[[8110,8110],\\\"mapped\\\",[8038,953]],[[8111,8111],\\\"mapped\\\",[8039,953]],[[8112,8113],\\\"valid\\\"],[[8114,8114],\\\"mapped\\\",[8048,953]],[[8115,8115],\\\"mapped\\\",[945,953]],[[8116,8116],\\\"mapped\\\",[940,953]],[[8117,8117],\\\"disallowed\\\"],[[8118,8118],\\\"valid\\\"],[[8119,8119],\\\"mapped\\\",[8118,953]],[[8120,8120],\\\"mapped\\\",[8112]],[[8121,8121],\\\"mapped\\\",[8113]],[[8122,8122],\\\"mapped\\\",[8048]],[[8123,8123],\\\"mapped\\\",[940]],[[8124,8124],\\\"mapped\\\",[945,953]],[[8125,8125],\\\"disallowed_STD3_mapped\\\",[32,787]],[[8126,8126],\\\"mapped\\\",[953]],[[8127,8127],\\\"disallowed_STD3_mapped\\\",[32,787]],[[8128,8128],\\\"disallowed_STD3_mapped\\\",[32,834]],[[8129,8129],\\\"disallowed_STD3_mapped\\\",[32,776,834]],[[8130,8130],\\\"mapped\\\",[8052,953]],[[8131,8131],\\\"mapped\\\",[951,953]],[[8132,8132],\\\"mapped\\\",[942,953]],[[8133,8133],\\\"disallowed\\\"],[[8134,8134],\\\"valid\\\"],[[8135,8135],\\\"mapped\\\",[8134,953]],[[8136,8136],\\\"mapped\\\",[8050]],[[8137,8137],\\\"mapped\\\",[941]],[[8138,8138],\\\"mapped\\\",[8052]],[[8139,8139],\\\"mapped\\\",[942]],[[8140,8140],\\\"mapped\\\",[951,953]],[[8141,8141],\\\"disallowed_STD3_mapped\\\",[32,787,768]],[[8142,8142],\\\"disallowed_STD3_mapped\\\",[32,787,769]],[[8143,8143],\\\"disallowed_STD3_mapped\\\",[32,787,834]],[[8144,8146],\\\"valid\\\"],[[8147,8147],\\\"mapped\\\",[912]],[[8148,8149],\\\"disallowed\\\"],[[8150,8151],\\\"valid\\\"],[[8152,8152],\\\"mapped\\\",[8144]],[[8153,8153],\\\"mapped\\\",[8145]],[[8154,8154],\\\"mapped\\\",[8054]],[[8155,8155],\\\"mapped\\\",[943]],[[8156,8156],\\\"disallowed\\\"],[[8157,8157],\\\"disallowed_STD3_mapped\\\",[32,788,768]],[[8158,8158],\\\"disallowed_STD3_mapped\\\",[32,788,769]],[[8159,8159],\\\"disallowed_STD3_mapped\\\",[32,788,834]],[[8160,8162],\\\"valid\\\"],[[8163,8163],\\\"mapped\\\",[944]],[[8164,8167],\\\"valid\\\"],[[8168,8168],\\\"mapped\\\",[8160]],[[8169,8169],\\\"mapped\\\",[8161]],[[8170,8170],\\\"mapped\\\",[8058]],[[8171,8171],\\\"mapped\\\",[973]],[[8172,8172],\\\"mapped\\\",[8165]],[[8173,8173],\\\"disallowed_STD3_mapped\\\",[32,776,768]],[[8174,8174],\\\"disallowed_STD3_mapped\\\",[32,776,769]],[[8175,8175],\\\"disallowed_STD3_mapped\\\",[96]],[[8176,8177],\\\"disallowed\\\"],[[8178,8178],\\\"mapped\\\",[8060,953]],[[8179,8179],\\\"mapped\\\",[969,953]],[[8180,8180],\\\"mapped\\\",[974,953]],[[8181,8181],\\\"disallowed\\\"],[[8182,8182],\\\"valid\\\"],[[8183,8183],\\\"mapped\\\",[8182,953]],[[8184,8184],\\\"mapped\\\",[8056]],[[8185,8185],\\\"mapped\\\",[972]],[[8186,8186],\\\"mapped\\\",[8060]],[[8187,8187],\\\"mapped\\\",[974]],[[8188,8188],\\\"mapped\\\",[969,953]],[[8189,8189],\\\"disallowed_STD3_mapped\\\",[32,769]],[[8190,8190],\\\"disallowed_STD3_mapped\\\",[32,788]],[[8191,8191],\\\"disallowed\\\"],[[8192,8202],\\\"disallowed_STD3_mapped\\\",[32]],[[8203,8203],\\\"ignored\\\"],[[8204,8205],\\\"deviation\\\",[]],[[8206,8207],\\\"disallowed\\\"],[[8208,8208],\\\"valid\\\",[],\\\"NV8\\\"],[[8209,8209],\\\"mapped\\\",[8208]],[[8210,8214],\\\"valid\\\",[],\\\"NV8\\\"],[[8215,8215],\\\"disallowed_STD3_mapped\\\",[32,819]],[[8216,8227],\\\"valid\\\",[],\\\"NV8\\\"],[[8228,8230],\\\"disallowed\\\"],[[8231,8231],\\\"valid\\\",[],\\\"NV8\\\"],[[8232,8238],\\\"disallowed\\\"],[[8239,8239],\\\"disallowed_STD3_mapped\\\",[32]],[[8240,8242],\\\"valid\\\",[],\\\"NV8\\\"],[[8243,8243],\\\"mapped\\\",[8242,8242]],[[8244,8244],\\\"mapped\\\",[8242,8242,8242]],[[8245,8245],\\\"valid\\\",[],\\\"NV8\\\"],[[8246,8246],\\\"mapped\\\",[8245,8245]],[[8247,8247],\\\"mapped\\\",[8245,8245,8245]],[[8248,8251],\\\"valid\\\",[],\\\"NV8\\\"],[[8252,8252],\\\"disallowed_STD3_mapped\\\",[33,33]],[[8253,8253],\\\"valid\\\",[],\\\"NV8\\\"],[[8254,8254],\\\"disallowed_STD3_mapped\\\",[32,773]],[[8255,8262],\\\"valid\\\",[],\\\"NV8\\\"],[[8263,8263],\\\"disallowed_STD3_mapped\\\",[63,63]],[[8264,8264],\\\"disallowed_STD3_mapped\\\",[63,33]],[[8265,8265],\\\"disallowed_STD3_mapped\\\",[33,63]],[[8266,8269],\\\"valid\\\",[],\\\"NV8\\\"],[[8270,8274],\\\"valid\\\",[],\\\"NV8\\\"],[[8275,8276],\\\"valid\\\",[],\\\"NV8\\\"],[[8277,8278],\\\"valid\\\",[],\\\"NV8\\\"],[[8279,8279],\\\"mapped\\\",[8242,8242,8242,8242]],[[8280,8286],\\\"valid\\\",[],\\\"NV8\\\"],[[8287,8287],\\\"disallowed_STD3_mapped\\\",[32]],[[8288,8288],\\\"ignored\\\"],[[8289,8291],\\\"disallowed\\\"],[[8292,8292],\\\"ignored\\\"],[[8293,8293],\\\"disallowed\\\"],[[8294,8297],\\\"disallowed\\\"],[[8298,8303],\\\"disallowed\\\"],[[8304,8304],\\\"mapped\\\",[48]],[[8305,8305],\\\"mapped\\\",[105]],[[8306,8307],\\\"disallowed\\\"],[[8308,8308],\\\"mapped\\\",[52]],[[8309,8309],\\\"mapped\\\",[53]],[[8310,8310],\\\"mapped\\\",[54]],[[8311,8311],\\\"mapped\\\",[55]],[[8312,8312],\\\"mapped\\\",[56]],[[8313,8313],\\\"mapped\\\",[57]],[[8314,8314],\\\"disallowed_STD3_mapped\\\",[43]],[[8315,8315],\\\"mapped\\\",[8722]],[[8316,8316],\\\"disallowed_STD3_mapped\\\",[61]],[[8317,8317],\\\"disallowed_STD3_mapped\\\",[40]],[[8318,8318],\\\"disallowed_STD3_mapped\\\",[41]],[[8319,8319],\\\"mapped\\\",[110]],[[8320,8320],\\\"mapped\\\",[48]],[[8321,8321],\\\"mapped\\\",[49]],[[8322,8322],\\\"mapped\\\",[50]],[[8323,8323],\\\"mapped\\\",[51]],[[8324,8324],\\\"mapped\\\",[52]],[[8325,8325],\\\"mapped\\\",[53]],[[8326,8326],\\\"mapped\\\",[54]],[[8327,8327],\\\"mapped\\\",[55]],[[8328,8328],\\\"mapped\\\",[56]],[[8329,8329],\\\"mapped\\\",[57]],[[8330,8330],\\\"disallowed_STD3_mapped\\\",[43]],[[8331,8331],\\\"mapped\\\",[8722]],[[8332,8332],\\\"disallowed_STD3_mapped\\\",[61]],[[8333,8333],\\\"disallowed_STD3_mapped\\\",[40]],[[8334,8334],\\\"disallowed_STD3_mapped\\\",[41]],[[8335,8335],\\\"disallowed\\\"],[[8336,8336],\\\"mapped\\\",[97]],[[8337,8337],\\\"mapped\\\",[101]],[[8338,8338],\\\"mapped\\\",[111]],[[8339,8339],\\\"mapped\\\",[120]],[[8340,8340],\\\"mapped\\\",[601]],[[8341,8341],\\\"mapped\\\",[104]],[[8342,8342],\\\"mapped\\\",[107]],[[8343,8343],\\\"mapped\\\",[108]],[[8344,8344],\\\"mapped\\\",[109]],[[8345,8345],\\\"mapped\\\",[110]],[[8346,8346],\\\"mapped\\\",[112]],[[8347,8347],\\\"mapped\\\",[115]],[[8348,8348],\\\"mapped\\\",[116]],[[8349,8351],\\\"disallowed\\\"],[[8352,8359],\\\"valid\\\",[],\\\"NV8\\\"],[[8360,8360],\\\"mapped\\\",[114,115]],[[8361,8362],\\\"valid\\\",[],\\\"NV8\\\"],[[8363,8363],\\\"valid\\\",[],\\\"NV8\\\"],[[8364,8364],\\\"valid\\\",[],\\\"NV8\\\"],[[8365,8367],\\\"valid\\\",[],\\\"NV8\\\"],[[8368,8369],\\\"valid\\\",[],\\\"NV8\\\"],[[8370,8373],\\\"valid\\\",[],\\\"NV8\\\"],[[8374,8376],\\\"valid\\\",[],\\\"NV8\\\"],[[8377,8377],\\\"valid\\\",[],\\\"NV8\\\"],[[8378,8378],\\\"valid\\\",[],\\\"NV8\\\"],[[8379,8381],\\\"valid\\\",[],\\\"NV8\\\"],[[8382,8382],\\\"valid\\\",[],\\\"NV8\\\"],[[8383,8399],\\\"disallowed\\\"],[[8400,8417],\\\"valid\\\",[],\\\"NV8\\\"],[[8418,8419],\\\"valid\\\",[],\\\"NV8\\\"],[[8420,8426],\\\"valid\\\",[],\\\"NV8\\\"],[[8427,8427],\\\"valid\\\",[],\\\"NV8\\\"],[[8428,8431],\\\"valid\\\",[],\\\"NV8\\\"],[[8432,8432],\\\"valid\\\",[],\\\"NV8\\\"],[[8433,8447],\\\"disallowed\\\"],[[8448,8448],\\\"disallowed_STD3_mapped\\\",[97,47,99]],[[8449,8449],\\\"disallowed_STD3_mapped\\\",[97,47,115]],[[8450,8450],\\\"mapped\\\",[99]],[[8451,8451],\\\"mapped\\\",[176,99]],[[8452,8452],\\\"valid\\\",[],\\\"NV8\\\"],[[8453,8453],\\\"disallowed_STD3_mapped\\\",[99,47,111]],[[8454,8454],\\\"disallowed_STD3_mapped\\\",[99,47,117]],[[8455,8455],\\\"mapped\\\",[603]],[[8456,8456],\\\"valid\\\",[],\\\"NV8\\\"],[[8457,8457],\\\"mapped\\\",[176,102]],[[8458,8458],\\\"mapped\\\",[103]],[[8459,8462],\\\"mapped\\\",[104]],[[8463,8463],\\\"mapped\\\",[295]],[[8464,8465],\\\"mapped\\\",[105]],[[8466,8467],\\\"mapped\\\",[108]],[[8468,8468],\\\"valid\\\",[],\\\"NV8\\\"],[[8469,8469],\\\"mapped\\\",[110]],[[8470,8470],\\\"mapped\\\",[110,111]],[[8471,8472],\\\"valid\\\",[],\\\"NV8\\\"],[[8473,8473],\\\"mapped\\\",[112]],[[8474,8474],\\\"mapped\\\",[113]],[[8475,8477],\\\"mapped\\\",[114]],[[8478,8479],\\\"valid\\\",[],\\\"NV8\\\"],[[8480,8480],\\\"mapped\\\",[115,109]],[[8481,8481],\\\"mapped\\\",[116,101,108]],[[8482,8482],\\\"mapped\\\",[116,109]],[[8483,8483],\\\"valid\\\",[],\\\"NV8\\\"],[[8484,8484],\\\"mapped\\\",[122]],[[8485,8485],\\\"valid\\\",[],\\\"NV8\\\"],[[8486,8486],\\\"mapped\\\",[969]],[[8487,8487],\\\"valid\\\",[],\\\"NV8\\\"],[[8488,8488],\\\"mapped\\\",[122]],[[8489,8489],\\\"valid\\\",[],\\\"NV8\\\"],[[8490,8490],\\\"mapped\\\",[107]],[[8491,8491],\\\"mapped\\\",[229]],[[8492,8492],\\\"mapped\\\",[98]],[[8493,8493],\\\"mapped\\\",[99]],[[8494,8494],\\\"valid\\\",[],\\\"NV8\\\"],[[8495,8496],\\\"mapped\\\",[101]],[[8497,8497],\\\"mapped\\\",[102]],[[8498,8498],\\\"disallowed\\\"],[[8499,8499],\\\"mapped\\\",[109]],[[8500,8500],\\\"mapped\\\",[111]],[[8501,8501],\\\"mapped\\\",[1488]],[[8502,8502],\\\"mapped\\\",[1489]],[[8503,8503],\\\"mapped\\\",[1490]],[[8504,8504],\\\"mapped\\\",[1491]],[[8505,8505],\\\"mapped\\\",[105]],[[8506,8506],\\\"valid\\\",[],\\\"NV8\\\"],[[8507,8507],\\\"mapped\\\",[102,97,120]],[[8508,8508],\\\"mapped\\\",[960]],[[8509,8510],\\\"mapped\\\",[947]],[[8511,8511],\\\"mapped\\\",[960]],[[8512,8512],\\\"mapped\\\",[8721]],[[8513,8516],\\\"valid\\\",[],\\\"NV8\\\"],[[8517,8518],\\\"mapped\\\",[100]],[[8519,8519],\\\"mapped\\\",[101]],[[8520,8520],\\\"mapped\\\",[105]],[[8521,8521],\\\"mapped\\\",[106]],[[8522,8523],\\\"valid\\\",[],\\\"NV8\\\"],[[8524,8524],\\\"valid\\\",[],\\\"NV8\\\"],[[8525,8525],\\\"valid\\\",[],\\\"NV8\\\"],[[8526,8526],\\\"valid\\\"],[[8527,8527],\\\"valid\\\",[],\\\"NV8\\\"],[[8528,8528],\\\"mapped\\\",[49,8260,55]],[[8529,8529],\\\"mapped\\\",[49,8260,57]],[[8530,8530],\\\"mapped\\\",[49,8260,49,48]],[[8531,8531],\\\"mapped\\\",[49,8260,51]],[[8532,8532],\\\"mapped\\\",[50,8260,51]],[[8533,8533],\\\"mapped\\\",[49,8260,53]],[[8534,8534],\\\"mapped\\\",[50,8260,53]],[[8535,8535],\\\"mapped\\\",[51,8260,53]],[[8536,8536],\\\"mapped\\\",[52,8260,53]],[[8537,8537],\\\"mapped\\\",[49,8260,54]],[[8538,8538],\\\"mapped\\\",[53,8260,54]],[[8539,8539],\\\"mapped\\\",[49,8260,56]],[[8540,8540],\\\"mapped\\\",[51,8260,56]],[[8541,8541],\\\"mapped\\\",[53,8260,56]],[[8542,8542],\\\"mapped\\\",[55,8260,56]],[[8543,8543],\\\"mapped\\\",[49,8260]],[[8544,8544],\\\"mapped\\\",[105]],[[8545,8545],\\\"mapped\\\",[105,105]],[[8546,8546],\\\"mapped\\\",[105,105,105]],[[8547,8547],\\\"mapped\\\",[105,118]],[[8548,8548],\\\"mapped\\\",[118]],[[8549,8549],\\\"mapped\\\",[118,105]],[[8550,8550],\\\"mapped\\\",[118,105,105]],[[8551,8551],\\\"mapped\\\",[118,105,105,105]],[[8552,8552],\\\"mapped\\\",[105,120]],[[8553,8553],\\\"mapped\\\",[120]],[[8554,8554],\\\"mapped\\\",[120,105]],[[8555,8555],\\\"mapped\\\",[120,105,105]],[[8556,8556],\\\"mapped\\\",[108]],[[8557,8557],\\\"mapped\\\",[99]],[[8558,8558],\\\"mapped\\\",[100]],[[8559,8559],\\\"mapped\\\",[109]],[[8560,8560],\\\"mapped\\\",[105]],[[8561,8561],\\\"mapped\\\",[105,105]],[[8562,8562],\\\"mapped\\\",[105,105,105]],[[8563,8563],\\\"mapped\\\",[105,118]],[[8564,8564],\\\"mapped\\\",[118]],[[8565,8565],\\\"mapped\\\",[118,105]],[[8566,8566],\\\"mapped\\\",[118,105,105]],[[8567,8567],\\\"mapped\\\",[118,105,105,105]],[[8568,8568],\\\"mapped\\\",[105,120]],[[8569,8569],\\\"mapped\\\",[120]],[[8570,8570],\\\"mapped\\\",[120,105]],[[8571,8571],\\\"mapped\\\",[120,105,105]],[[8572,8572],\\\"mapped\\\",[108]],[[8573,8573],\\\"mapped\\\",[99]],[[8574,8574],\\\"mapped\\\",[100]],[[8575,8575],\\\"mapped\\\",[109]],[[8576,8578],\\\"valid\\\",[],\\\"NV8\\\"],[[8579,8579],\\\"disallowed\\\"],[[8580,8580],\\\"valid\\\"],[[8581,8584],\\\"valid\\\",[],\\\"NV8\\\"],[[8585,8585],\\\"mapped\\\",[48,8260,51]],[[8586,8587],\\\"valid\\\",[],\\\"NV8\\\"],[[8588,8591],\\\"disallowed\\\"],[[8592,8682],\\\"valid\\\",[],\\\"NV8\\\"],[[8683,8691],\\\"valid\\\",[],\\\"NV8\\\"],[[8692,8703],\\\"valid\\\",[],\\\"NV8\\\"],[[8704,8747],\\\"valid\\\",[],\\\"NV8\\\"],[[8748,8748],\\\"mapped\\\",[8747,8747]],[[8749,8749],\\\"mapped\\\",[8747,8747,8747]],[[8750,8750],\\\"valid\\\",[],\\\"NV8\\\"],[[8751,8751],\\\"mapped\\\",[8750,8750]],[[8752,8752],\\\"mapped\\\",[8750,8750,8750]],[[8753,8799],\\\"valid\\\",[],\\\"NV8\\\"],[[8800,8800],\\\"disallowed_STD3_valid\\\"],[[8801,8813],\\\"valid\\\",[],\\\"NV8\\\"],[[8814,8815],\\\"disallowed_STD3_valid\\\"],[[8816,8945],\\\"valid\\\",[],\\\"NV8\\\"],[[8946,8959],\\\"valid\\\",[],\\\"NV8\\\"],[[8960,8960],\\\"valid\\\",[],\\\"NV8\\\"],[[8961,8961],\\\"valid\\\",[],\\\"NV8\\\"],[[8962,9000],\\\"valid\\\",[],\\\"NV8\\\"],[[9001,9001],\\\"mapped\\\",[12296]],[[9002,9002],\\\"mapped\\\",[12297]],[[9003,9082],\\\"valid\\\",[],\\\"NV8\\\"],[[9083,9083],\\\"valid\\\",[],\\\"NV8\\\"],[[9084,9084],\\\"valid\\\",[],\\\"NV8\\\"],[[9085,9114],\\\"valid\\\",[],\\\"NV8\\\"],[[9115,9166],\\\"valid\\\",[],\\\"NV8\\\"],[[9167,9168],\\\"valid\\\",[],\\\"NV8\\\"],[[9169,9179],\\\"valid\\\",[],\\\"NV8\\\"],[[9180,9191],\\\"valid\\\",[],\\\"NV8\\\"],[[9192,9192],\\\"valid\\\",[],\\\"NV8\\\"],[[9193,9203],\\\"valid\\\",[],\\\"NV8\\\"],[[9204,9210],\\\"valid\\\",[],\\\"NV8\\\"],[[9211,9215],\\\"disallowed\\\"],[[9216,9252],\\\"valid\\\",[],\\\"NV8\\\"],[[9253,9254],\\\"valid\\\",[],\\\"NV8\\\"],[[9255,9279],\\\"disallowed\\\"],[[9280,9290],\\\"valid\\\",[],\\\"NV8\\\"],[[9291,9311],\\\"disallowed\\\"],[[9312,9312],\\\"mapped\\\",[49]],[[9313,9313],\\\"mapped\\\",[50]],[[9314,9314],\\\"mapped\\\",[51]],[[9315,9315],\\\"mapped\\\",[52]],[[9316,9316],\\\"mapped\\\",[53]],[[9317,9317],\\\"mapped\\\",[54]],[[9318,9318],\\\"mapped\\\",[55]],[[9319,9319],\\\"mapped\\\",[56]],[[9320,9320],\\\"mapped\\\",[57]],[[9321,9321],\\\"mapped\\\",[49,48]],[[9322,9322],\\\"mapped\\\",[49,49]],[[9323,9323],\\\"mapped\\\",[49,50]],[[9324,9324],\\\"mapped\\\",[49,51]],[[9325,9325],\\\"mapped\\\",[49,52]],[[9326,9326],\\\"mapped\\\",[49,53]],[[9327,9327],\\\"mapped\\\",[49,54]],[[9328,9328],\\\"mapped\\\",[49,55]],[[9329,9329],\\\"mapped\\\",[49,56]],[[9330,9330],\\\"mapped\\\",[49,57]],[[9331,9331],\\\"mapped\\\",[50,48]],[[9332,9332],\\\"disallowed_STD3_mapped\\\",[40,49,41]],[[9333,9333],\\\"disallowed_STD3_mapped\\\",[40,50,41]],[[9334,9334],\\\"disallowed_STD3_mapped\\\",[40,51,41]],[[9335,9335],\\\"disallowed_STD3_mapped\\\",[40,52,41]],[[9336,9336],\\\"disallowed_STD3_mapped\\\",[40,53,41]],[[9337,9337],\\\"disallowed_STD3_mapped\\\",[40,54,41]],[[9338,9338],\\\"disallowed_STD3_mapped\\\",[40,55,41]],[[9339,9339],\\\"disallowed_STD3_mapped\\\",[40,56,41]],[[9340,9340],\\\"disallowed_STD3_mapped\\\",[40,57,41]],[[9341,9341],\\\"disallowed_STD3_mapped\\\",[40,49,48,41]],[[9342,9342],\\\"disallowed_STD3_mapped\\\",[40,49,49,41]],[[9343,9343],\\\"disallowed_STD3_mapped\\\",[40,49,50,41]],[[9344,9344],\\\"disallowed_STD3_mapped\\\",[40,49,51,41]],[[9345,9345],\\\"disallowed_STD3_mapped\\\",[40,49,52,41]],[[9346,9346],\\\"disallowed_STD3_mapped\\\",[40,49,53,41]],[[9347,9347],\\\"disallowed_STD3_mapped\\\",[40,49,54,41]],[[9348,9348],\\\"disallowed_STD3_mapped\\\",[40,49,55,41]],[[9349,9349],\\\"disallowed_STD3_mapped\\\",[40,49,56,41]],[[9350,9350],\\\"disallowed_STD3_mapped\\\",[40,49,57,41]],[[9351,9351],\\\"disallowed_STD3_mapped\\\",[40,50,48,41]],[[9352,9371],\\\"disallowed\\\"],[[9372,9372],\\\"disallowed_STD3_mapped\\\",[40,97,41]],[[9373,9373],\\\"disallowed_STD3_mapped\\\",[40,98,41]],[[9374,9374],\\\"disallowed_STD3_mapped\\\",[40,99,41]],[[9375,9375],\\\"disallowed_STD3_mapped\\\",[40,100,41]],[[9376,9376],\\\"disallowed_STD3_mapped\\\",[40,101,41]],[[9377,9377],\\\"disallowed_STD3_mapped\\\",[40,102,41]],[[9378,9378],\\\"disallowed_STD3_mapped\\\",[40,103,41]],[[9379,9379],\\\"disallowed_STD3_mapped\\\",[40,104,41]],[[9380,9380],\\\"disallowed_STD3_mapped\\\",[40,105,41]],[[9381,9381],\\\"disallowed_STD3_mapped\\\",[40,106,41]],[[9382,9382],\\\"disallowed_STD3_mapped\\\",[40,107,41]],[[9383,9383],\\\"disallowed_STD3_mapped\\\",[40,108,41]],[[9384,9384],\\\"disallowed_STD3_mapped\\\",[40,109,41]],[[9385,9385],\\\"disallowed_STD3_mapped\\\",[40,110,41]],[[9386,9386],\\\"disallowed_STD3_mapped\\\",[40,111,41]],[[9387,9387],\\\"disallowed_STD3_mapped\\\",[40,112,41]],[[9388,9388],\\\"disallowed_STD3_mapped\\\",[40,113,41]],[[9389,9389],\\\"disallowed_STD3_mapped\\\",[40,114,41]],[[9390,9390],\\\"disallowed_STD3_mapped\\\",[40,115,41]],[[9391,9391],\\\"disallowed_STD3_mapped\\\",[40,116,41]],[[9392,9392],\\\"disallowed_STD3_mapped\\\",[40,117,41]],[[9393,9393],\\\"disallowed_STD3_mapped\\\",[40,118,41]],[[9394,9394],\\\"disallowed_STD3_mapped\\\",[40,119,41]],[[9395,9395],\\\"disallowed_STD3_mapped\\\",[40,120,41]],[[9396,9396],\\\"disallowed_STD3_mapped\\\",[40,121,41]],[[9397,9397],\\\"disallowed_STD3_mapped\\\",[40,122,41]],[[9398,9398],\\\"mapped\\\",[97]],[[9399,9399],\\\"mapped\\\",[98]],[[9400,9400],\\\"mapped\\\",[99]],[[9401,9401],\\\"mapped\\\",[100]],[[9402,9402],\\\"mapped\\\",[101]],[[9403,9403],\\\"mapped\\\",[102]],[[9404,9404],\\\"mapped\\\",[103]],[[9405,9405],\\\"mapped\\\",[104]],[[9406,9406],\\\"mapped\\\",[105]],[[9407,9407],\\\"mapped\\\",[106]],[[9408,9408],\\\"mapped\\\",[107]],[[9409,9409],\\\"mapped\\\",[108]],[[9410,9410],\\\"mapped\\\",[109]],[[9411,9411],\\\"mapped\\\",[110]],[[9412,9412],\\\"mapped\\\",[111]],[[9413,9413],\\\"mapped\\\",[112]],[[9414,9414],\\\"mapped\\\",[113]],[[9415,9415],\\\"mapped\\\",[114]],[[9416,9416],\\\"mapped\\\",[115]],[[9417,9417],\\\"mapped\\\",[116]],[[9418,9418],\\\"mapped\\\",[117]],[[9419,9419],\\\"mapped\\\",[118]],[[9420,9420],\\\"mapped\\\",[119]],[[9421,9421],\\\"mapped\\\",[120]],[[9422,9422],\\\"mapped\\\",[121]],[[9423,9423],\\\"mapped\\\",[122]],[[9424,9424],\\\"mapped\\\",[97]],[[9425,9425],\\\"mapped\\\",[98]],[[9426,9426],\\\"mapped\\\",[99]],[[9427,9427],\\\"mapped\\\",[100]],[[9428,9428],\\\"mapped\\\",[101]],[[9429,9429],\\\"mapped\\\",[102]],[[9430,9430],\\\"mapped\\\",[103]],[[9431,9431],\\\"mapped\\\",[104]],[[9432,9432],\\\"mapped\\\",[105]],[[9433,9433],\\\"mapped\\\",[106]],[[9434,9434],\\\"mapped\\\",[107]],[[9435,9435],\\\"mapped\\\",[108]],[[9436,9436],\\\"mapped\\\",[109]],[[9437,9437],\\\"mapped\\\",[110]],[[9438,9438],\\\"mapped\\\",[111]],[[9439,9439],\\\"mapped\\\",[112]],[[9440,9440],\\\"mapped\\\",[113]],[[9441,9441],\\\"mapped\\\",[114]],[[9442,9442],\\\"mapped\\\",[115]],[[9443,9443],\\\"mapped\\\",[116]],[[9444,9444],\\\"mapped\\\",[117]],[[9445,9445],\\\"mapped\\\",[118]],[[9446,9446],\\\"mapped\\\",[119]],[[9447,9447],\\\"mapped\\\",[120]],[[9448,9448],\\\"mapped\\\",[121]],[[9449,9449],\\\"mapped\\\",[122]],[[9450,9450],\\\"mapped\\\",[48]],[[9451,9470],\\\"valid\\\",[],\\\"NV8\\\"],[[9471,9471],\\\"valid\\\",[],\\\"NV8\\\"],[[9472,9621],\\\"valid\\\",[],\\\"NV8\\\"],[[9622,9631],\\\"valid\\\",[],\\\"NV8\\\"],[[9632,9711],\\\"valid\\\",[],\\\"NV8\\\"],[[9712,9719],\\\"valid\\\",[],\\\"NV8\\\"],[[9720,9727],\\\"valid\\\",[],\\\"NV8\\\"],[[9728,9747],\\\"valid\\\",[],\\\"NV8\\\"],[[9748,9749],\\\"valid\\\",[],\\\"NV8\\\"],[[9750,9751],\\\"valid\\\",[],\\\"NV8\\\"],[[9752,9752],\\\"valid\\\",[],\\\"NV8\\\"],[[9753,9753],\\\"valid\\\",[],\\\"NV8\\\"],[[9754,9839],\\\"valid\\\",[],\\\"NV8\\\"],[[9840,9841],\\\"valid\\\",[],\\\"NV8\\\"],[[9842,9853],\\\"valid\\\",[],\\\"NV8\\\"],[[9854,9855],\\\"valid\\\",[],\\\"NV8\\\"],[[9856,9865],\\\"valid\\\",[],\\\"NV8\\\"],[[9866,9873],\\\"valid\\\",[],\\\"NV8\\\"],[[9874,9884],\\\"valid\\\",[],\\\"NV8\\\"],[[9885,9885],\\\"valid\\\",[],\\\"NV8\\\"],[[9886,9887],\\\"valid\\\",[],\\\"NV8\\\"],[[9888,9889],\\\"valid\\\",[],\\\"NV8\\\"],[[9890,9905],\\\"valid\\\",[],\\\"NV8\\\"],[[9906,9906],\\\"valid\\\",[],\\\"NV8\\\"],[[9907,9916],\\\"valid\\\",[],\\\"NV8\\\"],[[9917,9919],\\\"valid\\\",[],\\\"NV8\\\"],[[9920,9923],\\\"valid\\\",[],\\\"NV8\\\"],[[9924,9933],\\\"valid\\\",[],\\\"NV8\\\"],[[9934,9934],\\\"valid\\\",[],\\\"NV8\\\"],[[9935,9953],\\\"valid\\\",[],\\\"NV8\\\"],[[9954,9954],\\\"valid\\\",[],\\\"NV8\\\"],[[9955,9955],\\\"valid\\\",[],\\\"NV8\\\"],[[9956,9959],\\\"valid\\\",[],\\\"NV8\\\"],[[9960,9983],\\\"valid\\\",[],\\\"NV8\\\"],[[9984,9984],\\\"valid\\\",[],\\\"NV8\\\"],[[9985,9988],\\\"valid\\\",[],\\\"NV8\\\"],[[9989,9989],\\\"valid\\\",[],\\\"NV8\\\"],[[9990,9993],\\\"valid\\\",[],\\\"NV8\\\"],[[9994,9995],\\\"valid\\\",[],\\\"NV8\\\"],[[9996,10023],\\\"valid\\\",[],\\\"NV8\\\"],[[10024,10024],\\\"valid\\\",[],\\\"NV8\\\"],[[10025,10059],\\\"valid\\\",[],\\\"NV8\\\"],[[10060,10060],\\\"valid\\\",[],\\\"NV8\\\"],[[10061,10061],\\\"valid\\\",[],\\\"NV8\\\"],[[10062,10062],\\\"valid\\\",[],\\\"NV8\\\"],[[10063,10066],\\\"valid\\\",[],\\\"NV8\\\"],[[10067,10069],\\\"valid\\\",[],\\\"NV8\\\"],[[10070,10070],\\\"valid\\\",[],\\\"NV8\\\"],[[10071,10071],\\\"valid\\\",[],\\\"NV8\\\"],[[10072,10078],\\\"valid\\\",[],\\\"NV8\\\"],[[10079,10080],\\\"valid\\\",[],\\\"NV8\\\"],[[10081,10087],\\\"valid\\\",[],\\\"NV8\\\"],[[10088,10101],\\\"valid\\\",[],\\\"NV8\\\"],[[10102,10132],\\\"valid\\\",[],\\\"NV8\\\"],[[10133,10135],\\\"valid\\\",[],\\\"NV8\\\"],[[10136,10159],\\\"valid\\\",[],\\\"NV8\\\"],[[10160,10160],\\\"valid\\\",[],\\\"NV8\\\"],[[10161,10174],\\\"valid\\\",[],\\\"NV8\\\"],[[10175,10175],\\\"valid\\\",[],\\\"NV8\\\"],[[10176,10182],\\\"valid\\\",[],\\\"NV8\\\"],[[10183,10186],\\\"valid\\\",[],\\\"NV8\\\"],[[10187,10187],\\\"valid\\\",[],\\\"NV8\\\"],[[10188,10188],\\\"valid\\\",[],\\\"NV8\\\"],[[10189,10189],\\\"valid\\\",[],\\\"NV8\\\"],[[10190,10191],\\\"valid\\\",[],\\\"NV8\\\"],[[10192,10219],\\\"valid\\\",[],\\\"NV8\\\"],[[10220,10223],\\\"valid\\\",[],\\\"NV8\\\"],[[10224,10239],\\\"valid\\\",[],\\\"NV8\\\"],[[10240,10495],\\\"valid\\\",[],\\\"NV8\\\"],[[10496,10763],\\\"valid\\\",[],\\\"NV8\\\"],[[10764,10764],\\\"mapped\\\",[8747,8747,8747,8747]],[[10765,10867],\\\"valid\\\",[],\\\"NV8\\\"],[[10868,10868],\\\"disallowed_STD3_mapped\\\",[58,58,61]],[[10869,10869],\\\"disallowed_STD3_mapped\\\",[61,61]],[[10870,10870],\\\"disallowed_STD3_mapped\\\",[61,61,61]],[[10871,10971],\\\"valid\\\",[],\\\"NV8\\\"],[[10972,10972],\\\"mapped\\\",[10973,824]],[[10973,11007],\\\"valid\\\",[],\\\"NV8\\\"],[[11008,11021],\\\"valid\\\",[],\\\"NV8\\\"],[[11022,11027],\\\"valid\\\",[],\\\"NV8\\\"],[[11028,11034],\\\"valid\\\",[],\\\"NV8\\\"],[[11035,11039],\\\"valid\\\",[],\\\"NV8\\\"],[[11040,11043],\\\"valid\\\",[],\\\"NV8\\\"],[[11044,11084],\\\"valid\\\",[],\\\"NV8\\\"],[[11085,11087],\\\"valid\\\",[],\\\"NV8\\\"],[[11088,11092],\\\"valid\\\",[],\\\"NV8\\\"],[[11093,11097],\\\"valid\\\",[],\\\"NV8\\\"],[[11098,11123],\\\"valid\\\",[],\\\"NV8\\\"],[[11124,11125],\\\"disallowed\\\"],[[11126,11157],\\\"valid\\\",[],\\\"NV8\\\"],[[11158,11159],\\\"disallowed\\\"],[[11160,11193],\\\"valid\\\",[],\\\"NV8\\\"],[[11194,11196],\\\"disallowed\\\"],[[11197,11208],\\\"valid\\\",[],\\\"NV8\\\"],[[11209,11209],\\\"disallowed\\\"],[[11210,11217],\\\"valid\\\",[],\\\"NV8\\\"],[[11218,11243],\\\"disallowed\\\"],[[11244,11247],\\\"valid\\\",[],\\\"NV8\\\"],[[11248,11263],\\\"disallowed\\\"],[[11264,11264],\\\"mapped\\\",[11312]],[[11265,11265],\\\"mapped\\\",[11313]],[[11266,11266],\\\"mapped\\\",[11314]],[[11267,11267],\\\"mapped\\\",[11315]],[[11268,11268],\\\"mapped\\\",[11316]],[[11269,11269],\\\"mapped\\\",[11317]],[[11270,11270],\\\"mapped\\\",[11318]],[[11271,11271],\\\"mapped\\\",[11319]],[[11272,11272],\\\"mapped\\\",[11320]],[[11273,11273],\\\"mapped\\\",[11321]],[[11274,11274],\\\"mapped\\\",[11322]],[[11275,11275],\\\"mapped\\\",[11323]],[[11276,11276],\\\"mapped\\\",[11324]],[[11277,11277],\\\"mapped\\\",[11325]],[[11278,11278],\\\"mapped\\\",[11326]],[[11279,11279],\\\"mapped\\\",[11327]],[[11280,11280],\\\"mapped\\\",[11328]],[[11281,11281],\\\"mapped\\\",[11329]],[[11282,11282],\\\"mapped\\\",[11330]],[[11283,11283],\\\"mapped\\\",[11331]],[[11284,11284],\\\"mapped\\\",[11332]],[[11285,11285],\\\"mapped\\\",[11333]],[[11286,11286],\\\"mapped\\\",[11334]],[[11287,11287],\\\"mapped\\\",[11335]],[[11288,11288],\\\"mapped\\\",[11336]],[[11289,11289],\\\"mapped\\\",[11337]],[[11290,11290],\\\"mapped\\\",[11338]],[[11291,11291],\\\"mapped\\\",[11339]],[[11292,11292],\\\"mapped\\\",[11340]],[[11293,11293],\\\"mapped\\\",[11341]],[[11294,11294],\\\"mapped\\\",[11342]],[[11295,11295],\\\"mapped\\\",[11343]],[[11296,11296],\\\"mapped\\\",[11344]],[[11297,11297],\\\"mapped\\\",[11345]],[[11298,11298],\\\"mapped\\\",[11346]],[[11299,11299],\\\"mapped\\\",[11347]],[[11300,11300],\\\"mapped\\\",[11348]],[[11301,11301],\\\"mapped\\\",[11349]],[[11302,11302],\\\"mapped\\\",[11350]],[[11303,11303],\\\"mapped\\\",[11351]],[[11304,11304],\\\"mapped\\\",[11352]],[[11305,11305],\\\"mapped\\\",[11353]],[[11306,11306],\\\"mapped\\\",[11354]],[[11307,11307],\\\"mapped\\\",[11355]],[[11308,11308],\\\"mapped\\\",[11356]],[[11309,11309],\\\"mapped\\\",[11357]],[[11310,11310],\\\"mapped\\\",[11358]],[[11311,11311],\\\"disallowed\\\"],[[11312,11358],\\\"valid\\\"],[[11359,11359],\\\"disallowed\\\"],[[11360,11360],\\\"mapped\\\",[11361]],[[11361,11361],\\\"valid\\\"],[[11362,11362],\\\"mapped\\\",[619]],[[11363,11363],\\\"mapped\\\",[7549]],[[11364,11364],\\\"mapped\\\",[637]],[[11365,11366],\\\"valid\\\"],[[11367,11367],\\\"mapped\\\",[11368]],[[11368,11368],\\\"valid\\\"],[[11369,11369],\\\"mapped\\\",[11370]],[[11370,11370],\\\"valid\\\"],[[11371,11371],\\\"mapped\\\",[11372]],[[11372,11372],\\\"valid\\\"],[[11373,11373],\\\"mapped\\\",[593]],[[11374,11374],\\\"mapped\\\",[625]],[[11375,11375],\\\"mapped\\\",[592]],[[11376,11376],\\\"mapped\\\",[594]],[[11377,11377],\\\"valid\\\"],[[11378,11378],\\\"mapped\\\",[11379]],[[11379,11379],\\\"valid\\\"],[[11380,11380],\\\"valid\\\"],[[11381,11381],\\\"mapped\\\",[11382]],[[11382,11383],\\\"valid\\\"],[[11384,11387],\\\"valid\\\"],[[11388,11388],\\\"mapped\\\",[106]],[[11389,11389],\\\"mapped\\\",[118]],[[11390,11390],\\\"mapped\\\",[575]],[[11391,11391],\\\"mapped\\\",[576]],[[11392,11392],\\\"mapped\\\",[11393]],[[11393,11393],\\\"valid\\\"],[[11394,11394],\\\"mapped\\\",[11395]],[[11395,11395],\\\"valid\\\"],[[11396,11396],\\\"mapped\\\",[11397]],[[11397,11397],\\\"valid\\\"],[[11398,11398],\\\"mapped\\\",[11399]],[[11399,11399],\\\"valid\\\"],[[11400,11400],\\\"mapped\\\",[11401]],[[11401,11401],\\\"valid\\\"],[[11402,11402],\\\"mapped\\\",[11403]],[[11403,11403],\\\"valid\\\"],[[11404,11404],\\\"mapped\\\",[11405]],[[11405,11405],\\\"valid\\\"],[[11406,11406],\\\"mapped\\\",[11407]],[[11407,11407],\\\"valid\\\"],[[11408,11408],\\\"mapped\\\",[11409]],[[11409,11409],\\\"valid\\\"],[[11410,11410],\\\"mapped\\\",[11411]],[[11411,11411],\\\"valid\\\"],[[11412,11412],\\\"mapped\\\",[11413]],[[11413,11413],\\\"valid\\\"],[[11414,11414],\\\"mapped\\\",[11415]],[[11415,11415],\\\"valid\\\"],[[11416,11416],\\\"mapped\\\",[11417]],[[11417,11417],\\\"valid\\\"],[[11418,11418],\\\"mapped\\\",[11419]],[[11419,11419],\\\"valid\\\"],[[11420,11420],\\\"mapped\\\",[11421]],[[11421,11421],\\\"valid\\\"],[[11422,11422],\\\"mapped\\\",[11423]],[[11423,11423],\\\"valid\\\"],[[11424,11424],\\\"mapped\\\",[11425]],[[11425,11425],\\\"valid\\\"],[[11426,11426],\\\"mapped\\\",[11427]],[[11427,11427],\\\"valid\\\"],[[11428,11428],\\\"mapped\\\",[11429]],[[11429,11429],\\\"valid\\\"],[[11430,11430],\\\"mapped\\\",[11431]],[[11431,11431],\\\"valid\\\"],[[11432,11432],\\\"mapped\\\",[11433]],[[11433,11433],\\\"valid\\\"],[[11434,11434],\\\"mapped\\\",[11435]],[[11435,11435],\\\"valid\\\"],[[11436,11436],\\\"mapped\\\",[11437]],[[11437,11437],\\\"valid\\\"],[[11438,11438],\\\"mapped\\\",[11439]],[[11439,11439],\\\"valid\\\"],[[11440,11440],\\\"mapped\\\",[11441]],[[11441,11441],\\\"valid\\\"],[[11442,11442],\\\"mapped\\\",[11443]],[[11443,11443],\\\"valid\\\"],[[11444,11444],\\\"mapped\\\",[11445]],[[11445,11445],\\\"valid\\\"],[[11446,11446],\\\"mapped\\\",[11447]],[[11447,11447],\\\"valid\\\"],[[11448,11448],\\\"mapped\\\",[11449]],[[11449,11449],\\\"valid\\\"],[[11450,11450],\\\"mapped\\\",[11451]],[[11451,11451],\\\"valid\\\"],[[11452,11452],\\\"mapped\\\",[11453]],[[11453,11453],\\\"valid\\\"],[[11454,11454],\\\"mapped\\\",[11455]],[[11455,11455],\\\"valid\\\"],[[11456,11456],\\\"mapped\\\",[11457]],[[11457,11457],\\\"valid\\\"],[[11458,11458],\\\"mapped\\\",[11459]],[[11459,11459],\\\"valid\\\"],[[11460,11460],\\\"mapped\\\",[11461]],[[11461,11461],\\\"valid\\\"],[[11462,11462],\\\"mapped\\\",[11463]],[[11463,11463],\\\"valid\\\"],[[11464,11464],\\\"mapped\\\",[11465]],[[11465,11465],\\\"valid\\\"],[[11466,11466],\\\"mapped\\\",[11467]],[[11467,11467],\\\"valid\\\"],[[11468,11468],\\\"mapped\\\",[11469]],[[11469,11469],\\\"valid\\\"],[[11470,11470],\\\"mapped\\\",[11471]],[[11471,11471],\\\"valid\\\"],[[11472,11472],\\\"mapped\\\",[11473]],[[11473,11473],\\\"valid\\\"],[[11474,11474],\\\"mapped\\\",[11475]],[[11475,11475],\\\"valid\\\"],[[11476,11476],\\\"mapped\\\",[11477]],[[11477,11477],\\\"valid\\\"],[[11478,11478],\\\"mapped\\\",[11479]],[[11479,11479],\\\"valid\\\"],[[11480,11480],\\\"mapped\\\",[11481]],[[11481,11481],\\\"valid\\\"],[[11482,11482],\\\"mapped\\\",[11483]],[[11483,11483],\\\"valid\\\"],[[11484,11484],\\\"mapped\\\",[11485]],[[11485,11485],\\\"valid\\\"],[[11486,11486],\\\"mapped\\\",[11487]],[[11487,11487],\\\"valid\\\"],[[11488,11488],\\\"mapped\\\",[11489]],[[11489,11489],\\\"valid\\\"],[[11490,11490],\\\"mapped\\\",[11491]],[[11491,11492],\\\"valid\\\"],[[11493,11498],\\\"valid\\\",[],\\\"NV8\\\"],[[11499,11499],\\\"mapped\\\",[11500]],[[11500,11500],\\\"valid\\\"],[[11501,11501],\\\"mapped\\\",[11502]],[[11502,11505],\\\"valid\\\"],[[11506,11506],\\\"mapped\\\",[11507]],[[11507,11507],\\\"valid\\\"],[[11508,11512],\\\"disallowed\\\"],[[11513,11519],\\\"valid\\\",[],\\\"NV8\\\"],[[11520,11557],\\\"valid\\\"],[[11558,11558],\\\"disallowed\\\"],[[11559,11559],\\\"valid\\\"],[[11560,11564],\\\"disallowed\\\"],[[11565,11565],\\\"valid\\\"],[[11566,11567],\\\"disallowed\\\"],[[11568,11621],\\\"valid\\\"],[[11622,11623],\\\"valid\\\"],[[11624,11630],\\\"disallowed\\\"],[[11631,11631],\\\"mapped\\\",[11617]],[[11632,11632],\\\"valid\\\",[],\\\"NV8\\\"],[[11633,11646],\\\"disallowed\\\"],[[11647,11647],\\\"valid\\\"],[[11648,11670],\\\"valid\\\"],[[11671,11679],\\\"disallowed\\\"],[[11680,11686],\\\"valid\\\"],[[11687,11687],\\\"disallowed\\\"],[[11688,11694],\\\"valid\\\"],[[11695,11695],\\\"disallowed\\\"],[[11696,11702],\\\"valid\\\"],[[11703,11703],\\\"disallowed\\\"],[[11704,11710],\\\"valid\\\"],[[11711,11711],\\\"disallowed\\\"],[[11712,11718],\\\"valid\\\"],[[11719,11719],\\\"disallowed\\\"],[[11720,11726],\\\"valid\\\"],[[11727,11727],\\\"disallowed\\\"],[[11728,11734],\\\"valid\\\"],[[11735,11735],\\\"disallowed\\\"],[[11736,11742],\\\"valid\\\"],[[11743,11743],\\\"disallowed\\\"],[[11744,11775],\\\"valid\\\"],[[11776,11799],\\\"valid\\\",[],\\\"NV8\\\"],[[11800,11803],\\\"valid\\\",[],\\\"NV8\\\"],[[11804,11805],\\\"valid\\\",[],\\\"NV8\\\"],[[11806,11822],\\\"valid\\\",[],\\\"NV8\\\"],[[11823,11823],\\\"valid\\\"],[[11824,11824],\\\"valid\\\",[],\\\"NV8\\\"],[[11825,11825],\\\"valid\\\",[],\\\"NV8\\\"],[[11826,11835],\\\"valid\\\",[],\\\"NV8\\\"],[[11836,11842],\\\"valid\\\",[],\\\"NV8\\\"],[[11843,11903],\\\"disallowed\\\"],[[11904,11929],\\\"valid\\\",[],\\\"NV8\\\"],[[11930,11930],\\\"disallowed\\\"],[[11931,11934],\\\"valid\\\",[],\\\"NV8\\\"],[[11935,11935],\\\"mapped\\\",[27597]],[[11936,12018],\\\"valid\\\",[],\\\"NV8\\\"],[[12019,12019],\\\"mapped\\\",[40863]],[[12020,12031],\\\"disallowed\\\"],[[12032,12032],\\\"mapped\\\",[19968]],[[12033,12033],\\\"mapped\\\",[20008]],[[12034,12034],\\\"mapped\\\",[20022]],[[12035,12035],\\\"mapped\\\",[20031]],[[12036,12036],\\\"mapped\\\",[20057]],[[12037,12037],\\\"mapped\\\",[20101]],[[12038,12038],\\\"mapped\\\",[20108]],[[12039,12039],\\\"mapped\\\",[20128]],[[12040,12040],\\\"mapped\\\",[20154]],[[12041,12041],\\\"mapped\\\",[20799]],[[12042,12042],\\\"mapped\\\",[20837]],[[12043,12043],\\\"mapped\\\",[20843]],[[12044,12044],\\\"mapped\\\",[20866]],[[12045,12045],\\\"mapped\\\",[20886]],[[12046,12046],\\\"mapped\\\",[20907]],[[12047,12047],\\\"mapped\\\",[20960]],[[12048,12048],\\\"mapped\\\",[20981]],[[12049,12049],\\\"mapped\\\",[20992]],[[12050,12050],\\\"mapped\\\",[21147]],[[12051,12051],\\\"mapped\\\",[21241]],[[12052,12052],\\\"mapped\\\",[21269]],[[12053,12053],\\\"mapped\\\",[21274]],[[12054,12054],\\\"mapped\\\",[21304]],[[12055,12055],\\\"mapped\\\",[21313]],[[12056,12056],\\\"mapped\\\",[21340]],[[12057,12057],\\\"mapped\\\",[21353]],[[12058,12058],\\\"mapped\\\",[21378]],[[12059,12059],\\\"mapped\\\",[21430]],[[12060,12060],\\\"mapped\\\",[21448]],[[12061,12061],\\\"mapped\\\",[21475]],[[12062,12062],\\\"mapped\\\",[22231]],[[12063,12063],\\\"mapped\\\",[22303]],[[12064,12064],\\\"mapped\\\",[22763]],[[12065,12065],\\\"mapped\\\",[22786]],[[12066,12066],\\\"mapped\\\",[22794]],[[12067,12067],\\\"mapped\\\",[22805]],[[12068,12068],\\\"mapped\\\",[22823]],[[12069,12069],\\\"mapped\\\",[22899]],[[12070,12070],\\\"mapped\\\",[23376]],[[12071,12071],\\\"mapped\\\",[23424]],[[12072,12072],\\\"mapped\\\",[23544]],[[12073,12073],\\\"mapped\\\",[23567]],[[12074,12074],\\\"mapped\\\",[23586]],[[12075,12075],\\\"mapped\\\",[23608]],[[12076,12076],\\\"mapped\\\",[23662]],[[12077,12077],\\\"mapped\\\",[23665]],[[12078,12078],\\\"mapped\\\",[24027]],[[12079,12079],\\\"mapped\\\",[24037]],[[12080,12080],\\\"mapped\\\",[24049]],[[12081,12081],\\\"mapped\\\",[24062]],[[12082,12082],\\\"mapped\\\",[24178]],[[12083,12083],\\\"mapped\\\",[24186]],[[12084,12084],\\\"mapped\\\",[24191]],[[12085,12085],\\\"mapped\\\",[24308]],[[12086,12086],\\\"mapped\\\",[24318]],[[12087,12087],\\\"mapped\\\",[24331]],[[12088,12088],\\\"mapped\\\",[24339]],[[12089,12089],\\\"mapped\\\",[24400]],[[12090,12090],\\\"mapped\\\",[24417]],[[12091,12091],\\\"mapped\\\",[24435]],[[12092,12092],\\\"mapped\\\",[24515]],[[12093,12093],\\\"mapped\\\",[25096]],[[12094,12094],\\\"mapped\\\",[25142]],[[12095,12095],\\\"mapped\\\",[25163]],[[12096,12096],\\\"mapped\\\",[25903]],[[12097,12097],\\\"mapped\\\",[25908]],[[12098,12098],\\\"mapped\\\",[25991]],[[12099,12099],\\\"mapped\\\",[26007]],[[12100,12100],\\\"mapped\\\",[26020]],[[12101,12101],\\\"mapped\\\",[26041]],[[12102,12102],\\\"mapped\\\",[26080]],[[12103,12103],\\\"mapped\\\",[26085]],[[12104,12104],\\\"mapped\\\",[26352]],[[12105,12105],\\\"mapped\\\",[26376]],[[12106,12106],\\\"mapped\\\",[26408]],[[12107,12107],\\\"mapped\\\",[27424]],[[12108,12108],\\\"mapped\\\",[27490]],[[12109,12109],\\\"mapped\\\",[27513]],[[12110,12110],\\\"mapped\\\",[27571]],[[12111,12111],\\\"mapped\\\",[27595]],[[12112,12112],\\\"mapped\\\",[27604]],[[12113,12113],\\\"mapped\\\",[27611]],[[12114,12114],\\\"mapped\\\",[27663]],[[12115,12115],\\\"mapped\\\",[27668]],[[12116,12116],\\\"mapped\\\",[27700]],[[12117,12117],\\\"mapped\\\",[28779]],[[12118,12118],\\\"mapped\\\",[29226]],[[12119,12119],\\\"mapped\\\",[29238]],[[12120,12120],\\\"mapped\\\",[29243]],[[12121,12121],\\\"mapped\\\",[29247]],[[12122,12122],\\\"mapped\\\",[29255]],[[12123,12123],\\\"mapped\\\",[29273]],[[12124,12124],\\\"mapped\\\",[29275]],[[12125,12125],\\\"mapped\\\",[29356]],[[12126,12126],\\\"mapped\\\",[29572]],[[12127,12127],\\\"mapped\\\",[29577]],[[12128,12128],\\\"mapped\\\",[29916]],[[12129,12129],\\\"mapped\\\",[29926]],[[12130,12130],\\\"mapped\\\",[29976]],[[12131,12131],\\\"mapped\\\",[29983]],[[12132,12132],\\\"mapped\\\",[29992]],[[12133,12133],\\\"mapped\\\",[30000]],[[12134,12134],\\\"mapped\\\",[30091]],[[12135,12135],\\\"mapped\\\",[30098]],[[12136,12136],\\\"mapped\\\",[30326]],[[12137,12137],\\\"mapped\\\",[30333]],[[12138,12138],\\\"mapped\\\",[30382]],[[12139,12139],\\\"mapped\\\",[30399]],[[12140,12140],\\\"mapped\\\",[30446]],[[12141,12141],\\\"mapped\\\",[30683]],[[12142,12142],\\\"mapped\\\",[30690]],[[12143,12143],\\\"mapped\\\",[30707]],[[12144,12144],\\\"mapped\\\",[31034]],[[12145,12145],\\\"mapped\\\",[31160]],[[12146,12146],\\\"mapped\\\",[31166]],[[12147,12147],\\\"mapped\\\",[31348]],[[12148,12148],\\\"mapped\\\",[31435]],[[12149,12149],\\\"mapped\\\",[31481]],[[12150,12150],\\\"mapped\\\",[31859]],[[12151,12151],\\\"mapped\\\",[31992]],[[12152,12152],\\\"mapped\\\",[32566]],[[12153,12153],\\\"mapped\\\",[32593]],[[12154,12154],\\\"mapped\\\",[32650]],[[12155,12155],\\\"mapped\\\",[32701]],[[12156,12156],\\\"mapped\\\",[32769]],[[12157,12157],\\\"mapped\\\",[32780]],[[12158,12158],\\\"mapped\\\",[32786]],[[12159,12159],\\\"mapped\\\",[32819]],[[12160,12160],\\\"mapped\\\",[32895]],[[12161,12161],\\\"mapped\\\",[32905]],[[12162,12162],\\\"mapped\\\",[33251]],[[12163,12163],\\\"mapped\\\",[33258]],[[12164,12164],\\\"mapped\\\",[33267]],[[12165,12165],\\\"mapped\\\",[33276]],[[12166,12166],\\\"mapped\\\",[33292]],[[12167,12167],\\\"mapped\\\",[33307]],[[12168,12168],\\\"mapped\\\",[33311]],[[12169,12169],\\\"mapped\\\",[33390]],[[12170,12170],\\\"mapped\\\",[33394]],[[12171,12171],\\\"mapped\\\",[33400]],[[12172,12172],\\\"mapped\\\",[34381]],[[12173,12173],\\\"mapped\\\",[34411]],[[12174,12174],\\\"mapped\\\",[34880]],[[12175,12175],\\\"mapped\\\",[34892]],[[12176,12176],\\\"mapped\\\",[34915]],[[12177,12177],\\\"mapped\\\",[35198]],[[12178,12178],\\\"mapped\\\",[35211]],[[12179,12179],\\\"mapped\\\",[35282]],[[12180,12180],\\\"mapped\\\",[35328]],[[12181,12181],\\\"mapped\\\",[35895]],[[12182,12182],\\\"mapped\\\",[35910]],[[12183,12183],\\\"mapped\\\",[35925]],[[12184,12184],\\\"mapped\\\",[35960]],[[12185,12185],\\\"mapped\\\",[35997]],[[12186,12186],\\\"mapped\\\",[36196]],[[12187,12187],\\\"mapped\\\",[36208]],[[12188,12188],\\\"mapped\\\",[36275]],[[12189,12189],\\\"mapped\\\",[36523]],[[12190,12190],\\\"mapped\\\",[36554]],[[12191,12191],\\\"mapped\\\",[36763]],[[12192,12192],\\\"mapped\\\",[36784]],[[12193,12193],\\\"mapped\\\",[36789]],[[12194,12194],\\\"mapped\\\",[37009]],[[12195,12195],\\\"mapped\\\",[37193]],[[12196,12196],\\\"mapped\\\",[37318]],[[12197,12197],\\\"mapped\\\",[37324]],[[12198,12198],\\\"mapped\\\",[37329]],[[12199,12199],\\\"mapped\\\",[38263]],[[12200,12200],\\\"mapped\\\",[38272]],[[12201,12201],\\\"mapped\\\",[38428]],[[12202,12202],\\\"mapped\\\",[38582]],[[12203,12203],\\\"mapped\\\",[38585]],[[12204,12204],\\\"mapped\\\",[38632]],[[12205,12205],\\\"mapped\\\",[38737]],[[12206,12206],\\\"mapped\\\",[38750]],[[12207,12207],\\\"mapped\\\",[38754]],[[12208,12208],\\\"mapped\\\",[38761]],[[12209,12209],\\\"mapped\\\",[38859]],[[12210,12210],\\\"mapped\\\",[38893]],[[12211,12211],\\\"mapped\\\",[38899]],[[12212,12212],\\\"mapped\\\",[38913]],[[12213,12213],\\\"mapped\\\",[39080]],[[12214,12214],\\\"mapped\\\",[39131]],[[12215,12215],\\\"mapped\\\",[39135]],[[12216,12216],\\\"mapped\\\",[39318]],[[12217,12217],\\\"mapped\\\",[39321]],[[12218,12218],\\\"mapped\\\",[39340]],[[12219,12219],\\\"mapped\\\",[39592]],[[12220,12220],\\\"mapped\\\",[39640]],[[12221,12221],\\\"mapped\\\",[39647]],[[12222,12222],\\\"mapped\\\",[39717]],[[12223,12223],\\\"mapped\\\",[39727]],[[12224,12224],\\\"mapped\\\",[39730]],[[12225,12225],\\\"mapped\\\",[39740]],[[12226,12226],\\\"mapped\\\",[39770]],[[12227,12227],\\\"mapped\\\",[40165]],[[12228,12228],\\\"mapped\\\",[40565]],[[12229,12229],\\\"mapped\\\",[40575]],[[12230,12230],\\\"mapped\\\",[40613]],[[12231,12231],\\\"mapped\\\",[40635]],[[12232,12232],\\\"mapped\\\",[40643]],[[12233,12233],\\\"mapped\\\",[40653]],[[12234,12234],\\\"mapped\\\",[40657]],[[12235,12235],\\\"mapped\\\",[40697]],[[12236,12236],\\\"mapped\\\",[40701]],[[12237,12237],\\\"mapped\\\",[40718]],[[12238,12238],\\\"mapped\\\",[40723]],[[12239,12239],\\\"mapped\\\",[40736]],[[12240,12240],\\\"mapped\\\",[40763]],[[12241,12241],\\\"mapped\\\",[40778]],[[12242,12242],\\\"mapped\\\",[40786]],[[12243,12243],\\\"mapped\\\",[40845]],[[12244,12244],\\\"mapped\\\",[40860]],[[12245,12245],\\\"mapped\\\",[40864]],[[12246,12271],\\\"disallowed\\\"],[[12272,12283],\\\"disallowed\\\"],[[12284,12287],\\\"disallowed\\\"],[[12288,12288],\\\"disallowed_STD3_mapped\\\",[32]],[[12289,12289],\\\"valid\\\",[],\\\"NV8\\\"],[[12290,12290],\\\"mapped\\\",[46]],[[12291,12292],\\\"valid\\\",[],\\\"NV8\\\"],[[12293,12295],\\\"valid\\\"],[[12296,12329],\\\"valid\\\",[],\\\"NV8\\\"],[[12330,12333],\\\"valid\\\"],[[12334,12341],\\\"valid\\\",[],\\\"NV8\\\"],[[12342,12342],\\\"mapped\\\",[12306]],[[12343,12343],\\\"valid\\\",[],\\\"NV8\\\"],[[12344,12344],\\\"mapped\\\",[21313]],[[12345,12345],\\\"mapped\\\",[21316]],[[12346,12346],\\\"mapped\\\",[21317]],[[12347,12347],\\\"valid\\\",[],\\\"NV8\\\"],[[12348,12348],\\\"valid\\\"],[[12349,12349],\\\"valid\\\",[],\\\"NV8\\\"],[[12350,12350],\\\"valid\\\",[],\\\"NV8\\\"],[[12351,12351],\\\"valid\\\",[],\\\"NV8\\\"],[[12352,12352],\\\"disallowed\\\"],[[12353,12436],\\\"valid\\\"],[[12437,12438],\\\"valid\\\"],[[12439,12440],\\\"disallowed\\\"],[[12441,12442],\\\"valid\\\"],[[12443,12443],\\\"disallowed_STD3_mapped\\\",[32,12441]],[[12444,12444],\\\"disallowed_STD3_mapped\\\",[32,12442]],[[12445,12446],\\\"valid\\\"],[[12447,12447],\\\"mapped\\\",[12424,12426]],[[12448,12448],\\\"valid\\\",[],\\\"NV8\\\"],[[12449,12542],\\\"valid\\\"],[[12543,12543],\\\"mapped\\\",[12467,12488]],[[12544,12548],\\\"disallowed\\\"],[[12549,12588],\\\"valid\\\"],[[12589,12589],\\\"valid\\\"],[[12590,12592],\\\"disallowed\\\"],[[12593,12593],\\\"mapped\\\",[4352]],[[12594,12594],\\\"mapped\\\",[4353]],[[12595,12595],\\\"mapped\\\",[4522]],[[12596,12596],\\\"mapped\\\",[4354]],[[12597,12597],\\\"mapped\\\",[4524]],[[12598,12598],\\\"mapped\\\",[4525]],[[12599,12599],\\\"mapped\\\",[4355]],[[12600,12600],\\\"mapped\\\",[4356]],[[12601,12601],\\\"mapped\\\",[4357]],[[12602,12602],\\\"mapped\\\",[4528]],[[12603,12603],\\\"mapped\\\",[4529]],[[12604,12604],\\\"mapped\\\",[4530]],[[12605,12605],\\\"mapped\\\",[4531]],[[12606,12606],\\\"mapped\\\",[4532]],[[12607,12607],\\\"mapped\\\",[4533]],[[12608,12608],\\\"mapped\\\",[4378]],[[12609,12609],\\\"mapped\\\",[4358]],[[12610,12610],\\\"mapped\\\",[4359]],[[12611,12611],\\\"mapped\\\",[4360]],[[12612,12612],\\\"mapped\\\",[4385]],[[12613,12613],\\\"mapped\\\",[4361]],[[12614,12614],\\\"mapped\\\",[4362]],[[12615,12615],\\\"mapped\\\",[4363]],[[12616,12616],\\\"mapped\\\",[4364]],[[12617,12617],\\\"mapped\\\",[4365]],[[12618,12618],\\\"mapped\\\",[4366]],[[12619,12619],\\\"mapped\\\",[4367]],[[12620,12620],\\\"mapped\\\",[4368]],[[12621,12621],\\\"mapped\\\",[4369]],[[12622,12622],\\\"mapped\\\",[4370]],[[12623,12623],\\\"mapped\\\",[4449]],[[12624,12624],\\\"mapped\\\",[4450]],[[12625,12625],\\\"mapped\\\",[4451]],[[12626,12626],\\\"mapped\\\",[4452]],[[12627,12627],\\\"mapped\\\",[4453]],[[12628,12628],\\\"mapped\\\",[4454]],[[12629,12629],\\\"mapped\\\",[4455]],[[12630,12630],\\\"mapped\\\",[4456]],[[12631,12631],\\\"mapped\\\",[4457]],[[12632,12632],\\\"mapped\\\",[4458]],[[12633,12633],\\\"mapped\\\",[4459]],[[12634,12634],\\\"mapped\\\",[4460]],[[12635,12635],\\\"mapped\\\",[4461]],[[12636,12636],\\\"mapped\\\",[4462]],[[12637,12637],\\\"mapped\\\",[4463]],[[12638,12638],\\\"mapped\\\",[4464]],[[12639,12639],\\\"mapped\\\",[4465]],[[12640,12640],\\\"mapped\\\",[4466]],[[12641,12641],\\\"mapped\\\",[4467]],[[12642,12642],\\\"mapped\\\",[4468]],[[12643,12643],\\\"mapped\\\",[4469]],[[12644,12644],\\\"disallowed\\\"],[[12645,12645],\\\"mapped\\\",[4372]],[[12646,12646],\\\"mapped\\\",[4373]],[[12647,12647],\\\"mapped\\\",[4551]],[[12648,12648],\\\"mapped\\\",[4552]],[[12649,12649],\\\"mapped\\\",[4556]],[[12650,12650],\\\"mapped\\\",[4558]],[[12651,12651],\\\"mapped\\\",[4563]],[[12652,12652],\\\"mapped\\\",[4567]],[[12653,12653],\\\"mapped\\\",[4569]],[[12654,12654],\\\"mapped\\\",[4380]],[[12655,12655],\\\"mapped\\\",[4573]],[[12656,12656],\\\"mapped\\\",[4575]],[[12657,12657],\\\"mapped\\\",[4381]],[[12658,12658],\\\"mapped\\\",[4382]],[[12659,12659],\\\"mapped\\\",[4384]],[[12660,12660],\\\"mapped\\\",[4386]],[[12661,12661],\\\"mapped\\\",[4387]],[[12662,12662],\\\"mapped\\\",[4391]],[[12663,12663],\\\"mapped\\\",[4393]],[[12664,12664],\\\"mapped\\\",[4395]],[[12665,12665],\\\"mapped\\\",[4396]],[[12666,12666],\\\"mapped\\\",[4397]],[[12667,12667],\\\"mapped\\\",[4398]],[[12668,12668],\\\"mapped\\\",[4399]],[[12669,12669],\\\"mapped\\\",[4402]],[[12670,12670],\\\"mapped\\\",[4406]],[[12671,12671],\\\"mapped\\\",[4416]],[[12672,12672],\\\"mapped\\\",[4423]],[[12673,12673],\\\"mapped\\\",[4428]],[[12674,12674],\\\"mapped\\\",[4593]],[[12675,12675],\\\"mapped\\\",[4594]],[[12676,12676],\\\"mapped\\\",[4439]],[[12677,12677],\\\"mapped\\\",[4440]],[[12678,12678],\\\"mapped\\\",[4441]],[[12679,12679],\\\"mapped\\\",[4484]],[[12680,12680],\\\"mapped\\\",[4485]],[[12681,12681],\\\"mapped\\\",[4488]],[[12682,12682],\\\"mapped\\\",[4497]],[[12683,12683],\\\"mapped\\\",[4498]],[[12684,12684],\\\"mapped\\\",[4500]],[[12685,12685],\\\"mapped\\\",[4510]],[[12686,12686],\\\"mapped\\\",[4513]],[[12687,12687],\\\"disallowed\\\"],[[12688,12689],\\\"valid\\\",[],\\\"NV8\\\"],[[12690,12690],\\\"mapped\\\",[19968]],[[12691,12691],\\\"mapped\\\",[20108]],[[12692,12692],\\\"mapped\\\",[19977]],[[12693,12693],\\\"mapped\\\",[22235]],[[12694,12694],\\\"mapped\\\",[19978]],[[12695,12695],\\\"mapped\\\",[20013]],[[12696,12696],\\\"mapped\\\",[19979]],[[12697,12697],\\\"mapped\\\",[30002]],[[12698,12698],\\\"mapped\\\",[20057]],[[12699,12699],\\\"mapped\\\",[19993]],[[12700,12700],\\\"mapped\\\",[19969]],[[12701,12701],\\\"mapped\\\",[22825]],[[12702,12702],\\\"mapped\\\",[22320]],[[12703,12703],\\\"mapped\\\",[20154]],[[12704,12727],\\\"valid\\\"],[[12728,12730],\\\"valid\\\"],[[12731,12735],\\\"disallowed\\\"],[[12736,12751],\\\"valid\\\",[],\\\"NV8\\\"],[[12752,12771],\\\"valid\\\",[],\\\"NV8\\\"],[[12772,12783],\\\"disallowed\\\"],[[12784,12799],\\\"valid\\\"],[[12800,12800],\\\"disallowed_STD3_mapped\\\",[40,4352,41]],[[12801,12801],\\\"disallowed_STD3_mapped\\\",[40,4354,41]],[[12802,12802],\\\"disallowed_STD3_mapped\\\",[40,4355,41]],[[12803,12803],\\\"disallowed_STD3_mapped\\\",[40,4357,41]],[[12804,12804],\\\"disallowed_STD3_mapped\\\",[40,4358,41]],[[12805,12805],\\\"disallowed_STD3_mapped\\\",[40,4359,41]],[[12806,12806],\\\"disallowed_STD3_mapped\\\",[40,4361,41]],[[12807,12807],\\\"disallowed_STD3_mapped\\\",[40,4363,41]],[[12808,12808],\\\"disallowed_STD3_mapped\\\",[40,4364,41]],[[12809,12809],\\\"disallowed_STD3_mapped\\\",[40,4366,41]],[[12810,12810],\\\"disallowed_STD3_mapped\\\",[40,4367,41]],[[12811,12811],\\\"disallowed_STD3_mapped\\\",[40,4368,41]],[[12812,12812],\\\"disallowed_STD3_mapped\\\",[40,4369,41]],[[12813,12813],\\\"disallowed_STD3_mapped\\\",[40,4370,41]],[[12814,12814],\\\"disallowed_STD3_mapped\\\",[40,44032,41]],[[12815,12815],\\\"disallowed_STD3_mapped\\\",[40,45208,41]],[[12816,12816],\\\"disallowed_STD3_mapped\\\",[40,45796,41]],[[12817,12817],\\\"disallowed_STD3_mapped\\\",[40,46972,41]],[[12818,12818],\\\"disallowed_STD3_mapped\\\",[40,47560,41]],[[12819,12819],\\\"disallowed_STD3_mapped\\\",[40,48148,41]],[[12820,12820],\\\"disallowed_STD3_mapped\\\",[40,49324,41]],[[12821,12821],\\\"disallowed_STD3_mapped\\\",[40,50500,41]],[[12822,12822],\\\"disallowed_STD3_mapped\\\",[40,51088,41]],[[12823,12823],\\\"disallowed_STD3_mapped\\\",[40,52264,41]],[[12824,12824],\\\"disallowed_STD3_mapped\\\",[40,52852,41]],[[12825,12825],\\\"disallowed_STD3_mapped\\\",[40,53440,41]],[[12826,12826],\\\"disallowed_STD3_mapped\\\",[40,54028,41]],[[12827,12827],\\\"disallowed_STD3_mapped\\\",[40,54616,41]],[[12828,12828],\\\"disallowed_STD3_mapped\\\",[40,51452,41]],[[12829,12829],\\\"disallowed_STD3_mapped\\\",[40,50724,51204,41]],[[12830,12830],\\\"disallowed_STD3_mapped\\\",[40,50724,54980,41]],[[12831,12831],\\\"disallowed\\\"],[[12832,12832],\\\"disallowed_STD3_mapped\\\",[40,19968,41]],[[12833,12833],\\\"disallowed_STD3_mapped\\\",[40,20108,41]],[[12834,12834],\\\"disallowed_STD3_mapped\\\",[40,19977,41]],[[12835,12835],\\\"disallowed_STD3_mapped\\\",[40,22235,41]],[[12836,12836],\\\"disallowed_STD3_mapped\\\",[40,20116,41]],[[12837,12837],\\\"disallowed_STD3_mapped\\\",[40,20845,41]],[[12838,12838],\\\"disallowed_STD3_mapped\\\",[40,19971,41]],[[12839,12839],\\\"disallowed_STD3_mapped\\\",[40,20843,41]],[[12840,12840],\\\"disallowed_STD3_mapped\\\",[40,20061,41]],[[12841,12841],\\\"disallowed_STD3_mapped\\\",[40,21313,41]],[[12842,12842],\\\"disallowed_STD3_mapped\\\",[40,26376,41]],[[12843,12843],\\\"disallowed_STD3_mapped\\\",[40,28779,41]],[[12844,12844],\\\"disallowed_STD3_mapped\\\",[40,27700,41]],[[12845,12845],\\\"disallowed_STD3_mapped\\\",[40,26408,41]],[[12846,12846],\\\"disallowed_STD3_mapped\\\",[40,37329,41]],[[12847,12847],\\\"disallowed_STD3_mapped\\\",[40,22303,41]],[[12848,12848],\\\"disallowed_STD3_mapped\\\",[40,26085,41]],[[12849,12849],\\\"disallowed_STD3_mapped\\\",[40,26666,41]],[[12850,12850],\\\"disallowed_STD3_mapped\\\",[40,26377,41]],[[12851,12851],\\\"disallowed_STD3_mapped\\\",[40,31038,41]],[[12852,12852],\\\"disallowed_STD3_mapped\\\",[40,21517,41]],[[12853,12853],\\\"disallowed_STD3_mapped\\\",[40,29305,41]],[[12854,12854],\\\"disallowed_STD3_mapped\\\",[40,36001,41]],[[12855,12855],\\\"disallowed_STD3_mapped\\\",[40,31069,41]],[[12856,12856],\\\"disallowed_STD3_mapped\\\",[40,21172,41]],[[12857,12857],\\\"disallowed_STD3_mapped\\\",[40,20195,41]],[[12858,12858],\\\"disallowed_STD3_mapped\\\",[40,21628,41]],[[12859,12859],\\\"disallowed_STD3_mapped\\\",[40,23398,41]],[[12860,12860],\\\"disallowed_STD3_mapped\\\",[40,30435,41]],[[12861,12861],\\\"disallowed_STD3_mapped\\\",[40,20225,41]],[[12862,12862],\\\"disallowed_STD3_mapped\\\",[40,36039,41]],[[12863,12863],\\\"disallowed_STD3_mapped\\\",[40,21332,41]],[[12864,12864],\\\"disallowed_STD3_mapped\\\",[40,31085,41]],[[12865,12865],\\\"disallowed_STD3_mapped\\\",[40,20241,41]],[[12866,12866],\\\"disallowed_STD3_mapped\\\",[40,33258,41]],[[12867,12867],\\\"disallowed_STD3_mapped\\\",[40,33267,41]],[[12868,12868],\\\"mapped\\\",[21839]],[[12869,12869],\\\"mapped\\\",[24188]],[[12870,12870],\\\"mapped\\\",[25991]],[[12871,12871],\\\"mapped\\\",[31631]],[[12872,12879],\\\"valid\\\",[],\\\"NV8\\\"],[[12880,12880],\\\"mapped\\\",[112,116,101]],[[12881,12881],\\\"mapped\\\",[50,49]],[[12882,12882],\\\"mapped\\\",[50,50]],[[12883,12883],\\\"mapped\\\",[50,51]],[[12884,12884],\\\"mapped\\\",[50,52]],[[12885,12885],\\\"mapped\\\",[50,53]],[[12886,12886],\\\"mapped\\\",[50,54]],[[12887,12887],\\\"mapped\\\",[50,55]],[[12888,12888],\\\"mapped\\\",[50,56]],[[12889,12889],\\\"mapped\\\",[50,57]],[[12890,12890],\\\"mapped\\\",[51,48]],[[12891,12891],\\\"mapped\\\",[51,49]],[[12892,12892],\\\"mapped\\\",[51,50]],[[12893,12893],\\\"mapped\\\",[51,51]],[[12894,12894],\\\"mapped\\\",[51,52]],[[12895,12895],\\\"mapped\\\",[51,53]],[[12896,12896],\\\"mapped\\\",[4352]],[[12897,12897],\\\"mapped\\\",[4354]],[[12898,12898],\\\"mapped\\\",[4355]],[[12899,12899],\\\"mapped\\\",[4357]],[[12900,12900],\\\"mapped\\\",[4358]],[[12901,12901],\\\"mapped\\\",[4359]],[[12902,12902],\\\"mapped\\\",[4361]],[[12903,12903],\\\"mapped\\\",[4363]],[[12904,12904],\\\"mapped\\\",[4364]],[[12905,12905],\\\"mapped\\\",[4366]],[[12906,12906],\\\"mapped\\\",[4367]],[[12907,12907],\\\"mapped\\\",[4368]],[[12908,12908],\\\"mapped\\\",[4369]],[[12909,12909],\\\"mapped\\\",[4370]],[[12910,12910],\\\"mapped\\\",[44032]],[[12911,12911],\\\"mapped\\\",[45208]],[[12912,12912],\\\"mapped\\\",[45796]],[[12913,12913],\\\"mapped\\\",[46972]],[[12914,12914],\\\"mapped\\\",[47560]],[[12915,12915],\\\"mapped\\\",[48148]],[[12916,12916],\\\"mapped\\\",[49324]],[[12917,12917],\\\"mapped\\\",[50500]],[[12918,12918],\\\"mapped\\\",[51088]],[[12919,12919],\\\"mapped\\\",[52264]],[[12920,12920],\\\"mapped\\\",[52852]],[[12921,12921],\\\"mapped\\\",[53440]],[[12922,12922],\\\"mapped\\\",[54028]],[[12923,12923],\\\"mapped\\\",[54616]],[[12924,12924],\\\"mapped\\\",[52280,44256]],[[12925,12925],\\\"mapped\\\",[51452,51032]],[[12926,12926],\\\"mapped\\\",[50864]],[[12927,12927],\\\"valid\\\",[],\\\"NV8\\\"],[[12928,12928],\\\"mapped\\\",[19968]],[[12929,12929],\\\"mapped\\\",[20108]],[[12930,12930],\\\"mapped\\\",[19977]],[[12931,12931],\\\"mapped\\\",[22235]],[[12932,12932],\\\"mapped\\\",[20116]],[[12933,12933],\\\"mapped\\\",[20845]],[[12934,12934],\\\"mapped\\\",[19971]],[[12935,12935],\\\"mapped\\\",[20843]],[[12936,12936],\\\"mapped\\\",[20061]],[[12937,12937],\\\"mapped\\\",[21313]],[[12938,12938],\\\"mapped\\\",[26376]],[[12939,12939],\\\"mapped\\\",[28779]],[[12940,12940],\\\"mapped\\\",[27700]],[[12941,12941],\\\"mapped\\\",[26408]],[[12942,12942],\\\"mapped\\\",[37329]],[[12943,12943],\\\"mapped\\\",[22303]],[[12944,12944],\\\"mapped\\\",[26085]],[[12945,12945],\\\"mapped\\\",[26666]],[[12946,12946],\\\"mapped\\\",[26377]],[[12947,12947],\\\"mapped\\\",[31038]],[[12948,12948],\\\"mapped\\\",[21517]],[[12949,12949],\\\"mapped\\\",[29305]],[[12950,12950],\\\"mapped\\\",[36001]],[[12951,12951],\\\"mapped\\\",[31069]],[[12952,12952],\\\"mapped\\\",[21172]],[[12953,12953],\\\"mapped\\\",[31192]],[[12954,12954],\\\"mapped\\\",[30007]],[[12955,12955],\\\"mapped\\\",[22899]],[[12956,12956],\\\"mapped\\\",[36969]],[[12957,12957],\\\"mapped\\\",[20778]],[[12958,12958],\\\"mapped\\\",[21360]],[[12959,12959],\\\"mapped\\\",[27880]],[[12960,12960],\\\"mapped\\\",[38917]],[[12961,12961],\\\"mapped\\\",[20241]],[[12962,12962],\\\"mapped\\\",[20889]],[[12963,12963],\\\"mapped\\\",[27491]],[[12964,12964],\\\"mapped\\\",[19978]],[[12965,12965],\\\"mapped\\\",[20013]],[[12966,12966],\\\"mapped\\\",[19979]],[[12967,12967],\\\"mapped\\\",[24038]],[[12968,12968],\\\"mapped\\\",[21491]],[[12969,12969],\\\"mapped\\\",[21307]],[[12970,12970],\\\"mapped\\\",[23447]],[[12971,12971],\\\"mapped\\\",[23398]],[[12972,12972],\\\"mapped\\\",[30435]],[[12973,12973],\\\"mapped\\\",[20225]],[[12974,12974],\\\"mapped\\\",[36039]],[[12975,12975],\\\"mapped\\\",[21332]],[[12976,12976],\\\"mapped\\\",[22812]],[[12977,12977],\\\"mapped\\\",[51,54]],[[12978,12978],\\\"mapped\\\",[51,55]],[[12979,12979],\\\"mapped\\\",[51,56]],[[12980,12980],\\\"mapped\\\",[51,57]],[[12981,12981],\\\"mapped\\\",[52,48]],[[12982,12982],\\\"mapped\\\",[52,49]],[[12983,12983],\\\"mapped\\\",[52,50]],[[12984,12984],\\\"mapped\\\",[52,51]],[[12985,12985],\\\"mapped\\\",[52,52]],[[12986,12986],\\\"mapped\\\",[52,53]],[[12987,12987],\\\"mapped\\\",[52,54]],[[12988,12988],\\\"mapped\\\",[52,55]],[[12989,12989],\\\"mapped\\\",[52,56]],[[12990,12990],\\\"mapped\\\",[52,57]],[[12991,12991],\\\"mapped\\\",[53,48]],[[12992,12992],\\\"mapped\\\",[49,26376]],[[12993,12993],\\\"mapped\\\",[50,26376]],[[12994,12994],\\\"mapped\\\",[51,26376]],[[12995,12995],\\\"mapped\\\",[52,26376]],[[12996,12996],\\\"mapped\\\",[53,26376]],[[12997,12997],\\\"mapped\\\",[54,26376]],[[12998,12998],\\\"mapped\\\",[55,26376]],[[12999,12999],\\\"mapped\\\",[56,26376]],[[13000,13000],\\\"mapped\\\",[57,26376]],[[13001,13001],\\\"mapped\\\",[49,48,26376]],[[13002,13002],\\\"mapped\\\",[49,49,26376]],[[13003,13003],\\\"mapped\\\",[49,50,26376]],[[13004,13004],\\\"mapped\\\",[104,103]],[[13005,13005],\\\"mapped\\\",[101,114,103]],[[13006,13006],\\\"mapped\\\",[101,118]],[[13007,13007],\\\"mapped\\\",[108,116,100]],[[13008,13008],\\\"mapped\\\",[12450]],[[13009,13009],\\\"mapped\\\",[12452]],[[13010,13010],\\\"mapped\\\",[12454]],[[13011,13011],\\\"mapped\\\",[12456]],[[13012,13012],\\\"mapped\\\",[12458]],[[13013,13013],\\\"mapped\\\",[12459]],[[13014,13014],\\\"mapped\\\",[12461]],[[13015,13015],\\\"mapped\\\",[12463]],[[13016,13016],\\\"mapped\\\",[12465]],[[13017,13017],\\\"mapped\\\",[12467]],[[13018,13018],\\\"mapped\\\",[12469]],[[13019,13019],\\\"mapped\\\",[12471]],[[13020,13020],\\\"mapped\\\",[12473]],[[13021,13021],\\\"mapped\\\",[12475]],[[13022,13022],\\\"mapped\\\",[12477]],[[13023,13023],\\\"mapped\\\",[12479]],[[13024,13024],\\\"mapped\\\",[12481]],[[13025,13025],\\\"mapped\\\",[12484]],[[13026,13026],\\\"mapped\\\",[12486]],[[13027,13027],\\\"mapped\\\",[12488]],[[13028,13028],\\\"mapped\\\",[12490]],[[13029,13029],\\\"mapped\\\",[12491]],[[13030,13030],\\\"mapped\\\",[12492]],[[13031,13031],\\\"mapped\\\",[12493]],[[13032,13032],\\\"mapped\\\",[12494]],[[13033,13033],\\\"mapped\\\",[12495]],[[13034,13034],\\\"mapped\\\",[12498]],[[13035,13035],\\\"mapped\\\",[12501]],[[13036,13036],\\\"mapped\\\",[12504]],[[13037,13037],\\\"mapped\\\",[12507]],[[13038,13038],\\\"mapped\\\",[12510]],[[13039,13039],\\\"mapped\\\",[12511]],[[13040,13040],\\\"mapped\\\",[12512]],[[13041,13041],\\\"mapped\\\",[12513]],[[13042,13042],\\\"mapped\\\",[12514]],[[13043,13043],\\\"mapped\\\",[12516]],[[13044,13044],\\\"mapped\\\",[12518]],[[13045,13045],\\\"mapped\\\",[12520]],[[13046,13046],\\\"mapped\\\",[12521]],[[13047,13047],\\\"mapped\\\",[12522]],[[13048,13048],\\\"mapped\\\",[12523]],[[13049,13049],\\\"mapped\\\",[12524]],[[13050,13050],\\\"mapped\\\",[12525]],[[13051,13051],\\\"mapped\\\",[12527]],[[13052,13052],\\\"mapped\\\",[12528]],[[13053,13053],\\\"mapped\\\",[12529]],[[13054,13054],\\\"mapped\\\",[12530]],[[13055,13055],\\\"disallowed\\\"],[[13056,13056],\\\"mapped\\\",[12450,12497,12540,12488]],[[13057,13057],\\\"mapped\\\",[12450,12523,12501,12449]],[[13058,13058],\\\"mapped\\\",[12450,12531,12506,12450]],[[13059,13059],\\\"mapped\\\",[12450,12540,12523]],[[13060,13060],\\\"mapped\\\",[12452,12491,12531,12464]],[[13061,13061],\\\"mapped\\\",[12452,12531,12481]],[[13062,13062],\\\"mapped\\\",[12454,12457,12531]],[[13063,13063],\\\"mapped\\\",[12456,12473,12463,12540,12489]],[[13064,13064],\\\"mapped\\\",[12456,12540,12459,12540]],[[13065,13065],\\\"mapped\\\",[12458,12531,12473]],[[13066,13066],\\\"mapped\\\",[12458,12540,12512]],[[13067,13067],\\\"mapped\\\",[12459,12452,12522]],[[13068,13068],\\\"mapped\\\",[12459,12521,12483,12488]],[[13069,13069],\\\"mapped\\\",[12459,12525,12522,12540]],[[13070,13070],\\\"mapped\\\",[12460,12525,12531]],[[13071,13071],\\\"mapped\\\",[12460,12531,12510]],[[13072,13072],\\\"mapped\\\",[12462,12460]],[[13073,13073],\\\"mapped\\\",[12462,12491,12540]],[[13074,13074],\\\"mapped\\\",[12461,12517,12522,12540]],[[13075,13075],\\\"mapped\\\",[12462,12523,12480,12540]],[[13076,13076],\\\"mapped\\\",[12461,12525]],[[13077,13077],\\\"mapped\\\",[12461,12525,12464,12521,12512]],[[13078,13078],\\\"mapped\\\",[12461,12525,12513,12540,12488,12523]],[[13079,13079],\\\"mapped\\\",[12461,12525,12527,12483,12488]],[[13080,13080],\\\"mapped\\\",[12464,12521,12512]],[[13081,13081],\\\"mapped\\\",[12464,12521,12512,12488,12531]],[[13082,13082],\\\"mapped\\\",[12463,12523,12476,12452,12525]],[[13083,13083],\\\"mapped\\\",[12463,12525,12540,12493]],[[13084,13084],\\\"mapped\\\",[12465,12540,12473]],[[13085,13085],\\\"mapped\\\",[12467,12523,12490]],[[13086,13086],\\\"mapped\\\",[12467,12540,12509]],[[13087,13087],\\\"mapped\\\",[12469,12452,12463,12523]],[[13088,13088],\\\"mapped\\\",[12469,12531,12481,12540,12512]],[[13089,13089],\\\"mapped\\\",[12471,12522,12531,12464]],[[13090,13090],\\\"mapped\\\",[12475,12531,12481]],[[13091,13091],\\\"mapped\\\",[12475,12531,12488]],[[13092,13092],\\\"mapped\\\",[12480,12540,12473]],[[13093,13093],\\\"mapped\\\",[12487,12471]],[[13094,13094],\\\"mapped\\\",[12489,12523]],[[13095,13095],\\\"mapped\\\",[12488,12531]],[[13096,13096],\\\"mapped\\\",[12490,12494]],[[13097,13097],\\\"mapped\\\",[12494,12483,12488]],[[13098,13098],\\\"mapped\\\",[12495,12452,12484]],[[13099,13099],\\\"mapped\\\",[12497,12540,12475,12531,12488]],[[13100,13100],\\\"mapped\\\",[12497,12540,12484]],[[13101,13101],\\\"mapped\\\",[12496,12540,12524,12523]],[[13102,13102],\\\"mapped\\\",[12500,12450,12473,12488,12523]],[[13103,13103],\\\"mapped\\\",[12500,12463,12523]],[[13104,13104],\\\"mapped\\\",[12500,12467]],[[13105,13105],\\\"mapped\\\",[12499,12523]],[[13106,13106],\\\"mapped\\\",[12501,12449,12521,12483,12489]],[[13107,13107],\\\"mapped\\\",[12501,12451,12540,12488]],[[13108,13108],\\\"mapped\\\",[12502,12483,12471,12455,12523]],[[13109,13109],\\\"mapped\\\",[12501,12521,12531]],[[13110,13110],\\\"mapped\\\",[12504,12463,12479,12540,12523]],[[13111,13111],\\\"mapped\\\",[12506,12477]],[[13112,13112],\\\"mapped\\\",[12506,12491,12498]],[[13113,13113],\\\"mapped\\\",[12504,12523,12484]],[[13114,13114],\\\"mapped\\\",[12506,12531,12473]],[[13115,13115],\\\"mapped\\\",[12506,12540,12472]],[[13116,13116],\\\"mapped\\\",[12505,12540,12479]],[[13117,13117],\\\"mapped\\\",[12509,12452,12531,12488]],[[13118,13118],\\\"mapped\\\",[12508,12523,12488]],[[13119,13119],\\\"mapped\\\",[12507,12531]],[[13120,13120],\\\"mapped\\\",[12509,12531,12489]],[[13121,13121],\\\"mapped\\\",[12507,12540,12523]],[[13122,13122],\\\"mapped\\\",[12507,12540,12531]],[[13123,13123],\\\"mapped\\\",[12510,12452,12463,12525]],[[13124,13124],\\\"mapped\\\",[12510,12452,12523]],[[13125,13125],\\\"mapped\\\",[12510,12483,12495]],[[13126,13126],\\\"mapped\\\",[12510,12523,12463]],[[13127,13127],\\\"mapped\\\",[12510,12531,12471,12519,12531]],[[13128,13128],\\\"mapped\\\",[12511,12463,12525,12531]],[[13129,13129],\\\"mapped\\\",[12511,12522]],[[13130,13130],\\\"mapped\\\",[12511,12522,12496,12540,12523]],[[13131,13131],\\\"mapped\\\",[12513,12460]],[[13132,13132],\\\"mapped\\\",[12513,12460,12488,12531]],[[13133,13133],\\\"mapped\\\",[12513,12540,12488,12523]],[[13134,13134],\\\"mapped\\\",[12516,12540,12489]],[[13135,13135],\\\"mapped\\\",[12516,12540,12523]],[[13136,13136],\\\"mapped\\\",[12518,12450,12531]],[[13137,13137],\\\"mapped\\\",[12522,12483,12488,12523]],[[13138,13138],\\\"mapped\\\",[12522,12521]],[[13139,13139],\\\"mapped\\\",[12523,12500,12540]],[[13140,13140],\\\"mapped\\\",[12523,12540,12502,12523]],[[13141,13141],\\\"mapped\\\",[12524,12512]],[[13142,13142],\\\"mapped\\\",[12524,12531,12488,12466,12531]],[[13143,13143],\\\"mapped\\\",[12527,12483,12488]],[[13144,13144],\\\"mapped\\\",[48,28857]],[[13145,13145],\\\"mapped\\\",[49,28857]],[[13146,13146],\\\"mapped\\\",[50,28857]],[[13147,13147],\\\"mapped\\\",[51,28857]],[[13148,13148],\\\"mapped\\\",[52,28857]],[[13149,13149],\\\"mapped\\\",[53,28857]],[[13150,13150],\\\"mapped\\\",[54,28857]],[[13151,13151],\\\"mapped\\\",[55,28857]],[[13152,13152],\\\"mapped\\\",[56,28857]],[[13153,13153],\\\"mapped\\\",[57,28857]],[[13154,13154],\\\"mapped\\\",[49,48,28857]],[[13155,13155],\\\"mapped\\\",[49,49,28857]],[[13156,13156],\\\"mapped\\\",[49,50,28857]],[[13157,13157],\\\"mapped\\\",[49,51,28857]],[[13158,13158],\\\"mapped\\\",[49,52,28857]],[[13159,13159],\\\"mapped\\\",[49,53,28857]],[[13160,13160],\\\"mapped\\\",[49,54,28857]],[[13161,13161],\\\"mapped\\\",[49,55,28857]],[[13162,13162],\\\"mapped\\\",[49,56,28857]],[[13163,13163],\\\"mapped\\\",[49,57,28857]],[[13164,13164],\\\"mapped\\\",[50,48,28857]],[[13165,13165],\\\"mapped\\\",[50,49,28857]],[[13166,13166],\\\"mapped\\\",[50,50,28857]],[[13167,13167],\\\"mapped\\\",[50,51,28857]],[[13168,13168],\\\"mapped\\\",[50,52,28857]],[[13169,13169],\\\"mapped\\\",[104,112,97]],[[13170,13170],\\\"mapped\\\",[100,97]],[[13171,13171],\\\"mapped\\\",[97,117]],[[13172,13172],\\\"mapped\\\",[98,97,114]],[[13173,13173],\\\"mapped\\\",[111,118]],[[13174,13174],\\\"mapped\\\",[112,99]],[[13175,13175],\\\"mapped\\\",[100,109]],[[13176,13176],\\\"mapped\\\",[100,109,50]],[[13177,13177],\\\"mapped\\\",[100,109,51]],[[13178,13178],\\\"mapped\\\",[105,117]],[[13179,13179],\\\"mapped\\\",[24179,25104]],[[13180,13180],\\\"mapped\\\",[26157,21644]],[[13181,13181],\\\"mapped\\\",[22823,27491]],[[13182,13182],\\\"mapped\\\",[26126,27835]],[[13183,13183],\\\"mapped\\\",[26666,24335,20250,31038]],[[13184,13184],\\\"mapped\\\",[112,97]],[[13185,13185],\\\"mapped\\\",[110,97]],[[13186,13186],\\\"mapped\\\",[956,97]],[[13187,13187],\\\"mapped\\\",[109,97]],[[13188,13188],\\\"mapped\\\",[107,97]],[[13189,13189],\\\"mapped\\\",[107,98]],[[13190,13190],\\\"mapped\\\",[109,98]],[[13191,13191],\\\"mapped\\\",[103,98]],[[13192,13192],\\\"mapped\\\",[99,97,108]],[[13193,13193],\\\"mapped\\\",[107,99,97,108]],[[13194,13194],\\\"mapped\\\",[112,102]],[[13195,13195],\\\"mapped\\\",[110,102]],[[13196,13196],\\\"mapped\\\",[956,102]],[[13197,13197],\\\"mapped\\\",[956,103]],[[13198,13198],\\\"mapped\\\",[109,103]],[[13199,13199],\\\"mapped\\\",[107,103]],[[13200,13200],\\\"mapped\\\",[104,122]],[[13201,13201],\\\"mapped\\\",[107,104,122]],[[13202,13202],\\\"mapped\\\",[109,104,122]],[[13203,13203],\\\"mapped\\\",[103,104,122]],[[13204,13204],\\\"mapped\\\",[116,104,122]],[[13205,13205],\\\"mapped\\\",[956,108]],[[13206,13206],\\\"mapped\\\",[109,108]],[[13207,13207],\\\"mapped\\\",[100,108]],[[13208,13208],\\\"mapped\\\",[107,108]],[[13209,13209],\\\"mapped\\\",[102,109]],[[13210,13210],\\\"mapped\\\",[110,109]],[[13211,13211],\\\"mapped\\\",[956,109]],[[13212,13212],\\\"mapped\\\",[109,109]],[[13213,13213],\\\"mapped\\\",[99,109]],[[13214,13214],\\\"mapped\\\",[107,109]],[[13215,13215],\\\"mapped\\\",[109,109,50]],[[13216,13216],\\\"mapped\\\",[99,109,50]],[[13217,13217],\\\"mapped\\\",[109,50]],[[13218,13218],\\\"mapped\\\",[107,109,50]],[[13219,13219],\\\"mapped\\\",[109,109,51]],[[13220,13220],\\\"mapped\\\",[99,109,51]],[[13221,13221],\\\"mapped\\\",[109,51]],[[13222,13222],\\\"mapped\\\",[107,109,51]],[[13223,13223],\\\"mapped\\\",[109,8725,115]],[[13224,13224],\\\"mapped\\\",[109,8725,115,50]],[[13225,13225],\\\"mapped\\\",[112,97]],[[13226,13226],\\\"mapped\\\",[107,112,97]],[[13227,13227],\\\"mapped\\\",[109,112,97]],[[13228,13228],\\\"mapped\\\",[103,112,97]],[[13229,13229],\\\"mapped\\\",[114,97,100]],[[13230,13230],\\\"mapped\\\",[114,97,100,8725,115]],[[13231,13231],\\\"mapped\\\",[114,97,100,8725,115,50]],[[13232,13232],\\\"mapped\\\",[112,115]],[[13233,13233],\\\"mapped\\\",[110,115]],[[13234,13234],\\\"mapped\\\",[956,115]],[[13235,13235],\\\"mapped\\\",[109,115]],[[13236,13236],\\\"mapped\\\",[112,118]],[[13237,13237],\\\"mapped\\\",[110,118]],[[13238,13238],\\\"mapped\\\",[956,118]],[[13239,13239],\\\"mapped\\\",[109,118]],[[13240,13240],\\\"mapped\\\",[107,118]],[[13241,13241],\\\"mapped\\\",[109,118]],[[13242,13242],\\\"mapped\\\",[112,119]],[[13243,13243],\\\"mapped\\\",[110,119]],[[13244,13244],\\\"mapped\\\",[956,119]],[[13245,13245],\\\"mapped\\\",[109,119]],[[13246,13246],\\\"mapped\\\",[107,119]],[[13247,13247],\\\"mapped\\\",[109,119]],[[13248,13248],\\\"mapped\\\",[107,969]],[[13249,13249],\\\"mapped\\\",[109,969]],[[13250,13250],\\\"disallowed\\\"],[[13251,13251],\\\"mapped\\\",[98,113]],[[13252,13252],\\\"mapped\\\",[99,99]],[[13253,13253],\\\"mapped\\\",[99,100]],[[13254,13254],\\\"mapped\\\",[99,8725,107,103]],[[13255,13255],\\\"disallowed\\\"],[[13256,13256],\\\"mapped\\\",[100,98]],[[13257,13257],\\\"mapped\\\",[103,121]],[[13258,13258],\\\"mapped\\\",[104,97]],[[13259,13259],\\\"mapped\\\",[104,112]],[[13260,13260],\\\"mapped\\\",[105,110]],[[13261,13261],\\\"mapped\\\",[107,107]],[[13262,13262],\\\"mapped\\\",[107,109]],[[13263,13263],\\\"mapped\\\",[107,116]],[[13264,13264],\\\"mapped\\\",[108,109]],[[13265,13265],\\\"mapped\\\",[108,110]],[[13266,13266],\\\"mapped\\\",[108,111,103]],[[13267,13267],\\\"mapped\\\",[108,120]],[[13268,13268],\\\"mapped\\\",[109,98]],[[13269,13269],\\\"mapped\\\",[109,105,108]],[[13270,13270],\\\"mapped\\\",[109,111,108]],[[13271,13271],\\\"mapped\\\",[112,104]],[[13272,13272],\\\"disallowed\\\"],[[13273,13273],\\\"mapped\\\",[112,112,109]],[[13274,13274],\\\"mapped\\\",[112,114]],[[13275,13275],\\\"mapped\\\",[115,114]],[[13276,13276],\\\"mapped\\\",[115,118]],[[13277,13277],\\\"mapped\\\",[119,98]],[[13278,13278],\\\"mapped\\\",[118,8725,109]],[[13279,13279],\\\"mapped\\\",[97,8725,109]],[[13280,13280],\\\"mapped\\\",[49,26085]],[[13281,13281],\\\"mapped\\\",[50,26085]],[[13282,13282],\\\"mapped\\\",[51,26085]],[[13283,13283],\\\"mapped\\\",[52,26085]],[[13284,13284],\\\"mapped\\\",[53,26085]],[[13285,13285],\\\"mapped\\\",[54,26085]],[[13286,13286],\\\"mapped\\\",[55,26085]],[[13287,13287],\\\"mapped\\\",[56,26085]],[[13288,13288],\\\"mapped\\\",[57,26085]],[[13289,13289],\\\"mapped\\\",[49,48,26085]],[[13290,13290],\\\"mapped\\\",[49,49,26085]],[[13291,13291],\\\"mapped\\\",[49,50,26085]],[[13292,13292],\\\"mapped\\\",[49,51,26085]],[[13293,13293],\\\"mapped\\\",[49,52,26085]],[[13294,13294],\\\"mapped\\\",[49,53,26085]],[[13295,13295],\\\"mapped\\\",[49,54,26085]],[[13296,13296],\\\"mapped\\\",[49,55,26085]],[[13297,13297],\\\"mapped\\\",[49,56,26085]],[[13298,13298],\\\"mapped\\\",[49,57,26085]],[[13299,13299],\\\"mapped\\\",[50,48,26085]],[[13300,13300],\\\"mapped\\\",[50,49,26085]],[[13301,13301],\\\"mapped\\\",[50,50,26085]],[[13302,13302],\\\"mapped\\\",[50,51,26085]],[[13303,13303],\\\"mapped\\\",[50,52,26085]],[[13304,13304],\\\"mapped\\\",[50,53,26085]],[[13305,13305],\\\"mapped\\\",[50,54,26085]],[[13306,13306],\\\"mapped\\\",[50,55,26085]],[[13307,13307],\\\"mapped\\\",[50,56,26085]],[[13308,13308],\\\"mapped\\\",[50,57,26085]],[[13309,13309],\\\"mapped\\\",[51,48,26085]],[[13310,13310],\\\"mapped\\\",[51,49,26085]],[[13311,13311],\\\"mapped\\\",[103,97,108]],[[13312,19893],\\\"valid\\\"],[[19894,19903],\\\"disallowed\\\"],[[19904,19967],\\\"valid\\\",[],\\\"NV8\\\"],[[19968,40869],\\\"valid\\\"],[[40870,40891],\\\"valid\\\"],[[40892,40899],\\\"valid\\\"],[[40900,40907],\\\"valid\\\"],[[40908,40908],\\\"valid\\\"],[[40909,40917],\\\"valid\\\"],[[40918,40959],\\\"disallowed\\\"],[[40960,42124],\\\"valid\\\"],[[42125,42127],\\\"disallowed\\\"],[[42128,42145],\\\"valid\\\",[],\\\"NV8\\\"],[[42146,42147],\\\"valid\\\",[],\\\"NV8\\\"],[[42148,42163],\\\"valid\\\",[],\\\"NV8\\\"],[[42164,42164],\\\"valid\\\",[],\\\"NV8\\\"],[[42165,42176],\\\"valid\\\",[],\\\"NV8\\\"],[[42177,42177],\\\"valid\\\",[],\\\"NV8\\\"],[[42178,42180],\\\"valid\\\",[],\\\"NV8\\\"],[[42181,42181],\\\"valid\\\",[],\\\"NV8\\\"],[[42182,42182],\\\"valid\\\",[],\\\"NV8\\\"],[[42183,42191],\\\"disallowed\\\"],[[42192,42237],\\\"valid\\\"],[[42238,42239],\\\"valid\\\",[],\\\"NV8\\\"],[[42240,42508],\\\"valid\\\"],[[42509,42511],\\\"valid\\\",[],\\\"NV8\\\"],[[42512,42539],\\\"valid\\\"],[[42540,42559],\\\"disallowed\\\"],[[42560,42560],\\\"mapped\\\",[42561]],[[42561,42561],\\\"valid\\\"],[[42562,42562],\\\"mapped\\\",[42563]],[[42563,42563],\\\"valid\\\"],[[42564,42564],\\\"mapped\\\",[42565]],[[42565,42565],\\\"valid\\\"],[[42566,42566],\\\"mapped\\\",[42567]],[[42567,42567],\\\"valid\\\"],[[42568,42568],\\\"mapped\\\",[42569]],[[42569,42569],\\\"valid\\\"],[[42570,42570],\\\"mapped\\\",[42571]],[[42571,42571],\\\"valid\\\"],[[42572,42572],\\\"mapped\\\",[42573]],[[42573,42573],\\\"valid\\\"],[[42574,42574],\\\"mapped\\\",[42575]],[[42575,42575],\\\"valid\\\"],[[42576,42576],\\\"mapped\\\",[42577]],[[42577,42577],\\\"valid\\\"],[[42578,42578],\\\"mapped\\\",[42579]],[[42579,42579],\\\"valid\\\"],[[42580,42580],\\\"mapped\\\",[42581]],[[42581,42581],\\\"valid\\\"],[[42582,42582],\\\"mapped\\\",[42583]],[[42583,42583],\\\"valid\\\"],[[42584,42584],\\\"mapped\\\",[42585]],[[42585,42585],\\\"valid\\\"],[[42586,42586],\\\"mapped\\\",[42587]],[[42587,42587],\\\"valid\\\"],[[42588,42588],\\\"mapped\\\",[42589]],[[42589,42589],\\\"valid\\\"],[[42590,42590],\\\"mapped\\\",[42591]],[[42591,42591],\\\"valid\\\"],[[42592,42592],\\\"mapped\\\",[42593]],[[42593,42593],\\\"valid\\\"],[[42594,42594],\\\"mapped\\\",[42595]],[[42595,42595],\\\"valid\\\"],[[42596,42596],\\\"mapped\\\",[42597]],[[42597,42597],\\\"valid\\\"],[[42598,42598],\\\"mapped\\\",[42599]],[[42599,42599],\\\"valid\\\"],[[42600,42600],\\\"mapped\\\",[42601]],[[42601,42601],\\\"valid\\\"],[[42602,42602],\\\"mapped\\\",[42603]],[[42603,42603],\\\"valid\\\"],[[42604,42604],\\\"mapped\\\",[42605]],[[42605,42607],\\\"valid\\\"],[[42608,42611],\\\"valid\\\",[],\\\"NV8\\\"],[[42612,42619],\\\"valid\\\"],[[42620,42621],\\\"valid\\\"],[[42622,42622],\\\"valid\\\",[],\\\"NV8\\\"],[[42623,42623],\\\"valid\\\"],[[42624,42624],\\\"mapped\\\",[42625]],[[42625,42625],\\\"valid\\\"],[[42626,42626],\\\"mapped\\\",[42627]],[[42627,42627],\\\"valid\\\"],[[42628,42628],\\\"mapped\\\",[42629]],[[42629,42629],\\\"valid\\\"],[[42630,42630],\\\"mapped\\\",[42631]],[[42631,42631],\\\"valid\\\"],[[42632,42632],\\\"mapped\\\",[42633]],[[42633,42633],\\\"valid\\\"],[[42634,42634],\\\"mapped\\\",[42635]],[[42635,42635],\\\"valid\\\"],[[42636,42636],\\\"mapped\\\",[42637]],[[42637,42637],\\\"valid\\\"],[[42638,42638],\\\"mapped\\\",[42639]],[[42639,42639],\\\"valid\\\"],[[42640,42640],\\\"mapped\\\",[42641]],[[42641,42641],\\\"valid\\\"],[[42642,42642],\\\"mapped\\\",[42643]],[[42643,42643],\\\"valid\\\"],[[42644,42644],\\\"mapped\\\",[42645]],[[42645,42645],\\\"valid\\\"],[[42646,42646],\\\"mapped\\\",[42647]],[[42647,42647],\\\"valid\\\"],[[42648,42648],\\\"mapped\\\",[42649]],[[42649,42649],\\\"valid\\\"],[[42650,42650],\\\"mapped\\\",[42651]],[[42651,42651],\\\"valid\\\"],[[42652,42652],\\\"mapped\\\",[1098]],[[42653,42653],\\\"mapped\\\",[1100]],[[42654,42654],\\\"valid\\\"],[[42655,42655],\\\"valid\\\"],[[42656,42725],\\\"valid\\\"],[[42726,42735],\\\"valid\\\",[],\\\"NV8\\\"],[[42736,42737],\\\"valid\\\"],[[42738,42743],\\\"valid\\\",[],\\\"NV8\\\"],[[42744,42751],\\\"disallowed\\\"],[[42752,42774],\\\"valid\\\",[],\\\"NV8\\\"],[[42775,42778],\\\"valid\\\"],[[42779,42783],\\\"valid\\\"],[[42784,42785],\\\"valid\\\",[],\\\"NV8\\\"],[[42786,42786],\\\"mapped\\\",[42787]],[[42787,42787],\\\"valid\\\"],[[42788,42788],\\\"mapped\\\",[42789]],[[42789,42789],\\\"valid\\\"],[[42790,42790],\\\"mapped\\\",[42791]],[[42791,42791],\\\"valid\\\"],[[42792,42792],\\\"mapped\\\",[42793]],[[42793,42793],\\\"valid\\\"],[[42794,42794],\\\"mapped\\\",[42795]],[[42795,42795],\\\"valid\\\"],[[42796,42796],\\\"mapped\\\",[42797]],[[42797,42797],\\\"valid\\\"],[[42798,42798],\\\"mapped\\\",[42799]],[[42799,42801],\\\"valid\\\"],[[42802,42802],\\\"mapped\\\",[42803]],[[42803,42803],\\\"valid\\\"],[[42804,42804],\\\"mapped\\\",[42805]],[[42805,42805],\\\"valid\\\"],[[42806,42806],\\\"mapped\\\",[42807]],[[42807,42807],\\\"valid\\\"],[[42808,42808],\\\"mapped\\\",[42809]],[[42809,42809],\\\"valid\\\"],[[42810,42810],\\\"mapped\\\",[42811]],[[42811,42811],\\\"valid\\\"],[[42812,42812],\\\"mapped\\\",[42813]],[[42813,42813],\\\"valid\\\"],[[42814,42814],\\\"mapped\\\",[42815]],[[42815,42815],\\\"valid\\\"],[[42816,42816],\\\"mapped\\\",[42817]],[[42817,42817],\\\"valid\\\"],[[42818,42818],\\\"mapped\\\",[42819]],[[42819,42819],\\\"valid\\\"],[[42820,42820],\\\"mapped\\\",[42821]],[[42821,42821],\\\"valid\\\"],[[42822,42822],\\\"mapped\\\",[42823]],[[42823,42823],\\\"valid\\\"],[[42824,42824],\\\"mapped\\\",[42825]],[[42825,42825],\\\"valid\\\"],[[42826,42826],\\\"mapped\\\",[42827]],[[42827,42827],\\\"valid\\\"],[[42828,42828],\\\"mapped\\\",[42829]],[[42829,42829],\\\"valid\\\"],[[42830,42830],\\\"mapped\\\",[42831]],[[42831,42831],\\\"valid\\\"],[[42832,42832],\\\"mapped\\\",[42833]],[[42833,42833],\\\"valid\\\"],[[42834,42834],\\\"mapped\\\",[42835]],[[42835,42835],\\\"valid\\\"],[[42836,42836],\\\"mapped\\\",[42837]],[[42837,42837],\\\"valid\\\"],[[42838,42838],\\\"mapped\\\",[42839]],[[42839,42839],\\\"valid\\\"],[[42840,42840],\\\"mapped\\\",[42841]],[[42841,42841],\\\"valid\\\"],[[42842,42842],\\\"mapped\\\",[42843]],[[42843,42843],\\\"valid\\\"],[[42844,42844],\\\"mapped\\\",[42845]],[[42845,42845],\\\"valid\\\"],[[42846,42846],\\\"mapped\\\",[42847]],[[42847,42847],\\\"valid\\\"],[[42848,42848],\\\"mapped\\\",[42849]],[[42849,42849],\\\"valid\\\"],[[42850,42850],\\\"mapped\\\",[42851]],[[42851,42851],\\\"valid\\\"],[[42852,42852],\\\"mapped\\\",[42853]],[[42853,42853],\\\"valid\\\"],[[42854,42854],\\\"mapped\\\",[42855]],[[42855,42855],\\\"valid\\\"],[[42856,42856],\\\"mapped\\\",[42857]],[[42857,42857],\\\"valid\\\"],[[42858,42858],\\\"mapped\\\",[42859]],[[42859,42859],\\\"valid\\\"],[[42860,42860],\\\"mapped\\\",[42861]],[[42861,42861],\\\"valid\\\"],[[42862,42862],\\\"mapped\\\",[42863]],[[42863,42863],\\\"valid\\\"],[[42864,42864],\\\"mapped\\\",[42863]],[[42865,42872],\\\"valid\\\"],[[42873,42873],\\\"mapped\\\",[42874]],[[42874,42874],\\\"valid\\\"],[[42875,42875],\\\"mapped\\\",[42876]],[[42876,42876],\\\"valid\\\"],[[42877,42877],\\\"mapped\\\",[7545]],[[42878,42878],\\\"mapped\\\",[42879]],[[42879,42879],\\\"valid\\\"],[[42880,42880],\\\"mapped\\\",[42881]],[[42881,42881],\\\"valid\\\"],[[42882,42882],\\\"mapped\\\",[42883]],[[42883,42883],\\\"valid\\\"],[[42884,42884],\\\"mapped\\\",[42885]],[[42885,42885],\\\"valid\\\"],[[42886,42886],\\\"mapped\\\",[42887]],[[42887,42888],\\\"valid\\\"],[[42889,42890],\\\"valid\\\",[],\\\"NV8\\\"],[[42891,42891],\\\"mapped\\\",[42892]],[[42892,42892],\\\"valid\\\"],[[42893,42893],\\\"mapped\\\",[613]],[[42894,42894],\\\"valid\\\"],[[42895,42895],\\\"valid\\\"],[[42896,42896],\\\"mapped\\\",[42897]],[[42897,42897],\\\"valid\\\"],[[42898,42898],\\\"mapped\\\",[42899]],[[42899,42899],\\\"valid\\\"],[[42900,42901],\\\"valid\\\"],[[42902,42902],\\\"mapped\\\",[42903]],[[42903,42903],\\\"valid\\\"],[[42904,42904],\\\"mapped\\\",[42905]],[[42905,42905],\\\"valid\\\"],[[42906,42906],\\\"mapped\\\",[42907]],[[42907,42907],\\\"valid\\\"],[[42908,42908],\\\"mapped\\\",[42909]],[[42909,42909],\\\"valid\\\"],[[42910,42910],\\\"mapped\\\",[42911]],[[42911,42911],\\\"valid\\\"],[[42912,42912],\\\"mapped\\\",[42913]],[[42913,42913],\\\"valid\\\"],[[42914,42914],\\\"mapped\\\",[42915]],[[42915,42915],\\\"valid\\\"],[[42916,42916],\\\"mapped\\\",[42917]],[[42917,42917],\\\"valid\\\"],[[42918,42918],\\\"mapped\\\",[42919]],[[42919,42919],\\\"valid\\\"],[[42920,42920],\\\"mapped\\\",[42921]],[[42921,42921],\\\"valid\\\"],[[42922,42922],\\\"mapped\\\",[614]],[[42923,42923],\\\"mapped\\\",[604]],[[42924,42924],\\\"mapped\\\",[609]],[[42925,42925],\\\"mapped\\\",[620]],[[42926,42927],\\\"disallowed\\\"],[[42928,42928],\\\"mapped\\\",[670]],[[42929,42929],\\\"mapped\\\",[647]],[[42930,42930],\\\"mapped\\\",[669]],[[42931,42931],\\\"mapped\\\",[43859]],[[42932,42932],\\\"mapped\\\",[42933]],[[42933,42933],\\\"valid\\\"],[[42934,42934],\\\"mapped\\\",[42935]],[[42935,42935],\\\"valid\\\"],[[42936,42998],\\\"disallowed\\\"],[[42999,42999],\\\"valid\\\"],[[43000,43000],\\\"mapped\\\",[295]],[[43001,43001],\\\"mapped\\\",[339]],[[43002,43002],\\\"valid\\\"],[[43003,43007],\\\"valid\\\"],[[43008,43047],\\\"valid\\\"],[[43048,43051],\\\"valid\\\",[],\\\"NV8\\\"],[[43052,43055],\\\"disallowed\\\"],[[43056,43065],\\\"valid\\\",[],\\\"NV8\\\"],[[43066,43071],\\\"disallowed\\\"],[[43072,43123],\\\"valid\\\"],[[43124,43127],\\\"valid\\\",[],\\\"NV8\\\"],[[43128,43135],\\\"disallowed\\\"],[[43136,43204],\\\"valid\\\"],[[43205,43213],\\\"disallowed\\\"],[[43214,43215],\\\"valid\\\",[],\\\"NV8\\\"],[[43216,43225],\\\"valid\\\"],[[43226,43231],\\\"disallowed\\\"],[[43232,43255],\\\"valid\\\"],[[43256,43258],\\\"valid\\\",[],\\\"NV8\\\"],[[43259,43259],\\\"valid\\\"],[[43260,43260],\\\"valid\\\",[],\\\"NV8\\\"],[[43261,43261],\\\"valid\\\"],[[43262,43263],\\\"disallowed\\\"],[[43264,43309],\\\"valid\\\"],[[43310,43311],\\\"valid\\\",[],\\\"NV8\\\"],[[43312,43347],\\\"valid\\\"],[[43348,43358],\\\"disallowed\\\"],[[43359,43359],\\\"valid\\\",[],\\\"NV8\\\"],[[43360,43388],\\\"valid\\\",[],\\\"NV8\\\"],[[43389,43391],\\\"disallowed\\\"],[[43392,43456],\\\"valid\\\"],[[43457,43469],\\\"valid\\\",[],\\\"NV8\\\"],[[43470,43470],\\\"disallowed\\\"],[[43471,43481],\\\"valid\\\"],[[43482,43485],\\\"disallowed\\\"],[[43486,43487],\\\"valid\\\",[],\\\"NV8\\\"],[[43488,43518],\\\"valid\\\"],[[43519,43519],\\\"disallowed\\\"],[[43520,43574],\\\"valid\\\"],[[43575,43583],\\\"disallowed\\\"],[[43584,43597],\\\"valid\\\"],[[43598,43599],\\\"disallowed\\\"],[[43600,43609],\\\"valid\\\"],[[43610,43611],\\\"disallowed\\\"],[[43612,43615],\\\"valid\\\",[],\\\"NV8\\\"],[[43616,43638],\\\"valid\\\"],[[43639,43641],\\\"valid\\\",[],\\\"NV8\\\"],[[43642,43643],\\\"valid\\\"],[[43644,43647],\\\"valid\\\"],[[43648,43714],\\\"valid\\\"],[[43715,43738],\\\"disallowed\\\"],[[43739,43741],\\\"valid\\\"],[[43742,43743],\\\"valid\\\",[],\\\"NV8\\\"],[[43744,43759],\\\"valid\\\"],[[43760,43761],\\\"valid\\\",[],\\\"NV8\\\"],[[43762,43766],\\\"valid\\\"],[[43767,43776],\\\"disallowed\\\"],[[43777,43782],\\\"valid\\\"],[[43783,43784],\\\"disallowed\\\"],[[43785,43790],\\\"valid\\\"],[[43791,43792],\\\"disallowed\\\"],[[43793,43798],\\\"valid\\\"],[[43799,43807],\\\"disallowed\\\"],[[43808,43814],\\\"valid\\\"],[[43815,43815],\\\"disallowed\\\"],[[43816,43822],\\\"valid\\\"],[[43823,43823],\\\"disallowed\\\"],[[43824,43866],\\\"valid\\\"],[[43867,43867],\\\"valid\\\",[],\\\"NV8\\\"],[[43868,43868],\\\"mapped\\\",[42791]],[[43869,43869],\\\"mapped\\\",[43831]],[[43870,43870],\\\"mapped\\\",[619]],[[43871,43871],\\\"mapped\\\",[43858]],[[43872,43875],\\\"valid\\\"],[[43876,43877],\\\"valid\\\"],[[43878,43887],\\\"disallowed\\\"],[[43888,43888],\\\"mapped\\\",[5024]],[[43889,43889],\\\"mapped\\\",[5025]],[[43890,43890],\\\"mapped\\\",[5026]],[[43891,43891],\\\"mapped\\\",[5027]],[[43892,43892],\\\"mapped\\\",[5028]],[[43893,43893],\\\"mapped\\\",[5029]],[[43894,43894],\\\"mapped\\\",[5030]],[[43895,43895],\\\"mapped\\\",[5031]],[[43896,43896],\\\"mapped\\\",[5032]],[[43897,43897],\\\"mapped\\\",[5033]],[[43898,43898],\\\"mapped\\\",[5034]],[[43899,43899],\\\"mapped\\\",[5035]],[[43900,43900],\\\"mapped\\\",[5036]],[[43901,43901],\\\"mapped\\\",[5037]],[[43902,43902],\\\"mapped\\\",[5038]],[[43903,43903],\\\"mapped\\\",[5039]],[[43904,43904],\\\"mapped\\\",[5040]],[[43905,43905],\\\"mapped\\\",[5041]],[[43906,43906],\\\"mapped\\\",[5042]],[[43907,43907],\\\"mapped\\\",[5043]],[[43908,43908],\\\"mapped\\\",[5044]],[[43909,43909],\\\"mapped\\\",[5045]],[[43910,43910],\\\"mapped\\\",[5046]],[[43911,43911],\\\"mapped\\\",[5047]],[[43912,43912],\\\"mapped\\\",[5048]],[[43913,43913],\\\"mapped\\\",[5049]],[[43914,43914],\\\"mapped\\\",[5050]],[[43915,43915],\\\"mapped\\\",[5051]],[[43916,43916],\\\"mapped\\\",[5052]],[[43917,43917],\\\"mapped\\\",[5053]],[[43918,43918],\\\"mapped\\\",[5054]],[[43919,43919],\\\"mapped\\\",[5055]],[[43920,43920],\\\"mapped\\\",[5056]],[[43921,43921],\\\"mapped\\\",[5057]],[[43922,43922],\\\"mapped\\\",[5058]],[[43923,43923],\\\"mapped\\\",[5059]],[[43924,43924],\\\"mapped\\\",[5060]],[[43925,43925],\\\"mapped\\\",[5061]],[[43926,43926],\\\"mapped\\\",[5062]],[[43927,43927],\\\"mapped\\\",[5063]],[[43928,43928],\\\"mapped\\\",[5064]],[[43929,43929],\\\"mapped\\\",[5065]],[[43930,43930],\\\"mapped\\\",[5066]],[[43931,43931],\\\"mapped\\\",[5067]],[[43932,43932],\\\"mapped\\\",[5068]],[[43933,43933],\\\"mapped\\\",[5069]],[[43934,43934],\\\"mapped\\\",[5070]],[[43935,43935],\\\"mapped\\\",[5071]],[[43936,43936],\\\"mapped\\\",[5072]],[[43937,43937],\\\"mapped\\\",[5073]],[[43938,43938],\\\"mapped\\\",[5074]],[[43939,43939],\\\"mapped\\\",[5075]],[[43940,43940],\\\"mapped\\\",[5076]],[[43941,43941],\\\"mapped\\\",[5077]],[[43942,43942],\\\"mapped\\\",[5078]],[[43943,43943],\\\"mapped\\\",[5079]],[[43944,43944],\\\"mapped\\\",[5080]],[[43945,43945],\\\"mapped\\\",[5081]],[[43946,43946],\\\"mapped\\\",[5082]],[[43947,43947],\\\"mapped\\\",[5083]],[[43948,43948],\\\"mapped\\\",[5084]],[[43949,43949],\\\"mapped\\\",[5085]],[[43950,43950],\\\"mapped\\\",[5086]],[[43951,43951],\\\"mapped\\\",[5087]],[[43952,43952],\\\"mapped\\\",[5088]],[[43953,43953],\\\"mapped\\\",[5089]],[[43954,43954],\\\"mapped\\\",[5090]],[[43955,43955],\\\"mapped\\\",[5091]],[[43956,43956],\\\"mapped\\\",[5092]],[[43957,43957],\\\"mapped\\\",[5093]],[[43958,43958],\\\"mapped\\\",[5094]],[[43959,43959],\\\"mapped\\\",[5095]],[[43960,43960],\\\"mapped\\\",[5096]],[[43961,43961],\\\"mapped\\\",[5097]],[[43962,43962],\\\"mapped\\\",[5098]],[[43963,43963],\\\"mapped\\\",[5099]],[[43964,43964],\\\"mapped\\\",[5100]],[[43965,43965],\\\"mapped\\\",[5101]],[[43966,43966],\\\"mapped\\\",[5102]],[[43967,43967],\\\"mapped\\\",[5103]],[[43968,44010],\\\"valid\\\"],[[44011,44011],\\\"valid\\\",[],\\\"NV8\\\"],[[44012,44013],\\\"valid\\\"],[[44014,44015],\\\"disallowed\\\"],[[44016,44025],\\\"valid\\\"],[[44026,44031],\\\"disallowed\\\"],[[44032,55203],\\\"valid\\\"],[[55204,55215],\\\"disallowed\\\"],[[55216,55238],\\\"valid\\\",[],\\\"NV8\\\"],[[55239,55242],\\\"disallowed\\\"],[[55243,55291],\\\"valid\\\",[],\\\"NV8\\\"],[[55292,55295],\\\"disallowed\\\"],[[55296,57343],\\\"disallowed\\\"],[[57344,63743],\\\"disallowed\\\"],[[63744,63744],\\\"mapped\\\",[35912]],[[63745,63745],\\\"mapped\\\",[26356]],[[63746,63746],\\\"mapped\\\",[36554]],[[63747,63747],\\\"mapped\\\",[36040]],[[63748,63748],\\\"mapped\\\",[28369]],[[63749,63749],\\\"mapped\\\",[20018]],[[63750,63750],\\\"mapped\\\",[21477]],[[63751,63752],\\\"mapped\\\",[40860]],[[63753,63753],\\\"mapped\\\",[22865]],[[63754,63754],\\\"mapped\\\",[37329]],[[63755,63755],\\\"mapped\\\",[21895]],[[63756,63756],\\\"mapped\\\",[22856]],[[63757,63757],\\\"mapped\\\",[25078]],[[63758,63758],\\\"mapped\\\",[30313]],[[63759,63759],\\\"mapped\\\",[32645]],[[63760,63760],\\\"mapped\\\",[34367]],[[63761,63761],\\\"mapped\\\",[34746]],[[63762,63762],\\\"mapped\\\",[35064]],[[63763,63763],\\\"mapped\\\",[37007]],[[63764,63764],\\\"mapped\\\",[27138]],[[63765,63765],\\\"mapped\\\",[27931]],[[63766,63766],\\\"mapped\\\",[28889]],[[63767,63767],\\\"mapped\\\",[29662]],[[63768,63768],\\\"mapped\\\",[33853]],[[63769,63769],\\\"mapped\\\",[37226]],[[63770,63770],\\\"mapped\\\",[39409]],[[63771,63771],\\\"mapped\\\",[20098]],[[63772,63772],\\\"mapped\\\",[21365]],[[63773,63773],\\\"mapped\\\",[27396]],[[63774,63774],\\\"mapped\\\",[29211]],[[63775,63775],\\\"mapped\\\",[34349]],[[63776,63776],\\\"mapped\\\",[40478]],[[63777,63777],\\\"mapped\\\",[23888]],[[63778,63778],\\\"mapped\\\",[28651]],[[63779,63779],\\\"mapped\\\",[34253]],[[63780,63780],\\\"mapped\\\",[35172]],[[63781,63781],\\\"mapped\\\",[25289]],[[63782,63782],\\\"mapped\\\",[33240]],[[63783,63783],\\\"mapped\\\",[34847]],[[63784,63784],\\\"mapped\\\",[24266]],[[63785,63785],\\\"mapped\\\",[26391]],[[63786,63786],\\\"mapped\\\",[28010]],[[63787,63787],\\\"mapped\\\",[29436]],[[63788,63788],\\\"mapped\\\",[37070]],[[63789,63789],\\\"mapped\\\",[20358]],[[63790,63790],\\\"mapped\\\",[20919]],[[63791,63791],\\\"mapped\\\",[21214]],[[63792,63792],\\\"mapped\\\",[25796]],[[63793,63793],\\\"mapped\\\",[27347]],[[63794,63794],\\\"mapped\\\",[29200]],[[63795,63795],\\\"mapped\\\",[30439]],[[63796,63796],\\\"mapped\\\",[32769]],[[63797,63797],\\\"mapped\\\",[34310]],[[63798,63798],\\\"mapped\\\",[34396]],[[63799,63799],\\\"mapped\\\",[36335]],[[63800,63800],\\\"mapped\\\",[38706]],[[63801,63801],\\\"mapped\\\",[39791]],[[63802,63802],\\\"mapped\\\",[40442]],[[63803,63803],\\\"mapped\\\",[30860]],[[63804,63804],\\\"mapped\\\",[31103]],[[63805,63805],\\\"mapped\\\",[32160]],[[63806,63806],\\\"mapped\\\",[33737]],[[63807,63807],\\\"mapped\\\",[37636]],[[63808,63808],\\\"mapped\\\",[40575]],[[63809,63809],\\\"mapped\\\",[35542]],[[63810,63810],\\\"mapped\\\",[22751]],[[63811,63811],\\\"mapped\\\",[24324]],[[63812,63812],\\\"mapped\\\",[31840]],[[63813,63813],\\\"mapped\\\",[32894]],[[63814,63814],\\\"mapped\\\",[29282]],[[63815,63815],\\\"mapped\\\",[30922]],[[63816,63816],\\\"mapped\\\",[36034]],[[63817,63817],\\\"mapped\\\",[38647]],[[63818,63818],\\\"mapped\\\",[22744]],[[63819,63819],\\\"mapped\\\",[23650]],[[63820,63820],\\\"mapped\\\",[27155]],[[63821,63821],\\\"mapped\\\",[28122]],[[63822,63822],\\\"mapped\\\",[28431]],[[63823,63823],\\\"mapped\\\",[32047]],[[63824,63824],\\\"mapped\\\",[32311]],[[63825,63825],\\\"mapped\\\",[38475]],[[63826,63826],\\\"mapped\\\",[21202]],[[63827,63827],\\\"mapped\\\",[32907]],[[63828,63828],\\\"mapped\\\",[20956]],[[63829,63829],\\\"mapped\\\",[20940]],[[63830,63830],\\\"mapped\\\",[31260]],[[63831,63831],\\\"mapped\\\",[32190]],[[63832,63832],\\\"mapped\\\",[33777]],[[63833,63833],\\\"mapped\\\",[38517]],[[63834,63834],\\\"mapped\\\",[35712]],[[63835,63835],\\\"mapped\\\",[25295]],[[63836,63836],\\\"mapped\\\",[27138]],[[63837,63837],\\\"mapped\\\",[35582]],[[63838,63838],\\\"mapped\\\",[20025]],[[63839,63839],\\\"mapped\\\",[23527]],[[63840,63840],\\\"mapped\\\",[24594]],[[63841,63841],\\\"mapped\\\",[29575]],[[63842,63842],\\\"mapped\\\",[30064]],[[63843,63843],\\\"mapped\\\",[21271]],[[63844,63844],\\\"mapped\\\",[30971]],[[63845,63845],\\\"mapped\\\",[20415]],[[63846,63846],\\\"mapped\\\",[24489]],[[63847,63847],\\\"mapped\\\",[19981]],[[63848,63848],\\\"mapped\\\",[27852]],[[63849,63849],\\\"mapped\\\",[25976]],[[63850,63850],\\\"mapped\\\",[32034]],[[63851,63851],\\\"mapped\\\",[21443]],[[63852,63852],\\\"mapped\\\",[22622]],[[63853,63853],\\\"mapped\\\",[30465]],[[63854,63854],\\\"mapped\\\",[33865]],[[63855,63855],\\\"mapped\\\",[35498]],[[63856,63856],\\\"mapped\\\",[27578]],[[63857,63857],\\\"mapped\\\",[36784]],[[63858,63858],\\\"mapped\\\",[27784]],[[63859,63859],\\\"mapped\\\",[25342]],[[63860,63860],\\\"mapped\\\",[33509]],[[63861,63861],\\\"mapped\\\",[25504]],[[63862,63862],\\\"mapped\\\",[30053]],[[63863,63863],\\\"mapped\\\",[20142]],[[63864,63864],\\\"mapped\\\",[20841]],[[63865,63865],\\\"mapped\\\",[20937]],[[63866,63866],\\\"mapped\\\",[26753]],[[63867,63867],\\\"mapped\\\",[31975]],[[63868,63868],\\\"mapped\\\",[33391]],[[63869,63869],\\\"mapped\\\",[35538]],[[63870,63870],\\\"mapped\\\",[37327]],[[63871,63871],\\\"mapped\\\",[21237]],[[63872,63872],\\\"mapped\\\",[21570]],[[63873,63873],\\\"mapped\\\",[22899]],[[63874,63874],\\\"mapped\\\",[24300]],[[63875,63875],\\\"mapped\\\",[26053]],[[63876,63876],\\\"mapped\\\",[28670]],[[63877,63877],\\\"mapped\\\",[31018]],[[63878,63878],\\\"mapped\\\",[38317]],[[63879,63879],\\\"mapped\\\",[39530]],[[63880,63880],\\\"mapped\\\",[40599]],[[63881,63881],\\\"mapped\\\",[40654]],[[63882,63882],\\\"mapped\\\",[21147]],[[63883,63883],\\\"mapped\\\",[26310]],[[63884,63884],\\\"mapped\\\",[27511]],[[63885,63885],\\\"mapped\\\",[36706]],[[63886,63886],\\\"mapped\\\",[24180]],[[63887,63887],\\\"mapped\\\",[24976]],[[63888,63888],\\\"mapped\\\",[25088]],[[63889,63889],\\\"mapped\\\",[25754]],[[63890,63890],\\\"mapped\\\",[28451]],[[63891,63891],\\\"mapped\\\",[29001]],[[63892,63892],\\\"mapped\\\",[29833]],[[63893,63893],\\\"mapped\\\",[31178]],[[63894,63894],\\\"mapped\\\",[32244]],[[63895,63895],\\\"mapped\\\",[32879]],[[63896,63896],\\\"mapped\\\",[36646]],[[63897,63897],\\\"mapped\\\",[34030]],[[63898,63898],\\\"mapped\\\",[36899]],[[63899,63899],\\\"mapped\\\",[37706]],[[63900,63900],\\\"mapped\\\",[21015]],[[63901,63901],\\\"mapped\\\",[21155]],[[63902,63902],\\\"mapped\\\",[21693]],[[63903,63903],\\\"mapped\\\",[28872]],[[63904,63904],\\\"mapped\\\",[35010]],[[63905,63905],\\\"mapped\\\",[35498]],[[63906,63906],\\\"mapped\\\",[24265]],[[63907,63907],\\\"mapped\\\",[24565]],[[63908,63908],\\\"mapped\\\",[25467]],[[63909,63909],\\\"mapped\\\",[27566]],[[63910,63910],\\\"mapped\\\",[31806]],[[63911,63911],\\\"mapped\\\",[29557]],[[63912,63912],\\\"mapped\\\",[20196]],[[63913,63913],\\\"mapped\\\",[22265]],[[63914,63914],\\\"mapped\\\",[23527]],[[63915,63915],\\\"mapped\\\",[23994]],[[63916,63916],\\\"mapped\\\",[24604]],[[63917,63917],\\\"mapped\\\",[29618]],[[63918,63918],\\\"mapped\\\",[29801]],[[63919,63919],\\\"mapped\\\",[32666]],[[63920,63920],\\\"mapped\\\",[32838]],[[63921,63921],\\\"mapped\\\",[37428]],[[63922,63922],\\\"mapped\\\",[38646]],[[63923,63923],\\\"mapped\\\",[38728]],[[63924,63924],\\\"mapped\\\",[38936]],[[63925,63925],\\\"mapped\\\",[20363]],[[63926,63926],\\\"mapped\\\",[31150]],[[63927,63927],\\\"mapped\\\",[37300]],[[63928,63928],\\\"mapped\\\",[38584]],[[63929,63929],\\\"mapped\\\",[24801]],[[63930,63930],\\\"mapped\\\",[20102]],[[63931,63931],\\\"mapped\\\",[20698]],[[63932,63932],\\\"mapped\\\",[23534]],[[63933,63933],\\\"mapped\\\",[23615]],[[63934,63934],\\\"mapped\\\",[26009]],[[63935,63935],\\\"mapped\\\",[27138]],[[63936,63936],\\\"mapped\\\",[29134]],[[63937,63937],\\\"mapped\\\",[30274]],[[63938,63938],\\\"mapped\\\",[34044]],[[63939,63939],\\\"mapped\\\",[36988]],[[63940,63940],\\\"mapped\\\",[40845]],[[63941,63941],\\\"mapped\\\",[26248]],[[63942,63942],\\\"mapped\\\",[38446]],[[63943,63943],\\\"mapped\\\",[21129]],[[63944,63944],\\\"mapped\\\",[26491]],[[63945,63945],\\\"mapped\\\",[26611]],[[63946,63946],\\\"mapped\\\",[27969]],[[63947,63947],\\\"mapped\\\",[28316]],[[63948,63948],\\\"mapped\\\",[29705]],[[63949,63949],\\\"mapped\\\",[30041]],[[63950,63950],\\\"mapped\\\",[30827]],[[63951,63951],\\\"mapped\\\",[32016]],[[63952,63952],\\\"mapped\\\",[39006]],[[63953,63953],\\\"mapped\\\",[20845]],[[63954,63954],\\\"mapped\\\",[25134]],[[63955,63955],\\\"mapped\\\",[38520]],[[63956,63956],\\\"mapped\\\",[20523]],[[63957,63957],\\\"mapped\\\",[23833]],[[63958,63958],\\\"mapped\\\",[28138]],[[63959,63959],\\\"mapped\\\",[36650]],[[63960,63960],\\\"mapped\\\",[24459]],[[63961,63961],\\\"mapped\\\",[24900]],[[63962,63962],\\\"mapped\\\",[26647]],[[63963,63963],\\\"mapped\\\",[29575]],[[63964,63964],\\\"mapped\\\",[38534]],[[63965,63965],\\\"mapped\\\",[21033]],[[63966,63966],\\\"mapped\\\",[21519]],[[63967,63967],\\\"mapped\\\",[23653]],[[63968,63968],\\\"mapped\\\",[26131]],[[63969,63969],\\\"mapped\\\",[26446]],[[63970,63970],\\\"mapped\\\",[26792]],[[63971,63971],\\\"mapped\\\",[27877]],[[63972,63972],\\\"mapped\\\",[29702]],[[63973,63973],\\\"mapped\\\",[30178]],[[63974,63974],\\\"mapped\\\",[32633]],[[63975,63975],\\\"mapped\\\",[35023]],[[63976,63976],\\\"mapped\\\",[35041]],[[63977,63977],\\\"mapped\\\",[37324]],[[63978,63978],\\\"mapped\\\",[38626]],[[63979,63979],\\\"mapped\\\",[21311]],[[63980,63980],\\\"mapped\\\",[28346]],[[63981,63981],\\\"mapped\\\",[21533]],[[63982,63982],\\\"mapped\\\",[29136]],[[63983,63983],\\\"mapped\\\",[29848]],[[63984,63984],\\\"mapped\\\",[34298]],[[63985,63985],\\\"mapped\\\",[38563]],[[63986,63986],\\\"mapped\\\",[40023]],[[63987,63987],\\\"mapped\\\",[40607]],[[63988,63988],\\\"mapped\\\",[26519]],[[63989,63989],\\\"mapped\\\",[28107]],[[63990,63990],\\\"mapped\\\",[33256]],[[63991,63991],\\\"mapped\\\",[31435]],[[63992,63992],\\\"mapped\\\",[31520]],[[63993,63993],\\\"mapped\\\",[31890]],[[63994,63994],\\\"mapped\\\",[29376]],[[63995,63995],\\\"mapped\\\",[28825]],[[63996,63996],\\\"mapped\\\",[35672]],[[63997,63997],\\\"mapped\\\",[20160]],[[63998,63998],\\\"mapped\\\",[33590]],[[63999,63999],\\\"mapped\\\",[21050]],[[64000,64000],\\\"mapped\\\",[20999]],[[64001,64001],\\\"mapped\\\",[24230]],[[64002,64002],\\\"mapped\\\",[25299]],[[64003,64003],\\\"mapped\\\",[31958]],[[64004,64004],\\\"mapped\\\",[23429]],[[64005,64005],\\\"mapped\\\",[27934]],[[64006,64006],\\\"mapped\\\",[26292]],[[64007,64007],\\\"mapped\\\",[36667]],[[64008,64008],\\\"mapped\\\",[34892]],[[64009,64009],\\\"mapped\\\",[38477]],[[64010,64010],\\\"mapped\\\",[35211]],[[64011,64011],\\\"mapped\\\",[24275]],[[64012,64012],\\\"mapped\\\",[20800]],[[64013,64013],\\\"mapped\\\",[21952]],[[64014,64015],\\\"valid\\\"],[[64016,64016],\\\"mapped\\\",[22618]],[[64017,64017],\\\"valid\\\"],[[64018,64018],\\\"mapped\\\",[26228]],[[64019,64020],\\\"valid\\\"],[[64021,64021],\\\"mapped\\\",[20958]],[[64022,64022],\\\"mapped\\\",[29482]],[[64023,64023],\\\"mapped\\\",[30410]],[[64024,64024],\\\"mapped\\\",[31036]],[[64025,64025],\\\"mapped\\\",[31070]],[[64026,64026],\\\"mapped\\\",[31077]],[[64027,64027],\\\"mapped\\\",[31119]],[[64028,64028],\\\"mapped\\\",[38742]],[[64029,64029],\\\"mapped\\\",[31934]],[[64030,64030],\\\"mapped\\\",[32701]],[[64031,64031],\\\"valid\\\"],[[64032,64032],\\\"mapped\\\",[34322]],[[64033,64033],\\\"valid\\\"],[[64034,64034],\\\"mapped\\\",[35576]],[[64035,64036],\\\"valid\\\"],[[64037,64037],\\\"mapped\\\",[36920]],[[64038,64038],\\\"mapped\\\",[37117]],[[64039,64041],\\\"valid\\\"],[[64042,64042],\\\"mapped\\\",[39151]],[[64043,64043],\\\"mapped\\\",[39164]],[[64044,64044],\\\"mapped\\\",[39208]],[[64045,64045],\\\"mapped\\\",[40372]],[[64046,64046],\\\"mapped\\\",[37086]],[[64047,64047],\\\"mapped\\\",[38583]],[[64048,64048],\\\"mapped\\\",[20398]],[[64049,64049],\\\"mapped\\\",[20711]],[[64050,64050],\\\"mapped\\\",[20813]],[[64051,64051],\\\"mapped\\\",[21193]],[[64052,64052],\\\"mapped\\\",[21220]],[[64053,64053],\\\"mapped\\\",[21329]],[[64054,64054],\\\"mapped\\\",[21917]],[[64055,64055],\\\"mapped\\\",[22022]],[[64056,64056],\\\"mapped\\\",[22120]],[[64057,64057],\\\"mapped\\\",[22592]],[[64058,64058],\\\"mapped\\\",[22696]],[[64059,64059],\\\"mapped\\\",[23652]],[[64060,64060],\\\"mapped\\\",[23662]],[[64061,64061],\\\"mapped\\\",[24724]],[[64062,64062],\\\"mapped\\\",[24936]],[[64063,64063],\\\"mapped\\\",[24974]],[[64064,64064],\\\"mapped\\\",[25074]],[[64065,64065],\\\"mapped\\\",[25935]],[[64066,64066],\\\"mapped\\\",[26082]],[[64067,64067],\\\"mapped\\\",[26257]],[[64068,64068],\\\"mapped\\\",[26757]],[[64069,64069],\\\"mapped\\\",[28023]],[[64070,64070],\\\"mapped\\\",[28186]],[[64071,64071],\\\"mapped\\\",[28450]],[[64072,64072],\\\"mapped\\\",[29038]],[[64073,64073],\\\"mapped\\\",[29227]],[[64074,64074],\\\"mapped\\\",[29730]],[[64075,64075],\\\"mapped\\\",[30865]],[[64076,64076],\\\"mapped\\\",[31038]],[[64077,64077],\\\"mapped\\\",[31049]],[[64078,64078],\\\"mapped\\\",[31048]],[[64079,64079],\\\"mapped\\\",[31056]],[[64080,64080],\\\"mapped\\\",[31062]],[[64081,64081],\\\"mapped\\\",[31069]],[[64082,64082],\\\"mapped\\\",[31117]],[[64083,64083],\\\"mapped\\\",[31118]],[[64084,64084],\\\"mapped\\\",[31296]],[[64085,64085],\\\"mapped\\\",[31361]],[[64086,64086],\\\"mapped\\\",[31680]],[[64087,64087],\\\"mapped\\\",[32244]],[[64088,64088],\\\"mapped\\\",[32265]],[[64089,64089],\\\"mapped\\\",[32321]],[[64090,64090],\\\"mapped\\\",[32626]],[[64091,64091],\\\"mapped\\\",[32773]],[[64092,64092],\\\"mapped\\\",[33261]],[[64093,64094],\\\"mapped\\\",[33401]],[[64095,64095],\\\"mapped\\\",[33879]],[[64096,64096],\\\"mapped\\\",[35088]],[[64097,64097],\\\"mapped\\\",[35222]],[[64098,64098],\\\"mapped\\\",[35585]],[[64099,64099],\\\"mapped\\\",[35641]],[[64100,64100],\\\"mapped\\\",[36051]],[[64101,64101],\\\"mapped\\\",[36104]],[[64102,64102],\\\"mapped\\\",[36790]],[[64103,64103],\\\"mapped\\\",[36920]],[[64104,64104],\\\"mapped\\\",[38627]],[[64105,64105],\\\"mapped\\\",[38911]],[[64106,64106],\\\"mapped\\\",[38971]],[[64107,64107],\\\"mapped\\\",[24693]],[[64108,64108],\\\"mapped\\\",[148206]],[[64109,64109],\\\"mapped\\\",[33304]],[[64110,64111],\\\"disallowed\\\"],[[64112,64112],\\\"mapped\\\",[20006]],[[64113,64113],\\\"mapped\\\",[20917]],[[64114,64114],\\\"mapped\\\",[20840]],[[64115,64115],\\\"mapped\\\",[20352]],[[64116,64116],\\\"mapped\\\",[20805]],[[64117,64117],\\\"mapped\\\",[20864]],[[64118,64118],\\\"mapped\\\",[21191]],[[64119,64119],\\\"mapped\\\",[21242]],[[64120,64120],\\\"mapped\\\",[21917]],[[64121,64121],\\\"mapped\\\",[21845]],[[64122,64122],\\\"mapped\\\",[21913]],[[64123,64123],\\\"mapped\\\",[21986]],[[64124,64124],\\\"mapped\\\",[22618]],[[64125,64125],\\\"mapped\\\",[22707]],[[64126,64126],\\\"mapped\\\",[22852]],[[64127,64127],\\\"mapped\\\",[22868]],[[64128,64128],\\\"mapped\\\",[23138]],[[64129,64129],\\\"mapped\\\",[23336]],[[64130,64130],\\\"mapped\\\",[24274]],[[64131,64131],\\\"mapped\\\",[24281]],[[64132,64132],\\\"mapped\\\",[24425]],[[64133,64133],\\\"mapped\\\",[24493]],[[64134,64134],\\\"mapped\\\",[24792]],[[64135,64135],\\\"mapped\\\",[24910]],[[64136,64136],\\\"mapped\\\",[24840]],[[64137,64137],\\\"mapped\\\",[24974]],[[64138,64138],\\\"mapped\\\",[24928]],[[64139,64139],\\\"mapped\\\",[25074]],[[64140,64140],\\\"mapped\\\",[25140]],[[64141,64141],\\\"mapped\\\",[25540]],[[64142,64142],\\\"mapped\\\",[25628]],[[64143,64143],\\\"mapped\\\",[25682]],[[64144,64144],\\\"mapped\\\",[25942]],[[64145,64145],\\\"mapped\\\",[26228]],[[64146,64146],\\\"mapped\\\",[26391]],[[64147,64147],\\\"mapped\\\",[26395]],[[64148,64148],\\\"mapped\\\",[26454]],[[64149,64149],\\\"mapped\\\",[27513]],[[64150,64150],\\\"mapped\\\",[27578]],[[64151,64151],\\\"mapped\\\",[27969]],[[64152,64152],\\\"mapped\\\",[28379]],[[64153,64153],\\\"mapped\\\",[28363]],[[64154,64154],\\\"mapped\\\",[28450]],[[64155,64155],\\\"mapped\\\",[28702]],[[64156,64156],\\\"mapped\\\",[29038]],[[64157,64157],\\\"mapped\\\",[30631]],[[64158,64158],\\\"mapped\\\",[29237]],[[64159,64159],\\\"mapped\\\",[29359]],[[64160,64160],\\\"mapped\\\",[29482]],[[64161,64161],\\\"mapped\\\",[29809]],[[64162,64162],\\\"mapped\\\",[29958]],[[64163,64163],\\\"mapped\\\",[30011]],[[64164,64164],\\\"mapped\\\",[30237]],[[64165,64165],\\\"mapped\\\",[30239]],[[64166,64166],\\\"mapped\\\",[30410]],[[64167,64167],\\\"mapped\\\",[30427]],[[64168,64168],\\\"mapped\\\",[30452]],[[64169,64169],\\\"mapped\\\",[30538]],[[64170,64170],\\\"mapped\\\",[30528]],[[64171,64171],\\\"mapped\\\",[30924]],[[64172,64172],\\\"mapped\\\",[31409]],[[64173,64173],\\\"mapped\\\",[31680]],[[64174,64174],\\\"mapped\\\",[31867]],[[64175,64175],\\\"mapped\\\",[32091]],[[64176,64176],\\\"mapped\\\",[32244]],[[64177,64177],\\\"mapped\\\",[32574]],[[64178,64178],\\\"mapped\\\",[32773]],[[64179,64179],\\\"mapped\\\",[33618]],[[64180,64180],\\\"mapped\\\",[33775]],[[64181,64181],\\\"mapped\\\",[34681]],[[64182,64182],\\\"mapped\\\",[35137]],[[64183,64183],\\\"mapped\\\",[35206]],[[64184,64184],\\\"mapped\\\",[35222]],[[64185,64185],\\\"mapped\\\",[35519]],[[64186,64186],\\\"mapped\\\",[35576]],[[64187,64187],\\\"mapped\\\",[35531]],[[64188,64188],\\\"mapped\\\",[35585]],[[64189,64189],\\\"mapped\\\",[35582]],[[64190,64190],\\\"mapped\\\",[35565]],[[64191,64191],\\\"mapped\\\",[35641]],[[64192,64192],\\\"mapped\\\",[35722]],[[64193,64193],\\\"mapped\\\",[36104]],[[64194,64194],\\\"mapped\\\",[36664]],[[64195,64195],\\\"mapped\\\",[36978]],[[64196,64196],\\\"mapped\\\",[37273]],[[64197,64197],\\\"mapped\\\",[37494]],[[64198,64198],\\\"mapped\\\",[38524]],[[64199,64199],\\\"mapped\\\",[38627]],[[64200,64200],\\\"mapped\\\",[38742]],[[64201,64201],\\\"mapped\\\",[38875]],[[64202,64202],\\\"mapped\\\",[38911]],[[64203,64203],\\\"mapped\\\",[38923]],[[64204,64204],\\\"mapped\\\",[38971]],[[64205,64205],\\\"mapped\\\",[39698]],[[64206,64206],\\\"mapped\\\",[40860]],[[64207,64207],\\\"mapped\\\",[141386]],[[64208,64208],\\\"mapped\\\",[141380]],[[64209,64209],\\\"mapped\\\",[144341]],[[64210,64210],\\\"mapped\\\",[15261]],[[64211,64211],\\\"mapped\\\",[16408]],[[64212,64212],\\\"mapped\\\",[16441]],[[64213,64213],\\\"mapped\\\",[152137]],[[64214,64214],\\\"mapped\\\",[154832]],[[64215,64215],\\\"mapped\\\",[163539]],[[64216,64216],\\\"mapped\\\",[40771]],[[64217,64217],\\\"mapped\\\",[40846]],[[64218,64255],\\\"disallowed\\\"],[[64256,64256],\\\"mapped\\\",[102,102]],[[64257,64257],\\\"mapped\\\",[102,105]],[[64258,64258],\\\"mapped\\\",[102,108]],[[64259,64259],\\\"mapped\\\",[102,102,105]],[[64260,64260],\\\"mapped\\\",[102,102,108]],[[64261,64262],\\\"mapped\\\",[115,116]],[[64263,64274],\\\"disallowed\\\"],[[64275,64275],\\\"mapped\\\",[1396,1398]],[[64276,64276],\\\"mapped\\\",[1396,1381]],[[64277,64277],\\\"mapped\\\",[1396,1387]],[[64278,64278],\\\"mapped\\\",[1406,1398]],[[64279,64279],\\\"mapped\\\",[1396,1389]],[[64280,64284],\\\"disallowed\\\"],[[64285,64285],\\\"mapped\\\",[1497,1460]],[[64286,64286],\\\"valid\\\"],[[64287,64287],\\\"mapped\\\",[1522,1463]],[[64288,64288],\\\"mapped\\\",[1506]],[[64289,64289],\\\"mapped\\\",[1488]],[[64290,64290],\\\"mapped\\\",[1491]],[[64291,64291],\\\"mapped\\\",[1492]],[[64292,64292],\\\"mapped\\\",[1499]],[[64293,64293],\\\"mapped\\\",[1500]],[[64294,64294],\\\"mapped\\\",[1501]],[[64295,64295],\\\"mapped\\\",[1512]],[[64296,64296],\\\"mapped\\\",[1514]],[[64297,64297],\\\"disallowed_STD3_mapped\\\",[43]],[[64298,64298],\\\"mapped\\\",[1513,1473]],[[64299,64299],\\\"mapped\\\",[1513,1474]],[[64300,64300],\\\"mapped\\\",[1513,1468,1473]],[[64301,64301],\\\"mapped\\\",[1513,1468,1474]],[[64302,64302],\\\"mapped\\\",[1488,1463]],[[64303,64303],\\\"mapped\\\",[1488,1464]],[[64304,64304],\\\"mapped\\\",[1488,1468]],[[64305,64305],\\\"mapped\\\",[1489,1468]],[[64306,64306],\\\"mapped\\\",[1490,1468]],[[64307,64307],\\\"mapped\\\",[1491,1468]],[[64308,64308],\\\"mapped\\\",[1492,1468]],[[64309,64309],\\\"mapped\\\",[1493,1468]],[[64310,64310],\\\"mapped\\\",[1494,1468]],[[64311,64311],\\\"disallowed\\\"],[[64312,64312],\\\"mapped\\\",[1496,1468]],[[64313,64313],\\\"mapped\\\",[1497,1468]],[[64314,64314],\\\"mapped\\\",[1498,1468]],[[64315,64315],\\\"mapped\\\",[1499,1468]],[[64316,64316],\\\"mapped\\\",[1500,1468]],[[64317,64317],\\\"disallowed\\\"],[[64318,64318],\\\"mapped\\\",[1502,1468]],[[64319,64319],\\\"disallowed\\\"],[[64320,64320],\\\"mapped\\\",[1504,1468]],[[64321,64321],\\\"mapped\\\",[1505,1468]],[[64322,64322],\\\"disallowed\\\"],[[64323,64323],\\\"mapped\\\",[1507,1468]],[[64324,64324],\\\"mapped\\\",[1508,1468]],[[64325,64325],\\\"disallowed\\\"],[[64326,64326],\\\"mapped\\\",[1510,1468]],[[64327,64327],\\\"mapped\\\",[1511,1468]],[[64328,64328],\\\"mapped\\\",[1512,1468]],[[64329,64329],\\\"mapped\\\",[1513,1468]],[[64330,64330],\\\"mapped\\\",[1514,1468]],[[64331,64331],\\\"mapped\\\",[1493,1465]],[[64332,64332],\\\"mapped\\\",[1489,1471]],[[64333,64333],\\\"mapped\\\",[1499,1471]],[[64334,64334],\\\"mapped\\\",[1508,1471]],[[64335,64335],\\\"mapped\\\",[1488,1500]],[[64336,64337],\\\"mapped\\\",[1649]],[[64338,64341],\\\"mapped\\\",[1659]],[[64342,64345],\\\"mapped\\\",[1662]],[[64346,64349],\\\"mapped\\\",[1664]],[[64350,64353],\\\"mapped\\\",[1658]],[[64354,64357],\\\"mapped\\\",[1663]],[[64358,64361],\\\"mapped\\\",[1657]],[[64362,64365],\\\"mapped\\\",[1700]],[[64366,64369],\\\"mapped\\\",[1702]],[[64370,64373],\\\"mapped\\\",[1668]],[[64374,64377],\\\"mapped\\\",[1667]],[[64378,64381],\\\"mapped\\\",[1670]],[[64382,64385],\\\"mapped\\\",[1671]],[[64386,64387],\\\"mapped\\\",[1677]],[[64388,64389],\\\"mapped\\\",[1676]],[[64390,64391],\\\"mapped\\\",[1678]],[[64392,64393],\\\"mapped\\\",[1672]],[[64394,64395],\\\"mapped\\\",[1688]],[[64396,64397],\\\"mapped\\\",[1681]],[[64398,64401],\\\"mapped\\\",[1705]],[[64402,64405],\\\"mapped\\\",[1711]],[[64406,64409],\\\"mapped\\\",[1715]],[[64410,64413],\\\"mapped\\\",[1713]],[[64414,64415],\\\"mapped\\\",[1722]],[[64416,64419],\\\"mapped\\\",[1723]],[[64420,64421],\\\"mapped\\\",[1728]],[[64422,64425],\\\"mapped\\\",[1729]],[[64426,64429],\\\"mapped\\\",[1726]],[[64430,64431],\\\"mapped\\\",[1746]],[[64432,64433],\\\"mapped\\\",[1747]],[[64434,64449],\\\"valid\\\",[],\\\"NV8\\\"],[[64450,64466],\\\"disallowed\\\"],[[64467,64470],\\\"mapped\\\",[1709]],[[64471,64472],\\\"mapped\\\",[1735]],[[64473,64474],\\\"mapped\\\",[1734]],[[64475,64476],\\\"mapped\\\",[1736]],[[64477,64477],\\\"mapped\\\",[1735,1652]],[[64478,64479],\\\"mapped\\\",[1739]],[[64480,64481],\\\"mapped\\\",[1733]],[[64482,64483],\\\"mapped\\\",[1737]],[[64484,64487],\\\"mapped\\\",[1744]],[[64488,64489],\\\"mapped\\\",[1609]],[[64490,64491],\\\"mapped\\\",[1574,1575]],[[64492,64493],\\\"mapped\\\",[1574,1749]],[[64494,64495],\\\"mapped\\\",[1574,1608]],[[64496,64497],\\\"mapped\\\",[1574,1735]],[[64498,64499],\\\"mapped\\\",[1574,1734]],[[64500,64501],\\\"mapped\\\",[1574,1736]],[[64502,64504],\\\"mapped\\\",[1574,1744]],[[64505,64507],\\\"mapped\\\",[1574,1609]],[[64508,64511],\\\"mapped\\\",[1740]],[[64512,64512],\\\"mapped\\\",[1574,1580]],[[64513,64513],\\\"mapped\\\",[1574,1581]],[[64514,64514],\\\"mapped\\\",[1574,1605]],[[64515,64515],\\\"mapped\\\",[1574,1609]],[[64516,64516],\\\"mapped\\\",[1574,1610]],[[64517,64517],\\\"mapped\\\",[1576,1580]],[[64518,64518],\\\"mapped\\\",[1576,1581]],[[64519,64519],\\\"mapped\\\",[1576,1582]],[[64520,64520],\\\"mapped\\\",[1576,1605]],[[64521,64521],\\\"mapped\\\",[1576,1609]],[[64522,64522],\\\"mapped\\\",[1576,1610]],[[64523,64523],\\\"mapped\\\",[1578,1580]],[[64524,64524],\\\"mapped\\\",[1578,1581]],[[64525,64525],\\\"mapped\\\",[1578,1582]],[[64526,64526],\\\"mapped\\\",[1578,1605]],[[64527,64527],\\\"mapped\\\",[1578,1609]],[[64528,64528],\\\"mapped\\\",[1578,1610]],[[64529,64529],\\\"mapped\\\",[1579,1580]],[[64530,64530],\\\"mapped\\\",[1579,1605]],[[64531,64531],\\\"mapped\\\",[1579,1609]],[[64532,64532],\\\"mapped\\\",[1579,1610]],[[64533,64533],\\\"mapped\\\",[1580,1581]],[[64534,64534],\\\"mapped\\\",[1580,1605]],[[64535,64535],\\\"mapped\\\",[1581,1580]],[[64536,64536],\\\"mapped\\\",[1581,1605]],[[64537,64537],\\\"mapped\\\",[1582,1580]],[[64538,64538],\\\"mapped\\\",[1582,1581]],[[64539,64539],\\\"mapped\\\",[1582,1605]],[[64540,64540],\\\"mapped\\\",[1587,1580]],[[64541,64541],\\\"mapped\\\",[1587,1581]],[[64542,64542],\\\"mapped\\\",[1587,1582]],[[64543,64543],\\\"mapped\\\",[1587,1605]],[[64544,64544],\\\"mapped\\\",[1589,1581]],[[64545,64545],\\\"mapped\\\",[1589,1605]],[[64546,64546],\\\"mapped\\\",[1590,1580]],[[64547,64547],\\\"mapped\\\",[1590,1581]],[[64548,64548],\\\"mapped\\\",[1590,1582]],[[64549,64549],\\\"mapped\\\",[1590,1605]],[[64550,64550],\\\"mapped\\\",[1591,1581]],[[64551,64551],\\\"mapped\\\",[1591,1605]],[[64552,64552],\\\"mapped\\\",[1592,1605]],[[64553,64553],\\\"mapped\\\",[1593,1580]],[[64554,64554],\\\"mapped\\\",[1593,1605]],[[64555,64555],\\\"mapped\\\",[1594,1580]],[[64556,64556],\\\"mapped\\\",[1594,1605]],[[64557,64557],\\\"mapped\\\",[1601,1580]],[[64558,64558],\\\"mapped\\\",[1601,1581]],[[64559,64559],\\\"mapped\\\",[1601,1582]],[[64560,64560],\\\"mapped\\\",[1601,1605]],[[64561,64561],\\\"mapped\\\",[1601,1609]],[[64562,64562],\\\"mapped\\\",[1601,1610]],[[64563,64563],\\\"mapped\\\",[1602,1581]],[[64564,64564],\\\"mapped\\\",[1602,1605]],[[64565,64565],\\\"mapped\\\",[1602,1609]],[[64566,64566],\\\"mapped\\\",[1602,1610]],[[64567,64567],\\\"mapped\\\",[1603,1575]],[[64568,64568],\\\"mapped\\\",[1603,1580]],[[64569,64569],\\\"mapped\\\",[1603,1581]],[[64570,64570],\\\"mapped\\\",[1603,1582]],[[64571,64571],\\\"mapped\\\",[1603,1604]],[[64572,64572],\\\"mapped\\\",[1603,1605]],[[64573,64573],\\\"mapped\\\",[1603,1609]],[[64574,64574],\\\"mapped\\\",[1603,1610]],[[64575,64575],\\\"mapped\\\",[1604,1580]],[[64576,64576],\\\"mapped\\\",[1604,1581]],[[64577,64577],\\\"mapped\\\",[1604,1582]],[[64578,64578],\\\"mapped\\\",[1604,1605]],[[64579,64579],\\\"mapped\\\",[1604,1609]],[[64580,64580],\\\"mapped\\\",[1604,1610]],[[64581,64581],\\\"mapped\\\",[1605,1580]],[[64582,64582],\\\"mapped\\\",[1605,1581]],[[64583,64583],\\\"mapped\\\",[1605,1582]],[[64584,64584],\\\"mapped\\\",[1605,1605]],[[64585,64585],\\\"mapped\\\",[1605,1609]],[[64586,64586],\\\"mapped\\\",[1605,1610]],[[64587,64587],\\\"mapped\\\",[1606,1580]],[[64588,64588],\\\"mapped\\\",[1606,1581]],[[64589,64589],\\\"mapped\\\",[1606,1582]],[[64590,64590],\\\"mapped\\\",[1606,1605]],[[64591,64591],\\\"mapped\\\",[1606,1609]],[[64592,64592],\\\"mapped\\\",[1606,1610]],[[64593,64593],\\\"mapped\\\",[1607,1580]],[[64594,64594],\\\"mapped\\\",[1607,1605]],[[64595,64595],\\\"mapped\\\",[1607,1609]],[[64596,64596],\\\"mapped\\\",[1607,1610]],[[64597,64597],\\\"mapped\\\",[1610,1580]],[[64598,64598],\\\"mapped\\\",[1610,1581]],[[64599,64599],\\\"mapped\\\",[1610,1582]],[[64600,64600],\\\"mapped\\\",[1610,1605]],[[64601,64601],\\\"mapped\\\",[1610,1609]],[[64602,64602],\\\"mapped\\\",[1610,1610]],[[64603,64603],\\\"mapped\\\",[1584,1648]],[[64604,64604],\\\"mapped\\\",[1585,1648]],[[64605,64605],\\\"mapped\\\",[1609,1648]],[[64606,64606],\\\"disallowed_STD3_mapped\\\",[32,1612,1617]],[[64607,64607],\\\"disallowed_STD3_mapped\\\",[32,1613,1617]],[[64608,64608],\\\"disallowed_STD3_mapped\\\",[32,1614,1617]],[[64609,64609],\\\"disallowed_STD3_mapped\\\",[32,1615,1617]],[[64610,64610],\\\"disallowed_STD3_mapped\\\",[32,1616,1617]],[[64611,64611],\\\"disallowed_STD3_mapped\\\",[32,1617,1648]],[[64612,64612],\\\"mapped\\\",[1574,1585]],[[64613,64613],\\\"mapped\\\",[1574,1586]],[[64614,64614],\\\"mapped\\\",[1574,1605]],[[64615,64615],\\\"mapped\\\",[1574,1606]],[[64616,64616],\\\"mapped\\\",[1574,1609]],[[64617,64617],\\\"mapped\\\",[1574,1610]],[[64618,64618],\\\"mapped\\\",[1576,1585]],[[64619,64619],\\\"mapped\\\",[1576,1586]],[[64620,64620],\\\"mapped\\\",[1576,1605]],[[64621,64621],\\\"mapped\\\",[1576,1606]],[[64622,64622],\\\"mapped\\\",[1576,1609]],[[64623,64623],\\\"mapped\\\",[1576,1610]],[[64624,64624],\\\"mapped\\\",[1578,1585]],[[64625,64625],\\\"mapped\\\",[1578,1586]],[[64626,64626],\\\"mapped\\\",[1578,1605]],[[64627,64627],\\\"mapped\\\",[1578,1606]],[[64628,64628],\\\"mapped\\\",[1578,1609]],[[64629,64629],\\\"mapped\\\",[1578,1610]],[[64630,64630],\\\"mapped\\\",[1579,1585]],[[64631,64631],\\\"mapped\\\",[1579,1586]],[[64632,64632],\\\"mapped\\\",[1579,1605]],[[64633,64633],\\\"mapped\\\",[1579,1606]],[[64634,64634],\\\"mapped\\\",[1579,1609]],[[64635,64635],\\\"mapped\\\",[1579,1610]],[[64636,64636],\\\"mapped\\\",[1601,1609]],[[64637,64637],\\\"mapped\\\",[1601,1610]],[[64638,64638],\\\"mapped\\\",[1602,1609]],[[64639,64639],\\\"mapped\\\",[1602,1610]],[[64640,64640],\\\"mapped\\\",[1603,1575]],[[64641,64641],\\\"mapped\\\",[1603,1604]],[[64642,64642],\\\"mapped\\\",[1603,1605]],[[64643,64643],\\\"mapped\\\",[1603,1609]],[[64644,64644],\\\"mapped\\\",[1603,1610]],[[64645,64645],\\\"mapped\\\",[1604,1605]],[[64646,64646],\\\"mapped\\\",[1604,1609]],[[64647,64647],\\\"mapped\\\",[1604,1610]],[[64648,64648],\\\"mapped\\\",[1605,1575]],[[64649,64649],\\\"mapped\\\",[1605,1605]],[[64650,64650],\\\"mapped\\\",[1606,1585]],[[64651,64651],\\\"mapped\\\",[1606,1586]],[[64652,64652],\\\"mapped\\\",[1606,1605]],[[64653,64653],\\\"mapped\\\",[1606,1606]],[[64654,64654],\\\"mapped\\\",[1606,1609]],[[64655,64655],\\\"mapped\\\",[1606,1610]],[[64656,64656],\\\"mapped\\\",[1609,1648]],[[64657,64657],\\\"mapped\\\",[1610,1585]],[[64658,64658],\\\"mapped\\\",[1610,1586]],[[64659,64659],\\\"mapped\\\",[1610,1605]],[[64660,64660],\\\"mapped\\\",[1610,1606]],[[64661,64661],\\\"mapped\\\",[1610,1609]],[[64662,64662],\\\"mapped\\\",[1610,1610]],[[64663,64663],\\\"mapped\\\",[1574,1580]],[[64664,64664],\\\"mapped\\\",[1574,1581]],[[64665,64665],\\\"mapped\\\",[1574,1582]],[[64666,64666],\\\"mapped\\\",[1574,1605]],[[64667,64667],\\\"mapped\\\",[1574,1607]],[[64668,64668],\\\"mapped\\\",[1576,1580]],[[64669,64669],\\\"mapped\\\",[1576,1581]],[[64670,64670],\\\"mapped\\\",[1576,1582]],[[64671,64671],\\\"mapped\\\",[1576,1605]],[[64672,64672],\\\"mapped\\\",[1576,1607]],[[64673,64673],\\\"mapped\\\",[1578,1580]],[[64674,64674],\\\"mapped\\\",[1578,1581]],[[64675,64675],\\\"mapped\\\",[1578,1582]],[[64676,64676],\\\"mapped\\\",[1578,1605]],[[64677,64677],\\\"mapped\\\",[1578,1607]],[[64678,64678],\\\"mapped\\\",[1579,1605]],[[64679,64679],\\\"mapped\\\",[1580,1581]],[[64680,64680],\\\"mapped\\\",[1580,1605]],[[64681,64681],\\\"mapped\\\",[1581,1580]],[[64682,64682],\\\"mapped\\\",[1581,1605]],[[64683,64683],\\\"mapped\\\",[1582,1580]],[[64684,64684],\\\"mapped\\\",[1582,1605]],[[64685,64685],\\\"mapped\\\",[1587,1580]],[[64686,64686],\\\"mapped\\\",[1587,1581]],[[64687,64687],\\\"mapped\\\",[1587,1582]],[[64688,64688],\\\"mapped\\\",[1587,1605]],[[64689,64689],\\\"mapped\\\",[1589,1581]],[[64690,64690],\\\"mapped\\\",[1589,1582]],[[64691,64691],\\\"mapped\\\",[1589,1605]],[[64692,64692],\\\"mapped\\\",[1590,1580]],[[64693,64693],\\\"mapped\\\",[1590,1581]],[[64694,64694],\\\"mapped\\\",[1590,1582]],[[64695,64695],\\\"mapped\\\",[1590,1605]],[[64696,64696],\\\"mapped\\\",[1591,1581]],[[64697,64697],\\\"mapped\\\",[1592,1605]],[[64698,64698],\\\"mapped\\\",[1593,1580]],[[64699,64699],\\\"mapped\\\",[1593,1605]],[[64700,64700],\\\"mapped\\\",[1594,1580]],[[64701,64701],\\\"mapped\\\",[1594,1605]],[[64702,64702],\\\"mapped\\\",[1601,1580]],[[64703,64703],\\\"mapped\\\",[1601,1581]],[[64704,64704],\\\"mapped\\\",[1601,1582]],[[64705,64705],\\\"mapped\\\",[1601,1605]],[[64706,64706],\\\"mapped\\\",[1602,1581]],[[64707,64707],\\\"mapped\\\",[1602,1605]],[[64708,64708],\\\"mapped\\\",[1603,1580]],[[64709,64709],\\\"mapped\\\",[1603,1581]],[[64710,64710],\\\"mapped\\\",[1603,1582]],[[64711,64711],\\\"mapped\\\",[1603,1604]],[[64712,64712],\\\"mapped\\\",[1603,1605]],[[64713,64713],\\\"mapped\\\",[1604,1580]],[[64714,64714],\\\"mapped\\\",[1604,1581]],[[64715,64715],\\\"mapped\\\",[1604,1582]],[[64716,64716],\\\"mapped\\\",[1604,1605]],[[64717,64717],\\\"mapped\\\",[1604,1607]],[[64718,64718],\\\"mapped\\\",[1605,1580]],[[64719,64719],\\\"mapped\\\",[1605,1581]],[[64720,64720],\\\"mapped\\\",[1605,1582]],[[64721,64721],\\\"mapped\\\",[1605,1605]],[[64722,64722],\\\"mapped\\\",[1606,1580]],[[64723,64723],\\\"mapped\\\",[1606,1581]],[[64724,64724],\\\"mapped\\\",[1606,1582]],[[64725,64725],\\\"mapped\\\",[1606,1605]],[[64726,64726],\\\"mapped\\\",[1606,1607]],[[64727,64727],\\\"mapped\\\",[1607,1580]],[[64728,64728],\\\"mapped\\\",[1607,1605]],[[64729,64729],\\\"mapped\\\",[1607,1648]],[[64730,64730],\\\"mapped\\\",[1610,1580]],[[64731,64731],\\\"mapped\\\",[1610,1581]],[[64732,64732],\\\"mapped\\\",[1610,1582]],[[64733,64733],\\\"mapped\\\",[1610,1605]],[[64734,64734],\\\"mapped\\\",[1610,1607]],[[64735,64735],\\\"mapped\\\",[1574,1605]],[[64736,64736],\\\"mapped\\\",[1574,1607]],[[64737,64737],\\\"mapped\\\",[1576,1605]],[[64738,64738],\\\"mapped\\\",[1576,1607]],[[64739,64739],\\\"mapped\\\",[1578,1605]],[[64740,64740],\\\"mapped\\\",[1578,1607]],[[64741,64741],\\\"mapped\\\",[1579,1605]],[[64742,64742],\\\"mapped\\\",[1579,1607]],[[64743,64743],\\\"mapped\\\",[1587,1605]],[[64744,64744],\\\"mapped\\\",[1587,1607]],[[64745,64745],\\\"mapped\\\",[1588,1605]],[[64746,64746],\\\"mapped\\\",[1588,1607]],[[64747,64747],\\\"mapped\\\",[1603,1604]],[[64748,64748],\\\"mapped\\\",[1603,1605]],[[64749,64749],\\\"mapped\\\",[1604,1605]],[[64750,64750],\\\"mapped\\\",[1606,1605]],[[64751,64751],\\\"mapped\\\",[1606,1607]],[[64752,64752],\\\"mapped\\\",[1610,1605]],[[64753,64753],\\\"mapped\\\",[1610,1607]],[[64754,64754],\\\"mapped\\\",[1600,1614,1617]],[[64755,64755],\\\"mapped\\\",[1600,1615,1617]],[[64756,64756],\\\"mapped\\\",[1600,1616,1617]],[[64757,64757],\\\"mapped\\\",[1591,1609]],[[64758,64758],\\\"mapped\\\",[1591,1610]],[[64759,64759],\\\"mapped\\\",[1593,1609]],[[64760,64760],\\\"mapped\\\",[1593,1610]],[[64761,64761],\\\"mapped\\\",[1594,1609]],[[64762,64762],\\\"mapped\\\",[1594,1610]],[[64763,64763],\\\"mapped\\\",[1587,1609]],[[64764,64764],\\\"mapped\\\",[1587,1610]],[[64765,64765],\\\"mapped\\\",[1588,1609]],[[64766,64766],\\\"mapped\\\",[1588,1610]],[[64767,64767],\\\"mapped\\\",[1581,1609]],[[64768,64768],\\\"mapped\\\",[1581,1610]],[[64769,64769],\\\"mapped\\\",[1580,1609]],[[64770,64770],\\\"mapped\\\",[1580,1610]],[[64771,64771],\\\"mapped\\\",[1582,1609]],[[64772,64772],\\\"mapped\\\",[1582,1610]],[[64773,64773],\\\"mapped\\\",[1589,1609]],[[64774,64774],\\\"mapped\\\",[1589,1610]],[[64775,64775],\\\"mapped\\\",[1590,1609]],[[64776,64776],\\\"mapped\\\",[1590,1610]],[[64777,64777],\\\"mapped\\\",[1588,1580]],[[64778,64778],\\\"mapped\\\",[1588,1581]],[[64779,64779],\\\"mapped\\\",[1588,1582]],[[64780,64780],\\\"mapped\\\",[1588,1605]],[[64781,64781],\\\"mapped\\\",[1588,1585]],[[64782,64782],\\\"mapped\\\",[1587,1585]],[[64783,64783],\\\"mapped\\\",[1589,1585]],[[64784,64784],\\\"mapped\\\",[1590,1585]],[[64785,64785],\\\"mapped\\\",[1591,1609]],[[64786,64786],\\\"mapped\\\",[1591,1610]],[[64787,64787],\\\"mapped\\\",[1593,1609]],[[64788,64788],\\\"mapped\\\",[1593,1610]],[[64789,64789],\\\"mapped\\\",[1594,1609]],[[64790,64790],\\\"mapped\\\",[1594,1610]],[[64791,64791],\\\"mapped\\\",[1587,1609]],[[64792,64792],\\\"mapped\\\",[1587,1610]],[[64793,64793],\\\"mapped\\\",[1588,1609]],[[64794,64794],\\\"mapped\\\",[1588,1610]],[[64795,64795],\\\"mapped\\\",[1581,1609]],[[64796,64796],\\\"mapped\\\",[1581,1610]],[[64797,64797],\\\"mapped\\\",[1580,1609]],[[64798,64798],\\\"mapped\\\",[1580,1610]],[[64799,64799],\\\"mapped\\\",[1582,1609]],[[64800,64800],\\\"mapped\\\",[1582,1610]],[[64801,64801],\\\"mapped\\\",[1589,1609]],[[64802,64802],\\\"mapped\\\",[1589,1610]],[[64803,64803],\\\"mapped\\\",[1590,1609]],[[64804,64804],\\\"mapped\\\",[1590,1610]],[[64805,64805],\\\"mapped\\\",[1588,1580]],[[64806,64806],\\\"mapped\\\",[1588,1581]],[[64807,64807],\\\"mapped\\\",[1588,1582]],[[64808,64808],\\\"mapped\\\",[1588,1605]],[[64809,64809],\\\"mapped\\\",[1588,1585]],[[64810,64810],\\\"mapped\\\",[1587,1585]],[[64811,64811],\\\"mapped\\\",[1589,1585]],[[64812,64812],\\\"mapped\\\",[1590,1585]],[[64813,64813],\\\"mapped\\\",[1588,1580]],[[64814,64814],\\\"mapped\\\",[1588,1581]],[[64815,64815],\\\"mapped\\\",[1588,1582]],[[64816,64816],\\\"mapped\\\",[1588,1605]],[[64817,64817],\\\"mapped\\\",[1587,1607]],[[64818,64818],\\\"mapped\\\",[1588,1607]],[[64819,64819],\\\"mapped\\\",[1591,1605]],[[64820,64820],\\\"mapped\\\",[1587,1580]],[[64821,64821],\\\"mapped\\\",[1587,1581]],[[64822,64822],\\\"mapped\\\",[1587,1582]],[[64823,64823],\\\"mapped\\\",[1588,1580]],[[64824,64824],\\\"mapped\\\",[1588,1581]],[[64825,64825],\\\"mapped\\\",[1588,1582]],[[64826,64826],\\\"mapped\\\",[1591,1605]],[[64827,64827],\\\"mapped\\\",[1592,1605]],[[64828,64829],\\\"mapped\\\",[1575,1611]],[[64830,64831],\\\"valid\\\",[],\\\"NV8\\\"],[[64832,64847],\\\"disallowed\\\"],[[64848,64848],\\\"mapped\\\",[1578,1580,1605]],[[64849,64850],\\\"mapped\\\",[1578,1581,1580]],[[64851,64851],\\\"mapped\\\",[1578,1581,1605]],[[64852,64852],\\\"mapped\\\",[1578,1582,1605]],[[64853,64853],\\\"mapped\\\",[1578,1605,1580]],[[64854,64854],\\\"mapped\\\",[1578,1605,1581]],[[64855,64855],\\\"mapped\\\",[1578,1605,1582]],[[64856,64857],\\\"mapped\\\",[1580,1605,1581]],[[64858,64858],\\\"mapped\\\",[1581,1605,1610]],[[64859,64859],\\\"mapped\\\",[1581,1605,1609]],[[64860,64860],\\\"mapped\\\",[1587,1581,1580]],[[64861,64861],\\\"mapped\\\",[1587,1580,1581]],[[64862,64862],\\\"mapped\\\",[1587,1580,1609]],[[64863,64864],\\\"mapped\\\",[1587,1605,1581]],[[64865,64865],\\\"mapped\\\",[1587,1605,1580]],[[64866,64867],\\\"mapped\\\",[1587,1605,1605]],[[64868,64869],\\\"mapped\\\",[1589,1581,1581]],[[64870,64870],\\\"mapped\\\",[1589,1605,1605]],[[64871,64872],\\\"mapped\\\",[1588,1581,1605]],[[64873,64873],\\\"mapped\\\",[1588,1580,1610]],[[64874,64875],\\\"mapped\\\",[1588,1605,1582]],[[64876,64877],\\\"mapped\\\",[1588,1605,1605]],[[64878,64878],\\\"mapped\\\",[1590,1581,1609]],[[64879,64880],\\\"mapped\\\",[1590,1582,1605]],[[64881,64882],\\\"mapped\\\",[1591,1605,1581]],[[64883,64883],\\\"mapped\\\",[1591,1605,1605]],[[64884,64884],\\\"mapped\\\",[1591,1605,1610]],[[64885,64885],\\\"mapped\\\",[1593,1580,1605]],[[64886,64887],\\\"mapped\\\",[1593,1605,1605]],[[64888,64888],\\\"mapped\\\",[1593,1605,1609]],[[64889,64889],\\\"mapped\\\",[1594,1605,1605]],[[64890,64890],\\\"mapped\\\",[1594,1605,1610]],[[64891,64891],\\\"mapped\\\",[1594,1605,1609]],[[64892,64893],\\\"mapped\\\",[1601,1582,1605]],[[64894,64894],\\\"mapped\\\",[1602,1605,1581]],[[64895,64895],\\\"mapped\\\",[1602,1605,1605]],[[64896,64896],\\\"mapped\\\",[1604,1581,1605]],[[64897,64897],\\\"mapped\\\",[1604,1581,1610]],[[64898,64898],\\\"mapped\\\",[1604,1581,1609]],[[64899,64900],\\\"mapped\\\",[1604,1580,1580]],[[64901,64902],\\\"mapped\\\",[1604,1582,1605]],[[64903,64904],\\\"mapped\\\",[1604,1605,1581]],[[64905,64905],\\\"mapped\\\",[1605,1581,1580]],[[64906,64906],\\\"mapped\\\",[1605,1581,1605]],[[64907,64907],\\\"mapped\\\",[1605,1581,1610]],[[64908,64908],\\\"mapped\\\",[1605,1580,1581]],[[64909,64909],\\\"mapped\\\",[1605,1580,1605]],[[64910,64910],\\\"mapped\\\",[1605,1582,1580]],[[64911,64911],\\\"mapped\\\",[1605,1582,1605]],[[64912,64913],\\\"disallowed\\\"],[[64914,64914],\\\"mapped\\\",[1605,1580,1582]],[[64915,64915],\\\"mapped\\\",[1607,1605,1580]],[[64916,64916],\\\"mapped\\\",[1607,1605,1605]],[[64917,64917],\\\"mapped\\\",[1606,1581,1605]],[[64918,64918],\\\"mapped\\\",[1606,1581,1609]],[[64919,64920],\\\"mapped\\\",[1606,1580,1605]],[[64921,64921],\\\"mapped\\\",[1606,1580,1609]],[[64922,64922],\\\"mapped\\\",[1606,1605,1610]],[[64923,64923],\\\"mapped\\\",[1606,1605,1609]],[[64924,64925],\\\"mapped\\\",[1610,1605,1605]],[[64926,64926],\\\"mapped\\\",[1576,1582,1610]],[[64927,64927],\\\"mapped\\\",[1578,1580,1610]],[[64928,64928],\\\"mapped\\\",[1578,1580,1609]],[[64929,64929],\\\"mapped\\\",[1578,1582,1610]],[[64930,64930],\\\"mapped\\\",[1578,1582,1609]],[[64931,64931],\\\"mapped\\\",[1578,1605,1610]],[[64932,64932],\\\"mapped\\\",[1578,1605,1609]],[[64933,64933],\\\"mapped\\\",[1580,1605,1610]],[[64934,64934],\\\"mapped\\\",[1580,1581,1609]],[[64935,64935],\\\"mapped\\\",[1580,1605,1609]],[[64936,64936],\\\"mapped\\\",[1587,1582,1609]],[[64937,64937],\\\"mapped\\\",[1589,1581,1610]],[[64938,64938],\\\"mapped\\\",[1588,1581,1610]],[[64939,64939],\\\"mapped\\\",[1590,1581,1610]],[[64940,64940],\\\"mapped\\\",[1604,1580,1610]],[[64941,64941],\\\"mapped\\\",[1604,1605,1610]],[[64942,64942],\\\"mapped\\\",[1610,1581,1610]],[[64943,64943],\\\"mapped\\\",[1610,1580,1610]],[[64944,64944],\\\"mapped\\\",[1610,1605,1610]],[[64945,64945],\\\"mapped\\\",[1605,1605,1610]],[[64946,64946],\\\"mapped\\\",[1602,1605,1610]],[[64947,64947],\\\"mapped\\\",[1606,1581,1610]],[[64948,64948],\\\"mapped\\\",[1602,1605,1581]],[[64949,64949],\\\"mapped\\\",[1604,1581,1605]],[[64950,64950],\\\"mapped\\\",[1593,1605,1610]],[[64951,64951],\\\"mapped\\\",[1603,1605,1610]],[[64952,64952],\\\"mapped\\\",[1606,1580,1581]],[[64953,64953],\\\"mapped\\\",[1605,1582,1610]],[[64954,64954],\\\"mapped\\\",[1604,1580,1605]],[[64955,64955],\\\"mapped\\\",[1603,1605,1605]],[[64956,64956],\\\"mapped\\\",[1604,1580,1605]],[[64957,64957],\\\"mapped\\\",[1606,1580,1581]],[[64958,64958],\\\"mapped\\\",[1580,1581,1610]],[[64959,64959],\\\"mapped\\\",[1581,1580,1610]],[[64960,64960],\\\"mapped\\\",[1605,1580,1610]],[[64961,64961],\\\"mapped\\\",[1601,1605,1610]],[[64962,64962],\\\"mapped\\\",[1576,1581,1610]],[[64963,64963],\\\"mapped\\\",[1603,1605,1605]],[[64964,64964],\\\"mapped\\\",[1593,1580,1605]],[[64965,64965],\\\"mapped\\\",[1589,1605,1605]],[[64966,64966],\\\"mapped\\\",[1587,1582,1610]],[[64967,64967],\\\"mapped\\\",[1606,1580,1610]],[[64968,64975],\\\"disallowed\\\"],[[64976,65007],\\\"disallowed\\\"],[[65008,65008],\\\"mapped\\\",[1589,1604,1746]],[[65009,65009],\\\"mapped\\\",[1602,1604,1746]],[[65010,65010],\\\"mapped\\\",[1575,1604,1604,1607]],[[65011,65011],\\\"mapped\\\",[1575,1603,1576,1585]],[[65012,65012],\\\"mapped\\\",[1605,1581,1605,1583]],[[65013,65013],\\\"mapped\\\",[1589,1604,1593,1605]],[[65014,65014],\\\"mapped\\\",[1585,1587,1608,1604]],[[65015,65015],\\\"mapped\\\",[1593,1604,1610,1607]],[[65016,65016],\\\"mapped\\\",[1608,1587,1604,1605]],[[65017,65017],\\\"mapped\\\",[1589,1604,1609]],[[65018,65018],\\\"disallowed_STD3_mapped\\\",[1589,1604,1609,32,1575,1604,1604,1607,32,1593,1604,1610,1607,32,1608,1587,1604,1605]],[[65019,65019],\\\"disallowed_STD3_mapped\\\",[1580,1604,32,1580,1604,1575,1604,1607]],[[65020,65020],\\\"mapped\\\",[1585,1740,1575,1604]],[[65021,65021],\\\"valid\\\",[],\\\"NV8\\\"],[[65022,65023],\\\"disallowed\\\"],[[65024,65039],\\\"ignored\\\"],[[65040,65040],\\\"disallowed_STD3_mapped\\\",[44]],[[65041,65041],\\\"mapped\\\",[12289]],[[65042,65042],\\\"disallowed\\\"],[[65043,65043],\\\"disallowed_STD3_mapped\\\",[58]],[[65044,65044],\\\"disallowed_STD3_mapped\\\",[59]],[[65045,65045],\\\"disallowed_STD3_mapped\\\",[33]],[[65046,65046],\\\"disallowed_STD3_mapped\\\",[63]],[[65047,65047],\\\"mapped\\\",[12310]],[[65048,65048],\\\"mapped\\\",[12311]],[[65049,65049],\\\"disallowed\\\"],[[65050,65055],\\\"disallowed\\\"],[[65056,65059],\\\"valid\\\"],[[65060,65062],\\\"valid\\\"],[[65063,65069],\\\"valid\\\"],[[65070,65071],\\\"valid\\\"],[[65072,65072],\\\"disallowed\\\"],[[65073,65073],\\\"mapped\\\",[8212]],[[65074,65074],\\\"mapped\\\",[8211]],[[65075,65076],\\\"disallowed_STD3_mapped\\\",[95]],[[65077,65077],\\\"disallowed_STD3_mapped\\\",[40]],[[65078,65078],\\\"disallowed_STD3_mapped\\\",[41]],[[65079,65079],\\\"disallowed_STD3_mapped\\\",[123]],[[65080,65080],\\\"disallowed_STD3_mapped\\\",[125]],[[65081,65081],\\\"mapped\\\",[12308]],[[65082,65082],\\\"mapped\\\",[12309]],[[65083,65083],\\\"mapped\\\",[12304]],[[65084,65084],\\\"mapped\\\",[12305]],[[65085,65085],\\\"mapped\\\",[12298]],[[65086,65086],\\\"mapped\\\",[12299]],[[65087,65087],\\\"mapped\\\",[12296]],[[65088,65088],\\\"mapped\\\",[12297]],[[65089,65089],\\\"mapped\\\",[12300]],[[65090,65090],\\\"mapped\\\",[12301]],[[65091,65091],\\\"mapped\\\",[12302]],[[65092,65092],\\\"mapped\\\",[12303]],[[65093,65094],\\\"valid\\\",[],\\\"NV8\\\"],[[65095,65095],\\\"disallowed_STD3_mapped\\\",[91]],[[65096,65096],\\\"disallowed_STD3_mapped\\\",[93]],[[65097,65100],\\\"disallowed_STD3_mapped\\\",[32,773]],[[65101,65103],\\\"disallowed_STD3_mapped\\\",[95]],[[65104,65104],\\\"disallowed_STD3_mapped\\\",[44]],[[65105,65105],\\\"mapped\\\",[12289]],[[65106,65106],\\\"disallowed\\\"],[[65107,65107],\\\"disallowed\\\"],[[65108,65108],\\\"disallowed_STD3_mapped\\\",[59]],[[65109,65109],\\\"disallowed_STD3_mapped\\\",[58]],[[65110,65110],\\\"disallowed_STD3_mapped\\\",[63]],[[65111,65111],\\\"disallowed_STD3_mapped\\\",[33]],[[65112,65112],\\\"mapped\\\",[8212]],[[65113,65113],\\\"disallowed_STD3_mapped\\\",[40]],[[65114,65114],\\\"disallowed_STD3_mapped\\\",[41]],[[65115,65115],\\\"disallowed_STD3_mapped\\\",[123]],[[65116,65116],\\\"disallowed_STD3_mapped\\\",[125]],[[65117,65117],\\\"mapped\\\",[12308]],[[65118,65118],\\\"mapped\\\",[12309]],[[65119,65119],\\\"disallowed_STD3_mapped\\\",[35]],[[65120,65120],\\\"disallowed_STD3_mapped\\\",[38]],[[65121,65121],\\\"disallowed_STD3_mapped\\\",[42]],[[65122,65122],\\\"disallowed_STD3_mapped\\\",[43]],[[65123,65123],\\\"mapped\\\",[45]],[[65124,65124],\\\"disallowed_STD3_mapped\\\",[60]],[[65125,65125],\\\"disallowed_STD3_mapped\\\",[62]],[[65126,65126],\\\"disallowed_STD3_mapped\\\",[61]],[[65127,65127],\\\"disallowed\\\"],[[65128,65128],\\\"disallowed_STD3_mapped\\\",[92]],[[65129,65129],\\\"disallowed_STD3_mapped\\\",[36]],[[65130,65130],\\\"disallowed_STD3_mapped\\\",[37]],[[65131,65131],\\\"disallowed_STD3_mapped\\\",[64]],[[65132,65135],\\\"disallowed\\\"],[[65136,65136],\\\"disallowed_STD3_mapped\\\",[32,1611]],[[65137,65137],\\\"mapped\\\",[1600,1611]],[[65138,65138],\\\"disallowed_STD3_mapped\\\",[32,1612]],[[65139,65139],\\\"valid\\\"],[[65140,65140],\\\"disallowed_STD3_mapped\\\",[32,1613]],[[65141,65141],\\\"disallowed\\\"],[[65142,65142],\\\"disallowed_STD3_mapped\\\",[32,1614]],[[65143,65143],\\\"mapped\\\",[1600,1614]],[[65144,65144],\\\"disallowed_STD3_mapped\\\",[32,1615]],[[65145,65145],\\\"mapped\\\",[1600,1615]],[[65146,65146],\\\"disallowed_STD3_mapped\\\",[32,1616]],[[65147,65147],\\\"mapped\\\",[1600,1616]],[[65148,65148],\\\"disallowed_STD3_mapped\\\",[32,1617]],[[65149,65149],\\\"mapped\\\",[1600,1617]],[[65150,65150],\\\"disallowed_STD3_mapped\\\",[32,1618]],[[65151,65151],\\\"mapped\\\",[1600,1618]],[[65152,65152],\\\"mapped\\\",[1569]],[[65153,65154],\\\"mapped\\\",[1570]],[[65155,65156],\\\"mapped\\\",[1571]],[[65157,65158],\\\"mapped\\\",[1572]],[[65159,65160],\\\"mapped\\\",[1573]],[[65161,65164],\\\"mapped\\\",[1574]],[[65165,65166],\\\"mapped\\\",[1575]],[[65167,65170],\\\"mapped\\\",[1576]],[[65171,65172],\\\"mapped\\\",[1577]],[[65173,65176],\\\"mapped\\\",[1578]],[[65177,65180],\\\"mapped\\\",[1579]],[[65181,65184],\\\"mapped\\\",[1580]],[[65185,65188],\\\"mapped\\\",[1581]],[[65189,65192],\\\"mapped\\\",[1582]],[[65193,65194],\\\"mapped\\\",[1583]],[[65195,65196],\\\"mapped\\\",[1584]],[[65197,65198],\\\"mapped\\\",[1585]],[[65199,65200],\\\"mapped\\\",[1586]],[[65201,65204],\\\"mapped\\\",[1587]],[[65205,65208],\\\"mapped\\\",[1588]],[[65209,65212],\\\"mapped\\\",[1589]],[[65213,65216],\\\"mapped\\\",[1590]],[[65217,65220],\\\"mapped\\\",[1591]],[[65221,65224],\\\"mapped\\\",[1592]],[[65225,65228],\\\"mapped\\\",[1593]],[[65229,65232],\\\"mapped\\\",[1594]],[[65233,65236],\\\"mapped\\\",[1601]],[[65237,65240],\\\"mapped\\\",[1602]],[[65241,65244],\\\"mapped\\\",[1603]],[[65245,65248],\\\"mapped\\\",[1604]],[[65249,65252],\\\"mapped\\\",[1605]],[[65253,65256],\\\"mapped\\\",[1606]],[[65257,65260],\\\"mapped\\\",[1607]],[[65261,65262],\\\"mapped\\\",[1608]],[[65263,65264],\\\"mapped\\\",[1609]],[[65265,65268],\\\"mapped\\\",[1610]],[[65269,65270],\\\"mapped\\\",[1604,1570]],[[65271,65272],\\\"mapped\\\",[1604,1571]],[[65273,65274],\\\"mapped\\\",[1604,1573]],[[65275,65276],\\\"mapped\\\",[1604,1575]],[[65277,65278],\\\"disallowed\\\"],[[65279,65279],\\\"ignored\\\"],[[65280,65280],\\\"disallowed\\\"],[[65281,65281],\\\"disallowed_STD3_mapped\\\",[33]],[[65282,65282],\\\"disallowed_STD3_mapped\\\",[34]],[[65283,65283],\\\"disallowed_STD3_mapped\\\",[35]],[[65284,65284],\\\"disallowed_STD3_mapped\\\",[36]],[[65285,65285],\\\"disallowed_STD3_mapped\\\",[37]],[[65286,65286],\\\"disallowed_STD3_mapped\\\",[38]],[[65287,65287],\\\"disallowed_STD3_mapped\\\",[39]],[[65288,65288],\\\"disallowed_STD3_mapped\\\",[40]],[[65289,65289],\\\"disallowed_STD3_mapped\\\",[41]],[[65290,65290],\\\"disallowed_STD3_mapped\\\",[42]],[[65291,65291],\\\"disallowed_STD3_mapped\\\",[43]],[[65292,65292],\\\"disallowed_STD3_mapped\\\",[44]],[[65293,65293],\\\"mapped\\\",[45]],[[65294,65294],\\\"mapped\\\",[46]],[[65295,65295],\\\"disallowed_STD3_mapped\\\",[47]],[[65296,65296],\\\"mapped\\\",[48]],[[65297,65297],\\\"mapped\\\",[49]],[[65298,65298],\\\"mapped\\\",[50]],[[65299,65299],\\\"mapped\\\",[51]],[[65300,65300],\\\"mapped\\\",[52]],[[65301,65301],\\\"mapped\\\",[53]],[[65302,65302],\\\"mapped\\\",[54]],[[65303,65303],\\\"mapped\\\",[55]],[[65304,65304],\\\"mapped\\\",[56]],[[65305,65305],\\\"mapped\\\",[57]],[[65306,65306],\\\"disallowed_STD3_mapped\\\",[58]],[[65307,65307],\\\"disallowed_STD3_mapped\\\",[59]],[[65308,65308],\\\"disallowed_STD3_mapped\\\",[60]],[[65309,65309],\\\"disallowed_STD3_mapped\\\",[61]],[[65310,65310],\\\"disallowed_STD3_mapped\\\",[62]],[[65311,65311],\\\"disallowed_STD3_mapped\\\",[63]],[[65312,65312],\\\"disallowed_STD3_mapped\\\",[64]],[[65313,65313],\\\"mapped\\\",[97]],[[65314,65314],\\\"mapped\\\",[98]],[[65315,65315],\\\"mapped\\\",[99]],[[65316,65316],\\\"mapped\\\",[100]],[[65317,65317],\\\"mapped\\\",[101]],[[65318,65318],\\\"mapped\\\",[102]],[[65319,65319],\\\"mapped\\\",[103]],[[65320,65320],\\\"mapped\\\",[104]],[[65321,65321],\\\"mapped\\\",[105]],[[65322,65322],\\\"mapped\\\",[106]],[[65323,65323],\\\"mapped\\\",[107]],[[65324,65324],\\\"mapped\\\",[108]],[[65325,65325],\\\"mapped\\\",[109]],[[65326,65326],\\\"mapped\\\",[110]],[[65327,65327],\\\"mapped\\\",[111]],[[65328,65328],\\\"mapped\\\",[112]],[[65329,65329],\\\"mapped\\\",[113]],[[65330,65330],\\\"mapped\\\",[114]],[[65331,65331],\\\"mapped\\\",[115]],[[65332,65332],\\\"mapped\\\",[116]],[[65333,65333],\\\"mapped\\\",[117]],[[65334,65334],\\\"mapped\\\",[118]],[[65335,65335],\\\"mapped\\\",[119]],[[65336,65336],\\\"mapped\\\",[120]],[[65337,65337],\\\"mapped\\\",[121]],[[65338,65338],\\\"mapped\\\",[122]],[[65339,65339],\\\"disallowed_STD3_mapped\\\",[91]],[[65340,65340],\\\"disallowed_STD3_mapped\\\",[92]],[[65341,65341],\\\"disallowed_STD3_mapped\\\",[93]],[[65342,65342],\\\"disallowed_STD3_mapped\\\",[94]],[[65343,65343],\\\"disallowed_STD3_mapped\\\",[95]],[[65344,65344],\\\"disallowed_STD3_mapped\\\",[96]],[[65345,65345],\\\"mapped\\\",[97]],[[65346,65346],\\\"mapped\\\",[98]],[[65347,65347],\\\"mapped\\\",[99]],[[65348,65348],\\\"mapped\\\",[100]],[[65349,65349],\\\"mapped\\\",[101]],[[65350,65350],\\\"mapped\\\",[102]],[[65351,65351],\\\"mapped\\\",[103]],[[65352,65352],\\\"mapped\\\",[104]],[[65353,65353],\\\"mapped\\\",[105]],[[65354,65354],\\\"mapped\\\",[106]],[[65355,65355],\\\"mapped\\\",[107]],[[65356,65356],\\\"mapped\\\",[108]],[[65357,65357],\\\"mapped\\\",[109]],[[65358,65358],\\\"mapped\\\",[110]],[[65359,65359],\\\"mapped\\\",[111]],[[65360,65360],\\\"mapped\\\",[112]],[[65361,65361],\\\"mapped\\\",[113]],[[65362,65362],\\\"mapped\\\",[114]],[[65363,65363],\\\"mapped\\\",[115]],[[65364,65364],\\\"mapped\\\",[116]],[[65365,65365],\\\"mapped\\\",[117]],[[65366,65366],\\\"mapped\\\",[118]],[[65367,65367],\\\"mapped\\\",[119]],[[65368,65368],\\\"mapped\\\",[120]],[[65369,65369],\\\"mapped\\\",[121]],[[65370,65370],\\\"mapped\\\",[122]],[[65371,65371],\\\"disallowed_STD3_mapped\\\",[123]],[[65372,65372],\\\"disallowed_STD3_mapped\\\",[124]],[[65373,65373],\\\"disallowed_STD3_mapped\\\",[125]],[[65374,65374],\\\"disallowed_STD3_mapped\\\",[126]],[[65375,65375],\\\"mapped\\\",[10629]],[[65376,65376],\\\"mapped\\\",[10630]],[[65377,65377],\\\"mapped\\\",[46]],[[65378,65378],\\\"mapped\\\",[12300]],[[65379,65379],\\\"mapped\\\",[12301]],[[65380,65380],\\\"mapped\\\",[12289]],[[65381,65381],\\\"mapped\\\",[12539]],[[65382,65382],\\\"mapped\\\",[12530]],[[65383,65383],\\\"mapped\\\",[12449]],[[65384,65384],\\\"mapped\\\",[12451]],[[65385,65385],\\\"mapped\\\",[12453]],[[65386,65386],\\\"mapped\\\",[12455]],[[65387,65387],\\\"mapped\\\",[12457]],[[65388,65388],\\\"mapped\\\",[12515]],[[65389,65389],\\\"mapped\\\",[12517]],[[65390,65390],\\\"mapped\\\",[12519]],[[65391,65391],\\\"mapped\\\",[12483]],[[65392,65392],\\\"mapped\\\",[12540]],[[65393,65393],\\\"mapped\\\",[12450]],[[65394,65394],\\\"mapped\\\",[12452]],[[65395,65395],\\\"mapped\\\",[12454]],[[65396,65396],\\\"mapped\\\",[12456]],[[65397,65397],\\\"mapped\\\",[12458]],[[65398,65398],\\\"mapped\\\",[12459]],[[65399,65399],\\\"mapped\\\",[12461]],[[65400,65400],\\\"mapped\\\",[12463]],[[65401,65401],\\\"mapped\\\",[12465]],[[65402,65402],\\\"mapped\\\",[12467]],[[65403,65403],\\\"mapped\\\",[12469]],[[65404,65404],\\\"mapped\\\",[12471]],[[65405,65405],\\\"mapped\\\",[12473]],[[65406,65406],\\\"mapped\\\",[12475]],[[65407,65407],\\\"mapped\\\",[12477]],[[65408,65408],\\\"mapped\\\",[12479]],[[65409,65409],\\\"mapped\\\",[12481]],[[65410,65410],\\\"mapped\\\",[12484]],[[65411,65411],\\\"mapped\\\",[12486]],[[65412,65412],\\\"mapped\\\",[12488]],[[65413,65413],\\\"mapped\\\",[12490]],[[65414,65414],\\\"mapped\\\",[12491]],[[65415,65415],\\\"mapped\\\",[12492]],[[65416,65416],\\\"mapped\\\",[12493]],[[65417,65417],\\\"mapped\\\",[12494]],[[65418,65418],\\\"mapped\\\",[12495]],[[65419,65419],\\\"mapped\\\",[12498]],[[65420,65420],\\\"mapped\\\",[12501]],[[65421,65421],\\\"mapped\\\",[12504]],[[65422,65422],\\\"mapped\\\",[12507]],[[65423,65423],\\\"mapped\\\",[12510]],[[65424,65424],\\\"mapped\\\",[12511]],[[65425,65425],\\\"mapped\\\",[12512]],[[65426,65426],\\\"mapped\\\",[12513]],[[65427,65427],\\\"mapped\\\",[12514]],[[65428,65428],\\\"mapped\\\",[12516]],[[65429,65429],\\\"mapped\\\",[12518]],[[65430,65430],\\\"mapped\\\",[12520]],[[65431,65431],\\\"mapped\\\",[12521]],[[65432,65432],\\\"mapped\\\",[12522]],[[65433,65433],\\\"mapped\\\",[12523]],[[65434,65434],\\\"mapped\\\",[12524]],[[65435,65435],\\\"mapped\\\",[12525]],[[65436,65436],\\\"mapped\\\",[12527]],[[65437,65437],\\\"mapped\\\",[12531]],[[65438,65438],\\\"mapped\\\",[12441]],[[65439,65439],\\\"mapped\\\",[12442]],[[65440,65440],\\\"disallowed\\\"],[[65441,65441],\\\"mapped\\\",[4352]],[[65442,65442],\\\"mapped\\\",[4353]],[[65443,65443],\\\"mapped\\\",[4522]],[[65444,65444],\\\"mapped\\\",[4354]],[[65445,65445],\\\"mapped\\\",[4524]],[[65446,65446],\\\"mapped\\\",[4525]],[[65447,65447],\\\"mapped\\\",[4355]],[[65448,65448],\\\"mapped\\\",[4356]],[[65449,65449],\\\"mapped\\\",[4357]],[[65450,65450],\\\"mapped\\\",[4528]],[[65451,65451],\\\"mapped\\\",[4529]],[[65452,65452],\\\"mapped\\\",[4530]],[[65453,65453],\\\"mapped\\\",[4531]],[[65454,65454],\\\"mapped\\\",[4532]],[[65455,65455],\\\"mapped\\\",[4533]],[[65456,65456],\\\"mapped\\\",[4378]],[[65457,65457],\\\"mapped\\\",[4358]],[[65458,65458],\\\"mapped\\\",[4359]],[[65459,65459],\\\"mapped\\\",[4360]],[[65460,65460],\\\"mapped\\\",[4385]],[[65461,65461],\\\"mapped\\\",[4361]],[[65462,65462],\\\"mapped\\\",[4362]],[[65463,65463],\\\"mapped\\\",[4363]],[[65464,65464],\\\"mapped\\\",[4364]],[[65465,65465],\\\"mapped\\\",[4365]],[[65466,65466],\\\"mapped\\\",[4366]],[[65467,65467],\\\"mapped\\\",[4367]],[[65468,65468],\\\"mapped\\\",[4368]],[[65469,65469],\\\"mapped\\\",[4369]],[[65470,65470],\\\"mapped\\\",[4370]],[[65471,65473],\\\"disallowed\\\"],[[65474,65474],\\\"mapped\\\",[4449]],[[65475,65475],\\\"mapped\\\",[4450]],[[65476,65476],\\\"mapped\\\",[4451]],[[65477,65477],\\\"mapped\\\",[4452]],[[65478,65478],\\\"mapped\\\",[4453]],[[65479,65479],\\\"mapped\\\",[4454]],[[65480,65481],\\\"disallowed\\\"],[[65482,65482],\\\"mapped\\\",[4455]],[[65483,65483],\\\"mapped\\\",[4456]],[[65484,65484],\\\"mapped\\\",[4457]],[[65485,65485],\\\"mapped\\\",[4458]],[[65486,65486],\\\"mapped\\\",[4459]],[[65487,65487],\\\"mapped\\\",[4460]],[[65488,65489],\\\"disallowed\\\"],[[65490,65490],\\\"mapped\\\",[4461]],[[65491,65491],\\\"mapped\\\",[4462]],[[65492,65492],\\\"mapped\\\",[4463]],[[65493,65493],\\\"mapped\\\",[4464]],[[65494,65494],\\\"mapped\\\",[4465]],[[65495,65495],\\\"mapped\\\",[4466]],[[65496,65497],\\\"disallowed\\\"],[[65498,65498],\\\"mapped\\\",[4467]],[[65499,65499],\\\"mapped\\\",[4468]],[[65500,65500],\\\"mapped\\\",[4469]],[[65501,65503],\\\"disallowed\\\"],[[65504,65504],\\\"mapped\\\",[162]],[[65505,65505],\\\"mapped\\\",[163]],[[65506,65506],\\\"mapped\\\",[172]],[[65507,65507],\\\"disallowed_STD3_mapped\\\",[32,772]],[[65508,65508],\\\"mapped\\\",[166]],[[65509,65509],\\\"mapped\\\",[165]],[[65510,65510],\\\"mapped\\\",[8361]],[[65511,65511],\\\"disallowed\\\"],[[65512,65512],\\\"mapped\\\",[9474]],[[65513,65513],\\\"mapped\\\",[8592]],[[65514,65514],\\\"mapped\\\",[8593]],[[65515,65515],\\\"mapped\\\",[8594]],[[65516,65516],\\\"mapped\\\",[8595]],[[65517,65517],\\\"mapped\\\",[9632]],[[65518,65518],\\\"mapped\\\",[9675]],[[65519,65528],\\\"disallowed\\\"],[[65529,65531],\\\"disallowed\\\"],[[65532,65532],\\\"disallowed\\\"],[[65533,65533],\\\"disallowed\\\"],[[65534,65535],\\\"disallowed\\\"],[[65536,65547],\\\"valid\\\"],[[65548,65548],\\\"disallowed\\\"],[[65549,65574],\\\"valid\\\"],[[65575,65575],\\\"disallowed\\\"],[[65576,65594],\\\"valid\\\"],[[65595,65595],\\\"disallowed\\\"],[[65596,65597],\\\"valid\\\"],[[65598,65598],\\\"disallowed\\\"],[[65599,65613],\\\"valid\\\"],[[65614,65615],\\\"disallowed\\\"],[[65616,65629],\\\"valid\\\"],[[65630,65663],\\\"disallowed\\\"],[[65664,65786],\\\"valid\\\"],[[65787,65791],\\\"disallowed\\\"],[[65792,65794],\\\"valid\\\",[],\\\"NV8\\\"],[[65795,65798],\\\"disallowed\\\"],[[65799,65843],\\\"valid\\\",[],\\\"NV8\\\"],[[65844,65846],\\\"disallowed\\\"],[[65847,65855],\\\"valid\\\",[],\\\"NV8\\\"],[[65856,65930],\\\"valid\\\",[],\\\"NV8\\\"],[[65931,65932],\\\"valid\\\",[],\\\"NV8\\\"],[[65933,65935],\\\"disallowed\\\"],[[65936,65947],\\\"valid\\\",[],\\\"NV8\\\"],[[65948,65951],\\\"disallowed\\\"],[[65952,65952],\\\"valid\\\",[],\\\"NV8\\\"],[[65953,65999],\\\"disallowed\\\"],[[66000,66044],\\\"valid\\\",[],\\\"NV8\\\"],[[66045,66045],\\\"valid\\\"],[[66046,66175],\\\"disallowed\\\"],[[66176,66204],\\\"valid\\\"],[[66205,66207],\\\"disallowed\\\"],[[66208,66256],\\\"valid\\\"],[[66257,66271],\\\"disallowed\\\"],[[66272,66272],\\\"valid\\\"],[[66273,66299],\\\"valid\\\",[],\\\"NV8\\\"],[[66300,66303],\\\"disallowed\\\"],[[66304,66334],\\\"valid\\\"],[[66335,66335],\\\"valid\\\"],[[66336,66339],\\\"valid\\\",[],\\\"NV8\\\"],[[66340,66351],\\\"disallowed\\\"],[[66352,66368],\\\"valid\\\"],[[66369,66369],\\\"valid\\\",[],\\\"NV8\\\"],[[66370,66377],\\\"valid\\\"],[[66378,66378],\\\"valid\\\",[],\\\"NV8\\\"],[[66379,66383],\\\"disallowed\\\"],[[66384,66426],\\\"valid\\\"],[[66427,66431],\\\"disallowed\\\"],[[66432,66461],\\\"valid\\\"],[[66462,66462],\\\"disallowed\\\"],[[66463,66463],\\\"valid\\\",[],\\\"NV8\\\"],[[66464,66499],\\\"valid\\\"],[[66500,66503],\\\"disallowed\\\"],[[66504,66511],\\\"valid\\\"],[[66512,66517],\\\"valid\\\",[],\\\"NV8\\\"],[[66518,66559],\\\"disallowed\\\"],[[66560,66560],\\\"mapped\\\",[66600]],[[66561,66561],\\\"mapped\\\",[66601]],[[66562,66562],\\\"mapped\\\",[66602]],[[66563,66563],\\\"mapped\\\",[66603]],[[66564,66564],\\\"mapped\\\",[66604]],[[66565,66565],\\\"mapped\\\",[66605]],[[66566,66566],\\\"mapped\\\",[66606]],[[66567,66567],\\\"mapped\\\",[66607]],[[66568,66568],\\\"mapped\\\",[66608]],[[66569,66569],\\\"mapped\\\",[66609]],[[66570,66570],\\\"mapped\\\",[66610]],[[66571,66571],\\\"mapped\\\",[66611]],[[66572,66572],\\\"mapped\\\",[66612]],[[66573,66573],\\\"mapped\\\",[66613]],[[66574,66574],\\\"mapped\\\",[66614]],[[66575,66575],\\\"mapped\\\",[66615]],[[66576,66576],\\\"mapped\\\",[66616]],[[66577,66577],\\\"mapped\\\",[66617]],[[66578,66578],\\\"mapped\\\",[66618]],[[66579,66579],\\\"mapped\\\",[66619]],[[66580,66580],\\\"mapped\\\",[66620]],[[66581,66581],\\\"mapped\\\",[66621]],[[66582,66582],\\\"mapped\\\",[66622]],[[66583,66583],\\\"mapped\\\",[66623]],[[66584,66584],\\\"mapped\\\",[66624]],[[66585,66585],\\\"mapped\\\",[66625]],[[66586,66586],\\\"mapped\\\",[66626]],[[66587,66587],\\\"mapped\\\",[66627]],[[66588,66588],\\\"mapped\\\",[66628]],[[66589,66589],\\\"mapped\\\",[66629]],[[66590,66590],\\\"mapped\\\",[66630]],[[66591,66591],\\\"mapped\\\",[66631]],[[66592,66592],\\\"mapped\\\",[66632]],[[66593,66593],\\\"mapped\\\",[66633]],[[66594,66594],\\\"mapped\\\",[66634]],[[66595,66595],\\\"mapped\\\",[66635]],[[66596,66596],\\\"mapped\\\",[66636]],[[66597,66597],\\\"mapped\\\",[66637]],[[66598,66598],\\\"mapped\\\",[66638]],[[66599,66599],\\\"mapped\\\",[66639]],[[66600,66637],\\\"valid\\\"],[[66638,66717],\\\"valid\\\"],[[66718,66719],\\\"disallowed\\\"],[[66720,66729],\\\"valid\\\"],[[66730,66815],\\\"disallowed\\\"],[[66816,66855],\\\"valid\\\"],[[66856,66863],\\\"disallowed\\\"],[[66864,66915],\\\"valid\\\"],[[66916,66926],\\\"disallowed\\\"],[[66927,66927],\\\"valid\\\",[],\\\"NV8\\\"],[[66928,67071],\\\"disallowed\\\"],[[67072,67382],\\\"valid\\\"],[[67383,67391],\\\"disallowed\\\"],[[67392,67413],\\\"valid\\\"],[[67414,67423],\\\"disallowed\\\"],[[67424,67431],\\\"valid\\\"],[[67432,67583],\\\"disallowed\\\"],[[67584,67589],\\\"valid\\\"],[[67590,67591],\\\"disallowed\\\"],[[67592,67592],\\\"valid\\\"],[[67593,67593],\\\"disallowed\\\"],[[67594,67637],\\\"valid\\\"],[[67638,67638],\\\"disallowed\\\"],[[67639,67640],\\\"valid\\\"],[[67641,67643],\\\"disallowed\\\"],[[67644,67644],\\\"valid\\\"],[[67645,67646],\\\"disallowed\\\"],[[67647,67647],\\\"valid\\\"],[[67648,67669],\\\"valid\\\"],[[67670,67670],\\\"disallowed\\\"],[[67671,67679],\\\"valid\\\",[],\\\"NV8\\\"],[[67680,67702],\\\"valid\\\"],[[67703,67711],\\\"valid\\\",[],\\\"NV8\\\"],[[67712,67742],\\\"valid\\\"],[[67743,67750],\\\"disallowed\\\"],[[67751,67759],\\\"valid\\\",[],\\\"NV8\\\"],[[67760,67807],\\\"disallowed\\\"],[[67808,67826],\\\"valid\\\"],[[67827,67827],\\\"disallowed\\\"],[[67828,67829],\\\"valid\\\"],[[67830,67834],\\\"disallowed\\\"],[[67835,67839],\\\"valid\\\",[],\\\"NV8\\\"],[[67840,67861],\\\"valid\\\"],[[67862,67865],\\\"valid\\\",[],\\\"NV8\\\"],[[67866,67867],\\\"valid\\\",[],\\\"NV8\\\"],[[67868,67870],\\\"disallowed\\\"],[[67871,67871],\\\"valid\\\",[],\\\"NV8\\\"],[[67872,67897],\\\"valid\\\"],[[67898,67902],\\\"disallowed\\\"],[[67903,67903],\\\"valid\\\",[],\\\"NV8\\\"],[[67904,67967],\\\"disallowed\\\"],[[67968,68023],\\\"valid\\\"],[[68024,68027],\\\"disallowed\\\"],[[68028,68029],\\\"valid\\\",[],\\\"NV8\\\"],[[68030,68031],\\\"valid\\\"],[[68032,68047],\\\"valid\\\",[],\\\"NV8\\\"],[[68048,68049],\\\"disallowed\\\"],[[68050,68095],\\\"valid\\\",[],\\\"NV8\\\"],[[68096,68099],\\\"valid\\\"],[[68100,68100],\\\"disallowed\\\"],[[68101,68102],\\\"valid\\\"],[[68103,68107],\\\"disallowed\\\"],[[68108,68115],\\\"valid\\\"],[[68116,68116],\\\"disallowed\\\"],[[68117,68119],\\\"valid\\\"],[[68120,68120],\\\"disallowed\\\"],[[68121,68147],\\\"valid\\\"],[[68148,68151],\\\"disallowed\\\"],[[68152,68154],\\\"valid\\\"],[[68155,68158],\\\"disallowed\\\"],[[68159,68159],\\\"valid\\\"],[[68160,68167],\\\"valid\\\",[],\\\"NV8\\\"],[[68168,68175],\\\"disallowed\\\"],[[68176,68184],\\\"valid\\\",[],\\\"NV8\\\"],[[68185,68191],\\\"disallowed\\\"],[[68192,68220],\\\"valid\\\"],[[68221,68223],\\\"valid\\\",[],\\\"NV8\\\"],[[68224,68252],\\\"valid\\\"],[[68253,68255],\\\"valid\\\",[],\\\"NV8\\\"],[[68256,68287],\\\"disallowed\\\"],[[68288,68295],\\\"valid\\\"],[[68296,68296],\\\"valid\\\",[],\\\"NV8\\\"],[[68297,68326],\\\"valid\\\"],[[68327,68330],\\\"disallowed\\\"],[[68331,68342],\\\"valid\\\",[],\\\"NV8\\\"],[[68343,68351],\\\"disallowed\\\"],[[68352,68405],\\\"valid\\\"],[[68406,68408],\\\"disallowed\\\"],[[68409,68415],\\\"valid\\\",[],\\\"NV8\\\"],[[68416,68437],\\\"valid\\\"],[[68438,68439],\\\"disallowed\\\"],[[68440,68447],\\\"valid\\\",[],\\\"NV8\\\"],[[68448,68466],\\\"valid\\\"],[[68467,68471],\\\"disallowed\\\"],[[68472,68479],\\\"valid\\\",[],\\\"NV8\\\"],[[68480,68497],\\\"valid\\\"],[[68498,68504],\\\"disallowed\\\"],[[68505,68508],\\\"valid\\\",[],\\\"NV8\\\"],[[68509,68520],\\\"disallowed\\\"],[[68521,68527],\\\"valid\\\",[],\\\"NV8\\\"],[[68528,68607],\\\"disallowed\\\"],[[68608,68680],\\\"valid\\\"],[[68681,68735],\\\"disallowed\\\"],[[68736,68736],\\\"mapped\\\",[68800]],[[68737,68737],\\\"mapped\\\",[68801]],[[68738,68738],\\\"mapped\\\",[68802]],[[68739,68739],\\\"mapped\\\",[68803]],[[68740,68740],\\\"mapped\\\",[68804]],[[68741,68741],\\\"mapped\\\",[68805]],[[68742,68742],\\\"mapped\\\",[68806]],[[68743,68743],\\\"mapped\\\",[68807]],[[68744,68744],\\\"mapped\\\",[68808]],[[68745,68745],\\\"mapped\\\",[68809]],[[68746,68746],\\\"mapped\\\",[68810]],[[68747,68747],\\\"mapped\\\",[68811]],[[68748,68748],\\\"mapped\\\",[68812]],[[68749,68749],\\\"mapped\\\",[68813]],[[68750,68750],\\\"mapped\\\",[68814]],[[68751,68751],\\\"mapped\\\",[68815]],[[68752,68752],\\\"mapped\\\",[68816]],[[68753,68753],\\\"mapped\\\",[68817]],[[68754,68754],\\\"mapped\\\",[68818]],[[68755,68755],\\\"mapped\\\",[68819]],[[68756,68756],\\\"mapped\\\",[68820]],[[68757,68757],\\\"mapped\\\",[68821]],[[68758,68758],\\\"mapped\\\",[68822]],[[68759,68759],\\\"mapped\\\",[68823]],[[68760,68760],\\\"mapped\\\",[68824]],[[68761,68761],\\\"mapped\\\",[68825]],[[68762,68762],\\\"mapped\\\",[68826]],[[68763,68763],\\\"mapped\\\",[68827]],[[68764,68764],\\\"mapped\\\",[68828]],[[68765,68765],\\\"mapped\\\",[68829]],[[68766,68766],\\\"mapped\\\",[68830]],[[68767,68767],\\\"mapped\\\",[68831]],[[68768,68768],\\\"mapped\\\",[68832]],[[68769,68769],\\\"mapped\\\",[68833]],[[68770,68770],\\\"mapped\\\",[68834]],[[68771,68771],\\\"mapped\\\",[68835]],[[68772,68772],\\\"mapped\\\",[68836]],[[68773,68773],\\\"mapped\\\",[68837]],[[68774,68774],\\\"mapped\\\",[68838]],[[68775,68775],\\\"mapped\\\",[68839]],[[68776,68776],\\\"mapped\\\",[68840]],[[68777,68777],\\\"mapped\\\",[68841]],[[68778,68778],\\\"mapped\\\",[68842]],[[68779,68779],\\\"mapped\\\",[68843]],[[68780,68780],\\\"mapped\\\",[68844]],[[68781,68781],\\\"mapped\\\",[68845]],[[68782,68782],\\\"mapped\\\",[68846]],[[68783,68783],\\\"mapped\\\",[68847]],[[68784,68784],\\\"mapped\\\",[68848]],[[68785,68785],\\\"mapped\\\",[68849]],[[68786,68786],\\\"mapped\\\",[68850]],[[68787,68799],\\\"disallowed\\\"],[[68800,68850],\\\"valid\\\"],[[68851,68857],\\\"disallowed\\\"],[[68858,68863],\\\"valid\\\",[],\\\"NV8\\\"],[[68864,69215],\\\"disallowed\\\"],[[69216,69246],\\\"valid\\\",[],\\\"NV8\\\"],[[69247,69631],\\\"disallowed\\\"],[[69632,69702],\\\"valid\\\"],[[69703,69709],\\\"valid\\\",[],\\\"NV8\\\"],[[69710,69713],\\\"disallowed\\\"],[[69714,69733],\\\"valid\\\",[],\\\"NV8\\\"],[[69734,69743],\\\"valid\\\"],[[69744,69758],\\\"disallowed\\\"],[[69759,69759],\\\"valid\\\"],[[69760,69818],\\\"valid\\\"],[[69819,69820],\\\"valid\\\",[],\\\"NV8\\\"],[[69821,69821],\\\"disallowed\\\"],[[69822,69825],\\\"valid\\\",[],\\\"NV8\\\"],[[69826,69839],\\\"disallowed\\\"],[[69840,69864],\\\"valid\\\"],[[69865,69871],\\\"disallowed\\\"],[[69872,69881],\\\"valid\\\"],[[69882,69887],\\\"disallowed\\\"],[[69888,69940],\\\"valid\\\"],[[69941,69941],\\\"disallowed\\\"],[[69942,69951],\\\"valid\\\"],[[69952,69955],\\\"valid\\\",[],\\\"NV8\\\"],[[69956,69967],\\\"disallowed\\\"],[[69968,70003],\\\"valid\\\"],[[70004,70005],\\\"valid\\\",[],\\\"NV8\\\"],[[70006,70006],\\\"valid\\\"],[[70007,70015],\\\"disallowed\\\"],[[70016,70084],\\\"valid\\\"],[[70085,70088],\\\"valid\\\",[],\\\"NV8\\\"],[[70089,70089],\\\"valid\\\",[],\\\"NV8\\\"],[[70090,70092],\\\"valid\\\"],[[70093,70093],\\\"valid\\\",[],\\\"NV8\\\"],[[70094,70095],\\\"disallowed\\\"],[[70096,70105],\\\"valid\\\"],[[70106,70106],\\\"valid\\\"],[[70107,70107],\\\"valid\\\",[],\\\"NV8\\\"],[[70108,70108],\\\"valid\\\"],[[70109,70111],\\\"valid\\\",[],\\\"NV8\\\"],[[70112,70112],\\\"disallowed\\\"],[[70113,70132],\\\"valid\\\",[],\\\"NV8\\\"],[[70133,70143],\\\"disallowed\\\"],[[70144,70161],\\\"valid\\\"],[[70162,70162],\\\"disallowed\\\"],[[70163,70199],\\\"valid\\\"],[[70200,70205],\\\"valid\\\",[],\\\"NV8\\\"],[[70206,70271],\\\"disallowed\\\"],[[70272,70278],\\\"valid\\\"],[[70279,70279],\\\"disallowed\\\"],[[70280,70280],\\\"valid\\\"],[[70281,70281],\\\"disallowed\\\"],[[70282,70285],\\\"valid\\\"],[[70286,70286],\\\"disallowed\\\"],[[70287,70301],\\\"valid\\\"],[[70302,70302],\\\"disallowed\\\"],[[70303,70312],\\\"valid\\\"],[[70313,70313],\\\"valid\\\",[],\\\"NV8\\\"],[[70314,70319],\\\"disallowed\\\"],[[70320,70378],\\\"valid\\\"],[[70379,70383],\\\"disallowed\\\"],[[70384,70393],\\\"valid\\\"],[[70394,70399],\\\"disallowed\\\"],[[70400,70400],\\\"valid\\\"],[[70401,70403],\\\"valid\\\"],[[70404,70404],\\\"disallowed\\\"],[[70405,70412],\\\"valid\\\"],[[70413,70414],\\\"disallowed\\\"],[[70415,70416],\\\"valid\\\"],[[70417,70418],\\\"disallowed\\\"],[[70419,70440],\\\"valid\\\"],[[70441,70441],\\\"disallowed\\\"],[[70442,70448],\\\"valid\\\"],[[70449,70449],\\\"disallowed\\\"],[[70450,70451],\\\"valid\\\"],[[70452,70452],\\\"disallowed\\\"],[[70453,70457],\\\"valid\\\"],[[70458,70459],\\\"disallowed\\\"],[[70460,70468],\\\"valid\\\"],[[70469,70470],\\\"disallowed\\\"],[[70471,70472],\\\"valid\\\"],[[70473,70474],\\\"disallowed\\\"],[[70475,70477],\\\"valid\\\"],[[70478,70479],\\\"disallowed\\\"],[[70480,70480],\\\"valid\\\"],[[70481,70486],\\\"disallowed\\\"],[[70487,70487],\\\"valid\\\"],[[70488,70492],\\\"disallowed\\\"],[[70493,70499],\\\"valid\\\"],[[70500,70501],\\\"disallowed\\\"],[[70502,70508],\\\"valid\\\"],[[70509,70511],\\\"disallowed\\\"],[[70512,70516],\\\"valid\\\"],[[70517,70783],\\\"disallowed\\\"],[[70784,70853],\\\"valid\\\"],[[70854,70854],\\\"valid\\\",[],\\\"NV8\\\"],[[70855,70855],\\\"valid\\\"],[[70856,70863],\\\"disallowed\\\"],[[70864,70873],\\\"valid\\\"],[[70874,71039],\\\"disallowed\\\"],[[71040,71093],\\\"valid\\\"],[[71094,71095],\\\"disallowed\\\"],[[71096,71104],\\\"valid\\\"],[[71105,71113],\\\"valid\\\",[],\\\"NV8\\\"],[[71114,71127],\\\"valid\\\",[],\\\"NV8\\\"],[[71128,71133],\\\"valid\\\"],[[71134,71167],\\\"disallowed\\\"],[[71168,71232],\\\"valid\\\"],[[71233,71235],\\\"valid\\\",[],\\\"NV8\\\"],[[71236,71236],\\\"valid\\\"],[[71237,71247],\\\"disallowed\\\"],[[71248,71257],\\\"valid\\\"],[[71258,71295],\\\"disallowed\\\"],[[71296,71351],\\\"valid\\\"],[[71352,71359],\\\"disallowed\\\"],[[71360,71369],\\\"valid\\\"],[[71370,71423],\\\"disallowed\\\"],[[71424,71449],\\\"valid\\\"],[[71450,71452],\\\"disallowed\\\"],[[71453,71467],\\\"valid\\\"],[[71468,71471],\\\"disallowed\\\"],[[71472,71481],\\\"valid\\\"],[[71482,71487],\\\"valid\\\",[],\\\"NV8\\\"],[[71488,71839],\\\"disallowed\\\"],[[71840,71840],\\\"mapped\\\",[71872]],[[71841,71841],\\\"mapped\\\",[71873]],[[71842,71842],\\\"mapped\\\",[71874]],[[71843,71843],\\\"mapped\\\",[71875]],[[71844,71844],\\\"mapped\\\",[71876]],[[71845,71845],\\\"mapped\\\",[71877]],[[71846,71846],\\\"mapped\\\",[71878]],[[71847,71847],\\\"mapped\\\",[71879]],[[71848,71848],\\\"mapped\\\",[71880]],[[71849,71849],\\\"mapped\\\",[71881]],[[71850,71850],\\\"mapped\\\",[71882]],[[71851,71851],\\\"mapped\\\",[71883]],[[71852,71852],\\\"mapped\\\",[71884]],[[71853,71853],\\\"mapped\\\",[71885]],[[71854,71854],\\\"mapped\\\",[71886]],[[71855,71855],\\\"mapped\\\",[71887]],[[71856,71856],\\\"mapped\\\",[71888]],[[71857,71857],\\\"mapped\\\",[71889]],[[71858,71858],\\\"mapped\\\",[71890]],[[71859,71859],\\\"mapped\\\",[71891]],[[71860,71860],\\\"mapped\\\",[71892]],[[71861,71861],\\\"mapped\\\",[71893]],[[71862,71862],\\\"mapped\\\",[71894]],[[71863,71863],\\\"mapped\\\",[71895]],[[71864,71864],\\\"mapped\\\",[71896]],[[71865,71865],\\\"mapped\\\",[71897]],[[71866,71866],\\\"mapped\\\",[71898]],[[71867,71867],\\\"mapped\\\",[71899]],[[71868,71868],\\\"mapped\\\",[71900]],[[71869,71869],\\\"mapped\\\",[71901]],[[71870,71870],\\\"mapped\\\",[71902]],[[71871,71871],\\\"mapped\\\",[71903]],[[71872,71913],\\\"valid\\\"],[[71914,71922],\\\"valid\\\",[],\\\"NV8\\\"],[[71923,71934],\\\"disallowed\\\"],[[71935,71935],\\\"valid\\\"],[[71936,72383],\\\"disallowed\\\"],[[72384,72440],\\\"valid\\\"],[[72441,73727],\\\"disallowed\\\"],[[73728,74606],\\\"valid\\\"],[[74607,74648],\\\"valid\\\"],[[74649,74649],\\\"valid\\\"],[[74650,74751],\\\"disallowed\\\"],[[74752,74850],\\\"valid\\\",[],\\\"NV8\\\"],[[74851,74862],\\\"valid\\\",[],\\\"NV8\\\"],[[74863,74863],\\\"disallowed\\\"],[[74864,74867],\\\"valid\\\",[],\\\"NV8\\\"],[[74868,74868],\\\"valid\\\",[],\\\"NV8\\\"],[[74869,74879],\\\"disallowed\\\"],[[74880,75075],\\\"valid\\\"],[[75076,77823],\\\"disallowed\\\"],[[77824,78894],\\\"valid\\\"],[[78895,82943],\\\"disallowed\\\"],[[82944,83526],\\\"valid\\\"],[[83527,92159],\\\"disallowed\\\"],[[92160,92728],\\\"valid\\\"],[[92729,92735],\\\"disallowed\\\"],[[92736,92766],\\\"valid\\\"],[[92767,92767],\\\"disallowed\\\"],[[92768,92777],\\\"valid\\\"],[[92778,92781],\\\"disallowed\\\"],[[92782,92783],\\\"valid\\\",[],\\\"NV8\\\"],[[92784,92879],\\\"disallowed\\\"],[[92880,92909],\\\"valid\\\"],[[92910,92911],\\\"disallowed\\\"],[[92912,92916],\\\"valid\\\"],[[92917,92917],\\\"valid\\\",[],\\\"NV8\\\"],[[92918,92927],\\\"disallowed\\\"],[[92928,92982],\\\"valid\\\"],[[92983,92991],\\\"valid\\\",[],\\\"NV8\\\"],[[92992,92995],\\\"valid\\\"],[[92996,92997],\\\"valid\\\",[],\\\"NV8\\\"],[[92998,93007],\\\"disallowed\\\"],[[93008,93017],\\\"valid\\\"],[[93018,93018],\\\"disallowed\\\"],[[93019,93025],\\\"valid\\\",[],\\\"NV8\\\"],[[93026,93026],\\\"disallowed\\\"],[[93027,93047],\\\"valid\\\"],[[93048,93052],\\\"disallowed\\\"],[[93053,93071],\\\"valid\\\"],[[93072,93951],\\\"disallowed\\\"],[[93952,94020],\\\"valid\\\"],[[94021,94031],\\\"disallowed\\\"],[[94032,94078],\\\"valid\\\"],[[94079,94094],\\\"disallowed\\\"],[[94095,94111],\\\"valid\\\"],[[94112,110591],\\\"disallowed\\\"],[[110592,110593],\\\"valid\\\"],[[110594,113663],\\\"disallowed\\\"],[[113664,113770],\\\"valid\\\"],[[113771,113775],\\\"disallowed\\\"],[[113776,113788],\\\"valid\\\"],[[113789,113791],\\\"disallowed\\\"],[[113792,113800],\\\"valid\\\"],[[113801,113807],\\\"disallowed\\\"],[[113808,113817],\\\"valid\\\"],[[113818,113819],\\\"disallowed\\\"],[[113820,113820],\\\"valid\\\",[],\\\"NV8\\\"],[[113821,113822],\\\"valid\\\"],[[113823,113823],\\\"valid\\\",[],\\\"NV8\\\"],[[113824,113827],\\\"ignored\\\"],[[113828,118783],\\\"disallowed\\\"],[[118784,119029],\\\"valid\\\",[],\\\"NV8\\\"],[[119030,119039],\\\"disallowed\\\"],[[119040,119078],\\\"valid\\\",[],\\\"NV8\\\"],[[119079,119080],\\\"disallowed\\\"],[[119081,119081],\\\"valid\\\",[],\\\"NV8\\\"],[[119082,119133],\\\"valid\\\",[],\\\"NV8\\\"],[[119134,119134],\\\"mapped\\\",[119127,119141]],[[119135,119135],\\\"mapped\\\",[119128,119141]],[[119136,119136],\\\"mapped\\\",[119128,119141,119150]],[[119137,119137],\\\"mapped\\\",[119128,119141,119151]],[[119138,119138],\\\"mapped\\\",[119128,119141,119152]],[[119139,119139],\\\"mapped\\\",[119128,119141,119153]],[[119140,119140],\\\"mapped\\\",[119128,119141,119154]],[[119141,119154],\\\"valid\\\",[],\\\"NV8\\\"],[[119155,119162],\\\"disallowed\\\"],[[119163,119226],\\\"valid\\\",[],\\\"NV8\\\"],[[119227,119227],\\\"mapped\\\",[119225,119141]],[[119228,119228],\\\"mapped\\\",[119226,119141]],[[119229,119229],\\\"mapped\\\",[119225,119141,119150]],[[119230,119230],\\\"mapped\\\",[119226,119141,119150]],[[119231,119231],\\\"mapped\\\",[119225,119141,119151]],[[119232,119232],\\\"mapped\\\",[119226,119141,119151]],[[119233,119261],\\\"valid\\\",[],\\\"NV8\\\"],[[119262,119272],\\\"valid\\\",[],\\\"NV8\\\"],[[119273,119295],\\\"disallowed\\\"],[[119296,119365],\\\"valid\\\",[],\\\"NV8\\\"],[[119366,119551],\\\"disallowed\\\"],[[119552,119638],\\\"valid\\\",[],\\\"NV8\\\"],[[119639,119647],\\\"disallowed\\\"],[[119648,119665],\\\"valid\\\",[],\\\"NV8\\\"],[[119666,119807],\\\"disallowed\\\"],[[119808,119808],\\\"mapped\\\",[97]],[[119809,119809],\\\"mapped\\\",[98]],[[119810,119810],\\\"mapped\\\",[99]],[[119811,119811],\\\"mapped\\\",[100]],[[119812,119812],\\\"mapped\\\",[101]],[[119813,119813],\\\"mapped\\\",[102]],[[119814,119814],\\\"mapped\\\",[103]],[[119815,119815],\\\"mapped\\\",[104]],[[119816,119816],\\\"mapped\\\",[105]],[[119817,119817],\\\"mapped\\\",[106]],[[119818,119818],\\\"mapped\\\",[107]],[[119819,119819],\\\"mapped\\\",[108]],[[119820,119820],\\\"mapped\\\",[109]],[[119821,119821],\\\"mapped\\\",[110]],[[119822,119822],\\\"mapped\\\",[111]],[[119823,119823],\\\"mapped\\\",[112]],[[119824,119824],\\\"mapped\\\",[113]],[[119825,119825],\\\"mapped\\\",[114]],[[119826,119826],\\\"mapped\\\",[115]],[[119827,119827],\\\"mapped\\\",[116]],[[119828,119828],\\\"mapped\\\",[117]],[[119829,119829],\\\"mapped\\\",[118]],[[119830,119830],\\\"mapped\\\",[119]],[[119831,119831],\\\"mapped\\\",[120]],[[119832,119832],\\\"mapped\\\",[121]],[[119833,119833],\\\"mapped\\\",[122]],[[119834,119834],\\\"mapped\\\",[97]],[[119835,119835],\\\"mapped\\\",[98]],[[119836,119836],\\\"mapped\\\",[99]],[[119837,119837],\\\"mapped\\\",[100]],[[119838,119838],\\\"mapped\\\",[101]],[[119839,119839],\\\"mapped\\\",[102]],[[119840,119840],\\\"mapped\\\",[103]],[[119841,119841],\\\"mapped\\\",[104]],[[119842,119842],\\\"mapped\\\",[105]],[[119843,119843],\\\"mapped\\\",[106]],[[119844,119844],\\\"mapped\\\",[107]],[[119845,119845],\\\"mapped\\\",[108]],[[119846,119846],\\\"mapped\\\",[109]],[[119847,119847],\\\"mapped\\\",[110]],[[119848,119848],\\\"mapped\\\",[111]],[[119849,119849],\\\"mapped\\\",[112]],[[119850,119850],\\\"mapped\\\",[113]],[[119851,119851],\\\"mapped\\\",[114]],[[119852,119852],\\\"mapped\\\",[115]],[[119853,119853],\\\"mapped\\\",[116]],[[119854,119854],\\\"mapped\\\",[117]],[[119855,119855],\\\"mapped\\\",[118]],[[119856,119856],\\\"mapped\\\",[119]],[[119857,119857],\\\"mapped\\\",[120]],[[119858,119858],\\\"mapped\\\",[121]],[[119859,119859],\\\"mapped\\\",[122]],[[119860,119860],\\\"mapped\\\",[97]],[[119861,119861],\\\"mapped\\\",[98]],[[119862,119862],\\\"mapped\\\",[99]],[[119863,119863],\\\"mapped\\\",[100]],[[119864,119864],\\\"mapped\\\",[101]],[[119865,119865],\\\"mapped\\\",[102]],[[119866,119866],\\\"mapped\\\",[103]],[[119867,119867],\\\"mapped\\\",[104]],[[119868,119868],\\\"mapped\\\",[105]],[[119869,119869],\\\"mapped\\\",[106]],[[119870,119870],\\\"mapped\\\",[107]],[[119871,119871],\\\"mapped\\\",[108]],[[119872,119872],\\\"mapped\\\",[109]],[[119873,119873],\\\"mapped\\\",[110]],[[119874,119874],\\\"mapped\\\",[111]],[[119875,119875],\\\"mapped\\\",[112]],[[119876,119876],\\\"mapped\\\",[113]],[[119877,119877],\\\"mapped\\\",[114]],[[119878,119878],\\\"mapped\\\",[115]],[[119879,119879],\\\"mapped\\\",[116]],[[119880,119880],\\\"mapped\\\",[117]],[[119881,119881],\\\"mapped\\\",[118]],[[119882,119882],\\\"mapped\\\",[119]],[[119883,119883],\\\"mapped\\\",[120]],[[119884,119884],\\\"mapped\\\",[121]],[[119885,119885],\\\"mapped\\\",[122]],[[119886,119886],\\\"mapped\\\",[97]],[[119887,119887],\\\"mapped\\\",[98]],[[119888,119888],\\\"mapped\\\",[99]],[[119889,119889],\\\"mapped\\\",[100]],[[119890,119890],\\\"mapped\\\",[101]],[[119891,119891],\\\"mapped\\\",[102]],[[119892,119892],\\\"mapped\\\",[103]],[[119893,119893],\\\"disallowed\\\"],[[119894,119894],\\\"mapped\\\",[105]],[[119895,119895],\\\"mapped\\\",[106]],[[119896,119896],\\\"mapped\\\",[107]],[[119897,119897],\\\"mapped\\\",[108]],[[119898,119898],\\\"mapped\\\",[109]],[[119899,119899],\\\"mapped\\\",[110]],[[119900,119900],\\\"mapped\\\",[111]],[[119901,119901],\\\"mapped\\\",[112]],[[119902,119902],\\\"mapped\\\",[113]],[[119903,119903],\\\"mapped\\\",[114]],[[119904,119904],\\\"mapped\\\",[115]],[[119905,119905],\\\"mapped\\\",[116]],[[119906,119906],\\\"mapped\\\",[117]],[[119907,119907],\\\"mapped\\\",[118]],[[119908,119908],\\\"mapped\\\",[119]],[[119909,119909],\\\"mapped\\\",[120]],[[119910,119910],\\\"mapped\\\",[121]],[[119911,119911],\\\"mapped\\\",[122]],[[119912,119912],\\\"mapped\\\",[97]],[[119913,119913],\\\"mapped\\\",[98]],[[119914,119914],\\\"mapped\\\",[99]],[[119915,119915],\\\"mapped\\\",[100]],[[119916,119916],\\\"mapped\\\",[101]],[[119917,119917],\\\"mapped\\\",[102]],[[119918,119918],\\\"mapped\\\",[103]],[[119919,119919],\\\"mapped\\\",[104]],[[119920,119920],\\\"mapped\\\",[105]],[[119921,119921],\\\"mapped\\\",[106]],[[119922,119922],\\\"mapped\\\",[107]],[[119923,119923],\\\"mapped\\\",[108]],[[119924,119924],\\\"mapped\\\",[109]],[[119925,119925],\\\"mapped\\\",[110]],[[119926,119926],\\\"mapped\\\",[111]],[[119927,119927],\\\"mapped\\\",[112]],[[119928,119928],\\\"mapped\\\",[113]],[[119929,119929],\\\"mapped\\\",[114]],[[119930,119930],\\\"mapped\\\",[115]],[[119931,119931],\\\"mapped\\\",[116]],[[119932,119932],\\\"mapped\\\",[117]],[[119933,119933],\\\"mapped\\\",[118]],[[119934,119934],\\\"mapped\\\",[119]],[[119935,119935],\\\"mapped\\\",[120]],[[119936,119936],\\\"mapped\\\",[121]],[[119937,119937],\\\"mapped\\\",[122]],[[119938,119938],\\\"mapped\\\",[97]],[[119939,119939],\\\"mapped\\\",[98]],[[119940,119940],\\\"mapped\\\",[99]],[[119941,119941],\\\"mapped\\\",[100]],[[119942,119942],\\\"mapped\\\",[101]],[[119943,119943],\\\"mapped\\\",[102]],[[119944,119944],\\\"mapped\\\",[103]],[[119945,119945],\\\"mapped\\\",[104]],[[119946,119946],\\\"mapped\\\",[105]],[[119947,119947],\\\"mapped\\\",[106]],[[119948,119948],\\\"mapped\\\",[107]],[[119949,119949],\\\"mapped\\\",[108]],[[119950,119950],\\\"mapped\\\",[109]],[[119951,119951],\\\"mapped\\\",[110]],[[119952,119952],\\\"mapped\\\",[111]],[[119953,119953],\\\"mapped\\\",[112]],[[119954,119954],\\\"mapped\\\",[113]],[[119955,119955],\\\"mapped\\\",[114]],[[119956,119956],\\\"mapped\\\",[115]],[[119957,119957],\\\"mapped\\\",[116]],[[119958,119958],\\\"mapped\\\",[117]],[[119959,119959],\\\"mapped\\\",[118]],[[119960,119960],\\\"mapped\\\",[119]],[[119961,119961],\\\"mapped\\\",[120]],[[119962,119962],\\\"mapped\\\",[121]],[[119963,119963],\\\"mapped\\\",[122]],[[119964,119964],\\\"mapped\\\",[97]],[[119965,119965],\\\"disallowed\\\"],[[119966,119966],\\\"mapped\\\",[99]],[[119967,119967],\\\"mapped\\\",[100]],[[119968,119969],\\\"disallowed\\\"],[[119970,119970],\\\"mapped\\\",[103]],[[119971,119972],\\\"disallowed\\\"],[[119973,119973],\\\"mapped\\\",[106]],[[119974,119974],\\\"mapped\\\",[107]],[[119975,119976],\\\"disallowed\\\"],[[119977,119977],\\\"mapped\\\",[110]],[[119978,119978],\\\"mapped\\\",[111]],[[119979,119979],\\\"mapped\\\",[112]],[[119980,119980],\\\"mapped\\\",[113]],[[119981,119981],\\\"disallowed\\\"],[[119982,119982],\\\"mapped\\\",[115]],[[119983,119983],\\\"mapped\\\",[116]],[[119984,119984],\\\"mapped\\\",[117]],[[119985,119985],\\\"mapped\\\",[118]],[[119986,119986],\\\"mapped\\\",[119]],[[119987,119987],\\\"mapped\\\",[120]],[[119988,119988],\\\"mapped\\\",[121]],[[119989,119989],\\\"mapped\\\",[122]],[[119990,119990],\\\"mapped\\\",[97]],[[119991,119991],\\\"mapped\\\",[98]],[[119992,119992],\\\"mapped\\\",[99]],[[119993,119993],\\\"mapped\\\",[100]],[[119994,119994],\\\"disallowed\\\"],[[119995,119995],\\\"mapped\\\",[102]],[[119996,119996],\\\"disallowed\\\"],[[119997,119997],\\\"mapped\\\",[104]],[[119998,119998],\\\"mapped\\\",[105]],[[119999,119999],\\\"mapped\\\",[106]],[[120000,120000],\\\"mapped\\\",[107]],[[120001,120001],\\\"mapped\\\",[108]],[[120002,120002],\\\"mapped\\\",[109]],[[120003,120003],\\\"mapped\\\",[110]],[[120004,120004],\\\"disallowed\\\"],[[120005,120005],\\\"mapped\\\",[112]],[[120006,120006],\\\"mapped\\\",[113]],[[120007,120007],\\\"mapped\\\",[114]],[[120008,120008],\\\"mapped\\\",[115]],[[120009,120009],\\\"mapped\\\",[116]],[[120010,120010],\\\"mapped\\\",[117]],[[120011,120011],\\\"mapped\\\",[118]],[[120012,120012],\\\"mapped\\\",[119]],[[120013,120013],\\\"mapped\\\",[120]],[[120014,120014],\\\"mapped\\\",[121]],[[120015,120015],\\\"mapped\\\",[122]],[[120016,120016],\\\"mapped\\\",[97]],[[120017,120017],\\\"mapped\\\",[98]],[[120018,120018],\\\"mapped\\\",[99]],[[120019,120019],\\\"mapped\\\",[100]],[[120020,120020],\\\"mapped\\\",[101]],[[120021,120021],\\\"mapped\\\",[102]],[[120022,120022],\\\"mapped\\\",[103]],[[120023,120023],\\\"mapped\\\",[104]],[[120024,120024],\\\"mapped\\\",[105]],[[120025,120025],\\\"mapped\\\",[106]],[[120026,120026],\\\"mapped\\\",[107]],[[120027,120027],\\\"mapped\\\",[108]],[[120028,120028],\\\"mapped\\\",[109]],[[120029,120029],\\\"mapped\\\",[110]],[[120030,120030],\\\"mapped\\\",[111]],[[120031,120031],\\\"mapped\\\",[112]],[[120032,120032],\\\"mapped\\\",[113]],[[120033,120033],\\\"mapped\\\",[114]],[[120034,120034],\\\"mapped\\\",[115]],[[120035,120035],\\\"mapped\\\",[116]],[[120036,120036],\\\"mapped\\\",[117]],[[120037,120037],\\\"mapped\\\",[118]],[[120038,120038],\\\"mapped\\\",[119]],[[120039,120039],\\\"mapped\\\",[120]],[[120040,120040],\\\"mapped\\\",[121]],[[120041,120041],\\\"mapped\\\",[122]],[[120042,120042],\\\"mapped\\\",[97]],[[120043,120043],\\\"mapped\\\",[98]],[[120044,120044],\\\"mapped\\\",[99]],[[120045,120045],\\\"mapped\\\",[100]],[[120046,120046],\\\"mapped\\\",[101]],[[120047,120047],\\\"mapped\\\",[102]],[[120048,120048],\\\"mapped\\\",[103]],[[120049,120049],\\\"mapped\\\",[104]],[[120050,120050],\\\"mapped\\\",[105]],[[120051,120051],\\\"mapped\\\",[106]],[[120052,120052],\\\"mapped\\\",[107]],[[120053,120053],\\\"mapped\\\",[108]],[[120054,120054],\\\"mapped\\\",[109]],[[120055,120055],\\\"mapped\\\",[110]],[[120056,120056],\\\"mapped\\\",[111]],[[120057,120057],\\\"mapped\\\",[112]],[[120058,120058],\\\"mapped\\\",[113]],[[120059,120059],\\\"mapped\\\",[114]],[[120060,120060],\\\"mapped\\\",[115]],[[120061,120061],\\\"mapped\\\",[116]],[[120062,120062],\\\"mapped\\\",[117]],[[120063,120063],\\\"mapped\\\",[118]],[[120064,120064],\\\"mapped\\\",[119]],[[120065,120065],\\\"mapped\\\",[120]],[[120066,120066],\\\"mapped\\\",[121]],[[120067,120067],\\\"mapped\\\",[122]],[[120068,120068],\\\"mapped\\\",[97]],[[120069,120069],\\\"mapped\\\",[98]],[[120070,120070],\\\"disallowed\\\"],[[120071,120071],\\\"mapped\\\",[100]],[[120072,120072],\\\"mapped\\\",[101]],[[120073,120073],\\\"mapped\\\",[102]],[[120074,120074],\\\"mapped\\\",[103]],[[120075,120076],\\\"disallowed\\\"],[[120077,120077],\\\"mapped\\\",[106]],[[120078,120078],\\\"mapped\\\",[107]],[[120079,120079],\\\"mapped\\\",[108]],[[120080,120080],\\\"mapped\\\",[109]],[[120081,120081],\\\"mapped\\\",[110]],[[120082,120082],\\\"mapped\\\",[111]],[[120083,120083],\\\"mapped\\\",[112]],[[120084,120084],\\\"mapped\\\",[113]],[[120085,120085],\\\"disallowed\\\"],[[120086,120086],\\\"mapped\\\",[115]],[[120087,120087],\\\"mapped\\\",[116]],[[120088,120088],\\\"mapped\\\",[117]],[[120089,120089],\\\"mapped\\\",[118]],[[120090,120090],\\\"mapped\\\",[119]],[[120091,120091],\\\"mapped\\\",[120]],[[120092,120092],\\\"mapped\\\",[121]],[[120093,120093],\\\"disallowed\\\"],[[120094,120094],\\\"mapped\\\",[97]],[[120095,120095],\\\"mapped\\\",[98]],[[120096,120096],\\\"mapped\\\",[99]],[[120097,120097],\\\"mapped\\\",[100]],[[120098,120098],\\\"mapped\\\",[101]],[[120099,120099],\\\"mapped\\\",[102]],[[120100,120100],\\\"mapped\\\",[103]],[[120101,120101],\\\"mapped\\\",[104]],[[120102,120102],\\\"mapped\\\",[105]],[[120103,120103],\\\"mapped\\\",[106]],[[120104,120104],\\\"mapped\\\",[107]],[[120105,120105],\\\"mapped\\\",[108]],[[120106,120106],\\\"mapped\\\",[109]],[[120107,120107],\\\"mapped\\\",[110]],[[120108,120108],\\\"mapped\\\",[111]],[[120109,120109],\\\"mapped\\\",[112]],[[120110,120110],\\\"mapped\\\",[113]],[[120111,120111],\\\"mapped\\\",[114]],[[120112,120112],\\\"mapped\\\",[115]],[[120113,120113],\\\"mapped\\\",[116]],[[120114,120114],\\\"mapped\\\",[117]],[[120115,120115],\\\"mapped\\\",[118]],[[120116,120116],\\\"mapped\\\",[119]],[[120117,120117],\\\"mapped\\\",[120]],[[120118,120118],\\\"mapped\\\",[121]],[[120119,120119],\\\"mapped\\\",[122]],[[120120,120120],\\\"mapped\\\",[97]],[[120121,120121],\\\"mapped\\\",[98]],[[120122,120122],\\\"disallowed\\\"],[[120123,120123],\\\"mapped\\\",[100]],[[120124,120124],\\\"mapped\\\",[101]],[[120125,120125],\\\"mapped\\\",[102]],[[120126,120126],\\\"mapped\\\",[103]],[[120127,120127],\\\"disallowed\\\"],[[120128,120128],\\\"mapped\\\",[105]],[[120129,120129],\\\"mapped\\\",[106]],[[120130,120130],\\\"mapped\\\",[107]],[[120131,120131],\\\"mapped\\\",[108]],[[120132,120132],\\\"mapped\\\",[109]],[[120133,120133],\\\"disallowed\\\"],[[120134,120134],\\\"mapped\\\",[111]],[[120135,120137],\\\"disallowed\\\"],[[120138,120138],\\\"mapped\\\",[115]],[[120139,120139],\\\"mapped\\\",[116]],[[120140,120140],\\\"mapped\\\",[117]],[[120141,120141],\\\"mapped\\\",[118]],[[120142,120142],\\\"mapped\\\",[119]],[[120143,120143],\\\"mapped\\\",[120]],[[120144,120144],\\\"mapped\\\",[121]],[[120145,120145],\\\"disallowed\\\"],[[120146,120146],\\\"mapped\\\",[97]],[[120147,120147],\\\"mapped\\\",[98]],[[120148,120148],\\\"mapped\\\",[99]],[[120149,120149],\\\"mapped\\\",[100]],[[120150,120150],\\\"mapped\\\",[101]],[[120151,120151],\\\"mapped\\\",[102]],[[120152,120152],\\\"mapped\\\",[103]],[[120153,120153],\\\"mapped\\\",[104]],[[120154,120154],\\\"mapped\\\",[105]],[[120155,120155],\\\"mapped\\\",[106]],[[120156,120156],\\\"mapped\\\",[107]],[[120157,120157],\\\"mapped\\\",[108]],[[120158,120158],\\\"mapped\\\",[109]],[[120159,120159],\\\"mapped\\\",[110]],[[120160,120160],\\\"mapped\\\",[111]],[[120161,120161],\\\"mapped\\\",[112]],[[120162,120162],\\\"mapped\\\",[113]],[[120163,120163],\\\"mapped\\\",[114]],[[120164,120164],\\\"mapped\\\",[115]],[[120165,120165],\\\"mapped\\\",[116]],[[120166,120166],\\\"mapped\\\",[117]],[[120167,120167],\\\"mapped\\\",[118]],[[120168,120168],\\\"mapped\\\",[119]],[[120169,120169],\\\"mapped\\\",[120]],[[120170,120170],\\\"mapped\\\",[121]],[[120171,120171],\\\"mapped\\\",[122]],[[120172,120172],\\\"mapped\\\",[97]],[[120173,120173],\\\"mapped\\\",[98]],[[120174,120174],\\\"mapped\\\",[99]],[[120175,120175],\\\"mapped\\\",[100]],[[120176,120176],\\\"mapped\\\",[101]],[[120177,120177],\\\"mapped\\\",[102]],[[120178,120178],\\\"mapped\\\",[103]],[[120179,120179],\\\"mapped\\\",[104]],[[120180,120180],\\\"mapped\\\",[105]],[[120181,120181],\\\"mapped\\\",[106]],[[120182,120182],\\\"mapped\\\",[107]],[[120183,120183],\\\"mapped\\\",[108]],[[120184,120184],\\\"mapped\\\",[109]],[[120185,120185],\\\"mapped\\\",[110]],[[120186,120186],\\\"mapped\\\",[111]],[[120187,120187],\\\"mapped\\\",[112]],[[120188,120188],\\\"mapped\\\",[113]],[[120189,120189],\\\"mapped\\\",[114]],[[120190,120190],\\\"mapped\\\",[115]],[[120191,120191],\\\"mapped\\\",[116]],[[120192,120192],\\\"mapped\\\",[117]],[[120193,120193],\\\"mapped\\\",[118]],[[120194,120194],\\\"mapped\\\",[119]],[[120195,120195],\\\"mapped\\\",[120]],[[120196,120196],\\\"mapped\\\",[121]],[[120197,120197],\\\"mapped\\\",[122]],[[120198,120198],\\\"mapped\\\",[97]],[[120199,120199],\\\"mapped\\\",[98]],[[120200,120200],\\\"mapped\\\",[99]],[[120201,120201],\\\"mapped\\\",[100]],[[120202,120202],\\\"mapped\\\",[101]],[[120203,120203],\\\"mapped\\\",[102]],[[120204,120204],\\\"mapped\\\",[103]],[[120205,120205],\\\"mapped\\\",[104]],[[120206,120206],\\\"mapped\\\",[105]],[[120207,120207],\\\"mapped\\\",[106]],[[120208,120208],\\\"mapped\\\",[107]],[[120209,120209],\\\"mapped\\\",[108]],[[120210,120210],\\\"mapped\\\",[109]],[[120211,120211],\\\"mapped\\\",[110]],[[120212,120212],\\\"mapped\\\",[111]],[[120213,120213],\\\"mapped\\\",[112]],[[120214,120214],\\\"mapped\\\",[113]],[[120215,120215],\\\"mapped\\\",[114]],[[120216,120216],\\\"mapped\\\",[115]],[[120217,120217],\\\"mapped\\\",[116]],[[120218,120218],\\\"mapped\\\",[117]],[[120219,120219],\\\"mapped\\\",[118]],[[120220,120220],\\\"mapped\\\",[119]],[[120221,120221],\\\"mapped\\\",[120]],[[120222,120222],\\\"mapped\\\",[121]],[[120223,120223],\\\"mapped\\\",[122]],[[120224,120224],\\\"mapped\\\",[97]],[[120225,120225],\\\"mapped\\\",[98]],[[120226,120226],\\\"mapped\\\",[99]],[[120227,120227],\\\"mapped\\\",[100]],[[120228,120228],\\\"mapped\\\",[101]],[[120229,120229],\\\"mapped\\\",[102]],[[120230,120230],\\\"mapped\\\",[103]],[[120231,120231],\\\"mapped\\\",[104]],[[120232,120232],\\\"mapped\\\",[105]],[[120233,120233],\\\"mapped\\\",[106]],[[120234,120234],\\\"mapped\\\",[107]],[[120235,120235],\\\"mapped\\\",[108]],[[120236,120236],\\\"mapped\\\",[109]],[[120237,120237],\\\"mapped\\\",[110]],[[120238,120238],\\\"mapped\\\",[111]],[[120239,120239],\\\"mapped\\\",[112]],[[120240,120240],\\\"mapped\\\",[113]],[[120241,120241],\\\"mapped\\\",[114]],[[120242,120242],\\\"mapped\\\",[115]],[[120243,120243],\\\"mapped\\\",[116]],[[120244,120244],\\\"mapped\\\",[117]],[[120245,120245],\\\"mapped\\\",[118]],[[120246,120246],\\\"mapped\\\",[119]],[[120247,120247],\\\"mapped\\\",[120]],[[120248,120248],\\\"mapped\\\",[121]],[[120249,120249],\\\"mapped\\\",[122]],[[120250,120250],\\\"mapped\\\",[97]],[[120251,120251],\\\"mapped\\\",[98]],[[120252,120252],\\\"mapped\\\",[99]],[[120253,120253],\\\"mapped\\\",[100]],[[120254,120254],\\\"mapped\\\",[101]],[[120255,120255],\\\"mapped\\\",[102]],[[120256,120256],\\\"mapped\\\",[103]],[[120257,120257],\\\"mapped\\\",[104]],[[120258,120258],\\\"mapped\\\",[105]],[[120259,120259],\\\"mapped\\\",[106]],[[120260,120260],\\\"mapped\\\",[107]],[[120261,120261],\\\"mapped\\\",[108]],[[120262,120262],\\\"mapped\\\",[109]],[[120263,120263],\\\"mapped\\\",[110]],[[120264,120264],\\\"mapped\\\",[111]],[[120265,120265],\\\"mapped\\\",[112]],[[120266,120266],\\\"mapped\\\",[113]],[[120267,120267],\\\"mapped\\\",[114]],[[120268,120268],\\\"mapped\\\",[115]],[[120269,120269],\\\"mapped\\\",[116]],[[120270,120270],\\\"mapped\\\",[117]],[[120271,120271],\\\"mapped\\\",[118]],[[120272,120272],\\\"mapped\\\",[119]],[[120273,120273],\\\"mapped\\\",[120]],[[120274,120274],\\\"mapped\\\",[121]],[[120275,120275],\\\"mapped\\\",[122]],[[120276,120276],\\\"mapped\\\",[97]],[[120277,120277],\\\"mapped\\\",[98]],[[120278,120278],\\\"mapped\\\",[99]],[[120279,120279],\\\"mapped\\\",[100]],[[120280,120280],\\\"mapped\\\",[101]],[[120281,120281],\\\"mapped\\\",[102]],[[120282,120282],\\\"mapped\\\",[103]],[[120283,120283],\\\"mapped\\\",[104]],[[120284,120284],\\\"mapped\\\",[105]],[[120285,120285],\\\"mapped\\\",[106]],[[120286,120286],\\\"mapped\\\",[107]],[[120287,120287],\\\"mapped\\\",[108]],[[120288,120288],\\\"mapped\\\",[109]],[[120289,120289],\\\"mapped\\\",[110]],[[120290,120290],\\\"mapped\\\",[111]],[[120291,120291],\\\"mapped\\\",[112]],[[120292,120292],\\\"mapped\\\",[113]],[[120293,120293],\\\"mapped\\\",[114]],[[120294,120294],\\\"mapped\\\",[115]],[[120295,120295],\\\"mapped\\\",[116]],[[120296,120296],\\\"mapped\\\",[117]],[[120297,120297],\\\"mapped\\\",[118]],[[120298,120298],\\\"mapped\\\",[119]],[[120299,120299],\\\"mapped\\\",[120]],[[120300,120300],\\\"mapped\\\",[121]],[[120301,120301],\\\"mapped\\\",[122]],[[120302,120302],\\\"mapped\\\",[97]],[[120303,120303],\\\"mapped\\\",[98]],[[120304,120304],\\\"mapped\\\",[99]],[[120305,120305],\\\"mapped\\\",[100]],[[120306,120306],\\\"mapped\\\",[101]],[[120307,120307],\\\"mapped\\\",[102]],[[120308,120308],\\\"mapped\\\",[103]],[[120309,120309],\\\"mapped\\\",[104]],[[120310,120310],\\\"mapped\\\",[105]],[[120311,120311],\\\"mapped\\\",[106]],[[120312,120312],\\\"mapped\\\",[107]],[[120313,120313],\\\"mapped\\\",[108]],[[120314,120314],\\\"mapped\\\",[109]],[[120315,120315],\\\"mapped\\\",[110]],[[120316,120316],\\\"mapped\\\",[111]],[[120317,120317],\\\"mapped\\\",[112]],[[120318,120318],\\\"mapped\\\",[113]],[[120319,120319],\\\"mapped\\\",[114]],[[120320,120320],\\\"mapped\\\",[115]],[[120321,120321],\\\"mapped\\\",[116]],[[120322,120322],\\\"mapped\\\",[117]],[[120323,120323],\\\"mapped\\\",[118]],[[120324,120324],\\\"mapped\\\",[119]],[[120325,120325],\\\"mapped\\\",[120]],[[120326,120326],\\\"mapped\\\",[121]],[[120327,120327],\\\"mapped\\\",[122]],[[120328,120328],\\\"mapped\\\",[97]],[[120329,120329],\\\"mapped\\\",[98]],[[120330,120330],\\\"mapped\\\",[99]],[[120331,120331],\\\"mapped\\\",[100]],[[120332,120332],\\\"mapped\\\",[101]],[[120333,120333],\\\"mapped\\\",[102]],[[120334,120334],\\\"mapped\\\",[103]],[[120335,120335],\\\"mapped\\\",[104]],[[120336,120336],\\\"mapped\\\",[105]],[[120337,120337],\\\"mapped\\\",[106]],[[120338,120338],\\\"mapped\\\",[107]],[[120339,120339],\\\"mapped\\\",[108]],[[120340,120340],\\\"mapped\\\",[109]],[[120341,120341],\\\"mapped\\\",[110]],[[120342,120342],\\\"mapped\\\",[111]],[[120343,120343],\\\"mapped\\\",[112]],[[120344,120344],\\\"mapped\\\",[113]],[[120345,120345],\\\"mapped\\\",[114]],[[120346,120346],\\\"mapped\\\",[115]],[[120347,120347],\\\"mapped\\\",[116]],[[120348,120348],\\\"mapped\\\",[117]],[[120349,120349],\\\"mapped\\\",[118]],[[120350,120350],\\\"mapped\\\",[119]],[[120351,120351],\\\"mapped\\\",[120]],[[120352,120352],\\\"mapped\\\",[121]],[[120353,120353],\\\"mapped\\\",[122]],[[120354,120354],\\\"mapped\\\",[97]],[[120355,120355],\\\"mapped\\\",[98]],[[120356,120356],\\\"mapped\\\",[99]],[[120357,120357],\\\"mapped\\\",[100]],[[120358,120358],\\\"mapped\\\",[101]],[[120359,120359],\\\"mapped\\\",[102]],[[120360,120360],\\\"mapped\\\",[103]],[[120361,120361],\\\"mapped\\\",[104]],[[120362,120362],\\\"mapped\\\",[105]],[[120363,120363],\\\"mapped\\\",[106]],[[120364,120364],\\\"mapped\\\",[107]],[[120365,120365],\\\"mapped\\\",[108]],[[120366,120366],\\\"mapped\\\",[109]],[[120367,120367],\\\"mapped\\\",[110]],[[120368,120368],\\\"mapped\\\",[111]],[[120369,120369],\\\"mapped\\\",[112]],[[120370,120370],\\\"mapped\\\",[113]],[[120371,120371],\\\"mapped\\\",[114]],[[120372,120372],\\\"mapped\\\",[115]],[[120373,120373],\\\"mapped\\\",[116]],[[120374,120374],\\\"mapped\\\",[117]],[[120375,120375],\\\"mapped\\\",[118]],[[120376,120376],\\\"mapped\\\",[119]],[[120377,120377],\\\"mapped\\\",[120]],[[120378,120378],\\\"mapped\\\",[121]],[[120379,120379],\\\"mapped\\\",[122]],[[120380,120380],\\\"mapped\\\",[97]],[[120381,120381],\\\"mapped\\\",[98]],[[120382,120382],\\\"mapped\\\",[99]],[[120383,120383],\\\"mapped\\\",[100]],[[120384,120384],\\\"mapped\\\",[101]],[[120385,120385],\\\"mapped\\\",[102]],[[120386,120386],\\\"mapped\\\",[103]],[[120387,120387],\\\"mapped\\\",[104]],[[120388,120388],\\\"mapped\\\",[105]],[[120389,120389],\\\"mapped\\\",[106]],[[120390,120390],\\\"mapped\\\",[107]],[[120391,120391],\\\"mapped\\\",[108]],[[120392,120392],\\\"mapped\\\",[109]],[[120393,120393],\\\"mapped\\\",[110]],[[120394,120394],\\\"mapped\\\",[111]],[[120395,120395],\\\"mapped\\\",[112]],[[120396,120396],\\\"mapped\\\",[113]],[[120397,120397],\\\"mapped\\\",[114]],[[120398,120398],\\\"mapped\\\",[115]],[[120399,120399],\\\"mapped\\\",[116]],[[120400,120400],\\\"mapped\\\",[117]],[[120401,120401],\\\"mapped\\\",[118]],[[120402,120402],\\\"mapped\\\",[119]],[[120403,120403],\\\"mapped\\\",[120]],[[120404,120404],\\\"mapped\\\",[121]],[[120405,120405],\\\"mapped\\\",[122]],[[120406,120406],\\\"mapped\\\",[97]],[[120407,120407],\\\"mapped\\\",[98]],[[120408,120408],\\\"mapped\\\",[99]],[[120409,120409],\\\"mapped\\\",[100]],[[120410,120410],\\\"mapped\\\",[101]],[[120411,120411],\\\"mapped\\\",[102]],[[120412,120412],\\\"mapped\\\",[103]],[[120413,120413],\\\"mapped\\\",[104]],[[120414,120414],\\\"mapped\\\",[105]],[[120415,120415],\\\"mapped\\\",[106]],[[120416,120416],\\\"mapped\\\",[107]],[[120417,120417],\\\"mapped\\\",[108]],[[120418,120418],\\\"mapped\\\",[109]],[[120419,120419],\\\"mapped\\\",[110]],[[120420,120420],\\\"mapped\\\",[111]],[[120421,120421],\\\"mapped\\\",[112]],[[120422,120422],\\\"mapped\\\",[113]],[[120423,120423],\\\"mapped\\\",[114]],[[120424,120424],\\\"mapped\\\",[115]],[[120425,120425],\\\"mapped\\\",[116]],[[120426,120426],\\\"mapped\\\",[117]],[[120427,120427],\\\"mapped\\\",[118]],[[120428,120428],\\\"mapped\\\",[119]],[[120429,120429],\\\"mapped\\\",[120]],[[120430,120430],\\\"mapped\\\",[121]],[[120431,120431],\\\"mapped\\\",[122]],[[120432,120432],\\\"mapped\\\",[97]],[[120433,120433],\\\"mapped\\\",[98]],[[120434,120434],\\\"mapped\\\",[99]],[[120435,120435],\\\"mapped\\\",[100]],[[120436,120436],\\\"mapped\\\",[101]],[[120437,120437],\\\"mapped\\\",[102]],[[120438,120438],\\\"mapped\\\",[103]],[[120439,120439],\\\"mapped\\\",[104]],[[120440,120440],\\\"mapped\\\",[105]],[[120441,120441],\\\"mapped\\\",[106]],[[120442,120442],\\\"mapped\\\",[107]],[[120443,120443],\\\"mapped\\\",[108]],[[120444,120444],\\\"mapped\\\",[109]],[[120445,120445],\\\"mapped\\\",[110]],[[120446,120446],\\\"mapped\\\",[111]],[[120447,120447],\\\"mapped\\\",[112]],[[120448,120448],\\\"mapped\\\",[113]],[[120449,120449],\\\"mapped\\\",[114]],[[120450,120450],\\\"mapped\\\",[115]],[[120451,120451],\\\"mapped\\\",[116]],[[120452,120452],\\\"mapped\\\",[117]],[[120453,120453],\\\"mapped\\\",[118]],[[120454,120454],\\\"mapped\\\",[119]],[[120455,120455],\\\"mapped\\\",[120]],[[120456,120456],\\\"mapped\\\",[121]],[[120457,120457],\\\"mapped\\\",[122]],[[120458,120458],\\\"mapped\\\",[97]],[[120459,120459],\\\"mapped\\\",[98]],[[120460,120460],\\\"mapped\\\",[99]],[[120461,120461],\\\"mapped\\\",[100]],[[120462,120462],\\\"mapped\\\",[101]],[[120463,120463],\\\"mapped\\\",[102]],[[120464,120464],\\\"mapped\\\",[103]],[[120465,120465],\\\"mapped\\\",[104]],[[120466,120466],\\\"mapped\\\",[105]],[[120467,120467],\\\"mapped\\\",[106]],[[120468,120468],\\\"mapped\\\",[107]],[[120469,120469],\\\"mapped\\\",[108]],[[120470,120470],\\\"mapped\\\",[109]],[[120471,120471],\\\"mapped\\\",[110]],[[120472,120472],\\\"mapped\\\",[111]],[[120473,120473],\\\"mapped\\\",[112]],[[120474,120474],\\\"mapped\\\",[113]],[[120475,120475],\\\"mapped\\\",[114]],[[120476,120476],\\\"mapped\\\",[115]],[[120477,120477],\\\"mapped\\\",[116]],[[120478,120478],\\\"mapped\\\",[117]],[[120479,120479],\\\"mapped\\\",[118]],[[120480,120480],\\\"mapped\\\",[119]],[[120481,120481],\\\"mapped\\\",[120]],[[120482,120482],\\\"mapped\\\",[121]],[[120483,120483],\\\"mapped\\\",[122]],[[120484,120484],\\\"mapped\\\",[305]],[[120485,120485],\\\"mapped\\\",[567]],[[120486,120487],\\\"disallowed\\\"],[[120488,120488],\\\"mapped\\\",[945]],[[120489,120489],\\\"mapped\\\",[946]],[[120490,120490],\\\"mapped\\\",[947]],[[120491,120491],\\\"mapped\\\",[948]],[[120492,120492],\\\"mapped\\\",[949]],[[120493,120493],\\\"mapped\\\",[950]],[[120494,120494],\\\"mapped\\\",[951]],[[120495,120495],\\\"mapped\\\",[952]],[[120496,120496],\\\"mapped\\\",[953]],[[120497,120497],\\\"mapped\\\",[954]],[[120498,120498],\\\"mapped\\\",[955]],[[120499,120499],\\\"mapped\\\",[956]],[[120500,120500],\\\"mapped\\\",[957]],[[120501,120501],\\\"mapped\\\",[958]],[[120502,120502],\\\"mapped\\\",[959]],[[120503,120503],\\\"mapped\\\",[960]],[[120504,120504],\\\"mapped\\\",[961]],[[120505,120505],\\\"mapped\\\",[952]],[[120506,120506],\\\"mapped\\\",[963]],[[120507,120507],\\\"mapped\\\",[964]],[[120508,120508],\\\"mapped\\\",[965]],[[120509,120509],\\\"mapped\\\",[966]],[[120510,120510],\\\"mapped\\\",[967]],[[120511,120511],\\\"mapped\\\",[968]],[[120512,120512],\\\"mapped\\\",[969]],[[120513,120513],\\\"mapped\\\",[8711]],[[120514,120514],\\\"mapped\\\",[945]],[[120515,120515],\\\"mapped\\\",[946]],[[120516,120516],\\\"mapped\\\",[947]],[[120517,120517],\\\"mapped\\\",[948]],[[120518,120518],\\\"mapped\\\",[949]],[[120519,120519],\\\"mapped\\\",[950]],[[120520,120520],\\\"mapped\\\",[951]],[[120521,120521],\\\"mapped\\\",[952]],[[120522,120522],\\\"mapped\\\",[953]],[[120523,120523],\\\"mapped\\\",[954]],[[120524,120524],\\\"mapped\\\",[955]],[[120525,120525],\\\"mapped\\\",[956]],[[120526,120526],\\\"mapped\\\",[957]],[[120527,120527],\\\"mapped\\\",[958]],[[120528,120528],\\\"mapped\\\",[959]],[[120529,120529],\\\"mapped\\\",[960]],[[120530,120530],\\\"mapped\\\",[961]],[[120531,120532],\\\"mapped\\\",[963]],[[120533,120533],\\\"mapped\\\",[964]],[[120534,120534],\\\"mapped\\\",[965]],[[120535,120535],\\\"mapped\\\",[966]],[[120536,120536],\\\"mapped\\\",[967]],[[120537,120537],\\\"mapped\\\",[968]],[[120538,120538],\\\"mapped\\\",[969]],[[120539,120539],\\\"mapped\\\",[8706]],[[120540,120540],\\\"mapped\\\",[949]],[[120541,120541],\\\"mapped\\\",[952]],[[120542,120542],\\\"mapped\\\",[954]],[[120543,120543],\\\"mapped\\\",[966]],[[120544,120544],\\\"mapped\\\",[961]],[[120545,120545],\\\"mapped\\\",[960]],[[120546,120546],\\\"mapped\\\",[945]],[[120547,120547],\\\"mapped\\\",[946]],[[120548,120548],\\\"mapped\\\",[947]],[[120549,120549],\\\"mapped\\\",[948]],[[120550,120550],\\\"mapped\\\",[949]],[[120551,120551],\\\"mapped\\\",[950]],[[120552,120552],\\\"mapped\\\",[951]],[[120553,120553],\\\"mapped\\\",[952]],[[120554,120554],\\\"mapped\\\",[953]],[[120555,120555],\\\"mapped\\\",[954]],[[120556,120556],\\\"mapped\\\",[955]],[[120557,120557],\\\"mapped\\\",[956]],[[120558,120558],\\\"mapped\\\",[957]],[[120559,120559],\\\"mapped\\\",[958]],[[120560,120560],\\\"mapped\\\",[959]],[[120561,120561],\\\"mapped\\\",[960]],[[120562,120562],\\\"mapped\\\",[961]],[[120563,120563],\\\"mapped\\\",[952]],[[120564,120564],\\\"mapped\\\",[963]],[[120565,120565],\\\"mapped\\\",[964]],[[120566,120566],\\\"mapped\\\",[965]],[[120567,120567],\\\"mapped\\\",[966]],[[120568,120568],\\\"mapped\\\",[967]],[[120569,120569],\\\"mapped\\\",[968]],[[120570,120570],\\\"mapped\\\",[969]],[[120571,120571],\\\"mapped\\\",[8711]],[[120572,120572],\\\"mapped\\\",[945]],[[120573,120573],\\\"mapped\\\",[946]],[[120574,120574],\\\"mapped\\\",[947]],[[120575,120575],\\\"mapped\\\",[948]],[[120576,120576],\\\"mapped\\\",[949]],[[120577,120577],\\\"mapped\\\",[950]],[[120578,120578],\\\"mapped\\\",[951]],[[120579,120579],\\\"mapped\\\",[952]],[[120580,120580],\\\"mapped\\\",[953]],[[120581,120581],\\\"mapped\\\",[954]],[[120582,120582],\\\"mapped\\\",[955]],[[120583,120583],\\\"mapped\\\",[956]],[[120584,120584],\\\"mapped\\\",[957]],[[120585,120585],\\\"mapped\\\",[958]],[[120586,120586],\\\"mapped\\\",[959]],[[120587,120587],\\\"mapped\\\",[960]],[[120588,120588],\\\"mapped\\\",[961]],[[120589,120590],\\\"mapped\\\",[963]],[[120591,120591],\\\"mapped\\\",[964]],[[120592,120592],\\\"mapped\\\",[965]],[[120593,120593],\\\"mapped\\\",[966]],[[120594,120594],\\\"mapped\\\",[967]],[[120595,120595],\\\"mapped\\\",[968]],[[120596,120596],\\\"mapped\\\",[969]],[[120597,120597],\\\"mapped\\\",[8706]],[[120598,120598],\\\"mapped\\\",[949]],[[120599,120599],\\\"mapped\\\",[952]],[[120600,120600],\\\"mapped\\\",[954]],[[120601,120601],\\\"mapped\\\",[966]],[[120602,120602],\\\"mapped\\\",[961]],[[120603,120603],\\\"mapped\\\",[960]],[[120604,120604],\\\"mapped\\\",[945]],[[120605,120605],\\\"mapped\\\",[946]],[[120606,120606],\\\"mapped\\\",[947]],[[120607,120607],\\\"mapped\\\",[948]],[[120608,120608],\\\"mapped\\\",[949]],[[120609,120609],\\\"mapped\\\",[950]],[[120610,120610],\\\"mapped\\\",[951]],[[120611,120611],\\\"mapped\\\",[952]],[[120612,120612],\\\"mapped\\\",[953]],[[120613,120613],\\\"mapped\\\",[954]],[[120614,120614],\\\"mapped\\\",[955]],[[120615,120615],\\\"mapped\\\",[956]],[[120616,120616],\\\"mapped\\\",[957]],[[120617,120617],\\\"mapped\\\",[958]],[[120618,120618],\\\"mapped\\\",[959]],[[120619,120619],\\\"mapped\\\",[960]],[[120620,120620],\\\"mapped\\\",[961]],[[120621,120621],\\\"mapped\\\",[952]],[[120622,120622],\\\"mapped\\\",[963]],[[120623,120623],\\\"mapped\\\",[964]],[[120624,120624],\\\"mapped\\\",[965]],[[120625,120625],\\\"mapped\\\",[966]],[[120626,120626],\\\"mapped\\\",[967]],[[120627,120627],\\\"mapped\\\",[968]],[[120628,120628],\\\"mapped\\\",[969]],[[120629,120629],\\\"mapped\\\",[8711]],[[120630,120630],\\\"mapped\\\",[945]],[[120631,120631],\\\"mapped\\\",[946]],[[120632,120632],\\\"mapped\\\",[947]],[[120633,120633],\\\"mapped\\\",[948]],[[120634,120634],\\\"mapped\\\",[949]],[[120635,120635],\\\"mapped\\\",[950]],[[120636,120636],\\\"mapped\\\",[951]],[[120637,120637],\\\"mapped\\\",[952]],[[120638,120638],\\\"mapped\\\",[953]],[[120639,120639],\\\"mapped\\\",[954]],[[120640,120640],\\\"mapped\\\",[955]],[[120641,120641],\\\"mapped\\\",[956]],[[120642,120642],\\\"mapped\\\",[957]],[[120643,120643],\\\"mapped\\\",[958]],[[120644,120644],\\\"mapped\\\",[959]],[[120645,120645],\\\"mapped\\\",[960]],[[120646,120646],\\\"mapped\\\",[961]],[[120647,120648],\\\"mapped\\\",[963]],[[120649,120649],\\\"mapped\\\",[964]],[[120650,120650],\\\"mapped\\\",[965]],[[120651,120651],\\\"mapped\\\",[966]],[[120652,120652],\\\"mapped\\\",[967]],[[120653,120653],\\\"mapped\\\",[968]],[[120654,120654],\\\"mapped\\\",[969]],[[120655,120655],\\\"mapped\\\",[8706]],[[120656,120656],\\\"mapped\\\",[949]],[[120657,120657],\\\"mapped\\\",[952]],[[120658,120658],\\\"mapped\\\",[954]],[[120659,120659],\\\"mapped\\\",[966]],[[120660,120660],\\\"mapped\\\",[961]],[[120661,120661],\\\"mapped\\\",[960]],[[120662,120662],\\\"mapped\\\",[945]],[[120663,120663],\\\"mapped\\\",[946]],[[120664,120664],\\\"mapped\\\",[947]],[[120665,120665],\\\"mapped\\\",[948]],[[120666,120666],\\\"mapped\\\",[949]],[[120667,120667],\\\"mapped\\\",[950]],[[120668,120668],\\\"mapped\\\",[951]],[[120669,120669],\\\"mapped\\\",[952]],[[120670,120670],\\\"mapped\\\",[953]],[[120671,120671],\\\"mapped\\\",[954]],[[120672,120672],\\\"mapped\\\",[955]],[[120673,120673],\\\"mapped\\\",[956]],[[120674,120674],\\\"mapped\\\",[957]],[[120675,120675],\\\"mapped\\\",[958]],[[120676,120676],\\\"mapped\\\",[959]],[[120677,120677],\\\"mapped\\\",[960]],[[120678,120678],\\\"mapped\\\",[961]],[[120679,120679],\\\"mapped\\\",[952]],[[120680,120680],\\\"mapped\\\",[963]],[[120681,120681],\\\"mapped\\\",[964]],[[120682,120682],\\\"mapped\\\",[965]],[[120683,120683],\\\"mapped\\\",[966]],[[120684,120684],\\\"mapped\\\",[967]],[[120685,120685],\\\"mapped\\\",[968]],[[120686,120686],\\\"mapped\\\",[969]],[[120687,120687],\\\"mapped\\\",[8711]],[[120688,120688],\\\"mapped\\\",[945]],[[120689,120689],\\\"mapped\\\",[946]],[[120690,120690],\\\"mapped\\\",[947]],[[120691,120691],\\\"mapped\\\",[948]],[[120692,120692],\\\"mapped\\\",[949]],[[120693,120693],\\\"mapped\\\",[950]],[[120694,120694],\\\"mapped\\\",[951]],[[120695,120695],\\\"mapped\\\",[952]],[[120696,120696],\\\"mapped\\\",[953]],[[120697,120697],\\\"mapped\\\",[954]],[[120698,120698],\\\"mapped\\\",[955]],[[120699,120699],\\\"mapped\\\",[956]],[[120700,120700],\\\"mapped\\\",[957]],[[120701,120701],\\\"mapped\\\",[958]],[[120702,120702],\\\"mapped\\\",[959]],[[120703,120703],\\\"mapped\\\",[960]],[[120704,120704],\\\"mapped\\\",[961]],[[120705,120706],\\\"mapped\\\",[963]],[[120707,120707],\\\"mapped\\\",[964]],[[120708,120708],\\\"mapped\\\",[965]],[[120709,120709],\\\"mapped\\\",[966]],[[120710,120710],\\\"mapped\\\",[967]],[[120711,120711],\\\"mapped\\\",[968]],[[120712,120712],\\\"mapped\\\",[969]],[[120713,120713],\\\"mapped\\\",[8706]],[[120714,120714],\\\"mapped\\\",[949]],[[120715,120715],\\\"mapped\\\",[952]],[[120716,120716],\\\"mapped\\\",[954]],[[120717,120717],\\\"mapped\\\",[966]],[[120718,120718],\\\"mapped\\\",[961]],[[120719,120719],\\\"mapped\\\",[960]],[[120720,120720],\\\"mapped\\\",[945]],[[120721,120721],\\\"mapped\\\",[946]],[[120722,120722],\\\"mapped\\\",[947]],[[120723,120723],\\\"mapped\\\",[948]],[[120724,120724],\\\"mapped\\\",[949]],[[120725,120725],\\\"mapped\\\",[950]],[[120726,120726],\\\"mapped\\\",[951]],[[120727,120727],\\\"mapped\\\",[952]],[[120728,120728],\\\"mapped\\\",[953]],[[120729,120729],\\\"mapped\\\",[954]],[[120730,120730],\\\"mapped\\\",[955]],[[120731,120731],\\\"mapped\\\",[956]],[[120732,120732],\\\"mapped\\\",[957]],[[120733,120733],\\\"mapped\\\",[958]],[[120734,120734],\\\"mapped\\\",[959]],[[120735,120735],\\\"mapped\\\",[960]],[[120736,120736],\\\"mapped\\\",[961]],[[120737,120737],\\\"mapped\\\",[952]],[[120738,120738],\\\"mapped\\\",[963]],[[120739,120739],\\\"mapped\\\",[964]],[[120740,120740],\\\"mapped\\\",[965]],[[120741,120741],\\\"mapped\\\",[966]],[[120742,120742],\\\"mapped\\\",[967]],[[120743,120743],\\\"mapped\\\",[968]],[[120744,120744],\\\"mapped\\\",[969]],[[120745,120745],\\\"mapped\\\",[8711]],[[120746,120746],\\\"mapped\\\",[945]],[[120747,120747],\\\"mapped\\\",[946]],[[120748,120748],\\\"mapped\\\",[947]],[[120749,120749],\\\"mapped\\\",[948]],[[120750,120750],\\\"mapped\\\",[949]],[[120751,120751],\\\"mapped\\\",[950]],[[120752,120752],\\\"mapped\\\",[951]],[[120753,120753],\\\"mapped\\\",[952]],[[120754,120754],\\\"mapped\\\",[953]],[[120755,120755],\\\"mapped\\\",[954]],[[120756,120756],\\\"mapped\\\",[955]],[[120757,120757],\\\"mapped\\\",[956]],[[120758,120758],\\\"mapped\\\",[957]],[[120759,120759],\\\"mapped\\\",[958]],[[120760,120760],\\\"mapped\\\",[959]],[[120761,120761],\\\"mapped\\\",[960]],[[120762,120762],\\\"mapped\\\",[961]],[[120763,120764],\\\"mapped\\\",[963]],[[120765,120765],\\\"mapped\\\",[964]],[[120766,120766],\\\"mapped\\\",[965]],[[120767,120767],\\\"mapped\\\",[966]],[[120768,120768],\\\"mapped\\\",[967]],[[120769,120769],\\\"mapped\\\",[968]],[[120770,120770],\\\"mapped\\\",[969]],[[120771,120771],\\\"mapped\\\",[8706]],[[120772,120772],\\\"mapped\\\",[949]],[[120773,120773],\\\"mapped\\\",[952]],[[120774,120774],\\\"mapped\\\",[954]],[[120775,120775],\\\"mapped\\\",[966]],[[120776,120776],\\\"mapped\\\",[961]],[[120777,120777],\\\"mapped\\\",[960]],[[120778,120779],\\\"mapped\\\",[989]],[[120780,120781],\\\"disallowed\\\"],[[120782,120782],\\\"mapped\\\",[48]],[[120783,120783],\\\"mapped\\\",[49]],[[120784,120784],\\\"mapped\\\",[50]],[[120785,120785],\\\"mapped\\\",[51]],[[120786,120786],\\\"mapped\\\",[52]],[[120787,120787],\\\"mapped\\\",[53]],[[120788,120788],\\\"mapped\\\",[54]],[[120789,120789],\\\"mapped\\\",[55]],[[120790,120790],\\\"mapped\\\",[56]],[[120791,120791],\\\"mapped\\\",[57]],[[120792,120792],\\\"mapped\\\",[48]],[[120793,120793],\\\"mapped\\\",[49]],[[120794,120794],\\\"mapped\\\",[50]],[[120795,120795],\\\"mapped\\\",[51]],[[120796,120796],\\\"mapped\\\",[52]],[[120797,120797],\\\"mapped\\\",[53]],[[120798,120798],\\\"mapped\\\",[54]],[[120799,120799],\\\"mapped\\\",[55]],[[120800,120800],\\\"mapped\\\",[56]],[[120801,120801],\\\"mapped\\\",[57]],[[120802,120802],\\\"mapped\\\",[48]],[[120803,120803],\\\"mapped\\\",[49]],[[120804,120804],\\\"mapped\\\",[50]],[[120805,120805],\\\"mapped\\\",[51]],[[120806,120806],\\\"mapped\\\",[52]],[[120807,120807],\\\"mapped\\\",[53]],[[120808,120808],\\\"mapped\\\",[54]],[[120809,120809],\\\"mapped\\\",[55]],[[120810,120810],\\\"mapped\\\",[56]],[[120811,120811],\\\"mapped\\\",[57]],[[120812,120812],\\\"mapped\\\",[48]],[[120813,120813],\\\"mapped\\\",[49]],[[120814,120814],\\\"mapped\\\",[50]],[[120815,120815],\\\"mapped\\\",[51]],[[120816,120816],\\\"mapped\\\",[52]],[[120817,120817],\\\"mapped\\\",[53]],[[120818,120818],\\\"mapped\\\",[54]],[[120819,120819],\\\"mapped\\\",[55]],[[120820,120820],\\\"mapped\\\",[56]],[[120821,120821],\\\"mapped\\\",[57]],[[120822,120822],\\\"mapped\\\",[48]],[[120823,120823],\\\"mapped\\\",[49]],[[120824,120824],\\\"mapped\\\",[50]],[[120825,120825],\\\"mapped\\\",[51]],[[120826,120826],\\\"mapped\\\",[52]],[[120827,120827],\\\"mapped\\\",[53]],[[120828,120828],\\\"mapped\\\",[54]],[[120829,120829],\\\"mapped\\\",[55]],[[120830,120830],\\\"mapped\\\",[56]],[[120831,120831],\\\"mapped\\\",[57]],[[120832,121343],\\\"valid\\\",[],\\\"NV8\\\"],[[121344,121398],\\\"valid\\\"],[[121399,121402],\\\"valid\\\",[],\\\"NV8\\\"],[[121403,121452],\\\"valid\\\"],[[121453,121460],\\\"valid\\\",[],\\\"NV8\\\"],[[121461,121461],\\\"valid\\\"],[[121462,121475],\\\"valid\\\",[],\\\"NV8\\\"],[[121476,121476],\\\"valid\\\"],[[121477,121483],\\\"valid\\\",[],\\\"NV8\\\"],[[121484,121498],\\\"disallowed\\\"],[[121499,121503],\\\"valid\\\"],[[121504,121504],\\\"disallowed\\\"],[[121505,121519],\\\"valid\\\"],[[121520,124927],\\\"disallowed\\\"],[[124928,125124],\\\"valid\\\"],[[125125,125126],\\\"disallowed\\\"],[[125127,125135],\\\"valid\\\",[],\\\"NV8\\\"],[[125136,125142],\\\"valid\\\"],[[125143,126463],\\\"disallowed\\\"],[[126464,126464],\\\"mapped\\\",[1575]],[[126465,126465],\\\"mapped\\\",[1576]],[[126466,126466],\\\"mapped\\\",[1580]],[[126467,126467],\\\"mapped\\\",[1583]],[[126468,126468],\\\"disallowed\\\"],[[126469,126469],\\\"mapped\\\",[1608]],[[126470,126470],\\\"mapped\\\",[1586]],[[126471,126471],\\\"mapped\\\",[1581]],[[126472,126472],\\\"mapped\\\",[1591]],[[126473,126473],\\\"mapped\\\",[1610]],[[126474,126474],\\\"mapped\\\",[1603]],[[126475,126475],\\\"mapped\\\",[1604]],[[126476,126476],\\\"mapped\\\",[1605]],[[126477,126477],\\\"mapped\\\",[1606]],[[126478,126478],\\\"mapped\\\",[1587]],[[126479,126479],\\\"mapped\\\",[1593]],[[126480,126480],\\\"mapped\\\",[1601]],[[126481,126481],\\\"mapped\\\",[1589]],[[126482,126482],\\\"mapped\\\",[1602]],[[126483,126483],\\\"mapped\\\",[1585]],[[126484,126484],\\\"mapped\\\",[1588]],[[126485,126485],\\\"mapped\\\",[1578]],[[126486,126486],\\\"mapped\\\",[1579]],[[126487,126487],\\\"mapped\\\",[1582]],[[126488,126488],\\\"mapped\\\",[1584]],[[126489,126489],\\\"mapped\\\",[1590]],[[126490,126490],\\\"mapped\\\",[1592]],[[126491,126491],\\\"mapped\\\",[1594]],[[126492,126492],\\\"mapped\\\",[1646]],[[126493,126493],\\\"mapped\\\",[1722]],[[126494,126494],\\\"mapped\\\",[1697]],[[126495,126495],\\\"mapped\\\",[1647]],[[126496,126496],\\\"disallowed\\\"],[[126497,126497],\\\"mapped\\\",[1576]],[[126498,126498],\\\"mapped\\\",[1580]],[[126499,126499],\\\"disallowed\\\"],[[126500,126500],\\\"mapped\\\",[1607]],[[126501,126502],\\\"disallowed\\\"],[[126503,126503],\\\"mapped\\\",[1581]],[[126504,126504],\\\"disallowed\\\"],[[126505,126505],\\\"mapped\\\",[1610]],[[126506,126506],\\\"mapped\\\",[1603]],[[126507,126507],\\\"mapped\\\",[1604]],[[126508,126508],\\\"mapped\\\",[1605]],[[126509,126509],\\\"mapped\\\",[1606]],[[126510,126510],\\\"mapped\\\",[1587]],[[126511,126511],\\\"mapped\\\",[1593]],[[126512,126512],\\\"mapped\\\",[1601]],[[126513,126513],\\\"mapped\\\",[1589]],[[126514,126514],\\\"mapped\\\",[1602]],[[126515,126515],\\\"disallowed\\\"],[[126516,126516],\\\"mapped\\\",[1588]],[[126517,126517],\\\"mapped\\\",[1578]],[[126518,126518],\\\"mapped\\\",[1579]],[[126519,126519],\\\"mapped\\\",[1582]],[[126520,126520],\\\"disallowed\\\"],[[126521,126521],\\\"mapped\\\",[1590]],[[126522,126522],\\\"disallowed\\\"],[[126523,126523],\\\"mapped\\\",[1594]],[[126524,126529],\\\"disallowed\\\"],[[126530,126530],\\\"mapped\\\",[1580]],[[126531,126534],\\\"disallowed\\\"],[[126535,126535],\\\"mapped\\\",[1581]],[[126536,126536],\\\"disallowed\\\"],[[126537,126537],\\\"mapped\\\",[1610]],[[126538,126538],\\\"disallowed\\\"],[[126539,126539],\\\"mapped\\\",[1604]],[[126540,126540],\\\"disallowed\\\"],[[126541,126541],\\\"mapped\\\",[1606]],[[126542,126542],\\\"mapped\\\",[1587]],[[126543,126543],\\\"mapped\\\",[1593]],[[126544,126544],\\\"disallowed\\\"],[[126545,126545],\\\"mapped\\\",[1589]],[[126546,126546],\\\"mapped\\\",[1602]],[[126547,126547],\\\"disallowed\\\"],[[126548,126548],\\\"mapped\\\",[1588]],[[126549,126550],\\\"disallowed\\\"],[[126551,126551],\\\"mapped\\\",[1582]],[[126552,126552],\\\"disallowed\\\"],[[126553,126553],\\\"mapped\\\",[1590]],[[126554,126554],\\\"disallowed\\\"],[[126555,126555],\\\"mapped\\\",[1594]],[[126556,126556],\\\"disallowed\\\"],[[126557,126557],\\\"mapped\\\",[1722]],[[126558,126558],\\\"disallowed\\\"],[[126559,126559],\\\"mapped\\\",[1647]],[[126560,126560],\\\"disallowed\\\"],[[126561,126561],\\\"mapped\\\",[1576]],[[126562,126562],\\\"mapped\\\",[1580]],[[126563,126563],\\\"disallowed\\\"],[[126564,126564],\\\"mapped\\\",[1607]],[[126565,126566],\\\"disallowed\\\"],[[126567,126567],\\\"mapped\\\",[1581]],[[126568,126568],\\\"mapped\\\",[1591]],[[126569,126569],\\\"mapped\\\",[1610]],[[126570,126570],\\\"mapped\\\",[1603]],[[126571,126571],\\\"disallowed\\\"],[[126572,126572],\\\"mapped\\\",[1605]],[[126573,126573],\\\"mapped\\\",[1606]],[[126574,126574],\\\"mapped\\\",[1587]],[[126575,126575],\\\"mapped\\\",[1593]],[[126576,126576],\\\"mapped\\\",[1601]],[[126577,126577],\\\"mapped\\\",[1589]],[[126578,126578],\\\"mapped\\\",[1602]],[[126579,126579],\\\"disallowed\\\"],[[126580,126580],\\\"mapped\\\",[1588]],[[126581,126581],\\\"mapped\\\",[1578]],[[126582,126582],\\\"mapped\\\",[1579]],[[126583,126583],\\\"mapped\\\",[1582]],[[126584,126584],\\\"disallowed\\\"],[[126585,126585],\\\"mapped\\\",[1590]],[[126586,126586],\\\"mapped\\\",[1592]],[[126587,126587],\\\"mapped\\\",[1594]],[[126588,126588],\\\"mapped\\\",[1646]],[[126589,126589],\\\"disallowed\\\"],[[126590,126590],\\\"mapped\\\",[1697]],[[126591,126591],\\\"disallowed\\\"],[[126592,126592],\\\"mapped\\\",[1575]],[[126593,126593],\\\"mapped\\\",[1576]],[[126594,126594],\\\"mapped\\\",[1580]],[[126595,126595],\\\"mapped\\\",[1583]],[[126596,126596],\\\"mapped\\\",[1607]],[[126597,126597],\\\"mapped\\\",[1608]],[[126598,126598],\\\"mapped\\\",[1586]],[[126599,126599],\\\"mapped\\\",[1581]],[[126600,126600],\\\"mapped\\\",[1591]],[[126601,126601],\\\"mapped\\\",[1610]],[[126602,126602],\\\"disallowed\\\"],[[126603,126603],\\\"mapped\\\",[1604]],[[126604,126604],\\\"mapped\\\",[1605]],[[126605,126605],\\\"mapped\\\",[1606]],[[126606,126606],\\\"mapped\\\",[1587]],[[126607,126607],\\\"mapped\\\",[1593]],[[126608,126608],\\\"mapped\\\",[1601]],[[126609,126609],\\\"mapped\\\",[1589]],[[126610,126610],\\\"mapped\\\",[1602]],[[126611,126611],\\\"mapped\\\",[1585]],[[126612,126612],\\\"mapped\\\",[1588]],[[126613,126613],\\\"mapped\\\",[1578]],[[126614,126614],\\\"mapped\\\",[1579]],[[126615,126615],\\\"mapped\\\",[1582]],[[126616,126616],\\\"mapped\\\",[1584]],[[126617,126617],\\\"mapped\\\",[1590]],[[126618,126618],\\\"mapped\\\",[1592]],[[126619,126619],\\\"mapped\\\",[1594]],[[126620,126624],\\\"disallowed\\\"],[[126625,126625],\\\"mapped\\\",[1576]],[[126626,126626],\\\"mapped\\\",[1580]],[[126627,126627],\\\"mapped\\\",[1583]],[[126628,126628],\\\"disallowed\\\"],[[126629,126629],\\\"mapped\\\",[1608]],[[126630,126630],\\\"mapped\\\",[1586]],[[126631,126631],\\\"mapped\\\",[1581]],[[126632,126632],\\\"mapped\\\",[1591]],[[126633,126633],\\\"mapped\\\",[1610]],[[126634,126634],\\\"disallowed\\\"],[[126635,126635],\\\"mapped\\\",[1604]],[[126636,126636],\\\"mapped\\\",[1605]],[[126637,126637],\\\"mapped\\\",[1606]],[[126638,126638],\\\"mapped\\\",[1587]],[[126639,126639],\\\"mapped\\\",[1593]],[[126640,126640],\\\"mapped\\\",[1601]],[[126641,126641],\\\"mapped\\\",[1589]],[[126642,126642],\\\"mapped\\\",[1602]],[[126643,126643],\\\"mapped\\\",[1585]],[[126644,126644],\\\"mapped\\\",[1588]],[[126645,126645],\\\"mapped\\\",[1578]],[[126646,126646],\\\"mapped\\\",[1579]],[[126647,126647],\\\"mapped\\\",[1582]],[[126648,126648],\\\"mapped\\\",[1584]],[[126649,126649],\\\"mapped\\\",[1590]],[[126650,126650],\\\"mapped\\\",[1592]],[[126651,126651],\\\"mapped\\\",[1594]],[[126652,126703],\\\"disallowed\\\"],[[126704,126705],\\\"valid\\\",[],\\\"NV8\\\"],[[126706,126975],\\\"disallowed\\\"],[[126976,127019],\\\"valid\\\",[],\\\"NV8\\\"],[[127020,127023],\\\"disallowed\\\"],[[127024,127123],\\\"valid\\\",[],\\\"NV8\\\"],[[127124,127135],\\\"disallowed\\\"],[[127136,127150],\\\"valid\\\",[],\\\"NV8\\\"],[[127151,127152],\\\"disallowed\\\"],[[127153,127166],\\\"valid\\\",[],\\\"NV8\\\"],[[127167,127167],\\\"valid\\\",[],\\\"NV8\\\"],[[127168,127168],\\\"disallowed\\\"],[[127169,127183],\\\"valid\\\",[],\\\"NV8\\\"],[[127184,127184],\\\"disallowed\\\"],[[127185,127199],\\\"valid\\\",[],\\\"NV8\\\"],[[127200,127221],\\\"valid\\\",[],\\\"NV8\\\"],[[127222,127231],\\\"disallowed\\\"],[[127232,127232],\\\"disallowed\\\"],[[127233,127233],\\\"disallowed_STD3_mapped\\\",[48,44]],[[127234,127234],\\\"disallowed_STD3_mapped\\\",[49,44]],[[127235,127235],\\\"disallowed_STD3_mapped\\\",[50,44]],[[127236,127236],\\\"disallowed_STD3_mapped\\\",[51,44]],[[127237,127237],\\\"disallowed_STD3_mapped\\\",[52,44]],[[127238,127238],\\\"disallowed_STD3_mapped\\\",[53,44]],[[127239,127239],\\\"disallowed_STD3_mapped\\\",[54,44]],[[127240,127240],\\\"disallowed_STD3_mapped\\\",[55,44]],[[127241,127241],\\\"disallowed_STD3_mapped\\\",[56,44]],[[127242,127242],\\\"disallowed_STD3_mapped\\\",[57,44]],[[127243,127244],\\\"valid\\\",[],\\\"NV8\\\"],[[127245,127247],\\\"disallowed\\\"],[[127248,127248],\\\"disallowed_STD3_mapped\\\",[40,97,41]],[[127249,127249],\\\"disallowed_STD3_mapped\\\",[40,98,41]],[[127250,127250],\\\"disallowed_STD3_mapped\\\",[40,99,41]],[[127251,127251],\\\"disallowed_STD3_mapped\\\",[40,100,41]],[[127252,127252],\\\"disallowed_STD3_mapped\\\",[40,101,41]],[[127253,127253],\\\"disallowed_STD3_mapped\\\",[40,102,41]],[[127254,127254],\\\"disallowed_STD3_mapped\\\",[40,103,41]],[[127255,127255],\\\"disallowed_STD3_mapped\\\",[40,104,41]],[[127256,127256],\\\"disallowed_STD3_mapped\\\",[40,105,41]],[[127257,127257],\\\"disallowed_STD3_mapped\\\",[40,106,41]],[[127258,127258],\\\"disallowed_STD3_mapped\\\",[40,107,41]],[[127259,127259],\\\"disallowed_STD3_mapped\\\",[40,108,41]],[[127260,127260],\\\"disallowed_STD3_mapped\\\",[40,109,41]],[[127261,127261],\\\"disallowed_STD3_mapped\\\",[40,110,41]],[[127262,127262],\\\"disallowed_STD3_mapped\\\",[40,111,41]],[[127263,127263],\\\"disallowed_STD3_mapped\\\",[40,112,41]],[[127264,127264],\\\"disallowed_STD3_mapped\\\",[40,113,41]],[[127265,127265],\\\"disallowed_STD3_mapped\\\",[40,114,41]],[[127266,127266],\\\"disallowed_STD3_mapped\\\",[40,115,41]],[[127267,127267],\\\"disallowed_STD3_mapped\\\",[40,116,41]],[[127268,127268],\\\"disallowed_STD3_mapped\\\",[40,117,41]],[[127269,127269],\\\"disallowed_STD3_mapped\\\",[40,118,41]],[[127270,127270],\\\"disallowed_STD3_mapped\\\",[40,119,41]],[[127271,127271],\\\"disallowed_STD3_mapped\\\",[40,120,41]],[[127272,127272],\\\"disallowed_STD3_mapped\\\",[40,121,41]],[[127273,127273],\\\"disallowed_STD3_mapped\\\",[40,122,41]],[[127274,127274],\\\"mapped\\\",[12308,115,12309]],[[127275,127275],\\\"mapped\\\",[99]],[[127276,127276],\\\"mapped\\\",[114]],[[127277,127277],\\\"mapped\\\",[99,100]],[[127278,127278],\\\"mapped\\\",[119,122]],[[127279,127279],\\\"disallowed\\\"],[[127280,127280],\\\"mapped\\\",[97]],[[127281,127281],\\\"mapped\\\",[98]],[[127282,127282],\\\"mapped\\\",[99]],[[127283,127283],\\\"mapped\\\",[100]],[[127284,127284],\\\"mapped\\\",[101]],[[127285,127285],\\\"mapped\\\",[102]],[[127286,127286],\\\"mapped\\\",[103]],[[127287,127287],\\\"mapped\\\",[104]],[[127288,127288],\\\"mapped\\\",[105]],[[127289,127289],\\\"mapped\\\",[106]],[[127290,127290],\\\"mapped\\\",[107]],[[127291,127291],\\\"mapped\\\",[108]],[[127292,127292],\\\"mapped\\\",[109]],[[127293,127293],\\\"mapped\\\",[110]],[[127294,127294],\\\"mapped\\\",[111]],[[127295,127295],\\\"mapped\\\",[112]],[[127296,127296],\\\"mapped\\\",[113]],[[127297,127297],\\\"mapped\\\",[114]],[[127298,127298],\\\"mapped\\\",[115]],[[127299,127299],\\\"mapped\\\",[116]],[[127300,127300],\\\"mapped\\\",[117]],[[127301,127301],\\\"mapped\\\",[118]],[[127302,127302],\\\"mapped\\\",[119]],[[127303,127303],\\\"mapped\\\",[120]],[[127304,127304],\\\"mapped\\\",[121]],[[127305,127305],\\\"mapped\\\",[122]],[[127306,127306],\\\"mapped\\\",[104,118]],[[127307,127307],\\\"mapped\\\",[109,118]],[[127308,127308],\\\"mapped\\\",[115,100]],[[127309,127309],\\\"mapped\\\",[115,115]],[[127310,127310],\\\"mapped\\\",[112,112,118]],[[127311,127311],\\\"mapped\\\",[119,99]],[[127312,127318],\\\"valid\\\",[],\\\"NV8\\\"],[[127319,127319],\\\"valid\\\",[],\\\"NV8\\\"],[[127320,127326],\\\"valid\\\",[],\\\"NV8\\\"],[[127327,127327],\\\"valid\\\",[],\\\"NV8\\\"],[[127328,127337],\\\"valid\\\",[],\\\"NV8\\\"],[[127338,127338],\\\"mapped\\\",[109,99]],[[127339,127339],\\\"mapped\\\",[109,100]],[[127340,127343],\\\"disallowed\\\"],[[127344,127352],\\\"valid\\\",[],\\\"NV8\\\"],[[127353,127353],\\\"valid\\\",[],\\\"NV8\\\"],[[127354,127354],\\\"valid\\\",[],\\\"NV8\\\"],[[127355,127356],\\\"valid\\\",[],\\\"NV8\\\"],[[127357,127358],\\\"valid\\\",[],\\\"NV8\\\"],[[127359,127359],\\\"valid\\\",[],\\\"NV8\\\"],[[127360,127369],\\\"valid\\\",[],\\\"NV8\\\"],[[127370,127373],\\\"valid\\\",[],\\\"NV8\\\"],[[127374,127375],\\\"valid\\\",[],\\\"NV8\\\"],[[127376,127376],\\\"mapped\\\",[100,106]],[[127377,127386],\\\"valid\\\",[],\\\"NV8\\\"],[[127387,127461],\\\"disallowed\\\"],[[127462,127487],\\\"valid\\\",[],\\\"NV8\\\"],[[127488,127488],\\\"mapped\\\",[12411,12363]],[[127489,127489],\\\"mapped\\\",[12467,12467]],[[127490,127490],\\\"mapped\\\",[12469]],[[127491,127503],\\\"disallowed\\\"],[[127504,127504],\\\"mapped\\\",[25163]],[[127505,127505],\\\"mapped\\\",[23383]],[[127506,127506],\\\"mapped\\\",[21452]],[[127507,127507],\\\"mapped\\\",[12487]],[[127508,127508],\\\"mapped\\\",[20108]],[[127509,127509],\\\"mapped\\\",[22810]],[[127510,127510],\\\"mapped\\\",[35299]],[[127511,127511],\\\"mapped\\\",[22825]],[[127512,127512],\\\"mapped\\\",[20132]],[[127513,127513],\\\"mapped\\\",[26144]],[[127514,127514],\\\"mapped\\\",[28961]],[[127515,127515],\\\"mapped\\\",[26009]],[[127516,127516],\\\"mapped\\\",[21069]],[[127517,127517],\\\"mapped\\\",[24460]],[[127518,127518],\\\"mapped\\\",[20877]],[[127519,127519],\\\"mapped\\\",[26032]],[[127520,127520],\\\"mapped\\\",[21021]],[[127521,127521],\\\"mapped\\\",[32066]],[[127522,127522],\\\"mapped\\\",[29983]],[[127523,127523],\\\"mapped\\\",[36009]],[[127524,127524],\\\"mapped\\\",[22768]],[[127525,127525],\\\"mapped\\\",[21561]],[[127526,127526],\\\"mapped\\\",[28436]],[[127527,127527],\\\"mapped\\\",[25237]],[[127528,127528],\\\"mapped\\\",[25429]],[[127529,127529],\\\"mapped\\\",[19968]],[[127530,127530],\\\"mapped\\\",[19977]],[[127531,127531],\\\"mapped\\\",[36938]],[[127532,127532],\\\"mapped\\\",[24038]],[[127533,127533],\\\"mapped\\\",[20013]],[[127534,127534],\\\"mapped\\\",[21491]],[[127535,127535],\\\"mapped\\\",[25351]],[[127536,127536],\\\"mapped\\\",[36208]],[[127537,127537],\\\"mapped\\\",[25171]],[[127538,127538],\\\"mapped\\\",[31105]],[[127539,127539],\\\"mapped\\\",[31354]],[[127540,127540],\\\"mapped\\\",[21512]],[[127541,127541],\\\"mapped\\\",[28288]],[[127542,127542],\\\"mapped\\\",[26377]],[[127543,127543],\\\"mapped\\\",[26376]],[[127544,127544],\\\"mapped\\\",[30003]],[[127545,127545],\\\"mapped\\\",[21106]],[[127546,127546],\\\"mapped\\\",[21942]],[[127547,127551],\\\"disallowed\\\"],[[127552,127552],\\\"mapped\\\",[12308,26412,12309]],[[127553,127553],\\\"mapped\\\",[12308,19977,12309]],[[127554,127554],\\\"mapped\\\",[12308,20108,12309]],[[127555,127555],\\\"mapped\\\",[12308,23433,12309]],[[127556,127556],\\\"mapped\\\",[12308,28857,12309]],[[127557,127557],\\\"mapped\\\",[12308,25171,12309]],[[127558,127558],\\\"mapped\\\",[12308,30423,12309]],[[127559,127559],\\\"mapped\\\",[12308,21213,12309]],[[127560,127560],\\\"mapped\\\",[12308,25943,12309]],[[127561,127567],\\\"disallowed\\\"],[[127568,127568],\\\"mapped\\\",[24471]],[[127569,127569],\\\"mapped\\\",[21487]],[[127570,127743],\\\"disallowed\\\"],[[127744,127776],\\\"valid\\\",[],\\\"NV8\\\"],[[127777,127788],\\\"valid\\\",[],\\\"NV8\\\"],[[127789,127791],\\\"valid\\\",[],\\\"NV8\\\"],[[127792,127797],\\\"valid\\\",[],\\\"NV8\\\"],[[127798,127798],\\\"valid\\\",[],\\\"NV8\\\"],[[127799,127868],\\\"valid\\\",[],\\\"NV8\\\"],[[127869,127869],\\\"valid\\\",[],\\\"NV8\\\"],[[127870,127871],\\\"valid\\\",[],\\\"NV8\\\"],[[127872,127891],\\\"valid\\\",[],\\\"NV8\\\"],[[127892,127903],\\\"valid\\\",[],\\\"NV8\\\"],[[127904,127940],\\\"valid\\\",[],\\\"NV8\\\"],[[127941,127941],\\\"valid\\\",[],\\\"NV8\\\"],[[127942,127946],\\\"valid\\\",[],\\\"NV8\\\"],[[127947,127950],\\\"valid\\\",[],\\\"NV8\\\"],[[127951,127955],\\\"valid\\\",[],\\\"NV8\\\"],[[127956,127967],\\\"valid\\\",[],\\\"NV8\\\"],[[127968,127984],\\\"valid\\\",[],\\\"NV8\\\"],[[127985,127991],\\\"valid\\\",[],\\\"NV8\\\"],[[127992,127999],\\\"valid\\\",[],\\\"NV8\\\"],[[128000,128062],\\\"valid\\\",[],\\\"NV8\\\"],[[128063,128063],\\\"valid\\\",[],\\\"NV8\\\"],[[128064,128064],\\\"valid\\\",[],\\\"NV8\\\"],[[128065,128065],\\\"valid\\\",[],\\\"NV8\\\"],[[128066,128247],\\\"valid\\\",[],\\\"NV8\\\"],[[128248,128248],\\\"valid\\\",[],\\\"NV8\\\"],[[128249,128252],\\\"valid\\\",[],\\\"NV8\\\"],[[128253,128254],\\\"valid\\\",[],\\\"NV8\\\"],[[128255,128255],\\\"valid\\\",[],\\\"NV8\\\"],[[128256,128317],\\\"valid\\\",[],\\\"NV8\\\"],[[128318,128319],\\\"valid\\\",[],\\\"NV8\\\"],[[128320,128323],\\\"valid\\\",[],\\\"NV8\\\"],[[128324,128330],\\\"valid\\\",[],\\\"NV8\\\"],[[128331,128335],\\\"valid\\\",[],\\\"NV8\\\"],[[128336,128359],\\\"valid\\\",[],\\\"NV8\\\"],[[128360,128377],\\\"valid\\\",[],\\\"NV8\\\"],[[128378,128378],\\\"disallowed\\\"],[[128379,128419],\\\"valid\\\",[],\\\"NV8\\\"],[[128420,128420],\\\"disallowed\\\"],[[128421,128506],\\\"valid\\\",[],\\\"NV8\\\"],[[128507,128511],\\\"valid\\\",[],\\\"NV8\\\"],[[128512,128512],\\\"valid\\\",[],\\\"NV8\\\"],[[128513,128528],\\\"valid\\\",[],\\\"NV8\\\"],[[128529,128529],\\\"valid\\\",[],\\\"NV8\\\"],[[128530,128532],\\\"valid\\\",[],\\\"NV8\\\"],[[128533,128533],\\\"valid\\\",[],\\\"NV8\\\"],[[128534,128534],\\\"valid\\\",[],\\\"NV8\\\"],[[128535,128535],\\\"valid\\\",[],\\\"NV8\\\"],[[128536,128536],\\\"valid\\\",[],\\\"NV8\\\"],[[128537,128537],\\\"valid\\\",[],\\\"NV8\\\"],[[128538,128538],\\\"valid\\\",[],\\\"NV8\\\"],[[128539,128539],\\\"valid\\\",[],\\\"NV8\\\"],[[128540,128542],\\\"valid\\\",[],\\\"NV8\\\"],[[128543,128543],\\\"valid\\\",[],\\\"NV8\\\"],[[128544,128549],\\\"valid\\\",[],\\\"NV8\\\"],[[128550,128551],\\\"valid\\\",[],\\\"NV8\\\"],[[128552,128555],\\\"valid\\\",[],\\\"NV8\\\"],[[128556,128556],\\\"valid\\\",[],\\\"NV8\\\"],[[128557,128557],\\\"valid\\\",[],\\\"NV8\\\"],[[128558,128559],\\\"valid\\\",[],\\\"NV8\\\"],[[128560,128563],\\\"valid\\\",[],\\\"NV8\\\"],[[128564,128564],\\\"valid\\\",[],\\\"NV8\\\"],[[128565,128576],\\\"valid\\\",[],\\\"NV8\\\"],[[128577,128578],\\\"valid\\\",[],\\\"NV8\\\"],[[128579,128580],\\\"valid\\\",[],\\\"NV8\\\"],[[128581,128591],\\\"valid\\\",[],\\\"NV8\\\"],[[128592,128639],\\\"valid\\\",[],\\\"NV8\\\"],[[128640,128709],\\\"valid\\\",[],\\\"NV8\\\"],[[128710,128719],\\\"valid\\\",[],\\\"NV8\\\"],[[128720,128720],\\\"valid\\\",[],\\\"NV8\\\"],[[128721,128735],\\\"disallowed\\\"],[[128736,128748],\\\"valid\\\",[],\\\"NV8\\\"],[[128749,128751],\\\"disallowed\\\"],[[128752,128755],\\\"valid\\\",[],\\\"NV8\\\"],[[128756,128767],\\\"disallowed\\\"],[[128768,128883],\\\"valid\\\",[],\\\"NV8\\\"],[[128884,128895],\\\"disallowed\\\"],[[128896,128980],\\\"valid\\\",[],\\\"NV8\\\"],[[128981,129023],\\\"disallowed\\\"],[[129024,129035],\\\"valid\\\",[],\\\"NV8\\\"],[[129036,129039],\\\"disallowed\\\"],[[129040,129095],\\\"valid\\\",[],\\\"NV8\\\"],[[129096,129103],\\\"disallowed\\\"],[[129104,129113],\\\"valid\\\",[],\\\"NV8\\\"],[[129114,129119],\\\"disallowed\\\"],[[129120,129159],\\\"valid\\\",[],\\\"NV8\\\"],[[129160,129167],\\\"disallowed\\\"],[[129168,129197],\\\"valid\\\",[],\\\"NV8\\\"],[[129198,129295],\\\"disallowed\\\"],[[129296,129304],\\\"valid\\\",[],\\\"NV8\\\"],[[129305,129407],\\\"disallowed\\\"],[[129408,129412],\\\"valid\\\",[],\\\"NV8\\\"],[[129413,129471],\\\"disallowed\\\"],[[129472,129472],\\\"valid\\\",[],\\\"NV8\\\"],[[129473,131069],\\\"disallowed\\\"],[[131070,131071],\\\"disallowed\\\"],[[131072,173782],\\\"valid\\\"],[[173783,173823],\\\"disallowed\\\"],[[173824,177972],\\\"valid\\\"],[[177973,177983],\\\"disallowed\\\"],[[177984,178205],\\\"valid\\\"],[[178206,178207],\\\"disallowed\\\"],[[178208,183969],\\\"valid\\\"],[[183970,194559],\\\"disallowed\\\"],[[194560,194560],\\\"mapped\\\",[20029]],[[194561,194561],\\\"mapped\\\",[20024]],[[194562,194562],\\\"mapped\\\",[20033]],[[194563,194563],\\\"mapped\\\",[131362]],[[194564,194564],\\\"mapped\\\",[20320]],[[194565,194565],\\\"mapped\\\",[20398]],[[194566,194566],\\\"mapped\\\",[20411]],[[194567,194567],\\\"mapped\\\",[20482]],[[194568,194568],\\\"mapped\\\",[20602]],[[194569,194569],\\\"mapped\\\",[20633]],[[194570,194570],\\\"mapped\\\",[20711]],[[194571,194571],\\\"mapped\\\",[20687]],[[194572,194572],\\\"mapped\\\",[13470]],[[194573,194573],\\\"mapped\\\",[132666]],[[194574,194574],\\\"mapped\\\",[20813]],[[194575,194575],\\\"mapped\\\",[20820]],[[194576,194576],\\\"mapped\\\",[20836]],[[194577,194577],\\\"mapped\\\",[20855]],[[194578,194578],\\\"mapped\\\",[132380]],[[194579,194579],\\\"mapped\\\",[13497]],[[194580,194580],\\\"mapped\\\",[20839]],[[194581,194581],\\\"mapped\\\",[20877]],[[194582,194582],\\\"mapped\\\",[132427]],[[194583,194583],\\\"mapped\\\",[20887]],[[194584,194584],\\\"mapped\\\",[20900]],[[194585,194585],\\\"mapped\\\",[20172]],[[194586,194586],\\\"mapped\\\",[20908]],[[194587,194587],\\\"mapped\\\",[20917]],[[194588,194588],\\\"mapped\\\",[168415]],[[194589,194589],\\\"mapped\\\",[20981]],[[194590,194590],\\\"mapped\\\",[20995]],[[194591,194591],\\\"mapped\\\",[13535]],[[194592,194592],\\\"mapped\\\",[21051]],[[194593,194593],\\\"mapped\\\",[21062]],[[194594,194594],\\\"mapped\\\",[21106]],[[194595,194595],\\\"mapped\\\",[21111]],[[194596,194596],\\\"mapped\\\",[13589]],[[194597,194597],\\\"mapped\\\",[21191]],[[194598,194598],\\\"mapped\\\",[21193]],[[194599,194599],\\\"mapped\\\",[21220]],[[194600,194600],\\\"mapped\\\",[21242]],[[194601,194601],\\\"mapped\\\",[21253]],[[194602,194602],\\\"mapped\\\",[21254]],[[194603,194603],\\\"mapped\\\",[21271]],[[194604,194604],\\\"mapped\\\",[21321]],[[194605,194605],\\\"mapped\\\",[21329]],[[194606,194606],\\\"mapped\\\",[21338]],[[194607,194607],\\\"mapped\\\",[21363]],[[194608,194608],\\\"mapped\\\",[21373]],[[194609,194611],\\\"mapped\\\",[21375]],[[194612,194612],\\\"mapped\\\",[133676]],[[194613,194613],\\\"mapped\\\",[28784]],[[194614,194614],\\\"mapped\\\",[21450]],[[194615,194615],\\\"mapped\\\",[21471]],[[194616,194616],\\\"mapped\\\",[133987]],[[194617,194617],\\\"mapped\\\",[21483]],[[194618,194618],\\\"mapped\\\",[21489]],[[194619,194619],\\\"mapped\\\",[21510]],[[194620,194620],\\\"mapped\\\",[21662]],[[194621,194621],\\\"mapped\\\",[21560]],[[194622,194622],\\\"mapped\\\",[21576]],[[194623,194623],\\\"mapped\\\",[21608]],[[194624,194624],\\\"mapped\\\",[21666]],[[194625,194625],\\\"mapped\\\",[21750]],[[194626,194626],\\\"mapped\\\",[21776]],[[194627,194627],\\\"mapped\\\",[21843]],[[194628,194628],\\\"mapped\\\",[21859]],[[194629,194630],\\\"mapped\\\",[21892]],[[194631,194631],\\\"mapped\\\",[21913]],[[194632,194632],\\\"mapped\\\",[21931]],[[194633,194633],\\\"mapped\\\",[21939]],[[194634,194634],\\\"mapped\\\",[21954]],[[194635,194635],\\\"mapped\\\",[22294]],[[194636,194636],\\\"mapped\\\",[22022]],[[194637,194637],\\\"mapped\\\",[22295]],[[194638,194638],\\\"mapped\\\",[22097]],[[194639,194639],\\\"mapped\\\",[22132]],[[194640,194640],\\\"mapped\\\",[20999]],[[194641,194641],\\\"mapped\\\",[22766]],[[194642,194642],\\\"mapped\\\",[22478]],[[194643,194643],\\\"mapped\\\",[22516]],[[194644,194644],\\\"mapped\\\",[22541]],[[194645,194645],\\\"mapped\\\",[22411]],[[194646,194646],\\\"mapped\\\",[22578]],[[194647,194647],\\\"mapped\\\",[22577]],[[194648,194648],\\\"mapped\\\",[22700]],[[194649,194649],\\\"mapped\\\",[136420]],[[194650,194650],\\\"mapped\\\",[22770]],[[194651,194651],\\\"mapped\\\",[22775]],[[194652,194652],\\\"mapped\\\",[22790]],[[194653,194653],\\\"mapped\\\",[22810]],[[194654,194654],\\\"mapped\\\",[22818]],[[194655,194655],\\\"mapped\\\",[22882]],[[194656,194656],\\\"mapped\\\",[136872]],[[194657,194657],\\\"mapped\\\",[136938]],[[194658,194658],\\\"mapped\\\",[23020]],[[194659,194659],\\\"mapped\\\",[23067]],[[194660,194660],\\\"mapped\\\",[23079]],[[194661,194661],\\\"mapped\\\",[23000]],[[194662,194662],\\\"mapped\\\",[23142]],[[194663,194663],\\\"mapped\\\",[14062]],[[194664,194664],\\\"disallowed\\\"],[[194665,194665],\\\"mapped\\\",[23304]],[[194666,194667],\\\"mapped\\\",[23358]],[[194668,194668],\\\"mapped\\\",[137672]],[[194669,194669],\\\"mapped\\\",[23491]],[[194670,194670],\\\"mapped\\\",[23512]],[[194671,194671],\\\"mapped\\\",[23527]],[[194672,194672],\\\"mapped\\\",[23539]],[[194673,194673],\\\"mapped\\\",[138008]],[[194674,194674],\\\"mapped\\\",[23551]],[[194675,194675],\\\"mapped\\\",[23558]],[[194676,194676],\\\"disallowed\\\"],[[194677,194677],\\\"mapped\\\",[23586]],[[194678,194678],\\\"mapped\\\",[14209]],[[194679,194679],\\\"mapped\\\",[23648]],[[194680,194680],\\\"mapped\\\",[23662]],[[194681,194681],\\\"mapped\\\",[23744]],[[194682,194682],\\\"mapped\\\",[23693]],[[194683,194683],\\\"mapped\\\",[138724]],[[194684,194684],\\\"mapped\\\",[23875]],[[194685,194685],\\\"mapped\\\",[138726]],[[194686,194686],\\\"mapped\\\",[23918]],[[194687,194687],\\\"mapped\\\",[23915]],[[194688,194688],\\\"mapped\\\",[23932]],[[194689,194689],\\\"mapped\\\",[24033]],[[194690,194690],\\\"mapped\\\",[24034]],[[194691,194691],\\\"mapped\\\",[14383]],[[194692,194692],\\\"mapped\\\",[24061]],[[194693,194693],\\\"mapped\\\",[24104]],[[194694,194694],\\\"mapped\\\",[24125]],[[194695,194695],\\\"mapped\\\",[24169]],[[194696,194696],\\\"mapped\\\",[14434]],[[194697,194697],\\\"mapped\\\",[139651]],[[194698,194698],\\\"mapped\\\",[14460]],[[194699,194699],\\\"mapped\\\",[24240]],[[194700,194700],\\\"mapped\\\",[24243]],[[194701,194701],\\\"mapped\\\",[24246]],[[194702,194702],\\\"mapped\\\",[24266]],[[194703,194703],\\\"mapped\\\",[172946]],[[194704,194704],\\\"mapped\\\",[24318]],[[194705,194706],\\\"mapped\\\",[140081]],[[194707,194707],\\\"mapped\\\",[33281]],[[194708,194709],\\\"mapped\\\",[24354]],[[194710,194710],\\\"mapped\\\",[14535]],[[194711,194711],\\\"mapped\\\",[144056]],[[194712,194712],\\\"mapped\\\",[156122]],[[194713,194713],\\\"mapped\\\",[24418]],[[194714,194714],\\\"mapped\\\",[24427]],[[194715,194715],\\\"mapped\\\",[14563]],[[194716,194716],\\\"mapped\\\",[24474]],[[194717,194717],\\\"mapped\\\",[24525]],[[194718,194718],\\\"mapped\\\",[24535]],[[194719,194719],\\\"mapped\\\",[24569]],[[194720,194720],\\\"mapped\\\",[24705]],[[194721,194721],\\\"mapped\\\",[14650]],[[194722,194722],\\\"mapped\\\",[14620]],[[194723,194723],\\\"mapped\\\",[24724]],[[194724,194724],\\\"mapped\\\",[141012]],[[194725,194725],\\\"mapped\\\",[24775]],[[194726,194726],\\\"mapped\\\",[24904]],[[194727,194727],\\\"mapped\\\",[24908]],[[194728,194728],\\\"mapped\\\",[24910]],[[194729,194729],\\\"mapped\\\",[24908]],[[194730,194730],\\\"mapped\\\",[24954]],[[194731,194731],\\\"mapped\\\",[24974]],[[194732,194732],\\\"mapped\\\",[25010]],[[194733,194733],\\\"mapped\\\",[24996]],[[194734,194734],\\\"mapped\\\",[25007]],[[194735,194735],\\\"mapped\\\",[25054]],[[194736,194736],\\\"mapped\\\",[25074]],[[194737,194737],\\\"mapped\\\",[25078]],[[194738,194738],\\\"mapped\\\",[25104]],[[194739,194739],\\\"mapped\\\",[25115]],[[194740,194740],\\\"mapped\\\",[25181]],[[194741,194741],\\\"mapped\\\",[25265]],[[194742,194742],\\\"mapped\\\",[25300]],[[194743,194743],\\\"mapped\\\",[25424]],[[194744,194744],\\\"mapped\\\",[142092]],[[194745,194745],\\\"mapped\\\",[25405]],[[194746,194746],\\\"mapped\\\",[25340]],[[194747,194747],\\\"mapped\\\",[25448]],[[194748,194748],\\\"mapped\\\",[25475]],[[194749,194749],\\\"mapped\\\",[25572]],[[194750,194750],\\\"mapped\\\",[142321]],[[194751,194751],\\\"mapped\\\",[25634]],[[194752,194752],\\\"mapped\\\",[25541]],[[194753,194753],\\\"mapped\\\",[25513]],[[194754,194754],\\\"mapped\\\",[14894]],[[194755,194755],\\\"mapped\\\",[25705]],[[194756,194756],\\\"mapped\\\",[25726]],[[194757,194757],\\\"mapped\\\",[25757]],[[194758,194758],\\\"mapped\\\",[25719]],[[194759,194759],\\\"mapped\\\",[14956]],[[194760,194760],\\\"mapped\\\",[25935]],[[194761,194761],\\\"mapped\\\",[25964]],[[194762,194762],\\\"mapped\\\",[143370]],[[194763,194763],\\\"mapped\\\",[26083]],[[194764,194764],\\\"mapped\\\",[26360]],[[194765,194765],\\\"mapped\\\",[26185]],[[194766,194766],\\\"mapped\\\",[15129]],[[194767,194767],\\\"mapped\\\",[26257]],[[194768,194768],\\\"mapped\\\",[15112]],[[194769,194769],\\\"mapped\\\",[15076]],[[194770,194770],\\\"mapped\\\",[20882]],[[194771,194771],\\\"mapped\\\",[20885]],[[194772,194772],\\\"mapped\\\",[26368]],[[194773,194773],\\\"mapped\\\",[26268]],[[194774,194774],\\\"mapped\\\",[32941]],[[194775,194775],\\\"mapped\\\",[17369]],[[194776,194776],\\\"mapped\\\",[26391]],[[194777,194777],\\\"mapped\\\",[26395]],[[194778,194778],\\\"mapped\\\",[26401]],[[194779,194779],\\\"mapped\\\",[26462]],[[194780,194780],\\\"mapped\\\",[26451]],[[194781,194781],\\\"mapped\\\",[144323]],[[194782,194782],\\\"mapped\\\",[15177]],[[194783,194783],\\\"mapped\\\",[26618]],[[194784,194784],\\\"mapped\\\",[26501]],[[194785,194785],\\\"mapped\\\",[26706]],[[194786,194786],\\\"mapped\\\",[26757]],[[194787,194787],\\\"mapped\\\",[144493]],[[194788,194788],\\\"mapped\\\",[26766]],[[194789,194789],\\\"mapped\\\",[26655]],[[194790,194790],\\\"mapped\\\",[26900]],[[194791,194791],\\\"mapped\\\",[15261]],[[194792,194792],\\\"mapped\\\",[26946]],[[194793,194793],\\\"mapped\\\",[27043]],[[194794,194794],\\\"mapped\\\",[27114]],[[194795,194795],\\\"mapped\\\",[27304]],[[194796,194796],\\\"mapped\\\",[145059]],[[194797,194797],\\\"mapped\\\",[27355]],[[194798,194798],\\\"mapped\\\",[15384]],[[194799,194799],\\\"mapped\\\",[27425]],[[194800,194800],\\\"mapped\\\",[145575]],[[194801,194801],\\\"mapped\\\",[27476]],[[194802,194802],\\\"mapped\\\",[15438]],[[194803,194803],\\\"mapped\\\",[27506]],[[194804,194804],\\\"mapped\\\",[27551]],[[194805,194805],\\\"mapped\\\",[27578]],[[194806,194806],\\\"mapped\\\",[27579]],[[194807,194807],\\\"mapped\\\",[146061]],[[194808,194808],\\\"mapped\\\",[138507]],[[194809,194809],\\\"mapped\\\",[146170]],[[194810,194810],\\\"mapped\\\",[27726]],[[194811,194811],\\\"mapped\\\",[146620]],[[194812,194812],\\\"mapped\\\",[27839]],[[194813,194813],\\\"mapped\\\",[27853]],[[194814,194814],\\\"mapped\\\",[27751]],[[194815,194815],\\\"mapped\\\",[27926]],[[194816,194816],\\\"mapped\\\",[27966]],[[194817,194817],\\\"mapped\\\",[28023]],[[194818,194818],\\\"mapped\\\",[27969]],[[194819,194819],\\\"mapped\\\",[28009]],[[194820,194820],\\\"mapped\\\",[28024]],[[194821,194821],\\\"mapped\\\",[28037]],[[194822,194822],\\\"mapped\\\",[146718]],[[194823,194823],\\\"mapped\\\",[27956]],[[194824,194824],\\\"mapped\\\",[28207]],[[194825,194825],\\\"mapped\\\",[28270]],[[194826,194826],\\\"mapped\\\",[15667]],[[194827,194827],\\\"mapped\\\",[28363]],[[194828,194828],\\\"mapped\\\",[28359]],[[194829,194829],\\\"mapped\\\",[147153]],[[194830,194830],\\\"mapped\\\",[28153]],[[194831,194831],\\\"mapped\\\",[28526]],[[194832,194832],\\\"mapped\\\",[147294]],[[194833,194833],\\\"mapped\\\",[147342]],[[194834,194834],\\\"mapped\\\",[28614]],[[194835,194835],\\\"mapped\\\",[28729]],[[194836,194836],\\\"mapped\\\",[28702]],[[194837,194837],\\\"mapped\\\",[28699]],[[194838,194838],\\\"mapped\\\",[15766]],[[194839,194839],\\\"mapped\\\",[28746]],[[194840,194840],\\\"mapped\\\",[28797]],[[194841,194841],\\\"mapped\\\",[28791]],[[194842,194842],\\\"mapped\\\",[28845]],[[194843,194843],\\\"mapped\\\",[132389]],[[194844,194844],\\\"mapped\\\",[28997]],[[194845,194845],\\\"mapped\\\",[148067]],[[194846,194846],\\\"mapped\\\",[29084]],[[194847,194847],\\\"disallowed\\\"],[[194848,194848],\\\"mapped\\\",[29224]],[[194849,194849],\\\"mapped\\\",[29237]],[[194850,194850],\\\"mapped\\\",[29264]],[[194851,194851],\\\"mapped\\\",[149000]],[[194852,194852],\\\"mapped\\\",[29312]],[[194853,194853],\\\"mapped\\\",[29333]],[[194854,194854],\\\"mapped\\\",[149301]],[[194855,194855],\\\"mapped\\\",[149524]],[[194856,194856],\\\"mapped\\\",[29562]],[[194857,194857],\\\"mapped\\\",[29579]],[[194858,194858],\\\"mapped\\\",[16044]],[[194859,194859],\\\"mapped\\\",[29605]],[[194860,194861],\\\"mapped\\\",[16056]],[[194862,194862],\\\"mapped\\\",[29767]],[[194863,194863],\\\"mapped\\\",[29788]],[[194864,194864],\\\"mapped\\\",[29809]],[[194865,194865],\\\"mapped\\\",[29829]],[[194866,194866],\\\"mapped\\\",[29898]],[[194867,194867],\\\"mapped\\\",[16155]],[[194868,194868],\\\"mapped\\\",[29988]],[[194869,194869],\\\"mapped\\\",[150582]],[[194870,194870],\\\"mapped\\\",[30014]],[[194871,194871],\\\"mapped\\\",[150674]],[[194872,194872],\\\"mapped\\\",[30064]],[[194873,194873],\\\"mapped\\\",[139679]],[[194874,194874],\\\"mapped\\\",[30224]],[[194875,194875],\\\"mapped\\\",[151457]],[[194876,194876],\\\"mapped\\\",[151480]],[[194877,194877],\\\"mapped\\\",[151620]],[[194878,194878],\\\"mapped\\\",[16380]],[[194879,194879],\\\"mapped\\\",[16392]],[[194880,194880],\\\"mapped\\\",[30452]],[[194881,194881],\\\"mapped\\\",[151795]],[[194882,194882],\\\"mapped\\\",[151794]],[[194883,194883],\\\"mapped\\\",[151833]],[[194884,194884],\\\"mapped\\\",[151859]],[[194885,194885],\\\"mapped\\\",[30494]],[[194886,194887],\\\"mapped\\\",[30495]],[[194888,194888],\\\"mapped\\\",[30538]],[[194889,194889],\\\"mapped\\\",[16441]],[[194890,194890],\\\"mapped\\\",[30603]],[[194891,194891],\\\"mapped\\\",[16454]],[[194892,194892],\\\"mapped\\\",[16534]],[[194893,194893],\\\"mapped\\\",[152605]],[[194894,194894],\\\"mapped\\\",[30798]],[[194895,194895],\\\"mapped\\\",[30860]],[[194896,194896],\\\"mapped\\\",[30924]],[[194897,194897],\\\"mapped\\\",[16611]],[[194898,194898],\\\"mapped\\\",[153126]],[[194899,194899],\\\"mapped\\\",[31062]],[[194900,194900],\\\"mapped\\\",[153242]],[[194901,194901],\\\"mapped\\\",[153285]],[[194902,194902],\\\"mapped\\\",[31119]],[[194903,194903],\\\"mapped\\\",[31211]],[[194904,194904],\\\"mapped\\\",[16687]],[[194905,194905],\\\"mapped\\\",[31296]],[[194906,194906],\\\"mapped\\\",[31306]],[[194907,194907],\\\"mapped\\\",[31311]],[[194908,194908],\\\"mapped\\\",[153980]],[[194909,194910],\\\"mapped\\\",[154279]],[[194911,194911],\\\"disallowed\\\"],[[194912,194912],\\\"mapped\\\",[16898]],[[194913,194913],\\\"mapped\\\",[154539]],[[194914,194914],\\\"mapped\\\",[31686]],[[194915,194915],\\\"mapped\\\",[31689]],[[194916,194916],\\\"mapped\\\",[16935]],[[194917,194917],\\\"mapped\\\",[154752]],[[194918,194918],\\\"mapped\\\",[31954]],[[194919,194919],\\\"mapped\\\",[17056]],[[194920,194920],\\\"mapped\\\",[31976]],[[194921,194921],\\\"mapped\\\",[31971]],[[194922,194922],\\\"mapped\\\",[32000]],[[194923,194923],\\\"mapped\\\",[155526]],[[194924,194924],\\\"mapped\\\",[32099]],[[194925,194925],\\\"mapped\\\",[17153]],[[194926,194926],\\\"mapped\\\",[32199]],[[194927,194927],\\\"mapped\\\",[32258]],[[194928,194928],\\\"mapped\\\",[32325]],[[194929,194929],\\\"mapped\\\",[17204]],[[194930,194930],\\\"mapped\\\",[156200]],[[194931,194931],\\\"mapped\\\",[156231]],[[194932,194932],\\\"mapped\\\",[17241]],[[194933,194933],\\\"mapped\\\",[156377]],[[194934,194934],\\\"mapped\\\",[32634]],[[194935,194935],\\\"mapped\\\",[156478]],[[194936,194936],\\\"mapped\\\",[32661]],[[194937,194937],\\\"mapped\\\",[32762]],[[194938,194938],\\\"mapped\\\",[32773]],[[194939,194939],\\\"mapped\\\",[156890]],[[194940,194940],\\\"mapped\\\",[156963]],[[194941,194941],\\\"mapped\\\",[32864]],[[194942,194942],\\\"mapped\\\",[157096]],[[194943,194943],\\\"mapped\\\",[32880]],[[194944,194944],\\\"mapped\\\",[144223]],[[194945,194945],\\\"mapped\\\",[17365]],[[194946,194946],\\\"mapped\\\",[32946]],[[194947,194947],\\\"mapped\\\",[33027]],[[194948,194948],\\\"mapped\\\",[17419]],[[194949,194949],\\\"mapped\\\",[33086]],[[194950,194950],\\\"mapped\\\",[23221]],[[194951,194951],\\\"mapped\\\",[157607]],[[194952,194952],\\\"mapped\\\",[157621]],[[194953,194953],\\\"mapped\\\",[144275]],[[194954,194954],\\\"mapped\\\",[144284]],[[194955,194955],\\\"mapped\\\",[33281]],[[194956,194956],\\\"mapped\\\",[33284]],[[194957,194957],\\\"mapped\\\",[36766]],[[194958,194958],\\\"mapped\\\",[17515]],[[194959,194959],\\\"mapped\\\",[33425]],[[194960,194960],\\\"mapped\\\",[33419]],[[194961,194961],\\\"mapped\\\",[33437]],[[194962,194962],\\\"mapped\\\",[21171]],[[194963,194963],\\\"mapped\\\",[33457]],[[194964,194964],\\\"mapped\\\",[33459]],[[194965,194965],\\\"mapped\\\",[33469]],[[194966,194966],\\\"mapped\\\",[33510]],[[194967,194967],\\\"mapped\\\",[158524]],[[194968,194968],\\\"mapped\\\",[33509]],[[194969,194969],\\\"mapped\\\",[33565]],[[194970,194970],\\\"mapped\\\",[33635]],[[194971,194971],\\\"mapped\\\",[33709]],[[194972,194972],\\\"mapped\\\",[33571]],[[194973,194973],\\\"mapped\\\",[33725]],[[194974,194974],\\\"mapped\\\",[33767]],[[194975,194975],\\\"mapped\\\",[33879]],[[194976,194976],\\\"mapped\\\",[33619]],[[194977,194977],\\\"mapped\\\",[33738]],[[194978,194978],\\\"mapped\\\",[33740]],[[194979,194979],\\\"mapped\\\",[33756]],[[194980,194980],\\\"mapped\\\",[158774]],[[194981,194981],\\\"mapped\\\",[159083]],[[194982,194982],\\\"mapped\\\",[158933]],[[194983,194983],\\\"mapped\\\",[17707]],[[194984,194984],\\\"mapped\\\",[34033]],[[194985,194985],\\\"mapped\\\",[34035]],[[194986,194986],\\\"mapped\\\",[34070]],[[194987,194987],\\\"mapped\\\",[160714]],[[194988,194988],\\\"mapped\\\",[34148]],[[194989,194989],\\\"mapped\\\",[159532]],[[194990,194990],\\\"mapped\\\",[17757]],[[194991,194991],\\\"mapped\\\",[17761]],[[194992,194992],\\\"mapped\\\",[159665]],[[194993,194993],\\\"mapped\\\",[159954]],[[194994,194994],\\\"mapped\\\",[17771]],[[194995,194995],\\\"mapped\\\",[34384]],[[194996,194996],\\\"mapped\\\",[34396]],[[194997,194997],\\\"mapped\\\",[34407]],[[194998,194998],\\\"mapped\\\",[34409]],[[194999,194999],\\\"mapped\\\",[34473]],[[195000,195000],\\\"mapped\\\",[34440]],[[195001,195001],\\\"mapped\\\",[34574]],[[195002,195002],\\\"mapped\\\",[34530]],[[195003,195003],\\\"mapped\\\",[34681]],[[195004,195004],\\\"mapped\\\",[34600]],[[195005,195005],\\\"mapped\\\",[34667]],[[195006,195006],\\\"mapped\\\",[34694]],[[195007,195007],\\\"disallowed\\\"],[[195008,195008],\\\"mapped\\\",[34785]],[[195009,195009],\\\"mapped\\\",[34817]],[[195010,195010],\\\"mapped\\\",[17913]],[[195011,195011],\\\"mapped\\\",[34912]],[[195012,195012],\\\"mapped\\\",[34915]],[[195013,195013],\\\"mapped\\\",[161383]],[[195014,195014],\\\"mapped\\\",[35031]],[[195015,195015],\\\"mapped\\\",[35038]],[[195016,195016],\\\"mapped\\\",[17973]],[[195017,195017],\\\"mapped\\\",[35066]],[[195018,195018],\\\"mapped\\\",[13499]],[[195019,195019],\\\"mapped\\\",[161966]],[[195020,195020],\\\"mapped\\\",[162150]],[[195021,195021],\\\"mapped\\\",[18110]],[[195022,195022],\\\"mapped\\\",[18119]],[[195023,195023],\\\"mapped\\\",[35488]],[[195024,195024],\\\"mapped\\\",[35565]],[[195025,195025],\\\"mapped\\\",[35722]],[[195026,195026],\\\"mapped\\\",[35925]],[[195027,195027],\\\"mapped\\\",[162984]],[[195028,195028],\\\"mapped\\\",[36011]],[[195029,195029],\\\"mapped\\\",[36033]],[[195030,195030],\\\"mapped\\\",[36123]],[[195031,195031],\\\"mapped\\\",[36215]],[[195032,195032],\\\"mapped\\\",[163631]],[[195033,195033],\\\"mapped\\\",[133124]],[[195034,195034],\\\"mapped\\\",[36299]],[[195035,195035],\\\"mapped\\\",[36284]],[[195036,195036],\\\"mapped\\\",[36336]],[[195037,195037],\\\"mapped\\\",[133342]],[[195038,195038],\\\"mapped\\\",[36564]],[[195039,195039],\\\"mapped\\\",[36664]],[[195040,195040],\\\"mapped\\\",[165330]],[[195041,195041],\\\"mapped\\\",[165357]],[[195042,195042],\\\"mapped\\\",[37012]],[[195043,195043],\\\"mapped\\\",[37105]],[[195044,195044],\\\"mapped\\\",[37137]],[[195045,195045],\\\"mapped\\\",[165678]],[[195046,195046],\\\"mapped\\\",[37147]],[[195047,195047],\\\"mapped\\\",[37432]],[[195048,195048],\\\"mapped\\\",[37591]],[[195049,195049],\\\"mapped\\\",[37592]],[[195050,195050],\\\"mapped\\\",[37500]],[[195051,195051],\\\"mapped\\\",[37881]],[[195052,195052],\\\"mapped\\\",[37909]],[[195053,195053],\\\"mapped\\\",[166906]],[[195054,195054],\\\"mapped\\\",[38283]],[[195055,195055],\\\"mapped\\\",[18837]],[[195056,195056],\\\"mapped\\\",[38327]],[[195057,195057],\\\"mapped\\\",[167287]],[[195058,195058],\\\"mapped\\\",[18918]],[[195059,195059],\\\"mapped\\\",[38595]],[[195060,195060],\\\"mapped\\\",[23986]],[[195061,195061],\\\"mapped\\\",[38691]],[[195062,195062],\\\"mapped\\\",[168261]],[[195063,195063],\\\"mapped\\\",[168474]],[[195064,195064],\\\"mapped\\\",[19054]],[[195065,195065],\\\"mapped\\\",[19062]],[[195066,195066],\\\"mapped\\\",[38880]],[[195067,195067],\\\"mapped\\\",[168970]],[[195068,195068],\\\"mapped\\\",[19122]],[[195069,195069],\\\"mapped\\\",[169110]],[[195070,195071],\\\"mapped\\\",[38923]],[[195072,195072],\\\"mapped\\\",[38953]],[[195073,195073],\\\"mapped\\\",[169398]],[[195074,195074],\\\"mapped\\\",[39138]],[[195075,195075],\\\"mapped\\\",[19251]],[[195076,195076],\\\"mapped\\\",[39209]],[[195077,195077],\\\"mapped\\\",[39335]],[[195078,195078],\\\"mapped\\\",[39362]],[[195079,195079],\\\"mapped\\\",[39422]],[[195080,195080],\\\"mapped\\\",[19406]],[[195081,195081],\\\"mapped\\\",[170800]],[[195082,195082],\\\"mapped\\\",[39698]],[[195083,195083],\\\"mapped\\\",[40000]],[[195084,195084],\\\"mapped\\\",[40189]],[[195085,195085],\\\"mapped\\\",[19662]],[[195086,195086],\\\"mapped\\\",[19693]],[[195087,195087],\\\"mapped\\\",[40295]],[[195088,195088],\\\"mapped\\\",[172238]],[[195089,195089],\\\"mapped\\\",[19704]],[[195090,195090],\\\"mapped\\\",[172293]],[[195091,195091],\\\"mapped\\\",[172558]],[[195092,195092],\\\"mapped\\\",[172689]],[[195093,195093],\\\"mapped\\\",[40635]],[[195094,195094],\\\"mapped\\\",[19798]],[[195095,195095],\\\"mapped\\\",[40697]],[[195096,195096],\\\"mapped\\\",[40702]],[[195097,195097],\\\"mapped\\\",[40709]],[[195098,195098],\\\"mapped\\\",[40719]],[[195099,195099],\\\"mapped\\\",[40726]],[[195100,195100],\\\"mapped\\\",[40763]],[[195101,195101],\\\"mapped\\\",[173568]],[[195102,196605],\\\"disallowed\\\"],[[196606,196607],\\\"disallowed\\\"],[[196608,262141],\\\"disallowed\\\"],[[262142,262143],\\\"disallowed\\\"],[[262144,327677],\\\"disallowed\\\"],[[327678,327679],\\\"disallowed\\\"],[[327680,393213],\\\"disallowed\\\"],[[393214,393215],\\\"disallowed\\\"],[[393216,458749],\\\"disallowed\\\"],[[458750,458751],\\\"disallowed\\\"],[[458752,524285],\\\"disallowed\\\"],[[524286,524287],\\\"disallowed\\\"],[[524288,589821],\\\"disallowed\\\"],[[589822,589823],\\\"disallowed\\\"],[[589824,655357],\\\"disallowed\\\"],[[655358,655359],\\\"disallowed\\\"],[[655360,720893],\\\"disallowed\\\"],[[720894,720895],\\\"disallowed\\\"],[[720896,786429],\\\"disallowed\\\"],[[786430,786431],\\\"disallowed\\\"],[[786432,851965],\\\"disallowed\\\"],[[851966,851967],\\\"disallowed\\\"],[[851968,917501],\\\"disallowed\\\"],[[917502,917503],\\\"disallowed\\\"],[[917504,917504],\\\"disallowed\\\"],[[917505,917505],\\\"disallowed\\\"],[[917506,917535],\\\"disallowed\\\"],[[917536,917631],\\\"disallowed\\\"],[[917632,917759],\\\"disallowed\\\"],[[917760,917999],\\\"ignored\\\"],[[918000,983037],\\\"disallowed\\\"],[[983038,983039],\\\"disallowed\\\"],[[983040,1048573],\\\"disallowed\\\"],[[1048574,1048575],\\\"disallowed\\\"],[[1048576,1114109],\\\"disallowed\\\"],[[1114110,1114111],\\\"disallowed\\\"]]\");\n\n/***/ }),\n\n/***/ 2357:\n/***/ ((module) => {\n\n\"use strict\";\nmodule.exports = require(\"assert\");\n\n/***/ }),\n\n/***/ 3129:\n/***/ ((module) => {\n\n\"use strict\";\nmodule.exports = require(\"child_process\");\n\n/***/ }),\n\n/***/ 6417:\n/***/ ((module) => {\n\n\"use strict\";\nmodule.exports = require(\"crypto\");\n\n/***/ }),\n\n/***/ 8614:\n/***/ ((module) => {\n\n\"use strict\";\nmodule.exports = require(\"events\");\n\n/***/ }),\n\n/***/ 5747:\n/***/ ((module) => {\n\n\"use strict\";\nmodule.exports = require(\"fs\");\n\n/***/ }),\n\n/***/ 8605:\n/***/ ((module) => {\n\n\"use strict\";\nmodule.exports = require(\"http\");\n\n/***/ }),\n\n/***/ 7211:\n/***/ ((module) => {\n\n\"use strict\";\nmodule.exports = require(\"https\");\n\n/***/ }),\n\n/***/ 1631:\n/***/ ((module) => {\n\n\"use strict\";\nmodule.exports = require(\"net\");\n\n/***/ }),\n\n/***/ 2087:\n/***/ ((module) => {\n\n\"use strict\";\nmodule.exports = require(\"os\");\n\n/***/ }),\n\n/***/ 5622:\n/***/ ((module) => {\n\n\"use strict\";\nmodule.exports = require(\"path\");\n\n/***/ }),\n\n/***/ 4213:\n/***/ ((module) => {\n\n\"use strict\";\nmodule.exports = require(\"punycode\");\n\n/***/ }),\n\n/***/ 2413:\n/***/ ((module) => {\n\n\"use strict\";\nmodule.exports = require(\"stream\");\n\n/***/ }),\n\n/***/ 4016:\n/***/ ((module) => {\n\n\"use strict\";\nmodule.exports = require(\"tls\");\n\n/***/ }),\n\n/***/ 8835:\n/***/ ((module) => {\n\n\"use strict\";\nmodule.exports = require(\"url\");\n\n/***/ }),\n\n/***/ 1669:\n/***/ ((module) => {\n\n\"use strict\";\nmodule.exports = require(\"util\");\n\n/***/ }),\n\n/***/ 8761:\n/***/ ((module) => {\n\n\"use strict\";\nmodule.exports = require(\"zlib\");\n\n/***/ })\n\n/******/ \t});\n/************************************************************************/\n/******/ \t// The module cache\n/******/ \tvar __webpack_module_cache__ = {};\n/******/ \t\n/******/ \t// The require function\n/******/ \tfunction __webpack_require__(moduleId) {\n/******/ \t\t// Check if module is in cache\n/******/ \t\tif(__webpack_module_cache__[moduleId]) {\n/******/ \t\t\treturn __webpack_module_cache__[moduleId].exports;\n/******/ \t\t}\n/******/ \t\t// Create a new module (and put it into the cache)\n/******/ \t\tvar module = __webpack_module_cache__[moduleId] = {\n/******/ \t\t\t// no module.id needed\n/******/ \t\t\t// no module.loaded needed\n/******/ \t\t\texports: {}\n/******/ \t\t};\n/******/ \t\n/******/ \t\t// Execute the module function\n/******/ \t\tvar threw = true;\n/******/ \t\ttry {\n/******/ \t\t\t__webpack_modules__[moduleId].call(module.exports, module, module.exports, __webpack_require__);\n/******/ \t\t\tthrew = false;\n/******/ \t\t} finally {\n/******/ \t\t\tif(threw) delete __webpack_module_cache__[moduleId];\n/******/ \t\t}\n/******/ \t\n/******/ \t\t// Return the exports of the module\n/******/ \t\treturn module.exports;\n/******/ \t}\n/******/ \t\n/************************************************************************/\n/******/ \t/* webpack/runtime/compat get default export */\n/******/ \t(() => {\n/******/ \t\t// getDefaultExport function for compatibility with non-harmony modules\n/******/ \t\t__webpack_require__.n = (module) => {\n/******/ \t\t\tvar getter = module && module.__esModule ?\n/******/ \t\t\t\t() => module['default'] :\n/******/ \t\t\t\t() => module;\n/******/ \t\t\t__webpack_require__.d(getter, { a: getter });\n/******/ \t\t\treturn getter;\n/******/ \t\t};\n/******/ \t})();\n/******/ \t\n/******/ \t/* webpack/runtime/define property getters */\n/******/ \t(() => {\n/******/ \t\t// define getter functions for harmony exports\n/******/ \t\t__webpack_require__.d = (exports, definition) => {\n/******/ \t\t\tfor(var key in definition) {\n/******/ \t\t\t\tif(__webpack_require__.o(definition, key) && !__webpack_require__.o(exports, key)) {\n/******/ \t\t\t\t\tObject.defineProperty(exports, key, { enumerable: true, get: definition[key] });\n/******/ \t\t\t\t}\n/******/ \t\t\t}\n/******/ \t\t};\n/******/ \t})();\n/******/ \t\n/******/ \t/* webpack/runtime/hasOwnProperty shorthand */\n/******/ \t(() => {\n/******/ \t\t__webpack_require__.o = (obj, prop) => Object.prototype.hasOwnProperty.call(obj, prop)\n/******/ \t})();\n/******/ \t\n/******/ \t/* webpack/runtime/make namespace object */\n/******/ \t(() => {\n/******/ \t\t// define __esModule on exports\n/******/ \t\t__webpack_require__.r = (exports) => {\n/******/ \t\t\tif(typeof Symbol !== 'undefined' && Symbol.toStringTag) {\n/******/ \t\t\t\tObject.defineProperty(exports, Symbol.toStringTag, { value: 'Module' });\n/******/ \t\t\t}\n/******/ \t\t\tObject.defineProperty(exports, '__esModule', { value: true });\n/******/ \t\t};\n/******/ \t})();\n/******/ \t\n/******/ \t/* webpack/runtime/compat */\n/******/ \t\n/******/ \t__webpack_require__.ab = __dirname + \"/\";/************************************************************************/\n/******/ \t// module exports must be returned from runtime so entry inlining is disabled\n/******/ \t// startup\n/******/ \t// Load entry module and return exports\n/******/ \treturn __webpack_require__(4582);\n/******/ })()\n;"
  },
  {
    "path": ".github/workflows/actions/release-notes/local.js",
    "content": "/*\n * This file is the main entry for local development and manual testing.\n */\n\nconst path = require('path');\nconst releaseNotes = require(path.resolve('release-notes.js'));\n\nconst {Octokit} = require(\"@octokit/rest\");\nconst github = new Octokit({auth: mustGetEnvVar('GITHUB_TOKEN')});\n\nreleaseNotes(\n  github,\n  \"buildpacks/pack\",\n  mustGetArg(0, \"milestone\"),\n  mustGetArg(1, \"config-path\")\n)\n  .then(console.log)\n  .catch(err => {\n    console.error(err);\n    process.exit(1);\n  });\n\nfunction mustGetArg(position, name) {\n  let value = process.argv[position + 2];\n  if (!value) {\n    console.error(`'${name}' must be provided as argument ${position}.`);\n    process.exit(1);\n  }\n  return value;\n}\n\nfunction mustGetEnvVar(envVar) {\n  let value = process.env[envVar];\n  if (!value) {\n    console.error(`'${envVar}' env var must be set.`);\n    process.exit(1);\n  }\n  return value;\n}"
  },
  {
    "path": ".github/workflows/actions/release-notes/package.json",
    "content": "{\n  \"name\": \"release-notes\",\n  \"version\": \"1.0.0\",\n  \"scripts\": {\n    \"local\": \"node local.js\",\n    \"build\": \"ncc build action.js -o dist/\"\n  },\n  \"dependencies\": {\n    \"@actions/core\": \"^1.10.0\",\n    \"@actions/github\": \"^6.0.1\",\n    \"@octokit/rest\": \"^22.0.0\",\n    \"yaml\": \"^1.10.0\"\n  },\n  \"devDependencies\": {\n    \"@vercel/ncc\": \"^0.24.1\"\n  }\n}\n"
  },
  {
    "path": ".github/workflows/actions/release-notes/release-notes.js",
    "content": "const {promises: fs} = require('fs');\nconst YAML = require('yaml');\n\nmodule.exports = async (github, repository, milestone, configPath) => {\n  return await fs.readFile(configPath, \"utf-8\")\n    .then(content => YAML.parse(content))\n    .then(config => {\n      let labelGroups = config.labels\n      let weightSortFunc = (a, b) => labelGroups[a].weight - labelGroups[b].weight\n\n      console.log(\"looking up PRs for milestone\", milestone, \"in repo\", repository);\n      return github.paginate(\"GET /search/issues\", {\n        q: `repo:${repository} is:pr is:merged milestone:${milestone}`,\n      }).then(issues => {\n        console.log(\"Issues count:\", issues.length);\n\n        let groupedIssues = groupIssuesByLabels(issues, Object.keys(labelGroups));\n\n        // generate issues list\n        let output = \"\";\n        for (let key of Object.keys(labelGroups).sort(weightSortFunc)) {\n          let displayGroup = labelGroups[key]\n          let issues = (groupedIssues[key] || []);\n          console.log(key, \"issues:\", issues.length);\n\n          if (issues.length > 0) {\n            output += `### ${displayGroup.title}\\n\\n`;\n            if (displayGroup.description) {\n              output += `${displayGroup.description.trim()}\\n\\n`;\n            }\n            issues.forEach(issue => {\n              output += createIssueEntry(issue);\n            });\n            output += \"\\n\";\n          }\n        }\n\n        let hiddenIssues = groupedIssues[\"\"] || [];\n        console.warn(\"Issues not displayed: \", hiddenIssues.length);\n        if (hiddenIssues.length > 0) {\n          console.warn(\" - \" + hiddenIssues.map(issue => issue.number).join(\", \"));\n        }\n\n        // generate contributors list\n        if (\n          config\n          && config.sections\n          && config.sections.contributors\n          && config.sections.contributors.title\n        ) {\n          output += `## ${config.sections.contributors.title}\\n\\n`;\n\n          if (config.sections.contributors.description) {\n            output += `${config.sections.contributors.description.trim()}\\n\\n`;\n          }\n\n          let uniqueFunc = (value, index, self) => self.indexOf(value) === index\n          output += issues\n            .map(issueContrib)\n            .filter(uniqueFunc)\n            .sort()\n            .map(v => `@${v}`)\n            .join(\", \")\n        }\n\n        return output.trim();\n      });\n    })\n};\n\nfunction createIssueEntry(issue) {\n  return `* ${issue.title} (#${issue.number} by @${issueContrib(issue)})\\n`;\n}\n\nfunction issueContrib(issue) {\n  return issue.user.login;\n}\n\nfunction groupIssuesByLabels(issues, labels) {\n  return issues.reduce((groupedMap, issue) => {\n    let typeLabel = issue.labels\n      .filter(label => labels.includes(label.name))\n      .map(label => label.name)[0] || \"\";\n\n    (groupedMap[typeLabel] = groupedMap[typeLabel] || []).push(issue);\n\n    return groupedMap;\n  }, {})\n}"
  },
  {
    "path": ".github/workflows/benchmark.yml",
    "content": "name: benchmark\non:\n  push:\n    branches:\n      - main\n\npermissions:\n  contents: write\n  deployments: write\n\njobs:\n  benchmark:\n    name: Run Go benchmark example\n    runs-on: ubuntu-latest\n    steps:\n      - uses: actions/checkout@v4\n      - name: Set up go\n        uses: actions/setup-go@v6\n        with:\n          go-version: 1.25\n          check-latest: true\n          go-version-file: 'go.mod'\n      - name: Set up go env\n        run: |\n          echo \"GOPATH=$(go env GOPATH)\" >> $GITHUB_ENV\n          echo \"$(go env GOPATH)/bin\" >> $GITHUB_PATH\n        shell: bash\n      - name: Run benchmark\n        run: |\n             git clone https://github.com/buildpacks/samples.git\n             mkdir out || (exit 0)\n             go test -bench=. -benchtime=1s ./benchmarks/... -tags=benchmarks | tee ./out/benchmark.txt\n\n      - name: Store benchmark result\n        uses: benchmark-action/github-action-benchmark@v1\n        with:\n          name: Go Benchmark\n          tool: 'go'\n          output-file-path: ./out/benchmark.txt\n          github-token: ${{ secrets.GITHUB_TOKEN }}\n          auto-push: true\n          # Show alert with commit comment on detecting possible performance regression\n          alert-threshold: '200%'\n          comment-on-alert: true\n          fail-on-alert: true\n          alert-comment-cc-users: '@buildpacks/platform-maintainers'\n"
  },
  {
    "path": ".github/workflows/build.yml",
    "content": "name: build\n\non:\n  push:\n    branches:\n      - main\n      - 'release/**'\n  pull_request:\n    paths-ignore:\n      - '**.md'\n      - 'resources/**'\n      - 'CODEOWNERS'\n      - 'LICENSE'\n    branches:\n      - main\n      - 'release/**'\n\njobs:\n  test:\n    strategy:\n      fail-fast: false\n      matrix:\n        config: [macos, linux, windows]\n        include:\n          - config: macos\n            # since macos-14 the latest runner is arm64\n            os: macos-arm64\n            runner: macos-latest\n            no_docker: \"true\"\n            pack_bin: pack\n          - config: linux\n            os: linux\n            runner: ubuntu-latest\n            no_docker: \"false\"\n            pack_bin: pack\n          - config: windows\n            os: windows\n            runner: windows-latest\n            no_docker: \"true\"\n            pack_bin: pack.exe\n    runs-on: ${{ matrix.runner }}\n    env:\n      PACK_BIN: ${{ matrix.pack_bin }}\n      NO_DOCKER: ${{ matrix.no_docker }}\n    steps:\n      - name: Set git to use LF and symlinks\n        if: matrix.os == 'windows'\n        run: |\n          git config --global core.autocrlf false\n          git config --global core.eol lf\n          git config --global core.symlinks true\n      - uses: actions/checkout@v4\n      - name: Derive pack version from branch name Unix\n        if: runner.os != 'Windows'\n        run: |\n          [[ $GITHUB_REF =~ ^refs\\/heads\\/release/(.*)$ ]] && version=${BASH_REMATCH[1]} || version=0.0.0\n          echo \"PACK_VERSION=${version}\" >> $GITHUB_ENV\n        shell: bash\n      - name: Derive pack version from branch name Windows\n        if: runner.os == 'Windows'\n        run: |\n          if ($Env:GITHUB_REF -match '^refs\\/heads\\/release/(.*)$') {\n            $refmatch=$Matches[1]\n            echo \"PACK_VERSION=${refmatch}\" | Out-File -FilePath $Env:GITHUB_ENV -Encoding utf8 -Append\n          }\n          else {\n            echo \"PACK_VERSION=0.0.0\" | Out-File -FilePath $Env:GITHUB_ENV -Encoding utf8 -Append\n          }\n        shell: powershell\n      - name: Set up go\n        uses: actions/setup-go@v6\n        with:\n          check-latest: true\n          go-version-file: 'go.mod'\n      - name: Set up go env for Unix\n        if: runner.os != 'Windows'\n        run: |\n          echo \"GOPATH=$(go env GOPATH)\" >> $GITHUB_ENV\n          echo \"$(go env GOPATH)/bin\" >> $GITHUB_PATH\n        shell: bash\n      - name: Set up go env for Windows\n        if: runner.os == 'Windows'\n        run: |\n          echo \"GOPATH=$(go env GOPATH)\"| Out-File -FilePath $Env:GITHUB_ENV -Encoding utf8 -Append\n          echo \"$(go env GOPATH)\\bin\" | Out-File -FilePath $Env:GITHUB_PATH -Encoding utf8 -Append\n        shell: powershell\n      - name: Install Make on Windows\n        if: runner.os == 'Windows'\n        run: choco install make -y\n      - name: Verify\n        run: make verify\n      - name: Test\n        env:\n          TEST_COVERAGE: 1\n          GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}\n        run: make test\n      - name: Upload Coverage\n        uses: codecov/codecov-action@v3\n        with:\n          token: ${{ secrets.CODECOV_TOKEN }}\n          file: ./out/tests/coverage-unit.txt\n          flags: unit,os_${{ matrix.os }}\n          fail_ci_if_error: false\n          verbose: true\n      - name: Prepare Codecov\n        if: matrix.os == 'windows'\n        uses: crazy-max/ghaction-chocolatey@v3\n        with:\n          args: install codecov -y\n      - name: run Codecov\n        if: matrix.os == 'windows'\n        run: |\n          codecov.exe -f ./out/tests/coverage-unit.txt -v --flag os_windows\n      - name: Build Unix\n        if: runner.os != 'Windows'\n        run: |\n          make build\n        env:\n          PACK_BUILD: ${{ github.run_number }}\n        shell: bash\n      - name: Build Windows\n        if: runner.os == 'Windows'\n        run: |\n          make build\n        env:\n          PACK_BUILD: ${{ github.run_number }}\n        shell: powershell\n      - uses: actions/upload-artifact@v4\n        with:\n          name: pack-${{ matrix.os }}\n          path: out/${{ env.PACK_BIN }}\n  build-additional-archs:\n    if: ${{ startsWith(github.ref, 'refs/heads/release/') }}\n    strategy:\n      fail-fast: false\n      matrix:\n        include:\n          - name: freebsd\n            goarch: amd64\n            goos: freebsd\n          - name: freebsd-arm64\n            goarch: arm64\n            goos: freebsd\n          - name: linux-arm64\n            goarch: arm64\n            goos: linux\n          - name: macos\n            # since macos-14 default runner is arm, we need to build for intel architecture later\n            goarch: amd64\n            goos: darwin\n          - name: linux-s390x\n            goarch: s390x\n            goos: linux\n          - name: linux-ppc64le\n            goarch: ppc64le\n            goos: linux\n    needs: test\n    runs-on: ubuntu-latest\n    steps:\n      - uses: actions/checkout@v4\n      - name: Set up go\n        uses: actions/setup-go@v6\n        with:\n          check-latest: true\n          go-version-file: 'go.mod'\n      - name: Build\n        run: |\n          [[ $GITHUB_REF =~ ^refs\\/heads\\/release/(.*)$ ]] && version=${BASH_REMATCH[1]} || version=0.0.0\n          env PACK_VERSION=${version} GOARCH=${{ matrix.goarch }} GOOS=${{ matrix.goos }} make build\n        env:\n          PACK_BUILD: ${{ github.run_number }}\n      - uses: actions/upload-artifact@v4\n        with:\n          name: pack-${{ matrix.name }}\n          path: out/${{ env.PACK_BIN }}\n  release:\n    if: ${{ startsWith(github.ref, 'refs/heads/release/') }}\n    needs: build-additional-archs\n    runs-on: ubuntu-latest\n    steps:\n      - uses: actions/checkout@v4\n      - name: Derive pack version from branch name\n        shell: bash\n        run: |\n          echo \"GITHUB_REF=${GITHUB_REF}\"\n          [[ $GITHUB_REF =~ ^refs\\/heads\\/release\\/(.*)$ ]] && version=${BASH_REMATCH[1]}\n          if [[ -z \"${version}\" ]]; then\n            echo \"ERROR: pack version not detected.\"\n            exit 1\n          fi\n          echo \"PACK_VERSION=${version}\" >> $GITHUB_ENV\n\n          [[ \"${version}\" =~ ^([^-]+).*$ ]] && milestone=${BASH_REMATCH[1]}\n          if [[ -z \"${milestone}\" ]]; then\n            echo \"ERROR: couldn't determine the milestone to lookup from version: ${version}.\"\n            exit 1\n          fi\n\n          echo \"PACK_MILESTONE=${milestone}\" >> $GITHUB_ENV\n      - name: Download artifacts\n        uses: actions/download-artifact@v5\n      - name: Package artifacts - macos\n        run: |\n          chmod +x pack-macos/pack\n          filename=pack-v${{ env.PACK_VERSION }}-macos.tgz\n          tar -C pack-macos -vzcf $filename pack\n          shasum -a 256 $filename > $filename.sha256\n      - name: Package artifacts - freebsd\n        run: |\n          chmod +x pack-freebsd/pack\n          filename=pack-v${{ env.PACK_VERSION }}-freebsd.tgz\n          tar -C pack-freebsd -vzcf $filename pack\n          shasum -a 256 $filename > $filename.sha256\n      - name: Package artifacts - freebsd-arm64\n        run: |\n          chmod +x pack-freebsd-arm64/pack\n          filename=pack-v${{ env.PACK_VERSION }}-freebsd-arm64.tgz\n          tar -C pack-freebsd-arm64 -vzcf $filename pack\n          shasum -a 256 $filename > $filename.sha256\n      - name: Package artifacts - linux-arm64\n        run: |\n          chmod +x pack-linux-arm64/pack\n          filename=pack-v${{ env.PACK_VERSION }}-linux-arm64.tgz\n          tar -C pack-linux-arm64 -vzcf $filename pack\n          shasum -a 256 $filename > $filename.sha256\n      - name: Package artifacts - linux-s390x\n        run: |\n          chmod +x pack-linux-s390x/pack\n          filename=pack-v${{ env.PACK_VERSION }}-linux-s390x.tgz\n          tar -C pack-linux-s390x -vzcf $filename pack\n          shasum -a 256 $filename > $filename.sha256\n      - name: Package artifacts - linux-ppc64le\n        run: |\n          chmod +x pack-linux-ppc64le/pack\n          filename=pack-v${{ env.PACK_VERSION }}-linux-ppc64le.tgz\n          tar -C pack-linux-ppc64le -vzcf $filename pack\n          shasum -a 256 $filename > $filename.sha256\n      - name: Package artifacts - macos-arm64\n        run: |\n          chmod +x pack-macos-arm64/pack\n          filename=pack-v${{ env.PACK_VERSION }}-macos-arm64.tgz\n          tar -C pack-macos-arm64 -vzcf $filename pack\n          shasum -a 256 $filename > $filename.sha256\n      - name: Package artifacts - linux\n        run: |\n          chmod +x pack-linux/pack\n          filename=pack-v${{ env.PACK_VERSION }}-linux.tgz\n          tar -C pack-linux -vzcf $filename pack\n          shasum -a 256 $filename > $filename.sha256\n      - name: Package artifacts - windows\n        run: |\n          filename=pack-v${{ env.PACK_VERSION }}-windows.zip\n          zip -j $filename pack-windows/pack.exe\n          shasum -a 256 $filename > $filename.sha256\n      - name: Extract lifecycle version\n        id: lifecycle_version\n        run: |\n          LIFECYCLE_VERSION=$(./pack-linux/pack report | grep 'Default Lifecycle Version:' | grep -o '[^ ]*$')\n          echo \"version=$LIFECYCLE_VERSION\" >> $GITHUB_OUTPUT\n      - name: Extract pack help\n        id: pack_help\n        # Multiline output use a syntax similar to heredocs.\n        # see https://docs.github.com/en/actions/using-workflows/workflow-commands-for-github-actions#multiline-strings\n        run: |\n          DELIMITER=\"$(uuidgen)\"\n          echo \"help<<${DELIMITER}\" >> $GITHUB_OUTPUT\n          ./pack-linux/pack --help >> $GITHUB_OUTPUT\n          echo \"${DELIMITER}\" >> $GITHUB_OUTPUT\n      - name: Generate changelog\n        uses: ./.github/workflows/actions/release-notes\n        id: changelog\n        with:\n          github-token: ${{ secrets.GITHUB_TOKEN }}\n          milestone: ${{ env.PACK_MILESTONE }}\n      - name: Create Pre-Release\n        if: ${{ env.PACK_VERSION != env.PACK_MILESTONE }}\n        uses: softprops/action-gh-release@v2\n        env:\n          GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}\n        with:\n          target_commitish: ${{ github.sha }}\n          tag_name: v${{ env.PACK_VERSION }}\n          name: pack v${{ env.PACK_VERSION }}\n          draft: true\n          prerelease: true\n          files: pack-v${{ env.PACK_VERSION }}-*\n          body: |\n            ## Prerequisites\n\n            - A container runtime such as [Docker](https://www.docker.com/get-started) or [podman](https://podman.io/get-started) must be available to execute builds.\n\n            ## Install\n\n            #### FreeBSD\n\n            ##### AMD64\n\n            ```bash\n            (curl -sSL \"https://github.com/buildpacks/pack/releases/download/v${{ env.PACK_VERSION }}/pack-v${{ env.PACK_VERSION }}-freebsd.tgz\" | sudo tar -C /usr/local/bin/ --no-same-owner -xzv pack)\n            ```\n\n            ##### ARM64\n\n            ```bash\n            (curl -sSL \"https://github.com/buildpacks/pack/releases/download/v${{ env.PACK_VERSION }}/pack-v${{ env.PACK_VERSION }}-freebsd-arm64.tgz\" | sudo tar -C /usr/local/bin/ --no-same-owner -xzv pack)\n            ```\n\n            #### Linux\n\n            ##### AMD64\n\n            ```bash\n            (curl -sSL \"https://github.com/buildpacks/pack/releases/download/v${{ env.PACK_VERSION }}/pack-v${{ env.PACK_VERSION }}-linux.tgz\" | sudo tar -C /usr/local/bin/ --no-same-owner -xzv pack)\n            ```\n\n            ##### ARM64\n\n            ```bash\n            (curl -sSL \"https://github.com/buildpacks/pack/releases/download/v${{ env.PACK_VERSION }}/pack-v${{ env.PACK_VERSION }}-linux-arm64.tgz\" | sudo tar -C /usr/local/bin/ --no-same-owner -xzv pack)\n            ```\n\n            ##### S390X\n\n            ```bash\n            (curl -sSL \"https://github.com/buildpacks/pack/releases/download/v${{ env.PACK_VERSION }}/pack-v${{ env.PACK_VERSION }}-linux-s390x.tgz\" | sudo tar -C /usr/local/bin/ --no-same-owner -xzv pack)\n            ```\n            ##### PPC64LE\n\n            ```bash\n            (curl -sSL \"https://github.com/buildpacks/pack/releases/download/v${{ env.PACK_VERSION }}/pack-v${{ env.PACK_VERSION }}-linux-ppc64le.tgz\" | sudo tar -C /usr/local/bin/ --no-same-owner -xzv pack)\n            ```\n\n            #### MacOS\n\n            ##### Intel\n\n            ```bash\n            (curl -sSL \"https://github.com/buildpacks/pack/releases/download/v${{ env.PACK_VERSION }}/pack-v${{ env.PACK_VERSION }}-macos.tgz\" | sudo tar -C /usr/local/bin/ --no-same-owner -xzv pack)\n            ```\n\n            ##### Apple Silicon\n\n            ```bash\n            (curl -sSL \"https://github.com/buildpacks/pack/releases/download/v${{ env.PACK_VERSION }}/pack-v${{ env.PACK_VERSION }}-macos-arm64.tgz\" | sudo tar -C /usr/local/bin/ --no-same-owner -xzv pack)\n            ```\n            #### Manually\n\n            1. Download the `.tgz` or `.zip` file for your platform\n            2. Extract the `pack` binary\n            3. (Optional) Add the directory containing `pack` to `PATH`, or copy `pack` to a directory like `/usr/local/bin`\n\n            ## Run\n\n            Run the command `pack`.\n\n            You should see the following output:\n\n            ```text\n            ${{ steps.pack_help.outputs.help }}\n            ```\n\n            ## Info\n\n            Builders created with this release of the pack CLI contain [lifecycle v${{ steps.lifecycle_version.outputs.version }}](https://github.com/buildpack/lifecycle/releases/tag/v${{ steps.lifecycle_version.outputs.version }}) by default.\n\n            ## Changelog\n\n            ${{ steps.changelog.outputs.contents }}\n\n      - name: Create Beta Release\n        if: ${{ env.PACK_VERSION == env.PACK_MILESTONE }}\n        uses: softprops/action-gh-release@v2\n        env:\n          GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}\n        with:\n          tag_name: v${{ env.PACK_VERSION }}\n          name: pack v${{ env.PACK_VERSION }}\n          draft: true\n          files: pack-v${{ env.PACK_VERSION }}-*\n          body: |\n            ## Prerequisites\n\n            - A container runtime such as [Docker](https://www.docker.com/get-started) or [podman](https://podman.io/get-started) must be available to execute builds.\n\n            ## Install\n\n            For instructions on installing `pack`, see our [installation docs](https://buildpacks.io/docs/tools/pack/cli/install/).\n\n            ## Run\n\n            Run the command `pack`.\n\n            You should see the following output\n\n            ```text\n            ${{ steps.pack_help.outputs.help }}\n            ```\n\n            ## Info\n\n            Builders created with this release of the pack CLI contain [lifecycle v${{ steps.lifecycle_version.outputs.version }}](https://github.com/buildpack/lifecycle/releases/tag/v${{ steps.lifecycle_version.outputs.version }}) by default.\n\n            ## Changelog\n\n            ${{ steps.changelog.outputs.contents }}\n"
  },
  {
    "path": ".github/workflows/check-latest-release.yml",
    "content": "name: Check latest pack release\n\non:\n  schedule:\n    - cron: 0 2 * * 1,4\n  workflow_dispatch: {}\n\njobs:\n  check-release:\n    runs-on:\n      - ubuntu-latest\n    steps:\n      - uses: actions/checkout@v4\n      - uses: actions/setup-go@v6\n        with:\n          go-version-file: 'go.mod'\n      - name: Read go versions\n        id: read-go\n        env:\n          GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}\n        run: |\n          #!/usr/bin/env bash\n          \n          set -euo pipefail\n          \n          LATEST_GO_VERSION=$(go version | cut -d ' ' -f 3)\n          \n          LATEST_RELEASE_VERSION=$(gh release list --exclude-drafts --exclude-pre-releases -L 1 | cut -d $'\\t' -f 1 | cut -d ' ' -f 2)\n          \n          wget https://github.com/$GITHUB_REPOSITORY/releases/download/$LATEST_RELEASE_VERSION/pack-$LATEST_RELEASE_VERSION-linux.tgz -O out.tgz\n          tar xzf out.tgz\n          LATEST_RELEASE_GO_VERSION=$(go version ./pack | cut -d ' ' -f 2)\n          \n          echo \"latest-go-version=${LATEST_GO_VERSION}\" >> \"$GITHUB_OUTPUT\"\n          echo \"latest-release-go-version=${LATEST_RELEASE_GO_VERSION}\" >> \"$GITHUB_OUTPUT\"\n          \n          LATEST_RELEASE_VERSION=$(echo $LATEST_RELEASE_VERSION | cut -d \\v -f 2)\n          echo \"latest-release-version=${LATEST_RELEASE_VERSION}\" >> \"$GITHUB_OUTPUT\"\n      - name: Create issue if needed\n        if: ${{ steps.read-go.outputs.latest-go-version != steps.read-go.outputs.latest-release-go-version }}\n        env:\n          GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}\n        run: |\n          #!/usr/bin/env bash\n          \n          set -euo pipefail\n          \n          title=\"Upgrade $GITHUB_REPOSITORY to ${{ steps.read-go.outputs.latest-go-version }}\"\n          label=${{ steps.read-go.outputs.latest-go-version }}\n          \n          # Create label to use for exact search\n          gh label create \"$label\" || true\n          \n          search_output=$(gh issue list --search \"$title\" --label \"$label\")\n          \n          body=\"Latest $GITHUB_REPOSITORY release v${{ steps.read-go.outputs.latest-release-version }} is built with Go version ${{ steps.read-go.outputs.latest-release-go-version }}; newer version ${{ steps.read-go.outputs.latest-go-version }} is available.\"\n          \n          if [ -z \"${search_output// }\" ]\n          then\n            echo \"No issues matched search; creating new issue...\"\n            gh issue create \\\n              --label \"type/bug\" \\\n              --label \"status/triage\" \\\n              --label \"$label\" \\\n              --title \"$title\" \\\n              --body \"$body\"\n          else\n            echo \"Found matching issues:\"\n            echo $search_output\n          fi\n      - name: Scan latest release image\n        id: scan-image\n        uses: anchore/scan-action@v6\n        with:\n          image: docker.io/buildpacksio/pack:${{ steps.read-go.outputs.latest-release-version }}\n      - name: Create issue if needed\n        if: failure() && steps.scan-image.outcome == 'failure'\n        env:\n          GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}\n        run: |\n            #!/usr/bin/env bash\n\n            set -euo pipefail\n\n            title=\"CVE(s) found\"\n            label=cve\n\n            # Create label to use for exact search\n            gh label create \"$label\" || true\n\n            search_output=$(gh issue list --search \"$title\" --label \"$label\")\n\n            GITHUB_WORKFLOW_URL=https://github.com/$GITHUB_REPOSITORY/actions/runs/$GITHUB_RUN_ID\n            body=\"Latest docker.io/buildpacksio/pack v${{ steps.read-go.outputs.latest-release-version }} triggered CVE(s) from Grype. For further details, see: $GITHUB_WORKFLOW_URL\"\n\n            if [ -z \"${search_output// }\" ]\n            then\n              echo \"No issues matched search; creating new issue...\"\n              gh issue create \\\n              --label \"type/bug\" \\\n              --label \"status/triage\" \\\n              --label \"$label\" \\\n              --title \"$title\" \\\n              --body \"$body\"\n            else\n              echo \"Found matching issues:\"\n              echo $search_output\n            fi\n"
  },
  {
    "path": ".github/workflows/codeql-analysis.yml",
    "content": "# For most projects, this workflow file will not need changing; you simply need\n# to commit it to your repository.\n#\n# You may wish to alter this file to override the set of languages analyzed,\n# or to provide custom queries or build logic.\nname: \"CodeQL\"\n\non:\n  push:\n    branches: [main, 'releases/*.*.*']\n  pull_request:\n    # The branches below must be a subset of the branches above\n    branches: [main, 'releases/*.*.*']\n  schedule:\n    - cron: '0 16 * * 3'\n\njobs:\n  analyze:\n    name: Analyze\n    runs-on: ubuntu-latest\n\n    strategy:\n      fail-fast: false\n      matrix:\n        # Override automatic language detection by changing the below list\n        # Supported options are ['csharp', 'cpp', 'go', 'java', 'javascript', 'python']\n        language: ['go']\n        # Learn more...\n        # https://docs.github.com/en/github/finding-security-vulnerabilities-and-errors-in-your-code/configuring-code-scanning#overriding-automatic-language-detection\n\n    steps:\n    - name: Checkout repository\n      uses: actions/checkout@v4\n\n    # Initializes the CodeQL tools for scanning.\n    - name: Initialize CodeQL\n      uses: github/codeql-action/init@v3\n      with:\n        languages: ${{ matrix.language }}\n        # If you wish to specify custom queries, you can do so here or in a config file.\n        # By default, queries listed here will override any specified in a config file.\n        # Prefix the list here with \"+\" to use these queries and those in the config file.\n        # queries: ./path/to/local/query, your-org/your-repo/queries@main\n\n    # Autobuild attempts to build any compiled languages  (C/C++, C#, or Java).\n    # If this step fails, then you should remove it and run the build manually (see below)\n    - name: Autobuild\n      uses: github/codeql-action/autobuild@v3\n\n    # ℹ️ Command-line programs to run using the OS shell.\n    # 📚 https://git.io/JvXDl\n\n    # ✏️ If the Autobuild fails above, remove it and uncomment the following three lines\n    #    and modify them (or add more) to build your code if your project\n    #    uses a compiled language\n\n    #- run: |\n    #   make bootstrap\n    #   make release\n\n    - name: Perform CodeQL Analysis\n      uses: github/codeql-action/analyze@v3\n"
  },
  {
    "path": ".github/workflows/compatibility.yml",
    "content": "name: compatibility\n\non:\n  push:\n    paths-ignore:\n      - '**.md'\n      - 'resources/**'\n      - 'CODEOWNERS'\n      - 'LICENSE'\n    branches:\n      - main\n      - 'release/**'\n  pull_request:\n    paths-ignore:\n      - '**.md'\n      - 'resources/**'\n      - 'CODEOWNERS'\n      - 'LICENSE'\n    branches:\n      - main\n      - 'release/**'\n\njobs:\n  acceptance-combo:\n    strategy:\n      matrix:\n        pack_kind: [current, previous]\n        create_builder_kind: [current, previous]\n        lifecycle_kind: [current, previous]\n        exclude:\n          # For all previous versions these were tested prior to release\n          - pack_kind: previous\n            create_builder_kind: previous\n            lifecycle_kind: previous\n          # Previous versions of pack cannot create a builder with a newer version of lifecycle\n          - pack_kind: current\n            create_builder_kind: previous\n            lifecycle_kind: current\n          # Previous versions of pack cannot create a builder with a newer version of lifecycle\n          - pack_kind: previous\n            create_builder_kind: previous\n            lifecycle_kind: current\n    runs-on: ubuntu-latest\n    steps:\n      - uses: actions/checkout@v4\n      - name: Set up go\n        uses: actions/setup-go@v6\n        with:\n          check-latest: true\n          go-version-file: 'go.mod'\n      - name: Set up go env\n        run: |\n          echo \"GOPATH=$(go env GOPATH)\" >> $GITHUB_ENV\n          echo \"$(go env GOPATH)/bin\" >> $GITHUB_PATH\n        shell: bash\n      - name: Acceptance\n        env:\n          ACCEPTANCE_SUITE_CONFIG: '[{\"pack\": \"${{ matrix.pack_kind }}\", \"pack_create_builder\": \"${{ matrix.create_builder_kind }}\", \"lifecycle\": \"${{ matrix.lifecycle_kind }}\"}]'\n          GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}\n        run: |\n          make acceptance\n          docker network prune\n"
  },
  {
    "path": ".github/workflows/delivery/archlinux/README.md",
    "content": "# Arch Linux\n\nThere are two maintained packages by us and one official archlinux package:\n\n- [pack-cli](https://archlinux.org/packages/extra/x86_64/pack-cli/): Official Archlinux package in the 'Extra' repo.\n- [pack-cli-bin](https://aur.archlinux.org/packages/pack-cli-bin/): The latest release of `pack`, precompiled.\n- [pack-cli-git](https://aur.archlinux.org/packages/pack-cli-git/): An unreleased version of `pack`, compiled from source of the `main` branch.\n\n\n## Current State\n\nThe following depicts the current state of automation:\n\n| package      | tested | distributed |\n| ---          | ---    | ---         |\n| pack-cli     | yes    | yes         |\n| pack-cli-bin | yes    | yes         |\n| pack-cli-git | yes    | yes         |\n\n## Run Locally\n\n> **CAUTION:** This makes changes directly to the published packages. To prevent changes, comment out `git push` in `publish-package.sh`.\n\n```shell script\ndocker pull nektos/act-environments-ubuntu:18.04\ndocker pull archlinux:latest\n\nexport GITHUB_TOKEN=\"<YOUR_GH_TOKEN>\"\nexport AUR_KEY=\"<AUR_KEY>\"\n\nact -P ubuntu-latest=nektos/act-environments-ubuntu:18.04 \\\n    -e .github/workflows/testdata/event-release.json \\\n    -s GITHUB_TOKEN -s AUR_KEY \\\n    -j <JOB_NAME>\n```\n"
  },
  {
    "path": ".github/workflows/delivery/archlinux/pack-cli-bin/PKGBUILD",
    "content": "# Maintainer: Michael William Le Nguyen <michael at mail dot ttp dot codes>\n# Maintainer: Buildpacks Maintainers <cncf-buildpacks-maintainers at lists dot cncf dot io>\npkgname=pack-cli-bin\npkgver={{PACK_VERSION}}\npkgrel=1\npkgdesc=\"CLI for building apps using Cloud Native Buildpacks\"\narch=('x86_64')\nurl=\"https://buildpacks.io/\"\nlicense=('Apache')\nprovides=('pack-cli')\nconflicts=('pack-cli')\nsource=(\"{{BIN_TGZ_URL}}\")\nsha512sums=(\"{{BIN_TGZ_SHA}}\")\npackage() {\n\tinstall -D -m755 \"${srcdir}/pack\" \"${pkgdir}/usr/bin/pack\"\n}"
  },
  {
    "path": ".github/workflows/delivery/archlinux/pack-cli-git/PKGBUILD",
    "content": "# Maintainer: Michael William Le Nguyen <michael at mail dot ttp dot codes>\n# Maintainer: Buildpacks Maintainers <cncf-buildpacks-maintainers at lists dot cncf dot io>\npkgname=pack-cli-git\npkgver={{PACK_VERSION}}+r{{GIT_REVISION}}.g{{GIT_COMMIT}}\npkgrel=1\npkgdesc=\"CLI for building apps using Cloud Native Buildpacks\"\narch=('x86_64')\nurl=\"https://buildpacks.io/\"\nlicense=('Apache')\nmakedepends=(\n\t'git'\n\t'go-pie'\n)\nprovides=('pack-cli')\nconflicts=('pack-cli')\nsource=(\"${pkgname}::git+https://github.com/buildpacks/pack\")\nsha512sums=(\"SKIP\")\nbuild() {\n\texport GOPATH=\"${srcdir}/go\"\n\tcd \"${srcdir}/${pkgname}\"\n\tPACK_VERSION={{PACK_VERSION}} make build\n}\npackage() {\n\texport GOPATH=\"${srcdir}/go\"\n\tgo clean -modcache\n\tinstall -D -m755 \"${srcdir}/${pkgname}/out/pack\" \"${pkgdir}/usr/bin/pack\"\n}"
  },
  {
    "path": ".github/workflows/delivery/archlinux/publish-package.sh",
    "content": "#!/usr/bin/env bash\nset -e\nset -u\n\n# ensure variable is set\n: \"$PACK_VERSION\"\n: \"$PACKAGE_NAME\"\n: \"$AUR_KEY\"\n: \"$GITHUB_WORKSPACE\"\n\nPACKAGE_DIR=\"${GITHUB_WORKSPACE}/${PACKAGE_NAME}\"\nPACKAGE_AUR_DIR=\"${GITHUB_WORKSPACE}/${PACKAGE_NAME}-aur\"\n\n# setup non-root user\nuseradd -m archie\n\n# add non-root user to sudoers\npacman -Sy --noconfirm sudo\necho 'archie ALL=(ALL:ALL) NOPASSWD:ALL' >> /etc/sudoers\n\necho '> Install dependencies'\npacman -Sy --noconfirm git openssh base-devel libffi\n\necho '> Configuring ssh...'\nSSH_HOME=\"/root/.ssh\"\nmkdir -p \"${SSH_HOME}\"\nchmod 700 \"${SSH_HOME}\"\n\necho '> Starting ssh-agent...'\neval $(ssh-agent)\n\necho '> Add Github to known_hosts...'\nssh-keyscan -H aur.archlinux.org >> \"${SSH_HOME}/known_hosts\"\nchmod 644 \"${SSH_HOME}/known_hosts\"\n\necho '> Adding AUR_KEY...'\nssh-add - <<< \"$AUR_KEY\"\n\necho '> Cloning aur...'\ngit clone \"ssh://aur@aur.archlinux.org/${PACKAGE_NAME}.git\" \"${PACKAGE_AUR_DIR}\"\nchown -R archie \"${PACKAGE_AUR_DIR}\"\npushd \"${PACKAGE_AUR_DIR}\" > /dev/null\n  echo '> Declare directory ${PACKAGE_AUR_DIR} as safe'\n  git config --global --add safe.directory \"${PACKAGE_AUR_DIR}\"\n\n  echo '> Checking out master...'\n  git checkout master\n\n  echo '> Applying changes...'\n  rm -rf ./*\n  cp -R \"${PACKAGE_DIR}\"/* ./\n  \n  su archie -c \"makepkg --printsrcinfo\" > .SRCINFO  \n  \n  echo '> Committing changes...'\n  git config --global user.name \"github-bot\"\n  git config --global user.email \"action@github.com\"\n  git diff --color | cat\n  git add .\n  git commit -m \"Version ${PACK_VERSION}\"\n  git push -f\n\npopd > /dev/null\n"
  },
  {
    "path": ".github/workflows/delivery/archlinux/test-install-package.sh",
    "content": "#!/usr/bin/env bash\nset -e\nset -u\n\n# ensure variable is set\n: \"$PACKAGE_NAME\"\n: \"$GITHUB_WORKSPACE\"\n\n# setup non-root user\nuseradd -m archie\n\n# add non-root user to sudoers\npacman -Sy --noconfirm sudo\necho 'archie ALL=(ALL:ALL) NOPASSWD:ALL' >> /etc/sudoers\n\n# setup workspace\nWORKSPACE=$(mktemp -d -t \"$PACKAGE_NAME-XXXXXXXXXX\")\ncp -R \"$GITHUB_WORKSPACE/$PACKAGE_NAME/\"* \"$WORKSPACE\"\nchown -R archie \"$WORKSPACE\"\n\n# run everything else as non-root user\npushd \"$WORKSPACE\" > /dev/null\nsu archie << \"EOF\"\necho -n '> Debug info:'\nls -al\nsha512sum ./*\n\necho -n '> Installing AUR packaging deps...'\nsudo pacman -Sy --noconfirm git base-devel libffi\n\necho -n '> Installing package...'\nmakepkg -sri --noconfirm\n\n# print version\necho -n '> Installed pack version: '\npack --version\nEOF\npopd > /dev/null\n"
  },
  {
    "path": ".github/workflows/delivery/chocolatey/pack.nuspec",
    "content": "<?xml version=\"1.0\" encoding=\"utf-8\"?>\n<!-- Do not remove this test for UTF-8: if “Ω” doesn’t appear as greek uppercase omega letter enclosed in quotation marks, you should use an editor that supports UTF-8, not this one. -->\n<package xmlns=\"http://schemas.microsoft.com/packaging/2015/06/nuspec.xsd\">\n  <metadata>\n    <id>pack</id>\n    <version>{{PACK_VERSION}}</version>\n    <packageSourceUrl>https://github.com/buildpacks/pack</packageSourceUrl>\n    <owners>Cloud Native Buildpack Authors</owners>\n\n    <title>pack</title>\n    <authors>Cloud Native Buildpack Authors</authors>\n    <projectUrl>https://github.com/buildpacks/pack</projectUrl>\n    <iconUrl>https://rawcdn.githack.com/buildpacks/artwork/36f02fae98ed82bb3175918796b6cca7acb813df/light-background/logo-light.png</iconUrl>\n    <licenseUrl>https://github.com/buildpacks/pack/blob/main/LICENSE</licenseUrl>\n    <requireLicenseAcceptance>false</requireLicenseAcceptance>\n    <projectSourceUrl>https://github.com/buildpacks/pack</projectSourceUrl>\n    <docsUrl>https://buildpacks.io/docs/</docsUrl>\n    <mailingListUrl>https://lists.cncf.io/g/cncf-buildpacks</mailingListUrl>\n    <bugTrackerUrl>https://github.com/buildpacks/pack/issues</bugTrackerUrl>\n    <tags>pack cloud-native-buildpacks cncf</tags>\n    <summary>pack is a CLI for building apps using Cloud Native Buildpacks.</summary>\n    <description>\nThis package installs/upgrades the pack - Buildpacks CLI release.\n\n[![Slack](https://slack.buildpacks.io/badge.svg)](https://slack.buildpacks.io/)\n\n`pack` makes it easy for...\n- [**App Developers**](https://buildpacks.io/docs/app-developer-guide/) to use buildpacks to convert code into runnable images.\n- [**Buildpack Authors**](https://buildpacks.io/docs/buildpack-author-guide/) to develop and package buildpacks for distribution.\n- [**Operators**](https://buildpacks.io/docs/operator-guide/) to package buildpacks for distribution and maintain applications.\n\n## Usage\n\n![Usage](https://github.com/buildpacks/pack/raw/main/resources/pack-build.gif)\n\n## Getting Started\n\nGet started by running through our tutorial: [An App’s Brief Journey from Source to Image](https://buildpacks.io/docs/app-journey)\n\n## Specifications\n`pack` is a CLI implementation of the [Platform Interface Specification][platform-spec] for [Cloud Native Buildpacks](https://buildpacks.io/).\n    </description>\n  </metadata>\n  <files>\n    <file src=\"tools\\**\" target=\"tools\" />\n  </files>\n</package>\n"
  },
  {
    "path": ".github/workflows/delivery/chocolatey/tools/VERIFICATION.txt",
    "content": "\nVERIFICATION\nVerification is intended to assist the Chocolatey moderators and community\nin verifying that this package's contents are trustworthy.\n\nThis release is published by the Cloud Native Buildpacks project, creators of the Pack CLI.\n"
  },
  {
    "path": ".github/workflows/delivery/homebrew/pack.rb",
    "content": "###\n# This file is autogenerated from https://github.com/buildpacks/pack/tree/main/.github/workflows/delivery/homebrew/\n# Changes should be committed there. \n###\nclass Pack < Formula\n  desc \"A CLI for building apps using Cloud Native Buildpacks\"\n  homepage \"https://github.com/buildpacks/pack\"\n  version \"{{PACK_VERSION}}\"\n  version_scheme 1\n\n  if OS.mac? && Hardware::CPU.arm?\n    url \"{{MACOS_ARM64_URL}}\"\n    sha256 \"{{MACOS_ARM64_SHA}}\"\n  elsif OS.mac?\n    url \"{{MACOS_URL}}\"\n    sha256 \"{{MACOS_SHA}}\"\n  elsif OS.linux? && Hardware::CPU.arm?\n    url \"{{LINUX_ARM64_URL}}\"\n    sha256 \"{{LINUX_ARM64_SHA}}\"\n  else\n    url \"{{LINUX_URL}}\"\n    sha256 \"{{LINUX_SHA}}\"\n  end\n\n  def install\n    bin.install \"pack\"\n  end\nend\n"
  },
  {
    "path": ".github/workflows/delivery/ubuntu/1_dependencies.sh",
    "content": "function dependencies() {\n    : \"$GO_DEP_PACKAGE_NAME\"\n\n    echo \"> Installing dev tools...\"\n    apt-get update\n    apt-get install gnupg debhelper dput dh-make devscripts lintian software-properties-common -y\n\n    echo \"> Installing git...\"\n    apt-get install git -y\n\n    echo \"> Installing go...\"\n    add-apt-repository ppa:longsleep/golang-backports -y\n    apt-get update\n    apt-get install $GO_DEP_PACKAGE_NAME -y\n}\n"
  },
  {
    "path": ".github/workflows/delivery/ubuntu/2_create-ppa.sh",
    "content": "function create_ppa() {\n  # verify the following are set.\n  : \"$GPG_PUBLIC_KEY\"\n  : \"$GPG_PRIVATE_KEY\"\n  : \"$PACKAGE_VERSION\"\n  : \"$PACKAGE_NAME\"\n  : \"$MAINTAINER_NAME\"\n  : \"$MAINTAINER_EMAIL\"\n  : \"$SCRIPT_DIR\"\n\n  echo \"> Importing GPG keys...\"\n  gpg --import <(echo \"$GPG_PUBLIC_KEY\")\n  gpg --allow-secret-key-import --import <(echo \"$GPG_PRIVATE_KEY\")\n\n  # Dependencies fail to be pulled in during the Launchpad build process.\n  echo \"> Vendoring dependencies...\"\n  go mod vendor\n\n  echo \"> Creating package: ${PACKAGE_NAME}_${PACKAGE_VERSION}\"\n  echo \"> Generating skeleton of a debian package...\"\n  export DEBEMAIL=$MAINTAINER_EMAIL\n  export DEBFULLNAME=$MAINTAINER_NAME\n  dh_make -p \"${PACKAGE_NAME}_${PACKAGE_VERSION}\" --single --native --copyright apache --email \"${MAINTAINER_EMAIL}\" -y\n\n  echo \"> Copying templated configuration files...\"\n  cp \"$SCRIPT_DIR/debian/\"* debian/\n\n  echo \"=======\"\n  echo \"compat\"\n  echo \"=======\"\n  cat debian/compat\n  echo\n  echo \"=======\"\n  echo \"changelog\"\n  echo \"=======\"\n  cat debian/changelog\n  echo\n  echo \"=======\"\n  echo \"control\"\n  echo \"=======\"\n  cat debian/control\n  echo\n  echo \"=======\"\n  echo \"rules\"\n  echo \"=======\"\n  cat debian/rules\n  echo\n  echo \"=======\"\n  echo \"copyright\"\n  echo \"=======\"\n  cat debian/copyright\n  echo\n\n  echo \"> Removing empty default files created by dh_make...\"\n  rm -f debian/*.ex\n  rm -f debian/*.EX\n  rm -f debian/README.*\n\n  # Ubuntu ONLY accepts source packages.\n  echo \"> Build a source based debian package...\"\n  debuild -S\n\n  # debuild places everything in parent directory\n  echo \"> Files created in: ${PWD}/..\"\n  ls -al ${PWD}/..\n}\n"
  },
  {
    "path": ".github/workflows/delivery/ubuntu/3_test-ppa.sh",
    "content": "function test_ppa {\n    : \"$GITHUB_WORKSPACE\"\n\n    echo \"> Creating a test directory...\"\n    testdir=\"$(mktemp -d)\"\n\n    echo \"> Source Dir: '$GITHUB_WORKSPACE'\"\n    echo \"> Test Dir: '$testdir'\"\n    cp -R $GITHUB_WORKSPACE/* $testdir\n\n    pushd $testdir\n        echo \"> Building a debian binary package...\"\n        debuild -b -us -uc\n\n        echo \"> Installing binary package...\"\n        dpkg -i ../*.deb\n\n        echo \"> Contents installed by the build debain package:\"\n        dpkg -L pack-cli\n    popd\n}\n"
  },
  {
    "path": ".github/workflows/delivery/ubuntu/4_upload-ppa.sh",
    "content": "function upload_ppa {\n    echo \"> Uploading PPA...\"\n    dput \"ppa:cncf-buildpacks/pack-cli\" ./../*.changes\n}"
  },
  {
    "path": ".github/workflows/delivery/ubuntu/debian/README",
    "content": "The Debian Package {{PACKAGE_NAME}}\n----------------------------\n\nCLI for building apps using Cloud Native Buildpacks.\n\nFor extensive documentation see: https://buildpacks.io/docs/tools/pack/cli/pack/\n\nPlease file issues and bugs at https://github.com/buildpacks/pack/\n\n -- {{MAINTAINER_NAME}} <{{MAINTAINER_EMAIL}}>  {{DATE_TIME}}"
  },
  {
    "path": ".github/workflows/delivery/ubuntu/debian/changelog",
    "content": "{{PACKAGE_NAME}} ({{PACK_VERSION}}-0ubuntu1~{{UBUNTU_VERSION}}) {{UBUNTU_VERSION}}; urgency=medium\n\n  * For complete changelog see https://github.com/buildpacks/pack/releases/tag/v{{PACK_VERSION}}.\n\n -- {{MAINTAINER_NAME}} <{{MAINTAINER_EMAIL}}>  {{DATE_TIME}}\n"
  },
  {
    "path": ".github/workflows/delivery/ubuntu/debian/compat",
    "content": "11"
  },
  {
    "path": ".github/workflows/delivery/ubuntu/debian/control",
    "content": "Source: {{PACKAGE_NAME}}\nSection: utils\nPriority: optional\nMaintainer: {{MAINTAINER_NAME}} <{{MAINTAINER_EMAIL}}>\nBuild-Depends: debhelper (>= 11), git, {{GO_DEP_ENTRY}}\nStandards-Version: 3.9.8\nVcs-Git: git@github.com/{{REPO}}.git\nVcs-Browser: https://github.com/{{REPO}}\nHomepage: {{HOMEPAGE}}\n\nPackage: {{PACKAGE_NAME}}\nArchitecture: {{ARCH}}\nDescription: {{DESCRIPTION}}\n"
  },
  {
    "path": ".github/workflows/delivery/ubuntu/debian/copyright",
    "content": "Format: https://www.debian.org/doc/packaging-manuals/copyright-format/1.0/\nUpstream-Name: {{PACKAGE_NAME}}\n\nFiles: *\n\nLicense: Apache-2.0\n\nFiles: debian/*\nCopyright: 2020 {{MAINTAINER_NAME}} <{{MAINTAINER_EMAIL}}>\nLicense: Apache-2.0\n\nLicense: Apache-2.0\n Licensed under the Apache License, Version 2.0 (the \"License\");\n you may not use this file except in compliance with the License.\n You may obtain a copy of the License at\n .\n https://www.apache.org/licenses/LICENSE-2.0\n .\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS,\n WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n See the License for the specific language governing permissions and\n limitations under the License.\n .\n On Debian systems, the complete text of the Apache version 2.0 license\n can be found in \"/usr/share/common-licenses/Apache-2.0\".\n"
  },
  {
    "path": ".github/workflows/delivery/ubuntu/debian/rules",
    "content": "#!/usr/bin/make -f\n# See debhelper(7) (uncomment to enable)\n# output every command that modifies files on the build system.\nexport DH_VERBOSE = 1\n\n%:\n\tdh $@\n\noverride_dh_auto_test:\n\noverride_dh_auto_build:\n\tmkdir -p /tmp/.cache/go-build\n\tGOCACHE=/tmp/.cache/go-build GOFLAGS=\"-mod=vendor\" LDFLAGS=\"\" PACK_VERSION='{{PACK_VERSION}}' PATH=\"${PATH}:{{GO_DEP_LIB_PATH}}/bin\" dh_auto_build -- build\n\trm -r /tmp/.cache/go-build\n"
  },
  {
    "path": ".github/workflows/delivery/ubuntu/deliver.sh",
    "content": "#!/usr/bin/env bash\n\nset -e\nset -o pipefail\n\nreadonly SCRIPT_DIR=\"$(cd \"$(dirname \"${BASH_SOURCE[0]}\")\" && pwd)\"\n\necho \"PWD=${PWD}\"\necho \"SCRIPT_DIR=${SCRIPT_DIR}\"\n\nsource \"$SCRIPT_DIR/1_dependencies.sh\"\nsource \"$SCRIPT_DIR/2_create-ppa.sh\"\nsource \"$SCRIPT_DIR/3_test-ppa.sh\"\nsource \"$SCRIPT_DIR/4_upload-ppa.sh\"\n\necho\necho \"++++++++++++++++++++++++++++\"\necho \"> Installing dependencies...\"\necho \"++++++++++++++++++++++++++++\"\necho\ndependencies\n\necho\necho \"++++++++++++++++++++++++++++\"\necho \"> Creating PPA...\"\necho \"++++++++++++++++++++++++++++\"\necho\ncreate_ppa\n\necho\necho \"++++++++++++++++++++++++++++\"\necho \"> Testing PPA...\"\necho \"++++++++++++++++++++++++++++\"\necho\ntest_ppa\n\necho\necho \"++++++++++++++++++++++++++++\"\necho \"> Uploading PPA...\"\necho \"++++++++++++++++++++++++++++\"\necho\nupload_ppa\n"
  },
  {
    "path": ".github/workflows/delivery-archlinux-git.yml",
    "content": "name: delivery / archlinux / git\n\non:\n#  push:\n#    branches:\n#      - main\n  workflow_dispatch:\n\njobs:\n  pack-cli-git:\n    runs-on: ubuntu-latest\n    env:\n      PACKAGE_NAME: pack-cli-git\n    steps:\n      - uses: actions/checkout@v4\n        with:\n          fetch-depth: 0\n      - name: Setup working dir\n        run: |\n          mkdir -p ${{ env.PACKAGE_NAME }}\n          cp .github/workflows/delivery/archlinux/${{ env.PACKAGE_NAME }}/PKGBUILD ${{ env.PACKAGE_NAME }}/PKGBUILD\n      - name: Metadata\n        id: metadata\n        run: |\n          git_description=$(git describe --tags --long)\n          version=$(echo \"${git_description}\" | awk -F- '{print $(1)}' | sed 's/^v//')\n          revision=$(echo \"${git_description}\" | awk -F- '{print $(NF-1)}')\n          commit=$(echo \"${git_description}\" | awk -F- '{print $(NF)}'  | sed 's/^g//')\n          echo \"version=$version\" >> $GITHUB_OUTPUT\n          echo \"revision=$revision\" >> $GITHUB_OUTPUT\n          echo \"commit=$commit\" >> $GITHUB_OUTPUT\n      - name: Fill PKGBUILD\n        uses: cschleiden/replace-tokens@v1\n        with:\n          files: ${{ env.PACKAGE_NAME }}/PKGBUILD\n          tokenPrefix: '{{'\n          tokenSuffix: '}}'\n        env:\n          PACK_VERSION: ${{ steps.metadata.outputs.version }}\n          GIT_REVISION: ${{ steps.metadata.outputs.revision }}\n          GIT_COMMIT: ${{ steps.metadata.outputs.commit }}\n      - name: Print PKGBUILD\n        run: cat ${{ env.PACKAGE_NAME }}/PKGBUILD\n      - name: Test\n        uses: docker://archlinux:latest\n        with:\n          entrypoint: .github/workflows/delivery/archlinux/test-install-package.sh\n      - name: Publish\n        uses: docker://archlinux:latest\n        env:\n          PACK_VERSION: ${{ steps.metadata.outputs.version }}\n          AUR_KEY: ${{ secrets.AUR_KEY }}\n        with:\n          entrypoint: .github/workflows/delivery/archlinux/publish-package.sh\n"
  },
  {
    "path": ".github/workflows/delivery-archlinux.yml",
    "content": "name: delivery / archlinux\n\non:\n  release:\n    types:\n      - released\n  workflow_dispatch:\n    inputs:\n      tag_name:\n        description: The release tag to distribute\n        required: true\n\njobs:\n  pack-cli-bin:\n    runs-on: ubuntu-latest\n    env:\n      PACKAGE_NAME: pack-cli-bin\n    steps:\n      - uses: actions/checkout@v4\n      - name: Determine version\n        uses: actions/github-script@v7\n        id: version\n        with:\n          result-encoding: string\n          script: |\n            let payload = context.payload;\n            let tag = (payload.release && payload.release.tag_name) || (payload.inputs && payload.inputs.tag_name);\n            if (!tag) {\n              throw \"ERROR: unable to determine tag\"\n            }\n            return tag.replace(/^v/, '');\n      - name: Set PACK_VERSION\n        run: echo \"PACK_VERSION=${{ steps.version.outputs.result }}\" >> $GITHUB_ENV\n        shell: bash\n      - name: Setup working dir\n        run: |\n          mkdir -p ${{ env.PACKAGE_NAME }}/\n          cp .github/workflows/delivery/archlinux/${{ env.PACKAGE_NAME }}/PKGBUILD ${{ env.PACKAGE_NAME }}/PKGBUILD\n      - name: Lookup assets\n        uses: actions/github-script@v7\n        id: assets\n        with:\n          script: |\n            let tag_name = \"v${{ env.PACK_VERSION }}\";\n            var release = context.payload.release || await github.rest.repos.listReleases(context.repo)\n                  .then(result => result.data.find(r => r.tag_name === tag_name))\n                  .catch(err => {throw \"ERROR: \" + err.message});\n\n            if (!release) {\n              throw \"no release found with tag: \" + tag_name;\n            }\n\n            let asset = release.assets.find(a => a.name.endsWith(\"linux.tgz\"));\n            if (!asset) {\n              throw \"ERROR: Failed to find linux asset!\";\n            }\n\n            core.setOutput(\"linux_name\", asset.name);\n            core.setOutput(\"linux_url\", asset.browser_download_url);\n      - name: Metadata\n        id: metadata\n        run: |\n          curl -sSL ${{ steps.assets.outputs.linux_url }} -o ${{ steps.assets.outputs.linux_name }}\n          sha512=$(sha512sum ${{ steps.assets.outputs.linux_name }} | cut -d ' ' -f1)\n          echo \"url=${{ steps.assets.outputs.linux_url }}\" >> $GITHUB_OUTPUT\n          echo \"sha512=$sha512\" >> $GITHUB_OUTPUT\n      - name: Fill PKGBUILD\n        uses: cschleiden/replace-tokens@v1\n        with:\n          files: ${{ env.PACKAGE_NAME }}/PKGBUILD\n          tokenPrefix: '{{'\n          tokenSuffix: '}}'\n        env:\n          PACK_VERSION: ${{ env.PACK_VERSION }}\n          BIN_TGZ_URL: ${{ steps.metadata.outputs.url }}\n          BIN_TGZ_SHA: ${{ steps.metadata.outputs.sha512 }}\n      - name: Print PKGBUILD\n        run: cat ${{ env.PACKAGE_NAME }}/PKGBUILD\n      - name: Test\n        uses: docker://archlinux:latest\n        with:\n          entrypoint: .github/workflows/delivery/archlinux/test-install-package.sh\n      - name: Publish\n        uses: docker://archlinux:latest\n        env:\n          AUR_KEY: ${{ secrets.AUR_KEY }}\n        with:\n          entrypoint: .github/workflows/delivery/archlinux/publish-package.sh\n"
  },
  {
    "path": ".github/workflows/delivery-chocolatey.yml",
    "content": "name: delivery / chocolatey\n\non:\n  release:\n    types:\n      - released\n  workflow_dispatch:\n    inputs:\n      tag_name:\n        description: The release tag to distribute\n        required: true\n\nenv:\n  CHOCO_PATH: chocolatey\n\ndefaults:\n  run:\n    shell: bash\n\njobs:\n  deliver-chocolatey:\n    runs-on: windows-latest\n    steps:\n      - uses: actions/checkout@v4\n      - name: Determine version\n        uses: actions/github-script@v7\n        id: version\n        with:\n          result-encoding: string\n          script: |\n            let tag = (context.payload.release && context.payload.release.tag_name)\n              || (context.payload.inputs && context.payload.inputs.tag_name);\n\n            if (!tag) {\n              throw \"ERROR: unable to determine tag\";\n            }\n\n            updatedTag = tag.replace(/^v/, '')\n            core.exportVariable('PACK_VERSION', updatedTag);\n            return updatedTag;\n      - name: Setup working dir\n        run: |\n          mkdir -p ${{ env.CHOCO_PATH }}/source\n          cp -r .github/workflows/delivery/chocolatey/. ${{ env.CHOCO_PATH }}/\n          ls -R ${{ env.CHOCO_PATH }}\n      - name: Download and unzip Pack (Windows)\n        run: |\n          url=\"https://github.com/buildpacks/pack/releases/download/v${{ env.PACK_VERSION }}/pack-v${{ env.PACK_VERSION }}-windows.zip\"\n          filename=pack.zip\n          tools_path=${{ env.CHOCO_PATH }}/tools/\n          zip_path=\"$tools_path/$filename\"\n\n          curl -sSL \"$url\" -o \"$zip_path\"\n\n          apt-get update && apt-get install unzip\n          unzip -o \"$zip_path\" -d $tools_path\n          rm \"$zip_path\"\n          ls $tools_path\n      - name: Fill nuspec\n        run: |\n          file=${{ env.CHOCO_PATH }}/pack.nuspec\n          sed -i \"s/{{PACK_VERSION}}/${{ env.PACK_VERSION }}/g\" $file\n          cat $file\n      - name: Fill License\n        run: |\n          file=\"${{ env.CHOCO_PATH }}/tools/LICENSE.txt\"\n          cp LICENSE $file\n          cat $file\n      - name: build-release\n        uses: crazy-max/ghaction-chocolatey@v3\n        with:\n          args: pack ${{ env.CHOCO_PATH }}/pack.nuspec --outputdirectory ${{ env.CHOCO_PATH}}/source\n      - name: list files\n        run: |\n          ls ${{ env.CHOCO_PATH }}\n          ls ${{ env.CHOCO_PATH }}/tools\n      - name: Test Release\n        uses: crazy-max/ghaction-chocolatey@v3\n        with:\n          args: install pack -s ${{ env.CHOCO_PATH }}/source\n      - name: Ensure Pack Installed\n        run: pack help\n      - name: Upload Release\n        uses: crazy-max/ghaction-chocolatey@v3\n        with:\n          args: push ${{ env.CHOCO_PATH }}/source/pack.${{ env.PACK_VERSION }}.nupkg -s https://push.chocolatey.org/ -k ${{ secrets.CHOCO_KEY }}\n"
  },
  {
    "path": ".github/workflows/delivery-docker.yml",
    "content": "name: delivery / docker\n\non:\n  release:\n    types:\n      - released\n  workflow_dispatch:\n    inputs:\n      tag_name:\n        description: The release tag to distribute\n        required: true\n      tag_latest:\n        description: Tag as latest\n        required: false\n        type: boolean\n        default: false\n\nenv:\n  REGISTRY_NAME: 'docker.io'\n  USER_NAME: 'buildpacksio'\n  IMG_NAME: 'pack'\n\njobs:\n  deliver-docker:\n    strategy:\n      matrix:\n        config: [tiny, base]\n        include:\n          - config: tiny\n            base_image: gcr.io/distroless/static\n            suffix:\n          - config: base\n            base_image: ubuntu:jammy\n            suffix: -base\n    runs-on: ubuntu-latest\n    steps:\n      - name: Determine version\n        uses: actions/github-script@v7\n        id: version\n        with:\n          result-encoding: string\n          script: |\n            let tag = (context.payload.release && context.payload.release.tag_name)\n              || (context.payload.inputs && context.payload.inputs.tag_name);\n\n            if (!tag) {\n              throw \"ERROR: unable to determine tag\";\n            }\n\n            return tag.replace(/^v/, '');\n      - name: Checkout source at tag\n        uses: actions/checkout@v4\n        with:\n          ref: v${{ steps.version.outputs.result }}\n      - name: Determine App Name\n        run: 'echo \"IMG_NAME=${{ env.REGISTRY_NAME }}/${{ env.USER_NAME }}/${{ env.IMG_NAME }}\" >> $GITHUB_ENV'\n      - name: Login to Dockerhub\n        uses: docker/login-action@v3\n        with:\n          username: ${{ secrets.DOCKER_USERNAME }}\n          password: ${{ secrets.DOCKER_PASSWORD }}\n      - uses: docker/setup-qemu-action@v3\n      - uses: docker/setup-buildx-action@v3\n      - uses: buildpacks/github-actions/setup-tools@v5.9.4\n      - name: Buildx Build/Publish\n        run: |\n          docker buildx build . \\\n            --tag ${{ env.IMG_NAME }}:${{ steps.version.outputs.result }}${{ matrix.suffix }} \\\n            --platform linux/amd64,linux/arm64,linux/s390x,linux/ppc64le \\\n            --build-arg pack_version=${{ steps.version.outputs.result }} \\\n            --build-arg base_image=${{ matrix.base_image }} \\\n            --provenance=false \\\n            --push\n      - name: Tag Image as Base\n        if: ${{ (github.event.release != '' || github.event.inputs.tag_latest) && matrix.config == 'base' }}\n        run: |\n          crane copy ${{ env.IMG_NAME }}:${{ steps.version.outputs.result }}${{ matrix.suffix }} ${{ env.IMG_NAME }}:base\n      - name: Tag Image as Latest\n        if: ${{ (github.event.release != '' || github.event.inputs.tag_latest) && matrix.config != 'base' }}\n        run: |\n          crane copy ${{ env.IMG_NAME }}:${{ steps.version.outputs.result }}${{ matrix.suffix }} ${{ env.IMG_NAME }}:latest\n"
  },
  {
    "path": ".github/workflows/delivery-homebrew.yml",
    "content": "name: delivery / homebrew\n\non:\n  release:\n    types:\n      - released\n  workflow_dispatch:\n    inputs:\n      tag_name:\n        description: The release tag to distribute\n        required: true\n\njobs:\n  update-tap:\n    runs-on: ubuntu-latest\n    steps:\n      - uses: actions/checkout@v4\n      - name: Checkout tap\n        uses: actions/checkout@v4\n        with:\n          repository: buildpack/homebrew-tap\n          path: homebrew-tap\n          token: ${{ secrets.PLATFORM_GITHUB_TOKEN }}\n      - name: Copy pack.rb\n        run: cp .github/workflows/delivery/homebrew/pack.rb homebrew-tap/Formula/pack.rb\n      - name: Lookup assets\n        uses: actions/github-script@v7\n        id: assets\n        with:\n          script: |\n            let payload = context.payload;\n            let tag_name = (payload.release && payload.release.tag_name) || (payload.inputs && payload.inputs.tag_name);\n            if (!tag_name) {\n              throw \"ERROR: unable to determine tag\"\n            }\n\n            core.setOutput(\"pack_version\", tag_name.replace(/^v/, ''));\n\n            var release = payload.release || await github.rest.repos.listReleases(context.repo)\n                  .then(result => result.data.find(r => r.tag_name === tag_name))\n                  .catch(err => {throw \"ERROR: \" + err.message});\n\n            if (!release) {\n              throw \"no release found with tag: \" + tag_name;\n            }\n\n            release.assets.forEach(asset => {\n              if (asset.name.endsWith(\"linux.tgz\")) {\n                core.setOutput(\"linux_name\", asset.name);\n                core.setOutput(\"linux_url\", asset.browser_download_url);\n              }\n\n              if (asset.name.endsWith(\"linux-arm64.tgz\")) {\n                core.setOutput(\"linux_arm64_name\", asset.name);\n                core.setOutput(\"linux_arm64_url\", asset.browser_download_url);\n              }\n\n              if (asset.name.endsWith(\"macos.tgz\")) {\n                core.setOutput(\"macos_name\", asset.name);\n                core.setOutput(\"macos_url\", asset.browser_download_url);\n              }\n\n              if (asset.name.endsWith(\"macos-arm64.tgz\")) {\n                core.setOutput(\"macos_arm64_name\", asset.name);\n                core.setOutput(\"macos_arm64_url\", asset.browser_download_url);\n              }\n            });\n      - name: Generate asset checksums\n        id: checksums\n        run: |\n          curl -sSL ${{ steps.assets.outputs.linux_url }} -o ${{ steps.assets.outputs.linux_name }}\n          linux_sha256=$(sha256sum ${{ steps.assets.outputs.linux_name }} | cut -d ' ' -f1)\n          echo \"linux_sha256=$linux_sha256\" >> $GITHUB_OUTPUT\n\n          curl -sSL ${{ steps.assets.outputs.linux_arm64_url }} -o ${{ steps.assets.outputs.linux_arm64_name }}\n          linux_arm64_sha256=$(sha256sum ${{ steps.assets.outputs.linux_arm64_name }} | cut -d ' ' -f1)\n          echo \"linux_arm64_sha256=$linux_arm64_sha256\" >> $GITHUB_OUTPUT\n\n          curl -sSL ${{ steps.assets.outputs.macos_url }} -o ${{ steps.assets.outputs.macos_name }}\n          macos_sha256=$(sha256sum ${{ steps.assets.outputs.macos_name }} | cut -d ' ' -f1)\n          echo \"macos_sha256=$macos_sha256\" >> $GITHUB_OUTPUT\n\n          curl -sSL ${{ steps.assets.outputs.macos_arm64_url }} -o ${{ steps.assets.outputs.macos_arm64_name }}\n          macos_arm64_sha256=$(sha256sum ${{ steps.assets.outputs.macos_arm64_name }} | cut -d ' ' -f1)\n          echo \"macos_arm64_sha256=$macos_arm64_sha256\" >> $GITHUB_OUTPUT\n      - name: Fill pack.rb\n        uses: cschleiden/replace-tokens@v1\n        with:\n          files: homebrew-tap/Formula/pack.rb\n          tokenPrefix: '{{'\n          tokenSuffix: '}}'\n        env:\n          LINUX_URL: ${{ steps.assets.outputs.linux_url }}\n          LINUX_SHA: ${{ steps.checksums.outputs.linux_sha256 }}\n          LINUX_ARM64_URL: ${{ steps.assets.outputs.linux_arm64_url }}\n          LINUX_ARM64_SHA: ${{ steps.checksums.outputs.linux_arm64_sha256 }}\n          MACOS_URL: ${{ steps.assets.outputs.macos_url }}\n          MACOS_SHA: ${{ steps.checksums.outputs.macos_sha256 }}\n          MACOS_ARM64_URL: ${{ steps.assets.outputs.macos_arm64_url }}\n          MACOS_ARM64_SHA:  ${{ steps.checksums.outputs.macos_arm64_sha256 }}\n          PACK_VERSION: ${{ steps.assets.outputs.pack_version }}\n      - run: cat homebrew-tap/Formula/pack.rb\n      - name: Commit changes\n        run: |\n          git config --global user.name \"buildpack-bot\"\n          git config --global user.email \"cncf-buildpacks-maintainers@lists.cncf.io\"\n\n          cd homebrew-tap\n          git add Formula/pack.rb\n          git commit -m \"Version ${{ github.event.release.tag_name }}\"\n          git push"
  },
  {
    "path": ".github/workflows/delivery-release-dispatch.yml",
    "content": "name: delivery / release-dispatch\n\non:\n  release:\n    types:\n      - released\n\njobs:\n  send-release-dispatch:\n    runs-on: ubuntu-latest\n    strategy:\n      matrix:\n        repo: ['buildpacks/docs', 'buildpacks/samples', 'buildpacks/pack-orb', 'buildpacks/github-actions']\n    steps:\n      - name: Repository Dispatch\n        uses: peter-evans/repository-dispatch@v3\n        with:\n          token: ${{ secrets.PLATFORM_GITHUB_TOKEN }}\n          event-type: pack-release\n          repository: ${{ matrix.repo }}\n"
  },
  {
    "path": ".github/workflows/delivery-ubuntu.yml",
    "content": "name: delivery / ubuntu\n\non:\n  release:\n    types:\n      - released\n  workflow_dispatch:\n    inputs:\n      tag_name:\n        description: The release tag to distribute\n        required: true\n\nenv:\n  MAINTAINER_NAME: \"cncf-buildpacks\"\n  MAINTAINER_EMAIL: \"cncf-buildpacks-maintainers@lists.cncf.io\"\n  PACKAGE_NAME: \"pack-cli\"\n\njobs:\n    deliver-ppa:\n      strategy:\n        fail-fast: false\n        matrix:\n          target: [focal, jammy, noble, plucky]\n      runs-on: ubuntu-22.04\n      steps:\n          - name: Checkout code\n            uses: actions/checkout@v4\n\n          - name: Metadata\n            id: metadata\n            run: |\n                echo \"date=$(date +\"%a, %d %b %Y %T %z\")\" >> $GITHUB_OUTPUT\n\n          - name: Determine version\n            uses: actions/github-script@v7\n            id: version\n            with:\n              result-encoding: string\n              script: |\n                let payload = context.payload;\n                let tag = (payload.release && payload.release.tag_name) || (payload.inputs && payload.inputs.tag_name);\n                if (!tag) {\n                  throw \"ERROR: unable to determine tag\"\n                }\n                return tag.replace(/^v/, '');\n\n          - name: Fill debian/*\n            uses: cschleiden/replace-tokens@v1\n            with:\n              files: '[\".github/workflows/delivery/ubuntu/debian/*\"]'\n              tokenPrefix: '{{'\n              tokenSuffix: '}}'\n            env:\n              ARCH: \"any\"\n              DATE_TIME: ${{ steps.metadata.outputs.date }}\n              DESCRIPTION: \"CLI for building apps using Cloud Native Buildpacks\"\n              GO_DEP_ENTRY: golang (>=1.24)\n              GO_DEP_LIB_PATH: /usr/lib/go\n              GO_DEP_PACKAGE_NAME: golang\n              HOMEPAGE: \"https://buildpacks.io\"\n              PACK_VERSION: ${{ steps.version.outputs.result }}\n              REPO: \"buildpacks/pack\"\n              UBUNTU_VERSION:  ${{ matrix.target }}\n\n            ###\n            # NOTE: 'uses' does not support interpolation so we have to manually define the\n            # following steps per variant.\n            ###\n\n\n          - name: Deliver focal\n            if: matrix.target == 'focal'\n            uses: docker://ubuntu:focal\n            with:\n              entrypoint: .github/workflows/delivery/ubuntu/deliver.sh\n            env:\n              DEBIAN_FRONTEND: \"noninteractive\"\n              GO_DEP_PACKAGE_NAME: golang\n              GPG_PRIVATE_KEY: ${{ secrets.GPG_PRIVATE_KEY }}\n              GPG_PUBLIC_KEY: ${{ secrets.GPG_PUBLIC_KEY }}\n              PACKAGE_VERSION: ${{ steps.version.outputs.result }}\n\n          - name: Deliver jammy\n            if: matrix.target == 'jammy'\n            uses: docker://ubuntu:jammy\n            with:\n              entrypoint: .github/workflows/delivery/ubuntu/deliver.sh\n            env:\n              DEBIAN_FRONTEND: \"noninteractive\"\n              GO_DEP_PACKAGE_NAME: golang\n              GPG_PRIVATE_KEY: ${{ secrets.GPG_PRIVATE_KEY }}\n              GPG_PUBLIC_KEY: ${{ secrets.GPG_PUBLIC_KEY }}\n              PACKAGE_VERSION: ${{ steps.version.outputs.result }}\n\n          - name: Deliver noble\n            if: matrix.target == 'noble'\n            uses: docker://ubuntu:noble\n            with:\n              entrypoint: .github/workflows/delivery/ubuntu/deliver.sh\n            env:\n              DEBIAN_FRONTEND: \"noninteractive\"\n              GO_DEP_PACKAGE_NAME: golang\n              GPG_PRIVATE_KEY: ${{ secrets.GPG_PRIVATE_KEY }}\n              GPG_PUBLIC_KEY: ${{ secrets.GPG_PUBLIC_KEY }}\n              PACKAGE_VERSION: ${{ steps.version.outputs.result }}\n\n          - name: Deliver plucky\n            if: matrix.target == 'plucky'\n            uses: docker://ubuntu:plucky\n            with:\n              entrypoint: .github/workflows/delivery/ubuntu/deliver.sh\n            env:\n              DEBIAN_FRONTEND: \"noninteractive\"\n              GO_DEP_PACKAGE_NAME: golang\n              GPG_PRIVATE_KEY: ${{ secrets.GPG_PRIVATE_KEY }}\n              GPG_PUBLIC_KEY: ${{ secrets.GPG_PUBLIC_KEY }}\n              PACKAGE_VERSION: ${{ steps.version.outputs.result }}\n"
  },
  {
    "path": ".github/workflows/privileged-pr-process.yml",
    "content": "name: \"Process Pull Request\"\n# NOTE: This workflow is privileged, and shouldn't be used to checkout the repo or run any of the PR code\n# See more in: https://securitylab.github.com/research/github-actions-preventing-pwn-requests\non:\n  - pull_request_target\n\njobs:\n  label:\n    runs-on: ubuntu-latest\n    steps:\n      - uses: actions/labeler@v4\n        with:\n          repo-token: \"${{ secrets.GITHUB_TOKEN }}\"\n  add-milestone:\n    runs-on: ubuntu-latest\n    steps:\n      - name: Add milestone\n        uses: actions/github-script@v7\n        id: assets\n        with:\n          script: |\n            var milestone = await github.rest.issues.listMilestones({\n                      owner: context.repo.owner,\n                      repo: context.repo.repo,\n                      state: 'open',\n                      sort: 'due_on',\n                      direction: 'asc'\n                   })\n                  .then(result => result.data[0])\n                  .catch(err => {throw \"ERROR: \" + err.message});\n\n            if (milestone) {\n                await github.rest.issues.update({\n                  owner: context.repo.owner,\n                  repo: context.repo.repo,\n                  issue_number: context.payload.number,\n                  milestone: milestone.number,\n                  });\n                core.setOutput('milestone', milestone.number);\n              } else {\n              console.log(`No milestone found. Please create one first.`);\n            }\n"
  },
  {
    "path": ".github/workflows/release-merge.yml",
    "content": "# Merges changes to release branches back into main.\nname: release-merge\n\non:\n  push:\n    branches:\n      - 'release/**'\n\njobs:\n  merge:\n    runs-on: ubuntu-latest\n    steps:\n      - name: Merge\n        run: |\n          echo '> Configuring ssh...'\n          SSH_HOME=\"${HOME}/.ssh\"\n          mkdir -p \"${SSH_HOME}\"\n          chmod 700 \"${SSH_HOME}\"\n\n          echo '> Starting ssh-agent...'\n          eval $(ssh-agent)\n\n          echo '> Add Github to known_hosts...'\n          ssh-keyscan -H github.com >> \"${SSH_HOME}/known_hosts\"\n          chmod 644 \"${SSH_HOME}/known_hosts\"\n\n          echo '> Adding DEPLOY_KEY...'\n          ssh-add - <<< \"${{ secrets.DEPLOY_KEY }}\"\n\n          echo '> Attempting merge...'\n          ls -al\n          git config --global user.name \"github-bot\"\n          git config --global user.email \"action@github.com\"\n          git clone git@github.com:${GITHUB_REPOSITORY}.git\n          cd pack\n          git checkout main\n          git merge origin/${GITHUB_REF#refs/heads/} --no-edit\n          git show -1\n          git push"
  },
  {
    "path": ".github/workflows/scripts/generate-release-event.sh",
    "content": "#!/usr/bin/env bash\nset -e\n\n: ${GITHUB_TOKEN?\"Need to set GITHUB_TOKEN env var.\"}\n\nusage() {\n  echo \"Usage: \"\n  echo \"  $0 <owner>/<repo> <version>\"\n  echo \"    <owner>/<repo> github repository\"\n  echo \"    <version>  version of release to generate\"\n  exit 1; \n}\n\nmustHaveExec() {\n  local bin=$1\n  local e=$(command -v \"$bin\")\n  if [[ -z \"$e\" ]]; then\n    echo \"Need '$bin' to be available\"\n    exit 1\n  fi\n}\n\nGITHUB_REPO=\"${1}\"\nif [[ -z \"${GITHUB_REPO}\" ]]; then\n  echo \"Must specify GitHub repository\"\n  echo\n  usage\n  exit 1\nfi\n\nVERSION=\"${2}\"\nif [[ -z \"${VERSION}\" ]]; then\n  echo \"Must specify a version\"\n  echo\n  usage\n  exit 1\nfi\n\nmustHaveExec curl\nmustHaveExec jq\n\nrelease=$(curl -sSL -H \"Authorization: token ${GITHUB_TOKEN}\" \"https://api.github.com/repos/${GITHUB_REPO}/releases/tags/v${VERSION}\")\nrepository=$(curl -sSL -H \"Authorization: token ${GITHUB_TOKEN}\" \"https://api.github.com/repos/${GITHUB_REPO}\")\n\ntmpEventFile=$(mktemp)\n\ncat <<EOF > \"$tmpEventFile\"\n{\n  \"action\": \"published\",\n  \"release\": ${release},\n  \"repository\": ${repository},\n  \"sender\": {\n    \"login\": \"Codertocat\",\n    \"id\": 21031067,\n    \"node_id\": \"MDQ6VXNlcjIxMDMxMDY3\",\n    \"avatar_url\": \"https://avatars1.githubusercontent.com/u/21031067?v=4\",\n    \"gravatar_id\": \"\",\n    \"url\": \"https://api.github.com/users/Codertocat\",\n    \"html_url\": \"https://github.com/Codertocat\",\n    \"followers_url\": \"https://api.github.com/users/Codertocat/followers\",\n    \"following_url\": \"https://api.github.com/users/Codertocat/following{/other_user}\",\n    \"gists_url\": \"https://api.github.com/users/Codertocat/gists{/gist_id}\",\n    \"starred_url\": \"https://api.github.com/users/Codertocat/starred{/owner}{/repo}\",\n    \"subscriptions_url\": \"https://api.github.com/users/Codertocat/subscriptions\",\n    \"organizations_url\": \"https://api.github.com/users/Codertocat/orgs\",\n    \"repos_url\": \"https://api.github.com/users/Codertocat/repos\",\n    \"events_url\": \"https://api.github.com/users/Codertocat/events{/privacy}\",\n    \"received_events_url\": \"https://api.github.com/users/Codertocat/received_events\",\n    \"type\": \"User\",\n    \"site_admin\": false\n  }\n}\nEOF\n\necho \"$tmpEventFile\""
  },
  {
    "path": ".github/workflows/scripts/generate-workflow_dispatch-event.sh",
    "content": "#!/usr/bin/env bash\nset -e\n\nusage() {\n  echo \"Usage: \"\n  echo \"  $0 <input-json>\"\n  echo\n  echo \"    <input-json> inputs in JSON format\"\n  echo\n  echo \"Examples: \"\n  echo \"  $0 '{\\\"tag\\\": \\\"v1.2.3\\\"}'\"\n  echo \"  $0 '{\\\"issue_number\\\": 123}'\"\n  echo\n  exit 1;\n}\n\nINPUT_JSON=\"${1}\"\nif [[ -z \"${INPUT_JSON}\" ]]; then\n  echo \"Must specify input json\"\n  echo\n  usage\n  exit 1\nfi\n\ntmpEventFile=$(mktemp)\ncat <<EOF > \"$tmpEventFile\"\n{\n  \"action\": \"workflow_dispatch\",\n  \"inputs\": ${INPUT_JSON}\n}\nEOF\n\necho \"$tmpEventFile\""
  },
  {
    "path": ".github/workflows/scripts/test-job.sh",
    "content": "#!/usr/bin/env bash\nset -e\n\n: ${GITHUB_TOKEN?\"Need to set GITHUB_TOKEN env var.\"}\n\nusage() {\n  echo \"Usage: \"\n  echo \"  $0 <workflow> <job> [event]\"\n  echo \"    <workflow>  the workflow file to use\"\n  echo \"    <job>  job name to execute\"\n  echo \"    [event]  event file\"\n  exit 1; \n}\n\nWORKFLOW_FILE=\"${1}\"\nif [[ -z \"${WORKFLOW_FILE}\" ]]; then\n  echo \"Must specify a workflow file\"\n  echo\n  usage\n  exit 1\nfi\n\nJOB_NAME=\"${2}\"\nif [[ -z \"${JOB_NAME}\" ]]; then\n  echo \"Must specify a job\"\n  echo\n  usage\n  exit 1\nfi\n\nEVENT_FILE=\"${3}\"\n\nACT_EXEC=$(command -v act)\nif [[ -z \"${ACT_EXEC}\" ]]; then\n  echo \"Need act to be available: https://github.com/nektos/act\"\n  exit 1\nfi\n\nif [[ -n \"${EVENT_FILE}\" ]]; then\n  ${ACT_EXEC} \\\n    -v \\\n    -e \"${EVENT_FILE}\" \\\n    -P ubuntu-latest=nektos/act-environments-ubuntu:18.04 \\\n    -s GITHUB_TOKEN \\\n    -W \"${WORKFLOW_FILE}\" \\\n    -j \"${JOB_NAME}\"\nelse\n  ${ACT_EXEC} \\\n    -v \\\n    -P ubuntu-latest=nektos/act-environments-ubuntu:18.04 \\\n    -s GITHUB_TOKEN \\\n    -W \"${WORKFLOW_FILE}\" \\\n    -j \"${JOB_NAME}\"\nfi"
  },
  {
    "path": ".gitignore",
    "content": ".pack/\n/pack\nout/\nbenchmarks.test\n*.out\n\n# Jetbrains Goland\n.idea/\n\n# Build outputs\nartifacts/\n.DS_Store\n\n# Travis unencrypted file\n.travis/key.pem\n\n"
  },
  {
    "path": ".gitpod.yml",
    "content": "\ntasks:\n  - name: Setup\n    before: chmod ugo+w /var/run/docker.sock\n    init: make build\n    command: chmod ugo+w /var/run/docker.sock\n\ngithub:\n  prebuilds:\n    master: true\n    branches: true\n    pullRequests: true\n    pullRequestsFromForks: true\n    addCheck: true\n\nvscode:\n  extensions:\n    - golang.go\n"
  },
  {
    "path": "CODEOWNERS",
    "content": "* @buildpacks/platform-maintainers @buildpacks/toc\n"
  },
  {
    "path": "CONTRIBUTING.md",
    "content": "We're glad you are interested in contributing to this project. We hope that this\ndocument helps you get started.\n\n## Policies\n\nThis repository adheres to the following project policies:\n\n- [Code of Conduct][code-of-conduct] - How we should act with each other.\n- [Contributing][contributing] - General contributing standards.\n- [Security][security] - Reporting security concerns.\n- [Support][support] - Getting support.\n\n## Contributing to this repository\n\n### Development\n\nAside from the policies above, you may find [DEVELOPMENT.md](DEVELOPMENT.md) to provide specific helpful detail\nto assist you while developing in this repository.\n\nWe welcome any and all contributions! The issues we are prioritizing are visible in the repository [milestones](https://github.com/buildpacks/pack/milestones), and we especially welcome contributions to that. One good place to start contributing is in our [documentation](https://github.com/buildpacks/docs/issues), though you are welcome to start in this repository!\n\n#### Preparing for a Pull Request\n\nAfter making all the changes but before creating a [Pull Request][pull-request-process], you should run\n`make prepare-for-pr`. This command runs a set of other tasks that resolve or report any simple issues that would\notherwise arise during the pull request review process.\n\n### User Acceptance on a Pull Request\n\nRunning user acceptance on a pull request is just as critical as reviewing the code changes. It allows you, a contributor and user, direct insight into how a feature works and allows for you to provide feedback into what could be improved.\n\n#### Downloading PR binaries\n\n1. On GitHub's Pull Request view, click on the **Checks** tab.\n2. On the left panel, click on the **build** step.\n3. On the bottom, there is **Artifacts** section.\n4. Click on the zip file for the platform you are running.\n\n#### Setup\n\n1. Unzip binary:\n    ```shell\n    unzip pack-{{PLATFORM}}.zip\n    ```\n2. Enable execution of binary _(macOS/Linux only)_:\n    ```shell\n    chmod +x ./pack\n    ```\n\n    > For macOS, you might need to allow your terminal to be able to execute applications from unverified developers. See [Apple Support](https://support.apple.com/en-us/HT202491).\n    > \n    > A quick solution is to add exception to the downloaded pack binary: `sudo spctl --add -v ./pack`\n3. You should now be able to execute pack via:\n    - macOS: `./pack`\n    - Linux: `./pack`\n    - Windows: `pack.exe`\n\n\n#### Writing Feedback\n\nWhen providing feedback please provide a succinct title, a summary of the observation, what you expected, and some output or screenshots.\n\nHere's a simple template you can use:\n\n```text\n\n#### <!-- title -->\n\n<!-- a summary of what you observed -->\n\n###### Expected\n\n<!-- describe what you expected -->\n\n###### Output\n\n<!-- output / logs / screenshots -->\n```\n\n\n[code-of-conduct]: https://github.com/buildpacks/.github/blob/main/CODE_OF_CONDUCT.md\n[contributing]: https://github.com/buildpacks/.github/blob/main/CONTRIBUTING.md\n[security]: https://github.com/buildpacks/.github/blob/main/SECURITY.md\n[support]: https://github.com/buildpacks/.github/blob/main/SUPPORT.md\n[pull-request-process]: https://github.com/buildpacks/.github/blob/main/CONTRIBUTIONS.md#pull-request-process\n"
  },
  {
    "path": "DEVELOPMENT.md",
    "content": "# Development\n\n## Prerequisites\n\n* [Git](https://git-scm.com/book/en/v2/Getting-Started-Installing-Git)\n    * macOS: _(built-in)_\n    * Windows:\n        * `choco install git -y`\n        * `git config --global core.autocrlf false`\n* [Go](https://golang.org/doc/install)\n    * macOS: `brew install go`\n    * Windows: `choco install golang -y`\n* [Docker](https://www.docker.com/products/docker-desktop)\n* Make (and build tools)\n    * macOS: `xcode-select --install`\n    * Windows:\n        * `choco install cygwin make -y`\n        * `[Environment]::SetEnvironmentVariable(\"PATH\", \"C:\\tools\\cygwin\\bin;$ENV:PATH\", \"MACHINE\")`\n\nAlternatively, you can use Gitpod to run pre-configured dev environment in the cloud right from your browser\n\n[![Open in Gitpod](https://gitpod.io/button/open-in-gitpod.svg)](https://gitpod.io/#https://github.com/buildpacks/pack)\n\n\n### Windows Caveats\n\n* Symlinks - Some of our tests attempt to create symlinks. On Windows, this requires the [permission to be provided](https://stackoverflow.com/a/24353758).\n\n### Testing GitHub actions on forks\n\nThe pack release process involves chaining a series of GitHub actions together, such as:\n* The \"build\" workflow, which creates:\n    * .tgz files containing the pack binaries and shasums for the .tgz files\n    * a draft release with the above artifacts\n* The \"delivery-docker\" workflow, which builds and pushes OCI images containing the pack binary\n* The \"benchmark\" workflow, which runs performance checks for each commit and uploads reports to GitHub Pages\n\nIt can be rather cumbersome to test changes to these workflows, as they are heavily intertwined. Thus, we recommend forking the buildpacks/pack repository on GitHub and running through the entire release process end-to-end.\n\nFor the fork, it is necessary to complete the following preparations:\n\n* Add the following secrets:\n    * `DOCKER_PASSWORD` for the delivery-docker workflow, if not using ghcr.io\n    * `DOCKER_USERNAME` for the delivery-docker workflow, if not using ghcr.io\n    * `DEPLOY_KEY` for the release-merge workflow, as a SSH private key for repository access\n* Enable the issues feature on the repository and create `status/triage` and `type/bug` labels for the check-latest-release workflow\n* Create a branch named `gh-pages` for uploading benchmark reports for the benchmark workflow\n\nThe `tools/test-fork.sh` script can be used to update the source code to reflect the state of the fork and disable workflows that should not run on the fork repository.\nIt can be invoked like so: `./tools/test-fork.sh <registry repo name>`\n\n## Tasks\n\n### Building\n\nTo build pack:\n```\nmake build\n```\n\nThis will output the binary to the directory `out/`.\n\nOptions:\n\n| ENV_VAR      | Description                                                            | Default |\n|--------------|------------------------------------------------------------------------|---------|\n| GOCMD        | Change the `go` executable. For example, [richgo][rgo] for testing.    | go      |\n| PACK_BIN     | Change the name or location of the binary relative to `out/`.          | pack    |\n| PACK_VERSION | Tell `pack` what version to consider itself                            | `dev`   |\n\n[rgo]: https://github.com/kyoh86/richgo\n\n_NOTE: This project uses [go modules](https://github.com/golang/go/wiki/Modules) for dependency management._\n\n### Testing\n\nTo run unit and integration tests:\n```shell\nmake unit\n```\nTest output will be streamed to your terminal and also saved to the file\nout/unit\n\nTo run acceptance tests:\n```shell\nmake acceptance\n```\nTest output will be streamed to your terminal and also saved to the file\nout/acceptance\n\nAlternately, to run all tests:\n```shell\nmake test\n```\n\nTo run our full acceptance suite (including cross-compatibility for n-1 `pack` and `lifecycle`):\n```shell\nmake acceptance-all\n```\n\n### Tidy\n\nTo format the code:\n```shell\nmake format\n```\n\nTo tidy up the codebase and dependencies:\n```shell\nmake tidy\n```\n\n### Verification\n\nTo verify formatting and code quality:\n```shell\nmake verify\n```\n\n### Prepare for PR\n\nRuns various checks to ensure compliance:\n```shell\nmake prepare-for-pr\n```\n\n### Acceptance Tests\nSome options users can provide to our acceptance tests are:\n\n| ENV_VAR      | Description                                                            | Default                                                                                                                      |\n|--------------|------------------------------------------------------------------------|------------------------------------------------------------------------------------------------------------------------------|\n| ACCEPTANCE_SUITE_CONFIG        | A set of configurations for how to run the acceptance tests, describing the version of `pack` used for testing, the version of `pack` used to create the builders used in the test, and the version of `lifecycle` binaries used to test with Github | `[{\"pack\": \"current\", \"pack_create_builder\": \"current\", \"lifecycle\": \"default\"}]'`                                           |\n| COMPILE_PACK_WITH_VERSION     | Tell `pack` what version to consider itself    | `dev`                                                                                                                        |\n| GITHUB_TOKEN | A Github Token, used when downloading `pack` and `lifecycle` releases from Github during the test setup | \"\"                                                                                                                           |\n| LIFECYCLE_IMAGE        | Image reference to be used in untrusted builder workflows    | docker.io/buildpacksio/lifecycle:<lifecycle version>                                                                         |\n| LIFECYCLE_PATH        | Path to a `.tgz` file filled with a set of `lifecycle` binaries    | The Github release for the default version of lifecycle in `pack`                                                            |\n| PACK_PATH        | Path to a `pack` executable.  | A compiled version of the current branch                                                                                     |\n| PREVIOUS_LIFECYCLE_IMAGE        | Image reference to be used in untrusted builder workflows, used to test compatibility of `pack` with the n-1 version of the `lifecycle`    | docker.io/buildpacksio/lifecycle:<PREVIOUS_LIFECYCLE_PATH lifecycle version>, buildpacksio/lifecycle:<n-1 lifecycle version> |\n| PREVIOUS_LIFECYCLE_PATH     |  Path to a `.tgz` file filled with a set of `lifecycle` binaries, used to test compatibility of `pack` with the n-1 version of the `lifecycle`    | The Github release for n-1 release of `lifecycle`                                                                            |\n| PREVIOUS_PACK_FIXTURES_PATH | Path to a set of fixtures, used to override the most up-to-date fixtures, in case of changed functionality  | `acceptance/testdata/pack_previous_fixtures_overrides`                                                                       |\n| PREVIOUS_PACK_PATH     | Path to a `pack` executable, used to test compatibility with n-1 version of `pack`          | The most recent release from `pack`'s Github release                                                                         |\n"
  },
  {
    "path": "Dockerfile",
    "content": "ARG base_image=gcr.io/distroless/static\n\nFROM golang:1.25 as builder\nARG pack_version\nENV PACK_VERSION=$pack_version\nWORKDIR /app\nCOPY . .\nRUN make build\n\nFROM ${base_image}\nCOPY --from=builder /app/out/pack /usr/local/bin/pack\nENTRYPOINT [ \"/usr/local/bin/pack\" ]\n"
  },
  {
    "path": "LICENSE",
    "content": "                                 Apache License\n                           Version 2.0, January 2004\n                        http://www.apache.org/licenses/\n\n   TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION\n\n   1. Definitions.\n\n      \"License\" shall mean the terms and conditions for use, reproduction,\n      and distribution as defined by Sections 1 through 9 of this document.\n\n      \"Licensor\" shall mean the copyright owner or entity authorized by\n      the copyright owner that is granting the License.\n\n      \"Legal Entity\" shall mean the union of the acting entity and all\n      other entities that control, are controlled by, or are under common\n      control with that entity. For the purposes of this definition,\n      \"control\" means (i) the power, direct or indirect, to cause the\n      direction or management of such entity, whether by contract or\n      otherwise, or (ii) ownership of fifty percent (50%) or more of the\n      outstanding shares, or (iii) beneficial ownership of such entity.\n\n      \"You\" (or \"Your\") shall mean an individual or Legal Entity\n      exercising permissions granted by this License.\n\n      \"Source\" form shall mean the preferred form for making modifications,\n      including but not limited to software source code, documentation\n      source, and configuration files.\n\n      \"Object\" form shall mean any form resulting from mechanical\n      transformation or translation of a Source form, including but\n      not limited to compiled object code, generated documentation,\n      and conversions to other media types.\n\n      \"Work\" shall mean the work of authorship, whether in Source or\n      Object form, made available under the License, as indicated by a\n      copyright notice that is included in or attached to the work\n      (an example is provided in the Appendix below).\n\n      \"Derivative Works\" shall mean any work, whether in Source or Object\n      form, that is based on (or derived from) the Work and for which the\n      editorial revisions, annotations, elaborations, or other modifications\n      represent, as a whole, an original work of authorship. For the purposes\n      of this License, Derivative Works shall not include works that remain\n      separable from, or merely link (or bind by name) to the interfaces of,\n      the Work and Derivative Works thereof.\n\n      \"Contribution\" shall mean any work of authorship, including\n      the original version of the Work and any modifications or additions\n      to that Work or Derivative Works thereof, that is intentionally\n      submitted to Licensor for inclusion in the Work by the copyright owner\n      or by an individual or Legal Entity authorized to submit on behalf of\n      the copyright owner. For the purposes of this definition, \"submitted\"\n      means any form of electronic, verbal, or written communication sent\n      to the Licensor or its representatives, including but not limited to\n      communication on electronic mailing lists, source code control systems,\n      and issue tracking systems that are managed by, or on behalf of, the\n      Licensor for the purpose of discussing and improving the Work, but\n      excluding communication that is conspicuously marked or otherwise\n      designated in writing by the copyright owner as \"Not a Contribution.\"\n\n      \"Contributor\" shall mean Licensor and any individual or Legal Entity\n      on behalf of whom a Contribution has been received by Licensor and\n      subsequently incorporated within the Work.\n\n   2. Grant of Copyright License. Subject to the terms and conditions of\n      this License, each Contributor hereby grants to You a perpetual,\n      worldwide, non-exclusive, no-charge, royalty-free, irrevocable\n      copyright license to reproduce, prepare Derivative Works of,\n      publicly display, publicly perform, sublicense, and distribute the\n      Work and such Derivative Works in Source or Object form.\n\n   3. Grant of Patent License. Subject to the terms and conditions of\n      this License, each Contributor hereby grants to You a perpetual,\n      worldwide, non-exclusive, no-charge, royalty-free, irrevocable\n      (except as stated in this section) patent license to make, have made,\n      use, offer to sell, sell, import, and otherwise transfer the Work,\n      where such license applies only to those patent claims licensable\n      by such Contributor that are necessarily infringed by their\n      Contribution(s) alone or by combination of their Contribution(s)\n      with the Work to which such Contribution(s) was submitted. If You\n      institute patent litigation against any entity (including a\n      cross-claim or counterclaim in a lawsuit) alleging that the Work\n      or a Contribution incorporated within the Work constitutes direct\n      or contributory patent infringement, then any patent licenses\n      granted to You under this License for that Work shall terminate\n      as of the date such litigation is filed.\n\n   4. Redistribution. You may reproduce and distribute copies of the\n      Work or Derivative Works thereof in any medium, with or without\n      modifications, and in Source or Object form, provided that You\n      meet the following conditions:\n\n      (a) You must give any other recipients of the Work or\n          Derivative Works a copy of this License; and\n\n      (b) You must cause any modified files to carry prominent notices\n          stating that You changed the files; and\n\n      (c) You must retain, in the Source form of any Derivative Works\n          that You distribute, all copyright, patent, trademark, and\n          attribution notices from the Source form of the Work,\n          excluding those notices that do not pertain to any part of\n          the Derivative Works; and\n\n      (d) If the Work includes a \"NOTICE\" text file as part of its\n          distribution, then any Derivative Works that You distribute must\n          include a readable copy of the attribution notices contained\n          within such NOTICE file, excluding those notices that do not\n          pertain to any part of the Derivative Works, in at least one\n          of the following places: within a NOTICE text file distributed\n          as part of the Derivative Works; within the Source form or\n          documentation, if provided along with the Derivative Works; or,\n          within a display generated by the Derivative Works, if and\n          wherever such third-party notices normally appear. The contents\n          of the NOTICE file are for informational purposes only and\n          do not modify the License. You may add Your own attribution\n          notices within Derivative Works that You distribute, alongside\n          or as an addendum to the NOTICE text from the Work, provided\n          that such additional attribution notices cannot be construed\n          as modifying the License.\n\n      You may add Your own copyright statement to Your modifications and\n      may provide additional or different license terms and conditions\n      for use, reproduction, or distribution of Your modifications, or\n      for any such Derivative Works as a whole, provided Your use,\n      reproduction, and distribution of the Work otherwise complies with\n      the conditions stated in this License.\n\n   5. Submission of Contributions. Unless You explicitly state otherwise,\n      any Contribution intentionally submitted for inclusion in the Work\n      by You to the Licensor shall be under the terms and conditions of\n      this License, without any additional terms or conditions.\n      Notwithstanding the above, nothing herein shall supersede or modify\n      the terms of any separate license agreement you may have executed\n      with Licensor regarding such Contributions.\n\n   6. Trademarks. This License does not grant permission to use the trade\n      names, trademarks, service marks, or product names of the Licensor,\n      except as required for reasonable and customary use in describing the\n      origin of the Work and reproducing the content of the NOTICE file.\n\n   7. Disclaimer of Warranty. Unless required by applicable law or\n      agreed to in writing, Licensor provides the Work (and each\n      Contributor provides its Contributions) on an \"AS IS\" BASIS,\n      WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or\n      implied, including, without limitation, any warranties or conditions\n      of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A\n      PARTICULAR PURPOSE. You are solely responsible for determining the\n      appropriateness of using or redistributing the Work and assume any\n      risks associated with Your exercise of permissions under this License.\n\n   8. Limitation of Liability. In no event and under no legal theory,\n      whether in tort (including negligence), contract, or otherwise,\n      unless required by applicable law (such as deliberate and grossly\n      negligent acts) or agreed to in writing, shall any Contributor be\n      liable to You for damages, including any direct, indirect, special,\n      incidental, or consequential damages of any character arising as a\n      result of this License or out of the use or inability to use the\n      Work (including but not limited to damages for loss of goodwill,\n      work stoppage, computer failure or malfunction, or any and all\n      other commercial damages or losses), even if such Contributor\n      has been advised of the possibility of such damages.\n\n   9. Accepting Warranty or Additional Liability. While redistributing\n      the Work or Derivative Works thereof, You may choose to offer,\n      and charge a fee for, acceptance of support, warranty, indemnity,\n      or other liability obligations and/or rights consistent with this\n      License. However, in accepting such obligations, You may act only\n      on Your own behalf and on Your sole responsibility, not on behalf\n      of any other Contributor, and only if You agree to indemnify,\n      defend, and hold each Contributor harmless for any liability\n      incurred by, or claims asserted against, such Contributor by reason\n      of your accepting any such warranty or additional liability.\n\n   END OF TERMS AND CONDITIONS\n\n   APPENDIX: How to apply the Apache License to your work.\n\n      To apply the Apache License to your work, attach the following\n      boilerplate notice, with the fields enclosed by brackets \"[]\"\n      replaced with your own identifying information. (Don't include\n      the brackets!)  The text should be enclosed in the appropriate\n      comment syntax for the file format. We also recommend that a\n      file or class name and description of purpose be included on the\n      same \"printed page\" as the copyright notice for easier\n      identification within third-party archives.\n\n   Copyright 2018 The Cloud Native Buildpacks Authors\n\n   Licensed under the Apache License, Version 2.0 (the \"License\");\n   you may not use this file except in compliance with the License.\n   You may obtain a copy of the License at\n\n       http://www.apache.org/licenses/LICENSE-2.0\n\n   Unless required by applicable law or agreed to in writing, software\n   distributed under the License is distributed on an \"AS IS\" BASIS,\n   WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n   See the License for the specific language governing permissions and\n   limitations under the License.\n"
  },
  {
    "path": "Makefile",
    "content": "ifeq ($(OS),Windows_NT)\nSHELL:=cmd.exe\n\n# Need BLANK due to makefile parsing of `\\`\n# (see: https://stackoverflow.com/questions/54733231/how-to-escape-a-backslash-in-the-end-to-mean-literal-backslash-in-makefile/54733416#54733416)\nBLANK:=\n\n# Define variable named `/` to represent OS path separator (usable as `$/` in this file)\n/:=\\$(BLANK)\nCAT=type\nRMRF=rmdir /q /s\nSRC=$(shell dir /q /s /b *.go | findstr /v $/out$/)\nGOIMPORTS_DIFF_OPTION=\"-l\" # Windows can't do diff-mode because it's missing the \"diff\" binary\nPACK_BIN?=pack.exe\nelse\n/:=/\nCAT=cat\nRMRF=rm -rf\nSRC=$(shell find . -type f -name '*.go' -not -path \"*/out/*\")\nGOIMPORTS_DIFF_OPTION:=\"-d\"\nPACK_BIN?=pack\nendif\n\nACCEPTANCE_TIMEOUT?=$(TEST_TIMEOUT)\nARCHIVE_NAME=pack-$(PACK_VERSION)\nGOCMD?=go\nGOFLAGS?=\nGOTESTFLAGS?=-v -count=1 -parallel=1\nPACKAGE_BASE=github.com/buildpacks/pack\nPACK_GITSHA1=$(shell git rev-parse --short=7 HEAD)\nPACK_VERSION?=0.0.0\nTEST_TIMEOUT?=1200s\nUNIT_TIMEOUT?=$(TEST_TIMEOUT)\nNO_DOCKER?=\n\n# Clean build flags\nclean_build := $(strip ${PACK_BUILD})\nclean_sha := $(strip ${PACK_GITSHA1})\n\n# Append build number and git sha to version, if not-empty\nifneq ($(and $(clean_build),$(clean_sha)),)\nPACK_VERSION:=${PACK_VERSION}+git-${clean_sha}.build-${clean_build}\nelse ifneq ($(clean_build),)\nPACK_VERSION:=${PACK_VERSION}+build-${clean_build}\nelse ifneq ($(clean_sha),)\nPACK_VERSION:=${PACK_VERSION}+git-${clean_sha}\nendif\n\nexport GOFLAGS:=$(GOFLAGS)\nexport CGO_ENABLED=0\n\nBINDIR:=/usr/bin/\n\n.DEFAULT_GOAL := build\n\n# this target must be listed first in order for it to be a default target,\n# so that ubuntu_ppa's may be constructed using default build tools.\n## build: Build the program\nbuild: out\n\t@echo \"=====> Building...\"\n\t$(GOCMD) build -ldflags \"-s -w -X 'github.com/buildpacks/pack/pkg/client.Version=${PACK_VERSION}' -extldflags '${LDFLAGS}'\" -trimpath -o ./out/$(PACK_BIN) -a\n\n## all: Run clean, verify, test, and build operations\nall: clean verify test build\n\n## clean: Clean the workspace\nclean:\n\t@echo \"=====> Cleaning workspace...\"\n\t@$(RMRF) .$/out benchmarks.test || (exit 0)\n\n## format: Format the code\nformat: install-goimports\n\t@echo \"=====> Formatting code...\"\n\t@goimports -l -w -local ${PACKAGE_BASE} ${SRC}\n\t@go run tools/pedantic_imports/main.go ${PACKAGE_BASE} ${SRC}\n\n## generate: Generate mocks\ngenerate: install-mockgen\n\t@echo \"=====> Generating mocks...\"\n\t$(GOCMD) generate ./...\n\n## lint: Check the code\nlint: install-golangci-lint\n\t@echo \"=====> Linting code...\"\n\t@golangci-lint run -c golangci.yaml\n\n## test: Run unit and acceptance tests\ntest: unit acceptance\n\n## unit: Append coverage arguments\nifeq ($(TEST_COVERAGE), 1)\nunit: GOTESTFLAGS:=$(GOTESTFLAGS) -coverprofile=./out/tests/coverage-unit.txt -covermode=atomic\nendif\nifeq ($(NO_DOCKER),)\nunit: GOTESTFLAGS:=$(GOTESTFLAGS) --tags=example\nendif\nunit: out\n\t@echo \"> Running unit/integration tests...\"\n\t$(GOCMD) test $(GOTESTFLAGS) -timeout=$(UNIT_TIMEOUT) ./...\n\n## acceptance: Run acceptance tests\nacceptance: out\n\t@echo \"=====> Running acceptance tests...\"\n\t$(GOCMD) test $(GOTESTFLAGS) -timeout=$(ACCEPTANCE_TIMEOUT) -tags=acceptance ./acceptance\n\n## acceptance-all: Run all acceptance tests\nacceptance-all: export ACCEPTANCE_SUITE_CONFIG:=$(shell $(CAT) .$/acceptance$/testconfig$/all.json)\nacceptance-all:\n\t@echo \"=====> Running acceptance tests...\"\n\t$(GOCMD) test $(GOTESTFLAGS) -timeout=$(ACCEPTANCE_TIMEOUT) -tags=acceptance ./acceptance\n\n## prepare-for-pr: Run clean, verify, and test operations and check for uncommitted changes\nprepare-for-pr: tidy verify test\n\t@git diff-index --quiet HEAD -- ||\\\n\t(echo \"-----------------\" &&\\\n\techo \"NOTICE: There are some files that have not been committed.\" &&\\\n\techo \"-----------------\\n\" &&\\\n\tgit status &&\\\n\techo \"\\n-----------------\" &&\\\n\techo \"NOTICE: There are some files that have not been committed.\" &&\\\n\techo \"-----------------\\n\"  &&\\\n\texit 0)\n\n## verify: Run format and lint checks\nverify: verify-format lint\n\n## verify-format: Verify the format\nverify-format: install-goimports\n\t@echo \"=====> Verifying format...\"\n\t$(if $(shell goimports -l -local ${PACKAGE_BASE} ${SRC}), @echo ERROR: Format verification failed! && goimports ${GOIMPORTS_DIFF_OPTION} -local ${PACKAGE_BASE} ${SRC} && exit 1)\n\n## benchmark: Run benchmark tests\nbenchmark: out\n\t@echo \"=====> Running benchmarks\"\n\t$(GOCMD) test -run=^$  -bench=. -benchtime=1s -benchmem -memprofile=./out/bench_mem.out -cpuprofile=./out/bench_cpu.out -tags=benchmarks ./benchmarks/ -v\n# NOTE: You can analyze the results, using go tool pprof. For instance, you can start a server to see a graph of the cpu usage by running\n# go tool pprof -http=\":8082\" out/bench_cpu.out. Alternatively, you can run go tool pprof, and in the ensuing cli, run\n# commands like top10 or web to dig down into the cpu and memory usage\n# For more, see https://blog.golang.org/pprof\n\n## package: Package the program\npackage: out\n\ttar czf .$/out$/$(ARCHIVE_NAME).tgz -C .$/out$/ $(PACK_BIN)\n\n## install: Install the program to the system\ninstall:\n\tmkdir -p ${DESTDIR}${BINDIR}\n\tcp ./out/$(PACK_BIN) ${DESTDIR}${BINDIR}/\n\n## install-mockgen: Used only by apt-get install when installing ubuntu ppa\ninstall-mockgen:\n\t@echo \"=====> Installing mockgen...\"\n\tcd tools && $(GOCMD) install github.com/golang/mock/mockgen\n\n## install-goimports: Install goimports dependency\ninstall-goimports:\n\t@echo \"=====> Installing goimports...\"\n\tcd tools && $(GOCMD) install golang.org/x/tools/cmd/goimports\n\n## install-golangci-lint: Install golangci-lint dependency\ninstall-golangci-lint:\n\t@echo \"=====> Installing golangci-lint...\"\n\tcd tools && $(GOCMD) install github.com/golangci/golangci-lint/v2/cmd/golangci-lint@v2.0.2\n\n## mod-tidy: Tidy Go modules\nmod-tidy:\n\t$(GOCMD) mod tidy  -compat=1.25\n\tcd tools && $(GOCMD) mod tidy -compat=1.25\n\n## tidy: Tidy modules and format the code\ntidy: mod-tidy format\n\n# NOTE: Windows doesn't support `-p`\n## out: Make a directory for output\nout:\n\t@mkdir out || (exit 0)\n\tmkdir out$/tests || (exit 0)\n\n## help: Display help information\nhelp: Makefile\n\t@echo \"\"\n\t@echo \"Usage:\"\n\t@echo \"\"\n\t@echo \"  make [target]\"\n\t@echo \"\"\n\t@echo \"Targets:\"\n\t@echo \"\"\n\t@awk -F ':|##' '/^[^\\.%\\t][^\\t]*:.*##/{printf \"  \\033[36m%-20s\\033[0m %s\\n\", $$1, $$NF}' $(MAKEFILE_LIST) | sort\n\t@sed -n 's/^##//p' ${MAKEFILE_LIST} | column -t -s ':' |  sed -e 's/^/ /'\n\n.PHONY: clean build format imports lint test unit acceptance prepare-for-pr verify verify-format benchmark \n"
  },
  {
    "path": "README.md",
    "content": "# pack - Buildpack CLI\n\n[![Build results](https://github.com/buildpacks/pack/workflows/build/badge.svg)](https://github.com/buildpacks/pack/actions)\n[![Go Report Card](https://goreportcard.com/badge/github.com/buildpacks/pack)](https://goreportcard.com/report/github.com/buildpacks/pack)\n[![codecov](https://codecov.io/gh/buildpacks/pack/branch/main/graph/badge.svg)](https://codecov.io/gh/buildpacks/pack)\n[![GoDoc](https://godoc.org/github.com/buildpacks/pack?status.svg)](https://godoc.org/github.com/buildpacks/pack)\n[![GitHub license](https://img.shields.io/github/license/buildpacks/pack)](https://github.com/buildpacks/pack/blob/main/LICENSE)\n[![CII Best Practices](https://bestpractices.coreinfrastructure.org/projects/4748/badge)](https://bestpractices.coreinfrastructure.org/projects/4748)\n[![Slack](https://img.shields.io/badge/slack-join-ff69b4.svg?logo=slack)](https://slack.cncf.io/)\n[![Gitpod ready-to-code](https://img.shields.io/badge/Gitpod-ready--to--code-blue?logo=gitpod)](https://gitpod.io/#https://github.com/buildpacks/pack)\n\n`pack` makes it easy for...\n- [**App Developers**][app-dev] to use buildpacks to convert code into runnable images.\n- [**Buildpack Authors**][bp-author] to develop and package buildpacks for distribution.\n- [**Operators**][operator] to package buildpacks for distribution and maintain applications.\n\n## Usage\n\n<img src=\"resources/pack-build.gif\" width=\"600px\" />\n\n## Getting Started\nGet started by running through our tutorial: [An App’s Brief Journey from Source to Image][getting-started]\n\n## Contributing\n- [CONTRIBUTING](CONTRIBUTING.md) - Information on how to contribute, including the pull request process.\n- [DEVELOPMENT](DEVELOPMENT.md) - Further detail to help you during the development process.\n- [RELEASE](RELEASE.md) - Further details about our release process.\n\n## Documentation\nCheck out the command line documentation [here][pack-docs]\n\n## Specifications\n`pack` is a CLI implementation of the [Platform Interface Specification][platform-spec] for [Cloud Native Buildpacks][buildpacks.io].\n\nTo learn more about the details, check out the [specs repository][specs].\n\n[app-dev]: https://buildpacks.io/docs/for-app-developers/\n[bp-author]: https://buildpacks.io/docs/for-buildpack-authors/\n[operator]: https://buildpacks.io/docs/for-platform-operators/\n[buildpacks.io]: https://buildpacks.io/\n[install-pack]: https://buildpacks.io/docs/install-pack/\n[getting-started]: https://buildpacks.io/docs/app-journey\n[specs]: https://github.com/buildpacks/spec/\n[platform-spec]: https://github.com/buildpacks/spec/blob/main/platform.md\n[pack-docs]: https://buildpacks.io/docs/tools/pack/cli/pack/\n"
  },
  {
    "path": "RELEASE.md",
    "content": "# Release Process\n\nPack follows a 6 week release cadence, composed of 3 phases:\n  - [Development](#development)\n  - [Feature Complete](#feature-complete)\n  - [Release Finalization](#release-finalization)\n\n## Roles\n\n#### Release Manager\n\nOne of the [maintainers][maintainers] is designated as the release manager. They communicate the release status to the working group meetings, schedule additional meetings with the `pack` [maintainers][maintainers] as needed, and finalize the release. They also take care of whatever release needs may arise.\n\n## Phases\n\n### Development\n\nOur development flow is detailed in [Development](DEVELOPMENT.md).\n\n### Feature Complete\n\n5 business days prior to a scheduled release, we enter `feature complete`.\n\nDuring this period, a **Release Candidate** (RC) is published and used for further User Acceptance testing (`UAT`). Furthermore, additional RCs may be published based on assessment by the `pack` [maintainers][maintainers] of the **impact**, **effort** and **risk** of including the changes in the upcoming release. Any other changes may be merged into the `main` branch through the normal process, and will make it into the next release.\n\nTo produce the release candidate the [release manager](#release-manager) will:\n- Create a new release branch in form `release/<VERSION>-rc<NUMBER>` yielding a draft GitHub release to be published. \n- Publish the [GitHub release][release]:\n    - Tag release branch as `v<VERSION>-rc<NUMBER>`.\n    - Release should be marked as \"pre-release\".\n    - The GitHub release will contain the following:\n        - **artifacts**\n        - **release notes**\n    - The release notes should be edited and cleaned\n- Merge the release branch into `main`.\n\n### Release Finalization\n\nThe [release manager](#release-manager) will:\n- Create a new release branch in form `release/<VERSION>` yielding a draft GitHub release to be published. \n- Publish the [GitHub release][release] while tagging the release branch as `v<VERSION>`.\n    - Tag release branch as `v<VERSION>`.\n    - The GitHub release will contain the following:\n        - **artifacts**\n        - **release notes**\n        - **migration guide** (if necessary)\n- Merge the release branch into `main`.\n- Create a new [milestone](https://github.com/buildpacks/pack/milestones) for the next version, and set the delivery time in 6 weeks.\n- Move all still open PRs/issues in the delivered milestone to the new milestone\n- Close the delivered milestone\n- Send out release notifications, if deemed necessary, on\n  - The [cncf-buildpacks mailing list](https://lists.cncf.io/g/cncf-buildpacks)\n  - Twitter\n- Post release, you should be able to remove any acceptance test constraints (in [acceptance/invoke/pack.go](acceptance/invoke/pack.go)) in the `featureTests` struct. Create a PR removing them, in order to ensure our acceptance tests are clean.\n\nAnd with that, you're done!\n\n## Manual Releasing\n\nWe release pack to a number of systems, including `homebrew`, `docker`, and `archlinux`. All of our delivery pipelines\nhave workflow_dispatch triggers, if a maintainer needs to manually trigger them. To activate it, go to the\n[actions page](https://github.com/buildpacks/pack/actions), and select the desired workflow. Run it by providing the pack\nversion to release, in the format `v<version>`.\n\n_For more information, see the [release process RFC][release-process]_\n\n[maintainers]: https://github.com/buildpacks/community/blob/main/TEAMS.md#platform-team\n[release-process]: https://github.com/buildpacks/rfcs/blob/main/text/0039-release-process.md#change-control-board\n[release]: https://github.com/buildpacks/pack/releases\n"
  },
  {
    "path": "acceptance/acceptance_test.go",
    "content": "//go:build acceptance\n\npackage acceptance\n\nimport (\n\t\"bytes\"\n\t\"context\"\n\t\"crypto/sha256\"\n\t\"encoding/hex\"\n\t\"encoding/json\"\n\t\"fmt\"\n\t\"os\"\n\t\"path/filepath\"\n\t\"regexp\"\n\t\"runtime\"\n\t\"strings\"\n\t\"testing\"\n\t\"time\"\n\n\t\"github.com/buildpacks/imgutil\"\n\t\"github.com/buildpacks/lifecycle/api\"\n\t\"github.com/google/go-containerregistry/pkg/name\"\n\tv1 \"github.com/google/go-containerregistry/pkg/v1\"\n\t\"github.com/google/go-containerregistry/pkg/v1/types\"\n\t\"github.com/moby/moby/client\"\n\t\"github.com/pelletier/go-toml\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\tyaml \"gopkg.in/yaml.v3\"\n\n\t\"github.com/buildpacks/pack/acceptance/assertions\"\n\t\"github.com/buildpacks/pack/acceptance/buildpacks\"\n\t\"github.com/buildpacks/pack/acceptance/config\"\n\t\"github.com/buildpacks/pack/acceptance/invoke\"\n\t\"github.com/buildpacks/pack/acceptance/managers\"\n\t\"github.com/buildpacks/pack/internal/style\"\n\t\"github.com/buildpacks/pack/pkg/archive\"\n\t\"github.com/buildpacks/pack/pkg/cache\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nconst (\n\trunImage   = \"pack-test/run\"\n\tbuildImage = \"pack-test/build\"\n)\n\nvar (\n\tdockerCli      *client.Client\n\tregistryConfig *h.TestRegistryConfig\n\tsuiteManager   *SuiteManager\n\timageManager   managers.ImageManager\n\tassertImage    assertions.ImageAssertionManager\n)\n\nfunc TestAcceptance(t *testing.T) {\n\tvar err error\n\n\th.RequireDocker(t)\n\n\tassert := h.NewAssertionManager(t)\n\n\tdockerCli, err = client.New(client.FromEnv)\n\tassert.Nil(err)\n\n\timageManager = managers.NewImageManager(t, dockerCli)\n\n\tregistryConfig = h.RunRegistry(t)\n\tdefer registryConfig.RmRegistry(t)\n\n\tassertImage = assertions.NewImageAssertionManager(t, imageManager, registryConfig)\n\n\tinputConfigManager, err := config.NewInputConfigurationManager()\n\tassert.Nil(err)\n\n\tassetsConfig := config.ConvergedAssetManager(t, assert, inputConfigManager)\n\n\tsuiteManager = &SuiteManager{out: t.Logf}\n\tsuite := spec.New(\"acceptance suite\", spec.Report(report.Terminal{}))\n\n\tif inputConfigManager.Combinations().IncludesCurrentSubjectPack() {\n\t\tsuite(\"p_current\", func(t *testing.T, when spec.G, it spec.S) {\n\t\t\ttestWithoutSpecificBuilderRequirement(\n\t\t\t\tt,\n\t\t\t\twhen,\n\t\t\t\tit,\n\t\t\t\tassetsConfig.NewPackAsset(config.Current),\n\t\t\t)\n\t\t}, spec.Report(report.Terminal{}))\n\t}\n\n\tfor _, combo := range inputConfigManager.Combinations() {\n\t\t// see https://github.com/golang/go/wiki/CommonMistakes#using-reference-to-loop-iterator-variable\n\t\tcombo := combo\n\n\t\tt.Logf(`setting up run combination %s: %s`,\n\t\t\tstyle.Symbol(combo.String()),\n\t\t\tcombo.Describe(assetsConfig),\n\t\t)\n\n\t\tsuite(combo.String(), func(t *testing.T, when spec.G, it spec.S) {\n\t\t\ttestAcceptance(\n\t\t\t\tt,\n\t\t\t\twhen,\n\t\t\t\tit,\n\t\t\t\tassetsConfig.NewPackAsset(combo.Pack),\n\t\t\t\tassetsConfig.NewPackAsset(combo.PackCreateBuilder),\n\t\t\t\tassetsConfig.NewLifecycleAsset(combo.Lifecycle),\n\t\t\t)\n\t\t}, spec.Report(report.Terminal{}))\n\t}\n\n\tsuite.Run(t)\n\n\tassert.Nil(suiteManager.CleanUp())\n}\n\n// These tests either (a) do not require a builder or (b) do not require a specific builder to be provided\n// in order to test compatibility.\n// They should only be run against the \"current\" (i.e., main) version of pack.\nfunc testWithoutSpecificBuilderRequirement(\n\tt *testing.T,\n\twhen spec.G,\n\tit spec.S,\n\tpackConfig config.PackAsset,\n) {\n\tvar (\n\t\tpack             *invoke.PackInvoker\n\t\tassert           = h.NewAssertionManager(t)\n\t\tbuildpackManager buildpacks.BuildModuleManager\n\t)\n\n\tit.Before(func() {\n\t\tpack = invoke.NewPackInvoker(t, assert, packConfig, registryConfig.DockerConfigDir)\n\t\tpack.EnableExperimental()\n\t\tbuildpackManager = buildpacks.NewBuildModuleManager(t, assert)\n\t})\n\n\tit.After(func() {\n\t\tpack.Cleanup()\n\t})\n\n\twhen(\"invalid subcommand\", func() {\n\t\tit(\"prints usage\", func() {\n\t\t\toutput, err := pack.Run(\"some-bad-command\")\n\t\t\tassert.NotNil(err)\n\n\t\t\tassertOutput := assertions.NewOutputAssertionManager(t, output)\n\t\t\tassertOutput.ReportsCommandUnknown(\"some-bad-command\")\n\t\t\tassertOutput.IncludesUsagePrompt()\n\t\t})\n\t})\n\n\twhen(\"build with default builders not set\", func() {\n\t\tit(\"informs the user\", func() {\n\t\t\toutput, err := pack.Run(\n\t\t\t\t\"build\", \"some/image\",\n\t\t\t\t\"-p\", filepath.Join(\"testdata\", \"mock_app\"),\n\t\t\t)\n\n\t\t\tassert.NotNil(err)\n\t\t\tassertOutput := assertions.NewOutputAssertionManager(t, output)\n\t\t\tassertOutput.IncludesMessageToSetDefaultBuilder()\n\t\t\tassertOutput.IncludesPrefixedGoogleBuilder()\n\t\t\tassertOutput.IncludesPrefixedHerokuBuilders()\n\t\t\tassertOutput.IncludesPrefixedPaketoBuilders()\n\t\t})\n\t})\n\n\twhen(\"buildpack\", func() {\n\t\twhen(\"package\", func() {\n\t\t\tvar (\n\t\t\t\ttmpDir                         string\n\t\t\t\tbuildpackManager               buildpacks.BuildModuleManager\n\t\t\t\tsimplePackageConfigFixtureName = \"package.toml\"\n\t\t\t)\n\n\t\t\tit.Before(func() {\n\t\t\t\tvar err error\n\t\t\t\ttmpDir, err = os.MkdirTemp(\"\", \"buildpack-package-tests\")\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tbuildpackManager = buildpacks.NewBuildModuleManager(t, assert)\n\t\t\t\tbuildpackManager.PrepareBuildModules(tmpDir, buildpacks.BpSimpleLayersParent, buildpacks.BpSimpleLayers)\n\t\t\t})\n\n\t\t\tit.After(func() {\n\t\t\t\tassert.Nil(os.RemoveAll(tmpDir))\n\t\t\t})\n\n\t\t\tgenerateAggregatePackageToml := func(buildpackURI, nestedPackageName, operatingSystem string) string {\n\t\t\t\tt.Helper()\n\t\t\t\tpackageTomlFile, err := os.CreateTemp(tmpDir, \"package_aggregate-*.toml\")\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tpack.FixtureManager().TemplateFixtureToFile(\n\t\t\t\t\t\"package_aggregate.toml\",\n\t\t\t\t\tpackageTomlFile,\n\t\t\t\t\tmap[string]interface{}{\n\t\t\t\t\t\t\"BuildpackURI\": buildpackURI,\n\t\t\t\t\t\t\"PackageName\":  nestedPackageName,\n\t\t\t\t\t\t\"OS\":           operatingSystem,\n\t\t\t\t\t},\n\t\t\t\t)\n\n\t\t\t\tassert.Nil(packageTomlFile.Close())\n\n\t\t\t\treturn packageTomlFile.Name()\n\t\t\t}\n\n\t\t\tgenerateMultiPlatformCompositeBuildpackPackageToml := func(buildpackURI, dependencyURI string) string {\n\t\t\t\tt.Helper()\n\t\t\t\tpackageTomlFile, err := os.CreateTemp(tmpDir, \"package_multi_platform-*.toml\")\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tpack.FixtureManager().TemplateFixtureToFile(\n\t\t\t\t\t\"package_multi_platform.toml\",\n\t\t\t\t\tpackageTomlFile,\n\t\t\t\t\tmap[string]interface{}{\n\t\t\t\t\t\t\"BuildpackURI\": buildpackURI,\n\t\t\t\t\t\t\"PackageName\":  dependencyURI,\n\t\t\t\t\t},\n\t\t\t\t)\n\n\t\t\t\tassert.Nil(packageTomlFile.Close())\n\n\t\t\t\treturn packageTomlFile.Name()\n\t\t\t}\n\n\t\t\twhen(\"no --format is provided\", func() {\n\t\t\t\tit(\"creates the package as image\", func() {\n\t\t\t\t\tpackageName := \"test/package-\" + h.RandString(10)\n\t\t\t\t\tpackageTomlPath := generatePackageTomlWithOS(t, assert, pack, tmpDir, simplePackageConfigFixtureName, imageManager.HostOS())\n\n\t\t\t\t\toutput := pack.RunSuccessfully(\"buildpack\", \"package\", packageName, \"-c\", packageTomlPath)\n\t\t\t\t\tassertions.NewOutputAssertionManager(t, output).ReportsPackageCreation(packageName)\n\t\t\t\t\tdefer imageManager.CleanupImages(packageName)\n\n\t\t\t\t\tassertImage.ExistsLocally(packageName)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"--format image\", func() {\n\t\t\t\tit(\"creates the package\", func() {\n\t\t\t\t\tt.Log(\"package w/ only buildpacks\")\n\t\t\t\t\tnestedPackageName := \"test/package-\" + h.RandString(10)\n\t\t\t\t\tpackageName := \"test/package-\" + h.RandString(10)\n\n\t\t\t\t\tpackageTomlPath := generatePackageTomlWithOS(t, assert, pack, tmpDir, simplePackageConfigFixtureName, imageManager.HostOS())\n\t\t\t\t\taggregatePackageToml := generateAggregatePackageToml(\"simple-layers-parent-buildpack.tgz\", nestedPackageName, imageManager.HostOS())\n\n\t\t\t\t\tpackageBuildpack := buildpacks.NewPackageImage(\n\t\t\t\t\t\tt,\n\t\t\t\t\t\tpack,\n\t\t\t\t\t\tpackageName,\n\t\t\t\t\t\taggregatePackageToml,\n\t\t\t\t\t\tbuildpacks.WithRequiredBuildpacks(\n\t\t\t\t\t\t\tbuildpacks.BpSimpleLayersParent,\n\t\t\t\t\t\t\tbuildpacks.NewPackageImage(\n\t\t\t\t\t\t\t\tt,\n\t\t\t\t\t\t\t\tpack,\n\t\t\t\t\t\t\t\tnestedPackageName,\n\t\t\t\t\t\t\t\tpackageTomlPath,\n\t\t\t\t\t\t\t\tbuildpacks.WithRequiredBuildpacks(buildpacks.BpSimpleLayers),\n\t\t\t\t\t\t\t),\n\t\t\t\t\t\t),\n\t\t\t\t\t)\n\t\t\t\t\tbuildpackManager.PrepareBuildModules(tmpDir, packageBuildpack)\n\t\t\t\t\tdefer imageManager.CleanupImages(nestedPackageName, packageName)\n\n\t\t\t\t\tassertImage.ExistsLocally(nestedPackageName)\n\t\t\t\t\tassertImage.ExistsLocally(packageName)\n\t\t\t\t})\n\n\t\t\t\twhen(\"--publish\", func() {\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\t// used to avoid authentication issues with the local registry\n\t\t\t\t\t\tos.Setenv(\"DOCKER_CONFIG\", registryConfig.DockerConfigDir)\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"no --targets\", func() {\n\t\t\t\t\t\tit(\"publishes image to registry\", func() {\n\t\t\t\t\t\t\tpackageTomlPath := generatePackageTomlWithOS(t, assert, pack, tmpDir, simplePackageConfigFixtureName, imageManager.HostOS())\n\t\t\t\t\t\t\tnestedPackageName := registryConfig.RepoName(\"test/package-\" + h.RandString(10))\n\n\t\t\t\t\t\t\tnestedPackage := buildpacks.NewPackageImage(\n\t\t\t\t\t\t\t\tt,\n\t\t\t\t\t\t\t\tpack,\n\t\t\t\t\t\t\t\tnestedPackageName,\n\t\t\t\t\t\t\t\tpackageTomlPath,\n\t\t\t\t\t\t\t\tbuildpacks.WithRequiredBuildpacks(buildpacks.BpSimpleLayers),\n\t\t\t\t\t\t\t\tbuildpacks.WithPublish(),\n\t\t\t\t\t\t\t)\n\t\t\t\t\t\t\tbuildpackManager.PrepareBuildModules(tmpDir, nestedPackage)\n\n\t\t\t\t\t\t\taggregatePackageToml := generateAggregatePackageToml(\"simple-layers-parent-buildpack.tgz\", nestedPackageName, imageManager.HostOS())\n\t\t\t\t\t\t\tpackageName := registryConfig.RepoName(\"test/package-\" + h.RandString(10))\n\n\t\t\t\t\t\t\toutput := pack.RunSuccessfully(\n\t\t\t\t\t\t\t\t\"buildpack\", \"package\", packageName,\n\t\t\t\t\t\t\t\t\"-c\", aggregatePackageToml,\n\t\t\t\t\t\t\t\t\"--publish\",\n\t\t\t\t\t\t\t)\n\n\t\t\t\t\t\t\tdefer imageManager.CleanupImages(packageName)\n\t\t\t\t\t\t\tassertions.NewOutputAssertionManager(t, output).ReportsPackagePublished(packageName)\n\n\t\t\t\t\t\t\tassertImage.NotExistsLocally(packageName)\n\t\t\t\t\t\t\tassertImage.CanBePulledFromRegistry(packageName)\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"--targets\", func() {\n\t\t\t\t\t\tvar packageName string\n\n\t\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\t\th.SkipIf(t, !pack.SupportsFeature(invoke.MultiPlatformBuildersAndBuildPackages), \"multi-platform builders and buildpack packages are available since 0.34.0\")\n\t\t\t\t\t\t\tpackageName = registryConfig.RepoName(\"simple-multi-platform-buildpack\" + h.RandString(8))\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\twhen(\"simple buildpack on disk\", func() {\n\t\t\t\t\t\t\tvar path string\n\n\t\t\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\t\t\t// create a simple buildpack on disk\n\t\t\t\t\t\t\t\tsourceDir := filepath.Join(\"testdata\", \"mock_buildpacks\")\n\t\t\t\t\t\t\t\tpath = filepath.Join(tmpDir, \"simple-layers-buildpack\")\n\t\t\t\t\t\t\t\terr := buildpacks.BpFolderSimpleLayers.Prepare(sourceDir, tmpDir)\n\t\t\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\t\t})\n\n\t\t\t\t\t\t\tit(\"publishes images for each requested target to the registry and creates an image index\", func() {\n\t\t\t\t\t\t\t\toutput := pack.RunSuccessfully(\n\t\t\t\t\t\t\t\t\t\"buildpack\", \"package\", packageName,\n\t\t\t\t\t\t\t\t\t\"--path\", path,\n\t\t\t\t\t\t\t\t\t\"--publish\",\n\t\t\t\t\t\t\t\t\t\"--target\", \"linux/amd64\",\n\t\t\t\t\t\t\t\t\t\"--target\", \"linux/arm64\",\n\t\t\t\t\t\t\t\t\t\"--target\", \"windows/amd64\",\n\t\t\t\t\t\t\t\t)\n\n\t\t\t\t\t\t\t\tdefer imageManager.CleanupImages(packageName)\n\t\t\t\t\t\t\t\tassertions.NewOutputAssertionManager(t, output).ReportsPackagePublished(packageName)\n\n\t\t\t\t\t\t\t\tassertImage.NotExistsLocally(packageName)\n\t\t\t\t\t\t\t\tassertImage.CanBePulledFromRegistry(packageName)\n\n\t\t\t\t\t\t\t\tassertions.NewOutputAssertionManager(t, output).ReportsSuccessfulIndexPushed(packageName)\n\t\t\t\t\t\t\t\th.AssertRemoteImageIndex(t, packageName, types.OCIImageIndex, 3)\n\t\t\t\t\t\t\t})\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\twhen(\"composite buildpack on disk\", func() {\n\t\t\t\t\t\t\tvar packageTomlPath string\n\n\t\t\t\t\t\t\twhen(\"dependencies are not available in a registry\", func() {\n\t\t\t\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\t\t\t\t// creates a composite buildpack on disk\n\t\t\t\t\t\t\t\t\tsourceDir := filepath.Join(\"testdata\", \"mock_buildpacks\")\n\n\t\t\t\t\t\t\t\t\terr := buildpacks.MetaBpDependency.Prepare(sourceDir, tmpDir)\n\t\t\t\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\t\t\t\terr = buildpacks.MetaBpFolder.Prepare(sourceDir, tmpDir)\n\t\t\t\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\t\t\t\tpackageTomlPath = filepath.Join(tmpDir, \"meta-buildpack\", \"package.toml\")\n\t\t\t\t\t\t\t\t})\n\n\t\t\t\t\t\t\t\tit(\"errors with a descriptive message\", func() {\n\t\t\t\t\t\t\t\t\toutput, err := pack.Run(\n\t\t\t\t\t\t\t\t\t\t\"buildpack\", \"package\", packageName,\n\t\t\t\t\t\t\t\t\t\t\"--config\", packageTomlPath,\n\t\t\t\t\t\t\t\t\t\t\"--publish\",\n\t\t\t\t\t\t\t\t\t\t\"--target\", \"linux/amd64\",\n\t\t\t\t\t\t\t\t\t\t\"--target\", \"linux/arm64\",\n\t\t\t\t\t\t\t\t\t\t\"--target\", \"windows/amd64\",\n\t\t\t\t\t\t\t\t\t\t\"--verbose\",\n\t\t\t\t\t\t\t\t\t)\n\t\t\t\t\t\t\t\t\tassert.NotNil(err)\n\t\t\t\t\t\t\t\t\th.AssertContains(t, output, \"uri '../meta-buildpack-dependency' is not allowed when creating a composite multi-platform buildpack; push your dependencies to a registry and use 'docker://<image>' instead\")\n\t\t\t\t\t\t\t\t})\n\t\t\t\t\t\t\t})\n\n\t\t\t\t\t\t\twhen(\"dependencies are available in a registry\", func() {\n\t\t\t\t\t\t\t\tvar depPackageName string\n\n\t\t\t\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\t\t\t\t// multi-platform composite buildpacks require the dependencies to be available in a registry\n\t\t\t\t\t\t\t\t\t// let's push it\n\n\t\t\t\t\t\t\t\t\t// first creates the simple buildpack dependency on disk\n\t\t\t\t\t\t\t\t\tdepSourceDir := filepath.Join(\"testdata\", \"mock_buildpacks\")\n\t\t\t\t\t\t\t\t\tdepPath := filepath.Join(tmpDir, \"meta-buildpack-dependency\")\n\t\t\t\t\t\t\t\t\terr := buildpacks.MetaBpDependency.Prepare(depSourceDir, tmpDir)\n\t\t\t\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\t\t\t\t// push the dependency to a registry\n\t\t\t\t\t\t\t\t\tdepPackageName = registryConfig.RepoName(\"simple-multi-platform-buildpack\" + h.RandString(8))\n\t\t\t\t\t\t\t\t\toutput := pack.RunSuccessfully(\n\t\t\t\t\t\t\t\t\t\t\"buildpack\", \"package\", depPackageName,\n\t\t\t\t\t\t\t\t\t\t\"--path\", depPath,\n\t\t\t\t\t\t\t\t\t\t\"--publish\",\n\t\t\t\t\t\t\t\t\t\t\"--target\", \"linux/amd64\",\n\t\t\t\t\t\t\t\t\t\t\"--target\", \"linux/arm64\",\n\t\t\t\t\t\t\t\t\t\t\"--target\", \"windows/amd64\",\n\t\t\t\t\t\t\t\t\t)\n\t\t\t\t\t\t\t\t\tassertions.NewOutputAssertionManager(t, output).ReportsPackagePublished(depPackageName)\n\t\t\t\t\t\t\t\t\tassertImage.CanBePulledFromRegistry(depPackageName)\n\t\t\t\t\t\t\t\t\tassertions.NewOutputAssertionManager(t, output).ReportsSuccessfulIndexPushed(depPackageName)\n\n\t\t\t\t\t\t\t\t\t// let's prepare the composite buildpack to use the simple buildpack dependency prepared above\n\t\t\t\t\t\t\t\t\tpackageTomlPath = generateMultiPlatformCompositeBuildpackPackageToml(\".\", depPackageName)\n\n\t\t\t\t\t\t\t\t\t// We need to copy the buildpack toml to the folder where the packageTomlPath was created\n\t\t\t\t\t\t\t\t\tpackageTomlDir := filepath.Dir(packageTomlPath)\n\t\t\t\t\t\t\t\t\tsourceDir := filepath.Join(\"testdata\", \"mock_buildpacks\", \"meta-buildpack\", \"buildpack.toml\")\n\t\t\t\t\t\t\t\t\th.CopyFile(t, sourceDir, filepath.Join(packageTomlDir, \"buildpack.toml\"))\n\t\t\t\t\t\t\t\t})\n\n\t\t\t\t\t\t\t\tit(\"publishes images for each requested target to the registry and creates an image index\", func() {\n\t\t\t\t\t\t\t\t\toutput := pack.RunSuccessfully(\n\t\t\t\t\t\t\t\t\t\t\"buildpack\", \"package\", packageName,\n\t\t\t\t\t\t\t\t\t\t\"--config\", packageTomlPath,\n\t\t\t\t\t\t\t\t\t\t\"--publish\",\n\t\t\t\t\t\t\t\t\t\t\"--target\", \"linux/amd64\",\n\t\t\t\t\t\t\t\t\t\t\"--target\", \"linux/arm64\",\n\t\t\t\t\t\t\t\t\t\t\"--target\", \"windows/amd64\",\n\t\t\t\t\t\t\t\t\t)\n\n\t\t\t\t\t\t\t\t\tdefer imageManager.CleanupImages(packageName)\n\t\t\t\t\t\t\t\t\tassertions.NewOutputAssertionManager(t, output).ReportsPackagePublished(packageName)\n\n\t\t\t\t\t\t\t\t\tassertImage.NotExistsLocally(packageName)\n\t\t\t\t\t\t\t\t\tassertImage.CanBePulledFromRegistry(packageName)\n\n\t\t\t\t\t\t\t\t\tassertions.NewOutputAssertionManager(t, output).ReportsSuccessfulIndexPushed(packageName)\n\t\t\t\t\t\t\t\t\th.AssertRemoteImageIndex(t, packageName, types.OCIImageIndex, 3)\n\t\t\t\t\t\t\t\t})\n\t\t\t\t\t\t\t})\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"new multi-platform folder structure is used\", func() {\n\t\t\t\t\t\tvar packageName string\n\n\t\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\t\th.SkipIf(t, !pack.SupportsFeature(invoke.MultiPlatformBuildersAndBuildPackages), \"multi-platform builders and buildpack packages are available since 0.34.0\")\n\t\t\t\t\t\t\tpackageName = registryConfig.RepoName(\"simple-multi-platform-buildpack\" + h.RandString(8))\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\twhen(\"simple buildpack on disk\", func() {\n\t\t\t\t\t\t\tvar path string\n\n\t\t\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\t\t\t// create a simple buildpack on disk\n\t\t\t\t\t\t\t\tsourceDir := filepath.Join(\"testdata\", \"mock_buildpacks\")\n\t\t\t\t\t\t\t\tpath = filepath.Join(tmpDir, \"multi-platform-buildpack\")\n\t\t\t\t\t\t\t\terr := buildpacks.MultiPlatformFolderBP.Prepare(sourceDir, tmpDir)\n\t\t\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\t\t})\n\n\t\t\t\t\t\t\tit(\"publishes images for each target specified in buildpack.toml to the registry and creates an image index\", func() {\n\t\t\t\t\t\t\t\toutput := pack.RunSuccessfully(\n\t\t\t\t\t\t\t\t\t\"buildpack\", \"package\", packageName,\n\t\t\t\t\t\t\t\t\t\"--path\", path,\n\t\t\t\t\t\t\t\t\t\"--publish\", \"--verbose\",\n\t\t\t\t\t\t\t\t)\n\n\t\t\t\t\t\t\t\tdefer imageManager.CleanupImages(packageName)\n\t\t\t\t\t\t\t\tassertions.NewOutputAssertionManager(t, output).ReportsPackagePublished(packageName)\n\n\t\t\t\t\t\t\t\tassertImage.NotExistsLocally(packageName)\n\t\t\t\t\t\t\t\tassertImage.CanBePulledFromRegistry(packageName)\n\n\t\t\t\t\t\t\t\tassertions.NewOutputAssertionManager(t, output).ReportsSuccessfulIndexPushed(packageName)\n\t\t\t\t\t\t\t\th.AssertRemoteImageIndex(t, packageName, types.OCIImageIndex, 3)\n\t\t\t\t\t\t\t})\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"--pull-policy=never\", func() {\n\t\t\t\t\tit(\"should use local image\", func() {\n\t\t\t\t\t\tpackageTomlPath := generatePackageTomlWithOS(t, assert, pack, tmpDir, simplePackageConfigFixtureName, imageManager.HostOS())\n\t\t\t\t\t\tnestedPackageName := \"test/package-\" + h.RandString(10)\n\t\t\t\t\t\tnestedPackage := buildpacks.NewPackageImage(\n\t\t\t\t\t\t\tt,\n\t\t\t\t\t\t\tpack,\n\t\t\t\t\t\t\tnestedPackageName,\n\t\t\t\t\t\t\tpackageTomlPath,\n\t\t\t\t\t\t\tbuildpacks.WithRequiredBuildpacks(buildpacks.BpSimpleLayers),\n\t\t\t\t\t\t)\n\t\t\t\t\t\tbuildpackManager.PrepareBuildModules(tmpDir, nestedPackage)\n\t\t\t\t\t\tdefer imageManager.CleanupImages(nestedPackageName)\n\t\t\t\t\t\taggregatePackageToml := generateAggregatePackageToml(\"simple-layers-parent-buildpack.tgz\", nestedPackageName, imageManager.HostOS())\n\n\t\t\t\t\t\tpackageName := registryConfig.RepoName(\"test/package-\" + h.RandString(10))\n\t\t\t\t\t\tdefer imageManager.CleanupImages(packageName)\n\t\t\t\t\t\tpack.JustRunSuccessfully(\n\t\t\t\t\t\t\t\"buildpack\", \"package\", packageName,\n\t\t\t\t\t\t\t\"-c\", aggregatePackageToml,\n\t\t\t\t\t\t\t\"--pull-policy\", \"never\")\n\n\t\t\t\t\t\tassertImage.ExistsLocally(packageName)\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"should not pull image from registry\", func() {\n\t\t\t\t\t\tpackageTomlPath := generatePackageTomlWithOS(t, assert, pack, tmpDir, simplePackageConfigFixtureName, imageManager.HostOS())\n\t\t\t\t\t\tnestedPackageName := registryConfig.RepoName(\"test/package-\" + h.RandString(10))\n\t\t\t\t\t\tnestedPackage := buildpacks.NewPackageImage(\n\t\t\t\t\t\t\tt,\n\t\t\t\t\t\t\tpack,\n\t\t\t\t\t\t\tnestedPackageName,\n\t\t\t\t\t\t\tpackageTomlPath,\n\t\t\t\t\t\t\tbuildpacks.WithPublish(),\n\t\t\t\t\t\t\tbuildpacks.WithRequiredBuildpacks(buildpacks.BpSimpleLayers),\n\t\t\t\t\t\t)\n\t\t\t\t\t\tbuildpackManager.PrepareBuildModules(tmpDir, nestedPackage)\n\t\t\t\t\t\taggregatePackageToml := generateAggregatePackageToml(\"simple-layers-parent-buildpack.tgz\", nestedPackageName, imageManager.HostOS())\n\n\t\t\t\t\t\tpackageName := registryConfig.RepoName(\"test/package-\" + h.RandString(10))\n\n\t\t\t\t\t\toutput, err := pack.Run(\n\t\t\t\t\t\t\t\"buildpack\", \"package\", packageName,\n\t\t\t\t\t\t\t\"-c\", aggregatePackageToml,\n\t\t\t\t\t\t\t\"--pull-policy\", \"never\",\n\t\t\t\t\t\t)\n\n\t\t\t\t\t\tassert.NotNil(err)\n\t\t\t\t\t\tassertions.NewOutputAssertionManager(t, output).ReportsImageNotExistingOnDaemon(nestedPackageName)\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"--format file\", func() {\n\t\t\t\twhen(\"the file extension is .cnb\", func() {\n\t\t\t\t\tit(\"creates the package with the same extension\", func() {\n\t\t\t\t\t\tpackageTomlPath := generatePackageTomlWithOS(t, assert, pack, tmpDir, simplePackageConfigFixtureName, imageManager.HostOS())\n\t\t\t\t\t\tdestinationFile := filepath.Join(tmpDir, \"package.cnb\")\n\t\t\t\t\t\toutput := pack.RunSuccessfully(\n\t\t\t\t\t\t\t\"buildpack\", \"package\", destinationFile,\n\t\t\t\t\t\t\t\"--format\", \"file\",\n\t\t\t\t\t\t\t\"-c\", packageTomlPath,\n\t\t\t\t\t\t)\n\t\t\t\t\t\tassertions.NewOutputAssertionManager(t, output).ReportsPackageCreation(destinationFile)\n\t\t\t\t\t\th.AssertTarball(t, destinationFile)\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t\twhen(\"the file extension is empty\", func() {\n\t\t\t\t\tit(\"creates the package with a .cnb extension\", func() {\n\t\t\t\t\t\tpackageTomlPath := generatePackageTomlWithOS(t, assert, pack, tmpDir, simplePackageConfigFixtureName, imageManager.HostOS())\n\t\t\t\t\t\tdestinationFile := filepath.Join(tmpDir, \"package\")\n\t\t\t\t\t\texpectedFile := filepath.Join(tmpDir, \"package.cnb\")\n\t\t\t\t\t\toutput := pack.RunSuccessfully(\n\t\t\t\t\t\t\t\"buildpack\", \"package\", destinationFile,\n\t\t\t\t\t\t\t\"--format\", \"file\",\n\t\t\t\t\t\t\t\"-c\", packageTomlPath,\n\t\t\t\t\t\t)\n\t\t\t\t\t\tassertions.NewOutputAssertionManager(t, output).ReportsPackageCreation(expectedFile)\n\t\t\t\t\t\th.AssertTarball(t, expectedFile)\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t\twhen(\"the file extension is not .cnb\", func() {\n\t\t\t\t\tit(\"creates the package with the given extension but shows a warning\", func() {\n\t\t\t\t\t\tpackageTomlPath := generatePackageTomlWithOS(t, assert, pack, tmpDir, simplePackageConfigFixtureName, imageManager.HostOS())\n\t\t\t\t\t\tdestinationFile := filepath.Join(tmpDir, \"package.tar.gz\")\n\t\t\t\t\t\toutput := pack.RunSuccessfully(\n\t\t\t\t\t\t\t\"buildpack\", \"package\", destinationFile,\n\t\t\t\t\t\t\t\"--format\", \"file\",\n\t\t\t\t\t\t\t\"-c\", packageTomlPath,\n\t\t\t\t\t\t)\n\t\t\t\t\t\tassertOutput := assertions.NewOutputAssertionManager(t, output)\n\t\t\t\t\t\tassertOutput.ReportsPackageCreation(destinationFile)\n\t\t\t\t\t\tassertOutput.ReportsInvalidExtension(\".gz\")\n\t\t\t\t\t\th.AssertTarball(t, destinationFile)\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"package.toml is invalid\", func() {\n\t\t\t\tit(\"displays an error\", func() {\n\t\t\t\t\toutput, err := pack.Run(\n\t\t\t\t\t\t\"buildpack\", \"package\", \"some-package\",\n\t\t\t\t\t\t\"-c\", pack.FixtureManager().FixtureLocation(\"invalid_package.toml\"),\n\t\t\t\t\t)\n\n\t\t\t\t\tassert.NotNil(err)\n\t\t\t\t\tassertOutput := assertions.NewOutputAssertionManager(t, output)\n\t\t\t\t\tassertOutput.ReportsReadingConfig()\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"inspect\", func() {\n\t\t\tvar tmpDir string\n\n\t\t\tit.Before(func() {\n\t\t\t\tvar err error\n\t\t\t\ttmpDir, err = os.MkdirTemp(\"\", \"buildpack-inspect-tests\")\n\t\t\t\tassert.Nil(err)\n\t\t\t})\n\n\t\t\tit.After(func() {\n\t\t\t\tassert.Succeeds(os.RemoveAll(tmpDir))\n\t\t\t})\n\n\t\t\twhen(\"buildpack archive\", func() {\n\t\t\t\tit(\"succeeds\", func() {\n\n\t\t\t\t\tpackageFileLocation := filepath.Join(\n\t\t\t\t\t\ttmpDir,\n\t\t\t\t\t\tfmt.Sprintf(\"buildpack-%s.cnb\", h.RandString(8)),\n\t\t\t\t\t)\n\n\t\t\t\t\tpackageTomlPath := generatePackageTomlWithOS(t, assert, pack, tmpDir, \"package_for_build_cmd.toml\", imageManager.HostOS())\n\n\t\t\t\t\tpackageFile := buildpacks.NewPackageFile(\n\t\t\t\t\t\tt,\n\t\t\t\t\t\tpack,\n\t\t\t\t\t\tpackageFileLocation,\n\t\t\t\t\t\tpackageTomlPath,\n\t\t\t\t\t\tbuildpacks.WithRequiredBuildpacks(\n\t\t\t\t\t\t\tbuildpacks.BpFolderSimpleLayersParent,\n\t\t\t\t\t\t\tbuildpacks.BpFolderSimpleLayers,\n\t\t\t\t\t\t),\n\t\t\t\t\t)\n\n\t\t\t\t\tbuildpackManager.PrepareBuildModules(tmpDir, packageFile)\n\n\t\t\t\t\texpectedOutput := pack.FixtureManager().TemplateFixture(\n\t\t\t\t\t\t\"inspect_buildpack_output.txt\",\n\t\t\t\t\t\tmap[string]interface{}{\n\t\t\t\t\t\t\t\"buildpack_source\": \"LOCAL ARCHIVE\",\n\t\t\t\t\t\t\t\"buildpack_name\":   packageFileLocation,\n\t\t\t\t\t\t},\n\t\t\t\t\t)\n\n\t\t\t\t\toutput := pack.RunSuccessfully(\"buildpack\", \"inspect\", packageFileLocation)\n\t\t\t\t\tassert.TrimmedEq(output, expectedOutput)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"buildpack image\", func() {\n\t\t\t\twhen(\"inspect\", func() {\n\t\t\t\t\tit(\"succeeds\", func() {\n\t\t\t\t\t\tpackageTomlPath := generatePackageTomlWithOS(t, assert, pack, tmpDir, \"package_for_build_cmd.toml\", imageManager.HostOS())\n\t\t\t\t\t\tpackageImageName := registryConfig.RepoName(\"buildpack-\" + h.RandString(8))\n\n\t\t\t\t\t\tpackageImage := buildpacks.NewPackageImage(\n\t\t\t\t\t\t\tt,\n\t\t\t\t\t\t\tpack,\n\t\t\t\t\t\t\tpackageImageName,\n\t\t\t\t\t\t\tpackageTomlPath,\n\t\t\t\t\t\t\tbuildpacks.WithRequiredBuildpacks(\n\t\t\t\t\t\t\t\tbuildpacks.BpFolderSimpleLayersParent,\n\t\t\t\t\t\t\t\tbuildpacks.BpFolderSimpleLayers,\n\t\t\t\t\t\t\t),\n\t\t\t\t\t\t)\n\t\t\t\t\t\tdefer imageManager.CleanupImages(packageImageName)\n\n\t\t\t\t\t\tbuildpackManager.PrepareBuildModules(tmpDir, packageImage)\n\n\t\t\t\t\t\texpectedOutput := pack.FixtureManager().TemplateFixture(\n\t\t\t\t\t\t\t\"inspect_buildpack_output.txt\",\n\t\t\t\t\t\t\tmap[string]interface{}{\n\t\t\t\t\t\t\t\t\"buildpack_source\": \"LOCAL IMAGE\",\n\t\t\t\t\t\t\t\t\"buildpack_name\":   packageImageName,\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t)\n\n\t\t\t\t\t\toutput := pack.RunSuccessfully(\"buildpack\", \"inspect\", packageImageName)\n\t\t\t\t\t\tassert.TrimmedEq(output, expectedOutput)\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"builder\", func() {\n\t\twhen(\"suggest\", func() {\n\t\t\tit(\"displays suggested builders\", func() {\n\t\t\t\toutput := pack.RunSuccessfully(\"builder\", \"suggest\")\n\n\t\t\t\tassertOutput := assertions.NewOutputAssertionManager(t, output)\n\t\t\t\tassertOutput.IncludesSuggestedBuildersHeading()\n\t\t\t\tassertOutput.IncludesPrefixedGoogleBuilder()\n\t\t\t\tassertOutput.IncludesPrefixedHerokuBuilders()\n\t\t\t\tassertOutput.IncludesPrefixedPaketoBuilders()\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"config\", func() {\n\t\twhen(\"default-builder\", func() {\n\t\t\tit(\"sets the default builder in ~/.pack/config.toml\", func() {\n\t\t\t\tbuilderName := \"paketobuildpacks/builder-jammy-base\"\n\t\t\t\toutput := pack.RunSuccessfully(\"config\", \"default-builder\", builderName)\n\n\t\t\t\tassertions.NewOutputAssertionManager(t, output).ReportsSettingDefaultBuilder(builderName)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"trusted-builders\", func() {\n\t\t\tit(\"prints list of trusted builders\", func() {\n\t\t\t\toutput := pack.RunSuccessfully(\"config\", \"trusted-builders\")\n\n\t\t\t\tassertOutput := assertions.NewOutputAssertionManager(t, output)\n\t\t\t\tassertOutput.IncludesTrustedBuildersHeading()\n\t\t\t\tassertOutput.IncludesHerokuBuilders()\n\t\t\t\tassertOutput.IncludesGoogleBuilder()\n\t\t\t\tassertOutput.IncludesPaketoBuilders()\n\t\t\t})\n\n\t\t\twhen(\"add\", func() {\n\t\t\t\tit(\"sets the builder as trusted in ~/.pack/config.toml\", func() {\n\t\t\t\t\tbuilderName := \"some-builder\" + h.RandString(10)\n\n\t\t\t\t\tpack.JustRunSuccessfully(\"config\", \"trusted-builders\", \"add\", builderName)\n\t\t\t\t\tassert.Contains(pack.ConfigFileContents(), builderName)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"remove\", func() {\n\t\t\t\tit(\"removes the previously trusted builder from ~/${PACK_HOME}/config.toml\", func() {\n\t\t\t\t\tbuilderName := \"some-builder\" + h.RandString(10)\n\n\t\t\t\t\tpack.JustRunSuccessfully(\"config\", \"trusted-builders\", \"add\", builderName)\n\n\t\t\t\t\tassert.Contains(pack.ConfigFileContents(), builderName)\n\n\t\t\t\t\tpack.JustRunSuccessfully(\"config\", \"trusted-builders\", \"remove\", builderName)\n\n\t\t\t\t\tassert.NotContains(pack.ConfigFileContents(), builderName)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"list\", func() {\n\t\t\t\tit(\"prints list of trusted builders\", func() {\n\t\t\t\t\toutput := pack.RunSuccessfully(\"config\", \"trusted-builders\", \"list\")\n\n\t\t\t\t\tassertOutput := assertions.NewOutputAssertionManager(t, output)\n\t\t\t\t\tassertOutput.IncludesTrustedBuildersHeading()\n\t\t\t\t\tassertOutput.IncludesHerokuBuilders()\n\t\t\t\t\tassertOutput.IncludesGoogleBuilder()\n\t\t\t\t\tassertOutput.IncludesPaketoBuilders()\n\t\t\t\t})\n\n\t\t\t\tit(\"shows a builder trusted by pack config trusted-builders add\", func() {\n\t\t\t\t\tbuilderName := \"some-builder\" + h.RandString(10)\n\n\t\t\t\t\tpack.JustRunSuccessfully(\"config\", \"trusted-builders\", \"add\", builderName)\n\n\t\t\t\t\toutput := pack.RunSuccessfully(\"config\", \"trusted-builders\", \"list\")\n\t\t\t\t\tassert.Contains(output, builderName)\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"stack\", func() {\n\t\twhen(\"suggest\", func() {\n\t\t\tit(\"displays suggested stacks\", func() {\n\t\t\t\toutput, err := pack.Run(\"stack\", \"suggest\")\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassertions.NewOutputAssertionManager(t, output).IncludesSuggestedStacksHeading()\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"report\", func() {\n\t\twhen(\"default builder is set\", func() {\n\t\t\tit(\"redacts default builder\", func() {\n\t\t\t\tpack.RunSuccessfully(\"config\", \"default-builder\", \"paketobuildpacks/builder-jammy-base\")\n\n\t\t\t\toutput := pack.RunSuccessfully(\"report\")\n\t\t\t\tversion := pack.Version()\n\n\t\t\t\tlayoutRepoDir := filepath.Join(pack.Home(), \"layout-repo\")\n\t\t\t\tif runtime.GOOS == \"windows\" {\n\t\t\t\t\tlayoutRepoDir = strings.ReplaceAll(layoutRepoDir, `\\`, `\\\\`)\n\t\t\t\t}\n\n\t\t\t\texpectedOutput := pack.FixtureManager().TemplateFixture(\n\t\t\t\t\t\"report_output.txt\",\n\t\t\t\t\tmap[string]interface{}{\n\t\t\t\t\t\t\"DefaultBuilder\": \"[REDACTED]\",\n\t\t\t\t\t\t\"Version\":        version,\n\t\t\t\t\t\t\"OS\":             runtime.GOOS,\n\t\t\t\t\t\t\"Arch\":           runtime.GOARCH,\n\t\t\t\t\t\t\"LayoutRepoDir\":  layoutRepoDir,\n\t\t\t\t\t},\n\t\t\t\t)\n\t\t\t\tassert.Equal(output, expectedOutput)\n\t\t\t})\n\n\t\t\tit(\"explicit mode doesn't redact\", func() {\n\t\t\t\tpack.RunSuccessfully(\"config\", \"default-builder\", \"paketobuildpacks/builder-jammy-base\")\n\n\t\t\t\toutput := pack.RunSuccessfully(\"report\", \"--explicit\")\n\t\t\t\tversion := pack.Version()\n\n\t\t\t\tlayoutRepoDir := filepath.Join(pack.Home(), \"layout-repo\")\n\t\t\t\tif runtime.GOOS == \"windows\" {\n\t\t\t\t\tlayoutRepoDir = strings.ReplaceAll(layoutRepoDir, `\\`, `\\\\`)\n\t\t\t\t}\n\n\t\t\t\texpectedOutput := pack.FixtureManager().TemplateFixture(\n\t\t\t\t\t\"report_output.txt\",\n\t\t\t\t\tmap[string]interface{}{\n\t\t\t\t\t\t\"DefaultBuilder\": \"paketobuildpacks/builder-jammy-base\",\n\t\t\t\t\t\t\"Version\":        version,\n\t\t\t\t\t\t\"OS\":             runtime.GOOS,\n\t\t\t\t\t\t\"Arch\":           runtime.GOARCH,\n\t\t\t\t\t\t\"LayoutRepoDir\":  layoutRepoDir,\n\t\t\t\t\t},\n\t\t\t\t)\n\t\t\t\tassert.Equal(output, expectedOutput)\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"manifest\", func() {\n\t\tvar (\n\t\t\tindexRepoName  string\n\t\t\trepoName1      string\n\t\t\trepoName2      string\n\t\t\tindexLocalPath string\n\t\t\ttmpDir         string\n\t\t\terr            error\n\t\t)\n\n\t\tit.Before(func() {\n\t\t\th.SkipIf(t, !pack.SupportsFeature(invoke.ManifestCommands), \"pack manifest commands are available since 0.34.0\")\n\n\t\t\t// local storage path\n\t\t\ttmpDir, err = os.MkdirTemp(\"\", \"manifest-commands-test\")\n\t\t\tassert.Nil(err)\n\t\t\tos.Setenv(\"XDG_RUNTIME_DIR\", tmpDir)\n\n\t\t\t// manifest commands are experimental\n\t\t\tpack.EnableExperimental()\n\n\t\t\t// used to avoid authentication issues with the local registry\n\t\t\tos.Setenv(\"DOCKER_CONFIG\", registryConfig.DockerConfigDir)\n\t\t})\n\n\t\tit.After(func() {\n\t\t\tassert.Succeeds(os.RemoveAll(tmpDir))\n\t\t})\n\n\t\twhen(\"create\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tindexRepoName = registryConfig.RepoName(h.NewRandomIndexRepoName())\n\n\t\t\t\t\t// Manifest 1\n\t\t\t\t\trepoName1 = fmt.Sprintf(\"%s:%s\", indexRepoName, \"busybox-amd64\")\n\t\t\t\t\th.CreateRemoteImage(t, indexRepoName, \"busybox-amd64\", \"busybox@sha256:a236a6469768c17ca1a6ac81a35fe6fbc1efd76b0dcdf5aebb1cf5f0774ee539\")\n\n\t\t\t\t\t// Manifest 2\n\t\t\t\t\trepoName2 = fmt.Sprintf(\"%s:%s\", indexRepoName, \"busybox-arm64\")\n\t\t\t\t\th.CreateRemoteImage(t, indexRepoName, \"busybox-arm64\", \"busybox@sha256:0bcc1b827b855c65eaf6e031e894e682b6170160b8a676e1df7527a19d51fb1a\")\n\t\t\t\t})\n\t\t\t})\n\t\t\twhen(\"--publish\", func() {\n\t\t\t\tit(\"creates and push the index to a remote registry\", func() {\n\t\t\t\t\toutput := pack.RunSuccessfully(\"manifest\", \"create\", \"--publish\", indexRepoName, repoName1, repoName2)\n\t\t\t\t\tassertions.NewOutputAssertionManager(t, output).ReportsSuccessfulIndexPushed(indexRepoName)\n\t\t\t\t\th.AssertRemoteImageIndex(t, indexRepoName, types.OCIImageIndex, 2)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"no --publish\", func() {\n\t\t\t\tit(\"creates the index locally\", func() {\n\t\t\t\t\toutput := pack.RunSuccessfully(\"manifest\", \"create\", indexRepoName, repoName1, repoName2)\n\t\t\t\t\tassertions.NewOutputAssertionManager(t, output).ReportsSuccessfulIndexLocallyCreated(indexRepoName)\n\n\t\t\t\t\tindexLocalPath = filepath.Join(tmpDir, imgutil.MakeFileSafeName(indexRepoName))\n\t\t\t\t\tindex := h.ReadIndexManifest(t, indexLocalPath)\n\t\t\t\t\th.AssertEq(t, len(index.Manifests), 2)\n\t\t\t\t\th.AssertEq(t, index.MediaType, types.OCIImageIndex)\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"index is already created\", func() {\n\t\t\tvar digest v1.Hash\n\n\t\t\tit.Before(func() {\n\t\t\t\tindexRepoName = registryConfig.RepoName(h.NewRandomIndexRepoName())\n\n\t\t\t\t// Manifest 1\n\t\t\t\trepoName1 = fmt.Sprintf(\"%s:%s\", indexRepoName, \"busybox-amd64\")\n\t\t\t\timage1 := h.CreateRemoteImage(t, indexRepoName, \"busybox-amd64\", \"busybox@sha256:a236a6469768c17ca1a6ac81a35fe6fbc1efd76b0dcdf5aebb1cf5f0774ee539\")\n\t\t\t\tdigest, err = image1.Digest()\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\t// Manifest 2\n\t\t\t\trepoName2 = fmt.Sprintf(\"%s:%s\", indexRepoName, \"busybox-arm64\")\n\t\t\t\th.CreateRemoteImage(t, indexRepoName, \"busybox-arm64\", \"busybox@sha256:0bcc1b827b855c65eaf6e031e894e682b6170160b8a676e1df7527a19d51fb1a\")\n\n\t\t\t\t// create an index locally\n\t\t\t\tpack.RunSuccessfully(\"manifest\", \"create\", indexRepoName, repoName1)\n\t\t\t\tindexLocalPath = filepath.Join(tmpDir, imgutil.MakeFileSafeName(indexRepoName))\n\t\t\t})\n\n\t\t\twhen(\"add\", func() {\n\t\t\t\tit(\"adds the manifest to the index\", func() {\n\t\t\t\t\toutput := pack.RunSuccessfully(\"manifest\", \"add\", indexRepoName, repoName2)\n\t\t\t\t\tassertions.NewOutputAssertionManager(t, output).ReportsSuccessfulManifestAddedToIndex(repoName2)\n\n\t\t\t\t\tindex := h.ReadIndexManifest(t, indexLocalPath)\n\t\t\t\t\th.AssertEq(t, len(index.Manifests), 2)\n\t\t\t\t\th.AssertEq(t, index.MediaType, types.OCIImageIndex)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"remove\", func() {\n\t\t\t\tit(\"removes the index from local storage\", func() {\n\t\t\t\t\toutput := pack.RunSuccessfully(\"manifest\", \"remove\", indexRepoName)\n\t\t\t\t\tassertions.NewOutputAssertionManager(t, output).ReportsSuccessfulIndexDeleted()\n\n\t\t\t\t\th.AssertPathDoesNotExists(t, indexLocalPath)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"annotate\", func() {\n\t\t\t\tit(\"adds annotations to the manifest in the index\", func() {\n\t\t\t\t\toutput := pack.RunSuccessfully(\"manifest\", \"annotate\", indexRepoName, repoName1, \"--annotations\", \"foo=bar\")\n\t\t\t\t\tassertions.NewOutputAssertionManager(t, output).ReportsSuccessfulIndexAnnotated(repoName1, indexRepoName)\n\n\t\t\t\t\tindex := h.ReadIndexManifest(t, indexLocalPath)\n\t\t\t\t\th.AssertEq(t, len(index.Manifests), 1)\n\t\t\t\t\th.AssertEq(t, len(index.Manifests[0].Annotations), 1)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"rm\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\t// we need to point to the manifest digest we want to delete\n\t\t\t\t\trepoName1 = fmt.Sprintf(\"%s@%s\", repoName1, digest.String())\n\t\t\t\t})\n\n\t\t\t\tit(\"removes the manifest from the index\", func() {\n\t\t\t\t\toutput := pack.RunSuccessfully(\"manifest\", \"rm\", indexRepoName, repoName1)\n\t\t\t\t\tassertions.NewOutputAssertionManager(t, output).ReportsSuccessfulRemoveManifestFromIndex(indexRepoName)\n\n\t\t\t\t\tindex := h.ReadIndexManifest(t, indexLocalPath)\n\t\t\t\t\th.AssertEq(t, len(index.Manifests), 0)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"push\", func() {\n\t\t\t\tit(\"pushes the index to a remote registry\", func() {\n\t\t\t\t\toutput := pack.RunSuccessfully(\"manifest\", \"push\", indexRepoName)\n\t\t\t\t\tassertions.NewOutputAssertionManager(t, output).ReportsSuccessfulIndexPushed(indexRepoName)\n\t\t\t\t\th.AssertRemoteImageIndex(t, indexRepoName, types.OCIImageIndex, 1)\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\t})\n}\n\nfunc testAcceptance(\n\tt *testing.T,\n\twhen spec.G,\n\tit spec.S,\n\tsubjectPackConfig, createBuilderPackConfig config.PackAsset,\n\tlifecycle config.LifecycleAsset,\n) {\n\tvar (\n\t\tpack, createBuilderPack *invoke.PackInvoker\n\t\tbuildpackManager        buildpacks.BuildModuleManager\n\t\tbpDir                   = buildModulesDir()\n\t\tassert                  = h.NewAssertionManager(t)\n\t)\n\n\tit.Before(func() {\n\t\tpack = invoke.NewPackInvoker(t, assert, subjectPackConfig, registryConfig.DockerConfigDir)\n\t\tpack.EnableExperimental()\n\n\t\tcreateBuilderPack = invoke.NewPackInvoker(t, assert, createBuilderPackConfig, registryConfig.DockerConfigDir)\n\t\tcreateBuilderPack.EnableExperimental()\n\n\t\tbuildpackManager = buildpacks.NewBuildModuleManager(\n\t\t\tt,\n\t\t\tassert,\n\t\t)\n\t})\n\n\tit.After(func() {\n\t\tpack.Cleanup()\n\t\tcreateBuilderPack.Cleanup()\n\t})\n\n\twhen(\"stack is created\", func() {\n\t\tvar (\n\t\t\trunImageMirror  string\n\t\t\tstackBaseImages = map[string][]string{\n\t\t\t\t\"linux\":   {\"ubuntu:bionic\"},\n\t\t\t\t\"windows\": {\"mcr.microsoft.com/windows/nanoserver:1809\", \"golang:1.17-nanoserver-1809\"},\n\t\t\t}\n\t\t)\n\n\t\tit.Before(func() {\n\t\t\tvalue, err := suiteManager.RunTaskOnceString(\"create-stack\",\n\t\t\t\tfunc() (string, error) {\n\t\t\t\t\trunImageMirror := registryConfig.RepoName(runImage)\n\t\t\t\t\terr := createStack(t, dockerCli, runImageMirror)\n\t\t\t\t\tif err != nil {\n\t\t\t\t\t\treturn \"\", err\n\t\t\t\t\t}\n\n\t\t\t\t\treturn runImageMirror, nil\n\t\t\t\t})\n\t\t\tassert.Nil(err)\n\n\t\t\tbaseStackNames := stackBaseImages[imageManager.HostOS()]\n\t\t\tsuiteManager.RegisterCleanUp(\"remove-stack-images\", func() error {\n\t\t\t\timageManager.CleanupImages(baseStackNames...)\n\t\t\t\timageManager.CleanupImages(runImage, buildImage, value)\n\t\t\t\treturn nil\n\t\t\t})\n\n\t\t\trunImageMirror = value\n\t\t})\n\n\t\twhen(\"builder is created\", func() {\n\t\t\tvar builderName string\n\n\t\t\tit.Before(func() {\n\t\t\t\tkey := taskKey(\n\t\t\t\t\t\"create-builder\",\n\t\t\t\t\tappend(\n\t\t\t\t\t\t[]string{runImageMirror, createBuilderPackConfig.Path(), lifecycle.Identifier()},\n\t\t\t\t\t\tcreateBuilderPackConfig.FixturePaths()...,\n\t\t\t\t\t)...,\n\t\t\t\t)\n\t\t\t\tvalue, err := suiteManager.RunTaskOnceString(key, func() (string, error) {\n\t\t\t\t\treturn createBuilder(t, assert, createBuilderPack, lifecycle, buildpackManager, runImageMirror)\n\t\t\t\t})\n\t\t\t\tassert.Nil(err)\n\t\t\t\tsuiteManager.RegisterCleanUp(\"clean-\"+key, func() error {\n\t\t\t\t\timageManager.CleanupImages(value)\n\t\t\t\t\treturn nil\n\t\t\t\t})\n\n\t\t\t\tbuilderName = value\n\t\t\t})\n\n\t\t\twhen(\"complex builder\", func() {\n\t\t\t\twhen(\"builder has duplicate buildpacks\", func() {\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\t// create our nested builder\n\t\t\t\t\t\th.SkipIf(t, imageManager.HostOS() == \"windows\", \"These tests are not yet compatible with Windows-based containers\")\n\n\t\t\t\t\t\t// create a task, handled by a 'task manager' which executes our pack commands during tests.\n\t\t\t\t\t\t// looks like this is used to de-dup tasks\n\t\t\t\t\t\tkey := taskKey(\n\t\t\t\t\t\t\t\"create-complex-builder\",\n\t\t\t\t\t\t\tappend(\n\t\t\t\t\t\t\t\t[]string{runImageMirror, createBuilderPackConfig.Path(), lifecycle.Identifier()},\n\t\t\t\t\t\t\t\tcreateBuilderPackConfig.FixturePaths()...,\n\t\t\t\t\t\t\t)...,\n\t\t\t\t\t\t)\n\n\t\t\t\t\t\tvalue, err := suiteManager.RunTaskOnceString(key, func() (string, error) {\n\t\t\t\t\t\t\treturn createComplexBuilder(\n\t\t\t\t\t\t\t\tt,\n\t\t\t\t\t\t\t\tassert,\n\t\t\t\t\t\t\t\tcreateBuilderPack,\n\t\t\t\t\t\t\t\tlifecycle,\n\t\t\t\t\t\t\t\tbuildpackManager,\n\t\t\t\t\t\t\t\trunImageMirror,\n\t\t\t\t\t\t\t)\n\t\t\t\t\t\t})\n\t\t\t\t\t\tassert.Nil(err)\n\n\t\t\t\t\t\t// register task to be run to 'clean up' a task\n\t\t\t\t\t\tsuiteManager.RegisterCleanUp(\"clean-\"+key, func() error {\n\t\t\t\t\t\t\timageManager.CleanupImages(value)\n\t\t\t\t\t\t\treturn nil\n\t\t\t\t\t\t})\n\t\t\t\t\t\tbuilderName = value\n\n\t\t\t\t\t\toutput := pack.RunSuccessfully(\n\t\t\t\t\t\t\t\"config\", \"run-image-mirrors\", \"add\", \"pack-test/run\", \"--mirror\", \"some-registry.com/pack-test/run1\")\n\t\t\t\t\t\tassertOutput := assertions.NewOutputAssertionManager(t, output)\n\t\t\t\t\t\tassertOutput.ReportsSuccesfulRunImageMirrorsAdd(\"pack-test/run\", \"some-registry.com/pack-test/run1\")\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"buildpack layers have no duplication\", func() {\n\t\t\t\t\t\tassertImage.DoesNotHaveDuplicateLayers(builderName)\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"builder has extensions\", func() {\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\th.SkipIf(t, !createBuilderPack.SupportsFeature(invoke.BuildImageExtensions), \"\")\n\t\t\t\t\t\th.SkipIf(t, !pack.SupportsFeature(invoke.BuildImageExtensions), \"\")\n\t\t\t\t\t\th.SkipIf(t, !lifecycle.SupportsFeature(config.BuildImageExtensions), \"\")\n\t\t\t\t\t\t// create a task, handled by a 'task manager' which executes our pack commands during tests.\n\t\t\t\t\t\t// looks like this is used to de-dup tasks\n\t\t\t\t\t\tkey := taskKey(\n\t\t\t\t\t\t\t\"create-builder-with-extensions\",\n\t\t\t\t\t\t\tappend(\n\t\t\t\t\t\t\t\t[]string{runImageMirror, createBuilderPackConfig.Path(), lifecycle.Identifier()},\n\t\t\t\t\t\t\t\tcreateBuilderPackConfig.FixturePaths()...,\n\t\t\t\t\t\t\t)...,\n\t\t\t\t\t\t)\n\n\t\t\t\t\t\tvalue, err := suiteManager.RunTaskOnceString(key, func() (string, error) {\n\t\t\t\t\t\t\treturn createBuilderWithExtensions(\n\t\t\t\t\t\t\t\tt,\n\t\t\t\t\t\t\t\tassert,\n\t\t\t\t\t\t\t\tcreateBuilderPack,\n\t\t\t\t\t\t\t\tlifecycle,\n\t\t\t\t\t\t\t\tbuildpackManager,\n\t\t\t\t\t\t\t\trunImageMirror,\n\t\t\t\t\t\t\t)\n\t\t\t\t\t\t})\n\t\t\t\t\t\tassert.Nil(err)\n\n\t\t\t\t\t\t// register task to be run to 'clean up' a task\n\t\t\t\t\t\tsuiteManager.RegisterCleanUp(\"clean-\"+key, func() error {\n\t\t\t\t\t\t\timageManager.CleanupImages(value)\n\t\t\t\t\t\t\treturn nil\n\t\t\t\t\t\t})\n\t\t\t\t\t\tbuilderName = value\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"creates builder\", func() {\n\t\t\t\t\t\tif imageManager.HostOS() != \"windows\" {\n\t\t\t\t\t\t\t// Linux containers (including Linux containers on Windows)\n\t\t\t\t\t\t\textSimpleLayersDiffID := \"sha256:d24758b8b75b13292746fe7a06666f28a9499da31826a60afe6ee6b8cba29b73\"\n\t\t\t\t\t\t\textReadEnvDiffID := \"sha256:43072b16e96564a4dd6bd2e74c55c3c94af78cf99d869cab1e62c873e1fa6780\"\n\t\t\t\t\t\t\tbpSimpleLayersDiffID := \"sha256:ade9da86859fa4ea50a513757f9b242bf1038667abf92dad3d018974a17f0ea7\"\n\t\t\t\t\t\t\tbpReadEnvDiffID := \"sha256:db0797077ba8deff7054ab5578133b8f0206b6393de34b5bfd795cf50f6afdbd\"\n\t\t\t\t\t\t\t// extensions\n\t\t\t\t\t\t\tassertImage.HasLabelContaining(builderName, \"io.buildpacks.extension.layers\", `{\"read/env\":{\"read-env-version\":{\"api\":\"0.9\",\"layerDiffID\":\"`+extReadEnvDiffID+`\",\"name\":\"Read Env Extension\"}},\"simple/layers\":{\"simple-layers-version\":{\"api\":\"0.7\",\"layerDiffID\":\"`+extSimpleLayersDiffID+`\",\"name\":\"Simple Layers Extension\"}}}`)\n\t\t\t\t\t\t\tassertImage.HasLabelContaining(builderName, \"io.buildpacks.buildpack.order-extensions\", `[{\"group\":[{\"id\":\"read/env\",\"version\":\"read-env-version\"},{\"id\":\"simple/layers\",\"version\":\"simple-layers-version\"}]}]`)\n\t\t\t\t\t\t\t// buildpacks\n\t\t\t\t\t\t\tassertImage.HasLabelContaining(builderName, \"io.buildpacks.buildpack.layers\", `{\"read/env\":{\"read-env-version\":{\"api\":\"0.7\",\"stacks\":[{\"id\":\"pack.test.stack\"}],\"layerDiffID\":\"`+bpReadEnvDiffID+`\",\"name\":\"Read Env Buildpack\"}},\"simple/layers\":{\"simple-layers-version\":{\"api\":\"0.7\",\"stacks\":[{\"id\":\"pack.test.stack\"}],\"layerDiffID\":\"`+bpSimpleLayersDiffID+`\",\"name\":\"Simple Layers Buildpack\"}}}`)\n\t\t\t\t\t\t\tassertImage.HasLabelContaining(builderName, \"io.buildpacks.buildpack.order\", `[{\"group\":[{\"id\":\"read/env\",\"version\":\"read-env-version\",\"optional\":true},{\"id\":\"simple/layers\",\"version\":\"simple-layers-version\",\"optional\":true}]}]`)\n\t\t\t\t\t\t}\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"build\", func() {\n\t\t\t\t\t\tvar repo, repoName string\n\n\t\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\t\th.SkipIf(t, imageManager.HostOS() == \"windows\", \"\")\n\n\t\t\t\t\t\t\trepo = \"some-org/\" + h.RandString(10)\n\t\t\t\t\t\t\trepoName = registryConfig.RepoName(repo)\n\t\t\t\t\t\t\tpack.JustRunSuccessfully(\"config\", \"lifecycle-image\", lifecycle.Image())\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\tit.After(func() {\n\t\t\t\t\t\t\th.SkipIf(t, imageManager.HostOS() == \"windows\", \"\")\n\n\t\t\t\t\t\t\timageManager.CleanupImages(repoName)\n\t\t\t\t\t\t\tref, err := name.ParseReference(repoName, name.WeakValidation)\n\t\t\t\t\t\t\tassert.Nil(err)\n\t\t\t\t\t\t\tcacheImage := cache.NewImageCache(ref, dockerCli)\n\t\t\t\t\t\t\tlogger := logging.NewSimpleLogger(&bytes.Buffer{})\n\t\t\t\t\t\t\tbuildCacheVolume, _ := cache.NewVolumeCache(ref, cache.CacheInfo{}, \"build\", dockerCli, logger)\n\t\t\t\t\t\t\tlaunchCacheVolume, _ := cache.NewVolumeCache(ref, cache.CacheInfo{}, \"launch\", dockerCli, logger)\n\t\t\t\t\t\t\tcacheImage.Clear(context.TODO())\n\t\t\t\t\t\t\tbuildCacheVolume.Clear(context.TODO())\n\t\t\t\t\t\t\tlaunchCacheVolume.Clear(context.TODO())\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\twhen(\"there are build image extensions\", func() {\n\t\t\t\t\t\t\tit(\"uses the 5 phases, and runs the extender (build)\", func() {\n\t\t\t\t\t\t\t\torigLifecycle := lifecycle.Image()\n\n\t\t\t\t\t\t\t\toutput := pack.RunSuccessfully(\n\t\t\t\t\t\t\t\t\t\"build\", repoName,\n\t\t\t\t\t\t\t\t\t\"-p\", filepath.Join(\"testdata\", \"mock_app\"),\n\t\t\t\t\t\t\t\t\t\"--network\", \"host\", // export target is the daemon, but we need to be able to reach the registry where the builder image is saved\n\t\t\t\t\t\t\t\t\t\"-B\", builderName,\n\t\t\t\t\t\t\t\t)\n\n\t\t\t\t\t\t\t\tassertions.NewOutputAssertionManager(t, output).ReportsSuccessfulImageBuild(repoName)\n\n\t\t\t\t\t\t\t\tassertOutput := assertions.NewLifecycleOutputAssertionManager(t, output)\n\t\t\t\t\t\t\t\tassertOutput.IncludesTagOrEphemeralLifecycle(origLifecycle)\n\t\t\t\t\t\t\t\tassertOutput.IncludesSeparatePhasesWithBuildExtension()\n\n\t\t\t\t\t\t\t\tt.Log(\"inspecting image\")\n\t\t\t\t\t\t\t\tinspectCmd := \"inspect\"\n\t\t\t\t\t\t\t\tif !pack.Supports(\"inspect\") {\n\t\t\t\t\t\t\t\t\tinspectCmd = \"inspect-image\"\n\t\t\t\t\t\t\t\t}\n\n\t\t\t\t\t\t\t\toutput = pack.RunSuccessfully(inspectCmd, repoName)\n\t\t\t\t\t\t\t})\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\twhen(\"there are run image extensions\", func() {\n\t\t\t\t\t\t\twhen(\"switching the run image\", func() {\n\t\t\t\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\t\t\t\th.SkipIf(t, !pack.SupportsFeature(invoke.RunImageExtensions), \"\")\n\t\t\t\t\t\t\t\t\th.SkipIf(t, !lifecycle.SupportsFeature(config.RunImageExtensions), \"\")\n\t\t\t\t\t\t\t\t})\n\n\t\t\t\t\t\t\t\tit(\"uses the 5 phases, and tries to pull the new run image by name before restore, and by identifier after restore\", func() {\n\t\t\t\t\t\t\t\t\toutput, _ := pack.Run(\n\t\t\t\t\t\t\t\t\t\t\"build\", repoName,\n\t\t\t\t\t\t\t\t\t\t\"-p\", filepath.Join(\"testdata\", \"mock_app\"),\n\t\t\t\t\t\t\t\t\t\t\"--network\", \"host\",\n\t\t\t\t\t\t\t\t\t\t\"-B\", builderName,\n\t\t\t\t\t\t\t\t\t\t\"--env\", \"EXT_RUN_SWITCH=1\",\n\t\t\t\t\t\t\t\t\t)\n\t\t\t\t\t\t\t\t\th.AssertContains(t, output, \"Pulling image 'busybox:latest'\")\n\t\t\t\t\t\t\t\t})\n\t\t\t\t\t\t\t})\n\n\t\t\t\t\t\t\twhen(\"extending the run image\", func() {\n\t\t\t\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\t\t\t\th.SkipIf(t, !pack.SupportsFeature(invoke.RunImageExtensions), \"\")\n\t\t\t\t\t\t\t\t\th.SkipIf(t, !lifecycle.SupportsFeature(config.RunImageExtensions), \"\")\n\t\t\t\t\t\t\t\t})\n\n\t\t\t\t\t\t\t\tit(\"uses the 5 phases, and runs the extender (run)\", func() {\n\t\t\t\t\t\t\t\t\torigLifecycle := lifecycle.Image()\n\n\t\t\t\t\t\t\t\t\toutput := pack.RunSuccessfully(\n\t\t\t\t\t\t\t\t\t\t\"build\", repoName,\n\t\t\t\t\t\t\t\t\t\t\"-p\", filepath.Join(\"testdata\", \"mock_app\"),\n\t\t\t\t\t\t\t\t\t\t\"--network\", \"host\", // export target is the daemon, but we need to be able to reach the registry where the builder image and run image are saved\n\t\t\t\t\t\t\t\t\t\t\"-B\", builderName,\n\t\t\t\t\t\t\t\t\t\t\"--env\", \"EXT_RUN=1\",\n\t\t\t\t\t\t\t\t\t)\n\n\t\t\t\t\t\t\t\t\tassertions.NewOutputAssertionManager(t, output).ReportsSuccessfulImageBuild(repoName)\n\n\t\t\t\t\t\t\t\t\tassertOutput := assertions.NewLifecycleOutputAssertionManager(t, output)\n\n\t\t\t\t\t\t\t\t\tassertOutput.IncludesTagOrEphemeralLifecycle(origLifecycle)\n\t\t\t\t\t\t\t\t\tassertOutput.IncludesSeparatePhasesWithRunExtension()\n\n\t\t\t\t\t\t\t\t\tt.Log(\"inspecting image\")\n\t\t\t\t\t\t\t\t\tinspectCmd := \"inspect\"\n\t\t\t\t\t\t\t\t\tif !pack.Supports(\"inspect\") {\n\t\t\t\t\t\t\t\t\t\tinspectCmd = \"inspect-image\"\n\t\t\t\t\t\t\t\t\t}\n\n\t\t\t\t\t\t\t\t\toutput = pack.RunSuccessfully(inspectCmd, repoName)\n\t\t\t\t\t\t\t\t})\n\t\t\t\t\t\t\t})\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"builder.toml is invalid\", func() {\n\t\t\t\tit(\"displays an error\", func() {\n\t\t\t\t\tbuilderConfigPath := createBuilderPack.FixtureManager().FixtureLocation(\"invalid_builder.toml\")\n\n\t\t\t\t\toutput, err := createBuilderPack.Run(\n\t\t\t\t\t\t\"builder\", \"create\", \"some-builder:build\",\n\t\t\t\t\t\t\"--config\", builderConfigPath,\n\t\t\t\t\t)\n\n\t\t\t\t\tassert.NotNil(err)\n\t\t\t\t\tassertOutput := assertions.NewOutputAssertionManager(t, output)\n\t\t\t\t\tassertOutput.ReportsInvalidBuilderToml()\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"system buildpacks\", func() {\n\t\t\t\tvar (\n\t\t\t\t\tbuilderWithSystemBP                string\n\t\t\t\t\tbuilderWithFailingSystemBP         string\n\t\t\t\t\tbuilderWithOptionalFailingSystemBP string\n\t\t\t\t\tregularBuilder                     string\n\t\t\t\t)\n\n\t\t\t\tit.Before(func() {\n\t\t\t\t\t// Create builder with system buildpacks\n\t\t\t\t\tbuilderWithSystemBP = fmt.Sprintf(\"pack.local/builder-with-system-bps/%s\", h.RandString(10))\n\t\t\t\t\th.SkipIf(t, !createBuilderPack.Supports(\"builder create\"), \"pack builder create not supported\")\n\n\t\t\t\t\tcreateBuilderPack.JustRunSuccessfully(\n\t\t\t\t\t\t\"builder\", \"create\", builderWithSystemBP,\n\t\t\t\t\t\t\"--config\", createBuilderPack.FixtureManager().FixtureLocation(\"builder_with_system_buildpacks.toml\"),\n\t\t\t\t\t)\n\n\t\t\t\t\t// Create builder with failing system buildpack\n\t\t\t\t\tbuilderWithFailingSystemBP = fmt.Sprintf(\"pack.local/builder-fail-system/%s\", h.RandString(10))\n\t\t\t\t\tcreateBuilderPack.JustRunSuccessfully(\n\t\t\t\t\t\t\"builder\", \"create\", builderWithFailingSystemBP,\n\t\t\t\t\t\t\"--config\", createBuilderPack.FixtureManager().FixtureLocation(\"builder_with_failing_system_buildpack.toml\"),\n\t\t\t\t\t)\n\n\t\t\t\t\t// Create builder with optional failing system buildpack\n\t\t\t\t\tbuilderWithOptionalFailingSystemBP = fmt.Sprintf(\"pack.local/builder-optional-fail/%s\", h.RandString(10))\n\t\t\t\t\tcreateBuilderPack.JustRunSuccessfully(\n\t\t\t\t\t\t\"builder\", \"create\", builderWithOptionalFailingSystemBP,\n\t\t\t\t\t\t\"--config\", createBuilderPack.FixtureManager().FixtureLocation(\"builder_with_optional_failing_system_buildpack.toml\"),\n\t\t\t\t\t)\n\n\t\t\t\t\t// Create regular builder for comparison\n\t\t\t\t\tregularBuilder = fmt.Sprintf(\"pack.local/regular-builder/%s\", h.RandString(10))\n\t\t\t\t\tcreateBuilderPack.JustRunSuccessfully(\n\t\t\t\t\t\t\"builder\", \"create\", regularBuilder,\n\t\t\t\t\t\t\"--config\", createBuilderPack.FixtureManager().FixtureLocation(\"builder.toml\"),\n\t\t\t\t\t)\n\t\t\t\t})\n\n\t\t\t\tit.After(func() {\n\t\t\t\t\timageManager.CleanupImages(builderWithSystemBP)\n\t\t\t\t\timageManager.CleanupImages(builderWithFailingSystemBP)\n\t\t\t\t\timageManager.CleanupImages(builderWithOptionalFailingSystemBP)\n\t\t\t\t\timageManager.CleanupImages(regularBuilder)\n\t\t\t\t})\n\n\t\t\t\twhen(\"inspecting builder with system buildpacks\", func() {\n\t\t\t\t\tit(\"shows system buildpacks in builder info\", func() {\n\t\t\t\t\t\toutput := createBuilderPack.RunSuccessfully(\"builder\", \"inspect\", builderWithSystemBP)\n\n\t\t\t\t\t\t// Verify system buildpacks are shown in the output\n\t\t\t\t\t\th.AssertContains(t, output, \"system/pre\")\n\t\t\t\t\t\th.AssertContains(t, output, \"system/post\")\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"building with system buildpacks\", func() {\n\t\t\t\t\tvar (\n\t\t\t\t\t\tappImage string\n\t\t\t\t\t\tappPath  string\n\t\t\t\t\t)\n\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\tappPath = filepath.Join(\"testdata\", \"mock_app\")\n\t\t\t\t\t\tappImage = fmt.Sprintf(\"pack.local/app/%s\", h.RandString(10))\n\t\t\t\t\t})\n\n\t\t\t\t\tit.After(func() {\n\t\t\t\t\t\timageManager.CleanupImages(appImage)\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"system buildpacks are enabled (default)\", func() {\n\t\t\t\t\t\tit(\"runs pre-system buildpacks before regular buildpacks\", func() {\n\t\t\t\t\t\t\toutput := pack.RunSuccessfully(\n\t\t\t\t\t\t\t\t\"build\", appImage,\n\t\t\t\t\t\t\t\t\"--path\", appPath,\n\t\t\t\t\t\t\t\t\"--builder\", builderWithSystemBP,\n\t\t\t\t\t\t\t\t\"--no-color\",\n\t\t\t\t\t\t\t)\n\n\t\t\t\t\t\t\t// Verify pre-system buildpack ran before the main buildpack\n\t\t\t\t\t\t\th.AssertContains(t, output, \"DETECT: System Pre buildpack\")\n\t\t\t\t\t\t\th.AssertContains(t, output, \"BUILD: System Pre buildpack\")\n\t\t\t\t\t\t\th.AssertContains(t, output, \"Simple Layers Buildpack\")\n\n\t\t\t\t\t\t\t// Verify order: system pre should come before main buildpack\n\t\t\t\t\t\t\tsystemPreIndex := strings.Index(output, \"BUILD: System Pre buildpack\")\n\t\t\t\t\t\t\tmainBuildpackIndex := strings.Index(output, \"Simple Layers Buildpack\")\n\t\t\t\t\t\t\tif systemPreIndex == -1 || mainBuildpackIndex == -1 || systemPreIndex >= mainBuildpackIndex {\n\t\t\t\t\t\t\t\tt.Fatalf(\"Expected system pre buildpack to run before main buildpack\")\n\t\t\t\t\t\t\t}\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\tit(\"runs post-system buildpacks after regular buildpacks\", func() {\n\t\t\t\t\t\t\toutput := pack.RunSuccessfully(\n\t\t\t\t\t\t\t\t\"build\", appImage,\n\t\t\t\t\t\t\t\t\"--path\", appPath,\n\t\t\t\t\t\t\t\t\"--builder\", builderWithSystemBP,\n\t\t\t\t\t\t\t\t\"--no-color\",\n\t\t\t\t\t\t\t)\n\n\t\t\t\t\t\t\t// Verify post-system buildpack ran after the main buildpack\n\t\t\t\t\t\t\th.AssertContains(t, output, \"BUILD: System Post buildpack\")\n\n\t\t\t\t\t\t\t// Verify order: system post should come after main buildpack\n\t\t\t\t\t\t\tmainBuildpackIndex := strings.Index(output, \"Simple Layers Buildpack\")\n\t\t\t\t\t\t\tsystemPostIndex := strings.Index(output, \"BUILD: System Post buildpack\")\n\t\t\t\t\t\t\tif mainBuildpackIndex == -1 || systemPostIndex == -1 || mainBuildpackIndex >= systemPostIndex {\n\t\t\t\t\t\t\t\tt.Fatalf(\"Expected system post buildpack to run after main buildpack\")\n\t\t\t\t\t\t\t}\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\tit(\"builds successfully with system buildpacks\", func() {\n\t\t\t\t\t\t\toutput := pack.RunSuccessfully(\n\t\t\t\t\t\t\t\t\"build\", appImage,\n\t\t\t\t\t\t\t\t\"--path\", appPath,\n\t\t\t\t\t\t\t\t\"--builder\", builderWithSystemBP,\n\t\t\t\t\t\t\t\t\"--verbose\",\n\t\t\t\t\t\t\t)\n\n\t\t\t\t\t\t\t// Verify system buildpack contributed during build\n\t\t\t\t\t\t\th.AssertContains(t, output, \"BUILD: System Pre buildpack\")\n\t\t\t\t\t\t\th.AssertContains(t, output, \"BUILD: System Post buildpack\")\n\n\t\t\t\t\t\t\t// Verify the image was successfully built\n\t\t\t\t\t\t\th.AssertContains(t, output, \"Successfully built image\")\n\t\t\t\t\t\t\tassertImage.ExistsLocally(appImage)\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"--disable-system-buildpacks flag is used\", func() {\n\t\t\t\t\t\tit(\"does not run system buildpacks\", func() {\n\t\t\t\t\t\t\toutput := pack.RunSuccessfully(\n\t\t\t\t\t\t\t\t\"build\", appImage,\n\t\t\t\t\t\t\t\t\"--path\", appPath,\n\t\t\t\t\t\t\t\t\"--builder\", builderWithSystemBP,\n\t\t\t\t\t\t\t\t\"--disable-system-buildpacks\",\n\t\t\t\t\t\t\t\t\"--no-color\",\n\t\t\t\t\t\t\t)\n\n\t\t\t\t\t\t\t// Verify system buildpacks did not run\n\t\t\t\t\t\t\th.AssertNotContains(t, output, \"DETECT: System Pre buildpack\")\n\t\t\t\t\t\t\th.AssertNotContains(t, output, \"BUILD: System Pre buildpack\")\n\t\t\t\t\t\t\th.AssertNotContains(t, output, \"BUILD: System Post buildpack\")\n\n\t\t\t\t\t\t\t// Verify main buildpack still runs\n\t\t\t\t\t\t\th.AssertContains(t, output, \"Simple Layers Buildpack\")\n\n\t\t\t\t\t\t\t// Verify the image was successfully built\n\t\t\t\t\t\t\th.AssertContains(t, output, \"Successfully built image\")\n\t\t\t\t\t\t\tassertImage.ExistsLocally(appImage)\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"builder has no system buildpacks\", func() {\n\t\t\t\t\t\tit(\"builds normally without system buildpacks\", func() {\n\t\t\t\t\t\t\toutput := pack.RunSuccessfully(\n\t\t\t\t\t\t\t\t\"build\", appImage,\n\t\t\t\t\t\t\t\t\"--path\", appPath,\n\t\t\t\t\t\t\t\t\"--builder\", regularBuilder,\n\t\t\t\t\t\t\t\t\"--no-color\",\n\t\t\t\t\t\t\t)\n\n\t\t\t\t\t\t\t// Verify no system buildpacks ran\n\t\t\t\t\t\t\th.AssertNotContains(t, output, \"System Pre buildpack\")\n\t\t\t\t\t\t\th.AssertNotContains(t, output, \"System Post buildpack\")\n\n\t\t\t\t\t\t\t// Verify main buildpack runs\n\t\t\t\t\t\t\th.AssertContains(t, output, \"Simple Layers Buildpack\")\n\n\t\t\t\t\t\t\t// Verify the image was successfully built\n\t\t\t\t\t\t\th.AssertContains(t, output, \"Successfully built image\")\n\t\t\t\t\t\t\tassertImage.ExistsLocally(appImage)\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"required system buildpack fails detection\", func() {\n\t\t\t\t\t\tit(\"fails the build\", func() {\n\t\t\t\t\t\t\toutput, err := pack.Run(\n\t\t\t\t\t\t\t\t\"build\", appImage,\n\t\t\t\t\t\t\t\t\"--path\", appPath,\n\t\t\t\t\t\t\t\t\"--builder\", builderWithFailingSystemBP,\n\t\t\t\t\t\t\t\t\"--no-color\",\n\t\t\t\t\t\t\t)\n\n\t\t\t\t\t\t\t// Build should fail\n\t\t\t\t\t\t\th.AssertNotNil(t, err)\n\t\t\t\t\t\t\th.AssertContains(t, output, \"DETECT: System Fail Detect buildpack (will fail)\")\n\t\t\t\t\t\t\th.AssertContains(t, output, \"No buildpack groups passed detection\")\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"optional system buildpack fails detection\", func() {\n\t\t\t\t\t\tit(\"continues with the build\", func() {\n\t\t\t\t\t\t\toutput := pack.RunSuccessfully(\n\t\t\t\t\t\t\t\t\"build\", appImage,\n\t\t\t\t\t\t\t\t\"--path\", appPath,\n\t\t\t\t\t\t\t\t\"--builder\", builderWithOptionalFailingSystemBP,\n\t\t\t\t\t\t\t\t\"--no-color\",\n\t\t\t\t\t\t\t)\n\n\t\t\t\t\t\t\t// Build should succeed despite optional system buildpack failing\n\t\t\t\t\t\t\th.AssertContains(t, output, \"DETECT: System Fail Detect buildpack (will fail)\")\n\t\t\t\t\t\t\th.AssertContains(t, output, \"DETECT: System Pre buildpack\")\n\t\t\t\t\t\t\th.AssertContains(t, output, \"BUILD: System Pre buildpack\")\n\t\t\t\t\t\t\th.AssertContains(t, output, \"Simple Layers Buildpack\")\n\n\t\t\t\t\t\t\t// Verify the failed optional buildpack didn't run build\n\t\t\t\t\t\t\th.AssertNotContains(t, output, \"BUILD: System Fail Detect buildpack\")\n\n\t\t\t\t\t\t\t// Verify the image was successfully built\n\t\t\t\t\t\t\th.AssertContains(t, output, \"Successfully built image\")\n\t\t\t\t\t\t\tassertImage.ExistsLocally(appImage)\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"build\", func() {\n\t\t\t\tvar repo, repoName string\n\n\t\t\t\tit.Before(func() {\n\t\t\t\t\trepo = \"some-org/\" + h.RandString(10)\n\t\t\t\t\trepoName = registryConfig.RepoName(repo)\n\t\t\t\t\tpack.JustRunSuccessfully(\"config\", \"lifecycle-image\", lifecycle.Image())\n\t\t\t\t})\n\n\t\t\t\tit.After(func() {\n\t\t\t\t\timageManager.CleanupImages(repoName)\n\t\t\t\t\tref, err := name.ParseReference(repoName, name.WeakValidation)\n\t\t\t\t\tassert.Nil(err)\n\t\t\t\t\tcacheImage := cache.NewImageCache(ref, dockerCli)\n\t\t\t\t\tlogger := logging.NewSimpleLogger(&bytes.Buffer{})\n\t\t\t\t\tbuildCacheVolume, _ := cache.NewVolumeCache(ref, cache.CacheInfo{}, \"build\", dockerCli, logger)\n\t\t\t\t\tlaunchCacheVolume, _ := cache.NewVolumeCache(ref, cache.CacheInfo{}, \"launch\", dockerCli, logger)\n\t\t\t\t\tcacheImage.Clear(context.TODO())\n\t\t\t\t\tbuildCacheVolume.Clear(context.TODO())\n\t\t\t\t\tlaunchCacheVolume.Clear(context.TODO())\n\t\t\t\t})\n\n\t\t\t\twhen(\"builder is untrusted\", func() {\n\t\t\t\t\tvar untrustedBuilderName string\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\tvar err error\n\t\t\t\t\t\tuntrustedBuilderName, err = createBuilder(\n\t\t\t\t\t\t\tt,\n\t\t\t\t\t\t\tassert,\n\t\t\t\t\t\t\tcreateBuilderPack,\n\t\t\t\t\t\t\tlifecycle,\n\t\t\t\t\t\t\tbuildpackManager,\n\t\t\t\t\t\t\trunImageMirror,\n\t\t\t\t\t\t)\n\t\t\t\t\t\tassert.Nil(err)\n\n\t\t\t\t\t\tsuiteManager.RegisterCleanUp(\"remove-lifecycle-\"+lifecycle.Image(), func() error {\n\t\t\t\t\t\t\t// Try to get image ID, but ignore errors if image doesn't exist\n\t\t\t\t\t\t\t// (e.g., if it was pulled by digest instead of tag)\n\t\t\t\t\t\t\tinspect, err := imageManager.InspectLocal(lifecycle.Image())\n\t\t\t\t\t\t\tif err != nil {\n\t\t\t\t\t\t\t\treturn nil\n\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\timageManager.CleanupImages(inspect.ID)\n\t\t\t\t\t\t\treturn nil\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\n\t\t\t\t\tit.After(func() {\n\t\t\t\t\t\timageManager.CleanupImages(untrustedBuilderName)\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"daemon\", func() {\n\t\t\t\t\t\tit(\"uses the 5 phases\", func() {\n\t\t\t\t\t\t\torigLifecycle := lifecycle.Image()\n\n\t\t\t\t\t\t\toutput := pack.RunSuccessfully(\n\t\t\t\t\t\t\t\t\"build\", repoName,\n\t\t\t\t\t\t\t\t\"-p\", filepath.Join(\"testdata\", \"mock_app\"),\n\t\t\t\t\t\t\t\t\"-B\", untrustedBuilderName,\n\t\t\t\t\t\t\t)\n\n\t\t\t\t\t\t\tassertions.NewOutputAssertionManager(t, output).ReportsSuccessfulImageBuild(repoName)\n\n\t\t\t\t\t\t\tassertOutput := assertions.NewLifecycleOutputAssertionManager(t, output)\n\t\t\t\t\t\t\tassertOutput.IncludesTagOrEphemeralLifecycle(origLifecycle)\n\t\t\t\t\t\t\tassertOutput.IncludesSeparatePhases()\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"--publish\", func() {\n\t\t\t\t\t\tit(\"uses the 5 phases\", func() {\n\t\t\t\t\t\t\torigLifecycle := lifecycle.Image()\n\n\t\t\t\t\t\t\tbuildArgs := []string{\n\t\t\t\t\t\t\t\trepoName,\n\t\t\t\t\t\t\t\t\"-p\", filepath.Join(\"testdata\", \"mock_app\"),\n\t\t\t\t\t\t\t\t\"-B\", untrustedBuilderName,\n\t\t\t\t\t\t\t\t\"--publish\",\n\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\tif imageManager.HostOS() != \"windows\" {\n\t\t\t\t\t\t\t\tbuildArgs = append(buildArgs, \"--network\", \"host\")\n\t\t\t\t\t\t\t}\n\n\t\t\t\t\t\t\toutput := pack.RunSuccessfully(\"build\", buildArgs...)\n\n\t\t\t\t\t\t\tassertions.NewOutputAssertionManager(t, output).ReportsSuccessfulImageBuild(repoName)\n\n\t\t\t\t\t\t\tassertOutput := assertions.NewLifecycleOutputAssertionManager(t, output)\n\t\t\t\t\t\t\tassertOutput.IncludesTagOrEphemeralLifecycle(origLifecycle)\n\t\t\t\t\t\t\tassertOutput.IncludesSeparatePhases()\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"additional tags\", func() {\n\t\t\t\t\t\tvar additionalRepoName string\n\n\t\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\t\tadditionalRepoName = fmt.Sprintf(\"%s_additional\", repoName)\n\t\t\t\t\t\t})\n\t\t\t\t\t\tit.After(func() {\n\t\t\t\t\t\t\timageManager.CleanupImages(additionalRepoName)\n\t\t\t\t\t\t})\n\t\t\t\t\t\tit(\"pushes image to additional tags\", func() {\n\t\t\t\t\t\t\toutput := pack.RunSuccessfully(\n\t\t\t\t\t\t\t\t\"build\", repoName,\n\t\t\t\t\t\t\t\t\"-p\", filepath.Join(\"testdata\", \"mock_app\"),\n\t\t\t\t\t\t\t\t\"-B\", untrustedBuilderName,\n\t\t\t\t\t\t\t\t\"--tag\", additionalRepoName,\n\t\t\t\t\t\t\t)\n\n\t\t\t\t\t\t\tassertions.NewOutputAssertionManager(t, output).ReportsSuccessfulImageBuild(repoName)\n\t\t\t\t\t\t\tassert.Contains(output, additionalRepoName)\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"builder is trusted (and set as default)\", func() {\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\tpack.RunSuccessfully(\"config\", \"default-builder\", builderName)\n\t\t\t\t\t\tpack.JustRunSuccessfully(\"config\", \"trusted-builders\", \"add\", builderName)\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"creates a runnable, rebuildable image on daemon from app dir\", func() {\n\t\t\t\t\t\tappPath := filepath.Join(\"testdata\", \"mock_app\")\n\n\t\t\t\t\t\toutput := pack.RunSuccessfully(\n\t\t\t\t\t\t\t\"build\", repoName,\n\t\t\t\t\t\t\t\"-p\", appPath,\n\t\t\t\t\t\t)\n\n\t\t\t\t\t\tassertOutput := assertions.NewOutputAssertionManager(t, output)\n\n\t\t\t\t\t\tassertOutput.ReportsSuccessfulImageBuild(repoName)\n\t\t\t\t\t\tassertOutput.ReportsUsingBuildCacheVolume()\n\t\t\t\t\t\tassertOutput.ReportsSelectingRunImageMirror(runImageMirror)\n\n\t\t\t\t\t\tt.Log(\"app is runnable\")\n\t\t\t\t\t\tassertImage.RunsWithOutput(repoName, \"Launch Dep Contents\", \"Cached Dep Contents\")\n\n\t\t\t\t\t\tt.Log(\"it uses the run image as a base image\")\n\t\t\t\t\t\tassertImage.HasBaseImage(repoName, runImage)\n\n\t\t\t\t\t\tt.Log(\"sets the run image metadata\")\n\t\t\t\t\t\tassertImage.HasLabelContaining(repoName, \"io.buildpacks.lifecycle.metadata\", fmt.Sprintf(`\"image\":\"pack-test/run\",\"mirrors\":[\"%s\"]`, runImageMirror))\n\n\t\t\t\t\t\tt.Log(\"sets the source metadata\")\n\t\t\t\t\t\tassertImage.HasLabelContaining(repoName, \"io.buildpacks.project.metadata\", (`{\"source\":{\"type\":\"project\",\"version\":{\"declared\":\"1.0.2\"},\"metadata\":{\"url\":\"https://github.com/buildpacks/pack\"}}}`))\n\n\t\t\t\t\t\tt.Log(\"registry is empty\")\n\t\t\t\t\t\tassertImage.NotExistsInRegistry(repo)\n\n\t\t\t\t\t\tt.Log(\"add a local mirror\")\n\t\t\t\t\t\tlocalRunImageMirror := registryConfig.RepoName(\"pack-test/run-mirror\")\n\t\t\t\t\t\timageManager.TagImage(runImage, localRunImageMirror)\n\t\t\t\t\t\tdefer imageManager.CleanupImages(localRunImageMirror)\n\t\t\t\t\t\tpack.JustRunSuccessfully(\"config\", \"run-image-mirrors\", \"add\", runImage, \"-m\", localRunImageMirror)\n\n\t\t\t\t\t\tt.Log(\"rebuild\")\n\t\t\t\t\t\toutput = pack.RunSuccessfully(\n\t\t\t\t\t\t\t\"build\", repoName,\n\t\t\t\t\t\t\t\"-p\", appPath,\n\t\t\t\t\t\t)\n\t\t\t\t\t\tassertOutput = assertions.NewOutputAssertionManager(t, output)\n\t\t\t\t\t\tassertOutput.ReportsSuccessfulImageBuild(repoName)\n\t\t\t\t\t\tassertOutput.ReportsSelectingRunImageMirrorFromLocalConfig(localRunImageMirror)\n\t\t\t\t\t\tif pack.SupportsFeature(invoke.FixesRunImageMetadata) {\n\t\t\t\t\t\t\tt.Log(fmt.Sprintf(\"run-image mirror %s was NOT added into 'io.buildpacks.lifecycle.metadata' label\", localRunImageMirror))\n\t\t\t\t\t\t\tassertImage.HasLabelNotContaining(repoName, \"io.buildpacks.lifecycle.metadata\", fmt.Sprintf(`\"image\":\"%s\"`, localRunImageMirror))\n\t\t\t\t\t\t\tt.Log(fmt.Sprintf(\"run-image %s was added into 'io.buildpacks.lifecycle.metadata' label\", runImage))\n\t\t\t\t\t\t\tassertImage.HasLabelContaining(repoName, \"io.buildpacks.lifecycle.metadata\", fmt.Sprintf(`\"image\":\"%s\"`, runImage))\n\t\t\t\t\t\t}\n\t\t\t\t\t\tcachedLaunchLayer := \"simple/layers:cached-launch-layer\"\n\n\t\t\t\t\t\tassertLifecycleOutput := assertions.NewLifecycleOutputAssertionManager(t, output)\n\t\t\t\t\t\tassertLifecycleOutput.ReportsRestoresCachedLayer(cachedLaunchLayer)\n\t\t\t\t\t\tassertLifecycleOutput.ReportsExporterReusingUnchangedLayer(cachedLaunchLayer)\n\t\t\t\t\t\tassertLifecycleOutput.ReportsCacheReuse(cachedLaunchLayer)\n\n\t\t\t\t\t\tt.Log(\"app is runnable\")\n\t\t\t\t\t\tassertImage.RunsWithOutput(repoName, \"Launch Dep Contents\", \"Cached Dep Contents\")\n\n\t\t\t\t\t\tt.Log(\"rebuild with --clear-cache\")\n\t\t\t\t\t\toutput = pack.RunSuccessfully(\"build\", repoName, \"-p\", appPath, \"--clear-cache\")\n\n\t\t\t\t\t\tassertOutput = assertions.NewOutputAssertionManager(t, output)\n\t\t\t\t\t\tassertOutput.ReportsSuccessfulImageBuild(repoName)\n\t\t\t\t\t\tassertLifecycleOutput = assertions.NewLifecycleOutputAssertionManager(t, output)\n\t\t\t\t\t\tassertLifecycleOutput.ReportsExporterReusingUnchangedLayer(cachedLaunchLayer)\n\t\t\t\t\t\tassertLifecycleOutput.ReportsCacheCreation(cachedLaunchLayer)\n\n\t\t\t\t\t\tt.Log(\"cacher adds layers\")\n\t\t\t\t\t\tassert.Matches(output, regexp.MustCompile(`(?i)Adding cache layer 'simple/layers:cached-launch-layer'`))\n\n\t\t\t\t\t\tt.Log(\"inspecting image\")\n\t\t\t\t\t\tinspectCmd := \"inspect\"\n\t\t\t\t\t\tif !pack.Supports(\"inspect\") {\n\t\t\t\t\t\t\tinspectCmd = \"inspect-image\"\n\t\t\t\t\t\t}\n\n\t\t\t\t\t\tvar (\n\t\t\t\t\t\t\twebCommand      string\n\t\t\t\t\t\t\thelloCommand    string\n\t\t\t\t\t\t\thelloArgs       []string\n\t\t\t\t\t\t\thelloArgsPrefix string\n\t\t\t\t\t\t\timageWorkdir    string\n\t\t\t\t\t\t)\n\t\t\t\t\t\tif imageManager.HostOS() == \"windows\" {\n\t\t\t\t\t\t\twebCommand = \".\\\\run\"\n\t\t\t\t\t\t\thelloCommand = \"cmd\"\n\t\t\t\t\t\t\thelloArgs = []string{\"/c\", \"echo hello world\"}\n\t\t\t\t\t\t\thelloArgsPrefix = \" \"\n\t\t\t\t\t\t\timageWorkdir = \"c:\\\\workspace\"\n\n\t\t\t\t\t\t} else {\n\t\t\t\t\t\t\twebCommand = \"./run\"\n\t\t\t\t\t\t\thelloCommand = \"echo\"\n\t\t\t\t\t\t\thelloArgs = []string{\"hello\", \"world\"}\n\t\t\t\t\t\t\thelloArgsPrefix = \"\"\n\t\t\t\t\t\t\timageWorkdir = \"/workspace\"\n\t\t\t\t\t\t}\n\n\t\t\t\t\t\tformats := []compareFormat{\n\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\textension:   \"json\",\n\t\t\t\t\t\t\t\tcompareFunc: assert.EqualJSON,\n\t\t\t\t\t\t\t\toutputArg:   \"json\",\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\textension:   \"yaml\",\n\t\t\t\t\t\t\t\tcompareFunc: assert.EqualYAML,\n\t\t\t\t\t\t\t\toutputArg:   \"yaml\",\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\textension:   \"toml\",\n\t\t\t\t\t\t\t\tcompareFunc: assert.EqualTOML,\n\t\t\t\t\t\t\t\toutputArg:   \"toml\",\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t}\n\t\t\t\t\t\tfor _, format := range formats {\n\t\t\t\t\t\t\tt.Logf(\"inspecting image %s format\", format.outputArg)\n\n\t\t\t\t\t\t\toutput = pack.RunSuccessfully(inspectCmd, repoName, \"--output\", format.outputArg)\n\t\t\t\t\t\t\texpectedOutput := pack.FixtureManager().TemplateFixture(\n\t\t\t\t\t\t\t\tfmt.Sprintf(\"inspect_image_local_output.%s\", format.extension),\n\t\t\t\t\t\t\t\tmap[string]interface{}{\n\t\t\t\t\t\t\t\t\t\"image_name\":             repoName,\n\t\t\t\t\t\t\t\t\t\"base_image_id\":          h.ImageID(t, runImageMirror),\n\t\t\t\t\t\t\t\t\t\"base_image_top_layer\":   h.TopLayerDiffID(t, runImageMirror),\n\t\t\t\t\t\t\t\t\t\"run_image_local_mirror\": localRunImageMirror,\n\t\t\t\t\t\t\t\t\t\"run_image_mirror\":       runImageMirror,\n\t\t\t\t\t\t\t\t\t\"web_command\":            webCommand,\n\t\t\t\t\t\t\t\t\t\"hello_command\":          helloCommand,\n\t\t\t\t\t\t\t\t\t\"hello_args\":             helloArgs,\n\t\t\t\t\t\t\t\t\t\"hello_args_prefix\":      helloArgsPrefix,\n\t\t\t\t\t\t\t\t\t\"image_workdir\":          imageWorkdir,\n\t\t\t\t\t\t\t\t\t\"rebasable\":              true,\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t)\n\n\t\t\t\t\t\t\tformat.compareFunc(output, expectedOutput)\n\t\t\t\t\t\t}\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"--no-color\", func() {\n\t\t\t\t\t\tit(\"doesn't have color\", func() {\n\t\t\t\t\t\t\tappPath := filepath.Join(\"testdata\", \"mock_app\")\n\n\t\t\t\t\t\t\t// --no-color is set as a default option in our tests, and doesn't need to be explicitly provided\n\t\t\t\t\t\t\toutput := pack.RunSuccessfully(\"build\", repoName, \"-p\", appPath)\n\t\t\t\t\t\t\tassertOutput := assertions.NewOutputAssertionManager(t, output)\n\t\t\t\t\t\t\tassertOutput.ReportsSuccessfulImageBuild(repoName)\n\t\t\t\t\t\t\tassertOutput.WithoutColors()\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"--quiet\", func() {\n\t\t\t\t\t\tit(\"only logs app name and sha\", func() {\n\t\t\t\t\t\t\tappPath := filepath.Join(\"testdata\", \"mock_app\")\n\n\t\t\t\t\t\t\tpack.SetVerbose(false)\n\t\t\t\t\t\t\tdefer pack.SetVerbose(true)\n\n\t\t\t\t\t\t\toutput := pack.RunSuccessfully(\"build\", repoName, \"-p\", appPath, \"--quiet\")\n\t\t\t\t\t\t\tassertOutput := assertions.NewOutputAssertionManager(t, output)\n\t\t\t\t\t\t\tassertOutput.ReportSuccessfulQuietBuild(repoName)\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"supports building app from a zip file\", func() {\n\t\t\t\t\t\tappPath := filepath.Join(\"testdata\", \"mock_app.zip\")\n\t\t\t\t\t\toutput := pack.RunSuccessfully(\"build\", repoName, \"-p\", appPath)\n\t\t\t\t\t\tassertions.NewOutputAssertionManager(t, output).ReportsSuccessfulImageBuild(repoName)\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"--network\", func() {\n\t\t\t\t\t\tvar tmpDir string\n\n\t\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\t\th.SkipIf(t, imageManager.HostOS() == \"windows\", \"temporarily disabled on WCOW due to CI flakiness\")\n\n\t\t\t\t\t\t\tvar err error\n\t\t\t\t\t\t\ttmpDir, err = os.MkdirTemp(\"\", \"archive-buildpacks-\")\n\t\t\t\t\t\t\tassert.Nil(err)\n\n\t\t\t\t\t\t\tbuildpackManager.PrepareBuildModules(tmpDir, buildpacks.BpInternetCapable)\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\tit.After(func() {\n\t\t\t\t\t\t\th.SkipIf(t, imageManager.HostOS() == \"windows\", \"temporarily disabled on WCOW due to CI flakiness\")\n\t\t\t\t\t\t\tassert.Succeeds(os.RemoveAll(tmpDir))\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\twhen(\"the network mode is not provided\", func() {\n\t\t\t\t\t\t\tit(\"reports buildpack access to internet\", func() {\n\t\t\t\t\t\t\t\toutput := pack.RunSuccessfully(\n\t\t\t\t\t\t\t\t\t\"build\", repoName,\n\t\t\t\t\t\t\t\t\t\"-p\", filepath.Join(\"testdata\", \"mock_app\"),\n\t\t\t\t\t\t\t\t\t\"--buildpack\", buildpacks.BpInternetCapable.FullPathIn(tmpDir),\n\t\t\t\t\t\t\t\t)\n\n\t\t\t\t\t\t\t\tassertBuildpackOutput := assertions.NewTestBuildpackOutputAssertionManager(t, output)\n\t\t\t\t\t\t\t\tassertBuildpackOutput.ReportsConnectedToInternet()\n\t\t\t\t\t\t\t})\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\twhen(\"the network mode is set to default\", func() {\n\t\t\t\t\t\t\tit(\"reports buildpack access to internet\", func() {\n\t\t\t\t\t\t\t\toutput := pack.RunSuccessfully(\n\t\t\t\t\t\t\t\t\t\"build\", repoName,\n\t\t\t\t\t\t\t\t\t\"-p\", filepath.Join(\"testdata\", \"mock_app\"),\n\t\t\t\t\t\t\t\t\t\"--buildpack\", buildpacks.BpInternetCapable.FullPathIn(tmpDir),\n\t\t\t\t\t\t\t\t\t\"--network\", \"default\",\n\t\t\t\t\t\t\t\t)\n\n\t\t\t\t\t\t\t\tassertBuildpackOutput := assertions.NewTestBuildpackOutputAssertionManager(t, output)\n\t\t\t\t\t\t\t\tassertBuildpackOutput.ReportsConnectedToInternet()\n\t\t\t\t\t\t\t})\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\twhen(\"the network mode is set to none\", func() {\n\t\t\t\t\t\t\tit(\"reports buildpack disconnected from internet\", func() {\n\t\t\t\t\t\t\t\toutput := pack.RunSuccessfully(\n\t\t\t\t\t\t\t\t\t\"build\", repoName,\n\t\t\t\t\t\t\t\t\t\"-p\", filepath.Join(\"testdata\", \"mock_app\"),\n\t\t\t\t\t\t\t\t\t\"--buildpack\", buildpacks.BpInternetCapable.FullPathIn(tmpDir),\n\t\t\t\t\t\t\t\t\t\"--network\", \"none\",\n\t\t\t\t\t\t\t\t)\n\n\t\t\t\t\t\t\t\tassertBuildpackOutput := assertions.NewTestBuildpackOutputAssertionManager(t, output)\n\t\t\t\t\t\t\t\tassertBuildpackOutput.ReportsDisconnectedFromInternet()\n\t\t\t\t\t\t\t})\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"--volume\", func() {\n\t\t\t\t\t\tvar (\n\t\t\t\t\t\t\tvolumeRoot   = \"/\"\n\t\t\t\t\t\t\tslash        = \"/\"\n\t\t\t\t\t\t\ttmpDir       string\n\t\t\t\t\t\t\ttmpVolumeSrc string\n\t\t\t\t\t\t)\n\n\t\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\t\th.SkipIf(t, os.Getenv(\"DOCKER_HOST\") != \"\", \"cannot mount volume when DOCKER_HOST is set\")\n\t\t\t\t\t\t\th.SkipIf(t, imageManager.HostOS() == \"windows\", \"These tests are broken on Windows Containers on Windows when not using the creator; see https://github.com/buildpacks/pack/issues/2147\")\n\n\t\t\t\t\t\t\tif imageManager.HostOS() == \"windows\" {\n\t\t\t\t\t\t\t\tvolumeRoot = `c:\\`\n\t\t\t\t\t\t\t\tslash = `\\`\n\t\t\t\t\t\t\t}\n\n\t\t\t\t\t\t\tvar err error\n\t\t\t\t\t\t\ttmpDir, err = os.MkdirTemp(\"\", \"volume-buildpack-tests-\")\n\t\t\t\t\t\t\tassert.Nil(err)\n\n\t\t\t\t\t\t\tbuildpackManager.PrepareBuildModules(tmpDir, buildpacks.BpReadVolume, buildpacks.BpReadWriteVolume)\n\n\t\t\t\t\t\t\ttmpVolumeSrc, err = os.MkdirTemp(\"\", \"volume-mount-source\")\n\t\t\t\t\t\t\tassert.Nil(err)\n\t\t\t\t\t\t\tassert.Succeeds(os.Chmod(tmpVolumeSrc, 0777)) // Override umask\n\n\t\t\t\t\t\t\t// Some OSes (like macOS) use symlinks for the standard temp dir.\n\t\t\t\t\t\t\t// Resolve it so it can be properly mounted by the Docker daemon.\n\t\t\t\t\t\t\ttmpVolumeSrc, err = filepath.EvalSymlinks(tmpVolumeSrc)\n\t\t\t\t\t\t\tassert.Nil(err)\n\n\t\t\t\t\t\t\terr = os.WriteFile(filepath.Join(tmpVolumeSrc, \"some-file\"), []byte(\"some-content\\n\"), 0777)\n\t\t\t\t\t\t\tassert.Nil(err)\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\tit.After(func() {\n\t\t\t\t\t\t\t_ = os.RemoveAll(tmpDir)\n\t\t\t\t\t\t\t_ = os.RemoveAll(tmpVolumeSrc)\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\twhen(\"volume is read-only\", func() {\n\t\t\t\t\t\t\tit(\"mounts the provided volume in the detect and build phases\", func() {\n\t\t\t\t\t\t\t\tvolumeDest := volumeRoot + \"platform\" + slash + \"volume-mount-target\"\n\t\t\t\t\t\t\t\ttestFilePath := volumeDest + slash + \"some-file\"\n\t\t\t\t\t\t\t\toutput := pack.RunSuccessfully(\n\t\t\t\t\t\t\t\t\t\"build\", repoName,\n\t\t\t\t\t\t\t\t\t\"-p\", filepath.Join(\"testdata\", \"mock_app\"),\n\t\t\t\t\t\t\t\t\t\"--volume\", fmt.Sprintf(\"%s:%s\", tmpVolumeSrc, volumeDest),\n\t\t\t\t\t\t\t\t\t\"--buildpack\", buildpacks.BpReadVolume.FullPathIn(tmpDir),\n\t\t\t\t\t\t\t\t\t\"--env\", \"TEST_FILE_PATH=\"+testFilePath,\n\t\t\t\t\t\t\t\t)\n\n\t\t\t\t\t\t\t\tbpOutputAsserts := assertions.NewTestBuildpackOutputAssertionManager(t, output)\n\t\t\t\t\t\t\t\tbpOutputAsserts.ReportsReadingFileContents(\"Detect\", testFilePath, \"some-content\")\n\t\t\t\t\t\t\t\tbpOutputAsserts.ReportsReadingFileContents(\"Build\", testFilePath, \"some-content\")\n\t\t\t\t\t\t\t})\n\n\t\t\t\t\t\t\tit(\"should fail to write\", func() {\n\t\t\t\t\t\t\t\tvolumeDest := volumeRoot + \"platform\" + slash + \"volume-mount-target\"\n\t\t\t\t\t\t\t\ttestDetectFilePath := volumeDest + slash + \"detect-file\"\n\t\t\t\t\t\t\t\ttestBuildFilePath := volumeDest + slash + \"build-file\"\n\t\t\t\t\t\t\t\toutput := pack.RunSuccessfully(\n\t\t\t\t\t\t\t\t\t\"build\", repoName,\n\t\t\t\t\t\t\t\t\t\"-p\", filepath.Join(\"testdata\", \"mock_app\"),\n\t\t\t\t\t\t\t\t\t\"--volume\", fmt.Sprintf(\"%s:%s\", tmpVolumeSrc, volumeDest),\n\t\t\t\t\t\t\t\t\t\"--buildpack\", buildpacks.BpReadWriteVolume.FullPathIn(tmpDir),\n\t\t\t\t\t\t\t\t\t\"--env\", \"DETECT_TEST_FILE_PATH=\"+testDetectFilePath,\n\t\t\t\t\t\t\t\t\t\"--env\", \"BUILD_TEST_FILE_PATH=\"+testBuildFilePath,\n\t\t\t\t\t\t\t\t)\n\n\t\t\t\t\t\t\t\tbpOutputAsserts := assertions.NewTestBuildpackOutputAssertionManager(t, output)\n\t\t\t\t\t\t\t\tbpOutputAsserts.ReportsFailingToWriteFileContents(\"Detect\", testDetectFilePath)\n\t\t\t\t\t\t\t\tbpOutputAsserts.ReportsFailingToWriteFileContents(\"Build\", testBuildFilePath)\n\t\t\t\t\t\t\t})\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\twhen(\"volume is read-write\", func() {\n\t\t\t\t\t\t\tit(\"can be written to\", func() {\n\t\t\t\t\t\t\t\tvolumeDest := volumeRoot + \"volume-mount-target\"\n\t\t\t\t\t\t\t\ttestDetectFilePath := volumeDest + slash + \"detect-file\"\n\t\t\t\t\t\t\t\ttestBuildFilePath := volumeDest + slash + \"build-file\"\n\t\t\t\t\t\t\t\toutput := pack.RunSuccessfully(\n\t\t\t\t\t\t\t\t\t\"build\", repoName,\n\t\t\t\t\t\t\t\t\t\"-p\", filepath.Join(\"testdata\", \"mock_app\"),\n\t\t\t\t\t\t\t\t\t\"--volume\", fmt.Sprintf(\"%s:%s:rw\", tmpVolumeSrc, volumeDest),\n\t\t\t\t\t\t\t\t\t\"--buildpack\", buildpacks.BpReadWriteVolume.FullPathIn(tmpDir),\n\t\t\t\t\t\t\t\t\t\"--env\", \"DETECT_TEST_FILE_PATH=\"+testDetectFilePath,\n\t\t\t\t\t\t\t\t\t\"--env\", \"BUILD_TEST_FILE_PATH=\"+testBuildFilePath,\n\t\t\t\t\t\t\t\t)\n\n\t\t\t\t\t\t\t\tbpOutputAsserts := assertions.NewTestBuildpackOutputAssertionManager(t, output)\n\t\t\t\t\t\t\t\tbpOutputAsserts.ReportsWritingFileContents(\"Detect\", testDetectFilePath)\n\t\t\t\t\t\t\t\tbpOutputAsserts.ReportsReadingFileContents(\"Detect\", testDetectFilePath, \"some-content\")\n\t\t\t\t\t\t\t\tbpOutputAsserts.ReportsWritingFileContents(\"Build\", testBuildFilePath)\n\t\t\t\t\t\t\t\tbpOutputAsserts.ReportsReadingFileContents(\"Build\", testBuildFilePath, \"some-content\")\n\t\t\t\t\t\t\t})\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"--default-process\", func() {\n\t\t\t\t\t\tit(\"sets the default process from those in the process list\", func() {\n\t\t\t\t\t\t\tpack.RunSuccessfully(\n\t\t\t\t\t\t\t\t\"build\", repoName,\n\t\t\t\t\t\t\t\t\"--default-process\", \"hello\",\n\t\t\t\t\t\t\t\t\"-p\", filepath.Join(\"testdata\", \"mock_app\"),\n\t\t\t\t\t\t\t)\n\n\t\t\t\t\t\t\tassertImage.RunsWithLogs(repoName, \"hello world\")\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"--buildpack\", func() {\n\t\t\t\t\t\twhen(\"the argument is an ID\", func() {\n\t\t\t\t\t\t\tit(\"adds the buildpacks to the builder if necessary and runs them\", func() {\n\t\t\t\t\t\t\t\toutput := pack.RunSuccessfully(\n\t\t\t\t\t\t\t\t\t\"build\", repoName,\n\t\t\t\t\t\t\t\t\t\"-p\", filepath.Join(\"testdata\", \"mock_app\"),\n\t\t\t\t\t\t\t\t\t\"--buildpack\", \"simple/layers\", // can omit version if only one\n\t\t\t\t\t\t\t\t\t\"--buildpack\", \"noop.buildpack@noop.buildpack.version\",\n\t\t\t\t\t\t\t\t)\n\n\t\t\t\t\t\t\t\tassertOutput := assertions.NewOutputAssertionManager(t, output)\n\n\t\t\t\t\t\t\t\tassertTestAppOutput := assertions.NewTestBuildpackOutputAssertionManager(t, output)\n\t\t\t\t\t\t\t\tassertTestAppOutput.ReportsBuildStep(\"Simple Layers Buildpack\")\n\t\t\t\t\t\t\t\tassertTestAppOutput.ReportsBuildStep(\"NOOP Buildpack\")\n\t\t\t\t\t\t\t\tassertOutput.ReportsSuccessfulImageBuild(repoName)\n\n\t\t\t\t\t\t\t\tt.Log(\"app is runnable\")\n\t\t\t\t\t\t\t\tassertImage.RunsWithOutput(\n\t\t\t\t\t\t\t\t\trepoName,\n\t\t\t\t\t\t\t\t\t\"Launch Dep Contents\",\n\t\t\t\t\t\t\t\t\t\"Cached Dep Contents\",\n\t\t\t\t\t\t\t\t)\n\t\t\t\t\t\t\t})\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\twhen(\"the argument is an archive\", func() {\n\t\t\t\t\t\t\tvar tmpDir string\n\n\t\t\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\t\t\tvar err error\n\t\t\t\t\t\t\t\ttmpDir, err = os.MkdirTemp(\"\", \"archive-buildpack-tests-\")\n\t\t\t\t\t\t\t\tassert.Nil(err)\n\t\t\t\t\t\t\t})\n\n\t\t\t\t\t\t\tit.After(func() {\n\t\t\t\t\t\t\t\tassert.Succeeds(os.RemoveAll(tmpDir))\n\t\t\t\t\t\t\t})\n\n\t\t\t\t\t\t\tit(\"adds the buildpack to the builder and runs it\", func() {\n\t\t\t\t\t\t\t\tbuildpackManager.PrepareBuildModules(tmpDir, buildpacks.BpArchiveNotInBuilder)\n\n\t\t\t\t\t\t\t\toutput := pack.RunSuccessfully(\n\t\t\t\t\t\t\t\t\t\"build\", repoName,\n\t\t\t\t\t\t\t\t\t\"-p\", filepath.Join(\"testdata\", \"mock_app\"),\n\t\t\t\t\t\t\t\t\t\"--buildpack\", buildpacks.BpArchiveNotInBuilder.FullPathIn(tmpDir),\n\t\t\t\t\t\t\t\t)\n\n\t\t\t\t\t\t\t\tassertOutput := assertions.NewOutputAssertionManager(t, output)\n\t\t\t\t\t\t\t\tassertOutput.ReportsAddingBuildpack(\"local/bp\", \"local-bp-version\")\n\t\t\t\t\t\t\t\tassertOutput.ReportsSuccessfulImageBuild(repoName)\n\n\t\t\t\t\t\t\t\tassertBuildpackOutput := assertions.NewTestBuildpackOutputAssertionManager(t, output)\n\t\t\t\t\t\t\t\tassertBuildpackOutput.ReportsBuildStep(\"Local Buildpack\")\n\t\t\t\t\t\t\t})\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\twhen(\"the argument is directory\", func() {\n\t\t\t\t\t\t\tvar tmpDir string\n\n\t\t\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\t\t\tvar err error\n\t\t\t\t\t\t\t\ttmpDir, err = os.MkdirTemp(\"\", \"folder-buildpack-tests-\")\n\t\t\t\t\t\t\t\tassert.Nil(err)\n\t\t\t\t\t\t\t})\n\n\t\t\t\t\t\t\tit.After(func() {\n\t\t\t\t\t\t\t\t_ = os.RemoveAll(tmpDir)\n\t\t\t\t\t\t\t})\n\n\t\t\t\t\t\t\tit(\"adds the buildpacks to the builder and runs it\", func() {\n\t\t\t\t\t\t\t\th.SkipIf(t, runtime.GOOS == \"windows\", \"buildpack directories not supported on windows\")\n\n\t\t\t\t\t\t\t\tbuildpackManager.PrepareBuildModules(tmpDir, buildpacks.BpFolderNotInBuilder)\n\n\t\t\t\t\t\t\t\toutput := pack.RunSuccessfully(\n\t\t\t\t\t\t\t\t\t\"build\", repoName,\n\t\t\t\t\t\t\t\t\t\"-p\", filepath.Join(\"testdata\", \"mock_app\"),\n\t\t\t\t\t\t\t\t\t\"--buildpack\", buildpacks.BpFolderNotInBuilder.FullPathIn(tmpDir),\n\t\t\t\t\t\t\t\t)\n\n\t\t\t\t\t\t\t\tassertOutput := assertions.NewOutputAssertionManager(t, output)\n\t\t\t\t\t\t\t\tassertOutput.ReportsAddingBuildpack(\"local/bp\", \"local-bp-version\")\n\t\t\t\t\t\t\t\tassertOutput.ReportsSuccessfulImageBuild(repoName)\n\n\t\t\t\t\t\t\t\tassertBuildpackOutput := assertions.NewTestBuildpackOutputAssertionManager(t, output)\n\t\t\t\t\t\t\t\tassertBuildpackOutput.ReportsBuildStep(\"Local Buildpack\")\n\t\t\t\t\t\t\t})\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\twhen(\"the argument is meta-buildpack directory\", func() {\n\t\t\t\t\t\t\tvar tmpDir string\n\n\t\t\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\t\t\tvar err error\n\t\t\t\t\t\t\t\ttmpDir, err = os.MkdirTemp(\"\", \"folder-buildpack-tests-\")\n\t\t\t\t\t\t\t\tassert.Nil(err)\n\t\t\t\t\t\t\t})\n\n\t\t\t\t\t\t\tit.After(func() {\n\t\t\t\t\t\t\t\t_ = os.RemoveAll(tmpDir)\n\t\t\t\t\t\t\t})\n\n\t\t\t\t\t\t\tit(\"adds the buildpacks to the builder and runs it\", func() {\n\t\t\t\t\t\t\t\th.SkipIf(t, runtime.GOOS == \"windows\", \"buildpack directories not supported on windows\")\n\t\t\t\t\t\t\t\t// This only works if pack is new, therefore skip if pack is old\n\t\t\t\t\t\t\t\th.SkipIf(t, !pack.SupportsFeature(invoke.MetaBuildpackFolder), \"\")\n\n\t\t\t\t\t\t\t\tbuildpackManager.PrepareBuildModules(tmpDir, buildpacks.MetaBpFolder)\n\t\t\t\t\t\t\t\tbuildpackManager.PrepareBuildModules(tmpDir, buildpacks.MetaBpDependency)\n\n\t\t\t\t\t\t\t\toutput := pack.RunSuccessfully(\n\t\t\t\t\t\t\t\t\t\"build\", repoName,\n\t\t\t\t\t\t\t\t\t\"-p\", filepath.Join(\"testdata\", \"mock_app\"),\n\t\t\t\t\t\t\t\t\t\"--buildpack\", buildpacks.MetaBpFolder.FullPathIn(tmpDir),\n\t\t\t\t\t\t\t\t)\n\n\t\t\t\t\t\t\t\tassertOutput := assertions.NewOutputAssertionManager(t, output)\n\t\t\t\t\t\t\t\tassertOutput.ReportsAddingBuildpack(\"local/meta-bp\", \"local-meta-bp-version\")\n\t\t\t\t\t\t\t\tassertOutput.ReportsAddingBuildpack(\"local/meta-bp-dep\", \"local-meta-bp-version\")\n\t\t\t\t\t\t\t\tassertOutput.ReportsSuccessfulImageBuild(repoName)\n\n\t\t\t\t\t\t\t\tassertBuildpackOutput := assertions.NewTestBuildpackOutputAssertionManager(t, output)\n\t\t\t\t\t\t\t\tassertBuildpackOutput.ReportsBuildStep(\"Local Meta-Buildpack Dependency\")\n\t\t\t\t\t\t\t})\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\twhen(\"the argument is a buildpackage image\", func() {\n\t\t\t\t\t\t\tvar (\n\t\t\t\t\t\t\t\ttmpDir           string\n\t\t\t\t\t\t\t\tpackageImageName string\n\t\t\t\t\t\t\t)\n\n\t\t\t\t\t\t\tit.After(func() {\n\t\t\t\t\t\t\t\timageManager.CleanupImages(packageImageName)\n\t\t\t\t\t\t\t\t_ = os.RemoveAll(tmpDir)\n\t\t\t\t\t\t\t})\n\n\t\t\t\t\t\t\tit(\"adds the buildpacks to the builder and runs them\", func() {\n\t\t\t\t\t\t\t\tpackageImageName = registryConfig.RepoName(\"buildpack-\" + h.RandString(8))\n\n\t\t\t\t\t\t\t\tpackageTomlPath := generatePackageTomlWithOS(t, assert, pack, tmpDir, \"package_for_build_cmd.toml\", imageManager.HostOS())\n\t\t\t\t\t\t\t\tpackageImage := buildpacks.NewPackageImage(\n\t\t\t\t\t\t\t\t\tt,\n\t\t\t\t\t\t\t\t\tpack,\n\t\t\t\t\t\t\t\t\tpackageImageName,\n\t\t\t\t\t\t\t\t\tpackageTomlPath,\n\t\t\t\t\t\t\t\t\tbuildpacks.WithRequiredBuildpacks(\n\t\t\t\t\t\t\t\t\t\tbuildpacks.BpFolderSimpleLayersParent,\n\t\t\t\t\t\t\t\t\t\tbuildpacks.BpFolderSimpleLayers,\n\t\t\t\t\t\t\t\t\t),\n\t\t\t\t\t\t\t\t)\n\n\t\t\t\t\t\t\t\tbuildpackManager.PrepareBuildModules(tmpDir, packageImage)\n\n\t\t\t\t\t\t\t\toutput := pack.RunSuccessfully(\n\t\t\t\t\t\t\t\t\t\"build\", repoName,\n\t\t\t\t\t\t\t\t\t\"-p\", filepath.Join(\"testdata\", \"mock_app\"),\n\t\t\t\t\t\t\t\t\t\"--buildpack\", packageImageName,\n\t\t\t\t\t\t\t\t)\n\n\t\t\t\t\t\t\t\tassertOutput := assertions.NewOutputAssertionManager(t, output)\n\t\t\t\t\t\t\t\tassertOutput.ReportsAddingBuildpack(\n\t\t\t\t\t\t\t\t\t\"simple/layers/parent\",\n\t\t\t\t\t\t\t\t\t\"simple-layers-parent-version\",\n\t\t\t\t\t\t\t\t)\n\t\t\t\t\t\t\t\tassertOutput.ReportsAddingBuildpack(\"simple/layers\", \"simple-layers-version\")\n\t\t\t\t\t\t\t\tassertOutput.ReportsSuccessfulImageBuild(repoName)\n\n\t\t\t\t\t\t\t\tassertBuildpackOutput := assertions.NewTestBuildpackOutputAssertionManager(t, output)\n\t\t\t\t\t\t\t\tassertBuildpackOutput.ReportsBuildStep(\"Simple Layers Buildpack\")\n\t\t\t\t\t\t\t})\n\n\t\t\t\t\t\t\twhen(\"buildpackage is in a registry\", func() {\n\t\t\t\t\t\t\t\tit(\"adds the buildpacks to the builder and runs them\", func() {\n\t\t\t\t\t\t\t\t\th.SkipIf(t, !pack.SupportsFeature(invoke.PlatformRetries), \"\")\n\t\t\t\t\t\t\t\t\tpackageImageName = registryConfig.RepoName(\"buildpack-\" + h.RandString(8))\n\n\t\t\t\t\t\t\t\t\tpackageTomlPath := generatePackageTomlWithOS(t, assert, pack, tmpDir, \"package_for_build_cmd.toml\", imageManager.HostOS())\n\t\t\t\t\t\t\t\t\tpackageImage := buildpacks.NewPackageImage(\n\t\t\t\t\t\t\t\t\t\tt,\n\t\t\t\t\t\t\t\t\t\tpack,\n\t\t\t\t\t\t\t\t\t\tpackageImageName,\n\t\t\t\t\t\t\t\t\t\tpackageTomlPath,\n\t\t\t\t\t\t\t\t\t\tbuildpacks.WithRequiredBuildpacks(\n\t\t\t\t\t\t\t\t\t\t\tbuildpacks.BpFolderSimpleLayersParent,\n\t\t\t\t\t\t\t\t\t\t\tbuildpacks.BpFolderSimpleLayers,\n\t\t\t\t\t\t\t\t\t\t),\n\t\t\t\t\t\t\t\t\t\tbuildpacks.WithPublish(),\n\t\t\t\t\t\t\t\t\t)\n\n\t\t\t\t\t\t\t\t\tbuildpackManager.PrepareBuildModules(tmpDir, packageImage)\n\n\t\t\t\t\t\t\t\t\toutput := pack.RunSuccessfully(\n\t\t\t\t\t\t\t\t\t\t\"build\", repoName,\n\t\t\t\t\t\t\t\t\t\t\"-p\", filepath.Join(\"testdata\", \"mock_app\"),\n\t\t\t\t\t\t\t\t\t\t\"--buildpack\", packageImageName,\n\t\t\t\t\t\t\t\t\t)\n\n\t\t\t\t\t\t\t\t\tassertOutput := assertions.NewOutputAssertionManager(t, output)\n\t\t\t\t\t\t\t\t\tassertOutput.ReportsAddingBuildpack(\n\t\t\t\t\t\t\t\t\t\t\"simple/layers/parent\",\n\t\t\t\t\t\t\t\t\t\t\"simple-layers-parent-version\",\n\t\t\t\t\t\t\t\t\t)\n\t\t\t\t\t\t\t\t\tassertOutput.ReportsAddingBuildpack(\"simple/layers\", \"simple-layers-version\")\n\t\t\t\t\t\t\t\t\tassertOutput.ReportsSuccessfulImageBuild(repoName)\n\n\t\t\t\t\t\t\t\t\tassertBuildpackOutput := assertions.NewTestBuildpackOutputAssertionManager(t, output)\n\t\t\t\t\t\t\t\t\tassertBuildpackOutput.ReportsBuildStep(\"Simple Layers Buildpack\")\n\t\t\t\t\t\t\t\t})\n\t\t\t\t\t\t\t})\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\twhen(\"the argument is a buildpackage file\", func() {\n\t\t\t\t\t\t\tvar tmpDir string\n\n\t\t\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\t\t\tvar err error\n\t\t\t\t\t\t\t\ttmpDir, err = os.MkdirTemp(\"\", \"package-file\")\n\t\t\t\t\t\t\t\tassert.Nil(err)\n\t\t\t\t\t\t\t})\n\n\t\t\t\t\t\t\tit.After(func() {\n\t\t\t\t\t\t\t\tassert.Succeeds(os.RemoveAll(tmpDir))\n\t\t\t\t\t\t\t})\n\n\t\t\t\t\t\t\tit(\"adds the buildpacks to the builder and runs them\", func() {\n\t\t\t\t\t\t\t\tpackageFileLocation := filepath.Join(\n\t\t\t\t\t\t\t\t\ttmpDir,\n\t\t\t\t\t\t\t\t\tfmt.Sprintf(\"buildpack-%s.cnb\", h.RandString(8)),\n\t\t\t\t\t\t\t\t)\n\n\t\t\t\t\t\t\t\tpackageTomlPath := generatePackageTomlWithOS(t, assert, pack, tmpDir, \"package_for_build_cmd.toml\", imageManager.HostOS())\n\t\t\t\t\t\t\t\tpackageFile := buildpacks.NewPackageFile(\n\t\t\t\t\t\t\t\t\tt,\n\t\t\t\t\t\t\t\t\tpack,\n\t\t\t\t\t\t\t\t\tpackageFileLocation,\n\t\t\t\t\t\t\t\t\tpackageTomlPath,\n\t\t\t\t\t\t\t\t\tbuildpacks.WithRequiredBuildpacks(\n\t\t\t\t\t\t\t\t\t\tbuildpacks.BpFolderSimpleLayersParent,\n\t\t\t\t\t\t\t\t\t\tbuildpacks.BpFolderSimpleLayers,\n\t\t\t\t\t\t\t\t\t),\n\t\t\t\t\t\t\t\t)\n\n\t\t\t\t\t\t\t\tbuildpackManager.PrepareBuildModules(tmpDir, packageFile)\n\n\t\t\t\t\t\t\t\toutput := pack.RunSuccessfully(\n\t\t\t\t\t\t\t\t\t\"build\", repoName,\n\t\t\t\t\t\t\t\t\t\"-p\", filepath.Join(\"testdata\", \"mock_app\"),\n\t\t\t\t\t\t\t\t\t\"--buildpack\", packageFileLocation,\n\t\t\t\t\t\t\t\t)\n\n\t\t\t\t\t\t\t\tassertOutput := assertions.NewOutputAssertionManager(t, output)\n\t\t\t\t\t\t\t\tassertOutput.ReportsAddingBuildpack(\n\t\t\t\t\t\t\t\t\t\"simple/layers/parent\",\n\t\t\t\t\t\t\t\t\t\"simple-layers-parent-version\",\n\t\t\t\t\t\t\t\t)\n\t\t\t\t\t\t\t\tassertOutput.ReportsAddingBuildpack(\"simple/layers\", \"simple-layers-version\")\n\t\t\t\t\t\t\t\tassertOutput.ReportsSuccessfulImageBuild(repoName)\n\n\t\t\t\t\t\t\t\tassertBuildpackOutput := assertions.NewTestBuildpackOutputAssertionManager(t, output)\n\t\t\t\t\t\t\t\tassertBuildpackOutput.ReportsBuildStep(\"Simple Layers Buildpack\")\n\t\t\t\t\t\t\t})\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\twhen(\"the buildpack stack doesn't match the builder\", func() {\n\t\t\t\t\t\t\tvar otherStackBuilderTgz string\n\n\t\t\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\t\t\t// The Platform API is new if pack is new AND the lifecycle is new\n\t\t\t\t\t\t\t\t// Therefore skip if pack is old OR the lifecycle is old\n\t\t\t\t\t\t\t\th.SkipIf(t,\n\t\t\t\t\t\t\t\t\tpack.SupportsFeature(invoke.StackValidation) ||\n\t\t\t\t\t\t\t\t\t\tapi.MustParse(lifecycle.LatestPlatformAPIVersion()).LessThan(\"0.12\"), \"\")\n\t\t\t\t\t\t\t\totherStackBuilderTgz = h.CreateTGZ(t, filepath.Join(bpDir, \"other-stack-buildpack\"), \"./\", 0755)\n\t\t\t\t\t\t\t})\n\n\t\t\t\t\t\t\tit.After(func() {\n\t\t\t\t\t\t\t\th.SkipIf(t,\n\t\t\t\t\t\t\t\t\tpack.SupportsFeature(invoke.StackValidation) ||\n\t\t\t\t\t\t\t\t\t\tapi.MustParse(lifecycle.LatestPlatformAPIVersion()).LessThan(\"0.12\"), \"\")\n\t\t\t\t\t\t\t\tassert.Succeeds(os.Remove(otherStackBuilderTgz))\n\t\t\t\t\t\t\t})\n\n\t\t\t\t\t\t\tit(\"succeeds\", func() {\n\t\t\t\t\t\t\t\t_, err := pack.Run(\n\t\t\t\t\t\t\t\t\t\"build\", repoName,\n\t\t\t\t\t\t\t\t\t\"-p\", filepath.Join(\"testdata\", \"mock_app\"),\n\t\t\t\t\t\t\t\t\t\"--buildpack\", otherStackBuilderTgz,\n\t\t\t\t\t\t\t\t)\n\t\t\t\t\t\t\t\tassert.Nil(err)\n\t\t\t\t\t\t\t})\n\n\t\t\t\t\t\t\twhen(\"platform API < 0.12\", func() {\n\t\t\t\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\t\t\t\t// The Platform API is old if pack is old OR the lifecycle is old\n\t\t\t\t\t\t\t\t\t// Therefore skip if pack is new AND the lifecycle is new\n\t\t\t\t\t\t\t\t\th.SkipIf(t,\n\t\t\t\t\t\t\t\t\t\t!pack.SupportsFeature(invoke.StackValidation) &&\n\t\t\t\t\t\t\t\t\t\t\tapi.MustParse(lifecycle.LatestPlatformAPIVersion()).AtLeast(\"0.12\"), \"\")\n\t\t\t\t\t\t\t\t})\n\n\t\t\t\t\t\t\t\tit(\"errors\", func() {\n\t\t\t\t\t\t\t\t\toutput, err := pack.Run(\n\t\t\t\t\t\t\t\t\t\t\"build\", repoName,\n\t\t\t\t\t\t\t\t\t\t\"-p\", filepath.Join(\"testdata\", \"mock_app\"),\n\t\t\t\t\t\t\t\t\t\t\"--buildpack\", otherStackBuilderTgz,\n\t\t\t\t\t\t\t\t\t)\n\n\t\t\t\t\t\t\t\t\tassert.NotNil(err)\n\t\t\t\t\t\t\t\t\tassert.Contains(output, \"other/stack/bp\")\n\t\t\t\t\t\t\t\t\tassert.Contains(output, \"other-stack-version\")\n\t\t\t\t\t\t\t\t\tassert.Contains(output, \"does not support stack 'pack.test.stack'\")\n\t\t\t\t\t\t\t\t})\n\t\t\t\t\t\t\t})\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"--env-file\", func() {\n\t\t\t\t\t\tvar envPath string\n\n\t\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\t\tenvfile, err := os.CreateTemp(\"\", \"envfile\")\n\t\t\t\t\t\t\tassert.Nil(err)\n\t\t\t\t\t\t\tdefer envfile.Close()\n\n\t\t\t\t\t\t\terr = os.Setenv(\"ENV2_CONTENTS\", \"Env2 Layer Contents From Environment\")\n\t\t\t\t\t\t\tassert.Nil(err)\n\t\t\t\t\t\t\tenvfile.WriteString(`\n            DETECT_ENV_BUILDPACK=true\n\t\t\tENV1_CONTENTS=Env1 Layer Contents From File\n\t\t\tENV2_CONTENTS\n\t\t\t`)\n\t\t\t\t\t\t\tenvPath = envfile.Name()\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\tit.After(func() {\n\t\t\t\t\t\t\tassert.Succeeds(os.Unsetenv(\"ENV2_CONTENTS\"))\n\t\t\t\t\t\t\tassert.Succeeds(os.RemoveAll(envPath))\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\tit(\"provides the env vars to the build and detect steps\", func() {\n\t\t\t\t\t\t\toutput := pack.RunSuccessfully(\n\t\t\t\t\t\t\t\t\"build\", repoName,\n\t\t\t\t\t\t\t\t\"-p\", filepath.Join(\"testdata\", \"mock_app\"),\n\t\t\t\t\t\t\t\t\"--env-file\", envPath,\n\t\t\t\t\t\t\t)\n\n\t\t\t\t\t\t\tassertions.NewOutputAssertionManager(t, output).ReportsSuccessfulImageBuild(repoName)\n\t\t\t\t\t\t\tassertImage.RunsWithOutput(\n\t\t\t\t\t\t\t\trepoName,\n\t\t\t\t\t\t\t\t\"Env2 Layer Contents From Environment\",\n\t\t\t\t\t\t\t\t\"Env1 Layer Contents From File\",\n\t\t\t\t\t\t\t)\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"--env\", func() {\n\t\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\t\tassert.Succeeds(os.Setenv(\"ENV2_CONTENTS\", \"Env2 Layer Contents From Environment\"))\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\tit.After(func() {\n\t\t\t\t\t\t\tassert.Succeeds(os.Unsetenv(\"ENV2_CONTENTS\"))\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\tit(\"provides the env vars to the build and detect steps\", func() {\n\t\t\t\t\t\t\toutput := pack.RunSuccessfully(\n\t\t\t\t\t\t\t\t\"build\", repoName,\n\t\t\t\t\t\t\t\t\"-p\", filepath.Join(\"testdata\", \"mock_app\"),\n\t\t\t\t\t\t\t\t\"--env\", \"DETECT_ENV_BUILDPACK=true\",\n\t\t\t\t\t\t\t\t\"--env\", `ENV1_CONTENTS=\"Env1 Layer Contents From Command Line\"`,\n\t\t\t\t\t\t\t\t\"--env\", \"ENV2_CONTENTS\",\n\t\t\t\t\t\t\t)\n\n\t\t\t\t\t\t\tassertions.NewOutputAssertionManager(t, output).ReportsSuccessfulImageBuild(repoName)\n\t\t\t\t\t\t\tassertImage.RunsWithOutput(\n\t\t\t\t\t\t\t\trepoName,\n\t\t\t\t\t\t\t\t\"Env2 Layer Contents From Environment\",\n\t\t\t\t\t\t\t\t\"Env1 Layer Contents From Command Line\",\n\t\t\t\t\t\t\t)\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"--run-image\", func() {\n\t\t\t\t\t\tvar runImageName string\n\n\t\t\t\t\t\twhen(\"the run-image has the correct stack ID\", func() {\n\t\t\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\t\t\tuser := func() string {\n\t\t\t\t\t\t\t\t\tif imageManager.HostOS() == \"windows\" {\n\t\t\t\t\t\t\t\t\t\treturn \"ContainerAdministrator\"\n\t\t\t\t\t\t\t\t\t}\n\n\t\t\t\t\t\t\t\t\treturn \"root\"\n\t\t\t\t\t\t\t\t}\n\n\t\t\t\t\t\t\t\trunImageName = h.CreateImageOnRemote(t, dockerCli, registryConfig, \"custom-run-image\"+h.RandString(10), fmt.Sprintf(`\n\t\t\t\t\t\t\t\t\t\t\t\t\tFROM %s\n\t\t\t\t\t\t\t\t\t\t\t\t\tUSER %s\n\t\t\t\t\t\t\t\t\t\t\t\t\tRUN echo \"custom-run\" > /custom-run.txt\n\t\t\t\t\t\t\t\t\t\t\t\t\tUSER pack\n\t\t\t\t\t\t\t\t\t\t\t\t`, runImage, user()))\n\t\t\t\t\t\t\t})\n\n\t\t\t\t\t\t\tit.After(func() {\n\t\t\t\t\t\t\t\timageManager.CleanupImages(runImageName)\n\t\t\t\t\t\t\t})\n\n\t\t\t\t\t\t\tit(\"uses the run image as the base image\", func() {\n\t\t\t\t\t\t\t\toutput := pack.RunSuccessfully(\n\t\t\t\t\t\t\t\t\t\"build\", repoName,\n\t\t\t\t\t\t\t\t\t\"-p\", filepath.Join(\"testdata\", \"mock_app\"),\n\t\t\t\t\t\t\t\t\t\"--run-image\", runImageName,\n\t\t\t\t\t\t\t\t)\n\t\t\t\t\t\t\t\tassertOutput := assertions.NewOutputAssertionManager(t, output)\n\t\t\t\t\t\t\t\tassertOutput.ReportsSuccessfulImageBuild(repoName)\n\t\t\t\t\t\t\t\tassertOutput.ReportsPullingImage(runImageName)\n\n\t\t\t\t\t\t\t\tt.Log(\"app is runnable\")\n\t\t\t\t\t\t\t\tassertImage.RunsWithOutput(\n\t\t\t\t\t\t\t\t\trepoName,\n\t\t\t\t\t\t\t\t\t\"Launch Dep Contents\",\n\t\t\t\t\t\t\t\t\t\"Cached Dep Contents\",\n\t\t\t\t\t\t\t\t)\n\n\t\t\t\t\t\t\t\tt.Log(\"uses the run image as the base image\")\n\t\t\t\t\t\t\t\tassertImage.HasBaseImage(repoName, runImageName)\n\t\t\t\t\t\t\t\tif pack.SupportsFeature(invoke.FixesRunImageMetadata) {\n\t\t\t\t\t\t\t\t\tt.Log(fmt.Sprintf(\"run-image %s was added into 'io.buildpacks.lifecycle.metadata' label\", runImageName))\n\t\t\t\t\t\t\t\t\tassertImage.HasLabelContaining(repoName, \"io.buildpacks.lifecycle.metadata\", fmt.Sprintf(`\"image\":\"%s\"`, runImageName))\n\t\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\t})\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\twhen(\"the run image has the wrong stack ID\", func() {\n\t\t\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\t\t\trunImageName = h.CreateImageOnRemote(t, dockerCli, registryConfig, \"custom-run-image\"+h.RandString(10), fmt.Sprintf(`\n\t\t\t\t\t\t\t\t\t\t\t\t\tFROM %s\n\t\t\t\t\t\t\t\t\t\t\t\t\tLABEL io.buildpacks.stack.id=other.stack.id\n\t\t\t\t\t\t\t\t\t\t\t\t\tUSER pack\n\t\t\t\t\t\t\t\t\t\t\t\t`, runImage))\n\n\t\t\t\t\t\t\t})\n\n\t\t\t\t\t\t\tit.After(func() {\n\t\t\t\t\t\t\t\timageManager.CleanupImages(runImageName)\n\t\t\t\t\t\t\t})\n\n\t\t\t\t\t\t\twhen(\"should validate stack\", func() {\n\t\t\t\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\t\t\t\th.SkipIf(t, pack.SupportsFeature(invoke.StackWarning), \"stack is validated in prior versions\")\n\t\t\t\t\t\t\t\t})\n\t\t\t\t\t\t\t\tit(\"fails with a message\", func() {\n\n\t\t\t\t\t\t\t\t\toutput, err := pack.Run(\n\t\t\t\t\t\t\t\t\t\t\"build\", repoName,\n\t\t\t\t\t\t\t\t\t\t\"-p\", filepath.Join(\"testdata\", \"mock_app\"),\n\t\t\t\t\t\t\t\t\t\t\"--run-image\", runImageName,\n\t\t\t\t\t\t\t\t\t)\n\t\t\t\t\t\t\t\t\tassert.NotNil(err)\n\n\t\t\t\t\t\t\t\t\tassertOutput := assertions.NewOutputAssertionManager(t, output)\n\t\t\t\t\t\t\t\t\tassertOutput.ReportsRunImageStackNotMatchingBuilder(\n\t\t\t\t\t\t\t\t\t\t\"other.stack.id\",\n\t\t\t\t\t\t\t\t\t\t\"pack.test.stack\",\n\t\t\t\t\t\t\t\t\t)\n\t\t\t\t\t\t\t\t})\n\t\t\t\t\t\t\t})\n\n\t\t\t\t\t\t\twhen(\"should not validate stack\", func() {\n\t\t\t\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\t\t\t\th.SkipIf(t, !pack.SupportsFeature(invoke.StackWarning), \"stack is no longer validated\")\n\t\t\t\t\t\t\t\t})\n\t\t\t\t\t\t\t\tit(\"succeeds with a warning\", func() {\n\n\t\t\t\t\t\t\t\t\toutput, err := pack.Run(\n\t\t\t\t\t\t\t\t\t\t\"build\", repoName,\n\t\t\t\t\t\t\t\t\t\t\"-p\", filepath.Join(\"testdata\", \"mock_app\"),\n\t\t\t\t\t\t\t\t\t\t\"--run-image\", runImageName,\n\t\t\t\t\t\t\t\t\t)\n\t\t\t\t\t\t\t\t\tassert.Nil(err)\n\n\t\t\t\t\t\t\t\t\tassertOutput := assertions.NewOutputAssertionManager(t, output)\n\t\t\t\t\t\t\t\t\tassertOutput.ReportsDeprecatedUseOfStack()\n\t\t\t\t\t\t\t\t})\n\t\t\t\t\t\t\t})\n\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"--publish\", func() {\n\t\t\t\t\t\tit(\"creates image on the registry\", func() {\n\t\t\t\t\t\t\tbuildArgs := []string{\n\t\t\t\t\t\t\t\trepoName,\n\t\t\t\t\t\t\t\t\"-p\", filepath.Join(\"testdata\", \"mock_app\"),\n\t\t\t\t\t\t\t\t\"--publish\",\n\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\tif imageManager.HostOS() != \"windows\" {\n\t\t\t\t\t\t\t\tbuildArgs = append(buildArgs, \"--network\", \"host\")\n\t\t\t\t\t\t\t}\n\n\t\t\t\t\t\t\toutput := pack.RunSuccessfully(\"build\", buildArgs...)\n\t\t\t\t\t\t\tassertions.NewOutputAssertionManager(t, output).ReportsSuccessfulImageBuild(repoName)\n\n\t\t\t\t\t\t\tt.Log(\"checking that registry has contents\")\n\t\t\t\t\t\t\tassertImage.ExistsInRegistryCatalog(repo)\n\n\t\t\t\t\t\t\tcmdName := \"inspect\"\n\t\t\t\t\t\t\tif !pack.Supports(\"inspect\") {\n\t\t\t\t\t\t\t\tcmdName = \"inspect-image\"\n\t\t\t\t\t\t\t}\n\n\t\t\t\t\t\t\tt.Log(\"inspect-image\")\n\t\t\t\t\t\t\tvar (\n\t\t\t\t\t\t\t\twebCommand      string\n\t\t\t\t\t\t\t\thelloCommand    string\n\t\t\t\t\t\t\t\thelloArgs       []string\n\t\t\t\t\t\t\t\thelloArgsPrefix string\n\t\t\t\t\t\t\t\timageWorkdir    string\n\t\t\t\t\t\t\t)\n\t\t\t\t\t\t\tif imageManager.HostOS() == \"windows\" {\n\t\t\t\t\t\t\t\twebCommand = \".\\\\run\"\n\t\t\t\t\t\t\t\thelloCommand = \"cmd\"\n\t\t\t\t\t\t\t\thelloArgs = []string{\"/c\", \"echo hello world\"}\n\t\t\t\t\t\t\t\thelloArgsPrefix = \" \"\n\t\t\t\t\t\t\t\timageWorkdir = \"c:\\\\workspace\"\n\t\t\t\t\t\t\t} else {\n\t\t\t\t\t\t\t\twebCommand = \"./run\"\n\t\t\t\t\t\t\t\thelloCommand = \"echo\"\n\t\t\t\t\t\t\t\thelloArgs = []string{\"hello\", \"world\"}\n\t\t\t\t\t\t\t\thelloArgsPrefix = \"\"\n\t\t\t\t\t\t\t\timageWorkdir = \"/workspace\"\n\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\tformats := []compareFormat{\n\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\textension:   \"json\",\n\t\t\t\t\t\t\t\t\tcompareFunc: assert.EqualJSON,\n\t\t\t\t\t\t\t\t\toutputArg:   \"json\",\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\textension:   \"yaml\",\n\t\t\t\t\t\t\t\t\tcompareFunc: assert.EqualYAML,\n\t\t\t\t\t\t\t\t\toutputArg:   \"yaml\",\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\textension:   \"toml\",\n\t\t\t\t\t\t\t\t\tcompareFunc: assert.EqualTOML,\n\t\t\t\t\t\t\t\t\toutputArg:   \"toml\",\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t}\n\n\t\t\t\t\t\t\tfor _, format := range formats {\n\t\t\t\t\t\t\t\tt.Logf(\"inspecting image %s format\", format.outputArg)\n\n\t\t\t\t\t\t\t\toutput = pack.RunSuccessfully(cmdName, repoName, \"--output\", format.outputArg)\n\n\t\t\t\t\t\t\t\texpectedOutput := pack.FixtureManager().TemplateFixture(\n\t\t\t\t\t\t\t\t\tfmt.Sprintf(\"inspect_image_published_output.%s\", format.extension),\n\t\t\t\t\t\t\t\t\tmap[string]interface{}{\n\t\t\t\t\t\t\t\t\t\t\"image_name\":           repoName,\n\t\t\t\t\t\t\t\t\t\t\"base_image_ref\":       strings.Join([]string{runImageMirror, h.Digest(t, runImageMirror)}, \"@\"),\n\t\t\t\t\t\t\t\t\t\t\"base_image_top_layer\": h.TopLayerDiffID(t, runImageMirror),\n\t\t\t\t\t\t\t\t\t\t\"run_image_mirror\":     runImageMirror,\n\t\t\t\t\t\t\t\t\t\t\"web_command\":          webCommand,\n\t\t\t\t\t\t\t\t\t\t\"hello_command\":        helloCommand,\n\t\t\t\t\t\t\t\t\t\t\"hello_args\":           helloArgs,\n\t\t\t\t\t\t\t\t\t\t\"hello_args_prefix\":    helloArgsPrefix,\n\t\t\t\t\t\t\t\t\t\t\"image_workdir\":        imageWorkdir,\n\t\t\t\t\t\t\t\t\t\t\"rebasable\":            true,\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t)\n\n\t\t\t\t\t\t\t\tformat.compareFunc(output, expectedOutput)\n\t\t\t\t\t\t\t}\n\n\t\t\t\t\t\t\timageManager.PullImage(repoName, registryConfig.RegistryAuth())\n\n\t\t\t\t\t\t\tt.Log(\"app is runnable\")\n\t\t\t\t\t\t\tassertImage.RunsWithOutput(\n\t\t\t\t\t\t\t\trepoName,\n\t\t\t\t\t\t\t\t\"Launch Dep Contents\",\n\t\t\t\t\t\t\t\t\"Cached Dep Contents\",\n\t\t\t\t\t\t\t)\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\twhen(\"additional tags are specified with --tag\", func() {\n\t\t\t\t\t\t\tvar additionalRepo string\n\t\t\t\t\t\t\tvar additionalRepoName string\n\n\t\t\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\t\t\tadditionalRepo = fmt.Sprintf(\"%s_additional\", repo)\n\t\t\t\t\t\t\t\tadditionalRepoName = fmt.Sprintf(\"%s_additional\", repoName)\n\t\t\t\t\t\t\t})\n\t\t\t\t\t\t\tit.After(func() {\n\t\t\t\t\t\t\t\timageManager.CleanupImages(additionalRepoName)\n\t\t\t\t\t\t\t})\n\t\t\t\t\t\t\tit(\"creates additional tags on the registry\", func() {\n\t\t\t\t\t\t\t\tbuildArgs := []string{\n\t\t\t\t\t\t\t\t\trepoName,\n\t\t\t\t\t\t\t\t\t\"-p\", filepath.Join(\"testdata\", \"mock_app\"),\n\t\t\t\t\t\t\t\t\t\"--publish\",\n\t\t\t\t\t\t\t\t\t\"--tag\", additionalRepoName,\n\t\t\t\t\t\t\t\t}\n\n\t\t\t\t\t\t\t\tif imageManager.HostOS() != \"windows\" {\n\t\t\t\t\t\t\t\t\tbuildArgs = append(buildArgs, \"--network\", \"host\")\n\t\t\t\t\t\t\t\t}\n\n\t\t\t\t\t\t\t\toutput := pack.RunSuccessfully(\"build\", buildArgs...)\n\t\t\t\t\t\t\t\tassertions.NewOutputAssertionManager(t, output).ReportsSuccessfulImageBuild(repoName)\n\n\t\t\t\t\t\t\t\tt.Log(\"checking that registry has contents\")\n\t\t\t\t\t\t\t\tassertImage.ExistsInRegistryCatalog(repo)\n\t\t\t\t\t\t\t\tassertImage.ExistsInRegistryCatalog(additionalRepo)\n\n\t\t\t\t\t\t\t\timageManager.PullImage(repoName, registryConfig.RegistryAuth())\n\t\t\t\t\t\t\t\timageManager.PullImage(additionalRepoName, registryConfig.RegistryAuth())\n\n\t\t\t\t\t\t\t\tt.Log(\"additional app is runnable\")\n\t\t\t\t\t\t\t\tassertImage.RunsWithOutput(\n\t\t\t\t\t\t\t\t\tadditionalRepoName,\n\t\t\t\t\t\t\t\t\t\"Launch Dep Contents\",\n\t\t\t\t\t\t\t\t\t\"Cached Dep Contents\",\n\t\t\t\t\t\t\t\t)\n\n\t\t\t\t\t\t\t\timageDigest := h.Digest(t, repoName)\n\t\t\t\t\t\t\t\tadditionalDigest := h.Digest(t, additionalRepoName)\n\n\t\t\t\t\t\t\t\tassert.Equal(imageDigest, additionalDigest)\n\t\t\t\t\t\t\t})\n\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"--cache-image\", func() {\n\t\t\t\t\t\tvar cacheImageName string\n\t\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\t\tcacheImageName = fmt.Sprintf(\"%s-cache\", repoName)\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\tit(\"creates image and cache image on the registry\", func() {\n\t\t\t\t\t\t\tbuildArgs := []string{\n\t\t\t\t\t\t\t\trepoName,\n\t\t\t\t\t\t\t\t\"-p\", filepath.Join(\"testdata\", \"mock_app\"),\n\t\t\t\t\t\t\t\t\"--publish\",\n\t\t\t\t\t\t\t\t\"--cache-image\",\n\t\t\t\t\t\t\t\tcacheImageName,\n\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\tif imageManager.HostOS() != \"windows\" {\n\t\t\t\t\t\t\t\tbuildArgs = append(buildArgs, \"--network\", \"host\")\n\t\t\t\t\t\t\t}\n\n\t\t\t\t\t\t\toutput := pack.RunSuccessfully(\"build\", buildArgs...)\n\t\t\t\t\t\t\tassertions.NewOutputAssertionManager(t, output).ReportsSuccessfulImageBuild(repoName)\n\n\t\t\t\t\t\t\tcacheImageRef, err := name.ParseReference(cacheImageName, name.WeakValidation)\n\t\t\t\t\t\t\tassert.Nil(err)\n\n\t\t\t\t\t\t\tt.Log(\"checking that registry has contents\")\n\t\t\t\t\t\t\tassertImage.CanBePulledFromRegistry(repoName)\n\t\t\t\t\t\t\tif imageManager.HostOS() == \"windows\" {\n\t\t\t\t\t\t\t\t// Cache images are automatically Linux container images, and therefore can't be pulled\n\t\t\t\t\t\t\t\t// and inspected correctly on WCOW systems\n\t\t\t\t\t\t\t\t// https://github.com/buildpacks/lifecycle/issues/529\n\t\t\t\t\t\t\t\timageManager.PullImage(cacheImageRef.Name(), registryConfig.RegistryAuth())\n\t\t\t\t\t\t\t} else {\n\t\t\t\t\t\t\t\tassertImage.CanBePulledFromRegistry(cacheImageRef.Name())\n\t\t\t\t\t\t\t}\n\n\t\t\t\t\t\t\tdefer imageManager.CleanupImages(cacheImageRef.Name())\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"--cache with options for build cache as image\", func() {\n\t\t\t\t\t\tvar cacheImageName, cacheFlags string\n\t\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\t\tcacheImageName = fmt.Sprintf(\"%s-cache\", repoName)\n\t\t\t\t\t\t\tcacheFlags = fmt.Sprintf(\"type=build;format=image;name=%s\", cacheImageName)\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\tit(\"creates image and cache image on the registry\", func() {\n\t\t\t\t\t\t\tbuildArgs := []string{\n\t\t\t\t\t\t\t\trepoName,\n\t\t\t\t\t\t\t\t\"-p\", filepath.Join(\"testdata\", \"mock_app\"),\n\t\t\t\t\t\t\t\t\"--publish\",\n\t\t\t\t\t\t\t\t\"--cache\",\n\t\t\t\t\t\t\t\tcacheFlags,\n\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\tif imageManager.HostOS() != \"windows\" {\n\t\t\t\t\t\t\t\tbuildArgs = append(buildArgs, \"--network\", \"host\")\n\t\t\t\t\t\t\t}\n\n\t\t\t\t\t\t\toutput := pack.RunSuccessfully(\"build\", buildArgs...)\n\t\t\t\t\t\t\tassertions.NewOutputAssertionManager(t, output).ReportsSuccessfulImageBuild(repoName)\n\n\t\t\t\t\t\t\tcacheImageRef, err := name.ParseReference(cacheImageName, name.WeakValidation)\n\t\t\t\t\t\t\tassert.Nil(err)\n\n\t\t\t\t\t\t\tt.Log(\"checking that registry has contents\")\n\t\t\t\t\t\t\tassertImage.CanBePulledFromRegistry(repoName)\n\t\t\t\t\t\t\tif imageManager.HostOS() == \"windows\" {\n\t\t\t\t\t\t\t\t// Cache images are automatically Linux container images, and therefore can't be pulled\n\t\t\t\t\t\t\t\t// and inspected correctly on WCOW systems\n\t\t\t\t\t\t\t\t// https://github.com/buildpacks/lifecycle/issues/529\n\t\t\t\t\t\t\t\timageManager.PullImage(cacheImageRef.Name(), registryConfig.RegistryAuth())\n\t\t\t\t\t\t\t} else {\n\t\t\t\t\t\t\t\tassertImage.CanBePulledFromRegistry(cacheImageRef.Name())\n\t\t\t\t\t\t\t}\n\n\t\t\t\t\t\t\tdefer imageManager.CleanupImages(cacheImageRef.Name())\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"--cache with options for build cache as bind\", func() {\n\t\t\t\t\t\tvar bindCacheDir, cacheFlags string\n\t\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\t\th.SkipIf(t, !pack.SupportsFeature(invoke.Cache), \"\")\n\t\t\t\t\t\t\tcacheBindName := strings.ReplaceAll(strings.ReplaceAll(fmt.Sprintf(\"%s-bind\", repoName), string(filepath.Separator), \"-\"), \":\", \"-\")\n\t\t\t\t\t\t\tvar err error\n\t\t\t\t\t\t\tbindCacheDir, err = os.MkdirTemp(\"\", cacheBindName)\n\t\t\t\t\t\t\tassert.Nil(err)\n\t\t\t\t\t\t\tcacheFlags = fmt.Sprintf(\"type=build;format=bind;source=%s\", bindCacheDir)\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\tit(\"creates image and cache image on the registry\", func() {\n\t\t\t\t\t\t\tbuildArgs := []string{\n\t\t\t\t\t\t\t\trepoName,\n\t\t\t\t\t\t\t\t\"-p\", filepath.Join(\"testdata\", \"mock_app\"),\n\t\t\t\t\t\t\t\t\"--cache\",\n\t\t\t\t\t\t\t\tcacheFlags,\n\t\t\t\t\t\t\t}\n\n\t\t\t\t\t\t\toutput := pack.RunSuccessfully(\"build\", buildArgs...)\n\t\t\t\t\t\t\tassertions.NewOutputAssertionManager(t, output).ReportsSuccessfulImageBuild(repoName)\n\n\t\t\t\t\t\t\tt.Log(\"checking that bind mount has cache contents\")\n\t\t\t\t\t\t\tassert.FileExists(fmt.Sprintf(\"%s/committed\", bindCacheDir))\n\t\t\t\t\t\t\tdefer os.RemoveAll(bindCacheDir)\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"ctrl+c\", func() {\n\t\t\t\t\t\tit(\"stops the execution\", func() {\n\t\t\t\t\t\t\tvar buf = new(bytes.Buffer)\n\t\t\t\t\t\t\tcommand := pack.StartWithWriter(\n\t\t\t\t\t\t\t\tbuf,\n\t\t\t\t\t\t\t\t\"build\", repoName,\n\t\t\t\t\t\t\t\t\"-p\", filepath.Join(\"testdata\", \"mock_app\"),\n\t\t\t\t\t\t\t)\n\n\t\t\t\t\t\t\tgo command.TerminateAtStep(\"DETECTING\")\n\n\t\t\t\t\t\t\terr := command.Wait()\n\t\t\t\t\t\t\tassert.NotNil(err)\n\t\t\t\t\t\t\tassert.NotContains(buf.String(), \"Successfully built image\")\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"--descriptor\", func() {\n\n\t\t\t\t\t\twhen(\"using a included buildpack\", func() {\n\t\t\t\t\t\t\tvar tempAppDir, tempWorkingDir, origWorkingDir string\n\t\t\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\t\t\th.SkipIf(t, runtime.GOOS == \"windows\", \"buildpack directories not supported on windows\")\n\n\t\t\t\t\t\t\t\tvar err error\n\t\t\t\t\t\t\t\ttempAppDir, err = os.MkdirTemp(\"\", \"descriptor-app\")\n\t\t\t\t\t\t\t\tassert.Nil(err)\n\n\t\t\t\t\t\t\t\ttempWorkingDir, err = os.MkdirTemp(\"\", \"descriptor-app\")\n\t\t\t\t\t\t\t\tassert.Nil(err)\n\n\t\t\t\t\t\t\t\torigWorkingDir, err = os.Getwd()\n\t\t\t\t\t\t\t\tassert.Nil(err)\n\n\t\t\t\t\t\t\t\t// Create test directories and files:\n\t\t\t\t\t\t\t\t//\n\t\t\t\t\t\t\t\t// ├── cookie.jar\n\t\t\t\t\t\t\t\t// ├── descriptor-buildpack/...\n\t\t\t\t\t\t\t\t// ├── media\n\t\t\t\t\t\t\t\t// │   ├── mountain.jpg\n\t\t\t\t\t\t\t\t// │   └── person.png\n\t\t\t\t\t\t\t\t// └── test.sh\n\t\t\t\t\t\t\t\tassert.Succeeds(os.Mkdir(filepath.Join(tempAppDir, \"descriptor-buildpack\"), os.ModePerm))\n\t\t\t\t\t\t\t\th.RecursiveCopy(t, filepath.Join(bpDir, \"descriptor-buildpack\"), filepath.Join(tempAppDir, \"descriptor-buildpack\"))\n\n\t\t\t\t\t\t\t\terr = os.Mkdir(filepath.Join(tempAppDir, \"media\"), 0755)\n\t\t\t\t\t\t\t\tassert.Nil(err)\n\t\t\t\t\t\t\t\terr = os.WriteFile(filepath.Join(tempAppDir, \"media\", \"mountain.jpg\"), []byte(\"fake image bytes\"), 0755)\n\t\t\t\t\t\t\t\tassert.Nil(err)\n\t\t\t\t\t\t\t\terr = os.WriteFile(filepath.Join(tempAppDir, \"media\", \"person.png\"), []byte(\"fake image bytes\"), 0755)\n\t\t\t\t\t\t\t\tassert.Nil(err)\n\n\t\t\t\t\t\t\t\terr = os.WriteFile(filepath.Join(tempAppDir, \"cookie.jar\"), []byte(\"chocolate chip\"), 0755)\n\t\t\t\t\t\t\t\tassert.Nil(err)\n\t\t\t\t\t\t\t\terr = os.WriteFile(filepath.Join(tempAppDir, \"test.sh\"), []byte(\"echo test\"), 0755)\n\t\t\t\t\t\t\t\tassert.Nil(err)\n\n\t\t\t\t\t\t\t\tprojectToml := `\n[project]\nname = \"exclude test\"\n[[project.licenses]]\ntype = \"MIT\"\n[build]\nexclude = [ \"*.sh\", \"media/person.png\", \"descriptor-buildpack\" ]\n\n[[build.buildpacks]]\nuri = \"descriptor-buildpack\"\n`\n\t\t\t\t\t\t\t\texcludeDescriptorPath := filepath.Join(tempAppDir, \"project.toml\")\n\t\t\t\t\t\t\t\terr = os.WriteFile(excludeDescriptorPath, []byte(projectToml), 0755)\n\t\t\t\t\t\t\t\tassert.Nil(err)\n\n\t\t\t\t\t\t\t\t// set working dir to be outside of the app we are building\n\t\t\t\t\t\t\t\tassert.Succeeds(os.Chdir(tempWorkingDir))\n\t\t\t\t\t\t\t})\n\n\t\t\t\t\t\t\tit.After(func() {\n\t\t\t\t\t\t\t\tos.RemoveAll(tempAppDir)\n\t\t\t\t\t\t\t\tif origWorkingDir != \"\" {\n\t\t\t\t\t\t\t\t\tassert.Succeeds(os.Chdir(origWorkingDir))\n\t\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\t})\n\t\t\t\t\t\t\tit(\"uses buildpack specified by descriptor\", func() {\n\t\t\t\t\t\t\t\toutput := pack.RunSuccessfully(\n\t\t\t\t\t\t\t\t\t\"build\",\n\t\t\t\t\t\t\t\t\trepoName,\n\t\t\t\t\t\t\t\t\t\"-p\", tempAppDir,\n\t\t\t\t\t\t\t\t)\n\t\t\t\t\t\t\t\tassert.NotContains(output, \"person.png\")\n\t\t\t\t\t\t\t\tassert.NotContains(output, \"test.sh\")\n\n\t\t\t\t\t\t\t})\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\twhen(\"exclude and include\", func() {\n\t\t\t\t\t\t\tvar buildpackTgz, tempAppDir string\n\n\t\t\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\t\t\tbuildpackTgz = h.CreateTGZ(t, filepath.Join(bpDir, \"descriptor-buildpack\"), \"./\", 0755)\n\n\t\t\t\t\t\t\t\tvar err error\n\t\t\t\t\t\t\t\ttempAppDir, err = os.MkdirTemp(\"\", \"descriptor-app\")\n\t\t\t\t\t\t\t\tassert.Nil(err)\n\n\t\t\t\t\t\t\t\t// Create test directories and files:\n\t\t\t\t\t\t\t\t//\n\t\t\t\t\t\t\t\t// ├── cookie.jar\n\t\t\t\t\t\t\t\t// ├── other-cookie.jar\n\t\t\t\t\t\t\t\t// ├── nested-cookie.jar\n\t\t\t\t\t\t\t\t// ├── nested\n\t\t\t\t\t\t\t\t// │   └── nested-cookie.jar\n\t\t\t\t\t\t\t\t// ├── secrets\n\t\t\t\t\t\t\t\t// │   ├── api_keys.json\n\t\t\t\t\t\t\t\t// |   |── user_token\n\t\t\t\t\t\t\t\t// ├── media\n\t\t\t\t\t\t\t\t// │   ├── mountain.jpg\n\t\t\t\t\t\t\t\t// │   └── person.png\n\t\t\t\t\t\t\t\t// └── test.sh\n\n\t\t\t\t\t\t\t\terr = os.Mkdir(filepath.Join(tempAppDir, \"secrets\"), 0755)\n\t\t\t\t\t\t\t\tassert.Nil(err)\n\t\t\t\t\t\t\t\terr = os.WriteFile(filepath.Join(tempAppDir, \"secrets\", \"api_keys.json\"), []byte(\"{}\"), 0755)\n\t\t\t\t\t\t\t\tassert.Nil(err)\n\t\t\t\t\t\t\t\terr = os.WriteFile(filepath.Join(tempAppDir, \"secrets\", \"user_token\"), []byte(\"token\"), 0755)\n\t\t\t\t\t\t\t\tassert.Nil(err)\n\n\t\t\t\t\t\t\t\terr = os.Mkdir(filepath.Join(tempAppDir, \"nested\"), 0755)\n\t\t\t\t\t\t\t\tassert.Nil(err)\n\t\t\t\t\t\t\t\terr = os.WriteFile(filepath.Join(tempAppDir, \"nested\", \"nested-cookie.jar\"), []byte(\"chocolate chip\"), 0755)\n\t\t\t\t\t\t\t\tassert.Nil(err)\n\n\t\t\t\t\t\t\t\terr = os.WriteFile(filepath.Join(tempAppDir, \"other-cookie.jar\"), []byte(\"chocolate chip\"), 0755)\n\t\t\t\t\t\t\t\tassert.Nil(err)\n\n\t\t\t\t\t\t\t\terr = os.WriteFile(filepath.Join(tempAppDir, \"nested-cookie.jar\"), []byte(\"chocolate chip\"), 0755)\n\t\t\t\t\t\t\t\tassert.Nil(err)\n\n\t\t\t\t\t\t\t\terr = os.Mkdir(filepath.Join(tempAppDir, \"media\"), 0755)\n\t\t\t\t\t\t\t\tassert.Nil(err)\n\t\t\t\t\t\t\t\terr = os.WriteFile(filepath.Join(tempAppDir, \"media\", \"mountain.jpg\"), []byte(\"fake image bytes\"), 0755)\n\t\t\t\t\t\t\t\tassert.Nil(err)\n\t\t\t\t\t\t\t\terr = os.WriteFile(filepath.Join(tempAppDir, \"media\", \"person.png\"), []byte(\"fake image bytes\"), 0755)\n\t\t\t\t\t\t\t\tassert.Nil(err)\n\n\t\t\t\t\t\t\t\terr = os.WriteFile(filepath.Join(tempAppDir, \"cookie.jar\"), []byte(\"chocolate chip\"), 0755)\n\t\t\t\t\t\t\t\tassert.Nil(err)\n\t\t\t\t\t\t\t\terr = os.WriteFile(filepath.Join(tempAppDir, \"test.sh\"), []byte(\"echo test\"), 0755)\n\t\t\t\t\t\t\t\tassert.Nil(err)\n\t\t\t\t\t\t\t})\n\n\t\t\t\t\t\t\tit.After(func() {\n\t\t\t\t\t\t\t\tassert.Succeeds(os.RemoveAll(tempAppDir))\n\t\t\t\t\t\t\t})\n\n\t\t\t\t\t\t\tit(\"should exclude ALL specified files and directories\", func() {\n\t\t\t\t\t\t\t\tprojectToml := `\n[project]\nname = \"exclude test\"\n[[project.licenses]]\ntype = \"MIT\"\n[build]\nexclude = [ \"*.sh\", \"secrets/\", \"media/metadata\", \"/other-cookie.jar\" ,\"/nested-cookie.jar\"]\n`\n\t\t\t\t\t\t\t\texcludeDescriptorPath := filepath.Join(tempAppDir, \"exclude.toml\")\n\t\t\t\t\t\t\t\terr := os.WriteFile(excludeDescriptorPath, []byte(projectToml), 0755)\n\t\t\t\t\t\t\t\tassert.Nil(err)\n\n\t\t\t\t\t\t\t\toutput := pack.RunSuccessfully(\n\t\t\t\t\t\t\t\t\t\"build\",\n\t\t\t\t\t\t\t\t\trepoName,\n\t\t\t\t\t\t\t\t\t\"-p\", tempAppDir,\n\t\t\t\t\t\t\t\t\t\"--buildpack\", buildpackTgz,\n\t\t\t\t\t\t\t\t\t\"--descriptor\", excludeDescriptorPath,\n\t\t\t\t\t\t\t\t)\n\t\t\t\t\t\t\t\tassert.NotContains(output, \"api_keys.json\")\n\t\t\t\t\t\t\t\tassert.NotContains(output, \"user_token\")\n\t\t\t\t\t\t\t\tassert.NotContains(output, \"test.sh\")\n\t\t\t\t\t\t\t\tassert.NotContains(output, \"other-cookie.jar\")\n\n\t\t\t\t\t\t\t\tassert.Contains(output, \"cookie.jar\")\n\t\t\t\t\t\t\t\tassert.Contains(output, \"nested-cookie.jar\")\n\t\t\t\t\t\t\t\tassert.Contains(output, \"mountain.jpg\")\n\t\t\t\t\t\t\t\tassert.Contains(output, \"person.png\")\n\t\t\t\t\t\t\t})\n\n\t\t\t\t\t\t\tit(\"should ONLY include specified files and directories\", func() {\n\t\t\t\t\t\t\t\tprojectToml := `\n[project]\nname = \"include test\"\n[[project.licenses]]\ntype = \"MIT\"\n[build]\ninclude = [ \"*.jar\", \"media/mountain.jpg\", \"/media/person.png\", ]\n`\n\t\t\t\t\t\t\t\tincludeDescriptorPath := filepath.Join(tempAppDir, \"include.toml\")\n\t\t\t\t\t\t\t\terr := os.WriteFile(includeDescriptorPath, []byte(projectToml), 0755)\n\t\t\t\t\t\t\t\tassert.Nil(err)\n\n\t\t\t\t\t\t\t\toutput := pack.RunSuccessfully(\n\t\t\t\t\t\t\t\t\t\"build\",\n\t\t\t\t\t\t\t\t\trepoName,\n\t\t\t\t\t\t\t\t\t\"-p\", tempAppDir,\n\t\t\t\t\t\t\t\t\t\"--buildpack\", buildpackTgz,\n\t\t\t\t\t\t\t\t\t\"--descriptor\", includeDescriptorPath,\n\t\t\t\t\t\t\t\t)\n\t\t\t\t\t\t\t\tassert.NotContains(output, \"api_keys.json\")\n\t\t\t\t\t\t\t\tassert.NotContains(output, \"user_token\")\n\t\t\t\t\t\t\t\tassert.NotContains(output, \"test.sh\")\n\n\t\t\t\t\t\t\t\tassert.Contains(output, \"cookie.jar\")\n\t\t\t\t\t\t\t\tassert.Contains(output, \"mountain.jpg\")\n\t\t\t\t\t\t\t\tassert.Contains(output, \"person.png\")\n\t\t\t\t\t\t\t})\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"--creation-time\", func() {\n\t\t\t\t\t\twhen(\"provided as 'now'\", func() {\n\t\t\t\t\t\t\tit(\"image has create time of the current time\", func() {\n\t\t\t\t\t\t\t\texpectedTime := time.Now()\n\t\t\t\t\t\t\t\tpack.RunSuccessfully(\n\t\t\t\t\t\t\t\t\t\"build\", repoName,\n\t\t\t\t\t\t\t\t\t\"-p\", filepath.Join(\"testdata\", \"mock_app\"),\n\t\t\t\t\t\t\t\t\t\"--creation-time\", \"now\",\n\t\t\t\t\t\t\t\t)\n\t\t\t\t\t\t\t\tassertImage.HasCreateTime(repoName, expectedTime)\n\t\t\t\t\t\t\t})\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\twhen(\"provided as unix timestamp\", func() {\n\t\t\t\t\t\t\tit(\"image has create time of the time that was provided\", func() {\n\t\t\t\t\t\t\t\tpack.RunSuccessfully(\n\t\t\t\t\t\t\t\t\t\"build\", repoName,\n\t\t\t\t\t\t\t\t\t\"-p\", filepath.Join(\"testdata\", \"mock_app\"),\n\t\t\t\t\t\t\t\t\t\"--creation-time\", \"1566172801\",\n\t\t\t\t\t\t\t\t)\n\t\t\t\t\t\t\t\texpectedTime, err := time.Parse(\"2006-01-02T03:04:05Z\", \"2019-08-19T00:00:01Z\")\n\t\t\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\t\t\tassertImage.HasCreateTime(repoName, expectedTime)\n\t\t\t\t\t\t\t})\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\twhen(\"not provided\", func() {\n\t\t\t\t\t\t\tit(\"image has create time of Jan 1, 1980\", func() {\n\t\t\t\t\t\t\t\tpack.RunSuccessfully(\n\t\t\t\t\t\t\t\t\t\"build\", repoName,\n\t\t\t\t\t\t\t\t\t\"-p\", filepath.Join(\"testdata\", \"mock_app\"),\n\t\t\t\t\t\t\t\t)\n\t\t\t\t\t\t\t\texpectedTime, err := time.Parse(\"2006-01-02T03:04:05Z\", \"1980-01-01T00:00:01Z\")\n\t\t\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\t\t\tassertImage.HasCreateTime(repoName, expectedTime)\n\t\t\t\t\t\t\t})\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"--platform\", func() {\n\t\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\t\th.SkipIf(t, !pack.SupportsFeature(invoke.PlatformOption), \"\")\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\tit(\"uses the builder with the desired platform\", func() {\n\t\t\t\t\t\t\toutput, _ := pack.Run(\n\t\t\t\t\t\t\t\t\"build\", repoName,\n\t\t\t\t\t\t\t\t\"-p\", filepath.Join(\"testdata\", \"mock_app\"),\n\t\t\t\t\t\t\t\t\"--platform\", \"linux/not-exist-arch\",\n\t\t\t\t\t\t\t)\n\t\t\t\t\t\t\th.AssertContainsMatch(t, output, \"Pulling image '.*test/builder.*' with platform 'linux/not-exist-arch\")\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"build --buildpack <flattened buildpack>\", func() {\n\t\t\t\t\tvar (\n\t\t\t\t\t\ttmpDir                         string\n\t\t\t\t\t\tflattenedPackageName           string\n\t\t\t\t\t\tsimplePackageConfigFixtureName = \"package.toml\"\n\t\t\t\t\t)\n\n\t\t\t\t\tgenerateAggregatePackageToml := func(buildpackURI, nestedPackageName, operatingSystem string) string {\n\t\t\t\t\t\tt.Helper()\n\t\t\t\t\t\tpackageTomlFile, err := os.CreateTemp(tmpDir, \"package_aggregate-*.toml\")\n\t\t\t\t\t\tassert.Nil(err)\n\n\t\t\t\t\t\tpack.FixtureManager().TemplateFixtureToFile(\n\t\t\t\t\t\t\t\"package_aggregate.toml\",\n\t\t\t\t\t\t\tpackageTomlFile,\n\t\t\t\t\t\t\tmap[string]interface{}{\n\t\t\t\t\t\t\t\t\"BuildpackURI\": buildpackURI,\n\t\t\t\t\t\t\t\t\"PackageName\":  nestedPackageName,\n\t\t\t\t\t\t\t\t\"OS\":           operatingSystem,\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t)\n\n\t\t\t\t\t\tassert.Nil(packageTomlFile.Close())\n\t\t\t\t\t\treturn packageTomlFile.Name()\n\t\t\t\t\t}\n\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\th.SkipIf(t, !pack.SupportsFeature(invoke.BuildpackFlatten), \"\")\n\t\t\t\t\t\th.SkipIf(t, imageManager.HostOS() == \"windows\", \"buildpack directories not supported on windows\")\n\n\t\t\t\t\t\tvar err error\n\t\t\t\t\t\ttmpDir, err = os.MkdirTemp(\"\", \"buildpack-package-flattened-tests\")\n\t\t\t\t\t\tassert.Nil(err)\n\n\t\t\t\t\t\tbuildpackManager = buildpacks.NewBuildModuleManager(t, assert)\n\t\t\t\t\t\tbuildpackManager.PrepareBuildModules(tmpDir, buildpacks.BpSimpleLayersParent, buildpacks.BpSimpleLayers)\n\n\t\t\t\t\t\t// set up a flattened buildpack\n\t\t\t\t\t\tpackageTomlPath := generatePackageTomlWithOS(t, assert, pack, tmpDir, simplePackageConfigFixtureName, imageManager.HostOS())\n\t\t\t\t\t\tnestedPackageName := \"test/flattened-package-\" + h.RandString(10)\n\t\t\t\t\t\tnestedPackage := buildpacks.NewPackageImage(\n\t\t\t\t\t\t\tt,\n\t\t\t\t\t\t\tpack,\n\t\t\t\t\t\t\tnestedPackageName,\n\t\t\t\t\t\t\tpackageTomlPath,\n\t\t\t\t\t\t\tbuildpacks.WithRequiredBuildpacks(buildpacks.BpSimpleLayers),\n\t\t\t\t\t\t)\n\t\t\t\t\t\tbuildpackManager.PrepareBuildModules(tmpDir, nestedPackage)\n\t\t\t\t\t\tassertImage.ExistsLocally(nestedPackageName)\n\n\t\t\t\t\t\taggregatePackageToml := generateAggregatePackageToml(\"simple-layers-parent-buildpack.tgz\", nestedPackageName, imageManager.HostOS())\n\t\t\t\t\t\tflattenedPackageName = \"test/package-\" + h.RandString(10)\n\n\t\t\t\t\t\t_ = pack.RunSuccessfully(\n\t\t\t\t\t\t\t\"buildpack\", \"package\", flattenedPackageName,\n\t\t\t\t\t\t\t\"-c\", aggregatePackageToml,\n\t\t\t\t\t\t\t\"--flatten\",\n\t\t\t\t\t\t)\n\n\t\t\t\t\t\tassertImage.ExistsLocally(flattenedPackageName)\n\t\t\t\t\t\tassertImage.HasLengthLayers(flattenedPackageName, 1)\n\t\t\t\t\t})\n\n\t\t\t\t\tit.After(func() {\n\t\t\t\t\t\tassert.Nil(os.RemoveAll(tmpDir))\n\t\t\t\t\t\timageManager.CleanupImages(flattenedPackageName)\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"--flatten\", func() {\n\t\t\t\t\t\tit(\"does not write duplicate tar files when creating the ephemeral builder\", func() {\n\t\t\t\t\t\t\toutput := pack.RunSuccessfully(\n\t\t\t\t\t\t\t\t\"build\", repoName,\n\t\t\t\t\t\t\t\t\"-p\", filepath.Join(\"testdata\", \"mock_app\"),\n\t\t\t\t\t\t\t\t\"--buildpack\", fmt.Sprintf(\"docker://%s\", flattenedPackageName),\n\t\t\t\t\t\t\t\t\"--builder\", builderName,\n\t\t\t\t\t\t\t)\n\t\t\t\t\t\t\t// buildpack returning an empty tar file is non-deterministic,\n\t\t\t\t\t\t\t// but we expect one of them to throw the warning\n\t\t\t\t\t\t\th.AssertContainsMatch(t, output, \"Buildpack '(simple/layers@simple-layers-version|simple/layers/parent@simple-layers-parent-version)' is a component of a flattened buildpack that will be added elsewhere, skipping...\")\n\n\t\t\t\t\t\t\t// simple/layers BP exists on the builder and in the flattened buildpack\n\t\t\t\t\t\t\th.AssertContainsMatch(t, output, \"Buildpack 'simple/layers@simple-layers-version' already exists on builder with same contents, skipping...\")\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"inspecting builder\", func() {\n\t\t\t\twhen(\"inspecting a nested builder\", func() {\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\t// create our nested builder\n\t\t\t\t\t\th.SkipIf(t, imageManager.HostOS() == \"windows\", \"These tests are not yet compatible with Windows-based containers\")\n\n\t\t\t\t\t\t// create a task, handled by a 'task manager' which executes our pack commands during tests.\n\t\t\t\t\t\t// looks like this is used to de-dup tasks\n\t\t\t\t\t\tkey := taskKey(\n\t\t\t\t\t\t\t\"create-complex-builder\",\n\t\t\t\t\t\t\tappend(\n\t\t\t\t\t\t\t\t[]string{runImageMirror, createBuilderPackConfig.Path(), lifecycle.Identifier()},\n\t\t\t\t\t\t\t\tcreateBuilderPackConfig.FixturePaths()...,\n\t\t\t\t\t\t\t)...,\n\t\t\t\t\t\t)\n\t\t\t\t\t\t// run task on taskmanager and save output, in case there are future calls to the same task\n\t\t\t\t\t\t// likely all our changes need to go on the createBuilderPack.\n\t\t\t\t\t\tvalue, err := suiteManager.RunTaskOnceString(key, func() (string, error) {\n\t\t\t\t\t\t\treturn createComplexBuilder(\n\t\t\t\t\t\t\t\tt,\n\t\t\t\t\t\t\t\tassert,\n\t\t\t\t\t\t\t\tcreateBuilderPack,\n\t\t\t\t\t\t\t\tlifecycle,\n\t\t\t\t\t\t\t\tbuildpackManager,\n\t\t\t\t\t\t\t\trunImageMirror,\n\t\t\t\t\t\t\t)\n\t\t\t\t\t\t})\n\t\t\t\t\t\tassert.Nil(err)\n\n\t\t\t\t\t\t// register task to be run to 'clean up' a task\n\t\t\t\t\t\tsuiteManager.RegisterCleanUp(\"clean-\"+key, func() error {\n\t\t\t\t\t\t\timageManager.CleanupImages(value)\n\t\t\t\t\t\t\treturn nil\n\t\t\t\t\t\t})\n\t\t\t\t\t\tbuilderName = value\n\n\t\t\t\t\t\toutput := pack.RunSuccessfully(\n\t\t\t\t\t\t\t\"config\", \"run-image-mirrors\", \"add\", \"pack-test/run\", \"--mirror\", \"some-registry.com/pack-test/run1\")\n\t\t\t\t\t\tassertOutput := assertions.NewOutputAssertionManager(t, output)\n\t\t\t\t\t\tassertOutput.ReportsSuccesfulRunImageMirrorsAdd(\"pack-test/run\", \"some-registry.com/pack-test/run1\")\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"displays nested Detection Order groups\", func() {\n\t\t\t\t\t\tvar output string\n\t\t\t\t\t\tif pack.Supports(\"builder inspect\") {\n\t\t\t\t\t\t\toutput = pack.RunSuccessfully(\"builder\", \"inspect\", builderName)\n\t\t\t\t\t\t} else {\n\t\t\t\t\t\t\toutput = pack.RunSuccessfully(\"inspect-builder\", builderName)\n\t\t\t\t\t\t}\n\n\t\t\t\t\t\tdeprecatedBuildpackAPIs,\n\t\t\t\t\t\t\tsupportedBuildpackAPIs,\n\t\t\t\t\t\t\tdeprecatedPlatformAPIs,\n\t\t\t\t\t\t\tsupportedPlatformAPIs := lifecycle.OutputForAPIs()\n\n\t\t\t\t\t\texpectedOutput := pack.FixtureManager().TemplateVersionedFixture(\n\t\t\t\t\t\t\t\"inspect_%s_builder_nested_output.txt\",\n\t\t\t\t\t\t\tcreateBuilderPack.SanitizedVersion(),\n\t\t\t\t\t\t\t\"inspect_builder_nested_output.txt\",\n\t\t\t\t\t\t\tmap[string]interface{}{\n\t\t\t\t\t\t\t\t\"builder_name\":              builderName,\n\t\t\t\t\t\t\t\t\"lifecycle_version\":         lifecycle.Version(),\n\t\t\t\t\t\t\t\t\"deprecated_buildpack_apis\": deprecatedBuildpackAPIs,\n\t\t\t\t\t\t\t\t\"supported_buildpack_apis\":  supportedBuildpackAPIs,\n\t\t\t\t\t\t\t\t\"deprecated_platform_apis\":  deprecatedPlatformAPIs,\n\t\t\t\t\t\t\t\t\"supported_platform_apis\":   supportedPlatformAPIs,\n\t\t\t\t\t\t\t\t\"run_image_mirror\":          runImageMirror,\n\t\t\t\t\t\t\t\t\"pack_version\":              createBuilderPack.Version(),\n\t\t\t\t\t\t\t\t\"trusted\":                   \"No\",\n\n\t\t\t\t\t\t\t\t// set previous pack template fields\n\t\t\t\t\t\t\t\t\"buildpack_api_version\": lifecycle.EarliestBuildpackAPIVersion(),\n\t\t\t\t\t\t\t\t\"platform_api_version\":  lifecycle.EarliestPlatformAPIVersion(),\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t)\n\n\t\t\t\t\t\tassert.TrimmedEq(output, expectedOutput)\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"provides nested detection output up to depth\", func() {\n\t\t\t\t\t\tdepth := \"1\"\n\t\t\t\t\t\tvar output string\n\t\t\t\t\t\tif pack.Supports(\"builder inspect\") {\n\t\t\t\t\t\t\toutput = pack.RunSuccessfully(\"builder\", \"inspect\", \"--depth\", depth, builderName)\n\t\t\t\t\t\t} else {\n\t\t\t\t\t\t\toutput = pack.RunSuccessfully(\"inspect-builder\", \"--depth\", depth, builderName)\n\t\t\t\t\t\t}\n\n\t\t\t\t\t\tdeprecatedBuildpackAPIs,\n\t\t\t\t\t\t\tsupportedBuildpackAPIs,\n\t\t\t\t\t\t\tdeprecatedPlatformAPIs,\n\t\t\t\t\t\t\tsupportedPlatformAPIs := lifecycle.OutputForAPIs()\n\n\t\t\t\t\t\texpectedOutput := pack.FixtureManager().TemplateVersionedFixture(\n\t\t\t\t\t\t\t\"inspect_%s_builder_nested_depth_2_output.txt\",\n\t\t\t\t\t\t\tcreateBuilderPack.SanitizedVersion(),\n\t\t\t\t\t\t\t\"inspect_builder_nested_depth_2_output.txt\",\n\t\t\t\t\t\t\tmap[string]interface{}{\n\t\t\t\t\t\t\t\t\"builder_name\":              builderName,\n\t\t\t\t\t\t\t\t\"lifecycle_version\":         lifecycle.Version(),\n\t\t\t\t\t\t\t\t\"deprecated_buildpack_apis\": deprecatedBuildpackAPIs,\n\t\t\t\t\t\t\t\t\"supported_buildpack_apis\":  supportedBuildpackAPIs,\n\t\t\t\t\t\t\t\t\"deprecated_platform_apis\":  deprecatedPlatformAPIs,\n\t\t\t\t\t\t\t\t\"supported_platform_apis\":   supportedPlatformAPIs,\n\t\t\t\t\t\t\t\t\"run_image_mirror\":          runImageMirror,\n\t\t\t\t\t\t\t\t\"pack_version\":              createBuilderPack.Version(),\n\t\t\t\t\t\t\t\t\"trusted\":                   \"No\",\n\n\t\t\t\t\t\t\t\t// set previous pack template fields\n\t\t\t\t\t\t\t\t\"buildpack_api_version\": lifecycle.EarliestBuildpackAPIVersion(),\n\t\t\t\t\t\t\t\t\"platform_api_version\":  lifecycle.EarliestPlatformAPIVersion(),\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t)\n\n\t\t\t\t\t\tassert.TrimmedEq(output, expectedOutput)\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"output format is toml\", func() {\n\t\t\t\t\t\tit(\"prints builder information in toml format\", func() {\n\t\t\t\t\t\t\tvar output string\n\t\t\t\t\t\t\tif pack.Supports(\"builder inspect\") {\n\t\t\t\t\t\t\t\toutput = pack.RunSuccessfully(\"builder\", \"inspect\", builderName, \"--output\", \"toml\")\n\t\t\t\t\t\t\t} else {\n\t\t\t\t\t\t\t\toutput = pack.RunSuccessfully(\"inspect-builder\", builderName, \"--output\", \"toml\")\n\t\t\t\t\t\t\t}\n\n\t\t\t\t\t\t\terr := toml.NewDecoder(strings.NewReader(string(output))).Decode(&struct{}{})\n\t\t\t\t\t\t\tassert.Nil(err)\n\n\t\t\t\t\t\t\tdeprecatedBuildpackAPIs,\n\t\t\t\t\t\t\t\tsupportedBuildpackAPIs,\n\t\t\t\t\t\t\t\tdeprecatedPlatformAPIs,\n\t\t\t\t\t\t\t\tsupportedPlatformAPIs := lifecycle.TOMLOutputForAPIs()\n\n\t\t\t\t\t\t\texpectedOutput := pack.FixtureManager().TemplateVersionedFixture(\n\t\t\t\t\t\t\t\t\"inspect_%s_builder_nested_output_toml.txt\",\n\t\t\t\t\t\t\t\tcreateBuilderPack.SanitizedVersion(),\n\t\t\t\t\t\t\t\t\"inspect_builder_nested_output_toml.txt\",\n\t\t\t\t\t\t\t\tmap[string]interface{}{\n\t\t\t\t\t\t\t\t\t\"builder_name\":              builderName,\n\t\t\t\t\t\t\t\t\t\"lifecycle_version\":         lifecycle.Version(),\n\t\t\t\t\t\t\t\t\t\"deprecated_buildpack_apis\": deprecatedBuildpackAPIs,\n\t\t\t\t\t\t\t\t\t\"supported_buildpack_apis\":  supportedBuildpackAPIs,\n\t\t\t\t\t\t\t\t\t\"deprecated_platform_apis\":  deprecatedPlatformAPIs,\n\t\t\t\t\t\t\t\t\t\"supported_platform_apis\":   supportedPlatformAPIs,\n\t\t\t\t\t\t\t\t\t\"run_image_mirror\":          runImageMirror,\n\t\t\t\t\t\t\t\t\t\"pack_version\":              createBuilderPack.Version(),\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t)\n\n\t\t\t\t\t\t\tassert.TrimmedEq(string(output), expectedOutput)\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"output format is yaml\", func() {\n\t\t\t\t\t\tit(\"prints builder information in yaml format\", func() {\n\t\t\t\t\t\t\tvar output string\n\t\t\t\t\t\t\tif pack.Supports(\"builder inspect\") {\n\t\t\t\t\t\t\t\toutput = pack.RunSuccessfully(\"builder\", \"inspect\", builderName, \"--output\", \"yaml\")\n\t\t\t\t\t\t\t} else {\n\t\t\t\t\t\t\t\toutput = pack.RunSuccessfully(\"inspect-builder\", builderName, \"--output\", \"yaml\")\n\t\t\t\t\t\t\t}\n\n\t\t\t\t\t\t\terr := yaml.Unmarshal([]byte(output), &struct{}{})\n\t\t\t\t\t\t\tassert.Nil(err)\n\n\t\t\t\t\t\t\tdeprecatedBuildpackAPIs,\n\t\t\t\t\t\t\t\tsupportedBuildpackAPIs,\n\t\t\t\t\t\t\t\tdeprecatedPlatformAPIs,\n\t\t\t\t\t\t\t\tsupportedPlatformAPIs := lifecycle.YAMLOutputForAPIs(14)\n\n\t\t\t\t\t\t\texpectedOutput := pack.FixtureManager().TemplateVersionedFixture(\n\t\t\t\t\t\t\t\t\"inspect_%s_builder_nested_output_yaml.txt\",\n\t\t\t\t\t\t\t\tcreateBuilderPack.SanitizedVersion(),\n\t\t\t\t\t\t\t\t\"inspect_builder_nested_output_yaml.txt\",\n\t\t\t\t\t\t\t\tmap[string]interface{}{\n\t\t\t\t\t\t\t\t\t\"builder_name\":              builderName,\n\t\t\t\t\t\t\t\t\t\"lifecycle_version\":         lifecycle.Version(),\n\t\t\t\t\t\t\t\t\t\"deprecated_buildpack_apis\": deprecatedBuildpackAPIs,\n\t\t\t\t\t\t\t\t\t\"supported_buildpack_apis\":  supportedBuildpackAPIs,\n\t\t\t\t\t\t\t\t\t\"deprecated_platform_apis\":  deprecatedPlatformAPIs,\n\t\t\t\t\t\t\t\t\t\"supported_platform_apis\":   supportedPlatformAPIs,\n\t\t\t\t\t\t\t\t\t\"run_image_mirror\":          runImageMirror,\n\t\t\t\t\t\t\t\t\t\"pack_version\":              createBuilderPack.Version(),\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t)\n\n\t\t\t\t\t\t\tassert.TrimmedEq(string(output), expectedOutput)\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"output format is json\", func() {\n\t\t\t\t\t\tit(\"prints builder information in json format\", func() {\n\t\t\t\t\t\t\tvar output string\n\t\t\t\t\t\t\tif pack.Supports(\"builder inspect\") {\n\t\t\t\t\t\t\t\toutput = pack.RunSuccessfully(\"builder\", \"inspect\", builderName, \"--output\", \"json\")\n\t\t\t\t\t\t\t} else {\n\t\t\t\t\t\t\t\toutput = pack.RunSuccessfully(\"inspect-builder\", builderName, \"--output\", \"json\")\n\t\t\t\t\t\t\t}\n\n\t\t\t\t\t\t\terr := json.Unmarshal([]byte(output), &struct{}{})\n\t\t\t\t\t\t\tassert.Nil(err)\n\n\t\t\t\t\t\t\tvar prettifiedOutput bytes.Buffer\n\t\t\t\t\t\t\terr = json.Indent(&prettifiedOutput, []byte(output), \"\", \"  \")\n\t\t\t\t\t\t\tassert.Nil(err)\n\n\t\t\t\t\t\t\tdeprecatedBuildpackAPIs,\n\t\t\t\t\t\t\t\tsupportedBuildpackAPIs,\n\t\t\t\t\t\t\t\tdeprecatedPlatformAPIs,\n\t\t\t\t\t\t\t\tsupportedPlatformAPIs := lifecycle.JSONOutputForAPIs(8)\n\n\t\t\t\t\t\t\texpectedOutput := pack.FixtureManager().TemplateVersionedFixture(\n\t\t\t\t\t\t\t\t\"inspect_%s_builder_nested_output_json.txt\",\n\t\t\t\t\t\t\t\tcreateBuilderPack.SanitizedVersion(),\n\t\t\t\t\t\t\t\t\"inspect_builder_nested_output_json.txt\",\n\t\t\t\t\t\t\t\tmap[string]interface{}{\n\t\t\t\t\t\t\t\t\t\"builder_name\":              builderName,\n\t\t\t\t\t\t\t\t\t\"lifecycle_version\":         lifecycle.Version(),\n\t\t\t\t\t\t\t\t\t\"deprecated_buildpack_apis\": deprecatedBuildpackAPIs,\n\t\t\t\t\t\t\t\t\t\"supported_buildpack_apis\":  supportedBuildpackAPIs,\n\t\t\t\t\t\t\t\t\t\"deprecated_platform_apis\":  deprecatedPlatformAPIs,\n\t\t\t\t\t\t\t\t\t\"supported_platform_apis\":   supportedPlatformAPIs,\n\t\t\t\t\t\t\t\t\t\"run_image_mirror\":          runImageMirror,\n\t\t\t\t\t\t\t\t\t\"pack_version\":              createBuilderPack.Version(),\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t)\n\n\t\t\t\t\t\t\tassert.Equal(prettifiedOutput.String(), expectedOutput)\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\tit(\"displays configuration for a builder (local and remote)\", func() {\n\t\t\t\t\toutput := pack.RunSuccessfully(\n\t\t\t\t\t\t\"config\", \"run-image-mirrors\", \"add\", \"pack-test/run\", \"--mirror\", \"some-registry.com/pack-test/run1\",\n\t\t\t\t\t)\n\t\t\t\t\tassertOutput := assertions.NewOutputAssertionManager(t, output)\n\t\t\t\t\tassertOutput.ReportsSuccesfulRunImageMirrorsAdd(\"pack-test/run\", \"some-registry.com/pack-test/run1\")\n\n\t\t\t\t\tif pack.Supports(\"builder inspect\") {\n\t\t\t\t\t\toutput = pack.RunSuccessfully(\"builder\", \"inspect\", builderName)\n\t\t\t\t\t} else {\n\t\t\t\t\t\toutput = pack.RunSuccessfully(\"inspect-builder\", builderName)\n\t\t\t\t\t}\n\n\t\t\t\t\tdeprecatedBuildpackAPIs,\n\t\t\t\t\t\tsupportedBuildpackAPIs,\n\t\t\t\t\t\tdeprecatedPlatformAPIs,\n\t\t\t\t\t\tsupportedPlatformAPIs := lifecycle.OutputForAPIs()\n\n\t\t\t\t\texpectedOutput := pack.FixtureManager().TemplateVersionedFixture(\n\t\t\t\t\t\t\"inspect_%s_builder_output.txt\",\n\t\t\t\t\t\tcreateBuilderPack.SanitizedVersion(),\n\t\t\t\t\t\t\"inspect_builder_output.txt\",\n\t\t\t\t\t\tmap[string]interface{}{\n\t\t\t\t\t\t\t\"builder_name\":              builderName,\n\t\t\t\t\t\t\t\"lifecycle_version\":         lifecycle.Version(),\n\t\t\t\t\t\t\t\"deprecated_buildpack_apis\": deprecatedBuildpackAPIs,\n\t\t\t\t\t\t\t\"supported_buildpack_apis\":  supportedBuildpackAPIs,\n\t\t\t\t\t\t\t\"deprecated_platform_apis\":  deprecatedPlatformAPIs,\n\t\t\t\t\t\t\t\"supported_platform_apis\":   supportedPlatformAPIs,\n\t\t\t\t\t\t\t\"run_image_mirror\":          runImageMirror,\n\t\t\t\t\t\t\t\"pack_version\":              createBuilderPack.Version(),\n\t\t\t\t\t\t\t\"trusted\":                   \"No\",\n\n\t\t\t\t\t\t\t// set previous pack template fields\n\t\t\t\t\t\t\t\"buildpack_api_version\": lifecycle.EarliestBuildpackAPIVersion(),\n\t\t\t\t\t\t\t\"platform_api_version\":  lifecycle.EarliestPlatformAPIVersion(),\n\t\t\t\t\t\t},\n\t\t\t\t\t)\n\n\t\t\t\t\tassert.TrimmedEq(output, expectedOutput)\n\t\t\t\t})\n\n\t\t\t\tit(\"indicates builder is trusted\", func() {\n\t\t\t\t\tpack.JustRunSuccessfully(\"config\", \"trusted-builders\", \"add\", builderName)\n\t\t\t\t\tpack.JustRunSuccessfully(\"config\", \"run-image-mirrors\", \"add\", \"pack-test/run\", \"--mirror\", \"some-registry.com/pack-test/run1\")\n\n\t\t\t\t\tvar output string\n\t\t\t\t\tif pack.Supports(\"builder inspect\") {\n\t\t\t\t\t\toutput = pack.RunSuccessfully(\"builder\", \"inspect\", builderName)\n\t\t\t\t\t} else {\n\t\t\t\t\t\toutput = pack.RunSuccessfully(\"inspect-builder\", builderName)\n\t\t\t\t\t}\n\n\t\t\t\t\tdeprecatedBuildpackAPIs,\n\t\t\t\t\t\tsupportedBuildpackAPIs,\n\t\t\t\t\t\tdeprecatedPlatformAPIs,\n\t\t\t\t\t\tsupportedPlatformAPIs := lifecycle.OutputForAPIs()\n\n\t\t\t\t\texpectedOutput := pack.FixtureManager().TemplateVersionedFixture(\n\t\t\t\t\t\t\"inspect_%s_builder_output.txt\",\n\t\t\t\t\t\tcreateBuilderPack.SanitizedVersion(),\n\t\t\t\t\t\t\"inspect_builder_output.txt\",\n\t\t\t\t\t\tmap[string]interface{}{\n\t\t\t\t\t\t\t\"builder_name\":              builderName,\n\t\t\t\t\t\t\t\"lifecycle_version\":         lifecycle.Version(),\n\t\t\t\t\t\t\t\"deprecated_buildpack_apis\": deprecatedBuildpackAPIs,\n\t\t\t\t\t\t\t\"supported_buildpack_apis\":  supportedBuildpackAPIs,\n\t\t\t\t\t\t\t\"deprecated_platform_apis\":  deprecatedPlatformAPIs,\n\t\t\t\t\t\t\t\"supported_platform_apis\":   supportedPlatformAPIs,\n\t\t\t\t\t\t\t\"run_image_mirror\":          runImageMirror,\n\t\t\t\t\t\t\t\"pack_version\":              createBuilderPack.Version(),\n\t\t\t\t\t\t\t\"trusted\":                   \"Yes\",\n\n\t\t\t\t\t\t\t// set previous pack template fields\n\t\t\t\t\t\t\t\"buildpack_api_version\": lifecycle.EarliestBuildpackAPIVersion(),\n\t\t\t\t\t\t\t\"platform_api_version\":  lifecycle.EarliestPlatformAPIVersion(),\n\t\t\t\t\t\t},\n\t\t\t\t\t)\n\n\t\t\t\t\tassert.TrimmedEq(output, expectedOutput)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"rebase\", func() {\n\t\t\t\tvar repoName, runBefore, origID string\n\t\t\t\tvar buildRunImage func(string, string, string)\n\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tpack.JustRunSuccessfully(\"config\", \"trusted-builders\", \"add\", builderName)\n\n\t\t\t\t\trepoName = registryConfig.RepoName(\"some-org/\" + h.RandString(10))\n\t\t\t\t\trunBefore = registryConfig.RepoName(\"run-before/\" + h.RandString(10))\n\n\t\t\t\t\tbuildRunImage = func(newRunImage, contents1, contents2 string) {\n\t\t\t\t\t\tuser := func() string {\n\t\t\t\t\t\t\tif imageManager.HostOS() == \"windows\" {\n\t\t\t\t\t\t\t\treturn \"ContainerAdministrator\"\n\t\t\t\t\t\t\t}\n\n\t\t\t\t\t\t\treturn \"root\"\n\t\t\t\t\t\t}\n\n\t\t\t\t\t\th.CreateImage(t, dockerCli, newRunImage, fmt.Sprintf(`\n\t\t\t\t\t\t\t\t\t\t\t\t\tFROM %s\n\t\t\t\t\t\t\t\t\t\t\t\t\tUSER %s\n\t\t\t\t\t\t\t\t\t\t\t\t\tRUN echo %s > /contents1.txt\n\t\t\t\t\t\t\t\t\t\t\t\t\tRUN echo %s > /contents2.txt\n\t\t\t\t\t\t\t\t\t\t\t\t\tUSER pack\n\t\t\t\t\t\t\t\t\t\t\t\t`, runImage, user(), contents1, contents2))\n\t\t\t\t\t}\n\n\t\t\t\t\tbuildRunImage(runBefore, \"contents-before-1\", \"contents-before-2\")\n\t\t\t\t\tpack.RunSuccessfully(\n\t\t\t\t\t\t\"build\", repoName,\n\t\t\t\t\t\t\"-p\", filepath.Join(\"testdata\", \"mock_app\"),\n\t\t\t\t\t\t\"--builder\", builderName,\n\t\t\t\t\t\t\"--run-image\", runBefore,\n\t\t\t\t\t\t\"--pull-policy\", \"never\",\n\t\t\t\t\t)\n\t\t\t\t\torigID = h.ImageID(t, repoName)\n\t\t\t\t\tassertImage.RunsWithOutput(\n\t\t\t\t\t\trepoName,\n\t\t\t\t\t\t\"contents-before-1\",\n\t\t\t\t\t\t\"contents-before-2\",\n\t\t\t\t\t)\n\t\t\t\t})\n\n\t\t\t\tit.After(func() {\n\t\t\t\t\timageManager.CleanupImages(origID, repoName, runBefore)\n\t\t\t\t\tref, err := name.ParseReference(repoName, name.WeakValidation)\n\t\t\t\t\tassert.Nil(err)\n\t\t\t\t\tlogger := logging.NewSimpleLogger(&bytes.Buffer{})\n\t\t\t\t\tbuildCacheVolume, _ := cache.NewVolumeCache(ref, cache.CacheInfo{}, \"build\", dockerCli, logger)\n\t\t\t\t\tlaunchCacheVolume, _ := cache.NewVolumeCache(ref, cache.CacheInfo{}, \"launch\", dockerCli, logger)\n\t\t\t\t\tassert.Succeeds(buildCacheVolume.Clear(context.TODO()))\n\t\t\t\t\tassert.Succeeds(launchCacheVolume.Clear(context.TODO()))\n\t\t\t\t})\n\n\t\t\t\twhen(\"daemon\", func() {\n\t\t\t\t\twhen(\"--run-image\", func() {\n\t\t\t\t\t\tvar runAfter string\n\n\t\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\t\trunAfter = registryConfig.RepoName(\"run-after/\" + h.RandString(10))\n\t\t\t\t\t\t\tbuildRunImage(runAfter, \"contents-after-1\", \"contents-after-2\")\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\tit.After(func() {\n\t\t\t\t\t\t\timageManager.CleanupImages(runAfter)\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\tit(\"uses provided run image\", func() {\n\t\t\t\t\t\t\targs := []string{\n\t\t\t\t\t\t\t\trepoName,\n\t\t\t\t\t\t\t\t\"--run-image\", runAfter,\n\t\t\t\t\t\t\t\t\"--pull-policy\", \"never\",\n\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\tif pack.SupportsFeature(invoke.ForceRebase) {\n\t\t\t\t\t\t\t\targs = append(args, \"--force\")\n\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\toutput := pack.RunSuccessfully(\"rebase\", args...)\n\n\t\t\t\t\t\t\tassert.Contains(output, fmt.Sprintf(\"Successfully rebased image '%s'\", repoName))\n\t\t\t\t\t\t\tassertImage.RunsWithOutput(\n\t\t\t\t\t\t\t\trepoName,\n\t\t\t\t\t\t\t\t\"contents-after-1\",\n\t\t\t\t\t\t\t\t\"contents-after-2\",\n\t\t\t\t\t\t\t)\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"local config has a mirror\", func() {\n\t\t\t\t\t\tvar localRunImageMirror string\n\n\t\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\t\timageManager.CleanupImages(repoName)\n\t\t\t\t\t\t\tpack.RunSuccessfully(\n\t\t\t\t\t\t\t\t\"build\", repoName,\n\t\t\t\t\t\t\t\t\"-p\", filepath.Join(\"testdata\", \"mock_app\"),\n\t\t\t\t\t\t\t\t\"--builder\", builderName,\n\t\t\t\t\t\t\t\t\"--pull-policy\", \"never\",\n\t\t\t\t\t\t\t)\n\n\t\t\t\t\t\t\tlocalRunImageMirror = registryConfig.RepoName(\"run-after/\" + h.RandString(10))\n\t\t\t\t\t\t\tbuildRunImage(localRunImageMirror, \"local-mirror-after-1\", \"local-mirror-after-2\")\n\t\t\t\t\t\t\tpack.JustRunSuccessfully(\"config\", \"run-image-mirrors\", \"add\", runImage, \"-m\", localRunImageMirror)\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\tit.After(func() {\n\t\t\t\t\t\t\timageManager.CleanupImages(localRunImageMirror)\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\tit(\"prefers the local mirror\", func() {\n\t\t\t\t\t\t\targs := []string{repoName, \"--pull-policy\", \"never\"}\n\t\t\t\t\t\t\tif pack.SupportsFeature(invoke.ForceRebase) {\n\t\t\t\t\t\t\t\targs = append(args, \"--force\")\n\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\toutput := pack.RunSuccessfully(\"rebase\", args...)\n\n\t\t\t\t\t\t\tassertOutput := assertions.NewOutputAssertionManager(t, output)\n\t\t\t\t\t\t\tassertOutput.ReportsSelectingRunImageMirrorFromLocalConfig(localRunImageMirror)\n\t\t\t\t\t\t\tassertOutput.ReportsSuccessfulRebase(repoName)\n\t\t\t\t\t\t\tassertImage.RunsWithOutput(\n\t\t\t\t\t\t\t\trepoName,\n\t\t\t\t\t\t\t\t\"local-mirror-after-1\",\n\t\t\t\t\t\t\t\t\"local-mirror-after-2\",\n\t\t\t\t\t\t\t)\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"image metadata has a mirror\", func() {\n\t\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\t\t// clean up existing mirror first to avoid leaking images\n\t\t\t\t\t\t\timageManager.CleanupImages(runImageMirror, repoName)\n\t\t\t\t\t\t\tbuildRunImage(runImageMirror, \"mirror-after-1\", \"mirror-after-2\")\n\n\t\t\t\t\t\t\tpack.RunSuccessfully(\n\t\t\t\t\t\t\t\t\"build\", repoName,\n\t\t\t\t\t\t\t\t\"-p\", filepath.Join(\"testdata\", \"mock_app\"),\n\t\t\t\t\t\t\t\t\"--builder\", builderName,\n\t\t\t\t\t\t\t\t\"--pull-policy\", \"never\",\n\t\t\t\t\t\t\t)\n\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\tit(\"selects the best mirror\", func() {\n\t\t\t\t\t\t\toutput := pack.RunSuccessfully(\"rebase\", repoName, \"--pull-policy\", \"never\")\n\n\t\t\t\t\t\t\tassertOutput := assertions.NewOutputAssertionManager(t, output)\n\t\t\t\t\t\t\tassertOutput.ReportsSelectingRunImageMirror(runImageMirror)\n\t\t\t\t\t\t\tassertOutput.ReportsSuccessfulRebase(repoName)\n\t\t\t\t\t\t\tassertImage.RunsWithOutput(\n\t\t\t\t\t\t\t\trepoName,\n\t\t\t\t\t\t\t\t\"mirror-after-1\",\n\t\t\t\t\t\t\t\t\"mirror-after-2\",\n\t\t\t\t\t\t\t)\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"--publish\", func() {\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\tassert.Succeeds(h.PushImage(dockerCli, repoName, registryConfig))\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"--run-image\", func() {\n\t\t\t\t\t\tvar runAfter string\n\n\t\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\t\trunAfter = registryConfig.RepoName(\"run-after/\" + h.RandString(10))\n\t\t\t\t\t\t\tbuildRunImage(runAfter, \"contents-after-1\", \"contents-after-2\")\n\t\t\t\t\t\t\tassert.Succeeds(h.PushImage(dockerCli, runAfter, registryConfig))\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\tit.After(func() {\n\t\t\t\t\t\t\timageManager.CleanupImages(runAfter)\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\tit(\"uses provided run image\", func() {\n\t\t\t\t\t\t\targs := []string{repoName, \"--publish\", \"--run-image\", runAfter}\n\t\t\t\t\t\t\tif pack.SupportsFeature(invoke.ForceRebase) {\n\t\t\t\t\t\t\t\targs = append(args, \"--force\")\n\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\toutput := pack.RunSuccessfully(\"rebase\", args...)\n\n\t\t\t\t\t\t\tassertions.NewOutputAssertionManager(t, output).ReportsSuccessfulRebase(repoName)\n\t\t\t\t\t\t\tassertImage.CanBePulledFromRegistry(repoName)\n\t\t\t\t\t\t\tassertImage.RunsWithOutput(\n\t\t\t\t\t\t\t\trepoName,\n\t\t\t\t\t\t\t\t\"contents-after-1\",\n\t\t\t\t\t\t\t\t\"contents-after-2\",\n\t\t\t\t\t\t\t)\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"multi-platform\", func() {\n\t\t\t\tvar (\n\t\t\t\t\ttmpDir                    string\n\t\t\t\t\tmultiArchBuildpackPackage string\n\t\t\t\t\tbuilderTomlPath           string\n\t\t\t\t\tremoteRunImage            string\n\t\t\t\t\tremoteBuildImage          string\n\t\t\t\t\terr                       error\n\t\t\t\t)\n\n\t\t\t\tit.Before(func() {\n\t\t\t\t\th.SkipIf(t, !pack.SupportsFeature(invoke.MultiPlatformBuildersAndBuildPackages), \"multi-platform builders and buildpack packages are available since 0.34.0\")\n\n\t\t\t\t\ttmpDir, err = os.MkdirTemp(\"\", \"multi-platform-builder-create-tests\")\n\t\t\t\t\tassert.Nil(err)\n\n\t\t\t\t\t// used to avoid authentication issues with the local registry\n\t\t\t\t\tos.Setenv(\"DOCKER_CONFIG\", registryConfig.DockerConfigDir)\n\n\t\t\t\t\t// create a multi-platform buildpack and push it to a registry\n\t\t\t\t\tmultiArchBuildpackPackage = registryConfig.RepoName(\"simple-multi-platform-buildpack\" + h.RandString(8))\n\t\t\t\t\tsourceDir := filepath.Join(\"testdata\", \"mock_buildpacks\")\n\t\t\t\t\tpath := filepath.Join(tmpDir, \"simple-layers-buildpack\")\n\t\t\t\t\terr = buildpacks.BpFolderSimpleLayers.Prepare(sourceDir, tmpDir)\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\toutput := pack.RunSuccessfully(\n\t\t\t\t\t\t\"buildpack\", \"package\", multiArchBuildpackPackage,\n\t\t\t\t\t\t\"--path\", path,\n\t\t\t\t\t\t\"--publish\",\n\t\t\t\t\t\t\"--target\", \"linux/amd64\",\n\t\t\t\t\t\t\"--target\", \"linux/arm64\",\n\t\t\t\t\t\t\"--target\", \"windows/amd64\",\n\t\t\t\t\t)\n\t\t\t\t\tassertions.NewOutputAssertionManager(t, output).ReportsPackagePublished(multiArchBuildpackPackage)\n\t\t\t\t\tassertions.NewOutputAssertionManager(t, output).ReportsSuccessfulIndexPushed(multiArchBuildpackPackage)\n\t\t\t\t\th.AssertRemoteImageIndex(t, multiArchBuildpackPackage, types.OCIImageIndex, 3)\n\n\t\t\t\t\t// runImage and buildImage are saved in the daemon, for this test we want them to be available in a registry\n\t\t\t\t\tremoteRunImage = registryConfig.RepoName(runImage + h.RandString(8))\n\t\t\t\t\tremoteBuildImage = registryConfig.RepoName(buildImage + h.RandString(8))\n\n\t\t\t\t\timageManager.TagImage(runImage, remoteRunImage)\n\t\t\t\t\timageManager.TagImage(buildImage, remoteBuildImage)\n\n\t\t\t\t\th.AssertNil(t, h.PushImage(dockerCli, remoteRunImage, registryConfig))\n\t\t\t\t\th.AssertNil(t, h.PushImage(dockerCli, remoteBuildImage, registryConfig))\n\t\t\t\t})\n\n\t\t\t\tit.After(func() {\n\t\t\t\t\timageManager.CleanupImages(remoteBuildImage)\n\t\t\t\t\timageManager.CleanupImages(remoteRunImage)\n\t\t\t\t\tos.RemoveAll(tmpDir)\n\t\t\t\t})\n\n\t\t\t\tgenerateMultiPlatformBuilderToml := func(template, buildpackURI, buildImage, runImage string) string {\n\t\t\t\t\tt.Helper()\n\t\t\t\t\tbuildpackToml, err := os.CreateTemp(tmpDir, \"buildpack-*.toml\")\n\t\t\t\t\tassert.Nil(err)\n\n\t\t\t\t\tpack.FixtureManager().TemplateFixtureToFile(\n\t\t\t\t\t\ttemplate,\n\t\t\t\t\t\tbuildpackToml,\n\t\t\t\t\t\tmap[string]interface{}{\n\t\t\t\t\t\t\t\"BuildpackURI\": buildpackURI,\n\t\t\t\t\t\t\t\"BuildImage\":   buildImage,\n\t\t\t\t\t\t\t\"RunImage\":     runImage,\n\t\t\t\t\t\t},\n\t\t\t\t\t)\n\t\t\t\t\tassert.Nil(buildpackToml.Close())\n\t\t\t\t\treturn buildpackToml.Name()\n\t\t\t\t}\n\n\t\t\t\twhen(\"builder.toml has no targets but the user provides --target\", func() {\n\t\t\t\t\twhen(\"--publish\", func() {\n\t\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\t\tbuilderName = registryConfig.RepoName(\"remote-multi-platform-builder\" + h.RandString(8))\n\n\t\t\t\t\t\t\t// We need to configure our builder.toml with image references that points to our ephemeral registry\n\t\t\t\t\t\t\tbuilderTomlPath = generateMultiPlatformBuilderToml(\"builder_multi_platform-no-targets.toml\", multiArchBuildpackPackage, remoteBuildImage, remoteRunImage)\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\tit(\"publishes builder images for each requested target to the registry and creates an image index\", func() {\n\t\t\t\t\t\t\toutput := pack.RunSuccessfully(\n\t\t\t\t\t\t\t\t\"builder\", \"create\", builderName,\n\t\t\t\t\t\t\t\t\"--config\", builderTomlPath,\n\t\t\t\t\t\t\t\t\"--publish\",\n\t\t\t\t\t\t\t\t\"--target\", \"linux/amd64\",\n\t\t\t\t\t\t\t\t\"--target\", \"linux/arm64\",\n\t\t\t\t\t\t\t\t\"--target\", \"windows/amd64\",\n\t\t\t\t\t\t\t)\n\n\t\t\t\t\t\t\tdefer imageManager.CleanupImages(builderName)\n\t\t\t\t\t\t\tassertions.NewOutputAssertionManager(t, output).ReportsBuilderCreated(builderName)\n\n\t\t\t\t\t\t\tassertImage.CanBePulledFromRegistry(builderName)\n\n\t\t\t\t\t\t\tassertions.NewOutputAssertionManager(t, output).ReportsSuccessfulIndexPushed(builderName)\n\t\t\t\t\t\t\th.AssertRemoteImageIndex(t, builderName, types.OCIImageIndex, 3)\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"--daemon\", func() {\n\t\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\t\tbuilderName = registryConfig.RepoName(\"local-multi-platform-builder\" + h.RandString(8))\n\n\t\t\t\t\t\t\t// We need to configure our builder.toml with image references that points to our ephemeral registry\n\t\t\t\t\t\t\tbuilderTomlPath = generateMultiPlatformBuilderToml(\"builder_multi_platform-no-targets.toml\", multiArchBuildpackPackage, buildImage, runImage)\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\tit(\"publishes builder image to the daemon for the given target\", func() {\n\t\t\t\t\t\t\tplatform := \"linux/amd64\"\n\t\t\t\t\t\t\tif imageManager.HostOS() == \"windows\" {\n\t\t\t\t\t\t\t\tplatform = \"windows/amd64\"\n\t\t\t\t\t\t\t}\n\n\t\t\t\t\t\t\toutput := pack.RunSuccessfully(\n\t\t\t\t\t\t\t\t\"builder\", \"create\", builderName,\n\t\t\t\t\t\t\t\t\"--config\", builderTomlPath,\n\t\t\t\t\t\t\t\t\"--target\", platform,\n\t\t\t\t\t\t\t)\n\n\t\t\t\t\t\t\tdefer imageManager.CleanupImages(builderName)\n\t\t\t\t\t\t\tassertions.NewOutputAssertionManager(t, output).ReportsBuilderCreated(builderName)\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"builder.toml has targets\", func() {\n\t\t\t\t\twhen(\"--publish\", func() {\n\t\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\t\tbuilderName = registryConfig.RepoName(\"remote-multi-platform-builder\" + h.RandString(8))\n\n\t\t\t\t\t\t\t// We need to configure our builder.toml with image references that points to our ephemeral registry\n\t\t\t\t\t\t\tbuilderTomlPath = generateMultiPlatformBuilderToml(\"builder_multi_platform.toml\", multiArchBuildpackPackage, remoteBuildImage, remoteRunImage)\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\tit(\"publishes builder images for each configured target to the registry and creates an image index\", func() {\n\t\t\t\t\t\t\toutput := pack.RunSuccessfully(\n\t\t\t\t\t\t\t\t\"builder\", \"create\", builderName,\n\t\t\t\t\t\t\t\t\"--config\", builderTomlPath,\n\t\t\t\t\t\t\t\t\"--publish\",\n\t\t\t\t\t\t\t)\n\n\t\t\t\t\t\t\tdefer imageManager.CleanupImages(builderName)\n\t\t\t\t\t\t\tassertions.NewOutputAssertionManager(t, output).ReportsBuilderCreated(builderName)\n\n\t\t\t\t\t\t\tassertImage.CanBePulledFromRegistry(builderName)\n\n\t\t\t\t\t\t\tassertions.NewOutputAssertionManager(t, output).ReportsSuccessfulIndexPushed(builderName)\n\t\t\t\t\t\t\th.AssertRemoteImageIndex(t, builderName, types.OCIImageIndex, 2)\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"--daemon\", func() {\n\t\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\t\tbuilderName = registryConfig.RepoName(\"local-multi-platform-builder\" + h.RandString(8))\n\n\t\t\t\t\t\t\t// We need to configure our builder.toml with image references that points to our ephemeral registry\n\t\t\t\t\t\t\tbuilderTomlPath = generateMultiPlatformBuilderToml(\"builder_multi_platform.toml\", multiArchBuildpackPackage, buildImage, runImage)\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\tit(\"publishes builder image to the daemon for the given target\", func() {\n\t\t\t\t\t\t\tplatform := \"linux/amd64\"\n\t\t\t\t\t\t\tif imageManager.HostOS() == \"windows\" {\n\t\t\t\t\t\t\t\tplatform = \"windows/amd64\"\n\t\t\t\t\t\t\t}\n\n\t\t\t\t\t\t\toutput := pack.RunSuccessfully(\n\t\t\t\t\t\t\t\t\"builder\", \"create\", builderName,\n\t\t\t\t\t\t\t\t\"--config\", builderTomlPath,\n\t\t\t\t\t\t\t\t\"--target\", platform,\n\t\t\t\t\t\t\t)\n\n\t\t\t\t\t\t\tdefer imageManager.CleanupImages(builderName)\n\t\t\t\t\t\t\tassertions.NewOutputAssertionManager(t, output).ReportsBuilderCreated(builderName)\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"builder create\", func() {\n\t\t\twhen(\"--flatten=<buildpacks>\", func() {\n\t\t\t\tit(\"should flatten together all specified buildpacks\", func() {\n\t\t\t\t\th.SkipIf(t, !createBuilderPack.SupportsFeature(invoke.FlattenBuilderCreationV2), \"pack version <= 0.33.0 fails with this test\")\n\t\t\t\t\th.SkipIf(t, imageManager.HostOS() == \"windows\", \"These tests are not yet compatible with Windows-based containers\")\n\n\t\t\t\t\t// create a task, handled by a 'task manager' which executes our pack commands during tests.\n\t\t\t\t\t// looks like this is used to de-dup tasks\n\t\t\t\t\tkey := taskKey(\n\t\t\t\t\t\t\"create-complex-flattened-builder\",\n\t\t\t\t\t\tappend(\n\t\t\t\t\t\t\t[]string{runImageMirror, createBuilderPackConfig.Path(), lifecycle.Identifier()},\n\t\t\t\t\t\t\tcreateBuilderPackConfig.FixturePaths()...,\n\t\t\t\t\t\t)...,\n\t\t\t\t\t)\n\n\t\t\t\t\tbuilderName, err := suiteManager.RunTaskOnceString(key, func() (string, error) {\n\t\t\t\t\t\treturn createFlattenBuilder(t,\n\t\t\t\t\t\t\tassert,\n\t\t\t\t\t\t\tbuildpackManager,\n\t\t\t\t\t\t\tlifecycle,\n\t\t\t\t\t\t\tcreateBuilderPack,\n\t\t\t\t\t\t\trunImageMirror)\n\t\t\t\t\t})\n\t\t\t\t\tassert.Nil(err)\n\n\t\t\t\t\t// register task to be run to 'clean up' a task\n\t\t\t\t\tsuiteManager.RegisterCleanUp(\"clean-\"+key, func() error {\n\t\t\t\t\t\timageManager.CleanupImages(builderName)\n\t\t\t\t\t\treturn nil\n\t\t\t\t\t})\n\n\t\t\t\t\tassertImage.ExistsLocally(builderName)\n\n\t\t\t\t\t// 3 layers for runtime OS\n\t\t\t\t\t// 1 layer setting cnb, platform, layers folders\n\t\t\t\t\t// 1 layer for lifecycle binaries\n\t\t\t\t\t// 1 layer for order.toml\n\t\t\t\t\t// 1 layer for run.toml\n\t\t\t\t\t// 1 layer for stack.toml\n\t\t\t\t\t// 1 layer status file changed\n\t\t\t\t\t// Base Layers = 9\n\n\t\t\t\t\t// 1 layer for 3 flattened builpacks\n\t\t\t\t\t// 3 layers for single buildpacks not flattened\n\t\t\t\t\tassertImage.HasLengthLayers(builderName, 13)\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\t})\n}\n\nfunc buildModulesDir() string {\n\treturn filepath.Join(\"testdata\", \"mock_buildpacks\")\n}\n\nfunc createComplexBuilder(t *testing.T,\n\tassert h.AssertionManager,\n\tpack *invoke.PackInvoker,\n\tlifecycle config.LifecycleAsset,\n\tbuildpackManager buildpacks.BuildModuleManager,\n\trunImageMirror string,\n) (string, error) {\n\n\tt.Log(\"creating complex builder image...\")\n\n\t// CREATE TEMP WORKING DIR\n\ttmpDir, err := os.MkdirTemp(\"\", \"create-complex-test-builder\")\n\tif err != nil {\n\t\treturn \"\", err\n\t}\n\tdefer os.RemoveAll(tmpDir)\n\n\t// ARCHIVE BUILDPACKS\n\tbuilderBuildpacks := []buildpacks.TestBuildModule{\n\t\tbuildpacks.BpNoop,\n\t\tbuildpacks.BpNoop2,\n\t\tbuildpacks.BpOtherStack,\n\t\tbuildpacks.BpReadEnv,\n\t}\n\n\ttemplateMapping := map[string]interface{}{\n\t\t\"run_image_mirror\": runImageMirror,\n\t}\n\n\tpackageImageName := registryConfig.RepoName(\"nested-level-1-buildpack-\" + h.RandString(8))\n\tnestedLevelTwoBuildpackName := registryConfig.RepoName(\"nested-level-2-buildpack-\" + h.RandString(8))\n\tsimpleLayersBuildpackName := registryConfig.RepoName(\"simple-layers-buildpack-\" + h.RandString(8))\n\tsimpleLayersBuildpackDifferentShaName := registryConfig.RepoName(\"simple-layers-buildpack-different-name-\" + h.RandString(8))\n\n\ttemplateMapping[\"package_id\"] = \"simple/nested-level-1\"\n\ttemplateMapping[\"package_image_name\"] = packageImageName\n\ttemplateMapping[\"nested_level_1_buildpack\"] = packageImageName\n\ttemplateMapping[\"nested_level_2_buildpack\"] = nestedLevelTwoBuildpackName\n\ttemplateMapping[\"simple_layers_buildpack\"] = simpleLayersBuildpackName\n\ttemplateMapping[\"simple_layers_buildpack_different_sha\"] = simpleLayersBuildpackDifferentShaName\n\n\tfixtureManager := pack.FixtureManager()\n\n\tnestedLevelOneConfigFile, err := os.CreateTemp(tmpDir, \"nested-level-1-package.toml\")\n\tassert.Nil(err)\n\tfixtureManager.TemplateFixtureToFile(\n\t\t\"nested-level-1-buildpack_package.toml\",\n\t\tnestedLevelOneConfigFile,\n\t\ttemplateMapping,\n\t)\n\terr = nestedLevelOneConfigFile.Close()\n\tassert.Nil(err)\n\n\tnestedLevelTwoConfigFile, err := os.CreateTemp(tmpDir, \"nested-level-2-package.toml\")\n\tassert.Nil(err)\n\tfixtureManager.TemplateFixtureToFile(\n\t\t\"nested-level-2-buildpack_package.toml\",\n\t\tnestedLevelTwoConfigFile,\n\t\ttemplateMapping,\n\t)\n\n\terr = nestedLevelTwoConfigFile.Close()\n\tassert.Nil(err)\n\n\tpackageImageBuildpack := buildpacks.NewPackageImage(\n\t\tt,\n\t\tpack,\n\t\tpackageImageName,\n\t\tnestedLevelOneConfigFile.Name(),\n\t\tbuildpacks.WithRequiredBuildpacks(\n\t\t\tbuildpacks.BpNestedLevelOne,\n\t\t\tbuildpacks.NewPackageImage(\n\t\t\t\tt,\n\t\t\t\tpack,\n\t\t\t\tnestedLevelTwoBuildpackName,\n\t\t\t\tnestedLevelTwoConfigFile.Name(),\n\t\t\t\tbuildpacks.WithRequiredBuildpacks(\n\t\t\t\t\tbuildpacks.BpNestedLevelTwo,\n\t\t\t\t\tbuildpacks.NewPackageImage(\n\t\t\t\t\t\tt,\n\t\t\t\t\t\tpack,\n\t\t\t\t\t\tsimpleLayersBuildpackName,\n\t\t\t\t\t\tfixtureManager.FixtureLocation(\"simple-layers-buildpack_package.toml\"),\n\t\t\t\t\t\tbuildpacks.WithRequiredBuildpacks(buildpacks.BpSimpleLayers),\n\t\t\t\t\t),\n\t\t\t\t),\n\t\t\t),\n\t\t),\n\t)\n\n\tsimpleLayersDifferentShaBuildpack := buildpacks.NewPackageImage(\n\t\tt,\n\t\tpack,\n\t\tsimpleLayersBuildpackDifferentShaName,\n\t\tfixtureManager.FixtureLocation(\"simple-layers-buildpack-different-sha_package.toml\"),\n\t\tbuildpacks.WithRequiredBuildpacks(buildpacks.BpSimpleLayersDifferentSha),\n\t)\n\n\tdefer imageManager.CleanupImages(packageImageName, nestedLevelTwoBuildpackName, simpleLayersBuildpackName, simpleLayersBuildpackDifferentShaName)\n\n\tbuilderBuildpacks = append(\n\t\tbuilderBuildpacks,\n\t\tpackageImageBuildpack,\n\t\tsimpleLayersDifferentShaBuildpack,\n\t)\n\n\tbuildpackManager.PrepareBuildModules(tmpDir, builderBuildpacks...)\n\n\t// ADD lifecycle\n\tif lifecycle.HasLocation() {\n\t\tlifecycleURI := lifecycle.EscapedPath()\n\t\tt.Logf(\"adding lifecycle path '%s' to builder config\", lifecycleURI)\n\t\ttemplateMapping[\"lifecycle_uri\"] = lifecycleURI\n\t} else {\n\t\tlifecycleVersion := lifecycle.Version()\n\t\tt.Logf(\"adding lifecycle version '%s' to builder config\", lifecycleVersion)\n\t\ttemplateMapping[\"lifecycle_version\"] = lifecycleVersion\n\t}\n\n\t// RENDER builder.toml\n\tbuilderConfigFile, err := os.CreateTemp(tmpDir, \"nested_builder.toml\")\n\tif err != nil {\n\t\treturn \"\", err\n\t}\n\n\tpack.FixtureManager().TemplateFixtureToFile(\"nested_builder.toml\", builderConfigFile, templateMapping)\n\n\terr = builderConfigFile.Close()\n\tif err != nil {\n\t\treturn \"\", err\n\t}\n\n\t// NAME BUILDER\n\tbldr := registryConfig.RepoName(\"test/builder-\" + h.RandString(10))\n\n\t// CREATE BUILDER\n\toutput := pack.RunSuccessfully(\n\t\t\"builder\", \"create\", bldr,\n\t\t\"-c\", builderConfigFile.Name(),\n\t\t\"--no-color\",\n\t)\n\n\tassert.Contains(output, fmt.Sprintf(\"Successfully created builder image '%s'\", bldr))\n\tassert.Succeeds(h.PushImage(dockerCli, bldr, registryConfig))\n\n\treturn bldr, nil\n}\n\nfunc createBuilder(\n\tt *testing.T,\n\tassert h.AssertionManager,\n\tpack *invoke.PackInvoker,\n\tlifecycle config.LifecycleAsset,\n\tbuildpackManager buildpacks.BuildModuleManager,\n\trunImageMirror string,\n) (string, error) {\n\tt.Log(\"creating builder image...\")\n\n\t// CREATE TEMP WORKING DIR\n\ttmpDir, err := os.MkdirTemp(\"\", \"create-test-builder\")\n\tassert.Nil(err)\n\tdefer os.RemoveAll(tmpDir)\n\n\ttemplateMapping := map[string]interface{}{\n\t\t\"run_image_mirror\": runImageMirror,\n\t}\n\n\t// ARCHIVE BUILDPACKS\n\tbuilderBuildpacks := []buildpacks.TestBuildModule{\n\t\tbuildpacks.BpNoop,\n\t\tbuildpacks.BpNoop2,\n\t\tbuildpacks.BpOtherStack,\n\t\tbuildpacks.BpReadEnv,\n\t}\n\n\tpackageTomlPath := generatePackageTomlWithOS(t, assert, pack, tmpDir, \"package.toml\", imageManager.HostOS())\n\tpackageImageName := registryConfig.RepoName(\"simple-layers-package-image-buildpack-\" + h.RandString(8))\n\n\tpackageImageBuildpack := buildpacks.NewPackageImage(\n\t\tt,\n\t\tpack,\n\t\tpackageImageName,\n\t\tpackageTomlPath,\n\t\tbuildpacks.WithRequiredBuildpacks(buildpacks.BpSimpleLayers),\n\t)\n\n\tdefer imageManager.CleanupImages(packageImageName)\n\n\tbuilderBuildpacks = append(builderBuildpacks, packageImageBuildpack)\n\n\ttemplateMapping[\"package_image_name\"] = packageImageName\n\ttemplateMapping[\"package_id\"] = \"simple/layers\"\n\n\tbuildpackManager.PrepareBuildModules(tmpDir, builderBuildpacks...)\n\n\t// ADD lifecycle\n\tvar lifecycleURI string\n\tvar lifecycleVersion string\n\tif lifecycle.HasLocation() {\n\t\tlifecycleURI = lifecycle.EscapedPath()\n\t\tt.Logf(\"adding lifecycle path '%s' to builder config\", lifecycleURI)\n\t\ttemplateMapping[\"lifecycle_uri\"] = lifecycleURI\n\t} else {\n\t\tlifecycleVersion = lifecycle.Version()\n\t\tt.Logf(\"adding lifecycle version '%s' to builder config\", lifecycleVersion)\n\t\ttemplateMapping[\"lifecycle_version\"] = lifecycleVersion\n\t}\n\n\t// RENDER builder.toml\n\tconfigFileName := \"builder.toml\"\n\n\tbuilderConfigFile, err := os.CreateTemp(tmpDir, \"builder.toml\")\n\tassert.Nil(err)\n\n\tpack.FixtureManager().TemplateFixtureToFile(\n\t\tconfigFileName,\n\t\tbuilderConfigFile,\n\t\ttemplateMapping,\n\t)\n\n\terr = builderConfigFile.Close()\n\tassert.Nil(err)\n\n\t// NAME BUILDER\n\tbldr := registryConfig.RepoName(\"test/builder-\" + h.RandString(10))\n\n\t// CREATE BUILDER\n\toutput := pack.RunSuccessfully(\n\t\t\"builder\", \"create\", bldr,\n\t\t\"-c\", builderConfigFile.Name(),\n\t\t\"--no-color\",\n\t)\n\n\tassert.Contains(output, fmt.Sprintf(\"Successfully created builder image '%s'\", bldr))\n\tassert.Succeeds(h.PushImage(dockerCli, bldr, registryConfig))\n\n\treturn bldr, nil\n}\n\nfunc createBuilderWithExtensions(\n\tt *testing.T,\n\tassert h.AssertionManager,\n\tpack *invoke.PackInvoker,\n\tlifecycle config.LifecycleAsset,\n\tbuildpackManager buildpacks.BuildModuleManager,\n\trunImageMirror string,\n) (string, error) {\n\tt.Log(\"creating builder image with extensions...\")\n\n\t// CREATE TEMP WORKING DIR\n\ttmpDir, err := os.MkdirTemp(\"\", \"create-test-builder-extensions\")\n\tassert.Nil(err)\n\tdefer os.RemoveAll(tmpDir)\n\n\ttemplateMapping := map[string]interface{}{\n\t\t\"run_image_mirror\": runImageMirror,\n\t}\n\n\t// BUILDPACKS\n\tbuilderBuildpacks := []buildpacks.TestBuildModule{\n\t\tbuildpacks.BpReadEnv,            // archive buildpack\n\t\tbuildpacks.BpFolderSimpleLayers, // folder buildpack\n\t}\n\tbuildpackManager.PrepareBuildModules(tmpDir, builderBuildpacks...)\n\n\t// EXTENSIONS\n\tbuilderExtensions := []buildpacks.TestBuildModule{\n\t\tbuildpacks.ExtReadEnv,            // archive extension\n\t\tbuildpacks.ExtFolderSimpleLayers, // folder extension\n\t}\n\tbuildpackManager.PrepareBuildModules(tmpDir, builderExtensions...)\n\n\t// ADD lifecycle\n\tvar lifecycleURI string\n\tvar lifecycleVersion string\n\tif lifecycle.HasLocation() {\n\t\tlifecycleURI = lifecycle.EscapedPath()\n\t\tt.Logf(\"adding lifecycle path '%s' to builder config\", lifecycleURI)\n\t\ttemplateMapping[\"lifecycle_uri\"] = lifecycleURI\n\t} else {\n\t\tlifecycleVersion = lifecycle.Version()\n\t\tt.Logf(\"adding lifecycle version '%s' to builder config\", lifecycleVersion)\n\t\ttemplateMapping[\"lifecycle_version\"] = lifecycleVersion\n\t}\n\n\t// RENDER builder.toml\n\tconfigFileName := \"builder_extensions.toml\"\n\n\tbuilderConfigFile, err := os.CreateTemp(tmpDir, \"builder.toml\")\n\tassert.Nil(err)\n\n\tpack.FixtureManager().TemplateFixtureToFile(\n\t\tconfigFileName,\n\t\tbuilderConfigFile,\n\t\ttemplateMapping,\n\t)\n\n\terr = builderConfigFile.Close()\n\tassert.Nil(err)\n\n\t// NAME BUILDER\n\tbldr := registryConfig.RepoName(\"test/builder-\" + h.RandString(10))\n\n\t// SET EXPERIMENTAL\n\tpack.JustRunSuccessfully(\"config\", \"experimental\", \"true\")\n\n\t// CREATE BUILDER\n\toutput := pack.RunSuccessfully(\n\t\t\"builder\", \"create\", bldr,\n\t\t\"-c\", builderConfigFile.Name(),\n\t\t\"--no-color\",\n\t)\n\n\tassert.Contains(output, fmt.Sprintf(\"Successfully created builder image '%s'\", bldr))\n\tassert.Succeeds(h.PushImage(dockerCli, bldr, registryConfig))\n\n\treturn bldr, nil\n}\n\nfunc generatePackageTomlWithOS(\n\tt *testing.T,\n\tassert h.AssertionManager,\n\tpack *invoke.PackInvoker,\n\ttmpDir string,\n\tfixtureName string,\n\tplatform_os string,\n) string {\n\tt.Helper()\n\n\tpackageTomlFile, err := os.CreateTemp(tmpDir, \"package-*.toml\")\n\tassert.Nil(err)\n\n\tpack.FixtureManager().TemplateFixtureToFile(\n\t\tfixtureName,\n\t\tpackageTomlFile,\n\t\tmap[string]interface{}{\n\t\t\t\"OS\": platform_os,\n\t\t},\n\t)\n\n\tassert.Nil(packageTomlFile.Close())\n\n\treturn packageTomlFile.Name()\n}\n\nfunc createStack(t *testing.T, dockerCli *client.Client, runImageMirror string) error {\n\tt.Helper()\n\tt.Log(\"creating stack images...\")\n\n\tstackBaseDir := filepath.Join(\"testdata\", \"mock_stack\", imageManager.HostOS())\n\n\tif err := createStackImage(dockerCli, runImage, filepath.Join(stackBaseDir, \"run\")); err != nil {\n\t\treturn err\n\t}\n\tif err := createStackImage(dockerCli, buildImage, filepath.Join(stackBaseDir, \"build\")); err != nil {\n\t\treturn err\n\t}\n\n\timageManager.TagImage(runImage, runImageMirror)\n\tif err := h.PushImage(dockerCli, runImageMirror, registryConfig); err != nil {\n\t\treturn err\n\t}\n\n\treturn nil\n}\n\nfunc createStackImage(dockerCli *client.Client, repoName string, dir string) error {\n\tdefaultFilterFunc := func(file string) bool { return true }\n\n\tctx := context.Background()\n\tbuildContext := archive.ReadDirAsTar(dir, \"/\", 0, 0, -1, true, false, defaultFilterFunc)\n\n\treturn h.CheckImageBuildResult(dockerCli.ImageBuild(ctx, buildContext, client.ImageBuildOptions{\n\t\tTags:        []string{repoName},\n\t\tRemove:      true,\n\t\tForceRemove: true,\n\t}))\n}\n\nfunc createFlattenBuilder(\n\tt *testing.T,\n\tassert h.AssertionManager,\n\tbuildpackManager buildpacks.BuildModuleManager,\n\tlifecycle config.LifecycleAsset,\n\tpack *invoke.PackInvoker,\n\trunImageMirror string,\n) (string, error) {\n\tt.Helper()\n\tt.Log(\"creating flattened builder image...\")\n\n\t// CREATE TEMP WORKING DIR\n\ttmpDir, err := os.MkdirTemp(\"\", \"create-complex-test-flattened-builder\")\n\tif err != nil {\n\t\treturn \"\", err\n\t}\n\tdefer os.RemoveAll(tmpDir)\n\n\t// ARCHIVE BUILDPACKS\n\tbuilderBuildpacks := []buildpacks.TestBuildModule{\n\t\tbuildpacks.BpNoop,\n\t\tbuildpacks.BpNoop2,\n\t\tbuildpacks.BpOtherStack,\n\t\tbuildpacks.BpReadEnv,\n\t}\n\n\ttemplateMapping := map[string]interface{}{\n\t\t\"run_image_mirror\": runImageMirror,\n\t}\n\n\tpackageImageName := registryConfig.RepoName(\"nested-level-1-buildpack-\" + h.RandString(8))\n\tnestedLevelTwoBuildpackName := registryConfig.RepoName(\"nested-level-2-buildpack-\" + h.RandString(8))\n\tsimpleLayersBuildpackName := registryConfig.RepoName(\"simple-layers-buildpack-\" + h.RandString(8))\n\tsimpleLayersBuildpackDifferentShaName := registryConfig.RepoName(\"simple-layers-buildpack-different-name-\" + h.RandString(8))\n\n\ttemplateMapping[\"package_id\"] = \"simple/nested-level-1\"\n\ttemplateMapping[\"package_image_name\"] = packageImageName\n\ttemplateMapping[\"nested_level_1_buildpack\"] = packageImageName\n\ttemplateMapping[\"nested_level_2_buildpack\"] = nestedLevelTwoBuildpackName\n\ttemplateMapping[\"simple_layers_buildpack\"] = simpleLayersBuildpackName\n\ttemplateMapping[\"simple_layers_buildpack_different_sha\"] = simpleLayersBuildpackDifferentShaName\n\n\tfixtureManager := pack.FixtureManager()\n\n\tnestedLevelOneConfigFile, err := os.CreateTemp(tmpDir, \"nested-level-1-package.toml\")\n\tassert.Nil(err)\n\tfixtureManager.TemplateFixtureToFile(\n\t\t\"nested-level-1-buildpack_package.toml\",\n\t\tnestedLevelOneConfigFile,\n\t\ttemplateMapping,\n\t)\n\terr = nestedLevelOneConfigFile.Close()\n\tassert.Nil(err)\n\n\tnestedLevelTwoConfigFile, err := os.CreateTemp(tmpDir, \"nested-level-2-package.toml\")\n\tassert.Nil(err)\n\tfixtureManager.TemplateFixtureToFile(\n\t\t\"nested-level-2-buildpack_package.toml\",\n\t\tnestedLevelTwoConfigFile,\n\t\ttemplateMapping,\n\t)\n\n\terr = nestedLevelTwoConfigFile.Close()\n\tassert.Nil(err)\n\n\tpackageImageBuildpack := buildpacks.NewPackageImage(\n\t\tt,\n\t\tpack,\n\t\tpackageImageName,\n\t\tnestedLevelOneConfigFile.Name(),\n\t\tbuildpacks.WithRequiredBuildpacks(\n\t\t\tbuildpacks.BpNestedLevelOne,\n\t\t\tbuildpacks.NewPackageImage(\n\t\t\t\tt,\n\t\t\t\tpack,\n\t\t\t\tnestedLevelTwoBuildpackName,\n\t\t\t\tnestedLevelTwoConfigFile.Name(),\n\t\t\t\tbuildpacks.WithRequiredBuildpacks(\n\t\t\t\t\tbuildpacks.BpNestedLevelTwo,\n\t\t\t\t\tbuildpacks.NewPackageImage(\n\t\t\t\t\t\tt,\n\t\t\t\t\t\tpack,\n\t\t\t\t\t\tsimpleLayersBuildpackName,\n\t\t\t\t\t\tfixtureManager.FixtureLocation(\"simple-layers-buildpack_package.toml\"),\n\t\t\t\t\t\tbuildpacks.WithRequiredBuildpacks(buildpacks.BpSimpleLayers),\n\t\t\t\t\t),\n\t\t\t\t),\n\t\t\t),\n\t\t),\n\t)\n\n\tsimpleLayersDifferentShaBuildpack := buildpacks.NewPackageImage(\n\t\tt,\n\t\tpack,\n\t\tsimpleLayersBuildpackDifferentShaName,\n\t\tfixtureManager.FixtureLocation(\"simple-layers-buildpack-different-sha_package.toml\"),\n\t\tbuildpacks.WithRequiredBuildpacks(buildpacks.BpSimpleLayersDifferentSha),\n\t)\n\n\tdefer imageManager.CleanupImages(packageImageName, nestedLevelTwoBuildpackName, simpleLayersBuildpackName, simpleLayersBuildpackDifferentShaName)\n\n\tbuilderBuildpacks = append(\n\t\tbuilderBuildpacks,\n\t\tpackageImageBuildpack,\n\t\tsimpleLayersDifferentShaBuildpack,\n\t)\n\n\tbuildpackManager.PrepareBuildModules(tmpDir, builderBuildpacks...)\n\n\t// ADD lifecycle\n\tif lifecycle.HasLocation() {\n\t\tlifecycleURI := lifecycle.EscapedPath()\n\t\tt.Logf(\"adding lifecycle path '%s' to builder config\", lifecycleURI)\n\t\ttemplateMapping[\"lifecycle_uri\"] = lifecycleURI\n\t} else {\n\t\tlifecycleVersion := lifecycle.Version()\n\t\tt.Logf(\"adding lifecycle version '%s' to builder config\", lifecycleVersion)\n\t\ttemplateMapping[\"lifecycle_version\"] = lifecycleVersion\n\t}\n\n\t// RENDER builder.toml\n\tbuilderConfigFile, err := os.CreateTemp(tmpDir, \"nested_builder.toml\")\n\tif err != nil {\n\t\treturn \"\", err\n\t}\n\n\tpack.FixtureManager().TemplateFixtureToFile(\"nested_builder.toml\", builderConfigFile, templateMapping)\n\n\terr = builderConfigFile.Close()\n\tif err != nil {\n\t\treturn \"\", err\n\t}\n\n\t// NAME BUILDER\n\tbldr := registryConfig.RepoName(\"test/flatten-builder-\" + h.RandString(10))\n\n\t// CREATE BUILDER\n\toutput := pack.RunSuccessfully(\n\t\t\"builder\", \"create\", bldr,\n\t\t\"-c\", builderConfigFile.Name(),\n\t\t\"--no-color\",\n\t\t\"--verbose\",\n\t\t\"--flatten\", \"read/env@read-env-version,noop.buildpack@noop.buildpack.version,noop.buildpack@noop.buildpack.later-version\",\n\t)\n\n\tassert.Contains(output, fmt.Sprintf(\"Successfully created builder image '%s'\", bldr))\n\tassert.Succeeds(h.PushImage(dockerCli, bldr, registryConfig))\n\n\treturn bldr, nil\n}\n\n// taskKey creates a key from the prefix and all arguments to be unique\nfunc taskKey(prefix string, args ...string) string {\n\thash := sha256.New()\n\tfor _, v := range args {\n\t\thash.Write([]byte(v))\n\t}\n\treturn fmt.Sprintf(\"%s-%s\", prefix, hex.EncodeToString(hash.Sum(nil)))\n}\n\ntype compareFormat struct {\n\textension   string\n\tcompareFunc func(string, string)\n\toutputArg   string\n}\n"
  },
  {
    "path": "acceptance/assertions/image.go",
    "content": "//go:build acceptance\n\npackage assertions\n\nimport (\n\t\"fmt\"\n\t\"testing\"\n\t\"time\"\n\n\t\"github.com/buildpacks/pack/acceptance/managers\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\ntype ImageAssertionManager struct {\n\ttestObject   *testing.T\n\tassert       h.AssertionManager\n\timageManager managers.ImageManager\n\tregistry     *h.TestRegistryConfig\n}\n\nfunc NewImageAssertionManager(t *testing.T, imageManager managers.ImageManager, registry *h.TestRegistryConfig) ImageAssertionManager {\n\treturn ImageAssertionManager{\n\t\ttestObject:   t,\n\t\tassert:       h.NewAssertionManager(t),\n\t\timageManager: imageManager,\n\t\tregistry:     registry,\n\t}\n}\n\nfunc (a ImageAssertionManager) ExistsLocally(name string) {\n\ta.testObject.Helper()\n\t_, err := a.imageManager.InspectLocal(name)\n\ta.assert.Nil(err)\n}\n\nfunc (a ImageAssertionManager) NotExistsLocally(name string) {\n\ta.testObject.Helper()\n\t_, err := a.imageManager.InspectLocal(name)\n\ta.assert.ErrorContains(err, \"No such image\")\n}\n\nfunc (a ImageAssertionManager) HasBaseImage(image, base string) {\n\ta.testObject.Helper()\n\timageInspect, err := a.imageManager.InspectLocal(image)\n\ta.assert.Nil(err)\n\tbaseInspect, err := a.imageManager.InspectLocal(base)\n\ta.assert.Nil(err)\n\tfor i, layer := range baseInspect.RootFS.Layers {\n\t\ta.assert.Equal(imageInspect.RootFS.Layers[i], layer)\n\t}\n}\n\nfunc (a ImageAssertionManager) HasCreateTime(image string, expectedTime time.Time) {\n\ta.testObject.Helper()\n\tinspect, err := a.imageManager.InspectLocal(image)\n\ta.assert.Nil(err)\n\tactualTime, err := time.Parse(\"2006-01-02T15:04:05Z\", inspect.Created)\n\ta.assert.Nil(err)\n\ta.assert.TrueWithMessage(actualTime.Sub(expectedTime) < 5*time.Second && expectedTime.Sub(actualTime) < 5*time.Second, fmt.Sprintf(\"expected image create time %s to match expected time %s\", actualTime, expectedTime))\n}\n\nfunc (a ImageAssertionManager) HasLabelContaining(image, label, data string) {\n\ta.testObject.Helper()\n\tinspect, err := a.imageManager.InspectLocal(image)\n\ta.assert.Nil(err)\n\tlabel, ok := inspect.Config.Labels[label]\n\ta.assert.TrueWithMessage(ok, fmt.Sprintf(\"expected label %s to exist\", label))\n\ta.assert.Contains(label, data)\n}\n\nfunc (a ImageAssertionManager) HasLabelNotContaining(image, label, data string) {\n\ta.testObject.Helper()\n\tinspect, err := a.imageManager.InspectLocal(image)\n\ta.assert.Nil(err)\n\tlabel, ok := inspect.Config.Labels[label]\n\ta.assert.TrueWithMessage(ok, fmt.Sprintf(\"expected label %s to exist\", label))\n\ta.assert.NotContains(label, data)\n}\n\nfunc (a ImageAssertionManager) HasLengthLayers(image string, length int) {\n\ta.testObject.Helper()\n\tinspect, err := a.imageManager.InspectLocal(image)\n\ta.assert.Nil(err)\n\ta.assert.TrueWithMessage(len(inspect.RootFS.Layers) == length, fmt.Sprintf(\"expected image to have %d layers, found %d\", length, len(inspect.RootFS.Layers)))\n}\n\nfunc (a ImageAssertionManager) RunsWithOutput(image string, expectedOutputs ...string) {\n\ta.testObject.Helper()\n\tcontainerName := \"test-\" + h.RandString(10)\n\tcontainer := a.imageManager.ExposePortOnImage(image, containerName)\n\tdefer container.Cleanup()\n\n\toutput := container.WaitForResponse(managers.DefaultDuration)\n\ta.assert.ContainsAll(output, expectedOutputs...)\n}\n\nfunc (a ImageAssertionManager) RunsWithLogs(image string, expectedOutputs ...string) {\n\ta.testObject.Helper()\n\tcontainer := a.imageManager.CreateContainer(image)\n\tdefer container.Cleanup()\n\n\toutput := container.RunWithOutput()\n\ta.assert.ContainsAll(output, expectedOutputs...)\n}\n\nfunc (a ImageAssertionManager) CanBePulledFromRegistry(name string) {\n\ta.testObject.Helper()\n\ta.imageManager.PullImage(name, a.registry.RegistryAuth())\n\ta.ExistsLocally(name)\n}\n\nfunc (a ImageAssertionManager) ExistsInRegistryCatalog(name string) {\n\ta.testObject.Helper()\n\tcontents, err := a.registry.RegistryCatalog()\n\ta.assert.Nil(err)\n\ta.assert.ContainsWithMessage(contents, name, fmt.Sprintf(\"Expected to see image %s in %%s\", name))\n}\n\nfunc (a ImageAssertionManager) NotExistsInRegistry(name string) {\n\ta.testObject.Helper()\n\tcontents, err := a.registry.RegistryCatalog()\n\ta.assert.Nil(err)\n\ta.assert.NotContainWithMessage(\n\t\tcontents,\n\t\tname,\n\t\t\"Didn't expect to see image %s in the registry\",\n\t)\n}\n\nfunc (a ImageAssertionManager) DoesNotHaveDuplicateLayers(name string) {\n\ta.testObject.Helper()\n\n\tout, err := a.imageManager.InspectLocal(name)\n\ta.assert.Nil(err)\n\n\tlayerSet := map[string]interface{}{}\n\tfor _, layer := range out.RootFS.Layers {\n\t\t_, ok := layerSet[layer]\n\t\tif ok {\n\t\t\ta.testObject.Fatalf(\"duplicate layer found in builder %s\", layer)\n\t\t}\n\t\tlayerSet[layer] = true\n\t}\n}\n"
  },
  {
    "path": "acceptance/assertions/lifecycle_output.go",
    "content": "//go:build acceptance\n\npackage assertions\n\nimport (\n\t\"fmt\"\n\t\"regexp\"\n\t\"strings\"\n\t\"testing\"\n\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\ntype LifecycleOutputAssertionManager struct {\n\ttestObject *testing.T\n\tassert     h.AssertionManager\n\toutput     string\n}\n\nfunc NewLifecycleOutputAssertionManager(t *testing.T, output string) LifecycleOutputAssertionManager {\n\treturn LifecycleOutputAssertionManager{\n\t\ttestObject: t,\n\t\tassert:     h.NewAssertionManager(t),\n\t\toutput:     output,\n\t}\n}\n\nfunc (l LifecycleOutputAssertionManager) ReportsRestoresCachedLayer(layer string) {\n\tl.testObject.Helper()\n\tl.testObject.Log(\"restores the cache\")\n\n\tl.assert.MatchesAll(\n\t\tl.output,\n\t\tregexp.MustCompile(fmt.Sprintf(`(?i)Restoring data for \"%s\" from cache`, layer)),\n\t\tregexp.MustCompile(fmt.Sprintf(`(?i)Restoring metadata for \"%s\" from app image`, layer)),\n\t)\n}\n\nfunc (l LifecycleOutputAssertionManager) ReportsExporterReusingUnchangedLayer(layer string) {\n\tl.testObject.Helper()\n\tl.testObject.Log(\"exporter reuses unchanged layers\")\n\n\tl.assert.Matches(l.output, regexp.MustCompile(fmt.Sprintf(`(?i)Reusing layer '%s'`, layer)))\n}\n\nfunc (l LifecycleOutputAssertionManager) ReportsCacheReuse(layer string) {\n\tl.testObject.Helper()\n\tl.testObject.Log(\"cacher reuses unchanged layers\")\n\n\tl.assert.Matches(l.output, regexp.MustCompile(fmt.Sprintf(`(?i)Reusing cache layer '%s'`, layer)))\n}\n\nfunc (l LifecycleOutputAssertionManager) ReportsCacheCreation(layer string) {\n\tl.testObject.Helper()\n\tl.testObject.Log(\"cacher adds layers\")\n\n\tl.assert.Matches(l.output, regexp.MustCompile(fmt.Sprintf(`(?i)Adding cache layer '%s'`, layer)))\n}\n\nfunc (l LifecycleOutputAssertionManager) ReportsSkippingBuildpackLayerAnalysis() {\n\tl.testObject.Helper()\n\tl.testObject.Log(\"skips buildpack layer analysis\")\n\n\tl.assert.Matches(l.output, regexp.MustCompile(`(?i)Skipping buildpack layer analysis`))\n}\n\nfunc (l LifecycleOutputAssertionManager) IncludesSeparatePhases() {\n\tl.testObject.Helper()\n\n\tl.assert.ContainsAll(l.output, \"[detector]\", \"[analyzer]\", \"[builder]\", \"[exporter]\")\n}\n\nfunc (l LifecycleOutputAssertionManager) IncludesSeparatePhasesWithBuildExtension() {\n\tl.testObject.Helper()\n\n\t// Earlier pack versions print `[extender]`, later pack versions print `[extender (build)]`.\n\t// Removing the `]` for the extend phase allows us to navigate compat suite complexity without undo headache.\n\t// When previous pack is old enough, we can make the matcher more precise.\n\tl.assert.ContainsAll(l.output, \"[detector]\", \"[analyzer]\", \"[extender\", \"[exporter]\")\n}\n\nfunc (l LifecycleOutputAssertionManager) IncludesSeparatePhasesWithRunExtension() {\n\tl.testObject.Helper()\n\n\tl.assert.ContainsAll(l.output, \"[detector]\", \"[analyzer]\", \"[extender (run)]\", \"[exporter]\")\n}\n\nfunc (l LifecycleOutputAssertionManager) IncludesTagOrEphemeralLifecycle(tag string) {\n\tl.testObject.Helper()\n\n\tif !strings.Contains(l.output, tag) {\n\t\tif !strings.Contains(l.output, \"pack.local/lifecycle\") {\n\t\t\tl.testObject.Fatalf(\"Unable to locate reference to lifecycle image within output\")\n\t\t}\n\t}\n}\n"
  },
  {
    "path": "acceptance/assertions/output.go",
    "content": "//go:build acceptance\n\npackage assertions\n\nimport (\n\t\"fmt\"\n\t\"regexp\"\n\t\"strings\"\n\t\"testing\"\n\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\ntype OutputAssertionManager struct {\n\ttestObject *testing.T\n\tassert     h.AssertionManager\n\toutput     string\n}\n\nfunc NewOutputAssertionManager(t *testing.T, output string) OutputAssertionManager {\n\treturn OutputAssertionManager{\n\t\ttestObject: t,\n\t\tassert:     h.NewAssertionManager(t),\n\t\toutput:     output,\n\t}\n}\n\nfunc (o OutputAssertionManager) ReportsSuccessfulImageBuild(name string) {\n\to.testObject.Helper()\n\n\to.assert.ContainsF(o.output, \"Successfully built image '%s'\", name)\n}\n\nfunc (o OutputAssertionManager) ReportsSuccessfulIndexLocallyCreated(name string) {\n\to.testObject.Helper()\n\n\to.assert.ContainsF(o.output, \"Successfully created manifest list '%s'\", name)\n}\n\nfunc (o OutputAssertionManager) ReportsSuccessfulIndexPushed(name string) {\n\to.testObject.Helper()\n\n\to.assert.ContainsF(o.output, \"Successfully pushed manifest list '%s' to registry\", name)\n}\n\nfunc (o OutputAssertionManager) ReportsSuccessfulManifestAddedToIndex(name string) {\n\to.testObject.Helper()\n\n\to.assert.ContainsF(o.output, \"Successfully added image '%s' to index\", name)\n}\n\nfunc (o OutputAssertionManager) ReportsSuccessfulIndexDeleted() {\n\to.testObject.Helper()\n\n\to.assert.Contains(o.output, \"Successfully deleted manifest list(s) from local storage\")\n}\n\nfunc (o OutputAssertionManager) ReportsSuccessfulIndexAnnotated(name, manifest string) {\n\to.testObject.Helper()\n\n\to.assert.ContainsF(o.output, \"Successfully annotated image '%s' in index '%s'\", name, manifest)\n}\n\nfunc (o OutputAssertionManager) ReportsSuccessfulRemoveManifestFromIndex(name string) {\n\to.testObject.Helper()\n\n\to.assert.ContainsF(o.output, \"Successfully removed image(s) from index: '%s'\", name)\n}\n\nfunc (o OutputAssertionManager) ReportSuccessfulQuietBuild(name string) {\n\to.testObject.Helper()\n\to.testObject.Log(\"quiet mode\")\n\n\to.assert.Matches(strings.TrimSpace(o.output), regexp.MustCompile(name+`@sha256:[\\w]{12,64}`))\n}\n\nfunc (o OutputAssertionManager) ReportsSuccessfulRebase(name string) {\n\to.testObject.Helper()\n\n\to.assert.ContainsF(o.output, \"Successfully rebased image '%s'\", name)\n}\n\nfunc (o OutputAssertionManager) ReportsUsingBuildCacheVolume() {\n\to.testObject.Helper()\n\n\to.testObject.Log(\"uses a build cache volume\")\n\to.assert.Contains(o.output, \"Using build cache volume\")\n}\n\nfunc (o OutputAssertionManager) ReportsSelectingRunImageMirror(mirror string) {\n\to.testObject.Helper()\n\n\to.testObject.Log(\"selects expected run image mirror\")\n\to.assert.ContainsF(o.output, \"Selected run image mirror '%s'\", mirror)\n}\n\nfunc (o OutputAssertionManager) ReportsSelectingRunImageMirrorFromLocalConfig(mirror string) {\n\to.testObject.Helper()\n\n\to.testObject.Log(\"local run-image mirror is selected\")\n\to.assert.ContainsF(o.output, \"Selected run image mirror '%s' from local config\", mirror)\n}\n\nfunc (o OutputAssertionManager) ReportsSkippingRestore() {\n\to.testObject.Helper()\n\to.testObject.Log(\"skips restore\")\n\n\to.assert.Contains(o.output, \"Skipping 'restore' due to clearing cache\")\n}\n\nfunc (o OutputAssertionManager) ReportsRunImageStackNotMatchingBuilder(runImageStack, builderStack string) {\n\to.testObject.Helper()\n\n\to.assert.Contains(\n\t\to.output,\n\t\tfmt.Sprintf(\"run-image stack id '%s' does not match builder stack '%s'\", runImageStack, builderStack),\n\t)\n}\n\nfunc (o OutputAssertionManager) ReportsDeprecatedUseOfStack() {\n\to.testObject.Helper()\n\n\to.assert.Contains(o.output, \"Warning: deprecated usage of stack\")\n}\n\nfunc (o OutputAssertionManager) WithoutColors() {\n\to.testObject.Helper()\n\to.testObject.Log(\"has no color\")\n\n\to.assert.NoMatches(o.output, regexp.MustCompile(`\\x1b\\[[0-9;]*m`))\n}\n\nfunc (o OutputAssertionManager) ReportsAddingBuildpack(name, version string) {\n\to.testObject.Helper()\n\n\to.assert.ContainsF(o.output, \"Adding buildpack '%s' version '%s' to builder\", name, version)\n}\n\nfunc (o OutputAssertionManager) ReportsPullingImage(image string) {\n\to.testObject.Helper()\n\n\to.assert.ContainsF(o.output, \"Pulling image '%s'\", image)\n}\n\nfunc (o OutputAssertionManager) ReportsImageNotExistingOnDaemon(image string) {\n\to.testObject.Helper()\n\n\to.assert.ContainsF(o.output, \"image '%s' does not exist on the daemon\", image)\n}\n\nfunc (o OutputAssertionManager) ReportsPackageCreation(name string) {\n\to.testObject.Helper()\n\n\to.assert.ContainsF(o.output, \"Successfully created package '%s'\", name)\n}\n\nfunc (o OutputAssertionManager) ReportsInvalidExtension(extension string) {\n\to.testObject.Helper()\n\n\to.assert.ContainsF(o.output, \"'%s' is not a valid extension for a packaged buildpack. Packaged buildpacks must have a '.cnb' extension\", extension)\n}\n\nfunc (o OutputAssertionManager) ReportsPackagePublished(name string) {\n\to.testObject.Helper()\n\n\to.assert.ContainsF(o.output, \"Successfully published package '%s'\", name)\n}\n\nfunc (o OutputAssertionManager) ReportsCommandUnknown(command string) {\n\to.testObject.Helper()\n\n\to.assert.ContainsF(o.output, `unknown command \"%s\" for \"pack\"`, command)\n}\n\nfunc (o OutputAssertionManager) IncludesUsagePrompt() {\n\to.testObject.Helper()\n\n\to.assert.Contains(o.output, \"Run 'pack --help' for usage.\")\n}\n\nfunc (o OutputAssertionManager) ReportsBuilderCreated(name string) {\n\to.testObject.Helper()\n\n\to.assert.ContainsF(o.output, \"Successfully created builder image '%s'\", name)\n}\n\nfunc (o OutputAssertionManager) ReportsSettingDefaultBuilder(name string) {\n\to.testObject.Helper()\n\n\to.assert.ContainsF(o.output, \"Builder '%s' is now the default builder\", name)\n}\n\nfunc (o OutputAssertionManager) IncludesSuggestedBuildersHeading() {\n\to.testObject.Helper()\n\n\to.assert.Contains(o.output, \"Suggested builders:\")\n}\n\nfunc (o OutputAssertionManager) IncludesMessageToSetDefaultBuilder() {\n\to.testObject.Helper()\n\n\to.assert.Contains(o.output, \"Please select a default builder with:\")\n}\n\nfunc (o OutputAssertionManager) IncludesSuggestedStacksHeading() {\n\to.testObject.Helper()\n\n\to.assert.Contains(o.output, \"Stacks maintained by the community:\")\n}\n\nfunc (o OutputAssertionManager) IncludesTrustedBuildersHeading() {\n\to.testObject.Helper()\n\n\to.assert.Contains(o.output, \"Trusted Builders:\")\n}\n\nconst googleBuilder = \"gcr.io/buildpacks/builder:google-22\"\n\nfunc (o OutputAssertionManager) IncludesGoogleBuilder() {\n\to.testObject.Helper()\n\n\to.assert.Contains(o.output, googleBuilder)\n}\n\nfunc (o OutputAssertionManager) IncludesPrefixedGoogleBuilder() {\n\to.testObject.Helper()\n\n\to.assert.Matches(o.output, regexp.MustCompile(fmt.Sprintf(`Google:\\s+'%s'`, googleBuilder)))\n}\n\nvar herokuBuilders = []string{\n\t\"heroku/builder:24\",\n}\n\nfunc (o OutputAssertionManager) IncludesHerokuBuilders() {\n\to.testObject.Helper()\n\n\to.assert.ContainsAll(o.output, herokuBuilders...)\n}\n\nfunc (o OutputAssertionManager) IncludesPrefixedHerokuBuilders() {\n\to.testObject.Helper()\n\n\tfor _, builder := range herokuBuilders {\n\t\to.assert.Matches(o.output, regexp.MustCompile(fmt.Sprintf(`Heroku:\\s+'%s'`, builder)))\n\t}\n}\n\nvar paketoBuilders = []string{\n\t\"paketobuildpacks/builder-jammy-base\",\n\t\"paketobuildpacks/builder-jammy-full\",\n\t\"paketobuildpacks/builder-jammy-tiny\",\n}\n\nfunc (o OutputAssertionManager) IncludesPaketoBuilders() {\n\to.testObject.Helper()\n\n\to.assert.ContainsAll(o.output, paketoBuilders...)\n}\n\nfunc (o OutputAssertionManager) IncludesPrefixedPaketoBuilders() {\n\to.testObject.Helper()\n\n\tfor _, builder := range paketoBuilders {\n\t\to.assert.Matches(o.output, regexp.MustCompile(fmt.Sprintf(`Paketo Buildpacks:\\s+'%s'`, builder)))\n\t}\n}\n\nfunc (o OutputAssertionManager) IncludesDeprecationWarning() {\n\to.testObject.Helper()\n\n\to.assert.Matches(o.output, regexp.MustCompile(fmt.Sprintf(`Warning: Command 'pack [\\w-]+' has been deprecated, please use 'pack [\\w-\\s]+' instead`)))\n}\n\nfunc (o OutputAssertionManager) ReportsSuccesfulRunImageMirrorsAdd(image, mirror string) {\n\to.testObject.Helper()\n\n\to.assert.ContainsF(o.output, \"Run Image '%s' configured with mirror '%s'\\n\", image, mirror)\n}\n\nfunc (o OutputAssertionManager) ReportsReadingConfig() {\n\to.testObject.Helper()\n\n\to.assert.Contains(o.output, \"reading config\")\n}\n\nfunc (o OutputAssertionManager) ReportsInvalidBuilderToml() {\n\to.testObject.Helper()\n\n\to.assert.Contains(o.output, \"invalid builder toml\")\n}\n"
  },
  {
    "path": "acceptance/assertions/test_buildpack_output.go",
    "content": "//go:build acceptance\n\npackage assertions\n\nimport (\n\t\"testing\"\n\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\ntype TestBuildpackOutputAssertionManager struct {\n\ttestObject *testing.T\n\tassert     h.AssertionManager\n\toutput     string\n}\n\nfunc NewTestBuildpackOutputAssertionManager(t *testing.T, output string) TestBuildpackOutputAssertionManager {\n\treturn TestBuildpackOutputAssertionManager{\n\t\ttestObject: t,\n\t\tassert:     h.NewAssertionManager(t),\n\t\toutput:     output,\n\t}\n}\n\nfunc (t TestBuildpackOutputAssertionManager) ReportsReadingFileContents(phase, path, content string) {\n\tt.testObject.Helper()\n\n\tt.assert.ContainsF(t.output, \"%s: Reading file '%s': %s\", phase, path, content)\n}\n\nfunc (t TestBuildpackOutputAssertionManager) ReportsWritingFileContents(phase, path string) {\n\tt.testObject.Helper()\n\n\tt.assert.ContainsF(t.output, \"%s: Writing file '%s': written\", phase, path)\n}\n\nfunc (t TestBuildpackOutputAssertionManager) ReportsFailingToWriteFileContents(phase, path string) {\n\tt.testObject.Helper()\n\n\tt.assert.ContainsF(t.output, \"%s: Writing file '%s': failed\", phase, path)\n}\n\nfunc (t TestBuildpackOutputAssertionManager) ReportsConnectedToInternet() {\n\tt.testObject.Helper()\n\n\tt.assert.Contains(t.output, \"RESULT: Connected to the internet\")\n}\n\nfunc (t TestBuildpackOutputAssertionManager) ReportsDisconnectedFromInternet() {\n\tt.testObject.Helper()\n\n\tt.assert.Contains(t.output, \"RESULT: Disconnected from the internet\")\n}\n\nfunc (t TestBuildpackOutputAssertionManager) ReportsBuildStep(message string) {\n\tt.testObject.Helper()\n\n\tt.assert.ContainsF(t.output, \"Build: %s\", message)\n}\n"
  },
  {
    "path": "acceptance/buildpacks/archive_buildpack.go",
    "content": "//go:build acceptance\n\npackage buildpacks\n\nimport (\n\t\"archive/tar\"\n\t\"compress/gzip\"\n\t\"fmt\"\n\t\"os\"\n\t\"path/filepath\"\n\n\t\"github.com/buildpacks/pack/pkg/archive\"\n\n\t\"github.com/pkg/errors\"\n)\n\nconst (\n\tdefaultBasePath = \"./\"\n\tdefaultUid      = 0\n\tdefaultGid      = 0\n\tdefaultMode     = 0755\n)\n\ntype archiveBuildModule struct {\n\tname string\n}\n\nfunc (a archiveBuildModule) Prepare(sourceDir, destination string) error {\n\tlocation, err := a.createTgz(sourceDir)\n\tif err != nil {\n\t\treturn errors.Wrapf(err, \"creating archive for build module %s\", a)\n\t}\n\n\terr = os.Rename(location, filepath.Join(destination, a.FileName()))\n\tif err != nil {\n\t\treturn errors.Wrapf(err, \"renaming temporary archive for build module %s\", a)\n\t}\n\n\treturn nil\n}\n\nfunc (a archiveBuildModule) FileName() string {\n\treturn fmt.Sprintf(\"%s.tgz\", a)\n}\n\nfunc (a archiveBuildModule) String() string {\n\treturn a.name\n}\n\nfunc (a archiveBuildModule) FullPathIn(parentFolder string) string {\n\treturn filepath.Join(parentFolder, a.FileName())\n}\n\nfunc (a archiveBuildModule) createTgz(sourceDir string) (string, error) {\n\ttempFile, err := os.CreateTemp(\"\", \"*.tgz\")\n\tif err != nil {\n\t\treturn \"\", errors.Wrap(err, \"creating temporary archive\")\n\t}\n\tdefer tempFile.Close()\n\n\tgZipper := gzip.NewWriter(tempFile)\n\tdefer gZipper.Close()\n\n\ttarWriter := tar.NewWriter(gZipper)\n\tdefer tarWriter.Close()\n\n\tarchiveSource := filepath.Join(sourceDir, a.name)\n\terr = archive.WriteDirToTar(\n\t\ttarWriter,\n\t\tarchiveSource,\n\t\tdefaultBasePath,\n\t\tdefaultUid,\n\t\tdefaultGid,\n\t\tdefaultMode,\n\t\ttrue,\n\t\tfalse,\n\t\tnil,\n\t)\n\tif err != nil {\n\t\treturn \"\", errors.Wrap(err, \"writing to temporary archive\")\n\t}\n\n\treturn tempFile.Name(), nil\n}\n\nvar (\n\tBpSimpleLayersParent       = &archiveBuildModule{name: \"simple-layers-parent-buildpack\"}\n\tBpSimpleLayers             = &archiveBuildModule{name: \"simple-layers-buildpack\"}\n\tBpSimpleLayersDifferentSha = &archiveBuildModule{name: \"simple-layers-buildpack-different-sha\"}\n\tBpInternetCapable          = &archiveBuildModule{name: \"internet-capable-buildpack\"}\n\tBpReadVolume               = &archiveBuildModule{name: \"read-volume-buildpack\"}\n\tBpReadWriteVolume          = &archiveBuildModule{name: \"read-write-volume-buildpack\"}\n\tBpArchiveNotInBuilder      = &archiveBuildModule{name: \"not-in-builder-buildpack\"}\n\tBpNoop                     = &archiveBuildModule{name: \"noop-buildpack\"}\n\tBpNoop2                    = &archiveBuildModule{name: \"noop-buildpack-2\"}\n\tBpOtherStack               = &archiveBuildModule{name: \"other-stack-buildpack\"}\n\tBpReadEnv                  = &archiveBuildModule{name: \"read-env-buildpack\"}\n\tBpNestedLevelOne           = &archiveBuildModule{name: \"nested-level-1-buildpack\"}\n\tBpNestedLevelTwo           = &archiveBuildModule{name: \"nested-level-2-buildpack\"}\n\tExtReadEnv                 = &archiveBuildModule{name: \"read-env-extension\"}\n)\n"
  },
  {
    "path": "acceptance/buildpacks/folder_buildpack.go",
    "content": "//go:build acceptance\n\npackage buildpacks\n\nimport (\n\t\"fmt\"\n\t\"os\"\n\t\"path/filepath\"\n\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\ntype folderBuildModule struct {\n\tname string\n}\n\nfunc (f folderBuildModule) Prepare(sourceDir, destination string) error {\n\tsourceBuildpack := filepath.Join(sourceDir, f.name)\n\tinfo, err := os.Stat(sourceBuildpack)\n\tif err != nil {\n\t\treturn fmt.Errorf(\"retrieving folder info for folder: %s: %w\", sourceBuildpack, err)\n\t}\n\n\tdestinationBuildpack := filepath.Join(destination, f.name)\n\terr = os.Mkdir(filepath.Join(destinationBuildpack), info.Mode())\n\tif err != nil {\n\t\treturn fmt.Errorf(\"creating temp folder in: %s: %w\", destinationBuildpack, err)\n\t}\n\n\terr = h.RecursiveCopyE(filepath.Join(sourceDir, f.name), destinationBuildpack)\n\tif err != nil {\n\t\treturn fmt.Errorf(\"copying folder build module %s: %w\", f.name, err)\n\t}\n\n\treturn nil\n}\n\nfunc (f folderBuildModule) FullPathIn(parentFolder string) string {\n\treturn filepath.Join(parentFolder, f.name)\n}\n\nvar (\n\tBpFolderNotInBuilder       = folderBuildModule{name: \"not-in-builder-buildpack\"}\n\tBpFolderSimpleLayersParent = folderBuildModule{name: \"simple-layers-parent-buildpack\"}\n\tBpFolderSimpleLayers       = folderBuildModule{name: \"simple-layers-buildpack\"}\n\tExtFolderSimpleLayers      = folderBuildModule{name: \"simple-layers-extension\"}\n\tMetaBpFolder               = folderBuildModule{name: \"meta-buildpack\"}\n\tMetaBpDependency           = folderBuildModule{name: \"meta-buildpack-dependency\"}\n\tMultiPlatformFolderBP      = folderBuildModule{name: \"multi-platform-buildpack\"}\n)\n"
  },
  {
    "path": "acceptance/buildpacks/manager.go",
    "content": "//go:build acceptance\n\npackage buildpacks\n\nimport (\n\t\"path/filepath\"\n\t\"testing\"\n\n\t\"github.com/buildpacks/pack/testhelpers\"\n)\n\ntype BuildModuleManager struct {\n\ttestObject *testing.T\n\tassert     testhelpers.AssertionManager\n\tsourceDir  string\n}\n\ntype BuildModuleManagerModifier func(b *BuildModuleManager)\n\nfunc NewBuildModuleManager(t *testing.T, assert testhelpers.AssertionManager, modifiers ...BuildModuleManagerModifier) BuildModuleManager {\n\tm := BuildModuleManager{\n\t\ttestObject: t,\n\t\tassert:     assert,\n\t\tsourceDir:  filepath.Join(\"testdata\", \"mock_buildpacks\"),\n\t}\n\n\tfor _, mod := range modifiers {\n\t\tmod(&m)\n\t}\n\n\treturn m\n}\n\ntype TestBuildModule interface {\n\tPrepare(source, destination string) error\n}\n\nfunc (b BuildModuleManager) PrepareBuildModules(destination string, modules ...TestBuildModule) {\n\tb.testObject.Helper()\n\n\tfor _, module := range modules {\n\t\terr := module.Prepare(b.sourceDir, destination)\n\t\tb.assert.Nil(err)\n\t}\n}\n\ntype Modifiable interface {\n\tSetPublish()\n\tSetBuildpacks([]TestBuildModule)\n}\ntype PackageModifier func(p Modifiable)\n\nfunc WithRequiredBuildpacks(buildpacks ...TestBuildModule) PackageModifier {\n\treturn func(p Modifiable) {\n\t\tp.SetBuildpacks(buildpacks)\n\t}\n}\n\nfunc WithPublish() PackageModifier {\n\treturn func(p Modifiable) {\n\t\tp.SetPublish()\n\t}\n}\n"
  },
  {
    "path": "acceptance/buildpacks/package_file_buildpack.go",
    "content": "//go:build acceptance\n\npackage buildpacks\n\nimport (\n\t\"errors\"\n\t\"fmt\"\n\t\"os\"\n\t\"path/filepath\"\n\t\"strings\"\n\t\"testing\"\n\n\t\"github.com/buildpacks/pack/acceptance/invoke\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\ntype PackageFile struct {\n\ttestObject           *testing.T\n\tpack                 *invoke.PackInvoker\n\tdestination          string\n\tsourceConfigLocation string\n\tbuildpacks           []TestBuildModule\n}\n\nfunc (p *PackageFile) SetBuildpacks(buildpacks []TestBuildModule) {\n\tp.buildpacks = buildpacks\n}\n\nfunc (p *PackageFile) SetPublish() {}\n\nfunc NewPackageFile(\n\tt *testing.T,\n\tpack *invoke.PackInvoker,\n\tdestination, configLocation string,\n\tmodifiers ...PackageModifier,\n) PackageFile {\n\n\tp := PackageFile{\n\t\ttestObject:           t,\n\t\tpack:                 pack,\n\t\tdestination:          destination,\n\t\tsourceConfigLocation: configLocation,\n\t}\n\tfor _, mod := range modifiers {\n\t\tmod(&p)\n\t}\n\n\treturn p\n}\n\nfunc (p PackageFile) Prepare(sourceDir, _ string) error {\n\tp.testObject.Helper()\n\tp.testObject.Log(\"creating package file from:\", sourceDir)\n\n\ttmpDir, err := os.MkdirTemp(\"\", \"package-buildpacks\")\n\tif err != nil {\n\t\treturn fmt.Errorf(\"creating temp dir for package buildpacks: %w\", err)\n\t}\n\tdefer os.RemoveAll(tmpDir)\n\n\tfor _, buildpack := range p.buildpacks {\n\t\terr = buildpack.Prepare(sourceDir, tmpDir)\n\t\tif err != nil {\n\t\t\treturn fmt.Errorf(\"preparing buildpack %s: %w\", buildpack, err)\n\t\t}\n\t}\n\n\tconfigLocation := filepath.Join(tmpDir, \"package.toml\")\n\th.CopyFile(p.testObject, p.sourceConfigLocation, configLocation)\n\n\tpackArgs := []string{\n\t\tp.destination,\n\t\t\"--no-color\",\n\t\t\"-c\", configLocation,\n\t\t\"--format\", \"file\",\n\t}\n\n\toutput := p.pack.RunSuccessfully(\"buildpack\", append([]string{\"package\"}, packArgs...)...)\n\tif !strings.Contains(output, fmt.Sprintf(\"Successfully created package '%s'\", p.destination)) {\n\t\treturn errors.New(\"failed to create package\")\n\t}\n\n\treturn nil\n}\n"
  },
  {
    "path": "acceptance/buildpacks/package_image_buildpack.go",
    "content": "//go:build acceptance\n\npackage buildpacks\n\nimport (\n\t\"fmt\"\n\t\"os\"\n\t\"path/filepath\"\n\t\"testing\"\n\n\t\"github.com/buildpacks/pack/acceptance/assertions\"\n\n\th \"github.com/buildpacks/pack/testhelpers\"\n\n\t\"github.com/buildpacks/pack/acceptance/invoke\"\n)\n\ntype PackageImage struct {\n\ttestObject           *testing.T\n\tpack                 *invoke.PackInvoker\n\tname                 string\n\tsourceConfigLocation string\n\tbuildpacks           []TestBuildModule\n\tpublish              bool\n}\n\nfunc (p *PackageImage) SetBuildpacks(buildpacks []TestBuildModule) {\n\tp.buildpacks = buildpacks\n}\n\nfunc (p *PackageImage) SetPublish() {\n\tp.publish = true\n}\n\nfunc NewPackageImage(\n\tt *testing.T,\n\tpack *invoke.PackInvoker,\n\tname, configLocation string,\n\tmodifiers ...PackageModifier,\n) PackageImage {\n\tp := PackageImage{\n\t\ttestObject:           t,\n\t\tpack:                 pack,\n\t\tname:                 name,\n\t\tsourceConfigLocation: configLocation,\n\t\tpublish:              false,\n\t}\n\n\tfor _, mod := range modifiers {\n\t\tmod(&p)\n\t}\n\treturn p\n}\n\nfunc (p PackageImage) Prepare(sourceDir, _ string) error {\n\tp.testObject.Helper()\n\tp.testObject.Log(\"creating package image from:\", sourceDir)\n\n\ttmpDir, err := os.MkdirTemp(\"\", \"package-buildpacks\")\n\tif err != nil {\n\t\treturn fmt.Errorf(\"creating temp dir for package buildpacks: %w\", err)\n\t}\n\tdefer os.RemoveAll(tmpDir)\n\n\tfor _, buildpack := range p.buildpacks {\n\t\terr = buildpack.Prepare(sourceDir, tmpDir)\n\t\tif err != nil {\n\t\t\treturn fmt.Errorf(\"preparing buildpack %s: %w\", buildpack, err)\n\t\t}\n\t}\n\n\tconfigLocation := filepath.Join(tmpDir, \"package.toml\")\n\th.CopyFile(p.testObject, p.sourceConfigLocation, configLocation)\n\n\tpackArgs := []string{\n\t\tp.name,\n\t\t\"--no-color\",\n\t\t\"-c\", configLocation,\n\t}\n\n\tif p.publish {\n\t\tpackArgs = append(packArgs, \"--publish\")\n\t}\n\n\tp.testObject.Log(\"packaging image: \", p.name)\n\toutput := p.pack.RunSuccessfully(\"buildpack\", append([]string{\"package\"}, packArgs...)...)\n\tassertOutput := assertions.NewOutputAssertionManager(p.testObject, output)\n\tif p.publish {\n\t\tassertOutput.ReportsPackagePublished(p.name)\n\t} else {\n\t\tassertOutput.ReportsPackageCreation(p.name)\n\t}\n\n\treturn nil\n}\n"
  },
  {
    "path": "acceptance/config/asset_manager.go",
    "content": "//go:build acceptance\n\npackage config\n\nimport (\n\t\"fmt\"\n\t\"os\"\n\t\"os/exec\"\n\t\"path/filepath\"\n\t\"regexp\"\n\t\"testing\"\n\n\tacceptanceOS \"github.com/buildpacks/pack/acceptance/os\"\n\t\"github.com/buildpacks/pack/internal/builder\"\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/internal/style\"\n\t\"github.com/buildpacks/pack/pkg/blob\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nconst (\n\tdefaultCompilePackVersion = \"0.0.0\"\n)\n\nvar (\n\tcurrentPackFixturesDir           = filepath.Join(\"testdata\", \"pack_fixtures\")\n\tpreviousPackFixturesOverridesDir = filepath.Join(\"testdata\", \"pack_previous_fixtures_overrides\")\n\tlifecycleTgzExp                  = regexp.MustCompile(`lifecycle-v\\d+.\\d+.\\d+(-pre.\\d+)?(-rc.\\d+)?\\+linux.x86-64.tgz`)\n)\n\ntype AssetManager struct {\n\tpackPath                    string\n\tpackFixturesPath            string\n\tpreviousPackPath            string\n\tpreviousPackFixturesPaths   []string\n\tgithubAssetFetcher          *GithubAssetFetcher\n\tlifecyclePath               string\n\tlifecycleDescriptor         builder.LifecycleDescriptor\n\tlifecycleImage              string\n\tpreviousLifecyclePath       string\n\tpreviousLifecycleDescriptor builder.LifecycleDescriptor\n\tpreviousLifecycleImage      string\n\tdefaultLifecycleDescriptor  builder.LifecycleDescriptor\n\ttestObject                  *testing.T\n}\n\nfunc ConvergedAssetManager(t *testing.T, assert h.AssertionManager, inputConfig InputConfigurationManager) AssetManager {\n\tt.Helper()\n\n\tvar (\n\t\tconvergedCurrentPackPath             string\n\t\tconvergedPreviousPackPath            string\n\t\tconvergedPreviousPackFixturesPaths   []string\n\t\tconvergedCurrentLifecyclePath        string\n\t\tconvergedCurrentLifecycleImage       string\n\t\tconvergedCurrentLifecycleDescriptor  builder.LifecycleDescriptor\n\t\tconvergedPreviousLifecyclePath       string\n\t\tconvergedPreviousLifecycleImage      string\n\t\tconvergedPreviousLifecycleDescriptor builder.LifecycleDescriptor\n\t\tconvergedDefaultLifecycleDescriptor  builder.LifecycleDescriptor\n\t)\n\n\tgithubAssetFetcher, err := NewGithubAssetFetcher(t, inputConfig.githubToken)\n\th.AssertNil(t, err)\n\n\tassetBuilder := assetManagerBuilder{\n\t\ttestObject:         t,\n\t\tassert:             assert,\n\t\tinputConfig:        inputConfig,\n\t\tgithubAssetFetcher: githubAssetFetcher,\n\t}\n\n\tif inputConfig.combinations.requiresCurrentPack() {\n\t\tconvergedCurrentPackPath = assetBuilder.ensureCurrentPack()\n\t}\n\n\tif inputConfig.combinations.requiresPreviousPack() {\n\t\tconvergedPreviousPackPath = assetBuilder.ensurePreviousPack()\n\t\tconvergedPreviousPackFixturesPath := assetBuilder.ensurePreviousPackFixtures()\n\n\t\tconvergedPreviousPackFixturesPaths = []string{previousPackFixturesOverridesDir, convergedPreviousPackFixturesPath}\n\t}\n\n\tif inputConfig.combinations.requiresCurrentLifecycle() {\n\t\tconvergedCurrentLifecyclePath, convergedCurrentLifecycleImage, convergedCurrentLifecycleDescriptor = assetBuilder.ensureCurrentLifecycle()\n\t}\n\n\tif inputConfig.combinations.requiresPreviousLifecycle() {\n\t\tconvergedPreviousLifecyclePath, convergedPreviousLifecycleImage, convergedPreviousLifecycleDescriptor = assetBuilder.ensurePreviousLifecycle()\n\t}\n\n\tif inputConfig.combinations.requiresDefaultLifecycle() {\n\t\tconvergedDefaultLifecycleDescriptor = assetBuilder.defaultLifecycleDescriptor()\n\t}\n\n\treturn AssetManager{\n\t\tpackPath:                    convergedCurrentPackPath,\n\t\tpackFixturesPath:            currentPackFixturesDir,\n\t\tpreviousPackPath:            convergedPreviousPackPath,\n\t\tpreviousPackFixturesPaths:   convergedPreviousPackFixturesPaths,\n\t\tlifecyclePath:               convergedCurrentLifecyclePath,\n\t\tlifecycleImage:              convergedCurrentLifecycleImage,\n\t\tlifecycleDescriptor:         convergedCurrentLifecycleDescriptor,\n\t\tpreviousLifecyclePath:       convergedPreviousLifecyclePath,\n\t\tpreviousLifecycleImage:      convergedPreviousLifecycleImage,\n\t\tpreviousLifecycleDescriptor: convergedPreviousLifecycleDescriptor,\n\t\tdefaultLifecycleDescriptor:  convergedDefaultLifecycleDescriptor,\n\t\ttestObject:                  t,\n\t}\n}\n\nfunc (a AssetManager) PackPaths(kind ComboValue) (packPath string, packFixturesPaths []string) {\n\ta.testObject.Helper()\n\n\tswitch kind {\n\tcase Current:\n\t\tpackPath = a.packPath\n\t\tpackFixturesPaths = []string{a.packFixturesPath}\n\tcase Previous:\n\t\tpackPath = a.previousPackPath\n\t\tpackFixturesPaths = a.previousPackFixturesPaths\n\tdefault:\n\t\ta.testObject.Fatalf(\"pack kind must be current or previous, was %s\", kind)\n\t}\n\n\treturn packPath, packFixturesPaths\n}\n\nfunc (a AssetManager) LifecyclePath(kind ComboValue) string {\n\ta.testObject.Helper()\n\n\tswitch kind {\n\tcase Current:\n\t\treturn a.lifecyclePath\n\tcase Previous:\n\t\treturn a.previousLifecyclePath\n\tcase DefaultKind:\n\t\treturn \"\"\n\t}\n\n\ta.testObject.Fatalf(\"lifecycle kind must be previous, current or default was %s\", kind)\n\treturn \"\" // Unreachable\n}\n\nfunc (a AssetManager) LifecycleDescriptor(kind ComboValue) builder.LifecycleDescriptor {\n\ta.testObject.Helper()\n\n\tswitch kind {\n\tcase Current:\n\t\treturn a.lifecycleDescriptor\n\tcase Previous:\n\t\treturn a.previousLifecycleDescriptor\n\tcase DefaultKind:\n\t\treturn a.defaultLifecycleDescriptor\n\t}\n\n\ta.testObject.Fatalf(\"lifecycle kind must be previous, current or default was %s\", kind)\n\treturn builder.LifecycleDescriptor{} // Unreachable\n}\n\nfunc (a AssetManager) LifecycleImage(kind ComboValue) string {\n\ta.testObject.Helper()\n\n\tswitch kind {\n\tcase Current:\n\t\treturn a.lifecycleImage\n\tcase Previous:\n\t\treturn a.previousLifecycleImage\n\tcase DefaultKind:\n\t\treturn fmt.Sprintf(\"%s:%s\", config.DefaultLifecycleImageRepo, a.defaultLifecycleDescriptor.Info.Version)\n\t}\n\n\ta.testObject.Fatalf(\"lifecycle kind must be previous, current or default was %s\", kind)\n\treturn \"\" // Unreachable\n}\n\ntype assetManagerBuilder struct {\n\ttestObject         *testing.T\n\tassert             h.AssertionManager\n\tinputConfig        InputConfigurationManager\n\tgithubAssetFetcher *GithubAssetFetcher\n}\n\nfunc (b assetManagerBuilder) ensureCurrentPack() string {\n\tb.testObject.Helper()\n\n\tif b.inputConfig.packPath != \"\" {\n\t\treturn b.inputConfig.packPath\n\t}\n\n\tcompileWithVersion := b.inputConfig.compilePackWithVersion\n\tif compileWithVersion == \"\" {\n\t\tcompileWithVersion = defaultCompilePackVersion\n\t}\n\n\treturn b.buildPack(compileWithVersion)\n}\n\nfunc (b assetManagerBuilder) ensurePreviousPack() string {\n\tb.testObject.Helper()\n\n\tif b.inputConfig.previousPackPath != \"\" {\n\t\treturn b.inputConfig.previousPackPath\n\t}\n\n\tb.testObject.Logf(\n\t\t\"run combinations %+v require %s to be set\",\n\t\tb.inputConfig.combinations,\n\t\tstyle.Symbol(envPreviousPackPath),\n\t)\n\n\tversion, err := b.githubAssetFetcher.FetchReleaseVersion(\"buildpacks\", \"pack\", 0)\n\tb.assert.Nil(err)\n\n\tassetDir, err := b.githubAssetFetcher.FetchReleaseAsset(\n\t\t\"buildpacks\",\n\t\t\"pack\",\n\t\tversion,\n\t\tacceptanceOS.PackBinaryExp,\n\t\ttrue,\n\t)\n\tb.assert.Nil(err)\n\tassetPath := filepath.Join(assetDir, acceptanceOS.PackBinaryName)\n\n\tb.testObject.Logf(\"using %s for previous pack path\", assetPath)\n\n\treturn assetPath\n}\n\nfunc (b assetManagerBuilder) ensurePreviousPackFixtures() string {\n\tb.testObject.Helper()\n\n\tif b.inputConfig.previousPackFixturesPath != \"\" {\n\t\treturn b.inputConfig.previousPackFixturesPath\n\t}\n\n\tb.testObject.Logf(\n\t\t\"run combinations %+v require %s to be set\",\n\t\tb.inputConfig.combinations,\n\t\tstyle.Symbol(envPreviousPackFixturesPath),\n\t)\n\n\tversion, err := b.githubAssetFetcher.FetchReleaseVersion(\"buildpacks\", \"pack\", 0)\n\tb.assert.Nil(err)\n\n\tsourceDir, err := b.githubAssetFetcher.FetchReleaseSource(\"buildpacks\", \"pack\", version)\n\tb.assert.Nil(err)\n\n\tsourceDirFiles, err := os.ReadDir(sourceDir)\n\tb.assert.Nil(err)\n\t// GitHub source tarballs have a top-level directory whose name includes the current commit sha.\n\tinnerDir := sourceDirFiles[0].Name()\n\tfixturesDir := filepath.Join(sourceDir, innerDir, \"acceptance\", \"testdata\", \"pack_fixtures\")\n\n\tb.testObject.Logf(\"using %s for previous pack fixtures path\", fixturesDir)\n\n\treturn fixturesDir\n}\n\nfunc (b assetManagerBuilder) ensureCurrentLifecycle() (string, string, builder.LifecycleDescriptor) {\n\tb.testObject.Helper()\n\n\tlifecyclePath := b.inputConfig.lifecyclePath\n\n\tif lifecyclePath == \"\" {\n\t\tb.testObject.Logf(\n\t\t\t\"run combinations %+v require %s to be set\",\n\t\t\tb.inputConfig.combinations,\n\t\t\tstyle.Symbol(envLifecyclePath),\n\t\t)\n\n\t\tlifecyclePath = b.downloadLifecycleRelative(0)\n\n\t\tb.testObject.Logf(\"using %s for current lifecycle path\", lifecyclePath)\n\t}\n\n\tlifecycle, err := builder.NewLifecycle(blob.NewBlob(lifecyclePath))\n\tb.assert.Nil(err)\n\n\tlifecycleImage := b.inputConfig.lifecycleImage\n\n\tif lifecycleImage == \"\" {\n\t\tlifecycleImage = fmt.Sprintf(\"%s:%s\", config.DefaultLifecycleImageRepo, lifecycle.Descriptor().Info.Version)\n\n\t\tb.testObject.Logf(\"using %s for current lifecycle image\", lifecycleImage)\n\t}\n\n\treturn lifecyclePath, lifecycleImage, lifecycle.Descriptor()\n}\n\nfunc (b assetManagerBuilder) ensurePreviousLifecycle() (string, string, builder.LifecycleDescriptor) {\n\tb.testObject.Helper()\n\n\tpreviousLifecyclePath := b.inputConfig.previousLifecyclePath\n\n\tif previousLifecyclePath == \"\" {\n\t\tb.testObject.Logf(\n\t\t\t\"run combinations %+v require %s to be set\",\n\t\t\tb.inputConfig.combinations,\n\t\t\tstyle.Symbol(envPreviousLifecyclePath),\n\t\t)\n\n\t\tpreviousLifecyclePath = b.downloadLifecycleRelative(-1)\n\n\t\tb.testObject.Logf(\"using %s for previous lifecycle path\", previousLifecyclePath)\n\t}\n\n\tlifecycle, err := builder.NewLifecycle(blob.NewBlob(previousLifecyclePath))\n\tb.assert.Nil(err)\n\n\tpreviousLifecycleImage := b.inputConfig.previousLifecycleImage\n\n\tif previousLifecycleImage == \"\" {\n\t\tpreviousLifecycleImage = fmt.Sprintf(\"%s:%s\", config.DefaultLifecycleImageRepo, lifecycle.Descriptor().Info.Version)\n\n\t\tb.testObject.Logf(\"using %s for previous lifecycle image\", previousLifecycleImage)\n\t}\n\n\treturn previousLifecyclePath, previousLifecycleImage, lifecycle.Descriptor()\n}\n\nfunc (b assetManagerBuilder) downloadLifecycle(version string) string {\n\tpath, err := b.githubAssetFetcher.FetchReleaseAsset(\n\t\t\"buildpacks\",\n\t\t\"lifecycle\",\n\t\tversion,\n\t\tlifecycleTgzExp,\n\t\tfalse,\n\t)\n\tb.assert.Nil(err)\n\n\treturn path\n}\n\nfunc (b assetManagerBuilder) downloadLifecycleRelative(relativeVersion int) string {\n\tb.testObject.Helper()\n\n\tversion, err := b.githubAssetFetcher.FetchReleaseVersion(\"buildpacks\", \"lifecycle\", relativeVersion)\n\tb.assert.Nil(err)\n\n\treturn b.downloadLifecycle(version)\n}\n\nfunc (b assetManagerBuilder) buildPack(compileVersion string) string {\n\tb.testObject.Helper()\n\n\tpackTmpDir, err := os.MkdirTemp(\"\", \"pack.acceptance.binary.\")\n\tb.assert.Nil(err)\n\n\tpackPath := filepath.Join(packTmpDir, acceptanceOS.PackBinaryName)\n\n\tcwd, err := os.Getwd()\n\tb.assert.Nil(err)\n\n\tcmd := exec.Command(\"go\", \"build\",\n\t\t// XXX the version setter is wrong here, there is no cmd.Version\n\t\t\"-ldflags\", fmt.Sprintf(\"-X 'github.com/buildpacks/pack/cmd.Version=%s'\", compileVersion),\n\t\t\"-o\", packPath,\n\t)\n\tif filepath.Base(cwd) == \"acceptance\" {\n\t\tcmd.Dir = filepath.Dir(cwd)\n\t}\n\n\tb.testObject.Logf(\"building pack: [CWD=%s] %s\", cmd.Dir, cmd.Args)\n\t_, err = cmd.CombinedOutput()\n\tb.assert.Nil(err)\n\n\treturn packPath\n}\n\nfunc (b assetManagerBuilder) defaultLifecycleDescriptor() builder.LifecycleDescriptor {\n\tlifecyclePath := b.downloadLifecycle(\"v\" + builder.DefaultLifecycleVersion)\n\n\tb.testObject.Logf(\"using %s for default lifecycle path\", lifecyclePath)\n\n\tlifecycle, err := builder.NewLifecycle(blob.NewBlob(lifecyclePath))\n\tb.assert.Nil(err)\n\n\treturn lifecycle.Descriptor()\n}\n"
  },
  {
    "path": "acceptance/config/github_asset_fetcher.go",
    "content": "//go:build acceptance\n\npackage config\n\nimport (\n\t\"archive/tar\"\n\t\"archive/zip\"\n\t\"context\"\n\t\"encoding/json\"\n\t\"fmt\"\n\t\"io\"\n\t\"net/http\"\n\t\"os\"\n\t\"path\"\n\t\"path/filepath\"\n\t\"regexp\"\n\t\"sort\"\n\t\"strconv\"\n\t\"strings\"\n\t\"testing\"\n\t\"time\"\n\n\t\"github.com/Masterminds/semver\"\n\t\"github.com/google/go-github/v30/github\"\n\t\"github.com/pkg/errors\"\n\t\"golang.org/x/oauth2\"\n\n\t\"github.com/buildpacks/pack/pkg/blob\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n)\n\nconst (\n\tassetCacheDir         = \"test-assets-cache\"\n\tassetCacheManifest    = \"github.json\"\n\tcacheManifestLifetime = 1 * time.Hour\n)\n\ntype GithubAssetFetcher struct {\n\tctx          context.Context\n\ttestObject   *testing.T\n\tgithubClient *github.Client\n\tcacheDir     string\n}\n\ntype assetCache map[string]map[string]cachedRepo\ntype cachedRepo struct {\n\tAssets   cachedAssets\n\tSources  cachedSources\n\tVersions cachedVersions\n}\ntype cachedAssets map[string][]string\ntype cachedSources map[string]string\ntype cachedVersions map[string]string\n\nfunc NewGithubAssetFetcher(t *testing.T, githubToken string) (*GithubAssetFetcher, error) {\n\tt.Helper()\n\n\trelativeCacheDir := filepath.Join(\"..\", \"out\", \"tests\", assetCacheDir)\n\tcacheDir, err := filepath.Abs(relativeCacheDir)\n\tif err != nil {\n\t\treturn nil, errors.Wrapf(err, \"getting absolute path for %s\", relativeCacheDir)\n\t}\n\tif err := os.MkdirAll(cacheDir, 0755); err != nil {\n\t\treturn nil, errors.Wrapf(err, \"creating directory %s\", cacheDir)\n\t}\n\n\tctx := context.TODO()\n\thttpClient := new(http.Client)\n\tif githubToken != \"\" {\n\t\tt.Log(\"using provided github token\")\n\t\ttokenSource := oauth2.StaticTokenSource(&oauth2.Token{\n\t\t\tAccessToken: githubToken,\n\t\t})\n\t\thttpClient = oauth2.NewClient(ctx, tokenSource)\n\t}\n\n\treturn &GithubAssetFetcher{\n\t\tctx:          ctx,\n\t\ttestObject:   t,\n\t\tgithubClient: github.NewClient(httpClient),\n\t\tcacheDir:     cacheDir,\n\t}, nil\n}\n\n// Fetch a GitHub release asset for the given repo that matches the regular expression.\n// The expression is something like 'pack-v\\d+.\\d+.\\d+-macos'.\n// The asset may be found in the local cache or downloaded from GitHub.\n// The return value is the location of the asset on disk, or any error encountered.\nfunc (f *GithubAssetFetcher) FetchReleaseAsset(owner, repo, version string, expr *regexp.Regexp, extract bool) (string, error) {\n\tf.testObject.Helper()\n\n\tif destPath, _ := f.cachedAsset(owner, repo, version, expr); destPath != \"\" {\n\t\tf.testObject.Logf(\"found %s in cache for %s/%s %s\", destPath, owner, repo, version)\n\t\treturn destPath, nil\n\t}\n\n\trelease, _, err := f.githubClient.Repositories.GetReleaseByTag(f.ctx, owner, repo, version)\n\tif err != nil {\n\t\treturn \"\", errors.Wrap(err, \"getting release\")\n\t}\n\n\tvar desiredAsset *github.ReleaseAsset\n\tfor _, asset := range release.Assets {\n\t\tif expr.MatchString(*asset.Name) {\n\t\t\tdesiredAsset = asset\n\t\t\tbreak\n\t\t}\n\t}\n\tif desiredAsset == nil {\n\t\treturn \"\", fmt.Errorf(\"could not find asset matching expression %s\", expr.String())\n\t}\n\n\tvar returnPath string\n\textractType := extractType(extract, *desiredAsset.Name)\n\tswitch extractType {\n\tcase \"tgz\":\n\t\ttargetDir := filepath.Join(f.cacheDir, stripExtension(*desiredAsset.Name))\n\t\tif err := os.MkdirAll(targetDir, 0755); err != nil {\n\t\t\treturn \"\", errors.Wrapf(err, \"creating directory %s\", targetDir)\n\t\t}\n\n\t\tif err := f.downloadAndExtractTgz(*desiredAsset.BrowserDownloadURL, targetDir); err != nil {\n\t\t\treturn \"\", err\n\t\t}\n\n\t\treturnPath = targetDir\n\tcase \"zip\":\n\t\ttargetPath := filepath.Join(f.cacheDir, *desiredAsset.Name)\n\t\tif err := f.downloadAndExtractZip(*desiredAsset.BrowserDownloadURL, targetPath); err != nil {\n\t\t\treturn \"\", err\n\t\t}\n\n\t\treturnPath = stripExtension(targetPath)\n\tdefault:\n\t\ttargetPath := filepath.Join(f.cacheDir, *desiredAsset.Name)\n\t\tif err := f.downloadAndSave(*desiredAsset.BrowserDownloadURL, targetPath); err != nil {\n\t\t\treturn \"\", err\n\t\t}\n\n\t\treturnPath = targetPath\n\t}\n\n\terr = f.writeCacheManifest(owner, repo, func(cache assetCache) {\n\t\texistingAssets, found := cache[owner][repo].Assets[version]\n\t\tif found {\n\t\t\tcache[owner][repo].Assets[version] = append(existingAssets, returnPath)\n\t\t}\n\t\tcache[owner][repo].Assets[version] = []string{returnPath}\n\t})\n\tif err != nil {\n\t\tf.testObject.Log(errors.Wrap(err, \"writing cache\").Error())\n\t}\n\treturn returnPath, nil\n}\n\nfunc extractType(extract bool, assetName string) string {\n\tif extract && strings.Contains(assetName, \".tgz\") {\n\t\treturn \"tgz\"\n\t}\n\tif extract && strings.Contains(assetName, \".zip\") {\n\t\treturn \"zip\"\n\t}\n\treturn \"none\"\n}\n\nfunc (f *GithubAssetFetcher) FetchReleaseSource(owner, repo, version string) (string, error) {\n\tf.testObject.Helper()\n\n\tif destDir, _ := f.cachedSource(owner, repo, version); destDir != \"\" {\n\t\tf.testObject.Logf(\"found %s in cache for %s/%s %s\", destDir, owner, repo, version)\n\t\treturn destDir, nil\n\t}\n\n\trelease, _, err := f.githubClient.Repositories.GetReleaseByTag(f.ctx, owner, repo, version)\n\tif err != nil {\n\t\treturn \"\", errors.Wrap(err, \"getting release\")\n\t}\n\n\tdestDir := filepath.Join(f.cacheDir, strings.ReplaceAll(*release.Name, \" \", \"-\")+\"-source\")\n\tif err := os.MkdirAll(destDir, 0755); err != nil {\n\t\treturn \"\", errors.Wrapf(err, \"creating directory %s\", destDir)\n\t}\n\n\tif err := f.downloadAndExtractTgz(*release.TarballURL, destDir); err != nil {\n\t\treturn \"\", err\n\t}\n\n\terr = f.writeCacheManifest(owner, repo, func(cache assetCache) {\n\t\tcache[owner][repo].Sources[version] = destDir\n\t})\n\tif err != nil {\n\t\tf.testObject.Log(errors.Wrap(err, \"writing cache\").Error())\n\t}\n\treturn destDir, nil\n}\n\n// Fetch a GitHub release version that is n minor versions older than the latest.\n// Ex: when n is 0, the latest release version is returned.\n// Ex: when n is -1, the latest patch of the previous minor version is returned.\nfunc (f *GithubAssetFetcher) FetchReleaseVersion(owner, repo string, n int) (string, error) {\n\tf.testObject.Helper()\n\n\tif version, _ := f.cachedVersion(owner, repo, n); version != \"\" {\n\t\tf.testObject.Logf(\"found %s in cache for %s/%s %d\", version, owner, repo, n)\n\t\treturn version, nil\n\t}\n\n\t// get all release versions\n\trawReleases, _, err := f.githubClient.Repositories.ListReleases(f.ctx, owner, repo, nil)\n\tif err != nil {\n\t\treturn \"\", errors.Wrap(err, \"listing releases\")\n\t}\n\tif len(rawReleases) == 0 {\n\t\treturn \"\", fmt.Errorf(\"no releases found for %s/%s\", owner, repo)\n\t}\n\n\t// exclude drafts and pre-releases\n\tvar releases []*github.RepositoryRelease\n\tfor _, release := range rawReleases {\n\t\tif !*release.Draft && !*release.Prerelease {\n\t\t\treleases = append(releases, release)\n\t\t}\n\t}\n\tif len(releases) == 0 {\n\t\treturn \"\", fmt.Errorf(\"no non-draft releases found for %s/%s\", owner, repo)\n\t}\n\n\t// sort all release versions\n\tversions := make([]*semver.Version, len(releases))\n\tfor i, release := range releases {\n\t\tversion, err := semver.NewVersion(*release.TagName)\n\t\tif err != nil {\n\t\t\treturn \"\", errors.Wrap(err, \"parsing semver\")\n\t\t}\n\t\tversions[i] = version\n\t}\n\tsort.Sort(semver.Collection(versions))\n\n\tlatestVersion := versions[len(versions)-1]\n\n\t// get latest patch of previous minor\n\tconstraint, err := semver.NewConstraint(\n\t\tfmt.Sprintf(\"~%d.%d.x\", latestVersion.Major(), latestVersion.Minor()+int64(n)),\n\t)\n\tif err != nil {\n\t\treturn \"\", errors.Wrap(err, \"parsing semver constraint\")\n\t}\n\tvar latestPatchOfPreviousMinor *semver.Version\n\tfor i := len(versions) - 1; i >= 0; i-- {\n\t\tif constraint.Check(versions[i]) {\n\t\t\tlatestPatchOfPreviousMinor = versions[i]\n\t\t\tbreak\n\t\t}\n\t}\n\tif latestPatchOfPreviousMinor == nil {\n\t\treturn \"\", errors.New(\"obtaining latest patch of previous minor\")\n\t}\n\tformattedVersion := fmt.Sprintf(\"v%s\", latestPatchOfPreviousMinor.String())\n\n\terr = f.writeCacheManifest(owner, repo, func(cache assetCache) {\n\t\tcache[owner][repo].Versions[strconv.Itoa(n)] = formattedVersion\n\t})\n\tif err != nil {\n\t\tf.testObject.Log(errors.Wrap(err, \"writing cache\").Error())\n\t}\n\treturn formattedVersion, nil\n}\n\nfunc (f *GithubAssetFetcher) cachedAsset(owner, repo, version string, expr *regexp.Regexp) (string, error) {\n\tf.testObject.Helper()\n\n\tcache, err := f.loadCacheManifest()\n\tif err != nil {\n\t\treturn \"\", errors.Wrap(err, \"loading cache\")\n\t}\n\n\tassets, found := cache[owner][repo].Assets[version]\n\tif found {\n\t\tfor _, asset := range assets {\n\t\t\tif expr.MatchString(asset) {\n\t\t\t\treturn asset, nil\n\t\t\t}\n\t\t}\n\t}\n\treturn \"\", nil\n}\n\nfunc (f *GithubAssetFetcher) cachedSource(owner, repo, version string) (string, error) {\n\tf.testObject.Helper()\n\n\tcache, err := f.loadCacheManifest()\n\tif err != nil {\n\t\treturn \"\", errors.Wrap(err, \"loading cache\")\n\t}\n\n\tvalue, found := cache[owner][repo].Sources[version]\n\tif found {\n\t\treturn value, nil\n\t}\n\treturn \"\", nil\n}\n\nfunc (f *GithubAssetFetcher) cachedVersion(owner, repo string, n int) (string, error) {\n\tf.testObject.Helper()\n\n\tcache, err := f.loadCacheManifest()\n\tif err != nil {\n\t\treturn \"\", errors.Wrap(err, \"loading cache\")\n\t}\n\n\tvalue, found := cache[owner][repo].Versions[strconv.Itoa(n)]\n\tif found {\n\t\treturn value, nil\n\t}\n\treturn \"\", nil\n}\n\nfunc (f *GithubAssetFetcher) loadCacheManifest() (assetCache, error) {\n\tf.testObject.Helper()\n\n\tcacheManifest, err := os.Stat(filepath.Join(f.cacheDir, assetCacheManifest))\n\tif os.IsNotExist(err) {\n\t\treturn assetCache{}, nil\n\t}\n\n\t// invalidate cache manifest that is too old\n\tif time.Since(cacheManifest.ModTime()) > cacheManifestLifetime {\n\t\treturn assetCache{}, nil\n\t}\n\n\tcontent, err := os.ReadFile(filepath.Join(f.cacheDir, assetCacheManifest))\n\tif err != nil {\n\t\treturn nil, errors.Wrap(err, \"reading cache manifest\")\n\t}\n\n\tvar cache assetCache\n\terr = json.Unmarshal(content, &cache)\n\tif err != nil {\n\t\treturn nil, errors.Wrap(err, \"unmarshaling cache manifest content\")\n\t}\n\n\treturn cache, nil\n}\n\nfunc (f *GithubAssetFetcher) writeCacheManifest(owner, repo string, op func(cache assetCache)) error {\n\tf.testObject.Helper()\n\n\tcache, err := f.loadCacheManifest()\n\tif err != nil {\n\t\treturn errors.Wrap(err, \"loading cache\")\n\t}\n\n\t// init keys for owner and repo\n\tif _, found := cache[owner]; !found {\n\t\tcache[owner] = map[string]cachedRepo{}\n\t}\n\tif _, found := cache[owner][repo]; !found {\n\t\tcache[owner][repo] = cachedRepo{\n\t\t\tAssets:   cachedAssets{},\n\t\t\tSources:  cachedSources{},\n\t\t\tVersions: cachedVersions{},\n\t\t}\n\t}\n\n\top(cache)\n\n\tcontent, err := json.Marshal(cache)\n\tif err != nil {\n\t\treturn errors.Wrap(err, \"marshaling cache manifest content\")\n\t}\n\n\treturn os.WriteFile(filepath.Join(f.cacheDir, assetCacheManifest), content, 0644)\n}\n\nfunc (f *GithubAssetFetcher) downloadAndSave(assetURI, destPath string) error {\n\tf.testObject.Helper()\n\n\tdownloader := blob.NewDownloader(logging.NewSimpleLogger(&testWriter{t: f.testObject}), f.cacheDir)\n\n\tassetBlob, err := downloader.Download(f.ctx, assetURI)\n\tif err != nil {\n\t\treturn errors.Wrapf(err, \"downloading blob %s\", assetURI)\n\t}\n\n\tassetReader, err := assetBlob.Open()\n\tif err != nil {\n\t\treturn errors.Wrap(err, \"opening blob\")\n\t}\n\tdefer assetReader.Close()\n\n\tdestFile, err := os.OpenFile(destPath, os.O_CREATE|os.O_RDWR, 0644)\n\tif err != nil {\n\t\treturn errors.Wrapf(err, \"opening file %s\", destPath)\n\t}\n\tdefer destFile.Close()\n\n\tif _, err = io.Copy(destFile, assetReader); err != nil {\n\t\treturn errors.Wrap(err, \"copying data\")\n\t}\n\n\treturn nil\n}\n\nfunc (f *GithubAssetFetcher) downloadAndExtractTgz(assetURI, destDir string) error {\n\tf.testObject.Helper()\n\n\tdownloader := blob.NewDownloader(logging.NewSimpleLogger(&testWriter{t: f.testObject}), f.cacheDir)\n\n\tassetBlob, err := downloader.Download(f.ctx, assetURI)\n\tif err != nil {\n\t\treturn errors.Wrapf(err, \"downloading blob %s\", assetURI)\n\t}\n\n\tassetReader, err := assetBlob.Open()\n\tif err != nil {\n\t\treturn errors.Wrapf(err, \"opening blob\")\n\t}\n\tdefer assetReader.Close()\n\n\tif err := extractTgz(assetReader, destDir); err != nil {\n\t\treturn errors.Wrap(err, \"extracting tgz\")\n\t}\n\n\treturn nil\n}\n\nfunc (f *GithubAssetFetcher) downloadAndExtractZip(assetURI, destPath string) error {\n\tf.testObject.Helper()\n\n\tif err := f.downloadAndSave(assetURI, destPath); err != nil {\n\t\treturn err\n\t}\n\n\tif err := extractZip(destPath); err != nil {\n\t\treturn errors.Wrap(err, \"extracting zip\")\n\t}\n\n\treturn nil\n}\n\nfunc stripExtension(assetFilename string) string {\n\treturn strings.TrimSuffix(assetFilename, path.Ext(assetFilename))\n}\n\nfunc extractTgz(reader io.Reader, destDir string) error {\n\ttarReader := tar.NewReader(reader)\n\n\tfor {\n\t\theader, err := tarReader.Next()\n\n\t\tswitch err {\n\t\tcase nil:\n\t\t\t// keep going\n\t\tcase io.EOF:\n\t\t\treturn nil\n\t\tdefault:\n\t\t\treturn err\n\t\t}\n\n\t\ttarget := filepath.Join(destDir, header.Name)\n\n\t\tswitch header.Typeflag {\n\t\tcase tar.TypeDir:\n\t\t\tif err := os.MkdirAll(target, 0755); err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\t\tcase tar.TypeReg:\n\t\t\ttargetFile, err := os.OpenFile(target, os.O_CREATE|os.O_RDWR, os.FileMode(header.Mode))\n\t\t\tif err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\n\t\t\tif _, err := io.Copy(targetFile, tarReader); err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\n\t\t\ttargetFile.Close()\n\t\t}\n\t}\n}\n\nfunc extractZip(zipPath string) error {\n\tzipReader, err := zip.OpenReader(zipPath)\n\tif err != nil {\n\t\treturn err\n\t}\n\tdefer zipReader.Close()\n\n\tparentDir := filepath.Dir(zipPath)\n\n\tfor _, f := range zipReader.File {\n\t\ttarget := filepath.Join(parentDir, f.Name)\n\n\t\tif f.FileInfo().IsDir() {\n\t\t\tos.MkdirAll(target, f.Mode())\n\t\t\tcontinue\n\t\t}\n\n\t\ttargetFile, err := os.OpenFile(target, os.O_CREATE|os.O_RDWR, f.Mode())\n\t\tif err != nil {\n\t\t\treturn err\n\t\t}\n\n\t\tsourceFile, err := f.Open()\n\t\tif err != nil {\n\t\t\treturn err\n\t\t}\n\n\t\t_, err = io.Copy(targetFile, sourceFile)\n\t\tif err != nil {\n\t\t\treturn err\n\t\t}\n\n\t\tsourceFile.Close()\n\t\ttargetFile.Close()\n\t}\n\n\treturn nil\n}\n\ntype testWriter struct {\n\tt *testing.T\n}\n\nfunc (w *testWriter) Write(p []byte) (n int, err error) {\n\tw.t.Log(string(p))\n\treturn len(p), nil\n}\n"
  },
  {
    "path": "acceptance/config/input_configuration_manager.go",
    "content": "//go:build acceptance\n\npackage config\n\nimport (\n\t\"encoding/json\"\n\t\"os\"\n\t\"os/user\"\n\t\"path/filepath\"\n\t\"strings\"\n\n\t\"github.com/pkg/errors\"\n)\n\nconst (\n\tenvAcceptanceSuiteConfig    = \"ACCEPTANCE_SUITE_CONFIG\"\n\tenvCompilePackWithVersion   = \"COMPILE_PACK_WITH_VERSION\"\n\tenvGitHubToken              = \"GITHUB_TOKEN\"\n\tenvLifecycleImage           = \"LIFECYCLE_IMAGE\"\n\tenvLifecyclePath            = \"LIFECYCLE_PATH\"\n\tenvPackPath                 = \"PACK_PATH\"\n\tenvPreviousLifecycleImage   = \"PREVIOUS_LIFECYCLE_IMAGE\"\n\tenvPreviousLifecyclePath    = \"PREVIOUS_LIFECYCLE_PATH\"\n\tenvPreviousPackFixturesPath = \"PREVIOUS_PACK_FIXTURES_PATH\"\n\tenvPreviousPackPath         = \"PREVIOUS_PACK_PATH\"\n)\n\ntype InputConfigurationManager struct {\n\tpackPath                 string\n\tpreviousPackPath         string\n\tpreviousPackFixturesPath string\n\tlifecyclePath            string\n\tpreviousLifecyclePath    string\n\tlifecycleImage           string\n\tpreviousLifecycleImage   string\n\tcompilePackWithVersion   string\n\tgithubToken              string\n\tcombinations             ComboSet\n}\n\nfunc NewInputConfigurationManager() (InputConfigurationManager, error) {\n\tpackPath := os.Getenv(envPackPath)\n\tpreviousPackPath := os.Getenv(envPreviousPackPath)\n\tpreviousPackFixturesPath := os.Getenv(envPreviousPackFixturesPath)\n\tlifecyclePath := os.Getenv(envLifecyclePath)\n\tpreviousLifecyclePath := os.Getenv(envPreviousLifecyclePath)\n\n\terr := resolveAbsolutePaths(&packPath, &previousPackPath, &previousPackFixturesPath, &lifecyclePath, &previousLifecyclePath)\n\tif err != nil {\n\t\treturn InputConfigurationManager{}, err\n\t}\n\n\tvar combos ComboSet\n\n\tcomboConfig := os.Getenv(envAcceptanceSuiteConfig)\n\tif comboConfig != \"\" {\n\t\tif err := json.Unmarshal([]byte(comboConfig), &combos); err != nil {\n\t\t\treturn InputConfigurationManager{}, errors.Errorf(\"failed to parse combination config: %s\", err)\n\t\t}\n\t} else {\n\t\tcombos = defaultRunCombo\n\t}\n\n\tif lifecyclePath != \"\" && len(combos) == 1 && combos[0] == defaultRunCombo[0] {\n\t\tcombos[0].Lifecycle = Current\n\t}\n\n\treturn InputConfigurationManager{\n\t\tpackPath:                 packPath,\n\t\tpreviousPackPath:         previousPackPath,\n\t\tpreviousPackFixturesPath: previousPackFixturesPath,\n\t\tlifecyclePath:            lifecyclePath,\n\t\tpreviousLifecyclePath:    previousLifecyclePath,\n\t\tlifecycleImage:           os.Getenv(envLifecycleImage),\n\t\tpreviousLifecycleImage:   os.Getenv(envPreviousLifecycleImage),\n\t\tcompilePackWithVersion:   os.Getenv(envCompilePackWithVersion),\n\t\tgithubToken:              os.Getenv(envGitHubToken),\n\t\tcombinations:             combos,\n\t}, nil\n}\n\nfunc (i InputConfigurationManager) Combinations() ComboSet {\n\treturn i.combinations\n}\n\nfunc resolveAbsolutePaths(paths ...*string) error {\n\tfor _, path := range paths {\n\t\tif *path == \"\" {\n\t\t\tcontinue\n\t\t}\n\n\t\t// Manually expand ~ to home dir\n\t\tif strings.HasPrefix(*path, \"~/\") {\n\t\t\tusr, err := user.Current()\n\t\t\tif err != nil {\n\t\t\t\treturn errors.Wrapf(err, \"getting current user\")\n\t\t\t}\n\t\t\tdir := usr.HomeDir\n\t\t\t*path = filepath.Join(dir, (*path)[2:])\n\t\t} else {\n\t\t\tabsPath, err := filepath.Abs(*path)\n\t\t\tif err != nil {\n\t\t\t\treturn errors.Wrapf(err, \"getting absolute path for %s\", *path)\n\t\t\t}\n\t\t\t*path = absPath\n\t\t}\n\t}\n\n\treturn nil\n}\n"
  },
  {
    "path": "acceptance/config/lifecycle_asset.go",
    "content": "//go:build acceptance\n\npackage config\n\nimport (\n\t\"fmt\"\n\t\"strings\"\n\n\t\"github.com/Masterminds/semver\"\n\t\"github.com/buildpacks/lifecycle/api\"\n\n\t\"github.com/buildpacks/pack/internal/builder\"\n)\n\ntype LifecycleAsset struct {\n\tpath       string\n\tdescriptor builder.LifecycleDescriptor\n\timage      string\n}\n\nfunc (a AssetManager) NewLifecycleAsset(kind ComboValue) LifecycleAsset {\n\treturn LifecycleAsset{\n\t\tpath:       a.LifecyclePath(kind),\n\t\tdescriptor: a.LifecycleDescriptor(kind),\n\t\timage:      a.LifecycleImage(kind),\n\t}\n}\n\nfunc (l *LifecycleAsset) Version() string {\n\treturn l.SemVer().String()\n}\n\nfunc (l *LifecycleAsset) SemVer() *builder.Version {\n\treturn l.descriptor.Info.Version\n}\n\nfunc (l *LifecycleAsset) Identifier() string {\n\tif l.HasLocation() {\n\t\treturn l.path\n\t} else {\n\t\treturn l.Version()\n\t}\n}\n\nfunc (l *LifecycleAsset) HasLocation() bool {\n\treturn l.path != \"\"\n}\n\nfunc (l *LifecycleAsset) EscapedPath() string {\n\treturn strings.ReplaceAll(l.path, `\\`, `\\\\`)\n}\n\nfunc (l *LifecycleAsset) Image() string {\n\treturn l.image\n}\n\nfunc earliestVersion(versions []*api.Version) *api.Version {\n\tvar earliest *api.Version\n\tfor _, version := range versions {\n\t\tswitch {\n\t\tcase version == nil:\n\t\t\tcontinue\n\t\tcase earliest == nil:\n\t\t\tearliest = version\n\t\tcase earliest.Compare(version) > 0:\n\t\t\tearliest = version\n\t\t}\n\t}\n\treturn earliest\n}\n\nfunc latestVersion(versions []*api.Version) *api.Version {\n\tvar latest *api.Version\n\tfor _, version := range versions {\n\t\tswitch {\n\t\tcase version == nil:\n\t\t\tcontinue\n\t\tcase latest == nil:\n\t\t\tlatest = version\n\t\tcase latest.Compare(version) < 0:\n\t\t\tlatest = version\n\t\t}\n\t}\n\treturn latest\n}\nfunc (l *LifecycleAsset) EarliestBuildpackAPIVersion() string {\n\treturn earliestVersion(l.descriptor.APIs.Buildpack.Supported).String()\n}\n\nfunc (l *LifecycleAsset) EarliestPlatformAPIVersion() string {\n\treturn earliestVersion(l.descriptor.APIs.Platform.Supported).String()\n}\n\nfunc (l *LifecycleAsset) LatestPlatformAPIVersion() string {\n\treturn latestVersion(l.descriptor.APIs.Platform.Supported).String()\n}\n\nfunc (l *LifecycleAsset) OutputForAPIs() (deprecatedBuildpackAPIs, supportedBuildpackAPIs, deprecatedPlatformAPIs, supportedPlatformAPIs string) {\n\tstringify := func(apiSet builder.APISet) string {\n\t\tversions := apiSet.AsStrings()\n\t\tif len(versions) == 0 {\n\t\t\treturn \"(none)\"\n\t\t}\n\t\treturn strings.Join(versions, \", \")\n\t}\n\n\treturn stringify(l.descriptor.APIs.Buildpack.Deprecated),\n\t\tstringify(l.descriptor.APIs.Buildpack.Supported),\n\t\tstringify(l.descriptor.APIs.Platform.Deprecated),\n\t\tstringify(l.descriptor.APIs.Platform.Supported)\n}\n\nfunc (l *LifecycleAsset) TOMLOutputForAPIs() (deprecatedBuildpacksAPIs,\n\tsupportedBuildpacksAPIs,\n\tdeprectatedPlatformAPIs,\n\tsupportedPlatformAPIS string,\n) {\n\tstringify := func(apiSet builder.APISet) string {\n\t\tif len(apiSet) < 1 || apiSet == nil {\n\t\t\treturn \"[]\"\n\t\t}\n\n\t\tvar quotedAPIs []string\n\t\tfor _, a := range apiSet {\n\t\t\tquotedAPIs = append(quotedAPIs, fmt.Sprintf(\"%q\", a))\n\t\t}\n\n\t\treturn fmt.Sprintf(\"[%s]\", strings.Join(quotedAPIs, \", \"))\n\t}\n\n\treturn stringify(l.descriptor.APIs.Buildpack.Deprecated),\n\t\tstringify(l.descriptor.APIs.Buildpack.Supported),\n\t\tstringify(l.descriptor.APIs.Platform.Deprecated),\n\t\tstringify(l.descriptor.APIs.Platform.Supported)\n}\n\nfunc (l *LifecycleAsset) YAMLOutputForAPIs(baseIndentationWidth int) (deprecatedBuildpacksAPIs,\n\tsupportedBuildpacksAPIs,\n\tdeprectatedPlatformAPIs,\n\tsupportedPlatformAPIS string,\n) {\n\tstringify := func(apiSet builder.APISet, baseIndentationWidth int) string {\n\t\tif len(apiSet) < 1 || apiSet == nil {\n\t\t\treturn \"[]\"\n\t\t}\n\n\t\tapiIndentation := strings.Repeat(\" \", baseIndentationWidth+2)\n\n\t\tvar quotedAPIs []string\n\t\tfor _, a := range apiSet {\n\t\t\tquotedAPIs = append(quotedAPIs, fmt.Sprintf(`%s- %q`, apiIndentation, a))\n\t\t}\n\n\t\treturn fmt.Sprintf(`\n%s`, strings.Join(quotedAPIs, \"\\n\"))\n\t}\n\n\treturn stringify(l.descriptor.APIs.Buildpack.Deprecated, baseIndentationWidth),\n\t\tstringify(l.descriptor.APIs.Buildpack.Supported, baseIndentationWidth),\n\t\tstringify(l.descriptor.APIs.Platform.Deprecated, baseIndentationWidth),\n\t\tstringify(l.descriptor.APIs.Platform.Supported, baseIndentationWidth)\n}\n\nfunc (l *LifecycleAsset) JSONOutputForAPIs(baseIndentationWidth int) (\n\tdeprecatedBuildpacksAPIs,\n\tsupportedBuildpacksAPIs,\n\tdeprectatedPlatformAPIs,\n\tsupportedPlatformAPIS string,\n) {\n\tstringify := func(apiSet builder.APISet, baseIndentationWidth int) string {\n\t\tif len(apiSet) < 1 {\n\t\t\tif apiSet == nil {\n\t\t\t\treturn \"null\"\n\t\t\t}\n\t\t\treturn \"[]\"\n\t\t}\n\n\t\tapiIndentation := strings.Repeat(\" \", baseIndentationWidth+2)\n\n\t\tvar quotedAPIs []string\n\t\tfor _, a := range apiSet {\n\t\t\tquotedAPIs = append(quotedAPIs, fmt.Sprintf(`%s%q`, apiIndentation, a))\n\t\t}\n\n\t\tlineEndSeparator := `,\n`\n\n\t\treturn fmt.Sprintf(`[\n%s\n%s]`, strings.Join(quotedAPIs, lineEndSeparator), strings.Repeat(\" \", baseIndentationWidth))\n\t}\n\n\treturn stringify(l.descriptor.APIs.Buildpack.Deprecated, baseIndentationWidth),\n\t\tstringify(l.descriptor.APIs.Buildpack.Supported, baseIndentationWidth),\n\t\tstringify(l.descriptor.APIs.Platform.Deprecated, baseIndentationWidth),\n\t\tstringify(l.descriptor.APIs.Platform.Supported, baseIndentationWidth)\n}\n\ntype LifecycleFeature int\n\nconst (\n\tCreationTime = iota\n\tBuildImageExtensions\n\tRunImageExtensions\n)\n\ntype LifecycleAssetSupported func(l *LifecycleAsset) bool\n\nfunc supportsPlatformAPI(version string) LifecycleAssetSupported {\n\treturn func(i *LifecycleAsset) bool {\n\t\tfor _, platformAPI := range i.descriptor.APIs.Platform.Supported {\n\t\t\tif platformAPI.AtLeast(version) {\n\t\t\t\treturn true\n\t\t\t}\n\t\t}\n\t\tfor _, platformAPI := range i.descriptor.APIs.Platform.Deprecated {\n\t\t\tif platformAPI.AtLeast(version) {\n\t\t\t\treturn true\n\t\t\t}\n\t\t}\n\t\treturn false\n\t}\n}\n\nvar lifecycleFeatureTests = map[LifecycleFeature]LifecycleAssetSupported{\n\tCreationTime:         supportsPlatformAPI(\"0.9\"),\n\tBuildImageExtensions: supportsPlatformAPI(\"0.10\"),\n\tRunImageExtensions:   supportsPlatformAPI(\"0.12\"),\n}\n\nfunc (l *LifecycleAsset) SupportsFeature(f LifecycleFeature) bool {\n\treturn lifecycleFeatureTests[f](l)\n}\n\nfunc (l *LifecycleAsset) atLeast074() bool {\n\treturn !l.SemVer().LessThan(semver.MustParse(\"0.7.4\"))\n}\n"
  },
  {
    "path": "acceptance/config/pack_assets.go",
    "content": "//go:build acceptance\n\npackage config\n\ntype PackAsset struct {\n\tpath         string\n\tfixturePaths []string\n}\n\nfunc (a AssetManager) NewPackAsset(kind ComboValue) PackAsset {\n\tpath, fixtures := a.PackPaths(kind)\n\n\treturn PackAsset{\n\t\tpath:         path,\n\t\tfixturePaths: fixtures,\n\t}\n}\n\nfunc (p PackAsset) Path() string {\n\treturn p.path\n}\n\nfunc (p PackAsset) FixturePaths() []string {\n\treturn p.fixturePaths\n}\n"
  },
  {
    "path": "acceptance/config/run_combination.go",
    "content": "//go:build acceptance\n\npackage config\n\nimport (\n\t\"encoding/json\"\n\t\"fmt\"\n\n\t\"github.com/pkg/errors\"\n\n\t\"github.com/buildpacks/pack/internal/style\"\n)\n\ntype ComboValue int\n\nconst (\n\tCurrent ComboValue = iota\n\tPrevious\n\tDefaultKind\n)\n\nfunc (v ComboValue) String() string {\n\tswitch v {\n\tcase Current:\n\t\treturn \"current\"\n\tcase Previous:\n\t\treturn \"previous\"\n\tcase DefaultKind:\n\t\treturn \"default\"\n\t}\n\treturn \"\"\n}\n\ntype RunCombo struct {\n\tPack              ComboValue `json:\"pack\"`\n\tPackCreateBuilder ComboValue `json:\"pack_create_builder\"`\n\tLifecycle         ComboValue `json:\"lifecycle\"`\n}\n\nvar defaultRunCombo = []*RunCombo{\n\t{Pack: Current, PackCreateBuilder: Current, Lifecycle: DefaultKind},\n}\n\nfunc (c *RunCombo) UnmarshalJSON(b []byte) error {\n\tvar o map[string]string\n\n\tif err := json.Unmarshal(b, &o); err != nil {\n\t\treturn errors.Errorf(`failed to unmarshal run combo element: %s`, b)\n\t}\n\n\tfor k, v := range o {\n\t\tswitch k {\n\t\tcase \"pack\":\n\t\t\tval, err := validatedPackKind(v)\n\t\t\tif err != nil {\n\t\t\t\treturn errors.Errorf(\"failed to parse kind of %s: %s\", style.Symbol(\"pack\"), err)\n\t\t\t}\n\n\t\t\tc.Pack = val\n\t\tcase \"pack_create_builder\":\n\t\t\tval, err := validatedPackKind(v)\n\t\t\tif err != nil {\n\t\t\t\treturn errors.Errorf(\n\t\t\t\t\t\"failed to parse kind of %s: %s\", style.Symbol(\"pack_create_builder\"),\n\t\t\t\t\terr,\n\t\t\t\t)\n\t\t\t}\n\n\t\t\tc.PackCreateBuilder = val\n\t\tcase \"lifecycle\":\n\t\t\tval, err := validateLifecycleKind(v)\n\t\t\tif err != nil {\n\t\t\t\treturn errors.Errorf(\"failed to parse kind of %s: %s\", style.Symbol(\"lifecycle\"), err)\n\t\t\t}\n\n\t\t\tc.Lifecycle = val\n\t\tdefault:\n\t\t\treturn errors.Errorf(\"unknown key %s in run combo\", style.Symbol(k))\n\t\t}\n\t}\n\n\treturn nil\n}\n\nfunc validatedPackKind(k string) (ComboValue, error) {\n\tswitch k {\n\tcase \"current\":\n\t\treturn Current, nil\n\tcase \"previous\":\n\t\treturn Previous, nil\n\tdefault:\n\t\treturn Current, errors.Errorf(\"must be either current or previous, was %s\", style.Symbol(k))\n\t}\n}\n\nfunc validateLifecycleKind(k string) (ComboValue, error) {\n\tswitch k {\n\tcase \"current\":\n\t\treturn Current, nil\n\tcase \"previous\":\n\t\treturn Previous, nil\n\tcase \"default\":\n\t\treturn DefaultKind, nil\n\tdefault:\n\t\treturn Current, errors.Errorf(\"must be either current or previous, was %s\", style.Symbol(k))\n\t}\n}\n\nfunc (c *RunCombo) String() string {\n\treturn fmt.Sprintf(\"p_%s cb_%s lc_%s\", c.Pack, c.PackCreateBuilder, c.Lifecycle)\n}\n\nfunc (c *RunCombo) Describe(assets AssetManager) string {\n\tpackPath, packFixturesPaths := assets.PackPaths(c.Pack)\n\tcbPackPath, cbPackFixturesPaths := assets.PackPaths(c.PackCreateBuilder)\n\tlifecyclePath := assets.LifecyclePath(c.Lifecycle)\n\tlifecycleDescriptor := assets.LifecycleDescriptor(c.Lifecycle)\n\n\treturn fmt.Sprintf(`\npack:\n|__ path: %s\n|__ fixtures: %s\n\ncreate builder:\n|__ pack path: %s\n|__ pack fixtures: %s\n\nlifecycle:\n|__ path: %s\n|__ version: %s\n|__ buildpack api: %s\n|__ platform api: %s\n`,\n\t\tpackPath,\n\t\tpackFixturesPaths,\n\t\tcbPackPath,\n\t\tcbPackFixturesPaths,\n\t\tlifecyclePath,\n\t\tlifecycleDescriptor.Info.Version,\n\t\tlifecycleDescriptor.API.BuildpackVersion,\n\t\tlifecycleDescriptor.API.PlatformVersion,\n\t)\n}\n\ntype ComboSet []*RunCombo\n\nfunc (combos ComboSet) requiresCurrentPack() bool {\n\treturn combos.requiresPackKind(Current)\n}\n\nfunc (combos ComboSet) requiresPreviousPack() bool {\n\treturn combos.requiresPackKind(Previous)\n}\n\nfunc (combos ComboSet) requiresPackKind(k ComboValue) bool {\n\tfor _, c := range combos {\n\t\tif c.Pack == k || c.PackCreateBuilder == k {\n\t\t\treturn true\n\t\t}\n\t}\n\n\treturn false\n}\n\nfunc (combos ComboSet) IncludesCurrentSubjectPack() bool {\n\tfor _, c := range combos {\n\t\tif c.Pack == Current {\n\t\t\treturn true\n\t\t}\n\t}\n\n\treturn false\n}\n\nfunc (combos ComboSet) requiresCurrentLifecycle() bool {\n\treturn combos.requiresLifecycleKind(Current)\n}\n\nfunc (combos ComboSet) requiresPreviousLifecycle() bool {\n\treturn combos.requiresLifecycleKind(Previous)\n}\n\nfunc (combos ComboSet) requiresDefaultLifecycle() bool {\n\treturn combos.requiresLifecycleKind(DefaultKind)\n}\n\nfunc (combos ComboSet) requiresLifecycleKind(k ComboValue) bool {\n\tfor _, c := range combos {\n\t\tif c.Lifecycle == k {\n\t\t\treturn true\n\t\t}\n\t}\n\n\treturn false\n}\n"
  },
  {
    "path": "acceptance/invoke/pack.go",
    "content": "//go:build acceptance\n\npackage invoke\n\nimport (\n\t\"bytes\"\n\t\"fmt\"\n\t\"os\"\n\t\"os/exec\"\n\t\"path/filepath\"\n\t\"regexp\"\n\t\"strings\"\n\t\"sync\"\n\t\"testing\"\n\n\t\"github.com/Masterminds/semver\"\n\n\tacceptanceOS \"github.com/buildpacks/pack/acceptance/os\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\ntype PackInvoker struct {\n\ttestObject      *testing.T\n\tassert          h.AssertionManager\n\tpath            string\n\thome            string\n\tdockerConfigDir string\n\tfixtureManager  PackFixtureManager\n\tverbose         bool\n}\n\ntype packPathsProvider interface {\n\tPath() string\n\tFixturePaths() []string\n}\n\nfunc NewPackInvoker(\n\ttestObject *testing.T,\n\tassert h.AssertionManager,\n\tpackAssets packPathsProvider,\n\tdockerConfigDir string,\n) *PackInvoker {\n\n\ttestObject.Helper()\n\n\thome, err := os.MkdirTemp(\"\", \"buildpack.pack.home.\")\n\tif err != nil {\n\t\ttestObject.Fatalf(\"couldn't create home folder for pack: %s\", err)\n\t}\n\n\treturn &PackInvoker{\n\t\ttestObject:      testObject,\n\t\tassert:          assert,\n\t\tpath:            packAssets.Path(),\n\t\thome:            home,\n\t\tdockerConfigDir: dockerConfigDir,\n\t\tverbose:         true,\n\t\tfixtureManager: PackFixtureManager{\n\t\t\ttestObject: testObject,\n\t\t\tassert:     assert,\n\t\t\tlocations:  packAssets.FixturePaths(),\n\t\t},\n\t}\n}\n\nfunc (i *PackInvoker) Cleanup() {\n\tif i == nil {\n\t\treturn\n\t}\n\ti.testObject.Helper()\n\n\terr := os.RemoveAll(i.home)\n\ti.assert.Nil(err)\n}\n\nfunc (i *PackInvoker) cmd(name string, args ...string) *exec.Cmd {\n\ti.testObject.Helper()\n\n\tcmdArgs := append([]string{name}, args...)\n\tcmdArgs = append(cmdArgs, \"--no-color\")\n\tif i.verbose {\n\t\tcmdArgs = append(cmdArgs, \"--verbose\")\n\t}\n\n\tcmd := i.baseCmd(cmdArgs...)\n\n\tcmd.Env = append(os.Environ(), \"DOCKER_CONFIG=\"+i.dockerConfigDir)\n\tif i.home != \"\" {\n\t\tcmd.Env = append(cmd.Env, \"PACK_HOME=\"+i.home)\n\t}\n\n\treturn cmd\n}\n\nfunc (i *PackInvoker) baseCmd(parts ...string) *exec.Cmd {\n\treturn exec.Command(i.path, parts...)\n}\n\nfunc (i *PackInvoker) Run(name string, args ...string) (string, error) {\n\ti.testObject.Helper()\n\n\toutput, err := i.cmd(name, args...).CombinedOutput()\n\n\treturn string(output), err\n}\n\nfunc (i *PackInvoker) SetVerbose(verbose bool) {\n\ti.verbose = verbose\n}\n\nfunc (i *PackInvoker) RunSuccessfully(name string, args ...string) string {\n\ti.testObject.Helper()\n\n\toutput, err := i.Run(name, args...)\n\ti.assert.NilWithMessage(err, output)\n\n\treturn output\n}\n\nfunc (i *PackInvoker) JustRunSuccessfully(name string, args ...string) {\n\ti.testObject.Helper()\n\n\t_ = i.RunSuccessfully(name, args...)\n}\n\nfunc (i *PackInvoker) StartWithWriter(combinedOutput *bytes.Buffer, name string, args ...string) *InterruptCmd {\n\tcmd := i.cmd(name, args...)\n\tcmd.Stderr = combinedOutput\n\tcmd.Stdout = combinedOutput\n\n\terr := cmd.Start()\n\ti.assert.Nil(err)\n\n\treturn &InterruptCmd{\n\t\ttestObject:     i.testObject,\n\t\tassert:         i.assert,\n\t\tcmd:            cmd,\n\t\tcombinedOutput: combinedOutput,\n\t}\n}\n\nfunc (i *PackInvoker) Home() string {\n\treturn i.home\n}\n\ntype InterruptCmd struct {\n\ttestObject     *testing.T\n\tassert         h.AssertionManager\n\tcmd            *exec.Cmd\n\tcombinedOutput *bytes.Buffer\n\toutputMux      sync.Mutex\n}\n\nfunc (c *InterruptCmd) TerminateAtStep(pattern string) {\n\tc.testObject.Helper()\n\n\tfor {\n\t\tc.outputMux.Lock()\n\t\tif strings.Contains(c.combinedOutput.String(), pattern) {\n\t\t\terr := c.cmd.Process.Signal(acceptanceOS.InterruptSignal)\n\t\t\tc.assert.Nil(err)\n\t\t\th.AssertNil(c.testObject, err)\n\t\t\treturn\n\t\t}\n\t\tc.outputMux.Unlock()\n\t}\n}\n\nfunc (c *InterruptCmd) Wait() error {\n\treturn c.cmd.Wait()\n}\n\nfunc (i *PackInvoker) Version() string {\n\ti.testObject.Helper()\n\treturn strings.TrimSpace(i.RunSuccessfully(\"version\"))\n}\n\nfunc (i *PackInvoker) SanitizedVersion() string {\n\ti.testObject.Helper()\n\t// Sanitizing any git commit sha and build number from the version output\n\tre := regexp.MustCompile(`\\d+\\.\\d+\\.\\d+`)\n\treturn re.FindString(strings.TrimSpace(i.RunSuccessfully(\"version\")))\n}\n\nfunc (i *PackInvoker) EnableExperimental() {\n\ti.testObject.Helper()\n\n\ti.JustRunSuccessfully(\"config\", \"experimental\", \"true\")\n}\n\n// Supports returns whether or not the executor's pack binary supports a\n// given command string. The command string can take one of four forms:\n//   - \"<command>\" (e.g. \"create-builder\")\n//   - \"<flag>\" (e.g. \"--verbose\")\n//   - \"<command> <flag>\" (e.g. \"build --network\")\n//   - \"<command>... <flag>\" (e.g. \"config trusted-builder --network\")\n//\n// Any other form may return false.\nfunc (i *PackInvoker) Supports(command string) bool {\n\ti.testObject.Helper()\n\n\tparts := strings.Split(command, \" \")\n\n\tvar cmdParts = []string{\"help\"}\n\tvar search string\n\n\tif len(parts) > 1 {\n\t\tlast := len(parts) - 1\n\t\tcmdParts = append(cmdParts, parts[:last]...)\n\t\tsearch = parts[last]\n\t} else {\n\t\tcmdParts = append(cmdParts, command)\n\t\tsearch = command\n\t}\n\n\tre := regexp.MustCompile(fmt.Sprint(`\\b%s\\b`, search))\n\toutput, err := i.baseCmd(cmdParts...).CombinedOutput()\n\ti.assert.Nil(err)\n\n\t// FIXME: this doesn't appear to be working as expected,\n\t// as tests against \"build --creation-time\" and \"build --cache\" are returning unsupported\n\t// even on the latest version of pack.\n\treturn re.MatchString(string(output)) && !strings.Contains(string(output), \"Unknown help topic\")\n}\n\ntype Feature int\n\nconst (\n\tCreationTime = iota\n\tCache\n\tBuildImageExtensions\n\tRunImageExtensions\n\tStackValidation\n\tForceRebase\n\tBuildpackFlatten\n\tMetaBuildpackFolder\n\tPlatformRetries\n\tFlattenBuilderCreationV2\n\tFixesRunImageMetadata\n\tManifestCommands\n\tPlatformOption\n\tMultiPlatformBuildersAndBuildPackages\n\tStackWarning\n)\n\nvar featureTests = map[Feature]func(i *PackInvoker) bool{\n\tCreationTime: func(i *PackInvoker) bool {\n\t\treturn i.Supports(\"build --creation-time\")\n\t},\n\tCache: func(i *PackInvoker) bool {\n\t\treturn i.Supports(\"build --cache\")\n\t},\n\tBuildImageExtensions: func(i *PackInvoker) bool {\n\t\treturn i.laterThan(\"v0.27.0\")\n\t},\n\tRunImageExtensions: func(i *PackInvoker) bool {\n\t\treturn i.laterThan(\"v0.29.0\")\n\t},\n\tStackValidation: func(i *PackInvoker) bool {\n\t\treturn !i.atLeast(\"v0.30.0\")\n\t},\n\tForceRebase: func(i *PackInvoker) bool {\n\t\treturn i.atLeast(\"v0.30.0\")\n\t},\n\tBuildpackFlatten: func(i *PackInvoker) bool {\n\t\treturn i.atLeast(\"v0.30.0\")\n\t},\n\tMetaBuildpackFolder: func(i *PackInvoker) bool {\n\t\treturn i.atLeast(\"v0.30.0\")\n\t},\n\tPlatformRetries: func(i *PackInvoker) bool {\n\t\treturn i.atLeast(\"v0.32.1\")\n\t},\n\tFlattenBuilderCreationV2: func(i *PackInvoker) bool {\n\t\treturn i.atLeast(\"v0.33.1\")\n\t},\n\tFixesRunImageMetadata: func(i *PackInvoker) bool {\n\t\treturn i.atLeast(\"v0.34.0\")\n\t},\n\tManifestCommands: func(i *PackInvoker) bool {\n\t\treturn i.atLeast(\"v0.34.0\")\n\t},\n\tPlatformOption: func(i *PackInvoker) bool {\n\t\treturn i.atLeast(\"v0.34.0\")\n\t},\n\tMultiPlatformBuildersAndBuildPackages: func(i *PackInvoker) bool {\n\t\treturn i.atLeast(\"v0.34.0\")\n\t},\n\tStackWarning: func(i *PackInvoker) bool {\n\t\treturn i.atLeast(\"v0.37.0\")\n\t},\n}\n\nfunc (i *PackInvoker) SupportsFeature(f Feature) bool {\n\treturn featureTests[f](i)\n}\n\nfunc (i *PackInvoker) semanticVersion() *semver.Version {\n\tversion := i.Version()\n\tsemanticVersion, err := semver.NewVersion(strings.TrimPrefix(strings.Split(version, \" \")[0], \"v\"))\n\ti.assert.Nil(err)\n\n\treturn semanticVersion\n}\n\n// laterThan returns true if pack version is older than the provided version\nfunc (i *PackInvoker) laterThan(version string) bool {\n\tprovidedVersion := semver.MustParse(version)\n\tver := i.semanticVersion()\n\treturn ver.Compare(providedVersion) > 0 || ver.Equal(semver.MustParse(\"0.0.0\"))\n}\n\n// atLeast returns true if pack version is the same or older than the provided version\nfunc (i *PackInvoker) atLeast(version string) bool {\n\tminimalVersion := semver.MustParse(version)\n\tver := i.semanticVersion()\n\treturn ver.Equal(minimalVersion) || ver.GreaterThan(minimalVersion) || ver.Equal(semver.MustParse(\"0.0.0\"))\n}\n\nfunc (i *PackInvoker) ConfigFileContents() string {\n\ti.testObject.Helper()\n\n\tcontents, err := os.ReadFile(filepath.Join(i.home, \"config.toml\"))\n\ti.assert.Nil(err)\n\n\treturn string(contents)\n}\n\nfunc (i *PackInvoker) FixtureManager() PackFixtureManager {\n\treturn i.fixtureManager\n}\n"
  },
  {
    "path": "acceptance/invoke/pack_fixtures.go",
    "content": "//go:build acceptance\n\npackage invoke\n\nimport (\n\t\"bytes\"\n\t\"fmt\"\n\t\"io\"\n\t\"os\"\n\t\"path/filepath\"\n\t\"strings\"\n\t\"testing\"\n\t\"text/template\"\n\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\ntype PackFixtureManager struct {\n\ttestObject *testing.T\n\tassert     h.AssertionManager\n\tlocations  []string\n}\n\nfunc (m PackFixtureManager) FixtureLocation(name string) string {\n\tm.testObject.Helper()\n\n\tfor _, dir := range m.locations {\n\t\tfixtureLocation := filepath.Join(dir, name)\n\t\t_, err := os.Stat(fixtureLocation)\n\t\tif !os.IsNotExist(err) {\n\t\t\treturn fixtureLocation\n\t\t}\n\t}\n\n\tm.testObject.Fatalf(\"fixture %s does not exist in %v\", name, m.locations)\n\n\treturn \"\"\n}\n\nfunc (m PackFixtureManager) VersionedFixtureOrFallbackLocation(pattern, version, fallback string) string {\n\tm.testObject.Helper()\n\n\tversionedName := fmt.Sprintf(pattern, version)\n\n\tfor _, dir := range m.locations {\n\t\tfixtureLocation := filepath.Join(dir, versionedName)\n\t\t_, err := os.Stat(fixtureLocation)\n\t\tif !os.IsNotExist(err) {\n\t\t\treturn fixtureLocation\n\t\t}\n\t}\n\n\treturn m.FixtureLocation(fallback)\n}\n\nfunc (m PackFixtureManager) TemplateFixture(templateName string, templateData map[string]interface{}) string {\n\tm.testObject.Helper()\n\n\toutputTemplate, err := os.ReadFile(m.FixtureLocation(templateName))\n\tm.assert.Nil(err)\n\n\treturn m.fillTemplate(outputTemplate, templateData)\n}\n\nfunc (m PackFixtureManager) TemplateVersionedFixture(\n\tversionedPattern, version, fallback string,\n\ttemplateData map[string]interface{},\n) string {\n\tm.testObject.Helper()\n\toutputTemplate, err := os.ReadFile(m.VersionedFixtureOrFallbackLocation(versionedPattern, version, fallback))\n\tm.assert.Nil(err)\n\n\treturn m.fillTemplate(outputTemplate, templateData)\n}\n\nfunc (m PackFixtureManager) TemplateFixtureToFile(name string, destination *os.File, data map[string]interface{}) {\n\t_, err := io.WriteString(destination, m.TemplateFixture(name, data))\n\tm.assert.Nil(err)\n}\n\nfunc (m PackFixtureManager) fillTemplate(templateContents []byte, data map[string]interface{}) string {\n\ttpl, err := template.New(\"\").\n\t\tFuncs(template.FuncMap{\n\t\t\t\"StringsJoin\": strings.Join,\n\t\t\t\"StringsDoubleQuote\": func(s []string) []string {\n\t\t\t\tresult := []string{}\n\t\t\t\tfor _, str := range s {\n\t\t\t\t\tresult = append(result, fmt.Sprintf(`\"%s\"`, str))\n\t\t\t\t}\n\t\t\t\treturn result\n\t\t\t},\n\t\t\t\"StringsEscapeBackslash\": func(s string) string {\n\t\t\t\tresult := []rune{}\n\t\t\t\tfor _, elem := range s {\n\t\t\t\t\tswitch {\n\t\t\t\t\tcase elem == '\\\\':\n\t\t\t\t\t\tresult = append(result, '\\\\', '\\\\')\n\t\t\t\t\tdefault:\n\t\t\t\t\t\tresult = append(result, elem)\n\t\t\t\t\t}\n\t\t\t\t}\n\t\t\t\treturn string(result)\n\t\t\t},\n\t\t}).\n\t\tParse(string(templateContents))\n\tm.assert.Nil(err)\n\n\tvar templatedContent bytes.Buffer\n\terr = tpl.Execute(&templatedContent, data)\n\tm.assert.Nil(err)\n\n\treturn templatedContent.String()\n}\n"
  },
  {
    "path": "acceptance/managers/image_manager.go",
    "content": "//go:build acceptance\n\npackage managers\n\nimport (\n\t\"bytes\"\n\t\"context\"\n\t\"fmt\"\n\t\"testing\"\n\t\"time\"\n\n\t\"github.com/moby/moby/api/types/container\"\n\t\"github.com/moby/moby/api/types/image\"\n\t\"github.com/moby/moby/api/types/network\"\n\t\"github.com/moby/moby/client\"\n\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nvar DefaultDuration = 10 * time.Second\n\ntype ImageManager struct {\n\ttestObject *testing.T\n\tassert     h.AssertionManager\n\tdockerCli  *client.Client\n}\n\nfunc NewImageManager(t *testing.T, dockerCli *client.Client) ImageManager {\n\treturn ImageManager{\n\t\ttestObject: t,\n\t\tassert:     h.NewAssertionManager(t),\n\t\tdockerCli:  dockerCli,\n\t}\n}\n\nfunc (im ImageManager) CleanupImages(imageNames ...string) {\n\tim.testObject.Helper()\n\terr := h.DockerRmi(im.dockerCli, imageNames...)\n\tif err != nil {\n\t\tim.testObject.Logf(\"%s: Failed to remove image from %s\", err, imageNames)\n\t}\n}\n\nfunc (im ImageManager) InspectLocal(imageName string) (image.InspectResponse, error) {\n\tim.testObject.Helper()\n\tresult, err := im.dockerCli.ImageInspect(context.Background(), imageName)\n\treturn result.InspectResponse, err\n}\n\nfunc (im ImageManager) GetImageID(image string) string {\n\tim.testObject.Helper()\n\tinspect, err := im.InspectLocal(image)\n\tim.assert.Nil(err)\n\treturn inspect.ID\n}\n\nfunc (im ImageManager) HostOS() string {\n\tim.testObject.Helper()\n\tinfoResult, err := im.dockerCli.Info(context.Background(), client.InfoOptions{})\n\tim.assert.Nil(err)\n\treturn infoResult.Info.OSType\n}\n\nfunc (im ImageManager) TagImage(imageName, ref string) {\n\tim.testObject.Helper()\n\t_, err := im.dockerCli.ImageTag(context.Background(), client.ImageTagOptions{\n\t\tSource: imageName,\n\t\tTarget: ref,\n\t})\n\tim.assert.Nil(err)\n}\n\nfunc (im ImageManager) PullImage(image, registryAuth string) {\n\tim.testObject.Helper()\n\terr := h.PullImageWithAuth(im.dockerCli, image, registryAuth)\n\tim.assert.Nil(err)\n}\n\nfunc (im ImageManager) ExposePortOnImage(image, containerName string) TestContainer {\n\tim.testObject.Helper()\n\tctx := context.Background()\n\n\tctr, err := im.dockerCli.ContainerCreate(ctx, client.ContainerCreateOptions{\n\t\tConfig: &container.Config{\n\t\t\tImage:        image,\n\t\t\tExposedPorts: network.PortSet{network.MustParsePort(\"8080/tcp\"): {}},\n\t\t\tHealthcheck:  nil,\n\t\t},\n\t\tHostConfig: &container.HostConfig{\n\t\t\tPortBindings: network.PortMap{\n\t\t\t\tnetwork.MustParsePort(\"8080/tcp\"): []network.PortBinding{{}},\n\t\t\t},\n\t\t\tAutoRemove: true,\n\t\t},\n\t\tName: containerName,\n\t})\n\tim.assert.Nil(err)\n\n\t_, err = im.dockerCli.ContainerStart(ctx, ctr.ID, client.ContainerStartOptions{})\n\tim.assert.Nil(err)\n\treturn TestContainer{\n\t\ttestObject: im.testObject,\n\t\tdockerCli:  im.dockerCli,\n\t\tassert:     im.assert,\n\t\tname:       containerName,\n\t\tid:         ctr.ID,\n\t}\n}\n\nfunc (im ImageManager) CreateContainer(name string) TestContainer {\n\tim.testObject.Helper()\n\tcontainerName := \"test-\" + h.RandString(10)\n\tctr, err := im.dockerCli.ContainerCreate(context.Background(), client.ContainerCreateOptions{\n\t\tConfig: &container.Config{\n\t\t\tImage: name,\n\t\t},\n\t\tName: containerName,\n\t})\n\tim.assert.Nil(err)\n\n\treturn TestContainer{\n\t\ttestObject: im.testObject,\n\t\tdockerCli:  im.dockerCli,\n\t\tassert:     im.assert,\n\t\tname:       containerName,\n\t\tid:         ctr.ID,\n\t}\n}\n\ntype TestContainer struct {\n\ttestObject *testing.T\n\tdockerCli  *client.Client\n\tassert     h.AssertionManager\n\tname       string\n\tid         string\n}\n\nfunc (t TestContainer) RunWithOutput() string {\n\tt.testObject.Helper()\n\tvar b bytes.Buffer\n\terr := h.RunContainer(context.Background(), t.dockerCli, t.id, &b, &b)\n\tt.assert.Nil(err)\n\treturn b.String()\n}\n\nfunc (t TestContainer) Cleanup() {\n\tt.testObject.Helper()\n\tt.dockerCli.ContainerKill(context.Background(), t.name, client.ContainerKillOptions{Signal: \"SIGKILL\"})\n\tt.dockerCli.ContainerRemove(context.Background(), t.name, client.ContainerRemoveOptions{Force: true})\n}\n\nfunc (t TestContainer) WaitForResponse(duration time.Duration) string {\n\tt.testObject.Helper()\n\tticker := time.NewTicker(500 * time.Millisecond)\n\tdefer ticker.Stop()\n\ttimer := time.NewTimer(duration)\n\tdefer timer.Stop()\n\n\tappURI := fmt.Sprintf(\"http://%s\", h.RegistryHost(h.DockerHostname(t.testObject), t.hostPort()))\n\tfor {\n\t\tselect {\n\t\tcase <-ticker.C:\n\t\t\tresp, err := h.HTTPGetE(appURI, map[string]string{})\n\t\t\tif err != nil {\n\t\t\t\tbreak\n\t\t\t}\n\t\t\treturn resp\n\t\tcase <-timer.C:\n\t\t\tt.testObject.Fatalf(\"timeout waiting for response: %v\", duration)\n\t\t}\n\t}\n}\n\nfunc (t TestContainer) hostPort() string {\n\tt.testObject.Helper()\n\tresult, err := t.dockerCli.ContainerInspect(context.Background(), t.name, client.ContainerInspectOptions{})\n\tt.assert.Nil(err)\n\ti := result.Container\n\tfor _, port := range i.NetworkSettings.Ports {\n\t\tfor _, binding := range port {\n\t\t\treturn binding.HostPort\n\t\t}\n\t}\n\n\tt.testObject.Fatalf(\"Failed to fetch host port for %s: no ports exposed\", t.name)\n\treturn \"\"\n}\n"
  },
  {
    "path": "acceptance/os/variables.go",
    "content": "//go:build acceptance && !windows\n\npackage os\n\nimport \"os\"\n\nconst PackBinaryName = \"pack\"\n\nvar InterruptSignal = os.Interrupt\n"
  },
  {
    "path": "acceptance/os/variables_darwin.go",
    "content": "//go:build acceptance && darwin && amd64\n\npackage os\n\nimport \"regexp\"\n\nvar PackBinaryExp = regexp.MustCompile(`pack-v\\d+.\\d+.\\d+-macos\\.`)\n"
  },
  {
    "path": "acceptance/os/variables_darwin_arm64.go",
    "content": "//go:build acceptance && darwin && arm64\n\npackage os\n\nimport \"regexp\"\n\nvar PackBinaryExp = regexp.MustCompile(`pack-v\\d+.\\d+.\\d+-macos-`)\n"
  },
  {
    "path": "acceptance/os/variables_linux.go",
    "content": "//go:build acceptance && linux\n\npackage os\n\nimport \"regexp\"\n\nvar PackBinaryExp = regexp.MustCompile(`pack-v\\d+.\\d+.\\d+-linux\\.`)\n"
  },
  {
    "path": "acceptance/os/variables_windows.go",
    "content": "//go:build acceptance && windows\n\npackage os\n\nimport (\n\t\"os\"\n\t\"regexp\"\n)\n\nconst PackBinaryName = \"pack.exe\"\n\nvar (\n\tPackBinaryExp   = regexp.MustCompile(`pack-v\\d+.\\d+.\\d+-windows`)\n\tInterruptSignal = os.Kill\n)\n"
  },
  {
    "path": "acceptance/suite_manager.go",
    "content": "//go:build acceptance\n\npackage acceptance\n\ntype SuiteManager struct {\n\tout          func(format string, args ...interface{})\n\tresults      map[string]interface{}\n\tcleanUpTasks map[string]func() error\n}\n\nfunc (s *SuiteManager) CleanUp() error {\n\tfor key, cleanUp := range s.cleanUpTasks {\n\t\ts.out(\"Running cleanup task '%s'\\n\", key)\n\t\tif err := cleanUp(); err != nil {\n\t\t\treturn err\n\t\t}\n\t}\n\n\treturn nil\n}\n\nfunc (s *SuiteManager) RegisterCleanUp(key string, cleanUp func() error) {\n\tif s.cleanUpTasks == nil {\n\t\ts.cleanUpTasks = map[string]func() error{}\n\t}\n\n\ts.cleanUpTasks[key] = cleanUp\n}\n\nfunc (s *SuiteManager) RunTaskOnceString(key string, run func() (string, error)) (string, error) {\n\tv, err := s.runTaskOnce(key, func() (interface{}, error) {\n\t\treturn run()\n\t})\n\tif err != nil {\n\t\treturn \"\", err\n\t}\n\n\treturn v.(string), nil\n}\n\nfunc (s *SuiteManager) runTaskOnce(key string, run func() (interface{}, error)) (interface{}, error) {\n\tif s.results == nil {\n\t\ts.results = map[string]interface{}{}\n\t}\n\n\tvalue, found := s.results[key]\n\tif !found {\n\t\ts.out(\"Running task '%s'\\n\", key)\n\t\tv, err := run()\n\t\tif err != nil {\n\t\t\treturn nil, err\n\t\t}\n\n\t\ts.results[key] = v\n\n\t\treturn v, nil\n\t}\n\n\treturn value, nil\n}\n"
  },
  {
    "path": "acceptance/testconfig/all.json",
    "content": "[\n  {\"pack\": \"current\", \"pack_create_builder\": \"current\", \"lifecycle\": \"current\"},\n  {\"pack\": \"current\", \"pack_create_builder\": \"current\", \"lifecycle\": \"previous\"},\n  {\"pack\": \"current\", \"pack_create_builder\": \"previous\", \"lifecycle\": \"previous\"},\n  {\"pack\": \"previous\", \"pack_create_builder\": \"current\", \"lifecycle\": \"current\"},\n  {\"pack\": \"previous\", \"pack_create_builder\": \"current\", \"lifecycle\": \"previous\"}\n]"
  },
  {
    "path": "acceptance/testdata/mock_app/project.toml",
    "content": "[project]\n  version = \"1.0.2\"\n  source-url = \"https://github.com/buildpacks/pack\"\n"
  },
  {
    "path": "acceptance/testdata/mock_app/run",
    "content": "#!/usr/bin/env bash\n\nset -x\n\nport=\"${1-8080}\"\n\necho \"listening on port $port\"\n\nresp=$(echo \"HTTP/1.1 200 OK\\n\" && cat \"$PWD\"/*-deps/*-dep /contents*.txt)\nwhile true; do\n  nc -l -p \"$port\" -c \"echo \\\"$resp\\\"\"\ndone\n"
  },
  {
    "path": "acceptance/testdata/mock_app/run.bat",
    "content": "@echo off\n\nset port=8080\nif [%1] neq [] set port=%1\n\nC:\\util\\server.exe -p %port% -g \"%cd%\\*-deps\\*-dep, c:\\contents*.txt\"\n\n\n"
  },
  {
    "path": "acceptance/testdata/mock_buildpacks/descriptor-buildpack/bin/build",
    "content": "#!/usr/bin/env bash\n\necho \"---> Build: Descriptor Buildpack\"\n\nset -o errexit\nset -o nounset\nset -o pipefail\n\nls -laR\n\necho \"---> Done\"\n"
  },
  {
    "path": "acceptance/testdata/mock_buildpacks/descriptor-buildpack/bin/build.bat",
    "content": "@echo off\n\necho ---- Build: Descriptor Buildpack\n\ndir /s\n\necho ---- Done\n"
  },
  {
    "path": "acceptance/testdata/mock_buildpacks/descriptor-buildpack/bin/detect",
    "content": "#!/usr/bin/env bash\n\necho \"---> Detect: Descriptor Buildpack\"\n\nset -o errexit\nset -o nounset\nset -o pipefail\n\necho \"---> Done\"\n"
  },
  {
    "path": "acceptance/testdata/mock_buildpacks/descriptor-buildpack/bin/detect.bat",
    "content": "@echo off\n\necho ---- Detect: Descriptor Buildpack\n\necho ---- Done\n"
  },
  {
    "path": "acceptance/testdata/mock_buildpacks/descriptor-buildpack/buildpack.toml",
    "content": "api = \"0.7\"\n\n[buildpack]\n  id = \"descriptor/bp\"\n  version = \"descriptor-bp-version\"\n  name = \"Descriptor Buildpack\"\n\n[[stacks]]\n  id = \"pack.test.stack\"\n"
  },
  {
    "path": "acceptance/testdata/mock_buildpacks/internet-capable-buildpack/bin/build",
    "content": "#!/usr/bin/env bash\n\necho \"---> Build: Internet Capable Buildpack\"\n\nset -o errexit\nset -o nounset\nset -o pipefail\n\n\nif netcat -z -w 1 google.com 80; then\n  echo \"RESULT: Connected to the internet\"\nelse\n  echo \"RESULT: Disconnected from the internet\"\nfi\n\necho \"---> Done\"\n"
  },
  {
    "path": "acceptance/testdata/mock_buildpacks/internet-capable-buildpack/bin/build.bat",
    "content": "@echo off\n\necho ---- Build: Internet capable buildpack\n\nping -n 1 google.com\n\nif %ERRORLEVEL% equ 0 (\n  echo RESULT: Connected to the internet\n) else (\n  echo RESULT: Disconnected from the internet\n)\n\necho ---- Done\n"
  },
  {
    "path": "acceptance/testdata/mock_buildpacks/internet-capable-buildpack/bin/detect",
    "content": "#!/usr/bin/env bash\n\necho \"---> Internet capable buildpack\"\n\nset -o errexit\nset -o nounset\nset -o pipefail\n\n\nif netcat -z -w 1 google.com 80; then\n  echo \"RESULT: Connected to the internet\"\nelse\n  echo \"RESULT: Disconnected from the internet\"\nfi\n\necho \"---> Done\"\n"
  },
  {
    "path": "acceptance/testdata/mock_buildpacks/internet-capable-buildpack/bin/detect.bat",
    "content": "@echo off\n\necho ---- Detect: Internet capable buildpack\n\nping -n 1 google.com\n\nif %ERRORLEVEL% equ 0 (\n  echo RESULT: Connected to the internet\n) else (\n  echo RESULT: Disconnected from the internet\n)\n\necho ---- Done\n"
  },
  {
    "path": "acceptance/testdata/mock_buildpacks/internet-capable-buildpack/buildpack.toml",
    "content": "api = \"0.7\"\n\n[buildpack]\n  id = \"internet/bp\"\n  version = \"internet-bp-version\"\n  name = \"Internet Capable Buildpack\"\n\n[[stacks]]\n  id = \"pack.test.stack\"\n"
  },
  {
    "path": "acceptance/testdata/mock_buildpacks/meta-buildpack/buildpack.toml",
    "content": "api = \"0.7\"\n\n[buildpack]\n  id = \"local/meta-bp\"\n  version = \"local-meta-bp-version\"\n  name = \"Local Meta-Buildpack\"\n\n[[order]]\n[[order.group]]\nid = \"local/meta-bp-dep\"\nversion = \"local-meta-bp-version\""
  },
  {
    "path": "acceptance/testdata/mock_buildpacks/meta-buildpack/package.toml",
    "content": "[buildpack]\nuri = \".\"\n\n[[dependencies]]\nuri = \"../meta-buildpack-dependency\""
  },
  {
    "path": "acceptance/testdata/mock_buildpacks/meta-buildpack-dependency/bin/build",
    "content": "#!/usr/bin/env bash\n\necho \"---> Build: Local Meta-Buildpack Dependency\"\n"
  },
  {
    "path": "acceptance/testdata/mock_buildpacks/meta-buildpack-dependency/bin/build.bat",
    "content": "@echo off\n\necho ---- Build: Local Meta-Buildpack Dependency\n"
  },
  {
    "path": "acceptance/testdata/mock_buildpacks/meta-buildpack-dependency/bin/detect",
    "content": "#!/usr/bin/env bash\n\n## always detect\n"
  },
  {
    "path": "acceptance/testdata/mock_buildpacks/meta-buildpack-dependency/bin/detect.bat",
    "content": "@echo off\n:: always detect\n"
  },
  {
    "path": "acceptance/testdata/mock_buildpacks/meta-buildpack-dependency/buildpack.toml",
    "content": "api = \"0.7\"\n\n[buildpack]\n  id = \"local/meta-bp-dep\"\n  version = \"local-meta-bp-version\"\n  name = \"Local Meta-Buildpack Dependency\"\n\n[[stacks]]\n  id = \"pack.test.stack\""
  },
  {
    "path": "acceptance/testdata/mock_buildpacks/multi-platform-buildpack/buildpack.toml",
    "content": "api = \"0.10\"\n\n[buildpack]\n  id = \"simple/layers\"\n  version = \"simple-layers-version\"\n  name = \"Simple Layers Buildpack\"\n\n[[targets]]\nos = \"linux\"\narch = \"amd64\"\n\n[[targets]]\nos = \"linux\"\narch = \"arm64\"\n\n[[targets]]\nos = \"windows\"\narch = \"amd64\"\n\n[[stacks]]\n  id = \"*\"\n"
  },
  {
    "path": "acceptance/testdata/mock_buildpacks/multi-platform-buildpack/linux/amd64/bin/build",
    "content": "#!/usr/bin/env bash\n\necho \"---> Build: NOOP Buildpack\"\n"
  },
  {
    "path": "acceptance/testdata/mock_buildpacks/multi-platform-buildpack/linux/amd64/bin/detect",
    "content": "#!/usr/bin/env bash\n\n## always detect"
  },
  {
    "path": "acceptance/testdata/mock_buildpacks/multi-platform-buildpack/windows/amd64/bin/build.bat",
    "content": "@echo off\n\necho ---- Build: NOOP Buildpack\n"
  },
  {
    "path": "acceptance/testdata/mock_buildpacks/multi-platform-buildpack/windows/amd64/bin/detect.bat",
    "content": "@echo off\n:: always detect\n"
  },
  {
    "path": "acceptance/testdata/mock_buildpacks/nested-level-1-buildpack/buildpack.toml",
    "content": "api = \"0.7\"\n\n[buildpack]\n  id = \"simple/nested-level-1\"\n  version = \"nested-l1-version\"\n  name = \"Nested Level One Buildpack\"\n\n[[order]]\n  [[order.group]]\n    id = \"simple/nested-level-2\"\n    version = \"nested-l2-version\"\n"
  },
  {
    "path": "acceptance/testdata/mock_buildpacks/nested-level-2-buildpack/buildpack.toml",
    "content": "api = \"0.7\"\n\n[buildpack]\n  id = \"simple/nested-level-2\"\n  version = \"nested-l2-version\"\n  name = \"Nested Level Two Buildpack\"\n\n[[order]]\n  [[order.group]]\n    id = \"simple/layers\"\n    version = \"simple-layers-version\""
  },
  {
    "path": "acceptance/testdata/mock_buildpacks/noop-buildpack/bin/build",
    "content": "#!/usr/bin/env bash\n\necho \"---> Build: NOOP Buildpack\"\n"
  },
  {
    "path": "acceptance/testdata/mock_buildpacks/noop-buildpack/bin/build.bat",
    "content": "@echo off\n\necho ---- Build: NOOP Buildpack\n"
  },
  {
    "path": "acceptance/testdata/mock_buildpacks/noop-buildpack/bin/detect",
    "content": "#!/usr/bin/env bash\n\n## always detect"
  },
  {
    "path": "acceptance/testdata/mock_buildpacks/noop-buildpack/bin/detect.bat",
    "content": "@echo off\n:: always detect\n"
  },
  {
    "path": "acceptance/testdata/mock_buildpacks/noop-buildpack/buildpack.toml",
    "content": "api = \"0.7\"\n\n[buildpack]\n  id = \"noop.buildpack\"\n  version = \"noop.buildpack.version\"\n  name = \"NOOP Buildpack\"\n\n[[stacks]]\n  id = \"pack.test.stack\""
  },
  {
    "path": "acceptance/testdata/mock_buildpacks/noop-buildpack-2/bin/build",
    "content": "#!/usr/bin/env bash\n\necho \"---> Build: NOOP Buildpack (later version)\"\n"
  },
  {
    "path": "acceptance/testdata/mock_buildpacks/noop-buildpack-2/bin/detect",
    "content": "#!/usr/bin/env bash\n\n## always detect"
  },
  {
    "path": "acceptance/testdata/mock_buildpacks/noop-buildpack-2/buildpack.toml",
    "content": "api = \"0.7\"\n\n[buildpack]\n  id = \"noop.buildpack\"\n  version = \"noop.buildpack.later-version\"\n  name = \"NOOP Buildpack\"\n  homepage = \"http://geocities.com/cool-bp\"\n\n[[stacks]]\n  id = \"pack.test.stack\"\n"
  },
  {
    "path": "acceptance/testdata/mock_buildpacks/not-in-builder-buildpack/bin/build",
    "content": "#!/usr/bin/env bash\n\necho \"---> Build: Local Buildpack\"\n"
  },
  {
    "path": "acceptance/testdata/mock_buildpacks/not-in-builder-buildpack/bin/build.bat",
    "content": "@echo off\n\necho ---- Build: Local Buildpack\n"
  },
  {
    "path": "acceptance/testdata/mock_buildpacks/not-in-builder-buildpack/bin/detect",
    "content": "#!/usr/bin/env bash\n\n## always detect\n"
  },
  {
    "path": "acceptance/testdata/mock_buildpacks/not-in-builder-buildpack/bin/detect.bat",
    "content": "@echo off\n:: always detect\n"
  },
  {
    "path": "acceptance/testdata/mock_buildpacks/not-in-builder-buildpack/buildpack.toml",
    "content": "api = \"0.7\"\n\n[buildpack]\n  id = \"local/bp\"\n  version = \"local-bp-version\"\n  name = \"Local Buildpack\"\n\n[[stacks]]\n  id = \"pack.test.stack\""
  },
  {
    "path": "acceptance/testdata/mock_buildpacks/other-stack-buildpack/bin/build",
    "content": "#!/usr/bin/env bash\n\necho \"---> Build: Other Stack Buildpack\"\n"
  },
  {
    "path": "acceptance/testdata/mock_buildpacks/other-stack-buildpack/bin/build.bat",
    "content": "@echo off\n\necho ---- Build: Other Stack Buildpack\n"
  },
  {
    "path": "acceptance/testdata/mock_buildpacks/other-stack-buildpack/bin/detect",
    "content": "#!/usr/bin/env bash\n\n## always detect"
  },
  {
    "path": "acceptance/testdata/mock_buildpacks/other-stack-buildpack/bin/detect.bat",
    "content": "@echo off\n:: always detect\n"
  },
  {
    "path": "acceptance/testdata/mock_buildpacks/other-stack-buildpack/buildpack.toml",
    "content": "api = \"0.7\"\n\n[buildpack]\n  id = \"other/stack/bp\"\n  version = \"other-stack-version\"\n  name = \"Other Stack Buildpack\"\n\n[[stacks]]\n  id = \"other.stack\""
  },
  {
    "path": "acceptance/testdata/mock_buildpacks/read-env-buildpack/bin/build",
    "content": "#!/usr/bin/env bash\n\necho \"---> Build: Read Env Buildpack\"\n\nset -o errexit\nset -o nounset\nset -o pipefail\n\nlaunch_dir=$1\nplatform_dir=$2\n\n## makes a launch layer\nif [[ -f \"$platform_dir/env/ENV1_CONTENTS\" ]]; then\n    echo \"making env1 layer\"\n    mkdir \"$launch_dir/env1-launch-layer\"\n    contents=$(cat \"$platform_dir/env/ENV1_CONTENTS\")\n    echo \"$contents\" > \"$launch_dir/env1-launch-layer/env1-launch-dep\"\n    ln -snf \"$launch_dir/env1-launch-layer\" env1-launch-deps\n    echo \"[types]\" > \"$launch_dir/env1-launch-layer.toml\"\n    echo \"launch = true\" >> \"$launch_dir/env1-launch-layer.toml\"\nfi\n\n## makes a launch layer\nif [[ -f \"$platform_dir/env/ENV2_CONTENTS\" ]]; then\n    echo \"making env2 layer\"\n    mkdir \"$launch_dir/env2-launch-layer\"\n    contents=$(cat \"$platform_dir/env/ENV2_CONTENTS\")\n    echo \"$contents\" > \"$launch_dir/env2-launch-layer/env2-launch-dep\"\n    ln -snf \"$launch_dir/env2-launch-layer\" env2-launch-deps\n    echo \"[types]\" > \"$launch_dir/env2-launch-layer.toml\"\n    echo \"launch = true\" >> \"$launch_dir/env2-launch-layer.toml\"\nfi\n\necho \"---> Done\"\n"
  },
  {
    "path": "acceptance/testdata/mock_buildpacks/read-env-buildpack/bin/build.bat",
    "content": "@echo off\nsetlocal EnableDelayedExpansion\n\nset launch_dir=%1\nset platform_dir=%2\n\n:: makes a launch layer\nif exist %platform_dir%\\env\\ENV1_CONTENTS (\n    echo making env1 layer\n    mkdir %launch_dir%\\env1-launch-layer\n    set /p contents=<%platform_dir%\\env\\ENV1_CONTENTS\n    echo !contents!> %launch_dir%\\env1-launch-layer\\env1-launch-dep\n    mklink /j env1-launch-deps %launch_dir%\\env1-launch-layer\n    echo [types] > %launch_dir%\\env1-launch-layer.toml\n    echo launch = true >> %launch_dir%\\env1-launch-layer.toml\n)\n\n:: makes a launch layer\nif exist %platform_dir%\\env\\ENV2_CONTENTS (\n    echo making env2 layer\n    mkdir %launch_dir%\\env2-launch-layer\n    set /p contents=<%platform_dir%\\env\\ENV2_CONTENTS\n    echo !contents!> %launch_dir%\\env2-launch-layer\\env2-launch-dep\n    mklink /j env2-launch-deps %launch_dir%\\env2-launch-layer\n    echo [types] > %launch_dir%\\env2-launch-layer.toml\n    echo launch = true >> %launch_dir%\\env2-launch-layer.toml\n)\n\necho --- Done\n"
  },
  {
    "path": "acceptance/testdata/mock_buildpacks/read-env-buildpack/bin/detect",
    "content": "#!/usr/bin/env bash\n\necho \"---> DETECT: Printenv buildpack\"\n\nset -o errexit\nset -o pipefail\n\nplatform_dir=$1\n\nif [[ ! -f $platform_dir/env/DETECT_ENV_BUILDPACK ]]; then\n exit 1\nfi"
  },
  {
    "path": "acceptance/testdata/mock_buildpacks/read-env-buildpack/bin/detect.bat",
    "content": "@echo off\n\necho DETECT: Printenv buildpack\n\nset platform_dir=%1\n\nif not exist %platform_dir%\\env\\DETECT_ENV_BUILDPACK (\n    exit 1\n)\n"
  },
  {
    "path": "acceptance/testdata/mock_buildpacks/read-env-buildpack/buildpack.toml",
    "content": "api = \"0.7\"\n\n[buildpack]\n  id = \"read/env\"\n  version = \"read-env-version\"\n  name = \"Read Env Buildpack\"\n\n[[stacks]]\n  id = \"pack.test.stack\""
  },
  {
    "path": "acceptance/testdata/mock_buildpacks/read-env-extension/bin/detect",
    "content": "#!/usr/bin/env bash\n\necho \"---> Detect: Read Env Extension\"\n"
  },
  {
    "path": "acceptance/testdata/mock_buildpacks/read-env-extension/bin/generate",
    "content": "#!/usr/bin/env bash\n\necho \"---> Generate: Read Env Extension\"\n\n# 1. Get args\noutput_dir=$CNB_OUTPUT_DIR\n\n# 2. Generate build.Dockerfile\ncat >> \"${output_dir}/build.Dockerfile\" <<EOL\nARG base_image\nFROM \\${base_image}\n\nRUN echo \"Hello World\"\nEOL\n\n# 3. Optionally generate run.Dockerfile\nif [[ -z \"$EXT_RUN\" ]]; then\n  echo \"Skipping run image extension, not requested...\"\nelse\n  echo \"Generating run.Dockerfile for run image extension...\"\n  cat >>\"${output_dir}/run.Dockerfile\" <<EOL\nARG base_image\nFROM \\${base_image}\n\nUSER root\nRUN echo \"Hello World\" > /from-ext.txt\n\nARG user_id\nUSER \\${user_id}\nEOL\nfi\n\nif [[ -z \"$EXT_RUN_SWITCH\" ]]; then\n  echo \"Skipping run image switch, not requested...\"\nelse\n  echo \"Generating run.Dockerfile for run image switch...\"\n  cat >>\"${output_dir}/run.Dockerfile\" <<EOL\nFROM busybox:latest\nEOL\nfi"
  },
  {
    "path": "acceptance/testdata/mock_buildpacks/read-env-extension/extension.toml",
    "content": "api = \"0.9\"\n\n[extension]\n  id = \"read/env\"\n  version = \"read-env-version\"\n  name = \"Read Env Extension\"\n"
  },
  {
    "path": "acceptance/testdata/mock_buildpacks/read-volume-buildpack/bin/build",
    "content": "#!/usr/bin/env bash\n\nTEST_FILE_PATH=${TEST_FILE_PATH:?\"env var must be set\"}\n\necho \"---> Build: Volume Buildpack\"\n\nset -o errexit\nset -o nounset\nset -o pipefail\n\necho \"Build: Reading file '${TEST_FILE_PATH}': $(< \"${TEST_FILE_PATH}\")\"\n\necho \"---> Done\"\n"
  },
  {
    "path": "acceptance/testdata/mock_buildpacks/read-volume-buildpack/bin/build.bat",
    "content": "@echo off\n\necho --- Build: Volume Buildpack\n\nset /p content=<%TEST_FILE_PATH%\necho Build: Reading file '%TEST_FILE_PATH%': %content%\n\necho --- Done\n"
  },
  {
    "path": "acceptance/testdata/mock_buildpacks/read-volume-buildpack/bin/detect",
    "content": "#!/usr/bin/env bash\n\nTEST_FILE_PATH=${TEST_FILE_PATH:?\"env var must be set\"}\n\necho \"---> Detect: Volume Buildpack\"\n\nset -o errexit\nset -o nounset\nset -o pipefail\n\necho \"Detect: Reading file '${TEST_FILE_PATH}': $(< \"${TEST_FILE_PATH}\")\"\n\necho \"---> Done\"\n"
  },
  {
    "path": "acceptance/testdata/mock_buildpacks/read-volume-buildpack/bin/detect.bat",
    "content": "@echo off\n\necho --- Detect: Volume Buildpack\n\nset /p content=<%TEST_FILE_PATH%\necho Detect: Reading file '%TEST_FILE_PATH%': %content%\n\necho --- Done\n"
  },
  {
    "path": "acceptance/testdata/mock_buildpacks/read-volume-buildpack/buildpack.toml",
    "content": "api = \"0.7\"\n\n[buildpack]\n  id = \"volume/bp\"\n  version = \"volume-bp-version\"\n  name = \"Volume Buildpack\"\n\n[[stacks]]\n  id = \"pack.test.stack\"\n"
  },
  {
    "path": "acceptance/testdata/mock_buildpacks/read-write-volume-buildpack/bin/build",
    "content": "#!/usr/bin/env bash\n\nTEST_FILE_PATH=${BUILD_TEST_FILE_PATH:?\"env var must be set\"}\n\necho \"---> Build: Read/Write Volume Buildpack\"\n\nset -o errexit\nset -o nounset\nset -o pipefail\n\necho \"Build: Writing file '${TEST_FILE_PATH}': $(echo \"some-content\" > \"${TEST_FILE_PATH}\" && echo \"written\" || echo \"failed\")\"\necho \"Build: Reading file '${TEST_FILE_PATH}': $(< \"${TEST_FILE_PATH}\")\"\n\necho \"---> Done\"\n"
  },
  {
    "path": "acceptance/testdata/mock_buildpacks/read-write-volume-buildpack/bin/build.bat",
    "content": "@echo off\n\nset TEST_FILE_PATH=%BUILD_TEST_FILE_PATH%\n\necho --- Build: Read/Write Volume Buildpack\n\necho some-content> %TEST_FILE_PATH%\nif exist %TEST_FILE_PATH% (\n    echo Build: Writing file '%TEST_FILE_PATH%': written\n) else (\n    echo Build: Writing file '%TEST_FILE_PATH%': failed\n)\n\nset /p content=<%TEST_FILE_PATH%\necho Build: Reading file '%TEST_FILE_PATH%': %content%\n\necho --- Done\n"
  },
  {
    "path": "acceptance/testdata/mock_buildpacks/read-write-volume-buildpack/bin/detect",
    "content": "#!/usr/bin/env bash\n\nTEST_FILE_PATH=${DETECT_TEST_FILE_PATH:?\"env var must be set\"}\n\necho \"---> Detect: Read/Write Volume Buildpack\"\n\nset -o errexit\nset -o nounset\nset -o pipefail\n\necho \"Detect: Writing file '${TEST_FILE_PATH}': $(echo \"some-content\" > \"${TEST_FILE_PATH}\" && echo \"written\" || echo \"failed\")\"\necho \"Detect: Reading file '${TEST_FILE_PATH}': $(< \"${TEST_FILE_PATH}\")\"\n\necho \"---> Done\"\n"
  },
  {
    "path": "acceptance/testdata/mock_buildpacks/read-write-volume-buildpack/bin/detect.bat",
    "content": "@echo off\n\nset TEST_FILE_PATH=%DETECT_TEST_FILE_PATH%\n\necho --- Detect: Read/Write Volume Buildpack\n\necho some-content> %TEST_FILE_PATH%\nif exist %TEST_FILE_PATH% (\n    echo Detect: Writing file '%TEST_FILE_PATH%': written\n) else (\n    echo Detect: Writing file '%TEST_FILE_PATH%': failed\n)\n\nset /p content=<%TEST_FILE_PATH%\necho Detect: Reading file '%TEST_FILE_PATH%': %content%\n\necho --- Done\n"
  },
  {
    "path": "acceptance/testdata/mock_buildpacks/read-write-volume-buildpack/buildpack.toml",
    "content": "api = \"0.7\"\n\n[buildpack]\n  id = \"rw-volume/bp\"\n  version = \"rw-volume-bp-version\"\n  name = \"Read/Write Volume Buildpack\"\n\n[[stacks]]\n  id = \"pack.test.stack\"\n"
  },
  {
    "path": "acceptance/testdata/mock_buildpacks/simple-layers-buildpack/bin/build",
    "content": "#!/usr/bin/env bash\n\necho \"---> Build: Simple Layers Buildpack\"\n\nset -o errexit\nset -o nounset\nset -o pipefail\n\nlaunch_dir=$1\n\n## makes a launch layer\necho \"making launch layer\"\n\n# Add color line, to test for --no-color\necho \"Color: \u001b[0mStyled\"\n\nmkdir \"$launch_dir/launch-layer\"\necho \"Launch Dep Contents\" > \"$launch_dir/launch-layer/launch-dep\"\nln -snf \"$launch_dir/launch-layer\" launch-deps\necho \"[types]\" > \"$launch_dir/launch-layer.toml\"\necho \"launch = true\" >> \"$launch_dir/launch-layer.toml\"\n\n## makes a cached launch layer\nif [[ ! -f \"$launch_dir/cached-launch-layer.toml\" ]]; then\n    echo \"making cached launch layer\"\n    mkdir \"$launch_dir/cached-launch-layer\"\n    echo \"Cached Dep Contents\" > \"$launch_dir/cached-launch-layer/cached-dep\"\n    ln -snf \"$launch_dir/cached-launch-layer\" cached-deps\n    echo \"[types]\" > \"$launch_dir/cached-launch-layer.toml\"\n    echo \"launch = true\" >> \"$launch_dir/cached-launch-layer.toml\"\n    echo \"cache = true\" >> \"$launch_dir/cached-launch-layer.toml\"\nelse\n  echo \"reusing cached launch layer\"\n  echo \"[types]\" > \"$launch_dir/cached-launch-layer.toml\"\n  echo \"launch = true\" >> \"$launch_dir/cached-launch-layer.toml\"\n  echo \"cache = true\" >> \"$launch_dir/cached-launch-layer.toml\"\n  ln -snf \"$launch_dir/cached-launch-layer\" cached-deps\nfi\n\n## adds a process\ncat <<EOF > \"$launch_dir/launch.toml\"\n[[processes]]\n  type = \"web\"\n  command = \"./run\"\n  args = [\"8080\"]\n  default = true\n\n[[processes]]\n  type = \"hello\"\n  command = \"echo\"\n  args = [\"hello\", \"world\"]\n  direct = true\nEOF\n\necho \"---> Done\"\n"
  },
  {
    "path": "acceptance/testdata/mock_buildpacks/simple-layers-buildpack/bin/build.bat",
    "content": "@echo off\necho --- Build: Simple Layers Buildpack\n\nset launch_dir=%1\n\n:: makes a launch layer\necho making launch layer %launch_dir%\\launch-layer\nmkdir %launch_dir%\\launch-layer\necho Launch Dep Contents > \"%launch_dir%\\launch-layer\\launch-dep\nmklink /j launch-deps %launch_dir%\\launch-layer\necho [types] > %launch_dir%\\launch-layer.toml\necho launch = true >> %launch_dir%\\launch-layer.toml\n\n:: makes a cached launch layer\nif not exist %launch_dir%\\cached-launch-layer.toml (\n    echo making cached launch layer %launch_dir%\\cached-launch-layer\n    mkdir %launch_dir%\\cached-launch-layer\n    echo Cached Dep Contents > %launch_dir%\\cached-launch-layer\\cached-dep\n    mklink /j cached-deps %launch_dir%\\cached-launch-layer\n    echo [types] > %launch_dir%\\cached-launch-layer.toml\n    echo launch = true >> %launch_dir%\\cached-launch-layer.toml\n    echo cache = true >> %launch_dir%\\cached-launch-layer.toml\n) else (\n    echo reusing cached launch layer %launch_dir%\\cached-launch-layer\n    echo [types] > %launch_dir%\\cached-launch-layer.toml\n    echo launch = true >> %launch_dir%\\cached-launch-layer.toml\n    echo cache = true >> %launch_dir%\\cached-launch-layer.toml\n    mklink /j cached-deps %launch_dir%\\cached-launch-layer\n)\n\n:: adds a process\n(\necho [[processes]]\necho   type = \"web\"\necho   command = '.\\run'\necho   args = [\"8080\"]\necho   default = true\necho.\necho [[processes]]\necho   type = \"hello\"\necho   command = \"cmd\"\necho   args = [\"/c\", \"echo hello world\"]\necho   direct = true\n) > %launch_dir%\\launch.toml\n\necho --- Done\n"
  },
  {
    "path": "acceptance/testdata/mock_buildpacks/simple-layers-buildpack/bin/detect",
    "content": "#!/usr/bin/env bash\n\n## always detect\n"
  },
  {
    "path": "acceptance/testdata/mock_buildpacks/simple-layers-buildpack/bin/detect.bat",
    "content": "@echo off\n:: always detect\n"
  },
  {
    "path": "acceptance/testdata/mock_buildpacks/simple-layers-buildpack/buildpack.toml",
    "content": "api = \"0.7\"\n\n[buildpack]\n  id = \"simple/layers\"\n  version = \"simple-layers-version\"\n  name = \"Simple Layers Buildpack\"\n\n[[stacks]]\n  id = \"pack.test.stack\"\n"
  },
  {
    "path": "acceptance/testdata/mock_buildpacks/simple-layers-buildpack-different-sha/bin/build",
    "content": "#!/usr/bin/env bash\n\necho \"---> Build: Simple Layers Different Sha Buildpack\"\n\nset -o errexit\nset -o nounset\nset -o pipefail\n\nlaunch_dir=$1\n\n## makes a launch layer\necho \"making launch layer\"\n\n# Add color line, to test for --no-color\necho \"Color: \u001b[0mStyled\"\n\nmkdir \"$launch_dir/launch-layer\"\necho \"Launch Dep Contents\" > \"$launch_dir/launch-layer/launch-dep\"\nln -snf \"$launch_dir/launch-layer\" launch-deps\necho \"[types]\" > \"$launch_dir/launch-layer.toml\"\necho \"launch = true\" >> \"$launch_dir/launch-layer.toml\"\n\n## makes a cached launch layer\nif [[ ! -f \"$launch_dir/cached-launch-layer.toml\" ]]; then\n    echo \"making cached launch layer\"\n    mkdir \"$launch_dir/cached-launch-layer\"\n    echo \"Cached Dep Contents\" > \"$launch_dir/cached-launch-layer/cached-dep\"\n    ln -snf \"$launch_dir/cached-launch-layer\" cached-deps\n    echo \"[types]\" > \"$launch_dir/cached-launch-layer.toml\"\n    echo \"launch = true\" >> \"$launch_dir/cached-launch-layer.toml\"\n    echo \"cache = true\" >> \"$launch_dir/cached-launch-layer.toml\"\nelse\n  echo \"reusing cached launch layer\"\n  ln -snf \"$launch_dir/cached-launch-layer\" cached-deps\nfi\n\n## adds a process\ncat <<EOF > \"$launch_dir/launch.toml\"\n[[processes]]\n  type = \"web\"\n  command = \"./run\"\n  args = [\"8080\"]\n  default = true\n\n[[processes]]\n  type = \"hello\"\n  command = \"echo\"\n  args = [\"hello\", \"world\"]\n  direct = true\nEOF\n\necho \"---> Done\"\n"
  },
  {
    "path": "acceptance/testdata/mock_buildpacks/simple-layers-buildpack-different-sha/bin/build.bat",
    "content": "@echo off\necho --- Build: Simple Layers Different Sha Buildpack\n\nset launch_dir=%1\n\n:: makes a launch layer\necho making launch layer %launch_dir%\\launch-layer\nmkdir %launch_dir%\\launch-layer\necho Launch Dep Contents > \"%launch_dir%\\launch-layer\\launch-dep\nmklink /j launch-deps %launch_dir%\\launch-layer\necho [types] > %launch_dir%\\launch-layer.toml\necho launch = true >> %launch_dir%\\launch-layer.toml\n\n:: makes a cached launch layer\nif not exist %launch_dir%\\cached-launch-layer.toml (\n    echo making cached launch layer %launch_dir%\\cached-launch-layer\n    mkdir %launch_dir%\\cached-launch-layer\n    echo Cached Dep Contents > %launch_dir%\\cached-launch-layer\\cached-dep\n    mklink /j cached-deps %launch_dir%\\cached-launch-layer\n    echo [types] > %launch_dir%\\cached-launch-layer.toml\n    echo launch = true >> %launch_dir%\\cached-launch-layer.toml\n    echo cache = true >> %launch_dir%\\cached-launch-layer.toml\n) else (\n    echo reusing cached launch layer %launch_dir%\\cached-launch-layer\n    mklink /j cached-deps %launch_dir%\\cached-launch-layer\n)\n\n:: adds a process\n(\necho [[processes]]\necho   type = \"web\"\necho   command = '.\\run'\necho   args = [\"8080\"]\necho   default = true\necho.\necho [[processes]]\necho   type = \"hello\"\necho   command = \"cmd\"\necho   args = [\"/c\", \"echo hello world\"]\necho   direct = true\n) > %launch_dir%\\launch.toml\n\necho --- Done\n"
  },
  {
    "path": "acceptance/testdata/mock_buildpacks/simple-layers-buildpack-different-sha/bin/detect",
    "content": "#!/usr/bin/env bash\n\n## always detect\n"
  },
  {
    "path": "acceptance/testdata/mock_buildpacks/simple-layers-buildpack-different-sha/bin/detect.bat",
    "content": "@echo off\n:: always detect\n"
  },
  {
    "path": "acceptance/testdata/mock_buildpacks/simple-layers-buildpack-different-sha/bin/extra_file.txt",
    "content": "Just some extra content to change the sha256 of this buildpack :)"
  },
  {
    "path": "acceptance/testdata/mock_buildpacks/simple-layers-buildpack-different-sha/buildpack.toml",
    "content": "api = \"0.7\"\n\n[buildpack]\n  id = \"simple/layers\"\n  version = \"simple-layers-version\"\n  name = \"Simple Layers Buildpack\"\n\n[[stacks]]\n  id = \"pack.test.stack\"\n"
  },
  {
    "path": "acceptance/testdata/mock_buildpacks/simple-layers-extension/bin/detect",
    "content": "#!/usr/bin/env bash\n\necho \"---> Detect: Simple Layers Extension\"\n"
  },
  {
    "path": "acceptance/testdata/mock_buildpacks/simple-layers-extension/bin/generate",
    "content": "#!/usr/bin/env bash\n\necho \"---> Generate: Simple Layers Extension\"\n"
  },
  {
    "path": "acceptance/testdata/mock_buildpacks/simple-layers-extension/extension.toml",
    "content": "api = \"0.7\"\n\n[extension]\n  id = \"simple/layers\"\n  version = \"simple-layers-version\"\n  name = \"Simple Layers Extension\"\n"
  },
  {
    "path": "acceptance/testdata/mock_buildpacks/simple-layers-parent-buildpack/buildpack.toml",
    "content": "api = \"0.7\"\n\n[buildpack]\n  id = \"simple/layers/parent\"\n  version = \"simple-layers-parent-version\"\n  name = \"Simple Layers Parent Buildpack\"\n\n[[order]]\n[[order.group]]\n  id = \"simple/layers\"\n  version = \"simple-layers-version\""
  },
  {
    "path": "acceptance/testdata/mock_buildpacks/system-fail-detect/bin/build",
    "content": "#!/usr/bin/env bash\n\necho \"---> BUILD: System Fail Detect buildpack (should never run)\"\n\n# This should never be reached\nexit 1"
  },
  {
    "path": "acceptance/testdata/mock_buildpacks/system-fail-detect/bin/detect",
    "content": "#!/usr/bin/env bash\n\necho \"---> DETECT: System Fail Detect buildpack (will fail)\"\n\n# Always fail detection\nexit 1"
  },
  {
    "path": "acceptance/testdata/mock_buildpacks/system-fail-detect/buildpack.toml",
    "content": "api = \"0.7\"\n\n[buildpack]\n  id = \"system/fail-detect\"\n  version = \"system-fail-detect-version\"\n  name = \"System Fail Detect Buildpack\"\n\n[[stacks]]\n  id = \"pack.test.stack\""
  },
  {
    "path": "acceptance/testdata/mock_buildpacks/system-post-buildpack/bin/build",
    "content": "#!/usr/bin/env bash\n\necho \"---> BUILD: System Post buildpack\"\n\nset -o errexit\nset -o pipefail\n\nlayers_dir=$1\nplatform_dir=$2\n\n# Create a layer to verify it ran\nmkdir -p \"${layers_dir}/system-post\"\ncat > \"${layers_dir}/system-post.toml\" <<EOF\nlaunch = true\ncache = true\nEOF\n\necho \"System Post Buildpack was here\" > \"${layers_dir}/system-post/marker\"\n\nexit 0"
  },
  {
    "path": "acceptance/testdata/mock_buildpacks/system-post-buildpack/bin/detect",
    "content": "#!/usr/bin/env bash\n\necho \"---> DETECT: System Post buildpack\"\n\n# Always pass detection for testing\nexit 0"
  },
  {
    "path": "acceptance/testdata/mock_buildpacks/system-post-buildpack/buildpack.toml",
    "content": "api = \"0.7\"\n\n[buildpack]\n  id = \"system/post\"\n  version = \"system-post-version\"\n  name = \"System Post Buildpack\"\n\n[[stacks]]\n  id = \"pack.test.stack\""
  },
  {
    "path": "acceptance/testdata/mock_buildpacks/system-pre-buildpack/bin/build",
    "content": "#!/usr/bin/env bash\n\necho \"---> BUILD: System Pre buildpack\"\n\nset -o errexit\nset -o pipefail\n\nlayers_dir=$1\nplatform_dir=$2\n\n# Create a layer to verify it ran\nmkdir -p \"${layers_dir}/system-pre\"\ncat > \"${layers_dir}/system-pre.toml\" <<EOF\nlaunch = true\ncache = true\nEOF\n\necho \"System Pre Buildpack was here\" > \"${layers_dir}/system-pre/marker\"\n\nexit 0"
  },
  {
    "path": "acceptance/testdata/mock_buildpacks/system-pre-buildpack/bin/detect",
    "content": "#!/usr/bin/env bash\n\necho \"---> DETECT: System Pre buildpack\"\n\n# Always pass detection for testing\nexit 0"
  },
  {
    "path": "acceptance/testdata/mock_buildpacks/system-pre-buildpack/buildpack.toml",
    "content": "api = \"0.7\"\n\n[buildpack]\n  id = \"system/pre\"\n  version = \"system-pre-version\"\n  name = \"System Pre Buildpack\"\n\n[[stacks]]\n  id = \"pack.test.stack\""
  },
  {
    "path": "acceptance/testdata/mock_stack/create-stack.sh",
    "content": "#!/usr/bin/env bash\n\ndir=\"$(cd $(dirname $0) && pwd)\"\nos=$(docker info --format '{{json .}}' | jq -r .OSType)\n\ndocker build --tag pack-test/build \"$dir\"/$os/build\ndocker build --tag pack-test/run \"$dir\"/$os/run\n"
  },
  {
    "path": "acceptance/testdata/mock_stack/linux/build/Dockerfile",
    "content": "FROM ubuntu:bionic\n\nENV CNB_USER_ID=2222\nENV CNB_GROUP_ID=3333\n\nRUN \\\n  groupadd pack --gid 3333 && \\\n  useradd --uid 2222 --gid 3333 -m -s /bin/bash pack\n\nRUN apt-get update && apt-get -yq install netcat\nLABEL io.buildpacks.stack.id=pack.test.stack\nLABEL io.buildpacks.stack.mixins=\"[\\\"mixinA\\\", \\\"build:mixinTwo\\\", \\\"netcat\\\", \\\"mixin3\\\"]\"\n\nUSER pack\n"
  },
  {
    "path": "acceptance/testdata/mock_stack/linux/run/Dockerfile",
    "content": "FROM ubuntu:bionic\n\nENV CNB_USER_ID=2222\nENV CNB_GROUP_ID=3333\n\nRUN \\\n  groupadd pack --gid 3333 && \\\n  useradd --uid 2222 --gid 3333 -m -s /bin/bash pack\n\nRUN apt-get update && apt-get -yq install netcat\nLABEL io.buildpacks.stack.id=pack.test.stack\nLABEL io.buildpacks.stack.mixins=\"[\\\"mixinA\\\", \\\"netcat\\\", \\\"mixin3\\\"]\"\n\nUSER pack\n"
  },
  {
    "path": "acceptance/testdata/mock_stack/windows/build/Dockerfile",
    "content": "FROM mcr.microsoft.com/windows/nanoserver:1809\n\n# non-zero sets all user-owned directories to BUILTIN\\Users\nENV CNB_USER_ID=1\nENV CNB_GROUP_ID=1\n\nUSER ContainerAdministrator\n\nRUN net users /ADD pack /passwordreq:no /expires:never\n\nLABEL io.buildpacks.stack.id=pack.test.stack\nLABEL io.buildpacks.stack.mixins=\"[\\\"mixinA\\\", \\\"build:mixinTwo\\\", \\\"netcat\\\", \\\"mixin3\\\"]\"\n\nUSER pack\n"
  },
  {
    "path": "acceptance/testdata/mock_stack/windows/run/Dockerfile",
    "content": "FROM golang:1.17-nanoserver-1809 AS gobuild\n\n# bake in a simple server util\nCOPY server.go /util/server.go\nWORKDIR /util\nRUN go build server.go\n\nFROM mcr.microsoft.com/windows/nanoserver:1809\n\nCOPY --from=gobuild /util/server.exe /util/server.exe\n\n# non-zero sets all user-owned directories to BUILTIN\\Users\nENV CNB_USER_ID=1\nENV CNB_GROUP_ID=1\n\nUSER ContainerAdministrator\n\nRUN net users /ADD pack /passwordreq:no /expires:never\n\nLABEL io.buildpacks.stack.id=pack.test.stack\nLABEL io.buildpacks.stack.mixins=\"[\\\"mixinA\\\", \\\"netcat\\\", \\\"mixin3\\\"]\"\n\nUSER pack\n\n# launcher requires a non-empty PATH to workaround https://github.com/buildpacks/pack/issues/800\nENV PATH c:\\\\Windows\\\\system32;C:\\\\Windows\n"
  },
  {
    "path": "acceptance/testdata/mock_stack/windows/run/server.go",
    "content": "/*\n-p=\"8080\": port to expose\n-g:        file globs to read (comma-separated)\n*/\npackage main\n\nimport (\n\t\"flag\"\n\t\"log\"\n\t\"net/http\"\n\t\"os\"\n\t\"path/filepath\"\n\t\"strings\"\n)\n\nfunc main() {\n\tport := flag.String(\"p\", \"8080\", \"port to expose\")\n\tglob := flag.String(\"g\", \"\", \"file globs to read\")\n\tflag.Parse()\n\n\tvar resp []string\n\n\tglobs := strings.Split(*glob, \",\")\n\tfor _, glob := range globs {\n\t\tpaths, err := filepath.Glob(strings.TrimSpace(glob))\n\t\tif err != nil {\n\t\t\tpanic(err.Error())\n\t\t}\n\n\t\tfor _, path := range paths {\n\t\t\tcontents, err := os.ReadFile(filepath.Clean(path))\n\t\t\tif err != nil {\n\t\t\t\tpanic(err.Error())\n\t\t\t}\n\n\t\t\tresp = append(resp, string(contents))\n\t\t}\n\t}\n\n\thttp.HandleFunc(\"/\", func(w http.ResponseWriter, r *http.Request) {\n\t\tw.Write([]byte(strings.Join(resp, \"\\n\")))\n\t})\n\n\tlog.Printf(\"Serving %s on HTTP port: %s\\n\", *glob, *port)\n\tlog.Fatal(http.ListenAndServe(\":\"+*port, nil))\n}\n"
  },
  {
    "path": "acceptance/testdata/pack_fixtures/.gitattributes",
    "content": "*.txt text eol=lf\n"
  },
  {
    "path": "acceptance/testdata/pack_fixtures/builder.toml",
    "content": "[[buildpacks]]\n  id = \"read/env\"\n  version = \"read-env-version\"\n  uri = \"read-env-buildpack.tgz\"\n\n[[buildpacks]]\n  # intentionally missing id/version as they are optional\n  uri = \"noop-buildpack.tgz\"\n\n[[buildpacks]]\n  # noop-buildpack-2 has the same id but a different version compared to noop-buildpack\n  uri = \"noop-buildpack-2.tgz\"\n\n{{- if .package_image_name}}\n[[buildpacks]]\n  image = \"{{.package_image_name}}\"\n{{- end}}\n\n[[order]]\n{{- if .package_id}}\n[[order.group]]\n  id = \"{{.package_id}}\"\n  # intentionlly missing version to test support\n{{- end}}\n\n[[order.group]]\n  id = \"read/env\"\n  version = \"read-env-version\"\n  optional = true\n\n[stack]\n  id = \"pack.test.stack\"\n  build-image = \"pack-test/build\"\n  run-image = \"pack-test/run\"\n  run-image-mirrors = [\"{{.run_image_mirror}}\"]\n\n[lifecycle]\n{{- if .lifecycle_uri}}\n  uri = \"{{.lifecycle_uri}}\"\n{{- end}}\n{{- if .lifecycle_version}}\n  version = \"{{.lifecycle_version}}\"\n{{- end}}\n"
  },
  {
    "path": "acceptance/testdata/pack_fixtures/builder_extensions.toml",
    "content": "[[buildpacks]]\n  id = \"read/env\"\n  version = \"read-env-version\"\n  uri = \"read-env-buildpack.tgz\"\n\n[[buildpacks]]\n  id = \"simple/layers\"\n  version = \"simple-layers-version\"\n  uri = \"simple-layers-buildpack\"\n\n[[extensions]]\n  id = \"read/env\"\n  version = \"read-env-version\"\n  uri = \"read-env-extension.tgz\"\n\n[[extensions]]\n  id = \"simple/layers\"\n  version = \"simple-layers-version\"\n  uri = \"simple-layers-extension\"\n\n[[order]]\n[[order.group]]\n  id = \"read/env\"\n  version = \"read-env-version\"\n  optional = true\n\n[[order.group]]\n  id = \"simple/layers\"\n  version = \"simple-layers-version\"\n  optional = true\n\n[[order-extensions]]\n[[order-extensions.group]]\n  id = \"read/env\"\n  version = \"read-env-version\"\n\n[[order-extensions.group]]\n  id = \"simple/layers\"\n  version = \"simple-layers-version\"\n\n[stack]\n  id = \"pack.test.stack\"\n  build-image = \"pack-test/build\"\n  run-image = \"pack-test/run\"\n  run-image-mirrors = [\"{{.run_image_mirror}}\"]\n\n[lifecycle]\n{{- if .lifecycle_uri}}\n  uri = \"{{.lifecycle_uri}}\"\n{{- end}}\n{{- if .lifecycle_version}}\n  version = \"{{.lifecycle_version}}\"\n{{- end}}\n"
  },
  {
    "path": "acceptance/testdata/pack_fixtures/builder_multi_platform-no-targets.toml",
    "content": "[[buildpacks]]\nid = \"simple/layers\"\nversion = \"simple-layers-version\"\nuri = \"{{ .BuildpackURI }}\"\n\n[[order]]\n[[order.group]]\nid = \"simple/layers\"\nversion = \"simple-layers-version\"\n\n[build]\nimage = \"{{ .BuildImage }}\"\n\n[run]\n[[run.images]]\nimage = \"{{ .RunImage }}\"\n\n\n\n"
  },
  {
    "path": "acceptance/testdata/pack_fixtures/builder_multi_platform.toml",
    "content": "[[buildpacks]]\nid = \"simple/layers\"\nversion = \"simple-layers-version\"\nuri = \"{{ .BuildpackURI }}\"\n\n[[order]]\n[[order.group]]\nid = \"simple/layers\"\nversion = \"simple-layers-version\"\n\n# Targets the buildpack will work with\n[[targets]]\nos = \"linux\"\narch = \"amd64\"\n\n[[targets]]\nos = \"windows\"\narch = \"amd64\"\n\n[build]\nimage = \"{{ .BuildImage }}\"\n\n[run]\n[[run.images]]\nimage = \"{{ .RunImage }}\"\n\n\n\n"
  },
  {
    "path": "acceptance/testdata/pack_fixtures/builder_with_failing_system_buildpack.toml",
    "content": "[[buildpacks]]\n  id = \"simple-layers-buildpack\"\n  uri = \"file://{{.Fixtures}}/simple-layers-buildpack\"\n\n[[buildpacks]]\n  id = \"system/fail-detect\"\n  uri = \"file://{{.Fixtures}}/system-fail-detect\"\n\n[[buildpacks]]\n  id = \"system/post\"\n  uri = \"file://{{.Fixtures}}/system-post-buildpack\"\n\n# System buildpacks configuration\n[system]\n[system.pre]\nbuildpacks = [\n  { id = \"system/fail-detect\", version = \"system-fail-detect-version\", optional = false }\n]\n\n[system.post]\nbuildpacks = [\n  { id = \"system/post\", version = \"system-post-version\", optional = true }\n]\n\n[[order]]\n[[order.group]]\n  id = \"simple-layers-buildpack\"\n  version = \"simple-layers-buildpack-version\"\n\n[stack]\n  id = \"pack.test.stack\"\n  build-image = \"pack-test/build\"\n  run-image = \"pack-test/run\""
  },
  {
    "path": "acceptance/testdata/pack_fixtures/builder_with_optional_failing_system_buildpack.toml",
    "content": "[[buildpacks]]\n  id = \"simple-layers-buildpack\"\n  uri = \"file://{{.Fixtures}}/simple-layers-buildpack\"\n\n[[buildpacks]]\n  id = \"system/fail-detect\"\n  uri = \"file://{{.Fixtures}}/system-fail-detect\"\n\n[[buildpacks]]\n  id = \"system/pre\"\n  uri = \"file://{{.Fixtures}}/system-pre-buildpack\"\n\n# System buildpacks configuration\n[system]\n[system.pre]\nbuildpacks = [\n  { id = \"system/fail-detect\", version = \"system-fail-detect-version\", optional = true },\n  { id = \"system/pre\", version = \"system-pre-version\", optional = false }\n]\n\n[[order]]\n[[order.group]]\n  id = \"simple-layers-buildpack\"\n  version = \"simple-layers-buildpack-version\"\n\n[stack]\n  id = \"pack.test.stack\"\n  build-image = \"pack-test/build\"\n  run-image = \"pack-test/run\""
  },
  {
    "path": "acceptance/testdata/pack_fixtures/builder_with_system_buildpacks.toml",
    "content": "[[buildpacks]]\n  id = \"simple-layers-buildpack\"\n  uri = \"file://{{.Fixtures}}/simple-layers-buildpack\"\n\n[[buildpacks]]\n  id = \"system/pre\"\n  uri = \"file://{{.Fixtures}}/system-pre-buildpack\"\n\n[[buildpacks]]\n  id = \"system/post\"\n  uri = \"file://{{.Fixtures}}/system-post-buildpack\"\n\n# System buildpacks configuration\n[system]\n[system.pre]\nbuildpacks = [\n  { id = \"system/pre\", version = \"system-pre-version\", optional = false }\n]\n\n[system.post]\nbuildpacks = [\n  { id = \"system/post\", version = \"system-post-version\", optional = true }\n]\n\n[[order]]\n[[order.group]]\n  id = \"simple-layers-buildpack\"\n  version = \"simple-layers-buildpack-version\"\n\n[stack]\n  id = \"pack.test.stack\"\n  build-image = \"pack-test/build\"\n  run-image = \"pack-test/run\""
  },
  {
    "path": "acceptance/testdata/pack_fixtures/inspect_0.20.0_builder_nested_depth_2_output.txt",
    "content": "Inspecting builder: '{{.builder_name}}'\n\nREMOTE:\n\nCreated By:\n  Name: Pack CLI\n  Version: {{.pack_version}}\n\nTrusted: {{.trusted}}\n\nStack:\n  ID: pack.test.stack\n  Mixins:\n    mixinA\n    netcat\n    mixin3\n    build:mixinTwo\n\nLifecycle:\n  Version: {{.lifecycle_version}}\n  Buildpack APIs:\n    Deprecated: {{ .deprecated_buildpack_apis }}\n    Supported: {{ .supported_buildpack_apis }}\n  Platform APIs:\n    Deprecated: {{ .deprecated_platform_apis }}\n    Supported: {{ .supported_platform_apis }}\n\nRun Images:\n  some-registry.com/pack-test/run1    (user-configured)\n  pack-test/run\n  {{.run_image_mirror}}\n\nBuildpacks:\n  ID                           NAME                      VERSION                             HOMEPAGE\n  noop.buildpack               NOOP Buildpack            noop.buildpack.later-version        http://geocities.com/cool-bp\n  noop.buildpack               NOOP Buildpack            noop.buildpack.version              -\n  read/env                     Read Env Buildpack        read-env-version                    -\n  simple/layers                -                         simple-layers-version               -\n  simple/nested-level-1        -                         nested-l1-version                   -\n  simple/nested-level-2        -                         nested-l2-version                   -\n\nDetection Order:\n └ Group #1:\n    ├ simple/nested-level-1\n    │  └ Group #1:\n    │     └ simple/nested-level-2@nested-l2-version\n    └ read/env@read-env-version                        (optional)\n\nLOCAL:\n\nCreated By:\n  Name: Pack CLI\n  Version: {{.pack_version}}\n\nTrusted: {{.trusted}}\n\nStack:\n  ID: pack.test.stack\n  Mixins:\n    mixinA\n    netcat\n    mixin3\n    build:mixinTwo\n\nLifecycle:\n  Version: {{.lifecycle_version}}\n  Buildpack APIs:\n    Deprecated: {{ .deprecated_buildpack_apis }}\n    Supported: {{ .supported_buildpack_apis }}\n  Platform APIs:\n    Deprecated: {{ .deprecated_platform_apis }}\n    Supported: {{ .supported_platform_apis }}\n\nRun Images:\n  some-registry.com/pack-test/run1    (user-configured)\n  pack-test/run\n  {{.run_image_mirror}}\n\nBuildpacks:\n  ID                           NAME                      VERSION                             HOMEPAGE\n  noop.buildpack               NOOP Buildpack            noop.buildpack.later-version        http://geocities.com/cool-bp\n  noop.buildpack               NOOP Buildpack            noop.buildpack.version              -\n  read/env                     Read Env Buildpack        read-env-version                    -\n  simple/layers                -                         simple-layers-version               -\n  simple/nested-level-1        -                         nested-l1-version                   -\n  simple/nested-level-2        -                         nested-l2-version                   -\n\nDetection Order:\n └ Group #1:\n    ├ simple/nested-level-1\n    │  └ Group #1:\n    │     └ simple/nested-level-2@nested-l2-version\n    └ read/env@read-env-version                        (optional)\n"
  },
  {
    "path": "acceptance/testdata/pack_fixtures/inspect_0.20.0_builder_nested_output.txt",
    "content": "Inspecting builder: '{{.builder_name}}'\n\nREMOTE:\n\nCreated By:\n  Name: Pack CLI\n  Version: {{.pack_version}}\n\nTrusted: {{.trusted}}\n\nStack:\n  ID: pack.test.stack\n  Mixins:\n    mixinA\n    netcat\n    mixin3\n    build:mixinTwo\n\nLifecycle:\n  Version: {{.lifecycle_version}}\n  Buildpack APIs:\n    Deprecated: {{ .deprecated_buildpack_apis }}\n    Supported: {{ .supported_buildpack_apis }}\n  Platform APIs:\n    Deprecated: {{ .deprecated_platform_apis }}\n    Supported: {{ .supported_platform_apis }}\n\nRun Images:\n  some-registry.com/pack-test/run1    (user-configured)\n  pack-test/run\n  {{.run_image_mirror}}\n\nBuildpacks:\n  ID                           NAME                      VERSION                             HOMEPAGE\n  noop.buildpack               NOOP Buildpack            noop.buildpack.later-version        http://geocities.com/cool-bp\n  noop.buildpack               NOOP Buildpack            noop.buildpack.version              -\n  read/env                     Read Env Buildpack        read-env-version                    -\n  simple/layers                -                         simple-layers-version               -\n  simple/nested-level-1        -                         nested-l1-version                   -\n  simple/nested-level-2        -                         nested-l2-version                   -\n\nDetection Order:\n └ Group #1:\n    ├ simple/nested-level-1\n    │  └ Group #1:\n    │     └ simple/nested-level-2@nested-l2-version\n    │        └ Group #1:\n    │           └ simple/layers@simple-layers-version\n    └ read/env@read-env-version                          (optional)\n\nLOCAL:\n\nCreated By:\n  Name: Pack CLI\n  Version: {{.pack_version}}\n\nTrusted: {{.trusted}}\n\nStack:\n  ID: pack.test.stack\n  Mixins:\n    mixinA\n    netcat\n    mixin3\n    build:mixinTwo\n\nLifecycle:\n  Version: {{.lifecycle_version}}\n  Buildpack APIs:\n    Deprecated: {{ .deprecated_buildpack_apis }}\n    Supported: {{ .supported_buildpack_apis }}\n  Platform APIs:\n    Deprecated: {{ .deprecated_platform_apis }}\n    Supported: {{ .supported_platform_apis }}\n\nRun Images:\n  some-registry.com/pack-test/run1    (user-configured)\n  pack-test/run\n  {{.run_image_mirror}}\n\nBuildpacks:\n  ID                           NAME                      VERSION                             HOMEPAGE\n  noop.buildpack               NOOP Buildpack            noop.buildpack.later-version        http://geocities.com/cool-bp\n  noop.buildpack               NOOP Buildpack            noop.buildpack.version              -\n  read/env                     Read Env Buildpack        read-env-version                    -\n  simple/layers                -                         simple-layers-version               -\n  simple/nested-level-1        -                         nested-l1-version                   -\n  simple/nested-level-2        -                         nested-l2-version                   -\n\nDetection Order:\n └ Group #1:\n    ├ simple/nested-level-1\n    │  └ Group #1:\n    │     └ simple/nested-level-2@nested-l2-version\n    │        └ Group #1:\n    │           └ simple/layers@simple-layers-version\n    └ read/env@read-env-version                          (optional)\n"
  },
  {
    "path": "acceptance/testdata/pack_fixtures/inspect_0.20.0_builder_nested_output_json.txt",
    "content": "{\n  \"builder_name\": \"{{.builder_name}}\",\n  \"trusted\": false,\n  \"default\": false,\n  \"remote_info\": {\n    \"created_by\": {\n      \"name\": \"Pack CLI\",\n      \"version\": \"{{.pack_version}}\"\n    },\n    \"stack\": {\n      \"id\": \"pack.test.stack\",\n      \"mixins\": [\n        \"mixinA\",\n        \"netcat\",\n        \"mixin3\",\n        \"build:mixinTwo\"\n      ]\n    },\n    \"lifecycle\": {\n      \"version\": \"{{.lifecycle_version}}\",\n      \"buildpack_apis\": {\n        \"deprecated\": {{.deprecated_buildpack_apis}},\n        \"supported\": {{.supported_buildpack_apis}}\n      },\n      \"platform_apis\": {\n        \"deprecated\": {{.deprecated_platform_apis}},\n        \"supported\": {{.supported_platform_apis}}\n      }\n    },\n    \"run_images\": [\n      {\n        \"name\": \"some-registry.com/pack-test/run1\",\n        \"user_configured\": true\n      },\n      {\n        \"name\": \"pack-test/run\"\n      },\n      {\n        \"name\": \"{{.run_image_mirror}}\"\n      }\n    ],\n    \"buildpacks\": [\n      {\n        \"id\": \"noop.buildpack\",\n        \"name\": \"NOOP Buildpack\",\n        \"version\": \"noop.buildpack.later-version\",\n        \"homepage\": \"http://geocities.com/cool-bp\"\n      },\n      {\n        \"id\": \"noop.buildpack\",\n        \"name\": \"NOOP Buildpack\",\n        \"version\": \"noop.buildpack.version\"\n      },\n      {\n        \"id\": \"read/env\",\n        \"name\": \"Read Env Buildpack\",\n        \"version\": \"read-env-version\"\n      },\n      {\n        \"id\": \"simple/layers\",\n        \"version\": \"simple-layers-version\"\n      },\n      {\n        \"id\": \"simple/nested-level-1\",\n        \"version\": \"nested-l1-version\"\n      },\n      {\n        \"id\": \"simple/nested-level-2\",\n        \"version\": \"nested-l2-version\"\n      }\n    ],\n    \"detection_order\": [\n      {\n        \"buildpacks\": [\n          {\n            \"id\": \"simple/nested-level-1\",\n            \"buildpacks\": [\n              {\n                \"id\": \"simple/nested-level-2\",\n                \"version\": \"nested-l2-version\",\n                \"buildpacks\": [\n                  {\n                    \"id\": \"simple/layers\",\n                    \"version\": \"simple-layers-version\"\n                  }\n                ]\n              }\n            ]\n          },\n          {\n            \"id\": \"read/env\",\n            \"version\": \"read-env-version\",\n            \"optional\": true\n          }\n        ]\n      }\n    ]\n  },\n  \"local_info\": {\n    \"created_by\": {\n      \"name\": \"Pack CLI\",\n      \"version\": \"{{.pack_version}}\"\n    },\n    \"stack\": {\n      \"id\": \"pack.test.stack\",\n      \"mixins\": [\n        \"mixinA\",\n        \"netcat\",\n        \"mixin3\",\n        \"build:mixinTwo\"\n      ]\n    },\n    \"lifecycle\": {\n      \"version\": \"{{.lifecycle_version}}\",\n      \"buildpack_apis\": {\n        \"deprecated\": {{.deprecated_buildpack_apis}},\n        \"supported\": {{.supported_buildpack_apis}}\n      },\n      \"platform_apis\": {\n        \"deprecated\": {{.deprecated_platform_apis}},\n        \"supported\": {{.supported_platform_apis}}\n      }\n    },\n    \"run_images\": [\n      {\n        \"name\": \"some-registry.com/pack-test/run1\",\n        \"user_configured\": true\n      },\n      {\n        \"name\": \"pack-test/run\"\n      },\n      {\n        \"name\": \"{{.run_image_mirror}}\"\n      }\n    ],\n    \"buildpacks\": [\n      {\n        \"id\": \"noop.buildpack\",\n        \"name\": \"NOOP Buildpack\",\n        \"version\": \"noop.buildpack.later-version\",\n        \"homepage\": \"http://geocities.com/cool-bp\"\n      },\n      {\n        \"id\": \"noop.buildpack\",\n        \"name\": \"NOOP Buildpack\",\n        \"version\": \"noop.buildpack.version\"\n      },\n      {\n        \"id\": \"read/env\",\n        \"name\": \"Read Env Buildpack\",\n        \"version\": \"read-env-version\"\n      },\n      {\n        \"id\": \"simple/layers\",\n        \"version\": \"simple-layers-version\"\n      },\n      {\n        \"id\": \"simple/nested-level-1\",\n        \"version\": \"nested-l1-version\"\n      },\n      {\n        \"id\": \"simple/nested-level-2\",\n        \"version\": \"nested-l2-version\"\n      }\n    ],\n    \"detection_order\": [\n      {\n        \"buildpacks\": [\n          {\n            \"id\": \"simple/nested-level-1\",\n            \"buildpacks\": [\n              {\n                \"id\": \"simple/nested-level-2\",\n                \"version\": \"nested-l2-version\",\n                \"buildpacks\": [\n                  {\n                    \"id\": \"simple/layers\",\n                    \"version\": \"simple-layers-version\"\n                  }\n                ]\n              }\n            ]\n          },\n          {\n            \"id\": \"read/env\",\n            \"version\": \"read-env-version\",\n            \"optional\": true\n          }\n        ]\n      }\n    ]\n  }\n}\n"
  },
  {
    "path": "acceptance/testdata/pack_fixtures/inspect_0.20.0_builder_nested_output_toml.txt",
    "content": "builder_name = \"{{.builder_name}}\"\ntrusted = false\ndefault = false\n\n[remote_info]\n\n  [remote_info.created_by]\n    Name = \"Pack CLI\"\n    Version = \"{{.pack_version}}\"\n\n  [remote_info.stack]\n    id = \"pack.test.stack\"\n    mixins = [\"mixinA\", \"netcat\", \"mixin3\", \"build:mixinTwo\"]\n\n  [remote_info.lifecycle]\n    version = \"{{.lifecycle_version}}\"\n\n    [remote_info.lifecycle.buildpack_apis]\n      deprecated = {{.deprecated_buildpack_apis}}\n      supported = {{.supported_buildpack_apis}}\n\n    [remote_info.lifecycle.platform_apis]\n      deprecated = {{.deprecated_platform_apis}}\n      supported = {{.supported_platform_apis}}\n\n  [[remote_info.run_images]]\n    name = \"some-registry.com/pack-test/run1\"\n    user_configured = true\n\n  [[remote_info.run_images]]\n    name = \"pack-test/run\"\n\n  [[remote_info.run_images]]\n    name = \"{{.run_image_mirror}}\"\n\n  [[remote_info.buildpacks]]\n    id = \"noop.buildpack\"\n    name = \"NOOP Buildpack\"\n    version = \"noop.buildpack.later-version\"\n    homepage = \"http://geocities.com/cool-bp\"\n\n  [[remote_info.buildpacks]]\n    id = \"noop.buildpack\"\n    name = \"NOOP Buildpack\"\n    version = \"noop.buildpack.version\"\n\n  [[remote_info.buildpacks]]\n    id = \"read/env\"\n    name = \"Read Env Buildpack\"\n    version = \"read-env-version\"\n\n  [[remote_info.buildpacks]]\n    id = \"simple/layers\"\n    version = \"simple-layers-version\"\n\n  [[remote_info.buildpacks]]\n    id = \"simple/nested-level-1\"\n    version = \"nested-l1-version\"\n\n  [[remote_info.buildpacks]]\n    id = \"simple/nested-level-2\"\n    version = \"nested-l2-version\"\n\n  [[remote_info.detection_order]]\n\n    [[remote_info.detection_order.buildpacks]]\n      id = \"simple/nested-level-1\"\n\n      [[remote_info.detection_order.buildpacks.buildpacks]]\n        id = \"simple/nested-level-2\"\n        version = \"nested-l2-version\"\n\n        [[remote_info.detection_order.buildpacks.buildpacks.buildpacks]]\n          id = \"simple/layers\"\n          version = \"simple-layers-version\"\n\n    [[remote_info.detection_order.buildpacks]]\n      id = \"read/env\"\n      version = \"read-env-version\"\n      optional = true\n\n[local_info]\n\n  [local_info.created_by]\n    Name = \"Pack CLI\"\n    Version = \"{{.pack_version}}\"\n\n  [local_info.stack]\n    id = \"pack.test.stack\"\n    mixins = [\"mixinA\", \"netcat\", \"mixin3\", \"build:mixinTwo\"]\n\n  [local_info.lifecycle]\n    version = \"{{.lifecycle_version}}\"\n\n    [local_info.lifecycle.buildpack_apis]\n      deprecated = {{.deprecated_buildpack_apis}}\n      supported = {{.supported_buildpack_apis}}\n\n    [local_info.lifecycle.platform_apis]\n      deprecated = {{.deprecated_platform_apis}}\n      supported = {{.supported_platform_apis}}\n\n  [[local_info.run_images]]\n    name = \"some-registry.com/pack-test/run1\"\n    user_configured = true\n\n  [[local_info.run_images]]\n    name = \"pack-test/run\"\n\n  [[local_info.run_images]]\n    name = \"{{.run_image_mirror}}\"\n\n  [[local_info.buildpacks]]\n    id = \"noop.buildpack\"\n    name = \"NOOP Buildpack\"\n    version = \"noop.buildpack.later-version\"\n    homepage = \"http://geocities.com/cool-bp\"\n\n  [[local_info.buildpacks]]\n    id = \"noop.buildpack\"\n    name = \"NOOP Buildpack\"\n    version = \"noop.buildpack.version\"\n\n  [[local_info.buildpacks]]\n    id = \"read/env\"\n    name = \"Read Env Buildpack\"\n    version = \"read-env-version\"\n\n  [[local_info.buildpacks]]\n    id = \"simple/layers\"\n    version = \"simple-layers-version\"\n\n  [[local_info.buildpacks]]\n    id = \"simple/nested-level-1\"\n    version = \"nested-l1-version\"\n\n  [[local_info.buildpacks]]\n    id = \"simple/nested-level-2\"\n    version = \"nested-l2-version\"\n\n  [[local_info.detection_order]]\n\n    [[local_info.detection_order.buildpacks]]\n      id = \"simple/nested-level-1\"\n\n      [[local_info.detection_order.buildpacks.buildpacks]]\n        id = \"simple/nested-level-2\"\n        version = \"nested-l2-version\"\n\n        [[local_info.detection_order.buildpacks.buildpacks.buildpacks]]\n          id = \"simple/layers\"\n          version = \"simple-layers-version\"\n\n    [[local_info.detection_order.buildpacks]]\n      id = \"read/env\"\n      version = \"read-env-version\"\n      optional = true\n"
  },
  {
    "path": "acceptance/testdata/pack_fixtures/inspect_0.20.0_builder_nested_output_yaml.txt",
    "content": "sharedbuilderinfo:\n    builder_name: {{.builder_name}}\n    trusted: false\n    default: false\nremote_info:\n    created_by:\n        name: Pack CLI\n        version: {{.pack_version}}\n    stack:\n        id: pack.test.stack\n        mixins:\n            - mixinA\n            - netcat\n            - mixin3\n            - build:mixinTwo\n    lifecycle:\n        version: {{.lifecycle_version}}\n        buildpack_apis:\n            deprecated: {{.deprecated_buildpack_apis}}\n            supported: {{.supported_buildpack_apis}}\n        platform_apis:\n            deprecated: {{.deprecated_platform_apis}}\n            supported: {{.supported_platform_apis}}\n    run_images:\n        - name: some-registry.com/pack-test/run1\n          user_configured: true\n        - name: pack-test/run\n        - name: {{.run_image_mirror}}\n    buildpacks:\n        - id: noop.buildpack\n          name: NOOP Buildpack\n          version: noop.buildpack.later-version\n          homepage: http://geocities.com/cool-bp\n        - id: noop.buildpack\n          name: NOOP Buildpack\n          version: noop.buildpack.version\n        - id: read/env\n          name: Read Env Buildpack\n          version: read-env-version\n        - id: simple/layers\n          version: simple-layers-version\n        - id: simple/nested-level-1\n          version: nested-l1-version\n        - id: simple/nested-level-2\n          version: nested-l2-version\n    detection_order:\n        - buildpacks:\n            - id: simple/nested-level-1\n              buildpacks:\n                - id: simple/nested-level-2\n                  version: nested-l2-version\n                  buildpacks:\n                    - id: simple/layers\n                      version: simple-layers-version\n            - id: read/env\n              version: read-env-version\n              optional: true\nlocal_info:\n    created_by:\n        name: Pack CLI\n        version: {{.pack_version}}\n    stack:\n        id: pack.test.stack\n        mixins:\n            - mixinA\n            - netcat\n            - mixin3\n            - build:mixinTwo\n    lifecycle:\n        version: {{.lifecycle_version}}\n        buildpack_apis:\n            deprecated: {{.deprecated_buildpack_apis}}\n            supported: {{.supported_buildpack_apis}}\n        platform_apis:\n            deprecated: {{.deprecated_platform_apis}}\n            supported: {{.supported_platform_apis}}\n    run_images:\n        - name: some-registry.com/pack-test/run1\n          user_configured: true\n        - name: pack-test/run\n        - name: {{.run_image_mirror}}\n    buildpacks:\n        - id: noop.buildpack\n          name: NOOP Buildpack\n          version: noop.buildpack.later-version\n          homepage: http://geocities.com/cool-bp\n        - id: noop.buildpack\n          name: NOOP Buildpack\n          version: noop.buildpack.version\n        - id: read/env\n          name: Read Env Buildpack\n          version: read-env-version\n        - id: simple/layers\n          version: simple-layers-version\n        - id: simple/nested-level-1\n          version: nested-l1-version\n        - id: simple/nested-level-2\n          version: nested-l2-version\n    detection_order:\n        - buildpacks:\n            - id: simple/nested-level-1\n              buildpacks:\n                - id: simple/nested-level-2\n                  version: nested-l2-version\n                  buildpacks:\n                    - id: simple/layers\n                      version: simple-layers-version\n            - id: read/env\n              version: read-env-version\n              optional: true\n"
  },
  {
    "path": "acceptance/testdata/pack_fixtures/inspect_0.20.0_builder_output.txt",
    "content": "Inspecting builder: '{{.builder_name}}'\n\nREMOTE:\n\nCreated By:\n  Name: Pack CLI\n  Version: {{.pack_version}}\n\nTrusted: {{.trusted}}\n\nStack:\n  ID: pack.test.stack\n  Mixins:\n    mixinA\n    netcat\n    mixin3\n    build:mixinTwo\n\nLifecycle:\n  Version: {{.lifecycle_version}}\n  Buildpack APIs:\n    Deprecated: {{ .deprecated_buildpack_apis }}\n    Supported: {{ .supported_buildpack_apis }}\n  Platform APIs:\n    Deprecated: {{ .deprecated_platform_apis }}\n    Supported: {{ .supported_platform_apis }}\n\nRun Images:\n  some-registry.com/pack-test/run1    (user-configured)\n  pack-test/run\n  {{.run_image_mirror}}\n\nBuildpacks:\n  ID                    NAME                      VERSION                             HOMEPAGE\n  noop.buildpack        NOOP Buildpack            noop.buildpack.later-version        http://geocities.com/cool-bp\n  noop.buildpack        NOOP Buildpack            noop.buildpack.version              -\n  read/env              Read Env Buildpack        read-env-version                    -\n  simple/layers         -                         simple-layers-version               -\n\nDetection Order:\n └ Group #1:\n    ├ simple/layers\n    └ read/env@read-env-version    (optional)\n\nLOCAL:\n\nCreated By:\n  Name: Pack CLI\n  Version: {{.pack_version}}\n\nTrusted: {{.trusted}}\n\nStack:\n  ID: pack.test.stack\n  Mixins:\n    mixinA\n    netcat\n    mixin3\n    build:mixinTwo\n\nLifecycle:\n  Version: {{.lifecycle_version}}\n  Buildpack APIs:\n    Deprecated: {{ .deprecated_buildpack_apis }}\n    Supported: {{ .supported_buildpack_apis }}\n  Platform APIs:\n    Deprecated: {{ .deprecated_platform_apis }}\n    Supported: {{ .supported_platform_apis }}\n\nRun Images:\n  some-registry.com/pack-test/run1    (user-configured)\n  pack-test/run\n  {{.run_image_mirror}}\n\nBuildpacks:\n  ID                    NAME                      VERSION                             HOMEPAGE\n  noop.buildpack        NOOP Buildpack            noop.buildpack.later-version        http://geocities.com/cool-bp\n  noop.buildpack        NOOP Buildpack            noop.buildpack.version              -\n  read/env              Read Env Buildpack        read-env-version                    -\n  simple/layers         -                         simple-layers-version               -\n\nDetection Order:\n └ Group #1:\n    ├ simple/layers\n    └ read/env@read-env-version    (optional)\n"
  },
  {
    "path": "acceptance/testdata/pack_fixtures/inspect_X.Y.Z_builder_output.txt",
    "content": "Files like these represent the expected output of calling `pack inspect-builder` on a builder created with pack vX.Y.Z"
  },
  {
    "path": "acceptance/testdata/pack_fixtures/inspect_builder_nested_depth_2_output.txt",
    "content": "Inspecting builder: '{{.builder_name}}'\n\nREMOTE:\n\nCreated By:\n  Name: Pack CLI\n  Version: {{.pack_version}}\n\nTrusted: {{.trusted}}\n\nStack:\n  ID: pack.test.stack\n  Mixins:\n    mixinA\n    netcat\n    mixin3\n    build:mixinTwo\n\nLifecycle:\n  Version: {{.lifecycle_version}}\n  Buildpack APIs:\n    Deprecated: {{ .deprecated_buildpack_apis }}\n    Supported: {{ .supported_buildpack_apis }}\n  Platform APIs:\n    Deprecated: {{ .deprecated_platform_apis }}\n    Supported: {{ .supported_platform_apis }}\n\nRun Images:\n  some-registry.com/pack-test/run1    (user-configured)\n  pack-test/run\n  {{.run_image_mirror}}\n\nBuildpacks:\n  ID                           NAME                              VERSION                             HOMEPAGE\n  noop.buildpack               NOOP Buildpack                    noop.buildpack.later-version        http://geocities.com/cool-bp\n  noop.buildpack               NOOP Buildpack                    noop.buildpack.version              -\n  read/env                     Read Env Buildpack                read-env-version                    -\n  simple/layers                Simple Layers Buildpack           simple-layers-version               -\n  simple/nested-level-1        Nested Level One Buildpack        nested-l1-version                   -\n  simple/nested-level-2        Nested Level Two Buildpack        nested-l2-version                   -\n\nDetection Order:\n └ Group #1:\n    ├ simple/nested-level-1\n    │  └ Group #1:\n    │     └ simple/nested-level-2@nested-l2-version\n    └ read/env@read-env-version                        (optional)\n\nLOCAL:\n\nCreated By:\n  Name: Pack CLI\n  Version: {{.pack_version}}\n\nTrusted: {{.trusted}}\n\nStack:\n  ID: pack.test.stack\n  Mixins:\n    mixinA\n    netcat\n    mixin3\n    build:mixinTwo\n\nLifecycle:\n  Version: {{.lifecycle_version}}\n  Buildpack APIs:\n    Deprecated: {{ .deprecated_buildpack_apis }}\n    Supported: {{ .supported_buildpack_apis }}\n  Platform APIs:\n    Deprecated: {{ .deprecated_platform_apis }}\n    Supported: {{ .supported_platform_apis }}\n\nRun Images:\n  some-registry.com/pack-test/run1    (user-configured)\n  pack-test/run\n  {{.run_image_mirror}}\n\nBuildpacks:\n  ID                           NAME                              VERSION                             HOMEPAGE\n  noop.buildpack               NOOP Buildpack                    noop.buildpack.later-version        http://geocities.com/cool-bp\n  noop.buildpack               NOOP Buildpack                    noop.buildpack.version              -\n  read/env                     Read Env Buildpack                read-env-version                    -\n  simple/layers                Simple Layers Buildpack           simple-layers-version               -\n  simple/nested-level-1        Nested Level One Buildpack        nested-l1-version                   -\n  simple/nested-level-2        Nested Level Two Buildpack        nested-l2-version                   -\n\nDetection Order:\n └ Group #1:\n    ├ simple/nested-level-1\n    │  └ Group #1:\n    │     └ simple/nested-level-2@nested-l2-version\n    └ read/env@read-env-version                        (optional)\n"
  },
  {
    "path": "acceptance/testdata/pack_fixtures/inspect_builder_nested_output.txt",
    "content": "Inspecting builder: '{{.builder_name}}'\n\nREMOTE:\n\nCreated By:\n  Name: Pack CLI\n  Version: {{.pack_version}}\n\nTrusted: {{.trusted}}\n\nStack:\n  ID: pack.test.stack\n  Mixins:\n    mixinA\n    netcat\n    mixin3\n    build:mixinTwo\n\nLifecycle:\n  Version: {{.lifecycle_version}}\n  Buildpack APIs:\n    Deprecated: {{ .deprecated_buildpack_apis }}\n    Supported: {{ .supported_buildpack_apis }}\n  Platform APIs:\n    Deprecated: {{ .deprecated_platform_apis }}\n    Supported: {{ .supported_platform_apis }}\n\nRun Images:\n  some-registry.com/pack-test/run1    (user-configured)\n  pack-test/run\n  {{.run_image_mirror}}\n\nBuildpacks:\n  ID                           NAME                              VERSION                             HOMEPAGE\n  noop.buildpack               NOOP Buildpack                    noop.buildpack.later-version        http://geocities.com/cool-bp\n  noop.buildpack               NOOP Buildpack                    noop.buildpack.version              -\n  read/env                     Read Env Buildpack                read-env-version                    -\n  simple/layers                Simple Layers Buildpack           simple-layers-version               -\n  simple/nested-level-1        Nested Level One Buildpack        nested-l1-version                   -\n  simple/nested-level-2        Nested Level Two Buildpack        nested-l2-version                   -\n\nDetection Order:\n └ Group #1:\n    ├ simple/nested-level-1\n    │  └ Group #1:\n    │     └ simple/nested-level-2@nested-l2-version\n    │        └ Group #1:\n    │           └ simple/layers@simple-layers-version\n    └ read/env@read-env-version                          (optional)\n\nLOCAL:\n\nCreated By:\n  Name: Pack CLI\n  Version: {{.pack_version}}\n\nTrusted: {{.trusted}}\n\nStack:\n  ID: pack.test.stack\n  Mixins:\n    mixinA\n    netcat\n    mixin3\n    build:mixinTwo\n\nLifecycle:\n  Version: {{.lifecycle_version}}\n  Buildpack APIs:\n    Deprecated: {{ .deprecated_buildpack_apis }}\n    Supported: {{ .supported_buildpack_apis }}\n  Platform APIs:\n    Deprecated: {{ .deprecated_platform_apis }}\n    Supported: {{ .supported_platform_apis }}\n\nRun Images:\n  some-registry.com/pack-test/run1    (user-configured)\n  pack-test/run\n  {{.run_image_mirror}}\n\nBuildpacks:\n  ID                           NAME                              VERSION                             HOMEPAGE\n  noop.buildpack               NOOP Buildpack                    noop.buildpack.later-version        http://geocities.com/cool-bp\n  noop.buildpack               NOOP Buildpack                    noop.buildpack.version              -\n  read/env                     Read Env Buildpack                read-env-version                    -\n  simple/layers                Simple Layers Buildpack           simple-layers-version               -\n  simple/nested-level-1        Nested Level One Buildpack        nested-l1-version                   -\n  simple/nested-level-2        Nested Level Two Buildpack        nested-l2-version                   -\n\nDetection Order:\n └ Group #1:\n    ├ simple/nested-level-1\n    │  └ Group #1:\n    │     └ simple/nested-level-2@nested-l2-version\n    │        └ Group #1:\n    │           └ simple/layers@simple-layers-version\n    └ read/env@read-env-version                          (optional)\n"
  },
  {
    "path": "acceptance/testdata/pack_fixtures/inspect_builder_nested_output_json.txt",
    "content": "{\n  \"builder_name\": \"{{.builder_name}}\",\n  \"trusted\": false,\n  \"default\": false,\n  \"remote_info\": {\n    \"created_by\": {\n      \"name\": \"Pack CLI\",\n      \"version\": \"{{.pack_version}}\"\n    },\n    \"stack\": {\n      \"id\": \"pack.test.stack\",\n      \"mixins\": [\n        \"mixinA\",\n        \"netcat\",\n        \"mixin3\",\n        \"build:mixinTwo\"\n      ]\n    },\n    \"lifecycle\": {\n      \"version\": \"{{.lifecycle_version}}\",\n      \"buildpack_apis\": {\n        \"deprecated\": {{.deprecated_buildpack_apis}},\n        \"supported\": {{.supported_buildpack_apis}}\n      },\n      \"platform_apis\": {\n        \"deprecated\": {{.deprecated_platform_apis}},\n        \"supported\": {{.supported_platform_apis}}\n      }\n    },\n    \"run_images\": [\n      {\n        \"name\": \"some-registry.com/pack-test/run1\",\n        \"user_configured\": true\n      },\n      {\n        \"name\": \"pack-test/run\"\n      },\n      {\n        \"name\": \"{{.run_image_mirror}}\"\n      }\n    ],\n    \"buildpacks\": [\n      {\n        \"id\": \"noop.buildpack\",\n        \"name\": \"NOOP Buildpack\",\n        \"version\": \"noop.buildpack.later-version\",\n        \"homepage\": \"http://geocities.com/cool-bp\"\n      },\n      {\n        \"id\": \"noop.buildpack\",\n        \"name\": \"NOOP Buildpack\",\n        \"version\": \"noop.buildpack.version\"\n      },\n      {\n        \"id\": \"read/env\",\n        \"name\": \"Read Env Buildpack\",\n        \"version\": \"read-env-version\"\n      },\n      {\n        \"id\": \"simple/layers\",\n        \"name\": \"Simple Layers Buildpack\",\n        \"version\": \"simple-layers-version\"\n      },\n      {\n        \"id\": \"simple/nested-level-1\",\n        \"name\": \"Nested Level One Buildpack\",\n        \"version\": \"nested-l1-version\"\n      },\n      {\n        \"id\": \"simple/nested-level-2\",\n        \"name\": \"Nested Level Two Buildpack\",\n        \"version\": \"nested-l2-version\"\n      }\n    ],\n    \"detection_order\": [\n      {\n        \"buildpacks\": [\n          {\n            \"id\": \"simple/nested-level-1\",\n            \"buildpacks\": [\n              {\n                \"id\": \"simple/nested-level-2\",\n                \"version\": \"nested-l2-version\",\n                \"buildpacks\": [\n                  {\n                    \"id\": \"simple/layers\",\n                    \"version\": \"simple-layers-version\"\n                  }\n                ]\n              }\n            ]\n          },\n          {\n            \"id\": \"read/env\",\n            \"version\": \"read-env-version\",\n            \"optional\": true\n          }\n        ]\n      }\n    ]\n  },\n  \"local_info\": {\n    \"created_by\": {\n      \"name\": \"Pack CLI\",\n      \"version\": \"{{.pack_version}}\"\n    },\n    \"stack\": {\n      \"id\": \"pack.test.stack\",\n      \"mixins\": [\n        \"mixinA\",\n        \"netcat\",\n        \"mixin3\",\n        \"build:mixinTwo\"\n      ]\n    },\n    \"lifecycle\": {\n      \"version\": \"{{.lifecycle_version}}\",\n      \"buildpack_apis\": {\n        \"deprecated\": {{.deprecated_buildpack_apis}},\n        \"supported\": {{.supported_buildpack_apis}}\n      },\n      \"platform_apis\": {\n        \"deprecated\": {{.deprecated_platform_apis}},\n        \"supported\": {{.supported_platform_apis}}\n      }\n    },\n    \"run_images\": [\n      {\n        \"name\": \"some-registry.com/pack-test/run1\",\n        \"user_configured\": true\n      },\n      {\n        \"name\": \"pack-test/run\"\n      },\n      {\n        \"name\": \"{{.run_image_mirror}}\"\n      }\n    ],\n    \"buildpacks\": [\n      {\n        \"id\": \"noop.buildpack\",\n        \"name\": \"NOOP Buildpack\",\n        \"version\": \"noop.buildpack.later-version\",\n        \"homepage\": \"http://geocities.com/cool-bp\"\n      },\n      {\n        \"id\": \"noop.buildpack\",\n        \"name\": \"NOOP Buildpack\",\n        \"version\": \"noop.buildpack.version\"\n      },\n      {\n        \"id\": \"read/env\",\n        \"name\": \"Read Env Buildpack\",\n        \"version\": \"read-env-version\"\n      },\n      {\n        \"id\": \"simple/layers\",\n        \"name\": \"Simple Layers Buildpack\",\n        \"version\": \"simple-layers-version\"\n      },\n      {\n        \"id\": \"simple/nested-level-1\",\n        \"name\": \"Nested Level One Buildpack\",\n        \"version\": \"nested-l1-version\"\n      },\n      {\n        \"id\": \"simple/nested-level-2\",\n        \"name\": \"Nested Level Two Buildpack\",\n        \"version\": \"nested-l2-version\"\n      }\n    ],\n    \"detection_order\": [\n      {\n        \"buildpacks\": [\n          {\n            \"id\": \"simple/nested-level-1\",\n            \"buildpacks\": [\n              {\n                \"id\": \"simple/nested-level-2\",\n                \"version\": \"nested-l2-version\",\n                \"buildpacks\": [\n                  {\n                    \"id\": \"simple/layers\",\n                    \"version\": \"simple-layers-version\"\n                  }\n                ]\n              }\n            ]\n          },\n          {\n            \"id\": \"read/env\",\n            \"version\": \"read-env-version\",\n            \"optional\": true\n          }\n        ]\n      }\n    ]\n  }\n}\n"
  },
  {
    "path": "acceptance/testdata/pack_fixtures/inspect_builder_nested_output_toml.txt",
    "content": "builder_name = \"{{.builder_name}}\"\ntrusted = false\ndefault = false\n\n[remote_info]\n\n  [remote_info.created_by]\n    Name = \"Pack CLI\"\n    Version = \"{{.pack_version}}\"\n\n  [remote_info.stack]\n    id = \"pack.test.stack\"\n    mixins = [\"mixinA\", \"netcat\", \"mixin3\", \"build:mixinTwo\"]\n\n  [remote_info.lifecycle]\n    version = \"{{.lifecycle_version}}\"\n\n    [remote_info.lifecycle.buildpack_apis]\n      deprecated = {{.deprecated_buildpack_apis}}\n      supported = {{.supported_buildpack_apis}}\n\n    [remote_info.lifecycle.platform_apis]\n      deprecated = {{.deprecated_platform_apis}}\n      supported = {{.supported_platform_apis}}\n\n  [[remote_info.run_images]]\n    name = \"some-registry.com/pack-test/run1\"\n    user_configured = true\n\n  [[remote_info.run_images]]\n    name = \"pack-test/run\"\n\n  [[remote_info.run_images]]\n    name = \"{{.run_image_mirror}}\"\n\n  [[remote_info.buildpacks]]\n    id = \"noop.buildpack\"\n    name = \"NOOP Buildpack\"\n    version = \"noop.buildpack.later-version\"\n    homepage = \"http://geocities.com/cool-bp\"\n\n  [[remote_info.buildpacks]]\n    id = \"noop.buildpack\"\n    name = \"NOOP Buildpack\"\n    version = \"noop.buildpack.version\"\n\n  [[remote_info.buildpacks]]\n    id = \"read/env\"\n    name = \"Read Env Buildpack\"\n    version = \"read-env-version\"\n\n  [[remote_info.buildpacks]]\n    id = \"simple/layers\"\n    name = \"Simple Layers Buildpack\"\n    version = \"simple-layers-version\"\n\n  [[remote_info.buildpacks]]\n    id = \"simple/nested-level-1\"\n    name = \"Nested Level One Buildpack\"\n    version = \"nested-l1-version\"\n\n  [[remote_info.buildpacks]]\n    id = \"simple/nested-level-2\"\n    name = \"Nested Level Two Buildpack\"\n    version = \"nested-l2-version\"\n\n  [[remote_info.detection_order]]\n\n    [[remote_info.detection_order.buildpacks]]\n      id = \"simple/nested-level-1\"\n\n      [[remote_info.detection_order.buildpacks.buildpacks]]\n        id = \"simple/nested-level-2\"\n        version = \"nested-l2-version\"\n\n        [[remote_info.detection_order.buildpacks.buildpacks.buildpacks]]\n          id = \"simple/layers\"\n          version = \"simple-layers-version\"\n\n    [[remote_info.detection_order.buildpacks]]\n      id = \"read/env\"\n      version = \"read-env-version\"\n      optional = true\n\n[local_info]\n\n  [local_info.created_by]\n    Name = \"Pack CLI\"\n    Version = \"{{.pack_version}}\"\n\n  [local_info.stack]\n    id = \"pack.test.stack\"\n    mixins = [\"mixinA\", \"netcat\", \"mixin3\", \"build:mixinTwo\"]\n\n  [local_info.lifecycle]\n    version = \"{{.lifecycle_version}}\"\n\n    [local_info.lifecycle.buildpack_apis]\n      deprecated = {{.deprecated_buildpack_apis}}\n      supported = {{.supported_buildpack_apis}}\n\n    [local_info.lifecycle.platform_apis]\n      deprecated = {{.deprecated_platform_apis}}\n      supported = {{.supported_platform_apis}}\n\n  [[local_info.run_images]]\n    name = \"some-registry.com/pack-test/run1\"\n    user_configured = true\n\n  [[local_info.run_images]]\n    name = \"pack-test/run\"\n\n  [[local_info.run_images]]\n    name = \"{{.run_image_mirror}}\"\n\n  [[local_info.buildpacks]]\n    id = \"noop.buildpack\"\n    name = \"NOOP Buildpack\"\n    version = \"noop.buildpack.later-version\"\n    homepage = \"http://geocities.com/cool-bp\"\n\n  [[local_info.buildpacks]]\n    id = \"noop.buildpack\"\n    name = \"NOOP Buildpack\"\n    version = \"noop.buildpack.version\"\n\n  [[local_info.buildpacks]]\n    id = \"read/env\"\n    name = \"Read Env Buildpack\"\n    version = \"read-env-version\"\n\n  [[local_info.buildpacks]]\n    id = \"simple/layers\"\n    name = \"Simple Layers Buildpack\"\n    version = \"simple-layers-version\"\n\n  [[local_info.buildpacks]]\n    id = \"simple/nested-level-1\"\n    name = \"Nested Level One Buildpack\"\n    version = \"nested-l1-version\"\n\n  [[local_info.buildpacks]]\n    id = \"simple/nested-level-2\"\n    name = \"Nested Level Two Buildpack\"\n    version = \"nested-l2-version\"\n\n  [[local_info.detection_order]]\n\n    [[local_info.detection_order.buildpacks]]\n      id = \"simple/nested-level-1\"\n\n      [[local_info.detection_order.buildpacks.buildpacks]]\n        id = \"simple/nested-level-2\"\n        version = \"nested-l2-version\"\n\n        [[local_info.detection_order.buildpacks.buildpacks.buildpacks]]\n          id = \"simple/layers\"\n          version = \"simple-layers-version\"\n\n    [[local_info.detection_order.buildpacks]]\n      id = \"read/env\"\n      version = \"read-env-version\"\n      optional = true\n"
  },
  {
    "path": "acceptance/testdata/pack_fixtures/inspect_builder_nested_output_yaml.txt",
    "content": "sharedbuilderinfo:\n    builder_name: {{.builder_name}}\n    trusted: false\n    default: false\nremote_info:\n    created_by:\n        name: Pack CLI\n        version: {{.pack_version}}\n    stack:\n        id: pack.test.stack\n        mixins:\n            - mixinA\n            - netcat\n            - mixin3\n            - build:mixinTwo\n    lifecycle:\n        version: {{.lifecycle_version}}\n        buildpack_apis:\n            deprecated: {{.deprecated_buildpack_apis}}\n            supported: {{.supported_buildpack_apis}}\n        platform_apis:\n            deprecated: {{.deprecated_platform_apis}}\n            supported: {{.supported_platform_apis}}\n    run_images:\n        - name: some-registry.com/pack-test/run1\n          user_configured: true\n        - name: pack-test/run\n        - name: {{.run_image_mirror}}\n    buildpacks:\n        - id: noop.buildpack\n          name: NOOP Buildpack\n          version: noop.buildpack.later-version\n          homepage: http://geocities.com/cool-bp\n        - id: noop.buildpack\n          name: NOOP Buildpack\n          version: noop.buildpack.version\n        - id: read/env\n          name: Read Env Buildpack\n          version: read-env-version\n        - id: simple/layers\n          name: Simple Layers Buildpack\n          version: simple-layers-version\n        - id: simple/nested-level-1\n          name: Nested Level One Buildpack\n          version: nested-l1-version\n        - id: simple/nested-level-2\n          name: Nested Level Two Buildpack\n          version: nested-l2-version\n    detection_order:\n        - buildpacks:\n            - id: simple/nested-level-1\n              buildpacks:\n                - id: simple/nested-level-2\n                  version: nested-l2-version\n                  buildpacks:\n                    - id: simple/layers\n                      version: simple-layers-version\n            - id: read/env\n              version: read-env-version\n              optional: true\nlocal_info:\n    created_by:\n        name: Pack CLI\n        version: {{.pack_version}}\n    stack:\n        id: pack.test.stack\n        mixins:\n            - mixinA\n            - netcat\n            - mixin3\n            - build:mixinTwo\n    lifecycle:\n        version: {{.lifecycle_version}}\n        buildpack_apis:\n            deprecated: {{.deprecated_buildpack_apis}}\n            supported: {{.supported_buildpack_apis}}\n        platform_apis:\n            deprecated: {{.deprecated_platform_apis}}\n            supported: {{.supported_platform_apis}}\n    run_images:\n        - name: some-registry.com/pack-test/run1\n          user_configured: true\n        - name: pack-test/run\n        - name: {{.run_image_mirror}}\n    buildpacks:\n        - id: noop.buildpack\n          name: NOOP Buildpack\n          version: noop.buildpack.later-version\n          homepage: http://geocities.com/cool-bp\n        - id: noop.buildpack\n          name: NOOP Buildpack\n          version: noop.buildpack.version\n        - id: read/env\n          name: Read Env Buildpack\n          version: read-env-version\n        - id: simple/layers\n          name: Simple Layers Buildpack\n          version: simple-layers-version\n        - id: simple/nested-level-1\n          name: Nested Level One Buildpack\n          version: nested-l1-version\n        - id: simple/nested-level-2\n          name: Nested Level Two Buildpack\n          version: nested-l2-version\n    detection_order:\n        - buildpacks:\n            - id: simple/nested-level-1\n              buildpacks:\n                - id: simple/nested-level-2\n                  version: nested-l2-version\n                  buildpacks:\n                    - id: simple/layers\n                      version: simple-layers-version\n            - id: read/env\n              version: read-env-version\n              optional: true\n"
  },
  {
    "path": "acceptance/testdata/pack_fixtures/inspect_builder_output.txt",
    "content": "Inspecting builder: '{{.builder_name}}'\n\nREMOTE:\n\nCreated By:\n  Name: Pack CLI\n  Version: {{.pack_version}}\n\nTrusted: {{.trusted}}\n\nStack:\n  ID: pack.test.stack\n  Mixins:\n    mixinA\n    netcat\n    mixin3\n    build:mixinTwo\n\nLifecycle:\n  Version: {{.lifecycle_version}}\n  Buildpack APIs:\n    Deprecated: {{ .deprecated_buildpack_apis }}\n    Supported: {{ .supported_buildpack_apis }}\n  Platform APIs:\n    Deprecated: {{ .deprecated_platform_apis }}\n    Supported: {{ .supported_platform_apis }}\n\nRun Images:\n  some-registry.com/pack-test/run1    (user-configured)\n  pack-test/run\n  {{.run_image_mirror}}\n\nBuildpacks:\n  ID                    NAME                           VERSION                             HOMEPAGE\n  noop.buildpack        NOOP Buildpack                 noop.buildpack.later-version        http://geocities.com/cool-bp\n  noop.buildpack        NOOP Buildpack                 noop.buildpack.version              -\n  read/env              Read Env Buildpack             read-env-version                    -\n  simple/layers         Simple Layers Buildpack        simple-layers-version               -\n\nDetection Order:\n └ Group #1:\n    ├ simple/layers\n    └ read/env@read-env-version    (optional)\n\nLOCAL:\n\nCreated By:\n  Name: Pack CLI\n  Version: {{.pack_version}}\n\nTrusted: {{.trusted}}\n\nStack:\n  ID: pack.test.stack\n  Mixins:\n    mixinA\n    netcat\n    mixin3\n    build:mixinTwo\n\nLifecycle:\n  Version: {{.lifecycle_version}}\n  Buildpack APIs:\n    Deprecated: {{ .deprecated_buildpack_apis }}\n    Supported: {{ .supported_buildpack_apis }}\n  Platform APIs:\n    Deprecated: {{ .deprecated_platform_apis }}\n    Supported: {{ .supported_platform_apis }}\n\nRun Images:\n  some-registry.com/pack-test/run1    (user-configured)\n  pack-test/run\n  {{.run_image_mirror}}\n\nBuildpacks:\n  ID                    NAME                           VERSION                             HOMEPAGE\n  noop.buildpack        NOOP Buildpack                 noop.buildpack.later-version        http://geocities.com/cool-bp\n  noop.buildpack        NOOP Buildpack                 noop.buildpack.version              -\n  read/env              Read Env Buildpack             read-env-version                    -\n  simple/layers         Simple Layers Buildpack        simple-layers-version               -\n\nDetection Order:\n └ Group #1:\n    ├ simple/layers\n    └ read/env@read-env-version    (optional)\n"
  },
  {
    "path": "acceptance/testdata/pack_fixtures/inspect_buildpack_output.txt",
    "content": "Inspecting buildpack: '{{ .buildpack_name }}'\n\n{{ .buildpack_source }}:\n\nStacks:\n  ID: pack.test.stack\n    Mixins:\n      (none)\n\nBuildpacks:\n  ID                          NAME                                  VERSION                             HOMEPAGE\n  simple/layers               Simple Layers Buildpack               simple-layers-version               -\n  simple/layers/parent        Simple Layers Parent Buildpack        simple-layers-parent-version        -\n\nDetection Order:\n └ Group #1:\n    └ simple/layers/parent@simple-layers-parent-version\n       └ Group #1:\n          └ simple/layers@simple-layers-version\n"
  },
  {
    "path": "acceptance/testdata/pack_fixtures/inspect_image_local_output.json",
    "content": "{\n  \"image_name\": \"{{.image_name}}\",\n  \"remote_info\": null,\n  \"local_info\": {\n    \"stack\": \"pack.test.stack\",\n    \"rebasable\": {{.rebasable}},\n    \"base_image\": {\n      \"top_layer\": \"{{.base_image_top_layer}}\",\n      \"reference\": \"{{.base_image_id}}\"\n    },\n    \"run_images\": [\n      {\n        \"name\": \"{{.run_image_local_mirror}}\",\n        \"user_configured\": true\n      },\n      {\n        \"name\": \"pack-test/run\"\n      },\n      {\n        \"name\": \"{{.run_image_mirror}}\"\n      }\n    ],\n    \"buildpacks\": [\n      {\n        \"id\": \"simple/layers\",\n        \"version\": \"simple-layers-version\"\n      }\n    ],\n    \"extensions\": null,\n    \"processes\": [\n      {\n        \"type\": \"web\",\n        \"shell\": \"bash\",\n        \"command\": \"{{ ( StringsEscapeBackslash .web_command ) }}\",\n        \"default\": true,\n        \"args\": [\n          \"8080\"\n        ],\n        \"working-dir\": \"{{ ( StringsEscapeBackslash .image_workdir ) }}\"\n      },\n      {\n        \"type\": \"hello\",\n        \"shell\": \"\",\n        \"command\": \"{{.hello_command}}\",\n        \"default\": false,\n        \"args\": [\n          {{ ( StringsJoin (StringsDoubleQuote .hello_args) \",\" ) }}\n        ],\n        \"working-dir\": \"{{ ( StringsEscapeBackslash .image_workdir ) }}\"\n      }\n    ]\n  }\n}\n"
  },
  {
    "path": "acceptance/testdata/pack_fixtures/inspect_image_local_output.toml",
    "content": "image_name = \"{{.image_name}}\"\n\n[local_info]\nstack = \"pack.test.stack\"\nrebasable = {{.rebasable}}\n\n  [local_info.base_image]\n  top_layer = \"{{.base_image_top_layer}}\"\n  reference = \"{{.base_image_id}}\"\n\n  [[local_info.run_images]]\n  name = \"{{.run_image_local_mirror}}\"\n  user_configured = true\n\n  [[local_info.run_images]]\n  name = \"pack-test/run\"\n\n  [[local_info.run_images]]\n  name = \"{{.run_image_mirror}}\"\n\n  [[local_info.buildpacks]]\n  id = \"simple/layers\"\n  version = \"simple-layers-version\"\n\n  [[local_info.processes]]\n  type = \"web\"\n  shell = \"bash\"\n  command = \"{{ ( StringsEscapeBackslash .web_command ) }}\"\n  default = true\n  args = [ \"8080\" ]\n  working-dir = \"{{ ( StringsEscapeBackslash .image_workdir ) }}\"\n\n  [[local_info.processes]]\n  type = \"hello\"\n  shell = \"\"\n  command = \"{{.hello_command}}\"\n  default = false\n  args = [ {{ ( StringsJoin (StringsDoubleQuote .hello_args) \",\") }} ]\n  working-dir = \"{{ ( StringsEscapeBackslash .image_workdir ) }}\"\n"
  },
  {
    "path": "acceptance/testdata/pack_fixtures/inspect_image_local_output.yaml",
    "content": "---\nimage_name: \"{{.image_name}}\"\nremote_info:\nlocal_info:\n  stack: pack.test.stack\n  rebasable: {{.rebasable}}\n  base_image:\n    top_layer: \"{{.base_image_top_layer}}\"\n    reference: \"{{.base_image_id}}\"\n  run_images:\n    - name: \"{{.run_image_local_mirror}}\"\n      user_configured: true\n    - name: pack-test/run\n    - name: \"{{.run_image_mirror}}\"\n  buildpacks:\n    - id: simple/layers\n      version: simple-layers-version\n  extensions: []\n  processes:\n    - type: web\n      shell: bash\n      command: \"{{ ( StringsEscapeBackslash .web_command ) }}\"\n      default: true\n      args:\n        - '8080'\n      working-dir: \"{{ ( StringsEscapeBackslash .image_workdir ) }}\"\n    - type: hello\n      shell: ''\n      command: \"{{.hello_command}}\"\n      default: false\n      args: [ {{ ( StringsJoin (StringsDoubleQuote .hello_args) \",\") }} ]\n      working-dir: \"{{ ( StringsEscapeBackslash .image_workdir ) }}\"\n"
  },
  {
    "path": "acceptance/testdata/pack_fixtures/inspect_image_published_output.json",
    "content": "{\n  \"image_name\": \"{{.image_name}}\",\n  \"local_info\": null,\n  \"remote_info\": {\n    \"stack\": \"pack.test.stack\",\n    \"rebasable\": {{.rebasable}},\n    \"base_image\": {\n      \"top_layer\": \"{{.base_image_top_layer}}\",\n      \"reference\": \"{{.base_image_ref}}\"\n    },\n    \"run_images\": [\n      {\n        \"name\": \"pack-test/run\"\n      },\n      {\n        \"name\": \"{{.run_image_mirror}}\"\n      }\n    ],\n    \"buildpacks\": [\n      {\n        \"id\": \"simple/layers\",\n        \"version\": \"simple-layers-version\"\n      }\n    ],\n    \"extensions\": null,\n    \"processes\": [\n      {\n        \"type\": \"web\",\n        \"shell\": \"bash\",\n        \"command\": \"{{( StringsEscapeBackslash .web_command )}}\",\n        \"default\": true,\n        \"args\": [\n          \"8080\"\n        ],\n        \"working-dir\": \"{{ ( StringsEscapeBackslash .image_workdir ) }}\"\n      },\n      {\n        \"type\": \"hello\",\n        \"shell\": \"\",\n        \"command\": \"{{.hello_command}}\",\n        \"default\": false,\n        \"args\": [\n          {{ ( StringsJoin (StringsDoubleQuote .hello_args) \",\" ) }}\n        ],\n        \"working-dir\": \"{{ ( StringsEscapeBackslash .image_workdir ) }}\"\n      }\n    ]\n  }\n}\n"
  },
  {
    "path": "acceptance/testdata/pack_fixtures/inspect_image_published_output.toml",
    "content": "image_name = \"{{.image_name}}\"\n\n[remote_info]\nstack = \"pack.test.stack\"\nrebasable = {{.rebasable}}\n\n  [remote_info.base_image]\n  top_layer = \"{{.base_image_top_layer}}\"\n  reference = \"{{.base_image_ref}}\"\n\n  [[remote_info.run_images]]\n  name = \"pack-test/run\"\n\n  [[remote_info.run_images]]\n  name = \"{{.run_image_mirror}}\"\n\n  [[remote_info.buildpacks]]\n  id = \"simple/layers\"\n  version = \"simple-layers-version\"\n\n  [[remote_info.processes]]\n  type = \"web\"\n  shell = \"bash\"\n  command = \"{{( StringsEscapeBackslash .web_command )}}\"\n  default = true\n  args = [ \"8080\" ]\n  working-dir = \"{{ ( StringsEscapeBackslash .image_workdir ) }}\"\n\n  [[remote_info.processes]]\n  type = \"hello\"\n  shell = \"\"\n  command = \"{{.hello_command}}\"\n  default = false\n  args = [ {{ ( StringsJoin (StringsDoubleQuote .hello_args) \",\" ) }} ]\n  working-dir = \"{{ ( StringsEscapeBackslash .image_workdir ) }}\"\n\n"
  },
  {
    "path": "acceptance/testdata/pack_fixtures/inspect_image_published_output.yaml",
    "content": "---\nimage_name: \"{{.image_name}}\"\nlocal_info: null\nremote_info:\n  stack: pack.test.stack\n  rebasable: {{.rebasable}}\n  base_image:\n    top_layer: \"{{.base_image_top_layer}}\"\n    reference: \"{{.base_image_ref}}\"\n  run_images:\n    - name: pack-test/run\n    - name: \"{{.run_image_mirror}}\"\n  buildpacks:\n    - id: simple/layers\n      version: simple-layers-version\n  extensions: []\n  processes:\n    - type: web\n      shell: bash\n      command: \"{{( StringsEscapeBackslash .web_command )}}\"\n      default: true\n      args:\n        - '8080'\n      working-dir: \"{{ ( StringsEscapeBackslash .image_workdir ) }}\"\n    - type: hello\n      shell: ''\n      command: \"{{.hello_command}}\"\n      default: false\n      args: [ {{ ( StringsJoin (StringsDoubleQuote .hello_args) \",\" ) }} ]\n      working-dir: \"{{ ( StringsEscapeBackslash .image_workdir ) }}\"\n"
  },
  {
    "path": "acceptance/testdata/pack_fixtures/invalid_builder.toml",
    "content": "[[buildpacks]]\n  url = \"read-env-buildpack.tgz\"\n\n[stack]\n  id = \"pack.test.stack\"\n  build-image = \"pack-test/build\"\n  run-image = \"pack-test/run\""
  },
  {
    "path": "acceptance/testdata/pack_fixtures/invalid_package.toml",
    "content": "[buildpack]\n"
  },
  {
    "path": "acceptance/testdata/pack_fixtures/nested-level-1-buildpack_package.toml",
    "content": "[buildpack]\nuri = \"nested-level-1-buildpack.tgz\"\n\n[[dependencies]]\n  image = \"{{.simple_layers_buildpack}}\"\n\n[[dependencies]]\n  image = \"{{.nested_level_2_buildpack}}\""
  },
  {
    "path": "acceptance/testdata/pack_fixtures/nested-level-2-buildpack_package.toml",
    "content": "[buildpack]\nuri = \"nested-level-2-buildpack.tgz\"\n\n[[dependencies]]\nimage = \"{{.simple_layers_buildpack}}\""
  },
  {
    "path": "acceptance/testdata/pack_fixtures/nested_builder.toml",
    "content": "[[buildpacks]]\n  id = \"read/env\"\n  version = \"read-env-version\"\n  uri = \"read-env-buildpack.tgz\"\n\n[[buildpacks]]\n  # intentionally missing id/version as they are optional\n  uri = \"noop-buildpack.tgz\"\n\n[[buildpacks]]\n  # noop-buildpack-2 has the same id but a different version compared to noop-buildpack\n  uri = \"noop-buildpack-2.tgz\"\n\n{{- if .simple_layers_buildpack_different_sha}}\n[[buildpacks]]\n  image = \"{{.simple_layers_buildpack_different_sha}}\"\n  version = \"simple-layers-version\"\n{{- end}}\n\n{{- if .nested_level_2_buildpack}}\n[[buildpacks]]\n  image = \"{{.nested_level_2_buildpack}}\"\n{{- end}}\n\n{{- if .nested_level_1_buildpack}}\n[[buildpacks]]\n  image = \"{{.nested_level_1_buildpack}}\"\n{{- end}}\n\n[[order]]\n{{- if .package_id}}\n[[order.group]]\n  id = \"{{.package_id}}\"\n  # intentionlly missing version to test support\n{{- end}}\n\n[[order.group]]\n  id = \"read/env\"\n  version = \"read-env-version\"\n  optional = true\n\n[stack]\n  id = \"pack.test.stack\"\n  build-image = \"pack-test/build\"\n  run-image = \"pack-test/run\"\n  run-image-mirrors = [\"{{.run_image_mirror}}\"]\n\n[lifecycle]\n{{- if .lifecycle_uri}}\n  uri = \"{{.lifecycle_uri}}\"\n{{- end}}\n{{- if .lifecycle_version}}\n  version = \"{{.lifecycle_version}}\"\n{{- end}}"
  },
  {
    "path": "acceptance/testdata/pack_fixtures/package.toml",
    "content": "[buildpack]\nuri = \"simple-layers-buildpack.tgz\"\n\n[platform]\nos = \"{{ .OS }}\""
  },
  {
    "path": "acceptance/testdata/pack_fixtures/package_aggregate.toml",
    "content": "[buildpack]\nuri = \"{{ .BuildpackURI }}\"\n\n[[dependencies]]\nimage = \"{{ .PackageName }}\"\n\n[platform]\nos = \"{{ .OS }}\"\n"
  },
  {
    "path": "acceptance/testdata/pack_fixtures/package_for_build_cmd.toml",
    "content": "[buildpack]\nuri = \"simple-layers-parent-buildpack\"\n\n[[dependencies]]\nuri = \"simple-layers-buildpack\"\n\n[platform]\nos = \"{{ .OS }}\""
  },
  {
    "path": "acceptance/testdata/pack_fixtures/package_multi_platform.toml",
    "content": "[buildpack]\nuri = \"{{ .BuildpackURI }}\"\n\n[[dependencies]]\nuri = \"{{ .PackageName }}\"\n"
  },
  {
    "path": "acceptance/testdata/pack_fixtures/report_output.txt",
    "content": "Pack:\n  Version:  {{ .Version }}\n  OS/Arch:  {{ .OS }}/{{ .Arch }}\n\nDefault Lifecycle Version:  0.21.0\n\nSupported Platform APIs:  0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 0.10, 0.11, 0.12, 0.13, 0.14, 0.15\n\nConfig:\n  default-builder-image = \"{{ .DefaultBuilder }}\"\n  experimental = true\n  layout-repo-dir = \"{{ .LayoutRepoDir }}\"\n"
  },
  {
    "path": "acceptance/testdata/pack_fixtures/simple-layers-buildpack-different-sha_package.toml",
    "content": "[buildpack]\nuri = \"simple-layers-buildpack-different-sha.tgz\""
  },
  {
    "path": "acceptance/testdata/pack_fixtures/simple-layers-buildpack_package.toml",
    "content": "[buildpack]\nuri = \"simple-layers-buildpack.tgz\""
  },
  {
    "path": "acceptance/testdata/pack_previous_fixtures_overrides/.gitkeep",
    "content": ""
  },
  {
    "path": "benchmarks/build_test.go",
    "content": "//go:build benchmarks\n\npackage benchmarks\n\nimport (\n\t\"bytes\"\n\t\"fmt\"\n\t\"os\"\n\t\"path/filepath\"\n\t\"testing\"\n\n\tdockerCli \"github.com/moby/moby/client\"\n\t\"github.com/pkg/errors\"\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/internal/commands\"\n\tcfg \"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/pkg/client\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nvar (\n\tbaseImg               string\n\ttrustedImg            string\n\tbuilder               string\n\tmockAppPath           string\n\tpaketoBuilder         string\n\tadditionalMockAppPath string\n\tadditionalBuildapck   string\n)\n\nfunc BenchmarkBuild(b *testing.B) {\n\tsetEnv()\n\tdockerClient, err := dockerCli.New(dockerCli.FromEnv)\n\tif err != nil {\n\t\tb.Error(errors.Wrap(err, \"creating docker client\"))\n\t}\n\n\tif err = h.PullImageWithAuth(dockerClient, builder, \"\"); err != nil {\n\t\tb.Error(errors.Wrapf(err, \"pulling builder %s\", builder))\n\t}\n\n\tcmd := createCmd(b, dockerClient)\n\n\tb.Run(\"with Untrusted Builder\", func(b *testing.B) {\n\t\tfor i := 0; i < b.N; i++ {\n\t\t\t// perform the operation we're analyzing\n\t\t\tcmd.SetArgs([]string{fmt.Sprintf(\"%s%d\", baseImg, i), \"-p\", mockAppPath, \"-B\", builder})\n\t\t\tif err = cmd.Execute(); err != nil {\n\t\t\t\tb.Error(errors.Wrapf(err, \"running build #%d\", i))\n\t\t\t}\n\t\t}\n\t})\n\n\tb.Run(\"with Trusted Builder\", func(b *testing.B) {\n\t\tfor i := 0; i < b.N; i++ {\n\t\t\t// perform the operation we're analyzing\n\t\t\tcmd.SetArgs([]string{fmt.Sprintf(\"%s%d\", trustedImg, i), \"-p\", mockAppPath, \"-B\", builder, \"--trust-builder\"})\n\t\t\tif err = cmd.Execute(); err != nil {\n\t\t\t\tb.Error(errors.Wrapf(err, \"running build #%d\", i))\n\t\t\t}\n\t\t}\n\t})\n\n\tb.Run(\"with Additional Buildpack\", func(b *testing.B) {\n\t\tfor i := 0; i < b.N; i++ {\n\t\t\t// perform the operation we're analyzing\n\t\t\tcmd.SetArgs([]string{fmt.Sprintf(\"%s%d\", trustedImg, i), \"-p\", additionalMockAppPath, \"-B\", paketoBuilder, \"--buildpack\", additionalBuildapck})\n\t\t\tif err = cmd.Execute(); err != nil {\n\t\t\t\tb.Error(errors.Wrapf(err, \"running build #%d\", i))\n\t\t\t}\n\t\t}\n\t})\n\n\t// Cleanup\n\tfor i := 0; i < b.N; i++ {\n\t\tif err = h.DockerRmi(dockerClient, fmt.Sprintf(\"%s%d\", baseImg, i)); err != nil {\n\t\t\tb.Error(errors.Wrapf(err, \"deleting image #%d\", i))\n\t\t}\n\n\t\tif err = h.DockerRmi(dockerClient, fmt.Sprintf(\"%s%d\", trustedImg, i)); err != nil {\n\t\t\tb.Error(errors.Wrapf(err, \"deleting image #%d\", i))\n\t\t}\n\t}\n\n\tif err = h.DockerRmi(dockerClient, builder); err != nil {\n\t\tb.Error(errors.Wrapf(err, \"deleting builder %s\", builder))\n\t}\n}\n\nfunc createCmd(b *testing.B, docker *dockerCli.Client) *cobra.Command {\n\toutBuf := bytes.Buffer{}\n\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf)\n\tpackClient, err := client.NewClient(client.WithLogger(logger), client.WithDockerClient(docker), client.WithExperimental(true))\n\tif err != nil {\n\t\tb.Error(errors.Wrap(err, \"creating packClient\"))\n\t}\n\treturn commands.Build(logger, cfg.Config{}, packClient)\n}\n\nfunc setEnv() {\n\tif baseImg = os.Getenv(\"baseImg\"); baseImg == \"\" {\n\t\tbaseImg = \"some-org/\" + h.RandString(10)\n\t}\n\ttrustedImg = baseImg + \"-trusted-\"\n\tif builder = os.Getenv(\"builder\"); builder == \"\" {\n\t\tbuilder = \"cnbs/sample-builder:bionic\"\n\t}\n\tif mockAppPath = os.Getenv(\"mockAppPath\"); mockAppPath == \"\" {\n\t\tmockAppPath = filepath.Join(\"..\", \"acceptance\", \"testdata\", \"mock_app\")\n\t}\n\tif paketoBuilder = os.Getenv(\"paketoBuilder\"); paketoBuilder == \"\" {\n\t\tpaketoBuilder = \"paketobuildpacks/builder-jammy-base\"\n\t}\n\tif additionalMockAppPath = os.Getenv(\"additionalMockAppPath\"); additionalMockAppPath == \"\" {\n\t\tadditionalMockAppPath = filepath.Join(\"..\", \"samples\", \"apps\", \"java-maven\")\n\t}\n\tif additionalBuildapck = os.Getenv(\"additionalBuildapck\"); additionalBuildapck == \"\" {\n\t\tadditionalBuildapck = \"paketobuildpacks/java:latest\"\n\t}\n}\n"
  },
  {
    "path": "builder/buildpack_identifier.go",
    "content": "package builder\n\ntype BpIdentifier interface {\n\tId() string\n}\n"
  },
  {
    "path": "builder/config_reader.go",
    "content": "package builder\n\nimport (\n\t\"fmt\"\n\t\"os\"\n\t\"path/filepath\"\n\t\"strings\"\n\n\t\"github.com/BurntSushi/toml\"\n\t\"github.com/pkg/errors\"\n\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/internal/style\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n)\n\n// Config is a builder configuration file\ntype Config struct {\n\tDescription     string           `toml:\"description\"`\n\tBuildpacks      ModuleCollection `toml:\"buildpacks\"`\n\tExtensions      ModuleCollection `toml:\"extensions\"`\n\tOrder           dist.Order       `toml:\"order\"`\n\tOrderExtensions dist.Order       `toml:\"order-extensions\"`\n\tStack           StackConfig      `toml:\"stack\"`\n\tLifecycle       LifecycleConfig  `toml:\"lifecycle\"`\n\tRun             RunConfig        `toml:\"run\"`\n\tBuild           BuildConfig      `toml:\"build\"`\n\tTargets         []dist.Target    `toml:\"targets\"`\n\tSystem          dist.System      `toml:\"system\"`\n}\n\n// ModuleCollection is a list of ModuleConfigs\ntype ModuleCollection []ModuleConfig\n\n// ModuleConfig details the configuration of a Buildpack or Extension\ntype ModuleConfig struct {\n\tdist.ModuleInfo\n\tdist.ImageOrURI\n}\n\nfunc (c *ModuleConfig) DisplayString() string {\n\tif c.FullName() != \"\" {\n\t\treturn c.FullName()\n\t}\n\n\treturn c.ImageOrURI.DisplayString()\n}\n\n// StackConfig details the configuration of a Stack\ntype StackConfig struct {\n\tID              string   `toml:\"id\"`\n\tBuildImage      string   `toml:\"build-image\"`\n\tRunImage        string   `toml:\"run-image\"`\n\tRunImageMirrors []string `toml:\"run-image-mirrors,omitempty\"`\n}\n\n// LifecycleConfig details the configuration of the Lifecycle\ntype LifecycleConfig struct {\n\tURI     string `toml:\"uri\"`\n\tVersion string `toml:\"version\"`\n}\n\n// RunConfig set of run image configuration\ntype RunConfig struct {\n\tImages []RunImageConfig `toml:\"images\"`\n}\n\n// RunImageConfig run image id and mirrors\ntype RunImageConfig struct {\n\tImage   string   `toml:\"image\"`\n\tMirrors []string `toml:\"mirrors,omitempty\"`\n}\n\n// BuildConfig build image configuration\ntype BuildConfig struct {\n\tImage string           `toml:\"image\"`\n\tEnv   []BuildConfigEnv `toml:\"env\"`\n}\n\ntype Suffix string\n\nconst (\n\tNONE     Suffix = \"\"\n\tDEFAULT  Suffix = \"default\"\n\tOVERRIDE Suffix = \"override\"\n\tAPPEND   Suffix = \"append\"\n\tPREPEND  Suffix = \"prepend\"\n)\n\ntype BuildConfigEnv struct {\n\tName    string   `toml:\"name\"`\n\tValue   string   `toml:\"value\"`\n\tSuffix  Suffix   `toml:\"suffix,omitempty\"`\n\tDelim   string   `toml:\"delim,omitempty\"`\n\tExecEnv []string `toml:\"exec-env,omitempty\"`\n}\n\n// ReadConfig reads a builder configuration from the file path provided and returns the\n// configuration along with any warnings encountered while parsing\nfunc ReadConfig(path string) (config Config, warnings []string, err error) {\n\tfile, err := os.Open(filepath.Clean(path))\n\tif err != nil {\n\t\treturn Config{}, nil, errors.Wrap(err, \"opening config file\")\n\t}\n\tdefer file.Close()\n\n\tconfig, err = parseConfig(file)\n\tif err != nil {\n\t\treturn Config{}, nil, errors.Wrapf(err, \"parse contents of '%s'\", path)\n\t}\n\n\tif len(config.Order) == 0 {\n\t\twarnings = append(warnings, fmt.Sprintf(\"empty %s definition\", style.Symbol(\"order\")))\n\t}\n\n\tconfig.mergeStackWithImages()\n\n\treturn config, warnings, nil\n}\n\n// ValidateConfig validates the config\nfunc ValidateConfig(c Config) error {\n\tif c.Build.Image == \"\" && c.Stack.BuildImage == \"\" {\n\t\treturn errors.New(\"build.image is required\")\n\t} else if c.Build.Image != \"\" && c.Stack.BuildImage != \"\" && c.Build.Image != c.Stack.BuildImage {\n\t\treturn errors.New(\"build.image and stack.build-image do not match\")\n\t}\n\n\tif len(c.Run.Images) == 0 && (c.Stack.RunImage == \"\" || c.Stack.ID == \"\") {\n\t\treturn errors.New(\"run.images are required\")\n\t}\n\n\tfor _, runImage := range c.Run.Images {\n\t\tif runImage.Image == \"\" {\n\t\t\treturn errors.New(\"run.images.image is required\")\n\t\t}\n\t}\n\n\tif c.Stack.RunImage != \"\" && c.Run.Images[0].Image != c.Stack.RunImage {\n\t\treturn errors.New(\"run.images and stack.run-image do not match\")\n\t}\n\n\treturn nil\n}\n\nfunc (c *Config) mergeStackWithImages() {\n\t// RFC-0096\n\tif c.Build.Image != \"\" {\n\t\tc.Stack.BuildImage = c.Build.Image\n\t} else if c.Build.Image == \"\" && c.Stack.BuildImage != \"\" {\n\t\tc.Build.Image = c.Stack.BuildImage\n\t}\n\n\tif len(c.Run.Images) != 0 {\n\t\t// use the first run image as the \"stack\"\n\t\tc.Stack.RunImage = c.Run.Images[0].Image\n\t\tc.Stack.RunImageMirrors = c.Run.Images[0].Mirrors\n\t} else if len(c.Run.Images) == 0 && c.Stack.RunImage != \"\" {\n\t\tc.Run.Images = []RunImageConfig{{\n\t\t\tImage:   c.Stack.RunImage,\n\t\t\tMirrors: c.Stack.RunImageMirrors,\n\t\t},\n\t\t}\n\t}\n}\n\n// parseConfig reads a builder configuration from file\nfunc parseConfig(file *os.File) (Config, error) {\n\tbuilderConfig := Config{}\n\ttomlMetadata, err := toml.NewDecoder(file).Decode(&builderConfig)\n\tif err != nil {\n\t\treturn Config{}, errors.Wrap(err, \"decoding toml contents\")\n\t}\n\n\tundecodedKeys := tomlMetadata.Undecoded()\n\tif len(undecodedKeys) > 0 {\n\t\tunknownElementsMsg := config.FormatUndecodedKeys(undecodedKeys)\n\n\t\treturn Config{}, errors.Errorf(\"%s in %s\",\n\t\t\tunknownElementsMsg,\n\t\t\tstyle.Symbol(file.Name()),\n\t\t)\n\t}\n\n\treturn builderConfig, nil\n}\n\nfunc ParseBuildConfigEnv(env []BuildConfigEnv, path string) (envMap map[string]string, warnings []string, err error) {\n\tenvMap = map[string]string{}\n\tvar appendOrPrependWithoutDelim = 0\n\tfor _, v := range env {\n\t\tif name := v.Name; name == \"\" || len(name) == 0 {\n\t\t\treturn nil, nil, errors.Wrapf(errors.Errorf(\"env name should not be empty\"), \"parse contents of '%s'\", path)\n\t\t}\n\t\tif val := v.Value; val == \"\" || len(val) == 0 {\n\t\t\twarnings = append(warnings, fmt.Sprintf(\"empty value for key/name %s\", style.Symbol(v.Name)))\n\t\t}\n\t\tsuffixName, delimName, err := getBuildConfigEnvFileName(v)\n\t\tif err != nil {\n\t\t\treturn envMap, warnings, err\n\t\t}\n\t\tif val, e := envMap[suffixName]; e {\n\t\t\twarnings = append(warnings, fmt.Sprintf(errors.Errorf(\"overriding env with name: %s and suffix: %s from %s to %s\", style.Symbol(v.Name), style.Symbol(string(v.Suffix)), style.Symbol(val), style.Symbol(v.Value)).Error(), \"parse contents of '%s'\", path))\n\t\t}\n\t\tif val, e := envMap[delimName]; e {\n\t\t\twarnings = append(warnings, fmt.Sprintf(errors.Errorf(\"overriding env with name: %s and delim: %s from %s to %s\", style.Symbol(v.Name), style.Symbol(v.Delim), style.Symbol(val), style.Symbol(v.Value)).Error(), \"parse contents of '%s'\", path))\n\t\t}\n\t\tif delim := v.Delim; (delim != \"\" || len(delim) != 0) && (delimName != \"\" || len(delimName) != 0) {\n\t\t\tenvMap[delimName] = delim\n\t\t}\n\t\tenvMap[suffixName] = v.Value\n\t}\n\n\tfor k := range envMap {\n\t\tname, suffix, err := getFilePrefixSuffix(k)\n\t\tif err != nil {\n\t\t\tcontinue\n\t\t}\n\t\tif _, ok := envMap[name+\".delim\"]; (suffix == \"append\" || suffix == \"prepend\") && !ok {\n\t\t\twarnings = append(warnings, fmt.Sprintf(errors.Errorf(\"env with name/key %s with suffix %s must to have a %s value\", style.Symbol(name), style.Symbol(suffix), style.Symbol(\"delim\")).Error(), \"parse contents of '%s'\", path))\n\t\t\tappendOrPrependWithoutDelim++\n\t\t}\n\t}\n\tif appendOrPrependWithoutDelim > 0 {\n\t\treturn envMap, warnings, errors.Errorf(\"error parsing [[build.env]] in file '%s'\", path)\n\t}\n\treturn envMap, warnings, err\n}\n\nfunc getBuildConfigEnvFileName(env BuildConfigEnv) (suffixName, delimName string, err error) {\n\tsuffix, err := getActionType(env.Suffix)\n\tif err != nil {\n\t\treturn suffixName, delimName, err\n\t}\n\tif suffix == \"\" {\n\t\tsuffixName = env.Name\n\t} else {\n\t\tsuffixName = env.Name + suffix\n\t}\n\tif delim := env.Delim; delim != \"\" || len(delim) != 0 {\n\t\tdelimName = env.Name + \".delim\"\n\t}\n\treturn suffixName, delimName, err\n}\n\nfunc getActionType(suffix Suffix) (suffixString string, err error) {\n\tconst delim = \".\"\n\tswitch suffix {\n\tcase NONE:\n\t\treturn \"\", nil\n\tcase DEFAULT:\n\t\treturn delim + string(DEFAULT), nil\n\tcase OVERRIDE:\n\t\treturn delim + string(OVERRIDE), nil\n\tcase APPEND:\n\t\treturn delim + string(APPEND), nil\n\tcase PREPEND:\n\t\treturn delim + string(PREPEND), nil\n\tdefault:\n\t\treturn suffixString, errors.Errorf(\"unknown action type %s\", style.Symbol(string(suffix)))\n\t}\n}\n\nfunc getFilePrefixSuffix(filename string) (prefix, suffix string, err error) {\n\tval := strings.Split(filename, \".\")\n\tif len(val) <= 1 {\n\t\treturn val[0], suffix, errors.Errorf(\"Suffix might be null\")\n\t}\n\tif len(val) == 2 {\n\t\tsuffix = val[1]\n\t} else {\n\t\tsuffix = strings.Join(val[1:], \".\")\n\t}\n\treturn val[0], suffix, err\n}\n"
  },
  {
    "path": "builder/config_reader_test.go",
    "content": "package builder_test\n\nimport (\n\t\"os\"\n\t\"path/filepath\"\n\t\"testing\"\n\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\t\"github.com/buildpacks/pack/builder\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestConfig(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"testConfig\", testConfig, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testConfig(t *testing.T, when spec.G, it spec.S) {\n\twhen(\"#ReadConfig\", func() {\n\t\tvar (\n\t\t\ttmpDir            string\n\t\t\tbuilderConfigPath string\n\t\t\terr               error\n\t\t)\n\n\t\tit.Before(func() {\n\t\t\ttmpDir, err = os.MkdirTemp(\"\", \"config-test\")\n\t\t\th.AssertNil(t, err)\n\t\t\tbuilderConfigPath = filepath.Join(tmpDir, \"builder.toml\")\n\t\t})\n\n\t\tit.After(func() {\n\t\t\th.AssertNil(t, os.RemoveAll(tmpDir))\n\t\t})\n\n\t\twhen(\"file is written properly\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\th.AssertNil(t, os.WriteFile(builderConfigPath, []byte(`\n[[buildpacks]]\n  id = \"buildpack/1\"\n  version = \"0.0.1\"\n  uri = \"https://example.com/buildpack-1.tgz\"\n\n[[buildpacks]]\n  image = \"example.com/buildpack:2\"\n\n[[buildpacks]]\n  uri = \"https://example.com/buildpack-3.tgz\"\n\n[[order]]\n[[order.group]]\n  id = \"buildpack/1\"\n  exec-env = [\"production\"]\n\n[[build.env]]\n  name = \"key1\"\n  value = \"value1\"\n  suffix = \"append\"\n  delim = \"%\"\n  exec-env = [\"test\"]\n\n`), 0666))\n\t\t\t})\n\n\t\t\tit(\"returns a builder config\", func() {\n\t\t\t\tbuilderConfig, warns, err := builder.ReadConfig(builderConfigPath)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertEq(t, len(warns), 0)\n\n\t\t\t\th.AssertEq(t, builderConfig.Buildpacks[0].ID, \"buildpack/1\")\n\t\t\t\th.AssertEq(t, builderConfig.Buildpacks[0].Version, \"0.0.1\")\n\t\t\t\th.AssertEq(t, builderConfig.Buildpacks[0].URI, \"https://example.com/buildpack-1.tgz\")\n\t\t\t\th.AssertEq(t, builderConfig.Buildpacks[0].ImageName, \"\")\n\n\t\t\t\th.AssertEq(t, builderConfig.Buildpacks[1].ID, \"\")\n\t\t\t\th.AssertEq(t, builderConfig.Buildpacks[1].URI, \"\")\n\t\t\t\th.AssertEq(t, builderConfig.Buildpacks[1].ImageName, \"example.com/buildpack:2\")\n\n\t\t\t\th.AssertEq(t, builderConfig.Buildpacks[2].ID, \"\")\n\t\t\t\th.AssertEq(t, builderConfig.Buildpacks[2].URI, \"https://example.com/buildpack-3.tgz\")\n\t\t\t\th.AssertEq(t, builderConfig.Buildpacks[2].ImageName, \"\")\n\n\t\t\t\th.AssertEq(t, builderConfig.Order[0].Group[0].ID, \"buildpack/1\")\n\t\t\t\th.AssertTrue(t, len(builderConfig.Order[0].Group[0].ExecEnv) == 1)\n\t\t\t\th.AssertEq(t, builderConfig.Order[0].Group[0].ExecEnv[0], \"production\")\n\n\t\t\t\th.AssertTrue(t, len(builderConfig.Build.Env) == 1)\n\t\t\t\th.AssertEq(t, builderConfig.Build.Env[0].Name, \"key1\")\n\t\t\t\th.AssertEq(t, builderConfig.Build.Env[0].Value, \"value1\")\n\t\t\t\th.AssertEq(t, string(builderConfig.Build.Env[0].Suffix), \"append\")\n\t\t\t\th.AssertTrue(t, len(builderConfig.Build.Env[0].ExecEnv) == 1)\n\t\t\t\th.AssertEq(t, builderConfig.Build.Env[0].ExecEnv[0], \"test\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"an error occurs while reading\", func() {\n\t\t\tit(\"bubbles up the error\", func() {\n\t\t\t\t_, _, err := builder.ReadConfig(builderConfigPath)\n\t\t\t\th.AssertError(t, err, \"opening config file\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"detecting warnings\", func() {\n\t\t\twhen(\"'groups' field is used\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\th.AssertNil(t, os.WriteFile(builderConfigPath, []byte(`\n[[buildpacks]]\n  id = \"some.buildpack\"\n  version = \"some.buildpack.version\"\n\n[[groups]]\n[[groups.buildpacks]]\n  id = \"some.buildpack\"\n  version = \"some.buildpack.version\"\n\n[[order]]\n[[order.group]]\n  id = \"some.buildpack\"\n`), 0666))\n\t\t\t\t})\n\n\t\t\t\tit(\"returns error when obsolete 'groups' field is used\", func() {\n\t\t\t\t\t_, warns, err := builder.ReadConfig(builderConfigPath)\n\t\t\t\t\th.AssertError(t, err, \"parse contents of\")\n\t\t\t\t\th.AssertEq(t, len(warns), 0)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"'order' is missing or empty\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\th.AssertNil(t, os.WriteFile(builderConfigPath, []byte(`\n[[buildpacks]]\n  id = \"some.buildpack\"\n  version = \"some.buildpack.version\"\n`), 0666))\n\t\t\t\t})\n\n\t\t\t\tit(\"returns warnings\", func() {\n\t\t\t\t\t_, warns, err := builder.ReadConfig(builderConfigPath)\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\th.AssertSliceContainsOnly(t, warns, \"empty 'order' definition\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"unknown buildpack key is present\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\th.AssertNil(t, os.WriteFile(builderConfigPath, []byte(`\n[[buildpacks]]\nurl = \"noop-buildpack.tgz\"\n`), 0666))\n\t\t\t\t})\n\n\t\t\t\tit(\"returns an error\", func() {\n\t\t\t\t\t_, _, err := builder.ReadConfig(builderConfigPath)\n\t\t\t\t\th.AssertError(t, err, \"unknown configuration element 'buildpacks.url'\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"unknown array table is present\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\th.AssertNil(t, os.WriteFile(builderConfigPath, []byte(`\n[[buidlpack]]\nuri = \"noop-buildpack.tgz\"\n`), 0666))\n\t\t\t\t})\n\n\t\t\t\tit(\"returns an error\", func() {\n\t\t\t\t\t_, _, err := builder.ReadConfig(builderConfigPath)\n\t\t\t\t\th.AssertError(t, err, \"unknown configuration element 'buidlpack'\")\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"system buildpack is defined\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\th.AssertNil(t, os.WriteFile(builderConfigPath, []byte(`\n[[system.pre.buildpacks]]\n  id = \"id-1\"\n  version = \"1.0\"\n  optional = false\n\n[[system.post.buildpacks]]\n  id = \"id-2\"\n  version = \"2.0\"\n  optional = true\n`), 0666))\n\t\t\t})\n\n\t\t\tit(\"returns a builder config\", func() {\n\t\t\t\tbuilderConfig, _, err := builder.ReadConfig(builderConfigPath)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertEq(t, len(builderConfig.System.Pre.Buildpacks), 1)\n\t\t\t\th.AssertEq(t, len(builderConfig.System.Post.Buildpacks), 1)\n\n\t\t\t\t// Verify system.pre.buildpacks\n\t\t\t\th.AssertEq(t, builderConfig.System.Pre.Buildpacks[0].ID, \"id-1\")\n\t\t\t\th.AssertEq(t, builderConfig.System.Pre.Buildpacks[0].Version, \"1.0\")\n\t\t\t\th.AssertEq(t, builderConfig.System.Pre.Buildpacks[0].Optional, false)\n\n\t\t\t\t// Verify system.post.buildpacks\n\t\t\t\th.AssertEq(t, builderConfig.System.Post.Buildpacks[0].ID, \"id-2\")\n\t\t\t\th.AssertEq(t, builderConfig.System.Post.Buildpacks[0].Version, \"2.0\")\n\t\t\t\th.AssertEq(t, builderConfig.System.Post.Buildpacks[0].Optional, true)\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"#ValidateConfig()\", func() {\n\t\tvar (\n\t\t\ttestID         = \"testID\"\n\t\t\ttestRunImage   = \"test-run-image\"\n\t\t\ttestBuildImage = \"test-build-image\"\n\t\t)\n\n\t\tit(\"returns error if no stack id and no run images\", func() {\n\t\t\tconfig := builder.Config{\n\t\t\t\tStack: builder.StackConfig{\n\t\t\t\t\tBuildImage: testBuildImage,\n\t\t\t\t\tRunImage:   testRunImage,\n\t\t\t\t}}\n\t\t\th.AssertError(t, builder.ValidateConfig(config), \"run.images are required\")\n\t\t})\n\n\t\tit(\"returns error if no build image\", func() {\n\t\t\tconfig := builder.Config{\n\t\t\t\tStack: builder.StackConfig{\n\t\t\t\t\tID:       testID,\n\t\t\t\t\tRunImage: testRunImage,\n\t\t\t\t}}\n\t\t\th.AssertError(t, builder.ValidateConfig(config), \"build.image is required\")\n\t\t})\n\n\t\tit(\"returns error if no run image\", func() {\n\t\t\tconfig := builder.Config{\n\t\t\t\tStack: builder.StackConfig{\n\t\t\t\t\tID:         testID,\n\t\t\t\t\tBuildImage: testBuildImage,\n\t\t\t\t}}\n\t\t\th.AssertError(t, builder.ValidateConfig(config), \"run.images are required\")\n\t\t})\n\n\t\tit(\"returns error if no run images image\", func() {\n\t\t\tconfig := builder.Config{\n\t\t\t\tBuild: builder.BuildConfig{\n\t\t\t\t\tImage: testBuildImage,\n\t\t\t\t},\n\t\t\t\tRun: builder.RunConfig{\n\t\t\t\t\tImages: []builder.RunImageConfig{{\n\t\t\t\t\t\tImage: \"\",\n\t\t\t\t\t}},\n\t\t\t\t}}\n\t\t\th.AssertError(t, builder.ValidateConfig(config), \"run.images.image is required\")\n\t\t})\n\n\t\tit(\"returns error if no stack or run image\", func() {\n\t\t\tconfig := builder.Config{\n\t\t\t\tBuild: builder.BuildConfig{\n\t\t\t\t\tImage: testBuildImage,\n\t\t\t\t}}\n\t\t\th.AssertError(t, builder.ValidateConfig(config), \"run.images are required\")\n\t\t})\n\n\t\tit(\"returns error if no stack and no build image\", func() {\n\t\t\tconfig := builder.Config{\n\t\t\t\tRun: builder.RunConfig{\n\t\t\t\t\tImages: []builder.RunImageConfig{{\n\t\t\t\t\t\tImage: testBuildImage,\n\t\t\t\t\t}},\n\t\t\t\t}}\n\t\t\th.AssertError(t, builder.ValidateConfig(config), \"build.image is required\")\n\t\t})\n\n\t\tit(\"returns error if no stack, run, or build image\", func() {\n\t\t\tconfig := builder.Config{}\n\t\t\th.AssertError(t, builder.ValidateConfig(config), \"build.image is required\")\n\t\t})\n\t})\n\twhen(\"#ParseBuildConfigEnv()\", func() {\n\t\tit(\"should return an error when name is not defined\", func() {\n\t\t\t_, _, err := builder.ParseBuildConfigEnv([]builder.BuildConfigEnv{\n\t\t\t\t{\n\t\t\t\t\tName:  \"\",\n\t\t\t\t\tValue: \"vaiue\",\n\t\t\t\t},\n\t\t\t}, \"\")\n\t\t\th.AssertNotNil(t, err)\n\t\t})\n\t\tit(\"should warn when the value is nil or empty string\", func() {\n\t\t\tenv, warn, err := builder.ParseBuildConfigEnv([]builder.BuildConfigEnv{\n\t\t\t\t{\n\t\t\t\t\tName:   \"key\",\n\t\t\t\t\tValue:  \"\",\n\t\t\t\t\tSuffix: \"override\",\n\t\t\t\t},\n\t\t\t}, \"\")\n\n\t\t\th.AssertNotNil(t, warn)\n\t\t\th.AssertNil(t, err)\n\t\t\th.AssertMapContains[string, string](t, env, h.NewKeyValue[string, string](\"key.override\", \"\"))\n\t\t})\n\t\tit(\"should return an error when unknown suffix is specified\", func() {\n\t\t\t_, _, err := builder.ParseBuildConfigEnv([]builder.BuildConfigEnv{\n\t\t\t\t{\n\t\t\t\t\tName:   \"key\",\n\t\t\t\t\tValue:  \"\",\n\t\t\t\t\tSuffix: \"invalid\",\n\t\t\t\t},\n\t\t\t}, \"\")\n\n\t\t\th.AssertNotNil(t, err)\n\t\t})\n\t\tit(\"should override and show a warning when suffix or delim is defined multiple times\", func() {\n\t\t\tenv, warn, err := builder.ParseBuildConfigEnv([]builder.BuildConfigEnv{\n\t\t\t\t{\n\t\t\t\t\tName:   \"key1\",\n\t\t\t\t\tValue:  \"value1\",\n\t\t\t\t\tSuffix: \"append\",\n\t\t\t\t\tDelim:  \"%\",\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\tName:   \"key1\",\n\t\t\t\t\tValue:  \"value2\",\n\t\t\t\t\tSuffix: \"append\",\n\t\t\t\t\tDelim:  \",\",\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\tName:   \"key1\",\n\t\t\t\t\tValue:  \"value3\",\n\t\t\t\t\tSuffix: \"default\",\n\t\t\t\t\tDelim:  \";\",\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\tName:   \"key1\",\n\t\t\t\t\tValue:  \"value4\",\n\t\t\t\t\tSuffix: \"prepend\",\n\t\t\t\t\tDelim:  \":\",\n\t\t\t\t},\n\t\t\t}, \"\")\n\n\t\t\th.AssertNotNil(t, warn)\n\t\t\th.AssertNil(t, err)\n\t\t\th.AssertMapContains[string, string](\n\t\t\t\tt,\n\t\t\t\tenv,\n\t\t\t\th.NewKeyValue[string, string](\"key1.append\", \"value2\"),\n\t\t\t\th.NewKeyValue[string, string](\"key1.default\", \"value3\"),\n\t\t\t\th.NewKeyValue[string, string](\"key1.prepend\", \"value4\"),\n\t\t\t\th.NewKeyValue[string, string](\"key1.delim\", \":\"),\n\t\t\t)\n\t\t\th.AssertMapNotContains[string, string](\n\t\t\t\tt,\n\t\t\t\tenv,\n\t\t\t\th.NewKeyValue[string, string](\"key1.append\", \"value1\"),\n\t\t\t\th.NewKeyValue[string, string](\"key1.delim\", \"%\"),\n\t\t\t\th.NewKeyValue[string, string](\"key1.delim\", \",\"),\n\t\t\t\th.NewKeyValue[string, string](\"key1.delim\", \";\"),\n\t\t\t)\n\t\t})\n\t\tit(\"should return an error when `suffix` is defined as `append` or `prepend` without a `delim`\", func() {\n\t\t\t_, warn, err := builder.ParseBuildConfigEnv([]builder.BuildConfigEnv{\n\t\t\t\t{\n\t\t\t\t\tName:   \"key\",\n\t\t\t\t\tValue:  \"value\",\n\t\t\t\t\tSuffix: \"append\",\n\t\t\t\t},\n\t\t\t}, \"\")\n\n\t\t\th.AssertNotNil(t, warn)\n\t\t\th.AssertNotNil(t, err)\n\t\t})\n\t\tit(\"when suffix is NONE or omitted should default to `override`\", func() {\n\t\t\tenv, warn, err := builder.ParseBuildConfigEnv([]builder.BuildConfigEnv{\n\t\t\t\t{\n\t\t\t\t\tName:   \"key\",\n\t\t\t\t\tValue:  \"value\",\n\t\t\t\t\tSuffix: \"\",\n\t\t\t\t},\n\t\t\t}, \"\")\n\n\t\t\th.AssertNotNil(t, warn)\n\t\t\th.AssertNil(t, err)\n\t\t\th.AssertMapContains[string, string](t, env, h.NewKeyValue[string, string](\"key\", \"value\"))\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "builder/detection_order.go",
    "content": "package builder\n\nimport (\n\t\"github.com/buildpacks/pack/pkg/dist\"\n)\n\ntype DetectionOrderEntry struct {\n\tdist.ModuleRef      `yaml:\",inline\"`\n\tCyclical            bool           `json:\"cyclic,omitempty\" yaml:\"cyclic,omitempty\" toml:\"cyclic,omitempty\"`\n\tGroupDetectionOrder DetectionOrder `json:\"buildpacks,omitempty\" yaml:\"buildpacks,omitempty\" toml:\"buildpacks,omitempty\"`\n}\n\ntype DetectionOrder []DetectionOrderEntry\n\nconst (\n\tOrderDetectionMaxDepth = -1\n\tOrderDetectionNone     = 0\n)\n"
  },
  {
    "path": "buildpackage/config_reader.go",
    "content": "package buildpackage\n\nimport (\n\t\"path/filepath\"\n\n\t\"github.com/BurntSushi/toml\"\n\t\"github.com/pkg/errors\"\n\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/internal/style\"\n\t\"github.com/buildpacks/pack/pkg/buildpack\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n)\n\nconst defaultOS = \"linux\"\n\n// Config encapsulates the possible configuration options for buildpackage creation.\ntype Config struct {\n\tBuildpack    dist.BuildpackURI `toml:\"buildpack\"`\n\tExtension    dist.BuildpackURI `toml:\"extension\"`\n\tDependencies []dist.ImageOrURI `toml:\"dependencies\"`\n\t// deprecated\n\tPlatform dist.Platform `toml:\"platform\"`\n\n\t// Define targets for composite buildpacks\n\tTargets []dist.Target `toml:\"targets\"`\n}\n\nfunc DefaultConfig() Config {\n\treturn Config{\n\t\tBuildpack: dist.BuildpackURI{\n\t\t\tURI: \".\",\n\t\t},\n\t\tPlatform: dist.Platform{\n\t\t\tOS: defaultOS,\n\t\t},\n\t}\n}\n\nfunc DefaultExtensionConfig() Config {\n\treturn Config{\n\t\tExtension: dist.BuildpackURI{\n\t\t\tURI: \".\",\n\t\t},\n\t\tPlatform: dist.Platform{\n\t\t\tOS: defaultOS,\n\t\t},\n\t}\n}\n\n// NewConfigReader returns an instance of ConfigReader. It does not take any parameters.\nfunc NewConfigReader() *ConfigReader {\n\treturn &ConfigReader{}\n}\n\n// ConfigReader implements a Read method for buildpackage configuration which parses and validates buildpackage\n// configuration from a toml file.\ntype ConfigReader struct{}\n\n// Read reads and validates a buildpackage configuration from the file path provided and returns the\n// configuration and any error that occurred during reading or validation.\nfunc (r *ConfigReader) Read(path string) (Config, error) {\n\tpackageConfig := Config{}\n\n\ttomlMetadata, err := toml.DecodeFile(path, &packageConfig)\n\tif err != nil {\n\t\treturn packageConfig, errors.Wrap(err, \"decoding toml\")\n\t}\n\n\tundecodedKeys := tomlMetadata.Undecoded()\n\tif len(undecodedKeys) > 0 {\n\t\tunknownElementsMsg := config.FormatUndecodedKeys(undecodedKeys)\n\n\t\treturn packageConfig, errors.Errorf(\"%s in %s\",\n\t\t\tunknownElementsMsg,\n\t\t\tstyle.Symbol(path),\n\t\t)\n\t}\n\n\tif packageConfig.Buildpack.URI == \"\" && packageConfig.Extension.URI == \"\" {\n\t\tif packageConfig.Buildpack.URI == \"\" {\n\t\t\treturn packageConfig, errors.Errorf(\"missing %s configuration\", style.Symbol(\"buildpack.uri\"))\n\t\t}\n\t\treturn packageConfig, errors.Errorf(\"missing %s configuration\", style.Symbol(\"extension.uri\"))\n\t}\n\n\tif packageConfig.Platform.OS == \"\" {\n\t\tpackageConfig.Platform.OS = defaultOS\n\t}\n\n\tif packageConfig.Platform.OS != \"linux\" && packageConfig.Platform.OS != \"windows\" {\n\t\treturn packageConfig, errors.Errorf(\"invalid %s configuration: only [%s, %s] is permitted, found %s\",\n\t\t\tstyle.Symbol(\"platform.os\"), style.Symbol(\"linux\"), style.Symbol(\"windows\"), style.Symbol(packageConfig.Platform.OS))\n\t}\n\n\tconfigDir, err := filepath.Abs(filepath.Dir(path))\n\tif err != nil {\n\t\treturn packageConfig, err\n\t}\n\n\tif err := validateURI(packageConfig.Buildpack.URI, configDir); err != nil {\n\t\treturn packageConfig, err\n\t}\n\n\tfor _, dep := range packageConfig.Dependencies {\n\t\tif dep.URI != \"\" && dep.ImageName != \"\" {\n\t\t\treturn packageConfig, errors.Errorf(\n\t\t\t\t\"dependency configured with both %s and %s\",\n\t\t\t\tstyle.Symbol(\"uri\"),\n\t\t\t\tstyle.Symbol(\"image\"),\n\t\t\t)\n\t\t}\n\n\t\tif dep.URI != \"\" {\n\t\t\tif err := validateURI(dep.URI, configDir); err != nil {\n\t\t\t\treturn packageConfig, err\n\t\t\t}\n\t\t}\n\t}\n\n\treturn packageConfig, nil\n}\n\nfunc (r *ConfigReader) ReadBuildpackDescriptor(path string) (dist.BuildpackDescriptor, error) {\n\tbuildpackCfg := dist.BuildpackDescriptor{}\n\n\t_, err := toml.DecodeFile(path, &buildpackCfg)\n\tif err != nil {\n\t\treturn dist.BuildpackDescriptor{}, err\n\t}\n\n\treturn buildpackCfg, nil\n}\n\nfunc validateURI(uri, relativeBaseDir string) error {\n\tlocatorType, err := buildpack.GetLocatorType(uri, relativeBaseDir, nil)\n\tif err != nil {\n\t\treturn err\n\t}\n\n\tif locatorType == buildpack.InvalidLocator {\n\t\treturn errors.Errorf(\"invalid locator %s\", style.Symbol(uri))\n\t}\n\n\treturn nil\n}\n"
  },
  {
    "path": "buildpackage/config_reader_test.go",
    "content": "package buildpackage_test\n\nimport (\n\t\"os\"\n\t\"path/filepath\"\n\t\"testing\"\n\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\t\"github.com/buildpacks/pack/buildpackage\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestBuildpackageConfigReader(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"Buildpackage Config Reader\", testBuildpackageConfigReader, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testBuildpackageConfigReader(t *testing.T, when spec.G, it spec.S) {\n\twhen(\"#Read\", func() {\n\t\tvar tmpDir string\n\n\t\tit.Before(func() {\n\t\t\tvar err error\n\t\t\ttmpDir, err = os.MkdirTemp(\"\", \"buildpackage-config-test\")\n\t\t\th.AssertNil(t, err)\n\t\t})\n\n\t\tit.After(func() {\n\t\t\tos.RemoveAll(tmpDir)\n\t\t})\n\n\t\tit(\"returns default buildpack config\", func() {\n\t\t\texpected := buildpackage.Config{\n\t\t\t\tBuildpack: dist.BuildpackURI{\n\t\t\t\t\tURI: \".\",\n\t\t\t\t},\n\t\t\t\tPlatform: dist.Platform{\n\t\t\t\t\tOS: \"linux\",\n\t\t\t\t},\n\t\t\t}\n\t\t\tactual := buildpackage.DefaultConfig()\n\n\t\t\th.AssertEq(t, actual, expected)\n\t\t})\n\n\t\tit(\"returns default extension config\", func() {\n\t\t\texpected := buildpackage.Config{\n\t\t\t\tExtension: dist.BuildpackURI{\n\t\t\t\t\tURI: \".\",\n\t\t\t\t},\n\t\t\t\tPlatform: dist.Platform{\n\t\t\t\t\tOS: \"linux\",\n\t\t\t\t},\n\t\t\t}\n\t\t\tactual := buildpackage.DefaultExtensionConfig()\n\n\t\t\th.AssertEq(t, actual, expected)\n\t\t})\n\n\t\tit(\"returns correct config when provided toml file is valid\", func() {\n\t\t\tconfigFile := filepath.Join(tmpDir, \"package.toml\")\n\n\t\t\terr := os.WriteFile(configFile, []byte(validPackageToml), os.ModePerm)\n\t\t\th.AssertNil(t, err)\n\n\t\t\tpackageConfigReader := buildpackage.NewConfigReader()\n\n\t\t\tconfig, err := packageConfigReader.Read(configFile)\n\t\t\th.AssertNil(t, err)\n\n\t\t\th.AssertEq(t, config.Platform.OS, \"windows\")\n\t\t\th.AssertEq(t, config.Buildpack.URI, \"https://example.com/bp/a.tgz\")\n\t\t\th.AssertEq(t, len(config.Dependencies), 1)\n\t\t\th.AssertEq(t, config.Dependencies[0].URI, \"https://example.com/bp/b.tgz\")\n\t\t})\n\n\t\tit(\"returns a config with 'linux' as default when platform is missing\", func() {\n\t\t\tconfigFile := filepath.Join(tmpDir, \"package.toml\")\n\n\t\t\terr := os.WriteFile(configFile, []byte(validPackageWithoutPlatformToml), os.ModePerm)\n\t\t\th.AssertNil(t, err)\n\n\t\t\tpackageConfigReader := buildpackage.NewConfigReader()\n\n\t\t\tconfig, err := packageConfigReader.Read(configFile)\n\t\t\th.AssertNil(t, err)\n\n\t\t\th.AssertEq(t, config.Platform.OS, \"linux\")\n\t\t})\n\n\t\tit(\"returns an error when toml decode fails\", func() {\n\t\t\tconfigFile := filepath.Join(tmpDir, \"package.toml\")\n\n\t\t\terr := os.WriteFile(configFile, []byte(brokenPackageToml), os.ModePerm)\n\t\t\th.AssertNil(t, err)\n\n\t\t\tpackageConfigReader := buildpackage.NewConfigReader()\n\n\t\t\t_, err = packageConfigReader.Read(configFile)\n\t\t\th.AssertNotNil(t, err)\n\n\t\t\th.AssertError(t, err, \"decoding toml\")\n\t\t})\n\n\t\tit(\"returns an error when buildpack uri is invalid\", func() {\n\t\t\tconfigFile := filepath.Join(tmpDir, \"package.toml\")\n\n\t\t\terr := os.WriteFile(configFile, []byte(invalidBPURIPackageToml), os.ModePerm)\n\t\t\th.AssertNil(t, err)\n\n\t\t\tpackageConfigReader := buildpackage.NewConfigReader()\n\n\t\t\t_, err = packageConfigReader.Read(configFile)\n\t\t\th.AssertNotNil(t, err)\n\t\t\th.AssertError(t, err, \"invalid locator\")\n\t\t\th.AssertError(t, err, \"invalid/uri@version-is-invalid\")\n\t\t})\n\n\t\tit(\"returns an error when platform os is invalid\", func() {\n\t\t\tconfigFile := filepath.Join(tmpDir, \"package.toml\")\n\n\t\t\terr := os.WriteFile(configFile, []byte(invalidPlatformOSPackageToml), os.ModePerm)\n\t\t\th.AssertNil(t, err)\n\n\t\t\tpackageConfigReader := buildpackage.NewConfigReader()\n\n\t\t\t_, err = packageConfigReader.Read(configFile)\n\t\t\th.AssertNotNil(t, err)\n\t\t\th.AssertError(t, err, \"invalid 'platform.os' configuration\")\n\t\t\th.AssertError(t, err, \"only ['linux', 'windows'] is permitted\")\n\t\t})\n\n\t\tit(\"returns an error when dependency uri is invalid\", func() {\n\t\t\tconfigFile := filepath.Join(tmpDir, \"package.toml\")\n\n\t\t\terr := os.WriteFile(configFile, []byte(invalidDepURIPackageToml), os.ModePerm)\n\t\t\th.AssertNil(t, err)\n\n\t\t\tpackageConfigReader := buildpackage.NewConfigReader()\n\n\t\t\t_, err = packageConfigReader.Read(configFile)\n\t\t\th.AssertNotNil(t, err)\n\t\t\th.AssertError(t, err, \"invalid locator\")\n\t\t\th.AssertError(t, err, \"invalid/uri@version-is-invalid\")\n\t\t})\n\n\t\tit(\"returns an error when unknown array table is present\", func() {\n\t\t\tconfigFile := filepath.Join(tmpDir, \"package.toml\")\n\n\t\t\terr := os.WriteFile(configFile, []byte(invalidDepTablePackageToml), os.ModePerm)\n\t\t\th.AssertNil(t, err)\n\n\t\t\tpackageConfigReader := buildpackage.NewConfigReader()\n\n\t\t\t_, err = packageConfigReader.Read(configFile)\n\t\t\th.AssertNotNil(t, err)\n\t\t\th.AssertError(t, err, \"unknown configuration element\")\n\t\t\th.AssertError(t, err, \"dependenceis\")\n\t\t\th.AssertNotContains(t, err.Error(), \".image\")\n\t\t\th.AssertError(t, err, configFile)\n\t\t})\n\n\t\tit(\"returns an error when unknown buildpack key is present\", func() {\n\t\t\tconfigFile := filepath.Join(tmpDir, \"package.toml\")\n\n\t\t\terr := os.WriteFile(configFile, []byte(unknownBPKeyPackageToml), os.ModePerm)\n\t\t\th.AssertNil(t, err)\n\n\t\t\tpackageConfigReader := buildpackage.NewConfigReader()\n\n\t\t\t_, err = packageConfigReader.Read(configFile)\n\t\t\th.AssertNotNil(t, err)\n\t\t\th.AssertError(t, err, \"unknown configuration element \")\n\t\t\th.AssertError(t, err, \"buildpack.url\")\n\t\t\th.AssertError(t, err, configFile)\n\t\t})\n\n\t\tit(\"returns an error when multiple unknown keys are present\", func() {\n\t\t\tconfigFile := filepath.Join(tmpDir, \"package.toml\")\n\n\t\t\terr := os.WriteFile(configFile, []byte(multipleUnknownKeysPackageToml), os.ModePerm)\n\t\t\th.AssertNil(t, err)\n\n\t\t\tpackageConfigReader := buildpackage.NewConfigReader()\n\n\t\t\t_, err = packageConfigReader.Read(configFile)\n\t\t\th.AssertNotNil(t, err)\n\t\t\th.AssertError(t, err, \"unknown configuration elements\")\n\t\t\th.AssertError(t, err, \"'buildpack.url'\")\n\t\t\th.AssertError(t, err, \"', '\")\n\t\t\th.AssertError(t, err, \"'dependenceis'\")\n\t\t})\n\n\t\tit(\"returns an error when both dependency options are configured\", func() {\n\t\t\tconfigFile := filepath.Join(tmpDir, \"package.toml\")\n\n\t\t\terr := os.WriteFile(configFile, []byte(conflictingDependencyKeysPackageToml), os.ModePerm)\n\t\t\th.AssertNil(t, err)\n\n\t\t\tpackageConfigReader := buildpackage.NewConfigReader()\n\n\t\t\t_, err = packageConfigReader.Read(configFile)\n\t\t\th.AssertNotNil(t, err)\n\t\t\th.AssertError(t, err, \"dependency configured with both 'uri' and 'image'\")\n\t\t})\n\n\t\tit(\"returns an error no buildpack is configured\", func() {\n\t\t\tconfigFile := filepath.Join(tmpDir, \"package.toml\")\n\n\t\t\terr := os.WriteFile(configFile, []byte(missingBuildpackPackageToml), os.ModePerm)\n\t\t\th.AssertNil(t, err)\n\n\t\t\tpackageConfigReader := buildpackage.NewConfigReader()\n\n\t\t\t_, err = packageConfigReader.Read(configFile)\n\t\t\th.AssertNotNil(t, err)\n\t\t\th.AssertError(t, err, \"missing 'buildpack.uri' configuration\")\n\t\t})\n\t})\n\n\twhen(\"#ReadBuildpackDescriptor\", func() {\n\t\tvar tmpDir string\n\n\t\tit.Before(func() {\n\t\t\tvar err error\n\t\t\ttmpDir, err = os.MkdirTemp(\"\", \"buildpack-descriptor-test\")\n\t\t\th.AssertNil(t, err)\n\t\t})\n\n\t\tit.After(func() {\n\t\t\t_ = os.RemoveAll(tmpDir)\n\t\t})\n\n\t\tit(\"returns exec-env when a composite buildpack toml file is provided\", func() {\n\t\t\tbuildPackTomlFilePath := filepath.Join(tmpDir, \"buildpack-1.toml\")\n\n\t\t\terr := os.WriteFile(buildPackTomlFilePath, []byte(validCompositeBuildPackTomlWithExecEnv), os.ModePerm)\n\t\t\th.AssertNil(t, err)\n\n\t\t\tpackageConfigReader := buildpackage.NewConfigReader()\n\n\t\t\tbuildpackDescriptor, err := packageConfigReader.ReadBuildpackDescriptor(buildPackTomlFilePath)\n\t\t\th.AssertNil(t, err)\n\n\t\t\th.AssertTrue(t, len(buildpackDescriptor.Order()) == 1)\n\t\t\th.AssertTrue(t, len(buildpackDescriptor.Order()[0].Group) == 2)\n\t\t\th.AssertTrue(t, len(buildpackDescriptor.Order()[0].Group[0].ExecEnv) == 1)\n\t\t\th.AssertTrue(t, len(buildpackDescriptor.Order()[0].Group[1].ExecEnv) == 1)\n\t\t\th.AssertEq(t, buildpackDescriptor.Order()[0].Group[0].ExecEnv[0], \"production.1\")\n\t\t\th.AssertEq(t, buildpackDescriptor.Order()[0].Group[1].ExecEnv[0], \"production.2\")\n\t\t})\n\t})\n}\n\nconst validPackageToml = `\n[buildpack]\nuri = \"https://example.com/bp/a.tgz\"\n\n[[dependencies]]\nuri = \"https://example.com/bp/b.tgz\"\n\n[platform]\nos = \"windows\"\n`\n\nconst validPackageWithoutPlatformToml = `\n[buildpack]\nuri = \"https://example.com/bp/a.tgz\"\n\n[[dependencies]]\nuri = \"https://example.com/bp/b.tgz\"\n`\n\nconst brokenPackageToml = `\n[buildpack # missing closing bracket\nuri = \"https://example.com/bp/a.tgz\"\n\n[dependencies]] # missing opening bracket\nuri = \"https://example.com/bp/b.tgz\"\n`\n\nconst invalidBPURIPackageToml = `\n[buildpack]\nuri = \"invalid/uri@version-is-invalid\"\n`\n\nconst invalidDepURIPackageToml = `\n[buildpack]\nuri = \"noop-buildpack.tgz\"\n\n[[dependencies]]\nuri = \"invalid/uri@version-is-invalid\"\n`\n\nconst invalidDepTablePackageToml = `\n[buildpack]\nuri = \"noop-buildpack.tgz\"\n\n[[dependenceis]] # Notice: this is misspelled\nimage = \"some/package-dep\"\n`\n\nconst invalidPlatformOSPackageToml = `\n[buildpack]\nuri = \"https://example.com/bp/a.tgz\"\n\n[platform]\nos = \"some-incorrect-platform\"\n`\n\nconst unknownBPKeyPackageToml = `\n[buildpack]\nurl = \"noop-buildpack.tgz\"\n`\n\nconst multipleUnknownKeysPackageToml = `\n[buildpack]\nurl = \"noop-buildpack.tgz\"\n\n[[dependenceis]] # Notice: this is misspelled\nimage = \"some/package-dep\"\n`\n\nconst conflictingDependencyKeysPackageToml = `\n[buildpack]\nuri = \"noop-buildpack.tgz\"\n\n[[dependencies]]\nuri = \"bp/b\"\nimage = \"some/package-dep\"\n`\n\nconst missingBuildpackPackageToml = `\n[[dependencies]]\nuri = \"bp/b\"\n`\n\nconst validCompositeBuildPackTomlWithExecEnv = `\napi = \"0.15\"\n\n[buildpack]\nid = \"samples/hello-universe\"\nversion = \"0.0.1\"\nname = \"Hello Universe Buildpack\"\n\n# Order used for detection\n[[order]]\n[[order.group]]\nid = \"samples/hello-world\"\nversion = \"0.0.1\"\nexec-env = [\"production.1\"]\n\n[[order.group]]\nid = \"samples/hello-moon\"\nversion = \"0.0.1\"\nexec-env = [\"production.2\"]\n`\n"
  },
  {
    "path": "cmd/cmd.go",
    "content": "package cmd\n\nimport (\n\t\"github.com/heroku/color\"\n\t\"github.com/pkg/errors\"\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/buildpackage\"\n\tbuilderwriter \"github.com/buildpacks/pack/internal/builder/writer\"\n\t\"github.com/buildpacks/pack/internal/commands\"\n\t\"github.com/buildpacks/pack/internal/config\"\n\timagewriter \"github.com/buildpacks/pack/internal/inspectimage/writer\"\n\t\"github.com/buildpacks/pack/internal/term\"\n\t\"github.com/buildpacks/pack/pkg/client\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n)\n\n// ConfigurableLogger defines behavior required by the PackCommand\ntype ConfigurableLogger interface {\n\tlogging.Logger\n\tWantTime(f bool)\n\tWantQuiet(f bool)\n\tWantVerbose(f bool)\n}\n\n// NewPackCommand generates a Pack command\n//\n//nolint:staticcheck\nfunc NewPackCommand(logger ConfigurableLogger) (*cobra.Command, error) {\n\tcobra.EnableCommandSorting = false\n\tcfg, cfgPath, err := initConfig()\n\tif err != nil {\n\t\treturn nil, err\n\t}\n\n\tpackClient, err := initClient(logger, cfg)\n\tif err != nil {\n\t\treturn nil, err\n\t}\n\n\trootCmd := &cobra.Command{\n\t\tUse:   \"pack\",\n\t\tShort: \"CLI for building apps using Cloud Native Buildpacks\",\n\t\tPersistentPreRun: func(cmd *cobra.Command, args []string) {\n\t\t\tif fs := cmd.Flags(); fs != nil {\n\t\t\t\tif forceColor, err := fs.GetBool(\"force-color\"); err == nil && !forceColor {\n\t\t\t\t\tif flag, err := fs.GetBool(\"no-color\"); err == nil && flag {\n\t\t\t\t\t\tcolor.Disable(flag)\n\t\t\t\t\t}\n\n\t\t\t\t\t_, canDisplayColor := term.IsTerminal(logging.GetWriterForLevel(logger, logging.InfoLevel))\n\t\t\t\t\tif !canDisplayColor {\n\t\t\t\t\t\tcolor.Disable(true)\n\t\t\t\t\t}\n\t\t\t\t}\n\t\t\t\tif flag, err := fs.GetBool(\"quiet\"); err == nil {\n\t\t\t\t\tlogger.WantQuiet(flag)\n\t\t\t\t}\n\t\t\t\tif flag, err := fs.GetBool(\"verbose\"); err == nil {\n\t\t\t\t\tlogger.WantVerbose(flag)\n\t\t\t\t}\n\t\t\t\tif flag, err := fs.GetBool(\"timestamps\"); err == nil {\n\t\t\t\t\tlogger.WantTime(flag)\n\t\t\t\t}\n\t\t\t}\n\t\t},\n\t}\n\n\trootCmd.PersistentFlags().Bool(\"no-color\", false, \"Disable color output\")\n\trootCmd.PersistentFlags().Bool(\"force-color\", false, \"Force color output\")\n\trootCmd.PersistentFlags().Bool(\"timestamps\", false, \"Enable timestamps in output\")\n\trootCmd.PersistentFlags().BoolP(\"quiet\", \"q\", false, \"Show less output\")\n\trootCmd.PersistentFlags().BoolP(\"verbose\", \"v\", false, \"Show more output\")\n\trootCmd.Flags().Bool(\"version\", false, \"Show current 'pack' version\")\n\n\tcommands.AddHelpFlag(rootCmd, \"pack\")\n\n\trootCmd.AddCommand(commands.Build(logger, cfg, packClient))\n\trootCmd.AddCommand(commands.NewBuilderCommand(logger, cfg, packClient))\n\trootCmd.AddCommand(commands.NewBuildpackCommand(logger, cfg, packClient, buildpackage.NewConfigReader()))\n\trootCmd.AddCommand(commands.NewExtensionCommand(logger, cfg, packClient, buildpackage.NewConfigReader()))\n\trootCmd.AddCommand(commands.NewConfigCommand(logger, cfg, cfgPath, packClient))\n\trootCmd.AddCommand(commands.InspectImage(logger, imagewriter.NewFactory(), cfg, packClient))\n\trootCmd.AddCommand(commands.NewStackCommand(logger))\n\trootCmd.AddCommand(commands.Rebase(logger, cfg, packClient))\n\trootCmd.AddCommand(commands.NewSBOMCommand(logger, cfg, packClient))\n\n\trootCmd.AddCommand(commands.InspectBuildpack(logger, cfg, packClient))\n\trootCmd.AddCommand(commands.InspectBuilder(logger, cfg, packClient, builderwriter.NewFactory()))\n\n\trootCmd.AddCommand(commands.SetDefaultBuilder(logger, cfg, cfgPath, packClient))\n\trootCmd.AddCommand(commands.SetRunImagesMirrors(logger, cfg, cfgPath))\n\trootCmd.AddCommand(commands.SuggestBuilders(logger, packClient))\n\trootCmd.AddCommand(commands.TrustBuilder(logger, cfg, cfgPath))\n\trootCmd.AddCommand(commands.UntrustBuilder(logger, cfg, cfgPath))\n\trootCmd.AddCommand(commands.ListTrustedBuilders(logger, cfg))\n\trootCmd.AddCommand(commands.CreateBuilder(logger, cfg, packClient))\n\trootCmd.AddCommand(commands.PackageBuildpack(logger, cfg, packClient, buildpackage.NewConfigReader()))\n\n\tif cfg.Experimental {\n\t\trootCmd.AddCommand(commands.AddBuildpackRegistry(logger, cfg, cfgPath))\n\t\trootCmd.AddCommand(commands.ListBuildpackRegistries(logger, cfg))\n\t\trootCmd.AddCommand(commands.RegisterBuildpack(logger, cfg, packClient))\n\t\trootCmd.AddCommand(commands.SetDefaultRegistry(logger, cfg, cfgPath))\n\t\trootCmd.AddCommand(commands.RemoveRegistry(logger, cfg, cfgPath))\n\t\trootCmd.AddCommand(commands.YankBuildpack(logger, cfg, packClient))\n\t\trootCmd.AddCommand(commands.NewManifestCommand(logger, packClient))\n\t}\n\n\tpackHome, err := config.PackHome()\n\tif err != nil {\n\t\treturn nil, err\n\t}\n\n\trootCmd.AddCommand(commands.CompletionCommand(logger, packHome))\n\trootCmd.AddCommand(commands.Report(logger, packClient.Version(), cfgPath))\n\trootCmd.AddCommand(commands.Version(logger, packClient.Version()))\n\n\trootCmd.Version = packClient.Version()\n\trootCmd.SetVersionTemplate(`{{.Version}}{{\"\\n\"}}`)\n\trootCmd.SetOut(logging.GetWriterForLevel(logger, logging.InfoLevel))\n\trootCmd.SetErr(logging.GetWriterForLevel(logger, logging.ErrorLevel))\n\n\treturn rootCmd, nil\n}\n\nfunc initConfig() (config.Config, string, error) {\n\tpath, err := config.DefaultConfigPath()\n\tif err != nil {\n\t\treturn config.Config{}, \"\", errors.Wrap(err, \"getting config path\")\n\t}\n\n\tcfg, err := config.Read(path)\n\tif err != nil {\n\t\treturn config.Config{}, \"\", errors.Wrap(err, \"reading pack config\")\n\t}\n\treturn cfg, path, nil\n}\n\nfunc initClient(logger logging.Logger, cfg config.Config) (*client.Client, error) {\n\tif err := client.ProcessDockerContext(logger); err != nil {\n\t\treturn nil, err\n\t}\n\n\tdc, err := tryInitSSHDockerClient()\n\tif err != nil {\n\t\treturn nil, err\n\t}\n\n\t// If we got a docker client from SSH, use it directly\n\tif dc != nil {\n\t\treturn client.NewClient(client.WithLogger(logger), client.WithExperimental(cfg.Experimental), client.WithRegistryMirrors(cfg.RegistryMirrors), client.WithDockerClient(dc))\n\t}\n\n\treturn client.NewClient(client.WithLogger(logger), client.WithExperimental(cfg.Experimental), client.WithRegistryMirrors(cfg.RegistryMirrors))\n}\n"
  },
  {
    "path": "cmd/docker_init.go",
    "content": "package cmd\n\nimport (\n\t\"bufio\"\n\t\"bytes\"\n\t\"encoding/base64\"\n\t\"errors\"\n\t\"fmt\"\n\t\"io\"\n\t\"net/http\"\n\t\"net/url\"\n\t\"os\"\n\t\"strings\"\n\n\tdockerClient \"github.com/moby/moby/client\"\n\t\"golang.org/x/crypto/ssh\"\n\t\"golang.org/x/term\"\n\n\t\"github.com/buildpacks/pack/internal/sshdialer\"\n)\n\nfunc tryInitSSHDockerClient() (*dockerClient.Client, error) {\n\tdockerHost := os.Getenv(\"DOCKER_HOST\")\n\t_url, err := url.Parse(dockerHost)\n\tisSSH := err == nil && _url.Scheme == \"ssh\"\n\n\tif !isSSH {\n\t\treturn nil, nil\n\t}\n\n\tcredentialsConfig := sshdialer.Config{\n\t\tIdentity:           os.Getenv(\"DOCKER_HOST_SSH_IDENTITY\"),\n\t\tPassPhrase:         os.Getenv(\"DOCKER_HOST_SSH_IDENTITY_PASSPHRASE\"),\n\t\tPasswordCallback:   newReadSecretCbk(\"please enter password:\"),\n\t\tPassPhraseCallback: newReadSecretCbk(\"please enter passphrase to private key:\"),\n\t\tHostKeyCallback:    newHostKeyCbk(),\n\t}\n\tdialContext, err := sshdialer.NewDialContext(_url, credentialsConfig)\n\tif err != nil {\n\t\treturn nil, err\n\t}\n\n\thttpClient := &http.Client{\n\t\t// No tls\n\t\t// No proxy\n\t\tTransport: &http.Transport{\n\t\t\tDialContext: dialContext,\n\t\t},\n\t}\n\n\tdockerClientOpts := []dockerClient.Opt{\n\t\tdockerClient.WithHTTPClient(httpClient),\n\t\tdockerClient.WithHost(\"http://dummy\"),\n\t\tdockerClient.WithDialContext(dialContext),\n\t}\n\n\treturn dockerClient.New(dockerClientOpts...)\n}\n\n// readSecret prompts for a secret and returns value input by user from stdin\n// Unlike terminal.ReadPassword(), $(echo $SECRET | podman...) is supported.\n// Additionally, all input after `<secret>/n` is queued to podman command.\n//\n// NOTE: this code is based on \"github.com/containers/podman/v3/pkg/terminal\"\nfunc readSecret(prompt string) (pw []byte, err error) {\n\tfd := int(os.Stdin.Fd())\n\tif term.IsTerminal(fd) {\n\t\tfmt.Fprint(os.Stderr, prompt)\n\t\tpw, err = term.ReadPassword(fd)\n\t\tfmt.Fprintln(os.Stderr)\n\t\treturn pw, err\n\t}\n\n\tvar b [1]byte\n\tfor {\n\t\tn, err := os.Stdin.Read(b[:])\n\t\t// terminal.readSecret discards any '\\r', so we do the same\n\t\tif n > 0 && b[0] != '\\r' {\n\t\t\tif b[0] == '\\n' {\n\t\t\t\treturn pw, nil\n\t\t\t}\n\t\t\tpw = append(pw, b[0])\n\t\t\t// limit size, so that a wrong input won't fill up the memory\n\t\t\tif len(pw) > 1024 {\n\t\t\t\terr = errors.New(\"password too long, 1024 byte limit\")\n\t\t\t}\n\t\t}\n\t\tif err != nil {\n\t\t\t// terminal.readSecret accepts EOF-terminated passwords\n\t\t\t// if non-empty, so we do the same\n\t\t\tif err == io.EOF && len(pw) > 0 {\n\t\t\t\terr = nil\n\t\t\t}\n\t\t\treturn pw, err\n\t\t}\n\t}\n}\n\nfunc newReadSecretCbk(prompt string) sshdialer.SecretCallback {\n\tvar secretSet bool\n\tvar secret string\n\treturn func() (string, error) {\n\t\tif secretSet {\n\t\t\treturn secret, nil\n\t\t}\n\n\t\tp, err := readSecret(prompt)\n\t\tif err != nil {\n\t\t\treturn \"\", err\n\t\t}\n\t\tsecretSet = true\n\t\tsecret = string(p)\n\n\t\treturn secret, err\n\t}\n}\n\nfunc newHostKeyCbk() sshdialer.HostKeyCallback {\n\tvar trust []byte\n\treturn func(hostPort string, pubKey ssh.PublicKey) error {\n\t\tif bytes.Equal(trust, pubKey.Marshal()) {\n\t\t\treturn nil\n\t\t}\n\t\tmsg := `The authenticity of host %s cannot be established.\n%s key fingerprint is %s\nAre you sure you want to continue connecting (yes/no)? `\n\t\tfmt.Fprintf(os.Stderr, msg, hostPort, pubKey.Type(), ssh.FingerprintSHA256(pubKey))\n\t\treader := bufio.NewReader(os.Stdin)\n\t\tanswer, err := reader.ReadString('\\n')\n\t\tif err != nil {\n\t\t\treturn err\n\t\t}\n\t\tanswer = strings.TrimRight(answer, \"\\r\\n\")\n\t\tanswer = strings.ToLower(answer)\n\n\t\tif answer == \"yes\" || answer == \"y\" {\n\t\t\ttrust = pubKey.Marshal()\n\t\t\tfmt.Fprintf(os.Stderr, \"To avoid this in future add following line into your ~/.ssh/known_hosts:\\n%s %s %s\\n\",\n\t\t\t\thostPort, pubKey.Type(), base64.StdEncoding.EncodeToString(trust))\n\t\t\treturn nil\n\t\t}\n\n\t\treturn errors.New(\"key rejected\")\n\t}\n}\n"
  },
  {
    "path": "codecov.yml",
    "content": "codecov:\n  notify:\n    after_n_builds: 4\n\ncoverage:\n  round: up\n  status:\n    project:\n      default:\n        threshold: 1%\n    patch:\n      default:\n        threshold: 10%\n\ncomment:\n  layout: \"reach,diff,flags\"\n  require_changes: yes\n  after_n_builds: 4"
  },
  {
    "path": "go.mod",
    "content": "module github.com/buildpacks/pack\n\nrequire (\n\tgithub.com/BurntSushi/toml v1.5.0\n\tgithub.com/Masterminds/semver v1.5.0\n\tgithub.com/Microsoft/go-winio v0.6.2\n\tgithub.com/apex/log v1.9.0\n\tgithub.com/buildpacks/imgutil v0.0.0-20251202182233-51c1c8c186ea\n\tgithub.com/buildpacks/lifecycle v0.21.0\n\tgithub.com/chainguard-dev/kaniko v1.25.12\n\tgithub.com/containerd/errdefs v1.0.0\n\tgithub.com/docker/cli v29.3.1+incompatible\n\tgithub.com/docker/docker v28.5.2+incompatible\n\tgithub.com/dustin/go-humanize v1.0.1\n\tgithub.com/gdamore/tcell/v2 v2.13.8\n\tgithub.com/go-git/go-git/v5 v5.17.2\n\tgithub.com/golang/mock v1.6.0\n\tgithub.com/google/go-cmp v0.7.0\n\tgithub.com/google/go-containerregistry v0.21.4\n\tgithub.com/google/go-github/v30 v30.1.0\n\tgithub.com/hectane/go-acl v0.0.0-20190604041725-da78bae5fc95\n\tgithub.com/heroku/color v0.0.6\n\tgithub.com/mitchellh/ioprogress v0.0.0-20180201004757-6a23b12fa88e\n\tgithub.com/moby/go-archive v0.2.0\n\tgithub.com/moby/moby/api v1.54.1\n\tgithub.com/moby/moby/client v0.4.0\n\tgithub.com/onsi/gomega v1.39.1\n\tgithub.com/opencontainers/go-digest v1.0.0\n\tgithub.com/opencontainers/image-spec v1.1.1\n\tgithub.com/pelletier/go-toml v1.9.5\n\tgithub.com/pkg/errors v0.9.1\n\tgithub.com/rivo/tview v0.42.0\n\tgithub.com/sabhiram/go-gitignore v0.0.0-20210923224102-525f6e181f06\n\tgithub.com/sclevine/spec v1.4.0\n\tgithub.com/spf13/cobra v1.10.2\n\tgolang.org/x/crypto v0.49.0\n\tgolang.org/x/mod v0.34.0\n\tgolang.org/x/oauth2 v0.36.0\n\tgolang.org/x/sync v0.20.0\n\tgolang.org/x/sys v0.42.0\n\tgolang.org/x/term v0.41.0\n\tgolang.org/x/text v0.35.0\n\tgopkg.in/yaml.v3 v3.0.1\n)\n\nrequire (\n\tcyphar.com/go-pathrs v0.2.1 // indirect\n\tdario.cat/mergo v1.0.2 // indirect\n\tgithub.com/Azure/azure-sdk-for-go v68.0.0+incompatible // indirect\n\tgithub.com/Azure/go-ansiterm v0.0.0-20250102033503-faa5f7b0171c // indirect\n\tgithub.com/Azure/go-autorest v14.2.0+incompatible // indirect\n\tgithub.com/Azure/go-autorest/autorest v0.11.30 // indirect\n\tgithub.com/Azure/go-autorest/autorest/adal v0.9.24 // indirect\n\tgithub.com/Azure/go-autorest/autorest/azure/auth v0.5.13 // indirect\n\tgithub.com/Azure/go-autorest/autorest/azure/cli v0.4.7 // indirect\n\tgithub.com/Azure/go-autorest/autorest/date v0.3.1 // indirect\n\tgithub.com/Azure/go-autorest/logger v0.2.2 // indirect\n\tgithub.com/Azure/go-autorest/tracing v0.6.1 // indirect\n\tgithub.com/ProtonMail/go-crypto v1.3.0 // indirect\n\tgithub.com/agext/levenshtein v1.2.3 // indirect\n\tgithub.com/aws/aws-sdk-go-v2 v1.41.4 // indirect\n\tgithub.com/aws/aws-sdk-go-v2/config v1.32.12 // indirect\n\tgithub.com/aws/aws-sdk-go-v2/credentials v1.19.12 // indirect\n\tgithub.com/aws/aws-sdk-go-v2/feature/ec2/imds v1.18.20 // indirect\n\tgithub.com/aws/aws-sdk-go-v2/internal/configsources v1.4.20 // indirect\n\tgithub.com/aws/aws-sdk-go-v2/internal/endpoints/v2 v2.7.20 // indirect\n\tgithub.com/aws/aws-sdk-go-v2/internal/ini v1.8.6 // indirect\n\tgithub.com/aws/aws-sdk-go-v2/service/ecr v1.55.3 // indirect\n\tgithub.com/aws/aws-sdk-go-v2/service/ecrpublic v1.38.10 // indirect\n\tgithub.com/aws/aws-sdk-go-v2/service/internal/accept-encoding v1.13.7 // indirect\n\tgithub.com/aws/aws-sdk-go-v2/service/internal/presigned-url v1.13.20 // indirect\n\tgithub.com/aws/aws-sdk-go-v2/service/signin v1.0.8 // indirect\n\tgithub.com/aws/aws-sdk-go-v2/service/sso v1.30.13 // indirect\n\tgithub.com/aws/aws-sdk-go-v2/service/ssooidc v1.35.17 // indirect\n\tgithub.com/aws/aws-sdk-go-v2/service/sts v1.41.9 // indirect\n\tgithub.com/aws/smithy-go v1.24.2 // indirect\n\tgithub.com/awslabs/amazon-ecr-credential-helper/ecr-login v0.12.0 // indirect\n\tgithub.com/beorn7/perks v1.0.1 // indirect\n\tgithub.com/cespare/xxhash/v2 v2.3.0 // indirect\n\tgithub.com/chrismellard/docker-credential-acr-env v0.0.0-20230304212654-82a0ddb27589 // indirect\n\tgithub.com/cloudflare/circl v1.6.3 // indirect\n\tgithub.com/containerd/errdefs/pkg v0.3.0 // indirect\n\tgithub.com/containerd/log v0.1.0 // indirect\n\tgithub.com/containerd/stargz-snapshotter/estargz v0.18.2 // indirect\n\tgithub.com/containerd/typeurl/v2 v2.2.3 // indirect\n\tgithub.com/cyphar/filepath-securejoin v0.6.1 // indirect\n\tgithub.com/dimchansky/utfbom v1.1.1 // indirect\n\tgithub.com/distribution/reference v0.6.0 // indirect\n\tgithub.com/docker/distribution v2.8.3+incompatible // indirect\n\tgithub.com/docker/docker-credential-helpers v0.9.5 // indirect\n\tgithub.com/docker/go-connections v0.6.0 // indirect\n\tgithub.com/docker/go-metrics v0.0.1 // indirect\n\tgithub.com/docker/go-units v0.5.0 // indirect\n\tgithub.com/emirpasic/gods v1.18.1 // indirect\n\tgithub.com/felixge/httpsnoop v1.0.4 // indirect\n\tgithub.com/gdamore/encoding v1.0.1 // indirect\n\tgithub.com/go-git/gcfg v1.5.1-0.20230307220236-3a3c6141e376 // indirect\n\tgithub.com/go-git/go-billy/v5 v5.8.0 // indirect\n\tgithub.com/go-logr/logr v1.4.3 // indirect\n\tgithub.com/go-logr/stdr v1.2.2 // indirect\n\tgithub.com/go-viper/mapstructure/v2 v2.4.0 // indirect\n\tgithub.com/gogo/protobuf v1.3.2 // indirect\n\tgithub.com/golang-jwt/jwt/v4 v4.5.2 // indirect\n\tgithub.com/golang/groupcache v0.0.0-20241129210726-2c02b8208cf8 // indirect\n\tgithub.com/google/go-querystring v1.1.0 // indirect\n\tgithub.com/google/shlex v0.0.0-20191202100458-e7afc7fbc510 // indirect\n\tgithub.com/google/uuid v1.6.0 // indirect\n\tgithub.com/gorilla/mux v1.8.1 // indirect\n\tgithub.com/inconshreveable/mousetrap v1.1.0 // indirect\n\tgithub.com/jbenet/go-context v0.0.0-20150711004518-d14ea06fba99 // indirect\n\tgithub.com/kevinburke/ssh_config v1.2.0 // indirect\n\tgithub.com/klauspost/compress v1.18.5 // indirect\n\tgithub.com/lucasb-eyer/go-colorful v1.3.0 // indirect\n\tgithub.com/mattn/go-colorable v0.1.14 // indirect\n\tgithub.com/mattn/go-isatty v0.0.20 // indirect\n\tgithub.com/mitchellh/go-homedir v1.1.0 // indirect\n\tgithub.com/moby/buildkit v0.28.1 // indirect\n\tgithub.com/moby/docker-image-spec v1.3.1 // indirect\n\tgithub.com/moby/patternmatcher v0.6.1 // indirect\n\tgithub.com/moby/sys/atomicwriter v0.1.0 // indirect\n\tgithub.com/moby/sys/sequential v0.6.0 // indirect\n\tgithub.com/moby/sys/user v0.4.0 // indirect\n\tgithub.com/moby/sys/userns v0.1.0 // indirect\n\tgithub.com/moby/term v0.5.2 // indirect\n\tgithub.com/morikuni/aec v1.1.0 // indirect\n\tgithub.com/munnerz/goautoneg v0.0.0-20191010083416-a7dc8b61c822 // indirect\n\tgithub.com/opencontainers/selinux v1.13.1 // indirect\n\tgithub.com/pjbgf/sha1cd v0.3.2 // indirect\n\tgithub.com/planetscale/vtprotobuf v0.6.1-0.20240319094008-0393e58bdf10 // indirect\n\tgithub.com/prometheus/client_golang v1.23.2 // indirect\n\tgithub.com/prometheus/client_model v0.6.2 // indirect\n\tgithub.com/prometheus/common v0.66.1 // indirect\n\tgithub.com/prometheus/procfs v0.17.0 // indirect\n\tgithub.com/rivo/uniseg v0.4.7 // indirect\n\tgithub.com/sergi/go-diff v1.3.2-0.20230802210424-5b0b94c5c0d3 // indirect\n\tgithub.com/sirupsen/logrus v1.9.4 // indirect\n\tgithub.com/skeema/knownhosts v1.3.1 // indirect\n\tgithub.com/spf13/pflag v1.0.10 // indirect\n\tgithub.com/tonistiigi/go-csvvalue v0.0.0-20240814133006-030d3b2625d0 // indirect\n\tgithub.com/vbatts/tar-split v0.12.2 // indirect\n\tgithub.com/xanzy/ssh-agent v0.3.3 // indirect\n\tgithub.com/xeipuuv/gojsonpointer v0.0.0-20180127040702-4e3ac2762d5f // indirect\n\tgithub.com/xeipuuv/gojsonreference v0.0.0-20180127040603-bd5ef7bd5415 // indirect\n\tgithub.com/xeipuuv/gojsonschema v1.2.0 // indirect\n\tgo.opentelemetry.io/auto/sdk v1.2.1 // indirect\n\tgo.opentelemetry.io/contrib/instrumentation/net/http/otelhttp v0.63.0 // indirect\n\tgo.opentelemetry.io/otel v1.40.0 // indirect\n\tgo.opentelemetry.io/otel/metric v1.40.0 // indirect\n\tgo.opentelemetry.io/otel/trace v1.40.0 // indirect\n\tgo.yaml.in/yaml/v2 v2.4.3 // indirect\n\tgo.yaml.in/yaml/v3 v3.0.4 // indirect\n\tgolang.org/x/net v0.52.0 // indirect\n\tgoogle.golang.org/protobuf v1.36.11 // indirect\n\tgopkg.in/warnings.v0 v0.1.2 // indirect\n)\n\nreplace github.com/BurntSushi/toml => github.com/BurntSushi/toml v1.3.2\n\ngo 1.25.8\n"
  },
  {
    "path": "go.sum",
    "content": "cyphar.com/go-pathrs v0.2.1 h1:9nx1vOgwVvX1mNBWDu93+vaceedpbsDqo+XuBGL40b8=\ncyphar.com/go-pathrs v0.2.1/go.mod h1:y8f1EMG7r+hCuFf/rXsKqMJrJAUoADZGNh5/vZPKcGc=\ndario.cat/mergo v1.0.2 h1:85+piFYR1tMbRrLcDwR18y4UKJ3aH1Tbzi24VRW1TK8=\ndario.cat/mergo v1.0.2/go.mod h1:E/hbnu0NxMFBjpMIE34DRGLWqDy0g5FuKDhCb31ngxA=\ngithub.com/AdaLogics/go-fuzz-headers v0.0.0-20240806141605-e8a1dd7889d6 h1:He8afgbRMd7mFxO99hRNu+6tazq8nFF9lIwo9JFroBk=\ngithub.com/AdaLogics/go-fuzz-headers v0.0.0-20240806141605-e8a1dd7889d6/go.mod h1:8o94RPi1/7XTJvwPpRSzSUedZrtlirdB3r9Z20bi2f8=\ngithub.com/Azure/azure-sdk-for-go v68.0.0+incompatible h1:fcYLmCpyNYRnvJbPerq7U0hS+6+I79yEDJBqVNcqUzU=\ngithub.com/Azure/azure-sdk-for-go v68.0.0+incompatible/go.mod h1:9XXNKU+eRnpl9moKnB4QOLf1HestfXbmab5FXxiDBjc=\ngithub.com/Azure/go-ansiterm v0.0.0-20250102033503-faa5f7b0171c h1:udKWzYgxTojEKWjV8V+WSxDXJ4NFATAsZjh8iIbsQIg=\ngithub.com/Azure/go-ansiterm v0.0.0-20250102033503-faa5f7b0171c/go.mod h1:xomTg63KZ2rFqZQzSB4Vz2SUXa1BpHTVz9L5PTmPC4E=\ngithub.com/Azure/go-autorest v14.2.0+incompatible h1:V5VMDjClD3GiElqLWO7mz2MxNAK/vTfRHdAubSIPRgs=\ngithub.com/Azure/go-autorest v14.2.0+incompatible/go.mod h1:r+4oMnoxhatjLLJ6zxSWATqVooLgysK6ZNox3g/xq24=\ngithub.com/Azure/go-autorest/autorest v0.11.28/go.mod h1:MrkzG3Y3AH668QyF9KRk5neJnGgmhQ6krbhR8Q5eMvA=\ngithub.com/Azure/go-autorest/autorest v0.11.30 h1:iaZ1RGz/ALZtN5eq4Nr1SOFSlf2E4pDI3Tcsl+dZPVE=\ngithub.com/Azure/go-autorest/autorest v0.11.30/go.mod h1:t1kpPIOpIVX7annvothKvb0stsrXa37i7b+xpmBW8Fs=\ngithub.com/Azure/go-autorest/autorest/adal v0.9.18/go.mod h1:XVVeme+LZwABT8K5Lc3hA4nAe8LDBVle26gTrguhhPQ=\ngithub.com/Azure/go-autorest/autorest/adal v0.9.22/go.mod h1:XuAbAEUv2Tta//+voMI038TrJBqjKam0me7qR+L8Cmk=\ngithub.com/Azure/go-autorest/autorest/adal v0.9.24 h1:BHZfgGsGwdkHDyZdtQRQk1WeUdW0m2WPAwuHZwUi5i4=\ngithub.com/Azure/go-autorest/autorest/adal v0.9.24/go.mod h1:7T1+g0PYFmACYW5LlG2fcoPiPlFHjClyRGL7dRlP5c8=\ngithub.com/Azure/go-autorest/autorest/azure/auth v0.5.13 h1:Ov8avRZi2vmrE2JcXw+tu5K/yB41r7xK9GZDiBF7NdM=\ngithub.com/Azure/go-autorest/autorest/azure/auth v0.5.13/go.mod h1:5BAVfWLWXihP47vYrPuBKKf4cS0bXI+KM9Qx6ETDJYo=\ngithub.com/Azure/go-autorest/autorest/azure/cli v0.4.6/go.mod h1:piCfgPho7BiIDdEQ1+g4VmKyD5y+p/XtSNqE6Hc4QD0=\ngithub.com/Azure/go-autorest/autorest/azure/cli v0.4.7 h1:Q9R3utmFg9K1B4OYtAZ7ZUUvIUdzQt7G2MN5Hi/d670=\ngithub.com/Azure/go-autorest/autorest/azure/cli v0.4.7/go.mod h1:bVrAueELJ0CKLBpUHDIvD516TwmHmzqwCpvONWRsw3s=\ngithub.com/Azure/go-autorest/autorest/date v0.3.0/go.mod h1:BI0uouVdmngYNUzGWeSYnokU+TrmwEsOqdt8Y6sso74=\ngithub.com/Azure/go-autorest/autorest/date v0.3.1 h1:o9Z8Jyt+VJJTCZ/UORishuHOusBwolhjokt9s5k8I4w=\ngithub.com/Azure/go-autorest/autorest/date v0.3.1/go.mod h1:Dz/RDmXlfiFFS/eW+b/xMUSFs1tboPVy6UjgADToWDM=\ngithub.com/Azure/go-autorest/autorest/mocks v0.4.1/go.mod h1:LTp+uSrOhSkaKrUy935gNZuuIPPVsHlr9DSOxSayd+k=\ngithub.com/Azure/go-autorest/autorest/mocks v0.4.2 h1:PGN4EDXnuQbojHbU0UWoNvmu9AGVwYHG9/fkDYhtAfw=\ngithub.com/Azure/go-autorest/autorest/mocks v0.4.2/go.mod h1:Vy7OitM9Kei0i1Oj+LvyAWMXJHeKH1MVlzFugfVrmyU=\ngithub.com/Azure/go-autorest/logger v0.2.1/go.mod h1:T9E3cAhj2VqvPOtCYAvby9aBXkZmbF5NWuPV8+WeEW8=\ngithub.com/Azure/go-autorest/logger v0.2.2 h1:hYqBsEBywrrOSW24kkOCXRcKfKhK76OzLTfF+MYDE2o=\ngithub.com/Azure/go-autorest/logger v0.2.2/go.mod h1:I5fg9K52o+iuydlWfa9T5K6WFos9XYr9dYTFzpqgibw=\ngithub.com/Azure/go-autorest/tracing v0.6.0/go.mod h1:+vhtPC754Xsa23ID7GlGsrdKBpUA79WCAKPPZVC2DeU=\ngithub.com/Azure/go-autorest/tracing v0.6.1 h1:YUMSrC/CeD1ZnnXcNYU4a/fzsO35u2Fsful9L/2nyR0=\ngithub.com/Azure/go-autorest/tracing v0.6.1/go.mod h1:/3EgjbsjraOqiicERAeu3m7/z0x1TzjQGAwDrJrXGkc=\ngithub.com/BurntSushi/toml v1.3.2 h1:o7IhLm0Msx3BaB+n3Ag7L8EVlByGnpq14C4YWiu/gL8=\ngithub.com/BurntSushi/toml v1.3.2/go.mod h1:CxXYINrC8qIiEnFrOxCa7Jy5BFHlXnUU2pbicEuybxQ=\ngithub.com/Masterminds/semver v1.5.0 h1:H65muMkzWKEuNDnfl9d70GUjFniHKHRbFPGBuZ3QEww=\ngithub.com/Masterminds/semver v1.5.0/go.mod h1:MB6lktGJrhw8PrUyiEoblNEGEQ+RzHPF078ddwwvV3Y=\ngithub.com/Masterminds/semver/v3 v3.4.0 h1:Zog+i5UMtVoCU8oKka5P7i9q9HgrJeGzI9SA1Xbatp0=\ngithub.com/Masterminds/semver/v3 v3.4.0/go.mod h1:4V+yj/TJE1HU9XfppCwVMZq3I84lprf4nC11bSS5beM=\ngithub.com/Microsoft/go-winio v0.5.2/go.mod h1:WpS1mjBmmwHBEWmogvA2mj8546UReBk4v8QkMxJ6pZY=\ngithub.com/Microsoft/go-winio v0.6.2 h1:F2VQgta7ecxGYO8k3ZZz3RS8fVIXVxONVUPlNERoyfY=\ngithub.com/Microsoft/go-winio v0.6.2/go.mod h1:yd8OoFMLzJbo9gZq8j5qaps8bJ9aShtEA8Ipt1oGCvU=\ngithub.com/ProtonMail/go-crypto v1.3.0 h1:ILq8+Sf5If5DCpHQp4PbZdS1J7HDFRXz/+xKBiRGFrw=\ngithub.com/ProtonMail/go-crypto v1.3.0/go.mod h1:9whxjD8Rbs29b4XWbB8irEcE8KHMqaR2e7GWU1R+/PE=\ngithub.com/agext/levenshtein v1.2.3 h1:YB2fHEn0UJagG8T1rrWknE3ZQzWM06O8AMAatNn7lmo=\ngithub.com/agext/levenshtein v1.2.3/go.mod h1:JEDfjyjHDjOF/1e4FlBE/PkbqA9OfWu2ki2W0IB5558=\ngithub.com/alecthomas/template v0.0.0-20160405071501-a0175ee3bccc/go.mod h1:LOuyumcjzFXgccqObfd/Ljyb9UuFJ6TxHnclSeseNhc=\ngithub.com/alecthomas/units v0.0.0-20151022065526-2efee857e7cf/go.mod h1:ybxpYRFXyAe+OPACYpWeL0wqObRcbAqCMya13uyzqw0=\ngithub.com/anmitsu/go-shlex v0.0.0-20200514113438-38f4b401e2be h1:9AeTilPcZAjCFIImctFaOjnTIavg87rW78vTPkQqLI8=\ngithub.com/anmitsu/go-shlex v0.0.0-20200514113438-38f4b401e2be/go.mod h1:ySMOLuWl6zY27l47sB3qLNK6tF2fkHG55UZxx8oIVo4=\ngithub.com/apex/log v1.9.0 h1:FHtw/xuaM8AgmvDDTI9fiwoAL25Sq2cxojnZICUU8l0=\ngithub.com/apex/log v1.9.0/go.mod h1:m82fZlWIuiWzWP04XCTXmnX0xRkYYbCdYn8jbJeLBEA=\ngithub.com/apex/logs v1.0.0/go.mod h1:XzxuLZ5myVHDy9SAmYpamKKRNApGj54PfYLcFrXqDwo=\ngithub.com/aphistic/golf v0.0.0-20180712155816-02c07f170c5a/go.mod h1:3NqKYiepwy8kCu4PNA+aP7WUV72eXWJeP9/r3/K9aLE=\ngithub.com/aphistic/sweet v0.2.0/go.mod h1:fWDlIh/isSE9n6EPsRmC0det+whmX6dJid3stzu0Xys=\ngithub.com/armon/go-socks5 v0.0.0-20160902184237-e75332964ef5 h1:0CwZNZbxp69SHPdPJAN/hZIm0C4OItdklCFmMRWYpio=\ngithub.com/armon/go-socks5 v0.0.0-20160902184237-e75332964ef5/go.mod h1:wHh0iHkYZB8zMSxRWpUBQtwG5a7fFgvEO+odwuTv2gs=\ngithub.com/aws/aws-sdk-go v1.20.6/go.mod h1:KmX6BPdI08NWTb3/sm4ZGu5ShLoqVDhKgpiN924inxo=\ngithub.com/aws/aws-sdk-go-v2 v1.41.4 h1:10f50G7WyU02T56ox1wWXq+zTX9I1zxG46HYuG1hH/k=\ngithub.com/aws/aws-sdk-go-v2 v1.41.4/go.mod h1:mwsPRE8ceUUpiTgF7QmQIJ7lgsKUPQOUl3o72QBrE1o=\ngithub.com/aws/aws-sdk-go-v2/config v1.32.12 h1:O3csC7HUGn2895eNrLytOJQdoL2xyJy0iYXhoZ1OmP0=\ngithub.com/aws/aws-sdk-go-v2/config v1.32.12/go.mod h1:96zTvoOFR4FURjI+/5wY1vc1ABceROO4lWgWJuxgy0g=\ngithub.com/aws/aws-sdk-go-v2/credentials v1.19.12 h1:oqtA6v+y5fZg//tcTWahyN9PEn5eDU/Wpvc2+kJ4aY8=\ngithub.com/aws/aws-sdk-go-v2/credentials v1.19.12/go.mod h1:U3R1RtSHx6NB0DvEQFGyf/0sbrpJrluENHdPy1j/3TE=\ngithub.com/aws/aws-sdk-go-v2/feature/ec2/imds v1.18.20 h1:zOgq3uezl5nznfoK3ODuqbhVg1JzAGDUhXOsU0IDCAo=\ngithub.com/aws/aws-sdk-go-v2/feature/ec2/imds v1.18.20/go.mod h1:z/MVwUARehy6GAg/yQ1GO2IMl0k++cu1ohP9zo887wE=\ngithub.com/aws/aws-sdk-go-v2/internal/configsources v1.4.20 h1:CNXO7mvgThFGqOFgbNAP2nol2qAWBOGfqR/7tQlvLmc=\ngithub.com/aws/aws-sdk-go-v2/internal/configsources v1.4.20/go.mod h1:oydPDJKcfMhgfcgBUZaG+toBbwy8yPWubJXBVERtI4o=\ngithub.com/aws/aws-sdk-go-v2/internal/endpoints/v2 v2.7.20 h1:tN6W/hg+pkM+tf9XDkWUbDEjGLb+raoBMFsTodcoYKw=\ngithub.com/aws/aws-sdk-go-v2/internal/endpoints/v2 v2.7.20/go.mod h1:YJ898MhD067hSHA6xYCx5ts/jEd8BSOLtQDL3iZsvbc=\ngithub.com/aws/aws-sdk-go-v2/internal/ini v1.8.6 h1:qYQ4pzQ2Oz6WpQ8T3HvGHnZydA72MnLuFK9tJwmrbHw=\ngithub.com/aws/aws-sdk-go-v2/internal/ini v1.8.6/go.mod h1:O3h0IK87yXci+kg6flUKzJnWeziQUKciKrLjcatSNcY=\ngithub.com/aws/aws-sdk-go-v2/service/ecr v1.55.3 h1:RtGctYMmkTerGClvdY6bHXdtly4FeYw9wz/NPz62LF8=\ngithub.com/aws/aws-sdk-go-v2/service/ecr v1.55.3/go.mod h1:vBfBu24Ka3/5UZtepbTV0gnc9VPLT8ok+0oDDaYAzn4=\ngithub.com/aws/aws-sdk-go-v2/service/ecrpublic v1.38.10 h1:1A/sI3LNMi3fhRI5TFLMwwo7ALAALSFVCSGvFlr1Iys=\ngithub.com/aws/aws-sdk-go-v2/service/ecrpublic v1.38.10/go.mod h1:Diyyyz0b43X13pdi1mVMqlTwDjOmRbJMvDsqnduUYWM=\ngithub.com/aws/aws-sdk-go-v2/service/internal/accept-encoding v1.13.7 h1:5EniKhLZe4xzL7a+fU3C2tfUN4nWIqlLesfrjkuPFTY=\ngithub.com/aws/aws-sdk-go-v2/service/internal/accept-encoding v1.13.7/go.mod h1:x0nZssQ3qZSnIcePWLvcoFisRXJzcTVvYpAAdYX8+GI=\ngithub.com/aws/aws-sdk-go-v2/service/internal/presigned-url v1.13.20 h1:2HvVAIq+YqgGotK6EkMf+KIEqTISmTYh5zLpYyeTo1Y=\ngithub.com/aws/aws-sdk-go-v2/service/internal/presigned-url v1.13.20/go.mod h1:V4X406Y666khGa8ghKmphma/7C0DAtEQYhkq9z4vpbk=\ngithub.com/aws/aws-sdk-go-v2/service/signin v1.0.8 h1:0GFOLzEbOyZABS3PhYfBIx2rNBACYcKty+XGkTgw1ow=\ngithub.com/aws/aws-sdk-go-v2/service/signin v1.0.8/go.mod h1:LXypKvk85AROkKhOG6/YEcHFPoX+prKTowKnVdcaIxE=\ngithub.com/aws/aws-sdk-go-v2/service/sso v1.30.13 h1:kiIDLZ005EcKomYYITtfsjn7dtOwHDOFy7IbPXKek2o=\ngithub.com/aws/aws-sdk-go-v2/service/sso v1.30.13/go.mod h1:2h/xGEowcW/g38g06g3KpRWDlT+OTfxxI0o1KqayAB8=\ngithub.com/aws/aws-sdk-go-v2/service/ssooidc v1.35.17 h1:jzKAXIlhZhJbnYwHbvUQZEB8KfgAEuG0dc08Bkda7NU=\ngithub.com/aws/aws-sdk-go-v2/service/ssooidc v1.35.17/go.mod h1:Al9fFsXjv4KfbzQHGe6V4NZSZQXecFcvaIF4e70FoRA=\ngithub.com/aws/aws-sdk-go-v2/service/sts v1.41.9 h1:Cng+OOwCHmFljXIxpEVXAGMnBia8MSU6Ch5i9PgBkcU=\ngithub.com/aws/aws-sdk-go-v2/service/sts v1.41.9/go.mod h1:LrlIndBDdjA/EeXeyNBle+gyCwTlizzW5ycgWnvIxkk=\ngithub.com/aws/smithy-go v1.24.2 h1:FzA3bu/nt/vDvmnkg+R8Xl46gmzEDam6mZ1hzmwXFng=\ngithub.com/aws/smithy-go v1.24.2/go.mod h1:YE2RhdIuDbA5E5bTdciG9KrW3+TiEONeUWCqxX9i1Fc=\ngithub.com/awslabs/amazon-ecr-credential-helper/ecr-login v0.12.0 h1:JFWXO6QPihCknDdnL6VaQE57km4ZKheHIGd9YiOGcTo=\ngithub.com/awslabs/amazon-ecr-credential-helper/ecr-login v0.12.0/go.mod h1:046/oLyFlYdAghYQE2yHXi/E//VM5Cf3/dFmA+3CZ0c=\ngithub.com/aybabtme/rgbterm v0.0.0-20170906152045-cc83f3b3ce59/go.mod h1:q/89r3U2H7sSsE2t6Kca0lfwTK8JdoNGS/yzM/4iH5I=\ngithub.com/beorn7/perks v0.0.0-20180321164747-3a771d992973/go.mod h1:Dwedo/Wpr24TaqPxmxbtue+5NUziq4I4S80YR8gNf3Q=\ngithub.com/beorn7/perks v1.0.0/go.mod h1:KWe93zE9D1o94FZ5RNwFwVgaQK1VOXiVxmqh+CedLV8=\ngithub.com/beorn7/perks v1.0.1 h1:VlbKKnNfV8bJzeqoa4cOKqO6bYr3WgKZxO8Z16+hsOM=\ngithub.com/beorn7/perks v1.0.1/go.mod h1:G2ZrVWU2WbWT9wwq4/hrbKbnv/1ERSJQ0ibhJ6rlkpw=\ngithub.com/buildpacks/imgutil v0.0.0-20251202182233-51c1c8c186ea h1:91PTHjeL3uzjr2/jk1SJuFZp3ObodKawy79BKdio+VE=\ngithub.com/buildpacks/imgutil v0.0.0-20251202182233-51c1c8c186ea/go.mod h1:yw2U9Ec8KUk7jWY97K0+e6GqrUAr05uTK6LwOEZyupw=\ngithub.com/buildpacks/lifecycle v0.21.0 h1:s2okNv1I4rETBC4CRm4ly7DRr5eTqx1bpKXyf+ywVms=\ngithub.com/buildpacks/lifecycle v0.21.0/go.mod h1:5Rb9kld2v1XYQC14fJrIGESqmuJvSYHVlUxAW5MAxNU=\ngithub.com/cespare/xxhash/v2 v2.3.0 h1:UL815xU9SqsFlibzuggzjXhog7bL6oX9BbNZnL2UFvs=\ngithub.com/cespare/xxhash/v2 v2.3.0/go.mod h1:VGX0DQ3Q6kWi7AoAeZDth3/j3BFtOZR5XLFGgcrjCOs=\ngithub.com/chainguard-dev/kaniko v1.25.12 h1:3CyKF5zfGyadWRiq4vVC8tk2KtviKZTr6ajQcIOJ09M=\ngithub.com/chainguard-dev/kaniko v1.25.12/go.mod h1:mSjXdmLYuSF8BpGXyHtkTMGFIRvzlaDNxnMkk5SCTsU=\ngithub.com/chrismellard/docker-credential-acr-env v0.0.0-20230304212654-82a0ddb27589 h1:krfRl01rzPzxSxyLyrChD+U+MzsBXbm0OwYYB67uF+4=\ngithub.com/chrismellard/docker-credential-acr-env v0.0.0-20230304212654-82a0ddb27589/go.mod h1:OuDyvmLnMCwa2ep4Jkm6nyA0ocJuZlGyk2gGseVzERM=\ngithub.com/cloudflare/circl v1.6.3 h1:9GPOhQGF9MCYUeXyMYlqTR6a5gTrgR/fBLXvUgtVcg8=\ngithub.com/cloudflare/circl v1.6.3/go.mod h1:2eXP6Qfat4O/Yhh8BznvKnJ+uzEoTQ6jVKJRn81BiS4=\ngithub.com/containerd/errdefs v1.0.0 h1:tg5yIfIlQIrxYtu9ajqY42W3lpS19XqdxRQeEwYG8PI=\ngithub.com/containerd/errdefs v1.0.0/go.mod h1:+YBYIdtsnF4Iw6nWZhJcqGSg/dwvV7tyJ/kCkyJ2k+M=\ngithub.com/containerd/errdefs/pkg v0.3.0 h1:9IKJ06FvyNlexW690DXuQNx2KA2cUJXx151Xdx3ZPPE=\ngithub.com/containerd/errdefs/pkg v0.3.0/go.mod h1:NJw6s9HwNuRhnjJhM7pylWwMyAkmCQvQ4GpJHEqRLVk=\ngithub.com/containerd/log v0.1.0 h1:TCJt7ioM2cr/tfR8GPbGf9/VRAX8D2B4PjzCpfX540I=\ngithub.com/containerd/log v0.1.0/go.mod h1:VRRf09a7mHDIRezVKTRCrOq78v577GXq3bSa3EhrzVo=\ngithub.com/containerd/stargz-snapshotter/estargz v0.18.2 h1:yXkZFYIzz3eoLwlTUZKz2iQ4MrckBxJjkmD16ynUTrw=\ngithub.com/containerd/stargz-snapshotter/estargz v0.18.2/go.mod h1:XyVU5tcJ3PRpkA9XS2T5us6Eg35yM0214Y+wvrZTBrY=\ngithub.com/containerd/typeurl/v2 v2.2.3 h1:yNA/94zxWdvYACdYO8zofhrTVuQY73fFU1y++dYSw40=\ngithub.com/containerd/typeurl/v2 v2.2.3/go.mod h1:95ljDnPfD3bAbDJRugOiShd/DlAAsxGtUBhJxIn7SCk=\ngithub.com/cpuguy83/go-md2man/v2 v2.0.6/go.mod h1:oOW0eioCTA6cOiMLiUPZOpcVxMig6NIQQ7OS05n1F4g=\ngithub.com/creack/pty v1.1.24 h1:bJrF4RRfyJnbTJqzRLHzcGaZK1NeM5kTC9jGgovnR1s=\ngithub.com/creack/pty v1.1.24/go.mod h1:08sCNb52WyoAwi2QDyzUCTgcvVFhUzewun7wtTfvcwE=\ngithub.com/cyphar/filepath-securejoin v0.6.1 h1:5CeZ1jPXEiYt3+Z6zqprSAgSWiggmpVyciv8syjIpVE=\ngithub.com/cyphar/filepath-securejoin v0.6.1/go.mod h1:A8hd4EnAeyujCJRrICiOWqjS1AX0a9kM5XL+NwKoYSc=\ngithub.com/davecgh/go-spew v1.1.0/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=\ngithub.com/davecgh/go-spew v1.1.1/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=\ngithub.com/davecgh/go-spew v1.1.2-0.20180830191138-d8f796af33cc h1:U9qPSI2PIWSS1VwoXQT9A3Wy9MM3WgvqSxFWenqJduM=\ngithub.com/davecgh/go-spew v1.1.2-0.20180830191138-d8f796af33cc/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=\ngithub.com/dimchansky/utfbom v1.1.1 h1:vV6w1AhK4VMnhBno/TPVCoK9U/LP0PkLCS9tbxHdi/U=\ngithub.com/dimchansky/utfbom v1.1.1/go.mod h1:SxdoEBH5qIqFocHMyGOXVAybYJdr71b1Q/j0mACtrfE=\ngithub.com/distribution/reference v0.6.0 h1:0IXCQ5g4/QMHHkarYzh5l+u8T3t73zM5QvfrDyIgxBk=\ngithub.com/distribution/reference v0.6.0/go.mod h1:BbU0aIcezP1/5jX/8MP0YiH4SdvB5Y4f/wlDRiLyi3E=\ngithub.com/docker/cli v29.3.1+incompatible h1:M04FDj2TRehDacrosh7Vlkgc7AuQoWloQkf1PA5hmoI=\ngithub.com/docker/cli v29.3.1+incompatible/go.mod h1:JLrzqnKDaYBop7H2jaqPtU4hHvMKP+vjCwu2uszcLI8=\ngithub.com/docker/distribution v2.8.3+incompatible h1:AtKxIZ36LoNK51+Z6RpzLpddBirtxJnzDrHLEKxTAYk=\ngithub.com/docker/distribution v2.8.3+incompatible/go.mod h1:J2gT2udsDAN96Uj4KfcMRqY0/ypR+oyYUYmja8H+y+w=\ngithub.com/docker/docker v28.5.2+incompatible h1:DBX0Y0zAjZbSrm1uzOkdr1onVghKaftjlSWt4AFexzM=\ngithub.com/docker/docker v28.5.2+incompatible/go.mod h1:eEKB0N0r5NX/I1kEveEz05bcu8tLC/8azJZsviup8Sk=\ngithub.com/docker/docker-credential-helpers v0.9.5 h1:EFNN8DHvaiK8zVqFA2DT6BjXE0GzfLOZ38ggPTKePkY=\ngithub.com/docker/docker-credential-helpers v0.9.5/go.mod h1:v1S+hepowrQXITkEfw6o4+BMbGot02wiKpzWhGUZK6c=\ngithub.com/docker/go-connections v0.6.0 h1:LlMG9azAe1TqfR7sO+NJttz1gy6KO7VJBh+pMmjSD94=\ngithub.com/docker/go-connections v0.6.0/go.mod h1:AahvXYshr6JgfUJGdDCs2b5EZG/vmaMAntpSFH5BFKE=\ngithub.com/docker/go-metrics v0.0.1 h1:AgB/0SvBxihN0X8OR4SjsblXkbMvalQ8cjmtKQ2rQV8=\ngithub.com/docker/go-metrics v0.0.1/go.mod h1:cG1hvH2utMXtqgqqYE9plW6lDxS3/5ayHzueweSI3Vw=\ngithub.com/docker/go-units v0.5.0 h1:69rxXcBk27SvSaaxTtLh/8llcHD8vYHT7WSdRZ/jvr4=\ngithub.com/docker/go-units v0.5.0/go.mod h1:fgPhTUdO+D/Jk86RDLlptpiXQzgHJF7gydDDbaIK4Dk=\ngithub.com/docker/libtrust v0.0.0-20160708172513-aabc10ec26b7 h1:UhxFibDNY/bfvqU5CAUmr9zpesgbU6SWc8/B4mflAE4=\ngithub.com/docker/libtrust v0.0.0-20160708172513-aabc10ec26b7/go.mod h1:cyGadeNEkKy96OOhEzfZl+yxihPEzKnqJwvfuSUqbZE=\ngithub.com/dustin/go-humanize v1.0.1 h1:GzkhY7T5VNhEkwH0PVJgjz+fX1rhBrR7pRT3mDkpeCY=\ngithub.com/dustin/go-humanize v1.0.1/go.mod h1:Mu1zIs6XwVuF/gI1OepvI0qD18qycQx+mFykh5fBlto=\ngithub.com/elazarl/goproxy v1.7.2 h1:Y2o6urb7Eule09PjlhQRGNsqRfPmYI3KKQLFpCAV3+o=\ngithub.com/elazarl/goproxy v1.7.2/go.mod h1:82vkLNir0ALaW14Rc399OTTjyNREgmdL2cVoIbS6XaE=\ngithub.com/emirpasic/gods v1.18.1 h1:FXtiHYKDGKCW2KzwZKx0iC0PQmdlorYgdFG9jPXJ1Bc=\ngithub.com/emirpasic/gods v1.18.1/go.mod h1:8tpGGwCnJ5H4r6BWwaV6OrWmMoPhUl5jm/FMNAnJvWQ=\ngithub.com/fatih/color v1.7.0/go.mod h1:Zm6kSWBoL9eyXnKyktHP6abPY2pDugNf5KwzbycvMj4=\ngithub.com/felixge/httpsnoop v1.0.4 h1:NFTV2Zj1bL4mc9sqWACXbQFVBBg2W3GPvqp8/ESS2Wg=\ngithub.com/felixge/httpsnoop v1.0.4/go.mod h1:m8KPJKqk1gH5J9DgRY2ASl2lWCfGKXixSwevea8zH2U=\ngithub.com/fsnotify/fsnotify v1.4.7/go.mod h1:jwhsz4b93w/PPRr/qN1Yymfu8t87LnFCMoQvtojpjFo=\ngithub.com/gdamore/encoding v1.0.1 h1:YzKZckdBL6jVt2Gc+5p82qhrGiqMdG/eNs6Wy0u3Uhw=\ngithub.com/gdamore/encoding v1.0.1/go.mod h1:0Z0cMFinngz9kS1QfMjCP8TY7em3bZYeeklsSDPivEo=\ngithub.com/gdamore/tcell/v2 v2.13.8 h1:Mys/Kl5wfC/GcC5Cx4C2BIQH9dbnhnkPgS9/wF3RlfU=\ngithub.com/gdamore/tcell/v2 v2.13.8/go.mod h1:+Wfe208WDdB7INEtCsNrAN6O2m+wsTPk1RAovjaILlo=\ngithub.com/gliderlabs/ssh v0.3.8 h1:a4YXD1V7xMF9g5nTkdfnja3Sxy1PVDCj1Zg4Wb8vY6c=\ngithub.com/gliderlabs/ssh v0.3.8/go.mod h1:xYoytBv1sV0aL3CavoDuJIQNURXkkfPA/wxQ1pL1fAU=\ngithub.com/go-git/gcfg v1.5.1-0.20230307220236-3a3c6141e376 h1:+zs/tPmkDkHx3U66DAb0lQFJrpS6731Oaa12ikc+DiI=\ngithub.com/go-git/gcfg v1.5.1-0.20230307220236-3a3c6141e376/go.mod h1:an3vInlBmSxCcxctByoQdvwPiA7DTK7jaaFDBTtu0ic=\ngithub.com/go-git/go-billy/v5 v5.8.0 h1:I8hjc3LbBlXTtVuFNJuwYuMiHvQJDq1AT6u4DwDzZG0=\ngithub.com/go-git/go-billy/v5 v5.8.0/go.mod h1:RpvI/rw4Vr5QA+Z60c6d6LXH0rYJo0uD5SqfmrrheCY=\ngithub.com/go-git/go-git-fixtures/v4 v4.3.2-0.20231010084843-55a94097c399 h1:eMje31YglSBqCdIqdhKBW8lokaMrL3uTkpGYlE2OOT4=\ngithub.com/go-git/go-git-fixtures/v4 v4.3.2-0.20231010084843-55a94097c399/go.mod h1:1OCfN199q1Jm3HZlxleg+Dw/mwps2Wbk9frAWm+4FII=\ngithub.com/go-git/go-git/v5 v5.17.2 h1:B+nkdlxdYrvyFK4GPXVU8w1U+YkbsgciIR7f2sZJ104=\ngithub.com/go-git/go-git/v5 v5.17.2/go.mod h1:pW/VmeqkanRFqR6AljLcs7EA7FbZaN5MQqO7oZADXpo=\ngithub.com/go-kit/kit v0.8.0/go.mod h1:xBxKIO96dXMWWy0MnWVtmwkA9/13aqxPnvrjFYMA2as=\ngithub.com/go-logfmt/logfmt v0.3.0/go.mod h1:Qt1PoO58o5twSAckw1HlFXLmHsOX5/0LbT9GBnD5lWE=\ngithub.com/go-logfmt/logfmt v0.4.0/go.mod h1:3RMwSq7FuexP4Kalkev3ejPJsZTpXXBr9+V4qmtdjCk=\ngithub.com/go-logr/logr v1.2.2/go.mod h1:jdQByPbusPIv2/zmleS9BjJVeZ6kBagPoEUsqbVz/1A=\ngithub.com/go-logr/logr v1.4.3 h1:CjnDlHq8ikf6E492q6eKboGOC0T8CDaOvkHCIg8idEI=\ngithub.com/go-logr/logr v1.4.3/go.mod h1:9T104GzyrTigFIr8wt5mBrctHMim0Nb2HLGrmQ40KvY=\ngithub.com/go-logr/stdr v1.2.2 h1:hSWxHoqTgW2S2qGc0LTAI563KZ5YKYRhT3MFKZMbjag=\ngithub.com/go-logr/stdr v1.2.2/go.mod h1:mMo/vtBO5dYbehREoey6XUKy/eSumjCCveDpRre4VKE=\ngithub.com/go-stack/stack v1.8.0/go.mod h1:v0f6uXyyMGvRgIKkXu+yp6POWl0qKG85gN/melR3HDY=\ngithub.com/go-task/slim-sprig/v3 v3.0.0 h1:sUs3vkvUymDpBKi3qH1YSqBQk9+9D/8M2mN1vB6EwHI=\ngithub.com/go-task/slim-sprig/v3 v3.0.0/go.mod h1:W848ghGpv3Qj3dhTPRyJypKRiqCdHZiAzKg9hl15HA8=\ngithub.com/go-viper/mapstructure/v2 v2.4.0 h1:EBsztssimR/CONLSZZ04E8qAkxNYq4Qp9LvH92wZUgs=\ngithub.com/go-viper/mapstructure/v2 v2.4.0/go.mod h1:oJDH3BJKyqBA2TXFhDsKDGDTlndYOZ6rGS0BRZIxGhM=\ngithub.com/gogo/protobuf v1.1.1/go.mod h1:r8qH/GZQm5c6nD/R0oafs1akxWv10x8SbQlK7atdtwQ=\ngithub.com/gogo/protobuf v1.3.2 h1:Ov1cvc58UF3b5XjBnZv7+opcTcQFZebYjWzi34vdm4Q=\ngithub.com/gogo/protobuf v1.3.2/go.mod h1:P1XiOD3dCwIKUDQYPy72D8LYyHL2YPYrpS2s69NZV8Q=\ngithub.com/golang-jwt/jwt/v4 v4.0.0/go.mod h1:/xlHOz8bRuivTWchD4jCa+NbatV+wEUSzwAxVc6locg=\ngithub.com/golang-jwt/jwt/v4 v4.2.0/go.mod h1:/xlHOz8bRuivTWchD4jCa+NbatV+wEUSzwAxVc6locg=\ngithub.com/golang-jwt/jwt/v4 v4.5.0/go.mod h1:m21LjoU+eqJr34lmDMbreY2eSTRJ1cv77w39/MY0Ch0=\ngithub.com/golang-jwt/jwt/v4 v4.5.2 h1:YtQM7lnr8iZ+j5q71MGKkNw9Mn7AjHM68uc9g5fXeUI=\ngithub.com/golang-jwt/jwt/v4 v4.5.2/go.mod h1:m21LjoU+eqJr34lmDMbreY2eSTRJ1cv77w39/MY0Ch0=\ngithub.com/golang/groupcache v0.0.0-20241129210726-2c02b8208cf8 h1:f+oWsMOmNPc8JmEHVZIycC7hBoQxHH9pNKQORJNozsQ=\ngithub.com/golang/groupcache v0.0.0-20241129210726-2c02b8208cf8/go.mod h1:wcDNUvekVysuuOpQKo3191zZyTpiI6se1N1ULghS0sw=\ngithub.com/golang/mock v1.6.0 h1:ErTB+efbowRARo13NNdxyJji2egdxLGQhRaY+DUumQc=\ngithub.com/golang/mock v1.6.0/go.mod h1:p6yTPP+5HYm5mzsMV8JkE6ZKdX+/wYM6Hr+LicevLPs=\ngithub.com/golang/protobuf v1.2.0/go.mod h1:6lQm79b+lXiMfvg/cZm0SGofjICqVBUtrP5yJMmIC1U=\ngithub.com/golang/protobuf v1.3.1/go.mod h1:6lQm79b+lXiMfvg/cZm0SGofjICqVBUtrP5yJMmIC1U=\ngithub.com/golang/protobuf v1.3.2/go.mod h1:6lQm79b+lXiMfvg/cZm0SGofjICqVBUtrP5yJMmIC1U=\ngithub.com/google/go-cmp v0.3.0/go.mod h1:8QqcDgzrUqlUb/G2PQTWiueGozuR1884gddMywk6iLU=\ngithub.com/google/go-cmp v0.5.2/go.mod h1:v8dTdLbMG2kIc/vJvl+f65V22dbkXbowE6jgT/gNBxE=\ngithub.com/google/go-cmp v0.7.0 h1:wk8382ETsv4JYUZwIsn6YpYiWiBsYLSJiTsyBybVuN8=\ngithub.com/google/go-cmp v0.7.0/go.mod h1:pXiqmnSA92OHEEa9HXL2W4E7lf9JzCmGVUdgjX3N/iU=\ngithub.com/google/go-containerregistry v0.21.4 h1:VrhlIQtdhE6riZW//MjPrcJ1snAjPoCCpPHqGOygrv8=\ngithub.com/google/go-containerregistry v0.21.4/go.mod h1:kxgc23zQ2qMY/hAKt0wCbB/7tkeovAP2mE2ienynJUw=\ngithub.com/google/go-github/v30 v30.1.0 h1:VLDx+UolQICEOKu2m4uAoMti1SxuEBAl7RSEG16L+Oo=\ngithub.com/google/go-github/v30 v30.1.0/go.mod h1:n8jBpHl45a/rlBUtRJMOG4GhNADUQFEufcolZ95JfU8=\ngithub.com/google/go-querystring v1.0.0/go.mod h1:odCYkC5MyYFN7vkCjXpyrEuKhc/BUO6wN/zVPAxq5ck=\ngithub.com/google/go-querystring v1.1.0 h1:AnCroh3fv4ZBgVIf1Iwtovgjaw/GiKJo8M8yD/fhyJ8=\ngithub.com/google/go-querystring v1.1.0/go.mod h1:Kcdr2DB4koayq7X8pmAG4sNG59So17icRSOU623lUBU=\ngithub.com/google/gofuzz v1.0.0/go.mod h1:dBl0BpW6vV/+mYPU4Po3pmUjxk6FQPldtuIdl/M65Eg=\ngithub.com/google/pprof v0.0.0-20260115054156-294ebfa9ad83 h1:z2ogiKUYzX5Is6zr/vP9vJGqPwcdqsWjOt+V8J7+bTc=\ngithub.com/google/pprof v0.0.0-20260115054156-294ebfa9ad83/go.mod h1:MxpfABSjhmINe3F1It9d+8exIHFvUqtLIRCdOGNXqiI=\ngithub.com/google/shlex v0.0.0-20191202100458-e7afc7fbc510 h1:El6M4kTTCOh6aBiKaUGG7oYTSPP8MxqL4YI3kZKwcP4=\ngithub.com/google/shlex v0.0.0-20191202100458-e7afc7fbc510/go.mod h1:pupxD2MaaD3pAXIBCelhxNneeOaAeabZDe5s4K6zSpQ=\ngithub.com/google/uuid v1.1.1/go.mod h1:TIyPZe4MgqvfeYDBFedMoGGpEw/LqOeaOT+nhxU+yHo=\ngithub.com/google/uuid v1.6.0 h1:NIvaJDMOsjHA8n1jAhLSgzrAzy1Hgr+hNrb57e+94F0=\ngithub.com/google/uuid v1.6.0/go.mod h1:TIyPZe4MgqvfeYDBFedMoGGpEw/LqOeaOT+nhxU+yHo=\ngithub.com/gorilla/mux v1.8.1 h1:TuBL49tXwgrFYWhqrNgrUNEY92u81SPhu7sTdzQEiWY=\ngithub.com/gorilla/mux v1.8.1/go.mod h1:AKf9I4AEqPTmMytcMc0KkNouC66V3BtZ4qD5fmWSiMQ=\ngithub.com/hectane/go-acl v0.0.0-20190604041725-da78bae5fc95 h1:S4qyfL2sEm5Budr4KVMyEniCy+PbS55651I/a+Kn/NQ=\ngithub.com/hectane/go-acl v0.0.0-20190604041725-da78bae5fc95/go.mod h1:QiyDdbZLaJ/mZP4Zwc9g2QsfaEA4o7XvvgZegSci5/E=\ngithub.com/heroku/color v0.0.6 h1:UTFFMrmMLFcL3OweqP1lAdp8i1y/9oHqkeHjQ/b/Ny0=\ngithub.com/heroku/color v0.0.6/go.mod h1:ZBvOcx7cTF2QKOv4LbmoBtNl5uB17qWxGuzZrsi1wLU=\ngithub.com/hpcloud/tail v1.0.0/go.mod h1:ab1qPbhIpdTxEkNHXyeSf5vhxWSCs/tWer42PpOxQnU=\ngithub.com/inconshreveable/mousetrap v1.1.0 h1:wN+x4NVGpMsO7ErUn/mUI3vEoE6Jt13X2s0bqwp9tc8=\ngithub.com/inconshreveable/mousetrap v1.1.0/go.mod h1:vpF70FUmC8bwa3OWnCshd2FqLfsEA9PFc4w1p2J65bw=\ngithub.com/jbenet/go-context v0.0.0-20150711004518-d14ea06fba99 h1:BQSFePA1RWJOlocH6Fxy8MmwDt+yVQYULKfN0RoTN8A=\ngithub.com/jbenet/go-context v0.0.0-20150711004518-d14ea06fba99/go.mod h1:1lJo3i6rXxKeerYnT8Nvf0QmHCRC1n8sfWVwXF2Frvo=\ngithub.com/jmespath/go-jmespath v0.0.0-20180206201540-c2b33e8439af/go.mod h1:Nht3zPeWKUH0NzdCt2Blrr5ys8VGpn0CEB0cQHVjt7k=\ngithub.com/jpillora/backoff v0.0.0-20180909062703-3050d21c67d7/go.mod h1:2iMrUgbbvHEiQClaW2NsSzMyGHqN+rDFqY705q49KG0=\ngithub.com/json-iterator/go v1.1.6/go.mod h1:+SdeFBvtyEkXs7REEP0seUULqWtbJapLOCVDaaPEHmU=\ngithub.com/json-iterator/go v1.1.7/go.mod h1:KdQUCv79m/52Kvf8AW2vK1V8akMuk1QjK/uOdHXbAo4=\ngithub.com/julienschmidt/httprouter v1.2.0/go.mod h1:SYymIcj16QtmaHHD7aYtjjsJG7VTCxuUUipMqKk8s4w=\ngithub.com/kevinburke/ssh_config v1.2.0 h1:x584FjTGwHzMwvHx18PXxbBVzfnxogHaAReU4gf13a4=\ngithub.com/kevinburke/ssh_config v1.2.0/go.mod h1:CT57kijsi8u/K/BOFA39wgDQJ9CxiF4nAY/ojJ6r6mM=\ngithub.com/kisielk/errcheck v1.5.0/go.mod h1:pFxgyoBC7bSaBwPgfKdkLd5X25qrDl4LWUI2bnpBCr8=\ngithub.com/kisielk/gotool v1.0.0/go.mod h1:XhKaO+MFFWcvkIS/tQcRk01m1F5IRFswLeQ+oQHNcck=\ngithub.com/klauspost/compress v1.18.5 h1:/h1gH5Ce+VWNLSWqPzOVn6XBO+vJbCNGvjoaGBFW2IE=\ngithub.com/klauspost/compress v1.18.5/go.mod h1:cwPg85FWrGar70rWktvGQj8/hthj3wpl0PGDogxkrSQ=\ngithub.com/konsorten/go-windows-terminal-sequences v1.0.1/go.mod h1:T0+1ngSBFLxvqU3pZ+m/2kptfBszLMUkC4ZK/EgS/cQ=\ngithub.com/kr/logfmt v0.0.0-20140226030751-b84e30acd515/go.mod h1:+0opPa2QZZtGFBFZlji/RkVcI2GknAs/DXo4wKdlNEc=\ngithub.com/kr/pretty v0.1.0/go.mod h1:dAy3ld7l9f0ibDNOQOHHMYYIIbhfbHSm3C4ZsoJORNo=\ngithub.com/kr/pretty v0.2.0/go.mod h1:ipq/a2n7PKx3OHsz4KJII5eveXtPO4qwEXGdVfWzfnI=\ngithub.com/kr/pretty v0.3.1 h1:flRD4NNwYAUpkphVc1HcthR4KEIFJ65n8Mw5qdRn3LE=\ngithub.com/kr/pretty v0.3.1/go.mod h1:hoEshYVHaxMs3cyo3Yncou5ZscifuDolrwPKZanG3xk=\ngithub.com/kr/pty v1.1.1/go.mod h1:pFQYn66WHrOpPYNljwOMqo10TkYh1fy3cYio2l3bCsQ=\ngithub.com/kr/text v0.1.0/go.mod h1:4Jbv+DJW3UT/LiOwJeYQe1efqtUx/iVham/4vfdArNI=\ngithub.com/kr/text v0.2.0 h1:5Nx0Ya0ZqY2ygV366QzturHI13Jq95ApcVaJBhpS+AY=\ngithub.com/kr/text v0.2.0/go.mod h1:eLer722TekiGuMkidMxC/pM04lWEeraHUUmBw8l2grE=\ngithub.com/kylelemons/godebug v1.1.0 h1:RPNrshWIDI6G2gRW9EHilWtl7Z6Sb1BR0xunSBf0SNc=\ngithub.com/kylelemons/godebug v1.1.0/go.mod h1:9/0rRGxNHcop5bhtWyNeEfOS8JIWk580+fNqagV/RAw=\ngithub.com/lucasb-eyer/go-colorful v1.3.0 h1:2/yBRLdWBZKrf7gB40FoiKfAWYQ0lqNcbuQwVHXptag=\ngithub.com/lucasb-eyer/go-colorful v1.3.0/go.mod h1:R4dSotOR9KMtayYi1e77YzuveK+i7ruzyGqttikkLy0=\ngithub.com/mattn/go-colorable v0.1.1/go.mod h1:FuOcm+DKB9mbwrcAfNl7/TZVBZ6rcnceauSikq3lYCQ=\ngithub.com/mattn/go-colorable v0.1.2/go.mod h1:U0ppj6V5qS13XJ6of8GYAs25YV2eR4EVcfRqFIhoBtE=\ngithub.com/mattn/go-colorable v0.1.14 h1:9A9LHSqF/7dyVVX6g0U9cwm9pG3kP9gSzcuIPHPsaIE=\ngithub.com/mattn/go-colorable v0.1.14/go.mod h1:6LmQG8QLFO4G5z1gPvYEzlUgJ2wF+stgPZH1UqBm1s8=\ngithub.com/mattn/go-isatty v0.0.5/go.mod h1:Iq45c/XA43vh69/j3iqttzPXn0bhXyGjM0Hdxcsrc5s=\ngithub.com/mattn/go-isatty v0.0.8/go.mod h1:Iq45c/XA43vh69/j3iqttzPXn0bhXyGjM0Hdxcsrc5s=\ngithub.com/mattn/go-isatty v0.0.20 h1:xfD0iDuEKnDkl03q4limB+vH+GxLEtL/jb4xVJSWWEY=\ngithub.com/mattn/go-isatty v0.0.20/go.mod h1:W+V8PltTTMOvKvAeJH7IuucS94S2C6jfK/D7dTCTo3Y=\ngithub.com/matttproud/golang_protobuf_extensions v1.0.1/go.mod h1:D8He9yQNgCq6Z5Ld7szi9bcBfOoFv/3dc6xSMkL2PC0=\ngithub.com/mgutz/ansi v0.0.0-20170206155736-9520e82c474b/go.mod h1:01TrycV0kFyexm33Z7vhZRXopbI8J3TDReVlkTgMUxE=\ngithub.com/mitchellh/go-homedir v1.1.0 h1:lukF9ziXFxDFPkA1vsr5zpc1XuPDn/wFntq5mG+4E0Y=\ngithub.com/mitchellh/go-homedir v1.1.0/go.mod h1:SfyaCUpYCn1Vlf4IUYiD9fPX4A5wJrkLzIz1N1q0pr0=\ngithub.com/mitchellh/ioprogress v0.0.0-20180201004757-6a23b12fa88e h1:Qa6dnn8DlasdXRnacluu8HzPts0S1I9zvvUPDbBnXFI=\ngithub.com/mitchellh/ioprogress v0.0.0-20180201004757-6a23b12fa88e/go.mod h1:waEya8ee1Ro/lgxpVhkJI4BVASzkm3UZqkx/cFJiYHM=\ngithub.com/moby/buildkit v0.28.1 h1:Tq6H6gOMU2JyEQ5rA0pa7Ey3VGNR3qpw90liSIpMQoo=\ngithub.com/moby/buildkit v0.28.1/go.mod h1:xO6wb9VBXszkIBxaGTLXc1rQORVQFIJRt3GSX7KzCFc=\ngithub.com/moby/docker-image-spec v1.3.1 h1:jMKff3w6PgbfSa69GfNg+zN/XLhfXJGnEx3Nl2EsFP0=\ngithub.com/moby/docker-image-spec v1.3.1/go.mod h1:eKmb5VW8vQEh/BAr2yvVNvuiJuY6UIocYsFu/DxxRpo=\ngithub.com/moby/go-archive v0.2.0 h1:zg5QDUM2mi0JIM9fdQZWC7U8+2ZfixfTYoHL7rWUcP8=\ngithub.com/moby/go-archive v0.2.0/go.mod h1:mNeivT14o8xU+5q1YnNrkQVpK+dnNe/K6fHqnTg4qPU=\ngithub.com/moby/moby/api v1.54.1 h1:TqVzuJkOLsgLDDwNLmYqACUuTehOHRGKiPhvH8V3Nn4=\ngithub.com/moby/moby/api v1.54.1/go.mod h1:+RQ6wluLwtYaTd1WnPLykIDPekkuyD/ROWQClE83pzs=\ngithub.com/moby/moby/client v0.4.0 h1:S+2XegzHQrrvTCvF6s5HFzcrywWQmuVnhOXe2kiWjIw=\ngithub.com/moby/moby/client v0.4.0/go.mod h1:QWPbvWchQbxBNdaLSpoKpCdf5E+WxFAgNHogCWDoa7g=\ngithub.com/moby/patternmatcher v0.6.1 h1:qlhtafmr6kgMIJjKJMDmMWq7WLkKIo23hsrpR3x084U=\ngithub.com/moby/patternmatcher v0.6.1/go.mod h1:hDPoyOpDY7OrrMDLaYoY3hf52gNCR/YOUYxkhApJIxc=\ngithub.com/moby/sys/atomicwriter v0.1.0 h1:kw5D/EqkBwsBFi0ss9v1VG3wIkVhzGvLklJ+w3A14Sw=\ngithub.com/moby/sys/atomicwriter v0.1.0/go.mod h1:Ul8oqv2ZMNHOceF643P6FKPXeCmYtlQMvpizfsSoaWs=\ngithub.com/moby/sys/sequential v0.6.0 h1:qrx7XFUd/5DxtqcoH1h438hF5TmOvzC/lspjy7zgvCU=\ngithub.com/moby/sys/sequential v0.6.0/go.mod h1:uyv8EUTrca5PnDsdMGXhZe6CCe8U/UiTWd+lL+7b/Ko=\ngithub.com/moby/sys/user v0.4.0 h1:jhcMKit7SA80hivmFJcbB1vqmw//wU61Zdui2eQXuMs=\ngithub.com/moby/sys/user v0.4.0/go.mod h1:bG+tYYYJgaMtRKgEmuueC0hJEAZWwtIbZTB+85uoHjs=\ngithub.com/moby/sys/userns v0.1.0 h1:tVLXkFOxVu9A64/yh59slHVv9ahO9UIev4JZusOLG/g=\ngithub.com/moby/sys/userns v0.1.0/go.mod h1:IHUYgu/kao6N8YZlp9Cf444ySSvCmDlmzUcYfDHOl28=\ngithub.com/moby/term v0.5.2 h1:6qk3FJAFDs6i/q3W/pQ97SX192qKfZgGjCQqfCJkgzQ=\ngithub.com/moby/term v0.5.2/go.mod h1:d3djjFCrjnB+fl8NJux+EJzu0msscUP+f8it8hPkFLc=\ngithub.com/modern-go/concurrent v0.0.0-20180228061459-e0a39a4cb421/go.mod h1:6dJC0mAP4ikYIbvyc7fijjWJddQyLn8Ig3JB5CqoB9Q=\ngithub.com/modern-go/concurrent v0.0.0-20180306012644-bacd9c7ef1dd/go.mod h1:6dJC0mAP4ikYIbvyc7fijjWJddQyLn8Ig3JB5CqoB9Q=\ngithub.com/modern-go/reflect2 v0.0.0-20180701023420-4b7aa43c6742/go.mod h1:bx2lNnkwVCuqBIxFjflWJWanXIb3RllmbCylyMrvgv0=\ngithub.com/modern-go/reflect2 v1.0.1/go.mod h1:bx2lNnkwVCuqBIxFjflWJWanXIb3RllmbCylyMrvgv0=\ngithub.com/morikuni/aec v1.1.0 h1:vBBl0pUnvi/Je71dsRrhMBtreIqNMYErSAbEeb8jrXQ=\ngithub.com/morikuni/aec v1.1.0/go.mod h1:xDRgiq/iw5l+zkao76YTKzKttOp2cwPEne25HDkJnBw=\ngithub.com/munnerz/goautoneg v0.0.0-20191010083416-a7dc8b61c822 h1:C3w9PqII01/Oq1c1nUAm88MOHcQC9l5mIlSMApZMrHA=\ngithub.com/munnerz/goautoneg v0.0.0-20191010083416-a7dc8b61c822/go.mod h1:+n7T8mK8HuQTcFwEeznm/DIxMOiR9yIdICNftLE1DvQ=\ngithub.com/mwitkow/go-conntrack v0.0.0-20161129095857-cc309e4a2223/go.mod h1:qRWi+5nqEBWmkhHvq77mSJWrCKwh8bxhgT7d/eI7P4U=\ngithub.com/onsi/ginkgo v1.6.0 h1:Ix8l273rp3QzYgXSR+c8d1fTG7UPgYkOSELPhiY/YGw=\ngithub.com/onsi/ginkgo v1.6.0/go.mod h1:lLunBs/Ym6LB5Z9jYTR76FiuTmxDTDusOGeTQH+WWjE=\ngithub.com/onsi/ginkgo/v2 v2.28.0 h1:Rrf+lVLmtlBIKv6KrIGJCjyY8N36vDVcutbGJkyqjJc=\ngithub.com/onsi/ginkgo/v2 v2.28.0/go.mod h1:ArE1D/XhNXBXCBkKOLkbsb2c81dQHCRcF5zwn/ykDRo=\ngithub.com/onsi/gomega v1.5.0/go.mod h1:ex+gbHU/CVuBBDIJjb2X0qEXbFg53c61hWP/1CpauHY=\ngithub.com/onsi/gomega v1.39.1 h1:1IJLAad4zjPn2PsnhH70V4DKRFlrCzGBNrNaru+Vf28=\ngithub.com/onsi/gomega v1.39.1/go.mod h1:hL6yVALoTOxeWudERyfppUcZXjMwIMLnuSfruD2lcfg=\ngithub.com/opencontainers/go-digest v1.0.0 h1:apOUWs51W5PlhuyGyz9FCeeBIOUDA/6nW8Oi/yOhh5U=\ngithub.com/opencontainers/go-digest v1.0.0/go.mod h1:0JzlMkj0TRzQZfJkVvzbP0HBR3IKzErnv2BNG4W4MAM=\ngithub.com/opencontainers/image-spec v1.1.1 h1:y0fUlFfIZhPF1W537XOLg0/fcx6zcHCJwooC2xJA040=\ngithub.com/opencontainers/image-spec v1.1.1/go.mod h1:qpqAh3Dmcf36wStyyWU+kCeDgrGnAve2nCC8+7h8Q0M=\ngithub.com/opencontainers/selinux v1.13.1 h1:A8nNeceYngH9Ow++M+VVEwJVpdFmrlxsN22F+ISDCJE=\ngithub.com/opencontainers/selinux v1.13.1/go.mod h1:S10WXZ/osk2kWOYKy1x2f/eXF5ZHJoUs8UU/2caNRbg=\ngithub.com/pelletier/go-toml v1.9.5 h1:4yBQzkHv+7BHq2PQUZF3Mx0IYxG7LsP222s7Agd3ve8=\ngithub.com/pelletier/go-toml v1.9.5/go.mod h1:u1nR/EPcESfeI/szUZKdtJ0xRNbUoANCkoOuaOx1Y+c=\ngithub.com/pjbgf/sha1cd v0.3.2 h1:a9wb0bp1oC2TGwStyn0Umc/IGKQnEgF0vVaZ8QF8eo4=\ngithub.com/pjbgf/sha1cd v0.3.2/go.mod h1:zQWigSxVmsHEZow5qaLtPYxpcKMMQpa09ixqBxuCS6A=\ngithub.com/pkg/errors v0.8.0/go.mod h1:bwawxfHBFNV+L2hUp1rHADufV3IMtnDRdf1r5NINEl0=\ngithub.com/pkg/errors v0.8.1/go.mod h1:bwawxfHBFNV+L2hUp1rHADufV3IMtnDRdf1r5NINEl0=\ngithub.com/pkg/errors v0.9.1 h1:FEBLx1zS214owpjy7qsBeixbURkuhQAwrK5UwLGTwt4=\ngithub.com/pkg/errors v0.9.1/go.mod h1:bwawxfHBFNV+L2hUp1rHADufV3IMtnDRdf1r5NINEl0=\ngithub.com/planetscale/vtprotobuf v0.6.1-0.20240319094008-0393e58bdf10 h1:GFCKgmp0tecUJ0sJuv4pzYCqS9+RGSn52M3FUwPs+uo=\ngithub.com/planetscale/vtprotobuf v0.6.1-0.20240319094008-0393e58bdf10/go.mod h1:t/avpk3KcrXxUnYOhZhMXJlSEyie6gQbtLq5NM3loB8=\ngithub.com/pmezard/go-difflib v1.0.0/go.mod h1:iKH77koFhYxTK1pcRnkKkqfTogsbg7gZNVY4sRDYZ/4=\ngithub.com/pmezard/go-difflib v1.0.1-0.20181226105442-5d4384ee4fb2 h1:Jamvg5psRIccs7FGNTlIRMkT8wgtp5eCXdBlqhYGL6U=\ngithub.com/pmezard/go-difflib v1.0.1-0.20181226105442-5d4384ee4fb2/go.mod h1:iKH77koFhYxTK1pcRnkKkqfTogsbg7gZNVY4sRDYZ/4=\ngithub.com/prometheus/client_golang v0.9.1/go.mod h1:7SWBe2y4D6OKWSNQJUaRYU/AaXPKyh/dDVn+NZz0KFw=\ngithub.com/prometheus/client_golang v1.0.0/go.mod h1:db9x61etRT2tGnBNRi70OPL5FsnadC4Ky3P0J6CfImo=\ngithub.com/prometheus/client_golang v1.1.0/go.mod h1:I1FGZT9+L76gKKOs5djB6ezCbFQP1xR9D75/vuwEF3g=\ngithub.com/prometheus/client_golang v1.23.2 h1:Je96obch5RDVy3FDMndoUsjAhG5Edi49h0RJWRi/o0o=\ngithub.com/prometheus/client_golang v1.23.2/go.mod h1:Tb1a6LWHB3/SPIzCoaDXI4I8UHKeFTEQ1YCr+0Gyqmg=\ngithub.com/prometheus/client_model v0.0.0-20180712105110-5c3871d89910/go.mod h1:MbSGuTsp3dbXC40dX6PRTWyKYBIrTGTE9sqQNg2J8bo=\ngithub.com/prometheus/client_model v0.0.0-20190129233127-fd36f4220a90/go.mod h1:xMI15A0UPsDsEKsMN9yxemIoYk6Tm2C1GtYGdfGttqA=\ngithub.com/prometheus/client_model v0.6.2 h1:oBsgwpGs7iVziMvrGhE53c/GrLUsZdHnqNwqPLxwZyk=\ngithub.com/prometheus/client_model v0.6.2/go.mod h1:y3m2F6Gdpfy6Ut/GBsUqTWZqCUvMVzSfMLjcu6wAwpE=\ngithub.com/prometheus/common v0.4.1/go.mod h1:TNfzLD0ON7rHzMJeJkieUDPYmFC7Snx/y86RQel1bk4=\ngithub.com/prometheus/common v0.6.0/go.mod h1:eBmuwkDJBwy6iBfxCBob6t6dR6ENT/y+J+Zk0j9GMYc=\ngithub.com/prometheus/common v0.66.1 h1:h5E0h5/Y8niHc5DlaLlWLArTQI7tMrsfQjHV+d9ZoGs=\ngithub.com/prometheus/common v0.66.1/go.mod h1:gcaUsgf3KfRSwHY4dIMXLPV0K/Wg1oZ8+SbZk/HH/dA=\ngithub.com/prometheus/procfs v0.0.0-20181005140218-185b4288413d/go.mod h1:c3At6R/oaqEKCNdg8wHV1ftS6bRYblBhIjjI8uT2IGk=\ngithub.com/prometheus/procfs v0.0.2/go.mod h1:TjEm7ze935MbeOT/UhFTIMYKhuLP4wbCsTZCD3I8kEA=\ngithub.com/prometheus/procfs v0.0.3/go.mod h1:4A/X28fw3Fc593LaREMrKMqOKvUAntwMDaekg4FpcdQ=\ngithub.com/prometheus/procfs v0.17.0 h1:FuLQ+05u4ZI+SS/w9+BWEM2TXiHKsUQ9TADiRH7DuK0=\ngithub.com/prometheus/procfs v0.17.0/go.mod h1:oPQLaDAMRbA+u8H5Pbfq+dl3VDAvHxMUOVhe0wYB2zw=\ngithub.com/rivo/tview v0.42.0 h1:b/ftp+RxtDsHSaynXTbJb+/n/BxDEi+W3UfF5jILK6c=\ngithub.com/rivo/tview v0.42.0/go.mod h1:cSfIYfhpSGCjp3r/ECJb+GKS7cGJnqV8vfjQPwoXyfY=\ngithub.com/rivo/uniseg v0.4.7 h1:WUdvkW8uEhrYfLC4ZzdpI2ztxP1I582+49Oc5Mq64VQ=\ngithub.com/rivo/uniseg v0.4.7/go.mod h1:FN3SvrM+Zdj16jyLfmOkMNblXMcoc8DfTHruCPUcx88=\ngithub.com/rogpeppe/fastuuid v1.1.0/go.mod h1:jVj6XXZzXRy/MSR5jhDC/2q6DgLz+nrA6LYCDYWNEvQ=\ngithub.com/rogpeppe/go-internal v1.14.1 h1:UQB4HGPB6osV0SQTLymcB4TgvyWu6ZyliaW0tI/otEQ=\ngithub.com/rogpeppe/go-internal v1.14.1/go.mod h1:MaRKkUm5W0goXpeCfT7UZI6fk/L7L7so1lCWt35ZSgc=\ngithub.com/russross/blackfriday/v2 v2.1.0/go.mod h1:+Rmxgy9KzJVeS9/2gXHxylqXiyQDYRxCVz55jmeOWTM=\ngithub.com/sabhiram/go-gitignore v0.0.0-20210923224102-525f6e181f06 h1:OkMGxebDjyw0ULyrTYWeN0UNCCkmCWfjPnIA2W6oviI=\ngithub.com/sabhiram/go-gitignore v0.0.0-20210923224102-525f6e181f06/go.mod h1:+ePHsJ1keEjQtpvf9HHw0f4ZeJ0TLRsxhunSI2hYJSs=\ngithub.com/sclevine/spec v1.4.0 h1:z/Q9idDcay5m5irkZ28M7PtQM4aOISzOpj4bUPkDee8=\ngithub.com/sclevine/spec v1.4.0/go.mod h1:LvpgJaFyvQzRvc1kaDs0bulYwzC70PbiYjC4QnFHkOM=\ngithub.com/sergi/go-diff v1.0.0/go.mod h1:0CfEIISq7TuYL3j771MWULgwwjU+GofnZX9QAmXWZgo=\ngithub.com/sergi/go-diff v1.3.2-0.20230802210424-5b0b94c5c0d3 h1:n661drycOFuPLCN3Uc8sB6B/s6Z4t2xvBgU1htSHuq8=\ngithub.com/sergi/go-diff v1.3.2-0.20230802210424-5b0b94c5c0d3/go.mod h1:A0bzQcvG0E7Rwjx0REVgAGH58e96+X0MeOfepqsbeW4=\ngithub.com/sirupsen/logrus v1.2.0/go.mod h1:LxeOpSwHxABJmUn/MG1IvRgCAasNZTLOkJPxbbu5VWo=\ngithub.com/sirupsen/logrus v1.7.0/go.mod h1:yWOB1SBYBC5VeMP7gHvWumXLIWorT60ONWic61uBYv0=\ngithub.com/sirupsen/logrus v1.9.4 h1:TsZE7l11zFCLZnZ+teH4Umoq5BhEIfIzfRDZ1Uzql2w=\ngithub.com/sirupsen/logrus v1.9.4/go.mod h1:ftWc9WdOfJ0a92nsE2jF5u5ZwH8Bv2zdeOC42RjbV2g=\ngithub.com/skeema/knownhosts v1.3.1 h1:X2osQ+RAjK76shCbvhHHHVl3ZlgDm8apHEHFqRjnBY8=\ngithub.com/skeema/knownhosts v1.3.1/go.mod h1:r7KTdC8l4uxWRyK2TpQZ/1o5HaSzh06ePQNxPwTcfiY=\ngithub.com/smartystreets/assertions v1.0.0/go.mod h1:kHHU4qYBaI3q23Pp3VPrmWhuIUrLW/7eUrw0BU5VaoM=\ngithub.com/smartystreets/go-aws-auth v0.0.0-20180515143844-0c1422d1fdb9/go.mod h1:SnhjPscd9TpLiy1LpzGSKh3bXCfxxXuqd9xmQJy3slM=\ngithub.com/smartystreets/gunit v1.0.0/go.mod h1:qwPWnhz6pn0NnRBP++URONOVyNkPyr4SauJk4cUOwJs=\ngithub.com/spf13/cobra v1.10.2 h1:DMTTonx5m65Ic0GOoRY2c16WCbHxOOw6xxezuLaBpcU=\ngithub.com/spf13/cobra v1.10.2/go.mod h1:7C1pvHqHw5A4vrJfjNwvOdzYu0Gml16OCs2GRiTUUS4=\ngithub.com/spf13/pflag v1.0.9/go.mod h1:McXfInJRrz4CZXVZOBLb0bTZqETkiAhM9Iw0y3An2Bg=\ngithub.com/spf13/pflag v1.0.10 h1:4EBh2KAYBwaONj6b2Ye1GiHfwjqyROoF4RwYO+vPwFk=\ngithub.com/spf13/pflag v1.0.10/go.mod h1:McXfInJRrz4CZXVZOBLb0bTZqETkiAhM9Iw0y3An2Bg=\ngithub.com/stretchr/objx v0.1.0/go.mod h1:HFkY916IF+rwdDfMAkV7OtwuqBVzrE8GR6GFx+wExME=\ngithub.com/stretchr/objx v0.1.1/go.mod h1:HFkY916IF+rwdDfMAkV7OtwuqBVzrE8GR6GFx+wExME=\ngithub.com/stretchr/objx v0.4.0/go.mod h1:YvHI0jy2hoMjB+UWwv71VJQ9isScKT/TqJzVSSt89Yw=\ngithub.com/stretchr/objx v0.5.0/go.mod h1:Yh+to48EsGEfYuaHDzXPcE3xhTkx73EhmCGUpEOglKo=\ngithub.com/stretchr/testify v1.2.2/go.mod h1:a8OnRcib4nhh0OaRAV+Yts87kKdq0PP7pXfy6kDkUVs=\ngithub.com/stretchr/testify v1.3.0/go.mod h1:M5WIy9Dh21IEIfnGCwXGc5bZfKNJtfHm1UVUgZn+9EI=\ngithub.com/stretchr/testify v1.4.0/go.mod h1:j7eGeouHqKxXV5pUuKE4zz7dFj8WfuZ+81PSLYec5m4=\ngithub.com/stretchr/testify v1.6.1/go.mod h1:6Fq8oRcR53rry900zMqJjRRixrwX3KX962/h/Wwjteg=\ngithub.com/stretchr/testify v1.7.1/go.mod h1:6Fq8oRcR53rry900zMqJjRRixrwX3KX962/h/Wwjteg=\ngithub.com/stretchr/testify v1.8.0/go.mod h1:yNjHg4UonilssWZ8iaSj1OCr/vHnekPRkoO+kdMU+MU=\ngithub.com/stretchr/testify v1.8.2/go.mod h1:w2LPCIKwWwSfY2zedu0+kehJoqGctiVI29o6fzry7u4=\ngithub.com/stretchr/testify v1.11.1 h1:7s2iGBzp5EwR7/aIZr8ao5+dra3wiQyKjjFuvgVKu7U=\ngithub.com/stretchr/testify v1.11.1/go.mod h1:wZwfW3scLgRK+23gO65QZefKpKQRnfz6sD981Nm4B6U=\ngithub.com/tj/assert v0.0.0-20171129193455-018094318fb0/go.mod h1:mZ9/Rh9oLWpLLDRpvE+3b7gP/C2YyLFYxNmcLnPTMe0=\ngithub.com/tj/assert v0.0.3 h1:Df/BlaZ20mq6kuai7f5z2TvPFiwC3xaWJSDQNiIS3Rk=\ngithub.com/tj/assert v0.0.3/go.mod h1:Ne6X72Q+TB1AteidzQncjw9PabbMp4PBMZ1k+vd1Pvk=\ngithub.com/tj/go-buffer v1.1.0/go.mod h1:iyiJpfFcR2B9sXu7KvjbT9fpM4mOelRSDTbntVj52Uc=\ngithub.com/tj/go-elastic v0.0.0-20171221160941-36157cbbebc2/go.mod h1:WjeM0Oo1eNAjXGDx2yma7uG2XoyRZTq1uv3M/o7imD0=\ngithub.com/tj/go-kinesis v0.0.0-20171128231115-08b17f58cb1b/go.mod h1:/yhzCV0xPfx6jb1bBgRFjl5lytqVqZXEaeqWP8lTEao=\ngithub.com/tj/go-spin v1.1.0/go.mod h1:Mg1mzmePZm4dva8Qz60H2lHwmJ2loum4VIrLgVnKwh4=\ngithub.com/tonistiigi/go-csvvalue v0.0.0-20240814133006-030d3b2625d0 h1:2f304B10LaZdB8kkVEaoXvAMVan2tl9AiK4G0odjQtE=\ngithub.com/tonistiigi/go-csvvalue v0.0.0-20240814133006-030d3b2625d0/go.mod h1:278M4p8WsNh3n4a1eqiFcV2FGk7wE5fwUpUom9mK9lE=\ngithub.com/vbatts/tar-split v0.12.2 h1:w/Y6tjxpeiFMR47yzZPlPj/FcPLpXbTUi/9H7d3CPa4=\ngithub.com/vbatts/tar-split v0.12.2/go.mod h1:eF6B6i6ftWQcDqEn3/iGFRFRo8cBIMSJVOpnNdfTMFA=\ngithub.com/xanzy/ssh-agent v0.3.3 h1:+/15pJfg/RsTxqYcX6fHqOXZwwMP+2VyYWJeWM2qQFM=\ngithub.com/xanzy/ssh-agent v0.3.3/go.mod h1:6dzNDKs0J9rVPHPhaGCukekBHKqfl+L3KghI1Bc68Uw=\ngithub.com/xeipuuv/gojsonpointer v0.0.0-20180127040702-4e3ac2762d5f h1:J9EGpcZtP0E/raorCMxlFGSTBrsSlaDGf3jU/qvAE2c=\ngithub.com/xeipuuv/gojsonpointer v0.0.0-20180127040702-4e3ac2762d5f/go.mod h1:N2zxlSyiKSe5eX1tZViRH5QA0qijqEDrYZiPEAiq3wU=\ngithub.com/xeipuuv/gojsonreference v0.0.0-20180127040603-bd5ef7bd5415 h1:EzJWgHovont7NscjpAxXsDA8S8BMYve8Y5+7cuRE7R0=\ngithub.com/xeipuuv/gojsonreference v0.0.0-20180127040603-bd5ef7bd5415/go.mod h1:GwrjFmJcFw6At/Gs6z4yjiIwzuJ1/+UwLxMQDVQXShQ=\ngithub.com/xeipuuv/gojsonschema v1.2.0 h1:LhYJRs+L4fBtjZUfuSZIKGeVu0QRy8e5Xi7D17UxZ74=\ngithub.com/xeipuuv/gojsonschema v1.2.0/go.mod h1:anYRn/JVcOK2ZgGU+IjEV4nwlhoK5sQluxsYJ78Id3Y=\ngithub.com/yuin/goldmark v1.1.27/go.mod h1:3hX8gzYuyVAZsxl0MRgGTJEmQBFcNTphYh9decYSb74=\ngithub.com/yuin/goldmark v1.2.1/go.mod h1:3hX8gzYuyVAZsxl0MRgGTJEmQBFcNTphYh9decYSb74=\ngithub.com/yuin/goldmark v1.3.5/go.mod h1:mwnBkeHKe2W/ZEtQ+71ViKU8L12m81fl3OWwC1Zlc8k=\ngithub.com/yuin/goldmark v1.4.13/go.mod h1:6yULJ656Px+3vBD8DxQVa3kxgyrAnzto9xy5taEt/CY=\ngo.opentelemetry.io/auto/sdk v1.2.1 h1:jXsnJ4Lmnqd11kwkBV2LgLoFMZKizbCi5fNZ/ipaZ64=\ngo.opentelemetry.io/auto/sdk v1.2.1/go.mod h1:KRTj+aOaElaLi+wW1kO/DZRXwkF4C5xPbEe3ZiIhN7Y=\ngo.opentelemetry.io/contrib/instrumentation/net/http/otelhttp v0.63.0 h1:RbKq8BG0FI8OiXhBfcRtqqHcZcka+gU3cskNuf05R18=\ngo.opentelemetry.io/contrib/instrumentation/net/http/otelhttp v0.63.0/go.mod h1:h06DGIukJOevXaj/xrNjhi/2098RZzcLTbc0jDAUbsg=\ngo.opentelemetry.io/otel v1.40.0 h1:oA5YeOcpRTXq6NN7frwmwFR0Cn3RhTVZvXsP4duvCms=\ngo.opentelemetry.io/otel v1.40.0/go.mod h1:IMb+uXZUKkMXdPddhwAHm6UfOwJyh4ct1ybIlV14J0g=\ngo.opentelemetry.io/otel/metric v1.40.0 h1:rcZe317KPftE2rstWIBitCdVp89A2HqjkxR3c11+p9g=\ngo.opentelemetry.io/otel/metric v1.40.0/go.mod h1:ib/crwQH7N3r5kfiBZQbwrTge743UDc7DTFVZrrXnqc=\ngo.opentelemetry.io/otel/sdk v1.40.0 h1:KHW/jUzgo6wsPh9At46+h4upjtccTmuZCFAc9OJ71f8=\ngo.opentelemetry.io/otel/sdk v1.40.0/go.mod h1:Ph7EFdYvxq72Y8Li9q8KebuYUr2KoeyHx0DRMKrYBUE=\ngo.opentelemetry.io/otel/sdk/metric v1.40.0 h1:mtmdVqgQkeRxHgRv4qhyJduP3fYJRMX4AtAlbuWdCYw=\ngo.opentelemetry.io/otel/sdk/metric v1.40.0/go.mod h1:4Z2bGMf0KSK3uRjlczMOeMhKU2rhUqdWNoKcYrtcBPg=\ngo.opentelemetry.io/otel/trace v1.40.0 h1:WA4etStDttCSYuhwvEa8OP8I5EWu24lkOzp+ZYblVjw=\ngo.opentelemetry.io/otel/trace v1.40.0/go.mod h1:zeAhriXecNGP/s2SEG3+Y8X9ujcJOTqQ5RgdEJcawiA=\ngo.uber.org/goleak v1.3.0 h1:2K3zAYmnTNqV73imy9J1T3WC+gmCePx2hEGkimedGto=\ngo.uber.org/goleak v1.3.0/go.mod h1:CoHD4mav9JJNrW/WLlf7HGZPjdw8EucARQHekz1X6bE=\ngo.yaml.in/yaml/v2 v2.4.3 h1:6gvOSjQoTB3vt1l+CU+tSyi/HOjfOjRLJ4YwYZGwRO0=\ngo.yaml.in/yaml/v2 v2.4.3/go.mod h1:zSxWcmIDjOzPXpjlTTbAsKokqkDNAVtZO0WOMiT90s8=\ngo.yaml.in/yaml/v3 v3.0.4 h1:tfq32ie2Jv2UxXFdLJdh3jXuOzWiL1fo0bu/FbuKpbc=\ngo.yaml.in/yaml/v3 v3.0.4/go.mod h1:DhzuOOF2ATzADvBadXxruRBLzYTpT36CKvDb3+aBEFg=\ngolang.org/x/crypto v0.0.0-20180904163835-0709b304e793/go.mod h1:6SG95UA2DQfeDnfUPMdvaQW0Q7yPrPDi9nlGo2tz2b4=\ngolang.org/x/crypto v0.0.0-20190308221718-c2843e01d9a2/go.mod h1:djNgcEr1/C05ACkg1iLfiJU5Ep61QUkGW8qpdssI0+w=\ngolang.org/x/crypto v0.0.0-20190426145343-a29dc8fdc734/go.mod h1:yigFU9vqHzYiE8UmvKecakEJjdnWj3jj499lnFckfCI=\ngolang.org/x/crypto v0.0.0-20191011191535-87dc89f01550/go.mod h1:yigFU9vqHzYiE8UmvKecakEJjdnWj3jj499lnFckfCI=\ngolang.org/x/crypto v0.0.0-20200622213623-75b288015ac9/go.mod h1:LzIPMQfyMNhhGPhUkYOs5KpL4U8rLKemX1yGLhDgUto=\ngolang.org/x/crypto v0.0.0-20210921155107-089bfa567519/go.mod h1:GvvjBRRGRdwPK5ydBHafDWAxML/pGHZbMvKqRZ5+Abc=\ngolang.org/x/crypto v0.0.0-20220622213112-05595931fe9d/go.mod h1:IxCIyHEi3zRg3s0A5j5BB6A9Jmi73HwBIUl50j+osU4=\ngolang.org/x/crypto v0.0.0-20220722155217-630584e8d5aa/go.mod h1:IxCIyHEi3zRg3s0A5j5BB6A9Jmi73HwBIUl50j+osU4=\ngolang.org/x/crypto v0.17.0/go.mod h1:gCAAfMLgwOJRpTjQ2zCCt2OcSfYMTeZVSRtQlPC7Nq4=\ngolang.org/x/crypto v0.49.0 h1:+Ng2ULVvLHnJ/ZFEq4KdcDd/cfjrrjjNSXNzxg0Y4U4=\ngolang.org/x/crypto v0.49.0/go.mod h1:ErX4dUh2UM+CFYiXZRTcMpEcN8b/1gxEuv3nODoYtCA=\ngolang.org/x/mod v0.2.0/go.mod h1:s0Qsj1ACt9ePp/hMypM3fl4fZqREWJwdYDEqhRiZZUA=\ngolang.org/x/mod v0.3.0/go.mod h1:s0Qsj1ACt9ePp/hMypM3fl4fZqREWJwdYDEqhRiZZUA=\ngolang.org/x/mod v0.4.2/go.mod h1:s0Qsj1ACt9ePp/hMypM3fl4fZqREWJwdYDEqhRiZZUA=\ngolang.org/x/mod v0.6.0-dev.0.20220419223038-86c51ed26bb4/go.mod h1:jJ57K6gSWd91VN4djpZkiMVwK6gcyfeH4XE8wZrZaV4=\ngolang.org/x/mod v0.8.0/go.mod h1:iBbtSCu2XBx23ZKBPSOrRkjjQPZFPuis4dIYUhu/chs=\ngolang.org/x/mod v0.34.0 h1:xIHgNUUnW6sYkcM5Jleh05DvLOtwc6RitGHbDk4akRI=\ngolang.org/x/mod v0.34.0/go.mod h1:ykgH52iCZe79kzLLMhyCUzhMci+nQj+0XkbXpNYtVjY=\ngolang.org/x/net v0.0.0-20180906233101-161cd47e91fd/go.mod h1:mL1N/T3taQHkDXs73rZJwtUhF3w3ftmwwsq0BUmARs4=\ngolang.org/x/net v0.0.0-20181114220301-adae6a3d119a/go.mod h1:mL1N/T3taQHkDXs73rZJwtUhF3w3ftmwwsq0BUmARs4=\ngolang.org/x/net v0.0.0-20190311183353-d8887717615a/go.mod h1:t9HGtf8HONx5eT2rtn7q6eTqICYqUVnKs3thJo3Qplg=\ngolang.org/x/net v0.0.0-20190404232315-eb5bcb51f2a3/go.mod h1:t9HGtf8HONx5eT2rtn7q6eTqICYqUVnKs3thJo3Qplg=\ngolang.org/x/net v0.0.0-20190613194153-d28f0bde5980/go.mod h1:z5CRVTTTmAJ677TzLLGU+0bjPO0LkuOLi4/5GtJWs/s=\ngolang.org/x/net v0.0.0-20190620200207-3b0461eec859/go.mod h1:z5CRVTTTmAJ677TzLLGU+0bjPO0LkuOLi4/5GtJWs/s=\ngolang.org/x/net v0.0.0-20200226121028-0de0cce0169b/go.mod h1:z5CRVTTTmAJ677TzLLGU+0bjPO0LkuOLi4/5GtJWs/s=\ngolang.org/x/net v0.0.0-20201021035429-f5854403a974/go.mod h1:sp8m0HH+o8qH0wwXwYZr8TS3Oi6o0r6Gce1SSxlDquU=\ngolang.org/x/net v0.0.0-20210226172049-e18ecbb05110/go.mod h1:m0MpNAwzfU5UDzcl9v0D8zg8gWTRqZa9RBIspLL5mdg=\ngolang.org/x/net v0.0.0-20210405180319-a5a99cb37ef4/go.mod h1:p54w0d4576C0XHj96bSt6lcn1PtDYWL6XObtHCRCNQM=\ngolang.org/x/net v0.0.0-20211112202133-69e39bad7dc2/go.mod h1:9nx3DQGgdP8bBQD5qxJ1jj9UTztislL4KSBs9R2vV5Y=\ngolang.org/x/net v0.0.0-20220722155237-a158d28d115b/go.mod h1:XRhObCWvk6IyKnWLug+ECip1KBveYUHfp+8e9klMJ9c=\ngolang.org/x/net v0.6.0/go.mod h1:2Tu9+aMcznHK/AK1HMvgo6xiTLG5rD5rZLDS+rp2Bjs=\ngolang.org/x/net v0.10.0/go.mod h1:0qNGK6F8kojg2nk9dLZ2mShWaEBan6FAoqfSigmmuDg=\ngolang.org/x/net v0.52.0 h1:He/TN1l0e4mmR3QqHMT2Xab3Aj3L9qjbhRm78/6jrW0=\ngolang.org/x/net v0.52.0/go.mod h1:R1MAz7uMZxVMualyPXb+VaqGSa3LIaUqk0eEt3w36Sw=\ngolang.org/x/oauth2 v0.0.0-20180821212333-d2e6202438be/go.mod h1:N/0e6XlmueqKjAGxoOufVs8QHGRruUQn6yWY3a++T0U=\ngolang.org/x/oauth2 v0.36.0 h1:peZ/1z27fi9hUOFCAZaHyrpWG5lwe0RJEEEeH0ThlIs=\ngolang.org/x/oauth2 v0.36.0/go.mod h1:YDBUJMTkDnJS+A4BP4eZBjCqtokkg1hODuPjwiGPO7Q=\ngolang.org/x/sync v0.0.0-20180314180146-1d60e4601c6f/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=\ngolang.org/x/sync v0.0.0-20181108010431-42b317875d0f/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=\ngolang.org/x/sync v0.0.0-20181221193216-37e7f081c4d4/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=\ngolang.org/x/sync v0.0.0-20190423024810-112230192c58/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=\ngolang.org/x/sync v0.0.0-20190911185100-cd5d95a43a6e/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=\ngolang.org/x/sync v0.0.0-20201020160332-67f06af15bc9/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=\ngolang.org/x/sync v0.0.0-20210220032951-036812b2e83c/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=\ngolang.org/x/sync v0.0.0-20220722155255-886fb9371eb4/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=\ngolang.org/x/sync v0.1.0/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=\ngolang.org/x/sync v0.20.0 h1:e0PTpb7pjO8GAtTs2dQ6jYa5BWYlMuX047Dco/pItO4=\ngolang.org/x/sync v0.20.0/go.mod h1:9xrNwdLfx4jkKbNva9FpL6vEN7evnE43NNNJQ2LF3+0=\ngolang.org/x/sys v0.0.0-20180905080454-ebe1bf3edb33/go.mod h1:STP8DvDyc/dI5b8T5hshtkjS+E42TnysNCUPdjciGhY=\ngolang.org/x/sys v0.0.0-20180909124046-d0be0721c37e/go.mod h1:STP8DvDyc/dI5b8T5hshtkjS+E42TnysNCUPdjciGhY=\ngolang.org/x/sys v0.0.0-20181116152217-5ac8a444bdc5/go.mod h1:STP8DvDyc/dI5b8T5hshtkjS+E42TnysNCUPdjciGhY=\ngolang.org/x/sys v0.0.0-20190215142949-d0b11bdaac8a/go.mod h1:STP8DvDyc/dI5b8T5hshtkjS+E42TnysNCUPdjciGhY=\ngolang.org/x/sys v0.0.0-20190222072716-a9d3bda3a223/go.mod h1:STP8DvDyc/dI5b8T5hshtkjS+E42TnysNCUPdjciGhY=\ngolang.org/x/sys v0.0.0-20190412213103-97732733099d/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=\ngolang.org/x/sys v0.0.0-20190529164535-6a60838ec259/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=\ngolang.org/x/sys v0.0.0-20190801041406-cbf593c0f2f3/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=\ngolang.org/x/sys v0.0.0-20191026070338-33540a1f6037/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=\ngolang.org/x/sys v0.0.0-20200930185726-fdedc70b468f/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=\ngolang.org/x/sys v0.0.0-20201119102817-f84b799fce68/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=\ngolang.org/x/sys v0.0.0-20210124154548-22da62e12c0c/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=\ngolang.org/x/sys v0.0.0-20210330210617-4fbd30eecc44/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=\ngolang.org/x/sys v0.0.0-20210423082822-04245dca01da/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=\ngolang.org/x/sys v0.0.0-20210510120138-977fb7262007/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=\ngolang.org/x/sys v0.0.0-20210615035016-665e8c7367d1/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=\ngolang.org/x/sys v0.0.0-20210616094352-59db8d763f22/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=\ngolang.org/x/sys v0.0.0-20220520151302-bc2c85ada10a/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=\ngolang.org/x/sys v0.0.0-20220715151400-c0bba94af5f8/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=\ngolang.org/x/sys v0.0.0-20220722155257-8c9f86f7a55f/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=\ngolang.org/x/sys v0.5.0/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=\ngolang.org/x/sys v0.6.0/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=\ngolang.org/x/sys v0.8.0/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=\ngolang.org/x/sys v0.15.0/go.mod h1:/VUhepiaJMQUp4+oa/7Zr1D23ma6VTLIYjOOTFZPUcA=\ngolang.org/x/sys v0.42.0 h1:omrd2nAlyT5ESRdCLYdm3+fMfNFE/+Rf4bDIQImRJeo=\ngolang.org/x/sys v0.42.0/go.mod h1:4GL1E5IUh+htKOUEOaiffhrAeqysfVGipDYzABqnCmw=\ngolang.org/x/term v0.0.0-20201126162022-7de9c90e9dd1/go.mod h1:bj7SfCRtBDWHUb9snDiAeCFNEtKQo2Wmx5Cou7ajbmo=\ngolang.org/x/term v0.0.0-20210927222741-03fcf44c2211/go.mod h1:jbD1KX2456YbFQfuXm/mYQcufACuNUgVhRMnK/tPxf8=\ngolang.org/x/term v0.5.0/go.mod h1:jMB1sMXY+tzblOD4FWmEbocvup2/aLOaQEp7JmGp78k=\ngolang.org/x/term v0.8.0/go.mod h1:xPskH00ivmX89bAKVGSKKtLOWNx2+17Eiy94tnKShWo=\ngolang.org/x/term v0.15.0/go.mod h1:BDl952bC7+uMoWR75FIrCDx79TPU9oHkTZ9yRbYOrX0=\ngolang.org/x/term v0.41.0 h1:QCgPso/Q3RTJx2Th4bDLqML4W6iJiaXFq2/ftQF13YU=\ngolang.org/x/term v0.41.0/go.mod h1:3pfBgksrReYfZ5lvYM0kSO0LIkAl4Yl2bXOkKP7Ec2A=\ngolang.org/x/text v0.3.0/go.mod h1:NqM8EUOU14njkJ3fqMW+pc6Ldnwhi/IjpwHt7yyuwOQ=\ngolang.org/x/text v0.3.2/go.mod h1:bEr9sfX3Q8Zfm5fL9x+3itogRgK3+ptLWKqgva+5dAk=\ngolang.org/x/text v0.3.3/go.mod h1:5Zoc/QRtKVWzQhOtBMvqHzDpF6irO9z98xDceosuGiQ=\ngolang.org/x/text v0.3.6/go.mod h1:5Zoc/QRtKVWzQhOtBMvqHzDpF6irO9z98xDceosuGiQ=\ngolang.org/x/text v0.3.7/go.mod h1:u+2+/6zg+i71rQMx5EYifcz6MCKuco9NR6JIITiCfzQ=\ngolang.org/x/text v0.7.0/go.mod h1:mrYo+phRRbMaCq/xk9113O4dZlRixOauAjOtrjsXDZ8=\ngolang.org/x/text v0.9.0/go.mod h1:e1OnstbJyHTd6l/uOt8jFFHp6TRDWZR/bV3emEE/zU8=\ngolang.org/x/text v0.14.0/go.mod h1:18ZOQIKpY8NJVqYksKHtTdi31H5itFRjB5/qKTNYzSU=\ngolang.org/x/text v0.35.0 h1:JOVx6vVDFokkpaq1AEptVzLTpDe9KGpj5tR4/X+ybL8=\ngolang.org/x/text v0.35.0/go.mod h1:khi/HExzZJ2pGnjenulevKNX1W67CUy0AsXcNubPGCA=\ngolang.org/x/tools v0.0.0-20180917221912-90fa682c2a6e/go.mod h1:n7NCudcB/nEzxVGmLbDWY5pfWTLqBcC2KZ6jyYvM4mQ=\ngolang.org/x/tools v0.0.0-20191119224855-298f0cb1881e/go.mod h1:b+2E5dAYhXwXZwtnZ6UAqBI28+e2cm9otk0dWdXHAEo=\ngolang.org/x/tools v0.0.0-20200619180055-7c47624df98f/go.mod h1:EkVYQZoAsY45+roYkvgYkIh4xh/qjgUK9TdY2XT94GE=\ngolang.org/x/tools v0.0.0-20210106214847-113979e3529a/go.mod h1:emZCQorbCU4vsT4fOWvOPXz4eW1wZW4PmDk9uLelYpA=\ngolang.org/x/tools v0.1.1/go.mod h1:o0xws9oXOQQZyjljx8fwUC0k7L1pTE6eaCbjGeHmOkk=\ngolang.org/x/tools v0.1.12/go.mod h1:hNGJHUnrk76NpqgfD5Aqm5Crs+Hm0VOH/i9J2+nxYbc=\ngolang.org/x/tools v0.6.0/go.mod h1:Xwgl3UAJ/d3gWutnCtw505GrjyAbvKui8lOU390QaIU=\ngolang.org/x/tools v0.43.0 h1:12BdW9CeB3Z+J/I/wj34VMl8X+fEXBxVR90JeMX5E7s=\ngolang.org/x/tools v0.43.0/go.mod h1:uHkMso649BX2cZK6+RpuIPXS3ho2hZo4FVwfoy1vIk0=\ngolang.org/x/xerrors v0.0.0-20190717185122-a985d3407aa7/go.mod h1:I/5z698sn9Ka8TeJc9MKroUUfqBBauWjQqLJ2OPfmY0=\ngolang.org/x/xerrors v0.0.0-20191011141410-1b5146add898/go.mod h1:I/5z698sn9Ka8TeJc9MKroUUfqBBauWjQqLJ2OPfmY0=\ngolang.org/x/xerrors v0.0.0-20191204190536-9bdfabe68543/go.mod h1:I/5z698sn9Ka8TeJc9MKroUUfqBBauWjQqLJ2OPfmY0=\ngolang.org/x/xerrors v0.0.0-20200804184101-5ec99f83aff1/go.mod h1:I/5z698sn9Ka8TeJc9MKroUUfqBBauWjQqLJ2OPfmY0=\ngoogle.golang.org/appengine v1.1.0/go.mod h1:EbEs0AVv82hx2wNQdGPgUI5lhzA/G0D9YwlJXL52JkM=\ngoogle.golang.org/protobuf v1.36.11 h1:fV6ZwhNocDyBLK0dj+fg8ektcVegBBuEolpbTQyBNVE=\ngoogle.golang.org/protobuf v1.36.11/go.mod h1:HTf+CrKn2C3g5S8VImy6tdcUvCska2kB7j23XfzDpco=\ngopkg.in/alecthomas/kingpin.v2 v2.2.6/go.mod h1:FMv+mEhP44yOT+4EoQTLFTRgOQ1FBLkstjWtayDeSgw=\ngopkg.in/check.v1 v0.0.0-20161208181325-20d25e280405/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0=\ngopkg.in/check.v1 v1.0.0-20190902080502-41f04d3bba15/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0=\ngopkg.in/check.v1 v1.0.0-20201130134442-10cb98267c6c h1:Hei/4ADfdWqJk1ZMxUNpqntNwaWcugrBjAiHlqqRiVk=\ngopkg.in/check.v1 v1.0.0-20201130134442-10cb98267c6c/go.mod h1:JHkPIbrfpd72SG/EVd6muEfDQjcINNoR0C8j2r3qZ4Q=\ngopkg.in/fsnotify.v1 v1.4.7/go.mod h1:Tz8NjZHkW78fSQdbUxIjBTcgA1z1m8ZHf0WmKUhAMys=\ngopkg.in/tomb.v1 v1.0.0-20141024135613-dd632973f1e7/go.mod h1:dt/ZhP58zS4L8KSrWDmTeBkI65Dw0HsyUHuEVlX15mw=\ngopkg.in/warnings.v0 v0.1.2 h1:wFXVbFY8DY5/xOe1ECiWdKCzZlxgshcYVNkBHstARME=\ngopkg.in/warnings.v0 v0.1.2/go.mod h1:jksf8JmL6Qr/oQM2OXTHunEvvTAsrWBLb6OOjuVWRNI=\ngopkg.in/yaml.v2 v2.2.1/go.mod h1:hI93XBmqTisBFMUTm0b8Fm+jr3Dg1NNxqwp+5A1VGuI=\ngopkg.in/yaml.v2 v2.2.2/go.mod h1:hI93XBmqTisBFMUTm0b8Fm+jr3Dg1NNxqwp+5A1VGuI=\ngopkg.in/yaml.v2 v2.4.0/go.mod h1:RDklbk79AGWmwhnvt/jBztapEOGDOx6ZbXqjP6csGnQ=\ngopkg.in/yaml.v3 v3.0.0-20200313102051-9f266ea9e77c/go.mod h1:K4uyk7z7BCEPqu6E+C64Yfv1cQ7kz7rIZviUmN+EgEM=\ngopkg.in/yaml.v3 v3.0.0-20200605160147-a5ece683394c/go.mod h1:K4uyk7z7BCEPqu6E+C64Yfv1cQ7kz7rIZviUmN+EgEM=\ngopkg.in/yaml.v3 v3.0.1 h1:fxVm/GzAzEWqLHuvctI91KS9hhNmmWOoWu0XTYJS7CA=\ngopkg.in/yaml.v3 v3.0.1/go.mod h1:K4uyk7z7BCEPqu6E+C64Yfv1cQ7kz7rIZviUmN+EgEM=\ngotest.tools/v3 v3.5.2 h1:7koQfIKdy+I8UTetycgUqXWSDwpgv193Ka+qRsmBY8Q=\ngotest.tools/v3 v3.5.2/go.mod h1:LtdLGcnqToBH83WByAAi/wiwSFCArdFIUV/xxN4pcjA=\npgregory.net/rapid v1.2.0 h1:keKAYRcjm+e1F0oAuU5F5+YPAWcyxNNRK2wud503Gnk=\npgregory.net/rapid v1.2.0/go.mod h1:PY5XlDGj0+V1FCq0o192FdRhpKHGTRIWBgqjDBTrq04=\n"
  },
  {
    "path": "golangci.yaml",
    "content": "version: \"2\"\nlinters:\n  default: none\n  enable:\n    - bodyclose\n    - dogsled\n    - gocritic\n    - govet\n    - ineffassign\n    - misspell\n    - nakedret\n    - revive\n    - rowserrcheck\n    - staticcheck\n    - unconvert\n    - unused\n    - whitespace\n  settings:\n    revive:\n      rules:\n        - name: error-strings\n          disabled: true\n  exclusions:\n    generated: lax\n    presets:\n      - comments\n      - common-false-positives\n      - legacy\n      - std-error-handling\n    paths:\n      - third_party$\n      - builtin$\n      - examples$\nformatters:\n  enable:\n    - goimports\n  settings:\n    goimports:\n      local-prefixes:\n        - github.com/buildpacks/pack\n  exclusions:\n    generated: lax\n    paths:\n      - third_party$\n      - builtin$\n      - examples$\nissues:\n  default: info\n  rules:\n    - linters:\n      - staticcheck: info"
  },
  {
    "path": "internal/build/container_ops.go",
    "content": "package build\n\nimport (\n\t\"bytes\"\n\t\"context\"\n\t\"fmt\"\n\t\"io\"\n\t\"os\"\n\t\"runtime\"\n\n\t\"github.com/BurntSushi/toml\"\n\t\"github.com/buildpacks/lifecycle/platform/files\"\n\tdcontainer \"github.com/moby/moby/api/types/container\"\n\tdockerClient \"github.com/moby/moby/client\"\n\n\tdarchive \"github.com/moby/go-archive\"\n\t\"github.com/pkg/errors\"\n\n\tcerrdefs \"github.com/containerd/errdefs\"\n\n\t\"github.com/buildpacks/pack/internal/builder\"\n\t\"github.com/buildpacks/pack/internal/container\"\n\t\"github.com/buildpacks/pack/internal/paths\"\n\t\"github.com/buildpacks/pack/pkg/archive\"\n)\n\ntype ContainerOperation func(ctrClient DockerClient, ctx context.Context, containerID string, stdout, stderr io.Writer) error\n\n// CopyOut copies container directories to a handler function. The handler is responsible for closing the Reader.\nfunc CopyOut(handler func(closer io.ReadCloser) error, srcs ...string) ContainerOperation {\n\treturn func(ctrClient DockerClient, ctx context.Context, containerID string, stdout, stderr io.Writer) error {\n\t\tfor _, src := range srcs {\n\t\t\tresult, err := ctrClient.CopyFromContainer(ctx, containerID, dockerClient.CopyFromContainerOptions{SourcePath: src})\n\t\t\tif err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\t\t\treader := result.Content\n\n\t\t\terr = handler(reader)\n\t\t\tif err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\t\t}\n\n\t\treturn nil\n\t}\n}\n\n// CopyOutMaybe copies container directories to a handler function. The handler is responsible for closing the Reader.\n// CopyOutMaybe differs from CopyOut in that it will silently continue to the next source file if the file reader cannot be instantiated\n// because the source file does not exist in the container.\nfunc CopyOutMaybe(handler func(closer io.ReadCloser) error, srcs ...string) ContainerOperation {\n\treturn func(ctrClient DockerClient, ctx context.Context, containerID string, stdout, stderr io.Writer) error {\n\t\tfor _, src := range srcs {\n\t\t\tresult, err := ctrClient.CopyFromContainer(ctx, containerID, dockerClient.CopyFromContainerOptions{SourcePath: src})\n\t\t\tif err != nil {\n\t\t\t\tif cerrdefs.IsNotFound(err) {\n\t\t\t\t\tcontinue\n\t\t\t\t}\n\t\t\t\treturn err\n\t\t\t}\n\t\t\treader := result.Content\n\n\t\t\terr = handler(reader)\n\t\t\tif err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\t\t}\n\n\t\treturn nil\n\t}\n}\n\nfunc CopyOutTo(src, dest string) ContainerOperation {\n\treturn CopyOut(func(reader io.ReadCloser) error {\n\t\tinfo := darchive.CopyInfo{\n\t\t\tPath:  src,\n\t\t\tIsDir: true,\n\t\t}\n\n\t\tdefer reader.Close()\n\t\treturn darchive.CopyTo(reader, info, dest)\n\t}, src)\n}\n\nfunc CopyOutToMaybe(src, dest string) ContainerOperation {\n\treturn CopyOutMaybe(func(reader io.ReadCloser) error {\n\t\tinfo := darchive.CopyInfo{\n\t\t\tPath:  src,\n\t\t\tIsDir: true,\n\t\t}\n\n\t\tdefer reader.Close()\n\t\treturn darchive.CopyTo(reader, info, dest)\n\t}, src)\n}\n\n// CopyDir copies a local directory (src) to the destination on the container while filtering files and changing it's UID/GID.\n// if includeRoot is set the UID/GID will be set on the dst directory.\nfunc CopyDir(src, dst string, uid, gid int, os string, includeRoot bool, fileFilter func(string) bool) ContainerOperation {\n\treturn func(ctrClient DockerClient, ctx context.Context, containerID string, stdout, stderr io.Writer) error {\n\t\ttarPath := dst\n\t\tif os == \"windows\" {\n\t\t\ttarPath = paths.WindowsToSlash(dst)\n\t\t}\n\n\t\treader, err := createReader(src, tarPath, uid, gid, includeRoot, fileFilter)\n\t\tif err != nil {\n\t\t\treturn errors.Wrapf(err, \"create tar archive from '%s'\", src)\n\t\t}\n\t\tdefer reader.Close()\n\n\t\tif os == \"windows\" {\n\t\t\treturn copyDirWindows(ctx, ctrClient, containerID, reader, dst, stdout, stderr)\n\t\t}\n\t\treturn copyDir(ctx, ctrClient, containerID, reader)\n\t}\n}\n\nfunc copyDir(ctx context.Context, ctrClient DockerClient, containerID string, appReader io.Reader) error {\n\tvar clientErr, err error\n\n\tdoneChan := make(chan interface{})\n\tpr, pw := io.Pipe()\n\tgo func() {\n\t\t_, clientErr = ctrClient.CopyToContainer(ctx, containerID, dockerClient.CopyToContainerOptions{\n\t\t\tDestinationPath: \"/\",\n\t\t\tContent:         pr,\n\t\t})\n\t\tclose(doneChan)\n\t}()\n\tfunc() {\n\t\tdefer pw.Close()\n\t\t_, err = io.Copy(pw, appReader)\n\t}()\n\n\t<-doneChan\n\tif err == nil {\n\t\terr = clientErr\n\t}\n\n\treturn err\n}\n\n// copyDirWindows provides an alternate, Windows container-specific implementation of copyDir.\n// This implementation is needed because copying directly to a mounted volume is currently buggy\n// for Windows containers and does not work. Instead, we perform the copy from inside a container\n// using xcopy.\n// See: https://github.com/moby/moby/issues/40771\nfunc copyDirWindows(ctx context.Context, ctrClient DockerClient, containerID string, reader io.Reader, dst string, stdout, stderr io.Writer) error {\n\tinspectResult, err := ctrClient.ContainerInspect(ctx, containerID, dockerClient.ContainerInspectOptions{})\n\tif err != nil {\n\t\treturn err\n\t}\n\tinfo := inspectResult.Container\n\n\tbaseName := paths.WindowsBasename(dst)\n\n\tmnt, err := findMount(info, dst)\n\tif err != nil {\n\t\treturn err\n\t}\n\n\tctr, err := ctrClient.ContainerCreate(ctx, dockerClient.ContainerCreateOptions{\n\t\tConfig: &dcontainer.Config{\n\t\t\tImage: info.Image,\n\t\t\tCmd: []string{\n\t\t\t\t\"cmd\",\n\t\t\t\t\"/c\",\n\n\t\t\t\t// xcopy args\n\t\t\t\t// e - recursively create subdirectories\n\t\t\t\t// h - copy hidden and system files\n\t\t\t\t// b - copy symlinks, do not dereference\n\t\t\t\t// x - copy attributes\n\t\t\t\t// y - suppress prompting\n\t\t\t\tfmt.Sprintf(`xcopy c:\\windows\\%s %s /e /h /b /x /y`, baseName, dst),\n\t\t\t},\n\t\t\tWorkingDir: \"/\",\n\t\t\tUser:       windowsContainerAdmin,\n\t\t},\n\t\tHostConfig: &dcontainer.HostConfig{\n\t\t\tBinds:     []string{fmt.Sprintf(\"%s:%s\", mnt.Name, mnt.Destination)},\n\t\t\tIsolation: dcontainer.IsolationProcess,\n\t\t},\n\t})\n\tif err != nil {\n\t\treturn errors.Wrapf(err, \"creating prep container\")\n\t}\n\tdefer ctrClient.ContainerRemove(context.Background(), ctr.ID, dockerClient.ContainerRemoveOptions{Force: true})\n\n\t_, err = ctrClient.CopyToContainer(ctx, ctr.ID, dockerClient.CopyToContainerOptions{\n\t\tDestinationPath: \"/windows\",\n\t\tContent:         reader,\n\t})\n\tif err != nil {\n\t\treturn errors.Wrap(err, \"copy app to container\")\n\t}\n\n\treturn container.RunWithHandler(\n\t\tctx,\n\t\tctrClient,\n\t\tctr.ID,\n\t\tcontainer.DefaultHandler(\n\t\t\tio.Discard, // Suppress xcopy output\n\t\t\tstderr,\n\t\t),\n\t)\n}\n\nfunc findMount(info dcontainer.InspectResponse, dst string) (dcontainer.MountPoint, error) {\n\tfor _, m := range info.Mounts {\n\t\tif m.Destination == dst {\n\t\t\treturn m, nil\n\t\t}\n\t}\n\treturn dcontainer.MountPoint{}, fmt.Errorf(\"no matching mount found for %s\", dst)\n}\n\nfunc writeToml(ctrClient DockerClient, ctx context.Context, data interface{}, dstPath string, containerID string, os string, stdout, stderr io.Writer) error {\n\tbuf := &bytes.Buffer{}\n\terr := toml.NewEncoder(buf).Encode(data)\n\tif err != nil {\n\t\treturn errors.Wrapf(err, \"marshaling data to %s\", dstPath)\n\t}\n\n\ttarBuilder := archive.TarBuilder{}\n\n\ttarPath := dstPath\n\tif os == \"windows\" {\n\t\ttarPath = paths.WindowsToSlash(dstPath)\n\t}\n\n\ttarBuilder.AddFile(tarPath, 0755, archive.NormalizedDateTime, buf.Bytes())\n\treader := tarBuilder.Reader(archive.DefaultTarWriterFactory())\n\tdefer reader.Close()\n\n\tif os == \"windows\" {\n\t\tdirName := paths.WindowsDir(dstPath)\n\t\treturn copyDirWindows(ctx, ctrClient, containerID, reader, dirName, stdout, stderr)\n\t}\n\n\t_, err = ctrClient.CopyToContainer(ctx, containerID, dockerClient.CopyToContainerOptions{\n\t\tDestinationPath: \"/\",\n\t\tContent:         reader,\n\t})\n\treturn err\n}\n\n// WriteProjectMetadata writes a `project-metadata.toml` based on the ProjectMetadata provided to the destination path.\nfunc WriteProjectMetadata(dstPath string, metadata files.ProjectMetadata, os string) ContainerOperation {\n\treturn func(ctrClient DockerClient, ctx context.Context, containerID string, stdout, stderr io.Writer) error {\n\t\treturn writeToml(ctrClient, ctx, metadata, dstPath, containerID, os, stdout, stderr)\n\t}\n}\n\n// WriteStackToml writes a `stack.toml` based on the StackMetadata provided to the destination path.\nfunc WriteStackToml(dstPath string, stack builder.StackMetadata, os string) ContainerOperation {\n\treturn func(ctrClient DockerClient, ctx context.Context, containerID string, stdout, stderr io.Writer) error {\n\t\treturn writeToml(ctrClient, ctx, stack, dstPath, containerID, os, stdout, stderr)\n\t}\n}\n\n// WriteRunToml writes a `run.toml` based on the RunConfig provided to the destination path.\nfunc WriteRunToml(dstPath string, runImages []builder.RunImageMetadata, os string) ContainerOperation {\n\trunImageData := builder.RunImages{\n\t\tImages: runImages,\n\t}\n\treturn func(ctrClient DockerClient, ctx context.Context, containerID string, stdout, stderr io.Writer) error {\n\t\treturn writeToml(ctrClient, ctx, runImageData, dstPath, containerID, os, stdout, stderr)\n\t}\n}\n\nfunc createReader(src, dst string, uid, gid int, includeRoot bool, fileFilter func(string) bool) (io.ReadCloser, error) {\n\tfi, err := os.Stat(src)\n\tif err != nil {\n\t\treturn nil, err\n\t}\n\n\tif fi.IsDir() {\n\t\tvar mode int64 = -1\n\t\tif runtime.GOOS == \"windows\" {\n\t\t\tmode = 0777\n\t\t}\n\n\t\treturn archive.ReadDirAsTar(src, dst, uid, gid, mode, false, includeRoot, fileFilter), nil\n\t}\n\n\treturn archive.ReadZipAsTar(src, dst, uid, gid, -1, false, fileFilter), nil\n}\n\n// EnsureVolumeAccess grants full access permissions to volumes for UID/GID-based user\n// When UID/GID are 0 it grants explicit full access to BUILTIN\\Administrators and any other UID/GID grants full access to BUILTIN\\Users\n// Changing permissions on volumes through stopped containers does not work on Docker for Windows so we start the container and make change using icacls\n// See: https://github.com/moby/moby/issues/40771\nfunc EnsureVolumeAccess(uid, gid int, os string, volumeNames ...string) ContainerOperation {\n\treturn func(ctrClient DockerClient, ctx context.Context, containerID string, stdout, stderr io.Writer) error {\n\t\tif os != \"windows\" {\n\t\t\treturn nil\n\t\t}\n\n\t\tinspectResult, err := ctrClient.ContainerInspect(ctx, containerID, dockerClient.ContainerInspectOptions{})\n\t\tif err != nil {\n\t\t\treturn err\n\t\t}\n\t\tcontainerInfo := inspectResult.Container\n\n\t\tcmd := \"\"\n\t\tbinds := []string{}\n\t\tfor i, volumeName := range volumeNames {\n\t\t\tcontainerPath := fmt.Sprintf(\"c:/volume-mnt-%d\", i)\n\t\t\tbinds = append(binds, fmt.Sprintf(\"%s:%s\", volumeName, containerPath))\n\n\t\t\tif cmd != \"\" {\n\t\t\t\tcmd += \"&&\"\n\t\t\t}\n\n\t\t\t// icacls args\n\t\t\t// /grant - add new permissions instead of replacing\n\t\t\t// (OI) - object inherit\n\t\t\t// (CI) - container inherit\n\t\t\t// F - full access\n\t\t\t// /t - recursively apply\n\t\t\t// /l - perform on a symbolic link itself versus its target\n\t\t\t// /q - suppress success messages\n\t\t\tcmd += fmt.Sprintf(`icacls %s /grant *%s:(OI)(CI)F /t /l /q`, containerPath, paths.WindowsPathSID(uid, gid))\n\t\t}\n\n\t\tctr, err := ctrClient.ContainerCreate(ctx, dockerClient.ContainerCreateOptions{\n\t\t\tConfig: &dcontainer.Config{\n\t\t\t\tImage:      containerInfo.Image,\n\t\t\t\tCmd:        []string{\"cmd\", \"/c\", cmd},\n\t\t\t\tWorkingDir: \"/\",\n\t\t\t\tUser:       windowsContainerAdmin,\n\t\t\t},\n\t\t\tHostConfig: &dcontainer.HostConfig{\n\t\t\t\tBinds:     binds,\n\t\t\t\tIsolation: dcontainer.IsolationProcess,\n\t\t\t},\n\t\t})\n\t\tif err != nil {\n\t\t\treturn err\n\t\t}\n\t\tdefer ctrClient.ContainerRemove(context.Background(), ctr.ID, dockerClient.ContainerRemoveOptions{Force: true})\n\n\t\treturn container.RunWithHandler(\n\t\t\tctx,\n\t\t\tctrClient,\n\t\t\tctr.ID,\n\t\t\tcontainer.DefaultHandler(\n\t\t\t\tio.Discard, // Suppress icacls output\n\t\t\t\tstderr,\n\t\t\t),\n\t\t)\n\t}\n}\n"
  },
  {
    "path": "internal/build/container_ops_test.go",
    "content": "package build_test\n\nimport (\n\t\"bytes\"\n\t\"context\"\n\t\"fmt\"\n\t\"io\"\n\t\"io/fs\"\n\t\"os\"\n\t\"path/filepath\"\n\t\"runtime\"\n\t\"strings\"\n\t\"testing\"\n\n\t\"github.com/buildpacks/lifecycle/platform/files\"\n\t\"github.com/heroku/color\"\n\tdcontainer \"github.com/moby/moby/api/types/container\"\n\t\"github.com/moby/moby/api/types/mount\"\n\t\"github.com/moby/moby/client\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\t\"github.com/buildpacks/pack/internal/build\"\n\t\"github.com/buildpacks/pack/internal/builder\"\n\t\"github.com/buildpacks/pack/internal/container\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nvar ctrClient *client.Client\n\n// TestContainerOperations are integration tests for the container operations against a docker daemon\nfunc TestContainerOperations(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\n\th.RequireDocker(t)\n\n\tvar err error\n\tctrClient, err = client.New(client.FromEnv)\n\th.AssertNil(t, err)\n\n\tspec.Run(t, \"container-ops\", testContainerOps, spec.Report(report.Terminal{}), spec.Sequential())\n}\n\nfunc testContainerOps(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\timageName string\n\t\tosType    string\n\t)\n\n\tit.Before(func() {\n\t\timageName = \"container-ops.test-\" + h.RandString(10)\n\n\t\tinfoResult, err := ctrClient.Info(context.TODO(), client.InfoOptions{})\n\t\th.AssertNil(t, err)\n\t\tosType = infoResult.Info.OSType\n\n\t\tdockerfileContent := `FROM busybox`\n\t\tif osType == \"windows\" {\n\t\t\tdockerfileContent = `FROM mcr.microsoft.com/windows/nanoserver:1809`\n\t\t}\n\n\t\th.CreateImage(t, ctrClient, imageName, dockerfileContent)\n\n\t\th.AssertNil(t, err)\n\t})\n\n\tit.After(func() {\n\t\th.DockerRmi(ctrClient, imageName)\n\t})\n\n\twhen(\"#CopyDir\", func() {\n\t\tit(\"writes contents with proper owner/permissions\", func() {\n\t\t\tcontainerDir := \"/some-vol\"\n\t\t\tif osType == \"windows\" {\n\t\t\t\tcontainerDir = `c:\\some-vol`\n\t\t\t}\n\n\t\t\tctrCmd := []string{\"ls\", \"-al\", \"/some-vol\"}\n\t\t\tif osType == \"windows\" {\n\t\t\t\tctrCmd = []string{\"cmd\", \"/c\", `dir /q /s c:\\some-vol`}\n\t\t\t}\n\n\t\t\tctx := context.Background()\n\t\t\tctr, err := createContainer(ctx, imageName, containerDir, osType, ctrCmd...)\n\t\t\th.AssertNil(t, err)\n\t\t\tdefer cleanupContainer(ctx, ctr.ID)\n\n\t\t\t// chmod in case umask sets the wrong bits during a `git clone`.\n\t\t\tdir := filepath.Join(\"testdata\", \"fake-app\")\n\t\t\terr = filepath.WalkDir(dir, func(path string, d fs.DirEntry, err error) error {\n\t\t\t\tif err != nil {\n\t\t\t\t\treturn err\n\t\t\t\t}\n\t\t\t\tif d.IsDir() {\n\t\t\t\t\treturn nil\n\t\t\t\t}\n\n\t\t\t\treturn os.Chmod(path, 0644)\n\t\t\t})\n\t\t\th.AssertNil(t, err)\n\t\t\tcopyDirOp := build.CopyDir(dir, containerDir, 123, 456, osType, false, nil)\n\n\t\t\tvar outBuf, errBuf bytes.Buffer\n\t\t\terr = copyDirOp(ctrClient, ctx, ctr.ID, &outBuf, &errBuf)\n\t\t\th.AssertNil(t, err)\n\n\t\t\terr = container.RunWithHandler(ctx, ctrClient, ctr.ID, container.DefaultHandler(&outBuf, &errBuf))\n\t\t\th.AssertNil(t, err)\n\n\t\t\th.AssertEq(t, errBuf.String(), \"\")\n\t\t\tif osType == \"windows\" {\n\t\t\t\t// Expected WCOW results\n\t\t\t\th.AssertContainsMatch(t, strings.ReplaceAll(outBuf.String(), \"\\r\", \"\"), `\n(.*)    <DIR>          ...                    .\n(.*)    <DIR>          ...                    ..\n(.*)                17 ...                    fake-app-file\n(.*)    <SYMLINK>      ...                    fake-app-symlink \\[fake-app-file\\]\n(.*)                 0 ...                    file-to-ignore\n`)\n\t\t\t} else {\n\t\t\t\tif runtime.GOOS == \"windows\" {\n\t\t\t\t\t// Expected LCOW results\n\t\t\t\t\th.AssertContainsMatch(t, outBuf.String(), `\n-rwxrwxrwx    1 123      456 (.*) fake-app-file\nlrwxrwxrwx    1 123      456 (.*) fake-app-symlink -> fake-app-file\n-rwxrwxrwx    1 123      456 (.*) file-to-ignore\n`)\n\t\t\t\t} else {\n\t\t\t\t\t// Expected results\n\t\t\t\t\th.AssertContainsMatch(t, outBuf.String(), `\n-rw-r--r--    1 123      456 (.*) fake-app-file\nlrwxrwxrwx    1 123      456 (.*) fake-app-symlink -> fake-app-file\n-rw-r--r--    1 123      456 (.*) file-to-ignore\n`)\n\t\t\t\t}\n\t\t\t}\n\t\t})\n\n\t\twhen(\"includeRoot\", func() {\n\t\t\tit(\"copies root dir with new GID, UID and permissions\", func() {\n\t\t\t\tcontainerDir := \"/some-vol\"\n\t\t\t\tif osType == \"windows\" {\n\t\t\t\t\tcontainerDir = `c:\\some-vol`\n\t\t\t\t}\n\n\t\t\t\tctrCmd := []string{\"ls\", \"-al\", \"/\"}\n\t\t\t\tif osType == \"windows\" {\n\t\t\t\t\tctrCmd = []string{\"cmd\", \"/c\", `dir /q c:`}\n\t\t\t\t}\n\n\t\t\t\tctx := context.Background()\n\t\t\t\tctr, err := createContainer(ctx, imageName, containerDir, osType, ctrCmd...)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\tdefer cleanupContainer(ctx, ctr.ID)\n\n\t\t\t\tcopyDirOp := build.CopyDir(filepath.Join(\"testdata\", \"fake-app\"), containerDir, 123, 456, osType, true, nil)\n\n\t\t\t\tvar outBuf, errBuf bytes.Buffer\n\t\t\t\terr = copyDirOp(ctrClient, ctx, ctr.ID, &outBuf, &errBuf)\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\terr = container.RunWithHandler(ctx, ctrClient, ctr.ID, container.DefaultHandler(&outBuf, &errBuf))\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\th.AssertEq(t, errBuf.String(), \"\")\n\t\t\t\tif osType == \"windows\" {\n\t\t\t\t\t// Expected WCOW results\n\t\t\t\t\th.AssertContainsMatch(t, strings.ReplaceAll(outBuf.String(), \"\\r\", \"\"), `\n(.*)    <DIR>          ...                    some-vol\n`)\n\t\t\t\t} else {\n\t\t\t\t\th.AssertContainsMatch(t, outBuf.String(), `\ndrwxrwxrwx    2 123      456 (.*) some-vol\n`)\n\t\t\t\t}\n\t\t\t})\n\t\t})\n\n\t\tit(\"writes contents ignoring from file filter\", func() {\n\t\t\tcontainerDir := \"/some-vol\"\n\t\t\tif osType == \"windows\" {\n\t\t\t\tcontainerDir = `c:\\some-vol`\n\t\t\t}\n\n\t\t\tctrCmd := []string{\"ls\", \"-al\", \"/some-vol\"}\n\t\t\tif osType == \"windows\" {\n\t\t\t\tctrCmd = []string{\"cmd\", \"/c\", `dir /q /s /n c:\\some-vol`}\n\t\t\t}\n\n\t\t\tctx := context.Background()\n\t\t\tctr, err := createContainer(ctx, imageName, containerDir, osType, ctrCmd...)\n\t\t\th.AssertNil(t, err)\n\t\t\tdefer cleanupContainer(ctx, ctr.ID)\n\n\t\t\tcopyDirOp := build.CopyDir(filepath.Join(\"testdata\", \"fake-app\"), containerDir, 123, 456, osType, false, func(filename string) bool {\n\t\t\t\treturn filepath.Base(filename) != \"file-to-ignore\"\n\t\t\t})\n\n\t\t\tvar outBuf, errBuf bytes.Buffer\n\t\t\terr = copyDirOp(ctrClient, ctx, ctr.ID, &outBuf, &errBuf)\n\t\t\th.AssertNil(t, err)\n\n\t\t\terr = container.RunWithHandler(ctx, ctrClient, ctr.ID, container.DefaultHandler(&outBuf, &errBuf))\n\t\t\th.AssertNil(t, err)\n\n\t\t\th.AssertEq(t, errBuf.String(), \"\")\n\t\t\th.AssertContains(t, outBuf.String(), \"fake-app-file\")\n\t\t\th.AssertNotContains(t, outBuf.String(), \"file-to-ignore\")\n\t\t})\n\n\t\tit(\"writes contents from zip file\", func() {\n\t\t\tcontainerDir := \"/some-vol\"\n\t\t\tif osType == \"windows\" {\n\t\t\t\tcontainerDir = `c:\\some-vol`\n\t\t\t}\n\n\t\t\tctrCmd := []string{\"ls\", \"-al\", \"/some-vol\"}\n\t\t\tif osType == \"windows\" {\n\t\t\t\tctrCmd = []string{\"cmd\", \"/c\", `dir /q /s /n c:\\some-vol`}\n\t\t\t}\n\n\t\t\tctx := context.Background()\n\t\t\tctr, err := createContainer(ctx, imageName, containerDir, osType, ctrCmd...)\n\t\t\th.AssertNil(t, err)\n\t\t\tdefer cleanupContainer(ctx, ctr.ID)\n\n\t\t\tcopyDirOp := build.CopyDir(filepath.Join(\"testdata\", \"fake-app.zip\"), containerDir, 123, 456, osType, false, nil)\n\n\t\t\tvar outBuf, errBuf bytes.Buffer\n\t\t\terr = copyDirOp(ctrClient, ctx, ctr.ID, &outBuf, &errBuf)\n\t\t\th.AssertNil(t, err)\n\n\t\t\terr = container.RunWithHandler(ctx, ctrClient, ctr.ID, container.DefaultHandler(&outBuf, &errBuf))\n\t\t\th.AssertNil(t, err)\n\n\t\t\th.AssertEq(t, errBuf.String(), \"\")\n\t\t\tif osType == \"windows\" {\n\t\t\t\th.AssertContainsMatch(t, strings.ReplaceAll(outBuf.String(), \"\\r\", \"\"), `\n(.*)    <DIR>          ...                    .\n(.*)    <DIR>          ...                    ..\n(.*)                17 ...                    fake-app-file\n`)\n\t\t\t} else {\n\t\t\t\th.AssertContainsMatch(t, outBuf.String(), `\n-rw-r--r--    1 123      456 (.*) fake-app-file\n`)\n\t\t\t}\n\t\t})\n\t})\n\n\twhen(\"#CopyOut\", func() {\n\t\tit(\"reads the contents of a container directory\", func() {\n\t\t\th.SkipIf(t, osType == \"windows\", \"copying directories out of windows containers not yet supported\")\n\n\t\t\tcontainerDir := \"/some-vol\"\n\t\t\tif osType == \"windows\" {\n\t\t\t\tcontainerDir = `c:\\some-vol`\n\t\t\t}\n\n\t\t\tctrCmd := []string{\"ls\", \"-al\", \"/some-vol\"}\n\t\t\tif osType == \"windows\" {\n\t\t\t\tctrCmd = []string{\"cmd\", \"/c\", `dir /q /s c:\\some-vol`}\n\t\t\t}\n\n\t\t\tctx := context.Background()\n\t\t\tctr, err := createContainer(ctx, imageName, containerDir, osType, ctrCmd...)\n\t\t\th.AssertNil(t, err)\n\t\t\tdefer cleanupContainer(ctx, ctr.ID)\n\n\t\t\tcopyDirOp := build.CopyDir(filepath.Join(\"testdata\", \"fake-app\"), containerDir, 123, 456, osType, false, nil)\n\t\t\terr = copyDirOp(ctrClient, ctx, ctr.ID, io.Discard, io.Discard)\n\t\t\th.AssertNil(t, err)\n\n\t\t\ttarDestination, err := os.CreateTemp(\"\", \"pack.container.ops.test.\")\n\t\t\th.AssertNil(t, err)\n\t\t\tdefer os.RemoveAll(tarDestination.Name())\n\n\t\t\thandler := func(reader io.ReadCloser) error {\n\t\t\t\tdefer reader.Close()\n\n\t\t\t\tcontents, err := io.ReadAll(reader)\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\terr = os.WriteFile(tarDestination.Name(), contents, 0600)\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\treturn nil\n\t\t\t}\n\n\t\t\tcopyOutDirsOp := build.CopyOut(handler, containerDir)\n\t\t\terr = copyOutDirsOp(ctrClient, ctx, ctr.ID, io.Discard, io.Discard)\n\t\t\th.AssertNil(t, err)\n\n\t\t\terr = container.RunWithHandler(ctx, ctrClient, ctr.ID, container.DefaultHandler(io.Discard, io.Discard))\n\t\t\th.AssertNil(t, err)\n\n\t\t\tseparator := \"/\"\n\t\t\tif osType == \"windows\" {\n\t\t\t\tseparator = `\\`\n\t\t\t}\n\n\t\t\th.AssertTarball(t, tarDestination.Name())\n\t\t\th.AssertTarHasFile(t, tarDestination.Name(), fmt.Sprintf(\"some-vol%sfake-app-file\", separator))\n\t\t\th.AssertTarHasFile(t, tarDestination.Name(), fmt.Sprintf(\"some-vol%sfake-app-symlink\", separator))\n\t\t\th.AssertTarHasFile(t, tarDestination.Name(), fmt.Sprintf(\"some-vol%sfile-to-ignore\", separator))\n\t\t})\n\t})\n\n\twhen(\"#CopyOutMaybe\", func() {\n\t\tit(\"reads the contents of a container directory\", func() {\n\t\t\th.SkipIf(t, osType == \"windows\", \"copying directories out of windows containers not yet supported\")\n\n\t\t\tcontainerDir := \"/some-vol\"\n\t\t\tif osType == \"windows\" {\n\t\t\t\tcontainerDir = `c:\\some-vol`\n\t\t\t}\n\n\t\t\tctrCmd := []string{\"ls\", \"-al\", \"/some-vol\"}\n\t\t\tif osType == \"windows\" {\n\t\t\t\tctrCmd = []string{\"cmd\", \"/c\", `dir /q /s c:\\some-vol`}\n\t\t\t}\n\n\t\t\tctx := context.Background()\n\t\t\tctr, err := createContainer(ctx, imageName, containerDir, osType, ctrCmd...)\n\t\t\th.AssertNil(t, err)\n\t\t\tdefer cleanupContainer(ctx, ctr.ID)\n\n\t\t\tcopyDirOp := build.CopyDir(filepath.Join(\"testdata\", \"fake-app\"), containerDir, 123, 456, osType, false, nil)\n\t\t\terr = copyDirOp(ctrClient, ctx, ctr.ID, io.Discard, io.Discard)\n\t\t\th.AssertNil(t, err)\n\n\t\t\ttarDestination, err := os.CreateTemp(\"\", \"pack.container.ops.test.\")\n\t\t\th.AssertNil(t, err)\n\t\t\tdefer os.RemoveAll(tarDestination.Name())\n\n\t\t\thandler := func(reader io.ReadCloser) error {\n\t\t\t\tdefer reader.Close()\n\n\t\t\t\tcontents, err := io.ReadAll(reader)\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\terr = os.WriteFile(tarDestination.Name(), contents, 0600)\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\treturn nil\n\t\t\t}\n\n\t\t\tcopyOutDirsOp := build.CopyOutMaybe(handler, containerDir)\n\t\t\terr = copyOutDirsOp(ctrClient, ctx, ctr.ID, io.Discard, io.Discard)\n\t\t\th.AssertNil(t, err)\n\n\t\t\terr = container.RunWithHandler(ctx, ctrClient, ctr.ID, container.DefaultHandler(io.Discard, io.Discard))\n\t\t\th.AssertNil(t, err)\n\n\t\t\tseparator := \"/\"\n\t\t\tif osType == \"windows\" {\n\t\t\t\tseparator = `\\`\n\t\t\t}\n\n\t\t\th.AssertTarball(t, tarDestination.Name())\n\t\t\th.AssertTarHasFile(t, tarDestination.Name(), fmt.Sprintf(\"some-vol%sfake-app-file\", separator))\n\t\t\th.AssertTarHasFile(t, tarDestination.Name(), fmt.Sprintf(\"some-vol%sfake-app-symlink\", separator))\n\t\t\th.AssertTarHasFile(t, tarDestination.Name(), fmt.Sprintf(\"some-vol%sfile-to-ignore\", separator))\n\t\t})\n\t})\n\n\twhen(\"#WriteStackToml\", func() {\n\t\tit(\"writes file\", func() {\n\t\t\tcontainerDir := \"/layers-vol\"\n\t\t\tcontainerPath := \"/layers-vol/stack.toml\"\n\t\t\tif osType == \"windows\" {\n\t\t\t\tcontainerDir = `c:\\layers-vol`\n\t\t\t\tcontainerPath = `c:\\layers-vol\\stack.toml`\n\t\t\t}\n\n\t\t\tctrCmd := []string{\"ls\", \"-al\", \"/layers-vol/stack.toml\"}\n\t\t\tif osType == \"windows\" {\n\t\t\t\tctrCmd = []string{\"cmd\", \"/c\", `dir /q /n c:\\layers-vol\\stack.toml`}\n\t\t\t}\n\t\t\tctx := context.Background()\n\t\t\tctr, err := createContainer(ctx, imageName, containerDir, osType, ctrCmd...)\n\t\t\th.AssertNil(t, err)\n\t\t\tdefer cleanupContainer(ctx, ctr.ID)\n\n\t\t\twriteOp := build.WriteStackToml(containerPath, builder.StackMetadata{\n\t\t\t\tRunImage: builder.RunImageMetadata{\n\t\t\t\t\tImage: \"image-1\",\n\t\t\t\t\tMirrors: []string{\n\t\t\t\t\t\t\"mirror-1\",\n\t\t\t\t\t\t\"mirror-2\",\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t}, osType)\n\n\t\t\tvar outBuf, errBuf bytes.Buffer\n\t\t\terr = writeOp(ctrClient, ctx, ctr.ID, &outBuf, &errBuf)\n\t\t\th.AssertNil(t, err)\n\n\t\t\terr = container.RunWithHandler(ctx, ctrClient, ctr.ID, container.DefaultHandler(&outBuf, &errBuf))\n\t\t\th.AssertNil(t, err)\n\n\t\t\th.AssertEq(t, errBuf.String(), \"\")\n\t\t\tif osType == \"windows\" {\n\t\t\t\th.AssertContains(t, outBuf.String(), `01/01/1980  12:00 AM                69 ...                    stack.toml`)\n\t\t\t} else {\n\t\t\t\th.AssertContains(t, outBuf.String(), `-rwxr-xr-x    1 root     root            69 Jan  1  1980 /layers-vol/stack.toml`)\n\t\t\t}\n\t\t})\n\n\t\tit(\"has expected contents\", func() {\n\t\t\tcontainerDir := \"/layers-vol\"\n\t\t\tcontainerPath := \"/layers-vol/stack.toml\"\n\t\t\tif osType == \"windows\" {\n\t\t\t\tcontainerDir = `c:\\layers-vol`\n\t\t\t\tcontainerPath = `c:\\layers-vol\\stack.toml`\n\t\t\t}\n\n\t\t\tctrCmd := []string{\"cat\", \"/layers-vol/stack.toml\"}\n\t\t\tif osType == \"windows\" {\n\t\t\t\tctrCmd = []string{\"cmd\", \"/c\", `type c:\\layers-vol\\stack.toml`}\n\t\t\t}\n\n\t\t\tctx := context.Background()\n\t\t\tctr, err := createContainer(ctx, imageName, containerDir, osType, ctrCmd...)\n\t\t\th.AssertNil(t, err)\n\t\t\tdefer cleanupContainer(ctx, ctr.ID)\n\n\t\t\twriteOp := build.WriteStackToml(containerPath, builder.StackMetadata{\n\t\t\t\tRunImage: builder.RunImageMetadata{\n\t\t\t\t\tImage: \"image-1\",\n\t\t\t\t\tMirrors: []string{\n\t\t\t\t\t\t\"mirror-1\",\n\t\t\t\t\t\t\"mirror-2\",\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t}, osType)\n\n\t\t\tvar outBuf, errBuf bytes.Buffer\n\t\t\terr = writeOp(ctrClient, ctx, ctr.ID, &outBuf, &errBuf)\n\t\t\th.AssertNil(t, err)\n\n\t\t\terr = container.RunWithHandler(ctx, ctrClient, ctr.ID, container.DefaultHandler(&outBuf, &errBuf))\n\t\t\th.AssertNil(t, err)\n\n\t\t\th.AssertEq(t, errBuf.String(), \"\")\n\t\t\th.AssertContains(t, outBuf.String(), `[run-image]\n  image = \"image-1\"\n  mirrors = [\"mirror-1\", \"mirror-2\"]\n`)\n\t\t})\n\t})\n\n\twhen(\"#WriteRunToml\", func() {\n\t\tit(\"writes file\", func() {\n\t\t\tcontainerDir := \"/layers-vol\"\n\t\t\tcontainerPath := \"/layers-vol/run.toml\"\n\t\t\tif osType == \"windows\" {\n\t\t\t\tcontainerDir = `c:\\layers-vol`\n\t\t\t\tcontainerPath = `c:\\layers-vol\\run.toml`\n\t\t\t}\n\n\t\t\tctrCmd := []string{\"ls\", \"-al\", \"/layers-vol/run.toml\"}\n\t\t\tif osType == \"windows\" {\n\t\t\t\tctrCmd = []string{\"cmd\", \"/c\", `dir /q /n c:\\layers-vol\\run.toml`}\n\t\t\t}\n\t\t\tctx := context.Background()\n\t\t\tctr, err := createContainer(ctx, imageName, containerDir, osType, ctrCmd...)\n\t\t\th.AssertNil(t, err)\n\t\t\tdefer cleanupContainer(ctx, ctr.ID)\n\n\t\t\twriteOp := build.WriteRunToml(containerPath, []builder.RunImageMetadata{builder.RunImageMetadata{\n\t\t\t\tImage: \"image-1\",\n\t\t\t\tMirrors: []string{\n\t\t\t\t\t\"mirror-1\",\n\t\t\t\t\t\"mirror-2\",\n\t\t\t\t},\n\t\t\t},\n\t\t\t}, osType)\n\n\t\t\tvar outBuf, errBuf bytes.Buffer\n\t\t\terr = writeOp(ctrClient, ctx, ctr.ID, &outBuf, &errBuf)\n\t\t\th.AssertNil(t, err)\n\n\t\t\terr = container.RunWithHandler(ctx, ctrClient, ctr.ID, container.DefaultHandler(&outBuf, &errBuf))\n\t\t\th.AssertNil(t, err)\n\n\t\t\th.AssertEq(t, errBuf.String(), \"\")\n\t\t\tif osType == \"windows\" {\n\t\t\t\th.AssertContains(t, outBuf.String(), `01/01/1980  12:00 AM                68 ...                    run.toml`)\n\t\t\t} else {\n\t\t\t\th.AssertContains(t, outBuf.String(), `-rwxr-xr-x    1 root     root            68 Jan  1  1980 /layers-vol/run.toml`)\n\t\t\t}\n\t\t})\n\n\t\tit(\"has expected contents\", func() {\n\t\t\tcontainerDir := \"/layers-vol\"\n\t\t\tcontainerPath := \"/layers-vol/run.toml\"\n\t\t\tif osType == \"windows\" {\n\t\t\t\tcontainerDir = `c:\\layers-vol`\n\t\t\t\tcontainerPath = `c:\\layers-vol\\run.toml`\n\t\t\t}\n\n\t\t\tctrCmd := []string{\"cat\", \"/layers-vol/run.toml\"}\n\t\t\tif osType == \"windows\" {\n\t\t\t\tctrCmd = []string{\"cmd\", \"/c\", `type c:\\layers-vol\\run.toml`}\n\t\t\t}\n\n\t\t\tctx := context.Background()\n\t\t\tctr, err := createContainer(ctx, imageName, containerDir, osType, ctrCmd...)\n\t\t\th.AssertNil(t, err)\n\t\t\tdefer cleanupContainer(ctx, ctr.ID)\n\n\t\t\twriteOp := build.WriteRunToml(containerPath, []builder.RunImageMetadata{\n\t\t\t\t{\n\t\t\t\t\tImage: \"image-1\",\n\t\t\t\t\tMirrors: []string{\n\t\t\t\t\t\t\"mirror-1\",\n\t\t\t\t\t\t\"mirror-2\",\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\tImage: \"image-2\",\n\t\t\t\t\tMirrors: []string{\n\t\t\t\t\t\t\"mirror-3\",\n\t\t\t\t\t\t\"mirror-4\",\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t}, osType)\n\n\t\t\tvar outBuf, errBuf bytes.Buffer\n\t\t\terr = writeOp(ctrClient, ctx, ctr.ID, &outBuf, &errBuf)\n\t\t\th.AssertNil(t, err)\n\n\t\t\terr = container.RunWithHandler(ctx, ctrClient, ctr.ID, container.DefaultHandler(&outBuf, &errBuf))\n\t\t\th.AssertNil(t, err)\n\n\t\t\th.AssertEq(t, errBuf.String(), \"\")\n\t\t\th.AssertContains(t, outBuf.String(), `[[images]]\n  image = \"image-1\"\n  mirrors = [\"mirror-1\", \"mirror-2\"]\n\n[[images]]\n  image = \"image-2\"\n  mirrors = [\"mirror-3\", \"mirror-4\"]\n`)\n\t\t})\n\t})\n\n\twhen(\"#WriteProjectMetadata\", func() {\n\t\tit(\"writes file\", func() {\n\t\t\tcontainerDir := \"/layers-vol\"\n\t\t\tp := \"/layers-vol/project-metadata.toml\"\n\t\t\tif osType == \"windows\" {\n\t\t\t\tcontainerDir = `c:\\layers-vol`\n\t\t\t\tp = `c:\\layers-vol\\project-metadata.toml`\n\t\t\t}\n\n\t\t\tctrCmd := []string{\"ls\", \"-al\", \"/layers-vol/project-metadata.toml\"}\n\t\t\tif osType == \"windows\" {\n\t\t\t\tctrCmd = []string{\"cmd\", \"/c\", `dir /q /n c:\\layers-vol\\project-metadata.toml`}\n\t\t\t}\n\t\t\tctx := context.Background()\n\t\t\tctr, err := createContainer(ctx, imageName, containerDir, osType, ctrCmd...)\n\t\t\th.AssertNil(t, err)\n\t\t\tdefer cleanupContainer(ctx, ctr.ID)\n\n\t\t\twriteOp := build.WriteProjectMetadata(p, files.ProjectMetadata{\n\t\t\t\tSource: &files.ProjectSource{\n\t\t\t\t\tType: \"project\",\n\t\t\t\t\tVersion: map[string]interface{}{\n\t\t\t\t\t\t\"declared\": \"1.0.2\",\n\t\t\t\t\t},\n\t\t\t\t\tMetadata: map[string]interface{}{\n\t\t\t\t\t\t\"url\": \"https://github.com/buildpacks/pack\",\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t}, osType)\n\n\t\t\tvar outBuf, errBuf bytes.Buffer\n\t\t\terr = writeOp(ctrClient, ctx, ctr.ID, &outBuf, &errBuf)\n\t\t\th.AssertNil(t, err)\n\n\t\t\terr = container.RunWithHandler(ctx, ctrClient, ctr.ID, container.DefaultHandler(&outBuf, &errBuf))\n\t\t\th.AssertNil(t, err)\n\n\t\t\th.AssertEq(t, errBuf.String(), \"\")\n\t\t\tif osType == \"windows\" {\n\t\t\t\th.AssertContains(t, outBuf.String(), `01/01/1980  12:00 AM               137 ...                    project-metadata.toml`)\n\t\t\t} else {\n\t\t\t\th.AssertContains(t, outBuf.String(), `-rwxr-xr-x    1 root     root           137 Jan  1  1980 /layers-vol/project-metadata.toml`)\n\t\t\t}\n\t\t})\n\n\t\tit(\"has expected contents\", func() {\n\t\t\tcontainerDir := \"/layers-vol\"\n\t\t\tp := \"/layers-vol/project-metadata.toml\"\n\t\t\tif osType == \"windows\" {\n\t\t\t\tcontainerDir = `c:\\layers-vol`\n\t\t\t\tp = `c:\\layers-vol\\project-metadata.toml`\n\t\t\t}\n\n\t\t\tctrCmd := []string{\"cat\", \"/layers-vol/project-metadata.toml\"}\n\t\t\tif osType == \"windows\" {\n\t\t\t\tctrCmd = []string{\"cmd\", \"/c\", `type c:\\layers-vol\\project-metadata.toml`}\n\t\t\t}\n\n\t\t\tctx := context.Background()\n\t\t\tctr, err := createContainer(ctx, imageName, containerDir, osType, ctrCmd...)\n\t\t\th.AssertNil(t, err)\n\t\t\tdefer cleanupContainer(ctx, ctr.ID)\n\n\t\t\twriteOp := build.WriteProjectMetadata(p, files.ProjectMetadata{\n\t\t\t\tSource: &files.ProjectSource{\n\t\t\t\t\tType: \"project\",\n\t\t\t\t\tVersion: map[string]interface{}{\n\t\t\t\t\t\t\"declared\": \"1.0.2\",\n\t\t\t\t\t},\n\t\t\t\t\tMetadata: map[string]interface{}{\n\t\t\t\t\t\t\"url\": \"https://github.com/buildpacks/pack\",\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t}, osType)\n\n\t\t\tvar outBuf, errBuf bytes.Buffer\n\t\t\terr = writeOp(ctrClient, ctx, ctr.ID, &outBuf, &errBuf)\n\t\t\th.AssertNil(t, err)\n\n\t\t\terr = container.RunWithHandler(ctx, ctrClient, ctr.ID, container.DefaultHandler(&outBuf, &errBuf))\n\t\t\th.AssertEq(t, errBuf.String(), \"\")\n\t\t\th.AssertNil(t, err)\n\n\t\t\th.AssertContains(t, outBuf.String(), `[source]\n  type = \"project\"\n  [source.version]\n    declared = \"1.0.2\"\n  [source.metadata]\n    url = \"https://github.com/buildpacks/pack\"\n`)\n\t\t})\n\t})\n\n\twhen(\"#EnsureVolumeAccess\", func() {\n\t\tit(\"changes owner of volume\", func() {\n\t\t\th.SkipIf(t, osType != \"windows\", \"no-op for linux\")\n\n\t\t\tctx := context.Background()\n\n\t\t\tctrCmd := []string{\"ls\", \"-al\", \"/my-volume\"}\n\t\t\tcontainerDir := \"/my-volume\"\n\t\t\tif osType == \"windows\" {\n\t\t\t\tctrCmd = []string{\"cmd\", \"/c\", `icacls c:\\my-volume`}\n\t\t\t\tcontainerDir = `c:\\my-volume`\n\t\t\t}\n\n\t\t\tctr, err := createContainer(ctx, imageName, containerDir, osType, ctrCmd...)\n\t\t\th.AssertNil(t, err)\n\t\t\tdefer cleanupContainer(ctx, ctr.ID)\n\n\t\t\tinspectResult, err := ctrClient.ContainerInspect(ctx, ctr.ID, client.ContainerInspectOptions{})\n\t\t\tif err != nil {\n\t\t\t\treturn\n\t\t\t}\n\t\t\tinspect := inspectResult.Container\n\n\t\t\t// use container's current volumes\n\t\t\tvar ctrVolumes []string\n\t\t\tfor _, m := range inspect.Mounts {\n\t\t\t\tif m.Type == mount.TypeVolume {\n\t\t\t\t\tctrVolumes = append(ctrVolumes, m.Name)\n\t\t\t\t}\n\t\t\t}\n\n\t\t\tvar outBuf, errBuf bytes.Buffer\n\n\t\t\t// reuse same volume twice to demonstrate multiple ops\n\t\t\tinitVolumeOp := build.EnsureVolumeAccess(123, 456, osType, ctrVolumes[0], ctrVolumes[0])\n\t\t\terr = initVolumeOp(ctrClient, ctx, ctr.ID, &outBuf, &errBuf)\n\t\t\th.AssertNil(t, err)\n\t\t\terr = container.RunWithHandler(ctx, ctrClient, ctr.ID, container.DefaultHandler(&outBuf, &errBuf))\n\t\t\th.AssertNil(t, err)\n\n\t\t\th.AssertEq(t, errBuf.String(), \"\")\n\t\t\th.AssertContains(t, outBuf.String(), `BUILTIN\\Users:(OI)(CI)(F)`)\n\t\t})\n\t})\n}\n\nfunc createContainer(ctx context.Context, imageName, containerDir, osType string, cmd ...string) (client.ContainerCreateResult, error) {\n\tisolationType := dcontainer.IsolationDefault\n\tif osType == \"windows\" {\n\t\tisolationType = dcontainer.IsolationProcess\n\t}\n\n\treturn ctrClient.ContainerCreate(ctx, client.ContainerCreateOptions{\n\t\tConfig: &dcontainer.Config{\n\t\t\tImage: imageName,\n\t\t\tCmd:   cmd,\n\t\t},\n\t\tHostConfig: &dcontainer.HostConfig{\n\t\t\tBinds:     []string{fmt.Sprintf(\"%s:%s\", fmt.Sprintf(\"tests-volume-%s\", h.RandString(5)), filepath.ToSlash(containerDir))},\n\t\t\tIsolation: isolationType,\n\t\t},\n\t})\n}\n\nfunc cleanupContainer(ctx context.Context, ctrID string) {\n\tinspectResult, err := ctrClient.ContainerInspect(ctx, ctrID, client.ContainerInspectOptions{})\n\tif err != nil {\n\t\treturn\n\t}\n\tinspect := inspectResult.Container\n\n\t// remove container\n\t_, err = ctrClient.ContainerRemove(ctx, ctrID, client.ContainerRemoveOptions{})\n\tif err != nil {\n\t\treturn\n\t}\n\n\t// remove volumes\n\tfor _, m := range inspect.Mounts {\n\t\tif m.Type == mount.TypeVolume {\n\t\t\t_, err = ctrClient.VolumeRemove(ctx, m.Name, client.VolumeRemoveOptions{Force: true})\n\t\t\tif err != nil {\n\t\t\t\treturn\n\t\t\t}\n\t\t}\n\t}\n}\n"
  },
  {
    "path": "internal/build/docker.go",
    "content": "package build\n\nimport (\n\t\"context\"\n\n\tdockerClient \"github.com/moby/moby/client\"\n)\n\ntype DockerClient interface {\n\tImageRemove(ctx context.Context, image string, options dockerClient.ImageRemoveOptions) (dockerClient.ImageRemoveResult, error)\n\tVolumeRemove(ctx context.Context, volumeID string, options dockerClient.VolumeRemoveOptions) (dockerClient.VolumeRemoveResult, error)\n\tContainerWait(ctx context.Context, containerID string, options dockerClient.ContainerWaitOptions) dockerClient.ContainerWaitResult\n\tContainerAttach(ctx context.Context, container string, options dockerClient.ContainerAttachOptions) (dockerClient.ContainerAttachResult, error)\n\tContainerStart(ctx context.Context, container string, options dockerClient.ContainerStartOptions) (dockerClient.ContainerStartResult, error)\n\tContainerCreate(ctx context.Context, options dockerClient.ContainerCreateOptions) (dockerClient.ContainerCreateResult, error)\n\tCopyFromContainer(ctx context.Context, containerID string, options dockerClient.CopyFromContainerOptions) (dockerClient.CopyFromContainerResult, error)\n\tContainerInspect(ctx context.Context, containerID string, options dockerClient.ContainerInspectOptions) (dockerClient.ContainerInspectResult, error)\n\tContainerRemove(ctx context.Context, container string, options dockerClient.ContainerRemoveOptions) (dockerClient.ContainerRemoveResult, error)\n\tCopyToContainer(ctx context.Context, container string, options dockerClient.CopyToContainerOptions) (dockerClient.CopyToContainerResult, error)\n\tNetworkCreate(ctx context.Context, name string, options dockerClient.NetworkCreateOptions) (dockerClient.NetworkCreateResult, error)\n\tNetworkRemove(ctx context.Context, networkID string, options dockerClient.NetworkRemoveOptions) (dockerClient.NetworkRemoveResult, error)\n}\n"
  },
  {
    "path": "internal/build/fakes/cache.go",
    "content": "package fakes\n\nimport (\n\t\"context\"\n\n\t\"github.com/buildpacks/pack/pkg/cache\"\n)\n\ntype FakeCache struct {\n\tReturnForType  cache.Type\n\tReturnForClear error\n\tReturnForName  string\n\n\tTypeCallCount  int\n\tClearCallCount int\n\tNameCallCount  int\n}\n\nfunc NewFakeCache() *FakeCache {\n\treturn &FakeCache{}\n}\n\nfunc (f *FakeCache) Type() cache.Type {\n\tf.TypeCallCount++\n\treturn f.ReturnForType\n}\n\nfunc (f *FakeCache) Clear(ctx context.Context) error {\n\tf.ClearCallCount++\n\treturn f.ReturnForClear\n}\nfunc (f *FakeCache) Name() string {\n\tf.NameCallCount++\n\treturn f.ReturnForName\n}\n"
  },
  {
    "path": "internal/build/fakes/fake_builder.go",
    "content": "package fakes\n\nimport (\n\t\"github.com/Masterminds/semver\"\n\t\"github.com/buildpacks/imgutil\"\n\tifakes \"github.com/buildpacks/imgutil/fakes\"\n\t\"github.com/buildpacks/lifecycle/api\"\n\n\t\"github.com/buildpacks/pack/internal/build\"\n\t\"github.com/buildpacks/pack/internal/builder\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n)\n\ntype FakeBuilder struct {\n\tReturnForImage               imgutil.Image\n\tReturnForUID                 int\n\tReturnForGID                 int\n\tReturnForLifecycleDescriptor builder.LifecycleDescriptor\n\tReturnForStack               builder.StackMetadata\n\tReturnForRunImages           []builder.RunImageMetadata\n\tReturnForOrderExtensions     dist.Order\n}\n\nfunc NewFakeBuilder(ops ...func(*FakeBuilder)) (*FakeBuilder, error) {\n\tfakeBuilder := &FakeBuilder{\n\t\tReturnForImage: ifakes.NewImage(\"some-builder-name\", \"\", nil),\n\t\tReturnForUID:   99,\n\t\tReturnForGID:   99,\n\t\tReturnForLifecycleDescriptor: builder.LifecycleDescriptor{\n\t\t\tInfo: builder.LifecycleInfo{\n\t\t\t\tVersion: &builder.Version{Version: *semver.MustParse(\"12.34\")},\n\t\t\t},\n\t\t\tAPIs: builder.LifecycleAPIs{\n\t\t\t\tBuildpack: builder.APIVersions{\n\t\t\t\t\tSupported: builder.APISet{api.MustParse(\"0.4\")},\n\t\t\t\t},\n\t\t\t\tPlatform: builder.APIVersions{\n\t\t\t\t\tSupported: builder.APISet{api.MustParse(\"0.4\")},\n\t\t\t\t},\n\t\t\t},\n\t\t},\n\t\tReturnForStack: builder.StackMetadata{},\n\t}\n\n\tfor _, op := range ops {\n\t\top(fakeBuilder)\n\t}\n\n\treturn fakeBuilder, nil\n}\n\nfunc WithDeprecatedPlatformAPIs(apis []*api.Version) func(*FakeBuilder) {\n\treturn func(builder *FakeBuilder) {\n\t\tbuilder.ReturnForLifecycleDescriptor.APIs.Platform.Deprecated = apis\n\t}\n}\n\nfunc WithSupportedPlatformAPIs(apis []*api.Version) func(*FakeBuilder) {\n\treturn func(builder *FakeBuilder) {\n\t\tbuilder.ReturnForLifecycleDescriptor.APIs.Platform.Supported = apis\n\t}\n}\n\nfunc WithImage(image imgutil.Image) func(*FakeBuilder) {\n\treturn func(builder *FakeBuilder) {\n\t\tbuilder.ReturnForImage = image\n\t}\n}\n\nfunc WithOrderExtensions(orderExt dist.Order) func(*FakeBuilder) {\n\treturn func(builder *FakeBuilder) {\n\t\tbuilder.ReturnForOrderExtensions = orderExt\n\t}\n}\n\nfunc WithUID(uid int) func(*FakeBuilder) {\n\treturn func(builder *FakeBuilder) {\n\t\tbuilder.ReturnForUID = uid\n\t}\n}\n\nfunc WithGID(gid int) func(*FakeBuilder) {\n\treturn func(builder *FakeBuilder) {\n\t\tbuilder.ReturnForGID = gid\n\t}\n}\n\nfunc (b *FakeBuilder) Name() string {\n\treturn b.ReturnForImage.Name()\n}\n\nfunc (b *FakeBuilder) Image() imgutil.Image {\n\treturn b.ReturnForImage\n}\n\nfunc (b *FakeBuilder) UID() int {\n\treturn b.ReturnForUID\n}\n\nfunc (b *FakeBuilder) GID() int {\n\treturn b.ReturnForGID\n}\n\nfunc (b *FakeBuilder) LifecycleDescriptor() builder.LifecycleDescriptor {\n\treturn b.ReturnForLifecycleDescriptor\n}\n\nfunc (b *FakeBuilder) OrderExtensions() dist.Order {\n\treturn b.ReturnForOrderExtensions\n}\n\nfunc (b *FakeBuilder) Stack() builder.StackMetadata {\n\treturn b.ReturnForStack\n}\n\nfunc (b *FakeBuilder) RunImages() []builder.RunImageMetadata {\n\treturn b.ReturnForRunImages\n}\n\nfunc (b *FakeBuilder) System() dist.System { return dist.System{} }\n\nfunc WithBuilder(builder *FakeBuilder) func(*build.LifecycleOptions) {\n\treturn func(opts *build.LifecycleOptions) {\n\t\topts.Builder = builder\n\t}\n}\n\n// WithEnableUsernsHost creates a LifecycleOptions option that enables userns=host\nfunc WithEnableUsernsHost() func(*build.LifecycleOptions) {\n\treturn func(opts *build.LifecycleOptions) {\n\t\topts.EnableUsernsHost = true\n\t}\n}\n\n// WithExecutionEnvironment creates a LifecycleOptions option that sets the execution environment\nfunc WithExecutionEnvironment(execEnv string) func(*build.LifecycleOptions) {\n\treturn func(opts *build.LifecycleOptions) {\n\t\topts.ExecutionEnvironment = execEnv\n\t}\n}\n"
  },
  {
    "path": "internal/build/fakes/fake_phase.go",
    "content": "package fakes\n\nimport \"context\"\n\ntype FakePhase struct {\n\tCleanupCallCount int\n\tRunCallCount     int\n}\n\nfunc (p *FakePhase) Cleanup() error {\n\tp.CleanupCallCount++\n\n\treturn nil\n}\n\nfunc (p *FakePhase) Run(ctx context.Context) error {\n\tp.RunCallCount++\n\n\treturn nil\n}\n"
  },
  {
    "path": "internal/build/fakes/fake_phase_factory.go",
    "content": "package fakes\n\nimport \"github.com/buildpacks/pack/internal/build\"\n\ntype FakePhaseFactory struct {\n\tNewCallCount          int\n\tReturnForNew          build.RunnerCleaner\n\tNewCalledWithProvider []*build.PhaseConfigProvider\n}\n\nfunc NewFakePhaseFactory(ops ...func(*FakePhaseFactory)) *FakePhaseFactory {\n\tfakePhaseFactory := &FakePhaseFactory{\n\t\tReturnForNew: &FakePhase{},\n\t}\n\n\tfor _, op := range ops {\n\t\top(fakePhaseFactory)\n\t}\n\n\treturn fakePhaseFactory\n}\n\nfunc WhichReturnsForNew(phase build.RunnerCleaner) func(*FakePhaseFactory) {\n\treturn func(factory *FakePhaseFactory) {\n\t\tfactory.ReturnForNew = phase\n\t}\n}\n\nfunc (f *FakePhaseFactory) New(phaseConfigProvider *build.PhaseConfigProvider) build.RunnerCleaner {\n\tf.NewCallCount++\n\tf.NewCalledWithProvider = append(f.NewCalledWithProvider, phaseConfigProvider)\n\n\treturn f.ReturnForNew\n}\n"
  },
  {
    "path": "internal/build/fakes/fake_termui.go",
    "content": "package fakes\n\nimport (\n\t\"io\"\n\n\t\"github.com/buildpacks/pack/internal/build\"\n\t\"github.com/buildpacks/pack/internal/container\"\n)\n\ntype FakeTermui struct {\n\tHandlerFunc    container.Handler\n\tReadLayersFunc func(reader io.ReadCloser)\n}\n\nfunc (f *FakeTermui) Run(funk func()) error {\n\treturn nil\n}\n\nfunc (f *FakeTermui) Handler() container.Handler {\n\treturn f.HandlerFunc\n}\n\nfunc (f *FakeTermui) ReadLayers(reader io.ReadCloser) error {\n\tf.ReadLayersFunc(reader)\n\treturn nil\n}\n\nfunc WithTermui(screen build.Termui) func(*build.LifecycleOptions) {\n\treturn func(opts *build.LifecycleOptions) {\n\t\topts.Interactive = true\n\t\topts.Termui = screen\n\t}\n}\n\nfunc (f *FakeTermui) Debug(msg string) {\n\t// not implemented\n}\n\nfunc (f *FakeTermui) Debugf(fmt string, v ...interface{}) {\n\t// not implemented\n}\n\nfunc (f *FakeTermui) Info(msg string) {\n\t// not implemented\n}\n\nfunc (f *FakeTermui) Infof(fmt string, v ...interface{}) {\n\t// not implemented\n}\n\nfunc (f *FakeTermui) Warn(msg string) {\n\t// not implemented\n}\n\nfunc (f *FakeTermui) Warnf(fmt string, v ...interface{}) {\n\t// not implemented\n}\n\nfunc (f *FakeTermui) Error(msg string) {\n\t// not implemented\n}\n\nfunc (f *FakeTermui) Errorf(fmt string, v ...interface{}) {\n\t// not implemented\n}\n\nfunc (f *FakeTermui) Writer() io.Writer {\n\t// not implemented\n\treturn nil\n}\n\nfunc (f *FakeTermui) IsVerbose() bool {\n\t// not implemented\n\treturn false\n}\n"
  },
  {
    "path": "internal/build/lifecycle_execution.go",
    "content": "package build\n\nimport (\n\t\"context\"\n\t\"fmt\"\n\t\"math/rand\"\n\t\"os\"\n\t\"path/filepath\"\n\t\"strconv\"\n\t\"time\"\n\n\t\"github.com/BurntSushi/toml\"\n\t\"github.com/buildpacks/lifecycle/api\"\n\t\"github.com/buildpacks/lifecycle/auth\"\n\t\"github.com/buildpacks/lifecycle/platform/files\"\n\t\"github.com/google/go-containerregistry/pkg/name\"\n\t\"github.com/moby/moby/client\"\n\t\"github.com/pkg/errors\"\n\t\"golang.org/x/sync/errgroup\"\n\n\t\"github.com/buildpacks/pack/internal/builder\"\n\t\"github.com/buildpacks/pack/internal/paths\"\n\t\"github.com/buildpacks/pack/internal/style\"\n\t\"github.com/buildpacks/pack/pkg/cache\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n)\n\nconst (\n\tdefaultProcessType = \"web\"\n\toverrideGID        = 0\n\toverrideUID        = 0\n\tsourceDateEpochEnv = \"SOURCE_DATE_EPOCH\"\n)\n\ntype LifecycleExecution struct {\n\tlogger       logging.Logger\n\tdocker       DockerClient\n\tplatformAPI  *api.Version\n\tlayersVolume string\n\tappVolume    string\n\tos           string\n\tmountPaths   mountPaths\n\topts         LifecycleOptions\n\ttmpDir       string\n}\n\nfunc NewLifecycleExecution(logger logging.Logger, docker DockerClient, tmpDir string, opts LifecycleOptions) (*LifecycleExecution, error) {\n\tlatestSupportedPlatformAPI, err := FindLatestSupported(append(\n\t\topts.Builder.LifecycleDescriptor().APIs.Platform.Deprecated,\n\t\topts.Builder.LifecycleDescriptor().APIs.Platform.Supported...,\n\t), opts.LifecycleApis)\n\tif err != nil {\n\t\treturn nil, err\n\t}\n\n\tosType, err := opts.Builder.Image().OS()\n\tif err != nil {\n\t\treturn nil, err\n\t}\n\n\texec := &LifecycleExecution{\n\t\tlogger:       logger,\n\t\tdocker:       docker,\n\t\tlayersVolume: paths.FilterReservedNames(\"pack-layers-\" + randString(10)),\n\t\tappVolume:    paths.FilterReservedNames(\"pack-app-\" + randString(10)),\n\t\tplatformAPI:  latestSupportedPlatformAPI,\n\t\topts:         opts,\n\t\tos:           osType,\n\t\tmountPaths:   mountPathsForOS(osType, opts.Workspace),\n\t\ttmpDir:       tmpDir,\n\t}\n\n\tif opts.Interactive {\n\t\texec.logger = opts.Termui\n\t}\n\n\treturn exec, nil\n}\n\n// intersection of two sorted lists of api versions\nfunc apiIntersection(apisA, apisB []*api.Version) []*api.Version {\n\tbind := 0\n\taind := 0\n\tapis := []*api.Version{}\n\tfor ; aind < len(apisA); aind++ {\n\t\tfor ; bind < len(apisB) && apisA[aind].Compare(apisB[bind]) > 0; bind++ {\n\t\t}\n\t\tif bind == len(apisB) {\n\t\t\tbreak\n\t\t}\n\t\tif apisA[aind].Equal(apisB[bind]) {\n\t\t\tapis = append(apis, apisA[aind])\n\t\t}\n\t}\n\treturn apis\n}\n\n// FindLatestSupported finds the latest Platform API version supported by both the builder and the lifecycle.\nfunc FindLatestSupported(builderapis []*api.Version, lifecycleapis []string) (*api.Version, error) {\n\tvar apis []*api.Version\n\t// if a custom lifecycle image was used we need to take an intersection of its supported apis with the builder's supported apis.\n\t// generally no custom lifecycle is used, which will be indicated by the lifecycleapis list being empty in the struct.\n\tif len(lifecycleapis) > 0 {\n\t\tlcapis := []*api.Version{}\n\t\tfor _, ver := range lifecycleapis {\n\t\t\tv, err := api.NewVersion(ver)\n\t\t\tif err != nil {\n\t\t\t\treturn nil, fmt.Errorf(\"unable to parse lifecycle api version %s (%v)\", ver, err)\n\t\t\t}\n\t\t\tlcapis = append(lcapis, v)\n\t\t}\n\t\tapis = apiIntersection(lcapis, builderapis)\n\t} else {\n\t\tapis = builderapis\n\t}\n\n\tfor i := len(SupportedPlatformAPIVersions) - 1; i >= 0; i-- {\n\t\tfor _, version := range apis {\n\t\t\tif SupportedPlatformAPIVersions[i].Equal(version) {\n\t\t\t\treturn version, nil\n\t\t\t}\n\t\t}\n\t}\n\n\treturn nil, errors.New(\"unable to find a supported Platform API version\")\n}\n\nfunc randString(n int) string {\n\tb := make([]byte, n)\n\tfor i := range b {\n\t\tb[i] = 'a' + byte(rand.Intn(26))\n\t}\n\treturn string(b)\n}\n\nfunc (l *LifecycleExecution) Builder() Builder {\n\treturn l.opts.Builder\n}\n\nfunc (l *LifecycleExecution) AppPath() string {\n\treturn l.opts.AppPath\n}\n\nfunc (l LifecycleExecution) AppDir() string {\n\treturn l.mountPaths.appDir()\n}\n\nfunc (l *LifecycleExecution) AppVolume() string {\n\treturn l.appVolume\n}\n\nfunc (l *LifecycleExecution) LayersVolume() string {\n\treturn l.layersVolume\n}\n\nfunc (l *LifecycleExecution) PlatformAPI() *api.Version {\n\treturn l.platformAPI\n}\n\nfunc (l *LifecycleExecution) ImageName() name.Reference {\n\treturn l.opts.Image\n}\n\nfunc (l *LifecycleExecution) PrevImageName() string {\n\treturn l.opts.PreviousImage\n}\n\nconst maxNetworkRemoveRetries = 2\n\nfunc (l *LifecycleExecution) Run(ctx context.Context, phaseFactoryCreator PhaseFactoryCreator) error {\n\tphaseFactory := phaseFactoryCreator(l)\n\n\tvar buildCache Cache\n\tif l.opts.CacheImage != \"\" || (l.opts.Cache.Build.Format == cache.CacheImage) {\n\t\tcacheImageName := l.opts.CacheImage\n\t\tif cacheImageName == \"\" {\n\t\t\tcacheImageName = l.opts.Cache.Build.Source\n\t\t}\n\t\tcacheImage, err := name.ParseReference(cacheImageName, name.WeakValidation)\n\t\tif err != nil {\n\t\t\treturn fmt.Errorf(\"invalid cache image name: %s\", err)\n\t\t}\n\t\tbuildCache = cache.NewImageCache(cacheImage, l.docker)\n\t} else {\n\t\tswitch l.opts.Cache.Build.Format {\n\t\tcase cache.CacheVolume:\n\t\t\tvar err error\n\t\t\tbuildCache, err = cache.NewVolumeCache(l.opts.Image, l.opts.Cache.Build, \"build\", l.docker, l.logger)\n\t\t\tif err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\t\t\tl.logger.Debugf(\"Using build cache volume %s\", style.Symbol(buildCache.Name()))\n\t\tcase cache.CacheBind:\n\t\t\tbuildCache = cache.NewBindCache(l.opts.Cache.Build, l.docker)\n\t\t\tl.logger.Debugf(\"Using build cache dir %s\", style.Symbol(buildCache.Name()))\n\t\t}\n\t}\n\n\tif l.opts.ClearCache {\n\t\tif err := buildCache.Clear(ctx); err != nil {\n\t\t\treturn errors.Wrap(err, \"clearing build cache\")\n\t\t}\n\t\tl.logger.Debugf(\"Build cache %s cleared\", style.Symbol(buildCache.Name()))\n\t}\n\n\tlaunchCache, err := cache.NewVolumeCache(l.opts.Image, l.opts.Cache.Launch, \"launch\", l.docker, l.logger)\n\tif err != nil {\n\t\treturn err\n\t}\n\n\tif l.opts.Network == \"\" {\n\t\t// start an ephemeral bridge network\n\t\tdriver := \"bridge\"\n\t\tif l.os == \"windows\" {\n\t\t\tdriver = \"nat\"\n\t\t}\n\t\tnetworkName := fmt.Sprintf(\"pack.local-network-%x\", randString(10))\n\t\tresult, err := l.docker.NetworkCreate(ctx, networkName, client.NetworkCreateOptions{\n\t\t\tDriver: driver,\n\t\t})\n\t\tif err != nil {\n\t\t\treturn fmt.Errorf(\"failed to create ephemeral %s network: %w\", driver, err)\n\t\t}\n\t\tdefer func() {\n\t\t\tfor i := 0; i <= maxNetworkRemoveRetries; i++ {\n\t\t\t\ttime.Sleep(100 * time.Duration(i) * time.Millisecond) // wait if retrying\n\t\t\t\tif _, err = l.docker.NetworkRemove(ctx, networkName, client.NetworkRemoveOptions{}); err != nil {\n\t\t\t\t\tcontinue\n\t\t\t\t}\n\t\t\t\tbreak\n\t\t\t}\n\t\t}()\n\t\tl.logger.Debugf(\"Created ephemeral bridge network %s with ID %s\", networkName, result.ID)\n\t\tfor _, warning := range result.Warning {\n\t\t\tl.logger.Warn(warning)\n\t\t}\n\t\tl.opts.Network = networkName\n\t}\n\n\tif !l.opts.UseCreator {\n\t\tif l.platformAPI.LessThan(\"0.7\") {\n\t\t\tl.logger.Info(style.Step(\"DETECTING\"))\n\t\t\tif err := l.Detect(ctx, phaseFactory); err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\n\t\t\tl.logger.Info(style.Step(\"ANALYZING\"))\n\t\t\tif err := l.Analyze(ctx, buildCache, launchCache, phaseFactory); err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\t\t} else {\n\t\t\tl.logger.Info(style.Step(\"ANALYZING\"))\n\t\t\tif err := l.Analyze(ctx, buildCache, launchCache, phaseFactory); err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\n\t\t\tl.logger.Info(style.Step(\"DETECTING\"))\n\t\t\tif err := l.Detect(ctx, phaseFactory); err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\t\t}\n\n\t\tvar kanikoCache Cache\n\t\tif l.PlatformAPI().AtLeast(\"0.12\") {\n\t\t\t// lifecycle 0.17.0 (introduces support for Platform API 0.12) and above will ensure that\n\t\t\t// this volume is owned by the CNB user,\n\t\t\t// and hence the restorer (after dropping privileges) will be able to write to it.\n\t\t\tkanikoCache, err = cache.NewVolumeCache(l.opts.Image, l.opts.Cache.Kaniko, \"kaniko\", l.docker, l.logger)\n\t\t\tif err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\t\t} else {\n\t\t\tswitch {\n\t\t\tcase buildCache.Type() == cache.Volume:\n\t\t\t\t// Re-use the build cache as the kaniko cache. Earlier versions of the lifecycle (0.16.x and below)\n\t\t\t\t// already ensure this volume is owned by the CNB user.\n\t\t\t\tkanikoCache = buildCache\n\t\t\tcase l.hasExtensionsForBuild():\n\t\t\t\t// We need a usable kaniko cache, so error in this case.\n\t\t\t\treturn fmt.Errorf(\"build cache must be volume cache when building with extensions\")\n\t\t\tdefault:\n\t\t\t\t// The kaniko cache is unused, so it doesn't matter that it's not usable.\n\t\t\t\tkanikoCache, err = cache.NewVolumeCache(l.opts.Image, l.opts.Cache.Kaniko, \"kaniko\", l.docker, l.logger)\n\t\t\t\tif err != nil {\n\t\t\t\t\treturn err\n\t\t\t\t}\n\t\t\t}\n\t\t}\n\n\t\tvar (\n\t\t\tephemeralRunImage string\n\t\t\terr               error\n\t\t)\n\t\tif l.runImageChanged() || l.hasExtensionsForRun() {\n\t\t\t// Pull the run image by name in case we fail to pull it by identifier later.\n\t\t\tif ephemeralRunImage, err = l.opts.FetchRunImageWithLifecycleLayer(l.runImageNameAfterExtensions()); err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\t\t}\n\n\t\tl.logger.Info(style.Step(\"RESTORING\"))\n\t\tif l.opts.ClearCache && l.PlatformAPI().LessThan(\"0.10\") {\n\t\t\tl.logger.Info(\"Skipping 'restore' due to clearing cache\")\n\t\t} else if err := l.Restore(ctx, buildCache, kanikoCache, phaseFactory); err != nil {\n\t\t\treturn err\n\t\t}\n\n\t\tif l.runImageChanged() || l.hasExtensionsForRun() {\n\t\t\tif newEphemeralRunImage, err := l.opts.FetchRunImageWithLifecycleLayer(l.runImageIdentifierAfterExtensions()); err == nil {\n\t\t\t\t// If the run image was switched by extensions, the run image reference as written by the __restorer__ will be a digest reference\n\t\t\t\t// that is pullable from a registry.\n\t\t\t\t// However, if the run image is only extended (not switched), the run image reference as written by the __analyzer__ may be an image identifier\n\t\t\t\t// (in the daemon case), and will not be pullable.\n\t\t\t\tephemeralRunImage = newEphemeralRunImage\n\t\t\t}\n\t\t}\n\n\t\tgroup, _ := errgroup.WithContext(context.TODO())\n\t\tif l.platformAPI.AtLeast(\"0.10\") && l.hasExtensionsForBuild() {\n\t\t\tgroup.Go(func() error {\n\t\t\t\tl.logger.Info(style.Step(\"EXTENDING (BUILD)\"))\n\t\t\t\treturn l.ExtendBuild(ctx, kanikoCache, phaseFactory, l.extensionsAreExperimental())\n\t\t\t})\n\t\t} else {\n\t\t\tgroup.Go(func() error {\n\t\t\t\tl.logger.Info(style.Step(\"BUILDING\"))\n\t\t\t\treturn l.Build(ctx, phaseFactory)\n\t\t\t})\n\t\t}\n\n\t\tif l.platformAPI.AtLeast(\"0.12\") && l.hasExtensionsForRun() {\n\t\t\tgroup.Go(func() error {\n\t\t\t\tl.logger.Info(style.Step(\"EXTENDING (RUN)\"))\n\t\t\t\treturn l.ExtendRun(ctx, kanikoCache, phaseFactory, ephemeralRunImage, l.extensionsAreExperimental())\n\t\t\t})\n\t\t}\n\n\t\tif err := group.Wait(); err != nil {\n\t\t\treturn err\n\t\t}\n\n\t\tl.logger.Info(style.Step(\"EXPORTING\"))\n\t\treturn l.Export(ctx, buildCache, launchCache, kanikoCache, phaseFactory)\n\t}\n\n\tif l.platformAPI.AtLeast(\"0.10\") && l.hasExtensions() && !l.opts.UseCreatorWithExtensions {\n\t\treturn errors.New(\"builder has an order for extensions which is not supported when using the creator; re-run without '--trust-builder' or re-tag builder to avoid trusting it\")\n\t}\n\treturn l.Create(ctx, buildCache, launchCache, phaseFactory)\n}\n\nfunc (l *LifecycleExecution) Cleanup() error {\n\tvar reterr error\n\tif _, err := l.docker.VolumeRemove(context.Background(), l.layersVolume, client.VolumeRemoveOptions{Force: true}); err != nil {\n\t\treterr = errors.Wrapf(err, \"failed to clean up layers volume %s\", l.layersVolume)\n\t}\n\tif _, err := l.docker.VolumeRemove(context.Background(), l.appVolume, client.VolumeRemoveOptions{Force: true}); err != nil {\n\t\treterr = errors.Wrapf(err, \"failed to clean up app volume %s\", l.appVolume)\n\t}\n\tif err := os.RemoveAll(l.tmpDir); err != nil {\n\t\treterr = errors.Wrapf(err, \"failed to clean up working directory %s\", l.tmpDir)\n\t}\n\treturn reterr\n}\n\nfunc (l *LifecycleExecution) Create(ctx context.Context, buildCache, launchCache Cache, phaseFactory PhaseFactory) error {\n\tflags := addTags([]string{\n\t\t\"-app\", l.mountPaths.appDir(),\n\t\t\"-cache-dir\", l.mountPaths.cacheDir(),\n\t\t\"-run-image\", l.opts.RunImage,\n\t}, l.opts.AdditionalTags)\n\n\tif l.opts.ClearCache {\n\t\tflags = append(flags, \"-skip-restore\")\n\t}\n\n\tif l.opts.GID >= overrideGID {\n\t\tflags = append(flags, \"-gid\", strconv.Itoa(l.opts.GID))\n\t}\n\n\tif l.opts.UID >= overrideUID {\n\t\tflags = append(flags, \"-uid\", strconv.Itoa(l.opts.UID))\n\t}\n\n\tif l.platformAPI.AtLeast(\"0.13\") {\n\t\tfor _, reg := range l.opts.InsecureRegistries {\n\t\t\tflags = append(flags, \"-insecure-registry\", reg)\n\t\t}\n\t}\n\n\tif l.opts.PreviousImage != \"\" {\n\t\tif l.opts.Image == nil {\n\t\t\treturn errors.New(\"image can't be nil\")\n\t\t}\n\n\t\timage, err := name.ParseReference(l.opts.Image.Name(), name.WeakValidation)\n\t\tif err != nil {\n\t\t\treturn fmt.Errorf(\"invalid image name: %s\", err)\n\t\t}\n\n\t\tprevImage, err := name.ParseReference(l.opts.PreviousImage, name.WeakValidation)\n\t\tif err != nil {\n\t\t\treturn fmt.Errorf(\"invalid previous image name: %s\", err)\n\t\t}\n\t\tif l.opts.Publish {\n\t\t\tif image.Context().RegistryStr() != prevImage.Context().RegistryStr() {\n\t\t\t\treturn fmt.Errorf(`when --publish is used, <previous-image> must be in the same image registry as <image>\n                image registry = %s\n                previous-image registry = %s`, image.Context().RegistryStr(), prevImage.Context().RegistryStr())\n\t\t\t}\n\t\t}\n\n\t\tflags = append(flags, \"-previous-image\", l.opts.PreviousImage)\n\t}\n\n\tprocessType := determineDefaultProcessType(l.platformAPI, l.opts.DefaultProcessType)\n\tif processType != \"\" {\n\t\tflags = append(flags, \"-process-type\", processType)\n\t}\n\n\tvar cacheBindOp PhaseConfigProviderOperation\n\tswitch buildCache.Type() {\n\tcase cache.Image:\n\t\tflags = append(flags, \"-cache-image\", buildCache.Name())\n\t\tcacheBindOp = WithBinds(l.opts.Volumes...)\n\tcase cache.Volume, cache.Bind:\n\t\tcacheBindOp = WithBinds(append(l.opts.Volumes, fmt.Sprintf(\"%s:%s\", buildCache.Name(), l.mountPaths.cacheDir()))...)\n\t}\n\n\twithEnv := NullOp()\n\tif l.opts.CreationTime != nil && l.platformAPI.AtLeast(\"0.9\") {\n\t\twithEnv = WithEnv(fmt.Sprintf(\"%s=%s\", sourceDateEpochEnv, strconv.Itoa(int(l.opts.CreationTime.Unix()))))\n\t}\n\n\topts := []PhaseConfigProviderOperation{\n\t\tWithFlags(l.withLogLevel(flags...)...),\n\t\tWithArgs(l.opts.Image.String()),\n\t\tWithNetwork(l.opts.Network),\n\t\tcacheBindOp,\n\t\tWithContainerOperations(WriteProjectMetadata(l.mountPaths.projectPath(), l.opts.ProjectMetadata, l.os)),\n\t\tWithContainerOperations(CopyDir(l.opts.AppPath, l.mountPaths.appDir(), l.opts.Builder.UID(), l.opts.Builder.GID(), l.os, true, l.opts.FileFilter)),\n\t\tIf(l.opts.SBOMDestinationDir != \"\", WithPostContainerRunOperations(\n\t\t\tEnsureVolumeAccess(l.opts.Builder.UID(), l.opts.Builder.GID(), l.os, l.layersVolume, l.appVolume),\n\t\t\tCopyOutTo(l.mountPaths.sbomDir(), l.opts.SBOMDestinationDir))),\n\t\tIf(l.opts.ReportDestinationDir != \"\", WithPostContainerRunOperations(\n\t\t\tEnsureVolumeAccess(l.opts.Builder.UID(), l.opts.Builder.GID(), l.os, l.layersVolume, l.appVolume),\n\t\t\tCopyOutTo(l.mountPaths.reportPath(), l.opts.ReportDestinationDir))),\n\t\tIf(l.opts.Interactive, WithPostContainerRunOperations(\n\t\t\tEnsureVolumeAccess(l.opts.Builder.UID(), l.opts.Builder.GID(), l.os, l.layersVolume, l.appVolume),\n\t\t\tCopyOut(l.opts.Termui.ReadLayers, l.mountPaths.layersDir(), l.mountPaths.appDir()))),\n\t\twithEnv,\n\t}\n\n\tif l.opts.Layout {\n\t\tvar err error\n\t\topts, err = l.appendLayoutOperations(opts)\n\t\tif err != nil {\n\t\t\treturn err\n\t\t}\n\t}\n\n\tif l.opts.Publish || l.opts.Layout {\n\t\tauthConfig, err := auth.BuildEnvVar(l.opts.Keychain, l.opts.Image.String(), l.opts.RunImage, l.opts.CacheImage, l.opts.PreviousImage)\n\t\tif err != nil {\n\t\t\treturn err\n\t\t}\n\n\t\topts = append(opts, WithRoot(), WithRegistryAccess(authConfig))\n\t} else {\n\t\topts = append(opts,\n\t\t\tWithDaemonAccess(l.opts.DockerHost),\n\t\t\tWithFlags(\"-daemon\", \"-launch-cache\", l.mountPaths.launchCacheDir()),\n\t\t\tWithBinds(fmt.Sprintf(\"%s:%s\", launchCache.Name(), l.mountPaths.launchCacheDir())),\n\t\t)\n\t}\n\n\tcreate := phaseFactory.New(NewPhaseConfigProvider(\"creator\", l, opts...))\n\tdefer create.Cleanup()\n\treturn create.Run(ctx)\n}\n\nfunc (l *LifecycleExecution) Detect(ctx context.Context, phaseFactory PhaseFactory) error {\n\tflags := []string{\"-app\", l.mountPaths.appDir()}\n\n\tenvOp := NullOp()\n\tif l.platformAPI.AtLeast(\"0.10\") && l.hasExtensions() {\n\t\tenvOp = If(l.extensionsAreExperimental(), WithEnv(\"CNB_EXPERIMENTAL_MODE=warn\"))\n\t}\n\n\tconfigProvider := NewPhaseConfigProvider(\n\t\t\"detector\",\n\t\tl,\n\t\tWithLogPrefix(\"detector\"),\n\t\tWithArgs(\n\t\t\tl.withLogLevel()...,\n\t\t),\n\t\tWithNetwork(l.opts.Network),\n\t\tWithBinds(l.opts.Volumes...),\n\t\tWithContainerOperations(\n\t\t\tEnsureVolumeAccess(l.opts.Builder.UID(), l.opts.Builder.GID(), l.os, l.layersVolume, l.appVolume),\n\t\t\tCopyDir(l.opts.AppPath, l.mountPaths.appDir(), l.opts.Builder.UID(), l.opts.Builder.GID(), l.os, true, l.opts.FileFilter),\n\t\t),\n\t\tWithFlags(flags...),\n\t\tIf(l.hasExtensions(), WithPostContainerRunOperations(\n\t\t\tCopyOutToMaybe(filepath.Join(l.mountPaths.layersDir(), \"analyzed.toml\"), l.tmpDir))),\n\t\tIf(l.hasExtensions(), WithPostContainerRunOperations(\n\t\t\tCopyOutToMaybe(filepath.Join(l.mountPaths.layersDir(), \"generated\"), l.tmpDir))),\n\t\tenvOp,\n\t)\n\n\tdetect := phaseFactory.New(configProvider)\n\tdefer detect.Cleanup()\n\treturn detect.Run(ctx)\n}\n\nfunc (l *LifecycleExecution) extensionsAreExperimental() bool {\n\treturn l.PlatformAPI().AtLeast(\"0.10\") && l.platformAPI.LessThan(\"0.13\")\n}\n\nfunc (l *LifecycleExecution) Restore(ctx context.Context, buildCache Cache, kanikoCache Cache, phaseFactory PhaseFactory) error {\n\t// build up flags and ops\n\tvar flags []string\n\tif l.opts.ClearCache {\n\t\tflags = append(flags, \"-skip-layers\")\n\t}\n\tvar registryImages []string\n\n\t// for cache\n\tcacheBindOp := NullOp()\n\tswitch buildCache.Type() {\n\tcase cache.Image:\n\t\tflags = append(flags, \"-cache-image\", buildCache.Name())\n\t\tregistryImages = append(registryImages, buildCache.Name())\n\tcase cache.Volume:\n\t\tflags = append(flags, \"-cache-dir\", l.mountPaths.cacheDir())\n\t\tcacheBindOp = WithBinds(fmt.Sprintf(\"%s:%s\", buildCache.Name(), l.mountPaths.cacheDir()))\n\t}\n\n\t// for gid\n\tif l.opts.GID >= overrideGID {\n\t\tflags = append(flags, \"-gid\", strconv.Itoa(l.opts.GID))\n\t}\n\n\tif l.opts.UID >= overrideUID {\n\t\tflags = append(flags, \"-uid\", strconv.Itoa(l.opts.UID))\n\t}\n\n\tif l.platformAPI.AtLeast(\"0.13\") {\n\t\tfor _, reg := range l.opts.InsecureRegistries {\n\t\t\tflags = append(flags, \"-insecure-registry\", reg)\n\t\t}\n\t}\n\n\t// for kaniko\n\tkanikoCacheBindOp := NullOp()\n\tif (l.platformAPI.AtLeast(\"0.10\") && l.hasExtensionsForBuild()) ||\n\t\tl.platformAPI.AtLeast(\"0.12\") {\n\t\tif l.hasExtensionsForBuild() {\n\t\t\tflags = append(flags, \"-build-image\", l.opts.BuilderImage)\n\t\t\tregistryImages = append(registryImages, l.opts.BuilderImage)\n\t\t}\n\t\tif l.runImageChanged() || l.hasExtensionsForRun() {\n\t\t\tregistryImages = append(registryImages, l.runImageNameAfterExtensions())\n\t\t}\n\t\tif l.hasExtensionsForBuild() || l.hasExtensionsForRun() {\n\t\t\tkanikoCacheBindOp = WithBinds(fmt.Sprintf(\"%s:%s\", kanikoCache.Name(), l.mountPaths.kanikoCacheDir()))\n\t\t}\n\t}\n\n\t// for auths\n\tregistryOp := NullOp()\n\tif len(registryImages) > 0 {\n\t\tauthConfig, err := auth.BuildEnvVar(l.opts.Keychain, registryImages...)\n\t\tif err != nil {\n\t\t\treturn err\n\t\t}\n\t\tregistryOp = WithRegistryAccess(authConfig)\n\t}\n\n\t// for export to OCI layout\n\tlayoutOp := NullOp()\n\tlayoutBindOp := NullOp()\n\tif l.opts.Layout && l.platformAPI.AtLeast(\"0.12\") {\n\t\tlayoutOp = withLayoutOperation()\n\t\tlayoutBindOp = WithBinds(l.opts.Volumes...)\n\t}\n\n\tdockerOp := NullOp()\n\tif !l.opts.Publish && !l.opts.Layout && l.platformAPI.AtLeast(\"0.12\") {\n\t\tdockerOp = WithDaemonAccess(l.opts.DockerHost)\n\t\tflags = append(flags, \"-daemon\")\n\t}\n\n\tflagsOp := WithFlags(flags...)\n\n\tconfigProvider := NewPhaseConfigProvider(\n\t\t\"restorer\",\n\t\tl,\n\t\tWithLogPrefix(\"restorer\"),\n\t\tWithImage(l.opts.LifecycleImage),\n\t\tWithEnv(fmt.Sprintf(\"%s=%d\", builder.EnvUID, l.opts.Builder.UID()), fmt.Sprintf(\"%s=%d\", builder.EnvGID, l.opts.Builder.GID())),\n\t\tWithRoot(), // remove after platform API 0.2 is no longer supported\n\t\tWithArgs(\n\t\t\tl.withLogLevel()...,\n\t\t),\n\t\tWithNetwork(l.opts.Network),\n\t\tcacheBindOp,\n\t\tdockerOp,\n\t\tflagsOp,\n\t\tkanikoCacheBindOp,\n\t\tregistryOp,\n\t\tlayoutOp,\n\t\tlayoutBindOp,\n\t\tIf(l.hasExtensions(), WithPostContainerRunOperations(\n\t\t\tCopyOutToMaybe(filepath.Join(l.mountPaths.layersDir(), \"analyzed.toml\"), l.tmpDir))),\n\t)\n\n\trestore := phaseFactory.New(configProvider)\n\tdefer restore.Cleanup()\n\treturn restore.Run(ctx)\n}\n\nfunc (l *LifecycleExecution) Analyze(ctx context.Context, buildCache, launchCache Cache, phaseFactory PhaseFactory) error {\n\tvar flags []string\n\targs := []string{l.opts.Image.String()}\n\tplatformAPILessThan07 := l.platformAPI.LessThan(\"0.7\")\n\n\tcacheBindOp := NullOp()\n\tif l.opts.ClearCache {\n\t\tif platformAPILessThan07 || l.platformAPI.AtLeast(\"0.9\") {\n\t\t\targs = prependArg(\"-skip-layers\", args)\n\t\t}\n\t} else {\n\t\tswitch buildCache.Type() {\n\t\tcase cache.Image:\n\t\t\tflags = append(flags, \"-cache-image\", buildCache.Name())\n\t\tcase cache.Volume:\n\t\t\tif platformAPILessThan07 {\n\t\t\t\targs = append([]string{\"-cache-dir\", l.mountPaths.cacheDir()}, args...)\n\t\t\t\tcacheBindOp = WithBinds(fmt.Sprintf(\"%s:%s\", buildCache.Name(), l.mountPaths.cacheDir()))\n\t\t\t}\n\t\t}\n\t}\n\n\tlaunchCacheBindOp := NullOp()\n\tif l.platformAPI.AtLeast(\"0.9\") {\n\t\tif !l.opts.Publish {\n\t\t\targs = append([]string{\"-launch-cache\", l.mountPaths.launchCacheDir()}, args...)\n\t\t\tlaunchCacheBindOp = WithBinds(fmt.Sprintf(\"%s:%s\", launchCache.Name(), l.mountPaths.launchCacheDir()))\n\t\t}\n\t}\n\n\tif l.opts.GID >= overrideGID {\n\t\tflags = append(flags, \"-gid\", strconv.Itoa(l.opts.GID))\n\t}\n\n\tif l.opts.UID >= overrideUID {\n\t\tflags = append(flags, \"-uid\", strconv.Itoa(l.opts.UID))\n\t}\n\n\tif l.platformAPI.AtLeast(\"0.13\") {\n\t\tfor _, reg := range l.opts.InsecureRegistries {\n\t\t\tflags = append(flags, \"-insecure-registry\", reg)\n\t\t}\n\t}\n\n\tif l.opts.PreviousImage != \"\" {\n\t\tif l.opts.Image == nil {\n\t\t\treturn errors.New(\"image can't be nil\")\n\t\t}\n\n\t\timage, err := name.ParseReference(l.opts.Image.Name(), name.WeakValidation)\n\t\tif err != nil {\n\t\t\treturn fmt.Errorf(\"invalid image name: %s\", err)\n\t\t}\n\n\t\tprevImage, err := name.ParseReference(l.opts.PreviousImage, name.WeakValidation)\n\t\tif err != nil {\n\t\t\treturn fmt.Errorf(\"invalid previous image name: %s\", err)\n\t\t}\n\t\tif l.opts.Publish {\n\t\t\tif image.Context().RegistryStr() != prevImage.Context().RegistryStr() {\n\t\t\t\treturn fmt.Errorf(`when --publish is used, <previous-image> must be in the same image registry as <image>\n\t            image registry = %s\n\t            previous-image registry = %s`, image.Context().RegistryStr(), prevImage.Context().RegistryStr())\n\t\t\t}\n\t\t}\n\t\tif platformAPILessThan07 {\n\t\t\tl.opts.Image = prevImage\n\t\t} else {\n\t\t\targs = append([]string{\"-previous-image\", l.opts.PreviousImage}, args...)\n\t\t}\n\t}\n\n\tstackOp := NullOp()\n\trunOp := NullOp()\n\tif !platformAPILessThan07 {\n\t\tfor _, tag := range l.opts.AdditionalTags {\n\t\t\targs = append([]string{\"-tag\", tag}, args...)\n\t\t}\n\t\tif l.opts.RunImage != \"\" {\n\t\t\targs = append([]string{\"-run-image\", l.opts.RunImage}, args...)\n\t\t}\n\t\tif l.platformAPI.LessThan(\"0.12\") {\n\t\t\targs = append([]string{\"-stack\", l.mountPaths.stackPath()}, args...)\n\t\t\tstackOp = WithContainerOperations(WriteStackToml(l.mountPaths.stackPath(), l.opts.Builder.Stack(), l.os))\n\t\t} else {\n\t\t\targs = append([]string{\"-run\", l.mountPaths.runPath()}, args...)\n\t\t\trunOp = WithContainerOperations(WriteRunToml(l.mountPaths.runPath(), l.opts.Builder.RunImages(), l.os))\n\t\t}\n\t}\n\n\tlayoutOp := NullOp()\n\tif l.opts.Layout && l.platformAPI.AtLeast(\"0.12\") {\n\t\tlayoutOp = withLayoutOperation()\n\t}\n\n\tflagsOp := WithFlags(flags...)\n\n\tvar analyze RunnerCleaner\n\tif l.opts.Publish || l.opts.Layout {\n\t\tauthConfig, err := auth.BuildEnvVar(l.opts.Keychain, l.opts.Image.String(), l.opts.RunImage, l.opts.CacheImage, l.opts.PreviousImage)\n\t\tif err != nil {\n\t\t\treturn err\n\t\t}\n\n\t\tconfigProvider := NewPhaseConfigProvider(\n\t\t\t\"analyzer\",\n\t\t\tl,\n\t\t\tWithLogPrefix(\"analyzer\"),\n\t\t\tWithImage(l.opts.LifecycleImage),\n\t\t\tWithEnv(fmt.Sprintf(\"%s=%d\", builder.EnvUID, l.opts.Builder.UID()), fmt.Sprintf(\"%s=%d\", builder.EnvGID, l.opts.Builder.GID())),\n\t\t\tWithRegistryAccess(authConfig),\n\t\t\tWithRoot(),\n\t\t\tWithArgs(l.withLogLevel(args...)...),\n\t\t\tWithNetwork(l.opts.Network),\n\t\t\tflagsOp,\n\t\t\tcacheBindOp,\n\t\t\tstackOp,\n\t\t\trunOp,\n\t\t\tlayoutOp,\n\t\t)\n\n\t\tanalyze = phaseFactory.New(configProvider)\n\t} else {\n\t\tconfigProvider := NewPhaseConfigProvider(\n\t\t\t\"analyzer\",\n\t\t\tl,\n\t\t\tWithLogPrefix(\"analyzer\"),\n\t\t\tWithImage(l.opts.LifecycleImage),\n\t\t\tWithEnv(\n\t\t\t\tfmt.Sprintf(\"%s=%d\", builder.EnvUID, l.opts.Builder.UID()),\n\t\t\t\tfmt.Sprintf(\"%s=%d\", builder.EnvGID, l.opts.Builder.GID()),\n\t\t\t),\n\t\t\tWithDaemonAccess(l.opts.DockerHost),\n\t\t\tlaunchCacheBindOp,\n\t\t\tWithFlags(l.withLogLevel(\"-daemon\")...),\n\t\t\tWithArgs(args...),\n\t\t\tflagsOp,\n\t\t\tWithNetwork(l.opts.Network),\n\t\t\tcacheBindOp,\n\t\t\tstackOp,\n\t\t\trunOp,\n\t\t)\n\n\t\tanalyze = phaseFactory.New(configProvider)\n\t}\n\n\tdefer analyze.Cleanup()\n\treturn analyze.Run(ctx)\n}\n\nfunc (l *LifecycleExecution) Build(ctx context.Context, phaseFactory PhaseFactory) error {\n\tflags := []string{\"-app\", l.mountPaths.appDir()}\n\tconfigProvider := NewPhaseConfigProvider(\n\t\t\"builder\",\n\t\tl,\n\t\tWithLogPrefix(\"builder\"),\n\t\tWithArgs(l.withLogLevel()...),\n\t\tWithNetwork(l.opts.Network),\n\t\tWithBinds(l.opts.Volumes...),\n\t\tWithFlags(flags...),\n\t)\n\n\tbuild := phaseFactory.New(configProvider)\n\tdefer build.Cleanup()\n\treturn build.Run(ctx)\n}\n\nfunc (l *LifecycleExecution) ExtendBuild(ctx context.Context, kanikoCache Cache, phaseFactory PhaseFactory, experimental bool) error {\n\tflags := []string{\"-app\", l.mountPaths.appDir()}\n\n\tconfigProvider := NewPhaseConfigProvider(\n\t\t\"extender\",\n\t\tl,\n\t\tWithLogPrefix(\"extender (build)\"),\n\t\tWithArgs(l.withLogLevel()...),\n\t\tWithBinds(l.opts.Volumes...),\n\t\tIf(experimental, WithEnv(\"CNB_EXPERIMENTAL_MODE=warn\")),\n\t\tWithFlags(flags...),\n\t\tWithNetwork(l.opts.Network),\n\t\tWithRoot(),\n\t\tWithBinds(fmt.Sprintf(\"%s:%s\", kanikoCache.Name(), l.mountPaths.kanikoCacheDir())),\n\t)\n\n\textend := phaseFactory.New(configProvider)\n\tdefer extend.Cleanup()\n\treturn extend.Run(ctx)\n}\n\nfunc (l *LifecycleExecution) ExtendRun(ctx context.Context, kanikoCache Cache, phaseFactory PhaseFactory, runImageName string, experimental bool) error {\n\tflags := []string{\"-app\", l.mountPaths.appDir(), \"-kind\", \"run\"}\n\n\tconfigProvider := NewPhaseConfigProvider(\n\t\t\"extender\",\n\t\tl,\n\t\tWithLogPrefix(\"extender (run)\"),\n\t\tWithArgs(l.withLogLevel()...),\n\t\tWithBinds(l.opts.Volumes...),\n\t\tIf(experimental, WithEnv(\"CNB_EXPERIMENTAL_MODE=warn\")),\n\t\tWithFlags(flags...),\n\t\tWithNetwork(l.opts.Network),\n\t\tWithRoot(),\n\t\tWithImage(runImageName),\n\t\tWithBinds(fmt.Sprintf(\"%s:%s\", kanikoCache.Name(), l.mountPaths.kanikoCacheDir())),\n\t)\n\n\textend := phaseFactory.New(configProvider)\n\tdefer extend.Cleanup()\n\treturn extend.Run(ctx)\n}\n\nfunc determineDefaultProcessType(platformAPI *api.Version, providedValue string) string {\n\tshouldSetForceDefault := platformAPI.Compare(api.MustParse(\"0.4\")) >= 0 &&\n\t\tplatformAPI.Compare(api.MustParse(\"0.6\")) < 0\n\tif providedValue == \"\" && shouldSetForceDefault {\n\t\treturn defaultProcessType\n\t}\n\n\treturn providedValue\n}\n\nfunc (l *LifecycleExecution) Export(ctx context.Context, buildCache, launchCache, kanikoCache Cache, phaseFactory PhaseFactory) error {\n\tflags := []string{\n\t\t\"-app\", l.mountPaths.appDir(),\n\t\t\"-cache-dir\", l.mountPaths.cacheDir(),\n\t}\n\n\texpEnv := NullOp()\n\tkanikoCacheBindOp := NullOp()\n\tif l.platformAPI.LessThan(\"0.12\") {\n\t\tflags = append(flags, \"-stack\", l.mountPaths.stackPath())\n\t} else {\n\t\tflags = append(flags, \"-run\", l.mountPaths.runPath())\n\t\tif l.hasExtensionsForRun() {\n\t\t\texpEnv = If(l.extensionsAreExperimental(), WithEnv(\"CNB_EXPERIMENTAL_MODE=warn\"))\n\t\t\tkanikoCacheBindOp = WithBinds(fmt.Sprintf(\"%s:%s\", kanikoCache.Name(), l.mountPaths.kanikoCacheDir()))\n\t\t}\n\t}\n\n\tif l.platformAPI.LessThan(\"0.7\") {\n\t\tflags = append(flags, \"-run-image\", l.opts.RunImage)\n\t}\n\tprocessType := determineDefaultProcessType(l.platformAPI, l.opts.DefaultProcessType)\n\tif processType != \"\" {\n\t\tflags = append(flags, \"-process-type\", processType)\n\t}\n\tif l.opts.GID >= overrideGID {\n\t\tflags = append(flags, \"-gid\", strconv.Itoa(l.opts.GID))\n\t}\n\n\tif l.opts.UID >= overrideUID {\n\t\tflags = append(flags, \"-uid\", strconv.Itoa(l.opts.UID))\n\t}\n\n\tif l.platformAPI.AtLeast(\"0.13\") {\n\t\tfor _, reg := range l.opts.InsecureRegistries {\n\t\t\tflags = append(flags, \"-insecure-registry\", reg)\n\t\t}\n\t}\n\n\tcacheBindOp := NullOp()\n\tswitch buildCache.Type() {\n\tcase cache.Image:\n\t\tflags = append(flags, \"-cache-image\", buildCache.Name())\n\tcase cache.Volume:\n\t\tcacheBindOp = WithBinds(fmt.Sprintf(\"%s:%s\", buildCache.Name(), l.mountPaths.cacheDir()))\n\t}\n\n\tepochEnv := NullOp()\n\tif l.opts.CreationTime != nil && l.platformAPI.AtLeast(\"0.9\") {\n\t\tepochEnv = WithEnv(fmt.Sprintf(\"%s=%s\", sourceDateEpochEnv, strconv.Itoa(int(l.opts.CreationTime.Unix()))))\n\t}\n\n\topts := []PhaseConfigProviderOperation{\n\t\tWithLogPrefix(\"exporter\"),\n\t\tWithImage(l.opts.LifecycleImage),\n\t\tWithEnv(\n\t\t\tfmt.Sprintf(\"%s=%d\", builder.EnvUID, l.opts.Builder.UID()),\n\t\t\tfmt.Sprintf(\"%s=%d\", builder.EnvGID, l.opts.Builder.GID()),\n\t\t),\n\t\tWithFlags(\n\t\t\tl.withLogLevel(flags...)...,\n\t\t),\n\t\tWithArgs(append([]string{l.opts.Image.String()}, l.opts.AdditionalTags...)...),\n\t\tWithRoot(),\n\t\tWithNetwork(l.opts.Network),\n\t\tcacheBindOp,\n\t\tkanikoCacheBindOp,\n\t\tWithContainerOperations(WriteStackToml(l.mountPaths.stackPath(), l.opts.Builder.Stack(), l.os)),\n\t\tWithContainerOperations(WriteRunToml(l.mountPaths.runPath(), l.opts.Builder.RunImages(), l.os)),\n\t\tWithContainerOperations(WriteProjectMetadata(l.mountPaths.projectPath(), l.opts.ProjectMetadata, l.os)),\n\t\tIf(l.opts.SBOMDestinationDir != \"\", WithPostContainerRunOperations(\n\t\t\tEnsureVolumeAccess(l.opts.Builder.UID(), l.opts.Builder.GID(), l.os, l.layersVolume, l.appVolume),\n\t\t\tCopyOutTo(l.mountPaths.sbomDir(), l.opts.SBOMDestinationDir))),\n\t\tIf(l.opts.ReportDestinationDir != \"\", WithPostContainerRunOperations(\n\t\t\tEnsureVolumeAccess(l.opts.Builder.UID(), l.opts.Builder.GID(), l.os, l.layersVolume, l.appVolume),\n\t\t\tCopyOutTo(l.mountPaths.reportPath(), l.opts.ReportDestinationDir))),\n\t\tIf(l.opts.Interactive, WithPostContainerRunOperations(\n\t\t\tEnsureVolumeAccess(l.opts.Builder.UID(), l.opts.Builder.GID(), l.os, l.layersVolume, l.appVolume),\n\t\t\tCopyOut(l.opts.Termui.ReadLayers, l.mountPaths.layersDir(), l.mountPaths.appDir()))),\n\t\tepochEnv,\n\t\texpEnv,\n\t}\n\n\tif l.opts.Layout && l.platformAPI.AtLeast(\"0.12\") {\n\t\tvar err error\n\t\topts, err = l.appendLayoutOperations(opts)\n\t\tif err != nil {\n\t\t\treturn err\n\t\t}\n\t\topts = append(opts, WithBinds(l.opts.Volumes...))\n\t}\n\n\tvar export RunnerCleaner\n\tif l.opts.Publish || l.opts.Layout {\n\t\tauthConfig, err := auth.BuildEnvVar(l.opts.Keychain, l.opts.Image.String(), l.opts.RunImage, l.opts.CacheImage, l.opts.PreviousImage)\n\t\tif err != nil {\n\t\t\treturn err\n\t\t}\n\n\t\topts = append(\n\t\t\topts,\n\t\t\tWithRegistryAccess(authConfig),\n\t\t\tWithRoot(),\n\t\t)\n\t\texport = phaseFactory.New(NewPhaseConfigProvider(\"exporter\", l, opts...))\n\t} else {\n\t\topts = append(\n\t\t\topts,\n\t\t\tWithDaemonAccess(l.opts.DockerHost),\n\t\t\tWithFlags(\"-daemon\", \"-launch-cache\", l.mountPaths.launchCacheDir()),\n\t\t\tWithBinds(fmt.Sprintf(\"%s:%s\", launchCache.Name(), l.mountPaths.launchCacheDir())),\n\t\t)\n\t\texport = phaseFactory.New(NewPhaseConfigProvider(\"exporter\", l, opts...))\n\t}\n\n\tdefer export.Cleanup()\n\treturn export.Run(ctx)\n}\n\nfunc (l *LifecycleExecution) withLogLevel(args ...string) []string {\n\tif l.logger.IsVerbose() {\n\t\treturn append([]string{\"-log-level\", \"debug\"}, args...)\n\t}\n\treturn args\n}\n\nfunc (l *LifecycleExecution) hasExtensions() bool {\n\treturn len(l.opts.Builder.OrderExtensions()) > 0\n}\n\nfunc (l *LifecycleExecution) hasExtensionsForBuild() bool {\n\tif !l.hasExtensions() {\n\t\treturn false\n\t}\n\tgeneratedDir := filepath.Join(l.tmpDir, \"generated\")\n\tfis, err := os.ReadDir(filepath.Join(generatedDir, \"build\"))\n\tif err == nil && len(fis) > 0 {\n\t\t// on older platforms, we need to find a file such as <layers>/generated/build/<buildpack-id>/Dockerfile\n\t\t// on newer platforms, <layers>/generated/build doesn't exist\n\t\treturn true\n\t}\n\t// on newer platforms, we need to find a file such as <layers>/generated/<buildpack-id>/build.Dockerfile\n\tfis, err = os.ReadDir(generatedDir)\n\tif err != nil {\n\t\tl.logger.Warnf(\"failed to read generated directory, assuming no build image extensions: %s\", err)\n\t\treturn false\n\t}\n\tfor _, fi := range fis {\n\t\tif _, err := os.Stat(filepath.Join(generatedDir, fi.Name(), \"build.Dockerfile\")); err == nil {\n\t\t\treturn true\n\t\t}\n\t}\n\treturn false\n}\n\nfunc (l *LifecycleExecution) hasExtensionsForRun() bool {\n\tif !l.hasExtensions() {\n\t\treturn false\n\t}\n\tvar amd files.Analyzed\n\tif _, err := toml.DecodeFile(filepath.Join(l.tmpDir, \"analyzed.toml\"), &amd); err != nil {\n\t\tl.logger.Warnf(\"failed to parse analyzed.toml file, assuming no run image extensions: %s\", err)\n\t\treturn false\n\t}\n\tif amd.RunImage == nil {\n\t\t// this shouldn't be reachable\n\t\tl.logger.Warnf(\"found no run image in analyzed.toml file, assuming no run image extensions...\")\n\t\treturn false\n\t}\n\treturn amd.RunImage.Extend\n}\n\nfunc (l *LifecycleExecution) runImageIdentifierAfterExtensions() string {\n\tif !l.hasExtensions() {\n\t\treturn l.opts.RunImage\n\t}\n\tvar amd files.Analyzed\n\tif _, err := toml.DecodeFile(filepath.Join(l.tmpDir, \"analyzed.toml\"), &amd); err != nil {\n\t\tl.logger.Warnf(\"failed to parse analyzed.toml file, assuming run image identifier did not change: %s\", err)\n\t\treturn l.opts.RunImage\n\t}\n\tif amd.RunImage == nil || amd.RunImage.Reference == \"\" {\n\t\t// this shouldn't be reachable\n\t\tl.logger.Warnf(\"found no run image in analyzed.toml file, assuming run image identifier did not change...\")\n\t\treturn l.opts.RunImage\n\t}\n\treturn amd.RunImage.Reference\n}\n\nfunc (l *LifecycleExecution) runImageNameAfterExtensions() string {\n\tif !l.hasExtensions() {\n\t\treturn l.opts.RunImage\n\t}\n\tvar amd files.Analyzed\n\tif _, err := toml.DecodeFile(filepath.Join(l.tmpDir, \"analyzed.toml\"), &amd); err != nil {\n\t\tl.logger.Warnf(\"failed to parse analyzed.toml file, assuming run image name did not change: %s\", err)\n\t\treturn l.opts.RunImage\n\t}\n\tif amd.RunImage == nil || amd.RunImage.Image == \"\" {\n\t\t// this shouldn't be reachable\n\t\tl.logger.Warnf(\"found no run image in analyzed.toml file, assuming run image name did not change...\")\n\t\treturn l.opts.RunImage\n\t}\n\treturn amd.RunImage.Image\n}\n\nfunc (l *LifecycleExecution) runImageChanged() bool {\n\tcurrentRunImage := l.runImageNameAfterExtensions()\n\treturn currentRunImage != \"\" && currentRunImage != l.opts.RunImage\n}\n\nfunc (l *LifecycleExecution) appendLayoutOperations(opts []PhaseConfigProviderOperation) ([]PhaseConfigProviderOperation, error) {\n\topts = append(opts, withLayoutOperation())\n\treturn opts, nil\n}\n\nfunc withLayoutOperation() PhaseConfigProviderOperation {\n\tlayoutDir := filepath.Join(paths.RootDir, \"layout-repo\")\n\treturn WithEnv(\"CNB_USE_LAYOUT=true\", \"CNB_LAYOUT_DIR=\"+layoutDir, \"CNB_EXPERIMENTAL_MODE=warn\")\n}\n\nfunc prependArg(arg string, args []string) []string {\n\treturn append([]string{arg}, args...)\n}\n\nfunc addTags(flags, additionalTags []string) []string {\n\tfor _, tag := range additionalTags {\n\t\tflags = append(flags, \"-tag\", tag)\n\t}\n\treturn flags\n}\n"
  },
  {
    "path": "internal/build/lifecycle_execution_test.go",
    "content": "package build_test\n\nimport (\n\t\"bytes\"\n\t\"context\"\n\t\"fmt\"\n\t\"io\"\n\t\"os\"\n\t\"path/filepath\"\n\t\"strconv\"\n\t\"testing\"\n\t\"time\"\n\n\t\"github.com/BurntSushi/toml\"\n\t\"github.com/apex/log\"\n\tifakes \"github.com/buildpacks/imgutil/fakes\"\n\t\"github.com/buildpacks/lifecycle/api\"\n\t\"github.com/buildpacks/lifecycle/platform/files\"\n\t\"github.com/google/go-containerregistry/pkg/authn\"\n\t\"github.com/google/go-containerregistry/pkg/name\"\n\t\"github.com/heroku/color\"\n\t\"github.com/moby/moby/api/types/container\"\n\t\"github.com/moby/moby/api/types/network\"\n\t\"github.com/moby/moby/client\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\t\"github.com/buildpacks/pack/internal/build\"\n\t\"github.com/buildpacks/pack/internal/build/fakes\"\n\t\"github.com/buildpacks/pack/internal/paths\"\n\t\"github.com/buildpacks/pack/pkg/cache\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\n// TestLifecycleExecution are unit tests that test each possible phase to ensure they are executed with the proper parameters\nfunc TestLifecycleExecution(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\n\tspec.Run(t, \"phases\", testLifecycleExecution, spec.Report(report.Terminal{}), spec.Sequential())\n}\n\nfunc testLifecycleExecution(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tdockerConfigDir string\n\t\ttmpDir          string\n\n\t\t// lifecycle options\n\t\tprovidedClearCache     bool\n\t\tprovidedPublish        bool\n\t\tprovidedUseCreator     bool\n\t\tprovidedLayout         bool\n\t\tprovidedDockerHost     string\n\t\tprovidedNetworkMode    = \"some-network-mode\"\n\t\tprovidedRunImage       = \"some-run-image\"\n\t\tprovidedTargetImage    = \"some-target-image\"\n\t\tprovidedAdditionalTags = []string{\"some-additional-tag1\", \"some-additional-tag2\"}\n\t\tprovidedVolumes        = []string{\"some-mount-source:/some-mount-target\"}\n\n\t\t// builder options\n\t\tprovidedBuilderImage = \"some-registry.com/some-namespace/some-builder-name\"\n\t\twithOS               = \"linux\"\n\t\tplatformAPI          = build.SupportedPlatformAPIVersions[0] // TODO: update the tests to target the latest api by default and make earlier apis special cases\n\t\tprovidedUID          = 2222\n\t\tprovidedGID          = 3333\n\t\tprovidedOrderExt     dist.Order\n\n\t\tlifecycle        *build.LifecycleExecution\n\t\tfakeBuildCache   = newFakeVolumeCache()\n\t\tfakeLaunchCache  *fakes.FakeCache\n\t\tfakeKanikoCache  *fakes.FakeCache\n\t\tfakePhase        *fakes.FakePhase\n\t\tfakePhaseFactory *fakes.FakePhaseFactory\n\t\tfakeFetcher      fakeImageFetcher\n\t\tconfigProvider   *build.PhaseConfigProvider\n\n\t\textensionsForBuild, extensionsForRun bool\n\t\textensionsRunImageName               string\n\t\textensionsRunImageIdentifier         string\n\t\tuseCreatorWithExtensions             bool\n\t)\n\n\tvar configureDefaultTestLifecycle = func(opts *build.LifecycleOptions) {\n\t\topts.AdditionalTags = providedAdditionalTags\n\t\topts.BuilderImage = providedBuilderImage\n\t\topts.ClearCache = providedClearCache\n\t\topts.DockerHost = providedDockerHost\n\t\topts.Network = providedNetworkMode\n\t\topts.Publish = providedPublish\n\t\topts.RunImage = providedRunImage\n\t\topts.UseCreator = providedUseCreator\n\t\topts.Volumes = providedVolumes\n\t\topts.Layout = providedLayout\n\t\topts.Keychain = authn.DefaultKeychain\n\t\topts.UseCreatorWithExtensions = useCreatorWithExtensions\n\n\t\ttargetImageRef, err := name.ParseReference(providedTargetImage)\n\t\th.AssertNil(t, err)\n\t\topts.Image = targetImageRef\n\t}\n\n\tvar lifecycleOps = []func(*build.LifecycleOptions){configureDefaultTestLifecycle}\n\n\tit.Before(func() {\n\t\t// Avoid contaminating tests with existing docker configuration.\n\t\t// GGCR resolves the default keychain by inspecting DOCKER_CONFIG - this is used by the Analyze step\n\t\t// when constructing the auth config (see `auth.BuildEnvVar` in phases.go).\n\t\tvar err error\n\t\tdockerConfigDir, err = os.MkdirTemp(\"\", \"empty-docker-config-dir\")\n\t\th.AssertNil(t, err)\n\t\th.AssertNil(t, os.Setenv(\"DOCKER_CONFIG\", dockerConfigDir))\n\n\t\timage := ifakes.NewImage(\"some-image\", \"\", nil)\n\t\th.AssertNil(t, image.SetOS(withOS))\n\n\t\tfakeBuilder, err := fakes.NewFakeBuilder(\n\t\t\tfakes.WithSupportedPlatformAPIs([]*api.Version{platformAPI}),\n\t\t\tfakes.WithUID(providedUID),\n\t\t\tfakes.WithGID(providedGID),\n\t\t\tfakes.WithOrderExtensions(providedOrderExt),\n\t\t\tfakes.WithImage(image),\n\t\t)\n\t\th.AssertNil(t, err)\n\t\tfakeFetcher = fakeImageFetcher{\n\t\t\tcallCount:           0,\n\t\t\tcalledWithArgAtCall: make(map[int]string),\n\t\t}\n\t\twithFakeFetchRunImageFunc := func(opts *build.LifecycleOptions) {\n\t\t\topts.FetchRunImageWithLifecycleLayer = newFakeFetchRunImageFunc(&fakeFetcher)\n\t\t}\n\t\tlifecycleOps = append(lifecycleOps, fakes.WithBuilder(fakeBuilder), withFakeFetchRunImageFunc)\n\n\t\ttmpDir, err = os.MkdirTemp(\"\", \"pack.unit\")\n\t\th.AssertNil(t, err)\n\t\tlifecycle = newTestLifecycleExec(t, true, tmpDir, lifecycleOps...)\n\n\t\t// construct fixtures for extensions\n\t\tif extensionsForBuild {\n\t\t\tif platformAPI.LessThan(\"0.13\") {\n\t\t\t\terr = os.MkdirAll(filepath.Join(tmpDir, \"generated\", \"build\", \"some-buildpack-id\"), 0755)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t} else {\n\t\t\t\terr = os.MkdirAll(filepath.Join(tmpDir, \"generated\", \"some-buildpack-id\"), 0755)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\t_, err = os.Create(filepath.Join(tmpDir, \"generated\", \"some-buildpack-id\", \"build.Dockerfile\"))\n\t\t\t\th.AssertNil(t, err)\n\t\t\t}\n\t\t}\n\t\tamd := files.Analyzed{RunImage: &files.RunImage{\n\t\t\tExtend: false,\n\t\t\tImage:  \"\",\n\t\t}}\n\t\tif extensionsForRun {\n\t\t\tamd.RunImage.Extend = true\n\t\t}\n\t\tif extensionsRunImageName != \"\" {\n\t\t\tamd.RunImage.Image = extensionsRunImageName\n\t\t}\n\t\tif extensionsRunImageIdentifier != \"\" {\n\t\t\tamd.RunImage.Reference = extensionsRunImageIdentifier\n\t\t}\n\t\tf, err := os.Create(filepath.Join(tmpDir, \"analyzed.toml\"))\n\t\th.AssertNil(t, err)\n\t\ttoml.NewEncoder(f).Encode(amd)\n\t\th.AssertNil(t, f.Close())\n\n\t\tfakeLaunchCache = fakes.NewFakeCache()\n\t\tfakeLaunchCache.ReturnForType = cache.Volume\n\t\tfakeLaunchCache.ReturnForName = \"some-launch-cache\"\n\n\t\tfakeKanikoCache = fakes.NewFakeCache()\n\t\tfakeKanikoCache.ReturnForType = cache.Volume\n\t\tfakeKanikoCache.ReturnForName = \"some-kaniko-cache\"\n\n\t\tfakePhase = &fakes.FakePhase{}\n\t\tfakePhaseFactory = fakes.NewFakePhaseFactory(fakes.WhichReturnsForNew(fakePhase))\n\t})\n\n\tit.After(func() {\n\t\th.AssertNil(t, os.Unsetenv(\"DOCKER_CONFIG\"))\n\t\th.AssertNil(t, os.RemoveAll(dockerConfigDir))\n\t\t_ = os.RemoveAll(tmpDir)\n\t})\n\n\twhen(\"#NewLifecycleExecution\", func() {\n\t\twhen(\"lifecycle supports multiple platform APIs\", func() {\n\t\t\tit(\"selects the latest supported version\", func() {\n\t\t\t\tfakeBuilder, err := fakes.NewFakeBuilder(fakes.WithSupportedPlatformAPIs([]*api.Version{\n\t\t\t\t\tapi.MustParse(\"0.2\"),\n\t\t\t\t\tapi.MustParse(\"0.3\"),\n\t\t\t\t\tapi.MustParse(\"0.4\"),\n\t\t\t\t\tapi.MustParse(\"0.5\"),\n\t\t\t\t\tapi.MustParse(\"0.6\"),\n\t\t\t\t\tapi.MustParse(\"0.7\"),\n\t\t\t\t\tapi.MustParse(\"0.8\"),\n\t\t\t\t}))\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\tlifecycleExec := newTestLifecycleExec(t, false, \"some-temp-dir\", fakes.WithBuilder(fakeBuilder))\n\t\t\t\th.AssertEq(t, lifecycleExec.PlatformAPI().String(), \"0.8\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"supported platform API is deprecated\", func() {\n\t\t\tit(\"selects the deprecated version\", func() {\n\t\t\t\tfakeBuilder, err := fakes.NewFakeBuilder(\n\t\t\t\t\tfakes.WithDeprecatedPlatformAPIs([]*api.Version{api.MustParse(\"0.4\")}),\n\t\t\t\t\tfakes.WithSupportedPlatformAPIs([]*api.Version{api.MustParse(\"1.2\")}),\n\t\t\t\t)\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\tlifecycleExec := newTestLifecycleExec(t, false, \"some-temp-dir\", fakes.WithBuilder(fakeBuilder))\n\t\t\t\th.AssertEq(t, lifecycleExec.PlatformAPI().String(), \"0.4\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"pack doesn't support any lifecycle supported platform API\", func() {\n\t\t\tit(\"errors\", func() {\n\t\t\t\tfakeBuilder, err := fakes.NewFakeBuilder(\n\t\t\t\t\tfakes.WithSupportedPlatformAPIs([]*api.Version{api.MustParse(\"1.2\")}),\n\t\t\t\t)\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t_, err = newTestLifecycleExecErr(t, false, \"some-temp-dir\", fakes.WithBuilder(fakeBuilder))\n\t\t\t\th.AssertError(t, err, \"unable to find a supported Platform API version\")\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"FindLatestSupported\", func() {\n\t\tit(\"chooses a shared version\", func() {\n\t\t\tversion, err := build.FindLatestSupported([]*api.Version{api.MustParse(\"0.6\"), api.MustParse(\"0.7\"), api.MustParse(\"0.8\")}, []string{\"0.7\"})\n\t\t\th.AssertNil(t, err)\n\t\t\th.AssertEq(t, version, api.MustParse(\"0.7\"))\n\t\t})\n\n\t\tit(\"chooses a shared version, highest builder supported version\", func() {\n\t\t\tversion, err := build.FindLatestSupported([]*api.Version{api.MustParse(\"0.4\"), api.MustParse(\"0.5\"), api.MustParse(\"0.7\")}, []string{\"0.7\", \"0.8\"})\n\t\t\th.AssertNil(t, err)\n\t\t\th.AssertEq(t, version, api.MustParse(\"0.7\"))\n\t\t})\n\n\t\tit(\"chooses a shared version, lowest builder supported version\", func() {\n\t\t\tversion, err := build.FindLatestSupported([]*api.Version{api.MustParse(\"0.4\"), api.MustParse(\"0.5\"), api.MustParse(\"0.7\")}, []string{\"0.1\", \"0.2\", \"0.4\"})\n\t\t\th.AssertNil(t, err)\n\t\t\th.AssertEq(t, version, api.MustParse(\"0.4\"))\n\t\t})\n\n\t\tit(\"Interprets empty lifecycle versions list as lack of constraints\", func() {\n\t\t\tversion, err := build.FindLatestSupported([]*api.Version{api.MustParse(\"0.6\"), api.MustParse(\"0.7\")}, []string{})\n\t\t\th.AssertNil(t, err)\n\t\t\th.AssertEq(t, version, api.MustParse(\"0.7\"))\n\t\t})\n\n\t\tit(\"errors with no shared version, builder has no versions supported for some reason\", func() {\n\t\t\t_, err := build.FindLatestSupported([]*api.Version{}, []string{\"0.7\"})\n\t\t\th.AssertNotNil(t, err)\n\t\t})\n\n\t\tit(\"errors with no shared version, builder less than lifecycle\", func() {\n\t\t\t_, err := build.FindLatestSupported([]*api.Version{api.MustParse(\"0.4\"), api.MustParse(\"0.5\")}, []string{\"0.7\", \"0.8\"})\n\t\t\th.AssertNotNil(t, err)\n\t\t})\n\n\t\tit(\"errors with no shared version, builder greater than lifecycle\", func() {\n\t\t\t_, err := build.FindLatestSupported([]*api.Version{api.MustParse(\"0.8\"), api.MustParse(\"0.9\")}, []string{\"0.6\", \"0.7\"})\n\t\t\th.AssertNotNil(t, err)\n\t\t})\n\t})\n\n\twhen(\"Run\", func() {\n\t\tvar (\n\t\t\timageName   name.Tag\n\t\t\tfakeBuilder *fakes.FakeBuilder\n\t\t\toutBuf      bytes.Buffer\n\t\t\tlogger      *logging.LogWithWriters\n\t\t\tdocker      *fakeDockerClient\n\t\t\tfakeTermui  *fakes.FakeTermui\n\t\t)\n\n\t\tit.Before(func() {\n\t\t\tvar err error\n\t\t\timageName, err = name.NewTag(\"/some/image\", name.WeakValidation)\n\t\t\th.AssertNil(t, err)\n\n\t\t\tfakeTermui = &fakes.FakeTermui{}\n\n\t\t\tfakeBuilder, err = fakes.NewFakeBuilder(fakes.WithSupportedPlatformAPIs([]*api.Version{api.MustParse(\"0.3\")}))\n\t\t\th.AssertNil(t, err)\n\t\t\tlogger = logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\tdocker = &fakeDockerClient{}\n\t\t\th.AssertNil(t, err)\n\t\t\tfakePhaseFactory = fakes.NewFakePhaseFactory()\n\t\t})\n\n\t\twhen(\"Run using creator\", func() {\n\t\t\tit(\"succeeds\", func() {\n\t\t\t\topts := build.LifecycleOptions{\n\t\t\t\t\tPublish:      false,\n\t\t\t\t\tClearCache:   false,\n\t\t\t\t\tRunImage:     \"test\",\n\t\t\t\t\tImage:        imageName,\n\t\t\t\t\tBuilder:      fakeBuilder,\n\t\t\t\t\tTrustBuilder: false,\n\t\t\t\t\tUseCreator:   true,\n\t\t\t\t\tTermui:       fakeTermui,\n\t\t\t\t}\n\n\t\t\t\tlifecycle, err := build.NewLifecycleExecution(logger, docker, \"some-temp-dir\", opts)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertEq(t, filepath.Base(lifecycle.AppDir()), \"workspace\")\n\n\t\t\t\terr = lifecycle.Run(context.Background(), func(execution *build.LifecycleExecution) build.PhaseFactory {\n\t\t\t\t\treturn fakePhaseFactory\n\t\t\t\t})\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\th.AssertEq(t, len(fakePhaseFactory.NewCalledWithProvider), 1)\n\n\t\t\t\tfor _, entry := range fakePhaseFactory.NewCalledWithProvider {\n\t\t\t\t\tif entry.Name() == \"creator\" {\n\t\t\t\t\t\th.AssertSliceContains(t, entry.ContainerConfig().Cmd, \"/some/image\")\n\t\t\t\t\t}\n\t\t\t\t}\n\t\t\t})\n\n\t\t\twhen(\"Run with workspace dir\", func() {\n\t\t\t\tit(\"succeeds\", func() {\n\t\t\t\t\topts := build.LifecycleOptions{\n\t\t\t\t\t\tPublish:      false,\n\t\t\t\t\t\tClearCache:   false,\n\t\t\t\t\t\tRunImage:     \"test\",\n\t\t\t\t\t\tImage:        imageName,\n\t\t\t\t\t\tBuilder:      fakeBuilder,\n\t\t\t\t\t\tTrustBuilder: true,\n\t\t\t\t\t\tWorkspace:    \"app\",\n\t\t\t\t\t\tUseCreator:   true,\n\t\t\t\t\t\tTermui:       fakeTermui,\n\t\t\t\t\t}\n\n\t\t\t\t\tlifecycle, err := build.NewLifecycleExecution(logger, docker, \"some-temp-dir\", opts)\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\terr = lifecycle.Run(context.Background(), func(execution *build.LifecycleExecution) build.PhaseFactory {\n\t\t\t\t\t\treturn fakePhaseFactory\n\t\t\t\t\t})\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\th.AssertEq(t, len(fakePhaseFactory.NewCalledWithProvider), 1)\n\n\t\t\t\t\tfor _, entry := range fakePhaseFactory.NewCalledWithProvider {\n\t\t\t\t\t\tif entry.Name() == \"creator\" {\n\t\t\t\t\t\t\th.AssertSliceContainsInOrder(t, entry.ContainerConfig().Cmd, \"-app\", \"/app\")\n\t\t\t\t\t\t\th.AssertSliceContains(t, entry.ContainerConfig().Cmd, \"/some/image\")\n\t\t\t\t\t\t}\n\t\t\t\t\t}\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"there are extensions\", func() {\n\t\t\t\tprovidedUseCreator = true\n\t\t\t\tprovidedOrderExt = dist.Order{dist.OrderEntry{Group: []dist.ModuleRef{ /* don't care */ }}}\n\n\t\t\t\twhen(\"platform < 0.10\", func() {\n\t\t\t\t\tplatformAPI = api.MustParse(\"0.9\")\n\n\t\t\t\t\tit(\"succeeds\", func() {\n\t\t\t\t\t\terr := lifecycle.Run(context.Background(), func(execution *build.LifecycleExecution) build.PhaseFactory {\n\t\t\t\t\t\t\treturn fakePhaseFactory\n\t\t\t\t\t\t})\n\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\th.AssertEq(t, len(fakePhaseFactory.NewCalledWithProvider), 1)\n\n\t\t\t\t\t\tfor _, entry := range fakePhaseFactory.NewCalledWithProvider {\n\t\t\t\t\t\t\tif entry.Name() == \"creator\" {\n\t\t\t\t\t\t\t\th.AssertSliceContains(t, entry.ContainerConfig().Cmd, providedTargetImage)\n\t\t\t\t\t\t\t}\n\t\t\t\t\t\t}\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"platform >= 0.10\", func() {\n\t\t\t\t\tplatformAPI = api.MustParse(\"0.10\")\n\n\t\t\t\t\tit(\"errors\", func() {\n\t\t\t\t\t\terr := lifecycle.Run(context.Background(), func(execution *build.LifecycleExecution) build.PhaseFactory {\n\t\t\t\t\t\t\treturn fakePhaseFactory\n\t\t\t\t\t\t})\n\t\t\t\t\t\th.AssertNotNil(t, err)\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"use creator with extensions supported by the lifecycle\", func() {\n\t\t\t\t\t\tuseCreatorWithExtensions = true\n\n\t\t\t\t\t\tit(\"allows the build to proceed (but the creator will error if extensions are detected)\", func() {\n\t\t\t\t\t\t\terr := lifecycle.Run(context.Background(), func(execution *build.LifecycleExecution) build.PhaseFactory {\n\t\t\t\t\t\t\t\treturn fakePhaseFactory\n\t\t\t\t\t\t\t})\n\t\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\t\th.AssertEq(t, len(fakePhaseFactory.NewCalledWithProvider), 1)\n\n\t\t\t\t\t\t\tfor _, entry := range fakePhaseFactory.NewCalledWithProvider {\n\t\t\t\t\t\t\t\tif entry.Name() == \"creator\" {\n\t\t\t\t\t\t\t\t\th.AssertSliceContains(t, entry.ContainerConfig().Cmd, providedTargetImage)\n\t\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\t}\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"Run without using creator\", func() {\n\t\t\twhen(\"platform < 0.7\", func() {\n\t\t\t\tit(\"calls the phases with the right order\", func() {\n\t\t\t\t\topts := build.LifecycleOptions{\n\t\t\t\t\t\tPublish:      false,\n\t\t\t\t\t\tClearCache:   false,\n\t\t\t\t\t\tRunImage:     \"test\",\n\t\t\t\t\t\tImage:        imageName,\n\t\t\t\t\t\tBuilder:      fakeBuilder,\n\t\t\t\t\t\tTrustBuilder: false,\n\t\t\t\t\t\tUseCreator:   false,\n\t\t\t\t\t\tTermui:       fakeTermui,\n\t\t\t\t\t}\n\n\t\t\t\t\tlifecycle, err := build.NewLifecycleExecution(logger, docker, \"some-temp-dir\", opts)\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\terr = lifecycle.Run(context.Background(), func(execution *build.LifecycleExecution) build.PhaseFactory {\n\t\t\t\t\t\treturn fakePhaseFactory\n\t\t\t\t\t})\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\th.AssertEq(t, len(fakePhaseFactory.NewCalledWithProvider), 5)\n\t\t\t\t\texpectedPhases := []string{\n\t\t\t\t\t\t\"detector\", \"analyzer\", \"restorer\", \"builder\", \"exporter\",\n\t\t\t\t\t}\n\t\t\t\t\tfor i, entry := range fakePhaseFactory.NewCalledWithProvider {\n\t\t\t\t\t\th.AssertEq(t, entry.Name(), expectedPhases[i])\n\t\t\t\t\t}\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"platform >= 0.7\", func() {\n\t\t\t\tit(\"calls the phases with the right order\", func() {\n\t\t\t\t\tfakeBuilder, err := fakes.NewFakeBuilder(fakes.WithSupportedPlatformAPIs([]*api.Version{api.MustParse(\"0.7\")}))\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\topts := build.LifecycleOptions{\n\t\t\t\t\t\tPublish:      false,\n\t\t\t\t\t\tClearCache:   false,\n\t\t\t\t\t\tRunImage:     \"test\",\n\t\t\t\t\t\tImage:        imageName,\n\t\t\t\t\t\tBuilder:      fakeBuilder,\n\t\t\t\t\t\tTrustBuilder: false,\n\t\t\t\t\t\tUseCreator:   false,\n\t\t\t\t\t\tTermui:       fakeTermui,\n\t\t\t\t\t}\n\n\t\t\t\t\tlifecycle, err := build.NewLifecycleExecution(logger, docker, \"some-temp-dir\", opts)\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\terr = lifecycle.Run(context.Background(), func(execution *build.LifecycleExecution) build.PhaseFactory {\n\t\t\t\t\t\treturn fakePhaseFactory\n\t\t\t\t\t})\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\th.AssertEq(t, len(fakePhaseFactory.NewCalledWithProvider), 5)\n\t\t\t\t\texpectedPhases := []string{\n\t\t\t\t\t\t\"analyzer\", \"detector\", \"restorer\", \"builder\", \"exporter\",\n\t\t\t\t\t}\n\t\t\t\t\tfor i, entry := range fakePhaseFactory.NewCalledWithProvider {\n\t\t\t\t\t\th.AssertEq(t, entry.Name(), expectedPhases[i])\n\t\t\t\t\t}\n\t\t\t\t})\n\t\t\t})\n\n\t\t\tit(\"succeeds\", func() {\n\t\t\t\topts := build.LifecycleOptions{\n\t\t\t\t\tPublish:      false,\n\t\t\t\t\tClearCache:   false,\n\t\t\t\t\tRunImage:     \"test\",\n\t\t\t\t\tImage:        imageName,\n\t\t\t\t\tBuilder:      fakeBuilder,\n\t\t\t\t\tTrustBuilder: false,\n\t\t\t\t\tUseCreator:   false,\n\t\t\t\t\tTermui:       fakeTermui,\n\t\t\t\t}\n\n\t\t\t\tlifecycle, err := build.NewLifecycleExecution(logger, docker, \"some-temp-dir\", opts)\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\terr = lifecycle.Run(context.Background(), func(execution *build.LifecycleExecution) build.PhaseFactory {\n\t\t\t\t\treturn fakePhaseFactory\n\t\t\t\t})\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\th.AssertEq(t, len(fakePhaseFactory.NewCalledWithProvider), 5)\n\n\t\t\t\tfor _, entry := range fakePhaseFactory.NewCalledWithProvider {\n\t\t\t\t\tswitch entry.Name() {\n\t\t\t\t\tcase \"exporter\":\n\t\t\t\t\t\th.AssertSliceContains(t, entry.ContainerConfig().Cmd, \"/some/image\")\n\t\t\t\t\tcase \"analyzer\":\n\t\t\t\t\t\th.AssertSliceContains(t, entry.ContainerConfig().Cmd, \"/some/image\")\n\t\t\t\t\t}\n\t\t\t\t}\n\t\t\t})\n\n\t\t\twhen(\"Run with workspace dir\", func() {\n\t\t\t\tit(\"succeeds\", func() {\n\t\t\t\t\topts := build.LifecycleOptions{\n\t\t\t\t\t\tPublish:      false,\n\t\t\t\t\t\tClearCache:   false,\n\t\t\t\t\t\tRunImage:     \"test\",\n\t\t\t\t\t\tImage:        imageName,\n\t\t\t\t\t\tBuilder:      fakeBuilder,\n\t\t\t\t\t\tTrustBuilder: false,\n\t\t\t\t\t\tWorkspace:    \"app\",\n\t\t\t\t\t\tUseCreator:   false,\n\t\t\t\t\t\tTermui:       fakeTermui,\n\t\t\t\t\t}\n\n\t\t\t\t\tlifecycle, err := build.NewLifecycleExecution(logger, docker, \"some-temp-dir\", opts)\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\th.AssertEq(t, filepath.Base(lifecycle.AppDir()), \"app\")\n\n\t\t\t\t\terr = lifecycle.Run(context.Background(), func(execution *build.LifecycleExecution) build.PhaseFactory {\n\t\t\t\t\t\treturn fakePhaseFactory\n\t\t\t\t\t})\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\th.AssertEq(t, len(fakePhaseFactory.NewCalledWithProvider), 5)\n\n\t\t\t\t\tappCount := 0\n\t\t\t\t\tfor _, entry := range fakePhaseFactory.NewCalledWithProvider {\n\t\t\t\t\t\tswitch entry.Name() {\n\t\t\t\t\t\tcase \"detector\", \"builder\", \"exporter\":\n\t\t\t\t\t\t\th.AssertSliceContainsInOrder(t, entry.ContainerConfig().Cmd, \"-app\", \"/app\")\n\t\t\t\t\t\t\tappCount++\n\t\t\t\t\t\t}\n\t\t\t\t\t}\n\t\t\t\t\th.AssertEq(t, appCount, 3)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"--clear-cache\", func() {\n\t\t\t\tprovidedUseCreator = false\n\t\t\t\tprovidedClearCache = true\n\t\t\t\tlifecycleOps = append(lifecycleOps, func(options *build.LifecycleOptions) { // allow buildCache.Clear to succeed without requiring the docker daemon to be running\n\t\t\t\t\toptions.Cache.Build.Format = cache.CacheBind\n\t\t\t\t})\n\n\t\t\t\twhen(\"platform < 0.10\", func() {\n\t\t\t\t\tplatformAPI = api.MustParse(\"0.9\")\n\n\t\t\t\t\tit(\"does not run restore\", func() {\n\t\t\t\t\t\terr := lifecycle.Run(context.Background(), func(execution *build.LifecycleExecution) build.PhaseFactory {\n\t\t\t\t\t\t\treturn fakePhaseFactory\n\t\t\t\t\t\t})\n\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\th.AssertEq(t, len(fakePhaseFactory.NewCalledWithProvider), 4)\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"platform >= 0.10\", func() {\n\t\t\t\t\tplatformAPI = api.MustParse(\"0.10\")\n\n\t\t\t\t\tit(\"runs restore\", func() {\n\t\t\t\t\t\terr := lifecycle.Run(context.Background(), func(execution *build.LifecycleExecution) build.PhaseFactory {\n\t\t\t\t\t\t\treturn fakePhaseFactory\n\t\t\t\t\t\t})\n\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\th.AssertEq(t, len(fakePhaseFactory.NewCalledWithProvider), 5)\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"extensions\", func() {\n\t\t\t\tprovidedUseCreator = false\n\t\t\t\tprovidedOrderExt = dist.Order{dist.OrderEntry{Group: []dist.ModuleRef{ /* don't care */ }}}\n\n\t\t\t\twhen(\"for build\", func() {\n\t\t\t\t\twhen(\"present in <layers>/generated/<buildpack-id>\", func() {\n\t\t\t\t\t\textensionsForBuild = true\n\n\t\t\t\t\t\twhen(\"platform >= 0.13\", func() {\n\t\t\t\t\t\t\tplatformAPI = api.MustParse(\"0.13\")\n\n\t\t\t\t\t\t\tit(\"runs the extender (build)\", func() {\n\t\t\t\t\t\t\t\terr := lifecycle.Run(context.Background(), func(execution *build.LifecycleExecution) build.PhaseFactory {\n\t\t\t\t\t\t\t\t\treturn fakePhaseFactory\n\t\t\t\t\t\t\t\t})\n\t\t\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\t\t\th.AssertEq(t, len(fakePhaseFactory.NewCalledWithProvider), 5)\n\n\t\t\t\t\t\t\t\tvar found bool\n\t\t\t\t\t\t\t\tfor _, entry := range fakePhaseFactory.NewCalledWithProvider {\n\t\t\t\t\t\t\t\t\tif entry.Name() == \"extender\" {\n\t\t\t\t\t\t\t\t\t\tfound = true\n\t\t\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\t\th.AssertEq(t, found, true)\n\t\t\t\t\t\t\t})\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"present in <layers>/generated/build\", func() {\n\t\t\t\t\t\textensionsForBuild = true\n\n\t\t\t\t\t\twhen(\"platform < 0.10\", func() {\n\t\t\t\t\t\t\tplatformAPI = api.MustParse(\"0.9\")\n\n\t\t\t\t\t\t\tit(\"runs the builder\", func() {\n\t\t\t\t\t\t\t\terr := lifecycle.Run(context.Background(), func(execution *build.LifecycleExecution) build.PhaseFactory {\n\t\t\t\t\t\t\t\t\treturn fakePhaseFactory\n\t\t\t\t\t\t\t\t})\n\t\t\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\t\t\th.AssertEq(t, len(fakePhaseFactory.NewCalledWithProvider), 5)\n\n\t\t\t\t\t\t\t\tvar found bool\n\t\t\t\t\t\t\t\tfor _, entry := range fakePhaseFactory.NewCalledWithProvider {\n\t\t\t\t\t\t\t\t\tif entry.Name() == \"builder\" {\n\t\t\t\t\t\t\t\t\t\tfound = true\n\t\t\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\t\th.AssertEq(t, found, true)\n\t\t\t\t\t\t\t})\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\twhen(\"platform 0.10 to 0.12\", func() {\n\t\t\t\t\t\t\tplatformAPI = api.MustParse(\"0.10\")\n\n\t\t\t\t\t\t\tit(\"runs the extender (build)\", func() {\n\t\t\t\t\t\t\t\terr := lifecycle.Run(context.Background(), func(execution *build.LifecycleExecution) build.PhaseFactory {\n\t\t\t\t\t\t\t\t\treturn fakePhaseFactory\n\t\t\t\t\t\t\t\t})\n\t\t\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\t\t\th.AssertEq(t, len(fakePhaseFactory.NewCalledWithProvider), 5)\n\n\t\t\t\t\t\t\t\tvar found bool\n\t\t\t\t\t\t\t\tfor _, entry := range fakePhaseFactory.NewCalledWithProvider {\n\t\t\t\t\t\t\t\t\tif entry.Name() == \"extender\" {\n\t\t\t\t\t\t\t\t\t\tfound = true\n\t\t\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\t\th.AssertEq(t, found, true)\n\t\t\t\t\t\t\t})\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"not present in <layers>/generated/build\", func() {\n\t\t\t\t\t\tplatformAPI = api.MustParse(\"0.10\")\n\n\t\t\t\t\t\tit(\"runs the builder\", func() {\n\t\t\t\t\t\t\terr := lifecycle.Run(context.Background(), func(execution *build.LifecycleExecution) build.PhaseFactory {\n\t\t\t\t\t\t\t\treturn fakePhaseFactory\n\t\t\t\t\t\t\t})\n\t\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\t\th.AssertEq(t, len(fakePhaseFactory.NewCalledWithProvider), 5)\n\n\t\t\t\t\t\t\tvar found bool\n\t\t\t\t\t\t\tfor _, entry := range fakePhaseFactory.NewCalledWithProvider {\n\t\t\t\t\t\t\t\tif entry.Name() == \"builder\" {\n\t\t\t\t\t\t\t\t\tfound = true\n\t\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\th.AssertEq(t, found, true)\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"for run\", func() {\n\t\t\t\t\twhen(\"analyzed.toml run image\", func() {\n\t\t\t\t\t\twhen(\"matches provided run image\", func() {\n\t\t\t\t\t\t\tit(\"does nothing\", func() {\n\t\t\t\t\t\t\t\terr := lifecycle.Run(context.Background(), func(execution *build.LifecycleExecution) build.PhaseFactory {\n\t\t\t\t\t\t\t\t\treturn fakePhaseFactory\n\t\t\t\t\t\t\t\t})\n\t\t\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\t\t\th.AssertEq(t, fakeFetcher.callCount, 0)\n\t\t\t\t\t\t\t})\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\twhen(\"does not match provided run image\", func() {\n\t\t\t\t\t\t\textensionsRunImageName = \"some-new-run-image\"\n\t\t\t\t\t\t\textensionsRunImageIdentifier = \"some-new-run-image-identifier\"\n\n\t\t\t\t\t\t\tit(\"pulls the new run image\", func() {\n\t\t\t\t\t\t\t\terr := lifecycle.Run(context.Background(), func(execution *build.LifecycleExecution) build.PhaseFactory {\n\t\t\t\t\t\t\t\t\treturn fakePhaseFactory\n\t\t\t\t\t\t\t\t})\n\t\t\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\t\t\th.AssertEq(t, fakeFetcher.callCount, 2)\n\t\t\t\t\t\t\t\th.AssertEq(t, fakeFetcher.calledWithArgAtCall[0], \"some-new-run-image\")\n\t\t\t\t\t\t\t\th.AssertEq(t, fakeFetcher.calledWithArgAtCall[1], \"some-new-run-image-identifier\")\n\t\t\t\t\t\t\t})\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"analyzed.toml run image extend\", func() {\n\t\t\t\t\t\twhen(\"true\", func() {\n\t\t\t\t\t\t\textensionsForRun = true\n\n\t\t\t\t\t\t\twhen(\"platform >= 0.12\", func() {\n\t\t\t\t\t\t\t\tplatformAPI = api.MustParse(\"0.12\")\n\n\t\t\t\t\t\t\t\tit(\"runs the extender (run)\", func() {\n\t\t\t\t\t\t\t\t\terr := lifecycle.Run(context.Background(), func(execution *build.LifecycleExecution) build.PhaseFactory {\n\t\t\t\t\t\t\t\t\t\treturn fakePhaseFactory\n\t\t\t\t\t\t\t\t\t})\n\t\t\t\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\t\t\t\th.AssertEq(t, len(fakePhaseFactory.NewCalledWithProvider), 6)\n\n\t\t\t\t\t\t\t\t\tvar found bool\n\t\t\t\t\t\t\t\t\tfor _, entry := range fakePhaseFactory.NewCalledWithProvider {\n\t\t\t\t\t\t\t\t\t\tif entry.Name() == \"extender\" {\n\t\t\t\t\t\t\t\t\t\t\tfound = true\n\t\t\t\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\t\t\th.AssertEq(t, found, true)\n\t\t\t\t\t\t\t\t})\n\t\t\t\t\t\t\t})\n\n\t\t\t\t\t\t\twhen(\"platform < 0.12\", func() {\n\t\t\t\t\t\t\t\tplatformAPI = api.MustParse(\"0.11\")\n\n\t\t\t\t\t\t\t\tit(\"doesn't run the extender\", func() {\n\t\t\t\t\t\t\t\t\terr := lifecycle.Run(context.Background(), func(execution *build.LifecycleExecution) build.PhaseFactory {\n\t\t\t\t\t\t\t\t\t\treturn fakePhaseFactory\n\t\t\t\t\t\t\t\t\t})\n\t\t\t\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\t\t\t\th.AssertEq(t, len(fakePhaseFactory.NewCalledWithProvider), 5)\n\n\t\t\t\t\t\t\t\t\tvar found bool\n\t\t\t\t\t\t\t\t\tfor _, entry := range fakePhaseFactory.NewCalledWithProvider {\n\t\t\t\t\t\t\t\t\t\tif entry.Name() == \"extender\" {\n\t\t\t\t\t\t\t\t\t\t\tfound = true\n\t\t\t\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\t\t\th.AssertEq(t, found, false)\n\t\t\t\t\t\t\t\t})\n\t\t\t\t\t\t\t})\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\twhen(\"false\", func() {\n\t\t\t\t\t\t\tplatformAPI = api.MustParse(\"0.12\")\n\n\t\t\t\t\t\t\tit(\"doesn't run the extender\", func() {\n\t\t\t\t\t\t\t\terr := lifecycle.Run(context.Background(), func(execution *build.LifecycleExecution) build.PhaseFactory {\n\t\t\t\t\t\t\t\t\treturn fakePhaseFactory\n\t\t\t\t\t\t\t\t})\n\t\t\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\t\t\th.AssertEq(t, len(fakePhaseFactory.NewCalledWithProvider), 5)\n\n\t\t\t\t\t\t\t\tvar found bool\n\t\t\t\t\t\t\t\tfor _, entry := range fakePhaseFactory.NewCalledWithProvider {\n\t\t\t\t\t\t\t\t\tif entry.Name() == \"extender\" {\n\t\t\t\t\t\t\t\t\t\tfound = true\n\t\t\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\t\th.AssertEq(t, found, false)\n\t\t\t\t\t\t\t})\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"network is not provided\", func() {\n\t\t\tit(\"creates an ephemeral bridge network\", func() {\n\t\t\t\tbeforeNetworks := func() int {\n\t\t\t\t\tnetworks, err := docker.NetworkList(context.Background(), client.NetworkListOptions{})\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\treturn len(networks.Items)\n\t\t\t\t}()\n\n\t\t\t\topts := build.LifecycleOptions{\n\t\t\t\t\tImage:   imageName,\n\t\t\t\t\tBuilder: fakeBuilder,\n\t\t\t\t\tTermui:  fakeTermui,\n\t\t\t\t}\n\n\t\t\t\tlifecycle, err := build.NewLifecycleExecution(logger, docker, \"some-temp-dir\", opts)\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\terr = lifecycle.Run(context.Background(), func(execution *build.LifecycleExecution) build.PhaseFactory {\n\t\t\t\t\treturn fakePhaseFactory\n\t\t\t\t})\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\tfor _, entry := range fakePhaseFactory.NewCalledWithProvider {\n\t\t\t\t\th.AssertContains(t, string(entry.HostConfig().NetworkMode), \"pack.local-network-\")\n\t\t\t\t\th.AssertEq(t, entry.HostConfig().NetworkMode.IsDefault(), false)\n\t\t\t\t\th.AssertEq(t, entry.HostConfig().NetworkMode.IsHost(), false)\n\t\t\t\t\th.AssertEq(t, entry.HostConfig().NetworkMode.IsNone(), false)\n\t\t\t\t\th.AssertEq(t, entry.HostConfig().NetworkMode.IsPrivate(), true)\n\t\t\t\t\th.AssertEq(t, entry.HostConfig().NetworkMode.IsUserDefined(), true)\n\t\t\t\t}\n\n\t\t\t\tafterNetworks := func() int {\n\t\t\t\t\tnetworks, err := docker.NetworkList(context.Background(), client.NetworkListOptions{})\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\treturn len(networks.Items)\n\t\t\t\t}()\n\t\t\t\th.AssertEq(t, beforeNetworks, afterNetworks)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"Error cases\", func() {\n\t\t\twhen(\"passed invalid\", func() {\n\t\t\t\tit(\"fails for cache-image\", func() {\n\t\t\t\t\topts := build.LifecycleOptions{\n\t\t\t\t\t\tPublish:      false,\n\t\t\t\t\t\tClearCache:   false,\n\t\t\t\t\t\tRunImage:     \"test\",\n\t\t\t\t\t\tImage:        imageName,\n\t\t\t\t\t\tBuilder:      fakeBuilder,\n\t\t\t\t\t\tTrustBuilder: false,\n\t\t\t\t\t\tUseCreator:   false,\n\t\t\t\t\t\tCacheImage:   \"%%%\",\n\t\t\t\t\t\tTermui:       fakeTermui,\n\t\t\t\t\t}\n\n\t\t\t\t\tlifecycle, err := build.NewLifecycleExecution(logger, docker, \"some-temp-dir\", opts)\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\terr = lifecycle.Run(context.Background(), func(execution *build.LifecycleExecution) build.PhaseFactory {\n\t\t\t\t\t\treturn fakePhaseFactory\n\t\t\t\t\t})\n\n\t\t\t\t\th.AssertError(t, err, fmt.Sprintf(\"invalid cache image name: %s\", \"could not parse reference: %%\"))\n\t\t\t\t})\n\n\t\t\t\tit(\"fails for cache flags\", func() {\n\t\t\t\t\topts := build.LifecycleOptions{\n\t\t\t\t\t\tPublish:      false,\n\t\t\t\t\t\tClearCache:   false,\n\t\t\t\t\t\tRunImage:     \"test\",\n\t\t\t\t\t\tImage:        imageName,\n\t\t\t\t\t\tBuilder:      fakeBuilder,\n\t\t\t\t\t\tTrustBuilder: false,\n\t\t\t\t\t\tUseCreator:   false,\n\t\t\t\t\t\tCache: cache.CacheOpts{\n\t\t\t\t\t\t\tBuild: cache.CacheInfo{\n\t\t\t\t\t\t\t\tFormat: cache.CacheImage,\n\t\t\t\t\t\t\t\tSource: \"%%%\",\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t},\n\t\t\t\t\t\tTermui: fakeTermui,\n\t\t\t\t\t}\n\n\t\t\t\t\tlifecycle, err := build.NewLifecycleExecution(logger, docker, \"some-temp-dir\", opts)\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\terr = lifecycle.Run(context.Background(), func(execution *build.LifecycleExecution) build.PhaseFactory {\n\t\t\t\t\t\treturn fakePhaseFactory\n\t\t\t\t\t})\n\n\t\t\t\t\th.AssertError(t, err, fmt.Sprintf(\"invalid cache image name: %s\", \"could not parse reference: %%\"))\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"#Create\", func() {\n\t\tit.Before(func() {\n\t\t\terr := lifecycle.Create(context.Background(), fakeBuildCache, fakeLaunchCache, fakePhaseFactory)\n\t\t\th.AssertNil(t, err)\n\n\t\t\tlastCallIndex := len(fakePhaseFactory.NewCalledWithProvider) - 1\n\t\t\th.AssertNotEq(t, lastCallIndex, -1)\n\n\t\t\tconfigProvider = fakePhaseFactory.NewCalledWithProvider[lastCallIndex]\n\t\t\th.AssertEq(t, configProvider.Name(), \"creator\")\n\t\t})\n\n\t\tit(\"creates a phase and then runs it\", func() {\n\t\t\th.AssertEq(t, fakePhase.CleanupCallCount, 1)\n\t\t\th.AssertEq(t, fakePhase.RunCallCount, 1)\n\t\t})\n\n\t\tit(\"configures the phase with the expected arguments\", func() {\n\t\t\th.AssertIncludeAllExpectedPatterns(t,\n\t\t\t\tconfigProvider.ContainerConfig().Cmd,\n\t\t\t\t[]string{\"-log-level\", \"debug\"},\n\t\t\t\t[]string{\"-run-image\", providedRunImage},\n\t\t\t\t[]string{providedTargetImage},\n\t\t\t)\n\t\t})\n\n\t\tit(\"configures the phase with the expected network mode\", func() {\n\t\t\th.AssertEq(t, configProvider.HostConfig().NetworkMode, container.NetworkMode(providedNetworkMode))\n\t\t})\n\n\t\twhen(\"clear cache\", func() {\n\t\t\tprovidedClearCache = true\n\n\t\t\tit(\"configures the phase with the expected arguments\", func() {\n\t\t\t\th.AssertIncludeAllExpectedPatterns(t,\n\t\t\t\t\tconfigProvider.ContainerConfig().Cmd,\n\t\t\t\t\t[]string{\"-skip-restore\"},\n\t\t\t\t)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"clear cache is false\", func() {\n\t\t\tit(\"configures the phase with the expected arguments\", func() {\n\t\t\t\th.AssertIncludeAllExpectedPatterns(t,\n\t\t\t\t\tconfigProvider.ContainerConfig().Cmd,\n\t\t\t\t\t[]string{\"-cache-dir\", \"/cache\"},\n\t\t\t\t)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"using a cache image\", func() {\n\t\t\tprovidedClearCache = true\n\t\t\tfakeBuildCache = newFakeImageCache()\n\n\t\t\tit(\"configures the phase with the expected arguments\", func() {\n\t\t\t\th.AssertIncludeAllExpectedPatterns(t,\n\t\t\t\t\tconfigProvider.ContainerConfig().Cmd,\n\t\t\t\t\t[]string{\"-skip-restore\"},\n\t\t\t\t\t[]string{\"-cache-image\", \"some-cache-image\"},\n\t\t\t\t)\n\t\t\t\th.AssertSliceNotContains(t, configProvider.HostConfig().Binds, \":/cache\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"additional tags are specified\", func() {\n\t\t\tit(\"configures phases with additional tags\", func() {\n\t\t\t\th.AssertIncludeAllExpectedPatterns(t,\n\t\t\t\t\tconfigProvider.ContainerConfig().Cmd,\n\t\t\t\t\t[]string{\"-tag\", providedAdditionalTags[0], \"-tag\", providedAdditionalTags[1]},\n\t\t\t\t)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"publish\", func() {\n\t\t\tprovidedPublish = true\n\n\t\t\tit(\"configures the phase with binds\", func() {\n\t\t\t\texpectedBinds := providedVolumes\n\t\t\t\texpectedBinds = append(expectedBinds, \"some-cache:/cache\")\n\n\t\t\t\th.AssertSliceContains(t, configProvider.HostConfig().Binds, expectedBinds...)\n\t\t\t})\n\n\t\t\tit(\"configures the phase with root\", func() {\n\t\t\t\th.AssertEq(t, configProvider.ContainerConfig().User, \"root\")\n\t\t\t})\n\n\t\t\tit(\"configures the phase with registry access\", func() {\n\t\t\t\th.AssertSliceContains(t, configProvider.ContainerConfig().Env, \"CNB_REGISTRY_AUTH={}\")\n\t\t\t})\n\n\t\t\twhen(\"using a cache image\", func() {\n\t\t\t\tfakeBuildCache = newFakeImageCache()\n\n\t\t\t\tit(\"configures the phase with the expected arguments\", func() {\n\t\t\t\t\th.AssertIncludeAllExpectedPatterns(t,\n\t\t\t\t\t\tconfigProvider.ContainerConfig().Cmd,\n\t\t\t\t\t\t[]string{\"-cache-image\", \"some-cache-image\"},\n\t\t\t\t\t)\n\t\t\t\t\th.AssertSliceNotContains(t, configProvider.HostConfig().Binds, \":/cache\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"platform 0.3\", func() {\n\t\t\t\tplatformAPI = api.MustParse(\"0.3\")\n\n\t\t\t\tit(\"doesn't hint at default process type\", func() {\n\t\t\t\t\th.AssertSliceNotContains(t, configProvider.ContainerConfig().Cmd, \"-process-type\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"platform 0.4\", func() {\n\t\t\t\tplatformAPI = api.MustParse(\"0.4\")\n\n\t\t\t\tit(\"hints at default process type\", func() {\n\t\t\t\t\th.AssertIncludeAllExpectedPatterns(t, configProvider.ContainerConfig().Cmd, []string{\"-process-type\", \"web\"})\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"platform >= 0.6\", func() {\n\t\t\t\tplatformAPI = api.MustParse(\"0.6\")\n\n\t\t\t\twhen(\"no user provided process type is present\", func() {\n\t\t\t\t\tit(\"doesn't provide 'web' as default process type\", func() {\n\t\t\t\t\t\th.AssertSliceNotContains(t, configProvider.ContainerConfig().Cmd, \"-process-type\")\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"publish is false\", func() {\n\t\t\tit(\"configures the phase with the expected arguments\", func() {\n\t\t\t\th.AssertIncludeAllExpectedPatterns(t,\n\t\t\t\t\tconfigProvider.ContainerConfig().Cmd,\n\t\t\t\t\t[]string{\"-daemon\"},\n\t\t\t\t\t[]string{\"-launch-cache\", \"/launch-cache\"},\n\t\t\t\t)\n\t\t\t})\n\n\t\t\twhen(\"no docker-host\", func() {\n\t\t\t\tit(\"configures the phase with daemon access\", func() {\n\t\t\t\t\th.AssertEq(t, configProvider.ContainerConfig().User, \"root\")\n\t\t\t\t\th.AssertSliceContains(t, configProvider.HostConfig().Binds, \"/var/run/docker.sock:/var/run/docker.sock\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"tcp docker-host\", func() {\n\t\t\t\tprovidedDockerHost = `tcp://localhost:1234`\n\n\t\t\t\tit(\"configures the phase with daemon access with tcp docker-host\", func() {\n\t\t\t\t\th.AssertSliceNotContains(t, configProvider.HostConfig().Binds, \"/var/run/docker.sock:/var/run/docker.sock\")\n\t\t\t\t\th.AssertSliceContains(t, configProvider.ContainerConfig().Env, \"DOCKER_HOST=tcp://localhost:1234\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"alternative unix socket docker-host\", func() {\n\t\t\t\tprovidedDockerHost = `unix:///home/user/docker.sock`\n\n\t\t\t\tit(\"configures the phase with daemon access\", func() {\n\t\t\t\t\th.AssertSliceContains(t, configProvider.HostConfig().Binds, \"/home/user/docker.sock:/var/run/docker.sock\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"alternative windows pipe docker-host\", func() {\n\t\t\t\tprovidedDockerHost = `npipe:\\\\\\\\.\\pipe\\docker_engine_alt`\n\n\t\t\t\tit(\"configures the phase with daemon access\", func() {\n\t\t\t\t\th.AssertSliceNotContains(t, configProvider.HostConfig().Binds, \"/home/user/docker.sock:/var/run/docker.sock\")\n\t\t\t\t\th.AssertSliceContains(t, configProvider.HostConfig().Binds, `\\\\.\\pipe\\docker_engine_alt:\\\\.\\pipe\\docker_engine`)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"environment variable DOCKER_HOST is set\", func() {\n\t\t\t\tprovidedDockerHost = `inherit`\n\n\t\t\t\tvar (\n\t\t\t\t\toldDH       string\n\t\t\t\t\toldDHExists bool\n\t\t\t\t)\n\n\t\t\t\tit.Before(func() {\n\t\t\t\t\toldDH, oldDHExists = os.LookupEnv(\"DOCKER_HOST\")\n\t\t\t\t\tos.Setenv(\"DOCKER_HOST\", \"tcp://example.com:1234\")\n\t\t\t\t})\n\n\t\t\t\tit.After(func() {\n\t\t\t\t\tif oldDHExists {\n\t\t\t\t\t\tos.Setenv(\"DOCKER_HOST\", oldDH)\n\t\t\t\t\t} else {\n\t\t\t\t\t\tos.Unsetenv(\"DOCKER_HOST\")\n\t\t\t\t\t}\n\t\t\t\t})\n\n\t\t\t\tit(\"configures the phase with daemon access with inherited docker-host\", func() {\n\t\t\t\t\tlifecycle := newTestLifecycleExec(t, true, \"some-temp-dir\", lifecycleOps...)\n\t\t\t\t\tfakePhase := &fakes.FakePhase{}\n\t\t\t\t\tfakePhaseFactory := fakes.NewFakePhaseFactory(fakes.WhichReturnsForNew(fakePhase))\n\t\t\t\t\terr := lifecycle.Create(context.Background(), fakeBuildCache, fakeLaunchCache, fakePhaseFactory)\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\tlastCallIndex := len(fakePhaseFactory.NewCalledWithProvider) - 1\n\t\t\t\t\th.AssertNotEq(t, lastCallIndex, -1)\n\n\t\t\t\t\tconfigProvider := fakePhaseFactory.NewCalledWithProvider[lastCallIndex]\n\t\t\t\t\th.AssertSliceContains(t, configProvider.ContainerConfig().Env, \"DOCKER_HOST=tcp://example.com:1234\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"docker-host with unknown protocol\", func() {\n\t\t\t\tprovidedDockerHost = `withoutprotocol`\n\n\t\t\t\tit(\"configures the phase with daemon access\", func() {\n\t\t\t\t\th.AssertSliceContains(t, configProvider.ContainerConfig().Env, \"DOCKER_HOST=withoutprotocol\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\tit(\"configures the phase with binds\", func() {\n\t\t\t\texpectedBinds := providedVolumes\n\t\t\t\texpectedBinds = append(expectedBinds, \"some-cache:/cache\", \"some-launch-cache:/launch-cache\")\n\n\t\t\t\th.AssertSliceContains(t, configProvider.HostConfig().Binds, expectedBinds...)\n\t\t\t})\n\n\t\t\twhen(\"platform 0.3\", func() {\n\t\t\t\tplatformAPI = api.MustParse(\"0.3\")\n\n\t\t\t\tit(\"doesn't hint at default process type\", func() {\n\t\t\t\t\th.AssertSliceNotContains(t, configProvider.ContainerConfig().Cmd, \"-process-type\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"platform 0.4\", func() {\n\t\t\t\tplatformAPI = api.MustParse(\"0.4\")\n\n\t\t\t\tit(\"hints at default process type\", func() {\n\t\t\t\t\th.AssertIncludeAllExpectedPatterns(t, configProvider.ContainerConfig().Cmd, []string{\"-process-type\", \"web\"})\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"platform >= 0.6\", func() {\n\t\t\t\tplatformAPI = api.MustParse(\"0.6\")\n\n\t\t\t\twhen(\"no user provided process type is present\", func() {\n\t\t\t\t\tit(\"doesn't provide 'web' as default process type\", func() {\n\t\t\t\t\t\th.AssertSliceNotContains(t, configProvider.ContainerConfig().Cmd, \"-process-type\")\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"override GID\", func() {\n\t\t\twhen(\"override GID is provided\", func() {\n\t\t\t\tlifecycleOps = append(lifecycleOps, func(options *build.LifecycleOptions) {\n\t\t\t\t\toptions.GID = 2\n\t\t\t\t})\n\n\t\t\t\tit(\"configures the phase with the expected arguments\", func() {\n\t\t\t\t\th.AssertIncludeAllExpectedPatterns(t,\n\t\t\t\t\t\tconfigProvider.ContainerConfig().Cmd,\n\t\t\t\t\t\t[]string{\"-gid\", \"2\"},\n\t\t\t\t\t)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"override GID is not provided\", func() {\n\t\t\t\tlifecycleOps = append(lifecycleOps, func(options *build.LifecycleOptions) {\n\t\t\t\t\toptions.GID = -1\n\t\t\t\t})\n\n\t\t\t\tit(\"gid is not added to the expected arguments\", func() {\n\t\t\t\t\th.AssertSliceNotContains(t, configProvider.ContainerConfig().Cmd, \"-gid\")\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"override UID\", func() {\n\t\t\twhen(\"override UID is provided\", func() {\n\t\t\t\tlifecycleOps = append(lifecycleOps, func(options *build.LifecycleOptions) {\n\t\t\t\t\toptions.UID = 1001\n\t\t\t\t})\n\n\t\t\t\tit(\"configures the phase with the expected arguments\", func() {\n\t\t\t\t\th.AssertIncludeAllExpectedPatterns(t,\n\t\t\t\t\t\tconfigProvider.ContainerConfig().Cmd,\n\t\t\t\t\t\t[]string{\"-uid\", \"1001\"},\n\t\t\t\t\t)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"override UID is not provided\", func() {\n\t\t\t\tlifecycleOps = append(lifecycleOps, func(options *build.LifecycleOptions) {\n\t\t\t\t\toptions.UID = -1\n\t\t\t\t})\n\n\t\t\t\tit(\"uid is not added to the expected arguments\", func() {\n\t\t\t\t\th.AssertSliceNotContains(t, configProvider.ContainerConfig().Cmd, \"-uid\")\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"-previous-image is used and builder is trusted\", func() {\n\t\t\twhen(\"image is invalid\", func() {\n\t\t\t\tit(\"errors\", func() {\n\t\t\t\t\timageName, err := name.NewTag(\"/x/y/?!z\", name.WeakValidation)\n\t\t\t\t\th.AssertError(t, err, \"repository can only contain the characters `abcdefghijklmnopqrstuvwxyz0123456789_-./`\")\n\n\t\t\t\t\tlifecycleOps := append(lifecycleOps, func(options *build.LifecycleOptions) {\n\t\t\t\t\t\toptions.Image = imageName\n\t\t\t\t\t\toptions.PreviousImage = \"some-previous-image\"\n\t\t\t\t\t})\n\t\t\t\t\tlifecycle := newTestLifecycleExec(t, true, \"some-temp-dir\", lifecycleOps...)\n\n\t\t\t\t\terr = lifecycle.Create(context.Background(), fakeBuildCache, fakeLaunchCache, fakePhaseFactory)\n\t\t\t\t\th.AssertError(t, err, \"invalid image name\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"previous-image is invalid\", func() {\n\t\t\t\tit(\"errors\", func() {\n\t\t\t\t\timageName, err := name.NewTag(\"/some/image\", name.WeakValidation)\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\tlifecycleOps := append(lifecycleOps, func(options *build.LifecycleOptions) {\n\t\t\t\t\t\toptions.PreviousImage = \"%%%\"\n\t\t\t\t\t\toptions.Image = imageName\n\t\t\t\t\t})\n\t\t\t\t\tlifecycle := newTestLifecycleExec(t, true, \"some-temp-dir\", lifecycleOps...)\n\n\t\t\t\t\terr = lifecycle.Create(context.Background(), fakeBuildCache, fakeLaunchCache, fakePhaseFactory)\n\t\t\t\t\th.AssertError(t, err, \"invalid previous image name\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"--publish is false\", func() {\n\t\t\t\timageName, _ := name.NewTag(\"/some/image\", name.WeakValidation)\n\n\t\t\t\tlifecycleOps = append(lifecycleOps, func(options *build.LifecycleOptions) {\n\t\t\t\t\toptions.PreviousImage = \"some-previous-image\"\n\t\t\t\t\toptions.Image = imageName\n\t\t\t\t})\n\n\t\t\t\tit(\"successfully passes previous-image to creator\", func() {\n\t\t\t\t\th.AssertIncludeAllExpectedPatterns(t, configProvider.ContainerConfig().Cmd, []string{\"-previous-image\", \"some-previous-image\"})\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"--publish is true\", func() {\n\t\t\t\tprovidedPublish = true\n\n\t\t\t\twhen(\"previous-image and image are in the same registry\", func() {\n\t\t\t\t\timageName, _ := name.NewTag(\"/some/image\", name.WeakValidation)\n\n\t\t\t\t\tlifecycleOps = append(lifecycleOps, func(options *build.LifecycleOptions) {\n\t\t\t\t\t\toptions.PreviousImage = \"index.docker.io/some/previous:latest\"\n\t\t\t\t\t\toptions.Image = imageName\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"successfully passes previous-image to creator\", func() {\n\t\t\t\t\t\th.AssertIncludeAllExpectedPatterns(t, configProvider.ContainerConfig().Cmd, []string{\"-previous-image\", \"index.docker.io/some/previous:latest\"})\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"previous-image and image are not in the same registry\", func() {\n\t\t\t\t\tit(\"errors\", func() {\n\t\t\t\t\t\timageName, err := name.NewTag(\"/some/image\", name.WeakValidation)\n\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\tlifecycleOps := append(lifecycleOps, func(options *build.LifecycleOptions) {\n\t\t\t\t\t\t\toptions.PreviousImage = \"example.io/some/previous:latest\"\n\t\t\t\t\t\t\toptions.Image = imageName\n\t\t\t\t\t\t})\n\t\t\t\t\t\tlifecycle := newTestLifecycleExec(t, true, \"some-temp-dir\", lifecycleOps...)\n\n\t\t\t\t\t\terr = lifecycle.Create(context.Background(), fakeBuildCache, fakeLaunchCache, fakePhaseFactory)\n\t\t\t\t\t\th.AssertError(t, err, fmt.Sprintf(\"%s\", err))\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"interactive mode\", func() {\n\t\t\tlifecycleOps = append(lifecycleOps, func(opts *build.LifecycleOptions) {\n\t\t\t\topts.Interactive = true\n\t\t\t\topts.Termui = &fakes.FakeTermui{ReadLayersFunc: func(_ io.ReadCloser) {\n\t\t\t\t\t// no-op\n\t\t\t\t}}\n\t\t\t})\n\n\t\t\tit(\"provides the termui readLayersFunc as a post container operation\", func() {\n\t\t\t\th.AssertEq(t, fakePhase.CleanupCallCount, 1)\n\t\t\t\th.AssertEq(t, fakePhase.RunCallCount, 1)\n\n\t\t\t\th.AssertEq(t, len(configProvider.PostContainerRunOps()), 2)\n\t\t\t\th.AssertFunctionName(t, configProvider.PostContainerRunOps()[0], \"EnsureVolumeAccess\")\n\t\t\t\th.AssertFunctionName(t, configProvider.PostContainerRunOps()[1], \"CopyOut\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"sbom destination directory is provided\", func() {\n\t\t\tlifecycleOps = append(lifecycleOps, func(opts *build.LifecycleOptions) {\n\t\t\t\topts.SBOMDestinationDir = \"some-destination-dir\"\n\t\t\t})\n\n\t\t\tit(\"provides copy-sbom-func as a post container operation\", func() {\n\t\t\t\th.AssertEq(t, fakePhase.CleanupCallCount, 1)\n\t\t\t\th.AssertEq(t, fakePhase.RunCallCount, 1)\n\n\t\t\t\th.AssertEq(t, len(configProvider.PostContainerRunOps()), 2)\n\t\t\t\th.AssertFunctionName(t, configProvider.PostContainerRunOps()[0], \"EnsureVolumeAccess\")\n\t\t\t\th.AssertFunctionName(t, configProvider.PostContainerRunOps()[1], \"CopyOut\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"report destination directory is provided\", func() {\n\t\t\tlifecycleOps = append(lifecycleOps, func(opts *build.LifecycleOptions) {\n\t\t\t\topts.ReportDestinationDir = \"a-destination-dir\"\n\t\t\t})\n\n\t\t\tit(\"provides copy-sbom-func as a post container operation\", func() {\n\t\t\t\th.AssertEq(t, fakePhase.CleanupCallCount, 1)\n\t\t\t\th.AssertEq(t, fakePhase.RunCallCount, 1)\n\n\t\t\t\th.AssertEq(t, len(configProvider.PostContainerRunOps()), 2)\n\t\t\t\th.AssertFunctionName(t, configProvider.PostContainerRunOps()[0], \"EnsureVolumeAccess\")\n\t\t\t\th.AssertFunctionName(t, configProvider.PostContainerRunOps()[1], \"CopyOut\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"--creation-time\", func() {\n\t\t\twhen(\"platform < 0.9\", func() {\n\t\t\t\tplatformAPI = api.MustParse(\"0.8\")\n\n\t\t\t\tintTime, _ := strconv.ParseInt(\"1234567890\", 10, 64)\n\t\t\t\tprovidedTime := time.Unix(intTime, 0).UTC()\n\n\t\t\t\tlifecycleOps = append(lifecycleOps, func(baseOpts *build.LifecycleOptions) {\n\t\t\t\t\tbaseOpts.CreationTime = &providedTime\n\t\t\t\t})\n\n\t\t\t\tit(\"is ignored\", func() {\n\t\t\t\t\th.AssertSliceNotContains(t, configProvider.ContainerConfig().Env, \"SOURCE_DATE_EPOCH=1234567890\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"platform >= 0.9\", func() {\n\t\t\t\tplatformAPI = api.MustParse(\"0.9\")\n\n\t\t\t\twhen(\"provided\", func() {\n\t\t\t\t\tintTime, _ := strconv.ParseInt(\"1234567890\", 10, 64)\n\t\t\t\t\tprovidedTime := time.Unix(intTime, 0).UTC()\n\n\t\t\t\t\tlifecycleOps = append(lifecycleOps, func(baseOpts *build.LifecycleOptions) {\n\t\t\t\t\t\tbaseOpts.CreationTime = &providedTime\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"configures the phase with env SOURCE_DATE_EPOCH\", func() {\n\t\t\t\t\t\th.AssertSliceContains(t, configProvider.ContainerConfig().Env, \"SOURCE_DATE_EPOCH=1234567890\")\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"not provided\", func() {\n\t\t\t\t\tlifecycleOps = append(lifecycleOps, func(baseOpts *build.LifecycleOptions) {\n\t\t\t\t\t\tbaseOpts.CreationTime = nil\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"does not panic\", func() {\n\t\t\t\t\t\t// no-op\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"layout\", func() {\n\t\t\tprovidedLayout = true\n\t\t\tlayoutRepo := filepath.Join(paths.RootDir, \"layout-repo\")\n\t\t\tplatformAPI = api.MustParse(\"0.12\")\n\n\t\t\tit(\"configures the phase with oci layout environment variables\", func() {\n\t\t\t\th.AssertSliceContains(t, configProvider.ContainerConfig().Env, \"CNB_USE_LAYOUT=true\")\n\t\t\t\th.AssertSliceContains(t, configProvider.ContainerConfig().Env, fmt.Sprintf(\"CNB_LAYOUT_DIR=%s\", layoutRepo))\n\t\t\t\th.AssertSliceContains(t, configProvider.ContainerConfig().Env, \"CNB_EXPERIMENTAL_MODE=warn\")\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"#Detect\", func() {\n\t\tit.Before(func() {\n\t\t\terr := lifecycle.Detect(context.Background(), fakePhaseFactory)\n\t\t\th.AssertNil(t, err)\n\n\t\t\tlastCallIndex := len(fakePhaseFactory.NewCalledWithProvider) - 1\n\t\t\th.AssertNotEq(t, lastCallIndex, -1)\n\n\t\t\tconfigProvider = fakePhaseFactory.NewCalledWithProvider[lastCallIndex]\n\t\t\th.AssertEq(t, configProvider.Name(), \"detector\")\n\t\t})\n\n\t\tit(\"creates a phase and then runs it\", func() {\n\t\t\th.AssertEq(t, fakePhase.CleanupCallCount, 1)\n\t\t\th.AssertEq(t, fakePhase.RunCallCount, 1)\n\t\t})\n\n\t\tit(\"configures the phase with the expected arguments\", func() {\n\t\t\th.AssertIncludeAllExpectedPatterns(t,\n\t\t\t\tconfigProvider.ContainerConfig().Cmd,\n\t\t\t\t[]string{\"-log-level\", \"debug\"},\n\t\t\t)\n\t\t})\n\n\t\tit(\"configures the phase with the expected network mode\", func() {\n\t\t\th.AssertEq(t, configProvider.HostConfig().NetworkMode, container.NetworkMode(providedNetworkMode))\n\t\t})\n\n\t\tit(\"configures the phase to copy app dir\", func() {\n\t\t\th.AssertSliceContains(t, configProvider.HostConfig().Binds, providedVolumes...)\n\t\t\th.AssertEq(t, len(configProvider.ContainerOps()), 2)\n\t\t\th.AssertFunctionName(t, configProvider.ContainerOps()[0], \"EnsureVolumeAccess\")\n\t\t\th.AssertFunctionName(t, configProvider.ContainerOps()[1], \"CopyDir\")\n\t\t})\n\n\t\twhen(\"extensions\", func() {\n\t\t\tplatformAPI = api.MustParse(\"0.10\")\n\n\t\t\twhen(\"present in the order\", func() {\n\t\t\t\tprovidedOrderExt = dist.Order{dist.OrderEntry{Group: []dist.ModuleRef{ /* don't care */ }}}\n\n\t\t\t\tit(\"sets CNB_EXPERIMENTAL_MODE=warn in the environment\", func() {\n\t\t\t\t\th.AssertSliceContains(t, configProvider.ContainerConfig().Env, \"CNB_EXPERIMENTAL_MODE=warn\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"not present in the order\", func() {\n\t\t\t\tit(\"sets CNB_EXPERIMENTAL_MODE=warn in the environment\", func() {\n\t\t\t\t\th.AssertSliceNotContains(t, configProvider.ContainerConfig().Env, \"CNB_EXPERIMENTAL_MODE=warn\")\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"#Analyze\", func() {\n\t\tit.Before(func() {\n\t\t\terr := lifecycle.Analyze(context.Background(), fakeBuildCache, fakeLaunchCache, fakePhaseFactory)\n\t\t\th.AssertNil(t, err)\n\n\t\t\tlastCallIndex := len(fakePhaseFactory.NewCalledWithProvider) - 1\n\t\t\th.AssertNotEq(t, lastCallIndex, -1)\n\n\t\t\tconfigProvider = fakePhaseFactory.NewCalledWithProvider[lastCallIndex]\n\t\t\th.AssertEq(t, configProvider.Name(), \"analyzer\")\n\t\t})\n\n\t\tit(\"creates a phase and then runs it\", func() {\n\t\t\th.AssertEq(t, fakePhase.CleanupCallCount, 1)\n\t\t\th.AssertEq(t, fakePhase.RunCallCount, 1)\n\t\t})\n\n\t\twhen(\"platform < 0.7\", func() {\n\t\t\twhen(\"clear cache\", func() {\n\t\t\t\tprovidedClearCache = true\n\n\t\t\t\tit(\"configures the phase with the expected arguments\", func() {\n\t\t\t\t\th.AssertSliceContains(t, configProvider.ContainerConfig().Cmd, \"-skip-layers\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"clear cache is false\", func() {\n\t\t\t\tit(\"configures the phase with the expected arguments\", func() {\n\t\t\t\t\th.AssertIncludeAllExpectedPatterns(t,\n\t\t\t\t\t\tconfigProvider.ContainerConfig().Cmd,\n\t\t\t\t\t\t[]string{\"-cache-dir\", \"/cache\"},\n\t\t\t\t\t)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"using a cache image\", func() {\n\t\t\t\tfakeBuildCache = newFakeImageCache()\n\n\t\t\t\tit(\"configures the phase with a build cache image\", func() {\n\t\t\t\t\th.AssertSliceNotContains(t, configProvider.HostConfig().Binds, \":/cache\")\n\t\t\t\t\th.AssertIncludeAllExpectedPatterns(t,\n\t\t\t\t\t\tconfigProvider.ContainerConfig().Cmd,\n\t\t\t\t\t\t[]string{\"-cache-image\", \"some-cache-image\"},\n\t\t\t\t\t)\n\t\t\t\t\th.AssertSliceNotContains(t,\n\t\t\t\t\t\tconfigProvider.ContainerConfig().Cmd,\n\t\t\t\t\t\t\"-cache-dir\",\n\t\t\t\t\t)\n\t\t\t\t})\n\n\t\t\t\twhen(\"clear-cache\", func() {\n\t\t\t\t\tprovidedClearCache = true\n\n\t\t\t\t\tit(\"cache is omitted from Analyze\", func() {\n\t\t\t\t\t\th.AssertSliceNotContains(t, configProvider.ContainerConfig().Cmd, \"-cache-image\")\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"platform >= 0.7\", func() {\n\t\t\tplatformAPI = api.MustParse(\"0.7\")\n\n\t\t\tit(\"doesn't set cache dir\", func() {\n\t\t\t\th.AssertSliceNotContains(t, configProvider.HostConfig().Binds, \":/cache\")\n\t\t\t})\n\n\t\t\tit(\"passes additional tags\", func() {\n\t\t\t\th.AssertIncludeAllExpectedPatterns(t,\n\t\t\t\t\tconfigProvider.ContainerConfig().Cmd,\n\t\t\t\t\t[]string{\"-tag\", \"some-additional-tag2\", \"-tag\", \"some-additional-tag1\"},\n\t\t\t\t)\n\t\t\t})\n\n\t\t\tit(\"passes run image\", func() {\n\t\t\t\th.AssertIncludeAllExpectedPatterns(t,\n\t\t\t\t\tconfigProvider.ContainerConfig().Cmd,\n\t\t\t\t\t[]string{\"-run-image\", \"some-run-image\"},\n\t\t\t\t)\n\t\t\t})\n\n\t\t\tit(\"passes stack\", func() {\n\t\t\t\th.AssertIncludeAllExpectedPatterns(t,\n\t\t\t\t\tconfigProvider.ContainerConfig().Cmd,\n\t\t\t\t\t[]string{\"-stack\", \"/layers/stack.toml\"},\n\t\t\t\t)\n\t\t\t})\n\n\t\t\twhen(\"previous image\", func() {\n\t\t\t\tlifecycleOps = append(lifecycleOps, func(options *build.LifecycleOptions) {\n\t\t\t\t\toptions.PreviousImage = \"some-previous-image\"\n\t\t\t\t})\n\n\t\t\t\tit(\"passes previous image\", func() {\n\t\t\t\t\th.AssertIncludeAllExpectedPatterns(t,\n\t\t\t\t\t\tconfigProvider.ContainerConfig().Cmd,\n\t\t\t\t\t\t[]string{\"-previous-image\", \"some-previous-image\"},\n\t\t\t\t\t)\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"platform >= 0.12\", func() {\n\t\t\tplatformAPI = api.MustParse(\"0.12\")\n\n\t\t\tit(\"passes run\", func() {\n\t\t\t\th.AssertIncludeAllExpectedPatterns(t,\n\t\t\t\t\tconfigProvider.ContainerConfig().Cmd,\n\t\t\t\t\t[]string{\"-run\", \"/layers/run.toml\"},\n\t\t\t\t)\n\t\t\t\th.AssertSliceNotContains(t,\n\t\t\t\t\tconfigProvider.ContainerConfig().Cmd,\n\t\t\t\t\t\"-stack\",\n\t\t\t\t)\n\t\t\t})\n\n\t\t\twhen(\"layout is true\", func() {\n\t\t\t\tprovidedLayout = true\n\n\t\t\t\tit(\"configures the phase with the expected environment variables\", func() {\n\t\t\t\t\tlayoutDir := filepath.Join(paths.RootDir, \"layout-repo\")\n\t\t\t\t\th.AssertSliceContains(t,\n\t\t\t\t\t\tconfigProvider.ContainerConfig().Env, \"CNB_USE_LAYOUT=true\", fmt.Sprintf(\"CNB_LAYOUT_DIR=%s\", layoutDir),\n\t\t\t\t\t)\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"publish\", func() {\n\t\t\tprovidedPublish = true\n\n\t\t\twhen(\"lifecycle image\", func() {\n\t\t\t\tlifecycleOps = append(lifecycleOps, func(options *build.LifecycleOptions) {\n\t\t\t\t\toptions.LifecycleImage = \"some-lifecycle-image\"\n\t\t\t\t})\n\n\t\t\t\tit(\"runs the phase with the lifecycle image\", func() {\n\t\t\t\t\th.AssertEq(t, configProvider.ContainerConfig().Image, \"some-lifecycle-image\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\tit(\"sets the CNB_USER_ID and CNB_GROUP_ID in the environment\", func() {\n\t\t\t\th.AssertSliceContains(t, configProvider.ContainerConfig().Env, fmt.Sprintf(\"CNB_USER_ID=%d\", providedUID))\n\t\t\t\th.AssertSliceContains(t, configProvider.ContainerConfig().Env, fmt.Sprintf(\"CNB_GROUP_ID=%d\", providedGID))\n\t\t\t})\n\n\t\t\tit(\"configures the phase with registry access\", func() {\n\t\t\t\th.AssertSliceContains(t, configProvider.ContainerConfig().Env, \"CNB_REGISTRY_AUTH={}\")\n\t\t\t\th.AssertEq(t, configProvider.HostConfig().NetworkMode, container.NetworkMode(providedNetworkMode))\n\t\t\t})\n\n\t\t\tit(\"configures the phase with root\", func() {\n\t\t\t\th.AssertEq(t, configProvider.ContainerConfig().User, \"root\")\n\t\t\t})\n\n\t\t\tit(\"configures the phase with the expected arguments\", func() {\n\t\t\t\th.AssertIncludeAllExpectedPatterns(t,\n\t\t\t\t\tconfigProvider.ContainerConfig().Cmd,\n\t\t\t\t\t[]string{\"-log-level\", \"debug\"},\n\t\t\t\t\t[]string{providedTargetImage},\n\t\t\t\t)\n\t\t\t})\n\n\t\t\tit(\"configures the phase with binds\", func() {\n\t\t\t\texpectedBind := \"some-cache:/cache\"\n\n\t\t\t\th.AssertSliceContains(t, configProvider.HostConfig().Binds, expectedBind)\n\t\t\t})\n\n\t\t\twhen(\"using a cache image\", func() {\n\t\t\t\tfakeBuildCache = newFakeImageCache()\n\n\t\t\t\tit(\"configures the phase with a build cache images\", func() {\n\t\t\t\t\th.AssertSliceNotContains(t, configProvider.HostConfig().Binds, \":/cache\")\n\t\t\t\t\th.AssertIncludeAllExpectedPatterns(t,\n\t\t\t\t\t\tconfigProvider.ContainerConfig().Cmd,\n\t\t\t\t\t\t[]string{\"-cache-image\", \"some-cache-image\"},\n\t\t\t\t\t)\n\t\t\t\t\th.AssertSliceNotContains(t,\n\t\t\t\t\t\tconfigProvider.ContainerConfig().Cmd,\n\t\t\t\t\t\t\"-cache-dir\",\n\t\t\t\t\t)\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"publish is false\", func() {\n\t\t\twhen(\"lifecycle image\", func() {\n\t\t\t\tlifecycleOps = append(lifecycleOps, func(options *build.LifecycleOptions) {\n\t\t\t\t\toptions.LifecycleImage = \"some-lifecycle-image\"\n\t\t\t\t})\n\n\t\t\t\tit(\"runs the phase with the lifecycle image\", func() {\n\t\t\t\t\th.AssertEq(t, configProvider.ContainerConfig().Image, \"some-lifecycle-image\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\tit(\"sets the CNB_USER_ID and CNB_GROUP_ID in the environment\", func() {\n\t\t\t\th.AssertSliceContains(t, configProvider.ContainerConfig().Env, fmt.Sprintf(\"CNB_USER_ID=%d\", providedUID))\n\t\t\t\th.AssertSliceContains(t, configProvider.ContainerConfig().Env, fmt.Sprintf(\"CNB_GROUP_ID=%d\", providedGID))\n\t\t\t})\n\n\t\t\tit(\"configures the phase with daemon access\", func() {\n\t\t\t\th.AssertEq(t, configProvider.ContainerConfig().User, \"root\")\n\t\t\t\th.AssertSliceContains(t, configProvider.HostConfig().Binds, \"/var/run/docker.sock:/var/run/docker.sock\")\n\t\t\t})\n\n\t\t\twhen(\"tcp docker-host\", func() {\n\t\t\t\tprovidedDockerHost = `tcp://localhost:1234`\n\n\t\t\t\tit(\"configures the phase with daemon access with TCP docker-host\", func() {\n\t\t\t\t\th.AssertSliceNotContains(t, configProvider.HostConfig().Binds, \"/var/run/docker.sock:/var/run/docker.sock\")\n\t\t\t\t\th.AssertSliceContains(t, configProvider.ContainerConfig().Env, \"DOCKER_HOST=tcp://localhost:1234\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\tit(\"configures the phase with the expected arguments\", func() {\n\t\t\t\th.AssertIncludeAllExpectedPatterns(t,\n\t\t\t\t\tconfigProvider.ContainerConfig().Cmd,\n\t\t\t\t\t[]string{\"-log-level\", \"debug\"},\n\t\t\t\t\t[]string{\"-daemon\"},\n\t\t\t\t\t[]string{providedTargetImage},\n\t\t\t\t)\n\t\t\t})\n\n\t\t\tit(\"configures the phase with the expected network mode\", func() {\n\t\t\t\th.AssertEq(t, configProvider.HostConfig().NetworkMode, container.NetworkMode(providedNetworkMode))\n\t\t\t})\n\n\t\t\tit(\"configures the phase with binds\", func() {\n\t\t\t\texpectedBind := \"some-cache:/cache\"\n\n\t\t\t\th.AssertSliceContains(t, configProvider.HostConfig().Binds, expectedBind)\n\t\t\t})\n\n\t\t\twhen(\"platform >= 0.9\", func() {\n\t\t\t\tplatformAPI = api.MustParse(\"0.9\")\n\n\t\t\t\tprovidedClearCache = true\n\n\t\t\t\tit(\"configures the phase with launch cache and skip layers\", func() {\n\t\t\t\t\texpectedBinds := []string{\"some-launch-cache:/launch-cache\"}\n\n\t\t\t\t\th.AssertIncludeAllExpectedPatterns(t,\n\t\t\t\t\t\tconfigProvider.ContainerConfig().Cmd,\n\t\t\t\t\t\t[]string{\"-skip-layers\"},\n\t\t\t\t\t\t[]string{\"-launch-cache\", \"/launch-cache\"},\n\t\t\t\t\t)\n\t\t\t\t\th.AssertSliceContains(t, configProvider.HostConfig().Binds, expectedBinds...)\n\t\t\t\t})\n\n\t\t\t\twhen(\"override GID\", func() {\n\t\t\t\t\twhen(\"override GID is provided\", func() {\n\t\t\t\t\t\tlifecycleOps = append(lifecycleOps, func(options *build.LifecycleOptions) {\n\t\t\t\t\t\t\toptions.GID = 2\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\tit(\"configures the phase with the expected arguments\", func() {\n\t\t\t\t\t\t\th.AssertIncludeAllExpectedPatterns(t,\n\t\t\t\t\t\t\t\tconfigProvider.ContainerConfig().Cmd,\n\t\t\t\t\t\t\t\t[]string{\"-gid\", \"2\"},\n\t\t\t\t\t\t\t)\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"override GID is not provided\", func() {\n\t\t\t\t\t\tlifecycleOps = append(lifecycleOps, func(options *build.LifecycleOptions) {\n\t\t\t\t\t\t\toptions.GID = -1\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\tit(\"gid is not added to the expected arguments\", func() {\n\t\t\t\t\t\t\th.AssertSliceNotContains(t, configProvider.ContainerConfig().Cmd, \"-gid\")\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"override UID\", func() {\n\t\t\t\t\twhen(\"override UID is provided\", func() {\n\t\t\t\t\t\tlifecycleOps = append(lifecycleOps, func(options *build.LifecycleOptions) {\n\t\t\t\t\t\t\toptions.UID = 1001\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\tit(\"configures the phase with the expected arguments\", func() {\n\t\t\t\t\t\t\th.AssertIncludeAllExpectedPatterns(t,\n\t\t\t\t\t\t\t\tconfigProvider.ContainerConfig().Cmd,\n\t\t\t\t\t\t\t\t[]string{\"-uid\", \"1001\"},\n\t\t\t\t\t\t\t)\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"override UID is not provided\", func() {\n\t\t\t\t\t\tlifecycleOps = append(lifecycleOps, func(options *build.LifecycleOptions) {\n\t\t\t\t\t\t\toptions.UID = -1\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\tit(\"uid is not added to the expected arguments\", func() {\n\t\t\t\t\t\t\th.AssertSliceNotContains(t, configProvider.ContainerConfig().Cmd, \"-uid\")\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"previous-image is used and builder is untrusted\", func() {\n\t\t\twhen(\"image is invalid\", func() {\n\t\t\t\tit(\"errors\", func() {\n\t\t\t\t\tvar imageName name.Tag\n\t\t\t\t\timageName, err := name.NewTag(\"/x/y/?!z\", name.WeakValidation)\n\t\t\t\t\th.AssertError(t, err, \"repository can only contain the characters `abcdefghijklmnopqrstuvwxyz0123456789_-./`\")\n\n\t\t\t\t\tlifecycleOps := append(lifecycleOps, func(options *build.LifecycleOptions) {\n\t\t\t\t\t\toptions.Image = imageName\n\t\t\t\t\t\toptions.PreviousImage = \"some-previous-image\"\n\t\t\t\t\t})\n\t\t\t\t\tlifecycle := newTestLifecycleExec(t, true, \"some-temp-dir\", lifecycleOps...)\n\n\t\t\t\t\terr = lifecycle.Analyze(context.Background(), fakeBuildCache, fakeLaunchCache, fakePhaseFactory)\n\t\t\t\t\th.AssertError(t, err, \"invalid image name\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"previous-image is invalid\", func() {\n\t\t\t\tit(\"errors\", func() {\n\t\t\t\t\tvar imageName name.Tag\n\t\t\t\t\timageName, err := name.NewTag(\"/some/image\", name.WeakValidation)\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\tlifecycleOps := append(lifecycleOps, func(options *build.LifecycleOptions) {\n\t\t\t\t\t\toptions.PreviousImage = \"%%%\"\n\t\t\t\t\t\toptions.Image = imageName\n\t\t\t\t\t})\n\t\t\t\t\tlifecycle := newTestLifecycleExec(t, true, \"some-temp-dir\", lifecycleOps...)\n\n\t\t\t\t\terr = lifecycle.Analyze(context.Background(), fakeBuildCache, fakeLaunchCache, fakePhaseFactory)\n\t\t\t\t\th.AssertError(t, err, \"invalid previous image name\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"--publish is false\", func() {\n\t\t\t\twhen(\"previous image\", func() {\n\t\t\t\t\timageName, _ := name.NewTag(\"/some/image\", name.WeakValidation)\n\n\t\t\t\t\tlifecycleOps = append(lifecycleOps, func(options *build.LifecycleOptions) {\n\t\t\t\t\t\toptions.PreviousImage = \"previous-image\"\n\t\t\t\t\t\toptions.Image = imageName\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"successfully passes previous-image to analyzer\", func() {\n\t\t\t\t\t\tprevImage, err := name.ParseReference(lifecycle.PrevImageName(), name.WeakValidation)\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\th.AssertEq(t, lifecycle.ImageName().Name(), prevImage.Name())\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"--publish is true\", func() {\n\t\t\t\tprovidedPublish = true\n\n\t\t\t\twhen(\"previous-image and image are in the same registry\", func() {\n\t\t\t\t\timageName, _ := name.NewTag(\"/some/image\", name.WeakValidation)\n\n\t\t\t\t\tlifecycleOps = append(lifecycleOps, func(options *build.LifecycleOptions) {\n\t\t\t\t\t\toptions.PreviousImage = \"index.docker.io/some/previous:latest\"\n\t\t\t\t\t\toptions.Image = imageName\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"successfully passes previous-image to analyzer\", func() {\n\t\t\t\t\t\tprevImage, err := name.ParseReference(lifecycle.PrevImageName(), name.WeakValidation)\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\th.AssertEq(t, lifecycle.ImageName().Name(), prevImage.Name())\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"previous-image and image are not in the same registry\", func() {\n\t\t\t\t\tit(\"errors\", func() {\n\t\t\t\t\t\timageName, err := name.NewTag(\"/some/image\", name.WeakValidation)\n\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\tlifecycleOps := append(lifecycleOps, func(options *build.LifecycleOptions) {\n\t\t\t\t\t\t\toptions.PreviousImage = \"example.io/some/previous:latest\"\n\t\t\t\t\t\t\toptions.Image = imageName\n\t\t\t\t\t\t})\n\t\t\t\t\t\tlifecycle := newTestLifecycleExec(t, true, \"some-temp-dir\", lifecycleOps...)\n\n\t\t\t\t\t\terr = lifecycle.Analyze(context.Background(), fakeBuildCache, fakeLaunchCache, fakePhaseFactory)\n\t\t\t\t\t\th.AssertNotNil(t, err)\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"#Restore\", func() {\n\t\tit.Before(func() {\n\t\t\terr := lifecycle.Restore(context.Background(), fakeBuildCache, fakeKanikoCache, fakePhaseFactory)\n\t\t\th.AssertNil(t, err)\n\n\t\t\tlastCallIndex := len(fakePhaseFactory.NewCalledWithProvider) - 1\n\t\t\th.AssertNotEq(t, lastCallIndex, -1)\n\n\t\t\tconfigProvider = fakePhaseFactory.NewCalledWithProvider[lastCallIndex]\n\t\t\th.AssertEq(t, configProvider.Name(), \"restorer\")\n\t\t})\n\n\t\twhen(\"lifecycle image\", func() {\n\t\t\tlifecycleOps = append(lifecycleOps, func(options *build.LifecycleOptions) {\n\t\t\t\toptions.LifecycleImage = \"some-lifecycle-image\"\n\t\t\t})\n\n\t\t\tit(\"runs the phase with the lifecycle image\", func() {\n\t\t\t\th.AssertEq(t, configProvider.ContainerConfig().Image, \"some-lifecycle-image\")\n\t\t\t})\n\t\t})\n\n\t\tit(\"sets the CNB_USER_ID and CNB_GROUP_ID in the environment\", func() {\n\t\t\th.AssertSliceContains(t, configProvider.ContainerConfig().Env, fmt.Sprintf(\"CNB_USER_ID=%d\", providedUID))\n\t\t\th.AssertSliceContains(t, configProvider.ContainerConfig().Env, fmt.Sprintf(\"CNB_GROUP_ID=%d\", providedGID))\n\t\t})\n\n\t\tit(\"creates a phase and then runs it\", func() {\n\t\t\th.AssertEq(t, fakePhase.CleanupCallCount, 1)\n\t\t\th.AssertEq(t, fakePhase.RunCallCount, 1)\n\t\t})\n\n\t\tit(\"configures the phase with root access\", func() {\n\t\t\th.AssertEq(t, configProvider.ContainerConfig().User, \"root\")\n\t\t})\n\n\t\tit(\"configures the phase with the expected arguments\", func() {\n\t\t\th.AssertIncludeAllExpectedPatterns(t,\n\t\t\t\tconfigProvider.ContainerConfig().Cmd,\n\t\t\t\t[]string{\"-log-level\", \"debug\"},\n\t\t\t\t[]string{\"-cache-dir\", \"/cache\"},\n\t\t\t)\n\t\t})\n\n\t\tit(\"configures the phase with the expected network mode\", func() {\n\t\t\th.AssertEq(t, configProvider.HostConfig().NetworkMode, container.NetworkMode(providedNetworkMode))\n\t\t})\n\n\t\tit(\"configures the phase with binds\", func() {\n\t\t\texpectedBind := \"some-cache:/cache\"\n\n\t\t\th.AssertSliceContains(t, configProvider.HostConfig().Binds, expectedBind)\n\t\t})\n\n\t\twhen(\"there are extensions\", func() {\n\t\t\tplatformAPI = api.MustParse(\"0.12\")\n\t\t\tprovidedOrderExt = dist.Order{dist.OrderEntry{Group: []dist.ModuleRef{ /* don't care */ }}}\n\n\t\t\twhen(\"for build\", func() {\n\t\t\t\textensionsForBuild = true\n\n\t\t\t\tit(\"configures the phase with registry access\", func() {\n\t\t\t\t\th.AssertSliceContains(t, configProvider.ContainerConfig().Env, \"CNB_REGISTRY_AUTH={}\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"for run\", func() {\n\t\t\t\textensionsForRun = true\n\n\t\t\t\tit(\"configures the phase with registry access\", func() {\n\t\t\t\t\th.AssertSliceContains(t, configProvider.ContainerConfig().Env, \"CNB_REGISTRY_AUTH={}\")\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"using cache image\", func() {\n\t\t\tfakeBuildCache = newFakeImageCache()\n\n\t\t\tit(\"configures the phase with a cache image\", func() {\n\t\t\t\th.AssertSliceNotContains(t, configProvider.HostConfig().Binds, \":/cache\")\n\t\t\t\th.AssertIncludeAllExpectedPatterns(t,\n\t\t\t\t\tconfigProvider.ContainerConfig().Cmd,\n\t\t\t\t\t[]string{\"-cache-image\", \"some-cache-image\"},\n\t\t\t\t)\n\t\t\t\th.AssertSliceNotContains(t, configProvider.ContainerConfig().Cmd, \"-cache-dir\")\n\t\t\t})\n\n\t\t\tit(\"configures the phase with registry access\", func() {\n\t\t\t\th.AssertSliceContains(t, configProvider.ContainerConfig().Env, \"CNB_REGISTRY_AUTH={}\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"override GID\", func() {\n\t\t\twhen(\"override GID is provided\", func() {\n\t\t\t\tlifecycleOps = append(lifecycleOps, func(options *build.LifecycleOptions) {\n\t\t\t\t\toptions.GID = 2\n\t\t\t\t})\n\n\t\t\t\tit(\"configures the phase with the expected arguments\", func() {\n\t\t\t\t\th.AssertIncludeAllExpectedPatterns(t,\n\t\t\t\t\t\tconfigProvider.ContainerConfig().Cmd,\n\t\t\t\t\t\t[]string{\"-gid\", \"2\"},\n\t\t\t\t\t)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"override GID is not provided\", func() {\n\t\t\t\tlifecycleOps = append(lifecycleOps, func(options *build.LifecycleOptions) {\n\t\t\t\t\toptions.GID = -1\n\t\t\t\t})\n\n\t\t\t\tit(\"gid is not added to the expected arguments\", func() {\n\t\t\t\t\th.AssertSliceNotContains(t, configProvider.ContainerConfig().Cmd, \"-gid\")\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"override UID\", func() {\n\t\t\twhen(\"override UID is provided\", func() {\n\t\t\t\tlifecycleOps = append(lifecycleOps, func(options *build.LifecycleOptions) {\n\t\t\t\t\toptions.UID = 1001\n\t\t\t\t})\n\n\t\t\t\tit(\"configures the phase with the expected arguments\", func() {\n\t\t\t\t\th.AssertIncludeAllExpectedPatterns(t,\n\t\t\t\t\t\tconfigProvider.ContainerConfig().Cmd,\n\t\t\t\t\t\t[]string{\"-uid\", \"1001\"},\n\t\t\t\t\t)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"override UID is not provided\", func() {\n\t\t\t\tlifecycleOps = append(lifecycleOps, func(options *build.LifecycleOptions) {\n\t\t\t\t\toptions.UID = -1\n\t\t\t\t})\n\n\t\t\t\tit(\"uid is not added to the expected arguments\", func() {\n\t\t\t\t\th.AssertSliceNotContains(t, configProvider.ContainerConfig().Cmd, \"-uid\")\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"--clear-cache\", func() {\n\t\t\tprovidedClearCache = true\n\n\t\t\tit(\"provides -skip-layers\", func() {\n\t\t\t\th.AssertSliceContains(t, configProvider.ContainerConfig().Cmd, \"-skip-layers\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"extensions\", func() {\n\t\t\tprovidedOrderExt = dist.Order{dist.OrderEntry{Group: []dist.ModuleRef{ /* don't care */ }}}\n\n\t\t\twhen(\"for build\", func() {\n\t\t\t\twhen(\"present in <layers>/generated/build\", func() {\n\t\t\t\t\textensionsForBuild = true\n\n\t\t\t\t\twhen(\"platform < 0.10\", func() {\n\t\t\t\t\t\tplatformAPI = api.MustParse(\"0.9\")\n\n\t\t\t\t\t\tit(\"does not provide -build-image or /kaniko bind\", func() {\n\t\t\t\t\t\t\th.AssertSliceNotContains(t, configProvider.ContainerConfig().Cmd, \"-build-image\")\n\t\t\t\t\t\t\th.AssertSliceNotContains(t, configProvider.HostConfig().Binds, \"some-kaniko-cache:/kaniko\")\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"platform >= 0.10\", func() {\n\t\t\t\t\t\tplatformAPI = api.MustParse(\"0.10\")\n\n\t\t\t\t\t\tit(\"provides -build-image and /kaniko bind\", func() {\n\t\t\t\t\t\t\th.AssertSliceContainsInOrder(t, configProvider.ContainerConfig().Cmd, \"-build-image\", providedBuilderImage)\n\t\t\t\t\t\t\th.AssertSliceContains(t, configProvider.ContainerConfig().Env, \"CNB_REGISTRY_AUTH={}\")\n\t\t\t\t\t\t\th.AssertSliceContains(t, configProvider.HostConfig().Binds, \"some-kaniko-cache:/kaniko\")\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"not present in <layers>/generated/build\", func() {\n\t\t\t\t\tplatformAPI = api.MustParse(\"0.10\")\n\n\t\t\t\t\tit(\"does not provide -build-image or /kaniko bind\", func() {\n\t\t\t\t\t\th.AssertSliceNotContains(t, configProvider.ContainerConfig().Cmd, \"-build-image\")\n\t\t\t\t\t\th.AssertSliceNotContains(t, configProvider.HostConfig().Binds, \"some-kaniko-cache:/kaniko\")\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"for run\", func() {\n\t\t\t\twhen(\"analyzed.toml extend\", func() {\n\t\t\t\t\twhen(\"true\", func() {\n\t\t\t\t\t\textensionsForRun = true\n\n\t\t\t\t\t\twhen(\"platform >= 0.12\", func() {\n\t\t\t\t\t\t\tplatformAPI = api.MustParse(\"0.12\")\n\n\t\t\t\t\t\t\tit(\"provides /kaniko bind\", func() {\n\t\t\t\t\t\t\t\th.AssertSliceContains(t, configProvider.HostConfig().Binds, \"some-kaniko-cache:/kaniko\")\n\t\t\t\t\t\t\t})\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\twhen(\"platform < 0.12\", func() {\n\t\t\t\t\t\t\tplatformAPI = api.MustParse(\"0.11\")\n\n\t\t\t\t\t\t\tit(\"does not provide /kaniko bind\", func() {\n\t\t\t\t\t\t\t\th.AssertSliceNotContains(t, configProvider.HostConfig().Binds, \"some-kaniko-cache:/kaniko\")\n\t\t\t\t\t\t\t})\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"false\", func() {\n\t\t\t\t\t\tplatformAPI = api.MustParse(\"0.12\")\n\n\t\t\t\t\t\tit(\"does not provide /kaniko bind\", func() {\n\t\t\t\t\t\t\th.AssertSliceNotContains(t, configProvider.HostConfig().Binds, \"some-kaniko-cache:/kaniko\")\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"publish is false\", func() {\n\t\t\twhen(\"platform >= 0.12\", func() {\n\t\t\t\tplatformAPI = api.MustParse(\"0.12\")\n\n\t\t\t\tit(\"configures the phase with daemon access\", func() {\n\t\t\t\t\th.AssertEq(t, configProvider.ContainerConfig().User, \"root\")\n\t\t\t\t\th.AssertSliceContains(t, configProvider.HostConfig().Binds, \"/var/run/docker.sock:/var/run/docker.sock\")\n\t\t\t\t})\n\n\t\t\t\tit(\"configures the phase with the expected arguments\", func() {\n\t\t\t\t\th.AssertIncludeAllExpectedPatterns(t,\n\t\t\t\t\t\tconfigProvider.ContainerConfig().Cmd,\n\t\t\t\t\t\t[]string{\"-daemon\"},\n\t\t\t\t\t)\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"layout is true\", func() {\n\t\t\twhen(\"platform >= 0.12\", func() {\n\t\t\t\tplatformAPI = api.MustParse(\"0.12\")\n\t\t\t\tprovidedLayout = true\n\n\t\t\t\tit(\"it configures the phase with access to provided volumes\", func() {\n\t\t\t\t\t// this is required to read the /layout-repo\n\t\t\t\t\th.AssertSliceContains(t, configProvider.HostConfig().Binds, providedVolumes...)\n\t\t\t\t})\n\n\t\t\t\tit(\"configures the phase with the expected environment variables\", func() {\n\t\t\t\t\tlayoutDir := filepath.Join(paths.RootDir, \"layout-repo\")\n\t\t\t\t\th.AssertSliceContains(t,\n\t\t\t\t\t\tconfigProvider.ContainerConfig().Env, \"CNB_USE_LAYOUT=true\", fmt.Sprintf(\"CNB_LAYOUT_DIR=%s\", layoutDir),\n\t\t\t\t\t)\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"#Build\", func() {\n\t\tit.Before(func() {\n\t\t\terr := lifecycle.Build(context.Background(), fakePhaseFactory)\n\t\t\th.AssertNil(t, err)\n\n\t\t\tlastCallIndex := len(fakePhaseFactory.NewCalledWithProvider) - 1\n\t\t\th.AssertNotEq(t, lastCallIndex, -1)\n\n\t\t\tconfigProvider = fakePhaseFactory.NewCalledWithProvider[lastCallIndex]\n\t\t\th.AssertEq(t, configProvider.Name(), \"builder\")\n\t\t})\n\n\t\tit(\"creates a phase and then runs it\", func() {\n\t\t\th.AssertEq(t, fakePhase.CleanupCallCount, 1)\n\t\t\th.AssertEq(t, fakePhase.RunCallCount, 1)\n\t\t})\n\n\t\tit(\"configures the phase with the expected arguments\", func() {\n\t\t\th.AssertIncludeAllExpectedPatterns(t,\n\t\t\t\tconfigProvider.ContainerConfig().Cmd,\n\t\t\t\t[]string{\"-log-level\", \"debug\"},\n\t\t\t)\n\t\t})\n\n\t\tit(\"configures the phase with the expected network mode\", func() {\n\t\t\th.AssertEq(t, configProvider.HostConfig().NetworkMode, container.NetworkMode(providedNetworkMode))\n\t\t})\n\n\t\tit(\"configures the phase with binds\", func() {\n\t\t\th.AssertSliceContains(t, configProvider.HostConfig().Binds, providedVolumes...)\n\t\t})\n\t})\n\n\twhen(\"#ExtendBuild\", func() {\n\t\tvar experimental bool\n\t\tit.Before(func() {\n\t\t\texperimental = true\n\t\t\terr := lifecycle.ExtendBuild(context.Background(), fakeKanikoCache, fakePhaseFactory, experimental)\n\t\t\th.AssertNil(t, err)\n\n\t\t\tlastCallIndex := len(fakePhaseFactory.NewCalledWithProvider) - 1\n\t\t\th.AssertNotEq(t, lastCallIndex, -1)\n\n\t\t\tconfigProvider = fakePhaseFactory.NewCalledWithProvider[lastCallIndex]\n\t\t\th.AssertEq(t, configProvider.Name(), \"extender\")\n\t\t})\n\n\t\tit(\"creates a phase and then runs it\", func() {\n\t\t\th.AssertEq(t, fakePhase.CleanupCallCount, 1)\n\t\t\th.AssertEq(t, fakePhase.RunCallCount, 1)\n\t\t})\n\n\t\tit(\"configures the phase with the expected arguments\", func() {\n\t\t\th.AssertSliceContainsInOrder(t, configProvider.ContainerConfig().Cmd, \"-log-level\", \"debug\")\n\t\t\th.AssertSliceContainsInOrder(t, configProvider.ContainerConfig().Cmd, \"-app\", \"/workspace\")\n\t\t})\n\n\t\tit(\"configures the phase with binds\", func() {\n\t\t\texpectedBinds := providedVolumes\n\t\t\texpectedBinds = append(expectedBinds, \"some-kaniko-cache:/kaniko\")\n\n\t\t\th.AssertSliceContains(t, configProvider.HostConfig().Binds, expectedBinds...)\n\t\t})\n\n\t\tit(\"sets CNB_EXPERIMENTAL_MODE=warn in the environment\", func() {\n\t\t\th.AssertSliceContains(t, configProvider.ContainerConfig().Env, \"CNB_EXPERIMENTAL_MODE=warn\")\n\t\t})\n\n\t\tit(\"configures the phase with the expected network mode\", func() {\n\t\t\th.AssertEq(t, configProvider.HostConfig().NetworkMode, container.NetworkMode(providedNetworkMode))\n\t\t})\n\n\t\tit(\"configures the phase with root\", func() {\n\t\t\th.AssertEq(t, configProvider.ContainerConfig().User, \"root\")\n\t\t})\n\n\t\twhen(\"experimental is false\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\texperimental = false\n\t\t\t\terr := lifecycle.ExtendBuild(context.Background(), fakeKanikoCache, fakePhaseFactory, experimental)\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\tlastCallIndex := len(fakePhaseFactory.NewCalledWithProvider) - 1\n\t\t\t\th.AssertNotEq(t, lastCallIndex, -1)\n\n\t\t\t\tconfigProvider = fakePhaseFactory.NewCalledWithProvider[lastCallIndex]\n\t\t\t\th.AssertEq(t, configProvider.Name(), \"extender\")\n\t\t\t})\n\n\t\t\tit(\"CNB_EXPERIMENTAL_MODE=warn is not enable in the environment\", func() {\n\t\t\t\th.AssertSliceNotContains(t, configProvider.ContainerConfig().Env, \"CNB_EXPERIMENTAL_MODE=warn\")\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"#ExtendRun\", func() {\n\t\tvar experimental bool\n\t\tit.Before(func() {\n\t\t\texperimental = true\n\t\t\terr := lifecycle.ExtendRun(context.Background(), fakeKanikoCache, fakePhaseFactory, \"some-run-image\", experimental)\n\t\t\th.AssertNil(t, err)\n\n\t\t\tlastCallIndex := len(fakePhaseFactory.NewCalledWithProvider) - 1\n\t\t\th.AssertNotEq(t, lastCallIndex, -1)\n\n\t\t\tconfigProvider = fakePhaseFactory.NewCalledWithProvider[lastCallIndex]\n\t\t\th.AssertEq(t, configProvider.Name(), \"extender\")\n\t\t})\n\n\t\tit(\"creates a phase and then runs it\", func() {\n\t\t\th.AssertEq(t, fakePhase.CleanupCallCount, 1)\n\t\t\th.AssertEq(t, fakePhase.RunCallCount, 1)\n\t\t})\n\n\t\tit(\"runs the phase with the run image\", func() {\n\t\t\th.AssertEq(t, configProvider.ContainerConfig().Image, \"some-run-image\")\n\t\t})\n\n\t\tit(\"configures the phase with the expected arguments\", func() {\n\t\t\th.AssertSliceContainsInOrder(t, configProvider.ContainerConfig().Entrypoint, \"\") // the run image may have an entrypoint configured, override it\n\t\t\th.AssertSliceContainsInOrder(t, configProvider.ContainerConfig().Cmd, \"-log-level\", \"debug\")\n\t\t\th.AssertSliceContainsInOrder(t, configProvider.ContainerConfig().Cmd, \"-app\", \"/workspace\")\n\t\t\th.AssertSliceContainsInOrder(t, configProvider.ContainerConfig().Cmd, \"-kind\", \"run\")\n\t\t})\n\n\t\tit(\"configures the phase with binds\", func() {\n\t\t\texpectedBinds := providedVolumes\n\t\t\texpectedBinds = append(expectedBinds, \"some-kaniko-cache:/kaniko\")\n\n\t\t\th.AssertSliceContains(t, configProvider.HostConfig().Binds, expectedBinds...)\n\t\t})\n\n\t\tit(\"sets CNB_EXPERIMENTAL_MODE=warn in the environment\", func() {\n\t\t\th.AssertSliceContains(t, configProvider.ContainerConfig().Env, \"CNB_EXPERIMENTAL_MODE=warn\")\n\t\t})\n\n\t\tit(\"configures the phase with the expected network mode\", func() {\n\t\t\th.AssertEq(t, configProvider.HostConfig().NetworkMode, container.NetworkMode(providedNetworkMode))\n\t\t})\n\n\t\tit(\"configures the phase with root\", func() {\n\t\t\th.AssertEq(t, configProvider.ContainerConfig().User, \"root\")\n\t\t})\n\n\t\twhen(\"experimental is false\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\texperimental = false\n\t\t\t\terr := lifecycle.ExtendRun(context.Background(), fakeKanikoCache, fakePhaseFactory, \"some-run-image\", experimental)\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\tlastCallIndex := len(fakePhaseFactory.NewCalledWithProvider) - 1\n\t\t\t\th.AssertNotEq(t, lastCallIndex, -1)\n\n\t\t\t\tconfigProvider = fakePhaseFactory.NewCalledWithProvider[lastCallIndex]\n\t\t\t\th.AssertEq(t, configProvider.Name(), \"extender\")\n\t\t\t})\n\n\t\t\tit(\"CNB_EXPERIMENTAL_MODE=warn is not enable in the environment\", func() {\n\t\t\t\th.AssertSliceNotContains(t, configProvider.ContainerConfig().Env, \"CNB_EXPERIMENTAL_MODE=warn\")\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"#Export\", func() {\n\t\tit.Before(func() {\n\t\t\terr := lifecycle.Export(context.Background(), fakeBuildCache, fakeLaunchCache, fakeKanikoCache, fakePhaseFactory)\n\t\t\th.AssertNil(t, err)\n\n\t\t\tlastCallIndex := len(fakePhaseFactory.NewCalledWithProvider) - 1\n\t\t\th.AssertNotEq(t, lastCallIndex, -1)\n\n\t\t\tconfigProvider = fakePhaseFactory.NewCalledWithProvider[lastCallIndex]\n\t\t\th.AssertEq(t, configProvider.Name(), \"exporter\")\n\t\t})\n\n\t\tit(\"creates a phase and then runs it\", func() {\n\t\t\th.AssertEq(t, fakePhase.CleanupCallCount, 1)\n\t\t\th.AssertEq(t, fakePhase.RunCallCount, 1)\n\t\t})\n\n\t\tit(\"configures the phase with the expected arguments\", func() {\n\t\t\th.AssertIncludeAllExpectedPatterns(t,\n\t\t\t\tconfigProvider.ContainerConfig().Cmd,\n\t\t\t\t[]string{\"-log-level\", \"debug\"},\n\t\t\t\t[]string{\"-cache-dir\", \"/cache\"},\n\t\t\t\t[]string{\"-run-image\", providedRunImage},\n\t\t\t\t[]string{\"-stack\", \"/layers/stack.toml\"},\n\t\t\t\t[]string{providedTargetImage},\n\t\t\t)\n\t\t\th.AssertSliceNotContains(t, configProvider.ContainerConfig().Cmd, \"-run\")\n\t\t})\n\n\t\twhen(\"platform >= 0.12\", func() {\n\t\t\tplatformAPI = api.MustParse(\"0.12\")\n\n\t\t\tit(\"provides -run instead of -stack\", func() {\n\t\t\t\th.AssertIncludeAllExpectedPatterns(t,\n\t\t\t\t\tconfigProvider.ContainerConfig().Cmd,\n\t\t\t\t\t[]string{\"-run\", \"/layers/run.toml\"},\n\t\t\t\t)\n\t\t\t\th.AssertSliceNotContains(t, configProvider.ContainerConfig().Cmd, \"-stack\")\n\t\t\t})\n\n\t\t\twhen(\"there are extensions\", func() {\n\t\t\t\tprovidedOrderExt = dist.Order{dist.OrderEntry{Group: []dist.ModuleRef{ /* don't care */ }}}\n\n\t\t\t\twhen(\"for run\", func() {\n\t\t\t\t\textensionsForRun = true\n\n\t\t\t\t\tit(\"sets CNB_EXPERIMENTAL_MODE=warn in the environment\", func() {\n\t\t\t\t\t\th.AssertSliceContains(t, configProvider.ContainerConfig().Env, \"CNB_EXPERIMENTAL_MODE=warn\")\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"configures the phase with binds\", func() {\n\t\t\t\t\t\texpectedBinds := []string{\"some-cache:/cache\", \"some-launch-cache:/launch-cache\", \"some-kaniko-cache:/kaniko\"}\n\n\t\t\t\t\t\th.AssertSliceContains(t, configProvider.HostConfig().Binds, expectedBinds...)\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"layout is true\", func() {\n\t\t\t\tprovidedLayout = true\n\n\t\t\t\tit(\"it configures the phase with access to provided volumes\", func() {\n\t\t\t\t\t// this is required to read the /layout-repo\n\t\t\t\t\th.AssertSliceContains(t, configProvider.HostConfig().Binds, providedVolumes...)\n\t\t\t\t})\n\n\t\t\t\tit(\"configures the phase with the expected environment variables\", func() {\n\t\t\t\t\tlayoutDir := filepath.Join(paths.RootDir, \"layout-repo\")\n\t\t\t\t\th.AssertSliceContains(t,\n\t\t\t\t\t\tconfigProvider.ContainerConfig().Env, \"CNB_USE_LAYOUT=true\", fmt.Sprintf(\"CNB_LAYOUT_DIR=%s\", layoutDir),\n\t\t\t\t\t)\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"additional tags are specified\", func() {\n\t\t\tit(\"passes tag arguments to the exporter\", func() {\n\t\t\t\th.AssertIncludeAllExpectedPatterns(t,\n\t\t\t\t\tconfigProvider.ContainerConfig().Cmd,\n\t\t\t\t\t[]string{\"-log-level\", \"debug\"},\n\t\t\t\t\t[]string{\"-cache-dir\", \"/cache\"},\n\t\t\t\t\t[]string{\"-run-image\", providedRunImage},\n\t\t\t\t\t[]string{providedTargetImage, providedAdditionalTags[0], providedAdditionalTags[1]},\n\t\t\t\t)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"platform >= 0.7\", func() {\n\t\t\tplatformAPI = api.MustParse(\"0.7\")\n\n\t\t\tit(\"doesn't hint at default process type\", func() {\n\t\t\t\th.AssertIncludeAllExpectedPatterns(t,\n\t\t\t\t\tconfigProvider.ContainerConfig().Cmd,\n\t\t\t\t\t[]string{\"-log-level\", \"debug\"},\n\t\t\t\t\t[]string{\"-cache-dir\", \"/cache\"},\n\t\t\t\t\t[]string{providedTargetImage, providedAdditionalTags[0], providedAdditionalTags[1]},\n\t\t\t\t)\n\t\t\t\th.AssertSliceNotContains(t, configProvider.ContainerConfig().Cmd, \"-run-image\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"using cache image\", func() {\n\t\t\tfakeBuildCache = newFakeImageCache()\n\n\t\t\tit(\"configures phase with cache image\", func() {\n\t\t\t\th.AssertSliceNotContains(t, configProvider.HostConfig().Binds, \":/cache\")\n\t\t\t\th.AssertIncludeAllExpectedPatterns(t,\n\t\t\t\t\tconfigProvider.ContainerConfig().Cmd,\n\t\t\t\t\t[]string{\"-cache-image\", \"some-cache-image\"},\n\t\t\t\t)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"publish\", func() {\n\t\t\tprovidedPublish = true\n\n\t\t\twhen(\"lifecycle image\", func() {\n\t\t\t\tlifecycleOps = append(lifecycleOps, func(options *build.LifecycleOptions) {\n\t\t\t\t\toptions.LifecycleImage = \"some-lifecycle-image\"\n\t\t\t\t})\n\n\t\t\t\tit(\"runs the phase with the lifecycle image\", func() {\n\t\t\t\t\th.AssertEq(t, configProvider.ContainerConfig().Image, \"some-lifecycle-image\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\tit(\"sets the CNB_USER_ID and CNB_GROUP_ID in the environment\", func() {\n\t\t\t\th.AssertSliceContains(t, configProvider.ContainerConfig().Env, fmt.Sprintf(\"CNB_USER_ID=%d\", providedUID))\n\t\t\t\th.AssertSliceContains(t, configProvider.ContainerConfig().Env, fmt.Sprintf(\"CNB_GROUP_ID=%d\", providedGID))\n\t\t\t})\n\n\t\t\tit(\"configures the phase with registry access\", func() {\n\t\t\t\th.AssertSliceContains(t, configProvider.ContainerConfig().Env, \"CNB_REGISTRY_AUTH={}\")\n\t\t\t})\n\n\t\t\tit(\"configures the phase with root\", func() {\n\t\t\t\th.AssertEq(t, configProvider.ContainerConfig().User, \"root\")\n\t\t\t})\n\n\t\t\tit(\"configures the phase with the expected network mode\", func() {\n\t\t\t\th.AssertEq(t, configProvider.HostConfig().NetworkMode, container.NetworkMode(providedNetworkMode))\n\t\t\t})\n\n\t\t\tit(\"configures the phase with binds\", func() {\n\t\t\t\texpectedBind := \"some-cache:/cache\"\n\n\t\t\t\th.AssertSliceContains(t, configProvider.HostConfig().Binds, expectedBind)\n\t\t\t})\n\n\t\t\tit(\"configures the phase to write stack toml\", func() {\n\t\t\t\texpectedBinds := []string{\"some-cache:/cache\"}\n\t\t\t\th.AssertSliceContains(t, configProvider.HostConfig().Binds, expectedBinds...)\n\n\t\t\t\th.AssertEq(t, len(configProvider.ContainerOps()), 3)\n\t\t\t\th.AssertFunctionName(t, configProvider.ContainerOps()[0], \"WriteStackToml\")\n\t\t\t\th.AssertFunctionName(t, configProvider.ContainerOps()[1], \"WriteRunToml\")\n\t\t\t\th.AssertFunctionName(t, configProvider.ContainerOps()[2], \"WriteProjectMetadata\")\n\t\t\t})\n\n\t\t\twhen(\"default process type\", func() {\n\t\t\t\tlifecycleOps = append(lifecycleOps, func(options *build.LifecycleOptions) {\n\t\t\t\t\toptions.DefaultProcessType = \"test-process\"\n\t\t\t\t})\n\n\t\t\t\tit(\"configures the phase with default process type\", func() {\n\t\t\t\t\texpectedDefaultProc := []string{\"-process-type\", \"test-process\"}\n\t\t\t\t\th.AssertIncludeAllExpectedPatterns(t, configProvider.ContainerConfig().Cmd, expectedDefaultProc)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"using cache image and publishing\", func() {\n\t\t\t\tfakeBuildCache = newFakeImageCache()\n\n\t\t\t\tit(\"configures phase with cache image\", func() {\n\t\t\t\t\th.AssertSliceNotContains(t, configProvider.HostConfig().Binds, \":/cache\")\n\t\t\t\t\th.AssertIncludeAllExpectedPatterns(t,\n\t\t\t\t\t\tconfigProvider.ContainerConfig().Cmd,\n\t\t\t\t\t\t[]string{\"-cache-image\", \"some-cache-image\"},\n\t\t\t\t\t)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"platform 0.3\", func() {\n\t\t\t\tplatformAPI = api.MustParse(\"0.3\")\n\n\t\t\t\tit(\"doesn't hint at default process type\", func() {\n\t\t\t\t\th.AssertSliceNotContains(t, configProvider.ContainerConfig().Cmd, \"-process-type\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"platform 0.4\", func() {\n\t\t\t\tplatformAPI = api.MustParse(\"0.4\")\n\n\t\t\t\tit(\"hints at default process type\", func() {\n\t\t\t\t\th.AssertIncludeAllExpectedPatterns(t, configProvider.ContainerConfig().Cmd, []string{\"-process-type\", \"web\"})\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"platform >= 0.6\", func() {\n\t\t\t\tplatformAPI = api.MustParse(\"0.6\")\n\n\t\t\t\twhen(\"no user provided process type is present\", func() {\n\t\t\t\t\tit(\"doesn't provide 'web' as default process type\", func() {\n\t\t\t\t\t\th.AssertSliceNotContains(t, configProvider.ContainerConfig().Cmd, \"-process-type\")\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"publish is false\", func() {\n\t\t\twhen(\"lifecycle image\", func() {\n\t\t\t\tlifecycleOps = append(lifecycleOps, func(options *build.LifecycleOptions) {\n\t\t\t\t\toptions.LifecycleImage = \"some-lifecycle-image\"\n\t\t\t\t})\n\n\t\t\t\tit(\"runs the phase with the lifecycle image\", func() {\n\t\t\t\t\th.AssertEq(t, configProvider.ContainerConfig().Image, \"some-lifecycle-image\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\tit(\"sets the CNB_USER_ID and CNB_GROUP_ID in the environment\", func() {\n\t\t\t\th.AssertSliceContains(t, configProvider.ContainerConfig().Env, fmt.Sprintf(\"CNB_USER_ID=%d\", providedUID))\n\t\t\t\th.AssertSliceContains(t, configProvider.ContainerConfig().Env, fmt.Sprintf(\"CNB_GROUP_ID=%d\", providedGID))\n\t\t\t})\n\n\t\t\tit(\"configures the phase with daemon access\", func() {\n\t\t\t\th.AssertEq(t, configProvider.ContainerConfig().User, \"root\")\n\t\t\t\th.AssertSliceContains(t, configProvider.HostConfig().Binds, \"/var/run/docker.sock:/var/run/docker.sock\")\n\t\t\t})\n\n\t\t\twhen(\"tcp docker-host\", func() {\n\t\t\t\tprovidedDockerHost = `tcp://localhost:1234`\n\n\t\t\t\tit(\"configures the phase with daemon access with tcp docker-host\", func() {\n\t\t\t\t\th.AssertSliceNotContains(t, configProvider.HostConfig().Binds, \"/var/run/docker.sock:/var/run/docker.sock\")\n\t\t\t\t\th.AssertSliceContains(t, configProvider.ContainerConfig().Env, \"DOCKER_HOST=tcp://localhost:1234\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\tit(\"configures the phase with the expected arguments\", func() {\n\t\t\t\th.AssertIncludeAllExpectedPatterns(t,\n\t\t\t\t\tconfigProvider.ContainerConfig().Cmd,\n\t\t\t\t\t[]string{\"-daemon\"},\n\t\t\t\t\t[]string{\"-launch-cache\", \"/launch-cache\"},\n\t\t\t\t)\n\t\t\t})\n\n\t\t\tit(\"configures the phase with the expected network mode\", func() {\n\t\t\t\th.AssertEq(t, configProvider.HostConfig().NetworkMode, container.NetworkMode(providedNetworkMode))\n\t\t\t})\n\n\t\t\tit(\"configures the phase with binds\", func() {\n\t\t\t\texpectedBinds := []string{\"some-cache:/cache\", \"some-launch-cache:/launch-cache\"}\n\n\t\t\t\th.AssertSliceContains(t, configProvider.HostConfig().Binds, expectedBinds...)\n\t\t\t})\n\n\t\t\tit(\"configures the phase to write stack toml\", func() {\n\t\t\t\texpectedBinds := []string{\"some-cache:/cache\", \"some-launch-cache:/launch-cache\"}\n\t\t\t\th.AssertSliceContains(t, configProvider.HostConfig().Binds, expectedBinds...)\n\n\t\t\t\th.AssertEq(t, len(configProvider.ContainerOps()), 3)\n\t\t\t\th.AssertFunctionName(t, configProvider.ContainerOps()[0], \"WriteStackToml\")\n\t\t\t\th.AssertFunctionName(t, configProvider.ContainerOps()[1], \"WriteRunToml\")\n\t\t\t\th.AssertFunctionName(t, configProvider.ContainerOps()[2], \"WriteProjectMetadata\")\n\t\t\t})\n\n\t\t\twhen(\"default process type\", func() {\n\t\t\t\twhen(\"provided\", func() {\n\t\t\t\t\tlifecycleOps = append(lifecycleOps, func(options *build.LifecycleOptions) {\n\t\t\t\t\t\toptions.DefaultProcessType = \"test-process\"\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"configures the phase with default process type\", func() {\n\t\t\t\t\t\texpectedDefaultProc := []string{\"-process-type\", \"test-process\"}\n\t\t\t\t\t\th.AssertIncludeAllExpectedPatterns(t, configProvider.ContainerConfig().Cmd, expectedDefaultProc)\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"platform 0.3\", func() {\n\t\t\t\t\tplatformAPI = api.MustParse(\"0.3\")\n\n\t\t\t\t\tit(\"doesn't hint at default process type\", func() {\n\t\t\t\t\t\th.AssertSliceNotContains(t, configProvider.ContainerConfig().Cmd, \"-process-type\")\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"platform 0.4\", func() {\n\t\t\t\t\tplatformAPI = api.MustParse(\"0.4\")\n\n\t\t\t\t\tit(\"hints at default process type\", func() {\n\t\t\t\t\t\th.AssertIncludeAllExpectedPatterns(t, configProvider.ContainerConfig().Cmd, []string{\"-process-type\", \"web\"})\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"platform >= 0.6\", func() {\n\t\t\t\t\tplatformAPI = api.MustParse(\"0.6\")\n\n\t\t\t\t\twhen(\"no user provided process type is present\", func() {\n\t\t\t\t\t\tit(\"doesn't provide 'web' as default process type\", func() {\n\t\t\t\t\t\t\th.AssertSliceNotContains(t, configProvider.ContainerConfig().Cmd, \"-process-type\")\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"override GID\", func() {\n\t\t\twhen(\"override GID is provided\", func() {\n\t\t\t\tlifecycleOps = append(lifecycleOps, func(options *build.LifecycleOptions) {\n\t\t\t\t\toptions.GID = 2\n\t\t\t\t})\n\n\t\t\t\tit(\"configures the phase with the expected arguments\", func() {\n\t\t\t\t\th.AssertIncludeAllExpectedPatterns(t,\n\t\t\t\t\t\tconfigProvider.ContainerConfig().Cmd,\n\t\t\t\t\t\t[]string{\"-gid\", \"2\"},\n\t\t\t\t\t)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"override GID is not provided\", func() {\n\t\t\t\tlifecycleOps = append(lifecycleOps, func(options *build.LifecycleOptions) {\n\t\t\t\t\toptions.GID = -1\n\t\t\t\t})\n\n\t\t\t\tit(\"gid is not added to the expected arguments\", func() {\n\t\t\t\t\th.AssertSliceNotContains(t, configProvider.ContainerConfig().Cmd, \"-gid\")\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"override UID\", func() {\n\t\t\twhen(\"override UID is provided\", func() {\n\t\t\t\tlifecycleOps = append(lifecycleOps, func(options *build.LifecycleOptions) {\n\t\t\t\t\toptions.UID = 1001\n\t\t\t\t})\n\n\t\t\t\tit(\"configures the phase with the expected arguments\", func() {\n\t\t\t\t\th.AssertIncludeAllExpectedPatterns(t,\n\t\t\t\t\t\tconfigProvider.ContainerConfig().Cmd,\n\t\t\t\t\t\t[]string{\"-uid\", \"1001\"},\n\t\t\t\t\t)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"override UID is not provided\", func() {\n\t\t\t\tlifecycleOps = append(lifecycleOps, func(options *build.LifecycleOptions) {\n\t\t\t\t\toptions.UID = -1\n\t\t\t\t})\n\n\t\t\t\tit(\"uid is not added to the expected arguments\", func() {\n\t\t\t\t\th.AssertSliceNotContains(t, configProvider.ContainerConfig().Cmd, \"-uid\")\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"interactive mode\", func() {\n\t\t\tlifecycleOps = append(lifecycleOps, func(opts *build.LifecycleOptions) {\n\t\t\t\topts.Interactive = true\n\t\t\t\topts.Termui = &fakes.FakeTermui{ReadLayersFunc: func(_ io.ReadCloser) {\n\t\t\t\t\t// no-op\n\t\t\t\t}}\n\t\t\t})\n\n\t\t\tit(\"provides the termui readLayersFunc as a post container operation\", func() {\n\t\t\t\th.AssertEq(t, len(configProvider.PostContainerRunOps()), 2)\n\t\t\t\th.AssertFunctionName(t, configProvider.PostContainerRunOps()[0], \"EnsureVolumeAccess\")\n\t\t\t\th.AssertFunctionName(t, configProvider.PostContainerRunOps()[1], \"CopyOut\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"sbom destination directory is provided\", func() {\n\t\t\tlifecycleOps = append(lifecycleOps, func(opts *build.LifecycleOptions) {\n\t\t\t\topts.SBOMDestinationDir = \"some-destination-dir\"\n\t\t\t})\n\n\t\t\tit(\"provides copy-sbom-func as a post container operation\", func() {\n\t\t\t\th.AssertEq(t, len(configProvider.PostContainerRunOps()), 2)\n\t\t\t\th.AssertFunctionName(t, configProvider.PostContainerRunOps()[0], \"EnsureVolumeAccess\")\n\t\t\t\th.AssertFunctionName(t, configProvider.PostContainerRunOps()[1], \"CopyOut\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"report destination directory is provided\", func() {\n\t\t\tlifecycleOps = append(lifecycleOps, func(opts *build.LifecycleOptions) {\n\t\t\t\topts.ReportDestinationDir = \"a-destination-dir\"\n\t\t\t})\n\n\t\t\tit(\"provides copy-sbom-func as a post container operation\", func() {\n\t\t\t\th.AssertEq(t, len(configProvider.PostContainerRunOps()), 2)\n\t\t\t\th.AssertFunctionName(t, configProvider.PostContainerRunOps()[0], \"EnsureVolumeAccess\")\n\t\t\t\th.AssertFunctionName(t, configProvider.PostContainerRunOps()[1], \"CopyOut\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"--creation-time\", func() {\n\t\t\twhen(\"platform < 0.9\", func() {\n\t\t\t\tplatformAPI = api.MustParse(\"0.8\")\n\n\t\t\t\tintTime, _ := strconv.ParseInt(\"1234567890\", 10, 64)\n\t\t\t\tprovidedTime := time.Unix(intTime, 0).UTC()\n\n\t\t\t\tlifecycleOps = append(lifecycleOps, func(baseOpts *build.LifecycleOptions) {\n\t\t\t\t\tbaseOpts.CreationTime = &providedTime\n\t\t\t\t})\n\n\t\t\t\tit(\"is ignored\", func() {\n\t\t\t\t\th.AssertSliceNotContains(t, configProvider.ContainerConfig().Env, \"SOURCE_DATE_EPOCH=1234567890\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"platform >= 0.9\", func() {\n\t\t\t\tplatformAPI = api.MustParse(\"0.9\")\n\n\t\t\t\twhen(\"provided\", func() {\n\t\t\t\t\tintTime, _ := strconv.ParseInt(\"1234567890\", 10, 64)\n\t\t\t\t\tprovidedTime := time.Unix(intTime, 0).UTC()\n\n\t\t\t\t\tlifecycleOps = append(lifecycleOps, func(baseOpts *build.LifecycleOptions) {\n\t\t\t\t\t\tbaseOpts.CreationTime = &providedTime\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"configures the phase with env SOURCE_DATE_EPOCH\", func() {\n\t\t\t\t\t\th.AssertSliceContains(t, configProvider.ContainerConfig().Env, \"SOURCE_DATE_EPOCH=1234567890\")\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"not provided\", func() {\n\t\t\t\t\tit(\"does not panic\", func() {\n\t\t\t\t\t\t// no-op\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\t})\n}\n\nfunc newFakeVolumeCache() *fakes.FakeCache {\n\tc := fakes.NewFakeCache()\n\tc.ReturnForType = cache.Volume\n\tc.ReturnForName = \"some-cache\"\n\treturn c\n}\n\nfunc newFakeImageCache() *fakes.FakeCache {\n\tc := fakes.NewFakeCache()\n\tc.ReturnForType = cache.Image\n\tc.ReturnForName = \"some-cache-image\"\n\treturn c\n}\n\nfunc newFakeFetchRunImageFunc(f *fakeImageFetcher) func(name string) (string, error) {\n\treturn func(name string) (string, error) {\n\t\treturn fmt.Sprintf(\"ephemeral-%s\", name), f.fetchRunImage(name)\n\t}\n}\n\ntype fakeImageFetcher struct {\n\tcallCount           int\n\tcalledWithArgAtCall map[int]string\n}\n\nfunc (f *fakeImageFetcher) fetchRunImage(name string) error {\n\tf.calledWithArgAtCall[f.callCount] = name\n\tf.callCount++\n\treturn nil\n}\n\ntype fakeDockerClient struct {\n\tnNetworks int\n\tbuild.DockerClient\n}\n\nfunc (f *fakeDockerClient) NetworkList(ctx context.Context, opts client.NetworkListOptions) (client.NetworkListResult, error) {\n\tret := make([]network.Summary, f.nNetworks)\n\treturn client.NetworkListResult{Items: ret}, nil\n}\n\nfunc (f *fakeDockerClient) NetworkCreate(ctx context.Context, name string, options client.NetworkCreateOptions) (client.NetworkCreateResult, error) {\n\tf.nNetworks++\n\treturn client.NetworkCreateResult{}, nil\n}\n\nfunc (f *fakeDockerClient) NetworkRemove(ctx context.Context, network string, options client.NetworkRemoveOptions) (client.NetworkRemoveResult, error) {\n\tf.nNetworks--\n\treturn client.NetworkRemoveResult{}, nil\n}\n\nfunc newTestLifecycleExecErr(t *testing.T, logVerbose bool, tmpDir string, ops ...func(*build.LifecycleOptions)) (*build.LifecycleExecution, error) {\n\tdocker, err := client.New(client.FromEnv)\n\th.AssertNil(t, err)\n\n\tvar outBuf bytes.Buffer\n\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf)\n\tif logVerbose {\n\t\tlogger.Level = log.DebugLevel\n\t}\n\n\tdefaultBuilder, err := fakes.NewFakeBuilder()\n\th.AssertNil(t, err)\n\n\topts := build.LifecycleOptions{\n\t\tAppPath:    \"some-app-path\",\n\t\tBuilder:    defaultBuilder,\n\t\tHTTPProxy:  \"some-http-proxy\",\n\t\tHTTPSProxy: \"some-https-proxy\",\n\t\tNoProxy:    \"some-no-proxy\",\n\t\tTermui:     &fakes.FakeTermui{},\n\t}\n\n\tfor _, op := range ops {\n\t\top(&opts)\n\t}\n\n\treturn build.NewLifecycleExecution(logger, docker, tmpDir, opts)\n}\n\nfunc newTestLifecycleExec(t *testing.T, logVerbose bool, tmpDir string, ops ...func(*build.LifecycleOptions)) *build.LifecycleExecution {\n\tt.Helper()\n\n\tlifecycleExec, err := newTestLifecycleExecErr(t, logVerbose, tmpDir, ops...)\n\th.AssertNil(t, err)\n\treturn lifecycleExec\n}\n"
  },
  {
    "path": "internal/build/lifecycle_executor.go",
    "content": "package build\n\nimport (\n\t\"context\"\n\t\"io\"\n\t\"os\"\n\t\"time\"\n\n\t\"github.com/buildpacks/imgutil\"\n\t\"github.com/buildpacks/lifecycle/api\"\n\t\"github.com/buildpacks/lifecycle/platform/files\"\n\t\"github.com/google/go-containerregistry/pkg/authn\"\n\t\"github.com/google/go-containerregistry/pkg/name\"\n\n\t\"github.com/buildpacks/pack/internal/builder\"\n\t\"github.com/buildpacks/pack/internal/container\"\n\t\"github.com/buildpacks/pack/pkg/cache\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n)\n\nvar (\n\t// SupportedPlatformAPIVersions lists the Platform API versions pack supports listed from earliest to latest\n\tSupportedPlatformAPIVersions = builder.APISet{\n\t\tapi.MustParse(\"0.3\"),\n\t\tapi.MustParse(\"0.4\"),\n\t\tapi.MustParse(\"0.5\"),\n\t\tapi.MustParse(\"0.6\"),\n\t\tapi.MustParse(\"0.7\"),\n\t\tapi.MustParse(\"0.8\"),\n\t\tapi.MustParse(\"0.9\"),\n\t\tapi.MustParse(\"0.10\"),\n\t\tapi.MustParse(\"0.11\"),\n\t\tapi.MustParse(\"0.12\"),\n\t\tapi.MustParse(\"0.13\"),\n\t\tapi.MustParse(\"0.14\"),\n\t\tapi.MustParse(\"0.15\"),\n\t}\n)\n\ntype Builder interface {\n\tName() string\n\tUID() int\n\tGID() int\n\tLifecycleDescriptor() builder.LifecycleDescriptor\n\tStack() builder.StackMetadata\n\tRunImages() []builder.RunImageMetadata\n\tImage() imgutil.Image\n\tOrderExtensions() dist.Order\n\tSystem() dist.System\n}\n\ntype LifecycleExecutor struct {\n\tlogger logging.Logger\n\tdocker DockerClient\n}\n\ntype Cache interface {\n\tName() string\n\tClear(context.Context) error\n\tType() cache.Type\n}\n\ntype Termui interface {\n\tlogging.Logger\n\n\tRun(funk func()) error\n\tHandler() container.Handler\n\tReadLayers(reader io.ReadCloser) error\n}\n\ntype LifecycleOptions struct {\n\tAppPath                         string\n\tImage                           name.Reference\n\tBuilder                         Builder\n\tBuilderImage                    string // differs from Builder.Name() and Builder.Image().Name() in that it includes the registry context\n\tLifecycleImage                  string\n\tLifecycleApis                   []string // optional - populated only if custom lifecycle image is downloaded, from that lifecycle image's labels.\n\tRunImage                        string\n\tFetchRunImageWithLifecycleLayer func(name string) (string, error)\n\tProjectMetadata                 files.ProjectMetadata\n\tClearCache                      bool\n\tPublish                         bool\n\tTrustBuilder                    bool\n\tUseCreator                      bool\n\tUseCreatorWithExtensions        bool\n\tInteractive                     bool\n\tLayout                          bool\n\tTermui                          Termui\n\tDockerHost                      string\n\tCache                           cache.CacheOpts\n\tExecutionEnvironment            string\n\tCacheImage                      string\n\tHTTPProxy                       string\n\tHTTPSProxy                      string\n\tNoProxy                         string\n\tNetwork                         string\n\tAdditionalTags                  []string\n\tVolumes                         []string\n\tInsecureRegistries              []string\n\tDefaultProcessType              string\n\tFileFilter                      func(string) bool\n\tWorkspace                       string\n\tGID                             int\n\tUID                             int\n\tPreviousImage                   string\n\tReportDestinationDir            string\n\tSBOMDestinationDir              string\n\tCreationTime                    *time.Time\n\tKeychain                        authn.Keychain\n\tEnableUsernsHost                bool\n}\n\nfunc NewLifecycleExecutor(logger logging.Logger, docker DockerClient) *LifecycleExecutor {\n\treturn &LifecycleExecutor{logger: logger, docker: docker}\n}\n\nfunc (l *LifecycleExecutor) Execute(ctx context.Context, opts LifecycleOptions) error {\n\ttmpDir, err := os.MkdirTemp(\"\", \"pack.tmp\")\n\tif err != nil {\n\t\treturn err\n\t}\n\n\tlifecycleExec, err := NewLifecycleExecution(l.logger, l.docker, tmpDir, opts)\n\tif err != nil {\n\t\treturn err\n\t}\n\n\tif !opts.Interactive {\n\t\tdefer lifecycleExec.Cleanup()\n\t\treturn lifecycleExec.Run(ctx, NewDefaultPhaseFactory)\n\t}\n\n\treturn opts.Termui.Run(func() {\n\t\tdefer lifecycleExec.Cleanup()\n\t\tlifecycleExec.Run(ctx, NewDefaultPhaseFactory)\n\t})\n}\n"
  },
  {
    "path": "internal/build/mount_paths.go",
    "content": "package build\n\nimport \"strings\"\n\ntype mountPaths struct {\n\tvolume    string\n\tseparator string\n\tworkspace string\n}\n\nfunc mountPathsForOS(os, workspace string) mountPaths {\n\tif workspace == \"\" {\n\t\tworkspace = \"workspace\"\n\t}\n\tif os == \"windows\" {\n\t\treturn mountPaths{\n\t\t\tvolume:    `c:`,\n\t\t\tseparator: `\\`,\n\t\t\tworkspace: workspace,\n\t\t}\n\t}\n\treturn mountPaths{\n\t\tvolume:    \"\",\n\t\tseparator: \"/\",\n\t\tworkspace: workspace,\n\t}\n}\n\nfunc (m mountPaths) join(parts ...string) string {\n\treturn strings.Join(parts, m.separator)\n}\n\nfunc (m mountPaths) layersDir() string {\n\treturn m.join(m.volume, \"layers\")\n}\n\nfunc (m mountPaths) stackPath() string {\n\treturn m.join(m.layersDir(), \"stack.toml\")\n}\n\nfunc (m mountPaths) runPath() string {\n\treturn m.join(m.layersDir(), \"run.toml\")\n}\n\nfunc (m mountPaths) projectPath() string {\n\treturn m.join(m.layersDir(), \"project-metadata.toml\")\n}\n\nfunc (m mountPaths) reportPath() string {\n\treturn m.join(m.layersDir(), \"report.toml\")\n}\n\nfunc (m mountPaths) appDirName() string {\n\treturn m.workspace\n}\n\nfunc (m mountPaths) appDir() string {\n\treturn m.join(m.volume, m.appDirName())\n}\n\nfunc (m mountPaths) cacheDir() string {\n\treturn m.join(m.volume, \"cache\")\n}\n\nfunc (m mountPaths) kanikoCacheDir() string {\n\treturn m.join(m.volume, \"kaniko\")\n}\n\nfunc (m mountPaths) launchCacheDir() string {\n\treturn m.join(m.volume, \"launch-cache\")\n}\n\nfunc (m mountPaths) sbomDir() string {\n\treturn m.join(m.volume, \"layers\", \"sbom\")\n}\n"
  },
  {
    "path": "internal/build/phase.go",
    "content": "package build\n\nimport (\n\t\"context\"\n\t\"io\"\n\n\tdcontainer \"github.com/moby/moby/api/types/container\"\n\t\"github.com/moby/moby/client\"\n\t\"github.com/pkg/errors\"\n\n\t\"github.com/buildpacks/pack/internal/container\"\n)\n\ntype Phase struct {\n\tname                string\n\tinfoWriter          io.Writer\n\terrorWriter         io.Writer\n\tdocker              DockerClient\n\thandler             container.Handler\n\tctrConf             *dcontainer.Config\n\thostConf            *dcontainer.HostConfig\n\tctr                 client.ContainerCreateResult\n\tuid, gid            int\n\tappPath             string\n\tcontainerOps        []ContainerOperation\n\tpostContainerRunOps []ContainerOperation\n\tfileFilter          func(string) bool\n}\n\nfunc (p *Phase) Run(ctx context.Context) error {\n\tvar err error\n\tp.ctr, err = p.docker.ContainerCreate(ctx, client.ContainerCreateOptions{\n\t\tConfig:     p.ctrConf,\n\t\tHostConfig: p.hostConf,\n\t})\n\tif err != nil {\n\t\treturn errors.Wrapf(err, \"failed to create '%s' container\", p.name)\n\t}\n\n\tfor _, containerOp := range p.containerOps {\n\t\tif err := containerOp(p.docker, ctx, p.ctr.ID, p.infoWriter, p.errorWriter); err != nil {\n\t\t\treturn err\n\t\t}\n\t}\n\n\thandler := container.DefaultHandler(p.infoWriter, p.errorWriter)\n\tif p.handler != nil {\n\t\thandler = p.handler\n\t}\n\n\terr = container.RunWithHandler(\n\t\tctx,\n\t\tp.docker,\n\t\tp.ctr.ID,\n\t\thandler)\n\tif err != nil {\n\t\treturn err\n\t}\n\n\tfor _, containerOp := range p.postContainerRunOps {\n\t\tif err := containerOp(p.docker, ctx, p.ctr.ID, p.infoWriter, p.errorWriter); err != nil {\n\t\t\treturn err\n\t\t}\n\t}\n\n\treturn nil\n}\n\nfunc (p *Phase) Cleanup() error {\n\t_, err := p.docker.ContainerRemove(context.Background(), p.ctr.ID, client.ContainerRemoveOptions{Force: true})\n\treturn err\n}\n"
  },
  {
    "path": "internal/build/phase_config_provider.go",
    "content": "package build\n\nimport (\n\t\"fmt\"\n\t\"io\"\n\t\"os\"\n\t\"strings\"\n\n\t\"github.com/moby/moby/api/types/container\"\n\n\tpcontainer \"github.com/buildpacks/pack/internal/container\"\n\t\"github.com/buildpacks/pack/internal/style\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n)\n\nconst (\n\tlinuxContainerAdmin   = \"root\"\n\twindowsContainerAdmin = \"ContainerAdministrator\"\n\tplatformAPIEnvVar     = \"CNB_PLATFORM_API\"\n\texecutionEnvVar       = \"CNB_EXEC_ENV\"\n)\n\ntype PhaseConfigProviderOperation func(*PhaseConfigProvider)\n\ntype PhaseConfigProvider struct {\n\tctrConf             *container.Config\n\thostConf            *container.HostConfig\n\tname                string\n\tos                  string\n\tcontainerOps        []ContainerOperation\n\tpostContainerRunOps []ContainerOperation\n\tinfoWriter          io.Writer\n\terrorWriter         io.Writer\n\thandler             pcontainer.Handler\n}\n\nfunc NewPhaseConfigProvider(name string, lifecycleExec *LifecycleExecution, ops ...PhaseConfigProviderOperation) *PhaseConfigProvider {\n\thostConf := new(container.HostConfig)\n\tif lifecycleExec.opts.EnableUsernsHost {\n\t\thostConf.UsernsMode = \"host\"\n\t}\n\tif lifecycleExec.os != \"windows\" {\n\t\thostConf.SecurityOpt = []string{\"no-new-privileges=true\"}\n\t}\n\tprovider := &PhaseConfigProvider{\n\t\tctrConf:     new(container.Config),\n\t\thostConf:    hostConf,\n\t\tname:        name,\n\t\tos:          lifecycleExec.os,\n\t\tinfoWriter:  logging.GetWriterForLevel(lifecycleExec.logger, logging.InfoLevel),\n\t\terrorWriter: logging.GetWriterForLevel(lifecycleExec.logger, logging.ErrorLevel),\n\t}\n\n\tprovider.ctrConf.Image = lifecycleExec.opts.Builder.Name()\n\tprovider.ctrConf.Labels = map[string]string{\"author\": \"pack\"}\n\n\tif lifecycleExec.os == \"windows\" {\n\t\tprovider.hostConf.Isolation = container.IsolationProcess\n\t}\n\n\tops = append(ops,\n\t\tWithEnv(fmt.Sprintf(\"%s=%s\", platformAPIEnvVar, lifecycleExec.platformAPI.String())),\n\t\tIf(lifecycleExec.platformAPI.AtLeast(\"0.15\"), WithEnv(fmt.Sprintf(\"%s=%s\", executionEnvVar, lifecycleExec.opts.ExecutionEnvironment))),\n\t\tWithLifecycleProxy(lifecycleExec),\n\t\tWithBinds([]string{\n\t\t\tfmt.Sprintf(\"%s:%s\", lifecycleExec.layersVolume, lifecycleExec.mountPaths.layersDir()),\n\t\t\tfmt.Sprintf(\"%s:%s\", lifecycleExec.appVolume, lifecycleExec.mountPaths.appDir()),\n\t\t}...),\n\t)\n\n\tfor _, op := range ops {\n\t\top(provider)\n\t}\n\n\tprovider.ctrConf.Entrypoint = []string{\"\"} // override entrypoint in case it is set\n\tprovider.ctrConf.Cmd = append([]string{\"/cnb/lifecycle/\" + name}, provider.ctrConf.Cmd...)\n\n\tlifecycleExec.logger.Debugf(\"Running the %s on OS %s from image %s with:\", style.Symbol(provider.Name()), style.Symbol(provider.os), style.Symbol(provider.ctrConf.Image))\n\tlifecycleExec.logger.Debug(\"Container Settings:\")\n\tlifecycleExec.logger.Debugf(\"  Args: %s\", style.Symbol(strings.Join(provider.ctrConf.Cmd, \" \")))\n\tlifecycleExec.logger.Debugf(\"  System Envs: %s\", style.Symbol(strings.Join(sanitized(provider.ctrConf.Env), \" \")))\n\tlifecycleExec.logger.Debugf(\"  Image: %s\", style.Symbol(provider.ctrConf.Image))\n\tlifecycleExec.logger.Debugf(\"  User: %s\", style.Symbol(provider.ctrConf.User))\n\tlifecycleExec.logger.Debugf(\"  Labels: %s\", style.Symbol(fmt.Sprintf(\"%s\", provider.ctrConf.Labels)))\n\n\tlifecycleExec.logger.Debug(\"Host Settings:\")\n\tlifecycleExec.logger.Debugf(\"  Binds: %s\", style.Symbol(strings.Join(provider.hostConf.Binds, \" \")))\n\tlifecycleExec.logger.Debugf(\"  Network Mode: %s\", style.Symbol(string(provider.hostConf.NetworkMode)))\n\n\tif lifecycleExec.opts.Interactive {\n\t\tprovider.handler = lifecycleExec.opts.Termui.Handler()\n\t}\n\n\treturn provider\n}\n\nfunc sanitized(origEnv []string) []string {\n\tvar sanitizedEnv []string\n\tfor _, env := range origEnv {\n\t\tif strings.HasPrefix(env, \"CNB_REGISTRY_AUTH\") {\n\t\t\tsanitizedEnv = append(sanitizedEnv, \"CNB_REGISTRY_AUTH=<redacted>\")\n\t\t\tcontinue\n\t\t}\n\t\tsanitizedEnv = append(sanitizedEnv, env)\n\t}\n\treturn sanitizedEnv\n}\n\nfunc (p *PhaseConfigProvider) ContainerConfig() *container.Config {\n\treturn p.ctrConf\n}\n\nfunc (p *PhaseConfigProvider) ContainerOps() []ContainerOperation {\n\treturn p.containerOps\n}\n\nfunc (p *PhaseConfigProvider) PostContainerRunOps() []ContainerOperation {\n\treturn p.postContainerRunOps\n}\n\nfunc (p *PhaseConfigProvider) HostConfig() *container.HostConfig {\n\treturn p.hostConf\n}\n\nfunc (p *PhaseConfigProvider) Handler() pcontainer.Handler {\n\treturn p.handler\n}\n\nfunc (p *PhaseConfigProvider) Name() string {\n\treturn p.name\n}\n\nfunc (p *PhaseConfigProvider) ErrorWriter() io.Writer {\n\treturn p.errorWriter\n}\n\nfunc (p *PhaseConfigProvider) InfoWriter() io.Writer {\n\treturn p.infoWriter\n}\n\nfunc NullOp() PhaseConfigProviderOperation {\n\treturn func(provider *PhaseConfigProvider) {}\n}\n\nfunc WithArgs(args ...string) PhaseConfigProviderOperation {\n\treturn func(provider *PhaseConfigProvider) {\n\t\tprovider.ctrConf.Cmd = append(provider.ctrConf.Cmd, args...)\n\t}\n}\n\n// WithFlags differs from WithArgs as flags are always prepended\nfunc WithFlags(flags ...string) PhaseConfigProviderOperation {\n\treturn func(provider *PhaseConfigProvider) {\n\t\tprovider.ctrConf.Cmd = append(flags, provider.ctrConf.Cmd...)\n\t}\n}\n\nfunc WithBinds(binds ...string) PhaseConfigProviderOperation {\n\treturn func(provider *PhaseConfigProvider) {\n\t\tprovider.hostConf.Binds = append(provider.hostConf.Binds, binds...)\n\t}\n}\n\nfunc WithDaemonAccess(dockerHost string) PhaseConfigProviderOperation {\n\treturn func(provider *PhaseConfigProvider) {\n\t\tWithRoot()(provider)\n\t\tif dockerHost == \"inherit\" {\n\t\t\tdockerHost = os.Getenv(\"DOCKER_HOST\")\n\t\t}\n\t\tvar bind string\n\t\tif dockerHost == \"\" {\n\t\t\tbind = \"/var/run/docker.sock:/var/run/docker.sock\"\n\t\t\tif provider.os == \"windows\" {\n\t\t\t\tbind = `\\\\.\\pipe\\docker_engine:\\\\.\\pipe\\docker_engine`\n\t\t\t}\n\t\t} else {\n\t\t\tswitch {\n\t\t\tcase strings.HasPrefix(dockerHost, \"unix://\"):\n\t\t\t\tbind = fmt.Sprintf(\"%s:/var/run/docker.sock\", strings.TrimPrefix(dockerHost, \"unix://\"))\n\t\t\tcase strings.HasPrefix(dockerHost, \"npipe://\") || strings.HasPrefix(dockerHost, `npipe:\\\\`):\n\t\t\t\tsub := ([]rune(dockerHost))[8:]\n\t\t\t\tbind = fmt.Sprintf(`%s:\\\\.\\pipe\\docker_engine`, string(sub))\n\t\t\tdefault:\n\t\t\t\tprovider.ctrConf.Env = append(provider.ctrConf.Env, fmt.Sprintf(`DOCKER_HOST=%s`, dockerHost))\n\t\t\t}\n\t\t}\n\t\tif bind != \"\" {\n\t\t\tprovider.hostConf.Binds = append(provider.hostConf.Binds, bind)\n\t\t}\n\t\tif provider.os != \"windows\" {\n\t\t\tprovider.hostConf.SecurityOpt = []string{\"label=disable\"}\n\t\t}\n\t}\n}\n\nfunc WithEnv(envs ...string) PhaseConfigProviderOperation {\n\treturn func(provider *PhaseConfigProvider) {\n\t\tprovider.ctrConf.Env = append(provider.ctrConf.Env, envs...)\n\t}\n}\n\nfunc WithImage(image string) PhaseConfigProviderOperation {\n\treturn func(provider *PhaseConfigProvider) {\n\t\tprovider.ctrConf.Image = image\n\t}\n}\n\n// WithLogPrefix sets a prefix for logs produced by this phase\nfunc WithLogPrefix(prefix string) PhaseConfigProviderOperation {\n\treturn func(provider *PhaseConfigProvider) {\n\t\tif prefix != \"\" {\n\t\t\tprovider.infoWriter = logging.NewPrefixWriter(provider.infoWriter, prefix)\n\t\t\tprovider.errorWriter = logging.NewPrefixWriter(provider.errorWriter, prefix)\n\t\t}\n\t}\n}\n\nfunc WithLifecycleProxy(lifecycleExec *LifecycleExecution) PhaseConfigProviderOperation {\n\treturn func(provider *PhaseConfigProvider) {\n\t\tif lifecycleExec.opts.HTTPProxy != \"\" {\n\t\t\tprovider.ctrConf.Env = append(provider.ctrConf.Env, \"HTTP_PROXY=\"+lifecycleExec.opts.HTTPProxy, \"http_proxy=\"+lifecycleExec.opts.HTTPProxy)\n\t\t}\n\n\t\tif lifecycleExec.opts.HTTPSProxy != \"\" {\n\t\t\tprovider.ctrConf.Env = append(provider.ctrConf.Env, \"HTTPS_PROXY=\"+lifecycleExec.opts.HTTPSProxy, \"https_proxy=\"+lifecycleExec.opts.HTTPSProxy)\n\t\t}\n\n\t\tif lifecycleExec.opts.NoProxy != \"\" {\n\t\t\tprovider.ctrConf.Env = append(provider.ctrConf.Env, \"NO_PROXY=\"+lifecycleExec.opts.NoProxy, \"no_proxy=\"+lifecycleExec.opts.NoProxy)\n\t\t}\n\t}\n}\n\nfunc WithNetwork(networkMode string) PhaseConfigProviderOperation {\n\treturn func(provider *PhaseConfigProvider) {\n\t\tprovider.hostConf.NetworkMode = container.NetworkMode(networkMode)\n\t}\n}\n\nfunc WithRegistryAccess(authConfig string) PhaseConfigProviderOperation {\n\treturn func(provider *PhaseConfigProvider) {\n\t\tprovider.ctrConf.Env = append(provider.ctrConf.Env, fmt.Sprintf(`CNB_REGISTRY_AUTH=%s`, authConfig))\n\t}\n}\n\nfunc WithRoot() PhaseConfigProviderOperation {\n\treturn func(provider *PhaseConfigProvider) {\n\t\tif provider.os == \"windows\" {\n\t\t\tprovider.ctrConf.User = windowsContainerAdmin\n\t\t} else {\n\t\t\tprovider.ctrConf.User = linuxContainerAdmin\n\t\t}\n\t}\n}\n\nfunc WithContainerOperations(operations ...ContainerOperation) PhaseConfigProviderOperation {\n\treturn func(provider *PhaseConfigProvider) {\n\t\tprovider.containerOps = append(provider.containerOps, operations...)\n\t}\n}\n\nfunc WithPostContainerRunOperations(operations ...ContainerOperation) PhaseConfigProviderOperation {\n\treturn func(provider *PhaseConfigProvider) {\n\t\tprovider.postContainerRunOps = append(provider.postContainerRunOps, operations...)\n\t}\n}\n\nfunc If(expression bool, operation PhaseConfigProviderOperation) PhaseConfigProviderOperation {\n\tif expression {\n\t\treturn operation\n\t}\n\n\treturn NullOp()\n}\n"
  },
  {
    "path": "internal/build/phase_config_provider_test.go",
    "content": "package build_test\n\nimport (\n\t\"bytes\"\n\t\"io\"\n\t\"testing\"\n\n\tifakes \"github.com/buildpacks/imgutil/fakes\"\n\t\"github.com/buildpacks/lifecycle/api\"\n\t\"github.com/heroku/color\"\n\tdcontainer \"github.com/moby/moby/api/types/container\"\n\t\"github.com/moby/moby/client\"\n\t\"github.com/pkg/errors\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\t\"github.com/buildpacks/pack/internal/build\"\n\t\"github.com/buildpacks/pack/internal/build/fakes\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestPhaseConfigProvider(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\n\tspec.Run(t, \"phase_config_provider\", testPhaseConfigProvider, spec.Report(report.Terminal{}), spec.Sequential())\n}\n\nfunc testPhaseConfigProvider(t *testing.T, when spec.G, it spec.S) {\n\twhen(\"#NewPhaseConfigProvider\", func() {\n\t\tit(\"returns a phase config provider with defaults\", func() {\n\t\t\texpectedBuilderImage := ifakes.NewImage(\"some-builder-name\", \"\", nil)\n\t\t\tfakeBuilder, err := fakes.NewFakeBuilder(fakes.WithImage(expectedBuilderImage))\n\t\t\th.AssertNil(t, err)\n\t\t\tlifecycle := newTestLifecycleExec(t, false, \"some-temp-dir\", fakes.WithBuilder(fakeBuilder))\n\t\t\texpectedPhaseName := \"some-name\"\n\t\t\texpectedCmd := []string{\"/cnb/lifecycle/\" + expectedPhaseName}\n\n\t\t\tphaseConfigProvider := build.NewPhaseConfigProvider(expectedPhaseName, lifecycle)\n\n\t\t\th.AssertEq(t, phaseConfigProvider.Name(), expectedPhaseName)\n\t\t\th.AssertEq(t, phaseConfigProvider.ContainerConfig().Cmd, expectedCmd)\n\t\t\th.AssertEq(t, phaseConfigProvider.ContainerConfig().Image, expectedBuilderImage.Name())\n\t\t\th.AssertEq(t, phaseConfigProvider.ContainerConfig().Labels, map[string]string{\"author\": \"pack\"})\n\n\t\t\t// NewFakeBuilder sets the Platform API\n\t\t\th.AssertSliceContains(t, phaseConfigProvider.ContainerConfig().Env, \"CNB_PLATFORM_API=0.4\")\n\n\t\t\t// CreateFakeLifecycleExecution sets the following:\n\t\t\th.AssertSliceContains(t, phaseConfigProvider.ContainerConfig().Env, \"HTTP_PROXY=some-http-proxy\")\n\t\t\th.AssertSliceContains(t, phaseConfigProvider.ContainerConfig().Env, \"http_proxy=some-http-proxy\")\n\t\t\th.AssertSliceContains(t, phaseConfigProvider.ContainerConfig().Env, \"HTTPS_PROXY=some-https-proxy\")\n\t\t\th.AssertSliceContains(t, phaseConfigProvider.ContainerConfig().Env, \"https_proxy=some-https-proxy\")\n\t\t\th.AssertSliceContains(t, phaseConfigProvider.ContainerConfig().Env, \"NO_PROXY=some-no-proxy\")\n\t\t\th.AssertSliceContains(t, phaseConfigProvider.ContainerConfig().Env, \"no_proxy=some-no-proxy\")\n\n\t\t\th.AssertSliceContainsMatch(t, phaseConfigProvider.HostConfig().Binds, \"pack-layers-.*:/layers\")\n\t\t\th.AssertSliceContainsMatch(t, phaseConfigProvider.HostConfig().Binds, \"pack-app-.*:/workspace\")\n\n\t\t\th.AssertEq(t, phaseConfigProvider.HostConfig().Isolation, dcontainer.IsolationEmpty)\n\t\t\th.AssertEq(t, phaseConfigProvider.HostConfig().UsernsMode, dcontainer.UsernsMode(\"\"))\n\t\t\th.AssertSliceContains(t, phaseConfigProvider.HostConfig().SecurityOpt, \"no-new-privileges=true\")\n\t\t})\n\n\t\twhen(\"userns-host is enabled\", func() {\n\t\t\tit(\"sets user namespace mode to host\", func() {\n\t\t\t\texpectedBuilderImage := ifakes.NewImage(\"some-builder-name\", \"\", nil)\n\t\t\t\tfakeBuilder, err := fakes.NewFakeBuilder(fakes.WithImage(expectedBuilderImage))\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\tlifecycle := newTestLifecycleExec(t, false, \"some-temp-dir\", fakes.WithBuilder(fakeBuilder), fakes.WithEnableUsernsHost())\n\t\t\t\texpectedPhaseName := \"some-name\"\n\n\t\t\t\tphaseConfigProvider := build.NewPhaseConfigProvider(expectedPhaseName, lifecycle)\n\n\t\t\t\th.AssertEq(t, phaseConfigProvider.HostConfig().UsernsMode, dcontainer.UsernsMode(\"host\"))\n\t\t\t})\n\t\t})\n\n\t\twhen(\"building for Windows\", func() {\n\t\t\tit(\"sets process isolation\", func() {\n\t\t\t\tfakeBuilderImage := ifakes.NewImage(\"fake-builder\", \"\", nil)\n\t\t\t\th.AssertNil(t, fakeBuilderImage.SetOS(\"windows\"))\n\t\t\t\tfakeBuilder, err := fakes.NewFakeBuilder(fakes.WithImage(fakeBuilderImage))\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\tlifecycle := newTestLifecycleExec(t, false, \"some-temp-dir\", fakes.WithBuilder(fakeBuilder))\n\n\t\t\t\tphaseConfigProvider := build.NewPhaseConfigProvider(\"some-name\", lifecycle)\n\n\t\t\t\th.AssertEq(t, phaseConfigProvider.HostConfig().Isolation, dcontainer.IsolationProcess)\n\t\t\t\th.AssertSliceNotContains(t, phaseConfigProvider.HostConfig().SecurityOpt, \"no-new-privileges=true\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"building with interactive mode\", func() {\n\t\t\tit(\"returns a phase config provider with interactive args\", func() {\n\t\t\t\thandler := func(bodyChan <-chan dcontainer.WaitResponse, errChan <-chan error, reader io.Reader) error {\n\t\t\t\t\treturn errors.New(\"i was called\")\n\t\t\t\t}\n\n\t\t\t\tfakeTermui := &fakes.FakeTermui{HandlerFunc: handler}\n\t\t\t\tlifecycle := newTestLifecycleExec(t, false, \"some-temp-dir\", fakes.WithTermui(fakeTermui))\n\t\t\t\tphaseConfigProvider := build.NewPhaseConfigProvider(\"some-name\", lifecycle)\n\n\t\t\t\th.AssertError(t, phaseConfigProvider.Handler()(nil, nil, nil), \"i was called\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"called with WithArgs\", func() {\n\t\t\tit(\"sets args on the config\", func() {\n\t\t\t\tlifecycle := newTestLifecycleExec(t, false, \"some-temp-dir\")\n\t\t\t\texpectedArgs := []string{\"some-arg-1\", \"some-arg-2\"}\n\n\t\t\t\tphaseConfigProvider := build.NewPhaseConfigProvider(\n\t\t\t\t\t\"some-name\",\n\t\t\t\t\tlifecycle,\n\t\t\t\t\tbuild.WithArgs(expectedArgs...),\n\t\t\t\t)\n\n\t\t\t\tcmd := phaseConfigProvider.ContainerConfig().Cmd\n\t\t\t\th.AssertSliceContainsInOrder(t, cmd, \"some-arg-1\", \"some-arg-2\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"called with WithFlags\", func() {\n\t\t\tit(\"sets args on the config\", func() {\n\t\t\t\tlifecycle := newTestLifecycleExec(t, false, \"some-temp-dir\")\n\n\t\t\t\tphaseConfigProvider := build.NewPhaseConfigProvider(\n\t\t\t\t\t\"some-name\",\n\t\t\t\t\tlifecycle,\n\t\t\t\t\tbuild.WithArgs(\"arg-1\", \"arg-2\"),\n\t\t\t\t\tbuild.WithFlags(\"flag-1\", \"flag-2\"),\n\t\t\t\t)\n\n\t\t\t\tcmd := phaseConfigProvider.ContainerConfig().Cmd\n\t\t\t\th.AssertSliceContainsInOrder(t, cmd, \"flag-1\", \"flag-2\", \"arg-1\", \"arg-2\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"called with WithBinds\", func() {\n\t\t\tit(\"sets binds on the config\", func() {\n\t\t\t\tlifecycle := newTestLifecycleExec(t, false, \"some-temp-dir\")\n\t\t\t\texpectedBinds := []string{\"some-bind-1\", \"some-bind-2\"}\n\n\t\t\t\tphaseConfigProvider := build.NewPhaseConfigProvider(\n\t\t\t\t\t\"some-name\",\n\t\t\t\t\tlifecycle,\n\t\t\t\t\tbuild.WithBinds(expectedBinds...),\n\t\t\t\t)\n\n\t\t\t\th.AssertSliceContains(t, phaseConfigProvider.HostConfig().Binds, expectedBinds...)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"called with WithDaemonAccess\", func() {\n\t\t\twhen(\"building for non-Windows\", func() {\n\t\t\t\tit(\"sets daemon access on the config\", func() {\n\t\t\t\t\tlifecycle := newTestLifecycleExec(t, false, \"some-temp-dir\")\n\n\t\t\t\t\tphaseConfigProvider := build.NewPhaseConfigProvider(\n\t\t\t\t\t\t\"some-name\",\n\t\t\t\t\t\tlifecycle,\n\t\t\t\t\t\tbuild.WithDaemonAccess(\"\"),\n\t\t\t\t\t)\n\n\t\t\t\t\th.AssertEq(t, phaseConfigProvider.ContainerConfig().User, \"root\")\n\t\t\t\t\th.AssertSliceContains(t, phaseConfigProvider.HostConfig().Binds, \"/var/run/docker.sock:/var/run/docker.sock\")\n\t\t\t\t\th.AssertSliceContains(t, phaseConfigProvider.HostConfig().SecurityOpt, \"label=disable\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"building for Windows\", func() {\n\t\t\t\tit(\"sets daemon access on the config\", func() {\n\t\t\t\t\tfakeBuilderImage := ifakes.NewImage(\"fake-builder\", \"\", nil)\n\t\t\t\t\th.AssertNil(t, fakeBuilderImage.SetOS(\"windows\"))\n\t\t\t\t\tfakeBuilder, err := fakes.NewFakeBuilder(fakes.WithImage(fakeBuilderImage))\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\tlifecycle := newTestLifecycleExec(t, false, \"some-temp-dir\", fakes.WithBuilder(fakeBuilder))\n\n\t\t\t\t\tphaseConfigProvider := build.NewPhaseConfigProvider(\n\t\t\t\t\t\t\"some-name\",\n\t\t\t\t\t\tlifecycle,\n\t\t\t\t\t\tbuild.WithDaemonAccess(\"\"),\n\t\t\t\t\t)\n\n\t\t\t\t\th.AssertEq(t, phaseConfigProvider.ContainerConfig().User, \"ContainerAdministrator\")\n\t\t\t\t\th.AssertSliceContains(t, phaseConfigProvider.HostConfig().Binds, `\\\\.\\pipe\\docker_engine:\\\\.\\pipe\\docker_engine`)\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"called with WithEnv\", func() {\n\t\t\tit(\"sets the environment on the config\", func() {\n\t\t\t\tlifecycle := newTestLifecycleExec(t, false, \"some-temp-dir\")\n\n\t\t\t\tphaseConfigProvider := build.NewPhaseConfigProvider(\n\t\t\t\t\t\"some-name\",\n\t\t\t\t\tlifecycle,\n\t\t\t\t\tbuild.WithEnv(\"SOME_VARIABLE=some-value\"),\n\t\t\t\t)\n\n\t\t\t\th.AssertSliceContains(t, phaseConfigProvider.ContainerConfig().Env, \"SOME_VARIABLE=some-value\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"execution environment is set\", func() {\n\t\t\twhen(\"platform API >= 0.15\", func() {\n\t\t\t\tit(\"sets CNB_EXEC_ENV environment variable\", func() {\n\t\t\t\t\texpectedBuilderImage := ifakes.NewImage(\"some-builder-name\", \"\", nil)\n\t\t\t\t\tfakeBuilder, err := fakes.NewFakeBuilder(\n\t\t\t\t\t\tfakes.WithImage(expectedBuilderImage),\n\t\t\t\t\t\tfakes.WithSupportedPlatformAPIs([]*api.Version{api.MustParse(\"0.15\")}),\n\t\t\t\t\t)\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\tlifecycle := newTestLifecycleExec(t, false, \"some-temp-dir\",\n\t\t\t\t\t\tfakes.WithBuilder(fakeBuilder),\n\t\t\t\t\t\tfakes.WithExecutionEnvironment(\"test\"),\n\t\t\t\t\t)\n\n\t\t\t\t\tphaseConfigProvider := build.NewPhaseConfigProvider(\"some-name\", lifecycle)\n\n\t\t\t\t\th.AssertSliceContains(t, phaseConfigProvider.ContainerConfig().Env, \"CNB_EXEC_ENV=test\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"platform API < 0.15\", func() {\n\t\t\t\tit(\"does not set CNB_EXEC_ENV environment variable\", func() {\n\t\t\t\t\texpectedBuilderImage := ifakes.NewImage(\"some-builder-name\", \"\", nil)\n\t\t\t\t\tfakeBuilder, err := fakes.NewFakeBuilder(\n\t\t\t\t\t\tfakes.WithImage(expectedBuilderImage),\n\t\t\t\t\t\tfakes.WithSupportedPlatformAPIs([]*api.Version{api.MustParse(\"0.14\")}),\n\t\t\t\t\t)\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\tlifecycle := newTestLifecycleExec(t, false, \"some-temp-dir\",\n\t\t\t\t\t\tfakes.WithBuilder(fakeBuilder),\n\t\t\t\t\t\tfakes.WithExecutionEnvironment(\"test\"),\n\t\t\t\t\t)\n\n\t\t\t\t\tphaseConfigProvider := build.NewPhaseConfigProvider(\"some-name\", lifecycle)\n\n\t\t\t\t\th.AssertSliceNotContains(t, phaseConfigProvider.ContainerConfig().Env, \"CNB_EXEC_ENV=test\")\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"called with WithImage\", func() {\n\t\t\tit(\"sets the image on the config\", func() {\n\t\t\t\tlifecycle := newTestLifecycleExec(t, false, \"some-temp-dir\")\n\n\t\t\t\tphaseConfigProvider := build.NewPhaseConfigProvider(\n\t\t\t\t\t\"some-name\",\n\t\t\t\t\tlifecycle,\n\t\t\t\t\tbuild.WithImage(\"some-image-name\"),\n\t\t\t\t)\n\n\t\t\t\th.AssertEq(t, phaseConfigProvider.ContainerConfig().Image, \"some-image-name\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"called with WithNetwork\", func() {\n\t\t\tit(\"sets the network mode on the config\", func() {\n\t\t\t\tlifecycle := newTestLifecycleExec(t, false, \"some-temp-dir\")\n\t\t\t\texpectedNetworkMode := \"some-network-mode\"\n\n\t\t\t\tphaseConfigProvider := build.NewPhaseConfigProvider(\n\t\t\t\t\t\"some-name\",\n\t\t\t\t\tlifecycle,\n\t\t\t\t\tbuild.WithNetwork(expectedNetworkMode),\n\t\t\t\t)\n\n\t\t\t\th.AssertEq(\n\t\t\t\t\tt,\n\t\t\t\t\tphaseConfigProvider.HostConfig().NetworkMode,\n\t\t\t\t\tdcontainer.NetworkMode(expectedNetworkMode),\n\t\t\t\t)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"called with WithRegistryAccess\", func() {\n\t\t\tit(\"sets registry access on the config\", func() {\n\t\t\t\tlifecycle := newTestLifecycleExec(t, false, \"some-temp-dir\")\n\t\t\t\tauthConfig := \"some-auth-config\"\n\n\t\t\t\tphaseConfigProvider := build.NewPhaseConfigProvider(\n\t\t\t\t\t\"some-name\",\n\t\t\t\t\tlifecycle,\n\t\t\t\t\tbuild.WithRegistryAccess(authConfig),\n\t\t\t\t)\n\n\t\t\t\th.AssertSliceContains(\n\t\t\t\t\tt,\n\t\t\t\t\tphaseConfigProvider.ContainerConfig().Env,\n\t\t\t\t\t\"CNB_REGISTRY_AUTH=\"+authConfig,\n\t\t\t\t)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"called with WithRoot\", func() {\n\t\t\twhen(\"building for non-Windows\", func() {\n\t\t\t\tit(\"sets root user on the config\", func() {\n\t\t\t\t\tlifecycle := newTestLifecycleExec(t, false, \"some-temp-dir\")\n\n\t\t\t\t\tphaseConfigProvider := build.NewPhaseConfigProvider(\n\t\t\t\t\t\t\"some-name\",\n\t\t\t\t\t\tlifecycle,\n\t\t\t\t\t\tbuild.WithRoot(),\n\t\t\t\t\t)\n\n\t\t\t\t\th.AssertEq(t, phaseConfigProvider.ContainerConfig().User, \"root\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"building for Windows\", func() {\n\t\t\t\tit(\"sets root user on the config\", func() {\n\t\t\t\t\tfakeBuilderImage := ifakes.NewImage(\"fake-builder\", \"\", nil)\n\t\t\t\t\th.AssertNil(t, fakeBuilderImage.SetOS(\"windows\"))\n\t\t\t\t\tfakeBuilder, err := fakes.NewFakeBuilder(fakes.WithImage(fakeBuilderImage))\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\tlifecycle := newTestLifecycleExec(t, false, \"some-temp-dir\", fakes.WithBuilder(fakeBuilder))\n\n\t\t\t\t\tphaseConfigProvider := build.NewPhaseConfigProvider(\n\t\t\t\t\t\t\"some-name\",\n\t\t\t\t\t\tlifecycle,\n\t\t\t\t\t\tbuild.WithRoot(),\n\t\t\t\t\t)\n\n\t\t\t\t\th.AssertEq(t, phaseConfigProvider.ContainerConfig().User, \"ContainerAdministrator\")\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"called with WithLogPrefix\", func() {\n\t\t\tit(\"sets prefix writers\", func() {\n\t\t\t\tlifecycle := newTestLifecycleExec(t, false, \"some-temp-dir\")\n\n\t\t\t\tphaseConfigProvider := build.NewPhaseConfigProvider(\n\t\t\t\t\t\"some-name\",\n\t\t\t\t\tlifecycle,\n\t\t\t\t\tbuild.WithLogPrefix(\"some-prefix\"),\n\t\t\t\t)\n\n\t\t\t\t_, isType := phaseConfigProvider.InfoWriter().(*logging.PrefixWriter)\n\t\t\t\th.AssertEq(t, isType, true)\n\n\t\t\t\t_, isType = phaseConfigProvider.ErrorWriter().(*logging.PrefixWriter)\n\t\t\t\th.AssertEq(t, isType, true)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"verbose\", func() {\n\t\t\tit(\"prints debug information about the phase\", func() {\n\t\t\t\tvar outBuf bytes.Buffer\n\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf, logging.WithVerbose())\n\n\t\t\t\tdocker, err := client.New(client.FromEnv)\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\tdefaultBuilder, err := fakes.NewFakeBuilder()\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\topts := build.LifecycleOptions{\n\t\t\t\t\tAppPath: \"some-app-path\",\n\t\t\t\t\tBuilder: defaultBuilder,\n\t\t\t\t}\n\n\t\t\t\tlifecycleExec, err := build.NewLifecycleExecution(logger, docker, \"some-temp-dir\", opts)\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t_ = build.NewPhaseConfigProvider(\n\t\t\t\t\t\"some-name\",\n\t\t\t\t\tlifecycleExec,\n\t\t\t\t\tbuild.WithRoot(),\n\t\t\t\t)\n\n\t\t\t\th.AssertContains(t, outBuf.String(), \"Running the 'some-name' on OS\")\n\t\t\t\th.AssertContains(t, outBuf.String(), \"Args: '/cnb/lifecycle/some-name'\")\n\t\t\t\th.AssertContains(t, outBuf.String(), \"System Envs: 'CNB_PLATFORM_API=0.4'\")\n\t\t\t\th.AssertContains(t, outBuf.String(), \"Image: 'some-builder-name'\")\n\t\t\t\th.AssertContains(t, outBuf.String(), \"User:\")\n\t\t\t\th.AssertContains(t, outBuf.String(), \"Labels: 'map[author:pack]'\")\n\t\t\t\th.AssertContainsMatch(t, outBuf.String(), `Binds: \\'\\S+:\\S+layers \\S+:\\S+workspace'`)\n\t\t\t\th.AssertContains(t, outBuf.String(), \"Network Mode: ''\")\n\t\t\t})\n\n\t\t\twhen(\"there is registry auth\", func() {\n\t\t\t\tit(\"sanitizes the output\", func() {\n\t\t\t\t\tauthConfig := \"some-auth-config\"\n\n\t\t\t\t\tvar outBuf bytes.Buffer\n\t\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf, logging.WithVerbose())\n\n\t\t\t\t\tdocker, err := client.New(client.FromEnv)\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\tdefaultBuilder, err := fakes.NewFakeBuilder()\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\topts := build.LifecycleOptions{\n\t\t\t\t\t\tAppPath: \"some-app-path\",\n\t\t\t\t\t\tBuilder: defaultBuilder,\n\t\t\t\t\t}\n\n\t\t\t\t\tlifecycleExec, err := build.NewLifecycleExecution(logger, docker, \"some-temp-dir\", opts)\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t_ = build.NewPhaseConfigProvider(\n\t\t\t\t\t\t\"some-name\",\n\t\t\t\t\t\tlifecycleExec,\n\t\t\t\t\t\tbuild.WithRegistryAccess(authConfig),\n\t\t\t\t\t)\n\n\t\t\t\t\th.AssertContains(t, outBuf.String(), \"System Envs: 'CNB_REGISTRY_AUTH=<redacted> CNB_PLATFORM_API=0.4'\")\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "internal/build/phase_factory.go",
    "content": "package build\n\nimport \"context\"\n\ntype RunnerCleaner interface {\n\tRun(ctx context.Context) error\n\tCleanup() error\n}\n\ntype PhaseFactory interface {\n\tNew(provider *PhaseConfigProvider) RunnerCleaner\n}\n\ntype DefaultPhaseFactory struct {\n\tlifecycleExec *LifecycleExecution\n}\n\ntype PhaseFactoryCreator func(*LifecycleExecution) PhaseFactory\n\nfunc NewDefaultPhaseFactory(lifecycleExec *LifecycleExecution) PhaseFactory {\n\treturn &DefaultPhaseFactory{lifecycleExec: lifecycleExec}\n}\n\nfunc (m *DefaultPhaseFactory) New(provider *PhaseConfigProvider) RunnerCleaner {\n\treturn &Phase{\n\t\tctrConf:             provider.ContainerConfig(),\n\t\thostConf:            provider.HostConfig(),\n\t\tname:                provider.Name(),\n\t\tdocker:              m.lifecycleExec.docker,\n\t\tinfoWriter:          provider.InfoWriter(),\n\t\terrorWriter:         provider.ErrorWriter(),\n\t\thandler:             provider.handler,\n\t\tuid:                 m.lifecycleExec.opts.Builder.UID(),\n\t\tgid:                 m.lifecycleExec.opts.Builder.GID(),\n\t\tappPath:             m.lifecycleExec.opts.AppPath,\n\t\tcontainerOps:        provider.containerOps,\n\t\tpostContainerRunOps: provider.postContainerRunOps,\n\t\tfileFilter:          m.lifecycleExec.opts.FileFilter,\n\t}\n}\n"
  },
  {
    "path": "internal/build/phase_test.go",
    "content": "package build_test\n\nimport (\n\t\"bytes\"\n\t\"context\"\n\t\"fmt\"\n\t\"io\"\n\t\"net\"\n\t\"os\"\n\t\"path/filepath\"\n\t\"regexp\"\n\t\"runtime\"\n\t\"strconv\"\n\t\"sync\"\n\t\"testing\"\n\n\t// \"github.com/docker/docker/api/types/volume\"\n\n\t\"github.com/buildpacks/imgutil/local\"\n\t\"github.com/buildpacks/lifecycle/auth\"\n\tdcontainer \"github.com/moby/moby/api/types/container\"\n\n\t// \"github.com/docker/docker/api/types/filters\"\n\t\"github.com/google/go-containerregistry/pkg/authn\"\n\t\"github.com/heroku/color\"\n\t\"github.com/moby/moby/client\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\t\"github.com/buildpacks/pack/internal/build\"\n\t\"github.com/buildpacks/pack/internal/build/fakes\"\n\t\"github.com/buildpacks/pack/internal/container\"\n\t\"github.com/buildpacks/pack/pkg/archive\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nconst phaseName = \"phase\"\n\nvar repoName string\n\n// TestPhase is a integration test suite to ensure that the phase options are propagated to the container.\nfunc TestPhase(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\n\th.RequireDocker(t)\n\n\tvar err error\n\tctrClient, err = client.New(client.FromEnv)\n\th.AssertNil(t, err)\n\n\tinfo, err := ctrClient.Info(context.TODO(), client.InfoOptions{})\n\th.AssertNil(t, err)\n\th.SkipIf(t, info.Info.OSType == \"windows\", \"These tests are not yet compatible with Windows-based containers\")\n\n\trepoName = \"phase.test.lc-\" + h.RandString(10)\n\twd, err := os.Getwd()\n\th.AssertNil(t, err)\n\th.CreateImageFromDir(t, ctrClient, repoName, filepath.Join(wd, \"testdata\", \"fake-lifecycle\"))\n\tdefer h.DockerRmi(ctrClient, repoName)\n\n\tspec.Run(t, \"phase\", testPhase, spec.Report(report.Terminal{}), spec.Sequential())\n}\n\nfunc testPhase(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tlifecycleExec  *build.LifecycleExecution\n\t\tphaseFactory   build.PhaseFactory\n\t\toutBuf, errBuf bytes.Buffer\n\t\tdocker         client.APIClient\n\t\tlogger         logging.Logger\n\t\tosType         string\n\t)\n\n\tit.Before(func() {\n\t\tlogger = logging.NewLogWithWriters(&outBuf, &outBuf)\n\n\t\tvar err error\n\t\tdocker, err = client.New(client.FromEnv)\n\t\th.AssertNil(t, err)\n\n\t\tinfo, err := ctrClient.Info(context.Background(), client.InfoOptions{})\n\t\th.AssertNil(t, err)\n\t\tosType = info.Info.OSType\n\n\t\tlifecycleExec, err = CreateFakeLifecycleExecution(logger, docker, filepath.Join(\"testdata\", \"fake-app\"), repoName)\n\t\th.AssertNil(t, err)\n\t\tphaseFactory = build.NewDefaultPhaseFactory(lifecycleExec)\n\t})\n\n\tit.After(func() {\n\t\th.AssertNilE(t, lifecycleExec.Cleanup())\n\t})\n\n\twhen(\"Phase\", func() {\n\t\twhen(\"#Run\", func() {\n\t\t\tit(\"runs the subject phase on the builder image\", func() {\n\t\t\t\tconfigProvider := build.NewPhaseConfigProvider(phaseName, lifecycleExec)\n\t\t\t\tphase := phaseFactory.New(configProvider)\n\t\t\t\tassertRunSucceeds(t, phase, &outBuf, &errBuf)\n\t\t\t\th.AssertContains(t, outBuf.String(), \"running some-lifecycle-phase\")\n\t\t\t})\n\n\t\t\tit(\"prefixes the output with the phase name\", func() {\n\t\t\t\tconfigProvider := build.NewPhaseConfigProvider(phaseName, lifecycleExec, build.WithLogPrefix(\"phase\"))\n\t\t\t\tphase := phaseFactory.New(configProvider)\n\t\t\t\tassertRunSucceeds(t, phase, &outBuf, &errBuf)\n\t\t\t\th.AssertContains(t, outBuf.String(), \"[phase] running some-lifecycle-phase\")\n\t\t\t})\n\n\t\t\tit(\"attaches the same layers volume to each phase\", func() {\n\t\t\t\tconfigProvider := build.NewPhaseConfigProvider(phaseName, lifecycleExec, build.WithArgs(\"write\", \"/layers/test.txt\", \"test-layers\"))\n\t\t\t\twritePhase := phaseFactory.New(configProvider)\n\n\t\t\t\tassertRunSucceeds(t, writePhase, &outBuf, &errBuf)\n\t\t\t\th.AssertContains(t, outBuf.String(), \"write test\")\n\n\t\t\t\tconfigProvider = build.NewPhaseConfigProvider(phaseName, lifecycleExec, build.WithArgs(\"read\", \"/layers/test.txt\"))\n\t\t\t\treadPhase := phaseFactory.New(configProvider)\n\t\t\t\tassertRunSucceeds(t, readPhase, &outBuf, &errBuf)\n\t\t\t\th.AssertContains(t, outBuf.String(), \"file contents: test-layers\")\n\t\t\t})\n\n\t\t\tit(\"attaches the same app volume to each phase\", func() {\n\t\t\t\tconfigProvider := build.NewPhaseConfigProvider(phaseName, lifecycleExec, build.WithArgs(\"write\", \"/workspace/test.txt\", \"test-app\"))\n\t\t\t\twritePhase := phaseFactory.New(configProvider)\n\t\t\t\tassertRunSucceeds(t, writePhase, &outBuf, &errBuf)\n\t\t\t\th.AssertContains(t, outBuf.String(), \"write test\")\n\n\t\t\t\tconfigProvider = build.NewPhaseConfigProvider(phaseName, lifecycleExec, build.WithArgs(\"read\", \"/workspace/test.txt\"))\n\t\t\t\treadPhase := phaseFactory.New(configProvider)\n\t\t\t\tassertRunSucceeds(t, readPhase, &outBuf, &errBuf)\n\t\t\t\th.AssertContains(t, outBuf.String(), \"file contents: test-app\")\n\t\t\t})\n\n\t\t\tit(\"runs the phase with provided handlers\", func() {\n\t\t\t\tvar actual string\n\t\t\t\tvar handler container.Handler = func(bodyChan <-chan dcontainer.WaitResponse, errChan <-chan error, reader io.Reader) error {\n\t\t\t\t\tdata, _ := io.ReadAll(reader)\n\t\t\t\t\tactual = string(data)\n\t\t\t\t\treturn nil\n\t\t\t\t}\n\n\t\t\t\tvar err error\n\t\t\t\tlifecycleExec, err = CreateFakeLifecycleExecution(logger, docker, filepath.Join(\"testdata\", \"fake-app\"), repoName, handler)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\tphaseFactory = build.NewDefaultPhaseFactory(lifecycleExec)\n\n\t\t\t\tconfigProvider := build.NewPhaseConfigProvider(phaseName, lifecycleExec)\n\t\t\t\tphase := phaseFactory.New(configProvider)\n\t\t\t\tassertRunSucceeds(t, phase, nil, nil)\n\t\t\t\th.AssertContains(t, actual, \"running some-lifecycle-phase\")\n\t\t\t})\n\n\t\t\tit(\"copies the app into the app volume\", func() {\n\t\t\t\tconfigProvider := build.NewPhaseConfigProvider(\n\t\t\t\t\tphaseName,\n\t\t\t\t\tlifecycleExec,\n\t\t\t\t\tbuild.WithArgs(\"read\", \"/workspace/fake-app-file\"),\n\t\t\t\t\tbuild.WithContainerOperations(\n\t\t\t\t\t\tbuild.CopyDir(\n\t\t\t\t\t\t\tlifecycleExec.AppPath(),\n\t\t\t\t\t\t\t\"/workspace\",\n\t\t\t\t\t\t\tlifecycleExec.Builder().UID(),\n\t\t\t\t\t\t\tlifecycleExec.Builder().GID(),\n\t\t\t\t\t\t\tosType,\n\t\t\t\t\t\t\tfalse,\n\t\t\t\t\t\t\tnil,\n\t\t\t\t\t\t),\n\t\t\t\t\t),\n\t\t\t\t)\n\t\t\t\treadPhase := phaseFactory.New(configProvider)\n\t\t\t\tassertRunSucceeds(t, readPhase, &outBuf, &errBuf)\n\t\t\t\th.AssertContains(t, outBuf.String(), \"file contents: fake-app-contents\")\n\t\t\t\th.AssertContains(t, outBuf.String(), \"file uid/gid: 111/222\")\n\n\t\t\t\tconfigProvider = build.NewPhaseConfigProvider(phaseName, lifecycleExec, build.WithArgs(\"delete\", \"/workspace/fake-app-file\"))\n\t\t\t\tdeletePhase := phaseFactory.New(configProvider)\n\t\t\t\tassertRunSucceeds(t, deletePhase, &outBuf, &errBuf)\n\t\t\t\th.AssertContains(t, outBuf.String(), \"delete test\")\n\n\t\t\t\tconfigProvider = build.NewPhaseConfigProvider(phaseName, lifecycleExec, build.WithArgs(\"read\", \"/workspace/fake-app-file\"))\n\t\t\t\treadPhase2 := phaseFactory.New(configProvider)\n\t\t\t\terr := readPhase2.Run(context.TODO())\n\t\t\t\th.AssertNil(t, readPhase2.Cleanup())\n\t\t\t\th.AssertNotNil(t, err)\n\t\t\t\th.AssertContains(t, outBuf.String(), \"failed to read file\")\n\t\t\t})\n\n\t\t\twhen(\"app is a dir\", func() {\n\t\t\t\tit(\"preserves original mod times\", func() {\n\t\t\t\t\tassertAppModTimePreserved(t, lifecycleExec, phaseFactory, &outBuf, &errBuf, osType)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"app is a zip\", func() {\n\t\t\t\tit(\"preserves original mod times\", func() {\n\t\t\t\t\tvar err error\n\t\t\t\t\tlifecycleExec, err = CreateFakeLifecycleExecution(logger, docker, filepath.Join(\"testdata\", \"fake-app.zip\"), repoName)\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\tphaseFactory = build.NewDefaultPhaseFactory(lifecycleExec)\n\n\t\t\t\t\tassertAppModTimePreserved(t, lifecycleExec, phaseFactory, &outBuf, &errBuf, osType)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"is posix\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\th.SkipIf(t, runtime.GOOS == \"windows\", \"Skipping on windows\")\n\t\t\t\t})\n\n\t\t\t\twhen(\"restricted directory is present\", func() {\n\t\t\t\t\tvar (\n\t\t\t\t\t\terr              error\n\t\t\t\t\t\ttmpFakeAppDir    string\n\t\t\t\t\t\tdirWithoutAccess string\n\t\t\t\t\t)\n\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\th.SkipIf(t, os.Getuid() == 0, \"Skipping b/c current user is root\")\n\n\t\t\t\t\t\ttmpFakeAppDir, err = os.MkdirTemp(\"\", \"fake-app\")\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\tdirWithoutAccess = filepath.Join(tmpFakeAppDir, \"bad-dir\")\n\t\t\t\t\t\terr := os.MkdirAll(dirWithoutAccess, 0222)\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t})\n\n\t\t\t\t\tit.After(func() {\n\t\t\t\t\t\th.AssertNilE(t, os.RemoveAll(tmpFakeAppDir))\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"returns an error\", func() {\n\t\t\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\t\t\t\tlifecycleExec, err = CreateFakeLifecycleExecution(logger, docker, tmpFakeAppDir, repoName)\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\tphaseFactory = build.NewDefaultPhaseFactory(lifecycleExec)\n\t\t\t\t\t\treadPhase := phaseFactory.New(build.NewPhaseConfigProvider(\n\t\t\t\t\t\t\tphaseName,\n\t\t\t\t\t\t\tlifecycleExec,\n\t\t\t\t\t\t\tbuild.WithArgs(\"read\", \"/workspace/fake-app-file\"),\n\t\t\t\t\t\t\tbuild.WithContainerOperations(\n\t\t\t\t\t\t\t\tbuild.CopyDir(lifecycleExec.AppPath(), \"/workspace\", 0, 0, osType, false, nil),\n\t\t\t\t\t\t\t),\n\t\t\t\t\t\t))\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\terr = readPhase.Run(context.TODO())\n\t\t\t\t\t\tdefer readPhase.Cleanup()\n\n\t\t\t\t\t\th.AssertNotNil(t, err)\n\t\t\t\t\t\th.AssertContains(t,\n\t\t\t\t\t\t\terr.Error(),\n\t\t\t\t\t\t\tfmt.Sprintf(\"open %s: permission denied\", dirWithoutAccess),\n\t\t\t\t\t\t)\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\n\t\t\tit(\"sets the proxy vars in the container\", func() {\n\t\t\t\tconfigProvider := build.NewPhaseConfigProvider(phaseName, lifecycleExec, build.WithArgs(\"proxy\"))\n\t\t\t\tphase := phaseFactory.New(configProvider)\n\t\t\t\tassertRunSucceeds(t, phase, &outBuf, &errBuf)\n\t\t\t\th.AssertContains(t, outBuf.String(), \"HTTP_PROXY=some-http-proxy\")\n\t\t\t\th.AssertContains(t, outBuf.String(), \"HTTPS_PROXY=some-https-proxy\")\n\t\t\t\th.AssertContains(t, outBuf.String(), \"NO_PROXY=some-no-proxy\")\n\t\t\t\th.AssertContains(t, outBuf.String(), \"http_proxy=some-http-proxy\")\n\t\t\t\th.AssertContains(t, outBuf.String(), \"https_proxy=some-https-proxy\")\n\t\t\t\th.AssertContains(t, outBuf.String(), \"no_proxy=some-no-proxy\")\n\t\t\t})\n\n\t\t\twhen(\"#WithArgs\", func() {\n\t\t\t\tit(\"runs the subject phase with args\", func() {\n\t\t\t\t\tconfigProvider := build.NewPhaseConfigProvider(phaseName, lifecycleExec, build.WithArgs(\"some\", \"args\"))\n\t\t\t\t\tphase := phaseFactory.New(configProvider)\n\t\t\t\t\tassertRunSucceeds(t, phase, &outBuf, &errBuf)\n\t\t\t\t\th.AssertContains(t, outBuf.String(), `received args [/cnb/lifecycle/phase some args]`)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"#WithDaemonAccess\", func() {\n\t\t\t\twhen(\"with standard docker socket\", func() {\n\t\t\t\t\tit(\"allows daemon access inside the container\", func() {\n\t\t\t\t\t\tconfigProvider := build.NewPhaseConfigProvider(phaseName, lifecycleExec, build.WithArgs(\"daemon\"), build.WithDaemonAccess(\"\"))\n\t\t\t\t\t\tphase := phaseFactory.New(configProvider)\n\t\t\t\t\t\tassertRunSucceeds(t, phase, &outBuf, &errBuf)\n\t\t\t\t\t\th.AssertContains(t, outBuf.String(), \"daemon test\")\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"with non standard docker socket\", func() {\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\th.SkipIf(t, runtime.GOOS != \"linux\", \"Skipped on non-linux\")\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"allows daemon access inside the container\", func() {\n\t\t\t\t\t\ttmp, err := os.MkdirTemp(\"\", \"testSocketDir\")\n\t\t\t\t\t\tif err != nil {\n\t\t\t\t\t\t\tt.Fatal(err)\n\t\t\t\t\t\t}\n\t\t\t\t\t\tdefer os.RemoveAll(tmp)\n\n\t\t\t\t\t\tnewName := filepath.Join(tmp, \"docker.sock\")\n\t\t\t\t\t\terr = os.Symlink(\"/var/run/docker.sock\", newName)\n\t\t\t\t\t\tif err != nil {\n\t\t\t\t\t\t\tt.Fatal(err)\n\t\t\t\t\t\t}\n\n\t\t\t\t\t\tconfigProvider := build.NewPhaseConfigProvider(phaseName, lifecycleExec, build.WithArgs(\"daemon\"), build.WithDaemonAccess(\"unix://\"+newName))\n\t\t\t\t\t\tphase := phaseFactory.New(configProvider)\n\t\t\t\t\t\tassertRunSucceeds(t, phase, &outBuf, &errBuf)\n\t\t\t\t\t\th.AssertContains(t, outBuf.String(), \"daemon test\")\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"with TCP docker-host\", func() {\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\th.SkipIf(t, runtime.GOOS != \"linux\", \"Skipped on non-linux\")\n\n\t\t\t\t\t\t// this test is problematic in GitPod due to the special networking used\n\t\t\t\t\t\t// see: https://github.com/gitpod-io/gitpod/issues/6446\n\t\t\t\t\t\th.SkipIf(t, os.Getenv(\"GITPOD_WORKSPACE_ID\") != \"\", \"Skipped on GitPod\")\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"allows daemon access inside the container\", func() {\n\t\t\t\t\t\tforwardCtx, cancelForward := context.WithCancel(context.Background())\n\t\t\t\t\t\tdefer cancelForward()\n\n\t\t\t\t\t\tportChan := make(chan int, 1)\n\t\t\t\t\t\tforwardExited := make(chan struct{}, 1)\n\t\t\t\t\t\tgo func() {\n\t\t\t\t\t\t\tforwardUnix2TCP(forwardCtx, t, portChan)\n\t\t\t\t\t\t\tforwardExited <- struct{}{}\n\t\t\t\t\t\t}()\n\n\t\t\t\t\t\tdockerHost := fmt.Sprintf(\"tcp://127.0.0.1:%d\", <-portChan)\n\t\t\t\t\t\tconfigProvider := build.NewPhaseConfigProvider(phaseName, lifecycleExec,\n\t\t\t\t\t\t\tbuild.WithArgs(\"daemon\"),\n\t\t\t\t\t\t\tbuild.WithDaemonAccess(dockerHost),\n\t\t\t\t\t\t\tbuild.WithNetwork(\"host\"))\n\t\t\t\t\t\tphase := phaseFactory.New(configProvider)\n\t\t\t\t\t\tassertRunSucceeds(t, phase, &outBuf, &errBuf)\n\t\t\t\t\t\th.AssertContains(t, outBuf.String(), \"daemon test\")\n\t\t\t\t\t\tcancelForward()\n\t\t\t\t\t\t<-forwardExited\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"#WithRoot\", func() {\n\t\t\t\tit(\"sets the containers user to root\", func() {\n\t\t\t\t\tconfigProvider := build.NewPhaseConfigProvider(phaseName, lifecycleExec, build.WithArgs(\"user\"), build.WithRoot())\n\t\t\t\t\tphase := phaseFactory.New(configProvider)\n\t\t\t\t\tassertRunSucceeds(t, phase, &outBuf, &errBuf)\n\t\t\t\t\th.AssertContains(t, outBuf.String(), \"current user is root\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"#WithBinds\", func() {\n\t\t\t\tit.After(func() {\n\t\t\t\t\t_, err := docker.VolumeRemove(context.TODO(), \"some-volume\", client.VolumeRemoveOptions{Force: true})\n\t\t\t\t\th.AssertNilE(t, err)\n\t\t\t\t})\n\n\t\t\t\tit(\"mounts volumes inside container\", func() {\n\t\t\t\t\tconfigProvider := build.NewPhaseConfigProvider(phaseName, lifecycleExec, build.WithArgs(\"binds\"), build.WithBinds(\"some-volume:/mounted\"))\n\t\t\t\t\tphase := phaseFactory.New(configProvider)\n\t\t\t\t\tassertRunSucceeds(t, phase, &outBuf, &errBuf)\n\t\t\t\t\th.AssertContains(t, outBuf.String(), \"binds test\")\n\t\t\t\t\tbody, err := docker.VolumeList(context.TODO(), client.VolumeListOptions{Filters: client.Filters{\"name\": {\"some-volume\": true}}})\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\th.AssertEq(t, len(body.Items), 1)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"#WithRegistryAccess\", func() {\n\t\t\t\tvar registry *h.TestRegistryConfig\n\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tregistry = h.RunRegistry(t)\n\t\t\t\t\th.AssertNil(t, os.Setenv(\"DOCKER_CONFIG\", registry.DockerConfigDir))\n\t\t\t\t})\n\n\t\t\t\tit.After(func() {\n\t\t\t\t\tif registry != nil {\n\t\t\t\t\t\tregistry.StopRegistry(t)\n\t\t\t\t\t}\n\t\t\t\t\th.AssertNilE(t, os.Unsetenv(\"DOCKER_CONFIG\"))\n\t\t\t\t})\n\n\t\t\t\tit(\"provides auth for registry in the container\", func() {\n\t\t\t\t\trepoName := h.CreateImageOnRemote(t, ctrClient, registry, \"packs/build:v3alpha2\", \"FROM busybox\")\n\n\t\t\t\t\tauthConfig, err := auth.BuildEnvVar(authn.DefaultKeychain, repoName)\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\tconfigProvider := build.NewPhaseConfigProvider(\n\t\t\t\t\t\tphaseName,\n\t\t\t\t\t\tlifecycleExec,\n\t\t\t\t\t\tbuild.WithArgs(\"registry\", repoName),\n\t\t\t\t\t\tbuild.WithRegistryAccess(authConfig),\n\t\t\t\t\t\tbuild.WithNetwork(\"host\"),\n\t\t\t\t\t)\n\t\t\t\t\tphase := phaseFactory.New(configProvider)\n\t\t\t\t\tassertRunSucceeds(t, phase, &outBuf, &errBuf)\n\t\t\t\t\th.AssertContains(t, outBuf.String(), \"registry test\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"#WithNetwork\", func() {\n\t\t\t\tit(\"specifies a network for the container\", func() {\n\t\t\t\t\tconfigProvider := build.NewPhaseConfigProvider(phaseName, lifecycleExec, build.WithArgs(\"network\"), build.WithNetwork(\"none\"))\n\t\t\t\t\tphase := phaseFactory.New(configProvider)\n\t\t\t\t\tassertRunSucceeds(t, phase, &outBuf, &errBuf)\n\t\t\t\t\th.AssertNotContainsMatch(t, outBuf.String(), `interface: eth\\d+`)\n\t\t\t\t\th.AssertContains(t, outBuf.String(), `error connecting to internet:`)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"#WithPostContainerRunOperations\", func() {\n\t\t\t\tit(\"runs the operation after the container command\", func() {\n\t\t\t\t\ttarDestinationPath, err := os.CreateTemp(\"\", \"pack.phase.test.\")\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\tdefer os.RemoveAll(tarDestinationPath.Name())\n\n\t\t\t\t\thandler := func(reader io.ReadCloser) error {\n\t\t\t\t\t\tdefer reader.Close()\n\n\t\t\t\t\t\tcontents, err := io.ReadAll(reader)\n\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\terr = os.WriteFile(tarDestinationPath.Name(), contents, 0600)\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\treturn nil\n\t\t\t\t\t}\n\n\t\t\t\t\tconfigProvider := build.NewPhaseConfigProvider(phaseName, lifecycleExec,\n\t\t\t\t\t\tbuild.WithArgs(\"write\", \"/workspace/test.txt\", \"test-app\"),\n\t\t\t\t\t\tbuild.WithPostContainerRunOperations(build.CopyOut(handler, \"/workspace/test.txt\")))\n\t\t\t\t\tphase := phaseFactory.New(configProvider)\n\t\t\t\t\tassertRunSucceeds(t, phase, &outBuf, &errBuf)\n\n\t\t\t\t\th.AssertTarFileContents(t, tarDestinationPath.Name(), \"test.txt\", \"test-app\")\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"#Cleanup\", func() {\n\t\tit.Before(func() {\n\t\t\tconfigProvider := build.NewPhaseConfigProvider(phaseName, lifecycleExec)\n\t\t\tphase := phaseFactory.New(configProvider)\n\t\t\tassertRunSucceeds(t, phase, &outBuf, &errBuf)\n\t\t\th.AssertContains(t, outBuf.String(), \"running some-lifecycle-phase\")\n\n\t\t\th.AssertNil(t, lifecycleExec.Cleanup())\n\t\t})\n\n\t\tit(\"should delete the layers volume\", func() {\n\t\t\tbody, err := docker.VolumeList(context.TODO(), client.VolumeListOptions{\n\t\t\t\tFilters: client.Filters{\"name\": {lifecycleExec.LayersVolume(): true}}})\n\t\t\th.AssertNil(t, err)\n\t\t\th.AssertEq(t, len(body.Items), 0)\n\t\t})\n\n\t\tit(\"should delete the app volume\", func() {\n\t\t\tbody, err := docker.VolumeList(context.TODO(), client.VolumeListOptions{\n\t\t\t\tFilters: client.Filters{\"name\": {lifecycleExec.AppVolume(): true}}})\n\t\t\th.AssertNil(t, err)\n\t\t\th.AssertEq(t, len(body.Items), 0)\n\t\t})\n\t})\n}\n\nfunc assertAppModTimePreserved(t *testing.T, lifecycle *build.LifecycleExecution, phaseFactory build.PhaseFactory, outBuf *bytes.Buffer, errBuf *bytes.Buffer, osType string) {\n\tt.Helper()\n\treadPhase := phaseFactory.New(build.NewPhaseConfigProvider(\n\t\tphaseName,\n\t\tlifecycle,\n\t\tbuild.WithArgs(\"read\", \"/workspace/fake-app-file\"),\n\t\tbuild.WithContainerOperations(\n\t\t\tbuild.CopyDir(lifecycle.AppPath(), \"/workspace\", 0, 0, osType, false, nil),\n\t\t),\n\t))\n\tassertRunSucceeds(t, readPhase, outBuf, errBuf)\n\n\tmatches := regexp.MustCompile(regexp.QuoteMeta(\"file mod time (unix): \") + \"(.*)\").FindStringSubmatch(outBuf.String())\n\th.AssertEq(t, len(matches), 2)\n\th.AssertFalse(t, matches[1] == strconv.FormatInt(archive.NormalizedDateTime.Unix(), 10))\n}\n\nfunc assertRunSucceeds(t *testing.T, phase build.RunnerCleaner, outBuf *bytes.Buffer, errBuf *bytes.Buffer) {\n\tt.Helper()\n\tif err := phase.Run(context.TODO()); err != nil {\n\t\th.AssertNilE(t, phase.Cleanup())\n\t\tt.Fatalf(\"Failed to run phase: %s\\nstdout:\\n%s\\nstderr:\\n%s\\n\", err, outBuf.String(), errBuf.String())\n\t}\n\th.AssertNilE(t, phase.Cleanup())\n}\n\nfunc CreateFakeLifecycleExecution(logger logging.Logger, docker client.APIClient, appDir string, repoName string, handler ...container.Handler) (*build.LifecycleExecution, error) {\n\tbuilderImage, err := local.NewImage(repoName, docker, local.FromBaseImage(repoName))\n\tif err != nil {\n\t\treturn nil, err\n\t}\n\n\tfakeBuilder, err := fakes.NewFakeBuilder(\n\t\tfakes.WithUID(111), fakes.WithGID(222),\n\t\tfakes.WithImage(builderImage),\n\t)\n\tif err != nil {\n\t\treturn nil, err\n\t}\n\n\tvar (\n\t\tinteractive bool\n\t\ttermui      build.Termui\n\t)\n\n\tif len(handler) != 0 {\n\t\tinteractive = true\n\t\ttermui = &fakes.FakeTermui{HandlerFunc: handler[0]}\n\t}\n\n\treturn build.NewLifecycleExecution(logger, docker, \"some-temp-dir\", build.LifecycleOptions{\n\t\tAppPath:     appDir,\n\t\tBuilder:     fakeBuilder,\n\t\tHTTPProxy:   \"some-http-proxy\",\n\t\tHTTPSProxy:  \"some-https-proxy\",\n\t\tNoProxy:     \"some-no-proxy\",\n\t\tInteractive: interactive,\n\t\tTermui:      termui,\n\t})\n}\n\n// helper function to expose standard UNIX socket `/var/run/docker.sock` via TCP localhost:PORT\n// where PORT is picked automatically and returned via outPort parameter\nfunc forwardUnix2TCP(ctx context.Context, t *testing.T, outPort chan<- int) {\n\twg := sync.WaitGroup{}\n\terrChan := make(chan error, 1)\n\n\tforwardCon := func(tcpCon net.Conn) {\n\t\tdefer wg.Done()\n\n\t\tdefer tcpCon.Close()\n\t\tgo func() {\n\t\t\t<-ctx.Done()\n\t\t\ttcpCon.Close()\n\t\t}()\n\n\t\tunixCon, err := net.Dial(\"unix\", \"/var/run/docker.sock\")\n\t\tif err != nil {\n\t\t\terrChan <- err\n\t\t\treturn\n\t\t}\n\t\tdefer unixCon.Close()\n\t\tgo func() {\n\t\t\t<-ctx.Done()\n\t\t\tunixCon.Close()\n\t\t}()\n\n\t\tgo io.Copy(tcpCon, unixCon)\n\t\tio.Copy(unixCon, tcpCon)\n\t}\n\n\tlistener, err := net.Listen(\"tcp4\", \":\")\n\tif err != nil {\n\t\tt.Fatal(err)\n\t}\n\toutPort <- listener.Addr().(*net.TCPAddr).Port\n\tdefer listener.Close()\n\tgo func() {\n\t\t<-ctx.Done()\n\t\tlistener.Close()\n\t}()\n\tfor {\n\t\tselect {\n\t\tcase <-ctx.Done():\n\t\t\tgoto out\n\t\tcase err = <-errChan:\n\t\t\tt.Fatal(err)\n\t\tdefault:\n\t\t\tc, err := listener.Accept()\n\t\t\tif err != nil {\n\t\t\t\tif ctx.Err() == context.Canceled {\n\t\t\t\t\tgoto out\n\t\t\t\t} else {\n\t\t\t\t\tt.Fatal(err)\n\t\t\t\t}\n\t\t\t}\n\t\t\twg.Add(1)\n\t\t\tgo forwardCon(c)\n\t\t}\n\t}\nout:\n\twg.Wait()\n}\n"
  },
  {
    "path": "internal/build/testdata/fake-app/fake-app-file",
    "content": "fake-app-contents"
  },
  {
    "path": "internal/build/testdata/fake-app/file-to-ignore",
    "content": ""
  },
  {
    "path": "internal/build/testdata/fake-lifecycle/Dockerfile",
    "content": "FROM golang:1.23\n\nRUN mkdir /lifecycle\nWORKDIR /go/src/step\nCOPY . .\nRUN GO111MODULE=on GOFLAGS=-mod=mod go build -o /cnb/lifecycle/phase ./phase.go\n\nENV CNB_USER_ID 111\nENV CNB_GROUP_ID 222\n\nLABEL io.buildpacks.stack.id=\"test.stack\"\nLABEL io.buildpacks.builder.metadata=\"{\\\"buildpacks\\\":[{\\\"id\\\":\\\"just/buildpack.id\\\",\\\"version\\\":\\\"1.2.3\\\"}],\\\"lifecycle\\\":{\\\"version\\\":\\\"0.5.0\\\",\\\"api\\\":{\\\"buildpack\\\":\\\"0.2\\\",\\\"platform\\\":\\\"0.1\\\"}}}\""
  },
  {
    "path": "internal/build/testdata/fake-lifecycle/go.mod",
    "content": "module step\n\nrequire (\n\tgithub.com/buildpacks/lifecycle v0.19.3\n\tgithub.com/docker/docker v28.5.1+incompatible\n\tgithub.com/google/go-containerregistry v0.19.2\n)\n\nrequire (\n\tgithub.com/Azure/azure-sdk-for-go v68.0.0+incompatible // indirect\n\tgithub.com/Azure/go-autorest v14.2.0+incompatible // indirect\n\tgithub.com/Azure/go-autorest/autorest v0.11.30 // indirect\n\tgithub.com/Azure/go-autorest/autorest/adal v0.9.24 // indirect\n\tgithub.com/Azure/go-autorest/autorest/azure/auth v0.5.13 // indirect\n\tgithub.com/Azure/go-autorest/autorest/azure/cli v0.4.7 // indirect\n\tgithub.com/Azure/go-autorest/autorest/date v0.3.1 // indirect\n\tgithub.com/Azure/go-autorest/logger v0.2.2 // indirect\n\tgithub.com/Azure/go-autorest/tracing v0.6.1 // indirect\n\tgithub.com/Microsoft/go-winio v0.6.2 // indirect\n\tgithub.com/aws/aws-sdk-go-v2 v1.36.3 // indirect\n\tgithub.com/aws/aws-sdk-go-v2/config v1.29.14 // indirect\n\tgithub.com/aws/aws-sdk-go-v2/credentials v1.17.67 // indirect\n\tgithub.com/aws/aws-sdk-go-v2/feature/ec2/imds v1.16.30 // indirect\n\tgithub.com/aws/aws-sdk-go-v2/internal/configsources v1.3.34 // indirect\n\tgithub.com/aws/aws-sdk-go-v2/internal/endpoints/v2 v2.6.34 // indirect\n\tgithub.com/aws/aws-sdk-go-v2/internal/ini v1.8.3 // indirect\n\tgithub.com/aws/aws-sdk-go-v2/service/ecr v1.44.0 // indirect\n\tgithub.com/aws/aws-sdk-go-v2/service/ecrpublic v1.33.0 // indirect\n\tgithub.com/aws/aws-sdk-go-v2/service/internal/accept-encoding v1.12.3 // indirect\n\tgithub.com/aws/aws-sdk-go-v2/service/internal/presigned-url v1.12.15 // indirect\n\tgithub.com/aws/aws-sdk-go-v2/service/sso v1.25.3 // indirect\n\tgithub.com/aws/aws-sdk-go-v2/service/ssooidc v1.30.1 // indirect\n\tgithub.com/aws/aws-sdk-go-v2/service/sts v1.33.19 // indirect\n\tgithub.com/aws/smithy-go v1.22.3 // indirect\n\tgithub.com/awslabs/amazon-ecr-credential-helper/ecr-login v0.9.1 // indirect\n\tgithub.com/beorn7/perks v1.0.1 // indirect\n\tgithub.com/cespare/xxhash/v2 v2.3.0 // indirect\n\tgithub.com/chrismellard/docker-credential-acr-env v0.0.0-20230304212654-82a0ddb27589 // indirect\n\tgithub.com/containerd/errdefs v1.0.0 // indirect\n\tgithub.com/containerd/errdefs/pkg v0.3.0 // indirect\n\tgithub.com/containerd/log v0.1.0 // indirect\n\tgithub.com/containerd/stargz-snapshotter/estargz v0.16.3 // indirect\n\tgithub.com/dimchansky/utfbom v1.1.1 // indirect\n\tgithub.com/distribution/reference v0.6.0 // indirect\n\tgithub.com/docker/cli v28.2.2+incompatible // indirect\n\tgithub.com/docker/distribution v2.8.3+incompatible // indirect\n\tgithub.com/docker/docker-credential-helpers v0.9.3 // indirect\n\tgithub.com/docker/go-connections v0.5.0 // indirect\n\tgithub.com/docker/go-metrics v0.0.1 // indirect\n\tgithub.com/docker/go-units v0.5.0 // indirect\n\tgithub.com/felixge/httpsnoop v1.0.4 // indirect\n\tgithub.com/go-logr/logr v1.4.3 // indirect\n\tgithub.com/go-logr/stdr v1.2.2 // indirect\n\tgithub.com/golang-jwt/jwt/v4 v4.5.2 // indirect\n\tgithub.com/gorilla/mux v1.8.1 // indirect\n\tgithub.com/klauspost/compress v1.18.0 // indirect\n\tgithub.com/mitchellh/go-homedir v1.1.0 // indirect\n\tgithub.com/moby/docker-image-spec v1.3.1 // indirect\n\tgithub.com/moby/sys/atomicwriter v0.1.0 // indirect\n\tgithub.com/munnerz/goautoneg v0.0.0-20191010083416-a7dc8b61c822 // indirect\n\tgithub.com/opencontainers/go-digest v1.0.0 // indirect\n\tgithub.com/opencontainers/image-spec v1.1.1 // indirect\n\tgithub.com/pkg/errors v0.9.1 // indirect\n\tgithub.com/prometheus/client_golang v1.22.0 // indirect\n\tgithub.com/prometheus/client_model v0.6.2 // indirect\n\tgithub.com/prometheus/common v0.64.0 // indirect\n\tgithub.com/prometheus/procfs v0.16.1 // indirect\n\tgithub.com/sirupsen/logrus v1.9.3 // indirect\n\tgithub.com/vbatts/tar-split v0.12.1 // indirect\n\tgo.opentelemetry.io/auto/sdk v1.1.0 // indirect\n\tgo.opentelemetry.io/contrib/instrumentation/net/http/otelhttp v0.61.0 // indirect\n\tgo.opentelemetry.io/otel v1.38.0 // indirect\n\tgo.opentelemetry.io/otel/exporters/otlp/otlptrace/otlptracehttp v1.38.0 // indirect\n\tgo.opentelemetry.io/otel/metric v1.38.0 // indirect\n\tgo.opentelemetry.io/otel/sdk v1.38.0 // indirect\n\tgo.opentelemetry.io/otel/sdk/metric v1.38.0 // indirect\n\tgo.opentelemetry.io/otel/trace v1.38.0 // indirect\n\tgo.opentelemetry.io/proto/otlp v1.9.0 // indirect\n\tgolang.org/x/crypto v0.38.0 // indirect\n\tgolang.org/x/sync v0.15.0 // indirect\n\tgolang.org/x/sys v0.35.0 // indirect\n\tgoogle.golang.org/protobuf v1.36.10 // indirect\n)\n\ngo 1.23.0\n"
  },
  {
    "path": "internal/build/testdata/fake-lifecycle/go.sum",
    "content": "github.com/Azure/azure-sdk-for-go v68.0.0+incompatible h1:fcYLmCpyNYRnvJbPerq7U0hS+6+I79yEDJBqVNcqUzU=\ngithub.com/Azure/azure-sdk-for-go v68.0.0+incompatible/go.mod h1:9XXNKU+eRnpl9moKnB4QOLf1HestfXbmab5FXxiDBjc=\ngithub.com/Azure/go-ansiterm v0.0.0-20230124172434-306776ec8161 h1:L/gRVlceqvL25UVaW/CKtUDjefjrs0SPonmDGUVOYP0=\ngithub.com/Azure/go-ansiterm v0.0.0-20230124172434-306776ec8161/go.mod h1:xomTg63KZ2rFqZQzSB4Vz2SUXa1BpHTVz9L5PTmPC4E=\ngithub.com/Azure/go-autorest v14.2.0+incompatible h1:V5VMDjClD3GiElqLWO7mz2MxNAK/vTfRHdAubSIPRgs=\ngithub.com/Azure/go-autorest v14.2.0+incompatible/go.mod h1:r+4oMnoxhatjLLJ6zxSWATqVooLgysK6ZNox3g/xq24=\ngithub.com/Azure/go-autorest/autorest v0.11.28/go.mod h1:MrkzG3Y3AH668QyF9KRk5neJnGgmhQ6krbhR8Q5eMvA=\ngithub.com/Azure/go-autorest/autorest v0.11.30 h1:iaZ1RGz/ALZtN5eq4Nr1SOFSlf2E4pDI3Tcsl+dZPVE=\ngithub.com/Azure/go-autorest/autorest v0.11.30/go.mod h1:t1kpPIOpIVX7annvothKvb0stsrXa37i7b+xpmBW8Fs=\ngithub.com/Azure/go-autorest/autorest/adal v0.9.18/go.mod h1:XVVeme+LZwABT8K5Lc3hA4nAe8LDBVle26gTrguhhPQ=\ngithub.com/Azure/go-autorest/autorest/adal v0.9.22/go.mod h1:XuAbAEUv2Tta//+voMI038TrJBqjKam0me7qR+L8Cmk=\ngithub.com/Azure/go-autorest/autorest/adal v0.9.24 h1:BHZfgGsGwdkHDyZdtQRQk1WeUdW0m2WPAwuHZwUi5i4=\ngithub.com/Azure/go-autorest/autorest/adal v0.9.24/go.mod h1:7T1+g0PYFmACYW5LlG2fcoPiPlFHjClyRGL7dRlP5c8=\ngithub.com/Azure/go-autorest/autorest/azure/auth v0.5.13 h1:Ov8avRZi2vmrE2JcXw+tu5K/yB41r7xK9GZDiBF7NdM=\ngithub.com/Azure/go-autorest/autorest/azure/auth v0.5.13/go.mod h1:5BAVfWLWXihP47vYrPuBKKf4cS0bXI+KM9Qx6ETDJYo=\ngithub.com/Azure/go-autorest/autorest/azure/cli v0.4.6/go.mod h1:piCfgPho7BiIDdEQ1+g4VmKyD5y+p/XtSNqE6Hc4QD0=\ngithub.com/Azure/go-autorest/autorest/azure/cli v0.4.7 h1:Q9R3utmFg9K1B4OYtAZ7ZUUvIUdzQt7G2MN5Hi/d670=\ngithub.com/Azure/go-autorest/autorest/azure/cli v0.4.7/go.mod h1:bVrAueELJ0CKLBpUHDIvD516TwmHmzqwCpvONWRsw3s=\ngithub.com/Azure/go-autorest/autorest/date v0.3.0/go.mod h1:BI0uouVdmngYNUzGWeSYnokU+TrmwEsOqdt8Y6sso74=\ngithub.com/Azure/go-autorest/autorest/date v0.3.1 h1:o9Z8Jyt+VJJTCZ/UORishuHOusBwolhjokt9s5k8I4w=\ngithub.com/Azure/go-autorest/autorest/date v0.3.1/go.mod h1:Dz/RDmXlfiFFS/eW+b/xMUSFs1tboPVy6UjgADToWDM=\ngithub.com/Azure/go-autorest/autorest/mocks v0.4.1/go.mod h1:LTp+uSrOhSkaKrUy935gNZuuIPPVsHlr9DSOxSayd+k=\ngithub.com/Azure/go-autorest/autorest/mocks v0.4.2 h1:PGN4EDXnuQbojHbU0UWoNvmu9AGVwYHG9/fkDYhtAfw=\ngithub.com/Azure/go-autorest/autorest/mocks v0.4.2/go.mod h1:Vy7OitM9Kei0i1Oj+LvyAWMXJHeKH1MVlzFugfVrmyU=\ngithub.com/Azure/go-autorest/logger v0.2.1/go.mod h1:T9E3cAhj2VqvPOtCYAvby9aBXkZmbF5NWuPV8+WeEW8=\ngithub.com/Azure/go-autorest/logger v0.2.2 h1:hYqBsEBywrrOSW24kkOCXRcKfKhK76OzLTfF+MYDE2o=\ngithub.com/Azure/go-autorest/logger v0.2.2/go.mod h1:I5fg9K52o+iuydlWfa9T5K6WFos9XYr9dYTFzpqgibw=\ngithub.com/Azure/go-autorest/tracing v0.6.0/go.mod h1:+vhtPC754Xsa23ID7GlGsrdKBpUA79WCAKPPZVC2DeU=\ngithub.com/Azure/go-autorest/tracing v0.6.1 h1:YUMSrC/CeD1ZnnXcNYU4a/fzsO35u2Fsful9L/2nyR0=\ngithub.com/Azure/go-autorest/tracing v0.6.1/go.mod h1:/3EgjbsjraOqiicERAeu3m7/z0x1TzjQGAwDrJrXGkc=\ngithub.com/Microsoft/go-winio v0.6.2 h1:F2VQgta7ecxGYO8k3ZZz3RS8fVIXVxONVUPlNERoyfY=\ngithub.com/Microsoft/go-winio v0.6.2/go.mod h1:yd8OoFMLzJbo9gZq8j5qaps8bJ9aShtEA8Ipt1oGCvU=\ngithub.com/alecthomas/template v0.0.0-20160405071501-a0175ee3bccc/go.mod h1:LOuyumcjzFXgccqObfd/Ljyb9UuFJ6TxHnclSeseNhc=\ngithub.com/alecthomas/units v0.0.0-20151022065526-2efee857e7cf/go.mod h1:ybxpYRFXyAe+OPACYpWeL0wqObRcbAqCMya13uyzqw0=\ngithub.com/apex/log v1.9.0 h1:FHtw/xuaM8AgmvDDTI9fiwoAL25Sq2cxojnZICUU8l0=\ngithub.com/apex/log v1.9.0/go.mod h1:m82fZlWIuiWzWP04XCTXmnX0xRkYYbCdYn8jbJeLBEA=\ngithub.com/aws/aws-sdk-go-v2 v1.36.3 h1:mJoei2CxPutQVxaATCzDUjcZEjVRdpsiiXi2o38yqWM=\ngithub.com/aws/aws-sdk-go-v2 v1.36.3/go.mod h1:LLXuLpgzEbD766Z5ECcRmi8AzSwfZItDtmABVkRLGzg=\ngithub.com/aws/aws-sdk-go-v2/config v1.29.14 h1:f+eEi/2cKCg9pqKBoAIwRGzVb70MRKqWX4dg1BDcSJM=\ngithub.com/aws/aws-sdk-go-v2/config v1.29.14/go.mod h1:wVPHWcIFv3WO89w0rE10gzf17ZYy+UVS1Geq8Iei34g=\ngithub.com/aws/aws-sdk-go-v2/credentials v1.17.67 h1:9KxtdcIA/5xPNQyZRgUSpYOE6j9Bc4+D7nZua0KGYOM=\ngithub.com/aws/aws-sdk-go-v2/credentials v1.17.67/go.mod h1:p3C44m+cfnbv763s52gCqrjaqyPikj9Sg47kUVaNZQQ=\ngithub.com/aws/aws-sdk-go-v2/feature/ec2/imds v1.16.30 h1:x793wxmUWVDhshP8WW2mlnXuFrO4cOd3HLBroh1paFw=\ngithub.com/aws/aws-sdk-go-v2/feature/ec2/imds v1.16.30/go.mod h1:Jpne2tDnYiFascUEs2AWHJL9Yp7A5ZVy3TNyxaAjD6M=\ngithub.com/aws/aws-sdk-go-v2/internal/configsources v1.3.34 h1:ZK5jHhnrioRkUNOc+hOgQKlUL5JeC3S6JgLxtQ+Rm0Q=\ngithub.com/aws/aws-sdk-go-v2/internal/configsources v1.3.34/go.mod h1:p4VfIceZokChbA9FzMbRGz5OV+lekcVtHlPKEO0gSZY=\ngithub.com/aws/aws-sdk-go-v2/internal/endpoints/v2 v2.6.34 h1:SZwFm17ZUNNg5Np0ioo/gq8Mn6u9w19Mri8DnJ15Jf0=\ngithub.com/aws/aws-sdk-go-v2/internal/endpoints/v2 v2.6.34/go.mod h1:dFZsC0BLo346mvKQLWmoJxT+Sjp+qcVR1tRVHQGOH9Q=\ngithub.com/aws/aws-sdk-go-v2/internal/ini v1.8.3 h1:bIqFDwgGXXN1Kpp99pDOdKMTTb5d2KyU5X/BZxjOkRo=\ngithub.com/aws/aws-sdk-go-v2/internal/ini v1.8.3/go.mod h1:H5O/EsxDWyU+LP/V8i5sm8cxoZgc2fdNR9bxlOFrQTo=\ngithub.com/aws/aws-sdk-go-v2/service/ecr v1.44.0 h1:E+UTVTDH6XTSjqxHWRuY8nB6s+05UllneWxnycplHFk=\ngithub.com/aws/aws-sdk-go-v2/service/ecr v1.44.0/go.mod h1:iQ1skgw1XRK+6Lgkb0I9ODatAP72WoTILh0zXQ5DtbU=\ngithub.com/aws/aws-sdk-go-v2/service/ecrpublic v1.33.0 h1:wA2O6pZ2r5smqJunFP4hp7qptMW4EQxs8O6RVHPulOE=\ngithub.com/aws/aws-sdk-go-v2/service/ecrpublic v1.33.0/go.mod h1:RZL7ov7c72wSmoM8bIiVxRHgcVdzhNkVW2J36C8RF4s=\ngithub.com/aws/aws-sdk-go-v2/service/internal/accept-encoding v1.12.3 h1:eAh2A4b5IzM/lum78bZ590jy36+d/aFLgKF/4Vd1xPE=\ngithub.com/aws/aws-sdk-go-v2/service/internal/accept-encoding v1.12.3/go.mod h1:0yKJC/kb8sAnmlYa6Zs3QVYqaC8ug2AbnNChv5Ox3uA=\ngithub.com/aws/aws-sdk-go-v2/service/internal/presigned-url v1.12.15 h1:dM9/92u2F1JbDaGooxTq18wmmFzbJRfXfVfy96/1CXM=\ngithub.com/aws/aws-sdk-go-v2/service/internal/presigned-url v1.12.15/go.mod h1:SwFBy2vjtA0vZbjjaFtfN045boopadnoVPhu4Fv66vY=\ngithub.com/aws/aws-sdk-go-v2/service/sso v1.25.3 h1:1Gw+9ajCV1jogloEv1RRnvfRFia2cL6c9cuKV2Ps+G8=\ngithub.com/aws/aws-sdk-go-v2/service/sso v1.25.3/go.mod h1:qs4a9T5EMLl/Cajiw2TcbNt2UNo/Hqlyp+GiuG4CFDI=\ngithub.com/aws/aws-sdk-go-v2/service/ssooidc v1.30.1 h1:hXmVKytPfTy5axZ+fYbR5d0cFmC3JvwLm5kM83luako=\ngithub.com/aws/aws-sdk-go-v2/service/ssooidc v1.30.1/go.mod h1:MlYRNmYu/fGPoxBQVvBYr9nyr948aY/WLUvwBMBJubs=\ngithub.com/aws/aws-sdk-go-v2/service/sts v1.33.19 h1:1XuUZ8mYJw9B6lzAkXhqHlJd/XvaX32evhproijJEZY=\ngithub.com/aws/aws-sdk-go-v2/service/sts v1.33.19/go.mod h1:cQnB8CUnxbMU82JvlqjKR2HBOm3fe9pWorWBza6MBJ4=\ngithub.com/aws/smithy-go v1.22.3 h1:Z//5NuZCSW6R4PhQ93hShNbyBbn8BWCmCVCt+Q8Io5k=\ngithub.com/aws/smithy-go v1.22.3/go.mod h1:t1ufH5HMublsJYulve2RKmHDC15xu1f26kHCp/HgceI=\ngithub.com/awslabs/amazon-ecr-credential-helper/ecr-login v0.9.1 h1:50sS0RWhGpW/yZx2KcDNEb1u1MANv5BMEkJgcieEDTA=\ngithub.com/awslabs/amazon-ecr-credential-helper/ecr-login v0.9.1/go.mod h1:ErZOtbzuHabipRTDTor0inoRlYwbsV1ovwSxjGs/uJo=\ngithub.com/beorn7/perks v0.0.0-20180321164747-3a771d992973/go.mod h1:Dwedo/Wpr24TaqPxmxbtue+5NUziq4I4S80YR8gNf3Q=\ngithub.com/beorn7/perks v1.0.0/go.mod h1:KWe93zE9D1o94FZ5RNwFwVgaQK1VOXiVxmqh+CedLV8=\ngithub.com/beorn7/perks v1.0.1 h1:VlbKKnNfV8bJzeqoa4cOKqO6bYr3WgKZxO8Z16+hsOM=\ngithub.com/beorn7/perks v1.0.1/go.mod h1:G2ZrVWU2WbWT9wwq4/hrbKbnv/1ERSJQ0ibhJ6rlkpw=\ngithub.com/buildpacks/lifecycle v0.19.3 h1:T6dwX+/Nq7Q41Pb2zVu54MLrJPt93KEMNj4dHkXINbA=\ngithub.com/buildpacks/lifecycle v0.19.3/go.mod h1:BoLvGP1fjOqab59dariHDhVh5uIQuQ7yoIfj0orvL8M=\ngithub.com/cenkalti/backoff/v5 v5.0.3 h1:ZN+IMa753KfX5hd8vVaMixjnqRZ3y8CuJKRKj1xcsSM=\ngithub.com/cenkalti/backoff/v5 v5.0.3/go.mod h1:rkhZdG3JZukswDf7f0cwqPNk4K0sa+F97BxZthm/crw=\ngithub.com/cespare/xxhash/v2 v2.3.0 h1:UL815xU9SqsFlibzuggzjXhog7bL6oX9BbNZnL2UFvs=\ngithub.com/cespare/xxhash/v2 v2.3.0/go.mod h1:VGX0DQ3Q6kWi7AoAeZDth3/j3BFtOZR5XLFGgcrjCOs=\ngithub.com/chrismellard/docker-credential-acr-env v0.0.0-20230304212654-82a0ddb27589 h1:krfRl01rzPzxSxyLyrChD+U+MzsBXbm0OwYYB67uF+4=\ngithub.com/chrismellard/docker-credential-acr-env v0.0.0-20230304212654-82a0ddb27589/go.mod h1:OuDyvmLnMCwa2ep4Jkm6nyA0ocJuZlGyk2gGseVzERM=\ngithub.com/containerd/errdefs v1.0.0 h1:tg5yIfIlQIrxYtu9ajqY42W3lpS19XqdxRQeEwYG8PI=\ngithub.com/containerd/errdefs v1.0.0/go.mod h1:+YBYIdtsnF4Iw6nWZhJcqGSg/dwvV7tyJ/kCkyJ2k+M=\ngithub.com/containerd/errdefs/pkg v0.3.0 h1:9IKJ06FvyNlexW690DXuQNx2KA2cUJXx151Xdx3ZPPE=\ngithub.com/containerd/errdefs/pkg v0.3.0/go.mod h1:NJw6s9HwNuRhnjJhM7pylWwMyAkmCQvQ4GpJHEqRLVk=\ngithub.com/containerd/log v0.1.0 h1:TCJt7ioM2cr/tfR8GPbGf9/VRAX8D2B4PjzCpfX540I=\ngithub.com/containerd/log v0.1.0/go.mod h1:VRRf09a7mHDIRezVKTRCrOq78v577GXq3bSa3EhrzVo=\ngithub.com/containerd/stargz-snapshotter/estargz v0.16.3 h1:7evrXtoh1mSbGj/pfRccTampEyKpjpOnS3CyiV1Ebr8=\ngithub.com/containerd/stargz-snapshotter/estargz v0.16.3/go.mod h1:uyr4BfYfOj3G9WBVE8cOlQmXAbPN9VEQpBBeJIuOipU=\ngithub.com/davecgh/go-spew v1.1.0/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=\ngithub.com/davecgh/go-spew v1.1.1 h1:vj9j/u1bqnvCEfJOwUhtlOARqs3+rkHYY13jYWTU97c=\ngithub.com/davecgh/go-spew v1.1.1/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=\ngithub.com/dimchansky/utfbom v1.1.1 h1:vV6w1AhK4VMnhBno/TPVCoK9U/LP0PkLCS9tbxHdi/U=\ngithub.com/dimchansky/utfbom v1.1.1/go.mod h1:SxdoEBH5qIqFocHMyGOXVAybYJdr71b1Q/j0mACtrfE=\ngithub.com/distribution/reference v0.6.0 h1:0IXCQ5g4/QMHHkarYzh5l+u8T3t73zM5QvfrDyIgxBk=\ngithub.com/distribution/reference v0.6.0/go.mod h1:BbU0aIcezP1/5jX/8MP0YiH4SdvB5Y4f/wlDRiLyi3E=\ngithub.com/docker/cli v28.2.2+incompatible h1:qzx5BNUDFqlvyq4AHzdNB7gSyVTmU4cgsyN9SdInc1A=\ngithub.com/docker/cli v28.2.2+incompatible/go.mod h1:JLrzqnKDaYBop7H2jaqPtU4hHvMKP+vjCwu2uszcLI8=\ngithub.com/docker/distribution v2.8.3+incompatible h1:AtKxIZ36LoNK51+Z6RpzLpddBirtxJnzDrHLEKxTAYk=\ngithub.com/docker/distribution v2.8.3+incompatible/go.mod h1:J2gT2udsDAN96Uj4KfcMRqY0/ypR+oyYUYmja8H+y+w=\ngithub.com/docker/docker v28.5.1+incompatible h1:Bm8DchhSD2J6PsFzxC35TZo4TLGR2PdW/E69rU45NhM=\ngithub.com/docker/docker v28.5.1+incompatible/go.mod h1:eEKB0N0r5NX/I1kEveEz05bcu8tLC/8azJZsviup8Sk=\ngithub.com/docker/docker-credential-helpers v0.9.3 h1:gAm/VtF9wgqJMoxzT3Gj5p4AqIjCBS4wrsOh9yRqcz8=\ngithub.com/docker/docker-credential-helpers v0.9.3/go.mod h1:x+4Gbw9aGmChi3qTLZj8Dfn0TD20M/fuWy0E5+WDeCo=\ngithub.com/docker/go-connections v0.5.0 h1:USnMq7hx7gwdVZq1L49hLXaFtUdTADjXGp+uj1Br63c=\ngithub.com/docker/go-connections v0.5.0/go.mod h1:ov60Kzw0kKElRwhNs9UlUHAE/F9Fe6GLaXnqyDdmEXc=\ngithub.com/docker/go-metrics v0.0.1 h1:AgB/0SvBxihN0X8OR4SjsblXkbMvalQ8cjmtKQ2rQV8=\ngithub.com/docker/go-metrics v0.0.1/go.mod h1:cG1hvH2utMXtqgqqYE9plW6lDxS3/5ayHzueweSI3Vw=\ngithub.com/docker/go-units v0.5.0 h1:69rxXcBk27SvSaaxTtLh/8llcHD8vYHT7WSdRZ/jvr4=\ngithub.com/docker/go-units v0.5.0/go.mod h1:fgPhTUdO+D/Jk86RDLlptpiXQzgHJF7gydDDbaIK4Dk=\ngithub.com/docker/libtrust v0.0.0-20160708172513-aabc10ec26b7 h1:UhxFibDNY/bfvqU5CAUmr9zpesgbU6SWc8/B4mflAE4=\ngithub.com/docker/libtrust v0.0.0-20160708172513-aabc10ec26b7/go.mod h1:cyGadeNEkKy96OOhEzfZl+yxihPEzKnqJwvfuSUqbZE=\ngithub.com/felixge/httpsnoop v1.0.4 h1:NFTV2Zj1bL4mc9sqWACXbQFVBBg2W3GPvqp8/ESS2Wg=\ngithub.com/felixge/httpsnoop v1.0.4/go.mod h1:m8KPJKqk1gH5J9DgRY2ASl2lWCfGKXixSwevea8zH2U=\ngithub.com/go-kit/kit v0.8.0/go.mod h1:xBxKIO96dXMWWy0MnWVtmwkA9/13aqxPnvrjFYMA2as=\ngithub.com/go-logfmt/logfmt v0.3.0/go.mod h1:Qt1PoO58o5twSAckw1HlFXLmHsOX5/0LbT9GBnD5lWE=\ngithub.com/go-logfmt/logfmt v0.4.0/go.mod h1:3RMwSq7FuexP4Kalkev3ejPJsZTpXXBr9+V4qmtdjCk=\ngithub.com/go-logr/logr v1.2.2/go.mod h1:jdQByPbusPIv2/zmleS9BjJVeZ6kBagPoEUsqbVz/1A=\ngithub.com/go-logr/logr v1.4.3 h1:CjnDlHq8ikf6E492q6eKboGOC0T8CDaOvkHCIg8idEI=\ngithub.com/go-logr/logr v1.4.3/go.mod h1:9T104GzyrTigFIr8wt5mBrctHMim0Nb2HLGrmQ40KvY=\ngithub.com/go-logr/stdr v1.2.2 h1:hSWxHoqTgW2S2qGc0LTAI563KZ5YKYRhT3MFKZMbjag=\ngithub.com/go-logr/stdr v1.2.2/go.mod h1:mMo/vtBO5dYbehREoey6XUKy/eSumjCCveDpRre4VKE=\ngithub.com/go-stack/stack v1.8.0/go.mod h1:v0f6uXyyMGvRgIKkXu+yp6POWl0qKG85gN/melR3HDY=\ngithub.com/gogo/protobuf v1.1.1/go.mod h1:r8qH/GZQm5c6nD/R0oafs1akxWv10x8SbQlK7atdtwQ=\ngithub.com/golang-jwt/jwt/v4 v4.0.0/go.mod h1:/xlHOz8bRuivTWchD4jCa+NbatV+wEUSzwAxVc6locg=\ngithub.com/golang-jwt/jwt/v4 v4.2.0/go.mod h1:/xlHOz8bRuivTWchD4jCa+NbatV+wEUSzwAxVc6locg=\ngithub.com/golang-jwt/jwt/v4 v4.5.0/go.mod h1:m21LjoU+eqJr34lmDMbreY2eSTRJ1cv77w39/MY0Ch0=\ngithub.com/golang-jwt/jwt/v4 v4.5.2 h1:YtQM7lnr8iZ+j5q71MGKkNw9Mn7AjHM68uc9g5fXeUI=\ngithub.com/golang-jwt/jwt/v4 v4.5.2/go.mod h1:m21LjoU+eqJr34lmDMbreY2eSTRJ1cv77w39/MY0Ch0=\ngithub.com/golang/protobuf v1.2.0/go.mod h1:6lQm79b+lXiMfvg/cZm0SGofjICqVBUtrP5yJMmIC1U=\ngithub.com/golang/protobuf v1.3.1/go.mod h1:6lQm79b+lXiMfvg/cZm0SGofjICqVBUtrP5yJMmIC1U=\ngithub.com/golang/protobuf v1.3.2/go.mod h1:6lQm79b+lXiMfvg/cZm0SGofjICqVBUtrP5yJMmIC1U=\ngithub.com/google/go-cmp v0.3.0/go.mod h1:8QqcDgzrUqlUb/G2PQTWiueGozuR1884gddMywk6iLU=\ngithub.com/google/go-cmp v0.7.0 h1:wk8382ETsv4JYUZwIsn6YpYiWiBsYLSJiTsyBybVuN8=\ngithub.com/google/go-cmp v0.7.0/go.mod h1:pXiqmnSA92OHEEa9HXL2W4E7lf9JzCmGVUdgjX3N/iU=\ngithub.com/google/go-containerregistry v0.19.2 h1:TannFKE1QSajsP6hPWb5oJNgKe1IKjHukIKDUmvsV6w=\ngithub.com/google/go-containerregistry v0.19.2/go.mod h1:YCMFNQeeXeLF+dnhhWkqDItx/JSkH01j1Kis4PsjzFI=\ngithub.com/google/gofuzz v1.0.0/go.mod h1:dBl0BpW6vV/+mYPU4Po3pmUjxk6FQPldtuIdl/M65Eg=\ngithub.com/google/uuid v1.6.0 h1:NIvaJDMOsjHA8n1jAhLSgzrAzy1Hgr+hNrb57e+94F0=\ngithub.com/google/uuid v1.6.0/go.mod h1:TIyPZe4MgqvfeYDBFedMoGGpEw/LqOeaOT+nhxU+yHo=\ngithub.com/gorilla/mux v1.8.1 h1:TuBL49tXwgrFYWhqrNgrUNEY92u81SPhu7sTdzQEiWY=\ngithub.com/gorilla/mux v1.8.1/go.mod h1:AKf9I4AEqPTmMytcMc0KkNouC66V3BtZ4qD5fmWSiMQ=\ngithub.com/grpc-ecosystem/grpc-gateway/v2 v2.27.2 h1:8Tjv8EJ+pM1xP8mK6egEbD1OgnVTyacbefKhmbLhIhU=\ngithub.com/grpc-ecosystem/grpc-gateway/v2 v2.27.2/go.mod h1:pkJQ2tZHJ0aFOVEEot6oZmaVEZcRme73eIFmhiVuRWs=\ngithub.com/json-iterator/go v1.1.6/go.mod h1:+SdeFBvtyEkXs7REEP0seUULqWtbJapLOCVDaaPEHmU=\ngithub.com/json-iterator/go v1.1.7/go.mod h1:KdQUCv79m/52Kvf8AW2vK1V8akMuk1QjK/uOdHXbAo4=\ngithub.com/julienschmidt/httprouter v1.2.0/go.mod h1:SYymIcj16QtmaHHD7aYtjjsJG7VTCxuUUipMqKk8s4w=\ngithub.com/klauspost/compress v1.18.0 h1:c/Cqfb0r+Yi+JtIEq73FWXVkRonBlf0CRNYc8Zttxdo=\ngithub.com/klauspost/compress v1.18.0/go.mod h1:2Pp+KzxcywXVXMr50+X0Q/Lsb43OQHYWRCY2AiWywWQ=\ngithub.com/konsorten/go-windows-terminal-sequences v1.0.1/go.mod h1:T0+1ngSBFLxvqU3pZ+m/2kptfBszLMUkC4ZK/EgS/cQ=\ngithub.com/kr/logfmt v0.0.0-20140226030751-b84e30acd515/go.mod h1:+0opPa2QZZtGFBFZlji/RkVcI2GknAs/DXo4wKdlNEc=\ngithub.com/kylelemons/godebug v1.1.0 h1:RPNrshWIDI6G2gRW9EHilWtl7Z6Sb1BR0xunSBf0SNc=\ngithub.com/kylelemons/godebug v1.1.0/go.mod h1:9/0rRGxNHcop5bhtWyNeEfOS8JIWk580+fNqagV/RAw=\ngithub.com/matttproud/golang_protobuf_extensions v1.0.1/go.mod h1:D8He9yQNgCq6Z5Ld7szi9bcBfOoFv/3dc6xSMkL2PC0=\ngithub.com/mitchellh/go-homedir v1.1.0 h1:lukF9ziXFxDFPkA1vsr5zpc1XuPDn/wFntq5mG+4E0Y=\ngithub.com/mitchellh/go-homedir v1.1.0/go.mod h1:SfyaCUpYCn1Vlf4IUYiD9fPX4A5wJrkLzIz1N1q0pr0=\ngithub.com/moby/docker-image-spec v1.3.1 h1:jMKff3w6PgbfSa69GfNg+zN/XLhfXJGnEx3Nl2EsFP0=\ngithub.com/moby/docker-image-spec v1.3.1/go.mod h1:eKmb5VW8vQEh/BAr2yvVNvuiJuY6UIocYsFu/DxxRpo=\ngithub.com/moby/sys/atomicwriter v0.1.0 h1:kw5D/EqkBwsBFi0ss9v1VG3wIkVhzGvLklJ+w3A14Sw=\ngithub.com/moby/sys/atomicwriter v0.1.0/go.mod h1:Ul8oqv2ZMNHOceF643P6FKPXeCmYtlQMvpizfsSoaWs=\ngithub.com/moby/sys/sequential v0.6.0 h1:qrx7XFUd/5DxtqcoH1h438hF5TmOvzC/lspjy7zgvCU=\ngithub.com/moby/sys/sequential v0.6.0/go.mod h1:uyv8EUTrca5PnDsdMGXhZe6CCe8U/UiTWd+lL+7b/Ko=\ngithub.com/moby/term v0.5.0 h1:xt8Q1nalod/v7BqbG21f8mQPqH+xAaC9C3N3wfWbVP0=\ngithub.com/moby/term v0.5.0/go.mod h1:8FzsFHVUBGZdbDsJw/ot+X+d5HLUbvklYLJ9uGfcI3Y=\ngithub.com/modern-go/concurrent v0.0.0-20180228061459-e0a39a4cb421/go.mod h1:6dJC0mAP4ikYIbvyc7fijjWJddQyLn8Ig3JB5CqoB9Q=\ngithub.com/modern-go/concurrent v0.0.0-20180306012644-bacd9c7ef1dd/go.mod h1:6dJC0mAP4ikYIbvyc7fijjWJddQyLn8Ig3JB5CqoB9Q=\ngithub.com/modern-go/reflect2 v0.0.0-20180701023420-4b7aa43c6742/go.mod h1:bx2lNnkwVCuqBIxFjflWJWanXIb3RllmbCylyMrvgv0=\ngithub.com/modern-go/reflect2 v1.0.1/go.mod h1:bx2lNnkwVCuqBIxFjflWJWanXIb3RllmbCylyMrvgv0=\ngithub.com/morikuni/aec v1.0.0 h1:nP9CBfwrvYnBRgY6qfDQkygYDmYwOilePFkwzv4dU8A=\ngithub.com/morikuni/aec v1.0.0/go.mod h1:BbKIizmSmc5MMPqRYbxO4ZU0S0+P200+tUnFx7PXmsc=\ngithub.com/munnerz/goautoneg v0.0.0-20191010083416-a7dc8b61c822 h1:C3w9PqII01/Oq1c1nUAm88MOHcQC9l5mIlSMApZMrHA=\ngithub.com/munnerz/goautoneg v0.0.0-20191010083416-a7dc8b61c822/go.mod h1:+n7T8mK8HuQTcFwEeznm/DIxMOiR9yIdICNftLE1DvQ=\ngithub.com/mwitkow/go-conntrack v0.0.0-20161129095857-cc309e4a2223/go.mod h1:qRWi+5nqEBWmkhHvq77mSJWrCKwh8bxhgT7d/eI7P4U=\ngithub.com/opencontainers/go-digest v1.0.0 h1:apOUWs51W5PlhuyGyz9FCeeBIOUDA/6nW8Oi/yOhh5U=\ngithub.com/opencontainers/go-digest v1.0.0/go.mod h1:0JzlMkj0TRzQZfJkVvzbP0HBR3IKzErnv2BNG4W4MAM=\ngithub.com/opencontainers/image-spec v1.1.1 h1:y0fUlFfIZhPF1W537XOLg0/fcx6zcHCJwooC2xJA040=\ngithub.com/opencontainers/image-spec v1.1.1/go.mod h1:qpqAh3Dmcf36wStyyWU+kCeDgrGnAve2nCC8+7h8Q0M=\ngithub.com/pkg/errors v0.8.0/go.mod h1:bwawxfHBFNV+L2hUp1rHADufV3IMtnDRdf1r5NINEl0=\ngithub.com/pkg/errors v0.9.1 h1:FEBLx1zS214owpjy7qsBeixbURkuhQAwrK5UwLGTwt4=\ngithub.com/pkg/errors v0.9.1/go.mod h1:bwawxfHBFNV+L2hUp1rHADufV3IMtnDRdf1r5NINEl0=\ngithub.com/pmezard/go-difflib v1.0.0 h1:4DBwDE0NGyQoBHbLQYPwSUPoCMWR5BEzIk/f1lZbAQM=\ngithub.com/pmezard/go-difflib v1.0.0/go.mod h1:iKH77koFhYxTK1pcRnkKkqfTogsbg7gZNVY4sRDYZ/4=\ngithub.com/prometheus/client_golang v0.9.1/go.mod h1:7SWBe2y4D6OKWSNQJUaRYU/AaXPKyh/dDVn+NZz0KFw=\ngithub.com/prometheus/client_golang v1.0.0/go.mod h1:db9x61etRT2tGnBNRi70OPL5FsnadC4Ky3P0J6CfImo=\ngithub.com/prometheus/client_golang v1.1.0/go.mod h1:I1FGZT9+L76gKKOs5djB6ezCbFQP1xR9D75/vuwEF3g=\ngithub.com/prometheus/client_golang v1.22.0 h1:rb93p9lokFEsctTys46VnV1kLCDpVZ0a/Y92Vm0Zc6Q=\ngithub.com/prometheus/client_golang v1.22.0/go.mod h1:R7ljNsLXhuQXYZYtw6GAE9AZg8Y7vEW5scdCXrWRXC0=\ngithub.com/prometheus/client_model v0.0.0-20180712105110-5c3871d89910/go.mod h1:MbSGuTsp3dbXC40dX6PRTWyKYBIrTGTE9sqQNg2J8bo=\ngithub.com/prometheus/client_model v0.0.0-20190129233127-fd36f4220a90/go.mod h1:xMI15A0UPsDsEKsMN9yxemIoYk6Tm2C1GtYGdfGttqA=\ngithub.com/prometheus/client_model v0.6.2 h1:oBsgwpGs7iVziMvrGhE53c/GrLUsZdHnqNwqPLxwZyk=\ngithub.com/prometheus/client_model v0.6.2/go.mod h1:y3m2F6Gdpfy6Ut/GBsUqTWZqCUvMVzSfMLjcu6wAwpE=\ngithub.com/prometheus/common v0.4.1/go.mod h1:TNfzLD0ON7rHzMJeJkieUDPYmFC7Snx/y86RQel1bk4=\ngithub.com/prometheus/common v0.6.0/go.mod h1:eBmuwkDJBwy6iBfxCBob6t6dR6ENT/y+J+Zk0j9GMYc=\ngithub.com/prometheus/common v0.64.0 h1:pdZeA+g617P7oGv1CzdTzyeShxAGrTBsolKNOLQPGO4=\ngithub.com/prometheus/common v0.64.0/go.mod h1:0gZns+BLRQ3V6NdaerOhMbwwRbNh9hkGINtQAsP5GS8=\ngithub.com/prometheus/procfs v0.0.0-20181005140218-185b4288413d/go.mod h1:c3At6R/oaqEKCNdg8wHV1ftS6bRYblBhIjjI8uT2IGk=\ngithub.com/prometheus/procfs v0.0.2/go.mod h1:TjEm7ze935MbeOT/UhFTIMYKhuLP4wbCsTZCD3I8kEA=\ngithub.com/prometheus/procfs v0.0.3/go.mod h1:4A/X28fw3Fc593LaREMrKMqOKvUAntwMDaekg4FpcdQ=\ngithub.com/prometheus/procfs v0.16.1 h1:hZ15bTNuirocR6u0JZ6BAHHmwS1p8B4P6MRqxtzMyRg=\ngithub.com/prometheus/procfs v0.16.1/go.mod h1:teAbpZRB1iIAJYREa1LsoWUXykVXA1KlTmWl8x/U+Is=\ngithub.com/sclevine/spec v1.4.0 h1:z/Q9idDcay5m5irkZ28M7PtQM4aOISzOpj4bUPkDee8=\ngithub.com/sclevine/spec v1.4.0/go.mod h1:LvpgJaFyvQzRvc1kaDs0bulYwzC70PbiYjC4QnFHkOM=\ngithub.com/sirupsen/logrus v1.2.0/go.mod h1:LxeOpSwHxABJmUn/MG1IvRgCAasNZTLOkJPxbbu5VWo=\ngithub.com/sirupsen/logrus v1.9.3 h1:dueUQJ1C2q9oE3F7wvmSGAaVtTmUizReu6fjN8uqzbQ=\ngithub.com/sirupsen/logrus v1.9.3/go.mod h1:naHLuLoDiP4jHNo9R0sCBMtWGeIprob74mVsIT4qYEQ=\ngithub.com/stretchr/objx v0.1.0/go.mod h1:HFkY916IF+rwdDfMAkV7OtwuqBVzrE8GR6GFx+wExME=\ngithub.com/stretchr/objx v0.1.1/go.mod h1:HFkY916IF+rwdDfMAkV7OtwuqBVzrE8GR6GFx+wExME=\ngithub.com/stretchr/objx v0.4.0/go.mod h1:YvHI0jy2hoMjB+UWwv71VJQ9isScKT/TqJzVSSt89Yw=\ngithub.com/stretchr/objx v0.5.0/go.mod h1:Yh+to48EsGEfYuaHDzXPcE3xhTkx73EhmCGUpEOglKo=\ngithub.com/stretchr/testify v1.2.2/go.mod h1:a8OnRcib4nhh0OaRAV+Yts87kKdq0PP7pXfy6kDkUVs=\ngithub.com/stretchr/testify v1.3.0/go.mod h1:M5WIy9Dh21IEIfnGCwXGc5bZfKNJtfHm1UVUgZn+9EI=\ngithub.com/stretchr/testify v1.7.0/go.mod h1:6Fq8oRcR53rry900zMqJjRRixrwX3KX962/h/Wwjteg=\ngithub.com/stretchr/testify v1.7.1/go.mod h1:6Fq8oRcR53rry900zMqJjRRixrwX3KX962/h/Wwjteg=\ngithub.com/stretchr/testify v1.8.0/go.mod h1:yNjHg4UonilssWZ8iaSj1OCr/vHnekPRkoO+kdMU+MU=\ngithub.com/stretchr/testify v1.8.2/go.mod h1:w2LPCIKwWwSfY2zedu0+kehJoqGctiVI29o6fzry7u4=\ngithub.com/stretchr/testify v1.11.1 h1:7s2iGBzp5EwR7/aIZr8ao5+dra3wiQyKjjFuvgVKu7U=\ngithub.com/stretchr/testify v1.11.1/go.mod h1:wZwfW3scLgRK+23gO65QZefKpKQRnfz6sD981Nm4B6U=\ngithub.com/vbatts/tar-split v0.12.1 h1:CqKoORW7BUWBe7UL/iqTVvkTBOF8UvOMKOIZykxnnbo=\ngithub.com/vbatts/tar-split v0.12.1/go.mod h1:eF6B6i6ftWQcDqEn3/iGFRFRo8cBIMSJVOpnNdfTMFA=\ngithub.com/yuin/goldmark v1.4.13/go.mod h1:6yULJ656Px+3vBD8DxQVa3kxgyrAnzto9xy5taEt/CY=\ngo.opentelemetry.io/auto/sdk v1.1.0 h1:cH53jehLUN6UFLY71z+NDOiNJqDdPRaXzTel0sJySYA=\ngo.opentelemetry.io/auto/sdk v1.1.0/go.mod h1:3wSPjt5PWp2RhlCcmmOial7AvC4DQqZb7a7wCow3W8A=\ngo.opentelemetry.io/contrib/instrumentation/net/http/otelhttp v0.61.0 h1:F7Jx+6hwnZ41NSFTO5q4LYDtJRXBf2PD0rNBkeB/lus=\ngo.opentelemetry.io/contrib/instrumentation/net/http/otelhttp v0.61.0/go.mod h1:UHB22Z8QsdRDrnAtX4PntOl36ajSxcdUMt1sF7Y6E7Q=\ngo.opentelemetry.io/otel v1.38.0 h1:RkfdswUDRimDg0m2Az18RKOsnI8UDzppJAtj01/Ymk8=\ngo.opentelemetry.io/otel v1.38.0/go.mod h1:zcmtmQ1+YmQM9wrNsTGV/q/uyusom3P8RxwExxkZhjM=\ngo.opentelemetry.io/otel/exporters/otlp/otlptrace v1.38.0 h1:GqRJVj7UmLjCVyVJ3ZFLdPRmhDUp2zFmQe3RHIOsw24=\ngo.opentelemetry.io/otel/exporters/otlp/otlptrace v1.38.0/go.mod h1:ri3aaHSmCTVYu2AWv44YMauwAQc0aqI9gHKIcSbI1pU=\ngo.opentelemetry.io/otel/exporters/otlp/otlptrace/otlptracehttp v1.38.0 h1:aTL7F04bJHUlztTsNGJ2l+6he8c+y/b//eR0jjjemT4=\ngo.opentelemetry.io/otel/exporters/otlp/otlptrace/otlptracehttp v1.38.0/go.mod h1:kldtb7jDTeol0l3ewcmd8SDvx3EmIE7lyvqbasU3QC4=\ngo.opentelemetry.io/otel/metric v1.38.0 h1:Kl6lzIYGAh5M159u9NgiRkmoMKjvbsKtYRwgfrA6WpA=\ngo.opentelemetry.io/otel/metric v1.38.0/go.mod h1:kB5n/QoRM8YwmUahxvI3bO34eVtQf2i4utNVLr9gEmI=\ngo.opentelemetry.io/otel/sdk v1.38.0 h1:l48sr5YbNf2hpCUj/FoGhW9yDkl+Ma+LrVl8qaM5b+E=\ngo.opentelemetry.io/otel/sdk v1.38.0/go.mod h1:ghmNdGlVemJI3+ZB5iDEuk4bWA3GkTpW+DOoZMYBVVg=\ngo.opentelemetry.io/otel/sdk/metric v1.38.0 h1:aSH66iL0aZqo//xXzQLYozmWrXxyFkBJ6qT5wthqPoM=\ngo.opentelemetry.io/otel/sdk/metric v1.38.0/go.mod h1:dg9PBnW9XdQ1Hd6ZnRz689CbtrUp0wMMs9iPcgT9EZA=\ngo.opentelemetry.io/otel/trace v1.38.0 h1:Fxk5bKrDZJUH+AMyyIXGcFAPah0oRcT+LuNtJrmcNLE=\ngo.opentelemetry.io/otel/trace v1.38.0/go.mod h1:j1P9ivuFsTceSWe1oY+EeW3sc+Pp42sO++GHkg4wwhs=\ngo.opentelemetry.io/proto/otlp v1.9.0 h1:l706jCMITVouPOqEnii2fIAuO3IVGBRPV5ICjceRb/A=\ngo.opentelemetry.io/proto/otlp v1.9.0/go.mod h1:xE+Cx5E/eEHw+ISFkwPLwCZefwVjY+pqKg1qcK03+/4=\ngolang.org/x/crypto v0.0.0-20180904163835-0709b304e793/go.mod h1:6SG95UA2DQfeDnfUPMdvaQW0Q7yPrPDi9nlGo2tz2b4=\ngolang.org/x/crypto v0.0.0-20190308221718-c2843e01d9a2/go.mod h1:djNgcEr1/C05ACkg1iLfiJU5Ep61QUkGW8qpdssI0+w=\ngolang.org/x/crypto v0.0.0-20210921155107-089bfa567519/go.mod h1:GvvjBRRGRdwPK5ydBHafDWAxML/pGHZbMvKqRZ5+Abc=\ngolang.org/x/crypto v0.0.0-20220722155217-630584e8d5aa/go.mod h1:IxCIyHEi3zRg3s0A5j5BB6A9Jmi73HwBIUl50j+osU4=\ngolang.org/x/crypto v0.17.0/go.mod h1:gCAAfMLgwOJRpTjQ2zCCt2OcSfYMTeZVSRtQlPC7Nq4=\ngolang.org/x/crypto v0.38.0 h1:jt+WWG8IZlBnVbomuhg2Mdq0+BBQaHbtqHEFEigjUV8=\ngolang.org/x/crypto v0.38.0/go.mod h1:MvrbAqul58NNYPKnOra203SB9vpuZW0e+RRZV+Ggqjw=\ngolang.org/x/mod v0.6.0-dev.0.20220419223038-86c51ed26bb4/go.mod h1:jJ57K6gSWd91VN4djpZkiMVwK6gcyfeH4XE8wZrZaV4=\ngolang.org/x/mod v0.8.0/go.mod h1:iBbtSCu2XBx23ZKBPSOrRkjjQPZFPuis4dIYUhu/chs=\ngolang.org/x/net v0.0.0-20181114220301-adae6a3d119a/go.mod h1:mL1N/T3taQHkDXs73rZJwtUhF3w3ftmwwsq0BUmARs4=\ngolang.org/x/net v0.0.0-20190613194153-d28f0bde5980/go.mod h1:z5CRVTTTmAJ677TzLLGU+0bjPO0LkuOLi4/5GtJWs/s=\ngolang.org/x/net v0.0.0-20190620200207-3b0461eec859/go.mod h1:z5CRVTTTmAJ677TzLLGU+0bjPO0LkuOLi4/5GtJWs/s=\ngolang.org/x/net v0.0.0-20210226172049-e18ecbb05110/go.mod h1:m0MpNAwzfU5UDzcl9v0D8zg8gWTRqZa9RBIspLL5mdg=\ngolang.org/x/net v0.0.0-20211112202133-69e39bad7dc2/go.mod h1:9nx3DQGgdP8bBQD5qxJ1jj9UTztislL4KSBs9R2vV5Y=\ngolang.org/x/net v0.0.0-20220722155237-a158d28d115b/go.mod h1:XRhObCWvk6IyKnWLug+ECip1KBveYUHfp+8e9klMJ9c=\ngolang.org/x/net v0.6.0/go.mod h1:2Tu9+aMcznHK/AK1HMvgo6xiTLG5rD5rZLDS+rp2Bjs=\ngolang.org/x/net v0.10.0/go.mod h1:0qNGK6F8kojg2nk9dLZ2mShWaEBan6FAoqfSigmmuDg=\ngolang.org/x/net v0.43.0 h1:lat02VYK2j4aLzMzecihNvTlJNQUq316m2Mr9rnM6YE=\ngolang.org/x/net v0.43.0/go.mod h1:vhO1fvI4dGsIjh73sWfUVjj3N7CA9WkKJNQm2svM6Jg=\ngolang.org/x/sync v0.0.0-20181108010431-42b317875d0f/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=\ngolang.org/x/sync v0.0.0-20181221193216-37e7f081c4d4/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=\ngolang.org/x/sync v0.0.0-20190423024810-112230192c58/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=\ngolang.org/x/sync v0.0.0-20220722155255-886fb9371eb4/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=\ngolang.org/x/sync v0.1.0/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=\ngolang.org/x/sync v0.15.0 h1:KWH3jNZsfyT6xfAfKiz6MRNmd46ByHDYaZ7KSkCtdW8=\ngolang.org/x/sync v0.15.0/go.mod h1:1dzgHSNfp02xaA81J2MS99Qcpr2w7fw1gpm99rleRqA=\ngolang.org/x/sys v0.0.0-20180905080454-ebe1bf3edb33/go.mod h1:STP8DvDyc/dI5b8T5hshtkjS+E42TnysNCUPdjciGhY=\ngolang.org/x/sys v0.0.0-20181116152217-5ac8a444bdc5/go.mod h1:STP8DvDyc/dI5b8T5hshtkjS+E42TnysNCUPdjciGhY=\ngolang.org/x/sys v0.0.0-20190215142949-d0b11bdaac8a/go.mod h1:STP8DvDyc/dI5b8T5hshtkjS+E42TnysNCUPdjciGhY=\ngolang.org/x/sys v0.0.0-20190801041406-cbf593c0f2f3/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=\ngolang.org/x/sys v0.0.0-20201119102817-f84b799fce68/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=\ngolang.org/x/sys v0.0.0-20210423082822-04245dca01da/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=\ngolang.org/x/sys v0.0.0-20210615035016-665e8c7367d1/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=\ngolang.org/x/sys v0.0.0-20220520151302-bc2c85ada10a/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=\ngolang.org/x/sys v0.0.0-20220715151400-c0bba94af5f8/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=\ngolang.org/x/sys v0.0.0-20220722155257-8c9f86f7a55f/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=\ngolang.org/x/sys v0.5.0/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=\ngolang.org/x/sys v0.8.0/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=\ngolang.org/x/sys v0.15.0/go.mod h1:/VUhepiaJMQUp4+oa/7Zr1D23ma6VTLIYjOOTFZPUcA=\ngolang.org/x/sys v0.35.0 h1:vz1N37gP5bs89s7He8XuIYXpyY0+QlsKmzipCbUtyxI=\ngolang.org/x/sys v0.35.0/go.mod h1:BJP2sWEmIv4KK5OTEluFJCKSidICx8ciO85XgH3Ak8k=\ngolang.org/x/term v0.0.0-20201126162022-7de9c90e9dd1/go.mod h1:bj7SfCRtBDWHUb9snDiAeCFNEtKQo2Wmx5Cou7ajbmo=\ngolang.org/x/term v0.0.0-20210927222741-03fcf44c2211/go.mod h1:jbD1KX2456YbFQfuXm/mYQcufACuNUgVhRMnK/tPxf8=\ngolang.org/x/term v0.5.0/go.mod h1:jMB1sMXY+tzblOD4FWmEbocvup2/aLOaQEp7JmGp78k=\ngolang.org/x/term v0.8.0/go.mod h1:xPskH00ivmX89bAKVGSKKtLOWNx2+17Eiy94tnKShWo=\ngolang.org/x/term v0.15.0/go.mod h1:BDl952bC7+uMoWR75FIrCDx79TPU9oHkTZ9yRbYOrX0=\ngolang.org/x/text v0.3.0/go.mod h1:NqM8EUOU14njkJ3fqMW+pc6Ldnwhi/IjpwHt7yyuwOQ=\ngolang.org/x/text v0.3.3/go.mod h1:5Zoc/QRtKVWzQhOtBMvqHzDpF6irO9z98xDceosuGiQ=\ngolang.org/x/text v0.3.6/go.mod h1:5Zoc/QRtKVWzQhOtBMvqHzDpF6irO9z98xDceosuGiQ=\ngolang.org/x/text v0.3.7/go.mod h1:u+2+/6zg+i71rQMx5EYifcz6MCKuco9NR6JIITiCfzQ=\ngolang.org/x/text v0.7.0/go.mod h1:mrYo+phRRbMaCq/xk9113O4dZlRixOauAjOtrjsXDZ8=\ngolang.org/x/text v0.9.0/go.mod h1:e1OnstbJyHTd6l/uOt8jFFHp6TRDWZR/bV3emEE/zU8=\ngolang.org/x/text v0.14.0/go.mod h1:18ZOQIKpY8NJVqYksKHtTdi31H5itFRjB5/qKTNYzSU=\ngolang.org/x/text v0.28.0 h1:rhazDwis8INMIwQ4tpjLDzUhx6RlXqZNPEM0huQojng=\ngolang.org/x/text v0.28.0/go.mod h1:U8nCwOR8jO/marOQ0QbDiOngZVEBB7MAiitBuMjXiNU=\ngolang.org/x/time v0.5.0 h1:o7cqy6amK/52YcAKIPlM3a+Fpj35zvRj2TP+e1xFSfk=\ngolang.org/x/time v0.5.0/go.mod h1:3BpzKBy/shNhVucY/MWOyx10tF3SFh9QdLuxbVysPQM=\ngolang.org/x/tools v0.0.0-20180917221912-90fa682c2a6e/go.mod h1:n7NCudcB/nEzxVGmLbDWY5pfWTLqBcC2KZ6jyYvM4mQ=\ngolang.org/x/tools v0.0.0-20191119224855-298f0cb1881e/go.mod h1:b+2E5dAYhXwXZwtnZ6UAqBI28+e2cm9otk0dWdXHAEo=\ngolang.org/x/tools v0.1.12/go.mod h1:hNGJHUnrk76NpqgfD5Aqm5Crs+Hm0VOH/i9J2+nxYbc=\ngolang.org/x/tools v0.6.0/go.mod h1:Xwgl3UAJ/d3gWutnCtw505GrjyAbvKui8lOU390QaIU=\ngolang.org/x/xerrors v0.0.0-20190717185122-a985d3407aa7/go.mod h1:I/5z698sn9Ka8TeJc9MKroUUfqBBauWjQqLJ2OPfmY0=\ngoogle.golang.org/genproto v0.0.0-20240213162025-012b6fc9bca9 h1:9+tzLLstTlPTRyJTh+ah5wIMsBW5c4tQwGTN3thOW9Y=\ngoogle.golang.org/genproto/googleapis/api v0.0.0-20250825161204-c5933d9347a5 h1:BIRfGDEjiHRrk0QKZe3Xv2ieMhtgRGeLcZQ0mIVn4EY=\ngoogle.golang.org/genproto/googleapis/api v0.0.0-20250825161204-c5933d9347a5/go.mod h1:j3QtIyytwqGr1JUDtYXwtMXWPKsEa5LtzIFN1Wn5WvE=\ngoogle.golang.org/genproto/googleapis/rpc v0.0.0-20250825161204-c5933d9347a5 h1:eaY8u2EuxbRv7c3NiGK0/NedzVsCcV6hDuU5qPX5EGE=\ngoogle.golang.org/genproto/googleapis/rpc v0.0.0-20250825161204-c5933d9347a5/go.mod h1:M4/wBTSeyLxupu3W3tJtOgB14jILAS/XWPSSa3TAlJc=\ngoogle.golang.org/grpc v1.75.1 h1:/ODCNEuf9VghjgO3rqLcfg8fiOP0nSluljWFlDxELLI=\ngoogle.golang.org/grpc v1.75.1/go.mod h1:JtPAzKiq4v1xcAB2hydNlWI2RnF85XXcV0mhKXr2ecQ=\ngoogle.golang.org/protobuf v1.36.10 h1:AYd7cD/uASjIL6Q9LiTjz8JLcrh/88q5UObnmY3aOOE=\ngoogle.golang.org/protobuf v1.36.10/go.mod h1:HTf+CrKn2C3g5S8VImy6tdcUvCska2kB7j23XfzDpco=\ngopkg.in/alecthomas/kingpin.v2 v2.2.6/go.mod h1:FMv+mEhP44yOT+4EoQTLFTRgOQ1FBLkstjWtayDeSgw=\ngopkg.in/check.v1 v0.0.0-20161208181325-20d25e280405/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0=\ngopkg.in/yaml.v2 v2.2.1/go.mod h1:hI93XBmqTisBFMUTm0b8Fm+jr3Dg1NNxqwp+5A1VGuI=\ngopkg.in/yaml.v3 v3.0.0-20200313102051-9f266ea9e77c/go.mod h1:K4uyk7z7BCEPqu6E+C64Yfv1cQ7kz7rIZviUmN+EgEM=\ngopkg.in/yaml.v3 v3.0.1 h1:fxVm/GzAzEWqLHuvctI91KS9hhNmmWOoWu0XTYJS7CA=\ngopkg.in/yaml.v3 v3.0.1/go.mod h1:K4uyk7z7BCEPqu6E+C64Yfv1cQ7kz7rIZviUmN+EgEM=\ngotest.tools/v3 v3.0.3 h1:4AuOwCGf4lLR9u3YOe2awrHygurzhO/HeQ6laiA6Sx0=\ngotest.tools/v3 v3.0.3/go.mod h1:Z7Lb0S5l+klDB31fvDQX8ss/FlKDxtlFlw3Oa8Ymbl8=\n"
  },
  {
    "path": "internal/build/testdata/fake-lifecycle/phase.go",
    "content": "package main\n\nimport (\n\t\"context\"\n\t\"fmt\"\n\t\"net\"\n\t\"net/http\"\n\t\"os\"\n\t\"os/user\"\n\t\"path/filepath\"\n\t\"syscall\"\n\n\t\"github.com/buildpacks/lifecycle/auth\"\n\t\"github.com/docker/docker/api/types/container\"\n\tdockercli \"github.com/docker/docker/client\"\n\tv1remote \"github.com/google/go-containerregistry/pkg/v1/remote\"\n)\n\nfunc main() {\n\tfmt.Println(\"running some-lifecycle-phase\")\n\tfmt.Printf(\"received args %+v\\n\", os.Args)\n\tif len(os.Args) > 3 && os.Args[1] == \"write\" {\n\t\ttestWrite(os.Args[2], os.Args[3])\n\t}\n\tif len(os.Args) > 1 && os.Args[1] == \"daemon\" {\n\t\ttestDaemon()\n\t}\n\tif len(os.Args) > 1 && os.Args[1] == \"registry\" {\n\t\ttestRegistryAccess(os.Args[2])\n\t}\n\tif len(os.Args) > 2 && os.Args[1] == \"read\" {\n\t\ttestRead(os.Args[2])\n\t}\n\tif len(os.Args) > 2 && os.Args[1] == \"delete\" {\n\t\ttestDelete(os.Args[2])\n\t}\n\tif len(os.Args) > 1 && os.Args[1] == \"env\" {\n\t\ttestEnv()\n\t}\n\tif len(os.Args) > 1 && os.Args[1] == \"buildpacks\" {\n\t\ttestBuildpacks()\n\t}\n\tif len(os.Args) > 1 && os.Args[1] == \"proxy\" {\n\t\ttestProxy()\n\t}\n\tif len(os.Args) > 1 && os.Args[1] == \"binds\" {\n\t\ttestBinds()\n\t}\n\tif len(os.Args) > 1 && os.Args[1] == \"network\" {\n\t\ttestNetwork()\n\t}\n\tif len(os.Args) > 1 && os.Args[1] == \"user\" {\n\t\ttestUser()\n\t}\n}\n\nfunc testWrite(filename, contents string) {\n\tfmt.Println(\"write test\")\n\tfile, err := os.Create(filename)\n\tif err != nil {\n\t\tfmt.Printf(\"failed to create %s: %s\\n\", filename, err)\n\t\tos.Exit(1)\n\t}\n\tdefer file.Close()\n\t_, err = file.Write([]byte(contents))\n\tif err != nil {\n\t\tfmt.Printf(\"failed to write to %s: %s\\n\", filename, err)\n\t\tos.Exit(1)\n\t}\n}\n\nfunc testDaemon() {\n\tfmt.Println(\"daemon test\")\n\tcli, err := dockercli.NewClientWithOpts(dockercli.FromEnv, dockercli.WithAPIVersionNegotiation())\n\tif err != nil {\n\t\tfmt.Printf(\"failed to create new docker client: %s\\n\", err)\n\t\tos.Exit(1)\n\t}\n\t_, err = cli.ContainerList(context.TODO(), container.ListOptions{})\n\tif err != nil {\n\t\tfmt.Printf(\"failed to access docker daemon: %s\\n\", err)\n\t\tos.Exit(1)\n\t}\n}\n\nfunc testRegistryAccess(repoName string) {\n\tfmt.Println(\"registry test\")\n\tfmt.Printf(\"CNB_REGISTRY_AUTH=%+v\\n\", os.Getenv(\"CNB_REGISTRY_AUTH\"))\n\tkeychain, err := auth.NewEnvKeychain(\"CNB_REGISTRY_AUTH\")\n\tif err != nil {\n\t\tfmt.Println(\"fail creating keychain:\", err)\n\t\tos.Exit(1)\n\t}\n\tref, authenticator, err := auth.ReferenceForRepoName(keychain, repoName)\n\tif err != nil {\n\t\tfmt.Println(\"fail:\", err)\n\t\tos.Exit(1)\n\t}\n\n\t_, err = v1remote.Image(ref, v1remote.WithAuth(authenticator))\n\tif err != nil {\n\t\tfmt.Println(\"failed to access image:\", err)\n\t\tos.Exit(1)\n\t}\n}\n\nfunc testRead(filename string) {\n\tfmt.Println(\"read test\")\n\tcontents, err := os.ReadFile(filepath.Clean(filename))\n\tif err != nil {\n\t\tfmt.Printf(\"failed to read file '%s'\\n\", filename)\n\t\tos.Exit(1)\n\t}\n\tfmt.Println(\"file contents:\", string(contents))\n\tinfo, err := os.Stat(filename)\n\tif err != nil {\n\t\tfmt.Printf(\"failed to stat file '%s'\\n\", filename)\n\t\tos.Exit(1)\n\t}\n\tstat := info.Sys().(*syscall.Stat_t)\n\tfmt.Printf(\"file uid/gid: %d/%d\\n\", stat.Uid, stat.Gid)\n\tfmt.Printf(\"file mod time (unix): %d\\n\", info.ModTime().Unix())\n}\n\nfunc testEnv() {\n\tfmt.Println(\"env test\")\n\tfis, err := os.ReadDir(\"/platform/env\")\n\tif err != nil {\n\t\tfmt.Printf(\"failed to read /plaform/env dir: %s\\n\", err)\n\t\tos.Exit(1)\n\t}\n\tfor _, fi := range fis {\n\t\tcontents, err := os.ReadFile(filepath.Join(\"/\", \"platform\", \"env\", fi.Name()))\n\t\tif err != nil {\n\t\t\tfmt.Printf(\"failed to read file /plaform/env/%s: %s\\n\", fi.Name(), err)\n\t\t\tos.Exit(1)\n\t\t}\n\t\tfmt.Printf(\"%s=%s\\n\", fi.Name(), string(contents))\n\t}\n}\n\nfunc testDelete(filename string) {\n\tfmt.Println(\"delete test\")\n\terr := os.RemoveAll(filename)\n\tif err != nil {\n\t\tfmt.Printf(\"failed to delete file '%s'\\n\", filename)\n\t\tos.Exit(10)\n\t}\n}\n\nfunc testBuildpacks() {\n\tfmt.Println(\"buildpacks test\")\n\n\treadDir(\"/buildpacks\")\n}\n\nfunc testProxy() {\n\tfmt.Println(\"proxy test\")\n\tfmt.Println(\"HTTP_PROXY=\" + os.Getenv(\"HTTP_PROXY\"))\n\tfmt.Println(\"HTTPS_PROXY=\" + os.Getenv(\"HTTPS_PROXY\"))\n\tfmt.Println(\"NO_PROXY=\" + os.Getenv(\"NO_PROXY\"))\n\tfmt.Println(\"http_proxy=\" + os.Getenv(\"http_proxy\"))\n\tfmt.Println(\"https_proxy=\" + os.Getenv(\"https_proxy\"))\n\tfmt.Println(\"no_proxy=\" + os.Getenv(\"no_proxy\"))\n}\n\nfunc testNetwork() {\n\tfmt.Println(\"testing network\")\n\tifaces, err := net.Interfaces()\n\tif err != nil {\n\t\tfmt.Print(fmt.Errorf(\"localAddresses: %+v\\n\", err.Error()))\n\t\treturn\n\t}\n\tfmt.Printf(\"iterating over %v interfaces\\n\", len(ifaces))\n\tfor _, i := range ifaces {\n\t\tfmt.Printf(\"interface: %s\\n\", i.Name)\n\t}\n\n\tresp, err := http.Get(\"http://google.com\")\n\tif err != nil {\n\t\tfmt.Printf(\"error connecting to internet: %s\\n\", err.Error())\n\t\treturn\n\t}\n\tfmt.Printf(\"response status %s\\n\", resp.Status)\n}\n\nfunc testBinds() {\n\tfmt.Println(\"binds test\")\n\treadDir(\"/mounted\")\n}\n\nfunc readDir(dir string) {\n\tfis, err := os.ReadDir(dir)\n\tif err != nil {\n\t\tfmt.Printf(\"failed to read %s dir: %s\\n\", dir, err)\n\t\tos.Exit(1)\n\t}\n\tfor _, fi := range fis {\n\t\tabsPath := filepath.Join(dir, fi.Name())\n\t\tinfo, err := fi.Info()\n\t\tif err != nil {\n\t\t\tfmt.Printf(\"failed to dir info %s err: %s\\n\", fi.Name(), err)\n\t\t\tos.Exit(1)\n\t\t}\n\t\tstat := info.Sys().(*syscall.Stat_t)\n\t\tfmt.Printf(\"%s %d/%d \\n\", absPath, stat.Uid, stat.Gid)\n\t\tif fi.IsDir() {\n\t\t\treadDir(absPath)\n\t\t}\n\t}\n}\n\nfunc testUser() {\n\tfmt.Println(\"user test\")\n\tuser, err := user.Current()\n\tif err != nil {\n\t\tfmt.Printf(\"failed to determine current user: %s\\n\", err)\n\t\treturn\n\t}\n\n\tfmt.Printf(\"current user is %s\\n\", user.Name)\n}\n"
  },
  {
    "path": "internal/builder/builder.go",
    "content": "package builder\n\nimport (\n\t\"archive/tar\"\n\t\"bytes\"\n\te \"errors\"\n\t\"fmt\"\n\t\"io\"\n\t\"os\"\n\t\"path\"\n\t\"path/filepath\"\n\t\"regexp\"\n\t\"sort\"\n\t\"strconv\"\n\t\"strings\"\n\t\"time\"\n\n\t\"github.com/BurntSushi/toml\"\n\t\"github.com/buildpacks/imgutil\"\n\t\"github.com/pkg/errors\"\n\n\t\"github.com/buildpacks/pack/builder\"\n\t\"github.com/buildpacks/pack/internal/layer\"\n\t\"github.com/buildpacks/pack/internal/stack\"\n\tistrings \"github.com/buildpacks/pack/internal/strings\"\n\t\"github.com/buildpacks/pack/internal/style\"\n\t\"github.com/buildpacks/pack/pkg/archive\"\n\t\"github.com/buildpacks/pack/pkg/buildpack\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\n\tlifecycleplatform \"github.com/buildpacks/lifecycle/platform\"\n)\n\nvar buildConfigDir = cnbBuildConfigDir()\n\nconst (\n\tpackName = \"Pack CLI\"\n\n\tcnbDir        = \"/cnb\"\n\tbuildpacksDir = \"/cnb/buildpacks\"\n\n\torderPath          = \"/cnb/order.toml\"\n\tstackPath          = \"/cnb/stack.toml\"\n\tsystemPath         = \"/cnb/system.toml\"\n\trunPath            = \"/cnb/run.toml\"\n\tplatformDir        = \"/platform\"\n\tlifecycleDir       = \"/cnb/lifecycle\"\n\tcompatLifecycleDir = \"/lifecycle\"\n\tworkspaceDir       = \"/workspace\"\n\tlayersDir          = \"/layers\"\n\n\temptyTarDiffID = \"sha256:e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855\"\n\n\tmetadataLabel = \"io.buildpacks.builder.metadata\"\n\tstackLabel    = \"io.buildpacks.stack.id\"\n\n\tEnvUID = \"CNB_USER_ID\"\n\tEnvGID = \"CNB_GROUP_ID\"\n\n\tModuleOnBuilderMessage = `%s %s already exists on builder and will be overwritten\n  - existing diffID: %s\n  - new diffID: %s`\n\n\tModulePreviouslyDefinedMessage = `%s %s was previously defined with different contents and will be overwritten\n  - previous diffID: %s\n  - using diffID: %s`\n)\n\n// Builder represents a pack builder, used to build images\ntype Builder struct {\n\tbaseImageName        string\n\tbuildConfigEnv       map[string]string\n\timage                imgutil.Image\n\tlayerWriterFactory   archive.TarWriterFactory\n\tlifecycle            Lifecycle\n\tlifecycleDescriptor  LifecycleDescriptor\n\tadditionalBuildpacks buildpack.ManagedCollection\n\tadditionalExtensions buildpack.ManagedCollection\n\tmetadata             Metadata\n\tmixins               []string\n\tenv                  map[string]string\n\tuid, gid             int\n\tStackID              string\n\treplaceOrder         bool\n\torder                dist.Order\n\torderExtensions      dist.Order\n\tsystem               dist.System\n\tvalidateMixins       bool\n\tsaveProhibited       bool\n}\n\ntype orderTOML struct {\n\tOrder    dist.Order `toml:\"order,omitempty\"`\n\tOrderExt dist.Order `toml:\"order-extensions,omitempty\"`\n}\n\ntype systemTOML struct {\n\tSystem dist.System `toml:\"system\"`\n}\n\n// moduleWithDiffID is a Build Module which content was written on disk in a tar file and the content hash was calculated\ntype moduleWithDiffID struct {\n\ttarPath string\n\tdiffID  string\n\tmodule  buildpack.BuildModule\n}\n\ntype BuilderOption func(*options) error\n\ntype options struct {\n\ttoFlatten      buildpack.FlattenModuleInfos\n\tlabels         map[string]string\n\trunImage       string\n\tsaveProhibited bool\n}\n\nfunc WithRunImage(name string) BuilderOption {\n\treturn func(o *options) error {\n\t\to.runImage = name\n\t\treturn nil\n\t}\n}\n\nfunc WithoutSave() BuilderOption {\n\treturn func(o *options) error {\n\t\to.saveProhibited = true\n\t\treturn nil\n\t}\n}\n\n// FromImage constructs a builder from a builder image\nfunc FromImage(img imgutil.Image) (*Builder, error) {\n\treturn constructBuilder(img, \"\", true)\n}\n\n// New constructs a new builder from a base image\nfunc New(baseImage imgutil.Image, name string, ops ...BuilderOption) (*Builder, error) {\n\treturn constructBuilder(baseImage, name, false, ops...)\n}\n\nfunc constructBuilder(img imgutil.Image, newName string, errOnMissingLabel bool, ops ...BuilderOption) (*Builder, error) {\n\tvar metadata Metadata\n\tif ok, err := dist.GetLabel(img, metadataLabel, &metadata); err != nil {\n\t\treturn nil, errors.Wrapf(err, \"getting label %s\", metadataLabel)\n\t} else if !ok && errOnMissingLabel {\n\t\treturn nil, fmt.Errorf(\"builder %s missing label %s -- try recreating builder\", style.Symbol(img.Name()), style.Symbol(metadataLabel))\n\t}\n\n\tsystem := dist.System{}\n\tif _, err := dist.GetLabel(img, SystemLabel, &system); err != nil {\n\t\treturn nil, errors.Wrapf(err, \"getting label %s\", SystemLabel)\n\t}\n\n\topts := &options{}\n\tfor _, op := range ops {\n\t\tif err := op(opts); err != nil {\n\t\t\treturn nil, err\n\t\t}\n\t}\n\n\timageOS, err := img.OS()\n\tif err != nil {\n\t\treturn nil, errors.Wrap(err, \"getting image OS\")\n\t}\n\tlayerWriterFactory, err := layer.NewWriterFactory(imageOS)\n\tif err != nil {\n\t\treturn nil, err\n\t}\n\n\tif opts.runImage != \"\" {\n\t\t// FIXME: for now the mirrors are gone if you override the run-image (open an issue if preserving the mirrors is desired)\n\t\tmetadata.RunImages = []RunImageMetadata{{Image: opts.runImage}}\n\t\tmetadata.Stack.RunImage = RunImageMetadata{Image: opts.runImage}\n\t}\n\n\tfor labelKey, labelValue := range opts.labels {\n\t\terr = img.SetLabel(labelKey, labelValue)\n\t\tif err != nil {\n\t\t\treturn nil, errors.Wrapf(err, \"adding label %s=%s\", labelKey, labelValue)\n\t\t}\n\t}\n\n\tbldr := &Builder{\n\t\tbaseImageName:        img.Name(),\n\t\timage:                img,\n\t\tlayerWriterFactory:   layerWriterFactory,\n\t\tmetadata:             metadata,\n\t\tlifecycleDescriptor:  constructLifecycleDescriptor(metadata),\n\t\tenv:                  map[string]string{},\n\t\tbuildConfigEnv:       map[string]string{},\n\t\tvalidateMixins:       true,\n\t\tadditionalBuildpacks: buildpack.NewManagedCollectionV2(opts.toFlatten),\n\t\tadditionalExtensions: buildpack.NewManagedCollectionV2(opts.toFlatten),\n\t\tsaveProhibited:       opts.saveProhibited,\n\t\tsystem:               system,\n\t}\n\n\tif err := addImgLabelsToBuildr(bldr); err != nil {\n\t\treturn nil, errors.Wrap(err, \"adding image labels to builder\")\n\t}\n\n\tif newName != \"\" && img.Name() != newName {\n\t\timg.Rename(newName)\n\t}\n\n\treturn bldr, nil\n}\n\nfunc WithFlattened(modules buildpack.FlattenModuleInfos) BuilderOption {\n\treturn func(o *options) error {\n\t\to.toFlatten = modules\n\t\treturn nil\n\t}\n}\n\nfunc WithLabels(labels map[string]string) BuilderOption {\n\treturn func(o *options) error {\n\t\to.labels = labels\n\t\treturn nil\n\t}\n}\n\nfunc constructLifecycleDescriptor(metadata Metadata) LifecycleDescriptor {\n\treturn CompatDescriptor(LifecycleDescriptor{\n\t\tInfo: LifecycleInfo{\n\t\t\tVersion: metadata.Lifecycle.Version,\n\t\t},\n\t\tAPI:  metadata.Lifecycle.API,\n\t\tAPIs: metadata.Lifecycle.APIs,\n\t})\n}\n\nfunc addImgLabelsToBuildr(bldr *Builder) error {\n\tvar err error\n\tbldr.uid, bldr.gid, err = userAndGroupIDs(bldr.image)\n\tif err != nil {\n\t\treturn err\n\t}\n\n\tbldr.StackID, err = bldr.image.Label(stackLabel)\n\tif err != nil {\n\t\treturn errors.Wrapf(err, \"get label %s from image %s\", style.Symbol(stackLabel), style.Symbol(bldr.image.Name()))\n\t}\n\n\tif _, err = dist.GetLabel(bldr.image, stack.MixinsLabel, &bldr.mixins); err != nil {\n\t\treturn errors.Wrapf(err, \"getting label %s\", stack.MixinsLabel)\n\t}\n\n\tif _, err = dist.GetLabel(bldr.image, OrderLabel, &bldr.order); err != nil {\n\t\treturn errors.Wrapf(err, \"getting label %s\", OrderLabel)\n\t}\n\n\tif _, err = dist.GetLabel(bldr.image, OrderExtensionsLabel, &bldr.orderExtensions); err != nil {\n\t\treturn errors.Wrapf(err, \"getting label %s\", OrderExtensionsLabel)\n\t}\n\n\treturn nil\n}\n\n// Getters\n\n// Description returns the builder description\nfunc (b *Builder) Description() string {\n\treturn b.metadata.Description\n}\n\n// LifecycleDescriptor returns the LifecycleDescriptor\nfunc (b *Builder) LifecycleDescriptor() LifecycleDescriptor {\n\treturn b.lifecycleDescriptor\n}\n\n// Buildpacks returns the buildpack list\nfunc (b *Builder) Buildpacks() []dist.ModuleInfo {\n\treturn b.metadata.Buildpacks\n}\n\n// Extensions returns the extensions list\nfunc (b *Builder) Extensions() []dist.ModuleInfo {\n\treturn b.metadata.Extensions\n}\n\n// CreatedBy returns metadata around the creation of the builder\nfunc (b *Builder) CreatedBy() CreatorMetadata {\n\treturn b.metadata.CreatedBy\n}\n\n// Order returns the order\nfunc (b *Builder) Order() dist.Order {\n\treturn b.order\n}\n\n// OrderExtensions returns the order for extensions\nfunc (b *Builder) OrderExtensions() dist.Order {\n\treturn b.orderExtensions\n}\n\n// BaseImageName returns the name of the builder base image\nfunc (b *Builder) BaseImageName() string {\n\treturn b.baseImageName\n}\n\n// Name returns the name of the builder\nfunc (b *Builder) Name() string {\n\treturn b.image.Name()\n}\n\n// Image returns the base image\nfunc (b *Builder) Image() imgutil.Image {\n\treturn b.image\n}\n\n// Stack returns the stack metadata\nfunc (b *Builder) Stack() StackMetadata {\n\treturn b.metadata.Stack\n}\n\n// System returns the system buildpacks configuration\nfunc (b *Builder) System() dist.System { return b.system }\n\n// RunImages returns all run image metadata\nfunc (b *Builder) RunImages() []RunImageMetadata {\n\treturn append(b.metadata.RunImages, b.Stack().RunImage)\n}\n\n// DefaultRunImage returns the default run image metadata\nfunc (b *Builder) DefaultRunImage() RunImageMetadata {\n\t// run.images are ensured in builder.ValidateConfig()\n\t// per the spec, we use the first one as the default\n\treturn b.RunImages()[0]\n}\n\n// Mixins returns the mixins of the builder\nfunc (b *Builder) Mixins() []string {\n\treturn b.mixins\n}\n\n// UID returns the UID of the builder\nfunc (b *Builder) UID() int {\n\treturn b.uid\n}\n\n// GID returns the GID of the builder\nfunc (b *Builder) GID() int {\n\treturn b.gid\n}\n\nfunc (b *Builder) AllModules(kind string) []buildpack.BuildModule {\n\treturn b.moduleManager(kind).AllModules()\n}\n\nfunc (b *Builder) moduleManager(kind string) buildpack.ManagedCollection {\n\tswitch kind {\n\tcase buildpack.KindBuildpack:\n\t\treturn b.additionalBuildpacks\n\tcase buildpack.KindExtension:\n\t\treturn b.additionalExtensions\n\t}\n\treturn nil\n}\n\nfunc (b *Builder) FlattenedModules(kind string) [][]buildpack.BuildModule {\n\tmanager := b.moduleManager(kind)\n\treturn manager.FlattenedModules()\n}\n\nfunc (b *Builder) ShouldFlatten(module buildpack.BuildModule) bool {\n\treturn b.additionalBuildpacks.ShouldFlatten(module)\n}\n\n// Setters\n\n// AddBuildpack adds a buildpack to the builder\nfunc (b *Builder) AddBuildpack(bp buildpack.BuildModule) {\n\tb.additionalBuildpacks.AddModules(bp)\n\tb.metadata.Buildpacks = append(b.metadata.Buildpacks, bp.Descriptor().Info())\n}\n\nfunc (b *Builder) AddBuildpacks(main buildpack.BuildModule, dependencies []buildpack.BuildModule) {\n\tb.additionalBuildpacks.AddModules(main, dependencies...)\n\tb.metadata.Buildpacks = append(b.metadata.Buildpacks, main.Descriptor().Info())\n\tfor _, dep := range dependencies {\n\t\tb.metadata.Buildpacks = append(b.metadata.Buildpacks, dep.Descriptor().Info())\n\t}\n}\n\n// AddExtension adds an extension to the builder\nfunc (b *Builder) AddExtension(bp buildpack.BuildModule) {\n\tb.additionalExtensions.AddModules(bp)\n\tb.metadata.Extensions = append(b.metadata.Extensions, bp.Descriptor().Info())\n}\n\n// SetLifecycle sets the lifecycle of the builder\nfunc (b *Builder) SetLifecycle(lifecycle Lifecycle) {\n\tb.lifecycle = lifecycle\n\tb.lifecycleDescriptor = lifecycle.Descriptor()\n}\n\n// SetEnv sets an environment variable to a value\nfunc (b *Builder) SetEnv(env map[string]string) {\n\tb.env = env\n}\n\n// SetBuildConfigEnv sets an environment variable to a value that will take action on platform environment variables basedon filename suffix\nfunc (b *Builder) SetBuildConfigEnv(env map[string]string) {\n\tb.buildConfigEnv = env\n}\n\n// SetOrder sets the order of the builder\nfunc (b *Builder) SetOrder(order dist.Order) {\n\tb.order = order\n\tb.replaceOrder = true\n}\n\n// SetOrderExtensions sets the order of the builder\nfunc (b *Builder) SetOrderExtensions(order dist.Order) {\n\tfor i, entry := range order {\n\t\tfor j, ref := range entry.Group {\n\t\t\tref.Optional = false // ensure `optional = true` isn't redundantly printed for extensions (as they are always optional)\n\t\t\tentry.Group[j] = ref\n\t\t}\n\t\torder[i] = entry\n\t}\n\tb.orderExtensions = order\n\tb.replaceOrder = true\n}\n\n// SetDescription sets the description of the builder\nfunc (b *Builder) SetDescription(description string) {\n\tb.metadata.Description = description\n}\n\n// SetStack sets the stack of the builder\nfunc (b *Builder) SetStack(stackConfig builder.StackConfig) {\n\tb.metadata.Stack = StackMetadata{\n\t\tRunImage: RunImageMetadata{\n\t\t\tImage:   stackConfig.RunImage,\n\t\t\tMirrors: stackConfig.RunImageMirrors,\n\t\t},\n\t}\n}\n\n// SetSystem sets the system buildpacks of the builder\nfunc (b *Builder) SetSystem(system dist.System) {\n\tb.system = system\n}\n\n// SetRunImage sets the run image of the builder\nfunc (b *Builder) SetRunImage(runConfig builder.RunConfig) {\n\tvar runImages []RunImageMetadata\n\tfor _, i := range runConfig.Images {\n\t\trunImages = append(runImages, RunImageMetadata{\n\t\t\tImage:   i.Image,\n\t\t\tMirrors: i.Mirrors,\n\t\t})\n\t}\n\tb.metadata.RunImages = runImages\n}\n\n// SetValidateMixins if true instructs the builder to validate mixins\nfunc (b *Builder) SetValidateMixins(to bool) {\n\tb.validateMixins = to\n}\n\n// Save saves the builder\nfunc (b *Builder) Save(logger logging.Logger, creatorMetadata CreatorMetadata, additionalTags ...string) error {\n\tif b.saveProhibited {\n\t\treturn fmt.Errorf(\"failed to save builder %s as saving is not allowed\", b.Name())\n\t}\n\n\tlogger.Debugf(\"Creating builder with the following buildpacks:\")\n\tfor _, bpInfo := range b.metadata.Buildpacks {\n\t\tlogger.Debugf(\"-> %s\", style.Symbol(bpInfo.FullName()))\n\t}\n\n\ttmpDir, err := os.MkdirTemp(\"\", \"create-builder-scratch\")\n\tif err != nil {\n\t\treturn err\n\t}\n\tdefer os.RemoveAll(tmpDir)\n\n\tdirsTar, err := b.defaultDirsLayer(tmpDir)\n\tif err != nil {\n\t\treturn err\n\t}\n\tif err := b.image.AddLayer(dirsTar); err != nil {\n\t\treturn errors.Wrap(err, \"adding default dirs layer\")\n\t}\n\n\tif b.lifecycle != nil {\n\t\tlifecycleDescriptor := b.lifecycle.Descriptor()\n\t\tb.metadata.Lifecycle.LifecycleInfo = lifecycleDescriptor.Info\n\t\tb.metadata.Lifecycle.API = lifecycleDescriptor.API\n\t\tb.metadata.Lifecycle.APIs = lifecycleDescriptor.APIs\n\t\tlifecycleTar, err := b.lifecycleLayer(tmpDir)\n\t\tif err != nil {\n\t\t\treturn err\n\t\t}\n\t\tif err := b.image.AddLayer(lifecycleTar); err != nil {\n\t\t\treturn errors.Wrap(err, \"adding lifecycle layer\")\n\t\t}\n\t}\n\n\tif b.validateMixins {\n\t\tif err := b.validateBuildpacks(); err != nil {\n\t\t\treturn errors.Wrap(err, \"validating buildpacks\")\n\t\t}\n\t}\n\n\tif err := validateExtensions(b.lifecycleDescriptor, b.Extensions(), b.AllModules(buildpack.KindExtension)); err != nil {\n\t\treturn errors.Wrap(err, \"validating extensions\")\n\t}\n\n\tbpLayers := dist.ModuleLayers{}\n\tif _, err := dist.GetLabel(b.image, dist.BuildpackLayersLabel, &bpLayers); err != nil {\n\t\treturn errors.Wrapf(err, \"getting label %s\", dist.BuildpackLayersLabel)\n\t}\n\n\tvar excludedBuildpacks []buildpack.BuildModule\n\texcludedBuildpacks, err = b.addFlattenedModules(buildpack.KindBuildpack, logger, tmpDir, b.image, b.additionalBuildpacks.FlattenedModules(), bpLayers)\n\tif err != nil {\n\t\treturn err\n\t}\n\n\terr = b.addExplodedModules(buildpack.KindBuildpack, logger, tmpDir, b.image, append(b.additionalBuildpacks.ExplodedModules(), excludedBuildpacks...), bpLayers)\n\tif err != nil {\n\t\treturn err\n\t}\n\tif err := dist.SetLabel(b.image, dist.BuildpackLayersLabel, bpLayers); err != nil {\n\t\treturn err\n\t}\n\n\textLayers := dist.ModuleLayers{}\n\tif _, err := dist.GetLabel(b.image, dist.ExtensionLayersLabel, &extLayers); err != nil {\n\t\treturn errors.Wrapf(err, \"getting label %s\", dist.ExtensionLayersLabel)\n\t}\n\n\tvar excludedExtensions []buildpack.BuildModule\n\texcludedExtensions, err = b.addFlattenedModules(buildpack.KindExtension, logger, tmpDir, b.image, b.additionalExtensions.FlattenedModules(), extLayers)\n\tif err != nil {\n\t\treturn err\n\t}\n\n\terr = b.addExplodedModules(buildpack.KindExtension, logger, tmpDir, b.image, append(b.additionalExtensions.ExplodedModules(), excludedExtensions...), extLayers)\n\tif err != nil {\n\t\treturn err\n\t}\n\n\tif err := dist.SetLabel(b.image, dist.ExtensionLayersLabel, extLayers); err != nil {\n\t\treturn err\n\t}\n\n\tif b.replaceOrder {\n\t\tresolvedOrderBp, err := processOrder(b.metadata.Buildpacks, b.order, buildpack.KindBuildpack)\n\t\tif err != nil {\n\t\t\treturn errors.Wrap(err, \"processing buildpacks order\")\n\t\t}\n\t\tresolvedOrderExt, err := processOrder(b.metadata.Extensions, b.orderExtensions, buildpack.KindExtension)\n\t\tif err != nil {\n\t\t\treturn errors.Wrap(err, \"processing extensions order\")\n\t\t}\n\n\t\torderTar, err := b.orderLayer(resolvedOrderBp, resolvedOrderExt, tmpDir)\n\t\tif err != nil {\n\t\t\treturn err\n\t\t}\n\t\tif err := b.image.AddLayer(orderTar); err != nil {\n\t\t\treturn errors.Wrap(err, \"adding order.tar layer\")\n\t\t}\n\t\tif err := dist.SetLabel(b.image, OrderLabel, b.order); err != nil {\n\t\t\treturn err\n\t\t}\n\t\tif err := dist.SetLabel(b.image, OrderExtensionsLabel, b.orderExtensions); err != nil {\n\t\t\treturn err\n\t\t}\n\t}\n\n\tif len(b.system.Pre.Buildpacks) > 0 || len(b.system.Post.Buildpacks) > 0 {\n\t\tresolvedSystemBp, err := processSystem(b.metadata.Buildpacks, b.system, buildpack.KindBuildpack)\n\t\tif err != nil {\n\t\t\treturn errors.Wrap(err, \"processing system buildpacks\")\n\t\t}\n\n\t\tsystemTar, err := b.systemLayer(resolvedSystemBp, tmpDir)\n\t\tif err != nil {\n\t\t\treturn err\n\t\t}\n\t\tif err := b.image.AddLayer(systemTar); err != nil {\n\t\t\treturn errors.Wrap(err, \"adding system.tar layer\")\n\t\t}\n\t\tif err := dist.SetLabel(b.image, SystemLabel, b.system); err != nil {\n\t\t\treturn err\n\t\t}\n\t}\n\n\tstackTar, err := b.stackLayer(tmpDir)\n\tif err != nil {\n\t\treturn err\n\t}\n\tif err := b.image.AddLayer(stackTar); err != nil {\n\t\treturn errors.Wrap(err, \"adding stack.tar layer\")\n\t}\n\n\trunImageTar, err := b.runImageLayer(tmpDir)\n\tif err != nil {\n\t\treturn err\n\t}\n\tif err := b.image.AddLayer(runImageTar); err != nil {\n\t\treturn errors.Wrap(err, \"adding run.tar layer\")\n\t}\n\n\tif len(b.buildConfigEnv) > 0 {\n\t\tlogger.Debugf(\"Provided Build Config Environment Variables\\n  %s\", style.Map(b.env, \"  \", \"\\n\"))\n\t\tbuildConfigEnvTar, err := b.buildConfigEnvLayer(tmpDir, b.buildConfigEnv)\n\t\tif err != nil {\n\t\t\treturn errors.Wrap(err, \"retrieving build-config-env layer\")\n\t\t}\n\n\t\tif err := b.image.AddLayer(buildConfigEnvTar); err != nil {\n\t\t\treturn errors.Wrap(err, \"adding build-config-env layer\")\n\t\t}\n\t}\n\n\tif len(b.env) > 0 {\n\t\tlogger.Debugf(\"Provided Environment Variables\\n  %s\", style.Map(b.env, \"  \", \"\\n\"))\n\t}\n\n\tenvTar, err := b.envLayer(tmpDir, b.env)\n\tif err != nil {\n\t\treturn err\n\t}\n\n\tif err := b.image.AddLayer(envTar); err != nil {\n\t\treturn errors.Wrap(err, \"adding env layer\")\n\t}\n\n\tif creatorMetadata.Name == \"\" {\n\t\tcreatorMetadata.Name = packName\n\t}\n\n\tb.metadata.CreatedBy = creatorMetadata\n\n\tif err := dist.SetLabel(b.image, metadataLabel, b.metadata); err != nil {\n\t\treturn err\n\t}\n\n\tif err := dist.SetLabel(b.image, stack.MixinsLabel, b.mixins); err != nil {\n\t\treturn err\n\t}\n\n\tif err := b.image.SetWorkingDir(layersDir); err != nil {\n\t\treturn errors.Wrap(err, \"failed to set working dir\")\n\t}\n\n\tlogger.Debugf(\"Builder creation completed, starting image save\")\n\terr = b.image.Save(additionalTags...)\n\tlogger.Debugf(\"Image save completed\")\n\treturn err\n}\n\n// Helpers\n\nfunc (b *Builder) addExplodedModules(kind string, logger logging.Logger, tmpDir string, image imgutil.Image, additionalModules []buildpack.BuildModule, layers dist.ModuleLayers) error {\n\tcollectionToAdd := map[string]moduleWithDiffID{}\n\ttoAdd, errs := explodeModules(kind, tmpDir, additionalModules, logger)\n\tif len(errs) > 0 {\n\t\treturn e.Join(errs...)\n\t}\n\n\tfor i, additionalModule := range toAdd {\n\t\tinfo, diffID, layerTar, module := additionalModule.module.Descriptor().Info(), additionalModule.diffID, additionalModule.tarPath, additionalModule.module\n\n\t\t// check against builder layers\n\t\tif existingInfo, ok := layers[info.ID][info.Version]; ok {\n\t\t\tif existingInfo.LayerDiffID == diffID {\n\t\t\t\tlogger.Debugf(\"%s %s already exists on builder with same contents, skipping...\", istrings.Title(kind), style.Symbol(info.FullName()))\n\t\t\t\tcontinue\n\t\t\t} else {\n\t\t\t\twhiteoutsTar, err := b.whiteoutLayer(tmpDir, i, info)\n\t\t\t\tif err != nil {\n\t\t\t\t\treturn err\n\t\t\t\t}\n\n\t\t\t\tif err := image.AddLayer(whiteoutsTar); err != nil {\n\t\t\t\t\treturn errors.Wrap(err, \"adding whiteout layer tar\")\n\t\t\t\t}\n\t\t\t}\n\n\t\t\tlogger.Debugf(ModuleOnBuilderMessage, kind, style.Symbol(info.FullName()), style.Symbol(existingInfo.LayerDiffID), style.Symbol(diffID))\n\t\t}\n\n\t\t// check against other modules to be added\n\t\tif otherAdditionalMod, ok := collectionToAdd[info.FullName()]; ok {\n\t\t\tif otherAdditionalMod.diffID == diffID {\n\t\t\t\tlogger.Debugf(\"%s %s with same contents is already being added, skipping...\", istrings.Title(kind), style.Symbol(info.FullName()))\n\t\t\t\tcontinue\n\t\t\t}\n\n\t\t\tlogger.Debugf(ModulePreviouslyDefinedMessage, kind, style.Symbol(info.FullName()), style.Symbol(otherAdditionalMod.diffID), style.Symbol(diffID))\n\t\t}\n\n\t\t// note: if same id@version is in additionalModules, last one wins (see warnings above)\n\t\tcollectionToAdd[info.FullName()] = moduleWithDiffID{\n\t\t\ttarPath: layerTar,\n\t\t\tdiffID:  diffID,\n\t\t\tmodule:  module,\n\t\t}\n\t}\n\n\t// Fixes 1453\n\tkeys := sortKeys(collectionToAdd)\n\tfor _, k := range keys {\n\t\tmodule := collectionToAdd[k]\n\t\tlogger.Debugf(\"Adding %s %s (diffID=%s)\", kind, style.Symbol(module.module.Descriptor().Info().FullName()), module.diffID)\n\t\tif err := image.AddLayerWithDiffID(module.tarPath, module.diffID); err != nil {\n\t\t\treturn errors.Wrapf(err,\n\t\t\t\t\"adding layer tar for %s %s\",\n\t\t\t\tkind,\n\t\t\t\tstyle.Symbol(module.module.Descriptor().Info().FullName()),\n\t\t\t)\n\t\t}\n\n\t\tdist.AddToLayersMD(layers, module.module.Descriptor(), module.diffID)\n\t}\n\n\treturn nil\n}\n\nfunc (b *Builder) addFlattenedModules(kind string, logger logging.Logger, tmpDir string, image imgutil.Image, flattenModules [][]buildpack.BuildModule, layers dist.ModuleLayers) ([]buildpack.BuildModule, error) {\n\tcollectionToAdd := map[string]moduleWithDiffID{}\n\tvar (\n\t\tbuildModuleExcluded []buildpack.BuildModule\n\t\tfinalTarPath        string\n\t\terr                 error\n\t)\n\n\tbuildModuleWriter := buildpack.NewBuildModuleWriter(logger, b.layerWriterFactory)\n\n\tfor i, additionalModules := range flattenModules {\n\t\tmodFlattenTmpDir := filepath.Join(tmpDir, fmt.Sprintf(\"%s-%s-flatten\", kind, strconv.Itoa(i)))\n\t\tif err := os.MkdirAll(modFlattenTmpDir, os.ModePerm); err != nil {\n\t\t\treturn nil, errors.Wrap(err, \"creating flatten temp dir\")\n\t\t}\n\n\t\tfinalTarPath, buildModuleExcluded, err = buildModuleWriter.NToLayerTar(modFlattenTmpDir, fmt.Sprintf(\"%s-flatten-%s\", kind, strconv.Itoa(i)), additionalModules, nil)\n\t\tif err != nil {\n\t\t\treturn nil, errors.Wrapf(err, \"writing layer %s\", finalTarPath)\n\t\t}\n\n\t\tdiffID, err := dist.LayerDiffID(finalTarPath)\n\t\tif err != nil {\n\t\t\treturn nil, errors.Wrapf(err, \"calculating diff layer %s\", finalTarPath)\n\t\t}\n\n\t\tfor _, module := range additionalModules {\n\t\t\tcollectionToAdd[module.Descriptor().Info().FullName()] = moduleWithDiffID{\n\t\t\t\ttarPath: finalTarPath,\n\t\t\t\tdiffID:  diffID.String(),\n\t\t\t\tmodule:  module,\n\t\t\t}\n\t\t}\n\t}\n\n\t// Fixes 1453\n\tkeys := sortKeys(collectionToAdd)\n\tdiffIDAdded := map[string]string{}\n\tfor _, k := range keys {\n\t\tmodule := collectionToAdd[k]\n\t\tbp := module.module\n\t\taddLayer := true\n\t\tif b.ShouldFlatten(bp) {\n\t\t\tif _, ok := diffIDAdded[module.diffID]; !ok {\n\t\t\t\tdiffIDAdded[module.diffID] = module.tarPath\n\t\t\t} else {\n\t\t\t\taddLayer = false\n\t\t\t\tlogger.Debugf(\"Squashing %s %s (diffID=%s)\", kind, style.Symbol(bp.Descriptor().Info().FullName()), module.diffID)\n\t\t\t}\n\t\t}\n\t\tif addLayer {\n\t\t\tlogger.Debugf(\"Adding %s %s (diffID=%s)\", kind, style.Symbol(bp.Descriptor().Info().FullName()), module.diffID)\n\t\t\tif err = image.AddLayerWithDiffID(module.tarPath, module.diffID); err != nil {\n\t\t\t\treturn nil, errors.Wrapf(err,\n\t\t\t\t\t\"adding layer tar for %s %s\",\n\t\t\t\t\tkind,\n\t\t\t\t\tstyle.Symbol(module.module.Descriptor().Info().FullName()),\n\t\t\t\t)\n\t\t\t}\n\t\t}\n\t\tdist.AddToLayersMD(layers, bp.Descriptor(), module.diffID)\n\t}\n\n\treturn buildModuleExcluded, nil\n}\n\nfunc processOrder(modulesOnBuilder []dist.ModuleInfo, order dist.Order, kind string) (dist.Order, error) {\n\tresolved := dist.Order{}\n\tfor idx, g := range order {\n\t\tresolved = append(resolved, dist.OrderEntry{})\n\t\tfor _, ref := range g.Group {\n\t\t\tvar err error\n\t\t\tif ref, err = resolveRef(modulesOnBuilder, ref, kind); err != nil {\n\t\t\t\treturn dist.Order{}, err\n\t\t\t}\n\t\t\tresolved[idx].Group = append(resolved[idx].Group, ref)\n\t\t}\n\t}\n\treturn resolved, nil\n}\n\nfunc processSystem(modulesOnBuilder []dist.ModuleInfo, system dist.System, kind string) (dist.System, error) {\n\tresolved := dist.System{}\n\n\t// Pre buildpacks\n\tfor _, bp := range system.Pre.Buildpacks {\n\t\tvar (\n\t\t\tref dist.ModuleRef\n\t\t\terr error\n\t\t)\n\t\tif ref, err = resolveRef(modulesOnBuilder, bp, kind); err != nil {\n\t\t\treturn dist.System{}, err\n\t\t}\n\t\tresolved.Pre.Buildpacks = append(resolved.Pre.Buildpacks, ref)\n\t}\n\n\t// Post buildpacks\n\tfor _, bp := range system.Post.Buildpacks {\n\t\tvar (\n\t\t\tref dist.ModuleRef\n\t\t\terr error\n\t\t)\n\t\tif ref, err = resolveRef(modulesOnBuilder, bp, kind); err != nil {\n\t\t\treturn dist.System{}, err\n\t\t}\n\t\tresolved.Post.Buildpacks = append(resolved.Post.Buildpacks, ref)\n\t}\n\n\treturn resolved, nil\n}\n\nfunc resolveRef(moduleList []dist.ModuleInfo, ref dist.ModuleRef, kind string) (dist.ModuleRef, error) {\n\tvar matching []dist.ModuleInfo\n\tfor _, bp := range moduleList {\n\t\tif ref.ID == bp.ID {\n\t\t\tmatching = append(matching, bp)\n\t\t}\n\t}\n\n\tif len(matching) == 0 {\n\t\treturn dist.ModuleRef{},\n\t\t\tfmt.Errorf(\"no versions of %s %s were found on the builder\", kind, style.Symbol(ref.ID))\n\t}\n\n\tif ref.Version == \"\" {\n\t\tif len(uniqueVersions(matching)) > 1 {\n\t\t\treturn dist.ModuleRef{},\n\t\t\t\tfmt.Errorf(\"unable to resolve version: multiple versions of %s - must specify an explicit version\", style.Symbol(ref.ID))\n\t\t}\n\n\t\tref.Version = matching[0].Version\n\t}\n\n\tif !hasElementWithVersion(matching, ref.Version) {\n\t\treturn dist.ModuleRef{},\n\t\t\tfmt.Errorf(\"%s %s with version %s was not found on the builder\", kind, style.Symbol(ref.ID), style.Symbol(ref.Version))\n\t}\n\n\treturn ref, nil\n}\n\nfunc hasElementWithVersion(moduleList []dist.ModuleInfo, version string) bool {\n\tfor _, el := range moduleList {\n\t\tif el.Version == version {\n\t\t\treturn true\n\t\t}\n\t}\n\treturn false\n}\n\nfunc (b *Builder) validateBuildpacks() error {\n\tbpLookup := map[string]interface{}{}\n\n\tfor _, bp := range b.Buildpacks() {\n\t\tbpLookup[bp.FullName()] = nil\n\t}\n\n\tfor _, bp := range b.AllModules(buildpack.KindBuildpack) {\n\t\tbpd := bp.Descriptor()\n\t\tif err := validateLifecycleCompat(bpd, b.LifecycleDescriptor()); err != nil {\n\t\t\treturn err\n\t\t}\n\n\t\tif len(bpd.Order()) > 0 { // order buildpack\n\t\t\tfor _, g := range bpd.Order() {\n\t\t\t\tfor _, r := range g.Group {\n\t\t\t\t\tif _, ok := bpLookup[r.FullName()]; !ok {\n\t\t\t\t\t\treturn fmt.Errorf(\n\t\t\t\t\t\t\t\"buildpack %s not found on the builder\",\n\t\t\t\t\t\t\tstyle.Symbol(r.FullName()),\n\t\t\t\t\t\t)\n\t\t\t\t\t}\n\t\t\t\t}\n\t\t\t}\n\t\t} else if err := bpd.EnsureStackSupport(b.StackID, b.Mixins(), false); err != nil {\n\t\t\treturn err\n\t\t} else {\n\t\t\tbuildOS, err := b.Image().OS()\n\t\t\tif err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\t\t\tbuildArch, err := b.Image().Architecture()\n\t\t\tif err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\t\t\tbuildDistroName, err := b.Image().Label(lifecycleplatform.OSDistroNameLabel)\n\t\t\tif err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\t\t\tbuildDistroVersion, err := b.Image().Label(lifecycleplatform.OSDistroVersionLabel)\n\t\t\tif err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\t\t\tif err := bpd.EnsureTargetSupport(buildOS, buildArch, buildDistroName, buildDistroVersion); err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\n\t\t\t// TODO ensure at least one run-image\n\t\t}\n\t}\n\n\treturn nil\n}\n\nfunc validateExtensions(lifecycleDescriptor LifecycleDescriptor, allExtensions []dist.ModuleInfo, extsToValidate []buildpack.BuildModule) error {\n\textLookup := map[string]interface{}{}\n\n\tfor _, ext := range allExtensions {\n\t\textLookup[ext.FullName()] = nil\n\t}\n\n\tfor _, ext := range extsToValidate {\n\t\textd := ext.Descriptor()\n\t\tif err := validateLifecycleCompat(extd, lifecycleDescriptor); err != nil {\n\t\t\treturn err\n\t\t}\n\t}\n\n\treturn nil\n}\n\nfunc validateLifecycleCompat(descriptor buildpack.Descriptor, lifecycleDescriptor LifecycleDescriptor) error {\n\tcompatible := false\n\tfor _, version := range append(lifecycleDescriptor.APIs.Buildpack.Supported, lifecycleDescriptor.APIs.Buildpack.Deprecated...) {\n\t\tcompatible = version.Compare(descriptor.API()) == 0\n\t\tif compatible {\n\t\t\tbreak\n\t\t}\n\t}\n\n\tif !compatible {\n\t\treturn fmt.Errorf(\n\t\t\t\"%s %s (Buildpack API %s) is incompatible with lifecycle %s (Buildpack API(s) %s)\",\n\t\t\tdescriptor.Kind(),\n\t\t\tstyle.Symbol(descriptor.Info().FullName()),\n\t\t\tdescriptor.API().String(),\n\t\t\tstyle.Symbol(lifecycleDescriptor.Info.Version.String()),\n\t\t\tstrings.Join(lifecycleDescriptor.APIs.Buildpack.Supported.AsStrings(), \", \"),\n\t\t)\n\t}\n\n\treturn nil\n}\n\nfunc userAndGroupIDs(img imgutil.Image) (int, int, error) {\n\tsUID, err := img.Env(EnvUID)\n\tif err != nil {\n\t\treturn 0, 0, errors.Wrap(err, \"reading builder env variables\")\n\t} else if sUID == \"\" {\n\t\treturn 0, 0, fmt.Errorf(\"image %s missing required env var %s\", style.Symbol(img.Name()), style.Symbol(EnvUID))\n\t}\n\n\tsGID, err := img.Env(EnvGID)\n\tif err != nil {\n\t\treturn 0, 0, errors.Wrap(err, \"reading builder env variables\")\n\t} else if sGID == \"\" {\n\t\treturn 0, 0, fmt.Errorf(\"image %s missing required env var %s\", style.Symbol(img.Name()), style.Symbol(EnvGID))\n\t}\n\n\tvar uid, gid int\n\tuid, err = strconv.Atoi(sUID)\n\tif err != nil {\n\t\treturn 0, 0, fmt.Errorf(\"failed to parse %s, value %s should be an integer\", style.Symbol(EnvUID), style.Symbol(sUID))\n\t}\n\n\tgid, err = strconv.Atoi(sGID)\n\tif err != nil {\n\t\treturn 0, 0, fmt.Errorf(\"failed to parse %s, value %s should be an integer\", style.Symbol(EnvGID), style.Symbol(sGID))\n\t}\n\n\treturn uid, gid, nil\n}\n\nfunc uniqueVersions(buildpacks []dist.ModuleInfo) []string {\n\tresults := []string{}\n\tset := map[string]interface{}{}\n\tfor _, bpInfo := range buildpacks {\n\t\t_, ok := set[bpInfo.Version]\n\t\tif !ok {\n\t\t\tresults = append(results, bpInfo.Version)\n\t\t\tset[bpInfo.Version] = true\n\t\t}\n\t}\n\treturn results\n}\n\nfunc (b *Builder) defaultDirsLayer(dest string) (string, error) {\n\tfh, err := os.Create(filepath.Join(dest, \"dirs.tar\"))\n\tif err != nil {\n\t\treturn \"\", err\n\t}\n\tdefer fh.Close()\n\n\tlw := b.layerWriterFactory.NewWriter(fh)\n\tdefer lw.Close()\n\n\tts := archive.NormalizedDateTime\n\n\tfor _, path := range []string{workspaceDir, layersDir} {\n\t\tif err := lw.WriteHeader(b.packOwnedDir(path, ts)); err != nil {\n\t\t\treturn \"\", errors.Wrapf(err, \"creating %s dir in layer\", style.Symbol(path))\n\t\t}\n\t}\n\n\t// can't use filepath.Join(), to ensure Windows doesn't transform it to Windows join\n\tfor _, path := range []string{cnbDir, dist.BuildpacksDir, dist.ExtensionsDir, platformDir, platformDir + \"/env\", buildConfigDir, buildConfigDir + \"/env\"} {\n\t\tif err := lw.WriteHeader(b.rootOwnedDir(path, ts)); err != nil {\n\t\t\treturn \"\", errors.Wrapf(err, \"creating %s dir in layer\", style.Symbol(path))\n\t\t}\n\t}\n\n\treturn fh.Name(), nil\n}\n\nfunc (b *Builder) packOwnedDir(path string, time time.Time) *tar.Header {\n\treturn &tar.Header{\n\t\tTypeflag: tar.TypeDir,\n\t\tName:     path,\n\t\tMode:     0755,\n\t\tModTime:  time,\n\t\tUid:      b.uid,\n\t\tGid:      b.gid,\n\t}\n}\n\nfunc (b *Builder) rootOwnedDir(path string, time time.Time) *tar.Header {\n\treturn &tar.Header{\n\t\tTypeflag: tar.TypeDir,\n\t\tName:     path,\n\t\tMode:     0755,\n\t\tModTime:  time,\n\t}\n}\n\nfunc (b *Builder) lifecycleLayer(dest string) (string, error) {\n\tfh, err := os.Create(filepath.Join(dest, \"lifecycle.tar\"))\n\tif err != nil {\n\t\treturn \"\", err\n\t}\n\tdefer fh.Close()\n\n\tlw := b.layerWriterFactory.NewWriter(fh)\n\tdefer lw.Close()\n\n\tif err := lw.WriteHeader(&tar.Header{\n\t\tTypeflag: tar.TypeDir,\n\t\tName:     lifecycleDir,\n\t\tMode:     0755,\n\t\tModTime:  archive.NormalizedDateTime,\n\t}); err != nil {\n\t\treturn \"\", err\n\t}\n\n\terr = b.embedLifecycleTar(lw)\n\tif err != nil {\n\t\treturn \"\", errors.Wrap(err, \"embedding lifecycle tar\")\n\t}\n\n\tif err := lw.WriteHeader(&tar.Header{\n\t\tName:     compatLifecycleDir,\n\t\tLinkname: lifecycleDir,\n\t\tTypeflag: tar.TypeSymlink,\n\t\tMode:     0644,\n\t\tModTime:  archive.NormalizedDateTime,\n\t}); err != nil {\n\t\treturn \"\", errors.Wrapf(err, \"creating %s symlink\", style.Symbol(compatLifecycleDir))\n\t}\n\n\treturn fh.Name(), nil\n}\n\nfunc (b *Builder) embedLifecycleTar(tw archive.TarWriter) error {\n\tvar regex = regexp.MustCompile(`^[^/]+/([^/]+)$`)\n\n\tlr, err := b.lifecycle.Open()\n\tif err != nil {\n\t\treturn errors.Wrap(err, \"failed to open lifecycle\")\n\t}\n\tdefer lr.Close()\n\ttr := tar.NewReader(lr)\n\tfor {\n\t\theader, err := tr.Next()\n\t\tif err == io.EOF {\n\t\t\tbreak\n\t\t}\n\t\tif err != nil {\n\t\t\treturn errors.Wrap(err, \"failed to get next tar entry\")\n\t\t}\n\n\t\tpathMatches := regex.FindStringSubmatch(path.Clean(header.Name))\n\t\tif pathMatches != nil {\n\t\t\tbinaryName := pathMatches[1]\n\n\t\t\theader.Name = lifecycleDir + \"/\" + binaryName\n\t\t\terr = tw.WriteHeader(header)\n\t\t\tif err != nil {\n\t\t\t\treturn errors.Wrapf(err, \"failed to write header for '%s'\", header.Name)\n\t\t\t}\n\n\t\t\tbuf, err := io.ReadAll(tr)\n\t\t\tif err != nil {\n\t\t\t\treturn errors.Wrapf(err, \"failed to read contents of '%s'\", header.Name)\n\t\t\t}\n\n\t\t\t_, err = tw.Write(buf)\n\t\t\tif err != nil {\n\t\t\t\treturn errors.Wrapf(err, \"failed to write contents to '%s'\", header.Name)\n\t\t\t}\n\t\t}\n\t}\n\n\treturn nil\n}\n\nfunc (b *Builder) orderLayer(order dist.Order, orderExt dist.Order, dest string) (string, error) {\n\tcontents, err := orderFileContents(order, orderExt)\n\tif err != nil {\n\t\treturn \"\", err\n\t}\n\n\tlayerTar := filepath.Join(dest, \"order.tar\")\n\terr = layer.CreateSingleFileTar(layerTar, orderPath, contents, b.layerWriterFactory)\n\tif err != nil {\n\t\treturn \"\", errors.Wrapf(err, \"failed to create order.toml layer tar\")\n\t}\n\n\treturn layerTar, nil\n}\n\nfunc orderFileContents(order dist.Order, orderExt dist.Order) (string, error) {\n\tbuf := &bytes.Buffer{}\n\ttomlData := orderTOML{Order: order, OrderExt: orderExt}\n\tif err := toml.NewEncoder(buf).Encode(tomlData); err != nil {\n\t\treturn \"\", errors.Wrapf(err, \"failed to marshal order.toml\")\n\t}\n\treturn buf.String(), nil\n}\n\nfunc (b *Builder) systemLayer(system dist.System, dest string) (string, error) {\n\tcontents, err := systemFileContents(system)\n\tif err != nil {\n\t\treturn \"\", err\n\t}\n\tlayerTar := filepath.Join(dest, \"system.tar\")\n\terr = layer.CreateSingleFileTar(layerTar, systemPath, contents, b.layerWriterFactory)\n\tif err != nil {\n\t\treturn \"\", errors.Wrapf(err, \"failed to create system.toml layer tar\")\n\t}\n\n\treturn layerTar, nil\n}\n\nfunc systemFileContents(system dist.System) (string, error) {\n\tbuf := &bytes.Buffer{}\n\ttomlData := systemTOML{System: system}\n\tif err := toml.NewEncoder(buf).Encode(tomlData); err != nil {\n\t\treturn \"\", errors.Wrapf(err, \"failed to marshal system.toml\")\n\t}\n\treturn buf.String(), nil\n}\n\nfunc (b *Builder) stackLayer(dest string) (string, error) {\n\tbuf := &bytes.Buffer{}\n\tvar err error\n\tif b.metadata.Stack.RunImage.Image != \"\" {\n\t\terr = toml.NewEncoder(buf).Encode(b.metadata.Stack)\n\t} else if len(b.metadata.RunImages) > 0 {\n\t\terr = toml.NewEncoder(buf).Encode(StackMetadata{RunImage: b.metadata.RunImages[0]})\n\t}\n\tif err != nil {\n\t\treturn \"\", errors.Wrapf(err, \"failed to marshal stack.toml\")\n\t}\n\n\tlayerTar := filepath.Join(dest, \"stack.tar\")\n\terr = layer.CreateSingleFileTar(layerTar, stackPath, buf.String(), b.layerWriterFactory)\n\tif err != nil {\n\t\treturn \"\", errors.Wrapf(err, \"failed to create stack.toml layer tar\")\n\t}\n\n\treturn layerTar, nil\n}\n\nfunc (b *Builder) runImageLayer(dest string) (string, error) {\n\tbuf := &bytes.Buffer{}\n\terr := toml.NewEncoder(buf).Encode(RunImages{\n\t\tImages: b.metadata.RunImages,\n\t})\n\tif err != nil {\n\t\treturn \"\", errors.Wrapf(err, \"failed to marshal run.toml\")\n\t}\n\n\tlayerTar := filepath.Join(dest, \"run.tar\")\n\terr = layer.CreateSingleFileTar(layerTar, runPath, buf.String(), b.layerWriterFactory)\n\tif err != nil {\n\t\treturn \"\", errors.Wrapf(err, \"failed to create run.toml layer tar\")\n\t}\n\n\treturn layerTar, nil\n}\n\nfunc (b *Builder) envLayer(dest string, env map[string]string) (string, error) {\n\tfh, err := os.Create(filepath.Join(dest, \"env.tar\"))\n\tif err != nil {\n\t\treturn \"\", err\n\t}\n\tdefer fh.Close()\n\n\tlw := b.layerWriterFactory.NewWriter(fh)\n\tdefer lw.Close()\n\n\tfor k, v := range env {\n\t\tif err := lw.WriteHeader(&tar.Header{\n\t\t\tName:    path.Join(platformDir, \"env\", k),\n\t\t\tSize:    int64(len(v)),\n\t\t\tMode:    0644,\n\t\t\tModTime: archive.NormalizedDateTime,\n\t\t}); err != nil {\n\t\t\treturn \"\", err\n\t\t}\n\t\tif _, err := lw.Write([]byte(v)); err != nil {\n\t\t\treturn \"\", err\n\t\t}\n\t}\n\n\treturn fh.Name(), nil\n}\n\nfunc (b *Builder) buildConfigEnvLayer(dest string, env map[string]string) (string, error) {\n\tfh, err := os.Create(filepath.Join(dest, \"build-config-env.tar\"))\n\tif err != nil {\n\t\treturn \"\", err\n\t}\n\tdefer fh.Close()\n\tlw := b.layerWriterFactory.NewWriter(fh)\n\tdefer lw.Close()\n\tfor k, v := range env {\n\t\tif err := lw.WriteHeader(&tar.Header{\n\t\t\tName:    path.Join(cnbBuildConfigDir(), \"env\", k),\n\t\t\tSize:    int64(len(v)),\n\t\t\tMode:    0644,\n\t\t\tModTime: archive.NormalizedDateTime,\n\t\t}); err != nil {\n\t\t\treturn \"\", err\n\t\t}\n\t\tif _, err := lw.Write([]byte(v)); err != nil {\n\t\t\treturn \"\", err\n\t\t}\n\t}\n\n\treturn fh.Name(), nil\n}\n\nfunc (b *Builder) whiteoutLayer(tmpDir string, i int, bpInfo dist.ModuleInfo) (string, error) {\n\tbpWhiteoutsTmpDir := filepath.Join(tmpDir, strconv.Itoa(i)+\"_whiteouts\")\n\tif err := os.MkdirAll(bpWhiteoutsTmpDir, os.ModePerm); err != nil {\n\t\treturn \"\", errors.Wrap(err, \"creating buildpack whiteouts temp dir\")\n\t}\n\n\tfh, err := os.Create(filepath.Join(bpWhiteoutsTmpDir, \"whiteouts.tar\"))\n\tif err != nil {\n\t\treturn \"\", err\n\t}\n\tdefer fh.Close()\n\n\tlw := b.layerWriterFactory.NewWriter(fh)\n\tdefer lw.Close()\n\n\tif err := lw.WriteHeader(&tar.Header{\n\t\tName: path.Join(buildpacksDir, strings.ReplaceAll(bpInfo.ID, \"/\", \"_\"), fmt.Sprintf(\".wh.%s\", bpInfo.Version)),\n\t\tSize: int64(0),\n\t\tMode: 0644,\n\t}); err != nil {\n\t\treturn \"\", err\n\t}\n\tif _, err := lw.Write([]byte(\"\")); err != nil {\n\t\treturn \"\", errors.Wrapf(err,\n\t\t\t\"creating whiteout layers' tarfile for buildpack %s\",\n\t\t\tstyle.Symbol(bpInfo.FullName()),\n\t\t)\n\t}\n\n\treturn fh.Name(), nil\n}\n\nfunc sortKeys(collection map[string]moduleWithDiffID) []string {\n\tkeys := make([]string, 0, len(collection))\n\tfor k := range collection {\n\t\tkeys = append(keys, k)\n\t}\n\tsort.Strings(keys)\n\treturn keys\n}\n\n// explodeModules takes a collection of build modules and concurrently reads their tar files.\n// It assumes the modules were extracted with `buildpack.extractBuildpacks`, which when provided a flattened buildpack package containing N buildpacks,\n// will return N modules: 1 module with a single tar containing ALL N buildpacks, and N-1 modules with empty tar files.\n// As we iterate through the modules, in case a flattened module (tar containing all N buildpacks) is found,\n// explodeModules will split the module into N modules, each with a single tar containing a single buildpack.\n// In case a module with an empty tar file is found, it is ignored.\nfunc explodeModules(kind, tmpDir string, additionalModules []buildpack.BuildModule, logger logging.Logger) ([]moduleWithDiffID, []error) {\n\tmodInfoChans := make([]chan modInfo, len(additionalModules))\n\tfor i := range modInfoChans {\n\t\tmodInfoChans[i] = make(chan modInfo, 1)\n\t}\n\n\t// Explode modules concurrently\n\tfor i, module := range additionalModules {\n\t\tgo func(i int, module buildpack.BuildModule) {\n\t\t\tmodTmpDir := filepath.Join(tmpDir, fmt.Sprintf(\"%s-%s\", kind, strconv.Itoa(i)))\n\t\t\tif err := os.MkdirAll(modTmpDir, os.ModePerm); err != nil {\n\t\t\t\tmodInfoChans[i] <- handleError(module, err, fmt.Sprintf(\"creating %s temp dir %s\", kind, modTmpDir))\n\t\t\t}\n\t\t\tmoduleTars, err := buildpack.ToNLayerTar(modTmpDir, module)\n\t\t\tif err != nil {\n\t\t\t\tmodInfoChans[i] <- handleError(module, err, fmt.Sprintf(\"creating %s tar file at path %s\", module.Descriptor().Info().FullName(), modTmpDir))\n\t\t\t}\n\t\t\tmodInfoChans[i] <- modInfo{moduleTars: moduleTars}\n\t\t}(i, module)\n\t}\n\n\t// Iterate over modules sequentially, building up the result.\n\tvar (\n\t\tresult []moduleWithDiffID\n\t\terrs   []error\n\t)\n\tfor i, module := range additionalModules {\n\t\tmi := <-modInfoChans[i]\n\t\tif mi.err != nil {\n\t\t\terrs = append(errs, mi.err)\n\t\t\tcontinue\n\t\t}\n\t\tif len(mi.moduleTars) == 1 {\n\t\t\t// This entry is an individual buildpack or extension, or a module with empty tar\n\t\t\tmoduleTar := mi.moduleTars[0]\n\t\t\tdiffID, err := dist.LayerDiffID(moduleTar.Path())\n\t\t\tif err != nil {\n\t\t\t\terrs = append(errs, errors.Wrapf(err, \"calculating layer diffID for path %s\", moduleTar.Path()))\n\t\t\t\tcontinue\n\t\t\t}\n\t\t\tif diffID.String() == emptyTarDiffID {\n\t\t\t\tlogger.Debugf(\"%s %s is a component of a flattened buildpack that will be added elsewhere, skipping...\", istrings.Title(kind), style.Symbol(moduleTar.Info().FullName()))\n\t\t\t\tcontinue // we don't need to keep modules with empty tars\n\t\t\t}\n\t\t\tresult = append(result, moduleWithDiffID{\n\t\t\t\ttarPath: moduleTar.Path(),\n\t\t\t\tdiffID:  diffID.String(),\n\t\t\t\tmodule:  module,\n\t\t\t})\n\t\t} else {\n\t\t\t// This entry is a flattened buildpack that was exploded, we need to add each exploded buildpack to the result in order\n\t\t\tfor _, moduleTar := range mi.moduleTars {\n\t\t\t\tdiffID, err := dist.LayerDiffID(moduleTar.Path())\n\t\t\t\tif err != nil {\n\t\t\t\t\terrs = append(errs, errors.Wrapf(err, \"calculating layer diffID for path %s\", moduleTar.Path()))\n\t\t\t\t\tcontinue\n\t\t\t\t}\n\t\t\t\texplodedMod := moduleWithDiffID{\n\t\t\t\t\ttarPath: moduleTar.Path(),\n\t\t\t\t\tdiffID:  diffID.String(),\n\t\t\t\t}\n\t\t\t\t// find the module \"info\" for this buildpack - it could be the current module, or one of the modules with empty tars that was ignored\n\t\t\t\tif namesMatch(module, moduleTar) {\n\t\t\t\t\texplodedMod.module = module\n\t\t\t\t} else {\n\t\t\t\t\tfor _, additionalModule := range additionalModules {\n\t\t\t\t\t\tif namesMatch(additionalModule, moduleTar) {\n\t\t\t\t\t\t\texplodedMod.module = additionalModule\n\t\t\t\t\t\t\tbreak\n\t\t\t\t\t\t}\n\t\t\t\t\t}\n\t\t\t\t}\n\t\t\t\tresult = append(result, explodedMod)\n\t\t\t}\n\t\t}\n\t}\n\n\treturn result, errs\n}\n\nfunc handleError(module buildpack.BuildModule, err error, message string) modInfo {\n\tmoduleTar := errModuleTar{\n\t\tmodule: module,\n\t}\n\treturn modInfo{moduleTars: []buildpack.ModuleTar{moduleTar}, err: errors.Wrap(err, message)}\n}\n\nfunc namesMatch(module buildpack.BuildModule, moduleOnDisk buildpack.ModuleTar) bool {\n\treturn moduleOnDisk.Info().FullName() == fmt.Sprintf(\"%s@%s\", module.Descriptor().EscapedID(), module.Descriptor().Info().Version) ||\n\t\tmoduleOnDisk.Info().FullName() == module.Descriptor().Info().FullName()\n}\n\ntype modInfo struct {\n\tmoduleTars []buildpack.ModuleTar\n\terr        error\n}\n\ntype errModuleTar struct {\n\tmodule buildpack.BuildModule\n}\n\nfunc (e errModuleTar) Info() dist.ModuleInfo {\n\treturn e.module.Descriptor().Info()\n}\n\nfunc (e errModuleTar) Path() string {\n\treturn \"\"\n}\n\nfunc cnbBuildConfigDir() string {\n\tif env, ok := os.LookupEnv(\"CNB_BUILD_CONFIG_DIR\"); ok {\n\t\treturn env\n\t}\n\n\treturn \"/cnb/build-config\"\n}\n"
  },
  {
    "path": "internal/builder/builder_test.go",
    "content": "package builder_test\n\nimport (\n\t\"bytes\"\n\t\"crypto/sha256\"\n\t\"encoding/json\"\n\t\"fmt\"\n\t\"io\"\n\t\"os\"\n\t\"path\"\n\t\"path/filepath\"\n\t\"runtime\"\n\t\"slices\"\n\t\"strings\"\n\t\"testing\"\n\n\t\"github.com/buildpacks/imgutil\"\n\t\"github.com/buildpacks/imgutil/fakes\"\n\t\"github.com/buildpacks/lifecycle/api\"\n\t\"github.com/golang/mock/gomock\"\n\t\"github.com/heroku/color\"\n\t\"github.com/pkg/errors\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\tpubbldr \"github.com/buildpacks/pack/builder\"\n\t\"github.com/buildpacks/pack/internal/builder\"\n\t\"github.com/buildpacks/pack/internal/builder/testmocks\"\n\tifakes \"github.com/buildpacks/pack/internal/fakes\"\n\t\"github.com/buildpacks/pack/pkg/archive\"\n\t\"github.com/buildpacks/pack/pkg/buildpack\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestBuilder(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"Builder\", testBuilder, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testBuilder(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tbaseImage      *fakes.Image\n\t\tsubject        *builder.Builder\n\t\tmockController *gomock.Controller\n\t\tmockLifecycle  *testmocks.MockLifecycle\n\t\tbp1v1          buildpack.BuildModule\n\t\tbp1v2          buildpack.BuildModule\n\t\tbp2v1          buildpack.BuildModule\n\t\tbp2v2          buildpack.BuildModule\n\t\text1v1         buildpack.BuildModule\n\t\text1v2         buildpack.BuildModule\n\t\text2v1         buildpack.BuildModule\n\t\tbpOrder        buildpack.BuildModule\n\t\toutBuf         bytes.Buffer\n\t\tlogger         logging.Logger\n\t)\n\n\tit.Before(func() {\n\t\tlogger = logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\tbaseImage = fakes.NewImage(\"base/image\", \"\", nil)\n\t\tmockController = gomock.NewController(t)\n\n\t\tlifecycleTarReader := archive.ReadDirAsTar(\n\t\t\tfilepath.Join(\"testdata\", \"lifecycle\", \"platform-0.4\"),\n\t\t\t\".\", 0, 0, 0755, true, false, nil,\n\t\t)\n\n\t\tdescriptorContents, err := os.ReadFile(filepath.Join(\"testdata\", \"lifecycle\", \"platform-0.4\", \"lifecycle.toml\"))\n\t\th.AssertNil(t, err)\n\n\t\tlifecycleDescriptor, err := builder.ParseDescriptor(string(descriptorContents))\n\t\th.AssertNil(t, err)\n\n\t\tmockLifecycle = testmocks.NewMockLifecycle(mockController)\n\t\tmockLifecycle.EXPECT().Open().Return(lifecycleTarReader, nil).AnyTimes()\n\t\tmockLifecycle.EXPECT().Descriptor().Return(builder.CompatDescriptor(lifecycleDescriptor)).AnyTimes()\n\n\t\tbp1v1, err = ifakes.NewFakeBuildpack(dist.BuildpackDescriptor{\n\t\t\tWithAPI: api.MustParse(\"0.2\"),\n\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\tID:      \"buildpack-1-id\",\n\t\t\t\tVersion: \"buildpack-1-version-1\",\n\t\t\t},\n\t\t\tWithStacks: []dist.Stack{{\n\t\t\t\tID:     \"some.stack.id\",\n\t\t\t\tMixins: []string{\"mixinX\", \"mixinY\"},\n\t\t\t}},\n\t\t}, 0644)\n\t\th.AssertNil(t, err)\n\n\t\tbp1v2, err = ifakes.NewFakeBuildpack(dist.BuildpackDescriptor{\n\t\t\tWithAPI: api.MustParse(\"0.2\"),\n\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\tID:      \"buildpack-1-id\",\n\t\t\t\tVersion: \"buildpack-1-version-2\",\n\t\t\t},\n\t\t\tWithStacks: []dist.Stack{{\n\t\t\t\tID:     \"some.stack.id\",\n\t\t\t\tMixins: []string{\"mixinX\", \"mixinY\"},\n\t\t\t}},\n\t\t}, 0644)\n\t\th.AssertNil(t, err)\n\n\t\tbp2v1, err = ifakes.NewFakeBuildpack(dist.BuildpackDescriptor{\n\t\t\tWithAPI: api.MustParse(\"0.2\"),\n\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\tID:      \"buildpack-2-id\",\n\t\t\t\tVersion: \"buildpack-2-version-1\",\n\t\t\t},\n\t\t\tWithStacks: []dist.Stack{{\n\t\t\t\tID:     \"some.stack.id\",\n\t\t\t\tMixins: []string{\"build:mixinA\", \"run:mixinB\"},\n\t\t\t}},\n\t\t}, 0644)\n\t\th.AssertNil(t, err)\n\n\t\text1v1, err = ifakes.NewFakeExtension(dist.ExtensionDescriptor{\n\t\t\tWithAPI: api.MustParse(\"0.9\"),\n\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\tID:      \"extension-1-id\",\n\t\t\t\tVersion: \"extension-1-version-1\",\n\t\t\t},\n\t\t}, 0644)\n\t\th.AssertNil(t, err)\n\n\t\text1v2, err = ifakes.NewFakeExtension(dist.ExtensionDescriptor{\n\t\t\tWithAPI: api.MustParse(\"0.9\"),\n\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\tID:      \"extension-1-id\",\n\t\t\t\tVersion: \"extension-1-version-2\",\n\t\t\t},\n\t\t}, 0644)\n\t\th.AssertNil(t, err)\n\n\t\text2v1, err = ifakes.NewFakeExtension(dist.ExtensionDescriptor{\n\t\t\tWithAPI: api.MustParse(\"0.9\"),\n\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\tID:      \"extension-2-id\",\n\t\t\t\tVersion: \"extension-2-version-1\",\n\t\t\t},\n\t\t}, 0644)\n\t\th.AssertNil(t, err)\n\n\t\tbpOrder, err = ifakes.NewFakeBuildpack(dist.BuildpackDescriptor{\n\t\t\tWithAPI: api.MustParse(\"0.2\"),\n\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\tID:      \"order-buildpack-id\",\n\t\t\t\tVersion: \"order-buildpack-version\",\n\t\t\t},\n\t\t\tWithOrder: []dist.OrderEntry{{\n\t\t\t\tGroup: []dist.ModuleRef{\n\t\t\t\t\t{\n\t\t\t\t\t\tModuleInfo: bp1v1.Descriptor().Info(),\n\t\t\t\t\t\tOptional:   true,\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\tModuleInfo: bp2v1.Descriptor().Info(),\n\t\t\t\t\t\tOptional:   false,\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t}},\n\t\t}, 0644)\n\t\th.AssertNil(t, err)\n\t})\n\n\tit.After(func() {\n\t\th.AssertNilE(t, baseImage.Cleanup())\n\t\tmockController.Finish()\n\t})\n\n\twhen(\"the base image is not valid\", func() {\n\t\twhen(\"#FromImage\", func() {\n\t\t\twhen(\"metadata isn't valid\", func() {\n\t\t\t\tit(\"returns an error\", func() {\n\t\t\t\t\th.AssertNil(t, baseImage.SetLabel(\n\t\t\t\t\t\t\"io.buildpacks.builder.metadata\",\n\t\t\t\t\t\t`{\"something-random\": ,}`,\n\t\t\t\t\t))\n\n\t\t\t\t\t_, err := builder.FromImage(baseImage)\n\t\t\t\t\th.AssertError(t, err, \"getting label\")\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"#New\", func() {\n\t\t\twhen(\"metadata isn't valid\", func() {\n\t\t\t\tit(\"returns an error\", func() {\n\t\t\t\t\th.AssertNil(t, baseImage.SetLabel(\n\t\t\t\t\t\t\"io.buildpacks.builder.metadata\",\n\t\t\t\t\t\t`{\"something-random\": ,}`,\n\t\t\t\t\t))\n\n\t\t\t\t\t_, err := builder.FromImage(baseImage)\n\t\t\t\t\th.AssertError(t, err, \"getting label\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"missing CNB_USER_ID\", func() {\n\t\t\t\tit(\"returns an error\", func() {\n\t\t\t\t\t_, err := builder.New(baseImage, \"some/builder\")\n\t\t\t\t\th.AssertError(t, err, \"image 'base/image' missing required env var 'CNB_USER_ID'\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"missing CNB_GROUP_ID\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\th.AssertNil(t, baseImage.SetEnv(\"CNB_USER_ID\", \"1234\"))\n\t\t\t\t})\n\n\t\t\t\tit(\"returns an error\", func() {\n\t\t\t\t\t_, err := builder.New(baseImage, \"some/builder\")\n\t\t\t\t\th.AssertError(t, err, \"image 'base/image' missing required env var 'CNB_GROUP_ID'\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"CNB_USER_ID is not an int\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\th.AssertNil(t, baseImage.SetEnv(\"CNB_USER_ID\", \"not an int\"))\n\t\t\t\t\th.AssertNil(t, baseImage.SetEnv(\"CNB_GROUP_ID\", \"4321\"))\n\t\t\t\t})\n\n\t\t\t\tit(\"returns an error\", func() {\n\t\t\t\t\t_, err := builder.New(baseImage, \"some/builder\")\n\t\t\t\t\th.AssertError(t, err, \"failed to parse 'CNB_USER_ID', value 'not an int' should be an integer\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"CNB_GROUP_ID is not an int\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\th.AssertNil(t, baseImage.SetEnv(\"CNB_USER_ID\", \"1234\"))\n\t\t\t\t\th.AssertNil(t, baseImage.SetEnv(\"CNB_GROUP_ID\", \"not an int\"))\n\t\t\t\t})\n\n\t\t\t\tit(\"returns an error\", func() {\n\t\t\t\t\t_, err := builder.New(baseImage, \"some/builder\")\n\t\t\t\t\th.AssertError(t, err, \"failed to parse 'CNB_GROUP_ID', value 'not an int' should be an integer\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"missing stack id label and run image\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\th.AssertNil(t, baseImage.SetEnv(\"CNB_USER_ID\", \"1234\"))\n\t\t\t\t\th.AssertNil(t, baseImage.SetEnv(\"CNB_GROUP_ID\", \"4321\"))\n\t\t\t\t})\n\n\t\t\t\tit(\"does not return an error\", func() {\n\t\t\t\t\t_, err := builder.New(baseImage, \"some/builder\")\n\t\t\t\t\th.AssertNilE(t, err)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"mixins metadata is malformed\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\th.AssertNil(t, baseImage.SetEnv(\"CNB_USER_ID\", \"1234\"))\n\t\t\t\t\th.AssertNil(t, baseImage.SetEnv(\"CNB_GROUP_ID\", \"4321\"))\n\t\t\t\t\th.AssertNil(t, baseImage.SetLabel(\"io.buildpacks.stack.id\", \"some-id\"))\n\t\t\t\t})\n\n\t\t\t\tit(\"returns an error\", func() {\n\t\t\t\t\th.AssertNil(t, baseImage.SetLabel(\"io.buildpacks.stack.mixins\", `{\"mixinX\", \"mixinY\", \"build:mixinA\"}`))\n\t\t\t\t\t_, err := builder.New(baseImage, \"some/builder\")\n\t\t\t\t\th.AssertError(t, err, \"getting label io.buildpacks.stack.mixins\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"order metadata is malformed\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\th.AssertNil(t, baseImage.SetEnv(\"CNB_USER_ID\", \"1234\"))\n\t\t\t\t\th.AssertNil(t, baseImage.SetEnv(\"CNB_GROUP_ID\", \"4321\"))\n\t\t\t\t\th.AssertNil(t, baseImage.SetLabel(\"io.buildpacks.stack.id\", \"some-id\"))\n\t\t\t\t})\n\n\t\t\t\tit(\"returns an error\", func() {\n\t\t\t\t\th.AssertNil(t, baseImage.SetLabel(\"io.buildpacks.buildpack.order\", `{\"something\", }`))\n\t\t\t\t\t_, err := builder.New(baseImage, \"some/builder\")\n\t\t\t\t\th.AssertError(t, err, \"getting label io.buildpacks.buildpack.order\")\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"the base image is a valid build image\", func() {\n\t\tit.Before(func() {\n\t\t\tvar err error\n\t\t\th.AssertNil(t, baseImage.SetEnv(\"CNB_USER_ID\", \"1234\"))\n\t\t\th.AssertNil(t, baseImage.SetEnv(\"CNB_GROUP_ID\", \"4321\"))\n\t\t\th.AssertNil(t, baseImage.SetLabel(\"io.buildpacks.stack.id\", \"some.stack.id\"))\n\t\t\th.AssertNil(t, baseImage.SetLabel(\"io.buildpacks.stack.mixins\", `[\"mixinX\", \"mixinY\", \"build:mixinA\"]`))\n\t\t\tsubject, err = builder.New(baseImage, \"some/builder\")\n\t\t\th.AssertNil(t, err)\n\n\t\t\tsubject.SetLifecycle(mockLifecycle)\n\t\t})\n\n\t\tit.After(func() {\n\t\t\th.AssertNilE(t, baseImage.Cleanup())\n\t\t})\n\n\t\twhen(\"#Save\", func() {\n\t\t\tit(\"creates a builder from the image and renames it\", func() {\n\t\t\t\th.AssertNil(t, subject.Save(logger, builder.CreatorMetadata{}))\n\t\t\t\th.AssertEq(t, baseImage.IsSaved(), true)\n\t\t\t\th.AssertEq(t, baseImage.Name(), \"some/builder\")\n\t\t\t})\n\n\t\t\tit(\"adds creator metadata\", func() {\n\t\t\t\ttestName := \"test-name\"\n\t\t\t\ttestVersion := \"1.2.5\"\n\t\t\t\th.AssertNil(t, subject.Save(logger, builder.CreatorMetadata{\n\t\t\t\t\tName:    testName,\n\t\t\t\t\tVersion: testVersion,\n\t\t\t\t}))\n\t\t\t\th.AssertEq(t, baseImage.IsSaved(), true)\n\n\t\t\t\tlabel, err := baseImage.Label(\"io.buildpacks.builder.metadata\")\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\tvar metadata builder.Metadata\n\t\t\t\th.AssertNil(t, json.Unmarshal([]byte(label), &metadata))\n\n\t\t\t\th.AssertEq(t, metadata.CreatedBy.Name, testName)\n\t\t\t\th.AssertEq(t, metadata.CreatedBy.Version, testVersion)\n\t\t\t})\n\n\t\t\tit(\"adds creator name if not provided\", func() {\n\t\t\t\ttestVersion := \"1.2.5\"\n\t\t\t\th.AssertNil(t, subject.Save(logger, builder.CreatorMetadata{\n\t\t\t\t\tVersion: testVersion,\n\t\t\t\t}))\n\t\t\t\th.AssertEq(t, baseImage.IsSaved(), true)\n\n\t\t\t\tlabel, err := baseImage.Label(\"io.buildpacks.builder.metadata\")\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\tvar metadata builder.Metadata\n\t\t\t\th.AssertNil(t, json.Unmarshal([]byte(label), &metadata))\n\n\t\t\t\th.AssertEq(t, metadata.CreatedBy.Name, \"Pack CLI\")\n\t\t\t\th.AssertEq(t, metadata.CreatedBy.Version, testVersion)\n\t\t\t})\n\n\t\t\tit(\"creates the workspace dir with CNB user and group\", func() {\n\t\t\t\th.AssertNil(t, subject.Save(logger, builder.CreatorMetadata{}))\n\t\t\t\th.AssertEq(t, baseImage.IsSaved(), true)\n\n\t\t\t\tlayerTar, err := baseImage.FindLayerWithPath(\"/workspace\")\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertOnTarEntry(t, layerTar, \"/workspace\",\n\t\t\t\t\th.IsDirectory(),\n\t\t\t\t\th.HasFileMode(0755),\n\t\t\t\t\th.HasOwnerAndGroup(1234, 4321),\n\t\t\t\t\th.HasModTime(archive.NormalizedDateTime),\n\t\t\t\t)\n\t\t\t})\n\n\t\t\tit(\"creates the layers dir with CNB user and group\", func() {\n\t\t\t\th.AssertNil(t, subject.Save(logger, builder.CreatorMetadata{}))\n\t\t\t\th.AssertEq(t, baseImage.IsSaved(), true)\n\n\t\t\t\tlayerTar, err := baseImage.FindLayerWithPath(\"/layers\")\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertOnTarEntry(t, layerTar, \"/layers\",\n\t\t\t\t\th.IsDirectory(),\n\t\t\t\t\th.HasOwnerAndGroup(1234, 4321),\n\t\t\t\t\th.HasFileMode(0755),\n\t\t\t\t\th.HasModTime(archive.NormalizedDateTime),\n\t\t\t\t)\n\t\t\t})\n\n\t\t\tit(\"creates the cnb dir\", func() {\n\t\t\t\th.AssertNil(t, subject.Save(logger, builder.CreatorMetadata{}))\n\t\t\t\th.AssertEq(t, baseImage.IsSaved(), true)\n\n\t\t\t\tlayerTar, err := baseImage.FindLayerWithPath(\"/cnb\")\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertOnTarEntry(t, layerTar, \"/cnb\",\n\t\t\t\t\th.IsDirectory(),\n\t\t\t\t\th.HasOwnerAndGroup(0, 0),\n\t\t\t\t\th.HasFileMode(0755),\n\t\t\t\t\th.HasModTime(archive.NormalizedDateTime),\n\t\t\t\t)\n\t\t\t})\n\n\t\t\tit(\"creates the build-config dir\", func() {\n\t\t\t\th.AssertNil(t, subject.Save(logger, builder.CreatorMetadata{}))\n\t\t\t\th.AssertEq(t, baseImage.IsSaved(), true)\n\n\t\t\t\tlayerTar, err := baseImage.FindLayerWithPath(\"/cnb/build-config\")\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertOnTarEntry(t, layerTar, \"/cnb/build-config\",\n\t\t\t\t\th.IsDirectory(),\n\t\t\t\t\th.HasOwnerAndGroup(0, 0),\n\t\t\t\t\th.HasFileMode(0755),\n\t\t\t\t\th.HasModTime(archive.NormalizedDateTime),\n\t\t\t\t)\n\t\t\t})\n\t\t\tit(\"creates the buildpacks dir\", func() {\n\t\t\t\th.AssertNil(t, subject.Save(logger, builder.CreatorMetadata{}))\n\t\t\t\th.AssertEq(t, baseImage.IsSaved(), true)\n\n\t\t\t\tlayerTar, err := baseImage.FindLayerWithPath(\"/cnb/buildpacks\")\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertOnTarEntry(t, layerTar, \"/cnb/buildpacks\",\n\t\t\t\t\th.IsDirectory(),\n\t\t\t\t\th.HasOwnerAndGroup(0, 0),\n\t\t\t\t\th.HasFileMode(0755),\n\t\t\t\t\th.HasModTime(archive.NormalizedDateTime),\n\t\t\t\t)\n\t\t\t})\n\n\t\t\tit(\"creates the platform dir\", func() {\n\t\t\t\th.AssertNil(t, subject.Save(logger, builder.CreatorMetadata{}))\n\t\t\t\th.AssertEq(t, baseImage.IsSaved(), true)\n\n\t\t\t\tlayerTar, err := baseImage.FindLayerWithPath(\"/platform\")\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertOnTarEntry(t, layerTar, \"/platform\",\n\t\t\t\t\th.IsDirectory(),\n\t\t\t\t\th.HasOwnerAndGroup(0, 0),\n\t\t\t\t\th.HasFileMode(0755),\n\t\t\t\t\th.HasModTime(archive.NormalizedDateTime),\n\t\t\t\t)\n\t\t\t\th.AssertOnTarEntry(t, layerTar, \"/platform/env\",\n\t\t\t\t\th.IsDirectory(),\n\t\t\t\t\th.HasOwnerAndGroup(0, 0),\n\t\t\t\t\th.HasFileMode(0755),\n\t\t\t\t\th.HasModTime(archive.NormalizedDateTime),\n\t\t\t\t)\n\t\t\t})\n\n\t\t\tit(\"sets the working dir to the layers dir\", func() {\n\t\t\t\th.AssertNil(t, subject.Save(logger, builder.CreatorMetadata{}))\n\t\t\t\th.AssertEq(t, baseImage.IsSaved(), true)\n\n\t\t\t\tworkingDir, err := baseImage.WorkingDir()\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertEq(t, workingDir, \"/layers\")\n\t\t\t})\n\n\t\t\tit(\"does not overwrite the order layer when SetOrder has not been called\", func() {\n\t\t\t\ttmpDir, err := os.MkdirTemp(\"\", \"\")\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\tdefer os.RemoveAll(tmpDir)\n\n\t\t\t\tlayerFile := filepath.Join(tmpDir, \"order.tar\")\n\t\t\t\terr = archive.CreateSingleFileTar(layerFile, \"/cnb/order.toml\", \"some content\")\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\th.AssertNil(t, baseImage.AddLayer(layerFile))\n\t\t\t\th.AssertNil(t, baseImage.Save())\n\n\t\t\t\th.AssertNil(t, subject.Save(logger, builder.CreatorMetadata{}))\n\t\t\t\th.AssertEq(t, baseImage.IsSaved(), true)\n\n\t\t\t\tlayerTar, err := baseImage.FindLayerWithPath(\"/cnb/order.toml\")\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertOnTarEntry(t, layerTar, \"/cnb/order.toml\", h.ContentEquals(\"some content\"))\n\t\t\t})\n\n\t\t\tit(\"adds additional tags as requested\", func() {\n\t\t\t\th.AssertNil(t, subject.Save(logger, builder.CreatorMetadata{}, \"additional-tag-one\", \"additional-tag-two\"))\n\t\t\t\th.AssertEq(t, baseImage.IsSaved(), true)\n\t\t\t\th.AssertEq(t, baseImage.Name(), \"some/builder\")\n\t\t\t\tsavedNames := baseImage.SavedNames()\n\t\t\t\tslices.Sort(savedNames)\n\t\t\t\th.AssertEq(t, 3, len(savedNames))\n\t\t\t\th.AssertEq(t, \"additional-tag-one\", savedNames[0])\n\t\t\t\th.AssertEq(t, \"additional-tag-two\", savedNames[1])\n\t\t\t\th.AssertEq(t, \"some/builder\", savedNames[2])\n\t\t\t})\n\n\t\t\twhen(\"validating order\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tsubject.SetLifecycle(mockLifecycle)\n\t\t\t\t})\n\n\t\t\t\twhen(\"has single buildpack\", func() {\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\tsubject.AddBuildpack(bp1v1)\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"should resolve unset version (to legacy label and order.toml)\", func() {\n\t\t\t\t\t\tsubject.SetOrder(dist.Order{{\n\t\t\t\t\t\t\tGroup: []dist.ModuleRef{\n\t\t\t\t\t\t\t\t{ModuleInfo: dist.ModuleInfo{ID: bp1v1.Descriptor().Info().ID}}},\n\t\t\t\t\t\t}})\n\n\t\t\t\t\t\terr := subject.Save(logger, builder.CreatorMetadata{})\n\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\tlayerTar, err := baseImage.FindLayerWithPath(\"/cnb/order.toml\")\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\th.AssertOnTarEntry(t, layerTar, \"/cnb/order.toml\", h.ContentEquals(`[[order]]\n\n  [[order.group]]\n    id = \"buildpack-1-id\"\n    version = \"buildpack-1-version-1\"\n`))\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"order points to missing buildpack id\", func() {\n\t\t\t\t\t\tit(\"should error\", func() {\n\t\t\t\t\t\t\tsubject.SetOrder(dist.Order{{\n\t\t\t\t\t\t\t\tGroup: []dist.ModuleRef{\n\t\t\t\t\t\t\t\t\t{ModuleInfo: dist.ModuleInfo{ID: \"missing-buildpack-id\"}}},\n\t\t\t\t\t\t\t}})\n\n\t\t\t\t\t\t\terr := subject.Save(logger, builder.CreatorMetadata{})\n\n\t\t\t\t\t\t\th.AssertError(t, err, \"no versions of buildpack 'missing-buildpack-id' were found on the builder\")\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"order points to missing buildpack version\", func() {\n\t\t\t\t\t\tit(\"should error\", func() {\n\t\t\t\t\t\t\tsubject.SetOrder(dist.Order{{\n\t\t\t\t\t\t\t\tGroup: []dist.ModuleRef{\n\t\t\t\t\t\t\t\t\t{ModuleInfo: dist.ModuleInfo{ID: \"buildpack-1-id\", Version: \"missing-buildpack-version\"}}},\n\t\t\t\t\t\t\t}})\n\n\t\t\t\t\t\t\terr := subject.Save(logger, builder.CreatorMetadata{})\n\n\t\t\t\t\t\t\th.AssertError(t, err, \"buildpack 'buildpack-1-id' with version 'missing-buildpack-version' was not found on the builder\")\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"has repeated buildpacks with the same ID and version\", func() {\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\tsubject.AddBuildpack(bp1v1)\n\t\t\t\t\t\tsubject.AddBuildpack(bp1v1)\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"order omits version\", func() {\n\t\t\t\t\t\tit(\"should de-duplicate identical buildpacks\", func() {\n\t\t\t\t\t\t\tsubject.SetOrder(dist.Order{\n\t\t\t\t\t\t\t\t{Group: []dist.ModuleRef{{\n\t\t\t\t\t\t\t\t\tModuleInfo: dist.ModuleInfo{\n\t\t\t\t\t\t\t\t\t\tID:       bp1v1.Descriptor().Info().ID,\n\t\t\t\t\t\t\t\t\t\tHomepage: bp1v1.Descriptor().Info().Homepage,\n\t\t\t\t\t\t\t\t\t}}},\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t{Group: []dist.ModuleRef{{\n\t\t\t\t\t\t\t\t\tModuleInfo: dist.ModuleInfo{\n\t\t\t\t\t\t\t\t\t\tID:       bp1v1.Descriptor().Info().ID,\n\t\t\t\t\t\t\t\t\t\tHomepage: bp1v1.Descriptor().Info().Homepage,\n\t\t\t\t\t\t\t\t\t}}},\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t})\n\n\t\t\t\t\t\t\terr := subject.Save(logger, builder.CreatorMetadata{})\n\t\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"has multiple buildpacks with same ID\", func() {\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\tsubject.AddBuildpack(bp1v1)\n\t\t\t\t\t\tsubject.AddBuildpack(bp1v2)\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"order explicitly sets version\", func() {\n\t\t\t\t\t\tit(\"should keep order version\", func() {\n\t\t\t\t\t\t\tsubject.SetOrder(dist.Order{{\n\t\t\t\t\t\t\t\tGroup: []dist.ModuleRef{\n\t\t\t\t\t\t\t\t\t{ModuleInfo: bp1v1.Descriptor().Info()}},\n\t\t\t\t\t\t\t}})\n\n\t\t\t\t\t\t\terr := subject.Save(logger, builder.CreatorMetadata{})\n\t\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\t\tlayerTar, err := baseImage.FindLayerWithPath(\"/cnb/order.toml\")\n\t\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\t\th.AssertOnTarEntry(t, layerTar, \"/cnb/order.toml\", h.ContentEquals(`[[order]]\n\n  [[order.group]]\n    id = \"buildpack-1-id\"\n    version = \"buildpack-1-version-1\"\n`))\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"order version is empty\", func() {\n\t\t\t\t\t\tit(\"return error\", func() {\n\t\t\t\t\t\t\tsubject.SetOrder(dist.Order{{\n\t\t\t\t\t\t\t\tGroup: []dist.ModuleRef{\n\t\t\t\t\t\t\t\t\t{ModuleInfo: dist.ModuleInfo{ID: \"buildpack-1-id\"}}},\n\t\t\t\t\t\t\t}})\n\n\t\t\t\t\t\t\terr := subject.Save(logger, builder.CreatorMetadata{})\n\t\t\t\t\t\t\th.AssertError(t, err, \"multiple versions of 'buildpack-1-id' - must specify an explicit version\")\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"validating buildpacks\", func() {\n\t\t\t\twhen(\"nested buildpack does not exist\", func() {\n\t\t\t\t\twhen(\"buildpack by id does not exist\", func() {\n\t\t\t\t\t\tit(\"returns an error\", func() {\n\t\t\t\t\t\t\tsubject.AddBuildpack(bp1v1)\n\t\t\t\t\t\t\tsubject.AddBuildpack(bpOrder)\n\n\t\t\t\t\t\t\t// order buildpack requires bp2v1\n\t\t\t\t\t\t\terr := subject.Save(logger, builder.CreatorMetadata{})\n\n\t\t\t\t\t\t\th.AssertError(t, err, \"buildpack 'buildpack-2-id@buildpack-2-version-1' not found on the builder\")\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"buildpack version does not exist\", func() {\n\t\t\t\t\t\tit(\"returns an error\", func() {\n\t\t\t\t\t\t\tsubject.AddBuildpack(bp1v2)\n\t\t\t\t\t\t\tsubject.AddBuildpack(bp2v1)\n\n\t\t\t\t\t\t\t// order buildpack requires bp1v1 rather than bp1v2\n\t\t\t\t\t\t\tsubject.AddBuildpack(bpOrder)\n\n\t\t\t\t\t\t\terr := subject.Save(logger, builder.CreatorMetadata{})\n\n\t\t\t\t\t\t\th.AssertError(t, err, \"buildpack 'buildpack-1-id@buildpack-1-version-1' not found on the builder\")\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"buildpack stack id does not match\", func() {\n\t\t\t\t\tit(\"returns an error\", func() {\n\t\t\t\t\t\tbp, err := ifakes.NewFakeBuildpack(dist.BuildpackDescriptor{\n\t\t\t\t\t\t\tWithAPI:    api.MustParse(\"0.2\"),\n\t\t\t\t\t\t\tWithInfo:   bp1v1.Descriptor().Info(),\n\t\t\t\t\t\t\tWithStacks: []dist.Stack{{ID: \"other.stack.id\"}},\n\t\t\t\t\t\t}, 0644)\n\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\tsubject.AddBuildpack(bp)\n\t\t\t\t\t\terr = subject.Save(logger, builder.CreatorMetadata{})\n\n\t\t\t\t\t\th.AssertError(t, err, \"buildpack 'buildpack-1-id@buildpack-1-version-1' does not support stack 'some.stack.id'\")\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"buildpack is not compatible with lifecycle\", func() {\n\t\t\t\t\tit(\"returns an error\", func() {\n\t\t\t\t\t\tbp, err := ifakes.NewFakeBuildpack(dist.BuildpackDescriptor{\n\t\t\t\t\t\t\tWithAPI:    api.MustParse(\"0.1\"),\n\t\t\t\t\t\t\tWithInfo:   bp1v1.Descriptor().Info(),\n\t\t\t\t\t\t\tWithStacks: []dist.Stack{{ID: \"some.stack.id\"}},\n\t\t\t\t\t\t}, 0644)\n\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\tsubject.AddBuildpack(bp)\n\t\t\t\t\t\terr = subject.Save(logger, builder.CreatorMetadata{})\n\n\t\t\t\t\t\th.AssertError(t,\n\t\t\t\t\t\t\terr,\n\t\t\t\t\t\t\t\"buildpack 'buildpack-1-id@buildpack-1-version-1' (Buildpack API 0.1) is incompatible with lifecycle '0.0.0' (Buildpack API(s) 0.2, 0.3, 0.4, 0.9)\")\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"buildpack mixins are not satisfied\", func() {\n\t\t\t\t\tit(\"returns an error\", func() {\n\t\t\t\t\t\tbp, err := ifakes.NewFakeBuildpack(dist.BuildpackDescriptor{\n\t\t\t\t\t\t\tWithAPI:  api.MustParse(\"0.2\"),\n\t\t\t\t\t\t\tWithInfo: bp1v1.Descriptor().Info(),\n\t\t\t\t\t\t\tWithStacks: []dist.Stack{{\n\t\t\t\t\t\t\t\tID:     \"some.stack.id\",\n\t\t\t\t\t\t\t\tMixins: []string{\"missing\"},\n\t\t\t\t\t\t\t}},\n\t\t\t\t\t\t}, 0644)\n\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\tsubject.AddBuildpack(bp)\n\t\t\t\t\t\terr = subject.Save(logger, builder.CreatorMetadata{})\n\n\t\t\t\t\t\th.AssertError(t, err, \"buildpack 'buildpack-1-id@buildpack-1-version-1' requires missing mixin(s): missing\")\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"getting layers label\", func() {\n\t\t\t\tit(\"fails if layers label isn't set correctly\", func() {\n\t\t\t\t\th.AssertNil(t, baseImage.SetLabel(\n\t\t\t\t\t\t\"io.buildpacks.buildpack.layers\",\n\t\t\t\t\t\t`{\"something-here: }`,\n\t\t\t\t\t))\n\n\t\t\t\t\terr := subject.Save(logger, builder.CreatorMetadata{})\n\t\t\t\t\th.AssertError(t, err, \"getting label io.buildpacks.buildpack.layers\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"saving with duplicated buildpacks\", func() {\n\t\t\t\tit(\"adds a single buildpack to the builder image\", func() {\n\t\t\t\t\tsubject.AddBuildpack(bp1v1)\n\t\t\t\t\tsubject.AddBuildpack(bp2v1)\n\t\t\t\t\tsubject.AddBuildpack(bp1v1)\n\n\t\t\t\t\terr := subject.Save(logger, builder.CreatorMetadata{})\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\th.AssertEq(t, baseImage.IsSaved(), true)\n\n\t\t\t\t\t// Expect 6 layers from the following locations:\n\t\t\t\t\t//  - 1 from defaultDirsLayer\n\t\t\t\t\t//  - 1 from lifecycleLayer\n\t\t\t\t\t//  - 2 from buildpacks\n\t\t\t\t\t//  - 1 from orderLayer\n\t\t\t\t\t//  - 1 from stackLayer\n\t\t\t\t\t//  - 1 from runImageLayer\n\t\t\t\t\th.AssertEq(t, baseImage.NumberOfAddedLayers(), 7)\n\t\t\t\t})\n\n\t\t\t\twhen(\"duplicated buildpack, has different contents\", func() {\n\t\t\t\t\tvar bp1v1Alt buildpack.BuildModule\n\t\t\t\t\tvar bp1v1AltWithNewContent buildpack.BuildModule\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\tvar err error\n\t\t\t\t\t\tbp1v1Alt, err = ifakes.NewFakeBuildpack(dist.BuildpackDescriptor{\n\t\t\t\t\t\t\tWithAPI: api.MustParse(\"0.2\"),\n\t\t\t\t\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\t\t\t\t\tID:      \"buildpack-1-id\",\n\t\t\t\t\t\t\t\tVersion: \"buildpack-1-version-1\",\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\tWithStacks: []dist.Stack{{\n\t\t\t\t\t\t\t\tID:     \"some.stack.id\",\n\t\t\t\t\t\t\t\tMixins: []string{\"mixinX\", \"mixinY\"},\n\t\t\t\t\t\t\t}},\n\t\t\t\t\t\t}, 0644, ifakes.WithExtraBuildpackContents(\"coolbeans\", \"a file cool as beans\"))\n\n\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\tbp1v1AltWithNewContent, err = ifakes.NewFakeBuildpack(dist.BuildpackDescriptor{\n\t\t\t\t\t\t\tWithAPI: api.MustParse(\"0.2\"),\n\t\t\t\t\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\t\t\t\t\tID:      \"buildpack-1-id\",\n\t\t\t\t\t\t\t\tVersion: \"buildpack-1-version-1\",\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\tWithStacks: []dist.Stack{{\n\t\t\t\t\t\t\t\tID:     \"some.stack.id\",\n\t\t\t\t\t\t\t\tMixins: []string{\"mixinX\", \"mixinY\"},\n\t\t\t\t\t\t\t}},\n\t\t\t\t\t\t}, 0644, ifakes.WithExtraBuildpackContents(\"coolwatermelon\", \"a file cool as watermelon\"))\n\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"uses the whiteout layers\", func() {\n\t\t\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf, logging.WithVerbose())\n\n\t\t\t\t\t\tsubject.AddBuildpack(bp1v1Alt)\n\t\t\t\t\t\tsubject.AddBuildpack(bp1v1AltWithNewContent)\n\n\t\t\t\t\t\terr := subject.Save(logger, builder.CreatorMetadata{})\n\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\th.AssertEq(t, baseImage.IsSaved(), true)\n\n\t\t\t\t\t\toldPath := filepath.Join(\"/cnb\", \"buildpacks\", \"buildpack-1-id\", \"buildpack-1-version-1\", \"coolbeans\")\n\t\t\t\t\t\tlayer, err := baseImage.FindLayerWithPath(oldPath)\n\n\t\t\t\t\t\th.AssertEq(t, layer, \"\")\n\t\t\t\t\t\th.AssertError(t, err, fmt.Sprintf(\"could not find '%s' in any layer\", oldPath))\n\n\t\t\t\t\t\tnewPath := filepath.Join(\"/cnb\", \"buildpacks\", \"buildpack-1-id\", \"buildpack-1-version-1\", \"coolwatermelon\")\n\t\t\t\t\t\tlayer, err = baseImage.FindLayerWithPath(newPath)\n\n\t\t\t\t\t\th.AssertNotEq(t, layer, \"\")\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"uses the last buildpack\", func() {\n\t\t\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf, logging.WithVerbose())\n\n\t\t\t\t\t\tsubject.AddBuildpack(bp1v1)\n\t\t\t\t\t\tsubject.AddBuildpack(bp1v1Alt)\n\n\t\t\t\t\t\terr := subject.Save(logger, builder.CreatorMetadata{})\n\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\th.AssertEq(t, baseImage.IsSaved(), true)\n\n\t\t\t\t\t\t// Expect 5 layers from the following locations:\n\t\t\t\t\t\t//  - 1 from defaultDirsLayer\n\t\t\t\t\t\t//  - 1 from lifecycleLayer\n\t\t\t\t\t\t//  - 1 from buildpacks\n\t\t\t\t\t\t//  - 1 from orderLayer\n\t\t\t\t\t\t//  - 1 from stackLayer\n\t\t\t\t\t\t//  - 1 from runImageLayer\n\t\t\t\t\t\th.AssertEq(t, baseImage.NumberOfAddedLayers(), 6)\n\t\t\t\t\t\toldSha256 := \"2ba2e8563f7f43533ba26047a44f3e8bb7dd009043bd73a0e6aadb02c084955c\"\n\t\t\t\t\t\tnewSha256 := \"719faea06424d01bb5788ce63c1167e8d382b2d9df8fcf3a0a54ea9b2e3b4045\"\n\t\t\t\t\t\tif runtime.GOOS == \"windows\" {\n\t\t\t\t\t\t\tnewSha256 = \"d99d31efba72ebf98e8101ada9e89464566e943c05367c561b116c2cb86837c9\"\n\t\t\t\t\t\t}\n\n\t\t\t\t\t\th.AssertContains(t, outBuf.String(), fmt.Sprintf(`buildpack 'buildpack-1-id@buildpack-1-version-1' was previously defined with different contents and will be overwritten\n  - previous diffID: 'sha256:%s'\n  - using diffID: 'sha256:%s'`, oldSha256, newSha256))\n\n\t\t\t\t\t\tlayer, err := baseImage.FindLayerWithPath(filepath.Join(\"/cnb\", \"buildpacks\", \"buildpack-1-id\", \"buildpack-1-version-1\", \"coolbeans\"))\n\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\tbpLayer, err := os.Open(layer)\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\tdefer bpLayer.Close()\n\n\t\t\t\t\t\thsh := sha256.New()\n\t\t\t\t\t\t_, err = io.Copy(hsh, bpLayer)\n\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\th.AssertEq(t, newSha256, fmt.Sprintf(\"%x\", hsh.Sum(nil)))\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"adding buildpack that already exists on the image\", func() {\n\t\t\t\t\tit(\"skips adding buildpack that already exists\", func() {\n\t\t\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf, logging.WithVerbose())\n\t\t\t\t\t\tdiffID := \"2ba2e8563f7f43533ba26047a44f3e8bb7dd009043bd73a0e6aadb02c084955c\"\n\t\t\t\t\t\tbpLayer := dist.ModuleLayers{\n\t\t\t\t\t\t\t\"buildpack-1-id\": map[string]dist.ModuleLayerInfo{\n\t\t\t\t\t\t\t\t\"buildpack-1-version-1\": {\n\t\t\t\t\t\t\t\t\tAPI:         api.MustParse(\"0.2\"),\n\t\t\t\t\t\t\t\t\tStacks:      nil,\n\t\t\t\t\t\t\t\t\tOrder:       nil,\n\t\t\t\t\t\t\t\t\tLayerDiffID: fmt.Sprintf(\"sha256:%s\", diffID),\n\t\t\t\t\t\t\t\t\tHomepage:    \"\",\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t}\n\t\t\t\t\t\tbpLayerString, err := json.Marshal(bpLayer)\n\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\th.AssertNil(t, baseImage.SetLabel( // label builder as already having a buildpack with diffID `diffID`\n\t\t\t\t\t\t\tdist.BuildpackLayersLabel,\n\t\t\t\t\t\t\tstring(bpLayerString),\n\t\t\t\t\t\t))\n\n\t\t\t\t\t\tsubject.AddBuildpack(bp1v1)\n\t\t\t\t\t\terr = subject.Save(logger, builder.CreatorMetadata{})\n\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\tfmt.Println(outBuf.String())\n\t\t\t\t\t\texpectedLog := \"Buildpack 'buildpack-1-id@buildpack-1-version-1' already exists on builder with same contents, skipping...\"\n\t\t\t\t\t\th.AssertContains(t, outBuf.String(), expectedLog)\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"error adding buildpacks to builder\", func() {\n\t\t\t\twhen(\"unable to convert buildpack to layer tar\", func() {\n\t\t\t\t\tvar bp1v1Err buildpack.BuildModule\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\tvar err error\n\t\t\t\t\t\tbp1v1Err, err = ifakes.NewFakeBuildpack(dist.BuildpackDescriptor{\n\t\t\t\t\t\t\tWithAPI: api.MustParse(\"0.2\"),\n\t\t\t\t\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\t\t\t\t\tID:      \"buildpack-1-id\",\n\t\t\t\t\t\t\t\tVersion: \"buildpack-1-version-1\",\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\tWithStacks: []dist.Stack{{\n\t\t\t\t\t\t\t\tID:     \"some.stack.id\",\n\t\t\t\t\t\t\t\tMixins: []string{\"mixinX\", \"mixinY\"},\n\t\t\t\t\t\t\t}},\n\t\t\t\t\t\t}, 0644, ifakes.WithBpOpenError(errors.New(\"unable to open buildpack\")))\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t})\n\t\t\t\t\tit(\"errors\", func() {\n\t\t\t\t\t\tsubject.AddBuildpack(bp1v1Err)\n\n\t\t\t\t\t\terr := subject.Save(logger, builder.CreatorMetadata{})\n\n\t\t\t\t\t\th.AssertError(t, err, \"unable to open buildpack\")\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"modules are added in random order\", func() {\n\t\t\t\tvar fakeLayerImage *h.FakeAddedLayerImage\n\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tvar err error\n\t\t\t\t\tfakeLayerImage = &h.FakeAddedLayerImage{Image: baseImage}\n\t\t\t\t\tsubject, err = builder.New(fakeLayerImage, \"some/builder\")\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\tsubject.SetLifecycle(mockLifecycle)\n\n\t\t\t\t\tbp2v2, err = ifakes.NewFakeBuildpack(dist.BuildpackDescriptor{\n\t\t\t\t\t\tWithAPI: api.MustParse(\"0.2\"),\n\t\t\t\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\t\t\t\tID:      \"buildpack-2-id\",\n\t\t\t\t\t\t\tVersion: \"buildpack-2-version-2\",\n\t\t\t\t\t\t},\n\t\t\t\t\t\tWithStacks: []dist.Stack{{\n\t\t\t\t\t\t\tID:     \"some.stack.id\",\n\t\t\t\t\t\t\tMixins: []string{\"build:mixinA\", \"run:mixinB\"},\n\t\t\t\t\t\t}},\n\t\t\t\t\t}, 0644)\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t})\n\n\t\t\t\tit(\"layers are written ordered by buildpacks ID & Version\", func() {\n\t\t\t\t\t// add buildpacks in a random order\n\t\t\t\t\tsubject.AddBuildpack(bp2v2)\n\t\t\t\t\tsubject.AddBuildpack(bp1v2)\n\t\t\t\t\tsubject.AddBuildpack(bp1v1)\n\t\t\t\t\tsubject.AddBuildpack(bp2v1)\n\t\t\t\t\th.AssertNil(t, subject.Save(logger, builder.CreatorMetadata{}))\n\n\t\t\t\t\tlayers := fakeLayerImage.AddedLayersOrder()\n\t\t\t\t\th.AssertEq(t, len(layers), 4)\n\t\t\t\t\th.AssertTrue(t, strings.Contains(layers[0], h.LayerFileName(bp1v1)))\n\t\t\t\t\th.AssertTrue(t, strings.Contains(layers[1], h.LayerFileName(bp1v2)))\n\t\t\t\t\th.AssertTrue(t, strings.Contains(layers[2], h.LayerFileName(bp2v1)))\n\t\t\t\t\th.AssertTrue(t, strings.Contains(layers[3], h.LayerFileName(bp2v2)))\n\t\t\t\t})\n\n\t\t\t\tit(\"extensions are written ordered by buildpacks ID & Version\", func() {\n\t\t\t\t\t// add buildpacks in a random order\n\t\t\t\t\tsubject.AddBuildpack(ext2v1)\n\t\t\t\t\tsubject.AddBuildpack(ext1v2)\n\t\t\t\t\tsubject.AddBuildpack(ext1v1)\n\t\t\t\t\th.AssertNil(t, subject.Save(logger, builder.CreatorMetadata{}))\n\n\t\t\t\t\tlayers := fakeLayerImage.AddedLayersOrder()\n\t\t\t\t\th.AssertEq(t, len(layers), 3)\n\t\t\t\t\th.AssertTrue(t, strings.Contains(layers[0], h.LayerFileName(ext1v1)))\n\t\t\t\t\th.AssertTrue(t, strings.Contains(layers[1], h.LayerFileName(ext1v2)))\n\t\t\t\t\th.AssertTrue(t, strings.Contains(layers[2], h.LayerFileName(ext2v1)))\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"system buildpacks\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tsubject.SetLifecycle(mockLifecycle)\n\t\t\t\t\tsubject.AddBuildpack(bp1v1)\n\t\t\t\t\tsubject.SetSystem(dist.System{\n\t\t\t\t\t\tPre: dist.SystemBuildpacks{\n\t\t\t\t\t\t\tBuildpacks: []dist.ModuleRef{\n\t\t\t\t\t\t\t\t{ModuleInfo: dist.ModuleInfo{ID: bp1v1.Descriptor().Info().ID}}},\n\t\t\t\t\t\t},\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\tit(\"should write system buildpacks to system.toml)\", func() {\n\t\t\t\t\terr := subject.Save(logger, builder.CreatorMetadata{})\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\tlayerTar, err := baseImage.FindLayerWithPath(\"/cnb/system.toml\")\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\th.AssertOnTarEntry(t, layerTar, \"/cnb/system.toml\", h.ContentEquals(`[system]\n  [system.pre]\n\n    [[system.pre.buildpacks]]\n      id = \"buildpack-1-id\"\n      version = \"buildpack-1-version-1\"\n`))\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\t\twhen(\"#SetLifecycle\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\th.AssertNil(t, subject.Save(logger, builder.CreatorMetadata{}))\n\t\t\t\th.AssertEq(t, baseImage.IsSaved(), true)\n\t\t\t})\n\n\t\t\tit(\"should set the lifecycle version successfully\", func() {\n\t\t\t\th.AssertEq(t, subject.LifecycleDescriptor().Info.Version.String(), \"0.0.0\")\n\t\t\t})\n\n\t\t\tit(\"should add the lifecycle binaries as an image layer\", func() {\n\t\t\t\tlayerTar, err := baseImage.FindLayerWithPath(\"/cnb/lifecycle\")\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertOnTarEntry(t, layerTar, \"/cnb/lifecycle\",\n\t\t\t\t\th.IsDirectory(),\n\t\t\t\t\th.HasFileMode(0755),\n\t\t\t\t\th.HasModTime(archive.NormalizedDateTime),\n\t\t\t\t)\n\n\t\t\t\th.AssertOnTarEntry(t, layerTar, \"/cnb/lifecycle/detector\",\n\t\t\t\t\th.ContentEquals(\"detector\"),\n\t\t\t\t\th.HasFileMode(0755),\n\t\t\t\t\th.HasModTime(archive.NormalizedDateTime),\n\t\t\t\t)\n\n\t\t\t\th.AssertOnTarEntry(t, layerTar, \"/cnb/lifecycle/restorer\",\n\t\t\t\t\th.ContentEquals(\"restorer\"),\n\t\t\t\t\th.HasFileMode(0755),\n\t\t\t\t\th.HasModTime(archive.NormalizedDateTime),\n\t\t\t\t)\n\n\t\t\t\th.AssertOnTarEntry(t, layerTar, \"/cnb/lifecycle/analyzer\",\n\t\t\t\t\th.ContentEquals(\"analyzer\"),\n\t\t\t\t\th.HasFileMode(0755),\n\t\t\t\t\th.HasModTime(archive.NormalizedDateTime),\n\t\t\t\t)\n\n\t\t\t\th.AssertOnTarEntry(t, layerTar, \"/cnb/lifecycle/builder\",\n\t\t\t\t\th.ContentEquals(\"builder\"),\n\t\t\t\t\th.HasFileMode(0755),\n\t\t\t\t\th.HasModTime(archive.NormalizedDateTime),\n\t\t\t\t)\n\n\t\t\t\th.AssertOnTarEntry(t, layerTar, \"/cnb/lifecycle/exporter\",\n\t\t\t\t\th.ContentEquals(\"exporter\"),\n\t\t\t\t\th.HasFileMode(0755),\n\t\t\t\t\th.HasModTime(archive.NormalizedDateTime),\n\t\t\t\t)\n\n\t\t\t\th.AssertOnTarEntry(t, layerTar, \"/cnb/lifecycle/launcher\",\n\t\t\t\t\th.ContentEquals(\"launcher\"),\n\t\t\t\t\th.HasFileMode(0755),\n\t\t\t\t\th.HasModTime(archive.NormalizedDateTime),\n\t\t\t\t)\n\n\t\t\t\tit(\"should add lifecycle symlink\", func() {\n\t\t\t\t\th.AssertOnTarEntry(t, layerTar, \"/lifecycle\",\n\t\t\t\t\t\th.SymlinksTo(\"/cnb/lifecycle\"),\n\t\t\t\t\t\th.HasFileMode(0644),\n\t\t\t\t\t\th.HasModTime(archive.NormalizedDateTime),\n\t\t\t\t\t)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\tit(\"sets the lifecycle version on the metadata\", func() {\n\t\t\t\tlabel, err := baseImage.Label(\"io.buildpacks.builder.metadata\")\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\tvar metadata builder.Metadata\n\t\t\t\th.AssertNil(t, json.Unmarshal([]byte(label), &metadata))\n\t\t\t\th.AssertEq(t, metadata.Lifecycle.Version.String(), \"0.0.0\")\n\t\t\t\th.AssertEq(t, metadata.Lifecycle.API.BuildpackVersion.String(), \"0.2\")\n\t\t\t\th.AssertEq(t, metadata.Lifecycle.API.PlatformVersion.String(), \"0.2\")\n\t\t\t\th.AssertNotNil(t, metadata.Lifecycle.APIs)\n\t\t\t\th.AssertEq(t, metadata.Lifecycle.APIs.Buildpack.Deprecated.AsStrings(), []string{})\n\t\t\t\th.AssertEq(t, metadata.Lifecycle.APIs.Buildpack.Supported.AsStrings(), []string{\"0.2\", \"0.3\", \"0.4\", \"0.9\"})\n\t\t\t\th.AssertEq(t, metadata.Lifecycle.APIs.Platform.Deprecated.AsStrings(), []string{\"0.2\"})\n\t\t\t\th.AssertEq(t, metadata.Lifecycle.APIs.Platform.Supported.AsStrings(), []string{\"0.3\", \"0.4\"})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"#AddBuildpack\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\tsubject.AddBuildpack(bp1v1)\n\t\t\t\tsubject.AddBuildpack(bp1v2)\n\t\t\t\tsubject.AddBuildpack(bp2v1)\n\t\t\t\tsubject.AddBuildpack(bpOrder)\n\t\t\t})\n\n\t\t\tit(\"adds the buildpack as an image layer\", func() {\n\t\t\t\th.AssertNil(t, subject.Save(logger, builder.CreatorMetadata{}))\n\t\t\t\th.AssertEq(t, baseImage.IsSaved(), true)\n\t\t\t\tassertImageHasBPLayer(t, baseImage, bp1v1)\n\t\t\t\tassertImageHasBPLayer(t, baseImage, bp1v2)\n\t\t\t\tassertImageHasBPLayer(t, baseImage, bp2v1)\n\t\t\t\tassertImageHasOrderBpLayer(t, baseImage, bpOrder)\n\t\t\t})\n\n\t\t\tit(\"adds the buildpack metadata\", func() {\n\t\t\t\th.AssertNil(t, subject.Save(logger, builder.CreatorMetadata{}))\n\t\t\t\th.AssertEq(t, baseImage.IsSaved(), true)\n\n\t\t\t\tlabel, err := baseImage.Label(\"io.buildpacks.builder.metadata\")\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\tvar metadata builder.Metadata\n\t\t\t\th.AssertNil(t, json.Unmarshal([]byte(label), &metadata))\n\t\t\t\th.AssertEq(t, len(metadata.Buildpacks), 4)\n\n\t\t\t\th.AssertEq(t, metadata.Buildpacks[0].ID, \"buildpack-1-id\")\n\t\t\t\th.AssertEq(t, metadata.Buildpacks[0].Version, \"buildpack-1-version-1\")\n\n\t\t\t\th.AssertEq(t, metadata.Buildpacks[1].ID, \"buildpack-1-id\")\n\t\t\t\th.AssertEq(t, metadata.Buildpacks[1].Version, \"buildpack-1-version-2\")\n\n\t\t\t\th.AssertEq(t, metadata.Buildpacks[2].ID, \"buildpack-2-id\")\n\t\t\t\th.AssertEq(t, metadata.Buildpacks[2].Version, \"buildpack-2-version-1\")\n\n\t\t\t\th.AssertEq(t, metadata.Buildpacks[3].ID, \"order-buildpack-id\")\n\t\t\t\th.AssertEq(t, metadata.Buildpacks[3].Version, \"order-buildpack-version\")\n\t\t\t})\n\n\t\t\tit(\"adds the buildpack layers label\", func() {\n\t\t\t\th.AssertNil(t, subject.Save(logger, builder.CreatorMetadata{}))\n\t\t\t\th.AssertEq(t, baseImage.IsSaved(), true)\n\n\t\t\t\tlabel, err := baseImage.Label(\"io.buildpacks.buildpack.layers\")\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\tvar layers dist.ModuleLayers\n\t\t\t\th.AssertNil(t, json.Unmarshal([]byte(label), &layers))\n\t\t\t\th.AssertEq(t, len(layers), 3)\n\t\t\t\th.AssertEq(t, len(layers[\"buildpack-1-id\"]), 2)\n\t\t\t\th.AssertEq(t, len(layers[\"buildpack-2-id\"]), 1)\n\n\t\t\t\th.AssertEq(t, len(layers[\"buildpack-1-id\"][\"buildpack-1-version-1\"].Order), 0)\n\t\t\t\th.AssertEq(t, len(layers[\"buildpack-1-id\"][\"buildpack-1-version-1\"].Stacks), 1)\n\t\t\t\th.AssertEq(t, layers[\"buildpack-1-id\"][\"buildpack-1-version-1\"].Stacks[0].ID, \"some.stack.id\")\n\t\t\t\th.AssertSliceContainsOnly(t, layers[\"buildpack-1-id\"][\"buildpack-1-version-1\"].Stacks[0].Mixins, \"mixinX\", \"mixinY\")\n\n\t\t\t\th.AssertEq(t, len(layers[\"buildpack-1-id\"][\"buildpack-1-version-2\"].Order), 0)\n\t\t\t\th.AssertEq(t, len(layers[\"buildpack-1-id\"][\"buildpack-1-version-2\"].Stacks), 1)\n\t\t\t\th.AssertEq(t, layers[\"buildpack-1-id\"][\"buildpack-1-version-2\"].Stacks[0].ID, \"some.stack.id\")\n\t\t\t\th.AssertSliceContainsOnly(t, layers[\"buildpack-1-id\"][\"buildpack-1-version-2\"].Stacks[0].Mixins, \"mixinX\", \"mixinY\")\n\n\t\t\t\th.AssertEq(t, len(layers[\"buildpack-2-id\"][\"buildpack-2-version-1\"].Order), 0)\n\t\t\t\th.AssertEq(t, len(layers[\"buildpack-2-id\"][\"buildpack-2-version-1\"].Stacks), 1)\n\t\t\t\th.AssertEq(t, layers[\"buildpack-2-id\"][\"buildpack-2-version-1\"].Stacks[0].ID, \"some.stack.id\")\n\t\t\t\th.AssertSliceContainsOnly(t, layers[\"buildpack-2-id\"][\"buildpack-2-version-1\"].Stacks[0].Mixins, \"build:mixinA\", \"run:mixinB\")\n\n\t\t\t\th.AssertEq(t, len(layers[\"order-buildpack-id\"][\"order-buildpack-version\"].Order), 1)\n\t\t\t\th.AssertEq(t, len(layers[\"order-buildpack-id\"][\"order-buildpack-version\"].Order[0].Group), 2)\n\t\t\t\th.AssertEq(t, layers[\"order-buildpack-id\"][\"order-buildpack-version\"].Order[0].Group[0].ID, \"buildpack-1-id\")\n\t\t\t\th.AssertEq(t, layers[\"order-buildpack-id\"][\"order-buildpack-version\"].Order[0].Group[0].Version, \"buildpack-1-version-1\")\n\t\t\t\th.AssertEq(t, layers[\"order-buildpack-id\"][\"order-buildpack-version\"].Order[0].Group[0].Optional, true)\n\t\t\t\th.AssertEq(t, layers[\"order-buildpack-id\"][\"order-buildpack-version\"].Order[0].Group[1].ID, \"buildpack-2-id\")\n\t\t\t\th.AssertEq(t, layers[\"order-buildpack-id\"][\"order-buildpack-version\"].Order[0].Group[1].Version, \"buildpack-2-version-1\")\n\t\t\t\th.AssertEq(t, layers[\"order-buildpack-id\"][\"order-buildpack-version\"].Order[0].Group[1].Optional, false)\n\t\t\t})\n\n\t\t\twhen(\"base image already has buildpack layers label\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tvar mdJSON bytes.Buffer\n\t\t\t\t\th.AssertNil(t, json.Compact(\n\t\t\t\t\t\t&mdJSON,\n\t\t\t\t\t\t[]byte(`{\n  \"buildpack-1-id\": {\n    \"buildpack-1-version-1\": {\n      \"layerDiffID\": \"sha256:buildpack-1-version-1-diff-id\"\n    },\n    \"buildpack-1-version-2\": {\n      \"layerDiffID\": \"sha256:buildpack-1-version-2-diff-id\"\n    }\n  }\n}\n`)))\n\n\t\t\t\t\th.AssertNil(t, baseImage.SetLabel(\n\t\t\t\t\t\t\"io.buildpacks.buildpack.layers\",\n\t\t\t\t\t\tmdJSON.String(),\n\t\t\t\t\t))\n\n\t\t\t\t\tvar err error\n\t\t\t\t\tsubject, err = builder.New(baseImage, \"some/builder\")\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\tsubject.AddBuildpack(bp1v2)\n\t\t\t\t\tsubject.AddBuildpack(bp2v1)\n\n\t\t\t\t\tsubject.SetLifecycle(mockLifecycle)\n\t\t\t\t})\n\n\t\t\t\tit(\"appends buildpack layer info\", func() {\n\t\t\t\t\th.AssertNil(t, subject.Save(logger, builder.CreatorMetadata{}))\n\t\t\t\t\th.AssertEq(t, baseImage.IsSaved(), true)\n\n\t\t\t\t\tlabel, err := baseImage.Label(\"io.buildpacks.buildpack.layers\")\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\tvar layers dist.ModuleLayers\n\t\t\t\t\th.AssertNil(t, json.Unmarshal([]byte(label), &layers))\n\t\t\t\t\th.AssertEq(t, len(layers), 2)\n\t\t\t\t\th.AssertEq(t, len(layers[\"buildpack-1-id\"]), 2)\n\t\t\t\t\th.AssertEq(t, len(layers[\"buildpack-2-id\"]), 1)\n\n\t\t\t\t\th.AssertEq(t, layers[\"buildpack-1-id\"][\"buildpack-1-version-1\"].LayerDiffID, \"sha256:buildpack-1-version-1-diff-id\")\n\n\t\t\t\t\th.AssertUnique(t,\n\t\t\t\t\t\tlayers[\"buildpack-1-id\"][\"buildpack-1-version-1\"].LayerDiffID,\n\t\t\t\t\t\tlayers[\"buildpack-1-id\"][\"buildpack-1-version-2\"].LayerDiffID,\n\t\t\t\t\t\tlayers[\"buildpack-2-id\"][\"buildpack-2-version-1\"].LayerDiffID,\n\t\t\t\t\t)\n\n\t\t\t\t\th.AssertEq(t, len(layers[\"buildpack-1-id\"][\"buildpack-1-version-1\"].Order), 0)\n\t\t\t\t\th.AssertEq(t, len(layers[\"buildpack-1-id\"][\"buildpack-1-version-2\"].Order), 0)\n\t\t\t\t\th.AssertEq(t, len(layers[\"buildpack-2-id\"][\"buildpack-2-version-1\"].Order), 0)\n\t\t\t\t})\n\n\t\t\t\tit(\"informs when overriding existing buildpack, and log level is DEBUG\", func() {\n\t\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf, logging.WithVerbose())\n\n\t\t\t\t\th.AssertNil(t, subject.Save(logger, builder.CreatorMetadata{}))\n\t\t\t\t\th.AssertEq(t, baseImage.IsSaved(), true)\n\n\t\t\t\t\tlabel, err := baseImage.Label(\"io.buildpacks.buildpack.layers\")\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\tvar layers dist.ModuleLayers\n\t\t\t\t\th.AssertNil(t, json.Unmarshal([]byte(label), &layers))\n\n\t\t\t\t\th.AssertContains(t,\n\t\t\t\t\t\toutBuf.String(),\n\t\t\t\t\t\t\"buildpack 'buildpack-1-id@buildpack-1-version-2' already exists on builder and will be overwritten\",\n\t\t\t\t\t)\n\t\t\t\t\th.AssertNotContains(t, layers[\"buildpack-1-id\"][\"buildpack-1-version-2\"].LayerDiffID, \"buildpack-1-version-2-diff-id\")\n\t\t\t\t})\n\n\t\t\t\tit(\"doesn't message when overriding existing buildpack when log level is INFO\", func() {\n\t\t\t\t\th.AssertNil(t, subject.Save(logger, builder.CreatorMetadata{}))\n\t\t\t\t\th.AssertEq(t, baseImage.IsSaved(), true)\n\n\t\t\t\t\tlabel, err := baseImage.Label(\"io.buildpacks.buildpack.layers\")\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\tvar layers dist.ModuleLayers\n\t\t\t\t\th.AssertNil(t, json.Unmarshal([]byte(label), &layers))\n\n\t\t\t\t\th.AssertNotContains(t,\n\t\t\t\t\t\toutBuf.String(),\n\t\t\t\t\t\t\"buildpack 'buildpack-1-id@buildpack-1-version-2' already exists on builder and will be overwritten\",\n\t\t\t\t\t)\n\t\t\t\t\th.AssertNotContains(t, layers[\"buildpack-1-id\"][\"buildpack-1-version-2\"].LayerDiffID, \"buildpack-1-version-2-diff-id\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"base image already has metadata\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\th.AssertNil(t, baseImage.SetLabel(\n\t\t\t\t\t\t\"io.buildpacks.builder.metadata\",\n\t\t\t\t\t\t`{\n\"buildpacks\":[{\"id\":\"prev.id\"}],\n\"groups\":[{\"buildpacks\":[{\"id\":\"prev.id\"}]}],\n\"stack\":{\"runImage\":{\"image\":\"prev/run\",\"mirrors\":[\"prev/mirror\"]}},\n\"lifecycle\":{\"version\":\"6.6.6\",\"apis\":{\"buildpack\":{\"deprecated\":[\"0.1\"],\"supported\":[\"0.2\",\"0.3\"]},\"platform\":{\"deprecated\":[],\"supported\":[\"2.3\",\"2.4\"]}}}\n}`,\n\t\t\t\t\t))\n\n\t\t\t\t\tvar err error\n\t\t\t\t\tsubject, err = builder.New(baseImage, \"some/builder\")\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\tsubject.AddBuildpack(bp1v1)\n\t\t\t\t\th.AssertNil(t, subject.Save(logger, builder.CreatorMetadata{}))\n\t\t\t\t\th.AssertEq(t, baseImage.IsSaved(), true)\n\t\t\t\t})\n\n\t\t\t\tit(\"appends the buildpack to the metadata\", func() {\n\t\t\t\t\tlabel, err := baseImage.Label(\"io.buildpacks.builder.metadata\")\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\tvar metadata builder.Metadata\n\t\t\t\t\th.AssertNil(t, json.Unmarshal([]byte(label), &metadata))\n\t\t\t\t\th.AssertEq(t, len(metadata.Buildpacks), 2)\n\n\t\t\t\t\t// keeps original metadata\n\t\t\t\t\th.AssertEq(t, metadata.Buildpacks[0].ID, \"prev.id\")\n\t\t\t\t\th.AssertEq(t, metadata.Stack.RunImage.Image, \"prev/run\")\n\t\t\t\t\th.AssertEq(t, metadata.Stack.RunImage.Mirrors[0], \"prev/mirror\")\n\t\t\t\t\th.AssertEq(t, subject.LifecycleDescriptor().Info.Version.String(), \"6.6.6\")\n\n\t\t\t\t\t// adds new buildpack\n\t\t\t\t\th.AssertEq(t, metadata.Buildpacks[1].ID, \"buildpack-1-id\")\n\t\t\t\t\th.AssertEq(t, metadata.Buildpacks[1].Version, \"buildpack-1-version-1\")\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"#AddExtension\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\tsubject.AddExtension(ext1v1)\n\t\t\t\tsubject.AddExtension(ext1v2)\n\t\t\t\tsubject.AddExtension(ext2v1)\n\t\t\t})\n\n\t\t\tit(\"adds the extension as an image layer\", func() {\n\t\t\t\th.AssertNil(t, subject.Save(logger, builder.CreatorMetadata{}))\n\t\t\t\th.AssertEq(t, baseImage.IsSaved(), true)\n\t\t\t\tassertImageHasExtLayer(t, baseImage, ext1v1)\n\t\t\t\tassertImageHasExtLayer(t, baseImage, ext1v2)\n\t\t\t\tassertImageHasExtLayer(t, baseImage, ext2v1)\n\t\t\t})\n\n\t\t\tit(\"adds the extension metadata\", func() {\n\t\t\t\th.AssertNil(t, subject.Save(logger, builder.CreatorMetadata{}))\n\t\t\t\th.AssertEq(t, baseImage.IsSaved(), true)\n\n\t\t\t\tlabel, err := baseImage.Label(\"io.buildpacks.builder.metadata\")\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\tvar metadata builder.Metadata\n\t\t\t\th.AssertNil(t, json.Unmarshal([]byte(label), &metadata))\n\t\t\t\th.AssertEq(t, len(metadata.Extensions), 3)\n\n\t\t\t\th.AssertEq(t, metadata.Extensions[0].ID, \"extension-1-id\")\n\t\t\t\th.AssertEq(t, metadata.Extensions[0].Version, \"extension-1-version-1\")\n\n\t\t\t\th.AssertEq(t, metadata.Extensions[1].ID, \"extension-1-id\")\n\t\t\t\th.AssertEq(t, metadata.Extensions[1].Version, \"extension-1-version-2\")\n\n\t\t\t\th.AssertEq(t, metadata.Extensions[2].ID, \"extension-2-id\")\n\t\t\t\th.AssertEq(t, metadata.Extensions[2].Version, \"extension-2-version-1\")\n\t\t\t})\n\n\t\t\tit(\"adds the extension layers label\", func() {\n\t\t\t\th.AssertNil(t, subject.Save(logger, builder.CreatorMetadata{}))\n\t\t\t\th.AssertEq(t, baseImage.IsSaved(), true)\n\n\t\t\t\tlabel, err := baseImage.Label(\"io.buildpacks.extension.layers\")\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\tvar layers dist.ModuleLayers\n\t\t\t\th.AssertNil(t, json.Unmarshal([]byte(label), &layers))\n\t\t\t\th.AssertEq(t, len(layers), 2)\n\t\t\t\th.AssertEq(t, len(layers[\"extension-1-id\"]), 2)\n\t\t\t\th.AssertEq(t, len(layers[\"extension-2-id\"]), 1)\n\n\t\t\t\th.AssertEq(t, layers[\"extension-1-id\"][\"extension-1-version-1\"].API.String(), \"0.9\")\n\t\t\t\th.AssertEq(t, layers[\"extension-1-id\"][\"extension-1-version-2\"].API.String(), \"0.9\")\n\t\t\t\th.AssertEq(t, layers[\"extension-2-id\"][\"extension-2-version-1\"].API.String(), \"0.9\")\n\t\t\t})\n\n\t\t\twhen(\"base image already has extension layers label\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tvar mdJSON bytes.Buffer\n\t\t\t\t\th.AssertNil(t, json.Compact(\n\t\t\t\t\t\t&mdJSON,\n\t\t\t\t\t\t[]byte(`{\n\t\t\t \"extension-1-id\": {\n\t\t\t   \"extension-1-version-1\": {\n\t\t\t     \"layerDiffID\": \"sha256:extension-1-version-1-diff-id\"\n\t\t\t   },\n\t\t\t   \"extension-1-version-2\": {\n\t\t\t     \"layerDiffID\": \"sha256:extension-1-version-2-diff-id\"\n\t\t\t   }\n\t\t\t }\n\t\t\t}\n\t\t\t`)))\n\n\t\t\t\t\th.AssertNil(t, baseImage.SetLabel(\n\t\t\t\t\t\t\"io.buildpacks.extension.layers\",\n\t\t\t\t\t\tmdJSON.String(),\n\t\t\t\t\t))\n\n\t\t\t\t\tvar err error\n\t\t\t\t\tsubject, err = builder.New(baseImage, \"some/builder\")\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\tsubject.AddExtension(ext1v2)\n\t\t\t\t\tsubject.AddExtension(ext2v1)\n\n\t\t\t\t\tsubject.SetLifecycle(mockLifecycle)\n\t\t\t\t})\n\n\t\t\t\tit(\"appends extension layer info\", func() {\n\t\t\t\t\th.AssertNil(t, subject.Save(logger, builder.CreatorMetadata{}))\n\t\t\t\t\th.AssertEq(t, baseImage.IsSaved(), true)\n\n\t\t\t\t\tlabel, err := baseImage.Label(\"io.buildpacks.extension.layers\")\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\tvar layers dist.ModuleLayers\n\t\t\t\t\th.AssertNil(t, json.Unmarshal([]byte(label), &layers))\n\t\t\t\t\th.AssertEq(t, len(layers), 2)\n\t\t\t\t\th.AssertEq(t, len(layers[\"extension-1-id\"]), 2)\n\t\t\t\t\th.AssertEq(t, len(layers[\"extension-2-id\"]), 1)\n\n\t\t\t\t\th.AssertEq(t, layers[\"extension-1-id\"][\"extension-1-version-1\"].LayerDiffID, \"sha256:extension-1-version-1-diff-id\")\n\n\t\t\t\t\th.AssertUnique(t,\n\t\t\t\t\t\tlayers[\"extension-1-id\"][\"extension-1-version-1\"].LayerDiffID,\n\t\t\t\t\t\tlayers[\"extension-1-id\"][\"extension-1-version-2\"].LayerDiffID,\n\t\t\t\t\t\tlayers[\"extension-2-id\"][\"extension-2-version-1\"].LayerDiffID,\n\t\t\t\t\t)\n\t\t\t\t})\n\n\t\t\t\tit(\"informs when overriding existing extension, and log level is DEBUG\", func() {\n\t\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf, logging.WithVerbose())\n\n\t\t\t\t\th.AssertNil(t, subject.Save(logger, builder.CreatorMetadata{}))\n\t\t\t\t\th.AssertEq(t, baseImage.IsSaved(), true)\n\n\t\t\t\t\tlabel, err := baseImage.Label(\"io.buildpacks.extension.layers\")\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\tvar layers dist.ModuleLayers\n\t\t\t\t\th.AssertNil(t, json.Unmarshal([]byte(label), &layers))\n\n\t\t\t\t\th.AssertContains(t,\n\t\t\t\t\t\toutBuf.String(),\n\t\t\t\t\t\t\"extension 'extension-1-id@extension-1-version-2' already exists on builder and will be overwritten\",\n\t\t\t\t\t)\n\t\t\t\t\th.AssertNotContains(t, layers[\"extension-1-id\"][\"extension-1-version-2\"].LayerDiffID, \"extension-1-version-2-diff-id\")\n\t\t\t\t})\n\n\t\t\t\tit(\"doesn't message when overriding existing extension when log level is INFO\", func() {\n\t\t\t\t\th.AssertNil(t, subject.Save(logger, builder.CreatorMetadata{}))\n\t\t\t\t\th.AssertEq(t, baseImage.IsSaved(), true)\n\n\t\t\t\t\tlabel, err := baseImage.Label(\"io.buildpacks.extension.layers\")\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\tvar layers dist.ModuleLayers\n\t\t\t\t\th.AssertNil(t, json.Unmarshal([]byte(label), &layers))\n\n\t\t\t\t\th.AssertNotContains(t,\n\t\t\t\t\t\toutBuf.String(),\n\t\t\t\t\t\t\"extension 'extension-1-id@extension-1-version-2' already exists on builder and will be overwritten\",\n\t\t\t\t\t)\n\t\t\t\t\th.AssertNotContains(t, layers[\"extension-1-id\"][\"extension-1-version-2\"].LayerDiffID, \"extension-1-version-2-diff-id\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"base image already has metadata\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\th.AssertNil(t, baseImage.SetLabel(\n\t\t\t\t\t\t\"io.buildpacks.builder.metadata\",\n\t\t\t\t\t\t`{\n\t\t\t\"extensions\":[{\"id\":\"prev.id\"}],\n\t\t\t\"lifecycle\":{\"version\":\"6.6.6\",\"apis\":{\"buildpack\":{\"deprecated\":[\"0.1\"],\"supported\":[\"0.2\",\"0.3\",\"0.9\"]},\"platform\":{\"deprecated\":[],\"supported\":[\"2.3\",\"2.4\"]}}}\n\t\t\t}`,\n\t\t\t\t\t))\n\n\t\t\t\t\tvar err error\n\t\t\t\t\tsubject, err = builder.New(baseImage, \"some/builder\")\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\tsubject.AddExtension(ext1v1)\n\t\t\t\t\th.AssertNil(t, subject.Save(logger, builder.CreatorMetadata{}))\n\t\t\t\t\th.AssertEq(t, baseImage.IsSaved(), true)\n\t\t\t\t})\n\n\t\t\t\tit(\"appends the extensions to the metadata\", func() {\n\t\t\t\t\tlabel, err := baseImage.Label(\"io.buildpacks.builder.metadata\")\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\tvar metadata builder.Metadata\n\t\t\t\t\th.AssertNil(t, json.Unmarshal([]byte(label), &metadata))\n\t\t\t\t\th.AssertEq(t, len(metadata.Extensions), 2)\n\n\t\t\t\t\t// keeps original metadata\n\t\t\t\t\th.AssertEq(t, metadata.Extensions[0].ID, \"prev.id\")\n\t\t\t\t\th.AssertEq(t, subject.LifecycleDescriptor().Info.Version.String(), \"6.6.6\")\n\n\t\t\t\t\t// adds new extension\n\t\t\t\t\th.AssertEq(t, metadata.Extensions[1].ID, \"extension-1-id\")\n\t\t\t\t\th.AssertEq(t, metadata.Extensions[1].Version, \"extension-1-version-1\")\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"#SetOrder\", func() {\n\t\t\twhen(\"the buildpacks exist in the image\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tsubject.AddBuildpack(bp1v1)\n\t\t\t\t\tsubject.AddBuildpack(bp2v1)\n\t\t\t\t\tsubject.SetOrder(dist.Order{\n\t\t\t\t\t\t{Group: []dist.ModuleRef{\n\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\tModuleInfo: dist.ModuleInfo{\n\t\t\t\t\t\t\t\t\tID: bp1v1.Descriptor().Info().ID,\n\t\t\t\t\t\t\t\t\t// Version excluded intentionally\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\tModuleInfo: bp2v1.Descriptor().Info(),\n\t\t\t\t\t\t\t\tOptional:   true,\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t}},\n\t\t\t\t\t})\n\n\t\t\t\t\th.AssertNil(t, subject.Save(logger, builder.CreatorMetadata{}))\n\t\t\t\t\th.AssertEq(t, baseImage.IsSaved(), true)\n\t\t\t\t})\n\n\t\t\t\tit(\"adds the order.toml to the image\", func() {\n\t\t\t\t\tlayerTar, err := baseImage.FindLayerWithPath(\"/cnb/order.toml\")\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\th.AssertOnTarEntry(t, layerTar, \"/cnb/order.toml\",\n\t\t\t\t\t\th.ContentEquals(`[[order]]\n\n  [[order.group]]\n    id = \"buildpack-1-id\"\n    version = \"buildpack-1-version-1\"\n\n  [[order.group]]\n    id = \"buildpack-2-id\"\n    version = \"buildpack-2-version-1\"\n    optional = true\n`),\n\t\t\t\t\t\th.HasModTime(archive.NormalizedDateTime),\n\t\t\t\t\t)\n\t\t\t\t})\n\n\t\t\t\tit(\"adds the order to the order label\", func() {\n\t\t\t\t\tlabel, err := baseImage.Label(\"io.buildpacks.buildpack.order\")\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\tvar order dist.Order\n\t\t\t\t\th.AssertNil(t, json.Unmarshal([]byte(label), &order))\n\t\t\t\t\th.AssertEq(t, len(order), 1)\n\t\t\t\t\th.AssertEq(t, len(order[0].Group), 2)\n\t\t\t\t\th.AssertEq(t, order[0].Group[0].ID, \"buildpack-1-id\")\n\t\t\t\t\th.AssertEq(t, order[0].Group[0].Version, \"\")\n\t\t\t\t\th.AssertEq(t, order[0].Group[0].Optional, false)\n\t\t\t\t\th.AssertEq(t, order[0].Group[1].ID, \"buildpack-2-id\")\n\t\t\t\t\th.AssertEq(t, order[0].Group[1].Version, \"buildpack-2-version-1\")\n\t\t\t\t\th.AssertEq(t, order[0].Group[1].Optional, true)\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"#SetOrderExtensions\", func() {\n\t\t\twhen(\"the extensions exist in the image\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tsubject.AddExtension(ext1v1)\n\t\t\t\t\tsubject.AddExtension(ext2v1)\n\t\t\t\t\tsubject.SetOrderExtensions(dist.Order{\n\t\t\t\t\t\t{Group: []dist.ModuleRef{\n\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\tModuleInfo: dist.ModuleInfo{\n\t\t\t\t\t\t\t\t\tID: ext1v1.Descriptor().Info().ID,\n\t\t\t\t\t\t\t\t\t// Version excluded intentionally\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\tModuleInfo: ext2v1.Descriptor().Info(),\n\t\t\t\t\t\t\t\tOptional:   true, // extensions are always optional; this shouldn't be redundantly printed\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t}},\n\t\t\t\t\t})\n\n\t\t\t\t\th.AssertNil(t, subject.Save(logger, builder.CreatorMetadata{}))\n\t\t\t\t\th.AssertEq(t, baseImage.IsSaved(), true)\n\t\t\t\t})\n\n\t\t\t\tit(\"adds the order.toml to the image\", func() {\n\t\t\t\t\tlayerTar, err := baseImage.FindLayerWithPath(\"/cnb/order.toml\")\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\th.AssertOnTarEntry(t, layerTar, \"/cnb/order.toml\",\n\t\t\t\t\t\th.ContentEquals(`[[order-extensions]]\n\n  [[order-extensions.group]]\n    id = \"extension-1-id\"\n    version = \"extension-1-version-1\"\n\n  [[order-extensions.group]]\n    id = \"extension-2-id\"\n    version = \"extension-2-version-1\"\n`),\n\t\t\t\t\t\th.HasModTime(archive.NormalizedDateTime),\n\t\t\t\t\t)\n\t\t\t\t})\n\n\t\t\t\tit(\"adds the order for extensions to the order-extensions label\", func() {\n\t\t\t\t\tlabel, err := baseImage.Label(\"io.buildpacks.buildpack.order-extensions\")\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\tvar orderExt dist.Order\n\t\t\t\t\th.AssertNil(t, json.Unmarshal([]byte(label), &orderExt))\n\t\t\t\t\th.AssertEq(t, len(orderExt), 1)\n\t\t\t\t\th.AssertEq(t, len(orderExt[0].Group), 2)\n\t\t\t\t\th.AssertEq(t, orderExt[0].Group[0].ID, \"extension-1-id\")\n\t\t\t\t\th.AssertEq(t, orderExt[0].Group[0].Version, \"\")\n\t\t\t\t\th.AssertEq(t, orderExt[0].Group[0].Optional, false)\n\t\t\t\t\th.AssertEq(t, orderExt[0].Group[1].ID, \"extension-2-id\")\n\t\t\t\t\th.AssertEq(t, orderExt[0].Group[1].Version, \"extension-2-version-1\")\n\t\t\t\t\th.AssertEq(t, orderExt[0].Group[1].Optional, false)\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"#SetDescription\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\tsubject.SetDescription(\"Some description\")\n\t\t\t\th.AssertNil(t, subject.Save(logger, builder.CreatorMetadata{}))\n\t\t\t\th.AssertEq(t, baseImage.IsSaved(), true)\n\t\t\t})\n\n\t\t\tit(\"sets the description on the metadata\", func() {\n\t\t\t\tlabel, err := baseImage.Label(\"io.buildpacks.builder.metadata\")\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\tvar metadata builder.Metadata\n\t\t\t\th.AssertNil(t, json.Unmarshal([]byte(label), &metadata))\n\t\t\t\th.AssertEq(t, metadata.Description, \"Some description\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"#SetStack\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\tsubject.SetStack(pubbldr.StackConfig{\n\t\t\t\t\tRunImage:        \"some/run\",\n\t\t\t\t\tRunImageMirrors: []string{\"some/mirror\", \"other/mirror\"},\n\t\t\t\t})\n\t\t\t\th.AssertNil(t, subject.Save(logger, builder.CreatorMetadata{}))\n\t\t\t\th.AssertEq(t, baseImage.IsSaved(), true)\n\t\t\t})\n\n\t\t\tit(\"adds the stack.toml to the image\", func() {\n\t\t\t\tlayerTar, err := baseImage.FindLayerWithPath(\"/cnb/stack.toml\")\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertOnTarEntry(t, layerTar, \"/cnb/stack.toml\",\n\t\t\t\t\th.ContentEquals(`[run-image]\n  image = \"some/run\"\n  mirrors = [\"some/mirror\", \"other/mirror\"]\n`),\n\t\t\t\t\th.HasModTime(archive.NormalizedDateTime),\n\t\t\t\t)\n\t\t\t})\n\n\t\t\tit(\"adds the stack to the metadata\", func() {\n\t\t\t\tlabel, err := baseImage.Label(\"io.buildpacks.builder.metadata\")\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\tvar metadata builder.Metadata\n\t\t\t\th.AssertNil(t, json.Unmarshal([]byte(label), &metadata))\n\t\t\t\th.AssertEq(t, metadata.Stack.RunImage.Image, \"some/run\")\n\t\t\t\th.AssertEq(t, metadata.Stack.RunImage.Mirrors[0], \"some/mirror\")\n\t\t\t\th.AssertEq(t, metadata.Stack.RunImage.Mirrors[1], \"other/mirror\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"#SetRunImage\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\tsubject.SetRunImage(pubbldr.RunConfig{Images: []pubbldr.RunImageConfig{{\n\t\t\t\t\tImage:   \"some/run\",\n\t\t\t\t\tMirrors: []string{\"some/mirror\", \"other/mirror\"},\n\t\t\t\t}}})\n\t\t\t\th.AssertNil(t, subject.Save(logger, builder.CreatorMetadata{}))\n\t\t\t\th.AssertEq(t, baseImage.IsSaved(), true)\n\t\t\t})\n\n\t\t\tit(\"adds the run.toml to the image\", func() {\n\t\t\t\tlayerTar, err := baseImage.FindLayerWithPath(\"/cnb/run.toml\")\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertOnTarEntry(t, layerTar, \"/cnb/run.toml\",\n\t\t\t\t\th.ContentEquals(`[[images]]\n  image = \"some/run\"\n  mirrors = [\"some/mirror\", \"other/mirror\"]\n`),\n\t\t\t\t\th.HasModTime(archive.NormalizedDateTime),\n\t\t\t\t)\n\t\t\t})\n\n\t\t\tit(\"adds the stack.toml to the image\", func() {\n\t\t\t\tlayerTar, err := baseImage.FindLayerWithPath(\"/cnb/stack.toml\")\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertOnTarEntry(t, layerTar, \"/cnb/stack.toml\",\n\t\t\t\t\th.ContentEquals(`[run-image]\n  image = \"some/run\"\n  mirrors = [\"some/mirror\", \"other/mirror\"]\n`),\n\t\t\t\t\th.HasModTime(archive.NormalizedDateTime),\n\t\t\t\t)\n\t\t\t})\n\n\t\t\tit(\"adds the run image to the metadata\", func() {\n\t\t\t\tlabel, err := baseImage.Label(\"io.buildpacks.builder.metadata\")\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\tvar metadata builder.Metadata\n\t\t\t\th.AssertNil(t, json.Unmarshal([]byte(label), &metadata))\n\t\t\t\th.AssertEq(t, metadata.RunImages[0].Image, \"some/run\")\n\t\t\t\th.AssertEq(t, metadata.RunImages[0].Mirrors[0], \"some/mirror\")\n\t\t\t\th.AssertEq(t, metadata.RunImages[0].Mirrors[1], \"other/mirror\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"when CNB_BUILD_CONFIG_DIR is defined\", func() {\n\t\t\tvar buildConfigEnvName = \"CNB_BUILD_CONFIG_DIR\"\n\t\t\tvar buildConfigEnvValue = \"/cnb/dup-build-config-dir\"\n\t\t\tit.Before(func() {\n\t\t\t\tos.Setenv(buildConfigEnvName, buildConfigEnvValue)\n\t\t\t\tsubject.SetBuildConfigEnv(map[string]string{\n\t\t\t\t\t\"SOME_KEY\":         \"some-val\",\n\t\t\t\t\t\"OTHER_KEY.append\": \"other-val\",\n\t\t\t\t\t\"OTHER_KEY.delim\":  \":\",\n\t\t\t\t})\n\t\t\t\th.AssertNil(t, subject.Save(logger, builder.CreatorMetadata{}))\n\t\t\t\th.AssertEq(t, baseImage.IsSaved(), true)\n\t\t\t})\n\t\t\tit.After(func() {\n\t\t\t\tos.Unsetenv(buildConfigEnvName)\n\t\t\t})\n\n\t\t\tit(\"adds the env vars as files to the image\", func() {\n\t\t\t\tlayerTar, err := baseImage.FindLayerWithPath(buildConfigEnvValue + \"/env/SOME_KEY\")\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertOnTarEntry(t, layerTar, buildConfigEnvValue+\"/env/SOME_KEY\",\n\t\t\t\t\th.ContentEquals(`some-val`),\n\t\t\t\t\th.HasModTime(archive.NormalizedDateTime),\n\t\t\t\t)\n\t\t\t\th.AssertOnTarEntry(t, layerTar, buildConfigEnvValue+\"/env/OTHER_KEY.append\",\n\t\t\t\t\th.ContentEquals(`other-val`),\n\t\t\t\t\th.HasModTime(archive.NormalizedDateTime),\n\t\t\t\t)\n\t\t\t\th.AssertOnTarEntry(t, layerTar, buildConfigEnvValue+\"/env/OTHER_KEY.delim\",\n\t\t\t\t\th.ContentEquals(`:`),\n\t\t\t\t\th.HasModTime(archive.NormalizedDateTime),\n\t\t\t\t)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"#SetBuildConfigEnv\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\tos.Unsetenv(\"CNB_BUILD_CONFIG_DIR\")\n\t\t\t\tsubject.SetBuildConfigEnv(map[string]string{\n\t\t\t\t\t\"SOME_KEY\":         \"some-val\",\n\t\t\t\t\t\"OTHER_KEY.append\": \"other-val\",\n\t\t\t\t\t\"OTHER_KEY.delim\":  \":\",\n\t\t\t\t})\n\t\t\t\th.AssertNil(t, subject.Save(logger, builder.CreatorMetadata{}))\n\t\t\t\th.AssertEq(t, baseImage.IsSaved(), true)\n\t\t\t})\n\n\t\t\tit(\"adds the env vars as files to the image\", func() {\n\t\t\t\tlayerTar, err := baseImage.FindLayerWithPath(\"/cnb/build-config/env/SOME_KEY\")\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertOnTarEntry(t, layerTar, \"/cnb/build-config/env/SOME_KEY\",\n\t\t\t\t\th.ContentEquals(`some-val`),\n\t\t\t\t\th.HasModTime(archive.NormalizedDateTime),\n\t\t\t\t)\n\t\t\t\th.AssertOnTarEntry(t, layerTar, \"/cnb/build-config/env/OTHER_KEY.append\",\n\t\t\t\t\th.ContentEquals(`other-val`),\n\t\t\t\t\th.HasModTime(archive.NormalizedDateTime),\n\t\t\t\t)\n\t\t\t\th.AssertOnTarEntry(t, layerTar, \"/cnb/build-config/env/OTHER_KEY.delim\",\n\t\t\t\t\th.ContentEquals(`:`),\n\t\t\t\t\th.HasModTime(archive.NormalizedDateTime),\n\t\t\t\t)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"#SetEnv\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\tsubject.SetEnv(map[string]string{\n\t\t\t\t\t\"SOME_KEY\":  \"some-val\",\n\t\t\t\t\t\"OTHER_KEY\": \"other-val\",\n\t\t\t\t})\n\t\t\t\th.AssertNil(t, subject.Save(logger, builder.CreatorMetadata{}))\n\t\t\t\th.AssertEq(t, baseImage.IsSaved(), true)\n\t\t\t})\n\n\t\t\tit(\"adds the env vars as files to the image\", func() {\n\t\t\t\tlayerTar, err := baseImage.FindLayerWithPath(\"/platform/env/SOME_KEY\")\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertOnTarEntry(t, layerTar, \"/platform/env/SOME_KEY\",\n\t\t\t\t\th.ContentEquals(`some-val`),\n\t\t\t\t\th.HasModTime(archive.NormalizedDateTime),\n\t\t\t\t)\n\t\t\t\th.AssertOnTarEntry(t, layerTar, \"/platform/env/OTHER_KEY\",\n\t\t\t\t\th.ContentEquals(`other-val`),\n\t\t\t\t\th.HasModTime(archive.NormalizedDateTime),\n\t\t\t\t)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"#DefaultRunImage\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\tsubject.SetRunImage(pubbldr.RunConfig{Images: []pubbldr.RunImageConfig{{\n\t\t\t\t\tImage:   \"some/run\",\n\t\t\t\t\tMirrors: []string{\"some/mirror\", \"other/mirror\"},\n\t\t\t\t}}})\n\t\t\t\th.AssertNil(t, subject.Save(logger, builder.CreatorMetadata{}))\n\t\t\t\th.AssertEq(t, baseImage.IsSaved(), true)\n\t\t\t})\n\n\t\t\tit(\"adds the run.toml to the image\", func() {\n\t\t\t\tactual := subject.DefaultRunImage()\n\t\t\t\th.AssertEq(t, actual.Image, \"some/run\")\n\t\t\t\th.AssertEq(t, actual.Mirrors, []string{\"some/mirror\", \"other/mirror\"})\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"builder exists\", func() {\n\t\tvar builderImage imgutil.Image\n\n\t\tit.Before(func() {\n\t\t\th.AssertNil(t, baseImage.SetEnv(\"CNB_USER_ID\", \"1234\"))\n\t\t\th.AssertNil(t, baseImage.SetEnv(\"CNB_GROUP_ID\", \"4321\"))\n\t\t\th.AssertNil(t, baseImage.SetLabel(\"io.buildpacks.stack.id\", \"some.stack.id\"))\n\t\t\th.AssertNil(t, baseImage.SetLabel(\"io.buildpacks.stack.mixins\", `[\"mixinX\", \"mixinY\", \"build:mixinA\"]`))\n\t\t\th.AssertNil(t, baseImage.SetLabel(\n\t\t\t\t\"io.buildpacks.builder.metadata\",\n\t\t\t\t`{\"description\": \"some-description\", \"createdBy\": {\"name\": \"some-name\", \"version\": \"1.2.3\"}, \"buildpacks\": [{\"id\": \"buildpack-1-id\"}, {\"id\": \"buildpack-2-id\"}], \"groups\": [{\"buildpacks\": [{\"id\": \"buildpack-1-id\", \"version\": \"buildpack-1-version\", \"optional\": false}, {\"id\": \"buildpack-2-id\", \"version\": \"buildpack-2-version-1\", \"optional\": true}]}], \"stack\": {\"runImage\": {\"image\": \"prev/run\", \"mirrors\": [\"prev/mirror\"]}}, \"lifecycle\": {\"version\": \"6.6.6\"}}`,\n\t\t\t))\n\t\t\th.AssertNil(t, baseImage.SetLabel(\n\t\t\t\t\"io.buildpacks.buildpack.order\",\n\t\t\t\t`[{\"group\": [{\"id\": \"buildpack-1-id\", \"optional\": false}, {\"id\": \"buildpack-2-id\", \"version\": \"buildpack-2-version-1\", \"optional\": true}]}]`,\n\t\t\t))\n\n\t\t\tbuilderImage = baseImage\n\t\t})\n\n\t\twhen(\"#FromImage\", func() {\n\t\t\tvar bldr *builder.Builder\n\n\t\t\tit.Before(func() {\n\t\t\t\tvar err error\n\t\t\t\tbldr, err = builder.FromImage(builderImage)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t})\n\n\t\t\tit(\"gets builder from image\", func() {\n\t\t\t\th.AssertEq(t, bldr.Buildpacks()[0].ID, \"buildpack-1-id\")\n\t\t\t\th.AssertEq(t, bldr.Buildpacks()[1].ID, \"buildpack-2-id\")\n\n\t\t\t\torder := bldr.Order()\n\t\t\t\th.AssertEq(t, len(order), 1)\n\t\t\t\th.AssertEq(t, len(order[0].Group), 2)\n\t\t\t\th.AssertEq(t, order[0].Group[0].ID, \"buildpack-1-id\")\n\t\t\t\th.AssertEq(t, order[0].Group[0].Version, \"\")\n\t\t\t\th.AssertEq(t, order[0].Group[0].Optional, false)\n\t\t\t\th.AssertEq(t, order[0].Group[1].ID, \"buildpack-2-id\")\n\t\t\t\th.AssertEq(t, order[0].Group[1].Version, \"buildpack-2-version-1\")\n\t\t\t\th.AssertEq(t, order[0].Group[1].Optional, true)\n\t\t\t})\n\n\t\t\tit(\"gets mixins from image\", func() {\n\t\t\t\th.AssertSliceContainsOnly(t, bldr.Mixins(), \"mixinX\", \"mixinY\", \"build:mixinA\")\n\t\t\t})\n\n\t\t\twhen(\"metadata is missing\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\th.AssertNil(t, builderImage.SetLabel(\n\t\t\t\t\t\t\"io.buildpacks.builder.metadata\",\n\t\t\t\t\t\t\"\",\n\t\t\t\t\t))\n\t\t\t\t})\n\n\t\t\t\tit(\"should error\", func() {\n\t\t\t\t\t_, err := builder.FromImage(builderImage)\n\t\t\t\t\th.AssertError(t, err, \"missing label 'io.buildpacks.builder.metadata'\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"#Description\", func() {\n\t\t\t\tit(\"return description\", func() {\n\t\t\t\t\th.AssertEq(t, bldr.Description(), \"some-description\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"#CreatedBy\", func() {\n\t\t\t\tit(\"return CreatedBy\", func() {\n\t\t\t\t\texpectedCreatorMetadata := builder.CreatorMetadata{\n\t\t\t\t\t\tName:    \"some-name\",\n\t\t\t\t\t\tVersion: \"1.2.3\",\n\t\t\t\t\t}\n\t\t\t\t\th.AssertEq(t, bldr.CreatedBy(), expectedCreatorMetadata)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"#Name\", func() {\n\t\t\t\tit(\"return Name\", func() {\n\t\t\t\t\th.AssertEq(t, bldr.Name(), \"base/image\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"#Image\", func() {\n\t\t\t\tit(\"return Image\", func() {\n\t\t\t\t\th.AssertSameInstance(t, bldr.Image(), baseImage)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"#Stack\", func() {\n\t\t\t\tit(\"return Stack\", func() {\n\t\t\t\t\texpectedStack := builder.StackMetadata{\n\t\t\t\t\t\tRunImage: builder.RunImageMetadata{\n\t\t\t\t\t\t\tImage:   \"prev/run\",\n\t\t\t\t\t\t\tMirrors: []string{\"prev/mirror\"}}}\n\n\t\t\t\t\th.AssertEq(t, bldr.Stack(), expectedStack)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"#UID\", func() {\n\t\t\t\tit(\"return UID\", func() {\n\t\t\t\t\th.AssertEq(t, bldr.UID(), 1234)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"#GID\", func() {\n\t\t\t\tit(\"return GID\", func() {\n\t\t\t\t\th.AssertEq(t, bldr.GID(), 4321)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"#BaseImageName\", func() {\n\t\t\t\tit(\"return name of base image\", func() {\n\t\t\t\t\th.AssertEq(t, bldr.BaseImageName(), \"base/image\")\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"#New\", func() {\n\t\t\twhen(\"#WithRunImage\", func() {\n\t\t\t\t// Current runImage information in builder image:\n\t\t\t\t// \"stack\": {\"runImage\": {\"image\": \"prev/run\", \"mirrors\": [\"prev/mirror\"]}}\n\t\t\t\tvar newBuilder *builder.Builder\n\t\t\t\tnewRunImage := \"another/run\"\n\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tvar err error\n\t\t\t\t\tnewBuilder, err = builder.New(builderImage, \"newBuilder/image\", builder.WithRunImage(newRunImage))\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t})\n\n\t\t\t\tit(\"overrides the run image metadata (which becomes run.toml)\", func() {\n\t\t\t\t\t// RunImages() returns Stacks + RunImages metadata.\n\t\t\t\t\tmetadata := newBuilder.RunImages()\n\t\t\t\t\th.AssertTrue(t, len(metadata) == 2)\n\t\t\t\t\tfor _, m := range metadata {\n\t\t\t\t\t\t// Both images must be equal to the expected run-image\n\t\t\t\t\t\th.AssertEq(t, m.Image, newRunImage)\n\t\t\t\t\t\th.AssertEq(t, len(m.Mirrors), 0)\n\t\t\t\t\t}\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"flatten\", func() {\n\t\tvar (\n\t\t\tbldr         *builder.Builder\n\t\t\tbuilderImage imgutil.Image\n\t\t\tdeps         []buildpack.BuildModule\n\t\t)\n\n\t\tit.Before(func() {\n\t\t\th.AssertNil(t, baseImage.SetEnv(\"CNB_USER_ID\", \"1234\"))\n\t\t\th.AssertNil(t, baseImage.SetEnv(\"CNB_GROUP_ID\", \"4321\"))\n\t\t\th.AssertNil(t, baseImage.SetLabel(\"io.buildpacks.stack.id\", \"some.stack.id\"))\n\t\t\th.AssertNil(t, baseImage.SetLabel(\"io.buildpacks.stack.mixins\", `[\"mixinX\", \"mixinY\", \"build:mixinA\"]`))\n\t\t\th.AssertNil(t, baseImage.SetLabel(\n\t\t\t\t\"io.buildpacks.builder.metadata\",\n\t\t\t\t`{\"description\": \"some-description\", \"createdBy\": {\"name\": \"some-name\", \"version\": \"1.2.3\"}, \"buildpacks\": [{\"id\": \"buildpack-1-id\"}, {\"id\": \"buildpack-2-id\"}], \"groups\": [{\"buildpacks\": [{\"id\": \"buildpack-1-id\", \"version\": \"buildpack-1-version\", \"optional\": false}, {\"id\": \"buildpack-2-id\", \"version\": \"buildpack-2-version-1\", \"optional\": true}]}], \"stack\": {\"runImage\": {\"image\": \"prev/run\", \"mirrors\": [\"prev/mirror\"]}}, \"lifecycle\": {\"version\": \"6.6.6\"}}`,\n\t\t\t))\n\t\t\th.AssertNil(t, baseImage.SetLabel(\n\t\t\t\t\"io.buildpacks.buildpack.order\",\n\t\t\t\t`[{\"group\": [{\"id\": \"buildpack-1-id\", \"optional\": false}, {\"id\": \"buildpack-2-id\", \"version\": \"buildpack-2-version-1\", \"optional\": true}]}]`,\n\t\t\t))\n\n\t\t\tbuilderImage = baseImage\n\t\t\tdeps = []buildpack.BuildModule{bp2v1, bp1v2}\n\t\t})\n\n\t\twhen(\"buildpacks to be flattened are NOT defined\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\tvar err error\n\t\t\t\tbldr, err = builder.New(builderImage, \"some-builder\")\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t// Let's add the buildpacks\n\t\t\t\tbldr.AddBuildpacks(bp1v1, deps)\n\t\t\t})\n\n\t\t\twhen(\"#FlattenedModules\", func() {\n\t\t\t\tit(\"it return an empty array\", func() {\n\t\t\t\t\th.AssertEq(t, len(bldr.FlattenedModules(buildpack.KindBuildpack)), 0)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"#AllModules\", func() {\n\t\t\t\tit(\"it returns each buildpack individually\", func() {\n\t\t\t\t\th.AssertEq(t, len(bldr.AllModules(buildpack.KindBuildpack)), 3)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"#ShouldFlatten\", func() {\n\t\t\t\tit(\"it returns false for each buildpack\", func() {\n\t\t\t\t\th.AssertFalse(t, bldr.ShouldFlatten(bp1v1))\n\t\t\t\t\th.AssertFalse(t, bldr.ShouldFlatten(bp2v1))\n\t\t\t\t\th.AssertFalse(t, bldr.ShouldFlatten(bp1v2))\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"buildpacks to be flattened are defined\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\tvar err error\n\t\t\t\tflattenModules, err := buildpack.ParseFlattenBuildModules([]string{\"buildpack-1-id@buildpack-1-version-1,buildpack-1-id@buildpack-1-version-2,buildpack-2-id@buildpack-2-version-1\"})\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\tbldr, err = builder.New(builderImage, \"some-builder\", builder.WithFlattened(flattenModules))\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t// Let's add the buildpacks\n\t\t\t\tbldr.AddBuildpacks(bp1v1, deps)\n\t\t\t})\n\n\t\t\twhen(\"#FlattenedModules\", func() {\n\t\t\t\tit(\"it return one array with all buildpacks on it\", func() {\n\t\t\t\t\th.AssertEq(t, len(bldr.FlattenedModules(buildpack.KindBuildpack)), 1)\n\t\t\t\t\th.AssertEq(t, len(bldr.FlattenedModules(buildpack.KindBuildpack)[0]), 3)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"#AllModules\", func() {\n\t\t\t\tit(\"it returns each buildpack individually\", func() {\n\t\t\t\t\th.AssertEq(t, len(bldr.AllModules(buildpack.KindBuildpack)), 3)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"#ShouldFlatten\", func() {\n\t\t\t\tit(\"it returns true for each buildpack\", func() {\n\t\t\t\t\th.AssertTrue(t, bldr.ShouldFlatten(bp1v1))\n\t\t\t\t\th.AssertTrue(t, bldr.ShouldFlatten(bp2v1))\n\t\t\t\t\th.AssertTrue(t, bldr.ShouldFlatten(bp1v2))\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"labels\", func() {\n\t\tvar (\n\t\t\tcustomLabels, imageLabels map[string]string\n\t\t\terr                       error\n\t\t)\n\t\tit.Before(func() {\n\t\t\th.AssertNil(t, baseImage.SetEnv(\"CNB_USER_ID\", \"1234\"))\n\t\t\th.AssertNil(t, baseImage.SetEnv(\"CNB_GROUP_ID\", \"4321\"))\n\t\t\th.AssertNil(t, baseImage.SetLabel(\"io.buildpacks.stack.id\", \"some.stack.id\"))\n\t\t\th.AssertNil(t, baseImage.SetLabel(\"io.buildpacks.stack.mixins\", `[\"mixinX\", \"mixinY\", \"build:mixinA\"]`))\n\t\t})\n\n\t\tit.After(func() {\n\t\t\th.AssertNilE(t, baseImage.Cleanup())\n\t\t})\n\n\t\tit(\"should set labels to the image\", func() {\n\t\t\tcustomLabels = map[string]string{\"test.label.one\": \"1\", \"test.label.two\": \"2\"}\n\t\t\tsubject, err = builder.New(baseImage, \"some/builder\", builder.WithLabels(customLabels))\n\t\t\th.AssertNil(t, err)\n\n\t\t\timageLabels, err = baseImage.Labels()\n\t\t\th.AssertNil(t, err)\n\t\t\th.AssertEq(t, imageLabels[\"test.label.one\"], \"1\")\n\t\t\th.AssertEq(t, imageLabels[\"test.label.two\"], \"2\")\n\t\t})\n\t})\n}\n\nfunc assertImageHasBPLayer(t *testing.T, image *fakes.Image, bp buildpack.BuildModule) {\n\tt.Helper()\n\n\tdirPath := fmt.Sprintf(\"/cnb/buildpacks/%s/%s\", bp.Descriptor().Info().ID, bp.Descriptor().Info().Version)\n\tlayerTar, err := image.FindLayerWithPath(dirPath)\n\th.AssertNil(t, err)\n\n\th.AssertOnTarEntry(t, layerTar, dirPath,\n\t\th.IsDirectory(),\n\t)\n\n\th.AssertOnTarEntry(t, layerTar, path.Dir(dirPath),\n\t\th.IsDirectory(),\n\t)\n\n\th.AssertOnTarEntry(t, layerTar, dirPath+\"/bin/build\",\n\t\th.ContentEquals(\"build-contents\"),\n\t)\n\n\th.AssertOnTarEntry(t, layerTar, dirPath+\"/bin/detect\",\n\t\th.ContentEquals(\"detect-contents\"),\n\t)\n}\n\nfunc assertImageHasExtLayer(t *testing.T, image *fakes.Image, ext buildpack.BuildModule) {\n\tt.Helper()\n\n\tdirPath := fmt.Sprintf(\"/cnb/extensions/%s/%s\", ext.Descriptor().Info().ID, ext.Descriptor().Info().Version)\n\tlayerTar, err := image.FindLayerWithPath(dirPath)\n\th.AssertNil(t, err)\n\n\th.AssertOnTarEntry(t, layerTar, dirPath,\n\t\th.IsDirectory(),\n\t)\n\n\th.AssertOnTarEntry(t, layerTar, path.Dir(dirPath),\n\t\th.IsDirectory(),\n\t)\n\n\th.AssertOnTarEntry(t, layerTar, dirPath+\"/bin/generate\",\n\t\th.ContentEquals(\"generate-contents\"),\n\t)\n\n\th.AssertOnTarEntry(t, layerTar, dirPath+\"/bin/detect\",\n\t\th.ContentEquals(\"detect-contents\"),\n\t)\n}\n\nfunc assertImageHasOrderBpLayer(t *testing.T, image *fakes.Image, bp buildpack.BuildModule) {\n\tt.Helper()\n\n\tdirPath := fmt.Sprintf(\"/cnb/buildpacks/%s/%s\", bp.Descriptor().Info().ID, bp.Descriptor().Info().Version)\n\tlayerTar, err := image.FindLayerWithPath(dirPath)\n\th.AssertNil(t, err)\n\n\th.AssertOnTarEntry(t, layerTar, dirPath,\n\t\th.IsDirectory(),\n\t)\n\n\th.AssertOnTarEntry(t, layerTar, path.Dir(dirPath),\n\t\th.IsDirectory(),\n\t)\n}\n"
  },
  {
    "path": "internal/builder/descriptor.go",
    "content": "package builder\n\nimport (\n\t\"github.com/BurntSushi/toml\"\n\t\"github.com/buildpacks/lifecycle/api\"\n\t\"github.com/pkg/errors\"\n)\n\n// LifecycleDescriptor contains information described in the lifecycle.toml\ntype LifecycleDescriptor struct {\n\tInfo LifecycleInfo `toml:\"lifecycle\"`\n\t// Deprecated: Use `LifecycleAPIs` instead\n\tAPI  LifecycleAPI  `toml:\"api\"`\n\tAPIs LifecycleAPIs `toml:\"apis\"`\n}\n\n// LifecycleInfo contains information about the lifecycle\ntype LifecycleInfo struct {\n\tVersion *Version `toml:\"version\" json:\"version\" yaml:\"version\"`\n}\n\n// LifecycleAPI describes which API versions the lifecycle satisfies\ntype LifecycleAPI struct {\n\tBuildpackVersion *api.Version `toml:\"buildpack\" json:\"buildpack\"`\n\tPlatformVersion  *api.Version `toml:\"platform\" json:\"platform\"`\n}\n\n// LifecycleAPIs describes the supported API versions per specification\ntype LifecycleAPIs struct {\n\tBuildpack APIVersions `toml:\"buildpack\" json:\"buildpack\"`\n\tPlatform  APIVersions `toml:\"platform\" json:\"platform\"`\n}\n\ntype APISet []*api.Version\n\nfunc (a APISet) search(comp func(prevMatch, value *api.Version) bool) *api.Version {\n\tvar match *api.Version\n\tfor _, version := range a {\n\t\tswitch {\n\t\tcase version == nil:\n\t\t\tcontinue\n\t\tcase match == nil:\n\t\t\tmatch = version\n\t\tcase comp(match, version):\n\t\t\tmatch = version\n\t\t}\n\t}\n\n\treturn match\n}\n\nfunc (a APISet) Earliest() *api.Version {\n\treturn a.search(func(prevMatch, value *api.Version) bool { return value.Compare(prevMatch) < 0 })\n}\n\nfunc (a APISet) Latest() *api.Version {\n\treturn a.search(func(prevMatch, value *api.Version) bool { return value.Compare(prevMatch) > 0 })\n}\n\nfunc (a APISet) AsStrings() []string {\n\tverStrings := make([]string, len(a))\n\tfor i, version := range a {\n\t\tverStrings[i] = version.String()\n\t}\n\n\treturn verStrings\n}\n\n// APIVersions describes the supported API versions\ntype APIVersions struct {\n\tDeprecated APISet `toml:\"deprecated\" json:\"deprecated\" yaml:\"deprecated\"`\n\tSupported  APISet `toml:\"supported\" json:\"supported\" yaml:\"supported\"`\n}\n\n// ParseDescriptor parses LifecycleDescriptor from toml formatted string.\nfunc ParseDescriptor(contents string) (LifecycleDescriptor, error) {\n\tdescriptor := LifecycleDescriptor{}\n\t_, err := toml.Decode(contents, &descriptor)\n\tif err != nil {\n\t\treturn descriptor, errors.Wrap(err, \"decoding descriptor\")\n\t}\n\n\treturn descriptor, nil\n}\n\n// CompatDescriptor provides compatibility by mapping new fields to old and vice-versa\nfunc CompatDescriptor(descriptor LifecycleDescriptor) LifecycleDescriptor {\n\tif len(descriptor.APIs.Buildpack.Supported) != 0 || len(descriptor.APIs.Platform.Supported) != 0 {\n\t\t// select earliest value for deprecated parameters\n\t\tif len(descriptor.APIs.Buildpack.Supported) != 0 {\n\t\t\tdescriptor.API.BuildpackVersion =\n\t\t\t\tappend(descriptor.APIs.Buildpack.Deprecated, descriptor.APIs.Buildpack.Supported...).Earliest()\n\t\t}\n\t\tif len(descriptor.APIs.Platform.Supported) != 0 {\n\t\t\tdescriptor.API.PlatformVersion =\n\t\t\t\tappend(descriptor.APIs.Platform.Deprecated, descriptor.APIs.Platform.Supported...).Earliest()\n\t\t}\n\t} else if descriptor.API.BuildpackVersion != nil && descriptor.API.PlatformVersion != nil {\n\t\t// fill supported with deprecated field\n\t\tdescriptor.APIs = LifecycleAPIs{\n\t\t\tBuildpack: APIVersions{\n\t\t\t\tSupported: APISet{descriptor.API.BuildpackVersion},\n\t\t\t},\n\t\t\tPlatform: APIVersions{\n\t\t\t\tSupported: APISet{descriptor.API.PlatformVersion},\n\t\t\t},\n\t\t}\n\t}\n\n\treturn descriptor\n}\n"
  },
  {
    "path": "internal/builder/descriptor_test.go",
    "content": "package builder_test\n\nimport (\n\t\"testing\"\n\n\t\"github.com/buildpacks/lifecycle/api\"\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\t\"github.com/buildpacks/pack/internal/builder\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestDescriptor(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"Builder\", testDescriptor, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testDescriptor(t *testing.T, when spec.G, it spec.S) {\n\twhen(\"CompatDescriptor\", func() {\n\t\twhen(\"missing apis\", func() {\n\t\t\tit(\"makes a lifecycle from a blob\", func() {\n\t\t\t\tdescriptor := builder.CompatDescriptor(builder.LifecycleDescriptor{\n\t\t\t\t\tInfo: builder.LifecycleInfo{},\n\t\t\t\t\tAPI: builder.LifecycleAPI{\n\t\t\t\t\t\tBuildpackVersion: api.MustParse(\"0.2\"),\n\t\t\t\t\t\tPlatformVersion:  api.MustParse(\"0.3\"),\n\t\t\t\t\t},\n\t\t\t\t})\n\n\t\t\t\th.AssertEq(t, descriptor.API.BuildpackVersion.String(), \"0.2\")\n\t\t\t\th.AssertEq(t, descriptor.API.PlatformVersion.String(), \"0.3\")\n\n\t\t\t\t// fill supported with deprecated field\n\t\t\t\th.AssertEq(t, descriptor.APIs.Buildpack.Deprecated.AsStrings(), []string{})\n\t\t\t\th.AssertEq(t, descriptor.APIs.Buildpack.Supported.AsStrings(), []string{\"0.2\"})\n\t\t\t\th.AssertEq(t, descriptor.APIs.Platform.Deprecated.AsStrings(), []string{})\n\t\t\t\th.AssertEq(t, descriptor.APIs.Platform.Supported.AsStrings(), []string{\"0.3\"})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"missing api\", func() {\n\t\t\tit(\"sets lowest value supported\", func() {\n\t\t\t\tdescriptor := builder.CompatDescriptor(builder.LifecycleDescriptor{\n\t\t\t\t\tAPIs: builder.LifecycleAPIs{\n\t\t\t\t\t\tBuildpack: builder.APIVersions{\n\t\t\t\t\t\t\tSupported: builder.APISet{api.MustParse(\"0.2\"), api.MustParse(\"0.3\")},\n\t\t\t\t\t\t},\n\t\t\t\t\t\tPlatform: builder.APIVersions{\n\t\t\t\t\t\t\tSupported: builder.APISet{api.MustParse(\"1.2\"), api.MustParse(\"2.3\")},\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t})\n\n\t\t\t\th.AssertEq(t, descriptor.APIs.Buildpack.Deprecated.AsStrings(), []string{})\n\t\t\t\th.AssertEq(t, descriptor.APIs.Buildpack.Supported.AsStrings(), []string{\"0.2\", \"0.3\"})\n\t\t\t\th.AssertEq(t, descriptor.APIs.Platform.Deprecated.AsStrings(), []string{})\n\t\t\t\th.AssertEq(t, descriptor.APIs.Platform.Supported.AsStrings(), []string{\"1.2\", \"2.3\"})\n\n\t\t\t\t// select lowest value for deprecated parameters\n\t\t\t\th.AssertEq(t, descriptor.API.BuildpackVersion.String(), \"0.2\")\n\t\t\t\th.AssertEq(t, descriptor.API.PlatformVersion.String(), \"1.2\")\n\t\t\t})\n\n\t\t\tit(\"sets lowest value supported + deprecated\", func() {\n\t\t\t\tdescriptor := builder.CompatDescriptor(builder.LifecycleDescriptor{\n\t\t\t\t\tAPIs: builder.LifecycleAPIs{\n\t\t\t\t\t\tBuildpack: builder.APIVersions{\n\t\t\t\t\t\t\tDeprecated: builder.APISet{api.MustParse(\"0.1\")},\n\t\t\t\t\t\t\tSupported:  builder.APISet{api.MustParse(\"0.2\"), api.MustParse(\"0.3\")},\n\t\t\t\t\t\t},\n\t\t\t\t\t\tPlatform: builder.APIVersions{\n\t\t\t\t\t\t\tDeprecated: builder.APISet{api.MustParse(\"1.1\")},\n\t\t\t\t\t\t\tSupported:  builder.APISet{api.MustParse(\"1.2\"), api.MustParse(\"2.3\")},\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t})\n\n\t\t\t\th.AssertEq(t, descriptor.APIs.Buildpack.Deprecated.AsStrings(), []string{\"0.1\"})\n\t\t\t\th.AssertEq(t, descriptor.APIs.Buildpack.Supported.AsStrings(), []string{\"0.2\", \"0.3\"})\n\t\t\t\th.AssertEq(t, descriptor.APIs.Platform.Deprecated.AsStrings(), []string{\"1.1\"})\n\t\t\t\th.AssertEq(t, descriptor.APIs.Platform.Supported.AsStrings(), []string{\"1.2\", \"2.3\"})\n\n\t\t\t\t// select lowest value for deprecated parameters\n\t\t\t\th.AssertEq(t, descriptor.API.BuildpackVersion.String(), \"0.1\")\n\t\t\t\th.AssertEq(t, descriptor.API.PlatformVersion.String(), \"1.1\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"missing api + apis\", func() {\n\t\t\tit(\"makes a lifecycle from a blob\", func() {\n\t\t\t\tdescriptor := builder.CompatDescriptor(builder.LifecycleDescriptor{})\n\n\t\t\t\th.AssertNil(t, descriptor.API.BuildpackVersion)\n\t\t\t\th.AssertNil(t, descriptor.API.PlatformVersion)\n\n\t\t\t\t// fill supported with deprecated field\n\t\t\t\th.AssertEq(t, descriptor.APIs.Buildpack.Deprecated.AsStrings(), []string{})\n\t\t\t\th.AssertEq(t, descriptor.APIs.Buildpack.Supported.AsStrings(), []string{})\n\t\t\t\th.AssertEq(t, descriptor.APIs.Platform.Deprecated.AsStrings(), []string{})\n\t\t\t\th.AssertEq(t, descriptor.APIs.Platform.Supported.AsStrings(), []string{})\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"Earliest\", func() {\n\t\tit(\"returns lowest value\", func() {\n\t\t\th.AssertEq(\n\t\t\t\tt,\n\t\t\t\tbuilder.APISet{api.MustParse(\"2.1\"), api.MustParse(\"0.1\"), api.MustParse(\"1.1\")}.Earliest().String(),\n\t\t\t\t\"0.1\",\n\t\t\t)\n\t\t})\n\t})\n\n\twhen(\"Latest\", func() {\n\t\tit(\"returns highest value\", func() {\n\t\t\th.AssertEq(\n\t\t\t\tt,\n\t\t\t\tbuilder.APISet{api.MustParse(\"1.1\"), api.MustParse(\"2.1\"), api.MustParse(\"0.1\")}.Latest().String(),\n\t\t\t\t\"2.1\",\n\t\t\t)\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "internal/builder/detection_order_calculator.go",
    "content": "package builder\n\nimport (\n\tpubbldr \"github.com/buildpacks/pack/builder\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n)\n\ntype DetectionOrderCalculator struct{}\n\nfunc NewDetectionOrderCalculator() *DetectionOrderCalculator {\n\treturn &DetectionOrderCalculator{}\n}\n\ntype detectionOrderRecurser struct {\n\tlayers   dist.ModuleLayers\n\tmaxDepth int\n}\n\nfunc newDetectionOrderRecurser(layers dist.ModuleLayers, maxDepth int) *detectionOrderRecurser {\n\treturn &detectionOrderRecurser{\n\t\tlayers:   layers,\n\t\tmaxDepth: maxDepth,\n\t}\n}\n\nfunc (c *DetectionOrderCalculator) Order(\n\torder dist.Order,\n\tlayers dist.ModuleLayers,\n\tmaxDepth int,\n) (pubbldr.DetectionOrder, error) {\n\trecurser := newDetectionOrderRecurser(layers, maxDepth)\n\n\treturn recurser.detectionOrderFromOrder(order, dist.ModuleRef{}, 0, map[string]interface{}{}), nil\n}\n\nfunc (r *detectionOrderRecurser) detectionOrderFromOrder(\n\torder dist.Order,\n\tparentBuildpack dist.ModuleRef,\n\tcurrentDepth int,\n\tvisited map[string]interface{},\n) pubbldr.DetectionOrder {\n\tvar detectionOrder pubbldr.DetectionOrder\n\tfor _, orderEntry := range order {\n\t\tvisitedCopy := copyMap(visited)\n\t\tgroupDetectionOrder := r.detectionOrderFromGroup(orderEntry.Group, currentDepth, visitedCopy)\n\n\t\tdetectionOrderEntry := pubbldr.DetectionOrderEntry{\n\t\t\tModuleRef:           parentBuildpack,\n\t\t\tGroupDetectionOrder: groupDetectionOrder,\n\t\t}\n\n\t\tdetectionOrder = append(detectionOrder, detectionOrderEntry)\n\t}\n\n\treturn detectionOrder\n}\n\nfunc (r *detectionOrderRecurser) detectionOrderFromGroup(\n\tgroup []dist.ModuleRef,\n\tcurrentDepth int,\n\tvisited map[string]interface{},\n) pubbldr.DetectionOrder {\n\tvar groupDetectionOrder pubbldr.DetectionOrder\n\n\tfor _, bp := range group {\n\t\t_, bpSeen := visited[bp.FullName()]\n\t\tif !bpSeen {\n\t\t\tvisited[bp.FullName()] = true\n\t\t}\n\n\t\tlayer, ok := r.layers.Get(bp.ID, bp.Version)\n\t\tif ok && len(layer.Order) > 0 && r.shouldGoDeeper(currentDepth) && !bpSeen {\n\t\t\tgroupOrder := r.detectionOrderFromOrder(layer.Order, bp, currentDepth+1, visited)\n\t\t\tgroupDetectionOrder = append(groupDetectionOrder, groupOrder...)\n\t\t} else {\n\t\t\tgroupDetectionOrderEntry := pubbldr.DetectionOrderEntry{\n\t\t\t\tModuleRef: bp,\n\t\t\t\tCyclical:  bpSeen,\n\t\t\t}\n\t\t\tgroupDetectionOrder = append(groupDetectionOrder, groupDetectionOrderEntry)\n\t\t}\n\t}\n\n\treturn groupDetectionOrder\n}\n\nfunc (r *detectionOrderRecurser) shouldGoDeeper(currentDepth int) bool {\n\tif r.maxDepth == pubbldr.OrderDetectionMaxDepth {\n\t\treturn true\n\t}\n\n\tif currentDepth < r.maxDepth {\n\t\treturn true\n\t}\n\n\treturn false\n}\n\nfunc copyMap(toCopy map[string]interface{}) map[string]interface{} {\n\tresult := make(map[string]interface{}, len(toCopy))\n\tfor key := range toCopy {\n\t\tresult[key] = true\n\t}\n\n\treturn result\n}\n"
  },
  {
    "path": "internal/builder/detection_order_calculator_test.go",
    "content": "package builder_test\n\nimport (\n\t\"testing\"\n\n\t\"github.com/buildpacks/lifecycle/api\"\n\n\tpubbldr \"github.com/buildpacks/pack/builder\"\n\t\"github.com/buildpacks/pack/internal/builder\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n)\n\nfunc TestDetectionOrderCalculator(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"testDetectionOrderCalculator\", testDetectionOrderCalculator, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testDetectionOrderCalculator(t *testing.T, when spec.G, it spec.S) {\n\twhen(\"Order\", func() {\n\t\tvar (\n\t\t\tassert = h.NewAssertionManager(t)\n\n\t\t\ttestBuildpackOne = dist.ModuleInfo{\n\t\t\t\tID:      \"test.buildpack\",\n\t\t\t\tVersion: \"test.buildpack.version\",\n\t\t\t}\n\t\t\ttestBuildpackTwo = dist.ModuleInfo{\n\t\t\t\tID:      \"test.buildpack.2\",\n\t\t\t\tVersion: \"test.buildpack.2.version\",\n\t\t\t}\n\t\t\ttestTopNestedBuildpack = dist.ModuleInfo{\n\t\t\t\tID:      \"test.top.nested\",\n\t\t\t\tVersion: \"test.top.nested.version\",\n\t\t\t}\n\t\t\ttestLevelOneNestedBuildpack = dist.ModuleInfo{\n\t\t\t\tID:      \"test.nested.level.one\",\n\t\t\t\tVersion: \"test.nested.level.one.version\",\n\t\t\t}\n\t\t\ttestLevelOneNestedBuildpackTwo = dist.ModuleInfo{\n\t\t\t\tID:      \"test.nested.level.one.two\",\n\t\t\t\tVersion: \"test.nested.level.one.two.version\",\n\t\t\t}\n\t\t\ttestLevelOneNestedBuildpackThree = dist.ModuleInfo{\n\t\t\t\tID:      \"test.nested.level.one.three\",\n\t\t\t\tVersion: \"test.nested.level.one.three.version\",\n\t\t\t}\n\t\t\ttestLevelTwoNestedBuildpack = dist.ModuleInfo{\n\t\t\t\tID:      \"test.nested.level.two\",\n\t\t\t\tVersion: \"test.nested.level.two.version\",\n\t\t\t}\n\t\t\ttopLevelOrder = dist.Order{\n\t\t\t\t{\n\t\t\t\t\tGroup: []dist.ModuleRef{\n\t\t\t\t\t\t{ModuleInfo: testBuildpackOne},\n\t\t\t\t\t\t{ModuleInfo: testBuildpackTwo},\n\t\t\t\t\t\t{ModuleInfo: testTopNestedBuildpack},\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t}\n\t\t\tbuildpackLayers = dist.ModuleLayers{\n\t\t\t\t\"test.buildpack\": {\n\t\t\t\t\t\"test.buildpack.version\": dist.ModuleLayerInfo{\n\t\t\t\t\t\tAPI:         api.MustParse(\"0.2\"),\n\t\t\t\t\t\tLayerDiffID: \"layer:diff\",\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t\t\"test.top.nested\": {\n\t\t\t\t\t\"test.top.nested.version\": dist.ModuleLayerInfo{\n\t\t\t\t\t\tAPI: api.MustParse(\"0.2\"),\n\t\t\t\t\t\tOrder: dist.Order{\n\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\tGroup: []dist.ModuleRef{\n\t\t\t\t\t\t\t\t\t{ModuleInfo: testLevelOneNestedBuildpack},\n\t\t\t\t\t\t\t\t\t{ModuleInfo: testLevelOneNestedBuildpackTwo},\n\t\t\t\t\t\t\t\t\t{ModuleInfo: testLevelOneNestedBuildpackThree},\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t},\n\t\t\t\t\t\tLayerDiffID: \"layer:diff\",\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t\t\"test.nested.level.one\": {\n\t\t\t\t\t\"test.nested.level.one.version\": dist.ModuleLayerInfo{\n\t\t\t\t\t\tAPI: api.MustParse(\"0.2\"),\n\t\t\t\t\t\tOrder: dist.Order{\n\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\tGroup: []dist.ModuleRef{\n\t\t\t\t\t\t\t\t\t{ModuleInfo: testLevelTwoNestedBuildpack},\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t},\n\t\t\t\t\t\tLayerDiffID: \"layer:diff\",\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t\t\"test.nested.level.one.three\": {\n\t\t\t\t\t\"test.nested.level.one.three.version\": dist.ModuleLayerInfo{\n\t\t\t\t\t\tAPI: api.MustParse(\"0.2\"),\n\t\t\t\t\t\tOrder: dist.Order{\n\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\tGroup: []dist.ModuleRef{\n\t\t\t\t\t\t\t\t\t{ModuleInfo: testLevelTwoNestedBuildpack},\n\t\t\t\t\t\t\t\t\t{ModuleInfo: testTopNestedBuildpack},\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t},\n\t\t\t\t\t\tLayerDiffID: \"layer:diff\",\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t}\n\t\t)\n\n\t\twhen(\"called with no depth\", func() {\n\t\t\tit(\"returns detection order with top level order of buildpacks\", func() {\n\t\t\t\tcalculator := builder.NewDetectionOrderCalculator()\n\t\t\t\torder, err := calculator.Order(topLevelOrder, buildpackLayers, pubbldr.OrderDetectionNone)\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\texpectedOrder := pubbldr.DetectionOrder{\n\t\t\t\t\t{\n\t\t\t\t\t\tGroupDetectionOrder: pubbldr.DetectionOrder{\n\t\t\t\t\t\t\t{ModuleRef: dist.ModuleRef{ModuleInfo: testBuildpackOne}},\n\t\t\t\t\t\t\t{ModuleRef: dist.ModuleRef{ModuleInfo: testBuildpackTwo}},\n\t\t\t\t\t\t\t{ModuleRef: dist.ModuleRef{ModuleInfo: testTopNestedBuildpack}},\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t}\n\n\t\t\t\tassert.Equal(order, expectedOrder)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"called with max depth\", func() {\n\t\t\tit(\"returns detection order for nested buildpacks\", func() {\n\t\t\t\tcalculator := builder.NewDetectionOrderCalculator()\n\t\t\t\torder, err := calculator.Order(topLevelOrder, buildpackLayers, pubbldr.OrderDetectionMaxDepth)\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\texpectedOrder := pubbldr.DetectionOrder{\n\t\t\t\t\t{\n\t\t\t\t\t\tGroupDetectionOrder: pubbldr.DetectionOrder{\n\t\t\t\t\t\t\t{ModuleRef: dist.ModuleRef{ModuleInfo: testBuildpackOne}},\n\t\t\t\t\t\t\t{ModuleRef: dist.ModuleRef{ModuleInfo: testBuildpackTwo}},\n\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\tModuleRef: dist.ModuleRef{ModuleInfo: testTopNestedBuildpack},\n\t\t\t\t\t\t\t\tGroupDetectionOrder: pubbldr.DetectionOrder{\n\t\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t\tModuleRef: dist.ModuleRef{ModuleInfo: testLevelOneNestedBuildpack},\n\t\t\t\t\t\t\t\t\t\tGroupDetectionOrder: pubbldr.DetectionOrder{\n\t\t\t\t\t\t\t\t\t\t\t{ModuleRef: dist.ModuleRef{ModuleInfo: testLevelTwoNestedBuildpack}},\n\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t{ModuleRef: dist.ModuleRef{ModuleInfo: testLevelOneNestedBuildpackTwo}},\n\t\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t\tModuleRef: dist.ModuleRef{ModuleInfo: testLevelOneNestedBuildpackThree},\n\t\t\t\t\t\t\t\t\t\tGroupDetectionOrder: pubbldr.DetectionOrder{\n\t\t\t\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t\t\t\tModuleRef: dist.ModuleRef{ModuleInfo: testLevelTwoNestedBuildpack},\n\t\t\t\t\t\t\t\t\t\t\t\tCyclical:  false,\n\t\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t\t\t\tModuleRef: dist.ModuleRef{ModuleInfo: testTopNestedBuildpack},\n\t\t\t\t\t\t\t\t\t\t\t\tCyclical:  true,\n\t\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t}\n\n\t\t\t\tassert.Equal(order, expectedOrder)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"called with a depth of 1\", func() {\n\t\t\tit(\"returns detection order for first level of nested buildpacks\", func() {\n\t\t\t\tcalculator := builder.NewDetectionOrderCalculator()\n\t\t\t\torder, err := calculator.Order(topLevelOrder, buildpackLayers, 1)\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\texpectedOrder := pubbldr.DetectionOrder{\n\t\t\t\t\t{\n\t\t\t\t\t\tGroupDetectionOrder: pubbldr.DetectionOrder{\n\t\t\t\t\t\t\t{ModuleRef: dist.ModuleRef{ModuleInfo: testBuildpackOne}},\n\t\t\t\t\t\t\t{ModuleRef: dist.ModuleRef{ModuleInfo: testBuildpackTwo}},\n\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\tModuleRef: dist.ModuleRef{ModuleInfo: testTopNestedBuildpack},\n\t\t\t\t\t\t\t\tGroupDetectionOrder: pubbldr.DetectionOrder{\n\t\t\t\t\t\t\t\t\t{ModuleRef: dist.ModuleRef{ModuleInfo: testLevelOneNestedBuildpack}},\n\t\t\t\t\t\t\t\t\t{ModuleRef: dist.ModuleRef{ModuleInfo: testLevelOneNestedBuildpackTwo}},\n\t\t\t\t\t\t\t\t\t{ModuleRef: dist.ModuleRef{ModuleInfo: testLevelOneNestedBuildpackThree}},\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t}\n\n\t\t\t\tassert.Equal(order, expectedOrder)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"a buildpack is referenced in a sub detection group\", func() {\n\t\t\tit(\"marks the buildpack is cyclic and doesn't attempt to calculate that buildpacks order\", func() {\n\t\t\t\tcyclicBuildpackLayers := dist.ModuleLayers{\n\t\t\t\t\t\"test.top.nested\": {\n\t\t\t\t\t\t\"test.top.nested.version\": dist.ModuleLayerInfo{\n\t\t\t\t\t\t\tAPI: api.MustParse(\"0.2\"),\n\t\t\t\t\t\t\tOrder: dist.Order{\n\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\tGroup: []dist.ModuleRef{\n\t\t\t\t\t\t\t\t\t\t{ModuleInfo: testLevelOneNestedBuildpack},\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\tLayerDiffID: \"layer:diff\",\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t\t\"test.nested.level.one\": {\n\t\t\t\t\t\t\"test.nested.level.one.version\": dist.ModuleLayerInfo{\n\t\t\t\t\t\t\tAPI: api.MustParse(\"0.2\"),\n\t\t\t\t\t\t\tOrder: dist.Order{\n\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\tGroup: []dist.ModuleRef{\n\t\t\t\t\t\t\t\t\t\t{ModuleInfo: testTopNestedBuildpack},\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\tLayerDiffID: \"layer:diff\",\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t}\n\t\t\t\tcyclicOrder := dist.Order{\n\t\t\t\t\t{\n\t\t\t\t\t\tGroup: []dist.ModuleRef{{ModuleInfo: testTopNestedBuildpack}},\n\t\t\t\t\t},\n\t\t\t\t}\n\n\t\t\t\tcalculator := builder.NewDetectionOrderCalculator()\n\t\t\t\torder, err := calculator.Order(cyclicOrder, cyclicBuildpackLayers, pubbldr.OrderDetectionMaxDepth)\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\texpectedOrder := pubbldr.DetectionOrder{\n\t\t\t\t\t{\n\t\t\t\t\t\tGroupDetectionOrder: pubbldr.DetectionOrder{\n\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\tModuleRef: dist.ModuleRef{ModuleInfo: testTopNestedBuildpack},\n\t\t\t\t\t\t\t\tGroupDetectionOrder: pubbldr.DetectionOrder{\n\t\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t\tModuleRef: dist.ModuleRef{ModuleInfo: testLevelOneNestedBuildpack},\n\t\t\t\t\t\t\t\t\t\tGroupDetectionOrder: pubbldr.DetectionOrder{\n\t\t\t\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t\t\t\tModuleRef: dist.ModuleRef{\n\t\t\t\t\t\t\t\t\t\t\t\t\tModuleInfo: testTopNestedBuildpack,\n\t\t\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t\t\t\tCyclical: true,\n\t\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t}\n\n\t\t\t\tassert.Equal(order, expectedOrder)\n\t\t\t})\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "internal/builder/fakes/fake_detection_calculator.go",
    "content": "package fakes\n\nimport (\n\t\"github.com/buildpacks/pack/builder\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n)\n\ntype FakeDetectionCalculator struct {\n\tReturnForOrder builder.DetectionOrder\n\n\tErrorForOrder error\n\n\tReceivedTopOrder dist.Order\n\tReceivedLayers   dist.ModuleLayers\n\tReceivedDepth    int\n}\n\nfunc (c *FakeDetectionCalculator) Order(\n\ttopOrder dist.Order,\n\tlayers dist.ModuleLayers,\n\tdepth int,\n) (builder.DetectionOrder, error) {\n\tc.ReceivedTopOrder = topOrder\n\tc.ReceivedLayers = layers\n\tc.ReceivedDepth = depth\n\n\treturn c.ReturnForOrder, c.ErrorForOrder\n}\n"
  },
  {
    "path": "internal/builder/fakes/fake_inspectable.go",
    "content": "package fakes\n\ntype FakeInspectable struct {\n\tReturnForLabel string\n\n\tErrorForLabel error\n\n\tReceivedName string\n}\n\nfunc (f *FakeInspectable) Label(name string) (string, error) {\n\tf.ReceivedName = name\n\n\treturn f.ReturnForLabel, f.ErrorForLabel\n}\n"
  },
  {
    "path": "internal/builder/fakes/fake_inspectable_fetcher.go",
    "content": "package fakes\n\nimport (\n\t\"context\"\n\n\t\"github.com/buildpacks/pack/internal/builder\"\n\t\"github.com/buildpacks/pack/pkg/image\"\n)\n\ntype FakeInspectableFetcher struct {\n\tInspectableToReturn *FakeInspectable\n\tErrorToReturn       error\n\n\tCallCount int\n\n\tReceivedName       string\n\tReceivedDaemon     bool\n\tReceivedPullPolicy image.PullPolicy\n}\n\nfunc (f *FakeInspectableFetcher) Fetch(ctx context.Context, name string, options image.FetchOptions) (builder.Inspectable, error) {\n\tf.CallCount++\n\n\tf.ReceivedName = name\n\tf.ReceivedDaemon = options.Daemon\n\tf.ReceivedPullPolicy = options.PullPolicy\n\n\treturn f.InspectableToReturn, f.ErrorToReturn\n}\n"
  },
  {
    "path": "internal/builder/fakes/fake_label_manager.go",
    "content": "package fakes\n\nimport (\n\t\"github.com/buildpacks/pack/internal/builder\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n)\n\ntype FakeLabelManager struct {\n\tReturnForMetadata        builder.Metadata\n\tReturnForStackID         string\n\tReturnForMixins          []string\n\tReturnForOrder           dist.Order\n\tReturnForBuildpackLayers dist.ModuleLayers\n\tReturnForOrderExtensions dist.Order\n\n\tErrorForMetadata        error\n\tErrorForStackID         error\n\tErrorForMixins          error\n\tErrorForOrder           error\n\tErrorForBuildpackLayers error\n\tErrorForOrderExtensions error\n}\n\nfunc (m *FakeLabelManager) Metadata() (builder.Metadata, error) {\n\treturn m.ReturnForMetadata, m.ErrorForMetadata\n}\n\nfunc (m *FakeLabelManager) StackID() (string, error) {\n\treturn m.ReturnForStackID, m.ErrorForStackID\n}\n\nfunc (m *FakeLabelManager) Mixins() ([]string, error) {\n\treturn m.ReturnForMixins, m.ErrorForMixins\n}\n\nfunc (m *FakeLabelManager) Order() (dist.Order, error) {\n\treturn m.ReturnForOrder, m.ErrorForOrder\n}\n\nfunc (m *FakeLabelManager) OrderExtensions() (dist.Order, error) {\n\treturn m.ReturnForOrder, m.ErrorForOrderExtensions\n}\n\nfunc (m *FakeLabelManager) BuildpackLayers() (dist.ModuleLayers, error) {\n\treturn m.ReturnForBuildpackLayers, m.ErrorForBuildpackLayers\n}\n"
  },
  {
    "path": "internal/builder/fakes/fake_label_manager_factory.go",
    "content": "package fakes\n\nimport \"github.com/buildpacks/pack/internal/builder\"\n\ntype FakeLabelManagerFactory struct {\n\tBuilderLabelManagerToReturn builder.LabelInspector\n\n\tReceivedInspectable builder.Inspectable\n}\n\nfunc NewFakeLabelManagerFactory(builderLabelManagerToReturn builder.LabelInspector) *FakeLabelManagerFactory {\n\treturn &FakeLabelManagerFactory{\n\t\tBuilderLabelManagerToReturn: builderLabelManagerToReturn,\n\t}\n}\n\nfunc (f *FakeLabelManagerFactory) BuilderLabelManager(inspectable builder.Inspectable) builder.LabelInspector {\n\tf.ReceivedInspectable = inspectable\n\n\treturn f.BuilderLabelManagerToReturn\n}\n"
  },
  {
    "path": "internal/builder/image_fetcher_wrapper.go",
    "content": "package builder\n\nimport (\n\t\"context\"\n\n\t\"github.com/buildpacks/imgutil\"\n\n\t\"github.com/buildpacks/pack/pkg/image\"\n)\n\ntype ImageFetcher interface {\n\t// Fetch fetches an image by resolving it both remotely and locally depending on provided parameters.\n\t// If daemon is true, it will look return a `local.Image`. Pull, applicable only when daemon is true, will\n\t// attempt to pull a remote image first.\n\tFetch(ctx context.Context, name string, options image.FetchOptions) (imgutil.Image, error)\n\n\t// CheckReadAccess verifies if an image is accessible with read permissions\n\t// When FetchOptions.Daemon is true and the image doesn't exist in the daemon,\n\t// the behavior is dictated by the pull policy, which can have the following behavior\n\t//   - PullNever: returns false\n\t//   - PullAlways Or PullIfNotPresent: it will check read access for the remote image.\n\t// When FetchOptions.Daemon is false it will check read access for the remote image.\n\tCheckReadAccess(repo string, options image.FetchOptions) bool\n}\n\ntype ImageFetcherWrapper struct {\n\tfetcher ImageFetcher\n}\n\nfunc NewImageFetcherWrapper(fetcher ImageFetcher) *ImageFetcherWrapper {\n\treturn &ImageFetcherWrapper{\n\t\tfetcher: fetcher,\n\t}\n}\n\nfunc (w *ImageFetcherWrapper) Fetch(\n\tctx context.Context,\n\tname string,\n\toptions image.FetchOptions,\n) (Inspectable, error) {\n\treturn w.fetcher.Fetch(ctx, name, options)\n}\n\nfunc (w *ImageFetcherWrapper) CheckReadAccessValidator(repo string, options image.FetchOptions) bool {\n\treturn w.fetcher.CheckReadAccess(repo, options)\n}\n"
  },
  {
    "path": "internal/builder/inspect.go",
    "content": "package builder\n\nimport (\n\t\"context\"\n\t\"fmt\"\n\t\"sort\"\n\t\"strings\"\n\n\tpubbldr \"github.com/buildpacks/pack/builder\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n\t\"github.com/buildpacks/pack/pkg/image\"\n)\n\ntype Info struct {\n\tDescription     string\n\tStackID         string\n\tMixins          []string\n\tRunImages       []pubbldr.RunImageConfig\n\tBuildpacks      []dist.ModuleInfo\n\tOrder           pubbldr.DetectionOrder\n\tBuildpackLayers dist.ModuleLayers\n\tLifecycle       LifecycleDescriptor\n\tCreatedBy       CreatorMetadata\n\tExtensions      []dist.ModuleInfo\n\tOrderExtensions pubbldr.DetectionOrder\n}\n\ntype Inspectable interface {\n\tLabel(name string) (string, error)\n}\n\ntype InspectableFetcher interface {\n\tFetch(ctx context.Context, name string, options image.FetchOptions) (Inspectable, error)\n}\n\ntype LabelManagerFactory interface {\n\tBuilderLabelManager(inspectable Inspectable) LabelInspector\n}\n\ntype LabelInspector interface {\n\tMetadata() (Metadata, error)\n\tStackID() (string, error)\n\tMixins() ([]string, error)\n\tOrder() (dist.Order, error)\n\tBuildpackLayers() (dist.ModuleLayers, error)\n\tOrderExtensions() (dist.Order, error)\n}\n\ntype DetectionCalculator interface {\n\tOrder(topOrder dist.Order, layers dist.ModuleLayers, depth int) (pubbldr.DetectionOrder, error)\n}\n\ntype Inspector struct {\n\timageFetcher             InspectableFetcher\n\tlabelManagerFactory      LabelManagerFactory\n\tdetectionOrderCalculator DetectionCalculator\n}\n\nfunc NewInspector(fetcher InspectableFetcher, factory LabelManagerFactory, calculator DetectionCalculator) *Inspector {\n\treturn &Inspector{\n\t\timageFetcher:             fetcher,\n\t\tlabelManagerFactory:      factory,\n\t\tdetectionOrderCalculator: calculator,\n\t}\n}\n\nfunc (i *Inspector) Inspect(name string, daemon bool, orderDetectionDepth int) (Info, error) {\n\tinspectable, err := i.imageFetcher.Fetch(context.Background(), name, image.FetchOptions{Daemon: daemon, PullPolicy: image.PullNever})\n\tif err != nil {\n\t\treturn Info{}, fmt.Errorf(\"fetching builder image: %w\", err)\n\t}\n\n\tlabelManager := i.labelManagerFactory.BuilderLabelManager(inspectable)\n\n\tmetadata, err := labelManager.Metadata()\n\tif err != nil {\n\t\treturn Info{}, fmt.Errorf(\"reading image metadata: %w\", err)\n\t}\n\n\tstackID, _ := labelManager.StackID() // ignore error because stack is optional\n\n\tmixins, err := labelManager.Mixins()\n\tif err != nil {\n\t\treturn Info{}, fmt.Errorf(\"reading image mixins: %w\", err)\n\t}\n\n\tvar commonMixins, buildMixins []string\n\tcommonMixins = []string{}\n\tfor _, mixin := range mixins {\n\t\tif strings.HasPrefix(mixin, \"build:\") {\n\t\t\tbuildMixins = append(buildMixins, mixin)\n\t\t} else {\n\t\t\tcommonMixins = append(commonMixins, mixin)\n\t\t}\n\t}\n\n\torderExtensions, err := labelManager.OrderExtensions()\n\tif err != nil {\n\t\treturn Info{}, fmt.Errorf(\"reading image order extensions: %w\", err)\n\t}\n\n\torder, err := labelManager.Order()\n\tif err != nil {\n\t\treturn Info{}, fmt.Errorf(\"reading image order: %w\", err)\n\t}\n\n\tlayers, err := labelManager.BuildpackLayers()\n\tif err != nil {\n\t\treturn Info{}, fmt.Errorf(\"reading image buildpack layers: %w\", err)\n\t}\n\n\tdetectionOrder, err := i.detectionOrderCalculator.Order(order, layers, orderDetectionDepth)\n\tif err != nil {\n\t\treturn Info{}, fmt.Errorf(\"calculating detection order: %w\", err)\n\t}\n\n\tdetectionOrderExtensions := orderExttoPubbldrDetectionOrderExt(orderExtensions)\n\n\tlifecycle := CompatDescriptor(LifecycleDescriptor{\n\t\tInfo: LifecycleInfo{Version: metadata.Lifecycle.Version},\n\t\tAPI:  metadata.Lifecycle.API,\n\t\tAPIs: metadata.Lifecycle.APIs,\n\t})\n\n\tvar runImages []pubbldr.RunImageConfig\n\tfor _, ri := range metadata.RunImages {\n\t\trunImages = append(runImages, pubbldr.RunImageConfig{\n\t\t\tImage:   ri.Image,\n\t\t\tMirrors: ri.Mirrors,\n\t\t})\n\t}\n\taddStackRunImage := true\n\tfor _, ri := range runImages {\n\t\tif ri.Image == metadata.Stack.RunImage.Image {\n\t\t\taddStackRunImage = false\n\t\t}\n\t}\n\tif addStackRunImage && metadata.Stack.RunImage.Image != \"\" {\n\t\trunImages = append(runImages, pubbldr.RunImageConfig{\n\t\t\tImage:   metadata.Stack.RunImage.Image,\n\t\t\tMirrors: metadata.Stack.RunImage.Mirrors,\n\t\t})\n\t}\n\n\treturn Info{\n\t\tDescription:     metadata.Description,\n\t\tStackID:         stackID,\n\t\tMixins:          append(commonMixins, buildMixins...),\n\t\tRunImages:       runImages,\n\t\tBuildpacks:      sortBuildPacksByID(uniqueBuildpacks(metadata.Buildpacks)),\n\t\tOrder:           detectionOrder,\n\t\tBuildpackLayers: layers,\n\t\tLifecycle:       lifecycle,\n\t\tCreatedBy:       metadata.CreatedBy,\n\t\tExtensions:      metadata.Extensions,\n\t\tOrderExtensions: detectionOrderExtensions,\n\t}, nil\n}\n\nfunc orderExttoPubbldrDetectionOrderExt(orderExt dist.Order) pubbldr.DetectionOrder {\n\tvar detectionOrderExt pubbldr.DetectionOrder\n\n\tfor _, orderEntry := range orderExt {\n\t\tvar detectionOrderEntry pubbldr.DetectionOrderEntry\n\t\tfor _, moduleRef := range orderEntry.Group {\n\t\t\tdetectionOrderEntry.ModuleRef = moduleRef\n\t\t}\n\t\tdetectionOrderExt = append(detectionOrderExt, detectionOrderEntry)\n\t}\n\n\treturn detectionOrderExt\n}\n\nfunc uniqueBuildpacks(buildpacks []dist.ModuleInfo) []dist.ModuleInfo {\n\tfoundBuildpacks := map[string]interface{}{}\n\tvar uniqueBuildpacks []dist.ModuleInfo\n\n\tfor _, bp := range buildpacks {\n\t\t_, ok := foundBuildpacks[bp.FullName()]\n\t\tif !ok {\n\t\t\tuniqueBuildpacks = append(uniqueBuildpacks, bp)\n\t\t\tfoundBuildpacks[bp.FullName()] = true\n\t\t}\n\t}\n\n\treturn uniqueBuildpacks\n}\n\nfunc sortBuildPacksByID(buildpacks []dist.ModuleInfo) []dist.ModuleInfo {\n\tsort.Slice(buildpacks, func(i int, j int) bool {\n\t\tif buildpacks[i].ID == buildpacks[j].ID {\n\t\t\treturn buildpacks[i].Version < buildpacks[j].Version\n\t\t}\n\n\t\treturn buildpacks[i].ID < buildpacks[j].ID\n\t})\n\n\treturn buildpacks\n}\n"
  },
  {
    "path": "internal/builder/inspect_test.go",
    "content": "package builder_test\n\nimport (\n\t\"errors\"\n\t\"testing\"\n\n\t\"github.com/buildpacks/lifecycle/api\"\n\n\tpubbldr \"github.com/buildpacks/pack/builder\"\n\t\"github.com/buildpacks/pack/internal/builder\"\n\t\"github.com/buildpacks/pack/internal/builder/fakes\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n\t\"github.com/buildpacks/pack/pkg/image\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n)\n\nconst (\n\ttestBuilderName        = \"test-builder\"\n\ttestBuilderDescription = \"Test Builder Description\"\n\ttestStackID            = \"test-builder-stack-id\"\n\ttestRunImage           = \"test/run-image\"\n\ttestStackRunImage      = \"test/stack-run-image\"\n)\n\nvar (\n\ttestTopNestedBuildpack = dist.ModuleInfo{\n\t\tID:      \"test.top.nested\",\n\t\tVersion: \"test.top.nested.version\",\n\t}\n\ttestNestedBuildpack = dist.ModuleInfo{\n\t\tID:       \"test.nested\",\n\t\tVersion:  \"test.nested.version\",\n\t\tHomepage: \"http://geocities.com/top-bp\",\n\t}\n\ttestBuildpack = dist.ModuleInfo{\n\t\tID:      \"test.bp.two\",\n\t\tVersion: \"test.bp.two.version\",\n\t}\n\ttestBuildpackVersionTwo = dist.ModuleInfo{\n\t\tID:      \"test.bp.two\",\n\t\tVersion: \"test.bp.two.version-2\",\n\t}\n\ttestBuildpacks = []dist.ModuleInfo{\n\t\ttestBuildpack,\n\t\ttestNestedBuildpack,\n\t\ttestTopNestedBuildpack,\n\t}\n\ttestLifecycleInfo = builder.LifecycleInfo{\n\t\tVersion: builder.VersionMustParse(\"1.2.3\"),\n\t}\n\ttestBuildpackVersions = builder.APIVersions{\n\t\tDeprecated: builder.APISet{api.MustParse(\"0.1\")},\n\t\tSupported:  builder.APISet{api.MustParse(\"1.2\"), api.MustParse(\"1.3\")},\n\t}\n\ttestPlatformVersions = builder.APIVersions{\n\t\tSupported: builder.APISet{api.MustParse(\"2.3\"), api.MustParse(\"2.4\")},\n\t}\n\tinspectTestLifecycle = builder.LifecycleMetadata{\n\t\tLifecycleInfo: testLifecycleInfo,\n\t\tAPIs: builder.LifecycleAPIs{\n\t\t\tBuildpack: testBuildpackVersions,\n\t\t\tPlatform:  testPlatformVersions,\n\t\t},\n\t}\n\ttestCreatorData = builder.CreatorMetadata{\n\t\tName:    \"pack\",\n\t\tVersion: \"1.2.3\",\n\t}\n\ttestMetadata = builder.Metadata{\n\t\tDescription: testBuilderDescription,\n\t\tBuildpacks:  testBuildpacks,\n\t\tStack:       testStack,\n\t\tLifecycle:   inspectTestLifecycle,\n\t\tCreatedBy:   testCreatorData,\n\t\tRunImages: []builder.RunImageMetadata{\n\t\t\t{\n\t\t\t\tImage:   testRunImage,\n\t\t\t\tMirrors: testRunImageMirrors,\n\t\t\t},\n\t\t},\n\t}\n\ttestMixins          = []string{\"build:mixinA\", \"mixinX\", \"mixinY\"}\n\texpectedTestMixins  = []string{\"mixinX\", \"mixinY\", \"build:mixinA\"}\n\ttestRunImageMirrors = []string{\"test/first-run-image-mirror\", \"test/second-run-image-mirror\"}\n\ttestStack           = builder.StackMetadata{\n\t\tRunImage: builder.RunImageMetadata{\n\t\t\tImage:   testStackRunImage,\n\t\t\tMirrors: testRunImageMirrors,\n\t\t},\n\t}\n\ttestOrder = dist.Order{\n\t\tdist.OrderEntry{Group: []dist.ModuleRef{\n\t\t\t{ModuleInfo: testBuildpack, Optional: false},\n\t\t}},\n\t\tdist.OrderEntry{Group: []dist.ModuleRef{\n\t\t\t{ModuleInfo: testNestedBuildpack, Optional: false},\n\t\t\t{ModuleInfo: testTopNestedBuildpack, Optional: true},\n\t\t}},\n\t}\n\ttestOrderExtensions = dist.Order{\n\t\tdist.OrderEntry{Group: []dist.ModuleRef{\n\t\t\t{ModuleInfo: testBuildpack, Optional: false},\n\t\t}},\n\t\tdist.OrderEntry{Group: []dist.ModuleRef{\n\t\t\t{ModuleInfo: testNestedBuildpack, Optional: false},\n\t\t\t{ModuleInfo: testTopNestedBuildpack, Optional: true},\n\t\t}},\n\t}\n\ttestLayers = dist.ModuleLayers{\n\t\t\"test.top.nested\": {\n\t\t\t\"test.top.nested.version\": {\n\t\t\t\tAPI:         api.MustParse(\"0.2\"),\n\t\t\t\tOrder:       testOrder,\n\t\t\t\tLayerDiffID: \"sha256:test.top.nested.sha256\",\n\t\t\t\tHomepage:    \"http://geocities.com/top-bp\",\n\t\t\t},\n\t\t},\n\t\t\"test.bp.two\": {\n\t\t\t\"test.bp.two.version\": {\n\t\t\t\tAPI:         api.MustParse(\"0.2\"),\n\t\t\t\tStacks:      []dist.Stack{{ID: \"test.stack.id\"}},\n\t\t\t\tLayerDiffID: \"sha256:test.bp.two.sha256\",\n\t\t\t\tHomepage:    \"http://geocities.com/cool-bp\",\n\t\t\t},\n\t\t},\n\t}\n\texpectedTestLifecycle = builder.LifecycleDescriptor{\n\t\tInfo: testLifecycleInfo,\n\t\tAPI: builder.LifecycleAPI{\n\t\t\tBuildpackVersion: api.MustParse(\"0.1\"),\n\t\t\tPlatformVersion:  api.MustParse(\"2.3\"),\n\t\t},\n\t\tAPIs: builder.LifecycleAPIs{\n\t\t\tBuildpack: testBuildpackVersions,\n\t\t\tPlatform:  testPlatformVersions,\n\t\t},\n\t}\n\texpectedDetectionTestOrder = pubbldr.DetectionOrder{\n\t\t{\n\t\t\tModuleRef: dist.ModuleRef{\n\t\t\t\tModuleInfo: testBuildpack,\n\t\t\t},\n\t\t},\n\t\t{\n\t\t\tModuleRef: dist.ModuleRef{\n\t\t\t\tModuleInfo: testTopNestedBuildpack,\n\t\t\t},\n\t\t\tGroupDetectionOrder: pubbldr.DetectionOrder{\n\t\t\t\t{\n\t\t\t\t\tModuleRef: dist.ModuleRef{\n\t\t\t\t\t\tModuleInfo: testNestedBuildpack,\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t},\n\t\t},\n\t}\n)\n\nfunc TestInspect(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"testInspect\", testInspect, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testInspect(t *testing.T, when spec.G, it spec.S) {\n\twhen(\"Inspect\", func() {\n\t\tvar assert = h.NewAssertionManager(t)\n\n\t\tit(\"calls Fetch on inspectableFetcher with expected arguments\", func() {\n\t\t\tfetcher := newDefaultInspectableFetcher()\n\n\t\t\tinspector := builder.NewInspector(fetcher, newDefaultLabelManagerFactory(), newDefaultDetectionCalculator())\n\t\t\t_, err := inspector.Inspect(testBuilderName, true, pubbldr.OrderDetectionNone)\n\t\t\tassert.Nil(err)\n\n\t\t\tassert.Equal(fetcher.CallCount, 1)\n\t\t\tassert.Equal(fetcher.ReceivedName, testBuilderName)\n\t\t\tassert.Equal(fetcher.ReceivedDaemon, true)\n\t\t\tassert.Equal(fetcher.ReceivedPullPolicy, image.PullNever)\n\t\t})\n\n\t\tit(\"instantiates a builder label manager with the correct inspectable\", func() {\n\t\t\tinspectable := newNoOpInspectable()\n\n\t\t\tfetcher := &fakes.FakeInspectableFetcher{\n\t\t\t\tInspectableToReturn: inspectable,\n\t\t\t}\n\n\t\t\tlabelManagerFactory := newDefaultLabelManagerFactory()\n\n\t\t\tinspector := builder.NewInspector(fetcher, labelManagerFactory, newDefaultDetectionCalculator())\n\t\t\t_, err := inspector.Inspect(testBuilderName, true, pubbldr.OrderDetectionNone)\n\t\t\tassert.Nil(err)\n\n\t\t\tassert.Equal(labelManagerFactory.ReceivedInspectable, inspectable)\n\t\t})\n\n\t\tit(\"calls `Order` on detectionCalculator with expected arguments\", func() {\n\t\t\tdetectionOrderCalculator := newDefaultDetectionCalculator()\n\n\t\t\tinspector := builder.NewInspector(\n\t\t\t\tnewDefaultInspectableFetcher(),\n\t\t\t\tnewDefaultLabelManagerFactory(),\n\t\t\t\tdetectionOrderCalculator,\n\t\t\t)\n\t\t\t_, err := inspector.Inspect(testBuilderName, true, 3)\n\t\t\tassert.Nil(err)\n\n\t\t\tassert.Equal(detectionOrderCalculator.ReceivedTopOrder, testOrder)\n\t\t\tassert.Equal(detectionOrderCalculator.ReceivedLayers, testLayers)\n\t\t\tassert.Equal(detectionOrderCalculator.ReceivedDepth, 3)\n\t\t})\n\n\t\tit(\"returns Info object with expected fields\", func() {\n\t\t\tfetcher := newDefaultInspectableFetcher()\n\n\t\t\tinspector := builder.NewInspector(fetcher, newDefaultLabelManagerFactory(), newDefaultDetectionCalculator())\n\t\t\tinfo, err := inspector.Inspect(testBuilderName, true, pubbldr.OrderDetectionNone)\n\t\t\tassert.Nil(err)\n\n\t\t\tassert.Equal(info.Description, testBuilderDescription)\n\t\t\tassert.Equal(info.StackID, testStackID)\n\t\t\tassert.Equal(info.Mixins, expectedTestMixins)\n\t\t\tassert.Equal(len(info.RunImages), 2)\n\t\t\tassert.Equal(info.RunImages[0].Image, testRunImage)\n\t\t\tassert.Equal(info.RunImages[1].Image, testStackRunImage)\n\t\t\tassert.Equal(info.RunImages[0].Mirrors, testRunImageMirrors)\n\t\t\tassert.Equal(info.RunImages[1].Mirrors, testRunImageMirrors)\n\t\t\tassert.Equal(info.Buildpacks, testBuildpacks)\n\t\t\tassert.Equal(info.Order, expectedDetectionTestOrder)\n\t\t\tassert.Equal(info.BuildpackLayers, testLayers)\n\t\t\tassert.Equal(info.Lifecycle, expectedTestLifecycle)\n\t\t\tassert.Equal(info.CreatedBy, testCreatorData)\n\t\t})\n\n\t\tit(\"sorts buildPacks by ID then Version\", func() {\n\t\t\tmetadata := builder.Metadata{\n\t\t\t\tDescription: testBuilderDescription,\n\t\t\t\tBuildpacks: []dist.ModuleInfo{\n\t\t\t\t\ttestNestedBuildpack,\n\t\t\t\t\ttestBuildpackVersionTwo,\n\t\t\t\t\ttestBuildpack,\n\t\t\t\t},\n\t\t\t}\n\n\t\t\tlabelManager := newLabelManager(returnForMetadata(metadata))\n\t\t\tinspector := builder.NewInspector(\n\t\t\t\tnewDefaultInspectableFetcher(),\n\t\t\t\tnewLabelManagerFactory(labelManager),\n\t\t\t\tnewDefaultDetectionCalculator(),\n\t\t\t)\n\n\t\t\tinfo, err := inspector.Inspect(testBuilderName, true, pubbldr.OrderDetectionNone)\n\n\t\t\tassert.Nil(err)\n\t\t\tassert.Equal(info.Buildpacks, []dist.ModuleInfo{testBuildpack, testBuildpackVersionTwo, testNestedBuildpack})\n\t\t})\n\n\t\twhen(\"there are duplicated buildpacks in metadata\", func() {\n\t\t\tit(\"returns deduplicated buildpacks\", func() {\n\t\t\t\tmetadata := builder.Metadata{\n\t\t\t\t\tDescription: testBuilderDescription,\n\t\t\t\t\tBuildpacks: []dist.ModuleInfo{\n\t\t\t\t\t\ttestTopNestedBuildpack,\n\t\t\t\t\t\ttestNestedBuildpack,\n\t\t\t\t\t\ttestTopNestedBuildpack,\n\t\t\t\t\t},\n\t\t\t\t}\n\t\t\t\tlabelManager := newLabelManager(returnForMetadata(metadata))\n\n\t\t\t\tinspector := builder.NewInspector(\n\t\t\t\t\tnewDefaultInspectableFetcher(),\n\t\t\t\t\tnewLabelManagerFactory(labelManager),\n\t\t\t\t\tnewDefaultDetectionCalculator(),\n\t\t\t\t)\n\t\t\t\tinfo, err := inspector.Inspect(testBuilderName, true, pubbldr.OrderDetectionNone)\n\n\t\t\t\tassert.Nil(err)\n\t\t\t\tassert.Equal(info.Buildpacks, []dist.ModuleInfo{testNestedBuildpack, testTopNestedBuildpack})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"label manager returns an error for `Metadata`\", func() {\n\t\t\tit(\"returns the wrapped error\", func() {\n\t\t\t\texpectedBaseError := errors.New(\"failed to parse\")\n\n\t\t\t\tlabelManager := newLabelManager(errorForMetadata(expectedBaseError))\n\n\t\t\t\tinspector := builder.NewInspector(\n\t\t\t\t\tnewDefaultInspectableFetcher(),\n\t\t\t\t\tnewLabelManagerFactory(labelManager),\n\t\t\t\t\tnewDefaultDetectionCalculator(),\n\t\t\t\t)\n\t\t\t\t_, err := inspector.Inspect(testBuilderName, true, pubbldr.OrderDetectionNone)\n\n\t\t\t\tassert.ErrorWithMessage(err, \"reading image metadata: failed to parse\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"label manager does not return an error for `StackID`\", func() {\n\t\t\tit(\"returns the wrapped error\", func() {\n\t\t\t\texpectedBaseError := errors.New(\"label not found\")\n\n\t\t\t\tlabelManager := newLabelManager(errorForStackID(expectedBaseError))\n\n\t\t\t\tinspector := builder.NewInspector(\n\t\t\t\t\tnewDefaultInspectableFetcher(),\n\t\t\t\t\tnewLabelManagerFactory(labelManager),\n\t\t\t\t\tnewDefaultDetectionCalculator(),\n\t\t\t\t)\n\t\t\t\t_, err := inspector.Inspect(testBuilderName, true, pubbldr.OrderDetectionNone)\n\n\t\t\t\tassert.Nil(err)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"label manager returns an error for `Mixins`\", func() {\n\t\t\tit(\"returns the wrapped error\", func() {\n\t\t\t\texpectedBaseError := errors.New(\"label not found\")\n\n\t\t\t\tlabelManager := newLabelManager(errorForMixins(expectedBaseError))\n\n\t\t\t\tinspector := builder.NewInspector(\n\t\t\t\t\tnewDefaultInspectableFetcher(),\n\t\t\t\t\tnewLabelManagerFactory(labelManager),\n\t\t\t\t\tnewDefaultDetectionCalculator(),\n\t\t\t\t)\n\t\t\t\t_, err := inspector.Inspect(testBuilderName, true, pubbldr.OrderDetectionNone)\n\n\t\t\t\tassert.ErrorWithMessage(err, \"reading image mixins: label not found\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"label manager returns an error for `Order`\", func() {\n\t\t\tit(\"returns the wrapped error\", func() {\n\t\t\t\texpectedBaseError := errors.New(\"label not found\")\n\n\t\t\t\tlabelManager := newLabelManager(errorForOrder(expectedBaseError))\n\n\t\t\t\tinspector := builder.NewInspector(\n\t\t\t\t\tnewDefaultInspectableFetcher(),\n\t\t\t\t\tnewLabelManagerFactory(labelManager),\n\t\t\t\t\tnewDefaultDetectionCalculator(),\n\t\t\t\t)\n\t\t\t\t_, err := inspector.Inspect(testBuilderName, true, pubbldr.OrderDetectionNone)\n\n\t\t\t\tassert.ErrorWithMessage(err, \"reading image order: label not found\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"label manager returns an error for `OrderExtensions`\", func() {\n\t\t\tit(\"returns the wrapped error\", func() {\n\t\t\t\texpectedBaseError := errors.New(\"label not found\")\n\n\t\t\t\tlabelManager := newLabelManager(errorForOrderExtensions(expectedBaseError))\n\n\t\t\t\tinspector := builder.NewInspector(\n\t\t\t\t\tnewDefaultInspectableFetcher(),\n\t\t\t\t\tnewLabelManagerFactory(labelManager),\n\t\t\t\t\tnewDefaultDetectionCalculator(),\n\t\t\t\t)\n\t\t\t\t_, err := inspector.Inspect(testBuilderName, true, pubbldr.OrderDetectionNone)\n\n\t\t\t\tassert.ErrorWithMessage(err, \"reading image order extensions: label not found\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"label manager returns an error for `ModuleLayers`\", func() {\n\t\t\tit(\"returns the wrapped error\", func() {\n\t\t\t\texpectedBaseError := errors.New(\"label not found\")\n\n\t\t\t\tlabelManager := newLabelManager(errorForBuildpackLayers(expectedBaseError))\n\n\t\t\t\tinspector := builder.NewInspector(\n\t\t\t\t\tnewDefaultInspectableFetcher(),\n\t\t\t\t\tnewLabelManagerFactory(labelManager),\n\t\t\t\t\tnewDefaultDetectionCalculator(),\n\t\t\t\t)\n\t\t\t\t_, err := inspector.Inspect(testBuilderName, true, pubbldr.OrderDetectionNone)\n\n\t\t\t\tassert.ErrorWithMessage(err, \"reading image buildpack layers: label not found\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"detection calculator returns an error for `Order`\", func() {\n\t\t\tit(\"returns the wrapped error\", func() {\n\t\t\t\texpectedBaseError := errors.New(\"couldn't read label\")\n\n\t\t\t\tinspector := builder.NewInspector(\n\t\t\t\t\tnewDefaultInspectableFetcher(),\n\t\t\t\t\tnewDefaultLabelManagerFactory(),\n\t\t\t\t\tnewDetectionCalculator(errorForDetectionOrder(expectedBaseError)),\n\t\t\t\t)\n\t\t\t\t_, err := inspector.Inspect(testBuilderName, true, pubbldr.OrderDetectionMaxDepth)\n\n\t\t\t\tassert.ErrorWithMessage(err, \"calculating detection order: couldn't read label\")\n\t\t\t})\n\t\t})\n\t})\n}\n\nfunc newDefaultInspectableFetcher() *fakes.FakeInspectableFetcher {\n\treturn &fakes.FakeInspectableFetcher{\n\t\tInspectableToReturn: newNoOpInspectable(),\n\t}\n}\n\nfunc newNoOpInspectable() *fakes.FakeInspectable {\n\treturn &fakes.FakeInspectable{}\n}\n\nfunc newDefaultLabelManagerFactory() *fakes.FakeLabelManagerFactory {\n\treturn newLabelManagerFactory(newDefaultLabelManager())\n}\n\nfunc newLabelManagerFactory(manager builder.LabelInspector) *fakes.FakeLabelManagerFactory {\n\treturn fakes.NewFakeLabelManagerFactory(manager)\n}\n\nfunc newDefaultLabelManager() *fakes.FakeLabelManager {\n\treturn &fakes.FakeLabelManager{\n\t\tReturnForMetadata:        testMetadata,\n\t\tReturnForStackID:         testStackID,\n\t\tReturnForMixins:          testMixins,\n\t\tReturnForOrder:           testOrder,\n\t\tReturnForBuildpackLayers: testLayers,\n\t\tReturnForOrderExtensions: testOrderExtensions,\n\t}\n}\n\ntype labelManagerModifier func(manager *fakes.FakeLabelManager)\n\nfunc returnForMetadata(metadata builder.Metadata) labelManagerModifier {\n\treturn func(manager *fakes.FakeLabelManager) {\n\t\tmanager.ReturnForMetadata = metadata\n\t}\n}\n\nfunc errorForMetadata(err error) labelManagerModifier {\n\treturn func(manager *fakes.FakeLabelManager) {\n\t\tmanager.ErrorForMetadata = err\n\t}\n}\n\nfunc errorForStackID(err error) labelManagerModifier {\n\treturn func(manager *fakes.FakeLabelManager) {\n\t\tmanager.ErrorForStackID = err\n\t}\n}\n\nfunc errorForMixins(err error) labelManagerModifier {\n\treturn func(manager *fakes.FakeLabelManager) {\n\t\tmanager.ErrorForMixins = err\n\t}\n}\n\nfunc errorForOrder(err error) labelManagerModifier {\n\treturn func(manager *fakes.FakeLabelManager) {\n\t\tmanager.ErrorForOrder = err\n\t}\n}\n\nfunc errorForOrderExtensions(err error) labelManagerModifier {\n\treturn func(manager *fakes.FakeLabelManager) {\n\t\tmanager.ErrorForOrderExtensions = err\n\t}\n}\n\nfunc errorForBuildpackLayers(err error) labelManagerModifier {\n\treturn func(manager *fakes.FakeLabelManager) {\n\t\tmanager.ErrorForBuildpackLayers = err\n\t}\n}\n\nfunc newLabelManager(modifiers ...labelManagerModifier) *fakes.FakeLabelManager {\n\tmanager := newDefaultLabelManager()\n\n\tfor _, mod := range modifiers {\n\t\tmod(manager)\n\t}\n\n\treturn manager\n}\n\nfunc newDefaultDetectionCalculator() *fakes.FakeDetectionCalculator {\n\treturn &fakes.FakeDetectionCalculator{\n\t\tReturnForOrder: expectedDetectionTestOrder,\n\t}\n}\n\ntype detectionCalculatorModifier func(calculator *fakes.FakeDetectionCalculator)\n\nfunc errorForDetectionOrder(err error) detectionCalculatorModifier {\n\treturn func(calculator *fakes.FakeDetectionCalculator) {\n\t\tcalculator.ErrorForOrder = err\n\t}\n}\n\nfunc newDetectionCalculator(modifiers ...detectionCalculatorModifier) *fakes.FakeDetectionCalculator {\n\tcalculator := newDefaultDetectionCalculator()\n\n\tfor _, mod := range modifiers {\n\t\tmod(calculator)\n\t}\n\n\treturn calculator\n}\n"
  },
  {
    "path": "internal/builder/label_manager.go",
    "content": "package builder\n\nimport (\n\t\"encoding/json\"\n\t\"fmt\"\n\n\t\"github.com/buildpacks/pack/pkg/dist\"\n\n\t\"github.com/buildpacks/pack/internal/stack\"\n)\n\ntype LabelManager struct {\n\tinspectable Inspectable\n}\n\nfunc NewLabelManager(inspectable Inspectable) *LabelManager {\n\treturn &LabelManager{inspectable: inspectable}\n}\n\nfunc (m *LabelManager) Metadata() (Metadata, error) {\n\tvar parsedMetadata Metadata\n\terr := m.labelJSON(metadataLabel, &parsedMetadata)\n\treturn parsedMetadata, err\n}\n\nfunc (m *LabelManager) StackID() (string, error) {\n\treturn m.labelContent(stackLabel)\n}\n\nfunc (m *LabelManager) Mixins() ([]string, error) {\n\tparsedMixins := []string{}\n\terr := m.labelJSONDefaultEmpty(stack.MixinsLabel, &parsedMixins)\n\treturn parsedMixins, err\n}\n\nfunc (m *LabelManager) Order() (dist.Order, error) {\n\tparsedOrder := dist.Order{}\n\terr := m.labelJSONDefaultEmpty(OrderLabel, &parsedOrder)\n\treturn parsedOrder, err\n}\n\nfunc (m *LabelManager) OrderExtensions() (dist.Order, error) {\n\tparsedOrder := dist.Order{}\n\terr := m.labelJSONDefaultEmpty(OrderExtensionsLabel, &parsedOrder)\n\treturn parsedOrder, err\n}\n\nfunc (m *LabelManager) BuildpackLayers() (dist.ModuleLayers, error) {\n\tparsedLayers := dist.ModuleLayers{}\n\terr := m.labelJSONDefaultEmpty(dist.BuildpackLayersLabel, &parsedLayers)\n\treturn parsedLayers, err\n}\n\nfunc (m *LabelManager) labelContent(labelName string) (string, error) {\n\tcontent, err := m.inspectable.Label(labelName)\n\tif err != nil {\n\t\treturn \"\", fmt.Errorf(\"getting label %s: %w\", labelName, err)\n\t}\n\n\tif content == \"\" {\n\t\treturn \"\", fmt.Errorf(\"builder missing label %s -- try recreating builder\", labelName)\n\t}\n\n\treturn content, nil\n}\n\nfunc (m *LabelManager) labelJSON(labelName string, targetObject interface{}) error {\n\trawContent, err := m.labelContent(labelName)\n\tif err != nil {\n\t\treturn err\n\t}\n\n\terr = json.Unmarshal([]byte(rawContent), targetObject)\n\tif err != nil {\n\t\treturn fmt.Errorf(\"parsing label content for %s: %w\", labelName, err)\n\t}\n\n\treturn nil\n}\n\nfunc (m *LabelManager) labelJSONDefaultEmpty(labelName string, targetObject interface{}) error {\n\trawContent, err := m.inspectable.Label(labelName)\n\tif err != nil {\n\t\treturn fmt.Errorf(\"getting label %s: %w\", labelName, err)\n\t}\n\n\tif rawContent == \"\" {\n\t\treturn nil\n\t}\n\n\terr = json.Unmarshal([]byte(rawContent), targetObject)\n\tif err != nil {\n\t\treturn fmt.Errorf(\"parsing label content for %s: %w\", labelName, err)\n\t}\n\n\treturn nil\n}\n"
  },
  {
    "path": "internal/builder/label_manager_provider.go",
    "content": "package builder\n\ntype LabelManagerProvider struct{}\n\nfunc NewLabelManagerProvider() *LabelManagerProvider {\n\treturn &LabelManagerProvider{}\n}\n\nfunc (p *LabelManagerProvider) BuilderLabelManager(inspectable Inspectable) LabelInspector {\n\treturn NewLabelManager(inspectable)\n}\n"
  },
  {
    "path": "internal/builder/label_manager_test.go",
    "content": "package builder_test\n\nimport (\n\t\"errors\"\n\t\"fmt\"\n\t\"testing\"\n\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\t\"github.com/buildpacks/lifecycle/api\"\n\n\t\"github.com/buildpacks/pack/internal/builder\"\n\t\"github.com/buildpacks/pack/internal/builder/fakes\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestLabelManager(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"testLabelManager\", testLabelManager, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testLabelManager(t *testing.T, when spec.G, it spec.S) {\n\tvar assert = h.NewAssertionManager(t)\n\n\twhen(\"Metadata\", func() {\n\t\tconst (\n\t\t\tbuildpackFormat = `{\n      \"id\": \"%s\",\n      \"version\": \"%s\",\n      \"homepage\": \"%s\"\n    }`\n\t\t\tlifecycleFormat = `{\n      \"version\": \"%s\",\n      \"api\": {\n        \"buildpack\": \"%s\",\n        \"platform\": \"%s\"\n      },\n      \"apis\": {\n        \"buildpack\": {\"deprecated\": [\"%s\"], \"supported\": [\"%s\", \"%s\"]},\n        \"platform\": {\"deprecated\": [\"%s\"], \"supported\": [\"%s\", \"%s\"]}\n      }\n    }`\n\t\t\tmetadataFormat = `{\n  \"description\": \"%s\",\n  \"stack\": {\n    \"runImage\": {\n      \"image\": \"%s\",\n      \"mirrors\": [\"%s\"]\n    }\n  },\n  \"buildpacks\": [\n    %s,\n    %s\n  ],\n  \"lifecycle\": %s,\n  \"createdBy\": {\"name\": \"%s\", \"version\": \"%s\"}\n}`\n\t\t)\n\n\t\tvar (\n\t\t\texpectedDescription    = \"Test image description\"\n\t\t\texpectedRunImage       = \"some/run-image\"\n\t\t\texpectedRunImageMirror = \"gcr.io/some/default\"\n\t\t\texpectedBuildpacks     = []dist.ModuleInfo{\n\t\t\t\t{\n\t\t\t\t\tID:       \"test.buildpack\",\n\t\t\t\t\tVersion:  \"test.buildpack.version\",\n\t\t\t\t\tHomepage: \"http://geocities.com/test-bp\",\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\tID:       \"test.buildpack.two\",\n\t\t\t\t\tVersion:  \"test.buildpack.two.version\",\n\t\t\t\t\tHomepage: \"http://geocities.com/test-bp-two\",\n\t\t\t\t},\n\t\t\t}\n\t\t\texpectedLifecycleVersion    = builder.VersionMustParse(\"1.2.3\")\n\t\t\texpectedBuildpackAPI        = api.MustParse(\"0.1\")\n\t\t\texpectedPlatformAPI         = api.MustParse(\"2.3\")\n\t\t\texpectedBuildpackDeprecated = \"0.1\"\n\t\t\texpectedBuildpackSupported  = []string{\"1.2\", \"1.3\"}\n\t\t\texpectedPlatformDeprecated  = \"1.2\"\n\t\t\texpectedPlatformSupported   = []string{\"2.3\", \"2.4\"}\n\t\t\texpectedCreatorName         = \"pack\"\n\t\t\texpectedVersion             = \"2.3.4\"\n\n\t\t\trawMetadata = fmt.Sprintf(\n\t\t\t\tmetadataFormat,\n\t\t\t\texpectedDescription,\n\t\t\t\texpectedRunImage,\n\t\t\t\texpectedRunImageMirror,\n\t\t\t\tfmt.Sprintf(\n\t\t\t\t\tbuildpackFormat,\n\t\t\t\t\texpectedBuildpacks[0].ID,\n\t\t\t\t\texpectedBuildpacks[0].Version,\n\t\t\t\t\texpectedBuildpacks[0].Homepage,\n\t\t\t\t),\n\t\t\t\tfmt.Sprintf(\n\t\t\t\t\tbuildpackFormat,\n\t\t\t\t\texpectedBuildpacks[1].ID,\n\t\t\t\t\texpectedBuildpacks[1].Version,\n\t\t\t\t\texpectedBuildpacks[1].Homepage,\n\t\t\t\t),\n\t\t\t\tfmt.Sprintf(\n\t\t\t\t\tlifecycleFormat,\n\t\t\t\t\texpectedLifecycleVersion,\n\t\t\t\t\texpectedBuildpackAPI,\n\t\t\t\t\texpectedPlatformAPI,\n\t\t\t\t\texpectedBuildpackDeprecated,\n\t\t\t\t\texpectedBuildpackSupported[0],\n\t\t\t\t\texpectedBuildpackSupported[1],\n\t\t\t\t\texpectedPlatformDeprecated,\n\t\t\t\t\texpectedPlatformSupported[0],\n\t\t\t\t\texpectedPlatformSupported[1],\n\t\t\t\t),\n\t\t\t\texpectedCreatorName,\n\t\t\t\texpectedVersion,\n\t\t\t)\n\t\t)\n\n\t\tit(\"returns full metadata\", func() {\n\t\t\tinspectable := newInspectable(returnForLabel(rawMetadata))\n\n\t\t\tlabelManager := builder.NewLabelManager(inspectable)\n\t\t\tmetadata, err := labelManager.Metadata()\n\t\t\tassert.Nil(err)\n\t\t\tassert.Equal(metadata.Description, expectedDescription)\n\t\t\tassert.Equal(metadata.Stack.RunImage.Image, expectedRunImage)\n\t\t\tassert.Equal(metadata.Stack.RunImage.Mirrors, []string{expectedRunImageMirror})\n\t\t\tassert.Equal(metadata.Buildpacks, expectedBuildpacks)\n\t\t\tassert.Equal(metadata.Lifecycle.Version, expectedLifecycleVersion)\n\t\t\tassert.Equal(metadata.Lifecycle.API.BuildpackVersion, expectedBuildpackAPI)\n\t\t\tassert.Equal(metadata.Lifecycle.API.PlatformVersion, expectedPlatformAPI)\n\t\t\tassert.Equal(metadata.Lifecycle.APIs.Buildpack.Deprecated.AsStrings(), []string{expectedBuildpackDeprecated})\n\t\t\tassert.Equal(metadata.Lifecycle.APIs.Buildpack.Supported.AsStrings(), expectedBuildpackSupported)\n\t\t\tassert.Equal(metadata.Lifecycle.APIs.Platform.Deprecated.AsStrings(), []string{expectedPlatformDeprecated})\n\t\t\tassert.Equal(metadata.Lifecycle.APIs.Platform.Supported.AsStrings(), expectedPlatformSupported)\n\t\t\tassert.Equal(metadata.CreatedBy.Name, expectedCreatorName)\n\t\t\tassert.Equal(metadata.CreatedBy.Version, expectedVersion)\n\t\t})\n\n\t\tit(\"requests the expected label\", func() {\n\t\t\tinspectable := newInspectable(returnForLabel(rawMetadata))\n\n\t\t\tlabelManager := builder.NewLabelManager(inspectable)\n\t\t\t_, err := labelManager.Metadata()\n\t\t\tassert.Nil(err)\n\n\t\t\tassert.Equal(inspectable.ReceivedName, \"io.buildpacks.builder.metadata\")\n\t\t})\n\n\t\twhen(\"inspectable returns an error for `Label`\", func() {\n\t\t\tit(\"returns a wrapped error\", func() {\n\t\t\t\texpectedError := errors.New(\"couldn't find label\")\n\n\t\t\t\tinspectable := newInspectable(errorForLabel(expectedError))\n\n\t\t\t\tlabelManager := builder.NewLabelManager(inspectable)\n\t\t\t\t_, err := labelManager.Metadata()\n\n\t\t\t\tassert.ErrorWithMessage(\n\t\t\t\t\terr,\n\t\t\t\t\t\"getting label io.buildpacks.builder.metadata: couldn't find label\",\n\t\t\t\t)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"inspectable returns invalid json for `Label`\", func() {\n\t\t\tit(\"returns a wrapped error\", func() {\n\t\t\t\tinspectable := newInspectable(returnForLabel(\"{\"))\n\n\t\t\t\tlabelManager := builder.NewLabelManager(inspectable)\n\t\t\t\t_, err := labelManager.Metadata()\n\n\t\t\t\tassert.ErrorWithMessage(\n\t\t\t\t\terr,\n\t\t\t\t\t\"parsing label content for io.buildpacks.builder.metadata: unexpected end of JSON input\",\n\t\t\t\t)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"inspectable returns empty content for `Label`\", func() {\n\t\t\tit(\"returns an error suggesting rebuilding the builder\", func() {\n\t\t\t\tinspectable := newInspectable(returnForLabel(\"\"))\n\n\t\t\t\tlabelManager := builder.NewLabelManager(inspectable)\n\t\t\t\t_, err := labelManager.Metadata()\n\n\t\t\t\tassert.ErrorWithMessage(\n\t\t\t\t\terr,\n\t\t\t\t\t\"builder missing label io.buildpacks.builder.metadata -- try recreating builder\",\n\t\t\t\t)\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"StackID\", func() {\n\t\tit(\"returns the stack ID\", func() {\n\t\t\tinspectable := newInspectable(returnForLabel(\"some.stack.id\"))\n\n\t\t\tlabelManager := builder.NewLabelManager(inspectable)\n\t\t\tstackID, err := labelManager.StackID()\n\t\t\tassert.Nil(err)\n\n\t\t\tassert.Equal(stackID, \"some.stack.id\")\n\t\t})\n\n\t\tit(\"requests the expected label\", func() {\n\t\t\tinspectable := newInspectable(returnForLabel(\"some.stack.id\"))\n\n\t\t\tlabelManager := builder.NewLabelManager(inspectable)\n\t\t\t_, err := labelManager.StackID()\n\t\t\tassert.Nil(err)\n\n\t\t\tassert.Equal(inspectable.ReceivedName, \"io.buildpacks.stack.id\")\n\t\t})\n\n\t\twhen(\"inspectable return empty content for `Label`\", func() {\n\t\t\tit(\"returns an error suggesting rebuilding the builder\", func() {\n\t\t\t\tinspectable := newInspectable(returnForLabel(\"\"))\n\n\t\t\t\tlabelManager := builder.NewLabelManager(inspectable)\n\t\t\t\t_, err := labelManager.StackID()\n\n\t\t\t\tassert.ErrorWithMessage(\n\t\t\t\t\terr,\n\t\t\t\t\t\"builder missing label io.buildpacks.stack.id -- try recreating builder\",\n\t\t\t\t)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"inspectable returns an error for `Label`\", func() {\n\t\t\tit(\"returns a wrapped error\", func() {\n\t\t\t\texpectedError := errors.New(\"couldn't find label\")\n\n\t\t\t\tinspectable := newInspectable(errorForLabel(expectedError))\n\n\t\t\t\tlabelManager := builder.NewLabelManager(inspectable)\n\t\t\t\t_, err := labelManager.StackID()\n\n\t\t\t\tassert.ErrorWithMessage(\n\t\t\t\t\terr,\n\t\t\t\t\t\"getting label io.buildpacks.stack.id: couldn't find label\",\n\t\t\t\t)\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"Mixins\", func() {\n\t\tit(\"returns the mixins\", func() {\n\t\t\tinspectable := newInspectable(returnForLabel(`[\"mixinX\", \"mixinY\", \"build:mixinA\"]`))\n\n\t\t\tlabelManager := builder.NewLabelManager(inspectable)\n\t\t\tmixins, err := labelManager.Mixins()\n\t\t\tassert.Nil(err)\n\n\t\t\tassert.Equal(mixins, []string{\"mixinX\", \"mixinY\", \"build:mixinA\"})\n\t\t})\n\n\t\tit(\"requests the expected label\", func() {\n\t\t\tinspectable := newInspectable(returnForLabel(`[\"mixinX\", \"mixinY\", \"build:mixinA\"]`))\n\n\t\t\tlabelManager := builder.NewLabelManager(inspectable)\n\t\t\t_, err := labelManager.Mixins()\n\t\t\tassert.Nil(err)\n\n\t\t\tassert.Equal(inspectable.ReceivedName, \"io.buildpacks.stack.mixins\")\n\t\t})\n\n\t\twhen(\"inspectable return empty content for `Label`\", func() {\n\t\t\tit(\"returns empty stack mixins\", func() {\n\t\t\t\tinspectable := newInspectable(returnForLabel(\"\"))\n\n\t\t\t\tlabelManager := builder.NewLabelManager(inspectable)\n\t\t\t\tmixins, err := labelManager.Mixins()\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassert.Equal(mixins, []string{})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"inspectable returns an error for `Label`\", func() {\n\t\t\tit(\"returns a wrapped error\", func() {\n\t\t\t\texpectedError := errors.New(\"couldn't find label\")\n\n\t\t\t\tinspectable := newInspectable(errorForLabel(expectedError))\n\n\t\t\t\tlabelManager := builder.NewLabelManager(inspectable)\n\t\t\t\t_, err := labelManager.Mixins()\n\n\t\t\t\tassert.ErrorWithMessage(\n\t\t\t\t\terr,\n\t\t\t\t\t\"getting label io.buildpacks.stack.mixins: couldn't find label\",\n\t\t\t\t)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"inspectable returns invalid json for `Label`\", func() {\n\t\t\tit(\"returns a wrapped error\", func() {\n\t\t\t\tinspectable := newInspectable(returnForLabel(\"{\"))\n\n\t\t\t\tlabelManager := builder.NewLabelManager(inspectable)\n\t\t\t\t_, err := labelManager.Mixins()\n\n\t\t\t\tassert.ErrorWithMessage(\n\t\t\t\t\terr,\n\t\t\t\t\t\"parsing label content for io.buildpacks.stack.mixins: unexpected end of JSON input\",\n\t\t\t\t)\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"Order\", func() {\n\t\tvar rawOrder = `[{\"group\": [{\"id\": \"buildpack-1-id\", \"optional\": false}, {\"id\": \"buildpack-2-id\", \"version\": \"buildpack-2-version-1\", \"optional\": true}]}]`\n\n\t\tit(\"returns the order\", func() {\n\t\t\tinspectable := newInspectable(returnForLabel(rawOrder))\n\n\t\t\tlabelManager := builder.NewLabelManager(inspectable)\n\t\t\tmixins, err := labelManager.Order()\n\t\t\tassert.Nil(err)\n\n\t\t\texpectedOrder := dist.Order{\n\t\t\t\t{\n\t\t\t\t\tGroup: []dist.ModuleRef{\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\tModuleInfo: dist.ModuleInfo{\n\t\t\t\t\t\t\t\tID: \"buildpack-1-id\",\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t},\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\tModuleInfo: dist.ModuleInfo{\n\t\t\t\t\t\t\t\tID:      \"buildpack-2-id\",\n\t\t\t\t\t\t\t\tVersion: \"buildpack-2-version-1\",\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\tOptional: true,\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t}\n\n\t\t\tassert.Equal(mixins, expectedOrder)\n\t\t})\n\n\t\tit(\"requests the expected label\", func() {\n\t\t\tinspectable := newInspectable(returnForLabel(rawOrder))\n\n\t\t\tlabelManager := builder.NewLabelManager(inspectable)\n\t\t\t_, err := labelManager.Order()\n\t\t\tassert.Nil(err)\n\n\t\t\tassert.Equal(inspectable.ReceivedName, \"io.buildpacks.buildpack.order\")\n\t\t})\n\n\t\twhen(\"inspectable return empty content for `Label`\", func() {\n\t\t\tit(\"returns an empty order object\", func() {\n\t\t\t\tinspectable := newInspectable(returnForLabel(\"\"))\n\n\t\t\t\tlabelManager := builder.NewLabelManager(inspectable)\n\t\t\t\torder, err := labelManager.Order()\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassert.Equal(order, dist.Order{})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"inspectable returns an error for `Label`\", func() {\n\t\t\tit(\"returns a wrapped error\", func() {\n\t\t\t\texpectedError := errors.New(\"couldn't find label\")\n\n\t\t\t\tinspectable := newInspectable(errorForLabel(expectedError))\n\n\t\t\t\tlabelManager := builder.NewLabelManager(inspectable)\n\t\t\t\t_, err := labelManager.Order()\n\n\t\t\t\tassert.ErrorWithMessage(\n\t\t\t\t\terr,\n\t\t\t\t\t\"getting label io.buildpacks.buildpack.order: couldn't find label\",\n\t\t\t\t)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"inspectable returns invalid json for `Label`\", func() {\n\t\t\tit(\"returns a wrapped error\", func() {\n\t\t\t\tinspectable := newInspectable(returnForLabel(\"{\"))\n\n\t\t\t\tlabelManager := builder.NewLabelManager(inspectable)\n\t\t\t\t_, err := labelManager.Order()\n\n\t\t\t\tassert.ErrorWithMessage(\n\t\t\t\t\terr,\n\t\t\t\t\t\"parsing label content for io.buildpacks.buildpack.order: unexpected end of JSON input\",\n\t\t\t\t)\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"OrderExtensions\", func() {\n\t\tvar rawOrder = `[{\"group\": [{\"id\": \"buildpack-1-id\", \"optional\": false}, {\"id\": \"buildpack-2-id\", \"version\": \"buildpack-2-version-1\", \"optional\": true}]}]`\n\n\t\tit(\"returns the order\", func() {\n\t\t\tinspectable := newInspectable(returnForLabel(rawOrder))\n\n\t\t\tlabelManager := builder.NewLabelManager(inspectable)\n\t\t\tmixins, err := labelManager.OrderExtensions()\n\t\t\tassert.Nil(err)\n\n\t\t\texpectedOrder := dist.Order{\n\t\t\t\t{\n\t\t\t\t\tGroup: []dist.ModuleRef{\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\tModuleInfo: dist.ModuleInfo{\n\t\t\t\t\t\t\t\tID: \"buildpack-1-id\",\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t},\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\tModuleInfo: dist.ModuleInfo{\n\t\t\t\t\t\t\t\tID:      \"buildpack-2-id\",\n\t\t\t\t\t\t\t\tVersion: \"buildpack-2-version-1\",\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\tOptional: true,\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t}\n\n\t\t\tassert.Equal(mixins, expectedOrder)\n\t\t})\n\n\t\tit(\"requests the expected label\", func() {\n\t\t\tinspectable := newInspectable(returnForLabel(rawOrder))\n\n\t\t\tlabelManager := builder.NewLabelManager(inspectable)\n\t\t\t_, err := labelManager.OrderExtensions()\n\t\t\tassert.Nil(err)\n\n\t\t\tassert.Equal(inspectable.ReceivedName, \"io.buildpacks.buildpack.order-extensions\")\n\t\t})\n\n\t\twhen(\"inspectable return empty content for `Label`\", func() {\n\t\t\tit(\"returns an empty order object\", func() {\n\t\t\t\tinspectable := newInspectable(returnForLabel(\"\"))\n\n\t\t\t\tlabelManager := builder.NewLabelManager(inspectable)\n\t\t\t\torder, err := labelManager.OrderExtensions()\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassert.Equal(order, dist.Order{})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"inspectable returns an error for `Label`\", func() {\n\t\t\tit(\"returns a wrapped error\", func() {\n\t\t\t\texpectedError := errors.New(\"couldn't find label\")\n\n\t\t\t\tinspectable := newInspectable(errorForLabel(expectedError))\n\n\t\t\t\tlabelManager := builder.NewLabelManager(inspectable)\n\t\t\t\t_, err := labelManager.OrderExtensions()\n\n\t\t\t\tassert.ErrorWithMessage(\n\t\t\t\t\terr,\n\t\t\t\t\t\"getting label io.buildpacks.buildpack.order-extensions: couldn't find label\",\n\t\t\t\t)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"inspectable returns invalid json for `Label`\", func() {\n\t\t\tit(\"returns a wrapped error\", func() {\n\t\t\t\tinspectable := newInspectable(returnForLabel(\"{\"))\n\n\t\t\t\tlabelManager := builder.NewLabelManager(inspectable)\n\t\t\t\t_, err := labelManager.OrderExtensions()\n\n\t\t\t\tassert.ErrorWithMessage(\n\t\t\t\t\terr,\n\t\t\t\t\t\"parsing label content for io.buildpacks.buildpack.order-extensions: unexpected end of JSON input\",\n\t\t\t\t)\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"ModuleLayers\", func() {\n\t\tvar rawLayers = `\n{\n  \"buildpack-1-id\": {\n    \"buildpack-1-version-1\": {\n      \"api\": \"0.1\",\n      \"layerDiffID\": \"sha256:buildpack-1-version-1-diff-id\"\n    },\n    \"buildpack-1-version-2\": {\n      \"api\": \"0.2\",\n      \"layerDiffID\": \"sha256:buildpack-1-version-2-diff-id\"\n    }\n  }\n}\n`\n\n\t\tit(\"returns the layers\", func() {\n\t\t\tinspectable := newInspectable(returnForLabel(rawLayers))\n\n\t\t\tlabelManager := builder.NewLabelManager(inspectable)\n\t\t\tlayers, err := labelManager.BuildpackLayers()\n\t\t\tassert.Nil(err)\n\n\t\t\texpectedLayers := dist.ModuleLayers{\n\t\t\t\t\"buildpack-1-id\": {\n\t\t\t\t\t\"buildpack-1-version-1\": dist.ModuleLayerInfo{\n\t\t\t\t\t\tAPI:         api.MustParse(\"0.1\"),\n\t\t\t\t\t\tLayerDiffID: \"sha256:buildpack-1-version-1-diff-id\",\n\t\t\t\t\t},\n\t\t\t\t\t\"buildpack-1-version-2\": dist.ModuleLayerInfo{\n\t\t\t\t\t\tAPI:         api.MustParse(\"0.2\"),\n\t\t\t\t\t\tLayerDiffID: \"sha256:buildpack-1-version-2-diff-id\",\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t}\n\n\t\t\tassert.Equal(layers, expectedLayers)\n\t\t})\n\n\t\tit(\"requests the expected label\", func() {\n\t\t\tinspectable := newInspectable(returnForLabel(rawLayers))\n\n\t\t\tlabelManager := builder.NewLabelManager(inspectable)\n\t\t\t_, err := labelManager.BuildpackLayers()\n\t\t\tassert.Nil(err)\n\n\t\t\tassert.Equal(inspectable.ReceivedName, \"io.buildpacks.buildpack.layers\")\n\t\t})\n\n\t\twhen(\"inspectable return empty content for `Label`\", func() {\n\t\t\tit(\"returns an empty buildpack layers object\", func() {\n\t\t\t\tinspectable := newInspectable(returnForLabel(\"\"))\n\n\t\t\t\tlabelManager := builder.NewLabelManager(inspectable)\n\t\t\t\tlayers, err := labelManager.BuildpackLayers()\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassert.Equal(layers, dist.ModuleLayers{})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"inspectable returns an error for `Label`\", func() {\n\t\t\tit(\"returns a wrapped error\", func() {\n\t\t\t\texpectedError := errors.New(\"couldn't find label\")\n\n\t\t\t\tinspectable := newInspectable(errorForLabel(expectedError))\n\n\t\t\t\tlabelManager := builder.NewLabelManager(inspectable)\n\t\t\t\t_, err := labelManager.BuildpackLayers()\n\n\t\t\t\tassert.ErrorWithMessage(\n\t\t\t\t\terr,\n\t\t\t\t\t\"getting label io.buildpacks.buildpack.layers: couldn't find label\",\n\t\t\t\t)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"inspectable returns invalid json for `Label`\", func() {\n\t\t\tit(\"returns a wrapped error\", func() {\n\t\t\t\tinspectable := newInspectable(returnForLabel(\"{\"))\n\n\t\t\t\tlabelManager := builder.NewLabelManager(inspectable)\n\t\t\t\t_, err := labelManager.BuildpackLayers()\n\n\t\t\t\tassert.ErrorWithMessage(\n\t\t\t\t\terr,\n\t\t\t\t\t\"parsing label content for io.buildpacks.buildpack.layers: unexpected end of JSON input\",\n\t\t\t\t)\n\t\t\t})\n\t\t})\n\t})\n}\n\ntype inspectableModifier func(i *fakes.FakeInspectable)\n\nfunc returnForLabel(response string) inspectableModifier {\n\treturn func(i *fakes.FakeInspectable) {\n\t\ti.ReturnForLabel = response\n\t}\n}\n\nfunc errorForLabel(err error) inspectableModifier {\n\treturn func(i *fakes.FakeInspectable) {\n\t\ti.ErrorForLabel = err\n\t}\n}\n\nfunc newInspectable(modifiers ...inspectableModifier) *fakes.FakeInspectable {\n\tinspectable := &fakes.FakeInspectable{}\n\n\tfor _, mod := range modifiers {\n\t\tmod(inspectable)\n\t}\n\n\treturn inspectable\n}\n"
  },
  {
    "path": "internal/builder/lifecycle.go",
    "content": "package builder\n\nimport (\n\t\"archive/tar\"\n\t\"fmt\"\n\t\"io\"\n\t\"path\"\n\t\"regexp\"\n\n\t\"github.com/pkg/errors\"\n\n\t\"github.com/buildpacks/pack/pkg/archive\"\n)\n\n// DefaultLifecycleVersion A snapshot of the latest tested lifecycle version values\nconst (\n\tDefaultLifecycleVersion = \"0.21.0\"\n)\n\n// Blob is an interface to wrap opening blobs\ntype Blob interface {\n\tOpen() (io.ReadCloser, error)\n}\n\n// Lifecycle is an implementation of the CNB Lifecycle spec\n//\n//go:generate mockgen -package testmocks -destination testmocks/mock_lifecycle.go github.com/buildpacks/pack/internal/builder Lifecycle\ntype Lifecycle interface {\n\tBlob\n\tDescriptor() LifecycleDescriptor\n}\n\ntype lifecycle struct {\n\tdescriptor LifecycleDescriptor\n\tBlob\n}\n\n// NewLifecycle creates a Lifecycle from a Blob\nfunc NewLifecycle(blob Blob) (Lifecycle, error) {\n\tvar err error\n\n\tbr, err := blob.Open()\n\tif err != nil {\n\t\treturn nil, errors.Wrap(err, \"open lifecycle blob\")\n\t}\n\tdefer br.Close()\n\n\t_, buf, err := archive.ReadTarEntry(br, \"lifecycle.toml\")\n\tif err != nil && errors.Cause(err) == archive.ErrEntryNotExist {\n\t\treturn nil, err\n\t} else if err != nil {\n\t\treturn nil, errors.Wrap(err, \"reading lifecycle descriptor\")\n\t}\n\n\tlifecycleDescriptor, err := ParseDescriptor(string(buf))\n\tif err != nil {\n\t\treturn nil, err\n\t}\n\n\tlifecycle := &lifecycle{Blob: blob, descriptor: CompatDescriptor(lifecycleDescriptor)}\n\n\tif err = lifecycle.validateBinaries(); err != nil {\n\t\treturn nil, errors.Wrap(err, \"validating binaries\")\n\t}\n\n\treturn lifecycle, nil\n}\n\n// Descriptor returns the LifecycleDescriptor\nfunc (l *lifecycle) Descriptor() LifecycleDescriptor {\n\treturn l.descriptor\n}\n\nfunc (l *lifecycle) validateBinaries() error {\n\trc, err := l.Open()\n\tif err != nil {\n\t\treturn errors.Wrap(err, \"create lifecycle blob reader\")\n\t}\n\tdefer rc.Close()\n\tregex := regexp.MustCompile(`^[^/]+/([^/]+)$`)\n\theaders := map[string]bool{}\n\ttr := tar.NewReader(rc)\n\tfor {\n\t\theader, err := tr.Next()\n\t\tif err == io.EOF {\n\t\t\tbreak\n\t\t}\n\t\tif err != nil {\n\t\t\treturn errors.Wrap(err, \"failed to get next tar entry\")\n\t\t}\n\n\t\tpathMatches := regex.FindStringSubmatch(path.Clean(header.Name))\n\t\tif pathMatches != nil {\n\t\t\theaders[pathMatches[1]] = true\n\t\t}\n\t}\n\tfor _, p := range l.binaries() {\n\t\t_, found := headers[p]\n\t\tif !found {\n\t\t\t_, found = headers[p+\".exe\"]\n\t\t\tif !found {\n\t\t\t\treturn fmt.Errorf(\"did not find '%s' in tar\", p)\n\t\t\t}\n\t\t}\n\t}\n\treturn nil\n}\n\n// Binaries returns a list of all binaries contained in the lifecycle.\nfunc (l *lifecycle) binaries() []string {\n\tbinaries := []string{\n\t\t\"detector\",\n\t\t\"restorer\",\n\t\t\"analyzer\",\n\t\t\"builder\",\n\t\t\"exporter\",\n\t\t\"launcher\",\n\t\t\"creator\",\n\t}\n\treturn binaries\n}\n\n// SupportedLinuxArchitecture returns true for each binary architecture available at https://github.com/buildpacks/lifecycle/releases/\nfunc SupportedLinuxArchitecture(arch string) bool {\n\treturn arch == \"arm64\" || arch == \"ppc64le\" || arch == \"s390x\" || arch == \"x86-64\"\n}\n"
  },
  {
    "path": "internal/builder/lifecycle_test.go",
    "content": "package builder_test\n\nimport (\n\t\"archive/tar\"\n\t\"io\"\n\t\"os\"\n\t\"path/filepath\"\n\t\"testing\"\n\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\t\"github.com/buildpacks/pack/internal/builder\"\n\t\"github.com/buildpacks/pack/pkg/blob\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestLifecycle(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"testLifecycle\", testLifecycle, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testLifecycle(t *testing.T, when spec.G, it spec.S) {\n\twhen(\"#NewLifecycle\", func() {\n\t\twhen(\"platform api 0.3\", func() {\n\t\t\tit(\"makes a lifecycle from a blob\", func() {\n\t\t\t\t_, err := builder.NewLifecycle(blob.NewBlob(filepath.Join(\"testdata\", \"lifecycle\", \"platform-0.3\")))\n\t\t\t\th.AssertNil(t, err)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"platform api 0.4\", func() {\n\t\t\tit(\"makes a lifecycle from a blob\", func() {\n\t\t\t\t_, err := builder.NewLifecycle(blob.NewBlob(filepath.Join(\"testdata\", \"lifecycle\", \"platform-0.4\")))\n\t\t\t\th.AssertNil(t, err)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"the blob can't open\", func() {\n\t\t\tit(\"throws an error\", func() {\n\t\t\t\t_, err := builder.NewLifecycle(blob.NewBlob(filepath.Join(\"testdata\", \"doesn't exist\")))\n\t\t\t\th.AssertError(t, err, \"open lifecycle blob\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"there is no descriptor file\", func() {\n\t\t\tit(\"throws an error\", func() {\n\t\t\t\t_, err := builder.NewLifecycle(&fakeEmptyBlob{})\n\t\t\t\th.AssertError(t, err, \"could not find entry path 'lifecycle.toml': not exist\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"the descriptor file isn't valid\", func() {\n\t\t\tvar tmpDir string\n\n\t\t\tit.Before(func() {\n\t\t\t\tvar err error\n\t\t\t\ttmpDir, err = os.MkdirTemp(\"\", \"lifecycle\")\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\th.AssertNil(t, os.WriteFile(filepath.Join(tmpDir, \"lifecycle.toml\"), []byte(`\n[api]\n  platform \"0.1\"\n`), 0711))\n\t\t\t})\n\n\t\t\tit.After(func() {\n\t\t\t\th.AssertNil(t, os.RemoveAll(tmpDir))\n\t\t\t})\n\n\t\t\tit(\"returns an error\", func() {\n\t\t\t\t_, err := builder.NewLifecycle(blob.NewBlob(tmpDir))\n\t\t\t\th.AssertError(t, err, \"decoding descriptor\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"the lifecycle has incomplete list of binaries\", func() {\n\t\t\tvar tmpDir string\n\n\t\t\tit.Before(func() {\n\t\t\t\tvar err error\n\t\t\t\ttmpDir, err = os.MkdirTemp(\"\", \"\")\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\th.AssertNil(t, os.WriteFile(filepath.Join(tmpDir, \"lifecycle.toml\"), []byte(`\n[api]\n  platform = \"0.2\"\n  buildpack = \"0.3\"\n\n[lifecycle]\n  version = \"1.2.3\"\n`), os.ModePerm))\n\n\t\t\t\th.AssertNil(t, os.Mkdir(filepath.Join(tmpDir, \"lifecycle\"), os.ModePerm))\n\t\t\t\th.AssertNil(t, os.WriteFile(filepath.Join(tmpDir, \"lifecycle\", \"analyzer\"), []byte(\"content\"), os.ModePerm))\n\t\t\t\th.AssertNil(t, os.WriteFile(filepath.Join(tmpDir, \"lifecycle\", \"detector\"), []byte(\"content\"), os.ModePerm))\n\t\t\t\th.AssertNil(t, os.WriteFile(filepath.Join(tmpDir, \"lifecycle\", \"builder\"), []byte(\"content\"), os.ModePerm))\n\t\t\t})\n\n\t\t\tit.After(func() {\n\t\t\t\th.AssertNil(t, os.RemoveAll(tmpDir))\n\t\t\t})\n\n\t\t\tit(\"returns an error\", func() {\n\t\t\t\t_, err := builder.NewLifecycle(blob.NewBlob(tmpDir))\n\t\t\t\th.AssertError(t, err, \"validating binaries\")\n\t\t\t})\n\t\t})\n\t})\n}\n\ntype fakeEmptyBlob struct {\n}\n\nfunc (f *fakeEmptyBlob) Open() (io.ReadCloser, error) {\n\tpr, pw := io.Pipe()\n\tgo func() {\n\t\tdefer pw.Close()\n\t\ttw := tar.NewWriter(pw)\n\t\tdefer tw.Close()\n\t}()\n\treturn pr, nil\n}\n"
  },
  {
    "path": "internal/builder/metadata.go",
    "content": "package builder\n\nimport \"github.com/buildpacks/pack/pkg/dist\"\n\nconst (\n\tOrderLabel           = \"io.buildpacks.buildpack.order\"\n\tOrderExtensionsLabel = \"io.buildpacks.buildpack.order-extensions\"\n\tSystemLabel          = \"io.buildpacks.buildpack.system\"\n)\n\ntype Metadata struct {\n\tDescription string             `json:\"description\"`\n\tBuildpacks  []dist.ModuleInfo  `json:\"buildpacks\"`\n\tExtensions  []dist.ModuleInfo  `json:\"extensions\"`\n\tStack       StackMetadata      `json:\"stack\"`\n\tLifecycle   LifecycleMetadata  `json:\"lifecycle\"`\n\tCreatedBy   CreatorMetadata    `json:\"createdBy\"`\n\tRunImages   []RunImageMetadata `json:\"images\"`\n}\n\ntype CreatorMetadata struct {\n\tName    string `json:\"name\" yaml:\"name\"`\n\tVersion string `json:\"version\" yaml:\"version\"`\n}\n\ntype LifecycleMetadata struct {\n\tLifecycleInfo\n\t// Deprecated: use APIs instead\n\tAPI  LifecycleAPI  `json:\"api\"`\n\tAPIs LifecycleAPIs `json:\"apis\"`\n}\n\ntype StackMetadata struct {\n\tRunImage RunImageMetadata `json:\"runImage\" toml:\"run-image\"`\n}\n\ntype RunImages struct {\n\tImages []RunImageMetadata `json:\"images\" toml:\"images\"`\n}\n\ntype RunImageMetadata struct {\n\tImage   string   `json:\"image\" toml:\"image\"`\n\tMirrors []string `json:\"mirrors\" toml:\"mirrors\"`\n}\n"
  },
  {
    "path": "internal/builder/testdata/lifecycle/platform-0.3/lifecycle-v0.0.0-arch/analyzer",
    "content": "analyzer"
  },
  {
    "path": "internal/builder/testdata/lifecycle/platform-0.3/lifecycle-v0.0.0-arch/builder",
    "content": "builder"
  },
  {
    "path": "internal/builder/testdata/lifecycle/platform-0.3/lifecycle-v0.0.0-arch/creator",
    "content": "creator"
  },
  {
    "path": "internal/builder/testdata/lifecycle/platform-0.3/lifecycle-v0.0.0-arch/detector",
    "content": "detector"
  },
  {
    "path": "internal/builder/testdata/lifecycle/platform-0.3/lifecycle-v0.0.0-arch/exporter",
    "content": "exporter"
  },
  {
    "path": "internal/builder/testdata/lifecycle/platform-0.3/lifecycle-v0.0.0-arch/launcher",
    "content": "launcher"
  },
  {
    "path": "internal/builder/testdata/lifecycle/platform-0.3/lifecycle-v0.0.0-arch/restorer",
    "content": "restorer"
  },
  {
    "path": "internal/builder/testdata/lifecycle/platform-0.3/lifecycle.toml",
    "content": "[lifecycle]\nversion = \"0.0.0\"\n\n[api]\nbuildpack = \"0.2\"\nplatform = \"0.3\""
  },
  {
    "path": "internal/builder/testdata/lifecycle/platform-0.4/lifecycle-v0.0.0-arch/analyzer",
    "content": "analyzer"
  },
  {
    "path": "internal/builder/testdata/lifecycle/platform-0.4/lifecycle-v0.0.0-arch/builder",
    "content": "builder"
  },
  {
    "path": "internal/builder/testdata/lifecycle/platform-0.4/lifecycle-v0.0.0-arch/creator",
    "content": "creator"
  },
  {
    "path": "internal/builder/testdata/lifecycle/platform-0.4/lifecycle-v0.0.0-arch/detector",
    "content": "detector"
  },
  {
    "path": "internal/builder/testdata/lifecycle/platform-0.4/lifecycle-v0.0.0-arch/exporter",
    "content": "exporter"
  },
  {
    "path": "internal/builder/testdata/lifecycle/platform-0.4/lifecycle-v0.0.0-arch/launcher",
    "content": "launcher"
  },
  {
    "path": "internal/builder/testdata/lifecycle/platform-0.4/lifecycle-v0.0.0-arch/restorer",
    "content": "restorer"
  },
  {
    "path": "internal/builder/testdata/lifecycle/platform-0.4/lifecycle.toml",
    "content": "[lifecycle]\nversion = \"0.0.0\"\n\n[apis]\n[apis.buildpack]\ndeprecated = []\nsupported = [\"0.2\", \"0.3\", \"0.4\", \"0.9\"]\n\n[apis.platform]\ndeprecated = [\"0.2\"]\nsupported = [\"0.3\", \"0.4\"]"
  },
  {
    "path": "internal/builder/testmocks/mock_lifecycle.go",
    "content": "// Code generated by MockGen. DO NOT EDIT.\n// Source: github.com/buildpacks/pack/internal/builder (interfaces: Lifecycle)\n\n// Package testmocks is a generated GoMock package.\npackage testmocks\n\nimport (\n\tio \"io\"\n\treflect \"reflect\"\n\n\tgomock \"github.com/golang/mock/gomock\"\n\n\tbuilder \"github.com/buildpacks/pack/internal/builder\"\n)\n\n// MockLifecycle is a mock of Lifecycle interface.\ntype MockLifecycle struct {\n\tctrl     *gomock.Controller\n\trecorder *MockLifecycleMockRecorder\n}\n\n// MockLifecycleMockRecorder is the mock recorder for MockLifecycle.\ntype MockLifecycleMockRecorder struct {\n\tmock *MockLifecycle\n}\n\n// NewMockLifecycle creates a new mock instance.\nfunc NewMockLifecycle(ctrl *gomock.Controller) *MockLifecycle {\n\tmock := &MockLifecycle{ctrl: ctrl}\n\tmock.recorder = &MockLifecycleMockRecorder{mock}\n\treturn mock\n}\n\n// EXPECT returns an object that allows the caller to indicate expected use.\nfunc (m *MockLifecycle) EXPECT() *MockLifecycleMockRecorder {\n\treturn m.recorder\n}\n\n// Descriptor mocks base method.\nfunc (m *MockLifecycle) Descriptor() builder.LifecycleDescriptor {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"Descriptor\")\n\tret0, _ := ret[0].(builder.LifecycleDescriptor)\n\treturn ret0\n}\n\n// Descriptor indicates an expected call of Descriptor.\nfunc (mr *MockLifecycleMockRecorder) Descriptor() *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"Descriptor\", reflect.TypeOf((*MockLifecycle)(nil).Descriptor))\n}\n\n// Open mocks base method.\nfunc (m *MockLifecycle) Open() (io.ReadCloser, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"Open\")\n\tret0, _ := ret[0].(io.ReadCloser)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// Open indicates an expected call of Open.\nfunc (mr *MockLifecycleMockRecorder) Open() *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"Open\", reflect.TypeOf((*MockLifecycle)(nil).Open))\n}\n"
  },
  {
    "path": "internal/builder/trusted_builder.go",
    "content": "package builder\n\nimport (\n\t\"github.com/google/go-containerregistry/pkg/name\"\n\n\t\"github.com/buildpacks/pack/internal/config\"\n)\n\ntype KnownBuilder struct {\n\tVendor             string\n\tImage              string\n\tDefaultDescription string\n\tSuggested          bool\n\tTrusted            bool\n}\n\nvar KnownBuilders = []KnownBuilder{\n\t{\n\t\tVendor:             \"Google\",\n\t\tImage:              \"gcr.io/buildpacks/builder:google-22\",\n\t\tDefaultDescription: \"Ubuntu 22.04 base image with buildpacks for .NET, Dart, Go, Java, Node.js, PHP, Python, and Ruby\",\n\t\tSuggested:          true,\n\t\tTrusted:            true,\n\t},\n\t{\n\t\tVendor:             \"Heroku\",\n\t\tImage:              \"heroku/builder:24\",\n\t\tDefaultDescription: \"Ubuntu 24.04 AMD64+ARM64 base image with buildpacks for Go, Java, Node.js, PHP, Python, Ruby & Scala.\",\n\t\tSuggested:          true,\n\t\tTrusted:            true,\n\t},\n\t{\n\t\tVendor:             \"Heroku\",\n\t\tImage:              \"heroku/builder:22\",\n\t\tDefaultDescription: \"Ubuntu 22.04 AMD64 base image with buildpacks for Go, Java, Node.js, PHP, Python, Ruby & Scala.\",\n\t\tSuggested:          false,\n\t\tTrusted:            true,\n\t},\n\t{\n\t\tVendor:             \"Heroku\",\n\t\tImage:              \"heroku/builder:20\",\n\t\tDefaultDescription: \"Ubuntu 20.04 AMD64 base image with buildpacks for Go, Java, Node.js, PHP, Python, Ruby & Scala.\",\n\t\tSuggested:          false,\n\t\tTrusted:            true,\n\t},\n\t{\n\t\tVendor:             \"Paketo Buildpacks\",\n\t\tImage:              \"paketobuildpacks/builder-jammy-base\",\n\t\tDefaultDescription: \"Small base image with buildpacks for Java, Node.js, Golang, .NET Core, Python & Ruby\",\n\t\tSuggested:          true,\n\t\tTrusted:            true,\n\t},\n\t{\n\t\tVendor:             \"Paketo Buildpacks\",\n\t\tImage:              \"paketobuildpacks/builder-jammy-full\",\n\t\tDefaultDescription: \"Larger base image with buildpacks for Java, Node.js, Golang, .NET Core, Python, Ruby, & PHP\",\n\t\tSuggested:          true,\n\t\tTrusted:            true,\n\t},\n\t{\n\t\tVendor:             \"Paketo Buildpacks\",\n\t\tImage:              \"paketobuildpacks/builder-jammy-tiny\",\n\t\tDefaultDescription: \"Tiny base image (jammy build image, distroless run image) with buildpacks for Golang & Java\",\n\t\tSuggested:          true,\n\t\tTrusted:            true,\n\t},\n\t{\n\t\tVendor:             \"Paketo Buildpacks\",\n\t\tImage:              \"paketobuildpacks/builder-jammy-buildpackless-static\",\n\t\tDefaultDescription: \"Static base image (jammy build image, distroless run image) suitable for static binaries like Go or Rust\",\n\t\tSuggested:          true,\n\t\tTrusted:            true,\n\t},\n\t{\n\t\tVendor:             \"Paketo Buildpacks\",\n\t\tImage:              \"paketobuildpacks/ubuntu-noble-builder\",\n\t\tDefaultDescription: \"Small base image with buildpacks for Java, Node.js or .NET Core\",\n\t\tSuggested:          true,\n\t\tTrusted:            true,\n\t},\n\t{\n\t\tVendor:             \"Paketo Buildpacks\",\n\t\tImage:              \"paketobuildpacks/builder-ubi8-base\",\n\t\tDefaultDescription: \"Universal Base Image (RHEL8) with buildpacks to build Node.js or Java runtimes. Support also the new extension feature (aka apply Dockerfile)\",\n\t\tSuggested:          true,\n\t\tTrusted:            true,\n\t},\n\t{\n\t\tVendor:             \"Paketo Buildpacks\",\n\t\tImage:              \"paketobuildpacks/ubi-9-builder\",\n\t\tDefaultDescription: \"Universal Base Image (RHEL9) with buildpacks to build Node.js runtimes.\",\n\t\tSuggested:          true,\n\t\tTrusted:            true,\n\t},\n\t{\n\t\tVendor:             \"Paketo Buildpacks\",\n\t\tImage:              \"paketobuildpacks/ubi-10-builder\",\n\t\tDefaultDescription: \"Universal Base Image (RHEL10) with buildpacks to build Node.js runtimes.\",\n\t\tSuggested:          true,\n\t\tTrusted:            true,\n\t},\n}\n\nfunc IsKnownTrustedBuilder(builderName string) bool {\n\tfor _, knownBuilder := range KnownBuilders {\n\t\tif builderName == knownBuilder.Image && knownBuilder.Trusted {\n\t\t\treturn true\n\t\t}\n\t}\n\treturn false\n}\n\nfunc IsTrustedBuilder(cfg config.Config, builderName string) (bool, error) {\n\tbuilderReference, err := name.ParseReference(builderName, name.WithDefaultTag(\"\"))\n\tif err != nil {\n\t\treturn false, err\n\t}\n\tfor _, trustedBuilder := range cfg.TrustedBuilders {\n\t\ttrustedBuilderReference, err := name.ParseReference(trustedBuilder.Name, name.WithDefaultTag(\"\"))\n\t\tif err != nil {\n\t\t\treturn false, err\n\t\t}\n\t\tif trustedBuilderReference.Identifier() != \"\" {\n\t\t\tif builderReference.Name() == trustedBuilderReference.Name() {\n\t\t\t\treturn true, nil\n\t\t\t}\n\t\t} else {\n\t\t\tif builderReference.Context().RepositoryStr() == trustedBuilderReference.Context().RepositoryStr() {\n\t\t\t\treturn true, nil\n\t\t\t}\n\t\t}\n\t}\n\treturn false, nil\n}\n"
  },
  {
    "path": "internal/builder/trusted_builder_test.go",
    "content": "package builder_test\n\nimport (\n\t\"testing\"\n\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\tbldr \"github.com/buildpacks/pack/internal/builder\"\n\t\"github.com/buildpacks/pack/internal/config\"\n\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestTrustedBuilder(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"Trusted Builder\", trustedBuilder, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc trustedBuilder(t *testing.T, when spec.G, it spec.S) {\n\twhen(\"IsKnownTrustedBuilder\", func() {\n\t\tit(\"matches exactly\", func() {\n\t\t\th.AssertTrue(t, bldr.IsKnownTrustedBuilder(\"paketobuildpacks/builder-jammy-base\"))\n\t\t\th.AssertFalse(t, bldr.IsKnownTrustedBuilder(\"paketobuildpacks/builder-jammy-base:latest\"))\n\t\t\th.AssertFalse(t, bldr.IsKnownTrustedBuilder(\"paketobuildpacks/builder-jammy-base:1.2.3\"))\n\t\t\th.AssertFalse(t, bldr.IsKnownTrustedBuilder(\"my/private/builder\"))\n\t\t})\n\t})\n\n\twhen(\"IsTrustedBuilder\", func() {\n\t\tit(\"trust image without tag\", func() {\n\t\t\tcfg := config.Config{\n\t\t\t\tTrustedBuilders: []config.TrustedBuilder{\n\t\t\t\t\t{\n\t\t\t\t\t\tName: \"my/trusted/builder-jammy\",\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t}\n\n\t\t\ttrustedBuilders := []string{\n\t\t\t\t\"my/trusted/builder-jammy\",\n\t\t\t\t\"my/trusted/builder-jammy:latest\",\n\t\t\t\t\"my/trusted/builder-jammy:1.2.3\",\n\t\t\t}\n\n\t\t\tuntrustedBuilders := []string{\n\t\t\t\t\"my/private/builder\",            // random builder\n\t\t\t\t\"my/trusted/builder-jammy-base\", // shared prefix\n\t\t\t}\n\n\t\t\tfor _, builder := range trustedBuilders {\n\t\t\t\tisTrusted, err := bldr.IsTrustedBuilder(cfg, builder)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertTrue(t, isTrusted)\n\t\t\t}\n\n\t\t\tfor _, builder := range untrustedBuilders {\n\t\t\t\tisTrusted, err := bldr.IsTrustedBuilder(cfg, builder)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertFalse(t, isTrusted)\n\t\t\t}\n\t\t})\n\t\tit(\"trust image with tag\", func() {\n\t\t\tcfg := config.Config{\n\t\t\t\tTrustedBuilders: []config.TrustedBuilder{\n\t\t\t\t\t{\n\t\t\t\t\t\tName: \"my/trusted/builder-jammy:1.2.3\",\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\tName: \"my/trusted/builder-jammy:latest\",\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t}\n\n\t\t\ttrustedBuilders := []string{\n\t\t\t\t\"my/trusted/builder-jammy:1.2.3\",\n\t\t\t\t\"my/trusted/builder-jammy:latest\",\n\t\t\t}\n\n\t\t\tuntrustedBuilders := []string{\n\t\t\t\t\"my/private/builder\",\n\t\t\t\t\"my/trusted/builder-jammy\",\n\t\t\t\t\"my/trusted/builder-jammy:2.0.0\",\n\t\t\t\t\"my/trusted/builder-jammy-base\",\n\t\t\t}\n\n\t\t\tfor _, builder := range trustedBuilders {\n\t\t\t\tisTrusted, err := bldr.IsTrustedBuilder(cfg, builder)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertTrue(t, isTrusted)\n\t\t\t}\n\n\t\t\tfor _, builder := range untrustedBuilders {\n\t\t\t\tisTrusted, err := bldr.IsTrustedBuilder(cfg, builder)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertFalse(t, isTrusted)\n\t\t\t}\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "internal/builder/version.go",
    "content": "package builder\n\nimport (\n\t\"github.com/Masterminds/semver\"\n\t\"github.com/pkg/errors\"\n)\n\n// Version is an extension to semver.Version to make it marshalable.\ntype Version struct {\n\tsemver.Version\n}\n\n// VersionMustParse parses a string into a Version\nfunc VersionMustParse(v string) *Version {\n\treturn &Version{Version: *semver.MustParse(v)}\n}\n\n// String returns the string value of the Version\nfunc (v *Version) String() string {\n\treturn v.Version.String()\n}\n\n// Equal compares two Versions\nfunc (v *Version) Equal(other *Version) bool {\n\tif other != nil {\n\t\treturn v.Version.Equal(&other.Version)\n\t}\n\n\treturn false\n}\n\n// MarshalText makes Version satisfy the encoding.TextMarshaler interface.\nfunc (v *Version) MarshalText() ([]byte, error) {\n\treturn []byte(v.Original()), nil\n}\n\n// UnmarshalText makes Version satisfy the encoding.TextUnmarshaler interface.\nfunc (v *Version) UnmarshalText(text []byte) error {\n\ts := string(text)\n\tw, err := semver.NewVersion(s)\n\tif err != nil {\n\t\treturn errors.Wrapf(err, \"invalid semantic version %s\", s)\n\t}\n\n\tv.Version = *w\n\treturn nil\n}\n"
  },
  {
    "path": "internal/builder/version_test.go",
    "content": "package builder_test\n\nimport (\n\t\"testing\"\n\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\t\"github.com/buildpacks/pack/internal/builder\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestVersion(t *testing.T) {\n\tspec.Run(t, \"testVersion\", testVersion, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testVersion(t *testing.T, when spec.G, it spec.S) {\n\twhen(\"#VersionMustParse\", func() {\n\t\tit(\"parses string\", func() {\n\t\t\ttestVersion := \"1.2.3\"\n\t\t\tversion := builder.VersionMustParse(testVersion)\n\t\t\th.AssertEq(t, testVersion, version.String())\n\t\t})\n\t})\n\n\twhen(\"#Equal\", func() {\n\t\tvar version = builder.VersionMustParse(\"1.2.3\")\n\n\t\tit(\"matches\", func() {\n\t\t\totherVersion := builder.VersionMustParse(\"1.2.3\")\n\t\t\th.AssertTrue(t, version.Equal(otherVersion))\n\t\t})\n\n\t\tit(\"returns false if doesn't match exactly\", func() {\n\t\t\totherVersion := builder.VersionMustParse(\"1.2\")\n\t\t\th.AssertFalse(t, version.Equal(otherVersion))\n\t\t})\n\n\t\tit(\"handles nil case\", func() {\n\t\t\th.AssertFalse(t, version.Equal(nil))\n\t\t})\n\t})\n\n\twhen(\"MarshalText\", func() {\n\t\tit(\"marshals text\", func() {\n\t\t\ttestVersion := \"1.2.3\"\n\t\t\tversion := builder.VersionMustParse(testVersion)\n\t\t\tbytesVersion, err := version.MarshalText()\n\t\t\th.AssertNil(t, err)\n\t\t\th.AssertEq(t, bytesVersion, []byte(testVersion))\n\t\t})\n\t})\n\n\twhen(\"UnmarshalText\", func() {\n\t\tit(\"overwrites existing version\", func() {\n\t\t\ttestVersion := \"1.2.3\"\n\t\t\tversion := builder.VersionMustParse(testVersion)\n\n\t\t\tnewVersion := \"1.4.5\"\n\t\t\terr := version.UnmarshalText([]byte(newVersion))\n\t\t\th.AssertNil(t, err)\n\t\t\th.AssertEq(t, version.String(), newVersion)\n\t\t})\n\n\t\tit(\"fails if provided invalid semver\", func() {\n\t\t\ttestVersion := \"1.2.3\"\n\t\t\tversion := builder.VersionMustParse(testVersion)\n\n\t\t\tnewVersion := \"1.x\"\n\t\t\terr := version.UnmarshalText([]byte(newVersion))\n\t\t\th.AssertError(t, err, \"invalid semantic version\")\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "internal/builder/writer/factory.go",
    "content": "package writer\n\nimport (\n\t\"fmt\"\n\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/pkg/client\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\n\t\"github.com/buildpacks/pack/internal/style\"\n)\n\ntype Factory struct{}\n\ntype BuilderWriter interface {\n\tPrint(\n\t\tlogger logging.Logger,\n\t\tlocalRunImages []config.RunImage,\n\t\tlocal, remote *client.BuilderInfo,\n\t\tlocalErr, remoteErr error,\n\t\tbuilderInfo SharedBuilderInfo,\n\t) error\n}\n\ntype SharedBuilderInfo struct {\n\tName      string `json:\"builder_name\" yaml:\"builder_name\" toml:\"builder_name\"`\n\tTrusted   bool   `json:\"trusted\" yaml:\"trusted\" toml:\"trusted\"`\n\tIsDefault bool   `json:\"default\" yaml:\"default\" toml:\"default\"`\n}\n\ntype BuilderWriterFactory interface {\n\tWriter(kind string) (BuilderWriter, error)\n}\n\nfunc NewFactory() *Factory {\n\treturn &Factory{}\n}\n\nfunc (f *Factory) Writer(kind string) (BuilderWriter, error) {\n\tswitch kind {\n\tcase \"human-readable\":\n\t\treturn NewHumanReadable(), nil\n\tcase \"json\":\n\t\treturn NewJSON(), nil\n\tcase \"yaml\":\n\t\treturn NewYAML(), nil\n\tcase \"toml\":\n\t\treturn NewTOML(), nil\n\t}\n\n\treturn nil, fmt.Errorf(\"output format %s is not supported\", style.Symbol(kind))\n}\n"
  },
  {
    "path": "internal/builder/writer/factory_test.go",
    "content": "package writer_test\n\nimport (\n\t\"fmt\"\n\t\"testing\"\n\n\t\"github.com/buildpacks/pack/internal/builder/writer\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n)\n\nfunc TestFactory(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"Builder Writer Factory\", testFactory, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testFactory(t *testing.T, when spec.G, it spec.S) {\n\tvar assert = h.NewAssertionManager(t)\n\n\twhen(\"Writer\", func() {\n\t\twhen(\"output format is human-readable\", func() {\n\t\t\tit(\"returns a HumanReadable writer\", func() {\n\t\t\t\tfactory := writer.NewFactory()\n\n\t\t\t\treturnedWriter, err := factory.Writer(\"human-readable\")\n\t\t\t\tassert.Nil(err)\n\t\t\t\t_, ok := returnedWriter.(*writer.HumanReadable)\n\t\t\t\tassert.TrueWithMessage(\n\t\t\t\t\tok,\n\t\t\t\t\tfmt.Sprintf(\"expected %T to be assignable to type `*writer.HumanReadable`\", returnedWriter),\n\t\t\t\t)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"output format is json\", func() {\n\t\t\tit(\"return a JSON writer\", func() {\n\t\t\t\tfactory := writer.NewFactory()\n\n\t\t\t\treturnedWriter, err := factory.Writer(\"json\")\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\t_, ok := returnedWriter.(*writer.JSON)\n\t\t\t\tassert.TrueWithMessage(\n\t\t\t\t\tok,\n\t\t\t\t\tfmt.Sprintf(\"expected %T to be assignable to type `*writer.JSON`\", returnedWriter),\n\t\t\t\t)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"output format is yaml\", func() {\n\t\t\tit(\"return a YAML writer\", func() {\n\t\t\t\tfactory := writer.NewFactory()\n\n\t\t\t\treturnedWriter, err := factory.Writer(\"yaml\")\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\t_, ok := returnedWriter.(*writer.YAML)\n\t\t\t\tassert.TrueWithMessage(\n\t\t\t\t\tok,\n\t\t\t\t\tfmt.Sprintf(\"expected %T to be assignable to type `*writer.YAML`\", returnedWriter),\n\t\t\t\t)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"output format is toml\", func() {\n\t\t\tit(\"return a TOML writer\", func() {\n\t\t\t\tfactory := writer.NewFactory()\n\n\t\t\t\treturnedWriter, err := factory.Writer(\"toml\")\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\t_, ok := returnedWriter.(*writer.TOML)\n\t\t\t\tassert.TrueWithMessage(\n\t\t\t\t\tok,\n\t\t\t\t\tfmt.Sprintf(\"expected %T to be assignable to type `*writer.TOML`\", returnedWriter),\n\t\t\t\t)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"output format is not supported\", func() {\n\t\t\tit(\"returns an error\", func() {\n\t\t\t\tfactory := writer.NewFactory()\n\n\t\t\t\t_, err := factory.Writer(\"mind-beam\")\n\t\t\t\tassert.ErrorWithMessage(err, \"output format 'mind-beam' is not supported\")\n\t\t\t})\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "internal/builder/writer/human_readable.go",
    "content": "package writer\n\nimport (\n\t\"bytes\"\n\t\"fmt\"\n\t\"io\"\n\t\"strings\"\n\t\"text/tabwriter\"\n\t\"text/template\"\n\n\tstrs \"github.com/buildpacks/pack/internal/strings\"\n\t\"github.com/buildpacks/pack/pkg/client\"\n\n\t\"github.com/buildpacks/pack/internal/style\"\n\n\t\"github.com/buildpacks/pack/pkg/dist\"\n\n\tpubbldr \"github.com/buildpacks/pack/builder\"\n\n\t\"github.com/buildpacks/pack/internal/config\"\n\n\t\"github.com/buildpacks/pack/internal/builder\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n)\n\nconst (\n\twriterMinWidth     = 0\n\twriterTabWidth     = 0\n\tbuildpacksTabWidth = 8\n\textensionsTabWidth = 8\n\tdefaultTabWidth    = 4\n\twriterPadChar      = ' '\n\twriterFlags        = 0\n\tnone               = \"(none)\"\n\n\toutputTemplate = `\n{{ if ne .Info.Description \"\" -}}\nDescription: {{ .Info.Description }}\n\n{{ end -}}\n{{- if ne .Info.CreatedBy.Name \"\" -}}\nCreated By:\n  Name: {{ .Info.CreatedBy.Name }}\n  Version: {{ .Info.CreatedBy.Version }}\n\n{{ end -}}\n\nTrusted: {{.Trusted}}\n\n{{ if ne .Info.Stack \"\" -}}Stack:\n  ID: {{ .Info.Stack }}{{ end -}}\n{{- if .Verbose}}\n{{- if ne (len .Info.Mixins) 0 }}\n  Mixins:\n{{- end }}\n{{- range $index, $mixin := .Info.Mixins }}\n    {{ $mixin }}\n{{- end }}\n{{- end }}\n{{ .Lifecycle }}\n{{ .RunImages }}\n{{ .Buildpacks }}\n{{ .Order }}\n{{- if ne .Extensions \"\" }}\n{{ .Extensions }}\n{{- end }}\n{{- if ne .OrderExtensions \"\" }}\n{{ .OrderExtensions }}\n{{- end }}`\n)\n\ntype HumanReadable struct{}\n\nfunc NewHumanReadable() *HumanReadable {\n\treturn &HumanReadable{}\n}\n\nfunc (h *HumanReadable) Print(\n\tlogger logging.Logger,\n\tlocalRunImages []config.RunImage,\n\tlocal, remote *client.BuilderInfo,\n\tlocalErr, remoteErr error,\n\tbuilderInfo SharedBuilderInfo,\n) error {\n\tif local == nil && remote == nil {\n\t\treturn fmt.Errorf(\"unable to find builder '%s' locally or remotely\", builderInfo.Name)\n\t}\n\n\tif builderInfo.IsDefault {\n\t\tlogger.Infof(\"Inspecting default builder: %s\\n\", style.Symbol(builderInfo.Name))\n\t} else {\n\t\tlogger.Infof(\"Inspecting builder: %s\\n\", style.Symbol(builderInfo.Name))\n\t}\n\n\tlogger.Info(\"\\nREMOTE:\\n\")\n\terr := writeBuilderInfo(logger, localRunImages, remote, remoteErr, builderInfo)\n\tif err != nil {\n\t\treturn fmt.Errorf(\"writing remote builder info: %w\", err)\n\t}\n\tlogger.Info(\"\\nLOCAL:\\n\")\n\terr = writeBuilderInfo(logger, localRunImages, local, localErr, builderInfo)\n\tif err != nil {\n\t\treturn fmt.Errorf(\"writing local builder info: %w\", err)\n\t}\n\n\treturn nil\n}\n\nfunc writeBuilderInfo(\n\tlogger logging.Logger,\n\tlocalRunImages []config.RunImage,\n\tinfo *client.BuilderInfo,\n\terr error,\n\tsharedInfo SharedBuilderInfo,\n) error {\n\tif err != nil {\n\t\tlogger.Errorf(\"%s\\n\", err)\n\t\treturn nil\n\t}\n\n\tif info == nil {\n\t\tlogger.Info(\"(not present)\\n\")\n\t\treturn nil\n\t}\n\n\tvar warnings []string\n\n\trunImagesString, runImagesWarnings, err := runImagesOutput(info.RunImages, localRunImages, sharedInfo.Name)\n\tif err != nil {\n\t\treturn fmt.Errorf(\"compiling run images output: %w\", err)\n\t}\n\torderString, orderWarnings, err := detectionOrderOutput(info.Order, sharedInfo.Name)\n\tif err != nil {\n\t\treturn fmt.Errorf(\"compiling detection order output: %w\", err)\n\t}\n\n\tvar orderExtString string\n\tvar orderExtWarnings []string\n\n\tif info.Extensions != nil {\n\t\torderExtString, orderExtWarnings, err = detectionOrderExtOutput(info.OrderExtensions, sharedInfo.Name)\n\t\tif err != nil {\n\t\t\treturn fmt.Errorf(\"compiling detection order extensions output: %w\", err)\n\t\t}\n\t}\n\tbuildpacksString, buildpacksWarnings, err := buildpacksOutput(info.Buildpacks, sharedInfo.Name)\n\tif err != nil {\n\t\treturn fmt.Errorf(\"compiling buildpacks output: %w\", err)\n\t}\n\tlifecycleString, lifecycleWarnings := lifecycleOutput(info.Lifecycle, sharedInfo.Name)\n\n\tvar extensionsString string\n\tvar extensionsWarnings []string\n\n\tif info.Extensions != nil {\n\t\textensionsString, extensionsWarnings, err = extensionsOutput(info.Extensions, sharedInfo.Name)\n\t\tif err != nil {\n\t\t\treturn fmt.Errorf(\"compiling extensions output: %w\", err)\n\t\t}\n\t}\n\n\twarnings = append(warnings, runImagesWarnings...)\n\twarnings = append(warnings, orderWarnings...)\n\twarnings = append(warnings, buildpacksWarnings...)\n\twarnings = append(warnings, lifecycleWarnings...)\n\tif info.Extensions != nil {\n\t\twarnings = append(warnings, extensionsWarnings...)\n\t\twarnings = append(warnings, orderExtWarnings...)\n\t}\n\toutputTemplate, _ := template.New(\"\").Parse(outputTemplate)\n\n\terr = outputTemplate.Execute(\n\t\tlogger.Writer(),\n\t\t&struct {\n\t\t\tInfo            client.BuilderInfo\n\t\t\tVerbose         bool\n\t\t\tBuildpacks      string\n\t\t\tRunImages       string\n\t\t\tOrder           string\n\t\t\tTrusted         string\n\t\t\tLifecycle       string\n\t\t\tExtensions      string\n\t\t\tOrderExtensions string\n\t\t}{\n\t\t\t*info,\n\t\t\tlogger.IsVerbose(),\n\t\t\tbuildpacksString,\n\t\t\trunImagesString,\n\t\t\torderString,\n\t\t\tstringFromBool(sharedInfo.Trusted),\n\t\t\tlifecycleString,\n\t\t\textensionsString,\n\t\t\torderExtString,\n\t\t},\n\t)\n\n\tfor _, warning := range warnings {\n\t\tlogger.Warn(warning)\n\t}\n\n\treturn err\n}\n\ntype trailingSpaceStrippingWriter struct {\n\toutput io.Writer\n\n\tpotentialDiscard []byte\n}\n\nfunc (w *trailingSpaceStrippingWriter) Write(p []byte) (n int, err error) {\n\tvar doWrite []byte\n\n\tfor _, b := range p {\n\t\tswitch b {\n\t\tcase writerPadChar:\n\t\t\tw.potentialDiscard = append(w.potentialDiscard, b)\n\t\tcase '\\n':\n\t\t\tw.potentialDiscard = []byte{}\n\t\t\tdoWrite = append(doWrite, b)\n\t\tdefault:\n\t\t\tdoWrite = append(doWrite, w.potentialDiscard...)\n\t\t\tdoWrite = append(doWrite, b)\n\t\t\tw.potentialDiscard = []byte{}\n\t\t}\n\t}\n\n\tif len(doWrite) > 0 {\n\t\tactualWrote, err := w.output.Write(doWrite)\n\t\tif err != nil {\n\t\t\treturn actualWrote, err\n\t\t}\n\t}\n\n\treturn len(p), nil\n}\n\nfunc stringFromBool(subject bool) string {\n\tif subject {\n\t\treturn \"Yes\"\n\t}\n\n\treturn \"No\"\n}\n\nfunc runImagesOutput(\n\trunImages []pubbldr.RunImageConfig,\n\tlocalRunImages []config.RunImage,\n\tbuilderName string,\n) (string, []string, error) {\n\toutput := \"Run Images:\\n\"\n\n\ttabWriterBuf := bytes.Buffer{}\n\n\tlocalMirrorTabWriter := tabwriter.NewWriter(&tabWriterBuf, writerMinWidth, writerTabWidth, defaultTabWidth, writerPadChar, writerFlags)\n\terr := writeLocalMirrors(localMirrorTabWriter, runImages, localRunImages)\n\tif err != nil {\n\t\treturn \"\", []string{}, fmt.Errorf(\"writing local mirrors: %w\", err)\n\t}\n\n\tvar warnings []string\n\n\tif len(runImages) == 0 {\n\t\twarnings = append(\n\t\t\twarnings,\n\t\t\tfmt.Sprintf(\"%s does not specify a run image\", builderName),\n\t\t\t\"Users must build with an explicitly specified run image\",\n\t\t)\n\t} else {\n\t\tfor _, runImage := range runImages {\n\t\t\tif runImage.Image != \"\" {\n\t\t\t\t_, err = fmt.Fprintf(localMirrorTabWriter, \"  %s\\n\", runImage.Image)\n\t\t\t\tif err != nil {\n\t\t\t\t\treturn \"\", []string{}, fmt.Errorf(\"writing to tabwriter: %w\", err)\n\t\t\t\t}\n\t\t\t}\n\t\t\tfor _, m := range runImage.Mirrors {\n\t\t\t\t_, err = fmt.Fprintf(localMirrorTabWriter, \"  %s\\n\", m)\n\t\t\t\tif err != nil {\n\t\t\t\t\treturn \"\", []string{}, fmt.Errorf(\"writing to tab writer: %w\", err)\n\t\t\t\t}\n\t\t\t}\n\t\t\terr = localMirrorTabWriter.Flush()\n\t\t\tif err != nil {\n\t\t\t\treturn \"\", []string{}, fmt.Errorf(\"flushing tab writer: %w\", err)\n\t\t\t}\n\t\t}\n\t}\n\trunImageOutput := tabWriterBuf.String()\n\tif runImageOutput == \"\" {\n\t\trunImageOutput = fmt.Sprintf(\"  %s\\n\", none)\n\t}\n\n\toutput += runImageOutput\n\n\treturn output, warnings, nil\n}\n\nfunc writeLocalMirrors(logWriter io.Writer, runImages []pubbldr.RunImageConfig, localRunImages []config.RunImage) error {\n\tfor _, i := range localRunImages {\n\t\tfor _, ri := range runImages {\n\t\t\tif i.Image == ri.Image {\n\t\t\t\tfor _, m := range i.Mirrors {\n\t\t\t\t\t_, err := fmt.Fprintf(logWriter, \"  %s\\t(user-configured)\\n\", m)\n\t\t\t\t\tif err != nil {\n\t\t\t\t\t\treturn fmt.Errorf(\"writing local mirror: %s: %w\", m, err)\n\t\t\t\t\t}\n\t\t\t\t}\n\t\t\t}\n\t\t}\n\t}\n\n\treturn nil\n}\n\nfunc extensionsOutput(extensions []dist.ModuleInfo, builderName string) (string, []string, error) {\n\toutput := \"Extensions:\\n\"\n\n\tif len(extensions) == 0 {\n\t\treturn fmt.Sprintf(\"%s  %s\\n\", output, none), nil, nil\n\t}\n\n\tvar (\n\t\ttabWriterBuf         = bytes.Buffer{}\n\t\tspaceStrippingWriter = &trailingSpaceStrippingWriter{\n\t\t\toutput: &tabWriterBuf,\n\t\t}\n\t\textensionsTabWriter = tabwriter.NewWriter(spaceStrippingWriter, writerMinWidth, writerPadChar, extensionsTabWidth, writerPadChar, writerFlags)\n\t)\n\n\t_, err := fmt.Fprint(extensionsTabWriter, \"  ID\\tNAME\\tVERSION\\tHOMEPAGE\\n\")\n\tif err != nil {\n\t\treturn \"\", []string{}, fmt.Errorf(\"writing to tab writer: %w\", err)\n\t}\n\n\tfor _, b := range extensions {\n\t\t_, err = fmt.Fprintf(extensionsTabWriter, \"  %s\\t%s\\t%s\\t%s\\n\", b.ID, strs.ValueOrDefault(b.Name, \"-\"), b.Version, strs.ValueOrDefault(b.Homepage, \"-\"))\n\t\tif err != nil {\n\t\t\treturn \"\", []string{}, fmt.Errorf(\"writing to tab writer: %w\", err)\n\t\t}\n\t}\n\n\terr = extensionsTabWriter.Flush()\n\tif err != nil {\n\t\treturn \"\", []string{}, fmt.Errorf(\"flushing tab writer: %w\", err)\n\t}\n\n\toutput += tabWriterBuf.String()\n\treturn output, []string{}, nil\n}\n\nfunc buildpacksOutput(buildpacks []dist.ModuleInfo, builderName string) (string, []string, error) {\n\toutput := \"Buildpacks:\\n\"\n\n\tif len(buildpacks) == 0 {\n\t\twarnings := []string{\n\t\t\tfmt.Sprintf(\"%s has no buildpacks\", builderName),\n\t\t\t\"Users must supply buildpacks from the host machine\",\n\t\t}\n\n\t\treturn fmt.Sprintf(\"%s  %s\\n\", output, none), warnings, nil\n\t}\n\n\tvar (\n\t\ttabWriterBuf         = bytes.Buffer{}\n\t\tspaceStrippingWriter = &trailingSpaceStrippingWriter{\n\t\t\toutput: &tabWriterBuf,\n\t\t}\n\t\tbuildpacksTabWriter = tabwriter.NewWriter(spaceStrippingWriter, writerMinWidth, writerPadChar, buildpacksTabWidth, writerPadChar, writerFlags)\n\t)\n\n\t_, err := fmt.Fprint(buildpacksTabWriter, \"  ID\\tNAME\\tVERSION\\tHOMEPAGE\\n\")\n\tif err != nil {\n\t\treturn \"\", []string{}, fmt.Errorf(\"writing to tab writer: %w\", err)\n\t}\n\n\tfor _, b := range buildpacks {\n\t\t_, err = fmt.Fprintf(buildpacksTabWriter, \"  %s\\t%s\\t%s\\t%s\\n\", b.ID, strs.ValueOrDefault(b.Name, \"-\"), b.Version, strs.ValueOrDefault(b.Homepage, \"-\"))\n\t\tif err != nil {\n\t\t\treturn \"\", []string{}, fmt.Errorf(\"writing to tab writer: %w\", err)\n\t\t}\n\t}\n\n\terr = buildpacksTabWriter.Flush()\n\tif err != nil {\n\t\treturn \"\", []string{}, fmt.Errorf(\"flushing tab writer: %w\", err)\n\t}\n\n\toutput += tabWriterBuf.String()\n\treturn output, []string{}, nil\n}\n\nconst lifecycleFormat = `\nLifecycle:\n  Version: %s\n  Buildpack APIs:\n    Deprecated: %s\n    Supported: %s\n  Platform APIs:\n    Deprecated: %s\n    Supported: %s\n`\n\nfunc lifecycleOutput(lifecycleInfo builder.LifecycleDescriptor, builderName string) (string, []string) {\n\tvar warnings []string\n\n\tversion := none\n\tif lifecycleInfo.Info.Version != nil {\n\t\tversion = lifecycleInfo.Info.Version.String()\n\t}\n\n\tif version == none {\n\t\twarnings = append(warnings, fmt.Sprintf(\"%s does not specify a Lifecycle version\", builderName))\n\t}\n\n\tsupportedBuildpackAPIs := stringFromAPISet(lifecycleInfo.APIs.Buildpack.Supported)\n\tif supportedBuildpackAPIs == none {\n\t\twarnings = append(warnings, fmt.Sprintf(\"%s does not specify supported Lifecycle Buildpack APIs\", builderName))\n\t}\n\n\tsupportedPlatformAPIs := stringFromAPISet(lifecycleInfo.APIs.Platform.Supported)\n\tif supportedPlatformAPIs == none {\n\t\twarnings = append(warnings, fmt.Sprintf(\"%s does not specify supported Lifecycle Platform APIs\", builderName))\n\t}\n\n\treturn fmt.Sprintf(\n\t\tlifecycleFormat,\n\t\tversion,\n\t\tstringFromAPISet(lifecycleInfo.APIs.Buildpack.Deprecated),\n\t\tsupportedBuildpackAPIs,\n\t\tstringFromAPISet(lifecycleInfo.APIs.Platform.Deprecated),\n\t\tsupportedPlatformAPIs,\n\t), warnings\n}\n\nfunc stringFromAPISet(versions builder.APISet) string {\n\tif len(versions) == 0 {\n\t\treturn none\n\t}\n\n\treturn strings.Join(versions.AsStrings(), \", \")\n}\n\nconst (\n\tbranchPrefix     = \" ├ \"\n\tlastBranchPrefix = \" └ \"\n\ttrunkPrefix      = \" │ \"\n)\n\nfunc detectionOrderOutput(order pubbldr.DetectionOrder, builderName string) (string, []string, error) {\n\toutput := \"Detection Order:\\n\"\n\n\tif len(order) == 0 {\n\t\twarnings := []string{\n\t\t\tfmt.Sprintf(\"%s has no buildpacks\", builderName),\n\t\t\t\"Users must build with explicitly specified buildpacks\",\n\t\t}\n\n\t\treturn fmt.Sprintf(\"%s  %s\\n\", output, none), warnings, nil\n\t}\n\n\ttabWriterBuf := bytes.Buffer{}\n\tspaceStrippingWriter := &trailingSpaceStrippingWriter{\n\t\toutput: &tabWriterBuf,\n\t}\n\n\tdetectionOrderTabWriter := tabwriter.NewWriter(spaceStrippingWriter, writerMinWidth, writerTabWidth, defaultTabWidth, writerPadChar, writerFlags)\n\terr := writeDetectionOrderGroup(detectionOrderTabWriter, order, \"\")\n\tif err != nil {\n\t\treturn \"\", []string{}, fmt.Errorf(\"writing detection order group: %w\", err)\n\t}\n\terr = detectionOrderTabWriter.Flush()\n\tif err != nil {\n\t\treturn \"\", []string{}, fmt.Errorf(\"flushing tab writer: %w\", err)\n\t}\n\n\toutput += tabWriterBuf.String()\n\treturn output, []string{}, nil\n}\n\nfunc detectionOrderExtOutput(order pubbldr.DetectionOrder, builderName string) (string, []string, error) {\n\toutput := \"Detection Order (Extensions):\\n\"\n\n\tif len(order) == 0 {\n\t\treturn fmt.Sprintf(\"%s  %s\\n\", output, none), nil, nil\n\t}\n\n\ttabWriterBuf := bytes.Buffer{}\n\tspaceStrippingWriter := &trailingSpaceStrippingWriter{\n\t\toutput: &tabWriterBuf,\n\t}\n\n\tdetectionOrderExtTabWriter := tabwriter.NewWriter(spaceStrippingWriter, writerMinWidth, writerTabWidth, defaultTabWidth, writerPadChar, writerFlags)\n\terr := writeDetectionOrderGroup(detectionOrderExtTabWriter, order, \"\")\n\tif err != nil {\n\t\treturn \"\", []string{}, fmt.Errorf(\"writing detection order group: %w\", err)\n\t}\n\terr = detectionOrderExtTabWriter.Flush()\n\tif err != nil {\n\t\treturn \"\", []string{}, fmt.Errorf(\"flushing tab writer: %w\", err)\n\t}\n\n\toutput += tabWriterBuf.String()\n\treturn output, []string{}, nil\n}\n\nfunc writeDetectionOrderGroup(writer io.Writer, order pubbldr.DetectionOrder, prefix string) error {\n\tgroupNumber := 0\n\n\tfor i, orderEntry := range order {\n\t\tlastInGroup := i == len(order)-1\n\t\tincludesSubGroup := len(orderEntry.GroupDetectionOrder) > 0\n\n\t\torderPrefix, err := writeAndUpdateEntryPrefix(writer, lastInGroup, prefix)\n\t\tif err != nil {\n\t\t\treturn fmt.Errorf(\"writing detection group prefix: %w\", err)\n\t\t}\n\n\t\tif includesSubGroup {\n\t\t\tgroupPrefix := orderPrefix\n\n\t\t\tif orderEntry.ID != \"\" {\n\t\t\t\terr = writeDetectionOrderBuildpack(writer, orderEntry)\n\t\t\t\tif err != nil {\n\t\t\t\t\treturn fmt.Errorf(\"writing detection order buildpack: %w\", err)\n\t\t\t\t}\n\n\t\t\t\tif lastInGroup {\n\t\t\t\t\t_, err = fmt.Fprintf(writer, \"%s%s\", groupPrefix, lastBranchPrefix)\n\t\t\t\t\tif err != nil {\n\t\t\t\t\t\treturn fmt.Errorf(\"writing to detection order group writer: %w\", err)\n\t\t\t\t\t}\n\t\t\t\t\tgroupPrefix = fmt.Sprintf(\"%s   \", groupPrefix)\n\t\t\t\t} else {\n\t\t\t\t\t_, err = fmt.Fprintf(writer, \"%s%s\", orderPrefix, lastBranchPrefix)\n\t\t\t\t\tif err != nil {\n\t\t\t\t\t\treturn fmt.Errorf(\"writing to detection order group writer: %w\", err)\n\t\t\t\t\t}\n\t\t\t\t\tgroupPrefix = fmt.Sprintf(\"%s   \", groupPrefix)\n\t\t\t\t}\n\t\t\t}\n\n\t\t\tgroupNumber++\n\t\t\t_, err = fmt.Fprintf(writer, \"Group #%d:\\n\", groupNumber)\n\t\t\tif err != nil {\n\t\t\t\treturn fmt.Errorf(\"writing to detection order group writer: %w\", err)\n\t\t\t}\n\t\t\terr = writeDetectionOrderGroup(writer, orderEntry.GroupDetectionOrder, groupPrefix)\n\t\t\tif err != nil {\n\t\t\t\treturn fmt.Errorf(\"writing detection order group: %w\", err)\n\t\t\t}\n\t\t} else {\n\t\t\terr := writeDetectionOrderBuildpack(writer, orderEntry)\n\t\t\tif err != nil {\n\t\t\t\treturn fmt.Errorf(\"writing detection order buildpack: %w\", err)\n\t\t\t}\n\t\t}\n\t}\n\n\treturn nil\n}\n\nfunc writeAndUpdateEntryPrefix(writer io.Writer, last bool, prefix string) (string, error) {\n\tif last {\n\t\t_, err := fmt.Fprintf(writer, \"%s%s\", prefix, lastBranchPrefix)\n\t\tif err != nil {\n\t\t\treturn \"\", fmt.Errorf(\"writing detection order prefix: %w\", err)\n\t\t}\n\t\treturn fmt.Sprintf(\"%s%s\", prefix, \"   \"), nil\n\t}\n\n\t_, err := fmt.Fprintf(writer, \"%s%s\", prefix, branchPrefix)\n\tif err != nil {\n\t\treturn \"\", fmt.Errorf(\"writing detection order prefix: %w\", err)\n\t}\n\treturn fmt.Sprintf(\"%s%s\", prefix, trunkPrefix), nil\n}\n\nfunc writeDetectionOrderBuildpack(writer io.Writer, entry pubbldr.DetectionOrderEntry) error {\n\t_, err := fmt.Fprintf(\n\t\twriter,\n\t\t\"%s\\t%s%s\\n\",\n\t\tentry.FullName(),\n\t\tstringFromOptional(entry.Optional),\n\t\tstringFromCyclical(entry.Cyclical),\n\t)\n\n\tif err != nil {\n\t\treturn fmt.Errorf(\"writing buildpack in detection order: %w\", err)\n\t}\n\n\treturn nil\n}\n\nfunc stringFromOptional(optional bool) string {\n\tif optional {\n\t\treturn \"(optional)\"\n\t}\n\n\treturn \"\"\n}\n\nfunc stringFromCyclical(cyclical bool) string {\n\tif cyclical {\n\t\treturn \"[cyclic]\"\n\t}\n\n\treturn \"\"\n}\n"
  },
  {
    "path": "internal/builder/writer/human_readable_test.go",
    "content": "package writer_test\n\nimport (\n\t\"bytes\"\n\t\"errors\"\n\t\"testing\"\n\n\t\"github.com/Masterminds/semver\"\n\t\"github.com/buildpacks/lifecycle/api\"\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\tpubbldr \"github.com/buildpacks/pack/builder\"\n\t\"github.com/buildpacks/pack/internal/builder\"\n\t\"github.com/buildpacks/pack/internal/builder/writer\"\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/pkg/client\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestHumanReadable(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"Builder Writer\", testHumanReadable, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testHumanReadable(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tassert = h.NewAssertionManager(t)\n\t\toutBuf bytes.Buffer\n\n\t\tremoteInfo *client.BuilderInfo\n\t\tlocalInfo  *client.BuilderInfo\n\n\t\texpectedRemoteOutput = `\nREMOTE:\n\nDescription: Some remote description\n\nCreated By:\n  Name: Pack CLI\n  Version: 1.2.3\n\nTrusted: No\n\nStack:\n  ID: test.stack.id\n\nLifecycle:\n  Version: 6.7.8\n  Buildpack APIs:\n    Deprecated: (none)\n    Supported: 1.2, 2.3\n  Platform APIs:\n    Deprecated: 0.1, 1.2\n    Supported: 4.5\n\nRun Images:\n  first/local     (user-configured)\n  second/local    (user-configured)\n  some/run-image\n  first/default\n  second/default\n\nBuildpacks:\n  ID                     NAME        VERSION                        HOMEPAGE\n  test.top.nested        -           test.top.nested.version        -\n  test.nested            -                                          http://geocities.com/top-bp\n  test.bp.one            -           test.bp.one.version            http://geocities.com/cool-bp\n  test.bp.two            -           test.bp.two.version            -\n  test.bp.three          -           test.bp.three.version          -\n\nDetection Order:\n ├ Group #1:\n │  ├ test.top.nested@test.top.nested.version\n │  │  └ Group #1:\n │  │     ├ test.nested\n │  │     │  └ Group #1:\n │  │     │     └ test.bp.one@test.bp.one.version      (optional)\n │  │     ├ test.bp.three@test.bp.three.version        (optional)\n │  │     └ test.nested.two@test.nested.two.version\n │  │        └ Group #2:\n │  │           └ test.bp.one@test.bp.one.version    (optional)[cyclic]\n │  └ test.bp.two@test.bp.two.version                (optional)\n └ test.bp.three@test.bp.three.version\n\nExtensions:\n  ID                   NAME        VERSION                      HOMEPAGE\n  test.bp.one          -           test.bp.one.version          http://geocities.com/cool-bp\n  test.bp.two          -           test.bp.two.version          -\n  test.bp.three        -           test.bp.three.version        -\n\nDetection Order (Extensions):\n ├ test.top.nested@test.top.nested.version\n ├ test.bp.one@test.bp.one.version            (optional)\n ├ test.bp.two@test.bp.two.version            (optional)\n └ test.bp.three@test.bp.three.version\n`\n\t\texpectedRemoteOutputWithoutExtensions = `\nREMOTE:\n\nDescription: Some remote description\n\nCreated By:\n  Name: Pack CLI\n  Version: 1.2.3\n\nTrusted: No\n\nStack:\n  ID: test.stack.id\n\nLifecycle:\n  Version: 6.7.8\n  Buildpack APIs:\n    Deprecated: (none)\n    Supported: 1.2, 2.3\n  Platform APIs:\n    Deprecated: 0.1, 1.2\n    Supported: 4.5\n\nRun Images:\n  first/local     (user-configured)\n  second/local    (user-configured)\n  some/run-image\n  first/default\n  second/default\n\nBuildpacks:\n  ID                     NAME        VERSION                        HOMEPAGE\n  test.top.nested        -           test.top.nested.version        -\n  test.nested            -                                          http://geocities.com/top-bp\n  test.bp.one            -           test.bp.one.version            http://geocities.com/cool-bp\n  test.bp.two            -           test.bp.two.version            -\n  test.bp.three          -           test.bp.three.version          -\n\nDetection Order:\n ├ Group #1:\n │  ├ test.top.nested@test.top.nested.version\n │  │  └ Group #1:\n │  │     ├ test.nested\n │  │     │  └ Group #1:\n │  │     │     └ test.bp.one@test.bp.one.version      (optional)\n │  │     ├ test.bp.three@test.bp.three.version        (optional)\n │  │     └ test.nested.two@test.nested.two.version\n │  │        └ Group #2:\n │  │           └ test.bp.one@test.bp.one.version    (optional)[cyclic]\n │  └ test.bp.two@test.bp.two.version                (optional)\n └ test.bp.three@test.bp.three.version\n`\n\n\t\texpectedLocalOutput = `\nLOCAL:\n\nDescription: Some local description\n\nCreated By:\n  Name: Pack CLI\n  Version: 4.5.6\n\nTrusted: No\n\nStack:\n  ID: test.stack.id\n\nLifecycle:\n  Version: 4.5.6\n  Buildpack APIs:\n    Deprecated: 4.5, 6.7\n    Supported: 8.9, 10.11\n  Platform APIs:\n    Deprecated: (none)\n    Supported: 7.8\n\nRun Images:\n  first/local     (user-configured)\n  second/local    (user-configured)\n  some/run-image\n  first/local-default\n  second/local-default\n\nBuildpacks:\n  ID                     NAME        VERSION                        HOMEPAGE\n  test.top.nested        -           test.top.nested.version        -\n  test.nested            -                                          http://geocities.com/top-bp\n  test.bp.one            -           test.bp.one.version            http://geocities.com/cool-bp\n  test.bp.two            -           test.bp.two.version            -\n  test.bp.three          -           test.bp.three.version          -\n\nDetection Order:\n ├ Group #1:\n │  ├ test.top.nested@test.top.nested.version\n │  │  └ Group #1:\n │  │     ├ test.nested\n │  │     │  └ Group #1:\n │  │     │     └ test.bp.one@test.bp.one.version      (optional)\n │  │     ├ test.bp.three@test.bp.three.version        (optional)\n │  │     └ test.nested.two@test.nested.two.version\n │  │        └ Group #2:\n │  │           └ test.bp.one@test.bp.one.version    (optional)[cyclic]\n │  └ test.bp.two@test.bp.two.version                (optional)\n └ test.bp.three@test.bp.three.version\n\nExtensions:\n  ID                   NAME        VERSION                      HOMEPAGE\n  test.bp.one          -           test.bp.one.version          http://geocities.com/cool-bp\n  test.bp.two          -           test.bp.two.version          -\n  test.bp.three        -           test.bp.three.version        -\n\nDetection Order (Extensions):\n ├ test.top.nested@test.top.nested.version\n ├ test.bp.one@test.bp.one.version            (optional)\n ├ test.bp.two@test.bp.two.version            (optional)\n └ test.bp.three@test.bp.three.version\n`\n\n\t\texpectedLocalOutputWithoutExtensions = `\nLOCAL:\n\nDescription: Some local description\n\nCreated By:\n  Name: Pack CLI\n  Version: 4.5.6\n\nTrusted: No\n\nStack:\n  ID: test.stack.id\n\nLifecycle:\n  Version: 4.5.6\n  Buildpack APIs:\n    Deprecated: 4.5, 6.7\n    Supported: 8.9, 10.11\n  Platform APIs:\n    Deprecated: (none)\n    Supported: 7.8\n\nRun Images:\n  first/local     (user-configured)\n  second/local    (user-configured)\n  some/run-image\n  first/local-default\n  second/local-default\n\nBuildpacks:\n  ID                     NAME        VERSION                        HOMEPAGE\n  test.top.nested        -           test.top.nested.version        -\n  test.nested            -                                          http://geocities.com/top-bp\n  test.bp.one            -           test.bp.one.version            http://geocities.com/cool-bp\n  test.bp.two            -           test.bp.two.version            -\n  test.bp.three          -           test.bp.three.version          -\n\nDetection Order:\n ├ Group #1:\n │  ├ test.top.nested@test.top.nested.version\n │  │  └ Group #1:\n │  │     ├ test.nested\n │  │     │  └ Group #1:\n │  │     │     └ test.bp.one@test.bp.one.version      (optional)\n │  │     ├ test.bp.three@test.bp.three.version        (optional)\n │  │     └ test.nested.two@test.nested.two.version\n │  │        └ Group #2:\n │  │           └ test.bp.one@test.bp.one.version    (optional)[cyclic]\n │  └ test.bp.two@test.bp.two.version                (optional)\n └ test.bp.three@test.bp.three.version\n`\n\n\t\texpectedVerboseStack = `\nStack:\n  ID: test.stack.id\n  Mixins:\n    mixin1\n    mixin2\n    build:mixin3\n    build:mixin4\n`\n\t\texpectedNilLifecycleVersion = `\nLifecycle:\n  Version: (none)\n`\n\t\texpectedEmptyRunImages = `\nRun Images:\n  (none)\n`\n\t\texpectedEmptyBuildpacks = `\nBuildpacks:\n  (none)\n`\n\t\texpectedEmptyOrder = `\nDetection Order:\n  (none)\n`\n\t\texpectedEmptyOrderExt = `\nDetection Order (Extensions):\n  (none)\n`\n\t\texpectedMissingLocalInfo = `\nLOCAL:\n(not present)\n`\n\t\texpectedMissingRemoteInfo = `\nREMOTE:\n(not present)\n`\n\t)\n\n\twhen(\"Print\", func() {\n\t\tit.Before(func() {\n\t\t\tremoteInfo = &client.BuilderInfo{\n\t\t\t\tDescription:     \"Some remote description\",\n\t\t\t\tStack:           \"test.stack.id\",\n\t\t\t\tMixins:          []string{\"mixin1\", \"mixin2\", \"build:mixin3\", \"build:mixin4\"},\n\t\t\t\tRunImages:       []pubbldr.RunImageConfig{{Image: \"some/run-image\", Mirrors: []string{\"first/default\", \"second/default\"}}},\n\t\t\t\tBuildpacks:      buildpacks,\n\t\t\t\tOrder:           order,\n\t\t\t\tExtensions:      extensions,\n\t\t\t\tOrderExtensions: orderExtensions,\n\t\t\t\tBuildpackLayers: dist.ModuleLayers{},\n\t\t\t\tLifecycle: builder.LifecycleDescriptor{\n\t\t\t\t\tInfo: builder.LifecycleInfo{\n\t\t\t\t\t\tVersion: &builder.Version{\n\t\t\t\t\t\t\tVersion: *semver.MustParse(\"6.7.8\"),\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t\tAPIs: builder.LifecycleAPIs{\n\t\t\t\t\t\tBuildpack: builder.APIVersions{\n\t\t\t\t\t\t\tDeprecated: nil,\n\t\t\t\t\t\t\tSupported:  builder.APISet{api.MustParse(\"1.2\"), api.MustParse(\"2.3\")},\n\t\t\t\t\t\t},\n\t\t\t\t\t\tPlatform: builder.APIVersions{\n\t\t\t\t\t\t\tDeprecated: builder.APISet{api.MustParse(\"0.1\"), api.MustParse(\"1.2\")},\n\t\t\t\t\t\t\tSupported:  builder.APISet{api.MustParse(\"4.5\")},\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t\tCreatedBy: builder.CreatorMetadata{\n\t\t\t\t\tName:    \"Pack CLI\",\n\t\t\t\t\tVersion: \"1.2.3\",\n\t\t\t\t},\n\t\t\t}\n\n\t\t\tlocalInfo = &client.BuilderInfo{\n\t\t\t\tDescription:     \"Some local description\",\n\t\t\t\tStack:           \"test.stack.id\",\n\t\t\t\tMixins:          []string{\"mixin1\", \"mixin2\", \"build:mixin3\", \"build:mixin4\"},\n\t\t\t\tRunImages:       []pubbldr.RunImageConfig{{Image: \"some/run-image\", Mirrors: []string{\"first/local-default\", \"second/local-default\"}}},\n\t\t\t\tBuildpacks:      buildpacks,\n\t\t\t\tOrder:           order,\n\t\t\t\tExtensions:      extensions,\n\t\t\t\tOrderExtensions: orderExtensions,\n\t\t\t\tBuildpackLayers: dist.ModuleLayers{},\n\t\t\t\tLifecycle: builder.LifecycleDescriptor{\n\t\t\t\t\tInfo: builder.LifecycleInfo{\n\t\t\t\t\t\tVersion: &builder.Version{\n\t\t\t\t\t\t\tVersion: *semver.MustParse(\"4.5.6\"),\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t\tAPIs: builder.LifecycleAPIs{\n\t\t\t\t\t\tBuildpack: builder.APIVersions{\n\t\t\t\t\t\t\tDeprecated: builder.APISet{api.MustParse(\"4.5\"), api.MustParse(\"6.7\")},\n\t\t\t\t\t\t\tSupported:  builder.APISet{api.MustParse(\"8.9\"), api.MustParse(\"10.11\")},\n\t\t\t\t\t\t},\n\t\t\t\t\t\tPlatform: builder.APIVersions{\n\t\t\t\t\t\t\tDeprecated: nil,\n\t\t\t\t\t\t\tSupported:  builder.APISet{api.MustParse(\"7.8\")},\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t\tCreatedBy: builder.CreatorMetadata{\n\t\t\t\t\tName:    \"Pack CLI\",\n\t\t\t\t\tVersion: \"4.5.6\",\n\t\t\t\t},\n\t\t\t}\n\n\t\t\toutBuf = bytes.Buffer{}\n\t\t})\n\n\t\tit(\"prints both local and remote builders in a human readable format\", func() {\n\t\t\thumanReadableWriter := writer.NewHumanReadable()\n\n\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\terr := humanReadableWriter.Print(logger, localRunImages, localInfo, remoteInfo, nil, nil, sharedBuilderInfo)\n\t\t\tassert.Nil(err)\n\n\t\t\tassert.Contains(outBuf.String(), \"Inspecting builder: 'test-builder'\")\n\t\t\tassert.Contains(outBuf.String(), expectedRemoteOutput)\n\t\t\tassert.Contains(outBuf.String(), expectedLocalOutput)\n\t\t})\n\n\t\twhen(\"builder is default\", func() {\n\t\t\tit(\"prints inspecting default builder\", func() {\n\t\t\t\tdefaultSharedBuildInfo := sharedBuilderInfo\n\t\t\t\tdefaultSharedBuildInfo.IsDefault = true\n\n\t\t\t\thumanReadableWriter := writer.NewHumanReadable()\n\n\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\t\terr := humanReadableWriter.Print(logger, localRunImages, localInfo, remoteInfo, nil, nil, defaultSharedBuildInfo)\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassert.Contains(outBuf.String(), \"Inspecting default builder: 'test-builder'\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"builder doesn't exist locally or remotely\", func() {\n\t\t\tit(\"returns an error\", func() {\n\t\t\t\tlocalInfo = nil\n\t\t\t\tremoteInfo = nil\n\n\t\t\t\thumanReadableWriter := writer.NewHumanReadable()\n\n\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\t\terr := humanReadableWriter.Print(logger, localRunImages, localInfo, remoteInfo, nil, nil, sharedBuilderInfo)\n\t\t\t\tassert.ErrorWithMessage(err, \"unable to find builder 'test-builder' locally or remotely\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"builder doesn't exist locally\", func() {\n\t\t\tit(\"shows not present for local builder, and normal output for remote\", func() {\n\t\t\t\tlocalInfo = nil\n\n\t\t\t\thumanReadableWriter := writer.NewHumanReadable()\n\n\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\t\terr := humanReadableWriter.Print(logger, localRunImages, localInfo, remoteInfo, nil, nil, sharedBuilderInfo)\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassert.Contains(outBuf.String(), expectedMissingLocalInfo)\n\t\t\t\tassert.Contains(outBuf.String(), expectedRemoteOutput)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"builder doesn't exist remotely\", func() {\n\t\t\tit(\"shows not present for remote builder, and normal output for local\", func() {\n\t\t\t\tremoteInfo = nil\n\n\t\t\t\thumanReadableWriter := writer.NewHumanReadable()\n\n\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\t\terr := humanReadableWriter.Print(logger, localRunImages, localInfo, remoteInfo, nil, nil, sharedBuilderInfo)\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassert.Contains(outBuf.String(), expectedMissingRemoteInfo)\n\t\t\t\tassert.Contains(outBuf.String(), expectedLocalOutput)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"localErr is an error\", func() {\n\t\t\tit(\"error is logged, local info is not displayed, but remote info is\", func() {\n\t\t\t\terrorMessage := \"failed to retrieve local info\"\n\n\t\t\t\thumanReadableWriter := writer.NewHumanReadable()\n\n\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\t\terr := humanReadableWriter.Print(logger, localRunImages, localInfo, remoteInfo, errors.New(errorMessage), nil, sharedBuilderInfo)\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassert.Contains(outBuf.String(), errorMessage)\n\t\t\t\tassert.NotContains(outBuf.String(), expectedLocalOutput)\n\t\t\t\tassert.Contains(outBuf.String(), expectedRemoteOutput)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"remoteErr is an error\", func() {\n\t\t\tit(\"error is logged, remote info is not displayed, but local info is\", func() {\n\t\t\t\terrorMessage := \"failed to retrieve remote info\"\n\n\t\t\t\thumanReadableWriter := writer.NewHumanReadable()\n\n\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\t\terr := humanReadableWriter.Print(logger, localRunImages, localInfo, remoteInfo, nil, errors.New(errorMessage), sharedBuilderInfo)\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassert.Contains(outBuf.String(), errorMessage)\n\t\t\t\tassert.NotContains(outBuf.String(), expectedRemoteOutput)\n\t\t\t\tassert.Contains(outBuf.String(), expectedLocalOutput)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"description is blank\", func() {\n\t\t\tit(\"doesn't print the description block\", func() {\n\t\t\t\tlocalInfo.Description = \"\"\n\t\t\t\tremoteInfo.Description = \"\"\n\n\t\t\t\thumanReadableWriter := writer.NewHumanReadable()\n\n\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\t\terr := humanReadableWriter.Print(logger, localRunImages, localInfo, remoteInfo, nil, nil, sharedBuilderInfo)\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassert.NotContains(outBuf.String(), \"Description:\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"created by name is blank\", func() {\n\t\t\tit(\"doesn't print created by block\", func() {\n\t\t\t\tlocalInfo.CreatedBy.Name = \"\"\n\t\t\t\tremoteInfo.CreatedBy.Name = \"\"\n\n\t\t\t\thumanReadableWriter := writer.NewHumanReadable()\n\n\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\t\terr := humanReadableWriter.Print(logger, localRunImages, localInfo, remoteInfo, nil, nil, sharedBuilderInfo)\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassert.NotContains(outBuf.String(), \"Created By:\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"logger is verbose\", func() {\n\t\t\tit(\"displays mixins associated with the stack\", func() {\n\t\t\t\thumanReadableWriter := writer.NewHumanReadable()\n\n\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf, logging.WithVerbose())\n\t\t\t\terr := humanReadableWriter.Print(logger, localRunImages, localInfo, remoteInfo, nil, nil, sharedBuilderInfo)\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassert.Contains(outBuf.String(), expectedVerboseStack)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"lifecycle version is not set\", func() {\n\t\t\tit(\"displays lifecycle version as (none) and warns that version if not set\", func() {\n\t\t\t\tlocalInfo.Lifecycle.Info.Version = nil\n\t\t\t\tremoteInfo.Lifecycle.Info.Version = nil\n\n\t\t\t\thumanReadableWriter := writer.NewHumanReadable()\n\n\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\t\terr := humanReadableWriter.Print(logger, localRunImages, localInfo, remoteInfo, nil, nil, sharedBuilderInfo)\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassert.Contains(outBuf.String(), expectedNilLifecycleVersion)\n\t\t\t\tassert.Contains(outBuf.String(), \"test-builder does not specify a Lifecycle version\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"there are no supported buildpack APIs specified\", func() {\n\t\t\tit(\"prints a warning\", func() {\n\t\t\t\tlocalInfo.Lifecycle.APIs.Buildpack.Supported = builder.APISet{}\n\t\t\t\tremoteInfo.Lifecycle.APIs.Buildpack.Supported = builder.APISet{}\n\n\t\t\t\thumanReadableWriter := writer.NewHumanReadable()\n\n\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\t\terr := humanReadableWriter.Print(logger, localRunImages, localInfo, remoteInfo, nil, nil, sharedBuilderInfo)\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassert.Contains(outBuf.String(), \"test-builder does not specify supported Lifecycle Buildpack APIs\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"there are no supported platform APIs specified\", func() {\n\t\t\tit(\"prints a warning\", func() {\n\t\t\t\tlocalInfo.Lifecycle.APIs.Platform.Supported = builder.APISet{}\n\t\t\t\tremoteInfo.Lifecycle.APIs.Platform.Supported = builder.APISet{}\n\n\t\t\t\thumanReadableWriter := writer.NewHumanReadable()\n\n\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\t\terr := humanReadableWriter.Print(logger, localRunImages, localInfo, remoteInfo, nil, nil, sharedBuilderInfo)\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassert.Contains(outBuf.String(), \"test-builder does not specify supported Lifecycle Platform APIs\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"no run images are specified\", func() {\n\t\t\tit(\"displays run images as (none) and warns about unset run image\", func() {\n\t\t\t\tlocalInfo.RunImages = []pubbldr.RunImageConfig{}\n\t\t\t\tremoteInfo.RunImages = []pubbldr.RunImageConfig{}\n\t\t\t\temptyLocalRunImages := []config.RunImage{}\n\n\t\t\t\thumanReadableWriter := writer.NewHumanReadable()\n\n\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\t\terr := humanReadableWriter.Print(logger, emptyLocalRunImages, localInfo, remoteInfo, nil, nil, sharedBuilderInfo)\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassert.Contains(outBuf.String(), expectedEmptyRunImages)\n\t\t\t\tassert.Contains(outBuf.String(), \"test-builder does not specify a run image\")\n\t\t\t\tassert.Contains(outBuf.String(), \"Users must build with an explicitly specified run image\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"no buildpacks are specified\", func() {\n\t\t\tit(\"displays buildpacks as (none) and prints warnings\", func() {\n\t\t\t\tlocalInfo.Buildpacks = []dist.ModuleInfo{}\n\t\t\t\tremoteInfo.Buildpacks = []dist.ModuleInfo{}\n\n\t\t\t\thumanReadableWriter := writer.NewHumanReadable()\n\n\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\t\terr := humanReadableWriter.Print(logger, localRunImages, localInfo, remoteInfo, nil, nil, sharedBuilderInfo)\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassert.Contains(outBuf.String(), expectedEmptyBuildpacks)\n\t\t\t\tassert.Contains(outBuf.String(), \"test-builder has no buildpacks\")\n\t\t\t\tassert.Contains(outBuf.String(), \"Users must supply buildpacks from the host machine\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"no extensions are specified\", func() {\n\t\t\tit(\"displays no extensions as (none)\", func() {\n\t\t\t\tlocalInfo.Extensions = []dist.ModuleInfo{}\n\t\t\t\tremoteInfo.Extensions = []dist.ModuleInfo{}\n\n\t\t\t\thumanReadableWriter := writer.NewHumanReadable()\n\n\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\t\terr := humanReadableWriter.Print(logger, localRunImages, localInfo, remoteInfo, nil, nil, sharedBuilderInfo)\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassert.Contains(outBuf.String(), \"Inspecting builder: 'test-builder'\")\n\t\t\t\tassert.Contains(outBuf.String(), expectedRemoteOutputWithoutExtensions)\n\t\t\t\tassert.Contains(outBuf.String(), expectedLocalOutputWithoutExtensions)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"multiple top level groups\", func() {\n\t\t\tit(\"displays order correctly\", func() {\n\n\t\t\t})\n\t\t})\n\n\t\twhen(\"no detection order is specified\", func() {\n\t\t\tit(\"displays detection order as (none) and prints warnings\", func() {\n\t\t\t\tlocalInfo.Order = pubbldr.DetectionOrder{}\n\t\t\t\tremoteInfo.Order = pubbldr.DetectionOrder{}\n\n\t\t\t\thumanReadableWriter := writer.NewHumanReadable()\n\n\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\t\terr := humanReadableWriter.Print(logger, localRunImages, localInfo, remoteInfo, nil, nil, sharedBuilderInfo)\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassert.Contains(outBuf.String(), expectedEmptyOrder)\n\t\t\t\tassert.Contains(outBuf.String(), \"test-builder has no buildpacks\")\n\t\t\t\tassert.Contains(outBuf.String(), \"Users must build with explicitly specified buildpacks\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"no detection order for extension is specified\", func() {\n\t\t\tit(\"displays detection order for extensions as (none)\", func() {\n\t\t\t\tlocalInfo.OrderExtensions = pubbldr.DetectionOrder{}\n\t\t\t\tremoteInfo.OrderExtensions = pubbldr.DetectionOrder{}\n\n\t\t\t\thumanReadableWriter := writer.NewHumanReadable()\n\n\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\t\terr := humanReadableWriter.Print(logger, localRunImages, localInfo, remoteInfo, nil, nil, sharedBuilderInfo)\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassert.Contains(outBuf.String(), expectedEmptyOrderExt)\n\t\t\t})\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "internal/builder/writer/json.go",
    "content": "package writer\n\nimport (\n\t\"bytes\"\n\t\"encoding/json\"\n)\n\ntype JSON struct {\n\tStructuredFormat\n}\n\nfunc NewJSON() BuilderWriter {\n\treturn &JSON{\n\t\tStructuredFormat: StructuredFormat{\n\t\t\tMarshalFunc: func(i interface{}) ([]byte, error) {\n\t\t\t\tbuf, err := json.Marshal(i)\n\t\t\t\tif err != nil {\n\t\t\t\t\treturn []byte{}, err\n\t\t\t\t}\n\t\t\t\tformattedBuf := bytes.NewBuffer(nil)\n\t\t\t\terr = json.Indent(formattedBuf, buf, \"\", \"  \")\n\t\t\t\treturn formattedBuf.Bytes(), err\n\t\t\t},\n\t\t},\n\t}\n}\n"
  },
  {
    "path": "internal/builder/writer/json_test.go",
    "content": "package writer_test\n\nimport (\n\t\"bytes\"\n\t\"encoding/json\"\n\t\"errors\"\n\t\"fmt\"\n\t\"testing\"\n\n\t\"github.com/Masterminds/semver\"\n\t\"github.com/buildpacks/lifecycle/api\"\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\tpubbldr \"github.com/buildpacks/pack/builder\"\n\t\"github.com/buildpacks/pack/internal/builder\"\n\t\"github.com/buildpacks/pack/internal/builder/writer\"\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/pkg/client\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestJSON(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"Builder Writer\", testJSON, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testJSON(t *testing.T, when spec.G, it spec.S) {\n\tconst (\n\t\texpectedRemoteRunImages = `\"run_images\": [\n      {\n        \"name\": \"first/local\",\n        \"user_configured\": true\n      },\n      {\n        \"name\": \"second/local\",\n        \"user_configured\": true\n      },\n      {\n        \"name\": \"some/run-image\"\n      },\n      {\n        \"name\": \"first/default\"\n      },\n      {\n        \"name\": \"second/default\"\n      }\n    ]`\n\t\texpectedLocalRunImages = `\"run_images\": [\n      {\n        \"name\": \"first/local\",\n        \"user_configured\": true\n      },\n      {\n        \"name\": \"second/local\",\n        \"user_configured\": true\n      },\n      {\n        \"name\": \"some/run-image\"\n      },\n      {\n        \"name\": \"first/local-default\"\n      },\n      {\n        \"name\": \"second/local-default\"\n      }\n    ]`\n\n\t\texpectedBuildpacks = `\"buildpacks\": [\n      {\n        \"id\": \"test.top.nested\",\n        \"version\": \"test.top.nested.version\"\n      },\n      {\n        \"id\": \"test.nested\",\n        \"homepage\": \"http://geocities.com/top-bp\"\n      },\n      {\n        \"id\": \"test.bp.one\",\n        \"version\": \"test.bp.one.version\",\n        \"homepage\": \"http://geocities.com/cool-bp\"\n      },\n      {\n        \"id\": \"test.bp.two\",\n        \"version\": \"test.bp.two.version\"\n      },\n      {\n        \"id\": \"test.bp.three\",\n        \"version\": \"test.bp.three.version\"\n      }\n    ]`\n\n\t\texpectedExtensions = `\"extensions\": [\n      {\n        \"homepage\": \"http://geocities.com/cool-bp\",\n        \"id\": \"test.bp.one\",\n        \"version\": \"test.bp.one.version\"\n      },\n      {\n        \"id\": \"test.bp.two\",\n        \"version\": \"test.bp.two.version\"\n      },\n      {\n        \"id\": \"test.bp.three\",\n        \"version\": \"test.bp.three.version\"\n      }\n    ]`\n\t\texpectedDetectionOrder = `\"detection_order\": [\n      {\n        \"buildpacks\": [\n          {\n            \"id\": \"test.top.nested\",\n            \"version\": \"test.top.nested.version\",\n            \"buildpacks\": [\n              {\n                \"id\": \"test.nested\",\n                \"homepage\": \"http://geocities.com/top-bp\",\n                \"buildpacks\": [\n                  {\n                    \"id\": \"test.bp.one\",\n                    \"version\": \"test.bp.one.version\",\n                    \"homepage\": \"http://geocities.com/cool-bp\",\n                    \"optional\": true\n                  }\n                ]\n              },\n              {\n                \"id\": \"test.bp.three\",\n                \"version\": \"test.bp.three.version\",\n                \"optional\": true\n              },\n              {\n                \"id\": \"test.nested.two\",\n                \"version\": \"test.nested.two.version\",\n                \"buildpacks\": [\n                  {\n                    \"id\": \"test.bp.one\",\n                    \"version\": \"test.bp.one.version\",\n                    \"homepage\": \"http://geocities.com/cool-bp\",\n                    \"optional\": true,\n                    \"cyclic\": true\n                  }\n                ]\n              }\n            ]\n          },\n          {\n            \"id\": \"test.bp.two\",\n            \"version\": \"test.bp.two.version\",\n            \"optional\": true\n          }\n        ]\n      },\n      {\n        \"id\": \"test.bp.three\",\n        \"version\": \"test.bp.three.version\"\n      }\n    ]`\n\t\texpectedOrderExtensions = `\"order_extensions\": [\n\t  {\n\t\t\"id\": \"test.top.nested\",\n\t\t\"version\": \"test.top.nested.version\"\n\t  },\n\t  {\n\t\t\"homepage\": \"http://geocities.com/cool-bp\",\n\t\t\"id\": \"test.bp.one\",\n\t\t\"version\": \"test.bp.one.version\",\n\t\t\"optional\": true\n\t  },\n\t  {\n\t\t\"id\": \"test.bp.two\",\n\t\t\"version\": \"test.bp.two.version\",\n\t\t\"optional\": true\n\t  },\n      {\n        \"id\": \"test.bp.three\",\n        \"version\": \"test.bp.three.version\"\n      }\n    ]`\n\t\texpectedStackWithMixins = `\"stack\": {\n      \"id\": \"test.stack.id\",\n      \"mixins\": [\n        \"mixin1\",\n        \"mixin2\",\n        \"build:mixin3\",\n        \"build:mixin4\"\n      ]\n    }`\n\t)\n\n\tvar (\n\t\tassert = h.NewAssertionManager(t)\n\t\toutBuf bytes.Buffer\n\n\t\tremoteInfo *client.BuilderInfo\n\t\tlocalInfo  *client.BuilderInfo\n\n\t\texpectedRemoteInfo = fmt.Sprintf(`\"remote_info\": {\n    \"description\": \"Some remote description\",\n    \"created_by\": {\n      \"name\": \"Pack CLI\",\n      \"version\": \"1.2.3\"\n    },\n    \"stack\": {\n      \"id\": \"test.stack.id\"\n    },\n    \"lifecycle\": {\n      \"version\": \"6.7.8\",\n      \"buildpack_apis\": {\n        \"deprecated\": null,\n        \"supported\": [\n          \"1.2\",\n          \"2.3\"\n        ]\n      },\n      \"platform_apis\": {\n        \"deprecated\": [\n          \"0.1\",\n          \"1.2\"\n        ],\n        \"supported\": [\n          \"4.5\"\n        ]\n      }\n    },\n    %s,\n    %s,\n    %s,\n\t%s,\n\t%s\n  }`, expectedRemoteRunImages, expectedBuildpacks, expectedDetectionOrder, expectedExtensions, expectedOrderExtensions)\n\n\t\texpectedLocalInfo = fmt.Sprintf(`\"local_info\": {\n    \"description\": \"Some local description\",\n    \"created_by\": {\n      \"name\": \"Pack CLI\",\n      \"version\": \"4.5.6\"\n    },\n    \"stack\": {\n      \"id\": \"test.stack.id\"\n    },\n    \"lifecycle\": {\n      \"version\": \"4.5.6\",\n      \"buildpack_apis\": {\n        \"deprecated\": [\n          \"4.5\",\n          \"6.7\"\n        ],\n        \"supported\": [\n          \"8.9\",\n          \"10.11\"\n        ]\n      },\n      \"platform_apis\": {\n        \"deprecated\": null,\n        \"supported\": [\n          \"7.8\"\n        ]\n      }\n    },\n    %s,\n    %s,\n    %s,\n\t%s,\n\t%s\n  }`, expectedLocalRunImages, expectedBuildpacks, expectedDetectionOrder, expectedExtensions, expectedOrderExtensions)\n\n\t\texpectedPrettifiedJSON = fmt.Sprintf(`{\n  \"builder_name\": \"test-builder\",\n  \"trusted\": false,\n  \"default\": false,\n  %s,\n  %s\n}\n`, expectedRemoteInfo, expectedLocalInfo)\n\t)\n\n\twhen(\"Print\", func() {\n\t\tit.Before(func() {\n\t\t\tremoteInfo = &client.BuilderInfo{\n\t\t\t\tDescription:     \"Some remote description\",\n\t\t\t\tStack:           \"test.stack.id\",\n\t\t\t\tMixins:          []string{\"mixin1\", \"mixin2\", \"build:mixin3\", \"build:mixin4\"},\n\t\t\t\tRunImages:       []pubbldr.RunImageConfig{{Image: \"some/run-image\", Mirrors: []string{\"first/default\", \"second/default\"}}},\n\t\t\t\tBuildpacks:      buildpacks,\n\t\t\t\tOrder:           order,\n\t\t\t\tExtensions:      extensions,\n\t\t\t\tOrderExtensions: orderExtensions,\n\t\t\t\tBuildpackLayers: dist.ModuleLayers{},\n\t\t\t\tLifecycle: builder.LifecycleDescriptor{\n\t\t\t\t\tInfo: builder.LifecycleInfo{\n\t\t\t\t\t\tVersion: &builder.Version{\n\t\t\t\t\t\t\tVersion: *semver.MustParse(\"6.7.8\"),\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t\tAPIs: builder.LifecycleAPIs{\n\t\t\t\t\t\tBuildpack: builder.APIVersions{\n\t\t\t\t\t\t\tDeprecated: nil,\n\t\t\t\t\t\t\tSupported:  builder.APISet{api.MustParse(\"1.2\"), api.MustParse(\"2.3\")},\n\t\t\t\t\t\t},\n\t\t\t\t\t\tPlatform: builder.APIVersions{\n\t\t\t\t\t\t\tDeprecated: builder.APISet{api.MustParse(\"0.1\"), api.MustParse(\"1.2\")},\n\t\t\t\t\t\t\tSupported:  builder.APISet{api.MustParse(\"4.5\")},\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t\tCreatedBy: builder.CreatorMetadata{\n\t\t\t\t\tName:    \"Pack CLI\",\n\t\t\t\t\tVersion: \"1.2.3\",\n\t\t\t\t},\n\t\t\t}\n\n\t\t\tlocalInfo = &client.BuilderInfo{\n\t\t\t\tDescription:     \"Some local description\",\n\t\t\t\tStack:           \"test.stack.id\",\n\t\t\t\tMixins:          []string{\"mixin1\", \"mixin2\", \"build:mixin3\", \"build:mixin4\"},\n\t\t\t\tRunImages:       []pubbldr.RunImageConfig{{Image: \"some/run-image\", Mirrors: []string{\"first/local-default\", \"second/local-default\"}}},\n\t\t\t\tBuildpacks:      buildpacks,\n\t\t\t\tOrder:           order,\n\t\t\t\tExtensions:      extensions,\n\t\t\t\tOrderExtensions: orderExtensions,\n\t\t\t\tBuildpackLayers: dist.ModuleLayers{},\n\t\t\t\tLifecycle: builder.LifecycleDescriptor{\n\t\t\t\t\tInfo: builder.LifecycleInfo{\n\t\t\t\t\t\tVersion: &builder.Version{\n\t\t\t\t\t\t\tVersion: *semver.MustParse(\"4.5.6\"),\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t\tAPIs: builder.LifecycleAPIs{\n\t\t\t\t\t\tBuildpack: builder.APIVersions{\n\t\t\t\t\t\t\tDeprecated: builder.APISet{api.MustParse(\"4.5\"), api.MustParse(\"6.7\")},\n\t\t\t\t\t\t\tSupported:  builder.APISet{api.MustParse(\"8.9\"), api.MustParse(\"10.11\")},\n\t\t\t\t\t\t},\n\t\t\t\t\t\tPlatform: builder.APIVersions{\n\t\t\t\t\t\t\tDeprecated: nil,\n\t\t\t\t\t\t\tSupported:  builder.APISet{api.MustParse(\"7.8\")},\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t\tCreatedBy: builder.CreatorMetadata{\n\t\t\t\t\tName:    \"Pack CLI\",\n\t\t\t\t\tVersion: \"4.5.6\",\n\t\t\t\t},\n\t\t\t}\n\t\t})\n\n\t\tit(\"prints both local remote builders as valid JSON\", func() {\n\t\t\tjsonWriter := writer.NewJSON()\n\n\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\terr := jsonWriter.Print(logger, localRunImages, localInfo, remoteInfo, nil, nil, sharedBuilderInfo)\n\t\t\tassert.Nil(err)\n\n\t\t\tprettyJSON, err := validPrettifiedJSONOutput(outBuf)\n\t\t\tassert.Nil(err)\n\n\t\t\tassert.ContainsJSON(prettyJSON, expectedPrettifiedJSON)\n\t\t})\n\n\t\twhen(\"builder doesn't exist locally or remotely\", func() {\n\t\t\tit(\"returns an error\", func() {\n\t\t\t\tjsonWriter := writer.NewJSON()\n\n\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\t\terr := jsonWriter.Print(logger, localRunImages, nil, nil, nil, nil, sharedBuilderInfo)\n\t\t\t\tassert.ErrorWithMessage(err, \"unable to find builder 'test-builder' locally or remotely\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"builder doesn't exist locally\", func() {\n\t\t\tit(\"shows null for local builder, and normal output for remote\", func() {\n\t\t\t\tjsonWriter := writer.NewJSON()\n\n\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\t\terr := jsonWriter.Print(logger, localRunImages, nil, remoteInfo, nil, nil, sharedBuilderInfo)\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tprettyJSON, err := validPrettifiedJSONOutput(outBuf)\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassert.ContainsJSON(prettyJSON, `{\"local_info\": null}`)\n\t\t\t\tassert.ContainsJSON(prettyJSON, fmt.Sprintf(\"{%s}\", expectedRemoteInfo))\n\t\t\t})\n\t\t})\n\n\t\twhen(\"builder doesn't exist remotely\", func() {\n\t\t\tit(\"shows null for remote builder, and normal output for local\", func() {\n\t\t\t\tjsonWriter := writer.NewJSON()\n\n\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\t\terr := jsonWriter.Print(logger, localRunImages, localInfo, nil, nil, nil, sharedBuilderInfo)\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tprettyJSON, err := validPrettifiedJSONOutput(outBuf)\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassert.ContainsJSON(prettyJSON, `{\"remote_info\": null}`)\n\t\t\t\tassert.ContainsJSON(prettyJSON, fmt.Sprintf(\"{%s}\", expectedLocalInfo))\n\t\t\t})\n\t\t})\n\n\t\twhen(\"localErr is an error\", func() {\n\t\t\tit(\"returns the error, and doesn't write any json output\", func() {\n\t\t\t\texpectedErr := errors.New(\"failed to retrieve local info\")\n\n\t\t\t\tjsonWriter := writer.NewJSON()\n\n\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\t\terr := jsonWriter.Print(logger, localRunImages, localInfo, remoteInfo, expectedErr, nil, sharedBuilderInfo)\n\t\t\t\tassert.ErrorWithMessage(err, \"preparing output for 'test-builder': failed to retrieve local info\")\n\n\t\t\t\tassert.Equal(outBuf.String(), \"\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"remoteErr is an error\", func() {\n\t\t\tit(\"returns the error, and doesn't write any json output\", func() {\n\t\t\t\texpectedErr := errors.New(\"failed to retrieve remote info\")\n\n\t\t\t\tjsonWriter := writer.NewJSON()\n\n\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\t\terr := jsonWriter.Print(logger, localRunImages, localInfo, remoteInfo, nil, expectedErr, sharedBuilderInfo)\n\t\t\t\tassert.ErrorWithMessage(err, \"preparing output for 'test-builder': failed to retrieve remote info\")\n\n\t\t\t\tassert.Equal(outBuf.String(), \"\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"logger is verbose\", func() {\n\t\t\tit(\"displays mixins associated with the stack\", func() {\n\t\t\t\tjsonWriter := writer.NewJSON()\n\n\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf, logging.WithVerbose())\n\t\t\t\terr := jsonWriter.Print(logger, localRunImages, localInfo, remoteInfo, nil, nil, sharedBuilderInfo)\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tprettifiedJSON, err := validPrettifiedJSONOutput(outBuf)\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassert.ContainsJSON(prettifiedJSON, fmt.Sprintf(\"{%s}\", expectedStackWithMixins))\n\t\t\t})\n\t\t})\n\n\t\twhen(\"no run images are specified\", func() {\n\t\t\tit(\"displays run images as empty list\", func() {\n\t\t\t\tlocalInfo.RunImages = []pubbldr.RunImageConfig{}\n\t\t\t\tremoteInfo.RunImages = []pubbldr.RunImageConfig{}\n\t\t\t\temptyLocalRunImages := []config.RunImage{}\n\n\t\t\t\tjsonWriter := writer.NewJSON()\n\n\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf, logging.WithVerbose())\n\t\t\t\terr := jsonWriter.Print(logger, emptyLocalRunImages, localInfo, remoteInfo, nil, nil, sharedBuilderInfo)\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tprettifiedJSON, err := validPrettifiedJSONOutput(outBuf)\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassert.ContainsJSON(prettifiedJSON, `{\"run_images\": []}`)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"no buildpacks are specified\", func() {\n\t\t\tit(\"displays buildpacks as empty list\", func() {\n\t\t\t\tlocalInfo.Buildpacks = []dist.ModuleInfo{}\n\t\t\t\tremoteInfo.Buildpacks = []dist.ModuleInfo{}\n\n\t\t\t\tjsonWriter := writer.NewJSON()\n\n\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf, logging.WithVerbose())\n\t\t\t\terr := jsonWriter.Print(logger, localRunImages, localInfo, remoteInfo, nil, nil, sharedBuilderInfo)\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tprettifiedJSON, err := validPrettifiedJSONOutput(outBuf)\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassert.ContainsJSON(prettifiedJSON, `{\"buildpacks\": []}`)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"no detection order is specified\", func() {\n\t\t\tit(\"displays detection order as empty list\", func() {\n\t\t\t\tlocalInfo.Order = pubbldr.DetectionOrder{}\n\t\t\t\tremoteInfo.Order = pubbldr.DetectionOrder{}\n\n\t\t\t\tjsonWriter := writer.NewJSON()\n\n\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf, logging.WithVerbose())\n\t\t\t\terr := jsonWriter.Print(logger, localRunImages, localInfo, remoteInfo, nil, nil, sharedBuilderInfo)\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tprettifiedJSON, err := validPrettifiedJSONOutput(outBuf)\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassert.ContainsJSON(prettifiedJSON, `{\"detection_order\": []}`)\n\t\t\t})\n\t\t})\n\t})\n}\n\nfunc validPrettifiedJSONOutput(source bytes.Buffer) (string, error) {\n\terr := json.Unmarshal(source.Bytes(), &struct{}{})\n\tif err != nil {\n\t\treturn \"\", fmt.Errorf(\"failed to unmarshal to json: %w\", err)\n\t}\n\n\tvar prettifiedOutput bytes.Buffer\n\terr = json.Indent(&prettifiedOutput, source.Bytes(), \"\", \"  \")\n\tif err != nil {\n\t\treturn \"\", fmt.Errorf(\"failed to prettify source json: %w\", err)\n\t}\n\n\treturn prettifiedOutput.String(), nil\n}\n"
  },
  {
    "path": "internal/builder/writer/shared_builder_test.go",
    "content": "package writer_test\n\nimport (\n\tpubbldr \"github.com/buildpacks/pack/builder\"\n\t\"github.com/buildpacks/pack/internal/builder/writer\"\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n)\n\nvar (\n\ttestTopNestedBuildpack = dist.ModuleInfo{\n\t\tID:      \"test.top.nested\",\n\t\tVersion: \"test.top.nested.version\",\n\t}\n\ttestNestedBuildpack = dist.ModuleInfo{\n\t\tID:       \"test.nested\",\n\t\tHomepage: \"http://geocities.com/top-bp\",\n\t}\n\ttestBuildpackOne = dist.ModuleInfo{\n\t\tID:       \"test.bp.one\",\n\t\tVersion:  \"test.bp.one.version\",\n\t\tHomepage: \"http://geocities.com/cool-bp\",\n\t}\n\ttestBuildpackTwo = dist.ModuleInfo{\n\t\tID:      \"test.bp.two\",\n\t\tVersion: \"test.bp.two.version\",\n\t}\n\ttestBuildpackThree = dist.ModuleInfo{\n\t\tID:      \"test.bp.three\",\n\t\tVersion: \"test.bp.three.version\",\n\t}\n\ttestNestedBuildpackTwo = dist.ModuleInfo{\n\t\tID:      \"test.nested.two\",\n\t\tVersion: \"test.nested.two.version\",\n\t}\n\n\tbuildpacks = []dist.ModuleInfo{\n\t\ttestTopNestedBuildpack,\n\t\ttestNestedBuildpack,\n\t\ttestBuildpackOne,\n\t\ttestBuildpackTwo,\n\t\ttestBuildpackThree,\n\t}\n\n\torder = pubbldr.DetectionOrder{\n\t\tpubbldr.DetectionOrderEntry{\n\t\t\tGroupDetectionOrder: pubbldr.DetectionOrder{\n\t\t\t\tpubbldr.DetectionOrderEntry{\n\t\t\t\t\tModuleRef: dist.ModuleRef{\n\t\t\t\t\t\tModuleInfo: testTopNestedBuildpack,\n\t\t\t\t\t},\n\t\t\t\t\tGroupDetectionOrder: pubbldr.DetectionOrder{\n\t\t\t\t\t\tpubbldr.DetectionOrderEntry{\n\t\t\t\t\t\t\tModuleRef: dist.ModuleRef{ModuleInfo: testNestedBuildpack},\n\t\t\t\t\t\t\tGroupDetectionOrder: pubbldr.DetectionOrder{\n\t\t\t\t\t\t\t\tpubbldr.DetectionOrderEntry{\n\t\t\t\t\t\t\t\t\tModuleRef: dist.ModuleRef{\n\t\t\t\t\t\t\t\t\t\tModuleInfo: testBuildpackOne,\n\t\t\t\t\t\t\t\t\t\tOptional:   true,\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t},\n\t\t\t\t\t\tpubbldr.DetectionOrderEntry{\n\t\t\t\t\t\t\tModuleRef: dist.ModuleRef{\n\t\t\t\t\t\t\t\tModuleInfo: testBuildpackThree,\n\t\t\t\t\t\t\t\tOptional:   true,\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t},\n\t\t\t\t\t\tpubbldr.DetectionOrderEntry{\n\t\t\t\t\t\t\tModuleRef: dist.ModuleRef{ModuleInfo: testNestedBuildpackTwo},\n\t\t\t\t\t\t\tGroupDetectionOrder: pubbldr.DetectionOrder{\n\t\t\t\t\t\t\t\tpubbldr.DetectionOrderEntry{\n\t\t\t\t\t\t\t\t\tModuleRef: dist.ModuleRef{\n\t\t\t\t\t\t\t\t\t\tModuleInfo: testBuildpackOne,\n\t\t\t\t\t\t\t\t\t\tOptional:   true,\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\tCyclical: true,\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t\tpubbldr.DetectionOrderEntry{\n\t\t\t\t\tModuleRef: dist.ModuleRef{\n\t\t\t\t\t\tModuleInfo: testBuildpackTwo,\n\t\t\t\t\t\tOptional:   true,\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t},\n\t\t},\n\t\tpubbldr.DetectionOrderEntry{\n\t\t\tModuleRef: dist.ModuleRef{\n\t\t\t\tModuleInfo: testBuildpackThree,\n\t\t\t},\n\t\t},\n\t}\n\n\textensions = []dist.ModuleInfo{\n\t\ttestBuildpackOne,\n\t\ttestBuildpackTwo,\n\t\ttestBuildpackThree,\n\t}\n\n\torderExtensions = pubbldr.DetectionOrder{\n\t\tpubbldr.DetectionOrderEntry{\n\t\t\tModuleRef: dist.ModuleRef{\n\t\t\t\tModuleInfo: testTopNestedBuildpack,\n\t\t\t},\n\t\t},\n\t\tpubbldr.DetectionOrderEntry{\n\t\t\tModuleRef: dist.ModuleRef{\n\t\t\t\tModuleInfo: testBuildpackOne,\n\t\t\t\tOptional:   true,\n\t\t\t},\n\t\t},\n\t\tpubbldr.DetectionOrderEntry{\n\t\t\tModuleRef: dist.ModuleRef{\n\t\t\t\tModuleInfo: testBuildpackTwo,\n\t\t\t\tOptional:   true,\n\t\t\t},\n\t\t},\n\t\tpubbldr.DetectionOrderEntry{\n\t\t\tModuleRef: dist.ModuleRef{\n\t\t\t\tModuleInfo: testBuildpackThree,\n\t\t\t},\n\t\t},\n\t}\n\n\tsharedBuilderInfo = writer.SharedBuilderInfo{\n\t\tName:      \"test-builder\",\n\t\tTrusted:   false,\n\t\tIsDefault: false,\n\t}\n\n\tlocalRunImages = []config.RunImage{\n\t\t{Image: \"some/run-image\", Mirrors: []string{\"first/local\", \"second/local\"}},\n\t}\n)\n"
  },
  {
    "path": "internal/builder/writer/structured_format.go",
    "content": "package writer\n\nimport (\n\t\"fmt\"\n\n\t\"github.com/buildpacks/pack/internal/style\"\n\t\"github.com/buildpacks/pack/pkg/client\"\n\n\tpubbldr \"github.com/buildpacks/pack/builder\"\n\t\"github.com/buildpacks/pack/internal/builder\"\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n)\n\ntype InspectOutput struct {\n\tSharedBuilderInfo\n\tRemoteInfo *BuilderInfo `json:\"remote_info\" yaml:\"remote_info\" toml:\"remote_info\"`\n\tLocalInfo  *BuilderInfo `json:\"local_info\" yaml:\"local_info\" toml:\"local_info\"`\n}\n\ntype RunImage struct {\n\tName           string `json:\"name\" yaml:\"name\" toml:\"name\"`\n\tUserConfigured bool   `json:\"user_configured,omitempty\" yaml:\"user_configured,omitempty\" toml:\"user_configured,omitempty\"`\n}\n\ntype Lifecycle struct {\n\tbuilder.LifecycleInfo `yaml:\"lifecycleinfo,inline\"`\n\tBuildpackAPIs         builder.APIVersions `json:\"buildpack_apis\" yaml:\"buildpack_apis\" toml:\"buildpack_apis\"`\n\tPlatformAPIs          builder.APIVersions `json:\"platform_apis\" yaml:\"platform_apis\" toml:\"platform_apis\"`\n}\n\ntype Stack struct {\n\tID     string   `json:\"id\" yaml:\"id\" toml:\"id\"`\n\tMixins []string `json:\"mixins,omitempty\" yaml:\"mixins,omitempty\" toml:\"mixins,omitempty\"`\n}\n\ntype BuilderInfo struct {\n\tDescription            string                  `json:\"description,omitempty\" yaml:\"description,omitempty\" toml:\"description,omitempty\"`\n\tCreatedBy              builder.CreatorMetadata `json:\"created_by\" yaml:\"created_by\" toml:\"created_by\"`\n\tStack                  *Stack                  `json:\"stack,omitempty\" yaml:\"stack,omitempty\" toml:\"stack,omitempty\"`\n\tLifecycle              Lifecycle               `json:\"lifecycle\" yaml:\"lifecycle\" toml:\"lifecycle\"`\n\tRunImages              []RunImage              `json:\"run_images\" yaml:\"run_images\" toml:\"run_images\"`\n\tBuildpacks             []dist.ModuleInfo       `json:\"buildpacks\" yaml:\"buildpacks\" toml:\"buildpacks\"`\n\tpubbldr.DetectionOrder `json:\"detection_order\" yaml:\"detection_order\" toml:\"detection_order\"`\n\tExtensions             []dist.ModuleInfo      `json:\"extensions,omitempty\" yaml:\"extensions,omitempty\" toml:\"extensions,omitempty\"`\n\tOrderExtensions        pubbldr.DetectionOrder `json:\"order_extensions,omitempty\" yaml:\"order_extensions,omitempty\" toml:\"order_extensions,omitempty\"`\n}\n\ntype StructuredFormat struct {\n\tMarshalFunc func(interface{}) ([]byte, error)\n}\n\nfunc (w *StructuredFormat) Print(\n\tlogger logging.Logger,\n\tlocalRunImages []config.RunImage,\n\tlocal, remote *client.BuilderInfo,\n\tlocalErr, remoteErr error,\n\tbuilderInfo SharedBuilderInfo,\n) error {\n\tif localErr != nil {\n\t\treturn fmt.Errorf(\"preparing output for %s: %w\", style.Symbol(builderInfo.Name), localErr)\n\t}\n\n\tif remoteErr != nil {\n\t\treturn fmt.Errorf(\"preparing output for %s: %w\", style.Symbol(builderInfo.Name), remoteErr)\n\t}\n\n\toutputInfo := InspectOutput{SharedBuilderInfo: builderInfo}\n\n\tif local != nil {\n\t\tvar stack *Stack\n\t\tif local.Stack != \"\" {\n\t\t\tstack = &Stack{ID: local.Stack}\n\t\t}\n\n\t\tif logger.IsVerbose() {\n\t\t\tstack.Mixins = local.Mixins\n\t\t}\n\n\t\toutputInfo.LocalInfo = &BuilderInfo{\n\t\t\tDescription: local.Description,\n\t\t\tCreatedBy:   local.CreatedBy,\n\t\t\tStack:       stack,\n\t\t\tLifecycle: Lifecycle{\n\t\t\t\tLifecycleInfo: local.Lifecycle.Info,\n\t\t\t\tBuildpackAPIs: local.Lifecycle.APIs.Buildpack,\n\t\t\t\tPlatformAPIs:  local.Lifecycle.APIs.Platform,\n\t\t\t},\n\t\t\tRunImages:       runImages(local.RunImages, localRunImages),\n\t\t\tBuildpacks:      local.Buildpacks,\n\t\t\tDetectionOrder:  local.Order,\n\t\t\tExtensions:      local.Extensions,\n\t\t\tOrderExtensions: local.OrderExtensions,\n\t\t}\n\t}\n\n\tif remote != nil {\n\t\tvar stack *Stack\n\t\tif remote.Stack != \"\" {\n\t\t\tstack = &Stack{ID: remote.Stack}\n\t\t}\n\n\t\tif logger.IsVerbose() {\n\t\t\tstack.Mixins = remote.Mixins\n\t\t}\n\n\t\toutputInfo.RemoteInfo = &BuilderInfo{\n\t\t\tDescription: remote.Description,\n\t\t\tCreatedBy:   remote.CreatedBy,\n\t\t\tStack:       stack,\n\t\t\tLifecycle: Lifecycle{\n\t\t\t\tLifecycleInfo: remote.Lifecycle.Info,\n\t\t\t\tBuildpackAPIs: remote.Lifecycle.APIs.Buildpack,\n\t\t\t\tPlatformAPIs:  remote.Lifecycle.APIs.Platform,\n\t\t\t},\n\t\t\tRunImages:       runImages(remote.RunImages, localRunImages),\n\t\t\tBuildpacks:      remote.Buildpacks,\n\t\t\tDetectionOrder:  remote.Order,\n\t\t\tExtensions:      remote.Extensions,\n\t\t\tOrderExtensions: remote.OrderExtensions,\n\t\t}\n\t}\n\n\tif outputInfo.LocalInfo == nil && outputInfo.RemoteInfo == nil {\n\t\treturn fmt.Errorf(\"unable to find builder %s locally or remotely\", style.Symbol(builderInfo.Name))\n\t}\n\n\tvar (\n\t\toutput []byte\n\t\terr    error\n\t)\n\tif output, err = w.MarshalFunc(outputInfo); err != nil {\n\t\treturn fmt.Errorf(\"untested, unexpected failure while marshaling: %w\", err)\n\t}\n\n\tlogger.Info(string(output))\n\n\treturn nil\n}\n\nfunc runImages(runImages []pubbldr.RunImageConfig, localRunImages []config.RunImage) []RunImage {\n\timages := []RunImage{}\n\n\tfor _, i := range localRunImages {\n\t\tfor _, runImage := range runImages {\n\t\t\tif i.Image == runImage.Image {\n\t\t\t\tfor _, m := range i.Mirrors {\n\t\t\t\t\timages = append(images, RunImage{Name: m, UserConfigured: true})\n\t\t\t\t}\n\t\t\t}\n\t\t}\n\t}\n\n\tfor _, runImage := range runImages {\n\t\timages = append(images, RunImage{Name: runImage.Image})\n\t\tfor _, m := range runImage.Mirrors {\n\t\t\timages = append(images, RunImage{Name: m})\n\t\t}\n\t}\n\n\treturn images\n}\n"
  },
  {
    "path": "internal/builder/writer/toml.go",
    "content": "package writer\n\nimport (\n\t\"bytes\"\n\n\t\"github.com/pelletier/go-toml\"\n)\n\ntype TOML struct {\n\tStructuredFormat\n}\n\nfunc NewTOML() BuilderWriter {\n\treturn &TOML{\n\t\tStructuredFormat: StructuredFormat{\n\t\t\tMarshalFunc: func(v interface{}) ([]byte, error) {\n\t\t\t\tbuf := bytes.NewBuffer(nil)\n\t\t\t\terr := toml.NewEncoder(buf).Order(toml.OrderPreserve).PromoteAnonymous(false).Encode(v)\n\t\t\t\tif err != nil {\n\t\t\t\t\treturn []byte{}, err\n\t\t\t\t}\n\t\t\t\treturn buf.Bytes(), nil\n\t\t\t},\n\t\t},\n\t}\n}\n"
  },
  {
    "path": "internal/builder/writer/toml_test.go",
    "content": "package writer_test\n\nimport (\n\t\"bytes\"\n\t\"errors\"\n\t\"fmt\"\n\t\"testing\"\n\n\t\"github.com/pelletier/go-toml\"\n\n\t\"github.com/Masterminds/semver\"\n\t\"github.com/buildpacks/lifecycle/api\"\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\tpubbldr \"github.com/buildpacks/pack/builder\"\n\t\"github.com/buildpacks/pack/internal/builder\"\n\t\"github.com/buildpacks/pack/internal/builder/writer\"\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/pkg/client\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestTOML(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"Builder Writer\", testTOML, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testTOML(t *testing.T, when spec.G, it spec.S) {\n\tconst (\n\t\texpectedRemoteRunImages = `  [[remote_info.run_images]]\n    name = \"first/local\"\n    user_configured = true\n\n  [[remote_info.run_images]]\n    name = \"second/local\"\n    user_configured = true\n\n  [[remote_info.run_images]]\n    name = \"some/run-image\"\n\n  [[remote_info.run_images]]\n    name = \"first/default\"\n\n  [[remote_info.run_images]]\n    name = \"second/default\"`\n\n\t\texpectedLocalRunImages = `  [[local_info.run_images]]\n    name = \"first/local\"\n    user_configured = true\n\n  [[local_info.run_images]]\n    name = \"second/local\"\n    user_configured = true\n\n  [[local_info.run_images]]\n    name = \"some/run-image\"\n\n  [[local_info.run_images]]\n    name = \"first/local-default\"\n\n  [[local_info.run_images]]\n    name = \"second/local-default\"`\n\n\t\texpectedLocalBuildpacks = `  [[local_info.buildpacks]]\n    id = \"test.top.nested\"\n    version = \"test.top.nested.version\"\n\n  [[local_info.buildpacks]]\n    id = \"test.nested\"\n    homepage = \"http://geocities.com/top-bp\"\n\n  [[local_info.buildpacks]]\n    id = \"test.bp.one\"\n    version = \"test.bp.one.version\"\n    homepage = \"http://geocities.com/cool-bp\"\n\n  [[local_info.buildpacks]]\n    id = \"test.bp.two\"\n    version = \"test.bp.two.version\"\n\n  [[local_info.buildpacks]]\n    id = \"test.bp.three\"\n    version = \"test.bp.three.version\"`\n\n\t\texpectedRemoteBuildpacks = `  [[remote_info.buildpacks]]\n    id = \"test.top.nested\"\n    version = \"test.top.nested.version\"\n\n  [[remote_info.buildpacks]]\n    id = \"test.nested\"\n    homepage = \"http://geocities.com/top-bp\"\n\n  [[remote_info.buildpacks]]\n    id = \"test.bp.one\"\n    version = \"test.bp.one.version\"\n    homepage = \"http://geocities.com/cool-bp\"\n\n  [[remote_info.buildpacks]]\n    id = \"test.bp.two\"\n    version = \"test.bp.two.version\"\n\n  [[remote_info.buildpacks]]\n    id = \"test.bp.three\"\n    version = \"test.bp.three.version\"`\n\n\t\texpectedLocalDetectionOrder = `  [[local_info.detection_order]]\n\n    [[local_info.detection_order.buildpacks]]\n      id = \"test.top.nested\"\n      version = \"test.top.nested.version\"\n\n      [[local_info.detection_order.buildpacks.buildpacks]]\n        id = \"test.nested\"\n        homepage = \"http://geocities.com/top-bp\"\n\n        [[local_info.detection_order.buildpacks.buildpacks.buildpacks]]\n          id = \"test.bp.one\"\n          version = \"test.bp.one.version\"\n          homepage = \"http://geocities.com/cool-bp\"\n          optional = true\n\n      [[local_info.detection_order.buildpacks.buildpacks]]\n        id = \"test.bp.three\"\n        version = \"test.bp.three.version\"\n        optional = true\n\n      [[local_info.detection_order.buildpacks.buildpacks]]\n        id = \"test.nested.two\"\n        version = \"test.nested.two.version\"\n\n        [[local_info.detection_order.buildpacks.buildpacks.buildpacks]]\n          id = \"test.bp.one\"\n          version = \"test.bp.one.version\"\n          homepage = \"http://geocities.com/cool-bp\"\n          optional = true\n          cyclic = true\n\n    [[local_info.detection_order.buildpacks]]\n      id = \"test.bp.two\"\n      version = \"test.bp.two.version\"\n      optional = true\n\n  [[local_info.detection_order]]\n    id = \"test.bp.three\"\n    version = \"test.bp.three.version\"`\n\n\t\texpectedRemoteDetectionOrder = `  [[remote_info.detection_order]]\n\n    [[remote_info.detection_order.buildpacks]]\n      id = \"test.top.nested\"\n      version = \"test.top.nested.version\"\n\n      [[remote_info.detection_order.buildpacks.buildpacks]]\n        id = \"test.nested\"\n        homepage = \"http://geocities.com/top-bp\"\n\n        [[remote_info.detection_order.buildpacks.buildpacks.buildpacks]]\n          id = \"test.bp.one\"\n          version = \"test.bp.one.version\"\n          homepage = \"http://geocities.com/cool-bp\"\n          optional = true\n\n      [[remote_info.detection_order.buildpacks.buildpacks]]\n        id = \"test.bp.three\"\n        version = \"test.bp.three.version\"\n        optional = true\n\n      [[remote_info.detection_order.buildpacks.buildpacks]]\n        id = \"test.nested.two\"\n        version = \"test.nested.two.version\"\n\n        [[remote_info.detection_order.buildpacks.buildpacks.buildpacks]]\n          id = \"test.bp.one\"\n          version = \"test.bp.one.version\"\n          homepage = \"http://geocities.com/cool-bp\"\n          optional = true\n          cyclic = true\n\n    [[remote_info.detection_order.buildpacks]]\n      id = \"test.bp.two\"\n      version = \"test.bp.two.version\"\n      optional = true\n\n  [[remote_info.detection_order]]\n    id = \"test.bp.three\"\n    version = \"test.bp.three.version\"`\n\n\t\tstackWithMixins = `  [stack]\n    id = \"test.stack.id\"\n    mixins = [\"mixin1\", \"mixin2\", \"build:mixin3\", \"build:mixin4\"]`\n\t)\n\n\tvar (\n\t\tassert = h.NewAssertionManager(t)\n\t\toutBuf bytes.Buffer\n\n\t\tremoteInfo *client.BuilderInfo\n\t\tlocalInfo  *client.BuilderInfo\n\n\t\texpectedRemoteInfo = fmt.Sprintf(`[remote_info]\n  description = \"Some remote description\"\n\n  [remote_info.created_by]\n    Name = \"Pack CLI\"\n    Version = \"1.2.3\"\n\n  [remote_info.stack]\n    id = \"test.stack.id\"\n\n  [remote_info.lifecycle]\n    version = \"6.7.8\"\n\n    [remote_info.lifecycle.buildpack_apis]\n      deprecated = []\n      supported = [\"1.2\", \"2.3\"]\n\n    [remote_info.lifecycle.platform_apis]\n      deprecated = [\"0.1\", \"1.2\"]\n      supported = [\"4.5\"]\n\n%s\n\n%s\n\n%s`, expectedRemoteRunImages, expectedRemoteBuildpacks, expectedRemoteDetectionOrder)\n\n\t\texpectedLocalInfo = fmt.Sprintf(`[local_info]\n  description = \"Some local description\"\n\n  [local_info.created_by]\n    Name = \"Pack CLI\"\n    Version = \"4.5.6\"\n\n  [local_info.stack]\n    id = \"test.stack.id\"\n\n  [local_info.lifecycle]\n    version = \"4.5.6\"\n\n    [local_info.lifecycle.buildpack_apis]\n      deprecated = [\"4.5\", \"6.7\"]\n      supported = [\"8.9\", \"10.11\"]\n\n    [local_info.lifecycle.platform_apis]\n      deprecated = []\n      supported = [\"7.8\"]\n\n%s\n\n%s\n\n%s`, expectedLocalRunImages, expectedLocalBuildpacks, expectedLocalDetectionOrder)\n\n\t\texpectedPrettifiedTOML = fmt.Sprintf(`builder_name = \"test-builder\"\ntrusted = false\ndefault = false\n\n%s\n\n%s`, expectedRemoteInfo, expectedLocalInfo)\n\t)\n\n\twhen(\"Print\", func() {\n\t\tit.Before(func() {\n\t\t\tremoteInfo = &client.BuilderInfo{\n\t\t\t\tDescription:     \"Some remote description\",\n\t\t\t\tStack:           \"test.stack.id\",\n\t\t\t\tMixins:          []string{\"mixin1\", \"mixin2\", \"build:mixin3\", \"build:mixin4\"},\n\t\t\t\tRunImages:       []pubbldr.RunImageConfig{{Image: \"some/run-image\", Mirrors: []string{\"first/default\", \"second/default\"}}},\n\t\t\t\tBuildpacks:      buildpacks,\n\t\t\t\tOrder:           order,\n\t\t\t\tBuildpackLayers: dist.ModuleLayers{},\n\t\t\t\tLifecycle: builder.LifecycleDescriptor{\n\t\t\t\t\tInfo: builder.LifecycleInfo{\n\t\t\t\t\t\tVersion: &builder.Version{\n\t\t\t\t\t\t\tVersion: *semver.MustParse(\"6.7.8\"),\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t\tAPIs: builder.LifecycleAPIs{\n\t\t\t\t\t\tBuildpack: builder.APIVersions{\n\t\t\t\t\t\t\tDeprecated: nil,\n\t\t\t\t\t\t\tSupported:  builder.APISet{api.MustParse(\"1.2\"), api.MustParse(\"2.3\")},\n\t\t\t\t\t\t},\n\t\t\t\t\t\tPlatform: builder.APIVersions{\n\t\t\t\t\t\t\tDeprecated: builder.APISet{api.MustParse(\"0.1\"), api.MustParse(\"1.2\")},\n\t\t\t\t\t\t\tSupported:  builder.APISet{api.MustParse(\"4.5\")},\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t\tCreatedBy: builder.CreatorMetadata{\n\t\t\t\t\tName:    \"Pack CLI\",\n\t\t\t\t\tVersion: \"1.2.3\",\n\t\t\t\t},\n\t\t\t}\n\n\t\t\tlocalInfo = &client.BuilderInfo{\n\t\t\t\tDescription:     \"Some local description\",\n\t\t\t\tStack:           \"test.stack.id\",\n\t\t\t\tMixins:          []string{\"mixin1\", \"mixin2\", \"build:mixin3\", \"build:mixin4\"},\n\t\t\t\tRunImages:       []pubbldr.RunImageConfig{{Image: \"some/run-image\", Mirrors: []string{\"first/local-default\", \"second/local-default\"}}},\n\t\t\t\tBuildpacks:      buildpacks,\n\t\t\t\tOrder:           order,\n\t\t\t\tBuildpackLayers: dist.ModuleLayers{},\n\t\t\t\tLifecycle: builder.LifecycleDescriptor{\n\t\t\t\t\tInfo: builder.LifecycleInfo{\n\t\t\t\t\t\tVersion: &builder.Version{\n\t\t\t\t\t\t\tVersion: *semver.MustParse(\"4.5.6\"),\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t\tAPIs: builder.LifecycleAPIs{\n\t\t\t\t\t\tBuildpack: builder.APIVersions{\n\t\t\t\t\t\t\tDeprecated: builder.APISet{api.MustParse(\"4.5\"), api.MustParse(\"6.7\")},\n\t\t\t\t\t\t\tSupported:  builder.APISet{api.MustParse(\"8.9\"), api.MustParse(\"10.11\")},\n\t\t\t\t\t\t},\n\t\t\t\t\t\tPlatform: builder.APIVersions{\n\t\t\t\t\t\t\tDeprecated: nil,\n\t\t\t\t\t\t\tSupported:  builder.APISet{api.MustParse(\"7.8\")},\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t\tCreatedBy: builder.CreatorMetadata{\n\t\t\t\t\tName:    \"Pack CLI\",\n\t\t\t\t\tVersion: \"4.5.6\",\n\t\t\t\t},\n\t\t\t}\n\t\t})\n\n\t\tit(\"prints both local remote builders as valid TOML\", func() {\n\t\t\ttomlWriter := writer.NewTOML()\n\n\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\terr := tomlWriter.Print(logger, localRunImages, localInfo, remoteInfo, nil, nil, sharedBuilderInfo)\n\t\t\tassert.Nil(err)\n\n\t\t\tassert.Succeeds(validTOMLOutput(outBuf))\n\t\t\tassert.Nil(err)\n\n\t\t\tassert.ContainsTOML(outBuf.String(), expectedPrettifiedTOML)\n\t\t})\n\n\t\twhen(\"builder doesn't exist locally or remotely\", func() {\n\t\t\tit(\"returns an error\", func() {\n\t\t\t\ttomlWriter := writer.NewTOML()\n\n\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\t\terr := tomlWriter.Print(logger, localRunImages, nil, nil, nil, nil, sharedBuilderInfo)\n\t\t\t\tassert.ErrorWithMessage(err, \"unable to find builder 'test-builder' locally or remotely\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"builder doesn't exist locally\", func() {\n\t\t\tit(\"shows null for local builder, and normal output for remote\", func() {\n\t\t\t\ttomlWriter := writer.NewTOML()\n\n\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\t\terr := tomlWriter.Print(logger, localRunImages, nil, remoteInfo, nil, nil, sharedBuilderInfo)\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassert.Succeeds(validTOMLOutput(outBuf))\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassert.NotContains(outBuf.String(), \"local_info\")\n\t\t\t\tassert.ContainsTOML(outBuf.String(), expectedRemoteInfo)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"builder doesn't exist remotely\", func() {\n\t\t\tit(\"shows null for remote builder, and normal output for local\", func() {\n\t\t\t\ttomlWriter := writer.NewTOML()\n\n\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\t\terr := tomlWriter.Print(logger, localRunImages, localInfo, nil, nil, nil, sharedBuilderInfo)\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassert.Succeeds(validTOMLOutput(outBuf))\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassert.NotContains(outBuf.String(), \"remote_info\")\n\t\t\t\tassert.ContainsTOML(outBuf.String(), expectedLocalInfo)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"localErr is an error\", func() {\n\t\t\tit(\"returns the error, and doesn't write any toml output\", func() {\n\t\t\t\texpectedErr := errors.New(\"failed to retrieve local info\")\n\n\t\t\t\ttomlWriter := writer.NewTOML()\n\n\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\t\terr := tomlWriter.Print(logger, localRunImages, localInfo, remoteInfo, expectedErr, nil, sharedBuilderInfo)\n\t\t\t\tassert.ErrorWithMessage(err, \"preparing output for 'test-builder': failed to retrieve local info\")\n\n\t\t\t\tassert.Equal(outBuf.String(), \"\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"remoteErr is an error\", func() {\n\t\t\tit(\"returns the error, and doesn't write any toml output\", func() {\n\t\t\t\texpectedErr := errors.New(\"failed to retrieve remote info\")\n\n\t\t\t\ttomlWriter := writer.NewTOML()\n\n\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\t\terr := tomlWriter.Print(logger, localRunImages, localInfo, remoteInfo, nil, expectedErr, sharedBuilderInfo)\n\t\t\t\tassert.ErrorWithMessage(err, \"preparing output for 'test-builder': failed to retrieve remote info\")\n\n\t\t\t\tassert.Equal(outBuf.String(), \"\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"logger is verbose\", func() {\n\t\t\tit(\"displays mixins associated with the stack\", func() {\n\t\t\t\ttomlWriter := writer.NewTOML()\n\n\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf, logging.WithVerbose())\n\t\t\t\terr := tomlWriter.Print(logger, localRunImages, localInfo, remoteInfo, nil, nil, sharedBuilderInfo)\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassert.Succeeds(validTOMLOutput(outBuf))\n\t\t\t\tassert.ContainsTOML(outBuf.String(), stackWithMixins)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"no run images are specified\", func() {\n\t\t\tit(\"omits run images from output\", func() {\n\t\t\t\tlocalInfo.RunImages = []pubbldr.RunImageConfig{}\n\t\t\t\tremoteInfo.RunImages = []pubbldr.RunImageConfig{}\n\t\t\t\temptyLocalRunImages := []config.RunImage{}\n\n\t\t\t\ttomlWriter := writer.NewTOML()\n\n\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf, logging.WithVerbose())\n\t\t\t\terr := tomlWriter.Print(logger, emptyLocalRunImages, localInfo, remoteInfo, nil, nil, sharedBuilderInfo)\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassert.Succeeds(validTOMLOutput(outBuf))\n\n\t\t\t\tassert.NotContains(outBuf.String(), \"run_images\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"no buildpacks are specified\", func() {\n\t\t\tit(\"omits buildpacks from output\", func() {\n\t\t\t\tlocalInfo.Buildpacks = []dist.ModuleInfo{}\n\t\t\t\tremoteInfo.Buildpacks = []dist.ModuleInfo{}\n\n\t\t\t\ttomlWriter := writer.NewTOML()\n\n\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf, logging.WithVerbose())\n\t\t\t\terr := tomlWriter.Print(logger, localRunImages, localInfo, remoteInfo, nil, nil, sharedBuilderInfo)\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassert.Succeeds(validTOMLOutput(outBuf))\n\n\t\t\t\tassert.NotContains(outBuf.String(), \"local_info.buildpacks\")\n\t\t\t\tassert.NotContains(outBuf.String(), \"remote_info.buildpacks\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"no detection order is specified\", func() {\n\t\t\tit(\"omits dection order in output\", func() {\n\t\t\t\tlocalInfo.Order = pubbldr.DetectionOrder{}\n\t\t\t\tremoteInfo.Order = pubbldr.DetectionOrder{}\n\n\t\t\t\ttomlWriter := writer.NewTOML()\n\n\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf, logging.WithVerbose())\n\t\t\t\terr := tomlWriter.Print(logger, localRunImages, localInfo, remoteInfo, nil, nil, sharedBuilderInfo)\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassert.Succeeds(validTOMLOutput(outBuf))\n\t\t\t\tassert.NotContains(outBuf.String(), \"detection_order\")\n\t\t\t})\n\t\t})\n\t})\n}\n\nfunc validTOMLOutput(source bytes.Buffer) error {\n\terr := toml.NewDecoder(&source).Decode(&struct{}{})\n\tif err != nil {\n\t\treturn fmt.Errorf(\"failed to unmarshal to toml: %w\", err)\n\t}\n\treturn nil\n}\n"
  },
  {
    "path": "internal/builder/writer/yaml.go",
    "content": "package writer\n\nimport (\n\t\"bytes\"\n\n\t\"gopkg.in/yaml.v3\"\n)\n\ntype YAML struct {\n\tStructuredFormat\n}\n\nfunc NewYAML() BuilderWriter {\n\treturn &YAML{\n\t\tStructuredFormat: StructuredFormat{\n\t\t\tMarshalFunc: func(v interface{}) ([]byte, error) {\n\t\t\t\tbuf := bytes.NewBuffer(nil)\n\t\t\t\tif err := yaml.NewEncoder(buf).Encode(v); err != nil {\n\t\t\t\t\treturn []byte{}, err\n\t\t\t\t}\n\t\t\t\treturn buf.Bytes(), nil\n\t\t\t},\n\t\t},\n\t}\n}\n"
  },
  {
    "path": "internal/builder/writer/yaml_test.go",
    "content": "package writer_test\n\nimport (\n\t\"bytes\"\n\t\"errors\"\n\t\"fmt\"\n\t\"testing\"\n\n\t\"github.com/Masterminds/semver\"\n\t\"github.com/buildpacks/lifecycle/api\"\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\tyaml \"gopkg.in/yaml.v3\"\n\n\tpubbldr \"github.com/buildpacks/pack/builder\"\n\t\"github.com/buildpacks/pack/internal/builder\"\n\t\"github.com/buildpacks/pack/internal/builder/writer\"\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/pkg/client\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestYAML(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"Builder Writer\", testYAML, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testYAML(t *testing.T, when spec.G, it spec.S) {\n\tconst (\n\t\texpectedRemoteRunImages = `    run_images:\n        - name: first/local\n          user_configured: true\n        - name: second/local\n          user_configured: true\n        - name: some/run-image\n        - name: first/default\n        - name: second/default`\n\n\t\texpectedLocalRunImages = `    run_images:\n        - name: first/local\n          user_configured: true\n        - name: second/local\n          user_configured: true\n        - name: some/run-image\n        - name: first/local-default\n        - name: second/local-default`\n\n\t\texpectedBuildpacks = `    buildpacks:\n        - id: test.top.nested\n          version: test.top.nested.version\n        - id: test.nested\n          homepage: http://geocities.com/top-bp\n        - id: test.bp.one\n          version: test.bp.one.version\n          homepage: http://geocities.com/cool-bp\n        - id: test.bp.two\n          version: test.bp.two.version\n        - id: test.bp.three\n          version: test.bp.three.version`\n\n\t\texpectedExtensions = `    extensions:\n        - id: test.bp.one\n          version: test.bp.one.version\n          homepage: http://geocities.com/cool-bp\n        - id: test.bp.two\n          version: test.bp.two.version\n        - id: test.bp.three\n          version: test.bp.three.version`\n\n\t\texpectedDetectionOrder = `    detection_order:\n        - buildpacks:\n            - id: test.top.nested\n              version: test.top.nested.version\n              buildpacks:\n                - id: test.nested\n                  homepage: http://geocities.com/top-bp\n                  buildpacks:\n                    - id: test.bp.one\n                      version: test.bp.one.version\n                      homepage: http://geocities.com/cool-bp\n                      optional: true\n                - id: test.bp.three\n                  version: test.bp.three.version\n                  optional: true\n                - id: test.nested.two\n                  version: test.nested.two.version\n                  buildpacks:\n                    - id: test.bp.one\n                      version: test.bp.one.version\n                      homepage: http://geocities.com/cool-bp\n                      optional: true\n                      cyclic: true\n            - id: test.bp.two\n              version: test.bp.two.version\n              optional: true\n        - id: test.bp.three\n          version: test.bp.three.version`\n\n\t\texpectedOrderExtensions = `    order_extensions:\n        - id: test.top.nested\n          version: test.top.nested.version\n        - id: test.bp.one\n          version: test.bp.one.version\n          homepage: http://geocities.com/cool-bp\n          optional: true\n        - id: test.bp.two\n          version: test.bp.two.version\n          optional: true\n        - id: test.bp.three\n          version: test.bp.three.version`\n\t\texpectedStackWithMixins = `    stack:\n        id: test.stack.id\n        mixins:\n            - mixin1\n            - mixin2\n            - build:mixin3\n            - build:mixin4`\n\t)\n\n\tvar (\n\t\tassert = h.NewAssertionManager(t)\n\t\toutBuf bytes.Buffer\n\n\t\tremoteInfo *client.BuilderInfo\n\t\tlocalInfo  *client.BuilderInfo\n\n\t\texpectedRemoteInfo = fmt.Sprintf(`remote_info:\n    description: Some remote description\n    created_by:\n        name: Pack CLI\n        version: 1.2.3\n    stack:\n        id: test.stack.id\n    lifecycle:\n        version: 6.7.8\n        buildpack_apis:\n            deprecated: []\n            supported:\n                - \"1.2\"\n                - \"2.3\"\n        platform_apis:\n            deprecated:\n                - \"0.1\"\n                - \"1.2\"\n            supported:\n                - \"4.5\"\n%s\n%s\n%s\n%s\n%s`, expectedRemoteRunImages, expectedBuildpacks, expectedDetectionOrder, expectedExtensions, expectedOrderExtensions)\n\n\t\texpectedLocalInfo = fmt.Sprintf(`local_info:\n    description: Some local description\n    created_by:\n        name: Pack CLI\n        version: 4.5.6\n    stack:\n        id: test.stack.id\n    lifecycle:\n        version: 4.5.6\n        buildpack_apis:\n            deprecated:\n                - \"4.5\"\n                - \"6.7\"\n            supported:\n                - \"8.9\"\n                - \"10.11\"\n        platform_apis:\n            deprecated: []\n            supported:\n                - \"7.8\"\n%s\n%s\n%s\n%s\n%s`, expectedLocalRunImages, expectedBuildpacks, expectedDetectionOrder, expectedExtensions, expectedOrderExtensions)\n\n\t\texpectedPrettifiedYAML = fmt.Sprintf(`    builder_name: test-builder\n    trusted: false\n    default: false\n%s\n%s`, expectedRemoteInfo, expectedLocalInfo)\n\t)\n\n\twhen(\"Print\", func() {\n\t\tit.Before(func() {\n\t\t\tremoteInfo = &client.BuilderInfo{\n\t\t\t\tDescription:     \"Some remote description\",\n\t\t\t\tStack:           \"test.stack.id\",\n\t\t\t\tMixins:          []string{\"mixin1\", \"mixin2\", \"build:mixin3\", \"build:mixin4\"},\n\t\t\t\tRunImages:       []pubbldr.RunImageConfig{{Image: \"some/run-image\", Mirrors: []string{\"first/default\", \"second/default\"}}},\n\t\t\t\tBuildpacks:      buildpacks,\n\t\t\t\tOrder:           order,\n\t\t\t\tExtensions:      extensions,\n\t\t\t\tOrderExtensions: orderExtensions,\n\t\t\t\tBuildpackLayers: dist.ModuleLayers{},\n\t\t\t\tLifecycle: builder.LifecycleDescriptor{\n\t\t\t\t\tInfo: builder.LifecycleInfo{\n\t\t\t\t\t\tVersion: &builder.Version{\n\t\t\t\t\t\t\tVersion: *semver.MustParse(\"6.7.8\"),\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t\tAPIs: builder.LifecycleAPIs{\n\t\t\t\t\t\tBuildpack: builder.APIVersions{\n\t\t\t\t\t\t\tDeprecated: nil,\n\t\t\t\t\t\t\tSupported:  builder.APISet{api.MustParse(\"1.2\"), api.MustParse(\"2.3\")},\n\t\t\t\t\t\t},\n\t\t\t\t\t\tPlatform: builder.APIVersions{\n\t\t\t\t\t\t\tDeprecated: builder.APISet{api.MustParse(\"0.1\"), api.MustParse(\"1.2\")},\n\t\t\t\t\t\t\tSupported:  builder.APISet{api.MustParse(\"4.5\")},\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t\tCreatedBy: builder.CreatorMetadata{\n\t\t\t\t\tName:    \"Pack CLI\",\n\t\t\t\t\tVersion: \"1.2.3\",\n\t\t\t\t},\n\t\t\t}\n\n\t\t\tlocalInfo = &client.BuilderInfo{\n\t\t\t\tDescription:     \"Some local description\",\n\t\t\t\tStack:           \"test.stack.id\",\n\t\t\t\tMixins:          []string{\"mixin1\", \"mixin2\", \"build:mixin3\", \"build:mixin4\"},\n\t\t\t\tRunImages:       []pubbldr.RunImageConfig{{Image: \"some/run-image\", Mirrors: []string{\"first/local-default\", \"second/local-default\"}}},\n\t\t\t\tBuildpacks:      buildpacks,\n\t\t\t\tOrder:           order,\n\t\t\t\tExtensions:      extensions,\n\t\t\t\tOrderExtensions: orderExtensions,\n\t\t\t\tBuildpackLayers: dist.ModuleLayers{},\n\t\t\t\tLifecycle: builder.LifecycleDescriptor{\n\t\t\t\t\tInfo: builder.LifecycleInfo{\n\t\t\t\t\t\tVersion: &builder.Version{\n\t\t\t\t\t\t\tVersion: *semver.MustParse(\"4.5.6\"),\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t\tAPIs: builder.LifecycleAPIs{\n\t\t\t\t\t\tBuildpack: builder.APIVersions{\n\t\t\t\t\t\t\tDeprecated: builder.APISet{api.MustParse(\"4.5\"), api.MustParse(\"6.7\")},\n\t\t\t\t\t\t\tSupported:  builder.APISet{api.MustParse(\"8.9\"), api.MustParse(\"10.11\")},\n\t\t\t\t\t\t},\n\t\t\t\t\t\tPlatform: builder.APIVersions{\n\t\t\t\t\t\t\tDeprecated: nil,\n\t\t\t\t\t\t\tSupported:  builder.APISet{api.MustParse(\"7.8\")},\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t\tCreatedBy: builder.CreatorMetadata{\n\t\t\t\t\tName:    \"Pack CLI\",\n\t\t\t\t\tVersion: \"4.5.6\",\n\t\t\t\t},\n\t\t\t}\n\t\t})\n\n\t\tit(\"prints both local remote builders as valid YAML\", func() {\n\t\t\tyamlWriter := writer.NewYAML()\n\n\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\terr := yamlWriter.Print(logger, localRunImages, localInfo, remoteInfo, nil, nil, sharedBuilderInfo)\n\t\t\tassert.Nil(err)\n\n\t\t\tprettyYAML, err := validYAML(outBuf)\n\t\t\tassert.Nil(err)\n\n\t\t\tassert.Contains(prettyYAML, expectedPrettifiedYAML)\n\t\t})\n\n\t\twhen(\"builder doesn't exist locally or remotely\", func() {\n\t\t\tit(\"returns an error\", func() {\n\t\t\t\tyamlWriter := writer.NewYAML()\n\n\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\t\terr := yamlWriter.Print(logger, localRunImages, nil, nil, nil, nil, sharedBuilderInfo)\n\t\t\t\tassert.ErrorWithMessage(err, \"unable to find builder 'test-builder' locally or remotely\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"builder doesn't exist locally\", func() {\n\t\t\tit(\"shows null for local builder, and normal output for remote\", func() {\n\t\t\t\tyamlWriter := writer.NewYAML()\n\n\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\t\terr := yamlWriter.Print(logger, localRunImages, nil, remoteInfo, nil, nil, sharedBuilderInfo)\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tprettyYAML, err := validYAML(outBuf)\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassert.ContainsYAML(prettyYAML, `local_info: null`)\n\t\t\t\tassert.ContainsYAML(prettyYAML, expectedRemoteInfo)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"builder doesn't exist remotely\", func() {\n\t\t\tit(\"shows null for remote builder, and normal output for local\", func() {\n\t\t\t\tyamlWriter := writer.NewYAML()\n\n\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\t\terr := yamlWriter.Print(logger, localRunImages, localInfo, nil, nil, nil, sharedBuilderInfo)\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tprettyYAML, err := validYAML(outBuf)\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassert.ContainsYAML(prettyYAML, `remote_info: null`)\n\t\t\t\tassert.ContainsYAML(prettyYAML, expectedLocalInfo)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"localErr is an error\", func() {\n\t\t\tit(\"returns the error, and doesn't write any yaml output\", func() {\n\t\t\t\texpectedErr := errors.New(\"failed to retrieve local info\")\n\n\t\t\t\tyamlWriter := writer.NewYAML()\n\n\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\t\terr := yamlWriter.Print(logger, localRunImages, localInfo, remoteInfo, expectedErr, nil, sharedBuilderInfo)\n\t\t\t\tassert.ErrorWithMessage(err, \"preparing output for 'test-builder': failed to retrieve local info\")\n\n\t\t\t\tassert.Equal(outBuf.String(), \"\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"remoteErr is an error\", func() {\n\t\t\tit(\"returns the error, and doesn't write any yaml output\", func() {\n\t\t\t\texpectedErr := errors.New(\"failed to retrieve remote info\")\n\n\t\t\t\tyamlWriter := writer.NewYAML()\n\n\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\t\terr := yamlWriter.Print(logger, localRunImages, localInfo, remoteInfo, nil, expectedErr, sharedBuilderInfo)\n\t\t\t\tassert.ErrorWithMessage(err, \"preparing output for 'test-builder': failed to retrieve remote info\")\n\n\t\t\t\tassert.Equal(outBuf.String(), \"\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"logger is verbose\", func() {\n\t\t\tit(\"displays mixins associated with the stack\", func() {\n\t\t\t\tyamlWriter := writer.NewYAML()\n\n\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf, logging.WithVerbose())\n\t\t\t\terr := yamlWriter.Print(logger, localRunImages, localInfo, remoteInfo, nil, nil, sharedBuilderInfo)\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tprettifiedYAML, err := validYAML(outBuf)\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassert.ContainsYAML(prettifiedYAML, expectedStackWithMixins)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"no run images are specified\", func() {\n\t\t\tit(\"displays run images as empty list\", func() {\n\t\t\t\tlocalInfo.RunImages = []pubbldr.RunImageConfig{}\n\t\t\t\tremoteInfo.RunImages = []pubbldr.RunImageConfig{}\n\t\t\t\temptyLocalRunImages := []config.RunImage{}\n\n\t\t\t\tyamlWriter := writer.NewYAML()\n\n\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf, logging.WithVerbose())\n\t\t\t\terr := yamlWriter.Print(logger, emptyLocalRunImages, localInfo, remoteInfo, nil, nil, sharedBuilderInfo)\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tprettifiedYAML, err := validYAML(outBuf)\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassert.ContainsYAML(prettifiedYAML, `run_images: []`)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"no buildpacks are specified\", func() {\n\t\t\tit(\"displays buildpacks as empty list\", func() {\n\t\t\t\tlocalInfo.Buildpacks = []dist.ModuleInfo{}\n\t\t\t\tremoteInfo.Buildpacks = []dist.ModuleInfo{}\n\n\t\t\t\tyamlWriter := writer.NewYAML()\n\n\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf, logging.WithVerbose())\n\t\t\t\terr := yamlWriter.Print(logger, localRunImages, localInfo, remoteInfo, nil, nil, sharedBuilderInfo)\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tprettifiedYAML, err := validYAML(outBuf)\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassert.ContainsYAML(prettifiedYAML, `buildpacks: []`)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"no detection order is specified\", func() {\n\t\t\tit(\"displays detection order as empty list\", func() {\n\t\t\t\tlocalInfo.Order = pubbldr.DetectionOrder{}\n\t\t\t\tremoteInfo.Order = pubbldr.DetectionOrder{}\n\n\t\t\t\tyamlWriter := writer.NewYAML()\n\n\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf, logging.WithVerbose())\n\t\t\t\terr := yamlWriter.Print(logger, localRunImages, localInfo, remoteInfo, nil, nil, sharedBuilderInfo)\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tprettifiedYAML, err := validYAML(outBuf)\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassert.ContainsYAML(prettifiedYAML, `detection_order: []`)\n\t\t\t})\n\t\t})\n\t})\n}\n\nfunc validYAML(source bytes.Buffer) (string, error) {\n\terr := yaml.Unmarshal(source.Bytes(), &struct{}{})\n\tif err != nil {\n\t\treturn \"\", fmt.Errorf(\"failed to unmarshal to yaml: %w\", err)\n\t}\n\n\treturn source.String(), nil\n}\n"
  },
  {
    "path": "internal/commands/add_registry.go",
    "content": "package commands\n\nimport (\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n)\n\n// Deprecated: Use config registries add instead\nfunc AddBuildpackRegistry(logger logging.Logger, cfg config.Config, cfgPath string) *cobra.Command {\n\tvar (\n\t\tsetDefault   bool\n\t\tregistryType string\n\t)\n\n\tcmd := &cobra.Command{\n\t\tUse:     \"add-registry <name> <url>\",\n\t\tArgs:    cobra.ExactArgs(2),\n\t\tHidden:  true,\n\t\tShort:   \"Add buildpack registry to your pack config file\",\n\t\tExample: \"pack add-registry my-registry https://github.com/buildpacks/my-registry\",\n\t\tLong: \"A Buildpack Registry is a (still experimental) place to publish, store, and discover buildpacks. \" +\n\t\t\t\"Users can add buildpacks registries using add-registry, and publish/yank buildpacks from it, as well as use those buildpacks when building applications.\",\n\t\tRunE: logError(logger, func(cmd *cobra.Command, args []string) error {\n\t\t\tdeprecationWarning(logger, \"add-registry\", \"config registries add\")\n\t\t\tnewRegistry := config.Registry{\n\t\t\t\tName: args[0],\n\t\t\t\tURL:  args[1],\n\t\t\t\tType: registryType,\n\t\t\t}\n\n\t\t\treturn addRegistryToConfig(logger, newRegistry, setDefault, cfg, cfgPath)\n\t\t}),\n\t}\n\tcmd.Flags().BoolVar(&setDefault, \"default\", false, \"Set this buildpack registry as the default\")\n\tcmd.Flags().StringVar(&registryType, \"type\", \"github\", \"Type of buildpack registry [git|github]\")\n\tAddHelpFlag(cmd, \"add-registry\")\n\n\treturn cmd\n}\n"
  },
  {
    "path": "internal/commands/add_registry_test.go",
    "content": "package commands_test\n\nimport (\n\t\"bytes\"\n\t\"io\"\n\t\"os\"\n\t\"path/filepath\"\n\t\"testing\"\n\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\t\"github.com/buildpacks/pack/internal/commands\"\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestAddRegistry(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\n\tspec.Run(t, \"Commands\", testAddRegistryCommand, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testAddRegistryCommand(t *testing.T, when spec.G, it spec.S) {\n\twhen(\"AddBuildpackRegistry\", func() {\n\t\tvar (\n\t\t\ttmpDir     string\n\t\t\tconfigFile string\n\t\t\toutBuf     bytes.Buffer\n\t\t\tcommand    *cobra.Command\n\t\t\tlogger     = logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\tassert     = h.NewAssertionManager(t)\n\t\t)\n\n\t\tit.Before(func() {\n\t\t\tvar err error\n\t\t\ttmpDir, err = os.MkdirTemp(\"\", \"pack-home-*\")\n\t\t\tassert.Nil(err)\n\n\t\t\tconfigFile = filepath.Join(tmpDir, \"config.toml\")\n\t\t\tcommand = commands.AddBuildpackRegistry(logger, config.Config{}, configFile)\n\t\t})\n\n\t\tit.After(func() {\n\t\t\tassert.Nil(os.RemoveAll(tmpDir))\n\t\t})\n\n\t\twhen(\"add buildpack registry\", func() {\n\t\t\tit(\"adds to registry list\", func() {\n\t\t\t\tcommand.SetArgs([]string{\"bp\", \"https://github.com/buildpacks/registry-index/\"})\n\t\t\t\tassert.Succeeds(command.Execute())\n\n\t\t\t\tcfg, err := config.Read(configFile)\n\t\t\t\tassert.Nil(err)\n\t\t\t\tassert.Equal(len(cfg.Registries), 1)\n\t\t\t\tassert.Equal(cfg.Registries[0].Name, \"bp\")\n\t\t\t\tassert.Equal(cfg.Registries[0].Type, \"github\")\n\t\t\t\tassert.Equal(cfg.Registries[0].URL, \"https://github.com/buildpacks/registry-index/\")\n\t\t\t\tassert.Equal(cfg.DefaultRegistryName, \"\")\n\t\t\t\tassert.Contains(outBuf.String(), \"been deprecated, please use 'pack config registries add' instead\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"default is true\", func() {\n\t\t\tit(\"sets newly added registry as the default\", func() {\n\t\t\t\tcommand.SetArgs([]string{\"bp\", \"https://github.com/buildpacks/registry-index/\", \"--default\"})\n\t\t\t\tassert.Succeeds(command.Execute())\n\n\t\t\t\tcfg, err := config.Read(configFile)\n\t\t\t\tassert.Nil(err)\n\t\t\t\tassert.Equal(len(cfg.Registries), 1)\n\t\t\t\tassert.Equal(cfg.Registries[0].Name, \"bp\")\n\t\t\t\tassert.Equal(cfg.DefaultRegistryName, \"bp\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"validation\", func() {\n\t\t\tit(\"fails with missing args\", func() {\n\t\t\t\tcommand.SetOut(io.Discard)\n\t\t\t\tcommand.SetArgs([]string{})\n\t\t\t\terr := command.Execute()\n\t\t\t\tassert.ErrorContains(err, \"accepts 2 arg\")\n\t\t\t})\n\n\t\t\tit(\"should validate type\", func() {\n\t\t\t\tcommand.SetArgs([]string{\"bp\", \"https://github.com/buildpacks/registry-index/\", \"--type=bogus\"})\n\t\t\t\tassert.Error(command.Execute())\n\n\t\t\t\toutput := outBuf.String()\n\t\t\t\th.AssertContains(t, output, \"'bogus' is not a valid type. Supported types are: 'git', 'github'.\")\n\t\t\t})\n\n\t\t\tit(\"should throw error when registry already exists\", func() {\n\t\t\t\tcommand := commands.AddBuildpackRegistry(logger, config.Config{\n\t\t\t\t\tRegistries: []config.Registry{\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\tName: \"bp\",\n\t\t\t\t\t\t\tType: \"github\",\n\t\t\t\t\t\t\tURL:  \"https://github.com/buildpacks/registry-index/\",\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t}, configFile)\n\t\t\t\tcommand.SetArgs([]string{\"bp\", \"https://github.com/buildpacks/registry-index/\"})\n\t\t\t\tassert.Error(command.Execute())\n\n\t\t\t\toutput := outBuf.String()\n\t\t\t\th.AssertContains(t, output, \"Buildpack registry 'bp' already exists.\")\n\t\t\t})\n\n\t\t\tit(\"should throw error when registry name is official\", func() {\n\t\t\t\tcommand := commands.AddBuildpackRegistry(logger, config.Config{}, configFile)\n\t\t\t\tcommand.SetArgs([]string{\"official\", \"https://github.com/buildpacks/registry-index/\", \"--type=github\"})\n\t\t\t\tassert.Error(command.Execute())\n\n\t\t\t\toutput := outBuf.String()\n\t\t\t\th.AssertContains(t, output, \"'official' is a reserved registry, please provide a different name\")\n\t\t\t})\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "internal/commands/build.go",
    "content": "package commands\n\nimport (\n\t\"fmt\"\n\t\"os\"\n\t\"path/filepath\"\n\t\"regexp\"\n\t\"strconv\"\n\t\"strings\"\n\t\"time\"\n\n\t\"github.com/google/go-containerregistry/pkg/name\"\n\t\"github.com/pkg/errors\"\n\t\"github.com/spf13/cobra\"\n\n\tbldr \"github.com/buildpacks/pack/internal/builder\"\n\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/internal/style\"\n\t\"github.com/buildpacks/pack/pkg/cache\"\n\t\"github.com/buildpacks/pack/pkg/client\"\n\t\"github.com/buildpacks/pack/pkg/image\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\t\"github.com/buildpacks/pack/pkg/project\"\n\tprojectTypes \"github.com/buildpacks/pack/pkg/project/types\"\n)\n\ntype BuildFlags struct {\n\tPublish                bool\n\tClearCache             bool\n\tDisableSystemBuilpacks bool\n\tTrustBuilder           bool\n\tTrustExtraBuildpacks   bool\n\tInteractive            bool\n\tSparse                 bool\n\tEnableUsernsHost       bool\n\tDockerHost             string\n\tCacheImage             string\n\tCache                  cache.CacheOpts\n\tAppPath                string\n\tBuilder                string\n\tExecutionEnv           string\n\tRegistry               string\n\tRunImage               string\n\tPlatform               string\n\tPolicy                 string\n\tNetwork                string\n\tDescriptorPath         string\n\tDefaultProcessType     string\n\tLifecycleImage         string\n\tEnv                    []string\n\tEnvFiles               []string\n\tBuildpacks             []string\n\tExtensions             []string\n\tVolumes                []string\n\tAdditionalTags         []string\n\tWorkspace              string\n\tGID                    int\n\tUID                    int\n\tPreviousImage          string\n\tSBOMDestinationDir     string\n\tReportDestinationDir   string\n\tDateTime               string\n\tPreBuildpacks          []string\n\tPostBuildpacks         []string\n\tInsecureRegistries     []string\n}\n\n// Build an image from source code\nfunc Build(logger logging.Logger, cfg config.Config, packClient PackClient) *cobra.Command {\n\tvar flags BuildFlags\n\n\tcmd := &cobra.Command{\n\t\tUse:     \"build <image-name>\",\n\t\tArgs:    cobra.ExactArgs(1),\n\t\tShort:   \"Generate app image from source code\",\n\t\tExample: \"pack build test_img --path apps/test-app --builder cnbs/sample-builder:bionic\",\n\t\tLong: \"Pack Build uses Cloud Native Buildpacks to create a runnable app image from source code.\\n\\nPack Build \" +\n\t\t\t\"requires an image name, which will be generated from the source code. Build defaults to the current directory, \" +\n\t\t\t\"but you can use `--path` to specify another source code directory. Build requires a `builder`, which can either \" +\n\t\t\t\"be provided directly to build using `--builder`, or can be set using the `set-default-builder` command. For more \" +\n\t\t\t\"on how to use `pack build`, see: https://buildpacks.io/docs/app-developer-guide/build-an-app/.\",\n\t\tRunE: logError(logger, func(cmd *cobra.Command, args []string) error {\n\t\t\tinputImageName := client.ParseInputImageReference(args[0])\n\t\t\tif err := validateBuildFlags(&flags, cfg, inputImageName, logger); err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\n\t\t\tinputPreviousImage := client.ParseInputImageReference(flags.PreviousImage)\n\n\t\t\tdescriptor, actualDescriptorPath, err := parseProjectToml(flags.AppPath, flags.DescriptorPath, logger)\n\t\t\tif err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\n\t\t\tif actualDescriptorPath != \"\" {\n\t\t\t\tlogger.Debugf(\"Using project descriptor located at %s\", style.Symbol(actualDescriptorPath))\n\t\t\t}\n\n\t\t\tbuilder := flags.Builder\n\t\t\t// We only override the builder to the one in the project descriptor\n\t\t\t// if it was not explicitly set by the user\n\t\t\tif !cmd.Flags().Changed(\"builder\") && descriptor.Build.Builder != \"\" {\n\t\t\t\tbuilder = descriptor.Build.Builder\n\t\t\t}\n\n\t\t\tif builder == \"\" {\n\t\t\t\tsuggestSettingBuilder(logger, packClient)\n\t\t\t\treturn client.NewSoftError()\n\t\t\t}\n\n\t\t\tbuildpacks := flags.Buildpacks\n\t\t\textensions := flags.Extensions\n\n\t\t\tenv, err := parseEnv(flags.EnvFiles, flags.Env)\n\t\t\tif err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\n\t\t\tisTrusted, err := bldr.IsTrustedBuilder(cfg, builder)\n\t\t\tif err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\t\t\ttrustBuilder := isTrusted || bldr.IsKnownTrustedBuilder(builder) || flags.TrustBuilder\n\t\t\tif trustBuilder {\n\t\t\t\tlogger.Debugf(\"Builder %s is trusted\", style.Symbol(builder))\n\t\t\t\tif flags.LifecycleImage != \"\" {\n\t\t\t\t\tlogger.Warn(\"Ignoring the provided lifecycle image as the builder is trusted, running the creator in a single container using the provided builder\")\n\t\t\t\t}\n\t\t\t} else {\n\t\t\t\tlogger.Debugf(\"Builder %s is untrusted\", style.Symbol(builder))\n\t\t\t\tlogger.Debug(\"As a result, the phases of the lifecycle which require root access will be run in separate trusted ephemeral containers.\")\n\t\t\t\tlogger.Debug(\"For more information, see https://medium.com/buildpacks/faster-more-secure-builds-with-pack-0-11-0-4d0c633ca619\")\n\t\t\t}\n\n\t\t\tif !trustBuilder && len(flags.Volumes) > 0 {\n\t\t\t\tlogger.Warn(\"Using untrusted builder with volume mounts. If there is sensitive data in the volumes, this may present a security vulnerability.\")\n\t\t\t}\n\n\t\t\tstringPolicy := flags.Policy\n\t\t\tif stringPolicy == \"\" {\n\t\t\t\tstringPolicy = cfg.PullPolicy\n\t\t\t}\n\t\t\tpullPolicy, err := image.ParsePullPolicy(stringPolicy)\n\t\t\tif err != nil {\n\t\t\t\treturn errors.Wrapf(err, \"parsing pull policy %s\", flags.Policy)\n\t\t\t}\n\n\t\t\tvar lifecycleImage string\n\t\t\tif flags.LifecycleImage != \"\" {\n\t\t\t\tref, err := name.ParseReference(flags.LifecycleImage)\n\t\t\t\tif err != nil {\n\t\t\t\t\treturn errors.Wrapf(err, \"parsing lifecycle image %s\", flags.LifecycleImage)\n\t\t\t\t}\n\t\t\t\tlifecycleImage = ref.Name()\n\t\t\t}\n\n\t\t\terr = isForbiddenTag(cfg, inputImageName.Name(), lifecycleImage, builder)\n\t\t\tif err != nil {\n\t\t\t\treturn errors.Wrapf(err, \"forbidden image name\")\n\t\t\t}\n\n\t\t\tvar gid = -1\n\t\t\tif cmd.Flags().Changed(\"gid\") {\n\t\t\t\tgid = flags.GID\n\t\t\t}\n\n\t\t\tvar uid = -1\n\t\t\tif cmd.Flags().Changed(\"uid\") {\n\t\t\t\tuid = flags.UID\n\t\t\t}\n\n\t\t\tdateTime, err := parseTime(flags.DateTime)\n\t\t\tif err != nil {\n\t\t\t\treturn errors.Wrapf(err, \"parsing creation time %s\", flags.DateTime)\n\t\t\t}\n\t\t\tif err := packClient.Build(cmd.Context(), client.BuildOptions{\n\t\t\t\tAppPath:           flags.AppPath,\n\t\t\t\tBuilder:           builder,\n\t\t\t\tRegistry:          flags.Registry,\n\t\t\t\tAdditionalMirrors: getMirrors(cfg),\n\t\t\t\tAdditionalTags:    flags.AdditionalTags,\n\t\t\t\tRunImage:          flags.RunImage,\n\t\t\t\tEnv:               env,\n\t\t\t\tImage:             inputImageName.Name(),\n\t\t\t\tPublish:           flags.Publish,\n\t\t\t\tDockerHost:        flags.DockerHost,\n\t\t\t\tPlatform:          flags.Platform,\n\t\t\t\tPullPolicy:        pullPolicy,\n\t\t\t\tClearCache:        flags.ClearCache,\n\t\t\t\tTrustBuilder: func(string) bool {\n\t\t\t\t\treturn trustBuilder\n\t\t\t\t},\n\t\t\t\tTrustExtraBuildpacks: flags.TrustExtraBuildpacks,\n\t\t\t\tBuildpacks:           buildpacks,\n\t\t\t\tExtensions:           extensions,\n\t\t\t\tContainerConfig: client.ContainerConfig{\n\t\t\t\t\tNetwork: flags.Network,\n\t\t\t\t\tVolumes: flags.Volumes,\n\t\t\t\t},\n\t\t\t\tDefaultProcessType:       flags.DefaultProcessType,\n\t\t\t\tProjectDescriptorBaseDir: filepath.Dir(actualDescriptorPath),\n\t\t\t\tProjectDescriptor:        descriptor,\n\t\t\t\tCache:                    flags.Cache,\n\t\t\t\tCacheImage:               flags.CacheImage,\n\t\t\t\tWorkspace:                flags.Workspace,\n\t\t\t\tLifecycleImage:           lifecycleImage,\n\t\t\t\tGroupID:                  gid,\n\t\t\t\tUserID:                   uid,\n\t\t\t\tPreviousImage:            inputPreviousImage.Name(),\n\t\t\t\tInteractive:              flags.Interactive,\n\t\t\t\tSBOMDestinationDir:       flags.SBOMDestinationDir,\n\t\t\t\tReportDestinationDir:     flags.ReportDestinationDir,\n\t\t\t\tCreationTime:             dateTime,\n\t\t\t\tPreBuildpacks:            flags.PreBuildpacks,\n\t\t\t\tPostBuildpacks:           flags.PostBuildpacks,\n\t\t\t\tDisableSystemBuildpacks:  flags.DisableSystemBuilpacks,\n\t\t\t\tEnableUsernsHost:         flags.EnableUsernsHost,\n\t\t\t\tLayoutConfig: &client.LayoutConfig{\n\t\t\t\t\tSparse:             flags.Sparse,\n\t\t\t\t\tInputImage:         inputImageName,\n\t\t\t\t\tPreviousInputImage: inputPreviousImage,\n\t\t\t\t\tLayoutRepoDir:      cfg.LayoutRepositoryDir,\n\t\t\t\t},\n\t\t\t\tCNBExecutionEnv:    flags.ExecutionEnv,\n\t\t\t\tInsecureRegistries: flags.InsecureRegistries,\n\t\t\t}); err != nil {\n\t\t\t\treturn errors.Wrap(err, \"failed to build\")\n\t\t\t}\n\t\t\tlogger.Infof(\"Successfully built image %s\", style.Symbol(inputImageName.Name()))\n\t\t\treturn nil\n\t\t}),\n\t}\n\tbuildCommandFlags(cmd, &flags, cfg)\n\tAddHelpFlag(cmd, \"build\")\n\treturn cmd\n}\n\nfunc parseTime(providedTime string) (*time.Time, error) {\n\tvar parsedTime time.Time\n\tswitch providedTime {\n\tcase \"\":\n\t\treturn nil, nil\n\tcase \"now\":\n\t\tparsedTime = time.Now().UTC()\n\tdefault:\n\t\tintTime, err := strconv.ParseInt(providedTime, 10, 64)\n\t\tif err != nil {\n\t\t\treturn nil, errors.Wrap(err, \"parsing unix timestamp\")\n\t\t}\n\t\tparsedTime = time.Unix(intTime, 0).UTC()\n\t}\n\treturn &parsedTime, nil\n}\n\nfunc buildCommandFlags(cmd *cobra.Command, buildFlags *BuildFlags, cfg config.Config) {\n\tcmd.Flags().StringVarP(&buildFlags.AppPath, \"path\", \"p\", \"\", \"Path to app dir or zip-formatted file (defaults to current working directory)\")\n\tcmd.Flags().StringSliceVarP(&buildFlags.Buildpacks, \"buildpack\", \"b\", nil, \"Buildpack to use. One of:\\n  a buildpack by id and version in the form of '<buildpack>@<version>',\\n  path to a buildpack directory (not supported on Windows),\\n  path/URL to a buildpack .tar or .tgz file, or\\n  a packaged buildpack image name in the form of '<hostname>/<repo>[:<tag>]'\"+stringSliceHelp(\"buildpack\"))\n\tcmd.Flags().StringSliceVarP(&buildFlags.Extensions, \"extension\", \"\", nil, \"Extension to use. One of:\\n  an extension by id and version in the form of '<extension>@<version>',\\n  path to an extension directory (not supported on Windows),\\n  path/URL to an extension .tar or .tgz file, or\\n  a packaged extension image name in the form of '<hostname>/<repo>[:<tag>]'\"+stringSliceHelp(\"extension\"))\n\tcmd.Flags().StringArrayVar(&buildFlags.InsecureRegistries, \"insecure-registry\", []string{}, \"List of insecure registries (only available for API >= 0.13)\")\n\tcmd.Flags().StringVarP(&buildFlags.Builder, \"builder\", \"B\", cfg.DefaultBuilder, \"Builder image\")\n\tcmd.Flags().Var(&buildFlags.Cache, \"cache\",\n\t\t`Cache options used to define cache techniques for build process.\n- Cache as bind: 'type=<build/launch>;format=bind;source=<path to directory>'\n- Cache as image (requires --publish): 'type=<build/launch>;format=image;name=<registry image name>'\n- Cache as volume: 'type=<build/launch>;format=volume;[name=<volume name>]'\n    - If no name is provided, a random name will be generated.\n`)\n\tcmd.Flags().StringVar(&buildFlags.CacheImage, \"cache-image\", \"\", `Cache build layers in remote registry. Requires --publish`)\n\tcmd.Flags().BoolVar(&buildFlags.ClearCache, \"clear-cache\", false, \"Clear image's associated cache before building\")\n\tcmd.Flags().StringVar(&buildFlags.DateTime, \"creation-time\", \"\", \"Desired create time in the output image config. Accepted values are Unix timestamps (e.g., '1641013200'), or 'now'. Platform API version must be at least 0.9 to use this feature.\")\n\tcmd.Flags().StringVarP(&buildFlags.DescriptorPath, \"descriptor\", \"d\", \"\", \"Path to the project descriptor file\")\n\tcmd.Flags().StringVarP(&buildFlags.DefaultProcessType, \"default-process\", \"D\", \"\", `Set the default process type. (default \"web\")`)\n\tcmd.Flags().BoolVar(&buildFlags.DisableSystemBuilpacks, \"disable-system-buildpacks\", false, \"Disable System Buildpacks\")\n\tcmd.Flags().StringArrayVarP(&buildFlags.Env, \"env\", \"e\", []string{}, \"Build-time environment variable, in the form 'VAR=VALUE' or 'VAR'.\\nWhen using latter value-less form, value will be taken from current\\n  environment at the time this command is executed.\\nThis flag may be specified multiple times and will override\\n  individual values defined by --env-file.\"+stringArrayHelp(\"env\")+\"\\nNOTE: These are NOT available at image runtime.\")\n\tcmd.Flags().StringArrayVar(&buildFlags.EnvFiles, \"env-file\", []string{}, \"Build-time environment variables file\\nOne variable per line, of the form 'VAR=VALUE' or 'VAR'\\nWhen using latter value-less form, value will be taken from current\\n  environment at the time this command is executed\\nNOTE: These are NOT available at image runtime.\\\"\")\n\tcmd.Flags().StringVar(&buildFlags.Network, \"network\", \"\", \"Connect detect and build containers to network\")\n\tcmd.Flags().StringArrayVar(&buildFlags.PreBuildpacks, \"pre-buildpack\", []string{}, \"Buildpacks to prepend to the groups in the builder's order\")\n\tcmd.Flags().StringArrayVar(&buildFlags.PostBuildpacks, \"post-buildpack\", []string{}, \"Buildpacks to append to the groups in the builder's order\")\n\tcmd.Flags().BoolVar(&buildFlags.Publish, \"publish\", false, \"Publish the application image directly to the container registry specified in <image-name>, instead of the daemon. The run image must also reside in the registry.\")\n\tcmd.Flags().StringVar(&buildFlags.DockerHost, \"docker-host\", \"\",\n\t\t`Address to docker daemon that will be exposed to the build container.\nIf not set (or set to empty string) the standard socket location will be used.\nSpecial value 'inherit' may be used in which case DOCKER_HOST environment variable will be used.\nThis option may set DOCKER_HOST environment variable for the build container if needed.\n`)\n\tcmd.Flags().StringVar(&buildFlags.LifecycleImage, \"lifecycle-image\", cfg.LifecycleImage, `Custom lifecycle image to use for analysis, restore, and export when builder is untrusted.`)\n\tcmd.Flags().StringVar(&buildFlags.Platform, \"platform\", \"\", `Platform to build on (e.g., \"linux/amd64\").`)\n\tcmd.Flags().StringVar(&buildFlags.Policy, \"pull-policy\", \"\", `Pull policy to use. Accepted values are always, never, and if-not-present. (default \"always\")`)\n\tcmd.Flags().StringVar(&buildFlags.ExecutionEnv, \"exec-env\", \"production\", `Execution environment to use. (default \"production\"`)\n\tcmd.Flags().StringVarP(&buildFlags.Registry, \"buildpack-registry\", \"r\", cfg.DefaultRegistryName, \"Buildpack Registry by name\")\n\tcmd.Flags().StringVar(&buildFlags.RunImage, \"run-image\", \"\", \"Run image (defaults to default stack's run image)\")\n\tcmd.Flags().StringSliceVarP(&buildFlags.AdditionalTags, \"tag\", \"t\", nil, \"Additional tags to push the output image to.\\nTags should be in the format 'image:tag' or 'repository/image:tag'.\"+stringSliceHelp(\"tag\"))\n\tcmd.Flags().BoolVar(&buildFlags.TrustBuilder, \"trust-builder\", false, \"Trust the provided builder.\\nAll lifecycle phases will be run in a single container.\\nFor more on trusted builders, and when to trust or untrust a builder, check out our docs here: https://buildpacks.io/docs/tools/pack/concepts/trusted_builders\")\n\tcmd.Flags().BoolVar(&buildFlags.TrustExtraBuildpacks, \"trust-extra-buildpacks\", false, \"Trust buildpacks that are provided in addition to the buildpacks on the builder\")\n\tcmd.Flags().StringArrayVar(&buildFlags.Volumes, \"volume\", nil, \"Mount host volume into the build container, in the form '<host path>:<target path>[:<options>]'.\\n- 'host path': Name of the volume or absolute directory path to mount.\\n- 'target path': The path where the file or directory is available in the container.\\n- 'options' (default \\\"ro\\\"): An optional comma separated list of mount options.\\n    - \\\"ro\\\", volume contents are read-only.\\n    - \\\"rw\\\", volume contents are readable and writeable.\\n    - \\\"volume-opt=<key>=<value>\\\", can be specified more than once, takes a key-value pair consisting of the option name and its value.\"+stringArrayHelp(\"volume\"))\n\tcmd.Flags().StringVar(&buildFlags.Workspace, \"workspace\", \"\", \"Location at which to mount the app dir in the build image\")\n\tcmd.Flags().IntVar(&buildFlags.GID, \"gid\", 0, `Override GID of user's group in the stack's build and run images. The provided value must be a positive number`)\n\tcmd.Flags().IntVar(&buildFlags.UID, \"uid\", 0, `Override UID of user in the stack's build and run images. The provided value must be a positive number`)\n\tcmd.Flags().StringVar(&buildFlags.PreviousImage, \"previous-image\", \"\", \"Set previous image to a particular tag reference, digest reference, or (when performing a daemon build) image ID\")\n\tcmd.Flags().StringVar(&buildFlags.SBOMDestinationDir, \"sbom-output-dir\", \"\", \"Path to export SBoM contents.\\nOmitting the flag will yield no SBoM content.\")\n\tcmd.Flags().StringVar(&buildFlags.ReportDestinationDir, \"report-output-dir\", \"\", \"Path to export build report.toml.\\nOmitting the flag yield no report file.\")\n\tcmd.Flags().BoolVar(&buildFlags.Interactive, \"interactive\", false, \"Launch a terminal UI to depict the build process\")\n\tcmd.Flags().BoolVar(&buildFlags.Sparse, \"sparse\", false, \"Use this flag to avoid saving on disk the run-image layers when the application image is exported to OCI layout format\")\n\tcmd.Flags().BoolVar(&buildFlags.EnableUsernsHost, \"userns-host\", false, \"Enable user namespace isolation for the build containers\")\n\tif !cfg.Experimental {\n\t\tcmd.Flags().MarkHidden(\"interactive\")\n\t\tcmd.Flags().MarkHidden(\"sparse\")\n\t}\n}\n\nfunc validateBuildFlags(flags *BuildFlags, cfg config.Config, inputImageRef client.InputImageReference, logger logging.Logger) error {\n\tif flags.Registry != \"\" && !cfg.Experimental {\n\t\treturn client.NewExperimentError(\"Support for buildpack registries is currently experimental.\")\n\t}\n\n\tif flags.Cache.Launch.Format == cache.CacheImage {\n\t\tlogger.Warn(\"cache definition: 'launch' cache in format 'image' is not supported.\")\n\t}\n\n\tif flags.Cache.Build.Format == cache.CacheImage && flags.CacheImage != \"\" {\n\t\treturn errors.New(\"'cache' flag with 'image' format cannot be used with 'cache-image' flag.\")\n\t}\n\n\tif flags.Cache.Build.Format == cache.CacheImage && !flags.Publish {\n\t\treturn errors.New(\"image cache format requires the 'publish' flag\")\n\t}\n\n\tif flags.CacheImage != \"\" && !flags.Publish {\n\t\treturn errors.New(\"cache-image flag requires the publish flag\")\n\t}\n\n\tif flags.GID < 0 {\n\t\treturn errors.New(\"gid flag must be in the range of 0-2147483647\")\n\t}\n\n\tif flags.UID < 0 {\n\t\treturn errors.New(\"uid flag must be in the range of 0-2147483647\")\n\t}\n\n\tif flags.Interactive && !cfg.Experimental {\n\t\treturn client.NewExperimentError(\"Interactive mode is currently experimental.\")\n\t}\n\n\tif inputImageRef.Layout() && !cfg.Experimental {\n\t\treturn client.NewExperimentError(\"Exporting to OCI layout is currently experimental.\")\n\t}\n\n\tif _, err := os.Stat(inputImageRef.Name()); err == nil && flags.AppPath == \"\" {\n\t\tlogger.Warnf(\"You are building an image named '%s'. If you mean it as an app directory path, run 'pack build <args> --path %s'\",\n\t\t\tinputImageRef.Name(), inputImageRef.Name())\n\t}\n\n\tif flags.ExecutionEnv != \"\" && flags.ExecutionEnv != \"production\" && flags.ExecutionEnv != \"test\" {\n\t\t// RFC: the / character is reserved in case we need to introduce namespacing in the future.\n\t\tvar executionEnvRegex = regexp.MustCompile(`^[a-zA-Z0-9.-]+$`)\n\t\tif ok := executionEnvRegex.MatchString(flags.ExecutionEnv); !ok {\n\t\t\treturn errors.New(\"exec-env MUST only contain numbers, letters, and the characters: . or -\")\n\t\t}\n\t}\n\n\treturn nil\n}\n\nfunc parseEnv(envFiles []string, envVars []string) (map[string]string, error) {\n\tenv := map[string]string{}\n\n\tfor _, envFile := range envFiles {\n\t\tenvFileVars, err := parseEnvFile(envFile)\n\t\tif err != nil {\n\t\t\treturn nil, errors.Wrapf(err, \"failed to parse env file '%s'\", envFile)\n\t\t}\n\n\t\tfor k, v := range envFileVars {\n\t\t\tenv[k] = v\n\t\t}\n\t}\n\tfor _, envVar := range envVars {\n\t\tenv = addEnvVar(env, envVar)\n\t}\n\treturn env, nil\n}\n\nfunc parseEnvFile(filename string) (map[string]string, error) {\n\tout := make(map[string]string)\n\tf, err := os.ReadFile(filepath.Clean(filename))\n\tif err != nil {\n\t\treturn nil, errors.Wrapf(err, \"open %s\", filename)\n\t}\n\tfor _, line := range strings.Split(string(f), \"\\n\") {\n\t\tline = strings.TrimSpace(line)\n\t\tif line == \"\" {\n\t\t\tcontinue\n\t\t}\n\t\tout = addEnvVar(out, line)\n\t}\n\treturn out, nil\n}\n\nfunc addEnvVar(env map[string]string, item string) map[string]string {\n\tarr := strings.SplitN(item, \"=\", 2)\n\tif len(arr) > 1 {\n\t\tenv[arr[0]] = arr[1]\n\t} else {\n\t\tenv[arr[0]] = os.Getenv(arr[0])\n\t}\n\treturn env\n}\n\nfunc parseProjectToml(appPath, descriptorPath string, logger logging.Logger) (projectTypes.Descriptor, string, error) {\n\tactualPath := descriptorPath\n\tcomputePath := descriptorPath == \"\"\n\n\tif computePath {\n\t\tactualPath = filepath.Join(appPath, \"project.toml\")\n\t}\n\n\tif _, err := os.Stat(actualPath); err != nil {\n\t\tif computePath {\n\t\t\treturn projectTypes.Descriptor{}, \"\", nil\n\t\t}\n\t\treturn projectTypes.Descriptor{}, \"\", errors.Wrap(err, \"stat project descriptor\")\n\t}\n\n\tdescriptor, err := project.ReadProjectDescriptor(actualPath, logger)\n\treturn descriptor, actualPath, err\n}\n\nfunc isForbiddenTag(cfg config.Config, input, lifecycle, builder string) error {\n\tinputImage, err := name.ParseReference(input)\n\tif err != nil {\n\t\treturn errors.Wrapf(err, \"invalid image name %s\", input)\n\t}\n\n\tif builder != \"\" {\n\t\tbuilderImage, err := name.ParseReference(builder)\n\t\tif err != nil {\n\t\t\treturn errors.Wrapf(err, \"parsing builder image %s\", builder)\n\t\t}\n\t\tif inputImage.Context().RepositoryStr() == builderImage.Context().RepositoryStr() {\n\t\t\treturn fmt.Errorf(\"name must not match builder image name\")\n\t\t}\n\t}\n\n\tif lifecycle != \"\" {\n\t\tlifecycleImage, err := name.ParseReference(lifecycle)\n\t\tif err != nil {\n\t\t\treturn errors.Wrapf(err, \"parsing lifecycle image %s\", lifecycle)\n\t\t}\n\t\tif inputImage.Context().RepositoryStr() == lifecycleImage.Context().RepositoryStr() {\n\t\t\treturn fmt.Errorf(\"name must not match lifecycle image name\")\n\t\t}\n\t}\n\n\ttrustedBuilders := getTrustedBuilders(cfg)\n\tfor _, trustedBuilder := range trustedBuilders {\n\t\tbuilder, err := name.ParseReference(trustedBuilder)\n\t\tif err != nil {\n\t\t\treturn err\n\t\t}\n\t\tif inputImage.Context().RepositoryStr() == builder.Context().RepositoryStr() {\n\t\t\treturn fmt.Errorf(\"name must not match trusted builder name\")\n\t\t}\n\t}\n\n\tdefaultLifecycleImageRef, err := name.ParseReference(config.DefaultLifecycleImageRepo)\n\tif err != nil {\n\t\treturn errors.Wrapf(err, \"parsing default lifecycle image %s\", config.DefaultLifecycleImageRepo)\n\t}\n\n\tif inputImage.Context().RepositoryStr() == defaultLifecycleImageRef.Context().RepositoryStr() {\n\t\treturn fmt.Errorf(\"name must not match default lifecycle image name\")\n\t}\n\n\tif cfg.DefaultBuilder != \"\" {\n\t\tdefaultBuilderImage, err := name.ParseReference(cfg.DefaultBuilder)\n\t\tif err != nil {\n\t\t\treturn errors.Wrapf(err, \"parsing default builder %s\", cfg.DefaultBuilder)\n\t\t}\n\t\tif inputImage.Context().RepositoryStr() == defaultBuilderImage.Context().RegistryStr() {\n\t\t\treturn fmt.Errorf(\"name must not match default builder image name\")\n\t\t}\n\t}\n\n\treturn nil\n}\n"
  },
  {
    "path": "internal/commands/build_test.go",
    "content": "package commands_test\n\nimport (\n\t\"bytes\"\n\t\"fmt\"\n\t\"os\"\n\t\"path/filepath\"\n\t\"reflect\"\n\t\"testing\"\n\t\"time\"\n\n\t\"github.com/buildpacks/lifecycle/api\"\n\t\"github.com/golang/mock/gomock\"\n\t\"github.com/heroku/color\"\n\t\"github.com/pkg/errors\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/internal/commands\"\n\t\"github.com/buildpacks/pack/internal/commands/testmocks\"\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/internal/paths\"\n\t\"github.com/buildpacks/pack/pkg/client\"\n\t\"github.com/buildpacks/pack/pkg/image\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\tprojectTypes \"github.com/buildpacks/pack/pkg/project/types\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestBuildCommand(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\n\tspec.Run(t, \"Commands\", testBuildCommand, spec.Random(), spec.Report(report.Terminal{}))\n}\n\nfunc testBuildCommand(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tcommand        *cobra.Command\n\t\tlogger         *logging.LogWithWriters\n\t\toutBuf         bytes.Buffer\n\t\tmockController *gomock.Controller\n\t\tmockClient     *testmocks.MockPackClient\n\t\tcfg            config.Config\n\t)\n\n\tit.Before(func() {\n\t\tlogger = logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\tcfg = config.Config{}\n\t\tmockController = gomock.NewController(t)\n\t\tmockClient = testmocks.NewMockPackClient(mockController)\n\n\t\tcommand = commands.Build(logger, cfg, mockClient)\n\t})\n\n\twhen(\"#BuildCommand\", func() {\n\t\twhen(\"no builder is specified\", func() {\n\t\t\tit(\"returns a soft error\", func() {\n\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\tInspectBuilder(gomock.Any(), false).\n\t\t\t\t\tReturn(&client.BuilderInfo{Description: \"\"}, nil).\n\t\t\t\t\tAnyTimes()\n\n\t\t\t\tcommand.SetArgs([]string{\"image\"})\n\t\t\t\terr := command.Execute()\n\t\t\t\th.AssertError(t, err, client.NewSoftError().Error())\n\t\t\t})\n\t\t})\n\n\t\twhen(\"a builder and image are set\", func() {\n\t\t\tit(\"builds an image with a builder\", func() {\n\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\tBuild(gomock.Any(), EqBuildOptionsWithImage(\"my-builder\", \"image\")).\n\t\t\t\t\tReturn(nil)\n\n\t\t\t\tcommand.SetArgs([]string{\"--builder\", \"my-builder\", \"image\"})\n\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t})\n\n\t\t\tit(\"builds an image with a builder short command arg\", func() {\n\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\tBuild(gomock.Any(), EqBuildOptionsWithImage(\"my-builder\", \"image\")).\n\t\t\t\t\tReturn(nil)\n\n\t\t\t\tlogger.WantVerbose(true)\n\t\t\t\tcommand.SetArgs([]string{\"-B\", \"my-builder\", \"image\"})\n\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t\th.AssertContains(t, outBuf.String(), \"Builder 'my-builder' is untrusted\")\n\t\t\t})\n\n\t\t\twhen(\"the builder is trusted\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\t\tBuild(gomock.Any(), EqBuildOptionsWithTrustedBuilder(true)).\n\t\t\t\t\t\tReturn(nil)\n\n\t\t\t\t\tcfg := config.Config{TrustedBuilders: []config.TrustedBuilder{{Name: \"my-builder\"}}}\n\t\t\t\t\tcommand = commands.Build(logger, cfg, mockClient)\n\t\t\t\t})\n\t\t\t\tit(\"sets the trust builder option\", func() {\n\t\t\t\t\tlogger.WantVerbose(true)\n\t\t\t\t\tcommand.SetArgs([]string{\"image\", \"--builder\", \"my-builder\"})\n\t\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t\t\th.AssertContains(t, outBuf.String(), \"Builder 'my-builder' is trusted\")\n\t\t\t\t})\n\t\t\t\twhen(\"a lifecycle-image is provided\", func() {\n\t\t\t\t\tit(\"ignoring the mentioned lifecycle image, going with default version\", func() {\n\t\t\t\t\t\tcommand.SetArgs([]string{\"--builder\", \"my-builder\", \"image\", \"--lifecycle-image\", \"some-lifecycle-image\"})\n\t\t\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t\t\t\th.AssertContains(t, outBuf.String(), \"Warning: Ignoring the provided lifecycle image as the builder is trusted, running the creator in a single container using the provided builder\")\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"the builder is known to be trusted and suggested\", func() {\n\t\t\t\tit(\"sets the trust builder option\", func() {\n\t\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\t\tBuild(gomock.Any(), EqBuildOptionsWithTrustedBuilder(true)).\n\t\t\t\t\t\tReturn(nil)\n\n\t\t\t\t\tlogger.WantVerbose(true)\n\t\t\t\t\tcommand.SetArgs([]string{\"image\", \"--builder\", \"heroku/builder:24\"})\n\t\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t\t\th.AssertContains(t, outBuf.String(), \"Builder 'heroku/builder:24' is trusted\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"the builder is known to be trusted but not suggested\", func() {\n\t\t\t\tit(\"sets the trust builder option\", func() {\n\t\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\t\tBuild(gomock.Any(), EqBuildOptionsWithTrustedBuilder(true)).\n\t\t\t\t\t\tReturn(nil)\n\n\t\t\t\t\tlogger.WantVerbose(true)\n\t\t\t\t\tcommand.SetArgs([]string{\"image\", \"--builder\", \"heroku/builder:22\"})\n\t\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t\t\th.AssertContains(t, outBuf.String(), \"Builder 'heroku/builder:22' is trusted\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"the image name matches a builder name\", func() {\n\t\t\t\tit(\"refuses to build\", func() {\n\t\t\t\t\tlogger.WantVerbose(true)\n\t\t\t\t\tcommand.SetArgs([]string{\"heroku/builder:test\", \"--builder\", \"heroku/builder:24\"})\n\t\t\t\t\th.AssertNotNil(t, command.Execute())\n\t\t\t\t\th.AssertContains(t, outBuf.String(), \"name must not match builder image name\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"the image name matches a trusted-builder name\", func() {\n\t\t\t\tit(\"refuses to build\", func() {\n\t\t\t\t\tlogger.WantVerbose(true)\n\t\t\t\t\tcommand.SetArgs([]string{\"heroku/builder:test\", \"--builder\", \"test\", \"--trust-builder\"})\n\t\t\t\t\th.AssertNotNil(t, command.Execute())\n\t\t\t\t\th.AssertContains(t, outBuf.String(), \"name must not match trusted builder name\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"the image name matches a lifecycle image name\", func() {\n\t\t\t\tit(\"refuses to build\", func() {\n\t\t\t\t\tlogger.WantVerbose(true)\n\t\t\t\t\tcommand.SetArgs([]string{\"buildpacksio/lifecycle:test\", \"--builder\", \"test\", \"--trust-builder\"})\n\t\t\t\t\th.AssertNotNil(t, command.Execute())\n\t\t\t\t\th.AssertContains(t, outBuf.String(), \"name must not match default lifecycle image name\")\n\t\t\t\t})\n\n\t\t\t\tit(\"refuses to build when using fully qualified name\", func() {\n\t\t\t\t\tlogger.WantVerbose(true)\n\t\t\t\t\tcommand.SetArgs([]string{\"docker.io/buildpacksio/lifecycle:test\", \"--builder\", \"test\", \"--trust-builder\"})\n\t\t\t\t\th.AssertNotNil(t, command.Execute())\n\t\t\t\t\th.AssertContains(t, outBuf.String(), \"name must not match default lifecycle image name\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"the builder is not trusted\", func() {\n\t\t\t\tit(\"warns the user that the builder is untrusted\", func() {\n\t\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\t\tBuild(gomock.Any(), EqBuildOptionsWithTrustedBuilder(false)).\n\t\t\t\t\t\tReturn(nil)\n\n\t\t\t\t\tlogger.WantVerbose(true)\n\t\t\t\t\tcommand.SetArgs([]string{\"image\", \"--builder\", \"org/builder:unknown\"})\n\t\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t\t\th.AssertContains(t, outBuf.String(), \"Builder 'org/builder:unknown' is untrusted\")\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"--buildpack-registry flag is specified but experimental isn't set in the config\", func() {\n\t\t\tit(\"errors with a descriptive message\", func() {\n\t\t\t\tcommand.SetArgs([]string{\"image\", \"--builder\", \"my-builder\", \"--buildpack-registry\", \"some-registry\"})\n\t\t\t\terr := command.Execute()\n\t\t\t\th.AssertNotNil(t, err)\n\t\t\t\th.AssertError(t, err, \"Support for buildpack registries is currently experimental.\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"a network is given\", func() {\n\t\t\tit(\"forwards the network onto the client\", func() {\n\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\tBuild(gomock.Any(), EqBuildOptionsWithNetwork(\"my-network\")).\n\t\t\t\t\tReturn(nil)\n\n\t\t\t\tcommand.SetArgs([]string{\"image\", \"--builder\", \"my-builder\", \"--network\", \"my-network\"})\n\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t})\n\t\t})\n\n\t\twhen(\"--platform\", func() {\n\t\t\tit(\"sets platform\", func() {\n\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\tBuild(gomock.Any(), EqBuildOptionsWithPlatform(\"linux/amd64\")).\n\t\t\t\t\tReturn(nil)\n\n\t\t\t\tcommand.SetArgs([]string{\"image\", \"--builder\", \"my-builder\", \"--platform\", \"linux/amd64\"})\n\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t})\n\t\t})\n\n\t\twhen(\"--pull-policy\", func() {\n\t\t\tit(\"sets pull-policy=never\", func() {\n\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\tBuild(gomock.Any(), EqBuildOptionsWithPullPolicy(image.PullNever)).\n\t\t\t\t\tReturn(nil)\n\n\t\t\t\tcommand.SetArgs([]string{\"image\", \"--builder\", \"my-builder\", \"--pull-policy\", \"never\"})\n\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t})\n\t\t\tit(\"returns error for unknown policy\", func() {\n\t\t\t\tcommand.SetArgs([]string{\"image\", \"--builder\", \"my-builder\", \"--pull-policy\", \"unknown-policy\"})\n\t\t\t\th.AssertError(t, command.Execute(), \"parsing pull policy\")\n\t\t\t})\n\t\t\tit(\"takes precedence over a configured pull policy\", func() {\n\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\tBuild(gomock.Any(), EqBuildOptionsWithPullPolicy(image.PullNever)).\n\t\t\t\t\tReturn(nil)\n\n\t\t\t\tcfg := config.Config{PullPolicy: \"if-not-present\"}\n\t\t\t\tcommand := commands.Build(logger, cfg, mockClient)\n\n\t\t\t\tlogger.WantVerbose(true)\n\t\t\t\tcommand.SetArgs([]string{\"image\", \"--builder\", \"my-builder\", \"--pull-policy\", \"never\"})\n\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t})\n\t\t})\n\n\t\twhen(\"--pull-policy is not specified\", func() {\n\t\t\twhen(\"no pull policy set in config\", func() {\n\t\t\t\tit(\"uses the default policy\", func() {\n\t\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\t\tBuild(gomock.Any(), EqBuildOptionsWithPullPolicy(image.PullAlways)).\n\t\t\t\t\t\tReturn(nil)\n\n\t\t\t\t\tcommand.SetArgs([]string{\"image\", \"--builder\", \"my-builder\"})\n\t\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t\t})\n\t\t\t})\n\t\t\twhen(\"pull policy is set in config\", func() {\n\t\t\t\tit(\"uses the set policy\", func() {\n\t\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\t\tBuild(gomock.Any(), EqBuildOptionsWithPullPolicy(image.PullNever)).\n\t\t\t\t\t\tReturn(nil)\n\n\t\t\t\t\tcfg := config.Config{PullPolicy: \"never\"}\n\t\t\t\t\tcommand := commands.Build(logger, cfg, mockClient)\n\n\t\t\t\t\tlogger.WantVerbose(true)\n\t\t\t\t\tcommand.SetArgs([]string{\"image\", \"--builder\", \"my-builder\"})\n\t\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"volume mounts are specified\", func() {\n\t\t\tit(\"mounts the volumes\", func() {\n\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\tBuild(gomock.Any(), EqBuildOptionsWithVolumes([]string{\"a:b\", \"c:d\"})).\n\t\t\t\t\tReturn(nil)\n\n\t\t\t\tcommand.SetArgs([]string{\"image\", \"--builder\", \"my-builder\", \"--volume\", \"a:b\", \"--volume\", \"c:d\"})\n\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t})\n\n\t\t\tit(\"warns when running with an untrusted builder\", func() {\n\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\tBuild(gomock.Any(), EqBuildOptionsWithVolumes([]string{\"a:b\", \"c:d\"})).\n\t\t\t\t\tReturn(nil)\n\n\t\t\t\tcommand.SetArgs([]string{\"image\", \"--builder\", \"my-builder\", \"--volume\", \"a:b\", \"--volume\", \"c:d\"})\n\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t\th.AssertContains(t, outBuf.String(), \"Warning: Using untrusted builder with volume mounts\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"a default process is specified\", func() {\n\t\t\tit(\"sets that process\", func() {\n\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\tBuild(gomock.Any(), EqBuildOptionsDefaultProcess(\"my-proc\")).\n\t\t\t\t\tReturn(nil)\n\n\t\t\t\tcommand.SetArgs([]string{\"image\", \"--builder\", \"my-builder\", \"--default-process\", \"my-proc\"})\n\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t})\n\t\t})\n\n\t\twhen(\"env file\", func() {\n\t\t\twhen(\"an env file is provided\", func() {\n\t\t\t\tvar envPath string\n\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tenvfile, err := os.CreateTemp(\"\", \"envfile\")\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\tdefer envfile.Close()\n\n\t\t\t\t\tenvfile.WriteString(`KEY=VALUE`)\n\t\t\t\t\tenvPath = envfile.Name()\n\t\t\t\t})\n\n\t\t\t\tit.After(func() {\n\t\t\t\t\th.AssertNil(t, os.RemoveAll(envPath))\n\t\t\t\t})\n\n\t\t\t\tit(\"builds an image env variables read from the env file\", func() {\n\t\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\t\tBuild(gomock.Any(), EqBuildOptionsWithEnv(map[string]string{\n\t\t\t\t\t\t\t\"KEY\": \"VALUE\",\n\t\t\t\t\t\t})).\n\t\t\t\t\t\tReturn(nil)\n\n\t\t\t\t\tcommand.SetArgs([]string{\"--builder\", \"my-builder\", \"image\", \"--env-file\", envPath})\n\t\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"a env file is provided but doesn't exist\", func() {\n\t\t\t\tit(\"fails to run\", func() {\n\t\t\t\t\tcommand.SetArgs([]string{\"--builder\", \"my-builder\", \"image\", \"--env-file\", \"\"})\n\t\t\t\t\terr := command.Execute()\n\t\t\t\t\th.AssertError(t, err, \"parse env file\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"an empty env file is provided\", func() {\n\t\t\t\tvar envPath string\n\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tenvfile, err := os.CreateTemp(\"\", \"envfile\")\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\tdefer envfile.Close()\n\n\t\t\t\t\tenvfile.WriteString(``)\n\t\t\t\t\tenvPath = envfile.Name()\n\t\t\t\t})\n\n\t\t\t\tit.After(func() {\n\t\t\t\t\th.AssertNil(t, os.RemoveAll(envPath))\n\t\t\t\t})\n\n\t\t\t\tit(\"successfully builds\", func() {\n\t\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\t\tBuild(gomock.Any(), EqBuildOptionsWithEnv(map[string]string{})).\n\t\t\t\t\t\tReturn(nil)\n\n\t\t\t\t\tcommand.SetArgs([]string{\"--builder\", \"my-builder\", \"image\", \"--env-file\", envPath})\n\t\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"two env files are provided with conflicted keys\", func() {\n\t\t\t\tvar envPath1 string\n\t\t\t\tvar envPath2 string\n\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tenvfile1, err := os.CreateTemp(\"\", \"envfile\")\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\tdefer envfile1.Close()\n\n\t\t\t\t\tenvfile1.WriteString(\"KEY1=VALUE1\\nKEY2=IGNORED\")\n\t\t\t\t\tenvPath1 = envfile1.Name()\n\n\t\t\t\t\tenvfile2, err := os.CreateTemp(\"\", \"envfile\")\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\tdefer envfile2.Close()\n\n\t\t\t\t\tenvfile2.WriteString(\"KEY2=VALUE2\")\n\t\t\t\t\tenvPath2 = envfile2.Name()\n\t\t\t\t})\n\n\t\t\t\tit.After(func() {\n\t\t\t\t\th.AssertNil(t, os.RemoveAll(envPath1))\n\t\t\t\t\th.AssertNil(t, os.RemoveAll(envPath2))\n\t\t\t\t})\n\n\t\t\t\tit(\"builds an image with the last value of each env variable\", func() {\n\t\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\t\tBuild(gomock.Any(), EqBuildOptionsWithEnv(map[string]string{\n\t\t\t\t\t\t\t\"KEY1\": \"VALUE1\",\n\t\t\t\t\t\t\t\"KEY2\": \"VALUE2\",\n\t\t\t\t\t\t})).\n\t\t\t\t\t\tReturn(nil)\n\n\t\t\t\t\tcommand.SetArgs([]string{\"--builder\", \"my-builder\", \"image\", \"--env-file\", envPath1, \"--env-file\", envPath2})\n\t\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"a cache-image passed\", func() {\n\t\t\twhen(\"--publish is not used\", func() {\n\t\t\t\tit(\"errors\", func() {\n\t\t\t\t\tcommand.SetArgs([]string{\"--builder\", \"my-builder\", \"image\", \"--cache-image\", \"some-cache-image\"})\n\t\t\t\t\terr := command.Execute()\n\t\t\t\t\th.AssertError(t, err, \"cache-image flag requires the publish flag\")\n\t\t\t\t})\n\t\t\t})\n\t\t\twhen(\"--publish is used\", func() {\n\t\t\t\tit(\"succeeds\", func() {\n\t\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\t\tBuild(gomock.Any(), EqBuildOptionsWithCacheImage(\"some-cache-image\")).\n\t\t\t\t\t\tReturn(nil)\n\n\t\t\t\t\tcommand.SetArgs([]string{\"--builder\", \"my-builder\", \"image\", \"--cache-image\", \"some-cache-image\", \"--publish\"})\n\t\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"cache flag with 'format=image' is passed\", func() {\n\t\t\twhen(\"--publish is not used\", func() {\n\t\t\t\tit(\"errors\", func() {\n\t\t\t\t\tcommand.SetArgs([]string{\"--builder\", \"my-builder\", \"image\", \"--cache\", \"type=build;format=image;name=myorg/myimage:cache\"})\n\t\t\t\t\terr := command.Execute()\n\t\t\t\t\th.AssertError(t, err, \"image cache format requires the 'publish' flag\")\n\t\t\t\t})\n\t\t\t})\n\t\t\twhen(\"--publish is used\", func() {\n\t\t\t\tit(\"succeeds\", func() {\n\t\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\t\tBuild(gomock.Any(), EqBuildOptionsWithCacheFlags(\"type=build;format=image;name=myorg/myimage:cache;type=launch;format=volume;\")).\n\t\t\t\t\t\tReturn(nil)\n\n\t\t\t\t\tcommand.SetArgs([]string{\"--builder\", \"my-builder\", \"image\", \"--cache\", \"type=build;format=image;name=myorg/myimage:cache\", \"--publish\"})\n\t\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t\t})\n\t\t\t})\n\t\t\twhen(\"used together with --cache-image\", func() {\n\t\t\t\tit(\"errors\", func() {\n\t\t\t\t\tcommand.SetArgs([]string{\"--builder\", \"my-builder\", \"image\", \"--cache-image\", \"some-cache-image\", \"--cache\", \"type=build;format=image;name=myorg/myimage:cache\"})\n\t\t\t\t\terr := command.Execute()\n\t\t\t\t\th.AssertError(t, err, \"'cache' flag with 'image' format cannot be used with 'cache-image' flag\")\n\t\t\t\t})\n\t\t\t})\n\t\t\twhen(\"'type=launch;format=image' is used\", func() {\n\t\t\t\tit(\"warns\", func() {\n\t\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\t\tBuild(gomock.Any(), EqBuildOptionsWithCacheFlags(\"type=build;format=volume;type=launch;format=image;name=myorg/myimage:cache;\")).\n\t\t\t\t\t\tReturn(nil)\n\n\t\t\t\t\tcommand.SetArgs([]string{\"--builder\", \"my-builder\", \"image\", \"--cache\", \"type=launch;format=image;name=myorg/myimage:cache\", \"--publish\"})\n\t\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t\t\th.AssertContains(t, outBuf.String(), \"Warning: cache definition: 'launch' cache in format 'image' is not supported.\")\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"a valid lifecycle-image is provided\", func() {\n\t\t\twhen(\"only the image repo is provided\", func() {\n\t\t\t\tit(\"uses the provided lifecycle-image and parses it correctly\", func() {\n\t\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\t\tBuild(gomock.Any(), EqBuildOptionsWithLifecycleImage(\"index.docker.io/library/some-lifecycle-image:latest\")).\n\t\t\t\t\t\tReturn(nil)\n\n\t\t\t\t\tcommand.SetArgs([]string{\"--builder\", \"my-builder\", \"image\", \"--lifecycle-image\", \"some-lifecycle-image\"})\n\t\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t\t})\n\t\t\t})\n\t\t\twhen(\"a custom image repo is provided\", func() {\n\t\t\t\tit(\"uses the provided lifecycle-image and parses it correctly\", func() {\n\t\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\t\tBuild(gomock.Any(), EqBuildOptionsWithLifecycleImage(\"test.com/some-lifecycle-image:latest\")).\n\t\t\t\t\t\tReturn(nil)\n\n\t\t\t\t\tcommand.SetArgs([]string{\"--builder\", \"my-builder\", \"image\", \"--lifecycle-image\", \"test.com/some-lifecycle-image\"})\n\t\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t\t})\n\t\t\t})\n\t\t\twhen(\"a custom image repo is provided with a tag\", func() {\n\t\t\t\tit(\"uses the provided lifecycle-image and parses it correctly\", func() {\n\t\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\t\tBuild(gomock.Any(), EqBuildOptionsWithLifecycleImage(\"test.com/some-lifecycle-image:v1\")).\n\t\t\t\t\t\tReturn(nil)\n\n\t\t\t\t\tcommand.SetArgs([]string{\"--builder\", \"my-builder\", \"image\", \"--lifecycle-image\", \"test.com/some-lifecycle-image:v1\"})\n\t\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t\t})\n\t\t\t})\n\t\t\twhen(\"a custom image repo is provided with a digest\", func() {\n\t\t\t\tit(\"uses the provided lifecycle-image and parses it correctly\", func() {\n\t\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\t\tBuild(gomock.Any(), EqBuildOptionsWithLifecycleImage(\"test.com/some-lifecycle-image@sha256:e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855\")).\n\t\t\t\t\t\tReturn(nil)\n\n\t\t\t\t\tcommand.SetArgs([]string{\"--builder\", \"my-builder\", \"image\", \"--lifecycle-image\", \"test.com/some-lifecycle-image@sha256:e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855\"})\n\t\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"an invalid lifecycle-image is provided\", func() {\n\t\t\twhen(\"the repo name is invalid\", func() {\n\t\t\t\tit(\"returns a parse error\", func() {\n\t\t\t\t\tcommand.SetArgs([]string{\"--builder\", \"my-builder\", \"image\", \"--lifecycle-image\", \"some-!nv@l!d-image\"})\n\t\t\t\t\terr := command.Execute()\n\t\t\t\t\th.AssertError(t, err, \"could not parse reference: some-!nv@l!d-image\")\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"a lifecycle-image is not provided\", func() {\n\t\t\twhen(\"a lifecycle-image is set in the config\", func() {\n\t\t\t\tit(\"uses the lifecycle-image from the config after parsing it\", func() {\n\t\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\t\tBuild(gomock.Any(), EqBuildOptionsWithLifecycleImage(\"index.docker.io/library/some-lifecycle-image:latest\")).\n\t\t\t\t\t\tReturn(nil)\n\n\t\t\t\t\tcfg := config.Config{LifecycleImage: \"some-lifecycle-image\"}\n\t\t\t\t\tcommand := commands.Build(logger, cfg, mockClient)\n\n\t\t\t\t\tlogger.WantVerbose(true)\n\t\t\t\t\tcommand.SetArgs([]string{\"image\", \"--builder\", \"my-builder\"})\n\t\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t\t})\n\t\t\t})\n\t\t\twhen(\"a lifecycle-image is not set in the config\", func() {\n\t\t\t\tit(\"passes an empty lifecycle image and does not throw an error\", func() {\n\t\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\t\tBuild(gomock.Any(), EqBuildOptionsWithLifecycleImage(\"\")).\n\t\t\t\t\t\tReturn(nil)\n\n\t\t\t\t\tcommand.SetArgs([]string{\"--builder\", \"my-builder\", \"image\"})\n\t\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"env vars are passed as flags\", func() {\n\t\t\tvar (\n\t\t\t\ttmpVar   = \"tmpVar\"\n\t\t\t\ttmpValue = \"tmpKey\"\n\t\t\t)\n\n\t\t\tit.Before(func() {\n\t\t\t\th.AssertNil(t, os.Setenv(tmpVar, tmpValue))\n\t\t\t})\n\n\t\t\tit.After(func() {\n\t\t\t\th.AssertNil(t, os.Unsetenv(tmpVar))\n\t\t\t})\n\n\t\t\tit(\"sets flag variables\", func() {\n\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\tBuild(gomock.Any(), EqBuildOptionsWithEnv(map[string]string{\n\t\t\t\t\t\t\"KEY\":  \"VALUE\",\n\t\t\t\t\t\ttmpVar: tmpValue,\n\t\t\t\t\t})).\n\t\t\t\t\tReturn(nil)\n\n\t\t\t\tcommand.SetArgs([]string{\"image\", \"--builder\", \"my-builder\", \"--env\", \"KEY=VALUE\", \"--env\", tmpVar})\n\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t})\n\t\t})\n\n\t\twhen(\"build fails\", func() {\n\t\t\tit(\"should show an error\", func() {\n\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\tBuild(gomock.Any(), gomock.Any()).\n\t\t\t\t\tReturn(errors.New(\"\"))\n\n\t\t\t\tcommand.SetArgs([]string{\"--builder\", \"my-builder\", \"image\"})\n\t\t\t\terr := command.Execute()\n\t\t\t\th.AssertError(t, err, \"failed to build\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"user specifies an invalid project descriptor file\", func() {\n\t\t\tit(\"should show an error\", func() {\n\t\t\t\tprojectTomlPath := \"/incorrect/path/to/project.toml\"\n\n\t\t\t\tcommand.SetArgs([]string{\"--builder\", \"my-builder\", \"--descriptor\", projectTomlPath, \"image\"})\n\t\t\t\th.AssertNotNil(t, command.Execute())\n\t\t\t})\n\t\t})\n\n\t\twhen(\"parsing project descriptor\", func() {\n\t\t\twhen(\"file is valid\", func() {\n\t\t\t\tvar projectTomlPath string\n\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tprojectToml, err := os.CreateTemp(\"\", \"project.toml\")\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\tdefer projectToml.Close()\n\n\t\t\t\t\tprojectToml.WriteString(`\n[project]\nname = \"Sample\"\n\n[[build.buildpacks]]\nid = \"example/lua\"\nversion = \"1.0\"\n`)\n\t\t\t\t\tprojectTomlPath = projectToml.Name()\n\t\t\t\t})\n\n\t\t\t\tit.After(func() {\n\t\t\t\t\th.AssertNil(t, os.RemoveAll(projectTomlPath))\n\t\t\t\t})\n\n\t\t\t\tit(\"should build an image with configuration in descriptor\", func() {\n\t\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\t\tBuild(gomock.Any(), EqBuildOptionsWithProjectDescriptor(projectTypes.Descriptor{\n\t\t\t\t\t\t\tProject: projectTypes.Project{\n\t\t\t\t\t\t\t\tName: \"Sample\",\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\tBuild: projectTypes.Build{\n\t\t\t\t\t\t\t\tBuildpacks: []projectTypes.Buildpack{{\n\t\t\t\t\t\t\t\t\tID:      \"example/lua\",\n\t\t\t\t\t\t\t\t\tVersion: \"1.0\",\n\t\t\t\t\t\t\t\t}},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\tSchemaVersion: api.MustParse(\"0.1\"),\n\t\t\t\t\t\t})).\n\t\t\t\t\t\tReturn(nil)\n\n\t\t\t\t\tcommand.SetArgs([]string{\"--builder\", \"my-builder\", \"--descriptor\", projectTomlPath, \"image\"})\n\t\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"file has a builder specified\", func() {\n\t\t\t\tvar projectTomlPath string\n\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tprojectToml, err := os.CreateTemp(\"\", \"project.toml\")\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\tdefer projectToml.Close()\n\n\t\t\t\t\tprojectToml.WriteString(`\n[project]\nname = \"Sample\"\n\n[build]\nbuilder = \"my-builder\"\n`)\n\t\t\t\t\tprojectTomlPath = projectToml.Name()\n\t\t\t\t})\n\n\t\t\t\tit.After(func() {\n\t\t\t\t\th.AssertNil(t, os.RemoveAll(projectTomlPath))\n\t\t\t\t})\n\t\t\t\twhen(\"a builder is not explicitly passed by the user\", func() {\n\t\t\t\t\tit(\"should build an image with configuration in descriptor\", func() {\n\t\t\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\t\t\tBuild(gomock.Any(), EqBuildOptionsWithBuilder(\"my-builder\")).\n\t\t\t\t\t\t\tReturn(nil)\n\n\t\t\t\t\t\tcommand.SetArgs([]string{\"--descriptor\", projectTomlPath, \"image\"})\n\t\t\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t\twhen(\"a builder is explicitly passed by the user\", func() {\n\t\t\t\t\tit(\"should build an image with the passed builder flag\", func() {\n\t\t\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\t\t\tBuild(gomock.Any(), EqBuildOptionsWithBuilder(\"flag-builder\")).\n\t\t\t\t\t\t\tReturn(nil)\n\n\t\t\t\t\t\tcommand.SetArgs([]string{\"--builder\", \"flag-builder\", \"--descriptor\", projectTomlPath, \"image\"})\n\t\t\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"file is invalid\", func() {\n\t\t\t\tvar projectTomlPath string\n\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tprojectToml, err := os.CreateTemp(\"\", \"project.toml\")\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\tdefer projectToml.Close()\n\n\t\t\t\t\tprojectToml.WriteString(\"project]\")\n\t\t\t\t\tprojectTomlPath = projectToml.Name()\n\t\t\t\t})\n\n\t\t\t\tit.After(func() {\n\t\t\t\t\th.AssertNil(t, os.RemoveAll(projectTomlPath))\n\t\t\t\t})\n\n\t\t\t\tit(\"should fail to build\", func() {\n\t\t\t\t\tcommand.SetArgs([]string{\"--builder\", \"my-builder\", \"--descriptor\", projectTomlPath, \"image\"})\n\t\t\t\t\th.AssertNotNil(t, command.Execute())\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"descriptor path is NOT specified\", func() {\n\t\t\t\twhen(\"project.toml exists in source repo\", func() {\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\th.AssertNil(t, os.Chdir(\"testdata\"))\n\t\t\t\t\t})\n\n\t\t\t\t\tit.After(func() {\n\t\t\t\t\t\th.AssertNil(t, os.Chdir(\"..\"))\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"should use project.toml in source repo\", func() {\n\t\t\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\t\t\tBuild(gomock.Any(), EqBuildOptionsWithProjectDescriptor(projectTypes.Descriptor{\n\t\t\t\t\t\t\t\tProject: projectTypes.Project{\n\t\t\t\t\t\t\t\t\tName: \"Sample\",\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\tBuild: projectTypes.Build{\n\t\t\t\t\t\t\t\t\tBuildpacks: []projectTypes.Buildpack{{\n\t\t\t\t\t\t\t\t\t\tID:      \"example/lua\",\n\t\t\t\t\t\t\t\t\t\tVersion: \"1.0\",\n\t\t\t\t\t\t\t\t\t}},\n\t\t\t\t\t\t\t\t\tEnv: []projectTypes.EnvVar{{\n\t\t\t\t\t\t\t\t\t\tName:  \"KEY1\",\n\t\t\t\t\t\t\t\t\t\tValue: \"VALUE1\",\n\t\t\t\t\t\t\t\t\t}},\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\tSchemaVersion: api.MustParse(\"0.1\"),\n\t\t\t\t\t\t\t})).\n\t\t\t\t\t\t\tReturn(nil)\n\n\t\t\t\t\t\tcommand.SetArgs([]string{\"--builder\", \"my-builder\", \"image\"})\n\t\t\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"project.toml does NOT exist in source repo\", func() {\n\t\t\t\t\tit(\"should use empty descriptor\", func() {\n\t\t\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\t\t\tBuild(gomock.Any(), EqBuildOptionsWithEnv(map[string]string{})).\n\t\t\t\t\t\t\tReturn(nil)\n\n\t\t\t\t\t\tcommand.SetArgs([]string{\"--builder\", \"my-builder\", \"image\"})\n\t\t\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"descriptor path is specified\", func() {\n\t\t\t\twhen(\"descriptor file exists\", func() {\n\t\t\t\t\tvar projectTomlPath string\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\tprojectTomlPath = filepath.Join(\"testdata\", \"project.toml\")\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"should use specified descriptor\", func() {\n\t\t\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\t\t\tBuild(gomock.Any(), EqBuildOptionsWithProjectDescriptor(projectTypes.Descriptor{\n\t\t\t\t\t\t\t\tProject: projectTypes.Project{\n\t\t\t\t\t\t\t\t\tName: \"Sample\",\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\tBuild: projectTypes.Build{\n\t\t\t\t\t\t\t\t\tBuildpacks: []projectTypes.Buildpack{{\n\t\t\t\t\t\t\t\t\t\tID:      \"example/lua\",\n\t\t\t\t\t\t\t\t\t\tVersion: \"1.0\",\n\t\t\t\t\t\t\t\t\t}},\n\t\t\t\t\t\t\t\t\tEnv: []projectTypes.EnvVar{{\n\t\t\t\t\t\t\t\t\t\tName:  \"KEY1\",\n\t\t\t\t\t\t\t\t\t\tValue: \"VALUE1\",\n\t\t\t\t\t\t\t\t\t}},\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\tSchemaVersion: api.MustParse(\"0.1\"),\n\t\t\t\t\t\t\t})).\n\t\t\t\t\t\t\tReturn(nil)\n\n\t\t\t\t\t\tcommand.SetArgs([]string{\"--builder\", \"my-builder\", \"--descriptor\", projectTomlPath, \"image\"})\n\t\t\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"descriptor file does NOT exist in source repo\", func() {\n\t\t\t\t\tit(\"should fail with an error message\", func() {\n\t\t\t\t\t\tcommand.SetArgs([]string{\"--builder\", \"my-builder\", \"--descriptor\", \"non-existent-path\", \"image\"})\n\t\t\t\t\t\th.AssertError(t, command.Execute(), \"stat project descriptor\")\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"additional tags are specified\", func() {\n\t\t\tit(\"forwards additional tags to lifecycle\", func() {\n\t\t\t\texpectedTags := []string{\"additional-tag-1\", \"additional-tag-2\"}\n\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\tBuild(gomock.Any(), EqBuildOptionsWithAdditionalTags(expectedTags)).\n\t\t\t\t\tReturn(nil)\n\n\t\t\t\tcommand.SetArgs([]string{\"image\", \"--builder\", \"my-builder\", \"--tag\", expectedTags[0], \"--tag\", expectedTags[1]})\n\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t})\n\t\t})\n\n\t\twhen(\"gid flag is provided\", func() {\n\t\t\twhen(\"--gid is a valid value\", func() {\n\t\t\t\tit(\"override build option should be set to true\", func() {\n\t\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\t\tBuild(gomock.Any(), EqBuildOptionsWithOverrideGroupID(1)).\n\t\t\t\t\t\tReturn(nil)\n\n\t\t\t\t\tcommand.SetArgs([]string{\"--builder\", \"my-builder\", \"image\", \"--gid\", \"1\"})\n\t\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t\t})\n\t\t\t})\n\t\t\twhen(\"--gid is an invalid value\", func() {\n\t\t\t\tit(\"error must be thrown\", func() {\n\t\t\t\t\tcommand.SetArgs([]string{\"--builder\", \"my-builder\", \"image\", \"--gid\", \"-1\"})\n\t\t\t\t\terr := command.Execute()\n\t\t\t\t\th.AssertError(t, err, \"gid flag must be in the range of 0-2147483647\")\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"gid flag is not provided\", func() {\n\t\t\tit(\"override build option should be set to false\", func() {\n\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\tBuild(gomock.Any(), EqBuildOptionsWithOverrideGroupID(-1)).\n\t\t\t\t\tReturn(nil)\n\n\t\t\t\tcommand.SetArgs([]string{\"--builder\", \"my-builder\", \"image\"})\n\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t})\n\t\t})\n\n\t\twhen(\"previous-image flag is provided\", func() {\n\t\t\twhen(\"image is invalid\", func() {\n\t\t\t\tit(\"error must be thrown\", func() {\n\t\t\t\t\tcommand.SetArgs([]string{\"--builder\", \"my-builder\", \"/x@/y/?!z\", \"--previous-image\", \"previous-image\"})\n\t\t\t\t\terr := command.Execute()\n\t\t\t\t\th.AssertError(t, err, \"forbidden image name\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"previous-image is invalid\", func() {\n\t\t\t\tit(\"error must be thrown\", func() {\n\t\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\t\tBuild(gomock.Any(), EqBuildOptionsWithPreviousImage(\"%%%\")).\n\t\t\t\t\t\tReturn(errors.New(\"\"))\n\n\t\t\t\t\tcommand.SetArgs([]string{\"--builder\", \"my-builder\", \"image\", \"--previous-image\", \"%%%\"})\n\t\t\t\t\terr := command.Execute()\n\t\t\t\t\th.AssertError(t, err, \"failed to build\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"--publish is false\", func() {\n\t\t\t\tit(\"previous-image should be passed to builder\", func() {\n\t\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\t\tBuild(gomock.Any(), EqBuildOptionsWithPreviousImage(\"previous-image\")).\n\t\t\t\t\t\tReturn(nil)\n\n\t\t\t\t\tcommand.SetArgs([]string{\"--builder\", \"my-builder\", \"image\", \"--previous-image\", \"previous-image\"})\n\t\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"--publish is true\", func() {\n\t\t\t\twhen(\"image and previous-image are in same registry\", func() {\n\t\t\t\t\tit(\"previous-image should be passed to builder\", func() {\n\t\t\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\t\t\tBuild(gomock.Any(), EqBuildOptionsWithPreviousImage(\"index.docker.io/some/previous:latest\")).\n\t\t\t\t\t\t\tReturn(nil)\n\n\t\t\t\t\t\tcommand.SetArgs([]string{\"--builder\", \"my-builder\", \"index.docker.io/some/image:latest\", \"--previous-image\", \"index.docker.io/some/previous:latest\", \"--publish\"})\n\t\t\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"interactive flag is provided but experimental isn't set in the config\", func() {\n\t\t\tit(\"errors with a descriptive message\", func() {\n\t\t\t\tcommand.SetArgs([]string{\"image\", \"--interactive\"})\n\t\t\t\terr := command.Execute()\n\t\t\t\th.AssertNotNil(t, err)\n\t\t\t\th.AssertError(t, err, \"Interactive mode is currently experimental.\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"sbom destination directory is provided\", func() {\n\t\t\tit(\"forwards the network onto the client\", func() {\n\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\tBuild(gomock.Any(), EqBuildOptionsWithSBOMOutputDir(\"some-output-dir\")).\n\t\t\t\t\tReturn(nil)\n\n\t\t\t\tcommand.SetArgs([]string{\"image\", \"--builder\", \"my-builder\", \"--sbom-output-dir\", \"some-output-dir\"})\n\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t})\n\t\t})\n\n\t\twhen(\"--creation-time\", func() {\n\t\t\twhen(\"provided as 'now'\", func() {\n\t\t\t\tit(\"passes it to the builder\", func() {\n\t\t\t\t\texpectedTime := time.Now().UTC()\n\t\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\t\tBuild(gomock.Any(), EqBuildOptionsWithDateTime(&expectedTime)).\n\t\t\t\t\t\tReturn(nil)\n\n\t\t\t\t\tcommand.SetArgs([]string{\"image\", \"--builder\", \"my-builder\", \"--creation-time\", \"now\"})\n\t\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"provided as unix timestamp\", func() {\n\t\t\t\tit(\"passes it to the builder\", func() {\n\t\t\t\t\texpectedTime, err := time.Parse(\"2006-01-02T03:04:05Z\", \"2019-08-19T00:00:01Z\")\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\t\tBuild(gomock.Any(), EqBuildOptionsWithDateTime(&expectedTime)).\n\t\t\t\t\t\tReturn(nil)\n\n\t\t\t\t\tcommand.SetArgs([]string{\"image\", \"--builder\", \"my-builder\", \"--creation-time\", \"1566172801\"})\n\t\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"not provided\", func() {\n\t\t\t\tit(\"is nil\", func() {\n\t\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\t\tBuild(gomock.Any(), EqBuildOptionsWithDateTime(nil)).\n\t\t\t\t\t\tReturn(nil)\n\n\t\t\t\t\tcommand.SetArgs([]string{\"image\", \"--builder\", \"my-builder\"})\n\t\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"path to app dir or zip-formatted file is provided\", func() {\n\t\t\tit(\"builds with the specified path\", func() {\n\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\tBuild(gomock.Any(), EqBuildOptionsWithPath(\"my-source\")).\n\t\t\t\t\tReturn(nil)\n\n\t\t\t\tcommand.SetArgs([]string{\"image\", \"--builder\", \"my-builder\", \"--path\", \"my-source\"})\n\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t})\n\t\t})\n\n\t\twhen(\"a local path with the same string as the specified image name exists\", func() {\n\t\t\twhen(\"an app path is specified\", func() {\n\t\t\t\tit(\"doesn't warn that the positional argument will not be treated as the source path\", func() {\n\t\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\t\tBuild(gomock.Any(), EqBuildOptionsWithImage(\"my-builder\", \"testdata\")).\n\t\t\t\t\t\tReturn(nil)\n\n\t\t\t\t\tcommand.SetArgs([]string{\"testdata\", \"--builder\", \"my-builder\", \"--path\", \"my-source\"})\n\t\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t\t\th.AssertNotContainsMatch(t, outBuf.String(), `Warning: You are building an image named '([^']+)'\\. If you mean it as an app directory path, run 'pack build <args> --path ([^']+)'`)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"no app path is specified\", func() {\n\t\t\t\tit(\"warns that the positional argument will not be treated as the source path\", func() {\n\t\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\t\tBuild(gomock.Any(), EqBuildOptionsWithImage(\"my-builder\", \"testdata\")).\n\t\t\t\t\t\tReturn(nil)\n\n\t\t\t\t\tcommand.SetArgs([]string{\"testdata\", \"--builder\", \"my-builder\"})\n\t\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t\t\th.AssertContains(t, outBuf.String(), \"Warning: You are building an image named 'testdata'. If you mean it as an app directory path, run 'pack build <args> --path testdata'\")\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"export to OCI layout is expected but experimental isn't set in the config\", func() {\n\t\t\tit(\"errors with a descriptive message\", func() {\n\t\t\t\tcommand.SetArgs([]string{\"oci:image\", \"--builder\", \"my-builder\"})\n\t\t\t\terr := command.Execute()\n\t\t\t\th.AssertNotNil(t, err)\n\t\t\t\th.AssertError(t, err, \"Exporting to OCI layout is currently experimental.\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"--exec-env\", func() {\n\t\t\twhen(\"is not provided\", func() {\n\t\t\t\tit(\"set 'production' as default value\", func() {\n\t\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\t\tBuild(gomock.Any(), EqBuildOptionsWithExecEnv(\"production\")).\n\t\t\t\t\t\tReturn(nil)\n\n\t\t\t\t\tcommand.SetArgs([]string{\"image\", \"--builder\", \"my-builder\"})\n\t\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"is provided\", func() {\n\t\t\t\twhen(\"contains valid characters\", func() {\n\t\t\t\t\tit(\"forwards the exec-value (only letters) into the client\", func() {\n\t\t\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\t\t\tBuild(gomock.Any(), EqBuildOptionsWithExecEnv(\"something\")).\n\t\t\t\t\t\t\tReturn(nil)\n\n\t\t\t\t\t\tcommand.SetArgs([]string{\"image\", \"--builder\", \"my-builder\", \"--exec-env\", \"something\"})\n\t\t\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"forwards the exec-value (only numbers) into the client\", func() {\n\t\t\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\t\t\tBuild(gomock.Any(), EqBuildOptionsWithExecEnv(\"1234\")).\n\t\t\t\t\t\t\tReturn(nil)\n\n\t\t\t\t\t\tcommand.SetArgs([]string{\"image\", \"--builder\", \"my-builder\", \"--exec-env\", \"1234\"})\n\t\t\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"forwards the exec-value (mix letters and numbers) into the client\", func() {\n\t\t\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\t\t\tBuild(gomock.Any(), EqBuildOptionsWithExecEnv(\"env1\")).\n\t\t\t\t\t\t\tReturn(nil)\n\n\t\t\t\t\t\tcommand.SetArgs([]string{\"image\", \"--builder\", \"my-builder\", \"--exec-env\", \"env1\"})\n\t\t\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"forwards the exec-value (mix letters, numbers and .) into the client\", func() {\n\t\t\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\t\t\tBuild(gomock.Any(), EqBuildOptionsWithExecEnv(\"env1.1\")).\n\t\t\t\t\t\t\tReturn(nil)\n\n\t\t\t\t\t\tcommand.SetArgs([]string{\"image\", \"--builder\", \"my-builder\", \"--exec-env\", \"env1.1\"})\n\t\t\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"forwards the exec-value (mix letters, numbers and -) into the client\", func() {\n\t\t\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\t\t\tBuild(gomock.Any(), EqBuildOptionsWithExecEnv(\"env-1\")).\n\t\t\t\t\t\t\tReturn(nil)\n\n\t\t\t\t\t\tcommand.SetArgs([]string{\"image\", \"--builder\", \"my-builder\", \"--exec-env\", \"env-1\"})\n\t\t\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"forwards the exec-value (mix letters, numbers, . and  -) into the client\", func() {\n\t\t\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\t\t\tBuild(gomock.Any(), EqBuildOptionsWithExecEnv(\"env-1.1\")).\n\t\t\t\t\t\t\tReturn(nil)\n\n\t\t\t\t\t\tcommand.SetArgs([]string{\"image\", \"--builder\", \"my-builder\", \"--exec-env\", \"env-1.1\"})\n\t\t\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"contains invalid characters\", func() {\n\t\t\t\t\tit(\"errors with a descriptive message\", func() {\n\t\t\t\t\t\tcommand.SetArgs([]string{\"image\", \"--builder\", \"my-builder\", \"--exec-env\", \"$production\"})\n\t\t\t\t\t\terr := command.Execute()\n\t\t\t\t\t\th.AssertNotNil(t, err)\n\t\t\t\t\t\th.AssertError(t, err, \"exec-env MUST only contain numbers, letters, and the characters: . or -\")\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"--insecure-registry is provided\", func() {\n\t\t\tit(\"sets one insecure registry\", func() {\n\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\tBuild(gomock.Any(), EqBuildOptionsWithInsecureRegistries([]string{\n\t\t\t\t\t\t\"foo.bar\",\n\t\t\t\t\t})).\n\t\t\t\t\tReturn(nil)\n\n\t\t\t\tcommand.SetArgs([]string{\"image\", \"--builder\", \"my-builder\", \"--insecure-registry\", \"foo.bar\"})\n\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t})\n\n\t\t\tit(\"sets more than one insecure registry\", func() {\n\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\tBuild(gomock.Any(), EqBuildOptionsWithInsecureRegistries([]string{\n\t\t\t\t\t\t\"foo.bar\",\n\t\t\t\t\t\t\"foo.com\",\n\t\t\t\t\t})).\n\t\t\t\t\tReturn(nil)\n\n\t\t\t\tcommand.SetArgs([]string{\"image\", \"--builder\", \"my-builder\", \"--insecure-registry\", \"foo.bar\", \"--insecure-registry\", \"foo.com\"})\n\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"export to OCI layout is expected\", func() {\n\t\tvar (\n\t\t\tsparse        bool\n\t\t\tpreviousImage string\n\t\t\tlayoutDir     string\n\t\t)\n\n\t\tit.Before(func() {\n\t\t\tlayoutDir = filepath.Join(paths.RootDir, \"local\", \"repo\")\n\t\t\tpreviousImage = \"\"\n\t\t\tcfg = config.Config{\n\t\t\t\tExperimental:        true,\n\t\t\t\tLayoutRepositoryDir: layoutDir,\n\t\t\t}\n\t\t\tcommand = commands.Build(logger, cfg, mockClient)\n\t\t})\n\n\t\twhen(\"path to save the image is provided\", func() {\n\t\t\tit(\"build is called with oci layout configuration\", func() {\n\t\t\t\tsparse = false\n\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\tBuild(gomock.Any(), EqBuildOptionsWithLayoutConfig(\"image\", previousImage, sparse, layoutDir)).\n\t\t\t\t\tReturn(nil)\n\n\t\t\t\tcommand.SetArgs([]string{\"oci:image\", \"--builder\", \"my-builder\"})\n\t\t\t\terr := command.Execute()\n\t\t\t\th.AssertNil(t, err)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"previous-image flag is provided\", func() {\n\t\t\tit(\"build is called with oci layout configuration\", func() {\n\t\t\t\tsparse = false\n\t\t\t\tpreviousImage = \"my-previous-image\"\n\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\tBuild(gomock.Any(), EqBuildOptionsWithLayoutConfig(\"image\", previousImage, sparse, layoutDir)).\n\t\t\t\t\tReturn(nil)\n\n\t\t\t\tcommand.SetArgs([]string{\"oci:image\", \"--previous-image\", \"oci:my-previous-image\", \"--builder\", \"my-builder\"})\n\t\t\t\terr := command.Execute()\n\t\t\t\th.AssertNil(t, err)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"-sparse flag is provided\", func() {\n\t\t\tit(\"build is called with oci layout configuration and sparse true\", func() {\n\t\t\t\tsparse = true\n\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\tBuild(gomock.Any(), EqBuildOptionsWithLayoutConfig(\"image\", previousImage, sparse, layoutDir)).\n\t\t\t\t\tReturn(nil)\n\n\t\t\t\tcommand.SetArgs([]string{\"oci:image\", \"--sparse\", \"--builder\", \"my-builder\"})\n\t\t\t\terr := command.Execute()\n\t\t\t\th.AssertNil(t, err)\n\t\t\t})\n\t\t})\n\t})\n}\n\nfunc EqBuildOptionsWithImage(builder, image string) gomock.Matcher {\n\treturn buildOptionsMatcher{\n\t\tdescription: fmt.Sprintf(\"Builder=%s and Image=%s\", builder, image),\n\t\tequals: func(o client.BuildOptions) bool {\n\t\t\treturn o.Builder == builder && o.Image == image\n\t\t},\n\t}\n}\n\nfunc EqBuildOptionsDefaultProcess(defaultProc string) gomock.Matcher {\n\treturn buildOptionsMatcher{\n\t\tdescription: fmt.Sprintf(\"Default Process Type=%s\", defaultProc),\n\t\tequals: func(o client.BuildOptions) bool {\n\t\t\treturn o.DefaultProcessType == defaultProc\n\t\t},\n\t}\n}\n\nfunc EqBuildOptionsWithPlatform(platform string) gomock.Matcher {\n\treturn buildOptionsMatcher{\n\t\tdescription: fmt.Sprintf(\"Platform=%s\", platform),\n\t\tequals: func(o client.BuildOptions) bool {\n\t\t\treturn o.Platform == platform\n\t\t},\n\t}\n}\n\nfunc EqBuildOptionsWithPullPolicy(policy image.PullPolicy) gomock.Matcher {\n\treturn buildOptionsMatcher{\n\t\tdescription: fmt.Sprintf(\"PullPolicy=%s\", policy),\n\t\tequals: func(o client.BuildOptions) bool {\n\t\t\treturn o.PullPolicy == policy\n\t\t},\n\t}\n}\n\nfunc EqBuildOptionsWithCacheImage(cacheImage string) gomock.Matcher {\n\treturn buildOptionsMatcher{\n\t\tdescription: fmt.Sprintf(\"CacheImage=%s\", cacheImage),\n\t\tequals: func(o client.BuildOptions) bool {\n\t\t\treturn o.CacheImage == cacheImage\n\t\t},\n\t}\n}\n\nfunc EqBuildOptionsWithCacheFlags(cacheFlags string) gomock.Matcher {\n\treturn buildOptionsMatcher{\n\t\tdescription: fmt.Sprintf(\"CacheFlags=%s\", cacheFlags),\n\t\tequals: func(o client.BuildOptions) bool {\n\t\t\treturn o.Cache.String() == cacheFlags\n\t\t},\n\t}\n}\n\nfunc EqBuildOptionsWithLifecycleImage(lifecycleImage string) gomock.Matcher {\n\treturn buildOptionsMatcher{\n\t\tdescription: fmt.Sprintf(\"LifecycleImage=%s\", lifecycleImage),\n\t\tequals: func(o client.BuildOptions) bool {\n\t\t\treturn o.LifecycleImage == lifecycleImage\n\t\t},\n\t}\n}\n\nfunc EqBuildOptionsWithNetwork(network string) gomock.Matcher {\n\treturn buildOptionsMatcher{\n\t\tdescription: fmt.Sprintf(\"Network=%s\", network),\n\t\tequals: func(o client.BuildOptions) bool {\n\t\t\treturn o.ContainerConfig.Network == network\n\t\t},\n\t}\n}\n\nfunc EqBuildOptionsWithBuilder(builder string) gomock.Matcher {\n\treturn buildOptionsMatcher{\n\t\tdescription: fmt.Sprintf(\"Builder=%s\", builder),\n\t\tequals: func(o client.BuildOptions) bool {\n\t\t\treturn o.Builder == builder\n\t\t},\n\t}\n}\n\nfunc EqBuildOptionsWithTrustedBuilder(trustBuilder bool) gomock.Matcher {\n\treturn buildOptionsMatcher{\n\t\tdescription: fmt.Sprintf(\"Trust Builder=%t\", trustBuilder),\n\t\tequals: func(o client.BuildOptions) bool {\n\t\t\treturn o.TrustBuilder(o.Builder) == trustBuilder\n\t\t},\n\t}\n}\n\nfunc EqBuildOptionsWithVolumes(volumes []string) gomock.Matcher {\n\treturn buildOptionsMatcher{\n\t\tdescription: fmt.Sprintf(\"Volumes=%s\", volumes),\n\t\tequals: func(o client.BuildOptions) bool {\n\t\t\treturn reflect.DeepEqual(o.ContainerConfig.Volumes, volumes)\n\t\t},\n\t}\n}\n\nfunc EqBuildOptionsWithAdditionalTags(additionalTags []string) gomock.Matcher {\n\treturn buildOptionsMatcher{\n\t\tdescription: fmt.Sprintf(\"AdditionalTags=%s\", additionalTags),\n\t\tequals: func(o client.BuildOptions) bool {\n\t\t\treturn reflect.DeepEqual(o.AdditionalTags, additionalTags)\n\t\t},\n\t}\n}\n\nfunc EqBuildOptionsWithProjectDescriptor(descriptor projectTypes.Descriptor) gomock.Matcher {\n\treturn buildOptionsMatcher{\n\t\tdescription: fmt.Sprintf(\"Descriptor=%s\", descriptor),\n\t\tequals: func(o client.BuildOptions) bool {\n\t\t\treturn reflect.DeepEqual(o.ProjectDescriptor, descriptor)\n\t\t},\n\t}\n}\n\nfunc EqBuildOptionsWithEnv(env map[string]string) gomock.Matcher {\n\treturn buildOptionsMatcher{\n\t\tdescription: fmt.Sprintf(\"Env=%+v\", env),\n\t\tequals: func(o client.BuildOptions) bool {\n\t\t\tfor k, v := range o.Env {\n\t\t\t\tif env[k] != v {\n\t\t\t\t\treturn false\n\t\t\t\t}\n\t\t\t}\n\t\t\tfor k, v := range env {\n\t\t\t\tif o.Env[k] != v {\n\t\t\t\t\treturn false\n\t\t\t\t}\n\t\t\t}\n\t\t\treturn true\n\t\t},\n\t}\n}\n\nfunc EqBuildOptionsWithOverrideGroupID(gid int) gomock.Matcher {\n\treturn buildOptionsMatcher{\n\t\tdescription: fmt.Sprintf(\"GID=%d\", gid),\n\t\tequals: func(o client.BuildOptions) bool {\n\t\t\treturn o.GroupID == gid\n\t\t},\n\t}\n}\n\nfunc EqBuildOptionsWithPreviousImage(prevImage string) gomock.Matcher {\n\treturn buildOptionsMatcher{\n\t\tdescription: fmt.Sprintf(\"Previous image=%s\", prevImage),\n\t\tequals: func(o client.BuildOptions) bool {\n\t\t\treturn o.PreviousImage == prevImage\n\t\t},\n\t}\n}\n\nfunc EqBuildOptionsWithSBOMOutputDir(s string) interface{} {\n\treturn buildOptionsMatcher{\n\t\tdescription: fmt.Sprintf(\"sbom-destination-dir=%s\", s),\n\t\tequals: func(o client.BuildOptions) bool {\n\t\t\treturn o.SBOMDestinationDir == s\n\t\t},\n\t}\n}\n\nfunc EqBuildOptionsWithDateTime(t *time.Time) interface{} {\n\treturn buildOptionsMatcher{\n\t\tdescription: fmt.Sprintf(\"CreationTime=%s\", t),\n\t\tequals: func(o client.BuildOptions) bool {\n\t\t\tif t == nil {\n\t\t\t\treturn o.CreationTime == nil\n\t\t\t}\n\t\t\treturn o.CreationTime.Sub(*t) < 5*time.Second && t.Sub(*o.CreationTime) < 5*time.Second\n\t\t},\n\t}\n}\n\nfunc EqBuildOptionsWithPath(path string) interface{} {\n\treturn buildOptionsMatcher{\n\t\tdescription: fmt.Sprintf(\"AppPath=%s\", path),\n\t\tequals: func(o client.BuildOptions) bool {\n\t\t\treturn o.AppPath == path\n\t\t},\n\t}\n}\n\nfunc EqBuildOptionsWithLayoutConfig(image, previousImage string, sparse bool, layoutDir string) interface{} {\n\treturn buildOptionsMatcher{\n\t\tdescription: fmt.Sprintf(\"image=%s, previous-image=%s, sparse=%t, layout-dir=%s\", image, previousImage, sparse, layoutDir),\n\t\tequals: func(o client.BuildOptions) bool {\n\t\t\tif o.Layout() {\n\t\t\t\tresult := o.Image == image\n\t\t\t\tif previousImage != \"\" {\n\t\t\t\t\tresult = result && previousImage == o.PreviousImage\n\t\t\t\t}\n\t\t\t\treturn result && o.LayoutConfig.Sparse == sparse && o.LayoutConfig.LayoutRepoDir == layoutDir\n\t\t\t}\n\t\t\treturn false\n\t\t},\n\t}\n}\n\nfunc EqBuildOptionsWithExecEnv(s string) interface{} {\n\treturn buildOptionsMatcher{\n\t\tdescription: fmt.Sprintf(\"exec-env=%s\", s),\n\t\tequals: func(o client.BuildOptions) bool {\n\t\t\treturn o.CNBExecutionEnv == s\n\t\t},\n\t}\n}\n\nfunc EqBuildOptionsWithInsecureRegistries(insecureRegistries []string) gomock.Matcher {\n\treturn buildOptionsMatcher{\n\t\tdescription: fmt.Sprintf(\"Insercure Registries=%s\", insecureRegistries),\n\t\tequals: func(o client.BuildOptions) bool {\n\t\t\tif len(o.InsecureRegistries) != len(insecureRegistries) {\n\t\t\t\treturn false\n\t\t\t}\n\t\t\treturn reflect.DeepEqual(o.InsecureRegistries, insecureRegistries)\n\t\t},\n\t}\n}\n\ntype buildOptionsMatcher struct {\n\tequals      func(client.BuildOptions) bool\n\tdescription string\n}\n\nfunc (m buildOptionsMatcher) Matches(x interface{}) bool {\n\tif b, ok := x.(client.BuildOptions); ok {\n\t\treturn m.equals(b)\n\t}\n\treturn false\n}\n\nfunc (m buildOptionsMatcher) String() string {\n\treturn \"is a BuildOptions with \" + m.description\n}\n"
  },
  {
    "path": "internal/commands/builder.go",
    "content": "package commands\n\nimport (\n\t\"github.com/spf13/cobra\"\n\n\tbuilderwriter \"github.com/buildpacks/pack/internal/builder/writer\"\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n)\n\nfunc NewBuilderCommand(logger logging.Logger, cfg config.Config, client PackClient) *cobra.Command {\n\tcmd := &cobra.Command{\n\t\tUse:     \"builder\",\n\t\tAliases: []string{\"builders\"},\n\t\tShort:   \"Interact with builders\",\n\t\tRunE:    nil,\n\t}\n\n\tcmd.AddCommand(BuilderCreate(logger, cfg, client))\n\tcmd.AddCommand(BuilderInspect(logger, cfg, client, builderwriter.NewFactory()))\n\tcmd.AddCommand(BuilderSuggest(logger, client))\n\tAddHelpFlag(cmd, \"builder\")\n\treturn cmd\n}\n"
  },
  {
    "path": "internal/commands/builder_create.go",
    "content": "package commands\n\nimport (\n\t\"fmt\"\n\t\"os\"\n\t\"path/filepath\"\n\n\t\"github.com/pkg/errors\"\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/builder\"\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/internal/style\"\n\t\"github.com/buildpacks/pack/pkg/buildpack\"\n\t\"github.com/buildpacks/pack/pkg/client\"\n\t\"github.com/buildpacks/pack/pkg/image\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n)\n\n// BuilderCreateFlags define flags provided to the CreateBuilder command\ntype BuilderCreateFlags struct {\n\tPublish               bool\n\tAppendImageNameSuffix bool\n\tBuilderTomlPath       string\n\tRegistry              string\n\tPolicy                string\n\tFlatten               []string\n\tTargets               []string\n\tLabel                 map[string]string\n\tAdditionalTags        []string\n}\n\n// CreateBuilder creates a builder image, based on a builder config\nfunc BuilderCreate(logger logging.Logger, cfg config.Config, pack PackClient) *cobra.Command {\n\tvar flags BuilderCreateFlags\n\n\tcmd := &cobra.Command{\n\t\tUse:     \"create <image-name> --config <builder-config-path>\",\n\t\tArgs:    cobra.ExactArgs(1),\n\t\tShort:   \"Create builder image\",\n\t\tExample: \"pack builder create my-builder:bionic --config ./builder.toml\",\n\t\tLong: `A builder is an image that bundles all the bits and information on how to build your apps, such as buildpacks, an implementation of the lifecycle, and a build-time environment that pack uses when executing the lifecycle. When building an app, you can use community builders; you can see our suggestions by running\n\n\tpack builders suggest\n\nCreating a custom builder allows you to control what buildpacks are used and what image apps are based on. For more on how to create a builder, see: https://buildpacks.io/docs/operator-guide/create-a-builder/.\n`,\n\t\tRunE: logError(logger, func(cmd *cobra.Command, args []string) error {\n\t\t\tif err := validateCreateFlags(&flags, cfg); err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\n\t\t\tstringPolicy := flags.Policy\n\t\t\tif stringPolicy == \"\" {\n\t\t\t\tstringPolicy = cfg.PullPolicy\n\t\t\t}\n\t\t\tpullPolicy, err := image.ParsePullPolicy(stringPolicy)\n\t\t\tif err != nil {\n\t\t\t\treturn errors.Wrapf(err, \"parsing pull policy %s\", flags.Policy)\n\t\t\t}\n\n\t\t\tbuilderConfig, warns, err := builder.ReadConfig(flags.BuilderTomlPath)\n\t\t\tif err != nil {\n\t\t\t\treturn errors.Wrap(err, \"invalid builder toml\")\n\t\t\t}\n\t\t\tfor _, w := range warns {\n\t\t\t\tlogger.Warnf(\"builder configuration: %s\", w)\n\t\t\t}\n\n\t\t\trelativeBaseDir, err := filepath.Abs(filepath.Dir(flags.BuilderTomlPath))\n\t\t\tif err != nil {\n\t\t\t\treturn errors.Wrap(err, \"getting absolute path for config\")\n\t\t\t}\n\n\t\t\tenvMap, warnings, err := builder.ParseBuildConfigEnv(builderConfig.Build.Env, flags.BuilderTomlPath)\n\t\t\tfor _, v := range warnings {\n\t\t\t\tlogger.Warn(v)\n\t\t\t}\n\t\t\tif err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\n\t\t\ttoFlatten, err := buildpack.ParseFlattenBuildModules(flags.Flatten)\n\t\t\tif err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\n\t\t\tmultiArchCfg, err := processMultiArchitectureConfig(logger, flags.Targets, builderConfig.Targets, !flags.Publish)\n\t\t\tif err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\n\t\t\tif len(multiArchCfg.Targets()) == 0 {\n\t\t\t\tlogger.Infof(\"Pro tip: use --targets flag OR [[targets]] in builder.toml to specify the desired platform\")\n\t\t\t}\n\n\t\t\tif !flags.Publish && flags.AppendImageNameSuffix {\n\t\t\t\tlogger.Warnf(\"--append-image-name-suffix will be ignored, use combined with --publish\")\n\t\t\t}\n\n\t\t\t// Create temporary directory for lifecycle downloads when using Docker images\n\t\t\tvar tempDir string\n\t\t\tif hasDockerLifecycle(builderConfig) {\n\t\t\t\ttempDir, err = os.MkdirTemp(\"\", \"pack-builder-*\")\n\t\t\t\tif err != nil {\n\t\t\t\t\treturn errors.Wrap(err, \"creating temporary directory\")\n\t\t\t\t}\n\t\t\t\tdefer func() {\n\t\t\t\t\tif cleanupErr := os.RemoveAll(tempDir); cleanupErr != nil {\n\t\t\t\t\t\tlogger.Debugf(\"Failed to clean up temporary directory %s: %v\", tempDir, cleanupErr)\n\t\t\t\t\t}\n\t\t\t\t}()\n\t\t\t}\n\n\t\t\timageName := args[0]\n\t\t\tif err := pack.CreateBuilder(cmd.Context(), client.CreateBuilderOptions{\n\t\t\t\tRelativeBaseDir:       relativeBaseDir,\n\t\t\t\tBuildConfigEnv:        envMap,\n\t\t\t\tBuilderName:           imageName,\n\t\t\t\tConfig:                builderConfig,\n\t\t\t\tPublish:               flags.Publish,\n\t\t\t\tAppendImageNameSuffix: flags.AppendImageNameSuffix && flags.Publish,\n\t\t\t\tRegistry:              flags.Registry,\n\t\t\t\tPullPolicy:            pullPolicy,\n\t\t\t\tFlatten:               toFlatten,\n\t\t\t\tLabels:                flags.Label,\n\t\t\t\tTargets:               multiArchCfg.Targets(),\n\t\t\t\tTempDirectory:         tempDir,\n\t\t\t\tAdditionalTags:        flags.AdditionalTags,\n\t\t\t}); err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\t\t\tlogger.Infof(\"Successfully created builder image %s\", style.Symbol(imageName))\n\t\t\tlogging.Tip(logger, \"Run %s to use this builder\", style.Symbol(fmt.Sprintf(\"pack build <image-name> --builder %s\", imageName)))\n\t\t\treturn nil\n\t\t}),\n\t}\n\n\tcmd.Flags().StringVarP(&flags.Registry, \"buildpack-registry\", \"R\", cfg.DefaultRegistryName, \"Buildpack Registry by name\")\n\tif !cfg.Experimental {\n\t\tcmd.Flags().MarkHidden(\"buildpack-registry\")\n\t}\n\tcmd.Flags().StringVarP(&flags.BuilderTomlPath, \"config\", \"c\", \"\", \"Path to builder TOML file (required)\")\n\tcmd.Flags().BoolVar(&flags.Publish, \"publish\", false, \"Publish the builder directly to the container registry specified in <image-name>, instead of the daemon.\")\n\tcmd.Flags().BoolVar(&flags.AppendImageNameSuffix, \"append-image-name-suffix\", false, \"Append an [os]-[arch] suffix to intermediate image tags when creating a multi-arch image; useful when publishing to a registry that doesn't allow overwriting existing tags\")\n\tcmd.Flags().StringVar(&flags.Policy, \"pull-policy\", \"\", \"Pull policy to use. Accepted values are always, never, and if-not-present. The default is always\")\n\tcmd.Flags().StringArrayVar(&flags.Flatten, \"flatten\", nil, \"List of buildpacks to flatten together into a single layer (format: '<buildpack-id>@<buildpack-version>,<buildpack-id>@<buildpack-version>'\")\n\tcmd.Flags().StringToStringVarP(&flags.Label, \"label\", \"l\", nil, \"Labels to add to the builder image, in the form of '<name>=<value>'\")\n\tcmd.Flags().StringSliceVarP(&flags.Targets, \"target\", \"t\", nil,\n\t\t`Target platforms to build for.\\nTargets should be in the format '[os][/arch][/variant]:[distroname@osversion@anotherversion];[distroname@osversion]'.\n- To specify two different architectures:  '--target \"linux/amd64\" --target \"linux/arm64\"'\n- To specify the distribution version: '--target \"linux/arm/v6:ubuntu@14.04\"'\n- To specify multiple distribution versions: '--target \"linux/arm/v6:ubuntu@14.04\"  --target \"linux/arm/v6:ubuntu@16.04\"'\n\t`)\n\tcmd.Flags().StringSliceVarP(&flags.AdditionalTags, \"tag\", \"\", nil, \"Additional tags to push the output image to.\\nTags should be in the format 'image:tag' or 'repository/image:tag'.\"+stringSliceHelp(\"tag\"))\n\n\tAddHelpFlag(cmd, \"create\")\n\treturn cmd\n}\n\nfunc hasDockerLifecycle(builderConfig builder.Config) bool {\n\treturn buildpack.HasDockerLocator(builderConfig.Lifecycle.URI)\n}\n\nfunc validateCreateFlags(flags *BuilderCreateFlags, cfg config.Config) error {\n\tif flags.Publish && flags.Policy == image.PullNever.String() {\n\t\treturn errors.Errorf(\"--publish and --pull-policy never cannot be used together. The --publish flag requires the use of remote images.\")\n\t}\n\n\tif flags.Registry != \"\" && !cfg.Experimental {\n\t\treturn client.NewExperimentError(\"Support for buildpack registries is currently experimental.\")\n\t}\n\n\tif flags.BuilderTomlPath == \"\" {\n\t\treturn errors.Errorf(\"Please provide a builder config path, using --config.\")\n\t}\n\n\treturn nil\n}\n"
  },
  {
    "path": "internal/commands/builder_create_test.go",
    "content": "package commands_test\n\nimport (\n\t\"bytes\"\n\t\"errors\"\n\t\"fmt\"\n\t\"os\"\n\t\"path/filepath\"\n\t\"reflect\"\n\t\"testing\"\n\n\t\"github.com/golang/mock/gomock\"\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/builder\"\n\t\"github.com/buildpacks/pack/internal/commands\"\n\t\"github.com/buildpacks/pack/internal/commands/testmocks\"\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/pkg/client\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nconst validConfig = `\n[[buildpacks]]\n  id = \"some.buildpack\"\n\n[[order]]\n\t[[order.group]]\n\t\tid = \"some.buildpack\"\n\n`\n\nconst validConfigWithTargets = `\n[[buildpacks]]\nid = \"some.buildpack\"\n\n[[order]]\n[[order.group]]\nid = \"some.buildpack\"\n\n[[targets]]\nos = \"linux\"\narch = \"amd64\"\n\n[[targets]]\nos = \"linux\"\narch = \"arm64\"\n`\n\nconst validConfigWithExtensions = `\n[[buildpacks]]\n  id = \"some.buildpack\"\n\n[[extensions]]\n  id = \"some.extension\"\n\n[[order]]\n\t[[order.group]]\n\t\tid = \"some.buildpack\"\n\n[[order-extensions]]\n\t[[order-extensions.group]]\n\t\tid = \"some.extension\"\n\n`\n\nvar BuildConfigEnvSuffixNone = builder.BuildConfigEnv{\n\tName:  \"suffixNone\",\n\tValue: \"suffixNoneValue\",\n}\n\nvar BuildConfigEnvSuffixNoneWithEmptySuffix = builder.BuildConfigEnv{\n\tName:   \"suffixNoneWithEmptySuffix\",\n\tValue:  \"suffixNoneWithEmptySuffixValue\",\n\tSuffix: \"\",\n}\n\nvar BuildConfigEnvSuffixDefault = builder.BuildConfigEnv{\n\tName:   \"suffixDefault\",\n\tValue:  \"suffixDefaultValue\",\n\tSuffix: \"default\",\n}\n\nvar BuildConfigEnvSuffixOverride = builder.BuildConfigEnv{\n\tName:   \"suffixOverride\",\n\tValue:  \"suffixOverrideValue\",\n\tSuffix: \"override\",\n}\n\nvar BuildConfigEnvSuffixAppend = builder.BuildConfigEnv{\n\tName:   \"suffixAppend\",\n\tValue:  \"suffixAppendValue\",\n\tSuffix: \"append\",\n\tDelim:  \":\",\n}\n\nvar BuildConfigEnvSuffixPrepend = builder.BuildConfigEnv{\n\tName:   \"suffixPrepend\",\n\tValue:  \"suffixPrependValue\",\n\tSuffix: \"prepend\",\n\tDelim:  \":\",\n}\n\nvar BuildConfigEnvDelimWithoutSuffix = builder.BuildConfigEnv{\n\tName:  \"delimWithoutSuffix\",\n\tDelim: \":\",\n}\n\nvar BuildConfigEnvSuffixUnknown = builder.BuildConfigEnv{\n\tName:   \"suffixUnknown\",\n\tValue:  \"suffixUnknownValue\",\n\tSuffix: \"unknown\",\n}\n\nvar BuildConfigEnvSuffixMultiple = []builder.BuildConfigEnv{\n\t{\n\t\tName:   \"MY_VAR\",\n\t\tValue:  \"suffixAppendValueValue\",\n\t\tSuffix: \"append\",\n\t\tDelim:  \";\",\n\t},\n\t{\n\t\tName:   \"MY_VAR\",\n\t\tValue:  \"suffixDefaultValue\",\n\t\tSuffix: \"default\",\n\t\tDelim:  \"%\",\n\t},\n\t{\n\t\tName:   \"MY_VAR\",\n\t\tValue:  \"suffixPrependValue\",\n\t\tSuffix: \"prepend\",\n\t\tDelim:  \":\",\n\t},\n}\n\nvar BuildConfigEnvEmptyValue = builder.BuildConfigEnv{\n\tName:  \"warning\",\n\tValue: \"\",\n}\n\nvar BuildConfigEnvEmptyName = builder.BuildConfigEnv{\n\tName:   \"\",\n\tValue:  \"suffixUnknownValue\",\n\tSuffix: \"default\",\n}\n\nvar BuildConfigEnvSuffixPrependWithoutDelim = builder.BuildConfigEnv{\n\tName:   \"suffixPrepend\",\n\tValue:  \"suffixPrependValue\",\n\tSuffix: \"prepend\",\n}\n\nvar BuildConfigEnvDelimWithoutSuffixAppendOrPrepend = builder.BuildConfigEnv{\n\tName:  \"delimWithoutActionAppendOrPrepend\",\n\tValue: \"some-value\",\n\tDelim: \":\",\n}\n\nvar BuildConfigEnvDelimWithSameSuffixAndName = []builder.BuildConfigEnv{\n\t{\n\t\tName:   \"MY_VAR\",\n\t\tValue:  \"some-value\",\n\t\tSuffix: \"\",\n\t},\n\t{\n\t\tName:  \"MY_VAR\",\n\t\tValue: \"some-value\",\n\t},\n}\n\nfunc TestCreateCommand(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"CreateCommand\", testCreateCommand, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testCreateCommand(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tcommand           *cobra.Command\n\t\tlogger            logging.Logger\n\t\toutBuf            bytes.Buffer\n\t\tmockController    *gomock.Controller\n\t\tmockClient        *testmocks.MockPackClient\n\t\ttmpDir            string\n\t\tbuilderConfigPath string\n\t\tcfg               config.Config\n\t)\n\n\tit.Before(func() {\n\t\tvar err error\n\t\ttmpDir, err = os.MkdirTemp(\"\", \"create-builder-test\")\n\t\th.AssertNil(t, err)\n\t\tbuilderConfigPath = filepath.Join(tmpDir, \"builder.toml\")\n\t\tcfg = config.Config{}\n\n\t\tmockController = gomock.NewController(t)\n\t\tmockClient = testmocks.NewMockPackClient(mockController)\n\t\tlogger = logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\tcommand = commands.BuilderCreate(logger, cfg, mockClient)\n\t})\n\n\tit.After(func() {\n\t\tmockController.Finish()\n\t})\n\n\twhen(\"#Create\", func() {\n\t\twhen(\"both --publish and pull-policy=never flags are specified\", func() {\n\t\t\tit(\"errors with a descriptive message\", func() {\n\t\t\t\tcommand.SetArgs([]string{\n\t\t\t\t\t\"some/builder\",\n\t\t\t\t\t\"--config\", \"some-config-path\",\n\t\t\t\t\t\"--publish\",\n\t\t\t\t\t\"--pull-policy\",\n\t\t\t\t\t\"never\",\n\t\t\t\t})\n\t\t\t\terr := command.Execute()\n\t\t\t\th.AssertNotNil(t, err)\n\t\t\t\th.AssertError(t, err, \"--publish and --pull-policy never cannot be used together. The --publish flag requires the use of remote images.\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"--pull-policy\", func() {\n\t\t\tit(\"returns error for unknown policy\", func() {\n\t\t\t\tcommand.SetArgs([]string{\n\t\t\t\t\t\"some/builder\",\n\t\t\t\t\t\"--config\", builderConfigPath,\n\t\t\t\t\t\"--pull-policy\", \"unknown-policy\",\n\t\t\t\t})\n\t\t\t\th.AssertError(t, command.Execute(), \"parsing pull policy\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"--pull-policy is not specified\", func() {\n\t\t\twhen(\"configured pull policy is invalid\", func() {\n\t\t\t\tit(\"errors when config set with unknown policy\", func() {\n\t\t\t\t\tcfg = config.Config{PullPolicy: \"unknown-policy\"}\n\t\t\t\t\tcommand = commands.BuilderCreate(logger, cfg, mockClient)\n\t\t\t\t\tcommand.SetArgs([]string{\n\t\t\t\t\t\t\"some/builder\",\n\t\t\t\t\t\t\"--config\", builderConfigPath,\n\t\t\t\t\t})\n\t\t\t\t\th.AssertError(t, command.Execute(), \"parsing pull policy\")\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"--buildpack-registry flag is specified but experimental isn't set in the config\", func() {\n\t\t\tit(\"errors with a descriptive message\", func() {\n\t\t\t\tcommand.SetArgs([]string{\n\t\t\t\t\t\"some/builder\",\n\t\t\t\t\t\"--config\", \"some-config-path\",\n\t\t\t\t\t\"--buildpack-registry\", \"some-registry\",\n\t\t\t\t})\n\t\t\t\terr := command.Execute()\n\t\t\t\th.AssertNotNil(t, err)\n\t\t\t\th.AssertError(t, err, \"Support for buildpack registries is currently experimental.\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"warnings encountered in builder.toml\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\th.AssertNil(t, os.WriteFile(builderConfigPath, []byte(`\n[[buildpacks]]\n  id = \"some.buildpack\"\n`), 0666))\n\t\t\t})\n\n\t\t\tit(\"logs the warnings\", func() {\n\t\t\t\tmockClient.EXPECT().CreateBuilder(gomock.Any(), gomock.Any()).Return(nil)\n\n\t\t\t\tcommand.SetArgs([]string{\n\t\t\t\t\t\"some/builder\",\n\t\t\t\t\t\"--config\", builderConfigPath,\n\t\t\t\t})\n\t\t\t\th.AssertNil(t, command.Execute())\n\n\t\t\t\th.AssertContains(t, outBuf.String(), \"Warning: builder configuration: empty 'order' definition\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"uses --builder-config\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\th.AssertNil(t, os.WriteFile(builderConfigPath, []byte(validConfig), 0666))\n\t\t\t})\n\n\t\t\tit(\"errors with a descriptive message\", func() {\n\t\t\t\tcommand.SetArgs([]string{\n\t\t\t\t\t\"some/builder\",\n\t\t\t\t\t\"--builder-config\", builderConfigPath,\n\t\t\t\t})\n\t\t\t\th.AssertError(t, command.Execute(), \"unknown flag: --builder-config\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"#ParseBuildpackConfigEnv\", func() {\n\t\t\tit(\"should create envMap as expected when suffix is omitted\", func() {\n\t\t\t\tenvMap, warnings, err := builder.ParseBuildConfigEnv([]builder.BuildConfigEnv{BuildConfigEnvSuffixNone}, \"\")\n\t\t\t\th.AssertEq(t, envMap, map[string]string{\n\t\t\t\t\tBuildConfigEnvSuffixNone.Name: BuildConfigEnvSuffixNone.Value,\n\t\t\t\t})\n\t\t\t\th.AssertEq(t, len(warnings), 0)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t})\n\t\t\tit(\"should create envMap as expected when suffix is empty string\", func() {\n\t\t\t\tenvMap, warnings, err := builder.ParseBuildConfigEnv([]builder.BuildConfigEnv{BuildConfigEnvSuffixNoneWithEmptySuffix}, \"\")\n\t\t\t\th.AssertEq(t, envMap, map[string]string{\n\t\t\t\t\tBuildConfigEnvSuffixNoneWithEmptySuffix.Name: BuildConfigEnvSuffixNoneWithEmptySuffix.Value,\n\t\t\t\t})\n\t\t\t\th.AssertEq(t, len(warnings), 0)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t})\n\t\t\tit(\"should create envMap as expected when suffix is `default`\", func() {\n\t\t\t\tenvMap, warnings, err := builder.ParseBuildConfigEnv([]builder.BuildConfigEnv{BuildConfigEnvSuffixDefault}, \"\")\n\t\t\t\th.AssertEq(t, envMap, map[string]string{\n\t\t\t\t\tBuildConfigEnvSuffixDefault.Name + \".\" + string(BuildConfigEnvSuffixDefault.Suffix): BuildConfigEnvSuffixDefault.Value,\n\t\t\t\t})\n\t\t\t\th.AssertEq(t, len(warnings), 0)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t})\n\t\t\tit(\"should create envMap as expected when suffix is `override`\", func() {\n\t\t\t\tenvMap, warnings, err := builder.ParseBuildConfigEnv([]builder.BuildConfigEnv{BuildConfigEnvSuffixOverride}, \"\")\n\t\t\t\th.AssertEq(t, envMap, map[string]string{\n\t\t\t\t\tBuildConfigEnvSuffixOverride.Name + \".\" + string(BuildConfigEnvSuffixOverride.Suffix): BuildConfigEnvSuffixOverride.Value,\n\t\t\t\t})\n\t\t\t\th.AssertEq(t, len(warnings), 0)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t})\n\t\t\tit(\"should create envMap as expected when suffix is `append`\", func() {\n\t\t\t\tenvMap, warnings, err := builder.ParseBuildConfigEnv([]builder.BuildConfigEnv{BuildConfigEnvSuffixAppend}, \"\")\n\t\t\t\th.AssertEq(t, envMap, map[string]string{\n\t\t\t\t\tBuildConfigEnvSuffixAppend.Name + \".\" + string(BuildConfigEnvSuffixAppend.Suffix): BuildConfigEnvSuffixAppend.Value,\n\t\t\t\t\tBuildConfigEnvSuffixAppend.Name + \".delim\":                                        BuildConfigEnvSuffixAppend.Delim,\n\t\t\t\t})\n\t\t\t\th.AssertEq(t, len(warnings), 0)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t})\n\t\t\tit(\"should create envMap as expected when suffix is `prepend`\", func() {\n\t\t\t\tenvMap, warnings, err := builder.ParseBuildConfigEnv([]builder.BuildConfigEnv{BuildConfigEnvSuffixPrepend}, \"\")\n\t\t\t\th.AssertEq(t, envMap, map[string]string{\n\t\t\t\t\tBuildConfigEnvSuffixPrepend.Name + \".\" + string(BuildConfigEnvSuffixPrepend.Suffix): BuildConfigEnvSuffixPrepend.Value,\n\t\t\t\t\tBuildConfigEnvSuffixPrepend.Name + \".delim\":                                         BuildConfigEnvSuffixPrepend.Delim,\n\t\t\t\t})\n\t\t\t\th.AssertEq(t, len(warnings), 0)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t})\n\t\t\tit(\"should create envMap as expected when delim is specified\", func() {\n\t\t\t\tenvMap, warnings, err := builder.ParseBuildConfigEnv([]builder.BuildConfigEnv{BuildConfigEnvDelimWithoutSuffix}, \"\")\n\t\t\t\th.AssertEq(t, envMap, map[string]string{\n\t\t\t\t\tBuildConfigEnvDelimWithoutSuffix.Name:            BuildConfigEnvDelimWithoutSuffix.Value,\n\t\t\t\t\tBuildConfigEnvDelimWithoutSuffix.Name + \".delim\": BuildConfigEnvDelimWithoutSuffix.Delim,\n\t\t\t\t})\n\t\t\t\th.AssertNotEq(t, len(warnings), 0)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t})\n\t\t\tit(\"should create envMap with a warning when `value` is empty\", func() {\n\t\t\t\tenvMap, warnings, err := builder.ParseBuildConfigEnv([]builder.BuildConfigEnv{BuildConfigEnvEmptyValue}, \"\")\n\t\t\t\th.AssertEq(t, envMap, map[string]string{\n\t\t\t\t\tBuildConfigEnvEmptyValue.Name: BuildConfigEnvEmptyValue.Value,\n\t\t\t\t})\n\t\t\t\th.AssertNotEq(t, len(warnings), 0)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t})\n\t\t\tit(\"should return an error when `name` is empty\", func() {\n\t\t\t\tenvMap, warnings, err := builder.ParseBuildConfigEnv([]builder.BuildConfigEnv{BuildConfigEnvEmptyName}, \"\")\n\t\t\t\th.AssertEq(t, envMap, map[string]string(nil))\n\t\t\t\th.AssertEq(t, len(warnings), 0)\n\t\t\t\th.AssertNotNil(t, err)\n\t\t\t})\n\t\t\tit(\"should return warnings when `apprend` or `prepend` is used without `delim`\", func() {\n\t\t\t\tenvMap, warnings, err := builder.ParseBuildConfigEnv([]builder.BuildConfigEnv{BuildConfigEnvSuffixPrependWithoutDelim}, \"\")\n\t\t\t\th.AssertEq(t, envMap, map[string]string{\n\t\t\t\t\tBuildConfigEnvSuffixPrependWithoutDelim.Name + \".\" + string(BuildConfigEnvSuffixPrependWithoutDelim.Suffix): BuildConfigEnvSuffixPrependWithoutDelim.Value,\n\t\t\t\t})\n\t\t\t\th.AssertNotEq(t, len(warnings), 0)\n\t\t\t\th.AssertNotNil(t, err)\n\t\t\t})\n\t\t\tit(\"should return an error when unknown `suffix` is used\", func() {\n\t\t\t\tenvMap, warnings, err := builder.ParseBuildConfigEnv([]builder.BuildConfigEnv{BuildConfigEnvSuffixUnknown}, \"\")\n\t\t\t\th.AssertEq(t, envMap, map[string]string{})\n\t\t\t\th.AssertEq(t, len(warnings), 0)\n\t\t\t\th.AssertNotNil(t, err)\n\t\t\t})\n\t\t\tit(\"should override with the last specified delim when `[[build.env]]` has multiple delims with same `name` with a `append` or `prepend` suffix\", func() {\n\t\t\t\tenvMap, warnings, err := builder.ParseBuildConfigEnv(BuildConfigEnvSuffixMultiple, \"\")\n\t\t\t\th.AssertEq(t, envMap, map[string]string{\n\t\t\t\t\tBuildConfigEnvSuffixMultiple[0].Name + \".\" + string(BuildConfigEnvSuffixMultiple[0].Suffix): BuildConfigEnvSuffixMultiple[0].Value,\n\t\t\t\t\tBuildConfigEnvSuffixMultiple[1].Name + \".\" + string(BuildConfigEnvSuffixMultiple[1].Suffix): BuildConfigEnvSuffixMultiple[1].Value,\n\t\t\t\t\tBuildConfigEnvSuffixMultiple[2].Name + \".\" + string(BuildConfigEnvSuffixMultiple[2].Suffix): BuildConfigEnvSuffixMultiple[2].Value,\n\t\t\t\t\tBuildConfigEnvSuffixMultiple[2].Name + \".delim\":                                             BuildConfigEnvSuffixMultiple[2].Delim,\n\t\t\t\t})\n\t\t\t\th.AssertNotEq(t, len(warnings), 0)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t})\n\t\t\tit(\"should override `value` with the last read value when a `[[build.env]]` has same `name` with same `suffix`\", func() {\n\t\t\t\tenvMap, warnings, err := builder.ParseBuildConfigEnv(BuildConfigEnvDelimWithSameSuffixAndName, \"\")\n\t\t\t\th.AssertEq(t, envMap, map[string]string{\n\t\t\t\t\tBuildConfigEnvDelimWithSameSuffixAndName[1].Name: BuildConfigEnvDelimWithSameSuffixAndName[1].Value,\n\t\t\t\t})\n\t\t\t\th.AssertNotEq(t, len(warnings), 0)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"no config provided\", func() {\n\t\t\tit(\"errors with a descriptive message\", func() {\n\t\t\t\tcommand.SetArgs([]string{\n\t\t\t\t\t\"some/builder\",\n\t\t\t\t})\n\t\t\t\th.AssertError(t, command.Execute(), \"Please provide a builder config path\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"builder config has extensions but experimental isn't set in the config\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\th.AssertNil(t, os.WriteFile(builderConfigPath, []byte(validConfigWithExtensions), 0666))\n\t\t\t})\n\n\t\t\tit(\"errors\", func() {\n\t\t\t\tmockClient.EXPECT().CreateBuilder(gomock.Any(), gomock.Any()).Return(errors.New(\"builder config contains image extensions, but the lifecycle Platform API version (0.12) is older than 0.13; support for image extensions with Platform API < 0.13 is currently experimental\"))\n\n\t\t\t\tcommand.SetArgs([]string{\n\t\t\t\t\t\"some/builder\",\n\t\t\t\t\t\"--config\", builderConfigPath,\n\t\t\t\t})\n\t\t\t\th.AssertError(t, command.Execute(), \"support for image extensions with Platform API < 0.13 is currently experimental\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"--flatten\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\th.AssertNil(t, os.WriteFile(builderConfigPath, []byte(validConfig), 0666))\n\t\t\t})\n\n\t\t\twhen(\"requested buildpack doesn't have format <buildpack>@<version>\", func() {\n\t\t\t\tit(\"errors with a descriptive message\", func() {\n\t\t\t\t\tcommand.SetArgs([]string{\n\t\t\t\t\t\t\"some/builder\",\n\t\t\t\t\t\t\"--config\", builderConfigPath,\n\t\t\t\t\t\t\"--flatten\", \"some-buildpack\",\n\t\t\t\t\t})\n\t\t\t\t\th.AssertError(t, command.Execute(), fmt.Sprintf(\"invalid format %s; please use '<buildpack-id>@<buildpack-version>' to add buildpacks to be flattened\", \"some-buildpack\"))\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"--label\", func() {\n\t\t\twhen(\"can not be parsed\", func() {\n\t\t\t\tit(\"errors with a descriptive message\", func() {\n\t\t\t\t\tcmd := packageCommand()\n\t\t\t\t\tcmd.SetArgs([]string{\n\t\t\t\t\t\t\"some/builder\",\n\t\t\t\t\t\t\"--config\", builderConfigPath,\n\t\t\t\t\t\t\"--label\", \"name+value\",\n\t\t\t\t\t})\n\n\t\t\t\t\terr := cmd.Execute()\n\t\t\t\t\th.AssertNotNil(t, err)\n\t\t\t\t\th.AssertError(t, err, \"invalid argument \\\"name+value\\\" for \\\"-l, --label\\\" flag: name+value must be formatted as key=value\")\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"multi-platform builder is expected to be created\", func() {\n\t\t\twhen(\"builder config has no targets defined\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\th.AssertNil(t, os.WriteFile(builderConfigPath, []byte(validConfig), 0666))\n\t\t\t\t})\n\t\t\t\twhen(\"daemon\", func() {\n\t\t\t\t\tit(\"errors when exporting to daemon\", func() {\n\t\t\t\t\t\tcommand.SetArgs([]string{\n\t\t\t\t\t\t\t\"some/builder\",\n\t\t\t\t\t\t\t\"--config\", builderConfigPath,\n\t\t\t\t\t\t\t\"--target\", \"linux/amd64\",\n\t\t\t\t\t\t\t\"--target\", \"windows/amd64\",\n\t\t\t\t\t\t})\n\t\t\t\t\t\terr := command.Execute()\n\t\t\t\t\t\th.AssertNotNil(t, err)\n\t\t\t\t\t\th.AssertError(t, err, \"when exporting to daemon only one target is allowed\")\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"--publish\", func() {\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\tmockClient.EXPECT().CreateBuilder(gomock.Any(), EqCreateBuilderOptionsTargets([]dist.Target{\n\t\t\t\t\t\t\t{OS: \"linux\", Arch: \"amd64\"},\n\t\t\t\t\t\t\t{OS: \"windows\", Arch: \"amd64\"},\n\t\t\t\t\t\t})).Return(nil)\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"creates a builder with the given targets\", func() {\n\t\t\t\t\t\tcommand.SetArgs([]string{\n\t\t\t\t\t\t\t\"some/builder\",\n\t\t\t\t\t\t\t\"--config\", builderConfigPath,\n\t\t\t\t\t\t\t\"--target\", \"linux/amd64\",\n\t\t\t\t\t\t\t\"--target\", \"windows/amd64\",\n\t\t\t\t\t\t\t\"--publish\",\n\t\t\t\t\t\t})\n\t\t\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"builder config has targets defined\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\th.AssertNil(t, os.WriteFile(builderConfigPath, []byte(validConfigWithTargets), 0666))\n\t\t\t\t})\n\n\t\t\t\twhen(\"--publish\", func() {\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\tmockClient.EXPECT().CreateBuilder(gomock.Any(), EqCreateBuilderOptionsTargets([]dist.Target{\n\t\t\t\t\t\t\t{OS: \"linux\", Arch: \"amd64\"},\n\t\t\t\t\t\t\t{OS: \"linux\", Arch: \"arm64\"},\n\t\t\t\t\t\t})).Return(nil)\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"creates a builder with the given targets\", func() {\n\t\t\t\t\t\tcommand.SetArgs([]string{\n\t\t\t\t\t\t\t\"some/builder\",\n\t\t\t\t\t\t\t\"--config\", builderConfigPath,\n\t\t\t\t\t\t\t\"--publish\",\n\t\t\t\t\t\t})\n\t\t\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"invalid target flag is used\", func() {\n\t\t\t\t\tit(\"errors with a message when invalid target flag is used\", func() {\n\t\t\t\t\t\tcommand.SetArgs([]string{\n\t\t\t\t\t\t\t\"some/builder\",\n\t\t\t\t\t\t\t\"--config\", builderConfigPath,\n\t\t\t\t\t\t\t\"--target\", \"something/wrong\",\n\t\t\t\t\t\t\t\"--publish\",\n\t\t\t\t\t\t})\n\t\t\t\t\t\th.AssertNotNil(t, command.Execute())\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"--targets\", func() {\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\tmockClient.EXPECT().CreateBuilder(gomock.Any(), EqCreateBuilderOptionsTargets([]dist.Target{\n\t\t\t\t\t\t\t{OS: \"linux\", Arch: \"amd64\"},\n\t\t\t\t\t\t})).Return(nil)\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"creates a builder with the given targets\", func() {\n\t\t\t\t\t\tcommand.SetArgs([]string{\n\t\t\t\t\t\t\t\"some/builder\",\n\t\t\t\t\t\t\t\"--target\", \"linux/amd64\",\n\t\t\t\t\t\t\t\"--config\", builderConfigPath,\n\t\t\t\t\t\t})\n\t\t\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\t})\n}\n\nfunc EqCreateBuilderOptionsTargets(targets []dist.Target) gomock.Matcher {\n\treturn createbuilderOptionsMatcher{\n\t\tdescription: fmt.Sprintf(\"Target=%v\", targets),\n\t\tequals: func(o client.CreateBuilderOptions) bool {\n\t\t\tif len(o.Targets) != len(targets) {\n\t\t\t\treturn false\n\t\t\t}\n\t\t\treturn reflect.DeepEqual(o.Targets, targets)\n\t\t},\n\t}\n}\n\ntype createbuilderOptionsMatcher struct {\n\tequals      func(options client.CreateBuilderOptions) bool\n\tdescription string\n}\n\nfunc (m createbuilderOptionsMatcher) Matches(x interface{}) bool {\n\tif b, ok := x.(client.CreateBuilderOptions); ok {\n\t\treturn m.equals(b)\n\t}\n\treturn false\n}\n\nfunc (m createbuilderOptionsMatcher) String() string {\n\treturn \"is a CreateBuilderOption with \" + m.description\n}\n"
  },
  {
    "path": "internal/commands/builder_inspect.go",
    "content": "package commands\n\nimport (\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/builder\"\n\t\"github.com/buildpacks/pack/internal/builder/writer\"\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/pkg/client\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\n\tbldr \"github.com/buildpacks/pack/internal/builder\"\n)\n\ntype BuilderInspector interface {\n\tInspectBuilder(name string, daemon bool, modifiers ...client.BuilderInspectionModifier) (*client.BuilderInfo, error)\n}\n\ntype BuilderInspectFlags struct {\n\tDepth        int\n\tOutputFormat string\n}\n\nfunc BuilderInspect(logger logging.Logger,\n\tcfg config.Config,\n\tinspector BuilderInspector,\n\twriterFactory writer.BuilderWriterFactory,\n) *cobra.Command {\n\tvar flags BuilderInspectFlags\n\tcmd := &cobra.Command{\n\t\tUse:     \"inspect <builder-image-name>\",\n\t\tArgs:    cobra.MaximumNArgs(1),\n\t\tAliases: []string{\"inspect-builder\"},\n\t\tShort:   \"Show information about a builder\",\n\t\tExample: \"pack builder inspect cnbs/sample-builder:bionic\",\n\t\tLong:    \"Show information about the builder provided. If no argument is provided, it will inspect the default builder, if one has been set.\",\n\t\tRunE: logError(logger, func(cmd *cobra.Command, args []string) error {\n\t\t\timageName := cfg.DefaultBuilder\n\t\t\tif len(args) >= 1 {\n\t\t\t\timageName = args[0]\n\t\t\t}\n\n\t\t\tif imageName == \"\" {\n\t\t\t\tsuggestSettingBuilder(logger, inspector)\n\t\t\t\treturn client.NewSoftError()\n\t\t\t}\n\n\t\t\treturn inspectBuilder(logger, imageName, flags, cfg, inspector, writerFactory)\n\t\t}),\n\t}\n\n\tcmd.Flags().IntVarP(&flags.Depth, \"depth\", \"d\", builder.OrderDetectionMaxDepth, \"Max depth to display for Detection Order.\\nOmission of this flag or values < 0 will display the entire tree.\")\n\tcmd.Flags().StringVarP(&flags.OutputFormat, \"output\", \"o\", \"human-readable\", \"Output format to display builder detail (json, yaml, toml, human-readable).\\nOmission of this flag will display as human-readable.\")\n\tAddHelpFlag(cmd, \"inspect\")\n\treturn cmd\n}\n\nfunc inspectBuilder(\n\tlogger logging.Logger,\n\timageName string,\n\tflags BuilderInspectFlags,\n\tcfg config.Config,\n\tinspector BuilderInspector,\n\twriterFactory writer.BuilderWriterFactory,\n) error {\n\tisTrusted, err := bldr.IsTrustedBuilder(cfg, imageName)\n\tif err != nil {\n\t\treturn err\n\t}\n\n\tbuilderInfo := writer.SharedBuilderInfo{\n\t\tName:      imageName,\n\t\tIsDefault: imageName == cfg.DefaultBuilder,\n\t\tTrusted:   isTrusted,\n\t}\n\n\tlocalInfo, localErr := inspector.InspectBuilder(imageName, true, client.WithDetectionOrderDepth(flags.Depth))\n\tremoteInfo, remoteErr := inspector.InspectBuilder(imageName, false, client.WithDetectionOrderDepth(flags.Depth))\n\n\twriter, err := writerFactory.Writer(flags.OutputFormat)\n\tif err != nil {\n\t\treturn err\n\t}\n\treturn writer.Print(logger, cfg.RunImages, localInfo, remoteInfo, localErr, remoteErr, builderInfo)\n}\n"
  },
  {
    "path": "internal/commands/builder_inspect_test.go",
    "content": "package commands_test\n\nimport (\n\t\"bytes\"\n\t\"errors\"\n\t\"regexp\"\n\t\"testing\"\n\n\tpubbldr \"github.com/buildpacks/pack/builder\"\n\n\t\"github.com/buildpacks/lifecycle/api\"\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\t\"github.com/buildpacks/pack/internal/builder\"\n\t\"github.com/buildpacks/pack/internal/builder/writer\"\n\t\"github.com/buildpacks/pack/internal/commands\"\n\t\"github.com/buildpacks/pack/internal/commands/fakes\"\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/pkg/client\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nvar (\n\tminimalLifecycleDescriptor = builder.LifecycleDescriptor{\n\t\tInfo: builder.LifecycleInfo{Version: builder.VersionMustParse(\"3.4\")},\n\t\tAPI: builder.LifecycleAPI{\n\t\t\tBuildpackVersion: api.MustParse(\"1.2\"),\n\t\t\tPlatformVersion:  api.MustParse(\"2.3\"),\n\t\t},\n\t}\n\n\texpectedLocalRunImages = []config.RunImage{\n\t\t{Image: \"some/run-image\", Mirrors: []string{\"first/local\", \"second/local\"}},\n\t}\n\texpectedLocalInfo = &client.BuilderInfo{\n\t\tDescription: \"test-local-builder\",\n\t\tStack:       \"local-stack\",\n\t\tRunImages:   []pubbldr.RunImageConfig{{Image: \"local/image\"}},\n\t\tLifecycle:   minimalLifecycleDescriptor,\n\t}\n\texpectedRemoteInfo = &client.BuilderInfo{\n\t\tDescription: \"test-remote-builder\",\n\t\tStack:       \"remote-stack\",\n\t\tRunImages:   []pubbldr.RunImageConfig{{Image: \"remote/image\"}},\n\t\tLifecycle:   minimalLifecycleDescriptor,\n\t}\n\texpectedLocalDisplay  = \"Sample output for local builder\"\n\texpectedRemoteDisplay = \"Sample output for remote builder\"\n\texpectedBuilderInfo   = writer.SharedBuilderInfo{\n\t\tName:      \"default/builder\",\n\t\tTrusted:   false,\n\t\tIsDefault: true,\n\t}\n)\n\nfunc TestBuilderInspectCommand(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"BuilderInspectCommand\", testBuilderInspectCommand, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testBuilderInspectCommand(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tlogger logging.Logger\n\t\toutBuf bytes.Buffer\n\t\tcfg    config.Config\n\t)\n\n\tit.Before(func() {\n\t\tcfg = config.Config{\n\t\t\tDefaultBuilder: \"default/builder\",\n\t\t\tRunImages:      expectedLocalRunImages,\n\t\t}\n\t\tlogger = logging.NewLogWithWriters(&outBuf, &outBuf)\n\t})\n\n\twhen(\"BuilderInspect\", func() {\n\t\tvar (\n\t\t\tassert = h.NewAssertionManager(t)\n\t\t)\n\n\t\tit(\"passes output of local and remote builders to correct writer\", func() {\n\t\t\tbuilderInspector := newDefaultBuilderInspector()\n\t\t\tbuilderWriter := newDefaultBuilderWriter()\n\t\t\tbuilderWriterFactory := newWriterFactory(returnsForWriter(builderWriter))\n\n\t\t\tcommand := commands.BuilderInspect(logger, cfg, builderInspector, builderWriterFactory)\n\t\t\tcommand.SetArgs([]string{})\n\t\t\terr := command.Execute()\n\t\t\tassert.Nil(err)\n\n\t\t\tassert.Equal(builderWriter.ReceivedInfoForLocal, expectedLocalInfo)\n\t\t\tassert.Equal(builderWriter.ReceivedInfoForRemote, expectedRemoteInfo)\n\t\t\tassert.Equal(builderWriter.ReceivedBuilderInfo, expectedBuilderInfo)\n\t\t\tassert.Equal(builderWriter.ReceivedLocalRunImages, expectedLocalRunImages)\n\t\t\tassert.Equal(builderWriterFactory.ReceivedForKind, \"human-readable\")\n\t\t\tassert.Equal(builderInspector.ReceivedForLocalName, \"default/builder\")\n\t\t\tassert.Equal(builderInspector.ReceivedForRemoteName, \"default/builder\")\n\t\t\tassert.ContainsF(outBuf.String(), \"LOCAL:\\n%s\", expectedLocalDisplay)\n\t\t\tassert.ContainsF(outBuf.String(), \"REMOTE:\\n%s\", expectedRemoteDisplay)\n\t\t})\n\n\t\twhen(\"image name is provided as first arg\", func() {\n\t\t\tit(\"passes that image name to the inspector\", func() {\n\t\t\t\tbuilderInspector := newDefaultBuilderInspector()\n\t\t\t\twriter := newDefaultBuilderWriter()\n\t\t\t\tcommand := commands.BuilderInspect(logger, cfg, builderInspector, newWriterFactory(returnsForWriter(writer)))\n\t\t\t\tcommand.SetArgs([]string{\"some/image\"})\n\n\t\t\t\terr := command.Execute()\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassert.Equal(builderInspector.ReceivedForLocalName, \"some/image\")\n\t\t\t\tassert.Equal(builderInspector.ReceivedForRemoteName, \"some/image\")\n\t\t\t\tassert.Equal(writer.ReceivedBuilderInfo.IsDefault, false)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"depth flag is provided\", func() {\n\t\t\tit(\"passes a modifier to the builder inspector\", func() {\n\t\t\t\tbuilderInspector := newDefaultBuilderInspector()\n\t\t\t\tcommand := commands.BuilderInspect(logger, cfg, builderInspector, newDefaultWriterFactory())\n\t\t\t\tcommand.SetArgs([]string{\"--depth\", \"5\"})\n\n\t\t\t\terr := command.Execute()\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassert.Equal(builderInspector.CalculatedConfigForLocal.OrderDetectionDepth, 5)\n\t\t\t\tassert.Equal(builderInspector.CalculatedConfigForRemote.OrderDetectionDepth, 5)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"output type is set to json\", func() {\n\t\t\tit(\"passes json to the writer factory\", func() {\n\t\t\t\twriterFactory := newDefaultWriterFactory()\n\t\t\t\tcommand := commands.BuilderInspect(logger, cfg, newDefaultBuilderInspector(), writerFactory)\n\t\t\t\tcommand.SetArgs([]string{\"--output\", \"json\"})\n\n\t\t\t\terr := command.Execute()\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassert.Equal(writerFactory.ReceivedForKind, \"json\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"output type is set to toml using the shorthand flag\", func() {\n\t\t\tit(\"passes toml to the writer factory\", func() {\n\t\t\t\twriterFactory := newDefaultWriterFactory()\n\t\t\t\tcommand := commands.BuilderInspect(logger, cfg, newDefaultBuilderInspector(), writerFactory)\n\t\t\t\tcommand.SetArgs([]string{\"-o\", \"toml\"})\n\n\t\t\t\terr := command.Execute()\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassert.Equal(writerFactory.ReceivedForKind, \"toml\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"builder inspector returns an error for local builder\", func() {\n\t\t\tit(\"passes that error to the writer to handle appropriately\", func() {\n\t\t\t\tbaseError := errors.New(\"couldn't inspect local\")\n\n\t\t\t\tbuilderInspector := newBuilderInspector(errorsForLocal(baseError))\n\t\t\t\tbuilderWriter := newDefaultBuilderWriter()\n\t\t\t\tbuilderWriterFactory := newWriterFactory(returnsForWriter(builderWriter))\n\n\t\t\t\tcommand := commands.BuilderInspect(logger, cfg, builderInspector, builderWriterFactory)\n\t\t\t\tcommand.SetArgs([]string{})\n\t\t\t\terr := command.Execute()\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassert.ErrorWithMessage(builderWriter.ReceivedErrorForLocal, \"couldn't inspect local\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"builder inspector returns an error remote builder\", func() {\n\t\t\tit(\"passes that error to the writer to handle appropriately\", func() {\n\t\t\t\tbaseError := errors.New(\"couldn't inspect remote\")\n\n\t\t\t\tbuilderInspector := newBuilderInspector(errorsForRemote(baseError))\n\t\t\t\tbuilderWriter := newDefaultBuilderWriter()\n\t\t\t\tbuilderWriterFactory := newWriterFactory(returnsForWriter(builderWriter))\n\n\t\t\t\tcommand := commands.BuilderInspect(logger, cfg, builderInspector, builderWriterFactory)\n\t\t\t\tcommand.SetArgs([]string{})\n\t\t\t\terr := command.Execute()\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassert.ErrorWithMessage(builderWriter.ReceivedErrorForRemote, \"couldn't inspect remote\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"image is trusted\", func() {\n\t\t\tit(\"passes builder info with trusted true to the writer's `Print` method\", func() {\n\t\t\t\tcfg.TrustedBuilders = []config.TrustedBuilder{\n\t\t\t\t\t{Name: \"trusted/builder\"},\n\t\t\t\t}\n\t\t\t\twriter := newDefaultBuilderWriter()\n\n\t\t\t\tcommand := commands.BuilderInspect(\n\t\t\t\t\tlogger,\n\t\t\t\t\tcfg,\n\t\t\t\t\tnewDefaultBuilderInspector(),\n\t\t\t\t\tnewWriterFactory(returnsForWriter(writer)),\n\t\t\t\t)\n\t\t\t\tcommand.SetArgs([]string{\"trusted/builder\"})\n\n\t\t\t\terr := command.Execute()\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassert.Equal(writer.ReceivedBuilderInfo.Trusted, true)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"default builder is configured and is the same as specified by the command\", func() {\n\t\t\tit(\"passes builder info with isDefault true to the writer's `Print` method\", func() {\n\t\t\t\tcfg.DefaultBuilder = \"the/default-builder\"\n\t\t\t\twriter := newDefaultBuilderWriter()\n\n\t\t\t\tcommand := commands.BuilderInspect(\n\t\t\t\t\tlogger,\n\t\t\t\t\tcfg,\n\t\t\t\t\tnewDefaultBuilderInspector(),\n\t\t\t\t\tnewWriterFactory(returnsForWriter(writer)),\n\t\t\t\t)\n\t\t\t\tcommand.SetArgs([]string{\"the/default-builder\"})\n\n\t\t\t\terr := command.Execute()\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassert.Equal(writer.ReceivedBuilderInfo.IsDefault, true)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"default builder is empty and no builder is specified in command args\", func() {\n\t\t\tit(\"suggests builders and returns a soft error\", func() {\n\t\t\t\tcfg.DefaultBuilder = \"\"\n\n\t\t\t\tcommand := commands.BuilderInspect(logger, cfg, newDefaultBuilderInspector(), newDefaultWriterFactory())\n\t\t\t\tcommand.SetArgs([]string{})\n\n\t\t\t\terr := command.Execute()\n\t\t\t\tassert.Error(err)\n\t\t\t\tif !errors.Is(err, client.SoftError{}) {\n\t\t\t\t\tt.Fatalf(\"expect a client.SoftError, got: %s\", err)\n\t\t\t\t}\n\n\t\t\t\tassert.Contains(outBuf.String(), `Please select a default builder with:\n\n\tpack config default-builder <builder-image>`)\n\n\t\t\t\tassert.Matches(outBuf.String(), regexp.MustCompile(`Paketo Buildpacks:\\s+'paketobuildpacks/builder-jammy-base'`))\n\t\t\t\tassert.Matches(outBuf.String(), regexp.MustCompile(`Paketo Buildpacks:\\s+'paketobuildpacks/builder-jammy-full'`))\n\t\t\t\tassert.Matches(outBuf.String(), regexp.MustCompile(`Heroku:\\s+'heroku/builder:24'`))\n\t\t\t})\n\t\t})\n\n\t\twhen(\"print returns an error\", func() {\n\t\t\tit(\"returns that error\", func() {\n\t\t\t\tbaseError := errors.New(\"couldn't write builder\")\n\n\t\t\t\tbuilderWriter := newBuilderWriter(errorsForPrint(baseError))\n\t\t\t\tcommand := commands.BuilderInspect(\n\t\t\t\t\tlogger,\n\t\t\t\t\tcfg,\n\t\t\t\t\tnewDefaultBuilderInspector(),\n\t\t\t\t\tnewWriterFactory(returnsForWriter(builderWriter)),\n\t\t\t\t)\n\t\t\t\tcommand.SetArgs([]string{})\n\n\t\t\t\terr := command.Execute()\n\t\t\t\tassert.ErrorWithMessage(err, \"couldn't write builder\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"writer factory returns an error\", func() {\n\t\t\tit(\"returns that error\", func() {\n\t\t\t\tbaseError := errors.New(\"invalid output format\")\n\n\t\t\t\twriterFactory := newWriterFactory(errorsForWriter(baseError))\n\t\t\t\tcommand := commands.BuilderInspect(logger, cfg, newDefaultBuilderInspector(), writerFactory)\n\t\t\t\tcommand.SetArgs([]string{})\n\n\t\t\t\terr := command.Execute()\n\t\t\t\tassert.ErrorWithMessage(err, \"invalid output format\")\n\t\t\t})\n\t\t})\n\t})\n}\n\nfunc newDefaultBuilderInspector() *fakes.FakeBuilderInspector {\n\treturn &fakes.FakeBuilderInspector{\n\t\tInfoForLocal:  expectedLocalInfo,\n\t\tInfoForRemote: expectedRemoteInfo,\n\t}\n}\n\nfunc newDefaultBuilderWriter() *fakes.FakeBuilderWriter {\n\treturn &fakes.FakeBuilderWriter{\n\t\tPrintForLocal:  expectedLocalDisplay,\n\t\tPrintForRemote: expectedRemoteDisplay,\n\t}\n}\n\nfunc newDefaultWriterFactory() *fakes.FakeBuilderWriterFactory {\n\treturn &fakes.FakeBuilderWriterFactory{\n\t\tReturnForWriter: newDefaultBuilderWriter(),\n\t}\n}\n\ntype BuilderWriterModifier func(w *fakes.FakeBuilderWriter)\n\nfunc errorsForPrint(err error) BuilderWriterModifier {\n\treturn func(w *fakes.FakeBuilderWriter) {\n\t\tw.ErrorForPrint = err\n\t}\n}\n\nfunc newBuilderWriter(modifiers ...BuilderWriterModifier) *fakes.FakeBuilderWriter {\n\tw := newDefaultBuilderWriter()\n\n\tfor _, mod := range modifiers {\n\t\tmod(w)\n\t}\n\n\treturn w\n}\n\ntype WriterFactoryModifier func(f *fakes.FakeBuilderWriterFactory)\n\nfunc returnsForWriter(writer writer.BuilderWriter) WriterFactoryModifier {\n\treturn func(f *fakes.FakeBuilderWriterFactory) {\n\t\tf.ReturnForWriter = writer\n\t}\n}\n\nfunc errorsForWriter(err error) WriterFactoryModifier {\n\treturn func(f *fakes.FakeBuilderWriterFactory) {\n\t\tf.ErrorForWriter = err\n\t}\n}\n\nfunc newWriterFactory(modifiers ...WriterFactoryModifier) *fakes.FakeBuilderWriterFactory {\n\tf := newDefaultWriterFactory()\n\n\tfor _, mod := range modifiers {\n\t\tmod(f)\n\t}\n\n\treturn f\n}\n\ntype BuilderInspectorModifier func(i *fakes.FakeBuilderInspector)\n\nfunc errorsForLocal(err error) BuilderInspectorModifier {\n\treturn func(i *fakes.FakeBuilderInspector) {\n\t\ti.ErrorForLocal = err\n\t}\n}\n\nfunc errorsForRemote(err error) BuilderInspectorModifier {\n\treturn func(i *fakes.FakeBuilderInspector) {\n\t\ti.ErrorForRemote = err\n\t}\n}\n\nfunc newBuilderInspector(modifiers ...BuilderInspectorModifier) *fakes.FakeBuilderInspector {\n\ti := newDefaultBuilderInspector()\n\n\tfor _, mod := range modifiers {\n\t\tmod(i)\n\t}\n\n\treturn i\n}\n"
  },
  {
    "path": "internal/commands/builder_suggest.go",
    "content": "package commands\n\nimport (\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/pkg/logging\"\n)\n\nfunc BuilderSuggest(logger logging.Logger, inspector BuilderInspector) *cobra.Command {\n\tcmd := &cobra.Command{\n\t\tUse:     \"suggest\",\n\t\tArgs:    cobra.NoArgs,\n\t\tShort:   \"List the recommended builders\",\n\t\tExample: \"pack builder suggest\",\n\t\tRun: func(cmd *cobra.Command, s []string) {\n\t\t\tsuggestBuilders(logger, inspector)\n\t\t},\n\t}\n\n\tAddHelpFlag(cmd, \"suggest\")\n\treturn cmd\n}\n"
  },
  {
    "path": "internal/commands/builder_suggest_test.go",
    "content": "package commands_test\n\nimport (\n\t\"bytes\"\n\t\"errors\"\n\t\"testing\"\n\n\t\"github.com/golang/mock/gomock\"\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\tbldr \"github.com/buildpacks/pack/internal/builder\"\n\t\"github.com/buildpacks/pack/internal/commands\"\n\t\"github.com/buildpacks/pack/internal/commands/testmocks\"\n\t\"github.com/buildpacks/pack/pkg/client\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestSuggestCommand(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"BuilderSuggestCommand\", testSuggestCommand, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testSuggestCommand(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tlogger         logging.Logger\n\t\toutBuf         bytes.Buffer\n\t\tmockController *gomock.Controller\n\t\tmockClient     *testmocks.MockPackClient\n\t)\n\n\tit.Before(func() {\n\t\tmockController = gomock.NewController(t)\n\t\tmockClient = testmocks.NewMockPackClient(mockController)\n\t\tlogger = logging.NewLogWithWriters(&outBuf, &outBuf)\n\t})\n\n\twhen(\"#WriteSuggestedBuilder\", func() {\n\t\twhen(\"description metadata exists\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\tmockClient.EXPECT().InspectBuilder(\"gcr.io/some/builder:latest\", false).Return(&client.BuilderInfo{\n\t\t\t\t\tDescription: \"Remote description\",\n\t\t\t\t}, nil)\n\t\t\t})\n\n\t\t\tit(\"displays descriptions from metadata\", func() {\n\t\t\t\tcommands.WriteSuggestedBuilder(logger, mockClient, []bldr.KnownBuilder{{\n\t\t\t\t\tVendor:             \"Builder\",\n\t\t\t\t\tImage:              \"gcr.io/some/builder:latest\",\n\t\t\t\t\tDefaultDescription: \"Default description\",\n\t\t\t\t}})\n\t\t\t\th.AssertContains(t, outBuf.String(), \"Suggested builders:\")\n\t\t\t\th.AssertContainsMatch(t, outBuf.String(), `Builder:\\s+'gcr.io/some/builder:latest'\\s+Remote description`)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"description metadata does not exist\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\tmockClient.EXPECT().InspectBuilder(gomock.Any(), false).Return(&client.BuilderInfo{\n\t\t\t\t\tDescription: \"\",\n\t\t\t\t}, nil).AnyTimes()\n\t\t\t})\n\n\t\t\tit(\"displays default descriptions\", func() {\n\t\t\t\tcommands.WriteSuggestedBuilder(logger, mockClient, []bldr.KnownBuilder{{\n\t\t\t\t\tVendor:             \"Builder\",\n\t\t\t\t\tImage:              \"gcr.io/some/builder:latest\",\n\t\t\t\t\tDefaultDescription: \"Default description\",\n\t\t\t\t}})\n\t\t\t\th.AssertContains(t, outBuf.String(), \"Suggested builders:\")\n\t\t\t\th.AssertContainsMatch(t, outBuf.String(), `Builder:\\s+'gcr.io/some/builder:latest'\\s+Default description`)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"error inspecting images\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\tmockClient.EXPECT().InspectBuilder(gomock.Any(), false).Return(nil, errors.New(\"some error\")).AnyTimes()\n\t\t\t})\n\n\t\t\tit(\"displays default descriptions\", func() {\n\t\t\t\tcommands.WriteSuggestedBuilder(logger, mockClient, []bldr.KnownBuilder{{\n\t\t\t\t\tVendor:             \"Builder\",\n\t\t\t\t\tImage:              \"gcr.io/some/builder:latest\",\n\t\t\t\t\tDefaultDescription: \"Default description\",\n\t\t\t\t}})\n\t\t\t\th.AssertContains(t, outBuf.String(), \"Suggested builders:\")\n\t\t\t\th.AssertContainsMatch(t, outBuf.String(), `Builder:\\s+'gcr.io/some/builder:latest'\\s+Default description`)\n\t\t\t})\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "internal/commands/builder_test.go",
    "content": "package commands_test\n\nimport (\n\t\"bytes\"\n\t\"testing\"\n\n\t\"github.com/golang/mock/gomock\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/internal/commands\"\n\t\"github.com/buildpacks/pack/internal/commands/testmocks\"\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestBuilderCommand(t *testing.T) {\n\tspec.Run(t, \"BuilderCommand\", testBuilderCommand, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testBuilderCommand(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tcmd    *cobra.Command\n\t\tlogger logging.Logger\n\t\toutBuf bytes.Buffer\n\t)\n\n\tit.Before(func() {\n\t\tlogger = logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\tmockController := gomock.NewController(t)\n\t\tmockClient := testmocks.NewMockPackClient(mockController)\n\t\tcmd = commands.NewBuilderCommand(logger, config.Config{}, mockClient)\n\t\tcmd.SetOut(logging.GetWriterForLevel(logger, logging.InfoLevel))\n\t})\n\n\twhen(\"builder\", func() {\n\t\tit(\"prints help text\", func() {\n\t\t\tcmd.SetArgs([]string{})\n\t\t\th.AssertNil(t, cmd.Execute())\n\t\t\toutput := outBuf.String()\n\t\t\th.AssertContains(t, output, \"Interact with builders\")\n\t\t\th.AssertContains(t, output, \"Usage:\")\n\t\t\tfor _, command := range []string{\"create\", \"suggest\", \"inspect\"} {\n\t\t\t\th.AssertContains(t, output, command)\n\t\t\t\th.AssertNotContains(t, output, command+\"-builder\")\n\t\t\t}\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "internal/commands/buildpack.go",
    "content": "package commands\n\nimport (\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n)\n\nfunc NewBuildpackCommand(logger logging.Logger, cfg config.Config, client PackClient, packageConfigReader PackageConfigReader) *cobra.Command {\n\tcmd := &cobra.Command{\n\t\tUse:     \"buildpack\",\n\t\tAliases: []string{\"buildpacks\"},\n\t\tShort:   \"Interact with buildpacks\",\n\t\tRunE:    nil,\n\t}\n\n\tcmd.AddCommand(BuildpackInspect(logger, cfg, client))\n\tcmd.AddCommand(BuildpackPackage(logger, cfg, client, packageConfigReader))\n\tcmd.AddCommand(BuildpackNew(logger, client))\n\tcmd.AddCommand(BuildpackPull(logger, cfg, client))\n\tcmd.AddCommand(BuildpackRegister(logger, cfg, client))\n\tcmd.AddCommand(BuildpackYank(logger, cfg, client))\n\n\tAddHelpFlag(cmd, \"buildpack\")\n\treturn cmd\n}\n"
  },
  {
    "path": "internal/commands/buildpack_inspect.go",
    "content": "package commands\n\nimport (\n\t\"fmt\"\n\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/internal/style\"\n\t\"github.com/buildpacks/pack/pkg/client\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n)\n\ntype BuildpackInspectFlags struct {\n\tDepth    int\n\tRegistry string\n\tVerbose  bool\n}\n\nfunc BuildpackInspect(logger logging.Logger, cfg config.Config, client PackClient) *cobra.Command {\n\tvar flags BuildpackInspectFlags\n\tcmd := &cobra.Command{\n\t\tUse:     \"inspect <image-name>\",\n\t\tArgs:    cobra.ExactArgs(1),\n\t\tShort:   \"Show information about a buildpack\",\n\t\tExample: \"pack buildpack inspect cnbs/sample-package:hello-universe\",\n\t\tRunE: logError(logger, func(cmd *cobra.Command, args []string) error {\n\t\t\tbuildpackName := args[0]\n\t\t\tregistry := flags.Registry\n\t\t\tif registry == \"\" {\n\t\t\t\tregistry = cfg.DefaultRegistryName\n\t\t\t}\n\n\t\t\treturn buildpackInspect(logger, buildpackName, registry, flags, cfg, client)\n\t\t}),\n\t}\n\n\tcmd.Flags().IntVarP(&flags.Depth, \"depth\", \"d\", -1, \"Max depth to display for Detection Order.\\nOmission of this flag or values < 0 will display the entire tree.\")\n\tcmd.Flags().StringVarP(&flags.Registry, \"registry\", \"r\", \"\", \"buildpack registry that may be searched\")\n\tcmd.Flags().BoolVarP(&flags.Verbose, \"verbose\", \"v\", false, \"show more output\")\n\tAddHelpFlag(cmd, \"inspect\")\n\treturn cmd\n}\n\nfunc buildpackInspect(logger logging.Logger, buildpackName, registryName string, flags BuildpackInspectFlags, _ config.Config, pack PackClient) error {\n\tlogger.Infof(\"Inspecting buildpack: %s\\n\", style.Symbol(buildpackName))\n\n\tinspectedBuildpacksOutput, err := inspectAllBuildpacks(\n\t\tpack,\n\t\tflags,\n\t\tclient.InspectBuildpackOptions{\n\t\t\tBuildpackName: buildpackName,\n\t\t\tDaemon:        true,\n\t\t\tRegistry:      registryName,\n\t\t},\n\t\tclient.InspectBuildpackOptions{\n\t\t\tBuildpackName: buildpackName,\n\t\t\tDaemon:        false,\n\t\t\tRegistry:      registryName,\n\t\t})\n\tif err != nil {\n\t\treturn fmt.Errorf(\"error writing buildpack output: %q\", err)\n\t}\n\n\tlogger.Info(inspectedBuildpacksOutput)\n\treturn nil\n}\n"
  },
  {
    "path": "internal/commands/buildpack_inspect_test.go",
    "content": "package commands_test\n\nimport (\n\t\"bytes\"\n\t\"fmt\"\n\t\"testing\"\n\n\t\"github.com/buildpacks/lifecycle/api\"\n\t\"github.com/golang/mock/gomock\"\n\t\"github.com/heroku/color\"\n\t\"github.com/pkg/errors\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/internal/commands\"\n\t\"github.com/buildpacks/pack/internal/commands/testmocks\"\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/pkg/buildpack\"\n\t\"github.com/buildpacks/pack/pkg/client\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n\t\"github.com/buildpacks/pack/pkg/image\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nconst complexOutputSection = `Stacks:\n  ID: io.buildpacks.stacks.first-stack\n    Mixins:\n      (omitted)\n  ID: io.buildpacks.stacks.second-stack\n    Mixins:\n      (omitted)\n\nBuildpacks:\n  ID                                 NAME        VERSION        HOMEPAGE\n  some/first-inner-buildpack         -           1.0.0          first-inner-buildpack-homepage\n  some/second-inner-buildpack        -           2.0.0          second-inner-buildpack-homepage\n  some/third-inner-buildpack         -           3.0.0          third-inner-buildpack-homepage\n  some/top-buildpack                 top         0.0.1          top-buildpack-homepage\n\nDetection Order:\n └ Group #1:\n    └ some/top-buildpack@0.0.1\n       ├ Group #1:\n       │  ├ some/first-inner-buildpack@1.0.0\n       │  │  ├ Group #1:\n       │  │  │  ├ some/first-inner-buildpack@1.0.0    [cyclic]\n       │  │  │  └ some/third-inner-buildpack@3.0.0\n       │  │  └ Group #2:\n       │  │     └ some/third-inner-buildpack@3.0.0\n       │  └ some/second-inner-buildpack@2.0.0\n       └ Group #2:\n          └ some/first-inner-buildpack@1.0.0\n             ├ Group #1:\n             │  ├ some/first-inner-buildpack@1.0.0    [cyclic]\n             │  └ some/third-inner-buildpack@3.0.0\n             └ Group #2:\n                └ some/third-inner-buildpack@3.0.0`\n\nconst simpleOutputSection = `Stacks:\n  ID: io.buildpacks.stacks.first-stack\n    Mixins:\n      (omitted)\n  ID: io.buildpacks.stacks.second-stack\n    Mixins:\n      (omitted)\n\nBuildpacks:\n  ID                                NAME        VERSION        HOMEPAGE\n  some/single-buildpack             some        0.0.1          single-buildpack-homepage\n  some/buildpack-no-homepage        -           0.0.2          -\n\nDetection Order:\n └ Group #1:\n    └ some/single-buildpack@0.0.1`\n\nconst inspectOutputTemplate = `Inspecting buildpack: '%s'\n\n%s\n\n%s\n`\n\nconst depthOutputSection = `\nDetection Order:\n └ Group #1:\n    └ some/top-buildpack@0.0.1\n       ├ Group #1:\n       │  ├ some/first-inner-buildpack@1.0.0\n       │  └ some/second-inner-buildpack@2.0.0\n       └ Group #2:\n          └ some/first-inner-buildpack@1.0.0`\n\nconst simpleMixinOutputSection = `\n  ID: io.buildpacks.stacks.first-stack\n    Mixins:\n      mixin1\n      mixin2\n      build:mixin3\n      build:mixin4\n  ID: io.buildpacks.stacks.second-stack\n    Mixins:\n      mixin1\n      mixin2`\n\nfunc TestBuildpackInspectCommand(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"BuildpackInspectCommand\", testBuildpackInspectCommand, spec.Sequential(), spec.Report(report.Terminal{}))\n}\n\nfunc testBuildpackInspectCommand(t *testing.T, when spec.G, it spec.S) {\n\tapiVersion, err := api.NewVersion(\"0.2\")\n\tif err != nil {\n\t\tt.Fail()\n\t}\n\n\tvar (\n\t\tcommand        *cobra.Command\n\t\tlogger         logging.Logger\n\t\toutBuf         bytes.Buffer\n\t\tmockController *gomock.Controller\n\t\tmockClient     *testmocks.MockPackClient\n\t\tcfg            config.Config\n\t\tcomplexInfo    *client.BuildpackInfo\n\t\tsimpleInfo     *client.BuildpackInfo\n\t\tassert         = h.NewAssertionManager(t)\n\t)\n\n\tit.Before(func() {\n\t\tmockController = gomock.NewController(t)\n\t\tmockClient = testmocks.NewMockPackClient(mockController)\n\t\tlogger = logging.NewLogWithWriters(&outBuf, &outBuf)\n\n\t\tcfg = config.Config{\n\t\t\tDefaultRegistryName: \"default-registry\",\n\t\t}\n\n\t\tcomplexInfo = &client.BuildpackInfo{\n\t\t\tBuildpackMetadata: buildpack.Metadata{\n\t\t\t\tModuleInfo: dist.ModuleInfo{\n\t\t\t\t\tID:       \"some/top-buildpack\",\n\t\t\t\t\tVersion:  \"0.0.1\",\n\t\t\t\t\tHomepage: \"top-buildpack-homepage\",\n\t\t\t\t\tName:     \"top\",\n\t\t\t\t},\n\t\t\t\tStacks: []dist.Stack{\n\t\t\t\t\t{ID: \"io.buildpacks.stacks.first-stack\", Mixins: []string{\"mixin1\", \"mixin2\", \"build:mixin3\", \"build:mixin4\"}},\n\t\t\t\t\t{ID: \"io.buildpacks.stacks.second-stack\", Mixins: []string{\"mixin1\", \"mixin2\"}},\n\t\t\t\t},\n\t\t\t},\n\t\t\tBuildpacks: []dist.ModuleInfo{\n\t\t\t\t{\n\t\t\t\t\tID:       \"some/first-inner-buildpack\",\n\t\t\t\t\tVersion:  \"1.0.0\",\n\t\t\t\t\tHomepage: \"first-inner-buildpack-homepage\",\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\tID:       \"some/second-inner-buildpack\",\n\t\t\t\t\tVersion:  \"2.0.0\",\n\t\t\t\t\tHomepage: \"second-inner-buildpack-homepage\",\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\tID:       \"some/third-inner-buildpack\",\n\t\t\t\t\tVersion:  \"3.0.0\",\n\t\t\t\t\tHomepage: \"third-inner-buildpack-homepage\",\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\tID:       \"some/top-buildpack\",\n\t\t\t\t\tVersion:  \"0.0.1\",\n\t\t\t\t\tHomepage: \"top-buildpack-homepage\",\n\t\t\t\t\tName:     \"top\",\n\t\t\t\t},\n\t\t\t},\n\t\t\tOrder: dist.Order{\n\t\t\t\t{\n\t\t\t\t\tGroup: []dist.ModuleRef{\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\tModuleInfo: dist.ModuleInfo{\n\t\t\t\t\t\t\t\tID:       \"some/top-buildpack\",\n\t\t\t\t\t\t\t\tVersion:  \"0.0.1\",\n\t\t\t\t\t\t\t\tHomepage: \"top-buildpack-homepage\",\n\t\t\t\t\t\t\t\tName:     \"top\",\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\tOptional: false,\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t},\n\t\t\tBuildpackLayers: dist.ModuleLayers{\n\t\t\t\t\"some/first-inner-buildpack\": {\n\t\t\t\t\t\"1.0.0\": {\n\t\t\t\t\t\tAPI: apiVersion,\n\t\t\t\t\t\tStacks: []dist.Stack{\n\t\t\t\t\t\t\t{ID: \"io.buildpacks.stacks.first-stack\", Mixins: []string{\"mixin1\", \"mixin2\", \"build:mixin3\", \"build:mixin4\"}},\n\t\t\t\t\t\t\t{ID: \"io.buildpacks.stacks.second-stack\", Mixins: []string{\"mixin1\", \"mixin2\"}},\n\t\t\t\t\t\t},\n\t\t\t\t\t\tOrder: dist.Order{\n\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\tGroup: []dist.ModuleRef{\n\t\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t\tModuleInfo: dist.ModuleInfo{\n\t\t\t\t\t\t\t\t\t\t\tID:      \"some/first-inner-buildpack\",\n\t\t\t\t\t\t\t\t\t\t\tVersion: \"1.0.0\",\n\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t\tOptional: false,\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t\tModuleInfo: dist.ModuleInfo{\n\t\t\t\t\t\t\t\t\t\t\tID:      \"some/third-inner-buildpack\",\n\t\t\t\t\t\t\t\t\t\t\tVersion: \"3.0.0\",\n\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t\tOptional: false,\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\tGroup: []dist.ModuleRef{\n\t\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t\tModuleInfo: dist.ModuleInfo{\n\t\t\t\t\t\t\t\t\t\t\tID:      \"some/third-inner-buildpack\",\n\t\t\t\t\t\t\t\t\t\t\tVersion: \"3.0.0\",\n\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t\tOptional: false,\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t},\n\t\t\t\t\t\tLayerDiffID: \"sha256:first-inner-buildpack-diff-id\",\n\t\t\t\t\t\tHomepage:    \"first-inner-buildpack-homepage\",\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t\t\"some/second-inner-buildpack\": {\n\t\t\t\t\t\"2.0.0\": {\n\t\t\t\t\t\tAPI: apiVersion,\n\t\t\t\t\t\tStacks: []dist.Stack{\n\t\t\t\t\t\t\t{ID: \"io.buildpacks.stacks.first-stack\", Mixins: []string{\"mixin1\", \"mixin2\", \"build:mixin3\", \"build:mixin4\"}},\n\t\t\t\t\t\t\t{ID: \"io.buildpacks.stacks.second-stack\", Mixins: []string{\"mixin1\", \"mixin2\"}},\n\t\t\t\t\t\t},\n\t\t\t\t\t\tLayerDiffID: \"sha256:second-inner-buildpack-diff-id\",\n\t\t\t\t\t\tHomepage:    \"second-inner-buildpack-homepage\",\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t\t\"some/third-inner-buildpack\": {\n\t\t\t\t\t\"3.0.0\": {\n\t\t\t\t\t\tAPI: apiVersion,\n\t\t\t\t\t\tStacks: []dist.Stack{\n\t\t\t\t\t\t\t{ID: \"io.buildpacks.stacks.first-stack\", Mixins: []string{\"mixin1\", \"mixin2\", \"build:mixin3\", \"build:mixin4\"}},\n\t\t\t\t\t\t\t{ID: \"io.buildpacks.stacks.second-stack\", Mixins: []string{\"mixin1\", \"mixin2\"}},\n\t\t\t\t\t\t},\n\t\t\t\t\t\tLayerDiffID: \"sha256:third-inner-buildpack-diff-id\",\n\t\t\t\t\t\tHomepage:    \"third-inner-buildpack-homepage\",\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t\t\"some/top-buildpack\": {\n\t\t\t\t\t\"0.0.1\": {\n\t\t\t\t\t\tAPI: apiVersion,\n\t\t\t\t\t\tOrder: dist.Order{\n\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\tGroup: []dist.ModuleRef{\n\t\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t\tModuleInfo: dist.ModuleInfo{\n\t\t\t\t\t\t\t\t\t\t\tID:      \"some/first-inner-buildpack\",\n\t\t\t\t\t\t\t\t\t\t\tVersion: \"1.0.0\",\n\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t\tOptional: false,\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t\tModuleInfo: dist.ModuleInfo{\n\t\t\t\t\t\t\t\t\t\t\tID:      \"some/second-inner-buildpack\",\n\t\t\t\t\t\t\t\t\t\t\tVersion: \"2.0.0\",\n\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t\tOptional: false,\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\tGroup: []dist.ModuleRef{\n\t\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t\tModuleInfo: dist.ModuleInfo{\n\t\t\t\t\t\t\t\t\t\t\tID:      \"some/first-inner-buildpack\",\n\t\t\t\t\t\t\t\t\t\t\tVersion: \"1.0.0\",\n\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t\tOptional: false,\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t},\n\t\t\t\t\t\tLayerDiffID: \"sha256:top-buildpack-diff-id\",\n\t\t\t\t\t\tHomepage:    \"top-buildpack-homepage\",\n\t\t\t\t\t\tName:        \"top\",\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t},\n\t\t}\n\n\t\tsimpleInfo = &client.BuildpackInfo{\n\t\t\tBuildpackMetadata: buildpack.Metadata{\n\t\t\t\tModuleInfo: dist.ModuleInfo{\n\t\t\t\t\tID:       \"some/single-buildpack\",\n\t\t\t\t\tVersion:  \"0.0.1\",\n\t\t\t\t\tHomepage: \"single-homepage-homepace\",\n\t\t\t\t\tName:     \"some\",\n\t\t\t\t},\n\t\t\t\tStacks: []dist.Stack{\n\t\t\t\t\t{ID: \"io.buildpacks.stacks.first-stack\", Mixins: []string{\"mixin1\", \"mixin2\", \"build:mixin3\", \"build:mixin4\"}},\n\t\t\t\t\t{ID: \"io.buildpacks.stacks.second-stack\", Mixins: []string{\"mixin1\", \"mixin2\"}},\n\t\t\t\t},\n\t\t\t},\n\t\t\tBuildpacks: []dist.ModuleInfo{\n\t\t\t\t{\n\t\t\t\t\tID:       \"some/single-buildpack\",\n\t\t\t\t\tVersion:  \"0.0.1\",\n\t\t\t\t\tName:     \"some\",\n\t\t\t\t\tHomepage: \"single-buildpack-homepage\",\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\tID:      \"some/buildpack-no-homepage\",\n\t\t\t\t\tVersion: \"0.0.2\",\n\t\t\t\t},\n\t\t\t},\n\t\t\tOrder: dist.Order{\n\t\t\t\t{\n\t\t\t\t\tGroup: []dist.ModuleRef{\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\tModuleInfo: dist.ModuleInfo{\n\t\t\t\t\t\t\t\tID:       \"some/single-buildpack\",\n\t\t\t\t\t\t\t\tVersion:  \"0.0.1\",\n\t\t\t\t\t\t\t\tHomepage: \"single-buildpack-homepage\",\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\tOptional: false,\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t},\n\t\t\tBuildpackLayers: dist.ModuleLayers{\n\t\t\t\t\"some/single-buildpack\": {\n\t\t\t\t\t\"0.0.1\": {\n\t\t\t\t\t\tAPI: apiVersion,\n\t\t\t\t\t\tStacks: []dist.Stack{\n\t\t\t\t\t\t\t{ID: \"io.buildpacks.stacks.first-stack\", Mixins: []string{\"mixin1\", \"mixin2\", \"build:mixin3\", \"build:mixin4\"}},\n\t\t\t\t\t\t\t{ID: \"io.buildpacks.stacks.second-stack\", Mixins: []string{\"mixin1\", \"mixin2\"}},\n\t\t\t\t\t\t},\n\t\t\t\t\t\tLayerDiffID: \"sha256:single-buildpack-diff-id\",\n\t\t\t\t\t\tHomepage:    \"single-buildpack-homepage\",\n\t\t\t\t\t\tName:        \"some\",\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t},\n\t\t}\n\n\t\tcommand = commands.BuildpackInspect(logger, cfg, mockClient)\n\t})\n\n\twhen(\"BuildpackInspect\", func() {\n\t\twhen(\"inspecting an image\", func() {\n\t\t\twhen(\"both remote and local image are present\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tcomplexInfo.Location = buildpack.PackageLocator\n\t\t\t\t\tsimpleInfo.Location = buildpack.PackageLocator\n\n\t\t\t\t\tmockClient.EXPECT().InspectBuildpack(client.InspectBuildpackOptions{\n\t\t\t\t\t\tBuildpackName: \"test/buildpack\",\n\t\t\t\t\t\tDaemon:        true,\n\t\t\t\t\t\tRegistry:      \"default-registry\",\n\t\t\t\t\t}).Return(complexInfo, nil)\n\n\t\t\t\t\tmockClient.EXPECT().InspectBuildpack(client.InspectBuildpackOptions{\n\t\t\t\t\t\tBuildpackName: \"test/buildpack\",\n\t\t\t\t\t\tDaemon:        false,\n\t\t\t\t\t\tRegistry:      \"default-registry\",\n\t\t\t\t\t}).Return(simpleInfo, nil)\n\t\t\t\t})\n\n\t\t\t\tit(\"succeeds\", func() {\n\t\t\t\t\tcommand.SetArgs([]string{\"test/buildpack\"})\n\t\t\t\t\tassert.Nil(command.Execute())\n\n\t\t\t\t\tlocalOutputSection := fmt.Sprintf(inspectOutputTemplate,\n\t\t\t\t\t\t\"test/buildpack\",\n\t\t\t\t\t\t\"LOCAL IMAGE:\",\n\t\t\t\t\t\tcomplexOutputSection)\n\n\t\t\t\t\tremoteOutputSection := fmt.Sprintf(\"%s\\n\\n%s\",\n\t\t\t\t\t\t\"REMOTE IMAGE:\",\n\t\t\t\t\t\tsimpleOutputSection)\n\n\t\t\t\t\tassert.AssertTrimmedContains(outBuf.String(), localOutputSection)\n\t\t\t\t\tassert.AssertTrimmedContains(outBuf.String(), remoteOutputSection)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"only a local image is present\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tcomplexInfo.Location = buildpack.PackageLocator\n\t\t\t\t\tsimpleInfo.Location = buildpack.PackageLocator\n\n\t\t\t\t\tmockClient.EXPECT().InspectBuildpack(client.InspectBuildpackOptions{\n\t\t\t\t\t\tBuildpackName: \"only-local-test/buildpack\",\n\t\t\t\t\t\tDaemon:        true,\n\t\t\t\t\t\tRegistry:      \"default-registry\",\n\t\t\t\t\t}).Return(complexInfo, nil)\n\n\t\t\t\t\tmockClient.EXPECT().InspectBuildpack(client.InspectBuildpackOptions{\n\t\t\t\t\t\tBuildpackName: \"only-local-test/buildpack\",\n\t\t\t\t\t\tDaemon:        false,\n\t\t\t\t\t\tRegistry:      \"default-registry\",\n\t\t\t\t\t}).Return(nil, errors.Wrap(image.ErrNotFound, \"remote image not found!\"))\n\t\t\t\t})\n\n\t\t\t\tit(\"displays output for local image\", func() {\n\t\t\t\t\tcommand.SetArgs([]string{\"only-local-test/buildpack\"})\n\t\t\t\t\tassert.Nil(command.Execute())\n\n\t\t\t\t\texpectedOutput := fmt.Sprintf(inspectOutputTemplate,\n\t\t\t\t\t\t\"only-local-test/buildpack\",\n\t\t\t\t\t\t\"LOCAL IMAGE:\",\n\t\t\t\t\t\tcomplexOutputSection)\n\n\t\t\t\t\tassert.AssertTrimmedContains(outBuf.String(), expectedOutput)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"only a remote image is present\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tcomplexInfo.Location = buildpack.PackageLocator\n\t\t\t\t\tsimpleInfo.Location = buildpack.PackageLocator\n\n\t\t\t\t\tmockClient.EXPECT().InspectBuildpack(client.InspectBuildpackOptions{\n\t\t\t\t\t\tBuildpackName: \"only-remote-test/buildpack\",\n\t\t\t\t\t\tDaemon:        false,\n\t\t\t\t\t\tRegistry:      \"default-registry\",\n\t\t\t\t\t}).Return(complexInfo, nil)\n\n\t\t\t\t\tmockClient.EXPECT().InspectBuildpack(client.InspectBuildpackOptions{\n\t\t\t\t\t\tBuildpackName: \"only-remote-test/buildpack\",\n\t\t\t\t\t\tDaemon:        true,\n\t\t\t\t\t\tRegistry:      \"default-registry\",\n\t\t\t\t\t}).Return(nil, errors.Wrap(image.ErrNotFound, \"remote image not found!\"))\n\t\t\t\t})\n\n\t\t\t\tit(\"displays output for remote image\", func() {\n\t\t\t\t\tcommand.SetArgs([]string{\"only-remote-test/buildpack\"})\n\t\t\t\t\tassert.Nil(command.Execute())\n\n\t\t\t\t\texpectedOutput := fmt.Sprintf(inspectOutputTemplate,\n\t\t\t\t\t\t\"only-remote-test/buildpack\",\n\t\t\t\t\t\t\"REMOTE IMAGE:\",\n\t\t\t\t\t\tcomplexOutputSection)\n\n\t\t\t\t\tassert.AssertTrimmedContains(outBuf.String(), expectedOutput)\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"inspecting a buildpack uri\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\tsimpleInfo.Location = buildpack.URILocator\n\t\t\t})\n\n\t\t\twhen(\"uri is a local path\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tmockClient.EXPECT().InspectBuildpack(client.InspectBuildpackOptions{\n\t\t\t\t\t\tBuildpackName: \"/path/to/test/buildpack\",\n\t\t\t\t\t\tDaemon:        true,\n\t\t\t\t\t\tRegistry:      \"default-registry\",\n\t\t\t\t\t}).Return(simpleInfo, nil)\n\t\t\t\t})\n\n\t\t\t\tit(\"succeeds\", func() {\n\t\t\t\t\tcommand.SetArgs([]string{\"/path/to/test/buildpack\"})\n\t\t\t\t\tassert.Nil(command.Execute())\n\n\t\t\t\t\texpectedOutput := fmt.Sprintf(inspectOutputTemplate,\n\t\t\t\t\t\t\"/path/to/test/buildpack\",\n\t\t\t\t\t\t\"LOCAL ARCHIVE:\",\n\t\t\t\t\t\tsimpleOutputSection)\n\n\t\t\t\t\tassert.TrimmedEq(outBuf.String(), expectedOutput)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"uri is an http or https location\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tsimpleInfo.Location = buildpack.URILocator\n\t\t\t\t})\n\t\t\t\twhen(\"uri is a local path\", func() {\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\tmockClient.EXPECT().InspectBuildpack(client.InspectBuildpackOptions{\n\t\t\t\t\t\t\tBuildpackName: \"https://path/to/test/buildpack\",\n\t\t\t\t\t\t\tDaemon:        true,\n\t\t\t\t\t\t\tRegistry:      \"default-registry\",\n\t\t\t\t\t\t}).Return(simpleInfo, nil)\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"succeeds\", func() {\n\t\t\t\t\t\tcommand.SetArgs([]string{\"https://path/to/test/buildpack\"})\n\t\t\t\t\t\tassert.Nil(command.Execute())\n\n\t\t\t\t\t\texpectedOutput := fmt.Sprintf(inspectOutputTemplate,\n\t\t\t\t\t\t\t\"https://path/to/test/buildpack\",\n\t\t\t\t\t\t\t\"REMOTE ARCHIVE:\",\n\t\t\t\t\t\t\tsimpleOutputSection)\n\n\t\t\t\t\t\tassert.TrimmedEq(outBuf.String(), expectedOutput)\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"inspecting a buildpack on the registry\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\tsimpleInfo.Location = buildpack.RegistryLocator\n\t\t\t})\n\n\t\t\twhen(\"using the default registry\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tmockClient.EXPECT().InspectBuildpack(client.InspectBuildpackOptions{\n\t\t\t\t\t\tBuildpackName: \"urn:cnb:registry:test/buildpack\",\n\t\t\t\t\t\tDaemon:        true,\n\t\t\t\t\t\tRegistry:      \"default-registry\",\n\t\t\t\t\t}).Return(simpleInfo, nil)\n\t\t\t\t})\n\t\t\t\tit(\"succeeds\", func() {\n\t\t\t\t\tcommand.SetArgs([]string{\"urn:cnb:registry:test/buildpack\"})\n\t\t\t\t\tassert.Nil(command.Execute())\n\n\t\t\t\t\texpectedOutput := fmt.Sprintf(inspectOutputTemplate,\n\t\t\t\t\t\t\"urn:cnb:registry:test/buildpack\",\n\t\t\t\t\t\t\"REGISTRY IMAGE:\",\n\t\t\t\t\t\tsimpleOutputSection)\n\n\t\t\t\t\tassert.TrimmedEq(outBuf.String(), expectedOutput)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"using a user provided registry\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tmockClient.EXPECT().InspectBuildpack(client.InspectBuildpackOptions{\n\t\t\t\t\t\tBuildpackName: \"urn:cnb:registry:test/buildpack\",\n\t\t\t\t\t\tDaemon:        true,\n\t\t\t\t\t\tRegistry:      \"some-registry\",\n\t\t\t\t\t}).Return(simpleInfo, nil)\n\t\t\t\t})\n\n\t\t\t\tit(\"succeeds\", func() {\n\t\t\t\t\tcommand.SetArgs([]string{\"urn:cnb:registry:test/buildpack\", \"-r\", \"some-registry\"})\n\t\t\t\t\tassert.Nil(command.Execute())\n\n\t\t\t\t\texpectedOutput := fmt.Sprintf(inspectOutputTemplate,\n\t\t\t\t\t\t\"urn:cnb:registry:test/buildpack\",\n\t\t\t\t\t\t\"REGISTRY IMAGE:\",\n\t\t\t\t\t\tsimpleOutputSection)\n\n\t\t\t\t\tassert.TrimmedEq(outBuf.String(), expectedOutput)\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"a depth flag is passed\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\tcomplexInfo.Location = buildpack.URILocator\n\n\t\t\t\tmockClient.EXPECT().InspectBuildpack(client.InspectBuildpackOptions{\n\t\t\t\t\tBuildpackName: \"/other/path/to/test/buildpack\",\n\t\t\t\t\tDaemon:        true,\n\t\t\t\t\tRegistry:      \"default-registry\",\n\t\t\t\t}).Return(complexInfo, nil)\n\t\t\t})\n\n\t\t\tit(\"displays detection order to specified depth\", func() {\n\t\t\t\tcommand.SetArgs([]string{\"/other/path/to/test/buildpack\", \"-d\", \"2\"})\n\t\t\t\tassert.Nil(command.Execute())\n\n\t\t\t\tassert.AssertTrimmedContains(outBuf.String(), depthOutputSection)\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"verbose flag is passed\", func() {\n\t\tit.Before(func() {\n\t\t\tsimpleInfo.Location = buildpack.URILocator\n\t\t\tmockClient.EXPECT().InspectBuildpack(client.InspectBuildpackOptions{\n\t\t\t\tBuildpackName: \"/another/path/to/test/buildpack\",\n\t\t\t\tDaemon:        true,\n\t\t\t\tRegistry:      \"default-registry\",\n\t\t\t}).Return(simpleInfo, nil)\n\t\t})\n\n\t\tit(\"displays all mixins\", func() {\n\t\t\tcommand.SetArgs([]string{\"/another/path/to/test/buildpack\", \"-v\"})\n\t\t\tassert.Nil(command.Execute())\n\n\t\t\tassert.AssertTrimmedContains(outBuf.String(), simpleMixinOutputSection)\n\t\t})\n\t})\n\n\twhen(\"failure cases\", func() {\n\t\twhen(\"unable to inspect buildpack image\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\tmockClient.EXPECT().InspectBuildpack(client.InspectBuildpackOptions{\n\t\t\t\t\tBuildpackName: \"failure-case/buildpack\",\n\t\t\t\t\tDaemon:        true,\n\t\t\t\t\tRegistry:      \"default-registry\",\n\t\t\t\t}).Return(&client.BuildpackInfo{}, errors.Wrap(image.ErrNotFound, \"unable to inspect local failure-case/buildpack\"))\n\n\t\t\t\tmockClient.EXPECT().InspectBuildpack(client.InspectBuildpackOptions{\n\t\t\t\t\tBuildpackName: \"failure-case/buildpack\",\n\t\t\t\t\tDaemon:        false,\n\t\t\t\t\tRegistry:      \"default-registry\",\n\t\t\t\t}).Return(&client.BuildpackInfo{}, errors.Wrap(image.ErrNotFound, \"unable to inspect remote failure-case/buildpack\"))\n\t\t\t})\n\n\t\t\tit(\"errors\", func() {\n\t\t\t\tcommand.SetArgs([]string{\"failure-case/buildpack\"})\n\t\t\t\terr := command.Execute()\n\t\t\t\tassert.Error(err)\n\t\t\t})\n\t\t})\n\t\twhen(\"unable to inspect buildpack archive\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\tmockClient.EXPECT().InspectBuildpack(client.InspectBuildpackOptions{\n\t\t\t\t\tBuildpackName: \"http://path/to/failure-case/buildpack\",\n\t\t\t\t\tDaemon:        true,\n\t\t\t\t\tRegistry:      \"default-registry\",\n\t\t\t\t}).Return(&client.BuildpackInfo{}, errors.New(\"error inspecting local archive\"))\n\n\t\t\t\tit(\"errors\", func() {\n\t\t\t\t\tcommand.SetArgs([]string{\"http://path/to/failure-case/buildpack\"})\n\t\t\t\t\terr := command.Execute()\n\n\t\t\t\t\tassert.Error(err)\n\t\t\t\t\tassert.Contains(err.Error(), \"error inspecting local archive\")\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\t\twhen(\"unable to inspect both remote and local images\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\tmockClient.EXPECT().InspectBuildpack(client.InspectBuildpackOptions{\n\t\t\t\t\tBuildpackName: \"image-failure-case/buildpack\",\n\t\t\t\t\tDaemon:        true,\n\t\t\t\t\tRegistry:      \"default-registry\",\n\t\t\t\t}).Return(&client.BuildpackInfo{}, errors.Wrap(image.ErrNotFound, \"error inspecting local archive\"))\n\n\t\t\t\tmockClient.EXPECT().InspectBuildpack(client.InspectBuildpackOptions{\n\t\t\t\t\tBuildpackName: \"image-failure-case/buildpack\",\n\t\t\t\t\tDaemon:        false,\n\t\t\t\t\tRegistry:      \"default-registry\",\n\t\t\t\t}).Return(&client.BuildpackInfo{}, errors.Wrap(image.ErrNotFound, \"error inspecting remote archive\"))\n\t\t\t})\n\n\t\t\tit(\"errors\", func() {\n\t\t\t\tcommand.SetArgs([]string{\"image-failure-case/buildpack\"})\n\t\t\t\terr := command.Execute()\n\n\t\t\t\tassert.Error(err)\n\t\t\t\tassert.Contains(err.Error(), \"error writing buildpack output: \\\"error inspecting local archive: not found, error inspecting remote archive: not found\\\"\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"unable to inspect buildpack on registry\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\tmockClient.EXPECT().InspectBuildpack(client.InspectBuildpackOptions{\n\t\t\t\t\tBuildpackName: \"urn:cnb:registry:registry-failure/buildpack\",\n\t\t\t\t\tDaemon:        true,\n\t\t\t\t\tRegistry:      \"some-registry\",\n\t\t\t\t}).Return(&client.BuildpackInfo{}, errors.New(\"error inspecting registry image\"))\n\n\t\t\t\tmockClient.EXPECT().InspectBuildpack(client.InspectBuildpackOptions{\n\t\t\t\t\tBuildpackName: \"urn:cnb:registry:registry-failure/buildpack\",\n\t\t\t\t\tDaemon:        false,\n\t\t\t\t\tRegistry:      \"some-registry\",\n\t\t\t\t}).Return(&client.BuildpackInfo{}, errors.New(\"error inspecting registry image\"))\n\t\t\t})\n\n\t\t\tit(\"errors\", func() {\n\t\t\t\tcommand.SetArgs([]string{\"urn:cnb:registry:registry-failure/buildpack\", \"-r\", \"some-registry\"})\n\n\t\t\t\terr := command.Execute()\n\t\t\t\tassert.Error(err)\n\t\t\t\tassert.Contains(err.Error(), \"error inspecting registry image\")\n\t\t\t})\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "internal/commands/buildpack_new.go",
    "content": "package commands\n\nimport (\n\t\"context\"\n\t\"fmt\"\n\t\"os\"\n\t\"path/filepath\"\n\t\"runtime\"\n\t\"strings\"\n\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/internal/style\"\n\t\"github.com/buildpacks/pack/internal/target\"\n\t\"github.com/buildpacks/pack/pkg/client\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n)\n\n// BuildpackNewFlags define flags provided to the BuildpackNew command\ntype BuildpackNewFlags struct {\n\tAPI  string\n\tPath string\n\t// Deprecated: Stacks are deprecated\n\tStacks  []string\n\tTargets []string\n\tVersion string\n}\n\n// BuildpackCreator creates buildpacks\ntype BuildpackCreator interface {\n\tNewBuildpack(ctx context.Context, options client.NewBuildpackOptions) error\n}\n\n// BuildpackNew generates the scaffolding of a buildpack\nfunc BuildpackNew(logger logging.Logger, creator BuildpackCreator) *cobra.Command {\n\tvar flags BuildpackNewFlags\n\tcmd := &cobra.Command{\n\t\tUse:     \"new <id>\",\n\t\tShort:   \"Creates basic scaffolding of a buildpack.\",\n\t\tArgs:    cobra.MatchAll(cobra.ExactArgs(1), cobra.OnlyValidArgs),\n\t\tExample: \"pack buildpack new sample/my-buildpack\",\n\t\tLong:    \"buildpack new generates the basic scaffolding of a buildpack repository. It creates a new directory `name` in the current directory (or at `path`, if passed as a flag), and initializes a buildpack.toml, and two executable bash scripts, `bin/detect` and `bin/build`. \",\n\t\tRunE: logError(logger, func(cmd *cobra.Command, args []string) error {\n\t\t\tid := args[0]\n\t\t\tidParts := strings.Split(id, \"/\")\n\t\t\tdirName := idParts[len(idParts)-1]\n\n\t\t\tvar path string\n\t\t\tif len(flags.Path) == 0 {\n\t\t\t\tcwd, err := os.Getwd()\n\t\t\t\tif err != nil {\n\t\t\t\t\treturn err\n\t\t\t\t}\n\t\t\t\tpath = filepath.Join(cwd, dirName)\n\t\t\t} else {\n\t\t\t\tpath = flags.Path\n\t\t\t}\n\n\t\t\t_, err := os.Stat(path)\n\t\t\tif !os.IsNotExist(err) {\n\t\t\t\treturn fmt.Errorf(\"directory %s exists\", style.Symbol(path))\n\t\t\t}\n\n\t\t\tvar stacks []dist.Stack\n\t\t\tfor _, s := range flags.Stacks {\n\t\t\t\tstacks = append(stacks, dist.Stack{\n\t\t\t\t\tID:     s,\n\t\t\t\t\tMixins: []string{},\n\t\t\t\t})\n\t\t\t}\n\n\t\t\tvar targets []dist.Target\n\t\t\tif len(flags.Targets) == 0 && len(flags.Stacks) == 0 {\n\t\t\t\ttargets = []dist.Target{{\n\t\t\t\t\tOS:   runtime.GOOS,\n\t\t\t\t\tArch: runtime.GOARCH,\n\t\t\t\t}}\n\t\t\t} else {\n\t\t\t\tif targets, err = target.ParseTargets(flags.Targets, logger); err != nil {\n\t\t\t\t\treturn err\n\t\t\t\t}\n\t\t\t}\n\n\t\t\tif err := creator.NewBuildpack(cmd.Context(), client.NewBuildpackOptions{\n\t\t\t\tAPI:     flags.API,\n\t\t\t\tID:      id,\n\t\t\t\tPath:    path,\n\t\t\t\tStacks:  stacks,\n\t\t\t\tTargets: targets,\n\t\t\t\tVersion: flags.Version,\n\t\t\t}); err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\n\t\t\tlogger.Infof(\"Successfully created %s\", style.Symbol(id))\n\t\t\treturn nil\n\t\t}),\n\t}\n\n\tcmd.Flags().StringVarP(&flags.API, \"api\", \"a\", \"0.8\", \"Buildpack API compatibility of the generated buildpack\")\n\tcmd.Flags().StringVarP(&flags.Path, \"path\", \"p\", \"\", \"Path to generate the buildpack\")\n\tcmd.Flags().StringVarP(&flags.Version, \"version\", \"V\", \"1.0.0\", \"Version of the generated buildpack\")\n\tcmd.Flags().StringSliceVarP(&flags.Stacks, \"stacks\", \"s\", nil, \"Stack(s) this buildpack will be compatible with\"+stringSliceHelp(\"stack\"))\n\tcmd.Flags().MarkDeprecated(\"stacks\", \"prefer `--targets` instead: https://github.com/buildpacks/rfcs/blob/main/text/0096-remove-stacks-mixins.md\")\n\tcmd.Flags().StringSliceVarP(&flags.Targets, \"targets\", \"t\", nil,\n\t\t`Targets are of the form 'os/arch/variant', for example 'linux/amd64' or 'linux/arm64/v9'. The full format for targets follows the form [os][/arch][/variant]:[distroname@osversion@anotherversion];[distroname@osversion]\n\t- Base case for two different architectures :  '--targets \"linux/amd64\" --targets \"linux/arm64\"'\n\t- case for distribution version: '--targets \"windows/amd64:windows-nano@10.0.19041.1415\"'\n\t- case for different architecture with distributed versions : '--targets \"linux/arm/v6:ubuntu@14.04\"  --targets \"linux/arm/v6:ubuntu@16.04\"'\n\t`)\n\n\tAddHelpFlag(cmd, \"new\")\n\treturn cmd\n}\n"
  },
  {
    "path": "internal/commands/buildpack_new_test.go",
    "content": "package commands_test\n\nimport (\n\t\"bytes\"\n\t\"os\"\n\t\"path/filepath\"\n\t\"runtime\"\n\t\"testing\"\n\n\t\"github.com/buildpacks/pack/pkg/client\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\n\t\"github.com/golang/mock/gomock\"\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/internal/commands\"\n\t\"github.com/buildpacks/pack/internal/commands/testmocks\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestBuildpackNewCommand(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"BuildpackNewCommand\", testBuildpackNewCommand, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testBuildpackNewCommand(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tcommand        *cobra.Command\n\t\tlogger         *logging.LogWithWriters\n\t\toutBuf         bytes.Buffer\n\t\tmockController *gomock.Controller\n\t\tmockClient     *testmocks.MockPackClient\n\t\ttmpDir         string\n\t)\n\ttargets := []dist.Target{{\n\t\tOS:   runtime.GOOS,\n\t\tArch: runtime.GOARCH,\n\t}}\n\n\tit.Before(func() {\n\t\tvar err error\n\t\ttmpDir, err = os.MkdirTemp(\"\", \"build-test\")\n\t\th.AssertNil(t, err)\n\n\t\tlogger = logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\tmockController = gomock.NewController(t)\n\t\tmockClient = testmocks.NewMockPackClient(mockController)\n\n\t\tcommand = commands.BuildpackNew(logger, mockClient)\n\t})\n\n\tit.After(func() {\n\t\tos.RemoveAll(tmpDir)\n\t})\n\n\twhen(\"BuildpackNew#Execute\", func() {\n\t\tit(\"uses the args to generate artifacts\", func() {\n\t\t\tmockClient.EXPECT().NewBuildpack(gomock.Any(), client.NewBuildpackOptions{\n\t\t\t\tAPI:     \"0.8\",\n\t\t\t\tID:      \"example/some-cnb\",\n\t\t\t\tPath:    filepath.Join(tmpDir, \"some-cnb\"),\n\t\t\t\tVersion: \"1.0.0\",\n\t\t\t\tTargets: targets,\n\t\t\t}).Return(nil).MaxTimes(1)\n\n\t\t\tpath := filepath.Join(tmpDir, \"some-cnb\")\n\t\t\tcommand.SetArgs([]string{\"--path\", path, \"example/some-cnb\"})\n\n\t\t\terr := command.Execute()\n\t\t\th.AssertNil(t, err)\n\t\t})\n\n\t\tit(\"stops if the directory already exists\", func() {\n\t\t\terr := os.MkdirAll(tmpDir, 0600)\n\t\t\th.AssertNil(t, err)\n\n\t\t\tcommand.SetArgs([]string{\"--path\", tmpDir, \"example/some-cnb\"})\n\t\t\terr = command.Execute()\n\t\t\th.AssertNotNil(t, err)\n\t\t\th.AssertContains(t, outBuf.String(), \"ERROR: directory\")\n\t\t})\n\n\t\twhen(\"target flag is specified, \", func() {\n\t\t\tit(\"it uses target to generate artifacts\", func() {\n\t\t\t\tmockClient.EXPECT().NewBuildpack(gomock.Any(), client.NewBuildpackOptions{\n\t\t\t\t\tAPI:     \"0.8\",\n\t\t\t\t\tID:      \"example/targets\",\n\t\t\t\t\tPath:    filepath.Join(tmpDir, \"targets\"),\n\t\t\t\t\tVersion: \"1.0.0\",\n\t\t\t\t\tTargets: []dist.Target{{\n\t\t\t\t\t\tOS:          \"linux\",\n\t\t\t\t\t\tArch:        \"arm\",\n\t\t\t\t\t\tArchVariant: \"v6\",\n\t\t\t\t\t\tDistributions: []dist.Distribution{{\n\t\t\t\t\t\t\tName:    \"ubuntu\",\n\t\t\t\t\t\t\tVersion: \"14.04\",\n\t\t\t\t\t\t}},\n\t\t\t\t\t}},\n\t\t\t\t}).Return(nil).MaxTimes(1)\n\n\t\t\t\tpath := filepath.Join(tmpDir, \"targets\")\n\t\t\t\tcommand.SetArgs([]string{\"--path\", path, \"example/targets\", \"--targets\", \"linux/arm/v6:ubuntu@14.04\"})\n\n\t\t\t\terr := command.Execute()\n\t\t\t\th.AssertNil(t, err)\n\t\t\t})\n\t\t\tit(\"it should show error when invalid [os]/[arch] passed\", func() {\n\t\t\t\tmockClient.EXPECT().NewBuildpack(gomock.Any(), client.NewBuildpackOptions{\n\t\t\t\t\tAPI:     \"0.8\",\n\t\t\t\t\tID:      \"example/targets\",\n\t\t\t\t\tPath:    filepath.Join(tmpDir, \"targets\"),\n\t\t\t\t\tVersion: \"1.0.0\",\n\t\t\t\t\tTargets: []dist.Target{{\n\t\t\t\t\t\tOS:          \"os\",\n\t\t\t\t\t\tArch:        \"arm\",\n\t\t\t\t\t\tArchVariant: \"v6\",\n\t\t\t\t\t\tDistributions: []dist.Distribution{{\n\t\t\t\t\t\t\tName:    \"ubuntu\",\n\t\t\t\t\t\t\tVersion: \"14.04\",\n\t\t\t\t\t\t}, {\n\t\t\t\t\t\t\tName:    \"ubuntu\",\n\t\t\t\t\t\t\tVersion: \"16.04\",\n\t\t\t\t\t\t}},\n\t\t\t\t\t}},\n\t\t\t\t}).Return(nil).MaxTimes(1)\n\n\t\t\t\tpath := filepath.Join(tmpDir, \"targets\")\n\t\t\t\tcommand.SetArgs([]string{\"--path\", path, \"example/targets\", \"--targets\", \"os/arm/v6:ubuntu@14.04@16.04\"})\n\n\t\t\t\terr := command.Execute()\n\t\t\t\th.AssertNotNil(t, err)\n\t\t\t})\n\t\t\twhen(\"it should\", func() {\n\t\t\t\tit(\"support format [os][/arch][/variant]:[name@version];[some-name@version]\", func() {\n\t\t\t\t\tmockClient.EXPECT().NewBuildpack(gomock.Any(), client.NewBuildpackOptions{\n\t\t\t\t\t\tAPI:     \"0.8\",\n\t\t\t\t\t\tID:      \"example/targets\",\n\t\t\t\t\t\tPath:    filepath.Join(tmpDir, \"targets\"),\n\t\t\t\t\t\tVersion: \"1.0.0\",\n\t\t\t\t\t\tTargets: []dist.Target{\n\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\tOS:          \"linux\",\n\t\t\t\t\t\t\t\tArch:        \"arm\",\n\t\t\t\t\t\t\t\tArchVariant: \"v6\",\n\t\t\t\t\t\t\t\tDistributions: []dist.Distribution{\n\t\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t\tName:    \"ubuntu\",\n\t\t\t\t\t\t\t\t\t\tVersion: \"14.04\",\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t\tName:    \"debian\",\n\t\t\t\t\t\t\t\t\t\tVersion: \"8.10\",\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\tOS:   \"windows\",\n\t\t\t\t\t\t\t\tArch: \"amd64\",\n\t\t\t\t\t\t\t\tDistributions: []dist.Distribution{\n\t\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t\tName:    \"windows-nano\",\n\t\t\t\t\t\t\t\t\t\tVersion: \"10.0.19041.1415\",\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t},\n\t\t\t\t\t}).Return(nil).MaxTimes(1)\n\n\t\t\t\t\tpath := filepath.Join(tmpDir, \"targets\")\n\t\t\t\t\tcommand.SetArgs([]string{\"--path\", path, \"example/targets\", \"--targets\", \"linux/arm/v6:ubuntu@14.04;debian@8.10\", \"-t\", \"windows/amd64:windows-nano@10.0.19041.1415\"})\n\n\t\t\t\t\terr := command.Execute()\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t})\n\t\t\t})\n\t\t\twhen(\"stacks \", func() {\n\t\t\t\tit(\"flag should show deprecated message when used\", func() {\n\t\t\t\t\tmockClient.EXPECT().NewBuildpack(gomock.Any(), client.NewBuildpackOptions{\n\t\t\t\t\t\tAPI:     \"0.8\",\n\t\t\t\t\t\tID:      \"example/stacks\",\n\t\t\t\t\t\tPath:    filepath.Join(tmpDir, \"stacks\"),\n\t\t\t\t\t\tVersion: \"1.0.0\",\n\t\t\t\t\t\tStacks: []dist.Stack{{\n\t\t\t\t\t\t\tID:     \"io.buildpacks.stacks.jammy\",\n\t\t\t\t\t\t\tMixins: []string{},\n\t\t\t\t\t\t}},\n\t\t\t\t\t}).Return(nil).MaxTimes(1)\n\n\t\t\t\t\tpath := filepath.Join(tmpDir, \"stacks\")\n\t\t\t\t\toutput := new(bytes.Buffer)\n\t\t\t\t\tcommand.SetOut(output)\n\t\t\t\t\tcommand.SetErr(output)\n\t\t\t\t\tcommand.SetArgs([]string{\"--path\", path, \"example/stacks\", \"--stacks\", \"io.buildpacks.stacks.jammy\"})\n\n\t\t\t\t\terr := command.Execute()\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\th.AssertContains(t, output.String(), \"Flag --stacks has been deprecated,\")\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "internal/commands/buildpack_package.go",
    "content": "package commands\n\nimport (\n\t\"context\"\n\t\"os\"\n\t\"path/filepath\"\n\t\"strings\"\n\n\t\"github.com/pkg/errors\"\n\t\"github.com/spf13/cobra\"\n\n\tpubbldpkg \"github.com/buildpacks/pack/buildpackage\"\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/internal/style\"\n\t\"github.com/buildpacks/pack/pkg/client\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n\t\"github.com/buildpacks/pack/pkg/image\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n)\n\n// BuildpackPackageFlags define flags provided to the BuildpackPackage command\ntype BuildpackPackageFlags struct {\n\tPackageTomlPath       string\n\tFormat                string\n\tPolicy                string\n\tBuildpackRegistry     string\n\tPath                  string\n\tFlattenExclude        []string\n\tTargets               []string\n\tLabel                 map[string]string\n\tPublish               bool\n\tFlatten               bool\n\tAppendImageNameSuffix bool\n\tAdditionalTags        []string\n}\n\n// BuildpackPackager packages buildpacks\ntype BuildpackPackager interface {\n\tPackageBuildpack(ctx context.Context, options client.PackageBuildpackOptions) error\n}\n\n// PackageConfigReader reads BuildpackPackage configs\ntype PackageConfigReader interface {\n\tRead(path string) (pubbldpkg.Config, error)\n\tReadBuildpackDescriptor(path string) (dist.BuildpackDescriptor, error)\n}\n\n// BuildpackPackage packages (a) buildpack(s) into OCI format, based on a package config\nfunc BuildpackPackage(logger logging.Logger, cfg config.Config, packager BuildpackPackager, packageConfigReader PackageConfigReader) *cobra.Command {\n\tvar flags BuildpackPackageFlags\n\tcmd := &cobra.Command{\n\t\tUse:     \"package <name> --config <config-path>\",\n\t\tShort:   \"Package a buildpack in OCI format.\",\n\t\tArgs:    cobra.MatchAll(cobra.ExactArgs(1), cobra.OnlyValidArgs),\n\t\tExample: \"pack buildpack package my-buildpack --config ./package.toml\\npack buildpack package my-buildpack.cnb --config ./package.toml --f file\",\n\t\tLong: \"buildpack package allows users to package (a) buildpack(s) into OCI format, which can then to be hosted in \" +\n\t\t\t\"image repositories or persisted on disk as a '.cnb' file. You can also package a number of buildpacks \" +\n\t\t\t\"together, to enable easier distribution of a set of buildpacks. \" +\n\t\t\t\"Packaged buildpacks can be used as inputs to `pack build` (using the `--buildpack` flag), \" +\n\t\t\t\"and they can be included in the configs used in `pack builder create` and `pack buildpack package`. For more \" +\n\t\t\t\"on how to package a buildpack, see: https://buildpacks.io/docs/buildpack-author-guide/package-a-buildpack/.\",\n\t\tRunE: logError(logger, func(cmd *cobra.Command, args []string) error {\n\t\t\tif err := validateBuildpackPackageFlags(cfg, &flags); err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\n\t\t\tstringPolicy := flags.Policy\n\t\t\tif stringPolicy == \"\" {\n\t\t\t\tstringPolicy = cfg.PullPolicy\n\t\t\t}\n\t\t\tpullPolicy, err := image.ParsePullPolicy(stringPolicy)\n\t\t\tif err != nil {\n\t\t\t\treturn errors.Wrap(err, \"parsing pull policy\")\n\t\t\t}\n\t\t\tbpPackageCfg := pubbldpkg.DefaultConfig()\n\t\t\tvar bpPath string\n\t\t\tif flags.Path != \"\" {\n\t\t\t\tif bpPath, err = filepath.Abs(flags.Path); err != nil {\n\t\t\t\t\treturn errors.Wrap(err, \"resolving buildpack path\")\n\t\t\t\t}\n\t\t\t\tbpPackageCfg.Buildpack.URI = bpPath\n\t\t\t}\n\t\t\trelativeBaseDir := \"\"\n\t\t\tif flags.PackageTomlPath != \"\" {\n\t\t\t\tbpPackageCfg, err = packageConfigReader.Read(flags.PackageTomlPath)\n\t\t\t\tif err != nil {\n\t\t\t\t\treturn errors.Wrap(err, \"reading config\")\n\t\t\t\t}\n\n\t\t\t\trelativeBaseDir, err = filepath.Abs(filepath.Dir(flags.PackageTomlPath))\n\t\t\t\tif err != nil {\n\t\t\t\t\treturn errors.Wrap(err, \"getting absolute path for config\")\n\t\t\t\t}\n\t\t\t}\n\t\t\tname := args[0]\n\t\t\tif flags.Format == client.FormatFile {\n\t\t\t\tswitch ext := filepath.Ext(name); ext {\n\t\t\t\tcase client.CNBExtension:\n\t\t\t\tcase \"\":\n\t\t\t\t\tname += client.CNBExtension\n\t\t\t\tdefault:\n\t\t\t\t\tlogger.Warnf(\"%s is not a valid extension for a packaged buildpack. Packaged buildpacks must have a %s extension\", style.Symbol(ext), style.Symbol(client.CNBExtension))\n\t\t\t\t}\n\t\t\t}\n\t\t\tif flags.Flatten {\n\t\t\t\tlogger.Warn(\"Flattening a buildpack package could break the distribution specification. Please use it with caution.\")\n\t\t\t}\n\n\t\t\ttargets, isCompositeBP, err := processBuildpackPackageTargets(flags.Path, packageConfigReader, bpPackageCfg)\n\t\t\tif err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\n\t\t\tdaemon := !flags.Publish && flags.Format == \"\"\n\t\t\tmultiArchCfg, err := processMultiArchitectureConfig(logger, flags.Targets, targets, daemon)\n\t\t\tif err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\n\t\t\tif len(multiArchCfg.Targets()) == 0 {\n\t\t\t\tif isCompositeBP {\n\t\t\t\t\tlogger.Infof(\"Pro tip: use --target flag OR [[targets]] in package.toml to specify the desired platform (os/arch/variant); using os %s\", style.Symbol(bpPackageCfg.Platform.OS))\n\t\t\t\t} else {\n\t\t\t\t\tlogger.Infof(\"Pro tip: use --target flag OR [[targets]] in buildpack.toml to specify the desired platform (os/arch/variant); using os %s\", style.Symbol(bpPackageCfg.Platform.OS))\n\t\t\t\t}\n\t\t\t} else if !isCompositeBP {\n\t\t\t\t// FIXME: Check if we can copy the config files during layers creation.\n\t\t\t\tfilesToClean, err := multiArchCfg.CopyConfigFiles(bpPath, \"buildpack\")\n\t\t\t\tif err != nil {\n\t\t\t\t\treturn err\n\t\t\t\t}\n\t\t\t\tdefer clean(filesToClean)\n\t\t\t}\n\n\t\t\tif !flags.Publish && flags.AppendImageNameSuffix {\n\t\t\t\tlogger.Warnf(\"--append-image-name-suffix will be ignored, use combined with --publish\")\n\t\t\t}\n\n\t\t\tif err := packager.PackageBuildpack(cmd.Context(), client.PackageBuildpackOptions{\n\t\t\t\tRelativeBaseDir:       relativeBaseDir,\n\t\t\t\tName:                  name,\n\t\t\t\tFormat:                flags.Format,\n\t\t\t\tConfig:                bpPackageCfg,\n\t\t\t\tPublish:               flags.Publish,\n\t\t\t\tAppendImageNameSuffix: flags.AppendImageNameSuffix && flags.Publish,\n\t\t\t\tPullPolicy:            pullPolicy,\n\t\t\t\tRegistry:              flags.BuildpackRegistry,\n\t\t\t\tFlatten:               flags.Flatten,\n\t\t\t\tFlattenExclude:        flags.FlattenExclude,\n\t\t\t\tLabels:                flags.Label,\n\t\t\t\tTargets:               multiArchCfg.Targets(),\n\t\t\t\tAdditionalTags:        flags.AdditionalTags,\n\t\t\t}); err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\n\t\t\taction := \"created\"\n\t\t\tlocation := \"docker daemon\"\n\t\t\tif flags.Publish {\n\t\t\t\taction = \"published\"\n\t\t\t\tlocation = \"registry\"\n\t\t\t}\n\t\t\tif flags.Format == client.FormatFile {\n\t\t\t\tlocation = \"file\"\n\t\t\t}\n\t\t\tlogger.Infof(\"Successfully %s package %s and saved to %s\", action, style.Symbol(name), location)\n\t\t\treturn nil\n\t\t}),\n\t}\n\n\tcmd.Flags().StringVarP(&flags.PackageTomlPath, \"config\", \"c\", \"\", \"Path to package TOML config\")\n\tcmd.Flags().StringVarP(&flags.Format, \"format\", \"f\", \"\", `Format to save package as (\"image\" or \"file\")`)\n\tcmd.Flags().BoolVar(&flags.Publish, \"publish\", false, `Publish the buildpack directly to the container registry specified in <name>, instead of the daemon (applies to \"--format=image\" only).`)\n\tcmd.Flags().BoolVar(&flags.AppendImageNameSuffix, \"append-image-name-suffix\", false, \"When publishing to a registry that doesn't allow overwrite existing tags use this flag to append a [os]-[arch] suffix to package <name>\")\n\tcmd.Flags().StringVar(&flags.Policy, \"pull-policy\", \"\", \"Pull policy to use. Accepted values are always, never, and if-not-present. The default is always\")\n\tcmd.Flags().StringVarP(&flags.Path, \"path\", \"p\", \"\", \"Path to the Buildpack that needs to be packaged\")\n\tcmd.Flags().StringVarP(&flags.BuildpackRegistry, \"buildpack-registry\", \"r\", \"\", \"Buildpack Registry name\")\n\tcmd.Flags().BoolVar(&flags.Flatten, \"flatten\", false, \"Flatten the buildpack into a single layer\")\n\tcmd.Flags().StringSliceVarP(&flags.FlattenExclude, \"flatten-exclude\", \"e\", nil, \"Buildpacks to exclude from flattening, in the form of '<buildpack-id>@<buildpack-version>'\")\n\tcmd.Flags().StringToStringVarP(&flags.Label, \"label\", \"l\", nil, \"Labels to add to packaged Buildpack, in the form of '<name>=<value>'\")\n\tcmd.Flags().StringSliceVarP(&flags.Targets, \"target\", \"t\", nil,\n\t\t`Target platforms to build for.\nTargets should be in the format '[os][/arch][/variant]:[distroname@osversion@anotherversion];[distroname@osversion]'.\n- To specify two different architectures: '--target \"linux/amd64\" --target \"linux/arm64\"'\n- To specify the distribution version: '--target \"linux/arm/v6:ubuntu@14.04\"'\n- To specify multiple distribution versions: '--target \"linux/arm/v6:ubuntu@14.04\"  --target \"linux/arm/v6:ubuntu@16.04\"'\n\t`)\n\tcmd.Flags().StringSliceVarP(&flags.AdditionalTags, \"tag\", \"\", nil, \"Additional tags to push the output image to.\\nTags should be in the format 'image:tag' or 'repository/image:tag'.\"+stringSliceHelp(\"tag\"))\n\tif !cfg.Experimental {\n\t\tcmd.Flags().MarkHidden(\"flatten\")\n\t\tcmd.Flags().MarkHidden(\"flatten-exclude\")\n\t}\n\tAddHelpFlag(cmd, \"package\")\n\treturn cmd\n}\n\nfunc validateBuildpackPackageFlags(cfg config.Config, p *BuildpackPackageFlags) error {\n\tif p.Publish && p.Policy == image.PullNever.String() {\n\t\treturn errors.Errorf(\"--publish and --pull-policy never cannot be used together. The --publish flag requires the use of remote images.\")\n\t}\n\tif p.PackageTomlPath != \"\" && p.Path != \"\" {\n\t\treturn errors.Errorf(\"--config and --path cannot be used together. Please specify the relative path to the Buildpack directory in the package config file.\")\n\t}\n\n\tif p.Flatten {\n\t\tif !cfg.Experimental {\n\t\t\treturn client.NewExperimentError(\"Flattening a buildpack package is currently experimental.\")\n\t\t}\n\n\t\tif len(p.FlattenExclude) > 0 {\n\t\t\tfor _, exclude := range p.FlattenExclude {\n\t\t\t\tif strings.Count(exclude, \"@\") != 1 {\n\t\t\t\t\treturn errors.Errorf(\"invalid format %s; please use '<buildpack-id>@<buildpack-version>' to exclude buildpack from flattening\", exclude)\n\t\t\t\t}\n\t\t\t}\n\t\t}\n\t}\n\treturn nil\n}\n\n// processBuildpackPackageTargets returns the list of targets defined in the configuration file; it could be the buildpack.toml or\n// the package.toml if the buildpack is a composite buildpack\nfunc processBuildpackPackageTargets(path string, packageConfigReader PackageConfigReader, bpPackageCfg pubbldpkg.Config) ([]dist.Target, bool, error) {\n\tvar (\n\t\ttargets       []dist.Target\n\t\torder         dist.Order\n\t\tisCompositeBP bool\n\t)\n\n\t// Read targets from buildpack.toml\n\tpathToBuildpackToml := filepath.Join(path, \"buildpack.toml\")\n\tif _, err := os.Stat(pathToBuildpackToml); err == nil {\n\t\tbuildpackCfg, err := packageConfigReader.ReadBuildpackDescriptor(pathToBuildpackToml)\n\t\tif err != nil {\n\t\t\treturn nil, false, err\n\t\t}\n\t\ttargets = buildpackCfg.Targets()\n\t\torder = buildpackCfg.Order()\n\t\tisCompositeBP = len(order) > 0\n\t}\n\n\t// When composite buildpack, targets are defined in package.toml - See RFC-0128\n\tif isCompositeBP {\n\t\ttargets = bpPackageCfg.Targets\n\t}\n\treturn targets, isCompositeBP, nil\n}\n\nfunc clean(paths []string) error {\n\t// we need to clean the buildpack.toml for each place where we copied to\n\tif len(paths) > 0 {\n\t\tfor _, path := range paths {\n\t\t\tos.Remove(path)\n\t\t}\n\t}\n\treturn nil\n}\n"
  },
  {
    "path": "internal/commands/buildpack_package_test.go",
    "content": "package commands_test\n\nimport (\n\t\"bytes\"\n\t\"fmt\"\n\t\"path/filepath\"\n\t\"testing\"\n\n\t\"github.com/heroku/color\"\n\t\"github.com/pkg/errors\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\t\"github.com/spf13/cobra\"\n\n\tpubbldpkg \"github.com/buildpacks/pack/buildpackage\"\n\t\"github.com/buildpacks/pack/internal/commands\"\n\t\"github.com/buildpacks/pack/internal/commands/fakes\"\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n\t\"github.com/buildpacks/pack/pkg/image\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestPackageCommand(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"PackageCommand\", testPackageCommand, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testPackageCommand(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tlogger *logging.LogWithWriters\n\t\toutBuf bytes.Buffer\n\t)\n\n\tit.Before(func() {\n\t\tlogger = logging.NewLogWithWriters(&outBuf, &outBuf)\n\t})\n\n\twhen(\"Package#Execute\", func() {\n\t\tvar fakeBuildpackPackager *fakes.FakeBuildpackPackager\n\n\t\tit.Before(func() {\n\t\t\tfakeBuildpackPackager = &fakes.FakeBuildpackPackager{}\n\t\t})\n\n\t\twhen(\"valid package config\", func() {\n\t\t\tit(\"reads package config from the configured path\", func() {\n\t\t\t\tfakePackageConfigReader := fakes.NewFakePackageConfigReader()\n\t\t\t\texpectedPackageConfigPath := \"/path/to/some/file\"\n\n\t\t\t\tcmd := packageCommand(\n\t\t\t\t\twithPackageConfigReader(fakePackageConfigReader),\n\t\t\t\t\twithPackageConfigPath(expectedPackageConfigPath),\n\t\t\t\t)\n\t\t\t\terr := cmd.Execute()\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\th.AssertEq(t, fakePackageConfigReader.ReadCalledWithArg, expectedPackageConfigPath)\n\t\t\t})\n\n\t\t\tit(\"creates package with correct image name\", func() {\n\t\t\t\tcmd := packageCommand(\n\t\t\t\t\twithImageName(\"my-specific-image\"),\n\t\t\t\t\twithBuildpackPackager(fakeBuildpackPackager),\n\t\t\t\t)\n\t\t\t\terr := cmd.Execute()\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\treceivedOptions := fakeBuildpackPackager.CreateCalledWithOptions\n\t\t\t\th.AssertEq(t, receivedOptions.Name, \"my-specific-image\")\n\t\t\t})\n\n\t\t\tit(\"creates package with config returned by the reader\", func() {\n\t\t\t\tmyConfig := pubbldpkg.Config{\n\t\t\t\t\tBuildpack: dist.BuildpackURI{URI: \"test\"},\n\t\t\t\t}\n\n\t\t\t\tcmd := packageCommand(\n\t\t\t\t\twithBuildpackPackager(fakeBuildpackPackager),\n\t\t\t\t\twithPackageConfigReader(fakes.NewFakePackageConfigReader(whereReadReturns(myConfig, nil))),\n\t\t\t\t)\n\t\t\t\terr := cmd.Execute()\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\treceivedOptions := fakeBuildpackPackager.CreateCalledWithOptions\n\t\t\t\th.AssertEq(t, receivedOptions.Config, myConfig)\n\t\t\t})\n\n\t\t\twhen(\"file format\", func() {\n\t\t\t\twhen(\"extension is .cnb\", func() {\n\t\t\t\t\tit(\"does not modify the name\", func() {\n\t\t\t\t\t\tcmd := packageCommand(withBuildpackPackager(fakeBuildpackPackager))\n\t\t\t\t\t\tcmd.SetArgs([]string{\"test.cnb\", \"-f\", \"file\"})\n\t\t\t\t\t\th.AssertNil(t, cmd.Execute())\n\n\t\t\t\t\t\treceivedOptions := fakeBuildpackPackager.CreateCalledWithOptions\n\t\t\t\t\t\th.AssertEq(t, receivedOptions.Name, \"test.cnb\")\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t\twhen(\"extension is empty\", func() {\n\t\t\t\t\tit(\"appends .cnb to the name\", func() {\n\t\t\t\t\t\tcmd := packageCommand(withBuildpackPackager(fakeBuildpackPackager))\n\t\t\t\t\t\tcmd.SetArgs([]string{\"test\", \"-f\", \"file\"})\n\t\t\t\t\t\th.AssertNil(t, cmd.Execute())\n\n\t\t\t\t\t\treceivedOptions := fakeBuildpackPackager.CreateCalledWithOptions\n\t\t\t\t\t\th.AssertEq(t, receivedOptions.Name, \"test.cnb\")\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t\twhen(\"extension is something other than .cnb\", func() {\n\t\t\t\t\tit(\"does not modify the name but shows a warning\", func() {\n\t\t\t\t\t\tcmd := packageCommand(withBuildpackPackager(fakeBuildpackPackager), withLogger(logger))\n\t\t\t\t\t\tcmd.SetArgs([]string{\"test.tar.gz\", \"-f\", \"file\"})\n\t\t\t\t\t\th.AssertNil(t, cmd.Execute())\n\n\t\t\t\t\t\treceivedOptions := fakeBuildpackPackager.CreateCalledWithOptions\n\t\t\t\t\t\th.AssertEq(t, receivedOptions.Name, \"test.tar.gz\")\n\t\t\t\t\t\th.AssertContains(t, outBuf.String(), \"'.gz' is not a valid extension for a packaged buildpack. Packaged buildpacks must have a '.cnb' extension\")\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t\twhen(\"flatten is set to true\", func() {\n\t\t\t\t\twhen(\"experimental is true\", func() {\n\t\t\t\t\t\twhen(\"flatten exclude doesn't have format <buildpack>@<version>\", func() {\n\t\t\t\t\t\t\tit(\"errors with a descriptive message\", func() {\n\t\t\t\t\t\t\t\tcmd := packageCommand(withClientConfig(config.Config{Experimental: true}), withBuildpackPackager(fakeBuildpackPackager))\n\t\t\t\t\t\t\t\tcmd.SetArgs([]string{\"test\", \"-f\", \"file\", \"--flatten\", \"--flatten-exclude\", \"some-buildpack\"})\n\n\t\t\t\t\t\t\t\terr := cmd.Execute()\n\t\t\t\t\t\t\t\th.AssertError(t, err, fmt.Sprintf(\"invalid format %s; please use '<buildpack-id>@<buildpack-version>' to exclude buildpack from flattening\", \"some-buildpack\"))\n\t\t\t\t\t\t\t})\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\twhen(\"no exclusions\", func() {\n\t\t\t\t\t\t\tit(\"creates package with correct image name and warns flatten is being used\", func() {\n\t\t\t\t\t\t\t\tcmd := packageCommand(\n\t\t\t\t\t\t\t\t\twithClientConfig(config.Config{Experimental: true}),\n\t\t\t\t\t\t\t\t\twithBuildpackPackager(fakeBuildpackPackager),\n\t\t\t\t\t\t\t\t\twithLogger(logger),\n\t\t\t\t\t\t\t\t)\n\t\t\t\t\t\t\t\tcmd.SetArgs([]string{\"my-flatten-image\", \"-f\", \"file\", \"--flatten\"})\n\t\t\t\t\t\t\t\terr := cmd.Execute()\n\t\t\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\t\t\treceivedOptions := fakeBuildpackPackager.CreateCalledWithOptions\n\t\t\t\t\t\t\t\th.AssertEq(t, receivedOptions.Name, \"my-flatten-image.cnb\")\n\t\t\t\t\t\t\t\th.AssertContains(t, outBuf.String(), \"Flattening a buildpack package could break the distribution specification. Please use it with caution.\")\n\t\t\t\t\t\t\t})\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"experimental is false\", func() {\n\t\t\t\t\t\tit(\"errors with a descriptive message\", func() {\n\t\t\t\t\t\t\tcmd := packageCommand(withClientConfig(config.Config{Experimental: false}), withBuildpackPackager(fakeBuildpackPackager))\n\t\t\t\t\t\t\tcmd.SetArgs([]string{\"test\", \"-f\", \"file\", \"--flatten\"})\n\n\t\t\t\t\t\t\terr := cmd.Execute()\n\t\t\t\t\t\t\th.AssertError(t, err, \"Flattening a buildpack package is currently experimental.\")\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"there is a path flag\", func() {\n\t\t\t\tit(\"returns an error saying that it cannot be used with the config flag\", func() {\n\t\t\t\t\tmyConfig := pubbldpkg.Config{\n\t\t\t\t\t\tBuildpack: dist.BuildpackURI{URI: \"test\"},\n\t\t\t\t\t}\n\n\t\t\t\t\tcmd := packageCommand(\n\t\t\t\t\t\twithBuildpackPackager(fakeBuildpackPackager),\n\t\t\t\t\t\twithPackageConfigReader(fakes.NewFakePackageConfigReader(whereReadReturns(myConfig, nil))),\n\t\t\t\t\t\twithPath(\"..\"),\n\t\t\t\t\t)\n\t\t\t\t\terr := cmd.Execute()\n\t\t\t\t\th.AssertError(t, err, \"--config and --path cannot be used together\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"pull-policy\", func() {\n\t\t\t\tvar pullPolicyArgs = []string{\n\t\t\t\t\t\"some-image-name\",\n\t\t\t\t\t\"--config\", \"/path/to/some/file\",\n\t\t\t\t\t\"--pull-policy\",\n\t\t\t\t}\n\n\t\t\t\tit(\"pull-policy=never sets policy\", func() {\n\t\t\t\t\tcmd := packageCommand(withBuildpackPackager(fakeBuildpackPackager))\n\t\t\t\t\tcmd.SetArgs(append(pullPolicyArgs, \"never\"))\n\t\t\t\t\th.AssertNil(t, cmd.Execute())\n\n\t\t\t\t\treceivedOptions := fakeBuildpackPackager.CreateCalledWithOptions\n\t\t\t\t\th.AssertEq(t, receivedOptions.PullPolicy, image.PullNever)\n\t\t\t\t})\n\n\t\t\t\tit(\"pull-policy=always sets policy\", func() {\n\t\t\t\t\tcmd := packageCommand(withBuildpackPackager(fakeBuildpackPackager))\n\t\t\t\t\tcmd.SetArgs(append(pullPolicyArgs, \"always\"))\n\t\t\t\t\th.AssertNil(t, cmd.Execute())\n\n\t\t\t\t\treceivedOptions := fakeBuildpackPackager.CreateCalledWithOptions\n\t\t\t\t\th.AssertEq(t, receivedOptions.PullPolicy, image.PullAlways)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"no --pull-policy\", func() {\n\t\t\t\tvar pullPolicyArgs = []string{\n\t\t\t\t\t\"some-image-name\",\n\t\t\t\t\t\"--config\", \"/path/to/some/file\",\n\t\t\t\t}\n\n\t\t\t\tit(\"uses the default policy when no policy configured\", func() {\n\t\t\t\t\tcmd := packageCommand(withBuildpackPackager(fakeBuildpackPackager))\n\t\t\t\t\tcmd.SetArgs(pullPolicyArgs)\n\t\t\t\t\th.AssertNil(t, cmd.Execute())\n\n\t\t\t\t\treceivedOptions := fakeBuildpackPackager.CreateCalledWithOptions\n\t\t\t\t\th.AssertEq(t, receivedOptions.PullPolicy, image.PullAlways)\n\t\t\t\t})\n\t\t\t\tit(\"uses the configured pull policy when policy configured\", func() {\n\t\t\t\t\tcmd := packageCommand(\n\t\t\t\t\t\twithBuildpackPackager(fakeBuildpackPackager),\n\t\t\t\t\t\twithClientConfig(config.Config{PullPolicy: \"never\"}),\n\t\t\t\t\t)\n\n\t\t\t\t\tcmd.SetArgs([]string{\n\t\t\t\t\t\t\"some-image-name\",\n\t\t\t\t\t\t\"--config\", \"/path/to/some/file\",\n\t\t\t\t\t})\n\n\t\t\t\t\terr := cmd.Execute()\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\treceivedOptions := fakeBuildpackPackager.CreateCalledWithOptions\n\t\t\t\t\th.AssertEq(t, receivedOptions.PullPolicy, image.PullNever)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"composite buildpack\", func() {\n\t\t\t\twhen(\"multi-platform\", func() {\n\t\t\t\t\tvar (\n\t\t\t\t\t\ttargets    []dist.Target\n\t\t\t\t\t\tdescriptor dist.BuildpackDescriptor\n\t\t\t\t\t\tconfig     pubbldpkg.Config\n\t\t\t\t\t\tpath       string\n\t\t\t\t\t)\n\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\ttargets = []dist.Target{\n\t\t\t\t\t\t\t{OS: \"linux\", Arch: \"amd64\"},\n\t\t\t\t\t\t\t{OS: \"windows\", Arch: \"amd64\"},\n\t\t\t\t\t\t}\n\t\t\t\t\t\tconfig = pubbldpkg.Config{Buildpack: dist.BuildpackURI{URI: \"test\"}}\n\t\t\t\t\t\tdescriptor = dist.BuildpackDescriptor{WithTargets: targets}\n\t\t\t\t\t\tpath = \"testdata\"\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"creates a multi-platform buildpack package\", func() {\n\t\t\t\t\t\tcmd := packageCommand(withBuildpackPackager(fakeBuildpackPackager), withPackageConfigReader(fakes.NewFakePackageConfigReader(whereReadReturns(config, nil), whereReadBuildpackDescriptor(descriptor, nil))))\n\t\t\t\t\t\tcmd.SetArgs([]string{\"some-name\", \"-p\", path})\n\n\t\t\t\t\t\th.AssertNil(t, cmd.Execute())\n\t\t\t\t\t\th.AssertEq(t, fakeBuildpackPackager.CreateCalledWithOptions.Targets, targets)\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"additional tags are specified\", func() {\n\t\t\t\tit(\"forwards additional tags to buildpackPackager\", func() {\n\t\t\t\t\texpectedTags := []string{\"additional-tag-1\", \"additional-tag-2\"}\n\t\t\t\t\tcmd := packageCommand(\n\t\t\t\t\t\twithBuildpackPackager(fakeBuildpackPackager),\n\t\t\t\t\t)\n\t\t\t\t\tcmd.SetArgs([]string{\n\t\t\t\t\t\t\"my-specific-image\",\n\t\t\t\t\t\t\"--tag\", expectedTags[0], \"--tag\", expectedTags[1],\n\t\t\t\t\t})\n\t\t\t\t\terr := cmd.Execute()\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\treceivedOptions := fakeBuildpackPackager.CreateCalledWithOptions\n\t\t\t\t\th.AssertEq(t, receivedOptions.AdditionalTags[0], expectedTags[0])\n\t\t\t\t\th.AssertEq(t, receivedOptions.AdditionalTags[1], expectedTags[1])\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"no config path is specified\", func() {\n\t\t\twhen(\"no path is specified\", func() {\n\t\t\t\tit(\"creates a default config with the uri set to the current working directory\", func() {\n\t\t\t\t\tcmd := packageCommand(withBuildpackPackager(fakeBuildpackPackager))\n\t\t\t\t\tcmd.SetArgs([]string{\"some-name\"})\n\t\t\t\t\th.AssertNil(t, cmd.Execute())\n\n\t\t\t\t\treceivedOptions := fakeBuildpackPackager.CreateCalledWithOptions\n\t\t\t\t\th.AssertEq(t, receivedOptions.Config.Buildpack.URI, \".\")\n\t\t\t\t})\n\t\t\t})\n\t\t\twhen(\"a path is specified\", func() {\n\t\t\t\twhen(\"not multi-platform\", func() {\n\t\t\t\t\tit(\"creates a default config with the appropriate path\", func() {\n\t\t\t\t\t\tcmd := packageCommand(withBuildpackPackager(fakeBuildpackPackager))\n\t\t\t\t\t\tcmd.SetArgs([]string{\"some-name\", \"-p\", \"..\"})\n\t\t\t\t\t\th.AssertNil(t, cmd.Execute())\n\t\t\t\t\t\tbpPath, _ := filepath.Abs(\"..\")\n\t\t\t\t\t\treceivedOptions := fakeBuildpackPackager.CreateCalledWithOptions\n\t\t\t\t\t\th.AssertEq(t, receivedOptions.Config.Buildpack.URI, bpPath)\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"multi-platform\", func() {\n\t\t\t\t\tvar (\n\t\t\t\t\t\ttargets    []dist.Target\n\t\t\t\t\t\tdescriptor dist.BuildpackDescriptor\n\t\t\t\t\t\tpath       string\n\t\t\t\t\t)\n\n\t\t\t\t\twhen(\"single buildpack\", func() {\n\t\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\t\ttargets = []dist.Target{\n\t\t\t\t\t\t\t\t{OS: \"linux\", Arch: \"amd64\"},\n\t\t\t\t\t\t\t\t{OS: \"windows\", Arch: \"amd64\"},\n\t\t\t\t\t\t\t}\n\n\t\t\t\t\t\t\tdescriptor = dist.BuildpackDescriptor{WithTargets: targets}\n\t\t\t\t\t\t\tpath = \"testdata\"\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\tit(\"creates a multi-platform buildpack package\", func() {\n\t\t\t\t\t\t\tcmd := packageCommand(withBuildpackPackager(fakeBuildpackPackager), withPackageConfigReader(fakes.NewFakePackageConfigReader(whereReadBuildpackDescriptor(descriptor, nil))))\n\t\t\t\t\t\t\tcmd.SetArgs([]string{\"some-name\", \"-p\", path})\n\n\t\t\t\t\t\t\th.AssertNil(t, cmd.Execute())\n\t\t\t\t\t\t\th.AssertEq(t, fakeBuildpackPackager.CreateCalledWithOptions.Targets, targets)\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"invalid flags\", func() {\n\t\twhen(\"both --publish and --pull-policy never flags are specified\", func() {\n\t\t\tit(\"errors with a descriptive message\", func() {\n\t\t\t\tcmd := packageCommand()\n\t\t\t\tcmd.SetArgs([]string{\n\t\t\t\t\t\"some-image-name\", \"--config\", \"/path/to/some/file\",\n\t\t\t\t\t\"--publish\",\n\t\t\t\t\t\"--pull-policy\", \"never\",\n\t\t\t\t})\n\n\t\t\t\terr := cmd.Execute()\n\t\t\t\th.AssertNotNil(t, err)\n\t\t\t\th.AssertError(t, err, \"--publish and --pull-policy never cannot be used together. The --publish flag requires the use of remote images.\")\n\t\t\t})\n\t\t})\n\n\t\tit(\"logs an error and exits when package toml is invalid\", func() {\n\t\t\texpectedErr := errors.New(\"it went wrong\")\n\n\t\t\tcmd := packageCommand(\n\t\t\t\twithLogger(logger),\n\t\t\t\twithPackageConfigReader(\n\t\t\t\t\tfakes.NewFakePackageConfigReader(whereReadReturns(pubbldpkg.Config{}, expectedErr)),\n\t\t\t\t),\n\t\t\t)\n\n\t\t\terr := cmd.Execute()\n\t\t\th.AssertNotNil(t, err)\n\n\t\t\th.AssertContains(t, outBuf.String(), fmt.Sprintf(\"ERROR: reading config: %s\", expectedErr))\n\t\t})\n\n\t\twhen(\"package-config is specified\", func() {\n\t\t\tit(\"errors with a descriptive message\", func() {\n\t\t\t\tcmd := packageCommand()\n\t\t\t\tcmd.SetArgs([]string{\"some-name\", \"--package-config\", \"some-path\"})\n\n\t\t\t\terr := cmd.Execute()\n\t\t\t\th.AssertError(t, err, \"unknown flag: --package-config\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"--pull-policy unknown-policy\", func() {\n\t\t\tit(\"fails to run\", func() {\n\t\t\t\tcmd := packageCommand()\n\t\t\t\tcmd.SetArgs([]string{\n\t\t\t\t\t\"some-image-name\",\n\t\t\t\t\t\"--config\", \"/path/to/some/file\",\n\t\t\t\t\t\"--pull-policy\",\n\t\t\t\t\t\"unknown-policy\",\n\t\t\t\t})\n\n\t\t\t\th.AssertError(t, cmd.Execute(), \"parsing pull policy\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"--label cannot be parsed\", func() {\n\t\t\tit(\"errors with a descriptive message\", func() {\n\t\t\t\tcmd := packageCommand()\n\t\t\t\tcmd.SetArgs([]string{\n\t\t\t\t\t\"some-image-name\", \"--config\", \"/path/to/some/file\",\n\t\t\t\t\t\"--label\", \"name+value\",\n\t\t\t\t})\n\n\t\t\t\terr := cmd.Execute()\n\t\t\t\th.AssertNotNil(t, err)\n\t\t\t\th.AssertError(t, err, \"invalid argument \\\"name+value\\\" for \\\"-l, --label\\\" flag: name+value must be formatted as key=value\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"--target cannot be parsed\", func() {\n\t\t\tit(\"errors with a descriptive message\", func() {\n\t\t\t\tcmd := packageCommand()\n\t\t\t\tcmd.SetArgs([]string{\n\t\t\t\t\t\"some-image-name\", \"--config\", \"/path/to/some/file\",\n\t\t\t\t\t\"--target\", \"something/wrong\", \"--publish\",\n\t\t\t\t})\n\n\t\t\t\terr := cmd.Execute()\n\t\t\t\th.AssertNotNil(t, err)\n\t\t\t\th.AssertError(t, err, \"unknown target: 'something/wrong'\")\n\t\t\t})\n\t\t})\n\t})\n}\n\ntype packageCommandConfig struct {\n\tlogger              *logging.LogWithWriters\n\tpackageConfigReader *fakes.FakePackageConfigReader\n\tbuildpackPackager   *fakes.FakeBuildpackPackager\n\tclientConfig        config.Config\n\timageName           string\n\tconfigPath          string\n\tpath                string\n}\n\ntype packageCommandOption func(config *packageCommandConfig)\n\nfunc packageCommand(ops ...packageCommandOption) *cobra.Command {\n\tconfig := &packageCommandConfig{\n\t\tlogger:              logging.NewLogWithWriters(&bytes.Buffer{}, &bytes.Buffer{}),\n\t\tpackageConfigReader: fakes.NewFakePackageConfigReader(),\n\t\tbuildpackPackager:   &fakes.FakeBuildpackPackager{},\n\t\tclientConfig:        config.Config{},\n\t\timageName:           \"some-image-name\",\n\t\tconfigPath:          \"/path/to/some/file\",\n\t}\n\n\tfor _, op := range ops {\n\t\top(config)\n\t}\n\n\tcmd := commands.BuildpackPackage(config.logger, config.clientConfig, config.buildpackPackager, config.packageConfigReader)\n\tcmd.SetArgs([]string{config.imageName, \"--config\", config.configPath, \"-p\", config.path})\n\n\treturn cmd\n}\n\nfunc withLogger(logger *logging.LogWithWriters) packageCommandOption {\n\treturn func(config *packageCommandConfig) {\n\t\tconfig.logger = logger\n\t}\n}\n\nfunc withPackageConfigReader(reader *fakes.FakePackageConfigReader) packageCommandOption {\n\treturn func(config *packageCommandConfig) {\n\t\tconfig.packageConfigReader = reader\n\t}\n}\n\nfunc withBuildpackPackager(creator *fakes.FakeBuildpackPackager) packageCommandOption {\n\treturn func(config *packageCommandConfig) {\n\t\tconfig.buildpackPackager = creator\n\t}\n}\n\nfunc withImageName(name string) packageCommandOption {\n\treturn func(config *packageCommandConfig) {\n\t\tconfig.imageName = name\n\t}\n}\n\nfunc withPath(name string) packageCommandOption {\n\treturn func(config *packageCommandConfig) {\n\t\tconfig.path = name\n\t}\n}\n\nfunc withPackageConfigPath(path string) packageCommandOption {\n\treturn func(config *packageCommandConfig) {\n\t\tconfig.configPath = path\n\t}\n}\n\nfunc withClientConfig(clientCfg config.Config) packageCommandOption {\n\treturn func(config *packageCommandConfig) {\n\t\tconfig.clientConfig = clientCfg\n\t}\n}\n\nfunc whereReadReturns(config pubbldpkg.Config, err error) func(*fakes.FakePackageConfigReader) {\n\treturn func(r *fakes.FakePackageConfigReader) {\n\t\tr.ReadReturnConfig = config\n\t\tr.ReadReturnError = err\n\t}\n}\n\nfunc whereReadBuildpackDescriptor(descriptor dist.BuildpackDescriptor, err error) func(*fakes.FakePackageConfigReader) {\n\treturn func(r *fakes.FakePackageConfigReader) {\n\t\tr.ReadBuildpackDescriptorReturn = descriptor\n\t\tr.ReadBuildpackDescriptorReturnError = err\n\t}\n}\n"
  },
  {
    "path": "internal/commands/buildpack_pull.go",
    "content": "package commands\n\nimport (\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/internal/style\"\n\t\"github.com/buildpacks/pack/pkg/client\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n)\n\n// BuildpackPullFlags consist of flags applicable to the `buildpack pull` command\ntype BuildpackPullFlags struct {\n\t// BuildpackRegistry is the name of the buildpack registry to use to search for\n\tBuildpackRegistry string\n}\n\n// BuildpackPull pulls a buildpack and stores it locally\nfunc BuildpackPull(logger logging.Logger, cfg config.Config, pack PackClient) *cobra.Command {\n\tvar flags BuildpackPullFlags\n\n\tcmd := &cobra.Command{\n\t\tUse:     \"pull <uri>\",\n\t\tArgs:    cobra.ExactArgs(1),\n\t\tShort:   \"Pull a buildpack from a registry and store it locally\",\n\t\tExample: \"pack buildpack pull example/my-buildpack@1.0.0\",\n\t\tRunE: logError(logger, func(cmd *cobra.Command, args []string) error {\n\t\t\tregistry, err := config.GetRegistry(cfg, flags.BuildpackRegistry)\n\t\t\tif err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\n\t\t\topts := client.PullBuildpackOptions{\n\t\t\t\tURI:          args[0],\n\t\t\t\tRegistryName: registry.Name,\n\t\t\t}\n\n\t\t\tif err := pack.PullBuildpack(cmd.Context(), opts); err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\t\t\tlogger.Infof(\"Successfully pulled %s\", style.Symbol(opts.URI))\n\t\t\treturn nil\n\t\t}),\n\t}\n\tcmd.Flags().StringVarP(&flags.BuildpackRegistry, \"buildpack-registry\", \"r\", \"\", \"Buildpack Registry name\")\n\tAddHelpFlag(cmd, \"pull\")\n\treturn cmd\n}\n"
  },
  {
    "path": "internal/commands/buildpack_pull_test.go",
    "content": "package commands_test\n\nimport (\n\t\"bytes\"\n\t\"testing\"\n\n\t\"github.com/golang/mock/gomock\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/internal/commands\"\n\t\"github.com/buildpacks/pack/internal/commands/testmocks\"\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/pkg/client\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestPullBuildpackCommand(t *testing.T) {\n\tspec.Run(t, \"PullBuildpackCommand\", testPullBuildpackCommand, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testPullBuildpackCommand(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tcommand        *cobra.Command\n\t\tlogger         logging.Logger\n\t\toutBuf         bytes.Buffer\n\t\tmockController *gomock.Controller\n\t\tmockClient     *testmocks.MockPackClient\n\t\tcfg            config.Config\n\t)\n\n\tit.Before(func() {\n\t\tlogger = logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\tmockController = gomock.NewController(t)\n\t\tmockClient = testmocks.NewMockPackClient(mockController)\n\t\tcfg = config.Config{}\n\n\t\tcommand = commands.BuildpackPull(logger, cfg, mockClient)\n\t})\n\n\twhen(\"#BuildpackPullCommand\", func() {\n\t\twhen(\"no buildpack is provided\", func() {\n\t\t\tit(\"fails to run\", func() {\n\t\t\t\terr := command.Execute()\n\t\t\t\th.AssertError(t, err, \"accepts 1 arg\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"buildpack uri is provided\", func() {\n\t\t\tit(\"should work for required args\", func() {\n\t\t\t\tbuildpackImage := \"buildpack/image\"\n\t\t\t\topts := client.PullBuildpackOptions{\n\t\t\t\t\tURI:          buildpackImage,\n\t\t\t\t\tRegistryName: \"official\",\n\t\t\t\t}\n\n\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\tPullBuildpack(gomock.Any(), opts).\n\t\t\t\t\tReturn(nil)\n\n\t\t\t\tcommand.SetArgs([]string{buildpackImage})\n\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t})\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "internal/commands/buildpack_register.go",
    "content": "package commands\n\nimport (\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/internal/style\"\n\t\"github.com/buildpacks/pack/pkg/client\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n)\n\ntype BuildpackRegisterFlags struct {\n\tBuildpackRegistry string\n}\n\nfunc BuildpackRegister(logger logging.Logger, cfg config.Config, pack PackClient) *cobra.Command {\n\tvar opts client.RegisterBuildpackOptions\n\tvar flags BuildpackRegisterFlags\n\n\tcmd := &cobra.Command{\n\t\tUse:     \"register <image>\",\n\t\tArgs:    cobra.ExactArgs(1),\n\t\tShort:   \"Register a buildpack to a registry\",\n\t\tExample: \"pack buildpack register my-buildpack\",\n\t\tRunE: logError(logger, func(cmd *cobra.Command, args []string) error {\n\t\t\tregistry, err := config.GetRegistry(cfg, flags.BuildpackRegistry)\n\t\t\tif err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\t\t\topts.ImageName = args[0]\n\t\t\topts.Type = registry.Type\n\t\t\topts.URL = registry.URL\n\t\t\topts.Name = registry.Name\n\n\t\t\tif err := pack.RegisterBuildpack(cmd.Context(), opts); err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\t\t\tlogger.Infof(\"Successfully registered %s\", style.Symbol(opts.ImageName))\n\t\t\treturn nil\n\t\t}),\n\t}\n\tcmd.Flags().StringVarP(&flags.BuildpackRegistry, \"buildpack-registry\", \"r\", \"\", \"Buildpack Registry name\")\n\tAddHelpFlag(cmd, \"register\")\n\treturn cmd\n}\n"
  },
  {
    "path": "internal/commands/buildpack_register_test.go",
    "content": "package commands_test\n\nimport (\n\t\"bytes\"\n\t\"testing\"\n\n\t\"github.com/buildpacks/pack/internal/commands\"\n\t\"github.com/buildpacks/pack/pkg/client\"\n\n\t\"github.com/golang/mock/gomock\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/internal/commands/testmocks\"\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestRegisterCommand(t *testing.T) {\n\tspec.Run(t, \"RegisterCommand\", testRegisterCommand, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testRegisterCommand(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tcmd            *cobra.Command\n\t\tlogger         logging.Logger\n\t\toutBuf         bytes.Buffer\n\t\tmockController *gomock.Controller\n\t\tmockClient     *testmocks.MockPackClient\n\t\tcfg            config.Config\n\t)\n\n\tit.Before(func() {\n\t\tlogger = logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\tmockController = gomock.NewController(t)\n\t\tmockClient = testmocks.NewMockPackClient(mockController)\n\t\tcfg = config.Config{}\n\n\t\tcmd = commands.BuildpackRegister(logger, cfg, mockClient)\n\t})\n\n\tit.After(func() {})\n\n\twhen(\"#RegisterBuildpackCommand\", func() {\n\t\twhen(\"no image is provided\", func() {\n\t\t\tit(\"fails to run\", func() {\n\t\t\t\terr := cmd.Execute()\n\t\t\t\th.AssertError(t, err, \"accepts 1 arg\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"image name is provided\", func() {\n\t\t\tvar buildpackImage = \"buildpack/image\"\n\n\t\t\tit(\"should work for required args\", func() {\n\t\t\t\topts := client.RegisterBuildpackOptions{\n\t\t\t\t\tImageName: buildpackImage,\n\t\t\t\t\tType:      \"github\",\n\t\t\t\t\tURL:       \"https://github.com/buildpacks/registry-index\",\n\t\t\t\t\tName:      \"official\",\n\t\t\t\t}\n\n\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\tRegisterBuildpack(gomock.Any(), opts).\n\t\t\t\t\tReturn(nil)\n\n\t\t\t\tcmd.SetArgs([]string{buildpackImage})\n\t\t\t\th.AssertNil(t, cmd.Execute())\n\t\t\t})\n\n\t\t\twhen(\"config.toml exists\", func() {\n\t\t\t\tit(\"should consume registry config values\", func() {\n\t\t\t\t\tcfg = config.Config{\n\t\t\t\t\t\tDefaultRegistryName: \"berneuse\",\n\t\t\t\t\t\tRegistries: []config.Registry{\n\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\tName: \"berneuse\",\n\t\t\t\t\t\t\t\tType: \"github\",\n\t\t\t\t\t\t\t\tURL:  \"https://github.com/berneuse/buildpack-registry\",\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t},\n\t\t\t\t\t}\n\t\t\t\t\tcmd = commands.BuildpackRegister(logger, cfg, mockClient)\n\t\t\t\t\topts := client.RegisterBuildpackOptions{\n\t\t\t\t\t\tImageName: buildpackImage,\n\t\t\t\t\t\tType:      \"github\",\n\t\t\t\t\t\tURL:       \"https://github.com/berneuse/buildpack-registry\",\n\t\t\t\t\t\tName:      \"berneuse\",\n\t\t\t\t\t}\n\n\t\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\t\tRegisterBuildpack(gomock.Any(), opts).\n\t\t\t\t\t\tReturn(nil)\n\n\t\t\t\t\tcmd.SetArgs([]string{buildpackImage})\n\t\t\t\t\th.AssertNil(t, cmd.Execute())\n\t\t\t\t})\n\n\t\t\t\tit(\"should handle config errors\", func() {\n\t\t\t\t\tcfg = config.Config{\n\t\t\t\t\t\tDefaultRegistryName: \"missing registry\",\n\t\t\t\t\t}\n\t\t\t\t\tcmd = commands.BuildpackRegister(logger, cfg, mockClient)\n\t\t\t\t\tcmd.SetArgs([]string{buildpackImage})\n\n\t\t\t\t\terr := cmd.Execute()\n\t\t\t\t\th.AssertNotNil(t, err)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\tit(\"should support buildpack-registry flag\", func() {\n\t\t\t\tbuildpackRegistry := \"override\"\n\t\t\t\tcfg = config.Config{\n\t\t\t\t\tDefaultRegistryName: \"default\",\n\t\t\t\t\tRegistries: []config.Registry{\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\tName: \"default\",\n\t\t\t\t\t\t\tType: \"github\",\n\t\t\t\t\t\t\tURL:  \"https://github.com/default/buildpack-registry\",\n\t\t\t\t\t\t},\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\tName: \"override\",\n\t\t\t\t\t\t\tType: \"github\",\n\t\t\t\t\t\t\tURL:  \"https://github.com/override/buildpack-registry\",\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t}\n\t\t\t\topts := client.RegisterBuildpackOptions{\n\t\t\t\t\tImageName: buildpackImage,\n\t\t\t\t\tType:      \"github\",\n\t\t\t\t\tURL:       \"https://github.com/override/buildpack-registry\",\n\t\t\t\t\tName:      \"override\",\n\t\t\t\t}\n\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\tRegisterBuildpack(gomock.Any(), opts).\n\t\t\t\t\tReturn(nil)\n\n\t\t\t\tcmd = commands.BuildpackRegister(logger, cfg, mockClient)\n\t\t\t\tcmd.SetArgs([]string{buildpackImage, \"--buildpack-registry\", buildpackRegistry})\n\t\t\t\th.AssertNil(t, cmd.Execute())\n\t\t\t})\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "internal/commands/buildpack_test.go",
    "content": "package commands_test\n\nimport (\n\t\"bytes\"\n\t\"testing\"\n\n\t\"github.com/golang/mock/gomock\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/internal/commands\"\n\t\"github.com/buildpacks/pack/internal/commands/fakes\"\n\t\"github.com/buildpacks/pack/internal/commands/testmocks\"\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestBuildpackCommand(t *testing.T) {\n\tspec.Run(t, \"BuildpackCommand\", testBuildpackCommand, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testBuildpackCommand(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tcmd        *cobra.Command\n\t\tlogger     logging.Logger\n\t\toutBuf     bytes.Buffer\n\t\tmockClient *testmocks.MockPackClient\n\t)\n\n\tit.Before(func() {\n\t\tlogger = logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\tmockController := gomock.NewController(t)\n\t\tmockClient = testmocks.NewMockPackClient(mockController)\n\t\tcmd = commands.NewBuildpackCommand(logger, config.Config{}, mockClient, fakes.NewFakePackageConfigReader())\n\t\tcmd.SetOut(logging.GetWriterForLevel(logger, logging.InfoLevel))\n\t})\n\n\twhen(\"buildpack\", func() {\n\t\tit(\"prints help text\", func() {\n\t\t\tcmd.SetArgs([]string{})\n\t\t\th.AssertNil(t, cmd.Execute())\n\t\t\toutput := outBuf.String()\n\t\t\th.AssertContains(t, output, \"Interact with buildpacks\")\n\t\t\tfor _, command := range []string{\"Usage\", \"package\", \"register\", \"yank\", \"pull\", \"inspect\"} {\n\t\t\t\th.AssertContains(t, output, command)\n\t\t\t}\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "internal/commands/buildpack_yank.go",
    "content": "package commands\n\nimport (\n\t\"fmt\"\n\t\"strings\"\n\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/internal/style\"\n\t\"github.com/buildpacks/pack/pkg/client\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n)\n\ntype BuildpackYankFlags struct {\n\tBuildpackRegistry string\n\tUndo              bool\n}\n\nfunc BuildpackYank(logger logging.Logger, cfg config.Config, pack PackClient) *cobra.Command {\n\tvar flags BuildpackYankFlags\n\n\tcmd := &cobra.Command{\n\t\tUse:     \"yank <buildpack-id-and-version>\",\n\t\tArgs:    cobra.ExactArgs(1),\n\t\tShort:   \"Mark a buildpack on a Buildpack registry as unusable\",\n\t\tExample: \"pack yank my-buildpack@0.0.1\",\n\t\tRunE: logError(logger, func(cmd *cobra.Command, args []string) error {\n\t\t\tbuildpackIDVersion := args[0]\n\n\t\t\tregistry, err := config.GetRegistry(cfg, flags.BuildpackRegistry)\n\t\t\tif err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\t\t\tid, version, err := parseIDVersion(buildpackIDVersion)\n\t\t\tif err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\n\t\t\topts := client.YankBuildpackOptions{\n\t\t\t\tID:      id,\n\t\t\t\tVersion: version,\n\t\t\t\tType:    \"github\",\n\t\t\t\tURL:     registry.URL,\n\t\t\t\tYank:    !flags.Undo,\n\t\t\t}\n\n\t\t\tif err := pack.YankBuildpack(opts); err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\t\t\tlogger.Infof(\"Successfully yanked %s\", style.Symbol(buildpackIDVersion))\n\t\t\treturn nil\n\t\t}),\n\t}\n\tcmd.Flags().StringVarP(&flags.BuildpackRegistry, \"buildpack-registry\", \"r\", \"\", \"Buildpack Registry name\")\n\tcmd.Flags().BoolVarP(&flags.Undo, \"undo\", \"u\", false, \"undo previously yanked buildpack\")\n\tAddHelpFlag(cmd, \"yank\")\n\n\treturn cmd\n}\n\nfunc parseIDVersion(buildpackIDVersion string) (string, string, error) {\n\tparts := strings.Split(buildpackIDVersion, \"@\")\n\tif len(parts) != 2 {\n\t\treturn \"\", \"\", fmt.Errorf(\"invalid buildpack id@version %s\", style.Symbol(buildpackIDVersion))\n\t}\n\n\treturn parts[0], parts[1], nil\n}\n"
  },
  {
    "path": "internal/commands/buildpack_yank_test.go",
    "content": "package commands_test\n\nimport (\n\t\"bytes\"\n\t\"testing\"\n\n\t\"github.com/golang/mock/gomock\"\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/internal/commands\"\n\t\"github.com/buildpacks/pack/internal/commands/testmocks\"\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/pkg/client\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestYankCommand(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"YankCommand\", testYankCommand, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testYankCommand(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tcmd            *cobra.Command\n\t\tlogger         logging.Logger\n\t\toutBuf         bytes.Buffer\n\t\tmockController *gomock.Controller\n\t\tmockClient     *testmocks.MockPackClient\n\t\tcfg            config.Config\n\t)\n\n\tit.Before(func() {\n\t\tlogger = logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\tmockController = gomock.NewController(t)\n\t\tmockClient = testmocks.NewMockPackClient(mockController)\n\t\tcfg = config.Config{}\n\n\t\tcmd = commands.BuildpackYank(logger, cfg, mockClient)\n\t})\n\n\twhen(\"#YankBuildpackCommand\", func() {\n\t\twhen(\"no buildpack id@version is provided\", func() {\n\t\t\tit(\"fails to run\", func() {\n\t\t\t\terr := cmd.Execute()\n\t\t\t\th.AssertError(t, err, \"accepts 1 arg\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"id@version argument is provided\", func() {\n\t\t\tvar (\n\t\t\t\tbuildpackIDVersion string\n\t\t\t)\n\n\t\t\tit.Before(func() {\n\t\t\t\tbuildpackIDVersion = \"heroku/rust@0.0.1\"\n\t\t\t})\n\n\t\t\tit(\"should work for required args\", func() {\n\t\t\t\topts := client.YankBuildpackOptions{\n\t\t\t\t\tID:      \"heroku/rust\",\n\t\t\t\t\tVersion: \"0.0.1\",\n\t\t\t\t\tType:    \"github\",\n\t\t\t\t\tURL:     \"https://github.com/buildpacks/registry-index\",\n\t\t\t\t\tYank:    true,\n\t\t\t\t}\n\n\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\tYankBuildpack(opts).\n\t\t\t\t\tReturn(nil)\n\n\t\t\t\tcmd.SetArgs([]string{buildpackIDVersion})\n\t\t\t\th.AssertNil(t, cmd.Execute())\n\t\t\t})\n\n\t\t\tit(\"should fail for invalid buildpack id/version\", func() {\n\t\t\t\tcmd.SetArgs([]string{\"mybuildpack\"})\n\t\t\t\terr := cmd.Execute()\n\n\t\t\t\th.AssertError(t, err, \"invalid buildpack id@version 'mybuildpack'\")\n\t\t\t})\n\n\t\t\tit(\"should use the default registry defined in config.toml\", func() {\n\t\t\t\tcfg = config.Config{\n\t\t\t\t\tDefaultRegistryName: \"official\",\n\t\t\t\t\tRegistries: []config.Registry{\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\tName: \"official\",\n\t\t\t\t\t\t\tType: \"github\",\n\t\t\t\t\t\t\tURL:  \"https://github.com/buildpacks/registry-index\",\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t}\n\n\t\t\t\topts := client.YankBuildpackOptions{\n\t\t\t\t\tID:      \"heroku/rust\",\n\t\t\t\t\tVersion: \"0.0.1\",\n\t\t\t\t\tType:    \"github\",\n\t\t\t\t\tURL:     \"https://github.com/buildpacks/registry-index\",\n\t\t\t\t\tYank:    true,\n\t\t\t\t}\n\n\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\tYankBuildpack(opts).\n\t\t\t\t\tReturn(nil)\n\n\t\t\t\tcmd.SetArgs([]string{buildpackIDVersion})\n\t\t\t\th.AssertNil(t, cmd.Execute())\n\t\t\t})\n\n\t\t\tit(\"should undo\", func() {\n\t\t\t\topts := client.YankBuildpackOptions{\n\t\t\t\t\tID:      \"heroku/rust\",\n\t\t\t\t\tVersion: \"0.0.1\",\n\t\t\t\t\tType:    \"github\",\n\t\t\t\t\tURL:     \"https://github.com/buildpacks/registry-index\",\n\t\t\t\t\tYank:    false,\n\t\t\t\t}\n\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\tYankBuildpack(opts).\n\t\t\t\t\tReturn(nil)\n\n\t\t\t\tcmd.SetArgs([]string{buildpackIDVersion, \"--undo\"})\n\t\t\t\th.AssertNil(t, cmd.Execute())\n\t\t\t})\n\n\t\t\twhen(\"buildpack-registry flag is used\", func() {\n\t\t\t\tit(\"should use the specified buildpack registry\", func() {\n\t\t\t\t\tbuildpackRegistry := \"override\"\n\t\t\t\t\tcfg = config.Config{\n\t\t\t\t\t\tDefaultRegistryName: \"default\",\n\t\t\t\t\t\tRegistries: []config.Registry{\n\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\tName: \"default\",\n\t\t\t\t\t\t\t\tType: \"github\",\n\t\t\t\t\t\t\t\tURL:  \"https://github.com/default/buildpack-registry\",\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\tName: \"override\",\n\t\t\t\t\t\t\t\tType: \"github\",\n\t\t\t\t\t\t\t\tURL:  \"https://github.com/override/buildpack-registry\",\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t},\n\t\t\t\t\t}\n\t\t\t\t\topts := client.YankBuildpackOptions{\n\t\t\t\t\t\tID:      \"heroku/rust\",\n\t\t\t\t\t\tVersion: \"0.0.1\",\n\t\t\t\t\t\tType:    \"github\",\n\t\t\t\t\t\tURL:     \"https://github.com/override/buildpack-registry\",\n\t\t\t\t\t\tYank:    true,\n\t\t\t\t\t}\n\t\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\t\tYankBuildpack(opts).\n\t\t\t\t\t\tReturn(nil)\n\n\t\t\t\t\tcmd = commands.BuildpackYank(logger, cfg, mockClient)\n\t\t\t\t\tcmd.SetArgs([]string{buildpackIDVersion, \"--buildpack-registry\", buildpackRegistry})\n\t\t\t\t\th.AssertNil(t, cmd.Execute())\n\t\t\t\t})\n\n\t\t\t\tit(\"should handle config errors\", func() {\n\t\t\t\t\tcfg = config.Config{\n\t\t\t\t\t\tDefaultRegistryName: \"missing registry\",\n\t\t\t\t\t}\n\t\t\t\t\tcmd = commands.BuildpackYank(logger, cfg, mockClient)\n\t\t\t\t\tcmd.SetArgs([]string{buildpackIDVersion})\n\n\t\t\t\t\terr := cmd.Execute()\n\t\t\t\t\th.AssertNotNil(t, err)\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "internal/commands/commands.go",
    "content": "package commands\n\nimport (\n\t\"context\"\n\t\"fmt\"\n\t\"os\"\n\t\"os/signal\"\n\t\"syscall\"\n\n\t\"github.com/google/go-containerregistry/pkg/v1/types\"\n\t\"github.com/pkg/errors\"\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/internal/style\"\n\t\"github.com/buildpacks/pack/internal/target\"\n\t\"github.com/buildpacks/pack/pkg/buildpack\"\n\t\"github.com/buildpacks/pack/pkg/client\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n)\n\n//go:generate mockgen -package testmocks -destination testmocks/mock_pack_client.go github.com/buildpacks/pack/internal/commands PackClient\ntype PackClient interface {\n\tInspectBuilder(string, bool, ...client.BuilderInspectionModifier) (*client.BuilderInfo, error)\n\tInspectImage(string, bool) (*client.ImageInfo, error)\n\tRebase(context.Context, client.RebaseOptions) error\n\tCreateBuilder(context.Context, client.CreateBuilderOptions) error\n\tNewBuildpack(context.Context, client.NewBuildpackOptions) error\n\tPackageBuildpack(ctx context.Context, opts client.PackageBuildpackOptions) error\n\tPackageExtension(ctx context.Context, opts client.PackageBuildpackOptions) error\n\tBuild(context.Context, client.BuildOptions) error\n\tRegisterBuildpack(context.Context, client.RegisterBuildpackOptions) error\n\tYankBuildpack(client.YankBuildpackOptions) error\n\tInspectBuildpack(client.InspectBuildpackOptions) (*client.BuildpackInfo, error)\n\tInspectExtension(client.InspectExtensionOptions) (*client.ExtensionInfo, error)\n\tPullBuildpack(context.Context, client.PullBuildpackOptions) error\n\tDownloadSBOM(name string, options client.DownloadSBOMOptions) error\n\tCreateManifest(ctx context.Context, opts client.CreateManifestOptions) error\n\tAnnotateManifest(ctx context.Context, opts client.ManifestAnnotateOptions) error\n\tAddManifest(ctx context.Context, opts client.ManifestAddOptions) error\n\tDeleteManifest(name []string) error\n\tRemoveManifest(name string, images []string) error\n\tPushManifest(client.PushManifestOptions) error\n\tInspectManifest(string) error\n}\n\nfunc AddHelpFlag(cmd *cobra.Command, commandName string) {\n\tcmd.Flags().BoolP(\"help\", \"h\", false, fmt.Sprintf(\"Help for '%s'\", commandName))\n}\n\nfunc CreateCancellableContext() context.Context {\n\tsignals := make(chan os.Signal, 1)\n\tsignal.Notify(signals, syscall.SIGINT, syscall.SIGTERM)\n\tctx, cancel := context.WithCancel(context.Background())\n\n\tgo func() {\n\t\t<-signals\n\t\tcancel()\n\t}()\n\n\treturn ctx\n}\n\nfunc logError(logger logging.Logger, f func(cmd *cobra.Command, args []string) error) func(*cobra.Command, []string) error {\n\treturn func(cmd *cobra.Command, args []string) error {\n\t\tcmd.SilenceErrors = true\n\t\tcmd.SilenceUsage = true\n\t\terr := f(cmd, args)\n\t\tif err != nil {\n\t\t\tif _, isSoftError := errors.Cause(err).(client.SoftError); !isSoftError {\n\t\t\t\tlogger.Error(err.Error())\n\t\t\t}\n\n\t\t\tif _, isExpError := errors.Cause(err).(client.ExperimentError); isExpError {\n\t\t\t\tconfigPath, err := config.DefaultConfigPath()\n\t\t\t\tif err != nil {\n\t\t\t\t\treturn err\n\t\t\t\t}\n\t\t\t\tenableExperimentalTip(logger, configPath)\n\t\t\t}\n\t\t\treturn err\n\t\t}\n\t\treturn nil\n\t}\n}\n\nfunc enableExperimentalTip(logger logging.Logger, configPath string) {\n\tlogging.Tip(logger, \"To enable experimental features, run `pack config experimental true` to add %s to %s.\", style.Symbol(\"experimental = true\"), style.Symbol(configPath))\n}\n\nfunc stringArrayHelp(name string) string {\n\treturn fmt.Sprintf(\"\\nRepeat for each %s in order (comma-separated lists not accepted)\", name)\n}\n\nfunc stringSliceHelp(name string) string {\n\treturn fmt.Sprintf(\"\\nRepeat for each %s in order, or supply once by comma-separated list\", name)\n}\n\nfunc getMirrors(config config.Config) map[string][]string {\n\tmirrors := map[string][]string{}\n\tfor _, ri := range config.RunImages {\n\t\tmirrors[ri.Image] = ri.Mirrors\n\t}\n\treturn mirrors\n}\n\nfunc deprecationWarning(logger logging.Logger, oldCmd, replacementCmd string) {\n\tlogger.Warnf(\"Command %s has been deprecated, please use %s instead\", style.Symbol(\"pack \"+oldCmd), style.Symbol(\"pack \"+replacementCmd))\n}\n\nfunc parseFormatFlag(value string) (types.MediaType, error) {\n\tvar format types.MediaType\n\tswitch value {\n\tcase \"oci\":\n\t\tformat = types.OCIImageIndex\n\tcase \"docker\":\n\t\tformat = types.DockerManifestList\n\tdefault:\n\t\treturn format, errors.Errorf(\"%s invalid media type format\", value)\n\t}\n\treturn format, nil\n}\n\n// processMultiArchitectureConfig takes an array of targets with format: [os][/arch][/variant]:[distroname@osversion@anotherversion];[distroname@osversion]\n// and a list of targets defined in a configuration file (buildpack.toml or package.toml) and creates a multi-architecture configuration\nfunc processMultiArchitectureConfig(logger logging.Logger, userTargets []string, configTargets []dist.Target, daemon bool) (*buildpack.MultiArchConfig, error) {\n\tvar (\n\t\texpectedTargets []dist.Target\n\t\terr             error\n\t)\n\tif len(userTargets) > 0 {\n\t\tif expectedTargets, err = target.ParseTargets(userTargets, logger); err != nil {\n\t\t\treturn &buildpack.MultiArchConfig{}, err\n\t\t}\n\t\tif len(expectedTargets) > 1 && daemon {\n\t\t\t// when we are exporting to daemon, only 1 target is allow\n\t\t\treturn &buildpack.MultiArchConfig{}, errors.Errorf(\"when exporting to daemon only one target is allowed\")\n\t\t}\n\t}\n\n\tmultiArchCfg, err := buildpack.NewMultiArchConfig(configTargets, expectedTargets, logger)\n\tif err != nil {\n\t\treturn &buildpack.MultiArchConfig{}, err\n\t}\n\treturn multiArchCfg, nil\n}\n"
  },
  {
    "path": "internal/commands/completion.go",
    "content": "package commands\n\nimport (\n\t\"os\"\n\t\"path/filepath\"\n\n\t\"github.com/pkg/errors\"\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/pkg/logging\"\n)\n\ntype CompletionFlags struct {\n\tShell string\n}\n\ntype completionFunc func(packHome string, cmd *cobra.Command) (path string, err error)\n\nvar shellExtensions = map[string]completionFunc{\n\t\"bash\": func(packHome string, cmd *cobra.Command) (path string, err error) {\n\t\tp := filepath.Join(packHome, \"completion.sh\")\n\t\treturn p, cmd.GenBashCompletionFile(p)\n\t},\n\t\"fish\": func(packHome string, cmd *cobra.Command) (path string, err error) {\n\t\tp := filepath.Join(packHome, \"completion.fish\")\n\t\treturn p, cmd.GenFishCompletionFile(p, true)\n\t},\n\t\"powershell\": func(packHome string, cmd *cobra.Command) (path string, err error) {\n\t\tp := filepath.Join(packHome, \"completion.ps1\")\n\t\treturn p, cmd.GenPowerShellCompletionFile(p)\n\t},\n\t\"zsh\": func(packHome string, cmd *cobra.Command) (path string, err error) {\n\t\tp := filepath.Join(packHome, \"completion.zsh\")\n\t\treturn p, cmd.GenZshCompletionFile(p)\n\t},\n}\n\nfunc CompletionCommand(logger logging.Logger, packHome string) *cobra.Command {\n\tvar flags CompletionFlags\n\tvar completionCmd = &cobra.Command{\n\t\tUse:   \"completion\",\n\t\tShort: \"Outputs completion script location\",\n\t\tLong: `Generates completion script and outputs its location.\n\nTo configure your bash shell to load completions for each session, add the following to your '.bashrc' or '.bash_profile':\n\n\t. $(pack completion)\n\nTo configure your fish shell to load completions for each session, add the following to your '~/.config/fish/config.fish':\n\n\tsource (pack completion --shell fish)\n\nTo configure your powershell to load completions for each session, add the following to your '$Home\\[My ]Documents\\PowerShell\\\nMicrosoft.PowerShell_profile.ps1':\n\n\t. $(pack completion --shell powershell)\n\nTo configure your zsh shell to load completions for each session, add the following to your '.zshrc':\n\n\t. $(pack completion --shell zsh)\n\n  \n\t`,\n\t\tRunE: logError(logger, func(cmd *cobra.Command, args []string) error {\n\t\t\tcompletionFunc, ok := shellExtensions[flags.Shell]\n\t\t\tif !ok {\n\t\t\t\treturn errors.Errorf(\"%s is unsupported shell\", flags.Shell)\n\t\t\t}\n\n\t\t\tif err := os.MkdirAll(packHome, os.ModePerm); err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\n\t\t\tcompletionPath, err := completionFunc(packHome, cmd.Parent())\n\t\t\tif err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\n\t\t\tlogger.Info(completionPath)\n\t\t\treturn nil\n\t\t}),\n\t}\n\n\tcompletionCmd.Flags().StringVarP(&flags.Shell, \"shell\", \"s\", \"bash\", \"Generates completion file for [bash|fish|powershell|zsh]\")\n\treturn completionCmd\n}\n"
  },
  {
    "path": "internal/commands/completion_test.go",
    "content": "package commands_test\n\nimport (\n\t\"bytes\"\n\t\"os\"\n\t\"path/filepath\"\n\t\"testing\"\n\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/internal/commands\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestCompletionCommand(t *testing.T) {\n\tspec.Run(t, \"Commands\", testCompletionCommand, spec.Random(), spec.Report(report.Terminal{}))\n}\n\nfunc testCompletionCommand(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tcommand  *cobra.Command\n\t\tlogger   logging.Logger\n\t\toutBuf   bytes.Buffer\n\t\tpackHome string\n\t\tassert   = h.NewAssertionManager(t)\n\t)\n\n\tit.Before(func() {\n\t\tlogger = logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\tvar err error\n\t\tpackHome, err = os.MkdirTemp(\"\", \"\")\n\t\tassert.Nil(err)\n\n\t\t// the CompletionCommand calls a method on its Parent(), so it needs to have\n\t\t// one.\n\t\tcommand = &cobra.Command{}\n\t\tcommand.AddCommand(commands.CompletionCommand(logger, packHome))\n\t\tcommand.SetArgs([]string{\"completion\"})\n\t})\n\n\tit.After(func() {\n\t\tos.RemoveAll(packHome)\n\t})\n\n\twhen(\"#CompletionCommand\", func() {\n\t\twhen(\"Shell flag is empty(default value)\", func() {\n\t\t\tit(\"errors should not be occurred\", func() {\n\t\t\t\tassert.Nil(command.Execute())\n\t\t\t})\n\t\t})\n\n\t\twhen(\"PackHome directory does not exist\", func() {\n\t\t\tit(\"should create completion file\", func() {\n\t\t\t\tmissingDir := filepath.Join(packHome, \"not-found\")\n\n\t\t\t\tcommand = &cobra.Command{}\n\t\t\t\tcommand.AddCommand(commands.CompletionCommand(logger, missingDir))\n\t\t\t\tcommand.SetArgs([]string{\"completion\"})\n\n\t\t\t\tassert.Nil(command.Execute())\n\t\t\t\tassert.Contains(outBuf.String(), filepath.Join(missingDir, \"completion.sh\"))\n\t\t\t})\n\t\t})\n\n\t\tfor _, test := range []struct {\n\t\t\tshell     string\n\t\t\textension string\n\t\t}{\n\t\t\t{shell: \"bash\", extension: \".sh\"},\n\t\t\t{shell: \"fish\", extension: \".fish\"},\n\t\t\t{shell: \"powershell\", extension: \".ps1\"},\n\t\t\t{shell: \"zsh\", extension: \".zsh\"},\n\t\t} {\n\t\t\tshell := test.shell\n\t\t\textension := test.extension\n\n\t\t\twhen(\"shell is \"+shell, func() {\n\t\t\t\tit(\"should create completion file ending in \"+extension, func() {\n\t\t\t\t\tcommand.SetArgs([]string{\"completion\", \"--shell\", shell})\n\t\t\t\t\tassert.Nil(command.Execute())\n\n\t\t\t\t\texpectedFile := filepath.Join(packHome, \"completion\"+extension)\n\t\t\t\t\tassert.Contains(outBuf.String(), expectedFile)\n\t\t\t\t\tassert.FileExists(expectedFile)\n\t\t\t\t\tassert.FileIsNotEmpty(expectedFile)\n\t\t\t\t})\n\t\t\t})\n\t\t}\n\t})\n}\n"
  },
  {
    "path": "internal/commands/config.go",
    "content": "package commands\n\nimport (\n\t\"fmt\"\n\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n)\n\nfunc NewConfigCommand(logger logging.Logger, cfg config.Config, cfgPath string, client PackClient) *cobra.Command {\n\tcmd := &cobra.Command{\n\t\tUse:   \"config\",\n\t\tShort: \"Interact with your local pack config file\",\n\t\tRunE:  nil,\n\t}\n\n\tcmd.AddCommand(ConfigDefaultBuilder(logger, cfg, cfgPath, client))\n\tcmd.AddCommand(ConfigExperimental(logger, cfg, cfgPath))\n\tcmd.AddCommand(ConfigPullPolicy(logger, cfg, cfgPath))\n\tcmd.AddCommand(ConfigRegistries(logger, cfg, cfgPath))\n\tcmd.AddCommand(ConfigRunImagesMirrors(logger, cfg, cfgPath))\n\tcmd.AddCommand(ConfigTrustedBuilder(logger, cfg, cfgPath))\n\tcmd.AddCommand(ConfigLifecycleImage(logger, cfg, cfgPath))\n\tcmd.AddCommand(ConfigRegistryMirrors(logger, cfg, cfgPath))\n\n\tAddHelpFlag(cmd, \"config\")\n\treturn cmd\n}\n\ntype editCfgFunc func(args []string, logger logging.Logger, cfg config.Config, cfgPath string) error\n\nfunc generateAdd(cmdName string, logger logging.Logger, cfg config.Config, cfgPath string, addFunc editCfgFunc) *cobra.Command {\n\tcmd := &cobra.Command{\n\t\tUse:   \"add\",\n\t\tArgs:  cobra.ExactArgs(1),\n\t\tShort: fmt.Sprintf(\"Add a %s\", cmdName),\n\t\tRunE: logError(logger, func(cmd *cobra.Command, args []string) error {\n\t\t\treturn addFunc(args, logger, cfg, cfgPath)\n\t\t}),\n\t}\n\n\treturn cmd\n}\n\nfunc generateRemove(cmdName string, logger logging.Logger, cfg config.Config, cfgPath string, rmFunc editCfgFunc) *cobra.Command {\n\tcmd := &cobra.Command{\n\t\tUse:   \"remove\",\n\t\tArgs:  cobra.ExactArgs(1),\n\t\tShort: fmt.Sprintf(\"Remove a %s\", cmdName),\n\t\tRunE: logError(logger, func(cmd *cobra.Command, args []string) error {\n\t\t\treturn rmFunc(args, logger, cfg, cfgPath)\n\t\t}),\n\t}\n\n\treturn cmd\n}\n\ntype listFunc func(args []string, logger logging.Logger, cfg config.Config)\n\nfunc generateListCmd(cmdName string, logger logging.Logger, cfg config.Config, listFunc listFunc) *cobra.Command {\n\tcmd := &cobra.Command{\n\t\tUse:   \"list\",\n\t\tArgs:  cobra.MaximumNArgs(1),\n\t\tShort: fmt.Sprintf(\"List %s\", cmdName),\n\t\tRunE: logError(logger, func(cmd *cobra.Command, args []string) error {\n\t\t\tlistFunc(args, logger, cfg)\n\t\t\treturn nil\n\t\t}),\n\t}\n\n\treturn cmd\n}\n"
  },
  {
    "path": "internal/commands/config_default_builder.go",
    "content": "package commands\n\nimport (\n\t\"fmt\"\n\n\t\"github.com/pkg/errors\"\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/internal/style\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n)\n\nvar suggestedBuilderString = \"For suggested builders, run `pack builder suggest`.\"\n\nfunc ConfigDefaultBuilder(logger logging.Logger, cfg config.Config, cfgPath string, client PackClient) *cobra.Command {\n\tvar unset bool\n\n\tcmd := &cobra.Command{\n\t\tUse:   \"default-builder\",\n\t\tArgs:  cobra.MaximumNArgs(1),\n\t\tShort: \"List, set and unset the default builder used by other commands\",\n\t\tLong: \"List, set, and unset the default builder used by other commands.\\n\\n\" +\n\t\t\t\"* To list your default builder, run `pack config default-builder`.\\n\" +\n\t\t\t\"* To set your default builder, run `pack config default-builder <builder-name>`.\\n\" +\n\t\t\t\"* To unset your default builder, run `pack config default-builder --unset`.\\n\\n\" +\n\t\t\tsuggestedBuilderString,\n\t\tExample: \"pack config default-builder cnbs/sample-builder:bionic\",\n\t\tRunE: logError(logger, func(cmd *cobra.Command, args []string) error {\n\t\t\tswitch {\n\t\t\tcase unset:\n\t\t\t\tif cfg.DefaultBuilder == \"\" {\n\t\t\t\t\tlogger.Info(\"No default builder was set\")\n\t\t\t\t} else {\n\t\t\t\t\toldBuilder := cfg.DefaultBuilder\n\t\t\t\t\tcfg.DefaultBuilder = \"\"\n\t\t\t\t\tif err := config.Write(cfg, cfgPath); err != nil {\n\t\t\t\t\t\treturn errors.Wrapf(err, \"failed to write to config at %s\", cfgPath)\n\t\t\t\t\t}\n\t\t\t\t\tlogger.Infof(\"Successfully unset default builder %s\", style.Symbol(oldBuilder))\n\t\t\t\t}\n\t\t\tcase len(args) == 0:\n\t\t\t\tif cfg.DefaultBuilder != \"\" {\n\t\t\t\t\tlogger.Infof(\"The current default builder is %s\", style.Symbol(cfg.DefaultBuilder))\n\t\t\t\t} else {\n\t\t\t\t\tlogger.Infof(\"No default builder is set. \\n\\n%s\", suggestedBuilderString)\n\t\t\t\t}\n\t\t\t\treturn nil\n\t\t\tdefault:\n\t\t\t\timageName := args[0]\n\t\t\t\tif err := validateBuilderExists(logger, imageName, client); err != nil {\n\t\t\t\t\treturn errors.Wrapf(err, \"validating that builder %s exists\", style.Symbol(imageName))\n\t\t\t\t}\n\n\t\t\t\tcfg.DefaultBuilder = imageName\n\t\t\t\tif err := config.Write(cfg, cfgPath); err != nil {\n\t\t\t\t\treturn errors.Wrapf(err, \"failed to write to config at %s\", cfgPath)\n\t\t\t\t}\n\t\t\t\tlogger.Infof(\"Builder %s is now the default builder\", style.Symbol(imageName))\n\t\t\t}\n\n\t\t\treturn nil\n\t\t}),\n\t}\n\n\tcmd.Flags().BoolVarP(&unset, \"unset\", \"u\", false, \"Unset the current default builder\")\n\tAddHelpFlag(cmd, \"config default-builder\")\n\treturn cmd\n}\n\nfunc validateBuilderExists(logger logging.Logger, imageName string, client PackClient) error {\n\tlogger.Debug(\"Verifying local image...\")\n\tinfo, err := client.InspectBuilder(imageName, true)\n\tif err != nil {\n\t\treturn err\n\t}\n\n\tif info == nil {\n\t\tlogger.Debug(\"Verifying remote image...\")\n\t\tinfo, err := client.InspectBuilder(imageName, false)\n\t\tif err != nil {\n\t\t\treturn errors.Wrapf(err, \"failed to inspect remote image %s\", style.Symbol(imageName))\n\t\t}\n\n\t\tif info == nil {\n\t\t\treturn fmt.Errorf(\"builder %s not found\", style.Symbol(imageName))\n\t\t}\n\t}\n\n\treturn nil\n}\n"
  },
  {
    "path": "internal/commands/config_default_builder_test.go",
    "content": "package commands_test\n\nimport (\n\t\"bytes\"\n\t\"fmt\"\n\t\"os\"\n\t\"path/filepath\"\n\t\"testing\"\n\n\t\"github.com/golang/mock/gomock\"\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/internal/commands\"\n\t\"github.com/buildpacks/pack/internal/commands/testmocks\"\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/internal/style\"\n\t\"github.com/buildpacks/pack/pkg/client\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestConfigDefaultBuilder(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"ConfigDefaultBuilderCommand\", testConfigDefaultBuilder, spec.Random(), spec.Report(report.Terminal{}))\n}\n\nfunc testConfigDefaultBuilder(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tcmd            *cobra.Command\n\t\tlogger         logging.Logger\n\t\toutBuf         bytes.Buffer\n\t\tmockController *gomock.Controller\n\t\tmockClient     *testmocks.MockPackClient\n\t\ttempPackHome   string\n\t\tconfigPath     string\n\t)\n\n\tit.Before(func() {\n\t\tvar err error\n\n\t\tmockController = gomock.NewController(t)\n\t\tmockClient = testmocks.NewMockPackClient(mockController)\n\t\tlogger = logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\ttempPackHome, err = os.MkdirTemp(\"\", \"pack-home\")\n\t\th.AssertNil(t, err)\n\t\tconfigPath = filepath.Join(tempPackHome, \"config.toml\")\n\t\tcmd = commands.ConfigDefaultBuilder(logger, config.Config{}, configPath, mockClient)\n\t})\n\n\tit.After(func() {\n\t\tmockController.Finish()\n\t\th.AssertNil(t, os.RemoveAll(tempPackHome))\n\t})\n\n\twhen(\"#ConfigDefaultBuilder\", func() {\n\t\twhen(\"no args\", func() {\n\t\t\tit(\"lists current default builder if one is set\", func() {\n\t\t\t\tcmd = commands.ConfigDefaultBuilder(logger, config.Config{DefaultBuilder: \"some/builder\"}, configPath, mockClient)\n\t\t\t\tcmd.SetArgs([]string{})\n\t\t\t\th.AssertNil(t, cmd.Execute())\n\t\t\t\th.AssertContains(t, outBuf.String(), \"some/builder\")\n\t\t\t})\n\n\t\t\tit(\"suggests setting a builder if none is set\", func() {\n\t\t\t\tcmd.SetArgs([]string{})\n\t\t\t\th.AssertNil(t, cmd.Execute())\n\t\t\t\th.AssertContains(t, outBuf.String(), \"No default builder is set.\")\n\t\t\t\th.AssertContains(t, outBuf.String(), \"run `pack builder suggest`\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"unset\", func() {\n\t\t\tit(\"unsets current default builder\", func() {\n\t\t\t\tcfg := config.Config{DefaultBuilder: \"some/builder\"}\n\t\t\t\th.AssertNil(t, config.Write(cfg, configPath))\n\t\t\t\tcmd = commands.ConfigDefaultBuilder(logger, cfg, configPath, mockClient)\n\t\t\t\tcmd.SetArgs([]string{\"--unset\"})\n\t\t\t\terr := cmd.Execute()\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\tcfg, err = config.Read(configPath)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertEq(t, cfg.DefaultBuilder, \"\")\n\t\t\t\th.AssertContains(t, outBuf.String(), fmt.Sprintf(\"Successfully unset default builder %s\", style.Symbol(\"some/builder\")))\n\t\t\t})\n\n\t\t\tit(\"clarifies if no builder was set\", func() {\n\t\t\t\tcmd.SetArgs([]string{\"--unset\"})\n\t\t\t\th.AssertNil(t, cmd.Execute())\n\t\t\t\th.AssertContains(t, outBuf.String(), \"No default builder was set\")\n\t\t\t})\n\n\t\t\tit(\"gives clear error if unable to write to config\", func() {\n\t\t\t\th.AssertNil(t, os.WriteFile(configPath, []byte(\"some-data\"), 0001))\n\t\t\t\tcmd = commands.ConfigDefaultBuilder(logger, config.Config{DefaultBuilder: \"some/builder\"}, configPath, mockClient)\n\t\t\t\tcmd.SetArgs([]string{\"--unset\"})\n\t\t\t\terr := cmd.Execute()\n\t\t\t\th.AssertError(t, err, \"failed to write to config at \"+configPath)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"set\", func() {\n\t\t\twhen(\"valid builder is provider\", func() {\n\t\t\t\twhen(\"in local\", func() {\n\t\t\t\t\tvar imageName = \"some/image\"\n\n\t\t\t\t\tit(\"sets default builder\", func() {\n\t\t\t\t\t\tmockClient.EXPECT().InspectBuilder(imageName, true).Return(&client.BuilderInfo{\n\t\t\t\t\t\t\tStack: \"test.stack.id\",\n\t\t\t\t\t\t}, nil)\n\n\t\t\t\t\t\tcmd.SetArgs([]string{imageName})\n\t\t\t\t\t\th.AssertNil(t, cmd.Execute())\n\t\t\t\t\t\th.AssertContains(t, outBuf.String(), fmt.Sprintf(\"Builder '%s' is now the default builder\", imageName))\n\n\t\t\t\t\t\tcfg, err := config.Read(configPath)\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\th.AssertEq(t, cfg.DefaultBuilder, \"some/image\")\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"gives clear error if unable to write to config\", func() {\n\t\t\t\t\t\th.AssertNil(t, os.WriteFile(configPath, []byte(\"some-data\"), 0001))\n\t\t\t\t\t\tmockClient.EXPECT().InspectBuilder(imageName, true).Return(&client.BuilderInfo{\n\t\t\t\t\t\t\tStack: \"test.stack.id\",\n\t\t\t\t\t\t}, nil)\n\t\t\t\t\t\tcmd = commands.ConfigDefaultBuilder(logger, config.Config{}, configPath, mockClient)\n\t\t\t\t\t\tcmd.SetArgs([]string{imageName})\n\t\t\t\t\t\terr := cmd.Execute()\n\t\t\t\t\t\th.AssertError(t, err, \"failed to write to config at \"+configPath)\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"in remote\", func() {\n\t\t\t\t\tit(\"sets default builder\", func() {\n\t\t\t\t\t\timageName := \"some/image\"\n\n\t\t\t\t\t\tlocalCall := mockClient.EXPECT().InspectBuilder(imageName, true).Return(nil, nil)\n\n\t\t\t\t\t\tmockClient.EXPECT().InspectBuilder(imageName, false).Return(&client.BuilderInfo{\n\t\t\t\t\t\t\tStack: \"test.stack.id\",\n\t\t\t\t\t\t}, nil).After(localCall)\n\n\t\t\t\t\t\tcmd.SetArgs([]string{imageName})\n\t\t\t\t\t\th.AssertNil(t, cmd.Execute())\n\t\t\t\t\t\th.AssertContains(t, outBuf.String(), fmt.Sprintf(\"Builder '%s' is now the default builder\", imageName))\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"gives clear error if unable to inspect remote image\", func() {\n\t\t\t\t\t\timageName := \"some/image\"\n\n\t\t\t\t\t\tlocalCall := mockClient.EXPECT().InspectBuilder(imageName, true).Return(nil, nil)\n\n\t\t\t\t\t\tmockClient.EXPECT().InspectBuilder(imageName, false).Return(&client.BuilderInfo{\n\t\t\t\t\t\t\tStack: \"test.stack.id\",\n\t\t\t\t\t\t}, client.SoftError{}).After(localCall)\n\n\t\t\t\t\t\tcmd.SetArgs([]string{imageName})\n\t\t\t\t\t\terr := cmd.Execute()\n\t\t\t\t\t\th.AssertError(t, err, fmt.Sprintf(\"failed to inspect remote image %s\", style.Symbol(imageName)))\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"invalid builder is provided\", func() {\n\t\t\t\tit(\"error is presented\", func() {\n\t\t\t\t\timageName := \"nonbuilder/image\"\n\n\t\t\t\t\tmockClient.EXPECT().InspectBuilder(imageName, true).Return(\n\t\t\t\t\t\tnil,\n\t\t\t\t\t\tfmt.Errorf(\"failed to inspect image %s\", imageName))\n\n\t\t\t\t\tcmd.SetArgs([]string{imageName})\n\n\t\t\t\t\th.AssertNotNil(t, cmd.Execute())\n\t\t\t\t\th.AssertContains(t, outBuf.String(), fmt.Sprintf(\"validating that builder %s exists\", style.Symbol(\"nonbuilder/image\")))\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"non-existent builder is provided\", func() {\n\t\t\t\tit(\"error is present\", func() {\n\t\t\t\t\timageName := \"nonexisting/image\"\n\n\t\t\t\t\tlocalCall := mockClient.EXPECT().InspectBuilder(imageName, true).Return(\n\t\t\t\t\t\tnil,\n\t\t\t\t\t\tnil)\n\n\t\t\t\t\tmockClient.EXPECT().InspectBuilder(imageName, false).Return(\n\t\t\t\t\t\tnil,\n\t\t\t\t\t\tnil).After(localCall)\n\n\t\t\t\t\tcmd.SetArgs([]string{imageName})\n\n\t\t\t\t\th.AssertNotNil(t, cmd.Execute())\n\t\t\t\t\th.AssertContains(t, outBuf.String(), \"builder 'nonexisting/image' not found\")\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "internal/commands/config_experimental.go",
    "content": "package commands\n\nimport (\n\t\"path/filepath\"\n\t\"strconv\"\n\n\t\"github.com/pkg/errors\"\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/internal/style\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n)\n\nfunc ConfigExperimental(logger logging.Logger, cfg config.Config, cfgPath string) *cobra.Command {\n\tcmd := &cobra.Command{\n\t\tUse:   \"experimental [<true | false>]\",\n\t\tArgs:  cobra.MaximumNArgs(1),\n\t\tShort: \"List and set the current 'experimental' value from the config\",\n\t\tLong: \"Experimental features in pack are gated, and require you adding setting `experimental=true` to the Pack Config, either manually, or using this command.\\n\\n\" +\n\t\t\t\"* Running `pack config experimental` prints whether experimental features are currently enabled.\\n\" +\n\t\t\t\"* Running `pack config experimental <true | false>` enables or disables experimental features.\",\n\t\tRunE: logError(logger, func(cmd *cobra.Command, args []string) error {\n\t\t\tswitch {\n\t\t\tcase len(args) == 0:\n\t\t\t\tif cfg.Experimental {\n\t\t\t\t\tlogger.Infof(\"Experimental features are enabled! To turn them off, run `pack config experimental false`\")\n\t\t\t\t} else {\n\t\t\t\t\tlogger.Info(\"Experimental features aren't currently enabled. To enable them, run `pack config experimental true`\")\n\t\t\t\t}\n\t\t\tdefault:\n\t\t\t\tval, err := strconv.ParseBool(args[0])\n\t\t\t\tif err != nil {\n\t\t\t\t\treturn errors.Wrapf(err, \"invalid value %s provided\", style.Symbol(args[0]))\n\t\t\t\t}\n\t\t\t\tcfg.Experimental = val\n\t\t\t\tif cfg.Experimental {\n\t\t\t\t\tcfg.LayoutRepositoryDir = filepath.Join(filepath.Dir(cfgPath), \"layout-repo\")\n\t\t\t\t} else {\n\t\t\t\t\tcfg.LayoutRepositoryDir = \"\"\n\t\t\t\t}\n\n\t\t\t\tif err = config.Write(cfg, cfgPath); err != nil {\n\t\t\t\t\treturn errors.Wrap(err, \"writing to config\")\n\t\t\t\t}\n\n\t\t\t\tif cfg.Experimental {\n\t\t\t\t\tlogger.Info(\"Experimental features enabled!\")\n\t\t\t\t} else {\n\t\t\t\t\tlogger.Info(\"Experimental features disabled\")\n\t\t\t\t}\n\t\t\t}\n\n\t\t\treturn nil\n\t\t}),\n\t}\n\n\tAddHelpFlag(cmd, \"experimental\")\n\treturn cmd\n}\n"
  },
  {
    "path": "internal/commands/config_experimental_test.go",
    "content": "package commands_test\n\nimport (\n\t\"bytes\"\n\t\"fmt\"\n\t\"os\"\n\t\"path/filepath\"\n\t\"testing\"\n\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/internal/commands\"\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/internal/style\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestConfigExperimental(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"ConfigExperimentalCommand\", testConfigExperimental, spec.Random(), spec.Report(report.Terminal{}))\n}\n\nfunc testConfigExperimental(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tcmd          *cobra.Command\n\t\tlogger       logging.Logger\n\t\toutBuf       bytes.Buffer\n\t\ttempPackHome string\n\t\tconfigPath   string\n\t)\n\n\tit.Before(func() {\n\t\tvar err error\n\n\t\tlogger = logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\ttempPackHome, err = os.MkdirTemp(\"\", \"pack-home\")\n\t\th.AssertNil(t, err)\n\t\tconfigPath = filepath.Join(tempPackHome, \"config.toml\")\n\n\t\tcmd = commands.ConfigExperimental(logger, config.Config{}, configPath)\n\t\tcmd.SetOut(logging.GetWriterForLevel(logger, logging.InfoLevel))\n\t})\n\n\tit.After(func() {\n\t\th.AssertNil(t, os.RemoveAll(tempPackHome))\n\t})\n\n\twhen(\"#ConfigExperimental\", func() {\n\t\twhen(\"list values\", func() {\n\t\t\tit(\"prints a clear message if false\", func() {\n\t\t\t\tcmd.SetArgs([]string{})\n\t\t\t\th.AssertNil(t, cmd.Execute())\n\t\t\t\toutput := outBuf.String()\n\t\t\t\th.AssertContains(t, output, \"Experimental features aren't currently enabled\")\n\t\t\t})\n\n\t\t\tit(\"prints a clear message if true\", func() {\n\t\t\t\tcmd = commands.ConfigExperimental(logger, config.Config{Experimental: true}, configPath)\n\t\t\t\tcmd.SetArgs([]string{})\n\t\t\t\th.AssertNil(t, cmd.Execute())\n\t\t\t\toutput := outBuf.String()\n\t\t\t\th.AssertContains(t, output, \"Experimental features are enabled!\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"set\", func() {\n\t\t\tit(\"sets true if provided\", func() {\n\t\t\t\tcmd.SetArgs([]string{\"true\"})\n\t\t\t\th.AssertNil(t, cmd.Execute())\n\t\t\t\toutput := outBuf.String()\n\t\t\t\th.AssertContains(t, output, \"Experimental features enabled\")\n\t\t\t\tcfg, err := config.Read(configPath)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertEq(t, cfg.Experimental, true)\n\n\t\t\t\t// oci layout repo is configured\n\t\t\t\tlayoutDir := filepath.Join(filepath.Dir(configPath), \"layout-repo\")\n\t\t\t\th.AssertEq(t, cfg.LayoutRepositoryDir, layoutDir)\n\t\t\t})\n\n\t\t\tit(\"sets false if provided\", func() {\n\t\t\t\tcmd.SetArgs([]string{\"false\"})\n\t\t\t\th.AssertNil(t, cmd.Execute())\n\t\t\t\toutput := outBuf.String()\n\t\t\t\th.AssertContains(t, output, \"Experimental features disabled\")\n\t\t\t\tcfg, err := config.Read(configPath)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertEq(t, cfg.Experimental, false)\n\n\t\t\t\t// oci layout repo is cleaned\n\t\t\t\th.AssertEq(t, cfg.LayoutRepositoryDir, \"\")\n\t\t\t})\n\n\t\t\tit(\"returns error if invalid value provided\", func() {\n\t\t\t\tcmd.SetArgs([]string{\"disable-me\"})\n\t\t\t\th.AssertError(t, cmd.Execute(), fmt.Sprintf(\"invalid value %s provided\", style.Symbol(\"disable-me\")))\n\t\t\t\t// output := outBuf.String()\n\t\t\t\t// h.AssertContains(t, output, \"Experimental features disabled.\")\n\t\t\t\tcfg, err := config.Read(configPath)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertEq(t, cfg.Experimental, false)\n\t\t\t})\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "internal/commands/config_lifecycle_image.go",
    "content": "package commands\n\nimport (\n\t\"github.com/google/go-containerregistry/pkg/name\"\n\t\"github.com/pkg/errors\"\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/internal/style\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n)\n\nfunc ConfigLifecycleImage(logger logging.Logger, cfg config.Config, cfgPath string) *cobra.Command {\n\tvar unset bool\n\n\tcmd := &cobra.Command{\n\t\tUse:   \"lifecycle-image <lifecycle-image>\",\n\t\tArgs:  cobra.MaximumNArgs(1),\n\t\tShort: \"Configure a custom container image for the lifecycle\",\n\t\tLong: \"You can use this command to set a custom image to fetch the lifecycle from.\" +\n\t\t\t\"This will be used for untrusted builders. If unset, defaults to: \" + config.DefaultLifecycleImageRepo +\n\t\t\t\"For more on trusted builders, and when to trust or untrust a builder, \" +\n\t\t\t\"check out our docs here: https://buildpacks.io/docs/tools/pack/concepts/trusted_builders/\",\n\t\tRunE: logError(logger, func(cmd *cobra.Command, args []string) error {\n\t\t\tswitch {\n\t\t\tcase unset:\n\t\t\t\tif len(args) > 0 {\n\t\t\t\t\treturn errors.Errorf(\"lifecycle image and --unset cannot be specified simultaneously\")\n\t\t\t\t}\n\n\t\t\t\tif cfg.LifecycleImage == \"\" {\n\t\t\t\t\tlogger.Info(\"No custom lifecycle image was set.\")\n\t\t\t\t} else {\n\t\t\t\t\toldImage := cfg.LifecycleImage\n\t\t\t\t\tcfg.LifecycleImage = \"\"\n\t\t\t\t\tif err := config.Write(cfg, cfgPath); err != nil {\n\t\t\t\t\t\treturn errors.Wrapf(err, \"failed to write to config at %s\", cfgPath)\n\t\t\t\t\t}\n\t\t\t\t\tlogger.Infof(\"Successfully unset custom lifecycle image %s\", style.Symbol(oldImage))\n\t\t\t\t}\n\t\t\tcase len(args) == 0:\n\t\t\t\tif cfg.LifecycleImage != \"\" {\n\t\t\t\t\tlogger.Infof(\"The current custom lifecycle image is %s\", style.Symbol(cfg.LifecycleImage))\n\t\t\t\t} else {\n\t\t\t\t\tlogger.Infof(\"No custom lifecycle image is set. Lifecycle images from %s repo will be used.\", style.Symbol(config.DefaultLifecycleImageRepo))\n\t\t\t\t}\n\t\t\t\treturn nil\n\t\t\tdefault:\n\t\t\t\timageName := args[0]\n\t\t\t\t_, err := name.ParseReference(imageName)\n\n\t\t\t\tif err != nil {\n\t\t\t\t\treturn errors.Wrapf(err, \"Invalid image name %s provided\", style.Symbol(imageName))\n\t\t\t\t}\n\t\t\t\tif imageName == cfg.LifecycleImage {\n\t\t\t\t\tlogger.Infof(\"Custom lifecycle image is already set to %s\", style.Symbol(imageName))\n\t\t\t\t\treturn nil\n\t\t\t\t}\n\n\t\t\t\tcfg.LifecycleImage = imageName\n\t\t\t\tif err := config.Write(cfg, cfgPath); err != nil {\n\t\t\t\t\treturn errors.Wrapf(err, \"failed to write to config at %s\", cfgPath)\n\t\t\t\t}\n\t\t\t\tlogger.Infof(\"Image %s will now be used as the lifecycle image\", style.Symbol(imageName))\n\t\t\t}\n\n\t\t\treturn nil\n\t\t}),\n\t}\n\n\tcmd.Flags().BoolVarP(&unset, \"unset\", \"u\", false, \"Unset custom lifecycle image, and use the lifecycle images from \"+style.Symbol(config.DefaultLifecycleImageRepo))\n\tAddHelpFlag(cmd, \"lifecycle-image\")\n\treturn cmd\n}\n"
  },
  {
    "path": "internal/commands/config_lifecycle_image_test.go",
    "content": "package commands_test\n\nimport (\n\t\"bytes\"\n\t\"os\"\n\t\"path/filepath\"\n\t\"strings\"\n\t\"testing\"\n\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/internal/commands\"\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestConfigLifecyleImage(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"ConfigLifecycleImage\", testConfigLifecycleImageCommand, spec.Random(), spec.Report(report.Terminal{}))\n}\n\nfunc testConfigLifecycleImageCommand(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tcommand      *cobra.Command\n\t\tlogger       logging.Logger\n\t\toutBuf       bytes.Buffer\n\t\ttempPackHome string\n\t\tconfigFile   string\n\t\tassert       = h.NewAssertionManager(t)\n\t\tcfg          = config.Config{}\n\t)\n\n\tit.Before(func() {\n\t\tvar err error\n\t\tlogger = logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\ttempPackHome, err = os.MkdirTemp(\"\", \"pack-home\")\n\t\th.AssertNil(t, err)\n\t\tconfigFile = filepath.Join(tempPackHome, \"config.toml\")\n\n\t\tcommand = commands.ConfigLifecycleImage(logger, cfg, configFile)\n\t\tcommand.SetOut(logging.GetWriterForLevel(logger, logging.InfoLevel))\n\t})\n\n\tit.After(func() {\n\t\th.AssertNil(t, os.RemoveAll(tempPackHome))\n\t})\n\n\twhen(\"#ConfigLifecycleImage\", func() {\n\t\twhen(\"list\", func() {\n\t\t\twhen(\"no custom lifecycle image was set\", func() {\n\t\t\t\tit(\"lists the default\", func() {\n\t\t\t\t\tcommand.SetArgs([]string{})\n\n\t\t\t\t\th.AssertNil(t, command.Execute())\n\n\t\t\t\t\tassert.Contains(outBuf.String(), config.DefaultLifecycleImageRepo)\n\t\t\t\t})\n\t\t\t})\n\t\t\twhen(\"custom lifecycle-image was set\", func() {\n\t\t\t\tit(\"lists the custom image\", func() {\n\t\t\t\t\tcfg.LifecycleImage = \"custom-lifecycle/image:v1\"\n\t\t\t\t\tcommand = commands.ConfigLifecycleImage(logger, cfg, configFile)\n\t\t\t\t\tcommand.SetArgs([]string{})\n\n\t\t\t\t\th.AssertNil(t, command.Execute())\n\n\t\t\t\t\tassert.Contains(outBuf.String(), \"custom-lifecycle/image:v1\")\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\t\twhen(\"set\", func() {\n\t\t\twhen(\"custom lifecycle-image provided is the same as configured lifecycle-image\", func() {\n\t\t\t\tit(\"provides a helpful message\", func() {\n\t\t\t\t\tcfg.LifecycleImage = \"custom-lifecycle/image:v1\"\n\t\t\t\t\tcommand = commands.ConfigLifecycleImage(logger, cfg, configFile)\n\t\t\t\t\tcommand.SetArgs([]string{\"custom-lifecycle/image:v1\"})\n\n\t\t\t\t\th.AssertNil(t, command.Execute())\n\n\t\t\t\t\toutput := outBuf.String()\n\t\t\t\t\th.AssertEq(t, strings.TrimSpace(output), `Custom lifecycle image is already set to 'custom-lifecycle/image:v1'`)\n\t\t\t\t})\n\t\t\t\tit(\"it does not change the configured\", func() {\n\t\t\t\t\tcommand = commands.ConfigLifecycleImage(logger, cfg, configFile)\n\t\t\t\t\tcommand.SetArgs([]string{\"custom-lifecycle/image:v1\"})\n\t\t\t\t\tassert.Succeeds(command.Execute())\n\n\t\t\t\t\treadCfg, err := config.Read(configFile)\n\t\t\t\t\tassert.Nil(err)\n\t\t\t\t\tassert.Equal(readCfg.LifecycleImage, \"custom-lifecycle/image:v1\")\n\n\t\t\t\t\tcommand = commands.ConfigLifecycleImage(logger, readCfg, configFile)\n\t\t\t\t\tcommand.SetArgs([]string{\"custom-lifecycle/image:v1\"})\n\t\t\t\t\tassert.Succeeds(command.Execute())\n\n\t\t\t\t\treadCfg, err = config.Read(configFile)\n\t\t\t\t\tassert.Nil(err)\n\t\t\t\t\tassert.Equal(readCfg.LifecycleImage, \"custom-lifecycle/image:v1\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"valid lifecycle-image is specified\", func() {\n\t\t\t\tit(\"sets the lifecycle-image in config\", func() {\n\t\t\t\t\tcommand.SetArgs([]string{\"custom-lifecycle/image:v1\"})\n\t\t\t\t\tassert.Succeeds(command.Execute())\n\n\t\t\t\t\treadCfg, err := config.Read(configFile)\n\t\t\t\t\tassert.Nil(err)\n\t\t\t\t\tassert.Equal(readCfg.LifecycleImage, \"custom-lifecycle/image:v1\")\n\t\t\t\t})\n\t\t\t\tit(\"returns clear error if fails to write\", func() {\n\t\t\t\t\tassert.Nil(os.WriteFile(configFile, []byte(\"something\"), 0001))\n\t\t\t\t\tcommand := commands.ConfigLifecycleImage(logger, cfg, configFile)\n\t\t\t\t\tcommand.SetArgs([]string{\"custom-lifecycle/image:v1\"})\n\t\t\t\t\tassert.ErrorContains(command.Execute(), \"failed to write to config at\")\n\t\t\t\t})\n\t\t\t})\n\t\t\twhen(\"invalid lifecycle-image is specified\", func() {\n\t\t\t\tit(\"returns an error\", func() {\n\t\t\t\t\tcommand.SetArgs([]string{\"custom$1#-lifecycle/image-repo\"})\n\t\t\t\t\terr := command.Execute()\n\t\t\t\t\th.AssertError(t, err, `Invalid image name`)\n\t\t\t\t})\n\t\t\t\tit(\"returns clear error if fails to write\", func() {\n\t\t\t\t\tassert.Nil(os.WriteFile(configFile, []byte(\"something\"), 0001))\n\t\t\t\t\tcommand := commands.ConfigLifecycleImage(logger, cfg, configFile)\n\t\t\t\t\tcommand.SetArgs([]string{\"custom-lifecycle/image:v1\"})\n\t\t\t\t\tassert.ErrorContains(command.Execute(), \"failed to write to config at\")\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\t\twhen(\"unset\", func() {\n\t\t\twhen(\"the custom lifecycle image is set\", func() {\n\t\t\t\tit(\"removes set lifecycle image and resets to default lifecycle image\", func() {\n\t\t\t\t\tcommand = commands.ConfigLifecycleImage(logger, cfg, configFile)\n\t\t\t\t\tcommand.SetArgs([]string{\"custom-lifecycle/image:v1\"})\n\t\t\t\t\tassert.Succeeds(command.Execute())\n\n\t\t\t\t\treadCfg, err := config.Read(configFile)\n\t\t\t\t\tassert.Nil(err)\n\t\t\t\t\tassert.Equal(readCfg.LifecycleImage, \"custom-lifecycle/image:v1\")\n\n\t\t\t\t\tcommand = commands.ConfigLifecycleImage(logger, readCfg, configFile)\n\t\t\t\t\tcommand.SetArgs([]string{\"--unset\"})\n\t\t\t\t\tassert.Succeeds(command.Execute())\n\n\t\t\t\t\treadCfg, err = config.Read(configFile)\n\t\t\t\t\tassert.Nil(err)\n\t\t\t\t\tassert.Equal(readCfg.LifecycleImage, \"\")\n\t\t\t\t})\n\t\t\t\tit(\"returns clear error if fails to write\", func() {\n\t\t\t\t\tassert.Nil(os.WriteFile(configFile, []byte(\"something\"), 0001))\n\t\t\t\t\tcommand := commands.ConfigLifecycleImage(logger, config.Config{LifecycleImage: \"custom-lifecycle/image:v1\"}, configFile)\n\t\t\t\t\tcommand.SetArgs([]string{\"--unset\"})\n\t\t\t\t\tassert.ErrorContains(command.Execute(), \"failed to write to config at\")\n\t\t\t\t})\n\t\t\t})\n\t\t\twhen(\"the custom lifecycle image is not set\", func() {\n\t\t\t\tit(\"returns clear message that no custom lifecycle image is set\", func() {\n\t\t\t\t\tcommand.SetArgs([]string{\"--unset\"})\n\t\t\t\t\tassert.Succeeds(command.Execute())\n\t\t\t\t\toutput := outBuf.String()\n\t\t\t\t\th.AssertEq(t, strings.TrimSpace(output), `No custom lifecycle image was set.`)\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\t\twhen(\"--unset and lifecycle image to set is provided\", func() {\n\t\t\tit(\"errors\", func() {\n\t\t\t\tcommand.SetArgs([]string{\n\t\t\t\t\t\"custom-lifecycle/image:v1\",\n\t\t\t\t\t\"--unset\",\n\t\t\t\t})\n\t\t\t\terr := command.Execute()\n\t\t\t\th.AssertError(t, err, `lifecycle image and --unset cannot be specified simultaneously`)\n\t\t\t})\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "internal/commands/config_pull_policy.go",
    "content": "package commands\n\nimport (\n\t\"fmt\"\n\n\t\"github.com/pkg/errors\"\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/internal/style\"\n\t\"github.com/buildpacks/pack/pkg/image\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n)\n\nfunc ConfigPullPolicy(logger logging.Logger, cfg config.Config, cfgPath string) *cobra.Command {\n\tvar unset bool\n\n\tcmd := &cobra.Command{\n\t\tUse:   \"pull-policy <always | if-not-present | never>\",\n\t\tArgs:  cobra.MaximumNArgs(1),\n\t\tShort: \"List, set and unset the global pull policy used by other commands\",\n\t\tLong: \"You can use this command to list, set, and unset the default pull policy that will be used when working with containers:\\n\" +\n\t\t\t\"* To list your pull policy, run `pack config pull-policy`.\\n\" +\n\t\t\t\"* To set your pull policy, run `pack config pull-policy <always | if-not-present | never>`.\\n\" +\n\t\t\t\"* To unset your pull policy, run `pack config pull-policy --unset`.\\n\" +\n\t\t\tfmt.Sprintf(\"Unsetting the pull policy will reset the policy to the default, which is %s\", style.Symbol(\"always\")),\n\t\tRunE: logError(logger, func(cmd *cobra.Command, args []string) error {\n\t\t\tswitch {\n\t\t\tcase unset:\n\t\t\t\tif len(args) > 0 {\n\t\t\t\t\treturn errors.Errorf(\"pull policy and --unset cannot be specified simultaneously\")\n\t\t\t\t}\n\t\t\t\toldPullPolicy := cfg.PullPolicy\n\t\t\t\tcfg.PullPolicy = \"\"\n\t\t\t\tif err := config.Write(cfg, cfgPath); err != nil {\n\t\t\t\t\treturn errors.Wrapf(err, \"writing config to %s\", cfgPath)\n\t\t\t\t}\n\n\t\t\t\tpullPolicy, err := image.ParsePullPolicy(cfg.PullPolicy)\n\t\t\t\tif err != nil {\n\t\t\t\t\treturn err\n\t\t\t\t}\n\n\t\t\t\tlogger.Infof(\"Successfully unset pull policy %s\", style.Symbol(oldPullPolicy))\n\t\t\t\tlogger.Infof(\"Pull policy has been set to %s\", style.Symbol(pullPolicy.String()))\n\t\t\tcase len(args) == 0: // list\n\t\t\t\tpullPolicy, err := image.ParsePullPolicy(cfg.PullPolicy)\n\t\t\t\tif err != nil {\n\t\t\t\t\treturn err\n\t\t\t\t}\n\n\t\t\t\tlogger.Infof(\"The current pull policy is %s\", style.Symbol(pullPolicy.String()))\n\t\t\tdefault: // set\n\t\t\t\tnewPullPolicy := args[0]\n\n\t\t\t\tif newPullPolicy == cfg.PullPolicy {\n\t\t\t\t\tlogger.Infof(\"Pull policy is already set to %s\", style.Symbol(newPullPolicy))\n\t\t\t\t\treturn nil\n\t\t\t\t}\n\n\t\t\t\tpullPolicy, err := image.ParsePullPolicy(newPullPolicy)\n\t\t\t\tif err != nil {\n\t\t\t\t\treturn err\n\t\t\t\t}\n\n\t\t\t\tcfg.PullPolicy = newPullPolicy\n\t\t\t\tif err := config.Write(cfg, cfgPath); err != nil {\n\t\t\t\t\treturn errors.Wrapf(err, \"writing config to %s\", cfgPath)\n\t\t\t\t}\n\n\t\t\t\tlogger.Infof(\"Successfully set %s as the pull policy\", style.Symbol(pullPolicy.String()))\n\t\t\t}\n\n\t\t\treturn nil\n\t\t}),\n\t}\n\n\tcmd.Flags().BoolVarP(&unset, \"unset\", \"u\", false, \"Unset pull policy, and set it back to the default pull-policy, which is \"+style.Symbol(\"always\"))\n\tAddHelpFlag(cmd, \"pull-policy\")\n\treturn cmd\n}\n"
  },
  {
    "path": "internal/commands/config_pull_policy_test.go",
    "content": "package commands_test\n\nimport (\n\t\"bytes\"\n\t\"os\"\n\t\"path/filepath\"\n\t\"strings\"\n\t\"testing\"\n\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/internal/commands\"\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestConfigPullPolicy(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"ConfigPullPolicyCommand\", testConfigPullPolicyCommand, spec.Random(), spec.Report(report.Terminal{}))\n}\n\nfunc testConfigPullPolicyCommand(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tcommand      *cobra.Command\n\t\tlogger       logging.Logger\n\t\toutBuf       bytes.Buffer\n\t\ttempPackHome string\n\t\tconfigFile   string\n\t\tassert       = h.NewAssertionManager(t)\n\t\tcfg          = config.Config{}\n\t)\n\n\tit.Before(func() {\n\t\tvar err error\n\t\tlogger = logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\ttempPackHome, err = os.MkdirTemp(\"\", \"pack-home\")\n\t\th.AssertNil(t, err)\n\t\tconfigFile = filepath.Join(tempPackHome, \"config.toml\")\n\n\t\tcommand = commands.ConfigPullPolicy(logger, cfg, configFile)\n\t\tcommand.SetOut(logging.GetWriterForLevel(logger, logging.InfoLevel))\n\t})\n\n\tit.After(func() {\n\t\th.AssertNil(t, os.RemoveAll(tempPackHome))\n\t})\n\n\twhen(\"#ConfigPullPolicy\", func() {\n\t\twhen(\"list\", func() {\n\t\t\twhen(\"no policy is specified\", func() {\n\t\t\t\tit(\"lists default pull policy\", func() {\n\t\t\t\t\tcommand.SetArgs([]string{})\n\n\t\t\t\t\th.AssertNil(t, command.Execute())\n\n\t\t\t\t\tassert.Contains(outBuf.String(), \"always\")\n\t\t\t\t})\n\t\t\t})\n\t\t\twhen(\"policy set to always in config\", func() {\n\t\t\t\tit(\"lists always as pull policy\", func() {\n\t\t\t\t\tcfg.PullPolicy = \"always\"\n\t\t\t\t\tcommand = commands.ConfigPullPolicy(logger, cfg, configFile)\n\t\t\t\t\tcommand.SetArgs([]string{})\n\n\t\t\t\t\th.AssertNil(t, command.Execute())\n\n\t\t\t\t\tassert.Contains(outBuf.String(), \"always\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"policy set to never in config\", func() {\n\t\t\t\tit(\"lists never as pull policy\", func() {\n\t\t\t\t\tcfg.PullPolicy = \"never\"\n\t\t\t\t\tcommand = commands.ConfigPullPolicy(logger, cfg, configFile)\n\t\t\t\t\tcommand.SetArgs([]string{})\n\n\t\t\t\t\th.AssertNil(t, command.Execute())\n\n\t\t\t\t\tassert.Contains(outBuf.String(), \"never\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"policy set to if-not-present in config\", func() {\n\t\t\t\tit(\"lists if-not-present as pull policy\", func() {\n\t\t\t\t\tcfg.PullPolicy = \"if-not-present\"\n\t\t\t\t\tcommand = commands.ConfigPullPolicy(logger, cfg, configFile)\n\t\t\t\t\tcommand.SetArgs([]string{})\n\n\t\t\t\t\th.AssertNil(t, command.Execute())\n\n\t\t\t\t\tassert.Contains(outBuf.String(), \"if-not-present\")\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\t\twhen(\"set\", func() {\n\t\t\twhen(\"policy provided is the same as configured pull policy\", func() {\n\t\t\t\tit(\"provides a helpful message\", func() {\n\t\t\t\t\tcfg.PullPolicy = \"if-not-present\"\n\t\t\t\t\tcommand = commands.ConfigPullPolicy(logger, cfg, configFile)\n\t\t\t\t\tcommand.SetArgs([]string{\"if-not-present\"})\n\n\t\t\t\t\th.AssertNil(t, command.Execute())\n\n\t\t\t\t\toutput := outBuf.String()\n\t\t\t\t\th.AssertEq(t, strings.TrimSpace(output), `Pull policy is already set to 'if-not-present'`)\n\t\t\t\t})\n\t\t\t\tit(\"it does not change the configured policy\", func() {\n\t\t\t\t\tcommand = commands.ConfigPullPolicy(logger, cfg, configFile)\n\t\t\t\t\tcommand.SetArgs([]string{\"never\"})\n\t\t\t\t\tassert.Succeeds(command.Execute())\n\n\t\t\t\t\treadCfg, err := config.Read(configFile)\n\t\t\t\t\tassert.Nil(err)\n\t\t\t\t\tassert.Equal(readCfg.PullPolicy, \"never\")\n\n\t\t\t\t\tcommand = commands.ConfigPullPolicy(logger, cfg, configFile)\n\t\t\t\t\tcommand.SetArgs([]string{\"never\"})\n\t\t\t\t\tassert.Succeeds(command.Execute())\n\n\t\t\t\t\treadCfg, err = config.Read(configFile)\n\t\t\t\t\tassert.Nil(err)\n\t\t\t\t\tassert.Equal(readCfg.PullPolicy, \"never\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"invalid policy is specified\", func() {\n\t\t\t\tit(\"does not write invalid policy to config\", func() {\n\t\t\t\t\tcommand.SetArgs([]string{\"never\"})\n\t\t\t\t\tassert.Succeeds(command.Execute())\n\n\t\t\t\t\tcommand.SetArgs([]string{\"invalid-policy\"})\n\t\t\t\t\terr := command.Execute()\n\t\t\t\t\th.AssertError(t, err, `invalid pull policy invalid-policy`)\n\n\t\t\t\t\treadCfg, err := config.Read(configFile)\n\t\t\t\t\tassert.Nil(err)\n\t\t\t\t\tassert.Equal(readCfg.PullPolicy, \"never\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"valid policy is specified\", func() {\n\t\t\t\tit(\"sets the policy in config\", func() {\n\t\t\t\t\tcommand.SetArgs([]string{\"never\"})\n\t\t\t\t\tassert.Succeeds(command.Execute())\n\n\t\t\t\t\treadCfg, err := config.Read(configFile)\n\t\t\t\t\tassert.Nil(err)\n\t\t\t\t\tassert.Equal(readCfg.PullPolicy, \"never\")\n\t\t\t\t})\n\t\t\t\tit(\"returns clear error if fails to write\", func() {\n\t\t\t\t\tassert.Nil(os.WriteFile(configFile, []byte(\"something\"), 0001))\n\t\t\t\t\tcommand := commands.ConfigPullPolicy(logger, cfg, configFile)\n\t\t\t\t\tcommand.SetArgs([]string{\"if-not-present\"})\n\t\t\t\t\tassert.ErrorContains(command.Execute(), \"writing config to\")\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\t\twhen(\"unset\", func() {\n\t\t\tit(\"removes set policy and resets to default pull policy\", func() {\n\t\t\t\tcommand.SetArgs([]string{\"never\"})\n\t\t\t\tcommand = commands.ConfigPullPolicy(logger, cfg, configFile)\n\n\t\t\t\tcommand.SetArgs([]string{\"--unset\"})\n\t\t\t\tassert.Succeeds(command.Execute())\n\n\t\t\t\tcfg, err := config.Read(configFile)\n\t\t\t\tassert.Nil(err)\n\t\t\t\tassert.Equal(cfg.PullPolicy, \"\")\n\t\t\t})\n\t\t\tit(\"returns clear error if fails to write\", func() {\n\t\t\t\tassert.Nil(os.WriteFile(configFile, []byte(\"something\"), 0001))\n\t\t\t\tcommand := commands.ConfigPullPolicy(logger, config.Config{PullPolicy: \"never\"}, configFile)\n\t\t\t\tcommand.SetArgs([]string{\"--unset\"})\n\t\t\t\tassert.ErrorContains(command.Execute(), \"writing config to\")\n\t\t\t})\n\t\t})\n\t\twhen(\"--unset and policy to set is provided\", func() {\n\t\t\tit(\"errors\", func() {\n\t\t\t\tcommand.SetArgs([]string{\n\t\t\t\t\t\"never\",\n\t\t\t\t\t\"--unset\",\n\t\t\t\t})\n\t\t\t\terr := command.Execute()\n\t\t\t\th.AssertError(t, err, `pull policy and --unset cannot be specified simultaneously`)\n\t\t\t})\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "internal/commands/config_registries.go",
    "content": "package commands\n\nimport (\n\t\"fmt\"\n\t\"strings\"\n\n\t\"github.com/pkg/errors\"\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/internal/slices\"\n\t\"github.com/buildpacks/pack/internal/stringset\"\n\t\"github.com/buildpacks/pack/internal/style\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\t\"github.com/buildpacks/pack/registry\"\n)\n\nvar (\n\tbpRegistryExplanation = \"A Buildpack Registry is a (still experimental) place to publish, store, and discover buildpacks. \"\n)\n\nvar (\n\tsetDefault   bool\n\tregistryType string\n)\n\nfunc ConfigRegistries(logger logging.Logger, cfg config.Config, cfgPath string) *cobra.Command {\n\tcmd := &cobra.Command{\n\t\tUse:     \"registries\",\n\t\tAliases: []string{\"registry\", \"registreis\"},\n\t\tShort:   \"List, add and remove registries\",\n\t\tLong:    bpRegistryExplanation + \"\\nYou can use the attached commands to list, add, and remove registries from your config\",\n\t\tRunE: logError(logger, func(cmd *cobra.Command, args []string) error {\n\t\t\tlistRegistries(args, logger, cfg)\n\t\t\treturn nil\n\t\t}),\n\t}\n\n\tlistCmd := generateListCmd(\"registries\", logger, cfg, listRegistries)\n\tlistCmd.Example = \"pack config registries list\"\n\tlistCmd.Long = bpRegistryExplanation + \"List Registries saved in the pack config.\\n\\nShow the registries that are either added by default or have been explicitly added by using `pack config registries add`\"\n\tcmd.AddCommand(listCmd)\n\n\taddCmd := generateAdd(\"registries\", logger, cfg, cfgPath, addRegistry)\n\taddCmd.Args = cobra.ExactArgs(2)\n\taddCmd.Example = \"pack config registries add my-registry https://github.com/buildpacks/my-registry\"\n\taddCmd.Long = bpRegistryExplanation + \"Users can add registries from the config by using registries remove, and publish/yank buildpacks from it, as well as use those buildpacks when building applications.\"\n\taddCmd.Flags().BoolVar(&setDefault, \"default\", false, \"Set this buildpack registry as the default\")\n\taddCmd.Flags().StringVar(&registryType, \"type\", \"github\", \"Type of buildpack registry [git|github]\")\n\tcmd.AddCommand(addCmd)\n\n\trmCmd := generateRemove(\"registries\", logger, cfg, cfgPath, removeRegistry)\n\trmCmd.Example = \"pack config registries remove myregistry\"\n\trmCmd.Long = bpRegistryExplanation + \"Users can remove registries from the config by using `pack config registries remove <registry>`\"\n\tcmd.AddCommand(rmCmd)\n\n\tcmd.AddCommand(ConfigRegistriesDefault(logger, cfg, cfgPath))\n\n\tAddHelpFlag(cmd, \"registries\")\n\treturn cmd\n}\n\nfunc addRegistry(args []string, logger logging.Logger, cfg config.Config, cfgPath string) error {\n\tnewRegistry := config.Registry{\n\t\tName: args[0],\n\t\tURL:  args[1],\n\t\tType: registryType,\n\t}\n\n\treturn addRegistryToConfig(logger, newRegistry, setDefault, cfg, cfgPath)\n}\n\nfunc addRegistryToConfig(logger logging.Logger, newRegistry config.Registry, setDefault bool, cfg config.Config, cfgPath string) error {\n\tif newRegistry.Name == config.OfficialRegistryName {\n\t\treturn errors.Errorf(\"%s is a reserved registry, please provide a different name\",\n\t\t\tstyle.Symbol(config.OfficialRegistryName))\n\t}\n\n\tif _, ok := stringset.FromSlice(registry.Types)[newRegistry.Type]; !ok {\n\t\treturn errors.Errorf(\"%s is not a valid type. Supported types are: %s.\",\n\t\t\tstyle.Symbol(newRegistry.Type),\n\t\t\tstrings.Join(slices.MapString(registry.Types, style.Symbol), \", \"))\n\t}\n\n\tif registriesContains(config.GetRegistries(cfg), newRegistry.Name) {\n\t\treturn errors.Errorf(\"Buildpack registry %s already exists.\",\n\t\t\tstyle.Symbol(newRegistry.Name))\n\t}\n\n\tif setDefault {\n\t\tcfg.DefaultRegistryName = newRegistry.Name\n\t}\n\tcfg.Registries = append(cfg.Registries, newRegistry)\n\tif err := config.Write(cfg, cfgPath); err != nil {\n\t\treturn errors.Wrapf(err, \"writing config to %s\", cfgPath)\n\t}\n\n\tlogger.Infof(\"Successfully added %s to registries\", style.Symbol(newRegistry.Name))\n\treturn nil\n}\n\nfunc removeRegistry(args []string, logger logging.Logger, cfg config.Config, cfgPath string) error {\n\tregistryName := args[0]\n\n\tif registryName == config.OfficialRegistryName {\n\t\treturn errors.Errorf(\"%s is a reserved registry name, please provide a different registry\",\n\t\t\tstyle.Symbol(config.OfficialRegistryName))\n\t}\n\n\tindex := findRegistryIndex(cfg.Registries, registryName)\n\tif index < 0 {\n\t\treturn errors.Errorf(\"registry %s does not exist\", style.Symbol(registryName))\n\t}\n\n\tcfg.Registries = removeBPRegistry(index, cfg.Registries)\n\tif cfg.DefaultRegistryName == registryName {\n\t\tcfg.DefaultRegistryName = config.OfficialRegistryName\n\t}\n\n\tif err := config.Write(cfg, cfgPath); err != nil {\n\t\treturn errors.Wrapf(err, \"writing config to %s\", cfgPath)\n\t}\n\n\tlogger.Infof(\"Successfully removed %s from registries\", style.Symbol(registryName))\n\treturn nil\n}\n\nfunc listRegistries(args []string, logger logging.Logger, cfg config.Config) {\n\tfor _, currRegistry := range config.GetRegistries(cfg) {\n\t\tisDefaultRegistry := (currRegistry.Name == cfg.DefaultRegistryName) ||\n\t\t\t(currRegistry.Name == config.OfficialRegistryName && cfg.DefaultRegistryName == \"\")\n\n\t\tlogger.Info(fmtRegistry(\n\t\t\tcurrRegistry,\n\t\t\tisDefaultRegistry,\n\t\t\tlogger.IsVerbose()))\n\t}\n\tlogging.Tip(logger, \"Run %s to add additional registries\", style.Symbol(\"pack config registries add\"))\n}\n\n// Local private funcs\nfunc fmtRegistry(registry config.Registry, isDefaultRegistry, isVerbose bool) string {\n\tregistryOutput := fmt.Sprintf(\"  %s\", registry.Name)\n\tif isDefaultRegistry {\n\t\tregistryOutput = fmt.Sprintf(\"* %s\", registry.Name)\n\t}\n\tif isVerbose {\n\t\tregistryOutput = fmt.Sprintf(\"%-12s %s\", registryOutput, registry.URL)\n\t}\n\n\treturn registryOutput\n}\n\nfunc registriesContains(registries []config.Registry, registry string) bool {\n\treturn findRegistryIndex(registries, registry) != -1\n}\n\nfunc findRegistryIndex(registries []config.Registry, registryName string) int {\n\tfor index, r := range registries {\n\t\tif r.Name == registryName {\n\t\t\treturn index\n\t\t}\n\t}\n\n\treturn -1\n}\n\nfunc removeBPRegistry(index int, registries []config.Registry) []config.Registry {\n\tregistries[index] = registries[len(registries)-1]\n\treturn registries[:len(registries)-1]\n}\n"
  },
  {
    "path": "internal/commands/config_registries_default.go",
    "content": "package commands\n\nimport (\n\t\"fmt\"\n\n\t\"github.com/pkg/errors\"\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/internal/style\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n)\n\nfunc ConfigRegistriesDefault(logger logging.Logger, cfg config.Config, cfgPath string) *cobra.Command {\n\tvar unset bool\n\n\tcmd := &cobra.Command{\n\t\tUse:     \"default <name>\",\n\t\tArgs:    cobra.MaximumNArgs(1),\n\t\tShort:   \"Set default registry\",\n\t\tExample: \"pack config registries default myregistry\",\n\t\tLong: bpRegistryExplanation + \"\\n\\nYou can use this command to list, set, and unset a default registry, which will be used when looking for buildpacks:\" +\n\t\t\t\"* To list your default registry, run `pack config registries default`.\\n\" +\n\t\t\t\"* To set your default registry, run `pack config registries default <registry-name>`.\\n\" +\n\t\t\t\"* To unset your default registry, run `pack config registries default --unset`.\\n\" +\n\t\t\tfmt.Sprintf(\"Unsetting the default registry will reset the default-registry to be the official buildpacks registry, %s\", style.Symbol(config.DefaultRegistry().URL)),\n\t\tRunE: logError(logger, func(cmd *cobra.Command, args []string) error {\n\t\t\tswitch {\n\t\t\tcase unset:\n\t\t\t\tif cfg.DefaultRegistryName == \"\" || cfg.DefaultRegistryName == config.OfficialRegistryName {\n\t\t\t\t\treturn errors.Errorf(\"Registry %s is a protected registry, and can be replaced as a default registry, but not removed entirely. \"+\n\t\t\t\t\t\t\"To add a new registry and set as default, run `pack config registries add <registry-name> <registry-url> --default.\\n\"+\n\t\t\t\t\t\t\"To set an existing registry as default, call `pack config registries default <registry-name>`\", style.Symbol(config.OfficialRegistryName))\n\t\t\t\t}\n\t\t\t\toldRegistry := cfg.DefaultRegistryName\n\t\t\t\tcfg.DefaultRegistryName = \"\"\n\t\t\t\tif err := config.Write(cfg, cfgPath); err != nil {\n\t\t\t\t\treturn errors.Wrapf(err, \"writing config to %s\", cfgPath)\n\t\t\t\t}\n\t\t\t\tlogger.Infof(\"Successfully unset default registry %s\", style.Symbol(oldRegistry))\n\t\t\t\tlogger.Infof(\"Default registry has been set to %s\", style.Symbol(config.OfficialRegistryName))\n\t\t\tcase len(args) == 0: // list\n\t\t\t\tif cfg.DefaultRegistryName == \"\" {\n\t\t\t\t\tcfg.DefaultRegistryName = config.OfficialRegistryName\n\t\t\t\t}\n\t\t\t\tlogger.Infof(\"The current default registry is %s\", style.Symbol(cfg.DefaultRegistryName))\n\t\t\tdefault: // set\n\t\t\t\tregistryName := args[0]\n\t\t\t\tif !registriesContains(config.GetRegistries(cfg), registryName) {\n\t\t\t\t\treturn errors.Errorf(\"no registry with the name %s exists\", style.Symbol(registryName))\n\t\t\t\t}\n\n\t\t\t\tif cfg.DefaultRegistryName != registryName {\n\t\t\t\t\tcfg.DefaultRegistryName = registryName\n\t\t\t\t\terr := config.Write(cfg, cfgPath)\n\t\t\t\t\tif err != nil {\n\t\t\t\t\t\treturn errors.Wrapf(err, \"writing config to %s\", cfgPath)\n\t\t\t\t\t}\n\t\t\t\t}\n\n\t\t\t\tlogger.Infof(\"Successfully set %s as the default registry\", style.Symbol(registryName))\n\t\t\t}\n\n\t\t\treturn nil\n\t\t}),\n\t}\n\n\tcmd.Flags().BoolVarP(&unset, \"unset\", \"u\", false, \"Unset the current default registry, and set it to the official buildpacks registry\")\n\tAddHelpFlag(cmd, \"default\")\n\treturn cmd\n}\n"
  },
  {
    "path": "internal/commands/config_registries_default_test.go",
    "content": "package commands_test\n\nimport (\n\t\"bytes\"\n\t\"fmt\"\n\t\"os\"\n\t\"path/filepath\"\n\t\"testing\"\n\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\t\"github.com/buildpacks/pack/internal/commands\"\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/internal/style\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestConfigRegistriesDefault(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\n\tspec.Run(t, \"ConfigRegistriesDefaultCommand\", testConfigRegistriesDefaultCommand, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testConfigRegistriesDefaultCommand(t *testing.T, when spec.G, it spec.S) {\n\twhen(\"#ConfigRegistriesDefault\", func() {\n\t\tvar (\n\t\t\ttmpDir     string\n\t\t\tconfigFile string\n\t\t\toutBuf     bytes.Buffer\n\t\t\tlogger     = logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\tassert     = h.NewAssertionManager(t)\n\t\t)\n\n\t\tit.Before(func() {\n\t\t\tvar err error\n\t\t\ttmpDir, err = os.MkdirTemp(\"\", \"pack-home-*\")\n\t\t\tassert.Nil(err)\n\n\t\t\tconfigFile = filepath.Join(tmpDir, \"config.toml\")\n\t\t})\n\n\t\tit.After(func() {\n\t\t\t_ = os.RemoveAll(tmpDir)\n\t\t})\n\n\t\twhen(\"list\", func() {\n\t\t\tit(\"returns official if none is set\", func() {\n\t\t\t\tcommand := commands.ConfigRegistriesDefault(logger, config.Config{}, configFile)\n\t\t\t\tcommand.SetArgs([]string{})\n\t\t\t\tassert.Succeeds(command.Execute())\n\n\t\t\t\tassert.Contains(outBuf.String(), \"official\")\n\t\t\t})\n\n\t\t\tit(\"returns the default registry if one is set\", func() {\n\t\t\t\tcommand := commands.ConfigRegistriesDefault(logger, config.Config{DefaultRegistryName: \"some-registry\"}, configFile)\n\t\t\t\tcommand.SetArgs([]string{})\n\t\t\t\tassert.Succeeds(command.Execute())\n\n\t\t\t\tassert.Contains(outBuf.String(), \"some-registry\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"set default\", func() {\n\t\t\tit(\"should set the default registry\", func() {\n\t\t\t\tcfg := config.Config{\n\t\t\t\t\tRegistries: []config.Registry{\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\tName: \"myregistry\",\n\t\t\t\t\t\t\tURL:  \"https://github.com/buildpacks/registry-index\",\n\t\t\t\t\t\t\tType: \"github\",\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t}\n\t\t\t\tcommand := commands.ConfigRegistriesDefault(logger, cfg, configFile)\n\t\t\t\tcommand.SetArgs([]string{\"myregistry\"})\n\t\t\t\tassert.Succeeds(command.Execute())\n\n\t\t\t\tcfg, err := config.Read(configFile)\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassert.Equal(cfg.DefaultRegistryName, \"myregistry\")\n\t\t\t})\n\n\t\t\tit(\"should fail if no corresponding registry exists\", func() {\n\t\t\t\tcommand := commands.ConfigRegistriesDefault(logger, config.Config{}, configFile)\n\t\t\t\tcommand.SetArgs([]string{\"myregistry\"})\n\t\t\t\tassert.Error(command.Execute())\n\n\t\t\t\toutput := outBuf.String()\n\t\t\t\tassert.Contains(output, \"no registry with the name 'myregistry' exists\")\n\t\t\t})\n\n\t\t\tit(\"returns clear error if fails to write\", func() {\n\t\t\t\tassert.Nil(os.WriteFile(configFile, []byte(\"something\"), 0001))\n\t\t\t\tcfg := config.Config{\n\t\t\t\t\tRegistries: []config.Registry{\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\tName: \"myregistry\",\n\t\t\t\t\t\t\tURL:  \"https://github.com/buildpacks/registry-index\",\n\t\t\t\t\t\t\tType: \"github\",\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t}\n\t\t\t\tcommand := commands.ConfigRegistriesDefault(logger, cfg, configFile)\n\t\t\t\tcommand.SetArgs([]string{\"myregistry\"})\n\t\t\t\tassert.ErrorContains(command.Execute(), \"writing config to\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"--unset\", func() {\n\t\t\tit(\"should unset the default registry, if set\", func() {\n\t\t\t\tcommand := commands.ConfigRegistriesDefault(logger, config.Config{DefaultRegistryName: \"some-registry\"}, configFile)\n\t\t\t\tcommand.SetArgs([]string{\"--unset\"})\n\t\t\t\tassert.Nil(command.Execute())\n\n\t\t\t\tcfg, err := config.Read(configFile)\n\t\t\t\tassert.Nil(err)\n\t\t\t\tassert.Equal(cfg.DefaultRegistryName, \"\")\n\t\t\t\tassert.Contains(outBuf.String(), fmt.Sprintf(\"Successfully unset default registry %s\", style.Symbol(\"some-registry\")))\n\t\t\t})\n\n\t\t\tit(\"should return an error if official registry is being unset\", func() {\n\t\t\t\tcommand := commands.ConfigRegistriesDefault(logger, config.Config{DefaultRegistryName: config.OfficialRegistryName}, configFile)\n\t\t\t\tcommand.SetArgs([]string{\"--unset\"})\n\t\t\t\tassert.ErrorContains(command.Execute(), fmt.Sprintf(\"Registry %s is a protected registry\", style.Symbol(config.OfficialRegistryName)))\n\t\t\t})\n\n\t\t\tit(\"should return a clear message if no registry is set\", func() {\n\t\t\t\tcommand := commands.ConfigRegistriesDefault(logger, config.Config{}, configFile)\n\t\t\t\tcommand.SetArgs([]string{\"--unset\"})\n\t\t\t\tassert.ErrorContains(command.Execute(), fmt.Sprintf(\"Registry %s is a protected registry\", style.Symbol(config.OfficialRegistryName)))\n\t\t\t})\n\n\t\t\tit(\"returns clear error if fails to write\", func() {\n\t\t\t\tassert.Nil(os.WriteFile(configFile, []byte(\"something\"), 0001))\n\t\t\t\tcommand := commands.ConfigRegistriesDefault(logger, config.Config{DefaultRegistryName: \"some-registry\"}, configFile)\n\t\t\t\tcommand.SetArgs([]string{\"--unset\"})\n\t\t\t\tassert.ErrorContains(command.Execute(), \"writing config to\")\n\t\t\t})\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "internal/commands/config_registries_test.go",
    "content": "package commands_test\n\nimport (\n\t\"bytes\"\n\t\"io\"\n\t\"os\"\n\t\"path/filepath\"\n\t\"testing\"\n\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/internal/commands\"\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestConfigRegistries(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"ConfigRegistriesCommand\", testConfigRegistries, spec.Random(), spec.Report(report.Terminal{}))\n}\n\nfunc testConfigRegistries(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tcmd               *cobra.Command\n\t\tlogger            logging.Logger\n\t\tcfgWithRegistries config.Config\n\t\toutBuf            bytes.Buffer\n\t\ttempPackHome      string\n\t\tconfigPath        string\n\t\tassert            = h.NewAssertionManager(t)\n\t)\n\n\tit.Before(func() {\n\t\tvar err error\n\n\t\tlogger = logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\ttempPackHome, err = os.MkdirTemp(\"\", \"pack-home\")\n\t\tassert.Nil(err)\n\t\tconfigPath = filepath.Join(tempPackHome, \"config.toml\")\n\n\t\tcmd = commands.ConfigRegistries(logger, config.Config{}, configPath)\n\t\tcmd.SetOut(logging.GetWriterForLevel(logger, logging.InfoLevel))\n\n\t\tcfgWithRegistries = config.Config{\n\t\t\tDefaultRegistryName: \"private registry\",\n\t\t\tRegistries: []config.Registry{\n\t\t\t\t{\n\t\t\t\t\tName: \"public registry\",\n\t\t\t\t\tType: \"github\",\n\t\t\t\t\tURL:  \"https://github.com/buildpacks/public-registry\",\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\tName: \"private registry\",\n\t\t\t\t\tType: \"github\",\n\t\t\t\t\tURL:  \"https://github.com/buildpacks/private-registry\",\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\tName: \"personal registry\",\n\t\t\t\t\tType: \"github\",\n\t\t\t\t\tURL:  \"https://github.com/buildpacks/personal-registry\",\n\t\t\t\t},\n\t\t\t},\n\t\t}\n\t})\n\n\tit.After(func() {\n\t\tassert.Nil(os.RemoveAll(tempPackHome))\n\t})\n\n\twhen(\"-h\", func() {\n\t\tit(\"prints help text\", func() {\n\t\t\tcmd.SetArgs([]string{\"-h\"})\n\t\t\tassert.Nil(cmd.Execute())\n\t\t\toutput := outBuf.String()\n\t\t\tassert.Contains(output, \"Usage:\")\n\t\t\tfor _, command := range []string{\"add\", \"remove\", \"list\", \"default\"} {\n\t\t\t\tassert.Contains(output, command)\n\t\t\t}\n\t\t})\n\t})\n\n\twhen(\"no args\", func() {\n\t\tit(\"calls list\", func() {\n\t\t\tlogger = logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\tcfgWithRegistries = config.Config{\n\t\t\t\tDefaultRegistryName: \"private registry\",\n\t\t\t\tRegistries: []config.Registry{\n\t\t\t\t\t{\n\t\t\t\t\t\tName: \"public registry\",\n\t\t\t\t\t\tType: \"github\",\n\t\t\t\t\t\tURL:  \"https://github.com/buildpacks/public-registry\",\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\tName: \"private registry\",\n\t\t\t\t\t\tType: \"github\",\n\t\t\t\t\t\tURL:  \"https://github.com/buildpacks/private-registry\",\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\tName: \"personal registry\",\n\t\t\t\t\t\tType: \"github\",\n\t\t\t\t\t\tURL:  \"https://github.com/buildpacks/personal-registry\",\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t}\n\t\t\tcmd = commands.ConfigRegistries(logger, cfgWithRegistries, configPath)\n\t\t\tcmd.SetArgs([]string{})\n\n\t\t\tassert.Nil(cmd.Execute())\n\t\t\tassert.Contains(outBuf.String(), \"public registry\")\n\t\t\tassert.Contains(outBuf.String(), \"* private registry\")\n\t\t\tassert.Contains(outBuf.String(), \"personal registry\")\n\t\t})\n\t})\n\n\twhen(\"list\", func() {\n\t\tvar args = []string{\"list\"}\n\t\tit.Before(func() {\n\t\t\tlogger = logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\tcmd = commands.ConfigRegistries(logger, cfgWithRegistries, configPath)\n\t\t\tcmd.SetArgs(args)\n\t\t})\n\n\t\tit(\"should list all registries\", func() {\n\t\t\tassert.Nil(cmd.Execute())\n\n\t\t\tassert.Contains(outBuf.String(), \"public registry\")\n\t\t\tassert.Contains(outBuf.String(), \"* private registry\")\n\t\t\tassert.Contains(outBuf.String(), \"personal registry\")\n\t\t})\n\n\t\tit(\"should list registries in verbose mode\", func() {\n\t\t\tlogger = logging.NewLogWithWriters(&outBuf, &outBuf, logging.WithVerbose())\n\t\t\tcmd = commands.ConfigRegistries(logger, cfgWithRegistries, configPath)\n\t\t\tcmd.SetArgs(args)\n\t\t\tassert.Nil(cmd.Execute())\n\n\t\t\tassert.Contains(outBuf.String(), \"public registry\")\n\t\t\tassert.Contains(outBuf.String(), \"https://github.com/buildpacks/public-registry\")\n\n\t\t\tassert.Contains(outBuf.String(), \"* private registry\")\n\t\t\tassert.Contains(outBuf.String(), \"https://github.com/buildpacks/private-registry\")\n\n\t\t\tassert.Contains(outBuf.String(), \"personal registry\")\n\t\t\tassert.Contains(outBuf.String(), \"https://github.com/buildpacks/personal-registry\")\n\n\t\t\tassert.Contains(outBuf.String(), \"official\")\n\t\t\tassert.Contains(outBuf.String(), \"https://github.com/buildpacks/registry-index\")\n\t\t})\n\n\t\tit(\"should indicate official as the default registry by default\", func() {\n\t\t\tcfgWithRegistries.DefaultRegistryName = \"\"\n\t\t\tcmd = commands.ConfigRegistries(logger, cfgWithRegistries, configPath)\n\t\t\tcmd.SetArgs(args)\n\n\t\t\tassert.Nil(cmd.Execute())\n\n\t\t\tassert.Contains(outBuf.String(), \"* official\")\n\t\t\tassert.Contains(outBuf.String(), \"public registry\")\n\t\t\tassert.Contains(outBuf.String(), \"private registry\")\n\t\t\tassert.Contains(outBuf.String(), \"personal registry\")\n\t\t})\n\n\t\tit(\"should use official when no registries are defined\", func() {\n\t\t\tcmd = commands.ConfigRegistries(logger, config.Config{}, configPath)\n\t\t\tcmd.SetArgs(args)\n\n\t\t\tassert.Nil(cmd.Execute())\n\n\t\t\tassert.Contains(outBuf.String(), \"* official\")\n\t\t})\n\t})\n\n\twhen(\"add\", func() {\n\t\tvar (\n\t\t\targs = []string{\"add\", \"bp\", \"https://github.com/buildpacks/registry-index/\"}\n\t\t)\n\n\t\twhen(\"add buildpack registry\", func() {\n\t\t\tit(\"adds to registry list\", func() {\n\t\t\t\tcmd.SetArgs(args)\n\t\t\t\tassert.Succeeds(cmd.Execute())\n\n\t\t\t\tcfg, err := config.Read(configPath)\n\t\t\t\tassert.Nil(err)\n\t\t\t\tassert.Equal(len(cfg.Registries), 1)\n\t\t\t\tassert.Equal(cfg.Registries[0].Name, \"bp\")\n\t\t\t\tassert.Equal(cfg.Registries[0].Type, \"github\")\n\t\t\t\tassert.Equal(cfg.Registries[0].URL, \"https://github.com/buildpacks/registry-index/\")\n\t\t\t\tassert.Equal(cfg.DefaultRegistryName, \"\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"default is true\", func() {\n\t\t\tit(\"sets newly added registry as the default\", func() {\n\t\t\t\tcmd.SetArgs(append(args, \"--default\"))\n\t\t\t\tassert.Succeeds(cmd.Execute())\n\n\t\t\t\tcfg, err := config.Read(configPath)\n\t\t\t\tassert.Nil(err)\n\t\t\t\tassert.Equal(len(cfg.Registries), 1)\n\t\t\t\tassert.Equal(cfg.Registries[0].Name, \"bp\")\n\t\t\t\tassert.Equal(cfg.DefaultRegistryName, \"bp\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"validation\", func() {\n\t\t\tit(\"fails with missing args\", func() {\n\t\t\t\tcmd.SetOut(io.Discard)\n\t\t\t\tcmd.SetArgs([]string{\"add\"})\n\t\t\t\terr := cmd.Execute()\n\t\t\t\tassert.ErrorContains(err, \"accepts 2 arg\")\n\t\t\t})\n\n\t\t\tit(\"should validate type\", func() {\n\t\t\t\tcmd.SetArgs(append(args, \"--type=bogus\"))\n\t\t\t\tassert.Error(cmd.Execute())\n\n\t\t\t\toutput := outBuf.String()\n\t\t\t\tassert.Contains(output, \"'bogus' is not a valid type. Supported types are: 'git', 'github'.\")\n\t\t\t})\n\n\t\t\tit(\"should throw error when registry already exists\", func() {\n\t\t\t\tcommand := commands.ConfigRegistries(logger, config.Config{\n\t\t\t\t\tRegistries: []config.Registry{\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\tName: \"bp\",\n\t\t\t\t\t\t\tType: \"github\",\n\t\t\t\t\t\t\tURL:  \"https://github.com/buildpacks/registry-index/\",\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t}, configPath)\n\t\t\t\tcommand.SetArgs(args)\n\t\t\t\tassert.Error(command.Execute())\n\n\t\t\t\toutput := outBuf.String()\n\t\t\t\tassert.Contains(output, \"Buildpack registry 'bp' already exists.\")\n\t\t\t})\n\n\t\t\tit(\"should throw error when registry name is official\", func() {\n\t\t\t\tcmd.SetOut(logging.GetWriterForLevel(logger, logging.InfoLevel))\n\t\t\t\tcmd.SetErr(&outBuf)\n\t\t\t\tcmd.SetArgs([]string{\"add\", \"official\", \"https://github.com/buildpacks/registry-index/\", \"--type=github\"})\n\n\t\t\t\tassert.Error(cmd.Execute())\n\n\t\t\t\toutput := outBuf.String()\n\t\t\t\tassert.Contains(output, \"'official' is a reserved registry, please provide a different name\")\n\t\t\t})\n\n\t\t\tit(\"returns clear error if fails to write\", func() {\n\t\t\t\tassert.Nil(os.WriteFile(configPath, []byte(\"something\"), 0001))\n\t\t\t\tcmd.SetArgs(args)\n\t\t\t\tassert.ErrorContains(cmd.Execute(), \"writing config to\")\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"remove\", func() {\n\t\tit.Before(func() {\n\t\t\tcmd = commands.ConfigRegistries(logger, cfgWithRegistries, configPath)\n\t\t})\n\n\t\tit(\"should remove the registry\", func() {\n\t\t\tcmd.SetArgs([]string{\"remove\", \"public registry\"})\n\t\t\tassert.Succeeds(cmd.Execute())\n\n\t\t\tnewCfg, err := config.Read(configPath)\n\t\t\tassert.Nil(err)\n\n\t\t\tassert.Equal(newCfg, config.Config{\n\t\t\t\tDefaultRegistryName: \"private registry\",\n\t\t\t\tRegistries: []config.Registry{\n\t\t\t\t\t{\n\t\t\t\t\t\tName: \"personal registry\",\n\t\t\t\t\t\tType: \"github\",\n\t\t\t\t\t\tURL:  \"https://github.com/buildpacks/personal-registry\",\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\tName: \"private registry\",\n\t\t\t\t\t\tType: \"github\",\n\t\t\t\t\t\tURL:  \"https://github.com/buildpacks/private-registry\",\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t})\n\t\t})\n\n\t\tit(\"should remove the registry and matching default registry name\", func() {\n\t\t\tcmd.SetArgs([]string{\"remove\", \"private registry\"})\n\t\t\tassert.Succeeds(cmd.Execute())\n\n\t\t\tnewCfg, err := config.Read(configPath)\n\t\t\tassert.Nil(err)\n\n\t\t\tassert.Equal(newCfg, config.Config{\n\t\t\t\tDefaultRegistryName: config.OfficialRegistryName,\n\t\t\t\tRegistries: []config.Registry{\n\t\t\t\t\t{\n\t\t\t\t\t\tName: \"public registry\",\n\t\t\t\t\t\tType: \"github\",\n\t\t\t\t\t\tURL:  \"https://github.com/buildpacks/public-registry\",\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\tName: \"personal registry\",\n\t\t\t\t\t\tType: \"github\",\n\t\t\t\t\t\tURL:  \"https://github.com/buildpacks/personal-registry\",\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t})\n\t\t})\n\n\t\tit(\"should return error when registry does NOT already exist\", func() {\n\t\t\tcmd.SetArgs([]string{\"remove\", \"missing-registry\"})\n\t\t\tassert.Error(cmd.Execute())\n\n\t\t\toutput := outBuf.String()\n\t\t\tassert.Contains(output, \"registry 'missing-registry' does not exist\")\n\t\t})\n\n\t\tit(\"should throw error when registry name is official\", func() {\n\t\t\tcmd.SetArgs([]string{\"remove\", \"official\"})\n\t\t\tassert.Error(cmd.Execute())\n\n\t\t\toutput := outBuf.String()\n\t\t\tassert.Contains(output, \"'official' is a reserved registry name, please provide a different registry\")\n\t\t})\n\n\t\tit(\"returns clear error if fails to write\", func() {\n\t\t\tassert.Nil(os.WriteFile(configPath, []byte(\"something\"), 0001))\n\t\t\tcmd.SetArgs([]string{\"remove\", \"public registry\"})\n\t\t\tassert.ErrorContains(cmd.Execute(), \"writing config to\")\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "internal/commands/config_registry_mirrors.go",
    "content": "package commands\n\nimport (\n\t\"fmt\"\n\t\"strings\"\n\n\t\"github.com/pkg/errors\"\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/internal/style\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n)\n\nvar registryMirror string\n\nfunc ConfigRegistryMirrors(logger logging.Logger, cfg config.Config, cfgPath string) *cobra.Command {\n\tcmd := &cobra.Command{\n\t\tUse:     \"registry-mirrors\",\n\t\tShort:   \"List, add and remove OCI registry mirrors\",\n\t\tAliases: []string{\"registry-mirror\"},\n\t\tArgs:    cobra.MaximumNArgs(3),\n\t\tRunE: logError(logger, func(cmd *cobra.Command, args []string) error {\n\t\t\tlistRegistryMirrors(args, logger, cfg)\n\t\t\treturn nil\n\t\t}),\n\t}\n\n\tlistCmd := generateListCmd(cmd.Use, logger, cfg, listRegistryMirrors)\n\tlistCmd.Long = \"List all registry mirrors.\"\n\tlistCmd.Use = \"list\"\n\tlistCmd.Example = \"pack config registry-mirrors list\"\n\tcmd.AddCommand(listCmd)\n\n\taddCmd := generateAdd(\"mirror for a registry\", logger, cfg, cfgPath, addRegistryMirror)\n\taddCmd.Use = \"add <registry> [-m <mirror...]\"\n\taddCmd.Long = \"Set mirror for a given registry.\"\n\taddCmd.Example = \"pack config registry-mirrors add index.docker.io --mirror 10.0.0.1\\npack config registry-mirrors add '*' --mirror 10.0.0.1\"\n\taddCmd.Flags().StringVarP(&registryMirror, \"mirror\", \"m\", \"\", \"Registry mirror\")\n\tcmd.AddCommand(addCmd)\n\n\trmCmd := generateRemove(\"mirror for a registry\", logger, cfg, cfgPath, removeRegistryMirror)\n\trmCmd.Use = \"remove <registry>\"\n\trmCmd.Long = \"Remove mirror for a given registry.\"\n\trmCmd.Example = \"pack config registry-mirrors remove index.docker.io\"\n\tcmd.AddCommand(rmCmd)\n\n\tAddHelpFlag(cmd, \"run-image-mirrors\")\n\treturn cmd\n}\n\nfunc addRegistryMirror(args []string, logger logging.Logger, cfg config.Config, cfgPath string) error {\n\tregistry := args[0]\n\tif registryMirror == \"\" {\n\t\tlogger.Infof(\"A registry mirror was not provided.\")\n\t\treturn nil\n\t}\n\n\tif cfg.RegistryMirrors == nil {\n\t\tcfg.RegistryMirrors = map[string]string{}\n\t}\n\n\tcfg.RegistryMirrors[registry] = registryMirror\n\tif err := config.Write(cfg, cfgPath); err != nil {\n\t\treturn errors.Wrapf(err, \"failed to write to %s\", cfgPath)\n\t}\n\n\tlogger.Infof(\"Registry %s configured with mirror %s\", style.Symbol(registry), style.Symbol(registryMirror))\n\treturn nil\n}\n\nfunc removeRegistryMirror(args []string, logger logging.Logger, cfg config.Config, cfgPath string) error {\n\tregistry := args[0]\n\t_, ok := cfg.RegistryMirrors[registry]\n\tif !ok {\n\t\tlogger.Infof(\"No registry mirror has been set for %s\", style.Symbol(registry))\n\t\treturn nil\n\t}\n\n\tdelete(cfg.RegistryMirrors, registry)\n\tif err := config.Write(cfg, cfgPath); err != nil {\n\t\treturn errors.Wrapf(err, \"failed to write to %s\", cfgPath)\n\t}\n\n\tlogger.Infof(\"Removed mirror for %s\", style.Symbol(registry))\n\treturn nil\n}\n\nfunc listRegistryMirrors(args []string, logger logging.Logger, cfg config.Config) {\n\tif len(cfg.RegistryMirrors) == 0 {\n\t\tlogger.Info(\"No registry mirrors have been set\")\n\t\treturn\n\t}\n\n\tbuf := strings.Builder{}\n\tbuf.WriteString(\"Registry Mirrors:\\n\")\n\tfor registry, mirror := range cfg.RegistryMirrors {\n\t\tbuf.WriteString(fmt.Sprintf(\"  %s: %s\\n\", registry, style.Symbol(mirror)))\n\t}\n\n\tlogger.Info(buf.String())\n}\n"
  },
  {
    "path": "internal/commands/config_registry_mirrors_test.go",
    "content": "package commands_test\n\nimport (\n\t\"bytes\"\n\t\"fmt\"\n\t\"os\"\n\t\"path/filepath\"\n\t\"strings\"\n\t\"testing\"\n\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/internal/commands\"\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/internal/style\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestConfigRegistryMirrors(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"ConfigRunImageMirrorsCommand\", testConfigRegistryMirrorsCommand, spec.Random(), spec.Report(report.Terminal{}))\n}\n\nfunc testConfigRegistryMirrorsCommand(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tcmd          *cobra.Command\n\t\tlogger       logging.Logger\n\t\toutBuf       bytes.Buffer\n\t\ttempPackHome string\n\t\tconfigPath   string\n\t\tregistry1    = \"index.docker.io\"\n\t\tregistry2    = \"us.gcr.io\"\n\t\ttestMirror1  = \"10.0.0.1\"\n\t\ttestMirror2  = \"10.0.0.2\"\n\t\ttestCfg      = config.Config{\n\t\t\tRegistryMirrors: map[string]string{\n\t\t\t\tregistry1: testMirror1,\n\t\t\t\tregistry2: testMirror2,\n\t\t\t},\n\t\t}\n\t)\n\n\tit.Before(func() {\n\t\tvar err error\n\t\tlogger = logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\ttempPackHome, err = os.MkdirTemp(\"\", \"pack-home\")\n\t\th.AssertNil(t, err)\n\t\tconfigPath = filepath.Join(tempPackHome, \"config.toml\")\n\n\t\tcmd = commands.ConfigRegistryMirrors(logger, testCfg, configPath)\n\t\tcmd.SetOut(logging.GetWriterForLevel(logger, logging.InfoLevel))\n\t})\n\n\tit.After(func() {\n\t\th.AssertNil(t, os.RemoveAll(tempPackHome))\n\t})\n\n\twhen(\"-h\", func() {\n\t\tit(\"prints available commands\", func() {\n\t\t\tcmd.SetArgs([]string{\"-h\"})\n\t\t\th.AssertNil(t, cmd.Execute())\n\t\t\toutput := outBuf.String()\n\t\t\th.AssertContains(t, output, \"Usage:\")\n\t\t\tfor _, command := range []string{\"add\", \"remove\", \"list\"} {\n\t\t\t\th.AssertContains(t, output, command)\n\t\t\t}\n\t\t})\n\t})\n\n\twhen(\"no arguments\", func() {\n\t\tit(\"lists registry mirrors\", func() {\n\t\t\tcmd.SetArgs([]string{})\n\t\t\th.AssertNil(t, cmd.Execute())\n\t\t\toutput := outBuf.String()\n\t\t\th.AssertContains(t, strings.TrimSpace(output), `Registry Mirrors:`)\n\t\t\th.AssertContains(t, strings.TrimSpace(output), `index.docker.io: '10.0.0.1'`)\n\t\t\th.AssertContains(t, strings.TrimSpace(output), `us.gcr.io: '10.0.0.2'`)\n\t\t})\n\t})\n\n\twhen(\"add\", func() {\n\t\twhen(\"no registry is specified\", func() {\n\t\t\tit(\"fails to run\", func() {\n\t\t\t\tcmd.SetArgs([]string{\"add\"})\n\t\t\t\terr := cmd.Execute()\n\t\t\t\th.AssertError(t, err, \"accepts 1 arg\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"config path doesn't exist\", func() {\n\t\t\tit(\"fails to run\", func() {\n\t\t\t\tfakePath := filepath.Join(tempPackHome, \"not-exist.toml\")\n\t\t\t\th.AssertNil(t, os.WriteFile(fakePath, []byte(\"something\"), 0001))\n\t\t\t\tcmd = commands.ConfigRegistryMirrors(logger, config.Config{}, fakePath)\n\t\t\t\tcmd.SetArgs([]string{\"add\", registry1, \"-m\", testMirror1})\n\n\t\t\t\terr := cmd.Execute()\n\t\t\t\th.AssertError(t, err, \"failed to write to\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"mirrors are provided\", func() {\n\t\t\tit(\"adds them as mirrors to the config\", func() {\n\t\t\t\tcmd.SetArgs([]string{\"add\", \"asia.gcr.io\", \"-m\", \"10.0.0.3\"})\n\t\t\t\th.AssertNil(t, cmd.Execute())\n\t\t\t\tcfg, err := config.Read(configPath)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertEq(t, cfg, config.Config{\n\t\t\t\t\tRegistryMirrors: map[string]string{\n\t\t\t\t\t\tregistry1:     testMirror1,\n\t\t\t\t\t\tregistry2:     testMirror2,\n\t\t\t\t\t\t\"asia.gcr.io\": \"10.0.0.3\",\n\t\t\t\t\t},\n\t\t\t\t})\n\t\t\t})\n\n\t\t\tit(\"replaces pre-existing mirrors in the config\", func() {\n\t\t\t\tcmd.SetArgs([]string{\"add\", registry1, \"-m\", \"10.0.0.3\"})\n\t\t\t\th.AssertNil(t, cmd.Execute())\n\t\t\t\tcfg, err := config.Read(configPath)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertEq(t, cfg, config.Config{\n\t\t\t\t\tRegistryMirrors: map[string]string{\n\t\t\t\t\t\tregistry1: \"10.0.0.3\",\n\t\t\t\t\t\tregistry2: testMirror2,\n\t\t\t\t\t},\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"no mirrors are provided\", func() {\n\t\t\tit(\"preserves old mirrors, and prints helpful message\", func() {\n\t\t\t\tcmd.SetArgs([]string{\"add\", registry1})\n\t\t\t\th.AssertNil(t, cmd.Execute())\n\t\t\t\th.AssertContains(t, outBuf.String(), \"A registry mirror was not provided\")\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"remove\", func() {\n\t\twhen(\"no registry is specified\", func() {\n\t\t\tit(\"fails to run\", func() {\n\t\t\t\tcmd.SetArgs([]string{\"remove\"})\n\t\t\t\terr := cmd.Execute()\n\t\t\t\th.AssertError(t, err, \"accepts 1 arg\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"registry provided isn't present\", func() {\n\t\t\tit(\"prints a clear message\", func() {\n\t\t\t\tfakeImage := \"not-set-image\"\n\t\t\t\tcmd.SetArgs([]string{\"remove\", fakeImage})\n\t\t\t\th.AssertNil(t, cmd.Execute())\n\t\t\t\toutput := outBuf.String()\n\t\t\t\th.AssertContains(t, output, fmt.Sprintf(\"No registry mirror has been set for %s\", style.Symbol(fakeImage)))\n\t\t\t})\n\t\t})\n\n\t\twhen(\"config path doesn't exist\", func() {\n\t\t\tit(\"fails to run\", func() {\n\t\t\t\tfakePath := filepath.Join(tempPackHome, \"not-exist.toml\")\n\t\t\t\th.AssertNil(t, os.WriteFile(fakePath, []byte(\"something\"), 0001))\n\t\t\t\tcmd = commands.ConfigRegistryMirrors(logger, testCfg, fakePath)\n\t\t\t\tcmd.SetArgs([]string{\"remove\", registry1})\n\n\t\t\t\terr := cmd.Execute()\n\t\t\t\th.AssertError(t, err, \"failed to write to\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"registry is provided\", func() {\n\t\t\tit(\"removes the given registry\", func() {\n\t\t\t\tcmd.SetArgs([]string{\"remove\", registry1})\n\t\t\t\th.AssertNil(t, cmd.Execute())\n\t\t\t\tcfg, err := config.Read(configPath)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertEq(t, cfg.RegistryMirrors, map[string]string{\n\t\t\t\t\tregistry2: testMirror2,\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"list\", func() {\n\t\twhen(\"mirrors were previously set\", func() {\n\t\t\tit(\"lists registry mirrors\", func() {\n\t\t\t\tcmd.SetArgs([]string{\"list\"})\n\t\t\t\th.AssertNil(t, cmd.Execute())\n\t\t\t\toutput := outBuf.String()\n\t\t\t\th.AssertContains(t, output, registry1)\n\t\t\t\th.AssertContains(t, output, testMirror1)\n\t\t\t\th.AssertContains(t, output, registry2)\n\t\t\t\th.AssertContains(t, output, testMirror2)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"no registry mirrors were set\", func() {\n\t\t\tit(\"prints a clear message\", func() {\n\t\t\t\tcmd = commands.ConfigRegistryMirrors(logger, config.Config{}, configPath)\n\t\t\t\tcmd.SetArgs([]string{\"list\"})\n\t\t\t\th.AssertNil(t, cmd.Execute())\n\t\t\t\toutput := outBuf.String()\n\t\t\t\th.AssertNotContains(t, output, registry1)\n\t\t\t\th.AssertNotContains(t, output, testMirror1)\n\t\t\t\th.AssertNotContains(t, output, registry2)\n\t\t\t\th.AssertNotContains(t, output, testMirror2)\n\n\t\t\t\th.AssertContains(t, output, \"No registry mirrors have been set\")\n\t\t\t})\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "internal/commands/config_run_image_mirrors.go",
    "content": "package commands\n\nimport (\n\t\"fmt\"\n\t\"sort\"\n\t\"strings\"\n\n\t\"github.com/pkg/errors\"\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/internal/stringset\"\n\t\"github.com/buildpacks/pack/internal/style\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n)\n\nvar mirrors []string\n\nfunc ConfigRunImagesMirrors(logger logging.Logger, cfg config.Config, cfgPath string) *cobra.Command {\n\tcmd := &cobra.Command{\n\t\tUse:   \"run-image-mirrors\",\n\t\tShort: \"List, add and remove run image mirrors\",\n\t\tArgs:  cobra.MaximumNArgs(3),\n\t\tRunE: logError(logger, func(cmd *cobra.Command, args []string) error {\n\t\t\tlistRunImageMirror(args, logger, cfg)\n\t\t\treturn nil\n\t\t}),\n\t}\n\n\tlistCmd := generateListCmd(cmd.Use, logger, cfg, listRunImageMirror)\n\tlistCmd.Long = \"List all run image mirrors. If a run image is provided, it will return \"\n\tlistCmd.Use = \"list [<run-image>]\"\n\tlistCmd.Example = \"pack config run-image-mirrors list\"\n\tcmd.AddCommand(listCmd)\n\n\taddCmd := generateAdd(\"mirror for a run image\", logger, cfg, cfgPath, addRunImageMirror)\n\taddCmd.Use = \"add <image> [-m <mirror...]\"\n\taddCmd.Long = \"Set mirrors to other repositories for a given run image\"\n\taddCmd.Example = \"pack config run-image-mirrors add cnbs/sample-stack-run:bionic --mirror index.docker.io/cnbs/sample-stack-run:bionic --mirror gcr.io/cnbs/sample-stack-run:bionic\"\n\taddCmd.Flags().StringSliceVarP(&mirrors, \"mirror\", \"m\", nil, \"Run image mirror\"+stringSliceHelp(\"mirror\"))\n\tcmd.AddCommand(addCmd)\n\n\trmCmd := generateRemove(\"mirror for a run image\", logger, cfg, cfgPath, removeRunImageMirror)\n\trmCmd.Use = \"remove <image> [-m <mirror...]\"\n\trmCmd.Long = \"Remove mirrors for a given run image. If specific mirrors are passed, they will be removed. \" +\n\t\t\"If no mirrors are provided, all mirrors for the given run image will be removed from the config.\"\n\trmCmd.Example = \"pack config run-image-mirrors remove cnbs/sample-stack-run:bionic\"\n\trmCmd.Flags().StringSliceVarP(&mirrors, \"mirror\", \"m\", nil, \"Run image mirror\"+stringSliceHelp(\"mirror\"))\n\tcmd.AddCommand(rmCmd)\n\n\tAddHelpFlag(cmd, \"run-image-mirrors\")\n\treturn cmd\n}\n\nfunc addRunImageMirror(args []string, logger logging.Logger, cfg config.Config, cfgPath string) error {\n\trunImage := args[0]\n\tif len(mirrors) == 0 {\n\t\tlogger.Infof(\"No run image mirrors were provided.\")\n\t\treturn nil\n\t}\n\n\tnewMirrors := mirrors\n\tfor _, image := range cfg.RunImages {\n\t\tif image.Image == runImage {\n\t\t\tnewMirrors = append(newMirrors, image.Mirrors...)\n\t\t\tbreak\n\t\t}\n\t}\n\n\tcfg = config.SetRunImageMirrors(cfg, runImage, dedupAndSortSlice(newMirrors))\n\tif err := config.Write(cfg, cfgPath); err != nil {\n\t\treturn errors.Wrapf(err, \"failed to write to %s\", cfgPath)\n\t}\n\n\tfor _, mirror := range mirrors {\n\t\tlogger.Infof(\"Run Image %s configured with mirror %s\", style.Symbol(runImage), style.Symbol(mirror))\n\t}\n\treturn nil\n}\n\nfunc removeRunImageMirror(args []string, logger logging.Logger, cfg config.Config, cfgPath string) error {\n\timage := args[0]\n\n\tidx := -1\n\tfor i, runImage := range cfg.RunImages {\n\t\tif runImage.Image == image {\n\t\t\tidx = i\n\t\t}\n\t}\n\n\tif idx == -1 || len(cfg.RunImages) == 0 {\n\t\t// Run Image wasn't found\n\t\tlogger.Infof(\"No run image mirrors have been set for %s\", style.Symbol(image))\n\t\treturn nil\n\t}\n\n\tmirrorsMap := stringset.FromSlice(mirrors)\n\tvar newMirrors []string\n\tfor _, currMirror := range cfg.RunImages[idx].Mirrors {\n\t\tif _, ok := mirrorsMap[currMirror]; !ok {\n\t\t\tnewMirrors = append(newMirrors, currMirror)\n\t\t}\n\t}\n\n\tif len(newMirrors) == 0 || len(mirrors) == 0 {\n\t\tlastImageIdx := len(cfg.RunImages) - 1\n\t\tcfg.RunImages[idx] = cfg.RunImages[lastImageIdx]\n\t\tcfg.RunImages = cfg.RunImages[:lastImageIdx]\n\t} else {\n\t\tcfg = config.SetRunImageMirrors(cfg, image, newMirrors)\n\t}\n\n\tif err := config.Write(cfg, cfgPath); err != nil {\n\t\treturn errors.Wrapf(err, \"failed to write to %s\", cfgPath)\n\t}\n\tif len(mirrors) == 0 {\n\t\tlogger.Infof(\"Removed all run image mirrors for %s\", style.Symbol(image))\n\t} else {\n\t\tlogger.Infof(\"Removed mirrors %s for %s\", strings.Join(mirrors, \", \"), style.Symbol(image))\n\t}\n\n\treturn nil\n}\n\nfunc listRunImageMirror(args []string, logger logging.Logger, cfg config.Config) {\n\tvar (\n\t\treqImage string\n\t\tfound    = false\n\t)\n\n\tif len(args) > 0 {\n\t\treqImage = args[0]\n\t}\n\n\tbuf := strings.Builder{}\n\tbuf.WriteString(\"Run Image Mirrors:\\n\")\n\tfor _, runImage := range cfg.RunImages {\n\t\tif (reqImage != \"\" && runImage.Image == reqImage) || reqImage == \"\" {\n\t\t\tfound = true\n\t\t\tbuf.WriteString(fmt.Sprintf(\"  %s:\\n\", style.Symbol(runImage.Image)))\n\t\t\tfor _, mirror := range runImage.Mirrors {\n\t\t\t\tbuf.WriteString(fmt.Sprintf(\"    %s\\n\", mirror))\n\t\t\t}\n\t\t}\n\t}\n\n\tif !found {\n\t\tsuffix := \"\"\n\t\tif reqImage != \"\" {\n\t\t\tsuffix = fmt.Sprintf(\"for %s\", style.Symbol(reqImage))\n\t\t}\n\t\tlogger.Infof(\"No run image mirrors have been set %s\", suffix)\n\t} else {\n\t\tlogger.Info(buf.String())\n\t}\n}\n\nfunc dedupAndSortSlice(slice []string) []string {\n\tset := stringset.FromSlice(slice)\n\tvar newSlice []string\n\tfor s := range set {\n\t\tnewSlice = append(newSlice, s)\n\t}\n\tsort.Strings(newSlice)\n\treturn newSlice\n}\n"
  },
  {
    "path": "internal/commands/config_run_image_mirrors_test.go",
    "content": "package commands_test\n\nimport (\n\t\"bytes\"\n\t\"fmt\"\n\t\"os\"\n\t\"path/filepath\"\n\t\"strings\"\n\t\"testing\"\n\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/internal/commands\"\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/internal/style\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestConfigRunImageMirrors(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"ConfigRunImageMirrorsCommand\", testConfigRunImageMirrorsCommand, spec.Random(), spec.Report(report.Terminal{}))\n}\n\nfunc testConfigRunImageMirrorsCommand(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tcmd          *cobra.Command\n\t\tlogger       logging.Logger\n\t\toutBuf       bytes.Buffer\n\t\ttempPackHome string\n\t\tconfigPath   string\n\t\trunImage     = \"test/image\"\n\t\ttestMirror1  = \"example.com/some/run1\"\n\t\ttestMirror2  = \"example.com/some/run2\"\n\t\ttestCfg      = config.Config{\n\t\t\tExperimental: true,\n\t\t\tRunImages: []config.RunImage{{\n\t\t\t\tImage:   runImage,\n\t\t\t\tMirrors: []string{testMirror1, testMirror2},\n\t\t\t}},\n\t\t}\n\t\texpandedCfg = config.Config{\n\t\t\tExperimental: true,\n\t\t\tRunImages: append(testCfg.RunImages, config.RunImage{\n\t\t\t\tImage:   \"new-image\",\n\t\t\t\tMirrors: []string{\"some-mirror1\", \"some-mirror2\"},\n\t\t\t}),\n\t\t}\n\t)\n\n\tit.Before(func() {\n\t\tvar err error\n\t\tlogger = logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\ttempPackHome, err = os.MkdirTemp(\"\", \"pack-home\")\n\t\th.AssertNil(t, err)\n\t\tconfigPath = filepath.Join(tempPackHome, \"config.toml\")\n\n\t\tcmd = commands.ConfigRunImagesMirrors(logger, testCfg, configPath)\n\t\tcmd.SetOut(logging.GetWriterForLevel(logger, logging.InfoLevel))\n\t})\n\n\tit.After(func() {\n\t\th.AssertNil(t, os.RemoveAll(tempPackHome))\n\t})\n\n\twhen(\"-h\", func() {\n\t\tit(\"prints available commands\", func() {\n\t\t\tcmd.SetArgs([]string{\"-h\"})\n\t\t\th.AssertNil(t, cmd.Execute())\n\t\t\toutput := outBuf.String()\n\t\t\th.AssertContains(t, output, \"Usage:\")\n\t\t\tfor _, command := range []string{\"add\", \"remove\", \"list\"} {\n\t\t\t\th.AssertContains(t, output, command)\n\t\t\t}\n\t\t})\n\t})\n\n\twhen(\"no arguments\", func() {\n\t\tit(\"lists run image mirrors\", func() {\n\t\t\tcmd.SetArgs([]string{})\n\t\t\th.AssertNil(t, cmd.Execute())\n\t\t\toutput := outBuf.String()\n\t\t\th.AssertEq(t, strings.TrimSpace(output), `Run Image Mirrors:\n  'test/image':\n    example.com/some/run1\n    example.com/some/run2`)\n\t\t})\n\t})\n\n\twhen(\"add\", func() {\n\t\twhen(\"no run image is specified\", func() {\n\t\t\tit(\"fails to run\", func() {\n\t\t\t\tcmd.SetArgs([]string{\"add\"})\n\t\t\t\terr := cmd.Execute()\n\t\t\t\th.AssertError(t, err, \"accepts 1 arg\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"config path doesn't exist\", func() {\n\t\t\tit(\"fails to run\", func() {\n\t\t\t\tfakePath := filepath.Join(tempPackHome, \"not-exist.toml\")\n\t\t\t\th.AssertNil(t, os.WriteFile(fakePath, []byte(\"something\"), 0001))\n\t\t\t\tcmd = commands.ConfigRunImagesMirrors(logger, config.Config{}, fakePath)\n\t\t\t\tcmd.SetArgs([]string{\"add\", runImage, \"-m\", testMirror1})\n\n\t\t\t\terr := cmd.Execute()\n\t\t\t\th.AssertError(t, err, \"failed to write to\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"mirrors are provided\", func() {\n\t\t\tit(\"adds them as mirrors to the config\", func() {\n\t\t\t\tcmd.SetArgs([]string{\"add\", runImage, \"-m\", testMirror1, \"-m\", testMirror2})\n\t\t\t\th.AssertNil(t, cmd.Execute())\n\t\t\t\tcfg, err := config.Read(configPath)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertEq(t, cfg, testCfg)\n\t\t\t\t// This ensures that there are no dups\n\t\t\t\th.AssertEq(t, len(cfg.RunImages[0].Mirrors), 2)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"no mirrors are provided\", func() {\n\t\t\tit(\"preserves old mirrors, and prints helpful message\", func() {\n\t\t\t\tcmd.SetArgs([]string{\"add\", runImage})\n\t\t\t\th.AssertNil(t, cmd.Execute())\n\t\t\t\th.AssertContains(t, outBuf.String(), \"No run image mirrors were provided\")\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"remove\", func() {\n\t\twhen(\"no run image is specified\", func() {\n\t\t\tit(\"fails to run\", func() {\n\t\t\t\tcmd.SetArgs([]string{\"remove\"})\n\t\t\t\terr := cmd.Execute()\n\t\t\t\th.AssertError(t, err, \"accepts 1 arg\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"run image provided isn't present\", func() {\n\t\t\tit(\"prints a clear message\", func() {\n\t\t\t\tfakeImage := \"not-set-image\"\n\t\t\t\tcmd.SetArgs([]string{\"remove\", fakeImage})\n\t\t\t\th.AssertNil(t, cmd.Execute())\n\t\t\t\toutput := outBuf.String()\n\t\t\t\th.AssertContains(t, output, fmt.Sprintf(\"No run image mirrors have been set for %s\", style.Symbol(fakeImage)))\n\t\t\t})\n\t\t})\n\n\t\twhen(\"config path doesn't exist\", func() {\n\t\t\tit(\"fails to run\", func() {\n\t\t\t\tfakePath := filepath.Join(tempPackHome, \"not-exist.toml\")\n\t\t\t\th.AssertNil(t, os.WriteFile(fakePath, []byte(\"something\"), 0001))\n\t\t\t\tcmd = commands.ConfigRunImagesMirrors(logger, testCfg, fakePath)\n\t\t\t\tcmd.SetArgs([]string{\"remove\", runImage, \"-m\", testMirror1})\n\n\t\t\t\terr := cmd.Execute()\n\t\t\t\th.AssertError(t, err, \"failed to write to\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"mirrors are provided\", func() {\n\t\t\tit(\"removes them for the given run image\", func() {\n\t\t\t\tcmd.SetArgs([]string{\"remove\", runImage, \"-m\", testMirror2})\n\t\t\t\th.AssertNil(t, cmd.Execute())\n\t\t\t\tcfg, err := config.Read(configPath)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertEq(t, cfg.RunImages, []config.RunImage{{\n\t\t\t\t\tImage:   runImage,\n\t\t\t\t\tMirrors: []string{testMirror1},\n\t\t\t\t}})\n\t\t\t})\n\n\t\t\tit(\"removes the image if all mirrors are removed\", func() {\n\t\t\t\tcmd.SetArgs([]string{\"remove\", runImage, \"-m\", testMirror1, \"-m\", testMirror2})\n\t\t\t\th.AssertNil(t, cmd.Execute())\n\t\t\t\tcfg, err := config.Read(configPath)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertEq(t, cfg.RunImages, []config.RunImage{})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"no mirrors are provided\", func() {\n\t\t\tit(\"removes all mirrors for the given run image\", func() {\n\t\t\t\tcmd.SetArgs([]string{\"remove\", runImage})\n\t\t\t\th.AssertNil(t, cmd.Execute())\n\n\t\t\t\tcfg, err := config.Read(configPath)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertEq(t, cfg.Experimental, testCfg.Experimental)\n\t\t\t\th.AssertEq(t, cfg.RunImages, []config.RunImage{})\n\t\t\t})\n\n\t\t\tit(\"preserves all mirrors aside from the given run image\", func() {\n\t\t\t\tcmd = commands.ConfigRunImagesMirrors(logger, expandedCfg, configPath)\n\t\t\t\tcmd.SetArgs([]string{\"remove\", runImage})\n\t\t\t\th.AssertNil(t, cmd.Execute())\n\n\t\t\t\tcfg, err := config.Read(configPath)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertEq(t, cfg.Experimental, testCfg.Experimental)\n\t\t\t\th.AssertNotEq(t, cfg.RunImages, []config.RunImage{})\n\t\t\t\th.AssertEq(t, cfg.RunImages, []config.RunImage{expandedCfg.RunImages[1]})\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"list\", func() {\n\t\twhen(\"mirrors were previously set\", func() {\n\t\t\tit(\"lists run image mirrors\", func() {\n\t\t\t\tcmd.SetArgs([]string{\"list\"})\n\t\t\t\th.AssertNil(t, cmd.Execute())\n\t\t\t\toutput := outBuf.String()\n\t\t\t\th.AssertContains(t, output, runImage)\n\t\t\t\th.AssertContains(t, output, testMirror1)\n\t\t\t\th.AssertContains(t, output, testMirror2)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"no run image mirrors were set\", func() {\n\t\t\tit(\"prints a clear message\", func() {\n\t\t\t\tcmd = commands.ConfigRunImagesMirrors(logger, config.Config{}, configPath)\n\t\t\t\tcmd.SetArgs([]string{\"list\"})\n\t\t\t\th.AssertNil(t, cmd.Execute())\n\t\t\t\toutput := outBuf.String()\n\t\t\t\th.AssertNotContains(t, output, runImage)\n\t\t\t\th.AssertNotContains(t, output, testMirror1)\n\n\t\t\t\th.AssertContains(t, output, \"No run image mirrors have been set\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"run image provided\", func() {\n\t\t\twhen(\"mirrors are set\", func() {\n\t\t\t\tit(\"returns image mirrors\", func() {\n\t\t\t\t\tcmd = commands.ConfigRunImagesMirrors(logger, expandedCfg, configPath)\n\t\t\t\t\tcmd.SetArgs([]string{\"list\", \"new-image\"})\n\t\t\t\t\th.AssertNil(t, cmd.Execute())\n\t\t\t\t\toutput := outBuf.String()\n\t\t\t\t\th.AssertNotContains(t, output, runImage)\n\t\t\t\t\th.AssertNotContains(t, output, testMirror1)\n\t\t\t\t\th.AssertContains(t, output, \"new-image\")\n\t\t\t\t\th.AssertContains(t, output, \"some-mirror1\")\n\t\t\t\t\th.AssertContains(t, output, \"some-mirror2\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"mirrors aren't set\", func() {\n\t\t\t\tit(\"prints a clear message\", func() {\n\t\t\t\t\tfakeImage := \"not-set-image\"\n\t\t\t\t\tcmd.SetArgs([]string{\"list\", fakeImage})\n\t\t\t\t\th.AssertNil(t, cmd.Execute())\n\t\t\t\t\toutput := outBuf.String()\n\t\t\t\t\th.AssertNotContains(t, output, runImage)\n\t\t\t\t\th.AssertNotContains(t, output, testMirror1)\n\t\t\t\t\th.AssertContains(t, output, fmt.Sprintf(\"No run image mirrors have been set for %s\", style.Symbol(fakeImage)))\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "internal/commands/config_test.go",
    "content": "package commands_test\n\nimport (\n\t\"bytes\"\n\t\"os\"\n\t\"path/filepath\"\n\t\"testing\"\n\n\t\"github.com/golang/mock/gomock\"\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/internal/commands\"\n\t\"github.com/buildpacks/pack/internal/commands/testmocks\"\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestConfigCommand(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"ConfigCommands\", testConfigCommand, spec.Random(), spec.Report(report.Terminal{}))\n}\n\nfunc testConfigCommand(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tcommand      *cobra.Command\n\t\tlogger       logging.Logger\n\t\toutBuf       bytes.Buffer\n\t\ttempPackHome string\n\t\tconfigPath   string\n\t\tmockClient   *testmocks.MockPackClient\n\t)\n\n\tit.Before(func() {\n\t\tvar err error\n\n\t\tmockController := gomock.NewController(t)\n\t\tmockClient = testmocks.NewMockPackClient(mockController)\n\n\t\tlogger = logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\ttempPackHome, err = os.MkdirTemp(\"\", \"pack-home\")\n\t\th.AssertNil(t, err)\n\t\tconfigPath = filepath.Join(tempPackHome, \"config.toml\")\n\n\t\tcommand = commands.NewConfigCommand(logger, config.Config{Experimental: true}, configPath, mockClient)\n\t\tcommand.SetOut(logging.GetWriterForLevel(logger, logging.InfoLevel))\n\t})\n\n\tit.After(func() {\n\t\th.AssertNil(t, os.RemoveAll(tempPackHome))\n\t})\n\n\twhen(\"config\", func() {\n\t\tit(\"prints help text\", func() {\n\t\t\tcommand.SetArgs([]string{})\n\t\t\th.AssertNil(t, command.Execute())\n\t\t\toutput := outBuf.String()\n\t\t\th.AssertContains(t, output, \"Usage:\")\n\t\t\tfor _, command := range []string{\"trusted-builders\", \"run-image-mirrors\", \"default-builder\", \"experimental\", \"registries\", \"pull-policy\", \"registry-mirrors\"} {\n\t\t\t\th.AssertContains(t, output, command)\n\t\t\t}\n\t\t})\n\t})\n}\n\ntype configManager struct {\n\ttestObject *testing.T\n\tconfigPath string\n}\n\nfunc newConfigManager(t *testing.T, configPath string) configManager {\n\treturn configManager{\n\t\ttestObject: t,\n\t\tconfigPath: configPath,\n\t}\n}\n\nfunc (c configManager) configWithTrustedBuilders(trustedBuilders ...string) config.Config {\n\tc.testObject.Helper()\n\n\tcfg := config.Config{}\n\tfor _, builderName := range trustedBuilders {\n\t\tcfg.TrustedBuilders = append(cfg.TrustedBuilders, config.TrustedBuilder{Name: builderName})\n\t}\n\terr := config.Write(cfg, c.configPath)\n\th.AssertNil(c.testObject, err)\n\n\treturn cfg\n}\n"
  },
  {
    "path": "internal/commands/config_trusted_builder.go",
    "content": "package commands\n\nimport (\n\t\"sort\"\n\n\t\"github.com/pkg/errors\"\n\t\"github.com/spf13/cobra\"\n\n\tbldr \"github.com/buildpacks/pack/internal/builder\"\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/internal/style\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n)\n\nfunc ConfigTrustedBuilder(logger logging.Logger, cfg config.Config, cfgPath string) *cobra.Command {\n\tcmd := &cobra.Command{\n\t\tUse:   \"trusted-builders\",\n\t\tShort: \"List, add and remove trusted builders\",\n\t\tLong: \"When pack considers a builder to be trusted, `pack build` operations will use a single lifecycle binary \" +\n\t\t\t\"called the creator. This is more efficient than using an untrusted builder, where pack will execute \" +\n\t\t\t\"five separate lifecycle binaries, each in its own container: analyze, detect, restore, build and export.\\n\\n\" +\n\t\t\t\"For more on trusted builders, and when to trust or untrust a builder, \" +\n\t\t\t\"check out our docs here: https://buildpacks.io/docs/tools/pack/concepts/trusted_builders/\",\n\t\tAliases: []string{\"trusted-builder\", \"trust-builder\", \"trust-builders\"},\n\t\tRunE: logError(logger, func(cmd *cobra.Command, args []string) error {\n\t\t\tlistTrustedBuilders(args, logger, cfg)\n\t\t\treturn nil\n\t\t}),\n\t}\n\n\tlistCmd := generateListCmd(\"trusted-builders\", logger, cfg, listTrustedBuilders)\n\tlistCmd.Long = \"List Trusted Builders.\\n\\nShow the builders that are either trusted by default or have been explicitly trusted locally using `trusted-builder add`\"\n\tlistCmd.Example = \"pack config trusted-builders list\"\n\tcmd.AddCommand(listCmd)\n\n\taddCmd := generateAdd(\"trusted-builders\", logger, cfg, cfgPath, addTrustedBuilder)\n\taddCmd.Long = \"Trust builder.\\n\\nWhen building with this builder, all lifecycle phases will be run in a single container using the builder image.\"\n\taddCmd.Example = \"pack config trusted-builders add cnbs/sample-stack-run:bionic\"\n\tcmd.AddCommand(addCmd)\n\n\trmCmd := generateRemove(\"trusted-builders\", logger, cfg, cfgPath, removeTrustedBuilder)\n\trmCmd.Long = \"Stop trusting builder.\\n\\nWhen building with this builder, all lifecycle phases will be no longer be run in a single container using the builder image.\"\n\trmCmd.Example = \"pack config trusted-builders remove cnbs/sample-stack-run:bionic\"\n\tcmd.AddCommand(rmCmd)\n\n\tAddHelpFlag(cmd, \"trusted-builders\")\n\treturn cmd\n}\n\nfunc addTrustedBuilder(args []string, logger logging.Logger, cfg config.Config, cfgPath string) error {\n\timageName := args[0]\n\tbuilderToTrust := config.TrustedBuilder{Name: imageName}\n\n\tisTrusted, err := bldr.IsTrustedBuilder(cfg, imageName)\n\tif err != nil {\n\t\treturn err\n\t}\n\tif isTrusted || bldr.IsKnownTrustedBuilder(imageName) {\n\t\tlogger.Infof(\"Builder %s is already trusted\", style.Symbol(imageName))\n\t\treturn nil\n\t}\n\n\tcfg.TrustedBuilders = append(cfg.TrustedBuilders, builderToTrust)\n\tif err := config.Write(cfg, cfgPath); err != nil {\n\t\treturn errors.Wrap(err, \"writing config\")\n\t}\n\tlogger.Infof(\"Builder %s is now trusted\", style.Symbol(imageName))\n\n\treturn nil\n}\n\nfunc removeTrustedBuilder(args []string, logger logging.Logger, cfg config.Config, cfgPath string) error {\n\tbuilder := args[0]\n\n\texistingTrustedBuilders := cfg.TrustedBuilders\n\tcfg.TrustedBuilders = []config.TrustedBuilder{}\n\tfor _, trustedBuilder := range existingTrustedBuilders {\n\t\tif trustedBuilder.Name == builder {\n\t\t\tcontinue\n\t\t}\n\n\t\tcfg.TrustedBuilders = append(cfg.TrustedBuilders, trustedBuilder)\n\t}\n\n\t// Builder is not in the trusted builder list\n\tif len(existingTrustedBuilders) == len(cfg.TrustedBuilders) {\n\t\tif bldr.IsKnownTrustedBuilder(builder) {\n\t\t\t// Attempted to untrust a known trusted builder\n\t\t\treturn errors.Errorf(\"Builder %s is a known trusted builder. Currently pack doesn't support making these builders untrusted\", style.Symbol(builder))\n\t\t}\n\n\t\tlogger.Infof(\"Builder %s wasn't trusted\", style.Symbol(builder))\n\t\treturn nil\n\t}\n\n\terr := config.Write(cfg, cfgPath)\n\tif err != nil {\n\t\treturn errors.Wrap(err, \"writing config file\")\n\t}\n\n\tlogger.Infof(\"Builder %s is no longer trusted\", style.Symbol(builder))\n\treturn nil\n}\n\nfunc getTrustedBuilders(cfg config.Config) []string {\n\tvar trustedBuilders []string\n\tfor _, knownBuilder := range bldr.KnownBuilders {\n\t\tif knownBuilder.Trusted {\n\t\t\ttrustedBuilders = append(trustedBuilders, knownBuilder.Image)\n\t\t}\n\t}\n\n\tfor _, builder := range cfg.TrustedBuilders {\n\t\ttrustedBuilders = append(trustedBuilders, builder.Name)\n\t}\n\n\tsort.Strings(trustedBuilders)\n\treturn trustedBuilders\n}\n\nfunc listTrustedBuilders(args []string, logger logging.Logger, cfg config.Config) {\n\tlogger.Info(\"Trusted Builders:\")\n\n\ttrustedBuilders := getTrustedBuilders(cfg)\n\tfor _, builder := range trustedBuilders {\n\t\tlogger.Infof(\"  %s\", builder)\n\t}\n}\n"
  },
  {
    "path": "internal/commands/config_trusted_builder_test.go",
    "content": "package commands_test\n\nimport (\n\t\"bytes\"\n\t\"fmt\"\n\t\"os\"\n\t\"path/filepath\"\n\t\"testing\"\n\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/internal/commands\"\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/internal/style\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestTrustedBuilderCommand(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"TrustedBuilderCommands\", testTrustedBuilderCommand, spec.Random(), spec.Report(report.Terminal{}))\n}\n\nfunc testTrustedBuilderCommand(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tcommand      *cobra.Command\n\t\tlogger       logging.Logger\n\t\toutBuf       bytes.Buffer\n\t\ttempPackHome string\n\t\tconfigPath   string\n\t)\n\n\tit.Before(func() {\n\t\tvar err error\n\n\t\tlogger = logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\ttempPackHome, err = os.MkdirTemp(\"\", \"pack-home\")\n\t\th.AssertNil(t, err)\n\t\tconfigPath = filepath.Join(tempPackHome, \"config.toml\")\n\n\t\tcommand = commands.ConfigTrustedBuilder(logger, config.Config{}, configPath)\n\t\tcommand.SetOut(logging.GetWriterForLevel(logger, logging.InfoLevel))\n\t})\n\n\tit.After(func() {\n\t\th.AssertNil(t, os.RemoveAll(tempPackHome))\n\t})\n\n\twhen(\"no args\", func() {\n\t\tit(\"prints list of trusted builders\", func() {\n\t\t\tcommand.SetArgs([]string{})\n\t\t\th.AssertNil(t, command.Execute())\n\t\t\th.AssertContainsAllInOrder(t,\n\t\t\t\toutBuf,\n\t\t\t\t\"gcr.io/buildpacks/builder:google-22\",\n\t\t\t\t\"heroku/builder:20\",\n\t\t\t\t\"heroku/builder:22\",\n\t\t\t\t\"heroku/builder:24\",\n\t\t\t\t\"paketobuildpacks/builder-jammy-base\",\n\t\t\t\t\"paketobuildpacks/builder-jammy-full\",\n\t\t\t\t\"paketobuildpacks/builder-jammy-tiny\",\n\t\t\t)\n\t\t})\n\n\t\tit(\"works with alias of trusted-builders\", func() {\n\t\t\tcommand.SetArgs([]string{})\n\t\t\th.AssertNil(t, command.Execute())\n\t\t\th.AssertContainsAllInOrder(t,\n\t\t\t\toutBuf,\n\t\t\t\t\"gcr.io/buildpacks/builder:google-22\",\n\t\t\t\t\"heroku/builder:20\",\n\t\t\t\t\"heroku/builder:22\",\n\t\t\t\t\"heroku/builder:24\",\n\t\t\t\t\"paketobuildpacks/builder-jammy-base\",\n\t\t\t\t\"paketobuildpacks/builder-jammy-full\",\n\t\t\t\t\"paketobuildpacks/builder-jammy-tiny\",\n\t\t\t)\n\t\t})\n\t})\n\n\twhen(\"list\", func() {\n\t\tvar args = []string{\"list\"}\n\n\t\tit(\"shows suggested builders and locally trusted builder in alphabetical order\", func() {\n\t\t\tbuilderName := \"great-builder-\" + h.RandString(8)\n\n\t\t\tcommand.SetArgs(args)\n\t\t\th.AssertNil(t, command.Execute())\n\t\t\th.AssertNotContains(t, outBuf.String(), builderName)\n\t\t\th.AssertContainsAllInOrder(t,\n\t\t\t\toutBuf,\n\t\t\t\t\"gcr.io/buildpacks/builder:google-22\",\n\t\t\t\t\"heroku/builder:20\",\n\t\t\t\t\"heroku/builder:22\",\n\t\t\t\t\"heroku/builder:24\",\n\t\t\t\t\"paketobuildpacks/builder-jammy-base\",\n\t\t\t\t\"paketobuildpacks/builder-jammy-full\",\n\t\t\t\t\"paketobuildpacks/builder-jammy-tiny\",\n\t\t\t)\n\t\t\toutBuf.Reset()\n\n\t\t\tconfigManager := newConfigManager(t, configPath)\n\t\t\tcommand = commands.ConfigTrustedBuilder(logger, configManager.configWithTrustedBuilders(builderName), configPath)\n\t\t\tcommand.SetArgs(args)\n\t\t\th.AssertNil(t, command.Execute())\n\n\t\t\th.AssertContainsAllInOrder(t,\n\t\t\t\toutBuf,\n\t\t\t\t\"gcr.io/buildpacks/builder:google-22\",\n\t\t\t\tbuilderName,\n\t\t\t\t\"heroku/builder:20\",\n\t\t\t\t\"heroku/builder:22\",\n\t\t\t\t\"heroku/builder:24\",\n\t\t\t\t\"paketobuildpacks/builder-jammy-base\",\n\t\t\t\t\"paketobuildpacks/builder-jammy-full\",\n\t\t\t\t\"paketobuildpacks/builder-jammy-tiny\",\n\t\t\t)\n\t\t})\n\t})\n\n\twhen(\"add\", func() {\n\t\tvar args = []string{\"add\"}\n\t\twhen(\"no builder is provided\", func() {\n\t\t\tit(\"prints usage\", func() {\n\t\t\t\tcommand.SetArgs(args)\n\t\t\t\th.AssertError(t, command.Execute(), \"accepts 1 arg(s)\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"can't write to config path\", func() {\n\t\t\tit(\"fails\", func() {\n\t\t\t\ttempPath := filepath.Join(tempPackHome, \"non-existent-file.toml\")\n\t\t\t\th.AssertNil(t, os.WriteFile(tempPath, []byte(\"something\"), 0111))\n\t\t\t\tcommand = commands.ConfigTrustedBuilder(logger, config.Config{}, tempPath)\n\t\t\t\tcommand.SetOut(logging.GetWriterForLevel(logger, logging.InfoLevel))\n\t\t\t\tcommand.SetArgs(append(args, \"some-builder\"))\n\t\t\t\th.AssertError(t, command.Execute(), \"writing config\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"builder is provided\", func() {\n\t\t\twhen(\"builder is not already trusted\", func() {\n\t\t\t\tit(\"updates the config\", func() {\n\t\t\t\t\tcommand.SetArgs(append(args, \"some-builder\"))\n\t\t\t\t\th.AssertNil(t, command.Execute())\n\n\t\t\t\t\tb, err := os.ReadFile(configPath)\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\th.AssertContains(t, string(b), `[[trusted-builders]]\n  name = \"some-builder\"`)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"builder is already trusted\", func() {\n\t\t\t\tit(\"does nothing\", func() {\n\t\t\t\t\tcommand.SetArgs(append(args, \"some-already-trusted-builder\"))\n\t\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t\t\toldContents, err := os.ReadFile(configPath)\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\tcommand.SetArgs(append(args, \"some-already-trusted-builder\"))\n\t\t\t\t\th.AssertNil(t, command.Execute())\n\n\t\t\t\t\tnewContents, err := os.ReadFile(configPath)\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\th.AssertEq(t, newContents, oldContents)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"builder is a suggested builder\", func() {\n\t\t\t\tit(\"does nothing\", func() {\n\t\t\t\t\th.AssertNil(t, os.WriteFile(configPath, []byte(\"\"), os.ModePerm))\n\n\t\t\t\t\tcommand.SetArgs(append(args, \"paketobuildpacks/builder-jammy-base\"))\n\t\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t\t\toldContents, err := os.ReadFile(configPath)\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\th.AssertEq(t, string(oldContents), \"\")\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"remove\", func() {\n\t\tvar (\n\t\t\targs          = []string{\"remove\"}\n\t\t\tconfigManager configManager\n\t\t)\n\n\t\tit.Before(func() {\n\t\t\tconfigManager = newConfigManager(t, configPath)\n\t\t})\n\n\t\twhen(\"no builder is provided\", func() {\n\t\t\tit(\"prints usage\", func() {\n\t\t\t\tcfg := configManager.configWithTrustedBuilders()\n\t\t\t\tcommand := commands.ConfigTrustedBuilder(logger, cfg, configPath)\n\t\t\t\tcommand.SetArgs(args)\n\t\t\t\tcommand.SetOut(&outBuf)\n\n\t\t\t\terr := command.Execute()\n\t\t\t\th.AssertError(t, err, \"accepts 1 arg(s), received 0\")\n\t\t\t\th.AssertContains(t, outBuf.String(), \"Usage:\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"builder is already trusted\", func() {\n\t\t\tit(\"removes builder from the config\", func() {\n\t\t\t\tbuilderName := \"some-builder\"\n\n\t\t\t\tcfg := configManager.configWithTrustedBuilders(builderName)\n\t\t\t\tcommand := commands.ConfigTrustedBuilder(logger, cfg, configPath)\n\t\t\t\tcommand.SetArgs(append(args, builderName))\n\n\t\t\t\th.AssertNil(t, command.Execute())\n\n\t\t\t\tb, err := os.ReadFile(configPath)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertNotContains(t, string(b), builderName)\n\n\t\t\t\th.AssertContains(t,\n\t\t\t\t\toutBuf.String(),\n\t\t\t\t\tfmt.Sprintf(\"Builder %s is no longer trusted\", style.Symbol(builderName)),\n\t\t\t\t)\n\t\t\t})\n\n\t\t\tit(\"removes only the named builder when multiple builders are trusted\", func() {\n\t\t\t\tuntrustBuilder := \"stop/trusting:me\"\n\t\t\t\tstillTrustedBuilder := \"very/safe/builder\"\n\n\t\t\t\tcfg := configManager.configWithTrustedBuilders(untrustBuilder, stillTrustedBuilder)\n\t\t\t\tcommand := commands.ConfigTrustedBuilder(logger, cfg, configPath)\n\t\t\t\tcommand.SetArgs(append(args, untrustBuilder))\n\n\t\t\t\th.AssertNil(t, command.Execute())\n\n\t\t\t\tb, err := os.ReadFile(configPath)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertContains(t, string(b), stillTrustedBuilder)\n\t\t\t\th.AssertNotContains(t, string(b), untrustBuilder)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"builder wasn't already trusted\", func() {\n\t\t\tit(\"does nothing and reports builder wasn't trusted\", func() {\n\t\t\t\tneverTrustedBuilder := \"never/trusted-builder\"\n\t\t\t\tstillTrustedBuilder := \"very/safe/builder\"\n\n\t\t\t\tcfg := configManager.configWithTrustedBuilders(stillTrustedBuilder)\n\t\t\t\tcommand := commands.ConfigTrustedBuilder(logger, cfg, configPath)\n\t\t\t\tcommand.SetArgs(append(args, neverTrustedBuilder))\n\n\t\t\t\th.AssertNil(t, command.Execute())\n\n\t\t\t\tb, err := os.ReadFile(configPath)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertContains(t, string(b), stillTrustedBuilder)\n\t\t\t\th.AssertNotContains(t, string(b), neverTrustedBuilder)\n\n\t\t\t\th.AssertContains(t,\n\t\t\t\t\toutBuf.String(),\n\t\t\t\t\tfmt.Sprintf(\"Builder %s wasn't trusted\", style.Symbol(neverTrustedBuilder)),\n\t\t\t\t)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"builder is a suggested builder\", func() {\n\t\t\tit(\"does nothing and reports that \", func() {\n\t\t\t\tbuilder := \"paketobuildpacks/builder-jammy-base\"\n\t\t\t\tcommand := commands.ConfigTrustedBuilder(logger, config.Config{}, configPath)\n\t\t\t\tcommand.SetArgs(append(args, builder))\n\n\t\t\t\terr := command.Execute()\n\t\t\t\th.AssertError(t, err, fmt.Sprintf(\"Builder %s is a known trusted builder. Currently pack doesn't support making these builders untrusted\", style.Symbol(builder)))\n\t\t\t})\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "internal/commands/create_builder.go",
    "content": "package commands\n\nimport (\n\t\"fmt\"\n\t\"path/filepath\"\n\n\t\"github.com/pkg/errors\"\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/builder\"\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/internal/style\"\n\t\"github.com/buildpacks/pack/pkg/client\"\n\t\"github.com/buildpacks/pack/pkg/image\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n)\n\n// Deprecated: Use 'builder create' instead.\n// CreateBuilder creates a builder image, based on a builder config\nfunc CreateBuilder(logger logging.Logger, cfg config.Config, pack PackClient) *cobra.Command {\n\tvar flags BuilderCreateFlags\n\n\tcmd := &cobra.Command{\n\t\tUse:     \"create-builder <image-name> --config <builder-config-path>\",\n\t\tHidden:  true,\n\t\tArgs:    cobra.ExactArgs(1),\n\t\tShort:   \"Create builder image\",\n\t\tExample: \"pack create-builder my-builder:bionic --config ./builder.toml\",\n\t\tLong: `A builder is an image that bundles all the bits and information on how to build your apps, such as buildpacks, an implementation of the lifecycle, and a build-time environment that pack uses when executing the lifecycle. When building an app, you can use community builders; you can see our suggestions by running\n\n\tpack builder suggest\n\nCreating a custom builder allows you to control what buildpacks are used and what image apps are based on. For more on how to create a builder, see: https://buildpacks.io/docs/operator-guide/create-a-builder/.\n`,\n\t\tRunE: logError(logger, func(cmd *cobra.Command, args []string) error {\n\t\t\tdeprecationWarning(logger, \"create-builder\", \"builder create\")\n\n\t\t\tif err := validateCreateFlags(&flags, cfg); err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\n\t\t\tstringPolicy := flags.Policy\n\t\t\tif stringPolicy == \"\" {\n\t\t\t\tstringPolicy = cfg.PullPolicy\n\t\t\t}\n\t\t\tpullPolicy, err := image.ParsePullPolicy(stringPolicy)\n\t\t\tif err != nil {\n\t\t\t\treturn errors.Wrapf(err, \"parsing pull policy %s\", flags.Policy)\n\t\t\t}\n\n\t\t\tbuilderConfig, warnings, err := builder.ReadConfig(flags.BuilderTomlPath)\n\t\t\tif err != nil {\n\t\t\t\treturn errors.Wrap(err, \"invalid builder toml\")\n\t\t\t}\n\t\t\tfor _, w := range warnings {\n\t\t\t\tlogger.Warnf(\"builder configuration: %s\", w)\n\t\t\t}\n\n\t\t\trelativeBaseDir, err := filepath.Abs(filepath.Dir(flags.BuilderTomlPath))\n\t\t\tif err != nil {\n\t\t\t\treturn errors.Wrap(err, \"getting absolute path for config\")\n\t\t\t}\n\n\t\t\timageName := args[0]\n\t\t\tif err := pack.CreateBuilder(cmd.Context(), client.CreateBuilderOptions{\n\t\t\t\tRelativeBaseDir: relativeBaseDir,\n\t\t\t\tBuilderName:     imageName,\n\t\t\t\tConfig:          builderConfig,\n\t\t\t\tPublish:         flags.Publish,\n\t\t\t\tRegistry:        flags.Registry,\n\t\t\t\tPullPolicy:      pullPolicy,\n\t\t\t}); err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\t\t\tlogger.Infof(\"Successfully created builder image %s\", style.Symbol(imageName))\n\t\t\tlogging.Tip(logger, \"Run %s to use this builder\", style.Symbol(fmt.Sprintf(\"pack build <image-name> --builder %s\", imageName)))\n\t\t\treturn nil\n\t\t}),\n\t}\n\n\tcmd.Flags().StringVarP(&flags.Registry, \"buildpack-registry\", \"R\", cfg.DefaultRegistryName, \"Buildpack Registry by name\")\n\tif !cfg.Experimental {\n\t\tcmd.Flags().MarkHidden(\"buildpack-registry\")\n\t}\n\tcmd.Flags().StringVarP(&flags.BuilderTomlPath, \"config\", \"c\", \"\", \"Path to builder TOML file (required)\")\n\tcmd.Flags().BoolVar(&flags.Publish, \"publish\", false, \"Publish the builder directly to the container registry specified in <image-name>, instead of the daemon.\")\n\tcmd.Flags().StringVar(&flags.Policy, \"pull-policy\", \"\", \"Pull policy to use. Accepted values are always, never, and if-not-present. The default is always\")\n\treturn cmd\n}\n"
  },
  {
    "path": "internal/commands/create_builder_test.go",
    "content": "package commands_test\n\nimport (\n\t\"bytes\"\n\t\"errors\"\n\t\"os\"\n\t\"path/filepath\"\n\t\"testing\"\n\n\t\"github.com/golang/mock/gomock\"\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/internal/commands\"\n\t\"github.com/buildpacks/pack/internal/commands/testmocks\"\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestCreateBuilderCommand(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"CreateBuilderCommand\", testCreateBuilderCommand, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testCreateBuilderCommand(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tcommand           *cobra.Command\n\t\tlogger            logging.Logger\n\t\toutBuf            bytes.Buffer\n\t\tmockController    *gomock.Controller\n\t\tmockClient        *testmocks.MockPackClient\n\t\ttmpDir            string\n\t\tbuilderConfigPath string\n\t\tcfg               config.Config\n\t)\n\n\tit.Before(func() {\n\t\tvar err error\n\t\ttmpDir, err = os.MkdirTemp(\"\", \"create-builder-test\")\n\t\th.AssertNil(t, err)\n\t\tbuilderConfigPath = filepath.Join(tmpDir, \"builder.toml\")\n\t\tcfg = config.Config{}\n\n\t\tmockController = gomock.NewController(t)\n\t\tmockClient = testmocks.NewMockPackClient(mockController)\n\t\tlogger = logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\tcommand = commands.CreateBuilder(logger, cfg, mockClient)\n\t})\n\n\tit.After(func() {\n\t\tmockController.Finish()\n\t})\n\n\twhen(\"#CreateBuilder\", func() {\n\t\tit(\"gives deprecation warning\", func() {\n\t\t\th.AssertNil(t, os.WriteFile(builderConfigPath, []byte(validConfig), 0666))\n\t\t\tmockClient.EXPECT().CreateBuilder(gomock.Any(), gomock.Any()).Return(nil)\n\t\t\tcommand.SetArgs([]string{\n\t\t\t\t\"some/builder\",\n\t\t\t\t\"--config\", builderConfigPath,\n\t\t\t})\n\n\t\t\th.AssertNil(t, command.Execute())\n\n\t\t\th.AssertContains(t, outBuf.String(), \"Warning: Command 'pack create-builder' has been deprecated, please use 'pack builder create' instead\")\n\t\t})\n\n\t\twhen(\"both --publish and pull-policy=never flags are specified\", func() {\n\t\t\tit(\"errors with a descriptive message\", func() {\n\t\t\t\tcommand.SetArgs([]string{\n\t\t\t\t\t\"some/builder\",\n\t\t\t\t\t\"--config\", \"some-config-path\",\n\t\t\t\t\t\"--publish\",\n\t\t\t\t\t\"--pull-policy\",\n\t\t\t\t\t\"never\",\n\t\t\t\t})\n\t\t\t\terr := command.Execute()\n\t\t\t\th.AssertNotNil(t, err)\n\t\t\t\th.AssertError(t, err, \"--publish and --pull-policy never cannot be used together. The --publish flag requires the use of remote images.\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"--pull-policy\", func() {\n\t\t\tit(\"returns error for unknown policy\", func() {\n\t\t\t\tcommand.SetArgs([]string{\n\t\t\t\t\t\"some/builder\",\n\t\t\t\t\t\"--config\", builderConfigPath,\n\t\t\t\t\t\"--pull-policy\", \"unknown-policy\",\n\t\t\t\t})\n\t\t\t\th.AssertError(t, command.Execute(), \"parsing pull policy\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"--pull-policy is not specified\", func() {\n\t\t\twhen(\"configured pull policy is invalid\", func() {\n\t\t\t\tit(\"returns error for when config set with unknown policy\", func() {\n\t\t\t\t\tcfg = config.Config{PullPolicy: \"unknown-policy\"}\n\t\t\t\t\tcommand = commands.BuilderCreate(logger, cfg, mockClient)\n\t\t\t\t\tcommand.SetArgs([]string{\n\t\t\t\t\t\t\"some/builder\",\n\t\t\t\t\t\t\"--config\", builderConfigPath,\n\t\t\t\t\t})\n\t\t\t\t\th.AssertError(t, command.Execute(), \"parsing pull policy\")\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"--buildpack-registry flag is specified but experimental isn't set in the config\", func() {\n\t\t\tit(\"errors with a descriptive message\", func() {\n\t\t\t\tcommand.SetArgs([]string{\n\t\t\t\t\t\"some/builder\",\n\t\t\t\t\t\"--config\", \"some-config-path\",\n\t\t\t\t\t\"--buildpack-registry\", \"some-registry\",\n\t\t\t\t})\n\t\t\t\terr := command.Execute()\n\t\t\t\th.AssertNotNil(t, err)\n\t\t\t\th.AssertError(t, err, \"Support for buildpack registries is currently experimental.\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"warnings encountered in builder.toml\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\th.AssertNil(t, os.WriteFile(builderConfigPath, []byte(`\n[[buildpacks]]\n  id = \"some.buildpack\"\n`), 0666))\n\t\t\t})\n\n\t\t\tit(\"logs the warnings\", func() {\n\t\t\t\tmockClient.EXPECT().CreateBuilder(gomock.Any(), gomock.Any()).Return(nil)\n\n\t\t\t\tcommand.SetArgs([]string{\n\t\t\t\t\t\"some/builder\",\n\t\t\t\t\t\"--config\", builderConfigPath,\n\t\t\t\t})\n\t\t\t\th.AssertNil(t, command.Execute())\n\n\t\t\t\th.AssertContains(t, outBuf.String(), \"Warning: builder configuration: empty 'order' definition\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"uses --builder-config\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\th.AssertNil(t, os.WriteFile(builderConfigPath, []byte(validConfig), 0666))\n\t\t\t})\n\n\t\t\tit(\"errors with a descriptive message\", func() {\n\t\t\t\tcommand.SetArgs([]string{\n\t\t\t\t\t\"some/builder\",\n\t\t\t\t\t\"--builder-config\", builderConfigPath,\n\t\t\t\t})\n\t\t\t\th.AssertError(t, command.Execute(), \"unknown flag: --builder-config\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"no config provided\", func() {\n\t\t\tit(\"errors with a descriptive message\", func() {\n\t\t\t\tcommand.SetArgs([]string{\n\t\t\t\t\t\"some/builder\",\n\t\t\t\t})\n\t\t\t\th.AssertError(t, command.Execute(), \"Please provide a builder config path\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"builder config has extensions but experimental isn't set in the config\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\th.AssertNil(t, os.WriteFile(builderConfigPath, []byte(validConfigWithExtensions), 0666))\n\t\t\t})\n\n\t\t\tit(\"errors\", func() {\n\t\t\t\tmockClient.EXPECT().CreateBuilder(gomock.Any(), gomock.Any()).Return(errors.New(\"builder config contains image extensions, but the lifecycle Platform API version (0.12) is older than 0.13; support for image extensions with Platform API < 0.13 is currently experimental\"))\n\n\t\t\t\tcommand.SetArgs([]string{\n\t\t\t\t\t\"some/builder\",\n\t\t\t\t\t\"--config\", builderConfigPath,\n\t\t\t\t})\n\t\t\t\th.AssertError(t, command.Execute(), \"support for image extensions with Platform API < 0.13 is currently experimental\")\n\t\t\t})\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "internal/commands/download_sbom.go",
    "content": "package commands\n\nimport (\n\t\"github.com/spf13/cobra\"\n\n\tcpkg \"github.com/buildpacks/pack/pkg/client\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n)\n\ntype DownloadSBOMFlags struct {\n\tRemote         bool\n\tDestinationDir string\n}\n\nfunc DownloadSBOM(\n\tlogger logging.Logger,\n\tclient PackClient,\n) *cobra.Command {\n\tvar flags DownloadSBOMFlags\n\tcmd := &cobra.Command{\n\t\tUse:     \"download <image-name>\",\n\t\tArgs:    cobra.ExactArgs(1),\n\t\tShort:   \"Download SBoM from specified image\",\n\t\tLong:    \"Download layer containing structured Software Bill of Materials (SBoM) from specified image\",\n\t\tExample: \"pack sbom download buildpacksio/pack\",\n\t\tRunE: logError(logger, func(cmd *cobra.Command, args []string) error {\n\t\t\timg := args[0]\n\t\t\toptions := cpkg.DownloadSBOMOptions{\n\t\t\t\tDaemon:         !flags.Remote,\n\t\t\t\tDestinationDir: flags.DestinationDir,\n\t\t\t}\n\n\t\t\treturn client.DownloadSBOM(img, options)\n\t\t}),\n\t}\n\tAddHelpFlag(cmd, \"download\")\n\tcmd.Flags().BoolVar(&flags.Remote, \"remote\", false, \"Download SBoM of image in remote registry (without pulling image)\")\n\tcmd.Flags().StringVarP(&flags.DestinationDir, \"output-dir\", \"o\", \".\", \"Path to export SBoM contents.\\nIt defaults export to the current working directory.\")\n\treturn cmd\n}\n"
  },
  {
    "path": "internal/commands/download_sbom_test.go",
    "content": "package commands_test\n\nimport (\n\t\"bytes\"\n\t\"testing\"\n\n\t\"github.com/golang/mock/gomock\"\n\t\"github.com/heroku/color\"\n\t\"github.com/pkg/errors\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/internal/commands\"\n\t\"github.com/buildpacks/pack/internal/commands/testmocks\"\n\tcpkg \"github.com/buildpacks/pack/pkg/client\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestDownloadSBOMCommand(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"DownloadSBOMCommand\", testDownloadSBOMCommand, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testDownloadSBOMCommand(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tcommand        *cobra.Command\n\t\tlogger         logging.Logger\n\t\toutBuf         bytes.Buffer\n\t\tmockController *gomock.Controller\n\t\tmockClient     *testmocks.MockPackClient\n\t)\n\n\tit.Before(func() {\n\t\tmockController = gomock.NewController(t)\n\t\tmockClient = testmocks.NewMockPackClient(mockController)\n\t\tlogger = logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\tcommand = commands.DownloadSBOM(logger, mockClient)\n\t})\n\n\tit.After(func() {\n\t\tmockController.Finish()\n\t})\n\n\twhen(\"#DownloadSBOM\", func() {\n\t\twhen(\"happy path\", func() {\n\t\t\tit(\"returns no error\", func() {\n\t\t\t\tmockClient.EXPECT().DownloadSBOM(\"some/image\", cpkg.DownloadSBOMOptions{\n\t\t\t\t\tDaemon:         true,\n\t\t\t\t\tDestinationDir: \".\",\n\t\t\t\t})\n\t\t\t\tcommand.SetArgs([]string{\"some/image\"})\n\n\t\t\t\terr := command.Execute()\n\t\t\t\th.AssertNil(t, err)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"the remote flag is specified\", func() {\n\t\t\tit(\"respects the remote flag\", func() {\n\t\t\t\tmockClient.EXPECT().DownloadSBOM(\"some/image\", cpkg.DownloadSBOMOptions{\n\t\t\t\t\tDaemon:         false,\n\t\t\t\t\tDestinationDir: \".\",\n\t\t\t\t})\n\t\t\t\tcommand.SetArgs([]string{\"some/image\", \"--remote\"})\n\n\t\t\t\terr := command.Execute()\n\t\t\t\th.AssertNil(t, err)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"the output-dir flag is specified\", func() {\n\t\t\tit(\"respects the output-dir flag\", func() {\n\t\t\t\tmockClient.EXPECT().DownloadSBOM(\"some/image\", cpkg.DownloadSBOMOptions{\n\t\t\t\t\tDaemon:         true,\n\t\t\t\t\tDestinationDir: \"some-destination-dir\",\n\t\t\t\t})\n\t\t\t\tcommand.SetArgs([]string{\"some/image\", \"--output-dir\", \"some-destination-dir\"})\n\n\t\t\t\terr := command.Execute()\n\t\t\t\th.AssertNil(t, err)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"the client returns an error\", func() {\n\t\t\tit(\"returns the error\", func() {\n\t\t\t\tmockClient.EXPECT().DownloadSBOM(\"some/image\", cpkg.DownloadSBOMOptions{\n\t\t\t\t\tDaemon:         true,\n\t\t\t\t\tDestinationDir: \".\",\n\t\t\t\t}).Return(errors.New(\"some-error\"))\n\n\t\t\t\tcommand.SetArgs([]string{\"some/image\"})\n\n\t\t\t\terr := command.Execute()\n\t\t\t\th.AssertError(t, err, \"some-error\")\n\t\t\t})\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "internal/commands/extension.go",
    "content": "package commands\n\nimport (\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n)\n\nfunc NewExtensionCommand(logger logging.Logger, cfg config.Config, client PackClient, packageConfigReader PackageConfigReader) *cobra.Command {\n\tcmd := &cobra.Command{\n\t\tUse:     \"extension\",\n\t\tAliases: []string{\"extensions\"},\n\t\tShort:   \"Interact with extensions\",\n\t\tRunE:    nil,\n\t}\n\n\tcmd.AddCommand(ExtensionInspect(logger, cfg, client))\n\t// client and packageConfigReader to be passed later on\n\tcmd.AddCommand(ExtensionPackage(logger, cfg, client, packageConfigReader))\n\t// client to be passed later on\n\tcmd.AddCommand(ExtensionNew(logger))\n\tcmd.AddCommand(ExtensionPull(logger, cfg, client))\n\tcmd.AddCommand(ExtensionRegister(logger, cfg, client))\n\tcmd.AddCommand(ExtensionYank(logger, cfg, client))\n\n\tAddHelpFlag(cmd, \"extension\")\n\treturn cmd\n}\n"
  },
  {
    "path": "internal/commands/extension_inspect.go",
    "content": "package commands\n\nimport (\n\t\"fmt\"\n\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/internal/style\"\n\t\"github.com/buildpacks/pack/pkg/client\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n)\n\nfunc ExtensionInspect(logger logging.Logger, cfg config.Config, client PackClient) *cobra.Command {\n\tcmd := &cobra.Command{\n\t\tUse:     \"inspect <extension-name>\",\n\t\tArgs:    cobra.ExactArgs(1),\n\t\tShort:   \"Show information about an extension\",\n\t\tExample: \"pack extension inspect <example-extension>\",\n\t\tRunE: logError(logger, func(cmd *cobra.Command, args []string) error {\n\t\t\textensionName := args[0]\n\t\t\treturn extensionInspect(logger, extensionName, cfg, client)\n\t\t}),\n\t}\n\tAddHelpFlag(cmd, \"inspect\")\n\treturn cmd\n}\n\nfunc extensionInspect(logger logging.Logger, extensionName string, _ config.Config, pack PackClient) error {\n\tlogger.Infof(\"Inspecting extension: %s\\n\", style.Symbol(extensionName))\n\n\tinspectedExtensionsOutput, err := inspectAllExtensions(\n\t\tpack,\n\t\tclient.InspectExtensionOptions{\n\t\t\tExtensionName: extensionName,\n\t\t\tDaemon:        true,\n\t\t},\n\t\tclient.InspectExtensionOptions{\n\t\t\tExtensionName: extensionName,\n\t\t\tDaemon:        false,\n\t\t})\n\tif err != nil {\n\t\treturn fmt.Errorf(\"error writing extension output: %q\", err)\n\t}\n\n\tlogger.Info(inspectedExtensionsOutput)\n\treturn nil\n}\n"
  },
  {
    "path": "internal/commands/extension_inspect_test.go",
    "content": "package commands_test\n\nimport (\n\t\"bytes\"\n\t\"fmt\"\n\t\"testing\"\n\n\t\"github.com/golang/mock/gomock\"\n\t\"github.com/heroku/color\"\n\t\"github.com/pkg/errors\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/internal/commands\"\n\t\"github.com/buildpacks/pack/internal/commands/testmocks\"\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/pkg/buildpack\"\n\t\"github.com/buildpacks/pack/pkg/client\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n\t\"github.com/buildpacks/pack/pkg/image\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nconst extensionOutputSection = `Extension:\n  ID                           NAME        VERSION        HOMEPAGE\n  some/single-extension        some        0.0.1          single-extension-homepage`\n\nconst inspectExtensionOutputTemplate = `Inspecting extension: '%s'\n\n%s\n\n%s\n`\n\nfunc TestExtensionInspectCommand(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"ExtensionInspectCommand\", testExtensionInspectCommand, spec.Sequential(), spec.Report(report.Terminal{}))\n}\n\nfunc testExtensionInspectCommand(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tcommand        *cobra.Command\n\t\tlogger         logging.Logger\n\t\toutBuf         bytes.Buffer\n\t\tmockController *gomock.Controller\n\t\tmockClient     *testmocks.MockPackClient\n\t\tcfg            config.Config\n\t\tinfo           *client.ExtensionInfo\n\t\tassert         = h.NewAssertionManager(t)\n\t)\n\n\tit.Before(func() {\n\t\tmockController = gomock.NewController(t)\n\t\tmockClient = testmocks.NewMockPackClient(mockController)\n\t\tlogger = logging.NewLogWithWriters(&outBuf, &outBuf)\n\n\t\tinfo = &client.ExtensionInfo{\n\t\t\tExtension: dist.ModuleInfo{\n\t\t\t\tID:       \"some/single-extension\",\n\t\t\t\tVersion:  \"0.0.1\",\n\t\t\t\tName:     \"some\",\n\t\t\t\tHomepage: \"single-extension-homepage\",\n\t\t\t},\n\t\t}\n\n\t\tcommand = commands.ExtensionInspect(logger, cfg, mockClient)\n\t})\n\n\twhen(\"ExtensionInspect\", func() {\n\t\twhen(\"inspecting an image\", func() {\n\t\t\twhen(\"both remote and local image are present\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tinfo.Location = buildpack.PackageLocator\n\n\t\t\t\t\tmockClient.EXPECT().InspectExtension(client.InspectExtensionOptions{\n\t\t\t\t\t\tExtensionName: \"test/extension\",\n\t\t\t\t\t\tDaemon:        true,\n\t\t\t\t\t}).Return(info, nil)\n\n\t\t\t\t\tmockClient.EXPECT().InspectExtension(client.InspectExtensionOptions{\n\t\t\t\t\t\tExtensionName: \"test/extension\",\n\t\t\t\t\t\tDaemon:        false,\n\t\t\t\t\t}).Return(info, nil)\n\t\t\t\t})\n\n\t\t\t\tit(\"succeeds\", func() {\n\t\t\t\t\tcommand.SetArgs([]string{\"test/extension\"})\n\t\t\t\t\tassert.Nil(command.Execute())\n\n\t\t\t\t\tlocalOutputSection := fmt.Sprintf(inspectExtensionOutputTemplate,\n\t\t\t\t\t\t\"test/extension\",\n\t\t\t\t\t\t\"LOCAL IMAGE:\",\n\t\t\t\t\t\textensionOutputSection)\n\n\t\t\t\t\tremoteOutputSection := fmt.Sprintf(\"%s\\n\\n%s\",\n\t\t\t\t\t\t\"REMOTE IMAGE:\",\n\t\t\t\t\t\textensionOutputSection)\n\n\t\t\t\t\tassert.AssertTrimmedContains(outBuf.String(), localOutputSection)\n\t\t\t\t\tassert.AssertTrimmedContains(outBuf.String(), remoteOutputSection)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"only a local image is present\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tinfo.Location = buildpack.PackageLocator\n\n\t\t\t\t\tmockClient.EXPECT().InspectExtension(client.InspectExtensionOptions{\n\t\t\t\t\t\tExtensionName: \"only-local-test/extension\",\n\t\t\t\t\t\tDaemon:        true,\n\t\t\t\t\t}).Return(info, nil)\n\n\t\t\t\t\tmockClient.EXPECT().InspectExtension(client.InspectExtensionOptions{\n\t\t\t\t\t\tExtensionName: \"only-local-test/extension\",\n\t\t\t\t\t\tDaemon:        false,\n\t\t\t\t\t}).Return(nil, errors.Wrap(image.ErrNotFound, \"remote image not found!\"))\n\t\t\t\t})\n\n\t\t\t\tit(\"displays output for local image\", func() {\n\t\t\t\t\tcommand.SetArgs([]string{\"only-local-test/extension\"})\n\t\t\t\t\tassert.Nil(command.Execute())\n\n\t\t\t\t\texpectedOutput := fmt.Sprintf(inspectExtensionOutputTemplate,\n\t\t\t\t\t\t\"only-local-test/extension\",\n\t\t\t\t\t\t\"LOCAL IMAGE:\",\n\t\t\t\t\t\textensionOutputSection)\n\n\t\t\t\t\tassert.AssertTrimmedContains(outBuf.String(), expectedOutput)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"only a remote image is present\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tinfo.Location = buildpack.PackageLocator\n\n\t\t\t\t\tmockClient.EXPECT().InspectExtension(client.InspectExtensionOptions{\n\t\t\t\t\t\tExtensionName: \"only-remote-test/extension\",\n\t\t\t\t\t\tDaemon:        false,\n\t\t\t\t\t}).Return(info, nil)\n\n\t\t\t\t\tmockClient.EXPECT().InspectExtension(client.InspectExtensionOptions{\n\t\t\t\t\t\tExtensionName: \"only-remote-test/extension\",\n\t\t\t\t\t\tDaemon:        true,\n\t\t\t\t\t}).Return(nil, errors.Wrap(image.ErrNotFound, \"local image not found!\"))\n\t\t\t\t})\n\n\t\t\t\tit(\"displays output for remote image\", func() {\n\t\t\t\t\tcommand.SetArgs([]string{\"only-remote-test/extension\"})\n\t\t\t\t\tassert.Nil(command.Execute())\n\n\t\t\t\t\texpectedOutput := fmt.Sprintf(inspectExtensionOutputTemplate,\n\t\t\t\t\t\t\"only-remote-test/extension\",\n\t\t\t\t\t\t\"REMOTE IMAGE:\",\n\t\t\t\t\t\textensionOutputSection)\n\n\t\t\t\t\tassert.AssertTrimmedContains(outBuf.String(), expectedOutput)\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"failure cases\", func() {\n\t\twhen(\"unable to inspect extension image\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\tmockClient.EXPECT().InspectExtension(client.InspectExtensionOptions{\n\t\t\t\t\tExtensionName: \"failure-case/extension\",\n\t\t\t\t\tDaemon:        true,\n\t\t\t\t}).Return(&client.ExtensionInfo{}, errors.Wrap(image.ErrNotFound, \"unable to inspect local failure-case/extension\"))\n\n\t\t\t\tmockClient.EXPECT().InspectExtension(client.InspectExtensionOptions{\n\t\t\t\t\tExtensionName: \"failure-case/extension\",\n\t\t\t\t\tDaemon:        false,\n\t\t\t\t}).Return(&client.ExtensionInfo{}, errors.Wrap(image.ErrNotFound, \"unable to inspect remote failure-case/extension\"))\n\t\t\t})\n\n\t\t\tit(\"errors\", func() {\n\t\t\t\tcommand.SetArgs([]string{\"failure-case/extension\"})\n\t\t\t\terr := command.Execute()\n\t\t\t\tassert.Error(err)\n\t\t\t})\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "internal/commands/extension_new.go",
    "content": "package commands\n\nimport (\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/pkg/logging\"\n)\n\n// ExtensionNewFlags define flags provided to the ExtensionNew command\ntype ExtensionNewFlags struct {\n\tAPI     string\n\tPath    string\n\tStacks  []string\n\tVersion string\n}\n\n// extensioncreator type to be added here and argument also to be added in the function\n\n// ExtensionNew generates the scaffolding of an extension\nfunc ExtensionNew(logger logging.Logger) *cobra.Command {\n\tcmd := &cobra.Command{\n\t\tUse:     \"new <id>\",\n\t\tShort:   \"Creates basic scaffolding of an extension\",\n\t\tArgs:    cobra.MatchAll(cobra.ExactArgs(1), cobra.OnlyValidArgs),\n\t\tExample: \"pack extension new <example-extension>\",\n\t\tRunE: logError(logger, func(cmd *cobra.Command, args []string) error {\n\t\t\t// logic will go here\n\t\t\treturn nil\n\t\t}),\n\t}\n\n\t// flags will go here\n\n\tAddHelpFlag(cmd, \"new\")\n\treturn cmd\n}\n"
  },
  {
    "path": "internal/commands/extension_package.go",
    "content": "package commands\n\nimport (\n\t\"context\"\n\t\"os\"\n\t\"path/filepath\"\n\n\t\"github.com/pkg/errors\"\n\t\"github.com/spf13/cobra\"\n\n\tpubbldpkg \"github.com/buildpacks/pack/buildpackage\"\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/internal/style\"\n\t\"github.com/buildpacks/pack/pkg/client\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n\t\"github.com/buildpacks/pack/pkg/image\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n)\n\n// ExtensionPackageFlags define flags provided to the ExtensionPackage command\ntype ExtensionPackageFlags struct {\n\tPackageTomlPath string\n\tFormat          string\n\tTargets         []string\n\tPublish         bool\n\tPolicy          string\n\tPath            string\n\tAdditionalTags  []string\n}\n\n// ExtensionPackager packages extensions\ntype ExtensionPackager interface {\n\tPackageExtension(ctx context.Context, options client.PackageBuildpackOptions) error\n}\n\n// ExtensionPackage packages (a) extension(s) into OCI format, based on a package config\nfunc ExtensionPackage(logger logging.Logger, cfg config.Config, packager ExtensionPackager, packageConfigReader PackageConfigReader) *cobra.Command {\n\tvar flags ExtensionPackageFlags\n\tcmd := &cobra.Command{\n\t\tUse:     \"package <name> --config <config-path>\",\n\t\tShort:   \"Package an extension in OCI format\",\n\t\tArgs:    cobra.MatchAll(cobra.ExactArgs(1), cobra.OnlyValidArgs),\n\t\tExample: \"pack extension package /output/file.cnb --path /extracted/from/tgz/folder --format file\\npack extension package registry/image-name --path  /extracted/from/tgz/folder --format image --publish\",\n\t\tLong: \"extension package allows users to package (an) extension(s) into OCI format, which can then to be hosted in \" +\n\t\t\t\"image repositories or persisted on disk as a '.cnb' file.\" +\n\t\t\t\"Packaged extensions can be used as inputs to `pack build` (using the `--extension` flag), \" +\n\t\t\t\"and they can be included in the configs used in `pack builder create` and `pack extension package`. For more \" +\n\t\t\t\"on how to package an extension, see: https://buildpacks.io/docs/buildpack-author-guide/package-a-buildpack/.\",\n\t\tRunE: logError(logger, func(cmd *cobra.Command, args []string) error {\n\t\t\tif err := validateExtensionPackageFlags(&flags); err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\n\t\t\tstringPolicy := flags.Policy\n\t\t\tif stringPolicy == \"\" {\n\t\t\t\tstringPolicy = cfg.PullPolicy\n\t\t\t}\n\n\t\t\tpullPolicy, err := image.ParsePullPolicy(stringPolicy)\n\t\t\tif err != nil {\n\t\t\t\treturn errors.Wrap(err, \"parsing pull policy\")\n\t\t\t}\n\n\t\t\texPackageCfg := pubbldpkg.DefaultExtensionConfig()\n\t\t\tvar exPath string\n\t\t\tif flags.Path != \"\" {\n\t\t\t\tif exPath, err = filepath.Abs(flags.Path); err != nil {\n\t\t\t\t\treturn errors.Wrap(err, \"resolving extension path\")\n\t\t\t\t}\n\t\t\t\texPackageCfg.Extension.URI = exPath\n\t\t\t}\n\t\t\trelativeBaseDir := \"\"\n\t\t\tif flags.PackageTomlPath != \"\" {\n\t\t\t\texPackageCfg, err = packageConfigReader.Read(flags.PackageTomlPath)\n\t\t\t\tif err != nil {\n\t\t\t\t\treturn errors.Wrap(err, \"reading config\")\n\t\t\t\t}\n\n\t\t\t\trelativeBaseDir, err = filepath.Abs(filepath.Dir(flags.PackageTomlPath))\n\t\t\t\tif err != nil {\n\t\t\t\t\treturn errors.Wrap(err, \"getting absolute path for config\")\n\t\t\t\t}\n\t\t\t}\n\t\t\tname := args[0]\n\t\t\tif flags.Format == client.FormatFile {\n\t\t\t\tswitch ext := filepath.Ext(name); ext {\n\t\t\t\tcase client.CNBExtension:\n\t\t\t\tcase \"\":\n\t\t\t\t\tname += client.CNBExtension\n\t\t\t\tdefault:\n\t\t\t\t\tlogger.Warnf(\"%s is not a valid extension for a packaged extension. Packaged extensions must have a %s extension\", style.Symbol(ext), style.Symbol(client.CNBExtension))\n\t\t\t\t}\n\t\t\t}\n\n\t\t\ttargets, err := processExtensionPackageTargets(flags.Path, packageConfigReader, exPackageCfg)\n\t\t\tif err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\n\t\t\tdaemon := !flags.Publish && flags.Format == \"\"\n\t\t\tmultiArchCfg, err := processMultiArchitectureConfig(logger, flags.Targets, targets, daemon)\n\t\t\tif err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\n\t\t\tif len(multiArchCfg.Targets()) == 0 {\n\t\t\t\tlogger.Infof(\"Pro tip: use --target flag OR [[targets]] in buildpack.toml to specify the desired platform (os/arch/variant); using os %s\", style.Symbol(exPackageCfg.Platform.OS))\n\t\t\t} else {\n\t\t\t\t// FIXME: Check if we can copy the config files during layers creation.\n\t\t\t\tfilesToClean, err := multiArchCfg.CopyConfigFiles(exPath, \"extension\")\n\t\t\t\tif err != nil {\n\t\t\t\t\treturn err\n\t\t\t\t}\n\t\t\t\tdefer clean(filesToClean)\n\t\t\t}\n\n\t\t\tif err := packager.PackageExtension(cmd.Context(), client.PackageBuildpackOptions{\n\t\t\t\tRelativeBaseDir: relativeBaseDir,\n\t\t\t\tName:            name,\n\t\t\t\tFormat:          flags.Format,\n\t\t\t\tConfig:          exPackageCfg,\n\t\t\t\tPublish:         flags.Publish,\n\t\t\t\tPullPolicy:      pullPolicy,\n\t\t\t\tTargets:         multiArchCfg.Targets(),\n\t\t\t\tAdditionalTags:  flags.AdditionalTags,\n\t\t\t}); err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\n\t\t\taction := \"created\"\n\t\t\tlocation := \"docker daemon\"\n\t\t\tif flags.Publish {\n\t\t\t\taction = \"published\"\n\t\t\t\tlocation = \"registry\"\n\t\t\t}\n\t\t\tif flags.Format == client.FormatFile {\n\t\t\t\tlocation = \"file\"\n\t\t\t}\n\t\t\tlogger.Infof(\"Successfully %s package %s and saved to %s\", action, style.Symbol(name), location)\n\t\t\treturn nil\n\t\t}),\n\t}\n\n\t// flags will be added here\n\tcmd.Flags().StringVarP(&flags.PackageTomlPath, \"config\", \"c\", \"\", \"Path to package TOML config\")\n\tcmd.Flags().StringVarP(&flags.Format, \"format\", \"f\", \"\", `Format to save package as (\"image\" or \"file\")`)\n\tcmd.Flags().BoolVar(&flags.Publish, \"publish\", false, `Publish the extension directly to the container registry specified in <name>, instead of the daemon (applies to \"--format=image\" only).`)\n\tcmd.Flags().StringVar(&flags.Policy, \"pull-policy\", \"\", \"Pull policy to use. Accepted values are always, never, and if-not-present. The default is always\")\n\tcmd.Flags().StringVarP(&flags.Path, \"path\", \"p\", \"\", \"Path to the Extension that needs to be packaged\")\n\tcmd.Flags().StringSliceVarP(&flags.Targets, \"target\", \"t\", nil,\n\t\t`Target platforms to build for.\nTargets should be in the format '[os][/arch][/variant]:[distroname@osversion@anotherversion];[distroname@osversion]'.\n- To specify two different architectures: '--target \"linux/amd64\" --target \"linux/arm64\"'\n- To specify the distribution version: '--target \"linux/arm/v6:ubuntu@14.04\"'\n- To specify multiple distribution versions: '--target \"linux/arm/v6:ubuntu@14.04\"  --target \"linux/arm/v6:ubuntu@16.04\"'\n\t`)\n\tcmd.Flags().StringSliceVarP(&flags.AdditionalTags, \"tag\", \"\", nil, \"Additional tags to push the output image to.\\nTags should be in the format 'image:tag' or 'repository/image:tag'.\"+stringSliceHelp(\"tag\"))\n\tAddHelpFlag(cmd, \"package\")\n\treturn cmd\n}\n\nfunc validateExtensionPackageFlags(p *ExtensionPackageFlags) error {\n\tif p.Publish && p.Policy == image.PullNever.String() {\n\t\treturn errors.Errorf(\"--publish and --pull-policy=never cannot be used together. The --publish flag requires the use of remote images.\")\n\t}\n\treturn nil\n}\n\n// processExtensionPackageTargets returns the list of targets defined on the extension.toml\nfunc processExtensionPackageTargets(path string, packageConfigReader PackageConfigReader, bpPackageCfg pubbldpkg.Config) ([]dist.Target, error) {\n\tvar targets []dist.Target\n\n\t// Read targets from extension.toml\n\tpathToExtensionToml := filepath.Join(path, \"extension.toml\")\n\tif _, err := os.Stat(pathToExtensionToml); err == nil {\n\t\tbuildpackCfg, err := packageConfigReader.ReadBuildpackDescriptor(pathToExtensionToml)\n\t\tif err != nil {\n\t\t\treturn nil, err\n\t\t}\n\t\ttargets = buildpackCfg.Targets()\n\t}\n\n\treturn targets, nil\n}\n"
  },
  {
    "path": "internal/commands/extension_package_test.go",
    "content": "package commands_test\n\nimport (\n\t\"bytes\"\n\t\"fmt\"\n\t\"path/filepath\"\n\t\"testing\"\n\n\t\"github.com/heroku/color\"\n\t\"github.com/pkg/errors\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\t\"github.com/spf13/cobra\"\n\n\tpubbldpkg \"github.com/buildpacks/pack/buildpackage\"\n\t\"github.com/buildpacks/pack/internal/commands\"\n\t\"github.com/buildpacks/pack/internal/commands/fakes\"\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n\t\"github.com/buildpacks/pack/pkg/image\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestExtensionPackageCommand(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"ExtensionPackageCommand\", testExtensionPackageCommand, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testExtensionPackageCommand(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tlogger *logging.LogWithWriters\n\t\toutBuf bytes.Buffer\n\t)\n\n\tit.Before(func() {\n\t\tlogger = logging.NewLogWithWriters(&outBuf, &outBuf)\n\t})\n\n\twhen(\"Package#Execute\", func() {\n\t\tvar fakeExtensionPackager *fakes.FakeBuildpackPackager\n\n\t\tit.Before(func() {\n\t\t\tfakeExtensionPackager = &fakes.FakeBuildpackPackager{}\n\t\t})\n\n\t\twhen(\"valid package config\", func() {\n\t\t\tit(\"reads package config from the configured path\", func() {\n\t\t\t\tfakePackageConfigReader := fakes.NewFakePackageConfigReader()\n\t\t\t\texpectedPackageConfigPath := \"/path/to/some/file\"\n\n\t\t\t\tcmd := packageExtensionCommand(\n\t\t\t\t\twithExtensionPackageConfigReader(fakePackageConfigReader),\n\t\t\t\t\twithExtensionPackageConfigPath(expectedPackageConfigPath),\n\t\t\t\t)\n\t\t\t\terr := cmd.Execute()\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\th.AssertEq(t, fakePackageConfigReader.ReadCalledWithArg, expectedPackageConfigPath)\n\t\t\t})\n\n\t\t\tit(\"creates package with correct image name\", func() {\n\t\t\t\tcmd := packageExtensionCommand(\n\t\t\t\t\twithExtensionImageName(\"my-specific-image\"),\n\t\t\t\t\twithExtensionPackager(fakeExtensionPackager),\n\t\t\t\t)\n\t\t\t\terr := cmd.Execute()\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\treceivedOptions := fakeExtensionPackager.CreateCalledWithOptions\n\t\t\t\th.AssertEq(t, receivedOptions.Name, \"my-specific-image\")\n\t\t\t})\n\n\t\t\tit(\"creates package with config returned by the reader\", func() {\n\t\t\t\tmyConfig := pubbldpkg.Config{\n\t\t\t\t\tExtension: dist.BuildpackURI{URI: \"test\"},\n\t\t\t\t}\n\n\t\t\t\tcmd := packageExtensionCommand(\n\t\t\t\t\twithExtensionPackager(fakeExtensionPackager),\n\t\t\t\t\twithExtensionPackageConfigReader(fakes.NewFakePackageConfigReader(whereReadReturns(myConfig, nil))),\n\t\t\t\t)\n\t\t\t\terr := cmd.Execute()\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\treceivedOptions := fakeExtensionPackager.CreateCalledWithOptions\n\t\t\t\th.AssertEq(t, receivedOptions.Config, myConfig)\n\t\t\t})\n\n\t\t\twhen(\"file format\", func() {\n\t\t\t\twhen(\"extension is .cnb\", func() {\n\t\t\t\t\tit(\"does not modify the name\", func() {\n\t\t\t\t\t\tcmd := packageExtensionCommand(withExtensionPackager(fakeExtensionPackager))\n\t\t\t\t\t\tcmd.SetArgs([]string{\"test.cnb\", \"-f\", \"file\"})\n\t\t\t\t\t\th.AssertNil(t, cmd.Execute())\n\n\t\t\t\t\t\treceivedOptions := fakeExtensionPackager.CreateCalledWithOptions\n\t\t\t\t\t\th.AssertEq(t, receivedOptions.Name, \"test.cnb\")\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t\twhen(\"extension is empty\", func() {\n\t\t\t\t\tit(\"appends .cnb to the name\", func() {\n\t\t\t\t\t\tcmd := packageExtensionCommand(withExtensionPackager(fakeExtensionPackager))\n\t\t\t\t\t\tcmd.SetArgs([]string{\"test\", \"-f\", \"file\"})\n\t\t\t\t\t\th.AssertNil(t, cmd.Execute())\n\n\t\t\t\t\t\treceivedOptions := fakeExtensionPackager.CreateCalledWithOptions\n\t\t\t\t\t\th.AssertEq(t, receivedOptions.Name, \"test.cnb\")\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t\twhen(\"extension is something other than .cnb\", func() {\n\t\t\t\t\tit(\"does not modify the name but shows a warning\", func() {\n\t\t\t\t\t\tcmd := packageExtensionCommand(withExtensionPackager(fakeExtensionPackager), withExtensionLogger(logger))\n\t\t\t\t\t\tcmd.SetArgs([]string{\"test.tar.gz\", \"-f\", \"file\"})\n\t\t\t\t\t\th.AssertNil(t, cmd.Execute())\n\n\t\t\t\t\t\treceivedOptions := fakeExtensionPackager.CreateCalledWithOptions\n\t\t\t\t\t\th.AssertEq(t, receivedOptions.Name, \"test.tar.gz\")\n\t\t\t\t\t\th.AssertContains(t, outBuf.String(), \"'.gz' is not a valid extension for a packaged extension. Packaged extensions must have a '.cnb' extension\")\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"pull-policy\", func() {\n\t\t\t\tvar pullPolicyArgs = []string{\n\t\t\t\t\t\"some-image-name\",\n\t\t\t\t\t\"--config\", \"/path/to/some/file\",\n\t\t\t\t\t\"--pull-policy\",\n\t\t\t\t}\n\n\t\t\t\tit(\"pull-policy=never sets policy\", func() {\n\t\t\t\t\tcmd := packageExtensionCommand(withExtensionPackager(fakeExtensionPackager))\n\t\t\t\t\tcmd.SetArgs(append(pullPolicyArgs, \"never\"))\n\t\t\t\t\th.AssertNil(t, cmd.Execute())\n\n\t\t\t\t\treceivedOptions := fakeExtensionPackager.CreateCalledWithOptions\n\t\t\t\t\th.AssertEq(t, receivedOptions.PullPolicy, image.PullNever)\n\t\t\t\t})\n\n\t\t\t\tit(\"pull-policy=always sets policy\", func() {\n\t\t\t\t\tcmd := packageExtensionCommand(withExtensionPackager(fakeExtensionPackager))\n\t\t\t\t\tcmd.SetArgs(append(pullPolicyArgs, \"always\"))\n\t\t\t\t\th.AssertNil(t, cmd.Execute())\n\n\t\t\t\t\treceivedOptions := fakeExtensionPackager.CreateCalledWithOptions\n\t\t\t\t\th.AssertEq(t, receivedOptions.PullPolicy, image.PullAlways)\n\t\t\t\t})\n\t\t\t})\n\t\t\twhen(\"no --pull-policy\", func() {\n\t\t\t\tvar pullPolicyArgs = []string{\n\t\t\t\t\t\"some-image-name\",\n\t\t\t\t\t\"--config\", \"/path/to/some/file\",\n\t\t\t\t}\n\n\t\t\t\tit(\"uses the default policy when no policy configured\", func() {\n\t\t\t\t\tcmd := packageExtensionCommand(withExtensionPackager(fakeExtensionPackager))\n\t\t\t\t\tcmd.SetArgs(pullPolicyArgs)\n\t\t\t\t\th.AssertNil(t, cmd.Execute())\n\n\t\t\t\t\treceivedOptions := fakeExtensionPackager.CreateCalledWithOptions\n\t\t\t\t\th.AssertEq(t, receivedOptions.PullPolicy, image.PullAlways)\n\t\t\t\t})\n\t\t\t\tit(\"uses the configured pull policy when policy configured\", func() {\n\t\t\t\t\tcmd := packageExtensionCommand(\n\t\t\t\t\t\twithExtensionPackager(fakeExtensionPackager),\n\t\t\t\t\t\twithExtensionClientConfig(config.Config{PullPolicy: \"never\"}),\n\t\t\t\t\t)\n\n\t\t\t\t\tcmd.SetArgs([]string{\n\t\t\t\t\t\t\"some-image-name\",\n\t\t\t\t\t\t\"--config\", \"/path/to/some/file\",\n\t\t\t\t\t})\n\n\t\t\t\t\terr := cmd.Execute()\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\treceivedOptions := fakeExtensionPackager.CreateCalledWithOptions\n\t\t\t\t\th.AssertEq(t, receivedOptions.PullPolicy, image.PullNever)\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"no config path is specified\", func() {\n\t\t\twhen(\"no path is specified\", func() {\n\t\t\t\tit(\"creates a default config with the uri set to the current working directory\", func() {\n\t\t\t\t\tcmd := packageExtensionCommand(withExtensionPackager(fakeExtensionPackager))\n\t\t\t\t\tcmd.SetArgs([]string{\"some-name\"})\n\t\t\t\t\th.AssertNil(t, cmd.Execute())\n\n\t\t\t\t\treceivedOptions := fakeExtensionPackager.CreateCalledWithOptions\n\t\t\t\t\th.AssertEq(t, receivedOptions.Config.Extension.URI, \".\")\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"a path is specified\", func() {\n\t\t\twhen(\"no multi-platform\", func() {\n\t\t\t\tit(\"creates a default config with the appropriate path\", func() {\n\t\t\t\t\tcmd := packageExtensionCommand(withExtensionPackager(fakeExtensionPackager))\n\t\t\t\t\tcmd.SetArgs([]string{\"some-name\", \"-p\", \"..\"})\n\t\t\t\t\th.AssertNil(t, cmd.Execute())\n\t\t\t\t\tbpPath, _ := filepath.Abs(\"..\")\n\t\t\t\t\treceivedOptions := fakeExtensionPackager.CreateCalledWithOptions\n\t\t\t\t\th.AssertEq(t, receivedOptions.Config.Extension.URI, bpPath)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"multi-platform\", func() {\n\t\t\t\tvar targets []dist.Target\n\n\t\t\t\twhen(\"single extension\", func() {\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\ttargets = []dist.Target{\n\t\t\t\t\t\t\t{OS: \"linux\", Arch: \"amd64\"},\n\t\t\t\t\t\t\t{OS: \"windows\", Arch: \"amd64\"},\n\t\t\t\t\t\t}\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"creates a multi-platform extension package\", func() {\n\t\t\t\t\t\tcmd := packageExtensionCommand(withExtensionPackager(fakeExtensionPackager))\n\t\t\t\t\t\tcmd.SetArgs([]string{\"some-name\", \"-p\", \"some-path\", \"--target\", \"linux/amd64\", \"--target\", \"windows/amd64\", \"--format\", \"image\", \"--publish\"})\n\t\t\t\t\t\th.AssertNil(t, cmd.Execute())\n\t\t\t\t\t\th.AssertEq(t, fakeExtensionPackager.CreateCalledWithOptions.Targets, targets)\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"additional tags are specified\", func() {\n\t\t\tit(\"forwards additional tags to PackageExtension\", func() {\n\t\t\t\texpectedTags := []string{\"additional-tag-1\", \"additional-tag-2\"}\n\t\t\t\tcmd := packageExtensionCommand(\n\t\t\t\t\twithExtensionPackager(fakeExtensionPackager),\n\t\t\t\t)\n\t\t\t\tcmd.SetArgs([]string{\n\t\t\t\t\t\"my-specific-image\",\n\t\t\t\t\t\"--tag\", expectedTags[0], \"--tag\", expectedTags[1],\n\t\t\t\t})\n\t\t\t\terr := cmd.Execute()\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\treceivedOptions := fakeExtensionPackager.CreateCalledWithOptions\n\t\t\t\th.AssertEq(t, receivedOptions.AdditionalTags[0], expectedTags[0])\n\t\t\t\th.AssertEq(t, receivedOptions.AdditionalTags[1], expectedTags[1])\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"invalid flags\", func() {\n\t\twhen(\"both --publish and --pull-policy never flags are specified\", func() {\n\t\t\tit(\"errors with a descriptive message\", func() {\n\t\t\t\tcmd := packageExtensionCommand()\n\t\t\t\tcmd.SetArgs([]string{\n\t\t\t\t\t\"some-image-name\", \"--config\", \"/path/to/some/file\",\n\t\t\t\t\t\"--publish\",\n\t\t\t\t\t\"--pull-policy\", \"never\",\n\t\t\t\t})\n\n\t\t\t\terr := cmd.Execute()\n\t\t\t\th.AssertNotNil(t, err)\n\t\t\t\th.AssertError(t, err, \"--publish and --pull-policy=never cannot be used together. The --publish flag requires the use of remote images.\")\n\t\t\t})\n\t\t})\n\n\t\tit(\"logs an error and exits when package toml is invalid\", func() {\n\t\t\texpectedErr := errors.New(\"it went wrong\")\n\n\t\t\tcmd := packageExtensionCommand(\n\t\t\t\twithExtensionLogger(logger),\n\t\t\t\twithExtensionPackageConfigReader(\n\t\t\t\t\tfakes.NewFakePackageConfigReader(whereReadReturns(pubbldpkg.Config{}, expectedErr)),\n\t\t\t\t),\n\t\t\t)\n\n\t\t\terr := cmd.Execute()\n\t\t\th.AssertNotNil(t, err)\n\n\t\t\th.AssertContains(t, outBuf.String(), fmt.Sprintf(\"ERROR: reading config: %s\", expectedErr))\n\t\t})\n\n\t\twhen(\"package-config is specified\", func() {\n\t\t\tit(\"errors with a descriptive message\", func() {\n\t\t\t\tcmd := packageExtensionCommand()\n\t\t\t\tcmd.SetArgs([]string{\"some-name\", \"--package-config\", \"some-path\"})\n\n\t\t\t\terr := cmd.Execute()\n\t\t\t\th.AssertError(t, err, \"unknown flag: --package-config\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"--pull-policy unknown-policy\", func() {\n\t\t\tit(\"fails to run\", func() {\n\t\t\t\tcmd := packageExtensionCommand()\n\t\t\t\tcmd.SetArgs([]string{\n\t\t\t\t\t\"some-image-name\",\n\t\t\t\t\t\"--config\", \"/path/to/some/file\",\n\t\t\t\t\t\"--pull-policy\",\n\t\t\t\t\t\"unknown-policy\",\n\t\t\t\t})\n\n\t\t\t\th.AssertError(t, cmd.Execute(), \"parsing pull policy\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"--target cannot be parsed\", func() {\n\t\t\tit(\"errors with a descriptive message\", func() {\n\t\t\t\tcmd := packageCommand()\n\t\t\t\tcmd.SetArgs([]string{\n\t\t\t\t\t\"some-image-name\", \"--config\", \"/path/to/some/file\",\n\t\t\t\t\t\"--target\", \"something/wrong\", \"--publish\",\n\t\t\t\t})\n\n\t\t\t\terr := cmd.Execute()\n\t\t\t\th.AssertNotNil(t, err)\n\t\t\t\th.AssertError(t, err, \"unknown target: 'something/wrong'\")\n\t\t\t})\n\t\t})\n\t})\n}\n\ntype packageExtensionCommandConfig struct {\n\tlogger              *logging.LogWithWriters\n\tpackageConfigReader *fakes.FakePackageConfigReader\n\textensionPackager   *fakes.FakeBuildpackPackager\n\tclientConfig        config.Config\n\timageName           string\n\tconfigPath          string\n}\n\ntype packageExtensionCommandOption func(config *packageExtensionCommandConfig)\n\nfunc packageExtensionCommand(ops ...packageExtensionCommandOption) *cobra.Command {\n\tconfig := &packageExtensionCommandConfig{\n\t\tlogger:              logging.NewLogWithWriters(&bytes.Buffer{}, &bytes.Buffer{}),\n\t\tpackageConfigReader: fakes.NewFakePackageConfigReader(),\n\t\textensionPackager:   &fakes.FakeBuildpackPackager{},\n\t\tclientConfig:        config.Config{},\n\t\timageName:           \"some-image-name\",\n\t\tconfigPath:          \"/path/to/some/file\",\n\t}\n\n\tfor _, op := range ops {\n\t\top(config)\n\t}\n\n\tcmd := commands.ExtensionPackage(config.logger, config.clientConfig, config.extensionPackager, config.packageConfigReader)\n\tcmd.SetArgs([]string{config.imageName, \"--config\", config.configPath})\n\n\treturn cmd\n}\n\nfunc withExtensionLogger(logger *logging.LogWithWriters) packageExtensionCommandOption {\n\treturn func(config *packageExtensionCommandConfig) {\n\t\tconfig.logger = logger\n\t}\n}\n\nfunc withExtensionPackageConfigReader(reader *fakes.FakePackageConfigReader) packageExtensionCommandOption {\n\treturn func(config *packageExtensionCommandConfig) {\n\t\tconfig.packageConfigReader = reader\n\t}\n}\n\nfunc withExtensionPackager(creator *fakes.FakeBuildpackPackager) packageExtensionCommandOption {\n\treturn func(config *packageExtensionCommandConfig) {\n\t\tconfig.extensionPackager = creator\n\t}\n}\n\nfunc withExtensionImageName(name string) packageExtensionCommandOption {\n\treturn func(config *packageExtensionCommandConfig) {\n\t\tconfig.imageName = name\n\t}\n}\n\nfunc withExtensionPackageConfigPath(path string) packageExtensionCommandOption {\n\treturn func(config *packageExtensionCommandConfig) {\n\t\tconfig.configPath = path\n\t}\n}\n\nfunc withExtensionClientConfig(clientCfg config.Config) packageExtensionCommandOption {\n\treturn func(config *packageExtensionCommandConfig) {\n\t\tconfig.clientConfig = clientCfg\n\t}\n}\n"
  },
  {
    "path": "internal/commands/extension_pull.go",
    "content": "package commands\n\nimport (\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n)\n\n// ExtensionPullFlags consist of flags applicable to the `extension pull` command\ntype ExtensionPullFlags struct {\n\t// ExtensionRegistry is the name of the extension registry to use to search for\n\tExtensionRegistry string\n}\n\n// ExtensionPull pulls an extension and stores it locally\nfunc ExtensionPull(logger logging.Logger, cfg config.Config, pack PackClient) *cobra.Command {\n\tcmd := &cobra.Command{\n\t\tUse:     \"pull <uri>\",\n\t\tArgs:    cobra.ExactArgs(1),\n\t\tShort:   \"Pull an extension from a registry and store it locally\",\n\t\tExample: \"pack extension pull <extension-example>\",\n\t\tRunE: logError(logger, func(cmd *cobra.Command, args []string) error {\n\t\t\t// logic will be added here\n\t\t\treturn nil\n\t\t}),\n\t}\n\t// flags will be added here\n\tAddHelpFlag(cmd, \"pull\")\n\treturn cmd\n}\n"
  },
  {
    "path": "internal/commands/extension_register.go",
    "content": "package commands\n\nimport (\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n)\n\ntype ExtensionRegisterFlags struct {\n\tExtensionRegistry string\n}\n\nfunc ExtensionRegister(logger logging.Logger, cfg config.Config, pack PackClient) *cobra.Command {\n\tcmd := &cobra.Command{\n\t\tUse:     \"register <image>\",\n\t\tArgs:    cobra.ExactArgs(1),\n\t\tShort:   \"Register an extension to a registry\",\n\t\tExample: \"pack extension register <extension-example>\",\n\t\tRunE: logError(logger, func(cmd *cobra.Command, args []string) error {\n\t\t\t// logic will be added here\n\t\t\treturn nil\n\t\t}),\n\t}\n\t// flags will be added here\n\tAddHelpFlag(cmd, \"register\")\n\treturn cmd\n}\n"
  },
  {
    "path": "internal/commands/extension_test.go",
    "content": "package commands_test\n\nimport (\n\t\"bytes\"\n\t\"testing\"\n\n\t\"github.com/golang/mock/gomock\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/internal/commands\"\n\t\"github.com/buildpacks/pack/internal/commands/fakes\"\n\t\"github.com/buildpacks/pack/internal/commands/testmocks\"\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestExtensionCommand(t *testing.T) {\n\tspec.Run(t, \"ExtensionCommand\", testExtensionCommand, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testExtensionCommand(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tcmd        *cobra.Command\n\t\tlogger     logging.Logger\n\t\toutBuf     bytes.Buffer\n\t\tmockClient *testmocks.MockPackClient\n\t)\n\n\tit.Before(func() {\n\t\tlogger = logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\tmockController := gomock.NewController(t)\n\t\tmockClient = testmocks.NewMockPackClient(mockController)\n\t\tcmd = commands.NewExtensionCommand(logger, config.Config{}, mockClient, fakes.NewFakePackageConfigReader())\n\t\tcmd.SetOut(logging.GetWriterForLevel(logger, logging.InfoLevel))\n\t})\n\n\twhen(\"extension\", func() {\n\t\tit(\"prints help text\", func() {\n\t\t\tcmd.SetArgs([]string{})\n\t\t\th.AssertNil(t, cmd.Execute())\n\t\t\toutput := outBuf.String()\n\t\t\th.AssertContains(t, output, \"Interact with extensions\")\n\t\t\tfor _, command := range []string{\"Usage\", \"package\", \"register\", \"yank\", \"pull\", \"inspect\"} {\n\t\t\t\th.AssertContains(t, output, command)\n\t\t\t}\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "internal/commands/extension_yank.go",
    "content": "package commands\n\nimport (\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n)\n\ntype ExtensionYankFlags struct {\n\tExtensionRegistry string\n\tUndo              bool\n}\n\nfunc ExtensionYank(logger logging.Logger, cfg config.Config, pack PackClient) *cobra.Command {\n\tcmd := &cobra.Command{\n\t\tUse:     \"yank <extension-id-and-version>\",\n\t\tArgs:    cobra.ExactArgs(1),\n\t\tShort:   \"Yank an extension from a registry\",\n\t\tExample: \"pack yank <extension-example>\",\n\t\tRunE: logError(logger, func(cmd *cobra.Command, args []string) error {\n\t\t\t// logic will be added here\n\t\t\treturn nil\n\t\t}),\n\t}\n\t// flags will be added here\n\tAddHelpFlag(cmd, \"yank\")\n\n\treturn cmd\n}\n"
  },
  {
    "path": "internal/commands/fakes/fake_builder_inspector.go",
    "content": "package fakes\n\nimport (\n\t\"github.com/buildpacks/pack/pkg/client\"\n)\n\ntype FakeBuilderInspector struct {\n\tInfoForLocal   *client.BuilderInfo\n\tInfoForRemote  *client.BuilderInfo\n\tErrorForLocal  error\n\tErrorForRemote error\n\n\tReceivedForLocalName      string\n\tReceivedForRemoteName     string\n\tCalculatedConfigForLocal  client.BuilderInspectionConfig\n\tCalculatedConfigForRemote client.BuilderInspectionConfig\n}\n\nfunc (i *FakeBuilderInspector) InspectBuilder(\n\tname string,\n\tdaemon bool,\n\tmodifiers ...client.BuilderInspectionModifier,\n) (*client.BuilderInfo, error) {\n\tif daemon {\n\t\ti.CalculatedConfigForLocal = client.BuilderInspectionConfig{}\n\t\tfor _, mod := range modifiers {\n\t\t\tmod(&i.CalculatedConfigForLocal)\n\t\t}\n\t\ti.ReceivedForLocalName = name\n\t\treturn i.InfoForLocal, i.ErrorForLocal\n\t}\n\n\ti.CalculatedConfigForRemote = client.BuilderInspectionConfig{}\n\tfor _, mod := range modifiers {\n\t\tmod(&i.CalculatedConfigForRemote)\n\t}\n\ti.ReceivedForRemoteName = name\n\treturn i.InfoForRemote, i.ErrorForRemote\n}\n"
  },
  {
    "path": "internal/commands/fakes/fake_builder_writer.go",
    "content": "package fakes\n\nimport (\n\t\"github.com/buildpacks/pack/internal/builder/writer\"\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/pkg/client\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n)\n\ntype FakeBuilderWriter struct {\n\tPrintForLocal  string\n\tPrintForRemote string\n\tErrorForPrint  error\n\n\tReceivedInfoForLocal   *client.BuilderInfo\n\tReceivedInfoForRemote  *client.BuilderInfo\n\tReceivedErrorForLocal  error\n\tReceivedErrorForRemote error\n\tReceivedBuilderInfo    writer.SharedBuilderInfo\n\tReceivedLocalRunImages []config.RunImage\n}\n\nfunc (w *FakeBuilderWriter) Print(\n\tlogger logging.Logger,\n\tlocalRunImages []config.RunImage,\n\tlocal, remote *client.BuilderInfo,\n\tlocalErr, remoteErr error,\n\tbuilderInfo writer.SharedBuilderInfo,\n) error {\n\tw.ReceivedInfoForLocal = local\n\tw.ReceivedInfoForRemote = remote\n\tw.ReceivedErrorForLocal = localErr\n\tw.ReceivedErrorForRemote = remoteErr\n\tw.ReceivedBuilderInfo = builderInfo\n\tw.ReceivedLocalRunImages = localRunImages\n\n\tlogger.Infof(\"\\nLOCAL:\\n%s\\n\", w.PrintForLocal)\n\tlogger.Infof(\"\\nREMOTE:\\n%s\\n\", w.PrintForRemote)\n\n\treturn w.ErrorForPrint\n}\n"
  },
  {
    "path": "internal/commands/fakes/fake_builder_writer_factory.go",
    "content": "package fakes\n\nimport (\n\t\"github.com/buildpacks/pack/internal/builder/writer\"\n)\n\ntype FakeBuilderWriterFactory struct {\n\tReturnForWriter writer.BuilderWriter\n\tErrorForWriter  error\n\n\tReceivedForKind string\n}\n\nfunc (f *FakeBuilderWriterFactory) Writer(kind string) (writer.BuilderWriter, error) {\n\tf.ReceivedForKind = kind\n\n\treturn f.ReturnForWriter, f.ErrorForWriter\n}\n"
  },
  {
    "path": "internal/commands/fakes/fake_buildpack_packager.go",
    "content": "package fakes\n\nimport (\n\t\"context\"\n\n\t\"github.com/buildpacks/pack/pkg/client\"\n)\n\ntype FakeBuildpackPackager struct {\n\tCreateCalledWithOptions client.PackageBuildpackOptions\n}\n\nfunc (c *FakeBuildpackPackager) PackageBuildpack(ctx context.Context, opts client.PackageBuildpackOptions) error {\n\tc.CreateCalledWithOptions = opts\n\n\treturn nil\n}\n"
  },
  {
    "path": "internal/commands/fakes/fake_extension_packager.go",
    "content": "package fakes\n\nimport (\n\t\"context\"\n\n\t\"github.com/buildpacks/pack/pkg/client\"\n)\n\nfunc (c *FakeBuildpackPackager) PackageExtension(ctx context.Context, opts client.PackageBuildpackOptions) error {\n\tc.CreateCalledWithOptions = opts\n\n\treturn nil\n}\n"
  },
  {
    "path": "internal/commands/fakes/fake_inspect_image_writer.go",
    "content": "package fakes\n\nimport (\n\t\"github.com/buildpacks/pack/internal/inspectimage\"\n\t\"github.com/buildpacks/pack/pkg/client\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n)\n\ntype FakeInspectImageWriter struct {\n\tPrintForLocal  string\n\tPrintForRemote string\n\tErrorForPrint  error\n\n\tReceivedInfoForLocal   *client.ImageInfo\n\tReceivedInfoForRemote  *client.ImageInfo\n\tRecievedGeneralInfo    inspectimage.GeneralInfo\n\tReceivedErrorForLocal  error\n\tReceivedErrorForRemote error\n}\n\nfunc (w *FakeInspectImageWriter) Print(\n\tlogger logging.Logger,\n\tsharedInfo inspectimage.GeneralInfo,\n\tlocal, remote *client.ImageInfo,\n\tlocalErr, remoteErr error,\n) error {\n\tw.ReceivedInfoForLocal = local\n\tw.ReceivedInfoForRemote = remote\n\tw.ReceivedErrorForLocal = localErr\n\tw.ReceivedErrorForRemote = remoteErr\n\tw.RecievedGeneralInfo = sharedInfo\n\n\tlogger.Infof(\"\\nLOCAL:\\n%s\\n\", w.PrintForLocal)\n\tlogger.Infof(\"\\nREMOTE:\\n%s\\n\", w.PrintForRemote)\n\n\treturn w.ErrorForPrint\n}\n"
  },
  {
    "path": "internal/commands/fakes/fake_inspect_image_writer_factory.go",
    "content": "package fakes\n\nimport (\n\t\"github.com/buildpacks/pack/internal/inspectimage/writer\"\n)\n\ntype FakeInspectImageWriterFactory struct {\n\tReturnForWriter writer.InspectImageWriter\n\tErrorForWriter  error\n\n\tReceivedForKind string\n\tReceivedForBOM  bool\n}\n\nfunc (f *FakeInspectImageWriterFactory) Writer(kind string, bom bool) (writer.InspectImageWriter, error) {\n\tf.ReceivedForKind = kind\n\tf.ReceivedForBOM = bom\n\n\treturn f.ReturnForWriter, f.ErrorForWriter\n}\n"
  },
  {
    "path": "internal/commands/fakes/fake_package_config_reader.go",
    "content": "package fakes\n\nimport (\n\tpubbldpkg \"github.com/buildpacks/pack/buildpackage\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n)\n\ntype FakePackageConfigReader struct {\n\tReadCalledWithArg string\n\tReadReturnConfig  pubbldpkg.Config\n\tReadReturnError   error\n\n\tReadBuildpackDescriptorCalledWithArg string\n\tReadBuildpackDescriptorReturn        dist.BuildpackDescriptor\n\tReadExtensionDescriptorReturn        dist.ExtensionDescriptor\n\tReadBuildpackDescriptorReturnError   error\n}\n\nfunc (r *FakePackageConfigReader) Read(path string) (pubbldpkg.Config, error) {\n\tr.ReadCalledWithArg = path\n\n\treturn r.ReadReturnConfig, r.ReadReturnError\n}\n\nfunc (r *FakePackageConfigReader) ReadBuildpackDescriptor(path string) (dist.BuildpackDescriptor, error) {\n\tr.ReadBuildpackDescriptorCalledWithArg = path\n\n\treturn r.ReadBuildpackDescriptorReturn, r.ReadBuildpackDescriptorReturnError\n}\n\nfunc NewFakePackageConfigReader(ops ...func(*FakePackageConfigReader)) *FakePackageConfigReader {\n\tfakePackageConfigReader := &FakePackageConfigReader{\n\t\tReadReturnConfig: pubbldpkg.Config{},\n\t\tReadReturnError:  nil,\n\t}\n\n\tfor _, op := range ops {\n\t\top(fakePackageConfigReader)\n\t}\n\n\treturn fakePackageConfigReader\n}\n"
  },
  {
    "path": "internal/commands/inspect_builder.go",
    "content": "package commands\n\nimport (\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/builder\"\n\t\"github.com/buildpacks/pack/internal/builder/writer\"\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/pkg/client\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n)\n\n// Deprecated: Use builder inspect instead.\nfunc InspectBuilder(\n\tlogger logging.Logger,\n\tcfg config.Config,\n\tinspector BuilderInspector,\n\twriterFactory writer.BuilderWriterFactory,\n) *cobra.Command {\n\tvar flags BuilderInspectFlags\n\tcmd := &cobra.Command{\n\t\tUse:     \"inspect-builder <builder-image-name>\",\n\t\tArgs:    cobra.MaximumNArgs(2),\n\t\tHidden:  true,\n\t\tShort:   \"Show information about a builder\",\n\t\tExample: \"pack inspect-builder cnbs/sample-builder:bionic\",\n\t\tRunE: logError(logger, func(cmd *cobra.Command, args []string) error {\n\t\t\timageName := cfg.DefaultBuilder\n\t\t\tif len(args) >= 1 {\n\t\t\t\timageName = args[0]\n\t\t\t}\n\n\t\t\tif imageName == \"\" {\n\t\t\t\tsuggestSettingBuilder(logger, inspector)\n\t\t\t\treturn client.NewSoftError()\n\t\t\t}\n\n\t\t\treturn inspectBuilder(logger, imageName, flags, cfg, inspector, writerFactory)\n\t\t}),\n\t}\n\tcmd.Flags().IntVarP(&flags.Depth, \"depth\", \"d\", builder.OrderDetectionMaxDepth, \"Max depth to display for Detection Order.\\nOmission of this flag or values < 0 will display the entire tree.\")\n\tcmd.Flags().StringVarP(&flags.OutputFormat, \"output\", \"o\", \"human-readable\", \"Output format to display builder detail (json, yaml, toml, human-readable).\\nOmission of this flag will display as human-readable.\")\n\tAddHelpFlag(cmd, \"inspect-builder\")\n\treturn cmd\n}\n"
  },
  {
    "path": "internal/commands/inspect_builder_test.go",
    "content": "package commands_test\n\nimport (\n\t\"bytes\"\n\t\"errors\"\n\t\"regexp\"\n\t\"testing\"\n\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\t\"github.com/buildpacks/pack/internal/commands\"\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/pkg/client\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestInspectBuilderCommand(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"InspectBuilderCommand\", testInspectBuilderCommand, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testInspectBuilderCommand(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tlogger logging.Logger\n\t\toutBuf bytes.Buffer\n\t\tcfg    config.Config\n\t)\n\n\tit.Before(func() {\n\t\tcfg = config.Config{\n\t\t\tDefaultBuilder: \"default/builder\",\n\t\t\tRunImages:      expectedLocalRunImages,\n\t\t}\n\t\tlogger = logging.NewLogWithWriters(&outBuf, &outBuf)\n\t})\n\n\twhen(\"InspectBuilder\", func() {\n\t\tvar (\n\t\t\tassert = h.NewAssertionManager(t)\n\t\t)\n\n\t\tit(\"passes output of local and remote builders to correct writer\", func() {\n\t\t\tbuilderInspector := newDefaultBuilderInspector()\n\t\t\tbuilderWriter := newDefaultBuilderWriter()\n\t\t\tbuilderWriterFactory := newWriterFactory(returnsForWriter(builderWriter))\n\n\t\t\tcommand := commands.InspectBuilder(logger, cfg, builderInspector, builderWriterFactory)\n\t\t\tcommand.SetArgs([]string{})\n\t\t\terr := command.Execute()\n\t\t\tassert.Nil(err)\n\n\t\t\tassert.Equal(builderWriter.ReceivedInfoForLocal, expectedLocalInfo)\n\t\t\tassert.Equal(builderWriter.ReceivedInfoForRemote, expectedRemoteInfo)\n\t\t\tassert.Equal(builderWriter.ReceivedBuilderInfo, expectedBuilderInfo)\n\t\t\tassert.Equal(builderWriter.ReceivedLocalRunImages, expectedLocalRunImages)\n\t\t\tassert.Equal(builderWriterFactory.ReceivedForKind, \"human-readable\")\n\t\t\tassert.Equal(builderInspector.ReceivedForLocalName, \"default/builder\")\n\t\t\tassert.Equal(builderInspector.ReceivedForRemoteName, \"default/builder\")\n\t\t\tassert.ContainsF(outBuf.String(), \"LOCAL:\\n%s\", expectedLocalDisplay)\n\t\t\tassert.ContainsF(outBuf.String(), \"REMOTE:\\n%s\", expectedRemoteDisplay)\n\t\t})\n\n\t\twhen(\"image name is provided as first arg\", func() {\n\t\t\tit(\"passes that image name to the inspector\", func() {\n\t\t\t\tbuilderInspector := newDefaultBuilderInspector()\n\t\t\t\twriter := newDefaultBuilderWriter()\n\t\t\t\tcommand := commands.InspectBuilder(logger, cfg, builderInspector, newWriterFactory(returnsForWriter(writer)))\n\t\t\t\tcommand.SetArgs([]string{\"some/image\"})\n\n\t\t\t\terr := command.Execute()\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassert.Equal(builderInspector.ReceivedForLocalName, \"some/image\")\n\t\t\t\tassert.Equal(builderInspector.ReceivedForRemoteName, \"some/image\")\n\t\t\t\tassert.Equal(writer.ReceivedBuilderInfo.IsDefault, false)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"depth flag is provided\", func() {\n\t\t\tit(\"passes a modifier to the builder inspector\", func() {\n\t\t\t\tbuilderInspector := newDefaultBuilderInspector()\n\t\t\t\tcommand := commands.InspectBuilder(logger, cfg, builderInspector, newDefaultWriterFactory())\n\t\t\t\tcommand.SetArgs([]string{\"--depth\", \"5\"})\n\n\t\t\t\terr := command.Execute()\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassert.Equal(builderInspector.CalculatedConfigForLocal.OrderDetectionDepth, 5)\n\t\t\t\tassert.Equal(builderInspector.CalculatedConfigForRemote.OrderDetectionDepth, 5)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"output type is set to json\", func() {\n\t\t\tit(\"passes json to the writer factory\", func() {\n\t\t\t\twriterFactory := newDefaultWriterFactory()\n\t\t\t\tcommand := commands.InspectBuilder(logger, cfg, newDefaultBuilderInspector(), writerFactory)\n\t\t\t\tcommand.SetArgs([]string{\"--output\", \"json\"})\n\n\t\t\t\terr := command.Execute()\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassert.Equal(writerFactory.ReceivedForKind, \"json\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"output type is set to toml using the shorthand flag\", func() {\n\t\t\tit(\"passes toml to the writer factory\", func() {\n\t\t\t\twriterFactory := newDefaultWriterFactory()\n\t\t\t\tcommand := commands.InspectBuilder(logger, cfg, newDefaultBuilderInspector(), writerFactory)\n\t\t\t\tcommand.SetArgs([]string{\"-o\", \"toml\"})\n\n\t\t\t\terr := command.Execute()\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassert.Equal(writerFactory.ReceivedForKind, \"toml\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"builder inspector returns an error for local builder\", func() {\n\t\t\tit(\"passes that error to the writer to handle appropriately\", func() {\n\t\t\t\tbaseError := errors.New(\"couldn't inspect local\")\n\n\t\t\t\tbuilderInspector := newBuilderInspector(errorsForLocal(baseError))\n\t\t\t\tbuilderWriter := newDefaultBuilderWriter()\n\t\t\t\tbuilderWriterFactory := newWriterFactory(returnsForWriter(builderWriter))\n\n\t\t\t\tcommand := commands.InspectBuilder(logger, cfg, builderInspector, builderWriterFactory)\n\t\t\t\tcommand.SetArgs([]string{})\n\t\t\t\terr := command.Execute()\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassert.ErrorWithMessage(builderWriter.ReceivedErrorForLocal, \"couldn't inspect local\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"builder inspector returns an error remote builder\", func() {\n\t\t\tit(\"passes that error to the writer to handle appropriately\", func() {\n\t\t\t\tbaseError := errors.New(\"couldn't inspect remote\")\n\n\t\t\t\tbuilderInspector := newBuilderInspector(errorsForRemote(baseError))\n\t\t\t\tbuilderWriter := newDefaultBuilderWriter()\n\t\t\t\tbuilderWriterFactory := newWriterFactory(returnsForWriter(builderWriter))\n\n\t\t\t\tcommand := commands.InspectBuilder(logger, cfg, builderInspector, builderWriterFactory)\n\t\t\t\tcommand.SetArgs([]string{})\n\t\t\t\terr := command.Execute()\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassert.ErrorWithMessage(builderWriter.ReceivedErrorForRemote, \"couldn't inspect remote\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"image is trusted\", func() {\n\t\t\tit(\"passes builder info with trusted true to the writer's `Print` method\", func() {\n\t\t\t\tcfg.TrustedBuilders = []config.TrustedBuilder{\n\t\t\t\t\t{Name: \"trusted/builder\"},\n\t\t\t\t}\n\t\t\t\twriter := newDefaultBuilderWriter()\n\n\t\t\t\tcommand := commands.InspectBuilder(\n\t\t\t\t\tlogger,\n\t\t\t\t\tcfg,\n\t\t\t\t\tnewDefaultBuilderInspector(),\n\t\t\t\t\tnewWriterFactory(returnsForWriter(writer)),\n\t\t\t\t)\n\t\t\t\tcommand.SetArgs([]string{\"trusted/builder\"})\n\n\t\t\t\terr := command.Execute()\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassert.Equal(writer.ReceivedBuilderInfo.Trusted, true)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"default builder is configured and is the same as specified by the command\", func() {\n\t\t\tit(\"passes builder info with isDefault true to the writer's `Print` method\", func() {\n\t\t\t\tcfg.DefaultBuilder = \"the/default-builder\"\n\t\t\t\twriter := newDefaultBuilderWriter()\n\n\t\t\t\tcommand := commands.InspectBuilder(\n\t\t\t\t\tlogger,\n\t\t\t\t\tcfg,\n\t\t\t\t\tnewDefaultBuilderInspector(),\n\t\t\t\t\tnewWriterFactory(returnsForWriter(writer)),\n\t\t\t\t)\n\t\t\t\tcommand.SetArgs([]string{\"the/default-builder\"})\n\n\t\t\t\terr := command.Execute()\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassert.Equal(writer.ReceivedBuilderInfo.IsDefault, true)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"default builder is empty and no builder is specified in command args\", func() {\n\t\t\tit(\"suggests builders and returns a soft error\", func() {\n\t\t\t\tcfg.DefaultBuilder = \"\"\n\n\t\t\t\tcommand := commands.InspectBuilder(logger, cfg, newDefaultBuilderInspector(), newDefaultWriterFactory())\n\t\t\t\tcommand.SetArgs([]string{})\n\n\t\t\t\terr := command.Execute()\n\t\t\t\tassert.Error(err)\n\t\t\t\tif !errors.Is(err, client.SoftError{}) {\n\t\t\t\t\tt.Fatalf(\"expect a client.SoftError, got: %s\", err)\n\t\t\t\t}\n\n\t\t\t\tassert.Contains(outBuf.String(), `Please select a default builder with:\n\n\tpack config default-builder <builder-image>`)\n\n\t\t\t\tassert.Matches(outBuf.String(), regexp.MustCompile(`Paketo Buildpacks:\\s+'paketobuildpacks/builder-jammy-base'`))\n\t\t\t\tassert.Matches(outBuf.String(), regexp.MustCompile(`Paketo Buildpacks:\\s+'paketobuildpacks/builder-jammy-full'`))\n\t\t\t\tassert.Matches(outBuf.String(), regexp.MustCompile(`Heroku:\\s+'heroku/builder:24'`))\n\t\t\t})\n\t\t})\n\n\t\twhen(\"print returns an error\", func() {\n\t\t\tit(\"returns that error\", func() {\n\t\t\t\tbaseError := errors.New(\"couldn't write builder\")\n\n\t\t\t\tbuilderWriter := newBuilderWriter(errorsForPrint(baseError))\n\t\t\t\tcommand := commands.InspectBuilder(\n\t\t\t\t\tlogger,\n\t\t\t\t\tcfg,\n\t\t\t\t\tnewDefaultBuilderInspector(),\n\t\t\t\t\tnewWriterFactory(returnsForWriter(builderWriter)),\n\t\t\t\t)\n\t\t\t\tcommand.SetArgs([]string{})\n\n\t\t\t\terr := command.Execute()\n\t\t\t\tassert.ErrorWithMessage(err, \"couldn't write builder\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"writer factory returns an error\", func() {\n\t\t\tit(\"returns that error\", func() {\n\t\t\t\tbaseError := errors.New(\"invalid output format\")\n\n\t\t\t\twriterFactory := newWriterFactory(errorsForWriter(baseError))\n\t\t\t\tcommand := commands.InspectBuilder(logger, cfg, newDefaultBuilderInspector(), writerFactory)\n\t\t\t\tcommand.SetArgs([]string{})\n\n\t\t\t\terr := command.Execute()\n\t\t\t\tassert.ErrorWithMessage(err, \"invalid output format\")\n\t\t\t})\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "internal/commands/inspect_buildpack.go",
    "content": "package commands\n\nimport (\n\t\"bytes\"\n\t\"fmt\"\n\t\"io\"\n\t\"strings\"\n\t\"text/tabwriter\"\n\t\"text/template\"\n\n\t\"github.com/pkg/errors\"\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/internal/config\"\n\tstrs \"github.com/buildpacks/pack/internal/strings\"\n\t\"github.com/buildpacks/pack/pkg/buildpack\"\n\t\"github.com/buildpacks/pack/pkg/client\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n)\n\nconst inspectBuildpackTemplate = `\n{{ .Location -}}:\n\nStacks:\n{{- range $stackIndex, $stack := .Metadata.Stacks }}\n  ID: {{ $stack.ID }}\n    Mixins:\n  {{- if $.ListMixins }}\n    {{- if eq (len $stack.Mixins) 0 }}\n      (none)\n    {{- else }}\n      {{- range $mixinIndex, $mixin := $stack.Mixins }}\n      {{ $mixin }}\n      {{- end }}\n    {{- end }}\n  {{- else }}\n      (omitted)\n  {{- end }}\n{{- end }}\n\nBuildpacks:\n{{ .Buildpacks }}\n\nDetection Order:\n{{- if ne .Order \"\" }}\n{{ .Order }}\n{{- else }}\n  (none)\n{{ end }}\n`\n\nconst (\n\twriterMinWidth     = 0\n\twriterTabWidth     = 0\n\tbuildpacksTabWidth = 8\n\tdefaultTabWidth    = 4\n\twriterPadChar      = ' '\n\twriterFlags        = 0\n)\n\n// Deprecated: Use buildpack inspect instead.\nfunc InspectBuildpack(logger logging.Logger, cfg config.Config, client PackClient) *cobra.Command {\n\tvar flags BuildpackInspectFlags\n\tcmd := &cobra.Command{\n\t\tUse:     \"inspect-buildpack <image-name>\",\n\t\tArgs:    cobra.RangeArgs(1, 4),\n\t\tHidden:  true,\n\t\tShort:   \"Show information about a buildpack\",\n\t\tExample: \"pack inspect-buildpack cnbs/sample-package:hello-universe\",\n\t\tRunE: logError(logger, func(cmd *cobra.Command, args []string) error {\n\t\t\tbuildpackName := args[0]\n\t\t\tregistry := flags.Registry\n\t\t\tif registry == \"\" {\n\t\t\t\t//nolint:staticcheck\n\t\t\t\tregistry = cfg.DefaultRegistry\n\t\t\t}\n\n\t\t\treturn buildpackInspect(logger, buildpackName, registry, flags, cfg, client)\n\t\t}),\n\t}\n\tcmd.Flags().IntVarP(&flags.Depth, \"depth\", \"d\", -1, \"Max depth to display for Detection Order.\\nOmission of this flag or values < 0 will display the entire tree.\")\n\tcmd.Flags().StringVarP(&flags.Registry, \"registry\", \"r\", \"\", \"buildpack registry that may be searched\")\n\tcmd.Flags().BoolVarP(&flags.Verbose, \"verbose\", \"v\", false, \"show more output\")\n\tAddHelpFlag(cmd, \"inspect-buildpack\")\n\treturn cmd\n}\n\nfunc inspectAllBuildpacks(client PackClient, flags BuildpackInspectFlags, options ...client.InspectBuildpackOptions) (string, error) {\n\tbuf := bytes.NewBuffer(nil)\n\terrArray := []error{}\n\tfor _, option := range options {\n\t\tnextResult, err := client.InspectBuildpack(option)\n\t\tif err != nil {\n\t\t\terrArray = append(errArray, err)\n\t\t\tcontinue\n\t\t}\n\n\t\tprefix := determinePrefix(option.BuildpackName, nextResult.Location, option.Daemon)\n\n\t\toutput, err := inspectBuildpackOutput(nextResult, prefix, flags)\n\t\tif err != nil {\n\t\t\treturn \"\", err\n\t\t}\n\n\t\tif _, err := buf.Write(output); err != nil {\n\t\t\treturn \"\", err\n\t\t}\n\n\t\tif nextResult.Location != buildpack.PackageLocator {\n\t\t\treturn buf.String(), nil\n\t\t}\n\t}\n\tif len(errArray) == len(options) {\n\t\treturn \"\", joinErrors(errArray)\n\t}\n\treturn buf.String(), nil\n}\n\nfunc inspectBuildpackOutput(info *client.BuildpackInfo, prefix string, flags BuildpackInspectFlags) (output []byte, err error) {\n\ttpl := template.Must(template.New(\"inspect-buildpack\").Parse(inspectBuildpackTemplate))\n\tbpOutput, err := buildpacksOutput(info.Buildpacks)\n\tif err != nil {\n\t\treturn []byte{}, fmt.Errorf(\"error writing buildpack output: %q\", err)\n\t}\n\torderOutput, err := detectionOrderOutput(info.Order, info.BuildpackLayers, flags.Depth)\n\tif err != nil {\n\t\treturn []byte{}, fmt.Errorf(\"error writing detection order output: %q\", err)\n\t}\n\tbuf := bytes.NewBuffer(nil)\n\n\terr = tpl.Execute(buf, &struct {\n\t\tLocation   string\n\t\tMetadata   buildpack.Metadata\n\t\tListMixins bool\n\t\tBuildpacks string\n\t\tOrder      string\n\t}{\n\t\tLocation:   prefix,\n\t\tMetadata:   info.BuildpackMetadata,\n\t\tListMixins: flags.Verbose,\n\t\tBuildpacks: bpOutput,\n\t\tOrder:      orderOutput,\n\t})\n\n\tif err != nil {\n\t\treturn []byte{}, fmt.Errorf(\"error templating buildpack output template: %q\", err)\n\t}\n\treturn buf.Bytes(), nil\n}\n\nfunc determinePrefix(name string, locator buildpack.LocatorType, daemon bool) string {\n\tswitch locator {\n\tcase buildpack.RegistryLocator:\n\t\treturn \"REGISTRY IMAGE\"\n\tcase buildpack.PackageLocator:\n\t\tif daemon {\n\t\t\treturn \"LOCAL IMAGE\"\n\t\t}\n\t\treturn \"REMOTE IMAGE\"\n\tcase buildpack.URILocator:\n\t\tif strings.HasPrefix(name, \"http\") {\n\t\t\treturn \"REMOTE ARCHIVE\"\n\t\t}\n\t\treturn \"LOCAL ARCHIVE\"\n\t}\n\treturn \"UNKNOWN SOURCE\"\n}\n\nfunc buildpacksOutput(bps []dist.ModuleInfo) (string, error) {\n\tbuf := &bytes.Buffer{}\n\n\ttabWriter := new(tabwriter.Writer).Init(buf, writerMinWidth, writerPadChar, buildpacksTabWidth, writerPadChar, writerFlags)\n\tif _, err := fmt.Fprint(tabWriter, \"  ID\\tNAME\\tVERSION\\tHOMEPAGE\\n\"); err != nil {\n\t\treturn \"\", err\n\t}\n\n\tfor _, bp := range bps {\n\t\tif _, err := fmt.Fprintf(tabWriter, \"  %s\\t%s\\t%s\\t%s\\n\", bp.ID, strs.ValueOrDefault(bp.Name, \"-\"), bp.Version, strs.ValueOrDefault(bp.Homepage, \"-\")); err != nil {\n\t\t\treturn \"\", err\n\t\t}\n\t}\n\n\tif err := tabWriter.Flush(); err != nil {\n\t\treturn \"\", err\n\t}\n\n\treturn strings.TrimSuffix(buf.String(), \"\\n\"), nil\n}\n\n// Unable to easily convert format makes this feel like a poor solution...\nfunc detectionOrderOutput(order dist.Order, layers dist.ModuleLayers, maxDepth int) (string, error) {\n\tbuf := strings.Builder{}\n\ttabWriter := new(tabwriter.Writer).Init(&buf, writerMinWidth, writerTabWidth, defaultTabWidth, writerPadChar, writerFlags)\n\tbuildpackSet := map[client.BuildpackInfoKey]bool{}\n\n\tif err := orderOutputRecurrence(tabWriter, \"\", order, layers, buildpackSet, 0, maxDepth); err != nil {\n\t\treturn \"\", err\n\t}\n\tif err := tabWriter.Flush(); err != nil {\n\t\treturn \"\", fmt.Errorf(\"error flushing tabWriter output: %s\", err)\n\t}\n\treturn strings.TrimSuffix(buf.String(), \"\\n\"), nil\n}\n\n// Recursively generate output for every buildpack in an order.\nfunc orderOutputRecurrence(w io.Writer, prefix string, order dist.Order, layers dist.ModuleLayers, buildpackSet map[client.BuildpackInfoKey]bool, curDepth, maxDepth int) error {\n\t// exit if maxDepth is exceeded\n\tif validMaxDepth(maxDepth) && maxDepth <= curDepth {\n\t\treturn nil\n\t}\n\n\t// otherwise iterate over all nested buildpacks\n\tfor groupIndex, group := range order {\n\t\tlastGroup := groupIndex == (len(order) - 1)\n\t\tif err := displayGroup(w, prefix, groupIndex+1, lastGroup); err != nil {\n\t\t\treturn fmt.Errorf(\"error when printing group info: %q\", err)\n\t\t}\n\t\tfor bpIndex, buildpackEntry := range group.Group {\n\t\t\tlastBuildpack := bpIndex == len(group.Group)-1\n\n\t\t\tkey := client.BuildpackInfoKey{\n\t\t\t\tID:      buildpackEntry.ID,\n\t\t\t\tVersion: buildpackEntry.Version,\n\t\t\t}\n\t\t\t_, visited := buildpackSet[key]\n\t\t\tbuildpackSet[key] = true\n\n\t\t\tcurBuildpackLayer, ok := layers.Get(buildpackEntry.ID, buildpackEntry.Version)\n\t\t\tif !ok {\n\t\t\t\treturn fmt.Errorf(\"error: missing buildpack %s@%s from layer metadata\", buildpackEntry.ID, buildpackEntry.Version)\n\t\t\t}\n\n\t\t\tnewBuildpackPrefix := updatePrefix(prefix, lastGroup)\n\t\t\tif err := displayBuildpack(w, newBuildpackPrefix, buildpackEntry, visited, bpIndex == len(group.Group)-1); err != nil {\n\t\t\t\treturn fmt.Errorf(\"error when printing buildpack info: %q\", err)\n\t\t\t}\n\n\t\t\tnewGroupPrefix := updatePrefix(newBuildpackPrefix, lastBuildpack)\n\t\t\tif !visited {\n\t\t\t\tif err := orderOutputRecurrence(w, newGroupPrefix, curBuildpackLayer.Order, layers, buildpackSet, curDepth+1, maxDepth); err != nil {\n\t\t\t\t\treturn err\n\t\t\t\t}\n\t\t\t}\n\n\t\t\t// remove key from set after recurrence completes, so we only detect cycles.\n\t\t\tdelete(buildpackSet, key)\n\t\t}\n\t}\n\treturn nil\n}\n\nconst (\n\tbranchPrefix     = \" ├ \"\n\tlastBranchPrefix = \" └ \"\n\ttrunkPrefix      = \" │ \"\n)\n\nfunc updatePrefix(oldPrefix string, last bool) string {\n\tif last {\n\t\treturn oldPrefix + \"   \"\n\t}\n\treturn oldPrefix + trunkPrefix\n}\n\nfunc validMaxDepth(depth int) bool {\n\treturn depth >= 0\n}\n\nfunc displayGroup(w io.Writer, prefix string, groupCount int, last bool) error {\n\ttreePrefix := branchPrefix\n\tif last {\n\t\ttreePrefix = lastBranchPrefix\n\t}\n\t_, err := fmt.Fprintf(w, \"%s%sGroup #%d:\\n\", prefix, treePrefix, groupCount)\n\treturn err\n}\n\nfunc displayBuildpack(w io.Writer, prefix string, entry dist.ModuleRef, visited bool, last bool) error {\n\tvar optional string\n\tif entry.Optional {\n\t\toptional = \"(optional)\"\n\t}\n\n\tvisitedStatus := \"\"\n\tif visited {\n\t\tvisitedStatus = \"[cyclic]\"\n\t}\n\n\tbpRef := entry.ID\n\tif entry.Version != \"\" {\n\t\tbpRef += \"@\" + entry.Version\n\t}\n\n\ttreePrefix := branchPrefix\n\tif last {\n\t\ttreePrefix = lastBranchPrefix\n\t}\n\n\t_, err := fmt.Fprintf(w, \"%s%s%s\\t%s%s\\n\", prefix, treePrefix, bpRef, optional, visitedStatus)\n\treturn err\n}\n\nfunc joinErrors(errs []error) error {\n\terrStrings := make([]string, len(errs))\n\tfor idx, err := range errs {\n\t\terrStrings[idx] = err.Error()\n\t}\n\n\treturn errors.New(strings.Join(errStrings, \", \"))\n}\n"
  },
  {
    "path": "internal/commands/inspect_buildpack_test.go",
    "content": "package commands_test\n\nimport (\n\t\"bytes\"\n\t\"fmt\"\n\t\"testing\"\n\n\t\"github.com/buildpacks/lifecycle/api\"\n\t\"github.com/golang/mock/gomock\"\n\t\"github.com/heroku/color\"\n\t\"github.com/pkg/errors\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/internal/commands\"\n\t\"github.com/buildpacks/pack/internal/commands/testmocks\"\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/pkg/buildpack\"\n\t\"github.com/buildpacks/pack/pkg/client\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n\t\"github.com/buildpacks/pack/pkg/image\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestInspectBuildpackCommand(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"InspectBuildpackCommand\", testInspectBuildpackCommand, spec.Sequential(), spec.Report(report.Terminal{}))\n}\n\nfunc testInspectBuildpackCommand(t *testing.T, when spec.G, it spec.S) {\n\tapiVersion, err := api.NewVersion(\"0.2\")\n\tif err != nil {\n\t\tt.Fail()\n\t}\n\n\tvar (\n\t\tcommand        *cobra.Command\n\t\tlogger         logging.Logger\n\t\toutBuf         bytes.Buffer\n\t\tmockController *gomock.Controller\n\t\tmockClient     *testmocks.MockPackClient\n\t\tcfg            config.Config\n\t\tcomplexInfo    *client.BuildpackInfo\n\t\tsimpleInfo     *client.BuildpackInfo\n\t\tassert         = h.NewAssertionManager(t)\n\t)\n\n\tit.Before(func() {\n\t\tmockController = gomock.NewController(t)\n\t\tmockClient = testmocks.NewMockPackClient(mockController)\n\t\tlogger = logging.NewLogWithWriters(&outBuf, &outBuf)\n\n\t\tcfg = config.Config{\n\t\t\tDefaultRegistry: \"default-registry\",\n\t\t}\n\n\t\tcomplexInfo = &client.BuildpackInfo{\n\t\t\tBuildpackMetadata: buildpack.Metadata{\n\t\t\t\tModuleInfo: dist.ModuleInfo{\n\t\t\t\t\tID:       \"some/top-buildpack\",\n\t\t\t\t\tVersion:  \"0.0.1\",\n\t\t\t\t\tHomepage: \"top-buildpack-homepage\",\n\t\t\t\t},\n\t\t\t\tStacks: []dist.Stack{\n\t\t\t\t\t{ID: \"io.buildpacks.stacks.first-stack\", Mixins: []string{\"mixin1\", \"mixin2\", \"build:mixin3\", \"build:mixin4\"}},\n\t\t\t\t\t{ID: \"io.buildpacks.stacks.second-stack\", Mixins: []string{\"mixin1\", \"mixin2\"}},\n\t\t\t\t},\n\t\t\t},\n\t\t\tBuildpacks: []dist.ModuleInfo{\n\t\t\t\t{\n\t\t\t\t\tID:       \"some/first-inner-buildpack\",\n\t\t\t\t\tVersion:  \"1.0.0\",\n\t\t\t\t\tHomepage: \"first-inner-buildpack-homepage\",\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\tID:       \"some/second-inner-buildpack\",\n\t\t\t\t\tVersion:  \"2.0.0\",\n\t\t\t\t\tHomepage: \"second-inner-buildpack-homepage\",\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\tID:       \"some/third-inner-buildpack\",\n\t\t\t\t\tVersion:  \"3.0.0\",\n\t\t\t\t\tHomepage: \"third-inner-buildpack-homepage\",\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\tID:       \"some/top-buildpack\",\n\t\t\t\t\tVersion:  \"0.0.1\",\n\t\t\t\t\tHomepage: \"top-buildpack-homepage\",\n\t\t\t\t\tName:     \"top\",\n\t\t\t\t},\n\t\t\t},\n\t\t\tOrder: dist.Order{\n\t\t\t\t{\n\t\t\t\t\tGroup: []dist.ModuleRef{\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\tModuleInfo: dist.ModuleInfo{\n\t\t\t\t\t\t\t\tID:       \"some/top-buildpack\",\n\t\t\t\t\t\t\t\tVersion:  \"0.0.1\",\n\t\t\t\t\t\t\t\tHomepage: \"top-buildpack-homepage\",\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\tOptional: false,\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t},\n\t\t\tBuildpackLayers: dist.ModuleLayers{\n\t\t\t\t\"some/first-inner-buildpack\": {\n\t\t\t\t\t\"1.0.0\": {\n\t\t\t\t\t\tAPI: apiVersion,\n\t\t\t\t\t\tStacks: []dist.Stack{\n\t\t\t\t\t\t\t{ID: \"io.buildpacks.stacks.first-stack\", Mixins: []string{\"mixin1\", \"mixin2\", \"build:mixin3\", \"build:mixin4\"}},\n\t\t\t\t\t\t\t{ID: \"io.buildpacks.stacks.second-stack\", Mixins: []string{\"mixin1\", \"mixin2\"}},\n\t\t\t\t\t\t},\n\t\t\t\t\t\tOrder: dist.Order{\n\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\tGroup: []dist.ModuleRef{\n\t\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t\tModuleInfo: dist.ModuleInfo{\n\t\t\t\t\t\t\t\t\t\t\tID:      \"some/first-inner-buildpack\",\n\t\t\t\t\t\t\t\t\t\t\tVersion: \"1.0.0\",\n\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t\tOptional: false,\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t\tModuleInfo: dist.ModuleInfo{\n\t\t\t\t\t\t\t\t\t\t\tID:      \"some/third-inner-buildpack\",\n\t\t\t\t\t\t\t\t\t\t\tVersion: \"3.0.0\",\n\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t\tOptional: false,\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\tGroup: []dist.ModuleRef{\n\t\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t\tModuleInfo: dist.ModuleInfo{\n\t\t\t\t\t\t\t\t\t\t\tID:      \"some/third-inner-buildpack\",\n\t\t\t\t\t\t\t\t\t\t\tVersion: \"3.0.0\",\n\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t\tOptional: false,\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t},\n\t\t\t\t\t\tLayerDiffID: \"sha256:first-inner-buildpack-diff-id\",\n\t\t\t\t\t\tHomepage:    \"first-inner-buildpack-homepage\",\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t\t\"some/second-inner-buildpack\": {\n\t\t\t\t\t\"2.0.0\": {\n\t\t\t\t\t\tAPI: apiVersion,\n\t\t\t\t\t\tStacks: []dist.Stack{\n\t\t\t\t\t\t\t{ID: \"io.buildpacks.stacks.first-stack\", Mixins: []string{\"mixin1\", \"mixin2\", \"build:mixin3\", \"build:mixin4\"}},\n\t\t\t\t\t\t\t{ID: \"io.buildpacks.stacks.second-stack\", Mixins: []string{\"mixin1\", \"mixin2\"}},\n\t\t\t\t\t\t},\n\t\t\t\t\t\tLayerDiffID: \"sha256:second-inner-buildpack-diff-id\",\n\t\t\t\t\t\tHomepage:    \"second-inner-buildpack-homepage\",\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t\t\"some/third-inner-buildpack\": {\n\t\t\t\t\t\"3.0.0\": {\n\t\t\t\t\t\tAPI: apiVersion,\n\t\t\t\t\t\tStacks: []dist.Stack{\n\t\t\t\t\t\t\t{ID: \"io.buildpacks.stacks.first-stack\", Mixins: []string{\"mixin1\", \"mixin2\", \"build:mixin3\", \"build:mixin4\"}},\n\t\t\t\t\t\t\t{ID: \"io.buildpacks.stacks.second-stack\", Mixins: []string{\"mixin1\", \"mixin2\"}},\n\t\t\t\t\t\t},\n\t\t\t\t\t\tLayerDiffID: \"sha256:third-inner-buildpack-diff-id\",\n\t\t\t\t\t\tHomepage:    \"third-inner-buildpack-homepage\",\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t\t\"some/top-buildpack\": {\n\t\t\t\t\t\"0.0.1\": {\n\t\t\t\t\t\tAPI: apiVersion,\n\t\t\t\t\t\tOrder: dist.Order{\n\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\tGroup: []dist.ModuleRef{\n\t\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t\tModuleInfo: dist.ModuleInfo{\n\t\t\t\t\t\t\t\t\t\t\tID:      \"some/first-inner-buildpack\",\n\t\t\t\t\t\t\t\t\t\t\tVersion: \"1.0.0\",\n\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t\tOptional: false,\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t\tModuleInfo: dist.ModuleInfo{\n\t\t\t\t\t\t\t\t\t\t\tID:      \"some/second-inner-buildpack\",\n\t\t\t\t\t\t\t\t\t\t\tVersion: \"2.0.0\",\n\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t\tOptional: false,\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\tGroup: []dist.ModuleRef{\n\t\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t\tModuleInfo: dist.ModuleInfo{\n\t\t\t\t\t\t\t\t\t\t\tID:      \"some/first-inner-buildpack\",\n\t\t\t\t\t\t\t\t\t\t\tVersion: \"1.0.0\",\n\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t\tOptional: false,\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t},\n\t\t\t\t\t\tLayerDiffID: \"sha256:top-buildpack-diff-id\",\n\t\t\t\t\t\tHomepage:    \"top-buildpack-homepage\",\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t},\n\t\t}\n\n\t\tsimpleInfo = &client.BuildpackInfo{\n\t\t\tBuildpackMetadata: buildpack.Metadata{\n\t\t\t\tModuleInfo: dist.ModuleInfo{\n\t\t\t\t\tID:       \"some/single-buildpack\",\n\t\t\t\t\tVersion:  \"0.0.1\",\n\t\t\t\t\tHomepage: \"single-homepage-homepace\",\n\t\t\t\t},\n\t\t\t\tStacks: []dist.Stack{\n\t\t\t\t\t{ID: \"io.buildpacks.stacks.first-stack\", Mixins: []string{\"mixin1\", \"mixin2\", \"build:mixin3\", \"build:mixin4\"}},\n\t\t\t\t\t{ID: \"io.buildpacks.stacks.second-stack\", Mixins: []string{\"mixin1\", \"mixin2\"}},\n\t\t\t\t},\n\t\t\t},\n\t\t\tBuildpacks: []dist.ModuleInfo{\n\t\t\t\t{\n\t\t\t\t\tID:       \"some/single-buildpack\",\n\t\t\t\t\tVersion:  \"0.0.1\",\n\t\t\t\t\tHomepage: \"single-buildpack-homepage\",\n\t\t\t\t\tName:     \"some\",\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\tID:      \"some/buildpack-no-homepage\",\n\t\t\t\t\tVersion: \"0.0.2\",\n\t\t\t\t},\n\t\t\t},\n\t\t\tOrder: dist.Order{\n\t\t\t\t{\n\t\t\t\t\tGroup: []dist.ModuleRef{\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\tModuleInfo: dist.ModuleInfo{\n\t\t\t\t\t\t\t\tID:       \"some/single-buildpack\",\n\t\t\t\t\t\t\t\tVersion:  \"0.0.1\",\n\t\t\t\t\t\t\t\tHomepage: \"single-buildpack-homepage\",\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\tOptional: false,\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t},\n\t\t\tBuildpackLayers: dist.ModuleLayers{\n\t\t\t\t\"some/single-buildpack\": {\n\t\t\t\t\t\"0.0.1\": {\n\t\t\t\t\t\tAPI: apiVersion,\n\t\t\t\t\t\tStacks: []dist.Stack{\n\t\t\t\t\t\t\t{ID: \"io.buildpacks.stacks.first-stack\", Mixins: []string{\"mixin1\", \"mixin2\", \"build:mixin3\", \"build:mixin4\"}},\n\t\t\t\t\t\t\t{ID: \"io.buildpacks.stacks.second-stack\", Mixins: []string{\"mixin1\", \"mixin2\"}},\n\t\t\t\t\t\t},\n\t\t\t\t\t\tLayerDiffID: \"sha256:single-buildpack-diff-id\",\n\t\t\t\t\t\tHomepage:    \"single-buildpack-homepage\",\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t},\n\t\t}\n\n\t\tcommand = commands.InspectBuildpack(logger, cfg, mockClient)\n\t})\n\n\twhen(\"InspectBuildpack\", func() {\n\t\twhen(\"inspecting an image\", func() {\n\t\t\twhen(\"both remote and local image are present\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tcomplexInfo.Location = buildpack.PackageLocator\n\t\t\t\t\tsimpleInfo.Location = buildpack.PackageLocator\n\n\t\t\t\t\tmockClient.EXPECT().InspectBuildpack(client.InspectBuildpackOptions{\n\t\t\t\t\t\tBuildpackName: \"test/buildpack\",\n\t\t\t\t\t\tDaemon:        true,\n\t\t\t\t\t\tRegistry:      \"default-registry\",\n\t\t\t\t\t}).Return(complexInfo, nil)\n\n\t\t\t\t\tmockClient.EXPECT().InspectBuildpack(client.InspectBuildpackOptions{\n\t\t\t\t\t\tBuildpackName: \"test/buildpack\",\n\t\t\t\t\t\tDaemon:        false,\n\t\t\t\t\t\tRegistry:      \"default-registry\",\n\t\t\t\t\t}).Return(simpleInfo, nil)\n\t\t\t\t})\n\n\t\t\t\tit(\"succeeds\", func() {\n\t\t\t\t\tcommand.SetArgs([]string{\"test/buildpack\"})\n\t\t\t\t\tassert.Nil(command.Execute())\n\n\t\t\t\t\tlocalOutputSection := fmt.Sprintf(inspectOutputTemplate,\n\t\t\t\t\t\t\"test/buildpack\",\n\t\t\t\t\t\t\"LOCAL IMAGE:\",\n\t\t\t\t\t\tcomplexOutputSection)\n\n\t\t\t\t\tremoteOutputSection := fmt.Sprintf(\"%s\\n\\n%s\",\n\t\t\t\t\t\t\"REMOTE IMAGE:\",\n\t\t\t\t\t\tsimpleOutputSection)\n\n\t\t\t\t\tassert.AssertTrimmedContains(outBuf.String(), localOutputSection)\n\t\t\t\t\tassert.AssertTrimmedContains(outBuf.String(), remoteOutputSection)\n\t\t\t\t})\n\t\t\t})\n\t\t\twhen(\"only a local image is present\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tcomplexInfo.Location = buildpack.PackageLocator\n\t\t\t\t\tsimpleInfo.Location = buildpack.PackageLocator\n\n\t\t\t\t\tmockClient.EXPECT().InspectBuildpack(client.InspectBuildpackOptions{\n\t\t\t\t\t\tBuildpackName: \"only-local-test/buildpack\",\n\t\t\t\t\t\tDaemon:        true,\n\t\t\t\t\t\tRegistry:      \"default-registry\",\n\t\t\t\t\t}).Return(complexInfo, nil)\n\n\t\t\t\t\tmockClient.EXPECT().InspectBuildpack(client.InspectBuildpackOptions{\n\t\t\t\t\t\tBuildpackName: \"only-local-test/buildpack\",\n\t\t\t\t\t\tDaemon:        false,\n\t\t\t\t\t\tRegistry:      \"default-registry\",\n\t\t\t\t\t}).Return(nil, errors.Wrap(image.ErrNotFound, \"remote image not found!\"))\n\t\t\t\t})\n\n\t\t\t\tit(\"displays output for local image\", func() {\n\t\t\t\t\tcommand.SetArgs([]string{\"only-local-test/buildpack\"})\n\t\t\t\t\tassert.Nil(command.Execute())\n\n\t\t\t\t\texpectedOutput := fmt.Sprintf(inspectOutputTemplate,\n\t\t\t\t\t\t\"only-local-test/buildpack\",\n\t\t\t\t\t\t\"LOCAL IMAGE:\",\n\t\t\t\t\t\tcomplexOutputSection)\n\n\t\t\t\t\tassert.AssertTrimmedContains(outBuf.String(), expectedOutput)\n\t\t\t\t})\n\t\t\t})\n\t\t\twhen(\"only a remote image is present\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tcomplexInfo.Location = buildpack.PackageLocator\n\t\t\t\t\tsimpleInfo.Location = buildpack.PackageLocator\n\n\t\t\t\t\tmockClient.EXPECT().InspectBuildpack(client.InspectBuildpackOptions{\n\t\t\t\t\t\tBuildpackName: \"only-remote-test/buildpack\",\n\t\t\t\t\t\tDaemon:        false,\n\t\t\t\t\t\tRegistry:      \"default-registry\",\n\t\t\t\t\t}).Return(complexInfo, nil)\n\n\t\t\t\t\tmockClient.EXPECT().InspectBuildpack(client.InspectBuildpackOptions{\n\t\t\t\t\t\tBuildpackName: \"only-remote-test/buildpack\",\n\t\t\t\t\t\tDaemon:        true,\n\t\t\t\t\t\tRegistry:      \"default-registry\",\n\t\t\t\t\t}).Return(nil, errors.Wrap(image.ErrNotFound, \"remote image not found!\"))\n\t\t\t\t})\n\n\t\t\t\tit(\"displays output for remote image\", func() {\n\t\t\t\t\tcommand.SetArgs([]string{\"only-remote-test/buildpack\"})\n\t\t\t\t\tassert.Nil(command.Execute())\n\n\t\t\t\t\texpectedOutput := fmt.Sprintf(inspectOutputTemplate,\n\t\t\t\t\t\t\"only-remote-test/buildpack\",\n\t\t\t\t\t\t\"REMOTE IMAGE:\",\n\t\t\t\t\t\tcomplexOutputSection)\n\n\t\t\t\t\tassert.AssertTrimmedContains(outBuf.String(), expectedOutput)\n\t\t\t\t})\n\t\t\t})\n\t\t\twhen(\"local docker daemon is not running\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tcomplexInfo.Location = buildpack.PackageLocator\n\t\t\t\t\tsimpleInfo.Location = buildpack.PackageLocator\n\n\t\t\t\t\tmockClient.EXPECT().InspectBuildpack(client.InspectBuildpackOptions{\n\t\t\t\t\t\tBuildpackName: \"only-remote-test/buildpack\",\n\t\t\t\t\t\tDaemon:        false,\n\t\t\t\t\t\tRegistry:      \"default-registry\",\n\t\t\t\t\t}).Return(complexInfo, nil)\n\n\t\t\t\t\tmockClient.EXPECT().InspectBuildpack(client.InspectBuildpackOptions{\n\t\t\t\t\t\tBuildpackName: \"only-remote-test/buildpack\",\n\t\t\t\t\t\tDaemon:        true,\n\t\t\t\t\t\tRegistry:      \"default-registry\",\n\t\t\t\t\t}).Return(nil, errors.New(\"the docker daemon is not running\"))\n\t\t\t\t})\n\n\t\t\t\tit(\"displays output for remote image\", func() {\n\t\t\t\t\tcommand.SetArgs([]string{\"only-remote-test/buildpack\"})\n\t\t\t\t\tassert.Nil(command.Execute())\n\n\t\t\t\t\texpectedOutput := fmt.Sprintf(inspectOutputTemplate,\n\t\t\t\t\t\t\"only-remote-test/buildpack\",\n\t\t\t\t\t\t\"REMOTE IMAGE:\",\n\t\t\t\t\t\tcomplexOutputSection)\n\n\t\t\t\t\tassert.AssertTrimmedContains(outBuf.String(), expectedOutput)\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"inspecting a buildpack uri\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\tsimpleInfo.Location = buildpack.URILocator\n\t\t\t})\n\t\t\twhen(\"uri is a local path\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tmockClient.EXPECT().InspectBuildpack(client.InspectBuildpackOptions{\n\t\t\t\t\t\tBuildpackName: \"/path/to/test/buildpack\",\n\t\t\t\t\t\tDaemon:        true,\n\t\t\t\t\t\tRegistry:      \"default-registry\",\n\t\t\t\t\t}).Return(simpleInfo, nil)\n\t\t\t\t})\n\t\t\t\tit(\"succeeds\", func() {\n\t\t\t\t\tcommand.SetArgs([]string{\"/path/to/test/buildpack\"})\n\t\t\t\t\tassert.Nil(command.Execute())\n\n\t\t\t\t\texpectedOutput := fmt.Sprintf(inspectOutputTemplate,\n\t\t\t\t\t\t\"/path/to/test/buildpack\",\n\t\t\t\t\t\t\"LOCAL ARCHIVE:\",\n\t\t\t\t\t\tsimpleOutputSection)\n\n\t\t\t\t\tassert.TrimmedEq(outBuf.String(), expectedOutput)\n\t\t\t\t})\n\t\t\t})\n\t\t\twhen(\"uri is an http or https location\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tsimpleInfo.Location = buildpack.URILocator\n\t\t\t\t})\n\t\t\t\twhen(\"uri is a local path\", func() {\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\tmockClient.EXPECT().InspectBuildpack(client.InspectBuildpackOptions{\n\t\t\t\t\t\t\tBuildpackName: \"https://path/to/test/buildpack\",\n\t\t\t\t\t\t\tDaemon:        true,\n\t\t\t\t\t\t\tRegistry:      \"default-registry\",\n\t\t\t\t\t\t}).Return(simpleInfo, nil)\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"succeeds\", func() {\n\t\t\t\t\t\tcommand.SetArgs([]string{\"https://path/to/test/buildpack\"})\n\t\t\t\t\t\tassert.Nil(command.Execute())\n\n\t\t\t\t\t\texpectedOutput := fmt.Sprintf(inspectOutputTemplate,\n\t\t\t\t\t\t\t\"https://path/to/test/buildpack\",\n\t\t\t\t\t\t\t\"REMOTE ARCHIVE:\",\n\t\t\t\t\t\t\tsimpleOutputSection)\n\n\t\t\t\t\t\tassert.TrimmedEq(outBuf.String(), expectedOutput)\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"inspecting a buildpack on the registry\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\tsimpleInfo.Location = buildpack.RegistryLocator\n\t\t\t})\n\t\t\twhen(\"using the default registry\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tmockClient.EXPECT().InspectBuildpack(client.InspectBuildpackOptions{\n\t\t\t\t\t\tBuildpackName: \"urn:cnb:registry:test/buildpack\",\n\t\t\t\t\t\tDaemon:        true,\n\t\t\t\t\t\tRegistry:      \"default-registry\",\n\t\t\t\t\t}).Return(simpleInfo, nil)\n\t\t\t\t})\n\t\t\t\tit(\"succeeds\", func() {\n\t\t\t\t\tcommand.SetArgs([]string{\"urn:cnb:registry:test/buildpack\"})\n\t\t\t\t\tassert.Nil(command.Execute())\n\n\t\t\t\t\texpectedOutput := fmt.Sprintf(inspectOutputTemplate,\n\t\t\t\t\t\t\"urn:cnb:registry:test/buildpack\",\n\t\t\t\t\t\t\"REGISTRY IMAGE:\",\n\t\t\t\t\t\tsimpleOutputSection)\n\n\t\t\t\t\tassert.TrimmedEq(outBuf.String(), expectedOutput)\n\t\t\t\t})\n\t\t\t})\n\t\t\twhen(\"using a user provided registry\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tmockClient.EXPECT().InspectBuildpack(client.InspectBuildpackOptions{\n\t\t\t\t\t\tBuildpackName: \"urn:cnb:registry:test/buildpack\",\n\t\t\t\t\t\tDaemon:        true,\n\t\t\t\t\t\tRegistry:      \"some-registry\",\n\t\t\t\t\t}).Return(simpleInfo, nil)\n\t\t\t\t})\n\n\t\t\t\tit(\"succeeds\", func() {\n\t\t\t\t\tcommand.SetArgs([]string{\"urn:cnb:registry:test/buildpack\", \"-r\", \"some-registry\"})\n\t\t\t\t\tassert.Nil(command.Execute())\n\n\t\t\t\t\texpectedOutput := fmt.Sprintf(inspectOutputTemplate,\n\t\t\t\t\t\t\"urn:cnb:registry:test/buildpack\",\n\t\t\t\t\t\t\"REGISTRY IMAGE:\",\n\t\t\t\t\t\tsimpleOutputSection)\n\n\t\t\t\t\tassert.TrimmedEq(outBuf.String(), expectedOutput)\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"a depth flag is passed\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\tcomplexInfo.Location = buildpack.URILocator\n\n\t\t\t\tmockClient.EXPECT().InspectBuildpack(client.InspectBuildpackOptions{\n\t\t\t\t\tBuildpackName: \"/other/path/to/test/buildpack\",\n\t\t\t\t\tDaemon:        true,\n\t\t\t\t\tRegistry:      \"default-registry\",\n\t\t\t\t}).Return(complexInfo, nil)\n\t\t\t})\n\t\t\tit(\"displays detection order to specified depth\", func() {\n\t\t\t\tcommand.SetArgs([]string{\"/other/path/to/test/buildpack\", \"-d\", \"2\"})\n\t\t\t\tassert.Nil(command.Execute())\n\n\t\t\t\tassert.AssertTrimmedContains(outBuf.String(), depthOutputSection)\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"verbose flag is passed\", func() {\n\t\tit.Before(func() {\n\t\t\tsimpleInfo.Location = buildpack.URILocator\n\t\t\tmockClient.EXPECT().InspectBuildpack(client.InspectBuildpackOptions{\n\t\t\t\tBuildpackName: \"/another/path/to/test/buildpack\",\n\t\t\t\tDaemon:        true,\n\t\t\t\tRegistry:      \"default-registry\",\n\t\t\t}).Return(simpleInfo, nil)\n\t\t})\n\t\tit(\"displays all mixins\", func() {\n\t\t\tcommand.SetArgs([]string{\"/another/path/to/test/buildpack\", \"-v\"})\n\t\t\tassert.Nil(command.Execute())\n\n\t\t\tassert.AssertTrimmedContains(outBuf.String(), simpleMixinOutputSection)\n\t\t})\n\t})\n\n\twhen(\"failure cases\", func() {\n\t\twhen(\"unable to inspect buildpack image\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\tmockClient.EXPECT().InspectBuildpack(client.InspectBuildpackOptions{\n\t\t\t\t\tBuildpackName: \"failure-case/buildpack\",\n\t\t\t\t\tDaemon:        true,\n\t\t\t\t\tRegistry:      \"default-registry\",\n\t\t\t\t}).Return(&client.BuildpackInfo{}, errors.Wrap(image.ErrNotFound, \"unable to inspect local failure-case/buildpack\"))\n\n\t\t\t\tmockClient.EXPECT().InspectBuildpack(client.InspectBuildpackOptions{\n\t\t\t\t\tBuildpackName: \"failure-case/buildpack\",\n\t\t\t\t\tDaemon:        false,\n\t\t\t\t\tRegistry:      \"default-registry\",\n\t\t\t\t}).Return(&client.BuildpackInfo{}, errors.Wrap(image.ErrNotFound, \"unable to inspect remote failure-case/buildpack\"))\n\t\t\t})\n\t\t\tit(\"errors\", func() {\n\t\t\t\tcommand.SetArgs([]string{\"failure-case/buildpack\"})\n\t\t\t\terr := command.Execute()\n\t\t\t\tassert.Error(err)\n\t\t\t})\n\t\t})\n\t\twhen(\"unable to inspect buildpack archive\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\tmockClient.EXPECT().InspectBuildpack(client.InspectBuildpackOptions{\n\t\t\t\t\tBuildpackName: \"http://path/to/failure-case/buildpack\",\n\t\t\t\t\tDaemon:        true,\n\t\t\t\t\tRegistry:      \"default-registry\",\n\t\t\t\t}).Return(&client.BuildpackInfo{}, errors.New(\"error inspecting local archive\"))\n\n\t\t\t\tit(\"errors\", func() {\n\t\t\t\t\tcommand.SetArgs([]string{\"http://path/to/failure-case/buildpack\"})\n\t\t\t\t\terr := command.Execute()\n\n\t\t\t\t\tassert.Error(err)\n\t\t\t\t\tassert.Contains(err.Error(), \"error inspecting local archive\")\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\t\twhen(\"unable to inspect both remote and local images\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\tmockClient.EXPECT().InspectBuildpack(client.InspectBuildpackOptions{\n\t\t\t\t\tBuildpackName: \"image-failure-case/buildpack\",\n\t\t\t\t\tDaemon:        true,\n\t\t\t\t\tRegistry:      \"default-registry\",\n\t\t\t\t}).Return(&client.BuildpackInfo{}, errors.Wrap(image.ErrNotFound, \"error inspecting local archive\"))\n\n\t\t\t\tmockClient.EXPECT().InspectBuildpack(client.InspectBuildpackOptions{\n\t\t\t\t\tBuildpackName: \"image-failure-case/buildpack\",\n\t\t\t\t\tDaemon:        false,\n\t\t\t\t\tRegistry:      \"default-registry\",\n\t\t\t\t}).Return(&client.BuildpackInfo{}, errors.Wrap(image.ErrNotFound, \"error inspecting remote archive\"))\n\t\t\t})\n\t\t\tit(\"errors\", func() {\n\t\t\t\tcommand.SetArgs([]string{\"image-failure-case/buildpack\"})\n\t\t\t\terr := command.Execute()\n\n\t\t\t\tassert.Error(err)\n\t\t\t\tassert.Contains(err.Error(), \"error writing buildpack output: \\\"error inspecting local archive: not found, error inspecting remote archive: not found\\\"\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"unable to inspect buildpack on registry\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\tmockClient.EXPECT().InspectBuildpack(client.InspectBuildpackOptions{\n\t\t\t\t\tBuildpackName: \"urn:cnb:registry:registry-failure/buildpack\",\n\t\t\t\t\tDaemon:        true,\n\t\t\t\t\tRegistry:      \"some-registry\",\n\t\t\t\t}).Return(&client.BuildpackInfo{}, errors.New(\"error inspecting registry image\"))\n\n\t\t\t\tmockClient.EXPECT().InspectBuildpack(client.InspectBuildpackOptions{\n\t\t\t\t\tBuildpackName: \"urn:cnb:registry:registry-failure/buildpack\",\n\t\t\t\t\tDaemon:        false,\n\t\t\t\t\tRegistry:      \"some-registry\",\n\t\t\t\t}).Return(&client.BuildpackInfo{}, errors.New(\"error inspecting registry image\"))\n\t\t\t})\n\n\t\t\tit(\"errors\", func() {\n\t\t\t\tcommand.SetArgs([]string{\"urn:cnb:registry:registry-failure/buildpack\", \"-r\", \"some-registry\"})\n\n\t\t\t\terr := command.Execute()\n\t\t\t\tassert.Error(err)\n\t\t\t\tassert.Contains(err.Error(), \"error inspecting registry image\")\n\t\t\t})\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "internal/commands/inspect_extension.go",
    "content": "package commands\n\nimport (\n\t\"bytes\"\n\t\"fmt\"\n\t\"strings\"\n\t\"text/tabwriter\"\n\t\"text/template\"\n\n\tstrs \"github.com/buildpacks/pack/internal/strings\"\n\t\"github.com/buildpacks/pack/pkg/buildpack\"\n\t\"github.com/buildpacks/pack/pkg/client\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n)\n\nconst inspectExtensionTemplate = `\n{{ .Location -}}:\n\nExtension:\n{{ .Extension }}\n`\n\nfunc inspectAllExtensions(client PackClient, options ...client.InspectExtensionOptions) (string, error) {\n\tbuf := bytes.NewBuffer(nil)\n\terrArray := []error{}\n\tfor _, option := range options {\n\t\tnextResult, err := client.InspectExtension(option)\n\t\tif err != nil {\n\t\t\terrArray = append(errArray, err)\n\t\t\tcontinue\n\t\t}\n\n\t\tprefix := determinePrefix(option.ExtensionName, nextResult.Location, option.Daemon)\n\n\t\toutput, err := inspectExtensionOutput(nextResult, prefix)\n\t\tif err != nil {\n\t\t\treturn \"\", err\n\t\t}\n\n\t\tif _, err := buf.Write(output); err != nil {\n\t\t\treturn \"\", err\n\t\t}\n\n\t\tif nextResult.Location != buildpack.PackageLocator {\n\t\t\treturn buf.String(), nil\n\t\t}\n\t}\n\tif len(errArray) == len(options) {\n\t\treturn \"\", joinErrors(errArray)\n\t}\n\treturn buf.String(), nil\n}\n\nfunc inspectExtensionOutput(info *client.ExtensionInfo, prefix string) (output []byte, err error) {\n\ttpl := template.Must(template.New(\"inspect-extension\").Parse(inspectExtensionTemplate))\n\texOutput, err := extensionsOutput(info.Extension)\n\tif err != nil {\n\t\treturn []byte{}, fmt.Errorf(\"error writing extension output: %q\", err)\n\t}\n\n\tbuf := bytes.NewBuffer(nil)\n\terr = tpl.Execute(buf, &struct {\n\t\tLocation  string\n\t\tExtension string\n\t}{\n\t\tLocation:  prefix,\n\t\tExtension: exOutput,\n\t})\n\n\tif err != nil {\n\t\treturn []byte{}, fmt.Errorf(\"error templating extension output template: %q\", err)\n\t}\n\treturn buf.Bytes(), nil\n}\n\nfunc extensionsOutput(ex dist.ModuleInfo) (string, error) {\n\tbuf := &bytes.Buffer{}\n\n\ttabWriter := new(tabwriter.Writer).Init(buf, writerMinWidth, writerPadChar, buildpacksTabWidth, writerPadChar, writerFlags)\n\tif _, err := fmt.Fprint(tabWriter, \"  ID\\tNAME\\tVERSION\\tHOMEPAGE\\n\"); err != nil {\n\t\treturn \"\", err\n\t}\n\n\tif _, err := fmt.Fprintf(tabWriter, \"  %s\\t%s\\t%s\\t%s\\n\", ex.ID, strs.ValueOrDefault(ex.Name, \"-\"), ex.Version, strs.ValueOrDefault(ex.Homepage, \"-\")); err != nil {\n\t\treturn \"\", err\n\t}\n\n\tif err := tabWriter.Flush(); err != nil {\n\t\treturn \"\", err\n\t}\n\n\treturn strings.TrimSuffix(buf.String(), \"\\n\"), nil\n}\n"
  },
  {
    "path": "internal/commands/inspect_image.go",
    "content": "package commands\n\nimport (\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/internal/inspectimage\"\n\n\t\"github.com/buildpacks/pack/internal/inspectimage/writer\"\n\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n)\n\n//go:generate mockgen -package testmocks -destination testmocks/mock_inspect_image_writer_factory.go github.com/buildpacks/pack/internal/commands InspectImageWriterFactory\ntype InspectImageWriterFactory interface {\n\tWriter(kind string, BOM bool) (writer.InspectImageWriter, error)\n}\n\ntype InspectImageFlags struct {\n\tBOM          bool\n\tOutputFormat string\n}\n\nfunc InspectImage(\n\tlogger logging.Logger,\n\twriterFactory InspectImageWriterFactory,\n\tcfg config.Config,\n\tclient PackClient,\n) *cobra.Command {\n\tvar flags InspectImageFlags\n\tcmd := &cobra.Command{\n\t\tUse:     \"inspect <image-name>\",\n\t\tArgs:    cobra.ExactArgs(1),\n\t\tAliases: []string{\"inspect-image\"},\n\t\tShort:   \"Show information about a built app image\",\n\t\tExample: \"pack inspect buildpacksio/pack\",\n\t\tRunE: logError(logger, func(cmd *cobra.Command, args []string) error {\n\t\t\timg := args[0]\n\n\t\t\tsharedImageInfo := inspectimage.GeneralInfo{\n\t\t\t\tName:            img,\n\t\t\t\tRunImageMirrors: cfg.RunImages,\n\t\t\t}\n\n\t\t\tw, err := writerFactory.Writer(flags.OutputFormat, flags.BOM)\n\t\t\tif err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\n\t\t\tremote, remoteErr := client.InspectImage(img, false)\n\t\t\tlocal, localErr := client.InspectImage(img, true)\n\n\t\t\tif flags.BOM {\n\t\t\t\tlogger.Warn(\"Using the '--bom' flag with 'pack inspect-image <image-name>' is deprecated. Users are encouraged to use 'pack sbom download <image-name>'.\")\n\t\t\t}\n\n\t\t\tif err := w.Print(logger, sharedImageInfo, local, remote, localErr, remoteErr); err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\t\t\treturn nil\n\t\t}),\n\t}\n\tAddHelpFlag(cmd, \"inspect\")\n\tcmd.Flags().BoolVar(&flags.BOM, \"bom\", false, \"print bill of materials\")\n\tcmd.Flags().StringVarP(&flags.OutputFormat, \"output\", \"o\", \"human-readable\", \"Output format to display builder detail (json, yaml, toml, human-readable).\\nOmission of this flag will display as human-readable.\")\n\treturn cmd\n}\n"
  },
  {
    "path": "internal/commands/inspect_image_test.go",
    "content": "package commands_test\n\nimport (\n\t\"bytes\"\n\t\"errors\"\n\t\"testing\"\n\n\t\"github.com/buildpacks/lifecycle/platform/files\"\n\t\"github.com/golang/mock/gomock\"\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\t\"github.com/buildpacks/pack/internal/commands\"\n\t\"github.com/buildpacks/pack/internal/commands/fakes\"\n\t\"github.com/buildpacks/pack/internal/commands/testmocks\"\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/internal/inspectimage\"\n\t\"github.com/buildpacks/pack/pkg/client\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nvar (\n\texpectedLocalImageDisplay  = \"Sample output for local image\"\n\texpectedRemoteImageDisplay = \"Sample output for remote image\"\n\n\texpectedSharedInfo = inspectimage.GeneralInfo{\n\t\tName: \"some/image\",\n\t}\n\n\texpectedLocalImageInfo = &client.ImageInfo{\n\t\tStackID:    \"local.image.stack\",\n\t\tBuildpacks: nil,\n\t\tBase:       files.RunImageForRebase{},\n\t\tBOM:        nil,\n\t\tStack:      files.Stack{},\n\t\tProcesses:  client.ProcessDetails{},\n\t}\n\n\texpectedRemoteImageInfo = &client.ImageInfo{\n\t\tStackID:    \"remote.image.stack\",\n\t\tBuildpacks: nil,\n\t\tBase:       files.RunImageForRebase{},\n\t\tBOM:        nil,\n\t\tStack:      files.Stack{},\n\t\tProcesses:  client.ProcessDetails{},\n\t}\n\n\texpectedLocalImageWithExtensionInfo = &client.ImageInfo{\n\t\tStackID:    \"local.image.stack\",\n\t\tBuildpacks: nil,\n\t\tExtensions: nil,\n\t\tBase:       files.RunImageForRebase{},\n\t\tBOM:        nil,\n\t\tStack:      files.Stack{},\n\t\tProcesses:  client.ProcessDetails{},\n\t}\n\n\texpectedRemoteImageWithExtensionInfo = &client.ImageInfo{\n\t\tStackID:    \"remote.image.stack\",\n\t\tBuildpacks: nil,\n\t\tExtensions: nil,\n\t\tBase:       files.RunImageForRebase{},\n\t\tBOM:        nil,\n\t\tStack:      files.Stack{},\n\t\tProcesses:  client.ProcessDetails{},\n\t}\n)\n\nfunc TestInspectImageCommand(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"Commands\", testInspectImageCommand, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testInspectImageCommand(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tlogger         logging.Logger\n\t\toutBuf         bytes.Buffer\n\t\tmockController *gomock.Controller\n\t\tmockClient     *testmocks.MockPackClient\n\t\tcfg            config.Config\n\t)\n\n\tit.Before(func() {\n\t\tcfg = config.Config{}\n\t\tmockController = gomock.NewController(t)\n\t\tmockClient = testmocks.NewMockPackClient(mockController)\n\t\tlogger = logging.NewLogWithWriters(&outBuf, &outBuf)\n\t})\n\n\tit.After(func() {\n\t\tmockController.Finish()\n\t})\n\twhen(\"#InspectImage\", func() {\n\t\tvar (\n\t\t\tassert = h.NewAssertionManager(t)\n\t\t)\n\t\tit(\"passes output of local and remote builders to correct writer\", func() {\n\t\t\tinspectImageWriter := newDefaultInspectImageWriter()\n\t\t\tinspectImageWriterFactory := newImageWriterFactory(inspectImageWriter)\n\n\t\t\tmockClient.EXPECT().InspectImage(\"some/image\", true).Return(expectedLocalImageInfo, nil)\n\t\t\tmockClient.EXPECT().InspectImage(\"some/image\", false).Return(expectedRemoteImageInfo, nil)\n\t\t\tcommand := commands.InspectImage(logger, inspectImageWriterFactory, cfg, mockClient)\n\t\t\tcommand.SetArgs([]string{\"some/image\"})\n\t\t\terr := command.Execute()\n\t\t\tassert.Nil(err)\n\n\t\t\tassert.Equal(inspectImageWriter.ReceivedInfoForLocal, expectedLocalImageInfo)\n\t\t\tassert.Equal(inspectImageWriter.ReceivedInfoForRemote, expectedRemoteImageInfo)\n\t\t\tassert.Equal(inspectImageWriter.RecievedGeneralInfo, expectedSharedInfo)\n\t\t\tassert.Equal(inspectImageWriter.ReceivedErrorForLocal, nil)\n\t\t\tassert.Equal(inspectImageWriter.ReceivedErrorForRemote, nil)\n\t\t\tassert.Equal(inspectImageWriterFactory.ReceivedForKind, \"human-readable\")\n\n\t\t\tassert.ContainsF(outBuf.String(), \"LOCAL:\\n%s\", expectedLocalImageDisplay)\n\t\t\tassert.ContainsF(outBuf.String(), \"REMOTE:\\n%s\", expectedRemoteImageDisplay)\n\t\t})\n\n\t\tit(\"passes output of local and remote builders to correct writer for extension\", func() {\n\t\t\tinspectImageWriter := newDefaultInspectImageWriter()\n\t\t\tinspectImageWriterFactory := newImageWriterFactory(inspectImageWriter)\n\n\t\t\tmockClient.EXPECT().InspectImage(\"some/image\", true).Return(expectedLocalImageWithExtensionInfo, nil)\n\t\t\tmockClient.EXPECT().InspectImage(\"some/image\", false).Return(expectedRemoteImageWithExtensionInfo, nil)\n\t\t\tcommand := commands.InspectImage(logger, inspectImageWriterFactory, cfg, mockClient)\n\t\t\tcommand.SetArgs([]string{\"some/image\"})\n\t\t\terr := command.Execute()\n\t\t\tassert.Nil(err)\n\n\t\t\tassert.Equal(inspectImageWriter.ReceivedInfoForLocal, expectedLocalImageWithExtensionInfo)\n\t\t\tassert.Equal(inspectImageWriter.ReceivedInfoForRemote, expectedRemoteImageWithExtensionInfo)\n\t\t\tassert.Equal(inspectImageWriter.RecievedGeneralInfo, expectedSharedInfo)\n\t\t\tassert.Equal(inspectImageWriter.ReceivedErrorForLocal, nil)\n\t\t\tassert.Equal(inspectImageWriter.ReceivedErrorForRemote, nil)\n\t\t\tassert.Equal(inspectImageWriterFactory.ReceivedForKind, \"human-readable\")\n\n\t\t\tassert.ContainsF(outBuf.String(), \"LOCAL:\\n%s\", expectedLocalImageDisplay)\n\t\t\tassert.ContainsF(outBuf.String(), \"REMOTE:\\n%s\", expectedRemoteImageDisplay)\n\t\t})\n\n\t\tit(\"passes configured run image mirrors to the writer\", func() {\n\t\t\tcfg = config.Config{\n\t\t\t\tRunImages: []config.RunImage{{\n\t\t\t\t\tImage:   \"image-name\",\n\t\t\t\t\tMirrors: []string{\"first-mirror\", \"second-mirror2\"},\n\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\tImage:   \"image-name2\",\n\t\t\t\t\t\tMirrors: []string{\"other-mirror\"},\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t\tTrustedBuilders: nil,\n\t\t\t\tRegistries:      nil,\n\t\t\t}\n\n\t\t\tinspectImageWriter := newDefaultInspectImageWriter()\n\t\t\tinspectImageWriterFactory := newImageWriterFactory(inspectImageWriter)\n\n\t\t\tmockClient.EXPECT().InspectImage(\"some/image\", true).Return(expectedLocalImageInfo, nil)\n\t\t\tmockClient.EXPECT().InspectImage(\"some/image\", false).Return(expectedRemoteImageInfo, nil)\n\n\t\t\tcommand := commands.InspectImage(logger, inspectImageWriterFactory, cfg, mockClient)\n\t\t\tcommand.SetArgs([]string{\"some/image\"})\n\t\t\terr := command.Execute()\n\t\t\tassert.Nil(err)\n\n\t\t\tassert.Equal(inspectImageWriter.RecievedGeneralInfo.RunImageMirrors, cfg.RunImages)\n\t\t})\n\n\t\tit(\"passes configured run image mirrors to the writer\", func() {\n\t\t\tcfg = config.Config{\n\t\t\t\tRunImages: []config.RunImage{{\n\t\t\t\t\tImage:   \"image-name\",\n\t\t\t\t\tMirrors: []string{\"first-mirror\", \"second-mirror2\"},\n\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\tImage:   \"image-name2\",\n\t\t\t\t\t\tMirrors: []string{\"other-mirror\"},\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t\tTrustedBuilders: nil,\n\t\t\t\tRegistries:      nil,\n\t\t\t}\n\n\t\t\tinspectImageWriter := newDefaultInspectImageWriter()\n\t\t\tinspectImageWriterFactory := newImageWriterFactory(inspectImageWriter)\n\n\t\t\tmockClient.EXPECT().InspectImage(\"some/image\", true).Return(expectedLocalImageWithExtensionInfo, nil)\n\t\t\tmockClient.EXPECT().InspectImage(\"some/image\", false).Return(expectedRemoteImageWithExtensionInfo, nil)\n\n\t\t\tcommand := commands.InspectImage(logger, inspectImageWriterFactory, cfg, mockClient)\n\t\t\tcommand.SetArgs([]string{\"some/image\"})\n\t\t\terr := command.Execute()\n\t\t\tassert.Nil(err)\n\n\t\t\tassert.Equal(inspectImageWriter.RecievedGeneralInfo.RunImageMirrors, cfg.RunImages)\n\t\t})\n\n\t\twhen(\"error cases\", func() {\n\t\t\twhen(\"client returns an error when inspecting\", func() {\n\t\t\t\tit(\"passes errors to the Writer\", func() {\n\t\t\t\t\tinspectImageWriter := newDefaultInspectImageWriter()\n\t\t\t\t\tinspectImageWriterFactory := newImageWriterFactory(inspectImageWriter)\n\n\t\t\t\t\tlocalErr := errors.New(\"local inspection error\")\n\t\t\t\t\tmockClient.EXPECT().InspectImage(\"some/image\", true).Return(nil, localErr)\n\n\t\t\t\t\tremoteErr := errors.New(\"remote inspection error\")\n\t\t\t\t\tmockClient.EXPECT().InspectImage(\"some/image\", false).Return(nil, remoteErr)\n\n\t\t\t\t\tcommand := commands.InspectImage(logger, inspectImageWriterFactory, cfg, mockClient)\n\t\t\t\t\tcommand.SetArgs([]string{\"some/image\"})\n\t\t\t\t\terr := command.Execute()\n\t\t\t\t\tassert.Nil(err)\n\n\t\t\t\t\tassert.ErrorWithMessage(inspectImageWriter.ReceivedErrorForLocal, \"local inspection error\")\n\t\t\t\t\tassert.ErrorWithMessage(inspectImageWriter.ReceivedErrorForRemote, \"remote inspection error\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"writerFactory fails to create a writer\", func() {\n\t\t\t\tit(\"returns the error\", func() {\n\t\t\t\t\twriterFactoryErr := errors.New(\"unable to create writer factory\")\n\n\t\t\t\t\terroniousInspectImageWriterFactory := &fakes.FakeInspectImageWriterFactory{\n\t\t\t\t\t\tReturnForWriter: nil,\n\t\t\t\t\t\tErrorForWriter:  writerFactoryErr,\n\t\t\t\t\t}\n\n\t\t\t\t\tcommand := commands.InspectImage(logger, erroniousInspectImageWriterFactory, cfg, mockClient)\n\t\t\t\t\tcommand.SetArgs([]string{\"some/image\"})\n\t\t\t\t\terr := command.Execute()\n\t\t\t\t\tassert.ErrorWithMessage(err, \"unable to create writer factory\")\n\t\t\t\t})\n\t\t\t})\n\t\t\twhen(\"Print returns fails\", func() {\n\t\t\t\tit(\"returns the error\", func() {\n\t\t\t\t\tprintError := errors.New(\"unable to print\")\n\t\t\t\t\tinspectImageWriter := &fakes.FakeInspectImageWriter{\n\t\t\t\t\t\tErrorForPrint: printError,\n\t\t\t\t\t}\n\t\t\t\t\tinspectImageWriterFactory := newImageWriterFactory(inspectImageWriter)\n\n\t\t\t\t\tmockClient.EXPECT().InspectImage(\"some/image\", true).Return(expectedLocalImageInfo, nil)\n\t\t\t\t\tmockClient.EXPECT().InspectImage(\"some/image\", false).Return(expectedRemoteImageInfo, nil)\n\n\t\t\t\t\tcommand := commands.InspectImage(logger, inspectImageWriterFactory, cfg, mockClient)\n\t\t\t\t\tcommand.SetArgs([]string{\"some/image\"})\n\t\t\t\t\terr := command.Execute()\n\t\t\t\t\tassert.ErrorWithMessage(err, \"unable to print\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"Print returns fails for extension\", func() {\n\t\t\t\tit(\"returns the error\", func() {\n\t\t\t\t\tprintError := errors.New(\"unable to print\")\n\t\t\t\t\tinspectImageWriter := &fakes.FakeInspectImageWriter{\n\t\t\t\t\t\tErrorForPrint: printError,\n\t\t\t\t\t}\n\t\t\t\t\tinspectImageWriterFactory := newImageWriterFactory(inspectImageWriter)\n\n\t\t\t\t\tmockClient.EXPECT().InspectImage(\"some/image\", true).Return(expectedLocalImageWithExtensionInfo, nil)\n\t\t\t\t\tmockClient.EXPECT().InspectImage(\"some/image\", false).Return(expectedRemoteImageWithExtensionInfo, nil)\n\n\t\t\t\t\tcommand := commands.InspectImage(logger, inspectImageWriterFactory, cfg, mockClient)\n\t\t\t\t\tcommand.SetArgs([]string{\"some/image\"})\n\t\t\t\t\terr := command.Execute()\n\t\t\t\t\tassert.ErrorWithMessage(err, \"unable to print\")\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\t})\n}\n\nfunc newDefaultInspectImageWriter() *fakes.FakeInspectImageWriter {\n\treturn &fakes.FakeInspectImageWriter{\n\t\tPrintForLocal:  expectedLocalImageDisplay,\n\t\tPrintForRemote: expectedRemoteImageDisplay,\n\t}\n}\n\nfunc newImageWriterFactory(writer *fakes.FakeInspectImageWriter) *fakes.FakeInspectImageWriterFactory {\n\treturn &fakes.FakeInspectImageWriterFactory{\n\t\tReturnForWriter: writer,\n\t}\n}\n"
  },
  {
    "path": "internal/commands/list_registries.go",
    "content": "package commands\n\nimport (\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n)\n\n// Deprecated: Use config registries list instead\nfunc ListBuildpackRegistries(logger logging.Logger, cfg config.Config) *cobra.Command {\n\tcmd := &cobra.Command{\n\t\tUse:     \"list-registries\",\n\t\tArgs:    cobra.NoArgs,\n\t\tHidden:  true,\n\t\tShort:   \"List buildpack registries\",\n\t\tExample: \"pack list-registries\",\n\t\tRunE: logError(logger, func(cmd *cobra.Command, args []string) error {\n\t\t\tdeprecationWarning(logger, \"list-registries\", \"config registries list\")\n\t\t\tlistRegistries(args, logger, cfg)\n\n\t\t\treturn nil\n\t\t}),\n\t}\n\n\treturn cmd\n}\n"
  },
  {
    "path": "internal/commands/list_registries_test.go",
    "content": "package commands_test\n\nimport (\n\t\"bytes\"\n\t\"testing\"\n\n\th \"github.com/buildpacks/pack/testhelpers\"\n\n\t\"github.com/buildpacks/pack/internal/commands\"\n\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n)\n\nfunc TestListRegistries(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\n\tspec.Run(t, \"ListRegistriesCommand\", testListRegistriesCommand, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testListRegistriesCommand(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tcommand *cobra.Command\n\t\tlogger  logging.Logger\n\t\toutBuf  bytes.Buffer\n\t\tcfg     config.Config\n\t)\n\n\tit.Before(func() {\n\t\tlogger = logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\tcfg = config.Config{\n\t\t\tDefaultRegistryName: \"private registry\",\n\t\t\tRegistries: []config.Registry{\n\t\t\t\t{\n\t\t\t\t\tName: \"public registry\",\n\t\t\t\t\tType: \"github\",\n\t\t\t\t\tURL:  \"https://github.com/buildpacks/public-registry\",\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\tName: \"private registry\",\n\t\t\t\t\tType: \"github\",\n\t\t\t\t\tURL:  \"https://github.com/buildpacks/private-registry\",\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\tName: \"personal registry\",\n\t\t\t\t\tType: \"github\",\n\t\t\t\t\tURL:  \"https://github.com/buildpacks/personal-registry\",\n\t\t\t\t},\n\t\t\t},\n\t\t}\n\t\tcommand = commands.ListBuildpackRegistries(logger, cfg)\n\t\tcommand.SetArgs([]string{})\n\t})\n\n\twhen(\"#ListBuildpackRegistries\", func() {\n\t\tit(\"should list all registries\", func() {\n\t\t\th.AssertNil(t, command.Execute())\n\n\t\t\th.AssertContains(t, outBuf.String(), \"has been deprecated, please use 'pack config registries list'\")\n\t\t\th.AssertContains(t, outBuf.String(), \"public registry\")\n\t\t\th.AssertContains(t, outBuf.String(), \"* private registry\")\n\t\t\th.AssertContains(t, outBuf.String(), \"personal registry\")\n\t\t})\n\n\t\tit(\"should list registries in verbose mode\", func() {\n\t\t\tlogger = logging.NewLogWithWriters(&outBuf, &outBuf, logging.WithVerbose())\n\t\t\tcommand = commands.ListBuildpackRegistries(logger, cfg)\n\t\t\tcommand.SetArgs([]string{})\n\n\t\t\th.AssertNil(t, command.Execute())\n\n\t\t\th.AssertContains(t, outBuf.String(), \"public registry\")\n\t\t\th.AssertContains(t, outBuf.String(), \"https://github.com/buildpacks/public-registry\")\n\n\t\t\th.AssertContains(t, outBuf.String(), \"* private registry\")\n\t\t\th.AssertContains(t, outBuf.String(), \"https://github.com/buildpacks/private-registry\")\n\n\t\t\th.AssertContains(t, outBuf.String(), \"personal registry\")\n\t\t\th.AssertContains(t, outBuf.String(), \"https://github.com/buildpacks/personal-registry\")\n\n\t\t\th.AssertContains(t, outBuf.String(), \"official\")\n\t\t\th.AssertContains(t, outBuf.String(), \"https://github.com/buildpacks/registry-index\")\n\t\t})\n\n\t\tit(\"should indicate official as the default registry by default\", func() {\n\t\t\tcfg.DefaultRegistryName = \"\"\n\t\t\tcommand = commands.ListBuildpackRegistries(logger, cfg)\n\t\t\tcommand.SetArgs([]string{})\n\n\t\t\th.AssertNil(t, command.Execute())\n\n\t\t\th.AssertContains(t, outBuf.String(), \"* official\")\n\t\t\th.AssertContains(t, outBuf.String(), \"public registry\")\n\t\t\th.AssertContains(t, outBuf.String(), \"private registry\")\n\t\t\th.AssertContains(t, outBuf.String(), \"personal registry\")\n\t\t})\n\n\t\tit(\"should use official when no registries are defined\", func() {\n\t\t\tcfg.DefaultRegistryName = \"\"\n\t\t\tcommand = commands.ListBuildpackRegistries(logger, config.Config{})\n\t\t\tcommand.SetArgs([]string{})\n\n\t\t\th.AssertNil(t, command.Execute())\n\n\t\t\th.AssertContains(t, outBuf.String(), \"* official\")\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "internal/commands/list_trusted_builders.go",
    "content": "package commands\n\nimport (\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n)\n\n// Deprecated: Use `config trusted-builders list` instead\nfunc ListTrustedBuilders(logger logging.Logger, cfg config.Config) *cobra.Command {\n\tcmd := &cobra.Command{\n\t\tUse:     \"list-trusted-builders\",\n\t\tShort:   \"List Trusted Builders\",\n\t\tLong:    \"List Trusted Builders.\\n\\nShow the builders that are either trusted by default or have been explicitly trusted locally using `trusted-builder add`\",\n\t\tExample: \"pack list-trusted-builders\",\n\t\tHidden:  true,\n\t\tRunE: logError(logger, func(cmd *cobra.Command, args []string) error {\n\t\t\tdeprecationWarning(logger, \"list-trusted-builders\", \"config trusted-builders list\")\n\t\t\tlistTrustedBuilders(args, logger, cfg)\n\t\t\treturn nil\n\t\t}),\n\t}\n\n\tAddHelpFlag(cmd, \"list-trusted-builders\")\n\treturn cmd\n}\n"
  },
  {
    "path": "internal/commands/list_trusted_builders_test.go",
    "content": "package commands_test\n\nimport (\n\t\"bytes\"\n\t\"os\"\n\t\"testing\"\n\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/internal/commands\"\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestListTrustedBuildersCommand(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"Commands\", testListTrustedBuildersCommand, spec.Random(), spec.Report(report.Terminal{}))\n}\n\nfunc testListTrustedBuildersCommand(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tcommand      *cobra.Command\n\t\tlogger       logging.Logger\n\t\toutBuf       bytes.Buffer\n\t\ttempPackHome string\n\t)\n\n\tit.Before(func() {\n\t\tvar err error\n\n\t\tlogger = logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\tcommand = commands.ListTrustedBuilders(logger, config.Config{})\n\n\t\ttempPackHome, err = os.MkdirTemp(\"\", \"pack-home\")\n\t\th.AssertNil(t, err)\n\t\th.AssertNil(t, os.Setenv(\"PACK_HOME\", tempPackHome))\n\t})\n\n\tit.After(func() {\n\t\th.AssertNil(t, os.Unsetenv(\"PACK_HOME\"))\n\t\th.AssertNil(t, os.RemoveAll(tempPackHome))\n\t})\n\n\twhen(\"#ListTrustedBuilders\", func() {\n\t\tit(\"succeeds\", func() {\n\t\t\th.AssertNil(t, command.Execute())\n\t\t})\n\n\t\tit(\"shows header\", func() {\n\t\t\th.AssertNil(t, command.Execute())\n\n\t\t\th.AssertContains(t, outBuf.String(), \"Trusted Builders:\")\n\t\t})\n\n\t\tit(\"shows suggested builders and locally trusted builder in alphabetical order\", func() {\n\t\t\tbuilderName := \"great-builder-\" + h.RandString(8)\n\n\t\t\th.AssertNil(t, command.Execute())\n\t\t\th.AssertNotContains(t, outBuf.String(), builderName)\n\t\t\th.AssertContainsAllInOrder(t,\n\t\t\t\toutBuf,\n\t\t\t\t\"gcr.io/buildpacks/builder:google-22\",\n\t\t\t\t\"heroku/builder:20\",\n\t\t\t\t\"heroku/builder:22\",\n\t\t\t\t\"heroku/builder:24\",\n\t\t\t\t\"paketobuildpacks/builder-jammy-base\",\n\t\t\t\t\"paketobuildpacks/builder-jammy-full\",\n\t\t\t\t\"paketobuildpacks/builder-jammy-tiny\",\n\t\t\t)\n\n\t\t\tlistTrustedBuildersCommand := commands.ListTrustedBuilders(\n\t\t\t\tlogger,\n\t\t\t\tconfig.Config{\n\t\t\t\t\tTrustedBuilders: []config.TrustedBuilder{{Name: builderName}},\n\t\t\t\t},\n\t\t\t)\n\n\t\t\toutBuf.Reset()\n\n\t\t\th.AssertNil(t, listTrustedBuildersCommand.Execute())\n\n\t\t\th.AssertContainsAllInOrder(t,\n\t\t\t\toutBuf,\n\t\t\t\t\"gcr.io/buildpacks/builder:google-22\",\n\t\t\t\tbuilderName,\n\t\t\t\t\"heroku/builder:20\",\n\t\t\t\t\"heroku/builder:22\",\n\t\t\t\t\"heroku/builder:24\",\n\t\t\t\t\"paketobuildpacks/builder-jammy-base\",\n\t\t\t\t\"paketobuildpacks/builder-jammy-full\",\n\t\t\t\t\"paketobuildpacks/builder-jammy-tiny\",\n\t\t\t)\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "internal/commands/manifest.go",
    "content": "package commands\n\nimport (\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/pkg/logging\"\n)\n\nfunc NewManifestCommand(logger logging.Logger, client PackClient) *cobra.Command {\n\tcmd := &cobra.Command{\n\t\tUse:   \"manifest\",\n\t\tShort: \"Interact with OCI image indexes\",\n\t\tLong: `An image index is a higher-level manifest which points to specific image manifests and is ideal for one or more platforms; see: https://github.com/opencontainers/image-spec/ for more details\n\n'pack manifest' commands provide tooling to create, update, or delete images indexes or push them to a remote registry.\n'pack' will save a local copy of the image index at '$PACK_HOME/manifests'; the environment variable 'XDG_RUNTIME_DIR' \ncan be set to override the location, allowing manifests to be edited locally before being pushed to a registry.\n\nThese commands are experimental. For more information, consult the RFC which can be found at https://github.com/buildpacks/rfcs/blob/main/text/0124-pack-manifest-list-commands.md`,\n\t\tRunE: nil,\n\t}\n\n\tcmd.AddCommand(ManifestCreate(logger, client))\n\tcmd.AddCommand(ManifestAdd(logger, client))\n\tcmd.AddCommand(ManifestAnnotate(logger, client))\n\tcmd.AddCommand(ManifestDelete(logger, client))\n\tcmd.AddCommand(ManifestInspect(logger, client))\n\tcmd.AddCommand(ManifestPush(logger, client))\n\tcmd.AddCommand(ManifestRemove(logger, client))\n\n\tAddHelpFlag(cmd, \"manifest\")\n\treturn cmd\n}\n"
  },
  {
    "path": "internal/commands/manifest_add.go",
    "content": "package commands\n\nimport (\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/pkg/client\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n)\n\n// ManifestAdd adds a new image to a manifest list (image index).\nfunc ManifestAdd(logger logging.Logger, pack PackClient) *cobra.Command {\n\tcmd := &cobra.Command{\n\t\tUse:     \"add [OPTIONS] <manifest-list> <manifest> [flags]\",\n\t\tArgs:    cobra.MatchAll(cobra.ExactArgs(2), cobra.OnlyValidArgs),\n\t\tShort:   \"Add an image to a manifest list.\",\n\t\tExample: `pack manifest add my-image-index my-image:some-arch`,\n\t\tRunE: logError(logger, func(cmd *cobra.Command, args []string) (err error) {\n\t\t\treturn pack.AddManifest(cmd.Context(), client.ManifestAddOptions{\n\t\t\t\tIndexRepoName: args[0],\n\t\t\t\tRepoName:      args[1],\n\t\t\t})\n\t\t}),\n\t}\n\n\tAddHelpFlag(cmd, \"add\")\n\treturn cmd\n}\n"
  },
  {
    "path": "internal/commands/manifest_add_test.go",
    "content": "package commands_test\n\nimport (\n\t\"bytes\"\n\t\"testing\"\n\n\t\"github.com/golang/mock/gomock\"\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/internal/commands\"\n\t\"github.com/buildpacks/pack/internal/commands/testmocks\"\n\t\"github.com/buildpacks/pack/pkg/client\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestManifestAddCommand(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\n\tspec.Run(t, \"Commands\", testManifestAddCommand, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testManifestAddCommand(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tcommand        *cobra.Command\n\t\tlogger         *logging.LogWithWriters\n\t\toutBuf         bytes.Buffer\n\t\tmockController *gomock.Controller\n\t\tmockClient     *testmocks.MockPackClient\n\t)\n\n\tit.Before(func() {\n\t\tlogger = logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\tmockController = gomock.NewController(t)\n\t\tmockClient = testmocks.NewMockPackClient(mockController)\n\t\tcommand = commands.ManifestAdd(logger, mockClient)\n\t})\n\n\twhen(\"args are valid\", func() {\n\t\tvar indexRepoName string\n\t\tit.Before(func() {\n\t\t\tindexRepoName = h.NewRandomIndexRepoName()\n\t\t})\n\n\t\twhen(\"index exists\", func() {\n\t\t\twhen(\"no extra flag is provided\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tmockClient.EXPECT().AddManifest(\n\t\t\t\t\t\tgomock.Any(),\n\t\t\t\t\t\tgomock.Eq(client.ManifestAddOptions{\n\t\t\t\t\t\t\tIndexRepoName: indexRepoName,\n\t\t\t\t\t\t\tRepoName:      \"busybox:1.36-musl\",\n\t\t\t\t\t\t}),\n\t\t\t\t\t).Return(nil)\n\t\t\t\t})\n\n\t\t\t\tit(\"should call add manifest operation with the given arguments\", func() {\n\t\t\t\t\tcommand.SetArgs([]string{indexRepoName, \"busybox:1.36-musl\"})\n\t\t\t\t\terr := command.Execute()\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"--help\", func() {\n\t\t\t\tit(\"should have help flag\", func() {\n\t\t\t\t\tcommand.SetArgs([]string{\"--help\"})\n\t\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"args are invalid\", func() {\n\t\tit(\"error when missing mandatory arguments\", func() {\n\t\t\tcommand.SetArgs([]string{\"some-index\"})\n\t\t\terr := command.Execute()\n\t\t\th.AssertNotNil(t, err)\n\t\t\th.AssertError(t, err, \"accepts 2 arg(s), received 1\")\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "internal/commands/manifest_annotate.go",
    "content": "package commands\n\nimport (\n\t\"fmt\"\n\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/pkg/client\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n)\n\n// ManifestAnnotateFlags define flags provided to the ManifestAnnotate\ntype ManifestAnnotateFlags struct {\n\tos, arch, variant string\n\tannotations       map[string]string\n}\n\n// ManifestAnnotate modifies a manifest list and updates the platform information within the index for an image in the list.\nfunc ManifestAnnotate(logger logging.Logger, pack PackClient) *cobra.Command {\n\tvar flags ManifestAnnotateFlags\n\tcmd := &cobra.Command{\n\t\tUse:     \"annotate [OPTIONS] <manifest-list> <manifest> [flags]\",\n\t\tArgs:    cobra.MatchAll(cobra.ExactArgs(2), cobra.OnlyValidArgs),\n\t\tShort:   \"Add or update information about an entry in a manifest list.\",\n\t\tExample: `pack manifest annotate my-image-index my-image:some-arch --arch some-other-arch`,\n\t\tRunE: logError(logger, func(cmd *cobra.Command, args []string) (err error) {\n\t\t\tif err = validateManifestAnnotateFlags(flags); err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\t\t\treturn pack.AnnotateManifest(cmd.Context(), client.ManifestAnnotateOptions{\n\t\t\t\tIndexRepoName: args[0],\n\t\t\t\tRepoName:      args[1],\n\t\t\t\tOS:            flags.os,\n\t\t\t\tOSArch:        flags.arch,\n\t\t\t\tOSVariant:     flags.variant,\n\t\t\t\tAnnotations:   flags.annotations,\n\t\t\t})\n\t\t}),\n\t}\n\n\tcmd.Flags().StringVar(&flags.os, \"os\", \"\", \"Set the OS\")\n\tcmd.Flags().StringVar(&flags.arch, \"arch\", \"\", \"Set the architecture\")\n\tcmd.Flags().StringVar(&flags.variant, \"variant\", \"\", \"Set the architecture variant\")\n\tcmd.Flags().StringToStringVar(&flags.annotations, \"annotations\", make(map[string]string, 0), \"Set an `annotation` for the specified image\")\n\n\tAddHelpFlag(cmd, \"annotate\")\n\treturn cmd\n}\n\nfunc validateManifestAnnotateFlags(flags ManifestAnnotateFlags) error {\n\tif flags.os == \"\" && flags.arch == \"\" && flags.variant == \"\" && len(flags.annotations) == 0 {\n\t\treturn fmt.Errorf(\"one of --os, --arch, or --variant is required\")\n\t}\n\treturn nil\n}\n"
  },
  {
    "path": "internal/commands/manifest_annotate_test.go",
    "content": "package commands_test\n\nimport (\n\t\"bytes\"\n\t\"testing\"\n\n\t\"github.com/golang/mock/gomock\"\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/internal/commands\"\n\t\"github.com/buildpacks/pack/internal/commands/testmocks\"\n\t\"github.com/buildpacks/pack/pkg/client\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestManifestAnnotationsCommand(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\n\tspec.Run(t, \"Commands\", testManifestAnnotateCommand, spec.Random(), spec.Report(report.Terminal{}))\n}\n\nfunc testManifestAnnotateCommand(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tcommand        *cobra.Command\n\t\tlogger         *logging.LogWithWriters\n\t\toutBuf         bytes.Buffer\n\t\tmockController *gomock.Controller\n\t\tmockClient     *testmocks.MockPackClient\n\t)\n\n\tit.Before(func() {\n\t\tlogger = logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\tmockController = gomock.NewController(t)\n\t\tmockClient = testmocks.NewMockPackClient(mockController)\n\n\t\tcommand = commands.ManifestAnnotate(logger, mockClient)\n\t})\n\n\twhen(\"args are valid\", func() {\n\t\tvar (\n\t\t\tindexRepoName string\n\t\t\trepoName      string\n\t\t)\n\t\tit.Before(func() {\n\t\t\tindexRepoName = h.NewRandomIndexRepoName()\n\t\t\trepoName = \"busybox@sha256:6457d53fb065d6f250e1504b9bc42d5b6c65941d57532c072d929dd0628977d0\"\n\t\t})\n\n\t\twhen(\"index exists\", func() {\n\t\t\twhen(\"--os is provided\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\t\tAnnotateManifest(\n\t\t\t\t\t\t\tgomock.Any(),\n\t\t\t\t\t\t\tgomock.Eq(client.ManifestAnnotateOptions{\n\t\t\t\t\t\t\t\tIndexRepoName: indexRepoName,\n\t\t\t\t\t\t\t\tRepoName:      repoName,\n\t\t\t\t\t\t\t\tOS:            \"linux\",\n\t\t\t\t\t\t\t\tAnnotations:   map[string]string{},\n\t\t\t\t\t\t\t}),\n\t\t\t\t\t\t).\n\t\t\t\t\t\tReturn(nil)\n\t\t\t\t})\n\n\t\t\t\tit(\"should annotate images with given flags\", func() {\n\t\t\t\t\tcommand.SetArgs([]string{indexRepoName, repoName, \"--os\", \"linux\"})\n\t\t\t\t\th.AssertNilE(t, command.Execute())\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"--arch is provided\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\t\tAnnotateManifest(\n\t\t\t\t\t\t\tgomock.Any(),\n\t\t\t\t\t\t\tgomock.Eq(client.ManifestAnnotateOptions{\n\t\t\t\t\t\t\t\tIndexRepoName: indexRepoName,\n\t\t\t\t\t\t\t\tRepoName:      repoName,\n\t\t\t\t\t\t\t\tOSArch:        \"amd64\",\n\t\t\t\t\t\t\t\tAnnotations:   map[string]string{},\n\t\t\t\t\t\t\t}),\n\t\t\t\t\t\t).\n\t\t\t\t\t\tReturn(nil)\n\t\t\t\t})\n\n\t\t\t\tit(\"should annotate images with given flags\", func() {\n\t\t\t\t\tcommand.SetArgs([]string{indexRepoName, repoName, \"--arch\", \"amd64\"})\n\t\t\t\t\th.AssertNilE(t, command.Execute())\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"--variant is provided\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\t\tAnnotateManifest(\n\t\t\t\t\t\t\tgomock.Any(),\n\t\t\t\t\t\t\tgomock.Eq(client.ManifestAnnotateOptions{\n\t\t\t\t\t\t\t\tIndexRepoName: indexRepoName,\n\t\t\t\t\t\t\t\tRepoName:      repoName,\n\t\t\t\t\t\t\t\tOSVariant:     \"V6\",\n\t\t\t\t\t\t\t\tAnnotations:   map[string]string{},\n\t\t\t\t\t\t\t}),\n\t\t\t\t\t\t).\n\t\t\t\t\t\tReturn(nil)\n\t\t\t\t})\n\n\t\t\t\tit(\"should annotate images with given flags\", func() {\n\t\t\t\t\tcommand.SetArgs([]string{indexRepoName, repoName, \"--variant\", \"V6\"})\n\t\t\t\t\th.AssertNilE(t, command.Execute())\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"--annotations are provided\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\t\tAnnotateManifest(\n\t\t\t\t\t\t\tgomock.Any(),\n\t\t\t\t\t\t\tgomock.Eq(client.ManifestAnnotateOptions{\n\t\t\t\t\t\t\t\tIndexRepoName: indexRepoName,\n\t\t\t\t\t\t\t\tRepoName:      repoName,\n\t\t\t\t\t\t\t\tAnnotations:   map[string]string{\"foo\": \"bar\"},\n\t\t\t\t\t\t\t}),\n\t\t\t\t\t\t).\n\t\t\t\t\t\tReturn(nil)\n\t\t\t\t})\n\n\t\t\t\tit(\"should annotate images with given flags\", func() {\n\t\t\t\t\tcommand.SetArgs([]string{indexRepoName, repoName, \"--annotations\", \"foo=bar\"})\n\t\t\t\t\th.AssertNilE(t, command.Execute())\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"--help\", func() {\n\t\t\t\tit(\"should have help flag\", func() {\n\t\t\t\t\tcommand.SetArgs([]string{\"--help\"})\n\t\t\t\t\th.AssertNilE(t, command.Execute())\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"args are invalid\", func() {\n\t\tit(\"errors a message when no options are provided\", func() {\n\t\t\tcommand.SetArgs([]string{\"foo\", \"bar\"})\n\t\t\th.AssertError(t, command.Execute(), \"one of --os, --arch, or --variant is required\")\n\t\t})\n\n\t\tit(\"errors when missing mandatory arguments\", func() {\n\t\t\tcommand.SetArgs([]string{\"some-index\"})\n\t\t\terr := command.Execute()\n\t\t\th.AssertNotNil(t, err)\n\t\t\th.AssertError(t, err, \"accepts 2 arg(s), received 1\")\n\t\t})\n\n\t\tit(\"errors when annotations are invalid\", func() {\n\t\t\tcommand.SetArgs([]string{\"some-index\", \"some-manifest\", \"--annotations\", \"some-key\"})\n\t\t\terr := command.Execute()\n\t\t\th.AssertEq(t, err.Error(), `invalid argument \"some-key\" for \"--annotations\" flag: some-key must be formatted as key=value`)\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "internal/commands/manifest_create.go",
    "content": "package commands\n\nimport (\n\t\"fmt\"\n\t\"strings\"\n\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/pkg/client\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n)\n\n// ManifestCreateFlags define flags provided to the ManifestCreate\ntype ManifestCreateFlags struct {\n\tformat            string\n\tinsecure, publish bool\n}\n\n// ManifestCreate creates an image index for a multi-arch image\nfunc ManifestCreate(logger logging.Logger, pack PackClient) *cobra.Command {\n\tvar flags ManifestCreateFlags\n\n\tcmd := &cobra.Command{\n\t\tUse:     \"create <manifest-list> <manifest> [<manifest> ... ] [flags]\",\n\t\tArgs:    cobra.MatchAll(cobra.MinimumNArgs(2), cobra.OnlyValidArgs),\n\t\tShort:   \"Create a new manifest list.\",\n\t\tExample: `pack manifest create my-image-index my-image:some-arch my-image:some-other-arch`,\n\t\tLong:    `Create a new manifest list (e.g., for multi-arch images) which will be stored locally for manipulating images within the index`,\n\t\tRunE: logError(logger, func(cmd *cobra.Command, args []string) error {\n\t\t\tformat, err := parseFormatFlag(strings.ToLower(flags.format))\n\t\t\tif err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\n\t\t\tif err = validateCreateManifestFlags(flags); err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\n\t\t\treturn pack.CreateManifest(\n\t\t\t\tcmd.Context(),\n\t\t\t\tclient.CreateManifestOptions{\n\t\t\t\t\tIndexRepoName: args[0],\n\t\t\t\t\tRepoNames:     args[1:],\n\t\t\t\t\tFormat:        format,\n\t\t\t\t\tInsecure:      flags.insecure,\n\t\t\t\t\tPublish:       flags.publish,\n\t\t\t\t},\n\t\t\t)\n\t\t}),\n\t}\n\n\tcmdFlags := cmd.Flags()\n\n\tcmdFlags.StringVarP(&flags.format, \"format\", \"f\", \"oci\", \"Media type to use when saving the image index. Accepted values are: oci, docker\")\n\tcmdFlags.BoolVar(&flags.insecure, \"insecure\", false, \"When pushing the index to a registry, do not use TLS encryption or certificate verification; use with --publish\")\n\tcmdFlags.BoolVar(&flags.publish, \"publish\", false, \"Publish directly to a registry without saving a local copy\")\n\n\tAddHelpFlag(cmd, \"create\")\n\treturn cmd\n}\n\nfunc validateCreateManifestFlags(flags ManifestCreateFlags) error {\n\tif flags.insecure && !flags.publish {\n\t\treturn fmt.Errorf(\"insecure flag requires the publish flag\")\n\t}\n\treturn nil\n}\n"
  },
  {
    "path": "internal/commands/manifest_create_test.go",
    "content": "package commands_test\n\nimport (\n\t\"bytes\"\n\t\"testing\"\n\n\t\"github.com/golang/mock/gomock\"\n\t\"github.com/google/go-containerregistry/pkg/v1/types\"\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/internal/commands\"\n\t\"github.com/buildpacks/pack/internal/commands/testmocks\"\n\t\"github.com/buildpacks/pack/pkg/client\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestManifestCreateCommand(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\n\tspec.Run(t, \"Commands\", testManifestCreateCommand, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testManifestCreateCommand(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tcommand        *cobra.Command\n\t\tlogger         *logging.LogWithWriters\n\t\toutBuf         bytes.Buffer\n\t\tmockController *gomock.Controller\n\t\tmockClient     *testmocks.MockPackClient\n\t)\n\n\tit.Before(func() {\n\t\tlogger = logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\tmockController = gomock.NewController(t)\n\t\tmockClient = testmocks.NewMockPackClient(mockController)\n\n\t\tcommand = commands.ManifestCreate(logger, mockClient)\n\t})\n\n\twhen(\"args are valid\", func() {\n\t\tvar indexRepoName string\n\t\tit.Before(func() {\n\t\t\tindexRepoName = h.NewRandomIndexRepoName()\n\t\t})\n\n\t\twhen(\"index exists\", func() {\n\t\t\twhen(\"no extra flags are provided\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tmockClient.\n\t\t\t\t\t\tEXPECT().\n\t\t\t\t\t\tCreateManifest(gomock.Any(),\n\t\t\t\t\t\t\tclient.CreateManifestOptions{\n\t\t\t\t\t\t\t\tIndexRepoName: indexRepoName,\n\t\t\t\t\t\t\t\tRepoNames:     []string{\"some-manifest\"},\n\t\t\t\t\t\t\t\tFormat:        types.OCIImageIndex,\n\t\t\t\t\t\t\t\tInsecure:      false,\n\t\t\t\t\t\t\t\tPublish:       false,\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t).Return(nil)\n\t\t\t\t})\n\n\t\t\t\tit(\"should call create operation with default configuration\", func() {\n\t\t\t\t\tcommand.SetArgs([]string{indexRepoName, \"some-manifest\"})\n\t\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"--format is docker\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tmockClient.\n\t\t\t\t\t\tEXPECT().\n\t\t\t\t\t\tCreateManifest(gomock.Any(),\n\t\t\t\t\t\t\tclient.CreateManifestOptions{\n\t\t\t\t\t\t\t\tIndexRepoName: indexRepoName,\n\t\t\t\t\t\t\t\tRepoNames:     []string{\"some-manifest\"},\n\t\t\t\t\t\t\t\tFormat:        types.DockerManifestList,\n\t\t\t\t\t\t\t\tInsecure:      false,\n\t\t\t\t\t\t\t\tPublish:       false,\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t).Return(nil)\n\t\t\t\t})\n\n\t\t\t\tit(\"should call create operation with docker media type\", func() {\n\t\t\t\t\tcommand.SetArgs([]string{indexRepoName, \"some-manifest\", \"-f\", \"docker\"})\n\t\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"--publish\", func() {\n\t\t\t\twhen(\"--insecure\", func() {\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\tmockClient.\n\t\t\t\t\t\t\tEXPECT().\n\t\t\t\t\t\t\tCreateManifest(gomock.Any(),\n\t\t\t\t\t\t\t\tclient.CreateManifestOptions{\n\t\t\t\t\t\t\t\t\tIndexRepoName: indexRepoName,\n\t\t\t\t\t\t\t\t\tRepoNames:     []string{\"some-manifest\"},\n\t\t\t\t\t\t\t\t\tFormat:        types.OCIImageIndex,\n\t\t\t\t\t\t\t\t\tInsecure:      true,\n\t\t\t\t\t\t\t\t\tPublish:       true,\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t).Return(nil)\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"should call create operation with publish and insecure\", func() {\n\t\t\t\t\t\tcommand.SetArgs([]string{indexRepoName, \"some-manifest\", \"--publish\", \"--insecure\"})\n\t\t\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"no --insecure\", func() {\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\tmockClient.\n\t\t\t\t\t\t\tEXPECT().\n\t\t\t\t\t\t\tCreateManifest(gomock.Any(),\n\t\t\t\t\t\t\t\tclient.CreateManifestOptions{\n\t\t\t\t\t\t\t\t\tIndexRepoName: indexRepoName,\n\t\t\t\t\t\t\t\t\tRepoNames:     []string{\"some-manifest\"},\n\t\t\t\t\t\t\t\t\tFormat:        types.OCIImageIndex,\n\t\t\t\t\t\t\t\t\tInsecure:      false,\n\t\t\t\t\t\t\t\t\tPublish:       true,\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t).Return(nil)\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"should call create operation with publish\", func() {\n\t\t\t\t\t\tcommand.SetArgs([]string{indexRepoName, \"some-manifest\", \"--publish\"})\n\t\t\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"--help\", func() {\n\t\t\t\tit(\"should have help flag\", func() {\n\t\t\t\t\tcommand.SetArgs([]string{\"--help\"})\n\t\t\t\t\th.AssertNilE(t, command.Execute())\n\t\t\t\t\th.AssertEq(t, outBuf.String(), \"\")\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"invalid arguments\", func() {\n\t\twhen(\"--insecure is used without publish\", func() {\n\t\t\tit(\"errors a message\", func() {\n\t\t\t\tcommand.SetArgs([]string{\"something\", \"some-manifest\", \"--insecure\"})\n\t\t\t\th.AssertError(t, command.Execute(), \"insecure flag requires the publish flag\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"invalid media type\", func() {\n\t\t\tvar format string\n\t\t\tit.Before(func() {\n\t\t\t\tformat = \"invalid\"\n\t\t\t})\n\n\t\t\tit(\"errors a message\", func() {\n\t\t\t\tcommand.SetArgs([]string{\"some-index\", \"some-manifest\", \"--format\", format})\n\t\t\t\th.AssertNotNil(t, command.Execute())\n\t\t\t})\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "internal/commands/manifest_inspect.go",
    "content": "package commands\n\nimport (\n\t\"errors\"\n\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/pkg/logging\"\n)\n\n// ManifestInspect shows the manifest information stored locally\nfunc ManifestInspect(logger logging.Logger, pack PackClient) *cobra.Command {\n\tcmd := &cobra.Command{\n\t\tUse:     \"inspect <manifest-list>\",\n\t\tArgs:    cobra.MatchAll(cobra.ExactArgs(1), cobra.OnlyValidArgs),\n\t\tShort:   \"Display information about a manifest list.\",\n\t\tExample: `pack manifest inspect my-image-index`,\n\t\tRunE: logError(logger, func(cmd *cobra.Command, args []string) error {\n\t\t\tif args[0] == \"\" {\n\t\t\t\treturn errors.New(\"'<manifest-list>' is required\")\n\t\t\t}\n\t\t\treturn pack.InspectManifest(args[0])\n\t\t}),\n\t}\n\n\tAddHelpFlag(cmd, \"inspect\")\n\treturn cmd\n}\n"
  },
  {
    "path": "internal/commands/manifest_inspect_test.go",
    "content": "package commands_test\n\nimport (\n\t\"bytes\"\n\t\"testing\"\n\n\t\"github.com/golang/mock/gomock\"\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/internal/commands\"\n\t\"github.com/buildpacks/pack/internal/commands/testmocks\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestManifestInspectCommand(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\n\tspec.Run(t, \"Commands\", testManifestInspectCommand, spec.Random(), spec.Report(report.Terminal{}))\n}\n\nfunc testManifestInspectCommand(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tcommand        *cobra.Command\n\t\tlogger         *logging.LogWithWriters\n\t\toutBuf         bytes.Buffer\n\t\tmockController *gomock.Controller\n\t\tmockClient     *testmocks.MockPackClient\n\t)\n\n\tit.Before(func() {\n\t\tlogger = logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\tmockController = gomock.NewController(t)\n\t\tmockClient = testmocks.NewMockPackClient(mockController)\n\t\tcommand = commands.ManifestInspect(logger, mockClient)\n\t})\n\n\twhen(\"args are valid\", func() {\n\t\tvar indexRepoName string\n\t\tit.Before(func() {\n\t\t\tindexRepoName = h.NewRandomIndexRepoName()\n\t\t})\n\n\t\twhen(\"index exists\", func() {\n\t\t\twhen(\"no extra flags are provided\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tmockClient.EXPECT().InspectManifest(indexRepoName).Return(nil)\n\t\t\t\t})\n\n\t\t\t\tit(\"should call inspect operation with the given index repo name\", func() {\n\t\t\t\t\tcommand.SetArgs([]string{indexRepoName})\n\t\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"--help\", func() {\n\t\t\t\tit(\"should have help flag\", func() {\n\t\t\t\t\tcommand.SetArgs([]string{\"--help\"})\n\t\t\t\t\th.AssertNilE(t, command.Execute())\n\t\t\t\t\th.AssertEq(t, outBuf.String(), \"\")\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "internal/commands/manifest_push.go",
    "content": "package commands\n\nimport (\n\t\"strings\"\n\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/pkg/client\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n)\n\n// ManifestPushFlags define flags provided to the ManifestPush\ntype ManifestPushFlags struct {\n\tformat          string\n\tinsecure, purge bool\n}\n\n// ManifestPush pushes a manifest list to a remote registry.\nfunc ManifestPush(logger logging.Logger, pack PackClient) *cobra.Command {\n\tvar flags ManifestPushFlags\n\n\tcmd := &cobra.Command{\n\t\tUse:     \"push [OPTIONS] <manifest-list> [flags]\",\n\t\tArgs:    cobra.MatchAll(cobra.ExactArgs(1), cobra.OnlyValidArgs),\n\t\tShort:   \"Push a manifest list to a registry.\",\n\t\tExample: `pack manifest push my-image-index`,\n\t\tLong: `manifest push' pushes a manifest list to a registry.\nUse other 'pack manifest' commands to prepare the manifest list locally, then use the push command.`,\n\t\tRunE: logError(logger, func(cmd *cobra.Command, args []string) error {\n\t\t\tformat, err := parseFormatFlag(strings.ToLower(flags.format))\n\t\t\tif err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\n\t\t\treturn pack.PushManifest(client.PushManifestOptions{\n\t\t\t\tIndexRepoName: args[0],\n\t\t\t\tFormat:        format,\n\t\t\t\tInsecure:      flags.insecure,\n\t\t\t\tPurge:         flags.purge,\n\t\t\t})\n\t\t}),\n\t}\n\n\tcmd.Flags().StringVarP(&flags.format, \"format\", \"f\", \"oci\", \"Media type to use when saving the image index. Accepted values are: oci, docker\")\n\tcmd.Flags().BoolVar(&flags.insecure, \"insecure\", false, \"When pushing the index to a registry, do not use TLS encryption or certificate verification\")\n\tcmd.Flags().BoolVar(&flags.purge, \"purge\", false, \"Delete the manifest list from local storage if pushing succeeds\")\n\n\tAddHelpFlag(cmd, \"push\")\n\treturn cmd\n}\n"
  },
  {
    "path": "internal/commands/manifest_push_test.go",
    "content": "package commands_test\n\nimport (\n\t\"bytes\"\n\t\"testing\"\n\n\t\"github.com/golang/mock/gomock\"\n\t\"github.com/google/go-containerregistry/pkg/v1/types\"\n\t\"github.com/heroku/color\"\n\t\"github.com/pkg/errors\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/internal/commands\"\n\t\"github.com/buildpacks/pack/internal/commands/testmocks\"\n\t\"github.com/buildpacks/pack/pkg/client\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestManifestPushCommand(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\n\tspec.Run(t, \"Commands\", testManifestPushCommand, spec.Random(), spec.Report(report.Terminal{}))\n}\n\nfunc testManifestPushCommand(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tcommand        *cobra.Command\n\t\tlogger         *logging.LogWithWriters\n\t\toutBuf         bytes.Buffer\n\t\tmockController *gomock.Controller\n\t\tmockClient     *testmocks.MockPackClient\n\t)\n\n\tit.Before(func() {\n\t\tlogger = logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\tmockController = gomock.NewController(t)\n\t\tmockClient = testmocks.NewMockPackClient(mockController)\n\n\t\tcommand = commands.ManifestPush(logger, mockClient)\n\t})\n\n\twhen(\"args are valid\", func() {\n\t\tvar indexRepoName string\n\t\tit.Before(func() {\n\t\t\tindexRepoName = h.NewRandomIndexRepoName()\n\t\t})\n\n\t\twhen(\"index exists\", func() {\n\t\t\twhen(\"no extra flag is provided\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\t\tPushManifest(gomock.Eq(client.PushManifestOptions{\n\t\t\t\t\t\t\tIndexRepoName: indexRepoName,\n\t\t\t\t\t\t\tFormat:        types.OCIImageIndex,\n\t\t\t\t\t\t\tInsecure:      false,\n\t\t\t\t\t\t\tPurge:         false,\n\t\t\t\t\t\t})).Return(nil)\n\t\t\t\t})\n\n\t\t\t\tit(\"should call push operation with default configuration\", func() {\n\t\t\t\t\tcommand.SetArgs([]string{indexRepoName})\n\t\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"--format is docker\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\t\tPushManifest(gomock.Eq(client.PushManifestOptions{\n\t\t\t\t\t\t\tIndexRepoName: indexRepoName,\n\t\t\t\t\t\t\tFormat:        types.DockerManifestList,\n\t\t\t\t\t\t\tInsecure:      false,\n\t\t\t\t\t\t\tPurge:         false,\n\t\t\t\t\t\t})).Return(nil)\n\t\t\t\t})\n\n\t\t\t\tit(\"should call push operation with docker media type\", func() {\n\t\t\t\t\tcommand.SetArgs([]string{indexRepoName, \"-f\", \"docker\"})\n\t\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"--purge\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\t\tPushManifest(gomock.Eq(client.PushManifestOptions{\n\t\t\t\t\t\t\tIndexRepoName: indexRepoName,\n\t\t\t\t\t\t\tFormat:        types.OCIImageIndex,\n\t\t\t\t\t\t\tInsecure:      false,\n\t\t\t\t\t\t\tPurge:         true,\n\t\t\t\t\t\t})).Return(nil)\n\t\t\t\t})\n\n\t\t\t\tit(\"should call push operation with purge enabled\", func() {\n\t\t\t\t\tcommand.SetArgs([]string{indexRepoName, \"--purge\"})\n\t\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"--insecure\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\t\tPushManifest(gomock.Eq(client.PushManifestOptions{\n\t\t\t\t\t\t\tIndexRepoName: indexRepoName,\n\t\t\t\t\t\t\tFormat:        types.OCIImageIndex,\n\t\t\t\t\t\t\tInsecure:      true,\n\t\t\t\t\t\t\tPurge:         false,\n\t\t\t\t\t\t})).Return(nil)\n\t\t\t\t})\n\n\t\t\t\tit(\"should call push operation with insecure enabled\", func() {\n\t\t\t\t\tcommand.SetArgs([]string{indexRepoName, \"--insecure\"})\n\t\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"--help\", func() {\n\t\t\t\tit(\"should have help flag\", func() {\n\t\t\t\t\tcommand.SetArgs([]string{\"--help\"})\n\t\t\t\t\th.AssertNilE(t, command.Execute())\n\t\t\t\t\th.AssertEq(t, outBuf.String(), \"\")\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"index doesn't exist\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\tmockClient.\n\t\t\t\t\tEXPECT().\n\t\t\t\t\tPushManifest(\n\t\t\t\t\t\tgomock.Any(),\n\t\t\t\t\t).\n\t\t\t\t\tAnyTimes().\n\t\t\t\t\tReturn(errors.New(\"unable to push Image\"))\n\t\t\t})\n\n\t\t\tit(\"should return an error when index not exists locally\", func() {\n\t\t\t\tcommand.SetArgs([]string{\"some-index\"})\n\t\t\t\terr := command.Execute()\n\t\t\t\th.AssertNotNil(t, err)\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"args are invalid\", func() {\n\t\twhen(\"--format is invalid\", func() {\n\t\t\tit(\"should return an error when index not exists locally\", func() {\n\t\t\t\tcommand.SetArgs([]string{\"some-index\", \"-f\", \"bad-media-type\"})\n\t\t\t\terr := command.Execute()\n\t\t\t\th.AssertNotNil(t, err)\n\t\t\t\th.AssertError(t, err, \"invalid media type format\")\n\t\t\t})\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "internal/commands/manifest_remove.go",
    "content": "package commands\n\nimport (\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/pkg/logging\"\n)\n\n// ManifestDelete deletes one or more manifest lists from local storage\nfunc ManifestDelete(logger logging.Logger, pack PackClient) *cobra.Command {\n\tcmd := &cobra.Command{\n\t\tUse:     \"remove [manifest-list] [manifest-list...]\",\n\t\tArgs:    cobra.MatchAll(cobra.MinimumNArgs(1), cobra.OnlyValidArgs),\n\t\tShort:   \"Remove one or more manifest lists from local storage\",\n\t\tExample: `pack manifest remove my-image-index`,\n\t\tRunE: logError(logger, func(cmd *cobra.Command, args []string) error {\n\t\t\tif err := pack.DeleteManifest(args); err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\t\t\treturn nil\n\t\t}),\n\t}\n\n\tAddHelpFlag(cmd, \"remove\")\n\treturn cmd\n}\n"
  },
  {
    "path": "internal/commands/manifest_remove_test.go",
    "content": "package commands_test\n\nimport (\n\t\"bytes\"\n\t\"testing\"\n\n\t\"github.com/golang/mock/gomock\"\n\t\"github.com/heroku/color\"\n\t\"github.com/pkg/errors\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/internal/commands\"\n\t\"github.com/buildpacks/pack/internal/commands/testmocks\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestManifestDeleteCommand(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\n\tspec.Run(t, \"Commands\", testManifestDeleteCommand, spec.Random(), spec.Report(report.Terminal{}))\n}\n\nfunc testManifestDeleteCommand(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tcommand        *cobra.Command\n\t\tlogger         *logging.LogWithWriters\n\t\toutBuf         bytes.Buffer\n\t\tmockController *gomock.Controller\n\t\tmockClient     *testmocks.MockPackClient\n\t)\n\n\tit.Before(func() {\n\t\tlogger = logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\tmockController = gomock.NewController(t)\n\t\tmockClient = testmocks.NewMockPackClient(mockController)\n\t\tcommand = commands.ManifestDelete(logger, mockClient)\n\t})\n\n\twhen(\"args are valid\", func() {\n\t\tvar indexRepoName string\n\t\tit.Before(func() {\n\t\t\tindexRepoName = h.NewRandomIndexRepoName()\n\t\t})\n\n\t\twhen(\"index exists\", func() {\n\t\t\twhen(\"no extra flags are provided\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tmockClient.EXPECT().DeleteManifest(\n\t\t\t\t\t\tgomock.Eq([]string{indexRepoName}),\n\t\t\t\t\t).Return(nil)\n\t\t\t\t})\n\n\t\t\t\tit(\"should delete index\", func() {\n\t\t\t\t\tcommand.SetArgs([]string{indexRepoName})\n\t\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"--help\", func() {\n\t\t\t\tit(\"should have help flag\", func() {\n\t\t\t\t\tcommand.SetArgs([]string{\"--help\"})\n\t\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"index does not exist\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\tmockClient.EXPECT().DeleteManifest(\n\t\t\t\t\tgomock.Eq([]string{\"some-none-existent-index\"}),\n\t\t\t\t).Return(errors.New(\"image index doesn't exists\"))\n\t\t\t})\n\n\t\t\tit(\"should return an error\", func() {\n\t\t\t\tcommand.SetArgs([]string{\"some-none-existent-index\"})\n\t\t\t\th.AssertNotNil(t, command.Execute())\n\t\t\t})\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "internal/commands/manifest_rm.go",
    "content": "package commands\n\nimport (\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/pkg/logging\"\n)\n\n// ManifestRemove will remove the specified image manifest if it is already referenced in the index\nfunc ManifestRemove(logger logging.Logger, pack PackClient) *cobra.Command {\n\tcmd := &cobra.Command{\n\t\tUse:     \"rm [manifest-list] [manifest] [manifest...] [flags]\",\n\t\tArgs:    cobra.MatchAll(cobra.MinimumNArgs(2), cobra.OnlyValidArgs),\n\t\tShort:   \"Remove an image manifest from a manifest list.\",\n\t\tExample: `pack manifest rm my-image-index my-image@sha256:<some-sha>`,\n\t\tLong: `'manifest rm' will remove the specified image manifest if it is already referenced in the index.\nUsers must pass the digest of the image in order to delete it from the index.\nTo discard __all__ the images in an index and the index itself, use 'manifest delete'.`,\n\t\tRunE: logError(logger, func(cmd *cobra.Command, args []string) error {\n\t\t\tif err := pack.RemoveManifest(args[0], args[1:]); err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\t\t\treturn nil\n\t\t}),\n\t}\n\n\tAddHelpFlag(cmd, \"rm\")\n\treturn cmd\n}\n"
  },
  {
    "path": "internal/commands/manifest_rm_test.go",
    "content": "package commands_test\n\nimport (\n\t\"bytes\"\n\t\"errors\"\n\t\"testing\"\n\n\t\"github.com/golang/mock/gomock\"\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/internal/commands\"\n\t\"github.com/buildpacks/pack/internal/commands/testmocks\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestManifestRemoveCommand(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\n\tspec.Run(t, \"Commands\", testManifestRemoveCommand, spec.Random(), spec.Report(report.Terminal{}))\n}\n\nfunc testManifestRemoveCommand(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tcommand        *cobra.Command\n\t\tlogger         *logging.LogWithWriters\n\t\toutBuf         bytes.Buffer\n\t\tmockController *gomock.Controller\n\t\tmockClient     *testmocks.MockPackClient\n\t)\n\n\tit.Before(func() {\n\t\tlogger = logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\tmockController = gomock.NewController(t)\n\t\tmockClient = testmocks.NewMockPackClient(mockController)\n\t\tcommand = commands.ManifestRemove(logger, mockClient)\n\t})\n\twhen(\"args are valid\", func() {\n\t\tvar indexRepoName string\n\t\tit.Before(func() {\n\t\t\tindexRepoName = h.NewRandomIndexRepoName()\n\t\t})\n\n\t\twhen(\"index exists\", func() {\n\t\t\twhen(\"no extra flags are provided\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tmockClient.EXPECT().RemoveManifest(\n\t\t\t\t\t\tgomock.Eq(indexRepoName),\n\t\t\t\t\t\tgomock.Eq([]string{\"some-image\"}),\n\t\t\t\t\t).Return(nil)\n\t\t\t\t})\n\t\t\t\tit(\"should remove index\", func() {\n\t\t\t\t\tcommand.SetArgs([]string{indexRepoName, \"some-image\"})\n\t\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"--help\", func() {\n\t\t\t\tit(\"should have help flag\", func() {\n\t\t\t\t\tcommand.SetArgs([]string{\"--help\"})\n\t\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t\t\th.AssertEq(t, outBuf.String(), \"\")\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"index does not exist\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\tmockClient.EXPECT().RemoveManifest(\n\t\t\t\t\tgomock.Eq(indexRepoName),\n\t\t\t\t\tgomock.Eq([]string{\"some-image\"}),\n\t\t\t\t).Return(errors.New(\"image index doesn't exists\"))\n\t\t\t})\n\t\t\tit(\"should return an error\", func() {\n\t\t\t\tcommand.SetArgs([]string{indexRepoName, \"some-image\"})\n\t\t\t\th.AssertNotNil(t, command.Execute())\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"args are invalid\", func() {\n\t\tit(\"errors when missing mandatory arguments\", func() {\n\t\t\tcommand.SetArgs([]string{\"some-index\"})\n\t\t\terr := command.Execute()\n\t\t\th.AssertNotNil(t, err)\n\t\t\th.AssertError(t, err, \"requires at least 2 arg(s), only received 1\")\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "internal/commands/manifest_test.go",
    "content": "package commands_test\n\nimport (\n\t\"bytes\"\n\t\"testing\"\n\n\t\"github.com/golang/mock/gomock\"\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/internal/commands\"\n\t\"github.com/buildpacks/pack/internal/commands/testmocks\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestNewManifestCommand(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\n\tspec.Run(t, \"Commands\", testNewManifestCommand, spec.Random(), spec.Report(report.Terminal{}))\n}\n\nfunc testNewManifestCommand(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tcommand        *cobra.Command\n\t\tlogger         *logging.LogWithWriters\n\t\toutBuf         bytes.Buffer\n\t\tmockController *gomock.Controller\n\t\tmockClient     *testmocks.MockPackClient\n\t)\n\n\tit.Before(func() {\n\t\tlogger = logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\tmockController = gomock.NewController(t)\n\t\tmockClient = testmocks.NewMockPackClient(mockController)\n\n\t\tcommand = commands.NewManifestCommand(logger, mockClient)\n\t\tcommand.SetOut(logging.GetWriterForLevel(logger, logging.InfoLevel))\n\t})\n\tit(\"should have help flag\", func() {\n\t\tcommand.SetArgs([]string{})\n\t\terr := command.Execute()\n\t\th.AssertNilE(t, err)\n\n\t\toutput := outBuf.String()\n\t\th.AssertContains(t, output, \"Usage:\")\n\t\tfor _, command := range []string{\"create\", \"add\", \"annotate\", \"inspect\", \"remove\", \"rm\"} {\n\t\t\th.AssertContains(t, output, command)\n\t\t}\n\t})\n}\n"
  },
  {
    "path": "internal/commands/package_buildpack.go",
    "content": "package commands\n\nimport (\n\t\"path/filepath\"\n\n\t\"github.com/pkg/errors\"\n\t\"github.com/spf13/cobra\"\n\n\tpubbldpkg \"github.com/buildpacks/pack/buildpackage\"\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/internal/style\"\n\t\"github.com/buildpacks/pack/pkg/client\"\n\t\"github.com/buildpacks/pack/pkg/image\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n)\n\n// Deprecated: use BuildpackPackage instead\n// PackageBuildpack packages (a) buildpack(s) into OCI format, based on a package config\nfunc PackageBuildpack(logger logging.Logger, cfg config.Config, packager BuildpackPackager, packageConfigReader PackageConfigReader) *cobra.Command {\n\tvar flags BuildpackPackageFlags\n\n\tcmd := &cobra.Command{\n\t\tUse:     `package-buildpack <name> --config <package-config-path>`,\n\t\tHidden:  true,\n\t\tArgs:    cobra.MatchAll(cobra.ExactArgs(1), cobra.OnlyValidArgs),\n\t\tShort:   \"Package buildpack in OCI format.\",\n\t\tExample: \"pack package-buildpack my-buildpack --config ./package.toml\",\n\t\tLong: \"package-buildpack allows users to package (a) buildpack(s) into OCI format, which can then to be hosted in \" +\n\t\t\t\"image repositories. You can also package a number of buildpacks together, to enable easier distribution of \" +\n\t\t\t\"a set of buildpacks. Packaged buildpacks can be used as inputs to `pack build` (using the `--buildpack` flag), \" +\n\t\t\t\"and they can be included in the configs used in `pack builder create` and `pack buildpack package`. For more \" +\n\t\t\t\"on how to package a buildpack, see: https://buildpacks.io/docs/buildpack-author-guide/package-a-buildpack/.\",\n\t\tRunE: logError(logger, func(cmd *cobra.Command, args []string) error {\n\t\t\tdeprecationWarning(logger, \"package-buildpack\", \"buildpack package\")\n\n\t\t\tif err := validateBuildpackPackageFlags(cfg, &flags); err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\n\t\t\tstringPolicy := flags.Policy\n\t\t\tif stringPolicy == \"\" {\n\t\t\t\tstringPolicy = cfg.PullPolicy\n\t\t\t}\n\t\t\tpullPolicy, err := image.ParsePullPolicy(stringPolicy)\n\t\t\tif err != nil {\n\t\t\t\treturn errors.Wrap(err, \"parsing pull policy\")\n\t\t\t}\n\n\t\t\tcfg := pubbldpkg.DefaultConfig()\n\t\t\trelativeBaseDir := \"\"\n\t\t\tif flags.PackageTomlPath != \"\" {\n\t\t\t\tcfg, err = packageConfigReader.Read(flags.PackageTomlPath)\n\t\t\t\tif err != nil {\n\t\t\t\t\treturn errors.Wrap(err, \"reading config\")\n\t\t\t\t}\n\n\t\t\t\trelativeBaseDir, err = filepath.Abs(filepath.Dir(flags.PackageTomlPath))\n\t\t\t\tif err != nil {\n\t\t\t\t\treturn errors.Wrap(err, \"getting absolute path for config\")\n\t\t\t\t}\n\t\t\t}\n\n\t\t\tname := args[0]\n\t\t\tif err := packager.PackageBuildpack(cmd.Context(), client.PackageBuildpackOptions{\n\t\t\t\tRelativeBaseDir: relativeBaseDir,\n\t\t\t\tName:            name,\n\t\t\t\tFormat:          flags.Format,\n\t\t\t\tConfig:          cfg,\n\t\t\t\tPublish:         flags.Publish,\n\t\t\t\tPullPolicy:      pullPolicy,\n\t\t\t\tRegistry:        flags.BuildpackRegistry,\n\t\t\t}); err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\n\t\t\taction := \"created\"\n\t\t\tif flags.Publish {\n\t\t\t\taction = \"published\"\n\t\t\t}\n\n\t\t\tlogger.Infof(\"Successfully %s package %s\", action, style.Symbol(name))\n\t\t\treturn nil\n\t\t}),\n\t}\n\tcmd.Flags().StringVarP(&flags.PackageTomlPath, \"config\", \"c\", \"\", \"Path to package TOML config (required)\")\n\n\tcmd.Flags().StringVarP(&flags.Format, \"format\", \"f\", \"\", `Format to save package as (\"image\" or \"file\")`)\n\tcmd.Flags().BoolVar(&flags.Publish, \"publish\", false, `Publish the buildpack directly to the container registry specified in <name>, instead of the daemon (applies to \"--format=image\" only).`)\n\tcmd.Flags().StringVar(&flags.Policy, \"pull-policy\", \"\", \"Pull policy to use. Accepted values are always, never, and if-not-present. The default is always\")\n\tcmd.Flags().StringVarP(&flags.BuildpackRegistry, \"buildpack-registry\", \"r\", \"\", \"Buildpack Registry name\")\n\n\tAddHelpFlag(cmd, \"package-buildpack\")\n\treturn cmd\n}\n"
  },
  {
    "path": "internal/commands/package_buildpack_test.go",
    "content": "package commands_test\n\nimport (\n\t\"bytes\"\n\t\"fmt\"\n\t\"testing\"\n\n\t\"github.com/heroku/color\"\n\t\"github.com/pkg/errors\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\t\"github.com/spf13/cobra\"\n\n\tpubbldpkg \"github.com/buildpacks/pack/buildpackage\"\n\t\"github.com/buildpacks/pack/internal/commands\"\n\t\"github.com/buildpacks/pack/internal/commands/fakes\"\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n\t\"github.com/buildpacks/pack/pkg/image\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestPackageBuildpackCommand(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"PackageBuildpackCommand\", testPackageBuildpackCommand, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testPackageBuildpackCommand(t *testing.T, when spec.G, it spec.S) {\n\twhen(\"PackageBuildpack#Execute\", func() {\n\t\twhen(\"valid package config\", func() {\n\t\t\tit(\"prints deprecation warning\", func() {\n\t\t\t\tvar outBuf bytes.Buffer\n\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\t\tcmd := packageBuildpackCommand(withLogger(logger))\n\t\t\t\th.AssertNil(t, cmd.Execute())\n\t\t\t\th.AssertContains(t, outBuf.String(), \"Warning: Command 'pack package-buildpack' has been deprecated, please use 'pack buildpack package' instead\")\n\t\t\t})\n\n\t\t\tit(\"reads package config from the configured path\", func() {\n\t\t\t\tfakePackageConfigReader := fakes.NewFakePackageConfigReader()\n\t\t\t\texpectedPackageConfigPath := \"/path/to/some/file\"\n\n\t\t\t\tpackageBuildpackCommand := packageBuildpackCommand(\n\t\t\t\t\twithPackageConfigReader(fakePackageConfigReader),\n\t\t\t\t\twithPackageConfigPath(expectedPackageConfigPath),\n\t\t\t\t)\n\n\t\t\t\terr := packageBuildpackCommand.Execute()\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\th.AssertEq(t, fakePackageConfigReader.ReadCalledWithArg, expectedPackageConfigPath)\n\t\t\t})\n\n\t\t\tit(\"creates package with correct image name\", func() {\n\t\t\t\tfakeBuildpackPackager := &fakes.FakeBuildpackPackager{}\n\n\t\t\t\tpackageBuildpackCommand := packageBuildpackCommand(\n\t\t\t\t\twithImageName(\"my-specific-image\"),\n\t\t\t\t\twithBuildpackPackager(fakeBuildpackPackager),\n\t\t\t\t)\n\n\t\t\t\terr := packageBuildpackCommand.Execute()\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\treceivedOptions := fakeBuildpackPackager.CreateCalledWithOptions\n\n\t\t\t\th.AssertEq(t, receivedOptions.Name, \"my-specific-image\")\n\t\t\t})\n\n\t\t\tit(\"creates package with config returned by the reader\", func() {\n\t\t\t\tfakeBuildpackPackager := &fakes.FakeBuildpackPackager{}\n\n\t\t\t\tmyConfig := pubbldpkg.Config{\n\t\t\t\t\tBuildpack: dist.BuildpackURI{URI: \"test\"},\n\t\t\t\t}\n\n\t\t\t\tpackageBuildpackCommand := packageBuildpackCommand(\n\t\t\t\t\twithBuildpackPackager(fakeBuildpackPackager),\n\t\t\t\t\twithPackageConfigReader(fakes.NewFakePackageConfigReader(whereReadReturns(myConfig, nil))),\n\t\t\t\t)\n\n\t\t\t\terr := packageBuildpackCommand.Execute()\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\treceivedOptions := fakeBuildpackPackager.CreateCalledWithOptions\n\n\t\t\t\th.AssertEq(t, receivedOptions.Config, myConfig)\n\t\t\t})\n\n\t\t\twhen(\"pull-policy\", func() {\n\t\t\t\tvar (\n\t\t\t\t\toutBuf                bytes.Buffer\n\t\t\t\t\tcmd                   *cobra.Command\n\t\t\t\t\targs                  []string\n\t\t\t\t\tfakeBuildpackPackager *fakes.FakeBuildpackPackager\n\t\t\t\t)\n\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\t\t\tfakeBuildpackPackager = &fakes.FakeBuildpackPackager{}\n\n\t\t\t\t\tcmd = packageBuildpackCommand(withLogger(logger), withBuildpackPackager(fakeBuildpackPackager))\n\t\t\t\t\targs = []string{\n\t\t\t\t\t\t\"some-image-name\",\n\t\t\t\t\t\t\"--config\", \"/path/to/some/file\",\n\t\t\t\t\t}\n\t\t\t\t})\n\n\t\t\t\tit(\"pull-policy=never sets policy\", func() {\n\t\t\t\t\targs = append(args, \"--pull-policy\", \"never\")\n\t\t\t\t\tcmd.SetArgs(args)\n\n\t\t\t\t\terr := cmd.Execute()\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\treceivedOptions := fakeBuildpackPackager.CreateCalledWithOptions\n\t\t\t\t\th.AssertEq(t, receivedOptions.PullPolicy, image.PullNever)\n\t\t\t\t})\n\n\t\t\t\tit(\"pull-policy=always sets policy\", func() {\n\t\t\t\t\targs = append(args, \"--pull-policy\", \"always\")\n\t\t\t\t\tcmd.SetArgs(args)\n\n\t\t\t\t\terr := cmd.Execute()\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\treceivedOptions := fakeBuildpackPackager.CreateCalledWithOptions\n\t\t\t\t\th.AssertEq(t, receivedOptions.PullPolicy, image.PullAlways)\n\t\t\t\t})\n\t\t\t\tit(\"takes precedence over a configured pull policy\", func() {\n\t\t\t\t\tlogger := logging.NewLogWithWriters(&bytes.Buffer{}, &bytes.Buffer{})\n\t\t\t\t\tconfigReader := fakes.NewFakePackageConfigReader()\n\t\t\t\t\tbuildpackPackager := &fakes.FakeBuildpackPackager{}\n\t\t\t\t\tclientConfig := config.Config{PullPolicy: \"if-not-present\"}\n\n\t\t\t\t\tcommand := commands.PackageBuildpack(logger, clientConfig, buildpackPackager, configReader)\n\t\t\t\t\tcommand.SetArgs([]string{\n\t\t\t\t\t\t\"some-image-name\",\n\t\t\t\t\t\t\"--config\", \"/path/to/some/file\",\n\t\t\t\t\t\t\"--pull-policy\",\n\t\t\t\t\t\t\"never\",\n\t\t\t\t\t})\n\n\t\t\t\t\terr := command.Execute()\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\treceivedOptions := buildpackPackager.CreateCalledWithOptions\n\t\t\t\t\th.AssertEq(t, receivedOptions.PullPolicy, image.PullNever)\n\t\t\t\t})\n\t\t\t})\n\t\t\twhen(\"configured pull policy\", func() {\n\t\t\t\tit(\"uses the configured pull policy\", func() {\n\t\t\t\t\tlogger := logging.NewLogWithWriters(&bytes.Buffer{}, &bytes.Buffer{})\n\t\t\t\t\tconfigReader := fakes.NewFakePackageConfigReader()\n\t\t\t\t\tbuildpackPackager := &fakes.FakeBuildpackPackager{}\n\t\t\t\t\tclientConfig := config.Config{PullPolicy: \"never\"}\n\n\t\t\t\t\tcommand := commands.PackageBuildpack(logger, clientConfig, buildpackPackager, configReader)\n\t\t\t\t\tcommand.SetArgs([]string{\n\t\t\t\t\t\t\"some-image-name\",\n\t\t\t\t\t\t\"--config\", \"/path/to/some/file\",\n\t\t\t\t\t})\n\n\t\t\t\t\terr := command.Execute()\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\treceivedOptions := buildpackPackager.CreateCalledWithOptions\n\t\t\t\t\th.AssertEq(t, receivedOptions.PullPolicy, image.PullNever)\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"invalid flags\", func() {\n\t\twhen(\"both --publish and --pull-policy never flags are specified\", func() {\n\t\t\tit(\"errors with a descriptive message\", func() {\n\t\t\t\tlogger := logging.NewLogWithWriters(&bytes.Buffer{}, &bytes.Buffer{})\n\t\t\t\tconfigReader := fakes.NewFakePackageConfigReader()\n\t\t\t\tbuildpackPackager := &fakes.FakeBuildpackPackager{}\n\t\t\t\tclientConfig := config.Config{}\n\n\t\t\t\tcommand := commands.PackageBuildpack(logger, clientConfig, buildpackPackager, configReader)\n\t\t\t\tcommand.SetArgs([]string{\n\t\t\t\t\t\"some-image-name\",\n\t\t\t\t\t\"--config\", \"/path/to/some/file\",\n\t\t\t\t\t\"--publish\",\n\t\t\t\t\t\"--pull-policy\",\n\t\t\t\t\t\"never\",\n\t\t\t\t})\n\n\t\t\t\terr := command.Execute()\n\t\t\t\th.AssertNotNil(t, err)\n\t\t\t\th.AssertError(t, err, \"--publish and --pull-policy never cannot be used together. The --publish flag requires the use of remote images.\")\n\t\t\t})\n\t\t})\n\n\t\tit(\"logs an error and exits when package toml is invalid\", func() {\n\t\t\toutBuf := &bytes.Buffer{}\n\t\t\texpectedErr := errors.New(\"it went wrong\")\n\n\t\t\tpackageBuildpackCommand := packageBuildpackCommand(\n\t\t\t\twithLogger(logging.NewLogWithWriters(outBuf, outBuf)),\n\t\t\t\twithPackageConfigReader(\n\t\t\t\t\tfakes.NewFakePackageConfigReader(whereReadReturns(pubbldpkg.Config{}, expectedErr)),\n\t\t\t\t),\n\t\t\t)\n\n\t\t\terr := packageBuildpackCommand.Execute()\n\t\t\th.AssertNotNil(t, err)\n\n\t\t\th.AssertContains(t, outBuf.String(), fmt.Sprintf(\"ERROR: reading config: %s\", expectedErr))\n\t\t})\n\n\t\twhen(\"package-config is specified\", func() {\n\t\t\tit(\"errors with a descriptive message\", func() {\n\t\t\t\toutBuf := &bytes.Buffer{}\n\n\t\t\t\tconfig := &packageCommandConfig{\n\t\t\t\t\tlogger:              logging.NewLogWithWriters(outBuf, outBuf),\n\t\t\t\t\tpackageConfigReader: fakes.NewFakePackageConfigReader(),\n\t\t\t\t\tbuildpackPackager:   &fakes.FakeBuildpackPackager{},\n\n\t\t\t\t\timageName:  \"some-image-name\",\n\t\t\t\t\tconfigPath: \"/path/to/some/file\",\n\t\t\t\t}\n\n\t\t\t\tcmd := commands.PackageBuildpack(config.logger, config.clientConfig, config.buildpackPackager, config.packageConfigReader)\n\t\t\t\tcmd.SetArgs([]string{config.imageName, \"--package-config\", config.configPath})\n\n\t\t\t\terr := cmd.Execute()\n\t\t\t\th.AssertError(t, err, \"unknown flag: --package-config\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"no config path is specified\", func() {\n\t\t\tit(\"creates a default config\", func() {\n\t\t\t\tconfig := &packageCommandConfig{\n\t\t\t\t\tlogger:              logging.NewLogWithWriters(&bytes.Buffer{}, &bytes.Buffer{}),\n\t\t\t\t\tpackageConfigReader: fakes.NewFakePackageConfigReader(),\n\t\t\t\t\tbuildpackPackager:   &fakes.FakeBuildpackPackager{},\n\n\t\t\t\t\timageName: \"some-image-name\",\n\t\t\t\t}\n\n\t\t\t\tcmd := commands.PackageBuildpack(config.logger, config.clientConfig, config.buildpackPackager, config.packageConfigReader)\n\t\t\t\tcmd.SetArgs([]string{config.imageName})\n\n\t\t\t\terr := cmd.Execute()\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\treceivedOptions := config.buildpackPackager.CreateCalledWithOptions\n\t\t\t\th.AssertEq(t, receivedOptions.Config.Buildpack.URI, \".\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"--pull-policy unknown-policy\", func() {\n\t\t\tit(\"fails to run\", func() {\n\t\t\t\tlogger := logging.NewLogWithWriters(&bytes.Buffer{}, &bytes.Buffer{})\n\t\t\t\tconfigReader := fakes.NewFakePackageConfigReader()\n\t\t\t\tbuildpackPackager := &fakes.FakeBuildpackPackager{}\n\t\t\t\tclientConfig := config.Config{}\n\n\t\t\t\tcommand := commands.PackageBuildpack(logger, clientConfig, buildpackPackager, configReader)\n\t\t\t\tcommand.SetArgs([]string{\n\t\t\t\t\t\"some-image-name\",\n\t\t\t\t\t\"--config\", \"/path/to/some/file\",\n\t\t\t\t\t\"--pull-policy\",\n\t\t\t\t\t\"unknown-policy\",\n\t\t\t\t})\n\n\t\t\t\th.AssertError(t, command.Execute(), \"parsing pull policy\")\n\t\t\t})\n\t\t})\n\t})\n}\n\nfunc packageBuildpackCommand(ops ...packageCommandOption) *cobra.Command {\n\tconfig := &packageCommandConfig{\n\t\tlogger:              logging.NewLogWithWriters(&bytes.Buffer{}, &bytes.Buffer{}),\n\t\tpackageConfigReader: fakes.NewFakePackageConfigReader(),\n\t\tbuildpackPackager:   &fakes.FakeBuildpackPackager{},\n\t\tclientConfig:        config.Config{},\n\n\t\timageName:  \"some-image-name\",\n\t\tconfigPath: \"/path/to/some/file\",\n\t}\n\n\tfor _, op := range ops {\n\t\top(config)\n\t}\n\n\tcmd := commands.PackageBuildpack(config.logger, config.clientConfig, config.buildpackPackager, config.packageConfigReader)\n\tcmd.SetArgs([]string{config.imageName, \"--config\", config.configPath})\n\n\treturn cmd\n}\n"
  },
  {
    "path": "internal/commands/rebase.go",
    "content": "package commands\n\nimport (\n\t\"github.com/pkg/errors\"\n\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/pkg/client\"\n\t\"github.com/buildpacks/pack/pkg/image\"\n\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/internal/style\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n)\n\nfunc Rebase(logger logging.Logger, cfg config.Config, pack PackClient) *cobra.Command {\n\tvar opts client.RebaseOptions\n\tvar policy string\n\n\tcmd := &cobra.Command{\n\t\tUse:     \"rebase <image-name>\",\n\t\tArgs:    cobra.ExactArgs(1),\n\t\tShort:   \"Rebase app image with latest run image\",\n\t\tExample: \"pack rebase buildpacksio/pack\",\n\t\tLong: \"Rebase allows you to quickly swap out the underlying OS layers (run image) of an app image generated by `pack build` \" +\n\t\t\t\"with a newer version of the run image, without re-building the application.\",\n\t\tRunE: logError(logger, func(cmd *cobra.Command, args []string) error {\n\t\t\topts.RepoName = args[0]\n\t\t\topts.AdditionalMirrors = getMirrors(cfg)\n\n\t\t\tvar err error\n\t\t\tstringPolicy := policy\n\t\t\tif stringPolicy == \"\" {\n\t\t\t\tstringPolicy = cfg.PullPolicy\n\t\t\t}\n\t\t\topts.PullPolicy, err = image.ParsePullPolicy(stringPolicy)\n\t\t\tif err != nil {\n\t\t\t\treturn errors.Wrapf(err, \"parsing pull policy %s\", stringPolicy)\n\t\t\t}\n\n\t\t\tif err := pack.Rebase(cmd.Context(), opts); err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\t\t\tlogger.Infof(\"Successfully rebased image %s\", style.Symbol(opts.RepoName))\n\t\t\treturn nil\n\t\t}),\n\t}\n\n\tcmd.Flags().BoolVar(&opts.Publish, \"publish\", false, \"Publish the rebased application image directly to the container registry specified in <image-name>, instead of the daemon. The previous application image must also reside in the registry.\")\n\tcmd.Flags().StringVar(&opts.RunImage, \"run-image\", \"\", \"Run image to use for rebasing\")\n\tcmd.Flags().StringVar(&policy, \"pull-policy\", \"\", \"Pull policy to use. Accepted values are always, never, and if-not-present. The default is always\")\n\tcmd.Flags().StringVar(&opts.PreviousImage, \"previous-image\", \"\", \"Image to rebase. Set to a particular tag reference, digest reference, or (when performing a daemon build) image ID. Use this flag in combination with <image-name> to avoid replacing the original image.\")\n\tcmd.Flags().StringVar(&opts.ReportDestinationDir, \"report-output-dir\", \"\", \"Path to export build report.toml.\\nOmitting the flag yield no report file.\")\n\tcmd.Flags().BoolVar(&opts.Force, \"force\", false, \"Perform rebase operation without target validation (only available for API >= 0.12)\")\n\tcmd.Flags().StringArrayVar(&opts.InsecureRegistries, \"insecure-registry\", []string{}, \"List of insecure registries (only available for API >= 0.13)\")\n\tAddHelpFlag(cmd, \"rebase\")\n\treturn cmd\n}\n"
  },
  {
    "path": "internal/commands/rebase_test.go",
    "content": "package commands_test\n\nimport (\n\t\"bytes\"\n\t\"testing\"\n\n\t\"github.com/heroku/color\"\n\n\t\"github.com/buildpacks/pack/pkg/client\"\n\t\"github.com/buildpacks/pack/pkg/image\"\n\n\t\"github.com/golang/mock/gomock\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/internal/commands\"\n\t\"github.com/buildpacks/pack/internal/commands/testmocks\"\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestRebaseCommand(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\n\tspec.Run(t, \"Commands\", testRebaseCommand, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testRebaseCommand(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tcommand        *cobra.Command\n\t\tlogger         logging.Logger\n\t\toutBuf         bytes.Buffer\n\t\tmockController *gomock.Controller\n\t\tmockClient     *testmocks.MockPackClient\n\t\tcfg            config.Config\n\t)\n\n\tit.Before(func() {\n\t\tlogger = logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\tcfg = config.Config{}\n\t\tmockController = gomock.NewController(t)\n\t\tmockClient = testmocks.NewMockPackClient(mockController)\n\n\t\tcommand = commands.Rebase(logger, cfg, mockClient)\n\t})\n\n\twhen(\"#RebaseCommand\", func() {\n\t\twhen(\"no image is provided\", func() {\n\t\t\tit(\"fails to run\", func() {\n\t\t\t\tcommand.SetArgs([]string{})\n\t\t\t\terr := command.Execute()\n\t\t\t\th.AssertError(t, err, \"accepts 1 arg\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"image name is provided\", func() {\n\t\t\tvar (\n\t\t\t\trepoName string\n\t\t\t\topts     client.RebaseOptions\n\t\t\t)\n\t\t\tit.Before(func() {\n\t\t\t\trunImage := \"test/image\"\n\t\t\t\ttestMirror1 := \"example.com/some/run1\"\n\t\t\t\ttestMirror2 := \"example.com/some/run2\"\n\n\t\t\t\tcfg.RunImages = []config.RunImage{{\n\t\t\t\t\tImage:   runImage,\n\t\t\t\t\tMirrors: []string{testMirror1, testMirror2},\n\t\t\t\t}}\n\t\t\t\tcommand = commands.Rebase(logger, cfg, mockClient)\n\n\t\t\t\trepoName = \"test/repo-image\"\n\t\t\t\topts = client.RebaseOptions{\n\t\t\t\t\tRepoName:   repoName,\n\t\t\t\t\tPublish:    false,\n\t\t\t\t\tPullPolicy: image.PullAlways,\n\t\t\t\t\tRunImage:   \"\",\n\t\t\t\t\tAdditionalMirrors: map[string][]string{\n\t\t\t\t\t\trunImage: {testMirror1, testMirror2},\n\t\t\t\t\t},\n\t\t\t\t\tInsecureRegistries: []string{},\n\t\t\t\t}\n\t\t\t})\n\n\t\t\tit(\"works\", func() {\n\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\tRebase(gomock.Any(), opts).\n\t\t\t\t\tReturn(nil)\n\n\t\t\t\tcommand.SetArgs([]string{repoName})\n\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t})\n\n\t\t\twhen(\"--pull-policy never\", func() {\n\t\t\t\tit(\"works\", func() {\n\t\t\t\t\topts.PullPolicy = image.PullNever\n\t\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\t\tRebase(gomock.Any(), opts).\n\t\t\t\t\t\tReturn(nil)\n\n\t\t\t\t\tcommand.SetArgs([]string{repoName, \"--pull-policy\", \"never\"})\n\t\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t\t})\n\t\t\t\tit(\"takes precedence over config policy\", func() {\n\t\t\t\t\topts.PullPolicy = image.PullNever\n\t\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\t\tRebase(gomock.Any(), opts).\n\t\t\t\t\t\tReturn(nil)\n\n\t\t\t\t\tcfg.PullPolicy = \"if-not-present\"\n\t\t\t\t\tcommand = commands.Rebase(logger, cfg, mockClient)\n\n\t\t\t\t\tcommand.SetArgs([]string{repoName, \"--pull-policy\", \"never\"})\n\t\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"--pull-policy unknown-policy\", func() {\n\t\t\t\tit(\"fails to run\", func() {\n\t\t\t\t\tcommand.SetArgs([]string{repoName, \"--pull-policy\", \"unknown-policy\"})\n\t\t\t\t\th.AssertError(t, command.Execute(), \"parsing pull policy\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"--pull-policy not set\", func() {\n\t\t\t\twhen(\"no policy set in config\", func() {\n\t\t\t\t\tit(\"uses the default policy\", func() {\n\t\t\t\t\t\topts.PullPolicy = image.PullAlways\n\t\t\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\t\t\tRebase(gomock.Any(), opts).\n\t\t\t\t\t\t\tReturn(nil)\n\n\t\t\t\t\t\tcommand.SetArgs([]string{repoName})\n\t\t\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t\twhen(\"policy is set in config\", func() {\n\t\t\t\t\tit(\"uses set policy\", func() {\n\t\t\t\t\t\topts.PullPolicy = image.PullIfNotPresent\n\t\t\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\t\t\tRebase(gomock.Any(), opts).\n\t\t\t\t\t\t\tReturn(nil)\n\n\t\t\t\t\t\tcfg.PullPolicy = \"if-not-present\"\n\t\t\t\t\t\tcommand = commands.Rebase(logger, cfg, mockClient)\n\n\t\t\t\t\t\tcommand.SetArgs([]string{repoName})\n\t\t\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t\twhen(\"rebase is true\", func() {\n\t\t\t\t\tit(\"passes it through\", func() {\n\t\t\t\t\t\topts.Force = true\n\t\t\t\t\t\tmockClient.EXPECT().Rebase(gomock.Any(), opts).Return(nil)\n\t\t\t\t\t\tcommand = commands.Rebase(logger, cfg, mockClient)\n\t\t\t\t\t\tcommand.SetArgs([]string{repoName, \"--force\"})\n\t\t\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"image name and previous image are provided\", func() {\n\t\t\t\tvar expectedOpts client.RebaseOptions\n\n\t\t\t\tit.Before(func() {\n\t\t\t\t\trunImage := \"test/image\"\n\t\t\t\t\ttestMirror1 := \"example.com/some/run1\"\n\t\t\t\t\ttestMirror2 := \"example.com/some/run2\"\n\n\t\t\t\t\tcfg.RunImages = []config.RunImage{{\n\t\t\t\t\t\tImage:   runImage,\n\t\t\t\t\t\tMirrors: []string{testMirror1, testMirror2},\n\t\t\t\t\t}}\n\t\t\t\t\tcommand = commands.Rebase(logger, cfg, mockClient)\n\n\t\t\t\t\trepoName = \"test/repo-image\"\n\t\t\t\t\tpreviousImage := \"example.com/previous-image:tag\" // Example of previous image with tag\n\t\t\t\t\topts := client.RebaseOptions{\n\t\t\t\t\t\tRepoName:   repoName,\n\t\t\t\t\t\tPublish:    false,\n\t\t\t\t\t\tPullPolicy: image.PullAlways,\n\t\t\t\t\t\tRunImage:   \"\",\n\t\t\t\t\t\tAdditionalMirrors: map[string][]string{\n\t\t\t\t\t\t\trunImage: {testMirror1, testMirror2},\n\t\t\t\t\t\t},\n\t\t\t\t\t\tPreviousImage:      previousImage,\n\t\t\t\t\t\tInsecureRegistries: []string{},\n\t\t\t\t\t}\n\t\t\t\t\texpectedOpts = opts\n\t\t\t\t})\n\n\t\t\t\tit(\"works\", func() {\n\t\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\t\tRebase(gomock.Any(), gomock.Eq(expectedOpts)).\n\t\t\t\t\t\tReturn(nil)\n\n\t\t\t\t\tcommand.SetArgs([]string{repoName, \"--previous-image\", \"example.com/previous-image:tag\"})\n\t\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"--insecure-registry is provided\", func() {\n\t\t\t\tit(\"sets one insecure registry\", func() {\n\t\t\t\t\topts.PullPolicy = image.PullAlways\n\t\t\t\t\topts.InsecureRegistries = []string{\n\t\t\t\t\t\t\"foo.bar\",\n\t\t\t\t\t}\n\t\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\t\tRebase(gomock.Any(), opts).\n\t\t\t\t\t\tReturn(nil)\n\n\t\t\t\t\tcommand.SetArgs([]string{repoName, \"--insecure-registry\", \"foo.bar\"})\n\t\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t\t})\n\n\t\t\t\tit(\"sets more than one insecure registry\", func() {\n\t\t\t\t\topts.PullPolicy = image.PullAlways\n\t\t\t\t\topts.InsecureRegistries = []string{\n\t\t\t\t\t\t\"foo.bar\",\n\t\t\t\t\t\t\"foo.com\",\n\t\t\t\t\t}\n\t\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\t\tRebase(gomock.Any(), opts).\n\t\t\t\t\t\tReturn(nil)\n\n\t\t\t\t\tcommand.SetArgs([]string{repoName, \"--insecure-registry\", \"foo.bar\", \"--insecure-registry\", \"foo.com\"})\n\t\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "internal/commands/register_buildpack.go",
    "content": "package commands\n\nimport (\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/internal/style\"\n\t\"github.com/buildpacks/pack/pkg/client\"\n\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n)\n\n// Deprecated: Use BuildpackRegister instead\nfunc RegisterBuildpack(logger logging.Logger, cfg config.Config, pack PackClient) *cobra.Command {\n\tvar opts client.RegisterBuildpackOptions\n\tvar flags BuildpackRegisterFlags\n\n\tcmd := &cobra.Command{\n\t\tUse:     \"register-buildpack <image>\",\n\t\tHidden:  true,\n\t\tArgs:    cobra.ExactArgs(1),\n\t\tShort:   \"Register the buildpack to a registry\",\n\t\tExample: \"pack register-buildpack my-buildpack\",\n\t\tRunE: logError(logger, func(cmd *cobra.Command, args []string) error {\n\t\t\tdeprecationWarning(logger, \"register-buildpack\", \"buildpack register\")\n\t\t\tregistry, err := config.GetRegistry(cfg, flags.BuildpackRegistry)\n\t\t\tif err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\t\t\topts.ImageName = args[0]\n\t\t\topts.Type = registry.Type\n\t\t\topts.URL = registry.URL\n\t\t\topts.Name = registry.Name\n\n\t\t\tif err := pack.RegisterBuildpack(cmd.Context(), opts); err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\t\t\tlogger.Infof(\"Successfully registered %s\", style.Symbol(opts.ImageName))\n\t\t\treturn nil\n\t\t}),\n\t}\n\tcmd.Flags().StringVarP(&flags.BuildpackRegistry, \"buildpack-registry\", \"r\", \"\", \"Buildpack Registry name\")\n\tAddHelpFlag(cmd, \"register-buildpack\")\n\treturn cmd\n}\n"
  },
  {
    "path": "internal/commands/register_buildpack_test.go",
    "content": "package commands_test\n\nimport (\n\t\"bytes\"\n\t\"testing\"\n\n\t\"github.com/buildpacks/pack/internal/commands\"\n\t\"github.com/buildpacks/pack/pkg/client\"\n\n\t\"github.com/golang/mock/gomock\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/internal/commands/testmocks\"\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestRegisterBuildpackCommand(t *testing.T) {\n\tspec.Run(t, \"Commands\", testRegisterBuildpackCommand, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testRegisterBuildpackCommand(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tcommand        *cobra.Command\n\t\tlogger         logging.Logger\n\t\toutBuf         bytes.Buffer\n\t\tmockController *gomock.Controller\n\t\tmockClient     *testmocks.MockPackClient\n\t\tcfg            config.Config\n\t)\n\n\tit.Before(func() {\n\t\tlogger = logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\tmockController = gomock.NewController(t)\n\t\tmockClient = testmocks.NewMockPackClient(mockController)\n\t\tcfg = config.Config{}\n\n\t\tcommand = commands.RegisterBuildpack(logger, cfg, mockClient)\n\t})\n\n\tit.After(func() {})\n\n\twhen(\"#RegisterBuildpackCommand\", func() {\n\t\twhen(\"no image is provided\", func() {\n\t\t\tit(\"fails to run\", func() {\n\t\t\t\terr := command.Execute()\n\t\t\t\th.AssertError(t, err, \"accepts 1 arg\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"image name is provided\", func() {\n\t\t\tvar (\n\t\t\t\tbuildpackImage string\n\t\t\t)\n\n\t\t\tit.Before(func() {\n\t\t\t\tbuildpackImage = \"buildpack/image\"\n\t\t\t})\n\n\t\t\tit(\"should work for required args\", func() {\n\t\t\t\topts := client.RegisterBuildpackOptions{\n\t\t\t\t\tImageName: buildpackImage,\n\t\t\t\t\tType:      \"github\",\n\t\t\t\t\tURL:       \"https://github.com/buildpacks/registry-index\",\n\t\t\t\t\tName:      \"official\",\n\t\t\t\t}\n\n\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\tRegisterBuildpack(gomock.Any(), opts).\n\t\t\t\t\tReturn(nil)\n\n\t\t\t\tcommand.SetArgs([]string{buildpackImage})\n\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t})\n\n\t\t\twhen(\"config.toml exists\", func() {\n\t\t\t\tit(\"should consume registry config values\", func() {\n\t\t\t\t\tcfg = config.Config{\n\t\t\t\t\t\tDefaultRegistryName: \"berneuse\",\n\t\t\t\t\t\tRegistries: []config.Registry{\n\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\tName: \"berneuse\",\n\t\t\t\t\t\t\t\tType: \"github\",\n\t\t\t\t\t\t\t\tURL:  \"https://github.com/berneuse/buildpack-registry\",\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t},\n\t\t\t\t\t}\n\t\t\t\t\tcommand = commands.RegisterBuildpack(logger, cfg, mockClient)\n\t\t\t\t\topts := client.RegisterBuildpackOptions{\n\t\t\t\t\t\tImageName: buildpackImage,\n\t\t\t\t\t\tType:      \"github\",\n\t\t\t\t\t\tURL:       \"https://github.com/berneuse/buildpack-registry\",\n\t\t\t\t\t\tName:      \"berneuse\",\n\t\t\t\t\t}\n\n\t\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\t\tRegisterBuildpack(gomock.Any(), opts).\n\t\t\t\t\t\tReturn(nil)\n\n\t\t\t\t\tcommand.SetArgs([]string{buildpackImage})\n\t\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t\t})\n\n\t\t\t\tit(\"should handle config errors\", func() {\n\t\t\t\t\tcfg = config.Config{\n\t\t\t\t\t\tDefaultRegistryName: \"missing registry\",\n\t\t\t\t\t}\n\t\t\t\t\tcommand = commands.RegisterBuildpack(logger, cfg, mockClient)\n\t\t\t\t\tcommand.SetArgs([]string{buildpackImage})\n\n\t\t\t\t\terr := command.Execute()\n\t\t\t\t\th.AssertNotNil(t, err)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\tit(\"should support buildpack-registry flag\", func() {\n\t\t\t\tbuildpackRegistry := \"override\"\n\t\t\t\tcfg = config.Config{\n\t\t\t\t\tDefaultRegistryName: \"default\",\n\t\t\t\t\tRegistries: []config.Registry{\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\tName: \"default\",\n\t\t\t\t\t\t\tType: \"github\",\n\t\t\t\t\t\t\tURL:  \"https://github.com/default/buildpack-registry\",\n\t\t\t\t\t\t},\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\tName: \"override\",\n\t\t\t\t\t\t\tType: \"github\",\n\t\t\t\t\t\t\tURL:  \"https://github.com/override/buildpack-registry\",\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t}\n\t\t\t\topts := client.RegisterBuildpackOptions{\n\t\t\t\t\tImageName: buildpackImage,\n\t\t\t\t\tType:      \"github\",\n\t\t\t\t\tURL:       \"https://github.com/override/buildpack-registry\",\n\t\t\t\t\tName:      \"override\",\n\t\t\t\t}\n\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\tRegisterBuildpack(gomock.Any(), opts).\n\t\t\t\t\tReturn(nil)\n\n\t\t\t\tcommand = commands.RegisterBuildpack(logger, cfg, mockClient)\n\t\t\t\tcommand.SetArgs([]string{buildpackImage, \"--buildpack-registry\", buildpackRegistry})\n\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t})\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "internal/commands/remove_registry.go",
    "content": "package commands\n\nimport (\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n)\n\n// Deprecated: Use config registries remove instead\nfunc RemoveRegistry(logger logging.Logger, cfg config.Config, cfgPath string) *cobra.Command {\n\tcmd := &cobra.Command{\n\t\tUse:     \"remove-registry <name>\",\n\t\tArgs:    cobra.ExactArgs(1),\n\t\tHidden:  true,\n\t\tShort:   \"Remove registry\",\n\t\tExample: \"pack remove-registry myregistry\",\n\t\tRunE: logError(logger, func(cmd *cobra.Command, args []string) error {\n\t\t\tdeprecationWarning(logger, \"remove-registry\", \"config registries remove\")\n\t\t\treturn removeRegistry(args, logger, cfg, cfgPath)\n\t\t}),\n\t}\n\n\tAddHelpFlag(cmd, \"remove-registry\")\n\treturn cmd\n}\n"
  },
  {
    "path": "internal/commands/remove_registry_test.go",
    "content": "package commands_test\n\nimport (\n\t\"bytes\"\n\t\"os\"\n\t\"path/filepath\"\n\t\"testing\"\n\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\t\"github.com/buildpacks/pack/internal/commands\"\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestRemoveRegistry(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\n\tspec.Run(t, \"RemoveRegistryCommand\", testRemoveRegistryCommand, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testRemoveRegistryCommand(t *testing.T, when spec.G, it spec.S) {\n\twhen(\"#RemoveRegistry\", func() {\n\t\tvar (\n\t\t\toutBuf     bytes.Buffer\n\t\t\tlogger     = logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\ttmpDir     string\n\t\t\tconfigFile string\n\t\t\tcfg        config.Config\n\t\t\tassert     = h.NewAssertionManager(t)\n\t\t)\n\n\t\tit.Before(func() {\n\t\t\tvar err error\n\t\t\ttmpDir, err = os.MkdirTemp(\"\", \"pack-home-*\")\n\t\t\tassert.Nil(err)\n\n\t\t\tcfg = config.Config{\n\t\t\t\tDefaultRegistryName: \"buildpack-registry\",\n\t\t\t\tRegistries: []config.Registry{\n\t\t\t\t\t{\n\t\t\t\t\t\tName: \"buildpack-registry\",\n\t\t\t\t\t\tURL:  \"https://github.com/buildpacks/registry-index\",\n\t\t\t\t\t\tType: \"github\",\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\tName: \"elbandito-registry\",\n\t\t\t\t\t\tURL:  \"https://github.com/elbandito/registry-index\",\n\t\t\t\t\t\tType: \"github\",\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t}\n\n\t\t\tconfigFile = filepath.Join(tmpDir, \"config.toml\")\n\t\t\terr = config.Write(cfg, configFile)\n\t\t\tassert.Nil(err)\n\t\t})\n\n\t\tit.After(func() {\n\t\t\t_ = os.RemoveAll(tmpDir)\n\t\t})\n\n\t\tit(\"should remove the registry\", func() {\n\t\t\tcommand := commands.RemoveRegistry(logger, cfg, configFile)\n\t\t\tcommand.SetArgs([]string{\"elbandito-registry\"})\n\t\t\tassert.Succeeds(command.Execute())\n\n\t\t\tnewCfg, err := config.Read(configFile)\n\t\t\tassert.Nil(err)\n\n\t\t\tassert.Equal(newCfg, config.Config{\n\t\t\t\tDefaultRegistryName: \"buildpack-registry\",\n\t\t\t\tRegistries: []config.Registry{\n\t\t\t\t\t{\n\t\t\t\t\t\tName: \"buildpack-registry\",\n\t\t\t\t\t\tURL:  \"https://github.com/buildpacks/registry-index\",\n\t\t\t\t\t\tType: \"github\",\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t})\n\t\t\tassert.Contains(outBuf.String(), \"been deprecated, please use 'pack config registries remove' instead\")\n\t\t})\n\n\t\tit(\"should remove the registry and matching default registry name\", func() {\n\t\t\tcommand := commands.RemoveRegistry(logger, cfg, configFile)\n\t\t\tcommand.SetArgs([]string{\"buildpack-registry\"})\n\t\t\tassert.Succeeds(command.Execute())\n\n\t\t\tnewCfg, err := config.Read(configFile)\n\t\t\tassert.Nil(err)\n\n\t\t\tassert.Equal(newCfg, config.Config{\n\t\t\t\tDefaultRegistryName: config.OfficialRegistryName,\n\t\t\t\tRegistries: []config.Registry{\n\t\t\t\t\t{\n\t\t\t\t\t\tName: \"elbandito-registry\",\n\t\t\t\t\t\tURL:  \"https://github.com/elbandito/registry-index\",\n\t\t\t\t\t\tType: \"github\",\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t})\n\t\t})\n\n\t\tit(\"should return error when registry does NOT already exist\", func() {\n\t\t\tcommand := commands.RemoveRegistry(logger, cfg, configFile)\n\t\t\tcommand.SetArgs([]string{\"missing-registry\"})\n\t\t\tassert.Error(command.Execute())\n\n\t\t\toutput := outBuf.String()\n\t\t\th.AssertContains(t, output, \"registry 'missing-registry' does not exist\")\n\t\t})\n\n\t\tit(\"should throw error when registry name is official\", func() {\n\t\t\tcommand := commands.RemoveRegistry(logger, config.Config{}, configFile)\n\t\t\tcommand.SetArgs([]string{\"official\"})\n\t\t\tassert.Error(command.Execute())\n\n\t\t\toutput := outBuf.String()\n\t\t\th.AssertContains(t, output, \"'official' is a reserved registry name, please provide a different registry\")\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "internal/commands/report.go",
    "content": "package commands\n\nimport (\n\t\"bytes\"\n\t\"fmt\"\n\t\"io\"\n\t\"os\"\n\t\"path/filepath\"\n\t\"regexp\"\n\t\"runtime\"\n\t\"strings\"\n\t\"text/template\"\n\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/internal/build\"\n\t\"github.com/buildpacks/pack/internal/builder\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n)\n\nfunc Report(logger logging.Logger, version, cfgPath string) *cobra.Command {\n\tvar explicit bool\n\n\tcmd := &cobra.Command{\n\t\tUse:     \"report\",\n\t\tArgs:    cobra.NoArgs,\n\t\tShort:   \"Display useful information for reporting an issue\",\n\t\tExample: \"pack report\",\n\t\tRunE: logError(logger, func(cmd *cobra.Command, args []string) error {\n\t\t\tvar buf bytes.Buffer\n\t\t\terr := generateOutput(&buf, version, cfgPath, explicit)\n\t\t\tif err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\n\t\t\tlogger.Info(buf.String())\n\n\t\t\treturn nil\n\t\t}),\n\t}\n\n\tcmd.Flags().BoolVarP(&explicit, \"explicit\", \"e\", false, \"Print config without redacting information\")\n\tAddHelpFlag(cmd, \"report\")\n\treturn cmd\n}\n\nfunc generateOutput(writer io.Writer, version, cfgPath string, explicit bool) error {\n\ttpl := template.Must(template.New(\"\").Parse(`Pack:\n  Version:  {{ .Version }}\n  OS/Arch:  {{ .OS }}/{{ .Arch }}\n\nDefault Lifecycle Version:  {{ .DefaultLifecycleVersion }}\n\nSupported Platform APIs:  {{ .SupportedPlatformAPIs }}\n\nConfig:\n{{ .Config -}}`))\n\n\tconfigData := \"\"\n\tif data, err := os.ReadFile(filepath.Clean(cfgPath)); err != nil {\n\t\tconfigData = fmt.Sprintf(\"(no config file found at %s)\", cfgPath)\n\t} else {\n\t\tvar padded strings.Builder\n\n\t\tfor _, line := range strings.Split(string(data), \"\\n\") {\n\t\t\tif !explicit {\n\t\t\t\tline = sanitize(line)\n\t\t\t}\n\t\t\t_, _ = fmt.Fprintf(&padded, \"  %s\\n\", line)\n\t\t}\n\t\tconfigData = strings.TrimRight(padded.String(), \" \\n\")\n\t}\n\n\tplatformAPIs := strings.Join(build.SupportedPlatformAPIVersions.AsStrings(), \", \")\n\n\treturn tpl.Execute(writer, map[string]string{\n\t\t\"Version\":                 version,\n\t\t\"OS\":                      runtime.GOOS,\n\t\t\"Arch\":                    runtime.GOARCH,\n\t\t\"DefaultLifecycleVersion\": builder.DefaultLifecycleVersion,\n\t\t\"SupportedPlatformAPIs\":   platformAPIs,\n\t\t\"Config\":                  configData,\n\t})\n}\n\nfunc sanitize(line string) string {\n\tre := regexp.MustCompile(`\"(.*?)\"`)\n\tredactedString := `\"[REDACTED]\"`\n\tsensitiveFields := []string{\n\t\t\"default-builder-image\",\n\t\t\"image\",\n\t\t\"mirrors\",\n\t\t\"name\",\n\t\t\"url\",\n\t}\n\tfor _, field := range sensitiveFields {\n\t\tif strings.HasPrefix(strings.TrimSpace(line), field) {\n\t\t\treturn re.ReplaceAllString(line, redactedString)\n\t\t}\n\t}\n\n\treturn line\n}\n"
  },
  {
    "path": "internal/commands/report_test.go",
    "content": "package commands_test\n\nimport (\n\t\"bytes\"\n\t\"fmt\"\n\t\"os\"\n\t\"path/filepath\"\n\t\"testing\"\n\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/internal/commands\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestReport(t *testing.T) {\n\tspec.Run(t, \"ReportCommand\", testReportCommand, spec.Random(), spec.Report(report.Terminal{}))\n}\n\nfunc testReportCommand(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tcommand           *cobra.Command\n\t\tlogger            logging.Logger\n\t\toutBuf            bytes.Buffer\n\t\ttempPackHome      string\n\t\tpackConfigPath    string\n\t\ttempPackEmptyHome string\n\t\ttestVersion       = \"1.2.3\"\n\t)\n\n\tit.Before(func() {\n\t\tvar err error\n\t\tlogger = logging.NewLogWithWriters(&outBuf, &outBuf)\n\n\t\ttempPackHome, err = os.MkdirTemp(\"\", \"pack-home\")\n\t\th.AssertNil(t, err)\n\n\t\tpackConfigPath = filepath.Join(tempPackHome, \"config.toml\")\n\t\tcommand = commands.Report(logger, testVersion, packConfigPath)\n\t\tcommand.SetArgs([]string{})\n\t\th.AssertNil(t, os.WriteFile(packConfigPath, []byte(`\ndefault-builder-image = \"some/image\"\nexperimental = true\n\n[[run-images]]\n  image = \"super-secret-project/run\"\n  mirrors = [\"gcr.io/super-secret-project/run\", \"secret.io/super-secret-project/run\"]\n\n[[trusted-builders]]\n  name = \"super-secret-project/builder\"\n\n[[registries]]\n  name = \"secret-registry\"\n  type = \"github\"\n  url = \"https://github.com/super-secret-project/registry\"\n`), 0666))\n\n\t\ttempPackEmptyHome, err = os.MkdirTemp(\"\", \"\")\n\t\th.AssertNil(t, err)\n\t})\n\n\tit.After(func() {\n\t\th.AssertNil(t, os.RemoveAll(tempPackHome))\n\t\th.AssertNil(t, os.RemoveAll(tempPackEmptyHome))\n\t})\n\n\twhen(\"#ReportCommand\", func() {\n\t\twhen(\"config.toml is present\", func() {\n\t\t\tit(\"presents output\", func() {\n\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t\th.AssertContains(t, outBuf.String(), `experimental = true`)\n\t\t\t\th.AssertContains(t, outBuf.String(), `Version:  `+testVersion)\n\n\t\t\t\th.AssertContains(t, outBuf.String(), `default-builder-image = \"[REDACTED]\"`)\n\t\t\t\th.AssertContains(t, outBuf.String(), `name = \"[REDACTED]\"`)\n\t\t\t\th.AssertContains(t, outBuf.String(), `url = \"[REDACTED]\"`)\n\t\t\t\th.AssertContains(t, outBuf.String(), `image = \"[REDACTED]\"`)\n\t\t\t\th.AssertContains(t, outBuf.String(), `mirrors = [\"[REDACTED]\", \"[REDACTED]\"]`)\n\n\t\t\t\th.AssertNotContains(t, outBuf.String(), `default-builder-image = \"some/image\"`)\n\t\t\t\th.AssertNotContains(t, outBuf.String(), `image = \"super-secret-project/run\"`)\n\t\t\t\th.AssertNotContains(t, outBuf.String(), `mirrors = [\"gcr.io/super-secret-project/run\", \"secret.io/super-secret-project/run\"]`)\n\t\t\t\th.AssertNotContains(t, outBuf.String(), `name = \"super-secret-project/builder\"`)\n\t\t\t\th.AssertNotContains(t, outBuf.String(), `name = \"secret-registry\"`)\n\t\t\t\th.AssertNotContains(t, outBuf.String(), `url = \"https://github.com/super-secret-project/registry\"`)\n\t\t\t})\n\n\t\t\tit(\"doesn't sanitize output if explicit\", func() {\n\t\t\t\tcommand.SetArgs([]string{\"-e\"})\n\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t\th.AssertContains(t, outBuf.String(), `experimental = true`)\n\t\t\t\th.AssertContains(t, outBuf.String(), `Version:  `+testVersion)\n\n\t\t\t\th.AssertNotContains(t, outBuf.String(), `default-builder-image = \"[REDACTED]\"`)\n\t\t\t\th.AssertNotContains(t, outBuf.String(), `name = \"[REDACTED]\"`)\n\t\t\t\th.AssertNotContains(t, outBuf.String(), `url = \"[REDACTED]\"`)\n\t\t\t\th.AssertNotContains(t, outBuf.String(), `image = \"[REDACTED]\"`)\n\t\t\t\th.AssertNotContains(t, outBuf.String(), `mirrors = [\"[REDACTED]\", \"[REDACTED]\"]`)\n\n\t\t\t\th.AssertContains(t, outBuf.String(), `default-builder-image = \"some/image\"`)\n\t\t\t\th.AssertContains(t, outBuf.String(), `image = \"super-secret-project/run\"`)\n\t\t\t\th.AssertContains(t, outBuf.String(), `mirrors = [\"gcr.io/super-secret-project/run\", \"secret.io/super-secret-project/run\"]`)\n\t\t\t\th.AssertContains(t, outBuf.String(), `name = \"super-secret-project/builder\"`)\n\t\t\t\th.AssertContains(t, outBuf.String(), `name = \"secret-registry\"`)\n\t\t\t\th.AssertContains(t, outBuf.String(), `url = \"https://github.com/super-secret-project/registry\"`)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"config.toml is not present\", func() {\n\t\t\tit(\"logs a message\", func() {\n\t\t\t\tcommand = commands.Report(logger, testVersion, filepath.Join(tempPackEmptyHome, \"/config.toml\"))\n\t\t\t\tcommand.SetArgs([]string{})\n\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t\th.AssertContains(t, outBuf.String(), fmt.Sprintf(\"(no config file found at %s)\", filepath.Join(tempPackEmptyHome, \"config.toml\")))\n\t\t\t})\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "internal/commands/sbom.go",
    "content": "package commands\n\nimport (\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n)\n\nfunc NewSBOMCommand(logger logging.Logger, cfg config.Config, client PackClient) *cobra.Command {\n\tcmd := &cobra.Command{\n\t\tUse:   \"sbom\",\n\t\tShort: \"Interact with SBoM\",\n\t\tRunE:  nil,\n\t}\n\n\tcmd.AddCommand(DownloadSBOM(logger, client))\n\tAddHelpFlag(cmd, \"sbom\")\n\treturn cmd\n}\n"
  },
  {
    "path": "internal/commands/set_default_builder.go",
    "content": "package commands\n\nimport (\n\t\"fmt\"\n\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/internal/style\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n)\n\n// Deprecated: Use `pack config default-builder`\nfunc SetDefaultBuilder(logger logging.Logger, cfg config.Config, cfgPath string, client PackClient) *cobra.Command {\n\tcmd := &cobra.Command{\n\t\tUse:     \"set-default-builder <builder-name>\",\n\t\tHidden:  true,\n\t\tArgs:    cobra.MaximumNArgs(1),\n\t\tShort:   \"Set default builder used by other commands\",\n\t\tLong:    \"Set default builder used by other commands.\\n\\n** For suggested builders simply leave builder name empty. **\",\n\t\tExample: \"pack set-default-builder cnbs/sample-builder:bionic\",\n\t\tRunE: logError(logger, func(cmd *cobra.Command, args []string) error {\n\t\t\tdeprecationWarning(logger, \"set-default-builder\", \"config default-builder\")\n\t\t\tif len(args) < 1 || args[0] == \"\" {\n\t\t\t\tlogger.Infof(\"Usage:\\n\\t%s\\n\", cmd.UseLine())\n\t\t\t\tsuggestBuilders(logger, client)\n\t\t\t\treturn nil\n\t\t\t}\n\n\t\t\timageName := args[0]\n\n\t\t\tlogger.Debug(\"Verifying local image...\")\n\t\t\tinfo, err := client.InspectBuilder(imageName, true)\n\t\t\tif err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\n\t\t\tif info == nil {\n\t\t\t\tlogger.Debug(\"Verifying remote image...\")\n\t\t\t\tinfo, err := client.InspectBuilder(imageName, false)\n\t\t\t\tif err != nil {\n\t\t\t\t\treturn err\n\t\t\t\t}\n\n\t\t\t\tif info == nil {\n\t\t\t\t\treturn fmt.Errorf(\"builder %s not found\", style.Symbol(imageName))\n\t\t\t\t}\n\t\t\t}\n\n\t\t\tcfg.DefaultBuilder = imageName\n\t\t\tif err := config.Write(cfg, cfgPath); err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\t\t\tlogger.Infof(\"Builder %s is now the default builder\", style.Symbol(imageName))\n\t\t\treturn nil\n\t\t}),\n\t}\n\n\tAddHelpFlag(cmd, \"set-default-builder\")\n\treturn cmd\n}\n"
  },
  {
    "path": "internal/commands/set_default_builder_test.go",
    "content": "package commands_test\n\nimport (\n\t\"bytes\"\n\t\"fmt\"\n\t\"os\"\n\t\"path/filepath\"\n\t\"testing\"\n\n\t\"github.com/golang/mock/gomock\"\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/internal/commands\"\n\t\"github.com/buildpacks/pack/internal/commands/testmocks\"\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/pkg/client\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestSetDefaultBuilderCommand(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"Commands\", testSetDefaultBuilderCommand, spec.Random(), spec.Report(report.Terminal{}))\n}\n\nfunc testSetDefaultBuilderCommand(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tcommand        *cobra.Command\n\t\tlogger         logging.Logger\n\t\toutBuf         bytes.Buffer\n\t\tmockController *gomock.Controller\n\t\tmockClient     *testmocks.MockPackClient\n\t\ttempPackHome   string\n\t)\n\n\tit.Before(func() {\n\t\tmockController = gomock.NewController(t)\n\t\tmockClient = testmocks.NewMockPackClient(mockController)\n\t\tlogger = logging.NewLogWithWriters(&outBuf, &outBuf)\n\n\t\tvar err error\n\t\ttempPackHome, err = os.MkdirTemp(\"\", \"pack-home\")\n\t\th.AssertNil(t, err)\n\t\tcommand = commands.SetDefaultBuilder(logger, config.Config{}, filepath.Join(tempPackHome, \"config.toml\"), mockClient)\n\t})\n\n\tit.After(func() {\n\t\tmockController.Finish()\n\t\th.AssertNil(t, os.RemoveAll(tempPackHome))\n\t})\n\n\twhen(\"#SetDefaultBuilder\", func() {\n\t\twhen(\"no builder provided\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\tmockClient.EXPECT().InspectBuilder(gomock.Any(), false).Return(&client.BuilderInfo{}, nil).AnyTimes()\n\t\t\t})\n\n\t\t\tit(\"display suggested builders\", func() {\n\t\t\t\tcommand.SetArgs([]string{})\n\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t\th.AssertContains(t, outBuf.String(), \"Suggested builders:\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"empty builder name is provided\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\tmockClient.EXPECT().InspectBuilder(gomock.Any(), false).Return(&client.BuilderInfo{}, nil).AnyTimes()\n\t\t\t})\n\n\t\t\tit(\"display suggested builders\", func() {\n\t\t\t\tcommand.SetArgs([]string{})\n\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t\th.AssertContains(t, outBuf.String(), \"Suggested builders:\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"valid builder is provider\", func() {\n\t\t\twhen(\"in local\", func() {\n\t\t\t\tit(\"sets default builder\", func() {\n\t\t\t\t\timageName := \"some/image\"\n\t\t\t\t\tmockClient.EXPECT().InspectBuilder(imageName, true).Return(&client.BuilderInfo{\n\t\t\t\t\t\tStack: \"test.stack.id\",\n\t\t\t\t\t}, nil)\n\n\t\t\t\t\tcommand.SetArgs([]string{imageName})\n\t\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t\t\th.AssertContains(t, outBuf.String(), fmt.Sprintf(\"Builder '%s' is now the default builder\", imageName))\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"in remote\", func() {\n\t\t\t\tit(\"sets default builder\", func() {\n\t\t\t\t\timageName := \"some/image\"\n\n\t\t\t\t\tlocalCall := mockClient.EXPECT().InspectBuilder(imageName, true).Return(nil, nil)\n\n\t\t\t\t\tmockClient.EXPECT().InspectBuilder(imageName, false).Return(&client.BuilderInfo{\n\t\t\t\t\t\tStack: \"test.stack.id\",\n\t\t\t\t\t}, nil).After(localCall)\n\n\t\t\t\t\tcommand.SetArgs([]string{imageName})\n\t\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t\t\th.AssertContains(t, outBuf.String(), fmt.Sprintf(\"Builder '%s' is now the default builder\", imageName))\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"invalid builder is provided\", func() {\n\t\t\tit(\"error is presented\", func() {\n\t\t\t\timageName := \"nonbuilder/image\"\n\n\t\t\t\tmockClient.EXPECT().InspectBuilder(imageName, true).Return(\n\t\t\t\t\tnil,\n\t\t\t\t\tfmt.Errorf(\"failed to inspect image %s\", imageName))\n\n\t\t\t\tcommand.SetArgs([]string{imageName})\n\n\t\t\t\th.AssertNotNil(t, command.Execute())\n\t\t\t\th.AssertContains(t, outBuf.String(), \"ERROR: failed to inspect image nonbuilder/image\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"non-existent builder is provided\", func() {\n\t\t\tit(\"error is present\", func() {\n\t\t\t\timageName := \"nonexisting/image\"\n\n\t\t\t\tlocalCall := mockClient.EXPECT().InspectBuilder(imageName, true).Return(\n\t\t\t\t\tnil,\n\t\t\t\t\tnil)\n\n\t\t\t\tmockClient.EXPECT().InspectBuilder(imageName, false).Return(\n\t\t\t\t\tnil,\n\t\t\t\t\tnil).After(localCall)\n\n\t\t\t\tcommand.SetArgs([]string{imageName})\n\n\t\t\t\th.AssertNotNil(t, command.Execute())\n\t\t\t\th.AssertContains(t, outBuf.String(), \"ERROR: builder 'nonexisting/image' not found\")\n\t\t\t})\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "internal/commands/set_default_registry.go",
    "content": "package commands\n\nimport (\n\t\"github.com/pkg/errors\"\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/internal/style\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n)\n\n// Deprecated: Use `pack config registries default` instead\nfunc SetDefaultRegistry(logger logging.Logger, cfg config.Config, cfgPath string) *cobra.Command {\n\tvar (\n\t\tregistryName string\n\t)\n\n\tcmd := &cobra.Command{\n\t\tUse:     \"set-default-registry <name>\",\n\t\tArgs:    cobra.ExactArgs(1),\n\t\tHidden:  true,\n\t\tShort:   \"Set default registry\",\n\t\tExample: \"pack set-default-registry myregistry\",\n\t\tRunE: logError(logger, func(cmd *cobra.Command, args []string) error {\n\t\t\tdeprecationWarning(logger, \"set-default-registry\", \"config registries default\")\n\t\t\tregistryName = args[0]\n\t\t\tif !registriesContains(config.GetRegistries(cfg), registryName) {\n\t\t\t\treturn errors.Errorf(\"no registry with the name %s exists\", style.Symbol(registryName))\n\t\t\t}\n\n\t\t\tif cfg.DefaultRegistryName != registryName {\n\t\t\t\tcfg.DefaultRegistryName = registryName\n\t\t\t\terr := config.Write(cfg, cfgPath)\n\t\t\t\tif err != nil {\n\t\t\t\t\treturn err\n\t\t\t\t}\n\t\t\t}\n\n\t\t\tlogger.Infof(\"Successfully set %s as the default registry\", style.Symbol(registryName))\n\n\t\t\treturn nil\n\t\t}),\n\t}\n\tAddHelpFlag(cmd, \"set-default-registry\")\n\n\treturn cmd\n}\n"
  },
  {
    "path": "internal/commands/set_default_registry_test.go",
    "content": "package commands_test\n\nimport (\n\t\"bytes\"\n\t\"os\"\n\t\"path/filepath\"\n\t\"testing\"\n\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\t\"github.com/buildpacks/pack/internal/commands\"\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestSetDefaultRegistry(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\n\tspec.Run(t, \"SetDefaultRegistryCommand\", testSetDefaultRegistryCommand, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testSetDefaultRegistryCommand(t *testing.T, when spec.G, it spec.S) {\n\twhen(\"#SetDefaultRegistry\", func() {\n\t\tvar (\n\t\t\toutBuf     bytes.Buffer\n\t\t\tlogger     = logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\ttmpDir     string\n\t\t\tconfigFile string\n\t\t\tassert     = h.NewAssertionManager(t)\n\t\t)\n\n\t\tit.Before(func() {\n\t\t\tvar err error\n\t\t\ttmpDir, err = os.MkdirTemp(\"\", \"pack-home-*\")\n\t\t\tassert.Nil(err)\n\n\t\t\tconfigFile = filepath.Join(tmpDir, \"config.toml\")\n\t\t})\n\n\t\tit.After(func() {\n\t\t\t_ = os.RemoveAll(tmpDir)\n\t\t})\n\n\t\tit(\"should set the default registry\", func() {\n\t\t\tcfg := config.Config{\n\t\t\t\tRegistries: []config.Registry{\n\t\t\t\t\t{\n\t\t\t\t\t\tName: \"myregistry\",\n\t\t\t\t\t\tURL:  \"https://github.com/buildpacks/registry-index\",\n\t\t\t\t\t\tType: \"github\",\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t}\n\t\t\tcommand := commands.SetDefaultRegistry(logger, cfg, configFile)\n\t\t\tcommand.SetArgs([]string{\"myregistry\"})\n\t\t\tassert.Succeeds(command.Execute())\n\n\t\t\tcfg, err := config.Read(configFile)\n\t\t\tassert.Nil(err)\n\n\t\t\tassert.Equal(cfg.DefaultRegistryName, \"myregistry\")\n\t\t\tassert.Contains(outBuf.String(), \"has been deprecated, please use 'pack config registries default'\")\n\t\t})\n\n\t\tit(\"should fail if no corresponding registry exists\", func() {\n\t\t\tcommand := commands.SetDefaultRegistry(logger, config.Config{}, configFile)\n\t\t\tcommand.SetArgs([]string{\"myregistry\"})\n\t\t\tassert.Error(command.Execute())\n\n\t\t\toutput := outBuf.String()\n\t\t\th.AssertContains(t, output, \"no registry with the name 'myregistry' exists\")\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "internal/commands/set_run_image_mirrors.go",
    "content": "package commands\n\nimport (\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/internal/style\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n)\n\n// Deprecated: Use `pack config run-image-mirrors add` instead\n// SetRunImagesMirrors sets run image mirros for a given run image\nfunc SetRunImagesMirrors(logger logging.Logger, cfg config.Config, cfgPath string) *cobra.Command {\n\tvar mirrors []string\n\n\tcmd := &cobra.Command{\n\t\tUse:     \"set-run-image-mirrors <run-image-name> --mirror <run-image-mirror>\",\n\t\tArgs:    cobra.ExactArgs(1),\n\t\tHidden:  true,\n\t\tShort:   \"Set mirrors to other repositories for a given run image\",\n\t\tExample: \"pack set-run-image-mirrors cnbs/sample-stack-run:bionic --mirror index.docker.io/cnbs/sample-stack-run:bionic\",\n\t\tRunE: logError(logger, func(cmd *cobra.Command, args []string) error {\n\t\t\tdeprecationWarning(logger, \"set-run-image-mirrors\", \"config run-image-mirrors\")\n\t\t\trunImage := args[0]\n\t\t\tcfg = config.SetRunImageMirrors(cfg, runImage, mirrors)\n\t\t\tif err := config.Write(cfg, cfgPath); err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\n\t\t\tfor _, mirror := range mirrors {\n\t\t\t\tlogger.Infof(\"Run Image %s configured with mirror %s\", style.Symbol(runImage), style.Symbol(mirror))\n\t\t\t}\n\t\t\tif len(mirrors) == 0 {\n\t\t\t\tlogger.Infof(\"All mirrors removed for Run Image %s\", style.Symbol(runImage))\n\t\t\t}\n\t\t\treturn nil\n\t\t}),\n\t}\n\tcmd.Flags().StringSliceVarP(&mirrors, \"mirror\", \"m\", nil, \"Run image mirror\"+stringSliceHelp(\"mirror\"))\n\tAddHelpFlag(cmd, \"set-run-image-mirrors\")\n\treturn cmd\n}\n"
  },
  {
    "path": "internal/commands/set_run_image_mirrors_test.go",
    "content": "package commands_test\n\nimport (\n\t\"bytes\"\n\t\"os\"\n\t\"path/filepath\"\n\t\"testing\"\n\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/internal/commands\"\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestSetRunImageMirrorsCommand(t *testing.T) {\n\tspec.Run(t, \"Commands\", testSetRunImageMirrorsCommand, spec.Sequential(), spec.Report(report.Terminal{}))\n}\n\nfunc testSetRunImageMirrorsCommand(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tcommand      *cobra.Command\n\t\tlogger       logging.Logger\n\t\toutBuf       bytes.Buffer\n\t\tcfg          config.Config\n\t\ttempPackHome string\n\t\tcfgPath      string\n\t)\n\n\tit.Before(func() {\n\t\tlogger = logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\tcfg = config.Config{}\n\t\tvar err error\n\t\ttempPackHome, err = os.MkdirTemp(\"\", \"pack-home\")\n\t\th.AssertNil(t, err)\n\t\tcfgPath = filepath.Join(tempPackHome, \"config.toml\")\n\n\t\tcommand = commands.SetRunImagesMirrors(logger, cfg, cfgPath)\n\t})\n\n\tit.After(func() {\n\t\th.AssertNil(t, os.RemoveAll(tempPackHome))\n\t})\n\n\twhen(\"#SetRunImageMirrors\", func() {\n\t\tvar (\n\t\t\trunImage        string\n\t\t\ttestMirror1     string\n\t\t\ttestMirror2     string\n\t\t\ttestRunImageCfg []config.RunImage\n\t\t)\n\t\tit.Before(func() {\n\t\t\trunImage = \"test/image\"\n\t\t\ttestMirror1 = \"example.com/some/run1\"\n\t\t\ttestMirror2 = \"example.com/some/run2\"\n\t\t\ttestRunImageCfg = []config.RunImage{{\n\t\t\t\tImage:   runImage,\n\t\t\t\tMirrors: []string{testMirror1, testMirror2},\n\t\t\t}}\n\t\t})\n\n\t\twhen(\"no run image is specified\", func() {\n\t\t\tit(\"fails to run\", func() {\n\t\t\t\terr := command.Execute()\n\t\t\t\th.AssertError(t, err, \"accepts 1 arg\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"mirrors are provided\", func() {\n\t\t\tit(\"adds them as mirrors to the config\", func() {\n\t\t\t\tcommand.SetArgs([]string{runImage, \"-m\", testMirror1, \"-m\", testMirror2})\n\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t\tcfg, err := config.Read(cfgPath)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertEq(t, cfg.RunImages, testRunImageCfg)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"no mirrors are provided\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\tcfg.RunImages = testRunImageCfg\n\t\t\t\tcommand = commands.SetRunImagesMirrors(logger, cfg, cfgPath)\n\t\t\t})\n\n\t\t\tit(\"removes all mirrors for the run image\", func() {\n\t\t\t\tcommand.SetArgs([]string{runImage})\n\t\t\t\th.AssertNil(t, command.Execute())\n\n\t\t\t\tcfg, err := config.Read(cfgPath)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertEq(t, cfg.RunImages, []config.RunImage{{Image: runImage}})\n\t\t\t})\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "internal/commands/stack.go",
    "content": "package commands\n\nimport (\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/pkg/logging\"\n)\n\nfunc NewStackCommand(logger logging.Logger) *cobra.Command {\n\tcommand := cobra.Command{\n\t\tUse:   \"stack\",\n\t\tShort: \"(deprecated) Interact with stacks\",\n\t\tLong:  \"(Deprecated)\\nStacks are deprecated in favor of using BuildImages and RunImages directly, but will continue to be supported throughout all of 2023 and '24 if not longer. Please see our docs for more details- https://buildpacks.io/docs/concepts/components/stack\",\n\t\tRunE:  nil,\n\t}\n\n\tcommand.AddCommand(stackSuggest(logger))\n\treturn &command\n}\n"
  },
  {
    "path": "internal/commands/stack_suggest.go",
    "content": "package commands\n\nimport (\n\t\"bytes\"\n\t\"html/template\"\n\t\"sort\"\n\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/pkg/logging\"\n)\n\ntype suggestedStack struct {\n\tID          string\n\tDescription string\n\tMaintainer  string\n\tBuildImage  string\n\tRunImage    string\n}\n\nvar suggestedStacks = []suggestedStack{\n\t{\n\t\tID:          \"Deprecation Notice\",\n\t\tDescription: \"Stacks are deprecated in favor of using BuildImages and RunImages directly, but will continue to be supported throughout all of 2023 and 2024 if not longer. Please see our docs for more details- https://buildpacks.io/docs/concepts/components/stack\",\n\t\tMaintainer:  \"CNB\",\n\t},\n\t{\n\t\tID:          \"heroku-20\",\n\t\tDescription: \"The official Heroku stack based on Ubuntu 20.04\",\n\t\tMaintainer:  \"Heroku\",\n\t\tBuildImage:  \"heroku/heroku:20-cnb-build\",\n\t\tRunImage:    \"heroku/heroku:20-cnb\",\n\t},\n\t{\n\t\tID:          \"io.buildpacks.stacks.jammy\",\n\t\tDescription: \"A minimal Paketo stack based on Ubuntu 22.04\",\n\t\tMaintainer:  \"Paketo Project\",\n\t\tBuildImage:  \"paketobuildpacks/build-jammy-base\",\n\t\tRunImage:    \"paketobuildpacks/run-jammy-base\",\n\t},\n\t{\n\t\tID:          \"io.buildpacks.stacks.jammy\",\n\t\tDescription: \"A large Paketo stack based on Ubuntu 22.04\",\n\t\tMaintainer:  \"Paketo Project\",\n\t\tBuildImage:  \"paketobuildpacks/build-jammy-full\",\n\t\tRunImage:    \"paketobuildpacks/run-jammy-full\",\n\t},\n\t{\n\t\tID:          \"io.buildpacks.stacks.jammy.tiny\",\n\t\tDescription: \"A tiny Paketo stack based on Ubuntu 22.04, similar to distroless\",\n\t\tMaintainer:  \"Paketo Project\",\n\t\tBuildImage:  \"paketobuildpacks/build-jammy-tiny\",\n\t\tRunImage:    \"paketobuildpacks/run-jammy-tiny\",\n\t},\n\t{\n\t\tID:          \"io.buildpacks.stacks.jammy.static\",\n\t\tDescription: \"A static Paketo stack based on Ubuntu 22.04, similar to distroless\",\n\t\tMaintainer:  \"Paketo Project\",\n\t\tBuildImage:  \"paketobuildpacks/build-jammy-static\",\n\t\tRunImage:    \"paketobuildpacks/run-jammy-static\",\n\t},\n}\n\nfunc stackSuggest(logger logging.Logger) *cobra.Command {\n\tcmd := &cobra.Command{\n\t\tUse:     \"suggest\",\n\t\tArgs:    cobra.NoArgs,\n\t\tShort:   \"(deprecated) List the recommended stacks\",\n\t\tExample: \"pack stack suggest\",\n\t\tRunE: logError(logger, func(*cobra.Command, []string) error {\n\t\t\tSuggest(logger)\n\t\t\treturn nil\n\t\t}),\n\t}\n\n\treturn cmd\n}\n\nfunc Suggest(log logging.Logger) {\n\tsort.Slice(suggestedStacks, func(i, j int) bool { return suggestedStacks[i].ID < suggestedStacks[j].ID })\n\ttmpl := template.Must(template.New(\"\").Parse(`Stacks maintained by the community:\n{{- range . }}\n\n    Stack ID: {{ .ID }}\n    Description: {{ .Description }}\n    Maintainer: {{ .Maintainer }}\n    Build Image: {{ .BuildImage }}\n    Run Image: {{ .RunImage }}\n{{- end }}\n`))\n\n\tbuf := &bytes.Buffer{}\n\ttmpl.Execute(buf, suggestedStacks)\n\tlog.Info(buf.String())\n}\n"
  },
  {
    "path": "internal/commands/stack_suggest_test.go",
    "content": "package commands\n\nimport (\n\t\"bytes\"\n\t\"testing\"\n\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestStacksSuggestCommand(t *testing.T) {\n\tspec.Run(t, \"StacksSuggestCommand\", testStacksSuggestCommand, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testStacksSuggestCommand(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tcommand *cobra.Command\n\t\toutBuf  bytes.Buffer\n\t)\n\n\tit.Before(func() {\n\t\tcommand = stackSuggest(logging.NewLogWithWriters(&outBuf, &outBuf))\n\t})\n\n\twhen(\"#SuggestStacks\", func() {\n\t\tit(\"displays stack information\", func() {\n\t\t\tcommand.SetArgs([]string{})\n\t\t\th.AssertNil(t, command.Execute())\n\t\t\th.AssertEq(t, outBuf.String(), `Stacks maintained by the community:\n\n    Stack ID: Deprecation Notice\n    Description: Stacks are deprecated in favor of using BuildImages and RunImages directly, but will continue to be supported throughout all of 2023 and 2024 if not longer. Please see our docs for more details- https://buildpacks.io/docs/concepts/components/stack\n    Maintainer: CNB\n    Build Image: \n    Run Image: \n\n    Stack ID: heroku-20\n    Description: The official Heroku stack based on Ubuntu 20.04\n    Maintainer: Heroku\n    Build Image: heroku/heroku:20-cnb-build\n    Run Image: heroku/heroku:20-cnb\n\n    Stack ID: io.buildpacks.stacks.jammy\n    Description: A minimal Paketo stack based on Ubuntu 22.04\n    Maintainer: Paketo Project\n    Build Image: paketobuildpacks/build-jammy-base\n    Run Image: paketobuildpacks/run-jammy-base\n\n    Stack ID: io.buildpacks.stacks.jammy\n    Description: A large Paketo stack based on Ubuntu 22.04\n    Maintainer: Paketo Project\n    Build Image: paketobuildpacks/build-jammy-full\n    Run Image: paketobuildpacks/run-jammy-full\n\n    Stack ID: io.buildpacks.stacks.jammy.static\n    Description: A static Paketo stack based on Ubuntu 22.04, similar to distroless\n    Maintainer: Paketo Project\n    Build Image: paketobuildpacks/build-jammy-static\n    Run Image: paketobuildpacks/run-jammy-static\n\n    Stack ID: io.buildpacks.stacks.jammy.tiny\n    Description: A tiny Paketo stack based on Ubuntu 22.04, similar to distroless\n    Maintainer: Paketo Project\n    Build Image: paketobuildpacks/build-jammy-tiny\n    Run Image: paketobuildpacks/run-jammy-tiny\n`)\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "internal/commands/stack_test.go",
    "content": "package commands\n\nimport (\n\t\"bytes\"\n\t\"testing\"\n\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestStackCommand(t *testing.T) {\n\tspec.Run(t, \"StackCommand\", testStackCommand, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testStackCommand(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tcommand *cobra.Command\n\t\toutBuf  bytes.Buffer\n\t)\n\n\tit.Before(func() {\n\t\tcommand = NewStackCommand(logging.NewLogWithWriters(&outBuf, &outBuf))\n\t})\n\n\twhen(\"#Stack\", func() {\n\t\tit(\"displays stack information\", func() {\n\t\t\tcommand.SetArgs([]string{})\n\t\t\tbb := bytes.NewBufferString(\"\") // In most tests we don't seem to need to this, not sure why it's necessary here.\n\t\t\tcommand.SetOut(bb)\n\t\t\th.AssertNil(t, command.Execute())\n\t\t\th.AssertEq(t, bb.String(), `(Deprecated)\nStacks are deprecated in favor of using BuildImages and RunImages directly, but will continue to be supported throughout all of 2023 and '24 if not longer. Please see our docs for more details- https://buildpacks.io/docs/concepts/components/stack\n\nUsage:\n  stack [command]\n\nAvailable Commands:\n  completion  Generate the autocompletion script for the specified shell\n  help        Help about any command\n  suggest     (deprecated) List the recommended stacks\n\nFlags:\n  -h, --help   help for stack\n\nUse \"stack [command] --help\" for more information about a command.\n`)\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "internal/commands/suggest_builders.go",
    "content": "package commands\n\nimport (\n\t\"fmt\"\n\t\"sort\"\n\t\"sync\"\n\t\"text/tabwriter\"\n\n\t\"github.com/spf13/cobra\"\n\n\tbldr \"github.com/buildpacks/pack/internal/builder\"\n\t\"github.com/buildpacks/pack/internal/style\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n)\n\n// Deprecated: Use `builder suggest` instead.\nfunc SuggestBuilders(logger logging.Logger, inspector BuilderInspector) *cobra.Command {\n\tcmd := &cobra.Command{\n\t\tUse:     \"suggest-builders\",\n\t\tHidden:  true,\n\t\tArgs:    cobra.NoArgs,\n\t\tShort:   \"Display list of recommended builders\",\n\t\tExample: \"pack suggest-builders\",\n\t\tRun: func(cmd *cobra.Command, s []string) {\n\t\t\tdeprecationWarning(logger, \"suggest-builder\", \"builder suggest\")\n\t\t\tsuggestBuilders(logger, inspector)\n\t\t},\n\t}\n\n\treturn cmd\n}\n\nfunc suggestSettingBuilder(logger logging.Logger, inspector BuilderInspector) {\n\tlogger.Info(\"Please select a default builder with:\")\n\tlogger.Info(\"\")\n\tlogger.Info(\"\\tpack config default-builder <builder-image>\")\n\tlogger.Info(\"\")\n\tsuggestBuilders(logger, inspector)\n}\n\nfunc suggestBuilders(logger logging.Logger, client BuilderInspector) {\n\tsuggestedBuilders := []bldr.KnownBuilder{}\n\tfor _, knownBuilder := range bldr.KnownBuilders {\n\t\tif knownBuilder.Suggested {\n\t\t\tsuggestedBuilders = append(suggestedBuilders, knownBuilder)\n\t\t}\n\t}\n\tWriteSuggestedBuilder(logger, client, suggestedBuilders)\n}\n\nfunc WriteSuggestedBuilder(logger logging.Logger, inspector BuilderInspector, builders []bldr.KnownBuilder) {\n\tsort.Slice(builders, func(i, j int) bool {\n\t\tif builders[i].Vendor == builders[j].Vendor {\n\t\t\treturn builders[i].Image < builders[j].Image\n\t\t}\n\n\t\treturn builders[i].Vendor < builders[j].Vendor\n\t})\n\n\tlogger.Info(\"Suggested builders:\")\n\n\t// Fetch descriptions concurrently.\n\tdescriptions := make([]string, len(builders))\n\n\tvar wg sync.WaitGroup\n\twg.Add(len(builders))\n\n\tfor i, builder := range builders {\n\t\tgo func(w *sync.WaitGroup, i int, builder bldr.KnownBuilder) {\n\t\t\tdescriptions[i] = getBuilderDescription(builder, inspector)\n\t\t\tw.Done()\n\t\t}(&wg, i, builder)\n\t}\n\n\twg.Wait()\n\n\ttw := tabwriter.NewWriter(logger.Writer(), 10, 10, 5, ' ', tabwriter.TabIndent)\n\tfor i, builder := range builders {\n\t\tfmt.Fprintf(tw, \"\\t%s:\\t%s\\t%s\\t\\n\", builder.Vendor, style.Symbol(builder.Image), descriptions[i])\n\t}\n\tfmt.Fprintln(tw)\n\n\tlogging.Tip(logger, \"Learn more about a specific builder with:\")\n\tlogger.Info(\"\\tpack builder inspect <builder-image>\")\n}\n\nfunc getBuilderDescription(builder bldr.KnownBuilder, inspector BuilderInspector) string {\n\tinfo, err := inspector.InspectBuilder(builder.Image, false)\n\tif err == nil && info != nil && info.Description != \"\" {\n\t\treturn info.Description\n\t}\n\n\treturn builder.DefaultDescription\n}\n"
  },
  {
    "path": "internal/commands/suggest_builders_test.go",
    "content": "package commands_test\n\nimport (\n\t\"bytes\"\n\t\"errors\"\n\t\"testing\"\n\n\t\"github.com/golang/mock/gomock\"\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\tbldr \"github.com/buildpacks/pack/internal/builder\"\n\t\"github.com/buildpacks/pack/internal/commands\"\n\t\"github.com/buildpacks/pack/internal/commands/testmocks\"\n\t\"github.com/buildpacks/pack/pkg/client\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestSuggestBuildersCommand(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"SuggestBuilderCommand\", testSuggestBuildersCommand, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testSuggestBuildersCommand(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tlogger         logging.Logger\n\t\toutBuf         bytes.Buffer\n\t\tmockController *gomock.Controller\n\t\tmockClient     *testmocks.MockPackClient\n\t)\n\n\tit.Before(func() {\n\t\tmockController = gomock.NewController(t)\n\t\tmockClient = testmocks.NewMockPackClient(mockController)\n\t\tlogger = logging.NewLogWithWriters(&outBuf, &outBuf)\n\t})\n\n\twhen(\"#WriteSuggestedBuilder\", func() {\n\t\twhen(\"description metadata exists\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\tmockClient.EXPECT().InspectBuilder(\"gcr.io/some/builder:latest\", false).Return(&client.BuilderInfo{\n\t\t\t\t\tDescription: \"Remote description\",\n\t\t\t\t}, nil)\n\t\t\t})\n\n\t\t\tit(\"displays descriptions from metadata\", func() {\n\t\t\t\tcommands.WriteSuggestedBuilder(logger, mockClient, []bldr.KnownBuilder{{\n\t\t\t\t\tVendor:             \"Builder\",\n\t\t\t\t\tImage:              \"gcr.io/some/builder:latest\",\n\t\t\t\t\tDefaultDescription: \"Default description\",\n\t\t\t\t}})\n\t\t\t\th.AssertContains(t, outBuf.String(), \"Suggested builders:\")\n\t\t\t\th.AssertContainsMatch(t, outBuf.String(), `Builder:\\s+'gcr.io/some/builder:latest'\\s+Remote description`)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"description metadata does not exist\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\tmockClient.EXPECT().InspectBuilder(gomock.Any(), false).Return(&client.BuilderInfo{\n\t\t\t\t\tDescription: \"\",\n\t\t\t\t}, nil).AnyTimes()\n\t\t\t})\n\n\t\t\tit(\"displays default descriptions\", func() {\n\t\t\t\tcommands.WriteSuggestedBuilder(logger, mockClient, []bldr.KnownBuilder{{\n\t\t\t\t\tVendor:             \"Builder\",\n\t\t\t\t\tImage:              \"gcr.io/some/builder:latest\",\n\t\t\t\t\tDefaultDescription: \"Default description\",\n\t\t\t\t}})\n\t\t\t\th.AssertContains(t, outBuf.String(), \"Suggested builders:\")\n\t\t\t\th.AssertContainsMatch(t, outBuf.String(), `Builder:\\s+'gcr.io/some/builder:latest'\\s+Default description`)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"error inspecting images\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\tmockClient.EXPECT().InspectBuilder(gomock.Any(), false).Return(nil, errors.New(\"some error\")).AnyTimes()\n\t\t\t})\n\n\t\t\tit(\"displays default descriptions\", func() {\n\t\t\t\tcommands.WriteSuggestedBuilder(logger, mockClient, []bldr.KnownBuilder{{\n\t\t\t\t\tVendor:             \"Builder\",\n\t\t\t\t\tImage:              \"gcr.io/some/builder:latest\",\n\t\t\t\t\tDefaultDescription: \"Default description\",\n\t\t\t\t}})\n\t\t\t\th.AssertContains(t, outBuf.String(), \"Suggested builders:\")\n\t\t\t\th.AssertContainsMatch(t, outBuf.String(), `Builder:\\s+'gcr.io/some/builder:latest'\\s+Default description`)\n\t\t\t})\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "internal/commands/testdata/buildpack.toml",
    "content": ""
  },
  {
    "path": "internal/commands/testdata/inspect_image_output.json",
    "content": "{\"remote\":[{\"name\":\"name-1\",\"version\":\"version-1\",\"metadata\":{\"RemoteData\":{\"String\":\"aString\",\"Bool\":true,\"Int\":123,\"Nested\":{\"String\":\"anotherString\"}}},\"buildpack\":{\"id\":\"test.bp.one.remote\",\"version\":\"1.0.0\"}}],\"local\":[{\"name\":\"name-1\",\"version\":\"version-1\",\"metadata\":{\"LocalData\":{\"String\":\"\",\"Bool\":false,\"Int\":456,\"Nested\":{\"String\":\"\"}}},\"buildpack\":{\"id\":\"test.bp.one.remote\",\"version\":\"1.0.0\"}}]}"
  },
  {
    "path": "internal/commands/testdata/project.toml",
    "content": "[project]\nname = \"Sample\"\n\n[[build.buildpacks]]\nid = \"example/lua\"\nversion = \"1.0\"\n\n[[build.env]]\nname = \"KEY1\"\nvalue = \"VALUE1\"\n"
  },
  {
    "path": "internal/commands/testmocks/mock_inspect_image_writer_factory.go",
    "content": "// Code generated by MockGen. DO NOT EDIT.\n// Source: github.com/buildpacks/pack/internal/commands (interfaces: InspectImageWriterFactory)\n\n// Package testmocks is a generated GoMock package.\npackage testmocks\n\nimport (\n\treflect \"reflect\"\n\n\tgomock \"github.com/golang/mock/gomock\"\n\n\twriter \"github.com/buildpacks/pack/internal/inspectimage/writer\"\n)\n\n// MockInspectImageWriterFactory is a mock of InspectImageWriterFactory interface.\ntype MockInspectImageWriterFactory struct {\n\tctrl     *gomock.Controller\n\trecorder *MockInspectImageWriterFactoryMockRecorder\n}\n\n// MockInspectImageWriterFactoryMockRecorder is the mock recorder for MockInspectImageWriterFactory.\ntype MockInspectImageWriterFactoryMockRecorder struct {\n\tmock *MockInspectImageWriterFactory\n}\n\n// NewMockInspectImageWriterFactory creates a new mock instance.\nfunc NewMockInspectImageWriterFactory(ctrl *gomock.Controller) *MockInspectImageWriterFactory {\n\tmock := &MockInspectImageWriterFactory{ctrl: ctrl}\n\tmock.recorder = &MockInspectImageWriterFactoryMockRecorder{mock}\n\treturn mock\n}\n\n// EXPECT returns an object that allows the caller to indicate expected use.\nfunc (m *MockInspectImageWriterFactory) EXPECT() *MockInspectImageWriterFactoryMockRecorder {\n\treturn m.recorder\n}\n\n// Writer mocks base method.\nfunc (m *MockInspectImageWriterFactory) Writer(arg0 string, arg1 bool) (writer.InspectImageWriter, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"Writer\", arg0, arg1)\n\tret0, _ := ret[0].(writer.InspectImageWriter)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// Writer indicates an expected call of Writer.\nfunc (mr *MockInspectImageWriterFactoryMockRecorder) Writer(arg0, arg1 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"Writer\", reflect.TypeOf((*MockInspectImageWriterFactory)(nil).Writer), arg0, arg1)\n}\n"
  },
  {
    "path": "internal/commands/testmocks/mock_pack_client.go",
    "content": "// Code generated by MockGen. DO NOT EDIT.\n// Source: github.com/buildpacks/pack/internal/commands (interfaces: PackClient)\n\n// Package testmocks is a generated GoMock package.\npackage testmocks\n\nimport (\n\tcontext \"context\"\n\treflect \"reflect\"\n\n\tgomock \"github.com/golang/mock/gomock\"\n\n\tclient \"github.com/buildpacks/pack/pkg/client\"\n)\n\n// MockPackClient is a mock of PackClient interface.\ntype MockPackClient struct {\n\tctrl     *gomock.Controller\n\trecorder *MockPackClientMockRecorder\n}\n\n// MockPackClientMockRecorder is the mock recorder for MockPackClient.\ntype MockPackClientMockRecorder struct {\n\tmock *MockPackClient\n}\n\n// NewMockPackClient creates a new mock instance.\nfunc NewMockPackClient(ctrl *gomock.Controller) *MockPackClient {\n\tmock := &MockPackClient{ctrl: ctrl}\n\tmock.recorder = &MockPackClientMockRecorder{mock}\n\treturn mock\n}\n\n// EXPECT returns an object that allows the caller to indicate expected use.\nfunc (m *MockPackClient) EXPECT() *MockPackClientMockRecorder {\n\treturn m.recorder\n}\n\n// AddManifest mocks base method.\nfunc (m *MockPackClient) AddManifest(arg0 context.Context, arg1 client.ManifestAddOptions) error {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"AddManifest\", arg0, arg1)\n\tret0, _ := ret[0].(error)\n\treturn ret0\n}\n\n// AddManifest indicates an expected call of AddManifest.\nfunc (mr *MockPackClientMockRecorder) AddManifest(arg0, arg1 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"AddManifest\", reflect.TypeOf((*MockPackClient)(nil).AddManifest), arg0, arg1)\n}\n\n// AnnotateManifest mocks base method.\nfunc (m *MockPackClient) AnnotateManifest(arg0 context.Context, arg1 client.ManifestAnnotateOptions) error {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"AnnotateManifest\", arg0, arg1)\n\tret0, _ := ret[0].(error)\n\treturn ret0\n}\n\n// AnnotateManifest indicates an expected call of AnnotateManifest.\nfunc (mr *MockPackClientMockRecorder) AnnotateManifest(arg0, arg1 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"AnnotateManifest\", reflect.TypeOf((*MockPackClient)(nil).AnnotateManifest), arg0, arg1)\n}\n\n// Build mocks base method.\nfunc (m *MockPackClient) Build(arg0 context.Context, arg1 client.BuildOptions) error {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"Build\", arg0, arg1)\n\tret0, _ := ret[0].(error)\n\treturn ret0\n}\n\n// Build indicates an expected call of Build.\nfunc (mr *MockPackClientMockRecorder) Build(arg0, arg1 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"Build\", reflect.TypeOf((*MockPackClient)(nil).Build), arg0, arg1)\n}\n\n// CreateBuilder mocks base method.\nfunc (m *MockPackClient) CreateBuilder(arg0 context.Context, arg1 client.CreateBuilderOptions) error {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"CreateBuilder\", arg0, arg1)\n\tret0, _ := ret[0].(error)\n\treturn ret0\n}\n\n// CreateBuilder indicates an expected call of CreateBuilder.\nfunc (mr *MockPackClientMockRecorder) CreateBuilder(arg0, arg1 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"CreateBuilder\", reflect.TypeOf((*MockPackClient)(nil).CreateBuilder), arg0, arg1)\n}\n\n// CreateManifest mocks base method.\nfunc (m *MockPackClient) CreateManifest(arg0 context.Context, arg1 client.CreateManifestOptions) error {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"CreateManifest\", arg0, arg1)\n\tret0, _ := ret[0].(error)\n\treturn ret0\n}\n\n// CreateManifest indicates an expected call of CreateManifest.\nfunc (mr *MockPackClientMockRecorder) CreateManifest(arg0, arg1 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"CreateManifest\", reflect.TypeOf((*MockPackClient)(nil).CreateManifest), arg0, arg1)\n}\n\n// DeleteManifest mocks base method.\nfunc (m *MockPackClient) DeleteManifest(arg0 []string) error {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"DeleteManifest\", arg0)\n\tret0, _ := ret[0].(error)\n\treturn ret0\n}\n\n// DeleteManifest indicates an expected call of DeleteManifest.\nfunc (mr *MockPackClientMockRecorder) DeleteManifest(arg0 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"DeleteManifest\", reflect.TypeOf((*MockPackClient)(nil).DeleteManifest), arg0)\n}\n\n// DownloadSBOM mocks base method.\nfunc (m *MockPackClient) DownloadSBOM(arg0 string, arg1 client.DownloadSBOMOptions) error {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"DownloadSBOM\", arg0, arg1)\n\tret0, _ := ret[0].(error)\n\treturn ret0\n}\n\n// DownloadSBOM indicates an expected call of DownloadSBOM.\nfunc (mr *MockPackClientMockRecorder) DownloadSBOM(arg0, arg1 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"DownloadSBOM\", reflect.TypeOf((*MockPackClient)(nil).DownloadSBOM), arg0, arg1)\n}\n\n// InspectBuilder mocks base method.\nfunc (m *MockPackClient) InspectBuilder(arg0 string, arg1 bool, arg2 ...client.BuilderInspectionModifier) (*client.BuilderInfo, error) {\n\tm.ctrl.T.Helper()\n\tvarargs := []interface{}{arg0, arg1}\n\tfor _, a := range arg2 {\n\t\tvarargs = append(varargs, a)\n\t}\n\tret := m.ctrl.Call(m, \"InspectBuilder\", varargs...)\n\tret0, _ := ret[0].(*client.BuilderInfo)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// InspectBuilder indicates an expected call of InspectBuilder.\nfunc (mr *MockPackClientMockRecorder) InspectBuilder(arg0, arg1 interface{}, arg2 ...interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\tvarargs := append([]interface{}{arg0, arg1}, arg2...)\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"InspectBuilder\", reflect.TypeOf((*MockPackClient)(nil).InspectBuilder), varargs...)\n}\n\n// InspectBuildpack mocks base method.\nfunc (m *MockPackClient) InspectBuildpack(arg0 client.InspectBuildpackOptions) (*client.BuildpackInfo, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"InspectBuildpack\", arg0)\n\tret0, _ := ret[0].(*client.BuildpackInfo)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// InspectBuildpack indicates an expected call of InspectBuildpack.\nfunc (mr *MockPackClientMockRecorder) InspectBuildpack(arg0 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"InspectBuildpack\", reflect.TypeOf((*MockPackClient)(nil).InspectBuildpack), arg0)\n}\n\n// InspectExtension mocks base method.\nfunc (m *MockPackClient) InspectExtension(arg0 client.InspectExtensionOptions) (*client.ExtensionInfo, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"InspectExtension\", arg0)\n\tret0, _ := ret[0].(*client.ExtensionInfo)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// InspectExtension indicates an expected call of InspectExtension.\nfunc (mr *MockPackClientMockRecorder) InspectExtension(arg0 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"InspectExtension\", reflect.TypeOf((*MockPackClient)(nil).InspectExtension), arg0)\n}\n\n// InspectImage mocks base method.\nfunc (m *MockPackClient) InspectImage(arg0 string, arg1 bool) (*client.ImageInfo, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"InspectImage\", arg0, arg1)\n\tret0, _ := ret[0].(*client.ImageInfo)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// InspectImage indicates an expected call of InspectImage.\nfunc (mr *MockPackClientMockRecorder) InspectImage(arg0, arg1 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"InspectImage\", reflect.TypeOf((*MockPackClient)(nil).InspectImage), arg0, arg1)\n}\n\n// InspectManifest mocks base method.\nfunc (m *MockPackClient) InspectManifest(arg0 string) error {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"InspectManifest\", arg0)\n\tret0, _ := ret[0].(error)\n\treturn ret0\n}\n\n// InspectManifest indicates an expected call of InspectManifest.\nfunc (mr *MockPackClientMockRecorder) InspectManifest(arg0 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"InspectManifest\", reflect.TypeOf((*MockPackClient)(nil).InspectManifest), arg0)\n}\n\n// NewBuildpack mocks base method.\nfunc (m *MockPackClient) NewBuildpack(arg0 context.Context, arg1 client.NewBuildpackOptions) error {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"NewBuildpack\", arg0, arg1)\n\tret0, _ := ret[0].(error)\n\treturn ret0\n}\n\n// NewBuildpack indicates an expected call of NewBuildpack.\nfunc (mr *MockPackClientMockRecorder) NewBuildpack(arg0, arg1 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"NewBuildpack\", reflect.TypeOf((*MockPackClient)(nil).NewBuildpack), arg0, arg1)\n}\n\n// PackageBuildpack mocks base method.\nfunc (m *MockPackClient) PackageBuildpack(arg0 context.Context, arg1 client.PackageBuildpackOptions) error {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"PackageBuildpack\", arg0, arg1)\n\tret0, _ := ret[0].(error)\n\treturn ret0\n}\n\n// PackageBuildpack indicates an expected call of PackageBuildpack.\nfunc (mr *MockPackClientMockRecorder) PackageBuildpack(arg0, arg1 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"PackageBuildpack\", reflect.TypeOf((*MockPackClient)(nil).PackageBuildpack), arg0, arg1)\n}\n\n// PackageExtension mocks base method.\nfunc (m *MockPackClient) PackageExtension(arg0 context.Context, arg1 client.PackageBuildpackOptions) error {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"PackageExtension\", arg0, arg1)\n\tret0, _ := ret[0].(error)\n\treturn ret0\n}\n\n// PackageExtension indicates an expected call of PackageExtension.\nfunc (mr *MockPackClientMockRecorder) PackageExtension(arg0, arg1 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"PackageExtension\", reflect.TypeOf((*MockPackClient)(nil).PackageExtension), arg0, arg1)\n}\n\n// PullBuildpack mocks base method.\nfunc (m *MockPackClient) PullBuildpack(arg0 context.Context, arg1 client.PullBuildpackOptions) error {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"PullBuildpack\", arg0, arg1)\n\tret0, _ := ret[0].(error)\n\treturn ret0\n}\n\n// PullBuildpack indicates an expected call of PullBuildpack.\nfunc (mr *MockPackClientMockRecorder) PullBuildpack(arg0, arg1 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"PullBuildpack\", reflect.TypeOf((*MockPackClient)(nil).PullBuildpack), arg0, arg1)\n}\n\n// PushManifest mocks base method.\nfunc (m *MockPackClient) PushManifest(arg0 client.PushManifestOptions) error {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"PushManifest\", arg0)\n\tret0, _ := ret[0].(error)\n\treturn ret0\n}\n\n// PushManifest indicates an expected call of PushManifest.\nfunc (mr *MockPackClientMockRecorder) PushManifest(arg0 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"PushManifest\", reflect.TypeOf((*MockPackClient)(nil).PushManifest), arg0)\n}\n\n// Rebase mocks base method.\nfunc (m *MockPackClient) Rebase(arg0 context.Context, arg1 client.RebaseOptions) error {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"Rebase\", arg0, arg1)\n\tret0, _ := ret[0].(error)\n\treturn ret0\n}\n\n// Rebase indicates an expected call of Rebase.\nfunc (mr *MockPackClientMockRecorder) Rebase(arg0, arg1 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"Rebase\", reflect.TypeOf((*MockPackClient)(nil).Rebase), arg0, arg1)\n}\n\n// RegisterBuildpack mocks base method.\nfunc (m *MockPackClient) RegisterBuildpack(arg0 context.Context, arg1 client.RegisterBuildpackOptions) error {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"RegisterBuildpack\", arg0, arg1)\n\tret0, _ := ret[0].(error)\n\treturn ret0\n}\n\n// RegisterBuildpack indicates an expected call of RegisterBuildpack.\nfunc (mr *MockPackClientMockRecorder) RegisterBuildpack(arg0, arg1 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"RegisterBuildpack\", reflect.TypeOf((*MockPackClient)(nil).RegisterBuildpack), arg0, arg1)\n}\n\n// RemoveManifest mocks base method.\nfunc (m *MockPackClient) RemoveManifest(arg0 string, arg1 []string) error {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"RemoveManifest\", arg0, arg1)\n\tret0, _ := ret[0].(error)\n\treturn ret0\n}\n\n// RemoveManifest indicates an expected call of RemoveManifest.\nfunc (mr *MockPackClientMockRecorder) RemoveManifest(arg0, arg1 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"RemoveManifest\", reflect.TypeOf((*MockPackClient)(nil).RemoveManifest), arg0, arg1)\n}\n\n// YankBuildpack mocks base method.\nfunc (m *MockPackClient) YankBuildpack(arg0 client.YankBuildpackOptions) error {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"YankBuildpack\", arg0)\n\tret0, _ := ret[0].(error)\n\treturn ret0\n}\n\n// YankBuildpack indicates an expected call of YankBuildpack.\nfunc (mr *MockPackClientMockRecorder) YankBuildpack(arg0 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"YankBuildpack\", reflect.TypeOf((*MockPackClient)(nil).YankBuildpack), arg0)\n}\n"
  },
  {
    "path": "internal/commands/trust_builder.go",
    "content": "package commands\n\nimport (\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n)\n\n// Deprecated: Use `config trusted-builders add` instead\nfunc TrustBuilder(logger logging.Logger, cfg config.Config, cfgPath string) *cobra.Command {\n\tcmd := &cobra.Command{\n\t\tUse:     \"trust-builder <builder-name>\",\n\t\tArgs:    cobra.ExactArgs(1),\n\t\tShort:   \"Trust builder\",\n\t\tLong:    \"Trust builder.\\n\\nWhen building with this builder, all lifecycle phases will be run in a single container using the builder image.\",\n\t\tExample: \"pack trust-builder cnbs/sample-stack-run:bionic\",\n\t\tHidden:  true,\n\t\tRunE: logError(logger, func(cmd *cobra.Command, args []string) error {\n\t\t\tdeprecationWarning(logger, \"trust-builder\", \"config trusted-builders add\")\n\t\t\treturn addTrustedBuilder(args, logger, cfg, cfgPath)\n\t\t}),\n\t}\n\n\tAddHelpFlag(cmd, \"trust-builder\")\n\treturn cmd\n}\n"
  },
  {
    "path": "internal/commands/trust_builder_test.go",
    "content": "package commands_test\n\nimport (\n\t\"bytes\"\n\t\"os\"\n\t\"path/filepath\"\n\t\"testing\"\n\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/internal/commands\"\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestTrustBuilderCommand(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"Commands\", testTrustBuilderCommand, spec.Random(), spec.Report(report.Terminal{}))\n}\n\nfunc testTrustBuilderCommand(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tcommand      *cobra.Command\n\t\tlogger       logging.Logger\n\t\toutBuf       bytes.Buffer\n\t\ttempPackHome string\n\t\tconfigPath   string\n\t)\n\n\tit.Before(func() {\n\t\tvar err error\n\n\t\tlogger = logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\ttempPackHome, err = os.MkdirTemp(\"\", \"pack-home\")\n\t\th.AssertNil(t, err)\n\t\tconfigPath = filepath.Join(tempPackHome, \"config.toml\")\n\t\tcommand = commands.TrustBuilder(logger, config.Config{}, configPath)\n\t})\n\n\tit.After(func() {\n\t\th.AssertNil(t, os.RemoveAll(tempPackHome))\n\t})\n\n\twhen(\"#TrustBuilder\", func() {\n\t\twhen(\"no builder is provided\", func() {\n\t\t\tit(\"prints usage\", func() {\n\t\t\t\tcommand.SetArgs([]string{})\n\t\t\t\th.AssertError(t, command.Execute(), \"accepts 1 arg(s)\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"builder is provided\", func() {\n\t\t\twhen(\"builder is not already trusted\", func() {\n\t\t\t\tit(\"updates the config\", func() {\n\t\t\t\t\tcommand.SetArgs([]string{\"some-builder\"})\n\t\t\t\t\th.AssertNil(t, command.Execute())\n\n\t\t\t\t\tb, err := os.ReadFile(configPath)\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\th.AssertContains(t, string(b), `[[trusted-builders]]\n  name = \"some-builder\"`)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"builder is already trusted\", func() {\n\t\t\t\tit(\"does nothing\", func() {\n\t\t\t\t\tcommand.SetArgs([]string{\"some-already-trusted-builder\"})\n\t\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t\t\toldContents, err := os.ReadFile(configPath)\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\tcommand.SetArgs([]string{\"some-already-trusted-builder\"})\n\t\t\t\t\th.AssertNil(t, command.Execute())\n\n\t\t\t\t\tnewContents, err := os.ReadFile(configPath)\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\th.AssertEq(t, newContents, oldContents)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"builder is a suggested builder\", func() {\n\t\t\t\tit(\"does nothing\", func() {\n\t\t\t\t\th.AssertNil(t, os.WriteFile(configPath, []byte(\"\"), os.ModePerm))\n\n\t\t\t\t\tcommand.SetArgs([]string{\"paketobuildpacks/builder-jammy-base\"})\n\t\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t\t\toldContents, err := os.ReadFile(configPath)\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\th.AssertEq(t, string(oldContents), \"\")\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "internal/commands/untrust_builder.go",
    "content": "package commands\n\nimport (\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n)\n\n// Deprecated: Use `config trusted-builders remove` instead\nfunc UntrustBuilder(logger logging.Logger, cfg config.Config, cfgPath string) *cobra.Command {\n\tcmd := &cobra.Command{\n\t\tUse:     \"untrust-builder <builder-name>\",\n\t\tArgs:    cobra.ExactArgs(1),\n\t\tShort:   \"Stop trusting builder\",\n\t\tHidden:  true,\n\t\tLong:    \"Stop trusting builder.\\n\\nWhen building with this builder, all lifecycle phases will be no longer be run in a single container using the builder image.\",\n\t\tExample: \"pack untrust-builder cnbs/sample-stack-run:bionic\",\n\t\tRunE: logError(logger, func(cmd *cobra.Command, args []string) error {\n\t\t\tdeprecationWarning(logger, \"untrust-builder\", \"config trusted-builders remove\")\n\t\t\treturn removeTrustedBuilder(args, logger, cfg, cfgPath)\n\t\t}),\n\t}\n\n\tAddHelpFlag(cmd, \"untrust-builder\")\n\treturn cmd\n}\n"
  },
  {
    "path": "internal/commands/untrust_builder_test.go",
    "content": "package commands_test\n\nimport (\n\t\"bytes\"\n\t\"fmt\"\n\t\"os\"\n\t\"path/filepath\"\n\t\"testing\"\n\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\t\"github.com/buildpacks/pack/internal/commands\"\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/internal/style\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestUntrustBuilderCommand(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"Commands\", testUntrustBuilderCommand, spec.Random(), spec.Report(report.Terminal{}))\n}\n\nfunc testUntrustBuilderCommand(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tlogger        logging.Logger\n\t\toutBuf        bytes.Buffer\n\t\ttempPackHome  string\n\t\tconfigPath    string\n\t\tconfigManager configManager\n\t)\n\n\tit.Before(func() {\n\t\tvar err error\n\n\t\tlogger = logging.NewLogWithWriters(&outBuf, &outBuf)\n\n\t\ttempPackHome, err = os.MkdirTemp(\"\", \"pack-home\")\n\t\th.AssertNil(t, err)\n\t\tconfigPath = filepath.Join(tempPackHome, \"config.toml\")\n\t\tconfigManager = newConfigManager(t, configPath)\n\t})\n\n\tit.After(func() {\n\t\th.AssertNil(t, os.RemoveAll(tempPackHome))\n\t})\n\n\twhen(\"#UntrustBuilder\", func() {\n\t\twhen(\"no builder is provided\", func() {\n\t\t\tit(\"prints usage\", func() {\n\t\t\t\tcfg := configManager.configWithTrustedBuilders()\n\t\t\t\tcommand := commands.UntrustBuilder(logger, cfg, configPath)\n\t\t\t\tcommand.SetArgs([]string{})\n\t\t\t\tcommand.SetOut(&outBuf)\n\n\t\t\t\terr := command.Execute()\n\t\t\t\th.AssertError(t, err, \"accepts 1 arg(s), received 0\")\n\t\t\t\th.AssertContains(t, outBuf.String(), \"Usage:\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"builder is already trusted\", func() {\n\t\t\tit(\"removes builder from the config\", func() {\n\t\t\t\tbuilderName := \"some-builder\"\n\n\t\t\t\tcfg := configManager.configWithTrustedBuilders(builderName)\n\t\t\t\tcommand := commands.UntrustBuilder(logger, cfg, configPath)\n\t\t\t\tcommand.SetArgs([]string{builderName})\n\n\t\t\t\th.AssertNil(t, command.Execute())\n\n\t\t\t\tb, err := os.ReadFile(configPath)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertNotContains(t, string(b), builderName)\n\n\t\t\t\th.AssertContains(t,\n\t\t\t\t\toutBuf.String(),\n\t\t\t\t\tfmt.Sprintf(\"Builder %s is no longer trusted\", style.Symbol(builderName)),\n\t\t\t\t)\n\t\t\t})\n\n\t\t\tit(\"removes only the named builder when multiple builders are trusted\", func() {\n\t\t\t\tuntrustBuilder := \"stop/trusting:me\"\n\t\t\t\tstillTrustedBuilder := \"very/safe/builder\"\n\n\t\t\t\tcfg := configManager.configWithTrustedBuilders(untrustBuilder, stillTrustedBuilder)\n\t\t\t\tcommand := commands.UntrustBuilder(logger, cfg, configPath)\n\t\t\t\tcommand.SetArgs([]string{untrustBuilder})\n\n\t\t\t\th.AssertNil(t, command.Execute())\n\n\t\t\t\tb, err := os.ReadFile(configPath)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertContains(t, string(b), stillTrustedBuilder)\n\t\t\t\th.AssertNotContains(t, string(b), untrustBuilder)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"builder wasn't already trusted\", func() {\n\t\t\tit(\"does nothing and reports builder wasn't trusted\", func() {\n\t\t\t\tneverTrustedBuilder := \"never/trusted-builder\"\n\t\t\t\tstillTrustedBuilder := \"very/safe/builder\"\n\n\t\t\t\tcfg := configManager.configWithTrustedBuilders(stillTrustedBuilder)\n\t\t\t\tcommand := commands.UntrustBuilder(logger, cfg, configPath)\n\t\t\t\tcommand.SetArgs([]string{neverTrustedBuilder})\n\n\t\t\t\th.AssertNil(t, command.Execute())\n\n\t\t\t\tb, err := os.ReadFile(configPath)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertContains(t, string(b), stillTrustedBuilder)\n\t\t\t\th.AssertNotContains(t, string(b), neverTrustedBuilder)\n\n\t\t\t\th.AssertContains(t,\n\t\t\t\t\toutBuf.String(),\n\t\t\t\t\tfmt.Sprintf(\"Builder %s wasn't trusted\", style.Symbol(neverTrustedBuilder)),\n\t\t\t\t)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"builder is a suggested builder\", func() {\n\t\t\tit(\"does nothing and reports that \", func() {\n\t\t\t\tbuilder := \"paketobuildpacks/builder-jammy-base\"\n\t\t\t\tcommand := commands.UntrustBuilder(logger, config.Config{}, configPath)\n\t\t\t\tcommand.SetArgs([]string{builder})\n\n\t\t\t\terr := command.Execute()\n\t\t\t\th.AssertError(t, err, fmt.Sprintf(\"Builder %s is a known trusted builder. Currently pack doesn't support making these builders untrusted\", style.Symbol(builder)))\n\t\t\t})\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "internal/commands/version.go",
    "content": "package commands\n\nimport (\n\t\"strings\"\n\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/pkg/logging\"\n)\n\n// Version shows the current pack version\nfunc Version(logger logging.Logger, version string) *cobra.Command {\n\tcmd := &cobra.Command{\n\t\tUse:     \"version\",\n\t\tArgs:    cobra.NoArgs,\n\t\tShort:   \"Show current 'pack' version\",\n\t\tExample: \"pack version\",\n\t\tRunE: logError(logger, func(cmd *cobra.Command, args []string) error {\n\t\t\tlogger.Info(strings.TrimSpace(version))\n\t\t\treturn nil\n\t\t}),\n\t}\n\tAddHelpFlag(cmd, \"version\")\n\treturn cmd\n}\n"
  },
  {
    "path": "internal/commands/version_test.go",
    "content": "package commands_test\n\nimport (\n\t\"bytes\"\n\t\"testing\"\n\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/internal/commands\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestVersionCommand(t *testing.T) {\n\tspec.Run(t, \"Commands\", testVersionCommand, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testVersionCommand(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tcommand     *cobra.Command\n\t\toutBuf      bytes.Buffer\n\t\ttestVersion = \"1.3.4\"\n\t)\n\n\tit.Before(func() {\n\t\tcommand = commands.Version(logging.NewLogWithWriters(&outBuf, &outBuf), testVersion)\n\t})\n\n\twhen(\"#Version\", func() {\n\t\tit(\"returns version\", func() {\n\t\t\tcommand.SetArgs([]string{})\n\t\t\th.AssertNil(t, command.Execute())\n\t\t\th.AssertEq(t, outBuf.String(), testVersion+\"\\n\")\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "internal/commands/yank_buildpack.go",
    "content": "package commands\n\nimport (\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/internal/style\"\n\t\"github.com/buildpacks/pack/pkg/client\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n)\n\n// Deprecated: Use yank instead\nfunc YankBuildpack(logger logging.Logger, cfg config.Config, pack PackClient) *cobra.Command {\n\tvar flags BuildpackYankFlags\n\n\tcmd := &cobra.Command{\n\t\tUse:     \"yank-buildpack <buildpack-id-and-version>\",\n\t\tHidden:  true,\n\t\tArgs:    cobra.ExactArgs(1),\n\t\tShort:   \"Yank the buildpack from the registry\",\n\t\tExample: \"pack yank-buildpack my-buildpack@0.0.1\",\n\t\tRunE: logError(logger, func(cmd *cobra.Command, args []string) error {\n\t\t\tdeprecationWarning(logger, \"yank-buildpack\", \"buildpack yank\")\n\t\t\tbuildpackIDVersion := args[0]\n\n\t\t\tregistry, err := config.GetRegistry(cfg, flags.BuildpackRegistry)\n\t\t\tif err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\t\t\tid, version, err := parseIDVersion(buildpackIDVersion)\n\t\t\tif err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\n\t\t\topts := client.YankBuildpackOptions{\n\t\t\t\tID:      id,\n\t\t\t\tVersion: version,\n\t\t\t\tType:    \"github\",\n\t\t\t\tURL:     registry.URL,\n\t\t\t\tYank:    !flags.Undo,\n\t\t\t}\n\n\t\t\tif err := pack.YankBuildpack(opts); err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\t\t\tlogger.Infof(\"Successfully yanked %s\", style.Symbol(buildpackIDVersion))\n\t\t\treturn nil\n\t\t}),\n\t}\n\tcmd.Flags().StringVarP(&flags.BuildpackRegistry, \"buildpack-registry\", \"r\", \"\", \"Buildpack Registry name\")\n\tcmd.Flags().BoolVarP(&flags.Undo, \"undo\", \"u\", false, \"undo previously yanked buildpack\")\n\tAddHelpFlag(cmd, \"yank-buildpack\")\n\n\treturn cmd\n}\n"
  },
  {
    "path": "internal/commands/yank_buildpack_test.go",
    "content": "package commands_test\n\nimport (\n\t\"bytes\"\n\t\"testing\"\n\n\t\"github.com/heroku/color\"\n\n\t\"github.com/golang/mock/gomock\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\t\"github.com/spf13/cobra\"\n\n\t\"github.com/buildpacks/pack/internal/commands\"\n\t\"github.com/buildpacks/pack/pkg/client\"\n\n\t\"github.com/buildpacks/pack/internal/commands/testmocks\"\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestYankBuildpackCommand(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"YankBuildpackCommand\", testYankBuildpackCommand, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testYankBuildpackCommand(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tcommand        *cobra.Command\n\t\tlogger         logging.Logger\n\t\toutBuf         bytes.Buffer\n\t\tmockController *gomock.Controller\n\t\tmockClient     *testmocks.MockPackClient\n\t\tcfg            config.Config\n\t)\n\n\tit.Before(func() {\n\t\tlogger = logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\tmockController = gomock.NewController(t)\n\t\tmockClient = testmocks.NewMockPackClient(mockController)\n\t\tcfg = config.Config{}\n\n\t\tcommand = commands.YankBuildpack(logger, cfg, mockClient)\n\t})\n\n\twhen(\"#YankBuildpackCommand\", func() {\n\t\twhen(\"no buildpack id@version is provided\", func() {\n\t\t\tit(\"fails to run\", func() {\n\t\t\t\terr := command.Execute()\n\t\t\t\th.AssertError(t, err, \"accepts 1 arg\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"id@version argument is provided\", func() {\n\t\t\tvar (\n\t\t\t\tbuildpackIDVersion string\n\t\t\t)\n\n\t\t\tit.Before(func() {\n\t\t\t\tbuildpackIDVersion = \"heroku/rust@0.0.1\"\n\t\t\t})\n\n\t\t\tit(\"should work for required args\", func() {\n\t\t\t\topts := client.YankBuildpackOptions{\n\t\t\t\t\tID:      \"heroku/rust\",\n\t\t\t\t\tVersion: \"0.0.1\",\n\t\t\t\t\tType:    \"github\",\n\t\t\t\t\tURL:     \"https://github.com/buildpacks/registry-index\",\n\t\t\t\t\tYank:    true,\n\t\t\t\t}\n\n\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\tYankBuildpack(opts).\n\t\t\t\t\tReturn(nil)\n\n\t\t\t\tcommand.SetArgs([]string{buildpackIDVersion})\n\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t})\n\n\t\t\tit(\"should fail for invalid buildpack id/version\", func() {\n\t\t\t\tcommand.SetArgs([]string{\"mybuildpack\"})\n\t\t\t\terr := command.Execute()\n\n\t\t\t\th.AssertError(t, err, \"invalid buildpack id@version 'mybuildpack'\")\n\t\t\t})\n\n\t\t\tit(\"should use the default registry defined in config.toml\", func() {\n\t\t\t\tcfg = config.Config{\n\t\t\t\t\tDefaultRegistryName: \"official\",\n\t\t\t\t\tRegistries: []config.Registry{\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\tName: \"official\",\n\t\t\t\t\t\t\tType: \"github\",\n\t\t\t\t\t\t\tURL:  \"https://github.com/buildpacks/registry-index\",\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t}\n\t\t\t\tcommand = commands.YankBuildpack(logger, cfg, mockClient)\n\t\t\t\topts := client.YankBuildpackOptions{\n\t\t\t\t\tID:      \"heroku/rust\",\n\t\t\t\t\tVersion: \"0.0.1\",\n\t\t\t\t\tType:    \"github\",\n\t\t\t\t\tURL:     \"https://github.com/buildpacks/registry-index\",\n\t\t\t\t\tYank:    true,\n\t\t\t\t}\n\n\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\tYankBuildpack(opts).\n\t\t\t\t\tReturn(nil)\n\n\t\t\t\tcommand.SetArgs([]string{buildpackIDVersion})\n\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t})\n\n\t\t\tit(\"should undo\", func() {\n\t\t\t\topts := client.YankBuildpackOptions{\n\t\t\t\t\tID:      \"heroku/rust\",\n\t\t\t\t\tVersion: \"0.0.1\",\n\t\t\t\t\tType:    \"github\",\n\t\t\t\t\tURL:     \"https://github.com/buildpacks/registry-index\",\n\t\t\t\t\tYank:    false,\n\t\t\t\t}\n\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\tYankBuildpack(opts).\n\t\t\t\t\tReturn(nil)\n\n\t\t\t\tcommand = commands.YankBuildpack(logger, cfg, mockClient)\n\t\t\t\tcommand.SetArgs([]string{buildpackIDVersion, \"--undo\"})\n\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t})\n\n\t\t\twhen(\"buildpack-registry flag is used\", func() {\n\t\t\t\tit(\"should use the specified buildpack registry\", func() {\n\t\t\t\t\tbuildpackRegistry := \"override\"\n\t\t\t\t\tcfg = config.Config{\n\t\t\t\t\t\tDefaultRegistryName: \"default\",\n\t\t\t\t\t\tRegistries: []config.Registry{\n\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\tName: \"default\",\n\t\t\t\t\t\t\t\tType: \"github\",\n\t\t\t\t\t\t\t\tURL:  \"https://github.com/default/buildpack-registry\",\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\tName: \"override\",\n\t\t\t\t\t\t\t\tType: \"github\",\n\t\t\t\t\t\t\t\tURL:  \"https://github.com/override/buildpack-registry\",\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t},\n\t\t\t\t\t}\n\t\t\t\t\topts := client.YankBuildpackOptions{\n\t\t\t\t\t\tID:      \"heroku/rust\",\n\t\t\t\t\t\tVersion: \"0.0.1\",\n\t\t\t\t\t\tType:    \"github\",\n\t\t\t\t\t\tURL:     \"https://github.com/override/buildpack-registry\",\n\t\t\t\t\t\tYank:    true,\n\t\t\t\t\t}\n\t\t\t\t\tmockClient.EXPECT().\n\t\t\t\t\t\tYankBuildpack(opts).\n\t\t\t\t\t\tReturn(nil)\n\n\t\t\t\t\tcommand = commands.YankBuildpack(logger, cfg, mockClient)\n\t\t\t\t\tcommand.SetArgs([]string{buildpackIDVersion, \"--buildpack-registry\", buildpackRegistry})\n\t\t\t\t\th.AssertNil(t, command.Execute())\n\t\t\t\t})\n\n\t\t\t\tit(\"should handle config errors\", func() {\n\t\t\t\t\tcfg = config.Config{\n\t\t\t\t\t\tDefaultRegistryName: \"missing registry\",\n\t\t\t\t\t}\n\t\t\t\t\tcommand = commands.YankBuildpack(logger, cfg, mockClient)\n\t\t\t\t\tcommand.SetArgs([]string{buildpackIDVersion})\n\n\t\t\t\t\terr := command.Execute()\n\t\t\t\t\th.AssertNotNil(t, err)\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "internal/config/config.go",
    "content": "package config\n\nimport (\n\t\"os\"\n\t\"path/filepath\"\n\n\t\"github.com/BurntSushi/toml\"\n\t\"github.com/pkg/errors\"\n\n\t\"github.com/buildpacks/pack/internal/style\"\n)\n\ntype Config struct {\n\t// Deprecated: Use DefaultRegistryName instead. See https://github.com/buildpacks/pack/issues/747.\n\tDefaultRegistry     string            `toml:\"default-registry-url,omitempty\"`\n\tDefaultRegistryName string            `toml:\"default-registry,omitempty\"`\n\tDefaultBuilder      string            `toml:\"default-builder-image,omitempty\"`\n\tPullPolicy          string            `toml:\"pull-policy,omitempty\"`\n\tExperimental        bool              `toml:\"experimental,omitempty\"`\n\tRunImages           []RunImage        `toml:\"run-images\"`\n\tTrustedBuilders     []TrustedBuilder  `toml:\"trusted-builders,omitempty\"`\n\tRegistries          []Registry        `toml:\"registries,omitempty\"`\n\tLifecycleImage      string            `toml:\"lifecycle-image,omitempty\"`\n\tRegistryMirrors     map[string]string `toml:\"registry-mirrors,omitempty\"`\n\tLayoutRepositoryDir string            `toml:\"layout-repo-dir,omitempty\"`\n}\n\ntype VolumeConfig struct {\n\tVolumeKeys map[string]string `toml:\"volume-keys,omitempty\"`\n}\n\ntype Registry struct {\n\tName string `toml:\"name\"`\n\tType string `toml:\"type\"`\n\tURL  string `toml:\"url\"`\n}\n\ntype RunImage struct {\n\tImage   string   `toml:\"image\"`\n\tMirrors []string `toml:\"mirrors\"`\n}\n\ntype TrustedBuilder struct {\n\tName string `toml:\"name\"`\n}\n\nconst OfficialRegistryName = \"official\"\n\nfunc DefaultRegistry() Registry {\n\treturn Registry{\n\t\tOfficialRegistryName,\n\t\t\"github\",\n\t\t\"https://github.com/buildpacks/registry-index\",\n\t}\n}\n\nfunc DefaultConfigPath() (string, error) {\n\thome, err := PackHome()\n\tif err != nil {\n\t\treturn \"\", errors.Wrap(err, \"getting pack home\")\n\t}\n\treturn filepath.Join(home, \"config.toml\"), nil\n}\n\nfunc DefaultVolumeKeysPath() (string, error) {\n\thome, err := PackHome()\n\tif err != nil {\n\t\treturn \"\", errors.Wrap(err, \"getting pack home\")\n\t}\n\treturn filepath.Join(home, \"volume-keys.toml\"), nil\n}\n\nfunc PackHome() (string, error) {\n\tpackHome := os.Getenv(\"PACK_HOME\")\n\tif packHome == \"\" {\n\t\thome, err := os.UserHomeDir()\n\t\tif err != nil {\n\t\t\treturn \"\", errors.Wrap(err, \"getting user home\")\n\t\t}\n\t\tpackHome = filepath.Join(home, \".pack\")\n\t}\n\treturn packHome, nil\n}\n\nfunc Read(path string) (Config, error) {\n\tcfg := Config{}\n\t_, err := toml.DecodeFile(path, &cfg)\n\tif err != nil && !os.IsNotExist(err) {\n\t\treturn Config{}, errors.Wrapf(err, \"failed to read config file at path %s\", path)\n\t}\n\treturn cfg, nil\n}\n\nfunc ReadVolumeKeys(path string) (VolumeConfig, error) {\n\tcfg := VolumeConfig{}\n\t_, err := toml.DecodeFile(path, &cfg)\n\tif err != nil && !os.IsNotExist(err) {\n\t\treturn VolumeConfig{}, errors.Wrapf(err, \"failed to read config file at path %s\", path)\n\t}\n\treturn cfg, nil\n}\n\nfunc Write(cfg interface{}, path string) error {\n\tif err := MkdirAll(filepath.Dir(path)); err != nil {\n\t\treturn err\n\t}\n\tw, err := os.Create(path)\n\tif err != nil {\n\t\treturn err\n\t}\n\tdefer w.Close()\n\n\treturn toml.NewEncoder(w).Encode(cfg)\n}\n\nfunc MkdirAll(path string) error {\n\treturn os.MkdirAll(path, 0750)\n}\n\nfunc SetRunImageMirrors(cfg Config, image string, mirrors []string) Config {\n\tfor i := range cfg.RunImages {\n\t\tif cfg.RunImages[i].Image == image {\n\t\t\tcfg.RunImages[i].Mirrors = mirrors\n\t\t\treturn cfg\n\t\t}\n\t}\n\tcfg.RunImages = append(cfg.RunImages, RunImage{Image: image, Mirrors: mirrors})\n\treturn cfg\n}\n\nfunc GetRegistries(cfg Config) []Registry {\n\treturn append(cfg.Registries, DefaultRegistry())\n}\n\nfunc GetRegistry(cfg Config, registryName string) (Registry, error) {\n\tif registryName == \"\" && cfg.DefaultRegistryName != \"\" {\n\t\tregistryName = cfg.DefaultRegistryName\n\t}\n\tif registryName == \"\" && cfg.DefaultRegistryName == \"\" {\n\t\tregistryName = OfficialRegistryName\n\t}\n\tif registryName != \"\" {\n\t\tfor _, registry := range GetRegistries(cfg) {\n\t\t\tif registry.Name == registryName {\n\t\t\t\treturn registry, nil\n\t\t\t}\n\t\t}\n\t}\n\treturn Registry{}, errors.Errorf(\"registry %s is not defined in your config file\", style.Symbol(registryName))\n}\n\nconst DefaultLifecycleImageRepo = \"docker.io/buildpacksio/lifecycle\"\n"
  },
  {
    "path": "internal/config/config_helpers.go",
    "content": "package config\n\nimport (\n\t\"fmt\"\n\t\"strings\"\n\n\t\"github.com/BurntSushi/toml\"\n\n\t\"github.com/buildpacks/pack/internal/style\"\n)\n\nfunc FormatUndecodedKeys(undecodedKeys []toml.Key) string {\n\tunusedKeys := map[string]interface{}{}\n\tfor _, key := range undecodedKeys {\n\t\tkeyName := key.String()\n\n\t\tparent := strings.Split(keyName, \".\")[0]\n\n\t\tif _, ok := unusedKeys[parent]; !ok {\n\t\t\tunusedKeys[keyName] = nil\n\t\t}\n\t}\n\n\tvar errorKeys []string\n\tfor errorKey := range unusedKeys {\n\t\terrorKeys = append(errorKeys, style.Symbol(errorKey))\n\t}\n\n\tpluralizedElement := \"element\"\n\tif len(errorKeys) > 1 {\n\t\tpluralizedElement += \"s\"\n\t}\n\telements := strings.Join(errorKeys, \", \")\n\n\treturn fmt.Sprintf(\"unknown configuration %s %s\", pluralizedElement, elements)\n}\n"
  },
  {
    "path": "internal/config/config_test.go",
    "content": "package config_test\n\nimport (\n\t\"os\"\n\t\"path/filepath\"\n\t\"testing\"\n\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\t\"github.com/buildpacks/pack/internal/config\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestConfig(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"config\", testConfig, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testConfig(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\ttmpDir     string\n\t\tconfigPath string\n\t)\n\n\tit.Before(func() {\n\t\tvar err error\n\t\ttmpDir, err = os.MkdirTemp(\"\", \"pack.config.test.\")\n\t\th.AssertNil(t, err)\n\t\tconfigPath = filepath.Join(tmpDir, \"config.toml\")\n\t})\n\n\tit.After(func() {\n\t\terr := os.RemoveAll(tmpDir)\n\t\th.AssertNil(t, err)\n\t})\n\n\twhen(\"#Read\", func() {\n\t\twhen(\"no config on disk\", func() {\n\t\t\tit(\"returns an empty config\", func() {\n\t\t\t\tsubject, err := config.Read(configPath)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertEq(t, subject.DefaultBuilder, \"\")\n\t\t\t\th.AssertEq(t, len(subject.RunImages), 0)\n\t\t\t\th.AssertEq(t, subject.Experimental, false)\n\t\t\t\th.AssertEq(t, len(subject.RegistryMirrors), 0)\n\t\t\t\th.AssertEq(t, subject.LayoutRepositoryDir, \"\")\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"#Write\", func() {\n\t\twhen(\"no config on disk\", func() {\n\t\t\tit(\"writes config to disk\", func() {\n\t\t\t\th.AssertNil(t, config.Write(config.Config{\n\t\t\t\t\tDefaultBuilder: \"some/builder\",\n\t\t\t\t\tRunImages: []config.RunImage{\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\tImage:   \"some/run\",\n\t\t\t\t\t\t\tMirrors: []string{\"example.com/some/run\", \"example.com/some/mirror\"},\n\t\t\t\t\t\t},\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\tImage:   \"other/run\",\n\t\t\t\t\t\t\tMirrors: []string{\"example.com/other/run\", \"example.com/other/mirror\"},\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t\tTrustedBuilders: []config.TrustedBuilder{\n\t\t\t\t\t\t{Name: \"some-trusted-builder\"},\n\t\t\t\t\t},\n\t\t\t\t\tRegistryMirrors: map[string]string{\n\t\t\t\t\t\t\"index.docker.io\": \"10.0.0.1\",\n\t\t\t\t\t},\n\t\t\t\t}, configPath))\n\n\t\t\t\tb, err := os.ReadFile(configPath)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertContains(t, string(b), `default-builder-image = \"some/builder\"`)\n\t\t\t\th.AssertContains(t, string(b), `[[run-images]]\n  image = \"some/run\"\n  mirrors = [\"example.com/some/run\", \"example.com/some/mirror\"]`)\n\n\t\t\t\th.AssertContains(t, string(b), `[[run-images]]\n  image = \"other/run\"\n  mirrors = [\"example.com/other/run\", \"example.com/other/mirror\"]`)\n\n\t\t\t\th.AssertContains(t, string(b), `[[trusted-builders]]\n  name = \"some-trusted-builder\"`)\n\n\t\t\t\th.AssertContains(t, string(b), `[registry-mirrors]\n  \"index.docker.io\" = \"10.0.0.1\"`)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"config on disk\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\th.AssertNil(t, os.WriteFile(configPath, []byte(\"some-old-contents\"), 0777))\n\t\t\t})\n\n\t\t\tit(\"replaces the file\", func() {\n\t\t\t\th.AssertNil(t, config.Write(config.Config{\n\t\t\t\t\tDefaultBuilder: \"some/builder\",\n\t\t\t\t}, configPath))\n\t\t\t\tb, err := os.ReadFile(configPath)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertContains(t, string(b), `default-builder-image = \"some/builder\"`)\n\t\t\t\th.AssertNotContains(t, string(b), \"some-old-contents\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"directories are missing\", func() {\n\t\t\tit(\"creates the directories\", func() {\n\t\t\t\tmissingDirConfigPath := filepath.Join(tmpDir, \"not\", \"yet\", \"created\", \"config.toml\")\n\t\t\t\th.AssertNil(t, config.Write(config.Config{\n\t\t\t\t\tDefaultBuilder: \"some/builder\",\n\t\t\t\t}, missingDirConfigPath))\n\n\t\t\t\tb, err := os.ReadFile(missingDirConfigPath)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertContains(t, string(b), `default-builder-image = \"some/builder\"`)\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"#MkdirAll\", func() {\n\t\twhen(\"the directory doesn't exist yet\", func() {\n\t\t\tit(\"creates the directory\", func() {\n\t\t\t\tpath := filepath.Join(tmpDir, \"a-new-dir\")\n\t\t\t\terr := config.MkdirAll(path)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\tfi, err := os.Stat(path)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertEq(t, fi.Mode().IsDir(), true)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"the directory already exists\", func() {\n\t\t\tit(\"doesn't error\", func() {\n\t\t\t\terr := config.MkdirAll(tmpDir)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\tfi, err := os.Stat(tmpDir)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertEq(t, fi.Mode().IsDir(), true)\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"#SetRunImageMirrors\", func() {\n\t\twhen(\"run image exists in config\", func() {\n\t\t\tit(\"replaces the mirrors\", func() {\n\t\t\t\tcfg := config.SetRunImageMirrors(\n\t\t\t\t\tconfig.Config{\n\t\t\t\t\t\tRunImages: []config.RunImage{\n\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\tImage:   \"some/run-image\",\n\t\t\t\t\t\t\t\tMirrors: []string{\"old/mirror\", \"other/mirror\"},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t\t\"some/run-image\",\n\t\t\t\t\t[]string{\"some-other/run\"},\n\t\t\t\t)\n\n\t\t\t\th.AssertEq(t, len(cfg.RunImages), 1)\n\t\t\t\th.AssertEq(t, cfg.RunImages[0].Image, \"some/run-image\")\n\t\t\t\th.AssertSliceContainsOnly(t, cfg.RunImages[0].Mirrors, \"some-other/run\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"run image does not exist in config\", func() {\n\t\t\tit(\"adds the run image\", func() {\n\t\t\t\tcfg := config.SetRunImageMirrors(\n\t\t\t\t\tconfig.Config{},\n\t\t\t\t\t\"some/run-image\",\n\t\t\t\t\t[]string{\"some-other/run\"},\n\t\t\t\t)\n\n\t\t\t\th.AssertEq(t, len(cfg.RunImages), 1)\n\t\t\t\th.AssertEq(t, cfg.RunImages[0].Image, \"some/run-image\")\n\t\t\t\th.AssertSliceContainsOnly(t, cfg.RunImages[0].Mirrors, \"some-other/run\")\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"#GetRegistry\", func() {\n\t\tit(\"should return a default registry\", func() {\n\t\t\tcfg := config.Config{}\n\n\t\t\tregistry, err := config.GetRegistry(cfg, \"\")\n\n\t\t\th.AssertNil(t, err)\n\t\t\th.AssertEq(t, registry, config.Registry{\n\t\t\t\tName: \"official\",\n\t\t\t\tType: \"github\",\n\t\t\t\tURL:  \"https://github.com/buildpacks/registry-index\",\n\t\t\t})\n\t\t})\n\n\t\tit(\"should return the corresponding registry\", func() {\n\t\t\tcfg := config.Config{\n\t\t\t\tRegistries: []config.Registry{\n\t\t\t\t\t{\n\t\t\t\t\t\tName: \"registry\",\n\t\t\t\t\t\tType: \"github\",\n\t\t\t\t\t\tURL:  \"https://github.com/registry/buildpack-registry\",\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t}\n\n\t\t\tregistry, err := config.GetRegistry(cfg, \"registry\")\n\n\t\t\th.AssertNil(t, err)\n\t\t\th.AssertEq(t, registry, config.Registry{\n\t\t\t\tName: \"registry\",\n\t\t\t\tType: \"github\",\n\t\t\t\tURL:  \"https://github.com/registry/buildpack-registry\",\n\t\t\t})\n\t\t})\n\n\t\tit(\"should return the first matched registry\", func() {\n\t\t\tcfg := config.Config{\n\t\t\t\tRegistries: []config.Registry{\n\t\t\t\t\t{\n\t\t\t\t\t\tName: \"duplicate registry\",\n\t\t\t\t\t\tType: \"github\",\n\t\t\t\t\t\tURL:  \"https://github.com/duplicate1/buildpack-registry\",\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\tName: \"duplicate registry\",\n\t\t\t\t\t\tType: \"github\",\n\t\t\t\t\t\tURL:  \"https://github.com/duplicate2/buildpack-registry\",\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t}\n\n\t\t\tregistry, err := config.GetRegistry(cfg, \"duplicate registry\")\n\n\t\t\th.AssertNil(t, err)\n\t\t\th.AssertEq(t, registry, config.Registry{\n\t\t\t\tName: \"duplicate registry\",\n\t\t\t\tType: \"github\",\n\t\t\t\tURL:  \"https://github.com/duplicate1/buildpack-registry\",\n\t\t\t})\n\t\t})\n\n\t\tit(\"should return an error when mismatched\", func() {\n\t\t\tcfg := config.Config{}\n\t\t\t_, err := config.GetRegistry(cfg, \"missing\")\n\t\t\th.AssertError(t, err, \"registry 'missing' is not defined in your config file\")\n\t\t})\n\t})\n\twhen(\"#DefaultConfigPath\", func() {\n\t\tit.Before(func() {\n\t\t\th.AssertNil(t, os.Setenv(\"PACK_HOME\", tmpDir))\n\t\t})\n\n\t\tit.After(func() {\n\t\t\th.AssertNil(t, os.Unsetenv(\"PACK_HOME\"))\n\t\t})\n\n\t\tit(\"returns config path\", func() {\n\t\t\tcfgPath, err := config.DefaultConfigPath()\n\t\t\th.AssertNil(t, err)\n\t\t\th.AssertEq(t, cfgPath, filepath.Join(tmpDir, \"config.toml\"))\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "internal/container/run.go",
    "content": "package container\n\nimport (\n\t\"context\"\n\t\"fmt\"\n\t\"io\"\n\n\t\"github.com/docker/docker/pkg/stdcopy\"\n\tdcontainer \"github.com/moby/moby/api/types/container\"\n\tdockerClient \"github.com/moby/moby/client\"\n\t\"github.com/pkg/errors\"\n)\n\ntype Handler func(bodyChan <-chan dcontainer.WaitResponse, errChan <-chan error, reader io.Reader) error\n\ntype DockerClient interface {\n\tContainerWait(ctx context.Context, containerID string, options dockerClient.ContainerWaitOptions) dockerClient.ContainerWaitResult\n\tContainerAttach(ctx context.Context, container string, options dockerClient.ContainerAttachOptions) (dockerClient.ContainerAttachResult, error)\n\tContainerStart(ctx context.Context, container string, options dockerClient.ContainerStartOptions) (dockerClient.ContainerStartResult, error)\n}\n\nfunc ContainerWaitWrapper(ctx context.Context, docker DockerClient, container string, condition dcontainer.WaitCondition) (<-chan dcontainer.WaitResponse, <-chan error) {\n\tbodyChan := make(chan dcontainer.WaitResponse)\n\terrChan := make(chan error)\n\n\tgo func() {\n\t\tdefer close(bodyChan)\n\t\tdefer close(errChan)\n\n\t\tresult := docker.ContainerWait(ctx, container, dockerClient.ContainerWaitOptions{Condition: dcontainer.WaitConditionNextExit})\n\t\tfor {\n\t\t\tselect {\n\t\t\tcase body := <-result.Result:\n\t\t\t\tbodyChan <- body\n\t\t\t\treturn\n\t\t\tcase err := <-result.Error:\n\t\t\t\terrChan <- err\n\t\t\t\treturn\n\t\t\t}\n\t\t}\n\t}()\n\n\treturn bodyChan, errChan\n}\n\nfunc RunWithHandler(ctx context.Context, docker DockerClient, ctrID string, handler Handler) error {\n\tbodyChan, errChan := ContainerWaitWrapper(ctx, docker, ctrID, dcontainer.WaitConditionNextExit)\n\n\tresp, err := docker.ContainerAttach(ctx, ctrID, dockerClient.ContainerAttachOptions{\n\t\tStream: true,\n\t\tStdout: true,\n\t\tStderr: true,\n\t})\n\tif err != nil {\n\t\treturn err\n\t}\n\tdefer resp.Close()\n\n\tif _, err := docker.ContainerStart(ctx, ctrID, dockerClient.ContainerStartOptions{}); err != nil {\n\t\treturn errors.Wrap(err, \"container start\")\n\t}\n\n\treturn handler(bodyChan, errChan, resp.Reader)\n}\n\nfunc DefaultHandler(out, errOut io.Writer) Handler {\n\treturn func(bodyChan <-chan dcontainer.WaitResponse, errChan <-chan error, reader io.Reader) error {\n\t\tcopyErr := make(chan error)\n\t\tgo func() {\n\t\t\t_, err := stdcopy.StdCopy(out, errOut, reader)\n\t\t\tdefer optionallyCloseWriter(out)\n\t\t\tdefer optionallyCloseWriter(errOut)\n\n\t\t\tcopyErr <- err\n\t\t}()\n\n\t\tselect {\n\t\tcase body := <-bodyChan:\n\t\t\tif body.StatusCode != 0 {\n\t\t\t\treturn fmt.Errorf(\"failed with status code: %d\", body.StatusCode)\n\t\t\t}\n\t\tcase err := <-errChan:\n\t\t\treturn err\n\t\t}\n\n\t\treturn <-copyErr\n\t}\n}\n\nfunc optionallyCloseWriter(writer io.Writer) error {\n\tif closer, ok := writer.(io.Closer); ok {\n\t\treturn closer.Close()\n\t}\n\n\treturn nil\n}\n"
  },
  {
    "path": "internal/fakes/fake_buildpack.go",
    "content": "package fakes\n\nimport (\n\t\"bytes\"\n\t\"fmt\"\n\t\"io\"\n\t\"path/filepath\"\n\n\t\"github.com/BurntSushi/toml\"\n\n\t\"github.com/buildpacks/pack/pkg/archive\"\n\t\"github.com/buildpacks/pack/pkg/buildpack\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n)\n\ntype fakeBuildpack struct {\n\tdescriptor dist.BuildpackDescriptor\n\tchmod      int64\n\toptions    []FakeBuildpackOption\n}\n\ntype fakeBuildpackConfig struct {\n\t// maping of extrafilename to stringified contents\n\tExtraFiles map[string]string\n\tOpenError  error\n}\n\nfunc newFakeBuildpackConfig() *fakeBuildpackConfig {\n\treturn &fakeBuildpackConfig{ExtraFiles: map[string]string{}}\n}\n\ntype FakeBuildpackOption func(*fakeBuildpackConfig)\n\nfunc WithExtraBuildpackContents(filename, contents string) FakeBuildpackOption {\n\treturn func(f *fakeBuildpackConfig) {\n\t\tf.ExtraFiles[filename] = contents\n\t}\n}\n\nfunc WithBpOpenError(err error) FakeBuildpackOption {\n\treturn func(f *fakeBuildpackConfig) {\n\t\tf.OpenError = err\n\t}\n}\n\n// NewFakeBuildpack creates a fake buildpack with contents:\n//\n//\t\t\\_ /cnb/buildpacks/{ID}\n//\t\t\\_ /cnb/buildpacks/{ID}/{version}\n//\t\t\\_ /cnb/buildpacks/{ID}/{version}/buildpack.toml\n//\t\t\\_ /cnb/buildpacks/{ID}/{version}/bin\n//\t\t\\_ /cnb/buildpacks/{ID}/{version}/bin/build\n//\t \tbuild-contents\n//\t\t\\_ /cnb/buildpacks/{ID}/{version}/bin/detect\n//\t \tdetect-contents\nfunc NewFakeBuildpack(descriptor dist.BuildpackDescriptor, chmod int64, options ...FakeBuildpackOption) (buildpack.BuildModule, error) {\n\treturn &fakeBuildpack{\n\t\tdescriptor: descriptor,\n\t\tchmod:      chmod,\n\t\toptions:    options,\n\t}, nil\n}\n\nfunc (b *fakeBuildpack) Descriptor() buildpack.Descriptor {\n\treturn &b.descriptor\n}\n\nfunc (b *fakeBuildpack) Open() (io.ReadCloser, error) {\n\tfConfig := newFakeBuildpackConfig()\n\tfor _, option := range b.options {\n\t\toption(fConfig)\n\t}\n\n\tif fConfig.OpenError != nil {\n\t\treturn nil, fConfig.OpenError\n\t}\n\n\tbuf := &bytes.Buffer{}\n\tif err := toml.NewEncoder(buf).Encode(b.descriptor); err != nil {\n\t\treturn nil, err\n\t}\n\n\ttarBuilder := archive.TarBuilder{}\n\tts := archive.NormalizedDateTime\n\ttarBuilder.AddDir(fmt.Sprintf(\"/cnb/buildpacks/%s\", b.descriptor.EscapedID()), b.chmod, ts)\n\tbpDir := fmt.Sprintf(\"/cnb/buildpacks/%s/%s\", b.descriptor.EscapedID(), b.descriptor.Info().Version)\n\ttarBuilder.AddDir(bpDir, b.chmod, ts)\n\ttarBuilder.AddFile(bpDir+\"/buildpack.toml\", b.chmod, ts, buf.Bytes())\n\n\tif len(b.descriptor.Order()) == 0 {\n\t\ttarBuilder.AddDir(bpDir+\"/bin\", b.chmod, ts)\n\t\ttarBuilder.AddFile(bpDir+\"/bin/build\", b.chmod, ts, []byte(\"build-contents\"))\n\t\ttarBuilder.AddFile(bpDir+\"/bin/detect\", b.chmod, ts, []byte(\"detect-contents\"))\n\t}\n\n\tfor extraFilename, extraContents := range fConfig.ExtraFiles {\n\t\ttarBuilder.AddFile(filepath.Join(bpDir, extraFilename), b.chmod, ts, []byte(extraContents))\n\t}\n\n\treturn tarBuilder.Reader(archive.DefaultTarWriterFactory()), nil\n}\n"
  },
  {
    "path": "internal/fakes/fake_buildpack_blob.go",
    "content": "package fakes\n\nimport (\n\t\"bytes\"\n\t\"io\"\n\t\"time\"\n\n\t\"github.com/BurntSushi/toml\"\n\n\t\"github.com/buildpacks/pack/pkg/archive\"\n\t\"github.com/buildpacks/pack/pkg/blob\"\n\t\"github.com/buildpacks/pack/pkg/buildpack\"\n)\n\ntype fakeBuildpackBlob struct {\n\tdescriptor buildpack.Descriptor\n\tchmod      int64\n}\n\n// NewFakeBuildpackBlob creates a fake blob with contents:\n//\n//\t\t\\_ buildpack.toml\n//\t\t\\_ bin\n//\t\t\\_ bin/build\n//\t \tbuild-contents\n//\t\t\\_ bin/detect\n//\t \tdetect-contents\nfunc NewFakeBuildpackBlob(descriptor buildpack.Descriptor, chmod int64) (blob.Blob, error) {\n\treturn &fakeBuildpackBlob{\n\t\tdescriptor: descriptor,\n\t\tchmod:      chmod,\n\t}, nil\n}\n\nfunc (b *fakeBuildpackBlob) Open() (reader io.ReadCloser, err error) {\n\tbuf := &bytes.Buffer{}\n\tif err = toml.NewEncoder(buf).Encode(b.descriptor); err != nil {\n\t\treturn nil, err\n\t}\n\n\ttarBuilder := archive.TarBuilder{}\n\n\ttarBuilder.AddFile(\"buildpack.toml\", b.chmod, time.Now(), buf.Bytes())\n\ttarBuilder.AddDir(\"bin\", b.chmod, time.Now())\n\ttarBuilder.AddFile(\"bin/build\", b.chmod, time.Now(), []byte(\"build-contents\"))\n\ttarBuilder.AddFile(\"bin/detect\", b.chmod, time.Now(), []byte(\"detect-contents\"))\n\n\treturn tarBuilder.Reader(archive.DefaultTarWriterFactory()), err\n}\n"
  },
  {
    "path": "internal/fakes/fake_buildpack_tar.go",
    "content": "package fakes\n\nimport (\n\t\"io\"\n\t\"os\"\n\t\"testing\"\n\n\t\"github.com/buildpacks/pack/pkg/dist\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc CreateBuildpackTar(t *testing.T, tmpDir string, descriptor dist.BuildpackDescriptor) string {\n\tbuildpack, err := NewFakeBuildpackBlob(&descriptor, 0777)\n\th.AssertNil(t, err)\n\n\ttempFile, err := os.CreateTemp(tmpDir, \"bp-*.tar\")\n\th.AssertNil(t, err)\n\tdefer tempFile.Close()\n\n\treader, err := buildpack.Open()\n\th.AssertNil(t, err)\n\n\t_, err = io.Copy(tempFile, reader)\n\th.AssertNil(t, err)\n\n\treturn tempFile.Name()\n}\n"
  },
  {
    "path": "internal/fakes/fake_extension.go",
    "content": "package fakes\n\nimport (\n\t\"bytes\"\n\t\"fmt\"\n\t\"io\"\n\t\"path/filepath\"\n\n\t\"github.com/BurntSushi/toml\"\n\n\t\"github.com/buildpacks/pack/pkg/archive\"\n\t\"github.com/buildpacks/pack/pkg/buildpack\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n)\n\ntype fakeExtension struct {\n\tdescriptor dist.ExtensionDescriptor\n\tchmod      int64\n\toptions    []FakeExtensionOption\n}\n\ntype fakeExtensionConfig struct {\n\t// maping of extrafilename to stringified contents\n\tExtraFiles map[string]string\n\tOpenError  error\n}\n\nfunc newFakeExtensionConfig() *fakeExtensionConfig {\n\treturn &fakeExtensionConfig{ExtraFiles: map[string]string{}}\n}\n\ntype FakeExtensionOption func(*fakeExtensionConfig)\n\nfunc WithExtraExtensionContents(filename, contents string) FakeExtensionOption {\n\treturn func(f *fakeExtensionConfig) {\n\t\tf.ExtraFiles[filename] = contents\n\t}\n}\n\nfunc WithExtOpenError(err error) FakeExtensionOption {\n\treturn func(f *fakeExtensionConfig) {\n\t\tf.OpenError = err\n\t}\n}\n\n// NewFakeExtension creates a fake extension with contents:\n//\n//\t\t\\_ /cnb/extensions/{ID}\n//\t\t\\_ /cnb/extensions/{ID}/{version}\n//\t\t\\_ /cnb/extensions/{ID}/{version}/extension.toml\n//\t\t\\_ /cnb/extensions/{ID}/{version}/bin\n//\t\t\\_ /cnb/extensions/{ID}/{version}/bin/generate\n//\t \tgenerate-contents\n//\t\t\\_ /cnb/extensions/{ID}/{version}/bin/detect\n//\t \tdetect-contents\nfunc NewFakeExtension(descriptor dist.ExtensionDescriptor, chmod int64, options ...FakeExtensionOption) (buildpack.BuildModule, error) {\n\treturn &fakeExtension{\n\t\tdescriptor: descriptor,\n\t\tchmod:      chmod,\n\t\toptions:    options,\n\t}, nil\n}\n\nfunc (b *fakeExtension) Descriptor() buildpack.Descriptor {\n\treturn &b.descriptor\n}\n\nfunc (b *fakeExtension) Open() (io.ReadCloser, error) {\n\tfConfig := newFakeExtensionConfig()\n\tfor _, option := range b.options {\n\t\toption(fConfig)\n\t}\n\n\tif fConfig.OpenError != nil {\n\t\treturn nil, fConfig.OpenError\n\t}\n\n\tbuf := &bytes.Buffer{}\n\tif err := toml.NewEncoder(buf).Encode(b.descriptor); err != nil {\n\t\treturn nil, err\n\t}\n\n\ttarBuilder := archive.TarBuilder{}\n\tts := archive.NormalizedDateTime\n\ttarBuilder.AddDir(fmt.Sprintf(\"/cnb/extensions/%s\", b.descriptor.EscapedID()), b.chmod, ts)\n\textDir := fmt.Sprintf(\"/cnb/extensions/%s/%s\", b.descriptor.EscapedID(), b.descriptor.Info().Version)\n\ttarBuilder.AddDir(extDir, b.chmod, ts)\n\ttarBuilder.AddFile(extDir+\"/extension.toml\", b.chmod, ts, buf.Bytes())\n\n\ttarBuilder.AddDir(extDir+\"/bin\", b.chmod, ts)\n\ttarBuilder.AddFile(extDir+\"/bin/generate\", b.chmod, ts, []byte(\"generate-contents\"))\n\ttarBuilder.AddFile(extDir+\"/bin/detect\", b.chmod, ts, []byte(\"detect-contents\"))\n\n\tfor extraFilename, extraContents := range fConfig.ExtraFiles {\n\t\ttarBuilder.AddFile(filepath.Join(extDir, extraFilename), b.chmod, ts, []byte(extraContents))\n\t}\n\n\treturn tarBuilder.Reader(archive.DefaultTarWriterFactory()), nil\n}\n"
  },
  {
    "path": "internal/fakes/fake_extension_blob.go",
    "content": "package fakes\n\nimport (\n\t\"bytes\"\n\t\"io\"\n\t\"time\"\n\n\t\"github.com/BurntSushi/toml\"\n\n\t\"github.com/buildpacks/pack/pkg/archive\"\n\t\"github.com/buildpacks/pack/pkg/blob\"\n\t\"github.com/buildpacks/pack/pkg/buildpack\"\n)\n\ntype fakeExtensionBlob struct {\n\tdescriptor buildpack.Descriptor\n\tchmod      int64\n}\n\nfunc NewFakeExtensionBlob(descriptor buildpack.Descriptor, chmod int64) (blob.Blob, error) {\n\treturn &fakeExtensionBlob{\n\t\tdescriptor: descriptor,\n\t\tchmod:      chmod,\n\t}, nil\n}\n\nfunc (b *fakeExtensionBlob) Open() (reader io.ReadCloser, err error) {\n\tbuf := &bytes.Buffer{}\n\tif err = toml.NewEncoder(buf).Encode(b.descriptor); err != nil {\n\t\treturn nil, err\n\t}\n\n\ttarBuilder := archive.TarBuilder{}\n\ttarBuilder.AddFile(\"extension.toml\", b.chmod, time.Now(), buf.Bytes())\n\ttarBuilder.AddDir(\"bin\", b.chmod, time.Now())\n\ttarBuilder.AddFile(\"bin/build\", b.chmod, time.Now(), []byte(\"build-contents\"))\n\ttarBuilder.AddFile(\"bin/detect\", b.chmod, time.Now(), []byte(\"detect-contents\"))\n\n\treturn tarBuilder.Reader(archive.DefaultTarWriterFactory()), err\n}\n"
  },
  {
    "path": "internal/fakes/fake_extension_tar.go",
    "content": "package fakes\n\nimport (\n\t\"io\"\n\t\"os\"\n\t\"testing\"\n\n\t\"github.com/buildpacks/pack/pkg/dist\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc CreateExtensionTar(t *testing.T, tmpDir string, descriptor dist.ExtensionDescriptor) string {\n\textension, err := NewFakeExtensionBlob(&descriptor, 0777)\n\th.AssertNil(t, err)\n\n\ttempFile, err := os.CreateTemp(tmpDir, \"ex-*.tar\")\n\th.AssertNil(t, err)\n\tdefer tempFile.Close()\n\n\treader, err := extension.Open()\n\th.AssertNil(t, err)\n\n\t_, err = io.Copy(tempFile, reader)\n\th.AssertNil(t, err)\n\n\treturn tempFile.Name()\n}\n"
  },
  {
    "path": "internal/fakes/fake_image_fetcher.go",
    "content": "package fakes\n\nimport (\n\t\"context\"\n\n\t\"github.com/buildpacks/imgutil\"\n\t\"github.com/pkg/errors\"\n\n\t\"github.com/buildpacks/pack/pkg/dist\"\n\t\"github.com/buildpacks/pack/pkg/image\"\n)\n\ntype FetchArgs struct {\n\tDaemon       bool\n\tPullPolicy   image.PullPolicy\n\tLayoutOption image.LayoutOption\n\tTarget       *dist.Target\n}\n\ntype FakeImageFetcher struct {\n\tLocalImages  map[string]imgutil.Image\n\tRemoteImages map[string]imgutil.Image\n\tFetchCalls   map[string]*FetchArgs\n}\n\nfunc NewFakeImageFetcher() *FakeImageFetcher {\n\treturn &FakeImageFetcher{\n\t\tLocalImages:  map[string]imgutil.Image{},\n\t\tRemoteImages: map[string]imgutil.Image{},\n\t\tFetchCalls:   map[string]*FetchArgs{},\n\t}\n}\n\nfunc (f *FakeImageFetcher) Fetch(ctx context.Context, name string, options image.FetchOptions) (imgutil.Image, error) {\n\tf.FetchCalls[name] = &FetchArgs{Daemon: options.Daemon, PullPolicy: options.PullPolicy, Target: options.Target, LayoutOption: options.LayoutOption}\n\n\tri, remoteFound := f.RemoteImages[name]\n\n\tif options.Daemon {\n\t\tli, localFound := f.LocalImages[name]\n\n\t\tif shouldPull(localFound, remoteFound, options.PullPolicy) {\n\t\t\tf.LocalImages[name] = ri\n\t\t\tli = ri\n\t\t}\n\t\tif !localFound {\n\t\t\treturn nil, errors.Wrapf(image.ErrNotFound, \"image '%s' does not exist on the daemon\", name)\n\t\t}\n\t\treturn li, nil\n\t}\n\n\tif !remoteFound {\n\t\treturn nil, errors.Wrapf(image.ErrNotFound, \"image '%s' does not exist in registry\", name)\n\t}\n\n\treturn ri, nil\n}\n\nfunc (f *FakeImageFetcher) CheckReadAccess(_ string, _ image.FetchOptions) bool {\n\treturn true\n}\n\nfunc (f *FakeImageFetcher) FetchForPlatform(ctx context.Context, name string, options image.FetchOptions) (imgutil.Image, error) {\n\t// For the fake implementation, FetchForPlatform behaves the same as Fetch\n\t// since we don't need to simulate the platform-specific digest resolution\n\treturn f.Fetch(ctx, name, options)\n}\n\nfunc shouldPull(localFound, remoteFound bool, policy image.PullPolicy) bool {\n\tif remoteFound && !localFound && policy == image.PullIfNotPresent {\n\t\treturn true\n\t}\n\n\treturn remoteFound && policy == image.PullAlways\n}\n"
  },
  {
    "path": "internal/fakes/fake_images.go",
    "content": "package fakes\n\nimport (\n\t\"bytes\"\n\t\"fmt\"\n\t\"path/filepath\"\n\t\"testing\"\n\n\t\"github.com/BurntSushi/toml\"\n\t\"github.com/buildpacks/imgutil\"\n\t\"github.com/buildpacks/imgutil/fakes\"\n\n\t\"github.com/buildpacks/pack/internal/builder\"\n\t\"github.com/buildpacks/pack/pkg/archive\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\ntype FakeImageCreator func(name string, topLayerSha string, identifier imgutil.Identifier) *fakes.Image\n\nfunc NewFakeBuilderImage(t *testing.T, tmpDir, name string, stackID, uid, gid string, metadata builder.Metadata, bpLayers dist.ModuleLayers, order dist.Order, exLayers dist.ModuleLayers, orderExtensions dist.Order, system dist.System, creator FakeImageCreator) *fakes.Image {\n\tfakeBuilderImage := creator(name, \"\", nil)\n\n\th.AssertNil(t, fakeBuilderImage.SetLabel(\"io.buildpacks.stack.id\", stackID))\n\th.AssertNil(t, fakeBuilderImage.SetEnv(\"CNB_USER_ID\", uid))\n\th.AssertNil(t, fakeBuilderImage.SetEnv(\"CNB_GROUP_ID\", gid))\n\n\th.AssertNil(t, dist.SetLabel(fakeBuilderImage, \"io.buildpacks.builder.metadata\", metadata))\n\th.AssertNil(t, dist.SetLabel(fakeBuilderImage, \"io.buildpacks.buildpack.layers\", bpLayers))\n\n\tfor bpID, v := range bpLayers {\n\t\tfor bpVersion, bpLayerInfo := range v {\n\t\t\tbpInfo := dist.ModuleInfo{\n\t\t\t\tID:      bpID,\n\t\t\t\tVersion: bpVersion,\n\t\t\t}\n\n\t\t\tbuildpackDescriptor := dist.BuildpackDescriptor{\n\t\t\t\tWithAPI:    bpLayerInfo.API,\n\t\t\t\tWithInfo:   bpInfo,\n\t\t\t\tWithStacks: bpLayerInfo.Stacks,\n\t\t\t\tWithOrder:  bpLayerInfo.Order,\n\t\t\t}\n\n\t\t\tbuildpackTar := CreateBuildpackTar(t, tmpDir, buildpackDescriptor)\n\t\t\terr := fakeBuilderImage.AddLayer(buildpackTar)\n\t\t\th.AssertNil(t, err)\n\t\t}\n\t}\n\n\tfor exID, v := range exLayers {\n\t\tfor exVersion, exLayerInfo := range v {\n\t\t\texInfo := dist.ModuleInfo{\n\t\t\t\tID:      exID,\n\t\t\t\tVersion: exVersion,\n\t\t\t}\n\n\t\t\textensionDescriptor := dist.ExtensionDescriptor{\n\t\t\t\tWithAPI:  exLayerInfo.API,\n\t\t\t\tWithInfo: exInfo,\n\t\t\t}\n\n\t\t\textensionTar := CreateExtensionTar(t, tmpDir, extensionDescriptor)\n\t\t\terr := fakeBuilderImage.AddLayer(extensionTar)\n\t\t\th.AssertNil(t, err)\n\t\t}\n\t}\n\n\th.AssertNil(t, dist.SetLabel(fakeBuilderImage, \"io.buildpacks.buildpack.order\", order))\n\th.AssertNil(t, dist.SetLabel(fakeBuilderImage, \"io.buildpacks.extension.order\", orderExtensions))\n\n\ttarBuilder := archive.TarBuilder{}\n\torderTomlBytes := &bytes.Buffer{}\n\th.AssertNil(t, toml.NewEncoder(orderTomlBytes).Encode(orderTOML{Order: order, OrderExtensions: orderExtensions}))\n\ttarBuilder.AddFile(\"/cnb/order.toml\", 0777, archive.NormalizedDateTime, orderTomlBytes.Bytes())\n\n\torderTar := filepath.Join(tmpDir, fmt.Sprintf(\"order.%s.toml\", h.RandString(8)))\n\th.AssertNil(t, tarBuilder.WriteToPath(orderTar, archive.DefaultTarWriterFactory()))\n\th.AssertNil(t, fakeBuilderImage.AddLayer(orderTar))\n\n\tif len(system.Pre.Buildpacks) > 0 || len(system.Post.Buildpacks) > 0 {\n\t\th.AssertNil(t, dist.SetLabel(fakeBuilderImage, \"io.buildpacks.buildpack.system\", system))\n\t\tsystemTarBuilder := archive.TarBuilder{}\n\t\tsystemTomlBytes := &bytes.Buffer{}\n\t\th.AssertNil(t, toml.NewEncoder(systemTomlBytes).Encode(systemTOML{System: system}))\n\n\t\tsystemTarBuilder.AddFile(\"/cnb/system.toml\", 0777, archive.NormalizedDateTime, systemTomlBytes.Bytes())\n\n\t\tsystemTar := filepath.Join(tmpDir, fmt.Sprintf(\"system.%s.toml\", h.RandString(8)))\n\t\th.AssertNil(t, systemTarBuilder.WriteToPath(systemTar, archive.DefaultTarWriterFactory()))\n\t\th.AssertNil(t, fakeBuilderImage.AddLayer(systemTar))\n\t}\n\n\treturn fakeBuilderImage\n}\n\ntype orderTOML struct {\n\tOrder           dist.Order `toml:\"order\"`\n\tOrderExtensions dist.Order `toml:\"orderExtensions\"`\n}\n\ntype systemTOML struct {\n\tSystem dist.System `toml:\"system\"`\n}\n"
  },
  {
    "path": "internal/fakes/fake_lifecycle.go",
    "content": "package fakes\n\nimport (\n\t\"context\"\n\n\t\"github.com/buildpacks/pack/internal/build\"\n)\n\ntype FakeLifecycle struct {\n\tOpts build.LifecycleOptions\n}\n\nfunc (f *FakeLifecycle) Execute(ctx context.Context, opts build.LifecycleOptions) error {\n\tf.Opts = opts\n\treturn nil\n}\n"
  },
  {
    "path": "internal/fakes/fake_package.go",
    "content": "package fakes\n\nimport (\n\t\"errors\"\n\t\"io\"\n\t\"os\"\n\t\"path/filepath\"\n\n\t\"github.com/google/go-containerregistry/pkg/v1/tarball\"\n\n\t\"github.com/buildpacks/pack/pkg/buildpack\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n)\n\ntype Package interface {\n\tName() string\n\tBuildpackLayers() dist.ModuleLayers\n\tGetLayer(diffID string) (io.ReadCloser, error)\n}\n\nvar _ Package = (*fakePackage)(nil)\n\ntype fakePackage struct {\n\tname       string\n\tbpTarFiles map[string]string\n\tbpLayers   dist.ModuleLayers\n}\n\nfunc NewPackage(tmpDir string, name string, buildpacks []buildpack.BuildModule) (Package, error) {\n\tprocessBuildpack := func(bp buildpack.BuildModule) (tarFile string, diffID string, err error) {\n\t\ttarFile, err = buildpack.ToLayerTar(tmpDir, bp)\n\t\tif err != nil {\n\t\t\treturn \"\", \"\", err\n\t\t}\n\n\t\tlayer, err := tarball.LayerFromFile(tarFile)\n\t\tif err != nil {\n\t\t\treturn \"\", \"\", err\n\t\t}\n\n\t\thash, err := layer.DiffID()\n\t\tif err != nil {\n\t\t\treturn \"\", \"\", err\n\t\t}\n\n\t\treturn tarFile, hash.String(), nil\n\t}\n\n\tbpLayers := dist.ModuleLayers{}\n\tbpTarFiles := map[string]string{}\n\tfor _, bp := range buildpacks {\n\t\ttarFile, diffID, err := processBuildpack(bp)\n\t\tif err != nil {\n\t\t\treturn nil, err\n\t\t}\n\t\tbpTarFiles[diffID] = tarFile\n\t\tdist.AddToLayersMD(bpLayers, bp.Descriptor(), diffID)\n\t}\n\n\treturn &fakePackage{\n\t\tname:       name,\n\t\tbpTarFiles: bpTarFiles,\n\t\tbpLayers:   bpLayers,\n\t}, nil\n}\n\nfunc (f *fakePackage) Name() string {\n\treturn f.name\n}\n\nfunc (f *fakePackage) BuildpackLayers() dist.ModuleLayers {\n\treturn f.bpLayers\n}\n\nfunc (f *fakePackage) GetLayer(diffID string) (io.ReadCloser, error) {\n\ttarFile, ok := f.bpTarFiles[diffID]\n\tif !ok {\n\t\treturn nil, errors.New(\"no layer found\")\n\t}\n\n\treturn os.Open(filepath.Clean(tarFile))\n}\n"
  },
  {
    "path": "internal/inspectimage/bom_display.go",
    "content": "package inspectimage\n\nimport (\n\t\"github.com/buildpacks/lifecycle/buildpack\"\n\n\t\"github.com/buildpacks/pack/pkg/client\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n)\n\ntype BOMDisplay struct {\n\tRemote    []BOMEntryDisplay `json:\"remote\" yaml:\"remote\"`\n\tLocal     []BOMEntryDisplay `json:\"local\" yaml:\"local\"`\n\tRemoteErr string            `json:\"remote_error,omitempty\" yaml:\"remoteError,omitempty\"`\n\tLocalErr  string            `json:\"local_error,omitempty\" yaml:\"localError,omitempty\"`\n}\n\ntype BOMEntryDisplay struct {\n\tName      string                 `toml:\"name\" json:\"name\" yaml:\"name\"`\n\tVersion   string                 `toml:\"version,omitempty\" json:\"version,omitempty\" yaml:\"version,omitempty\"`\n\tMetadata  map[string]interface{} `toml:\"metadata\" json:\"metadata\" yaml:\"metadata\"`\n\tBuildpack dist.ModuleRef         `json:\"buildpacks\" yaml:\"buildpacks\" toml:\"buildpacks\"`\n}\n\nfunc NewBOMDisplay(info *client.ImageInfo) []BOMEntryDisplay {\n\tif info == nil {\n\t\treturn nil\n\t}\n\tif info != nil && info.Extensions != nil {\n\t\treturn displayBOMWithExtension(info.BOM)\n\t}\n\treturn displayBOM(info.BOM)\n}\n\nfunc displayBOM(bom []buildpack.BOMEntry) []BOMEntryDisplay {\n\tvar result []BOMEntryDisplay\n\tfor _, entry := range bom {\n\t\tresult = append(result, BOMEntryDisplay{\n\t\t\tName:     entry.Name,\n\t\t\tVersion:  entry.Version,\n\t\t\tMetadata: entry.Metadata,\n\n\t\t\tBuildpack: dist.ModuleRef{\n\t\t\t\tModuleInfo: dist.ModuleInfo{\n\t\t\t\t\tID:      entry.Buildpack.ID,\n\t\t\t\t\tVersion: entry.Buildpack.Version,\n\t\t\t\t},\n\t\t\t\tOptional: entry.Buildpack.Optional,\n\t\t\t},\n\t\t})\n\t}\n\n\treturn result\n}\n\nfunc displayBOMWithExtension(bom []buildpack.BOMEntry) []BOMEntryDisplay {\n\tvar result []BOMEntryDisplay\n\tfor _, entry := range bom {\n\t\tresult = append(result, BOMEntryDisplay{\n\t\t\tName:     entry.Name,\n\t\t\tVersion:  entry.Version,\n\t\t\tMetadata: entry.Metadata,\n\n\t\t\tBuildpack: dist.ModuleRef{\n\t\t\t\tModuleInfo: dist.ModuleInfo{\n\t\t\t\t\tID:      entry.Buildpack.ID,\n\t\t\t\t\tVersion: entry.Buildpack.Version,\n\t\t\t\t},\n\t\t\t\tOptional: entry.Buildpack.Optional,\n\t\t\t},\n\t\t})\n\t}\n\n\treturn result\n}\n"
  },
  {
    "path": "internal/inspectimage/info_display.go",
    "content": "package inspectimage\n\nimport (\n\t\"github.com/buildpacks/lifecycle/buildpack\"\n\t\"github.com/buildpacks/lifecycle/launch\"\n\t\"github.com/buildpacks/lifecycle/platform/files\"\n\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/pkg/client\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n)\n\ntype GeneralInfo struct {\n\tName            string\n\tRunImageMirrors []config.RunImage\n}\n\ntype RunImageMirrorDisplay struct {\n\tName           string `json:\"name\" yaml:\"name\" toml:\"name\"`\n\tUserConfigured bool   `json:\"user_configured,omitempty\" yaml:\"user_configured,omitempty\" toml:\"user_configured,omitempty\"`\n}\n\ntype StackDisplay struct {\n\tID     string   `json:\"id\" yaml:\"id\" toml:\"id\"`\n\tMixins []string `json:\"mixins,omitempty\" yaml:\"mixins,omitempty\" toml:\"mixins,omitempty\"`\n}\n\ntype ProcessDisplay struct {\n\tType    string   `json:\"type\" yaml:\"type\" toml:\"type\"`\n\tShell   string   `json:\"shell\" yaml:\"shell\" toml:\"shell\"`\n\tCommand string   `json:\"command\" yaml:\"command\" toml:\"command\"`\n\tDefault bool     `json:\"default\" yaml:\"default\" toml:\"default\"`\n\tArgs    []string `json:\"args\" yaml:\"args\" toml:\"args\"`\n\tWorkDir string   `json:\"working-dir\" yaml:\"working-dir\" toml:\"working-dir\"`\n}\n\ntype BaseDisplay struct {\n\tTopLayer  string `json:\"top_layer\" yaml:\"top_layer\" toml:\"top_layer\"`\n\tReference string `json:\"reference\" yaml:\"reference\" toml:\"reference\"`\n}\n\ntype InfoDisplay struct {\n\tStackID         string                  `json:\"stack\" yaml:\"stack\" toml:\"stack\"`\n\tBase            BaseDisplay             `json:\"base_image\" yaml:\"base_image\" toml:\"base_image\"`\n\tRunImageMirrors []RunImageMirrorDisplay `json:\"run_images\" yaml:\"run_images\" toml:\"run_images\"`\n\tBuildpacks      []dist.ModuleInfo       `json:\"buildpacks\" yaml:\"buildpacks\" toml:\"buildpacks\"`\n\tExtensions      []dist.ModuleInfo       `json:\"extensions\" yaml:\"extensions\" toml:\"extensions\"`\n\tProcesses       []ProcessDisplay        `json:\"processes\" yaml:\"processes\" toml:\"processes\"`\n\tRebasable       bool                    `json:\"rebasable\" yaml:\"rebasable\" toml:\"rebasable\"`\n}\n\ntype InspectOutput struct {\n\tImageName string       `json:\"image_name\" yaml:\"image_name\" toml:\"image_name\"`\n\tRemote    *InfoDisplay `json:\"remote_info\" yaml:\"remote_info\" toml:\"remote_info\"`\n\tLocal     *InfoDisplay `json:\"local_info\" yaml:\"local_info\" toml:\"local_info\"`\n}\n\nfunc NewInfoDisplay(info *client.ImageInfo, generalInfo GeneralInfo) *InfoDisplay {\n\tif info == nil {\n\t\treturn nil\n\t}\n\tif info != nil && info.Extensions != nil {\n\t\treturn &InfoDisplay{\n\t\t\tStackID:         info.StackID,\n\t\t\tBase:            displayBase(info.Base),\n\t\t\tRunImageMirrors: displayMirrors(info, generalInfo),\n\t\t\tBuildpacks:      displayBuildpacks(info.Buildpacks),\n\t\t\tExtensions:      displayExtensions(info.Extensions),\n\t\t\tProcesses:       displayProcesses(info.Processes),\n\t\t\tRebasable:       info.Rebasable,\n\t\t}\n\t}\n\treturn &InfoDisplay{\n\t\tStackID:         info.StackID,\n\t\tBase:            displayBase(info.Base),\n\t\tRunImageMirrors: displayMirrors(info, generalInfo),\n\t\tBuildpacks:      displayBuildpacks(info.Buildpacks),\n\t\tProcesses:       displayProcesses(info.Processes),\n\t\tRebasable:       info.Rebasable,\n\t}\n}\n\n//\n// private functions\n//\n\nfunc getConfigMirrors(info *client.ImageInfo, imageMirrors []config.RunImage) []string {\n\tvar runImage string\n\tif info != nil {\n\t\trunImage = info.Stack.RunImage.Image\n\t}\n\n\tfor _, ri := range imageMirrors {\n\t\tif ri.Image == runImage {\n\t\t\treturn ri.Mirrors\n\t\t}\n\t}\n\treturn nil\n}\n\nfunc displayBase(base files.RunImageForRebase) BaseDisplay {\n\treturn BaseDisplay{\n\t\tTopLayer:  base.TopLayer,\n\t\tReference: base.Reference,\n\t}\n}\n\nfunc displayMirrors(info *client.ImageInfo, generalInfo GeneralInfo) []RunImageMirrorDisplay {\n\t// add all user configured run images, then add run images provided by info\n\tvar result []RunImageMirrorDisplay\n\tif info == nil {\n\t\treturn result\n\t}\n\n\tcfgMirrors := getConfigMirrors(info, generalInfo.RunImageMirrors)\n\tfor _, mirror := range cfgMirrors {\n\t\tif mirror != \"\" {\n\t\t\tresult = append(result, RunImageMirrorDisplay{\n\t\t\t\tName:           mirror,\n\t\t\t\tUserConfigured: true,\n\t\t\t})\n\t\t}\n\t}\n\n\t// Add run image as named by the stack.\n\tif info.Stack.RunImage.Image != \"\" {\n\t\tresult = append(result, RunImageMirrorDisplay{\n\t\t\tName:           info.Stack.RunImage.Image,\n\t\t\tUserConfigured: false,\n\t\t})\n\t}\n\n\tfor _, mirror := range info.Stack.RunImage.Mirrors {\n\t\tif mirror != \"\" {\n\t\t\tresult = append(result, RunImageMirrorDisplay{\n\t\t\t\tName:           mirror,\n\t\t\t\tUserConfigured: false,\n\t\t\t})\n\t\t}\n\t}\n\n\treturn result\n}\n\nfunc displayBuildpacks(buildpacks []buildpack.GroupElement) []dist.ModuleInfo {\n\tvar result []dist.ModuleInfo\n\tfor _, buildpack := range buildpacks {\n\t\tresult = append(result, dist.ModuleInfo{\n\t\t\tID:       buildpack.ID,\n\t\t\tVersion:  buildpack.Version,\n\t\t\tHomepage: buildpack.Homepage,\n\t\t})\n\t}\n\treturn result\n}\n\nfunc displayExtensions(extensions []buildpack.GroupElement) []dist.ModuleInfo {\n\tvar result []dist.ModuleInfo\n\tfor _, extension := range extensions {\n\t\tresult = append(result, dist.ModuleInfo{\n\t\t\tID:       extension.ID,\n\t\t\tVersion:  extension.Version,\n\t\t\tHomepage: extension.Homepage,\n\t\t})\n\t}\n\treturn result\n}\n\nfunc displayProcesses(details client.ProcessDetails) []ProcessDisplay {\n\tvar result []ProcessDisplay\n\tdetailsArray := details.OtherProcesses\n\tif details.DefaultProcess != nil {\n\t\tresult = append(result, convertToDisplay(*details.DefaultProcess, true))\n\t}\n\n\tfor _, detail := range detailsArray {\n\t\tresult = append(result, convertToDisplay(detail, false))\n\t}\n\treturn result\n}\n\nfunc convertToDisplay(proc launch.Process, isDefault bool) ProcessDisplay {\n\tvar shell string\n\tswitch proc.Direct {\n\tcase true:\n\t\tshell = \"\"\n\tcase false:\n\t\tshell = \"bash\"\n\t}\n\tvar argsToUse []string\n\tif len(proc.Command.Entries) > 1 {\n\t\targsToUse = proc.Command.Entries[1:]\n\t}\n\targsToUse = append(argsToUse, proc.Args...)\n\tresult := ProcessDisplay{\n\t\tType:    proc.Type,\n\t\tShell:   shell,\n\t\tCommand: proc.Command.Entries[0],\n\t\tDefault: isDefault,\n\t\tArgs:    argsToUse, // overridable args are supported for platform API >= 0.10 with buildpack API >= 0.9, but we can't determine the buildpack API from the metadata label (to be fixed in platform 0.11)\n\t\tWorkDir: proc.WorkingDirectory,\n\t}\n\n\treturn result\n}\n"
  },
  {
    "path": "internal/inspectimage/writer/bom_json.go",
    "content": "package writer\n\nimport (\n\t\"bytes\"\n\t\"encoding/json\"\n)\n\ntype JSONBOM struct {\n\tStructuredBOMFormat\n}\n\nfunc NewJSONBOM() *JSONBOM {\n\treturn &JSONBOM{\n\t\tStructuredBOMFormat: StructuredBOMFormat{\n\t\t\tMarshalFunc: func(i interface{}) ([]byte, error) {\n\t\t\t\tbuf := bytes.NewBuffer(nil)\n\t\t\t\tif err := json.NewEncoder(buf).Encode(i); err != nil {\n\t\t\t\t\treturn []byte{}, err\n\t\t\t\t}\n\n\t\t\t\tformattedBuf := bytes.NewBuffer(nil)\n\t\t\t\tif err := json.Indent(formattedBuf, buf.Bytes(), \"\", \"  \"); err != nil {\n\t\t\t\t\treturn []byte{}, err\n\t\t\t\t}\n\t\t\t\treturn formattedBuf.Bytes(), nil\n\t\t\t},\n\t\t},\n\t}\n}\n"
  },
  {
    "path": "internal/inspectimage/writer/bom_json_test.go",
    "content": "package writer_test\n\nimport (\n\t\"bytes\"\n\t\"testing\"\n\n\t\"github.com/buildpacks/lifecycle/buildpack\"\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\t\"github.com/buildpacks/pack/internal/inspectimage\"\n\t\"github.com/buildpacks/pack/internal/inspectimage/writer\"\n\t\"github.com/buildpacks/pack/pkg/client\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestJSONBOM(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"JSON BOM Writer\", testJSONBOM, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testJSONBOM(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tassert = h.NewAssertionManager(t)\n\t\toutBuf bytes.Buffer\n\n\t\tremoteInfo *client.ImageInfo\n\t\tlocalInfo  *client.ImageInfo\n\n\t\texpectedLocalOutput = `{\n  \"local\": [\n    {\n      \"name\": \"name-1\",\n      \"version\": \"version-1\",\n      \"metadata\": {\n        \"LocalData\": {\n          \"String\": \"\",\n          \"Bool\": false,\n          \"Int\": 456,\n          \"Nested\": {\n            \"String\": \"\"\n          }\n        }\n      },\n      \"buildpacks\": {\n        \"id\": \"test.bp.one.remote\",\n        \"version\": \"1.0.0\"\n      }\n    }\n  ]\n}`\n\t\texpectedRemoteOutput = `{\n  \"remote\": [\n    {\n      \"name\": \"name-1\",\n      \"version\": \"version-1\",\n      \"metadata\": {\n        \"RemoteData\": {\n          \"String\": \"aString\",\n          \"Bool\": true,\n          \"Int\": 123,\n          \"Nested\": {\n            \"String\": \"anotherString\"\n          }\n        }\n      },\n      \"buildpacks\": {\n        \"id\": \"test.bp.one.remote\",\n        \"version\": \"1.0.0\"\n      }\n    }\n  ]\n}`\n\t)\n\n\twhen(\"Print\", func() {\n\t\tit.Before(func() {\n\t\t\ttype someData struct {\n\t\t\t\tString string\n\t\t\t\tBool   bool\n\t\t\t\tInt    int\n\t\t\t\tNested struct {\n\t\t\t\t\tString string\n\t\t\t\t}\n\t\t\t}\n\n\t\t\tremoteInfo = &client.ImageInfo{\n\t\t\t\tBOM: []buildpack.BOMEntry{{\n\t\t\t\t\tRequire: buildpack.Require{\n\t\t\t\t\t\tName:    \"name-1\",\n\t\t\t\t\t\tVersion: \"version-1\",\n\t\t\t\t\t\tMetadata: map[string]interface{}{\n\t\t\t\t\t\t\t\"RemoteData\": someData{\n\t\t\t\t\t\t\t\tString: \"aString\",\n\t\t\t\t\t\t\t\tBool:   true,\n\t\t\t\t\t\t\t\tInt:    123,\n\t\t\t\t\t\t\t\tNested: struct {\n\t\t\t\t\t\t\t\t\tString string\n\t\t\t\t\t\t\t\t}{\n\t\t\t\t\t\t\t\t\tString: \"anotherString\",\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t\tBuildpack: buildpack.GroupElement{ID: \"test.bp.one.remote\", Version: \"1.0.0\", Homepage: \"https://some-homepage\"},\n\t\t\t\t}}}\n\n\t\t\tlocalInfo = &client.ImageInfo{\n\t\t\t\tBOM: []buildpack.BOMEntry{{\n\t\t\t\t\tRequire: buildpack.Require{\n\t\t\t\t\t\tName:    \"name-1\",\n\t\t\t\t\t\tVersion: \"version-1\",\n\t\t\t\t\t\tMetadata: map[string]interface{}{\n\t\t\t\t\t\t\t\"LocalData\": someData{\n\t\t\t\t\t\t\t\tBool: false,\n\t\t\t\t\t\t\t\tInt:  456,\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t\tBuildpack: buildpack.GroupElement{ID: \"test.bp.one.remote\", Version: \"1.0.0\", Homepage: \"https://some-homepage\"},\n\t\t\t\t}},\n\t\t\t}\n\n\t\t\toutBuf = bytes.Buffer{}\n\t\t})\n\n\t\twhen(\"local and remote image exits\", func() {\n\t\t\tit(\"prints both local and remote image info in a JSON format\", func() {\n\t\t\t\tjsonBOMWriter := writer.NewJSONBOM()\n\n\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\t\terr := jsonBOMWriter.Print(logger, inspectimage.GeneralInfo{}, localInfo, remoteInfo, nil, nil)\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassert.ContainsJSON(outBuf.String(), expectedLocalOutput)\n\t\t\t\tassert.ContainsJSON(outBuf.String(), expectedRemoteOutput)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"only local image exists\", func() {\n\t\t\tit(\"prints local image info in JSON format\", func() {\n\t\t\t\tjsonBOMWriter := writer.NewJSONBOM()\n\n\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\t\terr := jsonBOMWriter.Print(logger, inspectimage.GeneralInfo{}, localInfo, nil, nil, nil)\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassert.ContainsJSON(outBuf.String(), expectedLocalOutput)\n\n\t\t\t\tassert.NotContains(outBuf.String(), \"test.stack.id.remote\")\n\t\t\t\tassert.ContainsJSON(outBuf.String(), expectedLocalOutput)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"only remote image exists\", func() {\n\t\t\tit(\"prints remote image info in JSON format\", func() {\n\t\t\t\tjsonBOMWriter := writer.NewJSONBOM()\n\n\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\t\terr := jsonBOMWriter.Print(logger, inspectimage.GeneralInfo{}, nil, remoteInfo, nil, nil)\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassert.NotContains(outBuf.String(), \"test.stack.id.local\")\n\t\t\t\tassert.ContainsJSON(outBuf.String(), expectedRemoteOutput)\n\t\t\t})\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "internal/inspectimage/writer/bom_yaml.go",
    "content": "package writer\n\nimport (\n\t\"bytes\"\n\n\t\"gopkg.in/yaml.v3\"\n)\n\ntype YAMLBOM struct {\n\tStructuredBOMFormat\n}\n\nfunc NewYAMLBOM() *YAMLBOM {\n\treturn &YAMLBOM{\n\t\tStructuredBOMFormat: StructuredBOMFormat{\n\t\t\tMarshalFunc: func(i interface{}) ([]byte, error) {\n\t\t\t\tbuf := bytes.NewBuffer(nil)\n\t\t\t\tif err := yaml.NewEncoder(buf).Encode(i); err != nil {\n\t\t\t\t\treturn []byte{}, err\n\t\t\t\t}\n\t\t\t\treturn buf.Bytes(), nil\n\t\t\t},\n\t\t},\n\t}\n}\n"
  },
  {
    "path": "internal/inspectimage/writer/bom_yaml_test.go",
    "content": "package writer_test\n\nimport (\n\t\"bytes\"\n\t\"testing\"\n\n\t\"github.com/buildpacks/lifecycle/buildpack\"\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\t\"github.com/buildpacks/pack/internal/inspectimage\"\n\t\"github.com/buildpacks/pack/internal/inspectimage/writer\"\n\t\"github.com/buildpacks/pack/pkg/client\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestYAMLBOM(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"YAML BOM Writer\", testYAMLBOM, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testYAMLBOM(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tassert = h.NewAssertionManager(t)\n\t\toutBuf bytes.Buffer\n\n\t\tremoteInfo *client.ImageInfo\n\t\tlocalInfo  *client.ImageInfo\n\n\t\texpectedLocalOutput = `---\nlocal:\n- name: name-1\n  version: version-1\n  metadata:\n    LocalData:\n      string: ''\n      bool: false\n      int: 456\n      nested:\n        string: ''\n  buildpacks:\n    id: test.bp.one.remote\n    version: 1.0.0\n`\n\t\texpectedRemoteOutput = `---\nremote:\n- name: name-1\n  version: version-1\n  metadata:\n    RemoteData:\n      string: aString\n      bool: true\n      int: 123\n      nested:\n        string: anotherString\n  buildpacks:\n    id: test.bp.one.remote\n    version: 1.0.0\n`\n\t)\n\n\twhen(\"Print\", func() {\n\t\tit.Before(func() {\n\t\t\ttype someData struct {\n\t\t\t\tString string\n\t\t\t\tBool   bool\n\t\t\t\tInt    int\n\t\t\t\tNested struct {\n\t\t\t\t\tString string\n\t\t\t\t}\n\t\t\t}\n\n\t\t\tremoteInfo = &client.ImageInfo{\n\t\t\t\tBOM: []buildpack.BOMEntry{{\n\t\t\t\t\tRequire: buildpack.Require{\n\t\t\t\t\t\tName:    \"name-1\",\n\t\t\t\t\t\tVersion: \"version-1\",\n\t\t\t\t\t\tMetadata: map[string]interface{}{\n\t\t\t\t\t\t\t\"RemoteData\": someData{\n\t\t\t\t\t\t\t\tString: \"aString\",\n\t\t\t\t\t\t\t\tBool:   true,\n\t\t\t\t\t\t\t\tInt:    123,\n\t\t\t\t\t\t\t\tNested: struct {\n\t\t\t\t\t\t\t\t\tString string\n\t\t\t\t\t\t\t\t}{\n\t\t\t\t\t\t\t\t\tString: \"anotherString\",\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t\tBuildpack: buildpack.GroupElement{ID: \"test.bp.one.remote\", Version: \"1.0.0\", Homepage: \"https://some-homepage\"},\n\t\t\t\t}}}\n\n\t\t\tlocalInfo = &client.ImageInfo{\n\t\t\t\tBOM: []buildpack.BOMEntry{{\n\t\t\t\t\tRequire: buildpack.Require{\n\t\t\t\t\t\tName:    \"name-1\",\n\t\t\t\t\t\tVersion: \"version-1\",\n\t\t\t\t\t\tMetadata: map[string]interface{}{\n\t\t\t\t\t\t\t\"LocalData\": someData{\n\t\t\t\t\t\t\t\tBool: false,\n\t\t\t\t\t\t\t\tInt:  456,\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t\tBuildpack: buildpack.GroupElement{ID: \"test.bp.one.remote\", Version: \"1.0.0\", Homepage: \"https://some-homepage\"},\n\t\t\t\t}},\n\t\t\t}\n\n\t\t\toutBuf = bytes.Buffer{}\n\t\t})\n\n\t\twhen(\"local and remote image exits\", func() {\n\t\t\tit(\"prints both local and remote image info in a YAML format\", func() {\n\t\t\t\tyamlBOMWriter := writer.NewYAMLBOM()\n\n\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\t\terr := yamlBOMWriter.Print(logger, inspectimage.GeneralInfo{}, localInfo, remoteInfo, nil, nil)\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassert.ContainsYAML(outBuf.String(), expectedLocalOutput)\n\t\t\t\tassert.ContainsYAML(outBuf.String(), expectedRemoteOutput)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"only local image exists\", func() {\n\t\t\tit(\"prints local image info in YAML format\", func() {\n\t\t\t\tyamlBOMWriter := writer.NewYAMLBOM()\n\n\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\t\terr := yamlBOMWriter.Print(logger, inspectimage.GeneralInfo{}, localInfo, nil, nil, nil)\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassert.ContainsYAML(outBuf.String(), expectedLocalOutput)\n\n\t\t\t\tassert.NotContains(outBuf.String(), \"test.stack.id.remote\")\n\t\t\t\tassert.ContainsYAML(outBuf.String(), expectedLocalOutput)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"only remote image exists\", func() {\n\t\t\tit(\"prints remote image info in YAML format\", func() {\n\t\t\t\tyamlBOMWriter := writer.NewYAMLBOM()\n\n\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\t\terr := yamlBOMWriter.Print(logger, inspectimage.GeneralInfo{}, nil, remoteInfo, nil, nil)\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassert.NotContains(outBuf.String(), \"test.stack.id.local\")\n\t\t\t\tassert.ContainsYAML(outBuf.String(), expectedRemoteOutput)\n\t\t\t})\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "internal/inspectimage/writer/factory.go",
    "content": "package writer\n\nimport (\n\t\"fmt\"\n\n\t\"github.com/buildpacks/pack/internal/inspectimage\"\n\t\"github.com/buildpacks/pack/pkg/client\"\n\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\n\t\"github.com/buildpacks/pack/internal/style\"\n)\n\ntype Factory struct{}\n\ntype InspectImageWriter interface {\n\tPrint(\n\t\tlogger logging.Logger,\n\t\tsharedInfo inspectimage.GeneralInfo,\n\t\tlocal, remote *client.ImageInfo,\n\t\tlocalErr, remoteErr error,\n\t) error\n}\n\nfunc NewFactory() *Factory {\n\treturn &Factory{}\n}\n\nfunc (f *Factory) Writer(kind string, bom bool) (InspectImageWriter, error) {\n\tif bom {\n\t\tswitch kind {\n\t\tcase \"human-readable\", \"json\":\n\t\t\treturn NewJSONBOM(), nil\n\t\tcase \"yaml\":\n\t\t\treturn NewYAMLBOM(), nil\n\t\t}\n\t} else {\n\t\tswitch kind {\n\t\tcase \"human-readable\":\n\t\t\treturn NewHumanReadable(), nil\n\t\tcase \"json\":\n\t\t\treturn NewJSON(), nil\n\t\tcase \"yaml\":\n\t\t\treturn NewYAML(), nil\n\t\tcase \"toml\":\n\t\t\treturn NewTOML(), nil\n\t\t}\n\t}\n\n\treturn nil, fmt.Errorf(\"output format %s is not supported\", style.Symbol(kind))\n}\n"
  },
  {
    "path": "internal/inspectimage/writer/factory_test.go",
    "content": "package writer_test\n\nimport (\n\t\"fmt\"\n\t\"testing\"\n\n\t\"github.com/buildpacks/pack/internal/inspectimage/writer\"\n\n\th \"github.com/buildpacks/pack/testhelpers\"\n\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n)\n\nfunc TestFactory(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"Builder Writer Factory\", testFactory, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testFactory(t *testing.T, when spec.G, it spec.S) {\n\tvar assert = h.NewAssertionManager(t)\n\n\twhen(\"Writer\", func() {\n\t\twhen(\"Not BOM\", func() {\n\t\t\twhen(\"output format is human-readable\", func() {\n\t\t\t\tit(\"returns a HumanReadable writer\", func() {\n\t\t\t\t\tfactory := writer.NewFactory()\n\n\t\t\t\t\treturnedWriter, err := factory.Writer(\"human-readable\", false)\n\t\t\t\t\tassert.Nil(err)\n\t\t\t\t\t_, ok := returnedWriter.(*writer.HumanReadable)\n\t\t\t\t\tassert.TrueWithMessage(\n\t\t\t\t\t\tok,\n\t\t\t\t\t\tfmt.Sprintf(\"expected %T to be assignable to type `*writer.HumanReadable`\", returnedWriter),\n\t\t\t\t\t)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"output format is json\", func() {\n\t\t\t\tit(\"return a JSON writer\", func() {\n\t\t\t\t\tfactory := writer.NewFactory()\n\n\t\t\t\t\treturnedWriter, err := factory.Writer(\"json\", false)\n\t\t\t\t\tassert.Nil(err)\n\n\t\t\t\t\t_, ok := returnedWriter.(*writer.JSON)\n\t\t\t\t\tassert.TrueWithMessage(\n\t\t\t\t\t\tok,\n\t\t\t\t\t\tfmt.Sprintf(\"expected %T to be assignable to type `*writer.JSON`\", returnedWriter),\n\t\t\t\t\t)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"output format is yaml\", func() {\n\t\t\t\tit(\"return a YAML writer\", func() {\n\t\t\t\t\tfactory := writer.NewFactory()\n\n\t\t\t\t\treturnedWriter, err := factory.Writer(\"yaml\", false)\n\t\t\t\t\tassert.Nil(err)\n\n\t\t\t\t\t_, ok := returnedWriter.(*writer.YAML)\n\t\t\t\t\tassert.TrueWithMessage(\n\t\t\t\t\t\tok,\n\t\t\t\t\t\tfmt.Sprintf(\"expected %T to be assignable to type `*writer.YAML`\", returnedWriter),\n\t\t\t\t\t)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"output format is toml\", func() {\n\t\t\t\tit(\"return a TOML writer\", func() {\n\t\t\t\t\tfactory := writer.NewFactory()\n\n\t\t\t\t\treturnedWriter, err := factory.Writer(\"toml\", false)\n\t\t\t\t\tassert.Nil(err)\n\n\t\t\t\t\t_, ok := returnedWriter.(*writer.TOML)\n\t\t\t\t\tassert.TrueWithMessage(\n\t\t\t\t\t\tok,\n\t\t\t\t\t\tfmt.Sprintf(\"expected %T to be assignable to type `*writer.TOML`\", returnedWriter),\n\t\t\t\t\t)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"output format is not supported\", func() {\n\t\t\t\tit(\"returns an error\", func() {\n\t\t\t\t\tfactory := writer.NewFactory()\n\n\t\t\t\t\t_, err := factory.Writer(\"mind-beam\", false)\n\t\t\t\t\tassert.ErrorWithMessage(err, \"output format 'mind-beam' is not supported\")\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"BOM\", func() {\n\t\t\twhen(\"output format is json\", func() {\n\t\t\t\tit(\"return a JSONBOM writer\", func() {\n\t\t\t\t\tfactory := writer.NewFactory()\n\n\t\t\t\t\treturnedWriter, err := factory.Writer(\"json\", true)\n\t\t\t\t\tassert.Nil(err)\n\n\t\t\t\t\t_, ok := returnedWriter.(*writer.JSONBOM)\n\t\t\t\t\tassert.TrueWithMessage(\n\t\t\t\t\t\tok,\n\t\t\t\t\t\tfmt.Sprintf(\"expected %T to be assignable to type `*writer.JSON`\", returnedWriter),\n\t\t\t\t\t)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"output format is yaml\", func() {\n\t\t\t\tit(\"return a YAMLBOM writer\", func() {\n\t\t\t\t\tfactory := writer.NewFactory()\n\n\t\t\t\t\treturnedWriter, err := factory.Writer(\"yaml\", true)\n\t\t\t\t\tassert.Nil(err)\n\n\t\t\t\t\t_, ok := returnedWriter.(*writer.YAMLBOM)\n\t\t\t\t\tassert.TrueWithMessage(\n\t\t\t\t\t\tok,\n\t\t\t\t\t\tfmt.Sprintf(\"expected %T to be assignable to type `*writer.JSON`\", returnedWriter),\n\t\t\t\t\t)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"error cases\", func() {\n\t\t\t\twhen(\"output format is toml\", func() {\n\t\t\t\t\tit(\"return an error\", func() {\n\t\t\t\t\t\tfactory := writer.NewFactory()\n\n\t\t\t\t\t\t_, err := factory.Writer(\"toml\", true)\n\t\t\t\t\t\tassert.ErrorWithMessage(err, \"output format 'toml' is not supported\")\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"output format is not supported\", func() {\n\t\t\t\t\tit(\"returns an error\", func() {\n\t\t\t\t\t\tfactory := writer.NewFactory()\n\n\t\t\t\t\t\t_, err := factory.Writer(\"mind-BOM\", true)\n\t\t\t\t\t\tassert.ErrorWithMessage(err, \"output format 'mind-BOM' is not supported\")\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "internal/inspectimage/writer/human_readable.go",
    "content": "package writer\n\nimport (\n\t\"bytes\"\n\t\"fmt\"\n\t\"strings\"\n\t\"text/tabwriter\"\n\t\"text/template\"\n\n\t\"github.com/buildpacks/pack/internal/inspectimage\"\n\t\"github.com/buildpacks/pack/pkg/client\"\n\n\tstrs \"github.com/buildpacks/pack/internal/strings\"\n\t\"github.com/buildpacks/pack/internal/style\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n)\n\ntype HumanReadable struct{}\n\nfunc NewHumanReadable() *HumanReadable {\n\treturn &HumanReadable{}\n}\n\nfunc (h *HumanReadable) Print(\n\tlogger logging.Logger,\n\tgeneralInfo inspectimage.GeneralInfo,\n\tlocal, remote *client.ImageInfo,\n\tlocalErr, remoteErr error,\n) error {\n\tif local == nil && remote == nil {\n\t\treturn fmt.Errorf(\"unable to find image '%s' locally or remotely\", generalInfo.Name)\n\t}\n\n\tlogger.Infof(\"Inspecting image: %s\\n\", style.Symbol(generalInfo.Name))\n\n\tif err := writeRemoteImageInfo(logger, generalInfo, remote, remoteErr); err != nil {\n\t\treturn err\n\t}\n\n\tif err := writeLocalImageInfo(logger, generalInfo, local, localErr); err != nil {\n\t\treturn err\n\t}\n\n\treturn nil\n}\n\nfunc writeLocalImageInfo(\n\tlogger logging.Logger,\n\tgeneralInfo inspectimage.GeneralInfo,\n\tlocal *client.ImageInfo,\n\tlocalErr error) error {\n\tlogger.Info(\"\\nLOCAL:\\n\")\n\n\tif localErr != nil {\n\t\tlogger.Errorf(\"%s\\n\", localErr)\n\t\treturn nil\n\t}\n\n\tlocalDisplay := inspectimage.NewInfoDisplay(local, generalInfo)\n\tif localDisplay == nil {\n\t\tlogger.Info(\"(not present)\\n\")\n\t\treturn nil\n\t}\n\n\terr := writeImageInfo(logger, localDisplay)\n\tif err != nil {\n\t\treturn fmt.Errorf(\"writing local builder info: %w\", err)\n\t}\n\n\treturn nil\n}\n\nfunc writeRemoteImageInfo(\n\tlogger logging.Logger,\n\tgeneralInfo inspectimage.GeneralInfo,\n\tremote *client.ImageInfo,\n\tremoteErr error) error {\n\tlogger.Info(\"\\nREMOTE:\\n\")\n\n\tif remoteErr != nil {\n\t\tlogger.Errorf(\"%s\\n\", remoteErr)\n\t\treturn nil\n\t}\n\n\tremoteDisplay := inspectimage.NewInfoDisplay(remote, generalInfo)\n\tif remoteDisplay == nil {\n\t\tlogger.Info(\"(not present)\\n\")\n\t\treturn nil\n\t}\n\n\terr := writeImageInfo(logger, remoteDisplay)\n\tif err != nil {\n\t\treturn fmt.Errorf(\"writing remote builder info: %w\", err)\n\t}\n\n\treturn nil\n}\n\nfunc writeImageInfo(\n\tlogger logging.Logger,\n\tinfo *inspectimage.InfoDisplay,\n) error {\n\timgTpl := getImageTemplate(info)\n\tremoteOutput, err := getInspectImageOutput(imgTpl, info)\n\tif err != nil {\n\t\tlogger.Error(err.Error())\n\t\treturn err\n\t} else {\n\t\tlogger.Info(remoteOutput.String())\n\t\treturn nil\n\t}\n}\n\nfunc getImageTemplate(info *inspectimage.InfoDisplay) *template.Template {\n\timgTpl := template.Must(template.New(\"runImages\").\n\t\tFuncs(template.FuncMap{\"StringsJoin\": strings.Join}).\n\t\tFuncs(template.FuncMap{\"StringsValueOrDefault\": strs.ValueOrDefault}).\n\t\tParse(runImagesTemplate))\n\timgTpl = template.Must(imgTpl.New(\"buildpacks\").Parse(buildpacksTemplate))\n\n\timgTpl = template.Must(imgTpl.New(\"processes\").Parse(processesTemplate))\n\n\timgTpl = template.Must(imgTpl.New(\"rebasable\").Parse(rebasableTemplate))\n\n\tif info != nil && info.Extensions != nil {\n\t\timgTpl = template.Must(imgTpl.New(\"extensions\").Parse(extensionsTemplate))\n\t\timgTpl = template.Must(imgTpl.New(\"image\").Parse(imageWithExtensionTemplate))\n\t} else {\n\t\timgTpl = template.Must(imgTpl.New(\"image\").Parse(imageTemplate))\n\t}\n\treturn imgTpl\n}\n\nfunc getInspectImageOutput(\n\ttpl *template.Template,\n\tinfo *inspectimage.InfoDisplay) (*bytes.Buffer, error) {\n\tif info == nil {\n\t\treturn bytes.NewBuffer([]byte(\"(not present)\")), nil\n\t}\n\tbuf := bytes.NewBuffer(nil)\n\ttw := tabwriter.NewWriter(buf, 0, 0, 8, ' ', 0)\n\tdefer func() {\n\t\ttw.Flush()\n\t}()\n\tif err := tpl.Execute(tw, &struct {\n\t\tInfo *inspectimage.InfoDisplay\n\t}{\n\t\tinfo,\n\t}); err != nil {\n\t\treturn bytes.NewBuffer(nil), err\n\t}\n\treturn buf, nil\n}\n\nvar runImagesTemplate = `\nRun Images:\n{{- range $_, $m := .Info.RunImageMirrors }}\n  {{- if $m.UserConfigured }}\n  {{$m.Name}}\t(user-configured)\n  {{- else }}\n  {{$m.Name}}\n  {{- end }}  \n{{- end }}\n{{- if not .Info.RunImageMirrors }}\n  (none)\n{{- end }}`\n\nvar buildpacksTemplate = `\nBuildpacks:\n{{- if .Info.Buildpacks }}\n  ID\tVERSION\tHOMEPAGE\n{{- range $_, $b := .Info.Buildpacks }}\n  {{ $b.ID }}\t{{ $b.Version }}\t{{ StringsValueOrDefault $b.Homepage \"-\" }}\n{{- end }}\n{{- else }}\n  (buildpack metadata not present)\n{{- end }}`\n\nvar extensionsTemplate = `\nExtensions:\n{{- if .Info.Extensions }}\n  ID\tVERSION\tHOMEPAGE\n{{- range $_, $b := .Info.Extensions }}\n  {{ $b.ID }}\t{{ $b.Version }}\t{{ StringsValueOrDefault $b.Homepage \"-\" }}\n{{- end }}\n{{- else }}\n  (extension metadata not present)\n{{- end }}`\n\nvar processesTemplate = `\n{{- if .Info.Processes }}\n\nProcesses:\n  TYPE\tSHELL\tCOMMAND\tARGS\tWORK DIR\n  {{- range $_, $p := .Info.Processes }}\n    {{- if $p.Default }}\n  {{ (printf \"%s %s\" $p.Type \"(default)\") }}\t{{ $p.Shell }}\t{{ $p.Command }}\t{{ StringsJoin $p.Args \" \"  }}\t{{ $p.WorkDir }}\n    {{- else }}\n  {{ $p.Type }}\t{{ $p.Shell }}\t{{ $p.Command }}\t{{ StringsJoin $p.Args \" \" }}\t{{ $p.WorkDir }}\n    {{- end }}\n  {{- end }}\n{{- end }}`\n\nvar rebasableTemplate = `\n\nRebasable: \n{{- if or .Info.Rebasable (eq .Info.Rebasable true)  }} true \n{{- else }} false \n{{- end }}`\n\nvar imageTemplate = `\nStack: {{ .Info.StackID }}\n\nBase Image:\n{{- if .Info.Base.Reference}}\n  Reference: {{ .Info.Base.Reference }}\n{{- end}}\n  Top Layer: {{ .Info.Base.TopLayer }}\n{{ template \"runImages\" . }}\n{{- template \"rebasable\" . }}\n{{ template \"buildpacks\" . }}{{ template \"processes\" . }}`\n\nvar imageWithExtensionTemplate = `\nStack: {{ .Info.StackID }}\n\nBase Image:\n{{- if .Info.Base.Reference}}\n  Reference: {{ .Info.Base.Reference }}\n{{- end}}\n  Top Layer: {{ .Info.Base.TopLayer }}\n{{ template \"runImages\" . }}\n{{- template \"rebasable\" . }}\n{{ template \"buildpacks\" . }}\n{{ template \"extensions\" . -}}\n{{ template \"processes\" . }}`\n"
  },
  {
    "path": "internal/inspectimage/writer/human_readable_test.go",
    "content": "package writer_test\n\nimport (\n\t\"bytes\"\n\t\"errors\"\n\t\"fmt\"\n\t\"testing\"\n\n\t\"github.com/buildpacks/lifecycle/buildpack\"\n\t\"github.com/buildpacks/lifecycle/launch\"\n\t\"github.com/buildpacks/lifecycle/platform/files\"\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/internal/inspectimage\"\n\t\"github.com/buildpacks/pack/internal/inspectimage/writer\"\n\t\"github.com/buildpacks/pack/pkg/client\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestHumanReadable(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"Human Readable Writer\", testHumanReadable, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testHumanReadable(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tassert = h.NewAssertionManager(t)\n\t\toutBuf bytes.Buffer\n\n\t\tremoteInfo              *client.ImageInfo\n\t\tremoteWithExtensionInfo *client.ImageInfo\n\t\tremoteInfoNoRebasable   *client.ImageInfo\n\n\t\tlocalInfo              *client.ImageInfo\n\t\tlocalWithExtensionInfo *client.ImageInfo\n\t\tlocalInfoNoRebasable   *client.ImageInfo\n\n\t\texpectedRemoteOutput = `REMOTE:\n\nStack: test.stack.id.remote\n\nBase Image:\n  Reference: some-remote-run-image-reference\n  Top Layer: some-remote-top-layer\n\nRun Images:\n  user-configured-mirror-for-remote        (user-configured)\n  some-remote-run-image\n  some-remote-mirror\n  other-remote-mirror\n\nRebasable: true\n\nBuildpacks:\n  ID                          VERSION        HOMEPAGE\n  test.bp.one.remote          1.0.0          https://some-homepage-one\n  test.bp.two.remote          2.0.0          https://some-homepage-two\n  test.bp.three.remote        3.0.0          -\n\nProcesses:\n  TYPE                              SHELL        COMMAND                      ARGS                     WORK DIR\n  some-remote-type (default)        bash         /some/remote command         some remote args         /some-test-work-dir\n  other-remote-type                              /other/remote/command        other remote args        /other-test-work-dir`\n\t\texpectedRemoteNoRebasableOutput = `REMOTE:\n\nStack: test.stack.id.remote\n\nBase Image:\n  Reference: some-remote-run-image-reference\n  Top Layer: some-remote-top-layer\n\nRun Images:\n  user-configured-mirror-for-remote        (user-configured)\n  some-remote-run-image\n  some-remote-mirror\n  other-remote-mirror\n\nRebasable: false\n\nBuildpacks:\n  ID                          VERSION        HOMEPAGE\n  test.bp.one.remote          1.0.0          https://some-homepage-one\n  test.bp.two.remote          2.0.0          https://some-homepage-two\n  test.bp.three.remote        3.0.0          -\n\nProcesses:\n  TYPE                              SHELL        COMMAND                      ARGS                     WORK DIR\n  some-remote-type (default)        bash         /some/remote command         some remote args         /some-test-work-dir\n  other-remote-type                              /other/remote/command        other remote args        /other-test-work-dir`\n\n\t\texpectedRemoteWithExtensionOutput = `REMOTE:\n\nStack: test.stack.id.remote\n\nBase Image:\n  Reference: some-remote-run-image-reference\n  Top Layer: some-remote-top-layer\n\nRun Images:\n  user-configured-mirror-for-remote        (user-configured)\n  some-remote-run-image\n  some-remote-mirror\n  other-remote-mirror\n\nRebasable: true\n\nBuildpacks:\n  ID                          VERSION        HOMEPAGE\n  test.bp.one.remote          1.0.0          https://some-homepage-one\n  test.bp.two.remote          2.0.0          https://some-homepage-two\n  test.bp.three.remote        3.0.0          -\n\nExtensions:\n  ID                          VERSION        HOMEPAGE\n  test.bp.one.remote          1.0.0          https://some-homepage-one\n  test.bp.two.remote          2.0.0          https://some-homepage-two\n  test.bp.three.remote        3.0.0          -\n\nProcesses:\n  TYPE                              SHELL        COMMAND                      ARGS                     WORK DIR\n  some-remote-type (default)        bash         /some/remote command         some remote args         /some-test-work-dir\n  other-remote-type                              /other/remote/command        other remote args        /other-test-work-dir`\n\n\t\texpectedLocalOutput = `LOCAL:\n\nStack: test.stack.id.local\n\nBase Image:\n  Reference: some-local-run-image-reference\n  Top Layer: some-local-top-layer\n\nRun Images:\n  user-configured-mirror-for-local        (user-configured)\n  some-local-run-image\n  some-local-mirror\n  other-local-mirror\n\nRebasable: true\n\nBuildpacks:\n  ID                         VERSION        HOMEPAGE\n  test.bp.one.local          1.0.0          https://some-homepage-one\n  test.bp.two.local          2.0.0          https://some-homepage-two\n  test.bp.three.local        3.0.0          -\n\nProcesses:\n  TYPE                             SHELL        COMMAND                     ARGS                    WORK DIR\n  some-local-type (default)        bash         /some/local command         some local args         /some-test-work-dir\n  other-local-type                              /other/local/command        other local args        /other-test-work-dir`\n\t\texpectedLocalNoRebasableOutput = `LOCAL:\n\nStack: test.stack.id.local\n\nBase Image:\n  Reference: some-local-run-image-reference\n  Top Layer: some-local-top-layer\n\nRun Images:\n  user-configured-mirror-for-local        (user-configured)\n  some-local-run-image\n  some-local-mirror\n  other-local-mirror\n\nRebasable: false\n\nBuildpacks:\n  ID                         VERSION        HOMEPAGE\n  test.bp.one.local          1.0.0          https://some-homepage-one\n  test.bp.two.local          2.0.0          https://some-homepage-two\n  test.bp.three.local        3.0.0          -\n\nProcesses:\n  TYPE                             SHELL        COMMAND                     ARGS                    WORK DIR\n  some-local-type (default)        bash         /some/local command         some local args         /some-test-work-dir\n  other-local-type                              /other/local/command        other local args        /other-test-work-dir`\n\n\t\texpectedLocalWithExtensionOutput = `LOCAL:\n\nStack: test.stack.id.local\n\nBase Image:\n  Reference: some-local-run-image-reference\n  Top Layer: some-local-top-layer\n\nRun Images:\n  user-configured-mirror-for-local        (user-configured)\n  some-local-run-image\n  some-local-mirror\n  other-local-mirror\n\nRebasable: true\n\nBuildpacks:\n  ID                         VERSION        HOMEPAGE\n  test.bp.one.local          1.0.0          https://some-homepage-one\n  test.bp.two.local          2.0.0          https://some-homepage-two\n  test.bp.three.local        3.0.0          -\n\nExtensions:\n  ID                         VERSION        HOMEPAGE\n  test.bp.one.local          1.0.0          https://some-homepage-one\n  test.bp.two.local          2.0.0          https://some-homepage-two\n  test.bp.three.local        3.0.0          -\n\nProcesses:\n  TYPE                             SHELL        COMMAND                     ARGS                    WORK DIR\n  some-local-type (default)        bash         /some/local command         some local args         /some-test-work-dir\n  other-local-type                              /other/local/command        other local args        /other-test-work-dir`\n\t)\n\n\twhen(\"Print\", func() {\n\t\tit.Before(func() {\n\t\t\tremoteInfo = getRemoteBasicImageInfo(t)\n\t\t\tremoteWithExtensionInfo = getRemoteImageInfoWithExtension(t)\n\t\t\tremoteInfoNoRebasable = getRemoteImageInfoNoRebasable(t)\n\n\t\t\tlocalInfo = getBasicLocalImageInfo(t)\n\t\t\tlocalWithExtensionInfo = getLocalImageInfoWithExtension(t)\n\t\t\tlocalInfoNoRebasable = getLocalImageInfoNoRebasable(t)\n\n\t\t\toutBuf = bytes.Buffer{}\n\t\t})\n\n\t\twhen(\"local and remote image exits\", func() {\n\t\t\tit(\"prints both local and remote image info in a human readable format\", func() {\n\t\t\t\trunImageMirrors := []config.RunImage{\n\t\t\t\t\t{\n\t\t\t\t\t\tImage:   \"un-used-run-image\",\n\t\t\t\t\t\tMirrors: []string{\"un-used\"},\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\tImage:   \"some-local-run-image\",\n\t\t\t\t\t\tMirrors: []string{\"user-configured-mirror-for-local\"},\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\tImage:   \"some-remote-run-image\",\n\t\t\t\t\t\tMirrors: []string{\"user-configured-mirror-for-remote\"},\n\t\t\t\t\t},\n\t\t\t\t}\n\t\t\t\tsharedImageInfo := inspectimage.GeneralInfo{\n\t\t\t\t\tName:            \"test-image\",\n\t\t\t\t\tRunImageMirrors: runImageMirrors,\n\t\t\t\t}\n\t\t\t\thumanReadableWriter := writer.NewHumanReadable()\n\n\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\t\terr := humanReadableWriter.Print(logger, sharedImageInfo, localInfo, remoteInfo, nil, nil)\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassert.Contains(outBuf.String(), expectedLocalOutput)\n\t\t\t\tassert.Contains(outBuf.String(), expectedRemoteOutput)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"localWithExtension and remoteWithExtension image exits\", func() {\n\t\t\tit(\"prints both localWithExtension and remoteWithExtension image info in a human readable format\", func() {\n\t\t\t\trunImageMirrors := []config.RunImage{\n\t\t\t\t\t{\n\t\t\t\t\t\tImage:   \"un-used-run-image\",\n\t\t\t\t\t\tMirrors: []string{\"un-used\"},\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\tImage:   \"some-local-run-image\",\n\t\t\t\t\t\tMirrors: []string{\"user-configured-mirror-for-local\"},\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\tImage:   \"some-remote-run-image\",\n\t\t\t\t\t\tMirrors: []string{\"user-configured-mirror-for-remote\"},\n\t\t\t\t\t},\n\t\t\t\t}\n\t\t\t\tsharedImageInfo := inspectimage.GeneralInfo{\n\t\t\t\t\tName:            \"test-image\",\n\t\t\t\t\tRunImageMirrors: runImageMirrors,\n\t\t\t\t}\n\t\t\t\thumanReadableWriter := writer.NewHumanReadable()\n\n\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\t\terr := humanReadableWriter.Print(logger, sharedImageInfo, localWithExtensionInfo, remoteWithExtensionInfo, nil, nil)\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassert.Contains(outBuf.String(), expectedLocalWithExtensionOutput)\n\t\t\t\tassert.Contains(outBuf.String(), expectedRemoteWithExtensionOutput)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"only local image exists\", func() {\n\t\t\tit(\"prints local image info in a human readable format\", func() {\n\t\t\t\trunImageMirrors := []config.RunImage{\n\t\t\t\t\t{\n\t\t\t\t\t\tImage:   \"un-used-run-image\",\n\t\t\t\t\t\tMirrors: []string{\"un-used\"},\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\tImage:   \"some-local-run-image\",\n\t\t\t\t\t\tMirrors: []string{\"user-configured-mirror-for-local\"},\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\tImage:   \"some-remote-run-image\",\n\t\t\t\t\t\tMirrors: []string{\"user-configured-mirror-for-remote\"},\n\t\t\t\t\t},\n\t\t\t\t}\n\t\t\t\tsharedImageInfo := inspectimage.GeneralInfo{\n\t\t\t\t\tName:            \"test-image\",\n\t\t\t\t\tRunImageMirrors: runImageMirrors,\n\t\t\t\t}\n\t\t\t\thumanReadableWriter := writer.NewHumanReadable()\n\n\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\t\terr := humanReadableWriter.Print(logger, sharedImageInfo, localInfo, nil, nil, nil)\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassert.Contains(outBuf.String(), expectedLocalOutput)\n\t\t\t\tassert.NotContains(outBuf.String(), expectedRemoteOutput)\n\t\t\t})\n\t\t\tit(\"prints local no rebasable image info in a human readable format\", func() {\n\t\t\t\trunImageMirrors := []config.RunImage{\n\t\t\t\t\t{\n\t\t\t\t\t\tImage:   \"un-used-run-image\",\n\t\t\t\t\t\tMirrors: []string{\"un-used\"},\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\tImage:   \"some-local-run-image\",\n\t\t\t\t\t\tMirrors: []string{\"user-configured-mirror-for-local\"},\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\tImage:   \"some-remote-run-image\",\n\t\t\t\t\t\tMirrors: []string{\"user-configured-mirror-for-remote\"},\n\t\t\t\t\t},\n\t\t\t\t}\n\t\t\t\tsharedImageInfo := inspectimage.GeneralInfo{\n\t\t\t\t\tName:            \"test-image\",\n\t\t\t\t\tRunImageMirrors: runImageMirrors,\n\t\t\t\t}\n\t\t\t\thumanReadableWriter := writer.NewHumanReadable()\n\n\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\t\terr := humanReadableWriter.Print(logger, sharedImageInfo, localInfoNoRebasable, nil, nil, nil)\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassert.Contains(outBuf.String(), expectedLocalNoRebasableOutput)\n\t\t\t\tassert.NotContains(outBuf.String(), expectedRemoteOutput)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"only localWithExtension image exists\", func() {\n\t\t\tit(\"prints localWithExtension image info in a human readable format\", func() {\n\t\t\t\trunImageMirrors := []config.RunImage{\n\t\t\t\t\t{\n\t\t\t\t\t\tImage:   \"un-used-run-image\",\n\t\t\t\t\t\tMirrors: []string{\"un-used\"},\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\tImage:   \"some-local-run-image\",\n\t\t\t\t\t\tMirrors: []string{\"user-configured-mirror-for-local\"},\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\tImage:   \"some-remote-run-image\",\n\t\t\t\t\t\tMirrors: []string{\"user-configured-mirror-for-remote\"},\n\t\t\t\t\t},\n\t\t\t\t}\n\t\t\t\tsharedImageInfo := inspectimage.GeneralInfo{\n\t\t\t\t\tName:            \"test-image\",\n\t\t\t\t\tRunImageMirrors: runImageMirrors,\n\t\t\t\t}\n\t\t\t\thumanReadableWriter := writer.NewHumanReadable()\n\n\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\t\terr := humanReadableWriter.Print(logger, sharedImageInfo, localWithExtensionInfo, nil, nil, nil)\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassert.Contains(outBuf.String(), expectedLocalWithExtensionOutput)\n\t\t\t\tassert.NotContains(outBuf.String(), expectedRemoteWithExtensionOutput)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"only remote image exists\", func() {\n\t\t\tit(\"prints remote image info in a human readable format\", func() {\n\t\t\t\trunImageMirrors := []config.RunImage{\n\t\t\t\t\t{\n\t\t\t\t\t\tImage:   \"un-used-run-image\",\n\t\t\t\t\t\tMirrors: []string{\"un-used\"},\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\tImage:   \"some-local-run-image\",\n\t\t\t\t\t\tMirrors: []string{\"user-configured-mirror-for-local\"},\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\tImage:   \"some-remote-run-image\",\n\t\t\t\t\t\tMirrors: []string{\"user-configured-mirror-for-remote\"},\n\t\t\t\t\t},\n\t\t\t\t}\n\t\t\t\tsharedImageInfo := inspectimage.GeneralInfo{\n\t\t\t\t\tName:            \"test-image\",\n\t\t\t\t\tRunImageMirrors: runImageMirrors,\n\t\t\t\t}\n\t\t\t\thumanReadableWriter := writer.NewHumanReadable()\n\n\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\t\terr := humanReadableWriter.Print(logger, sharedImageInfo, nil, remoteInfo, nil, nil)\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassert.NotContains(outBuf.String(), expectedLocalOutput)\n\t\t\t\tassert.Contains(outBuf.String(), expectedRemoteOutput)\n\t\t\t})\n\t\t\tit(\"prints remote no rebasable image info in a human readable format\", func() {\n\t\t\t\trunImageMirrors := []config.RunImage{\n\t\t\t\t\t{\n\t\t\t\t\t\tImage:   \"un-used-run-image\",\n\t\t\t\t\t\tMirrors: []string{\"un-used\"},\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\tImage:   \"some-local-run-image\",\n\t\t\t\t\t\tMirrors: []string{\"user-configured-mirror-for-local\"},\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\tImage:   \"some-remote-run-image\",\n\t\t\t\t\t\tMirrors: []string{\"user-configured-mirror-for-remote\"},\n\t\t\t\t\t},\n\t\t\t\t}\n\t\t\t\tsharedImageInfo := inspectimage.GeneralInfo{\n\t\t\t\t\tName:            \"test-image\",\n\t\t\t\t\tRunImageMirrors: runImageMirrors,\n\t\t\t\t}\n\t\t\t\thumanReadableWriter := writer.NewHumanReadable()\n\n\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\t\terr := humanReadableWriter.Print(logger, sharedImageInfo, nil, remoteInfoNoRebasable, nil, nil)\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassert.NotContains(outBuf.String(), expectedLocalOutput)\n\t\t\t\tassert.Contains(outBuf.String(), expectedRemoteNoRebasableOutput)\n\t\t\t})\n\n\t\t\twhen(\"buildpack metadata is missing\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tremoteInfo.Buildpacks = []buildpack.GroupElement{}\n\t\t\t\t})\n\t\t\t\tit(\"displays a message indicating missing metadata\", func() {\n\t\t\t\t\tsharedImageInfo := inspectimage.GeneralInfo{\n\t\t\t\t\t\tName:            \"test-image\",\n\t\t\t\t\t\tRunImageMirrors: []config.RunImage{},\n\t\t\t\t\t}\n\n\t\t\t\t\thumanReadableWriter := writer.NewHumanReadable()\n\n\t\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\t\t\terr := humanReadableWriter.Print(logger, sharedImageInfo, nil, remoteInfo, nil, nil)\n\t\t\t\t\tassert.Nil(err)\n\n\t\t\t\t\tassert.Contains(outBuf.String(), \"(buildpack metadata not present)\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"there are no run images\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tremoteInfo.Stack = files.Stack{}\n\t\t\t\t})\n\t\t\t\tit(\"displays a message indicating missing run images\", func() {\n\t\t\t\t\tsharedImageInfo := inspectimage.GeneralInfo{\n\t\t\t\t\t\tName:            \"test-image\",\n\t\t\t\t\t\tRunImageMirrors: []config.RunImage{},\n\t\t\t\t\t}\n\n\t\t\t\t\thumanReadableWriter := writer.NewHumanReadable()\n\n\t\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\t\t\terr := humanReadableWriter.Print(logger, sharedImageInfo, nil, remoteInfo, nil, nil)\n\t\t\t\t\tassert.Nil(err)\n\n\t\t\t\t\tassert.Contains(outBuf.String(), \"Run Images:\\n  (none)\")\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"only remoteWithExtension image exists\", func() {\n\t\t\tit(\"prints remoteWithExtension image info in a human readable format\", func() {\n\t\t\t\trunImageMirrors := []config.RunImage{\n\t\t\t\t\t{\n\t\t\t\t\t\tImage:   \"un-used-run-image\",\n\t\t\t\t\t\tMirrors: []string{\"un-used\"},\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\tImage:   \"some-local-run-image\",\n\t\t\t\t\t\tMirrors: []string{\"user-configured-mirror-for-local\"},\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\tImage:   \"some-remote-run-image\",\n\t\t\t\t\t\tMirrors: []string{\"user-configured-mirror-for-remote\"},\n\t\t\t\t\t},\n\t\t\t\t}\n\t\t\t\tsharedImageInfo := inspectimage.GeneralInfo{\n\t\t\t\t\tName:            \"test-image\",\n\t\t\t\t\tRunImageMirrors: runImageMirrors,\n\t\t\t\t}\n\t\t\t\thumanReadableWriter := writer.NewHumanReadable()\n\n\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\t\terr := humanReadableWriter.Print(logger, sharedImageInfo, nil, remoteWithExtensionInfo, nil, nil)\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassert.NotContains(outBuf.String(), expectedLocalWithExtensionOutput)\n\t\t\t\tassert.Contains(outBuf.String(), expectedRemoteWithExtensionOutput)\n\t\t\t})\n\n\t\t\twhen(\"buildpack metadata is missing\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tremoteWithExtensionInfo.Buildpacks = []buildpack.GroupElement{}\n\t\t\t\t})\n\t\t\t\tit(\"displays a message indicating missing metadata\", func() {\n\t\t\t\t\tsharedImageInfo := inspectimage.GeneralInfo{\n\t\t\t\t\t\tName:            \"test-image\",\n\t\t\t\t\t\tRunImageMirrors: []config.RunImage{},\n\t\t\t\t\t}\n\n\t\t\t\t\thumanReadableWriter := writer.NewHumanReadable()\n\n\t\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\t\t\terr := humanReadableWriter.Print(logger, sharedImageInfo, nil, remoteWithExtensionInfo, nil, nil)\n\t\t\t\t\tassert.Nil(err)\n\n\t\t\t\t\tassert.Contains(outBuf.String(), \"(buildpack metadata not present)\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"there are no run images\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tremoteWithExtensionInfo.Stack = files.Stack{}\n\t\t\t\t})\n\t\t\t\tit(\"displays a message indicating missing run images\", func() {\n\t\t\t\t\tsharedImageInfo := inspectimage.GeneralInfo{\n\t\t\t\t\t\tName:            \"test-image\",\n\t\t\t\t\t\tRunImageMirrors: []config.RunImage{},\n\t\t\t\t\t}\n\n\t\t\t\t\thumanReadableWriter := writer.NewHumanReadable()\n\n\t\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\t\t\terr := humanReadableWriter.Print(logger, sharedImageInfo, nil, remoteWithExtensionInfo, nil, nil)\n\t\t\t\t\tassert.Nil(err)\n\n\t\t\t\t\tassert.Contains(outBuf.String(), \"Run Images:\\n  (none)\")\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"error handled cases\", func() {\n\t\t\twhen(\"there is a remoteErr\", func() {\n\t\t\t\tvar remoteErr error\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tremoteErr = errors.New(\"some remote error\")\n\t\t\t\t})\n\t\t\t\tit(\"displays the remote error and local info\", func() {\n\t\t\t\t\trunImageMirrors := []config.RunImage{\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\tImage:   \"un-used-run-image\",\n\t\t\t\t\t\t\tMirrors: []string{\"un-used\"},\n\t\t\t\t\t\t},\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\tImage:   \"some-local-run-image\",\n\t\t\t\t\t\t\tMirrors: []string{\"user-configured-mirror-for-local\"},\n\t\t\t\t\t\t},\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\tImage:   \"some-remote-run-image\",\n\t\t\t\t\t\t\tMirrors: []string{\"user-configured-mirror-for-remote\"},\n\t\t\t\t\t\t},\n\t\t\t\t\t}\n\t\t\t\t\tsharedImageInfo := inspectimage.GeneralInfo{\n\t\t\t\t\t\tName:            \"test-image\",\n\t\t\t\t\t\tRunImageMirrors: runImageMirrors,\n\t\t\t\t\t}\n\t\t\t\t\thumanReadableWriter := writer.NewHumanReadable()\n\n\t\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\t\t\terr := humanReadableWriter.Print(logger, sharedImageInfo, localInfo, remoteInfo, nil, remoteErr)\n\t\t\t\t\tassert.Nil(err)\n\n\t\t\t\t\tassert.Contains(outBuf.String(), expectedLocalOutput)\n\t\t\t\t\tassert.Contains(outBuf.String(), \"some remote error\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"there is a localErr\", func() {\n\t\t\t\tvar localErr error\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tlocalErr = errors.New(\"some local error\")\n\t\t\t\t})\n\t\t\t\tit(\"displays the remote info and local error\", func() {\n\t\t\t\t\trunImageMirrors := []config.RunImage{\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\tImage:   \"un-used-run-image\",\n\t\t\t\t\t\t\tMirrors: []string{\"un-used\"},\n\t\t\t\t\t\t},\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\tImage:   \"some-local-run-image\",\n\t\t\t\t\t\t\tMirrors: []string{\"user-configured-mirror-for-local\"},\n\t\t\t\t\t\t},\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\tImage:   \"some-remote-run-image\",\n\t\t\t\t\t\t\tMirrors: []string{\"user-configured-mirror-for-remote\"},\n\t\t\t\t\t\t},\n\t\t\t\t\t}\n\t\t\t\t\tsharedImageInfo := inspectimage.GeneralInfo{\n\t\t\t\t\t\tName:            \"test-image\",\n\t\t\t\t\t\tRunImageMirrors: runImageMirrors,\n\t\t\t\t\t}\n\t\t\t\t\thumanReadableWriter := writer.NewHumanReadable()\n\n\t\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\t\t\terr := humanReadableWriter.Print(logger, sharedImageInfo, localInfo, remoteInfo, localErr, nil)\n\t\t\t\t\tassert.Nil(err)\n\n\t\t\t\t\tassert.Contains(outBuf.String(), expectedRemoteOutput)\n\t\t\t\t\tassert.Contains(outBuf.String(), \"some local error\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"error handled cases\", func() {\n\t\t\t\twhen(\"there is a remoteErr\", func() {\n\t\t\t\t\tvar remoteErr error\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\tremoteErr = errors.New(\"some remote error\")\n\t\t\t\t\t})\n\t\t\t\t\tit(\"displays the remote error and local info\", func() {\n\t\t\t\t\t\trunImageMirrors := []config.RunImage{\n\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\tImage:   \"un-used-run-image\",\n\t\t\t\t\t\t\t\tMirrors: []string{\"un-used\"},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\tImage:   \"some-local-run-image\",\n\t\t\t\t\t\t\t\tMirrors: []string{\"user-configured-mirror-for-local\"},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\tImage:   \"some-remote-run-image\",\n\t\t\t\t\t\t\t\tMirrors: []string{\"user-configured-mirror-for-remote\"},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t}\n\t\t\t\t\t\tsharedImageInfo := inspectimage.GeneralInfo{\n\t\t\t\t\t\t\tName:            \"test-image\",\n\t\t\t\t\t\t\tRunImageMirrors: runImageMirrors,\n\t\t\t\t\t\t}\n\t\t\t\t\t\thumanReadableWriter := writer.NewHumanReadable()\n\n\t\t\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\t\t\t\terr := humanReadableWriter.Print(logger, sharedImageInfo, localWithExtensionInfo, remoteWithExtensionInfo, nil, remoteErr)\n\t\t\t\t\t\tassert.Nil(err)\n\n\t\t\t\t\t\tassert.Contains(outBuf.String(), expectedLocalWithExtensionOutput)\n\t\t\t\t\t\tassert.Contains(outBuf.String(), \"some remote error\")\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"there is a localErr\", func() {\n\t\t\t\t\tvar localErr error\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\tlocalErr = errors.New(\"some local error\")\n\t\t\t\t\t})\n\t\t\t\t\tit(\"displays the remote info and local error\", func() {\n\t\t\t\t\t\trunImageMirrors := []config.RunImage{\n\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\tImage:   \"un-used-run-image\",\n\t\t\t\t\t\t\t\tMirrors: []string{\"un-used\"},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\tImage:   \"some-local-run-image\",\n\t\t\t\t\t\t\t\tMirrors: []string{\"user-configured-mirror-for-local\"},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\tImage:   \"some-remote-run-image\",\n\t\t\t\t\t\t\t\tMirrors: []string{\"user-configured-mirror-for-remote\"},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t}\n\t\t\t\t\t\tsharedImageInfo := inspectimage.GeneralInfo{\n\t\t\t\t\t\t\tName:            \"test-image\",\n\t\t\t\t\t\t\tRunImageMirrors: runImageMirrors,\n\t\t\t\t\t\t}\n\t\t\t\t\t\thumanReadableWriter := writer.NewHumanReadable()\n\n\t\t\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\t\t\t\terr := humanReadableWriter.Print(logger, sharedImageInfo, localWithExtensionInfo, remoteWithExtensionInfo, localErr, nil)\n\t\t\t\t\t\tassert.Nil(err)\n\n\t\t\t\t\t\tassert.Contains(outBuf.String(), expectedRemoteWithExtensionOutput)\n\t\t\t\t\t\tassert.Contains(outBuf.String(), \"some local error\")\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"error cases\", func() {\n\t\t\t\twhen(\"both localInfo and remoteInfo are nil\", func() {\n\t\t\t\t\tit(\"displays a 'missing image' error message\", func() {\n\t\t\t\t\t\thumanReadableWriter := writer.NewHumanReadable()\n\n\t\t\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\t\t\t\terr := humanReadableWriter.Print(logger, inspectimage.GeneralInfo{Name: \"missing-image\"}, nil, nil, nil, nil)\n\t\t\t\t\t\tassert.ErrorWithMessage(err, fmt.Sprintf(\"unable to find image '%s' locally or remotely\", \"missing-image\"))\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\t})\n}\n\nfunc getRemoteBasicImageInfo(t testing.TB) *client.ImageInfo {\n\tt.Helper()\n\treturn getRemoteImageInfo(t, false, true)\n}\nfunc getRemoteImageInfoWithExtension(t testing.TB) *client.ImageInfo {\n\tt.Helper()\n\treturn getRemoteImageInfo(t, true, true)\n}\n\nfunc getRemoteImageInfoNoRebasable(t testing.TB) *client.ImageInfo {\n\tt.Helper()\n\treturn getRemoteImageInfo(t, false, false)\n}\n\nfunc getRemoteImageInfo(t testing.TB, extension bool, rebasable bool) *client.ImageInfo {\n\tt.Helper()\n\n\tmockedStackID := \"test.stack.id.remote\"\n\n\tmockedBuildpacks := []buildpack.GroupElement{\n\t\t{ID: \"test.bp.one.remote\", Version: \"1.0.0\", Homepage: \"https://some-homepage-one\"},\n\t\t{ID: \"test.bp.two.remote\", Version: \"2.0.0\", Homepage: \"https://some-homepage-two\"},\n\t\t{ID: \"test.bp.three.remote\", Version: \"3.0.0\"},\n\t}\n\n\tmockedBase := files.RunImageForRebase{\n\t\tTopLayer:  \"some-remote-top-layer\",\n\t\tReference: \"some-remote-run-image-reference\",\n\t}\n\n\tmockedStack := files.Stack{\n\t\tRunImage: files.RunImageForExport{\n\t\t\tImage:   \"some-remote-run-image\",\n\t\t\tMirrors: []string{\"some-remote-mirror\", \"other-remote-mirror\"},\n\t\t},\n\t}\n\n\ttype someData struct {\n\t\tString string\n\t\tBool   bool\n\t\tInt    int\n\t\tNested struct {\n\t\t\tString string\n\t\t}\n\t}\n\tmockedMetadata := map[string]interface{}{\n\t\t\"RemoteData\": someData{\n\t\t\tString: \"aString\",\n\t\t\tBool:   true,\n\t\t\tInt:    123,\n\t\t\tNested: struct {\n\t\t\t\tString string\n\t\t\t}{\n\t\t\t\tString: \"anotherString\",\n\t\t\t},\n\t\t},\n\t}\n\n\tmockedBOM := []buildpack.BOMEntry{{\n\t\tRequire: buildpack.Require{\n\t\t\tName:     \"name-1\",\n\t\t\tMetadata: mockedMetadata,\n\t\t},\n\t\tBuildpack: buildpack.GroupElement{ID: \"test.bp.one.remote\", Version: \"1.0.0\"},\n\t}}\n\n\tmockedProcesses := client.ProcessDetails{\n\t\tDefaultProcess: &launch.Process{\n\t\t\tType:             \"some-remote-type\",\n\t\t\tCommand:          launch.RawCommand{Entries: []string{\"/some/remote command\"}},\n\t\t\tArgs:             []string{\"some\", \"remote\", \"args\"},\n\t\t\tDirect:           false,\n\t\t\tWorkingDirectory: \"/some-test-work-dir\",\n\t\t},\n\t\tOtherProcesses: []launch.Process{\n\t\t\t{\n\t\t\t\tType:             \"other-remote-type\",\n\t\t\t\tCommand:          launch.RawCommand{Entries: []string{\"/other/remote/command\"}},\n\t\t\t\tArgs:             []string{\"other\", \"remote\", \"args\"},\n\t\t\t\tDirect:           true,\n\t\t\t\tWorkingDirectory: \"/other-test-work-dir\",\n\t\t\t},\n\t\t},\n\t}\n\n\tmockedExtension := []buildpack.GroupElement{\n\t\t{ID: \"test.bp.one.remote\", Version: \"1.0.0\", Homepage: \"https://some-homepage-one\"},\n\t\t{ID: \"test.bp.two.remote\", Version: \"2.0.0\", Homepage: \"https://some-homepage-two\"},\n\t\t{ID: \"test.bp.three.remote\", Version: \"3.0.0\"},\n\t}\n\n\timageInfo := &client.ImageInfo{\n\t\tStackID:    mockedStackID,\n\t\tBuildpacks: mockedBuildpacks,\n\t\tBase:       mockedBase,\n\t\tStack:      mockedStack,\n\t\tBOM:        mockedBOM,\n\t\tProcesses:  mockedProcesses,\n\t\tRebasable:  rebasable,\n\t}\n\n\tif extension {\n\t\timageInfo.Extensions = mockedExtension\n\t}\n\n\treturn imageInfo\n}\n\nfunc getBasicLocalImageInfo(t testing.TB) *client.ImageInfo {\n\tt.Helper()\n\treturn getLocalImageInfo(t, false, true)\n}\n\nfunc getLocalImageInfoWithExtension(t testing.TB) *client.ImageInfo {\n\tt.Helper()\n\treturn getLocalImageInfo(t, true, true)\n}\n\nfunc getLocalImageInfoNoRebasable(t testing.TB) *client.ImageInfo {\n\tt.Helper()\n\treturn getLocalImageInfo(t, false, false)\n}\n\nfunc getLocalImageInfo(t testing.TB, extension bool, rebasable bool) *client.ImageInfo {\n\tt.Helper()\n\n\tmockedStackID := \"test.stack.id.local\"\n\n\tmockedBuildpacks := []buildpack.GroupElement{\n\t\t{ID: \"test.bp.one.local\", Version: \"1.0.0\", Homepage: \"https://some-homepage-one\"},\n\t\t{ID: \"test.bp.two.local\", Version: \"2.0.0\", Homepage: \"https://some-homepage-two\"},\n\t\t{ID: \"test.bp.three.local\", Version: \"3.0.0\"},\n\t}\n\n\tmockedBase := files.RunImageForRebase{\n\t\tTopLayer:  \"some-local-top-layer\",\n\t\tReference: \"some-local-run-image-reference\",\n\t}\n\n\tmockedPlatform := files.Stack{\n\t\tRunImage: files.RunImageForExport{\n\t\t\tImage:   \"some-local-run-image\",\n\t\t\tMirrors: []string{\"some-local-mirror\", \"other-local-mirror\"},\n\t\t},\n\t}\n\n\ttype someData struct {\n\t\tString string\n\t\tBool   bool\n\t\tInt    int\n\t\tNested struct {\n\t\t\tString string\n\t\t}\n\t}\n\tmockedMetadata := map[string]interface{}{\n\t\t\"LocalData\": someData{\n\t\t\tBool: false,\n\t\t\tInt:  456,\n\t\t},\n\t}\n\n\tmockedBOM := []buildpack.BOMEntry{{\n\t\tRequire: buildpack.Require{\n\t\t\tName:     \"name-1\",\n\t\t\tVersion:  \"version-1\",\n\t\t\tMetadata: mockedMetadata,\n\t\t},\n\t\tBuildpack: buildpack.GroupElement{ID: \"test.bp.one.remote\", Version: \"1.0.0\"},\n\t}}\n\n\tmockedProcesses := client.ProcessDetails{\n\t\tDefaultProcess: &launch.Process{\n\t\t\tType:             \"some-local-type\",\n\t\t\tCommand:          launch.RawCommand{Entries: []string{\"/some/local command\"}},\n\t\t\tArgs:             []string{\"some\", \"local\", \"args\"},\n\t\t\tDirect:           false,\n\t\t\tWorkingDirectory: \"/some-test-work-dir\",\n\t\t},\n\t\tOtherProcesses: []launch.Process{\n\t\t\t{\n\t\t\t\tType:             \"other-local-type\",\n\t\t\t\tCommand:          launch.RawCommand{Entries: []string{\"/other/local/command\"}},\n\t\t\t\tArgs:             []string{\"other\", \"local\", \"args\"},\n\t\t\t\tDirect:           true,\n\t\t\t\tWorkingDirectory: \"/other-test-work-dir\",\n\t\t\t},\n\t\t},\n\t}\n\n\tmockedExtension := []buildpack.GroupElement{\n\t\t{ID: \"test.bp.one.local\", Version: \"1.0.0\", Homepage: \"https://some-homepage-one\"},\n\t\t{ID: \"test.bp.two.local\", Version: \"2.0.0\", Homepage: \"https://some-homepage-two\"},\n\t\t{ID: \"test.bp.three.local\", Version: \"3.0.0\"},\n\t}\n\n\timageInfo := &client.ImageInfo{\n\t\tStackID:    mockedStackID,\n\t\tBuildpacks: mockedBuildpacks,\n\t\tBase:       mockedBase,\n\t\tStack:      mockedPlatform,\n\t\tBOM:        mockedBOM,\n\t\tProcesses:  mockedProcesses,\n\t\tRebasable:  rebasable,\n\t}\n\n\tif extension {\n\t\timageInfo.Extensions = mockedExtension\n\t}\n\n\treturn imageInfo\n}\n"
  },
  {
    "path": "internal/inspectimage/writer/json.go",
    "content": "package writer\n\nimport (\n\t\"bytes\"\n\t\"encoding/json\"\n)\n\ntype JSON struct {\n\tStructuredFormat\n}\n\nfunc NewJSON() *JSON {\n\treturn &JSON{\n\t\tStructuredFormat: StructuredFormat{\n\t\t\tMarshalFunc: func(i interface{}) ([]byte, error) {\n\t\t\t\tbuf := bytes.NewBuffer(nil)\n\t\t\t\tif err := json.NewEncoder(buf).Encode(i); err != nil {\n\t\t\t\t\treturn []byte{}, err\n\t\t\t\t}\n\n\t\t\t\tformattedBuf := bytes.NewBuffer(nil)\n\t\t\t\tif err := json.Indent(formattedBuf, buf.Bytes(), \"\", \"  \"); err != nil {\n\t\t\t\t\treturn []byte{}, err\n\t\t\t\t}\n\t\t\t\treturn formattedBuf.Bytes(), nil\n\t\t\t},\n\t\t},\n\t}\n}\n"
  },
  {
    "path": "internal/inspectimage/writer/json_test.go",
    "content": "package writer_test\n\nimport (\n\t\"bytes\"\n\t\"testing\"\n\n\t\"github.com/buildpacks/lifecycle/buildpack\"\n\t\"github.com/buildpacks/lifecycle/launch\"\n\t\"github.com/buildpacks/lifecycle/platform/files\"\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/internal/inspectimage\"\n\t\"github.com/buildpacks/pack/internal/inspectimage/writer\"\n\t\"github.com/buildpacks/pack/pkg/client\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestJSON(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"JSON Writer\", testJSON, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testJSON(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tassert = h.NewAssertionManager(t)\n\t\toutBuf bytes.Buffer\n\n\t\tremoteInfo            *client.ImageInfo\n\t\tremoteInfoNoRebasable *client.ImageInfo\n\t\tlocalInfo             *client.ImageInfo\n\t\tlocalInfoNoRebasable  *client.ImageInfo\n\n\t\texpectedLocalOutput = `{\n  \"local_info\": {\n    \"stack\": \"test.stack.id.local\",\n    \"rebasable\": true,\n    \"base_image\": {\n      \"top_layer\": \"some-local-top-layer\",\n      \"reference\": \"some-local-run-image-reference\"\n    },\n    \"run_images\": [\n      {\n        \"name\": \"user-configured-mirror-for-local\",\n        \"user_configured\": true\n      },\n      {\n        \"name\": \"some-local-run-image\"\n      },\n      {\n        \"name\": \"some-local-mirror\"\n      },\n      {\n        \"name\": \"other-local-mirror\"\n      }\n    ],\n    \"buildpacks\": [\n      {\n        \"homepage\": \"https://some-homepage-one\",\n        \"id\": \"test.bp.one.local\",\n        \"version\": \"1.0.0\"\n      },\n      {\n        \"homepage\": \"https://some-homepage-two\",\n        \"id\": \"test.bp.two.local\",\n        \"version\": \"2.0.0\"\n      }\n    ],\n\t\"extensions\": null,\n    \"processes\": [\n      {\n        \"type\": \"some-local-type\",\n        \"shell\": \"bash\",\n        \"command\": \"/some/local command\",\n        \"default\": true,\n        \"args\": [\n          \"some\",\n          \"local\",\n          \"args\"\n        ],\n\t\t\"working-dir\": \"/some-test-work-dir\"\n      },\n      {\n        \"type\": \"other-local-type\",\n        \"shell\": \"\",\n        \"command\": \"/other/local/command\",\n        \"default\": false,\n        \"args\": [\n          \"other\",\n          \"local\",\n          \"args\"\n        ],\n\t\t\"working-dir\": \"/other-test-work-dir\"\n      }\n    ]\n  }\n}`\n\t\texpectedLocalNoRebasableOutput = `{\n  \"local_info\": {\n    \"stack\": \"test.stack.id.local\",\n    \"rebasable\": false,\n    \"base_image\": {\n      \"top_layer\": \"some-local-top-layer\",\n      \"reference\": \"some-local-run-image-reference\"\n    },\n    \"run_images\": [\n      {\n        \"name\": \"user-configured-mirror-for-local\",\n        \"user_configured\": true\n      },\n      {\n        \"name\": \"some-local-run-image\"\n      },\n      {\n        \"name\": \"some-local-mirror\"\n      },\n      {\n        \"name\": \"other-local-mirror\"\n      }\n    ],\n    \"buildpacks\": [\n      {\n        \"homepage\": \"https://some-homepage-one\",\n        \"id\": \"test.bp.one.local\",\n        \"version\": \"1.0.0\"\n      },\n      {\n        \"homepage\": \"https://some-homepage-two\",\n        \"id\": \"test.bp.two.local\",\n        \"version\": \"2.0.0\"\n      }\n    ],\n\t\"extensions\": null,\n    \"processes\": [\n      {\n        \"type\": \"some-local-type\",\n        \"shell\": \"bash\",\n        \"command\": \"/some/local command\",\n        \"default\": true,\n        \"args\": [\n          \"some\",\n          \"local\",\n          \"args\"\n        ],\n\t\t\"working-dir\": \"/some-test-work-dir\"\n      },\n      {\n        \"type\": \"other-local-type\",\n        \"shell\": \"\",\n        \"command\": \"/other/local/command\",\n        \"default\": false,\n        \"args\": [\n          \"other\",\n          \"local\",\n          \"args\"\n        ],\n\t\t\"working-dir\": \"/other-test-work-dir\"\n      }\n    ]\n  }\n}`\n\t\texpectedRemoteOutput = `{  \n  \"remote_info\": {\n    \"stack\": \"test.stack.id.remote\",\n    \"rebasable\": true,\n    \"base_image\": {\n      \"top_layer\": \"some-remote-top-layer\",\n      \"reference\": \"some-remote-run-image-reference\"\n    },\n    \"run_images\": [\n      {\n        \"name\": \"user-configured-mirror-for-remote\",\n        \"user_configured\": true\n      },\n      {\n        \"name\": \"some-remote-run-image\"\n      },\n      {\n        \"name\": \"some-remote-mirror\"\n      },\n      {\n        \"name\": \"other-remote-mirror\"\n      }\n    ],\n    \"buildpacks\": [\n      {\n        \"id\": \"test.bp.one.remote\",\n        \"version\": \"1.0.0\",\n        \"homepage\": \"https://some-homepage-one\"\n      },\n      {\n        \"id\": \"test.bp.two.remote\",\n        \"version\": \"2.0.0\",\n        \"homepage\": \"https://some-homepage-two\"\n      }\n    ],\n\t\"extensions\": null,\n    \"processes\": [\n      {\n        \"type\": \"some-remote-type\",\n        \"shell\": \"bash\",\n        \"command\": \"/some/remote command\",\n        \"default\": true,\n        \"args\": [\n          \"some\",\n          \"remote\",\n          \"args\"\n        ],\n\t\t\"working-dir\": \"/some-test-work-dir\"\n      },\n      {\n        \"type\": \"other-remote-type\",\n        \"shell\": \"\",\n        \"command\": \"/other/remote/command\",\n        \"default\": false,\n        \"args\": [\n          \"other\",\n          \"remote\",\n          \"args\"\n        ],\n\t\t\"working-dir\": \"/other-test-work-dir\"\n      }\n    ]\n  }\n}`\n\t\texpectedRemoteNoRebasableOutput = `{  \n  \"remote_info\": {\n    \"stack\": \"test.stack.id.remote\",\n    \"rebasable\": false,\n    \"base_image\": {\n      \"top_layer\": \"some-remote-top-layer\",\n      \"reference\": \"some-remote-run-image-reference\"\n    },\n    \"run_images\": [\n      {\n        \"name\": \"user-configured-mirror-for-remote\",\n        \"user_configured\": true\n      },\n      {\n        \"name\": \"some-remote-run-image\"\n      },\n      {\n        \"name\": \"some-remote-mirror\"\n      },\n      {\n        \"name\": \"other-remote-mirror\"\n      }\n    ],\n    \"buildpacks\": [\n      {\n        \"id\": \"test.bp.one.remote\",\n        \"version\": \"1.0.0\",\n        \"homepage\": \"https://some-homepage-one\"\n      },\n      {\n        \"id\": \"test.bp.two.remote\",\n        \"version\": \"2.0.0\",\n        \"homepage\": \"https://some-homepage-two\"\n      }\n    ],\n\t\"extensions\": null,\n    \"processes\": [\n      {\n        \"type\": \"some-remote-type\",\n        \"shell\": \"bash\",\n        \"command\": \"/some/remote command\",\n        \"default\": true,\n        \"args\": [\n          \"some\",\n          \"remote\",\n          \"args\"\n        ],\n\t\t\"working-dir\": \"/some-test-work-dir\"\n      },\n      {\n        \"type\": \"other-remote-type\",\n        \"shell\": \"\",\n        \"command\": \"/other/remote/command\",\n        \"default\": false,\n        \"args\": [\n          \"other\",\n          \"remote\",\n          \"args\"\n        ],\n\t\t\"working-dir\": \"/other-test-work-dir\"\n      }\n    ]\n  }\n}`\n\t)\n\n\twhen(\"Print\", func() {\n\t\tit.Before(func() {\n\t\t\ttype someData struct {\n\t\t\t\tString string\n\t\t\t\tBool   bool\n\t\t\t\tInt    int\n\t\t\t\tNested struct {\n\t\t\t\t\tString string\n\t\t\t\t}\n\t\t\t}\n\n\t\t\tremoteInfo = &client.ImageInfo{\n\t\t\t\tStackID: \"test.stack.id.remote\",\n\t\t\t\tBuildpacks: []buildpack.GroupElement{\n\t\t\t\t\t{ID: \"test.bp.one.remote\", Version: \"1.0.0\", Homepage: \"https://some-homepage-one\"},\n\t\t\t\t\t{ID: \"test.bp.two.remote\", Version: \"2.0.0\", Homepage: \"https://some-homepage-two\"},\n\t\t\t\t},\n\t\t\t\tBase: files.RunImageForRebase{\n\t\t\t\t\tTopLayer:  \"some-remote-top-layer\",\n\t\t\t\t\tReference: \"some-remote-run-image-reference\",\n\t\t\t\t},\n\t\t\t\tStack: files.Stack{\n\t\t\t\t\tRunImage: files.RunImageForExport{\n\t\t\t\t\t\tImage:   \"some-remote-run-image\",\n\t\t\t\t\t\tMirrors: []string{\"some-remote-mirror\", \"other-remote-mirror\"},\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t\tBOM: []buildpack.BOMEntry{{\n\t\t\t\t\tRequire: buildpack.Require{\n\t\t\t\t\t\tName:    \"name-1\",\n\t\t\t\t\t\tVersion: \"version-1\",\n\t\t\t\t\t\tMetadata: map[string]interface{}{\n\t\t\t\t\t\t\t\"RemoteData\": someData{\n\t\t\t\t\t\t\t\tString: \"aString\",\n\t\t\t\t\t\t\t\tBool:   true,\n\t\t\t\t\t\t\t\tInt:    123,\n\t\t\t\t\t\t\t\tNested: struct {\n\t\t\t\t\t\t\t\t\tString string\n\t\t\t\t\t\t\t\t}{\n\t\t\t\t\t\t\t\t\tString: \"anotherString\",\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t\tBuildpack: buildpack.GroupElement{ID: \"test.bp.one.remote\", Version: \"1.0.0\", Homepage: \"https://some-homepage-one\"},\n\t\t\t\t}},\n\t\t\t\tProcesses: client.ProcessDetails{\n\t\t\t\t\tDefaultProcess: &launch.Process{\n\t\t\t\t\t\tType:             \"some-remote-type\",\n\t\t\t\t\t\tCommand:          launch.RawCommand{Entries: []string{\"/some/remote command\"}},\n\t\t\t\t\t\tArgs:             []string{\"some\", \"remote\", \"args\"},\n\t\t\t\t\t\tDirect:           false,\n\t\t\t\t\t\tWorkingDirectory: \"/some-test-work-dir\",\n\t\t\t\t\t},\n\t\t\t\t\tOtherProcesses: []launch.Process{\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\tType:             \"other-remote-type\",\n\t\t\t\t\t\t\tCommand:          launch.RawCommand{Entries: []string{\"/other/remote/command\"}},\n\t\t\t\t\t\t\tArgs:             []string{\"other\", \"remote\", \"args\"},\n\t\t\t\t\t\t\tDirect:           true,\n\t\t\t\t\t\t\tWorkingDirectory: \"/other-test-work-dir\",\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t\tRebasable: true,\n\t\t\t}\n\t\t\tremoteInfoNoRebasable = &client.ImageInfo{\n\t\t\t\tStackID: \"test.stack.id.remote\",\n\t\t\t\tBuildpacks: []buildpack.GroupElement{\n\t\t\t\t\t{ID: \"test.bp.one.remote\", Version: \"1.0.0\", Homepage: \"https://some-homepage-one\"},\n\t\t\t\t\t{ID: \"test.bp.two.remote\", Version: \"2.0.0\", Homepage: \"https://some-homepage-two\"},\n\t\t\t\t},\n\t\t\t\tBase: files.RunImageForRebase{\n\t\t\t\t\tTopLayer:  \"some-remote-top-layer\",\n\t\t\t\t\tReference: \"some-remote-run-image-reference\",\n\t\t\t\t},\n\t\t\t\tStack: files.Stack{\n\t\t\t\t\tRunImage: files.RunImageForExport{\n\t\t\t\t\t\tImage:   \"some-remote-run-image\",\n\t\t\t\t\t\tMirrors: []string{\"some-remote-mirror\", \"other-remote-mirror\"},\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t\tBOM: []buildpack.BOMEntry{{\n\t\t\t\t\tRequire: buildpack.Require{\n\t\t\t\t\t\tName:    \"name-1\",\n\t\t\t\t\t\tVersion: \"version-1\",\n\t\t\t\t\t\tMetadata: map[string]interface{}{\n\t\t\t\t\t\t\t\"RemoteData\": someData{\n\t\t\t\t\t\t\t\tString: \"aString\",\n\t\t\t\t\t\t\t\tBool:   true,\n\t\t\t\t\t\t\t\tInt:    123,\n\t\t\t\t\t\t\t\tNested: struct {\n\t\t\t\t\t\t\t\t\tString string\n\t\t\t\t\t\t\t\t}{\n\t\t\t\t\t\t\t\t\tString: \"anotherString\",\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t\tBuildpack: buildpack.GroupElement{ID: \"test.bp.one.remote\", Version: \"1.0.0\", Homepage: \"https://some-homepage-one\"},\n\t\t\t\t}},\n\t\t\t\tProcesses: client.ProcessDetails{\n\t\t\t\t\tDefaultProcess: &launch.Process{\n\t\t\t\t\t\tType:             \"some-remote-type\",\n\t\t\t\t\t\tCommand:          launch.RawCommand{Entries: []string{\"/some/remote command\"}},\n\t\t\t\t\t\tArgs:             []string{\"some\", \"remote\", \"args\"},\n\t\t\t\t\t\tDirect:           false,\n\t\t\t\t\t\tWorkingDirectory: \"/some-test-work-dir\",\n\t\t\t\t\t},\n\t\t\t\t\tOtherProcesses: []launch.Process{\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\tType:             \"other-remote-type\",\n\t\t\t\t\t\t\tCommand:          launch.RawCommand{Entries: []string{\"/other/remote/command\"}},\n\t\t\t\t\t\t\tArgs:             []string{\"other\", \"remote\", \"args\"},\n\t\t\t\t\t\t\tDirect:           true,\n\t\t\t\t\t\t\tWorkingDirectory: \"/other-test-work-dir\",\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t\tRebasable: false,\n\t\t\t}\n\n\t\t\tlocalInfo = &client.ImageInfo{\n\t\t\t\tStackID: \"test.stack.id.local\",\n\t\t\t\tBuildpacks: []buildpack.GroupElement{\n\t\t\t\t\t{ID: \"test.bp.one.local\", Version: \"1.0.0\", Homepage: \"https://some-homepage-one\"},\n\t\t\t\t\t{ID: \"test.bp.two.local\", Version: \"2.0.0\", Homepage: \"https://some-homepage-two\"},\n\t\t\t\t},\n\t\t\t\tBase: files.RunImageForRebase{\n\t\t\t\t\tTopLayer:  \"some-local-top-layer\",\n\t\t\t\t\tReference: \"some-local-run-image-reference\",\n\t\t\t\t},\n\t\t\t\tStack: files.Stack{\n\t\t\t\t\tRunImage: files.RunImageForExport{\n\t\t\t\t\t\tImage:   \"some-local-run-image\",\n\t\t\t\t\t\tMirrors: []string{\"some-local-mirror\", \"other-local-mirror\"},\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t\tBOM: []buildpack.BOMEntry{{\n\t\t\t\t\tRequire: buildpack.Require{\n\t\t\t\t\t\tName:    \"name-1\",\n\t\t\t\t\t\tVersion: \"version-1\",\n\t\t\t\t\t\tMetadata: map[string]interface{}{\n\t\t\t\t\t\t\t\"LocalData\": someData{\n\t\t\t\t\t\t\t\tBool: false,\n\t\t\t\t\t\t\t\tInt:  456,\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t\tBuildpack: buildpack.GroupElement{ID: \"test.bp.one.remote\", Version: \"1.0.0\", Homepage: \"https://some-homepage-one\"},\n\t\t\t\t}},\n\t\t\t\tProcesses: client.ProcessDetails{\n\t\t\t\t\tDefaultProcess: &launch.Process{\n\t\t\t\t\t\tType:             \"some-local-type\",\n\t\t\t\t\t\tCommand:          launch.RawCommand{Entries: []string{\"/some/local command\"}},\n\t\t\t\t\t\tArgs:             []string{\"some\", \"local\", \"args\"},\n\t\t\t\t\t\tDirect:           false,\n\t\t\t\t\t\tWorkingDirectory: \"/some-test-work-dir\",\n\t\t\t\t\t},\n\t\t\t\t\tOtherProcesses: []launch.Process{\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\tType:             \"other-local-type\",\n\t\t\t\t\t\t\tCommand:          launch.RawCommand{Entries: []string{\"/other/local/command\"}},\n\t\t\t\t\t\t\tArgs:             []string{\"other\", \"local\", \"args\"},\n\t\t\t\t\t\t\tDirect:           true,\n\t\t\t\t\t\t\tWorkingDirectory: \"/other-test-work-dir\",\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t\tRebasable: true,\n\t\t\t}\n\t\t\tlocalInfoNoRebasable = &client.ImageInfo{\n\t\t\t\tStackID: \"test.stack.id.local\",\n\t\t\t\tBuildpacks: []buildpack.GroupElement{\n\t\t\t\t\t{ID: \"test.bp.one.local\", Version: \"1.0.0\", Homepage: \"https://some-homepage-one\"},\n\t\t\t\t\t{ID: \"test.bp.two.local\", Version: \"2.0.0\", Homepage: \"https://some-homepage-two\"},\n\t\t\t\t},\n\t\t\t\tBase: files.RunImageForRebase{\n\t\t\t\t\tTopLayer:  \"some-local-top-layer\",\n\t\t\t\t\tReference: \"some-local-run-image-reference\",\n\t\t\t\t},\n\t\t\t\tStack: files.Stack{\n\t\t\t\t\tRunImage: files.RunImageForExport{\n\t\t\t\t\t\tImage:   \"some-local-run-image\",\n\t\t\t\t\t\tMirrors: []string{\"some-local-mirror\", \"other-local-mirror\"},\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t\tBOM: []buildpack.BOMEntry{{\n\t\t\t\t\tRequire: buildpack.Require{\n\t\t\t\t\t\tName:    \"name-1\",\n\t\t\t\t\t\tVersion: \"version-1\",\n\t\t\t\t\t\tMetadata: map[string]interface{}{\n\t\t\t\t\t\t\t\"LocalData\": someData{\n\t\t\t\t\t\t\t\tBool: false,\n\t\t\t\t\t\t\t\tInt:  456,\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t\tBuildpack: buildpack.GroupElement{ID: \"test.bp.one.remote\", Version: \"1.0.0\", Homepage: \"https://some-homepage-one\"},\n\t\t\t\t}},\n\t\t\t\tProcesses: client.ProcessDetails{\n\t\t\t\t\tDefaultProcess: &launch.Process{\n\t\t\t\t\t\tType:             \"some-local-type\",\n\t\t\t\t\t\tCommand:          launch.RawCommand{Entries: []string{\"/some/local command\"}},\n\t\t\t\t\t\tArgs:             []string{\"some\", \"local\", \"args\"},\n\t\t\t\t\t\tDirect:           false,\n\t\t\t\t\t\tWorkingDirectory: \"/some-test-work-dir\",\n\t\t\t\t\t},\n\t\t\t\t\tOtherProcesses: []launch.Process{\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\tType:             \"other-local-type\",\n\t\t\t\t\t\t\tCommand:          launch.RawCommand{Entries: []string{\"/other/local/command\"}},\n\t\t\t\t\t\t\tArgs:             []string{\"other\", \"local\", \"args\"},\n\t\t\t\t\t\t\tDirect:           true,\n\t\t\t\t\t\t\tWorkingDirectory: \"/other-test-work-dir\",\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t\tRebasable: false,\n\t\t\t}\n\n\t\t\toutBuf = bytes.Buffer{}\n\t\t})\n\n\t\twhen(\"local and remote image exits\", func() {\n\t\t\tit(\"prints both local and remote image info in a JSON format\", func() {\n\t\t\t\trunImageMirrors := []config.RunImage{\n\t\t\t\t\t{\n\t\t\t\t\t\tImage:   \"un-used-run-image\",\n\t\t\t\t\t\tMirrors: []string{\"un-used\"},\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\tImage:   \"some-local-run-image\",\n\t\t\t\t\t\tMirrors: []string{\"user-configured-mirror-for-local\"},\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\tImage:   \"some-remote-run-image\",\n\t\t\t\t\t\tMirrors: []string{\"user-configured-mirror-for-remote\"},\n\t\t\t\t\t},\n\t\t\t\t}\n\t\t\t\tsharedImageInfo := inspectimage.GeneralInfo{\n\t\t\t\t\tName:            \"test-image\",\n\t\t\t\t\tRunImageMirrors: runImageMirrors,\n\t\t\t\t}\n\t\t\t\tjsonWriter := writer.NewJSON()\n\n\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\t\terr := jsonWriter.Print(logger, sharedImageInfo, localInfo, remoteInfo, nil, nil)\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassert.ContainsJSON(outBuf.String(), `{ \"image_name\": \"test-image\" }`)\n\t\t\t\tassert.ContainsJSON(outBuf.String(), expectedLocalOutput)\n\t\t\t\tassert.ContainsJSON(outBuf.String(), expectedRemoteOutput)\n\t\t\t})\n\t\t\tit(\"prints both local and remote no rebasable images info in a JSON format\", func() {\n\t\t\t\trunImageMirrors := []config.RunImage{\n\t\t\t\t\t{\n\t\t\t\t\t\tImage:   \"un-used-run-image\",\n\t\t\t\t\t\tMirrors: []string{\"un-used\"},\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\tImage:   \"some-local-run-image\",\n\t\t\t\t\t\tMirrors: []string{\"user-configured-mirror-for-local\"},\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\tImage:   \"some-remote-run-image\",\n\t\t\t\t\t\tMirrors: []string{\"user-configured-mirror-for-remote\"},\n\t\t\t\t\t},\n\t\t\t\t}\n\t\t\t\tsharedImageInfo := inspectimage.GeneralInfo{\n\t\t\t\t\tName:            \"test-image\",\n\t\t\t\t\tRunImageMirrors: runImageMirrors,\n\t\t\t\t}\n\t\t\t\tjsonWriter := writer.NewJSON()\n\n\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\t\terr := jsonWriter.Print(logger, sharedImageInfo, localInfoNoRebasable, remoteInfoNoRebasable, nil, nil)\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassert.ContainsJSON(outBuf.String(), `{ \"image_name\": \"test-image\" }`)\n\t\t\t\tassert.ContainsJSON(outBuf.String(), expectedLocalNoRebasableOutput)\n\t\t\t\tassert.ContainsJSON(outBuf.String(), expectedRemoteNoRebasableOutput)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"only local image exists\", func() {\n\t\t\tit(\"prints local image info in JSON format\", func() {\n\t\t\t\trunImageMirrors := []config.RunImage{\n\t\t\t\t\t{\n\t\t\t\t\t\tImage:   \"un-used-run-image\",\n\t\t\t\t\t\tMirrors: []string{\"un-used\"},\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\tImage:   \"some-local-run-image\",\n\t\t\t\t\t\tMirrors: []string{\"user-configured-mirror-for-local\"},\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\tImage:   \"some-remote-run-image\",\n\t\t\t\t\t\tMirrors: []string{\"user-configured-mirror-for-remote\"},\n\t\t\t\t\t},\n\t\t\t\t}\n\t\t\t\tsharedImageInfo := inspectimage.GeneralInfo{\n\t\t\t\t\tName:            \"test-image\",\n\t\t\t\t\tRunImageMirrors: runImageMirrors,\n\t\t\t\t}\n\t\t\t\tjsonWriter := writer.NewJSON()\n\n\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\t\terr := jsonWriter.Print(logger, sharedImageInfo, localInfo, nil, nil, nil)\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassert.ContainsJSON(outBuf.String(), `{ \"image_name\": \"test-image\" }`)\n\t\t\t\tassert.ContainsJSON(outBuf.String(), expectedLocalOutput)\n\n\t\t\t\tassert.NotContains(outBuf.String(), \"test.stack.id.remote\")\n\t\t\t\tassert.ContainsJSON(outBuf.String(), expectedLocalOutput)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"only remote image exists\", func() {\n\t\t\tit(\"prints remote image info in JSON format\", func() {\n\t\t\t\trunImageMirrors := []config.RunImage{\n\t\t\t\t\t{\n\t\t\t\t\t\tImage:   \"un-used-run-image\",\n\t\t\t\t\t\tMirrors: []string{\"un-used\"},\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\tImage:   \"some-local-run-image\",\n\t\t\t\t\t\tMirrors: []string{\"user-configured-mirror-for-local\"},\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\tImage:   \"some-remote-run-image\",\n\t\t\t\t\t\tMirrors: []string{\"user-configured-mirror-for-remote\"},\n\t\t\t\t\t},\n\t\t\t\t}\n\t\t\t\tsharedImageInfo := inspectimage.GeneralInfo{\n\t\t\t\t\tName:            \"test-image\",\n\t\t\t\t\tRunImageMirrors: runImageMirrors,\n\t\t\t\t}\n\t\t\t\tjsonWriter := writer.NewJSON()\n\n\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\t\terr := jsonWriter.Print(logger, sharedImageInfo, nil, remoteInfo, nil, nil)\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassert.ContainsJSON(outBuf.String(), `{ \"image_name\": \"test-image\" }`)\n\t\t\t\tassert.NotContains(outBuf.String(), \"test.stack.id.local\")\n\t\t\t\tassert.ContainsJSON(outBuf.String(), expectedRemoteOutput)\n\t\t\t})\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "internal/inspectimage/writer/structured_bom_format.go",
    "content": "package writer\n\nimport (\n\t\"fmt\"\n\n\t\"github.com/buildpacks/pack/internal/style\"\n\t\"github.com/buildpacks/pack/pkg/client\"\n\n\t\"github.com/buildpacks/pack/internal/inspectimage\"\n\n\t\"github.com/buildpacks/pack/pkg/logging\"\n)\n\ntype StructuredBOMFormat struct {\n\tMarshalFunc func(interface{}) ([]byte, error)\n}\n\nfunc (w *StructuredBOMFormat) Print(\n\tlogger logging.Logger,\n\tgeneralInfo inspectimage.GeneralInfo,\n\tlocal, remote *client.ImageInfo,\n\tlocalErr, remoteErr error,\n) error {\n\tif local == nil && remote == nil {\n\t\treturn fmt.Errorf(\"unable to find image '%s' locally or remotely\", generalInfo.Name)\n\t}\n\tif localErr != nil && remoteErr != nil {\n\t\treturn fmt.Errorf(\"preparing BOM output for %s: local :%s remote: %s\", style.Symbol(generalInfo.Name), localErr, remoteErr)\n\t}\n\tout, err := w.MarshalFunc(inspectimage.BOMDisplay{\n\t\tRemote:    inspectimage.NewBOMDisplay(remote),\n\t\tLocal:     inspectimage.NewBOMDisplay(local),\n\t\tRemoteErr: errorString(remoteErr),\n\t\tLocalErr:  errorString(localErr),\n\t})\n\n\tif err != nil {\n\t\treturn err\n\t}\n\n\t_, err = logger.Writer().Write(out)\n\treturn err\n}\n\nfunc errorString(err error) string {\n\tif err == nil {\n\t\treturn \"\"\n\t}\n\treturn err.Error()\n}\n"
  },
  {
    "path": "internal/inspectimage/writer/structured_bom_format_test.go",
    "content": "package writer_test\n\nimport (\n\t\"bytes\"\n\t\"errors\"\n\t\"fmt\"\n\t\"testing\"\n\n\t\"github.com/buildpacks/lifecycle/buildpack\"\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/internal/inspectimage\"\n\t\"github.com/buildpacks/pack/internal/inspectimage/writer\"\n\t\"github.com/buildpacks/pack/pkg/client\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestStructuredBOMFormat(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"StructuredBOMFormat Writer\", testStructuredBOMFormat, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testStructuredBOMFormat(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tassert = h.NewAssertionManager(t)\n\t\toutBuf *bytes.Buffer\n\n\t\tremoteInfo              *client.ImageInfo\n\t\tlocalInfo               *client.ImageInfo\n\t\tremoteWithExtensionInfo *client.ImageInfo\n\t\tlocalWithExtensionInfo  *client.ImageInfo\n\t\tgeneralInfo             inspectimage.GeneralInfo\n\t\tlogger                  *logging.LogWithWriters\n\t)\n\n\twhen(\"Print\", func() {\n\t\tit.Before(func() {\n\t\t\toutBuf = bytes.NewBuffer(nil)\n\t\t\tlogger = logging.NewLogWithWriters(outBuf, outBuf)\n\t\t\tremoteInfo = &client.ImageInfo{\n\t\t\t\tBOM: []buildpack.BOMEntry{\n\t\t\t\t\t{\n\t\t\t\t\t\tRequire: buildpack.Require{\n\t\t\t\t\t\t\tName:    \"remote-require\",\n\t\t\t\t\t\t\tVersion: \"1.2.3\",\n\t\t\t\t\t\t\tMetadata: map[string]interface{}{\n\t\t\t\t\t\t\t\t\"cool-remote\": \"beans\",\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t},\n\t\t\t\t\t\tBuildpack: buildpack.GroupElement{\n\t\t\t\t\t\t\tID:      \"remote-buildpack\",\n\t\t\t\t\t\t\tVersion: \"remote-buildpack-version\",\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t}\n\t\t\tlocalInfo = &client.ImageInfo{\n\t\t\t\tBOM: []buildpack.BOMEntry{\n\t\t\t\t\t{\n\t\t\t\t\t\tRequire: buildpack.Require{\n\t\t\t\t\t\t\tName:    \"local-require\",\n\t\t\t\t\t\t\tVersion: \"4.5.6\",\n\t\t\t\t\t\t\tMetadata: map[string]interface{}{\n\t\t\t\t\t\t\t\t\"cool-local\": \"beans\",\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t},\n\t\t\t\t\t\tBuildpack: buildpack.GroupElement{\n\t\t\t\t\t\t\tID:      \"local-buildpack\",\n\t\t\t\t\t\t\tVersion: \"local-buildpack-version\",\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t}\n\n\t\t\tremoteWithExtensionInfo = &client.ImageInfo{\n\t\t\t\tBOM: []buildpack.BOMEntry{\n\t\t\t\t\t{\n\t\t\t\t\t\tRequire: buildpack.Require{\n\t\t\t\t\t\t\tName:    \"remote-require\",\n\t\t\t\t\t\t\tVersion: \"1.2.3\",\n\t\t\t\t\t\t\tMetadata: map[string]interface{}{\n\t\t\t\t\t\t\t\t\"cool-remote\": \"beans\",\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t},\n\t\t\t\t\t\tBuildpack: buildpack.GroupElement{\n\t\t\t\t\t\t\tID:      \"remote-buildpack\",\n\t\t\t\t\t\t\tVersion: \"remote-buildpack-version\",\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t}\n\t\t\tlocalWithExtensionInfo = &client.ImageInfo{\n\t\t\t\tBOM: []buildpack.BOMEntry{\n\t\t\t\t\t{\n\t\t\t\t\t\tRequire: buildpack.Require{\n\t\t\t\t\t\t\tName:    \"local-require\",\n\t\t\t\t\t\t\tVersion: \"4.5.6\",\n\t\t\t\t\t\t\tMetadata: map[string]interface{}{\n\t\t\t\t\t\t\t\t\"cool-local\": \"beans\",\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t},\n\t\t\t\t\t\tBuildpack: buildpack.GroupElement{\n\t\t\t\t\t\t\tID:      \"local-buildpack\",\n\t\t\t\t\t\t\tVersion: \"local-buildpack-version\",\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t}\n\n\t\t\tgeneralInfo = inspectimage.GeneralInfo{\n\t\t\t\tName: \"some-image-name\",\n\t\t\t\tRunImageMirrors: []config.RunImage{\n\t\t\t\t\t{\n\t\t\t\t\t\tImage:   \"some-run-image\",\n\t\t\t\t\t\tMirrors: []string{\"first-mirror\", \"second-mirror\"},\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t}\n\t\t})\n\n\t\twhen(\"structured output\", func() {\n\t\t\tvar (\n\t\t\t\tlocalBomDisplay               []inspectimage.BOMEntryDisplay\n\t\t\t\tremoteBomDisplay              []inspectimage.BOMEntryDisplay\n\t\t\t\tlocalBomWithExtensionDisplay  []inspectimage.BOMEntryDisplay\n\t\t\t\tremoteBomWithExtensionDisplay []inspectimage.BOMEntryDisplay\n\t\t\t)\n\t\t\tit.Before(func() {\n\t\t\t\tlocalBomDisplay = []inspectimage.BOMEntryDisplay{{\n\t\t\t\t\tName:    \"local-require\",\n\t\t\t\t\tVersion: \"4.5.6\",\n\t\t\t\t\tMetadata: map[string]interface{}{\n\t\t\t\t\t\t\"cool-local\": \"beans\",\n\t\t\t\t\t},\n\t\t\t\t\tBuildpack: dist.ModuleRef{\n\t\t\t\t\t\tModuleInfo: dist.ModuleInfo{\n\t\t\t\t\t\t\tID:      \"local-buildpack\",\n\t\t\t\t\t\t\tVersion: \"local-buildpack-version\",\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t}}\n\t\t\t\tremoteBomDisplay = []inspectimage.BOMEntryDisplay{{\n\t\t\t\t\tName:    \"remote-require\",\n\t\t\t\t\tVersion: \"1.2.3\",\n\t\t\t\t\tMetadata: map[string]interface{}{\n\t\t\t\t\t\t\"cool-remote\": \"beans\",\n\t\t\t\t\t},\n\t\t\t\t\tBuildpack: dist.ModuleRef{\n\t\t\t\t\t\tModuleInfo: dist.ModuleInfo{\n\t\t\t\t\t\t\tID:      \"remote-buildpack\",\n\t\t\t\t\t\t\tVersion: \"remote-buildpack-version\",\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t}}\n\n\t\t\t\tlocalBomWithExtensionDisplay = []inspectimage.BOMEntryDisplay{{\n\t\t\t\t\tName:    \"local-require\",\n\t\t\t\t\tVersion: \"4.5.6\",\n\t\t\t\t\tMetadata: map[string]interface{}{\n\t\t\t\t\t\t\"cool-local\": \"beans\",\n\t\t\t\t\t},\n\t\t\t\t\tBuildpack: dist.ModuleRef{\n\t\t\t\t\t\tModuleInfo: dist.ModuleInfo{\n\t\t\t\t\t\t\tID:      \"local-buildpack\",\n\t\t\t\t\t\t\tVersion: \"local-buildpack-version\",\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t}}\n\t\t\t\tremoteBomWithExtensionDisplay = []inspectimage.BOMEntryDisplay{{\n\t\t\t\t\tName:    \"remote-require\",\n\t\t\t\t\tVersion: \"1.2.3\",\n\t\t\t\t\tMetadata: map[string]interface{}{\n\t\t\t\t\t\t\"cool-remote\": \"beans\",\n\t\t\t\t\t},\n\t\t\t\t\tBuildpack: dist.ModuleRef{\n\t\t\t\t\t\tModuleInfo: dist.ModuleInfo{\n\t\t\t\t\t\t\tID:      \"remote-buildpack\",\n\t\t\t\t\t\t\tVersion: \"remote-buildpack-version\",\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t}}\n\t\t\t})\n\t\t\tit(\"passes correct info to structuredBOMWriter\", func() {\n\t\t\t\tvar marshalInput interface{}\n\n\t\t\t\tstructuredBOMWriter := writer.StructuredBOMFormat{\n\t\t\t\t\tMarshalFunc: func(i interface{}) ([]byte, error) {\n\t\t\t\t\t\tmarshalInput = i\n\t\t\t\t\t\treturn []byte(\"marshalled\"), nil\n\t\t\t\t\t},\n\t\t\t\t}\n\n\t\t\t\terr := structuredBOMWriter.Print(logger, generalInfo, localInfo, remoteInfo, nil, nil)\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassert.Equal(marshalInput, inspectimage.BOMDisplay{\n\t\t\t\t\tRemote: remoteBomDisplay,\n\t\t\t\t\tLocal:  localBomDisplay,\n\t\t\t\t})\n\t\t\t})\n\n\t\t\tit(\"passes correct info to structuredBOMWriter\", func() {\n\t\t\t\tvar marshalInput interface{}\n\n\t\t\t\tstructuredBOMWriter := writer.StructuredBOMFormat{\n\t\t\t\t\tMarshalFunc: func(i interface{}) ([]byte, error) {\n\t\t\t\t\t\tmarshalInput = i\n\t\t\t\t\t\treturn []byte(\"marshalled\"), nil\n\t\t\t\t\t},\n\t\t\t\t}\n\n\t\t\t\terr := structuredBOMWriter.Print(logger, generalInfo, localWithExtensionInfo, remoteWithExtensionInfo, nil, nil)\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassert.Equal(marshalInput, inspectimage.BOMDisplay{\n\t\t\t\t\tRemote: remoteBomWithExtensionDisplay,\n\t\t\t\t\tLocal:  localBomWithExtensionDisplay,\n\t\t\t\t})\n\t\t\t})\n\t\t\twhen(\"a localErr is passed to Print\", func() {\n\t\t\t\tit(\"still marshals remote information\", func() {\n\t\t\t\t\tvar marshalInput interface{}\n\n\t\t\t\t\tlocalErr := errors.New(\"a local error occurred\")\n\t\t\t\t\tstructuredBOMWriter := writer.StructuredBOMFormat{\n\t\t\t\t\t\tMarshalFunc: func(i interface{}) ([]byte, error) {\n\t\t\t\t\t\t\tmarshalInput = i\n\t\t\t\t\t\t\treturn []byte(\"marshalled\"), nil\n\t\t\t\t\t\t},\n\t\t\t\t\t}\n\n\t\t\t\t\terr := structuredBOMWriter.Print(logger, generalInfo, nil, remoteInfo, localErr, nil)\n\t\t\t\t\tassert.Nil(err)\n\n\t\t\t\t\tassert.Equal(marshalInput, inspectimage.BOMDisplay{\n\t\t\t\t\t\tRemote:   remoteBomDisplay,\n\t\t\t\t\t\tLocal:    nil,\n\t\t\t\t\t\tLocalErr: localErr.Error(),\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"a localErr is passed to Print\", func() {\n\t\t\t\tit(\"still marshals remote information\", func() {\n\t\t\t\t\tvar marshalInput interface{}\n\n\t\t\t\t\tlocalErr := errors.New(\"a local error occurred\")\n\t\t\t\t\tstructuredBOMWriter := writer.StructuredBOMFormat{\n\t\t\t\t\t\tMarshalFunc: func(i interface{}) ([]byte, error) {\n\t\t\t\t\t\t\tmarshalInput = i\n\t\t\t\t\t\t\treturn []byte(\"marshalled\"), nil\n\t\t\t\t\t\t},\n\t\t\t\t\t}\n\n\t\t\t\t\terr := structuredBOMWriter.Print(logger, generalInfo, nil, remoteWithExtensionInfo, localErr, nil)\n\t\t\t\t\tassert.Nil(err)\n\n\t\t\t\t\tassert.Equal(marshalInput, inspectimage.BOMDisplay{\n\t\t\t\t\t\tRemote:   remoteBomWithExtensionDisplay,\n\t\t\t\t\t\tLocal:    nil,\n\t\t\t\t\t\tLocalErr: localErr.Error(),\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"a remoteErr is passed to Print\", func() {\n\t\t\t\tit(\"still marshals local information\", func() {\n\t\t\t\t\tvar marshalInput interface{}\n\n\t\t\t\t\tremoteErr := errors.New(\"a remote error occurred\")\n\t\t\t\t\tstructuredBOMWriter := writer.StructuredBOMFormat{\n\t\t\t\t\t\tMarshalFunc: func(i interface{}) ([]byte, error) {\n\t\t\t\t\t\t\tmarshalInput = i\n\t\t\t\t\t\t\treturn []byte(\"marshalled\"), nil\n\t\t\t\t\t\t},\n\t\t\t\t\t}\n\n\t\t\t\t\terr := structuredBOMWriter.Print(logger, generalInfo, localInfo, nil, nil, remoteErr)\n\t\t\t\t\tassert.Nil(err)\n\n\t\t\t\t\tassert.Equal(marshalInput, inspectimage.BOMDisplay{\n\t\t\t\t\t\tRemote:    nil,\n\t\t\t\t\t\tLocal:     localBomDisplay,\n\t\t\t\t\t\tRemoteErr: remoteErr.Error(),\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"a remoteErr is passed to Print\", func() {\n\t\t\t\tit(\"still marshals local information\", func() {\n\t\t\t\t\tvar marshalInput interface{}\n\n\t\t\t\t\tremoteErr := errors.New(\"a remote error occurred\")\n\t\t\t\t\tstructuredBOMWriter := writer.StructuredBOMFormat{\n\t\t\t\t\t\tMarshalFunc: func(i interface{}) ([]byte, error) {\n\t\t\t\t\t\t\tmarshalInput = i\n\t\t\t\t\t\t\treturn []byte(\"marshalled\"), nil\n\t\t\t\t\t\t},\n\t\t\t\t\t}\n\n\t\t\t\t\terr := structuredBOMWriter.Print(logger, generalInfo, localWithExtensionInfo, nil, nil, remoteErr)\n\t\t\t\t\tassert.Nil(err)\n\n\t\t\t\t\tassert.Equal(marshalInput, inspectimage.BOMDisplay{\n\t\t\t\t\t\tRemote:    nil,\n\t\t\t\t\t\tLocal:     localBomWithExtensionDisplay,\n\t\t\t\t\t\tRemoteErr: remoteErr.Error(),\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\t// Just test error cases, all error-free cases will be tested in JSON, TOML, and YAML subclasses.\n\t\twhen(\"failure cases\", func() {\n\t\t\twhen(\"both info objects are nil\", func() {\n\t\t\t\tit(\"displays a 'missing image' error message'\", func() {\n\t\t\t\t\tstructuredBOMWriter := writer.StructuredBOMFormat{\n\t\t\t\t\t\tMarshalFunc: testMarshalFunc,\n\t\t\t\t\t}\n\n\t\t\t\t\terr := structuredBOMWriter.Print(logger, generalInfo, nil, nil, nil, nil)\n\t\t\t\t\tassert.ErrorWithMessage(err, fmt.Sprintf(\"unable to find image '%s' locally or remotely\", \"some-image-name\"))\n\t\t\t\t})\n\t\t\t})\n\t\t\twhen(\"fetching local and remote info errors\", func() {\n\t\t\t\tit(\"returns an error\", func() {\n\t\t\t\t\tstructuredBOMWriter := writer.StructuredBOMFormat{\n\t\t\t\t\t\tMarshalFunc: func(i interface{}) ([]byte, error) {\n\t\t\t\t\t\t\treturn []byte(\"cool\"), nil\n\t\t\t\t\t\t},\n\t\t\t\t\t}\n\t\t\t\t\tremoteErr := errors.New(\"a remote error occurred\")\n\t\t\t\t\tlocalErr := errors.New(\"a local error occurred\")\n\n\t\t\t\t\terr := structuredBOMWriter.Print(logger, generalInfo, localInfo, remoteInfo, localErr, remoteErr)\n\t\t\t\t\tassert.ErrorContains(err, remoteErr.Error())\n\t\t\t\t\tassert.ErrorContains(err, localErr.Error())\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"fetching local and remote info errors\", func() {\n\t\t\t\tit(\"returns an error\", func() {\n\t\t\t\t\tstructuredBOMWriter := writer.StructuredBOMFormat{\n\t\t\t\t\t\tMarshalFunc: func(i interface{}) ([]byte, error) {\n\t\t\t\t\t\t\treturn []byte(\"cool\"), nil\n\t\t\t\t\t\t},\n\t\t\t\t\t}\n\t\t\t\t\tremoteErr := errors.New(\"a remote error occurred\")\n\t\t\t\t\tlocalErr := errors.New(\"a local error occurred\")\n\n\t\t\t\t\terr := structuredBOMWriter.Print(logger, generalInfo, localWithExtensionInfo, remoteWithExtensionInfo, localErr, remoteErr)\n\t\t\t\t\tassert.ErrorContains(err, remoteErr.Error())\n\t\t\t\t\tassert.ErrorContains(err, localErr.Error())\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "internal/inspectimage/writer/structured_format.go",
    "content": "package writer\n\nimport (\n\t\"fmt\"\n\n\t\"github.com/buildpacks/pack/internal/inspectimage\"\n\t\"github.com/buildpacks/pack/pkg/client\"\n\n\t\"github.com/buildpacks/pack/internal/style\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n)\n\ntype StructuredFormat struct {\n\tMarshalFunc func(interface{}) ([]byte, error)\n}\n\nfunc (w *StructuredFormat) Print(\n\tlogger logging.Logger,\n\tgeneralInfo inspectimage.GeneralInfo,\n\tlocal, remote *client.ImageInfo,\n\tlocalErr, remoteErr error,\n) error {\n\t// synthesize all objects here using methods\n\tif local == nil && remote == nil {\n\t\treturn fmt.Errorf(\"unable to find image '%s' locally or remotely\", generalInfo.Name)\n\t}\n\tif localErr != nil {\n\t\treturn fmt.Errorf(\"preparing output for %s: %w\", style.Symbol(generalInfo.Name), localErr)\n\t}\n\n\tif remoteErr != nil {\n\t\treturn fmt.Errorf(\"preparing output for %s: %w\", style.Symbol(generalInfo.Name), remoteErr)\n\t}\n\n\tlocalInfo := inspectimage.NewInfoDisplay(local, generalInfo)\n\tremoteInfo := inspectimage.NewInfoDisplay(remote, generalInfo)\n\n\tout, err := w.MarshalFunc(inspectimage.InspectOutput{\n\t\tImageName: generalInfo.Name,\n\t\tRemote:    remoteInfo,\n\t\tLocal:     localInfo,\n\t})\n\tif err != nil {\n\t\tpanic(err)\n\t}\n\n\t_, err = logger.Writer().Write(out)\n\treturn err\n}\n"
  },
  {
    "path": "internal/inspectimage/writer/structured_format_test.go",
    "content": "package writer_test\n\nimport (\n\t\"bytes\"\n\t\"errors\"\n\t\"fmt\"\n\t\"testing\"\n\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\t\"github.com/buildpacks/lifecycle/buildpack\"\n\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/internal/inspectimage\"\n\t\"github.com/buildpacks/pack/internal/inspectimage/writer\"\n\t\"github.com/buildpacks/pack/pkg/client\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestStructuredFormat(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"StructuredFormat Writer\", testStructuredFormat, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testStructuredFormat(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tassert = h.NewAssertionManager(t)\n\t\toutBuf bytes.Buffer\n\n\t\tremoteInfo                    *client.ImageInfo\n\t\tlocalInfo                     *client.ImageInfo\n\t\tremoteWithExtensionInfo       *client.ImageInfo\n\t\tlocalWithExtensionInfo        *client.ImageInfo\n\t\tlocalInfoWithExtensionDisplay *inspectimage.InfoDisplay\n\t)\n\n\twhen(\"Print\", func() {\n\t\tit.Before(func() {\n\t\t\tremoteInfo = &client.ImageInfo{}\n\t\t\tlocalInfo = &client.ImageInfo{}\n\t\t\tremoteWithExtensionInfo = &client.ImageInfo{}\n\t\t\tlocalWithExtensionInfo = &client.ImageInfo{\n\t\t\t\tStackID: \"test.stack.id.local\",\n\t\t\t\tBuildpacks: []buildpack.GroupElement{\n\t\t\t\t\t{ID: \"test.bp.one.local\", Version: \"1.0.0\", Homepage: \"https://some-homepage-one\"},\n\t\t\t\t},\n\t\t\t\tExtensions: []buildpack.GroupElement{\n\t\t\t\t\t{ID: \"test.bp.one.local\", Version: \"1.0.0\", Homepage: \"https://some-homepage-one\"},\n\t\t\t\t},\n\t\t\t}\n\t\t\tlocalInfoWithExtensionDisplay = &inspectimage.InfoDisplay{\n\t\t\t\tStackID: \"test.stack.id.local\",\n\t\t\t\tBuildpacks: []dist.ModuleInfo{\n\t\t\t\t\t{\n\t\t\t\t\t\tID:       \"test.bp.one.local\",\n\t\t\t\t\t\tVersion:  \"1.0.0\",\n\t\t\t\t\t\tHomepage: \"https://some-homepage-one\",\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t\tExtensions: []dist.ModuleInfo{\n\t\t\t\t\t{\n\t\t\t\t\t\tID:       \"test.bp.one.local\",\n\t\t\t\t\t\tVersion:  \"1.0.0\",\n\t\t\t\t\t\tHomepage: \"https://some-homepage-one\",\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t}\n\t\t\toutBuf = bytes.Buffer{}\n\t\t})\n\n\t\t// Just test error cases, all error-free cases will be tested in JSON, TOML, and YAML subclasses.\n\t\twhen(\"failure cases\", func() {\n\t\t\twhen(\"both info objects are nil\", func() {\n\t\t\t\tit(\"displays a 'missing image' error message'\", func() {\n\t\t\t\t\tsharedImageInfo := inspectimage.GeneralInfo{\n\t\t\t\t\t\tName:            \"missing-image\",\n\t\t\t\t\t\tRunImageMirrors: []config.RunImage{},\n\t\t\t\t\t}\n\n\t\t\t\t\tstructuredWriter := writer.StructuredFormat{\n\t\t\t\t\t\tMarshalFunc: testMarshalFunc,\n\t\t\t\t\t}\n\n\t\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\t\t\terr := structuredWriter.Print(logger, sharedImageInfo, nil, nil, nil, nil)\n\t\t\t\t\tassert.ErrorWithMessage(err, fmt.Sprintf(\"unable to find image '%s' locally or remotely\", \"missing-image\"))\n\t\t\t\t})\n\t\t\t})\n\t\t\twhen(\"a localErr is passed to Print\", func() {\n\t\t\t\tit(\"still prints remote information\", func() {\n\t\t\t\t\tsharedImageInfo := inspectimage.GeneralInfo{\n\t\t\t\t\t\tName:            \"localErr-image\",\n\t\t\t\t\t\tRunImageMirrors: []config.RunImage{},\n\t\t\t\t\t}\n\t\t\t\t\tstructuredWriter := writer.StructuredFormat{\n\t\t\t\t\t\tMarshalFunc: testMarshalFunc,\n\t\t\t\t\t}\n\n\t\t\t\t\tlocalErr := errors.New(\"a local error occurred\")\n\n\t\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\t\t\terr := structuredWriter.Print(logger, sharedImageInfo, nil, remoteInfo, localErr, nil)\n\t\t\t\t\tassert.ErrorWithMessage(err, \"preparing output for 'localErr-image': a local error occurred\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"a localWithExtension is passed to Print\", func() {\n\t\t\t\tit(\"prints localWithExtension information\", func() {\n\t\t\t\t\tvar marshalInput interface{}\n\t\t\t\t\tsharedImageInfo := inspectimage.GeneralInfo{\n\t\t\t\t\t\tName:            \"localExtension-image\",\n\t\t\t\t\t\tRunImageMirrors: []config.RunImage{},\n\t\t\t\t\t}\n\t\t\t\t\tstructuredWriter := writer.StructuredFormat{\n\t\t\t\t\t\tMarshalFunc: func(i interface{}) ([]byte, error) {\n\t\t\t\t\t\t\tmarshalInput = i\n\t\t\t\t\t\t\treturn []byte(\"marshalled\"), nil\n\t\t\t\t\t\t},\n\t\t\t\t\t}\n\n\t\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\t\t\terr := structuredWriter.Print(logger, sharedImageInfo, localWithExtensionInfo, nil, nil, nil)\n\t\t\t\t\tassert.Nil(err)\n\t\t\t\t\tassert.Equal(marshalInput, inspectimage.InspectOutput{\n\t\t\t\t\t\tImageName: \"localExtension-image\",\n\t\t\t\t\t\tLocal:     localInfoWithExtensionDisplay,\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"a localErr is passed to Print\", func() {\n\t\t\t\tit(\"still prints remote information\", func() {\n\t\t\t\t\tsharedImageInfo := inspectimage.GeneralInfo{\n\t\t\t\t\t\tName:            \"localErr-image\",\n\t\t\t\t\t\tRunImageMirrors: []config.RunImage{},\n\t\t\t\t\t}\n\t\t\t\t\tstructuredWriter := writer.StructuredFormat{\n\t\t\t\t\t\tMarshalFunc: testMarshalFunc,\n\t\t\t\t\t}\n\n\t\t\t\t\tlocalErr := errors.New(\"a local error occurred\")\n\n\t\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\t\t\terr := structuredWriter.Print(logger, sharedImageInfo, nil, remoteWithExtensionInfo, localErr, nil)\n\t\t\t\t\tassert.ErrorWithMessage(err, \"preparing output for 'localErr-image': a local error occurred\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"a remoteErr is passed to print\", func() {\n\t\t\t\tit(\"still prints local information\", func() {\n\t\t\t\t\tsharedImageInfo := inspectimage.GeneralInfo{\n\t\t\t\t\t\tName:            \"remoteErr-image\",\n\t\t\t\t\t\tRunImageMirrors: []config.RunImage{},\n\t\t\t\t\t}\n\t\t\t\t\tstructuredWriter := writer.StructuredFormat{\n\t\t\t\t\t\tMarshalFunc: testMarshalFunc,\n\t\t\t\t\t}\n\n\t\t\t\t\tremoteErr := errors.New(\"a remote error occurred\")\n\n\t\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\t\t\terr := structuredWriter.Print(logger, sharedImageInfo, localInfo, nil, nil, remoteErr)\n\t\t\t\t\tassert.ErrorWithMessage(err, \"preparing output for 'remoteErr-image': a remote error occurred\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"a remoteErr is passed to print\", func() {\n\t\t\t\tit(\"still prints local information\", func() {\n\t\t\t\t\tsharedImageInfo := inspectimage.GeneralInfo{\n\t\t\t\t\t\tName:            \"remoteErr-image\",\n\t\t\t\t\t\tRunImageMirrors: []config.RunImage{},\n\t\t\t\t\t}\n\t\t\t\t\tstructuredWriter := writer.StructuredFormat{\n\t\t\t\t\t\tMarshalFunc: testMarshalFunc,\n\t\t\t\t\t}\n\n\t\t\t\t\tremoteErr := errors.New(\"a remote error occurred\")\n\n\t\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\t\t\terr := structuredWriter.Print(logger, sharedImageInfo, localWithExtensionInfo, nil, nil, remoteErr)\n\t\t\t\t\tassert.ErrorWithMessage(err, \"preparing output for 'remoteErr-image': a remote error occurred\")\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\t})\n}\n\n//\n// testfunctions and helpers\n//\n\nfunc testMarshalFunc(i interface{}) ([]byte, error) {\n\treturn []byte(\"marshalled\"), nil\n}\n"
  },
  {
    "path": "internal/inspectimage/writer/toml.go",
    "content": "package writer\n\nimport (\n\t\"bytes\"\n\n\t\"github.com/pelletier/go-toml\"\n)\n\ntype TOML struct {\n\tStructuredFormat\n}\n\nfunc NewTOML() *TOML {\n\treturn &TOML{\n\t\tStructuredFormat: StructuredFormat{\n\t\t\tMarshalFunc: func(i interface{}) ([]byte, error) {\n\t\t\t\tbuf := bytes.NewBuffer(nil)\n\t\t\t\tif err := toml.NewEncoder(buf).Encode(i); err != nil {\n\t\t\t\t\treturn []byte{}, err\n\t\t\t\t}\n\t\t\t\treturn buf.Bytes(), nil\n\t\t\t},\n\t\t},\n\t}\n}\n"
  },
  {
    "path": "internal/inspectimage/writer/toml_test.go",
    "content": "package writer_test\n\nimport (\n\t\"bytes\"\n\t\"testing\"\n\n\t\"github.com/buildpacks/lifecycle/buildpack\"\n\t\"github.com/buildpacks/lifecycle/launch\"\n\t\"github.com/buildpacks/lifecycle/platform/files\"\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/internal/inspectimage\"\n\t\"github.com/buildpacks/pack/internal/inspectimage/writer\"\n\t\"github.com/buildpacks/pack/pkg/client\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestTOML(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"TOML Writer\", testTOML, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testTOML(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tassert = h.NewAssertionManager(t)\n\t\toutBuf bytes.Buffer\n\n\t\tremoteInfo            *client.ImageInfo\n\t\tremoteInfoNoRebasable *client.ImageInfo\n\t\tlocalInfo             *client.ImageInfo\n\t\tlocalInfoNoRebasable  *client.ImageInfo\n\n\t\texpectedLocalOutput = `[local_info]\nstack = 'test.stack.id.local'\nrebasable = true\n\n[local_info.base_image]\ntop_layer = 'some-local-top-layer'\nreference = 'some-local-run-image-reference'\n\n[[local_info.run_images]]\nname = 'user-configured-mirror-for-local'\nuser_configured = true\n\n[[local_info.run_images]]\nname = 'some-local-run-image'\n\n[[local_info.run_images]]\nname = 'some-local-mirror'\n\n[[local_info.run_images]]\nname = 'other-local-mirror'\n\n[[local_info.buildpacks]]\nid = 'test.bp.one.local'\nversion = '1.0.0'\nhomepage = 'https://some-homepage-one'\n\n[[local_info.buildpacks]]\nid = 'test.bp.two.local'\nversion = '2.0.0'\nhomepage = 'https://some-homepage-two'\n\n[[local_info.processes]]\ntype = 'some-local-type'\nshell = 'bash'\ncommand = '/some/local command'\ndefault = true\nargs = [\n    'some',\n    'local',\n    'args',\n]\nworking-dir = \"/some-test-work-dir\"\n\n[[local_info.processes]]\ntype = 'other-local-type'\nshell = ''\ncommand = '/other/local/command'\ndefault = false\nargs = [\n    'other',\n    'local',\n    'args',\n]\nworking-dir = \"/other-test-work-dir\"\n`\n\t\texpectedLocalNoRebasableOutput = `[local_info]\nstack = 'test.stack.id.local'\nrebasable = false\n\n[local_info.base_image]\ntop_layer = 'some-local-top-layer'\nreference = 'some-local-run-image-reference'\n\n[[local_info.run_images]]\nname = 'user-configured-mirror-for-local'\nuser_configured = true\n\n[[local_info.run_images]]\nname = 'some-local-run-image'\n\n[[local_info.run_images]]\nname = 'some-local-mirror'\n\n[[local_info.run_images]]\nname = 'other-local-mirror'\n\n[[local_info.buildpacks]]\nid = 'test.bp.one.local'\nversion = '1.0.0'\nhomepage = 'https://some-homepage-one'\n\n[[local_info.buildpacks]]\nid = 'test.bp.two.local'\nversion = '2.0.0'\nhomepage = 'https://some-homepage-two'\n\n[[local_info.processes]]\ntype = 'some-local-type'\nshell = 'bash'\ncommand = '/some/local command'\ndefault = true\nargs = [\n    'some',\n    'local',\n    'args',\n]\nworking-dir = \"/some-test-work-dir\"\n\n[[local_info.processes]]\ntype = 'other-local-type'\nshell = ''\ncommand = '/other/local/command'\ndefault = false\nargs = [\n    'other',\n    'local',\n    'args',\n]\nworking-dir = \"/other-test-work-dir\"\n`\n\n\t\texpectedRemoteOutput = `\n[remote_info]\nstack = 'test.stack.id.remote'\nrebasable = true\n\n[remote_info.base_image]\ntop_layer = 'some-remote-top-layer'\nreference = 'some-remote-run-image-reference'\n\n[[remote_info.run_images]]\nname = 'user-configured-mirror-for-remote'\nuser_configured = true\n\n[[remote_info.run_images]]\nname = 'some-remote-run-image'\n\n[[remote_info.run_images]]\nname = 'some-remote-mirror'\n\n[[remote_info.run_images]]\nname = 'other-remote-mirror'\n\n[[remote_info.buildpacks]]\nid = 'test.bp.one.remote'\nversion = '1.0.0'\nhomepage = 'https://some-homepage-one'\n\n[[remote_info.buildpacks]]\nid = 'test.bp.two.remote'\nversion = '2.0.0'\nhomepage = 'https://some-homepage-two'\n\n[[remote_info.processes]]\ntype = 'some-remote-type'\nshell = 'bash'\ncommand = '/some/remote command'\ndefault = true\nargs = [\n    'some',\n    'remote',\n    'args',\n]\nworking-dir = \"/some-test-work-dir\"\n\n[[remote_info.processes]]\ntype = 'other-remote-type'\nshell = ''\ncommand = '/other/remote/command'\ndefault = false\nargs = [\n    'other',\n    'remote',\n    'args',\n]\nworking-dir = \"/other-test-work-dir\"\n`\n\t\texpectedRemoteNoRebasableOutput = `\n[remote_info]\nstack = 'test.stack.id.remote'\nrebasable = false\n\n[remote_info.base_image]\ntop_layer = 'some-remote-top-layer'\nreference = 'some-remote-run-image-reference'\n\n[[remote_info.run_images]]\nname = 'user-configured-mirror-for-remote'\nuser_configured = true\n\n[[remote_info.run_images]]\nname = 'some-remote-run-image'\n\n[[remote_info.run_images]]\nname = 'some-remote-mirror'\n\n[[remote_info.run_images]]\nname = 'other-remote-mirror'\n\n[[remote_info.buildpacks]]\nid = 'test.bp.one.remote'\nversion = '1.0.0'\nhomepage = 'https://some-homepage-one'\n\n[[remote_info.buildpacks]]\nid = 'test.bp.two.remote'\nversion = '2.0.0'\nhomepage = 'https://some-homepage-two'\n\n[[remote_info.processes]]\ntype = 'some-remote-type'\nshell = 'bash'\ncommand = '/some/remote command'\ndefault = true\nargs = [\n    'some',\n    'remote',\n    'args',\n]\nworking-dir = \"/some-test-work-dir\"\n\n[[remote_info.processes]]\ntype = 'other-remote-type'\nshell = ''\ncommand = '/other/remote/command'\ndefault = false\nargs = [\n    'other',\n    'remote',\n    'args',\n]\nworking-dir = \"/other-test-work-dir\"\n`\n\t)\n\n\twhen(\"Print\", func() {\n\t\tit.Before(func() {\n\t\t\ttype someData struct {\n\t\t\t\tString string\n\t\t\t\tBool   bool\n\t\t\t\tInt    int\n\t\t\t\tNested struct {\n\t\t\t\t\tString string\n\t\t\t\t}\n\t\t\t}\n\n\t\t\tremoteInfo = &client.ImageInfo{\n\t\t\t\tStackID: \"test.stack.id.remote\",\n\t\t\t\tBuildpacks: []buildpack.GroupElement{\n\t\t\t\t\t{ID: \"test.bp.one.remote\", Version: \"1.0.0\", Homepage: \"https://some-homepage-one\"},\n\t\t\t\t\t{ID: \"test.bp.two.remote\", Version: \"2.0.0\", Homepage: \"https://some-homepage-two\"},\n\t\t\t\t},\n\t\t\t\tBase: files.RunImageForRebase{\n\t\t\t\t\tTopLayer:  \"some-remote-top-layer\",\n\t\t\t\t\tReference: \"some-remote-run-image-reference\",\n\t\t\t\t},\n\t\t\t\tStack: files.Stack{\n\t\t\t\t\tRunImage: files.RunImageForExport{\n\t\t\t\t\t\tImage:   \"some-remote-run-image\",\n\t\t\t\t\t\tMirrors: []string{\"some-remote-mirror\", \"other-remote-mirror\"},\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t\tBOM: []buildpack.BOMEntry{{\n\t\t\t\t\tRequire: buildpack.Require{\n\t\t\t\t\t\tName:    \"name-1\",\n\t\t\t\t\t\tVersion: \"version-1\",\n\t\t\t\t\t\tMetadata: map[string]interface{}{\n\t\t\t\t\t\t\t\"RemoteData\": someData{\n\t\t\t\t\t\t\t\tString: \"aString\",\n\t\t\t\t\t\t\t\tBool:   true,\n\t\t\t\t\t\t\t\tInt:    123,\n\t\t\t\t\t\t\t\tNested: struct {\n\t\t\t\t\t\t\t\t\tString string\n\t\t\t\t\t\t\t\t}{\n\t\t\t\t\t\t\t\t\tString: \"anotherString\",\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t\tBuildpack: buildpack.GroupElement{ID: \"test.bp.one.remote\", Version: \"1.0.0\", Homepage: \"https://some-homepage-one\"},\n\t\t\t\t}},\n\t\t\t\tProcesses: client.ProcessDetails{\n\t\t\t\t\tDefaultProcess: &launch.Process{\n\t\t\t\t\t\tType:             \"some-remote-type\",\n\t\t\t\t\t\tCommand:          launch.RawCommand{Entries: []string{\"/some/remote command\"}},\n\t\t\t\t\t\tArgs:             []string{\"some\", \"remote\", \"args\"},\n\t\t\t\t\t\tDirect:           false,\n\t\t\t\t\t\tWorkingDirectory: \"/some-test-work-dir\",\n\t\t\t\t\t},\n\t\t\t\t\tOtherProcesses: []launch.Process{\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\tType:             \"other-remote-type\",\n\t\t\t\t\t\t\tCommand:          launch.RawCommand{Entries: []string{\"/other/remote/command\"}},\n\t\t\t\t\t\t\tArgs:             []string{\"other\", \"remote\", \"args\"},\n\t\t\t\t\t\t\tDirect:           true,\n\t\t\t\t\t\t\tWorkingDirectory: \"/other-test-work-dir\",\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t\tRebasable: true,\n\t\t\t}\n\t\t\tremoteInfoNoRebasable = &client.ImageInfo{\n\t\t\t\tStackID: \"test.stack.id.remote\",\n\t\t\t\tBuildpacks: []buildpack.GroupElement{\n\t\t\t\t\t{ID: \"test.bp.one.remote\", Version: \"1.0.0\", Homepage: \"https://some-homepage-one\"},\n\t\t\t\t\t{ID: \"test.bp.two.remote\", Version: \"2.0.0\", Homepage: \"https://some-homepage-two\"},\n\t\t\t\t},\n\t\t\t\tBase: files.RunImageForRebase{\n\t\t\t\t\tTopLayer:  \"some-remote-top-layer\",\n\t\t\t\t\tReference: \"some-remote-run-image-reference\",\n\t\t\t\t},\n\t\t\t\tStack: files.Stack{\n\t\t\t\t\tRunImage: files.RunImageForExport{\n\t\t\t\t\t\tImage:   \"some-remote-run-image\",\n\t\t\t\t\t\tMirrors: []string{\"some-remote-mirror\", \"other-remote-mirror\"},\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t\tBOM: []buildpack.BOMEntry{{\n\t\t\t\t\tRequire: buildpack.Require{\n\t\t\t\t\t\tName:    \"name-1\",\n\t\t\t\t\t\tVersion: \"version-1\",\n\t\t\t\t\t\tMetadata: map[string]interface{}{\n\t\t\t\t\t\t\t\"RemoteData\": someData{\n\t\t\t\t\t\t\t\tString: \"aString\",\n\t\t\t\t\t\t\t\tBool:   true,\n\t\t\t\t\t\t\t\tInt:    123,\n\t\t\t\t\t\t\t\tNested: struct {\n\t\t\t\t\t\t\t\t\tString string\n\t\t\t\t\t\t\t\t}{\n\t\t\t\t\t\t\t\t\tString: \"anotherString\",\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t\tBuildpack: buildpack.GroupElement{ID: \"test.bp.one.remote\", Version: \"1.0.0\", Homepage: \"https://some-homepage-one\"},\n\t\t\t\t}},\n\t\t\t\tProcesses: client.ProcessDetails{\n\t\t\t\t\tDefaultProcess: &launch.Process{\n\t\t\t\t\t\tType:             \"some-remote-type\",\n\t\t\t\t\t\tCommand:          launch.RawCommand{Entries: []string{\"/some/remote command\"}},\n\t\t\t\t\t\tArgs:             []string{\"some\", \"remote\", \"args\"},\n\t\t\t\t\t\tDirect:           false,\n\t\t\t\t\t\tWorkingDirectory: \"/some-test-work-dir\",\n\t\t\t\t\t},\n\t\t\t\t\tOtherProcesses: []launch.Process{\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\tType:             \"other-remote-type\",\n\t\t\t\t\t\t\tCommand:          launch.RawCommand{Entries: []string{\"/other/remote/command\"}},\n\t\t\t\t\t\t\tArgs:             []string{\"other\", \"remote\", \"args\"},\n\t\t\t\t\t\t\tDirect:           true,\n\t\t\t\t\t\t\tWorkingDirectory: \"/other-test-work-dir\",\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t\tRebasable: false,\n\t\t\t}\n\n\t\t\tlocalInfo = &client.ImageInfo{\n\t\t\t\tStackID: \"test.stack.id.local\",\n\t\t\t\tBuildpacks: []buildpack.GroupElement{\n\t\t\t\t\t{ID: \"test.bp.one.local\", Version: \"1.0.0\", Homepage: \"https://some-homepage-one\"},\n\t\t\t\t\t{ID: \"test.bp.two.local\", Version: \"2.0.0\", Homepage: \"https://some-homepage-two\"},\n\t\t\t\t},\n\t\t\t\tBase: files.RunImageForRebase{\n\t\t\t\t\tTopLayer:  \"some-local-top-layer\",\n\t\t\t\t\tReference: \"some-local-run-image-reference\",\n\t\t\t\t},\n\t\t\t\tStack: files.Stack{\n\t\t\t\t\tRunImage: files.RunImageForExport{\n\t\t\t\t\t\tImage:   \"some-local-run-image\",\n\t\t\t\t\t\tMirrors: []string{\"some-local-mirror\", \"other-local-mirror\"},\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t\tBOM: []buildpack.BOMEntry{{\n\t\t\t\t\tRequire: buildpack.Require{\n\t\t\t\t\t\tName:    \"name-1\",\n\t\t\t\t\t\tVersion: \"version-1\",\n\t\t\t\t\t\tMetadata: map[string]interface{}{\n\t\t\t\t\t\t\t\"LocalData\": someData{\n\t\t\t\t\t\t\t\tBool: false,\n\t\t\t\t\t\t\t\tInt:  456,\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t\tBuildpack: buildpack.GroupElement{ID: \"test.bp.one.remote\", Version: \"1.0.0\", Homepage: \"https://some-homepage-one\"},\n\t\t\t\t}},\n\t\t\t\tProcesses: client.ProcessDetails{\n\t\t\t\t\tDefaultProcess: &launch.Process{\n\t\t\t\t\t\tType:             \"some-local-type\",\n\t\t\t\t\t\tCommand:          launch.RawCommand{Entries: []string{\"/some/local command\"}},\n\t\t\t\t\t\tArgs:             []string{\"some\", \"local\", \"args\"},\n\t\t\t\t\t\tDirect:           false,\n\t\t\t\t\t\tWorkingDirectory: \"/some-test-work-dir\",\n\t\t\t\t\t},\n\t\t\t\t\tOtherProcesses: []launch.Process{\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\tType:             \"other-local-type\",\n\t\t\t\t\t\t\tCommand:          launch.RawCommand{Entries: []string{\"/other/local/command\"}},\n\t\t\t\t\t\t\tArgs:             []string{\"other\", \"local\", \"args\"},\n\t\t\t\t\t\t\tDirect:           true,\n\t\t\t\t\t\t\tWorkingDirectory: \"/other-test-work-dir\",\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t\tRebasable: true,\n\t\t\t}\n\t\t\tlocalInfoNoRebasable = &client.ImageInfo{\n\t\t\t\tStackID: \"test.stack.id.local\",\n\t\t\t\tBuildpacks: []buildpack.GroupElement{\n\t\t\t\t\t{ID: \"test.bp.one.local\", Version: \"1.0.0\", Homepage: \"https://some-homepage-one\"},\n\t\t\t\t\t{ID: \"test.bp.two.local\", Version: \"2.0.0\", Homepage: \"https://some-homepage-two\"},\n\t\t\t\t},\n\t\t\t\tBase: files.RunImageForRebase{\n\t\t\t\t\tTopLayer:  \"some-local-top-layer\",\n\t\t\t\t\tReference: \"some-local-run-image-reference\",\n\t\t\t\t},\n\t\t\t\tStack: files.Stack{\n\t\t\t\t\tRunImage: files.RunImageForExport{\n\t\t\t\t\t\tImage:   \"some-local-run-image\",\n\t\t\t\t\t\tMirrors: []string{\"some-local-mirror\", \"other-local-mirror\"},\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t\tBOM: []buildpack.BOMEntry{{\n\t\t\t\t\tRequire: buildpack.Require{\n\t\t\t\t\t\tName:    \"name-1\",\n\t\t\t\t\t\tVersion: \"version-1\",\n\t\t\t\t\t\tMetadata: map[string]interface{}{\n\t\t\t\t\t\t\t\"LocalData\": someData{\n\t\t\t\t\t\t\t\tBool: false,\n\t\t\t\t\t\t\t\tInt:  456,\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t\tBuildpack: buildpack.GroupElement{ID: \"test.bp.one.remote\", Version: \"1.0.0\", Homepage: \"https://some-homepage-one\"},\n\t\t\t\t}},\n\t\t\t\tProcesses: client.ProcessDetails{\n\t\t\t\t\tDefaultProcess: &launch.Process{\n\t\t\t\t\t\tType:             \"some-local-type\",\n\t\t\t\t\t\tCommand:          launch.RawCommand{Entries: []string{\"/some/local command\"}},\n\t\t\t\t\t\tArgs:             []string{\"some\", \"local\", \"args\"},\n\t\t\t\t\t\tDirect:           false,\n\t\t\t\t\t\tWorkingDirectory: \"/some-test-work-dir\",\n\t\t\t\t\t},\n\t\t\t\t\tOtherProcesses: []launch.Process{\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\tType:             \"other-local-type\",\n\t\t\t\t\t\t\tCommand:          launch.RawCommand{Entries: []string{\"/other/local/command\"}},\n\t\t\t\t\t\t\tArgs:             []string{\"other\", \"local\", \"args\"},\n\t\t\t\t\t\t\tDirect:           true,\n\t\t\t\t\t\t\tWorkingDirectory: \"/other-test-work-dir\",\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t\tRebasable: false,\n\t\t\t}\n\n\t\t\toutBuf = bytes.Buffer{}\n\t\t})\n\n\t\twhen(\"local and remote image exits\", func() {\n\t\t\tit(\"prints both local and remote image info in a TOML format\", func() {\n\t\t\t\trunImageMirrors := []config.RunImage{\n\t\t\t\t\t{\n\t\t\t\t\t\tImage:   \"un-used-run-image\",\n\t\t\t\t\t\tMirrors: []string{\"un-used\"},\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\tImage:   \"some-local-run-image\",\n\t\t\t\t\t\tMirrors: []string{\"user-configured-mirror-for-local\"},\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\tImage:   \"some-remote-run-image\",\n\t\t\t\t\t\tMirrors: []string{\"user-configured-mirror-for-remote\"},\n\t\t\t\t\t},\n\t\t\t\t}\n\t\t\t\tsharedImageInfo := inspectimage.GeneralInfo{\n\t\t\t\t\tName:            \"test-image\",\n\t\t\t\t\tRunImageMirrors: runImageMirrors,\n\t\t\t\t}\n\t\t\t\ttomlWriter := writer.NewTOML()\n\n\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\t\terr := tomlWriter.Print(logger, sharedImageInfo, localInfo, remoteInfo, nil, nil)\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassert.ContainsTOML(outBuf.String(), `image_name = \"test-image\"`)\n\t\t\t\tassert.ContainsTOML(outBuf.String(), expectedLocalOutput)\n\t\t\t\tassert.ContainsTOML(outBuf.String(), expectedRemoteOutput)\n\t\t\t})\n\t\t\tit(\"prints both local and remote no rebasable images info in a TOML format\", func() {\n\t\t\t\trunImageMirrors := []config.RunImage{\n\t\t\t\t\t{\n\t\t\t\t\t\tImage:   \"un-used-run-image\",\n\t\t\t\t\t\tMirrors: []string{\"un-used\"},\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\tImage:   \"some-local-run-image\",\n\t\t\t\t\t\tMirrors: []string{\"user-configured-mirror-for-local\"},\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\tImage:   \"some-remote-run-image\",\n\t\t\t\t\t\tMirrors: []string{\"user-configured-mirror-for-remote\"},\n\t\t\t\t\t},\n\t\t\t\t}\n\t\t\t\tsharedImageInfo := inspectimage.GeneralInfo{\n\t\t\t\t\tName:            \"test-image\",\n\t\t\t\t\tRunImageMirrors: runImageMirrors,\n\t\t\t\t}\n\t\t\t\ttomlWriter := writer.NewTOML()\n\n\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\t\terr := tomlWriter.Print(logger, sharedImageInfo, localInfoNoRebasable, remoteInfoNoRebasable, nil, nil)\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassert.ContainsTOML(outBuf.String(), `image_name = \"test-image\"`)\n\t\t\t\tassert.ContainsTOML(outBuf.String(), expectedLocalNoRebasableOutput)\n\t\t\t\tassert.ContainsTOML(outBuf.String(), expectedRemoteNoRebasableOutput)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"only local image exists\", func() {\n\t\t\tit(\"prints local image info in TOML format\", func() {\n\t\t\t\trunImageMirrors := []config.RunImage{\n\t\t\t\t\t{\n\t\t\t\t\t\tImage:   \"un-used-run-image\",\n\t\t\t\t\t\tMirrors: []string{\"un-used\"},\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\tImage:   \"some-local-run-image\",\n\t\t\t\t\t\tMirrors: []string{\"user-configured-mirror-for-local\"},\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\tImage:   \"some-remote-run-image\",\n\t\t\t\t\t\tMirrors: []string{\"user-configured-mirror-for-remote\"},\n\t\t\t\t\t},\n\t\t\t\t}\n\t\t\t\tsharedImageInfo := inspectimage.GeneralInfo{\n\t\t\t\t\tName:            \"test-image\",\n\t\t\t\t\tRunImageMirrors: runImageMirrors,\n\t\t\t\t}\n\t\t\t\ttomlWriter := writer.NewTOML()\n\n\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\t\terr := tomlWriter.Print(logger, sharedImageInfo, localInfo, nil, nil, nil)\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassert.ContainsTOML(outBuf.String(), `image_name = \"test-image\"`)\n\t\t\t\tassert.NotContains(outBuf.String(), \"test.stack.id.remote\")\n\t\t\t\tassert.ContainsTOML(outBuf.String(), expectedLocalOutput)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"only remote image exists\", func() {\n\t\t\tit(\"prints remote image info in TOML format\", func() {\n\t\t\t\trunImageMirrors := []config.RunImage{\n\t\t\t\t\t{\n\t\t\t\t\t\tImage:   \"un-used-run-image\",\n\t\t\t\t\t\tMirrors: []string{\"un-used\"},\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\tImage:   \"some-local-run-image\",\n\t\t\t\t\t\tMirrors: []string{\"user-configured-mirror-for-local\"},\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\tImage:   \"some-remote-run-image\",\n\t\t\t\t\t\tMirrors: []string{\"user-configured-mirror-for-remote\"},\n\t\t\t\t\t},\n\t\t\t\t}\n\t\t\t\tsharedImageInfo := inspectimage.GeneralInfo{\n\t\t\t\t\tName:            \"test-image\",\n\t\t\t\t\tRunImageMirrors: runImageMirrors,\n\t\t\t\t}\n\t\t\t\ttomlWriter := writer.NewTOML()\n\n\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\t\terr := tomlWriter.Print(logger, sharedImageInfo, nil, remoteInfo, nil, nil)\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassert.ContainsTOML(outBuf.String(), `image_name = \"test-image\"`)\n\t\t\t\tassert.NotContains(outBuf.String(), \"test.stack.id.local\")\n\t\t\t\tassert.ContainsTOML(outBuf.String(), expectedRemoteOutput)\n\t\t\t})\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "internal/inspectimage/writer/yaml.go",
    "content": "package writer\n\nimport (\n\t\"bytes\"\n\n\t\"gopkg.in/yaml.v3\"\n)\n\ntype YAML struct {\n\tStructuredFormat\n}\n\nfunc NewYAML() *YAML {\n\treturn &YAML{\n\t\tStructuredFormat: StructuredFormat{\n\t\t\tMarshalFunc: func(i interface{}) ([]byte, error) {\n\t\t\t\tbuf := bytes.NewBuffer(nil)\n\t\t\t\tif err := yaml.NewEncoder(buf).Encode(i); err != nil {\n\t\t\t\t\treturn []byte{}, err\n\t\t\t\t}\n\t\t\t\treturn buf.Bytes(), nil\n\t\t\t},\n\t\t},\n\t}\n}\n"
  },
  {
    "path": "internal/inspectimage/writer/yaml_test.go",
    "content": "package writer_test\n\nimport (\n\t\"bytes\"\n\t\"testing\"\n\n\t\"github.com/buildpacks/lifecycle/buildpack\"\n\t\"github.com/buildpacks/lifecycle/launch\"\n\t\"github.com/buildpacks/lifecycle/platform/files\"\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/internal/inspectimage\"\n\t\"github.com/buildpacks/pack/internal/inspectimage/writer\"\n\t\"github.com/buildpacks/pack/pkg/client\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestYAML(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"YAML Writer\", testYAML, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testYAML(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tassert = h.NewAssertionManager(t)\n\t\toutBuf bytes.Buffer\n\n\t\tremoteInfo            *client.ImageInfo\n\t\tremoteInfoNoRebasable *client.ImageInfo\n\t\tlocalInfo             *client.ImageInfo\n\t\tlocalInfoNoRebasable  *client.ImageInfo\n\n\t\texpectedLocalOutput = `---\nlocal_info:\n  stack: test.stack.id.local\n  rebasable: true\n  base_image:\n    top_layer: some-local-top-layer\n    reference: some-local-run-image-reference\n  run_images:\n  - name: user-configured-mirror-for-local\n    user_configured: true\n  - name: some-local-run-image\n  - name: some-local-mirror\n  - name: other-local-mirror\n  buildpacks:\n  - homepage: https://some-homepage-one\n    id: test.bp.one.local\n    version: 1.0.0\n  - homepage: https://some-homepage-two\n    id: test.bp.two.local\n    version: 2.0.0\n  extensions: []\n  processes:\n  - type: some-local-type\n    shell: bash\n    command: \"/some/local command\"\n    default: true\n    args:\n    - some\n    - local\n    - args\n    working-dir: /some-test-work-dir\n  - type: other-local-type\n    shell: ''\n    command: \"/other/local/command\"\n    default: false\n    args:\n    - other\n    - local\n    - args\n    working-dir: /other-test-work-dir\n`\n\t\texpectedLocalNoRebasableOutput = `---\nlocal_info:\n  stack: test.stack.id.local\n  rebasable: false\n  base_image:\n    top_layer: some-local-top-layer\n    reference: some-local-run-image-reference\n  run_images:\n  - name: user-configured-mirror-for-local\n    user_configured: true\n  - name: some-local-run-image\n  - name: some-local-mirror\n  - name: other-local-mirror\n  buildpacks:\n  - homepage: https://some-homepage-one\n    id: test.bp.one.local\n    version: 1.0.0\n  - homepage: https://some-homepage-two\n    id: test.bp.two.local\n    version: 2.0.0\n  extensions: []\n  processes:\n  - type: some-local-type\n    shell: bash\n    command: \"/some/local command\"\n    default: true\n    args:\n    - some\n    - local\n    - args\n    working-dir: /some-test-work-dir\n  - type: other-local-type\n    shell: ''\n    command: \"/other/local/command\"\n    default: false\n    args:\n    - other\n    - local\n    - args\n    working-dir: /other-test-work-dir\n`\n\t\texpectedRemoteOutput = `---\nremote_info:\n  stack: test.stack.id.remote\n  rebasable: true\n  base_image:\n    top_layer: some-remote-top-layer\n    reference: some-remote-run-image-reference\n  run_images:\n  - name: user-configured-mirror-for-remote\n    user_configured: true\n  - name: some-remote-run-image\n  - name: some-remote-mirror\n  - name: other-remote-mirror\n  buildpacks:\n  - homepage: https://some-homepage-one\n    id: test.bp.one.remote\n    version: 1.0.0\n  - homepage: https://some-homepage-two\n    id: test.bp.two.remote\n    version: 2.0.0\n  extensions: []\n  processes:\n  - type: some-remote-type\n    shell: bash\n    command: \"/some/remote command\"\n    default: true\n    args:\n    - some\n    - remote\n    - args\n    working-dir: /some-test-work-dir\n  - type: other-remote-type\n    shell: ''\n    command: \"/other/remote/command\"\n    default: false\n    args:\n    - other\n    - remote\n    - args\n    working-dir: /other-test-work-dir\n`\n\t\texpectedRemoteNoRebasableOutput = `---\nremote_info:\n  stack: test.stack.id.remote\n  rebasable: false\n  base_image:\n    top_layer: some-remote-top-layer\n    reference: some-remote-run-image-reference\n  run_images:\n  - name: user-configured-mirror-for-remote\n    user_configured: true\n  - name: some-remote-run-image\n  - name: some-remote-mirror\n  - name: other-remote-mirror\n  buildpacks:\n  - homepage: https://some-homepage-one\n    id: test.bp.one.remote\n    version: 1.0.0\n  - homepage: https://some-homepage-two\n    id: test.bp.two.remote\n    version: 2.0.0\n  extensions: []\n  processes:\n  - type: some-remote-type\n    shell: bash\n    command: \"/some/remote command\"\n    default: true\n    args:\n    - some\n    - remote\n    - args\n    working-dir: /some-test-work-dir\n  - type: other-remote-type\n    shell: ''\n    command: \"/other/remote/command\"\n    default: false\n    args:\n    - other\n    - remote\n    - args\n    working-dir: /other-test-work-dir\n`\n\t)\n\n\twhen(\"Print\", func() {\n\t\tit.Before(func() {\n\t\t\ttype someData struct {\n\t\t\t\tString string\n\t\t\t\tBool   bool\n\t\t\t\tInt    int\n\t\t\t\tNested struct {\n\t\t\t\t\tString string\n\t\t\t\t}\n\t\t\t}\n\n\t\t\tremoteInfo = &client.ImageInfo{\n\t\t\t\tStackID: \"test.stack.id.remote\",\n\t\t\t\tBuildpacks: []buildpack.GroupElement{\n\t\t\t\t\t{ID: \"test.bp.one.remote\", Version: \"1.0.0\", Homepage: \"https://some-homepage-one\"},\n\t\t\t\t\t{ID: \"test.bp.two.remote\", Version: \"2.0.0\", Homepage: \"https://some-homepage-two\"},\n\t\t\t\t},\n\t\t\t\tBase: files.RunImageForRebase{\n\t\t\t\t\tTopLayer:  \"some-remote-top-layer\",\n\t\t\t\t\tReference: \"some-remote-run-image-reference\",\n\t\t\t\t},\n\t\t\t\tStack: files.Stack{\n\t\t\t\t\tRunImage: files.RunImageForExport{\n\t\t\t\t\t\tImage:   \"some-remote-run-image\",\n\t\t\t\t\t\tMirrors: []string{\"some-remote-mirror\", \"other-remote-mirror\"},\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t\tBOM: []buildpack.BOMEntry{{\n\t\t\t\t\tRequire: buildpack.Require{\n\t\t\t\t\t\tName:    \"name-1\",\n\t\t\t\t\t\tVersion: \"version-1\",\n\t\t\t\t\t\tMetadata: map[string]interface{}{\n\t\t\t\t\t\t\t\"RemoteData\": someData{\n\t\t\t\t\t\t\t\tString: \"aString\",\n\t\t\t\t\t\t\t\tBool:   true,\n\t\t\t\t\t\t\t\tInt:    123,\n\t\t\t\t\t\t\t\tNested: struct {\n\t\t\t\t\t\t\t\t\tString string\n\t\t\t\t\t\t\t\t}{\n\t\t\t\t\t\t\t\t\tString: \"anotherString\",\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t\tBuildpack: buildpack.GroupElement{ID: \"test.bp.one.remote\", Version: \"1.0.0\", Homepage: \"https://some-homepage-one\"},\n\t\t\t\t}},\n\t\t\t\tProcesses: client.ProcessDetails{\n\t\t\t\t\tDefaultProcess: &launch.Process{\n\t\t\t\t\t\tType:             \"some-remote-type\",\n\t\t\t\t\t\tCommand:          launch.RawCommand{Entries: []string{\"/some/remote command\"}},\n\t\t\t\t\t\tArgs:             []string{\"some\", \"remote\", \"args\"},\n\t\t\t\t\t\tDirect:           false,\n\t\t\t\t\t\tWorkingDirectory: \"/some-test-work-dir\",\n\t\t\t\t\t},\n\t\t\t\t\tOtherProcesses: []launch.Process{\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\tType:             \"other-remote-type\",\n\t\t\t\t\t\t\tCommand:          launch.RawCommand{Entries: []string{\"/other/remote/command\"}},\n\t\t\t\t\t\t\tArgs:             []string{\"other\", \"remote\", \"args\"},\n\t\t\t\t\t\t\tDirect:           true,\n\t\t\t\t\t\t\tWorkingDirectory: \"/other-test-work-dir\",\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t\tRebasable: true,\n\t\t\t}\n\t\t\tremoteInfoNoRebasable = &client.ImageInfo{\n\t\t\t\tStackID: \"test.stack.id.remote\",\n\t\t\t\tBuildpacks: []buildpack.GroupElement{\n\t\t\t\t\t{ID: \"test.bp.one.remote\", Version: \"1.0.0\", Homepage: \"https://some-homepage-one\"},\n\t\t\t\t\t{ID: \"test.bp.two.remote\", Version: \"2.0.0\", Homepage: \"https://some-homepage-two\"},\n\t\t\t\t},\n\t\t\t\tBase: files.RunImageForRebase{\n\t\t\t\t\tTopLayer:  \"some-remote-top-layer\",\n\t\t\t\t\tReference: \"some-remote-run-image-reference\",\n\t\t\t\t},\n\t\t\t\tStack: files.Stack{\n\t\t\t\t\tRunImage: files.RunImageForExport{\n\t\t\t\t\t\tImage:   \"some-remote-run-image\",\n\t\t\t\t\t\tMirrors: []string{\"some-remote-mirror\", \"other-remote-mirror\"},\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t\tBOM: []buildpack.BOMEntry{{\n\t\t\t\t\tRequire: buildpack.Require{\n\t\t\t\t\t\tName:    \"name-1\",\n\t\t\t\t\t\tVersion: \"version-1\",\n\t\t\t\t\t\tMetadata: map[string]interface{}{\n\t\t\t\t\t\t\t\"RemoteData\": someData{\n\t\t\t\t\t\t\t\tString: \"aString\",\n\t\t\t\t\t\t\t\tBool:   true,\n\t\t\t\t\t\t\t\tInt:    123,\n\t\t\t\t\t\t\t\tNested: struct {\n\t\t\t\t\t\t\t\t\tString string\n\t\t\t\t\t\t\t\t}{\n\t\t\t\t\t\t\t\t\tString: \"anotherString\",\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t\tBuildpack: buildpack.GroupElement{ID: \"test.bp.one.remote\", Version: \"1.0.0\", Homepage: \"https://some-homepage-one\"},\n\t\t\t\t}},\n\t\t\t\tProcesses: client.ProcessDetails{\n\t\t\t\t\tDefaultProcess: &launch.Process{\n\t\t\t\t\t\tType:             \"some-remote-type\",\n\t\t\t\t\t\tCommand:          launch.RawCommand{Entries: []string{\"/some/remote command\"}},\n\t\t\t\t\t\tArgs:             []string{\"some\", \"remote\", \"args\"},\n\t\t\t\t\t\tDirect:           false,\n\t\t\t\t\t\tWorkingDirectory: \"/some-test-work-dir\",\n\t\t\t\t\t},\n\t\t\t\t\tOtherProcesses: []launch.Process{\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\tType:             \"other-remote-type\",\n\t\t\t\t\t\t\tCommand:          launch.RawCommand{Entries: []string{\"/other/remote/command\"}},\n\t\t\t\t\t\t\tArgs:             []string{\"other\", \"remote\", \"args\"},\n\t\t\t\t\t\t\tDirect:           true,\n\t\t\t\t\t\t\tWorkingDirectory: \"/other-test-work-dir\",\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t\tRebasable: false,\n\t\t\t}\n\n\t\t\tlocalInfo = &client.ImageInfo{\n\t\t\t\tStackID: \"test.stack.id.local\",\n\t\t\t\tBuildpacks: []buildpack.GroupElement{\n\t\t\t\t\t{ID: \"test.bp.one.local\", Version: \"1.0.0\", Homepage: \"https://some-homepage-one\"},\n\t\t\t\t\t{ID: \"test.bp.two.local\", Version: \"2.0.0\", Homepage: \"https://some-homepage-two\"},\n\t\t\t\t},\n\t\t\t\tBase: files.RunImageForRebase{\n\t\t\t\t\tTopLayer:  \"some-local-top-layer\",\n\t\t\t\t\tReference: \"some-local-run-image-reference\",\n\t\t\t\t},\n\t\t\t\tStack: files.Stack{\n\t\t\t\t\tRunImage: files.RunImageForExport{\n\t\t\t\t\t\tImage:   \"some-local-run-image\",\n\t\t\t\t\t\tMirrors: []string{\"some-local-mirror\", \"other-local-mirror\"},\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t\tBOM: []buildpack.BOMEntry{{\n\t\t\t\t\tRequire: buildpack.Require{\n\t\t\t\t\t\tName:    \"name-1\",\n\t\t\t\t\t\tVersion: \"version-1\",\n\t\t\t\t\t\tMetadata: map[string]interface{}{\n\t\t\t\t\t\t\t\"LocalData\": someData{\n\t\t\t\t\t\t\t\tBool: false,\n\t\t\t\t\t\t\t\tInt:  456,\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t\tBuildpack: buildpack.GroupElement{ID: \"test.bp.one.remote\", Version: \"1.0.0\", Homepage: \"https://some-homepage-one\"},\n\t\t\t\t}},\n\t\t\t\tProcesses: client.ProcessDetails{\n\t\t\t\t\tDefaultProcess: &launch.Process{\n\t\t\t\t\t\tType:             \"some-local-type\",\n\t\t\t\t\t\tCommand:          launch.RawCommand{Entries: []string{\"/some/local command\"}},\n\t\t\t\t\t\tArgs:             []string{\"some\", \"local\", \"args\"},\n\t\t\t\t\t\tDirect:           false,\n\t\t\t\t\t\tWorkingDirectory: \"/some-test-work-dir\",\n\t\t\t\t\t},\n\t\t\t\t\tOtherProcesses: []launch.Process{\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\tType:             \"other-local-type\",\n\t\t\t\t\t\t\tCommand:          launch.RawCommand{Entries: []string{\"/other/local/command\"}},\n\t\t\t\t\t\t\tArgs:             []string{\"other\", \"local\", \"args\"},\n\t\t\t\t\t\t\tDirect:           true,\n\t\t\t\t\t\t\tWorkingDirectory: \"/other-test-work-dir\",\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t\tRebasable: true,\n\t\t\t}\n\t\t\tlocalInfoNoRebasable = &client.ImageInfo{\n\t\t\t\tStackID: \"test.stack.id.local\",\n\t\t\t\tBuildpacks: []buildpack.GroupElement{\n\t\t\t\t\t{ID: \"test.bp.one.local\", Version: \"1.0.0\", Homepage: \"https://some-homepage-one\"},\n\t\t\t\t\t{ID: \"test.bp.two.local\", Version: \"2.0.0\", Homepage: \"https://some-homepage-two\"},\n\t\t\t\t},\n\t\t\t\tBase: files.RunImageForRebase{\n\t\t\t\t\tTopLayer:  \"some-local-top-layer\",\n\t\t\t\t\tReference: \"some-local-run-image-reference\",\n\t\t\t\t},\n\t\t\t\tStack: files.Stack{\n\t\t\t\t\tRunImage: files.RunImageForExport{\n\t\t\t\t\t\tImage:   \"some-local-run-image\",\n\t\t\t\t\t\tMirrors: []string{\"some-local-mirror\", \"other-local-mirror\"},\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t\tBOM: []buildpack.BOMEntry{{\n\t\t\t\t\tRequire: buildpack.Require{\n\t\t\t\t\t\tName:    \"name-1\",\n\t\t\t\t\t\tVersion: \"version-1\",\n\t\t\t\t\t\tMetadata: map[string]interface{}{\n\t\t\t\t\t\t\t\"LocalData\": someData{\n\t\t\t\t\t\t\t\tBool: false,\n\t\t\t\t\t\t\t\tInt:  456,\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t\tBuildpack: buildpack.GroupElement{ID: \"test.bp.one.remote\", Version: \"1.0.0\", Homepage: \"https://some-homepage-one\"},\n\t\t\t\t}},\n\t\t\t\tProcesses: client.ProcessDetails{\n\t\t\t\t\tDefaultProcess: &launch.Process{\n\t\t\t\t\t\tType:             \"some-local-type\",\n\t\t\t\t\t\tCommand:          launch.RawCommand{Entries: []string{\"/some/local command\"}},\n\t\t\t\t\t\tArgs:             []string{\"some\", \"local\", \"args\"},\n\t\t\t\t\t\tDirect:           false,\n\t\t\t\t\t\tWorkingDirectory: \"/some-test-work-dir\",\n\t\t\t\t\t},\n\t\t\t\t\tOtherProcesses: []launch.Process{\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\tType:             \"other-local-type\",\n\t\t\t\t\t\t\tCommand:          launch.RawCommand{Entries: []string{\"/other/local/command\"}},\n\t\t\t\t\t\t\tArgs:             []string{\"other\", \"local\", \"args\"},\n\t\t\t\t\t\t\tDirect:           true,\n\t\t\t\t\t\t\tWorkingDirectory: \"/other-test-work-dir\",\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t\tRebasable: false,\n\t\t\t}\n\n\t\t\toutBuf = bytes.Buffer{}\n\t\t})\n\n\t\twhen(\"local and remote image exits\", func() {\n\t\t\tit(\"prints both local and remote image info in a YAML format\", func() {\n\t\t\t\trunImageMirrors := []config.RunImage{\n\t\t\t\t\t{\n\t\t\t\t\t\tImage:   \"un-used-run-image\",\n\t\t\t\t\t\tMirrors: []string{\"un-used\"},\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\tImage:   \"some-local-run-image\",\n\t\t\t\t\t\tMirrors: []string{\"user-configured-mirror-for-local\"},\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\tImage:   \"some-remote-run-image\",\n\t\t\t\t\t\tMirrors: []string{\"user-configured-mirror-for-remote\"},\n\t\t\t\t\t},\n\t\t\t\t}\n\t\t\t\tsharedImageInfo := inspectimage.GeneralInfo{\n\t\t\t\t\tName:            \"test-image\",\n\t\t\t\t\tRunImageMirrors: runImageMirrors,\n\t\t\t\t}\n\t\t\t\tyamlWriter := writer.NewYAML()\n\n\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\t\terr := yamlWriter.Print(logger, sharedImageInfo, localInfo, remoteInfo, nil, nil)\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassert.ContainsYAML(outBuf.String(), `\"image_name\": \"test-image\"`)\n\t\t\t\tassert.ContainsYAML(outBuf.String(), expectedLocalOutput)\n\t\t\t\tassert.ContainsYAML(outBuf.String(), expectedRemoteOutput)\n\t\t\t})\n\t\t\tit(\"prints both local and remote no rebasable images info in a YAML format\", func() {\n\t\t\t\trunImageMirrors := []config.RunImage{\n\t\t\t\t\t{\n\t\t\t\t\t\tImage:   \"un-used-run-image\",\n\t\t\t\t\t\tMirrors: []string{\"un-used\"},\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\tImage:   \"some-local-run-image\",\n\t\t\t\t\t\tMirrors: []string{\"user-configured-mirror-for-local\"},\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\tImage:   \"some-remote-run-image\",\n\t\t\t\t\t\tMirrors: []string{\"user-configured-mirror-for-remote\"},\n\t\t\t\t\t},\n\t\t\t\t}\n\t\t\t\tsharedImageInfo := inspectimage.GeneralInfo{\n\t\t\t\t\tName:            \"test-image\",\n\t\t\t\t\tRunImageMirrors: runImageMirrors,\n\t\t\t\t}\n\t\t\t\tyamlWriter := writer.NewYAML()\n\n\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\t\terr := yamlWriter.Print(logger, sharedImageInfo, localInfoNoRebasable, remoteInfoNoRebasable, nil, nil)\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassert.ContainsYAML(outBuf.String(), `\"image_name\": \"test-image\"`)\n\t\t\t\tassert.ContainsYAML(outBuf.String(), expectedLocalNoRebasableOutput)\n\t\t\t\tassert.ContainsYAML(outBuf.String(), expectedRemoteNoRebasableOutput)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"only local image exists\", func() {\n\t\t\tit(\"prints local image info in YAML format\", func() {\n\t\t\t\trunImageMirrors := []config.RunImage{\n\t\t\t\t\t{\n\t\t\t\t\t\tImage:   \"un-used-run-image\",\n\t\t\t\t\t\tMirrors: []string{\"un-used\"},\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\tImage:   \"some-local-run-image\",\n\t\t\t\t\t\tMirrors: []string{\"user-configured-mirror-for-local\"},\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\tImage:   \"some-remote-run-image\",\n\t\t\t\t\t\tMirrors: []string{\"user-configured-mirror-for-remote\"},\n\t\t\t\t\t},\n\t\t\t\t}\n\t\t\t\tsharedImageInfo := inspectimage.GeneralInfo{\n\t\t\t\t\tName:            \"test-image\",\n\t\t\t\t\tRunImageMirrors: runImageMirrors,\n\t\t\t\t}\n\t\t\t\tyamlWriter := writer.NewYAML()\n\n\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\t\terr := yamlWriter.Print(logger, sharedImageInfo, localInfo, nil, nil, nil)\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassert.ContainsYAML(outBuf.String(), `\"image_name\": \"test-image\"`)\n\t\t\t\tassert.ContainsYAML(outBuf.String(), expectedLocalOutput)\n\t\t\t\tassert.NotContains(outBuf.String(), \"test.stack.id.remote\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"only remote image exists\", func() {\n\t\t\tit(\"prints remote image info in YAML format\", func() {\n\t\t\t\trunImageMirrors := []config.RunImage{\n\t\t\t\t\t{\n\t\t\t\t\t\tImage:   \"un-used-run-image\",\n\t\t\t\t\t\tMirrors: []string{\"un-used\"},\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\tImage:   \"some-local-run-image\",\n\t\t\t\t\t\tMirrors: []string{\"user-configured-mirror-for-local\"},\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\tImage:   \"some-remote-run-image\",\n\t\t\t\t\t\tMirrors: []string{\"user-configured-mirror-for-remote\"},\n\t\t\t\t\t},\n\t\t\t\t}\n\t\t\t\tsharedImageInfo := inspectimage.GeneralInfo{\n\t\t\t\t\tName:            \"test-image\",\n\t\t\t\t\tRunImageMirrors: runImageMirrors,\n\t\t\t\t}\n\t\t\t\tyamlWriter := writer.NewYAML()\n\n\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\t\terr := yamlWriter.Print(logger, sharedImageInfo, nil, remoteInfo, nil, nil)\n\t\t\t\tassert.Nil(err)\n\n\t\t\t\tassert.ContainsYAML(outBuf.String(), `\"image_name\": \"test-image\"`)\n\t\t\t\tassert.NotContains(outBuf.String(), \"test.stack.id.local\")\n\t\t\t\tassert.ContainsYAML(outBuf.String(), expectedRemoteOutput)\n\t\t\t})\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "internal/layer/layer.go",
    "content": "package layer\n\nimport (\n\t\"github.com/buildpacks/pack/pkg/archive\"\n)\n\nfunc CreateSingleFileTar(tarFile, path, txt string, twf archive.TarWriterFactory) error {\n\ttarBuilder := archive.TarBuilder{}\n\ttarBuilder.AddFile(path, 0644, archive.NormalizedDateTime, []byte(txt))\n\treturn tarBuilder.WriteToPath(tarFile, twf)\n}\n"
  },
  {
    "path": "internal/layer/writer_factory.go",
    "content": "package layer\n\nimport (\n\t\"archive/tar\"\n\t\"fmt\"\n\t\"io\"\n\n\tilayer \"github.com/buildpacks/imgutil/layer\"\n\n\t\"github.com/buildpacks/pack/pkg/archive\"\n)\n\ntype WriterFactory struct {\n\tos string\n}\n\nfunc NewWriterFactory(imageOS string) (*WriterFactory, error) {\n\tif imageOS != \"freebsd\" && imageOS != \"linux\" && imageOS != \"windows\" {\n\t\treturn nil, fmt.Errorf(\"provided image OS '%s' must be either 'freebsd', 'linux' or 'windows'\", imageOS)\n\t}\n\n\treturn &WriterFactory{os: imageOS}, nil\n}\n\nfunc (f *WriterFactory) NewWriter(fileWriter io.Writer) archive.TarWriter {\n\tif f.os == \"windows\" {\n\t\treturn ilayer.NewWindowsWriter(fileWriter)\n\t}\n\n\t// Linux and FreeBSD images use tar.Writer\n\treturn tar.NewWriter(fileWriter)\n}\n"
  },
  {
    "path": "internal/layer/writer_factory_test.go",
    "content": "package layer_test\n\nimport (\n\t\"archive/tar\" //nolint\n\t\"testing\"\n\n\tilayer \"github.com/buildpacks/imgutil/layer\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\t\"github.com/buildpacks/pack/internal/layer\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestTarWriterFactory(t *testing.T) {\n\tspec.Run(t, \"WriterFactory\", testWriterFactory, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testWriterFactory(t *testing.T, when spec.G, it spec.S) {\n\twhen(\"#NewWriterFactory\", func() {\n\t\tit(\"returns an error for invalid image OS\", func() {\n\t\t\t_, err := layer.NewWriterFactory(\"not-an-os\")\n\t\t\th.AssertError(t, err, \"provided image OS 'not-an-os' must be either 'freebsd', 'linux' or 'windows'\")\n\t\t})\n\t})\n\n\twhen(\"#NewWriter\", func() {\n\t\tit(\"returns a regular tar writer for FreeBSD\", func() {\n\t\t\tfactory, err := layer.NewWriterFactory(\"freebsd\")\n\t\t\th.AssertNil(t, err)\n\n\t\t\t_, ok := factory.NewWriter(nil).(*tar.Writer)\n\t\t\tif !ok {\n\t\t\t\tt.Fatal(\"returned writer was not a regular tar writer\")\n\t\t\t}\n\t\t})\n\n\t\tit(\"returns a regular tar writer for Linux\", func() {\n\t\t\tfactory, err := layer.NewWriterFactory(\"linux\")\n\t\t\th.AssertNil(t, err)\n\n\t\t\t_, ok := factory.NewWriter(nil).(*tar.Writer)\n\t\t\tif !ok {\n\t\t\t\tt.Fatal(\"returned writer was not a regular tar writer\")\n\t\t\t}\n\t\t})\n\n\t\tit(\"returns a Windows layer writer for Windows\", func() {\n\t\t\tfactory, err := layer.NewWriterFactory(\"windows\")\n\t\t\th.AssertNil(t, err)\n\n\t\t\t_, ok := factory.NewWriter(nil).(*ilayer.WindowsWriter)\n\t\t\tif !ok {\n\t\t\t\tt.Fatal(\"returned writer was not a Windows layer writer\")\n\t\t\t}\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "internal/name/name.go",
    "content": "package name\n\nimport (\n\t\"fmt\"\n\t\"strings\"\n\n\t\"github.com/buildpacks/pack/pkg/dist\"\n\n\tgname \"github.com/google/go-containerregistry/pkg/name\"\n\n\t\"github.com/buildpacks/pack/internal/style\"\n)\n\nconst (\n\tdefaultRefFormat = \"%s/%s:%s\"\n\tdigestRefFormat  = \"%s/%s@%s\"\n)\n\ntype Logger interface {\n\tInfof(fmt string, v ...interface{})\n}\n\nfunc TranslateRegistry(name string, registryMirrors map[string]string, logger Logger) (string, error) {\n\tif registryMirrors == nil {\n\t\treturn name, nil\n\t}\n\n\tsrcRef, err := gname.ParseReference(name, gname.WeakValidation)\n\tif err != nil {\n\t\treturn \"\", err\n\t}\n\n\tsrcContext := srcRef.Context()\n\tregistryMirror, ok := getMirror(srcContext, registryMirrors)\n\tif !ok {\n\t\treturn name, nil\n\t}\n\n\trefFormat := defaultRefFormat\n\tif strings.Contains(srcRef.Identifier(), \":\") {\n\t\trefFormat = digestRefFormat\n\t}\n\n\trefName := fmt.Sprintf(refFormat, registryMirror, srcContext.RepositoryStr(), srcRef.Identifier())\n\t_, err = gname.ParseReference(refName, gname.WeakValidation)\n\tif err != nil {\n\t\treturn \"\", err\n\t}\n\n\tlogger.Infof(\"Using mirror %s for %s\", style.Symbol(refName), name)\n\treturn refName, nil\n}\n\nfunc AppendSuffix(name string, target dist.Target) (string, error) {\n\treference, err := gname.ParseReference(name, gname.WeakValidation)\n\tif err != nil {\n\t\treturn \"\", err\n\t}\n\n\tsuffixPlatformTag := targetToTag(target)\n\tif suffixPlatformTag != \"\" {\n\t\tif reference.Identifier() == \"latest\" {\n\t\t\treturn fmt.Sprintf(\"%s:%s\", reference.Context(), suffixPlatformTag), nil\n\t\t}\n\t\tif !strings.Contains(reference.Identifier(), \":\") {\n\t\t\treturn fmt.Sprintf(\"%s:%s-%s\", reference.Context(), reference.Identifier(), suffixPlatformTag), nil\n\t\t}\n\t}\n\treturn name, nil\n}\n\nfunc getMirror(repo gname.Repository, registryMirrors map[string]string) (string, bool) {\n\tmirror, ok := registryMirrors[\"*\"]\n\tif ok {\n\t\treturn mirror, ok\n\t}\n\n\tmirror, ok = registryMirrors[repo.RegistryStr()]\n\treturn mirror, ok\n}\n\nfunc targetToTag(target dist.Target) string {\n\treturn strings.Join(target.ValuesAsSlice(), \"-\")\n}\n"
  },
  {
    "path": "internal/name/name_test.go",
    "content": "package name_test\n\nimport (\n\t\"io\"\n\t\"testing\"\n\n\t\"github.com/buildpacks/pack/pkg/dist\"\n\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\t\"github.com/buildpacks/pack/internal/name\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestTranslateRegistry(t *testing.T) {\n\tspec.Run(t, \"TranslateRegistry\", testTranslateRegistry, spec.Report(report.Terminal{}))\n}\n\nfunc testTranslateRegistry(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tassert = h.NewAssertionManager(t)\n\t\tlogger = logging.NewSimpleLogger(io.Discard)\n\t)\n\n\twhen(\"#TranslateRegistry\", func() {\n\t\tit(\"doesn't translate when there are no mirrors\", func() {\n\t\t\tinput := \"index.docker.io/my/buildpack:0.1\"\n\n\t\t\toutput, err := name.TranslateRegistry(input, nil, logger)\n\t\t\tassert.Nil(err)\n\t\t\tassert.Equal(output, input)\n\t\t})\n\n\t\tit(\"doesn't translate when there are is no matching mirrors\", func() {\n\t\t\tinput := \"index.docker.io/my/buildpack:0.1\"\n\t\t\tregistryMirrors := map[string]string{\n\t\t\t\t\"us.gcr.io\": \"10.0.0.1\",\n\t\t\t}\n\n\t\t\toutput, err := name.TranslateRegistry(input, registryMirrors, logger)\n\t\t\tassert.Nil(err)\n\t\t\tassert.Equal(output, input)\n\t\t})\n\n\t\tit(\"translates when there is a mirror\", func() {\n\t\t\tinput := \"index.docker.io/my/buildpack:0.1\"\n\t\t\texpected := \"10.0.0.1/my/buildpack:0.1\"\n\t\t\tregistryMirrors := map[string]string{\n\t\t\t\t\"index.docker.io\": \"10.0.0.1\",\n\t\t\t}\n\n\t\t\toutput, err := name.TranslateRegistry(input, registryMirrors, logger)\n\t\t\tassert.Nil(err)\n\t\t\tassert.Equal(output, expected)\n\t\t})\n\n\t\tit(\"prefers the wildcard mirror translation\", func() {\n\t\t\tinput := \"index.docker.io/my/buildpack:0.1\"\n\t\t\texpected := \"10.0.0.2/my/buildpack:0.1\"\n\t\t\tregistryMirrors := map[string]string{\n\t\t\t\t\"index.docker.io\": \"10.0.0.1\",\n\t\t\t\t\"*\":               \"10.0.0.2\",\n\t\t\t}\n\n\t\t\toutput, err := name.TranslateRegistry(input, registryMirrors, logger)\n\t\t\tassert.Nil(err)\n\t\t\tassert.Equal(output, expected)\n\t\t})\n\n\t\tit(\"translate a buildpack referenced by a digest\", func() {\n\t\t\tinput := \"buildpack/bp@sha256:7f48a442c056cd19ea48462e05faa2837ac3a13732c47616d20f11f8c847a8c4\"\n\t\t\texpected := \"myregistry.com/buildpack/bp@sha256:7f48a442c056cd19ea48462e05faa2837ac3a13732c47616d20f11f8c847a8c4\"\n\t\t\tregistryMirrors := map[string]string{\n\t\t\t\t\"index.docker.io\": \"myregistry.com\",\n\t\t\t}\n\n\t\t\toutput, err := name.TranslateRegistry(input, registryMirrors, logger)\n\t\t\tassert.Nil(err)\n\t\t\tassert.Equal(output, expected)\n\t\t})\n\t})\n\n\twhen(\"#AppendSuffix\", func() {\n\t\twhen(\"[os] is provided\", func() {\n\t\t\twhen(\"[arch]] is provided\", func() {\n\t\t\t\twhen(\"[arch-variant] is provided\", func() {\n\t\t\t\t\twhen(\"tag is provided\", func() {\n\t\t\t\t\t\tit(\"append [os]-[arch]-[arch-variant] to the given tag\", func() {\n\t\t\t\t\t\t\tinput := \"my.registry.com/my-repo/my-image:some-tag\"\n\t\t\t\t\t\t\ttarget := dist.Target{\n\t\t\t\t\t\t\t\tOS:          \"linux\",\n\t\t\t\t\t\t\t\tArch:        \"amd64\",\n\t\t\t\t\t\t\t\tArchVariant: \"v6\",\n\t\t\t\t\t\t\t}\n\n\t\t\t\t\t\t\tresult, err := name.AppendSuffix(input, target)\n\t\t\t\t\t\t\tassert.Nil(err)\n\t\t\t\t\t\t\tassert.Equal(result, \"my.registry.com/my-repo/my-image:some-tag-linux-amd64-v6\")\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\t\t\t\t\twhen(\"tag is not provided\", func() {\n\t\t\t\t\t\tit(\"add tag: [os]-[arch]-[arch-variant] to the given <image>\", func() {\n\t\t\t\t\t\t\tinput := \"my.registry.com/my-repo/my-image\"\n\t\t\t\t\t\t\ttarget := dist.Target{\n\t\t\t\t\t\t\t\tOS:          \"linux\",\n\t\t\t\t\t\t\t\tArch:        \"amd64\",\n\t\t\t\t\t\t\t\tArchVariant: \"v6\",\n\t\t\t\t\t\t\t}\n\n\t\t\t\t\t\t\tresult, err := name.AppendSuffix(input, target)\n\t\t\t\t\t\t\tassert.Nil(err)\n\t\t\t\t\t\t\tassert.Equal(result, \"my.registry.com/my-repo/my-image:linux-amd64-v6\")\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t\twhen(\"[arch-variant] is not provided\", func() {\n\t\t\t\t\twhen(\"tag is provided\", func() {\n\t\t\t\t\t\t// my.registry.com/my-repo/my-image:some-tag\n\t\t\t\t\t\tit(\"append [os]-[arch] to the given tag\", func() {\n\t\t\t\t\t\t\tinput := \"my.registry.com/my-repo/my-image:some-tag\"\n\t\t\t\t\t\t\ttarget := dist.Target{\n\t\t\t\t\t\t\t\tOS:   \"linux\",\n\t\t\t\t\t\t\t\tArch: \"amd64\",\n\t\t\t\t\t\t\t}\n\n\t\t\t\t\t\t\tresult, err := name.AppendSuffix(input, target)\n\t\t\t\t\t\t\tassert.Nil(err)\n\t\t\t\t\t\t\tassert.Equal(result, \"my.registry.com/my-repo/my-image:some-tag-linux-amd64\")\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\t\t\t\t\twhen(\"tag is NOT provided\", func() {\n\t\t\t\t\t\t// my.registry.com/my-repo/my-image\n\t\t\t\t\t\tit(\"add tag: [os]-[arch] to the given <image>\", func() {\n\t\t\t\t\t\t\tinput := \"my.registry.com/my-repo/my-image\"\n\t\t\t\t\t\t\ttarget := dist.Target{\n\t\t\t\t\t\t\t\tOS:   \"linux\",\n\t\t\t\t\t\t\t\tArch: \"amd64\",\n\t\t\t\t\t\t\t}\n\n\t\t\t\t\t\t\tresult, err := name.AppendSuffix(input, target)\n\t\t\t\t\t\t\tassert.Nil(err)\n\t\t\t\t\t\t\tassert.Equal(result, \"my.registry.com/my-repo/my-image:linux-amd64\")\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"[arch] is not provided\", func() {\n\t\t\t\twhen(\"tag is provided\", func() {\n\t\t\t\t\t// my.registry.com/my-repo/my-image:some-tag\n\t\t\t\t\tit(\"append [os] to the given tag\", func() {\n\t\t\t\t\t\tinput := \"my.registry.com/my-repo/my-image:some-tag\"\n\t\t\t\t\t\ttarget := dist.Target{\n\t\t\t\t\t\t\tOS: \"linux\",\n\t\t\t\t\t\t}\n\n\t\t\t\t\t\tresult, err := name.AppendSuffix(input, target)\n\t\t\t\t\t\tassert.Nil(err)\n\t\t\t\t\t\tassert.Equal(result, \"my.registry.com/my-repo/my-image:some-tag-linux\")\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t\twhen(\"tag is not provided\", func() {\n\t\t\t\t\t// my.registry.com/my-repo/my-image\n\t\t\t\t\tit(\"add tag: [os] to the given <image>\", func() {\n\t\t\t\t\t\tinput := \"my.registry.com/my-repo/my-image\"\n\t\t\t\t\t\ttarget := dist.Target{\n\t\t\t\t\t\t\tOS: \"linux\",\n\t\t\t\t\t\t}\n\n\t\t\t\t\t\tresult, err := name.AppendSuffix(input, target)\n\t\t\t\t\t\tassert.Nil(err)\n\t\t\t\t\t\tassert.Equal(result, \"my.registry.com/my-repo/my-image:linux\")\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"[os] is not provided\", func() {\n\t\t\tit(\"doesn't append anything and return the same <image> name\", func() {\n\t\t\t\tinput := \"my.registry.com/my-repo/my-image\"\n\t\t\t\ttarget := dist.Target{}\n\n\t\t\t\tresult, err := name.AppendSuffix(input, target)\n\t\t\t\tassert.Nil(err)\n\t\t\t\tassert.Equal(result, input)\n\t\t\t})\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "internal/paths/defaults_unix.go",
    "content": "//go:build unix\n\npackage paths\n\nconst (\n\tRootDir = `/`\n)\n"
  },
  {
    "path": "internal/paths/defaults_windows.go",
    "content": "package paths\n\nconst (\n\tRootDir = `c:\\`\n)\n"
  },
  {
    "path": "internal/paths/paths.go",
    "content": "package paths\n\nimport (\n\t\"net/url\"\n\t\"os\"\n\t\"path\"\n\t\"path/filepath\"\n\t\"regexp\"\n\t\"runtime\"\n\t\"strings\"\n)\n\nvar schemeRegexp = regexp.MustCompile(`^.+:/.*`)\n\nfunc IsURI(ref string) bool {\n\treturn schemeRegexp.MatchString(ref)\n}\n\nfunc IsDir(p string) (bool, error) {\n\tfileInfo, err := os.Stat(p)\n\tif err != nil {\n\t\treturn false, err\n\t}\n\n\treturn fileInfo.IsDir(), nil\n}\n\n// FilePathToURI converts a filepath to URI. If relativeTo is provided not empty and path is\n// a relative path it will be made absolute based on the provided value. Otherwise, the\n// current working directory is used.\nfunc FilePathToURI(path, relativeTo string) (string, error) {\n\tif IsURI(path) {\n\t\treturn path, nil\n\t}\n\n\tif !filepath.IsAbs(path) {\n\t\tvar err error\n\t\tpath, err = filepath.Abs(filepath.Join(relativeTo, path))\n\t\tif err != nil {\n\t\t\treturn \"\", err\n\t\t}\n\t}\n\n\tif runtime.GOOS == \"windows\" {\n\t\tif strings.HasPrefix(path, `\\\\`) {\n\t\t\treturn \"file://\" + filepath.ToSlash(strings.TrimPrefix(path, `\\\\`)), nil\n\t\t}\n\t\treturn \"file:///\" + filepath.ToSlash(path), nil\n\t}\n\treturn \"file://\" + path, nil\n}\n\n// examples:\n//\n// - unix file: file://laptop/some%20dir/file.tgz\n//\n// - windows drive: file:///C:/Documents%20and%20Settings/file.tgz\n//\n// - windows share: file://laptop/My%20Documents/file.tgz\nfunc URIToFilePath(uri string) (string, error) {\n\tvar (\n\t\tosPath string\n\t\terr    error\n\t)\n\n\tosPath = filepath.FromSlash(strings.TrimPrefix(uri, \"file://\"))\n\n\tif osPath, err = url.PathUnescape(osPath); err != nil {\n\t\treturn \"\", nil\n\t}\n\n\tif runtime.GOOS == \"windows\" {\n\t\tif strings.HasPrefix(osPath, `\\`) {\n\t\t\treturn strings.TrimPrefix(osPath, `\\`), nil\n\t\t}\n\t\treturn `\\\\` + osPath, nil\n\t}\n\treturn osPath, nil\n}\n\nfunc FilterReservedNames(p string) string {\n\t// The following keys are reserved on Windows\n\t// https://docs.microsoft.com/en-us/windows/win32/fileio/naming-a-file?redirectedfrom=MSDN#win32-file-namespaces\n\treservedNameConversions := map[string]string{\n\t\t\"aux\": \"a_u_x\",\n\t\t\"com\": \"c_o_m\",\n\t\t\"con\": \"c_o_n\",\n\t\t\"lpt\": \"l_p_t\",\n\t\t\"nul\": \"n_u_l\",\n\t\t\"prn\": \"p_r_n\",\n\t}\n\tfor k, v := range reservedNameConversions {\n\t\tp = strings.ReplaceAll(p, k, v)\n\t}\n\n\treturn p\n}\n\n// WindowsDir is equivalent to path.Dir or filepath.Dir but always for Windows paths\n// reproduced because Windows implementation is not exported\nfunc WindowsDir(p string) string {\n\tpathElements := strings.Split(p, `\\`)\n\n\tdirName := strings.Join(pathElements[:len(pathElements)-1], `\\`)\n\n\treturn dirName\n}\n\n// WindowsBasename is equivalent to path.Basename or filepath.Basename but always for Windows paths\n// reproduced because Windows implementation is not exported\nfunc WindowsBasename(p string) string {\n\tpathElements := strings.Split(p, `\\`)\n\n\treturn pathElements[len(pathElements)-1]\n}\n\n// WindowsToSlash is equivalent to path.ToSlash or filepath.ToSlash but always for Windows paths\n// reproduced because Windows implementation is not exported\nfunc WindowsToSlash(p string) string {\n\tslashPath := strings.ReplaceAll(p, `\\`, \"/\") // convert slashes\n\tif len(slashPath) < 2 {\n\t\treturn \"\"\n\t}\n\n\treturn slashPath[2:] // strip volume\n}\n\n// WindowsPathSID returns the appropriate SID for a given UID and GID\n// This is the basic logic for path permissions in Pack and Lifecycle\nfunc WindowsPathSID(uid, gid int) string {\n\tif uid == 0 && gid == 0 {\n\t\treturn \"S-1-5-32-544\" // BUILTIN\\Administrators\n\t}\n\treturn \"S-1-5-32-545\" // BUILTIN\\Users\n}\n\n// CanonicalTarPath return a cleaned path (see path.Clean) with leading slashes removed\nfunc CanonicalTarPath(p string) string {\n\treturn strings.TrimPrefix(path.Clean(p), \"/\")\n}\n"
  },
  {
    "path": "internal/paths/paths_test.go",
    "content": "package paths_test\n\nimport (\n\t\"fmt\"\n\t\"os\"\n\t\"path/filepath\"\n\t\"runtime\"\n\t\"testing\"\n\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\t\"github.com/buildpacks/pack/internal/paths\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestPaths(t *testing.T) {\n\tspec.Run(t, \"Paths\", testPaths, spec.Report(report.Terminal{}))\n}\n\nfunc testPaths(t *testing.T, when spec.G, it spec.S) {\n\twhen(\"#IsURI\", func() {\n\t\tfor _, params := range []struct {\n\t\t\tdesc    string\n\t\t\turi     string\n\t\t\tisValid bool\n\t\t}{\n\t\t\t{\n\t\t\t\tdesc:    \"missing scheme\",\n\t\t\t\turi:     \":/invalid\",\n\t\t\t\tisValid: false,\n\t\t\t},\n\t\t\t{\n\t\t\t\tdesc:    \"missing scheme\",\n\t\t\t\turi:     \"://invalid\",\n\t\t\t\tisValid: false,\n\t\t\t},\n\t\t\t{\n\t\t\t\turi:     \"file://host/file.txt\",\n\t\t\t\tisValid: true,\n\t\t\t},\n\t\t\t{\n\t\t\t\tdesc:    \"no host (shorthand)\",\n\t\t\t\turi:     \"file:/valid\",\n\t\t\t\tisValid: true,\n\t\t\t},\n\t\t\t{\n\t\t\t\tdesc:    \"no host\",\n\t\t\t\turi:     \"file:///valid\",\n\t\t\t\tisValid: true,\n\t\t\t},\n\t\t} {\n\t\t\tparams := params\n\n\t\t\twhen(params.desc+\":\"+params.uri, func() {\n\t\t\t\tit(fmt.Sprintf(\"returns %v\", params.isValid), func() {\n\t\t\t\t\th.AssertEq(t, paths.IsURI(params.uri), params.isValid)\n\t\t\t\t})\n\t\t\t})\n\t\t}\n\t})\n\n\twhen(\"#FilterReservedNames\", func() {\n\t\twhen(\"volume contains a reserved name\", func() {\n\t\t\tit(\"modifies the volume name\", func() {\n\t\t\t\tvolumeName := \"auxauxaux\"\n\t\t\t\tsubject := paths.FilterReservedNames(volumeName)\n\t\t\t\texpected := \"a_u_xa_u_xa_u_x\"\n\t\t\t\tif subject != expected {\n\t\t\t\t\tt.Fatalf(\"The volume should not contain reserved names\")\n\t\t\t\t}\n\t\t\t})\n\t\t})\n\n\t\twhen(\"volume does not contain reserved names\", func() {\n\t\t\tit(\"does not modify the volume name\", func() {\n\t\t\t\tvolumeName := \"lbtlbtlbt\"\n\t\t\t\tsubject := paths.FilterReservedNames(volumeName)\n\t\t\t\tif subject != volumeName {\n\t\t\t\t\tt.Fatalf(\"The volume should not be modified\")\n\t\t\t\t}\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"#FilePathToURI\", func() {\n\t\twhen(\"is windows\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\th.SkipIf(t, runtime.GOOS != \"windows\", \"Skipped on non-windows\")\n\t\t\t})\n\n\t\t\twhen(\"path is absolute\", func() {\n\t\t\t\tit(\"returns uri\", func() {\n\t\t\t\t\turi, err := paths.FilePathToURI(`C:\\some\\file.txt`, \"\")\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\th.AssertEq(t, uri, `file:///C:/some/file.txt`)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"path is relative\", func() {\n\t\t\t\tvar (\n\t\t\t\t\terr    error\n\t\t\t\t\togDir  string\n\t\t\t\t\ttmpDir string\n\t\t\t\t)\n\t\t\t\tit.Before(func() {\n\t\t\t\t\togDir, err = os.Getwd()\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\ttmpDir = os.TempDir()\n\n\t\t\t\t\terr = os.Chdir(tmpDir)\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t})\n\n\t\t\t\tit.After(func() {\n\t\t\t\t\terr := os.Chdir(ogDir)\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t})\n\n\t\t\t\tit(\"returns uri\", func() {\n\t\t\t\t\tcwd, err := os.Getwd()\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\turi, err := paths.FilePathToURI(`some\\file.tgz`, \"\")\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\th.AssertEq(t, uri, fmt.Sprintf(`file:///%s/some/file.tgz`, filepath.ToSlash(cwd)))\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"is *nix\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\th.SkipIf(t, runtime.GOOS == \"windows\", \"Skipped on windows\")\n\t\t\t})\n\n\t\t\twhen(\"path is absolute\", func() {\n\t\t\t\tit(\"returns uri\", func() {\n\t\t\t\t\turi, err := paths.FilePathToURI(\"/tmp/file.tgz\", \"\")\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\th.AssertEq(t, uri, \"file:///tmp/file.tgz\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"path is relative\", func() {\n\t\t\t\tit(\"returns uri\", func() {\n\t\t\t\t\tcwd, err := os.Getwd()\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\turi, err := paths.FilePathToURI(\"some/file.tgz\", \"\")\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\th.AssertEq(t, uri, fmt.Sprintf(\"file://%s/some/file.tgz\", cwd))\n\t\t\t\t})\n\n\t\t\t\tit(\"returns uri based on relativeTo\", func() {\n\t\t\t\t\turi, err := paths.FilePathToURI(\"some/file.tgz\", \"/my/base/dir\")\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\th.AssertEq(t, uri, \"file:///my/base/dir/some/file.tgz\")\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"#URIToFilePath\", func() {\n\t\twhen(\"is windows\", func() {\n\t\t\twhen(\"uri is drive\", func() {\n\t\t\t\tit(\"returns path\", func() {\n\t\t\t\t\th.SkipIf(t, runtime.GOOS != \"windows\", \"Skipped on non-windows\")\n\n\t\t\t\t\tpath, err := paths.URIToFilePath(`file:///c:/laptop/file.tgz`)\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\th.AssertEq(t, path, `c:\\laptop\\file.tgz`)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"uri is network share\", func() {\n\t\t\t\tit(\"returns path\", func() {\n\t\t\t\t\th.SkipIf(t, runtime.GOOS != \"windows\", \"Skipped on non-windows\")\n\n\t\t\t\t\tpath, err := paths.URIToFilePath(`file://laptop/file.tgz`)\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\th.AssertEq(t, path, `\\\\laptop\\file.tgz`)\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"is *nix\", func() {\n\t\t\twhen(\"uri is valid\", func() {\n\t\t\t\tit(\"returns path\", func() {\n\t\t\t\t\th.SkipIf(t, runtime.GOOS == \"windows\", \"Skipped on windows\")\n\n\t\t\t\t\tpath, err := paths.URIToFilePath(`file:///tmp/file.tgz`)\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\th.AssertEq(t, path, `/tmp/file.tgz`)\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"#WindowsDir\", func() {\n\t\tit(\"returns the path directory\", func() {\n\t\t\tpath := paths.WindowsDir(`C:\\layers\\file.txt`)\n\t\t\th.AssertEq(t, path, `C:\\layers`)\n\t\t})\n\n\t\tit(\"returns empty for empty\", func() {\n\t\t\tpath := paths.WindowsBasename(\"\")\n\t\t\th.AssertEq(t, path, \"\")\n\t\t})\n\t})\n\n\twhen(\"#WindowsBasename\", func() {\n\t\tit(\"returns the path basename\", func() {\n\t\t\tpath := paths.WindowsBasename(`C:\\layers\\file.txt`)\n\t\t\th.AssertEq(t, path, `file.txt`)\n\t\t})\n\n\t\tit(\"returns empty for empty\", func() {\n\t\t\tpath := paths.WindowsBasename(\"\")\n\t\t\th.AssertEq(t, path, \"\")\n\t\t})\n\t})\n\n\twhen(\"#WindowsToSlash\", func() {\n\t\tit(\"returns the path; backward slashes converted to forward with volume stripped \", func() {\n\t\t\tpath := paths.WindowsToSlash(`C:\\layers\\file.txt`)\n\t\t\th.AssertEq(t, path, `/layers/file.txt`)\n\t\t})\n\n\t\tit(\"returns / for volume\", func() {\n\t\t\tpath := paths.WindowsToSlash(`c:\\`)\n\t\t\th.AssertEq(t, path, `/`)\n\t\t})\n\n\t\tit(\"returns empty for empty\", func() {\n\t\t\tpath := paths.WindowsToSlash(\"\")\n\t\t\th.AssertEq(t, path, \"\")\n\t\t})\n\t})\n\n\twhen(\"#WindowsPathSID\", func() {\n\t\twhen(\"UID and GID are both 0\", func() {\n\t\t\tit(`returns the built-in BUILTIN\\Administrators SID`, func() {\n\t\t\t\tsid := paths.WindowsPathSID(0, 0)\n\t\t\t\th.AssertEq(t, sid, \"S-1-5-32-544\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"UID and GID are both non-zero\", func() {\n\t\t\tit(`returns the built-in BUILTIN\\Users SID`, func() {\n\t\t\t\tsid := paths.WindowsPathSID(99, 99)\n\t\t\t\th.AssertEq(t, sid, \"S-1-5-32-545\")\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"#CanonicalTarPath\", func() {\n\t\tfor _, params := range []struct {\n\t\t\tdesc     string\n\t\t\tpath     string\n\t\t\texpected string\n\t\t}{\n\t\t\t{\n\t\t\t\tdesc:     \"noop\",\n\t\t\t\tpath:     \"my/clean/path\",\n\t\t\t\texpected: \"my/clean/path\",\n\t\t\t},\n\t\t\t{\n\t\t\t\tdesc:     \"leading slash\",\n\t\t\t\tpath:     \"/my/path\",\n\t\t\t\texpected: \"my/path\",\n\t\t\t},\n\t\t\t{\n\t\t\t\tdesc:     \"dot\",\n\t\t\t\tpath:     \"my/./path\",\n\t\t\t\texpected: \"my/path\",\n\t\t\t},\n\t\t\t{\n\t\t\t\tdesc:     \"dotdot\",\n\t\t\t\tpath:     \"my/../my/path\",\n\t\t\t\texpected: \"my/path\",\n\t\t\t},\n\t\t} {\n\t\t\tparams := params\n\n\t\t\twhen(params.desc+\":\"+params.path, func() {\n\t\t\t\tit(fmt.Sprintf(\"returns %v\", params.expected), func() {\n\t\t\t\t\th.AssertEq(t, paths.CanonicalTarPath(params.path), params.expected)\n\t\t\t\t})\n\t\t\t})\n\t\t}\n\t})\n}\n"
  },
  {
    "path": "internal/registry/buildpack.go",
    "content": "package registry\n\nimport (\n\t\"fmt\"\n\t\"strings\"\n\n\tggcrname \"github.com/google/go-containerregistry/pkg/name\"\n\t\"github.com/pkg/errors\"\n\n\t\"github.com/buildpacks/pack/internal/style\"\n)\n\n// Buildpack contains information about a buildpack stored in a Registry\ntype Buildpack struct {\n\tNamespace string `json:\"ns\"`\n\tName      string `json:\"name\"`\n\tVersion   string `json:\"version\"`\n\tYanked    bool   `json:\"yanked\"`\n\tAddress   string `json:\"addr,omitempty\"`\n}\n\n// Validate that a buildpack reference contains required information\nfunc Validate(b Buildpack) error {\n\tif b.Address == \"\" {\n\t\treturn errors.New(\"invalid entry: address is a required field\")\n\t}\n\t_, err := ggcrname.NewDigest(b.Address)\n\tif err != nil {\n\t\treturn fmt.Errorf(\"invalid entry: '%s' is not a digest reference\", b.Address)\n\t}\n\n\treturn nil\n}\n\n// ParseNamespaceName parses a buildpack ID into Namespace and Name\nfunc ParseNamespaceName(id string) (ns string, name string, err error) {\n\tparts := strings.Split(id, \"/\")\n\tif len(parts) < 2 {\n\t\treturn \"\", \"\", fmt.Errorf(\"invalid id %s does not contain a namespace\", style.Symbol(id))\n\t} else if len(parts) > 2 {\n\t\treturn \"\", \"\", fmt.Errorf(\"invalid id %s contains unexpected characters\", style.Symbol(id))\n\t}\n\n\treturn parts[0], parts[1], nil\n}\n"
  },
  {
    "path": "internal/registry/buildpack_test.go",
    "content": "package registry_test\n\nimport (\n\t\"testing\"\n\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\t\"github.com/buildpacks/pack/internal/registry\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestRegistryBuildpack(t *testing.T) {\n\tspec.Run(t, \"Buildpack\", testRegistryBuildpack, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testRegistryBuildpack(t *testing.T, when spec.G, it spec.S) {\n\twhen(\"#Validate\", func() {\n\t\tit(\"errors when address is missing\", func() {\n\t\t\tb := registry.Buildpack{\n\t\t\t\tAddress: \"\",\n\t\t\t}\n\n\t\t\th.AssertNotNil(t, registry.Validate(b))\n\t\t})\n\n\t\tit(\"errors when not a digest\", func() {\n\t\t\tb := registry.Buildpack{\n\t\t\t\tAddress: \"example.com/some/package:18\",\n\t\t\t}\n\n\t\t\th.AssertNotNil(t, registry.Validate(b))\n\t\t})\n\n\t\tit(\"succeeds when address is a digest\", func() {\n\t\t\tb := registry.Buildpack{\n\t\t\t\tAddress: \"example.com/some/package@sha256:8c27fe111c11b722081701dfed3bd55e039b9ce92865473cf4cdfa918071c566\",\n\t\t\t}\n\n\t\t\th.AssertNil(t, registry.Validate(b))\n\t\t})\n\t})\n\n\twhen(\"#ParseNamespaceName\", func() {\n\t\tit(\"should parse buildpack id into namespace and name\", func() {\n\t\t\tconst id = \"heroku/rust@1.2.3\"\n\t\t\tnamespace, name, err := registry.ParseNamespaceName(id)\n\n\t\t\th.AssertNil(t, err)\n\t\t\th.AssertEq(t, namespace, \"heroku\")\n\t\t\th.AssertEq(t, name, \"rust@1.2.3\")\n\t\t})\n\n\t\tit(\"should provide an error for invalid id\", func() {\n\t\t\tconst id = \"bad id\"\n\t\t\t_, _, err := registry.ParseNamespaceName(id)\n\n\t\t\th.AssertNotNil(t, err)\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "internal/registry/git.go",
    "content": "package registry\n\nimport (\n\t\"bytes\"\n\t\"text/template\"\n\n\t\"github.com/pkg/errors\"\n)\n\n// GitCommit commits a Buildpack to a registry Cache.\nfunc GitCommit(b Buildpack, username string, registryCache Cache) error {\n\tif err := registryCache.Initialize(); err != nil {\n\t\treturn err\n\t}\n\n\tcommitTemplate, err := template.New(\"buildpack\").Parse(GitCommitTemplate)\n\tif err != nil {\n\t\treturn err\n\t}\n\n\tvar commit bytes.Buffer\n\tif err := commitTemplate.Execute(&commit, b); err != nil {\n\t\treturn errors.Wrap(err, \"creating template\")\n\t}\n\n\tif err := registryCache.Commit(b, username, commit.String()); err != nil {\n\t\treturn err\n\t}\n\n\treturn nil\n}\n"
  },
  {
    "path": "internal/registry/git_test.go",
    "content": "package registry_test\n\nimport (\n\t\"bytes\"\n\t\"os\"\n\t\"path/filepath\"\n\t\"runtime\"\n\t\"testing\"\n\n\t\"github.com/go-git/go-git/v5\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\t\"github.com/buildpacks/pack/internal/registry\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestGit(t *testing.T) {\n\tspec.Run(t, \"Git\", testGit, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testGit(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tregistryCache   registry.Cache\n\t\ttmpDir          string\n\t\terr             error\n\t\tregistryFixture string\n\t\toutBuf          bytes.Buffer\n\t\tlogger          logging.Logger\n\t\tusername        = \"supra08\"\n\t)\n\n\tit.Before(func() {\n\t\tlogger = logging.NewLogWithWriters(&outBuf, &outBuf)\n\n\t\ttmpDir, err = os.MkdirTemp(\"\", \"registry\")\n\t\th.AssertNil(t, err)\n\n\t\tregistryFixture = h.CreateRegistryFixture(t, tmpDir, filepath.Join(\"..\", \"..\", \"testdata\", \"registry\"))\n\t\tregistryCache, err = registry.NewRegistryCache(logger, tmpDir, registryFixture)\n\t\th.AssertNil(t, err)\n\t})\n\n\tit.After(func() {\n\t\tif runtime.GOOS != \"windows\" {\n\t\t\th.AssertNil(t, os.RemoveAll(tmpDir))\n\t\t}\n\t\tos.RemoveAll(tmpDir)\n\t})\n\n\twhen(\"#GitCommit\", func() {\n\t\twhen(\"ADD buildpack\", func() {\n\t\t\tit(\"commits addition\", func() {\n\t\t\t\terr := registry.GitCommit(registry.Buildpack{\n\t\t\t\t\tNamespace: \"example\",\n\t\t\t\t\tName:      \"python\",\n\t\t\t\t\tVersion:   \"1.0.0\",\n\t\t\t\t\tYanked:    false,\n\t\t\t\t\tAddress:   \"example.com\",\n\t\t\t\t}, username, registryCache)\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\trepo, err := git.PlainOpen(registryCache.Root)\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\thead, err := repo.Head()\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\tcIter, err := repo.Log(&git.LogOptions{From: head.Hash()})\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\tcommit, err := cIter.Next()\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\th.AssertEq(t, commit.Message, \"ADD example/python@1.0.0\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"YANK buildpack\", func() {\n\t\t\tit(\"commits yank\", func() {\n\t\t\t\terr := registry.GitCommit(registry.Buildpack{\n\t\t\t\t\tNamespace: \"example\",\n\t\t\t\t\tName:      \"python\",\n\t\t\t\t\tVersion:   \"1.0.0\",\n\t\t\t\t\tYanked:    true,\n\t\t\t\t\tAddress:   \"example.com\",\n\t\t\t\t}, username, registryCache)\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\trepo, err := git.PlainOpen(registryCache.Root)\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\thead, err := repo.Head()\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\tcIter, err := repo.Log(&git.LogOptions{From: head.Hash()})\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\tcommit, err := cIter.Next()\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\th.AssertEq(t, commit.Message, \"YANK example/python@1.0.0\")\n\t\t\t})\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "internal/registry/github.go",
    "content": "package registry\n\nimport (\n\t\"bytes\"\n\t\"fmt\"\n\t\"net/url\"\n\t\"os/exec\"\n\t\"strings\"\n\t\"text/template\"\n\n\t\"github.com/pkg/errors\"\n\n\t\"github.com/buildpacks/pack/internal/style\"\n)\n\ntype GithubIssue struct {\n\tTitle string\n\tBody  string\n}\n\nfunc CreateGithubIssue(b Buildpack) (GithubIssue, error) {\n\ttitleTemplate, err := template.New(\"buildpack\").Parse(GithubIssueTitleTemplate)\n\tif err != nil {\n\t\treturn GithubIssue{}, err\n\t}\n\n\tbodyTemplate, err := template.New(\"buildpack\").Parse(GithubIssueBodyTemplate)\n\tif err != nil {\n\t\treturn GithubIssue{}, err\n\t}\n\n\tvar title bytes.Buffer\n\terr = titleTemplate.Execute(&title, b)\n\tif err != nil {\n\t\treturn GithubIssue{}, err\n\t}\n\n\tvar body bytes.Buffer\n\terr = bodyTemplate.Execute(&body, b)\n\tif err != nil {\n\t\treturn GithubIssue{}, err\n\t}\n\n\treturn GithubIssue{\n\t\ttitle.String(),\n\t\tbody.String(),\n\t}, nil\n}\n\nfunc CreateBrowserCmd(browserURL, os string) (*exec.Cmd, error) {\n\t_, err := url.ParseRequestURI(browserURL)\n\tif err != nil {\n\t\treturn nil, errors.Wrapf(err, \"invalid URL %s\", style.Symbol(browserURL))\n\t}\n\n\tswitch os {\n\tcase \"linux\":\n\t\treturn exec.Command(\"xdg-open\", browserURL), nil\n\tcase \"windows\":\n\t\treturn exec.Command(\"rundll32\", \"url.dll,FileProtocolHandler\", browserURL), nil\n\tcase \"darwin\":\n\t\treturn exec.Command(\"open\", browserURL), nil\n\tdefault:\n\t\treturn nil, fmt.Errorf(\"unsupported platform %s\", style.Symbol(os))\n\t}\n}\n\nfunc GetIssueURL(githubURL string) (*url.URL, error) {\n\tif githubURL == \"\" {\n\t\treturn nil, errors.New(\"missing github URL\")\n\t}\n\treturn url.Parse(fmt.Sprintf(\"%s/issues/new\", strings.TrimSuffix(githubURL, \"/\")))\n}\n"
  },
  {
    "path": "internal/registry/github_test.go",
    "content": "package registry_test\n\nimport (\n\t\"testing\"\n\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\t\"github.com/buildpacks/pack/internal/registry\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestGithub(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"Github\", testGitHub, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testGitHub(t *testing.T, when spec.G, it spec.S) {\n\twhen(\"#CreateBrowserCommand\", func() {\n\t\tit(\"should error when browser URL is invalid\", func() {\n\t\t\t_, err := registry.CreateBrowserCmd(\"\", \"linux\")\n\t\t\th.AssertError(t, err, \"invalid URL\")\n\t\t})\n\n\t\tit(\"should error when os is unsupported\", func() {\n\t\t\t_, err := registry.CreateBrowserCmd(\"https://url.com\", \"valinor\")\n\t\t\th.AssertError(t, err, \"unsupported platform 'valinor'\")\n\t\t})\n\n\t\tit(\"should work for linux\", func() {\n\t\t\tcmd, err := registry.CreateBrowserCmd(\"https://buildpacks.io\", \"linux\")\n\t\t\th.AssertNil(t, err)\n\t\t\th.AssertEq(t, len(cmd.Args), 2)\n\t\t\th.AssertEq(t, cmd.Args[0], \"xdg-open\")\n\t\t\th.AssertEq(t, cmd.Args[1], \"https://buildpacks.io\")\n\t\t})\n\n\t\tit(\"should work for darwin\", func() {\n\t\t\tcmd, err := registry.CreateBrowserCmd(\"https://buildpacks.io\", \"darwin\")\n\t\t\th.AssertNil(t, err)\n\t\t\th.AssertEq(t, len(cmd.Args), 2)\n\t\t\th.AssertEq(t, cmd.Args[0], \"open\")\n\t\t\th.AssertEq(t, cmd.Args[1], \"https://buildpacks.io\")\n\t\t})\n\n\t\tit(\"should work for windows\", func() {\n\t\t\tcmd, err := registry.CreateBrowserCmd(\"https://buildpacks.io\", \"windows\")\n\t\t\th.AssertNil(t, err)\n\t\t\th.AssertEq(t, len(cmd.Args), 3)\n\t\t\th.AssertEq(t, cmd.Args[0], \"rundll32\")\n\t\t\th.AssertEq(t, cmd.Args[1], \"url.dll,FileProtocolHandler\")\n\t\t\th.AssertEq(t, cmd.Args[2], \"https://buildpacks.io\")\n\t\t})\n\t})\n\n\twhen(\"#GetIssueURL\", func() {\n\t\tit(\"should return an issueURL\", func() {\n\t\t\turl, err := registry.GetIssueURL(\"https://github.com/buildpacks\")\n\n\t\t\th.AssertNil(t, err)\n\t\t\th.AssertEq(t, url.String(), \"https://github.com/buildpacks/issues/new\")\n\t\t})\n\n\t\tit(\"should fail when url is empty\", func() {\n\t\t\t_, err := registry.GetIssueURL(\"\")\n\n\t\t\th.AssertError(t, err, \"missing github URL\")\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "internal/registry/index.go",
    "content": "package registry\n\nimport (\n\t\"fmt\"\n\t\"path/filepath\"\n\t\"regexp\"\n\n\t\"github.com/pkg/errors\"\n\n\t\"github.com/buildpacks/pack/internal/style\"\n)\n\nvar (\n\tvalidCharsPattern = \"[a-z0-9\\\\-.]+\"\n\tvalidCharsRegexp  = regexp.MustCompile(fmt.Sprintf(\"^%s$\", validCharsPattern))\n)\n\n// IndexPath resolves the path for a specific namespace and name of buildpack\nfunc IndexPath(rootDir, ns, name string) (string, error) {\n\tif err := validateField(\"namespace\", ns); err != nil {\n\t\treturn \"\", err\n\t}\n\n\tif err := validateField(\"name\", name); err != nil {\n\t\treturn \"\", err\n\t}\n\n\tvar indexDir string\n\tswitch {\n\tcase len(name) == 1:\n\t\tindexDir = \"1\"\n\tcase len(name) == 2:\n\t\tindexDir = \"2\"\n\tcase len(name) == 3:\n\t\tindexDir = filepath.Join(\"3\", name[:2])\n\tdefault:\n\t\tindexDir = filepath.Join(name[:2], name[2:4])\n\t}\n\n\treturn filepath.Join(rootDir, indexDir, fmt.Sprintf(\"%s_%s\", ns, name)), nil\n}\n\nfunc validateField(field, value string) error {\n\tlength := len(value)\n\tswitch {\n\tcase length == 0:\n\t\treturn errors.Errorf(\"%s cannot be empty\", style.Symbol(field))\n\tcase length > 253:\n\t\treturn errors.Errorf(\"%s too long (max 253 chars)\", style.Symbol(field))\n\t}\n\n\tif !validCharsRegexp.MatchString(value) {\n\t\treturn errors.Errorf(\"%s contains illegal characters (must match %s)\", style.Symbol(field), style.Symbol(validCharsPattern))\n\t}\n\n\treturn nil\n}\n"
  },
  {
    "path": "internal/registry/index_test.go",
    "content": "package registry_test\n\nimport (\n\t\"path/filepath\"\n\t\"testing\"\n\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\t\"github.com/buildpacks/pack/internal/registry\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestIndex(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"Index\", testIndex, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testIndex(t *testing.T, when spec.G, it spec.S) {\n\twhen(\"#IndexPath\", func() {\n\t\twhen(\"valid\", func() {\n\t\t\tfor _, scenario := range []struct {\n\t\t\t\tdesc,\n\t\t\t\troot,\n\t\t\t\tns,\n\t\t\t\tname,\n\t\t\t\texpectation string\n\t\t\t}{\n\t\t\t\t{\n\t\t\t\t\tdesc:        \"1 char name\",\n\t\t\t\t\troot:        \"/tmp\",\n\t\t\t\t\tns:          \"acme\",\n\t\t\t\t\tname:        \"a\",\n\t\t\t\t\texpectation: filepath.Join(\"/tmp\", \"1\", \"acme_a\"),\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\tdesc:        \"2 char name\",\n\t\t\t\t\troot:        \"/tmp\",\n\t\t\t\t\tns:          \"acme\",\n\t\t\t\t\tname:        \"ab\",\n\t\t\t\t\texpectation: filepath.Join(\"/tmp\", \"2\", \"acme_ab\"),\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\tdesc:        \"3 char name\",\n\t\t\t\t\troot:        \"/tmp\",\n\t\t\t\t\tns:          \"acme\",\n\t\t\t\t\tname:        \"abc\",\n\t\t\t\t\texpectation: filepath.Join(\"/tmp\", \"3\", \"ab\", \"acme_abc\"),\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\tdesc:        \"4 char name\",\n\t\t\t\t\troot:        \"/tmp\",\n\t\t\t\t\tns:          \"acme\",\n\t\t\t\t\tname:        \"abcd\",\n\t\t\t\t\texpectation: filepath.Join(\"/tmp\", \"ab\", \"cd\", \"acme_abcd\"),\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\tdesc:        \"> 4 char name\",\n\t\t\t\t\troot:        \"/tmp\",\n\t\t\t\t\tns:          \"acme\",\n\t\t\t\t\tname:        \"acmelang\",\n\t\t\t\t\texpectation: filepath.Join(\"/tmp\", \"ac\", \"me\", \"acme_acmelang\"),\n\t\t\t\t},\n\t\t\t} {\n\t\t\t\tscenario := scenario\n\t\t\t\tit(scenario.desc, func() {\n\t\t\t\t\tindex, err := registry.IndexPath(scenario.root, scenario.ns, scenario.name)\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\th.AssertEq(t, index, scenario.expectation)\n\t\t\t\t})\n\t\t\t}\n\t\t})\n\n\t\twhen(\"invalid\", func() {\n\t\t\tfor _, scenario := range []struct {\n\t\t\t\tdesc,\n\t\t\t\tns,\n\t\t\t\tname,\n\t\t\t\terror string\n\t\t\t}{\n\t\t\t\t{\n\t\t\t\t\tdesc:  \"ns is empty\",\n\t\t\t\t\tns:    \"\",\n\t\t\t\t\tname:  \"a\",\n\t\t\t\t\terror: \"'namespace' cannot be empty\",\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\tdesc:  \"name is empty\",\n\t\t\t\t\tns:    \"a\",\n\t\t\t\t\tname:  \"\",\n\t\t\t\t\terror: \"'name' cannot be empty\",\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\tdesc:  \"namespace has capital letters\",\n\t\t\t\t\tns:    \"Acme\",\n\t\t\t\t\tname:  \"buildpack\",\n\t\t\t\t\terror: \"'namespace' contains illegal characters (must match '[a-z0-9\\\\-.]+')\",\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\tdesc:  \"name has capital letters\",\n\t\t\t\t\tns:    \"acme\",\n\t\t\t\t\tname:  \"Buildpack\",\n\t\t\t\t\terror: \"'name' contains illegal characters (must match '[a-z0-9\\\\-.]+')\",\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\tdesc:  \"namespace is too long\",\n\t\t\t\t\tns:    h.RandString(254),\n\t\t\t\t\tname:  \"buildpack\",\n\t\t\t\t\terror: \"'namespace' too long (max 253 chars)\",\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\tdesc:  \"name is too long\",\n\t\t\t\t\tns:    \"acme\",\n\t\t\t\t\tname:  h.RandString(254),\n\t\t\t\t\terror: \"'name' too long (max 253 chars)\",\n\t\t\t\t},\n\t\t\t} {\n\t\t\t\tscenario := scenario\n\t\t\t\twhen(scenario.desc, func() {\n\t\t\t\t\tit(\"errors\", func() {\n\t\t\t\t\t\t_, err := registry.IndexPath(\"/tmp\", scenario.ns, scenario.name)\n\t\t\t\t\t\th.AssertError(t, err, scenario.error)\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t}\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "internal/registry/registry_cache.go",
    "content": "package registry\n\nimport (\n\t\"bufio\"\n\t\"crypto/sha256\"\n\t\"encoding/hex\"\n\t\"encoding/json\"\n\t\"fmt\"\n\t\"net/url\"\n\t\"os\"\n\t\"os/exec\"\n\t\"path/filepath\"\n\t\"runtime\"\n\t\"time\"\n\n\t\"github.com/go-git/go-git/v5\"\n\t\"github.com/go-git/go-git/v5/plumbing/object\"\n\t\"github.com/pkg/errors\"\n\t\"golang.org/x/mod/semver\"\n\n\t\"github.com/buildpacks/pack/internal/style\"\n\t\"github.com/buildpacks/pack/pkg/buildpack\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n)\n\nconst DefaultRegistryURL = \"https://github.com/buildpacks/registry-index\"\nconst DefaultRegistryName = \"official\"\nconst defaultRegistryDir = \"registry\"\n\n// Cache is a RegistryCache\ntype Cache struct {\n\tlogger      logging.Logger\n\turl         *url.URL\n\tRoot        string\n\tRegistryDir string\n}\n\nconst GithubIssueTitleTemplate = \"{{ if .Yanked }}YANK{{ else }}ADD{{ end }} {{.Namespace}}/{{.Name}}@{{.Version}}\"\nconst GithubIssueBodyTemplate = `\nid = \"{{.Namespace}}/{{.Name}}\"\nversion = \"{{.Version}}\"\n{{ if .Yanked }}{{ else if .Address }}addr = \"{{.Address}}\"{{ end }}\n`\nconst GitCommitTemplate = `{{ if .Yanked }}YANK{{else}}ADD{{end}} {{.Namespace}}/{{.Name}}@{{.Version}}`\n\n// Entry is a list of buildpacks stored in a registry\ntype Entry struct {\n\tBuildpacks []Buildpack `json:\"buildpacks\"`\n}\n\n// NewDefaultRegistryCache creates a new registry cache with default options\nfunc NewDefaultRegistryCache(logger logging.Logger, home string) (Cache, error) {\n\treturn NewRegistryCache(logger, home, DefaultRegistryURL)\n}\n\n// NewRegistryCache creates a new registry cache\nfunc NewRegistryCache(logger logging.Logger, home, registryURL string) (Cache, error) {\n\tif _, err := os.Stat(home); err != nil {\n\t\treturn Cache{}, errors.Wrapf(err, \"finding home %s\", home)\n\t}\n\n\tnormalizedURL, err := url.Parse(registryURL)\n\tif err != nil {\n\t\treturn Cache{}, errors.Wrapf(err, \"parsing registry url %s\", registryURL)\n\t}\n\n\tkey := sha256.New()\n\tkey.Write([]byte(normalizedURL.String()))\n\tcacheDir := fmt.Sprintf(\"%s-%s\", defaultRegistryDir, hex.EncodeToString(key.Sum(nil)))\n\n\treturn Cache{\n\t\turl:    normalizedURL,\n\t\tlogger: logger,\n\t\tRoot:   filepath.Join(home, cacheDir),\n\t}, nil\n}\n\n// LocateBuildpack stored in registry\nfunc (r *Cache) LocateBuildpack(bp string) (Buildpack, error) {\n\terr := r.Refresh()\n\tif err != nil {\n\t\treturn Buildpack{}, errors.Wrap(err, \"refreshing cache\")\n\t}\n\n\tns, name, version, err := buildpack.ParseRegistryID(bp)\n\tif err != nil {\n\t\treturn Buildpack{}, errors.Wrap(err, \"parsing buildpacks registry id\")\n\t}\n\n\tentry, err := r.readEntry(ns, name)\n\tif err != nil {\n\t\treturn Buildpack{}, errors.Wrap(err, \"reading entry\")\n\t}\n\n\tif len(entry.Buildpacks) > 0 {\n\t\tif version == \"\" {\n\t\t\thighestVersion := entry.Buildpacks[0]\n\t\t\tif len(entry.Buildpacks) > 1 {\n\t\t\t\tfor _, bp := range entry.Buildpacks[1:] {\n\t\t\t\t\tif semver.Compare(fmt.Sprintf(\"v%s\", bp.Version), fmt.Sprintf(\"v%s\", highestVersion.Version)) > 0 {\n\t\t\t\t\t\thighestVersion = bp\n\t\t\t\t\t}\n\t\t\t\t}\n\t\t\t}\n\t\t\treturn highestVersion, Validate(highestVersion)\n\t\t}\n\n\t\tfor _, bpIndex := range entry.Buildpacks {\n\t\t\tif bpIndex.Version == version {\n\t\t\t\treturn bpIndex, Validate(bpIndex)\n\t\t\t}\n\t\t}\n\t\treturn Buildpack{}, fmt.Errorf(\"could not find version for buildpack: %s\", bp)\n\t}\n\n\treturn Buildpack{}, fmt.Errorf(\"no entries for buildpack: %s\", bp)\n}\n\n// Refresh local Registry Cache\nfunc (r *Cache) Refresh() error {\n\tr.logger.Debugf(\"Refreshing registry cache for %s/%s\", r.url.Host, r.url.Path)\n\n\tif err := r.Initialize(); err != nil {\n\t\treturn errors.Wrapf(err, \"initializing (%s)\", r.Root)\n\t}\n\n\trepository, err := git.PlainOpen(r.Root)\n\tif err != nil {\n\t\treturn errors.Wrapf(err, \"opening (%s)\", r.Root)\n\t}\n\n\tw, err := repository.Worktree()\n\tif err != nil {\n\t\treturn errors.Wrapf(err, \"reading (%s)\", r.Root)\n\t}\n\n\terr = w.Pull(&git.PullOptions{RemoteName: \"origin\"})\n\tif err == git.NoErrAlreadyUpToDate {\n\t\treturn nil\n\t}\n\treturn err\n}\n\n// Initialize a local Registry Cache\nfunc (r *Cache) Initialize() error {\n\t_, err := os.Stat(r.Root)\n\tif err != nil {\n\t\tif os.IsNotExist(err) {\n\t\t\terr = r.CreateCache()\n\t\t\tif err != nil {\n\t\t\t\treturn errors.Wrap(err, \"creating registry cache\")\n\t\t\t}\n\t\t}\n\t}\n\n\tif err := r.validateCache(); err != nil {\n\t\terr = os.RemoveAll(r.Root)\n\t\tif err != nil {\n\t\t\treturn errors.Wrap(err, \"resetting registry cache\")\n\t\t}\n\t\terr = r.CreateCache()\n\t\tif err != nil {\n\t\t\treturn errors.Wrap(err, \"rebuilding registry cache\")\n\t\t}\n\t}\n\n\treturn nil\n}\n\n// CreateCache creates the cache on the filesystem\nfunc (r *Cache) CreateCache() error {\n\tvar repository *git.Repository\n\tr.logger.Debugf(\"Creating registry cache for %s/%s\", r.url.Host, r.url.Path)\n\n\tregistryDir, err := os.MkdirTemp(filepath.Dir(r.Root), \"registry\")\n\tif err != nil {\n\t\treturn err\n\t}\n\n\tr.RegistryDir = registryDir\n\n\tif r.url.Host == \"dev.azure.com\" {\n\t\terr = exec.Command(\"git\", \"clone\", r.url.String(), r.RegistryDir).Run()\n\t\tif err != nil {\n\t\t\treturn errors.Wrap(err, \"cloning remote registry with native git\")\n\t\t}\n\n\t\trepository, err = git.PlainOpen(r.RegistryDir)\n\t\tif err != nil {\n\t\t\treturn errors.Wrap(err, \"opening remote registry clone\")\n\t\t}\n\t} else {\n\t\trepository, err = git.PlainClone(r.RegistryDir, false, &git.CloneOptions{\n\t\t\tURL: r.url.String(),\n\t\t})\n\t\tif err != nil {\n\t\t\treturn errors.Wrap(err, \"cloning remote registry\")\n\t\t}\n\t}\n\n\tw, err := repository.Worktree()\n\tif err != nil {\n\t\treturn err\n\t}\n\n\terr = os.Rename(w.Filesystem.Root(), r.Root)\n\tif err != nil {\n\t\tif err == os.ErrExist {\n\t\t\t// If pack is run concurrently, this action might have already occurred\n\t\t\treturn nil\n\t\t}\n\t\treturn err\n\t}\n\treturn nil\n}\n\nfunc (r *Cache) validateCache() error {\n\tr.logger.Debugf(\"Validating registry cache for %s/%s\", r.url.Host, r.url.Path)\n\n\trepository, err := git.PlainOpen(r.Root)\n\tif err != nil {\n\t\treturn errors.Wrap(err, \"opening registry cache\")\n\t}\n\n\tremotes, err := repository.Remotes()\n\tif err != nil {\n\t\treturn errors.Wrap(err, \"accessing registry cache\")\n\t}\n\n\tfor _, remote := range remotes {\n\t\tif remote.Config().Name == \"origin\" && remotes[0].Config().URLs[0] != r.url.String() {\n\t\t\treturn nil\n\t\t}\n\t}\n\treturn errors.New(\"invalid registry cache remote\")\n}\n\n// Commit a Buildpack change\nfunc (r *Cache) Commit(b Buildpack, username, msg string) error {\n\tr.logger.Debugf(\"Creating commit in registry cache\")\n\n\tif msg == \"\" {\n\t\treturn errors.New(\"invalid commit message\")\n\t}\n\n\trepository, err := git.PlainOpen(r.Root)\n\tif err != nil {\n\t\treturn errors.Wrap(err, \"opening registry cache\")\n\t}\n\n\tw, err := repository.Worktree()\n\tif err != nil {\n\t\treturn errors.Wrapf(err, \"reading %s\", style.Symbol(r.Root))\n\t}\n\n\tindex, err := r.writeEntry(b)\n\tif err != nil {\n\t\treturn errors.Wrapf(err, \"writing %s\", style.Symbol(index))\n\t}\n\n\trelativeIndexFile, err := filepath.Rel(r.Root, index)\n\tif err != nil {\n\t\treturn errors.Wrap(err, \"resolving relative path\")\n\t}\n\n\tif _, err := w.Add(relativeIndexFile); err != nil {\n\t\treturn errors.Wrapf(err, \"adding %s\", style.Symbol(index))\n\t}\n\n\tif _, err := w.Commit(msg, &git.CommitOptions{\n\t\tAuthor: &object.Signature{\n\t\t\tName:  username,\n\t\t\tEmail: \"\",\n\t\t\tWhen:  time.Now(),\n\t\t},\n\t}); err != nil {\n\t\treturn errors.Wrapf(err, \"committing\")\n\t}\n\n\treturn nil\n}\n\nfunc (r *Cache) writeEntry(b Buildpack) (string, error) {\n\tvar ns = b.Namespace\n\tvar name = b.Name\n\n\tindex, err := IndexPath(r.Root, ns, name)\n\tif err != nil {\n\t\treturn \"\", err\n\t}\n\n\tif _, err := os.Stat(index); os.IsNotExist(err) {\n\t\tif err := os.MkdirAll(filepath.Dir(index), 0750); err != nil {\n\t\t\treturn \"\", errors.Wrapf(err, \"creating directory structure for: %s/%s\", ns, name)\n\t\t}\n\t} else {\n\t\tif _, err := os.Stat(index); err == nil {\n\t\t\tentry, err := r.readEntry(ns, name)\n\t\t\tif err != nil {\n\t\t\t\treturn \"\", errors.Wrapf(err, \"reading existing buildpack entries\")\n\t\t\t}\n\n\t\t\tavailableBuildpacks := entry.Buildpacks\n\n\t\t\tif len(availableBuildpacks) != 0 {\n\t\t\t\tif availableBuildpacks[len(availableBuildpacks)-1].Version == b.Version {\n\t\t\t\t\treturn \"\", errors.Wrapf(err, \"same version exists, upgrade the version to add\")\n\t\t\t\t}\n\t\t\t}\n\t\t}\n\t}\n\n\tf, err := os.OpenFile(filepath.Clean(index), os.O_APPEND|os.O_CREATE|os.O_RDWR, 0644)\n\tif err != nil {\n\t\treturn \"\", errors.Wrapf(err, \"creating buildpack file: %s/%s\", ns, name)\n\t}\n\tdefer f.Close()\n\n\tnewline := \"\\n\"\n\tif runtime.GOOS == \"windows\" {\n\t\tnewline = \"\\r\\n\"\n\t}\n\n\tfileContents, err := json.Marshal(b)\n\tif err != nil {\n\t\treturn \"\", errors.Wrapf(err, \"converting buildpack file to json: %s/%s\", ns, name)\n\t}\n\n\tfileContentsFormatted := string(fileContents) + newline\n\tif _, err := f.WriteString(fileContentsFormatted); err != nil {\n\t\treturn \"\", errors.Wrapf(err, \"writing buildpack to file: %s/%s\", ns, name)\n\t}\n\n\treturn index, nil\n}\n\nfunc (r *Cache) readEntry(ns, name string) (Entry, error) {\n\tindex, err := IndexPath(r.Root, ns, name)\n\tif err != nil {\n\t\treturn Entry{}, err\n\t}\n\n\tif _, err := os.Stat(index); err != nil {\n\t\treturn Entry{}, errors.Wrapf(err, \"finding buildpack: %s/%s\", ns, name)\n\t}\n\n\tfile, err := os.Open(filepath.Clean(index))\n\tif err != nil {\n\t\treturn Entry{}, errors.Wrapf(err, \"opening index for buildpack: %s/%s\", ns, name)\n\t}\n\tdefer file.Close()\n\n\tentry := Entry{}\n\tscanner := bufio.NewScanner(file)\n\tfor scanner.Scan() {\n\t\tvar bp Buildpack\n\t\terr = json.Unmarshal([]byte(scanner.Text()), &bp)\n\t\tif err != nil {\n\t\t\treturn Entry{}, errors.Wrapf(err, \"parsing index for buildpack: %s/%s\", ns, name)\n\t\t}\n\n\t\tentry.Buildpacks = append(entry.Buildpacks, bp)\n\t}\n\n\tif err := scanner.Err(); err != nil {\n\t\treturn entry, errors.Wrapf(err, \"reading index for buildpack: %s/%s\", ns, name)\n\t}\n\n\treturn entry, nil\n}\n"
  },
  {
    "path": "internal/registry/registry_cache_test.go",
    "content": "package registry\n\nimport (\n\t\"bytes\"\n\t\"net/url\"\n\t\"os\"\n\t\"path/filepath\"\n\t\"strings\"\n\t\"testing\"\n\t\"time\"\n\n\t\"github.com/go-git/go-git/v5\"\n\t\"github.com/go-git/go-git/v5/plumbing/object\"\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestRegistryCache(t *testing.T) {\n\tcolor.Disable(true)\n\tspec.Run(t, \"RegistryCache\", testRegistryCache, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testRegistryCache(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\ttmpDir          string\n\t\terr             error\n\t\tregistryFixture string\n\t\toutBuf          bytes.Buffer\n\t\tlogger          logging.Logger\n\t)\n\n\tit.Before(func() {\n\t\tlogger = logging.NewLogWithWriters(&outBuf, &outBuf)\n\n\t\ttmpDir, err = os.MkdirTemp(\"\", \"registry\")\n\t\th.AssertNil(t, err)\n\n\t\tregistryFixture = h.CreateRegistryFixture(t, tmpDir, filepath.Join(\"..\", \"..\", \"testdata\", \"registry\"))\n\t})\n\n\tit.After(func() {\n\t\t// Ignoring the error for now, it failed randomly on windows\n\t\t_ = os.RemoveAll(tmpDir)\n\t})\n\n\twhen(\"#NewDefaultRegistryCache\", func() {\n\t\tit(\"creates a RegistryCache with default URL\", func() {\n\t\t\tregistryCache, err := NewDefaultRegistryCache(logger, tmpDir)\n\t\t\th.AssertNil(t, err)\n\t\t\tnormalizedURL, err := url.Parse(\"https://github.com/buildpacks/registry-index\")\n\t\t\th.AssertNil(t, err)\n\n\t\t\th.AssertEq(t, registryCache.url, normalizedURL)\n\t\t})\n\t})\n\n\twhen(\"#NewRegistryCache\", func() {\n\t\twhen(\"home doesn't exist\", func() {\n\t\t\tit(\"fails to create a registry cache\", func() {\n\t\t\t\t_, err := NewRegistryCache(logger, \"/tmp/not-exist\", \"not-here\")\n\t\t\t\th.AssertError(t, err, \"finding home\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"registryURL isn't a valid url\", func() {\n\t\t\tit(\"fails to create a registry cache\", func() {\n\t\t\t\t_, err := NewRegistryCache(logger, tmpDir, \"://bad-uri\")\n\t\t\t\th.AssertError(t, err, \"parsing registry url\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"registryURL is Azure\", func() {\n\t\t\tit(\"fails to create a registry cache\", func() {\n\t\t\t\t_, err := NewRegistryCache(logger, tmpDir, \"https://dev.azure.com/\")\n\t\t\t\th.AssertNil(t, err)\n\t\t\t})\n\t\t})\n\n\t\tit(\"creates a RegistryCache\", func() {\n\t\t\tregistryCache, err := NewRegistryCache(logger, tmpDir, registryFixture)\n\t\t\th.AssertNil(t, err)\n\t\t\texpectedRoot := filepath.Join(tmpDir, \"registry\")\n\t\t\tactualRoot := strings.Split(registryCache.Root, \"-\")[0]\n\t\t\th.AssertEq(t, actualRoot, expectedRoot)\n\t\t})\n\t})\n\n\twhen(\"#LocateBuildpack\", func() {\n\t\tvar (\n\t\t\tregistryCache Cache\n\t\t)\n\n\t\tit.Before(func() {\n\t\t\tregistryCache, err = NewRegistryCache(logger, tmpDir, registryFixture)\n\t\t\th.AssertNil(t, err)\n\t\t})\n\n\t\tit(\"locates a buildpack without version (name len 3)\", func() {\n\t\t\tbp, err := registryCache.LocateBuildpack(\"example/foo\")\n\t\t\th.AssertNil(t, err)\n\t\t\th.AssertNotNil(t, bp)\n\n\t\t\th.AssertEq(t, bp.Namespace, \"example\")\n\t\t\th.AssertEq(t, bp.Name, \"foo\")\n\t\t\th.AssertEq(t, bp.Version, \"1.2.0\")\n\t\t})\n\n\t\tit(\"locates a buildpack without version (name len 4)\", func() {\n\t\t\tbp, err := registryCache.LocateBuildpack(\"example/java\")\n\t\t\th.AssertNil(t, err)\n\t\t\th.AssertNotNil(t, bp)\n\n\t\t\th.AssertEq(t, bp.Namespace, \"example\")\n\t\t\th.AssertEq(t, bp.Name, \"java\")\n\t\t\th.AssertEq(t, bp.Version, \"1.0.0\")\n\t\t})\n\n\t\tit(\"locates a buildpack with version\", func() {\n\t\t\tbp, err := registryCache.LocateBuildpack(\"example/foo@1.1.0\")\n\t\t\th.AssertNil(t, err)\n\t\t\th.AssertNotNil(t, bp)\n\n\t\t\th.AssertEq(t, bp.Namespace, \"example\")\n\t\t\th.AssertEq(t, bp.Name, \"foo\")\n\t\t\th.AssertEq(t, bp.Version, \"1.1.0\")\n\t\t})\n\n\t\tit(\"returns error if can't parse buildpack id\", func() {\n\t\t\t_, err := registryCache.LocateBuildpack(\"quack\")\n\t\t\th.AssertError(t, err, \"parsing buildpacks registry id\")\n\t\t})\n\n\t\tit(\"returns error if buildpack id is empty\", func() {\n\t\t\t_, err := registryCache.LocateBuildpack(\"example/\")\n\t\t\th.AssertError(t, err, \"'name' cannot be empty\")\n\t\t})\n\n\t\tit(\"returns error if can't find buildpack with requested id\", func() {\n\t\t\t_, err := registryCache.LocateBuildpack(\"example/qu\")\n\t\t\th.AssertError(t, err, \"reading entry\")\n\t\t})\n\n\t\tit(\"returns error if can't find buildpack with requested version\", func() {\n\t\t\t_, err := registryCache.LocateBuildpack(\"example/foo@3.5.6\")\n\t\t\th.AssertError(t, err, \"could not find version\")\n\t\t})\n\t})\n\n\twhen(\"#Refresh\", func() {\n\t\tvar (\n\t\t\tregistryCache Cache\n\t\t)\n\n\t\tit.Before(func() {\n\t\t\tregistryCache, err = NewRegistryCache(logger, tmpDir, registryFixture)\n\t\t\th.AssertNil(t, err)\n\t\t})\n\n\t\twhen(\"registry has new commits\", func() {\n\t\t\tit(\"pulls the latest index\", func() {\n\t\t\t\th.AssertNil(t, registryCache.Refresh())\n\t\t\t\th.AssertGitHeadEq(t, registryFixture, registryCache.Root)\n\n\t\t\t\tr, err := git.PlainOpen(registryFixture)\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\tw, err := r.Worktree()\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\tcommit, err := w.Commit(\"second\", &git.CommitOptions{\n\t\t\t\t\tAuthor: &object.Signature{\n\t\t\t\t\t\tName:  \"John Doe\",\n\t\t\t\t\t\tEmail: \"john@doe.org\",\n\t\t\t\t\t\tWhen:  time.Now(),\n\t\t\t\t\t},\n\t\t\t\t\tAllowEmptyCommits: true,\n\t\t\t\t})\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t_, err = r.CommitObject(commit)\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\th.AssertNil(t, registryCache.Refresh())\n\t\t\t\th.AssertGitHeadEq(t, registryFixture, registryCache.Root)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"Root is an empty string\", func() {\n\t\t\tit(\"fails to refresh\", func() {\n\t\t\t\tregistryCache.Root = \"\"\n\t\t\t\terr = registryCache.Refresh()\n\t\t\t\th.AssertError(t, err, \"initializing\")\n\t\t\t})\n\n\t\t\tit.After(func() {\n\t\t\t\th.AssertNil(t, os.RemoveAll(registryCache.RegistryDir))\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"#Initialize\", func() {\n\t\tvar (\n\t\t\tregistryCache Cache\n\t\t)\n\n\t\tit.Before(func() {\n\t\t\tregistryCache, err = NewRegistryCache(logger, tmpDir, registryFixture)\n\t\t\th.AssertNil(t, err)\n\t\t})\n\n\t\twhen(\"root is empty string\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\tregistryCache.Root = \"\"\n\t\t\t})\n\n\t\t\tit(\"fails to create registry cache\", func() {\n\t\t\t\terr = registryCache.Initialize()\n\t\t\t\th.AssertError(t, err, \"creating registry cache\")\n\t\t\t})\n\n\t\t\tit.After(func() {\n\t\t\t\th.AssertNil(t, os.RemoveAll(registryCache.RegistryDir))\n\t\t\t})\n\n\t\t\twhen(\"url is empty string\", func() {\n\t\t\t\tit(\"fails to clone cache\", func() {\n\t\t\t\t\tregistryCache.url = &url.URL{}\n\n\t\t\t\t\terr = registryCache.Initialize()\n\t\t\t\t\th.AssertError(t, err, \"cloning remote registry\")\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"#Commit\", func() {\n\t\tbp := Buildpack{\n\t\t\tNamespace: \"example\",\n\t\t\tName:      \"python\",\n\t\t\tVersion:   \"1.0.0\",\n\t\t\tYanked:    false,\n\t\t\tAddress:   \"example.com\",\n\t\t}\n\n\t\tvar (\n\t\t\tregistryCache Cache\n\t\t\tmsg           = \"test commit message\"\n\t\t\tusername      = \"supra08\"\n\t\t)\n\n\t\tit.Before(func() {\n\t\t\tregistryCache, err = NewRegistryCache(logger, tmpDir, registryFixture)\n\t\t\th.AssertNil(t, err)\n\n\t\t\terr = registryCache.CreateCache()\n\t\t\th.AssertNil(t, err)\n\t\t})\n\n\t\twhen(\"correct buildpack and commit message is passed\", func() {\n\t\t\tit(\"creates a file and a commit\", func() {\n\t\t\t\th.AssertNil(t, registryCache.Commit(bp, username, msg))\n\t\t\t})\n\t\t})\n\n\t\twhen(\"empty commit message is passed\", func() {\n\t\t\tit(\"fails to create commit\", func() {\n\t\t\t\terr := registryCache.Commit(bp, username, \"\")\n\t\t\t\th.AssertError(t, err, \"invalid commit message\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"Root is an empty string\", func() {\n\t\t\tit(\"fails to create commit\", func() {\n\t\t\t\tregistryCache.Root = \"\"\n\t\t\t\terr = registryCache.Commit(bp, username, msg)\n\t\t\t\th.AssertError(t, err, \"opening registry cache: repository does not exist\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"name is empty in buildpack\", func() {\n\t\t\tit(\"fails to create commit\", func() {\n\t\t\t\tbp := Buildpack{\n\t\t\t\t\tNamespace: \"example\",\n\t\t\t\t\tName:      \"\",\n\t\t\t\t\tVersion:   \"1.0.0\",\n\t\t\t\t\tYanked:    false,\n\t\t\t\t\tAddress:   \"example.com\",\n\t\t\t\t}\n\t\t\t\terr = registryCache.Commit(bp, username, msg)\n\t\t\t\th.AssertError(t, err, \"'name' cannot be empty\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"namespace is empty in buildpack\", func() {\n\t\t\tit(\"fails to create commit\", func() {\n\t\t\t\tbp := Buildpack{\n\t\t\t\t\tNamespace: \"\",\n\t\t\t\t\tName:      \"python\",\n\t\t\t\t\tVersion:   \"1.0.0\",\n\t\t\t\t\tYanked:    false,\n\t\t\t\t\tAddress:   \"example.com\",\n\t\t\t\t}\n\t\t\t\terr = registryCache.Commit(bp, username, msg)\n\t\t\t\th.AssertError(t, err, \"'namespace' cannot be empty\")\n\t\t\t})\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "internal/slices/slices.go",
    "content": "package slices\n\nfunc MapString(vs []string, f func(string) string) []string {\n\tvsm := make([]string, len(vs))\n\tfor i, v := range vs {\n\t\tvsm[i] = f(v)\n\t}\n\treturn vsm\n}\n"
  },
  {
    "path": "internal/slices/slices_test.go",
    "content": "package slices_test\n\nimport (\n\t\"testing\"\n\n\t\"github.com/sclevine/spec\"\n\n\t\"github.com/buildpacks/pack/internal/slices\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestMapString(t *testing.T) {\n\tspec.Run(t, \"Slices\", func(t *testing.T, when spec.G, it spec.S) {\n\t\tvar (\n\t\t\tassert = h.NewAssertionManager(t)\n\t\t)\n\n\t\twhen(\"#MapString\", func() {\n\t\t\tit(\"maps each value\", func() {\n\t\t\t\tinput := []string{\"hello\", \"1\", \"2\", \"world\"}\n\t\t\t\texpected := []string{\"hello.\", \"1.\", \"2.\", \"world.\"}\n\t\t\t\tfn := func(v string) string {\n\t\t\t\t\treturn v + \".\"\n\t\t\t\t}\n\n\t\t\t\toutput := slices.MapString(input, fn)\n\t\t\t\tassert.Equal(output, expected)\n\t\t\t})\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "internal/sshdialer/posix_test.go",
    "content": "//go:build !windows\n\npackage sshdialer_test\n\nimport (\n\t\"errors\"\n\t\"net\"\n\t\"os\"\n)\n\nfunc fixupPrivateKeyMod(path string) {\n\terr := os.Chmod(path, 0400)\n\tif err != nil {\n\t\tpanic(err)\n\t}\n}\n\nfunc listen(addr string) (net.Listener, error) {\n\treturn net.Listen(\"unix\", addr)\n}\n\nfunc isErrClosed(err error) bool {\n\treturn errors.Is(err, net.ErrClosed)\n}\n"
  },
  {
    "path": "internal/sshdialer/server_test.go",
    "content": "package sshdialer_test\n\nimport (\n\t\"bytes\"\n\t\"context\"\n\t\"crypto/md5\"\n\t\"encoding/binary\"\n\t\"errors\"\n\t\"fmt\"\n\t\"io\"\n\t\"net\"\n\t\"net/http\"\n\t\"os\"\n\t\"path/filepath\"\n\t\"strconv\"\n\t\"strings\"\n\t\"sync\"\n\t\"testing\"\n\t\"time\"\n\n\t\"golang.org/x/crypto/ssh\"\n)\n\ntype SSHServer struct {\n\tlock           sync.Locker\n\tdockerServer   http.Server\n\tdockerListener listener\n\tdockerHost     string\n\thostIPv4       string\n\thostIPv6       string\n\tportIPv4       int\n\tportIPv6       int\n\thasDialStdio   bool\n\tisWin          bool\n}\n\nfunc (s *SSHServer) SetIsWindows(v bool) {\n\ts.lock.Lock()\n\tdefer s.lock.Unlock()\n\ts.isWin = v\n}\n\nfunc (s *SSHServer) IsWindows() bool {\n\ts.lock.Lock()\n\tdefer s.lock.Unlock()\n\treturn s.isWin\n}\n\nfunc (s *SSHServer) SetDockerHostEnvVar(host string) {\n\ts.lock.Lock()\n\tdefer s.lock.Unlock()\n\ts.dockerHost = host\n}\n\nfunc (s *SSHServer) GetDockerHostEnvVar() string {\n\ts.lock.Lock()\n\tdefer s.lock.Unlock()\n\treturn s.dockerHost\n}\n\nfunc (s *SSHServer) HasDialStdio() bool {\n\ts.lock.Lock()\n\tdefer s.lock.Unlock()\n\treturn s.hasDialStdio\n}\n\nfunc (s *SSHServer) SetHasDialStdio(v bool) {\n\ts.lock.Lock()\n\tdefer s.lock.Unlock()\n\ts.hasDialStdio = v\n}\n\nconst dockerUnixSocket = \"/home/testuser/test.sock\"\nconst dockerTCPSocket = \"localhost:1234\"\n\n// We need to set up SSH server against which we will run the tests.\n// This will return SSHServer structure representing the state of the testing server.\n// It also returns clean up procedure stopSSH used to shut down the server.\nfunc prepareSSHServer(t *testing.T) (sshServer *SSHServer, stopSSH func(), err error) {\n\tctx, cancel := context.WithCancel(context.Background())\n\tdefer func() {\n\t\tif err != nil {\n\t\t\tcancel()\n\t\t}\n\t}()\n\thttpServerErrChan := make(chan error)\n\tpollingLoopErr := make(chan error)\n\tpollingLoopIPv6Err := make(chan error)\n\n\thandlePing := http.HandlerFunc(func(writer http.ResponseWriter, request *http.Request) {\n\t\twriter.Header().Add(\"Content-Type\", \"text/plain\")\n\t\twriter.WriteHeader(200)\n\t\t_, _ = writer.Write([]byte(\"OK\"))\n\t})\n\n\tsshServer = &SSHServer{\n\t\tdockerServer: http.Server{\n\t\t\tHandler: handlePing,\n\t\t},\n\t\tdockerListener: listener{conns: make(chan net.Conn), closed: make(chan struct{})},\n\t\tlock:           &sync.Mutex{},\n\t}\n\n\tsshTCPListener, err := net.Listen(\"tcp4\", \"localhost:0\")\n\tif err != nil {\n\t\treturn sshServer, stopSSH, err\n\t}\n\n\thasIPv6 := true\n\tsshTCP6Listener, err := net.Listen(\"tcp6\", \"localhost:0\")\n\tif err != nil {\n\t\thasIPv6 = false\n\t\tt.Log(err)\n\t}\n\n\thost, p, err := net.SplitHostPort(sshTCPListener.Addr().String())\n\tif err != nil {\n\t\treturn sshServer, stopSSH, err\n\t}\n\tport, err := strconv.ParseInt(p, 10, 32)\n\tif err != nil {\n\t\treturn sshServer, stopSSH, err\n\t}\n\tsshServer.hostIPv4 = host\n\tsshServer.portIPv4 = int(port)\n\n\tif hasIPv6 {\n\t\thost, p, err = net.SplitHostPort(sshTCP6Listener.Addr().String())\n\t\tif err != nil {\n\t\t\treturn sshServer, stopSSH, err\n\t\t}\n\t\tport, err = strconv.ParseInt(p, 10, 32)\n\t\tif err != nil {\n\t\t\treturn sshServer, stopSSH, err\n\t\t}\n\t\tsshServer.hostIPv6 = host\n\t\tsshServer.portIPv6 = int(port)\n\t}\n\n\tt.Logf(\"Listening on %s\", sshTCPListener.Addr())\n\tif hasIPv6 {\n\t\tt.Logf(\"Listening on %s\", sshTCP6Listener.Addr())\n\t}\n\n\tgo func() {\n\t\thttpServerErrChan <- sshServer.dockerServer.Serve(&sshServer.dockerListener)\n\t}()\n\n\tstopSSH = func() {\n\t\tvar err error\n\t\tcancel()\n\n\t\tstopCtx, cancel := context.WithTimeout(context.Background(), time.Second*5)\n\t\tdefer cancel()\n\t\terr = sshServer.dockerServer.Shutdown(stopCtx)\n\t\tif err != nil {\n\t\t\tt.Error(err)\n\t\t}\n\n\t\terr = <-httpServerErrChan\n\t\tif err != nil && !strings.Contains(err.Error(), \"Server closed\") {\n\t\t\tt.Error(err)\n\t\t}\n\n\t\tsshTCPListener.Close()\n\t\terr = <-pollingLoopErr\n\t\tif err != nil && !errors.Is(err, net.ErrClosed) {\n\t\t\tt.Error(err)\n\t\t}\n\n\t\tif hasIPv6 {\n\t\t\tsshTCP6Listener.Close()\n\t\t\terr = <-pollingLoopIPv6Err\n\t\t\tif err != nil && !errors.Is(err, net.ErrClosed) {\n\t\t\t\tt.Error(err)\n\t\t\t}\n\t\t}\n\t}\n\n\tconnChan := make(chan net.Conn)\n\n\tgo func() {\n\t\tfor {\n\t\t\ttcpConn, err := sshTCPListener.Accept()\n\t\t\tif err != nil {\n\t\t\t\tpollingLoopErr <- err\n\t\t\t\treturn\n\t\t\t}\n\t\t\tconnChan <- tcpConn\n\t\t}\n\t}()\n\n\tif hasIPv6 {\n\t\tgo func() {\n\t\t\tfor {\n\t\t\t\ttcpConn, err := sshTCP6Listener.Accept()\n\t\t\t\tif err != nil {\n\t\t\t\t\tpollingLoopIPv6Err <- err\n\t\t\t\t\treturn\n\t\t\t\t}\n\t\t\t\tconnChan <- tcpConn\n\t\t\t}\n\t\t}()\n\t}\n\n\tgo func() {\n\t\tfor {\n\t\t\tconn := <-connChan\n\t\t\tgo func(conn net.Conn) {\n\t\t\t\terr := sshServer.handleConnection(ctx, conn)\n\t\t\t\tif err != nil {\n\t\t\t\t\tfmt.Fprintf(os.Stderr, \"err: %v\\n\", err)\n\t\t\t\t}\n\t\t\t}(conn)\n\t\t}\n\t}()\n\n\treturn sshServer, stopSSH, err\n}\n\nfunc setupServerAuth(conf *ssh.ServerConfig) (err error) {\n\tpasswd := map[string]string{\n\t\t\"testuser\": \"idkfa\",\n\t\t\"root\":     \"iddqd\",\n\t}\n\n\tauthorizedKeysFiles := []string{\"id_ed25519.pub\", \"id_rsa.pub\"}\n\tauthorizedKeys := make(map[[16]byte][]byte, len(authorizedKeysFiles))\n\tfor _, key := range authorizedKeysFiles {\n\t\tkeyFileName := filepath.Join(\"testdata\", key)\n\t\tvar bs []byte\n\t\tbs, err = os.ReadFile(keyFileName)\n\t\tif err != nil {\n\t\t\treturn err\n\t\t}\n\t\tvar pk ssh.PublicKey\n\t\tpk, _, _, _, err = ssh.ParseAuthorizedKey(bs)\n\t\tif err != nil {\n\t\t\treturn err\n\t\t}\n\n\t\tbs = pk.Marshal()\n\t\tauthorizedKeys[md5.Sum(bs)] = bs\n\t}\n\n\t*conf = ssh.ServerConfig{\n\t\tPasswordCallback: func(conn ssh.ConnMetadata, password []byte) (*ssh.Permissions, error) {\n\t\t\tif p, ok := passwd[conn.User()]; ok && p == string(password) {\n\t\t\t\treturn nil, nil\n\t\t\t}\n\t\t\treturn nil, fmt.Errorf(\"incorrect password %q for user %q\", string(password), conn.User())\n\t\t},\n\t\tPublicKeyCallback: func(conn ssh.ConnMetadata, key ssh.PublicKey) (*ssh.Permissions, error) {\n\t\t\tkeyBytes := key.Marshal()\n\t\t\tif b, ok := authorizedKeys[md5.Sum(keyBytes)]; ok && bytes.Equal(b, keyBytes) {\n\t\t\t\treturn &ssh.Permissions{}, nil\n\t\t\t}\n\t\t\treturn nil, fmt.Errorf(\"untrusted public key: %q\", string(keyBytes))\n\t\t},\n\t}\n\n\thostKeys := []string{\"ssh_host_ecdsa_key\", \"ssh_host_ed25519_key\", \"ssh_host_rsa_key\"}\n\tserverKeysDir := filepath.Join(\"testdata\", \"etc\", \"ssh\")\n\tfor _, key := range hostKeys {\n\t\tkeyFileName := filepath.Join(serverKeysDir, key)\n\t\tvar b []byte\n\t\tb, err = os.ReadFile(keyFileName)\n\t\tif err != nil {\n\t\t\treturn err\n\t\t}\n\t\tvar signer ssh.Signer\n\t\tsigner, err = ssh.ParsePrivateKey(b)\n\t\tif err != nil {\n\t\t\treturn err\n\t\t}\n\t\tconf.AddHostKey(signer)\n\t}\n\n\treturn nil\n}\n\nfunc (s *SSHServer) handleConnection(ctx context.Context, conn net.Conn) error {\n\tvar config ssh.ServerConfig\n\tsetupServerAuth(&config)\n\tsshConn, newChannels, reqs, err := ssh.NewServerConn(conn, &config)\n\tif err != nil {\n\t\treturn err\n\t}\n\n\tgo func() {\n\t\t<-ctx.Done()\n\t\terr = sshConn.Close()\n\t\tif err != nil && !errors.Is(err, net.ErrClosed) {\n\t\t\tfmt.Fprintf(os.Stderr, \"err: %v\\n\", err)\n\t\t}\n\t}()\n\n\tvar wg sync.WaitGroup\n\n\twg.Add(1)\n\tgo func() {\n\t\tdefer wg.Done()\n\t\tssh.DiscardRequests(reqs)\n\t}()\n\n\tfor newChannel := range newChannels {\n\t\twg.Add(1)\n\t\tgo func(newChannel ssh.NewChannel) {\n\t\t\tdefer wg.Done()\n\t\t\ts.handleChannel(newChannel)\n\t\t}(newChannel)\n\t}\n\n\twg.Wait()\n\n\treturn nil\n}\n\nfunc (s *SSHServer) handleChannel(newChannel ssh.NewChannel) {\n\tvar err error\n\tswitch newChannel.ChannelType() {\n\tcase \"session\":\n\t\ts.handleSession(newChannel)\n\tcase \"direct-streamlocal@openssh.com\", \"direct-tcpip\":\n\t\ts.handleTunnel(newChannel)\n\tdefault:\n\t\terr = newChannel.Reject(ssh.UnknownChannelType, fmt.Sprintf(\"type of channel %q is not supported\", newChannel.ChannelType()))\n\t\tif err != nil {\n\t\t\tfmt.Fprintf(os.Stderr, \"err: %v\\n\", err)\n\t\t}\n\t}\n}\n\nfunc (s *SSHServer) handleSession(newChannel ssh.NewChannel) {\n\tch, reqs, err := newChannel.Accept()\n\tif err != nil {\n\t\tfmt.Fprintf(os.Stderr, \"err: %v\\n\", err)\n\t\treturn\n\t}\n\n\tdefer ch.Close()\n\tfor req := range reqs {\n\t\tif req.Type == \"exec\" {\n\t\t\ts.handleExec(ch, req)\n\t\t\tbreak\n\t\t}\n\t}\n}\n\nfunc (s *SSHServer) handleExec(ch ssh.Channel, req *ssh.Request) {\n\tvar err error\n\terr = req.Reply(true, nil)\n\tif err != nil {\n\t\tfmt.Fprintf(os.Stderr, \"err: %v\\n\", err)\n\t\treturn\n\t}\n\texecData := struct {\n\t\tCommand string\n\t}{}\n\terr = ssh.Unmarshal(req.Payload, &execData)\n\tif err != nil {\n\t\tfmt.Fprintf(os.Stderr, \"err: %v\\n\", err)\n\t\treturn\n\t}\n\n\tsendExitCode := func(ret uint32) {\n\t\tmsg := []byte{0, 0, 0, 0}\n\t\tbinary.BigEndian.PutUint32(msg, ret)\n\t\t_, err = ch.SendRequest(\"exit-status\", false, msg)\n\t\tif err != nil && !errors.Is(err, io.EOF) {\n\t\t\tfmt.Fprintf(os.Stderr, \"err: %v\\n\", err)\n\t\t}\n\t}\n\n\tvar ret uint32\n\tswitch {\n\tcase execData.Command == \"set\":\n\t\tret = 0\n\t\tdh := s.GetDockerHostEnvVar()\n\t\tif dh != \"\" {\n\t\t\t_, _ = fmt.Fprintf(ch, \"DOCKER_HOST=%s\\n\", dh)\n\t\t}\n\tcase execData.Command == \"systeminfo\" && s.IsWindows():\n\t\t_, _ = fmt.Fprintln(ch, \"something Windows something\")\n\t\tret = 0\n\tcase execData.Command == \"docker system dial-stdio\" && s.HasDialStdio():\n\t\tpr, pw, conn := newPipeConn()\n\n\t\tselect {\n\t\tcase s.dockerListener.conns <- conn:\n\t\tcase <-s.dockerListener.closed:\n\t\t\terr = ch.Close()\n\t\t\tif err != nil {\n\t\t\t\tfmt.Fprintf(os.Stderr, \"err: %v\\n\", err)\n\t\t\t}\n\t\t}\n\n\t\tcpDone := make(chan struct{})\n\t\tgo func() {\n\t\t\tvar err error\n\t\t\t_, err = io.Copy(pw, ch)\n\t\t\tif err != nil {\n\t\t\t\tfmt.Fprintf(os.Stderr, \"err: %v\\n\", err)\n\t\t\t}\n\t\t\terr = pw.Close()\n\t\t\tif err != nil {\n\t\t\t\tfmt.Fprintf(os.Stderr, \"err: %v\\n\", err)\n\t\t\t}\n\t\t\tcpDone <- struct{}{}\n\t\t}()\n\n\t\t_, err = io.Copy(ch, pr)\n\t\tif err != nil {\n\t\t\tfmt.Fprintf(os.Stderr, \"err: %v\\n\", err)\n\t\t}\n\t\terr = pr.Close()\n\t\tif err != nil {\n\t\t\tfmt.Fprintf(os.Stderr, \"err: %v\\n\", err)\n\t\t}\n\n\t\t<-cpDone\n\n\t\t<-conn.closed\n\t\tif err != nil {\n\t\t\tfmt.Fprintf(os.Stderr, \"err: %v\\n\", err)\n\t\t}\n\n\t\tret = 0\n\tdefault:\n\t\t_, _ = fmt.Fprintf(ch.Stderr(), \"unknown command: %q\\n\", execData.Command)\n\t\tret = 127\n\t}\n\tsendExitCode(ret)\n}\n\nfunc newPipeConn() (*io.PipeReader, *io.PipeWriter, *rwcConn) {\n\tpr0, pw0 := io.Pipe()\n\tpr1, pw1 := io.Pipe()\n\trwc := pipeReaderWriterCloser{r: pr0, w: pw1}\n\treturn pr1, pw0, newRWCConn(rwc)\n}\n\ntype pipeReaderWriterCloser struct {\n\tr *io.PipeReader\n\tw *io.PipeWriter\n}\n\nfunc (d pipeReaderWriterCloser) Read(p []byte) (n int, err error) {\n\treturn d.r.Read(p)\n}\n\nfunc (d pipeReaderWriterCloser) Write(p []byte) (n int, err error) {\n\treturn d.w.Write(p)\n}\n\nfunc (d pipeReaderWriterCloser) Close() error {\n\terr := d.r.Close()\n\tif err != nil {\n\t\treturn err\n\t}\n\treturn d.w.Close()\n}\n\nfunc (s *SSHServer) handleTunnel(newChannel ssh.NewChannel) {\n\tvar err error\n\n\tswitch newChannel.ChannelType() {\n\tcase \"direct-streamlocal@openssh.com\":\n\t\tbs := newChannel.ExtraData()\n\t\tunixExtraData := struct {\n\t\t\tSocketPath string\n\t\t\tReserved0  string\n\t\t\tReserved1  uint32\n\t\t}{}\n\t\terr = ssh.Unmarshal(bs, &unixExtraData)\n\t\tif err != nil {\n\t\t\tfmt.Fprintf(os.Stderr, \"err: %v\\n\", err)\n\t\t\treturn\n\t\t}\n\t\tif unixExtraData.SocketPath != dockerUnixSocket {\n\t\t\terr = newChannel.Reject(ssh.ConnectionFailed, fmt.Sprintf(\"bad socket: %q\", unixExtraData.SocketPath))\n\t\t\tif err != nil {\n\t\t\t\tfmt.Fprintf(os.Stderr, \"err: %v\\n\", err)\n\t\t\t}\n\t\t\treturn\n\t\t}\n\tcase \"direct-tcpip\":\n\t\tbs := newChannel.ExtraData()\n\t\ttcpExtraData := struct { //nolint:maligned\n\t\t\tHostLocal  string\n\t\t\tPortLocal  uint32\n\t\t\tHostRemote string\n\t\t\tPortRemote uint32\n\t\t}{}\n\t\terr = ssh.Unmarshal(bs, &tcpExtraData)\n\t\tif err != nil {\n\t\t\tfmt.Fprintf(os.Stderr, \"err: %v\\n\", err)\n\t\t\treturn\n\t\t}\n\n\t\thostPort := fmt.Sprintf(\"%s:%d\", tcpExtraData.HostLocal, tcpExtraData.PortLocal)\n\t\tif hostPort != dockerTCPSocket {\n\t\t\terr = newChannel.Reject(ssh.ConnectionFailed, fmt.Sprintf(\"bad socket: '%s:%d'\", tcpExtraData.HostLocal, tcpExtraData.PortLocal))\n\t\t\tif err != nil {\n\t\t\t\tfmt.Fprintf(os.Stderr, \"err: %v\\n\", err)\n\t\t\t}\n\t\t\treturn\n\t\t}\n\t}\n\n\tch, _, err := newChannel.Accept()\n\tif err != nil {\n\t\tfmt.Fprintf(os.Stderr, \"err: %v\\n\", err)\n\t\treturn\n\t}\n\tconn := newRWCConn(ch)\n\tselect {\n\tcase s.dockerListener.conns <- conn:\n\tcase <-s.dockerListener.closed:\n\t\terr = ch.Close()\n\t\tif err != nil {\n\t\t\tfmt.Fprintf(os.Stderr, \"err: %v\\n\", err)\n\t\t}\n\t\treturn\n\t}\n\t<-conn.closed\n}\n\ntype listener struct {\n\tconns  chan net.Conn\n\tclosed chan struct{}\n\to      sync.Once\n}\n\nfunc (l *listener) Accept() (net.Conn, error) {\n\tselect {\n\tcase <-l.closed:\n\t\treturn nil, net.ErrClosed\n\tcase conn := <-l.conns:\n\t\treturn conn, nil\n\t}\n}\n\nfunc (l *listener) Close() error {\n\tl.o.Do(func() {\n\t\tclose(l.closed)\n\t})\n\treturn nil\n}\n\nfunc (l *listener) Addr() net.Addr {\n\treturn &net.UnixAddr{Name: dockerUnixSocket, Net: \"unix\"}\n}\n\nfunc newRWCConn(rwc io.ReadWriteCloser) *rwcConn {\n\treturn &rwcConn{rwc: rwc, closed: make(chan struct{})}\n}\n\ntype rwcConn struct {\n\trwc    io.ReadWriteCloser\n\tclosed chan struct{}\n\to      sync.Once\n}\n\nfunc (c *rwcConn) Read(b []byte) (n int, err error) {\n\treturn c.rwc.Read(b)\n}\n\nfunc (c *rwcConn) Write(b []byte) (n int, err error) {\n\treturn c.rwc.Write(b)\n}\n\nfunc (c *rwcConn) Close() error {\n\tc.o.Do(func() {\n\t\tclose(c.closed)\n\t})\n\treturn c.rwc.Close()\n}\n\nfunc (c *rwcConn) LocalAddr() net.Addr {\n\treturn &net.UnixAddr{Name: dockerUnixSocket, Net: \"unix\"}\n}\n\nfunc (c *rwcConn) RemoteAddr() net.Addr {\n\treturn &net.UnixAddr{Name: \"@\", Net: \"unix\"}\n}\n\nfunc (c *rwcConn) SetDeadline(t time.Time) error { return nil }\n\nfunc (c *rwcConn) SetReadDeadline(t time.Time) error { return nil }\n\nfunc (c *rwcConn) SetWriteDeadline(t time.Time) error { return nil }\n"
  },
  {
    "path": "internal/sshdialer/ssh_agent_unix.go",
    "content": "//go:build unix\n\npackage sshdialer\n\nimport \"net\"\n\nfunc dialSSHAgent(addr string) (net.Conn, error) {\n\treturn net.Dial(\"unix\", addr)\n}\n"
  },
  {
    "path": "internal/sshdialer/ssh_agent_windows.go",
    "content": "package sshdialer\n\nimport (\n\t\"net\"\n\t\"strings\"\n\n\t\"github.com/Microsoft/go-winio\"\n)\n\nfunc dialSSHAgent(addr string) (net.Conn, error) {\n\tif strings.Contains(addr, \"\\\\pipe\\\\\") {\n\t\treturn winio.DialPipe(addr, nil)\n\t}\n\treturn net.Dial(\"unix\", addr)\n}\n"
  },
  {
    "path": "internal/sshdialer/ssh_dialer.go",
    "content": "// NOTE: this code is based on \"github.com/containers/podman/v3/pkg/bindings\"\n\npackage sshdialer\n\nimport (\n\t\"bufio\"\n\t\"bytes\"\n\t\"context\"\n\t\"errors\"\n\t\"fmt\"\n\t\"net\"\n\turlPkg \"net/url\"\n\t\"os\"\n\t\"path/filepath\"\n\t\"runtime\"\n\t\"strings\"\n\t\"time\"\n\n\t\"github.com/docker/cli/cli/connhelper\"\n\t\"github.com/docker/docker/pkg/homedir\"\n\t\"golang.org/x/crypto/ssh\"\n\t\"golang.org/x/crypto/ssh/agent\"\n\t\"golang.org/x/crypto/ssh/knownhosts\"\n)\n\ntype SecretCallback func() (string, error)\ntype HostKeyCallback func(hostPort string, pubKey ssh.PublicKey) error\n\ntype Config struct {\n\tIdentity           string\n\tPassPhrase         string\n\tPasswordCallback   SecretCallback\n\tPassPhraseCallback SecretCallback\n\tHostKeyCallback    HostKeyCallback\n}\n\nconst defaultSSHPort = \"22\"\n\nfunc NewDialContext(url *urlPkg.URL, config Config) (func(ctx context.Context, network, addr string) (net.Conn, error), error) {\n\tsshConfig, err := NewSSHClientConfig(url, config)\n\tif err != nil {\n\t\treturn nil, err\n\t}\n\n\tport := url.Port()\n\tif port == \"\" {\n\t\tport = defaultSSHPort\n\t}\n\thost := url.Hostname()\n\n\tsshClient, err := ssh.Dial(\"tcp\", net.JoinHostPort(host, port), sshConfig)\n\tif err != nil {\n\t\treturn nil, fmt.Errorf(\"failed to dial ssh: %w\", err)\n\t}\n\tdefer func() {\n\t\tif sshClient != nil {\n\t\t\tsshClient.Close()\n\t\t}\n\t}()\n\n\tvar dialContext func(ctx context.Context, network, addr string) (net.Conn, error)\n\tif url.Path == \"\" {\n\t\tdialContext, err = tryGetStdioDialContext(url, sshClient, config.Identity)\n\t\tif err != nil {\n\t\t\treturn nil, err\n\t\t}\n\t\tif dialContext != nil {\n\t\t\treturn dialContext, nil\n\t\t}\n\t}\n\n\tvar addr string\n\tvar network string\n\n\tif url.Path != \"\" {\n\t\taddr = url.Path\n\t\tnetwork = \"unix\"\n\t} else {\n\t\tnetwork, addr, err = networkAndAddressFromRemoteDockerHost(sshClient)\n\t\tif err != nil {\n\t\t\treturn nil, err\n\t\t}\n\t}\n\n\td := dialer{sshClient: sshClient, addr: addr, network: network}\n\tsshClient = nil\n\tdialContext = d.DialContext\n\n\truntime.SetFinalizer(&d, func(d *dialer) {\n\t\td.Close()\n\t})\n\n\treturn dialContext, nil\n}\n\ntype dialer struct {\n\tsshClient *ssh.Client\n\tnetwork   string\n\taddr      string\n}\n\nfunc (d *dialer) DialContext(ctx context.Context, n, a string) (net.Conn, error) {\n\tconn, err := d.Dial(d.network, d.addr)\n\tif err != nil {\n\t\treturn nil, err\n\t}\n\tgo func() {\n\t\tif ctx != nil {\n\t\t\t<-ctx.Done()\n\t\t\tconn.Close()\n\t\t}\n\t}()\n\treturn conn, nil\n}\n\nfunc (d *dialer) Dial(n, a string) (net.Conn, error) {\n\treturn d.sshClient.Dial(d.network, d.addr)\n}\n\nfunc (d *dialer) Close() error {\n\treturn d.sshClient.Close()\n}\n\nfunc isWindowsMachine(sshClient *ssh.Client) (bool, error) {\n\tsession, err := sshClient.NewSession()\n\tif err != nil {\n\t\treturn false, err\n\t}\n\tdefer session.Close()\n\n\tout, err := session.CombinedOutput(\"systeminfo\")\n\tif err == nil && strings.Contains(string(out), \"Windows\") {\n\t\treturn true, nil\n\t}\n\treturn false, nil\n}\n\nfunc networkAndAddressFromRemoteDockerHost(sshClient *ssh.Client) (network string, addr string, err error) {\n\tsession, err := sshClient.NewSession()\n\tif err != nil {\n\t\treturn network, addr, err\n\t}\n\tdefer session.Close()\n\n\tout, err := session.CombinedOutput(\"set\")\n\tif err != nil {\n\t\treturn network, addr, err\n\t}\n\n\tremoteDockerHost := \"unix:///var/run/docker.sock\"\n\tisWin, err := isWindowsMachine(sshClient)\n\tif err != nil {\n\t\treturn network, addr, err\n\t}\n\n\tif isWin {\n\t\tremoteDockerHost = \"npipe:////./pipe/docker_engine\"\n\t}\n\n\tscanner := bufio.NewScanner(bytes.NewBuffer(out))\n\tfor scanner.Scan() {\n\t\tif strings.HasPrefix(scanner.Text(), \"DOCKER_HOST=\") {\n\t\t\tparts := strings.SplitN(scanner.Text(), \"=\", 2)\n\t\t\tremoteDockerHost = strings.Trim(parts[1], `\"'`)\n\t\t\tbreak\n\t\t}\n\t}\n\n\tremoteDockerHostURL, err := urlPkg.Parse(remoteDockerHost)\n\tif err != nil {\n\t\treturn network, addr, err\n\t}\n\tswitch remoteDockerHostURL.Scheme {\n\tcase \"unix\":\n\t\taddr = remoteDockerHostURL.Path\n\tcase \"fd\":\n\t\tremoteDockerHostURL.Scheme = \"tcp\" // don't know why it works that way\n\t\tfallthrough\n\tcase \"tcp\":\n\t\taddr = remoteDockerHostURL.Host\n\tdefault:\n\t\treturn \"\", \"\", errors.New(\"scheme is not supported\")\n\t}\n\tnetwork = remoteDockerHostURL.Scheme\n\n\treturn network, addr, err\n}\n\nfunc tryGetStdioDialContext(url *urlPkg.URL, sshClient *ssh.Client, identity string) (func(ctx context.Context, network, addr string) (net.Conn, error), error) {\n\tsession, err := sshClient.NewSession()\n\tif err != nil {\n\t\treturn nil, err\n\t}\n\tdefer session.Close()\n\tsession.Stdin = nil\n\tsession.Stdout = nil\n\tsession.Stderr = nil\n\terr = session.Run(\"docker system dial-stdio\")\n\tif err == nil {\n\t\tvar opts []string\n\n\t\tif identity != \"\" {\n\t\t\topts = append(opts, \"-i\", identity)\n\t\t}\n\n\t\tconnHelper, err := connhelper.GetConnectionHelperWithSSHOpts(url.String(), opts)\n\t\tif err != nil {\n\t\t\treturn nil, err\n\t\t}\n\t\tif connHelper != nil {\n\t\t\treturn connHelper.Dialer, nil\n\t\t}\n\t}\n\treturn nil, nil\n}\n\nfunc NewSSHClientConfig(url *urlPkg.URL, config Config) (*ssh.ClientConfig, error) {\n\tvar (\n\t\tauthMethods []ssh.AuthMethod\n\t\tsigners     []ssh.Signer\n\t)\n\n\tif pw, found := url.User.Password(); found {\n\t\tauthMethods = append(authMethods, ssh.Password(pw))\n\t}\n\n\t// add signer from explicit identity parameter\n\tif config.Identity != \"\" {\n\t\tsigner, err := loadSignerFromFile(config.Identity, []byte(config.Identity), config.PassPhraseCallback)\n\t\tif err != nil {\n\t\t\treturn nil, fmt.Errorf(\"failed to parse identity file: %w\", err)\n\t\t}\n\t\tsigners = append(signers, signer)\n\t}\n\n\t// pulls signers (keys) from ssh-agent\n\tsignersFromAgent, err := getSignersFromAgent()\n\tif err != nil {\n\t\treturn nil, err\n\t}\n\tsigners = append(signers, signersFromAgent...)\n\n\t// if there is no explicit identity file nor keys from ssh-agent then\n\t// add keys with standard name from ~/.ssh/\n\tif len(signers) == 0 {\n\t\tdefaultKeyPaths := getDefaultKeys()\n\t\tif len(defaultKeyPaths) == 1 {\n\t\t\tsigner, err := loadSignerFromFile(defaultKeyPaths[0], []byte(config.PassPhrase), config.PassPhraseCallback)\n\t\t\tif err != nil {\n\t\t\t\treturn nil, err\n\t\t\t}\n\t\t\tsigners = append(signers, signer)\n\t\t}\n\t}\n\n\tauthMethods = append(authMethods, signersToAuthMethods(signers)...)\n\n\tif len(authMethods) == 0 && config.PasswordCallback != nil {\n\t\tauthMethods = append(authMethods, ssh.PasswordCallback(config.PasswordCallback))\n\t}\n\n\tconst sshTimeout = 5\n\tclientConfig := &ssh.ClientConfig{\n\t\tUser:            url.User.Username(),\n\t\tAuth:            authMethods,\n\t\tHostKeyCallback: createHostKeyCallback(config.HostKeyCallback),\n\t\tHostKeyAlgorithms: []string{\n\t\t\tssh.KeyAlgoECDSA256,\n\t\t\tssh.KeyAlgoECDSA384,\n\t\t\tssh.KeyAlgoECDSA521,\n\t\t\tssh.KeyAlgoED25519,\n\t\t\tssh.KeyAlgoRSASHA512,\n\t\t\tssh.KeyAlgoRSASHA256,\n\t\t\tssh.KeyAlgoRSA,\n\t\t},\n\t\tTimeout: sshTimeout * time.Second,\n\t}\n\n\treturn clientConfig, nil\n}\n\n// returns signers from ssh agent\nfunc getSignersFromAgent() ([]ssh.Signer, error) {\n\tif sock, found := os.LookupEnv(\"SSH_AUTH_SOCK\"); found && sock != \"\" {\n\t\tvar err error\n\t\tvar agentSigners []ssh.Signer\n\t\tvar agentConn net.Conn\n\t\tagentConn, err = dialSSHAgent(sock)\n\t\tif err != nil {\n\t\t\treturn nil, fmt.Errorf(\"failed to connect to ssh-agent's socket: %w\", err)\n\t\t}\n\t\tagentSigners, err = agent.NewClient(agentConn).Signers()\n\t\tif err != nil {\n\t\t\treturn nil, fmt.Errorf(\"failed to get signers from ssh-agent: %w\", err)\n\t\t}\n\t\treturn agentSigners, nil\n\t}\n\treturn nil, nil\n}\n\n// Default key names.\nvar knownKeyNames = []string{\"id_rsa\", \"id_dsa\", \"id_ecdsa\", \"id_ecdsa_sk\", \"id_ed25519\", \"id_ed25519_sk\"}\n\n// returns paths to keys with standard name that are in the ~/.ssh/ directory\nfunc getDefaultKeys() []string {\n\tvar defaultKeyPaths []string\n\tif home, err := os.UserHomeDir(); err == nil {\n\t\tfor _, keyName := range knownKeyNames {\n\t\t\tp := filepath.Join(home, \".ssh\", keyName)\n\n\t\t\tfi, err := os.Stat(p)\n\t\t\tif err != nil {\n\t\t\t\tcontinue\n\t\t\t}\n\t\t\tif fi.Mode().IsRegular() {\n\t\t\t\tdefaultKeyPaths = append(defaultKeyPaths, p)\n\t\t\t}\n\t\t}\n\t}\n\treturn defaultKeyPaths\n}\n\n// transforms slice of singers (keys) into slice of authentication methods for ssh client\nfunc signersToAuthMethods(signers []ssh.Signer) []ssh.AuthMethod {\n\tif len(signers) == 0 {\n\t\treturn nil\n\t}\n\n\tvar authMethods []ssh.AuthMethod\n\tdedup := make(map[string]ssh.Signer, len(signers))\n\t// Dedup signers based on fingerprint, ssh-agent keys override explicit identity\n\tfor _, s := range signers {\n\t\tfp := ssh.FingerprintSHA256(s.PublicKey())\n\t\tdedup[fp] = s\n\t}\n\n\tvar uniq []ssh.Signer\n\tfor _, s := range dedup {\n\t\tuniq = append(uniq, s)\n\t}\n\tauthMethods = append(authMethods, ssh.PublicKeysCallback(func() ([]ssh.Signer, error) {\n\t\treturn uniq, nil\n\t}))\n\n\treturn authMethods\n}\n\n// reads key from given path\n// if necessary it will decrypt it\nfunc loadSignerFromFile(path string, passphrase []byte, passPhraseCallback SecretCallback) (ssh.Signer, error) {\n\tkey, err := os.ReadFile(path)\n\tif err != nil {\n\t\treturn nil, fmt.Errorf(\"failed to read key file: %w\", err)\n\t}\n\n\tsigner, err := ssh.ParsePrivateKey(key)\n\tif err != nil {\n\t\tvar missingPhraseError *ssh.PassphraseMissingError\n\t\tif ok := errors.As(err, &missingPhraseError); !ok {\n\t\t\treturn nil, fmt.Errorf(\"failed to parse private key: %w\", err)\n\t\t}\n\n\t\tif len(passphrase) == 0 && passPhraseCallback != nil {\n\t\t\tb, err := passPhraseCallback()\n\t\t\tif err != nil {\n\t\t\t\treturn nil, err\n\t\t\t}\n\t\t\tpassphrase = []byte(b)\n\t\t}\n\n\t\treturn ssh.ParsePrivateKeyWithPassphrase(key, passphrase)\n\t}\n\n\treturn signer, nil\n}\n\nfunc createHostKeyCallback(userCallback HostKeyCallback) ssh.HostKeyCallback {\n\treturn func(hostPort string, remote net.Addr, pubKey ssh.PublicKey) error {\n\t\tknownHosts := filepath.Join(homedir.Get(), \".ssh\", \"known_hosts\")\n\n\t\tfileCallback, err := knownhosts.New(knownHosts)\n\t\tif err != nil {\n\t\t\tif os.IsNotExist(err) {\n\t\t\t\terr = errKeyUnknown\n\t\t\t}\n\t\t} else {\n\t\t\terr = fileCallback(hostPort, remote, pubKey)\n\t\t\tif err == nil {\n\t\t\t\treturn nil\n\t\t\t}\n\t\t}\n\n\t\tif userCallback != nil {\n\t\t\terr = userCallback(hostPort, pubKey)\n\t\t\tif err == nil {\n\t\t\t\treturn nil\n\t\t\t}\n\t\t}\n\n\t\treturn err\n\t}\n}\n\nvar ErrKeyMismatchMsg = \"key mismatch\"\nvar ErrKeyUnknownMsg = \"key is unknown\"\n\n// I would expose those but since ssh pkg doesn't do correct error wrapping it would be entirely futile\nvar errKeyUnknown = errors.New(ErrKeyUnknownMsg)\n"
  },
  {
    "path": "internal/sshdialer/ssh_dialer_test.go",
    "content": "package sshdialer_test\n\nimport (\n\t\"context\"\n\t\"fmt\"\n\t\"io\"\n\t\"net\"\n\t\"net/http\"\n\t\"net/url\"\n\t\"os\"\n\t\"os/exec\"\n\t\"path/filepath\"\n\t\"runtime\"\n\t\"strings\"\n\t\"sync\"\n\t\"testing\"\n\t\"text/template\"\n\t\"time\"\n\n\t\"github.com/docker/docker/pkg/homedir\"\n\t\"github.com/pkg/errors\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\t\"golang.org/x/crypto/ssh\"\n\t\"golang.org/x/crypto/ssh/agent\"\n\n\t\"github.com/buildpacks/pack/internal/sshdialer\"\n\tth \"github.com/buildpacks/pack/testhelpers\"\n)\n\ntype args struct {\n\tconnStr          string\n\tcredentialConfig sshdialer.Config\n}\ntype testParams struct {\n\tname        string\n\targs        args\n\tsetUpEnv    setUpEnvFn\n\tskipOnWin   bool\n\tCreateError string\n\tDialError   string\n}\n\nfunc TestCreateDialer(t *testing.T) {\n\tfor _, privateKey := range []string{\"id_ed25519\", \"id_rsa\", \"id_dsa\"} {\n\t\tpath := filepath.Join(\"testdata\", privateKey)\n\t\tfixupPrivateKeyMod(path)\n\t}\n\n\tdefer withoutSSHAgent(t)()\n\tdefer withCleanHome(t)()\n\n\tconnConfig, cleanUp, err := prepareSSHServer(t)\n\tth.AssertNil(t, err)\n\n\tdefer cleanUp()\n\ttime.Sleep(time.Second * 1)\n\n\ttests := []testParams{\n\t\t{\n\t\t\tname: \"read password from input\",\n\t\t\targs: args{\n\t\t\t\tconnStr: fmt.Sprintf(\"ssh://testuser@%s:%d/home/testuser/test.sock\",\n\t\t\t\t\tconnConfig.hostIPv4,\n\t\t\t\t\tconnConfig.portIPv4,\n\t\t\t\t),\n\t\t\t\tcredentialConfig: sshdialer.Config{PasswordCallback: func() (string, error) {\n\t\t\t\t\treturn \"idkfa\", nil\n\t\t\t\t}},\n\t\t\t},\n\t\t\tsetUpEnv: all(withoutSSHAgent, withCleanHome, withKnowHosts(connConfig)),\n\t\t},\n\t\t{\n\t\t\tname: \"password in url\",\n\t\t\targs: args{connStr: fmt.Sprintf(\"ssh://testuser:idkfa@%s:%d/home/testuser/test.sock\",\n\t\t\t\tconnConfig.hostIPv4,\n\t\t\t\tconnConfig.portIPv4,\n\t\t\t)},\n\t\t\tsetUpEnv: all(withoutSSHAgent, withCleanHome, withKnowHosts(connConfig)),\n\t\t},\n\t\t{\n\t\t\tname: \"server key is not in known_hosts (the file doesn't exists)\",\n\t\t\targs: args{connStr: fmt.Sprintf(\"ssh://testuser:idkfa@%s:%d/home/testuser/test.sock\",\n\t\t\t\tconnConfig.hostIPv4,\n\t\t\t\tconnConfig.portIPv4,\n\t\t\t)},\n\t\t\tsetUpEnv:    all(withoutSSHAgent, withCleanHome),\n\t\t\tCreateError: sshdialer.ErrKeyUnknownMsg,\n\t\t},\n\t\t{\n\t\t\tname: \"server key is not in known_hosts (the file exists)\",\n\t\t\targs: args{connStr: fmt.Sprintf(\"ssh://testuser:idkfa@%s:%d/home/testuser/test.sock\",\n\t\t\t\tconnConfig.hostIPv4,\n\t\t\t\tconnConfig.portIPv4,\n\t\t\t)},\n\t\t\tsetUpEnv:    all(withoutSSHAgent, withCleanHome, withEmptyKnownHosts),\n\t\t\tCreateError: sshdialer.ErrKeyUnknownMsg,\n\t\t},\n\t\t{\n\t\t\tname: \"server key is not in known_hosts (the filed doesn't exists) - user force trust\",\n\t\t\targs: args{\n\t\t\t\tconnStr: fmt.Sprintf(\"ssh://testuser:idkfa@%s:%d/home/testuser/test.sock\",\n\t\t\t\t\tconnConfig.hostIPv4,\n\t\t\t\t\tconnConfig.portIPv4,\n\t\t\t\t),\n\t\t\t\tcredentialConfig: sshdialer.Config{HostKeyCallback: func(hostPort string, pubKey ssh.PublicKey) error {\n\t\t\t\t\treturn nil\n\t\t\t\t}},\n\t\t\t},\n\t\t\tsetUpEnv: all(withoutSSHAgent, withCleanHome),\n\t\t},\n\t\t{\n\t\t\tname: \"server key is not in known_hosts (the file exists) - user force trust\",\n\t\t\targs: args{\n\t\t\t\tconnStr: fmt.Sprintf(\"ssh://testuser:idkfa@%s:%d/home/testuser/test.sock\",\n\t\t\t\t\tconnConfig.hostIPv4,\n\t\t\t\t\tconnConfig.portIPv4,\n\t\t\t\t),\n\t\t\t\tcredentialConfig: sshdialer.Config{HostKeyCallback: func(hostPort string, pubKey ssh.PublicKey) error {\n\t\t\t\t\treturn nil\n\t\t\t\t}},\n\t\t\t},\n\t\t\tsetUpEnv: all(withoutSSHAgent, withCleanHome, withEmptyKnownHosts),\n\t\t},\n\t\t{\n\t\t\tname: \"server key does not match the respective key in known_host\",\n\t\t\targs: args{connStr: fmt.Sprintf(\"ssh://testuser:idkfa@%s:%d/home/testuser/test.sock\",\n\t\t\t\tconnConfig.hostIPv4,\n\t\t\t\tconnConfig.portIPv4,\n\t\t\t)},\n\t\t\tsetUpEnv:    all(withoutSSHAgent, withCleanHome, withBadKnownHosts(connConfig)),\n\t\t\tCreateError: sshdialer.ErrKeyMismatchMsg,\n\t\t},\n\t\t{\n\t\t\tname: \"key from identity parameter\",\n\t\t\targs: args{\n\t\t\t\tconnStr: fmt.Sprintf(\"ssh://testuser@%s:%d/home/testuser/test.sock\",\n\t\t\t\t\tconnConfig.hostIPv4,\n\t\t\t\t\tconnConfig.portIPv4,\n\t\t\t\t),\n\t\t\t\tcredentialConfig: sshdialer.Config{Identity: filepath.Join(\"testdata\", \"id_ed25519\")},\n\t\t\t},\n\t\t\tsetUpEnv: all(withoutSSHAgent, withCleanHome, withKnowHosts(connConfig)),\n\t\t},\n\t\t{\n\t\t\tname: \"key at standard location with need to read passphrase\",\n\t\t\targs: args{\n\t\t\t\tconnStr: fmt.Sprintf(\"ssh://testuser@%s:%d/home/testuser/test.sock\",\n\t\t\t\t\tconnConfig.hostIPv4,\n\t\t\t\t\tconnConfig.portIPv4,\n\t\t\t\t),\n\t\t\t\tcredentialConfig: sshdialer.Config{PassPhraseCallback: func() (string, error) {\n\t\t\t\t\treturn \"idfa\", nil\n\t\t\t\t}},\n\t\t\t},\n\t\t\tsetUpEnv: all(withoutSSHAgent, withCleanHome, withKey(t, \"id_rsa\"), withKnowHosts(connConfig)),\n\t\t},\n\t\t{\n\t\t\tname: \"key at standard location with explicitly set passphrase\",\n\t\t\targs: args{\n\t\t\t\tconnStr: fmt.Sprintf(\"ssh://testuser@%s:%d/home/testuser/test.sock\",\n\t\t\t\t\tconnConfig.hostIPv4,\n\t\t\t\t\tconnConfig.portIPv4,\n\t\t\t\t),\n\t\t\t\tcredentialConfig: sshdialer.Config{PassPhrase: \"idfa\"},\n\t\t\t},\n\t\t\tsetUpEnv: all(withoutSSHAgent, withCleanHome, withKey(t, \"id_rsa\"), withKnowHosts(connConfig)),\n\t\t},\n\t\t{\n\t\t\tname: \"key at standard location with no passphrase\",\n\t\t\targs: args{connStr: fmt.Sprintf(\"ssh://testuser@%s:%d/home/testuser/test.sock\",\n\t\t\t\tconnConfig.hostIPv4,\n\t\t\t\tconnConfig.portIPv4,\n\t\t\t)},\n\t\t\tsetUpEnv: all(withoutSSHAgent, withCleanHome, withKey(t, \"id_ed25519\"), withKnowHosts(connConfig)),\n\t\t},\n\t\t{\n\t\t\tname: \"key from ssh-agent\",\n\t\t\targs: args{connStr: fmt.Sprintf(\"ssh://testuser@%s:%d/home/testuser/test.sock\",\n\t\t\t\tconnConfig.hostIPv4,\n\t\t\t\tconnConfig.portIPv4,\n\t\t\t)},\n\t\t\tsetUpEnv: all(withGoodSSHAgent, withCleanHome, withKnowHosts(connConfig)),\n\t\t},\n\t\t{\n\t\t\tname: \"password in url with IPv6\",\n\t\t\targs: args{connStr: fmt.Sprintf(\"ssh://testuser:idkfa@[%s]:%d/home/testuser/test.sock\",\n\t\t\t\tconnConfig.hostIPv6,\n\t\t\t\tconnConfig.portIPv6,\n\t\t\t)},\n\t\t\tsetUpEnv: all(withoutSSHAgent, withCleanHome, withKnowHosts(connConfig)),\n\t\t},\n\t\t{\n\t\t\tname: \"broken known host\",\n\t\t\targs: args{connStr: fmt.Sprintf(\"ssh://testuser:idkfa@%s:%d/home/testuser/test.sock\",\n\t\t\t\tconnConfig.hostIPv4,\n\t\t\t\tconnConfig.portIPv4,\n\t\t\t)},\n\t\t\tsetUpEnv:    all(withoutSSHAgent, withCleanHome, withBrokenKnownHosts),\n\t\t\tCreateError: \"missing host pattern\",\n\t\t},\n\t\t{\n\t\t\tname: \"inaccessible known host\",\n\t\t\targs: args{connStr: fmt.Sprintf(\"ssh://testuser:idkfa@%s:%d/home/testuser/test.sock\",\n\t\t\t\tconnConfig.hostIPv4,\n\t\t\t\tconnConfig.portIPv4,\n\t\t\t)},\n\t\t\tsetUpEnv:    all(withoutSSHAgent, withCleanHome, withInaccessibleKnownHosts),\n\t\t\tskipOnWin:   true,\n\t\t\tCreateError: \"permission denied\",\n\t\t},\n\t\t{\n\t\t\tname: \"failing pass phrase cbk\",\n\t\t\targs: args{\n\t\t\t\tconnStr: fmt.Sprintf(\"ssh://testuser:idkfa@%s:%d/home/testuser/test.sock\",\n\t\t\t\t\tconnConfig.hostIPv4,\n\t\t\t\t\tconnConfig.portIPv4,\n\t\t\t\t),\n\t\t\t\tcredentialConfig: sshdialer.Config{PassPhraseCallback: func() (string, error) {\n\t\t\t\t\treturn \"\", errors.New(\"test_error_msg\")\n\t\t\t\t}},\n\t\t\t},\n\t\t\tsetUpEnv:    all(withoutSSHAgent, withCleanHome, withKey(t, \"id_rsa\"), withKnowHosts(connConfig)),\n\t\t\tCreateError: \"test_error_msg\",\n\t\t},\n\t\t{\n\t\t\tname: \"with broken key at default location\",\n\t\t\targs: args{connStr: fmt.Sprintf(\"ssh://testuser:idkfa@%s:%d/home/testuser/test.sock\",\n\t\t\t\tconnConfig.hostIPv4,\n\t\t\t\tconnConfig.portIPv4,\n\t\t\t)},\n\t\t\tsetUpEnv:    all(withoutSSHAgent, withCleanHome, withKey(t, \"id_dsa\"), withKnowHosts(connConfig)),\n\t\t\tCreateError: \"failed to parse private key\",\n\t\t},\n\t\t{\n\t\t\tname: \"with broken key explicit\",\n\t\t\targs: args{\n\t\t\t\tconnStr: fmt.Sprintf(\"ssh://testuser:idkfa@%s:%d/home/testuser/test.sock\",\n\t\t\t\t\tconnConfig.hostIPv4,\n\t\t\t\t\tconnConfig.portIPv4,\n\t\t\t\t),\n\t\t\t\tcredentialConfig: sshdialer.Config{Identity: filepath.Join(\"testdata\", \"id_dsa\")},\n\t\t\t},\n\t\t\tsetUpEnv:    all(withoutSSHAgent, withCleanHome, withKnowHosts(connConfig)),\n\t\t\tCreateError: \"failed to parse private key\",\n\t\t},\n\t\t{\n\t\t\tname: \"with inaccessible key\",\n\t\t\targs: args{connStr: fmt.Sprintf(\"ssh://testuser:idkfa@%s:%d/home/testuser/test.sock\",\n\t\t\t\tconnConfig.hostIPv4,\n\t\t\t\tconnConfig.portIPv4,\n\t\t\t)},\n\t\t\tsetUpEnv:    all(withoutSSHAgent, withCleanHome, withInaccessibleKey(\"id_rsa\"), withKnowHosts(connConfig)),\n\t\t\tskipOnWin:   true,\n\t\t\tCreateError: \"failed to read key file\",\n\t\t},\n\t\t{\n\t\t\tname: \"socket doesn't exist in remote\",\n\t\t\targs: args{\n\t\t\t\tconnStr: fmt.Sprintf(\"ssh://testuser@%s:%d/does/not/exist/test.sock\",\n\t\t\t\t\tconnConfig.hostIPv4,\n\t\t\t\t\tconnConfig.portIPv4,\n\t\t\t\t),\n\t\t\t\tcredentialConfig: sshdialer.Config{PasswordCallback: func() (string, error) {\n\t\t\t\t\treturn \"idkfa\", nil\n\t\t\t\t}},\n\t\t\t},\n\t\t\tsetUpEnv:  all(withoutSSHAgent, withCleanHome, withKnowHosts(connConfig)),\n\t\t\tDialError: \"failed to dial unix socket in the remote\",\n\t\t},\n\t\t{\n\t\t\tname: \"ssh agent non-existent socket\",\n\t\t\targs: args{\n\t\t\t\tconnStr: fmt.Sprintf(\"ssh://testuser@%s:%d/does/not/exist/test.sock\",\n\t\t\t\t\tconnConfig.hostIPv4,\n\t\t\t\t\tconnConfig.portIPv4,\n\t\t\t\t),\n\t\t\t},\n\t\t\tsetUpEnv:    all(withBadSSHAgentSocket, withCleanHome, withKnowHosts(connConfig)),\n\t\t\tCreateError: \"failed to connect to ssh-agent's socket\",\n\t\t},\n\t\t{\n\t\t\tname: \"bad ssh agent\",\n\t\t\targs: args{\n\t\t\t\tconnStr: fmt.Sprintf(\"ssh://testuser@%s:%d/does/not/exist/test.sock\",\n\t\t\t\t\tconnConfig.hostIPv4,\n\t\t\t\t\tconnConfig.portIPv4,\n\t\t\t\t),\n\t\t\t},\n\t\t\tsetUpEnv:    all(withBadSSHAgent, withCleanHome, withKnowHosts(connConfig)),\n\t\t\tCreateError: \"failed to get signers from ssh-agent\",\n\t\t},\n\t\t{\n\t\t\tname: \"use docker host from remote unix\",\n\t\t\targs: args{\n\t\t\t\tconnStr: fmt.Sprintf(\"ssh://testuser@%s:%d\",\n\t\t\t\t\tconnConfig.hostIPv4,\n\t\t\t\t\tconnConfig.portIPv4,\n\t\t\t\t),\n\t\t\t\tcredentialConfig: sshdialer.Config{Identity: filepath.Join(\"testdata\", \"id_ed25519\")},\n\t\t\t},\n\t\t\tsetUpEnv: all(withoutSSHAgent, withCleanHome, withKnowHosts(connConfig),\n\t\t\t\twithRemoteDockerHost(\"unix:///home/testuser/test.sock\", connConfig)),\n\t\t},\n\t\t{\n\t\t\tname: \"use docker host from remote tcp\",\n\t\t\targs: args{\n\t\t\t\tconnStr: fmt.Sprintf(\"ssh://testuser@%s:%d\",\n\t\t\t\t\tconnConfig.hostIPv4,\n\t\t\t\t\tconnConfig.portIPv4,\n\t\t\t\t),\n\t\t\t\tcredentialConfig: sshdialer.Config{Identity: filepath.Join(\"testdata\", \"id_ed25519\")},\n\t\t\t},\n\t\t\tsetUpEnv: all(withoutSSHAgent, withCleanHome, withKnowHosts(connConfig),\n\t\t\t\twithRemoteDockerHost(\"tcp://localhost:1234\", connConfig)),\n\t\t},\n\t\t{\n\t\t\tname: \"use docker host from remote fd\",\n\t\t\targs: args{\n\t\t\t\tconnStr: fmt.Sprintf(\"ssh://testuser@%s:%d\",\n\t\t\t\t\tconnConfig.hostIPv4,\n\t\t\t\t\tconnConfig.portIPv4,\n\t\t\t\t),\n\t\t\t\tcredentialConfig: sshdialer.Config{Identity: filepath.Join(\"testdata\", \"id_ed25519\")},\n\t\t\t},\n\t\t\tsetUpEnv: all(withoutSSHAgent, withCleanHome, withKnowHosts(connConfig),\n\t\t\t\twithRemoteDockerHost(\"fd://localhost:1234\", connConfig)),\n\t\t},\n\t\t{\n\t\t\tname: \"use docker host from remote npipe\",\n\t\t\targs: args{\n\t\t\t\tconnStr: fmt.Sprintf(\"ssh://testuser@%s:%d\",\n\t\t\t\t\tconnConfig.hostIPv4,\n\t\t\t\t\tconnConfig.portIPv4,\n\t\t\t\t),\n\t\t\t\tcredentialConfig: sshdialer.Config{Identity: filepath.Join(\"testdata\", \"id_ed25519\")},\n\t\t\t},\n\t\t\tsetUpEnv: all(withoutSSHAgent, withCleanHome, withKnowHosts(connConfig),\n\t\t\t\twithRemoteDockerHost(\"npipe:////./pipe/docker_engine\", connConfig)),\n\t\t\tCreateError: \"not supported\",\n\t\t},\n\t\t{\n\t\t\tname: \"use emulated windows with default docker host\",\n\t\t\targs: args{\n\t\t\t\tconnStr: fmt.Sprintf(\"ssh://testuser@%s:%d\",\n\t\t\t\t\tconnConfig.hostIPv4,\n\t\t\t\t\tconnConfig.portIPv4,\n\t\t\t\t),\n\t\t\t\tcredentialConfig: sshdialer.Config{Identity: filepath.Join(\"testdata\", \"id_ed25519\")},\n\t\t\t},\n\t\t\tsetUpEnv: all(withoutSSHAgent, withCleanHome, withKnowHosts(connConfig),\n\t\t\t\twithEmulatingWindows(connConfig)),\n\t\t\tCreateError: \"not supported\",\n\t\t},\n\t\t{\n\t\t\tname: \"use emulated windows with tcp docker host\",\n\t\t\targs: args{\n\t\t\t\tconnStr: fmt.Sprintf(\"ssh://testuser@%s:%d\",\n\t\t\t\t\tconnConfig.hostIPv4,\n\t\t\t\t\tconnConfig.portIPv4,\n\t\t\t\t),\n\t\t\t\tcredentialConfig: sshdialer.Config{Identity: filepath.Join(\"testdata\", \"id_ed25519\")},\n\t\t\t},\n\t\t\tsetUpEnv: all(withoutSSHAgent, withCleanHome, withKnowHosts(connConfig), withEmulatingWindows(connConfig),\n\t\t\t\twithRemoteDockerHost(\"tcp://localhost:1234\", connConfig)),\n\t\t},\n\t\t{\n\t\t\tname: \"use docker system dial-stdio\",\n\t\t\targs: args{\n\t\t\t\tconnStr: fmt.Sprintf(\"ssh://testuser@%s:%d\",\n\t\t\t\t\tconnConfig.hostIPv4,\n\t\t\t\t\tconnConfig.portIPv4,\n\t\t\t\t),\n\t\t\t\tcredentialConfig: sshdialer.Config{Identity: filepath.Join(\"testdata\", \"id_ed25519\")},\n\t\t\t},\n\t\t\tsetUpEnv: all(withoutSSHAgent, withCleanHome, withKnowHosts(connConfig), withEmulatedDockerSystemDialStdio(connConfig), withFixedUpSSHCLI),\n\t\t},\n\t}\n\n\tfor _, ttx := range tests {\n\t\tspec.Run(t, \"sshDialer/\"+ttx.name, testCreateDialer(connConfig, ttx), spec.Report(report.Terminal{}))\n\t}\n}\n\n// this test cannot be parallelized as they use process wide environment variable $HOME\nfunc testCreateDialer(connConfig *SSHServer, tt testParams) func(t *testing.T, when spec.G, it spec.S) {\n\treturn func(t *testing.T, when spec.G, it spec.S) {\n\t\tit(\"creates a dialer\", func() {\n\t\t\tu, err := url.Parse(tt.args.connStr)\n\t\t\tth.AssertNil(t, err)\n\n\t\t\tif net.ParseIP(u.Hostname()).To4() == nil && connConfig.hostIPv6 == \"\" {\n\t\t\t\tt.Skip(\"skipping ipv6 test since test environment doesn't support ipv6 connection\")\n\t\t\t}\n\n\t\t\tif tt.skipOnWin && runtime.GOOS == \"windows\" {\n\t\t\t\tt.Skip(\"skipping this test on windows\")\n\t\t\t}\n\n\t\t\tdefer tt.setUpEnv(t)()\n\n\t\t\tdialContext, err := sshdialer.NewDialContext(u, tt.args.credentialConfig)\n\n\t\t\tif tt.CreateError == \"\" {\n\t\t\t\tth.AssertEq(t, err, nil)\n\t\t\t} else {\n\t\t\t\t// I wish I could use errors.Is(),\n\t\t\t\t// however foreign code is not wrapping errors thoroughly\n\t\t\t\tif err != nil {\n\t\t\t\t\tth.AssertContains(t, err.Error(), tt.CreateError)\n\t\t\t\t} else {\n\t\t\t\t\tt.Error(\"expected error but got nil\")\n\t\t\t\t}\n\t\t\t}\n\t\t\tif err != nil {\n\t\t\t\treturn\n\t\t\t}\n\n\t\t\ttransport := http.Transport{DialContext: dialContext}\n\t\t\thttpClient := http.Client{Transport: &transport}\n\t\t\tdefer httpClient.CloseIdleConnections()\n\t\t\tresp, err := httpClient.Get(\"http://docker/\")\n\t\t\tif tt.DialError == \"\" {\n\t\t\t\tth.AssertEq(t, err, nil)\n\t\t\t} else {\n\t\t\t\t// I wish I could use errors.Is(),\n\t\t\t\t// however foreign code is not wrapping errors thoroughly\n\t\t\t\tif err != nil {\n\t\t\t\t\tth.AssertContains(t, err.Error(), tt.CreateError)\n\t\t\t\t} else {\n\t\t\t\t\tt.Error(\"expected error but got nil\")\n\t\t\t\t}\n\t\t\t}\n\t\t\tif err != nil {\n\t\t\t\treturn\n\t\t\t}\n\t\t\tdefer resp.Body.Close()\n\n\t\t\tb, err := io.ReadAll(resp.Body)\n\t\t\tth.AssertTrue(t, err == nil)\n\t\t\tif err != nil {\n\t\t\t\treturn\n\t\t\t}\n\t\t\tth.AssertEq(t, string(b), \"OK\")\n\t\t})\n\t}\n}\n\n// function that prepares testing environment and returns clean up function\n// this should be used in conjunction with defer: `defer fn()()`\n// e.g. sets environment variables or starts mock up services\n// it returns clean up procedure that restores old values of environment variables\n// or shuts down mock up services\ntype setUpEnvFn func(t *testing.T) func()\n\n// combines multiple setUp routines into one setUp routine\nfunc all(fns ...setUpEnvFn) setUpEnvFn {\n\treturn func(t *testing.T) func() {\n\t\tt.Helper()\n\t\tvar cleanUps []func()\n\t\tfor _, fn := range fns {\n\t\t\tcleanUps = append(cleanUps, fn(t))\n\t\t}\n\n\t\treturn func() {\n\t\t\tfor i := len(cleanUps) - 1; i >= 0; i-- {\n\t\t\t\tcleanUps[i]()\n\t\t\t}\n\t\t}\n\t}\n}\n\nfunc cp(src, dest string) error {\n\tsrcFs, err := os.Stat(src)\n\tif err != nil {\n\t\treturn fmt.Errorf(\"the cp() function failed to stat source file: %w\", err)\n\t}\n\n\tdata, err := os.ReadFile(src)\n\tif err != nil {\n\t\treturn fmt.Errorf(\"the cp() function failed to read source file: %w\", err)\n\t}\n\n\t_, err = os.Stat(dest)\n\tif err == nil {\n\t\treturn fmt.Errorf(\"destination file already exists: %w\", os.ErrExist)\n\t}\n\n\treturn os.WriteFile(dest, data, srcFs.Mode())\n}\n\n// puts key from ./testdata/{keyName} to $HOME/.ssh/{keyName}\n// those keys are authorized by the testing ssh server\nfunc withKey(t *testing.T, keyName string) setUpEnvFn {\n\tt.Helper()\n\n\treturn func(t *testing.T) func() {\n\t\tt.Helper()\n\t\tvar err error\n\n\t\thome, err := os.UserHomeDir()\n\t\tth.AssertNil(t, err)\n\n\t\terr = os.MkdirAll(filepath.Join(home, \".ssh\"), 0700)\n\t\tth.AssertNil(t, err)\n\n\t\tkeySrc := filepath.Join(\"testdata\", keyName)\n\t\tkeyDest := filepath.Join(home, \".ssh\", keyName)\n\t\terr = cp(keySrc, keyDest)\n\t\tth.AssertNil(t, err)\n\n\t\treturn func() {\n\t\t\tos.Remove(keyDest)\n\t\t}\n\t}\n}\n\n// withInaccessibleKey creates inaccessible key of give type (specified by keyName)\nfunc withInaccessibleKey(keyName string) setUpEnvFn {\n\treturn func(t *testing.T) func() {\n\t\tt.Helper()\n\t\tvar err error\n\n\t\thome, err := os.UserHomeDir()\n\t\tth.AssertNil(t, err)\n\n\t\terr = os.MkdirAll(filepath.Join(home, \".ssh\"), 0700)\n\t\tth.AssertNil(t, err)\n\n\t\tkeyDest := filepath.Join(home, \".ssh\", keyName)\n\t\t_, err = os.OpenFile(keyDest, os.O_CREATE|os.O_WRONLY, 0000)\n\t\tth.AssertNil(t, err)\n\n\t\treturn func() {\n\t\t\tos.Remove(keyDest)\n\t\t}\n\t}\n}\n\n// sets clean temporary $HOME for test\n// this prevents interaction with actual user home which may contain .ssh/\nfunc withCleanHome(t *testing.T) func() {\n\tt.Helper()\n\thomeName := \"HOME\"\n\tif runtime.GOOS == \"windows\" {\n\t\thomeName = \"USERPROFILE\"\n\t}\n\ttmpDir, err := os.MkdirTemp(\"\", \"tmpHome\")\n\tth.AssertNil(t, err)\n\n\toldHome, hadHome := os.LookupEnv(homeName)\n\tos.Setenv(homeName, tmpDir)\n\n\treturn func() {\n\t\tif hadHome {\n\t\t\tos.Setenv(homeName, oldHome)\n\t\t} else {\n\t\t\tos.Unsetenv(homeName)\n\t\t}\n\t\tos.RemoveAll(tmpDir)\n\t}\n}\n\n// withKnowHosts creates $HOME/.ssh/known_hosts with correct entries\nfunc withKnowHosts(connConfig *SSHServer) setUpEnvFn {\n\treturn func(t *testing.T) func() {\n\t\tt.Helper()\n\n\t\tknownHosts := filepath.Join(homedir.Get(), \".ssh\", \"known_hosts\")\n\n\t\terr := os.MkdirAll(filepath.Join(homedir.Get(), \".ssh\"), 0700)\n\t\tth.AssertNil(t, err)\n\n\t\t_, err = os.Stat(knownHosts)\n\t\tif err == nil || !errors.Is(err, os.ErrNotExist) {\n\t\t\tt.Fatal(\"known_hosts already exists\")\n\t\t}\n\n\t\tf, err := os.OpenFile(knownHosts, os.O_CREATE|os.O_WRONLY, 0600)\n\t\tth.AssertNil(t, err)\n\t\tdefer f.Close()\n\n\t\t// generate known_hosts\n\t\tserverKeysDir := filepath.Join(\"testdata\", \"etc\", \"ssh\")\n\t\tfor _, k := range []string{\"ecdsa\"} {\n\t\t\tkeyPath := filepath.Join(serverKeysDir, fmt.Sprintf(\"ssh_host_%s_key.pub\", k))\n\t\t\tkey, err := os.ReadFile(keyPath)\n\t\t\tth.AssertNil(t, err)\n\n\t\t\tfmt.Fprintf(f, \"%s %s\", connConfig.hostIPv4, string(key))\n\t\t\tfmt.Fprintf(f, \"[%s]:%d %s\", connConfig.hostIPv4, connConfig.portIPv4, string(key))\n\n\t\t\tif connConfig.hostIPv6 != \"\" {\n\t\t\t\tfmt.Fprintf(f, \"%s %s\", connConfig.hostIPv6, string(key))\n\t\t\t\tfmt.Fprintf(f, \"[%s]:%d %s\", connConfig.hostIPv6, connConfig.portIPv6, string(key))\n\t\t\t}\n\t\t}\n\n\t\treturn func() {\n\t\t\tos.Remove(knownHosts)\n\t\t}\n\t}\n}\n\n// withBadKnownHosts creates $HOME/.ssh/known_hosts with incorrect entries\nfunc withBadKnownHosts(connConfig *SSHServer) setUpEnvFn {\n\treturn func(t *testing.T) func() {\n\t\tt.Helper()\n\n\t\tknownHosts := filepath.Join(homedir.Get(), \".ssh\", \"known_hosts\")\n\n\t\terr := os.MkdirAll(filepath.Join(homedir.Get(), \".ssh\"), 0700)\n\t\tth.AssertNil(t, err)\n\n\t\t_, err = os.Stat(knownHosts)\n\t\tif err == nil || !errors.Is(err, os.ErrNotExist) {\n\t\t\tt.Fatal(\"known_hosts already exists\")\n\t\t}\n\n\t\tf, err := os.OpenFile(knownHosts, os.O_CREATE|os.O_WRONLY, 0600)\n\t\tth.AssertNil(t, err)\n\t\tdefer f.Close()\n\n\t\tknownHostTemplate := `{{range $host := .}}{{$host}} ssh-dss AAAAB3NzaC1kc3MAAACBAKH4ufS3ABVb780oTgEL1eu+pI1p6YOq/1KJn5s3zm+L3cXXq76r5OM/roGEYrXWUDGRtfVpzYTAKoMWuqcVc0AZ2zOdYkoy1fSjJ3MqDGF53QEO3TXIUt3gUzmLOewwmZWle0RgMa9GHccv7XVVIZB36RR68ZEUswLaTnlVhXQ1AAAAFQCl4t/LnY7kuUI+tL2qT2XmxmiyqwAAAIB72XaO+LfyIiqBOaTkQf+5rvH1i6y6LDO1QD9pzGWUYw3y03AEveHJMjW0EjnYBKJjK39wcZNTieRyU54lhH/HWeWABn9NcQ3duEf1WSO/s7SPsFO2R6quqVSsStkqf2Yfdy4fl24mH41olwtNA6ft5nkVfkqrIa51si4jU8fBVAAAAIB8SSvyYBcyMGLUlQjzQqhhhAHer9x/1YbknVz+y5PHJLLjHjMC4ZRfLgNEojvMKQW46Te9Pwnudcwv19ho4F+kkCOfss7xjyH70gQm6Sj76DxClmnnPoSRq3qEAOMy5Oh+7vyzxm68KHqd/aOmUaiT1LgqgViS9+kNdCoVMGAMOg== mvasek@bellatrix\n{{$host}} ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBLTxVVaQ93ReqHNlbjg5/nBRpuRuG6JIgNeJXWT1V4Dl+dMMrnad3uJBfyrNpvn8rv2qnn6gMTZVtTbLdo96pG0= mvasek@bellatrix\n{{$host}} ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKymJNQszrxetVffPZRfZGKWK786r0mNcg/Wah4+2wn mvasek@bellatrix\n{{$host}} ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC/1/OCwec2Gyv5goNYYvos4iOA+a0NolOGsZA/93jmSArPY1zZS1UWeJ6dDTmxGoL/e7jm9lM6NJY7a/zM0C/GqCNRGR/aCUHBJTIgGtH+79FDKO/LWY6ClGY7Lw8qNgZpugbBw3N3HqTtyb2lELhFLT0FEb+le4WUbryooLK2zsz6DnqV4JvTYyyHcanS0h68iSXC7XbkZchvL99l5LT0gD1oDteBPKKFdNOwIjpMkk/IrbFM24xoNkaTDXN87EpQPQzYDfsoGymprc5OZZ8kzrtErQR+yfuunHfzzqDHWi7ga5pbgkuxNt10djWgCfBRsy07FTEgV0JirS0TCfwTBbqRzdjf3dgi8AP+WtkW3mcv4a1XYeqoBo2o9TbfyiA9kERs79UBN0mCe3KNX3Ns0PvutsRLaHmdJ49eaKWkJ6GgL37aqSlIwTixz2xY3eoDSkqHoZpx6Q1MdpSIl5gGVzlaobM/PNM1jqVdyUj+xpjHyiXwHQMKc3eJna7s8Jc= mvasek@bellatrix\n{{end}}`\n\n\t\ttmpl := template.New(knownHostTemplate)\n\t\ttmpl, err = tmpl.Parse(knownHostTemplate)\n\t\tth.AssertNil(t, err)\n\n\t\thosts := make([]string, 0, 4)\n\t\thosts = append(hosts, connConfig.hostIPv4, fmt.Sprintf(\"[%s]:%d\", connConfig.hostIPv4, connConfig.portIPv4))\n\t\tif connConfig.hostIPv6 != \"\" {\n\t\t\thosts = append(hosts, connConfig.hostIPv6, fmt.Sprintf(\"[%s]:%d\", connConfig.hostIPv6, connConfig.portIPv4))\n\t\t}\n\n\t\terr = tmpl.Execute(f, hosts)\n\t\tth.AssertNil(t, err)\n\n\t\treturn func() {\n\t\t\tos.Remove(knownHosts)\n\t\t}\n\t}\n}\n\n// withBrokenKnownHosts creates broken $HOME/.ssh/known_hosts\nfunc withBrokenKnownHosts(t *testing.T) func() {\n\tt.Helper()\n\n\tknownHosts := filepath.Join(homedir.Get(), \".ssh\", \"known_hosts\")\n\n\terr := os.MkdirAll(filepath.Join(homedir.Get(), \".ssh\"), 0700)\n\tth.AssertNil(t, err)\n\n\t_, err = os.Stat(knownHosts)\n\tif err == nil || !errors.Is(err, os.ErrNotExist) {\n\t\tt.Fatal(\"known_hosts already exists\")\n\t}\n\n\tf, err := os.OpenFile(knownHosts, os.O_CREATE|os.O_WRONLY, 0600)\n\tth.AssertNil(t, err)\n\tdefer f.Close()\n\n\t_, err = f.WriteString(\"somegarbage\\nsome rubish\\n stuff\\tqwerty\")\n\tth.AssertNil(t, err)\n\n\treturn func() {\n\t\tos.Remove(knownHosts)\n\t}\n}\n\n// withInaccessibleKnownHosts creates inaccessible $HOME/.ssh/known_hosts\nfunc withInaccessibleKnownHosts(t *testing.T) func() {\n\tt.Helper()\n\n\tknownHosts := filepath.Join(homedir.Get(), \".ssh\", \"known_hosts\")\n\n\terr := os.MkdirAll(filepath.Join(homedir.Get(), \".ssh\"), 0700)\n\tth.AssertNil(t, err)\n\n\t_, err = os.Stat(knownHosts)\n\tif err == nil || !errors.Is(err, os.ErrNotExist) {\n\t\tt.Fatal(\"known_hosts already exists\")\n\t}\n\n\tf, err := os.OpenFile(knownHosts, os.O_CREATE|os.O_WRONLY, 0000)\n\tth.AssertNil(t, err)\n\tdefer f.Close()\n\n\treturn func() {\n\t\tos.Remove(knownHosts)\n\t}\n}\n\n// withEmptyKnownHosts creates empty $HOME/.ssh/known_hosts\nfunc withEmptyKnownHosts(t *testing.T) func() {\n\tt.Helper()\n\n\tknownHosts := filepath.Join(homedir.Get(), \".ssh\", \"known_hosts\")\n\n\terr := os.MkdirAll(filepath.Join(homedir.Get(), \".ssh\"), 0700)\n\tth.AssertNil(t, err)\n\n\t_, err = os.Stat(knownHosts)\n\tif err == nil || !errors.Is(err, os.ErrNotExist) {\n\t\tt.Fatal(\"known_hosts already exists\")\n\t}\n\n\t_, err = os.Create(knownHosts)\n\tth.AssertNil(t, err)\n\n\treturn func() {\n\t\tos.Remove(knownHosts)\n\t}\n}\n\n// withoutSSHAgent unsets the SSH_AUTH_SOCK environment variable so ssh-agent is not used by test\nfunc withoutSSHAgent(t *testing.T) func() {\n\tt.Helper()\n\toldAuthSock, hadAuthSock := os.LookupEnv(\"SSH_AUTH_SOCK\")\n\tos.Unsetenv(\"SSH_AUTH_SOCK\")\n\n\treturn func() {\n\t\tif hadAuthSock {\n\t\t\tos.Setenv(\"SSH_AUTH_SOCK\", oldAuthSock)\n\t\t} else {\n\t\t\tos.Unsetenv(\"SSH_AUTH_SOCK\")\n\t\t}\n\t}\n}\n\n// withBadSSHAgentSocket sets the SSH_AUTH_SOCK environment variable to non-existing file\nfunc withBadSSHAgentSocket(t *testing.T) func() {\n\tt.Helper()\n\toldAuthSock, hadAuthSock := os.LookupEnv(\"SSH_AUTH_SOCK\")\n\tos.Setenv(\"SSH_AUTH_SOCK\", \"/does/not/exists.sock\")\n\n\treturn func() {\n\t\tif hadAuthSock {\n\t\t\tos.Setenv(\"SSH_AUTH_SOCK\", oldAuthSock)\n\t\t} else {\n\t\t\tos.Unsetenv(\"SSH_AUTH_SOCK\")\n\t\t}\n\t}\n}\n\n// withGoodSSHAgent starts serving ssh-agent on temporary unix socket.\n// It sets the SSH_AUTH_SOCK environment variable to the temporary socket.\n// The agent will return correct keys for the testing ssh server.\nfunc withGoodSSHAgent(t *testing.T) func() {\n\tt.Helper()\n\n\tkey, err := os.ReadFile(filepath.Join(\"testdata\", \"id_ed25519\"))\n\tth.AssertNil(t, err)\n\n\tsigner, err := ssh.ParsePrivateKey(key)\n\tth.AssertNil(t, err)\n\n\treturn withSSHAgent(t, signerAgent{signer})\n}\n\n// withBadSSHAgent starts serving ssh-agent on temporary unix socket.\n// It sets the SSH_AUTH_SOCK environment variable to the temporary socket.\n// The agent will return incorrect keys for the testing ssh server.\nfunc withBadSSHAgent(t *testing.T) func() {\n\treturn withSSHAgent(t, badAgent{})\n}\n\nfunc withSSHAgent(t *testing.T, ag agent.Agent) func() {\n\tvar err error\n\tt.Helper()\n\n\tvar tmpDirForSocket string\n\tvar agentSocketPath string\n\tif runtime.GOOS == \"windows\" {\n\t\tagentSocketPath = `\\\\.\\pipe\\openssh-ssh-agent-test`\n\t} else {\n\t\ttmpDirForSocket, err = os.MkdirTemp(\"\", \"forAuthSock\")\n\t\tth.AssertNil(t, err)\n\n\t\tagentSocketPath = filepath.Join(tmpDirForSocket, \"agent.sock\")\n\t}\n\n\tunixListener, err := listen(agentSocketPath)\n\tth.AssertNil(t, err)\n\n\tos.Setenv(\"SSH_AUTH_SOCK\", agentSocketPath)\n\n\tctx, cancel := context.WithCancel(context.Background())\n\terrChan := make(chan error, 1)\n\tvar wg sync.WaitGroup\n\n\tgo func() {\n\t\tfor {\n\t\t\tconn, err := unixListener.Accept()\n\t\t\tif err != nil {\n\t\t\t\terrChan <- err\n\n\t\t\t\treturn\n\t\t\t}\n\n\t\t\twg.Add(1)\n\t\t\tgo func(conn net.Conn) {\n\t\t\t\tdefer wg.Done()\n\t\t\t\tgo func() {\n\t\t\t\t\t<-ctx.Done()\n\t\t\t\t\tconn.Close()\n\t\t\t\t}()\n\t\t\t\terr := agent.ServeAgent(ag, conn)\n\t\t\t\tif err != nil {\n\t\t\t\t\tif !isErrClosed(err) {\n\t\t\t\t\t\tfmt.Fprintf(os.Stderr, \"agent.ServeAgent() failed: %v\\n\", err)\n\t\t\t\t\t}\n\t\t\t\t}\n\t\t\t}(conn)\n\t\t}\n\t}()\n\n\treturn func() {\n\t\tos.Unsetenv(\"SSH_AUTH_SOCK\")\n\n\t\terr := unixListener.Close()\n\t\tth.AssertNil(t, err)\n\n\t\terr = <-errChan\n\n\t\tif !isErrClosed(err) {\n\t\t\tt.Fatal(err)\n\t\t}\n\t\tcancel()\n\t\twg.Wait()\n\t\tif tmpDirForSocket != \"\" {\n\t\t\tos.RemoveAll(tmpDirForSocket)\n\t\t}\n\t}\n}\n\ntype signerAgent struct {\n\timpl ssh.Signer\n}\n\nfunc (a signerAgent) List() ([]*agent.Key, error) {\n\treturn []*agent.Key{{\n\t\tFormat: a.impl.PublicKey().Type(),\n\t\tBlob:   a.impl.PublicKey().Marshal(),\n\t}}, nil\n}\n\nfunc (a signerAgent) Sign(key ssh.PublicKey, data []byte) (*ssh.Signature, error) {\n\treturn a.impl.Sign(nil, data)\n}\n\nfunc (a signerAgent) Add(key agent.AddedKey) error {\n\tpanic(\"implement me\")\n}\n\nfunc (a signerAgent) Remove(key ssh.PublicKey) error {\n\tpanic(\"implement me\")\n}\n\nfunc (a signerAgent) RemoveAll() error {\n\tpanic(\"implement me\")\n}\n\nfunc (a signerAgent) Lock(passphrase []byte) error {\n\tpanic(\"implement me\")\n}\n\nfunc (a signerAgent) Unlock(passphrase []byte) error {\n\tpanic(\"implement me\")\n}\n\nfunc (a signerAgent) Signers() ([]ssh.Signer, error) {\n\tpanic(\"implement me\")\n}\n\nvar errBadAgent = errors.New(\"bad agent error\")\n\ntype badAgent struct{}\n\nfunc (b badAgent) List() ([]*agent.Key, error) {\n\treturn nil, errBadAgent\n}\n\nfunc (b badAgent) Sign(key ssh.PublicKey, data []byte) (*ssh.Signature, error) {\n\treturn nil, errBadAgent\n}\n\nfunc (b badAgent) Add(key agent.AddedKey) error {\n\treturn errBadAgent\n}\n\nfunc (b badAgent) Remove(key ssh.PublicKey) error {\n\treturn errBadAgent\n}\n\nfunc (b badAgent) RemoveAll() error {\n\treturn errBadAgent\n}\n\nfunc (b badAgent) Lock(passphrase []byte) error {\n\treturn errBadAgent\n}\n\nfunc (b badAgent) Unlock(passphrase []byte) error {\n\treturn errBadAgent\n}\n\nfunc (b badAgent) Signers() ([]ssh.Signer, error) {\n\treturn nil, errBadAgent\n}\n\n// openSSH CLI doesn't take the HOME/USERPROFILE environment variable into account.\n// It gets user home in different way (e.g. reading /etc/passwd).\n// This means tests cannot mock home dir just by setting environment variable.\n// withFixedUpSSHCLI works around the problem, it forces usage of known_hosts from HOME/USERPROFILE.\nfunc withFixedUpSSHCLI(t *testing.T) func() {\n\tt.Helper()\n\n\tsshAbsPath, err := exec.LookPath(\"ssh\")\n\tth.AssertNil(t, err)\n\n\tsshScript := `#!/bin/sh\nSSH_BIN -o PasswordAuthentication=no -o ConnectTimeout=3 -o UserKnownHostsFile=\"$HOME/.ssh/known_hosts\" $@\n`\n\tif runtime.GOOS == \"windows\" {\n\t\tsshScript = `@echo off\nSSH_BIN -o PasswordAuthentication=no -o ConnectTimeout=3 -o UserKnownHostsFile=%USERPROFILE%\\.ssh\\known_hosts %*\n`\n\t}\n\tsshScript = strings.ReplaceAll(sshScript, \"SSH_BIN\", sshAbsPath)\n\n\thome, err := os.UserHomeDir()\n\tth.AssertNil(t, err)\n\n\thomeBin := filepath.Join(home, \"bin\")\n\terr = os.MkdirAll(homeBin, 0700)\n\tth.AssertNil(t, err)\n\n\tsshScriptName := \"ssh\"\n\tif runtime.GOOS == \"windows\" {\n\t\tsshScriptName = \"ssh.bat\"\n\t}\n\n\tsshScriptFullPath := filepath.Join(homeBin, sshScriptName)\n\terr = os.WriteFile(sshScriptFullPath, []byte(sshScript), 0700)\n\tth.AssertNil(t, err)\n\n\toldPath := os.Getenv(\"PATH\")\n\tos.Setenv(\"PATH\", homeBin+string(os.PathListSeparator)+oldPath)\n\treturn func() {\n\t\tos.Setenv(\"PATH\", oldPath)\n\t\tos.RemoveAll(homeBin)\n\t}\n}\n\n// withEmulatedDockerSystemDialStdio makes `docker system dial-stdio` viable in the testing ssh server.\n// It does so by appending definition of shell function named `docker` into .bashrc .\nfunc withEmulatedDockerSystemDialStdio(sshServer *SSHServer) setUpEnvFn {\n\treturn func(t *testing.T) func() {\n\t\tt.Helper()\n\n\t\toldHasDialStdio := sshServer.HasDialStdio()\n\t\tsshServer.SetHasDialStdio(true)\n\t\treturn func() {\n\t\t\tsshServer.SetHasDialStdio(oldHasDialStdio)\n\t\t}\n\t}\n}\n\n// withEmulatingWindows makes changes to the testing ssh server such that\n// the server appears to be Windows server for simple check done calling the `systeminfo` command\nfunc withEmulatingWindows(sshServer *SSHServer) setUpEnvFn {\n\treturn func(t *testing.T) func() {\n\t\toldIsWindows := sshServer.IsWindows()\n\t\tsshServer.SetIsWindows(true)\n\t\treturn func() {\n\t\t\tsshServer.SetIsWindows(oldIsWindows)\n\t\t}\n\t}\n}\n\n// withRemoteDockerHost makes changes to the testing ssh server such that\n// the DOCKER_HOST environment is set to host parameter\nfunc withRemoteDockerHost(host string, sshServer *SSHServer) setUpEnvFn {\n\treturn func(t *testing.T) func() {\n\t\toldHost := sshServer.GetDockerHostEnvVar()\n\t\tsshServer.SetDockerHostEnvVar(host)\n\t\treturn func() {\n\t\t\tsshServer.SetDockerHostEnvVar(oldHost)\n\t\t}\n\t}\n}\n"
  },
  {
    "path": "internal/sshdialer/testdata/etc/ssh/ssh_host_ecdsa_key",
    "content": "-----BEGIN OPENSSH PRIVATE KEY-----\nb3BlbnNzaC1rZXktdjEAAAAABG5vbmUAAAAEbm9uZQAAAAAAAAABAAAAaAAAABNlY2RzYS\n1zaGEyLW5pc3RwMjU2AAAACG5pc3RwMjU2AAAAQQRDB1fPXY58fSwaqyoj5lCfLtQ/NcIs\ngrKTA11vypVy9MUCWtdAQIXczmtRMTFCVozk3lwt9M4iKc79nCkkkfyrAAAAsPgat2v4Gr\ndrAAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBEMHV89djnx9LBqr\nKiPmUJ8u1D81wiyCspMDXW/KlXL0xQJa10BAhdzOa1ExMUJWjOTeXC30ziIpzv2cKSSR/K\nsAAAAhAIcs2smJGAEKOvzL8Rfz5b1IpQqB8GzxycT3/53XOzaSAAAAEG12YXNla0BiZWxs\nYXRyaXgBAgMEBQYH\n-----END OPENSSH PRIVATE KEY-----\n"
  },
  {
    "path": "internal/sshdialer/testdata/etc/ssh/ssh_host_ecdsa_key.pub",
    "content": "ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBEMHV89djnx9LBqrKiPmUJ8u1D81wiyCspMDXW/KlXL0xQJa10BAhdzOa1ExMUJWjOTeXC30ziIpzv2cKSSR/Ks= mvasek@bellatrix\n"
  },
  {
    "path": "internal/sshdialer/testdata/etc/ssh/ssh_host_ed25519_key",
    "content": "-----BEGIN OPENSSH PRIVATE KEY-----\nb3BlbnNzaC1rZXktdjEAAAAABG5vbmUAAAAEbm9uZQAAAAAAAAABAAAAMwAAAAtzc2gtZW\nQyNTUxOQAAACAhhd7wuRBaf9R7Q/HQi7lEWoukb/HrYDg394NpeOgsbAAAAJjz02VI89Nl\nSAAAAAtzc2gtZWQyNTUxOQAAACAhhd7wuRBaf9R7Q/HQi7lEWoukb/HrYDg394NpeOgsbA\nAAAEC9lvHqAASWXcZSm/Rih3V78uMejs+6sc6SOVhaogLwHyGF3vC5EFp/1HtD8dCLuURa\ni6Rv8etgODf3g2l46CxsAAAAEG12YXNla0BiZWxsYXRyaXgBAgMEBQ==\n-----END OPENSSH PRIVATE KEY-----\n"
  },
  {
    "path": "internal/sshdialer/testdata/etc/ssh/ssh_host_ed25519_key.pub",
    "content": "ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICGF3vC5EFp/1HtD8dCLuURai6Rv8etgODf3g2l46Cxs mvasek@bellatrix\n"
  },
  {
    "path": "internal/sshdialer/testdata/etc/ssh/ssh_host_rsa_key",
    "content": "-----BEGIN OPENSSH PRIVATE KEY-----\nb3BlbnNzaC1rZXktdjEAAAAABG5vbmUAAAAEbm9uZQAAAAAAAAABAAABlwAAAAdzc2gtcn\nNhAAAAAwEAAQAAAYEA1VHmRdtHzOhCZCmcrYJdF9VjdPTXc9Hid9Bzexk0QfnG6C15gOyb\nbmX5YVxgViHzrrpaodLDCYBWu4+l2DPeG5gaIa8a1WAhgvEQRfJh09JcVkVWdz9A7Umgtb\n6k3d0QwlYaZrLS1SZGYg+ioxolebzAs/+dm1yfkSRXNf3fPuhyubO6AIBMBdcnIz3cwu3N\nZv6o3LshFYym7DaXA/LdHwmMZe2JaksqSbafbaxqZ6kIjvWUeOrI6R/3uLWM1BTMir9inn\n8iRrcxnAtSVG+Q5XgvakPuZDXvzyPP66kOcnT1x8DDHcy4PD0SyztDIjHWXp/XBwqDqWQb\nhiUBaOjeyYJ6qo9ZJnrGoaip5A8TlKtaDFS6aX1obHxYet+SvA5sKja/cx33NLh0FRfXXZ\nmkngEvrmP9oGX/AegMTbzE+t2kKDU80Mye1LmxWN/e8Rudib41hVTZ0U8PZcHmgAOMcmGf\nA4iVmWc+vEc5oYeY0WSk/zQKBTYNlQsNw6+O7qT3AAAFiCBt4WEgbeFhAAAAB3NzaC1yc2\nEAAAGBANVR5kXbR8zoQmQpnK2CXRfVY3T013PR4nfQc3sZNEH5xugteYDsm25l+WFcYFYh\n8666WqHSwwmAVruPpdgz3huYGiGvGtVgIYLxEEXyYdPSXFZFVnc/QO1JoLW+pN3dEMJWGm\nay0tUmRmIPoqMaJXm8wLP/nZtcn5EkVzX93z7ocrmzugCATAXXJyM93MLtzWb+qNy7IRWM\npuw2lwPy3R8JjGXtiWpLKkm2n22samepCI71lHjqyOkf97i1jNQUzIq/Yp5/Ika3MZwLUl\nRvkOV4L2pD7mQ1788jz+upDnJ09cfAwx3MuDw9Ess7QyIx1l6f1wcKg6lkG4YlAWjo3smC\neqqPWSZ6xqGoqeQPE5SrWgxUuml9aGx8WHrfkrwObCo2v3Md9zS4dBUX112ZpJ4BL65j/a\nBl/wHoDE28xPrdpCg1PNDMntS5sVjf3vEbnYm+NYVU2dFPD2XB5oADjHJhnwOIlZlnPrxH\nOaGHmNFkpP80CgU2DZULDcOvju6k9wAAAAMBAAEAAAGALccVk4grMF3nYXdMmC+RqruwTD\nj+w2wXHX8uSQxvmnjvpoObv38HG/nmOm6IffNrR+PV70Q7dp6D/lwlSvBWibVqZjAdogyv\nJFp3E4ugUsSh7CGVHKIGXOWgB2CSIMp//jRcFg3qELPWBtU0IaxKvoUzFW2VdPG7jHov/P\nYuImHfvNpE4DaoGdjCHV35Mhu2KJQdyMCfqPA2IhrU7ZQAv9hcuMLw6k6XFJqMPAz0CKrN\nm2A4LHq2AtFJZ+oN/rU3izl01xvH3WndwEJJ16L49ItvDvPCb8WJJJx5jmvMF7o3qJlOmM\nT39JvPB1zyQnClv6wOHQDUPLR0MNiKJ2OFVZWw9Ay7hgYXERqazkKctI0Hh4DtgFJNtOSN\nML5EEAgEYRlKsikhYRra922Gi6p7DJthhtd/e2QnHKnRLX2fed0rkBVS5qlN7baR/ujF9n\n5C+SN1+OsNSBhbe4aPAYapXKnN9UCDMyz6kDv0eSQs/arkgacoGzCtfnd/z5Jpl9kBAAAA\nwETMrjKmEAVT/w2xheu/Asky7ldZ6L4Y/Tm6g5htjgJYMz6PGNrTLQTz5fFlH5yphps+g4\noqYxJHgqbAD4DT4v8Lfc8CkQ/vy9xZ/jYrn6qD4+/5Mm8Vd7eOTvFJfBut8ZT9ZDvc5kGi\n2rfKSq2kvq0mSGhcdLM+6jbjC4twIQ373O3i3M1XNAO/L7FFhMSex0go5O+41Ni08kHllq\nvcQXd9XvVzGkgZTI3wWrFe6+C0h3qLLLrr4Kgt39ZoHoOsBwAAAMEA+/0ZAd4/7hogsstf\nnp+OXNS4deFssz630m7ncJQDUEpgCQnRZphwABJoinfzj9oKSBfuiVfgvL/DJn/OjNhNT+\nknGY1nVB3mHIQbqb+AeLKilO3QG4tjOZzI7uxN0mmswN/brHcEKd4lQP6AMyXndF5S6Xv5\n8HmdFm+8HvPmcDPS8Mh8NM2gaPVH1O67Iv//DUBeriaN1GqXBXOTnT03es8cAw10I5898U\nKO9s/+Pe9dFcXVg7cemDNBr6Oz1E9JAAAAwQDYtzggjIQbtn3xHNC+5iIkXz+4oRkSYtC0\nFeymq7bmo3Xy1Dp85C2Oz3DSKjwR2mo5vPREV784NMiyIDFnnGxSX0Aq0xAPfmwvMlF9DU\nN4aJ/LIA2Fk2c3M2ZirUi600QNDweuaNnPN5OFBLMHxp6+olBZ22A/utN9ErWxSDJIUlix\ncDFkcDPhozdShr+hwzN3bGGJZq8+UZ6/y6gSb1gcNf7Ly1vupINRvmRin5xKNp/t4MaqZp\npAnGpiWXM8Ej8AAAAQbXZhc2VrQGJlbGxhdHJpeAECAw==\n-----END OPENSSH PRIVATE KEY-----\n"
  },
  {
    "path": "internal/sshdialer/testdata/etc/ssh/ssh_host_rsa_key.pub",
    "content": "ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDVUeZF20fM6EJkKZytgl0X1WN09Ndz0eJ30HN7GTRB+cboLXmA7JtuZflhXGBWIfOuulqh0sMJgFa7j6XYM94bmBohrxrVYCGC8RBF8mHT0lxWRVZ3P0DtSaC1vqTd3RDCVhpmstLVJkZiD6KjGiV5vMCz/52bXJ+RJFc1/d8+6HK5s7oAgEwF1ycjPdzC7c1m/qjcuyEVjKbsNpcD8t0fCYxl7YlqSypJtp9trGpnqQiO9ZR46sjpH/e4tYzUFMyKv2KefyJGtzGcC1JUb5DleC9qQ+5kNe/PI8/rqQ5ydPXHwMMdzLg8PRLLO0MiMdZen9cHCoOpZBuGJQFo6N7Jgnqqj1kmesahqKnkDxOUq1oMVLppfWhsfFh635K8DmwqNr9zHfc0uHQVF9ddmaSeAS+uY/2gZf8B6AxNvMT63aQoNTzQzJ7UubFY397xG52JvjWFVNnRTw9lweaAA4xyYZ8DiJWZZz68Rzmhh5jRZKT/NAoFNg2VCw3Dr47upPc= mvasek@bellatrix\n"
  },
  {
    "path": "internal/sshdialer/testdata/id_dsa",
    "content": "somegarbage\nsomerugish"
  },
  {
    "path": "internal/sshdialer/testdata/id_dsa.pub",
    "content": "somegarbage\nsomerugish"
  },
  {
    "path": "internal/sshdialer/testdata/id_ed25519",
    "content": "-----BEGIN OPENSSH PRIVATE KEY-----\nb3BlbnNzaC1rZXktdjEAAAAABG5vbmUAAAAEbm9uZQAAAAAAAAABAAAAMwAAAAtzc2gtZW\nQyNTUxOQAAACCVbPGj8U+Zjq5NEgyP2996RFk+lIrXNMYLqLLikeJRJQAAAJjDwa3Yw8Gt\n2AAAAAtzc2gtZWQyNTUxOQAAACCVbPGj8U+Zjq5NEgyP2996RFk+lIrXNMYLqLLikeJRJQ\nAAAEDti7Y7pPMfJq3Cwztd3ZiM1orRTIibsTH2Y/NQPPFiHpVs8aPxT5mOrk0SDI/b33pE\nWT6Uitc0xguosuKR4lElAAAAFHRlc3R1c2VyQGV4YW1wbGUuY29tAQ==\n-----END OPENSSH PRIVATE KEY-----\n"
  },
  {
    "path": "internal/sshdialer/testdata/id_ed25519.pub",
    "content": "ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIJVs8aPxT5mOrk0SDI/b33pEWT6Uitc0xguosuKR4lEl testuser@example.com\n"
  },
  {
    "path": "internal/sshdialer/testdata/id_rsa",
    "content": "-----BEGIN OPENSSH PRIVATE KEY-----\nb3BlbnNzaC1rZXktdjEAAAAACmFlczI1Ni1jdHIAAAAGYmNyeXB0AAAAGAAAABBqK6VQTJ\n8oCGZAMOYnHQg4AAAAEAAAAAEAAAIXAAAAB3NzaC1yc2EAAAADAQABAAACAQDe2RqNezOI\n3cRJq+PQYlcASjYRaJgFd/AjAYtB+u8C7C8OkuelIaiYavjUn1+Sx3VOkqSVwA7J7zUzUP\ncx/83BjCffvHtXLAmYfxojd9Z7Bh05kV1Ayx7Pn8xEbEkZpffLGrIy98Vs2MYUU7K7JhLO\nInkEfD1qolV5IpoIKazkulJEpsdTalrSn53IO0Afa+aayZIG+Fc4RiTHUqk6YcLZQnAcUA\noa6WCC31cIOhbllv83hVwjx9b5ZgX509o/WrBsR2cnL7qemBDJmq4RkL6RVcM0lyaGyMn+\nPfYCXbx4pn3bxdw7An3BS3/O/hTEFLh2tCJc55u7xbKJX265RpKMt/BdU8+VLuA8assVBk\nNhDwlLI+jvmC8fzqE7H/PLWO2XgTkMpDybEy2n4+8nRElnj4RB2J+vagiJPZFrr5SWQHMU\nD8fHAYvWo0AHPLly+NgkEu2ZRoAV3gRwZm4C8u9tXMNf4UNVcyxpgaylZwxnaiLNjwx6UC\nhcA0TAAGPygJ3UJ8D/Gakj3iekOd7Zcl7oE9dzByjIolVaPqXChAjfB8wWSiJ2LZRhVklL\nwHm+wKLEpIBhzai4OnF6k4NTyo1wxlfHb90rbyfn6/arUTJX7cU1NFd7cUw0lh+fL/Q15n\nJZsd5oCNt3yCTS9fpo9rPPPjCkL+XqUzVVOgbRK5IknwAAB1BqWuTHt/A4IsbG6ieTHPDv\nCuxsBR8OlbLcI3+QYoa0KR3fVgJbP4IWx5c/RkzF6+t394a3xkat8CYaQyZOmEcNCB4cAH\nqYnFTrEY6Q+zB76THbGPt1/1GQS3Y/UJsum4PYUG9cgT+ckmV7Zuk7Zcd3mP/kpRnDOHX4\nptKARsD8gCttLUu0UbdFoV/cLqDxvpc4VY2hxV/zP30wrP4w/o4ajvQIPji3QR52K4oQpq\ngFy1hitMS7rl/VduX7wQe/xRQeZmln7LywMEhC8IFLJsEFg1IqLPeiVM52GBBH1uBQLPRs\nAmiLLeCgP8DGPqwp+AGkWFmlHC+OVDxos0S43WlePgrLfWBfr0z+ezAMLVskkDqtkflZC9\nVbFA7caV+7W508YhXjWf+D0HP48jhjfsX8ZtDyG1hI/zW6l5hDeTAvyNH6f8VJ88P4kKXe\nSuvIOcMiBHJKw3lsQb7/bx9byVcy+dLrfttceITm/m+ebK+IYlrmR2qPWwJR43o0l9oyvO\nUDahn5qUFFP7HhAEFsg6X03nT200IIaqdc3K7+KfY+OdhP2H33e0e3wbMglgKqd2a4Co/o\nLLAq3V8/ViEdxhWZgNruy4AnzVsA18vmLvIk5cEh/jOMkB/O10hCpfRbQp/MEHT5a8BQph\nLYNLFLF4WrpSPJwRvNjCbW5IDH0bNh6ochwX0kzv18djPYe1mEhR22MaQMYP3VivXB2A8c\nv44dk6dQfzP7qqCYJP+GueGvRpvZTwC86R6wS/ppPCe4mbF0ANgB+x7UqkXzCpJ9CxaPl+\nEpbYmJVaHzWjxfVGWKAX1rZq3wfO0tLwppNkpVvi9ntNPx6frwEUcuiQ0aaF9Jutv3Voxn\nqA2wq5koWrUZuvu5GlwADlNhXh6kN2z6MCajGBTJ2KzvN80BnzqrL5LCYMJLWYWma4WYSD\nEOIcDeZxBtE23FmJTMf3WcexB6gGCmHoumC7rvnEpjWM2MFct1gzBhKfSciFbjQYxYH/je\nOgVyTsGr7ypqbyEAb/Ao4NWD97XV4WDXTmpI239pndEZjYetRUAhESEIG69a6KIPFPLD7m\nd2zFONaL6okT8ozr/CAE7Q8z6SPRC5tx4lmFatXGW0i3LjSK90ilw8R9v4Ji7RXVSsWxBd\nkqcodchO2w1OWVVo/xqeEmvJKms8uYf2EpZfaCNS7WsZPJYHulwSG9WX0SabYRO+k9w3rj\nPjEP9DS6VV/j2TEkCoNhEuR3VF0GlVbfNfFfefsrfH6FmTawW8UFvMYoPNxgH3sZAjvfzx\nRGC3Iysvao72wNBIlDowgUPap8A+snUgQZ4YoCefhGSbcPDeuqqx6bDlEUVQkeEARYWgj6\nxek869h045LbDqdKiQbjhBpuIRbe7cjKjIBgG7X2Vl/W7fOMoHN8QOW6Y+Tzohyz8rEmAp\nMRgXlqs2hO/LGs28EXBUxKgaaQghzVz/LwGe06bUFRUL0ERaI2wHvM2a/mYgnMYytFeFFG\nICqnF+0dPLeUuoQBT3KWGspGa6tR+ZA11GBCK+7TYZi23e1n0zQptxl1me16q/NGHNfpQf\nGEKCML2RzSH+1p9hflIRMN1WQMRzPHu6gTHJkxnlKrTBGgHwAaG7vCEaQTbIcHVnenIjB5\nFntxr2gEz5dyiKnS52D2uNbJDhUu/Z/hKw8pTmLC7YcZO+wTneFtaZ4AxcMQXG5xF51vjy\nPow72o0M6gL1s6te9BmiYFxfmIWERmJ+mlmR80JSTVFcjma38bWX567gooXTyvd+GvaVIl\nmIx6EjdRqe5TK/r+JU92ggcyobTZI6iaYaOBSpi/i4hbpWX6n2RIoN1HYui5CMf2wvQT2F\njFrbmbi24d8xSw6DJeiDXuPVdIdYVqcLWIfAdNgNzVGw2cmagzTLjQrRZlCnpjTY/Uznj1\nJ7LivTdju+Pb7rbbxFJqeubnee/1gga2sIk2lgGcgOQmaDI8QMms4Pt47rG2w2hluP054V\nm918D7Y3dNNwJnp0G6Wv2Ecyo5A2oS4Jd6g1szn4dW7C6J5vWI8+eZyhW8VIKtDlMOftJX\n5DkEeXKrJD8+1WBo2kK2E+sfpzSvHjQQ0jXe39JMnogyNt45batknxLk+llZSNbkWI1IiR\nVzB7wkeH/r/zvz+pOSan1PXI0P+JYqEx7PEk4EL5hXwMViMUrZEgGxaMqFdWpjnY3fzA7J\niF4NoP2Gy21bC2KiZ4/3PshSBPprYkAyfkgQZrlgD1+ubcMYYrbVWK/bilyuDv/tQaW0Wi\nF4SsQaMRm8X/5wth5CH1M46diExCajv5HKMOuTJ0ML3oJphyuvUpmuvHG+VcU/tgKEDgJM\n1FWoLJqgdlNspUZYcQYTU2MH8sQFDNNnn3TFxFfKmdtLxpsKq1OGIuNPJKPGfd9kiUw4kP\nHKhZDUt2MpCtjKzl/wyGt+giNUL2t51XAT1eqA4Z1FfFU7ZtSQoZ95JA5VfR1yl8F6rWq8\nMUnDXSP1KUo0fm3Ojh6J25+2w=\n-----END OPENSSH PRIVATE KEY-----\n"
  },
  {
    "path": "internal/sshdialer/testdata/id_rsa.pub",
    "content": "ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDe2RqNezOI3cRJq+PQYlcASjYRaJgFd/AjAYtB+u8C7C8OkuelIaiYavjUn1+Sx3VOkqSVwA7J7zUzUPcx/83BjCffvHtXLAmYfxojd9Z7Bh05kV1Ayx7Pn8xEbEkZpffLGrIy98Vs2MYUU7K7JhLOInkEfD1qolV5IpoIKazkulJEpsdTalrSn53IO0Afa+aayZIG+Fc4RiTHUqk6YcLZQnAcUAoa6WCC31cIOhbllv83hVwjx9b5ZgX509o/WrBsR2cnL7qemBDJmq4RkL6RVcM0lyaGyMn+PfYCXbx4pn3bxdw7An3BS3/O/hTEFLh2tCJc55u7xbKJX265RpKMt/BdU8+VLuA8assVBkNhDwlLI+jvmC8fzqE7H/PLWO2XgTkMpDybEy2n4+8nRElnj4RB2J+vagiJPZFrr5SWQHMUD8fHAYvWo0AHPLly+NgkEu2ZRoAV3gRwZm4C8u9tXMNf4UNVcyxpgaylZwxnaiLNjwx6UChcA0TAAGPygJ3UJ8D/Gakj3iekOd7Zcl7oE9dzByjIolVaPqXChAjfB8wWSiJ2LZRhVklLwHm+wKLEpIBhzai4OnF6k4NTyo1wxlfHb90rbyfn6/arUTJX7cU1NFd7cUw0lh+fL/Q15nJZsd5oCNt3yCTS9fpo9rPPPjCkL+XqUzVVOgbRK5Iknw== testuser@example.com\n"
  },
  {
    "path": "internal/sshdialer/windows_test.go",
    "content": "//go:build windows\n\npackage sshdialer_test\n\nimport (\n\t\"errors\"\n\t\"net\"\n\t\"os/user\"\n\t\"strings\"\n\n\t\"github.com/Microsoft/go-winio\"\n\t\"github.com/hectane/go-acl\"\n)\n\nfunc fixupPrivateKeyMod(path string) {\n\tusr, err := user.Current()\n\tif err != nil {\n\t\tpanic(err)\n\t}\n\tmode := uint32(0400)\n\terr = acl.Apply(path,\n\t\ttrue,\n\t\tfalse,\n\t\tacl.GrantName(((mode&0700)<<23)|((mode&0200)<<9), usr.Username))\n\n\t// See https://github.com/hectane/go-acl/issues/1\n\tif err != nil && err.Error() != \"The operation completed successfully.\" {\n\t\tpanic(err)\n\t}\n}\n\nfunc listen(addr string) (net.Listener, error) {\n\tif strings.Contains(addr, \"\\\\pipe\\\\\") {\n\t\treturn winio.ListenPipe(addr, nil)\n\t}\n\treturn net.Listen(\"unix\", addr)\n}\n\nfunc isErrClosed(err error) bool {\n\treturn errors.Is(err, net.ErrClosed) || errors.Is(err, winio.ErrPipeListenerClosed) || errors.Is(err, winio.ErrFileClosed)\n}\n"
  },
  {
    "path": "internal/stack/merge.go",
    "content": "package stack\n\nimport (\n\t\"sort\"\n\n\t\"github.com/buildpacks/pack/internal/stringset\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n)\n\nconst WildcardStack = \"*\"\n\n// MergeCompatible determines the allowable set of stacks that a combination of buildpacks may run on, given each\n// buildpack's set of stacks. Compatibility between the two sets of buildpack stacks is defined by the following rules:\n//\n//  1. The stack must be supported by both buildpacks. That is, any resulting stack ID must appear in both input sets.\n//  2. For each supported stack ID, all required mixins for all buildpacks must be provided by the result. That is,\n//     mixins for the stack ID in both input sets are unioned.\n//  3. If there is a wildcard stack in either of the stack list, the stack list not having the wild card stack is returned.\n//  4. If both the stack lists contain a wildcard stack, a list containing just the wildcard stack is returned.\n//\n// ---\n//\n// Examples:\n//\n//\tstacksA = [{ID: \"stack1\", mixins: [\"build:mixinA\", \"mixinB\", \"run:mixinC\"]}}]\n//\tstacksB = [{ID: \"stack1\", mixins: [\"build:mixinA\", \"run:mixinC\"]}}]\n//\tresult = [{ID: \"stack1\", mixins: [\"build:mixinA\", \"mixinB\", \"run:mixinC\"]}}]\n//\n//\tstacksA = [{ID: \"stack1\", mixins: [\"build:mixinA\"]}}, {ID: \"stack2\", mixins: [\"mixinA\"]}}]\n//\tstacksB = [{ID: \"stack1\", mixins: [\"run:mixinC\"]}}, {ID: \"stack2\", mixins: [\"mixinA\"]}}]\n//\tresult = [{ID: \"stack1\", mixins: [\"build:mixinA\", \"run:mixinC\"]}}, {ID: \"stack2\", mixins: [\"mixinA\"]}}]\n//\n//\tstacksA = [{ID: \"stack1\", mixins: [\"build:mixinA\"]}}, {ID: \"stack2\", mixins: [\"mixinA\"]}}]\n//\tstacksB = [{ID: \"stack2\", mixins: [\"mixinA\", \"run:mixinB\"]}}]\n//\tresult = [{ID: \"stack2\", mixins: [\"mixinA\", \"run:mixinB\"]}}]\n//\n//\tstacksA = [{ID: \"stack1\", mixins: [\"build:mixinA\"]}}]\n//\tstacksB = [{ID: \"stack2\", mixins: [\"mixinA\", \"run:mixinB\"]}}]\n//\tresult = []\n//\n//\tstacksA = [{ID: \"*\"}, {ID: \"stack1\", mixins: [\"build:mixinC\"]}]\n//\tstacksB = [{ID: \"stack1\", mixins: [\"build:mixinA\"]}, {ID: \"stack2\", mixins: [\"mixinA\", \"run:mixinB\"]}]\n//\tresult = [{ID: \"stack1\", mixins: [\"build:mixinA\"]}, {ID: \"stack2\", mixins: [\"mixinA\", \"run:mixinB\"]}]\n//\n//\tstacksA = [{ID: \"stack1\", mixins: [\"build:mixinA\"]}, {ID: \"stack2\", mixins: [\"mixinA\", \"run:mixinB\"]}]\n//\tstacksB = [{ID: \"*\"}, {ID: \"stack1\", mixins: [\"build:mixinC\"]}]\n//\tresult = [{ID: \"stack1\", mixins: [\"build:mixinA\"]}, {ID: \"stack2\", mixins: [\"mixinA\", \"run:mixinB\"]}]\n//\n//\tstacksA = [{ID: \"*\"}, {ID: \"stack1\", mixins: [\"build:mixinA\"]}, {ID: \"stack2\", mixins: [\"mixinA\", \"run:mixinB\"]}]\n//\tstacksB = [{ID: \"*\"}, {ID: \"stack1\", mixins: [\"build:mixinC\"]}]\n//\tresult = [{ID: \"*\"}]\nfunc MergeCompatible(stacksA []dist.Stack, stacksB []dist.Stack) []dist.Stack {\n\tset := map[string][]string{}\n\tAHasWildcardStack, BHasWildcardStack := false, false\n\n\tfor _, s := range stacksA {\n\t\tset[s.ID] = s.Mixins\n\t\tif s.ID == WildcardStack {\n\t\t\tAHasWildcardStack = true\n\t\t}\n\t}\n\n\tfor _, s := range stacksB {\n\t\tif s.ID == WildcardStack {\n\t\t\tBHasWildcardStack = true\n\t\t}\n\t}\n\n\tif AHasWildcardStack && BHasWildcardStack {\n\t\treturn []dist.Stack{{ID: WildcardStack}}\n\t}\n\n\tif AHasWildcardStack {\n\t\treturn stacksB\n\t}\n\n\tif BHasWildcardStack {\n\t\treturn stacksA\n\t}\n\n\tvar results []dist.Stack\n\n\tfor _, s := range stacksB {\n\t\tif stackMixins, ok := set[s.ID]; ok {\n\t\t\tmixinsSet := stringset.FromSlice(append(stackMixins, s.Mixins...))\n\t\t\tvar mixins []string\n\t\t\tfor m := range mixinsSet {\n\t\t\t\tmixins = append(mixins, m)\n\t\t\t}\n\t\t\tsort.Strings(mixins)\n\n\t\t\tresults = append(results, dist.Stack{\n\t\t\t\tID:     s.ID,\n\t\t\t\tMixins: mixins,\n\t\t\t})\n\t\t}\n\t}\n\n\treturn results\n}\n"
  },
  {
    "path": "internal/stack/merge_test.go",
    "content": "package stack_test\n\nimport (\n\t\"testing\"\n\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\t\"github.com/buildpacks/pack/internal/stack\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestMerge(t *testing.T) {\n\tspec.Run(t, \"testMerge\", testMerge, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testMerge(t *testing.T, when spec.G, it spec.S) {\n\twhen(\"MergeCompatible\", func() {\n\t\twhen(\"a stack has more mixins than the other\", func() {\n\t\t\tit(\"add mixins\", func() {\n\t\t\t\tresult := stack.MergeCompatible(\n\t\t\t\t\t[]dist.Stack{{ID: \"stack1\", Mixins: []string{\"build:mixinA\", \"mixinB\", \"run:mixinC\"}}},\n\t\t\t\t\t[]dist.Stack{{ID: \"stack1\", Mixins: []string{\"build:mixinA\", \"run:mixinC\"}}},\n\t\t\t\t)\n\n\t\t\t\th.AssertEq(t, len(result), 1)\n\t\t\t\th.AssertEq(t, result, []dist.Stack{{ID: \"stack1\", Mixins: []string{\"build:mixinA\", \"mixinB\", \"run:mixinC\"}}})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"stacks don't match id\", func() {\n\t\t\tit(\"returns no stacks\", func() {\n\t\t\t\tresult := stack.MergeCompatible(\n\t\t\t\t\t[]dist.Stack{{ID: \"stack1\", Mixins: []string{\"build:mixinA\", \"mixinB\", \"run:mixinC\"}}},\n\t\t\t\t\t[]dist.Stack{{ID: \"stack2\", Mixins: []string{\"build:mixinA\", \"run:mixinC\"}}},\n\t\t\t\t)\n\n\t\t\t\th.AssertEq(t, len(result), 0)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"a set of stacks has extra stacks\", func() {\n\t\t\tit(\"removes extra stacks\", func() {\n\t\t\t\tresult := stack.MergeCompatible(\n\t\t\t\t\t[]dist.Stack{{ID: \"stack1\"}},\n\t\t\t\t\t[]dist.Stack{\n\t\t\t\t\t\t{ID: \"stack1\"},\n\t\t\t\t\t\t{ID: \"stack2\"},\n\t\t\t\t\t},\n\t\t\t\t)\n\n\t\t\t\th.AssertEq(t, len(result), 1)\n\t\t\t\th.AssertEq(t, result, []dist.Stack{{ID: \"stack1\"}})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"a set has a wildcard stack\", func() {\n\t\t\tit(\"returns the other set of stacks\", func() {\n\t\t\t\tresult := stack.MergeCompatible(\n\t\t\t\t\t[]dist.Stack{{ID: \"*\"}},\n\t\t\t\t\t[]dist.Stack{\n\t\t\t\t\t\t{ID: \"stack1\"},\n\t\t\t\t\t\t{ID: \"stack2\"},\n\t\t\t\t\t},\n\t\t\t\t)\n\n\t\t\t\th.AssertEq(t, len(result), 2)\n\t\t\t\th.AssertEq(t, result, []dist.Stack{\n\t\t\t\t\t{ID: \"stack1\"},\n\t\t\t\t\t{ID: \"stack2\"},\n\t\t\t\t})\n\t\t\t})\n\n\t\t\tit(\"returns the other set of stacks\", func() {\n\t\t\t\tresult := stack.MergeCompatible(\n\t\t\t\t\t[]dist.Stack{\n\t\t\t\t\t\t{ID: \"stack1\"},\n\t\t\t\t\t\t{ID: \"stack2\"},\n\t\t\t\t\t},\n\t\t\t\t\t[]dist.Stack{{ID: \"*\"}},\n\t\t\t\t)\n\n\t\t\t\th.AssertEq(t, len(result), 2)\n\t\t\t\th.AssertEq(t, result, []dist.Stack{\n\t\t\t\t\t{ID: \"stack1\"},\n\t\t\t\t\t{ID: \"stack2\"},\n\t\t\t\t})\n\t\t\t})\n\n\t\t\tit(\"returns the wildcard stack\", func() {\n\t\t\t\tresult := stack.MergeCompatible(\n\t\t\t\t\t[]dist.Stack{\n\t\t\t\t\t\t{ID: \"stack1\"},\n\t\t\t\t\t\t{ID: \"stack2\"},\n\t\t\t\t\t\t{ID: \"*\"},\n\t\t\t\t\t},\n\t\t\t\t\t[]dist.Stack{\n\t\t\t\t\t\t{ID: \"*\"},\n\t\t\t\t\t\t{ID: \"stack3\"},\n\t\t\t\t\t\t{ID: \"stack1\"},\n\t\t\t\t\t},\n\t\t\t\t)\n\n\t\t\t\th.AssertEq(t, len(result), 1)\n\t\t\t\th.AssertEq(t, result, []dist.Stack{\n\t\t\t\t\t{ID: \"*\"},\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "internal/stack/mixins.go",
    "content": "package stack\n\nimport (\n\t\"fmt\"\n\t\"sort\"\n\t\"strings\"\n\n\t\"github.com/buildpacks/pack/internal/stringset\"\n\t\"github.com/buildpacks/pack/internal/style\"\n)\n\nconst MixinsLabel = \"io.buildpacks.stack.mixins\"\n\nfunc ValidateMixins(buildImageName string, buildImageMixins []string, runImageName string, runImageMixins []string) error {\n\tif invalid := FindStageMixins(buildImageMixins, \"run\"); len(invalid) > 0 {\n\t\tsort.Strings(invalid)\n\t\treturn fmt.Errorf(\"%s contains run-only mixin(s): %s\", style.Symbol(buildImageName), strings.Join(invalid, \", \"))\n\t}\n\n\tif invalid := FindStageMixins(runImageMixins, \"build\"); len(invalid) > 0 {\n\t\tsort.Strings(invalid)\n\t\treturn fmt.Errorf(\"%s contains build-only mixin(s): %s\", style.Symbol(runImageName), strings.Join(invalid, \", \"))\n\t}\n\n\tbuildImageMixins = removeStageMixins(buildImageMixins, \"build\")\n\trunImageMixins = removeStageMixins(runImageMixins, \"run\")\n\n\t_, missing, _ := stringset.Compare(runImageMixins, buildImageMixins)\n\n\tif len(missing) > 0 {\n\t\tsort.Strings(missing)\n\t\treturn fmt.Errorf(\"%s missing required mixin(s): %s\", style.Symbol(runImageName), strings.Join(missing, \", \"))\n\t}\n\treturn nil\n}\n\nfunc FindStageMixins(mixins []string, stage string) []string {\n\tvar found []string\n\tfor _, m := range mixins {\n\t\tif strings.HasPrefix(m, stage+\":\") {\n\t\t\tfound = append(found, m)\n\t\t}\n\t}\n\treturn found\n}\n\nfunc removeStageMixins(mixins []string, stage string) []string {\n\tvar filtered []string\n\tfor _, m := range mixins {\n\t\tif !strings.HasPrefix(m, stage+\":\") {\n\t\t\tfiltered = append(filtered, m)\n\t\t}\n\t}\n\treturn filtered\n}\n"
  },
  {
    "path": "internal/stack/mixins_test.go",
    "content": "package stack_test\n\nimport (\n\t\"testing\"\n\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\t\"github.com/buildpacks/pack/internal/stack\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestMixinValidation(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"testMixinValidation\", testMixinValidation, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testMixinValidation(t *testing.T, when spec.G, it spec.S) {\n\twhen(\"#ValidateMixins\", func() {\n\t\tit(\"ignores stage-specific mixins\", func() {\n\t\t\tbuildMixins := []string{\"mixinA\", \"build:mixinB\"}\n\t\t\trunMixins := []string{\"mixinA\", \"run:mixinC\"}\n\n\t\t\th.AssertNil(t, stack.ValidateMixins(\"some/build\", buildMixins, \"some/run\", runMixins))\n\t\t})\n\n\t\tit(\"allows extraneous run image mixins\", func() {\n\t\t\tbuildMixins := []string{\"mixinA\"}\n\t\t\trunMixins := []string{\"mixinA\", \"mixinB\"}\n\n\t\t\th.AssertNil(t, stack.ValidateMixins(\"some/build\", buildMixins, \"some/run\", runMixins))\n\t\t})\n\n\t\tit(\"returns an error with any missing run image mixins\", func() {\n\t\t\tbuildMixins := []string{\"mixinA\", \"mixinB\"}\n\t\t\trunMixins := []string{}\n\n\t\t\terr := stack.ValidateMixins(\"some/build\", buildMixins, \"some/run\", runMixins)\n\n\t\t\th.AssertError(t, err, \"'some/run' missing required mixin(s): mixinA, mixinB\")\n\t\t})\n\n\t\tit(\"returns an error with any invalid build image mixins\", func() {\n\t\t\tbuildMixins := []string{\"run:mixinA\", \"run:mixinB\"}\n\t\t\trunMixins := []string{}\n\n\t\t\terr := stack.ValidateMixins(\"some/build\", buildMixins, \"some/run\", runMixins)\n\n\t\t\th.AssertError(t, err, \"'some/build' contains run-only mixin(s): run:mixinA, run:mixinB\")\n\t\t})\n\n\t\tit(\"returns an error with any invalid run image mixins\", func() {\n\t\t\tbuildMixins := []string{}\n\t\t\trunMixins := []string{\"build:mixinA\", \"build:mixinB\"}\n\n\t\t\terr := stack.ValidateMixins(\"some/build\", buildMixins, \"some/run\", runMixins)\n\n\t\t\th.AssertError(t, err, \"'some/run' contains build-only mixin(s): build:mixinA, build:mixinB\")\n\t\t})\n\t})\n\n\twhen(\"#FindStageMixins\", func() {\n\t\tit(\"returns mixins with stage prefix\", func() {\n\t\t\tmixins := []string{\"mixinA\", \"run:mixinB\", \"mixinC\", \"run:mixinD\", \"build:mixinE\"}\n\n\t\t\trunMixins := stack.FindStageMixins(mixins, \"run\")\n\n\t\t\th.AssertEq(t, runMixins, []string{\"run:mixinB\", \"run:mixinD\"})\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "internal/strings/strings.go",
    "content": "package strings\n\nimport (\n\t\"golang.org/x/text/cases\"\n\t\"golang.org/x/text/language\"\n)\n\nfunc ValueOrDefault(str, def string) string {\n\tif str == \"\" {\n\t\treturn def\n\t}\n\n\treturn str\n}\n\nfunc Title(lower string) string {\n\treturn cases.Title(language.English).String(lower)\n}\n"
  },
  {
    "path": "internal/strings/strings_test.go",
    "content": "package strings_test\n\nimport (\n\t\"testing\"\n\n\t\"github.com/buildpacks/pack/internal/strings\"\n\n\t\"github.com/sclevine/spec\"\n\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestValueOrDefault(t *testing.T) {\n\tspec.Run(t, \"Strings\", func(t *testing.T, when spec.G, it spec.S) {\n\t\tvar (\n\t\t\tassert = h.NewAssertionManager(t)\n\t\t)\n\n\t\twhen(\"#ValueOrDefault\", func() {\n\t\t\tit(\"returns value when value is non-empty\", func() {\n\t\t\t\toutput := strings.ValueOrDefault(\"some-value\", \"-\")\n\t\t\t\tassert.Equal(output, \"some-value\")\n\t\t\t})\n\n\t\t\tit(\"returns default when value is empty\", func() {\n\t\t\t\toutput := strings.ValueOrDefault(\"\", \"-\")\n\t\t\t\tassert.Equal(output, \"-\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"#Title\", func() {\n\t\t\tit(\"returns the provided string with title casing\", func() {\n\t\t\t\toutput := strings.Title(\"to title case\")\n\t\t\t\tassert.Equal(output, \"To Title Case\")\n\t\t\t})\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "internal/stringset/stringset.go",
    "content": "package stringset\n\n// FromSlice converts the given slice to a set in the form of unique keys in a map.\n// The value associated with each key should not be relied upon. A value is present\n// in the set if its key is present in the map, regardless of the key's value.\nfunc FromSlice(strings []string) map[string]interface{} {\n\tset := map[string]interface{}{}\n\tfor _, s := range strings {\n\t\tset[s] = nil\n\t}\n\treturn set\n}\n\n// Compare performs a set comparison between two slices. `extra` represents elements present in\n// `strings1` but not `strings2`. `missing` represents elements present in `strings2` that are\n// missing from `strings1`. `common` represents elements present in both slices. Since the input\n// slices are treated as sets, duplicates will be removed in any outputs.\nfunc Compare(strings1, strings2 []string) (extra []string, missing []string, common []string) {\n\tset1 := FromSlice(strings1)\n\tset2 := FromSlice(strings2)\n\n\tfor s := range set1 {\n\t\tif _, ok := set2[s]; !ok {\n\t\t\textra = append(extra, s)\n\t\t\tcontinue\n\t\t}\n\t\tcommon = append(common, s)\n\t}\n\n\tfor s := range set2 {\n\t\tif _, ok := set1[s]; !ok {\n\t\t\tmissing = append(missing, s)\n\t\t}\n\t}\n\n\treturn extra, missing, common\n}\n"
  },
  {
    "path": "internal/stringset/stringset_test.go",
    "content": "package stringset_test\n\nimport (\n\t\"sort\"\n\t\"testing\"\n\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\t\"github.com/buildpacks/pack/internal/stringset\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestStringSet(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"testStringSet\", testStringSet, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\n// NOTE: Do NOT use AssertSliceContains() or variants of it in theses tests, as they use `stringset`\nfunc testStringSet(t *testing.T, when spec.G, it spec.S) {\n\twhen(\"#FromSlice\", func() {\n\t\tit(\"returns a map with elements as unique keys\", func() {\n\t\t\tslice := []string{\"a\", \"b\", \"a\", \"c\"}\n\n\t\t\tset := stringset.FromSlice(slice)\n\n\t\t\th.AssertEq(t, len(set), 3)\n\n\t\t\t_, ok := set[\"a\"]\n\t\t\th.AssertTrue(t, ok)\n\n\t\t\t_, ok = set[\"b\"]\n\t\t\th.AssertTrue(t, ok)\n\n\t\t\t_, ok = set[\"c\"]\n\t\t\th.AssertTrue(t, ok)\n\t\t})\n\t})\n\n\twhen(\"#Compare\", func() {\n\t\tit(\"returns elements in slice 1 but not in slice 2\", func() {\n\t\t\tslice1 := []string{\"a\", \"b\", \"c\", \"d\"}\n\t\t\tslice2 := []string{\"a\", \"c\"}\n\n\t\t\textra, _, _ := stringset.Compare(slice1, slice2)\n\n\t\t\th.AssertEq(t, len(extra), 2)\n\n\t\t\tsort.Strings(extra)\n\t\t\th.AssertEq(t, extra[0], \"b\")\n\t\t\th.AssertEq(t, extra[1], \"d\")\n\t\t})\n\n\t\tit(\"returns elements in slice 2 missing from slice 1\", func() {\n\t\t\tslice1 := []string{\"a\", \"c\"}\n\t\t\tslice2 := []string{\"a\", \"b\", \"c\", \"d\"}\n\n\t\t\t_, missing, _ := stringset.Compare(slice1, slice2)\n\n\t\t\th.AssertEq(t, len(missing), 2)\n\n\t\t\tsort.Strings(missing)\n\t\t\th.AssertEq(t, missing[0], \"b\")\n\t\t\th.AssertEq(t, missing[1], \"d\")\n\t\t})\n\n\t\tit(\"returns elements present in both slices\", func() {\n\t\t\tslice1 := []string{\"a\", \"b\", \"c\"}\n\t\t\tslice2 := []string{\"b\", \"c\", \"d\"}\n\n\t\t\t_, _, common := stringset.Compare(slice1, slice2)\n\n\t\t\th.AssertEq(t, len(common), 2)\n\n\t\t\tsort.Strings(common)\n\t\t\th.AssertEq(t, common[0], \"b\")\n\t\t\th.AssertEq(t, common[1], \"c\")\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "internal/style/style.go",
    "content": "package style\n\nimport (\n\t\"fmt\"\n\t\"sort\"\n\t\"strings\"\n\n\t\"github.com/heroku/color\"\n)\n\nvar Symbol = func(value string) string {\n\tif color.Enabled() {\n\t\treturn Key(value)\n\t}\n\treturn \"'\" + value + \"'\"\n}\n\nvar Map = func(value map[string]string, prefix, separator string) string {\n\tresult := \"\"\n\n\tvar keys []string\n\n\tfor key := range value {\n\t\tkeys = append(keys, key)\n\t}\n\n\tsort.Strings(keys)\n\n\tfor _, key := range keys {\n\t\tresult += fmt.Sprintf(\"%s%s=%s%s\", prefix, key, value[key], separator)\n\t}\n\n\tif color.Enabled() {\n\t\treturn Key(strings.TrimSpace(result))\n\t}\n\treturn \"'\" + strings.TrimSpace(result) + \"'\"\n}\n\nvar SymbolF = func(format string, a ...interface{}) string {\n\tif color.Enabled() {\n\t\treturn Key(format, a...)\n\t}\n\treturn \"'\" + fmt.Sprintf(format, a...) + \"'\"\n}\n\nvar Key = color.HiBlueString\n\nvar Tip = color.New(color.FgGreen, color.Bold).SprintfFunc()\n\nvar Warn = color.New(color.FgYellow, color.Bold).SprintfFunc()\n\nvar Error = color.New(color.FgRed, color.Bold).SprintfFunc()\n\nvar Step = func(format string, a ...interface{}) string {\n\treturn color.CyanString(\"===> \"+format, a...)\n}\n\nvar Prefix = color.CyanString\nvar Waiting = color.HiBlackString\nvar Working = color.HiBlueString\nvar Complete = color.GreenString\nvar ProgressBar = color.HiBlueString\n"
  },
  {
    "path": "internal/style/style_test.go",
    "content": "package style_test\n\nimport (\n\t\"testing\"\n\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\t\"github.com/buildpacks/pack/internal/style\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestStyle(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"testStyle\", testStyle, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testStyle(t *testing.T, when spec.G, it spec.S) {\n\twhen(\"#Symbol\", func() {\n\t\tit(\"It should return the expected value\", func() {\n\t\t\th.AssertEq(t, style.Symbol(\"Symbol\"), \"'Symbol'\")\n\t\t})\n\n\t\tit(\"It should return an empty string\", func() {\n\t\t\th.AssertEq(t, style.Symbol(\"\"), \"''\")\n\t\t})\n\n\t\tit(\"It should return the expected value while color enabled\", func() {\n\t\t\tcolor.Disable(false)\n\t\t\tdefer color.Disable(true)\n\t\t\th.AssertEq(t, style.Symbol(\"Symbol\"), \"\\x1b[94mSymbol\\x1b[0m\")\n\t\t})\n\n\t\tit(\"It should return an empty string while color enabled\", func() {\n\t\t\tcolor.Disable(false)\n\t\t\tdefer color.Disable(true)\n\t\t\th.AssertEq(t, style.Symbol(\"\"), \"\\x1b[94m\\x1b[0m\")\n\t\t})\n\t})\n\n\twhen(\"#SymbolF\", func() {\n\t\tit(\"It should return the expected value\", func() {\n\t\t\th.AssertEq(t, style.SymbolF(\"values %s %d\", \"hello\", 1), \"'values hello 1'\")\n\t\t})\n\n\t\tit(\"It should return an empty string\", func() {\n\t\t\th.AssertEq(t, style.SymbolF(\"\"), \"''\")\n\t\t})\n\n\t\tit(\"It should return the expected value while color enabled\", func() {\n\t\t\tcolor.Disable(false)\n\t\t\tdefer color.Disable(true)\n\t\t\th.AssertEq(t, style.SymbolF(\"values %s %d\", \"hello\", 1), \"\\x1b[94mvalues hello 1\\x1b[0m\")\n\t\t})\n\n\t\tit(\"It should return an empty string while color enabled\", func() {\n\t\t\tcolor.Disable(false)\n\t\t\tdefer color.Disable(true)\n\t\t\th.AssertEq(t, style.SymbolF(\"\"), \"\\x1b[94m\\x1b[0m\")\n\t\t})\n\t})\n\n\twhen(\"#Map\", func() {\n\t\tit(\"It should return a string with all key value pairs\", func() {\n\t\t\th.AssertEq(t, style.Map(map[string]string{\"FOO\": \"foo\", \"BAR\": \"bar\"}, \"\", \" \"), \"'BAR=bar FOO=foo'\")\n\t\t\th.AssertEq(t, style.Map(map[string]string{\"BAR\": \"bar\", \"FOO\": \"foo\"}, \"  \", \"\\n\"), \"'BAR=bar\\n  FOO=foo'\")\n\t\t})\n\n\t\tit(\"It should return a string with all key value pairs while color enabled\", func() {\n\t\t\tcolor.Disable(false)\n\t\t\tdefer color.Disable(true)\n\t\t\th.AssertEq(t, style.Map(map[string]string{\"FOO\": \"foo\", \"BAR\": \"bar\"}, \"\", \" \"), \"\\x1b[94mBAR=bar FOO=foo\\x1b[0m\")\n\t\t\th.AssertEq(t, style.Map(map[string]string{\"BAR\": \"bar\", \"FOO\": \"foo\"}, \"  \", \"\\n\"), \"\\x1b[94mBAR=bar\\n  FOO=foo\\x1b[0m\")\n\t\t})\n\n\t\tit(\"It should return an empty string\", func() {\n\t\t\th.AssertEq(t, style.Map(map[string]string{}, \"\", \" \"), \"''\")\n\t\t})\n\n\t\tit(\"It should return an empty string while color enabled\", func() {\n\t\t\tcolor.Disable(false)\n\t\t\tdefer color.Disable(true)\n\t\t\th.AssertEq(t, style.Map(map[string]string{}, \"\", \" \"), \"\\x1b[94m\\x1b[0m\")\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "internal/target/parse.go",
    "content": "package target\n\nimport (\n\t\"fmt\"\n\t\"strings\"\n\n\t\"github.com/pkg/errors\"\n\n\t\"github.com/buildpacks/pack/internal/style\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n)\n\nfunc ParseTargets(t []string, logger logging.Logger) (targets []dist.Target, err error) {\n\tfor _, v := range t {\n\t\ttarget, err := ParseTarget(v, logger)\n\t\tif err != nil {\n\t\t\treturn nil, err\n\t\t}\n\t\ttargets = append(targets, target)\n\t}\n\treturn targets, nil\n}\n\nfunc ParseTarget(t string, logger logging.Logger) (output dist.Target, err error) {\n\tnonDistro, distros, err := getTarget(t, logger)\n\tif v, _ := getSliceAt[string](nonDistro, 0); len(nonDistro) <= 1 && v == \"\" {\n\t\tlogger.Warn(\"os/arch must be defined\")\n\t}\n\tif err != nil {\n\t\treturn output, err\n\t}\n\tos, arch, variant, err := getPlatform(nonDistro, logger)\n\tif err != nil {\n\t\treturn output, err\n\t}\n\tv, err := ParseDistros(distros, logger)\n\tif err != nil {\n\t\treturn output, err\n\t}\n\toutput = dist.Target{\n\t\tOS:            os,\n\t\tArch:          arch,\n\t\tArchVariant:   variant,\n\t\tDistributions: v,\n\t}\n\treturn output, err\n}\n\nfunc ParseDistros(distroSlice string, logger logging.Logger) (distros []dist.Distribution, err error) {\n\tdistro := strings.Split(distroSlice, \";\")\n\tif l := len(distro); l == 1 && distro[0] == \"\" {\n\t\treturn nil, err\n\t}\n\tfor _, d := range distro {\n\t\tv, err := ParseDistro(d, logger)\n\t\tif err != nil {\n\t\t\treturn nil, err\n\t\t}\n\t\tdistros = append(distros, v)\n\t}\n\treturn distros, nil\n}\n\nfunc ParseDistro(distroString string, logger logging.Logger) (distro dist.Distribution, err error) {\n\td := strings.Split(distroString, \"@\")\n\tif d[0] == \"\" || len(d) == 0 {\n\t\treturn distro, errors.Errorf(\"distro's versions %s cannot be specified without distro's name\", style.Symbol(\"@\"+strings.Join(d[1:], \"@\")))\n\t}\n\tdistro.Name = d[0]\n\tif len(d) < 2 {\n\t\tlogger.Warnf(\"distro with name %s has no specific version!\", style.Symbol(d[0]))\n\t\treturn distro, err\n\t}\n\tif len(d) > 2 {\n\t\treturn distro, fmt.Errorf(\"invalid distro: %s\", distroString)\n\t}\n\tdistro.Version = d[1]\n\treturn distro, err\n}\n\nfunc getTarget(t string, logger logging.Logger) (nonDistro []string, distros string, err error) {\n\ttarget := strings.Split(t, \":\")\n\tif (len(target) == 1 && target[0] == \"\") || len(target) == 0 {\n\t\treturn nonDistro, distros, errors.Errorf(\"invalid target %s, atleast one of [os][/arch][/archVariant] must be specified\", t)\n\t}\n\tif len(target) == 2 && target[0] == \"\" {\n\t\tv, _ := getSliceAt[string](target, 1)\n\t\tlogger.Warn(style.Warn(\"adding distros %s without [os][/arch][/variant]\", v))\n\t} else {\n\t\ti, _ := getSliceAt[string](target, 0)\n\t\tnonDistro = strings.Split(i, \"/\")\n\t}\n\tif i, err := getSliceAt[string](target, 1); err == nil {\n\t\tdistros = i\n\t}\n\treturn nonDistro, distros, err\n}\n\nfunc getSliceAt[T interface{}](slice []T, index int) (value T, err error) {\n\tif index < 0 || index >= len(slice) {\n\t\treturn value, errors.Errorf(\"index out of bound, cannot access item at index %d of slice with length %d\", index, len(slice))\n\t}\n\n\treturn slice[index], err\n}\n"
  },
  {
    "path": "internal/target/parse_test.go",
    "content": "package target_test\n\nimport (\n\t\"bytes\"\n\t\"testing\"\n\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\t\"github.com/buildpacks/pack/internal/target\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestParseTargets(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"ParseTargets\", testParseTargets, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testParseTargets(t *testing.T, when spec.G, it spec.S) {\n\toutBuf := bytes.Buffer{}\n\tit.Before(func() {\n\t\toutBuf = bytes.Buffer{}\n\t\th.AssertEq(t, outBuf.String(), \"\")\n\t\tvar err error\n\t\th.AssertNil(t, err)\n\t})\n\n\twhen(\"target#ParseTarget\", func() {\n\t\tit(\"should show a warn when [os][/arch][/variant] is nil\", func() {\n\t\t\ttarget.ParseTarget(\":distro@version\", logging.NewLogWithWriters(&outBuf, &outBuf))\n\t\t\th.AssertNotEq(t, outBuf.String(), \"\")\n\t\t})\n\t\tit(\"should parse target as expected\", func() {\n\t\t\toutput, err := target.ParseTarget(\"linux/arm/v6\", logging.NewLogWithWriters(&outBuf, &outBuf))\n\t\t\th.AssertEq(t, outBuf.String(), \"\")\n\t\t\th.AssertNil(t, err)\n\t\t\th.AssertEq(t, output, dist.Target{\n\t\t\t\tOS:          \"linux\",\n\t\t\t\tArch:        \"arm\",\n\t\t\t\tArchVariant: \"v6\",\n\t\t\t})\n\t\t})\n\t\tit(\"should return an error\", func() {\n\t\t\t_, err := target.ParseTarget(\"\", logging.NewLogWithWriters(&outBuf, &outBuf))\n\t\t\th.AssertNotNil(t, err)\n\t\t})\n\t\tit(\"should log a warning when only [os] has typo or is unknown\", func() {\n\t\t\ttarget.ParseTarget(\"os/arm/v6\", logging.NewLogWithWriters(&outBuf, &outBuf))\n\t\t\th.AssertNotEq(t, outBuf.String(), \"\")\n\t\t})\n\t\tit(\"should log a warning when only [arch] has typo or is unknown\", func() {\n\t\t\ttarget.ParseTarget(\"darwin/arm/v6\", logging.NewLogWithWriters(&outBuf, &outBuf))\n\t\t\th.AssertNotEq(t, outBuf.String(), \"\")\n\t\t})\n\t\tit(\"should log a warning when only [variant] has typo or is unknown\", func() {\n\t\t\ttarget.ParseTarget(\"linux/arm/unknown\", logging.NewLogWithWriters(&outBuf, &outBuf))\n\t\t\th.AssertNotEq(t, outBuf.String(), \"\")\n\t\t})\n\t})\n\n\twhen(\"target#ParseTargets\", func() {\n\t\tit(\"should throw an error when atleast one target throws error\", func() {\n\t\t\t_, err := target.ParseTargets([]string{\"linux/arm/v6\", \":distro@version\"}, logging.NewLogWithWriters(&outBuf, &outBuf))\n\t\t\th.AssertNotNil(t, err)\n\t\t})\n\t\tit(\"should parse targets as expected\", func() {\n\t\t\toutput, err := target.ParseTargets([]string{\"linux/arm/v6\", \"linux/amd64:ubuntu@22.04;debian@8.10;debian@10.06\"}, logging.NewLogWithWriters(&outBuf, &outBuf))\n\t\t\th.AssertNil(t, err)\n\t\t\th.AssertEq(t, output, []dist.Target{\n\t\t\t\t{\n\t\t\t\t\tOS:          \"linux\",\n\t\t\t\t\tArch:        \"arm\",\n\t\t\t\t\tArchVariant: \"v6\",\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\tOS:   \"linux\",\n\t\t\t\t\tArch: \"amd64\",\n\t\t\t\t\tDistributions: []dist.Distribution{\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\tName:    \"ubuntu\",\n\t\t\t\t\t\t\tVersion: \"22.04\",\n\t\t\t\t\t\t},\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\tName:    \"debian\",\n\t\t\t\t\t\t\tVersion: \"8.10\",\n\t\t\t\t\t\t},\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\tName:    \"debian\",\n\t\t\t\t\t\t\tVersion: \"10.06\",\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"target#ParseDistro\", func() {\n\t\tit(\"should parse distro as expected\", func() {\n\t\t\toutput, err := target.ParseDistro(\"ubuntu@22.04\", logging.NewLogWithWriters(&outBuf, &outBuf))\n\t\t\th.AssertEq(t, output, dist.Distribution{\n\t\t\t\tName:    \"ubuntu\",\n\t\t\t\tVersion: \"22.04\",\n\t\t\t})\n\t\t\th.AssertNil(t, err)\n\t\t})\n\t\tit(\"should return an error when name is missing\", func() {\n\t\t\t_, err := target.ParseDistro(\"@22.04@20.08\", logging.NewLogWithWriters(&outBuf, &outBuf))\n\t\t\th.AssertNotNil(t, err)\n\t\t})\n\t\tit(\"should return an error when there are two versions\", func() {\n\t\t\t_, err := target.ParseDistro(\"some-distro@22.04@20.08\", logging.NewLogWithWriters(&outBuf, &outBuf))\n\t\t\th.AssertNotNil(t, err)\n\t\t\th.AssertError(t, err, \"invalid distro\")\n\t\t})\n\t\tit(\"should warn when distro version is not specified\", func() {\n\t\t\ttarget.ParseDistro(\"ubuntu\", logging.NewLogWithWriters(&outBuf, &outBuf))\n\t\t\th.AssertNotEq(t, outBuf.String(), \"\")\n\t\t})\n\t})\n\n\twhen(\"target#ParseDistros\", func() {\n\t\tit(\"should parse distros as expected\", func() {\n\t\t\toutput, err := target.ParseDistros(\"ubuntu@22.04;ubuntu@20.08;debian@8.10;debian@10.06\", logging.NewLogWithWriters(&outBuf, &outBuf))\n\t\t\th.AssertEq(t, output, []dist.Distribution{\n\t\t\t\t{\n\t\t\t\t\tName:    \"ubuntu\",\n\t\t\t\t\tVersion: \"22.04\",\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\tName:    \"ubuntu\",\n\t\t\t\t\tVersion: \"20.08\",\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\tName:    \"debian\",\n\t\t\t\t\tVersion: \"8.10\",\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\tName:    \"debian\",\n\t\t\t\t\tVersion: \"10.06\",\n\t\t\t\t},\n\t\t\t})\n\t\t\th.AssertNil(t, err)\n\t\t})\n\t\tit(\"result should be nil\", func() {\n\t\t\toutput, err := target.ParseDistros(\"\", logging.NewLogWithWriters(&outBuf, &outBuf))\n\t\t\th.AssertEq(t, output, []dist.Distribution(nil))\n\t\t\th.AssertNil(t, err)\n\t\t})\n\t\tit(\"should return an error\", func() {\n\t\t\t_, err := target.ParseDistros(\";\", logging.NewLogWithWriters(&outBuf, &outBuf))\n\t\t\th.AssertNotNil(t, err)\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "internal/target/platform.go",
    "content": "package target\n\nimport (\n\t\"strings\"\n\n\t\"github.com/pkg/errors\"\n\n\t\"github.com/buildpacks/pack/internal/style\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n)\n\nfunc getPlatform(t []string, logger logging.Logger) (os, arch, variant string, err error) {\n\tos, _ = getSliceAt[string](t, 0)\n\tarch, _ = getSliceAt[string](t, 1)\n\tvariant, _ = getSliceAt[string](t, 2)\n\tif !supportsOS(os) && supportsVariant(arch, variant) {\n\t\tlogger.Warn(style.Warn(\"unknown os %s, is this a typo\", os))\n\t}\n\tif supportsArch(os, arch) && !supportsVariant(arch, variant) {\n\t\tlogger.Warn(style.Warn(\"unknown variant %s\", variant))\n\t}\n\tif supportsOS(os) && !supportsArch(os, arch) && supportsVariant(arch, variant) {\n\t\tlogger.Warn(style.Warn(\"unknown arch %s\", arch))\n\t}\n\tif !SupportsPlatform(os, arch, variant) {\n\t\treturn os, arch, variant, errors.Errorf(\"unknown target: %s\", style.Symbol(strings.Join(t, \"/\")))\n\t}\n\treturn os, arch, variant, err\n}\n\nvar supportedOSArchs = map[string][]string{\n\t\"aix\":       {\"ppc64\"},\n\t\"android\":   {\"386\", \"amd64\", \"arm\", \"arm64\"},\n\t\"darwin\":    {\"amd64\", \"arm64\"},\n\t\"dragonfly\": {\"amd64\"},\n\t\"freebsd\":   {\"386\", \"amd64\", \"arm\"},\n\t\"illumos\":   {\"amd64\"},\n\t\"ios\":       {\"arm64\"},\n\t\"js\":        {\"wasm\"},\n\t\"linux\":     {\"386\", \"amd64\", \"arm\", \"arm64\", \"loong64\", \"mips\", \"mipsle\", \"mips64\", \"mips64le\", \"ppc64\", \"ppc64le\", \"riscv64\", \"s390x\"},\n\t\"netbsd\":    {\"386\", \"amd64\", \"arm\"},\n\t\"openbsd\":   {\"386\", \"amd64\", \"arm\", \"arm64\"},\n\t\"plan9\":     {\"386\", \"amd64\", \"arm\"},\n\t\"solaris\":   {\"amd64\"},\n\t\"wasip1\":    {\"wasm\"},\n\t\"windows\":   {\"386\", \"amd64\", \"arm\", \"arm64\"},\n}\n\nvar supportedArchVariants = map[string][]string{\n\t\"386\":      {\"softfloat\", \"sse2\"},\n\t\"arm\":      {\"v5\", \"v6\", \"v7\"},\n\t\"amd64\":    {\"v1\", \"v2\", \"v3\", \"v4\"},\n\t\"mips\":     {\"hardfloat\", \"softfloat\"},\n\t\"mipsle\":   {\"hardfloat\", \"softfloat\"},\n\t\"mips64\":   {\"hardfloat\", \"softfloat\"},\n\t\"mips64le\": {\"hardfloat\", \"softfloat\"},\n\t\"ppc64\":    {\"power8\", \"power9\"},\n\t\"ppc64le\":  {\"power8\", \"power9\"},\n\t\"wasm\":     {\"satconv\", \"signext\"},\n}\n\nfunc supportsOS(os string) bool {\n\treturn supportedOSArchs[os] != nil\n}\n\nfunc supportsArch(os, arch string) bool {\n\tif supportsOS(os) {\n\t\tfor _, s := range supportedOSArchs[os] {\n\t\t\tif s == arch {\n\t\t\t\treturn true\n\t\t\t}\n\t\t}\n\t}\n\treturn false\n}\n\nfunc supportsVariant(arch, variant string) (supported bool) {\n\tif variant == \"\" || len(variant) == 0 {\n\t\treturn true\n\t}\n\tfor _, s := range supportedArchVariants[arch] {\n\t\tif s == variant {\n\t\t\treturn true\n\t\t}\n\t}\n\treturn supported\n}\n\nfunc SupportsPlatform(os, arch, variant string) bool {\n\treturn supportsArch(os, arch) && supportsVariant(arch, variant)\n}\n"
  },
  {
    "path": "internal/target/platform_test.go",
    "content": "package target_test\n\nimport (\n\t\"testing\"\n\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\t\"github.com/buildpacks/pack/internal/target\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestPlatforms(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"TestPlatforms\", testPlatforms, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testPlatforms(t *testing.T, when spec.G, it spec.S) {\n\tit.Before(func() {\n\t\tvar err error\n\t\th.AssertNil(t, err)\n\t})\n\twhen(\"target#SupportsPlatform\", func() {\n\t\tit(\"should return false when target not supported\", func() {\n\t\t\tb := target.SupportsPlatform(\"os\", \"arm\", \"v6\")\n\t\t\th.AssertFalse(t, b)\n\t\t})\n\t\tit(\"should parse targets as expected\", func() {\n\t\t\tb := target.SupportsPlatform(\"linux\", \"arm\", \"v6\")\n\t\t\th.AssertTrue(t, b)\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "internal/term/term.go",
    "content": "package term\n\nimport (\n\t\"io\"\n\n\t\"golang.org/x/term\"\n)\n\n// InvalidFileDescriptor based on https://golang.org/src/os/file_unix.go?s=2183:2210#L57\nconst InvalidFileDescriptor = ^(uintptr(0))\n\n// IsTerminal returns whether a writer is a terminal\nfunc IsTerminal(w io.Writer) (uintptr, bool) {\n\tif f, ok := w.(hasDescriptor); ok {\n\t\ttermFd := f.Fd()\n\t\tisTerm := term.IsTerminal(int(termFd))\n\t\treturn termFd, isTerm\n\t}\n\n\treturn InvalidFileDescriptor, false\n}\n\ntype hasDescriptor interface {\n\tFd() uintptr\n}\n"
  },
  {
    "path": "internal/term/term_test.go",
    "content": "package term_test\n\nimport (\n\t\"bytes\"\n\t\"os\"\n\t\"testing\"\n\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\t\"github.com/buildpacks/pack/internal/term\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestTerm(t *testing.T) {\n\tspec.Run(t, \"Term\", testTerm, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testTerm(t *testing.T, when spec.G, it spec.S) {\n\twhen(\"#IsTerminal\", func() {\n\t\tit(\"returns false for a pipe\", func() {\n\t\t\tr, _, _ := os.Pipe()\n\t\t\tfd, isTerm := term.IsTerminal(r)\n\t\t\th.AssertFalse(t, isTerm)\n\t\t\th.AssertNotEq(t, fd, term.InvalidFileDescriptor) // The mock writer is a pipe, and therefore has a file descriptor\n\t\t})\n\n\t\tit(\"returns InvalidFileDescriptor if passed a normal Writer\", func() {\n\t\t\tfd, isTerm := term.IsTerminal(&bytes.Buffer{})\n\t\t\th.AssertFalse(t, isTerm)\n\t\t\th.AssertEq(t, fd, term.InvalidFileDescriptor)\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "internal/termui/branch.go",
    "content": "package termui\n\ntype Symbol int\n\nfunc (s Symbol) String() string {\n\tswitch s {\n\tcase NoBranchSymbol:\n\t\treturn \"    \"\n\tcase BranchSymbol:\n\t\treturn \"│   \"\n\tcase MiddleBranchSymbol:\n\t\treturn \"├── \"\n\tcase LastBranchSymbol:\n\t\treturn \"└── \"\n\tdefault:\n\t\treturn \"\"\n\t}\n}\n\nconst (\n\tNoBranchSymbol Symbol = iota\n\tBranchSymbol\n\tMiddleBranchSymbol\n\tLastBranchSymbol\n)\n\ntype Branches []Symbol\n\nfunc (b Branches) Add(s Symbol) Branches {\n\tbCopy := make(Branches, len(b))\n\tcopy(bCopy, b)\n\treturn append(bCopy, s)\n}\n\nfunc (b Branches) String() string {\n\tvar s string\n\tfor _, branch := range b[:len(b)-1] {\n\t\ts += branch.String()\n\t}\n\treturn s\n}\n"
  },
  {
    "path": "internal/termui/dashboard.go",
    "content": "package termui\n\nimport (\n\t\"fmt\"\n\n\t\"github.com/gdamore/tcell/v2\"\n\t\"github.com/rivo/tview\"\n\n\t\"github.com/buildpacks/pack/pkg/dist\"\n)\n\ntype Dashboard struct {\n\tapp           app\n\tbuildpackInfo []dist.ModuleInfo\n\tappTree       *tview.TreeView\n\tbuilderTree   *tview.TreeView\n\tplanList      *tview.List\n\tlogsView      *tview.TextView\n\tscreen        *tview.Flex\n\tleftPane      *tview.Flex\n\tnodes         map[string]*tview.TreeNode\n\n\tlogs string\n}\n\nfunc NewDashboard(app app, appName string, bldr buildr, runImageName string, buildpackInfo []dist.ModuleInfo, logs []string) *Dashboard {\n\td := &Dashboard{}\n\n\tappTree, builderTree := initTrees(appName, bldr, runImageName)\n\n\tplanList, logsView := d.initDashboard(buildpackInfo)\n\n\timagesView := tview.NewFlex().\n\t\tSetDirection(tview.FlexRow).\n\t\tAddItem(appTree, 0, 1, false).\n\t\tAddItem(builderTree, 0, 1, true)\n\n\timagesView.\n\t\tSetBorder(true).\n\t\tSetTitleAlign(tview.AlignLeft).\n\t\tSetTitle(\"| [::b]images[::-] |\").\n\t\tSetBackgroundColor(backgroundColor)\n\n\tleftPane := tview.NewFlex().\n\t\tSetDirection(tview.FlexRow).\n\t\tAddItem(imagesView, 11, 0, false).\n\t\tAddItem(planList, 0, 1, true)\n\n\tscreen := tview.NewFlex().\n\t\tSetDirection(tview.FlexColumn).\n\t\tAddItem(leftPane, 0, 1, true).\n\t\tAddItem(logsView, 0, 1, false)\n\n\td.app = app\n\td.buildpackInfo = buildpackInfo\n\td.appTree = appTree\n\td.builderTree = builderTree\n\td.planList = planList\n\td.leftPane = leftPane\n\td.logsView = logsView\n\td.screen = screen\n\n\tfor _, txt := range logs {\n\t\td.logs = d.logs + txt + \"\\n\"\n\t}\n\n\td.handleToggle()\n\td.setScreen()\n\treturn d\n}\n\nfunc (d *Dashboard) Handle(txt string) {\n\td.app.QueueUpdateDraw(func() {\n\t\td.logs = d.logs + txt + \"\\n\"\n\t\td.logsView.SetText(tview.TranslateANSI(d.logs))\n\t})\n}\n\nfunc (d *Dashboard) Stop() {\n\t// no-op\n\t// This method is a side effect of the ill-fitting 'page interface'\n\t// Trying to create a cleaner interface between the main termui controller\n\t// and child pages like this one is currently a work-in-progress\n}\n\nfunc (d *Dashboard) SetNodes(nodes map[string]*tview.TreeNode) {\n\td.nodes = nodes\n\n\t// activate plan list buttons\n\td.planList.SetMainTextColor(tcell.ColorMediumTurquoise).\n\t\tSetSelectedTextColor(tcell.ColorMediumTurquoise)\n\n\tidx := d.planList.GetCurrentItem()\n\td.planList.Clear()\n\tfor _, buildpackInfo := range d.buildpackInfo {\n\t\tbp := buildpackInfo\n\n\t\td.planList.AddItem(\n\t\t\tbp.FullName(),\n\t\t\tinfo(bp),\n\t\t\t'✔',\n\t\t\tfunc() {\n\t\t\t\tNewDive(d.app, d.buildpackInfo, bp, d.nodes, func() {\n\t\t\t\t\td.setScreen()\n\t\t\t\t})\n\t\t\t},\n\t\t)\n\t}\n\td.planList.SetCurrentItem(idx)\n\td.app.Draw()\n}\n\nfunc (d *Dashboard) handleToggle() {\n\td.planList.SetDoneFunc(func() {\n\t\tscreen := tview.NewFlex().\n\t\t\tSetDirection(tview.FlexColumn).\n\t\t\tAddItem(d.leftPane, 0, 1, false).\n\t\t\tAddItem(d.logsView, 0, 1, true)\n\t\td.app.SetRoot(screen, true)\n\t})\n\n\td.logsView.SetDoneFunc(func(key tcell.Key) {\n\t\tscreen := tview.NewFlex().\n\t\t\tSetDirection(tview.FlexColumn).\n\t\t\tAddItem(d.leftPane, 0, 1, true).\n\t\t\tAddItem(d.logsView, 0, 1, false)\n\t\td.app.SetRoot(screen, true)\n\t})\n}\n\nfunc (d *Dashboard) setScreen() {\n\td.app.SetRoot(d.screen, true)\n}\n\nfunc (d *Dashboard) initDashboard(buildpackInfos []dist.ModuleInfo) (*tview.List, *tview.TextView) {\n\tplanList := tview.NewList()\n\tplanList.SetMainTextColor(tcell.ColorDarkGrey).\n\t\tSetSelectedTextColor(tcell.ColorDarkGrey).\n\t\tSetSelectedBackgroundColor(tcell.ColorDarkSlateGray).\n\t\tSetSecondaryTextColor(tcell.ColorDimGray).\n\t\tSetBorder(true).\n\t\tSetBorderPadding(1, 1, 1, 1).\n\t\tSetTitle(\"| [::b]plan[::-] |\").\n\t\tSetTitleAlign(tview.AlignLeft).\n\t\tSetBackgroundColor(backgroundColor)\n\n\tfor _, buildpackInfo := range buildpackInfos {\n\t\tbp := buildpackInfo\n\n\t\tplanList.AddItem(\n\t\t\tbp.FullName(),\n\t\t\tinfo(bp),\n\t\t\t' ',\n\t\t\tfunc() {},\n\t\t)\n\t}\n\n\tlogsView := tview.NewTextView()\n\tlogsView.SetDynamicColors(true).\n\t\tSetTextAlign(tview.AlignLeft).\n\t\tSetBorderPadding(1, 1, 3, 1).\n\t\tSetTitleAlign(tview.AlignLeft).\n\t\tSetBackgroundColor(backgroundColor)\n\n\treturn planList, logsView\n}\n\nfunc initTrees(appName string, bldr buildr, runImageName string) (*tview.TreeView, *tview.TreeView) {\n\tvar (\n\t\tappImage     = tview.NewTreeNode(fmt.Sprintf(\"app: [white::b]%s\", appName)).SetColor(tcell.ColorDimGray)\n\t\tappRunImage  = tview.NewTreeNode(fmt.Sprintf(\" run: [white::b]%s\", runImageName)).SetColor(tcell.ColorDimGray)\n\t\tbuilderImage = tview.NewTreeNode(fmt.Sprintf(\"builder: [white::b]%s\", bldr.BaseImageName())).SetColor(tcell.ColorDimGray)\n\t\tlifecycle    = tview.NewTreeNode(fmt.Sprintf(\" lifecycle: [white::b]%s\", bldr.LifecycleDescriptor().Info.Version.String())).SetColor(tcell.ColorDimGray)\n\t\trunImage     = tview.NewTreeNode(fmt.Sprintf(\" run: [white::b]%s\", bldr.Stack().RunImage.Image)).SetColor(tcell.ColorDimGray)\n\t\tbuildpacks   = tview.NewTreeNode(\" [mediumturquoise::b]buildpacks\")\n\t)\n\n\tappImage.AddChild(appRunImage)\n\tbuilderImage.AddChild(lifecycle)\n\tbuilderImage.AddChild(runImage)\n\tbuilderImage.AddChild(buildpacks)\n\n\tappTree := tview.NewTreeView()\n\tappTree.\n\t\tSetRoot(appImage).\n\t\tSetGraphics(true).\n\t\tSetGraphicsColor(tcell.ColorMediumTurquoise).\n\t\tSetBorderPadding(1, 0, 4, 0).\n\t\tSetBackgroundColor(backgroundColor)\n\n\tbuilderTree := tview.NewTreeView()\n\tbuilderTree.\n\t\tSetRoot(builderImage).\n\t\tSetGraphics(true).\n\t\tSetGraphicsColor(tcell.ColorMediumTurquoise).\n\t\tSetBorderPadding(0, 0, 4, 0).\n\t\tSetBackgroundColor(backgroundColor)\n\n\treturn appTree, builderTree\n}\n\nfunc info(buildpackInfo dist.ModuleInfo) string {\n\tif buildpackInfo.Description != \"\" {\n\t\treturn buildpackInfo.Description\n\t}\n\n\tif buildpackInfo.Homepage != \"\" {\n\t\treturn buildpackInfo.Homepage\n\t}\n\n\treturn \"-\"\n}\n"
  },
  {
    "path": "internal/termui/detect.go",
    "content": "package termui\n\nimport (\n\t\"regexp\"\n\t\"time\"\n\n\t\"github.com/rivo/tview\"\n\n\t\"github.com/buildpacks/pack/pkg/dist\"\n)\n\ntype Detect struct {\n\tapp  app\n\tbldr buildr\n\n\ttextView       *tview.TextView\n\tbuildpackRegex *regexp.Regexp\n\tbuildpackChan  chan dist.ModuleInfo\n\tdoneChan       chan bool\n}\n\nfunc NewDetect(app app, buildpackChan chan dist.ModuleInfo, bldr buildr) *Detect {\n\td := &Detect{\n\t\tapp:            app,\n\t\ttextView:       detectStatusTV(),\n\t\tbuildpackRegex: regexp.MustCompile(`^(\\S+)\\s+([\\d\\.]+)$`),\n\t\tbuildpackChan:  buildpackChan,\n\t\tdoneChan:       make(chan bool, 1),\n\t\tbldr:           bldr,\n\t}\n\n\tgo d.start()\n\tgrid := centered(d.textView)\n\td.app.SetRoot(grid, true)\n\treturn d\n}\n\nfunc (d *Detect) Handle(txt string) {\n\tm := d.buildpackRegex.FindStringSubmatch(txt)\n\tif len(m) == 3 {\n\t\td.buildpackChan <- d.find(m[1], m[2])\n\t}\n}\n\nfunc (d *Detect) Stop() {\n\td.doneChan <- true\n}\n\nfunc (d *Detect) SetNodes(map[string]*tview.TreeNode) {\n\t// no-op\n\t// This method is a side effect of the ill-fitting 'page interface'\n\t// Trying to create a cleaner interface between the main termui controller\n\t// and child pages like this one is currently a work-in-progress\n}\n\nfunc (d *Detect) start() {\n\tvar (\n\t\ti        = 0\n\t\tticker   = time.NewTicker(250 * time.Millisecond)\n\t\tdoneText = \"⌛️ Detected!\"\n\t\ttexts    = []string{\n\t\t\t\"⏳️ Detecting\",\n\t\t\t\"⏳️ Detecting.\",\n\t\t\t\"⏳️ Detecting..\",\n\t\t\t\"⏳️ Detecting...\",\n\t\t}\n\t)\n\n\tfor {\n\t\tselect {\n\t\tcase <-ticker.C:\n\t\t\td.app.QueueUpdateDraw(func() {\n\t\t\t\td.textView.SetText(texts[i])\n\t\t\t})\n\n\t\t\ti++\n\t\t\tif i == len(texts) {\n\t\t\t\ti = 0\n\t\t\t}\n\t\tcase <-d.doneChan:\n\t\t\tticker.Stop()\n\n\t\t\td.app.QueueUpdateDraw(func() {\n\t\t\t\td.textView.SetText(doneText)\n\t\t\t})\n\t\t\treturn\n\t\t}\n\t}\n}\n\nfunc (d *Detect) find(buildpackID, buildpackVersion string) dist.ModuleInfo {\n\tfor _, buildpack := range d.bldr.Buildpacks() {\n\t\tif buildpack.ID == buildpackID && buildpack.Version == buildpackVersion {\n\t\t\treturn buildpack\n\t\t}\n\t}\n\n\treturn dist.ModuleInfo{\n\t\tID:      buildpackID,\n\t\tVersion: buildpackVersion,\n\t}\n}\n\nfunc detectStatusTV() *tview.TextView {\n\ttv := tview.NewTextView()\n\ttv.SetBackgroundColor(backgroundColor)\n\treturn tv\n}\n\nfunc centered(p tview.Primitive) tview.Primitive {\n\treturn tview.NewGrid().\n\t\tSetColumns(0, 20, 0).\n\t\tSetRows(0, 1, 0).\n\t\tAddItem(tview.NewBox().SetBackgroundColor(backgroundColor), 0, 0, 1, 1, 0, 0, true).\n\t\tAddItem(tview.NewBox().SetBackgroundColor(backgroundColor), 0, 1, 1, 1, 0, 0, true).\n\t\tAddItem(tview.NewBox().SetBackgroundColor(backgroundColor), 0, 2, 1, 1, 0, 0, true).\n\t\tAddItem(tview.NewBox().SetBackgroundColor(backgroundColor), 1, 0, 1, 1, 0, 0, true).\n\t\tAddItem(p, 1, 1, 1, 1, 0, 0, true).\n\t\tAddItem(tview.NewBox().SetBackgroundColor(backgroundColor), 1, 2, 1, 1, 0, 0, true).\n\t\tAddItem(tview.NewBox().SetBackgroundColor(backgroundColor), 2, 0, 1, 1, 0, 0, true).\n\t\tAddItem(tview.NewBox().SetBackgroundColor(backgroundColor), 2, 1, 1, 1, 0, 0, true).\n\t\tAddItem(tview.NewBox().SetBackgroundColor(backgroundColor), 2, 2, 1, 1, 0, 0, true)\n}\n"
  },
  {
    "path": "internal/termui/dive.go",
    "content": "package termui\n\nimport (\n\t\"archive/tar\"\n\t\"fmt\"\n\t\"strings\"\n\n\t\"github.com/dustin/go-humanize\"\n\t\"github.com/gdamore/tcell/v2\"\n\t\"github.com/rivo/tview\"\n\n\t\"github.com/buildpacks/pack/pkg/dist\"\n)\n\ntype Dive struct {\n\tapp               app\n\tmenuTable         *tview.Table\n\tfileExplorerTable *tview.Table\n\tbuildpackInfo     []dist.ModuleInfo\n\tbuildpacksTreeMap map[string]*tview.TreeNode\n\tescHandler        func()\n}\n\nfunc NewDive(app app, buildpackInfo []dist.ModuleInfo, selectedBuildpack dist.ModuleInfo, nodes map[string]*tview.TreeNode, escHandler func()) *Dive {\n\tmenu := initMenu(buildpackInfo, nodes)\n\tfileExplorerTable := initFileExplorer()\n\n\tscreen := tview.NewFlex().\n\t\tSetDirection(tview.FlexColumn).\n\t\tAddItem(menu, 0, 1, true).\n\t\tAddItem(fileExplorerTable, 0, 2, false)\n\n\td := &Dive{\n\t\tapp:               app,\n\t\tmenuTable:         menu,\n\t\tfileExplorerTable: fileExplorerTable,\n\t\tbuildpackInfo:     buildpackInfo,\n\t\tbuildpacksTreeMap: nodes,\n\t\tescHandler:        escHandler,\n\t}\n\n\td.handle()\n\n\tfor row := 0; row < d.menuTable.GetRowCount(); row++ {\n\t\tif strings.Contains(d.menuTable.GetCell(row, 0).Text, selectedBuildpack.FullName()) {\n\t\t\td.menuTable.Select(row, 0)\n\t\t}\n\t}\n\n\td.app.SetRoot(screen, true)\n\treturn d\n}\n\nfunc (d *Dive) handle() {\n\tselectionFunc := func(nodeKey string) func(row, column int) {\n\t\treturn func(row, column int) {\n\t\t\tnode := d.fileExplorerTable.GetCell(row, 3).GetReference().(*tview.TreeNode)\n\n\t\t\tif !node.GetReference().(*tar.Header).FileInfo().IsDir() {\n\t\t\t\treturn\n\t\t\t}\n\n\t\t\tif node.IsExpanded() {\n\t\t\t\tnode.Collapse()\n\t\t\t} else {\n\t\t\t\tnode.Expand()\n\t\t\t}\n\n\t\t\td.loadFileExplorerData(nodeKey)\n\t\t}\n\t}\n\n\td.menuTable.SetSelectionChangedFunc(func(row, column int) {\n\t\t// protect from panic\n\t\tif row < 0 {\n\t\t\treturn\n\t\t}\n\n\t\t// if SBOM\n\t\tif row == d.menuTable.GetRowCount()-1 {\n\t\t\tnodeKey := \"layers/sbom\"\n\n\t\t\td.loadFileExplorerData(nodeKey)\n\n\t\t\td.fileExplorerTable.ScrollToBeginning()\n\n\t\t\td.fileExplorerTable.SetSelectedFunc(selectionFunc(nodeKey))\n\t\t\treturn\n\t\t}\n\n\t\t// if buildpack\n\t\tselectedBuildpack := d.buildpackInfo[row-4]\n\t\tnodeKey := \"layers/\" + strings.ReplaceAll(selectedBuildpack.ID, \"/\", \"_\")\n\n\t\td.loadFileExplorerData(nodeKey)\n\n\t\td.fileExplorerTable.ScrollToBeginning()\n\n\t\td.fileExplorerTable.SetSelectedFunc(selectionFunc(nodeKey))\n\t})\n\n\td.menuTable.SetDoneFunc(func(key tcell.Key) {\n\t\tswitch key {\n\t\tcase tcell.KeyEscape:\n\t\t\td.escHandler()\n\t\tcase tcell.KeyTab:\n\t\t\tscreen := tview.NewFlex().\n\t\t\t\tSetDirection(tview.FlexColumn).\n\t\t\t\tAddItem(d.menuTable, 0, 1, false).\n\t\t\t\tAddItem(d.fileExplorerTable, 0, 2, true)\n\t\t\td.app.SetRoot(screen, true)\n\t\t}\n\t})\n\n\td.fileExplorerTable.SetDoneFunc(func(key tcell.Key) {\n\t\tswitch key {\n\t\tcase tcell.KeyEscape:\n\t\t\td.escHandler()\n\t\tcase tcell.KeyTab:\n\t\t\tscreen := tview.NewFlex().\n\t\t\t\tSetDirection(tview.FlexColumn).\n\t\t\t\tAddItem(d.menuTable, 0, 1, true).\n\t\t\t\tAddItem(d.fileExplorerTable, 0, 2, false)\n\t\t\td.app.SetRoot(screen, true)\n\t\t}\n\t})\n}\n\nfunc (d *Dive) loadFileExplorerData(nodeKey string) {\n\t// Configure tree\n\troot := tview.NewTreeNode(\"[::b]Filetree[::-]\")\n\tfor _, child := range d.buildpacksTreeMap[nodeKey].GetChildren() {\n\t\troot.AddChild(child)\n\t}\n\n\td.fileExplorerTable.Clear()\n\n\t// Configure Table\n\td.fileExplorerTable.SetCell(0, 0, tview.NewTableCell(\"[::b]Permission[::-]\").SetSelectable(false))\n\td.fileExplorerTable.SetCell(0, 1, tview.NewTableCell(\"[::b]UID:GID[::-]\").SetSelectable(false).\n\t\tSetAlign(tview.AlignRight))\n\td.fileExplorerTable.SetCell(0, 2, tview.NewTableCell(\"[::b]Size[::-]  \").SetSelectable(false).\n\t\tSetAlign(tview.AlignRight))\n\td.fileExplorerTable.SetCell(0, 3, tview.NewTableCell(\"[::b]Filetree[::-]\").SetSelectable(false))\n\n\tbranchesMapping := map[*tview.TreeNode]Branches{\n\t\troot: {},\n\t}\n\n\troot.Walk(func(node, parent *tview.TreeNode) bool {\n\t\tif node == root {\n\t\t\treturn true\n\t\t}\n\n\t\tchildCount := len(parent.GetChildren())\n\t\tisLast := parent.GetChildren()[childCount-1] == node\n\n\t\tif isLast {\n\t\t\tbranchesMapping[node] = branchesMapping[parent].Add(NoBranchSymbol)\n\t\t} else {\n\t\t\tbranchesMapping[node] = branchesMapping[parent].Add(BranchSymbol)\n\t\t}\n\n\t\treturn true\n\t})\n\n\tvar tableRow = 0\n\troot.Walk(func(node, parent *tview.TreeNode) bool {\n\t\tif node == root {\n\t\t\ttableRow++\n\t\t\treturn true\n\t\t}\n\n\t\tref := node.GetReference().(*tar.Header)\n\n\t\tcollapseIcon := \"\"\n\t\tif !node.IsExpanded() {\n\t\t\tcollapseIcon = \" ⬥ \"\n\t\t}\n\n\t\tsize := \"-\"\n\t\tif ref.Typeflag != tar.TypeDir {\n\t\t\tsize = humanize.Bytes(uint64(ref.Size))\n\t\t}\n\n\t\tcolor := \"\"\n\t\tswitch {\n\t\tcase ref.FileInfo().IsDir():\n\t\t\tcolor = \"[mediumturquoise::b]\"\n\t\tcase ref.Typeflag == tar.TypeSymlink, ref.Typeflag == tar.TypeLink:\n\t\t\tcolor = \"[purple]\"\n\t\tcase ref.FileInfo().Mode().Perm()&0111 != 0:\n\t\t\tcolor = \"[yellow]\"\n\t\t}\n\n\t\tchildCount := len(parent.GetChildren())\n\t\tisLast := parent.GetChildren()[childCount-1] == node\n\t\tbranches := branchesMapping[node].String()\n\n\t\tcurrentBranch := MiddleBranchSymbol.String()\n\t\tif isLast {\n\t\t\tcurrentBranch = LastBranchSymbol.String()\n\t\t}\n\n\t\twithLink := \"\"\n\t\tif ref.Typeflag == tar.TypeSymlink || ref.Typeflag == tar.TypeLink {\n\t\t\twithLink = \"[-:-:-] → \" + ref.Linkname\n\t\t}\n\n\t\td.fileExplorerTable.SetCell(tableRow, 0, tview.NewTableCell(color+ref.FileInfo().Mode().String()))\n\t\td.fileExplorerTable.SetCell(tableRow, 1, tview.NewTableCell(color+fmt.Sprintf(\"%d:%d\", ref.Uid, ref.Gid)).\n\t\t\tSetAlign(tview.AlignRight))\n\t\td.fileExplorerTable.SetCell(tableRow, 2, tview.NewTableCell(color+size+\"  \").\n\t\t\tSetAlign(tview.AlignRight))\n\t\td.fileExplorerTable.SetCell(tableRow, 3, tview.NewTableCell(branches+currentBranch+color+ref.FileInfo().Name()+withLink+collapseIcon).\n\t\t\tSetReference(node))\n\n\t\ttableRow++\n\t\treturn node.IsExpanded()\n\t})\n}\n\nfunc initMenu(buildpackInfos []dist.ModuleInfo, nodes map[string]*tview.TreeNode) *tview.Table {\n\tstyle := tcell.StyleDefault.\n\t\tForeground(tcell.ColorMediumTurquoise).\n\t\tBackground(tcell.ColorDarkSlateGray).\n\t\tAttributes(tcell.AttrBold)\n\n\ttable := tview.NewTable()\n\ttable.\n\t\tSetSelectable(true, false).\n\t\tSetSelectedStyle(style).\n\t\tSetBorder(true).\n\t\tSetBorderPadding(1, 1, 2, 1).\n\t\tSetTitle(\"| [::b]phases[::-] |\").\n\t\tSetTitleAlign(tview.AlignLeft).\n\t\tSetBackgroundColor(backgroundColor)\n\n\tvar i int\n\tfor _, phase := range []string{\"ANALYZE\", \"DETECT\", \"RESTORE\", \"BUILD\"} {\n\t\ttable.SetCell(i, 0,\n\t\t\ttview.NewTableCell(phase).\n\t\t\t\tSetTextColor(tcell.ColorDarkGray).\n\t\t\t\tSetSelectable(false))\n\t\ti++\n\t}\n\n\tfor _, buildpackInfo := range buildpackInfos {\n\t\ttable.SetCell(i, 0,\n\t\t\ttview.NewTableCell(\" ↳ \"+buildpackInfo.FullName()).\n\t\t\t\tSetTextColor(tcell.ColorMediumTurquoise).\n\t\t\t\tSetSelectable(true))\n\t\ti++\n\t}\n\n\ttable.SetCell(i, 0,\n\t\ttview.NewTableCell(\"EXPORT\").\n\t\t\tSetTextColor(tcell.ColorDarkGray).\n\t\t\tSetSelectable(false))\n\n\t// set spacing\n\ti++\n\ti++\n\tsbomTextColor := tcell.ColorMediumTurquoise\n\tsbomSelectable := true\n\tif _, ok := nodes[\"layers/sbom\"]; !ok {\n\t\tsbomTextColor = tcell.ColorDarkGray\n\t\tsbomSelectable = false\n\t}\n\n\ttable.SetCell(i, 0,\n\t\ttview.NewTableCell(\"SBOM\").\n\t\t\tSetTextColor(sbomTextColor).\n\t\t\tSetSelectable(sbomSelectable))\n\treturn table\n}\n\nfunc initFileExplorer() *tview.Table {\n\tstyle := tcell.StyleDefault.\n\t\tForeground(tcell.ColorMediumTurquoise).\n\t\tBackground(tcell.ColorDarkSlateGray).\n\t\tAttributes(tcell.AttrBold)\n\n\ttbl := tview.NewTable()\n\ttbl.SetFixed(1, 0).\n\t\tSetSelectedStyle(style).\n\t\tSetSelectable(true, false).\n\t\tSetBackgroundColor(backgroundColor).\n\t\tSetBorderPadding(1, 1, 4, 0)\n\treturn tbl\n}\n"
  },
  {
    "path": "internal/termui/dive_test.go",
    "content": "package termui\n\nimport (\n\t\"os\"\n\t\"testing\"\n\n\t\"github.com/rivo/tview\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\t\"github.com/buildpacks/pack/internal/termui/fakes\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestDiveScreen(t *testing.T) {\n\tspec.Run(t, \"DiveScreen\", testDive, spec.Report(report.Terminal{}))\n}\n\nfunc testDive(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tfakeApp           app\n\t\tbuildpacks        []dist.ModuleInfo\n\t\tselectedBuildpack dist.ModuleInfo\n\t\tnodes             map[string]*tview.TreeNode\n\t)\n\n\tit.Before(func() {\n\t\tfakeApp = fakes.NewApp()\n\t\tbuildpacks = []dist.ModuleInfo{\n\t\t\t{ID: \"some/buildpack-1\", Version: \"0.0.1\"},\n\t\t\t{ID: \"some/buildpack-2\", Version: \"0.0.2\"}}\n\t\tselectedBuildpack = buildpacks[0]\n\n\t\t// fetch nodes\n\t\ttermui := &Termui{\n\t\t\tnodes: map[string]*tview.TreeNode{}}\n\t\tf, err := os.Open(\"./testdata/fake-layers.tar\")\n\t\th.AssertNil(t, err)\n\t\th.AssertNil(t, termui.ReadLayers(f))\n\t\tnodes = termui.nodes\n\t})\n\n\tit(\"loads buildpack and layer data\", func() {\n\t\tscreen := NewDive(fakeApp, buildpacks, selectedBuildpack, nodes, func() {})\n\t\th.AssertContains(t, screen.menuTable.GetCell(4, 0).Text, \"some/buildpack-1@0.0.1\")\n\t\th.AssertContains(t, screen.menuTable.GetCell(5, 0).Text, \"some/buildpack-2@0.0.2\")\n\n\t\th.AssertContains(t, screen.fileExplorerTable.GetCell(1, 0).Text, \"-rw-r--r--\")\n\t\th.AssertContains(t, screen.fileExplorerTable.GetCell(1, 1).Text, \"501:20\")\n\t\th.AssertContains(t, screen.fileExplorerTable.GetCell(1, 2).Text, \"14 B  \")\n\t\th.AssertContains(t, screen.fileExplorerTable.GetCell(1, 3).Text, \"└── some-file-1.txt\")\n\n\t\t// select the other buildpack\n\t\tscreen.menuTable.Select(5, 0)\n\t\th.AssertContains(t, screen.fileExplorerTable.GetCell(1, 0).Text, \"drwxr-xr-x\")\n\t\th.AssertContains(t, screen.fileExplorerTable.GetCell(1, 1).Text, \"501:20\")\n\t\th.AssertContains(t, screen.fileExplorerTable.GetCell(1, 2).Text, \"-  \")\n\t\th.AssertContainsMatch(t, screen.fileExplorerTable.GetCell(1, 3).Text, \"└── .*some-dir\")\n\t\th.AssertContains(t, screen.fileExplorerTable.GetCell(2, 0).Text, \"-rw-r--r--\")\n\t\th.AssertContains(t, screen.fileExplorerTable.GetCell(2, 1).Text, \"501:20\")\n\t\th.AssertContains(t, screen.fileExplorerTable.GetCell(2, 2).Text, \"14 B  \")\n\t\th.AssertContains(t, screen.fileExplorerTable.GetCell(2, 3).Text, \"    └── some-file-2.txt\")\n\n\t\t// select SBOM\n\t\tlastRow := screen.menuTable.GetRowCount() - 1\n\t\tscreen.menuTable.Select(lastRow, 0)\n\t\th.AssertContains(t, screen.fileExplorerTable.GetCell(1, 0).Text, \"drwxr-xr-x\")\n\t\th.AssertContains(t, screen.fileExplorerTable.GetCell(1, 1).Text, \"501:20\")\n\t\th.AssertContains(t, screen.fileExplorerTable.GetCell(1, 2).Text, \"-  \")\n\t\th.AssertContainsMatch(t, screen.fileExplorerTable.GetCell(1, 3).Text, \"└── .*launch\")\n\t\th.AssertContains(t, screen.fileExplorerTable.GetCell(2, 0).Text, \"-rw-r--r--\")\n\t\th.AssertContains(t, screen.fileExplorerTable.GetCell(2, 1).Text, \"501:20\")\n\t\th.AssertContains(t, screen.fileExplorerTable.GetCell(2, 2).Text, \"32 B  \")\n\t\th.AssertContains(t, screen.fileExplorerTable.GetCell(2, 3).Text, \"    └── sbom.cdx.json\")\n\t})\n}\n"
  },
  {
    "path": "internal/termui/fakes/app.go",
    "content": "package fakes\n\nimport (\n\t\"github.com/rivo/tview\"\n)\n\ntype App struct {\n\tSetRootCallCount int\n\tDrawCallCount    int\n\n\tdoneChan chan bool\n}\n\nfunc NewApp() *App {\n\treturn &App{\n\t\tdoneChan: make(chan bool, 1),\n\t}\n}\n\nfunc (a *App) SetRoot(root tview.Primitive, fullscreen bool) *tview.Application {\n\ta.SetRootCallCount++\n\treturn nil\n}\n\nfunc (a *App) Draw() *tview.Application {\n\ta.DrawCallCount++\n\treturn nil\n}\n\nfunc (a *App) QueueUpdateDraw(f func()) *tview.Application {\n\tf()\n\ta.DrawCallCount++\n\treturn nil\n}\n\nfunc (a *App) Run() error {\n\t<-a.doneChan\n\treturn nil\n}\n\nfunc (a *App) StopRunning() {\n\ta.doneChan <- true\n}\n\nfunc (a *App) ResetDrawCount() {\n\ta.DrawCallCount = 0\n}\n"
  },
  {
    "path": "internal/termui/fakes/builder.go",
    "content": "package fakes\n\nimport (\n\t\"github.com/buildpacks/pack/internal/builder\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n)\n\ntype Builder struct {\n\tbaseImageName       string\n\tbuildpacks          []dist.ModuleInfo\n\tlifecycleDescriptor builder.LifecycleDescriptor\n\tstack               builder.StackMetadata\n}\n\nfunc NewBuilder(baseImageName string, buildpacks []dist.ModuleInfo, lifecycleDescriptor builder.LifecycleDescriptor, stack builder.StackMetadata) *Builder {\n\treturn &Builder{\n\t\tbaseImageName:       baseImageName,\n\t\tbuildpacks:          buildpacks,\n\t\tlifecycleDescriptor: lifecycleDescriptor,\n\t\tstack:               stack,\n\t}\n}\n\nfunc (b *Builder) BaseImageName() string {\n\treturn b.baseImageName\n}\n\nfunc (b *Builder) Buildpacks() []dist.ModuleInfo {\n\treturn b.buildpacks\n}\n\nfunc (b *Builder) LifecycleDescriptor() builder.LifecycleDescriptor {\n\treturn b.lifecycleDescriptor\n}\n\nfunc (b *Builder) Stack() builder.StackMetadata {\n\treturn b.stack\n}\n"
  },
  {
    "path": "internal/termui/fakes/docker_stdwriter.go",
    "content": "package fakes\n\nimport (\n\t\"io\"\n\t\"time\"\n\n\t\"github.com/docker/docker/pkg/stdcopy\"\n)\n\ntype DockerStdWriter struct {\n\twOut io.Writer\n\twErr io.Writer\n}\n\nfunc NewDockerStdWriter(w io.Writer) *DockerStdWriter {\n\treturn &DockerStdWriter{\n\t\twOut: stdcopy.NewStdWriter(w, stdcopy.Stdout),\n\t\twErr: stdcopy.NewStdWriter(w, stdcopy.Stderr),\n\t}\n}\n\nfunc (w *DockerStdWriter) WriteStdoutln(contents string) {\n\tw.write(contents+\"\\n\", stdcopy.Stdout)\n}\n\nfunc (w *DockerStdWriter) WriteStderrln(contents string) {\n\tw.write(contents+\"\\n\", stdcopy.Stderr)\n}\n\nfunc (w *DockerStdWriter) write(contents string, t stdcopy.StdType) {\n\tswitch t {\n\tcase stdcopy.Stdout:\n\t\tw.wOut.Write([]byte(contents))\n\tcase stdcopy.Stderr:\n\t\tw.wErr.Write([]byte(contents))\n\t}\n\n\t// guard against race conditions\n\ttime.Sleep(time.Millisecond)\n}\n"
  },
  {
    "path": "internal/termui/logger.go",
    "content": "package termui\n\nimport \"io\"\n\nfunc (s *Termui) Debug(msg string) {\n\t// not implemented\n}\n\nfunc (s *Termui) Debugf(fmt string, v ...interface{}) {\n\t// not implemented\n}\n\nfunc (s *Termui) Info(msg string) {\n\ts.textChan <- msg\n}\n\nfunc (s *Termui) Infof(fmt string, v ...interface{}) {\n\t// not implemented\n}\n\nfunc (s *Termui) Warn(msg string) {\n\t// not implemented\n}\n\nfunc (s *Termui) Warnf(fmt string, v ...interface{}) {\n\t// not implemented\n}\n\nfunc (s *Termui) Error(msg string) {\n\t// not implemented\n}\n\nfunc (s *Termui) Errorf(fmt string, v ...interface{}) {\n\t// not implemented\n}\n\nfunc (s *Termui) Writer() io.Writer {\n\t// not implemented\n\treturn nil\n}\n\nfunc (s *Termui) IsVerbose() bool {\n\t// not implemented\n\treturn false\n}\n"
  },
  {
    "path": "internal/termui/termui.go",
    "content": "package termui\n\nimport (\n\t\"archive/tar\"\n\t\"bufio\"\n\t\"io\"\n\t\"path\"\n\t\"path/filepath\"\n\t\"strings\"\n\n\t\"github.com/docker/docker/pkg/stdcopy\"\n\t\"github.com/gdamore/tcell/v2\"\n\tdcontainer \"github.com/moby/moby/api/types/container\"\n\t\"github.com/rivo/tview\"\n\n\t\"github.com/buildpacks/pack/internal/builder\"\n\t\"github.com/buildpacks/pack/internal/container\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n)\n\nvar (\n\tbackgroundColor = tcell.NewRGBColor(5, 30, 40)\n)\n\ntype app interface {\n\tSetRoot(root tview.Primitive, fullscreen bool) *tview.Application\n\tDraw() *tview.Application\n\tQueueUpdateDraw(f func()) *tview.Application\n\tRun() error\n}\n\ntype buildr interface {\n\tBaseImageName() string\n\tBuildpacks() []dist.ModuleInfo\n\tLifecycleDescriptor() builder.LifecycleDescriptor\n\tStack() builder.StackMetadata\n}\n\ntype page interface {\n\tHandle(txt string)\n\tStop()\n\tSetNodes(nodes map[string]*tview.TreeNode)\n}\n\ntype Termui struct {\n\tapp         app\n\tbldr        buildr\n\tcurrentPage page\n\n\tappName       string\n\trunImageName  string\n\texitCode      int64\n\ttextChan      chan string\n\tbuildpackChan chan dist.ModuleInfo\n\tnodes         map[string]*tview.TreeNode\n}\n\nfunc NewTermui(appName string, bldr *builder.Builder, runImageName string) *Termui {\n\treturn &Termui{\n\t\tappName:       appName,\n\t\tbldr:          bldr,\n\t\trunImageName:  runImageName,\n\t\tapp:           tview.NewApplication(),\n\t\tbuildpackChan: make(chan dist.ModuleInfo, 50),\n\t\ttextChan:      make(chan string, 50),\n\t\tnodes:         map[string]*tview.TreeNode{},\n\t}\n}\n\n// Run starts the terminal UI process in the foreground\n// and the passed in function in the background\nfunc (s *Termui) Run(funk func()) error {\n\tgo func() {\n\t\tfunk()\n\t\ts.showBuildStatus()\n\t}()\n\tgo s.handle()\n\tdefer s.stop()\n\n\ts.currentPage = NewDetect(s.app, s.buildpackChan, s.bldr)\n\treturn s.app.Run()\n}\n\nfunc (s *Termui) stop() {\n\tclose(s.textChan)\n}\n\nfunc (s *Termui) handle() {\n\tvar detectLogs []string\n\n\tfor txt := range s.textChan {\n\t\tswitch {\n\t\t// We need a line that signals when detect phase is completed.\n\t\t// Since the phase order is: analyze -> detect -> restore -> build -> ...\n\t\t// \"===> RESTORING\" would be the best option. But since restore is optional,\n\t\t// \"===> BUILDING\" serves as the next best option.\n\t\tcase strings.Contains(txt, \"===> BUILDING\"):\n\t\t\ts.currentPage.Stop()\n\n\t\t\ts.currentPage = NewDashboard(s.app, s.appName, s.bldr, s.runImageName, collect(s.buildpackChan), detectLogs)\n\t\t\ts.currentPage.Handle(txt)\n\t\tdefault:\n\t\t\tdetectLogs = append(detectLogs, txt)\n\t\t\ts.currentPage.Handle(txt)\n\t\t}\n\t}\n}\n\nfunc (s *Termui) Handler() container.Handler {\n\treturn func(bodyChan <-chan dcontainer.WaitResponse, errChan <-chan error, reader io.Reader) error {\n\t\tvar (\n\t\t\tcopyErr = make(chan error)\n\t\t\tr, w    = io.Pipe()\n\t\t\tscanner = bufio.NewScanner(r)\n\t\t)\n\n\t\tgo func() {\n\t\t\tdefer w.Close()\n\n\t\t\t_, err := stdcopy.StdCopy(w, io.Discard, reader)\n\t\t\tif err != nil {\n\t\t\t\tcopyErr <- err\n\t\t\t}\n\t\t}()\n\n\t\tfor {\n\t\t\tselect {\n\t\t\t//TODO: errors should show up on screen\n\t\t\t//      instead of halting loop\n\t\t\t//See: https://github.com/buildpacks/pack/issues/1262\n\t\t\tcase err := <-copyErr:\n\t\t\t\treturn err\n\t\t\tcase err := <-errChan:\n\t\t\t\treturn err\n\t\t\tcase body := <-bodyChan:\n\t\t\t\ts.exitCode = body.StatusCode\n\t\t\t\treturn nil\n\t\t\tdefault:\n\t\t\t\tif scanner.Scan() {\n\t\t\t\t\ts.textChan <- scanner.Text()\n\t\t\t\t\tcontinue\n\t\t\t\t}\n\n\t\t\t\tif err := scanner.Err(); err != nil {\n\t\t\t\t\treturn err\n\t\t\t\t}\n\t\t\t}\n\t\t}\n\t}\n}\n\nfunc (s *Termui) ReadLayers(reader io.ReadCloser) error {\n\tdefer reader.Close()\n\n\ttr := tar.NewReader(reader)\n\n\tfor {\n\t\theader, err := tr.Next()\n\n\t\tswitch {\n\t\t// if no more files are found return\n\t\tcase err == io.EOF:\n\t\t\tif s.currentPage != nil {\n\t\t\t\ts.currentPage.SetNodes(s.nodes)\n\t\t\t}\n\t\t\treturn nil\n\n\t\t// return any other error\n\t\tcase err != nil:\n\t\t\treturn err\n\n\t\t// if the header is nil, just skip it (not sure how this happens)\n\t\tcase header == nil:\n\t\t\tcontinue\n\n\t\tdefault:\n\t\t\tname := path.Clean(header.Name)\n\t\t\tdir, base := filepath.Split(name)\n\t\t\tdir = strings.TrimSuffix(dir, \"/\")\n\n\t\t\tif s.nodes[dir] == nil {\n\t\t\t\ts.nodes[dir] = tview.NewTreeNode(dir)\n\t\t\t}\n\n\t\t\tnode := tview.NewTreeNode(base).SetReference(header)\n\t\t\ts.nodes[name] = node\n\t\t\ts.nodes[dir].AddChild(node)\n\t\t}\n\t}\n}\n\nfunc (s *Termui) showBuildStatus() {\n\tif s.exitCode == 0 {\n\t\ts.textChan <- \"[green::b]\\n\\nBUILD SUCCEEDED\"\n\t\treturn\n\t}\n\n\ts.textChan <- \"[red::b]\\n\\nBUILD FAILED\"\n}\n\nfunc collect(buildpackChan chan dist.ModuleInfo) []dist.ModuleInfo {\n\tclose(buildpackChan)\n\n\tvar result []dist.ModuleInfo\n\tfor txt := range buildpackChan {\n\t\tresult = append(result, txt)\n\t}\n\n\treturn result\n}\n"
  },
  {
    "path": "internal/termui/termui_test.go",
    "content": "package termui\n\nimport (\n\t\"archive/tar\"\n\t\"bytes\"\n\t\"errors\"\n\t\"fmt\"\n\t\"io\"\n\t\"os\"\n\t\"regexp\"\n\t\"strings\"\n\t\"testing\"\n\t\"time\"\n\n\tdcontainer \"github.com/moby/moby/api/types/container\"\n\t\"github.com/rivo/tview\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\t\"github.com/buildpacks/pack/internal/builder\"\n\t\"github.com/buildpacks/pack/internal/termui/fakes\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestScreen(t *testing.T) {\n\tspec.Run(t, \"Termui\", testTermui, spec.Report(report.Terminal{}))\n}\n\nfunc testTermui(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tassert             = h.NewAssertionManager(t)\n\t\teventuallyInterval = 500 * time.Millisecond\n\t\teventuallyDuration = 5 * time.Second\n\t)\n\n\tit(\"performs the lifecycle\", func() {\n\t\tvar (\n\t\t\tfakeBuild           = make(chan bool, 1)\n\t\t\tfakeBodyChan        = make(chan dcontainer.WaitResponse, 1)\n\t\t\tfakeApp             = fakes.NewApp()\n\t\t\tr, w                = io.Pipe()\n\t\t\tfakeDockerStdWriter = fakes.NewDockerStdWriter(w)\n\n\t\t\tfakeBuilder = fakes.NewBuilder(\"some/basename\",\n\t\t\t\t[]dist.ModuleInfo{\n\t\t\t\t\t{ID: \"some/buildpack-1\", Version: \"0.0.1\", Homepage: \"https://some/buildpack-1\"},\n\t\t\t\t\t{ID: \"some/buildpack-2\", Version: \"0.0.2\", Homepage: \"https://some/buildpack-2\"},\n\t\t\t\t},\n\t\t\t\tbuilder.LifecycleDescriptor{Info: builder.LifecycleInfo{\n\t\t\t\t\tVersion: builder.VersionMustParse(\"0.0.1\"),\n\t\t\t\t}},\n\t\t\t\tbuilder.StackMetadata{\n\t\t\t\t\tRunImage: builder.RunImageMetadata{\n\t\t\t\t\t\tImage: \"some/run-image\",\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t)\n\n\t\t\ts = &Termui{\n\t\t\t\tappName:       \"some/app-name\",\n\t\t\t\tbldr:          fakeBuilder,\n\t\t\t\trunImageName:  \"some/run-image-name\",\n\t\t\t\tapp:           fakeApp,\n\t\t\t\tbuildpackChan: make(chan dist.ModuleInfo, 10),\n\t\t\t\ttextChan:      make(chan string, 10),\n\t\t\t\tnodes:         map[string]*tview.TreeNode{},\n\t\t\t}\n\t\t)\n\n\t\tdefer func() {\n\t\t\tfakeBodyChan <- dcontainer.WaitResponse{StatusCode: 0}\n\t\t\tfakeBuild <- true\n\t\t\tw.Close()\n\t\t\tfakeApp.StopRunning()\n\t\t}()\n\t\tgo s.Run(func() { <-fakeBuild })\n\t\tgo s.Handler()(fakeBodyChan, nil, r)\n\n\t\th.Eventually(t, func() bool {\n\t\t\treturn fakeApp.SetRootCallCount == 1\n\t\t}, eventuallyInterval, eventuallyDuration)\n\n\t\tdetectPage, ok := s.currentPage.(*Detect)\n\t\tassert.TrueWithMessage(ok, fmt.Sprintf(\"expected %T to be assignable to type `*screen.Detect`\", s.currentPage))\n\t\tassert.TrueWithMessage(fakeApp.DrawCallCount > 0, \"expect app.Draw() to be called\")\n\t\th.Eventually(t, func() bool {\n\t\t\treturn strings.Contains(detectPage.textView.GetText(true), \"Detecting\")\n\t\t}, eventuallyInterval, eventuallyDuration)\n\n\t\tfakeDockerStdWriter.WriteStdoutln(`1 of 2 buildpacks participating`)\n\t\tfakeDockerStdWriter.WriteStdoutln(`some/buildpack-1 0.0.1`)\n\n\t\t// move to next screen\n\t\tfakeDockerStdWriter.WriteStdoutln(`===> BUILDING`)\n\t\th.Eventually(t, func() bool {\n\t\t\treturn strings.Contains(detectPage.textView.GetText(true), \"Detected!\")\n\t\t}, eventuallyInterval, eventuallyDuration)\n\n\t\th.Eventually(t, func() bool {\n\t\t\t_, ok := s.currentPage.(*Dashboard)\n\t\t\treturn ok\n\t\t}, eventuallyInterval, eventuallyDuration)\n\t\tassert.Equal(fakeApp.SetRootCallCount, 2)\n\n\t\tdashboardPage, ok := s.currentPage.(*Dashboard)\n\t\tassert.TrueWithMessage(ok, fmt.Sprintf(\"expected %T to be assignable to type `*screen.Dashboard`\", s.currentPage))\n\t\tassert.Equal(dashboardPage.planList.GetItemCount(), 1)\n\t\tbuildpackName, buildpackDescription := dashboardPage.planList.GetItemText(0)\n\t\tassert.Equal(buildpackName, \"some/buildpack-1@0.0.1\")\n\t\tassert.Equal(buildpackDescription, \"https://some/buildpack-1\")\n\n\t\tassert.Matches(dashboardPage.appTree.GetRoot().GetText(), regexp.MustCompile(`app: .*some/app-name`))\n\t\tassert.Matches(dashboardPage.appTree.GetRoot().GetChildren()[0].GetText(), regexp.MustCompile(`run: .*some/run-image-name`))\n\t\tassert.Matches(dashboardPage.builderTree.GetRoot().GetText(), regexp.MustCompile(`builder: .*some/basename`))\n\t\tassert.Matches(dashboardPage.builderTree.GetRoot().GetChildren()[0].GetText(), regexp.MustCompile(`lifecycle: .*0.0.1`))\n\t\tassert.Matches(dashboardPage.builderTree.GetRoot().GetChildren()[1].GetText(), regexp.MustCompile(`run: .*some/run-image`))\n\t\tassert.Matches(dashboardPage.builderTree.GetRoot().GetChildren()[2].GetText(), regexp.MustCompile(`buildpacks`))\n\n\t\tfakeDockerStdWriter.WriteStdoutln(`some-build-logs`)\n\t\th.Eventually(t, func() bool {\n\t\t\treturn strings.Contains(dashboardPage.logsView.GetText(true), \"some-build-logs\")\n\t\t}, eventuallyInterval, eventuallyDuration)\n\n\t\t// extract /layers from build and provide to termui\n\t\tf, err := os.Open(\"./testdata/fake-layers.tar\")\n\t\th.AssertNil(t, err)\n\t\th.AssertNil(t, s.ReadLayers(f))\n\n\t\tbpChildren1 := dashboardPage.nodes[\"layers/some_buildpack-1\"].GetChildren()\n\t\th.AssertEq(t, len(bpChildren1), 1)\n\t\th.AssertEq(t, bpChildren1[0].GetText(), \"some-file-1.txt\")\n\t\th.AssertFalse(t, bpChildren1[0].GetReference().(*tar.Header).FileInfo().IsDir())\n\n\t\tbpChildren2 := dashboardPage.nodes[\"layers/some_buildpack-2\"].GetChildren()\n\t\th.AssertEq(t, len(bpChildren2), 1)\n\t\th.AssertEq(t, bpChildren2[0].GetText(), \"some-dir\")\n\t\th.AssertTrue(t, bpChildren2[0].GetReference().(*tar.Header).FileInfo().IsDir())\n\n\t\th.AssertEq(t, len(bpChildren2[0].GetChildren()), 1)\n\t\th.AssertEq(t, bpChildren2[0].GetChildren()[0].GetText(), \"some-file-2.txt\")\n\t\th.AssertFalse(t, bpChildren2[0].GetChildren()[0].GetReference().(*tar.Header).FileInfo().IsDir())\n\n\t\t// finish build\n\t\tfakeBodyChan <- dcontainer.WaitResponse{StatusCode: 0}\n\t\tw.Close()\n\t\ttime.Sleep(500 * time.Millisecond)\n\t\tfakeBuild <- true\n\t\th.Eventually(t, func() bool {\n\t\t\treturn strings.Contains(dashboardPage.logsView.GetText(true), \"BUILD SUCCEEDED\")\n\t\t}, eventuallyInterval, eventuallyDuration)\n\t})\n\n\tit(\"performs the lifecycle (when the builder is untrusted)\", func() {\n\t\tvar (\n\t\t\tfakeBuild           = make(chan bool, 1)\n\t\t\tfakeBodyChan        = make(chan dcontainer.WaitResponse, 1)\n\t\t\tfakeApp             = fakes.NewApp()\n\t\t\tr, w                = io.Pipe()\n\t\t\tfakeDockerStdWriter = fakes.NewDockerStdWriter(w)\n\n\t\t\tfakeBuilder = fakes.NewBuilder(\"some/basename\",\n\t\t\t\t[]dist.ModuleInfo{\n\t\t\t\t\t{ID: \"some/buildpack-1\", Version: \"0.0.1\", Homepage: \"https://some/buildpack-1\"},\n\t\t\t\t\t{ID: \"some/buildpack-2\", Version: \"0.0.2\", Homepage: \"https://some/buildpack-2\"},\n\t\t\t\t},\n\t\t\t\tbuilder.LifecycleDescriptor{Info: builder.LifecycleInfo{\n\t\t\t\t\tVersion: builder.VersionMustParse(\"0.0.1\"),\n\t\t\t\t}},\n\t\t\t\tbuilder.StackMetadata{\n\t\t\t\t\tRunImage: builder.RunImageMetadata{\n\t\t\t\t\t\tImage: \"some/run-image\",\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t)\n\n\t\t\ts = &Termui{\n\t\t\t\tappName:       \"some/app-name\",\n\t\t\t\tbldr:          fakeBuilder,\n\t\t\t\trunImageName:  \"some/run-image-name\",\n\t\t\t\tapp:           fakeApp,\n\t\t\t\tbuildpackChan: make(chan dist.ModuleInfo, 10),\n\t\t\t\ttextChan:      make(chan string, 10),\n\t\t\t}\n\t\t)\n\n\t\tdefer func() {\n\t\t\tfakeBodyChan <- dcontainer.WaitResponse{StatusCode: 0}\n\t\t\tfakeBuild <- true\n\t\t\tw.Close()\n\t\t\tfakeApp.StopRunning()\n\t\t}()\n\t\tgo s.Run(func() { <-fakeBuild })\n\t\tgo s.Handler()(fakeBodyChan, nil, r)\n\n\t\th.Eventually(t, func() bool {\n\t\t\treturn fakeApp.SetRootCallCount == 1\n\t\t}, eventuallyInterval, eventuallyDuration)\n\n\t\tassert.Equal(fakeApp.SetRootCallCount, 1)\n\t\tcurrentPage, ok := s.currentPage.(*Detect)\n\t\tassert.TrueWithMessage(ok, fmt.Sprintf(\"expected %T to be assignable to type `*screen.Detect`\", s.currentPage))\n\t\tassert.TrueWithMessage(fakeApp.DrawCallCount > 0, \"expect app.Draw() to be called\")\n\t\th.Eventually(t, func() bool {\n\t\t\treturn strings.Contains(currentPage.textView.GetText(true), \"Detecting\")\n\t\t}, eventuallyInterval, eventuallyDuration)\n\n\t\t// move to next screen\n\t\ts.Info(`===> BUILDING`)\n\t\th.Eventually(t, func() bool {\n\t\t\treturn strings.Contains(currentPage.textView.GetText(true), \"Detected!\")\n\t\t}, eventuallyInterval, eventuallyDuration)\n\n\t\th.Eventually(t, func() bool {\n\t\t\t_, ok := s.currentPage.(*Dashboard)\n\t\t\treturn ok\n\t\t}, eventuallyInterval, eventuallyDuration)\n\t\tassert.Equal(fakeApp.SetRootCallCount, 2)\n\n\t\tdashboardPage, ok := s.currentPage.(*Dashboard)\n\t\tassert.TrueWithMessage(ok, fmt.Sprintf(\"expected %T to be assignable to type `*screen.Dashboard`\", s.currentPage))\n\n\t\tfakeDockerStdWriter.WriteStdoutln(`some-build-logs`)\n\t\th.Eventually(t, func() bool {\n\t\t\treturn strings.Contains(dashboardPage.logsView.GetText(true), \"some-build-logs\")\n\t\t}, eventuallyInterval, eventuallyDuration)\n\n\t\t// finish build\n\t\tfakeBodyChan <- dcontainer.WaitResponse{StatusCode: 1}\n\t\tw.Close()\n\t\ttime.Sleep(500 * time.Millisecond)\n\t\tfakeBuild <- true\n\t\th.Eventually(t, func() bool {\n\t\t\treturn strings.Contains(dashboardPage.logsView.GetText(true), \"BUILD FAILED\")\n\t\t}, eventuallyInterval, eventuallyDuration)\n\t})\n\n\t// TODO: change to show errors on-screen\n\t// See: https://github.com/buildpacks/pack/issues/1262\n\tit(\"returns errors from error channel\", func() {\n\t\tvar (\n\t\t\terrChan = make(chan error, 1)\n\t\t\tfakeApp = fakes.NewApp()\n\t\t\ts       = Termui{app: fakeApp}\n\t\t)\n\n\t\terrChan <- errors.New(\"some-error\")\n\n\t\terr := s.Handler()(nil, errChan, bytes.NewReader(nil))\n\t\tassert.ErrorContains(err, \"some-error\")\n\t})\n}\n"
  },
  {
    "path": "internal/termui/testdata/generate.sh",
    "content": "#!/usr/bin/env bash\n\ndir=$(cd $(dirname $0) && pwd)\nmkdir -p $dir/layers/some_buildpack-1\nmkdir -p $dir/layers/some_buildpack-2/some-dir\nmkdir -p $dir/layers/sbom/launch\n\necho -n \"some-content-1\" > $dir/layers/some_buildpack-1/some-file-1.txt\necho -n \"some-content-2\" > $dir/layers/some_buildpack-2/some-dir/some-file-2.txt\necho -n '{\"content\": \"some-sbom-content\"}' > $dir/layers/sbom/launch/sbom.cdx.json\n\ntar cvf $dir/fake-layers.tar layers\nrm -rf $dir/layers"
  },
  {
    "path": "main.go",
    "content": "package main\n\nimport (\n\t\"os\"\n\n\t\"github.com/heroku/color\"\n\n\t\"github.com/buildpacks/pack/cmd\"\n\t\"github.com/buildpacks/pack/pkg/client\"\n\n\t\"github.com/buildpacks/pack/internal/commands\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n)\n\nfunc main() {\n\t// create logger with defaults\n\tlogger := logging.NewLogWithWriters(color.Stdout(), color.Stderr())\n\n\trootCmd, err := cmd.NewPackCommand(logger)\n\tif err != nil {\n\t\tlogger.Error(err.Error())\n\t\tos.Exit(1)\n\t}\n\n\tctx := commands.CreateCancellableContext()\n\tif err := rootCmd.ExecuteContext(ctx); err != nil {\n\t\tif _, isSoftError := err.(client.SoftError); isSoftError {\n\t\t\tos.Exit(2)\n\t\t}\n\t\tos.Exit(1)\n\t}\n}\n"
  },
  {
    "path": "pkg/README.md",
    "content": "`pkg/` is a collection of utility packages that are used by the Pack CLI, without being specific to its internals. Some \npackages may get moved to `imgutil` when a project wide common API is developed, but this is a first landing spot allowing \nus to share code with the community.\n"
  },
  {
    "path": "pkg/archive/archive.go",
    "content": "// Package archive defines a set of functions for reading and writing directories and files in a number of tar formats.\npackage archive // import \"github.com/buildpacks/pack/pkg/archive\"\n\nimport (\n\t\"archive/tar\"\n\t\"archive/zip\"\n\t\"io\"\n\t\"io/fs\"\n\t\"os\"\n\t\"path/filepath\"\n\t\"time\"\n\n\t\"github.com/docker/docker/pkg/ioutils\"\n\t\"github.com/pkg/errors\"\n\n\t\"github.com/buildpacks/pack/internal/paths\"\n)\n\nvar NormalizedDateTime time.Time\nvar Umask fs.FileMode\n\nfunc init() {\n\tNormalizedDateTime = time.Date(1980, time.January, 1, 0, 0, 1, 0, time.UTC)\n}\n\ntype TarWriter interface {\n\tWriteHeader(hdr *tar.Header) error\n\tWrite(b []byte) (int, error)\n\tClose() error\n}\n\ntype TarWriterFactory interface {\n\tNewWriter(io.Writer) TarWriter\n}\n\ntype defaultTarWriterFactory struct{}\n\nfunc DefaultTarWriterFactory() TarWriterFactory {\n\treturn defaultTarWriterFactory{}\n}\n\nfunc (defaultTarWriterFactory) NewWriter(w io.Writer) TarWriter {\n\treturn tar.NewWriter(w)\n}\n\nfunc ReadDirAsTar(srcDir, basePath string, uid, gid int, mode int64, normalizeModTime, includeRoot bool, fileFilter func(string) bool) io.ReadCloser {\n\treturn GenerateTar(func(tw TarWriter) error {\n\t\treturn WriteDirToTar(tw, srcDir, basePath, uid, gid, mode, normalizeModTime, includeRoot, fileFilter)\n\t})\n}\n\nfunc ReadZipAsTar(srcPath, basePath string, uid, gid int, mode int64, normalizeModTime bool, fileFilter func(string) bool) io.ReadCloser {\n\treturn GenerateTar(func(tw TarWriter) error {\n\t\treturn WriteZipToTar(tw, srcPath, basePath, uid, gid, mode, normalizeModTime, fileFilter)\n\t})\n}\n\nfunc GenerateTar(genFn func(TarWriter) error) io.ReadCloser {\n\treturn GenerateTarWithWriter(genFn, DefaultTarWriterFactory())\n}\n\n// GenerateTarWithWriter returns a reader to a tar from a generator function using a writer from the provided factory.\n// Note that the generator will not fully execute until the reader is fully read from. Any errors returned by the\n// generator will be returned when reading the reader.\nfunc GenerateTarWithWriter(genFn func(TarWriter) error, twf TarWriterFactory) io.ReadCloser {\n\terrChan := make(chan error)\n\tpr, pw := io.Pipe()\n\n\tgo func() {\n\t\ttw := twf.NewWriter(pw)\n\t\tdefer func() {\n\t\t\tif r := recover(); r != nil {\n\t\t\t\ttw.Close()\n\t\t\t\tpw.CloseWithError(errors.Errorf(\"panic: %v\", r))\n\t\t\t}\n\t\t}()\n\n\t\terr := genFn(tw)\n\n\t\tcloseErr := tw.Close()\n\t\tcloseErr = aggregateError(closeErr, pw.CloseWithError(err))\n\n\t\terrChan <- closeErr\n\t}()\n\n\treturn ioutils.NewReadCloserWrapper(pr, func() error {\n\t\tvar completeErr error\n\n\t\t// closing the reader ensures that if anything attempts\n\t\t// further reading it doesn't block waiting for content\n\t\tif err := pr.Close(); err != nil {\n\t\t\tcompleteErr = aggregateError(completeErr, err)\n\t\t}\n\n\t\t// wait until everything closes properly\n\t\tif err := <-errChan; err != nil {\n\t\t\tcompleteErr = aggregateError(completeErr, err)\n\t\t}\n\n\t\treturn completeErr\n\t})\n}\n\nfunc aggregateError(base, addition error) error {\n\tif addition == nil {\n\t\treturn base\n\t}\n\n\tif base == nil {\n\t\treturn addition\n\t}\n\n\treturn errors.Wrap(addition, base.Error())\n}\n\nfunc CreateSingleFileTarReader(path, txt string) io.ReadCloser {\n\ttarBuilder := TarBuilder{}\n\ttarBuilder.AddFile(path, 0644, NormalizedDateTime, []byte(txt))\n\treturn tarBuilder.Reader(DefaultTarWriterFactory())\n}\n\nfunc CreateSingleFileTar(tarFile, path, txt string) error {\n\ttarBuilder := TarBuilder{}\n\ttarBuilder.AddFile(path, 0644, NormalizedDateTime, []byte(txt))\n\treturn tarBuilder.WriteToPath(tarFile, DefaultTarWriterFactory())\n}\n\n// ErrEntryNotExist is an error returned if an entry path doesn't exist\nvar ErrEntryNotExist = errors.New(\"not exist\")\n\n// IsEntryNotExist detects whether a given error is of type ErrEntryNotExist\nfunc IsEntryNotExist(err error) bool {\n\treturn err == ErrEntryNotExist || errors.Cause(err) == ErrEntryNotExist\n}\n\n// ReadTarEntry reads and returns a tar file\nfunc ReadTarEntry(rc io.Reader, entryPath string) (*tar.Header, []byte, error) {\n\tcanonicalEntryPath := paths.CanonicalTarPath(entryPath)\n\ttr := tar.NewReader(rc)\n\tfor {\n\t\theader, err := tr.Next()\n\t\tif err == io.EOF {\n\t\t\tbreak\n\t\t}\n\t\tif err != nil {\n\t\t\treturn nil, nil, errors.Wrap(err, \"failed to get next tar entry\")\n\t\t}\n\n\t\tif paths.CanonicalTarPath(header.Name) == canonicalEntryPath {\n\t\t\tbuf, err := io.ReadAll(tr)\n\t\t\tif err != nil {\n\t\t\t\treturn nil, nil, errors.Wrapf(err, \"failed to read contents of '%s'\", entryPath)\n\t\t\t}\n\n\t\t\treturn header, buf, nil\n\t\t}\n\t}\n\n\treturn nil, nil, errors.Wrapf(ErrEntryNotExist, \"could not find entry path '%s'\", entryPath)\n}\n\n// WriteDirToTar writes the contents of a directory to a tar writer. `basePath` is the \"location\" in the tar the\n// contents will be placed. The includeRoot param sets the permissions and metadata on the root file.\nfunc WriteDirToTar(tw TarWriter, srcDir, basePath string, uid, gid int, mode int64, normalizeModTime, includeRoot bool, fileFilter func(string) bool) error {\n\tif includeRoot {\n\t\tmode := modePermIfNegativeMode(mode)\n\t\terr := writeRootHeader(tw, basePath, mode, uid, gid, normalizeModTime)\n\t\tif err != nil {\n\t\t\treturn err\n\t\t}\n\t}\n\n\thardLinkFiles := map[uint64]string{}\n\treturn filepath.Walk(srcDir, func(file string, fi os.FileInfo, err error) error {\n\t\tvar relPath string\n\t\tif fileFilter != nil {\n\t\t\trelPath, err = filepath.Rel(srcDir, file)\n\t\t\tif err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\t\t\tif !fileFilter(relPath) {\n\t\t\t\treturn nil\n\t\t\t}\n\t\t}\n\n\t\tif err != nil {\n\t\t\treturn err\n\t\t}\n\n\t\tif relPath == \"\" {\n\t\t\trelPath, err = filepath.Rel(srcDir, file)\n\t\t\tif err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\t\t}\n\t\tif relPath == \".\" {\n\t\t\treturn nil\n\t\t}\n\n\t\tif hasModeSocket(fi) != 0 {\n\t\t\treturn nil\n\t\t}\n\n\t\tvar header *tar.Header\n\t\tif hasModeSymLink(fi) {\n\t\t\tif header, err = getHeaderFromSymLink(file, fi); err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\t\t} else {\n\t\t\tif header, err = tar.FileInfoHeader(fi, fi.Name()); err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\t\t}\n\n\t\theader.Name = getHeaderNameFromBaseAndRelPath(basePath, relPath)\n\t\tif err = processHardLinks(file, fi, hardLinkFiles, header); err != nil {\n\t\t\treturn err\n\t\t}\n\n\t\terr = writeHeader(header, uid, gid, mode, normalizeModTime, tw)\n\t\tif err != nil {\n\t\t\treturn err\n\t\t}\n\n\t\tif hasRegularMode(fi) && header.Size > 0 {\n\t\t\tf, err := os.Open(filepath.Clean(file))\n\t\t\tif err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\t\t\tdefer f.Close()\n\n\t\t\tif _, err := io.Copy(tw, f); err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\t\t}\n\n\t\treturn nil\n\t})\n}\n\n// processHardLinks determine if the given file has hard-links associated with it, the given hardLinkFiles map keeps track\n// of any previous hard-link previously processed. In case the hard-link was already found, the header will be updated with\n// the previous information otherwise the new hard-link found will be tracked into the map\nfunc processHardLinks(file string, fi os.FileInfo, hardLinkFiles map[uint64]string, header *tar.Header) error {\n\tvar (\n\t\terr       error\n\t\thardlinks bool\n\t\tinode     uint64\n\t)\n\tif hardlinks, err = hasHardlinks(fi, file); err != nil {\n\t\treturn err\n\t}\n\tif hardlinks {\n\t\tinode, err = getInodeFromStat(fi.Sys(), file)\n\t\tif err != nil {\n\t\t\treturn err\n\t\t}\n\n\t\tif processedPath, ok := hardLinkFiles[inode]; ok {\n\t\t\theader.Typeflag = tar.TypeLink\n\t\t\theader.Linkname = processedPath\n\t\t\theader.Size = 0\n\t\t} else {\n\t\t\thardLinkFiles[inode] = header.Name\n\t\t}\n\t}\n\treturn nil\n}\n\n// WriteZipToTar writes the contents of a zip file to a tar writer.\nfunc WriteZipToTar(tw TarWriter, srcZip, basePath string, uid, gid int, mode int64, normalizeModTime bool, fileFilter func(string) bool) error {\n\tzipReader, err := zip.OpenReader(srcZip)\n\tif err != nil {\n\t\treturn err\n\t}\n\tdefer zipReader.Close()\n\n\tvar fileMode int64\n\tfor _, f := range zipReader.File {\n\t\tif fileFilter != nil && !fileFilter(f.Name) {\n\t\t\tcontinue\n\t\t}\n\n\t\tfileMode = mode\n\t\tif isFatFile(f.FileHeader) {\n\t\t\tfileMode = 0777\n\t\t}\n\n\t\tvar header *tar.Header\n\t\tif f.Mode()&os.ModeSymlink != 0 {\n\t\t\ttarget, err := func() (string, error) {\n\t\t\t\tr, err := f.Open()\n\t\t\t\tif err != nil {\n\t\t\t\t\treturn \"\", nil\n\t\t\t\t}\n\t\t\t\tdefer r.Close()\n\n\t\t\t\t// contents is the target of the symlink\n\t\t\t\ttarget, err := io.ReadAll(r)\n\t\t\t\tif err != nil {\n\t\t\t\t\treturn \"\", err\n\t\t\t\t}\n\n\t\t\t\treturn string(target), nil\n\t\t\t}()\n\n\t\t\tif err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\n\t\t\theader, err = tar.FileInfoHeader(f.FileInfo(), target)\n\t\t\tif err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\t\t} else {\n\t\t\theader, err = tar.FileInfoHeader(f.FileInfo(), f.Name)\n\t\t\tif err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\t\t}\n\n\t\theader.Name = filepath.ToSlash(filepath.Join(basePath, f.Name))\n\t\tfinalizeHeader(header, uid, gid, fileMode, normalizeModTime)\n\n\t\tif err := tw.WriteHeader(header); err != nil {\n\t\t\treturn err\n\t\t}\n\n\t\tif f.Mode().IsRegular() {\n\t\t\terr := func() error {\n\t\t\t\tfi, err := f.Open()\n\t\t\t\tif err != nil {\n\t\t\t\t\treturn err\n\t\t\t\t}\n\t\t\t\tdefer fi.Close()\n\n\t\t\t\t_, err = io.Copy(tw, fi)\n\t\t\t\treturn err\n\t\t\t}()\n\n\t\t\tif err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\t\t}\n\t}\n\n\treturn nil\n}\n\n// NormalizeHeader normalizes a tar.Header\n//\n// Normalizes the following:\n//   - ModTime\n//   - GID\n//   - UID\n//   - User Name\n//   - Group Name\nfunc NormalizeHeader(header *tar.Header, normalizeModTime bool) {\n\tif normalizeModTime {\n\t\theader.ModTime = NormalizedDateTime\n\t}\n\theader.Uid = 0\n\theader.Gid = 0\n\theader.Uname = \"\"\n\theader.Gname = \"\"\n}\n\n// IsZip detects whether or not a File is a zip directory\nfunc IsZip(path string) (bool, error) {\n\tr, err := zip.OpenReader(path)\n\n\tswitch err {\n\tcase nil:\n\t\tr.Close()\n\t\treturn true, nil\n\tcase zip.ErrFormat:\n\t\treturn false, nil\n\tdefault:\n\t\treturn false, err\n\t}\n}\n\nfunc isFatFile(header zip.FileHeader) bool {\n\tvar (\n\t\tcreatorFAT  uint16 = 0 // nolint:revive\n\t\tcreatorVFAT uint16 = 14\n\t)\n\n\t// This identifies FAT files, based on the `zip` source: https://golang.org/src/archive/zip/struct.go\n\tfirstByte := header.CreatorVersion >> 8\n\treturn firstByte == creatorFAT || firstByte == creatorVFAT\n}\n\nfunc finalizeHeader(header *tar.Header, uid, gid int, mode int64, normalizeModTime bool) {\n\tNormalizeHeader(header, normalizeModTime)\n\tif mode != -1 {\n\t\theader.Mode = mode\n\t}\n\theader.Uid = uid\n\theader.Gid = gid\n}\n\nfunc hasRegularMode(fi os.FileInfo) bool {\n\treturn fi.Mode().IsRegular()\n}\n\nfunc getHeaderNameFromBaseAndRelPath(basePath string, relPath string) string {\n\treturn filepath.ToSlash(filepath.Join(basePath, relPath))\n}\n\nfunc writeHeader(header *tar.Header, uid int, gid int, mode int64, normalizeModTime bool, tw TarWriter) error {\n\tfinalizeHeader(header, uid, gid, mode, normalizeModTime)\n\n\tif err := tw.WriteHeader(header); err != nil {\n\t\treturn err\n\t}\n\n\treturn nil\n}\n\nfunc getHeaderFromSymLink(file string, fi os.FileInfo) (*tar.Header, error) {\n\ttarget, err := os.Readlink(file)\n\tif err != nil {\n\t\treturn nil, err\n\t}\n\n\t// Ensure that symlinks have Linux link names, independent of source OS\n\theader, err := tar.FileInfoHeader(fi, filepath.ToSlash(target))\n\tif err != nil {\n\t\treturn nil, err\n\t}\n\treturn header, nil\n}\n\nfunc hasModeSymLink(fi os.FileInfo) bool {\n\treturn fi.Mode()&os.ModeSymlink != 0\n}\n\nfunc hasModeSocket(fi os.FileInfo) fs.FileMode {\n\treturn fi.Mode() & os.ModeSocket\n}\n\nfunc writeRootHeader(tw TarWriter, basePath string, mode int64, uid int, gid int, normalizeModTime bool) error {\n\trootHeader := &tar.Header{\n\t\tTypeflag: tar.TypeDir,\n\t\tName:     basePath,\n\t\tMode:     mode,\n\t}\n\n\tfinalizeHeader(rootHeader, uid, gid, mode, normalizeModTime)\n\n\tif err := tw.WriteHeader(rootHeader); err != nil {\n\t\treturn err\n\t}\n\n\treturn nil\n}\n\nfunc modePermIfNegativeMode(mode int64) int64 {\n\tif mode == -1 {\n\t\treturn int64(fs.ModePerm)\n\t}\n\treturn mode\n}\n"
  },
  {
    "path": "pkg/archive/archive_test.go",
    "content": "package archive_test\n\nimport (\n\t\"archive/tar\"\n\t\"net\"\n\t\"os\"\n\t\"path/filepath\"\n\t\"runtime\"\n\t\"strings\"\n\t\"testing\"\n\n\t\"github.com/pkg/errors\"\n\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\t\"github.com/buildpacks/pack/pkg/archive\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestArchive(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"Archive\", testArchive, spec.Sequential(), spec.Report(report.Terminal{}))\n}\n\nfunc testArchive(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\ttmpDir string\n\t)\n\n\tit.Before(func() {\n\t\tvar err error\n\t\ttmpDir, err = os.MkdirTemp(\"\", \"create-tar-test\")\n\t\tif err != nil {\n\t\t\tt.Fatalf(\"failed to create tmp dir %s: %s\", tmpDir, err)\n\t\t}\n\t})\n\n\tit.After(func() {\n\t\tif err := os.RemoveAll(tmpDir); err != nil {\n\t\t\tif runtime.GOOS != \"windows\" {\n\t\t\t\t// skip \"The process cannot access the file because it is being used by another process\" on windows\n\t\t\t\tt.Fatalf(\"failed to clean up tmp dir %s: %s\", tmpDir, err)\n\t\t\t}\n\t\t}\n\t})\n\n\twhen(\"#ReadDirAsTar\", func() {\n\t\tvar src string\n\t\tit.Before(func() {\n\t\t\tsrc = filepath.Join(\"testdata\", \"dir-to-tar\")\n\t\t})\n\n\t\tit(\"returns a TarReader of the dir\", func() {\n\t\t\trc := archive.ReadDirAsTar(src, \"/nested/dir/dir-in-archive\", 1234, 2345, 0777, true, false, nil)\n\n\t\t\ttr := tar.NewReader(rc)\n\t\t\tverify := h.NewTarVerifier(t, tr, 1234, 2345)\n\t\t\tverify.NextFile(\"/nested/dir/dir-in-archive/some-file.txt\", \"some-content\", int64(os.ModePerm))\n\t\t\tverify.NextDirectory(\"/nested/dir/dir-in-archive/sub-dir\", int64(os.ModePerm))\n\t\t\tif runtime.GOOS != \"windows\" {\n\t\t\t\tverify.NextSymLink(\"/nested/dir/dir-in-archive/sub-dir/link-file\", \"../some-file.txt\")\n\t\t\t\tverify.NoMoreFilesExist()\n\t\t\t\th.AssertNil(t, rc.Close())\n\t\t\t}\n\t\t})\n\t\twhen(\"includeRoot\", func() {\n\t\t\tit(\"includes a modified root entry\", func() {\n\t\t\t\trc := archive.ReadDirAsTar(src, \"/nested/dir/dir-in-archive\", 1234, 2345, 0777, true, true, nil)\n\t\t\t\ttr := tar.NewReader(rc)\n\t\t\t\tverify := h.NewTarVerifier(t, tr, 1234, 2345)\n\t\t\t\tverify.NextDirectory(\"/nested/dir/dir-in-archive\", int64(os.ModePerm))\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"#ReadZipAsTar\", func() {\n\t\tvar src string\n\t\tit.Before(func() {\n\t\t\tsrc = filepath.Join(\"testdata\", \"zip-to-tar.zip\")\n\t\t})\n\n\t\tit(\"returns a TarReader of the dir\", func() {\n\t\t\trc := archive.ReadZipAsTar(src, \"/nested/dir/dir-in-archive\", 1234, 2345, 0777, true, nil)\n\n\t\t\ttr := tar.NewReader(rc)\n\t\t\tverify := h.NewTarVerifier(t, tr, 1234, 2345)\n\t\t\tverify.NextFile(\"/nested/dir/dir-in-archive/some-file.txt\", \"some-content\", int64(os.ModePerm))\n\t\t\tverify.NextDirectory(\"/nested/dir/dir-in-archive/sub-dir\", int64(os.ModePerm))\n\t\t\tverify.NextSymLink(\"/nested/dir/dir-in-archive/sub-dir/link-file\", \"../some-file.txt\")\n\n\t\t\tverify.NoMoreFilesExist()\n\t\t\th.AssertNil(t, rc.Close())\n\t\t})\n\t})\n\n\twhen(\"#ReadTarEntry\", func() {\n\t\tvar (\n\t\t\terr     error\n\t\t\ttarFile *os.File\n\t\t)\n\t\tit.Before(func() {\n\t\t\ttarFile, err = os.CreateTemp(tmpDir, \"file.tgz\")\n\t\t\th.AssertNil(t, err)\n\t\t})\n\n\t\tit.After(func() {\n\t\t\t_ = tarFile.Close()\n\t\t})\n\n\t\twhen(\"tgz has the path\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\terr = archive.CreateSingleFileTar(tarFile.Name(), \"file1\", \"file-1 content\")\n\t\t\t\th.AssertNil(t, err)\n\t\t\t})\n\n\t\t\tit(\"returns the file contents\", func() {\n\t\t\t\t_, contents, err := archive.ReadTarEntry(tarFile, \"file1\")\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertEq(t, string(contents), \"file-1 content\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"tgz has ./path\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\terr = archive.CreateSingleFileTar(tarFile.Name(), \"./file1\", \"file-1 content\")\n\t\t\t\th.AssertNil(t, err)\n\t\t\t})\n\n\t\t\tit(\"returns the file contents\", func() {\n\t\t\t\t_, contents, err := archive.ReadTarEntry(tarFile, \"file1\")\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertEq(t, string(contents), \"file-1 content\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"path doesn't exist\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\terr = archive.CreateSingleFileTar(tarFile.Name(), \"file1\", \"file-1 content\")\n\t\t\t\th.AssertNil(t, err)\n\t\t\t})\n\n\t\t\tit(\"returns the file contents\", func() {\n\t\t\t\t_, _, err := archive.ReadTarEntry(tarFile, \"file2\")\n\t\t\t\th.AssertError(t, err, \"could not find entry path\")\n\t\t\t\th.AssertTrue(t, archive.IsEntryNotExist(err))\n\t\t\t})\n\t\t})\n\n\t\twhen(\"reader isn't tar\", func() {\n\t\t\tit(\"returns the file contents\", func() {\n\t\t\t\treader := strings.NewReader(\"abcde\")\n\t\t\t\t_, _, err := archive.ReadTarEntry(reader, \"file1\")\n\t\t\t\th.AssertError(t, err, \"get next tar entry\")\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"#CreateSingleFileTarReader\", func() {\n\t\tit(\"returns the file contents\", func() {\n\t\t\trc := archive.CreateSingleFileTarReader(\"file1\", \"file-1 content\")\n\t\t\t_, contents, err := archive.ReadTarEntry(rc, \"file1\")\n\t\t\th.AssertNil(t, err)\n\t\t\th.AssertEq(t, string(contents), \"file-1 content\")\n\t\t})\n\t})\n\n\twhen(\"#IsEntryNotExist\", func() {\n\t\tit(\"works\", func() {\n\t\t\th.AssertTrue(t, archive.IsEntryNotExist(errors.Wrap(archive.ErrEntryNotExist, \"something\")))\n\t\t\th.AssertFalse(t, archive.IsEntryNotExist(errors.New(\"something not err not exist\")))\n\t\t})\n\t})\n\n\twhen(\"#WriteDirToTar\", func() {\n\t\tvar src string\n\t\tit.Before(func() {\n\t\t\tsrc = filepath.Join(\"testdata\", \"dir-to-tar\")\n\t\t})\n\n\t\twhen(\"mode is set to 0777\", func() {\n\t\t\tit(\"writes a tar to the dest dir with 0777\", func() {\n\t\t\t\tfh, err := os.Create(filepath.Join(tmpDir, \"some.tar\"))\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\ttw := tar.NewWriter(fh)\n\n\t\t\t\terr = archive.WriteDirToTar(tw, src, \"/nested/dir/dir-in-archive\", 1234, 2345, 0777, true, false, nil)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertNil(t, tw.Close())\n\t\t\t\th.AssertNil(t, fh.Close())\n\n\t\t\t\tfile, err := os.Open(filepath.Join(tmpDir, \"some.tar\"))\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\tdefer file.Close()\n\n\t\t\t\ttr := tar.NewReader(file)\n\n\t\t\t\tverify := h.NewTarVerifier(t, tr, 1234, 2345)\n\t\t\t\tverify.NextFile(\"/nested/dir/dir-in-archive/some-file.txt\", \"some-content\", int64(os.ModePerm))\n\t\t\t\tverify.NextDirectory(\"/nested/dir/dir-in-archive/sub-dir\", int64(os.ModePerm))\n\t\t\t\tif runtime.GOOS != \"windows\" {\n\t\t\t\t\tverify.NextSymLink(\"/nested/dir/dir-in-archive/sub-dir/link-file\", \"../some-file.txt\")\n\t\t\t\t}\n\t\t\t})\n\t\t})\n\n\t\twhen(\"mode is set to 0755\", func() {\n\t\t\tit(\"writes a tar to the dest dir with 0755\", func() {\n\t\t\t\tfh, err := os.Create(filepath.Join(tmpDir, \"some.tar\"))\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\ttw := tar.NewWriter(fh)\n\n\t\t\t\terr = archive.WriteDirToTar(tw, src, \"/nested/dir/dir-in-archive\", 1234, 2345, 0755, true, false, nil)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertNil(t, tw.Close())\n\t\t\t\th.AssertNil(t, fh.Close())\n\n\t\t\t\tfile, err := os.Open(filepath.Join(tmpDir, \"some.tar\"))\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\tdefer file.Close()\n\n\t\t\t\ttr := tar.NewReader(file)\n\n\t\t\t\tverify := h.NewTarVerifier(t, tr, 1234, 2345)\n\t\t\t\tverify.NextFile(\"/nested/dir/dir-in-archive/some-file.txt\", \"some-content\", 0755)\n\t\t\t\tverify.NextDirectory(\"/nested/dir/dir-in-archive/sub-dir\", 0755)\n\t\t\t\tif runtime.GOOS != \"windows\" {\n\t\t\t\t\tverify.NextSymLink(\"/nested/dir/dir-in-archive/sub-dir/link-file\", \"../some-file.txt\")\n\t\t\t\t}\n\t\t\t})\n\t\t})\n\n\t\twhen(\"includeRoot is true\", func() {\n\t\t\tit(\"writes a tar to the root dir with the provided mode\", func() {\n\t\t\t\tfh, err := os.Create(filepath.Join(tmpDir, \"some.tar\"))\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\ttw := tar.NewWriter(fh)\n\n\t\t\t\terr = archive.WriteDirToTar(tw, src, \"/nested/dir/dir-in-archive\", 1234, 2345, 0777, true, true, nil)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertNil(t, tw.Close())\n\t\t\t\th.AssertNil(t, fh.Close())\n\n\t\t\t\tfile, err := os.Open(filepath.Join(tmpDir, \"some.tar\"))\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\tdefer file.Close()\n\n\t\t\t\ttr := tar.NewReader(file)\n\n\t\t\t\tverify := h.NewTarVerifier(t, tr, 1234, 2345)\n\t\t\t\tverify.NextDirectory(\"/nested/dir/dir-in-archive\", int64(os.ModePerm))\n\t\t\t})\n\t\t\twhen(\"mode is set to -1\", func() {\n\t\t\t\tit(\"writes a tar to the root dir with default (0777) dir mode\", func() {\n\t\t\t\t\tfh, err := os.Create(filepath.Join(tmpDir, \"some.tar\"))\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\ttw := tar.NewWriter(fh)\n\n\t\t\t\t\terr = archive.WriteDirToTar(tw, src, \"/nested/dir/dir-in-archive\", 1234, 2345, -1, true, true, nil)\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\th.AssertNil(t, tw.Close())\n\t\t\t\t\th.AssertNil(t, fh.Close())\n\n\t\t\t\t\tfile, err := os.Open(filepath.Join(tmpDir, \"some.tar\"))\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\tdefer file.Close()\n\n\t\t\t\t\ttr := tar.NewReader(file)\n\n\t\t\t\t\tverify := h.NewTarVerifier(t, tr, 1234, 2345)\n\t\t\t\t\tverify.NextDirectory(\"/nested/dir/dir-in-archive\", 0777)\n\t\t\t\t\tverify.NextFile(\"/nested/dir/dir-in-archive/some-file.txt\", \"some-content\", fileMode(t, filepath.Join(src, \"some-file.txt\")))\n\t\t\t\t\tverify.NextDirectory(\"/nested/dir/dir-in-archive/sub-dir\", fileMode(t, filepath.Join(src, \"sub-dir\")))\n\t\t\t\t\tif runtime.GOOS != \"windows\" {\n\t\t\t\t\t\tverify.NextSymLink(\"/nested/dir/dir-in-archive/sub-dir/link-file\", \"../some-file.txt\")\n\t\t\t\t\t}\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"has file filter\", func() {\n\t\t\tit(\"does not add files against the file filter\", func() {\n\t\t\t\ttarFile := filepath.Join(tmpDir, \"some.tar\")\n\t\t\t\tfh, err := os.Create(tarFile)\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\ttw := tar.NewWriter(fh)\n\n\t\t\t\terr = archive.WriteDirToTar(tw, src, \"/nested/dir/dir-in-archive\", 1234, 2345, 0777, true, false, func(path string) bool {\n\t\t\t\t\treturn !strings.Contains(path, \"some-file.txt\")\n\t\t\t\t})\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertNil(t, tw.Close())\n\t\t\t\th.AssertNil(t, fh.Close())\n\n\t\t\t\tfile, err := os.Open(filepath.Join(tmpDir, \"some.tar\"))\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\tdefer file.Close()\n\n\t\t\t\ttr := tar.NewReader(file)\n\n\t\t\t\tverify := h.NewTarVerifier(t, tr, 1234, 2345)\n\t\t\t\tverify.NextDirectory(\"/nested/dir/dir-in-archive/sub-dir\", int64(os.ModePerm))\n\t\t\t\tif runtime.GOOS != \"windows\" {\n\t\t\t\t\tverify.NextSymLink(\"/nested/dir/dir-in-archive/sub-dir/link-file\", \"../some-file.txt\")\n\t\t\t\t}\n\t\t\t})\n\n\t\t\tit(\"filter is only handed relevant section of the filepath\", func() {\n\t\t\t\ttarFile := filepath.Join(tmpDir, \"some.tar\")\n\t\t\t\tfh, err := os.Create(tarFile)\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\ttw := tar.NewWriter(fh)\n\n\t\t\t\terr = archive.WriteDirToTar(tw, src, \"/nested/dir/dir-in-archive\", 1234, 2345, 0777, true, false, func(path string) bool {\n\t\t\t\t\treturn !strings.Contains(path, \"dir-to-tar\")\n\t\t\t\t})\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertNil(t, tw.Close())\n\t\t\t\th.AssertNil(t, fh.Close())\n\n\t\t\t\tfile, err := os.Open(filepath.Join(tmpDir, \"some.tar\"))\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\tdefer file.Close()\n\n\t\t\t\ttr := tar.NewReader(file)\n\n\t\t\t\tverify := h.NewTarVerifier(t, tr, 1234, 2345)\n\t\t\t\tverify.NextFile(\"/nested/dir/dir-in-archive/some-file.txt\", \"some-content\", int64(os.ModePerm))\n\t\t\t\tverify.NextDirectory(\"/nested/dir/dir-in-archive/sub-dir\", int64(os.ModePerm))\n\t\t\t\tif runtime.GOOS != \"windows\" {\n\t\t\t\t\tverify.NextSymLink(\"/nested/dir/dir-in-archive/sub-dir/link-file\", \"../some-file.txt\")\n\t\t\t\t}\n\t\t\t})\n\t\t})\n\n\t\twhen(\"normalize mod time is false\", func() {\n\t\t\tit(\"does not normalize mod times\", func() {\n\t\t\t\ttarFile := filepath.Join(tmpDir, \"some.tar\")\n\t\t\t\tfh, err := os.Create(tarFile)\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\ttw := tar.NewWriter(fh)\n\n\t\t\t\terr = archive.WriteDirToTar(tw, src, \"/foo\", 1234, 2345, 0777, false, false, nil)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertNil(t, tw.Close())\n\t\t\t\th.AssertNil(t, fh.Close())\n\n\t\t\t\th.AssertOnTarEntry(t, tarFile, \"/foo/some-file.txt\",\n\t\t\t\t\th.DoesNotHaveModTime(archive.NormalizedDateTime),\n\t\t\t\t)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"normalize mod time is true\", func() {\n\t\t\tit(\"normalizes mod times\", func() {\n\t\t\t\ttarFile := filepath.Join(tmpDir, \"some.tar\")\n\t\t\t\tfh, err := os.Create(tarFile)\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\ttw := tar.NewWriter(fh)\n\n\t\t\t\terr = archive.WriteDirToTar(tw, src, \"/foo\", 1234, 2345, 0777, true, false, nil)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertNil(t, tw.Close())\n\t\t\t\th.AssertNil(t, fh.Close())\n\n\t\t\t\th.AssertOnTarEntry(t, tarFile, \"/foo/some-file.txt\",\n\t\t\t\t\th.HasModTime(archive.NormalizedDateTime),\n\t\t\t\t)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"is posix\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\th.SkipIf(t, runtime.GOOS == \"windows\", \"Skipping on windows\")\n\t\t\t})\n\n\t\t\twhen(\"socket is present\", func() {\n\t\t\t\tvar (\n\t\t\t\t\terr        error\n\t\t\t\t\ttmpSrcDir  string\n\t\t\t\t\tfakeSocket net.Listener\n\t\t\t\t)\n\n\t\t\t\tit.Before(func() {\n\t\t\t\t\ttmpSrcDir, err = os.MkdirTemp(\"\", \"socket-test\")\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\tfakeSocket, err = net.Listen(\n\t\t\t\t\t\t\"unix\",\n\t\t\t\t\t\tfilepath.Join(tmpSrcDir, \"fake-socket\"),\n\t\t\t\t\t)\n\n\t\t\t\t\terr = os.WriteFile(filepath.Join(tmpSrcDir, \"fake-file\"), []byte(\"some-content\"), 0777)\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t})\n\n\t\t\t\tit.After(func() {\n\t\t\t\t\tos.RemoveAll(tmpSrcDir)\n\t\t\t\t\tfakeSocket.Close()\n\t\t\t\t})\n\n\t\t\t\tit(\"silently ignore socket\", func() {\n\t\t\t\t\tfh, err := os.Create(filepath.Join(tmpDir, \"some.tar\"))\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\ttw := tar.NewWriter(fh)\n\n\t\t\t\t\terr = archive.WriteDirToTar(tw, tmpSrcDir, \"/nested/dir/dir-in-archive\", 1234, 2345, 0777, true, false, nil)\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\th.AssertNil(t, tw.Close())\n\t\t\t\t\th.AssertNil(t, fh.Close())\n\n\t\t\t\t\tfile, err := os.Open(filepath.Join(tmpDir, \"some.tar\"))\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\tdefer file.Close()\n\n\t\t\t\t\ttr := tar.NewReader(file)\n\n\t\t\t\t\tverify := h.NewTarVerifier(t, tr, 1234, 2345)\n\t\t\t\t\tverify.NextFile(\n\t\t\t\t\t\t\"/nested/dir/dir-in-archive/fake-file\",\n\t\t\t\t\t\t\"some-content\",\n\t\t\t\t\t\t0777,\n\t\t\t\t\t)\n\t\t\t\t\tverify.NoMoreFilesExist()\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"hard link files are present\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\tsrc = filepath.Join(\"testdata\", \"dir-to-tar-with-hardlink\")\n\t\t\t\t// create a hard link\n\t\t\t\terr := os.Link(filepath.Join(src, \"original-file\"), filepath.Join(src, \"original-file-2\"))\n\t\t\t\th.AssertNil(t, err)\n\t\t\t})\n\n\t\t\tit.After(func() {\n\t\t\t\tos.RemoveAll(filepath.Join(src, \"original-file-2\"))\n\t\t\t})\n\n\t\t\tit(\"tar file file doesn't include duplicated data\", func() {\n\t\t\t\toutputFilename := filepath.Join(tmpDir, \"file-with-hard-links.tar\")\n\t\t\t\tfh, err := os.Create(outputFilename)\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\ttw := tar.NewWriter(fh)\n\t\t\t\terr = archive.WriteDirToTar(tw, src, \"/nested/dir\", 1234, 2345, 0777, true, false, nil)\n\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertNil(t, tw.Close())\n\t\t\t\th.AssertNil(t, fh.Close())\n\t\t\t\th.AssertOnTarEntries(t, outputFilename,\n\t\t\t\t\t\"/nested/dir/original-file\",\n\t\t\t\t\t\"/nested/dir/original-file-2\",\n\t\t\t\t\th.AreEquivalentHardLinks(),\n\t\t\t\t)\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"#WriteZipToTar\", func() {\n\t\tvar src string\n\t\tit.Before(func() {\n\t\t\tsrc = filepath.Join(\"testdata\", \"zip-to-tar.zip\")\n\t\t})\n\n\t\twhen(\"mode is set to 0777\", func() {\n\t\t\tit(\"writes a tar to the dest dir with 0777\", func() {\n\t\t\t\tfh, err := os.Create(filepath.Join(tmpDir, \"some.tar\"))\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\ttw := tar.NewWriter(fh)\n\n\t\t\t\terr = archive.WriteZipToTar(tw, src, \"/nested/dir/dir-in-archive\", 1234, 2345, 0777, true, nil)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertNil(t, tw.Close())\n\t\t\t\th.AssertNil(t, fh.Close())\n\n\t\t\t\tfile, err := os.Open(filepath.Join(tmpDir, \"some.tar\"))\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\tdefer file.Close()\n\n\t\t\t\ttr := tar.NewReader(file)\n\n\t\t\t\tverify := h.NewTarVerifier(t, tr, 1234, 2345)\n\t\t\t\tverify.NextFile(\"/nested/dir/dir-in-archive/some-file.txt\", \"some-content\", 0777)\n\t\t\t\tverify.NextDirectory(\"/nested/dir/dir-in-archive/sub-dir\", 0777)\n\t\t\t\tverify.NextSymLink(\"/nested/dir/dir-in-archive/sub-dir/link-file\", \"../some-file.txt\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"mode is set to -1\", func() {\n\t\t\tit(\"writes a tar to the dest dir with preexisting file mode\", func() {\n\t\t\t\tfh, err := os.Create(filepath.Join(tmpDir, \"some.tar\"))\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\ttw := tar.NewWriter(fh)\n\n\t\t\t\terr = archive.WriteZipToTar(tw, src, \"/nested/dir/dir-in-archive\", 1234, 2345, -1, true, nil)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertNil(t, tw.Close())\n\t\t\t\th.AssertNil(t, fh.Close())\n\n\t\t\t\tfile, err := os.Open(filepath.Join(tmpDir, \"some.tar\"))\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\tdefer file.Close()\n\n\t\t\t\ttr := tar.NewReader(file)\n\n\t\t\t\tverify := h.NewTarVerifier(t, tr, 1234, 2345)\n\t\t\t\tverify.NextFile(\"/nested/dir/dir-in-archive/some-file.txt\", \"some-content\", 0644)\n\t\t\t\tverify.NextDirectory(\"/nested/dir/dir-in-archive/sub-dir\", 0755)\n\t\t\t\tverify.NextSymLink(\"/nested/dir/dir-in-archive/sub-dir/link-file\", \"../some-file.txt\")\n\t\t\t})\n\n\t\t\twhen(\"files are compressed in fat (MSDOS) format\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tsrc = filepath.Join(\"testdata\", \"fat-zip-to-tar.zip\")\n\t\t\t\t})\n\n\t\t\t\tit(\"writes a tar to the dest dir with 0777\", func() {\n\t\t\t\t\tfh, err := os.Create(filepath.Join(tmpDir, \"some.tar\"))\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\ttw := tar.NewWriter(fh)\n\n\t\t\t\t\terr = archive.WriteZipToTar(tw, src, \"/nested/dir/dir-in-archive\", 1234, 2345, -1, true, nil)\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\th.AssertNil(t, tw.Close())\n\t\t\t\t\th.AssertNil(t, fh.Close())\n\n\t\t\t\t\tfile, err := os.Open(filepath.Join(tmpDir, \"some.tar\"))\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\tdefer file.Close()\n\n\t\t\t\t\ttr := tar.NewReader(file)\n\n\t\t\t\t\tverify := h.NewTarVerifier(t, tr, 1234, 2345)\n\t\t\t\t\tverify.NextFile(\"/nested/dir/dir-in-archive/some-file.txt\", \"some-content\", 0777)\n\t\t\t\t\tverify.NoMoreFilesExist()\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"has file filter\", func() {\n\t\t\tit(\"follows it when adding files\", func() {\n\t\t\t\tfh, err := os.Create(filepath.Join(tmpDir, \"some.tar\"))\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\ttw := tar.NewWriter(fh)\n\n\t\t\t\terr = archive.WriteZipToTar(tw, src, \"/nested/dir/dir-in-archive\", 1234, 2345, 0777, true, func(path string) bool {\n\t\t\t\t\treturn !strings.Contains(path, \"some-file.txt\")\n\t\t\t\t})\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertNil(t, tw.Close())\n\t\t\t\th.AssertNil(t, fh.Close())\n\n\t\t\t\tfile, err := os.Open(filepath.Join(tmpDir, \"some.tar\"))\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\tdefer file.Close()\n\n\t\t\t\ttr := tar.NewReader(file)\n\n\t\t\t\tverify := h.NewTarVerifier(t, tr, 1234, 2345)\n\t\t\t\tverify.NextDirectory(\"/nested/dir/dir-in-archive/sub-dir\", 0777)\n\t\t\t\tverify.NextSymLink(\"/nested/dir/dir-in-archive/sub-dir/link-file\", \"../some-file.txt\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"normalize mod time is false\", func() {\n\t\t\tit(\"does not normalize mod times\", func() {\n\t\t\t\ttarFile := filepath.Join(tmpDir, \"some.tar\")\n\t\t\t\tfh, err := os.Create(tarFile)\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\ttw := tar.NewWriter(fh)\n\n\t\t\t\terr = archive.WriteZipToTar(tw, src, \"/foo\", 1234, 2345, 0777, false, nil)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertNil(t, tw.Close())\n\t\t\t\th.AssertNil(t, fh.Close())\n\n\t\t\t\th.AssertOnTarEntry(t, tarFile, \"/foo/some-file.txt\",\n\t\t\t\t\th.DoesNotHaveModTime(archive.NormalizedDateTime),\n\t\t\t\t)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"normalize mod time is true\", func() {\n\t\t\tit(\"normalizes mod times\", func() {\n\t\t\t\ttarFile := filepath.Join(tmpDir, \"some.tar\")\n\t\t\t\tfh, err := os.Create(tarFile)\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\ttw := tar.NewWriter(fh)\n\n\t\t\t\terr = archive.WriteZipToTar(tw, src, \"/foo\", 1234, 2345, 0777, true, nil)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertNil(t, tw.Close())\n\t\t\t\th.AssertNil(t, fh.Close())\n\n\t\t\t\th.AssertOnTarEntry(t, tarFile, \"/foo/some-file.txt\",\n\t\t\t\t\th.HasModTime(archive.NormalizedDateTime),\n\t\t\t\t)\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"#IsZip\", func() {\n\t\twhen(\"file is a zip file\", func() {\n\t\t\tit(\"returns true\", func() {\n\t\t\t\tpath := filepath.Join(\"testdata\", \"zip-to-tar.zip\")\n\t\t\t\tisZip, err := archive.IsZip(path)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertTrue(t, isZip)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"file is a jar file\", func() {\n\t\t\tit(\"returns true\", func() {\n\t\t\t\tpath := filepath.Join(\"testdata\", \"jar-file.jar\")\n\t\t\t\tisZip, err := archive.IsZip(path)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertTrue(t, isZip)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"file is not a zip file\", func() {\n\t\t\twhen(\"file has some content\", func() {\n\t\t\t\tit(\"returns false\", func() {\n\t\t\t\t\tfile, err := os.CreateTemp(tmpDir, \"file.txt\")\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\tdefer file.Close()\n\n\t\t\t\t\terr = os.WriteFile(file.Name(), []byte(\"content\"), os.ModePerm)\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\tisZip, err := archive.IsZip(file.Name())\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\th.AssertFalse(t, isZip)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"file doesn't have content\", func() {\n\t\t\t\tit(\"returns false\", func() {\n\t\t\t\t\tfile, err := os.CreateTemp(tmpDir, \"file.txt\")\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\tdefer file.Close()\n\n\t\t\t\t\tisZip, err := archive.IsZip(file.Name())\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\th.AssertFalse(t, isZip)\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\t})\n}\n\nfunc fileMode(t *testing.T, path string) int64 {\n\tt.Helper()\n\tinfo, err := os.Stat(path)\n\tif err != nil {\n\t\tt.Fatalf(\"failed to stat %s\", path)\n\t}\n\tmode := int64(info.Mode() & os.ModePerm)\n\treturn mode\n}\n"
  },
  {
    "path": "pkg/archive/archive_unix.go",
    "content": "//go:build unix\n\npackage archive\n\nimport (\n\t\"os\"\n\t\"syscall\"\n)\n\n// hasHardlinks check if the given files has a hard-link associated with it\nfunc hasHardlinks(fi os.FileInfo, path string) (bool, error) {\n\treturn fi.Sys().(*syscall.Stat_t).Nlink > 1, nil\n}\n\n// getInodeFromStat returns the inode (index node) value associated with the given file\nfunc getInodeFromStat(stat interface{}, path string) (inode uint64, err error) {\n\ts, ok := stat.(*syscall.Stat_t)\n\tif ok {\n\t\tinode = s.Ino\n\t}\n\treturn\n}\n"
  },
  {
    "path": "pkg/archive/archive_windows.go",
    "content": "//go:build windows\n\npackage archive\n\nimport (\n\t\"os\"\n\t\"syscall\"\n\n\t\"golang.org/x/sys/windows\"\n)\n\n// hasHardlinks returns true if the given file has hard-links associated with it\nfunc hasHardlinks(fi os.FileInfo, path string) (bool, error) {\n\tvar numberOfLinks uint32\n\tswitch v := fi.Sys().(type) {\n\tcase *syscall.ByHandleFileInformation:\n\t\tnumberOfLinks = v.NumberOfLinks\n\tdefault:\n\t\t// We need an instance of a ByHandleFileInformation to read NumberOfLinks\n\t\tinfo, err := open(path)\n\t\tif err != nil {\n\t\t\treturn false, err\n\t\t}\n\t\tnumberOfLinks = info.NumberOfLinks\n\t}\n\treturn numberOfLinks > 1, nil\n}\n\n// getInodeFromStat returns an equivalent representation of unix inode on windows based on FileIndexHigh and FileIndexLow values\nfunc getInodeFromStat(stat interface{}, path string) (inode uint64, err error) {\n\ts, ok := stat.(*syscall.ByHandleFileInformation)\n\tif ok {\n\t\tinode = (uint64(s.FileIndexHigh) << 32) | uint64(s.FileIndexLow)\n\t} else {\n\t\ts, err = open(path)\n\t\tif err == nil {\n\t\t\tinode = (uint64(s.FileIndexHigh) << 32) | uint64(s.FileIndexLow)\n\t\t}\n\t}\n\treturn\n}\n\n// open returns a ByHandleFileInformation object representation of the given file\nfunc open(path string) (*syscall.ByHandleFileInformation, error) {\n\tfPath, err := syscall.UTF16PtrFromString(path)\n\tif err != nil {\n\t\treturn nil, err\n\t}\n\n\thandle, err := syscall.CreateFile(\n\t\tfPath,\n\t\twindows.FILE_READ_ATTRIBUTES,\n\t\tsyscall.FILE_SHARE_READ|syscall.FILE_SHARE_WRITE|syscall.FILE_SHARE_DELETE,\n\t\tnil,\n\t\tsyscall.OPEN_EXISTING,\n\t\tsyscall.FILE_FLAG_BACKUP_SEMANTICS,\n\t\t0)\n\tif err != nil {\n\t\treturn nil, err\n\t}\n\tdefer syscall.CloseHandle(handle)\n\n\tvar info syscall.ByHandleFileInformation\n\tif err = syscall.GetFileInformationByHandle(handle, &info); err != nil {\n\t\treturn nil, err\n\t}\n\treturn &info, nil\n}\n"
  },
  {
    "path": "pkg/archive/tar_builder.go",
    "content": "package archive\n\nimport (\n\t\"archive/tar\"\n\t\"io\"\n\t\"os\"\n\t\"time\"\n\n\t\"github.com/pkg/errors\"\n\n\t\"github.com/buildpacks/pack/internal/style\"\n)\n\ntype TarBuilder struct {\n\tfiles []fileEntry\n}\n\ntype fileEntry struct {\n\ttypeFlag byte\n\tpath     string\n\tmode     int64\n\tmodTime  time.Time\n\tcontents []byte\n}\n\nfunc (t *TarBuilder) AddFile(path string, mode int64, modTime time.Time, contents []byte) {\n\tt.files = append(t.files, fileEntry{\n\t\ttypeFlag: tar.TypeReg,\n\t\tpath:     path,\n\t\tmode:     mode,\n\t\tmodTime:  modTime,\n\t\tcontents: contents,\n\t})\n}\n\nfunc (t *TarBuilder) AddDir(path string, mode int64, modTime time.Time) {\n\tt.files = append(t.files, fileEntry{\n\t\ttypeFlag: tar.TypeDir,\n\t\tpath:     path,\n\t\tmode:     mode,\n\t\tmodTime:  modTime,\n\t})\n}\n\nfunc (t *TarBuilder) Reader(twf TarWriterFactory) io.ReadCloser {\n\tpr, pw := io.Pipe()\n\tgo func() {\n\t\tvar err error\n\t\tdefer func() {\n\t\t\tpw.CloseWithError(err)\n\t\t}()\n\t\t_, err = t.WriteTo(pw, twf)\n\t}()\n\n\treturn pr\n}\n\nfunc (t *TarBuilder) WriteToPath(path string, twf TarWriterFactory) error {\n\tfh, err := os.Create(path)\n\tif err != nil {\n\t\treturn errors.Wrapf(err, \"create file for tar: %s\", style.Symbol(path))\n\t}\n\tdefer fh.Close()\n\n\t_, err = t.WriteTo(fh, twf)\n\treturn err\n}\n\nfunc (t *TarBuilder) WriteTo(w io.Writer, twf TarWriterFactory) (int64, error) {\n\tvar written int64\n\ttw := twf.NewWriter(w)\n\tdefer tw.Close()\n\n\tfor _, f := range t.files {\n\t\tif err := tw.WriteHeader(&tar.Header{\n\t\t\tTypeflag: f.typeFlag,\n\t\t\tName:     f.path,\n\t\t\tSize:     int64(len(f.contents)),\n\t\t\tMode:     f.mode,\n\t\t\tModTime:  f.modTime,\n\t\t}); err != nil {\n\t\t\treturn written, err\n\t\t}\n\n\t\tn, err := tw.Write(f.contents)\n\t\tif err != nil {\n\t\t\treturn written, err\n\t\t}\n\n\t\twritten += int64(n)\n\t}\n\n\treturn written, nil\n}\n"
  },
  {
    "path": "pkg/archive/tar_builder_test.go",
    "content": "package archive_test\n\nimport (\n\t\"archive/tar\"\n\t\"os\"\n\t\"path/filepath\"\n\t\"testing\"\n\n\t\"github.com/buildpacks/pack/pkg/archive\"\n\n\th \"github.com/buildpacks/pack/testhelpers\"\n\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n)\n\nfunc TestTarBuilder(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"TarBuilder\", testTarBuilder, spec.Sequential(), spec.Report(report.Terminal{}))\n}\n\nfunc testTarBuilder(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\ttmpDir     string\n\t\ttarBuilder archive.TarBuilder\n\t)\n\n\tit.Before(func() {\n\t\tvar err error\n\t\ttmpDir, err = os.MkdirTemp(\"\", \"tar-builder-test\")\n\t\th.AssertNil(t, err)\n\t\ttarBuilder = archive.TarBuilder{}\n\t})\n\n\tit.After(func() {\n\t\th.AssertNil(t, os.RemoveAll(tmpDir))\n\t})\n\n\twhen(\"#AddFile\", func() {\n\t\tit(\"adds file\", func() {\n\t\t\ttarBuilder.AddFile(\"file1\", 0777, archive.NormalizedDateTime, []byte(\"file-1 content\"))\n\t\t\treader := tarBuilder.Reader(archive.DefaultTarWriterFactory())\n\t\t\ttr := tar.NewReader(reader)\n\n\t\t\tverify := h.NewTarVerifier(t, tr, 0, 0)\n\t\t\tverify.NextFile(\"file1\", \"file-1 content\", int64(os.ModePerm))\n\t\t\tverify.NoMoreFilesExist()\n\t\t})\n\t})\n\n\twhen(\"#AddDir\", func() {\n\t\tit(\"adds dir\", func() {\n\t\t\ttarBuilder.AddDir(\"path/of/dir\", 0777, archive.NormalizedDateTime)\n\t\t\treader := tarBuilder.Reader(archive.DefaultTarWriterFactory())\n\t\t\ttr := tar.NewReader(reader)\n\n\t\t\tverify := h.NewTarVerifier(t, tr, 0, 0)\n\t\t\tverify.NextDirectory(\"path/of/dir\", int64(os.ModePerm))\n\t\t\tverify.NoMoreFilesExist()\n\t\t})\n\t})\n\n\twhen(\"#WriteToPath\", func() {\n\t\tit(\"writes to path\", func() {\n\t\t\tpath := filepath.Join(tmpDir, \"some.txt\")\n\t\t\th.AssertNil(t, tarBuilder.WriteToPath(path, archive.DefaultTarWriterFactory()))\n\t\t})\n\n\t\tit(\"fails if dir doesn't exist\", func() {\n\t\t\tpath := \"dir/some.txt\"\n\t\t\th.AssertError(t, tarBuilder.WriteToPath(path, archive.DefaultTarWriterFactory()), \"create file for tar\")\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "pkg/archive/testdata/dir-to-tar/some-file.txt",
    "content": "some-content"
  },
  {
    "path": "pkg/archive/testdata/dir-to-tar-with-hardlink/original-file",
    "content": "foo\n"
  },
  {
    "path": "pkg/archive/umask_unix.go",
    "content": "//go:build unix\n\npackage archive\n\nimport (\n\t\"io/fs\"\n\t\"syscall\"\n)\n\nfunc init() {\n\tUmask = fs.FileMode(syscall.Umask(0))\n\tsyscall.Umask(int(Umask))\n}\n"
  },
  {
    "path": "pkg/blob/blob.go",
    "content": "package blob\n\nimport (\n\t\"bytes\"\n\t\"compress/gzip\"\n\t\"io\"\n\t\"os\"\n\n\t\"github.com/docker/docker/pkg/ioutils\"\n\t\"github.com/pkg/errors\"\n\n\t\"github.com/buildpacks/pack/pkg/archive\"\n)\n\ntype Blob interface {\n\tOpen() (io.ReadCloser, error)\n}\n\ntype blob struct {\n\tpath string\n}\n\nfunc NewBlob(path string) Blob {\n\treturn &blob{path: path}\n}\n\n// Open returns an io.ReadCloser whose contents are in tar archive format\nfunc (b blob) Open() (r io.ReadCloser, err error) {\n\tfi, err := os.Stat(b.path)\n\tif err != nil {\n\t\treturn nil, errors.Wrapf(err, \"read blob at path '%s'\", b.path)\n\t}\n\tif fi.IsDir() {\n\t\treturn archive.ReadDirAsTar(b.path, \".\", 0, 0, -1, true, false, nil), nil\n\t}\n\n\tfh, err := os.Open(b.path)\n\tif err != nil {\n\t\treturn nil, errors.Wrap(err, \"open buildpack archive\")\n\t}\n\tdefer func() {\n\t\tif err != nil {\n\t\t\tfh.Close()\n\t\t}\n\t}()\n\n\tif ok, err := isGZip(fh); err != nil {\n\t\treturn nil, errors.Wrap(err, \"check header\")\n\t} else if !ok {\n\t\treturn fh, nil\n\t}\n\tgzr, err := gzip.NewReader(fh)\n\tif err != nil {\n\t\treturn nil, errors.Wrap(err, \"create gzip reader\")\n\t}\n\n\trc := ioutils.NewReadCloserWrapper(gzr, func() error {\n\t\tdefer fh.Close()\n\t\treturn gzr.Close()\n\t})\n\treturn rc, nil\n}\n\nfunc isGZip(file io.ReadSeeker) (bool, error) {\n\tb := make([]byte, 3)\n\tif _, err := file.Seek(0, 0); err != nil {\n\t\treturn false, err\n\t}\n\t_, err := file.Read(b)\n\tif err != nil && err != io.EOF {\n\t\treturn false, err\n\t} else if err == io.EOF {\n\t\treturn false, nil\n\t}\n\tif _, err := file.Seek(0, 0); err != nil {\n\t\treturn false, err\n\t}\n\treturn bytes.Equal(b, []byte(\"\\x1f\\x8b\\x08\")), nil\n}\n"
  },
  {
    "path": "pkg/blob/blob_test.go",
    "content": "package blob_test\n\nimport (\n\t\"os\"\n\t\"path/filepath\"\n\t\"testing\"\n\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\t\"github.com/buildpacks/pack/pkg/blob\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestBlob(t *testing.T) {\n\tspec.Run(t, \"Buildpack\", testBlob, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testBlob(t *testing.T, when spec.G, it spec.S) {\n\twhen(\"#Blob\", func() {\n\t\twhen(\"#Open\", func() {\n\t\t\tvar (\n\t\t\t\tblobDir  = filepath.Join(\"testdata\", \"blob\")\n\t\t\t\tblobPath string\n\t\t\t)\n\n\t\t\twhen(\"dir\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tblobPath = blobDir\n\t\t\t\t})\n\t\t\t\tit(\"returns a tar reader\", func() {\n\t\t\t\t\tassertBlob(t, blob.NewBlob(blobPath))\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"tgz\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tblobPath = h.CreateTGZ(t, blobDir, \".\", -1)\n\t\t\t\t})\n\n\t\t\t\tit.After(func() {\n\t\t\t\t\th.AssertNil(t, os.Remove(blobPath))\n\t\t\t\t})\n\t\t\t\tit(\"returns a tar reader\", func() {\n\t\t\t\t\tassertBlob(t, blob.NewBlob(blobPath))\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"tar\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tblobPath = h.CreateTAR(t, blobDir, \".\", -1)\n\t\t\t\t})\n\n\t\t\t\tit.After(func() {\n\t\t\t\t\th.AssertNil(t, os.Remove(blobPath))\n\t\t\t\t})\n\t\t\t\tit(\"returns a tar reader\", func() {\n\t\t\t\t\tassertBlob(t, blob.NewBlob(blobPath))\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "pkg/blob/downloader.go",
    "content": "package blob\n\nimport (\n\t\"context\"\n\t\"crypto/sha256\"\n\t\"fmt\"\n\t\"io\"\n\t\"net/http\"\n\t\"net/url\"\n\t\"os\"\n\t\"path/filepath\"\n\n\t\"github.com/mitchellh/ioprogress\"\n\t\"github.com/pkg/errors\"\n\n\t\"github.com/buildpacks/pack/internal/paths\"\n\t\"github.com/buildpacks/pack/internal/style\"\n)\n\nconst (\n\tcacheDirPrefix = \"c\"\n\tcacheVersion   = \"2\"\n)\n\ntype Logger interface {\n\tDebugf(fmt string, v ...interface{})\n\tInfof(fmt string, v ...interface{})\n\tWriter() io.Writer\n}\n\ntype DownloaderOption func(d *downloader)\n\nfunc WithClient(client *http.Client) DownloaderOption {\n\treturn func(d *downloader) {\n\t\td.client = client\n\t}\n}\n\ntype Downloader interface {\n\tDownload(ctx context.Context, pathOrURI string) (Blob, error)\n}\n\ntype downloader struct {\n\tlogger       Logger\n\tbaseCacheDir string\n\tclient       *http.Client\n}\n\nfunc NewDownloader(logger Logger, baseCacheDir string, opts ...DownloaderOption) Downloader {\n\td := &downloader{\n\t\tlogger:       logger,\n\t\tbaseCacheDir: baseCacheDir,\n\t\tclient:       http.DefaultClient,\n\t}\n\n\tfor _, opt := range opts {\n\t\topt(d)\n\t}\n\n\treturn d\n}\n\nfunc (d *downloader) Download(ctx context.Context, pathOrURI string) (Blob, error) {\n\tif paths.IsURI(pathOrURI) {\n\t\tparsedURL, err := url.Parse(pathOrURI)\n\t\tif err != nil {\n\t\t\treturn nil, errors.Wrapf(err, \"parsing path/uri %s\", style.Symbol(pathOrURI))\n\t\t}\n\n\t\tvar path string\n\t\tswitch parsedURL.Scheme {\n\t\tcase \"file\":\n\t\t\tpath, err = paths.URIToFilePath(pathOrURI)\n\t\tcase \"http\", \"https\":\n\t\t\tpath, err = d.handleHTTP(ctx, pathOrURI)\n\t\t\tif err != nil {\n\t\t\t\t// retry as we sometimes see `wsarecv: An existing connection was forcibly closed by the remote host.` on Windows\n\t\t\t\tpath, err = d.handleHTTP(ctx, pathOrURI)\n\t\t\t}\n\t\tdefault:\n\t\t\terr = fmt.Errorf(\"unsupported protocol %s in URI %s\", style.Symbol(parsedURL.Scheme), style.Symbol(pathOrURI))\n\t\t}\n\t\tif err != nil {\n\t\t\treturn nil, err\n\t\t}\n\n\t\treturn &blob{path: path}, nil\n\t}\n\n\tpath := d.handleFile(pathOrURI)\n\n\treturn &blob{path: path}, nil\n}\n\nfunc (d *downloader) handleFile(path string) string {\n\tpath, err := filepath.Abs(path)\n\tif err != nil {\n\t\treturn \"\"\n\t}\n\n\treturn path\n}\n\nfunc (d *downloader) handleHTTP(ctx context.Context, uri string) (string, error) {\n\tcacheDir := d.versionedCacheDir()\n\n\tif err := os.MkdirAll(cacheDir, 0750); err != nil {\n\t\treturn \"\", err\n\t}\n\n\tcachePath := filepath.Join(cacheDir, fmt.Sprintf(\"%x\", sha256.Sum256([]byte(uri))))\n\n\tetagFile := cachePath + \".etag\"\n\tetagExists, err := fileExists(etagFile)\n\tif err != nil {\n\t\treturn \"\", err\n\t}\n\n\tetag := \"\"\n\tif etagExists {\n\t\tbytes, err := os.ReadFile(filepath.Clean(etagFile))\n\t\tif err != nil {\n\t\t\treturn \"\", err\n\t\t}\n\t\tetag = string(bytes)\n\t}\n\n\treader, etag, err := d.downloadAsStream(ctx, uri, etag)\n\tif err != nil {\n\t\treturn \"\", err\n\t} else if reader == nil {\n\t\treturn cachePath, nil\n\t}\n\tdefer reader.Close()\n\n\tfh, err := os.Create(cachePath)\n\tif err != nil {\n\t\treturn \"\", errors.Wrapf(err, \"create cache path %s\", style.Symbol(cachePath))\n\t}\n\tdefer fh.Close()\n\n\t_, err = io.Copy(fh, reader)\n\tif err != nil {\n\t\treturn \"\", errors.Wrap(err, \"writing cache\")\n\t}\n\n\tif err = os.WriteFile(etagFile, []byte(etag), 0744); err != nil {\n\t\treturn \"\", errors.Wrap(err, \"writing etag\")\n\t}\n\n\treturn cachePath, nil\n}\n\nfunc (d *downloader) downloadAsStream(ctx context.Context, uri string, etag string) (io.ReadCloser, string, error) {\n\treq, err := http.NewRequest(\"GET\", uri, nil)\n\tif err != nil {\n\t\treturn nil, \"\", err\n\t}\n\treq = req.WithContext(ctx)\n\n\tif etag != \"\" {\n\t\treq.Header.Set(\"If-None-Match\", etag)\n\t}\n\n\tresp, err := d.client.Do(req) //nolint:bodyclose\n\tif err != nil {\n\t\treturn nil, \"\", err\n\t}\n\n\tif resp.StatusCode >= 200 && resp.StatusCode < 300 {\n\t\td.logger.Infof(\"Downloading from %s\", style.Symbol(uri))\n\t\treturn withProgress(d.logger.Writer(), resp.Body, resp.ContentLength), resp.Header.Get(\"Etag\"), nil\n\t}\n\n\tif resp.StatusCode == 304 {\n\t\td.logger.Debugf(\"Using cached version of %s\", style.Symbol(uri))\n\t\treturn nil, etag, nil\n\t}\n\n\treturn nil, \"\", fmt.Errorf(\n\t\t\"could not download from %s, code http status %s\",\n\t\tstyle.Symbol(uri), style.SymbolF(\"%d\", resp.StatusCode),\n\t)\n}\n\nfunc withProgress(writer io.Writer, rc io.ReadCloser, length int64) io.ReadCloser {\n\treturn &progressReader{\n\t\tCloser: rc,\n\t\tReader: &ioprogress.Reader{\n\t\t\tReader:   rc,\n\t\t\tSize:     length,\n\t\t\tDrawFunc: ioprogress.DrawTerminalf(writer, ioprogress.DrawTextFormatBytes),\n\t\t},\n\t}\n}\n\ntype progressReader struct {\n\t*ioprogress.Reader\n\tio.Closer\n}\n\nfunc (d *downloader) versionedCacheDir() string {\n\treturn filepath.Join(d.baseCacheDir, cacheDirPrefix+cacheVersion)\n}\n\nfunc fileExists(file string) (bool, error) {\n\t_, err := os.Stat(file)\n\tif err != nil {\n\t\tif os.IsNotExist(err) {\n\t\t\treturn false, nil\n\t\t}\n\t\treturn false, err\n\t}\n\treturn true, nil\n}\n"
  },
  {
    "path": "pkg/blob/downloader_test.go",
    "content": "package blob_test\n\nimport (\n\t\"context\"\n\t\"fmt\"\n\t\"io\"\n\t\"net/http\"\n\t\"os\"\n\t\"path/filepath\"\n\t\"testing\"\n\n\t\"github.com/heroku/color\"\n\t\"github.com/onsi/gomega/ghttp\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\t\"github.com/buildpacks/pack/internal/paths\"\n\t\"github.com/buildpacks/pack/pkg/archive\"\n\t\"github.com/buildpacks/pack/pkg/blob\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestDownloader(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"Downloader\", testDownloader, spec.Sequential(), spec.Report(report.Terminal{}))\n}\n\nfunc testDownloader(t *testing.T, when spec.G, it spec.S) {\n\twhen(\"#Download\", func() {\n\t\tvar (\n\t\t\tcacheDir string\n\t\t\terr      error\n\t\t\tsubject  blob.Downloader\n\t\t)\n\n\t\tit.Before(func() {\n\t\t\tcacheDir, err = os.MkdirTemp(\"\", \"cache\")\n\t\t\th.AssertNil(t, err)\n\t\t\tsubject = blob.NewDownloader(&logger{io.Discard}, cacheDir)\n\t\t})\n\n\t\tit.After(func() {\n\t\t\th.AssertNil(t, os.RemoveAll(cacheDir))\n\t\t})\n\n\t\twhen(\"is path\", func() {\n\t\t\tvar (\n\t\t\t\trelPath string\n\t\t\t)\n\n\t\t\tit.Before(func() {\n\t\t\t\trelPath = filepath.Join(\"testdata\", \"blob\")\n\t\t\t})\n\n\t\t\twhen(\"is absolute\", func() {\n\t\t\t\tit(\"return the absolute path\", func() {\n\t\t\t\t\tabsPath, err := filepath.Abs(relPath)\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\tb, err := subject.Download(context.TODO(), absPath)\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\tassertBlob(t, b)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"is relative\", func() {\n\t\t\t\tit(\"resolves the absolute path\", func() {\n\t\t\t\t\tb, err := subject.Download(context.TODO(), relPath)\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\tassertBlob(t, b)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"path is a file:// uri\", func() {\n\t\t\t\tit(\"resolves the absolute path\", func() {\n\t\t\t\t\tabsPath, err := filepath.Abs(relPath)\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\turi, err := paths.FilePathToURI(absPath, \"\")\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\tb, err := subject.Download(context.TODO(), uri)\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\tassertBlob(t, b)\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"is uri\", func() {\n\t\t\tvar (\n\t\t\t\tserver *ghttp.Server\n\t\t\t\turi    string\n\t\t\t\ttgz    string\n\t\t\t)\n\n\t\t\tit.Before(func() {\n\t\t\t\tserver = ghttp.NewServer()\n\t\t\t\turi = server.URL() + \"/downloader/somefile.tgz\"\n\n\t\t\t\ttgz = h.CreateTGZ(t, filepath.Join(\"testdata\", \"blob\"), \"./\", 0777)\n\t\t\t})\n\n\t\t\tit.After(func() {\n\t\t\t\tos.Remove(tgz)\n\t\t\t\tserver.Close()\n\t\t\t})\n\n\t\t\twhen(\"uri is valid\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tserver.AppendHandlers(func(w http.ResponseWriter, r *http.Request) {\n\t\t\t\t\t\tw.Header().Add(\"ETag\", \"A\")\n\t\t\t\t\t\thttp.ServeFile(w, r, tgz)\n\t\t\t\t\t})\n\n\t\t\t\t\tserver.AppendHandlers(func(w http.ResponseWriter, r *http.Request) {\n\t\t\t\t\t\tw.WriteHeader(304)\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\tit(\"downloads from a 'http(s)://' URI\", func() {\n\t\t\t\t\tb, err := subject.Download(context.TODO(), uri)\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\tassertBlob(t, b)\n\t\t\t\t})\n\n\t\t\t\tit(\"uses cache from a 'http(s)://' URI tgz\", func() {\n\t\t\t\t\tb, err := subject.Download(context.TODO(), uri)\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\tassertBlob(t, b)\n\n\t\t\t\t\tb, err = subject.Download(context.TODO(), uri)\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\tassertBlob(t, b)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"uri is invalid\", func() {\n\t\t\t\twhen(\"uri file is not found\", func() {\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\tserver.AppendHandlers(func(w http.ResponseWriter, r *http.Request) {\n\t\t\t\t\t\t\tw.WriteHeader(404)\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\tserver.AppendHandlers(func(w http.ResponseWriter, r *http.Request) {\n\t\t\t\t\t\t\tw.WriteHeader(404)\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"should return error\", func() {\n\t\t\t\t\t\t_, err := subject.Download(context.TODO(), uri)\n\t\t\t\t\t\th.AssertError(t, err, \"could not download\")\n\t\t\t\t\t\th.AssertError(t, err, \"http status '404'\")\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"uri is unsupported\", func() {\n\t\t\t\t\tit(\"should return error\", func() {\n\t\t\t\t\t\t_, err := subject.Download(context.TODO(), \"not-supported://file.tgz\")\n\t\t\t\t\t\th.AssertError(t, err, \"unsupported protocol 'not-supported'\")\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\t})\n}\n\nfunc assertBlob(t *testing.T, b blob.Blob) {\n\tt.Helper()\n\tr, err := b.Open()\n\th.AssertNil(t, err)\n\tdefer r.Close()\n\n\t_, bytes, err := archive.ReadTarEntry(r, \"file.txt\")\n\th.AssertNil(t, err)\n\n\th.AssertEq(t, string(bytes), \"contents\")\n}\n\ntype logger struct {\n\twriter io.Writer\n}\n\nfunc (l *logger) Debugf(format string, v ...interface{}) {\n\tfmt.Fprintln(l.writer, format, v)\n}\n\nfunc (l *logger) Infof(format string, v ...interface{}) {\n\tfmt.Fprintln(l.writer, format, v)\n}\n\nfunc (l *logger) Writer() io.Writer {\n\treturn l.writer\n}\n"
  },
  {
    "path": "pkg/blob/testdata/blob/file.txt",
    "content": "contents"
  },
  {
    "path": "pkg/blob/testdata/buildpack/buildpack.toml",
    "content": "[buildpack]\n  id = \"bp.one\"\n  version = \"bp.one.version\"\n  homepage = \"http://one.buildpack\"\n\n# technically, this is invalid as a buildpack cannot have both an order and stacks\n# however, we're just testing that all the correct fields are parsed\n[[order]]\n[[order.group]]\n  id = \"bp.nested\"\n  version = \"bp.nested.version\"\n\n[[stacks]]\n  id = \"some.stack.id\"\n\n[[stacks]]\n  id = \"other.stack.id\"\n"
  },
  {
    "path": "pkg/blob/testdata/lifecycle/analyzer",
    "content": "analyzer"
  },
  {
    "path": "pkg/blob/testdata/lifecycle/builder",
    "content": "builder"
  },
  {
    "path": "pkg/blob/testdata/lifecycle/cacher",
    "content": "cacher"
  },
  {
    "path": "pkg/blob/testdata/lifecycle/detector",
    "content": "detector"
  },
  {
    "path": "pkg/blob/testdata/lifecycle/exporter",
    "content": "exporter"
  },
  {
    "path": "pkg/blob/testdata/lifecycle/launcher",
    "content": "launcher"
  },
  {
    "path": "pkg/blob/testdata/lifecycle/restorer",
    "content": "restorer"
  },
  {
    "path": "pkg/buildpack/build_module_info.go",
    "content": "package buildpack\n\nimport (\n\t\"strings\"\n\n\t\"github.com/pkg/errors\"\n\n\t\"github.com/buildpacks/pack/pkg/dist\"\n)\n\ntype ModuleInfos interface {\n\tBuildModule() []dist.ModuleInfo\n}\n\ntype FlattenModuleInfos interface {\n\tFlattenModules() []ModuleInfos\n}\n\ntype flattenModules struct {\n\tmodules []ModuleInfos\n}\n\nfunc (fl *flattenModules) FlattenModules() []ModuleInfos {\n\treturn fl.modules\n}\n\ntype buildModuleInfosImpl struct {\n\tmodules []dist.ModuleInfo\n}\n\nfunc (b *buildModuleInfosImpl) BuildModule() []dist.ModuleInfo {\n\treturn b.modules\n}\n\nfunc ParseFlattenBuildModules(buildpacksID []string) (FlattenModuleInfos, error) {\n\tvar buildModuleInfos []ModuleInfos\n\tfor _, ids := range buildpacksID {\n\t\tmodules, err := parseBuildpackName(ids)\n\t\tif err != nil {\n\t\t\treturn nil, err\n\t\t}\n\t\tbuildModuleInfos = append(buildModuleInfos, modules)\n\t}\n\treturn &flattenModules{modules: buildModuleInfos}, nil\n}\n\nfunc parseBuildpackName(names string) (ModuleInfos, error) {\n\tvar buildModuleInfos []dist.ModuleInfo\n\tids := strings.Split(names, \",\")\n\tfor _, id := range ids {\n\t\tif strings.Count(id, \"@\") != 1 {\n\t\t\treturn nil, errors.Errorf(\"invalid format %s; please use '<buildpack-id>@<buildpack-version>' to add buildpacks to be flattened\", id)\n\t\t}\n\t\tbpFullName := strings.Split(id, \"@\")\n\t\tidFromName := strings.TrimSpace(bpFullName[0])\n\t\tversionFromName := strings.TrimSpace(bpFullName[1])\n\t\tif idFromName == \"\" || versionFromName == \"\" {\n\t\t\treturn nil, errors.Errorf(\"invalid format %s; '<buildpack-id>' and '<buildpack-version>' must be specified\", id)\n\t\t}\n\n\t\tbpID := dist.ModuleInfo{\n\t\t\tID:      idFromName,\n\t\t\tVersion: versionFromName,\n\t\t}\n\t\tbuildModuleInfos = append(buildModuleInfos, bpID)\n\t}\n\treturn &buildModuleInfosImpl{modules: buildModuleInfos}, nil\n}\n"
  },
  {
    "path": "pkg/buildpack/build_module_info_test.go",
    "content": "package buildpack_test\n\nimport (\n\t\"fmt\"\n\t\"testing\"\n\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\t\"github.com/buildpacks/pack/pkg/buildpack\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestBuildModuleInfo(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"BuildModuleInfo\", testBuildModuleInfo, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testBuildModuleInfo(t *testing.T, when spec.G, it spec.S) {\n\twhen(\"#ParseFlattenBuildModules\", func() {\n\t\twhen(\"buildpacksID have format <buildpack>@<version>\", func() {\n\t\t\tvar buildModules []string\n\t\t\twhen(\"one buildpackID is provided\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tbuildModules = []string{\"some-buildpack@version-1\"}\n\t\t\t\t})\n\n\t\t\t\tit(\"parses successfully\", func() {\n\t\t\t\t\tflattenModuleInfos, err := buildpack.ParseFlattenBuildModules(buildModules)\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\th.AssertNotNil(t, flattenModuleInfos)\n\t\t\t\t\th.AssertTrue(t, len(flattenModuleInfos.FlattenModules()) == 1)\n\t\t\t\t\th.AssertEq(t, flattenModuleInfos.FlattenModules()[0].BuildModule()[0].ID, \"some-buildpack\")\n\t\t\t\t\th.AssertEq(t, flattenModuleInfos.FlattenModules()[0].BuildModule()[0].Version, \"version-1\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"more than one buildpackID is provided\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tbuildModules = []string{\"some-buildpack@version-1, another-buildpack@version-2\"}\n\t\t\t\t})\n\n\t\t\t\tit(\"parses multiple buildpackIDs\", func() {\n\t\t\t\t\tflattenModuleInfos, err := buildpack.ParseFlattenBuildModules(buildModules)\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\th.AssertNotNil(t, flattenModuleInfos)\n\t\t\t\t\th.AssertTrue(t, len(flattenModuleInfos.FlattenModules()) == 1)\n\t\t\t\t\th.AssertTrue(t, len(flattenModuleInfos.FlattenModules()[0].BuildModule()) == 2)\n\t\t\t\t\th.AssertEq(t, flattenModuleInfos.FlattenModules()[0].BuildModule()[0].ID, \"some-buildpack\")\n\t\t\t\t\th.AssertEq(t, flattenModuleInfos.FlattenModules()[0].BuildModule()[0].Version, \"version-1\")\n\t\t\t\t\th.AssertEq(t, flattenModuleInfos.FlattenModules()[0].BuildModule()[1].ID, \"another-buildpack\")\n\t\t\t\t\th.AssertEq(t, flattenModuleInfos.FlattenModules()[0].BuildModule()[1].Version, \"version-2\")\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"buildpacksID don't have format <buildpack>@<version>\", func() {\n\t\t\twhen(\"@<version> is missing\", func() {\n\t\t\t\tit(\"errors with a descriptive message\", func() {\n\t\t\t\t\t_, err := buildpack.ParseFlattenBuildModules([]string{\"some-buildpack\"})\n\t\t\t\t\th.AssertNotNil(t, err)\n\t\t\t\t\th.AssertError(t, err, fmt.Sprintf(\"invalid format %s; please use '<buildpack-id>@<buildpack-version>' to add buildpacks to be flattened\", \"some-buildpack\"))\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"<version> is missing\", func() {\n\t\t\t\tit(\"errors with a descriptive message\", func() {\n\t\t\t\t\t_, err := buildpack.ParseFlattenBuildModules([]string{\"some-buildpack@\"})\n\t\t\t\t\th.AssertNotNil(t, err)\n\t\t\t\t\th.AssertError(t, err, fmt.Sprintf(\"invalid format %s; '<buildpack-id>' and '<buildpack-version>' must be specified\", \"some-buildpack@\"))\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"<buildpack> is missing\", func() {\n\t\t\t\tit(\"errors with a descriptive message\", func() {\n\t\t\t\t\t_, err := buildpack.ParseFlattenBuildModules([]string{\"@version-1\"})\n\t\t\t\t\th.AssertNotNil(t, err)\n\t\t\t\t\th.AssertError(t, err, fmt.Sprintf(\"invalid format %s; '<buildpack-id>' and '<buildpack-version>' must be specified\", \"@version-1\"))\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"multiple @ are used\", func() {\n\t\t\t\tit(\"errors with a descriptive message\", func() {\n\t\t\t\t\t_, err := buildpack.ParseFlattenBuildModules([]string{\"some-buildpack@@version-1\"})\n\t\t\t\t\th.AssertNotNil(t, err)\n\t\t\t\t\th.AssertError(t, err, fmt.Sprintf(\"invalid format %s; please use '<buildpack-id>@<buildpack-version>' to add buildpacks to be flattened\", \"some-buildpack@@version-1\"))\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "pkg/buildpack/builder.go",
    "content": "package buildpack\n\nimport (\n\t\"archive/tar\"\n\t\"compress/gzip\"\n\t\"fmt\"\n\t\"io\"\n\t\"os\"\n\t\"path/filepath\"\n\t\"strconv\"\n\n\t\"github.com/buildpacks/imgutil\"\n\t\"github.com/buildpacks/imgutil/layer\"\n\tv1 \"github.com/google/go-containerregistry/pkg/v1\"\n\t\"github.com/google/go-containerregistry/pkg/v1/empty\"\n\t\"github.com/google/go-containerregistry/pkg/v1/layout\"\n\t\"github.com/google/go-containerregistry/pkg/v1/mutate\"\n\t\"github.com/google/go-containerregistry/pkg/v1/tarball\"\n\t\"github.com/pkg/errors\"\n\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\n\t\"github.com/buildpacks/pack/internal/stack\"\n\t\"github.com/buildpacks/pack/internal/style\"\n\t\"github.com/buildpacks/pack/pkg/archive\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n)\n\ntype ImageFactory interface {\n\tNewImage(repoName string, local bool, target dist.Target) (imgutil.Image, error)\n}\n\ntype WorkableImage interface {\n\tSetLabel(string, string) error\n\tAddLayerWithDiffID(path, diffID string) error\n}\n\ntype layoutImage struct {\n\tv1.Image\n}\n\ntype toAdd struct {\n\ttarPath string\n\tdiffID  string\n\tmodule  BuildModule\n}\n\nfunc (i *layoutImage) SetLabel(key string, val string) error {\n\tconfigFile, err := i.ConfigFile()\n\tif err != nil {\n\t\treturn err\n\t}\n\tconfig := *configFile.Config.DeepCopy()\n\tif config.Labels == nil {\n\t\tconfig.Labels = map[string]string{}\n\t}\n\tconfig.Labels[key] = val\n\ti.Image, err = mutate.Config(i.Image, config)\n\treturn err\n}\n\nfunc (i *layoutImage) AddLayerWithDiffID(path, _ string) error {\n\ttarLayer, err := tarball.LayerFromFile(path, tarball.WithCompressionLevel(gzip.DefaultCompression))\n\tif err != nil {\n\t\treturn err\n\t}\n\ti.Image, err = mutate.AppendLayers(i.Image, tarLayer)\n\tif err != nil {\n\t\treturn errors.Wrap(err, \"add layer\")\n\t}\n\treturn nil\n}\n\ntype PackageBuilderOption func(*options) error\n\ntype options struct {\n\tflatten bool\n\texclude []string\n\tlogger  logging.Logger\n\tfactory archive.TarWriterFactory\n}\n\ntype PackageBuilder struct {\n\tbuildpack                BuildModule\n\textension                BuildModule\n\tlogger                   logging.Logger\n\tlayerWriterFactory       archive.TarWriterFactory\n\tdependencies             ManagedCollection\n\timageFactory             ImageFactory\n\tflattenAllBuildpacks     bool\n\tflattenExcludeBuildpacks []string\n}\n\n// TODO: Rename to PackageBuilder\nfunc NewBuilder(imageFactory ImageFactory, ops ...PackageBuilderOption) *PackageBuilder {\n\topts := &options{}\n\tfor _, op := range ops {\n\t\tif err := op(opts); err != nil {\n\t\t\treturn nil\n\t\t}\n\t}\n\tmoduleManager := NewManagedCollectionV1(opts.flatten)\n\treturn &PackageBuilder{\n\t\timageFactory:             imageFactory,\n\t\tdependencies:             moduleManager,\n\t\tflattenAllBuildpacks:     opts.flatten,\n\t\tflattenExcludeBuildpacks: opts.exclude,\n\t\tlogger:                   opts.logger,\n\t\tlayerWriterFactory:       opts.factory,\n\t}\n}\n\nfunc FlattenAll() PackageBuilderOption {\n\treturn func(o *options) error {\n\t\to.flatten = true\n\t\treturn nil\n\t}\n}\n\nfunc DoNotFlatten(exclude []string) PackageBuilderOption {\n\treturn func(o *options) error {\n\t\to.flatten = true\n\t\to.exclude = exclude\n\t\treturn nil\n\t}\n}\n\nfunc WithLogger(logger logging.Logger) PackageBuilderOption {\n\treturn func(o *options) error {\n\t\to.logger = logger\n\t\treturn nil\n\t}\n}\n\nfunc WithLayerWriterFactory(factory archive.TarWriterFactory) PackageBuilderOption {\n\treturn func(o *options) error {\n\t\to.factory = factory\n\t\treturn nil\n\t}\n}\n\nfunc (b *PackageBuilder) SetBuildpack(buildpack BuildModule) {\n\tb.buildpack = buildpack\n}\nfunc (b *PackageBuilder) SetExtension(extension BuildModule) {\n\tb.extension = extension\n}\n\nfunc (b *PackageBuilder) AddDependency(buildpack BuildModule) {\n\tb.dependencies.AddModules(buildpack)\n}\n\nfunc (b *PackageBuilder) AddDependencies(main BuildModule, dependencies []BuildModule) {\n\tb.dependencies.AddModules(main, dependencies...)\n}\n\nfunc (b *PackageBuilder) ShouldFlatten(module BuildModule) bool {\n\treturn b.flattenAllBuildpacks || (b.dependencies.ShouldFlatten(module))\n}\n\nfunc (b *PackageBuilder) FlattenedModules() [][]BuildModule {\n\treturn b.dependencies.FlattenedModules()\n}\n\nfunc (b *PackageBuilder) AllModules() []BuildModule {\n\tall := b.dependencies.ExplodedModules()\n\tfor _, modules := range b.dependencies.FlattenedModules() {\n\t\tall = append(all, modules...)\n\t}\n\treturn all\n}\n\nfunc (b *PackageBuilder) finalizeImage(image WorkableImage, tmpDir string) error {\n\tif err := dist.SetLabel(image, MetadataLabel, &Metadata{\n\t\tModuleInfo: b.buildpack.Descriptor().Info(),\n\t\tStacks:     b.resolvedStacks(),\n\t}); err != nil {\n\t\treturn err\n\t}\n\n\tcollectionToAdd := map[string]toAdd{}\n\tvar individualBuildModules []BuildModule\n\n\t// Let's create the tarball for each flatten module\n\tif len(b.FlattenedModules()) > 0 {\n\t\tbuildModuleWriter := NewBuildModuleWriter(b.logger, b.layerWriterFactory)\n\t\texcludedModules := Set(b.flattenExcludeBuildpacks)\n\n\t\tvar (\n\t\t\tfinalTarPath string\n\t\t\terr          error\n\t\t)\n\t\tfor i, additionalModules := range b.FlattenedModules() {\n\t\t\tmodFlattenTmpDir := filepath.Join(tmpDir, fmt.Sprintf(\"buildpack-%s-flatten\", strconv.Itoa(i)))\n\t\t\tif err := os.MkdirAll(modFlattenTmpDir, os.ModePerm); err != nil {\n\t\t\t\treturn errors.Wrap(err, \"creating flatten temp dir\")\n\t\t\t}\n\n\t\t\tif b.flattenAllBuildpacks {\n\t\t\t\t// include the buildpack itself\n\t\t\t\tadditionalModules = append(additionalModules, b.buildpack)\n\t\t\t}\n\t\t\tfinalTarPath, individualBuildModules, err = buildModuleWriter.NToLayerTar(modFlattenTmpDir, fmt.Sprintf(\"buildpack-flatten-%s\", strconv.Itoa(i)), additionalModules, excludedModules)\n\t\t\tif err != nil {\n\t\t\t\treturn errors.Wrapf(err, \"adding layer %s\", finalTarPath)\n\t\t\t}\n\n\t\t\tdiffID, err := dist.LayerDiffID(finalTarPath)\n\t\t\tif err != nil {\n\t\t\t\treturn errors.Wrapf(err, \"calculating diffID for layer %s\", finalTarPath)\n\t\t\t}\n\n\t\t\tfor _, module := range additionalModules {\n\t\t\t\tcollectionToAdd[module.Descriptor().Info().FullName()] = toAdd{\n\t\t\t\t\ttarPath: finalTarPath,\n\t\t\t\t\tdiffID:  diffID.String(),\n\t\t\t\t\tmodule:  module,\n\t\t\t\t}\n\t\t\t}\n\t\t}\n\t}\n\n\tif !b.flattenAllBuildpacks || len(b.FlattenedModules()) == 0 {\n\t\tindividualBuildModules = append(individualBuildModules, b.buildpack)\n\t}\n\n\t// Let's create the tarball for each individual module\n\tfor _, bp := range append(b.dependencies.ExplodedModules(), individualBuildModules...) {\n\t\tbpLayerTar, err := ToLayerTar(tmpDir, bp)\n\t\tif err != nil {\n\t\t\treturn err\n\t\t}\n\n\t\tdiffID, err := dist.LayerDiffID(bpLayerTar)\n\t\tif err != nil {\n\t\t\treturn errors.Wrapf(err,\n\t\t\t\t\"getting content hashes for buildpack %s\",\n\t\t\t\tstyle.Symbol(bp.Descriptor().Info().FullName()),\n\t\t\t)\n\t\t}\n\t\tcollectionToAdd[bp.Descriptor().Info().FullName()] = toAdd{\n\t\t\ttarPath: bpLayerTar,\n\t\t\tdiffID:  diffID.String(),\n\t\t\tmodule:  bp,\n\t\t}\n\t}\n\n\tbpLayers := dist.ModuleLayers{}\n\tdiffIDAdded := map[string]string{}\n\n\tfor key := range collectionToAdd {\n\t\tmodule := collectionToAdd[key]\n\t\tbp := module.module\n\t\taddLayer := true\n\t\tif b.ShouldFlatten(bp) {\n\t\t\tif _, ok := diffIDAdded[module.diffID]; !ok {\n\t\t\t\tdiffIDAdded[module.diffID] = module.tarPath\n\t\t\t} else {\n\t\t\t\taddLayer = false\n\t\t\t}\n\t\t}\n\t\tif addLayer {\n\t\t\tif err := image.AddLayerWithDiffID(module.tarPath, module.diffID); err != nil {\n\t\t\t\treturn errors.Wrapf(err, \"adding layer tar for buildpack %s\", style.Symbol(bp.Descriptor().Info().FullName()))\n\t\t\t}\n\t\t}\n\n\t\tdist.AddToLayersMD(bpLayers, bp.Descriptor(), module.diffID)\n\t}\n\n\tif err := dist.SetLabel(image, dist.BuildpackLayersLabel, bpLayers); err != nil {\n\t\treturn err\n\t}\n\n\treturn nil\n}\n\nfunc (b *PackageBuilder) finalizeExtensionImage(image WorkableImage, tmpDir string) error {\n\tif err := dist.SetLabel(image, MetadataLabel, &Metadata{\n\t\tModuleInfo: b.extension.Descriptor().Info(),\n\t}); err != nil {\n\t\treturn err\n\t}\n\n\texLayers := dist.ModuleLayers{}\n\texLayerTar, err := ToLayerTar(tmpDir, b.extension)\n\tif err != nil {\n\t\treturn err\n\t}\n\n\tdiffID, err := dist.LayerDiffID(exLayerTar)\n\tif err != nil {\n\t\treturn errors.Wrapf(err,\n\t\t\t\"getting content hashes for extension %s\",\n\t\t\tstyle.Symbol(b.extension.Descriptor().Info().FullName()),\n\t\t)\n\t}\n\n\tif err := image.AddLayerWithDiffID(exLayerTar, diffID.String()); err != nil {\n\t\treturn errors.Wrapf(err, \"adding layer tar for extension %s\", style.Symbol(b.extension.Descriptor().Info().FullName()))\n\t}\n\n\tdist.AddToLayersMD(exLayers, b.extension.Descriptor(), diffID.String())\n\n\tif err := dist.SetLabel(image, dist.ExtensionLayersLabel, exLayers); err != nil {\n\t\treturn err\n\t}\n\n\treturn nil\n}\n\nfunc (b *PackageBuilder) validate() error {\n\tif b.buildpack == nil && b.extension == nil {\n\t\treturn errors.New(\"buildpack or extension must be set\")\n\t}\n\n\t// we don't need to validate extensions because there are no order or stacks in extensions\n\tif b.buildpack != nil && b.extension == nil {\n\t\tif err := validateBuildpacks(b.buildpack, b.AllModules()); err != nil {\n\t\t\treturn err\n\t\t}\n\n\t\tif len(b.resolvedStacks()) == 0 {\n\t\t\treturn errors.Errorf(\"no compatible stacks among provided buildpacks\")\n\t\t}\n\t}\n\n\treturn nil\n}\n\nfunc (b *PackageBuilder) resolvedStacks() []dist.Stack {\n\tstacks := b.buildpack.Descriptor().Stacks()\n\tif len(stacks) == 0 && len(b.buildpack.Descriptor().Order()) == 0 {\n\t\t// For non-meta-buildpacks using targets, not stacks: assume any stack\n\t\tstacks = append(stacks, dist.Stack{ID: \"*\"})\n\t}\n\tfor _, bp := range b.AllModules() {\n\t\tbpd := bp.Descriptor()\n\t\tbpdStacks := bp.Descriptor().Stacks()\n\t\tif len(bpdStacks) == 0 && len(bpd.Order()) == 0 {\n\t\t\t// For non-meta-buildpacks using targets, not stacks: assume any stack\n\t\t\tbpdStacks = append(bpdStacks, dist.Stack{ID: \"*\"})\n\t\t}\n\n\t\tif len(stacks) == 0 {\n\t\t\tstacks = bpdStacks\n\t\t} else if len(bpdStacks) > 0 { // skip over \"meta-buildpacks\"\n\t\t\tstacks = stack.MergeCompatible(stacks, bpdStacks)\n\t\t}\n\t}\n\n\treturn stacks\n}\n\nfunc (b *PackageBuilder) SaveAsFile(path string, target dist.Target, labels map[string]string) error {\n\tif err := b.validate(); err != nil {\n\t\treturn err\n\t}\n\n\tlayoutImage, err := newLayoutImage(target)\n\tif err != nil {\n\t\treturn errors.Wrap(err, \"creating layout image\")\n\t}\n\n\tfor labelKey, labelValue := range labels {\n\t\terr = layoutImage.SetLabel(labelKey, labelValue)\n\t\tif err != nil {\n\t\t\treturn errors.Wrapf(err, \"adding label %s=%s\", labelKey, labelValue)\n\t\t}\n\t}\n\n\ttempDirName := \"\"\n\tif b.buildpack != nil {\n\t\ttempDirName = \"package-buildpack\"\n\t} else if b.extension != nil {\n\t\ttempDirName = \"extension-buildpack\"\n\t}\n\n\ttmpDir, err := os.MkdirTemp(\"\", tempDirName)\n\tif err != nil {\n\t\treturn err\n\t}\n\tdefer os.RemoveAll(tmpDir)\n\n\tif b.buildpack != nil {\n\t\tif err := b.finalizeImage(layoutImage, tmpDir); err != nil {\n\t\t\treturn err\n\t\t}\n\t} else if b.extension != nil {\n\t\tif err := b.finalizeExtensionImage(layoutImage, tmpDir); err != nil {\n\t\t\treturn err\n\t\t}\n\t}\n\tlayoutDir, err := os.MkdirTemp(tmpDir, \"oci-layout\")\n\tif err != nil {\n\t\treturn errors.Wrap(err, \"creating oci-layout temp dir\")\n\t}\n\n\tp, err := layout.Write(layoutDir, empty.Index)\n\tif err != nil {\n\t\treturn errors.Wrap(err, \"writing index\")\n\t}\n\n\tif err := p.AppendImage(layoutImage); err != nil {\n\t\treturn errors.Wrap(err, \"writing layout\")\n\t}\n\n\toutputFile, err := os.Create(path)\n\tif err != nil {\n\t\treturn errors.Wrap(err, \"creating output file\")\n\t}\n\tdefer outputFile.Close()\n\n\ttw := tar.NewWriter(outputFile)\n\tdefer tw.Close()\n\n\treturn archive.WriteDirToTar(tw, layoutDir, \"/\", 0, 0, 0755, true, false, nil)\n}\n\nfunc newLayoutImage(target dist.Target) (*layoutImage, error) {\n\ti := empty.Image\n\n\tconfigFile, err := i.ConfigFile()\n\tif err != nil {\n\t\treturn nil, err\n\t}\n\n\tconfigFile.OS = target.OS\n\tconfigFile.Architecture = target.Arch\n\ti, err = mutate.ConfigFile(i, configFile)\n\tif err != nil {\n\t\treturn nil, err\n\t}\n\n\tif target.OS == \"windows\" {\n\t\topener := func() (io.ReadCloser, error) {\n\t\t\treader, err := layer.WindowsBaseLayer()\n\t\t\treturn io.NopCloser(reader), err\n\t\t}\n\n\t\tbaseLayer, err := tarball.LayerFromOpener(opener, tarball.WithCompressionLevel(gzip.DefaultCompression))\n\t\tif err != nil {\n\t\t\treturn nil, err\n\t\t}\n\n\t\ti, err = mutate.AppendLayers(i, baseLayer)\n\t\tif err != nil {\n\t\t\treturn nil, err\n\t\t}\n\t}\n\n\treturn &layoutImage{Image: i}, nil\n}\n\nfunc (b *PackageBuilder) SaveAsImage(repoName string, publish bool, target dist.Target, labels map[string]string, additionalTags ...string) (imgutil.Image, error) {\n\tif err := b.validate(); err != nil {\n\t\treturn nil, err\n\t}\n\n\timage, err := b.imageFactory.NewImage(repoName, !publish, target)\n\tif err != nil {\n\t\treturn nil, errors.Wrapf(err, \"creating image\")\n\t}\n\n\tfor labelKey, labelValue := range labels {\n\t\terr = image.SetLabel(labelKey, labelValue)\n\t\tif err != nil {\n\t\t\treturn nil, errors.Wrapf(err, \"adding label %s=%s\", labelKey, labelValue)\n\t\t}\n\t}\n\n\ttempDirName := \"\"\n\tif b.buildpack != nil {\n\t\ttempDirName = \"package-buildpack\"\n\t} else if b.extension != nil {\n\t\ttempDirName = \"extension-buildpack\"\n\t}\n\n\ttmpDir, err := os.MkdirTemp(\"\", tempDirName)\n\tif err != nil {\n\t\treturn nil, err\n\t}\n\tdefer os.RemoveAll(tmpDir)\n\tif b.buildpack != nil {\n\t\tif err := b.finalizeImage(image, tmpDir); err != nil {\n\t\t\treturn nil, err\n\t\t}\n\t} else if b.extension != nil {\n\t\tif err := b.finalizeExtensionImage(image, tmpDir); err != nil {\n\t\t\treturn nil, err\n\t\t}\n\t}\n\n\tif err := image.Save(additionalTags...); err != nil {\n\t\treturn nil, err\n\t}\n\n\treturn image, nil\n}\n\nfunc validateBuildpacks(mainBP BuildModule, depBPs []BuildModule) error {\n\tdepsWithRefs := map[string][]dist.ModuleInfo{}\n\n\tfor _, bp := range depBPs {\n\t\tdepsWithRefs[bp.Descriptor().Info().FullName()] = nil\n\t}\n\n\tfor _, bp := range append([]BuildModule{mainBP}, depBPs...) { // List of everything\n\t\tbpd := bp.Descriptor()\n\t\tfor _, orderEntry := range bpd.Order() {\n\t\t\tfor _, groupEntry := range orderEntry.Group {\n\t\t\t\tbpFullName, err := groupEntry.FullNameWithVersion()\n\t\t\t\tif err != nil {\n\t\t\t\t\treturn errors.Wrapf(\n\t\t\t\t\t\terr,\n\t\t\t\t\t\t\"buildpack %s must specify a version when referencing buildpack %s\",\n\t\t\t\t\t\tstyle.Symbol(bpd.Info().FullName()),\n\t\t\t\t\t\tstyle.Symbol(bpFullName),\n\t\t\t\t\t)\n\t\t\t\t}\n\t\t\t\tif _, ok := depsWithRefs[bpFullName]; !ok {\n\t\t\t\t\treturn errors.Errorf(\n\t\t\t\t\t\t\"buildpack %s references buildpack %s which is not present\",\n\t\t\t\t\t\tstyle.Symbol(bpd.Info().FullName()),\n\t\t\t\t\t\tstyle.Symbol(bpFullName),\n\t\t\t\t\t)\n\t\t\t\t}\n\n\t\t\t\tdepsWithRefs[bpFullName] = append(depsWithRefs[bpFullName], bpd.Info())\n\t\t\t}\n\t\t}\n\t}\n\n\tfor bp, refs := range depsWithRefs {\n\t\tif len(refs) == 0 {\n\t\t\treturn errors.Errorf(\n\t\t\t\t\"buildpack %s is not used by buildpack %s\",\n\t\t\t\tstyle.Symbol(bp),\n\t\t\t\tstyle.Symbol(mainBP.Descriptor().Info().FullName()),\n\t\t\t)\n\t\t}\n\t}\n\n\treturn nil\n}\n"
  },
  {
    "path": "pkg/buildpack/builder_test.go",
    "content": "package buildpack_test\n\nimport (\n\t\"archive/tar\"\n\t\"bytes\"\n\t\"compress/gzip\"\n\t\"encoding/json\"\n\t\"fmt\"\n\t\"io\"\n\t\"os\"\n\t\"path\"\n\t\"path/filepath\"\n\t\"slices\"\n\t\"testing\"\n\n\t\"github.com/pkg/errors\"\n\n\t\"github.com/buildpacks/imgutil/fakes\"\n\t\"github.com/buildpacks/imgutil/layer\"\n\t\"github.com/buildpacks/lifecycle/api\"\n\t\"github.com/golang/mock/gomock\"\n\t\"github.com/google/go-containerregistry/pkg/v1/stream\"\n\t\"github.com/heroku/color\"\n\tv1 \"github.com/opencontainers/image-spec/specs-go/v1\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\t\"github.com/buildpacks/pack/pkg/archive\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\n\tifakes \"github.com/buildpacks/pack/internal/fakes\"\n\t\"github.com/buildpacks/pack/pkg/buildpack\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n\t\"github.com/buildpacks/pack/pkg/testmocks\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestPackageBuilder(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"PackageBuilder\", testPackageBuilder, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testPackageBuilder(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tmockController   *gomock.Controller\n\t\tmockImageFactory func(expectedImageOS string) *testmocks.MockImageFactory\n\t\ttmpDir           string\n\t)\n\n\tit.Before(func() {\n\t\tmockController = gomock.NewController(t)\n\n\t\tmockImageFactory = func(expectedImageOS string) *testmocks.MockImageFactory {\n\t\t\timageFactory := testmocks.NewMockImageFactory(mockController)\n\n\t\t\tif expectedImageOS != \"\" {\n\t\t\t\tfakePackageImage := fakes.NewImage(\"some/package\", \"\", nil)\n\t\t\t\timageFactory.EXPECT().NewImage(\"some/package\", true, dist.Target{OS: expectedImageOS}).Return(fakePackageImage, nil).MaxTimes(1)\n\t\t\t}\n\n\t\t\treturn imageFactory\n\t\t}\n\n\t\tvar err error\n\t\ttmpDir, err = os.MkdirTemp(\"\", \"package_builder_tests\")\n\t\th.AssertNil(t, err)\n\t})\n\n\tit.After(func() {\n\t\th.AssertNilE(t, os.RemoveAll(tmpDir))\n\t\tmockController.Finish()\n\t})\n\n\twhen(\"validation\", func() {\n\t\tlinux := dist.Target{OS: \"linux\"}\n\t\twindows := dist.Target{OS: \"windows\"}\n\n\t\tfor _, _test := range []*struct {\n\t\t\tname            string\n\t\t\texpectedImageOS string\n\t\t\tfn              func(*buildpack.PackageBuilder) error\n\t\t}{\n\t\t\t{name: \"SaveAsImage\", expectedImageOS: \"linux\", fn: func(builder *buildpack.PackageBuilder) error {\n\t\t\t\t_, err := builder.SaveAsImage(\"some/package\", false, linux, map[string]string{})\n\t\t\t\treturn err\n\t\t\t}},\n\t\t\t{name: \"SaveAsImage\", expectedImageOS: \"windows\", fn: func(builder *buildpack.PackageBuilder) error {\n\t\t\t\t_, err := builder.SaveAsImage(\"some/package\", false, windows, map[string]string{})\n\t\t\t\treturn err\n\t\t\t}},\n\t\t\t{name: \"SaveAsFile\", expectedImageOS: \"linux\", fn: func(builder *buildpack.PackageBuilder) error {\n\t\t\t\treturn builder.SaveAsFile(path.Join(tmpDir, \"package.cnb\"), linux, map[string]string{})\n\t\t\t}},\n\t\t\t{name: \"SaveAsFile\", expectedImageOS: \"windows\", fn: func(builder *buildpack.PackageBuilder) error {\n\t\t\t\treturn builder.SaveAsFile(path.Join(tmpDir, \"package.cnb\"), windows, map[string]string{})\n\t\t\t}},\n\t\t} {\n\t\t\t// always use copies to avoid stale refs\n\t\t\ttestFn := _test.fn\n\t\t\texpectedImageOS := _test.expectedImageOS\n\t\t\ttestName := _test.name\n\t\t\t_test = nil\n\n\t\t\twhen(testName, func() {\n\t\t\t\twhen(expectedImageOS, func() {\n\t\t\t\t\twhen(\"validate buildpack\", func() {\n\t\t\t\t\t\twhen(\"buildpack not set\", func() {\n\t\t\t\t\t\t\tit(\"returns error\", func() {\n\t\t\t\t\t\t\t\tbuilder := buildpack.NewBuilder(mockImageFactory(expectedImageOS))\n\t\t\t\t\t\t\t\terr := testFn(builder)\n\t\t\t\t\t\t\t\th.AssertError(t, err, \"buildpack or extension must be set\")\n\t\t\t\t\t\t\t})\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\twhen(\"there is a buildpack not referenced\", func() {\n\t\t\t\t\t\t\tit(\"should error\", func() {\n\t\t\t\t\t\t\t\tbp1, err := ifakes.NewFakeBuildpack(dist.BuildpackDescriptor{\n\t\t\t\t\t\t\t\t\tWithAPI: api.MustParse(\"0.2\"),\n\t\t\t\t\t\t\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\t\t\t\t\t\t\tID:      \"bp.1.id\",\n\t\t\t\t\t\t\t\t\t\tVersion: \"bp.1.version\",\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\tWithStacks: []dist.Stack{{ID: \"some.stack\"}},\n\t\t\t\t\t\t\t\t}, 0644)\n\t\t\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\t\t\tbuilder := buildpack.NewBuilder(mockImageFactory(expectedImageOS))\n\t\t\t\t\t\t\t\tbuilder.SetBuildpack(bp1)\n\n\t\t\t\t\t\t\t\tbp2, err := ifakes.NewFakeBuildpack(dist.BuildpackDescriptor{\n\t\t\t\t\t\t\t\t\tWithAPI:    api.MustParse(\"0.2\"),\n\t\t\t\t\t\t\t\t\tWithInfo:   dist.ModuleInfo{ID: \"bp.2.id\", Version: \"bp.2.version\"},\n\t\t\t\t\t\t\t\t\tWithStacks: []dist.Stack{{ID: \"some.stack\"}},\n\t\t\t\t\t\t\t\t\tWithOrder:  nil,\n\t\t\t\t\t\t\t\t}, 0644)\n\t\t\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\t\t\tbuilder.AddDependency(bp2)\n\n\t\t\t\t\t\t\t\terr = testFn(builder)\n\t\t\t\t\t\t\t\th.AssertError(t, err, \"buildpack 'bp.2.id@bp.2.version' is not used by buildpack 'bp.1.id@bp.1.version'\")\n\t\t\t\t\t\t\t})\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\twhen(\"there is a referenced buildpack from main buildpack that is not present\", func() {\n\t\t\t\t\t\t\tit(\"should error\", func() {\n\t\t\t\t\t\t\t\tmainBP, err := ifakes.NewFakeBuildpack(dist.BuildpackDescriptor{\n\t\t\t\t\t\t\t\t\tWithAPI: api.MustParse(\"0.2\"),\n\t\t\t\t\t\t\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\t\t\t\t\t\t\tID:      \"bp.1.id\",\n\t\t\t\t\t\t\t\t\t\tVersion: \"bp.1.version\",\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\tWithOrder: dist.Order{{\n\t\t\t\t\t\t\t\t\t\tGroup: []dist.ModuleRef{\n\t\t\t\t\t\t\t\t\t\t\t{ModuleInfo: dist.ModuleInfo{ID: \"bp.present.id\", Version: \"bp.present.version\"}},\n\t\t\t\t\t\t\t\t\t\t\t{ModuleInfo: dist.ModuleInfo{ID: \"bp.missing.id\", Version: \"bp.missing.version\"}},\n\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t}},\n\t\t\t\t\t\t\t\t}, 0644)\n\t\t\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\t\t\tbuilder := buildpack.NewBuilder(mockImageFactory(expectedImageOS))\n\t\t\t\t\t\t\t\tbuilder.SetBuildpack(mainBP)\n\n\t\t\t\t\t\t\t\tpresentBP, err := ifakes.NewFakeBuildpack(dist.BuildpackDescriptor{\n\t\t\t\t\t\t\t\t\tWithAPI:    api.MustParse(\"0.2\"),\n\t\t\t\t\t\t\t\t\tWithInfo:   dist.ModuleInfo{ID: \"bp.present.id\", Version: \"bp.present.version\"},\n\t\t\t\t\t\t\t\t\tWithStacks: []dist.Stack{{ID: \"some.stack\"}},\n\t\t\t\t\t\t\t\t\tWithOrder:  nil,\n\t\t\t\t\t\t\t\t}, 0644)\n\t\t\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\t\t\tbuilder.AddDependency(presentBP)\n\n\t\t\t\t\t\t\t\terr = testFn(builder)\n\t\t\t\t\t\t\t\th.AssertError(t, err, \"buildpack 'bp.1.id@bp.1.version' references buildpack 'bp.missing.id@bp.missing.version' which is not present\")\n\t\t\t\t\t\t\t})\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\twhen(\"there is a referenced buildpack from dependency buildpack that is not present\", func() {\n\t\t\t\t\t\t\tit(\"should error\", func() {\n\t\t\t\t\t\t\t\tmainBP, err := ifakes.NewFakeBuildpack(dist.BuildpackDescriptor{\n\t\t\t\t\t\t\t\t\tWithAPI: api.MustParse(\"0.2\"),\n\t\t\t\t\t\t\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\t\t\t\t\t\t\tID:      \"bp.1.id\",\n\t\t\t\t\t\t\t\t\t\tVersion: \"bp.1.version\",\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\tWithOrder: dist.Order{{\n\t\t\t\t\t\t\t\t\t\tGroup: []dist.ModuleRef{\n\t\t\t\t\t\t\t\t\t\t\t{ModuleInfo: dist.ModuleInfo{ID: \"bp.present.id\", Version: \"bp.present.version\"}},\n\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t}},\n\t\t\t\t\t\t\t\t}, 0644)\n\t\t\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\t\t\tbuilder := buildpack.NewBuilder(mockImageFactory(expectedImageOS))\n\t\t\t\t\t\t\t\tbuilder.SetBuildpack(mainBP)\n\n\t\t\t\t\t\t\t\tpresentBP, err := ifakes.NewFakeBuildpack(dist.BuildpackDescriptor{\n\t\t\t\t\t\t\t\t\tWithAPI:  api.MustParse(\"0.2\"),\n\t\t\t\t\t\t\t\t\tWithInfo: dist.ModuleInfo{ID: \"bp.present.id\", Version: \"bp.present.version\"},\n\t\t\t\t\t\t\t\t\tWithOrder: dist.Order{{\n\t\t\t\t\t\t\t\t\t\tGroup: []dist.ModuleRef{\n\t\t\t\t\t\t\t\t\t\t\t{ModuleInfo: dist.ModuleInfo{ID: \"bp.missing.id\", Version: \"bp.missing.version\"}},\n\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t}},\n\t\t\t\t\t\t\t\t}, 0644)\n\t\t\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\t\t\tbuilder.AddDependency(presentBP)\n\n\t\t\t\t\t\t\t\terr = testFn(builder)\n\t\t\t\t\t\t\t\th.AssertError(t, err, \"buildpack 'bp.present.id@bp.present.version' references buildpack 'bp.missing.id@bp.missing.version' which is not present\")\n\t\t\t\t\t\t\t})\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\twhen(\"there is a referenced buildpack from dependency buildpack that does not have proper version defined\", func() {\n\t\t\t\t\t\t\tit(\"should error\", func() {\n\t\t\t\t\t\t\t\tmainBP, err := ifakes.NewFakeBuildpack(dist.BuildpackDescriptor{\n\t\t\t\t\t\t\t\t\tWithAPI: api.MustParse(\"0.2\"),\n\t\t\t\t\t\t\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\t\t\t\t\t\t\tID:      \"bp.1.id\",\n\t\t\t\t\t\t\t\t\t\tVersion: \"bp.1.version\",\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\tWithOrder: dist.Order{{\n\t\t\t\t\t\t\t\t\t\tGroup: []dist.ModuleRef{\n\t\t\t\t\t\t\t\t\t\t\t{ModuleInfo: dist.ModuleInfo{ID: \"bp.present.id\", Version: \"bp.present.version\"}},\n\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t}},\n\t\t\t\t\t\t\t\t}, 0644)\n\t\t\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\t\t\tbuilder := buildpack.NewBuilder(mockImageFactory(expectedImageOS))\n\t\t\t\t\t\t\t\tbuilder.SetBuildpack(mainBP)\n\n\t\t\t\t\t\t\t\tpresentBP, err := ifakes.NewFakeBuildpack(dist.BuildpackDescriptor{\n\t\t\t\t\t\t\t\t\tWithAPI:  api.MustParse(\"0.2\"),\n\t\t\t\t\t\t\t\t\tWithInfo: dist.ModuleInfo{ID: \"bp.present.id\", Version: \"bp.present.version\"},\n\t\t\t\t\t\t\t\t\tWithOrder: dist.Order{{\n\t\t\t\t\t\t\t\t\t\tGroup: []dist.ModuleRef{\n\t\t\t\t\t\t\t\t\t\t\t{ModuleInfo: dist.ModuleInfo{ID: \"bp.missing.id\"}},\n\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t}},\n\t\t\t\t\t\t\t\t}, 0644)\n\t\t\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\t\t\tbuilder.AddDependency(presentBP)\n\n\t\t\t\t\t\t\t\terr = testFn(builder)\n\t\t\t\t\t\t\t\th.AssertError(t, err, \"buildpack 'bp.present.id@bp.present.version' must specify a version when referencing buildpack 'bp.missing.id'\")\n\t\t\t\t\t\t\t})\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"validate stacks\", func() {\n\t\t\t\t\t\twhen(\"buildpack does not define stacks\", func() {\n\t\t\t\t\t\t\tit(\"should succeed\", func() {\n\t\t\t\t\t\t\t\tbp, err := ifakes.NewFakeBuildpack(dist.BuildpackDescriptor{\n\t\t\t\t\t\t\t\t\tWithAPI: api.MustParse(\"0.10\"),\n\t\t\t\t\t\t\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\t\t\t\t\t\t\tID:      \"bp.1.id\",\n\t\t\t\t\t\t\t\t\t\tVersion: \"bp.1.version\",\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\tWithStacks: nil,\n\t\t\t\t\t\t\t\t\tWithOrder:  nil,\n\t\t\t\t\t\t\t\t}, 0644)\n\t\t\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\t\t\tbuilder := buildpack.NewBuilder(mockImageFactory(expectedImageOS))\n\t\t\t\t\t\t\t\tbuilder.SetBuildpack(bp)\n\t\t\t\t\t\t\t\terr = testFn(builder)\n\t\t\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\t\t})\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\twhen(\"buildpack is meta-buildpack\", func() {\n\t\t\t\t\t\t\tit(\"should succeed\", func() {\n\t\t\t\t\t\t\t\tbp, err := ifakes.NewFakeBuildpack(dist.BuildpackDescriptor{\n\t\t\t\t\t\t\t\t\tWithAPI: api.MustParse(\"0.2\"),\n\t\t\t\t\t\t\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\t\t\t\t\t\t\tID:      \"bp.1.id\",\n\t\t\t\t\t\t\t\t\t\tVersion: \"bp.1.version\",\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\tWithStacks: nil,\n\t\t\t\t\t\t\t\t\tWithOrder: dist.Order{{\n\t\t\t\t\t\t\t\t\t\tGroup: []dist.ModuleRef{\n\t\t\t\t\t\t\t\t\t\t\t{ModuleInfo: dist.ModuleInfo{ID: \"bp.nested.id\", Version: \"bp.nested.version\"}},\n\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t}},\n\t\t\t\t\t\t\t\t}, 0644)\n\t\t\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\t\t\tbuilder := buildpack.NewBuilder(mockImageFactory(expectedImageOS))\n\t\t\t\t\t\t\t\tbuilder.SetBuildpack(bp)\n\n\t\t\t\t\t\t\t\tdependency, err := ifakes.NewFakeBuildpack(dist.BuildpackDescriptor{\n\t\t\t\t\t\t\t\t\tWithAPI: api.MustParse(\"0.2\"),\n\t\t\t\t\t\t\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\t\t\t\t\t\t\tID:      \"bp.nested.id\",\n\t\t\t\t\t\t\t\t\t\tVersion: \"bp.nested.version\",\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\tWithStacks: []dist.Stack{\n\t\t\t\t\t\t\t\t\t\t{ID: \"stack.id.1\", Mixins: []string{\"Mixin-A\"}},\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\tWithOrder: nil,\n\t\t\t\t\t\t\t\t}, 0644)\n\t\t\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\t\t\tbuilder.AddDependency(dependency)\n\n\t\t\t\t\t\t\t\terr = testFn(builder)\n\t\t\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\t\t})\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\twhen(\"dependencies don't have a common stack\", func() {\n\t\t\t\t\t\t\tit(\"should error\", func() {\n\t\t\t\t\t\t\t\tbp, err := ifakes.NewFakeBuildpack(dist.BuildpackDescriptor{\n\t\t\t\t\t\t\t\t\tWithAPI: api.MustParse(\"0.2\"),\n\t\t\t\t\t\t\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\t\t\t\t\t\t\tID:      \"bp.1.id\",\n\t\t\t\t\t\t\t\t\t\tVersion: \"bp.1.version\",\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\tWithOrder: dist.Order{{\n\t\t\t\t\t\t\t\t\t\tGroup: []dist.ModuleRef{{\n\t\t\t\t\t\t\t\t\t\t\tModuleInfo: dist.ModuleInfo{ID: \"bp.2.id\", Version: \"bp.2.version\"},\n\t\t\t\t\t\t\t\t\t\t\tOptional:   false,\n\t\t\t\t\t\t\t\t\t\t}, {\n\t\t\t\t\t\t\t\t\t\t\tModuleInfo: dist.ModuleInfo{ID: \"bp.3.id\", Version: \"bp.3.version\"},\n\t\t\t\t\t\t\t\t\t\t\tOptional:   false,\n\t\t\t\t\t\t\t\t\t\t}},\n\t\t\t\t\t\t\t\t\t}},\n\t\t\t\t\t\t\t\t}, 0644)\n\t\t\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\t\t\tbuilder := buildpack.NewBuilder(mockImageFactory(expectedImageOS))\n\t\t\t\t\t\t\t\tbuilder.SetBuildpack(bp)\n\n\t\t\t\t\t\t\t\tdependency1, err := ifakes.NewFakeBuildpack(dist.BuildpackDescriptor{\n\t\t\t\t\t\t\t\t\tWithAPI: api.MustParse(\"0.2\"),\n\t\t\t\t\t\t\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\t\t\t\t\t\t\tID:      \"bp.2.id\",\n\t\t\t\t\t\t\t\t\t\tVersion: \"bp.2.version\",\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\tWithStacks: []dist.Stack{\n\t\t\t\t\t\t\t\t\t\t{ID: \"stack.id.1\", Mixins: []string{\"Mixin-A\"}},\n\t\t\t\t\t\t\t\t\t\t{ID: \"stack.id.2\", Mixins: []string{\"Mixin-A\"}},\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\tWithOrder: nil,\n\t\t\t\t\t\t\t\t}, 0644)\n\t\t\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\t\t\tbuilder.AddDependency(dependency1)\n\n\t\t\t\t\t\t\t\tdependency2, err := ifakes.NewFakeBuildpack(dist.BuildpackDescriptor{\n\t\t\t\t\t\t\t\t\tWithAPI: api.MustParse(\"0.2\"),\n\t\t\t\t\t\t\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\t\t\t\t\t\t\tID:      \"bp.3.id\",\n\t\t\t\t\t\t\t\t\t\tVersion: \"bp.3.version\",\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\tWithStacks: []dist.Stack{\n\t\t\t\t\t\t\t\t\t\t{ID: \"stack.id.3\", Mixins: []string{\"Mixin-A\"}},\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\tWithOrder: nil,\n\t\t\t\t\t\t\t\t}, 0644)\n\t\t\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\t\t\tbuilder.AddDependency(dependency2)\n\n\t\t\t\t\t\t\t\terr = testFn(builder)\n\t\t\t\t\t\t\t\th.AssertError(t, err, \"no compatible stacks among provided buildpacks\")\n\t\t\t\t\t\t\t})\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\twhen(\"dependency has stacks that aren't supported by buildpack\", func() {\n\t\t\t\t\t\t\tit(\"should only support common stacks\", func() {\n\t\t\t\t\t\t\t\tbp, err := ifakes.NewFakeBuildpack(dist.BuildpackDescriptor{\n\t\t\t\t\t\t\t\t\tWithAPI: api.MustParse(\"0.2\"),\n\t\t\t\t\t\t\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\t\t\t\t\t\t\tID:      \"bp.1.id\",\n\t\t\t\t\t\t\t\t\t\tVersion: \"bp.1.version\",\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\tWithOrder: dist.Order{{\n\t\t\t\t\t\t\t\t\t\tGroup: []dist.ModuleRef{{\n\t\t\t\t\t\t\t\t\t\t\tModuleInfo: dist.ModuleInfo{ID: \"bp.2.id\", Version: \"bp.2.version\"},\n\t\t\t\t\t\t\t\t\t\t\tOptional:   false,\n\t\t\t\t\t\t\t\t\t\t}, {\n\t\t\t\t\t\t\t\t\t\t\tModuleInfo: dist.ModuleInfo{ID: \"bp.3.id\", Version: \"bp.3.version\"},\n\t\t\t\t\t\t\t\t\t\t\tOptional:   false,\n\t\t\t\t\t\t\t\t\t\t}},\n\t\t\t\t\t\t\t\t\t}},\n\t\t\t\t\t\t\t\t}, 0644)\n\t\t\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\t\t\tbuilder := buildpack.NewBuilder(mockImageFactory(expectedImageOS))\n\t\t\t\t\t\t\t\tbuilder.SetBuildpack(bp)\n\n\t\t\t\t\t\t\t\tdependency1, err := ifakes.NewFakeBuildpack(dist.BuildpackDescriptor{\n\t\t\t\t\t\t\t\t\tWithAPI: api.MustParse(\"0.2\"),\n\t\t\t\t\t\t\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\t\t\t\t\t\t\tID:      \"bp.2.id\",\n\t\t\t\t\t\t\t\t\t\tVersion: \"bp.2.version\",\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\tWithStacks: []dist.Stack{\n\t\t\t\t\t\t\t\t\t\t{ID: \"stack.id.1\", Mixins: []string{\"Mixin-A\"}},\n\t\t\t\t\t\t\t\t\t\t{ID: \"stack.id.2\", Mixins: []string{\"Mixin-A\"}},\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\tWithOrder: nil,\n\t\t\t\t\t\t\t\t}, 0644)\n\t\t\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\t\t\tbuilder.AddDependency(dependency1)\n\n\t\t\t\t\t\t\t\tdependency2, err := ifakes.NewFakeBuildpack(dist.BuildpackDescriptor{\n\t\t\t\t\t\t\t\t\tWithAPI: api.MustParse(\"0.2\"),\n\t\t\t\t\t\t\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\t\t\t\t\t\t\tID:      \"bp.3.id\",\n\t\t\t\t\t\t\t\t\t\tVersion: \"bp.3.version\",\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\tWithStacks: []dist.Stack{\n\t\t\t\t\t\t\t\t\t\t{ID: \"stack.id.1\", Mixins: []string{\"Mixin-A\"}},\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\tWithOrder: nil,\n\t\t\t\t\t\t\t\t}, 0644)\n\t\t\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\t\t\tbuilder.AddDependency(dependency2)\n\n\t\t\t\t\t\t\t\timg, err := builder.SaveAsImage(\"some/package\", false, dist.Target{OS: expectedImageOS}, map[string]string{})\n\t\t\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\t\t\tmetadata := buildpack.Metadata{}\n\t\t\t\t\t\t\t\t_, err = dist.GetLabel(img, \"io.buildpacks.buildpackage.metadata\", &metadata)\n\t\t\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\t\t\th.AssertEq(t, metadata.Stacks, []dist.Stack{{ID: \"stack.id.1\", Mixins: []string{\"Mixin-A\"}}})\n\t\t\t\t\t\t\t})\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\twhen(\"dependency has wildcard stacks\", func() {\n\t\t\t\t\t\t\tit(\"should support all the possible stacks\", func() {\n\t\t\t\t\t\t\t\tbp, err := ifakes.NewFakeBuildpack(dist.BuildpackDescriptor{\n\t\t\t\t\t\t\t\t\tWithAPI: api.MustParse(\"0.2\"),\n\t\t\t\t\t\t\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\t\t\t\t\t\t\tID:      \"bp.1.id\",\n\t\t\t\t\t\t\t\t\t\tVersion: \"bp.1.version\",\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\tWithOrder: dist.Order{{\n\t\t\t\t\t\t\t\t\t\tGroup: []dist.ModuleRef{{\n\t\t\t\t\t\t\t\t\t\t\tModuleInfo: dist.ModuleInfo{ID: \"bp.2.id\", Version: \"bp.2.version\"},\n\t\t\t\t\t\t\t\t\t\t\tOptional:   false,\n\t\t\t\t\t\t\t\t\t\t}, {\n\t\t\t\t\t\t\t\t\t\t\tModuleInfo: dist.ModuleInfo{ID: \"bp.3.id\", Version: \"bp.3.version\"},\n\t\t\t\t\t\t\t\t\t\t\tOptional:   false,\n\t\t\t\t\t\t\t\t\t\t}},\n\t\t\t\t\t\t\t\t\t}},\n\t\t\t\t\t\t\t\t}, 0644)\n\t\t\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\t\t\tbuilder := buildpack.NewBuilder(mockImageFactory(expectedImageOS))\n\t\t\t\t\t\t\t\tbuilder.SetBuildpack(bp)\n\n\t\t\t\t\t\t\t\tdependency1, err := ifakes.NewFakeBuildpack(dist.BuildpackDescriptor{\n\t\t\t\t\t\t\t\t\tWithAPI: api.MustParse(\"0.2\"),\n\t\t\t\t\t\t\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\t\t\t\t\t\t\tID:      \"bp.2.id\",\n\t\t\t\t\t\t\t\t\t\tVersion: \"bp.2.version\",\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\tWithStacks: []dist.Stack{\n\t\t\t\t\t\t\t\t\t\t{ID: \"*\", Mixins: []string{\"Mixin-A\"}},\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\tWithOrder: nil,\n\t\t\t\t\t\t\t\t}, 0644)\n\t\t\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\t\t\tbuilder.AddDependency(dependency1)\n\n\t\t\t\t\t\t\t\tdependency2, err := ifakes.NewFakeBuildpack(dist.BuildpackDescriptor{\n\t\t\t\t\t\t\t\t\tWithAPI: api.MustParse(\"0.2\"),\n\t\t\t\t\t\t\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\t\t\t\t\t\t\tID:      \"bp.3.id\",\n\t\t\t\t\t\t\t\t\t\tVersion: \"bp.3.version\",\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\tWithStacks: []dist.Stack{\n\t\t\t\t\t\t\t\t\t\t{ID: \"stack.id.1\", Mixins: []string{\"Mixin-A\"}},\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\tWithOrder: nil,\n\t\t\t\t\t\t\t\t}, 0644)\n\t\t\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\t\t\tbuilder.AddDependency(dependency2)\n\n\t\t\t\t\t\t\t\timg, err := builder.SaveAsImage(\"some/package\", false, dist.Target{OS: expectedImageOS}, map[string]string{})\n\t\t\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\t\t\tmetadata := buildpack.Metadata{}\n\t\t\t\t\t\t\t\t_, err = dist.GetLabel(img, \"io.buildpacks.buildpackage.metadata\", &metadata)\n\t\t\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\t\t\th.AssertEq(t, metadata.Stacks, []dist.Stack{{ID: \"stack.id.1\", Mixins: []string{\"Mixin-A\"}}})\n\t\t\t\t\t\t\t})\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\twhen(\"dependency is meta-buildpack\", func() {\n\t\t\t\t\t\t\tit(\"should succeed and compute common stacks\", func() {\n\t\t\t\t\t\t\t\tbp, err := ifakes.NewFakeBuildpack(dist.BuildpackDescriptor{\n\t\t\t\t\t\t\t\t\tWithAPI: api.MustParse(\"0.2\"),\n\t\t\t\t\t\t\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\t\t\t\t\t\t\tID:      \"bp.1.id\",\n\t\t\t\t\t\t\t\t\t\tVersion: \"bp.1.version\",\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\tWithStacks: nil,\n\t\t\t\t\t\t\t\t\tWithOrder: dist.Order{{\n\t\t\t\t\t\t\t\t\t\tGroup: []dist.ModuleRef{\n\t\t\t\t\t\t\t\t\t\t\t{ModuleInfo: dist.ModuleInfo{ID: \"bp.nested.id\", Version: \"bp.nested.version\"}},\n\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t}},\n\t\t\t\t\t\t\t\t}, 0644)\n\t\t\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\t\t\tbuilder := buildpack.NewBuilder(mockImageFactory(expectedImageOS))\n\t\t\t\t\t\t\t\tbuilder.SetBuildpack(bp)\n\n\t\t\t\t\t\t\t\tdependencyOrder, err := ifakes.NewFakeBuildpack(dist.BuildpackDescriptor{\n\t\t\t\t\t\t\t\t\tWithAPI: api.MustParse(\"0.2\"),\n\t\t\t\t\t\t\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\t\t\t\t\t\t\tID:      \"bp.nested.id\",\n\t\t\t\t\t\t\t\t\t\tVersion: \"bp.nested.version\",\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\tWithOrder: dist.Order{{\n\t\t\t\t\t\t\t\t\t\tGroup: []dist.ModuleRef{\n\t\t\t\t\t\t\t\t\t\t\t{ModuleInfo: dist.ModuleInfo{\n\t\t\t\t\t\t\t\t\t\t\t\tID:      \"bp.nested.nested.id\",\n\t\t\t\t\t\t\t\t\t\t\t\tVersion: \"bp.nested.nested.version\",\n\t\t\t\t\t\t\t\t\t\t\t}},\n\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t}},\n\t\t\t\t\t\t\t\t}, 0644)\n\t\t\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\t\t\tbuilder.AddDependency(dependencyOrder)\n\n\t\t\t\t\t\t\t\tdependencyNestedNested, err := ifakes.NewFakeBuildpack(dist.BuildpackDescriptor{\n\t\t\t\t\t\t\t\t\tWithAPI: api.MustParse(\"0.2\"),\n\t\t\t\t\t\t\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\t\t\t\t\t\t\tID:      \"bp.nested.nested.id\",\n\t\t\t\t\t\t\t\t\t\tVersion: \"bp.nested.nested.version\",\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\tWithStacks: []dist.Stack{\n\t\t\t\t\t\t\t\t\t\t{ID: \"stack.id.1\", Mixins: []string{\"Mixin-A\"}},\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\tWithOrder: nil,\n\t\t\t\t\t\t\t\t}, 0644)\n\t\t\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\t\t\tbuilder.AddDependency(dependencyNestedNested)\n\n\t\t\t\t\t\t\t\timg, err := builder.SaveAsImage(\"some/package\", false, dist.Target{OS: expectedImageOS}, map[string]string{})\n\t\t\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\t\t\tmetadata := buildpack.Metadata{}\n\t\t\t\t\t\t\t\t_, err = dist.GetLabel(img, \"io.buildpacks.buildpackage.metadata\", &metadata)\n\t\t\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\t\t\th.AssertEq(t, metadata.Stacks, []dist.Stack{{ID: \"stack.id.1\", Mixins: []string{\"Mixin-A\"}}})\n\t\t\t\t\t\t\t})\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\t\t}\n\t})\n\n\twhen(\"#SaveAsImage\", func() {\n\t\tit(\"sets metadata\", func() {\n\t\t\tbuildpack1, err := ifakes.NewFakeBuildpack(dist.BuildpackDescriptor{\n\t\t\t\tWithAPI: api.MustParse(\"0.2\"),\n\t\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\t\tID:          \"bp.1.id\",\n\t\t\t\t\tVersion:     \"bp.1.version\",\n\t\t\t\t\tName:        \"One\",\n\t\t\t\t\tDescription: \"some description\",\n\t\t\t\t\tHomepage:    \"https://example.com/homepage\",\n\t\t\t\t\tKeywords:    []string{\"some-keyword\"},\n\t\t\t\t\tLicenses: []dist.License{\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\tType: \"MIT\",\n\t\t\t\t\t\t\tURI:  \"https://example.com/license\",\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t\tWithStacks: []dist.Stack{\n\t\t\t\t\t{ID: \"stack.id.1\"},\n\t\t\t\t\t{ID: \"stack.id.2\"},\n\t\t\t\t},\n\t\t\t\tWithOrder: nil,\n\t\t\t}, 0644)\n\t\t\th.AssertNil(t, err)\n\n\t\t\tbuilder := buildpack.NewBuilder(mockImageFactory(\"linux\"))\n\t\t\tbuilder.SetBuildpack(buildpack1)\n\n\t\t\tvar customLabels = map[string]string{\"test.label.one\": \"1\", \"test.label.two\": \"2\"}\n\n\t\t\tpackageImage, err := builder.SaveAsImage(\"some/package\", false, dist.Target{OS: \"linux\"}, customLabels)\n\t\t\th.AssertNil(t, err)\n\n\t\t\tlabelData, err := packageImage.Label(\"io.buildpacks.buildpackage.metadata\")\n\t\t\th.AssertNil(t, err)\n\t\t\tvar md buildpack.Metadata\n\t\t\th.AssertNil(t, json.Unmarshal([]byte(labelData), &md))\n\n\t\t\th.AssertEq(t, md.ID, \"bp.1.id\")\n\t\t\th.AssertEq(t, md.Version, \"bp.1.version\")\n\t\t\th.AssertEq(t, len(md.Stacks), 2)\n\t\t\th.AssertEq(t, md.Stacks[0].ID, \"stack.id.1\")\n\t\t\th.AssertEq(t, md.Stacks[1].ID, \"stack.id.2\")\n\t\t\th.AssertEq(t, md.Keywords[0], \"some-keyword\")\n\t\t\th.AssertEq(t, md.Homepage, \"https://example.com/homepage\")\n\t\t\th.AssertEq(t, md.Name, \"One\")\n\t\t\th.AssertEq(t, md.Description, \"some description\")\n\t\t\th.AssertEq(t, md.Licenses[0].Type, \"MIT\")\n\t\t\th.AssertEq(t, md.Licenses[0].URI, \"https://example.com/license\")\n\n\t\t\tosVal, err := packageImage.OS()\n\t\t\th.AssertNil(t, err)\n\t\t\th.AssertEq(t, osVal, \"linux\")\n\n\t\t\timageLabels, err := packageImage.Labels()\n\t\t\th.AssertNil(t, err)\n\t\t\th.AssertEq(t, imageLabels[\"test.label.one\"], \"1\")\n\t\t\th.AssertEq(t, imageLabels[\"test.label.two\"], \"2\")\n\t\t})\n\n\t\tit(\"sets extension metadata\", func() {\n\t\t\textension1, err := ifakes.NewFakeExtension(dist.ExtensionDescriptor{\n\t\t\t\tWithAPI: api.MustParse(\"0.2\"),\n\t\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\t\tID:          \"ex.1.id\",\n\t\t\t\t\tVersion:     \"ex.1.version\",\n\t\t\t\t\tName:        \"One\",\n\t\t\t\t\tDescription: \"some description\",\n\t\t\t\t\tHomepage:    \"https://example.com/homepage\",\n\t\t\t\t\tKeywords:    []string{\"some-keyword\"},\n\t\t\t\t\tLicenses: []dist.License{\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\tType: \"MIT\",\n\t\t\t\t\t\t\tURI:  \"https://example.com/license\",\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t}, 0644)\n\t\t\th.AssertNil(t, err)\n\t\t\tbuilder := buildpack.NewBuilder(mockImageFactory(\"linux\"))\n\t\t\tbuilder.SetExtension(extension1)\n\t\t\tpackageImage, err := builder.SaveAsImage(\"some/package\", false, dist.Target{OS: \"linux\"}, map[string]string{})\n\t\t\th.AssertNil(t, err)\n\t\t\tlabelData, err := packageImage.Label(\"io.buildpacks.buildpackage.metadata\")\n\t\t\th.AssertNil(t, err)\n\t\t\tvar md buildpack.Metadata\n\t\t\th.AssertNil(t, json.Unmarshal([]byte(labelData), &md))\n\n\t\t\th.AssertEq(t, md.ID, \"ex.1.id\")\n\t\t\th.AssertEq(t, md.Version, \"ex.1.version\")\n\t\t\th.AssertEq(t, md.Keywords[0], \"some-keyword\")\n\t\t\th.AssertEq(t, md.Homepage, \"https://example.com/homepage\")\n\t\t\th.AssertEq(t, md.Name, \"One\")\n\t\t\th.AssertEq(t, md.Description, \"some description\")\n\t\t\th.AssertEq(t, md.Licenses[0].Type, \"MIT\")\n\t\t\th.AssertEq(t, md.Licenses[0].URI, \"https://example.com/license\")\n\n\t\t\tosVal, err := packageImage.OS()\n\t\t\th.AssertNil(t, err)\n\t\t\th.AssertEq(t, osVal, \"linux\")\n\t\t})\n\n\t\tit(\"sets buildpack layers label\", func() {\n\t\t\tbuildpack1, err := ifakes.NewFakeBuildpack(dist.BuildpackDescriptor{\n\t\t\t\tWithAPI:    api.MustParse(\"0.2\"),\n\t\t\t\tWithInfo:   dist.ModuleInfo{ID: \"bp.1.id\", Version: \"bp.1.version\"},\n\t\t\t\tWithStacks: []dist.Stack{{ID: \"stack.id.1\"}, {ID: \"stack.id.2\"}},\n\t\t\t\tWithOrder:  nil,\n\t\t\t}, 0644)\n\t\t\th.AssertNil(t, err)\n\n\t\t\tbuilder := buildpack.NewBuilder(mockImageFactory(\"linux\"))\n\t\t\tbuilder.SetBuildpack(buildpack1)\n\n\t\t\tpackageImage, err := builder.SaveAsImage(\"some/package\", false, dist.Target{OS: \"linux\"}, map[string]string{})\n\t\t\th.AssertNil(t, err)\n\n\t\t\tvar bpLayers dist.ModuleLayers\n\t\t\t_, err = dist.GetLabel(packageImage, \"io.buildpacks.buildpack.layers\", &bpLayers)\n\t\t\th.AssertNil(t, err)\n\n\t\t\tbp1Info, ok1 := bpLayers[\"bp.1.id\"][\"bp.1.version\"]\n\t\t\th.AssertEq(t, ok1, true)\n\t\t\th.AssertEq(t, bp1Info.Stacks, []dist.Stack{{ID: \"stack.id.1\"}, {ID: \"stack.id.2\"}})\n\t\t})\n\n\t\tit(\"adds buildpack layers for linux\", func() {\n\t\t\tbuildpack1, err := ifakes.NewFakeBuildpack(dist.BuildpackDescriptor{\n\t\t\t\tWithAPI:    api.MustParse(\"0.2\"),\n\t\t\t\tWithInfo:   dist.ModuleInfo{ID: \"bp.1.id\", Version: \"bp.1.version\"},\n\t\t\t\tWithStacks: []dist.Stack{{ID: \"stack.id.1\"}, {ID: \"stack.id.2\"}},\n\t\t\t\tWithOrder:  nil,\n\t\t\t}, 0644)\n\t\t\th.AssertNil(t, err)\n\n\t\t\tbuilder := buildpack.NewBuilder(mockImageFactory(\"linux\"))\n\t\t\tbuilder.SetBuildpack(buildpack1)\n\n\t\t\tpackageImage, err := builder.SaveAsImage(\"some/package\", false, dist.Target{OS: \"linux\"}, map[string]string{})\n\t\t\th.AssertNil(t, err)\n\n\t\t\tbuildpackExists := func(name, version string) {\n\t\t\t\tt.Helper()\n\t\t\t\tdirPath := fmt.Sprintf(\"/cnb/buildpacks/%s/%s\", name, version)\n\t\t\t\tfakePackageImage := packageImage.(*fakes.Image)\n\t\t\t\tlayerTar, err := fakePackageImage.FindLayerWithPath(dirPath)\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\th.AssertOnTarEntry(t, layerTar, dirPath,\n\t\t\t\t\th.IsDirectory(),\n\t\t\t\t)\n\n\t\t\t\th.AssertOnTarEntry(t, layerTar, dirPath+\"/bin/build\",\n\t\t\t\t\th.ContentEquals(\"build-contents\"),\n\t\t\t\t\th.HasOwnerAndGroup(0, 0),\n\t\t\t\t\th.HasFileMode(0644),\n\t\t\t\t)\n\n\t\t\t\th.AssertOnTarEntry(t, layerTar, dirPath+\"/bin/detect\",\n\t\t\t\t\th.ContentEquals(\"detect-contents\"),\n\t\t\t\t\th.HasOwnerAndGroup(0, 0),\n\t\t\t\t\th.HasFileMode(0644),\n\t\t\t\t)\n\t\t\t}\n\n\t\t\tbuildpackExists(\"bp.1.id\", \"bp.1.version\")\n\n\t\t\tfakePackageImage := packageImage.(*fakes.Image)\n\t\t\tosVal, err := fakePackageImage.OS()\n\t\t\th.AssertNil(t, err)\n\t\t\th.AssertEq(t, osVal, \"linux\")\n\t\t})\n\n\t\tit(\"adds baselayer + buildpack layers for windows\", func() {\n\t\t\tbuildpack1, err := ifakes.NewFakeBuildpack(dist.BuildpackDescriptor{\n\t\t\t\tWithAPI:    api.MustParse(\"0.2\"),\n\t\t\t\tWithInfo:   dist.ModuleInfo{ID: \"bp.1.id\", Version: \"bp.1.version\"},\n\t\t\t\tWithStacks: []dist.Stack{{ID: \"stack.id.1\"}, {ID: \"stack.id.2\"}},\n\t\t\t\tWithOrder:  nil,\n\t\t\t}, 0644)\n\t\t\th.AssertNil(t, err)\n\n\t\t\tbuilder := buildpack.NewBuilder(mockImageFactory(\"windows\"))\n\t\t\tbuilder.SetBuildpack(buildpack1)\n\n\t\t\t_, err = builder.SaveAsImage(\"some/package\", false, dist.Target{OS: \"windows\"}, map[string]string{})\n\t\t\th.AssertNil(t, err)\n\t\t})\n\n\t\tit(\"should report an error when custom label cannot be set\", func() {\n\t\t\tmockImageFactory = func(expectedImageOS string) *testmocks.MockImageFactory {\n\t\t\t\tvar imageWithLabelError = &imageWithLabelError{Image: fakes.NewImage(\"some/package\", \"\", nil)}\n\t\t\t\timageFactory := testmocks.NewMockImageFactory(mockController)\n\t\t\t\timageFactory.EXPECT().NewImage(\"some/package\", true, dist.Target{OS: expectedImageOS}).Return(imageWithLabelError, nil).MaxTimes(1)\n\t\t\t\treturn imageFactory\n\t\t\t}\n\n\t\t\tbuildpack1, err := ifakes.NewFakeBuildpack(dist.BuildpackDescriptor{\n\t\t\t\tWithAPI: api.MustParse(\"0.2\"),\n\t\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\t\tID:          \"bp.1.id\",\n\t\t\t\t\tVersion:     \"bp.1.version\",\n\t\t\t\t\tName:        \"One\",\n\t\t\t\t\tDescription: \"some description\",\n\t\t\t\t\tHomepage:    \"https://example.com/homepage\",\n\t\t\t\t\tKeywords:    []string{\"some-keyword\"},\n\t\t\t\t\tLicenses: []dist.License{\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\tType: \"MIT\",\n\t\t\t\t\t\t\tURI:  \"https://example.com/license\",\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t\tWithStacks: []dist.Stack{\n\t\t\t\t\t{ID: \"stack.id.1\"},\n\t\t\t\t\t{ID: \"stack.id.2\"},\n\t\t\t\t},\n\t\t\t\tWithOrder: nil,\n\t\t\t}, 0644)\n\t\t\th.AssertNil(t, err)\n\n\t\t\tbuilder := buildpack.NewBuilder(mockImageFactory(\"linux\"))\n\t\t\tbuilder.SetBuildpack(buildpack1)\n\n\t\t\tvar customLabels = map[string]string{\"test.label.fail\": \"true\"}\n\n\t\t\t_, err = builder.SaveAsImage(\"some/package\", false, dist.Target{OS: \"linux\"}, customLabels)\n\t\t\th.AssertError(t, err, \"adding label test.label.fail=true\")\n\t\t})\n\n\t\tit(\"sets additional tags\", func() {\n\t\t\tbuildpack1, err := ifakes.NewFakeBuildpack(dist.BuildpackDescriptor{}, 0644)\n\t\t\th.AssertNil(t, err)\n\n\t\t\tbuilder := buildpack.NewBuilder(mockImageFactory(\"linux\"))\n\t\t\tbuilder.SetBuildpack(buildpack1)\n\n\t\t\tpackageImage, err := builder.SaveAsImage(\"some/package\", false, dist.Target{OS: \"linux\"}, map[string]string{}, \"additional-tag-one\", \"additional-tag-two\")\n\t\t\th.AssertNil(t, err)\n\n\t\t\ti, ok := packageImage.(*fakes.Image)\n\t\t\th.AssertTrue(t, ok)\n\t\t\tsavedNames := i.SavedNames()\n\t\t\tslices.Sort(savedNames)\n\t\t\th.AssertEq(t, 3, len(savedNames))\n\t\t\th.AssertEq(t, \"additional-tag-one\", savedNames[0])\n\t\t\th.AssertEq(t, \"additional-tag-two\", savedNames[1])\n\t\t\th.AssertEq(t, \"some/package\", savedNames[2])\n\t\t})\n\n\t\twhen(\"flatten is set\", func() {\n\t\t\tvar (\n\t\t\t\tbuildpack1   buildpack.BuildModule\n\t\t\t\tbp1          buildpack.BuildModule\n\t\t\t\tcompositeBP2 buildpack.BuildModule\n\t\t\t\tbp21         buildpack.BuildModule\n\t\t\t\tbp22         buildpack.BuildModule\n\t\t\t\tcompositeBP3 buildpack.BuildModule\n\t\t\t\tbp31         buildpack.BuildModule\n\t\t\t\tlogger       logging.Logger\n\t\t\t\toutBuf       bytes.Buffer\n\t\t\t\terr          error\n\t\t\t)\n\t\t\tit.Before(func() {\n\t\t\t\tbp1, err = ifakes.NewFakeBuildpack(dist.BuildpackDescriptor{\n\t\t\t\t\tWithAPI: api.MustParse(\"0.2\"),\n\t\t\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\t\t\tID:      \"buildpack-1-id\",\n\t\t\t\t\t\tVersion: \"buildpack-1-version\",\n\t\t\t\t\t},\n\t\t\t\t}, 0644)\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\tbp21, err = ifakes.NewFakeBuildpack(dist.BuildpackDescriptor{\n\t\t\t\t\tWithAPI: api.MustParse(\"0.2\"),\n\t\t\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\t\t\tID:      \"buildpack-21-id\",\n\t\t\t\t\t\tVersion: \"buildpack-21-version\",\n\t\t\t\t\t},\n\t\t\t\t}, 0644)\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\tbp22, err = ifakes.NewFakeBuildpack(dist.BuildpackDescriptor{\n\t\t\t\t\tWithAPI: api.MustParse(\"0.2\"),\n\t\t\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\t\t\tID:      \"buildpack-22-id\",\n\t\t\t\t\t\tVersion: \"buildpack-22-version\",\n\t\t\t\t\t},\n\t\t\t\t}, 0644)\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\tbp31, err = ifakes.NewFakeBuildpack(dist.BuildpackDescriptor{\n\t\t\t\t\tWithAPI: api.MustParse(\"0.2\"),\n\t\t\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\t\t\tID:      \"buildpack-31-id\",\n\t\t\t\t\t\tVersion: \"buildpack-31-version\",\n\t\t\t\t\t},\n\t\t\t\t}, 0644)\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\tcompositeBP3, err = ifakes.NewFakeBuildpack(dist.BuildpackDescriptor{\n\t\t\t\t\tWithAPI: api.MustParse(\"0.2\"),\n\t\t\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\t\t\tID:      \"composite-buildpack-3-id\",\n\t\t\t\t\t\tVersion: \"composite-buildpack-3-version\",\n\t\t\t\t\t},\n\t\t\t\t\tWithOrder: []dist.OrderEntry{{\n\t\t\t\t\t\tGroup: []dist.ModuleRef{\n\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\tModuleInfo: bp31.Descriptor().Info(),\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t},\n\t\t\t\t\t}},\n\t\t\t\t}, 0644)\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\tcompositeBP2, err = ifakes.NewFakeBuildpack(dist.BuildpackDescriptor{\n\t\t\t\t\tWithAPI: api.MustParse(\"0.2\"),\n\t\t\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\t\t\tID:      \"composite-buildpack-2-id\",\n\t\t\t\t\t\tVersion: \"composite-buildpack-2-version\",\n\t\t\t\t\t},\n\t\t\t\t\tWithOrder: []dist.OrderEntry{{\n\t\t\t\t\t\tGroup: []dist.ModuleRef{\n\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\tModuleInfo: bp21.Descriptor().Info(),\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\tModuleInfo: bp22.Descriptor().Info(),\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\tModuleInfo: compositeBP3.Descriptor().Info(),\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t},\n\t\t\t\t\t}},\n\t\t\t\t}, 0644)\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\tbuildpack1, err = ifakes.NewFakeBuildpack(dist.BuildpackDescriptor{\n\t\t\t\t\tWithAPI:    api.MustParse(\"0.2\"),\n\t\t\t\t\tWithInfo:   dist.ModuleInfo{ID: \"bp.1.id\", Version: \"bp.1.version\"},\n\t\t\t\t\tWithStacks: []dist.Stack{{ID: \"stack.id.1\"}, {ID: \"stack.id.2\"}},\n\t\t\t\t\tWithOrder: []dist.OrderEntry{{\n\t\t\t\t\t\tGroup: []dist.ModuleRef{\n\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\tModuleInfo: bp1.Descriptor().Info(),\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\tModuleInfo: compositeBP2.Descriptor().Info(),\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t},\n\t\t\t\t\t}},\n\t\t\t\t}, 0644)\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\tlogger = logging.NewLogWithWriters(&outBuf, &outBuf, logging.WithVerbose())\n\t\t\t})\n\n\t\t\twhen(\"flatten all\", func() {\n\t\t\t\tvar builder *buildpack.PackageBuilder\n\n\t\t\t\twhen(\"no exclusions\", func() {\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\tbuilder = buildpack.NewBuilder(mockImageFactory(\"linux\"),\n\t\t\t\t\t\t\tbuildpack.FlattenAll(),\n\t\t\t\t\t\t\tbuildpack.WithLogger(logger),\n\t\t\t\t\t\t\tbuildpack.WithLayerWriterFactory(archive.DefaultTarWriterFactory()))\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"flatten all buildpacks\", func() {\n\t\t\t\t\t\tbuilder.SetBuildpack(buildpack1)\n\t\t\t\t\t\tbuilder.AddDependencies(bp1, nil)\n\t\t\t\t\t\tbuilder.AddDependencies(compositeBP2, []buildpack.BuildModule{bp21, bp22, compositeBP3, bp31})\n\n\t\t\t\t\t\tpackageImage, err := builder.SaveAsImage(\"some/package\", false, dist.Target{OS: \"linux\"}, map[string]string{})\n\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\tfakePackageImage := packageImage.(*fakes.Image)\n\t\t\t\t\t\th.AssertEq(t, fakePackageImage.NumberOfAddedLayers(), 1)\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"exclude buildpacks\", func() {\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\texcluded := []string{bp31.Descriptor().Info().FullName()}\n\n\t\t\t\t\t\tbuilder = buildpack.NewBuilder(mockImageFactory(\"linux\"),\n\t\t\t\t\t\t\tbuildpack.DoNotFlatten(excluded),\n\t\t\t\t\t\t\tbuildpack.WithLogger(logger),\n\t\t\t\t\t\t\tbuildpack.WithLayerWriterFactory(archive.DefaultTarWriterFactory()))\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"creates 2 layers\", func() {\n\t\t\t\t\t\tbuilder.SetBuildpack(buildpack1)\n\t\t\t\t\t\tbuilder.AddDependencies(bp1, nil)\n\t\t\t\t\t\tbuilder.AddDependencies(compositeBP2, []buildpack.BuildModule{bp21, bp22, compositeBP3, bp31})\n\n\t\t\t\t\t\tpackageImage, err := builder.SaveAsImage(\"some/package\", false, dist.Target{OS: \"linux\"}, map[string]string{})\n\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\tfakePackageImage := packageImage.(*fakes.Image)\n\t\t\t\t\t\th.AssertEq(t, fakePackageImage.NumberOfAddedLayers(), 2)\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"#SaveAsFile\", func() {\n\t\tit(\"sets metadata\", func() {\n\t\t\tbuildpack1, err := ifakes.NewFakeBuildpack(dist.BuildpackDescriptor{\n\t\t\t\tWithAPI:    api.MustParse(\"0.2\"),\n\t\t\t\tWithInfo:   dist.ModuleInfo{ID: \"bp.1.id\", Version: \"bp.1.version\"},\n\t\t\t\tWithStacks: []dist.Stack{{ID: \"stack.id.1\"}, {ID: \"stack.id.2\"}},\n\t\t\t\tWithOrder:  nil,\n\t\t\t}, 0644)\n\t\t\th.AssertNil(t, err)\n\n\t\t\tbuilder := buildpack.NewBuilder(mockImageFactory(\"\"))\n\t\t\tbuilder.SetBuildpack(buildpack1)\n\n\t\t\tvar customLabels = map[string]string{\"test.label.one\": \"1\", \"test.label.two\": \"2\"}\n\n\t\t\toutputFile := filepath.Join(tmpDir, fmt.Sprintf(\"package-%s.cnb\", h.RandString(10)))\n\t\t\th.AssertNil(t, builder.SaveAsFile(outputFile, dist.Target{OS: \"linux\"}, customLabels))\n\n\t\t\twithContents := func(fn func(data []byte)) h.TarEntryAssertion {\n\t\t\t\treturn func(t *testing.T, header *tar.Header, data []byte) {\n\t\t\t\t\tfn(data)\n\t\t\t\t}\n\t\t\t}\n\n\t\t\th.AssertOnTarEntry(t, outputFile, \"/index.json\",\n\t\t\t\th.HasOwnerAndGroup(0, 0),\n\t\t\t\th.HasFileMode(0755),\n\t\t\t\twithContents(func(data []byte) {\n\t\t\t\t\tindex := v1.Index{}\n\t\t\t\t\terr := json.Unmarshal(data, &index)\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\th.AssertEq(t, len(index.Manifests), 1)\n\n\t\t\t\t\t// manifest: application/vnd.docker.distribution.manifest.v2+json\n\t\t\t\t\th.AssertOnTarEntry(t, outputFile,\n\t\t\t\t\t\t\"/blobs/sha256/\"+index.Manifests[0].Digest.Hex(),\n\t\t\t\t\t\th.HasOwnerAndGroup(0, 0),\n\t\t\t\t\t\th.IsJSON(),\n\n\t\t\t\t\t\twithContents(func(data []byte) {\n\t\t\t\t\t\t\tmanifest := v1.Manifest{}\n\t\t\t\t\t\t\terr := json.Unmarshal(data, &manifest)\n\t\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\t\t// config: application/vnd.docker.container.image.v1+json\n\t\t\t\t\t\t\th.AssertOnTarEntry(t, outputFile,\n\t\t\t\t\t\t\t\t\"/blobs/sha256/\"+manifest.Config.Digest.Hex(),\n\t\t\t\t\t\t\t\th.HasOwnerAndGroup(0, 0),\n\t\t\t\t\t\t\t\th.IsJSON(),\n\t\t\t\t\t\t\t\t// buildpackage metadata\n\t\t\t\t\t\t\t\th.ContentContains(`\"io.buildpacks.buildpackage.metadata\":\"{\\\"id\\\":\\\"bp.1.id\\\",\\\"version\\\":\\\"bp.1.version\\\",\\\"stacks\\\":[{\\\"id\\\":\\\"stack.id.1\\\"},{\\\"id\\\":\\\"stack.id.2\\\"}]}\"`),\n\t\t\t\t\t\t\t\t// buildpack layers metadata\n\t\t\t\t\t\t\t\th.ContentContains(`\"io.buildpacks.buildpack.layers\":\"{\\\"bp.1.id\\\":{\\\"bp.1.version\\\":{\\\"api\\\":\\\"0.2\\\",\\\"stacks\\\":[{\\\"id\\\":\\\"stack.id.1\\\"},{\\\"id\\\":\\\"stack.id.2\\\"}],\\\"layerDiffID\\\":\\\"sha256:44447e95b06b73496d1891de5afb01936e9999b97ea03dad6337d9f5610807a7\\\"}}`),\n\t\t\t\t\t\t\t\t// image os\n\t\t\t\t\t\t\t\th.ContentContains(`\"os\":\"linux\"`),\n\t\t\t\t\t\t\t\t// custom labels\n\t\t\t\t\t\t\t\th.ContentContains(`\"test.label.one\":\"1\"`),\n\t\t\t\t\t\t\t\th.ContentContains(`\"test.label.two\":\"2\"`),\n\t\t\t\t\t\t\t)\n\t\t\t\t\t\t}))\n\t\t\t\t}))\n\t\t})\n\n\t\tit(\"adds buildpack layers\", func() {\n\t\t\tbuildpack1, err := ifakes.NewFakeBuildpack(dist.BuildpackDescriptor{\n\t\t\t\tWithAPI:    api.MustParse(\"0.2\"),\n\t\t\t\tWithInfo:   dist.ModuleInfo{ID: \"bp.1.id\", Version: \"bp.1.version\"},\n\t\t\t\tWithStacks: []dist.Stack{{ID: \"stack.id.1\"}, {ID: \"stack.id.2\"}},\n\t\t\t\tWithOrder:  nil,\n\t\t\t}, 0644)\n\t\t\th.AssertNil(t, err)\n\n\t\t\tbuilder := buildpack.NewBuilder(mockImageFactory(\"\"))\n\t\t\tbuilder.SetBuildpack(buildpack1)\n\n\t\t\toutputFile := filepath.Join(tmpDir, fmt.Sprintf(\"package-%s.cnb\", h.RandString(10)))\n\t\t\th.AssertNil(t, builder.SaveAsFile(outputFile, dist.Target{OS: \"linux\"}, map[string]string{}))\n\n\t\t\th.AssertOnTarEntry(t, outputFile, \"/blobs\",\n\t\t\t\th.IsDirectory(),\n\t\t\t\th.HasOwnerAndGroup(0, 0),\n\t\t\t\th.HasFileMode(0755))\n\t\t\th.AssertOnTarEntry(t, outputFile, \"/blobs/sha256\",\n\t\t\t\th.IsDirectory(),\n\t\t\t\th.HasOwnerAndGroup(0, 0),\n\t\t\t\th.HasFileMode(0755))\n\n\t\t\tbpReader, err := buildpack1.Open()\n\t\t\th.AssertNil(t, err)\n\t\t\tdefer bpReader.Close()\n\n\t\t\t// layer: application/vnd.docker.image.rootfs.diff.tar.gzip\n\t\t\tbuildpackLayerSHA, err := computeLayerSHA(bpReader)\n\t\t\th.AssertNil(t, err)\n\t\t\th.AssertOnTarEntry(t, outputFile,\n\t\t\t\t\"/blobs/sha256/\"+buildpackLayerSHA,\n\t\t\t\th.HasOwnerAndGroup(0, 0),\n\t\t\t\th.HasFileMode(0755),\n\t\t\t\th.IsGzipped(),\n\t\t\t\th.AssertOnNestedTar(\"/cnb/buildpacks/bp.1.id\",\n\t\t\t\t\th.IsDirectory(),\n\t\t\t\t\th.HasOwnerAndGroup(0, 0),\n\t\t\t\t\th.HasFileMode(0644)),\n\t\t\t\th.AssertOnNestedTar(\"/cnb/buildpacks/bp.1.id/bp.1.version/bin/build\",\n\t\t\t\t\th.ContentEquals(\"build-contents\"),\n\t\t\t\t\th.HasOwnerAndGroup(0, 0),\n\t\t\t\t\th.HasFileMode(0644)),\n\t\t\t\th.AssertOnNestedTar(\"/cnb/buildpacks/bp.1.id/bp.1.version/bin/detect\",\n\t\t\t\t\th.ContentEquals(\"detect-contents\"),\n\t\t\t\t\th.HasOwnerAndGroup(0, 0),\n\t\t\t\t\th.HasFileMode(0644)))\n\t\t})\n\n\t\tit(\"adds baselayer + buildpack layers for windows\", func() {\n\t\t\tbuildpack1, err := ifakes.NewFakeBuildpack(dist.BuildpackDescriptor{\n\t\t\t\tWithAPI:    api.MustParse(\"0.2\"),\n\t\t\t\tWithInfo:   dist.ModuleInfo{ID: \"bp.1.id\", Version: \"bp.1.version\"},\n\t\t\t\tWithStacks: []dist.Stack{{ID: \"stack.id.1\"}, {ID: \"stack.id.2\"}},\n\t\t\t\tWithOrder:  nil,\n\t\t\t}, 0644)\n\t\t\th.AssertNil(t, err)\n\n\t\t\tbuilder := buildpack.NewBuilder(mockImageFactory(\"\"))\n\t\t\tbuilder.SetBuildpack(buildpack1)\n\n\t\t\toutputFile := filepath.Join(tmpDir, fmt.Sprintf(\"package-%s.cnb\", h.RandString(10)))\n\t\t\th.AssertNil(t, builder.SaveAsFile(outputFile, dist.Target{OS: \"windows\"}, map[string]string{}))\n\n\t\t\t// Windows baselayer content is constant\n\t\t\texpectedBaseLayerReader, err := layer.WindowsBaseLayer()\n\t\t\th.AssertNil(t, err)\n\n\t\t\t// layer: application/vnd.docker.image.rootfs.diff.tar.gzip\n\t\t\texpectedBaseLayerSHA, err := computeLayerSHA(io.NopCloser(expectedBaseLayerReader))\n\t\t\th.AssertNil(t, err)\n\t\t\th.AssertOnTarEntry(t, outputFile,\n\t\t\t\t\"/blobs/sha256/\"+expectedBaseLayerSHA,\n\t\t\t\th.HasOwnerAndGroup(0, 0),\n\t\t\t\th.HasFileMode(0755),\n\t\t\t\th.IsGzipped(),\n\t\t\t)\n\n\t\t\tbpReader, err := buildpack1.Open()\n\t\t\th.AssertNil(t, err)\n\t\t\tdefer bpReader.Close()\n\n\t\t\tbuildpackLayerSHA, err := computeLayerSHA(bpReader)\n\t\t\th.AssertNil(t, err)\n\t\t\th.AssertOnTarEntry(t, outputFile,\n\t\t\t\t\"/blobs/sha256/\"+buildpackLayerSHA,\n\t\t\t\th.HasOwnerAndGroup(0, 0),\n\t\t\t\th.HasFileMode(0755),\n\t\t\t\th.IsGzipped(),\n\t\t\t)\n\t\t})\n\t})\n}\n\nfunc computeLayerSHA(reader io.ReadCloser) (string, error) {\n\tbpLayer := stream.NewLayer(reader, stream.WithCompressionLevel(gzip.DefaultCompression))\n\tcompressed, err := bpLayer.Compressed()\n\tif err != nil {\n\t\treturn \"\", err\n\t}\n\tdefer compressed.Close()\n\n\tif _, err := io.Copy(io.Discard, compressed); err != nil {\n\t\treturn \"\", err\n\t}\n\n\tdigest, err := bpLayer.Digest()\n\tif err != nil {\n\t\treturn \"\", err\n\t}\n\n\treturn digest.Hex, nil\n}\n\ntype imageWithLabelError struct {\n\t*fakes.Image\n}\n\nfunc (i *imageWithLabelError) SetLabel(string, string) error {\n\treturn errors.New(\"Label could not be set\")\n}\n"
  },
  {
    "path": "pkg/buildpack/buildpack.go",
    "content": "package buildpack\n\nimport (\n\t\"archive/tar\"\n\t\"fmt\"\n\t\"io\"\n\t\"os\"\n\t\"path\"\n\t\"path/filepath\"\n\t\"strings\"\n\n\t\"github.com/BurntSushi/toml\"\n\t\"github.com/buildpacks/lifecycle/api\"\n\t\"github.com/pkg/errors\"\n\n\t\"github.com/buildpacks/pack/internal/style\"\n\t\"github.com/buildpacks/pack/pkg/archive\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n)\n\nconst (\n\tKindBuildpack = \"buildpack\"\n\tKindExtension = \"extension\"\n)\n\n//go:generate mockgen -package testmocks -destination ../testmocks/mock_build_module.go github.com/buildpacks/pack/pkg/buildpack BuildModule\ntype BuildModule interface {\n\t// Open returns a reader to a tar with contents structured as per the distribution spec\n\t// (currently '/cnb/buildpacks/{ID}/{version}/*', all entries with a zeroed-out\n\t// timestamp and root UID/GID).\n\tOpen() (io.ReadCloser, error)\n\tDescriptor() Descriptor\n}\n\ntype Descriptor interface {\n\tAPI() *api.Version\n\tEnsureStackSupport(stackID string, providedMixins []string, validateRunStageMixins bool) error\n\tEnsureTargetSupport(os, arch, distroName, distroVersion string) error\n\tEscapedID() string\n\tInfo() dist.ModuleInfo\n\tKind() string\n\tOrder() dist.Order\n\tStacks() []dist.Stack\n\tTargets() []dist.Target\n}\n\ntype Blob interface {\n\t// Open returns a io.ReadCloser for the contents of the Blob in tar format.\n\tOpen() (io.ReadCloser, error)\n}\n\ntype buildModule struct {\n\tdescriptor Descriptor\n\tBlob       `toml:\"-\"`\n}\n\nfunc (b *buildModule) Descriptor() Descriptor {\n\treturn b.descriptor\n}\n\n// FromBlob constructs a buildpack or extension from a blob. It is assumed that the buildpack\n// contents are structured as per the distribution spec (currently '/cnb/buildpacks/{ID}/{version}/*' or\n// '/cnb/extensions/{ID}/{version}/*').\nfunc FromBlob(descriptor Descriptor, blob Blob) BuildModule {\n\treturn &buildModule{\n\t\tBlob:       blob,\n\t\tdescriptor: descriptor,\n\t}\n}\n\n// FromBuildpackRootBlob constructs a buildpack from a blob. It is assumed that the buildpack contents reside at the\n// root of the blob. The constructed buildpack contents will be structured as per the distribution spec (currently\n// a tar with contents under '/cnb/buildpacks/{ID}/{version}/*').\nfunc FromBuildpackRootBlob(blob Blob, layerWriterFactory archive.TarWriterFactory, logger Logger) (BuildModule, error) {\n\tdescriptor := dist.BuildpackDescriptor{}\n\tdescriptor.WithAPI = api.MustParse(dist.AssumedBuildpackAPIVersion)\n\tundecodedKeys, err := readDescriptor(KindBuildpack, &descriptor, blob)\n\tif err != nil {\n\t\treturn nil, err\n\t}\n\tif len(undecodedKeys) > 0 {\n\t\tlogger.Warnf(\"Ignoring unexpected key(s) in descriptor for buildpack %s: %s\", descriptor.EscapedID(), strings.Join(undecodedKeys, \", \"))\n\t}\n\tif err := detectPlatformSpecificValues(&descriptor, blob); err != nil {\n\t\treturn nil, err\n\t}\n\tif err := validateBuildpackDescriptor(descriptor); err != nil {\n\t\treturn nil, err\n\t}\n\treturn buildpackFrom(&descriptor, blob, layerWriterFactory)\n}\n\n// FromExtensionRootBlob constructs an extension from a blob. It is assumed that the extension contents reside at the\n// root of the blob. The constructed extension contents will be structured as per the distribution spec (currently\n// a tar with contents under '/cnb/extensions/{ID}/{version}/*').\nfunc FromExtensionRootBlob(blob Blob, layerWriterFactory archive.TarWriterFactory, logger Logger) (BuildModule, error) {\n\tdescriptor := dist.ExtensionDescriptor{}\n\tdescriptor.WithAPI = api.MustParse(dist.AssumedBuildpackAPIVersion)\n\tundecodedKeys, err := readDescriptor(KindExtension, &descriptor, blob)\n\tif err != nil {\n\t\treturn nil, err\n\t}\n\tif len(undecodedKeys) > 0 {\n\t\tlogger.Warnf(\"Ignoring unexpected key(s) in descriptor for extension %s: %s\", descriptor.EscapedID(), strings.Join(undecodedKeys, \", \"))\n\t}\n\tif err := validateExtensionDescriptor(descriptor); err != nil {\n\t\treturn nil, err\n\t}\n\treturn buildpackFrom(&descriptor, blob, layerWriterFactory)\n}\n\nfunc readDescriptor(kind string, descriptor interface{}, blob Blob) (undecodedKeys []string, err error) {\n\trc, err := blob.Open()\n\tif err != nil {\n\t\treturn undecodedKeys, errors.Wrapf(err, \"open %s\", kind)\n\t}\n\tdefer rc.Close()\n\n\tdescriptorFile := kind + \".toml\"\n\n\t_, buf, err := archive.ReadTarEntry(rc, descriptorFile)\n\tif err != nil {\n\t\treturn undecodedKeys, errors.Wrapf(err, \"reading %s\", descriptorFile)\n\t}\n\n\tmd, err := toml.Decode(string(buf), descriptor)\n\tif err != nil {\n\t\treturn undecodedKeys, errors.Wrapf(err, \"decoding %s\", descriptorFile)\n\t}\n\n\tundecoded := md.Undecoded()\n\tfor _, k := range undecoded {\n\t\t// FIXME: we should ideally update dist.ModuleInfo to expect sbom-formats, but this breaks other tests;\n\t\t// it isn't possible to make [metadata] a decoded key because its type is undefined in the buildpack spec.\n\t\tif k.String() == \"metadata\" || strings.HasPrefix(k.String(), \"metadata.\") ||\n\t\t\tk.String() == \"buildpack.sbom-formats\" {\n\t\t\t// buildpack.toml & extension.toml can contain [metadata] which is arbitrary\n\t\t\tcontinue\n\t\t}\n\t\tundecodedKeys = append(undecodedKeys, k.String())\n\t}\n\n\treturn undecodedKeys, nil\n}\n\nfunc detectPlatformSpecificValues(descriptor *dist.BuildpackDescriptor, blob Blob) error {\n\tif val, err := hasFile(blob, path.Join(\"bin\", \"build\")); val {\n\t\tdescriptor.WithLinuxBuild = true\n\t} else if err != nil {\n\t\treturn err\n\t}\n\tif val, err := hasFile(blob, path.Join(\"bin\", \"build.bat\")); val {\n\t\tdescriptor.WithWindowsBuild = true\n\t} else if err != nil {\n\t\treturn err\n\t}\n\tif val, err := hasFile(blob, path.Join(\"bin\", \"build.exe\")); val {\n\t\tdescriptor.WithWindowsBuild = true\n\t} else if err != nil {\n\t\treturn err\n\t}\n\treturn nil\n}\n\nfunc hasFile(blob Blob, file string) (bool, error) {\n\trc, err := blob.Open()\n\tif err != nil {\n\t\treturn false, errors.Wrapf(err, \"open %s\", \"buildpack bin/\")\n\t}\n\tdefer rc.Close()\n\t_, _, err = archive.ReadTarEntry(rc, file)\n\treturn err == nil, nil\n}\n\nfunc buildpackFrom(descriptor Descriptor, blob Blob, layerWriterFactory archive.TarWriterFactory) (BuildModule, error) {\n\treturn &buildModule{\n\t\tdescriptor: descriptor,\n\t\tBlob: &distBlob{\n\t\t\topenFn: func() io.ReadCloser {\n\t\t\t\treturn archive.GenerateTarWithWriter(\n\t\t\t\t\tfunc(tw archive.TarWriter) error {\n\t\t\t\t\t\treturn toDistTar(tw, descriptor, blob)\n\t\t\t\t\t},\n\t\t\t\t\tlayerWriterFactory,\n\t\t\t\t)\n\t\t\t},\n\t\t},\n\t}, nil\n}\n\ntype distBlob struct {\n\topenFn func() io.ReadCloser\n}\n\nfunc (b *distBlob) Open() (io.ReadCloser, error) {\n\treturn b.openFn(), nil\n}\n\nfunc toDistTar(tw archive.TarWriter, descriptor Descriptor, blob Blob) error {\n\tts := archive.NormalizedDateTime\n\n\tparentDir := dist.BuildpacksDir\n\tif descriptor.Kind() == KindExtension {\n\t\tparentDir = dist.ExtensionsDir\n\t}\n\n\tif err := tw.WriteHeader(&tar.Header{\n\t\tTypeflag: tar.TypeDir,\n\t\tName:     path.Join(parentDir, descriptor.EscapedID()),\n\t\tMode:     0755,\n\t\tModTime:  ts,\n\t}); err != nil {\n\t\treturn errors.Wrapf(err, \"writing %s id dir header\", descriptor.Kind())\n\t}\n\n\tbaseTarDir := path.Join(parentDir, descriptor.EscapedID(), descriptor.Info().Version)\n\tif err := tw.WriteHeader(&tar.Header{\n\t\tTypeflag: tar.TypeDir,\n\t\tName:     baseTarDir,\n\t\tMode:     0755,\n\t\tModTime:  ts,\n\t}); err != nil {\n\t\treturn errors.Wrapf(err, \"writing %s version dir header\", descriptor.Kind())\n\t}\n\n\trc, err := blob.Open()\n\tif err != nil {\n\t\treturn errors.Wrapf(err, \"reading %s blob\", descriptor.Kind())\n\t}\n\tdefer rc.Close()\n\n\ttr := tar.NewReader(rc)\n\tfor {\n\t\theader, err := tr.Next()\n\t\tif err == io.EOF {\n\t\t\tbreak\n\t\t}\n\t\tif err != nil {\n\t\t\treturn errors.Wrap(err, \"failed to get next tar entry\")\n\t\t}\n\n\t\tarchive.NormalizeHeader(header, true)\n\t\theader.Name = path.Clean(header.Name)\n\t\tif header.Name == \".\" || header.Name == \"/\" {\n\t\t\tcontinue\n\t\t}\n\n\t\theader.Mode = calcFileMode(header)\n\t\theader.Name = path.Join(baseTarDir, header.Name)\n\n\t\tif header.Typeflag == tar.TypeLink {\n\t\t\theader.Linkname = path.Join(baseTarDir, path.Clean(header.Linkname))\n\t\t}\n\t\terr = tw.WriteHeader(header)\n\t\tif err != nil {\n\t\t\treturn errors.Wrapf(err, \"failed to write header for '%s'\", header.Name)\n\t\t}\n\n\t\t_, err = io.Copy(tw, tr)\n\t\tif err != nil {\n\t\t\treturn errors.Wrapf(err, \"failed to write contents to '%s'\", header.Name)\n\t\t}\n\t}\n\n\treturn nil\n}\n\nfunc calcFileMode(header *tar.Header) int64 {\n\tswitch {\n\tcase header.Typeflag == tar.TypeDir:\n\t\treturn 0755\n\tcase nameOneOf(header.Name,\n\t\tpath.Join(\"bin\", \"build\"),\n\t\tpath.Join(\"bin\", \"detect\"),\n\t\tpath.Join(\"bin\", \"generate\"),\n\t):\n\t\treturn 0755\n\tcase anyExecBit(header.Mode):\n\t\treturn 0755\n\t}\n\n\treturn 0644\n}\n\nfunc nameOneOf(name string, paths ...string) bool {\n\tfor _, p := range paths {\n\t\tif name == p {\n\t\t\treturn true\n\t\t}\n\t}\n\treturn false\n}\n\nfunc anyExecBit(mode int64) bool {\n\treturn mode&0111 != 0\n}\n\nfunc validateBuildpackDescriptor(bpd dist.BuildpackDescriptor) error {\n\tif bpd.Info().ID == \"\" {\n\t\treturn errors.Errorf(\"%s is required\", style.Symbol(\"buildpack.id\"))\n\t}\n\n\tif bpd.Info().Version == \"\" {\n\t\treturn errors.Errorf(\"%s is required\", style.Symbol(\"buildpack.version\"))\n\t}\n\n\tif len(bpd.Order()) >= 1 && (len(bpd.Stacks()) >= 1 || len(bpd.Targets()) >= 1) {\n\t\treturn errors.Errorf(\n\t\t\t\"buildpack %s: cannot have both %s/%s and an %s defined\",\n\t\t\tstyle.Symbol(bpd.Info().FullName()),\n\t\t\tstyle.Symbol(\"targets\"),\n\t\t\tstyle.Symbol(\"stacks\"),\n\t\t\tstyle.Symbol(\"order\"),\n\t\t)\n\t}\n\n\treturn nil\n}\n\nfunc validateExtensionDescriptor(extd dist.ExtensionDescriptor) error {\n\tif extd.Info().ID == \"\" {\n\t\treturn errors.Errorf(\"%s is required\", style.Symbol(\"extension.id\"))\n\t}\n\n\tif extd.Info().Version == \"\" {\n\t\treturn errors.Errorf(\"%s is required\", style.Symbol(\"extension.version\"))\n\t}\n\n\treturn nil\n}\n\nfunc ToLayerTar(dest string, module BuildModule) (string, error) {\n\tdescriptor := module.Descriptor()\n\tmodReader, err := module.Open()\n\tif err != nil {\n\t\treturn \"\", errors.Wrap(err, \"opening blob\")\n\t}\n\tdefer modReader.Close()\n\n\tlayerTar := filepath.Join(dest, fmt.Sprintf(\"%s.%s.tar\", descriptor.EscapedID(), descriptor.Info().Version))\n\tfh, err := os.Create(layerTar)\n\tif err != nil {\n\t\treturn \"\", errors.Wrap(err, \"create file for tar\")\n\t}\n\tdefer fh.Close()\n\n\tif _, err := io.Copy(fh, modReader); err != nil {\n\t\treturn \"\", errors.Wrap(err, \"writing blob to tar\")\n\t}\n\n\treturn layerTar, nil\n}\n\nfunc ToNLayerTar(dest string, module BuildModule) ([]ModuleTar, error) {\n\tmodReader, err := module.Open()\n\tif err != nil {\n\t\treturn nil, errors.Wrap(err, \"opening blob\")\n\t}\n\tdefer modReader.Close()\n\n\ttarCollection := newModuleTarCollection(dest)\n\ttr := tar.NewReader(modReader)\n\n\tvar (\n\t\theader     *tar.Header\n\t\tforWindows bool\n\t)\n\n\tfor {\n\t\theader, err = tr.Next()\n\t\tif err != nil {\n\t\t\tif err == io.EOF {\n\t\t\t\treturn handleEmptyModule(dest, module)\n\t\t\t}\n\t\t\treturn nil, err\n\t\t}\n\t\tif _, err := sanitizePath(header.Name); err != nil {\n\t\t\treturn nil, err\n\t\t}\n\t\tif header.Name == \"Files\" {\n\t\t\tforWindows = true\n\t\t}\n\t\tif strings.Contains(header.Name, `/cnb/buildpacks/`) || strings.Contains(header.Name, `\\cnb\\buildpacks\\`) {\n\t\t\t// Only for Windows, the first four headers are:\n\t\t\t// - Files\n\t\t\t// - Hives\n\t\t\t// - Files/cnb\n\t\t\t// - Files/cnb/buildpacks\n\t\t\t// Skip over these until we find \"Files/cnb/buildpacks/<buildpack-id>\":\n\t\t\tbreak\n\t\t}\n\t}\n\t// The header should look like \"/cnb/buildpacks/<buildpack-id>\"\n\t// The version should be blank because the first header is missing <buildpack-version>.\n\torigID, origVersion := parseBpIDAndVersion(header)\n\tif origVersion != \"\" {\n\t\treturn nil, fmt.Errorf(\"first header '%s' contained unexpected version\", header.Name)\n\t}\n\n\tif err := toNLayerTar(origID, origVersion, header, tr, tarCollection, forWindows); err != nil {\n\t\treturn nil, err\n\t}\n\n\terrs := tarCollection.close()\n\tif len(errs) > 0 {\n\t\treturn nil, errors.New(\"closing files\")\n\t}\n\n\treturn tarCollection.moduleTars(), nil\n}\n\nfunc toNLayerTar(origID, origVersion string, firstHeader *tar.Header, tr *tar.Reader, tc *moduleTarCollection, forWindows bool) error {\n\ttoWrite := []*tar.Header{firstHeader}\n\tif origVersion == \"\" {\n\t\t// the first header only contains the id - e.g., /cnb/buildpacks/<buildpack-id>,\n\t\t// read the next header to get the version\n\t\tsecondHeader, err := tr.Next()\n\t\tif err != nil {\n\t\t\treturn fmt.Errorf(\"getting second header: %w; first header was %s\", err, firstHeader.Name)\n\t\t}\n\t\tif _, err := sanitizePath(secondHeader.Name); err != nil {\n\t\t\treturn err\n\t\t}\n\t\tnextID, nextVersion := parseBpIDAndVersion(secondHeader)\n\t\tif nextID != origID || nextVersion == \"\" {\n\t\t\treturn fmt.Errorf(\"second header '%s' contained unexpected id or missing version\", secondHeader.Name)\n\t\t}\n\t\torigVersion = nextVersion\n\t\ttoWrite = append(toWrite, secondHeader)\n\t} else {\n\t\t// the first header contains id and version - e.g., /cnb/buildpacks/<buildpack-id>/<buildpack-version>,\n\t\t// we need to write the parent header - e.g., /cnb/buildpacks/<buildpack-id>\n\t\trealFirstHeader := *firstHeader\n\t\trealFirstHeader.Name = filepath.ToSlash(filepath.Dir(firstHeader.Name))\n\t\ttoWrite = append([]*tar.Header{&realFirstHeader}, toWrite...)\n\t}\n\tif forWindows {\n\t\ttoWrite = append(windowsPreamble(), toWrite...)\n\t}\n\tmt, err := tc.get(origID, origVersion)\n\tif err != nil {\n\t\treturn fmt.Errorf(\"getting module from collection: %w\", err)\n\t}\n\tfor _, h := range toWrite {\n\t\tif err := mt.writer.WriteHeader(h); err != nil {\n\t\t\treturn fmt.Errorf(\"failed to write header '%s': %w\", h.Name, err)\n\t\t}\n\t}\n\t// write the rest of the package\n\tvar header *tar.Header\n\tfor {\n\t\theader, err = tr.Next()\n\t\tif err != nil {\n\t\t\tif err == io.EOF {\n\t\t\t\treturn nil\n\t\t\t}\n\t\t\treturn fmt.Errorf(\"getting next header: %w\", err)\n\t\t}\n\t\tif _, err := sanitizePath(header.Name); err != nil {\n\t\t\treturn err\n\t\t}\n\t\tnextID, nextVersion := parseBpIDAndVersion(header)\n\t\tif nextID != origID || nextVersion != origVersion {\n\t\t\t// we found a new module, recurse\n\t\t\treturn toNLayerTar(nextID, nextVersion, header, tr, tc, forWindows)\n\t\t}\n\n\t\terr = mt.writer.WriteHeader(header)\n\t\tif err != nil {\n\t\t\treturn fmt.Errorf(\"failed to write header for '%s': %w\", header.Name, err)\n\t\t}\n\n\t\t_, err = io.Copy(mt.writer, tr)\n\t\tif err != nil {\n\t\t\treturn errors.Wrapf(err, \"failed to write contents to '%s'\", header.Name)\n\t\t}\n\t}\n}\n\nfunc sanitizePath(path string) (string, error) {\n\tif strings.Contains(path, \"..\") {\n\t\treturn \"\", fmt.Errorf(\"path %s contains unexpected special elements\", path)\n\t}\n\treturn path, nil\n}\n\nfunc windowsPreamble() []*tar.Header {\n\treturn []*tar.Header{\n\t\t{\n\t\t\tName:     \"Files\",\n\t\t\tTypeflag: tar.TypeDir,\n\t\t},\n\t\t{\n\t\t\tName:     \"Hives\",\n\t\t\tTypeflag: tar.TypeDir,\n\t\t},\n\t\t{\n\t\t\tName:     \"Files/cnb\",\n\t\t\tTypeflag: tar.TypeDir,\n\t\t},\n\t\t{\n\t\t\tName:     \"Files/cnb/buildpacks\",\n\t\t\tTypeflag: tar.TypeDir,\n\t\t},\n\t}\n}\n\nfunc parseBpIDAndVersion(hdr *tar.Header) (id, version string) {\n\t// splitting \"/cnb/buildpacks/{ID}/{version}/*\" returns\n\t// [0] = \"\" -> first element is empty or \"Files\" in windows\n\t// [1] = \"cnb\"\n\t// [2] = \"buildpacks\"\n\t// [3] = \"{ID}\"\n\t// [4] = \"{version}\"\n\t// ...\n\tparts := strings.Split(strings.ReplaceAll(filepath.Clean(hdr.Name), `\\`, `/`), `/`)\n\tsize := len(parts)\n\tswitch {\n\tcase size < 4:\n\t\t// error\n\tcase size == 4:\n\t\tid = parts[3]\n\tcase size >= 5:\n\t\tid = parts[3]\n\t\tversion = parts[4]\n\t}\n\treturn id, version\n}\n\nfunc handleEmptyModule(dest string, module BuildModule) ([]ModuleTar, error) {\n\ttarFile, err := ToLayerTar(dest, module)\n\tif err != nil {\n\t\treturn nil, err\n\t}\n\tlayerTar := &moduleTar{\n\t\tinfo: module.Descriptor().Info(),\n\t\tpath: tarFile,\n\t}\n\treturn []ModuleTar{layerTar}, nil\n}\n\n// Set returns a set of the given string slice.\nfunc Set(exclude []string) map[string]struct{} {\n\ttype void struct{}\n\tvar member void\n\tvar excludedModules = make(map[string]struct{})\n\tfor _, fullName := range exclude {\n\t\texcludedModules[fullName] = member\n\t}\n\treturn excludedModules\n}\n\ntype ModuleTar interface {\n\tInfo() dist.ModuleInfo\n\tPath() string\n}\n\ntype moduleTar struct {\n\tinfo   dist.ModuleInfo\n\tpath   string\n\twriter archive.TarWriter\n}\n\nfunc (t *moduleTar) Info() dist.ModuleInfo {\n\treturn t.info\n}\n\nfunc (t *moduleTar) Path() string {\n\treturn t.path\n}\n\nfunc newModuleTar(dest, id, version string) (moduleTar, error) {\n\tlayerTar := filepath.Join(dest, fmt.Sprintf(\"%s.%s.tar\", id, version))\n\tfh, err := os.Create(layerTar)\n\tif err != nil {\n\t\treturn moduleTar{}, errors.Wrapf(err, \"creating file at path %s\", layerTar)\n\t}\n\treturn moduleTar{\n\t\tinfo: dist.ModuleInfo{\n\t\t\tID:      id,\n\t\t\tVersion: version,\n\t\t},\n\t\tpath:   layerTar,\n\t\twriter: tar.NewWriter(fh),\n\t}, nil\n}\n\ntype moduleTarCollection struct {\n\trootPath string\n\tmodules  map[string]moduleTar\n}\n\nfunc newModuleTarCollection(rootPath string) *moduleTarCollection {\n\treturn &moduleTarCollection{\n\t\trootPath: rootPath,\n\t\tmodules:  map[string]moduleTar{},\n\t}\n}\n\nfunc (m *moduleTarCollection) get(id, version string) (moduleTar, error) {\n\tkey := fmt.Sprintf(\"%s@%s\", id, version)\n\tif _, ok := m.modules[key]; !ok {\n\t\tmodule, err := newModuleTar(m.rootPath, id, version)\n\t\tif err != nil {\n\t\t\treturn moduleTar{}, err\n\t\t}\n\t\tm.modules[key] = module\n\t}\n\treturn m.modules[key], nil\n}\n\nfunc (m *moduleTarCollection) moduleTars() []ModuleTar {\n\tvar modulesTar []ModuleTar\n\tfor _, v := range m.modules {\n\t\tv := v\n\t\tvv := &v\n\t\tmodulesTar = append(modulesTar, vv)\n\t}\n\treturn modulesTar\n}\n\nfunc (m *moduleTarCollection) close() []error {\n\tvar errors []error\n\tfor _, v := range m.modules {\n\t\terr := v.writer.Close()\n\t\tif err != nil {\n\t\t\terrors = append(errors, err)\n\t\t}\n\t}\n\treturn errors\n}\n"
  },
  {
    "path": "pkg/buildpack/buildpack_tar_writer.go",
    "content": "package buildpack\n\nimport (\n\t\"archive/tar\"\n\t\"fmt\"\n\t\"io\"\n\t\"os\"\n\t\"path\"\n\t\"path/filepath\"\n\t\"strings\"\n\n\t\"github.com/buildpacks/lifecycle/buildpack\"\n\t\"github.com/pkg/errors\"\n\n\t\"github.com/buildpacks/pack/internal/style\"\n\t\"github.com/buildpacks/pack/pkg/archive\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n)\n\ntype BuildModuleWriter struct {\n\tlogger  logging.Logger\n\tfactory archive.TarWriterFactory\n}\n\n// NewBuildModuleWriter creates a BuildModule writer\nfunc NewBuildModuleWriter(logger logging.Logger, factory archive.TarWriterFactory) *BuildModuleWriter {\n\treturn &BuildModuleWriter{\n\t\tlogger:  logger,\n\t\tfactory: factory,\n\t}\n}\n\n// NToLayerTar creates a tar file containing the all the Buildpacks given, but excluding the ones which FullName() is\n// in the exclude list. It returns the path to the tar file, the list of Buildpacks that were excluded, and any error\nfunc (b *BuildModuleWriter) NToLayerTar(tarPath, filename string, modules []BuildModule, exclude map[string]struct{}) (string, []BuildModule, error) {\n\tlayerTar := filepath.Join(tarPath, fmt.Sprintf(\"%s.tar\", filename))\n\ttarFile, err := os.Create(layerTar)\n\tb.logger.Debugf(\"creating file %s\", style.Symbol(layerTar))\n\tif err != nil {\n\t\treturn \"\", nil, errors.Wrap(err, \"create file for tar\")\n\t}\n\n\tdefer tarFile.Close()\n\ttw := b.factory.NewWriter(tarFile)\n\tdefer tw.Close()\n\n\tparentFolderAdded := map[string]bool{}\n\tduplicated := map[string]bool{}\n\n\tvar buildModuleExcluded []BuildModule\n\tfor _, module := range modules {\n\t\tif _, ok := exclude[module.Descriptor().Info().FullName()]; !ok {\n\t\t\tif !duplicated[module.Descriptor().Info().FullName()] {\n\t\t\t\tduplicated[module.Descriptor().Info().FullName()] = true\n\t\t\t\tb.logger.Debugf(\"adding %s\", style.Symbol(module.Descriptor().Info().FullName()))\n\n\t\t\t\tif err := b.writeBuildModuleToTar(tw, module, &parentFolderAdded); err != nil {\n\t\t\t\t\treturn \"\", nil, errors.Wrapf(err, \"adding %s\", style.Symbol(module.Descriptor().Info().FullName()))\n\t\t\t\t}\n\t\t\t\trootPath := processRootPath(module)\n\t\t\t\tif !parentFolderAdded[rootPath] {\n\t\t\t\t\tparentFolderAdded[rootPath] = true\n\t\t\t\t}\n\t\t\t} else {\n\t\t\t\tb.logger.Debugf(\"skipping %s, it was already added\", style.Symbol(module.Descriptor().Info().FullName()))\n\t\t\t}\n\t\t} else {\n\t\t\tb.logger.Debugf(\"excluding %s from being flattened\", style.Symbol(module.Descriptor().Info().FullName()))\n\t\t\tbuildModuleExcluded = append(buildModuleExcluded, module)\n\t\t}\n\t}\n\n\tb.logger.Debugf(\"%s was created successfully\", style.Symbol(layerTar))\n\treturn layerTar, buildModuleExcluded, nil\n}\n\n// writeBuildModuleToTar writes the content of the given tar file into the writer, skipping the folders that were already added\nfunc (b *BuildModuleWriter) writeBuildModuleToTar(tw archive.TarWriter, module BuildModule, parentFolderAdded *map[string]bool) error {\n\tvar (\n\t\trc  io.ReadCloser\n\t\terr error\n\t)\n\n\tif rc, err = module.Open(); err != nil {\n\t\treturn err\n\t}\n\tdefer rc.Close()\n\n\ttr := tar.NewReader(rc)\n\n\tfor {\n\t\theader, err := tr.Next()\n\t\tif err == io.EOF {\n\t\t\tbreak\n\t\t}\n\t\tif err != nil {\n\t\t\treturn errors.Wrap(err, \"failed to get next tar entry\")\n\t\t}\n\n\t\tif (*parentFolderAdded)[header.Name] {\n\t\t\tb.logger.Debugf(\"folder %s was already added, skipping it\", style.Symbol(header.Name))\n\t\t\tcontinue\n\t\t}\n\n\t\terr = tw.WriteHeader(header)\n\t\tif err != nil {\n\t\t\treturn errors.Wrapf(err, \"failed to write header for '%s'\", header.Name)\n\t\t}\n\n\t\t_, err = io.Copy(tw, tr)\n\t\tif err != nil {\n\t\t\treturn errors.Wrapf(err, \"failed to write contents to '%s'\", header.Name)\n\t\t}\n\t}\n\n\treturn nil\n}\n\nfunc processRootPath(module BuildModule) string {\n\tvar bpFolder string\n\tswitch module.Descriptor().Kind() {\n\tcase buildpack.KindBuildpack:\n\t\tbpFolder = \"buildpacks\"\n\tcase buildpack.KindExtension:\n\t\tbpFolder = \"extensions\"\n\tdefault:\n\t\tbpFolder = \"buildpacks\"\n\t}\n\tbpInfo := module.Descriptor().Info()\n\trootPath := path.Join(\"/cnb\", bpFolder, strings.ReplaceAll(bpInfo.ID, \"/\", \"_\"))\n\treturn rootPath\n}\n"
  },
  {
    "path": "pkg/buildpack/buildpack_tar_writer_test.go",
    "content": "package buildpack_test\n\nimport (\n\t\"bytes\"\n\t\"fmt\"\n\t\"os\"\n\t\"testing\"\n\n\t\"github.com/buildpacks/lifecycle/api\"\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\tifakes \"github.com/buildpacks/pack/internal/fakes\"\n\t\"github.com/buildpacks/pack/internal/style\"\n\t\"github.com/buildpacks/pack/pkg/archive\"\n\t\"github.com/buildpacks/pack/pkg/buildpack\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestBuildModuleWriter(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"testBuildModuleWriter\", testBuildModuleWriter, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\ntype void struct{}\n\nfunc testBuildModuleWriter(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\toutBuf            bytes.Buffer\n\t\tlogger            logging.Logger\n\t\tbuildModuleWriter *buildpack.BuildModuleWriter\n\t\tbp1v1             buildpack.BuildModule\n\t\tbp1v2             buildpack.BuildModule\n\t\tbp2v1             buildpack.BuildModule\n\t\tbp3v1             buildpack.BuildModule\n\t\tmember            void\n\t\ttmpDir            string\n\t\terr               error\n\t)\n\n\tit.Before(func() {\n\t\tlogger = logging.NewLogWithWriters(&outBuf, &outBuf, logging.WithVerbose())\n\t\tbuildModuleWriter = buildpack.NewBuildModuleWriter(logger, archive.DefaultTarWriterFactory())\n\t\ttmpDir, err = os.MkdirTemp(\"\", \"test_build_module_writer\")\n\t\th.AssertNil(t, err)\n\n\t\tbp1v1, err = ifakes.NewFakeBuildpack(dist.BuildpackDescriptor{\n\t\t\tWithAPI: api.MustParse(\"0.2\"),\n\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\tID:      \"buildpack-1-id\",\n\t\t\t\tVersion: \"buildpack-1-version-1\",\n\t\t\t},\n\t\t\tWithStacks: []dist.Stack{{\n\t\t\t\tID: \"*\",\n\t\t\t}},\n\t\t}, 0644)\n\t\th.AssertNil(t, err)\n\n\t\tbp1v2, err = ifakes.NewFakeBuildpack(dist.BuildpackDescriptor{\n\t\t\tWithAPI: api.MustParse(\"0.2\"),\n\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\tID:      \"buildpack-1-id\",\n\t\t\t\tVersion: \"buildpack-1-version-2\",\n\t\t\t},\n\t\t\tWithStacks: []dist.Stack{{\n\t\t\t\tID: \"*\",\n\t\t\t}},\n\t\t}, 0644)\n\t\th.AssertNil(t, err)\n\n\t\tbp2v1, err = ifakes.NewFakeBuildpack(dist.BuildpackDescriptor{\n\t\t\tWithAPI: api.MustParse(\"0.2\"),\n\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\tID:      \"buildpack-2-id\",\n\t\t\t\tVersion: \"buildpack-2-version-1\",\n\t\t\t},\n\t\t\tWithStacks: []dist.Stack{{\n\t\t\t\tID: \"*\",\n\t\t\t}},\n\t\t}, 0644)\n\t\th.AssertNil(t, err)\n\n\t\tbp3v1, err = ifakes.NewFakeBuildpack(dist.BuildpackDescriptor{\n\t\t\tWithAPI: api.MustParse(\"0.2\"),\n\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\tID:      \"buildpack-3-id\",\n\t\t\t\tVersion: \"buildpack-3-version-1\",\n\t\t\t},\n\t\t\tWithStacks: []dist.Stack{{\n\t\t\t\tID: \"*\",\n\t\t\t}},\n\t\t}, 0644)\n\t\th.AssertNil(t, err)\n\t})\n\n\tit.After(func() {\n\t\terr := os.RemoveAll(tmpDir)\n\t\th.AssertNil(t, err)\n\t})\n\n\twhen(\"#NToLayerTar\", func() {\n\t\twhen(\"there are not exclude buildpacks\", func() {\n\t\t\twhen(\"there are not duplicated buildpacks\", func() {\n\t\t\t\tit(\"creates a tar\", func() {\n\t\t\t\t\tbpModules := []buildpack.BuildModule{bp1v1, bp2v1, bp3v1}\n\t\t\t\t\ttarFile, bpExcluded, err := buildModuleWriter.NToLayerTar(tmpDir, \"test-file-1\", bpModules, nil)\n\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\th.AssertTrue(t, len(bpExcluded) == 0)\n\t\t\t\t\th.AssertNotNil(t, tarFile)\n\t\t\t\t\tassertBuildpackModuleWritten(t, tarFile, bpModules)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"there are duplicated buildpacks\", func() {\n\t\t\t\tit(\"creates a tar skipping root folder from duplicated buildpacks\", func() {\n\t\t\t\t\tbpModules := []buildpack.BuildModule{bp1v1, bp1v2, bp2v1, bp3v1}\n\t\t\t\t\ttarFile, bpExcluded, err := buildModuleWriter.NToLayerTar(tmpDir, \"test-file-2\", bpModules, nil)\n\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\th.AssertTrue(t, len(bpExcluded) == 0)\n\t\t\t\t\th.AssertNotNil(t, tarFile)\n\t\t\t\t\tassertBuildpackModuleWritten(t, tarFile, bpModules)\n\t\t\t\t\th.AssertContains(t, outBuf.String(), fmt.Sprintf(\"folder '%s' was already added, skipping it\", \"/cnb/buildpacks/buildpack-1-id\"))\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"there are exclude buildpacks\", func() {\n\t\t\texclude := make(map[string]struct{})\n\t\t\tit.Before(func() {\n\t\t\t\texclude[bp2v1.Descriptor().Info().FullName()] = member\n\t\t\t})\n\n\t\t\twhen(\"there are not duplicated buildpacks\", func() {\n\t\t\t\tit(\"creates a tar skipping excluded buildpacks\", func() {\n\t\t\t\t\tbpModules := []buildpack.BuildModule{bp1v1, bp2v1, bp3v1}\n\t\t\t\t\ttarFile, bpExcluded, err := buildModuleWriter.NToLayerTar(tmpDir, \"test-file-3\", bpModules, exclude)\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\th.AssertTrue(t, len(bpExcluded) == 1)\n\t\t\t\t\th.AssertNotNil(t, tarFile)\n\t\t\t\t\tassertBuildpackModuleWritten(t, tarFile, []buildpack.BuildModule{bp1v1, bp3v1})\n\t\t\t\t\th.AssertContains(t, outBuf.String(), fmt.Sprintf(\"excluding %s from being flattened\", style.Symbol(bp2v1.Descriptor().Info().FullName())))\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"there are duplicated buildpacks\", func() {\n\t\t\t\tit(\"creates a tar skipping excluded buildpacks and root folder from duplicated buildpacks\", func() {\n\t\t\t\t\tbpModules := []buildpack.BuildModule{bp1v1, bp1v2, bp2v1, bp3v1}\n\t\t\t\t\ttarFile, bpExcluded, err := buildModuleWriter.NToLayerTar(tmpDir, \"test-file-4\", bpModules, exclude)\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\th.AssertTrue(t, len(bpExcluded) == 1)\n\t\t\t\t\th.AssertNotNil(t, tarFile)\n\t\t\t\t\tassertBuildpackModuleWritten(t, tarFile, []buildpack.BuildModule{bp1v1, bp1v2, bp3v1})\n\t\t\t\t\th.AssertContains(t, outBuf.String(), fmt.Sprintf(\"folder '%s' was already added, skipping it\", \"/cnb/buildpacks/buildpack-1-id\"))\n\t\t\t\t\th.AssertContains(t, outBuf.String(), fmt.Sprintf(\"excluding %s from being flattened\", style.Symbol(bp2v1.Descriptor().Info().FullName())))\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\t})\n}\n\nfunc assertBuildpackModuleWritten(t *testing.T, path string, modules []buildpack.BuildModule) {\n\tt.Helper()\n\tfor _, module := range modules {\n\t\tdirPath := fmt.Sprintf(\"/cnb/buildpacks/%s/%s\", module.Descriptor().Info().ID, module.Descriptor().Info().Version)\n\t\th.AssertOnTarEntry(t, path, dirPath,\n\t\t\th.IsDirectory(),\n\t\t)\n\t}\n}\n"
  },
  {
    "path": "pkg/buildpack/buildpack_test.go",
    "content": "package buildpack_test\n\nimport (\n\t\"bytes\"\n\t\"fmt\"\n\t\"io\"\n\t\"os\"\n\t\"path/filepath\"\n\t\"runtime\"\n\t\"strings\"\n\t\"testing\"\n\t\"time\"\n\n\t\"github.com/buildpacks/lifecycle/api\"\n\t\"github.com/heroku/color\"\n\t\"github.com/pkg/errors\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\t\"github.com/buildpacks/pack/pkg/archive\"\n\t\"github.com/buildpacks/pack/pkg/blob\"\n\t\"github.com/buildpacks/pack/pkg/buildpack\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestBuildpack(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"buildpack\", testBuildpack, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testBuildpack(t *testing.T, when spec.G, it spec.S) {\n\tvar writeBlobToFile = func(bp buildpack.BuildModule) string {\n\t\tt.Helper()\n\n\t\tbpReader, err := bp.Open()\n\t\th.AssertNil(t, err)\n\n\t\ttmpDir, err := os.MkdirTemp(\"\", \"\")\n\t\th.AssertNil(t, err)\n\n\t\tp := filepath.Join(tmpDir, \"bp.tar\")\n\t\tbpWriter, err := os.Create(p)\n\t\th.AssertNil(t, err)\n\n\t\t_, err = io.Copy(bpWriter, bpReader)\n\t\th.AssertNil(t, err)\n\n\t\terr = bpReader.Close()\n\t\th.AssertNil(t, err)\n\n\t\treturn p\n\t}\n\n\twhen(\"#BuildpackFromRootBlob\", func() {\n\t\tit(\"parses the descriptor file\", func() {\n\t\t\tbp, err := buildpack.FromBuildpackRootBlob(&readerBlob{\n\t\t\t\topenFn: func() io.ReadCloser {\n\t\t\t\t\ttarBuilder := archive.TarBuilder{}\n\t\t\t\t\ttarBuilder.AddFile(\"buildpack.toml\", 0700, time.Now(), []byte(`\napi = \"0.3\"\n\n[buildpack]\nid = \"bp.one\"\nversion = \"1.2.3\"\nhomepage = \"http://geocities.com/cool-bp\"\n\n[[stacks]]\nid = \"some.stack.id\"\n`))\n\t\t\t\t\treturn tarBuilder.Reader(archive.DefaultTarWriterFactory())\n\t\t\t\t},\n\t\t\t}, archive.DefaultTarWriterFactory(), nil)\n\t\t\th.AssertNil(t, err)\n\n\t\t\th.AssertEq(t, bp.Descriptor().API().String(), \"0.3\")\n\t\t\th.AssertEq(t, bp.Descriptor().Info().ID, \"bp.one\")\n\t\t\th.AssertEq(t, bp.Descriptor().Info().Version, \"1.2.3\")\n\t\t\th.AssertEq(t, bp.Descriptor().Info().Homepage, \"http://geocities.com/cool-bp\")\n\t\t\th.AssertEq(t, bp.Descriptor().Stacks()[0].ID, \"some.stack.id\")\n\t\t})\n\n\t\tit(\"translates blob to distribution format\", func() {\n\t\t\tbp, err := buildpack.FromBuildpackRootBlob(&readerBlob{\n\t\t\t\topenFn: func() io.ReadCloser {\n\t\t\t\t\ttarBuilder := archive.TarBuilder{}\n\t\t\t\t\ttarBuilder.AddFile(\"buildpack.toml\", 0700, time.Now(), []byte(`\napi = \"0.3\"\n\n[buildpack]\nid = \"bp.one\"\nversion = \"1.2.3\"\n\n[[stacks]]\nid = \"some.stack.id\"\n`))\n\n\t\t\t\t\ttarBuilder.AddDir(\"bin\", 0700, time.Now())\n\t\t\t\t\ttarBuilder.AddFile(\"bin/detect\", 0700, time.Now(), []byte(\"detect-contents\"))\n\t\t\t\t\ttarBuilder.AddFile(\"bin/build\", 0700, time.Now(), []byte(\"build-contents\"))\n\t\t\t\t\treturn tarBuilder.Reader(archive.DefaultTarWriterFactory())\n\t\t\t\t},\n\t\t\t}, archive.DefaultTarWriterFactory(), nil)\n\t\t\th.AssertNil(t, err)\n\n\t\t\th.AssertNil(t, bp.Descriptor().EnsureTargetSupport(dist.DefaultTargetOSLinux, dist.DefaultTargetArch, \"\", \"\"))\n\n\t\t\ttarPath := writeBlobToFile(bp)\n\t\t\tdefer os.Remove(tarPath)\n\n\t\t\th.AssertOnTarEntry(t, tarPath,\n\t\t\t\t\"/cnb/buildpacks/bp.one\",\n\t\t\t\th.IsDirectory(),\n\t\t\t\th.HasFileMode(0755),\n\t\t\t\th.HasModTime(archive.NormalizedDateTime),\n\t\t\t)\n\n\t\t\th.AssertOnTarEntry(t, tarPath,\n\t\t\t\t\"/cnb/buildpacks/bp.one/1.2.3\",\n\t\t\t\th.IsDirectory(),\n\t\t\t\th.HasFileMode(0755),\n\t\t\t\th.HasModTime(archive.NormalizedDateTime),\n\t\t\t)\n\n\t\t\th.AssertOnTarEntry(t, tarPath,\n\t\t\t\t\"/cnb/buildpacks/bp.one/1.2.3/bin\",\n\t\t\t\th.IsDirectory(),\n\t\t\t\th.HasFileMode(0755),\n\t\t\t\th.HasModTime(archive.NormalizedDateTime),\n\t\t\t)\n\n\t\t\th.AssertOnTarEntry(t, tarPath,\n\t\t\t\t\"/cnb/buildpacks/bp.one/1.2.3/bin/detect\",\n\t\t\t\th.HasFileMode(0755),\n\t\t\t\th.HasModTime(archive.NormalizedDateTime),\n\t\t\t\th.ContentEquals(\"detect-contents\"),\n\t\t\t)\n\n\t\t\th.AssertOnTarEntry(t, tarPath,\n\t\t\t\t\"/cnb/buildpacks/bp.one/1.2.3/bin/build\",\n\t\t\t\th.HasFileMode(0755),\n\t\t\t\th.HasModTime(archive.NormalizedDateTime),\n\t\t\t\th.ContentEquals(\"build-contents\"),\n\t\t\t)\n\t\t})\n\n\t\tit(\"translates blob to windows bat distribution format\", func() {\n\t\t\tbp, err := buildpack.FromBuildpackRootBlob(&readerBlob{\n\t\t\t\topenFn: func() io.ReadCloser {\n\t\t\t\t\ttarBuilder := archive.TarBuilder{}\n\t\t\t\t\ttarBuilder.AddFile(\"buildpack.toml\", 0700, time.Now(), []byte(`\napi = \"0.9\"\n\n[buildpack]\nid = \"bp.one\"\nversion = \"1.2.3\"\n`))\n\n\t\t\t\t\ttarBuilder.AddDir(\"bin\", 0700, time.Now())\n\t\t\t\t\ttarBuilder.AddFile(\"bin/detect\", 0700, time.Now(), []byte(\"detect-contents\"))\n\t\t\t\t\ttarBuilder.AddFile(\"bin/build.bat\", 0700, time.Now(), []byte(\"build-contents\"))\n\t\t\t\t\treturn tarBuilder.Reader(archive.DefaultTarWriterFactory())\n\t\t\t\t},\n\t\t\t}, archive.DefaultTarWriterFactory(), nil)\n\t\t\th.AssertNil(t, err)\n\n\t\t\tbpDescriptor := bp.Descriptor().(*dist.BuildpackDescriptor)\n\t\t\th.AssertTrue(t, bpDescriptor.WithWindowsBuild)\n\t\t\th.AssertFalse(t, bpDescriptor.WithLinuxBuild)\n\n\t\t\ttarPath := writeBlobToFile(bp)\n\t\t\tdefer os.Remove(tarPath)\n\n\t\t\th.AssertOnTarEntry(t, tarPath,\n\t\t\t\t\"/cnb/buildpacks/bp.one/1.2.3/bin/build.bat\",\n\t\t\t\th.HasFileMode(0755),\n\t\t\t\th.HasModTime(archive.NormalizedDateTime),\n\t\t\t\th.ContentEquals(\"build-contents\"),\n\t\t\t)\n\t\t})\n\n\t\tit(\"translates blob to windows exe distribution format\", func() {\n\t\t\tbp, err := buildpack.FromBuildpackRootBlob(&readerBlob{\n\t\t\t\topenFn: func() io.ReadCloser {\n\t\t\t\t\ttarBuilder := archive.TarBuilder{}\n\t\t\t\t\ttarBuilder.AddFile(\"buildpack.toml\", 0700, time.Now(), []byte(`\napi = \"0.3\"\n\n[buildpack]\nid = \"bp.one\"\nversion = \"1.2.3\"\n`))\n\n\t\t\t\t\ttarBuilder.AddDir(\"bin\", 0700, time.Now())\n\t\t\t\t\ttarBuilder.AddFile(\"bin/detect\", 0700, time.Now(), []byte(\"detect-contents\"))\n\t\t\t\t\ttarBuilder.AddFile(\"bin/build.exe\", 0700, time.Now(), []byte(\"build-contents\"))\n\t\t\t\t\treturn tarBuilder.Reader(archive.DefaultTarWriterFactory())\n\t\t\t\t},\n\t\t\t}, archive.DefaultTarWriterFactory(), nil)\n\t\t\th.AssertNil(t, err)\n\n\t\t\tbpDescriptor := bp.Descriptor().(*dist.BuildpackDescriptor)\n\t\t\th.AssertTrue(t, bpDescriptor.WithWindowsBuild)\n\t\t\th.AssertFalse(t, bpDescriptor.WithLinuxBuild)\n\n\t\t\ttarPath := writeBlobToFile(bp)\n\t\t\tdefer os.Remove(tarPath)\n\n\t\t\th.AssertOnTarEntry(t, tarPath,\n\t\t\t\t\"/cnb/buildpacks/bp.one/1.2.3/bin/build.exe\",\n\t\t\t\th.HasFileMode(0755),\n\t\t\t\th.HasModTime(archive.NormalizedDateTime),\n\t\t\t\th.ContentEquals(\"build-contents\"),\n\t\t\t)\n\t\t})\n\n\t\tit(\"surfaces errors encountered while reading blob\", func() {\n\t\t\trealBlob := &readerBlob{\n\t\t\t\topenFn: func() io.ReadCloser {\n\t\t\t\t\ttarBuilder := archive.TarBuilder{}\n\t\t\t\t\ttarBuilder.AddFile(\"buildpack.toml\", 0700, time.Now(), []byte(`\napi = \"0.3\"\n\n[buildpack]\nid = \"bp.one\"\nversion = \"1.2.3\"\n\n[[stacks]]\nid = \"some.stack.id\"\n`))\n\t\t\t\t\treturn tarBuilder.Reader(archive.DefaultTarWriterFactory())\n\t\t\t\t},\n\t\t\t}\n\n\t\t\tbp, err := buildpack.FromBuildpackRootBlob(&errorBlob{\n\t\t\t\trealBlob: realBlob,\n\t\t\t\tlimit:    4,\n\t\t\t}, archive.DefaultTarWriterFactory(), nil)\n\t\t\th.AssertNil(t, err)\n\n\t\t\tbpReader, err := bp.Open()\n\t\t\th.AssertNil(t, err)\n\n\t\t\t_, err = io.Copy(io.Discard, bpReader)\n\t\t\th.AssertError(t, err, \"error from errBlob (reached limit of 4)\")\n\t\t})\n\n\t\twhen(\"calculating permissions\", func() {\n\t\t\tbpTOMLData := `\napi = \"0.3\"\n\n[buildpack]\nid = \"bp.one\"\nversion = \"1.2.3\"\n\n[[stacks]]\nid = \"some.stack.id\"\n`\n\n\t\t\twhen(\"no exec bits set\", func() {\n\t\t\t\tit(\"sets to 0755 if directory\", func() {\n\t\t\t\t\tbp, err := buildpack.FromBuildpackRootBlob(&readerBlob{\n\t\t\t\t\t\topenFn: func() io.ReadCloser {\n\t\t\t\t\t\t\ttarBuilder := archive.TarBuilder{}\n\t\t\t\t\t\t\ttarBuilder.AddFile(\"buildpack.toml\", 0700, time.Now(), []byte(bpTOMLData))\n\t\t\t\t\t\t\ttarBuilder.AddDir(\"some-dir\", 0600, time.Now())\n\t\t\t\t\t\t\treturn tarBuilder.Reader(archive.DefaultTarWriterFactory())\n\t\t\t\t\t\t},\n\t\t\t\t\t}, archive.DefaultTarWriterFactory(), nil)\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\ttarPath := writeBlobToFile(bp)\n\t\t\t\t\tdefer os.Remove(tarPath)\n\n\t\t\t\t\th.AssertOnTarEntry(t, tarPath,\n\t\t\t\t\t\t\"/cnb/buildpacks/bp.one/1.2.3/some-dir\",\n\t\t\t\t\t\th.HasFileMode(0755),\n\t\t\t\t\t)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"no exec bits set\", func() {\n\t\t\t\tit(\"sets to 0755 if 'bin/detect' or 'bin/build'\", func() {\n\t\t\t\t\tbp, err := buildpack.FromBuildpackRootBlob(&readerBlob{\n\t\t\t\t\t\topenFn: func() io.ReadCloser {\n\t\t\t\t\t\t\ttarBuilder := archive.TarBuilder{}\n\t\t\t\t\t\t\ttarBuilder.AddFile(\"buildpack.toml\", 0700, time.Now(), []byte(bpTOMLData))\n\t\t\t\t\t\t\ttarBuilder.AddFile(\"bin/detect\", 0600, time.Now(), []byte(\"detect-contents\"))\n\t\t\t\t\t\t\ttarBuilder.AddFile(\"bin/build\", 0600, time.Now(), []byte(\"build-contents\"))\n\t\t\t\t\t\t\treturn tarBuilder.Reader(archive.DefaultTarWriterFactory())\n\t\t\t\t\t\t},\n\t\t\t\t\t}, archive.DefaultTarWriterFactory(), nil)\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\tbpDescriptor := bp.Descriptor().(*dist.BuildpackDescriptor)\n\t\t\t\t\th.AssertFalse(t, bpDescriptor.WithWindowsBuild)\n\t\t\t\t\th.AssertTrue(t, bpDescriptor.WithLinuxBuild)\n\n\t\t\t\t\ttarPath := writeBlobToFile(bp)\n\t\t\t\t\tdefer os.Remove(tarPath)\n\n\t\t\t\t\th.AssertOnTarEntry(t, tarPath,\n\t\t\t\t\t\t\"/cnb/buildpacks/bp.one/1.2.3/bin/detect\",\n\t\t\t\t\t\th.HasFileMode(0755),\n\t\t\t\t\t)\n\n\t\t\t\t\th.AssertOnTarEntry(t, tarPath,\n\t\t\t\t\t\t\"/cnb/buildpacks/bp.one/1.2.3/bin/build\",\n\t\t\t\t\t\th.HasFileMode(0755),\n\t\t\t\t\t)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"not directory, 'bin/detect', or 'bin/build'\", func() {\n\t\t\t\tit(\"sets to 0755 if ANY exec bit is set\", func() {\n\t\t\t\t\tbp, err := buildpack.FromBuildpackRootBlob(&readerBlob{\n\t\t\t\t\t\topenFn: func() io.ReadCloser {\n\t\t\t\t\t\t\ttarBuilder := archive.TarBuilder{}\n\t\t\t\t\t\t\ttarBuilder.AddFile(\"buildpack.toml\", 0700, time.Now(), []byte(bpTOMLData))\n\t\t\t\t\t\t\ttarBuilder.AddFile(\"some-file\", 0700, time.Now(), []byte(\"some-data\"))\n\t\t\t\t\t\t\treturn tarBuilder.Reader(archive.DefaultTarWriterFactory())\n\t\t\t\t\t\t},\n\t\t\t\t\t}, archive.DefaultTarWriterFactory(), nil)\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\ttarPath := writeBlobToFile(bp)\n\t\t\t\t\tdefer os.Remove(tarPath)\n\n\t\t\t\t\th.AssertOnTarEntry(t, tarPath,\n\t\t\t\t\t\t\"/cnb/buildpacks/bp.one/1.2.3/some-file\",\n\t\t\t\t\t\th.HasFileMode(0755),\n\t\t\t\t\t)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"not directory, 'bin/detect', or 'bin/build'\", func() {\n\t\t\t\tit(\"sets to 0644 if NO exec bits set\", func() {\n\t\t\t\t\tbp, err := buildpack.FromBuildpackRootBlob(&readerBlob{\n\t\t\t\t\t\topenFn: func() io.ReadCloser {\n\t\t\t\t\t\t\ttarBuilder := archive.TarBuilder{}\n\t\t\t\t\t\t\ttarBuilder.AddFile(\"buildpack.toml\", 0700, time.Now(), []byte(bpTOMLData))\n\t\t\t\t\t\t\ttarBuilder.AddFile(\"some-file\", 0600, time.Now(), []byte(\"some-data\"))\n\t\t\t\t\t\t\treturn tarBuilder.Reader(archive.DefaultTarWriterFactory())\n\t\t\t\t\t\t},\n\t\t\t\t\t}, archive.DefaultTarWriterFactory(), nil)\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\ttarPath := writeBlobToFile(bp)\n\t\t\t\t\tdefer os.Remove(tarPath)\n\n\t\t\t\t\th.AssertOnTarEntry(t, tarPath,\n\t\t\t\t\t\t\"/cnb/buildpacks/bp.one/1.2.3/some-file\",\n\t\t\t\t\t\th.HasFileMode(0644),\n\t\t\t\t\t)\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"there is no descriptor file\", func() {\n\t\t\tit(\"returns error\", func() {\n\t\t\t\t_, err := buildpack.FromBuildpackRootBlob(&readerBlob{\n\t\t\t\t\topenFn: func() io.ReadCloser {\n\t\t\t\t\t\ttarBuilder := archive.TarBuilder{}\n\t\t\t\t\t\treturn tarBuilder.Reader(archive.DefaultTarWriterFactory())\n\t\t\t\t\t},\n\t\t\t\t}, archive.DefaultTarWriterFactory(), nil)\n\t\t\t\th.AssertError(t, err, \"could not find entry path 'buildpack.toml'\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"there is no api field\", func() {\n\t\t\tit(\"assumes an api version\", func() {\n\t\t\t\tbp, err := buildpack.FromBuildpackRootBlob(&readerBlob{\n\t\t\t\t\topenFn: func() io.ReadCloser {\n\t\t\t\t\t\ttarBuilder := archive.TarBuilder{}\n\t\t\t\t\t\ttarBuilder.AddFile(\"buildpack.toml\", 0700, time.Now(), []byte(`\n[buildpack]\nid = \"bp.one\"\nversion = \"1.2.3\"\n\n[[stacks]]\nid = \"some.stack.id\"`))\n\t\t\t\t\t\treturn tarBuilder.Reader(archive.DefaultTarWriterFactory())\n\t\t\t\t\t},\n\t\t\t\t}, archive.DefaultTarWriterFactory(), nil)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertEq(t, bp.Descriptor().API().String(), \"0.1\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"there is no id\", func() {\n\t\t\tit(\"returns error\", func() {\n\t\t\t\t_, err := buildpack.FromBuildpackRootBlob(&readerBlob{\n\t\t\t\t\topenFn: func() io.ReadCloser {\n\t\t\t\t\t\ttarBuilder := archive.TarBuilder{}\n\t\t\t\t\t\ttarBuilder.AddFile(\"buildpack.toml\", 0700, time.Now(), []byte(`\n[buildpack]\nid = \"\"\nversion = \"1.2.3\"\n\n[[stacks]]\nid = \"some.stack.id\"`))\n\t\t\t\t\t\treturn tarBuilder.Reader(archive.DefaultTarWriterFactory())\n\t\t\t\t\t},\n\t\t\t\t}, archive.DefaultTarWriterFactory(), nil)\n\t\t\t\th.AssertError(t, err, \"'buildpack.id' is required\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"there is no version\", func() {\n\t\t\tit(\"returns error\", func() {\n\t\t\t\t_, err := buildpack.FromBuildpackRootBlob(&readerBlob{\n\t\t\t\t\topenFn: func() io.ReadCloser {\n\t\t\t\t\t\ttarBuilder := archive.TarBuilder{}\n\t\t\t\t\t\ttarBuilder.AddFile(\"buildpack.toml\", 0700, time.Now(), []byte(`\n[buildpack]\nid = \"bp.one\"\nversion = \"\"\n\n[[stacks]]\nid = \"some.stack.id\"`))\n\t\t\t\t\t\treturn tarBuilder.Reader(archive.DefaultTarWriterFactory())\n\t\t\t\t\t},\n\t\t\t\t}, archive.DefaultTarWriterFactory(), nil)\n\t\t\t\th.AssertError(t, err, \"'buildpack.version' is required\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"both stacks and order are present\", func() {\n\t\t\tit(\"returns error\", func() {\n\t\t\t\t_, err := buildpack.FromBuildpackRootBlob(&readerBlob{\n\t\t\t\t\topenFn: func() io.ReadCloser {\n\t\t\t\t\t\ttarBuilder := archive.TarBuilder{}\n\t\t\t\t\t\ttarBuilder.AddFile(\"buildpack.toml\", 0700, time.Now(), []byte(`\n[buildpack]\nid = \"bp.one\"\nversion = \"1.2.3\"\n\n[[stacks]]\nid = \"some.stack.id\"\n\n[[order]]\n[[order.group]]\n  id = \"bp.nested\"\n  version = \"bp.nested.version\"\n`))\n\t\t\t\t\t\treturn tarBuilder.Reader(archive.DefaultTarWriterFactory())\n\t\t\t\t\t},\n\t\t\t\t}, archive.DefaultTarWriterFactory(), nil)\n\t\t\t\th.AssertError(t, err, \"cannot have both 'targets'/'stacks' and an 'order' defined\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"missing stacks and order\", func() {\n\t\t\tit(\"does not return an error\", func() {\n\t\t\t\t_, err := buildpack.FromBuildpackRootBlob(&readerBlob{\n\t\t\t\t\topenFn: func() io.ReadCloser {\n\t\t\t\t\t\ttarBuilder := archive.TarBuilder{}\n\t\t\t\t\t\ttarBuilder.AddFile(\"buildpack.toml\", 0700, time.Now(), []byte(`\n[buildpack]\nid = \"bp.one\"\nversion = \"1.2.3\"\n`))\n\t\t\t\t\t\treturn tarBuilder.Reader(archive.DefaultTarWriterFactory())\n\t\t\t\t\t},\n\t\t\t\t}, archive.DefaultTarWriterFactory(), nil)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"hardlink is present\", func() {\n\t\t\tvar bpRootFolder string\n\n\t\t\tit.Before(func() {\n\t\t\t\tbpRootFolder = filepath.Join(\"testdata\", \"buildpack-with-hardlink\")\n\t\t\t\t// create a hard link\n\t\t\t\terr := os.Link(filepath.Join(bpRootFolder, \"original-file\"), filepath.Join(bpRootFolder, \"original-file-2\"))\n\t\t\t\th.AssertNil(t, err)\n\t\t\t})\n\n\t\t\tit.After(func() {\n\t\t\t\tos.RemoveAll(filepath.Join(bpRootFolder, \"original-file-2\"))\n\t\t\t})\n\n\t\t\tit(\"hardlink is preserved in the output tar file\", func() {\n\t\t\t\tbp, err := buildpack.FromBuildpackRootBlob(blob.NewBlob(bpRootFolder), archive.DefaultTarWriterFactory(), nil)\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\ttarPath := writeBlobToFile(bp)\n\t\t\t\tdefer os.Remove(tarPath)\n\n\t\t\t\th.AssertOnTarEntries(t, tarPath,\n\t\t\t\t\t\"/cnb/buildpacks/bp.one/1.2.3/original-file\",\n\t\t\t\t\t\"/cnb/buildpacks/bp.one/1.2.3/original-file-2\",\n\t\t\t\t\th.AreEquivalentHardLinks(),\n\t\t\t\t)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"there are wrong things in the file\", func() {\n\t\t\tit(\"warns\", func() {\n\t\t\t\toutBuf := bytes.Buffer{}\n\t\t\t\tlogger := logging.NewLogWithWriters(&outBuf, &outBuf)\n\t\t\t\t_, err := buildpack.FromBuildpackRootBlob(&readerBlob{\n\t\t\t\t\topenFn: func() io.ReadCloser {\n\t\t\t\t\t\ttarBuilder := archive.TarBuilder{}\n\t\t\t\t\t\ttarBuilder.AddFile(\"buildpack.toml\", 0700, time.Now(), []byte(`\napi = \"0.3\"\n\n[buildpack]\nid = \"bp.one\"\nversion = \"1.2.3\"\nhomepage = \"http://geocities.com/cool-bp\"\nsbom-formats = [\"this should not warn\"]\nclear-env = true\n\n[[targets]]\nos = \"some-os\"\narch = \"some-arch\"\nvariant = \"some-arch-variant\"\n[[targets.distributions]]\nname = \"some-distro-name\"\nversion = \"some-distro-version\"\n[[targets.distros]]\nname = \"some-distro-name\"\nversions = [\"some-distro-version\"]\n\n[metadata]\nthis-key = \"is totally allowed and should not warn\"\n`))\n\t\t\t\t\t\treturn tarBuilder.Reader(archive.DefaultTarWriterFactory())\n\t\t\t\t\t},\n\t\t\t\t}, archive.DefaultTarWriterFactory(), logger)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertContains(t, outBuf.String(), \"Warning: Ignoring unexpected key(s) in descriptor for buildpack bp.one: targets.distributions, targets.distributions.name, targets.distributions.version, targets.distros.versions\")\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"#Match\", func() {\n\t\tit(\"compares, using only the id and version\", func() {\n\t\t\tother := dist.ModuleInfo{\n\t\t\t\tID:          \"same\",\n\t\t\t\tVersion:     \"1.2.3\",\n\t\t\t\tDescription: \"something else\",\n\t\t\t\tHomepage:    \"something else\",\n\t\t\t\tKeywords:    []string{\"something\", \"else\"},\n\t\t\t\tLicenses: []dist.License{\n\t\t\t\t\t{\n\t\t\t\t\t\tType: \"MIT\",\n\t\t\t\t\t\tURI:  \"https://example.com\",\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t}\n\n\t\t\tself := dist.ModuleInfo{\n\t\t\t\tID:      \"same\",\n\t\t\t\tVersion: \"1.2.3\",\n\t\t\t}\n\n\t\t\tmatch := self.Match(other)\n\n\t\t\th.AssertEq(t, match, true)\n\n\t\t\tself.ID = \"different\"\n\t\t\tmatch = self.Match(other)\n\n\t\t\th.AssertEq(t, match, false)\n\t\t})\n\t})\n\n\twhen(\"#Set\", func() {\n\t\tit(\"creates a set\", func() {\n\t\t\tvalues := []string{\"a\", \"b\", \"c\", \"a\"}\n\t\t\tset := buildpack.Set(values)\n\t\t\th.AssertEq(t, len(set), 3)\n\t\t})\n\t})\n\n\twhen(\"#ToNLayerTar\", func() {\n\t\tvar (\n\t\t\ttmpDir     string\n\t\t\texpectedBP []expectedBuildpack\n\t\t\terr        error\n\t\t)\n\n\t\tit.Before(func() {\n\t\t\ttmpDir, err = os.MkdirTemp(\"\", \"\")\n\t\t\th.AssertNil(t, err)\n\t\t})\n\n\t\tit.After(func() {\n\t\t\terr := os.RemoveAll(tmpDir)\n\t\t\tif runtime.GOOS != \"windows\" {\n\t\t\t\t// avoid \"The process cannot access the file because it is being used by another process\"\n\t\t\t\t// error on Windows\n\t\t\t\th.AssertNil(t, err)\n\t\t\t}\n\t\t})\n\n\t\twhen(\"BuildModule contains only an individual buildpack (default)\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\texpectedBP = []expectedBuildpack{\n\t\t\t\t\t{\n\t\t\t\t\t\tid:      \"buildpack-1-id\",\n\t\t\t\t\t\tversion: \"buildpack-1-version-1\",\n\t\t\t\t\t},\n\t\t\t\t}\n\t\t\t})\n\n\t\t\tit(\"returns 1 tar files\", func() {\n\t\t\t\tbp := buildpack.FromBlob(\n\t\t\t\t\t&dist.BuildpackDescriptor{\n\t\t\t\t\t\tWithAPI: api.MustParse(\"0.3\"),\n\t\t\t\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\t\t\t\tID:      \"buildpack-1-id\",\n\t\t\t\t\t\t\tVersion: \"buildpack-1-version-1\",\n\t\t\t\t\t\t\tName:    \"buildpack-1\",\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t\t&readerBlob{\n\t\t\t\t\t\topenFn: func() io.ReadCloser {\n\t\t\t\t\t\t\ttarBuilder := archive.TarBuilder{}\n\n\t\t\t\t\t\t\t// Buildpack 1\n\t\t\t\t\t\t\ttarBuilder.AddDir(\"/cnb/buildpacks/buildpack-1-id\", 0700, time.Now())\n\t\t\t\t\t\t\ttarBuilder.AddDir(\"/cnb/buildpacks/buildpack-1-id/buildpack-1-version-1\", 0700, time.Now())\n\t\t\t\t\t\t\ttarBuilder.AddFile(\"/cnb/buildpacks/buildpack-1-id/buildpack-1-version-1/buildpack.toml\", 0700, time.Now(), []byte(`\napi = \"0.3\"\n\n[buildpack]\nid = \"buildpack-1-id\"\nversion = \"buildpack-1-version-1\"\n\n`))\n\t\t\t\t\t\t\ttarBuilder.AddDir(\"/cnb/buildpacks/buildpack-1-id/buildpack-1-version-1/bin\", 0700, time.Now())\n\t\t\t\t\t\t\ttarBuilder.AddFile(\"/cnb/buildpacks/buildpack-1-id/buildpack-1-version-1/bin/detect\", 0700, time.Now(), []byte(\"detect-contents\"))\n\t\t\t\t\t\t\ttarBuilder.AddFile(\"/cnb/buildpacks/buildpack-1-id/buildpack-1-version-1/bin/build\", 0700, time.Now(), []byte(\"build-contents\"))\n\n\t\t\t\t\t\t\treturn tarBuilder.Reader(archive.DefaultTarWriterFactory())\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t)\n\n\t\t\t\ttarPaths, err := buildpack.ToNLayerTar(tmpDir, bp)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertEq(t, len(tarPaths), 1)\n\t\t\t\tassertBuildpacksToTar(t, tarPaths, expectedBP)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"BuildModule contains N flattened buildpacks\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\texpectedBP = []expectedBuildpack{\n\t\t\t\t\t{\n\t\t\t\t\t\tid:      \"buildpack-1-id\",\n\t\t\t\t\t\tversion: \"buildpack-1-version-1\",\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\tid:      \"buildpack-2-id\",\n\t\t\t\t\t\tversion: \"buildpack-2-version-1\",\n\t\t\t\t\t},\n\t\t\t\t}\n\t\t\t})\n\t\t\twhen(\"not running on windows\", func() {\n\t\t\t\tit(\"returns N tar files\", func() {\n\t\t\t\t\th.SkipIf(t, runtime.GOOS == \"windows\", \"\")\n\t\t\t\t\tbp := buildpack.FromBlob(\n\t\t\t\t\t\t&dist.BuildpackDescriptor{\n\t\t\t\t\t\t\tWithAPI: api.MustParse(\"0.3\"),\n\t\t\t\t\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\t\t\t\t\tID:      \"buildpack-1-id\",\n\t\t\t\t\t\t\t\tVersion: \"buildpack-1-version-1\",\n\t\t\t\t\t\t\t\tName:    \"buildpack-1\",\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t},\n\t\t\t\t\t\t&readerBlob{\n\t\t\t\t\t\t\topenFn: func() io.ReadCloser {\n\t\t\t\t\t\t\t\ttarBuilder := archive.TarBuilder{}\n\n\t\t\t\t\t\t\t\t// Buildpack 1\n\t\t\t\t\t\t\t\ttarBuilder.AddDir(\"/cnb/buildpacks/buildpack-1-id\", 0700, time.Now())\n\t\t\t\t\t\t\t\ttarBuilder.AddDir(\"/cnb/buildpacks/buildpack-1-id/buildpack-1-version-1\", 0700, time.Now())\n\t\t\t\t\t\t\t\ttarBuilder.AddFile(\"/cnb/buildpacks/buildpack-1-id/buildpack-1-version-1/buildpack.toml\", 0700, time.Now(), []byte(`\napi = \"0.3\"\n\n[buildpack]\nid = \"buildpack-1-id\"\nversion = \"buildpack-1-version-1\"\n\n`))\n\t\t\t\t\t\t\t\ttarBuilder.AddDir(\"/cnb/buildpacks/buildpack-1-id/buildpack-1-version-1/bin\", 0700, time.Now())\n\t\t\t\t\t\t\t\ttarBuilder.AddFile(\"/cnb/buildpacks/buildpack-1-id/buildpack-1-version-1/bin/detect\", 0700, time.Now(), []byte(\"detect-contents\"))\n\t\t\t\t\t\t\t\ttarBuilder.AddFile(\"/cnb/buildpacks/buildpack-1-id/buildpack-1-version-1/bin/build\", 0700, time.Now(), []byte(\"build-contents\"))\n\n\t\t\t\t\t\t\t\t// Buildpack 2\n\t\t\t\t\t\t\t\ttarBuilder.AddDir(\"/cnb/buildpacks/buildpack-2-id\", 0700, time.Now())\n\t\t\t\t\t\t\t\ttarBuilder.AddDir(\"/cnb/buildpacks/buildpack-2-id/buildpack-2-version-1\", 0700, time.Now())\n\t\t\t\t\t\t\t\ttarBuilder.AddFile(\"/cnb/buildpacks/buildpack-2-id/buildpack-2-version-1/buildpack.toml\", 0700, time.Now(), []byte(`\napi = \"0.3\"\n\n[buildpack]\nid = \"buildpack-2-id\"\nversion = \"buildpack-2-version-1\"\n\n`))\n\t\t\t\t\t\t\t\ttarBuilder.AddDir(\"/cnb/buildpacks/buildpack-2-id/buildpack-2-version-1/bin\", 0700, time.Now())\n\t\t\t\t\t\t\t\ttarBuilder.AddFile(\"/cnb/buildpacks/buildpack-2-id/buildpack-2-version-1/bin/detect\", 0700, time.Now(), []byte(\"detect-contents\"))\n\t\t\t\t\t\t\t\ttarBuilder.AddFile(\"/cnb/buildpacks/buildpack-2-id/buildpack-2-version-1/bin/build\", 0700, time.Now(), []byte(\"build-contents\"))\n\n\t\t\t\t\t\t\t\treturn tarBuilder.Reader(archive.DefaultTarWriterFactory())\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t},\n\t\t\t\t\t)\n\n\t\t\t\t\ttarPaths, err := buildpack.ToNLayerTar(tmpDir, bp)\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\th.AssertEq(t, len(tarPaths), 2)\n\t\t\t\t\tassertBuildpacksToTar(t, tarPaths, expectedBP)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"running on windows\", func() {\n\t\t\t\tit(\"returns N tar files\", func() {\n\t\t\t\t\th.SkipIf(t, runtime.GOOS != \"windows\", \"\")\n\t\t\t\t\tbp := buildpack.FromBlob(\n\t\t\t\t\t\t&dist.BuildpackDescriptor{\n\t\t\t\t\t\t\tWithAPI: api.MustParse(\"0.3\"),\n\t\t\t\t\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\t\t\t\t\tID:      \"buildpack-1-id\",\n\t\t\t\t\t\t\t\tVersion: \"buildpack-1-version-1\",\n\t\t\t\t\t\t\t\tName:    \"buildpack-1\",\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t},\n\t\t\t\t\t\t&readerBlob{\n\t\t\t\t\t\t\topenFn: func() io.ReadCloser {\n\t\t\t\t\t\t\t\ttarBuilder := archive.TarBuilder{}\n\t\t\t\t\t\t\t\t// Windows tar format\n\t\t\t\t\t\t\t\ttarBuilder.AddDir(\"Files\", 0700, time.Now())\n\t\t\t\t\t\t\t\ttarBuilder.AddDir(\"Hives\", 0700, time.Now())\n\t\t\t\t\t\t\t\ttarBuilder.AddDir(\"Files/cnb\", 0700, time.Now())\n\t\t\t\t\t\t\t\ttarBuilder.AddDir(\"Files/cnb/builpacks\", 0700, time.Now())\n\n\t\t\t\t\t\t\t\t// Buildpack 1\n\t\t\t\t\t\t\t\ttarBuilder.AddDir(\"Files/cnb/buildpacks/buildpack-1-id\", 0700, time.Now())\n\t\t\t\t\t\t\t\ttarBuilder.AddDir(\"Files/cnb/buildpacks/buildpack-1-id/buildpack-1-version-1\", 0700, time.Now())\n\t\t\t\t\t\t\t\ttarBuilder.AddFile(\"Files/cnb/buildpacks/buildpack-1-id/buildpack-1-version-1/buildpack.toml\", 0700, time.Now(), []byte(`\napi = \"0.3\"\n\n[buildpack]\nid = \"buildpack-1-id\"\nversion = \"buildpack-1-version-1\"\n\n`))\n\t\t\t\t\t\t\t\ttarBuilder.AddDir(\"Files/cnb/buildpacks/buildpack-1-id/buildpack-1-version-1/bin\", 0700, time.Now())\n\t\t\t\t\t\t\t\ttarBuilder.AddFile(\"Files/cnb/buildpacks/buildpack-1-id/buildpack-1-version-1/bin/detect.bat\", 0700, time.Now(), []byte(\"detect-contents\"))\n\t\t\t\t\t\t\t\ttarBuilder.AddFile(\"Files/cnb/buildpacks/buildpack-1-id/buildpack-1-version-1/bin/build.bat\", 0700, time.Now(), []byte(\"build-contents\"))\n\n\t\t\t\t\t\t\t\t// Buildpack 2\n\t\t\t\t\t\t\t\ttarBuilder.AddDir(\"Files/cnb/buildpacks/buildpack-2-id\", 0700, time.Now())\n\t\t\t\t\t\t\t\ttarBuilder.AddDir(\"Files/cnb/buildpacks/buildpack-2-id/buildpack-2-version-1\", 0700, time.Now())\n\t\t\t\t\t\t\t\ttarBuilder.AddFile(\"Files/cnb/buildpacks/buildpack-2-id/buildpack-2-version-1/buildpack.toml\", 0700, time.Now(), []byte(`\napi = \"0.3\"\n\n[buildpack]\nid = \"buildpack-2-id\"\nversion = \"buildpack-2-version-1\"\n\n`))\n\t\t\t\t\t\t\t\ttarBuilder.AddDir(\"Files/cnb/buildpacks/buildpack-2-id/buildpack-2-version-1/bin\", 0700, time.Now())\n\t\t\t\t\t\t\t\ttarBuilder.AddFile(\"Files/cnb/buildpacks/buildpack-2-id/buildpack-2-version-1/bin/detect.bat\", 0700, time.Now(), []byte(\"detect-contents\"))\n\t\t\t\t\t\t\t\ttarBuilder.AddFile(\"Files/cnb/buildpacks/buildpack-2-id/buildpack-2-version-1/bin/build.bat\", 0700, time.Now(), []byte(\"build-contents\"))\n\n\t\t\t\t\t\t\t\treturn tarBuilder.Reader(archive.DefaultTarWriterFactory())\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t},\n\t\t\t\t\t)\n\n\t\t\t\t\ttarPaths, err := buildpack.ToNLayerTar(tmpDir, bp)\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\th.AssertEq(t, len(tarPaths), 2)\n\t\t\t\t\tassertWindowsBuildpacksToTar(t, tarPaths, expectedBP)\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"BuildModule contains buildpacks with same ID but different versions\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\texpectedBP = []expectedBuildpack{\n\t\t\t\t\t{\n\t\t\t\t\t\tid:      \"buildpack-1-id\",\n\t\t\t\t\t\tversion: \"buildpack-1-version-1\",\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\tid:      \"buildpack-1-id\",\n\t\t\t\t\t\tversion: \"buildpack-1-version-2\",\n\t\t\t\t\t},\n\t\t\t\t}\n\t\t\t})\n\n\t\t\tit(\"returns N tar files one per each version\", func() {\n\t\t\t\tbp := buildpack.FromBlob(\n\t\t\t\t\t&dist.BuildpackDescriptor{\n\t\t\t\t\t\tWithAPI: api.MustParse(\"0.3\"),\n\t\t\t\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\t\t\t\tID:      \"buildpack-1-id\",\n\t\t\t\t\t\t\tVersion: \"buildpack-1-version-1\",\n\t\t\t\t\t\t\tName:    \"buildpack-1\",\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t\t&readerBlob{\n\t\t\t\t\t\topenFn: func() io.ReadCloser {\n\t\t\t\t\t\t\ttarBuilder := archive.TarBuilder{}\n\n\t\t\t\t\t\t\t// Buildpack 1\n\t\t\t\t\t\t\ttarBuilder.AddDir(\"/cnb/buildpacks/buildpack-1-id\", 0700, time.Now())\n\t\t\t\t\t\t\ttarBuilder.AddDir(\"/cnb/buildpacks/buildpack-1-id/buildpack-1-version-1\", 0700, time.Now())\n\t\t\t\t\t\t\ttarBuilder.AddFile(\"/cnb/buildpacks/buildpack-1-id/buildpack-1-version-1/buildpack.toml\", 0700, time.Now(), []byte(`\napi = \"0.3\"\n\n[buildpack]\nid = \"buildpack-1-id\"\nversion = \"buildpack-1-version-1\"\n\n`))\n\t\t\t\t\t\t\ttarBuilder.AddDir(\"/cnb/buildpacks/buildpack-1-id/buildpack-1-version-1/bin\", 0700, time.Now())\n\t\t\t\t\t\t\ttarBuilder.AddFile(\"/cnb/buildpacks/buildpack-1-id/buildpack-1-version-1/bin/detect\", 0700, time.Now(), []byte(\"detect-contents\"))\n\t\t\t\t\t\t\ttarBuilder.AddFile(\"/cnb/buildpacks/buildpack-1-id/buildpack-1-version-1/bin/build\", 0700, time.Now(), []byte(\"build-contents\"))\n\n\t\t\t\t\t\t\t// Buildpack 2 same as before but with different version\n\t\t\t\t\t\t\ttarBuilder.AddDir(\"/cnb/buildpacks/buildpack-1-id/buildpack-1-version-2\", 0700, time.Now())\n\t\t\t\t\t\t\ttarBuilder.AddFile(\"/cnb/buildpacks/buildpack-1-id/buildpack-1-version-2/buildpack.toml\", 0700, time.Now(), []byte(`\napi = \"0.3\"\n\n[buildpack]\nid = \"buildpack-2-id\"\nversion = \"buildpack-2-version-1\"\n\n`))\n\t\t\t\t\t\t\ttarBuilder.AddDir(\"/cnb/buildpacks/buildpack-1-id/buildpack-1-version-2/bin\", 0700, time.Now())\n\t\t\t\t\t\t\ttarBuilder.AddFile(\"/cnb/buildpacks/buildpack-1-id/buildpack-1-version-2/bin/detect\", 0700, time.Now(), []byte(\"detect-contents\"))\n\t\t\t\t\t\t\ttarBuilder.AddFile(\"/cnb/buildpacks/buildpack-1-id/buildpack-1-version-2/bin/build\", 0700, time.Now(), []byte(\"build-contents\"))\n\n\t\t\t\t\t\t\treturn tarBuilder.Reader(archive.DefaultTarWriterFactory())\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t)\n\n\t\t\t\ttarPaths, err := buildpack.ToNLayerTar(tmpDir, bp)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertEq(t, len(tarPaths), 2)\n\t\t\t\tassertBuildpacksToTar(t, tarPaths, expectedBP)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"BuildModule could not be read\", func() {\n\t\t\tit(\"surfaces errors encountered while reading blob\", func() {\n\t\t\t\t_, err = buildpack.ToNLayerTar(tmpDir, &errorBuildModule{})\n\t\t\t\th.AssertError(t, err, \"opening blob\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"BuildModule is empty\", func() {\n\t\t\tit(\"returns a path to an empty tarball\", func() {\n\t\t\t\tbp := buildpack.FromBlob(\n\t\t\t\t\t&dist.BuildpackDescriptor{\n\t\t\t\t\t\tWithAPI: api.MustParse(\"0.3\"),\n\t\t\t\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\t\t\t\tID:      \"buildpack-1-id\",\n\t\t\t\t\t\t\tVersion: \"buildpack-1-version-1\",\n\t\t\t\t\t\t\tName:    \"buildpack-1\",\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t\t&readerBlob{\n\t\t\t\t\t\topenFn: func() io.ReadCloser {\n\t\t\t\t\t\t\treturn io.NopCloser(strings.NewReader(\"\"))\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t)\n\n\t\t\t\ttarPaths, err := buildpack.ToNLayerTar(tmpDir, bp)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertEq(t, len(tarPaths), 1)\n\t\t\t\th.AssertNotNil(t, tarPaths[0].Path())\n\t\t\t})\n\t\t})\n\n\t\twhen(\"BuildModule contains unexpected elements in the tarball file\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\texpectedBP = []expectedBuildpack{\n\t\t\t\t\t{\n\t\t\t\t\t\tid:      \"buildpack-1-id\",\n\t\t\t\t\t\tversion: \"buildpack-1-version-1\",\n\t\t\t\t\t},\n\t\t\t\t}\n\t\t\t})\n\n\t\t\tit(\"throws an error\", func() {\n\t\t\t\tbp := buildpack.FromBlob(\n\t\t\t\t\t&dist.BuildpackDescriptor{\n\t\t\t\t\t\tWithAPI: api.MustParse(\"0.3\"),\n\t\t\t\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\t\t\t\tID:      \"buildpack-1-id\",\n\t\t\t\t\t\t\tVersion: \"buildpack-1-version-1\",\n\t\t\t\t\t\t\tName:    \"buildpack-1\",\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t\t&readerBlob{\n\t\t\t\t\t\topenFn: func() io.ReadCloser {\n\t\t\t\t\t\t\ttarBuilder := archive.TarBuilder{}\n\n\t\t\t\t\t\t\t// Buildpack 1\n\t\t\t\t\t\t\ttarBuilder.AddDir(\"/cnb/buildpacks/buildpack-1-id\", 0700, time.Now())\n\t\t\t\t\t\t\ttarBuilder.AddDir(\"/cnb/buildpacks/buildpack-1-id/buildpack-1-version-1\", 0700, time.Now())\n\t\t\t\t\t\t\ttarBuilder.AddFile(\"/cnb/buildpacks/buildpack-1-id/buildpack-1-version-1/../hack\", 0700, time.Now(), []byte(\"harmful content\"))\n\t\t\t\t\t\t\treturn tarBuilder.Reader(archive.DefaultTarWriterFactory())\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t)\n\n\t\t\t\t_, err = buildpack.ToNLayerTar(tmpDir, bp)\n\t\t\t\th.AssertError(t, err, \"contains unexpected special elements\")\n\t\t\t})\n\t\t})\n\t})\n}\n\ntype errorBlob struct {\n\tcount    int\n\tlimit    int\n\trealBlob buildpack.Blob\n}\n\nfunc (e *errorBlob) Open() (io.ReadCloser, error) {\n\tif e.count < e.limit {\n\t\te.count += 1\n\t\treturn e.realBlob.Open()\n\t}\n\treturn nil, fmt.Errorf(\"error from errBlob (reached limit of %d)\", e.limit)\n}\n\ntype readerBlob struct {\n\topenFn func() io.ReadCloser\n}\n\nfunc (r *readerBlob) Open() (io.ReadCloser, error) {\n\treturn r.openFn(), nil\n}\n\ntype errorBuildModule struct {\n}\n\nfunc (eb *errorBuildModule) Open() (io.ReadCloser, error) {\n\treturn nil, errors.New(\"something happened opening the build module\")\n}\n\nfunc (eb *errorBuildModule) Descriptor() buildpack.Descriptor {\n\treturn nil\n}\n\ntype expectedBuildpack struct {\n\tid      string\n\tversion string\n}\n\nfunc assertBuildpacksToTar(t *testing.T, actual []buildpack.ModuleTar, expected []expectedBuildpack) {\n\tt.Helper()\n\tfor _, expectedBP := range expected {\n\t\tfound := false\n\t\tfor _, moduleTar := range actual {\n\t\t\tif expectedBP.id == moduleTar.Info().ID && expectedBP.version == moduleTar.Info().Version {\n\t\t\t\tfound = true\n\t\t\t\th.AssertOnTarEntry(t, moduleTar.Path(), fmt.Sprintf(\"/cnb/buildpacks/%s\", expectedBP.id),\n\t\t\t\t\th.IsDirectory(),\n\t\t\t\t)\n\t\t\t\th.AssertOnTarEntry(t, moduleTar.Path(), fmt.Sprintf(\"/cnb/buildpacks/%s/%s\", expectedBP.id, expectedBP.version),\n\t\t\t\t\th.IsDirectory(),\n\t\t\t\t)\n\t\t\t\th.AssertOnTarEntry(t, moduleTar.Path(), fmt.Sprintf(\"/cnb/buildpacks/%s/%s/bin\", expectedBP.id, expectedBP.version),\n\t\t\t\t\th.IsDirectory(),\n\t\t\t\t)\n\t\t\t\th.AssertOnTarEntry(t, moduleTar.Path(), fmt.Sprintf(\"/cnb/buildpacks/%s/%s/bin/build\", expectedBP.id, expectedBP.version),\n\t\t\t\t\th.HasFileMode(0700),\n\t\t\t\t)\n\t\t\t\th.AssertOnTarEntry(t, moduleTar.Path(), fmt.Sprintf(\"/cnb/buildpacks/%s/%s/bin/detect\", expectedBP.id, expectedBP.version),\n\t\t\t\t\th.HasFileMode(0700),\n\t\t\t\t)\n\t\t\t\th.AssertOnTarEntry(t, moduleTar.Path(), fmt.Sprintf(\"/cnb/buildpacks/%s/%s/buildpack.toml\", expectedBP.id, expectedBP.version),\n\t\t\t\t\th.HasFileMode(0700),\n\t\t\t\t)\n\t\t\t\tbreak\n\t\t\t}\n\t\t}\n\t\th.AssertTrue(t, found)\n\t}\n}\n\nfunc assertWindowsBuildpacksToTar(t *testing.T, actual []buildpack.ModuleTar, expected []expectedBuildpack) {\n\tt.Helper()\n\tfor _, expectedBP := range expected {\n\t\tfound := false\n\t\tfor _, moduleTar := range actual {\n\t\t\tif expectedBP.id == moduleTar.Info().ID && expectedBP.version == moduleTar.Info().Version {\n\t\t\t\tfound = true\n\t\t\t\th.AssertOnTarEntry(t, moduleTar.Path(), fmt.Sprintf(\"Files/cnb/buildpacks/%s\", expectedBP.id),\n\t\t\t\t\th.IsDirectory(),\n\t\t\t\t)\n\t\t\t\th.AssertOnTarEntry(t, moduleTar.Path(), fmt.Sprintf(\"Files/cnb/buildpacks/%s/%s\", expectedBP.id, expectedBP.version),\n\t\t\t\t\th.IsDirectory(),\n\t\t\t\t)\n\t\t\t\th.AssertOnTarEntry(t, moduleTar.Path(), fmt.Sprintf(\"Files/cnb/buildpacks/%s/%s/bin\", expectedBP.id, expectedBP.version),\n\t\t\t\t\th.IsDirectory(),\n\t\t\t\t)\n\t\t\t\th.AssertOnTarEntry(t, moduleTar.Path(), fmt.Sprintf(\"Files/cnb/buildpacks/%s/%s/bin/build.bat\", expectedBP.id, expectedBP.version),\n\t\t\t\t\th.HasFileMode(0700),\n\t\t\t\t)\n\t\t\t\th.AssertOnTarEntry(t, moduleTar.Path(), fmt.Sprintf(\"Files/cnb/buildpacks/%s/%s/bin/detect.bat\", expectedBP.id, expectedBP.version),\n\t\t\t\t\th.HasFileMode(0700),\n\t\t\t\t)\n\t\t\t\th.AssertOnTarEntry(t, moduleTar.Path(), fmt.Sprintf(\"Files/cnb/buildpacks/%s/%s/buildpack.toml\", expectedBP.id, expectedBP.version),\n\t\t\t\t\th.HasFileMode(0700),\n\t\t\t\t)\n\t\t\t\tbreak\n\t\t\t}\n\t\t}\n\t\th.AssertTrue(t, found)\n\t}\n}\n"
  },
  {
    "path": "pkg/buildpack/buildpackage.go",
    "content": "package buildpack\n\nimport (\n\t\"github.com/buildpacks/pack/pkg/dist\"\n)\n\n// TODO: Move to dist\nconst MetadataLabel = \"io.buildpacks.buildpackage.metadata\"\n\ntype Metadata struct {\n\tdist.ModuleInfo\n\tStacks []dist.Stack `toml:\"stacks\" json:\"stacks\"`\n}\n"
  },
  {
    "path": "pkg/buildpack/downloader.go",
    "content": "package buildpack\n\nimport (\n\t\"context\"\n\t\"fmt\"\n\n\t\"github.com/pkg/errors\"\n\n\t\"github.com/buildpacks/imgutil\"\n\n\t\"github.com/buildpacks/pack/internal/layer\"\n\t\"github.com/buildpacks/pack/internal/paths\"\n\t\"github.com/buildpacks/pack/internal/style\"\n\t\"github.com/buildpacks/pack/pkg/blob\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n\t\"github.com/buildpacks/pack/pkg/image\"\n)\n\ntype Logger interface {\n\tDebug(msg string)\n\tDebugf(fmt string, v ...interface{})\n\tInfo(msg string)\n\tInfof(fmt string, v ...interface{})\n\tWarn(msg string)\n\tWarnf(fmt string, v ...interface{})\n\tError(msg string)\n\tErrorf(fmt string, v ...interface{})\n}\n\ntype ImageFetcher interface {\n\tFetch(ctx context.Context, name string, options image.FetchOptions) (imgutil.Image, error)\n\tCheckReadAccess(repo string, options image.FetchOptions) bool\n}\n\ntype Downloader interface {\n\tDownload(ctx context.Context, pathOrURI string) (blob.Blob, error)\n}\n\n//go:generate mockgen -package testmocks -destination ../testmocks/mock_registry_resolver.go github.com/buildpacks/pack/pkg/buildpack RegistryResolver\n\ntype RegistryResolver interface {\n\tResolve(registryName, bpURI string) (string, error)\n}\n\ntype buildpackDownloader struct {\n\tlogger           Logger\n\timageFetcher     ImageFetcher\n\tdownloader       Downloader\n\tregistryResolver RegistryResolver\n}\n\nfunc NewDownloader(logger Logger, imageFetcher ImageFetcher, downloader Downloader, registryResolver RegistryResolver) *buildpackDownloader { //nolint:revive,gosimple\n\treturn &buildpackDownloader{\n\t\tlogger:           logger,\n\t\timageFetcher:     imageFetcher,\n\t\tdownloader:       downloader,\n\t\tregistryResolver: registryResolver,\n\t}\n}\n\ntype DownloadOptions struct {\n\t// Buildpack registry name. Defines where all registry buildpacks will be pulled from.\n\tRegistryName string\n\n\t// The base directory to use to resolve relative assets\n\tRelativeBaseDir string\n\n\t// Deprecated: the older alternative to specify the OS to download; use Target instead\n\tImageOS string\n\n\t// Deprecated: the older alternative to buildpack URI\n\tImageName string\n\n\t// The kind of module to download (valid values: \"buildpack\", \"extension\"). Defaults to \"buildpack\".\n\tModuleKind string\n\n\tDaemon bool\n\n\tPullPolicy image.PullPolicy\n\n\t// The OS/Architecture/Variant to download.\n\tTarget *dist.Target\n}\n\nfunc (c *buildpackDownloader) Download(ctx context.Context, moduleURI string, opts DownloadOptions) (BuildModule, []BuildModule, error) {\n\tkind := KindBuildpack\n\tif opts.ModuleKind == KindExtension {\n\t\tkind = KindExtension\n\t}\n\n\tvar err error\n\tvar locatorType LocatorType\n\tif moduleURI == \"\" && opts.ImageName != \"\" {\n\t\tc.logger.Warn(\"The 'image' key is deprecated. Use 'uri=\\\"docker://...\\\"' instead.\")\n\t\tmoduleURI = opts.ImageName\n\t\tlocatorType = PackageLocator\n\t} else {\n\t\tlocatorType, err = GetLocatorType(moduleURI, opts.RelativeBaseDir, []dist.ModuleInfo{})\n\t\tif err != nil {\n\t\t\treturn nil, nil, err\n\t\t}\n\t}\n\tvar mainBP BuildModule\n\tvar depBPs []BuildModule\n\tswitch locatorType {\n\tcase PackageLocator:\n\t\timageName := ParsePackageLocator(moduleURI)\n\t\tc.logger.Debugf(\"Downloading %s from image: %s\", kind, style.Symbol(imageName))\n\t\tmainBP, depBPs, err = extractPackaged(ctx, kind, imageName, c.imageFetcher, image.FetchOptions{\n\t\t\tDaemon:     opts.Daemon,\n\t\t\tPullPolicy: opts.PullPolicy,\n\t\t\tTarget:     opts.Target,\n\t\t})\n\t\tif err != nil {\n\t\t\treturn nil, nil, errors.Wrapf(err, \"extracting from registry %s\", style.Symbol(moduleURI))\n\t\t}\n\tcase RegistryLocator:\n\t\tc.logger.Debugf(\"Downloading %s from registry: %s\", kind, style.Symbol(moduleURI))\n\t\taddress, err := c.registryResolver.Resolve(opts.RegistryName, moduleURI)\n\t\tif err != nil {\n\t\t\treturn nil, nil, errors.Wrapf(err, \"locating in registry: %s\", style.Symbol(moduleURI))\n\t\t}\n\n\t\tmainBP, depBPs, err = extractPackaged(ctx, kind, address, c.imageFetcher, image.FetchOptions{\n\t\t\tDaemon:     opts.Daemon,\n\t\t\tPullPolicy: opts.PullPolicy,\n\t\t\tTarget:     opts.Target,\n\t\t})\n\t\tif err != nil {\n\t\t\treturn nil, nil, errors.Wrapf(err, \"extracting from registry %s\", style.Symbol(moduleURI))\n\t\t}\n\tcase URILocator:\n\t\tmoduleURI, err = paths.FilePathToURI(moduleURI, opts.RelativeBaseDir)\n\t\tif err != nil {\n\t\t\treturn nil, nil, errors.Wrapf(err, \"making absolute: %s\", style.Symbol(moduleURI))\n\t\t}\n\n\t\tc.logger.Debugf(\"Downloading %s from URI: %s\", kind, style.Symbol(moduleURI))\n\n\t\tblob, err := c.downloader.Download(ctx, moduleURI)\n\t\tif err != nil {\n\t\t\treturn nil, nil, errors.Wrapf(err, \"downloading %s from %s\", kind, style.Symbol(moduleURI))\n\t\t}\n\n\t\timageOS := opts.ImageOS\n\t\tif opts.Target != nil {\n\t\t\timageOS = opts.Target.OS\n\t\t}\n\t\tmainBP, depBPs, err = decomposeBlob(blob, kind, imageOS, c.logger)\n\t\tif err != nil {\n\t\t\treturn nil, nil, errors.Wrapf(err, \"extracting from %s\", style.Symbol(moduleURI))\n\t\t}\n\tdefault:\n\t\treturn nil, nil, fmt.Errorf(\"error reading %s: invalid locator: %s\", moduleURI, locatorType)\n\t}\n\treturn mainBP, depBPs, nil\n}\n\n// decomposeBlob decomposes a buildpack or extension blob into the main module (order buildpack or extension) and\n// (for buildpack blobs) its dependent buildpacks.\nfunc decomposeBlob(blob blob.Blob, kind string, imageOS string, logger Logger) (mainModule BuildModule, depModules []BuildModule, err error) {\n\tisOCILayout, err := IsOCILayoutBlob(blob)\n\tif err != nil {\n\t\treturn mainModule, depModules, errors.Wrapf(err, \"inspecting %s blob\", kind)\n\t}\n\n\tif isOCILayout {\n\t\tmainModule, depModules, err = fromOCILayoutBlob(blob, kind)\n\t\tif err != nil {\n\t\t\treturn mainModule, depModules, errors.Wrapf(err, \"extracting %ss\", kind)\n\t\t}\n\t} else {\n\t\tlayerWriterFactory, err := layer.NewWriterFactory(imageOS)\n\t\tif err != nil {\n\t\t\treturn mainModule, depModules, errors.Wrapf(err, \"get tar writer factory for OS %s\", style.Symbol(imageOS))\n\t\t}\n\n\t\tif kind == KindExtension {\n\t\t\tmainModule, err = FromExtensionRootBlob(blob, layerWriterFactory, logger)\n\t\t} else {\n\t\t\tmainModule, err = FromBuildpackRootBlob(blob, layerWriterFactory, logger)\n\t\t}\n\t\tif err != nil {\n\t\t\treturn mainModule, depModules, errors.Wrapf(err, \"reading %s\", kind)\n\t\t}\n\t}\n\n\treturn mainModule, depModules, nil\n}\n\nfunc fromOCILayoutBlob(blob blob.Blob, kind string) (mainModule BuildModule, depModules []BuildModule, err error) {\n\tswitch kind {\n\tcase KindBuildpack:\n\t\tmainModule, depModules, err = BuildpacksFromOCILayoutBlob(blob)\n\tcase KindExtension:\n\t\tmainModule, err = ExtensionsFromOCILayoutBlob(blob)\n\tdefault:\n\t\treturn nil, nil, fmt.Errorf(\"unknown module kind: %s\", kind)\n\t}\n\tif err != nil {\n\t\treturn nil, nil, err\n\t}\n\treturn mainModule, depModules, nil\n}\n\nfunc extractPackaged(ctx context.Context, kind string, pkgImageRef string, fetcher ImageFetcher, fetchOptions image.FetchOptions) (mainModule BuildModule, depModules []BuildModule, err error) {\n\tpkgImage, err := fetcher.Fetch(ctx, pkgImageRef, fetchOptions)\n\tif err != nil {\n\t\treturn nil, nil, errors.Wrapf(err, \"fetching image\")\n\t}\n\n\tswitch kind {\n\tcase KindBuildpack:\n\t\tmainModule, depModules, err = extractBuildpacks(pkgImage)\n\tcase KindExtension:\n\t\tmainModule, err = extractExtensions(pkgImage)\n\tdefault:\n\t\treturn nil, nil, fmt.Errorf(\"unknown module kind: %s\", kind)\n\t}\n\tif err != nil {\n\t\treturn nil, nil, errors.Wrapf(err, \"extracting %ss from %s\", kind, style.Symbol(pkgImageRef))\n\t}\n\treturn mainModule, depModules, nil\n}\n"
  },
  {
    "path": "pkg/buildpack/downloader_test.go",
    "content": "package buildpack_test\n\nimport (\n\t\"bytes\"\n\t\"context\"\n\t\"fmt\"\n\t\"os\"\n\t\"path/filepath\"\n\t\"testing\"\n\n\t\"github.com/buildpacks/imgutil/fakes\"\n\t\"github.com/buildpacks/lifecycle/api\"\n\t\"github.com/golang/mock/gomock\"\n\t\"github.com/heroku/color\"\n\tmobysystem \"github.com/moby/moby/api/types/system\"\n\tdockerclient \"github.com/moby/moby/client\"\n\t\"github.com/pkg/errors\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\tpubbldpkg \"github.com/buildpacks/pack/buildpackage\"\n\tifakes \"github.com/buildpacks/pack/internal/fakes\"\n\t\"github.com/buildpacks/pack/internal/paths\"\n\t\"github.com/buildpacks/pack/pkg/blob\"\n\t\"github.com/buildpacks/pack/pkg/buildpack\"\n\t\"github.com/buildpacks/pack/pkg/client\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n\t\"github.com/buildpacks/pack/pkg/image\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\t\"github.com/buildpacks/pack/pkg/testmocks\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestBuildpackDownloader(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"BuildpackDownloader\", testBuildpackDownloader, spec.Report(report.Terminal{}))\n}\n\nfunc testBuildpackDownloader(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tmockController       *gomock.Controller\n\t\tmockDownloader       *testmocks.MockBlobDownloader\n\t\tmockImageFactory     *testmocks.MockImageFactory\n\t\tmockImageFetcher     *testmocks.MockImageFetcher\n\t\tmockRegistryResolver *testmocks.MockRegistryResolver\n\t\tmockDockerClient     *testmocks.MockAPIClient\n\t\tbuildpackDownloader  client.BuildpackDownloader\n\t\tlogger               logging.Logger\n\t\tout                  bytes.Buffer\n\t\ttmpDir               string\n\t)\n\n\tvar createBuildpack = func(descriptor dist.BuildpackDescriptor) string {\n\t\tbp, err := ifakes.NewFakeBuildpackBlob(&descriptor, 0644)\n\t\th.AssertNil(t, err)\n\t\turl := fmt.Sprintf(\"https://example.com/bp.%s.tgz\", h.RandString(12))\n\t\tmockDownloader.EXPECT().Download(gomock.Any(), url).Return(bp, nil).AnyTimes()\n\t\treturn url\n\t}\n\n\tvar createPackage = func(imageName string) *fakes.Image {\n\t\tpackageImage := fakes.NewImage(imageName, \"\", nil)\n\t\tmockImageFactory.EXPECT().NewImage(packageImage.Name(), false, dist.Target{OS: \"linux\"}).Return(packageImage, nil)\n\n\t\tpack, err := client.NewClient(\n\t\t\tclient.WithLogger(logger),\n\t\t\tclient.WithDownloader(mockDownloader),\n\t\t\tclient.WithImageFactory(mockImageFactory),\n\t\t\tclient.WithFetcher(mockImageFetcher),\n\t\t\tclient.WithDockerClient(mockDockerClient),\n\t\t)\n\t\th.AssertNil(t, err)\n\n\t\th.AssertNil(t, pack.PackageBuildpack(context.TODO(), client.PackageBuildpackOptions{\n\t\t\tName: packageImage.Name(),\n\t\t\tConfig: pubbldpkg.Config{\n\t\t\t\tPlatform: dist.Platform{OS: \"linux\"},\n\t\t\t\tBuildpack: dist.BuildpackURI{URI: createBuildpack(dist.BuildpackDescriptor{\n\t\t\t\t\tWithAPI:    api.MustParse(\"0.3\"),\n\t\t\t\t\tWithInfo:   dist.ModuleInfo{ID: \"example/foo\", Version: \"1.1.0\"},\n\t\t\t\t\tWithStacks: []dist.Stack{{ID: \"some.stack.id\"}},\n\t\t\t\t})},\n\t\t\t},\n\t\t\tPublish: true,\n\t\t}))\n\n\t\treturn packageImage\n\t}\n\n\tit.Before(func() {\n\t\tlogger = logging.NewLogWithWriters(&out, &out, logging.WithVerbose())\n\t\tmockController = gomock.NewController(t)\n\t\tmockDownloader = testmocks.NewMockBlobDownloader(mockController)\n\t\tmockRegistryResolver = testmocks.NewMockRegistryResolver(mockController)\n\t\tmockImageFetcher = testmocks.NewMockImageFetcher(mockController)\n\t\tmockImageFactory = testmocks.NewMockImageFactory(mockController)\n\t\tmockDockerClient = testmocks.NewMockAPIClient(mockController)\n\t\tmockDownloader.EXPECT().Download(gomock.Any(), \"https://example.fake/bp-one.tgz\").Return(blob.NewBlob(filepath.Join(\"testdata\", \"buildpack\")), nil).AnyTimes()\n\t\tmockDownloader.EXPECT().Download(gomock.Any(), \"some/buildpack/dir\").Return(blob.NewBlob(filepath.Join(\"testdata\", \"buildpack\")), nil).AnyTimes()\n\n\t\tbuildpackDownloader = buildpack.NewDownloader(logger, mockImageFetcher, mockDownloader, mockRegistryResolver)\n\n\t\tmockDockerClient.EXPECT().Info(context.TODO(), gomock.Any()).Return(dockerclient.SystemInfoResult{Info: mobysystem.Info{OSType: \"linux\"}}, nil).AnyTimes()\n\n\t\tmockRegistryResolver.EXPECT().\n\t\t\tResolve(\"some-registry\", \"urn:cnb:registry:example/foo@1.1.0\").\n\t\t\tReturn(\"example.com/some/package@sha256:74eb48882e835d8767f62940d453eb96ed2737de3a16573881dcea7dea769df7\", nil).\n\t\t\tAnyTimes()\n\t\tmockRegistryResolver.EXPECT().\n\t\t\tResolve(\"some-registry\", \"example/foo@1.1.0\").\n\t\t\tReturn(\"example.com/some/package@sha256:74eb48882e835d8767f62940d453eb96ed2737de3a16573881dcea7dea769df7\", nil).\n\t\t\tAnyTimes()\n\n\t\tvar err error\n\t\ttmpDir, err = os.MkdirTemp(\"\", \"buildpack-downloader-test\")\n\t\th.AssertNil(t, err)\n\t})\n\n\tit.After(func() {\n\t\tmockController.Finish()\n\t\th.AssertNil(t, os.RemoveAll(tmpDir))\n\t})\n\n\twhen(\"#Download\", func() {\n\t\tvar (\n\t\t\tpackageImage    *fakes.Image\n\t\t\tdownloadOptions = buildpack.DownloadOptions{Target: &dist.Target{\n\t\t\t\tOS: \"linux\",\n\t\t\t}}\n\t\t)\n\n\t\tshouldFetchPackageImageWith := func(demon bool, pull image.PullPolicy, target *dist.Target) {\n\t\t\tmockImageFetcher.EXPECT().Fetch(gomock.Any(), packageImage.Name(), image.FetchOptions{\n\t\t\t\tDaemon:     demon,\n\t\t\t\tPullPolicy: pull,\n\t\t\t\tTarget:     target,\n\t\t\t}).Return(packageImage, nil)\n\t\t}\n\n\t\twhen(\"package image lives in cnb registry\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\tpackageImage = createPackage(\"example.com/some/package@sha256:74eb48882e835d8767f62940d453eb96ed2737de3a16573881dcea7dea769df7\")\n\t\t\t})\n\n\t\t\twhen(\"daemon=true and pull-policy=always\", func() {\n\t\t\t\tit(\"should pull and use local package image\", func() {\n\t\t\t\t\tdownloadOptions = buildpack.DownloadOptions{\n\t\t\t\t\t\tRegistryName: \"some-registry\",\n\t\t\t\t\t\tTarget:       &dist.Target{OS: \"linux\", Arch: \"amd64\"},\n\t\t\t\t\t\tDaemon:       true,\n\t\t\t\t\t\tPullPolicy:   image.PullAlways,\n\t\t\t\t\t}\n\n\t\t\t\t\tshouldFetchPackageImageWith(true, image.PullAlways, &dist.Target{OS: \"linux\", Arch: \"amd64\"})\n\t\t\t\t\tmainBP, _, err := buildpackDownloader.Download(context.TODO(), \"urn:cnb:registry:example/foo@1.1.0\", downloadOptions)\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\th.AssertEq(t, mainBP.Descriptor().Info().ID, \"example/foo\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"ambigious URI provided\", func() {\n\t\t\t\tit(\"should find package in registry\", func() {\n\t\t\t\t\tdownloadOptions = buildpack.DownloadOptions{\n\t\t\t\t\t\tRegistryName: \"some-registry\",\n\t\t\t\t\t\tTarget:       &dist.Target{OS: \"linux\"},\n\t\t\t\t\t\tDaemon:       true,\n\t\t\t\t\t\tPullPolicy:   image.PullAlways,\n\t\t\t\t\t}\n\n\t\t\t\t\tshouldFetchPackageImageWith(true, image.PullAlways, &dist.Target{OS: \"linux\"})\n\t\t\t\t\tmainBP, _, err := buildpackDownloader.Download(context.TODO(), \"example/foo@1.1.0\", downloadOptions)\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\th.AssertEq(t, mainBP.Descriptor().Info().ID, \"example/foo\")\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"package image lives in docker registry\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\tpackageImage = createPackage(\"docker.io/some/package-\" + h.RandString(12))\n\t\t\t})\n\n\t\t\tprepareFetcherWithMissingPackageImage := func() {\n\t\t\t\tmockImageFetcher.EXPECT().Fetch(gomock.Any(), packageImage.Name(), gomock.Any()).Return(nil, image.ErrNotFound)\n\t\t\t}\n\n\t\t\twhen(\"image key is provided\", func() {\n\t\t\t\tit(\"should succeed\", func() {\n\t\t\t\t\tpackageImage = createPackage(\"some/package:tag\")\n\t\t\t\t\tdownloadOptions = buildpack.DownloadOptions{\n\t\t\t\t\t\tDaemon:     true,\n\t\t\t\t\t\tPullPolicy: image.PullAlways,\n\t\t\t\t\t\tTarget:     &dist.Target{OS: \"linux\", Arch: \"amd64\"},\n\t\t\t\t\t\tImageName:  \"some/package:tag\",\n\t\t\t\t\t}\n\n\t\t\t\t\tshouldFetchPackageImageWith(true, image.PullAlways, &dist.Target{OS: \"linux\", Arch: \"amd64\"})\n\t\t\t\t\tmainBP, _, err := buildpackDownloader.Download(context.TODO(), \"\", downloadOptions)\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\th.AssertEq(t, mainBP.Descriptor().Info().ID, \"example/foo\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"daemon=true and pull-policy=always\", func() {\n\t\t\t\tit(\"should pull and use local package image\", func() {\n\t\t\t\t\tdownloadOptions = buildpack.DownloadOptions{\n\t\t\t\t\t\tTarget:     &dist.Target{OS: \"linux\"},\n\t\t\t\t\t\tImageName:  packageImage.Name(),\n\t\t\t\t\t\tDaemon:     true,\n\t\t\t\t\t\tPullPolicy: image.PullAlways,\n\t\t\t\t\t}\n\n\t\t\t\t\tshouldFetchPackageImageWith(true, image.PullAlways, &dist.Target{OS: \"linux\"})\n\t\t\t\t\tmainBP, _, err := buildpackDownloader.Download(context.TODO(), \"\", downloadOptions)\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\th.AssertEq(t, mainBP.Descriptor().Info().ID, \"example/foo\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"daemon=false and pull-policy=always\", func() {\n\t\t\t\tit(\"should use remote package image\", func() {\n\t\t\t\t\tdownloadOptions = buildpack.DownloadOptions{\n\t\t\t\t\t\tTarget:     &dist.Target{OS: \"linux\"},\n\t\t\t\t\t\tImageName:  packageImage.Name(),\n\t\t\t\t\t\tDaemon:     false,\n\t\t\t\t\t\tPullPolicy: image.PullAlways,\n\t\t\t\t\t}\n\n\t\t\t\t\tshouldFetchPackageImageWith(false, image.PullAlways, &dist.Target{OS: \"linux\"})\n\t\t\t\t\tmainBP, _, err := buildpackDownloader.Download(context.TODO(), \"\", downloadOptions)\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\th.AssertEq(t, mainBP.Descriptor().Info().ID, \"example/foo\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"daemon=false and pull-policy=always\", func() {\n\t\t\t\tit(\"should use remote package URI\", func() {\n\t\t\t\t\tdownloadOptions = buildpack.DownloadOptions{\n\t\t\t\t\t\tTarget:     &dist.Target{OS: \"linux\"},\n\t\t\t\t\t\tDaemon:     false,\n\t\t\t\t\t\tPullPolicy: image.PullAlways,\n\t\t\t\t\t}\n\t\t\t\t\tshouldFetchPackageImageWith(false, image.PullAlways, &dist.Target{OS: \"linux\"})\n\t\t\t\t\tmainBP, _, err := buildpackDownloader.Download(context.TODO(), packageImage.Name(), downloadOptions)\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\th.AssertEq(t, mainBP.Descriptor().Info().ID, \"example/foo\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"publish=true and pull-policy=never\", func() {\n\t\t\t\tit(\"should push to registry and not pull package image\", func() {\n\t\t\t\t\tdownloadOptions = buildpack.DownloadOptions{\n\t\t\t\t\t\tTarget:     &dist.Target{OS: \"linux\"},\n\t\t\t\t\t\tImageName:  packageImage.Name(),\n\t\t\t\t\t\tDaemon:     false,\n\t\t\t\t\t\tPullPolicy: image.PullNever,\n\t\t\t\t\t}\n\n\t\t\t\t\tshouldFetchPackageImageWith(false, image.PullNever, &dist.Target{OS: \"linux\"})\n\t\t\t\t\tmainBP, _, err := buildpackDownloader.Download(context.TODO(), \"\", downloadOptions)\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\th.AssertEq(t, mainBP.Descriptor().Info().ID, \"example/foo\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"daemon=true pull-policy=never and there is no local package image\", func() {\n\t\t\t\tit(\"should fail without trying to retrieve package image from registry\", func() {\n\t\t\t\t\tdownloadOptions = buildpack.DownloadOptions{\n\t\t\t\t\t\tTarget:     &dist.Target{OS: \"linux\"},\n\t\t\t\t\t\tImageName:  packageImage.Name(),\n\t\t\t\t\t\tDaemon:     true,\n\t\t\t\t\t\tPullPolicy: image.PullNever,\n\t\t\t\t\t}\n\t\t\t\t\tprepareFetcherWithMissingPackageImage()\n\t\t\t\t\t_, _, err := buildpackDownloader.Download(context.TODO(), \"\", downloadOptions)\n\t\t\t\t\th.AssertError(t, err, \"not found\")\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"package lives on filesystem\", func() {\n\t\t\tit(\"should successfully retrieve package from absolute path\", func() {\n\t\t\t\tbuildpackPath := filepath.Join(\"testdata\", \"buildpack\")\n\t\t\t\tbuildpackURI, _ := paths.FilePathToURI(buildpackPath, \"\")\n\t\t\t\tmockDownloader.EXPECT().Download(gomock.Any(), buildpackURI).Return(blob.NewBlob(buildpackPath), nil).AnyTimes()\n\t\t\t\tmainBP, _, err := buildpackDownloader.Download(context.TODO(), buildpackURI, downloadOptions)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertEq(t, mainBP.Descriptor().Info().ID, \"bp.one\")\n\t\t\t})\n\n\t\t\tit(\"should successfully retrieve package from relative path\", func() {\n\t\t\t\tbuildpackPath := filepath.Join(\"testdata\", \"buildpack\")\n\t\t\t\tbuildpackURI, _ := paths.FilePathToURI(buildpackPath, \"\")\n\t\t\t\tmockDownloader.EXPECT().Download(gomock.Any(), buildpackURI).Return(blob.NewBlob(buildpackPath), nil).AnyTimes()\n\t\t\t\tdownloadOptions = buildpack.DownloadOptions{\n\t\t\t\t\tTarget:          &dist.Target{OS: \"linux\"},\n\t\t\t\t\tRelativeBaseDir: \"testdata\",\n\t\t\t\t}\n\t\t\t\tmainBP, _, err := buildpackDownloader.Download(context.TODO(), \"buildpack\", downloadOptions)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertEq(t, mainBP.Descriptor().Info().ID, \"bp.one\")\n\t\t\t})\n\n\t\t\twhen(\"kind == extension\", func() {\n\t\t\t\tit(\"succeeds\", func() {\n\t\t\t\t\textensionPath := filepath.Join(\"testdata\", \"extension\")\n\t\t\t\t\textensionURI, _ := paths.FilePathToURI(extensionPath, \"\")\n\t\t\t\t\tmockDownloader.EXPECT().Download(gomock.Any(), extensionURI).Return(blob.NewBlob(extensionPath), nil).AnyTimes()\n\t\t\t\t\tdownloadOptions = buildpack.DownloadOptions{\n\t\t\t\t\t\tTarget:          &dist.Target{OS: \"linux\"},\n\t\t\t\t\t\tModuleKind:      \"extension\",\n\t\t\t\t\t\tRelativeBaseDir: \"testdata\",\n\t\t\t\t\t}\n\t\t\t\t\tmainExt, _, err := buildpackDownloader.Download(context.TODO(), \"extension\", downloadOptions)\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\th.AssertEq(t, mainExt.Descriptor().Info().ID, \"ext.one\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"kind == packagedExtension\", func() {\n\t\t\t\tit(\"succeeds\", func() {\n\t\t\t\t\tpackagedExtensionPath := filepath.Join(\"testdata\", \"tree-extension.cnb\")\n\t\t\t\t\tpackagedExtensionURI, _ := paths.FilePathToURI(packagedExtensionPath, \"\")\n\t\t\t\t\tmockDownloader.EXPECT().Download(gomock.Any(), packagedExtensionURI).Return(blob.NewBlob(packagedExtensionPath), nil).AnyTimes()\n\t\t\t\t\tdownloadOptions = buildpack.DownloadOptions{\n\t\t\t\t\t\tTarget:          &dist.Target{OS: \"linux\"},\n\t\t\t\t\t\tModuleKind:      \"extension\",\n\t\t\t\t\t\tRelativeBaseDir: \"testdata\",\n\t\t\t\t\t\tDaemon:          true,\n\t\t\t\t\t\tPullPolicy:      image.PullAlways,\n\t\t\t\t\t}\n\t\t\t\t\tmainExt, _, _ := buildpackDownloader.Download(context.TODO(), \"tree-extension.cnb\", downloadOptions)\n\t\t\t\t\th.AssertEq(t, mainExt.Descriptor().Info().ID, \"samples-tree\")\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"package image is not a valid package\", func() {\n\t\t\tit(\"errors\", func() {\n\t\t\t\tnotPackageImage := fakes.NewImage(\"docker.io/not/package\", \"\", nil)\n\n\t\t\t\tmockImageFetcher.EXPECT().Fetch(gomock.Any(), notPackageImage.Name(), gomock.Any()).Return(notPackageImage, nil)\n\t\t\t\th.AssertNil(t, notPackageImage.SetLabel(\"io.buildpacks.buildpack.layers\", \"\"))\n\n\t\t\t\tdownloadOptions.ImageName = notPackageImage.Name()\n\t\t\t\t_, _, err := buildpackDownloader.Download(context.TODO(), \"\", downloadOptions)\n\t\t\t\th.AssertError(t, err, \"extracting buildpacks from 'docker.io/not/package': could not find label 'io.buildpacks.buildpackage.metadata'\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"invalid buildpack URI\", func() {\n\t\t\twhen(\"buildpack URI is from=builder:fake\", func() {\n\t\t\t\tit(\"errors\", func() {\n\t\t\t\t\t_, _, err := buildpackDownloader.Download(context.TODO(), \"from=builder:fake\", downloadOptions)\n\t\t\t\t\th.AssertError(t, err, \"'from=builder:fake' is not a valid identifier\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"buildpack URI is from=builder\", func() {\n\t\t\t\tit(\"errors\", func() {\n\t\t\t\t\t_, _, err := buildpackDownloader.Download(context.TODO(), \"from=builder\", downloadOptions)\n\t\t\t\t\th.AssertError(t, err,\n\t\t\t\t\t\t\"invalid locator: FromBuilderLocator\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"can't resolve buildpack in registry\", func() {\n\t\t\t\tit(\"errors\", func() {\n\t\t\t\t\tmockRegistryResolver.EXPECT().\n\t\t\t\t\t\tResolve(\"://bad-url\", \"urn:cnb:registry:fake\").\n\t\t\t\t\t\tReturn(\"\", errors.New(\"bad mhkay\")).\n\t\t\t\t\t\tAnyTimes()\n\n\t\t\t\t\tdownloadOptions.RegistryName = \"://bad-url\"\n\t\t\t\t\t_, _, err := buildpackDownloader.Download(context.TODO(), \"urn:cnb:registry:fake\", downloadOptions)\n\t\t\t\t\th.AssertError(t, err, \"locating in registry\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"can't download image from registry\", func() {\n\t\t\t\tit(\"errors\", func() {\n\t\t\t\t\tpackageImage := fakes.NewImage(\"example.com/some/package@sha256:74eb48882e835d8767f62940d453eb96ed2737de3a16573881dcea7dea769df7\", \"\", nil)\n\t\t\t\t\tmockImageFetcher.EXPECT().Fetch(gomock.Any(), packageImage.Name(), image.FetchOptions{Daemon: false, PullPolicy: image.PullAlways, Target: &dist.Target{OS: \"linux\"}}).Return(nil, errors.New(\"failed to pull\"))\n\n\t\t\t\t\tdownloadOptions.RegistryName = \"some-registry\"\n\t\t\t\t\t_, _, err := buildpackDownloader.Download(context.TODO(), \"urn:cnb:registry:example/foo@1.1.0\", downloadOptions)\n\t\t\t\t\th.AssertError(t, err,\n\t\t\t\t\t\t\"extracting from registry\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"buildpack URI is an invalid locator\", func() {\n\t\t\t\tit(\"errors\", func() {\n\t\t\t\t\t_, _, err := buildpackDownloader.Download(context.TODO(), \"nonsense string here\", downloadOptions)\n\t\t\t\t\th.AssertError(t, err,\n\t\t\t\t\t\t\"invalid locator: InvalidLocator\")\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "pkg/buildpack/locator_type.go",
    "content": "package buildpack\n\nimport (\n\t\"fmt\"\n\t\"os\"\n\t\"path/filepath\"\n\t\"regexp\"\n\t\"strings\"\n\n\t\"github.com/google/go-containerregistry/pkg/name\"\n\n\t\"github.com/buildpacks/pack/internal/paths\"\n\t\"github.com/buildpacks/pack/internal/style\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n)\n\ntype LocatorType int\n\nconst (\n\tInvalidLocator LocatorType = iota\n\tFromBuilderLocator\n\tURILocator\n\tIDLocator\n\tPackageLocator\n\tRegistryLocator\n\t// added entries here should also be added to `String()`\n)\n\nconst (\n\tfromBuilderPrefix           = \"urn:cnb:builder\"\n\tdeprecatedFromBuilderPrefix = \"from=builder\"\n\tfromRegistryPrefix          = \"urn:cnb:registry\"\n\tfromDockerPrefix            = \"docker:/\"\n)\n\nvar (\n\t// https://semver.org/#is-there-a-suggested-regular-expression-regex-to-check-a-semver-string\n\tsemverPattern   = `(0|[1-9]\\d*)\\.(0|[1-9]\\d*)\\.(0|[1-9]\\d*)(?:-((?:0|[1-9]\\d*|\\d*[a-zA-Z-][0-9a-zA-Z-]*)(?:\\.(?:0|[1-9]\\d*|\\d*[a-zA-Z-][0-9a-zA-Z-]*))*))?(?:\\+([0-9a-zA-Z-]+(?:\\.[0-9a-zA-Z-]+)*))?`\n\tregistryPattern = regexp.MustCompile(`^[a-z0-9\\-\\.]+\\/[a-z0-9\\-\\.]+(?:@` + semverPattern + `)?$`)\n)\n\nfunc (l LocatorType) String() string {\n\treturn []string{\n\t\t\"InvalidLocator\",\n\t\t\"FromBuilderLocator\",\n\t\t\"URILocator\",\n\t\t\"IDLocator\",\n\t\t\"PackageLocator\",\n\t\t\"RegistryLocator\",\n\t}[l]\n}\n\n// GetLocatorType determines which type of locator is designated by the given input.\n// If a type cannot be determined, `INVALID_LOCATOR` will be returned. If an error\n// is encountered, it will be returned.\nfunc GetLocatorType(locator string, relativeBaseDir string, buildpacksFromBuilder []dist.ModuleInfo) (LocatorType, error) {\n\tif locator == deprecatedFromBuilderPrefix {\n\t\treturn FromBuilderLocator, nil\n\t}\n\n\tif strings.HasPrefix(locator, fromBuilderPrefix+\":\") || strings.HasPrefix(locator, deprecatedFromBuilderPrefix+\":\") {\n\t\tif !isFoundInBuilder(locator, buildpacksFromBuilder) {\n\t\t\treturn InvalidLocator, fmt.Errorf(\"%s is not a valid identifier\", style.Symbol(locator))\n\t\t}\n\t\treturn IDLocator, nil\n\t}\n\n\tif strings.HasPrefix(locator, fromRegistryPrefix+\":\") {\n\t\treturn RegistryLocator, nil\n\t}\n\n\tif paths.IsURI(locator) {\n\t\tif HasDockerLocator(locator) {\n\t\t\tif _, err := name.ParseReference(locator); err == nil {\n\t\t\t\treturn PackageLocator, nil\n\t\t\t}\n\t\t}\n\t\treturn URILocator, nil\n\t}\n\n\treturn parseNakedLocator(locator, relativeBaseDir, buildpacksFromBuilder), nil\n}\n\nfunc HasDockerLocator(locator string) bool {\n\treturn strings.HasPrefix(locator, fromDockerPrefix)\n}\n\nfunc parseNakedLocator(locator, relativeBaseDir string, buildpacksFromBuilder []dist.ModuleInfo) LocatorType {\n\t// from here on, we're dealing with a naked locator, and we try to figure out what it is. To do this we check\n\t// the following characteristics in order:\n\t//   1. Does it match a path on the file system\n\t//   2. Does it match a buildpack ID in the builder\n\t//   3. Does it look like a Buildpack Registry ID\n\t//   4. Does it look like a Docker ref\n\tif isLocalFile(locator, relativeBaseDir) {\n\t\treturn URILocator\n\t}\n\n\tif isFoundInBuilder(locator, buildpacksFromBuilder) {\n\t\treturn IDLocator\n\t}\n\n\tif canBeRegistryRef(locator) {\n\t\treturn RegistryLocator\n\t}\n\n\tif canBePackageRef(locator) {\n\t\treturn PackageLocator\n\t}\n\n\treturn InvalidLocator\n}\n\nfunc canBePackageRef(locator string) bool {\n\tif _, err := name.ParseReference(locator); err == nil {\n\t\treturn true\n\t}\n\n\treturn false\n}\n\nfunc canBeRegistryRef(locator string) bool {\n\treturn registryPattern.MatchString(locator)\n}\n\nfunc isFoundInBuilder(locator string, candidates []dist.ModuleInfo) bool {\n\tid, version := ParseIDLocator(locator)\n\tfor _, c := range candidates {\n\t\tif id == c.ID && (version == \"\" || version == c.Version) {\n\t\t\treturn true\n\t\t}\n\t}\n\treturn false\n}\n\nfunc isLocalFile(locator, relativeBaseDir string) bool {\n\tif !filepath.IsAbs(locator) {\n\t\tlocator = filepath.Join(relativeBaseDir, locator)\n\t}\n\n\tif _, err := os.Stat(locator); err == nil {\n\t\treturn true\n\t}\n\n\treturn false\n}\n"
  },
  {
    "path": "pkg/buildpack/locator_type_test.go",
    "content": "package buildpack_test\n\nimport (\n\t\"fmt\"\n\t\"path/filepath\"\n\t\"testing\"\n\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\t\"github.com/buildpacks/pack/pkg/buildpack\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestGetLocatorType(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"testGetLocatorType\", testGetLocatorType, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testGetLocatorType(t *testing.T, when spec.G, it spec.S) {\n\ttype testCase struct {\n\t\tlocator      string\n\t\tbuilderBPs   []dist.ModuleInfo\n\t\texpectedType buildpack.LocatorType\n\t\texpectedErr  string\n\t}\n\n\tvar localPath = func(path string) string {\n\t\treturn filepath.Join(\"testdata\", path)\n\t}\n\n\tfor _, tc := range []testCase{\n\t\t{\n\t\t\tlocator:      \"from=builder\",\n\t\t\texpectedType: buildpack.FromBuilderLocator,\n\t\t},\n\t\t{\n\t\t\tlocator:      \"from=builder:some-bp\",\n\t\t\tbuilderBPs:   []dist.ModuleInfo{{ID: \"some-bp\", Version: \"some-version\"}},\n\t\t\texpectedType: buildpack.IDLocator,\n\t\t},\n\t\t{\n\t\t\tlocator:     \"from=builder:some-bp\",\n\t\t\texpectedErr: \"'from=builder:some-bp' is not a valid identifier\",\n\t\t},\n\t\t{\n\t\t\tlocator:     \"from=builder:some-bp@some-other-version\",\n\t\t\tbuilderBPs:  []dist.ModuleInfo{{ID: \"some-bp\", Version: \"some-version\"}},\n\t\t\texpectedErr: \"'from=builder:some-bp@some-other-version' is not a valid identifier\",\n\t\t},\n\t\t{\n\t\t\tlocator:      \"urn:cnb:builder:some-bp\",\n\t\t\tbuilderBPs:   []dist.ModuleInfo{{ID: \"some-bp\", Version: \"some-version\"}},\n\t\t\texpectedType: buildpack.IDLocator,\n\t\t},\n\t\t{\n\t\t\tlocator:     \"urn:cnb:builder:some-bp\",\n\t\t\texpectedErr: \"'urn:cnb:builder:some-bp' is not a valid identifier\",\n\t\t},\n\t\t{\n\t\t\tlocator:     \"urn:cnb:builder:some-bp@some-other-version\",\n\t\t\tbuilderBPs:  []dist.ModuleInfo{{ID: \"some-bp\", Version: \"some-version\"}},\n\t\t\texpectedErr: \"'urn:cnb:builder:some-bp@some-other-version' is not a valid identifier\",\n\t\t},\n\t\t{\n\t\t\tlocator:      \"some-bp\",\n\t\t\tbuilderBPs:   []dist.ModuleInfo{{ID: \"some-bp\", Version: \"any-version\"}},\n\t\t\texpectedType: buildpack.IDLocator,\n\t\t},\n\t\t{\n\t\t\tlocator:      localPath(\"buildpack\"),\n\t\t\tbuilderBPs:   []dist.ModuleInfo{{ID: \"bp.one\", Version: \"1.2.3\"}},\n\t\t\texpectedType: buildpack.URILocator,\n\t\t},\n\t\t{\n\t\t\tlocator:      \"https://example.com/buildpack.tgz\",\n\t\t\texpectedType: buildpack.URILocator,\n\t\t},\n\t\t{\n\t\t\tlocator:      \"localhost:1234/example/package-cnb\",\n\t\t\texpectedType: buildpack.PackageLocator,\n\t\t},\n\t\t{\n\t\t\tlocator:      \"cnbs/some-bp:latest\",\n\t\t\texpectedType: buildpack.PackageLocator,\n\t\t},\n\t\t{\n\t\t\tlocator:      \"docker://cnbs/some-bp\",\n\t\t\texpectedType: buildpack.PackageLocator,\n\t\t},\n\t\t{\n\t\t\tlocator:      \"docker://cnbs/some-bp@sha256:0123456789abcdef0123456789abcdef0123456789abcdef0123456789abcdef\",\n\t\t\texpectedType: buildpack.PackageLocator,\n\t\t},\n\t\t{\n\t\t\tlocator:      \"docker://cnbs/some-bp:some-tag\",\n\t\t\texpectedType: buildpack.PackageLocator,\n\t\t},\n\t\t{\n\t\t\tlocator:      \"docker://cnbs/some-bp:some-tag@sha256:0123456789abcdef0123456789abcdef0123456789abcdef0123456789abcdef\",\n\t\t\texpectedType: buildpack.PackageLocator,\n\t\t},\n\t\t{\n\t\t\tlocator:      \"docker://registry.com/cnbs/some-bp\",\n\t\t\texpectedType: buildpack.PackageLocator,\n\t\t},\n\t\t{\n\t\t\tlocator:      \"docker://registry.com/cnbs/some-bp@sha256:0123456789abcdef0123456789abcdef0123456789abcdef0123456789abcdef\",\n\t\t\texpectedType: buildpack.PackageLocator,\n\t\t},\n\t\t{\n\t\t\tlocator:      \"docker://registry.com/cnbs/some-bp:some-tag\",\n\t\t\texpectedType: buildpack.PackageLocator,\n\t\t},\n\t\t{\n\t\t\tlocator:      \"docker://registry.com/cnbs/some-bp:some-tag@sha256:0123456789abcdef0123456789abcdef0123456789abcdef0123456789abcdef\",\n\t\t\texpectedType: buildpack.PackageLocator,\n\t\t},\n\t\t{\n\t\t\tlocator:      \"cnbs/some-bp@sha256:0123456789abcdef0123456789abcdef0123456789abcdef0123456789abcdef\",\n\t\t\texpectedType: buildpack.PackageLocator,\n\t\t},\n\t\t{\n\t\t\tlocator:      \"cnbs/some-bp:some-tag@sha256:0123456789abcdef0123456789abcdef0123456789abcdef0123456789abcdef\",\n\t\t\texpectedType: buildpack.PackageLocator,\n\t\t},\n\t\t{\n\t\t\tlocator:      \"registry.com/cnbs/some-bp\",\n\t\t\texpectedType: buildpack.PackageLocator,\n\t\t},\n\t\t{\n\t\t\tlocator:      \"registry.com/cnbs/some-bp@sha256:0123456789abcdef0123456789abcdef0123456789abcdef0123456789abcdef\",\n\t\t\texpectedType: buildpack.PackageLocator,\n\t\t},\n\t\t{\n\t\t\tlocator:      \"registry.com/cnbs/some-bp:some-tag\",\n\t\t\texpectedType: buildpack.PackageLocator,\n\t\t},\n\t\t{\n\t\t\tlocator:      \"registry.com/cnbs/some-bp:some-tag@sha256:0123456789abcdef0123456789abcdef0123456789abcdef0123456789abcdef\",\n\t\t\texpectedType: buildpack.PackageLocator,\n\t\t},\n\t\t{\n\t\t\tlocator:      \"urn:cnb:registry:example/foo@1.0.0\",\n\t\t\texpectedType: buildpack.RegistryLocator,\n\t\t},\n\t\t{\n\t\t\tlocator:      \"example/foo@1.0.0\",\n\t\t\texpectedType: buildpack.RegistryLocator,\n\t\t},\n\t\t{\n\t\t\tlocator:      \"example/registry-cnb\",\n\t\t\texpectedType: buildpack.RegistryLocator,\n\t\t},\n\t\t{\n\t\t\tlocator:      \"cnbs/sample-package@hello-universe\",\n\t\t\texpectedType: buildpack.InvalidLocator,\n\t\t},\n\t\t{\n\t\t\tlocator:      \"dev.local/http-go-fn:latest\",\n\t\t\texpectedType: buildpack.PackageLocator,\n\t\t},\n\t} {\n\t\ttc := tc\n\n\t\tdesc := fmt.Sprintf(\"locator is %s\", tc.locator)\n\t\tif len(tc.builderBPs) > 0 {\n\t\t\tvar names []string\n\t\t\tfor _, bp := range tc.builderBPs {\n\t\t\t\tnames = append(names, bp.FullName())\n\t\t\t}\n\t\t\tdesc += fmt.Sprintf(\" and builder has buildpacks %s\", names)\n\t\t}\n\n\t\twhen(desc, func() {\n\t\t\tit(fmt.Sprintf(\"should return %s\", tc.expectedType), func() {\n\t\t\t\tactualType, actualErr := buildpack.GetLocatorType(tc.locator, \"\", tc.builderBPs)\n\n\t\t\t\tif tc.expectedErr == \"\" {\n\t\t\t\t\th.AssertNil(t, actualErr)\n\t\t\t\t} else {\n\t\t\t\t\th.AssertError(t, actualErr, tc.expectedErr)\n\t\t\t\t}\n\n\t\t\t\th.AssertEq(t, actualType, tc.expectedType)\n\t\t\t})\n\t\t})\n\t}\n}\n"
  },
  {
    "path": "pkg/buildpack/managed_collection.go",
    "content": "package buildpack\n\n// ManagedCollection keeps track of build modules and the manner in which they should be added to an OCI image (as flattened or exploded).\ntype ManagedCollection interface {\n\t// AllModules returns all build modules handled by the manager.\n\tAllModules() []BuildModule\n\n\t// ExplodedModules returns all build modules that will be added to the output artifact as a single layer\n\t// containing a single module.\n\tExplodedModules() []BuildModule\n\n\t// AddModules adds module information to the collection as flattened or not, depending on how the collection is configured.\n\tAddModules(main BuildModule, deps ...BuildModule)\n\n\t// FlattenedModules returns all build modules that will be added to the output artifact as a single layer\n\t// containing multiple modules.\n\tFlattenedModules() [][]BuildModule\n\n\t// ShouldFlatten returns true if the given module should be flattened.\n\tShouldFlatten(module BuildModule) bool\n}\n\ntype managedCollection struct {\n\texplodedModules  []BuildModule\n\tflattenedModules [][]BuildModule\n}\n\nfunc (f *managedCollection) ExplodedModules() []BuildModule {\n\treturn f.explodedModules\n}\n\nfunc (f *managedCollection) FlattenedModules() [][]BuildModule {\n\treturn f.flattenedModules\n}\n\nfunc (f *managedCollection) AllModules() []BuildModule {\n\tall := f.explodedModules\n\tfor _, modules := range f.flattenedModules {\n\t\tall = append(all, modules...)\n\t}\n\treturn all\n}\n\nfunc (f *managedCollection) ShouldFlatten(module BuildModule) bool {\n\tfor _, modules := range f.flattenedModules {\n\t\tfor _, v := range modules {\n\t\t\tif v == module {\n\t\t\t\treturn true\n\t\t\t}\n\t\t}\n\t}\n\treturn false\n}\n\n// managedCollectionV1 can be used to flatten all the flattenModuleInfos or none of them.\ntype managedCollectionV1 struct {\n\tmanagedCollection\n\tflattenAll bool\n}\n\n// NewManagedCollectionV1 will create a manager instance responsible for flattening Buildpack Packages.\nfunc NewManagedCollectionV1(flattenAll bool) ManagedCollection {\n\treturn &managedCollectionV1{\n\t\tflattenAll: flattenAll,\n\t\tmanagedCollection: managedCollection{\n\t\t\texplodedModules:  []BuildModule{},\n\t\t\tflattenedModules: [][]BuildModule{},\n\t\t},\n\t}\n}\n\nfunc (f *managedCollectionV1) AddModules(main BuildModule, deps ...BuildModule) {\n\tif !f.flattenAll {\n\t\t// default behavior\n\t\tf.explodedModules = append(f.explodedModules, append([]BuildModule{main}, deps...)...)\n\t} else {\n\t\t// flatten all\n\t\tif len(f.flattenedModules) == 1 {\n\t\t\t// we already have data in the array, append to the first element\n\t\t\tf.flattenedModules[0] = append(f.flattenedModules[0], append([]BuildModule{main}, deps...)...)\n\t\t} else {\n\t\t\t// the array is empty, create the first element\n\t\t\tf.flattenedModules = append(f.flattenedModules, append([]BuildModule{main}, deps...))\n\t\t}\n\t}\n}\n\n// NewManagedCollectionV2 will create a manager instance responsible for flattening buildpacks inside a Builder.\n// The flattened build modules provided are the groups of buildpacks that must be put together in a single layer; the manager\n// will take care of keeping them in the correct group (flattened or exploded) once they are added.\nfunc NewManagedCollectionV2(modules FlattenModuleInfos) ManagedCollection {\n\tflattenGroups := 0\n\tif modules != nil {\n\t\tflattenGroups = len(modules.FlattenModules())\n\t}\n\n\treturn &managedCollectionV2{\n\t\tflattenModuleInfos: modules,\n\t\tmanagedCollection: managedCollection{\n\t\t\texplodedModules:  []BuildModule{},\n\t\t\tflattenedModules: make([][]BuildModule, flattenGroups),\n\t\t},\n\t}\n}\n\n// managedCollectionV2 can be used when the build modules to be flattened are known at the point of initialization.\n// The flattened build modules are provided when the collection is initialized and the collection will take care of\n// keeping them in the correct group (flattened or exploded) once they are added.\ntype managedCollectionV2 struct {\n\tmanagedCollection\n\tflattenModuleInfos FlattenModuleInfos\n}\n\nfunc (ff *managedCollectionV2) flattenGroups() []ModuleInfos {\n\treturn ff.flattenModuleInfos.FlattenModules()\n}\n\nfunc (ff *managedCollectionV2) AddModules(main BuildModule, deps ...BuildModule) {\n\tvar allModules []BuildModule\n\tallModules = append(allModules, append([]BuildModule{main}, deps...)...)\n\tfor _, module := range allModules {\n\t\tif ff.flattenModuleInfos != nil && len(ff.flattenGroups()) > 0 {\n\t\t\tpos := ff.flattenedLayerFor(module)\n\t\t\tif pos >= 0 {\n\t\t\t\tff.flattenedModules[pos] = append(ff.flattenedModules[pos], module)\n\t\t\t} else {\n\t\t\t\t// this module must not be flattened\n\t\t\t\tff.explodedModules = append(ff.explodedModules, module)\n\t\t\t}\n\t\t} else {\n\t\t\t// we don't want to flatten anything\n\t\t\tff.explodedModules = append(ff.explodedModules, module)\n\t\t}\n\t}\n}\n\n// flattenedLayerFor given a module will try to determine which row (layer) this module must be added to in order to be flattened.\n// If the layer is not found, it means the module must not be flattened at all.\nfunc (ff *managedCollectionV2) flattenedLayerFor(module BuildModule) int {\n\t// flattenGroups is a two-dimensional array, where each row represents\n\t// a group of module infos that must be flattened together in the same layer.\n\tfor i, flattenGroup := range ff.flattenGroups() {\n\t\tfor _, buildModuleInfo := range flattenGroup.BuildModule() {\n\t\t\tif buildModuleInfo.FullName() == module.Descriptor().Info().FullName() {\n\t\t\t\treturn i\n\t\t\t}\n\t\t}\n\t}\n\treturn -1\n}\n"
  },
  {
    "path": "pkg/buildpack/managed_collection_test.go",
    "content": "package buildpack_test\n\nimport (\n\t\"testing\"\n\n\t\"github.com/buildpacks/lifecycle/api\"\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\tifakes \"github.com/buildpacks/pack/internal/fakes\"\n\t\"github.com/buildpacks/pack/pkg/buildpack\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestModuleManager(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"ManagedCollection\", testModuleManager, spec.Report(report.Terminal{}))\n}\n\nfunc testModuleManager(t *testing.T, when spec.G, it spec.S) {\n\t/* compositeBP1\n\t *    /    \\\n\t *   bp1   compositeBP2\n\t *           /   |    \\\n\t *\t      bp21 bp22 compositeBP3\n\t *\t\t\t          |\n\t *\t\t            bp31\n\t */\n\tvar (\n\t\tmoduleManager       buildpack.ManagedCollection\n\t\tcompositeBP1        buildpack.BuildModule\n\t\tbp1                 buildpack.BuildModule\n\t\tcompositeBP2        buildpack.BuildModule\n\t\tbp21                buildpack.BuildModule\n\t\tbp22                buildpack.BuildModule\n\t\tcompositeBP3        buildpack.BuildModule\n\t\tbp31                buildpack.BuildModule\n\t\tflattenBuildModules buildpack.FlattenModuleInfos\n\t\terr                 error\n\t)\n\n\tit.Before(func() {\n\t\tbp1, err = ifakes.NewFakeBuildpack(dist.BuildpackDescriptor{\n\t\t\tWithAPI: api.MustParse(\"0.2\"),\n\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\tID:      \"buildpack-1-id\",\n\t\t\t\tVersion: \"buildpack-1-version\",\n\t\t\t},\n\t\t}, 0644)\n\t\th.AssertNil(t, err)\n\n\t\tbp21, err = ifakes.NewFakeBuildpack(dist.BuildpackDescriptor{\n\t\t\tWithAPI: api.MustParse(\"0.2\"),\n\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\tID:      \"buildpack-21-id\",\n\t\t\t\tVersion: \"buildpack-21-version\",\n\t\t\t},\n\t\t}, 0644)\n\t\th.AssertNil(t, err)\n\n\t\tbp22, err = ifakes.NewFakeBuildpack(dist.BuildpackDescriptor{\n\t\t\tWithAPI: api.MustParse(\"0.2\"),\n\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\tID:      \"buildpack-22-id\",\n\t\t\t\tVersion: \"buildpack-22-version\",\n\t\t\t},\n\t\t}, 0644)\n\t\th.AssertNil(t, err)\n\n\t\tbp31, err = ifakes.NewFakeBuildpack(dist.BuildpackDescriptor{\n\t\t\tWithAPI: api.MustParse(\"0.2\"),\n\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\tID:      \"buildpack-31-id\",\n\t\t\t\tVersion: \"buildpack-31-version\",\n\t\t\t},\n\t\t}, 0644)\n\t\th.AssertNil(t, err)\n\n\t\tcompositeBP3, err = ifakes.NewFakeBuildpack(dist.BuildpackDescriptor{\n\t\t\tWithAPI: api.MustParse(\"0.2\"),\n\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\tID:      \"composite-buildpack-3-id\",\n\t\t\t\tVersion: \"composite-buildpack-3-version\",\n\t\t\t},\n\t\t\tWithOrder: []dist.OrderEntry{{\n\t\t\t\tGroup: []dist.ModuleRef{\n\t\t\t\t\t{\n\t\t\t\t\t\tModuleInfo: bp31.Descriptor().Info(),\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t}},\n\t\t}, 0644)\n\t\th.AssertNil(t, err)\n\n\t\tcompositeBP2, err = ifakes.NewFakeBuildpack(dist.BuildpackDescriptor{\n\t\t\tWithAPI: api.MustParse(\"0.2\"),\n\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\tID:      \"composite-buildpack-2-id\",\n\t\t\t\tVersion: \"composite-buildpack-2-version\",\n\t\t\t},\n\t\t\tWithOrder: []dist.OrderEntry{{\n\t\t\t\tGroup: []dist.ModuleRef{\n\t\t\t\t\t{\n\t\t\t\t\t\tModuleInfo: bp21.Descriptor().Info(),\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\tModuleInfo: bp22.Descriptor().Info(),\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\tModuleInfo: compositeBP3.Descriptor().Info(),\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t}},\n\t\t}, 0644)\n\t\th.AssertNil(t, err)\n\n\t\tcompositeBP1, err = ifakes.NewFakeBuildpack(dist.BuildpackDescriptor{\n\t\t\tWithAPI: api.MustParse(\"0.2\"),\n\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\tID:      \"composite-buildpack-1-id\",\n\t\t\t\tVersion: \"composite-buildpack-1-version\",\n\t\t\t},\n\t\t\tWithOrder: []dist.OrderEntry{{\n\t\t\t\tGroup: []dist.ModuleRef{\n\t\t\t\t\t{\n\t\t\t\t\t\tModuleInfo: bp1.Descriptor().Info(),\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\tModuleInfo: compositeBP2.Descriptor().Info(),\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t}},\n\t\t}, 0644)\n\t\th.AssertNil(t, err)\n\t})\n\n\twhen(\"manager is configured in flatten mode\", func() {\n\t\twhen(\"V1 is used\", func() {\n\t\t\twhen(\"flatten all\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tmoduleManager = buildpack.NewManagedCollectionV1(true)\n\t\t\t\t\tmoduleManager.AddModules(compositeBP1, []buildpack.BuildModule{bp1, compositeBP2, bp21, bp22, compositeBP3, bp31}...)\n\t\t\t\t})\n\n\t\t\t\twhen(\"#FlattenedModules\", func() {\n\t\t\t\t\tit(\"returns one flatten module (1 layer)\", func() {\n\t\t\t\t\t\tmodules := moduleManager.FlattenedModules()\n\t\t\t\t\t\th.AssertEq(t, len(modules), 1)\n\t\t\t\t\t\th.AssertEq(t, len(modules[0]), 7)\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"#ExplodedModules\", func() {\n\t\t\t\t\tit(\"returns empty\", func() {\n\t\t\t\t\t\tmodules := moduleManager.ExplodedModules()\n\t\t\t\t\t\th.AssertEq(t, len(modules), 0)\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"#AllModules\", func() {\n\t\t\t\t\tit(\"returns all explodedModules\", func() {\n\t\t\t\t\t\tmodules := moduleManager.AllModules()\n\t\t\t\t\t\th.AssertEq(t, len(modules), 7)\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"#ShouldFlatten\", func() {\n\t\t\t\t\tit(\"returns true for flatten explodedModules\", func() {\n\t\t\t\t\t\th.AssertTrue(t, moduleManager.ShouldFlatten(compositeBP1))\n\t\t\t\t\t\th.AssertTrue(t, moduleManager.ShouldFlatten(bp1))\n\t\t\t\t\t\th.AssertTrue(t, moduleManager.ShouldFlatten(compositeBP2))\n\t\t\t\t\t\th.AssertTrue(t, moduleManager.ShouldFlatten(bp21))\n\t\t\t\t\t\th.AssertTrue(t, moduleManager.ShouldFlatten(bp22))\n\t\t\t\t\t\th.AssertTrue(t, moduleManager.ShouldFlatten(compositeBP3))\n\t\t\t\t\t\th.AssertTrue(t, moduleManager.ShouldFlatten(bp31))\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"V2 is used\", func() {\n\t\t\twhen(\"flattened build modules are provided\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tflattenBuildModules, err = buildpack.ParseFlattenBuildModules([]string{\"composite-buildpack-3-id@composite-buildpack-3-version,buildpack-31-id@buildpack-31-version\", \"composite-buildpack-2-id@composite-buildpack-2-version,buildpack-21-id@buildpack-21-version,buildpack-22-id@buildpack-22-version\"})\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\tmoduleManager = buildpack.NewManagedCollectionV2(flattenBuildModules)\n\t\t\t\t\tmoduleManager.AddModules(compositeBP1, []buildpack.BuildModule{bp1, compositeBP2, bp21, bp22, compositeBP3, bp31}...)\n\t\t\t\t})\n\n\t\t\t\twhen(\"#FlattenedModules\", func() {\n\t\t\t\t\tit(\"returns two flattened modules (2 layers)\", func() {\n\t\t\t\t\t\tmodules := moduleManager.FlattenedModules()\n\t\t\t\t\t\th.AssertEq(t, len(modules), 2)\n\t\t\t\t\t\th.AssertTrue(t, len(modules[0]) == 2 || len(modules[0]) == 3)\n\t\t\t\t\t\tif len(modules[0]) == 2 {\n\t\t\t\t\t\t\th.AssertEq(t, len(modules[1]), 3)\n\t\t\t\t\t\t} else if len(modules[0]) == 3 {\n\t\t\t\t\t\t\th.AssertEq(t, len(modules[1]), 2)\n\t\t\t\t\t\t}\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"#ExplodedModules\", func() {\n\t\t\t\t\tit(\"returns two exploded modules: compositeBP1 and bp1\", func() {\n\t\t\t\t\t\tmodules := moduleManager.ExplodedModules()\n\t\t\t\t\t\th.AssertEq(t, len(modules), 2)\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"#AllModules\", func() {\n\t\t\t\t\tit(\"returns all modules\", func() {\n\t\t\t\t\t\tmodules := moduleManager.AllModules()\n\t\t\t\t\t\th.AssertEq(t, len(modules), 7)\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"#ShouldFlatten\", func() {\n\t\t\t\t\tit(\"returns true for flattened modules\", func() {\n\t\t\t\t\t\t// exploded modules\n\t\t\t\t\t\th.AssertFalse(t, moduleManager.ShouldFlatten(compositeBP1))\n\t\t\t\t\t\th.AssertFalse(t, moduleManager.ShouldFlatten(bp1))\n\n\t\t\t\t\t\t// flattened modules\n\t\t\t\t\t\th.AssertTrue(t, moduleManager.ShouldFlatten(compositeBP2))\n\t\t\t\t\t\th.AssertTrue(t, moduleManager.ShouldFlatten(bp21))\n\t\t\t\t\t\th.AssertTrue(t, moduleManager.ShouldFlatten(bp22))\n\t\t\t\t\t\th.AssertTrue(t, moduleManager.ShouldFlatten(compositeBP3))\n\t\t\t\t\t\th.AssertTrue(t, moduleManager.ShouldFlatten(bp31))\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"manager is not configured in flatten mode\", func() {\n\t\twhen(\"V1 is used\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\tmoduleManager = buildpack.NewManagedCollectionV1(false)\n\t\t\t})\n\n\t\t\twhen(\"#ExplodedModules\", func() {\n\t\t\t\tit(\"returns nil when no explodedModules are added\", func() {\n\t\t\t\t\tmodules := moduleManager.ExplodedModules()\n\t\t\t\t\th.AssertEq(t, len(modules), 0)\n\t\t\t\t})\n\n\t\t\t\twhen(\"explodedModules are added\", func() {\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\tmoduleManager.AddModules(compositeBP1, []buildpack.BuildModule{bp1, compositeBP2, bp21, bp22, compositeBP3, bp31}...)\n\t\t\t\t\t})\n\t\t\t\t\tit(\"returns all explodedModules added\", func() {\n\t\t\t\t\t\tmodules := moduleManager.ExplodedModules()\n\t\t\t\t\t\th.AssertEq(t, len(modules), 7)\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"#FlattenedModules\", func() {\n\t\t\t\tit(\"returns nil when no explodedModules are added\", func() {\n\t\t\t\t\tmodules := moduleManager.FlattenedModules()\n\t\t\t\t\th.AssertEq(t, len(modules), 0)\n\t\t\t\t})\n\n\t\t\t\twhen(\"explodedModules are added\", func() {\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\tmoduleManager.AddModules(compositeBP1, []buildpack.BuildModule{bp1, compositeBP2, bp21, bp22, compositeBP3, bp31}...)\n\t\t\t\t\t})\n\t\t\t\t\tit(\"returns nil\", func() {\n\t\t\t\t\t\tmodules := moduleManager.FlattenedModules()\n\t\t\t\t\t\th.AssertEq(t, len(modules), 0)\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"#ShouldFlatten\", func() {\n\t\t\t\tit(\"returns false when no explodedModules are added\", func() {\n\t\t\t\t\th.AssertFalse(t, moduleManager.ShouldFlatten(bp1))\n\t\t\t\t})\n\n\t\t\t\twhen(\"explodedModules are added\", func() {\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\tmoduleManager.AddModules(compositeBP1, []buildpack.BuildModule{bp1, compositeBP2, bp21, bp22, compositeBP3, bp31}...)\n\t\t\t\t\t})\n\t\t\t\t\tit(\"returns false\", func() {\n\t\t\t\t\t\th.AssertFalse(t, moduleManager.ShouldFlatten(bp1))\n\t\t\t\t\t\th.AssertFalse(t, moduleManager.ShouldFlatten(bp21))\n\t\t\t\t\t\th.AssertFalse(t, moduleManager.ShouldFlatten(bp22))\n\t\t\t\t\t\th.AssertFalse(t, moduleManager.ShouldFlatten(bp31))\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"V2 is used\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\tmoduleManager = buildpack.NewManagedCollectionV2(nil)\n\t\t\t})\n\n\t\t\twhen(\"#ExplodedModules\", func() {\n\t\t\t\tit(\"returns nil when no explodedModules are added\", func() {\n\t\t\t\t\tmodules := moduleManager.ExplodedModules()\n\t\t\t\t\th.AssertEq(t, len(modules), 0)\n\t\t\t\t})\n\n\t\t\t\twhen(\"explodedModules are added\", func() {\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\tmoduleManager.AddModules(compositeBP1, []buildpack.BuildModule{bp1, compositeBP2, bp21, bp22, compositeBP3, bp31}...)\n\t\t\t\t\t})\n\t\t\t\t\tit(\"returns all explodedModules added\", func() {\n\t\t\t\t\t\tmodules := moduleManager.ExplodedModules()\n\t\t\t\t\t\th.AssertEq(t, len(modules), 7)\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"#FlattenedModules\", func() {\n\t\t\t\tit(\"returns nil when no explodedModules are added\", func() {\n\t\t\t\t\tmodules := moduleManager.FlattenedModules()\n\t\t\t\t\th.AssertEq(t, len(modules), 0)\n\t\t\t\t})\n\n\t\t\t\twhen(\"explodedModules are added\", func() {\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\tmoduleManager.AddModules(compositeBP1, []buildpack.BuildModule{bp1, compositeBP2, bp21, bp22, compositeBP3, bp31}...)\n\t\t\t\t\t})\n\t\t\t\t\tit(\"returns nil\", func() {\n\t\t\t\t\t\tmodules := moduleManager.FlattenedModules()\n\t\t\t\t\t\th.AssertEq(t, len(modules), 0)\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"#ShouldFlatten\", func() {\n\t\t\t\tit(\"returns false when no explodedModules are added\", func() {\n\t\t\t\t\th.AssertFalse(t, moduleManager.ShouldFlatten(bp1))\n\t\t\t\t})\n\n\t\t\t\twhen(\"explodedModules are added\", func() {\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\tmoduleManager.AddModules(compositeBP1, []buildpack.BuildModule{bp1, compositeBP2, bp21, bp22, compositeBP3, bp31}...)\n\t\t\t\t\t})\n\t\t\t\t\tit(\"returns false\", func() {\n\t\t\t\t\t\th.AssertFalse(t, moduleManager.ShouldFlatten(bp1))\n\t\t\t\t\t\th.AssertFalse(t, moduleManager.ShouldFlatten(bp21))\n\t\t\t\t\t\th.AssertFalse(t, moduleManager.ShouldFlatten(bp22))\n\t\t\t\t\t\th.AssertFalse(t, moduleManager.ShouldFlatten(bp31))\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "pkg/buildpack/multi_architecture_helper.go",
    "content": "package buildpack\n\nimport (\n\t\"io\"\n\t\"os\"\n\t\"path/filepath\"\n\n\t\"github.com/buildpacks/pack/internal/paths\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n)\n\n// MultiArchConfig targets can be defined in .toml files or can be overridden by end-users via the command line; this structure offers\n// utility methods to determine the expected final targets configuration.\ntype MultiArchConfig struct {\n\t// Targets defined in .toml files\n\tbuildpackTargets []dist.Target\n\n\t// Targets defined by end-users to override configuration files\n\texpectedTargets []dist.Target\n\tlogger          logging.Logger\n}\n\nfunc NewMultiArchConfig(targets []dist.Target, expected []dist.Target, logger logging.Logger) (*MultiArchConfig, error) {\n\treturn &MultiArchConfig{\n\t\tbuildpackTargets: targets,\n\t\texpectedTargets:  expected,\n\t\tlogger:           logger,\n\t}, nil\n}\n\nfunc (m *MultiArchConfig) Targets() []dist.Target {\n\tif len(m.expectedTargets) == 0 {\n\t\treturn m.buildpackTargets\n\t}\n\treturn m.expectedTargets\n}\n\n// CopyConfigFiles will, given a base directory (which is expected to be the root folder of a single buildpack or an extension),\n// copy the buildpack.toml or the extension.toml file from the base directory into the corresponding platform root folder for each target.\n// It will return an array with all the platform root folders where the buildpack.toml or the extension.toml file was copied.\n// Whether to copy the buildpack or the extension TOML file is determined by the buildpackType parameter.\nfunc (m *MultiArchConfig) CopyConfigFiles(baseDir string, buildpackType string) ([]string, error) {\n\tvar filesToClean []string\n\tif buildpackType == \"\" {\n\t\tbuildpackType = KindBuildpack\n\t}\n\ttargets := dist.ExpandTargetsDistributions(m.Targets()...)\n\tfor _, target := range targets {\n\t\tpath, err := CopyConfigFile(baseDir, target, buildpackType)\n\t\tif err != nil {\n\t\t\treturn nil, err\n\t\t}\n\t\tif path != \"\" {\n\t\t\tfilesToClean = append(filesToClean, path)\n\t\t}\n\t}\n\treturn filesToClean, nil\n}\n\n// CopyConfigFile will copy the buildpack.toml or the extension.toml file, based on the buildpackType parameter,\n// from the base directory into the corresponding platform folder\n// for the specified target and desired distribution version.\nfunc CopyConfigFile(baseDir string, target dist.Target, buildpackType string) (string, error) {\n\tvar path string\n\tvar err error\n\n\tif ok, platformRootFolder := PlatformRootFolder(baseDir, target); ok {\n\t\tif buildpackType == KindExtension {\n\t\t\tpath, err = copyExtensionTOML(baseDir, platformRootFolder)\n\t\t} else {\n\t\t\tpath, err = copyBuildpackTOML(baseDir, platformRootFolder)\n\t\t}\n\t\tif err != nil {\n\t\t\treturn \"\", err\n\t\t}\n\t\treturn path, nil\n\t}\n\treturn \"\", nil\n}\n\n// PlatformRootFolder finds the top-most directory that identifies a target in a given buildpack <root> folder.\n// Let's define a target with the following format: [os][/arch][/variant]:[name@version], and consider the following examples:\n//   - Given a target linux/amd64 the platform root folder will be <root>/linux/amd64 if the folder exists\n//   - Given a target windows/amd64:windows@10.0.20348.1970 the platform root folder will be <root>/windows/amd64/windows@10.0.20348.1970 if the folder exists\n//   - When no target folder exists, the root folder will be equal to <root> folder\n//\n// Note: If the given target has more than 1 distribution, it is recommended to use `ExpandTargetsDistributions` before\n// calling this method.\nfunc PlatformRootFolder(bpPathURI string, target dist.Target) (bool, string) {\n\tvar (\n\t\tpRootFolder string\n\t\terr         error\n\t)\n\n\tif paths.IsURI(bpPathURI) {\n\t\tif pRootFolder, err = paths.URIToFilePath(bpPathURI); err != nil {\n\t\t\treturn false, \"\"\n\t\t}\n\t} else {\n\t\tpRootFolder = bpPathURI\n\t}\n\n\ttargets := target.ValuesAsSlice()\n\tfound := false\n\tcurrent := false\n\tfor _, t := range targets {\n\t\tcurrent, pRootFolder = targetExists(pRootFolder, t)\n\t\tif current {\n\t\t\tfound = current\n\t\t} else {\n\t\t\t// No need to keep looking\n\t\t\tbreak\n\t\t}\n\t}\n\t// We will return the last matching folder\n\treturn found, pRootFolder\n}\n\nfunc targetExists(root, expected string) (bool, string) {\n\tif expected == \"\" {\n\t\treturn false, root\n\t}\n\tpath := filepath.Join(root, expected)\n\tif exists, _ := paths.IsDir(path); exists {\n\t\treturn true, path\n\t}\n\treturn false, root\n}\n\nfunc copyBuildpackTOML(src string, dest string) (string, error) {\n\treturn copyFile(src, dest, \"buildpack.toml\")\n}\n\nfunc copyExtensionTOML(src string, dest string) (string, error) {\n\treturn copyFile(src, dest, \"extension.toml\")\n}\nfunc copyFile(src, dest, fileName string) (string, error) {\n\tfilePath := filepath.Join(dest, fileName)\n\tfileToCopy, err := os.Create(filePath)\n\tif err != nil {\n\t\treturn \"\", err\n\t}\n\tdefer fileToCopy.Close()\n\n\tfileCopyFrom, err := os.Open(filepath.Join(src, fileName))\n\tif err != nil {\n\t\treturn \"\", err\n\t}\n\tdefer fileCopyFrom.Close()\n\n\t_, err = io.Copy(fileToCopy, fileCopyFrom)\n\tif err != nil {\n\t\treturn \"\", err\n\t}\n\n\tfileToCopy.Sync()\n\tif err != nil {\n\t\treturn \"\", err\n\t}\n\n\treturn filePath, nil\n}\n"
  },
  {
    "path": "pkg/buildpack/multi_architecture_helper_test.go",
    "content": "package buildpack_test\n\nimport (\n\t\"bytes\"\n\t\"os\"\n\t\"path/filepath\"\n\t\"testing\"\n\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\t\"github.com/buildpacks/pack/internal/paths\"\n\t\"github.com/buildpacks/pack/pkg/buildpack\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestMultiArchConfig(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"testMultiArchConfig\", testMultiArchConfig, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testMultiArchConfig(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\terr                  error\n\t\toutBuf               bytes.Buffer\n\t\tlogger               *logging.LogWithWriters\n\t\tmultiArchConfig      *buildpack.MultiArchConfig\n\t\ttargetsFromBuildpack []dist.Target\n\t\ttargetsFromExtension []dist.Target\n\t\ttargetsFromFlags     []dist.Target\n\t\ttmpDir               string\n\t)\n\n\tit.Before(func() {\n\t\ttargetsFromBuildpack = []dist.Target{{OS: \"linux\", Arch: \"amd64\"}}\n\t\ttargetsFromFlags = []dist.Target{{OS: \"linux\", Arch: \"arm64\", ArchVariant: \"v6\"}}\n\t\tlogger = logging.NewLogWithWriters(&outBuf, &outBuf)\n\n\t\ttmpDir, err = os.MkdirTemp(\"\", \"test-multi-arch\")\n\t\th.AssertNil(t, err)\n\t})\n\n\tit.After(func() {\n\t\tos.RemoveAll(tmpDir)\n\t})\n\n\twhen(\"#Targets\", func() {\n\t\twhen(\"buildpack targets are defined\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\tmultiArchConfig, err = buildpack.NewMultiArchConfig(targetsFromBuildpack, []dist.Target{}, logger)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t})\n\n\t\t\tit(\"returns buildpack targets\", func() {\n\t\t\t\th.AssertEq(t, len(multiArchConfig.Targets()), 1)\n\t\t\t\th.AssertEq(t, multiArchConfig.Targets()[0].OS, \"linux\")\n\t\t\t\th.AssertEq(t, multiArchConfig.Targets()[0].Arch, \"amd64\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"buildpack targets are not defined, but flags are provided\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\tmultiArchConfig, err = buildpack.NewMultiArchConfig([]dist.Target{}, targetsFromFlags, logger)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t})\n\n\t\t\tit(\"returns targets from flags\", func() {\n\t\t\t\th.AssertEq(t, len(multiArchConfig.Targets()), 1)\n\t\t\t\th.AssertEq(t, multiArchConfig.Targets()[0].OS, \"linux\")\n\t\t\t\th.AssertEq(t, multiArchConfig.Targets()[0].Arch, \"arm64\")\n\t\t\t\th.AssertEq(t, multiArchConfig.Targets()[0].ArchVariant, \"v6\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"buildpack targets are defined and flags are provided\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\tmultiArchConfig, err = buildpack.NewMultiArchConfig(targetsFromBuildpack, targetsFromFlags, logger)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t})\n\n\t\t\tit(\"returns targets from flags\", func() {\n\t\t\t\t// flags overrides the targets in the configuration files\n\t\t\t\th.AssertEq(t, len(multiArchConfig.Targets()), 1)\n\t\t\t\th.AssertEq(t, multiArchConfig.Targets()[0].OS, \"linux\")\n\t\t\t\th.AssertEq(t, multiArchConfig.Targets()[0].Arch, \"arm64\")\n\t\t\t\th.AssertEq(t, multiArchConfig.Targets()[0].ArchVariant, \"v6\")\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"#CopyConfigFiles\", func() {\n\t\twhen(\"buildpack root folder exists\", func() {\n\t\t\tvar rootFolder string\n\n\t\t\tit.Before(func() {\n\t\t\t\trootFolder = filepath.Join(tmpDir, \"some-buildpack\")\n\t\t\t\ttargetsFromBuildpack = []dist.Target{{OS: \"linux\", Arch: \"amd64\"}, {OS: \"linux\", Arch: \"arm64\", ArchVariant: \"v8\"}}\n\t\t\t\tmultiArchConfig, err = buildpack.NewMultiArchConfig(targetsFromBuildpack, []dist.Target{}, logger)\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t// dummy multi-platform buildpack structure\n\t\t\t\tos.MkdirAll(filepath.Join(rootFolder, \"linux\", \"amd64\"), 0755)\n\t\t\t\tos.MkdirAll(filepath.Join(rootFolder, \"linux\", \"arm64\", \"v8\"), 0755)\n\t\t\t\t_, err = os.Create(filepath.Join(rootFolder, \"buildpack.toml\"))\n\t\t\t\th.AssertNil(t, err)\n\t\t\t})\n\n\t\t\tit(\"copies the buildpack.toml to each target platform folder\", func() {\n\t\t\t\tpaths, err := multiArchConfig.CopyConfigFiles(rootFolder, \"buildpack\")\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertEq(t, len(paths), 2)\n\t\t\t\th.AssertPathExists(t, filepath.Join(rootFolder, \"linux\", \"amd64\", \"buildpack.toml\"))\n\t\t\t\th.AssertPathExists(t, filepath.Join(rootFolder, \"linux\", \"arm64\", \"v8\", \"buildpack.toml\"))\n\t\t\t})\n\t\t})\n\n\t\twhen(\"extension root folder exists\", func() {\n\t\t\tvar rootFolder string\n\n\t\t\tit.Before(func() {\n\t\t\t\trootFolder = filepath.Join(tmpDir, \"some-extension\")\n\t\t\t\ttargetsFromExtension = []dist.Target{{OS: \"linux\", Arch: \"amd64\"}, {OS: \"linux\", Arch: \"arm64\", ArchVariant: \"v8\"}}\n\t\t\t\tmultiArchConfig, err = buildpack.NewMultiArchConfig(targetsFromExtension, []dist.Target{}, logger)\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t// dummy multi-platform extension structure\n\t\t\t\tos.MkdirAll(filepath.Join(rootFolder, \"linux\", \"amd64\"), 0755)\n\t\t\t\tos.MkdirAll(filepath.Join(rootFolder, \"linux\", \"arm64\", \"v8\"), 0755)\n\t\t\t\t_, err = os.Create(filepath.Join(rootFolder, \"extension.toml\"))\n\t\t\t\th.AssertNil(t, err)\n\t\t\t})\n\n\t\t\tit(\"copies the extension.toml to each target platform folder\", func() {\n\t\t\t\tpaths, err := multiArchConfig.CopyConfigFiles(rootFolder, \"extension\")\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertEq(t, len(paths), 2)\n\t\t\t\th.AssertPathExists(t, filepath.Join(rootFolder, \"linux\", \"amd64\", \"extension.toml\"))\n\t\t\t\th.AssertPathExists(t, filepath.Join(rootFolder, \"linux\", \"arm64\", \"v8\", \"extension.toml\"))\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"#PlatformRootFolder\", func() {\n\t\tvar target dist.Target\n\n\t\twhen(\"root folder exists\", func() {\n\t\t\tvar bpURI string\n\n\t\t\tit.Before(func() {\n\t\t\t\tos.MkdirAll(filepath.Join(tmpDir, \"linux\", \"arm64\", \"v8\"), 0755)\n\t\t\t\tos.MkdirAll(filepath.Join(tmpDir, \"windows\", \"amd64\", \"v2\", \"windows@10.0.20348.1970\"), 0755)\n\t\t\t\tbpURI, err = paths.FilePathToURI(tmpDir, \"\")\n\t\t\t\th.AssertNil(t, err)\n\t\t\t})\n\n\t\t\twhen(\"target has 'os'\", func() {\n\t\t\t\twhen(\"'os' directory exists\", func() {\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\ttarget = dist.Target{OS: \"linux\"}\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"returns <root>/<os directory>\", func() {\n\t\t\t\t\t\tfound, path := buildpack.PlatformRootFolder(bpURI, target)\n\t\t\t\t\t\th.AssertTrue(t, found)\n\t\t\t\t\t\th.AssertEq(t, path, filepath.Join(tmpDir, \"linux\"))\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"'os' directory doesn't exist\", func() {\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\ttarget = dist.Target{OS: \"darwin\"}\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"returns not found\", func() {\n\t\t\t\t\t\tfound, _ := buildpack.PlatformRootFolder(bpURI, target)\n\t\t\t\t\t\th.AssertFalse(t, found)\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"target has 'os' and 'arch'\", func() {\n\t\t\t\twhen(\"'arch' directory exists\", func() {\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\ttarget = dist.Target{OS: \"linux\", Arch: \"arm64\"}\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"returns <root>/<os directory>/<arch directory>\", func() {\n\t\t\t\t\t\tfound, path := buildpack.PlatformRootFolder(bpURI, target)\n\t\t\t\t\t\th.AssertTrue(t, found)\n\t\t\t\t\t\th.AssertEq(t, path, filepath.Join(tmpDir, \"linux\", \"arm64\"))\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"'arch' directory doesn't exist\", func() {\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\ttarget = dist.Target{OS: \"linux\", Arch: \"amd64\"}\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"returns <root>/<os directory>\", func() {\n\t\t\t\t\t\tfound, path := buildpack.PlatformRootFolder(bpURI, target)\n\t\t\t\t\t\th.AssertTrue(t, found)\n\t\t\t\t\t\th.AssertEq(t, path, filepath.Join(tmpDir, \"linux\"))\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"target has 'os', 'arch' and 'variant'\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\ttarget = dist.Target{OS: \"linux\", Arch: \"arm64\", ArchVariant: \"v8\"}\n\t\t\t\t})\n\n\t\t\t\tit(\"returns <root>/<os directory>/<arch directory>/<variant directory>\", func() {\n\t\t\t\t\tfound, path := buildpack.PlatformRootFolder(bpURI, target)\n\t\t\t\t\th.AssertTrue(t, found)\n\t\t\t\t\th.AssertEq(t, path, filepath.Join(tmpDir, \"linux\", \"arm64\", \"v8\"))\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"target has 'os', 'arch', 'variant' and name@version\", func() {\n\t\t\t\twhen(\"all directories exist\", func() {\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\ttarget = dist.Target{OS: \"windows\", Arch: \"amd64\", ArchVariant: \"v2\", Distributions: []dist.Distribution{{Name: \"windows\", Version: \"10.0.20348.1970\"}}}\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"returns <root>/<os directory>/<arch directory>/<variant directory>/<distro name directory>@<distro version directory>\", func() {\n\t\t\t\t\t\tfound, path := buildpack.PlatformRootFolder(bpURI, target)\n\t\t\t\t\t\th.AssertTrue(t, found)\n\t\t\t\t\t\th.AssertEq(t, path, filepath.Join(tmpDir, \"windows\", \"amd64\", \"v2\", \"windows@10.0.20348.1970\"))\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"version doesn't exist\", func() {\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\ttarget = dist.Target{OS: \"windows\", Arch: \"amd64\", ArchVariant: \"v2\", Distributions: []dist.Distribution{{Name: \"windows\", Version: \"foo\"}}}\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"returns the most specific matching directory (<root>/<os directory>/<arch directory>/<variant directory>)\", func() {\n\t\t\t\t\t\tfound, path := buildpack.PlatformRootFolder(bpURI, target)\n\t\t\t\t\t\th.AssertTrue(t, found)\n\t\t\t\t\t\th.AssertEq(t, path, filepath.Join(tmpDir, \"windows\", \"amd64\", \"v2\"))\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "pkg/buildpack/oci_layout_package.go",
    "content": "package buildpack\n\nimport (\n\t\"archive/tar\"\n\t\"compress/gzip\"\n\t\"encoding/json\"\n\t\"fmt\"\n\t\"io\"\n\t\"path\"\n\t\"strings\"\n\n\t\"github.com/docker/docker/pkg/ioutils\"\n\tv1 \"github.com/opencontainers/image-spec/specs-go/v1\"\n\t\"github.com/pkg/errors\"\n\n\t\"github.com/buildpacks/pack/internal/paths\"\n\t\"github.com/buildpacks/pack/internal/style\"\n\t\"github.com/buildpacks/pack/pkg/archive\"\n\tblob2 \"github.com/buildpacks/pack/pkg/blob\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n)\n\n// IsOCILayoutBlob checks whether a blob is in OCI layout format.\nfunc IsOCILayoutBlob(blob blob2.Blob) (bool, error) {\n\treadCloser, err := blob.Open()\n\tif err != nil {\n\t\treturn false, err\n\t}\n\tdefer readCloser.Close()\n\n\t_, _, err = archive.ReadTarEntry(readCloser, v1.ImageLayoutFile)\n\tif err != nil {\n\t\tif archive.IsEntryNotExist(err) {\n\t\t\treturn false, nil\n\t\t}\n\n\t\treturn false, err\n\t}\n\n\treturn true, nil\n}\n\n// BuildpacksFromOCILayoutBlob constructs buildpacks from a blob in OCI layout format.\nfunc BuildpacksFromOCILayoutBlob(blob Blob) (mainBP BuildModule, dependencies []BuildModule, err error) {\n\tlayoutPackage, err := newOCILayoutPackage(blob, KindBuildpack)\n\tif err != nil {\n\t\treturn nil, nil, err\n\t}\n\n\treturn extractBuildpacks(layoutPackage)\n}\n\n// ExtensionsFromOCILayoutBlob constructs extensions from a blob in OCI layout format.\nfunc ExtensionsFromOCILayoutBlob(blob Blob) (mainExt BuildModule, err error) {\n\tlayoutPackage, err := newOCILayoutPackage(blob, KindExtension)\n\tif err != nil {\n\t\treturn nil, err\n\t}\n\n\treturn extractExtensions(layoutPackage)\n}\n\nfunc ConfigFromOCILayoutBlob(blob Blob) (config v1.ImageConfig, err error) {\n\tlayoutPackage, err := newOCILayoutPackage(blob, KindBuildpack)\n\tif err != nil {\n\t\treturn v1.ImageConfig{}, err\n\t}\n\treturn layoutPackage.imageInfo.Config, nil\n}\n\ntype ociLayoutPackage struct {\n\timageInfo v1.Image\n\tmanifest  v1.Manifest\n\tblob      Blob\n}\n\nfunc newOCILayoutPackage(blob Blob, kind string) (*ociLayoutPackage, error) {\n\tindex := &v1.Index{}\n\n\tif err := unmarshalJSONFromBlob(blob, v1.ImageIndexFile, index); err != nil {\n\t\treturn nil, err\n\t}\n\n\tvar manifestDescriptor *v1.Descriptor\n\tfor _, m := range index.Manifests {\n\t\tif m.MediaType == \"application/vnd.docker.distribution.manifest.v2+json\" || m.MediaType == v1.MediaTypeImageManifest {\n\t\t\tmanifestDescriptor = &m // nolint:exportloopref\n\t\t\tbreak\n\t\t}\n\t}\n\n\tif manifestDescriptor == nil {\n\t\treturn nil, errors.New(\"unable to find manifest\")\n\t}\n\n\tmanifest := &v1.Manifest{}\n\tif err := unmarshalJSONFromBlob(blob, pathFromDescriptor(*manifestDescriptor), manifest); err != nil {\n\t\treturn nil, err\n\t}\n\n\timageInfo := &v1.Image{}\n\tif err := unmarshalJSONFromBlob(blob, pathFromDescriptor(manifest.Config), imageInfo); err != nil {\n\t\treturn nil, err\n\t}\n\tvar layersLabel string\n\tswitch kind {\n\tcase KindBuildpack:\n\t\tlayersLabel = imageInfo.Config.Labels[dist.BuildpackLayersLabel]\n\t\tif layersLabel == \"\" {\n\t\t\treturn nil, errors.Errorf(\"label %s not found\", style.Symbol(dist.BuildpackLayersLabel))\n\t\t}\n\tcase KindExtension:\n\t\tlayersLabel = imageInfo.Config.Labels[dist.ExtensionLayersLabel]\n\t\tif layersLabel == \"\" {\n\t\t\treturn nil, errors.Errorf(\"label %s not found\", style.Symbol(dist.ExtensionLayersLabel))\n\t\t}\n\tdefault:\n\t\treturn nil, fmt.Errorf(\"unknown module kind: %s\", kind)\n\t}\n\n\tbpLayers := dist.ModuleLayers{}\n\tif err := json.Unmarshal([]byte(layersLabel), &bpLayers); err != nil {\n\t\treturn nil, errors.Wrap(err, \"unmarshaling layers label\")\n\t}\n\n\treturn &ociLayoutPackage{\n\t\timageInfo: *imageInfo,\n\t\tmanifest:  *manifest,\n\t\tblob:      blob,\n\t}, nil\n}\n\nfunc (o *ociLayoutPackage) Label(name string) (value string, err error) {\n\treturn o.imageInfo.Config.Labels[name], nil\n}\n\nfunc (o *ociLayoutPackage) GetLayer(diffID string) (io.ReadCloser, error) {\n\tindex := -1\n\tfor i, dID := range o.imageInfo.RootFS.DiffIDs {\n\t\tif dID.String() == diffID {\n\t\t\tindex = i\n\t\t\tbreak\n\t\t}\n\t}\n\tif index == -1 {\n\t\treturn nil, errors.Errorf(\"layer %s not found in rootfs\", style.Symbol(diffID))\n\t}\n\n\tlayerDescriptor := o.manifest.Layers[index]\n\tlayerPath := paths.CanonicalTarPath(pathFromDescriptor(layerDescriptor))\n\n\tblobReader, err := o.blob.Open()\n\tif err != nil {\n\t\treturn nil, err\n\t}\n\n\ttr := tar.NewReader(blobReader)\n\tfor {\n\t\theader, err := tr.Next()\n\t\tif err == io.EOF {\n\t\t\tbreak\n\t\t}\n\t\tif err != nil {\n\t\t\treturn nil, errors.Wrap(err, \"failed to get next tar entry\")\n\t\t}\n\n\t\tif paths.CanonicalTarPath(header.Name) == layerPath {\n\t\t\tfinalReader := blobReader\n\n\t\t\tif strings.HasSuffix(layerDescriptor.MediaType, \"gzip\") {\n\t\t\t\tfinalReader, err = gzip.NewReader(tr)\n\t\t\t\tif err != nil {\n\t\t\t\t\treturn nil, err\n\t\t\t\t}\n\t\t\t}\n\n\t\t\treturn ioutils.NewReadCloserWrapper(finalReader, func() error {\n\t\t\t\tif err := finalReader.Close(); err != nil {\n\t\t\t\t\treturn err\n\t\t\t\t}\n\n\t\t\t\treturn blobReader.Close()\n\t\t\t}), nil\n\t\t}\n\t}\n\n\tif err := blobReader.Close(); err != nil {\n\t\treturn nil, err\n\t}\n\n\treturn nil, errors.Errorf(\"layer blob %s not found\", style.Symbol(layerPath))\n}\n\nfunc pathFromDescriptor(descriptor v1.Descriptor) string {\n\treturn path.Join(\"/blobs\", descriptor.Digest.Algorithm().String(), descriptor.Digest.Encoded())\n}\n\nfunc unmarshalJSONFromBlob(blob Blob, path string, obj interface{}) error {\n\treader, err := blob.Open()\n\tif err != nil {\n\t\treturn err\n\t}\n\tdefer reader.Close()\n\n\t_, contents, err := archive.ReadTarEntry(reader, path)\n\tif err != nil {\n\t\treturn err\n\t}\n\n\tif err = json.Unmarshal(contents, obj); err != nil {\n\t\treturn err\n\t}\n\n\treturn nil\n}\n"
  },
  {
    "path": "pkg/buildpack/oci_layout_package_test.go",
    "content": "package buildpack_test\n\nimport (\n\t\"fmt\"\n\t\"path/filepath\"\n\t\"testing\"\n\n\t\"github.com/buildpacks/lifecycle/api\"\n\t\"github.com/heroku/color\"\n\tv1 \"github.com/opencontainers/image-spec/specs-go/v1\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\t\"github.com/buildpacks/pack/internal/fakes\"\n\t\"github.com/buildpacks/pack/pkg/archive\"\n\t\"github.com/buildpacks/pack/pkg/blob\"\n\t\"github.com/buildpacks/pack/pkg/buildpack\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestOCILayoutPackage(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"Extract\", testOCILayoutPackage, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\ntype testCase struct {\n\tmediatype string\n\tfile      string\n}\n\nfunc testOCILayoutPackage(t *testing.T, when spec.G, it spec.S) {\n\twhen(\"#BuildpacksFromOCILayoutBlob\", func() {\n\t\tfor _, test := range []testCase{\n\t\t\t{\n\t\t\t\tmediatype: \"application/vnd.docker.distribution.manifest.v2+json\",\n\t\t\t\tfile:      \"hello-universe.cnb\",\n\t\t\t},\n\t\t\t{\n\t\t\t\tmediatype: v1.MediaTypeImageManifest,\n\t\t\t\tfile:      \"hello-universe-oci.cnb\",\n\t\t\t},\n\t\t} {\n\t\t\tit(fmt.Sprintf(\"extracts buildpacks, media type: %s\", test.mediatype), func() {\n\t\t\t\tmainBP, depBPs, err := buildpack.BuildpacksFromOCILayoutBlob(blob.NewBlob(filepath.Join(\"testdata\", test.file)))\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\th.AssertEq(t, mainBP.Descriptor().Info().ID, \"io.buildpacks.samples.hello-universe\")\n\t\t\t\th.AssertEq(t, mainBP.Descriptor().Info().Version, \"0.0.1\")\n\t\t\t\th.AssertEq(t, len(depBPs), 2)\n\t\t\t})\n\n\t\t\tit(fmt.Sprintf(\"provides readable blobs, media type: %s\", test.mediatype), func() {\n\t\t\t\tmainBP, depBPs, err := buildpack.BuildpacksFromOCILayoutBlob(blob.NewBlob(filepath.Join(\"testdata\", test.file)))\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\tfor _, bp := range append([]buildpack.BuildModule{mainBP}, depBPs...) {\n\t\t\t\t\treader, err := bp.Open()\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t_, contents, err := archive.ReadTarEntry(\n\t\t\t\t\t\treader,\n\t\t\t\t\t\tfmt.Sprintf(\"/cnb/buildpacks/%s/%s/buildpack.toml\",\n\t\t\t\t\t\t\tbp.Descriptor().Info().ID,\n\t\t\t\t\t\t\tbp.Descriptor().Info().Version,\n\t\t\t\t\t\t),\n\t\t\t\t\t)\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\th.AssertContains(t, string(contents), bp.Descriptor().Info().ID)\n\t\t\t\t\th.AssertContains(t, string(contents), bp.Descriptor().Info().Version)\n\t\t\t\t}\n\t\t\t})\n\t\t}\n\t})\n\n\twhen(\"#ExtensionsFromOCILayoutBlob\", func() {\n\t\tit(\"extracts buildpacks\", func() {\n\t\t\text, err := buildpack.ExtensionsFromOCILayoutBlob(blob.NewBlob(filepath.Join(\"testdata\", \"tree-extension.cnb\")))\n\t\t\th.AssertNil(t, err)\n\n\t\t\th.AssertEq(t, ext.Descriptor().Info().ID, \"samples-tree\")\n\t\t\th.AssertEq(t, ext.Descriptor().Info().Version, \"0.0.1\")\n\t\t})\n\n\t\tit(\"provides readable blobs\", func() {\n\t\t\text, err := buildpack.ExtensionsFromOCILayoutBlob(blob.NewBlob(filepath.Join(\"testdata\", \"tree-extension.cnb\")))\n\t\t\th.AssertNil(t, err)\n\t\t\treader, err := ext.Open()\n\t\t\th.AssertNil(t, err)\n\n\t\t\t_, contents, err := archive.ReadTarEntry(\n\t\t\t\treader,\n\t\t\t\tfmt.Sprintf(\"/cnb/extensions/%s/%s/extension.toml\",\n\t\t\t\t\text.Descriptor().Info().ID,\n\t\t\t\t\text.Descriptor().Info().Version,\n\t\t\t\t),\n\t\t\t)\n\t\t\th.AssertNil(t, err)\n\t\t\th.AssertContains(t, string(contents), ext.Descriptor().Info().ID)\n\t\t\th.AssertContains(t, string(contents), ext.Descriptor().Info().Version)\n\t\t})\n\t})\n\n\twhen(\"#IsOCILayoutBlob\", func() {\n\t\twhen(\"is an OCI layout blob\", func() {\n\t\t\tit(\"returns true\", func() {\n\t\t\t\tisOCILayoutBlob, err := buildpack.IsOCILayoutBlob(blob.NewBlob(filepath.Join(\"testdata\", \"hello-universe.cnb\")))\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertEq(t, isOCILayoutBlob, true)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"is NOT an OCI layout blob\", func() {\n\t\t\tit(\"returns false\", func() {\n\t\t\t\tbuildpackBlob, err := fakes.NewFakeBuildpackBlob(&dist.BuildpackDescriptor{\n\t\t\t\t\tWithAPI: api.MustParse(\"0.3\"),\n\t\t\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\t\t\tID:      \"bp.id\",\n\t\t\t\t\t\tVersion: \"bp.version\",\n\t\t\t\t\t},\n\t\t\t\t\tWithStacks: []dist.Stack{{}},\n\t\t\t\t\tWithOrder:  nil,\n\t\t\t\t}, 0755)\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\tisOCILayoutBlob, err := buildpack.IsOCILayoutBlob(buildpackBlob)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertEq(t, isOCILayoutBlob, false)\n\t\t\t})\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "pkg/buildpack/package.go",
    "content": "package buildpack\n\nimport (\n\t\"io\"\n\t\"strings\"\n\t\"sync\"\n\n\t\"github.com/pkg/errors\"\n\n\t\"github.com/buildpacks/pack/internal/style\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n)\n\ntype Package interface {\n\tLabel(name string) (value string, err error)\n\tGetLayer(diffID string) (io.ReadCloser, error)\n}\n\ntype syncPkg struct {\n\tmu  sync.Mutex\n\tpkg Package\n}\n\nfunc (s *syncPkg) Label(name string) (value string, err error) {\n\ts.mu.Lock()\n\tdefer s.mu.Unlock()\n\treturn s.pkg.Label(name)\n}\n\nfunc (s *syncPkg) GetLayer(diffID string) (io.ReadCloser, error) {\n\ts.mu.Lock()\n\tdefer s.mu.Unlock()\n\treturn s.pkg.GetLayer(diffID)\n}\n\n// extractBuildpacks when provided a flattened buildpack package containing N buildpacks,\n// will return N modules: 1 module with a single tar containing ALL N buildpacks, and N-1 modules with empty tar files.\nfunc extractBuildpacks(pkg Package) (mainBP BuildModule, depBPs []BuildModule, err error) {\n\tpkg = &syncPkg{pkg: pkg}\n\tmd := &Metadata{}\n\tif found, err := dist.GetLabel(pkg, MetadataLabel, md); err != nil {\n\t\treturn nil, nil, err\n\t} else if !found {\n\t\treturn nil, nil, errors.Errorf(\n\t\t\t\"could not find label %s\",\n\t\t\tstyle.Symbol(MetadataLabel),\n\t\t)\n\t}\n\n\tpkgLayers := dist.ModuleLayers{}\n\tok, err := dist.GetLabel(pkg, dist.BuildpackLayersLabel, &pkgLayers)\n\tif err != nil {\n\t\treturn nil, nil, err\n\t}\n\n\tif !ok {\n\t\treturn nil, nil, errors.Errorf(\n\t\t\t\"could not find label %s\",\n\t\t\tstyle.Symbol(dist.BuildpackLayersLabel),\n\t\t)\n\t}\n\n\t// Example `dist.ModuleLayers{}`:\n\t//\n\t//{\n\t//  \"samples/hello-moon\": {\n\t//    \"0.0.1\": {\n\t//      \"api\": \"0.2\",\n\t//      \"stacks\": [\n\t//        {\n\t//          \"id\": \"*\"\n\t//        }\n\t//      ],\n\t//      \"layerDiffID\": \"sha256:37ab46923c181aa5fb27c9a23479a38aec2679237f35a0ea4115e5ae81a17bba\",\n\t//      \"homepage\": \"https://github.com/buildpacks/samples/tree/main/buildpacks/hello-moon\",\n\t//      \"name\": \"Hello Moon Buildpack\"\n\t//    }\n\t//  }\n\t//}\n\n\t// If the package is a flattened buildpack, the first buildpack in the package returns all the tar content,\n\t// and subsequent buildpacks return an empty tar.\n\tvar processedDiffIDs = make(map[string]bool)\n\tfor bpID, v := range pkgLayers {\n\t\tfor bpVersion, bpInfo := range v {\n\t\t\tdesc := dist.BuildpackDescriptor{\n\t\t\t\tWithAPI: bpInfo.API,\n\t\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\t\tID:       bpID,\n\t\t\t\t\tVersion:  bpVersion,\n\t\t\t\t\tHomepage: bpInfo.Homepage,\n\t\t\t\t\tName:     bpInfo.Name,\n\t\t\t\t},\n\t\t\t\tWithStacks:  bpInfo.Stacks,\n\t\t\t\tWithTargets: bpInfo.Targets,\n\t\t\t\tWithOrder:   bpInfo.Order,\n\t\t\t}\n\n\t\t\tdiffID := bpInfo.LayerDiffID // Allow use in closure\n\n\t\t\tvar openerFunc func() (io.ReadCloser, error)\n\t\t\tif _, ok := processedDiffIDs[diffID]; ok {\n\t\t\t\t// We already processed a layer with this diffID, so the module must be flattened;\n\t\t\t\t// return an empty reader to avoid multiple tars with the same content.\n\t\t\t\topenerFunc = func() (io.ReadCloser, error) {\n\t\t\t\t\treturn io.NopCloser(strings.NewReader(\"\")), nil\n\t\t\t\t}\n\t\t\t} else {\n\t\t\t\topenerFunc = func() (io.ReadCloser, error) {\n\t\t\t\t\trc, err := pkg.GetLayer(diffID)\n\t\t\t\t\tif err != nil {\n\t\t\t\t\t\treturn nil, errors.Wrapf(err,\n\t\t\t\t\t\t\t\"extracting buildpack %s layer (diffID %s)\",\n\t\t\t\t\t\t\tstyle.Symbol(desc.Info().FullName()),\n\t\t\t\t\t\t\tstyle.Symbol(diffID),\n\t\t\t\t\t\t)\n\t\t\t\t\t}\n\t\t\t\t\treturn rc, nil\n\t\t\t\t}\n\t\t\t\tprocessedDiffIDs[diffID] = true\n\t\t\t}\n\n\t\t\tb := &openerBlob{\n\t\t\t\topener: openerFunc,\n\t\t\t}\n\n\t\t\tif desc.Info().Match(md.ModuleInfo) { // Current module is the order buildpack of the package\n\t\t\t\tmainBP = FromBlob(&desc, b)\n\t\t\t} else {\n\t\t\t\tdepBPs = append(depBPs, FromBlob(&desc, b))\n\t\t\t}\n\t\t}\n\t}\n\n\treturn mainBP, depBPs, nil\n}\n\nfunc extractExtensions(pkg Package) (mainExt BuildModule, err error) {\n\tpkg = &syncPkg{pkg: pkg}\n\tmd := &Metadata{}\n\tif found, err := dist.GetLabel(pkg, MetadataLabel, md); err != nil {\n\t\treturn nil, err\n\t} else if !found {\n\t\treturn nil, errors.Errorf(\n\t\t\t\"could not find label %s\",\n\t\t\tstyle.Symbol(MetadataLabel),\n\t\t)\n\t}\n\n\tpkgLayers := dist.ModuleLayers{}\n\tok, err := dist.GetLabel(pkg, dist.ExtensionLayersLabel, &pkgLayers)\n\tif err != nil {\n\t\treturn nil, err\n\t}\n\n\tif !ok {\n\t\treturn nil, errors.Errorf(\n\t\t\t\"could not find label %s\",\n\t\t\tstyle.Symbol(dist.ExtensionLayersLabel),\n\t\t)\n\t}\n\tfor extID, v := range pkgLayers {\n\t\tfor extVersion, extInfo := range v {\n\t\t\tdesc := dist.ExtensionDescriptor{\n\t\t\t\tWithAPI: extInfo.API,\n\t\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\t\tID:       extID,\n\t\t\t\t\tVersion:  extVersion,\n\t\t\t\t\tHomepage: extInfo.Homepage,\n\t\t\t\t\tName:     extInfo.Name,\n\t\t\t\t},\n\t\t\t}\n\n\t\t\tdiffID := extInfo.LayerDiffID // Allow use in closure\n\t\t\tb := &openerBlob{\n\t\t\t\topener: func() (io.ReadCloser, error) {\n\t\t\t\t\trc, err := pkg.GetLayer(diffID)\n\t\t\t\t\tif err != nil {\n\t\t\t\t\t\treturn nil, errors.Wrapf(err,\n\t\t\t\t\t\t\t\"extracting extension %s layer (diffID %s)\",\n\t\t\t\t\t\t\tstyle.Symbol(desc.Info().FullName()),\n\t\t\t\t\t\t\tstyle.Symbol(diffID),\n\t\t\t\t\t\t)\n\t\t\t\t\t}\n\t\t\t\t\treturn rc, nil\n\t\t\t\t},\n\t\t\t}\n\n\t\t\tmainExt = FromBlob(&desc, b)\n\t\t}\n\t}\n\treturn mainExt, nil\n}\n\ntype openerBlob struct {\n\topener func() (io.ReadCloser, error)\n}\n\nfunc (b *openerBlob) Open() (io.ReadCloser, error) {\n\treturn b.opener()\n}\n"
  },
  {
    "path": "pkg/buildpack/parse_name.go",
    "content": "package buildpack\n\nimport (\n\t\"fmt\"\n\t\"strings\"\n)\n\n// ParseIDLocator parses a buildpack locator in the following formats into its ID and version.\n//\n//   - <id>[@<version>]\n//   - urn:cnb:builder:<id>[@<version>]\n//   - urn:cnb:registry:<id>[@<version>]\n//   - from=builder:<id>[@<version>] (deprecated)\n//\n// If version is omitted, the version returned will be empty. Any \"from=builder:\" or \"urn:cnb\" prefix will be ignored.\nfunc ParseIDLocator(locator string) (id string, version string) {\n\tnakedLocator := parseRegistryLocator(parseBuilderLocator(locator))\n\n\tparts := strings.Split(nakedLocator, \"@\")\n\tif len(parts) == 2 {\n\t\treturn parts[0], parts[1]\n\t}\n\treturn parts[0], \"\"\n}\n\n// ParsePackageLocator parses a locator (in format `[docker://][<host>/]<path>[:<tag>⏐@<digest>]`) to image name (`[<host>/]<path>[:<tag>⏐@<digest>]`)\nfunc ParsePackageLocator(locator string) (imageName string) {\n\treturn strings.TrimPrefix(\n\t\tstrings.TrimPrefix(\n\t\t\tstrings.TrimPrefix(locator, fromDockerPrefix+\"//\"),\n\t\t\tfromDockerPrefix+\"/\"),\n\t\tfromDockerPrefix)\n}\n\n// ParseRegistryID parses a registry id (ie. `<namespace>/<name>@<version>`) into namespace, name and version components.\n//\n// Supported formats:\n//   - <ns>/<name>[@<version>]\n//   - urn:cnb:registry:<ns>/<name>[@<version>]\nfunc ParseRegistryID(registryID string) (namespace string, name string, version string, err error) {\n\tid, version := ParseIDLocator(registryID)\n\n\tparts := strings.Split(id, \"/\")\n\tif len(parts) != 2 {\n\t\treturn \"\", \"\", \"\", fmt.Errorf(\"invalid registry ID: %s\", registryID)\n\t}\n\n\treturn parts[0], parts[1], version, nil\n}\n\nfunc parseRegistryLocator(locator string) (path string) {\n\treturn strings.TrimPrefix(locator, fromRegistryPrefix+\":\")\n}\n\nfunc parseBuilderLocator(locator string) (path string) {\n\treturn strings.TrimPrefix(\n\t\tstrings.TrimPrefix(locator, deprecatedFromBuilderPrefix+\":\"),\n\t\tfromBuilderPrefix+\":\")\n}\n"
  },
  {
    "path": "pkg/buildpack/parse_name_test.go",
    "content": "package buildpack_test\n\nimport (\n\t\"testing\"\n\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\t\"github.com/buildpacks/pack/pkg/buildpack\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestParseName(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"ParseName\", testParseName, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testParseName(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tassert = h.NewAssertionManager(t)\n\t)\n\n\twhen(\"#ParseIDLocator\", func() {\n\t\ttype testParams struct {\n\t\t\tdesc            string\n\t\t\tlocator         string\n\t\t\texpectedID      string\n\t\t\texpectedVersion string\n\t\t}\n\n\t\tfor _, params := range []testParams{\n\t\t\t{\n\t\t\t\tdesc:            \"naked id+version\",\n\t\t\t\tlocator:         \"ns/name@0.0.1\",\n\t\t\t\texpectedID:      \"ns/name\",\n\t\t\t\texpectedVersion: \"0.0.1\",\n\t\t\t},\n\t\t\t{\n\t\t\t\tdesc:            \"naked only id\",\n\t\t\t\tlocator:         \"ns/name\",\n\t\t\t\texpectedID:      \"ns/name\",\n\t\t\t\texpectedVersion: \"\",\n\t\t\t},\n\t\t\t{\n\t\t\t\tdesc:            \"from=builder id+version\",\n\t\t\t\tlocator:         \"from=builder:ns/name@1.2.3\",\n\t\t\t\texpectedID:      \"ns/name\",\n\t\t\t\texpectedVersion: \"1.2.3\",\n\t\t\t},\n\t\t\t{\n\t\t\t\tdesc:            \"urn:cnb:builder id+version\",\n\t\t\t\tlocator:         \"urn:cnb:builder:ns/name@1.2.3\",\n\t\t\t\texpectedID:      \"ns/name\",\n\t\t\t\texpectedVersion: \"1.2.3\",\n\t\t\t},\n\t\t\t{\n\t\t\t\tdesc:            \"urn:cnb:registry id+version\",\n\t\t\t\tlocator:         \"urn:cnb:registry:ns/name@1.2.3\",\n\t\t\t\texpectedID:      \"ns/name\",\n\t\t\t\texpectedVersion: \"1.2.3\",\n\t\t\t},\n\t\t} {\n\t\t\tparams := params\n\t\t\twhen(params.desc+\" \"+params.locator, func() {\n\t\t\t\tit(\"should parse as id=\"+params.expectedID+\" and version=\"+params.expectedVersion, func() {\n\t\t\t\t\tid, version := buildpack.ParseIDLocator(params.locator)\n\t\t\t\t\tassert.Equal(id, params.expectedID)\n\t\t\t\t\tassert.Equal(version, params.expectedVersion)\n\t\t\t\t})\n\t\t\t})\n\t\t}\n\t})\n\n\twhen(\"#ParsePackageLocator\", func() {\n\t\ttype testParams struct {\n\t\t\tdesc              string\n\t\t\tlocator           string\n\t\t\texpectedImageName string\n\t\t}\n\n\t\tfor _, params := range []testParams{\n\t\t\t{\n\t\t\t\tdesc:              \"docker scheme (missing host)\",\n\t\t\t\tlocator:           \"docker:///ns/name:latest\",\n\t\t\t\texpectedImageName: \"ns/name:latest\",\n\t\t\t},\n\t\t\t{\n\t\t\t\tdesc:              \"docker scheme (missing host shorthand)\",\n\t\t\t\tlocator:           \"docker:/ns/name:latest\",\n\t\t\t\texpectedImageName: \"ns/name:latest\",\n\t\t\t},\n\t\t\t{\n\t\t\t\tdesc:              \"docker scheme\",\n\t\t\t\tlocator:           \"docker://docker.io/ns/name:latest\",\n\t\t\t\texpectedImageName: \"docker.io/ns/name:latest\",\n\t\t\t},\n\t\t\t{\n\t\t\t\tdesc:              \"schemaless w/ host\",\n\t\t\t\tlocator:           \"docker.io/ns/name:latest\",\n\t\t\t\texpectedImageName: \"docker.io/ns/name:latest\",\n\t\t\t},\n\t\t\t{\n\t\t\t\tdesc:              \"schemaless w/o host\",\n\t\t\t\tlocator:           \"ns/name:latest\",\n\t\t\t\texpectedImageName: \"ns/name:latest\",\n\t\t\t},\n\t\t} {\n\t\t\tparams := params\n\t\t\twhen(params.desc+\" \"+params.locator, func() {\n\t\t\t\tit(\"should parse as \"+params.expectedImageName, func() {\n\t\t\t\t\timageName := buildpack.ParsePackageLocator(params.locator)\n\t\t\t\t\tassert.Equal(imageName, params.expectedImageName)\n\t\t\t\t})\n\t\t\t})\n\t\t}\n\t})\n\n\twhen(\"#ParseRegistryID\", func() {\n\t\ttype testParams struct {\n\t\t\tdesc,\n\t\t\tlocator,\n\t\t\texpectedNS,\n\t\t\texpectedName,\n\t\t\texpectedVersion,\n\t\t\texpectedErr string\n\t\t}\n\n\t\tfor _, params := range []testParams{\n\t\t\t{\n\t\t\t\tdesc:            \"naked id+version\",\n\t\t\t\tlocator:         \"ns/name@0.1.2\",\n\t\t\t\texpectedNS:      \"ns\",\n\t\t\t\texpectedName:    \"name\",\n\t\t\t\texpectedVersion: \"0.1.2\",\n\t\t\t},\n\t\t\t{\n\t\t\t\tdesc:            \"naked id\",\n\t\t\t\tlocator:         \"ns/name\",\n\t\t\t\texpectedNS:      \"ns\",\n\t\t\t\texpectedName:    \"name\",\n\t\t\t\texpectedVersion: \"\",\n\t\t\t},\n\t\t\t{\n\t\t\t\tdesc:            \"urn:cnb:registry ref\",\n\t\t\t\tlocator:         \"urn:cnb:registry:ns/name@1.2.3\",\n\t\t\t\texpectedNS:      \"ns\",\n\t\t\t\texpectedName:    \"name\",\n\t\t\t\texpectedVersion: \"1.2.3\",\n\t\t\t},\n\t\t\t{\n\t\t\t\tdesc:        \"invalid id\",\n\t\t\t\tlocator:     \"invalid/id/name@1.2.3\",\n\t\t\t\texpectedErr: \"invalid registry ID: invalid/id/name@1.2.3\",\n\t\t\t},\n\t\t} {\n\t\t\tparams := params\n\t\t\twhen(params.desc, func() {\n\t\t\t\tif params.expectedErr != \"\" {\n\t\t\t\t\tit(\"errors\", func() {\n\t\t\t\t\t\t_, _, _, err := buildpack.ParseRegistryID(params.locator)\n\t\t\t\t\t\tassert.ErrorWithMessage(err, params.expectedErr)\n\t\t\t\t\t})\n\t\t\t\t} else {\n\t\t\t\t\tit(\"parses\", func() {\n\t\t\t\t\t\tns, name, version, err := buildpack.ParseRegistryID(params.locator)\n\t\t\t\t\t\tassert.Nil(err)\n\t\t\t\t\t\tassert.Equal(ns, params.expectedNS)\n\t\t\t\t\t\tassert.Equal(name, params.expectedName)\n\t\t\t\t\t\tassert.Equal(version, params.expectedVersion)\n\t\t\t\t\t})\n\t\t\t\t}\n\t\t\t})\n\t\t}\n\t})\n}\n"
  },
  {
    "path": "pkg/buildpack/testdata/buildpack/bin/build",
    "content": "build-contents"
  },
  {
    "path": "pkg/buildpack/testdata/buildpack/bin/detect",
    "content": ""
  },
  {
    "path": "pkg/buildpack/testdata/buildpack/buildpack.toml",
    "content": "api = \"0.3\"\n\n[buildpack]\nid = \"bp.one\"\nversion = \"1.2.3\"\nhomepage = \"http://one.buildpack\"\n\n[[stacks]]\nid = \"some.stack.id\"\nmixins = [\"mixinX\", \"build:mixinY\", \"run:mixinZ\"]\n"
  },
  {
    "path": "pkg/buildpack/testdata/buildpack-with-hardlink/bin/build",
    "content": "build-contents"
  },
  {
    "path": "pkg/buildpack/testdata/buildpack-with-hardlink/bin/detect",
    "content": ""
  },
  {
    "path": "pkg/buildpack/testdata/buildpack-with-hardlink/buildpack.toml",
    "content": "api = \"0.3\"\n\n[buildpack]\nid = \"bp.one\"\nversion = \"1.2.3\"\nhomepage = \"http://one.buildpack\"\n\n[[stacks]]\nid = \"some.stack.id\"\nmixins = [\"mixinX\", \"build:mixinY\", \"run:mixinZ\"]\n"
  },
  {
    "path": "pkg/buildpack/testdata/buildpack-with-hardlink/original-file",
    "content": "foo\n"
  },
  {
    "path": "pkg/buildpack/testdata/extension/bin/detect",
    "content": ""
  },
  {
    "path": "pkg/buildpack/testdata/extension/bin/generate",
    "content": "generate-contents"
  },
  {
    "path": "pkg/buildpack/testdata/extension/extension.toml",
    "content": "api = \"0.9\"\n\n[extension]\nid = \"ext.one\"\nversion = \"1.2.3\"\nhomepage = \"http://one.extension\"\n"
  },
  {
    "path": "pkg/buildpack/testdata/package.toml",
    "content": "[buildpack]\nuri = \"https://example.com/bp/a.tgz\"\n\n[[dependencies]]\nuri = \"https://example.com/bp/b.tgz\"\n\n[[dependencies]]\nuri = \"bp/c\"\n\n[[dependencies]]\nimage = \"registry.example.com/bp/d\"\n"
  },
  {
    "path": "pkg/cache/bind_cache.go",
    "content": "package cache\n\nimport (\n\t\"context\"\n\t\"os\"\n)\n\ntype BindCache struct {\n\tdocker DockerClient\n\tbind   string\n}\n\nfunc NewBindCache(cacheType CacheInfo, dockerClient DockerClient) *BindCache {\n\treturn &BindCache{\n\t\tbind:   cacheType.Source,\n\t\tdocker: dockerClient,\n\t}\n}\n\nfunc (c *BindCache) Name() string {\n\treturn c.bind\n}\n\nfunc (c *BindCache) Clear(ctx context.Context) error {\n\terr := os.RemoveAll(c.bind)\n\tif err != nil {\n\t\treturn err\n\t}\n\treturn nil\n}\n\nfunc (c *BindCache) Type() Type {\n\treturn Bind\n}\n"
  },
  {
    "path": "pkg/cache/cache_opts.go",
    "content": "package cache\n\nimport (\n\t\"encoding/csv\"\n\t\"fmt\"\n\t\"path/filepath\"\n\t\"strings\"\n\n\t\"github.com/pkg/errors\"\n)\n\ntype Format int\ntype CacheInfo struct {\n\tFormat Format\n\tSource string\n}\n\ntype CacheOpts struct {\n\tBuild  CacheInfo\n\tLaunch CacheInfo\n\tKaniko CacheInfo\n}\n\nconst (\n\tCacheVolume Format = iota\n\tCacheImage\n\tCacheBind\n)\n\nfunc (f Format) String() string {\n\tswitch f {\n\tcase CacheImage:\n\t\treturn \"image\"\n\tcase CacheVolume:\n\t\treturn \"volume\"\n\tcase CacheBind:\n\t\treturn \"bind\"\n\t}\n\treturn \"\"\n}\n\nfunc (c *CacheInfo) SourceName() string {\n\tswitch c.Format {\n\tcase CacheImage:\n\t\tfallthrough\n\tcase CacheVolume:\n\t\treturn \"name\"\n\tcase CacheBind:\n\t\treturn \"source\"\n\t}\n\treturn \"\"\n}\n\nfunc (c *CacheOpts) Set(value string) error {\n\tcsvReader := csv.NewReader(strings.NewReader(value))\n\tcsvReader.Comma = ';'\n\tfields, err := csvReader.Read()\n\tif err != nil {\n\t\treturn err\n\t}\n\n\tcache := &c.Build\n\tfor _, field := range fields {\n\t\tparts := strings.SplitN(field, \"=\", 2)\n\t\tif len(parts) != 2 {\n\t\t\treturn errors.Errorf(\"invalid field '%s' must be a key=value pair\", field)\n\t\t}\n\t\tkey := strings.ToLower(parts[0])\n\t\tvalue := parts[1]\n\t\tif key == \"type\" {\n\t\t\tswitch value {\n\t\t\tcase \"build\":\n\t\t\t\tcache = &c.Build\n\t\t\tcase \"launch\":\n\t\t\t\tcache = &c.Launch\n\t\t\tdefault:\n\t\t\t\treturn errors.Errorf(\"invalid cache type '%s'\", value)\n\t\t\t}\n\t\t\tbreak\n\t\t}\n\t}\n\n\tfor _, field := range fields {\n\t\tparts := strings.SplitN(field, \"=\", 2)\n\t\tif len(parts) != 2 {\n\t\t\treturn errors.Errorf(\"invalid field '%s' must be a key=value pair\", field)\n\t\t}\n\t\tkey := strings.ToLower(parts[0])\n\t\tvalue := parts[1]\n\t\tswitch key {\n\t\tcase \"format\":\n\t\t\tswitch value {\n\t\t\tcase \"image\":\n\t\t\t\tcache.Format = CacheImage\n\t\t\tcase \"volume\":\n\t\t\t\tcache.Format = CacheVolume\n\t\t\tcase \"bind\":\n\t\t\t\tcache.Format = CacheBind\n\t\t\tdefault:\n\t\t\t\treturn errors.Errorf(\"invalid cache format '%s'\", value)\n\t\t\t}\n\t\tcase \"name\":\n\t\t\tcache.Source = value\n\t\tcase \"source\":\n\t\t\tcache.Source = value\n\t\t}\n\t}\n\n\terr = sanitize(c)\n\tif err != nil {\n\t\treturn err\n\t}\n\treturn nil\n}\n\nfunc (c *CacheOpts) String() string {\n\tvar cacheFlag string\n\tcacheFlag = fmt.Sprintf(\"type=build;format=%s;\", c.Build.Format.String())\n\tif c.Build.Source != \"\" {\n\t\tcacheFlag += fmt.Sprintf(\"%s=%s;\", c.Build.SourceName(), c.Build.Source)\n\t}\n\n\tcacheFlag += fmt.Sprintf(\"type=launch;format=%s;\", c.Launch.Format.String())\n\tif c.Launch.Source != \"\" {\n\t\tcacheFlag += fmt.Sprintf(\"%s=%s;\", c.Launch.SourceName(), c.Launch.Source)\n\t}\n\n\treturn cacheFlag\n}\n\nfunc (c *CacheOpts) Type() string {\n\treturn \"cache\"\n}\n\nfunc sanitize(c *CacheOpts) error {\n\tfor _, v := range []CacheInfo{c.Build, c.Launch} {\n\t\t// volume cache name can be auto-generated\n\t\tif v.Format != CacheVolume && v.Source == \"\" {\n\t\t\treturn errors.Errorf(\"cache '%s' is required\", v.SourceName())\n\t\t}\n\t}\n\n\tvar (\n\t\tresolvedPath string\n\t\terr          error\n\t)\n\tif c.Build.Format == CacheBind {\n\t\tif resolvedPath, err = filepath.Abs(c.Build.Source); err != nil {\n\t\t\treturn errors.Wrap(err, \"resolve absolute path\")\n\t\t}\n\t\tc.Build.Source = filepath.Join(resolvedPath, \"build-cache\")\n\t}\n\tif c.Launch.Format == CacheBind {\n\t\tif resolvedPath, err = filepath.Abs(c.Launch.Source); err != nil {\n\t\t\treturn errors.Wrap(err, \"resolve absolute path\")\n\t\t}\n\t\tc.Launch.Source = filepath.Join(resolvedPath, \"launch-cache\")\n\t}\n\treturn nil\n}\n"
  },
  {
    "path": "pkg/cache/cache_opts_test.go",
    "content": "package cache\n\nimport (\n\t\"fmt\"\n\t\"os\"\n\t\"runtime\"\n\t\"strings\"\n\t\"testing\"\n\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\ntype CacheOptTestCase struct {\n\tname       string\n\tinput      string\n\toutput     string\n\tshouldFail bool\n}\n\nfunc TestMetadata(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"Metadata\", testCacheOpts, spec.Sequential(), spec.Report(report.Terminal{}))\n}\n\nfunc testCacheOpts(t *testing.T, when spec.G, it spec.S) {\n\twhen(\"image cache format options are passed\", func() {\n\t\tit(\"with complete options\", func() {\n\t\t\ttestcases := []CacheOptTestCase{\n\t\t\t\t{\n\t\t\t\t\tname:   \"Build cache as Image\",\n\t\t\t\t\tinput:  \"type=build;format=image;name=io.test.io/myorg/my-cache:build\",\n\t\t\t\t\toutput: \"type=build;format=image;name=io.test.io/myorg/my-cache:build;type=launch;format=volume;\",\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\tname:   \"Launch cache as Image\",\n\t\t\t\t\tinput:  \"type=launch;format=image;name=io.test.io/myorg/my-cache:build\",\n\t\t\t\t\toutput: \"type=build;format=volume;type=launch;format=image;name=io.test.io/myorg/my-cache:build;\",\n\t\t\t\t},\n\t\t\t}\n\n\t\t\tfor _, testcase := range testcases {\n\t\t\t\tvar cacheFlags CacheOpts\n\t\t\t\tt.Logf(\"Testing cache type: %s\", testcase.name)\n\t\t\t\terr := cacheFlags.Set(testcase.input)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertEq(t, testcase.output, cacheFlags.String())\n\t\t\t}\n\t\t})\n\n\t\tit(\"with missing options\", func() {\n\t\t\tsuccessTestCases := []CacheOptTestCase{\n\t\t\t\t{\n\t\t\t\t\tname:   \"Build cache as Image missing: type\",\n\t\t\t\t\tinput:  \"format=image;name=io.test.io/myorg/my-cache:build\",\n\t\t\t\t\toutput: \"type=build;format=image;name=io.test.io/myorg/my-cache:build;type=launch;format=volume;\",\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\tname:   \"Build cache as Image missing: format\",\n\t\t\t\t\tinput:  \"type=build;name=io.test.io/myorg/my-cache:build\",\n\t\t\t\t\toutput: \"type=build;format=volume;name=io.test.io/myorg/my-cache:build;type=launch;format=volume;\",\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\tname:       \"Build cache as Image missing: name\",\n\t\t\t\t\tinput:      \"type=build;format=image\",\n\t\t\t\t\toutput:     \"cache 'name' is required\",\n\t\t\t\t\tshouldFail: true,\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\tname:   \"Build cache as Image missing: type, format\",\n\t\t\t\t\tinput:  \"name=io.test.io/myorg/my-cache:build\",\n\t\t\t\t\toutput: \"type=build;format=volume;name=io.test.io/myorg/my-cache:build;type=launch;format=volume;\",\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\tname:   \"Build cache as Image missing: format, name\",\n\t\t\t\t\tinput:  \"type=build\",\n\t\t\t\t\toutput: \"type=build;format=volume;type=launch;format=volume;\",\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\tname:       \"Build cache as Image missing: type, name\",\n\t\t\t\t\tinput:      \"format=image\",\n\t\t\t\t\toutput:     \"cache 'name' is required\",\n\t\t\t\t\tshouldFail: true,\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\tname:       \"Launch cache as Image missing: name\",\n\t\t\t\t\tinput:      \"type=launch;format=image\",\n\t\t\t\t\toutput:     \"cache 'name' is required\",\n\t\t\t\t\tshouldFail: true,\n\t\t\t\t},\n\t\t\t}\n\n\t\t\tfor _, testcase := range successTestCases {\n\t\t\t\tvar cacheFlags CacheOpts\n\t\t\t\tt.Logf(\"Testing cache type: %s\", testcase.name)\n\t\t\t\terr := cacheFlags.Set(testcase.input)\n\n\t\t\t\tif testcase.shouldFail {\n\t\t\t\t\th.AssertError(t, err, testcase.output)\n\t\t\t\t} else {\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\toutput := cacheFlags.String()\n\t\t\t\t\th.AssertEq(t, testcase.output, output)\n\t\t\t\t}\n\t\t\t}\n\t\t})\n\n\t\tit(\"with invalid options\", func() {\n\t\t\ttestcases := []CacheOptTestCase{\n\t\t\t\t{\n\t\t\t\t\tname:       \"Invalid cache type\",\n\t\t\t\t\tinput:      \"type=invalid_cache;format=image;name=io.test.io/myorg/my-cache:build\",\n\t\t\t\t\toutput:     \"invalid cache type 'invalid_cache'\",\n\t\t\t\t\tshouldFail: true,\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\tname:       \"Invalid cache format\",\n\t\t\t\t\tinput:      \"type=launch;format=invalid_format;name=io.test.io/myorg/my-cache:build\",\n\t\t\t\t\toutput:     \"invalid cache format 'invalid_format'\",\n\t\t\t\t\tshouldFail: true,\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\tname:       \"Not a key=value pair\",\n\t\t\t\t\tinput:      \"launch;format=image;name=io.test.io/myorg/my-cache:build\",\n\t\t\t\t\toutput:     \"invalid field 'launch' must be a key=value pair\",\n\t\t\t\t\tshouldFail: true,\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\tname:       \"Extra semicolon\",\n\t\t\t\t\tinput:      \"type=launch;format=image;name=io.test.io/myorg/my-cache:build;\",\n\t\t\t\t\toutput:     \"invalid field '' must be a key=value pair\",\n\t\t\t\t\tshouldFail: true,\n\t\t\t\t},\n\t\t\t}\n\n\t\t\tfor _, testcase := range testcases {\n\t\t\t\tvar cacheFlags CacheOpts\n\t\t\t\tt.Logf(\"Testing cache type: %s\", testcase.name)\n\t\t\t\terr := cacheFlags.Set(testcase.input)\n\t\t\t\th.AssertError(t, err, testcase.output)\n\t\t\t}\n\t\t})\n\t})\n\n\twhen(\"volume cache format options are passed\", func() {\n\t\tit(\"with complete options\", func() {\n\t\t\ttestcases := []CacheOptTestCase{\n\t\t\t\t{\n\t\t\t\t\tname:   \"Build cache as Volume\",\n\t\t\t\t\tinput:  \"type=build;format=volume;name=test-build-volume-cache\",\n\t\t\t\t\toutput: \"type=build;format=volume;name=test-build-volume-cache;type=launch;format=volume;\",\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\tname:   \"Launch cache as Volume\",\n\t\t\t\t\tinput:  \"type=launch;format=volume;name=test-launch-volume-cache\",\n\t\t\t\t\toutput: \"type=build;format=volume;type=launch;format=volume;name=test-launch-volume-cache;\",\n\t\t\t\t},\n\t\t\t}\n\n\t\t\tfor _, testcase := range testcases {\n\t\t\t\tvar cacheFlags CacheOpts\n\t\t\t\tt.Logf(\"Testing cache type: %s\", testcase.name)\n\t\t\t\terr := cacheFlags.Set(testcase.input)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertEq(t, testcase.output, cacheFlags.String())\n\t\t\t}\n\t\t})\n\n\t\tit(\"with missing options\", func() {\n\t\t\tsuccessTestCases := []CacheOptTestCase{\n\t\t\t\t{\n\t\t\t\t\tname:   \"Launch cache as Volume missing: format\",\n\t\t\t\t\tinput:  \"type=launch;name=test-launch-volume\",\n\t\t\t\t\toutput: \"type=build;format=volume;type=launch;format=volume;name=test-launch-volume;\",\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\tname:   \"Launch cache as Volume missing: name\",\n\t\t\t\t\tinput:  \"type=launch;format=volume\",\n\t\t\t\t\toutput: \"type=build;format=volume;type=launch;format=volume;\",\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\tname:   \"Launch cache as Volume missing: format, name\",\n\t\t\t\t\tinput:  \"type=launch\",\n\t\t\t\t\toutput: \"type=build;format=volume;type=launch;format=volume;\",\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\tname:   \"Launch cache as Volume missing: type, name\",\n\t\t\t\t\tinput:  \"format=volume\",\n\t\t\t\t\toutput: \"type=build;format=volume;type=launch;format=volume;\",\n\t\t\t\t},\n\t\t\t}\n\n\t\t\tfor _, testcase := range successTestCases {\n\t\t\t\tvar cacheFlags CacheOpts\n\t\t\t\tt.Logf(\"Testing cache type: %s\", testcase.name)\n\t\t\t\terr := cacheFlags.Set(testcase.input)\n\n\t\t\t\tif testcase.shouldFail {\n\t\t\t\t\th.AssertError(t, err, testcase.output)\n\t\t\t\t} else {\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\toutput := cacheFlags.String()\n\t\t\t\t\th.AssertEq(t, testcase.output, output)\n\t\t\t\t}\n\t\t\t}\n\t\t})\n\t})\n\n\twhen(\"bind cache format options are passed\", func() {\n\t\tit(\"with complete options\", func() {\n\t\t\tvar testcases []CacheOptTestCase\n\t\t\thomeDir, err := os.UserHomeDir()\n\t\t\th.AssertNil(t, err)\n\t\t\tcwd, err := os.Getwd()\n\t\t\th.AssertNil(t, err)\n\n\t\t\tif runtime.GOOS != \"windows\" {\n\t\t\t\ttestcases = []CacheOptTestCase{\n\t\t\t\t\t{\n\t\t\t\t\t\tname:   \"Build cache as bind\",\n\t\t\t\t\t\tinput:  fmt.Sprintf(\"type=build;format=bind;source=%s/test-bind-build-cache\", homeDir),\n\t\t\t\t\t\toutput: fmt.Sprintf(\"type=build;format=bind;source=%s/test-bind-build-cache/build-cache;type=launch;format=volume;\", homeDir),\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\tname:   \"Build cache as bind with relative path\",\n\t\t\t\t\t\tinput:  \"type=build;format=bind;source=./test-bind-build-cache-relative\",\n\t\t\t\t\t\toutput: fmt.Sprintf(\"type=build;format=bind;source=%s/test-bind-build-cache-relative/build-cache;type=launch;format=volume;\", cwd),\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\tname:   \"Launch cache as bind\",\n\t\t\t\t\t\tinput:  fmt.Sprintf(\"type=launch;format=bind;source=%s/test-bind-volume-cache\", homeDir),\n\t\t\t\t\t\toutput: fmt.Sprintf(\"type=build;format=volume;type=launch;format=bind;source=%s/test-bind-volume-cache/launch-cache;\", homeDir),\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\tname:   \"Case sensitivity test with uppercase path\",\n\t\t\t\t\t\tinput:  fmt.Sprintf(\"type=build;format=bind;source=%s/TestBindBuildCache\", homeDir),\n\t\t\t\t\t\toutput: fmt.Sprintf(\"type=build;format=bind;source=%s/TestBindBuildCache/build-cache;type=launch;format=volume;\", homeDir),\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\tname:   \"Case sensitivity test with mixed case path\",\n\t\t\t\t\t\tinput:  fmt.Sprintf(\"type=build;format=bind;source=%s/TeStBiNdBuildCaChe\", homeDir),\n\t\t\t\t\t\toutput: fmt.Sprintf(\"type=build;format=bind;source=%s/TeStBiNdBuildCaChe/build-cache;type=launch;format=volume;\", homeDir),\n\t\t\t\t\t},\n\t\t\t\t}\n\t\t\t} else {\n\t\t\t\ttestcases = []CacheOptTestCase{\n\t\t\t\t\t{\n\t\t\t\t\t\tname:   \"Build cache as bind\",\n\t\t\t\t\t\tinput:  fmt.Sprintf(\"type=build;format=bind;source=%s\\\\test-bind-build-cache\", homeDir),\n\t\t\t\t\t\toutput: fmt.Sprintf(\"type=build;format=bind;source=%s\\\\test-bind-build-cache\\\\build-cache;type=launch;format=volume;\", homeDir),\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\tname:   \"Build cache as bind with relative path\",\n\t\t\t\t\t\tinput:  \"type=build;format=bind;source=.\\\\test-bind-build-cache-relative\",\n\t\t\t\t\t\toutput: fmt.Sprintf(\"type=build;format=bind;source=%s\\\\test-bind-build-cache-relative\\\\build-cache;type=launch;format=volume;\", cwd),\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\tname:   \"Launch cache as bind\",\n\t\t\t\t\t\tinput:  fmt.Sprintf(\"type=launch;format=bind;source=%s\\\\test-bind-volume-cache\", homeDir),\n\t\t\t\t\t\toutput: fmt.Sprintf(\"type=build;format=volume;type=launch;format=bind;source=%s\\\\test-bind-volume-cache\\\\launch-cache;\", homeDir),\n\t\t\t\t\t},\n\t\t\t\t\t// Case sensitivity test cases for Windows\n\t\t\t\t\t{\n\t\t\t\t\t\tname:   \"Case sensitivity test with uppercase path\",\n\t\t\t\t\t\tinput:  fmt.Sprintf(\"type=build;format=bind;source=%s\\\\TestBindBuildCache\", homeDir),\n\t\t\t\t\t\toutput: fmt.Sprintf(\"type=build;format=bind;source=%s\\\\TestBindBuildCache\\\\build-cache;type=launch;format=volume;\", homeDir),\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\tname:   \"Case sensitivity test with mixed case path\",\n\t\t\t\t\t\tinput:  fmt.Sprintf(\"type=build;format=bind;source=%s\\\\TeStBiNdBuildCaChe\", homeDir),\n\t\t\t\t\t\toutput: fmt.Sprintf(\"type=build;format=bind;source=%s\\\\TeStBiNdBuildCaChe\\\\build-cache;type=launch;format=volume;\", homeDir),\n\t\t\t\t\t},\n\t\t\t\t}\n\t\t\t}\n\n\t\t\tfor _, testcase := range testcases {\n\t\t\t\tvar cacheFlags CacheOpts\n\t\t\t\tt.Logf(\"Testing cache type: %s\", testcase.name)\n\t\t\t\terr := cacheFlags.Set(testcase.input)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertEq(t, strings.ToLower(testcase.output), strings.ToLower(cacheFlags.String()))\n\t\t\t}\n\t\t})\n\n\t\tit(\"with missing options\", func() {\n\t\t\tsuccessTestCases := []CacheOptTestCase{\n\t\t\t\t{\n\t\t\t\t\tname:       \"Launch cache as bind missing: source\",\n\t\t\t\t\tinput:      \"type=launch;format=bind\",\n\t\t\t\t\toutput:     \"cache 'source' is required\",\n\t\t\t\t\tshouldFail: true,\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\tname:       \"Launch cache as Volume missing: type, source\",\n\t\t\t\t\tinput:      \"format=bind\",\n\t\t\t\t\toutput:     \"cache 'source' is required\",\n\t\t\t\t\tshouldFail: true,\n\t\t\t\t},\n\t\t\t}\n\n\t\t\tfor _, testcase := range successTestCases {\n\t\t\t\tvar cacheFlags CacheOpts\n\t\t\t\tt.Logf(\"Testing cache type: %s\", testcase.name)\n\t\t\t\terr := cacheFlags.Set(testcase.input)\n\n\t\t\t\tif testcase.shouldFail {\n\t\t\t\t\th.AssertError(t, err, testcase.output)\n\t\t\t\t} else {\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\toutput := cacheFlags.String()\n\t\t\t\t\th.AssertEq(t, testcase.output, output)\n\t\t\t\t}\n\t\t\t}\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "pkg/cache/consts.go",
    "content": "package cache\n\nconst (\n\tImage Type = iota\n\tVolume\n\tBind\n)\n\ntype Type int\n"
  },
  {
    "path": "pkg/cache/image_cache.go",
    "content": "package cache\n\nimport (\n\t\"context\"\n\n\tcerrdefs \"github.com/containerd/errdefs\"\n\t\"github.com/google/go-containerregistry/pkg/name\"\n\tdockerClient \"github.com/moby/moby/client\"\n)\n\ntype ImageCache struct {\n\tdocker DockerClient\n\timage  string\n}\n\ntype DockerClient interface {\n\tImageRemove(ctx context.Context, image string, options dockerClient.ImageRemoveOptions) (dockerClient.ImageRemoveResult, error)\n\tVolumeRemove(ctx context.Context, volumeID string, options dockerClient.VolumeRemoveOptions) (dockerClient.VolumeRemoveResult, error)\n}\n\nfunc NewImageCache(imageRef name.Reference, dockerClient DockerClient) *ImageCache {\n\treturn &ImageCache{\n\t\timage:  imageRef.Name(),\n\t\tdocker: dockerClient,\n\t}\n}\n\nfunc (c *ImageCache) Name() string {\n\treturn c.image\n}\n\nfunc (c *ImageCache) Clear(ctx context.Context) error {\n\t_, err := c.docker.ImageRemove(ctx, c.Name(), dockerClient.ImageRemoveOptions{\n\t\tForce: true,\n\t})\n\tif err != nil && !cerrdefs.IsNotFound(err) {\n\t\treturn err\n\t}\n\treturn nil\n}\n\nfunc (c *ImageCache) Type() Type {\n\treturn Image\n}\n"
  },
  {
    "path": "pkg/cache/image_cache_test.go",
    "content": "package cache_test\n\nimport (\n\t\"context\"\n\t\"testing\"\n\n\t\"github.com/buildpacks/pack/pkg/cache\"\n\n\t\"github.com/buildpacks/imgutil/local\"\n\t\"github.com/google/go-containerregistry/pkg/name\"\n\t\"github.com/heroku/color\"\n\t\"github.com/moby/moby/client\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestImageCache(t *testing.T) {\n\th.RequireDocker(t)\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\n\tspec.Run(t, \"ImageCache\", testImageCache, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testImageCache(t *testing.T, when spec.G, it spec.S) {\n\twhen(\"#NewImageCache\", func() {\n\t\tvar dockerClient *client.Client\n\n\t\tit.Before(func() {\n\t\t\tvar err error\n\t\t\tdockerClient, err = client.New(client.FromEnv)\n\t\t\th.AssertNil(t, err)\n\t\t})\n\n\t\twhen(\"#Name\", func() {\n\t\t\tit(\"should return the image reference used in intialization\", func() {\n\t\t\t\trefName := \"gcr.io/my/repo:tag\"\n\t\t\t\tref, err := name.ParseReference(refName, name.WeakValidation)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\tsubject := cache.NewImageCache(ref, dockerClient)\n\t\t\t\tactual := subject.Name()\n\t\t\t\tif actual != refName {\n\t\t\t\t\tt.Fatalf(\"Incorrect cache name expected %s, got %s\", refName, actual)\n\t\t\t\t}\n\t\t\t})\n\t\t})\n\n\t\tit(\"resolves implied tag\", func() {\n\t\t\tref, err := name.ParseReference(\"my/repo:latest\", name.WeakValidation)\n\t\t\th.AssertNil(t, err)\n\t\t\tsubject := cache.NewImageCache(ref, dockerClient)\n\n\t\t\tref, err = name.ParseReference(\"my/repo\", name.WeakValidation)\n\t\t\th.AssertNil(t, err)\n\t\t\texpected := cache.NewImageCache(ref, dockerClient)\n\n\t\t\th.AssertEq(t, subject.Name(), expected.Name())\n\t\t})\n\n\t\tit(\"resolves implied registry\", func() {\n\t\t\tref, err := name.ParseReference(\"index.docker.io/my/repo\", name.WeakValidation)\n\t\t\th.AssertNil(t, err)\n\t\t\tsubject := cache.NewImageCache(ref, dockerClient)\n\t\t\tref, err = name.ParseReference(\"my/repo\", name.WeakValidation)\n\t\t\th.AssertNil(t, err)\n\t\t\texpected := cache.NewImageCache(ref, dockerClient)\n\t\t\tif subject.Name() != expected.Name() {\n\t\t\t\tt.Fatalf(\"The same repo name should result in the same image\")\n\t\t\t}\n\t\t})\n\t})\n\n\twhen(\"#Type\", func() {\n\t\tvar (\n\t\t\tdockerClient client.APIClient\n\t\t)\n\n\t\tit.Before(func() {\n\t\t\tvar err error\n\t\t\tdockerClient, err = client.New(client.FromEnv)\n\t\t\th.AssertNil(t, err)\n\t\t})\n\n\t\tit(\"returns the cache type\", func() {\n\t\t\tref, err := name.ParseReference(\"my/repo\", name.WeakValidation)\n\t\t\th.AssertNil(t, err)\n\t\t\tsubject := cache.NewImageCache(ref, dockerClient)\n\t\t\texpected := cache.Image\n\t\t\th.AssertEq(t, subject.Type(), expected)\n\t\t})\n\t})\n\n\twhen(\"#Clear\", func() {\n\t\tvar (\n\t\t\timageName    string\n\t\t\tdockerClient client.APIClient\n\t\t\tsubject      *cache.ImageCache\n\t\t\tctx          context.Context\n\t\t)\n\n\t\tit.Before(func() {\n\t\t\tvar err error\n\t\t\tdockerClient, err = client.New(client.FromEnv)\n\t\t\th.AssertNil(t, err)\n\t\t\tctx = context.TODO()\n\n\t\t\tref, err := name.ParseReference(h.RandString(10), name.WeakValidation)\n\t\t\th.AssertNil(t, err)\n\t\t\tsubject = cache.NewImageCache(ref, dockerClient)\n\t\t\th.AssertNil(t, err)\n\t\t\timageName = subject.Name()\n\t\t})\n\n\t\twhen(\"there is a cache image\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\timg, err := local.NewImage(imageName, dockerClient)\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\th.AssertNil(t, img.Save())\n\t\t\t})\n\n\t\t\tit(\"removes the image\", func() {\n\t\t\t\terr := subject.Clear(ctx)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\tresult, err := dockerClient.ImageList(context.TODO(), client.ImageListOptions{\n\t\t\t\t\tFilters: client.Filters{\n\t\t\t\t\t\t\"reference\": {imageName: true},\n\t\t\t\t\t},\n\t\t\t\t})\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertEq(t, len(result.Items), 0)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"there is no cache image\", func() {\n\t\t\tit(\"does not fail\", func() {\n\t\t\t\terr := subject.Clear(ctx)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t})\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "pkg/cache/volume_cache.go",
    "content": "package cache\n\nimport (\n\t\"context\"\n\t\"crypto/rand\"\n\t\"crypto/sha256\"\n\t\"fmt\"\n\t\"os\"\n\t\"strings\"\n\n\t\"github.com/chainguard-dev/kaniko/pkg/util/proc\"\n\t\"github.com/google/go-containerregistry/pkg/name\"\n\tdockerClient \"github.com/moby/moby/client\"\n\n\tcerrdefs \"github.com/containerd/errdefs\"\n\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/internal/paths\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n)\n\nconst EnvVolumeKey = \"PACK_VOLUME_KEY\"\n\ntype VolumeCache struct {\n\tdocker DockerClient\n\tvolume string\n}\n\nfunc NewVolumeCache(imageRef name.Reference, cacheType CacheInfo, suffix string, dockerClient DockerClient, logger logging.Logger) (*VolumeCache, error) {\n\tvar volumeName string\n\tif cacheType.Source == \"\" {\n\t\tvolumeKey, err := getVolumeKey(imageRef, logger)\n\t\tif err != nil {\n\t\t\treturn nil, err\n\t\t}\n\t\tsum := sha256.Sum256([]byte(imageRef.Name() + volumeKey))\n\t\tvol := paths.FilterReservedNames(fmt.Sprintf(\"%s-%x\", sanitizedRef(imageRef), sum[:6]))\n\t\tvolumeName = fmt.Sprintf(\"pack-cache-%s.%s\", vol, suffix)\n\t} else {\n\t\tvolumeName = paths.FilterReservedNames(cacheType.Source)\n\t}\n\n\treturn &VolumeCache{\n\t\tvolume: volumeName,\n\t\tdocker: dockerClient,\n\t}, nil\n}\n\nfunc getVolumeKey(imageRef name.Reference, logger logging.Logger) (string, error) {\n\tvar foundKey string\n\n\t// first, look for key in env\n\n\tfoundKey = os.Getenv(EnvVolumeKey)\n\tif foundKey != \"\" {\n\t\treturn foundKey, nil\n\t}\n\n\t// then, look for key in existing config\n\n\tvolumeKeysPath, err := config.DefaultVolumeKeysPath()\n\tif err != nil {\n\t\treturn \"\", err\n\t}\n\tcfg, err := config.ReadVolumeKeys(volumeKeysPath)\n\tif err != nil {\n\t\treturn \"\", err\n\t}\n\n\tfoundKey = cfg.VolumeKeys[imageRef.Name()]\n\tif foundKey != \"\" {\n\t\treturn foundKey, nil\n\t}\n\n\t// finally, create new key and store it in config\n\n\t// if we're running in a container, we should log a warning\n\t// so that we don't always re-create the cache\n\tif RunningInContainer() {\n\t\tlogger.Warnf(\"%s is unset; set this environment variable to a secret value to avoid creating a new volume cache on every build\", EnvVolumeKey)\n\t}\n\n\tnewKey := randString(20)\n\tif cfg.VolumeKeys == nil {\n\t\tcfg.VolumeKeys = make(map[string]string)\n\t}\n\tcfg.VolumeKeys[imageRef.Name()] = newKey\n\tif err = config.Write(cfg, volumeKeysPath); err != nil {\n\t\treturn \"\", err\n\t}\n\n\treturn newKey, nil\n}\n\n// Returns a string iwith lowercase a-z, of length n\nfunc randString(n int) string {\n\tb := make([]byte, n)\n\t_, err := rand.Read(b)\n\tif err != nil {\n\t\tpanic(err)\n\t}\n\tfor i := range b {\n\t\tb[i] = 'a' + (b[i] % 26)\n\t}\n\treturn string(b)\n}\n\nfunc (c *VolumeCache) Name() string {\n\treturn c.volume\n}\n\nfunc (c *VolumeCache) Clear(ctx context.Context) error {\n\t_, err := c.docker.VolumeRemove(ctx, c.Name(), dockerClient.VolumeRemoveOptions{Force: true})\n\tif err != nil && !cerrdefs.IsNotFound(err) {\n\t\treturn err\n\t}\n\treturn nil\n}\n\nfunc (c *VolumeCache) Type() Type {\n\treturn Volume\n}\n\n// note image names and volume names are validated using the same restrictions:\n// see https://github.com/moby/moby/blob/f266f13965d5bfb1825afa181fe6c32f3a597fa3/daemon/names/names.go#L5\nfunc sanitizedRef(ref name.Reference) string {\n\tresult := strings.TrimPrefix(ref.Context().String(), ref.Context().RegistryStr()+\"/\")\n\tresult = strings.ReplaceAll(result, \"/\", \"_\")\n\treturn fmt.Sprintf(\"%s_%s\", result, ref.Identifier())\n}\n\nvar RunningInContainer = func() bool {\n\treturn proc.GetContainerRuntime(0, 0) != proc.RuntimeNotFound\n}\n"
  },
  {
    "path": "pkg/cache/volume_cache_test.go",
    "content": "package cache_test\n\nimport (\n\t\"bytes\"\n\t\"context\"\n\t\"os\"\n\t\"path/filepath\"\n\t\"strings\"\n\t\"testing\"\n\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/pkg/cache\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\n\t\"github.com/docker/docker/daemon/names\"\n\t\"github.com/google/go-containerregistry/pkg/name\"\n\t\"github.com/heroku/color\"\n\t\"github.com/moby/moby/client\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestVolumeCache(t *testing.T) {\n\th.RequireDocker(t)\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\n\tspec.Run(t, \"VolumeCache\", testCache, spec.Sequential(), spec.Report(report.Terminal{}))\n}\n\nfunc testCache(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tdockerClient client.APIClient\n\t\toutBuf       bytes.Buffer\n\t\tlogger       logging.Logger\n\t)\n\n\tit.Before(func() {\n\t\tvar err error\n\t\tdockerClient, err = client.New(client.FromEnv)\n\t\th.AssertNil(t, err)\n\t\tlogger = logging.NewSimpleLogger(&outBuf)\n\t})\n\n\twhen(\"#NewVolumeCache\", func() {\n\t\twhen(\"volume cache name is empty\", func() {\n\t\t\tit(\"adds suffix to calculated name\", func() {\n\t\t\t\tref, err := name.ParseReference(\"my/repo\", name.WeakValidation)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\tsubject, _ := cache.NewVolumeCache(ref, cache.CacheInfo{}, \"some-suffix\", dockerClient, logger)\n\t\t\t\tif !strings.HasSuffix(subject.Name(), \".some-suffix\") {\n\t\t\t\t\tt.Fatalf(\"Calculated volume name '%s' should end with '.some-suffix'\", subject.Name())\n\t\t\t\t}\n\t\t\t})\n\n\t\t\tit(\"reusing the same cache for the same repo name\", func() {\n\t\t\t\tref, err := name.ParseReference(\"my/repo\", name.WeakValidation)\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\tsubject, _ := cache.NewVolumeCache(ref, cache.CacheInfo{}, \"some-suffix\", dockerClient, logger)\n\t\t\t\texpected, _ := cache.NewVolumeCache(ref, cache.CacheInfo{}, \"some-suffix\", dockerClient, logger)\n\t\t\t\tif subject.Name() != expected.Name() {\n\t\t\t\t\tt.Fatalf(\"The same repo name should result in the same volume\")\n\t\t\t\t}\n\t\t\t})\n\n\t\t\tit(\"supplies different volumes for different tags\", func() {\n\t\t\t\tref, err := name.ParseReference(\"my/repo:other-tag\", name.WeakValidation)\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\tsubject, _ := cache.NewVolumeCache(ref, cache.CacheInfo{}, \"some-suffix\", dockerClient, logger)\n\n\t\t\t\tref, err = name.ParseReference(\"my/repo\", name.WeakValidation)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\tnotExpected, _ := cache.NewVolumeCache(ref, cache.CacheInfo{}, \"some-suffix\", dockerClient, logger)\n\t\t\t\tif subject.Name() == notExpected.Name() {\n\t\t\t\t\tt.Fatalf(\"Different image tags should result in different volumes\")\n\t\t\t\t}\n\t\t\t})\n\n\t\t\tit(\"supplies different volumes for different registries\", func() {\n\t\t\t\tref, err := name.ParseReference(\"registry.com/my/repo:other-tag\", name.WeakValidation)\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\tsubject, _ := cache.NewVolumeCache(ref, cache.CacheInfo{}, \"some-suffix\", dockerClient, logger)\n\n\t\t\t\tref, err = name.ParseReference(\"my/repo\", name.WeakValidation)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\tnotExpected, _ := cache.NewVolumeCache(ref, cache.CacheInfo{}, \"some-suffix\", dockerClient, logger)\n\t\t\t\tif subject.Name() == notExpected.Name() {\n\t\t\t\t\tt.Fatalf(\"Different image registries should result in different volumes\")\n\t\t\t\t}\n\t\t\t})\n\n\t\t\tit(\"resolves implied tag\", func() {\n\t\t\t\tref, err := name.ParseReference(\"my/repo:latest\", name.WeakValidation)\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\tsubject, _ := cache.NewVolumeCache(ref, cache.CacheInfo{}, \"some-suffix\", dockerClient, logger)\n\n\t\t\t\tref, err = name.ParseReference(\"my/repo\", name.WeakValidation)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\texpected, _ := cache.NewVolumeCache(ref, cache.CacheInfo{}, \"some-suffix\", dockerClient, logger)\n\t\t\t\th.AssertEq(t, subject.Name(), expected.Name())\n\t\t\t})\n\n\t\t\tit(\"resolves implied registry\", func() {\n\t\t\t\tref, err := name.ParseReference(\"index.docker.io/my/repo\", name.WeakValidation)\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\tsubject, _ := cache.NewVolumeCache(ref, cache.CacheInfo{}, \"some-suffix\", dockerClient, logger)\n\n\t\t\t\tref, err = name.ParseReference(\"my/repo\", name.WeakValidation)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\texpected, _ := cache.NewVolumeCache(ref, cache.CacheInfo{}, \"some-suffix\", dockerClient, logger)\n\t\t\t\th.AssertEq(t, subject.Name(), expected.Name())\n\t\t\t})\n\n\t\t\tit(\"includes human readable information\", func() {\n\t\t\t\tref, err := name.ParseReference(\"myregistryhost:5000/fedora/httpd:version1.0\", name.WeakValidation)\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\tsubject, _ := cache.NewVolumeCache(ref, cache.CacheInfo{}, \"some-suffix\", dockerClient, logger)\n\n\t\t\t\th.AssertContains(t, subject.Name(), \"fedora_httpd_version1.0\")\n\t\t\t\th.AssertTrue(t, names.RestrictedNamePattern.MatchString(subject.Name()))\n\t\t\t})\n\n\t\t\twhen(\"PACK_VOLUME_KEY\", func() {\n\t\t\t\twhen(\"is set\", func() {\n\t\t\t\t\tit.After(func() {\n\t\t\t\t\t\th.AssertNil(t, os.Unsetenv(\"PACK_VOLUME_KEY\"))\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"uses it to construct the volume name\", func() {\n\t\t\t\t\t\tref, err := name.ParseReference(\"my/repo:some-tag\", name.WeakValidation)\n\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\tnameFromNewKey, _ := cache.NewVolumeCache(ref, cache.CacheInfo{}, \"some-suffix\", dockerClient, logger) // sources a new key\n\t\t\t\t\t\th.AssertNil(t, os.Setenv(\"PACK_VOLUME_KEY\", \"some-volume-key\"))\n\t\t\t\t\t\tnameFromEnvKey, _ := cache.NewVolumeCache(ref, cache.CacheInfo{}, \"some-suffix\", dockerClient, logger) // sources key from env\n\t\t\t\t\t\th.AssertNotEq(t, nameFromNewKey.Name(), nameFromEnvKey.Name())\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"is unset\", func() {\n\t\t\t\t\tvar tmpPackHome string\n\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\tvar err error\n\t\t\t\t\t\ttmpPackHome, err = os.MkdirTemp(\"\", \"\")\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\th.AssertNil(t, os.Setenv(\"PACK_HOME\", tmpPackHome))\n\t\t\t\t\t})\n\n\t\t\t\t\tit.After(func() {\n\t\t\t\t\t\th.AssertNil(t, os.RemoveAll(tmpPackHome))\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"~/.pack/volume-keys.toml contains key for repo name\", func() {\n\t\t\t\t\t\tit(\"sources the key from ~/.pack/volume-keys.toml\", func() {\n\t\t\t\t\t\t\tref, err := name.ParseReference(\"my/repo:some-tag\", name.WeakValidation)\n\t\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\t\tnameFromNewKey, _ := cache.NewVolumeCache(ref, cache.CacheInfo{}, \"some-suffix\", dockerClient, logger) // sources a new key\n\n\t\t\t\t\t\t\tcfgContents := `\n[volume-keys]\n\"index.docker.io/my/repo:some-tag\" = \"SOME_VOLUME_KEY\"\n`\n\t\t\t\t\t\t\th.AssertNil(t, os.WriteFile(filepath.Join(tmpPackHome, \"volume-keys.toml\"), []byte(cfgContents), 0755)) // overrides the key that was set\n\n\t\t\t\t\t\t\tnameFromConfigKey, _ := cache.NewVolumeCache(ref, cache.CacheInfo{}, \"some-suffix\", dockerClient, logger) // sources key from config\n\t\t\t\t\t\t\th.AssertNotEq(t, nameFromNewKey.Name(), nameFromConfigKey.Name())\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"~/.pack/volume-keys.toml missing key for repo name\", func() {\n\t\t\t\t\t\tit(\"generates a new key and saves it to ~/.pack/volume-keys.toml\", func() {\n\t\t\t\t\t\t\tref, err := name.ParseReference(\"my/repo:some-tag\", name.WeakValidation)\n\t\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\t\tnameFromNewKey, _ := cache.NewVolumeCache(ref, cache.CacheInfo{}, \"some-suffix\", dockerClient, logger)    // sources a new key\n\t\t\t\t\t\t\tnameFromConfigKey, _ := cache.NewVolumeCache(ref, cache.CacheInfo{}, \"some-suffix\", dockerClient, logger) // sources same key from config\n\t\t\t\t\t\t\th.AssertEq(t, nameFromNewKey.Name(), nameFromConfigKey.Name())\n\n\t\t\t\t\t\t\tcfg, err := config.ReadVolumeKeys(filepath.Join(tmpPackHome, \"volume-keys.toml\"))\n\t\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\t\th.AssertNotNil(t, cfg.VolumeKeys[\"index.docker.io/my/repo:some-tag\"])\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\twhen(\"containerized pack\", func() {\n\t\t\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\t\t\tcache.RunningInContainer = func() bool {\n\t\t\t\t\t\t\t\t\treturn true\n\t\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\t})\n\n\t\t\t\t\t\t\tit(\"logs a warning\", func() {\n\t\t\t\t\t\t\t\tref, err := name.ParseReference(\"my/repo:some-tag\", name.WeakValidation)\n\t\t\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\t\t\t_, _ = cache.NewVolumeCache(ref, cache.CacheInfo{}, \"some-suffix\", dockerClient, logger) // sources a new key\n\t\t\t\t\t\t\t\t_, _ = cache.NewVolumeCache(ref, cache.CacheInfo{}, \"some-suffix\", dockerClient, logger) // sources same key from config\n\t\t\t\t\t\t\t\th.AssertContains(t, outBuf.String(), \"PACK_VOLUME_KEY is unset; set this environment variable to a secret value to avoid creating a new volume cache on every build\")\n\t\t\t\t\t\t\t\th.AssertEq(t, strings.Count(outBuf.String(), \"PACK_VOLUME_KEY is unset\"), 1) // the second call to NewVolumeCache reads from the config\n\t\t\t\t\t\t\t})\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"volume cache name is not empty\", func() {\n\t\t\tvolumeName := \"test-volume-name\"\n\t\t\tcacheInfo := cache.CacheInfo{\n\t\t\t\tFormat: cache.CacheVolume,\n\t\t\t\tSource: volumeName,\n\t\t\t}\n\n\t\t\tit(\"named volume created without suffix\", func() {\n\t\t\t\tref, err := name.ParseReference(\"my/repo\", name.WeakValidation)\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\tsubject, _ := cache.NewVolumeCache(ref, cacheInfo, \"some-suffix\", dockerClient, logger)\n\n\t\t\t\tif volumeName != subject.Name() {\n\t\t\t\t\tt.Fatalf(\"Volume name '%s' should be same as the name specified '%s'\", subject.Name(), volumeName)\n\t\t\t\t}\n\t\t\t})\n\n\t\t\tit(\"reusing the same cache for the same repo name\", func() {\n\t\t\t\tref, err := name.ParseReference(\"my/repo\", name.WeakValidation)\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\tsubject, _ := cache.NewVolumeCache(ref, cacheInfo, \"some-suffix\", dockerClient, logger)\n\n\t\t\t\texpected, _ := cache.NewVolumeCache(ref, cacheInfo, \"some-suffix\", dockerClient, logger)\n\t\t\t\tif subject.Name() != expected.Name() {\n\t\t\t\t\tt.Fatalf(\"The same repo name should result in the same volume\")\n\t\t\t\t}\n\t\t\t})\n\n\t\t\tit(\"supplies different volumes for different registries\", func() {\n\t\t\t\tref, err := name.ParseReference(\"registry.com/my/repo:other-tag\", name.WeakValidation)\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\tsubject, _ := cache.NewVolumeCache(ref, cache.CacheInfo{}, \"some-suffix\", dockerClient, logger)\n\n\t\t\t\tref, err = name.ParseReference(\"my/repo\", name.WeakValidation)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\tnotExpected, _ := cache.NewVolumeCache(ref, cache.CacheInfo{}, \"some-suffix\", dockerClient, logger)\n\t\t\t\tif subject.Name() == notExpected.Name() {\n\t\t\t\t\tt.Fatalf(\"Different image registries should result in different volumes\")\n\t\t\t\t}\n\t\t\t})\n\n\t\t\tit(\"resolves implied tag\", func() {\n\t\t\t\tref, err := name.ParseReference(\"my/repo:latest\", name.WeakValidation)\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\tsubject, _ := cache.NewVolumeCache(ref, cache.CacheInfo{}, \"some-suffix\", dockerClient, logger)\n\n\t\t\t\tref, err = name.ParseReference(\"my/repo\", name.WeakValidation)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\texpected, _ := cache.NewVolumeCache(ref, cache.CacheInfo{}, \"some-suffix\", dockerClient, logger)\n\t\t\t\th.AssertEq(t, subject.Name(), expected.Name())\n\t\t\t})\n\n\t\t\tit(\"resolves implied registry\", func() {\n\t\t\t\tref, err := name.ParseReference(\"index.docker.io/my/repo\", name.WeakValidation)\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\tsubject, _ := cache.NewVolumeCache(ref, cache.CacheInfo{}, \"some-suffix\", dockerClient, logger)\n\n\t\t\t\tref, err = name.ParseReference(\"my/repo\", name.WeakValidation)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\texpected, _ := cache.NewVolumeCache(ref, cache.CacheInfo{}, \"some-suffix\", dockerClient, logger)\n\t\t\t\th.AssertEq(t, subject.Name(), expected.Name())\n\t\t\t})\n\n\t\t\tit(\"includes human readable information\", func() {\n\t\t\t\tref, err := name.ParseReference(\"myregistryhost:5000/fedora/httpd:version1.0\", name.WeakValidation)\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\tsubject, _ := cache.NewVolumeCache(ref, cache.CacheInfo{}, \"some-suffix\", dockerClient, logger)\n\n\t\t\t\th.AssertContains(t, subject.Name(), \"fedora_httpd_version1.0\")\n\t\t\t\th.AssertTrue(t, names.RestrictedNamePattern.MatchString(subject.Name()))\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"#Clear\", func() {\n\t\tvar (\n\t\t\tvolumeName   string\n\t\t\tdockerClient client.APIClient\n\t\t\tsubject      *cache.VolumeCache\n\t\t\tctx          context.Context\n\t\t)\n\n\t\tit.Before(func() {\n\t\t\tvar err error\n\t\t\tdockerClient, err = client.New(client.FromEnv)\n\t\t\th.AssertNil(t, err)\n\t\t\tctx = context.TODO()\n\n\t\t\tref, err := name.ParseReference(h.RandString(10), name.WeakValidation)\n\t\t\th.AssertNil(t, err)\n\n\t\t\tsubject, _ = cache.NewVolumeCache(ref, cache.CacheInfo{}, \"some-suffix\", dockerClient, logger)\n\t\t\tvolumeName = subject.Name()\n\t\t})\n\n\t\twhen(\"there is a cache volume\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\tdockerClient.VolumeCreate(context.TODO(), client.VolumeCreateOptions{\n\t\t\t\t\tName: volumeName,\n\t\t\t\t})\n\t\t\t})\n\n\t\t\tit(\"removes the volume\", func() {\n\t\t\t\terr := subject.Clear(ctx)\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\tvolumesResult, err := dockerClient.VolumeList(context.TODO(), client.VolumeListOptions{\n\t\t\t\t\tFilters: client.Filters{\n\t\t\t\t\t\t\"name\": {volumeName: true},\n\t\t\t\t\t},\n\t\t\t\t})\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertEq(t, len(volumesResult.Items), 0)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"there is no cache volume\", func() {\n\t\t\tit(\"does not fail\", func() {\n\t\t\t\terr := subject.Clear(ctx)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"#Type\", func() {\n\t\tit(\"returns the cache type\", func() {\n\t\t\tref, err := name.ParseReference(\"my/repo\", name.WeakValidation)\n\t\t\th.AssertNil(t, err)\n\t\t\tsubject, _ := cache.NewVolumeCache(ref, cache.CacheInfo{}, \"some-suffix\", dockerClient, logger)\n\t\t\texpected := cache.Volume\n\t\t\th.AssertEq(t, subject.Type(), expected)\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "pkg/client/build.go",
    "content": "package client\n\nimport (\n\t\"archive/tar\"\n\t\"context\"\n\t\"crypto/rand\"\n\t\"crypto/sha256\"\n\t\"encoding/hex\"\n\t\"encoding/json\"\n\t\"fmt\"\n\t\"io\"\n\t\"os\"\n\t\"path/filepath\"\n\t\"sort\"\n\t\"strconv\"\n\t\"strings\"\n\t\"time\"\n\n\t\"github.com/Masterminds/semver\"\n\t\"github.com/buildpacks/imgutil\"\n\t\"github.com/buildpacks/imgutil/layout\"\n\t\"github.com/buildpacks/imgutil/local\"\n\t\"github.com/buildpacks/imgutil/remote\"\n\t\"github.com/buildpacks/lifecycle/platform/files\"\n\t\"github.com/chainguard-dev/kaniko/pkg/util/proc\"\n\t\"github.com/google/go-containerregistry/pkg/name\"\n\t\"github.com/moby/moby/client\"\n\t\"github.com/pkg/errors\"\n\tignore \"github.com/sabhiram/go-gitignore\"\n\n\t\"github.com/buildpacks/pack/buildpackage\"\n\t\"github.com/buildpacks/pack/internal/build\"\n\t\"github.com/buildpacks/pack/internal/builder\"\n\tinternalConfig \"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/internal/layer\"\n\tpname \"github.com/buildpacks/pack/internal/name\"\n\t\"github.com/buildpacks/pack/internal/paths\"\n\t\"github.com/buildpacks/pack/internal/stack\"\n\t\"github.com/buildpacks/pack/internal/stringset\"\n\t\"github.com/buildpacks/pack/internal/style\"\n\t\"github.com/buildpacks/pack/internal/termui\"\n\t\"github.com/buildpacks/pack/pkg/archive\"\n\t\"github.com/buildpacks/pack/pkg/buildpack\"\n\t\"github.com/buildpacks/pack/pkg/cache\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n\t\"github.com/buildpacks/pack/pkg/image\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\tprojectTypes \"github.com/buildpacks/pack/pkg/project/types\"\n\tv02 \"github.com/buildpacks/pack/pkg/project/v02\"\n)\n\nconst (\n\tminLifecycleVersionSupportingCreator               = \"0.7.4\"\n\tprevLifecycleVersionSupportingImage                = \"0.6.1\"\n\tminLifecycleVersionSupportingImage                 = \"0.7.5\"\n\tminLifecycleVersionSupportingCreatorWithExtensions = \"0.19.0\"\n)\n\nvar RunningInContainer = func() bool {\n\treturn proc.GetContainerRuntime(0, 0) != proc.RuntimeNotFound\n}\n\n// LifecycleExecutor executes the lifecycle which satisfies the Cloud Native Buildpacks Lifecycle specification.\n// Implementations of the Lifecycle must execute the following phases by calling the\n// phase-specific lifecycle binary in order:\n//\n//\tDetection:         /cnb/lifecycle/detector\n//\tAnalysis:          /cnb/lifecycle/analyzer\n//\tCache Restoration: /cnb/lifecycle/restorer\n//\tBuild:             /cnb/lifecycle/builder\n//\tExport:            /cnb/lifecycle/exporter\n//\n// or invoke the single creator binary:\n//\n//\tCreator:            /cnb/lifecycle/creator\ntype LifecycleExecutor interface {\n\t// Execute is responsible for invoking each of these binaries\n\t// with the desired configuration.\n\tExecute(ctx context.Context, opts build.LifecycleOptions) error\n}\n\ntype IsTrustedBuilder func(string) bool\n\n// BuildOptions defines configuration settings for a Build.\ntype BuildOptions struct {\n\t// The base directory to use to resolve relative assets\n\tRelativeBaseDir string\n\n\t// required. Name of output image.\n\tImage string\n\n\t// required. Builder image name.\n\tBuilder string\n\n\t// Name of the buildpack registry. Used to\n\t// add buildpacks to a build.\n\tRegistry string\n\n\t// AppPath is the path to application bits.\n\t// If unset it defaults to current working directory.\n\tAppPath string\n\n\t// Specify the run image the Image will be\n\t// built atop.\n\tRunImage string\n\n\t// Address of docker daemon exposed to build container\n\t// e.g. tcp://example.com:1234, unix:///run/user/1000/podman/podman.sock\n\tDockerHost string\n\n\t// the target environment the OCI image is expected to be run in, i.e. production, test, development.\n\tCNBExecutionEnv string\n\n\t// Used to determine a run-image mirror if Run Image is empty.\n\t// Used in combination with Builder metadata to determine to the 'best' mirror.\n\t// 'best' is defined as:\n\t//  - if Publish is true, the best mirror matches registry we are publishing to.\n\t//  - if Publish is false, the best mirror matches a registry specified in Image.\n\t//  - otherwise if both of the above did not match, use mirror specified in\n\t//    the builder metadata\n\tAdditionalMirrors map[string][]string\n\n\t// User provided environment variables to the buildpacks.\n\t// Buildpacks may both read and overwrite these values.\n\tEnv map[string]string\n\n\t// Used to configure various cache available options\n\tCache cache.CacheOpts\n\n\t// Option only valid if Publish is true\n\t// Create an additional image that contains cache=true layers and push it to the registry.\n\tCacheImage string\n\n\t// Option passed directly to the lifecycle.\n\t// If true, publishes Image directly to a registry.\n\t// Assumes Image contains a valid registry with credentials\n\t// provided by the docker client.\n\tPublish bool\n\n\t// Clear the build cache from previous builds.\n\tClearCache bool\n\n\t// Launch a terminal UI to depict the build process\n\tInteractive bool\n\n\t// Disable System Buildpacks present in the builder\n\tDisableSystemBuildpacks bool\n\n\t// List of buildpack images or archives to add to a builder.\n\t// These buildpacks may overwrite those on the builder if they\n\t// share both an ID and Version with a buildpack on the builder.\n\tBuildpacks []string\n\n\t// List of extension images or archives to add to a builder.\n\t// These extensions may overwrite those on the builder if they\n\t// share both an ID and Version with an extension on the builder.\n\tExtensions []string\n\n\t// Additional image tags to push to, each will contain contents identical to Image\n\tAdditionalTags []string\n\n\t// Configure the proxy environment variables,\n\t// These variables will only be set in the build image\n\t// and will not be used if proxy env vars are already set.\n\tProxyConfig *ProxyConfig\n\n\t// Configure network and volume mounts for the build containers.\n\tContainerConfig ContainerConfig\n\n\t// Process type that will be used when setting container start command.\n\tDefaultProcessType string\n\n\t// Platform is the desired platform to build on (e.g., linux/amd64)\n\tPlatform string\n\n\t// Strategy for updating local images before a build.\n\tPullPolicy image.PullPolicy\n\n\t// ProjectDescriptorBaseDir is the base directory to find relative resources referenced by the ProjectDescriptor\n\tProjectDescriptorBaseDir string\n\n\t// ProjectDescriptor describes the project and any configuration specific to the project\n\tProjectDescriptor projectTypes.Descriptor\n\n\t// List of buildpack images or archives to add to a builder.\n\t// these buildpacks will be prepended to the builder's order\n\tPreBuildpacks []string\n\n\t// List of buildpack images or archives to add to a builder.\n\t// these buildpacks will be appended to the builder's order\n\tPostBuildpacks []string\n\n\t// The lifecycle image that will be used for the analysis, restore and export phases\n\t// when using an untrusted builder.\n\tLifecycleImage string\n\n\t// The location at which to mount the AppDir in the build image.\n\tWorkspace string\n\n\t// User's group id used to build the image\n\tGroupID int\n\n\t// User's user id used to build the image\n\tUserID int\n\n\t// A previous image to set to a particular tag reference, digest reference, or (when performing a daemon build) image ID;\n\tPreviousImage string\n\n\t// TrustBuilder when true optimizes builds by running\n\t// all lifecycle phases in a single container.\n\t// This places registry credentials on the builder's build image.\n\t// Only trust builders from reputable sources.  The optimized\n\t// build happens only when both builder and buildpacks are\n\t// trusted\n\tTrustBuilder IsTrustedBuilder\n\n\t// TrustExtraBuildpacks when true optimizes builds by running\n\t// all lifecycle phases in a single container.  The optimized\n\t// build happens only when both builder and buildpacks are\n\t// trusted\n\tTrustExtraBuildpacks bool\n\n\t// Directory to output any SBOM artifacts\n\tSBOMDestinationDir string\n\n\t// Directory to output the report.toml metadata artifact\n\tReportDestinationDir string\n\n\t// Desired create time in the output image config\n\tCreationTime *time.Time\n\n\t// Configuration to export to OCI layout format\n\tLayoutConfig *LayoutConfig\n\n\t// Enable user namespace isolation for the build containers\n\tEnableUsernsHost bool\n\n\tInsecureRegistries []string\n}\n\nfunc (b *BuildOptions) Layout() bool {\n\tif b.LayoutConfig != nil {\n\t\treturn b.LayoutConfig.Enable()\n\t}\n\treturn false\n}\n\n// ProxyConfig specifies proxy setting to be set as environment variables in a container.\ntype ProxyConfig struct {\n\tHTTPProxy  string // Used to set HTTP_PROXY env var.\n\tHTTPSProxy string // Used to set HTTPS_PROXY env var.\n\tNoProxy    string // Used to set NO_PROXY env var.\n}\n\n// ContainerConfig is additional configuration of the docker container that all build steps\n// occur within.\ntype ContainerConfig struct {\n\t// Configure network settings of the build containers.\n\t// The value of Network is handed directly to the docker client.\n\t// For valid values of this field see:\n\t// https://docs.docker.com/network/#network-drivers\n\tNetwork string\n\n\t// Volumes are accessible during both detect build phases\n\t// should have the form: /path/in/host:/path/in/container.\n\t// For more about volume mounts, and their permissions see:\n\t// https://docs.docker.com/storage/volumes/\n\t//\n\t// It is strongly recommended you do not override any of the\n\t// paths with volume mounts at the following locations:\n\t// - /cnb\n\t// - /layers\n\t// - anything below /cnb/**\n\tVolumes []string\n}\n\ntype LayoutConfig struct {\n\t// Application image reference provided by the user\n\tInputImage InputImageReference\n\n\t// Previous image reference provided by the user\n\tPreviousInputImage InputImageReference\n\n\t// Local root path to save the run-image in OCI layout format\n\tLayoutRepoDir string\n\n\t// Configure the OCI layout fetch mode to avoid saving layers on disk\n\tSparse bool\n}\n\nfunc (l *LayoutConfig) Enable() bool {\n\treturn l.InputImage.Layout()\n}\n\ntype layoutPathConfig struct {\n\thostImagePath           string\n\thostPreviousImagePath   string\n\thostRunImagePath        string\n\ttargetImagePath         string\n\ttargetPreviousImagePath string\n\ttargetRunImagePath      string\n}\n\n// Build configures settings for the build container(s) and lifecycle.\n// It then invokes the lifecycle to build an app image.\n// If any configuration is deemed invalid, or if any lifecycle phases fail,\n// an error will be returned and no image produced.\nfunc (c *Client) Build(ctx context.Context, opts BuildOptions) error {\n\tvar pathsConfig layoutPathConfig\n\n\tif RunningInContainer() && (opts.PullPolicy != image.PullAlways) {\n\t\tc.logger.Warnf(\"Detected pack is running in a container; if using a shared docker host, failing to pull build inputs from a remote registry is insecure - \" +\n\t\t\t\"other tenants may have compromised build inputs stored in the daemon.\" +\n\t\t\t\"This configuration is insecure and may become unsupported in the future.\" +\n\t\t\t\"Re-run with '--pull-policy=always' to silence this warning.\")\n\t}\n\n\tif !opts.Publish && usesContainerdStorage(c.docker) {\n\t\tc.logger.Warnf(\"Exporting to docker daemon (building without --publish) and daemon uses containerd storage; performance may be significantly degraded.\\n\" +\n\t\t\t\"For more information, see https://github.com/buildpacks/pack/issues/2272.\")\n\t}\n\n\timageRef, err := c.parseReference(opts)\n\tif err != nil {\n\t\treturn errors.Wrapf(err, \"invalid image name '%s'\", opts.Image)\n\t}\n\timgRegistry := imageRef.Context().RegistryStr()\n\timageName := imageRef.Name()\n\n\tif opts.Layout() {\n\t\tpathsConfig, err = c.processLayoutPath(opts.LayoutConfig.InputImage, opts.LayoutConfig.PreviousInputImage)\n\t\tif err != nil {\n\t\t\tif opts.LayoutConfig.PreviousInputImage != nil {\n\t\t\t\treturn errors.Wrapf(err, \"invalid layout paths image name '%s' or previous-image name '%s'\", opts.LayoutConfig.InputImage.Name(),\n\t\t\t\t\topts.LayoutConfig.PreviousInputImage.Name())\n\t\t\t}\n\t\t\treturn errors.Wrapf(err, \"invalid layout paths image name '%s'\", opts.LayoutConfig.InputImage.Name())\n\t\t}\n\t}\n\n\tappPath, err := c.processAppPath(opts.AppPath)\n\tif err != nil {\n\t\treturn errors.Wrapf(err, \"invalid app path '%s'\", opts.AppPath)\n\t}\n\n\tproxyConfig := c.processProxyConfig(opts.ProxyConfig)\n\n\tbuilderRef, err := c.processBuilderName(opts.Builder)\n\tif err != nil {\n\t\treturn errors.Wrapf(err, \"invalid builder '%s'\", opts.Builder)\n\t}\n\n\trequestedTarget := func() *dist.Target {\n\t\tif opts.Platform == \"\" {\n\t\t\treturn nil\n\t\t}\n\t\tparts := strings.Split(opts.Platform, \"/\")\n\t\tswitch len(parts) {\n\t\tcase 0:\n\t\t\treturn nil\n\t\tcase 1:\n\t\t\treturn &dist.Target{OS: parts[0]}\n\t\tcase 2:\n\t\t\treturn &dist.Target{OS: parts[0], Arch: parts[1]}\n\t\tdefault:\n\t\t\treturn &dist.Target{OS: parts[0], Arch: parts[1], ArchVariant: parts[2]}\n\t\t}\n\t}()\n\n\trawBuilderImage, err := c.imageFetcher.Fetch(\n\t\tctx,\n\t\tbuilderRef.Name(),\n\t\timage.FetchOptions{\n\t\t\tDaemon:             true,\n\t\t\tTarget:             requestedTarget,\n\t\t\tPullPolicy:         opts.PullPolicy,\n\t\t\tInsecureRegistries: opts.InsecureRegistries,\n\t\t},\n\t)\n\tif err != nil {\n\t\treturn errors.Wrapf(err, \"failed to fetch builder image '%s'\", builderRef.Name())\n\t}\n\n\tvar targetToUse *dist.Target\n\tif requestedTarget != nil {\n\t\ttargetToUse = requestedTarget\n\t} else {\n\t\ttargetToUse, err = getTargetFromBuilder(rawBuilderImage)\n\t\tif err != nil {\n\t\t\treturn err\n\t\t}\n\t}\n\n\tbldr, err := c.getBuilder(rawBuilderImage)\n\tif err != nil {\n\t\treturn errors.Wrapf(err, \"invalid builder %s\", style.Symbol(opts.Builder))\n\t}\n\n\tfetchOptions := image.FetchOptions{\n\t\tDaemon:             !opts.Publish,\n\t\tPullPolicy:         opts.PullPolicy,\n\t\tTarget:             targetToUse,\n\t\tInsecureRegistries: opts.InsecureRegistries,\n\t}\n\trunImageName := c.resolveRunImage(opts.RunImage, imgRegistry, builderRef.Context().RegistryStr(), bldr.DefaultRunImage(), opts.AdditionalMirrors, opts.Publish, fetchOptions)\n\n\tif opts.Layout() {\n\t\ttargetRunImagePath, err := layout.ParseRefToPath(runImageName)\n\t\tif err != nil {\n\t\t\treturn err\n\t\t}\n\t\thostRunImagePath := filepath.Join(opts.LayoutConfig.LayoutRepoDir, targetRunImagePath)\n\t\ttargetRunImagePath = filepath.Join(paths.RootDir, \"layout-repo\", targetRunImagePath)\n\t\tfetchOptions.LayoutOption = image.LayoutOption{\n\t\t\tPath:   hostRunImagePath,\n\t\t\tSparse: opts.LayoutConfig.Sparse,\n\t\t}\n\t\tfetchOptions.Daemon = false\n\t\tpathsConfig.targetRunImagePath = targetRunImagePath\n\t\tpathsConfig.hostRunImagePath = hostRunImagePath\n\t}\n\n\trunImage, warnings, err := c.validateRunImage(ctx, runImageName, fetchOptions, bldr.StackID)\n\tif err != nil {\n\t\treturn errors.Wrapf(err, \"invalid run-image '%s'\", runImageName)\n\t}\n\tfor _, warning := range warnings {\n\t\tc.logger.Warn(warning)\n\t}\n\n\tvar runMixins []string\n\tif _, err := dist.GetLabel(runImage, stack.MixinsLabel, &runMixins); err != nil {\n\t\treturn err\n\t}\n\n\tfetchedBPs, nInlineBPs, order, err := c.processBuildpacks(ctx, bldr.Buildpacks(), bldr.Order(), bldr.StackID, opts, targetToUse)\n\tif err != nil {\n\t\treturn err\n\t}\n\n\tfetchedExs, orderExtensions, err := c.processExtensions(ctx, bldr.Extensions(), opts, targetToUse)\n\tif err != nil {\n\t\treturn err\n\t}\n\n\tsystem, err := c.processSystem(bldr.System(), fetchedBPs, opts.DisableSystemBuildpacks)\n\tif err != nil {\n\t\treturn err\n\t}\n\n\t// Default mode: if the TrustBuilder option is not set, trust the known trusted builders.\n\tif opts.TrustBuilder == nil {\n\t\topts.TrustBuilder = builder.IsKnownTrustedBuilder\n\t}\n\n\t// Ensure the builder's platform APIs are supported\n\tvar builderPlatformAPIs builder.APISet\n\tbuilderPlatformAPIs = append(builderPlatformAPIs, bldr.LifecycleDescriptor().APIs.Platform.Deprecated...)\n\tbuilderPlatformAPIs = append(builderPlatformAPIs, bldr.LifecycleDescriptor().APIs.Platform.Supported...)\n\tif !supportsPlatformAPI(builderPlatformAPIs) {\n\t\tc.logger.Debugf(\"pack %s supports Platform API(s): %s\", c.version, strings.Join(build.SupportedPlatformAPIVersions.AsStrings(), \", \"))\n\t\tc.logger.Debugf(\"Builder %s supports Platform API(s): %s\", style.Symbol(opts.Builder), strings.Join(builderPlatformAPIs.AsStrings(), \", \"))\n\t\treturn errors.Errorf(\"Builder %s is incompatible with this version of pack\", style.Symbol(opts.Builder))\n\t}\n\n\t// Get the platform API version to use\n\tlifecycleVersion := bldr.LifecycleDescriptor().Info.Version\n\tuseCreator := supportsCreator(lifecycleVersion) && opts.TrustBuilder(opts.Builder)\n\thasAdditionalBuildpacks := func() bool {\n\t\treturn len(fetchedBPs) != nInlineBPs\n\t}()\n\thasExtensions := func() bool {\n\t\treturn len(fetchedExs) != 0\n\t}()\n\tif hasExtensions {\n\t\tc.logger.Warnf(\"Builder is trusted but additional modules were added; using the untrusted (5 phases) build flow\")\n\t\tuseCreator = false\n\t}\n\tif hasAdditionalBuildpacks && !opts.TrustExtraBuildpacks {\n\t\tc.logger.Warnf(\"Builder is trusted but additional modules were added; using the untrusted (5 phases) build flow\")\n\t\tuseCreator = false\n\t}\n\tvar (\n\t\tlifecycleOptsLifecycleImage string\n\t\tlifecycleAPIs               []string\n\t)\n\tif !(useCreator) {\n\t\t// fetch the lifecycle image\n\t\tif supportsLifecycleImage(lifecycleVersion) {\n\t\t\tlifecycleImageName := opts.LifecycleImage\n\t\t\tif lifecycleImageName == \"\" {\n\t\t\t\tlifecycleImageName = fmt.Sprintf(\"%s:%s\", internalConfig.DefaultLifecycleImageRepo, lifecycleVersion.String())\n\t\t\t}\n\n\t\t\tlifecycleImage, err := c.imageFetcher.FetchForPlatform(\n\t\t\t\tctx,\n\t\t\t\tlifecycleImageName,\n\t\t\t\timage.FetchOptions{\n\t\t\t\t\tDaemon:             true,\n\t\t\t\t\tPullPolicy:         opts.PullPolicy,\n\t\t\t\t\tTarget:             targetToUse,\n\t\t\t\t\tInsecureRegistries: opts.InsecureRegistries,\n\t\t\t\t},\n\t\t\t)\n\t\t\tif err != nil {\n\t\t\t\treturn fmt.Errorf(\"fetching lifecycle image: %w\", err)\n\t\t\t}\n\n\t\t\t// if lifecyle container os isn't windows, use ephemeral lifecycle to add /workspace with correct ownership\n\t\t\timageOS, err := lifecycleImage.OS()\n\t\t\tif err != nil {\n\t\t\t\treturn errors.Wrap(err, \"getting lifecycle image OS\")\n\t\t\t}\n\t\t\tif imageOS != \"windows\" {\n\t\t\t\t// obtain uid/gid from builder to use when extending lifecycle image\n\t\t\t\tuid, gid, err := userAndGroupIDs(rawBuilderImage)\n\t\t\t\tif err != nil {\n\t\t\t\t\treturn fmt.Errorf(\"obtaining build uid/gid from builder image: %w\", err)\n\t\t\t\t}\n\n\t\t\t\tc.logger.Debugf(\"Creating ephemeral lifecycle from %s with uid %d and gid %d. With workspace dir %s\", lifecycleImage.Name(), uid, gid, opts.Workspace)\n\t\t\t\t// extend lifecycle image with mountpoints, and use it instead of current lifecycle image\n\t\t\t\tlifecycleImage, err = c.createEphemeralLifecycle(lifecycleImage, opts.Workspace, uid, gid)\n\t\t\t\tif err != nil {\n\t\t\t\t\treturn err\n\t\t\t\t}\n\t\t\t\tc.logger.Debugf(\"Selecting ephemeral lifecycle image %s for build\", lifecycleImage.Name())\n\t\t\t\t// cleanup the extended lifecycle image when done\n\t\t\t\tdefer c.docker.ImageRemove(context.Background(), lifecycleImage.Name(), client.ImageRemoveOptions{Force: true})\n\t\t\t}\n\n\t\t\tlifecycleOptsLifecycleImage = lifecycleImage.Name()\n\t\t\tlabels, err := lifecycleImage.Labels()\n\t\t\tif err != nil {\n\t\t\t\treturn fmt.Errorf(\"reading labels of lifecycle image: %w\", err)\n\t\t\t}\n\n\t\t\tlifecycleAPIs, err = extractSupportedLifecycleApis(labels)\n\t\t\tif err != nil {\n\t\t\t\treturn fmt.Errorf(\"reading api versions of lifecycle image: %w\", err)\n\t\t\t}\n\t\t}\n\t}\n\n\tusingPlatformAPI, err := build.FindLatestSupported(append(\n\t\tbldr.LifecycleDescriptor().APIs.Platform.Deprecated,\n\t\tbldr.LifecycleDescriptor().APIs.Platform.Supported...),\n\t\tlifecycleAPIs)\n\tif err != nil {\n\t\treturn fmt.Errorf(\"finding latest supported Platform API: %w\", err)\n\t}\n\tif usingPlatformAPI.LessThan(\"0.12\") {\n\t\tif err = c.validateMixins(fetchedBPs, bldr, runImageName, runMixins); err != nil {\n\t\t\treturn fmt.Errorf(\"validating stack mixins: %w\", err)\n\t\t}\n\t}\n\n\tbuildEnvs := map[string]string{}\n\tfor _, envVar := range opts.ProjectDescriptor.Build.Env {\n\t\tbuildEnvs[envVar.Name] = envVar.Value\n\t}\n\n\tfor k, v := range opts.Env {\n\t\tbuildEnvs[k] = v\n\t}\n\n\torigBuilderName := rawBuilderImage.Name()\n\tephemeralBuilder, err := c.createEphemeralBuilder(\n\t\trawBuilderImage,\n\t\tbuildEnvs,\n\t\torder,\n\t\tfetchedBPs,\n\t\torderExtensions,\n\t\tfetchedExs,\n\t\tusingPlatformAPI.LessThan(\"0.12\"),\n\t\topts.RunImage,\n\t\tsystem,\n\t\topts.DisableSystemBuildpacks,\n\t)\n\tif err != nil {\n\t\treturn err\n\t}\n\tdefer func() {\n\t\tif ephemeralBuilder.Name() == origBuilderName {\n\t\t\treturn\n\t\t}\n\t\t_, _ = c.docker.ImageRemove(context.Background(), ephemeralBuilder.Name(), client.ImageRemoveOptions{Force: true})\n\t}()\n\n\tif len(bldr.OrderExtensions()) > 0 || len(ephemeralBuilder.OrderExtensions()) > 0 {\n\t\tif targetToUse.OS == \"windows\" {\n\t\t\treturn fmt.Errorf(\"builder contains image extensions which are not supported for Windows builds\")\n\t\t}\n\t\tif opts.PullPolicy != image.PullAlways {\n\t\t\treturn fmt.Errorf(\"pull policy must be 'always' when builder contains image extensions\")\n\t\t}\n\t}\n\n\tif opts.Layout() {\n\t\topts.ContainerConfig.Volumes = appendLayoutVolumes(opts.ContainerConfig.Volumes, pathsConfig)\n\t}\n\n\tprocessedVolumes, warnings, err := processVolumes(targetToUse.OS, opts.ContainerConfig.Volumes)\n\tif err != nil {\n\t\treturn err\n\t}\n\n\tfor _, warning := range warnings {\n\t\tc.logger.Warn(warning)\n\t}\n\n\tfileFilter, err := getFileFilter(opts.ProjectDescriptor)\n\tif err != nil {\n\t\treturn err\n\t}\n\n\trunImageName, err = pname.TranslateRegistry(runImageName, c.registryMirrors, c.logger)\n\tif err != nil {\n\t\treturn err\n\t}\n\n\tprojectMetadata := files.ProjectMetadata{}\n\tif c.experimental {\n\t\tversion := opts.ProjectDescriptor.Project.Version\n\t\tsourceURL := opts.ProjectDescriptor.Project.SourceURL\n\t\tif version != \"\" || sourceURL != \"\" {\n\t\t\tprojectMetadata.Source = &files.ProjectSource{\n\t\t\t\tType:     \"project\",\n\t\t\t\tVersion:  map[string]interface{}{\"declared\": version},\n\t\t\t\tMetadata: map[string]interface{}{\"url\": sourceURL},\n\t\t\t}\n\t\t} else {\n\t\t\tprojectMetadata.Source = v02.GitMetadata(opts.AppPath)\n\t\t}\n\t}\n\n\tlifecycleOpts := build.LifecycleOptions{\n\t\tAppPath:                  appPath,\n\t\tImage:                    imageRef,\n\t\tBuilder:                  ephemeralBuilder,\n\t\tBuilderImage:             builderRef.Name(),\n\t\tLifecycleImage:           ephemeralBuilder.Name(),\n\t\tRunImage:                 runImageName,\n\t\tProjectMetadata:          projectMetadata,\n\t\tClearCache:               opts.ClearCache,\n\t\tPublish:                  opts.Publish,\n\t\tTrustBuilder:             opts.TrustBuilder(opts.Builder),\n\t\tUseCreator:               useCreator,\n\t\tUseCreatorWithExtensions: supportsCreatorWithExtensions(lifecycleVersion),\n\t\tDockerHost:               opts.DockerHost,\n\t\tCache:                    opts.Cache,\n\t\tCacheImage:               opts.CacheImage,\n\t\tHTTPProxy:                proxyConfig.HTTPProxy,\n\t\tHTTPSProxy:               proxyConfig.HTTPSProxy,\n\t\tNoProxy:                  proxyConfig.NoProxy,\n\t\tNetwork:                  opts.ContainerConfig.Network,\n\t\tAdditionalTags:           opts.AdditionalTags,\n\t\tVolumes:                  processedVolumes,\n\t\tDefaultProcessType:       opts.DefaultProcessType,\n\t\tFileFilter:               fileFilter,\n\t\tWorkspace:                opts.Workspace,\n\t\tGID:                      opts.GroupID,\n\t\tUID:                      opts.UserID,\n\t\tPreviousImage:            opts.PreviousImage,\n\t\tInteractive:              opts.Interactive,\n\t\tTermui:                   termui.NewTermui(imageName, ephemeralBuilder, runImageName),\n\t\tReportDestinationDir:     opts.ReportDestinationDir,\n\t\tSBOMDestinationDir:       opts.SBOMDestinationDir,\n\t\tCreationTime:             opts.CreationTime,\n\t\tLayout:                   opts.Layout(),\n\t\tKeychain:                 c.keychain,\n\t\tEnableUsernsHost:         opts.EnableUsernsHost,\n\t\tExecutionEnvironment:     opts.CNBExecutionEnv,\n\t\tInsecureRegistries:       opts.InsecureRegistries,\n\t}\n\n\tswitch {\n\tcase useCreator:\n\t\tlifecycleOpts.UseCreator = true\n\tcase supportsLifecycleImage(lifecycleVersion):\n\t\tlifecycleOpts.LifecycleImage = lifecycleOptsLifecycleImage\n\t\tlifecycleOpts.LifecycleApis = lifecycleAPIs\n\tcase !opts.TrustBuilder(opts.Builder):\n\t\treturn errors.Errorf(\"Lifecycle %s does not have an associated lifecycle image. Builder must be trusted.\", lifecycleVersion.String())\n\t}\n\n\tlifecycleOpts.FetchRunImageWithLifecycleLayer = func(runImageName string) (string, error) {\n\t\tephemeralRunImageName := fmt.Sprintf(\"pack.local/run-image/%x:latest\", randString(10))\n\t\trunImage, err := c.imageFetcher.Fetch(ctx, runImageName, fetchOptions)\n\t\tif err != nil {\n\t\t\treturn \"\", err\n\t\t}\n\t\tephemeralRunImage, err := local.NewImage(ephemeralRunImageName, c.docker, local.FromBaseImage(runImage.Name()))\n\t\tif err != nil {\n\t\t\treturn \"\", err\n\t\t}\n\t\ttmpDir, err := os.MkdirTemp(\"\", \"extend-run-image-scratch\") // we need to write to disk because manifest.json is last in the tar\n\t\tif err != nil {\n\t\t\treturn \"\", err\n\t\t}\n\t\tdefer os.RemoveAll(tmpDir)\n\t\tlifecycleImageTar, err := func() (string, error) {\n\t\t\tlifecycleImageTar := filepath.Join(tmpDir, \"lifecycle-image.tar\")\n\t\t\tlifecycleImageReader, err := c.docker.ImageSave(context.Background(), []string{lifecycleOpts.LifecycleImage}) // this is fast because the lifecycle image is based on distroless static\n\t\t\tif err != nil {\n\t\t\t\treturn \"\", err\n\t\t\t}\n\t\t\tdefer lifecycleImageReader.Close()\n\t\t\tlifecycleImageWriter, err := os.Create(lifecycleImageTar)\n\t\t\tif err != nil {\n\t\t\t\treturn \"\", err\n\t\t\t}\n\t\t\tdefer lifecycleImageWriter.Close()\n\t\t\tif _, err = io.Copy(lifecycleImageWriter, lifecycleImageReader); err != nil {\n\t\t\t\treturn \"\", err\n\t\t\t}\n\t\t\treturn lifecycleImageTar, nil\n\t\t}()\n\t\tif err != nil {\n\t\t\treturn \"\", err\n\t\t}\n\t\tadvanceTarToEntryWithName := func(tarReader *tar.Reader, wantName string) (*tar.Header, error) {\n\t\t\tvar (\n\t\t\t\theader *tar.Header\n\t\t\t\terr    error\n\t\t\t)\n\t\t\tfor {\n\t\t\t\theader, err = tarReader.Next()\n\t\t\t\tif err == io.EOF {\n\t\t\t\t\tbreak\n\t\t\t\t}\n\t\t\t\tif err != nil {\n\t\t\t\t\treturn nil, err\n\t\t\t\t}\n\t\t\t\tif header.Name != wantName {\n\t\t\t\t\tcontinue\n\t\t\t\t}\n\t\t\t\treturn header, nil\n\t\t\t}\n\t\t\treturn nil, fmt.Errorf(\"failed to find header with name: %s\", wantName)\n\t\t}\n\t\tlifecycleLayerName, err := func() (string, error) {\n\t\t\tlifecycleImageReader, err := os.Open(lifecycleImageTar)\n\t\t\tif err != nil {\n\t\t\t\treturn \"\", err\n\t\t\t}\n\t\t\tdefer lifecycleImageReader.Close()\n\t\t\ttarReader := tar.NewReader(lifecycleImageReader)\n\t\t\tif _, err = advanceTarToEntryWithName(tarReader, \"manifest.json\"); err != nil {\n\t\t\t\treturn \"\", err\n\t\t\t}\n\t\t\ttype descriptor struct {\n\t\t\t\tLayers []string\n\t\t\t}\n\t\t\ttype manifestJSON []descriptor\n\t\t\tvar manifestContents manifestJSON\n\t\t\tif err = json.NewDecoder(tarReader).Decode(&manifestContents); err != nil {\n\t\t\t\treturn \"\", err\n\t\t\t}\n\t\t\tif len(manifestContents) < 1 {\n\t\t\t\treturn \"\", errors.New(\"missing manifest entries\")\n\t\t\t}\n\t\t\t// we can assume the lifecycle layer is the last in the tar, except if the lifecycle has been extended as an ephemeral lifecycle\n\t\t\tlayerOffset := 1\n\t\t\tif strings.Contains(lifecycleOpts.LifecycleImage, \"pack.local/lifecycle\") {\n\t\t\t\tlayerOffset = 2\n\t\t\t}\n\n\t\t\tif (len(manifestContents[0].Layers) - layerOffset) < 0 {\n\t\t\t\treturn \"\", errors.New(\"Lifecycle image did not contain expected layer count\")\n\t\t\t}\n\n\t\t\treturn manifestContents[0].Layers[len(manifestContents[0].Layers)-layerOffset], nil\n\t\t}()\n\t\tif err != nil {\n\t\t\treturn \"\", err\n\t\t}\n\t\tif lifecycleLayerName == \"\" {\n\t\t\treturn \"\", errors.New(\"failed to find lifecycle layer\")\n\t\t}\n\t\tlifecycleLayerTar, err := func() (string, error) {\n\t\t\tlifecycleImageReader, err := os.Open(lifecycleImageTar)\n\t\t\tif err != nil {\n\t\t\t\treturn \"\", err\n\t\t\t}\n\t\t\tdefer lifecycleImageReader.Close()\n\t\t\ttarReader := tar.NewReader(lifecycleImageReader)\n\t\t\tvar header *tar.Header\n\t\t\tif header, err = advanceTarToEntryWithName(tarReader, lifecycleLayerName); err != nil {\n\t\t\t\treturn \"\", err\n\t\t\t}\n\t\t\tlifecycleLayerTar := filepath.Join(filepath.Dir(lifecycleImageTar), filepath.Dir(lifecycleLayerName)+\".tar\") // this will be either <s0m3d1g3st>/layer.tar (docker < 25.x) OR blobs/sha256.tar (docker 25.x and later OR containerd storage enabled)\n\t\t\tif err = os.MkdirAll(filepath.Dir(lifecycleLayerTar), 0755); err != nil {\n\t\t\t\treturn \"\", err\n\t\t\t}\n\t\t\tlifecycleLayerWriter, err := os.OpenFile(lifecycleLayerTar, os.O_CREATE|os.O_RDWR, os.FileMode(header.Mode))\n\t\t\tif err != nil {\n\t\t\t\treturn \"\", err\n\t\t\t}\n\t\t\tdefer lifecycleLayerWriter.Close()\n\t\t\tif _, err = io.Copy(lifecycleLayerWriter, tarReader); err != nil {\n\t\t\t\treturn \"\", err\n\t\t\t}\n\t\t\treturn lifecycleLayerTar, nil\n\t\t}()\n\t\tif err != nil {\n\t\t\treturn \"\", err\n\t\t}\n\t\tdiffID, err := func() (string, error) {\n\t\t\tlifecycleLayerReader, err := os.Open(lifecycleLayerTar)\n\t\t\tif err != nil {\n\t\t\t\treturn \"\", err\n\t\t\t}\n\t\t\tdefer lifecycleLayerReader.Close()\n\t\t\thasher := sha256.New()\n\t\t\tif _, err = io.Copy(hasher, lifecycleLayerReader); err != nil {\n\t\t\t\treturn \"\", err\n\t\t\t}\n\t\t\t// it's weird that this doesn't match lifecycleLayerTar\n\t\t\treturn hex.EncodeToString(hasher.Sum(nil)), nil\n\t\t}()\n\t\tif err != nil {\n\t\t\treturn \"\", err\n\t\t}\n\t\tif err = ephemeralRunImage.AddLayerWithDiffID(lifecycleLayerTar, \"sha256:\"+diffID); err != nil {\n\t\t\treturn \"\", err\n\t\t}\n\t\tif err = ephemeralRunImage.Save(); err != nil {\n\t\t\treturn \"\", err\n\t\t}\n\t\treturn ephemeralRunImageName, nil\n\t}\n\n\tif err = c.lifecycleExecutor.Execute(ctx, lifecycleOpts); err != nil {\n\t\treturn fmt.Errorf(\"executing lifecycle: %w\", err)\n\t}\n\treturn c.logImageNameAndSha(ctx, opts.Publish, imageRef, opts.InsecureRegistries)\n}\n\nfunc usesContainerdStorage(docker DockerClient) bool {\n\tresult, err := docker.Info(context.Background(), client.InfoOptions{})\n\tif err != nil {\n\t\treturn false\n\t}\n\n\tfor _, driverStatus := range result.Info.DriverStatus {\n\t\tif driverStatus[0] == \"driver-type\" && driverStatus[1] == \"io.containerd.snapshotter.v1\" {\n\t\t\treturn true\n\t\t}\n\t}\n\n\treturn false\n}\n\nfunc getTargetFromBuilder(builderImage imgutil.Image) (*dist.Target, error) {\n\tbuilderOS, err := builderImage.OS()\n\tif err != nil {\n\t\treturn nil, fmt.Errorf(\"failed to get builder OS: %w\", err)\n\t}\n\tbuilderArch, err := builderImage.Architecture()\n\tif err != nil {\n\t\treturn nil, fmt.Errorf(\"failed to get builder architecture: %w\", err)\n\t}\n\tbuilderArchVariant, err := builderImage.Variant()\n\tif err != nil {\n\t\treturn nil, fmt.Errorf(\"failed to get builder architecture variant: %w\", err)\n\t}\n\treturn &dist.Target{\n\t\tOS:          builderOS,\n\t\tArch:        builderArch,\n\t\tArchVariant: builderArchVariant,\n\t}, nil\n}\n\nfunc extractSupportedLifecycleApis(labels map[string]string) ([]string, error) {\n\t// sample contents of labels:\n\t//    {io.buildpacks.builder.metadata:\\\"{\\\"lifecycle\\\":{\\\"version\\\":\\\"0.15.3\\\"},\\\"api\\\":{\\\"buildpack\\\":\\\"0.2\\\",\\\"platform\\\":\\\"0.3\\\"}}\",\n\t//     io.buildpacks.lifecycle.apis\":\"{\\\"buildpack\\\":{\\\"deprecated\\\":[],\\\"supported\\\":[\\\"0.2\\\",\\\"0.3\\\",\\\"0.4\\\",\\\"0.5\\\",\\\"0.6\\\",\\\"0.7\\\",\\\"0.8\\\",\\\"0.9\\\"]},\\\"platform\\\":{\\\"deprecated\\\":[],\\\"supported\\\":[\\\"0.3\\\",\\\"0.4\\\",\\\"0.5\\\",\\\"0.6\\\",\\\"0.7\\\",\\\"0.8\\\",\\\"0.9\\\",\\\"0.10\\\"]}}\\\",\\\"io.buildpacks.lifecycle.version\\\":\\\"0.15.3\\\"}\")\n\n\t// This struct is defined in lifecycle-repository/tools/image/main.go#Descriptor -- we could consider moving it from the main package to an importable location.\n\tvar bpPlatformAPI struct {\n\t\tPlatform struct {\n\t\t\tDeprecated []string\n\t\t\tSupported  []string\n\t\t}\n\t}\n\tif len(labels[\"io.buildpacks.lifecycle.apis\"]) > 0 {\n\t\terr := json.Unmarshal([]byte(labels[\"io.buildpacks.lifecycle.apis\"]), &bpPlatformAPI)\n\t\tif err != nil {\n\t\t\treturn nil, err\n\t\t}\n\t\treturn append(bpPlatformAPI.Platform.Deprecated, bpPlatformAPI.Platform.Supported...), nil\n\t}\n\treturn []string{}, nil\n}\n\nfunc getFileFilter(descriptor projectTypes.Descriptor) (func(string) bool, error) {\n\tif len(descriptor.Build.Exclude) > 0 {\n\t\texcludes := ignore.CompileIgnoreLines(descriptor.Build.Exclude...)\n\t\treturn func(fileName string) bool {\n\t\t\treturn !excludes.MatchesPath(fileName)\n\t\t}, nil\n\t}\n\tif len(descriptor.Build.Include) > 0 {\n\t\tincludes := ignore.CompileIgnoreLines(descriptor.Build.Include...)\n\t\treturn includes.MatchesPath, nil\n\t}\n\n\treturn nil, nil\n}\n\nfunc supportsCreator(lifecycleVersion *builder.Version) bool {\n\t// Technically the creator is supported as of platform API version 0.3 (lifecycle version 0.7.0+) but earlier versions\n\t// have bugs that make using the creator problematic.\n\treturn !lifecycleVersion.LessThan(semver.MustParse(minLifecycleVersionSupportingCreator))\n}\n\nfunc supportsCreatorWithExtensions(lifecycleVersion *builder.Version) bool {\n\treturn !lifecycleVersion.LessThan(semver.MustParse(minLifecycleVersionSupportingCreatorWithExtensions))\n}\n\nfunc supportsLifecycleImage(lifecycleVersion *builder.Version) bool {\n\treturn lifecycleVersion.Equal(builder.VersionMustParse(prevLifecycleVersionSupportingImage)) ||\n\t\t!lifecycleVersion.LessThan(semver.MustParse(minLifecycleVersionSupportingImage))\n}\n\n// supportsPlatformAPI determines whether pack can build using the builder based on the builder's supported Platform API versions.\nfunc supportsPlatformAPI(builderPlatformAPIs builder.APISet) bool {\n\tfor _, packSupportedAPI := range build.SupportedPlatformAPIVersions {\n\t\tfor _, builderSupportedAPI := range builderPlatformAPIs {\n\t\t\tsupportsPlatform := packSupportedAPI.Compare(builderSupportedAPI) == 0\n\t\t\tif supportsPlatform {\n\t\t\t\treturn true\n\t\t\t}\n\t\t}\n\t}\n\n\treturn false\n}\n\nfunc (c *Client) processBuilderName(builderName string) (name.Reference, error) {\n\tif builderName == \"\" {\n\t\treturn nil, errors.New(\"builder is a required parameter if the client has no default builder\")\n\t}\n\treturn name.ParseReference(builderName, name.WeakValidation)\n}\n\nfunc (c *Client) getBuilder(img imgutil.Image) (*builder.Builder, error) {\n\tbldr, err := builder.FromImage(img)\n\tif err != nil {\n\t\treturn nil, err\n\t}\n\tif bldr.Stack().RunImage.Image == \"\" && len(bldr.RunImages()) == 0 {\n\t\treturn nil, errors.New(\"builder metadata is missing run-image\")\n\t}\n\n\tlifecycleDescriptor := bldr.LifecycleDescriptor()\n\tif lifecycleDescriptor.Info.Version == nil {\n\t\treturn nil, errors.New(\"lifecycle version must be specified in builder\")\n\t}\n\tif len(lifecycleDescriptor.APIs.Buildpack.Supported) == 0 {\n\t\treturn nil, errors.New(\"supported Lifecycle Buildpack APIs not specified\")\n\t}\n\tif len(lifecycleDescriptor.APIs.Platform.Supported) == 0 {\n\t\treturn nil, errors.New(\"supported Lifecycle Platform APIs not specified\")\n\t}\n\n\treturn bldr, nil\n}\n\nfunc (c *Client) validateRunImage(context context.Context, name string, opts image.FetchOptions, expectedStack string) (runImage imgutil.Image, warnings []string, err error) {\n\tif name == \"\" {\n\t\treturn nil, nil, errors.New(\"run image must be specified\")\n\t}\n\timg, err := c.imageFetcher.Fetch(context, name, opts)\n\tif err != nil {\n\t\treturn nil, nil, err\n\t}\n\tstackID, err := img.Label(\"io.buildpacks.stack.id\")\n\tif err != nil {\n\t\treturn nil, nil, err\n\t}\n\n\tif stackID != expectedStack {\n\t\twarnings = append(warnings, \"deprecated usage of stack\")\n\t}\n\n\treturn img, warnings, err\n}\n\nfunc (c *Client) validateMixins(additionalBuildpacks []buildpack.BuildModule, bldr *builder.Builder, runImageName string, runMixins []string) error {\n\tif err := stack.ValidateMixins(bldr.Image().Name(), bldr.Mixins(), runImageName, runMixins); err != nil {\n\t\treturn err\n\t}\n\n\tbps, err := allBuildpacks(bldr.Image(), additionalBuildpacks)\n\tif err != nil {\n\t\treturn err\n\t}\n\tmixins := assembleAvailableMixins(bldr.Mixins(), runMixins)\n\n\tfor _, bp := range bps {\n\t\tif err := bp.EnsureStackSupport(bldr.StackID, mixins, true); err != nil {\n\t\t\treturn err\n\t\t}\n\t}\n\treturn nil\n}\n\n// assembleAvailableMixins returns the set of mixins that are common between the two provided sets, plus build-only mixins and run-only mixins.\nfunc assembleAvailableMixins(buildMixins, runMixins []string) []string {\n\t// NOTE: We cannot simply union the two mixin sets, as this could introduce a mixin that is only present on one stack\n\t// image but not the other. A buildpack that happens to require the mixin would fail to run properly, even though validation\n\t// would pass.\n\t//\n\t// For example:\n\t//\n\t//  Incorrect:\n\t//    Run image mixins:   [A, B]\n\t//    Build image mixins: [A]\n\t//    Merged: [A, B]\n\t//    Buildpack requires: [A, B]\n\t//    Match? Yes\n\t//\n\t//  Correct:\n\t//    Run image mixins:   [A, B]\n\t//    Build image mixins: [A]\n\t//    Merged: [A]\n\t//    Buildpack requires: [A, B]\n\t//    Match? No\n\n\tbuildOnly := stack.FindStageMixins(buildMixins, \"build\")\n\trunOnly := stack.FindStageMixins(runMixins, \"run\")\n\t_, _, common := stringset.Compare(buildMixins, runMixins)\n\n\treturn append(common, append(buildOnly, runOnly...)...)\n}\n\n// allBuildpacks aggregates all buildpacks declared on the image with additional buildpacks passed in. They are sorted\n// by ID then Version.\nfunc allBuildpacks(builderImage imgutil.Image, additionalBuildpacks []buildpack.BuildModule) ([]buildpack.Descriptor, error) {\n\tvar all []buildpack.Descriptor\n\tvar bpLayers dist.ModuleLayers\n\tif _, err := dist.GetLabel(builderImage, dist.BuildpackLayersLabel, &bpLayers); err != nil {\n\t\treturn nil, err\n\t}\n\tfor id, bps := range bpLayers {\n\t\tfor ver, bp := range bps {\n\t\t\tdesc := dist.BuildpackDescriptor{\n\t\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\t\tID:      id,\n\t\t\t\t\tVersion: ver,\n\t\t\t\t},\n\t\t\t\tWithStacks:  bp.Stacks,\n\t\t\t\tWithTargets: bp.Targets,\n\t\t\t\tWithOrder:   bp.Order,\n\t\t\t}\n\t\t\tall = append(all, &desc)\n\t\t}\n\t}\n\tfor _, bp := range additionalBuildpacks {\n\t\tall = append(all, bp.Descriptor())\n\t}\n\n\tsort.Slice(all, func(i, j int) bool {\n\t\tif all[i].Info().ID != all[j].Info().ID {\n\t\t\treturn all[i].Info().ID < all[j].Info().ID\n\t\t}\n\t\treturn all[i].Info().Version < all[j].Info().Version\n\t})\n\n\treturn all, nil\n}\n\nfunc (c *Client) processAppPath(appPath string) (string, error) {\n\tvar (\n\t\tresolvedAppPath string\n\t\terr             error\n\t)\n\n\tif appPath == \"\" {\n\t\tif appPath, err = os.Getwd(); err != nil {\n\t\t\treturn \"\", errors.Wrap(err, \"get working dir\")\n\t\t}\n\t}\n\n\tif resolvedAppPath, err = filepath.EvalSymlinks(appPath); err != nil {\n\t\treturn \"\", errors.Wrap(err, \"evaluate symlink\")\n\t}\n\n\tif resolvedAppPath, err = filepath.Abs(resolvedAppPath); err != nil {\n\t\treturn \"\", errors.Wrap(err, \"resolve absolute path\")\n\t}\n\n\tfi, err := os.Stat(resolvedAppPath)\n\tif err != nil {\n\t\treturn \"\", errors.Wrap(err, \"stat file\")\n\t}\n\n\tif !fi.IsDir() {\n\t\tisZip, err := archive.IsZip(filepath.Clean(resolvedAppPath))\n\t\tif err != nil {\n\t\t\treturn \"\", errors.Wrap(err, \"check zip\")\n\t\t}\n\n\t\tif !isZip {\n\t\t\treturn \"\", errors.New(\"app path must be a directory or zip\")\n\t\t}\n\t}\n\n\treturn resolvedAppPath, nil\n}\n\n// processLayoutPath given an image reference and a previous image reference this method calculates the\n// local full path and the expected path in the lifecycle container for both images provides. Those values\n// can be used to mount the correct volumes\nfunc (c *Client) processLayoutPath(inputImageRef, previousImageRef InputImageReference) (layoutPathConfig, error) {\n\tvar (\n\t\thostImagePath, hostPreviousImagePath, targetImagePath, targetPreviousImagePath string\n\t\terr                                                                            error\n\t)\n\thostImagePath, err = fullImagePath(inputImageRef, true)\n\tif err != nil {\n\t\treturn layoutPathConfig{}, err\n\t}\n\ttargetImagePath, err = layout.ParseRefToPath(inputImageRef.Name())\n\tif err != nil {\n\t\treturn layoutPathConfig{}, err\n\t}\n\ttargetImagePath = filepath.Join(paths.RootDir, \"layout-repo\", targetImagePath)\n\tc.logger.Debugf(\"local image path %s will be mounted into the container at path %s\", hostImagePath, targetImagePath)\n\n\tif previousImageRef != nil && previousImageRef.Name() != \"\" {\n\t\thostPreviousImagePath, err = fullImagePath(previousImageRef, false)\n\t\tif err != nil {\n\t\t\treturn layoutPathConfig{}, err\n\t\t}\n\t\ttargetPreviousImagePath, err = layout.ParseRefToPath(previousImageRef.Name())\n\t\tif err != nil {\n\t\t\treturn layoutPathConfig{}, err\n\t\t}\n\t\ttargetPreviousImagePath = filepath.Join(paths.RootDir, \"layout-repo\", targetPreviousImagePath)\n\t\tc.logger.Debugf(\"local previous image path %s will be mounted into the container at path %s\", hostPreviousImagePath, targetPreviousImagePath)\n\t}\n\treturn layoutPathConfig{\n\t\thostImagePath:           hostImagePath,\n\t\ttargetImagePath:         targetImagePath,\n\t\thostPreviousImagePath:   hostPreviousImagePath,\n\t\ttargetPreviousImagePath: targetPreviousImagePath,\n\t}, nil\n}\n\nfunc (c *Client) parseReference(opts BuildOptions) (name.Reference, error) {\n\tif !opts.Layout() {\n\t\treturn c.parseTagReference(opts.Image)\n\t}\n\tbase := filepath.Base(opts.Image)\n\treturn c.parseTagReference(base)\n}\n\nfunc (c *Client) processProxyConfig(config *ProxyConfig) ProxyConfig {\n\tvar (\n\t\thttpProxy, httpsProxy, noProxy string\n\t\tok                             bool\n\t)\n\tif config != nil {\n\t\treturn *config\n\t}\n\tif httpProxy, ok = os.LookupEnv(\"HTTP_PROXY\"); !ok {\n\t\thttpProxy = os.Getenv(\"http_proxy\")\n\t}\n\tif httpsProxy, ok = os.LookupEnv(\"HTTPS_PROXY\"); !ok {\n\t\thttpsProxy = os.Getenv(\"https_proxy\")\n\t}\n\tif noProxy, ok = os.LookupEnv(\"NO_PROXY\"); !ok {\n\t\tnoProxy = os.Getenv(\"no_proxy\")\n\t}\n\treturn ProxyConfig{\n\t\tHTTPProxy:  httpProxy,\n\t\tHTTPSProxy: httpsProxy,\n\t\tNoProxy:    noProxy,\n\t}\n}\n\n// processBuildpacks computes an order group based on the existing builder order and declared buildpacks. Additionally,\n// it returns buildpacks that should be added to the builder.\n//\n// Visual examples:\n//\n//\t\tBUILDER ORDER\n//\t\t----------\n//\t - group:\n//\t\t\t- A\n//\t\t\t- B\n//\t - group:\n//\t\t\t- A\n//\n//\t\tWITH DECLARED: \"from=builder\", X\n//\t\t----------\n//\t\t- group:\n//\t\t\t- A\n//\t\t\t- B\n//\t\t\t- X\n//\t\t - group:\n//\t\t\t- A\n//\t\t\t- X\n//\n//\t\tWITH DECLARED: X, \"from=builder\", Y\n//\t\t----------\n//\t\t- group:\n//\t\t\t- X\n//\t\t\t- A\n//\t\t\t- B\n//\t     - Y\n//\t\t- group:\n//\t\t\t- X\n//\t\t\t- A\n//\t     - Y\n//\n//\t\tWITH DECLARED: X\n//\t\t----------\n//\t\t- group:\n//\t\t\t- X\n//\n//\t\tWITH DECLARED: A\n//\t\t----------\n//\t\t- group:\n//\t\t\t- A\nfunc (c *Client) processBuildpacks(ctx context.Context, builderBPs []dist.ModuleInfo, builderOrder dist.Order, stackID string, opts BuildOptions, targetToUse *dist.Target) (fetchedBPs []buildpack.BuildModule, nInlineBPs int, order dist.Order, err error) {\n\trelativeBaseDir := opts.RelativeBaseDir\n\tdeclaredBPs := opts.Buildpacks\n\n\t// Buildpacks from --buildpack override buildpacks from project descriptor\n\tif len(declaredBPs) == 0 && len(opts.ProjectDescriptor.Build.Buildpacks) != 0 {\n\t\trelativeBaseDir = opts.ProjectDescriptorBaseDir\n\n\t\tfor _, bp := range opts.ProjectDescriptor.Build.Buildpacks {\n\t\t\tbuildpackLocator, isInline, err := getBuildpackLocator(bp, stackID)\n\t\t\tif err != nil {\n\t\t\t\treturn nil, 0, nil, err\n\t\t\t}\n\t\t\tif isInline {\n\t\t\t\tnInlineBPs++\n\t\t\t}\n\t\t\tdeclaredBPs = append(declaredBPs, buildpackLocator)\n\t\t}\n\t}\n\n\torder = dist.Order{{Group: []dist.ModuleRef{}}}\n\tfor _, bp := range declaredBPs {\n\t\tlocatorType, err := buildpack.GetLocatorType(bp, relativeBaseDir, builderBPs)\n\t\tif err != nil {\n\t\t\treturn nil, 0, nil, err\n\t\t}\n\n\t\tswitch locatorType {\n\t\tcase buildpack.FromBuilderLocator:\n\t\t\tswitch {\n\t\t\tcase len(order) == 0 || len(order[0].Group) == 0:\n\t\t\t\torder = builderOrder\n\t\t\tcase len(order) > 1:\n\t\t\t\t// This should only ever be possible if they are using from=builder twice which we don't allow\n\t\t\t\treturn nil, 0, nil, errors.New(\"buildpacks from builder can only be defined once\")\n\t\t\tdefault:\n\t\t\t\tnewOrder := dist.Order{}\n\t\t\t\tgroupToAdd := order[0].Group\n\t\t\t\tfor _, bOrderEntry := range builderOrder {\n\t\t\t\t\tnewEntry := dist.OrderEntry{Group: append(groupToAdd, bOrderEntry.Group...)}\n\t\t\t\t\tnewOrder = append(newOrder, newEntry)\n\t\t\t\t}\n\n\t\t\t\torder = newOrder\n\t\t\t}\n\t\tdefault:\n\t\t\tnewFetchedBPs, moduleInfo, err := c.fetchBuildpack(ctx, bp, relativeBaseDir, builderBPs, opts, buildpack.KindBuildpack, targetToUse)\n\t\t\tif err != nil {\n\t\t\t\treturn fetchedBPs, 0, order, err\n\t\t\t}\n\t\t\tfetchedBPs = append(fetchedBPs, newFetchedBPs...)\n\t\t\torder = appendBuildpackToOrder(order, *moduleInfo)\n\t\t}\n\t}\n\n\tif (len(order) == 0 || len(order[0].Group) == 0) && len(builderOrder) > 0 {\n\t\tpreBuildpacks := opts.PreBuildpacks\n\t\tpostBuildpacks := opts.PostBuildpacks\n\t\t// Pre-buildpacks from --pre-buildpack override pre-buildpacks from project descriptor\n\t\tif len(preBuildpacks) == 0 && len(opts.ProjectDescriptor.Build.Pre.Buildpacks) > 0 {\n\t\t\tfor _, bp := range opts.ProjectDescriptor.Build.Pre.Buildpacks {\n\t\t\t\tbuildpackLocator, isInline, err := getBuildpackLocator(bp, stackID)\n\t\t\t\tif err != nil {\n\t\t\t\t\treturn nil, 0, nil, errors.Wrap(err, \"get pre-buildpack locator\")\n\t\t\t\t}\n\t\t\t\tif isInline {\n\t\t\t\t\tnInlineBPs++\n\t\t\t\t}\n\t\t\t\tpreBuildpacks = append(preBuildpacks, buildpackLocator)\n\t\t\t}\n\t\t}\n\t\t// Post-buildpacks from --post-buildpack override post-buildpacks from project descriptor\n\t\tif len(postBuildpacks) == 0 && len(opts.ProjectDescriptor.Build.Post.Buildpacks) > 0 {\n\t\t\tfor _, bp := range opts.ProjectDescriptor.Build.Post.Buildpacks {\n\t\t\t\tbuildpackLocator, isInline, err := getBuildpackLocator(bp, stackID)\n\t\t\t\tif err != nil {\n\t\t\t\t\treturn nil, 0, nil, errors.Wrap(err, \"get post-buildpack locator\")\n\t\t\t\t}\n\t\t\t\tif isInline {\n\t\t\t\t\tnInlineBPs++\n\t\t\t\t}\n\t\t\t\tpostBuildpacks = append(postBuildpacks, buildpackLocator)\n\t\t\t}\n\t\t}\n\n\t\tif len(preBuildpacks) > 0 || len(postBuildpacks) > 0 {\n\t\t\torder = builderOrder\n\t\t\tfor _, bp := range preBuildpacks {\n\t\t\t\tnewFetchedBPs, moduleInfo, err := c.fetchBuildpack(ctx, bp, relativeBaseDir, builderBPs, opts, buildpack.KindBuildpack, targetToUse)\n\t\t\t\tif err != nil {\n\t\t\t\t\treturn fetchedBPs, 0, order, err\n\t\t\t\t}\n\t\t\t\tfetchedBPs = append(fetchedBPs, newFetchedBPs...)\n\t\t\t\torder = prependBuildpackToOrder(order, *moduleInfo)\n\t\t\t}\n\n\t\t\tfor _, bp := range postBuildpacks {\n\t\t\t\tnewFetchedBPs, moduleInfo, err := c.fetchBuildpack(ctx, bp, relativeBaseDir, builderBPs, opts, buildpack.KindBuildpack, targetToUse)\n\t\t\t\tif err != nil {\n\t\t\t\t\treturn fetchedBPs, 0, order, err\n\t\t\t\t}\n\t\t\t\tfetchedBPs = append(fetchedBPs, newFetchedBPs...)\n\t\t\t\torder = appendBuildpackToOrder(order, *moduleInfo)\n\t\t\t}\n\t\t}\n\t}\n\n\treturn fetchedBPs, nInlineBPs, order, nil\n}\n\nfunc (c *Client) fetchBuildpack(ctx context.Context, bp string, relativeBaseDir string, builderBPs []dist.ModuleInfo, opts BuildOptions, kind string, targetToUse *dist.Target) ([]buildpack.BuildModule, *dist.ModuleInfo, error) {\n\tpullPolicy := opts.PullPolicy\n\tpublish := opts.Publish\n\tregistry := opts.Registry\n\n\tlocatorType, err := buildpack.GetLocatorType(bp, relativeBaseDir, builderBPs)\n\tif err != nil {\n\t\treturn nil, nil, err\n\t}\n\n\tfetchedBPs := []buildpack.BuildModule{}\n\tvar moduleInfo *dist.ModuleInfo\n\tswitch locatorType {\n\tcase buildpack.IDLocator:\n\t\tid, version := buildpack.ParseIDLocator(bp)\n\t\tmoduleInfo = &dist.ModuleInfo{\n\t\t\tID:      id,\n\t\t\tVersion: version,\n\t\t}\n\tdefault:\n\t\tdownloadOptions := buildpack.DownloadOptions{\n\t\t\tRegistryName:    registry,\n\t\t\tTarget:          targetToUse,\n\t\t\tRelativeBaseDir: relativeBaseDir,\n\t\t\tDaemon:          !publish,\n\t\t\tPullPolicy:      pullPolicy,\n\t\t}\n\t\tif kind == buildpack.KindExtension {\n\t\t\tdownloadOptions.ModuleKind = kind\n\t\t}\n\t\tmainBP, depBPs, err := c.buildpackDownloader.Download(ctx, bp, downloadOptions)\n\t\tif err != nil {\n\t\t\treturn nil, nil, errors.Wrap(err, \"downloading buildpack\")\n\t\t}\n\t\tfetchedBPs = append(append(fetchedBPs, mainBP), depBPs...)\n\t\tmainBPInfo := mainBP.Descriptor().Info()\n\t\tmoduleInfo = &mainBPInfo\n\n\t\tpackageCfgPath := filepath.Join(bp, \"package.toml\")\n\t\t_, err = os.Stat(packageCfgPath)\n\t\tif err == nil {\n\t\t\tfetchedDeps, err := c.fetchBuildpackDependencies(ctx, bp, packageCfgPath, downloadOptions)\n\t\t\tif err != nil {\n\t\t\t\treturn nil, nil, errors.Wrapf(err, \"fetching package.toml dependencies (path=%s)\", style.Symbol(packageCfgPath))\n\t\t\t}\n\t\t\tfetchedBPs = append(fetchedBPs, fetchedDeps...)\n\t\t}\n\t}\n\treturn fetchedBPs, moduleInfo, nil\n}\n\nfunc (c *Client) fetchBuildpackDependencies(ctx context.Context, bp string, packageCfgPath string, downloadOptions buildpack.DownloadOptions) ([]buildpack.BuildModule, error) {\n\tpackageReader := buildpackage.NewConfigReader()\n\tpackageCfg, err := packageReader.Read(packageCfgPath)\n\tif err == nil {\n\t\tfetchedBPs := []buildpack.BuildModule{}\n\t\tfor _, dep := range packageCfg.Dependencies {\n\t\t\tmainBP, deps, err := c.buildpackDownloader.Download(ctx, dep.URI, buildpack.DownloadOptions{\n\t\t\t\tRegistryName:    downloadOptions.RegistryName,\n\t\t\t\tTarget:          downloadOptions.Target,\n\t\t\t\tDaemon:          downloadOptions.Daemon,\n\t\t\t\tPullPolicy:      downloadOptions.PullPolicy,\n\t\t\t\tRelativeBaseDir: filepath.Join(bp, packageCfg.Buildpack.URI),\n\t\t\t})\n\n\t\t\tif err != nil {\n\t\t\t\treturn nil, errors.Wrapf(err, \"fetching dependencies (uri=%s,image=%s)\", style.Symbol(dep.URI), style.Symbol(dep.ImageName))\n\t\t\t}\n\n\t\t\tfetchedBPs = append(append(fetchedBPs, mainBP), deps...)\n\t\t}\n\t\treturn fetchedBPs, nil\n\t}\n\treturn nil, err\n}\n\nfunc getBuildpackLocator(bp projectTypes.Buildpack, stackID string) (locator string, isInline bool, err error) {\n\tswitch {\n\tcase bp.ID != \"\" && bp.Script.Inline != \"\" && bp.URI == \"\":\n\t\tif bp.Script.API == \"\" {\n\t\t\treturn \"\", false, errors.New(\"Missing API version for inline buildpack\")\n\t\t}\n\n\t\tpathToInlineBuildpack, err := createInlineBuildpack(bp, stackID)\n\t\tif err != nil {\n\t\t\treturn \"\", false, errors.Wrap(err, \"Could not create temporary inline buildpack\")\n\t\t}\n\t\treturn pathToInlineBuildpack, true, nil\n\tcase bp.URI != \"\":\n\t\treturn bp.URI, false, nil\n\tcase bp.ID != \"\" && bp.Version != \"\":\n\t\treturn fmt.Sprintf(\"%s@%s\", bp.ID, bp.Version), false, nil\n\tcase bp.ID != \"\" && bp.Version == \"\":\n\t\treturn bp.ID, false, nil\n\tdefault:\n\t\treturn \"\", false, errors.New(\"Invalid buildpack definition\")\n\t}\n}\n\nfunc appendBuildpackToOrder(order dist.Order, bpInfo dist.ModuleInfo) (newOrder dist.Order) {\n\tfor _, orderEntry := range order {\n\t\tnewEntry := orderEntry\n\t\tnewEntry.Group = append(newEntry.Group, dist.ModuleRef{\n\t\t\tModuleInfo: bpInfo,\n\t\t\tOptional:   false,\n\t\t})\n\t\tnewOrder = append(newOrder, newEntry)\n\t}\n\n\treturn newOrder\n}\n\nfunc prependBuildpackToOrder(order dist.Order, bpInfo dist.ModuleInfo) (newOrder dist.Order) {\n\tfor _, orderEntry := range order {\n\t\tnewEntry := orderEntry\n\t\tnewGroup := []dist.ModuleRef{{\n\t\t\tModuleInfo: bpInfo,\n\t\t\tOptional:   false,\n\t\t}}\n\t\tnewEntry.Group = append(newGroup, newEntry.Group...)\n\t\tnewOrder = append(newOrder, newEntry)\n\t}\n\n\treturn newOrder\n}\n\nfunc (c *Client) processExtensions(ctx context.Context, builderExs []dist.ModuleInfo, opts BuildOptions, targetToUse *dist.Target) (fetchedExs []buildpack.BuildModule, orderExtensions dist.Order, err error) {\n\trelativeBaseDir := opts.RelativeBaseDir\n\tdeclaredExs := opts.Extensions\n\n\torderExtensions = dist.Order{{Group: []dist.ModuleRef{}}}\n\tfor _, ex := range declaredExs {\n\t\tlocatorType, err := buildpack.GetLocatorType(ex, relativeBaseDir, builderExs)\n\t\tif err != nil {\n\t\t\treturn nil, nil, err\n\t\t}\n\n\t\tswitch locatorType {\n\t\tcase buildpack.RegistryLocator:\n\t\t\treturn nil, nil, errors.New(\"RegistryLocator type is not valid for extensions\")\n\t\tcase buildpack.FromBuilderLocator:\n\t\t\treturn nil, nil, errors.New(\"from builder is not supported for extensions\")\n\t\tdefault:\n\t\t\tnewFetchedExs, moduleInfo, err := c.fetchBuildpack(ctx, ex, relativeBaseDir, builderExs, opts, buildpack.KindExtension, targetToUse)\n\t\t\tif err != nil {\n\t\t\t\treturn fetchedExs, orderExtensions, err\n\t\t\t}\n\t\t\tfetchedExs = append(fetchedExs, newFetchedExs...)\n\t\t\torderExtensions = prependBuildpackToOrder(orderExtensions, *moduleInfo)\n\t\t}\n\t}\n\n\treturn fetchedExs, orderExtensions, nil\n}\n\nfunc userAndGroupIDs(img imgutil.Image) (int, int, error) {\n\tsUID, err := img.Env(builder.EnvUID)\n\tif err != nil {\n\t\treturn 0, 0, errors.Wrap(err, \"reading builder env variables\")\n\t} else if sUID == \"\" {\n\t\treturn 0, 0, fmt.Errorf(\"image %s missing required env var %s\", style.Symbol(img.Name()), style.Symbol(builder.EnvUID))\n\t}\n\n\tsGID, err := img.Env(builder.EnvGID)\n\tif err != nil {\n\t\treturn 0, 0, errors.Wrap(err, \"reading builder env variables\")\n\t} else if sGID == \"\" {\n\t\treturn 0, 0, fmt.Errorf(\"image %s missing required env var %s\", style.Symbol(img.Name()), style.Symbol(builder.EnvGID))\n\t}\n\n\tvar uid, gid int\n\tuid, err = strconv.Atoi(sUID)\n\tif err != nil {\n\t\treturn 0, 0, fmt.Errorf(\"failed to parse %s, value %s should be an integer\", style.Symbol(builder.EnvUID), style.Symbol(sUID))\n\t}\n\n\tgid, err = strconv.Atoi(sGID)\n\tif err != nil {\n\t\treturn 0, 0, fmt.Errorf(\"failed to parse %s, value %s should be an integer\", style.Symbol(builder.EnvGID), style.Symbol(sGID))\n\t}\n\n\treturn uid, gid, nil\n}\n\nfunc workspacePathForOS(os, workspace string) string {\n\tif workspace == \"\" {\n\t\tworkspace = \"workspace\"\n\t}\n\tif os == \"windows\" {\n\t\t// note we don't use ephemeral lifecycle when os is windows..\n\t\treturn \"c:\\\\\" + workspace\n\t}\n\treturn \"/\" + workspace\n}\n\nfunc (c *Client) addUserMountpoints(lifecycleImage imgutil.Image, dest string, workspace string, uid int, gid int) (string, error) {\n\t// today only workspace needs to be added, easy to add future dirs if required.\n\n\timageOS, err := lifecycleImage.OS()\n\tif err != nil {\n\t\treturn \"\", errors.Wrap(err, \"getting image OS\")\n\t}\n\tlayerWriterFactory, err := layer.NewWriterFactory(imageOS)\n\tif err != nil {\n\t\treturn \"\", err\n\t}\n\n\tworkspace = workspacePathForOS(imageOS, workspace)\n\n\tfh, err := os.Create(filepath.Join(dest, \"dirs.tar\"))\n\tif err != nil {\n\t\treturn \"\", err\n\t}\n\tdefer fh.Close()\n\n\tlw := layerWriterFactory.NewWriter(fh)\n\tdefer lw.Close()\n\n\tfor _, path := range []string{workspace} {\n\t\tif err := lw.WriteHeader(&tar.Header{\n\t\t\tTypeflag: tar.TypeDir,\n\t\t\tName:     path,\n\t\t\tMode:     0755,\n\t\t\tModTime:  archive.NormalizedDateTime,\n\t\t\tUid:      uid,\n\t\t\tGid:      gid,\n\t\t}); err != nil {\n\t\t\treturn \"\", errors.Wrapf(err, \"creating %s mountpoint dir in layer\", style.Symbol(path))\n\t\t}\n\t}\n\n\treturn fh.Name(), nil\n}\n\nfunc (c *Client) createEphemeralLifecycle(lifecycleImage imgutil.Image, workspace string, uid int, gid int) (imgutil.Image, error) {\n\tlifecycleImage.Rename(fmt.Sprintf(\"pack.local/lifecycle/%x:latest\", randString(10)))\n\n\ttmpDir, err := os.MkdirTemp(\"\", \"create-lifecycle-scratch\")\n\tif err != nil {\n\t\treturn nil, err\n\t}\n\tdefer os.RemoveAll(tmpDir)\n\tdirsTar, err := c.addUserMountpoints(lifecycleImage, tmpDir, workspace, uid, gid)\n\tif err != nil {\n\t\treturn nil, err\n\t}\n\tif err := lifecycleImage.AddLayer(dirsTar); err != nil {\n\t\treturn nil, errors.Wrap(err, \"adding mountpoint dirs layer\")\n\t}\n\n\terr = lifecycleImage.Save()\n\tif err != nil {\n\t\treturn nil, err\n\t}\n\n\treturn lifecycleImage, nil\n}\n\nfunc (c *Client) createEphemeralBuilder(\n\trawBuilderImage imgutil.Image,\n\tenv map[string]string,\n\torder dist.Order,\n\tbuildpacks []buildpack.BuildModule,\n\torderExtensions dist.Order,\n\textensions []buildpack.BuildModule,\n\tvalidateMixins bool,\n\trunImage string,\n\tsystem dist.System,\n\tdisableSystem bool,\n) (*builder.Builder, error) {\n\tif !ephemeralBuilderNeeded(env, order, buildpacks, orderExtensions, extensions, runImage) && !disableSystem {\n\t\treturn builder.New(rawBuilderImage, rawBuilderImage.Name(), builder.WithoutSave())\n\t}\n\n\torigBuilderName := rawBuilderImage.Name()\n\tbldr, err := builder.New(rawBuilderImage, fmt.Sprintf(\"pack.local/builder/%x:latest\", randString(10)), builder.WithRunImage(runImage))\n\tif err != nil {\n\t\treturn nil, errors.Wrapf(err, \"invalid builder %s\", style.Symbol(origBuilderName))\n\t}\n\n\tbldr.SetEnv(env)\n\tfor _, bp := range buildpacks {\n\t\tbpInfo := bp.Descriptor().Info()\n\t\tc.logger.Debugf(\"Adding buildpack %s version %s to builder\", style.Symbol(bpInfo.ID), style.Symbol(bpInfo.Version))\n\t\tbldr.AddBuildpack(bp)\n\t}\n\tif len(order) > 0 && len(order[0].Group) > 0 {\n\t\tc.logger.Debug(\"Setting custom order\")\n\t\tbldr.SetOrder(order)\n\t}\n\n\tfor _, ex := range extensions {\n\t\texInfo := ex.Descriptor().Info()\n\t\tc.logger.Debugf(\"Adding extension %s version %s to builder\", style.Symbol(exInfo.ID), style.Symbol(exInfo.Version))\n\t\tbldr.AddExtension(ex)\n\t}\n\tif len(orderExtensions) > 0 && len(orderExtensions[0].Group) > 0 {\n\t\tc.logger.Debug(\"Setting custom order for extensions\")\n\t\tbldr.SetOrderExtensions(orderExtensions)\n\t}\n\n\tbldr.SetValidateMixins(validateMixins)\n\tbldr.SetSystem(system)\n\n\tif err := bldr.Save(c.logger, builder.CreatorMetadata{Version: c.version}); err != nil {\n\t\treturn nil, err\n\t}\n\treturn bldr, nil\n}\n\nfunc ephemeralBuilderNeeded(\n\tenv map[string]string,\n\torder dist.Order,\n\tbuildpacks []buildpack.BuildModule,\n\torderExtensions dist.Order,\n\textensions []buildpack.BuildModule,\n\trunImage string,\n) bool {\n\tif len(env) > 0 {\n\t\treturn true\n\t}\n\tif len(order) > 0 && len(order[0].Group) > 0 {\n\t\treturn true\n\t}\n\tif len(buildpacks) > 0 {\n\t\treturn true\n\t}\n\tif len(orderExtensions) > 0 && len(orderExtensions[0].Group) > 0 {\n\t\treturn true\n\t}\n\tif len(extensions) > 0 {\n\t\treturn true\n\t}\n\tif runImage != \"\" {\n\t\treturn true\n\t}\n\treturn false\n}\n\n// Returns a string iwith lowercase a-z, of length n\nfunc randString(n int) string {\n\tb := make([]byte, n)\n\t_, err := rand.Read(b)\n\tif err != nil {\n\t\tpanic(err)\n\t}\n\tfor i := range b {\n\t\tb[i] = 'a' + (b[i] % 26)\n\t}\n\treturn string(b)\n}\n\nfunc (c *Client) logImageNameAndSha(ctx context.Context, publish bool, imageRef name.Reference, insecureRegistries []string) error {\n\t// The image name and sha are printed in the lifecycle logs, and there is no need to print it again, unless output is suppressed.\n\tif !logging.IsQuiet(c.logger) {\n\t\treturn nil\n\t}\n\n\timg, err := c.imageFetcher.Fetch(ctx, imageRef.Name(), image.FetchOptions{Daemon: !publish, PullPolicy: image.PullNever, InsecureRegistries: insecureRegistries})\n\tif err != nil {\n\t\treturn fmt.Errorf(\"fetching built image: %w\", err)\n\t}\n\n\tid, err := img.Identifier()\n\tif err != nil {\n\t\treturn fmt.Errorf(\"reading image sha: %w\", err)\n\t}\n\n\t// Remove tag, if it exists, from the image name\n\timgName := strings.TrimSuffix(imageRef.String(), imageRef.Identifier())\n\timgNameAndSha := fmt.Sprintf(\"%s@%s\\n\", imgName, parseDigestFromImageID(id))\n\n\t// Access the logger's Writer directly to bypass ReportSuccessfulQuietBuild mode\n\t_, err = c.logger.Writer().Write([]byte(imgNameAndSha))\n\treturn err\n}\n\nfunc parseDigestFromImageID(id imgutil.Identifier) string {\n\tvar digest string\n\tswitch v := id.(type) {\n\tcase local.IDIdentifier:\n\t\tdigest = v.String()\n\tcase remote.DigestIdentifier:\n\t\tdigest = v.Digest.DigestStr()\n\t}\n\n\tdigest = strings.TrimPrefix(digest, \"sha256:\")\n\treturn fmt.Sprintf(\"sha256:%s\", digest)\n}\n\nfunc createInlineBuildpack(bp projectTypes.Buildpack, stackID string) (string, error) {\n\tpathToInlineBuilpack, err := os.MkdirTemp(\"\", \"inline-cnb\")\n\tif err != nil {\n\t\treturn pathToInlineBuilpack, err\n\t}\n\n\tif bp.Version == \"\" {\n\t\tbp.Version = \"0.0.0\"\n\t}\n\n\tif err = createBuildpackTOML(pathToInlineBuilpack, bp.ID, bp.Version, bp.Script.API, []dist.Stack{{ID: stackID}}, []dist.Target{}, nil); err != nil {\n\t\treturn pathToInlineBuilpack, err\n\t}\n\n\tshell := bp.Script.Shell\n\tif shell == \"\" {\n\t\tshell = \"/bin/sh\"\n\t}\n\n\tbinBuild := fmt.Sprintf(`#!%s\n\n%s\n`, shell, bp.Script.Inline)\n\n\tbinDetect := fmt.Sprintf(`#!%s\n\nexit 0\n`, shell)\n\n\tif err = createBinScript(pathToInlineBuilpack, \"build\", binBuild, nil); err != nil {\n\t\treturn pathToInlineBuilpack, err\n\t}\n\n\tif err = createBinScript(pathToInlineBuilpack, \"build.bat\", bp.Script.Inline, nil); err != nil {\n\t\treturn pathToInlineBuilpack, err\n\t}\n\n\tif err = createBinScript(pathToInlineBuilpack, \"detect\", binDetect, nil); err != nil {\n\t\treturn pathToInlineBuilpack, err\n\t}\n\n\tif err = createBinScript(pathToInlineBuilpack, \"detect.bat\", bp.Script.Inline, nil); err != nil {\n\t\treturn pathToInlineBuilpack, err\n\t}\n\n\treturn pathToInlineBuilpack, nil\n}\n\n// fullImagePath parses the inputImageReference provided by the user and creates the directory\n// structure if create value is true\nfunc fullImagePath(inputImageRef InputImageReference, create bool) (string, error) {\n\timagePath, err := inputImageRef.FullName()\n\tif err != nil {\n\t\treturn \"\", errors.Wrapf(err, \"evaluating image %s destination path\", inputImageRef.Name())\n\t}\n\n\tif create {\n\t\tif err := os.MkdirAll(imagePath, os.ModePerm); err != nil {\n\t\t\treturn \"\", errors.Wrapf(err, \"creating %s layout application destination\", imagePath)\n\t\t}\n\t}\n\n\treturn imagePath, nil\n}\n\n// appendLayoutVolumes mount host volume into the build container, in the form '<host path>:<target path>[:<options>]'\n// the volumes mounted are:\n// - The path where the user wants the image to be exported in OCI layout format\n// - The previous image path if it exits\n// - The run-image path\nfunc appendLayoutVolumes(volumes []string, config layoutPathConfig) []string {\n\tif config.hostPreviousImagePath != \"\" {\n\t\tvolumes = append(volumes, readOnlyVolume(config.hostPreviousImagePath, config.targetPreviousImagePath),\n\t\t\treadOnlyVolume(config.hostRunImagePath, config.targetRunImagePath),\n\t\t\twritableVolume(config.hostImagePath, config.targetImagePath))\n\t} else {\n\t\tvolumes = append(volumes, readOnlyVolume(config.hostRunImagePath, config.targetRunImagePath),\n\t\t\twritableVolume(config.hostImagePath, config.targetImagePath))\n\t}\n\treturn volumes\n}\n\nfunc writableVolume(hostPath, targetPath string) string {\n\ttp := targetPath\n\tif !filepath.IsAbs(targetPath) {\n\t\ttp = filepath.Join(string(filepath.Separator), targetPath)\n\t}\n\treturn fmt.Sprintf(\"%s:%s:rw\", hostPath, tp)\n}\n\nfunc readOnlyVolume(hostPath, targetPath string) string {\n\ttp := targetPath\n\tif !filepath.IsAbs(targetPath) {\n\t\ttp = filepath.Join(string(filepath.Separator), targetPath)\n\t}\n\treturn fmt.Sprintf(\"%s:%s\", hostPath, tp)\n}\n"
  },
  {
    "path": "pkg/client/build_test.go",
    "content": "package client\n\nimport (\n\t\"bytes\"\n\t\"context\"\n\t\"crypto/sha256\"\n\t\"encoding/hex\"\n\t\"encoding/json\"\n\t\"fmt\"\n\t\"io\"\n\t\"net/http\"\n\t\"os\"\n\t\"path/filepath\"\n\t\"runtime\"\n\t\"strings\"\n\t\"testing\"\n\n\t\"github.com/Masterminds/semver\"\n\t\"github.com/buildpacks/imgutil\"\n\t\"github.com/buildpacks/imgutil/fakes\"\n\t\"github.com/buildpacks/imgutil/local\"\n\t\"github.com/buildpacks/imgutil/remote\"\n\t\"github.com/buildpacks/lifecycle/api\"\n\t\"github.com/buildpacks/lifecycle/platform/files\"\n\t\"github.com/google/go-containerregistry/pkg/name\"\n\t\"github.com/heroku/color\"\n\tdockerclient \"github.com/moby/moby/client\"\n\t\"github.com/onsi/gomega/ghttp\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\t\"github.com/buildpacks/pack/internal/builder\"\n\tcfg \"github.com/buildpacks/pack/internal/config\"\n\tifakes \"github.com/buildpacks/pack/internal/fakes\"\n\trg \"github.com/buildpacks/pack/internal/registry\"\n\t\"github.com/buildpacks/pack/internal/style\"\n\t\"github.com/buildpacks/pack/pkg/blob\"\n\t\"github.com/buildpacks/pack/pkg/buildpack\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n\t\"github.com/buildpacks/pack/pkg/image\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\tprojectTypes \"github.com/buildpacks/pack/pkg/project/types\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestBuild(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"build\", testBuild, spec.Report(report.Terminal{}))\n}\n\nfunc testBuild(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tsubject                      *Client\n\t\tfakeImageFetcher             *ifakes.FakeImageFetcher\n\t\tfakeLifecycle                *ifakes.FakeLifecycle\n\t\tdefaultBuilderStackID        = \"some.stack.id\"\n\t\tdefaultWindowsBuilderStackID = \"some.windows.stack.id\"\n\t\tbuilderImageWithSystem       *fakes.Image\n\t\tdefaultBuilderImage          *fakes.Image\n\t\tdefaultWindowsBuilderImage   *fakes.Image\n\t\tbuilderImageWithSystemName   = \"example.com/default/builder-with-system:tag\"\n\t\tdefaultBuilderName           = \"example.com/default/builder:tag\"\n\t\tdefaultWindowsBuilderName    = \"example.com/windows-default/builder:tag\"\n\t\tdefaultRunImageName          = \"default/run\"\n\t\tdefaultWindowsRunImageName   = \"default/win-run\"\n\t\tfakeDefaultRunImage          *fakes.Image\n\t\tfakeDefaultWindowsRunImage   *fakes.Image\n\t\tfakeMirror1                  *fakes.Image\n\t\tfakeMirror2                  *fakes.Image\n\t\ttmpDir                       string\n\t\toutBuf                       bytes.Buffer\n\t\tlogger                       *logging.LogWithWriters\n\t\tfakeLifecycleImage           *fakes.Image\n\n\t\twithExtensionsLabel bool\n\t)\n\n\tit.Before(func() {\n\t\tvar err error\n\n\t\tfakeImageFetcher = ifakes.NewFakeImageFetcher()\n\t\tfakeLifecycle = &ifakes.FakeLifecycle{}\n\n\t\ttmpDir, err = os.MkdirTemp(\"\", \"build-test\")\n\t\th.AssertNil(t, err)\n\n\t\tdefaultBuilderImage = newFakeBuilderImage(t, tmpDir, defaultBuilderName, defaultBuilderStackID, defaultRunImageName, builder.DefaultLifecycleVersion, newLinuxImage, false)\n\t\th.AssertNil(t, defaultBuilderImage.SetLabel(\"io.buildpacks.stack.mixins\", `[\"mixinA\", \"build:mixinB\", \"mixinX\", \"build:mixinY\"]`))\n\t\tfakeImageFetcher.LocalImages[defaultBuilderImage.Name()] = defaultBuilderImage\n\t\tif withExtensionsLabel {\n\t\t\th.AssertNil(t, defaultBuilderImage.SetLabel(\"io.buildpacks.buildpack.order-extensions\", `[{\"group\":[{\"id\":\"some-extension-id\",\"version\":\"some-extension-version\"}]}]`))\n\t\t}\n\n\t\tbuilderImageWithSystem = newFakeBuilderImage(t, tmpDir, builderImageWithSystemName, defaultBuilderStackID, defaultRunImageName, builder.DefaultLifecycleVersion, newLinuxImage, true)\n\t\th.AssertNil(t, builderImageWithSystem.SetLabel(\"io.buildpacks.stack.mixins\", `[\"mixinA\", \"build:mixinB\", \"mixinX\", \"build:mixinY\"]`))\n\t\tfakeImageFetcher.LocalImages[builderImageWithSystem.Name()] = builderImageWithSystem\n\n\t\tdefaultWindowsBuilderImage = newFakeBuilderImage(t, tmpDir, defaultWindowsBuilderName, defaultWindowsBuilderStackID, defaultWindowsRunImageName, builder.DefaultLifecycleVersion, newWindowsImage, false)\n\t\th.AssertNil(t, defaultWindowsBuilderImage.SetLabel(\"io.buildpacks.stack.mixins\", `[\"mixinA\", \"build:mixinB\", \"mixinX\", \"build:mixinY\"]`))\n\t\tfakeImageFetcher.LocalImages[defaultWindowsBuilderImage.Name()] = defaultWindowsBuilderImage\n\t\tif withExtensionsLabel {\n\t\t\th.AssertNil(t, defaultWindowsBuilderImage.SetLabel(\"io.buildpacks.buildpack.order-extensions\", `[{\"group\":[{\"id\":\"some-extension-id\",\"version\":\"some-extension-version\"}]}]`))\n\t\t}\n\n\t\tfakeDefaultWindowsRunImage = newWindowsImage(\"default/win-run\", \"\", nil)\n\t\th.AssertNil(t, fakeDefaultWindowsRunImage.SetLabel(\"io.buildpacks.stack.id\", defaultWindowsBuilderStackID))\n\t\th.AssertNil(t, fakeDefaultWindowsRunImage.SetLabel(\"io.buildpacks.stack.mixins\", `[\"mixinA\", \"run:mixinC\", \"mixinX\", \"run:mixinZ\"]`))\n\t\tfakeImageFetcher.LocalImages[fakeDefaultWindowsRunImage.Name()] = fakeDefaultWindowsRunImage\n\n\t\tfakeDefaultRunImage = newLinuxImage(\"default/run\", \"\", nil)\n\t\th.AssertNil(t, fakeDefaultRunImage.SetLabel(\"io.buildpacks.stack.id\", defaultBuilderStackID))\n\t\th.AssertNil(t, fakeDefaultRunImage.SetLabel(\"io.buildpacks.stack.mixins\", `[\"mixinA\", \"run:mixinC\", \"mixinX\", \"run:mixinZ\"]`))\n\t\tfakeImageFetcher.LocalImages[fakeDefaultRunImage.Name()] = fakeDefaultRunImage\n\n\t\tfakeMirror1 = newLinuxImage(\"registry1.example.com/run/mirror\", \"\", nil)\n\t\th.AssertNil(t, fakeMirror1.SetLabel(\"io.buildpacks.stack.id\", defaultBuilderStackID))\n\t\th.AssertNil(t, fakeMirror1.SetLabel(\"io.buildpacks.stack.mixins\", `[\"mixinA\", \"mixinX\", \"run:mixinZ\"]`))\n\t\tfakeImageFetcher.LocalImages[fakeMirror1.Name()] = fakeMirror1\n\n\t\tfakeMirror2 = newLinuxImage(\"registry2.example.com/run/mirror\", \"\", nil)\n\t\th.AssertNil(t, fakeMirror2.SetLabel(\"io.buildpacks.stack.id\", defaultBuilderStackID))\n\t\th.AssertNil(t, fakeMirror2.SetLabel(\"io.buildpacks.stack.mixins\", `[\"mixinA\", \"mixinX\", \"run:mixinZ\"]`))\n\t\tfakeImageFetcher.LocalImages[fakeMirror2.Name()] = fakeMirror2\n\n\t\tfakeLifecycleImage = newLinuxImage(fmt.Sprintf(\"%s:%s\", cfg.DefaultLifecycleImageRepo, builder.DefaultLifecycleVersion), \"\", nil)\n\t\tfakeImageFetcher.LocalImages[fakeLifecycleImage.Name()] = fakeLifecycleImage\n\n\t\tdocker, err := dockerclient.New(dockerclient.FromEnv)\n\t\th.AssertNil(t, err)\n\n\t\tlogger = logging.NewLogWithWriters(&outBuf, &outBuf)\n\n\t\tdlCacheDir, err := os.MkdirTemp(tmpDir, \"dl-cache\")\n\t\th.AssertNil(t, err)\n\n\t\tblobDownloader := blob.NewDownloader(logger, dlCacheDir)\n\t\tbuildpackDownloader := buildpack.NewDownloader(logger, fakeImageFetcher, blobDownloader, &registryResolver{logger: logger})\n\t\tsubject = &Client{\n\t\t\tlogger:              logger,\n\t\t\timageFetcher:        fakeImageFetcher,\n\t\t\tdownloader:          blobDownloader,\n\t\t\tlifecycleExecutor:   fakeLifecycle,\n\t\t\tdocker:              docker,\n\t\t\tbuildpackDownloader: buildpackDownloader,\n\t\t}\n\t})\n\n\tit.After(func() {\n\t\th.AssertNilE(t, defaultBuilderImage.Cleanup())\n\t\th.AssertNilE(t, builderImageWithSystem.Cleanup())\n\t\th.AssertNilE(t, fakeDefaultRunImage.Cleanup())\n\t\th.AssertNilE(t, fakeMirror1.Cleanup())\n\t\th.AssertNilE(t, fakeMirror2.Cleanup())\n\t\tos.RemoveAll(tmpDir)\n\t\th.AssertNilE(t, fakeLifecycleImage.Cleanup())\n\t})\n\n\twhen(\"#Build\", func() {\n\t\twhen(\"ephemeral builder is not needed\", func() {\n\t\t\tit(\"does not create one\", func() {\n\t\t\t\th.AssertNil(t, subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\tBuilder: defaultBuilderName,\n\t\t\t\t\tImage:   \"example.com/some/repo:tag\",\n\t\t\t\t}))\n\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.Builder.Name(), defaultBuilderName)\n\t\t\t\tbldr := fakeLifecycle.Opts.Builder.(*builder.Builder)\n\t\t\t\th.AssertNotNil(t, bldr.Save(logger, builder.CreatorMetadata{})) // it shouldn't be possible to save this builder, as that would overwrite the original builder\n\t\t\t})\n\t\t})\n\n\t\twhen(\"Workspace option\", func() {\n\t\t\tit(\"uses the specified dir\", func() {\n\t\t\t\th.AssertNil(t, subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\tWorkspace: \"app\",\n\t\t\t\t\tBuilder:   defaultBuilderName,\n\t\t\t\t\tImage:     \"example.com/some/repo:tag\",\n\t\t\t\t}))\n\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.Workspace, \"app\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"Image option\", func() {\n\t\t\tit(\"is required\", func() {\n\t\t\t\th.AssertError(t, subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\tImage:   \"\",\n\t\t\t\t\tBuilder: defaultBuilderName,\n\t\t\t\t}),\n\t\t\t\t\t\"invalid image name ''\",\n\t\t\t\t)\n\t\t\t})\n\n\t\t\tit(\"must be a valid image reference\", func() {\n\t\t\t\th.AssertError(t, subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\tImage:   \"not@valid\",\n\t\t\t\t\tBuilder: defaultBuilderName,\n\t\t\t\t}),\n\t\t\t\t\t\"invalid image name 'not@valid'\",\n\t\t\t\t)\n\t\t\t})\n\n\t\t\tit(\"must be a valid tag reference\", func() {\n\t\t\t\th.AssertError(t, subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\tImage:   \"registry.com/my/image@sha256:954e1f01e80ce09d0887ff6ea10b13a812cb01932a0781d6b0cc23f743a874fd\",\n\t\t\t\t\tBuilder: defaultBuilderName,\n\t\t\t\t}),\n\t\t\t\t\t\"invalid image name 'registry.com/my/image@sha256:954e1f01e80ce09d0887ff6ea10b13a812cb01932a0781d6b0cc23f743a874fd'\",\n\t\t\t\t)\n\t\t\t})\n\n\t\t\tit(\"lifecycle receives resolved reference\", func() {\n\t\t\t\th.AssertNil(t, subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\tBuilder: defaultBuilderName,\n\t\t\t\t\tImage:   \"example.com/some/repo:tag\",\n\t\t\t\t}))\n\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.Image.Context().RegistryStr(), \"example.com\")\n\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.Image.Context().RepositoryStr(), \"some/repo\")\n\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.Image.Identifier(), \"tag\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"Quiet mode\", func() {\n\t\t\tvar builtImage *fakes.Image\n\n\t\t\tit.After(func() {\n\t\t\t\tlogger.WantQuiet(false)\n\t\t\t})\n\n\t\t\twhen(\"publish\", func() {\n\t\t\t\tvar remoteRunImage, builderWithoutLifecycleImageOrCreator *fakes.Image\n\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tremoteRunImage = fakes.NewImage(\"default/run\", \"\", nil)\n\t\t\t\t\th.AssertNil(t, remoteRunImage.SetLabel(\"io.buildpacks.stack.id\", defaultBuilderStackID))\n\t\t\t\t\th.AssertNil(t, remoteRunImage.SetLabel(\"io.buildpacks.stack.mixins\", `[\"mixinA\", \"mixinX\", \"run:mixinZ\"]`))\n\t\t\t\t\tfakeImageFetcher.RemoteImages[remoteRunImage.Name()] = remoteRunImage\n\n\t\t\t\t\tbuilderWithoutLifecycleImageOrCreator = newFakeBuilderImage(\n\t\t\t\t\t\tt,\n\t\t\t\t\t\ttmpDir,\n\t\t\t\t\t\t\"example.com/supportscreator/builder:tag\",\n\t\t\t\t\t\t\"some.stack.id\",\n\t\t\t\t\t\tdefaultRunImageName,\n\t\t\t\t\t\t\"0.3.0\",\n\t\t\t\t\t\tnewLinuxImage,\n\t\t\t\t\t\tfalse,\n\t\t\t\t\t)\n\t\t\t\t\th.AssertNil(t, builderWithoutLifecycleImageOrCreator.SetLabel(\"io.buildpacks.stack.mixins\", `[\"mixinA\", \"build:mixinB\", \"mixinX\", \"build:mixinY\"]`))\n\t\t\t\t\tfakeImageFetcher.LocalImages[builderWithoutLifecycleImageOrCreator.Name()] = builderWithoutLifecycleImageOrCreator\n\n\t\t\t\t\tdigest, err := name.NewDigest(\"example.io/some/app@sha256:363c754893f0efe22480b4359a5956cf3bd3ce22742fc576973c61348308c2e4\", name.WeakValidation)\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\tbuiltImage = fakes.NewImage(\"example.io/some/app:latest\", \"\", remote.DigestIdentifier{Digest: digest})\n\t\t\t\t\tfakeImageFetcher.RemoteImages[builtImage.Name()] = builtImage\n\t\t\t\t})\n\n\t\t\t\tit.After(func() {\n\t\t\t\t\th.AssertNilE(t, remoteRunImage.Cleanup())\n\t\t\t\t\th.AssertNilE(t, builderWithoutLifecycleImageOrCreator.Cleanup())\n\t\t\t\t\th.AssertNilE(t, builtImage.Cleanup())\n\t\t\t\t})\n\n\t\t\t\tit(\"only prints app name and sha\", func() {\n\t\t\t\t\tlogger.WantQuiet(true)\n\n\t\t\t\t\th.AssertNil(t, subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\tImage:   \"example.io/some/app\",\n\t\t\t\t\t\tBuilder: defaultBuilderName,\n\t\t\t\t\t\tAppPath: filepath.Join(\"testdata\", \"some-app\"),\n\t\t\t\t\t\tPublish: true,\n\t\t\t\t\t}))\n\n\t\t\t\t\th.AssertEq(t, strings.TrimSpace(outBuf.String()), \"example.io/some/app@sha256:363c754893f0efe22480b4359a5956cf3bd3ce22742fc576973c61348308c2e4\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"local\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tbuiltImage = fakes.NewImage(\"index.docker.io/some/app:latest\", \"\", local.IDIdentifier{\n\t\t\t\t\t\tImageID: \"363c754893f0efe22480b4359a5956cf3bd3ce22742fc576973c61348308c2e4\",\n\t\t\t\t\t})\n\t\t\t\t\tfakeImageFetcher.LocalImages[builtImage.Name()] = builtImage\n\t\t\t\t})\n\n\t\t\t\tit.After(func() {\n\t\t\t\t\th.AssertNilE(t, builtImage.Cleanup())\n\t\t\t\t})\n\n\t\t\t\tit(\"only prints app name and sha\", func() {\n\t\t\t\t\tlogger.WantQuiet(true)\n\n\t\t\t\t\th.AssertNil(t, subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\tImage:   \"some/app\",\n\t\t\t\t\t\tBuilder: defaultBuilderName,\n\t\t\t\t\t\tAppPath: filepath.Join(\"testdata\", \"some-app\"),\n\t\t\t\t\t}))\n\n\t\t\t\t\th.AssertEq(t, strings.TrimSpace(outBuf.String()), \"some/app@sha256:363c754893f0efe22480b4359a5956cf3bd3ce22742fc576973c61348308c2e4\")\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"AppDir option\", func() {\n\t\t\tit(\"defaults to the current working directory\", func() {\n\t\t\t\th.AssertNil(t, subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\tImage:   \"some/app\",\n\t\t\t\t\tBuilder: defaultBuilderName,\n\t\t\t\t}))\n\n\t\t\t\twd, err := os.Getwd()\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\tresolvedWd, err := filepath.EvalSymlinks(wd)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.AppPath, resolvedWd)\n\t\t\t})\n\t\t\tfor fileDesc, appPath := range map[string]string{\n\t\t\t\t\"zip\": filepath.Join(\"testdata\", \"zip-file.zip\"),\n\t\t\t\t\"jar\": filepath.Join(\"testdata\", \"jar-file.jar\"),\n\t\t\t} {\n\t\t\t\tfileDesc := fileDesc\n\t\t\t\tappPath := appPath\n\n\t\t\t\tit(fmt.Sprintf(\"supports %s files\", fileDesc), func() {\n\t\t\t\t\terr := subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\tImage:   \"some/app\",\n\t\t\t\t\t\tBuilder: defaultBuilderName,\n\t\t\t\t\t\tAppPath: appPath,\n\t\t\t\t\t})\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t})\n\t\t\t}\n\n\t\t\tfor fileDesc, testData := range map[string][]string{\n\t\t\t\t\"non-existent\": {\"not/exist/path\", \"does not exist\"},\n\t\t\t\t\"empty\":        {filepath.Join(\"testdata\", \"empty-file\"), \"app path must be a directory or zip\"},\n\t\t\t\t\"non-zip\":      {filepath.Join(\"testdata\", \"non-zip-file\"), \"app path must be a directory or zip\"},\n\t\t\t} {\n\t\t\t\tfileDesc := fileDesc\n\t\t\t\tappPath := testData[0]\n\t\t\t\terrMessage := testData[0]\n\n\t\t\t\tit(fmt.Sprintf(\"does NOT support %s files\", fileDesc), func() {\n\t\t\t\t\terr := subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\tImage:   \"some/app\",\n\t\t\t\t\t\tBuilder: defaultBuilderName,\n\t\t\t\t\t\tAppPath: appPath,\n\t\t\t\t\t})\n\n\t\t\t\t\th.AssertError(t, err, errMessage)\n\t\t\t\t})\n\t\t\t}\n\n\t\t\tit(\"resolves the absolute path\", func() {\n\t\t\t\th.AssertNil(t, subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\tImage:   \"some/app\",\n\t\t\t\t\tBuilder: defaultBuilderName,\n\t\t\t\t\tAppPath: filepath.Join(\"testdata\", \"some-app\"),\n\t\t\t\t}))\n\t\t\t\tabsPath, err := filepath.Abs(filepath.Join(\"testdata\", \"some-app\"))\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.AppPath, absPath)\n\t\t\t})\n\n\t\t\twhen(\"appDir is a symlink\", func() {\n\t\t\t\tvar (\n\t\t\t\t\tappDirName     = \"some-app\"\n\t\t\t\t\tabsoluteAppDir string\n\t\t\t\t\ttmpDir         string\n\t\t\t\t\terr            error\n\t\t\t\t)\n\n\t\t\t\tit.Before(func() {\n\t\t\t\t\ttmpDir, err = os.MkdirTemp(\"\", \"build-symlink-test\")\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\tappDirPath := filepath.Join(tmpDir, appDirName)\n\t\t\t\t\th.AssertNil(t, os.MkdirAll(filepath.Join(tmpDir, appDirName), 0666))\n\n\t\t\t\t\tabsoluteAppDir, err = filepath.Abs(appDirPath)\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\tabsoluteAppDir, err = filepath.EvalSymlinks(appDirPath)\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t})\n\n\t\t\t\tit.After(func() {\n\t\t\t\t\tos.RemoveAll(tmpDir)\n\t\t\t\t})\n\n\t\t\t\tit(\"resolves relative symbolic links\", func() {\n\t\t\t\t\trelLink := filepath.Join(tmpDir, \"some-app.link\")\n\t\t\t\t\th.AssertNil(t, os.Symlink(filepath.Join(\".\", appDirName), relLink))\n\n\t\t\t\t\th.AssertNil(t, subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\tImage:   \"some/app\",\n\t\t\t\t\t\tBuilder: defaultBuilderName,\n\t\t\t\t\t\tAppPath: relLink,\n\t\t\t\t\t}))\n\n\t\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.AppPath, absoluteAppDir)\n\t\t\t\t})\n\n\t\t\t\tit(\"resolves absolute symbolic links\", func() {\n\t\t\t\t\trelLink := filepath.Join(tmpDir, \"some-app.link\")\n\t\t\t\t\th.AssertNil(t, os.Symlink(absoluteAppDir, relLink))\n\n\t\t\t\t\th.AssertNil(t, subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\tImage:   \"some/app\",\n\t\t\t\t\t\tBuilder: defaultBuilderName,\n\t\t\t\t\t\tAppPath: relLink,\n\t\t\t\t\t}))\n\n\t\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.AppPath, absoluteAppDir)\n\t\t\t\t})\n\n\t\t\t\tit(\"resolves symbolic links recursively\", func() {\n\t\t\t\t\tlinkRef1 := absoluteAppDir\n\t\t\t\t\tabsoluteLink1 := filepath.Join(tmpDir, \"some-app-abs-1.link\")\n\n\t\t\t\t\tlinkRef2 := \"some-app-abs-1.link\"\n\t\t\t\t\tsymbolicLink := filepath.Join(tmpDir, \"some-app-rel-2.link\")\n\n\t\t\t\t\th.AssertNil(t, os.Symlink(linkRef1, absoluteLink1))\n\t\t\t\t\th.AssertNil(t, os.Symlink(linkRef2, symbolicLink))\n\n\t\t\t\t\th.AssertNil(t, subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\tImage:   \"some/app\",\n\t\t\t\t\t\tBuilder: defaultBuilderName,\n\t\t\t\t\t\tAppPath: symbolicLink,\n\t\t\t\t\t}))\n\n\t\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.AppPath, absoluteAppDir)\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"Builder option\", func() {\n\t\t\tit(\"builder is required\", func() {\n\t\t\t\th.AssertError(t, subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\tImage: \"some/app\",\n\t\t\t\t}),\n\t\t\t\t\t\"invalid builder ''\",\n\t\t\t\t)\n\t\t\t})\n\n\t\t\twhen(\"the builder name is provided\", func() {\n\t\t\t\tvar (\n\t\t\t\t\tcustomBuilderImage *fakes.Image\n\t\t\t\t\tfakeRunImage       *fakes.Image\n\t\t\t\t)\n\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tcustomBuilderImage = ifakes.NewFakeBuilderImage(t,\n\t\t\t\t\t\ttmpDir,\n\t\t\t\t\t\tdefaultBuilderName,\n\t\t\t\t\t\t\"some.stack.id\",\n\t\t\t\t\t\t\"1234\",\n\t\t\t\t\t\t\"5678\",\n\t\t\t\t\t\tbuilder.Metadata{\n\t\t\t\t\t\t\tStack: builder.StackMetadata{\n\t\t\t\t\t\t\t\tRunImage: builder.RunImageMetadata{\n\t\t\t\t\t\t\t\t\tImage: \"some/run\",\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\tLifecycle: builder.LifecycleMetadata{\n\t\t\t\t\t\t\t\tLifecycleInfo: builder.LifecycleInfo{\n\t\t\t\t\t\t\t\t\tVersion: &builder.Version{\n\t\t\t\t\t\t\t\t\t\tVersion: *semver.MustParse(builder.DefaultLifecycleVersion),\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\tAPIs: builder.LifecycleAPIs{\n\t\t\t\t\t\t\t\t\tBuildpack: builder.APIVersions{\n\t\t\t\t\t\t\t\t\t\tSupported: builder.APISet{api.MustParse(\"0.2\"), api.MustParse(\"0.3\"), api.MustParse(\"0.4\")},\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\tPlatform: builder.APIVersions{\n\t\t\t\t\t\t\t\t\t\tSupported: builder.APISet{api.MustParse(\"0.3\"), api.MustParse(\"0.4\")},\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t},\n\t\t\t\t\t\tnil,\n\t\t\t\t\t\tnil,\n\t\t\t\t\t\tnil,\n\t\t\t\t\t\tnil,\n\t\t\t\t\t\tdist.System{},\n\t\t\t\t\t\tnewLinuxImage,\n\t\t\t\t\t)\n\n\t\t\t\t\tfakeImageFetcher.LocalImages[customBuilderImage.Name()] = customBuilderImage\n\n\t\t\t\t\tfakeRunImage = fakes.NewImage(\"some/run\", \"\", nil)\n\t\t\t\t\th.AssertNil(t, fakeRunImage.SetLabel(\"io.buildpacks.stack.id\", \"some.stack.id\"))\n\t\t\t\t\tfakeImageFetcher.LocalImages[fakeRunImage.Name()] = fakeRunImage\n\t\t\t\t})\n\n\t\t\t\tit.After(func() {\n\t\t\t\t\th.AssertNilE(t, customBuilderImage.Cleanup())\n\t\t\t\t\th.AssertNilE(t, fakeRunImage.Cleanup())\n\t\t\t\t})\n\n\t\t\t\tit(\"it uses the provided builder\", func() {\n\t\t\t\t\th.AssertNil(t, subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\tImage:   \"some/app\",\n\t\t\t\t\t\tBuilder: defaultBuilderName,\n\t\t\t\t\t}))\n\t\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.Builder.Name(), customBuilderImage.Name())\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"RunImage option\", func() {\n\t\t\tvar (\n\t\t\t\tfakeRunImage *fakes.Image\n\t\t\t)\n\n\t\t\tit.Before(func() {\n\t\t\t\tfakeRunImage = fakes.NewImage(\"custom/run\", \"\", nil)\n\t\t\t\th.AssertNil(t, fakeRunImage.SetLabel(\"io.buildpacks.stack.id\", defaultBuilderStackID))\n\t\t\t\th.AssertNil(t, fakeRunImage.SetLabel(\"io.buildpacks.stack.mixins\", `[\"mixinA\", \"mixinX\", \"run:mixinZ\"]`))\n\t\t\t\tfakeImageFetcher.LocalImages[fakeRunImage.Name()] = fakeRunImage\n\t\t\t})\n\n\t\t\tit.After(func() {\n\t\t\t\th.AssertNilE(t, fakeRunImage.Cleanup())\n\t\t\t})\n\n\t\t\twhen(\"run image stack matches the builder stack\", func() {\n\t\t\t\tit(\"uses the provided image\", func() {\n\t\t\t\t\th.AssertNil(t, subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\tImage:    \"some/app\",\n\t\t\t\t\t\tBuilder:  defaultBuilderName,\n\t\t\t\t\t\tRunImage: \"custom/run\",\n\t\t\t\t\t}))\n\t\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.RunImage, \"custom/run\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"run image stack does not match the builder stack\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\th.AssertNil(t, fakeRunImage.SetLabel(\"io.buildpacks.stack.id\", \"other.stack\"))\n\t\t\t\t})\n\n\t\t\t\tit(\"warning\", func() {\n\t\t\t\t\terr := subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\tImage:    \"some/app\",\n\t\t\t\t\t\tBuilder:  defaultBuilderName,\n\t\t\t\t\t\tRunImage: \"custom/run\",\n\t\t\t\t\t})\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\th.AssertContains(t, outBuf.String(), \"Warning: deprecated usage of stack\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"run image is not supplied\", func() {\n\t\t\t\twhen(\"there are no locally configured mirrors\", func() {\n\t\t\t\t\twhen(\"Publish is true\", func() {\n\t\t\t\t\t\tit(\"chooses the run image mirror matching the local image\", func() {\n\t\t\t\t\t\t\tfakeImageFetcher.RemoteImages[fakeDefaultRunImage.Name()] = fakeDefaultRunImage\n\n\t\t\t\t\t\t\th.AssertNil(t, subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\t\t\tImage:   \"some/app\",\n\t\t\t\t\t\t\t\tBuilder: defaultBuilderName,\n\t\t\t\t\t\t\t\tPublish: true,\n\t\t\t\t\t\t\t}))\n\t\t\t\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.RunImage, \"default/run\")\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\tfor _, registry := range []string{\"registry1.example.com\", \"registry2.example.com\"} {\n\t\t\t\t\t\t\ttestRegistry := registry\n\t\t\t\t\t\t\tit(\"chooses the run image mirror matching the built image\", func() {\n\t\t\t\t\t\t\t\trunImg := testRegistry + \"/run/mirror\"\n\t\t\t\t\t\t\t\tfakeImageFetcher.RemoteImages[runImg] = fakeDefaultRunImage\n\t\t\t\t\t\t\t\th.AssertNil(t, subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\t\t\t\tImage:   testRegistry + \"/some/app\",\n\t\t\t\t\t\t\t\t\tBuilder: defaultBuilderName,\n\t\t\t\t\t\t\t\t\tPublish: true,\n\t\t\t\t\t\t\t\t}))\n\t\t\t\t\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.RunImage, runImg)\n\t\t\t\t\t\t\t})\n\t\t\t\t\t\t}\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"Publish is false\", func() {\n\t\t\t\t\t\tfor _, img := range []string{\"some/app\",\n\t\t\t\t\t\t\t\"registry1.example.com/some/app\",\n\t\t\t\t\t\t\t\"registry2.example.com/some/app\"} {\n\t\t\t\t\t\t\ttestImg := img\n\t\t\t\t\t\t\tit(\"chooses a mirror on the builder registry\", func() {\n\t\t\t\t\t\t\t\th.AssertNil(t, subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\t\t\t\tImage:   testImg,\n\t\t\t\t\t\t\t\t\tBuilder: defaultBuilderName,\n\t\t\t\t\t\t\t\t}))\n\t\t\t\t\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.RunImage, \"default/run\")\n\t\t\t\t\t\t\t})\n\t\t\t\t\t\t}\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"there are locally configured mirrors\", func() {\n\t\t\t\t\tvar (\n\t\t\t\t\t\tfakeLocalMirror  *fakes.Image\n\t\t\t\t\t\tfakeLocalMirror1 *fakes.Image\n\t\t\t\t\t\tmirrors          = map[string][]string{\n\t\t\t\t\t\t\t\"default/run\": {\"local/mirror\", \"registry1.example.com/local/mirror\"},\n\t\t\t\t\t\t}\n\t\t\t\t\t)\n\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\tfakeLocalMirror = fakes.NewImage(\"local/mirror\", \"\", nil)\n\t\t\t\t\t\th.AssertNil(t, fakeLocalMirror.SetLabel(\"io.buildpacks.stack.id\", defaultBuilderStackID))\n\t\t\t\t\t\th.AssertNil(t, fakeLocalMirror.SetLabel(\"io.buildpacks.stack.mixins\", `[\"mixinA\", \"mixinX\", \"run:mixinZ\"]`))\n\n\t\t\t\t\t\tfakeImageFetcher.LocalImages[fakeLocalMirror.Name()] = fakeLocalMirror\n\n\t\t\t\t\t\tfakeLocalMirror1 = fakes.NewImage(\"registry1.example.com/local/mirror\", \"\", nil)\n\t\t\t\t\t\th.AssertNil(t, fakeLocalMirror1.SetLabel(\"io.buildpacks.stack.id\", defaultBuilderStackID))\n\t\t\t\t\t\th.AssertNil(t, fakeLocalMirror1.SetLabel(\"io.buildpacks.stack.mixins\", `[\"mixinA\", \"mixinX\", \"run:mixinZ\"]`))\n\n\t\t\t\t\t\tfakeImageFetcher.LocalImages[fakeLocalMirror1.Name()] = fakeLocalMirror1\n\t\t\t\t\t})\n\n\t\t\t\t\tit.After(func() {\n\t\t\t\t\t\th.AssertNilE(t, fakeLocalMirror.Cleanup())\n\t\t\t\t\t\th.AssertNilE(t, fakeLocalMirror1.Cleanup())\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"Publish is true\", func() {\n\t\t\t\t\t\tfor _, registry := range []string{\"\", \"registry1.example.com\"} {\n\t\t\t\t\t\t\ttestRegistry := registry\n\t\t\t\t\t\t\tit(\"prefers user provided mirrors for registry \"+testRegistry, func() {\n\t\t\t\t\t\t\t\tif testRegistry != \"\" {\n\t\t\t\t\t\t\t\t\ttestRegistry += \"/\"\n\t\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\t\trunImg := testRegistry + \"local/mirror\"\n\t\t\t\t\t\t\t\tfakeImageFetcher.RemoteImages[runImg] = fakeDefaultRunImage\n\n\t\t\t\t\t\t\t\th.AssertNil(t, subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\t\t\t\tImage:             testRegistry + \"some/app\",\n\t\t\t\t\t\t\t\t\tBuilder:           defaultBuilderName,\n\t\t\t\t\t\t\t\t\tAdditionalMirrors: mirrors,\n\t\t\t\t\t\t\t\t\tPublish:           true,\n\t\t\t\t\t\t\t\t}))\n\t\t\t\t\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.RunImage, runImg)\n\t\t\t\t\t\t\t})\n\t\t\t\t\t\t}\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"Publish is false\", func() {\n\t\t\t\t\t\tfor _, registry := range []string{\"\", \"registry1.example.com\", \"registry2.example.com\"} {\n\t\t\t\t\t\t\ttestRegistry := registry\n\t\t\t\t\t\t\tit(\"prefers user provided mirrors\", func() {\n\t\t\t\t\t\t\t\th.AssertNil(t, subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\t\t\t\tImage:             testRegistry + \"some/app\",\n\t\t\t\t\t\t\t\t\tBuilder:           defaultBuilderName,\n\t\t\t\t\t\t\t\t\tAdditionalMirrors: mirrors,\n\t\t\t\t\t\t\t\t}))\n\t\t\t\t\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.RunImage, \"local/mirror\")\n\t\t\t\t\t\t\t})\n\t\t\t\t\t\t}\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"ClearCache option\", func() {\n\t\t\tit(\"passes it through to lifecycle\", func() {\n\t\t\t\th.AssertNil(t, subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\tImage:      \"some/app\",\n\t\t\t\t\tBuilder:    defaultBuilderName,\n\t\t\t\t\tClearCache: true,\n\t\t\t\t}))\n\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.ClearCache, true)\n\t\t\t})\n\n\t\t\tit(\"defaults to false\", func() {\n\t\t\t\th.AssertNil(t, subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\tImage:   \"some/app\",\n\t\t\t\t\tBuilder: defaultBuilderName,\n\t\t\t\t}))\n\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.ClearCache, false)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"ImageCache option\", func() {\n\t\t\tit(\"passes it through to lifecycle\", func() {\n\t\t\t\th.AssertNil(t, subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\tImage:      \"some/app\",\n\t\t\t\t\tBuilder:    defaultBuilderName,\n\t\t\t\t\tCacheImage: \"some-cache-image\",\n\t\t\t\t}))\n\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.CacheImage, \"some-cache-image\")\n\t\t\t})\n\n\t\t\tit(\"defaults to false\", func() {\n\t\t\t\th.AssertNil(t, subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\tImage:   \"some/app\",\n\t\t\t\t\tBuilder: defaultBuilderName,\n\t\t\t\t}))\n\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.CacheImage, \"\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"Buildpacks option\", func() {\n\t\t\tassertOrderEquals := func(content string) {\n\t\t\t\tt.Helper()\n\n\t\t\t\torderLayer, err := defaultBuilderImage.FindLayerWithPath(\"/cnb/order.toml\")\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertOnTarEntry(t, orderLayer, \"/cnb/order.toml\", h.ContentEquals(content))\n\t\t\t}\n\n\t\t\tit(\"builder order is overwritten\", func() {\n\t\t\t\tadditionalBP := ifakes.CreateBuildpackTar(t, tmpDir, dist.BuildpackDescriptor{\n\t\t\t\t\tWithAPI: api.MustParse(\"0.3\"),\n\t\t\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\t\t\tID:      \"buildpack.add.1.id\",\n\t\t\t\t\t\tVersion: \"buildpack.add.1.version\",\n\t\t\t\t\t},\n\t\t\t\t\tWithStacks: []dist.Stack{{ID: defaultBuilderStackID}},\n\t\t\t\t\tWithOrder:  nil,\n\t\t\t\t})\n\n\t\t\t\th.AssertNil(t, subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\tImage:      \"some/app\",\n\t\t\t\t\tBuilder:    defaultBuilderName,\n\t\t\t\t\tClearCache: true,\n\t\t\t\t\tBuildpacks: []string{additionalBP},\n\t\t\t\t}))\n\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.Builder.Name(), defaultBuilderImage.Name())\n\n\t\t\t\tassertOrderEquals(`[[order]]\n\n  [[order.group]]\n    id = \"buildpack.add.1.id\"\n    version = \"buildpack.add.1.version\"\n`)\n\t\t\t})\n\n\t\t\twhen(\"id - no version is provided\", func() {\n\t\t\t\tit(\"resolves version\", func() {\n\t\t\t\t\th.AssertNil(t, subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\tImage:      \"some/app\",\n\t\t\t\t\t\tBuilder:    defaultBuilderName,\n\t\t\t\t\t\tClearCache: true,\n\t\t\t\t\t\tBuildpacks: []string{\"buildpack.1.id\"},\n\t\t\t\t\t}))\n\t\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.Builder.Name(), defaultBuilderImage.Name())\n\n\t\t\t\t\tassertOrderEquals(`[[order]]\n\n  [[order.group]]\n    id = \"buildpack.1.id\"\n    version = \"buildpack.1.version\"\n`)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"from=builder:id@version\", func() {\n\t\t\t\tit(\"builder order is prepended\", func() {\n\t\t\t\t\th.AssertNil(t, subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\tImage:      \"some/app\",\n\t\t\t\t\t\tBuilder:    defaultBuilderName,\n\t\t\t\t\t\tClearCache: true,\n\t\t\t\t\t\tBuildpacks: []string{\n\t\t\t\t\t\t\t\"from=builder:buildpack.1.id@buildpack.1.version\",\n\t\t\t\t\t\t},\n\t\t\t\t\t}))\n\t\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.Builder.Name(), defaultBuilderImage.Name())\n\n\t\t\t\t\tassertOrderEquals(`[[order]]\n\n  [[order.group]]\n    id = \"buildpack.1.id\"\n    version = \"buildpack.1.version\"\n`)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"from=builder is set first\", func() {\n\t\t\t\tit(\"builder order is prepended\", func() {\n\t\t\t\t\tadditionalBP1 := ifakes.CreateBuildpackTar(t, tmpDir, dist.BuildpackDescriptor{\n\t\t\t\t\t\tWithAPI: api.MustParse(\"0.3\"),\n\t\t\t\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\t\t\t\tID:      \"buildpack.add.1.id\",\n\t\t\t\t\t\t\tVersion: \"buildpack.add.1.version\",\n\t\t\t\t\t\t},\n\t\t\t\t\t\tWithStacks: []dist.Stack{{ID: defaultBuilderStackID}},\n\t\t\t\t\t\tWithOrder:  nil,\n\t\t\t\t\t})\n\n\t\t\t\t\tadditionalBP2 := ifakes.CreateBuildpackTar(t, tmpDir, dist.BuildpackDescriptor{\n\t\t\t\t\t\tWithAPI: api.MustParse(\"0.3\"),\n\t\t\t\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\t\t\t\tID:      \"buildpack.add.2.id\",\n\t\t\t\t\t\t\tVersion: \"buildpack.add.2.version\",\n\t\t\t\t\t\t},\n\t\t\t\t\t\tWithStacks: []dist.Stack{{ID: defaultBuilderStackID}},\n\t\t\t\t\t\tWithOrder:  nil,\n\t\t\t\t\t})\n\n\t\t\t\t\th.AssertNil(t, subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\tImage:      \"some/app\",\n\t\t\t\t\t\tBuilder:    defaultBuilderName,\n\t\t\t\t\t\tClearCache: true,\n\t\t\t\t\t\tBuildpacks: []string{\n\t\t\t\t\t\t\t\"from=builder\",\n\t\t\t\t\t\t\tadditionalBP1,\n\t\t\t\t\t\t\tadditionalBP2,\n\t\t\t\t\t\t},\n\t\t\t\t\t}))\n\n\t\t\t\t\tassertOrderEquals(`[[order]]\n\n  [[order.group]]\n    id = \"buildpack.1.id\"\n    version = \"buildpack.1.version\"\n\n  [[order.group]]\n    id = \"buildpack.add.1.id\"\n    version = \"buildpack.add.1.version\"\n\n  [[order.group]]\n    id = \"buildpack.add.2.id\"\n    version = \"buildpack.add.2.version\"\n\n[[order]]\n\n  [[order.group]]\n    id = \"buildpack.2.id\"\n    version = \"buildpack.2.version\"\n\n  [[order.group]]\n    id = \"buildpack.add.1.id\"\n    version = \"buildpack.add.1.version\"\n\n  [[order.group]]\n    id = \"buildpack.add.2.id\"\n    version = \"buildpack.add.2.version\"\n`)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"from=builder is set in middle\", func() {\n\t\t\t\tit(\"builder order is appended\", func() {\n\t\t\t\t\tadditionalBP1 := ifakes.CreateBuildpackTar(t, tmpDir, dist.BuildpackDescriptor{\n\t\t\t\t\t\tWithAPI: api.MustParse(\"0.3\"),\n\t\t\t\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\t\t\t\tID:      \"buildpack.add.1.id\",\n\t\t\t\t\t\t\tVersion: \"buildpack.add.1.version\",\n\t\t\t\t\t\t},\n\t\t\t\t\t\tWithStacks: []dist.Stack{{ID: defaultBuilderStackID}},\n\t\t\t\t\t\tWithOrder:  nil,\n\t\t\t\t\t})\n\n\t\t\t\t\tadditionalBP2 := ifakes.CreateBuildpackTar(t, tmpDir, dist.BuildpackDescriptor{\n\t\t\t\t\t\tWithAPI: api.MustParse(\"0.3\"),\n\t\t\t\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\t\t\t\tID:      \"buildpack.add.2.id\",\n\t\t\t\t\t\t\tVersion: \"buildpack.add.2.version\",\n\t\t\t\t\t\t},\n\t\t\t\t\t\tWithStacks: []dist.Stack{{ID: defaultBuilderStackID}},\n\t\t\t\t\t\tWithOrder:  nil,\n\t\t\t\t\t})\n\n\t\t\t\t\th.AssertNil(t, subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\tImage:      \"some/app\",\n\t\t\t\t\t\tBuilder:    defaultBuilderName,\n\t\t\t\t\t\tClearCache: true,\n\t\t\t\t\t\tBuildpacks: []string{\n\t\t\t\t\t\t\tadditionalBP1,\n\t\t\t\t\t\t\t\"from=builder\",\n\t\t\t\t\t\t\tadditionalBP2,\n\t\t\t\t\t\t},\n\t\t\t\t\t}))\n\t\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.Builder.Name(), defaultBuilderImage.Name())\n\n\t\t\t\t\tassertOrderEquals(`[[order]]\n\n  [[order.group]]\n    id = \"buildpack.add.1.id\"\n    version = \"buildpack.add.1.version\"\n\n  [[order.group]]\n    id = \"buildpack.1.id\"\n    version = \"buildpack.1.version\"\n\n  [[order.group]]\n    id = \"buildpack.add.2.id\"\n    version = \"buildpack.add.2.version\"\n\n[[order]]\n\n  [[order.group]]\n    id = \"buildpack.add.1.id\"\n    version = \"buildpack.add.1.version\"\n\n  [[order.group]]\n    id = \"buildpack.2.id\"\n    version = \"buildpack.2.version\"\n\n  [[order.group]]\n    id = \"buildpack.add.2.id\"\n    version = \"buildpack.add.2.version\"\n`)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"from=builder is set last\", func() {\n\t\t\t\tit(\"builder order is appended\", func() {\n\t\t\t\t\tadditionalBP1 := ifakes.CreateBuildpackTar(t, tmpDir, dist.BuildpackDescriptor{\n\t\t\t\t\t\tWithAPI: api.MustParse(\"0.3\"),\n\t\t\t\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\t\t\t\tID:      \"buildpack.add.1.id\",\n\t\t\t\t\t\t\tVersion: \"buildpack.add.1.version\",\n\t\t\t\t\t\t},\n\t\t\t\t\t\tWithStacks: []dist.Stack{{ID: defaultBuilderStackID}},\n\t\t\t\t\t\tWithOrder:  nil,\n\t\t\t\t\t})\n\n\t\t\t\t\tadditionalBP2 := ifakes.CreateBuildpackTar(t, tmpDir, dist.BuildpackDescriptor{\n\t\t\t\t\t\tWithAPI: api.MustParse(\"0.3\"),\n\t\t\t\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\t\t\t\tID:      \"buildpack.add.2.id\",\n\t\t\t\t\t\t\tVersion: \"buildpack.add.2.version\",\n\t\t\t\t\t\t},\n\t\t\t\t\t\tWithStacks: []dist.Stack{{ID: defaultBuilderStackID}},\n\t\t\t\t\t\tWithOrder:  nil,\n\t\t\t\t\t})\n\n\t\t\t\t\th.AssertNil(t, subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\tImage:      \"some/app\",\n\t\t\t\t\t\tBuilder:    defaultBuilderName,\n\t\t\t\t\t\tClearCache: true,\n\t\t\t\t\t\tBuildpacks: []string{\n\t\t\t\t\t\t\tadditionalBP1,\n\t\t\t\t\t\t\tadditionalBP2,\n\t\t\t\t\t\t\t\"from=builder\",\n\t\t\t\t\t\t},\n\t\t\t\t\t}))\n\t\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.Builder.Name(), defaultBuilderImage.Name())\n\n\t\t\t\t\tassertOrderEquals(`[[order]]\n\n  [[order.group]]\n    id = \"buildpack.add.1.id\"\n    version = \"buildpack.add.1.version\"\n\n  [[order.group]]\n    id = \"buildpack.add.2.id\"\n    version = \"buildpack.add.2.version\"\n\n  [[order.group]]\n    id = \"buildpack.1.id\"\n    version = \"buildpack.1.version\"\n\n[[order]]\n\n  [[order.group]]\n    id = \"buildpack.add.1.id\"\n    version = \"buildpack.add.1.version\"\n\n  [[order.group]]\n    id = \"buildpack.add.2.id\"\n    version = \"buildpack.add.2.version\"\n\n  [[order.group]]\n    id = \"buildpack.2.id\"\n    version = \"buildpack.2.version\"\n`)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"meta-buildpack is used\", func() {\n\t\t\t\tit(\"resolves buildpack from builder\", func() {\n\t\t\t\t\tbuildpackTar := ifakes.CreateBuildpackTar(t, tmpDir, dist.BuildpackDescriptor{\n\t\t\t\t\t\tWithAPI: api.MustParse(\"0.3\"),\n\t\t\t\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\t\t\t\tID:      \"metabuildpack.id\",\n\t\t\t\t\t\t\tVersion: \"metabuildpack.version\",\n\t\t\t\t\t\t},\n\t\t\t\t\t\tWithStacks: nil,\n\t\t\t\t\t\tWithOrder: dist.Order{{\n\t\t\t\t\t\t\tGroup: []dist.ModuleRef{{\n\t\t\t\t\t\t\t\tModuleInfo: dist.ModuleInfo{\n\t\t\t\t\t\t\t\t\tID:      \"buildpack.1.id\",\n\t\t\t\t\t\t\t\t\tVersion: \"buildpack.1.version\",\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\tOptional: false,\n\t\t\t\t\t\t\t}, {\n\t\t\t\t\t\t\t\tModuleInfo: dist.ModuleInfo{\n\t\t\t\t\t\t\t\t\tID:      \"buildpack.2.id\",\n\t\t\t\t\t\t\t\t\tVersion: \"buildpack.2.version\",\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\tOptional: false,\n\t\t\t\t\t\t\t}},\n\t\t\t\t\t\t}},\n\t\t\t\t\t})\n\n\t\t\t\t\terr := subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\tImage:      \"some/app\",\n\t\t\t\t\t\tBuilder:    defaultBuilderName,\n\t\t\t\t\t\tClearCache: true,\n\t\t\t\t\t\tBuildpacks: []string{buildpackTar},\n\t\t\t\t\t})\n\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"meta-buildpack folder is used\", func() {\n\t\t\t\tit(\"resolves buildpack\", func() {\n\t\t\t\t\tmetaBuildpackFolder := filepath.Join(tmpDir, \"meta-buildpack\")\n\t\t\t\t\terr := os.Mkdir(metaBuildpackFolder, os.ModePerm)\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\terr = os.WriteFile(filepath.Join(metaBuildpackFolder, \"buildpack.toml\"), []byte(`\napi = \"0.2\"\n\n[buildpack]\n  id = \"local/meta-bp\"\n  version = \"local-meta-bp-version\"\n  name = \"Local Meta-Buildpack\"\n\n[[order]]\n[[order.group]]\nid = \"local/meta-bp-dep\"\nversion = \"local-meta-bp-version\"\n\t\t\t\t\t`), 0644)\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\terr = os.WriteFile(filepath.Join(metaBuildpackFolder, \"package.toml\"), []byte(`\n[buildpack]\nuri = \".\"\n\n[[dependencies]]\nuri = \"../meta-buildpack-dependency\"\n\t\t\t\t\t`), 0644)\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\tmetaBuildpackDependencyFolder := filepath.Join(tmpDir, \"meta-buildpack-dependency\")\n\t\t\t\t\terr = os.Mkdir(metaBuildpackDependencyFolder, os.ModePerm)\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\terr = os.WriteFile(filepath.Join(metaBuildpackDependencyFolder, \"buildpack.toml\"), []byte(`\napi = \"0.2\"\n\n[buildpack]\n  id = \"local/meta-bp-dep\"\n  version = \"local-meta-bp-version\"\n  name = \"Local Meta-Buildpack Dependency\"\n\n[[stacks]]\n  id = \"*\"\n\t\t\t\t\t`), 0644)\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\terr = subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\tImage:      \"some/app\",\n\t\t\t\t\t\tBuilder:    defaultBuilderName,\n\t\t\t\t\t\tClearCache: true,\n\t\t\t\t\t\tBuildpacks: []string{metaBuildpackFolder},\n\t\t\t\t\t})\n\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.Builder.Name(), defaultBuilderImage.Name())\n\n\t\t\t\t\tbldr, err := builder.FromImage(defaultBuilderImage)\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\tbuildpack1Info := dist.ModuleInfo{ID: \"buildpack.1.id\", Version: \"buildpack.1.version\"}\n\t\t\t\t\tbuildpack2Info := dist.ModuleInfo{ID: \"buildpack.2.id\", Version: \"buildpack.2.version\"}\n\t\t\t\t\tmetaBuildpackInfo := dist.ModuleInfo{ID: \"local/meta-bp\", Version: \"local-meta-bp-version\", Name: \"Local Meta-Buildpack\"}\n\t\t\t\t\tmetaBuildpackDependencyInfo := dist.ModuleInfo{ID: \"local/meta-bp-dep\", Version: \"local-meta-bp-version\", Name: \"Local Meta-Buildpack Dependency\"}\n\t\t\t\t\th.AssertEq(t, bldr.Buildpacks(), []dist.ModuleInfo{\n\t\t\t\t\t\tbuildpack1Info,\n\t\t\t\t\t\tbuildpack2Info,\n\t\t\t\t\t\tmetaBuildpackInfo,\n\t\t\t\t\t\tmetaBuildpackDependencyInfo,\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\tit(\"fails if buildpack dependency could not be fetched\", func() {\n\t\t\t\t\tmetaBuildpackFolder := filepath.Join(tmpDir, \"meta-buildpack\")\n\t\t\t\t\terr := os.Mkdir(metaBuildpackFolder, os.ModePerm)\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\terr = os.WriteFile(filepath.Join(metaBuildpackFolder, \"buildpack.toml\"), []byte(`\napi = \"0.2\"\n\n[buildpack]\n  id = \"local/meta-bp\"\n  version = \"local-meta-bp-version\"\n  name = \"Local Meta-Buildpack\"\n\n[[order]]\n[[order.group]]\nid = \"local/meta-bp-dep\"\nversion = \"local-meta-bp-version\"\n\t\t\t\t\t`), 0644)\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\terr = os.WriteFile(filepath.Join(metaBuildpackFolder, \"package.toml\"), []byte(`\n[buildpack]\nuri = \".\"\n\n[[dependencies]]\nuri = \"../meta-buildpack-dependency\"\n\n[[dependencies]]\nuri = \"../not-a-valid-dependency\"\n\t\t\t\t\t`), 0644)\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\tmetaBuildpackDependencyFolder := filepath.Join(tmpDir, \"meta-buildpack-dependency\")\n\t\t\t\t\terr = os.Mkdir(metaBuildpackDependencyFolder, os.ModePerm)\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\terr = os.WriteFile(filepath.Join(metaBuildpackDependencyFolder, \"buildpack.toml\"), []byte(`\napi = \"0.2\"\n\n[buildpack]\n  id = \"local/meta-bp-dep\"\n  version = \"local-meta-bp-version\"\n  name = \"Local Meta-Buildpack Dependency\"\n\n[[stacks]]\n  id = \"*\"\n\t\t\t\t\t`), 0644)\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\terr = subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\tImage:      \"some/app\",\n\t\t\t\t\t\tBuilder:    defaultBuilderName,\n\t\t\t\t\t\tClearCache: true,\n\t\t\t\t\t\tBuildpacks: []string{metaBuildpackFolder},\n\t\t\t\t\t})\n\t\t\t\t\th.AssertError(t, err, fmt.Sprintf(\"fetching package.toml dependencies (path='%s')\", filepath.Join(metaBuildpackFolder, \"package.toml\")))\n\t\t\t\t\th.AssertError(t, err, \"fetching dependencies (uri='../not-a-valid-dependency',image='')\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"buildpackage image is used\", func() {\n\t\t\t\tvar fakePackage *fakes.Image\n\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tfakePackage = makeFakePackage(t, tmpDir, defaultBuilderStackID)\n\t\t\t\t\tfakeImageFetcher.LocalImages[fakePackage.Name()] = fakePackage\n\t\t\t\t})\n\n\t\t\t\tit(\"all buildpacks are added to ephemeral builder\", func() {\n\t\t\t\t\terr := subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\tImage:      \"some/app\",\n\t\t\t\t\t\tBuilder:    defaultBuilderName,\n\t\t\t\t\t\tClearCache: true,\n\t\t\t\t\t\tBuildpacks: []string{\n\t\t\t\t\t\t\t\"example.com/some/package\",\n\t\t\t\t\t\t},\n\t\t\t\t\t})\n\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.Builder.Name(), defaultBuilderImage.Name())\n\t\t\t\t\tbldr, err := builder.FromImage(defaultBuilderImage)\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\th.AssertEq(t, bldr.Order(), dist.Order{\n\t\t\t\t\t\t{Group: []dist.ModuleRef{\n\t\t\t\t\t\t\t{ModuleInfo: dist.ModuleInfo{ID: \"meta.buildpack.id\", Version: \"meta.buildpack.version\"}},\n\t\t\t\t\t\t}},\n\t\t\t\t\t\t// Child buildpacks should not be added to order\n\t\t\t\t\t})\n\t\t\t\t\th.AssertEq(t, bldr.Buildpacks(), []dist.ModuleInfo{\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\tID:      \"buildpack.1.id\",\n\t\t\t\t\t\t\tVersion: \"buildpack.1.version\",\n\t\t\t\t\t\t},\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\tID:      \"buildpack.2.id\",\n\t\t\t\t\t\t\tVersion: \"buildpack.2.version\",\n\t\t\t\t\t\t},\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\tID:      \"meta.buildpack.id\",\n\t\t\t\t\t\t\tVersion: \"meta.buildpack.version\",\n\t\t\t\t\t\t},\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\tID:      \"child.buildpack.id\",\n\t\t\t\t\t\t\tVersion: \"child.buildpack.version\",\n\t\t\t\t\t\t},\n\t\t\t\t\t})\n\t\t\t\t\targs := fakeImageFetcher.FetchCalls[fakePackage.Name()]\n\t\t\t\t\th.AssertEq(t, args.Target.ValuesAsPlatform(), \"linux/amd64\")\n\t\t\t\t})\n\n\t\t\t\tit(\"fails when no metadata label on package\", func() {\n\t\t\t\t\th.AssertNil(t, fakePackage.SetLabel(\"io.buildpacks.buildpackage.metadata\", \"\"))\n\n\t\t\t\t\terr := subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\tImage:      \"some/app\",\n\t\t\t\t\t\tBuilder:    defaultBuilderName,\n\t\t\t\t\t\tClearCache: true,\n\t\t\t\t\t\tBuildpacks: []string{\n\t\t\t\t\t\t\t\"example.com/some/package\",\n\t\t\t\t\t\t},\n\t\t\t\t\t})\n\n\t\t\t\t\th.AssertError(t, err, \"extracting buildpacks from 'example.com/some/package': could not find label 'io.buildpacks.buildpackage.metadata'\")\n\t\t\t\t})\n\n\t\t\t\tit(\"fails when no bp layers label is on package\", func() {\n\t\t\t\t\th.AssertNil(t, fakePackage.SetLabel(\"io.buildpacks.buildpack.layers\", \"\"))\n\n\t\t\t\t\terr := subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\tImage:      \"some/app\",\n\t\t\t\t\t\tBuilder:    defaultBuilderName,\n\t\t\t\t\t\tClearCache: true,\n\t\t\t\t\t\tBuildpacks: []string{\n\t\t\t\t\t\t\t\"example.com/some/package\",\n\t\t\t\t\t\t},\n\t\t\t\t\t})\n\n\t\t\t\t\th.AssertError(t, err, \"extracting buildpacks from 'example.com/some/package': could not find label 'io.buildpacks.buildpack.layers'\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\tit(\"ensures buildpacks exist on builder\", func() {\n\t\t\t\th.AssertError(t, subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\tImage:      \"some/app\",\n\t\t\t\t\tBuilder:    defaultBuilderName,\n\t\t\t\t\tClearCache: true,\n\t\t\t\t\tBuildpacks: []string{\"missing.bp@version\"},\n\t\t\t\t}),\n\t\t\t\t\t\"downloading buildpack: error reading missing.bp@version: invalid locator: InvalidLocator\",\n\t\t\t\t)\n\t\t\t})\n\n\t\t\twhen(\"from project descriptor\", func() {\n\t\t\t\twhen(\"id - no version is provided\", func() {\n\t\t\t\t\tit(\"resolves version\", func() {\n\t\t\t\t\t\th.AssertNil(t, subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\t\tImage:      \"some/app\",\n\t\t\t\t\t\t\tBuilder:    defaultBuilderName,\n\t\t\t\t\t\t\tClearCache: true,\n\t\t\t\t\t\t\tProjectDescriptor: projectTypes.Descriptor{\n\t\t\t\t\t\t\t\tBuild: projectTypes.Build{Buildpacks: []projectTypes.Buildpack{{ID: \"buildpack.1.id\"}}},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t}))\n\t\t\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.Builder.Name(), defaultBuilderImage.Name())\n\n\t\t\t\t\t\tassertOrderEquals(`[[order]]\n\n  [[order.group]]\n    id = \"buildpack.1.id\"\n    version = \"buildpack.1.version\"\n`)\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"buildpacks include URIs\", func() {\n\t\t\t\tvar buildpackTgz string\n\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tbuildpackTgz = h.CreateTGZ(t, filepath.Join(\"testdata\", \"buildpack2\"), \"./\", 0755)\n\t\t\t\t})\n\n\t\t\t\tit.After(func() {\n\t\t\t\t\th.AssertNilE(t, os.Remove(buildpackTgz))\n\t\t\t\t})\n\n\t\t\t\tit(\"buildpacks are added to ephemeral builder\", func() {\n\t\t\t\t\terr := subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\tImage:      \"some/app\",\n\t\t\t\t\t\tBuilder:    defaultBuilderName,\n\t\t\t\t\t\tClearCache: true,\n\t\t\t\t\t\tBuildpacks: []string{\n\t\t\t\t\t\t\t\"buildpack.1.id@buildpack.1.version\",\n\t\t\t\t\t\t\t\"buildpack.2.id@buildpack.2.version\",\n\t\t\t\t\t\t\tfilepath.Join(\"testdata\", \"buildpack\"),\n\t\t\t\t\t\t\tbuildpackTgz,\n\t\t\t\t\t\t},\n\t\t\t\t\t})\n\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.Builder.Name(), defaultBuilderImage.Name())\n\t\t\t\t\tbldr, err := builder.FromImage(defaultBuilderImage)\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\tbuildpack1Info := dist.ModuleInfo{ID: \"buildpack.1.id\", Version: \"buildpack.1.version\"}\n\t\t\t\t\tbuildpack2Info := dist.ModuleInfo{ID: \"buildpack.2.id\", Version: \"buildpack.2.version\"}\n\t\t\t\t\tdirBuildpackInfo := dist.ModuleInfo{ID: \"bp.one\", Version: \"1.2.3\", Homepage: \"http://one.buildpack\"}\n\t\t\t\t\ttgzBuildpackInfo := dist.ModuleInfo{ID: \"some-other-buildpack-id\", Version: \"some-other-buildpack-version\"}\n\t\t\t\t\th.AssertEq(t, bldr.Order(), dist.Order{\n\t\t\t\t\t\t{Group: []dist.ModuleRef{\n\t\t\t\t\t\t\t{ModuleInfo: buildpack1Info},\n\t\t\t\t\t\t\t{ModuleInfo: buildpack2Info},\n\t\t\t\t\t\t\t{ModuleInfo: dirBuildpackInfo},\n\t\t\t\t\t\t\t{ModuleInfo: tgzBuildpackInfo},\n\t\t\t\t\t\t}},\n\t\t\t\t\t})\n\t\t\t\t\th.AssertEq(t, bldr.Buildpacks(), []dist.ModuleInfo{\n\t\t\t\t\t\tbuildpack1Info,\n\t\t\t\t\t\tbuildpack2Info,\n\t\t\t\t\t\tdirBuildpackInfo,\n\t\t\t\t\t\ttgzBuildpackInfo,\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"uri is an http url\", func() {\n\t\t\t\t\tvar server *ghttp.Server\n\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\tserver = ghttp.NewServer()\n\t\t\t\t\t\tserver.AppendHandlers(func(w http.ResponseWriter, r *http.Request) {\n\t\t\t\t\t\t\thttp.ServeFile(w, r, buildpackTgz)\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\n\t\t\t\t\tit.After(func() {\n\t\t\t\t\t\tserver.Close()\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"adds the buildpack\", func() {\n\t\t\t\t\t\terr := subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\t\tImage:      \"some/app\",\n\t\t\t\t\t\t\tBuilder:    defaultBuilderName,\n\t\t\t\t\t\t\tClearCache: true,\n\t\t\t\t\t\t\tBuildpacks: []string{\n\t\t\t\t\t\t\t\t\"buildpack.1.id@buildpack.1.version\",\n\t\t\t\t\t\t\t\t\"buildpack.2.id@buildpack.2.version\",\n\t\t\t\t\t\t\t\tserver.URL(),\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.Builder.Name(), defaultBuilderImage.Name())\n\t\t\t\t\t\tbldr, err := builder.FromImage(defaultBuilderImage)\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\th.AssertEq(t, bldr.Order(), dist.Order{\n\t\t\t\t\t\t\t{Group: []dist.ModuleRef{\n\t\t\t\t\t\t\t\t{ModuleInfo: dist.ModuleInfo{ID: \"buildpack.1.id\", Version: \"buildpack.1.version\"}},\n\t\t\t\t\t\t\t\t{ModuleInfo: dist.ModuleInfo{ID: \"buildpack.2.id\", Version: \"buildpack.2.version\"}},\n\t\t\t\t\t\t\t\t{ModuleInfo: dist.ModuleInfo{ID: \"some-other-buildpack-id\", Version: \"some-other-buildpack-version\"}},\n\t\t\t\t\t\t\t}},\n\t\t\t\t\t\t})\n\t\t\t\t\t\th.AssertEq(t, bldr.Buildpacks(), []dist.ModuleInfo{\n\t\t\t\t\t\t\t{ID: \"buildpack.1.id\", Version: \"buildpack.1.version\"},\n\t\t\t\t\t\t\t{ID: \"buildpack.2.id\", Version: \"buildpack.2.version\"},\n\t\t\t\t\t\t\t{ID: \"some-other-buildpack-id\", Version: \"some-other-buildpack-version\"},\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"adds the buildpack from the project descriptor\", func() {\n\t\t\t\t\t\terr := subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\t\tImage:      \"some/app\",\n\t\t\t\t\t\t\tBuilder:    defaultBuilderName,\n\t\t\t\t\t\t\tClearCache: true,\n\t\t\t\t\t\t\tProjectDescriptor: projectTypes.Descriptor{\n\t\t\t\t\t\t\t\tBuild: projectTypes.Build{\n\t\t\t\t\t\t\t\t\tBuildpacks: []projectTypes.Buildpack{{\n\t\t\t\t\t\t\t\t\t\tURI: server.URL(),\n\t\t\t\t\t\t\t\t\t}},\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.Builder.Name(), defaultBuilderImage.Name())\n\t\t\t\t\t\tbldr, err := builder.FromImage(defaultBuilderImage)\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\th.AssertEq(t, bldr.Order(), dist.Order{\n\t\t\t\t\t\t\t{Group: []dist.ModuleRef{\n\t\t\t\t\t\t\t\t{ModuleInfo: dist.ModuleInfo{ID: \"some-other-buildpack-id\", Version: \"some-other-buildpack-version\"}},\n\t\t\t\t\t\t\t}},\n\t\t\t\t\t\t})\n\t\t\t\t\t\th.AssertEq(t, bldr.Buildpacks(), []dist.ModuleInfo{\n\t\t\t\t\t\t\t{ID: \"buildpack.1.id\", Version: \"buildpack.1.version\"},\n\t\t\t\t\t\t\t{ID: \"buildpack.2.id\", Version: \"buildpack.2.version\"},\n\t\t\t\t\t\t\t{ID: \"some-other-buildpack-id\", Version: \"some-other-buildpack-version\"},\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"adds the pre buildpack from the project descriptor\", func() {\n\t\t\t\t\t\terr := subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\t\tImage:      \"some/app\",\n\t\t\t\t\t\t\tBuilder:    defaultBuilderName,\n\t\t\t\t\t\t\tClearCache: true,\n\t\t\t\t\t\t\tProjectDescriptor: projectTypes.Descriptor{\n\t\t\t\t\t\t\t\tBuild: projectTypes.Build{\n\t\t\t\t\t\t\t\t\tPre: projectTypes.GroupAddition{\n\t\t\t\t\t\t\t\t\t\tBuildpacks: []projectTypes.Buildpack{{\n\t\t\t\t\t\t\t\t\t\t\tURI: server.URL(),\n\t\t\t\t\t\t\t\t\t\t}},\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.Builder.Name(), defaultBuilderImage.Name())\n\t\t\t\t\t\tbldr, err := builder.FromImage(defaultBuilderImage)\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\th.AssertEq(t, bldr.Order(), dist.Order{\n\t\t\t\t\t\t\t{Group: []dist.ModuleRef{\n\t\t\t\t\t\t\t\t{ModuleInfo: dist.ModuleInfo{ID: \"some-other-buildpack-id\", Version: \"some-other-buildpack-version\"}},\n\t\t\t\t\t\t\t\t{ModuleInfo: dist.ModuleInfo{ID: \"buildpack.1.id\", Version: \"buildpack.1.version\"}},\n\t\t\t\t\t\t\t}},\n\t\t\t\t\t\t\t{Group: []dist.ModuleRef{\n\t\t\t\t\t\t\t\t{ModuleInfo: dist.ModuleInfo{ID: \"some-other-buildpack-id\", Version: \"some-other-buildpack-version\"}},\n\t\t\t\t\t\t\t\t{ModuleInfo: dist.ModuleInfo{ID: \"buildpack.2.id\", Version: \"buildpack.2.version\"}},\n\t\t\t\t\t\t\t}},\n\t\t\t\t\t\t})\n\t\t\t\t\t\th.AssertEq(t, bldr.Buildpacks(), []dist.ModuleInfo{\n\t\t\t\t\t\t\t{ID: \"buildpack.1.id\", Version: \"buildpack.1.version\"},\n\t\t\t\t\t\t\t{ID: \"buildpack.2.id\", Version: \"buildpack.2.version\"},\n\t\t\t\t\t\t\t{ID: \"some-other-buildpack-id\", Version: \"some-other-buildpack-version\"},\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"adds the post buildpack from the project descriptor\", func() {\n\t\t\t\t\t\terr := subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\t\tImage:      \"some/app\",\n\t\t\t\t\t\t\tBuilder:    defaultBuilderName,\n\t\t\t\t\t\t\tClearCache: true,\n\t\t\t\t\t\t\tProjectDescriptor: projectTypes.Descriptor{\n\t\t\t\t\t\t\t\tBuild: projectTypes.Build{\n\t\t\t\t\t\t\t\t\tPost: projectTypes.GroupAddition{\n\t\t\t\t\t\t\t\t\t\tBuildpacks: []projectTypes.Buildpack{{\n\t\t\t\t\t\t\t\t\t\t\tURI: server.URL(),\n\t\t\t\t\t\t\t\t\t\t}},\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.Builder.Name(), defaultBuilderImage.Name())\n\t\t\t\t\t\tbldr, err := builder.FromImage(defaultBuilderImage)\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\th.AssertEq(t, bldr.Order(), dist.Order{\n\t\t\t\t\t\t\t{Group: []dist.ModuleRef{\n\t\t\t\t\t\t\t\t{ModuleInfo: dist.ModuleInfo{ID: \"buildpack.1.id\", Version: \"buildpack.1.version\"}},\n\t\t\t\t\t\t\t\t{ModuleInfo: dist.ModuleInfo{ID: \"some-other-buildpack-id\", Version: \"some-other-buildpack-version\"}},\n\t\t\t\t\t\t\t}},\n\t\t\t\t\t\t\t{Group: []dist.ModuleRef{\n\t\t\t\t\t\t\t\t{ModuleInfo: dist.ModuleInfo{ID: \"buildpack.2.id\", Version: \"buildpack.2.version\"}},\n\t\t\t\t\t\t\t\t{ModuleInfo: dist.ModuleInfo{ID: \"some-other-buildpack-id\", Version: \"some-other-buildpack-version\"}},\n\t\t\t\t\t\t\t}},\n\t\t\t\t\t\t})\n\t\t\t\t\t\th.AssertEq(t, bldr.Buildpacks(), []dist.ModuleInfo{\n\t\t\t\t\t\t\t{ID: \"buildpack.1.id\", Version: \"buildpack.1.version\"},\n\t\t\t\t\t\t\t{ID: \"buildpack.2.id\", Version: \"buildpack.2.version\"},\n\t\t\t\t\t\t\t{ID: \"some-other-buildpack-id\", Version: \"some-other-buildpack-version\"},\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"pre and post buildpacks\", func() {\n\t\t\t\t\tit(\"added from the project descriptor\", func() {\n\t\t\t\t\t\terr := subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\t\tImage:      \"some/app\",\n\t\t\t\t\t\t\tBuilder:    defaultBuilderName,\n\t\t\t\t\t\t\tClearCache: true,\n\t\t\t\t\t\t\tProjectDescriptor: projectTypes.Descriptor{\n\t\t\t\t\t\t\t\tBuild: projectTypes.Build{\n\t\t\t\t\t\t\t\t\tPre: projectTypes.GroupAddition{\n\t\t\t\t\t\t\t\t\t\tBuildpacks: []projectTypes.Buildpack{{ID: \"buildpack.2.id\", Version: \"buildpack.2.version\"}},\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\tPost: projectTypes.GroupAddition{\n\t\t\t\t\t\t\t\t\t\tBuildpacks: []projectTypes.Buildpack{{ID: \"buildpack.2.id\", Version: \"buildpack.2.version\"}},\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.Builder.Name(), defaultBuilderImage.Name())\n\t\t\t\t\t\tbldr, err := builder.FromImage(defaultBuilderImage)\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\th.AssertEq(t, bldr.Order(), dist.Order{\n\t\t\t\t\t\t\t{Group: []dist.ModuleRef{\n\t\t\t\t\t\t\t\t{ModuleInfo: dist.ModuleInfo{ID: \"buildpack.2.id\", Version: \"buildpack.2.version\"}},\n\t\t\t\t\t\t\t\t{ModuleInfo: dist.ModuleInfo{ID: \"buildpack.1.id\", Version: \"buildpack.1.version\"}},\n\t\t\t\t\t\t\t\t{ModuleInfo: dist.ModuleInfo{ID: \"buildpack.2.id\", Version: \"buildpack.2.version\"}},\n\t\t\t\t\t\t\t}},\n\t\t\t\t\t\t\t{Group: []dist.ModuleRef{\n\t\t\t\t\t\t\t\t{ModuleInfo: dist.ModuleInfo{ID: \"buildpack.2.id\", Version: \"buildpack.2.version\"}},\n\t\t\t\t\t\t\t\t{ModuleInfo: dist.ModuleInfo{ID: \"buildpack.2.id\", Version: \"buildpack.2.version\"}},\n\t\t\t\t\t\t\t\t{ModuleInfo: dist.ModuleInfo{ID: \"buildpack.2.id\", Version: \"buildpack.2.version\"}},\n\t\t\t\t\t\t\t}},\n\t\t\t\t\t\t})\n\t\t\t\t\t\th.AssertEq(t, bldr.Buildpacks(), []dist.ModuleInfo{\n\t\t\t\t\t\t\t{ID: \"buildpack.1.id\", Version: \"buildpack.1.version\"},\n\t\t\t\t\t\t\t{ID: \"buildpack.2.id\", Version: \"buildpack.2.version\"},\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"not added from the project descriptor\", func() {\n\t\t\t\t\t\terr := subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\t\tImage:      \"some/app\",\n\t\t\t\t\t\t\tBuilder:    defaultBuilderName,\n\t\t\t\t\t\t\tClearCache: true,\n\t\t\t\t\t\t\tBuildpacks: []string{\n\t\t\t\t\t\t\t\t\"buildpack.1.id@buildpack.1.version\",\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\tProjectDescriptor: projectTypes.Descriptor{\n\t\t\t\t\t\t\t\tBuild: projectTypes.Build{\n\t\t\t\t\t\t\t\t\tPre: projectTypes.GroupAddition{\n\t\t\t\t\t\t\t\t\t\tBuildpacks: []projectTypes.Buildpack{{ID: \"some-other-buildpack-id\", Version: \"some-other-buildpack-version\"}},\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\tPost: projectTypes.GroupAddition{\n\t\t\t\t\t\t\t\t\t\tBuildpacks: []projectTypes.Buildpack{{ID: \"yet-other-buildpack-id\", Version: \"yet-other-buildpack-version\"}},\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.Builder.Name(), defaultBuilderImage.Name())\n\t\t\t\t\t\tbldr, err := builder.FromImage(defaultBuilderImage)\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\th.AssertEq(t, bldr.Order(), dist.Order{\n\t\t\t\t\t\t\t{Group: []dist.ModuleRef{\n\t\t\t\t\t\t\t\t{ModuleInfo: dist.ModuleInfo{ID: \"buildpack.1.id\", Version: \"buildpack.1.version\"}},\n\t\t\t\t\t\t\t}},\n\t\t\t\t\t\t})\n\t\t\t\t\t\th.AssertEq(t, bldr.Buildpacks(), []dist.ModuleInfo{\n\t\t\t\t\t\t\t{ID: \"buildpack.1.id\", Version: \"buildpack.1.version\"},\n\t\t\t\t\t\t\t{ID: \"buildpack.2.id\", Version: \"buildpack.2.version\"},\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"added buildpack's mixins are not satisfied\", func() {\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\th.AssertNil(t, defaultBuilderImage.SetLabel(\"io.buildpacks.stack.mixins\", `[\"mixinX\", \"build:mixinY\"]`))\n\t\t\t\t\t\th.AssertNil(t, fakeDefaultRunImage.SetLabel(\"io.buildpacks.stack.mixins\", `[\"mixinX\", \"run:mixinZ\"]`))\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"succeeds\", func() {\n\t\t\t\t\t\th.AssertNil(t, subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\t\tImage:   \"some/app\",\n\t\t\t\t\t\t\tBuilder: defaultBuilderName,\n\t\t\t\t\t\t\tBuildpacks: []string{\n\t\t\t\t\t\t\t\tbuildpackTgz, // requires mixinA, build:mixinB, run:mixinC\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t}))\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"platform API < 0.12\", func() {\n\t\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\t\tsetAPIs(t, defaultBuilderImage, []string{\"0.8\"}, []string{\"0.11\"})\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\tit(\"returns an error\", func() {\n\t\t\t\t\t\t\terr := subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\t\t\tImage:   \"some/app\",\n\t\t\t\t\t\t\t\tBuilder: defaultBuilderName,\n\t\t\t\t\t\t\t\tBuildpacks: []string{\n\t\t\t\t\t\t\t\t\tbuildpackTgz, // requires mixinA, build:mixinB, run:mixinC\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t})\n\n\t\t\t\t\t\t\th.AssertError(t, err, \"validating stack mixins: buildpack 'some-other-buildpack-id@some-other-buildpack-version' requires missing mixin(s): build:mixinB, mixinA, run:mixinC\")\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"buildpack is inline\", func() {\n\t\t\t\t\tvar (\n\t\t\t\t\t\ttmpDir string\n\t\t\t\t\t)\n\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\tvar err error\n\t\t\t\t\t\ttmpDir, err = os.MkdirTemp(\"\", \"project-desc\")\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t})\n\n\t\t\t\t\tit.After(func() {\n\t\t\t\t\t\terr := os.RemoveAll(tmpDir)\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"all buildpacks are added to ephemeral builder\", func() {\n\t\t\t\t\t\terr := subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\t\tImage:      \"some/app\",\n\t\t\t\t\t\t\tBuilder:    defaultBuilderName,\n\t\t\t\t\t\t\tClearCache: true,\n\t\t\t\t\t\t\tProjectDescriptor: projectTypes.Descriptor{\n\t\t\t\t\t\t\t\tBuild: projectTypes.Build{\n\t\t\t\t\t\t\t\t\tBuildpacks: []projectTypes.Buildpack{{\n\t\t\t\t\t\t\t\t\t\tID: \"my/inline\",\n\t\t\t\t\t\t\t\t\t\tScript: projectTypes.Script{\n\t\t\t\t\t\t\t\t\t\t\tAPI:    \"0.4\",\n\t\t\t\t\t\t\t\t\t\t\tInline: \"touch foo.txt\",\n\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t}},\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\tProjectDescriptorBaseDir: tmpDir,\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.Builder.Name(), defaultBuilderImage.Name())\n\t\t\t\t\t\tbldr, err := builder.FromImage(defaultBuilderImage)\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\th.AssertEq(t, bldr.Order(), dist.Order{\n\t\t\t\t\t\t\t{Group: []dist.ModuleRef{\n\t\t\t\t\t\t\t\t{ModuleInfo: dist.ModuleInfo{ID: \"my/inline\", Version: \"0.0.0\"}},\n\t\t\t\t\t\t\t}},\n\t\t\t\t\t\t})\n\t\t\t\t\t\th.AssertEq(t, bldr.Buildpacks(), []dist.ModuleInfo{\n\t\t\t\t\t\t\t{ID: \"buildpack.1.id\", Version: \"buildpack.1.version\"},\n\t\t\t\t\t\t\t{ID: \"buildpack.2.id\", Version: \"buildpack.2.version\"},\n\t\t\t\t\t\t\t{ID: \"my/inline\", Version: \"0.0.0\"},\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"sets version if version is set\", func() {\n\t\t\t\t\t\terr := subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\t\tImage:      \"some/app\",\n\t\t\t\t\t\t\tBuilder:    defaultBuilderName,\n\t\t\t\t\t\t\tClearCache: true,\n\t\t\t\t\t\t\tProjectDescriptor: projectTypes.Descriptor{\n\t\t\t\t\t\t\t\tBuild: projectTypes.Build{\n\t\t\t\t\t\t\t\t\tBuildpacks: []projectTypes.Buildpack{{\n\t\t\t\t\t\t\t\t\t\tID:      \"my/inline\",\n\t\t\t\t\t\t\t\t\t\tVersion: \"1.0.0-my-version\",\n\t\t\t\t\t\t\t\t\t\tScript: projectTypes.Script{\n\t\t\t\t\t\t\t\t\t\t\tAPI:    \"0.4\",\n\t\t\t\t\t\t\t\t\t\t\tInline: \"touch foo.txt\",\n\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t}},\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\tProjectDescriptorBaseDir: tmpDir,\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.Builder.Name(), defaultBuilderImage.Name())\n\t\t\t\t\t\tbldr, err := builder.FromImage(defaultBuilderImage)\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\th.AssertEq(t, bldr.Order(), dist.Order{\n\t\t\t\t\t\t\t{Group: []dist.ModuleRef{\n\t\t\t\t\t\t\t\t{ModuleInfo: dist.ModuleInfo{ID: \"my/inline\", Version: \"1.0.0-my-version\"}},\n\t\t\t\t\t\t\t}},\n\t\t\t\t\t\t})\n\t\t\t\t\t\th.AssertEq(t, bldr.Buildpacks(), []dist.ModuleInfo{\n\t\t\t\t\t\t\t{ID: \"buildpack.1.id\", Version: \"buildpack.1.version\"},\n\t\t\t\t\t\t\t{ID: \"buildpack.2.id\", Version: \"buildpack.2.version\"},\n\t\t\t\t\t\t\t{ID: \"my/inline\", Version: \"1.0.0-my-version\"},\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"fails if there is no API\", func() {\n\t\t\t\t\t\terr := subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\t\tImage:      \"some/app\",\n\t\t\t\t\t\t\tBuilder:    defaultBuilderName,\n\t\t\t\t\t\t\tClearCache: true,\n\t\t\t\t\t\t\tProjectDescriptor: projectTypes.Descriptor{\n\t\t\t\t\t\t\t\tBuild: projectTypes.Build{\n\t\t\t\t\t\t\t\t\tBuildpacks: []projectTypes.Buildpack{{\n\t\t\t\t\t\t\t\t\t\tID: \"my/inline\",\n\t\t\t\t\t\t\t\t\t\tScript: projectTypes.Script{\n\t\t\t\t\t\t\t\t\t\t\tInline: \"touch foo.txt\",\n\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t}},\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\tProjectDescriptorBaseDir: tmpDir,\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\th.AssertEq(t, \"Missing API version for inline buildpack\", err.Error())\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"fails if there is no ID\", func() {\n\t\t\t\t\t\terr := subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\t\tImage:      \"some/app\",\n\t\t\t\t\t\t\tBuilder:    defaultBuilderName,\n\t\t\t\t\t\t\tClearCache: true,\n\t\t\t\t\t\t\tProjectDescriptor: projectTypes.Descriptor{\n\t\t\t\t\t\t\t\tBuild: projectTypes.Build{\n\t\t\t\t\t\t\t\t\tBuildpacks: []projectTypes.Buildpack{{\n\t\t\t\t\t\t\t\t\t\tScript: projectTypes.Script{\n\t\t\t\t\t\t\t\t\t\t\tAPI:    \"0.4\",\n\t\t\t\t\t\t\t\t\t\t\tInline: \"touch foo.txt\",\n\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t}},\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\tProjectDescriptorBaseDir: tmpDir,\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\th.AssertEq(t, \"Invalid buildpack definition\", err.Error())\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"ignores script if there is a URI\", func() {\n\t\t\t\t\t\terr := subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\t\tImage:      \"some/app\",\n\t\t\t\t\t\t\tBuilder:    defaultBuilderName,\n\t\t\t\t\t\t\tClearCache: true,\n\t\t\t\t\t\t\tProjectDescriptor: projectTypes.Descriptor{\n\t\t\t\t\t\t\t\tBuild: projectTypes.Build{\n\t\t\t\t\t\t\t\t\tBuildpacks: []projectTypes.Buildpack{{\n\t\t\t\t\t\t\t\t\t\tID:      \"buildpack.1.id\",\n\t\t\t\t\t\t\t\t\t\tURI:     \"some-uri\",\n\t\t\t\t\t\t\t\t\t\tVersion: \"buildpack.1.version\",\n\t\t\t\t\t\t\t\t\t\tScript: projectTypes.Script{\n\t\t\t\t\t\t\t\t\t\t\tInline: \"touch foo.txt\",\n\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t}},\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\tProjectDescriptorBaseDir: tmpDir,\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\th.AssertContains(t, err.Error(), \"extracting from registry 'some-uri'\")\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"buildpack is from a registry\", func() {\n\t\t\t\t\tvar (\n\t\t\t\t\t\tfakePackage     *fakes.Image\n\t\t\t\t\t\ttmpDir          string\n\t\t\t\t\t\tregistryFixture string\n\t\t\t\t\t\tpackHome        string\n\n\t\t\t\t\t\tconfigPath string\n\t\t\t\t\t)\n\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\tvar err error\n\t\t\t\t\t\ttmpDir, err = os.MkdirTemp(\"\", \"registry\")\n\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\tpackHome = filepath.Join(tmpDir, \".pack\")\n\t\t\t\t\t\terr = os.MkdirAll(packHome, 0755)\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\tos.Setenv(\"PACK_HOME\", packHome)\n\n\t\t\t\t\t\tregistryFixture = h.CreateRegistryFixture(t, tmpDir, filepath.Join(\"testdata\", \"registry\"))\n\n\t\t\t\t\t\tconfigPath = filepath.Join(packHome, \"config.toml\")\n\t\t\t\t\t\th.AssertNil(t, cfg.Write(cfg.Config{\n\t\t\t\t\t\t\tRegistries: []cfg.Registry{\n\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\tName: \"some-registry\",\n\t\t\t\t\t\t\t\t\tType: \"github\",\n\t\t\t\t\t\t\t\t\tURL:  registryFixture,\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t}, configPath))\n\n\t\t\t\t\t\t_, err = rg.NewRegistryCache(logger, tmpDir, registryFixture)\n\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\tchildBuildpackTar := ifakes.CreateBuildpackTar(t, tmpDir, dist.BuildpackDescriptor{\n\t\t\t\t\t\t\tWithAPI: api.MustParse(\"0.3\"),\n\t\t\t\t\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\t\t\t\t\tID:      \"example/foo\",\n\t\t\t\t\t\t\t\tVersion: \"1.0.0\",\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\tWithStacks: []dist.Stack{\n\t\t\t\t\t\t\t\t{ID: defaultBuilderStackID},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\tbpLayers := dist.ModuleLayers{\n\t\t\t\t\t\t\t\"example/foo\": {\n\t\t\t\t\t\t\t\t\"1.0.0\": {\n\t\t\t\t\t\t\t\t\tAPI: api.MustParse(\"0.3\"),\n\t\t\t\t\t\t\t\t\tStacks: []dist.Stack{\n\t\t\t\t\t\t\t\t\t\t{ID: defaultBuilderStackID},\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\tLayerDiffID: diffIDForFile(t, childBuildpackTar),\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t}\n\n\t\t\t\t\t\tmd := buildpack.Metadata{\n\t\t\t\t\t\t\tModuleInfo: dist.ModuleInfo{\n\t\t\t\t\t\t\t\tID:      \"example/foo\",\n\t\t\t\t\t\t\t\tVersion: \"1.0.0\",\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\tStacks: []dist.Stack{\n\t\t\t\t\t\t\t\t{ID: defaultBuilderStackID},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t}\n\n\t\t\t\t\t\tfakePackage = fakes.NewImage(\"example.com/some/package@sha256:8c27fe111c11b722081701dfed3bd55e039b9ce92865473cf4cdfa918071c566\", \"\", nil)\n\t\t\t\t\t\th.AssertNil(t, dist.SetLabel(fakePackage, \"io.buildpacks.buildpack.layers\", bpLayers))\n\t\t\t\t\t\th.AssertNil(t, dist.SetLabel(fakePackage, \"io.buildpacks.buildpackage.metadata\", md))\n\n\t\t\t\t\t\th.AssertNil(t, fakePackage.AddLayer(childBuildpackTar))\n\n\t\t\t\t\t\tfakeImageFetcher.LocalImages[fakePackage.Name()] = fakePackage\n\t\t\t\t\t})\n\n\t\t\t\t\tit.After(func() {\n\t\t\t\t\t\tos.Unsetenv(\"PACK_HOME\")\n\t\t\t\t\t\th.AssertNil(t, os.RemoveAll(tmpDir))\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"all buildpacks are added to ephemeral builder\", func() {\n\t\t\t\t\t\terr := subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\t\tImage:      \"some/app\",\n\t\t\t\t\t\t\tBuilder:    defaultBuilderName,\n\t\t\t\t\t\t\tClearCache: true,\n\t\t\t\t\t\t\tBuildpacks: []string{\n\t\t\t\t\t\t\t\t\"urn:cnb:registry:example/foo@1.0.0\",\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\tRegistry: \"some-registry\",\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.Builder.Name(), defaultBuilderImage.Name())\n\t\t\t\t\t\tbldr, err := builder.FromImage(defaultBuilderImage)\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\th.AssertEq(t, bldr.Order(), dist.Order{\n\t\t\t\t\t\t\t{Group: []dist.ModuleRef{\n\t\t\t\t\t\t\t\t{ModuleInfo: dist.ModuleInfo{ID: \"example/foo\", Version: \"1.0.0\"}},\n\t\t\t\t\t\t\t}},\n\t\t\t\t\t\t})\n\t\t\t\t\t\th.AssertEq(t, bldr.Buildpacks(), []dist.ModuleInfo{\n\t\t\t\t\t\t\t{ID: \"buildpack.1.id\", Version: \"buildpack.1.version\"},\n\t\t\t\t\t\t\t{ID: \"buildpack.2.id\", Version: \"buildpack.2.version\"},\n\t\t\t\t\t\t\t{ID: \"example/foo\", Version: \"1.0.0\"},\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"Extensions option\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\tsubject.experimental = true\n\t\t\t\tdefaultBuilderImage.SetLabel(\"io.buildpacks.buildpack.order-extensions\", `[{\"group\":[{\"id\":\"extension.1.id\",\"version\":\"extension.1.version\"}]}, {\"group\":[{\"id\":\"extension.2.id\",\"version\":\"extension.2.version\"}]}]`)\n\t\t\t\tdefaultWindowsBuilderImage.SetLabel(\"io.buildpacks.buildpack.order-extensions\", `[{\"group\":[{\"id\":\"extension.1.id\",\"version\":\"extension.1.version\"}]}, {\"group\":[{\"id\":\"extension.2.id\",\"version\":\"extension.2.version\"}]}]`)\n\t\t\t})\n\n\t\t\tassertOrderEquals := func(content string) {\n\t\t\t\tt.Helper()\n\n\t\t\t\torderLayer, err := defaultBuilderImage.FindLayerWithPath(\"/cnb/order.toml\")\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertOnTarEntry(t, orderLayer, \"/cnb/order.toml\", h.ContentEquals(content))\n\t\t\t}\n\n\t\t\tit(\"builder order-extensions is overwritten\", func() {\n\t\t\t\tadditionalEx := ifakes.CreateExtensionTar(t, tmpDir, dist.ExtensionDescriptor{\n\t\t\t\t\tWithAPI: api.MustParse(\"0.7\"),\n\t\t\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\t\t\tID:      \"extension.add.1.id\",\n\t\t\t\t\t\tVersion: \"extension.add.1.version\",\n\t\t\t\t\t},\n\t\t\t\t})\n\n\t\t\t\th.AssertNil(t, subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\tImage:      \"some/app\",\n\t\t\t\t\tBuilder:    defaultBuilderName,\n\t\t\t\t\tClearCache: true,\n\t\t\t\t\tExtensions: []string{additionalEx},\n\t\t\t\t}))\n\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.Builder.Name(), defaultBuilderImage.Name())\n\n\t\t\t\tassertOrderEquals(`[[order]]\n\n  [[order.group]]\n    id = \"buildpack.1.id\"\n    version = \"buildpack.1.version\"\n\n[[order]]\n\n  [[order.group]]\n    id = \"buildpack.2.id\"\n    version = \"buildpack.2.version\"\n\n[[order-extensions]]\n\n  [[order-extensions.group]]\n    id = \"extension.add.1.id\"\n    version = \"extension.add.1.version\"\n`)\n\t\t\t})\n\n\t\t\twhen(\"id - no version is provided\", func() {\n\t\t\t\tit(\"resolves version\", func() {\n\t\t\t\t\th.AssertNil(t, subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\tImage:      \"some/app\",\n\t\t\t\t\t\tBuilder:    defaultBuilderName,\n\t\t\t\t\t\tClearCache: true,\n\t\t\t\t\t\tExtensions: []string{\"extension.1.id\"},\n\t\t\t\t\t}))\n\t\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.Builder.Name(), defaultBuilderImage.Name())\n\n\t\t\t\t\tassertOrderEquals(`[[order]]\n\n  [[order.group]]\n    id = \"buildpack.1.id\"\n    version = \"buildpack.1.version\"\n\n[[order]]\n\n  [[order.group]]\n    id = \"buildpack.2.id\"\n    version = \"buildpack.2.version\"\n\n[[order-extensions]]\n\n  [[order-extensions.group]]\n    id = \"extension.1.id\"\n    version = \"extension.1.version\"\n`)\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\t//TODO: \"all buildpacks are added to ephemeral builder\" test after extractPackaged() is completed.\n\n\t\twhen(\"ProjectDescriptor\", func() {\n\t\t\twhen(\"project metadata\", func() {\n\t\t\t\twhen(\"not experimental\", func() {\n\t\t\t\t\tit(\"does not set project source\", func() {\n\t\t\t\t\t\terr := subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\t\tImage:      \"some/app\",\n\t\t\t\t\t\t\tBuilder:    defaultBuilderName,\n\t\t\t\t\t\t\tClearCache: true,\n\t\t\t\t\t\t\tProjectDescriptor: projectTypes.Descriptor{\n\t\t\t\t\t\t\t\tProject: projectTypes.Project{\n\t\t\t\t\t\t\t\t\tVersion:   \"1.2.3\",\n\t\t\t\t\t\t\t\t\tSourceURL: \"https://example.com\",\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\th.AssertNil(t, fakeLifecycle.Opts.ProjectMetadata.Source)\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"is experimental\", func() {\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\tsubject.experimental = true\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"missing information\", func() {\n\t\t\t\t\t\tit(\"does not set project source\", func() {\n\t\t\t\t\t\t\terr := subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\t\t\tImage:             \"some/app\",\n\t\t\t\t\t\t\t\tBuilder:           defaultBuilderName,\n\t\t\t\t\t\t\t\tClearCache:        true,\n\t\t\t\t\t\t\t\tProjectDescriptor: projectTypes.Descriptor{},\n\t\t\t\t\t\t\t})\n\n\t\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\t\th.AssertNil(t, fakeLifecycle.Opts.ProjectMetadata.Source)\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"sets project source\", func() {\n\t\t\t\t\t\terr := subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\t\tImage:      \"some/app\",\n\t\t\t\t\t\t\tBuilder:    defaultBuilderName,\n\t\t\t\t\t\t\tClearCache: true,\n\t\t\t\t\t\t\tProjectDescriptor: projectTypes.Descriptor{\n\t\t\t\t\t\t\t\tProject: projectTypes.Project{\n\t\t\t\t\t\t\t\t\tVersion:   \"1.2.3\",\n\t\t\t\t\t\t\t\t\tSourceURL: \"https://example.com\",\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\th.AssertNotNil(t, fakeLifecycle.Opts.ProjectMetadata.Source)\n\t\t\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.ProjectMetadata.Source, &files.ProjectSource{\n\t\t\t\t\t\t\tType:     \"project\",\n\t\t\t\t\t\t\tVersion:  map[string]interface{}{\"declared\": \"1.2.3\"},\n\t\t\t\t\t\t\tMetadata: map[string]interface{}{\"url\": \"https://example.com\"},\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"Env option\", func() {\n\t\t\tit(\"should set the env on the ephemeral builder\", func() {\n\t\t\t\th.AssertNil(t, subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\tImage:   \"some/app\",\n\t\t\t\t\tBuilder: defaultBuilderName,\n\t\t\t\t\tEnv: map[string]string{\n\t\t\t\t\t\t\"key1\": \"value1\",\n\t\t\t\t\t\t\"key2\": \"value2\",\n\t\t\t\t\t},\n\t\t\t\t}))\n\t\t\t\tlayerTar, err := defaultBuilderImage.FindLayerWithPath(\"/platform/env/key1\")\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertTarFileContents(t, layerTar, \"/platform/env/key1\", `value1`)\n\t\t\t\th.AssertTarFileContents(t, layerTar, \"/platform/env/key2\", `value2`)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"Publish option\", func() {\n\t\t\tvar remoteRunImage, builderWithoutLifecycleImageOrCreator *fakes.Image\n\n\t\t\tit.Before(func() {\n\t\t\t\tremoteRunImage = fakes.NewImage(\"default/run\", \"\", nil)\n\t\t\t\th.AssertNil(t, remoteRunImage.SetLabel(\"io.buildpacks.stack.id\", defaultBuilderStackID))\n\t\t\t\th.AssertNil(t, remoteRunImage.SetLabel(\"io.buildpacks.stack.mixins\", `[\"mixinA\", \"mixinX\", \"run:mixinZ\"]`))\n\t\t\t\tfakeImageFetcher.RemoteImages[remoteRunImage.Name()] = remoteRunImage\n\n\t\t\t\tbuilderWithoutLifecycleImageOrCreator = newFakeBuilderImage(\n\t\t\t\t\tt,\n\t\t\t\t\ttmpDir,\n\t\t\t\t\t\"example.com/supportscreator/builder:tag\",\n\t\t\t\t\t\"some.stack.id\",\n\t\t\t\t\tdefaultRunImageName,\n\t\t\t\t\t\"0.3.0\",\n\t\t\t\t\tnewLinuxImage,\n\t\t\t\t\tfalse,\n\t\t\t\t)\n\t\t\t\th.AssertNil(t, builderWithoutLifecycleImageOrCreator.SetLabel(\"io.buildpacks.stack.mixins\", `[\"mixinA\", \"build:mixinB\", \"mixinX\", \"build:mixinY\"]`))\n\t\t\t\tfakeImageFetcher.LocalImages[builderWithoutLifecycleImageOrCreator.Name()] = builderWithoutLifecycleImageOrCreator\n\t\t\t})\n\n\t\t\tit.After(func() {\n\t\t\t\tremoteRunImage.Cleanup()\n\t\t\t\tbuilderWithoutLifecycleImageOrCreator.Cleanup()\n\t\t\t})\n\n\t\t\twhen(\"true\", func() {\n\t\t\t\tit(\"uses a remote run image\", func() {\n\t\t\t\t\th.AssertNil(t, subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\tImage:   \"some/app\",\n\t\t\t\t\t\tBuilder: defaultBuilderName,\n\t\t\t\t\t\tPublish: true,\n\t\t\t\t\t}))\n\t\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.Publish, true)\n\n\t\t\t\t\targs := fakeImageFetcher.FetchCalls[defaultBuilderName]\n\t\t\t\t\th.AssertEq(t, args.Daemon, true)\n\n\t\t\t\t\targs = fakeImageFetcher.FetchCalls[\"default/run\"]\n\t\t\t\t\th.AssertEq(t, args.Daemon, false)\n\t\t\t\t\th.AssertEq(t, args.Target.ValuesAsPlatform(), \"linux/amd64\")\n\t\t\t\t})\n\n\t\t\t\twhen(\"builder is untrusted\", func() {\n\t\t\t\t\twhen(\"lifecycle image is available\", func() {\n\t\t\t\t\t\tit(\"uses the 5 phases with the lifecycle image\", func() {\n\t\t\t\t\t\t\torigLifecyleName := fakeLifecycleImage.Name()\n\n\t\t\t\t\t\t\th.AssertNil(t, subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\t\t\tImage:        \"some/app\",\n\t\t\t\t\t\t\t\tBuilder:      defaultBuilderName,\n\t\t\t\t\t\t\t\tPublish:      true,\n\t\t\t\t\t\t\t\tTrustBuilder: func(string) bool { return false },\n\t\t\t\t\t\t\t}))\n\t\t\t\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.UseCreator, false)\n\t\t\t\t\t\t\th.AssertContains(t, fakeLifecycle.Opts.LifecycleImage, \"pack.local/lifecycle\")\n\t\t\t\t\t\t\targs := fakeImageFetcher.FetchCalls[origLifecyleName]\n\t\t\t\t\t\t\th.AssertNotNil(t, args)\n\t\t\t\t\t\t\th.AssertEq(t, args.Daemon, true)\n\t\t\t\t\t\t\th.AssertEq(t, args.PullPolicy, image.PullAlways)\n\t\t\t\t\t\t\th.AssertEq(t, args.Target.ValuesAsPlatform(), \"linux/amd64\")\n\t\t\t\t\t\t})\n\t\t\t\t\t\tit(\"parses the versions correctly\", func() {\n\t\t\t\t\t\t\tfakeLifecycleImage.SetLabel(\"io.buildpacks.lifecycle.apis\", \"{\\\"platform\\\":{\\\"deprecated\\\":[\\\"0.1\\\",\\\"0.2\\\",\\\"0.3\\\",\\\"0.4\\\",\\\"0.5\\\",\\\"0.6\\\"],\\\"supported\\\":[\\\"0.7\\\",\\\"0.8\\\",\\\"0.9\\\",\\\"0.10\\\",\\\"0.11\\\",\\\"0.12\\\"]}}\")\n\n\t\t\t\t\t\t\th.AssertNil(t, subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\t\t\tImage:        \"some/app\",\n\t\t\t\t\t\t\t\tBuilder:      defaultBuilderName,\n\t\t\t\t\t\t\t\tPublish:      true,\n\t\t\t\t\t\t\t\tTrustBuilder: func(string) bool { return false },\n\t\t\t\t\t\t\t}))\n\t\t\t\t\t\t\th.AssertSliceContainsInOrder(t, fakeLifecycle.Opts.LifecycleApis, \"0.1\", \"0.2\", \"0.3\", \"0.4\", \"0.5\", \"0.6\", \"0.7\", \"0.8\", \"0.9\", \"0.10\", \"0.11\", \"0.12\")\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"lifecycle image is not available\", func() {\n\t\t\t\t\t\tit(\"errors\", func() {\n\t\t\t\t\t\t\th.AssertNotNil(t, subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\t\t\tImage:        \"some/app\",\n\t\t\t\t\t\t\t\tBuilder:      builderWithoutLifecycleImageOrCreator.Name(),\n\t\t\t\t\t\t\t\tPublish:      true,\n\t\t\t\t\t\t\t\tTrustBuilder: func(string) bool { return false },\n\t\t\t\t\t\t\t}))\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"builder is trusted\", func() {\n\t\t\t\t\twhen(\"lifecycle supports creator\", func() {\n\t\t\t\t\t\tit(\"uses the creator with the provided builder\", func() {\n\t\t\t\t\t\t\th.AssertNil(t, subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\t\t\tImage:        \"some/app\",\n\t\t\t\t\t\t\t\tBuilder:      defaultBuilderName,\n\t\t\t\t\t\t\t\tPublish:      true,\n\t\t\t\t\t\t\t\tTrustBuilder: func(string) bool { return true },\n\t\t\t\t\t\t\t}))\n\t\t\t\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.UseCreator, true)\n\n\t\t\t\t\t\t\targs := fakeImageFetcher.FetchCalls[fakeLifecycleImage.Name()]\n\t\t\t\t\t\t\th.AssertNil(t, args)\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\twhen(\"additional buildpacks were added\", func() {\n\t\t\t\t\t\t\tit(\"uses creator when additional buildpacks are provided and TrustExtraBuildpacks is set\", func() {\n\t\t\t\t\t\t\t\tadditionalBP := ifakes.CreateBuildpackTar(t, tmpDir, dist.BuildpackDescriptor{\n\t\t\t\t\t\t\t\t\tWithAPI: api.MustParse(\"0.3\"),\n\t\t\t\t\t\t\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\t\t\t\t\t\t\tID:      \"buildpack.add.1.id\",\n\t\t\t\t\t\t\t\t\t\tVersion: \"buildpack.add.1.version\",\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\tWithStacks: []dist.Stack{{ID: defaultBuilderStackID}},\n\t\t\t\t\t\t\t\t\tWithOrder:  nil,\n\t\t\t\t\t\t\t\t})\n\n\t\t\t\t\t\t\t\th.AssertNil(t, subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\t\t\t\tImage:                \"some/app\",\n\t\t\t\t\t\t\t\t\tBuilder:              defaultBuilderName,\n\t\t\t\t\t\t\t\t\tPublish:              true,\n\t\t\t\t\t\t\t\t\tTrustBuilder:         func(string) bool { return true },\n\t\t\t\t\t\t\t\t\tTrustExtraBuildpacks: true,\n\t\t\t\t\t\t\t\t\tBuildpacks:           []string{additionalBP},\n\t\t\t\t\t\t\t\t}))\n\t\t\t\t\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.UseCreator, true)\n\t\t\t\t\t\t\t})\n\n\t\t\t\t\t\t\tit(\"uses the 5 phases with the lifecycle image\", func() {\n\t\t\t\t\t\t\t\tadditionalBP := ifakes.CreateBuildpackTar(t, tmpDir, dist.BuildpackDescriptor{\n\t\t\t\t\t\t\t\t\tWithAPI: api.MustParse(\"0.3\"),\n\t\t\t\t\t\t\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\t\t\t\t\t\t\tID:      \"buildpack.add.1.id\",\n\t\t\t\t\t\t\t\t\t\tVersion: \"buildpack.add.1.version\",\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\tWithStacks: []dist.Stack{{ID: defaultBuilderStackID}},\n\t\t\t\t\t\t\t\t\tWithOrder:  nil,\n\t\t\t\t\t\t\t\t})\n\n\t\t\t\t\t\t\t\th.AssertNil(t, subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\t\t\t\tImage:        \"some/app\",\n\t\t\t\t\t\t\t\t\tBuilder:      defaultBuilderName,\n\t\t\t\t\t\t\t\t\tPublish:      true,\n\t\t\t\t\t\t\t\t\tTrustBuilder: func(string) bool { return true },\n\t\t\t\t\t\t\t\t\tBuildpacks:   []string{additionalBP},\n\t\t\t\t\t\t\t\t}))\n\t\t\t\t\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.UseCreator, false)\n\t\t\t\t\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.LifecycleImage, fakeLifecycleImage.Name())\n\n\t\t\t\t\t\t\t\th.AssertContains(t, outBuf.String(), \"Builder is trusted but additional modules were added; using the untrusted (5 phases) build flow\")\n\t\t\t\t\t\t\t})\n\n\t\t\t\t\t\t\twhen(\"from project descriptor\", func() {\n\t\t\t\t\t\t\t\tit(\"uses the 5 phases with the lifecycle image\", func() {\n\t\t\t\t\t\t\t\t\tadditionalBP := ifakes.CreateBuildpackTar(t, tmpDir, dist.BuildpackDescriptor{\n\t\t\t\t\t\t\t\t\t\tWithAPI: api.MustParse(\"0.3\"),\n\t\t\t\t\t\t\t\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\t\t\t\t\t\t\t\tID:      \"buildpack.add.1.id\",\n\t\t\t\t\t\t\t\t\t\t\tVersion: \"buildpack.add.1.version\",\n\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t\tWithStacks: []dist.Stack{{ID: defaultBuilderStackID}},\n\t\t\t\t\t\t\t\t\t\tWithOrder:  nil,\n\t\t\t\t\t\t\t\t\t})\n\n\t\t\t\t\t\t\t\t\th.AssertNil(t, subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\t\t\t\t\tImage:        \"some/app\",\n\t\t\t\t\t\t\t\t\t\tBuilder:      defaultBuilderName,\n\t\t\t\t\t\t\t\t\t\tPublish:      true,\n\t\t\t\t\t\t\t\t\t\tTrustBuilder: func(string) bool { return true },\n\t\t\t\t\t\t\t\t\t\tProjectDescriptor: projectTypes.Descriptor{Build: projectTypes.Build{\n\t\t\t\t\t\t\t\t\t\t\tBuildpacks: []projectTypes.Buildpack{{\n\t\t\t\t\t\t\t\t\t\t\t\tURI: additionalBP,\n\t\t\t\t\t\t\t\t\t\t\t}},\n\t\t\t\t\t\t\t\t\t\t}},\n\t\t\t\t\t\t\t\t\t}))\n\t\t\t\t\t\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.UseCreator, false)\n\t\t\t\t\t\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.LifecycleImage, fakeLifecycleImage.Name())\n\n\t\t\t\t\t\t\t\t\th.AssertContains(t, outBuf.String(), \"Builder is trusted but additional modules were added; using the untrusted (5 phases) build flow\")\n\t\t\t\t\t\t\t\t})\n\n\t\t\t\t\t\t\t\twhen(\"inline buildpack\", func() {\n\t\t\t\t\t\t\t\t\tit(\"uses the creator with the provided builder\", func() {\n\t\t\t\t\t\t\t\t\t\th.AssertNil(t, subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\t\t\t\t\t\tImage:        \"some/app\",\n\t\t\t\t\t\t\t\t\t\t\tBuilder:      defaultBuilderName,\n\t\t\t\t\t\t\t\t\t\t\tPublish:      true,\n\t\t\t\t\t\t\t\t\t\t\tTrustBuilder: func(string) bool { return true },\n\t\t\t\t\t\t\t\t\t\t\tProjectDescriptor: projectTypes.Descriptor{Build: projectTypes.Build{\n\t\t\t\t\t\t\t\t\t\t\t\tBuildpacks: []projectTypes.Buildpack{{\n\t\t\t\t\t\t\t\t\t\t\t\t\tID:      \"buildpack.add.1.id\",\n\t\t\t\t\t\t\t\t\t\t\t\t\tVersion: \"buildpack.add.1.version\",\n\t\t\t\t\t\t\t\t\t\t\t\t\tScript: projectTypes.Script{\n\t\t\t\t\t\t\t\t\t\t\t\t\t\tAPI:    \"0.10\",\n\t\t\t\t\t\t\t\t\t\t\t\t\t\tInline: \"echo hello\",\n\t\t\t\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t\t\t\t}},\n\t\t\t\t\t\t\t\t\t\t\t}},\n\t\t\t\t\t\t\t\t\t\t}))\n\t\t\t\t\t\t\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.UseCreator, true)\n\n\t\t\t\t\t\t\t\t\t\targs := fakeImageFetcher.FetchCalls[fakeLifecycleImage.Name()]\n\t\t\t\t\t\t\t\t\t\th.AssertNil(t, args)\n\t\t\t\t\t\t\t\t\t})\n\t\t\t\t\t\t\t\t})\n\t\t\t\t\t\t\t})\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"lifecycle doesn't support creator\", func() {\n\t\t\t\t\t\t// the default test builder (example.com/default/builder:tag) has lifecycle version 0.3.0, so creator is not supported\n\t\t\t\t\t\tit(\"uses the 5 phases with the provided builder\", func() {\n\t\t\t\t\t\t\th.AssertNil(t, subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\t\t\tImage:        \"some/app\",\n\t\t\t\t\t\t\t\tBuilder:      builderWithoutLifecycleImageOrCreator.Name(),\n\t\t\t\t\t\t\t\tPublish:      true,\n\t\t\t\t\t\t\t\tTrustBuilder: func(string) bool { return true },\n\t\t\t\t\t\t\t}))\n\t\t\t\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.UseCreator, false)\n\t\t\t\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.LifecycleImage, builderWithoutLifecycleImageOrCreator.Name())\n\n\t\t\t\t\t\t\targs := fakeImageFetcher.FetchCalls[fakeLifecycleImage.Name()]\n\t\t\t\t\t\t\th.AssertNil(t, args)\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"false\", func() {\n\t\t\t\tit(\"uses a local run image\", func() {\n\t\t\t\t\th.AssertNil(t, subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\tImage:   \"some/app\",\n\t\t\t\t\t\tBuilder: defaultBuilderName,\n\t\t\t\t\t\tPublish: false,\n\t\t\t\t\t}))\n\t\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.Publish, false)\n\n\t\t\t\t\targs := fakeImageFetcher.FetchCalls[\"default/run\"]\n\t\t\t\t\th.AssertEq(t, args.Daemon, true)\n\t\t\t\t\th.AssertEq(t, args.PullPolicy, image.PullAlways)\n\n\t\t\t\t\targs = fakeImageFetcher.FetchCalls[defaultBuilderName]\n\t\t\t\t\th.AssertEq(t, args.Daemon, true)\n\t\t\t\t\th.AssertEq(t, args.PullPolicy, image.PullAlways)\n\t\t\t\t})\n\n\t\t\t\twhen(\"builder is untrusted\", func() {\n\t\t\t\t\twhen(\"lifecycle image is available\", func() {\n\t\t\t\t\t\tit(\"uses the 5 phases with the lifecycle image\", func() {\n\t\t\t\t\t\t\torigLifecyleName := fakeLifecycleImage.Name()\n\t\t\t\t\t\t\th.AssertNil(t, subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\t\t\tImage:        \"some/app\",\n\t\t\t\t\t\t\t\tBuilder:      defaultBuilderName,\n\t\t\t\t\t\t\t\tPublish:      false,\n\t\t\t\t\t\t\t\tTrustBuilder: func(string) bool { return false },\n\t\t\t\t\t\t\t}))\n\t\t\t\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.UseCreator, false)\n\t\t\t\t\t\t\th.AssertContains(t, fakeLifecycle.Opts.LifecycleImage, \"pack.local/lifecycle\")\n\t\t\t\t\t\t\targs := fakeImageFetcher.FetchCalls[origLifecyleName]\n\t\t\t\t\t\t\th.AssertNotNil(t, args)\n\t\t\t\t\t\t\th.AssertEq(t, args.Daemon, true)\n\t\t\t\t\t\t\th.AssertEq(t, args.PullPolicy, image.PullAlways)\n\t\t\t\t\t\t\th.AssertEq(t, args.Target.ValuesAsPlatform(), \"linux/amd64\")\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"lifecycle image is not available\", func() {\n\t\t\t\t\t\tit(\"errors\", func() {\n\t\t\t\t\t\t\th.AssertNotNil(t, subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\t\t\tImage:        \"some/app\",\n\t\t\t\t\t\t\t\tBuilder:      builderWithoutLifecycleImageOrCreator.Name(),\n\t\t\t\t\t\t\t\tPublish:      false,\n\t\t\t\t\t\t\t\tTrustBuilder: func(string) bool { return false },\n\t\t\t\t\t\t\t}))\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"builder is trusted\", func() {\n\t\t\t\t\twhen(\"lifecycle supports creator\", func() {\n\t\t\t\t\t\tit(\"uses the creator with the provided builder\", func() {\n\t\t\t\t\t\t\th.AssertNil(t, subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\t\t\tImage:        \"some/app\",\n\t\t\t\t\t\t\t\tBuilder:      defaultBuilderName,\n\t\t\t\t\t\t\t\tPublish:      false,\n\t\t\t\t\t\t\t\tTrustBuilder: func(string) bool { return true },\n\t\t\t\t\t\t\t}))\n\t\t\t\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.UseCreator, true)\n\n\t\t\t\t\t\t\targs := fakeImageFetcher.FetchCalls[fakeLifecycleImage.Name()]\n\t\t\t\t\t\t\th.AssertNil(t, args)\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"lifecycle doesn't support creator\", func() {\n\t\t\t\t\t\t// the default test builder (example.com/default/builder:tag) has lifecycle version 0.3.0, so creator is not supported\n\t\t\t\t\t\tit(\"uses the 5 phases with the provided builder\", func() {\n\t\t\t\t\t\t\th.AssertNil(t, subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\t\t\tImage:        \"some/app\",\n\t\t\t\t\t\t\t\tBuilder:      builderWithoutLifecycleImageOrCreator.Name(),\n\t\t\t\t\t\t\t\tPublish:      false,\n\t\t\t\t\t\t\t\tTrustBuilder: func(string) bool { return true },\n\t\t\t\t\t\t\t}))\n\t\t\t\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.UseCreator, false)\n\t\t\t\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.LifecycleImage, builderWithoutLifecycleImageOrCreator.Name())\n\n\t\t\t\t\t\t\targs := fakeImageFetcher.FetchCalls[fakeLifecycleImage.Name()]\n\t\t\t\t\t\t\th.AssertNil(t, args)\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"Platform option\", func() {\n\t\t\tvar fakePackage imgutil.Image\n\n\t\t\tit.Before(func() {\n\t\t\t\tfakePackage = makeFakePackage(t, tmpDir, defaultBuilderStackID)\n\t\t\t\tfakeImageFetcher.LocalImages[fakePackage.Name()] = fakePackage\n\t\t\t})\n\n\t\t\twhen(\"provided\", func() {\n\t\t\t\tit(\"uses the provided platform to pull the builder, run image, packages, and lifecycle image\", func() {\n\t\t\t\t\th.AssertNil(t, subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\tImage:   \"some/app\",\n\t\t\t\t\t\tBuilder: defaultBuilderName,\n\t\t\t\t\t\tBuildpacks: []string{\n\t\t\t\t\t\t\t\"example.com/some/package\",\n\t\t\t\t\t\t},\n\t\t\t\t\t\tPlatform:   \"linux/arm64\",\n\t\t\t\t\t\tPullPolicy: image.PullAlways,\n\t\t\t\t\t}))\n\n\t\t\t\t\targs := fakeImageFetcher.FetchCalls[defaultBuilderName]\n\t\t\t\t\th.AssertEq(t, args.Daemon, true)\n\t\t\t\t\th.AssertEq(t, args.PullPolicy, image.PullAlways)\n\t\t\t\t\th.AssertEq(t, args.Target.ValuesAsPlatform(), \"linux/arm64\")\n\n\t\t\t\t\targs = fakeImageFetcher.FetchCalls[\"default/run\"]\n\t\t\t\t\th.AssertEq(t, args.Daemon, true)\n\t\t\t\t\th.AssertEq(t, args.PullPolicy, image.PullAlways)\n\t\t\t\t\th.AssertEq(t, args.Target.ValuesAsPlatform(), \"linux/arm64\")\n\n\t\t\t\t\targs = fakeImageFetcher.FetchCalls[fakePackage.Name()]\n\t\t\t\t\th.AssertEq(t, args.Daemon, true)\n\t\t\t\t\th.AssertEq(t, args.PullPolicy, image.PullAlways)\n\t\t\t\t\th.AssertEq(t, args.Target.ValuesAsPlatform(), \"linux/arm64\")\n\n\t\t\t\t\targs = fakeImageFetcher.FetchCalls[fmt.Sprintf(\"%s:%s\", cfg.DefaultLifecycleImageRepo, builder.DefaultLifecycleVersion)]\n\t\t\t\t\th.AssertEq(t, args.Daemon, true)\n\t\t\t\t\th.AssertEq(t, args.PullPolicy, image.PullAlways)\n\t\t\t\t\th.AssertEq(t, args.Target.ValuesAsPlatform(), \"linux/arm64\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"not provided\", func() {\n\t\t\t\tit(\"defaults to builder os/arch\", func() {\n\t\t\t\t\t// defaultBuilderImage has linux/amd64\n\n\t\t\t\t\th.AssertNil(t, subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\tImage:   \"some/app\",\n\t\t\t\t\t\tBuilder: defaultBuilderName,\n\t\t\t\t\t\tBuildpacks: []string{\n\t\t\t\t\t\t\t\"example.com/some/package\",\n\t\t\t\t\t\t},\n\t\t\t\t\t\tPullPolicy: image.PullAlways,\n\t\t\t\t\t}))\n\n\t\t\t\t\targs := fakeImageFetcher.FetchCalls[defaultBuilderName]\n\t\t\t\t\th.AssertEq(t, args.Daemon, true)\n\t\t\t\t\th.AssertEq(t, args.PullPolicy, image.PullAlways)\n\t\t\t\t\th.AssertEq(t, args.Target, (*dist.Target)(nil))\n\n\t\t\t\t\targs = fakeImageFetcher.FetchCalls[\"default/run\"]\n\t\t\t\t\th.AssertEq(t, args.Daemon, true)\n\t\t\t\t\th.AssertEq(t, args.PullPolicy, image.PullAlways)\n\t\t\t\t\th.AssertEq(t, args.Target.ValuesAsPlatform(), \"linux/amd64\")\n\n\t\t\t\t\targs = fakeImageFetcher.FetchCalls[fakePackage.Name()]\n\t\t\t\t\th.AssertEq(t, args.Daemon, true)\n\t\t\t\t\th.AssertEq(t, args.PullPolicy, image.PullAlways)\n\t\t\t\t\th.AssertEq(t, args.Target.ValuesAsPlatform(), \"linux/amd64\")\n\n\t\t\t\t\targs = fakeImageFetcher.FetchCalls[fmt.Sprintf(\"%s:%s\", cfg.DefaultLifecycleImageRepo, builder.DefaultLifecycleVersion)]\n\t\t\t\t\th.AssertEq(t, args.Daemon, true)\n\t\t\t\t\th.AssertEq(t, args.PullPolicy, image.PullAlways)\n\t\t\t\t\th.AssertEq(t, args.Target.ValuesAsPlatform(), \"linux/amd64\")\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"PullPolicy\", func() {\n\t\t\twhen(\"never\", func() {\n\t\t\t\tit(\"uses the local builder and run images without updating\", func() {\n\t\t\t\t\th.AssertNil(t, subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\tImage:      \"some/app\",\n\t\t\t\t\t\tBuilder:    defaultBuilderName,\n\t\t\t\t\t\tPullPolicy: image.PullNever,\n\t\t\t\t\t}))\n\n\t\t\t\t\targs := fakeImageFetcher.FetchCalls[\"default/run\"]\n\t\t\t\t\th.AssertEq(t, args.Daemon, true)\n\t\t\t\t\th.AssertEq(t, args.PullPolicy, image.PullNever)\n\n\t\t\t\t\targs = fakeImageFetcher.FetchCalls[defaultBuilderName]\n\t\t\t\t\th.AssertEq(t, args.Daemon, true)\n\t\t\t\t\th.AssertEq(t, args.PullPolicy, image.PullNever)\n\n\t\t\t\t\targs = fakeImageFetcher.FetchCalls[fmt.Sprintf(\"%s:%s\", cfg.DefaultLifecycleImageRepo, builder.DefaultLifecycleVersion)]\n\t\t\t\t\th.AssertEq(t, args.Daemon, true)\n\t\t\t\t\th.AssertEq(t, args.PullPolicy, image.PullNever)\n\t\t\t\t\th.AssertEq(t, args.Target.ValuesAsPlatform(), \"linux/amd64\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"containerized pack\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tRunningInContainer = func() bool {\n\t\t\t\t\t\treturn true\n\t\t\t\t\t}\n\t\t\t\t})\n\n\t\t\t\twhen(\"--pull-policy=always\", func() {\n\t\t\t\t\tit(\"does not warn\", func() {\n\t\t\t\t\t\th.AssertNil(t, subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\t\tImage:      \"some/app\",\n\t\t\t\t\t\t\tBuilder:    defaultBuilderName,\n\t\t\t\t\t\t\tPullPolicy: image.PullAlways,\n\t\t\t\t\t\t}))\n\n\t\t\t\t\t\th.AssertNotContains(t, outBuf.String(), \"failing to pull build inputs from a remote registry is insecure\")\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"not --pull-policy=always\", func() {\n\t\t\t\t\tit(\"warns\", func() {\n\t\t\t\t\t\th.AssertNil(t, subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\t\tImage:      \"some/app\",\n\t\t\t\t\t\t\tBuilder:    defaultBuilderName,\n\t\t\t\t\t\t\tPullPolicy: image.PullNever,\n\t\t\t\t\t\t}))\n\n\t\t\t\t\t\th.AssertContains(t, outBuf.String(), \"failing to pull build inputs from a remote registry is insecure\")\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"always\", func() {\n\t\t\t\tit(\"uses pulls the builder and run image before using them\", func() {\n\t\t\t\t\th.AssertNil(t, subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\tImage:      \"some/app\",\n\t\t\t\t\t\tBuilder:    defaultBuilderName,\n\t\t\t\t\t\tPullPolicy: image.PullAlways,\n\t\t\t\t\t}))\n\n\t\t\t\t\targs := fakeImageFetcher.FetchCalls[\"default/run\"]\n\t\t\t\t\th.AssertEq(t, args.Daemon, true)\n\t\t\t\t\th.AssertEq(t, args.PullPolicy, image.PullAlways)\n\n\t\t\t\t\targs = fakeImageFetcher.FetchCalls[defaultBuilderName]\n\t\t\t\t\th.AssertEq(t, args.Daemon, true)\n\t\t\t\t\th.AssertEq(t, args.PullPolicy, image.PullAlways)\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"ProxyConfig option\", func() {\n\t\t\twhen(\"ProxyConfig is nil\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\th.AssertNil(t, os.Setenv(\"http_proxy\", \"other-http-proxy\"))\n\t\t\t\t\th.AssertNil(t, os.Setenv(\"https_proxy\", \"other-https-proxy\"))\n\t\t\t\t\th.AssertNil(t, os.Setenv(\"no_proxy\", \"other-no-proxy\"))\n\t\t\t\t})\n\n\t\t\t\twhen(\"*_PROXY env vars are set\", func() {\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\th.AssertNil(t, os.Setenv(\"HTTP_PROXY\", \"some-http-proxy\"))\n\t\t\t\t\t\th.AssertNil(t, os.Setenv(\"HTTPS_PROXY\", \"some-https-proxy\"))\n\t\t\t\t\t\th.AssertNil(t, os.Setenv(\"NO_PROXY\", \"some-no-proxy\"))\n\t\t\t\t\t})\n\n\t\t\t\t\tit.After(func() {\n\t\t\t\t\t\th.AssertNilE(t, os.Unsetenv(\"HTTP_PROXY\"))\n\t\t\t\t\t\th.AssertNilE(t, os.Unsetenv(\"HTTPS_PROXY\"))\n\t\t\t\t\t\th.AssertNilE(t, os.Unsetenv(\"NO_PROXY\"))\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"defaults to the *_PROXY environment variables\", func() {\n\t\t\t\t\t\th.AssertNil(t, subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\t\tImage:   \"some/app\",\n\t\t\t\t\t\t\tBuilder: defaultBuilderName,\n\t\t\t\t\t\t}))\n\t\t\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.HTTPProxy, \"some-http-proxy\")\n\t\t\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.HTTPSProxy, \"some-https-proxy\")\n\t\t\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.NoProxy, \"some-no-proxy\")\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\tit(\"falls back to the *_proxy environment variables\", func() {\n\t\t\t\t\th.AssertNil(t, subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\tImage:   \"some/app\",\n\t\t\t\t\t\tBuilder: defaultBuilderName,\n\t\t\t\t\t}))\n\t\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.HTTPProxy, \"other-http-proxy\")\n\t\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.HTTPSProxy, \"other-https-proxy\")\n\t\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.NoProxy, \"other-no-proxy\")\n\t\t\t\t})\n\t\t\t}, spec.Sequential())\n\n\t\t\twhen(\"ProxyConfig is not nil\", func() {\n\t\t\t\tit(\"passes the values through\", func() {\n\t\t\t\t\th.AssertNil(t, subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\tImage:   \"some/app\",\n\t\t\t\t\t\tBuilder: defaultBuilderName,\n\t\t\t\t\t\tProxyConfig: &ProxyConfig{\n\t\t\t\t\t\t\tHTTPProxy:  \"custom-http-proxy\",\n\t\t\t\t\t\t\tHTTPSProxy: \"custom-https-proxy\",\n\t\t\t\t\t\t\tNoProxy:    \"custom-no-proxy\",\n\t\t\t\t\t\t},\n\t\t\t\t\t}))\n\t\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.HTTPProxy, \"custom-http-proxy\")\n\t\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.HTTPSProxy, \"custom-https-proxy\")\n\t\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.NoProxy, \"custom-no-proxy\")\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"Network option\", func() {\n\t\t\tit(\"passes the value through\", func() {\n\t\t\t\th.AssertNil(t, subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\tImage:   \"some/app\",\n\t\t\t\t\tBuilder: defaultBuilderName,\n\t\t\t\t\tContainerConfig: ContainerConfig{\n\t\t\t\t\t\tNetwork: \"some-network\",\n\t\t\t\t\t},\n\t\t\t\t}))\n\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.Network, \"some-network\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"Lifecycle option\", func() {\n\t\t\twhen(\"Platform API\", func() {\n\t\t\t\tfor _, supportedPlatformAPI := range []string{\"0.3\", \"0.4\"} {\n\t\t\t\t\tvar (\n\t\t\t\t\t\tsupportedPlatformAPI = supportedPlatformAPI\n\t\t\t\t\t\tcompatibleBuilder    *fakes.Image\n\t\t\t\t\t)\n\n\t\t\t\t\twhen(fmt.Sprintf(\"lifecycle platform API is compatible (%s)\", supportedPlatformAPI), func() {\n\t\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\t\tcompatibleBuilder = ifakes.NewFakeBuilderImage(t,\n\t\t\t\t\t\t\t\ttmpDir,\n\t\t\t\t\t\t\t\t\"compatible-\"+defaultBuilderName,\n\t\t\t\t\t\t\t\tdefaultBuilderStackID,\n\t\t\t\t\t\t\t\t\"1234\",\n\t\t\t\t\t\t\t\t\"5678\",\n\t\t\t\t\t\t\t\tbuilder.Metadata{\n\t\t\t\t\t\t\t\t\tStack: builder.StackMetadata{\n\t\t\t\t\t\t\t\t\t\tRunImage: builder.RunImageMetadata{\n\t\t\t\t\t\t\t\t\t\t\tImage: \"default/run\",\n\t\t\t\t\t\t\t\t\t\t\tMirrors: []string{\n\t\t\t\t\t\t\t\t\t\t\t\t\"registry1.example.com/run/mirror\",\n\t\t\t\t\t\t\t\t\t\t\t\t\"registry2.example.com/run/mirror\",\n\t\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\tLifecycle: builder.LifecycleMetadata{\n\t\t\t\t\t\t\t\t\t\tLifecycleInfo: builder.LifecycleInfo{\n\t\t\t\t\t\t\t\t\t\t\tVersion: &builder.Version{\n\t\t\t\t\t\t\t\t\t\t\t\tVersion: *semver.MustParse(builder.DefaultLifecycleVersion),\n\t\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t\tAPIs: builder.LifecycleAPIs{\n\t\t\t\t\t\t\t\t\t\t\tBuildpack: builder.APIVersions{\n\t\t\t\t\t\t\t\t\t\t\t\tSupported: builder.APISet{api.MustParse(\"0.2\"), api.MustParse(\"0.3\"), api.MustParse(\"0.4\")},\n\t\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t\t\tPlatform: builder.APIVersions{\n\t\t\t\t\t\t\t\t\t\t\t\tSupported: builder.APISet{api.MustParse(supportedPlatformAPI)},\n\t\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\tnil,\n\t\t\t\t\t\t\t\tnil,\n\t\t\t\t\t\t\t\tnil,\n\t\t\t\t\t\t\t\tnil,\n\t\t\t\t\t\t\t\tdist.System{},\n\t\t\t\t\t\t\t\tnewLinuxImage,\n\t\t\t\t\t\t\t)\n\n\t\t\t\t\t\t\tfakeImageFetcher.LocalImages[compatibleBuilder.Name()] = compatibleBuilder\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\tit(\"should succeed\", func() {\n\t\t\t\t\t\t\terr := subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\t\t\tImage:   \"some/app\",\n\t\t\t\t\t\t\t\tBuilder: compatibleBuilder.Name(),\n\t\t\t\t\t\t\t})\n\n\t\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\t\t\t\t}\n\n\t\t\t\twhen(\"lifecycle Platform API is not compatible\", func() {\n\t\t\t\t\tvar incompatibleBuilderImage *fakes.Image\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\tincompatibleBuilderImage = ifakes.NewFakeBuilderImage(t,\n\t\t\t\t\t\t\ttmpDir,\n\t\t\t\t\t\t\t\"incompatible-\"+defaultBuilderName,\n\t\t\t\t\t\t\tdefaultBuilderStackID,\n\t\t\t\t\t\t\t\"1234\",\n\t\t\t\t\t\t\t\"5678\",\n\t\t\t\t\t\t\tbuilder.Metadata{\n\t\t\t\t\t\t\t\tStack: builder.StackMetadata{\n\t\t\t\t\t\t\t\t\tRunImage: builder.RunImageMetadata{\n\t\t\t\t\t\t\t\t\t\tImage: \"default/run\",\n\t\t\t\t\t\t\t\t\t\tMirrors: []string{\n\t\t\t\t\t\t\t\t\t\t\t\"registry1.example.com/run/mirror\",\n\t\t\t\t\t\t\t\t\t\t\t\"registry2.example.com/run/mirror\",\n\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\tLifecycle: builder.LifecycleMetadata{\n\t\t\t\t\t\t\t\t\tLifecycleInfo: builder.LifecycleInfo{\n\t\t\t\t\t\t\t\t\t\tVersion: &builder.Version{\n\t\t\t\t\t\t\t\t\t\t\tVersion: *semver.MustParse(builder.DefaultLifecycleVersion),\n\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\tAPI: builder.LifecycleAPI{\n\t\t\t\t\t\t\t\t\t\tBuildpackVersion: api.MustParse(\"0.3\"),\n\t\t\t\t\t\t\t\t\t\tPlatformVersion:  api.MustParse(\"0.1\"),\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\tnil,\n\t\t\t\t\t\t\tnil,\n\t\t\t\t\t\t\tnil,\n\t\t\t\t\t\t\tnil,\n\t\t\t\t\t\t\tdist.System{},\n\t\t\t\t\t\t\tnewLinuxImage,\n\t\t\t\t\t\t)\n\n\t\t\t\t\t\tfakeImageFetcher.LocalImages[incompatibleBuilderImage.Name()] = incompatibleBuilderImage\n\t\t\t\t\t})\n\n\t\t\t\t\tit.After(func() {\n\t\t\t\t\t\tincompatibleBuilderImage.Cleanup()\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"should error\", func() {\n\t\t\t\t\t\tbuilderName := incompatibleBuilderImage.Name()\n\n\t\t\t\t\t\terr := subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\t\tImage:   \"some/app\",\n\t\t\t\t\t\t\tBuilder: builderName,\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\th.AssertError(t, err, fmt.Sprintf(\"Builder %s is incompatible with this version of pack\", style.Symbol(builderName)))\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"supported Platform APIs not specified\", func() {\n\t\t\t\t\tvar badBuilderImage *fakes.Image\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\tbadBuilderImage = ifakes.NewFakeBuilderImage(t,\n\t\t\t\t\t\t\ttmpDir,\n\t\t\t\t\t\t\t\"incompatible-\"+defaultBuilderName,\n\t\t\t\t\t\t\tdefaultBuilderStackID,\n\t\t\t\t\t\t\t\"1234\",\n\t\t\t\t\t\t\t\"5678\",\n\t\t\t\t\t\t\tbuilder.Metadata{\n\t\t\t\t\t\t\t\tStack: builder.StackMetadata{\n\t\t\t\t\t\t\t\t\tRunImage: builder.RunImageMetadata{\n\t\t\t\t\t\t\t\t\t\tImage: \"default/run\",\n\t\t\t\t\t\t\t\t\t\tMirrors: []string{\n\t\t\t\t\t\t\t\t\t\t\t\"registry1.example.com/run/mirror\",\n\t\t\t\t\t\t\t\t\t\t\t\"registry2.example.com/run/mirror\",\n\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\tLifecycle: builder.LifecycleMetadata{\n\t\t\t\t\t\t\t\t\tLifecycleInfo: builder.LifecycleInfo{\n\t\t\t\t\t\t\t\t\t\tVersion: &builder.Version{\n\t\t\t\t\t\t\t\t\t\t\tVersion: *semver.MustParse(builder.DefaultLifecycleVersion),\n\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\tAPIs: builder.LifecycleAPIs{\n\t\t\t\t\t\t\t\t\t\tBuildpack: builder.APIVersions{Supported: builder.APISet{api.MustParse(\"0.2\")}},\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\tnil,\n\t\t\t\t\t\t\tnil,\n\t\t\t\t\t\t\tnil,\n\t\t\t\t\t\t\tnil,\n\t\t\t\t\t\t\tdist.System{},\n\t\t\t\t\t\t\tnewLinuxImage,\n\t\t\t\t\t\t)\n\n\t\t\t\t\t\tfakeImageFetcher.LocalImages[badBuilderImage.Name()] = badBuilderImage\n\t\t\t\t\t})\n\n\t\t\t\t\tit.After(func() {\n\t\t\t\t\t\tbadBuilderImage.Cleanup()\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"should error\", func() {\n\t\t\t\t\t\tbuilderName := badBuilderImage.Name()\n\n\t\t\t\t\t\terr := subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\t\tImage:   \"some/app\",\n\t\t\t\t\t\t\tBuilder: builderName,\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\th.AssertError(t, err, \"supported Lifecycle Platform APIs not specified\")\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"Buildpack API\", func() {\n\t\t\t\twhen(\"supported Buildpack APIs not specified\", func() {\n\t\t\t\t\tvar badBuilderImage *fakes.Image\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\tbadBuilderImage = ifakes.NewFakeBuilderImage(t,\n\t\t\t\t\t\t\ttmpDir,\n\t\t\t\t\t\t\t\"incompatible-\"+defaultBuilderName,\n\t\t\t\t\t\t\tdefaultBuilderStackID,\n\t\t\t\t\t\t\t\"1234\",\n\t\t\t\t\t\t\t\"5678\",\n\t\t\t\t\t\t\tbuilder.Metadata{\n\t\t\t\t\t\t\t\tStack: builder.StackMetadata{\n\t\t\t\t\t\t\t\t\tRunImage: builder.RunImageMetadata{\n\t\t\t\t\t\t\t\t\t\tImage: \"default/run\",\n\t\t\t\t\t\t\t\t\t\tMirrors: []string{\n\t\t\t\t\t\t\t\t\t\t\t\"registry1.example.com/run/mirror\",\n\t\t\t\t\t\t\t\t\t\t\t\"registry2.example.com/run/mirror\",\n\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\tLifecycle: builder.LifecycleMetadata{\n\t\t\t\t\t\t\t\t\tLifecycleInfo: builder.LifecycleInfo{\n\t\t\t\t\t\t\t\t\t\tVersion: &builder.Version{\n\t\t\t\t\t\t\t\t\t\t\tVersion: *semver.MustParse(builder.DefaultLifecycleVersion),\n\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\tAPIs: builder.LifecycleAPIs{\n\t\t\t\t\t\t\t\t\t\tPlatform: builder.APIVersions{Supported: builder.APISet{api.MustParse(\"0.4\")}},\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\tnil,\n\t\t\t\t\t\t\tnil,\n\t\t\t\t\t\t\tnil,\n\t\t\t\t\t\t\tnil,\n\t\t\t\t\t\t\tdist.System{},\n\t\t\t\t\t\t\tnewLinuxImage,\n\t\t\t\t\t\t)\n\n\t\t\t\t\t\tfakeImageFetcher.LocalImages[badBuilderImage.Name()] = badBuilderImage\n\t\t\t\t\t})\n\n\t\t\t\t\tit.After(func() {\n\t\t\t\t\t\tbadBuilderImage.Cleanup()\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"should error\", func() {\n\t\t\t\t\t\tbuilderName := badBuilderImage.Name()\n\n\t\t\t\t\t\terr := subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\t\tImage:   \"some/app\",\n\t\t\t\t\t\t\tBuilder: builderName,\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\th.AssertError(t, err, \"supported Lifecycle Buildpack APIs not specified\")\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"use creator with extensions\", func() {\n\t\t\t\twhen(\"lifecycle is old\", func() {\n\t\t\t\t\tit(\"false\", func() {\n\t\t\t\t\t\toldLifecycleBuilder := newFakeBuilderImage(t, tmpDir, \"example.com/old-lifecycle-builder:tag\", defaultBuilderStackID, defaultRunImageName, \"0.18.0\", newLinuxImage, false)\n\t\t\t\t\t\tdefer oldLifecycleBuilder.Cleanup()\n\t\t\t\t\t\tfakeImageFetcher.LocalImages[oldLifecycleBuilder.Name()] = oldLifecycleBuilder\n\n\t\t\t\t\t\th.AssertNil(t, subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\t\tImage:        \"some/app\",\n\t\t\t\t\t\t\tBuilder:      oldLifecycleBuilder.Name(),\n\t\t\t\t\t\t\tTrustBuilder: func(string) bool { return true },\n\t\t\t\t\t\t}))\n\n\t\t\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.UseCreatorWithExtensions, false)\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"lifecycle is new\", func() {\n\t\t\t\t\tit(\"true\", func() {\n\t\t\t\t\t\tnewLifecycleBuilder := newFakeBuilderImage(t, tmpDir, \"example.com/new-lifecycle-builder:tag\", defaultBuilderStackID, defaultRunImageName, \"0.19.0\", newLinuxImage, false)\n\t\t\t\t\t\tdefer newLifecycleBuilder.Cleanup()\n\t\t\t\t\t\tfakeImageFetcher.LocalImages[newLifecycleBuilder.Name()] = newLifecycleBuilder\n\n\t\t\t\t\t\th.AssertNil(t, subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\t\tImage:        \"some/app\",\n\t\t\t\t\t\t\tBuilder:      newLifecycleBuilder.Name(),\n\t\t\t\t\t\t\tTrustBuilder: func(string) bool { return true },\n\t\t\t\t\t\t}))\n\n\t\t\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.UseCreatorWithExtensions, true)\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"validating mixins\", func() {\n\t\t\twhen(\"stack image mixins disagree\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\th.AssertNil(t, defaultBuilderImage.SetLabel(\"io.buildpacks.stack.mixins\", `[\"mixinA\"]`))\n\t\t\t\t\th.AssertNil(t, fakeDefaultRunImage.SetLabel(\"io.buildpacks.stack.mixins\", `[\"mixinB\"]`))\n\t\t\t\t})\n\n\t\t\t\tit(\"succeeds\", func() {\n\t\t\t\t\th.AssertNil(t, subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\tImage:   \"some/app\",\n\t\t\t\t\t\tBuilder: defaultBuilderName,\n\t\t\t\t\t}))\n\t\t\t\t})\n\n\t\t\t\twhen(\"platform API < 0.12\", func() {\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\tsetAPIs(t, defaultBuilderImage, []string{\"0.8\"}, []string{\"0.11\"})\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"returns an error\", func() {\n\t\t\t\t\t\terr := subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\t\tImage:   \"some/app\",\n\t\t\t\t\t\t\tBuilder: defaultBuilderName,\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\th.AssertError(t, err, \"validating stack mixins: 'default/run' missing required mixin(s): mixinA\")\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"builder buildpack mixins are not satisfied\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\th.AssertNil(t, defaultBuilderImage.SetLabel(\"io.buildpacks.stack.mixins\", \"\"))\n\t\t\t\t\th.AssertNil(t, fakeDefaultRunImage.SetLabel(\"io.buildpacks.stack.mixins\", \"\"))\n\t\t\t\t})\n\n\t\t\t\tit(\"succeeds\", func() {\n\t\t\t\t\th.AssertNil(t, subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\tImage:   \"some/app\",\n\t\t\t\t\t\tBuilder: defaultBuilderName,\n\t\t\t\t\t}))\n\t\t\t\t})\n\n\t\t\t\twhen(\"platform API < 0.12\", func() {\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\tsetAPIs(t, defaultBuilderImage, []string{\"0.8\"}, []string{\"0.11\"})\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"returns an error\", func() {\n\t\t\t\t\t\terr := subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\t\tImage:   \"some/app\",\n\t\t\t\t\t\t\tBuilder: defaultBuilderName,\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\th.AssertError(t, err, \"validating stack mixins: buildpack 'buildpack.1.id@buildpack.1.version' requires missing mixin(s): build:mixinY, mixinX, run:mixinZ\")\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"Volumes option\", func() {\n\t\t\twhen(\"on posix\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\th.SkipIf(t, runtime.GOOS == \"windows\", \"Skipped on windows\")\n\t\t\t\t})\n\n\t\t\t\tfor _, test := range []struct {\n\t\t\t\t\tname        string\n\t\t\t\t\tvolume      string\n\t\t\t\t\texpectation string\n\t\t\t\t}{\n\t\t\t\t\t{\"defaults to read-only\", \"/a:/x\", \"/a:/x:ro\"},\n\t\t\t\t\t{\"defaults to read-only (nested)\", \"/a:/some/path/y\", \"/a:/some/path/y:ro\"},\n\t\t\t\t\t{\"supports rw mode\", \"/a:/x:rw\", \"/a:/x:rw\"},\n\t\t\t\t} {\n\t\t\t\t\tvolume := test.volume\n\t\t\t\t\texpectation := test.expectation\n\n\t\t\t\t\tit(test.name, func() {\n\t\t\t\t\t\terr := subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\t\tImage:   \"some/app\",\n\t\t\t\t\t\t\tBuilder: defaultBuilderName,\n\t\t\t\t\t\t\tContainerConfig: ContainerConfig{\n\t\t\t\t\t\t\t\tVolumes: []string{volume},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t})\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.Volumes, []string{expectation})\n\t\t\t\t\t})\n\t\t\t\t}\n\n\t\t\t\twhen(\"volume mode is invalid\", func() {\n\t\t\t\t\tit(\"returns an error\", func() {\n\t\t\t\t\t\terr := subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\t\tImage:   \"some/app\",\n\t\t\t\t\t\t\tBuilder: defaultBuilderName,\n\t\t\t\t\t\t\tContainerConfig: ContainerConfig{\n\t\t\t\t\t\t\t\tVolumes: []string{\"/a:/x:invalid\"},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t})\n\t\t\t\t\t\th.AssertError(t, err, `platform volume \"/a:/x:invalid\" has invalid format: invalid mode: invalid`)\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"volume specification is invalid\", func() {\n\t\t\t\t\tit(\"returns an error\", func() {\n\t\t\t\t\t\terr := subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\t\tImage:   \"some/app\",\n\t\t\t\t\t\t\tBuilder: defaultBuilderName,\n\t\t\t\t\t\t\tContainerConfig: ContainerConfig{\n\t\t\t\t\t\t\t\tVolumes: []string{\":::\"},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t})\n\t\t\t\t\t\tif runtime.GOOS == \"darwin\" {\n\t\t\t\t\t\t\th.AssertError(t, err, `platform volume \":::\" has invalid format: invalid spec: :::: empty section between colons`)\n\t\t\t\t\t\t} else {\n\t\t\t\t\t\t\th.AssertError(t, err, `platform volume \":::\" has invalid format: invalid volume specification: ':::'`)\n\t\t\t\t\t\t}\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"mounting onto cnb spec'd dir\", func() {\n\t\t\t\t\tfor _, p := range []string{\n\t\t\t\t\t\t\"/cnb/buildpacks\",\n\t\t\t\t\t\t\"/cnb/buildpacks/nested\",\n\t\t\t\t\t\t\"/cnb\",\n\t\t\t\t\t\t\"/cnb/nested\",\n\t\t\t\t\t\t\"/layers\",\n\t\t\t\t\t\t\"/layers/nested\",\n\t\t\t\t\t\t\"/workspace\",\n\t\t\t\t\t\t\"/workspace/bindings\",\n\t\t\t\t\t} {\n\t\t\t\t\t\tp := p\n\t\t\t\t\t\tit(fmt.Sprintf(\"warns when mounting to '%s'\", p), func() {\n\t\t\t\t\t\t\terr := subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\t\t\tImage:   \"some/app\",\n\t\t\t\t\t\t\t\tBuilder: defaultBuilderName,\n\t\t\t\t\t\t\t\tContainerConfig: ContainerConfig{\n\t\t\t\t\t\t\t\t\tVolumes: []string{fmt.Sprintf(\"/tmp/path:%s\", p)},\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t})\n\n\t\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\t\th.AssertContains(t, outBuf.String(), fmt.Sprintf(\"Warning: Mounting to a sensitive directory '%s'\", p))\n\t\t\t\t\t\t})\n\t\t\t\t\t}\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"on windows\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\th.SkipIf(t, runtime.GOOS != \"windows\", \"Skipped on non-windows\")\n\t\t\t\t})\n\t\t\t\twhen(\"linux container\", func() {\n\t\t\t\t\tit(\"drive is transformed\", func() {\n\t\t\t\t\t\tdir, _ := os.MkdirTemp(\"\", \"pack-test-mount\")\n\t\t\t\t\t\tvolume := fmt.Sprintf(\"%v:/x\", dir)\n\t\t\t\t\t\terr := subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\t\tImage:   \"some/app\",\n\t\t\t\t\t\t\tBuilder: defaultBuilderName,\n\t\t\t\t\t\t\tContainerConfig: ContainerConfig{\n\t\t\t\t\t\t\t\tVolumes: []string{volume},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\tTrustBuilder: func(string) bool { return true },\n\t\t\t\t\t\t})\n\t\t\t\t\t\texpected := []string{\n\t\t\t\t\t\t\tfmt.Sprintf(\"%s:/x:ro\", strings.ToLower(dir)),\n\t\t\t\t\t\t}\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.Volumes, expected)\n\t\t\t\t\t})\n\n\t\t\t\t\t// May not fail as mode is not used on Windows\n\t\t\t\t\twhen(\"volume mode is invalid\", func() {\n\t\t\t\t\t\tit(\"returns an error\", func() {\n\t\t\t\t\t\t\terr := subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\t\t\tImage:   \"some/app\",\n\t\t\t\t\t\t\t\tBuilder: defaultBuilderName,\n\t\t\t\t\t\t\t\tContainerConfig: ContainerConfig{\n\t\t\t\t\t\t\t\t\tVolumes: []string{\"/a:/x:invalid\"},\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\tTrustBuilder: func(string) bool { return true },\n\t\t\t\t\t\t\t})\n\t\t\t\t\t\t\th.AssertError(t, err, `platform volume \"/a:/x:invalid\" has invalid format: invalid volume specification: '/a:/x:invalid'`)\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"volume specification is invalid\", func() {\n\t\t\t\t\t\tit(\"returns an error\", func() {\n\t\t\t\t\t\t\terr := subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\t\t\tImage:   \"some/app\",\n\t\t\t\t\t\t\t\tBuilder: defaultBuilderName,\n\t\t\t\t\t\t\t\tContainerConfig: ContainerConfig{\n\t\t\t\t\t\t\t\t\tVolumes: []string{\":::\"},\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\tTrustBuilder: func(string) bool { return true },\n\t\t\t\t\t\t\t})\n\t\t\t\t\t\t\th.AssertError(t, err, `platform volume \":::\" has invalid format: invalid volume specification: ':::'`)\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"mounting onto cnb spec'd dir\", func() {\n\t\t\t\t\t\tfor _, p := range []string{\n\t\t\t\t\t\t\t`/cnb`, `/cnb/buildpacks`, `/layers`, `/workspace`,\n\t\t\t\t\t\t} {\n\t\t\t\t\t\t\tp := p\n\t\t\t\t\t\t\tit(fmt.Sprintf(\"warns when mounting to '%s'\", p), func() {\n\t\t\t\t\t\t\t\terr := subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\t\t\t\tImage:   \"some/app\",\n\t\t\t\t\t\t\t\t\tBuilder: defaultBuilderName,\n\t\t\t\t\t\t\t\t\tContainerConfig: ContainerConfig{\n\t\t\t\t\t\t\t\t\t\tVolumes: []string{fmt.Sprintf(\"c:/Users:%s\", p)},\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\tTrustBuilder: func(string) bool { return true },\n\t\t\t\t\t\t\t\t})\n\n\t\t\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\t\t\th.AssertContains(t, outBuf.String(), fmt.Sprintf(\"Warning: Mounting to a sensitive directory '%s'\", p))\n\t\t\t\t\t\t\t})\n\t\t\t\t\t\t}\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t\twhen(\"windows container\", func() {\n\t\t\t\t\tit(\"drive is mounted\", func() {\n\t\t\t\t\t\tdir, _ := os.MkdirTemp(\"\", \"pack-test-mount\")\n\t\t\t\t\t\tvolume := fmt.Sprintf(\"%v:c:\\\\x\", dir)\n\t\t\t\t\t\terr := subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\t\tImage:   \"some/app\",\n\t\t\t\t\t\t\tBuilder: defaultWindowsBuilderName,\n\t\t\t\t\t\t\tContainerConfig: ContainerConfig{\n\t\t\t\t\t\t\t\tVolumes: []string{volume},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\tTrustBuilder: func(string) bool { return true },\n\t\t\t\t\t\t})\n\t\t\t\t\t\texpected := []string{\n\t\t\t\t\t\t\tfmt.Sprintf(\"%s:c:\\\\x:ro\", strings.ToLower(dir)),\n\t\t\t\t\t\t}\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.Volumes, expected)\n\t\t\t\t\t})\n\n\t\t\t\t\t// May not fail as mode is not used on Windows\n\t\t\t\t\twhen(\"volume mode is invalid\", func() {\n\t\t\t\t\t\tit(\"returns an error\", func() {\n\t\t\t\t\t\t\terr := subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\t\t\tImage:   \"some/app\",\n\t\t\t\t\t\t\t\tBuilder: defaultWindowsBuilderName,\n\t\t\t\t\t\t\t\tContainerConfig: ContainerConfig{\n\t\t\t\t\t\t\t\t\tVolumes: []string{\"/a:/x:invalid\"},\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\tTrustBuilder: func(string) bool { return true },\n\t\t\t\t\t\t\t})\n\t\t\t\t\t\t\th.AssertError(t, err, `platform volume \"/a:/x:invalid\" has invalid format: invalid volume specification: '/a:/x:invalid'`)\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\n\t\t\t\t\t// Should fail even on windows\n\t\t\t\t\twhen(\"volume specification is invalid\", func() {\n\t\t\t\t\t\tit(\"returns an error\", func() {\n\t\t\t\t\t\t\terr := subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\t\t\tImage:   \"some/app\",\n\t\t\t\t\t\t\t\tBuilder: defaultWindowsBuilderName,\n\t\t\t\t\t\t\t\tContainerConfig: ContainerConfig{\n\t\t\t\t\t\t\t\t\tVolumes: []string{\":::\"},\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\tTrustBuilder: func(string) bool { return true },\n\t\t\t\t\t\t\t})\n\t\t\t\t\t\t\th.AssertError(t, err, `platform volume \":::\" has invalid format: invalid volume specification: ':::'`)\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"mounting onto cnb spec'd dir\", func() {\n\t\t\t\t\t\tfor _, p := range []string{\n\t\t\t\t\t\t\t`c:\\cnb`, `c:\\cnb\\buildpacks`, `c:\\layers`, `c:\\workspace`,\n\t\t\t\t\t\t} {\n\t\t\t\t\t\t\tp := p\n\t\t\t\t\t\t\tit(fmt.Sprintf(\"warns when mounting to '%s'\", p), func() {\n\t\t\t\t\t\t\t\terr := subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\t\t\t\tImage:   \"some/app\",\n\t\t\t\t\t\t\t\t\tBuilder: defaultWindowsBuilderName,\n\t\t\t\t\t\t\t\t\tContainerConfig: ContainerConfig{\n\t\t\t\t\t\t\t\t\t\tVolumes: []string{fmt.Sprintf(\"c:/Users:%s\", p)},\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\tTrustBuilder: func(string) bool { return true },\n\t\t\t\t\t\t\t\t})\n\n\t\t\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\t\t\th.AssertContains(t, outBuf.String(), fmt.Sprintf(\"Warning: Mounting to a sensitive directory '%s'\", p))\n\t\t\t\t\t\t\t})\n\t\t\t\t\t\t}\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"gid option\", func() {\n\t\t\tit(\"gid is passthroughs to lifecycle\", func() {\n\t\t\t\th.AssertNil(t, subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\tWorkspace: \"app\",\n\t\t\t\t\tBuilder:   defaultBuilderName,\n\t\t\t\t\tImage:     \"example.com/some/repo:tag\",\n\t\t\t\t\tGroupID:   2,\n\t\t\t\t}))\n\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.GID, 2)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"RegistryMirrors option\", func() {\n\t\t\tit(\"translates run image before passing to lifecycle\", func() {\n\t\t\t\tsubject.registryMirrors = map[string]string{\n\t\t\t\t\t\"index.docker.io\": \"10.0.0.1\",\n\t\t\t\t}\n\n\t\t\t\th.AssertNil(t, subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\tBuilder: defaultBuilderName,\n\t\t\t\t\tImage:   \"example.com/some/repo:tag\",\n\t\t\t\t}))\n\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.RunImage, \"10.0.0.1/default/run:latest\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"previous-image option\", func() {\n\t\t\tit(\"previous-image is passed to lifecycle\", func() {\n\t\t\t\th.AssertNil(t, subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\tWorkspace:     \"app\",\n\t\t\t\t\tBuilder:       defaultBuilderName,\n\t\t\t\t\tImage:         \"example.com/some/repo:tag\",\n\t\t\t\t\tPreviousImage: \"example.com/some/new:tag\",\n\t\t\t\t}))\n\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.PreviousImage, \"example.com/some/new:tag\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"interactive option\", func() {\n\t\t\tit(\"passthroughs to lifecycle\", func() {\n\t\t\t\th.AssertNil(t, subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\tBuilder:     defaultBuilderName,\n\t\t\t\t\tImage:       \"example.com/some/repo:tag\",\n\t\t\t\t\tInteractive: true,\n\t\t\t\t}))\n\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.Interactive, true)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"sbom destination dir option\", func() {\n\t\t\tit(\"passthroughs to lifecycle\", func() {\n\t\t\t\th.AssertNil(t, subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\tBuilder:            defaultBuilderName,\n\t\t\t\t\tImage:              \"example.com/some/repo:tag\",\n\t\t\t\t\tSBOMDestinationDir: \"some-destination-dir\",\n\t\t\t\t}))\n\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.SBOMDestinationDir, \"some-destination-dir\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"report destination dir option\", func() {\n\t\t\tit(\"passthroughs to lifecycle\", func() {\n\t\t\t\th.AssertNil(t, subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\tBuilder:              defaultBuilderName,\n\t\t\t\t\tImage:                \"example.com/some/repo:tag\",\n\t\t\t\t\tReportDestinationDir: \"a-destination-dir\",\n\t\t\t\t}))\n\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.ReportDestinationDir, \"a-destination-dir\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"there are extensions\", func() {\n\t\t\twithExtensionsLabel = true\n\n\t\t\twhen(\"default configuration\", func() {\n\t\t\t\tit(\"succeeds\", func() {\n\t\t\t\t\terr := subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\tImage:   \"some/app\",\n\t\t\t\t\t\tBuilder: defaultBuilderName,\n\t\t\t\t\t})\n\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.BuilderImage, defaultBuilderName)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"os\", func() {\n\t\t\t\twhen(\"windows\", func() {\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\th.SkipIf(t, runtime.GOOS != \"windows\", \"Skipped on non-windows\")\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"errors\", func() {\n\t\t\t\t\t\terr := subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\t\tImage:   \"some/app\",\n\t\t\t\t\t\t\tBuilder: defaultWindowsBuilderName,\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\th.AssertNotNil(t, err)\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"linux\", func() {\n\t\t\t\t\tit(\"succeeds\", func() {\n\t\t\t\t\t\terr := subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\t\tImage:   \"some/app\",\n\t\t\t\t\t\t\tBuilder: defaultBuilderName,\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.BuilderImage, defaultBuilderName)\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"pull policy\", func() {\n\t\t\t\twhen(\"always\", func() {\n\t\t\t\t\tit(\"succeeds\", func() {\n\t\t\t\t\t\terr := subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\t\tImage:      \"some/app\",\n\t\t\t\t\t\t\tBuilder:    defaultBuilderName,\n\t\t\t\t\t\t\tPullPolicy: image.PullAlways,\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.BuilderImage, defaultBuilderName)\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"other\", func() {\n\t\t\t\t\tit(\"errors\", func() {\n\t\t\t\t\t\terr := subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\t\tImage:      \"some/app\",\n\t\t\t\t\t\t\tBuilder:    defaultBuilderName,\n\t\t\t\t\t\t\tPullPolicy: image.PullNever,\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\th.AssertNotNil(t, err)\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"export to OCI layout\", func() {\n\t\t\tvar (\n\t\t\t\tinputImageReference, inputPreviousImageReference       InputImageReference\n\t\t\t\tlayoutConfig                                           *LayoutConfig\n\t\t\t\thostImagePath, hostPreviousImagePath, hostRunImagePath string\n\t\t\t)\n\n\t\t\tit.Before(func() {\n\t\t\t\th.SkipIf(t, runtime.GOOS == \"windows\", \"skip on windows\")\n\n\t\t\t\tremoteRunImage := fakes.NewImage(\"default/run\", \"\", nil)\n\t\t\t\th.AssertNil(t, remoteRunImage.SetLabel(\"io.buildpacks.stack.id\", defaultBuilderStackID))\n\t\t\t\th.AssertNil(t, remoteRunImage.SetLabel(\"io.buildpacks.stack.mixins\", `[\"mixinA\", \"mixinX\", \"run:mixinZ\"]`))\n\t\t\t\tfakeImageFetcher.RemoteImages[remoteRunImage.Name()] = remoteRunImage\n\n\t\t\t\thostImagePath = filepath.Join(tmpDir, \"my-app\")\n\t\t\t\tinputImageReference = ParseInputImageReference(fmt.Sprintf(\"oci:%s\", hostImagePath))\n\t\t\t\tlayoutConfig = &LayoutConfig{\n\t\t\t\t\tInputImage:    inputImageReference,\n\t\t\t\t\tLayoutRepoDir: filepath.Join(tmpDir, \"local-repo\"),\n\t\t\t\t}\n\t\t\t})\n\n\t\t\twhen(\"previous image is not provided\", func() {\n\t\t\t\twhen(\"sparse is false\", func() {\n\t\t\t\t\tit(\"saves run-image locally in oci layout and mount volumes\", func() {\n\t\t\t\t\t\th.AssertNil(t, subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\t\tImage:        inputImageReference.Name(),\n\t\t\t\t\t\t\tBuilder:      defaultBuilderName,\n\t\t\t\t\t\t\tLayoutConfig: layoutConfig,\n\t\t\t\t\t\t}))\n\n\t\t\t\t\t\targs := fakeImageFetcher.FetchCalls[\"default/run\"]\n\t\t\t\t\t\th.AssertEq(t, args.LayoutOption.Sparse, false)\n\t\t\t\t\t\th.AssertContains(t, args.LayoutOption.Path, layoutConfig.LayoutRepoDir)\n\n\t\t\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.Layout, true)\n\t\t\t\t\t\t// verify the host path are mounted as volumes\n\t\t\t\t\t\th.AssertSliceContainsMatch(t, fakeLifecycle.Opts.Volumes, hostImagePath, hostRunImagePath)\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"sparse is true\", func() {\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\tlayoutConfig.Sparse = true\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"saves run-image locally (no layers) in oci layout and mount volumes\", func() {\n\t\t\t\t\t\th.AssertNil(t, subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\t\tImage:        inputImageReference.Name(),\n\t\t\t\t\t\t\tBuilder:      defaultBuilderName,\n\t\t\t\t\t\t\tLayoutConfig: layoutConfig,\n\t\t\t\t\t\t}))\n\n\t\t\t\t\t\targs := fakeImageFetcher.FetchCalls[\"default/run\"]\n\t\t\t\t\t\th.AssertEq(t, args.LayoutOption.Sparse, true)\n\t\t\t\t\t\th.AssertContains(t, args.LayoutOption.Path, layoutConfig.LayoutRepoDir)\n\n\t\t\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.Layout, true)\n\t\t\t\t\t\t// verify the host path are mounted as volumes\n\t\t\t\t\t\th.AssertSliceContainsMatch(t, fakeLifecycle.Opts.Volumes, hostImagePath, hostRunImagePath)\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"previous image is provided\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\thostPreviousImagePath = filepath.Join(tmpDir, \"my-previous-app\")\n\t\t\t\t\tinputPreviousImageReference = ParseInputImageReference(fmt.Sprintf(\"oci:%s\", hostPreviousImagePath))\n\t\t\t\t\tlayoutConfig.PreviousInputImage = inputPreviousImageReference\n\t\t\t\t})\n\n\t\t\t\tit(\"mount previous image volume\", func() {\n\t\t\t\t\th.AssertNil(t, subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\t\tImage:         inputImageReference.Name(),\n\t\t\t\t\t\tPreviousImage: inputPreviousImageReference.Name(),\n\t\t\t\t\t\tBuilder:       defaultBuilderName,\n\t\t\t\t\t\tLayoutConfig:  layoutConfig,\n\t\t\t\t\t}))\n\n\t\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.Layout, true)\n\t\t\t\t\t// verify the host path are mounted as volumes\n\t\t\t\t\th.AssertSliceContainsMatch(t, fakeLifecycle.Opts.Volumes, hostImagePath, hostPreviousImagePath, hostRunImagePath)\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"there are system buildpacks\", func() {\n\t\t\tassertSystemEquals := func(content string) {\n\t\t\t\tt.Helper()\n\n\t\t\t\tsystemLayer, err := builderImageWithSystem.FindLayerWithPath(\"/cnb/system.toml\")\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertOnTarEntry(t, systemLayer, \"/cnb/system.toml\", h.ContentEquals(content))\n\t\t\t}\n\n\t\t\tit(\"uses the system buildpacks defined in the builder\", func() {\n\t\t\t\th.AssertNil(t, subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\tImage:   \"some/app\",\n\t\t\t\t\tBuilder: builderImageWithSystemName,\n\t\t\t\t}))\n\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.Builder.Name(), builderImageWithSystem.Name())\n\t\t\t\th.AssertTrue(t, len(fakeLifecycle.Opts.Builder.System().Pre.Buildpacks) == 1)\n\t\t\t\th.AssertTrue(t, len(fakeLifecycle.Opts.Builder.System().Post.Buildpacks) == 1)\n\t\t\t\tassertSystemEquals(`[system]\n  [system.pre]\n\n    [[system.pre.buildpacks]]\n      id = \"buildpack.1.id\"\n      version = \"buildpack.1.version\"\n  [system.post]\n\n    [[system.post.buildpacks]]\n      id = \"buildpack.2.id\"\n      version = \"buildpack.2.version\"\n`)\n\t\t\t})\n\n\t\t\tit(\"removes system buildpacks from builder when --disable-system-buildpacks\", func() {\n\t\t\t\th.AssertNil(t, subject.Build(context.TODO(), BuildOptions{\n\t\t\t\t\tImage:                   \"some/app\",\n\t\t\t\t\tBuilder:                 builderImageWithSystemName,\n\t\t\t\t\tDisableSystemBuildpacks: true,\n\t\t\t\t}))\n\t\t\t\th.AssertEq(t, fakeLifecycle.Opts.Builder.Name(), builderImageWithSystem.Name())\n\t\t\t\th.AssertTrue(t, len(fakeLifecycle.Opts.Builder.System().Pre.Buildpacks) == 0)\n\t\t\t\th.AssertTrue(t, len(fakeLifecycle.Opts.Builder.System().Post.Buildpacks) == 0)\n\t\t\t})\n\t\t})\n\t})\n}\n\nfunc makeFakePackage(t *testing.T, tmpDir string, stackID string) *fakes.Image {\n\tmetaBuildpackTar := ifakes.CreateBuildpackTar(t, tmpDir, dist.BuildpackDescriptor{\n\t\tWithAPI: api.MustParse(\"0.3\"),\n\t\tWithInfo: dist.ModuleInfo{\n\t\t\tID:       \"meta.buildpack.id\",\n\t\t\tVersion:  \"meta.buildpack.version\",\n\t\t\tHomepage: \"http://meta.buildpack\",\n\t\t},\n\t\tWithStacks: nil,\n\t\tWithOrder: dist.Order{{\n\t\t\tGroup: []dist.ModuleRef{{\n\t\t\t\tModuleInfo: dist.ModuleInfo{\n\t\t\t\t\tID:      \"child.buildpack.id\",\n\t\t\t\t\tVersion: \"child.buildpack.version\",\n\t\t\t\t},\n\t\t\t\tOptional: false,\n\t\t\t}},\n\t\t}},\n\t})\n\n\tchildBuildpackTar := ifakes.CreateBuildpackTar(t, tmpDir, dist.BuildpackDescriptor{\n\t\tWithAPI: api.MustParse(\"0.3\"),\n\t\tWithInfo: dist.ModuleInfo{\n\t\t\tID:       \"child.buildpack.id\",\n\t\t\tVersion:  \"child.buildpack.version\",\n\t\t\tHomepage: \"http://child.buildpack\",\n\t\t},\n\t\tWithStacks: []dist.Stack{\n\t\t\t{ID: stackID},\n\t\t},\n\t})\n\n\tbpLayers := dist.ModuleLayers{\n\t\t\"meta.buildpack.id\": {\n\t\t\t\"meta.buildpack.version\": {\n\t\t\t\tAPI: api.MustParse(\"0.3\"),\n\t\t\t\tOrder: dist.Order{{\n\t\t\t\t\tGroup: []dist.ModuleRef{{\n\t\t\t\t\t\tModuleInfo: dist.ModuleInfo{\n\t\t\t\t\t\t\tID:      \"child.buildpack.id\",\n\t\t\t\t\t\t\tVersion: \"child.buildpack.version\",\n\t\t\t\t\t\t},\n\t\t\t\t\t\tOptional: false,\n\t\t\t\t\t}},\n\t\t\t\t}},\n\t\t\t\tLayerDiffID: diffIDForFile(t, metaBuildpackTar),\n\t\t\t},\n\t\t},\n\t\t\"child.buildpack.id\": {\n\t\t\t\"child.buildpack.version\": {\n\t\t\t\tAPI: api.MustParse(\"0.3\"),\n\t\t\t\tStacks: []dist.Stack{\n\t\t\t\t\t{ID: stackID},\n\t\t\t\t},\n\t\t\t\tLayerDiffID: diffIDForFile(t, childBuildpackTar),\n\t\t\t},\n\t\t},\n\t}\n\n\tmd := buildpack.Metadata{\n\t\tModuleInfo: dist.ModuleInfo{\n\t\t\tID:      \"meta.buildpack.id\",\n\t\t\tVersion: \"meta.buildpack.version\",\n\t\t},\n\t\tStacks: []dist.Stack{\n\t\t\t{ID: stackID},\n\t\t},\n\t}\n\n\tfakePackage := fakes.NewImage(\"example.com/some/package\", \"\", nil)\n\th.AssertNil(t, dist.SetLabel(fakePackage, \"io.buildpacks.buildpack.layers\", bpLayers))\n\th.AssertNil(t, dist.SetLabel(fakePackage, \"io.buildpacks.buildpackage.metadata\", md))\n\n\th.AssertNil(t, fakePackage.AddLayer(metaBuildpackTar))\n\th.AssertNil(t, fakePackage.AddLayer(childBuildpackTar))\n\n\treturn fakePackage\n}\n\nfunc diffIDForFile(t *testing.T, path string) string {\n\tfile, err := os.Open(path)\n\th.AssertNil(t, err)\n\n\thasher := sha256.New()\n\t_, err = io.Copy(hasher, file)\n\th.AssertNil(t, err)\n\n\treturn \"sha256:\" + hex.EncodeToString(hasher.Sum(make([]byte, 0, hasher.Size())))\n}\n\nfunc newLinuxImage(name, topLayerSha string, identifier imgutil.Identifier) *fakes.Image {\n\treturn fakes.NewImage(name, topLayerSha, identifier)\n}\n\nfunc newWindowsImage(name, topLayerSha string, identifier imgutil.Identifier) *fakes.Image {\n\tresult := fakes.NewImage(name, topLayerSha, identifier)\n\tarch, _ := result.Architecture()\n\tosVersion, _ := result.OSVersion()\n\tresult.SetOS(\"windows\")\n\tresult.SetOSVersion(osVersion)\n\tresult.SetArchitecture(arch)\n\treturn result\n}\n\nfunc newFakeBuilderImage(t *testing.T, tmpDir, builderName, defaultBuilderStackID, runImageName, lifecycleVersion string, osImageCreator ifakes.FakeImageCreator, withSystem bool) *fakes.Image {\n\tvar supportedBuildpackAPIs builder.APISet\n\tfor _, v := range api.Buildpack.Supported {\n\t\tsupportedBuildpackAPIs = append(supportedBuildpackAPIs, v)\n\t}\n\tvar supportedPlatformAPIs builder.APISet\n\tfor _, v := range api.Platform.Supported {\n\t\tsupportedPlatformAPIs = append(supportedPlatformAPIs, v)\n\t}\n\n\tsystem := dist.System{}\n\tif withSystem {\n\t\tsystem.Pre.Buildpacks = append(system.Pre.Buildpacks, []dist.ModuleRef{\n\t\t\t{\n\t\t\t\tOptional: false,\n\t\t\t\tModuleInfo: dist.ModuleInfo{\n\t\t\t\t\tID:      \"buildpack.1.id\",\n\t\t\t\t\tVersion: \"buildpack.1.version\",\n\t\t\t\t},\n\t\t\t},\n\t\t}...)\n\t\tsystem.Post.Buildpacks = append(system.Post.Buildpacks, []dist.ModuleRef{\n\t\t\t{\n\t\t\t\tOptional: false,\n\t\t\t\tModuleInfo: dist.ModuleInfo{\n\t\t\t\t\tID:      \"buildpack.2.id\",\n\t\t\t\t\tVersion: \"buildpack.2.version\",\n\t\t\t\t},\n\t\t\t},\n\t\t}...)\n\t}\n\n\treturn ifakes.NewFakeBuilderImage(t,\n\t\ttmpDir,\n\t\tbuilderName,\n\t\tdefaultBuilderStackID,\n\t\t\"1234\",\n\t\t\"5678\",\n\t\tbuilder.Metadata{\n\t\t\tBuildpacks: []dist.ModuleInfo{\n\t\t\t\t{ID: \"buildpack.1.id\", Version: \"buildpack.1.version\"},\n\t\t\t\t{ID: \"buildpack.2.id\", Version: \"buildpack.2.version\"},\n\t\t\t},\n\t\t\tExtensions: []dist.ModuleInfo{\n\t\t\t\t{ID: \"extension.1.id\", Version: \"extension.1.version\"},\n\t\t\t\t{ID: \"extension.2.id\", Version: \"extension.2.version\"},\n\t\t\t},\n\t\t\tStack: builder.StackMetadata{\n\t\t\t\tRunImage: builder.RunImageMetadata{\n\t\t\t\t\tImage: runImageName,\n\t\t\t\t\tMirrors: []string{\n\t\t\t\t\t\t\"registry1.example.com/run/mirror\",\n\t\t\t\t\t\t\"registry2.example.com/run/mirror\",\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t},\n\t\t\tLifecycle: builder.LifecycleMetadata{\n\t\t\t\tLifecycleInfo: builder.LifecycleInfo{\n\t\t\t\t\tVersion: &builder.Version{\n\t\t\t\t\t\tVersion: *semver.MustParse(lifecycleVersion),\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t\tAPIs: builder.LifecycleAPIs{\n\t\t\t\t\tBuildpack: builder.APIVersions{\n\t\t\t\t\t\tSupported: supportedBuildpackAPIs,\n\t\t\t\t\t},\n\t\t\t\t\tPlatform: builder.APIVersions{\n\t\t\t\t\t\tSupported: supportedPlatformAPIs,\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t},\n\t\t},\n\t\tdist.ModuleLayers{\n\t\t\t\"buildpack.1.id\": {\n\t\t\t\t\"buildpack.1.version\": {\n\t\t\t\t\tAPI: api.MustParse(\"0.3\"),\n\t\t\t\t\tStacks: []dist.Stack{\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\tID:     defaultBuilderStackID,\n\t\t\t\t\t\t\tMixins: []string{\"mixinX\", \"build:mixinY\", \"run:mixinZ\"},\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t},\n\t\t\t\"buildpack.2.id\": {\n\t\t\t\t\"buildpack.2.version\": {\n\t\t\t\t\tAPI: api.MustParse(\"0.3\"),\n\t\t\t\t\tStacks: []dist.Stack{\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\tID:     defaultBuilderStackID,\n\t\t\t\t\t\t\tMixins: []string{\"mixinX\", \"build:mixinY\"},\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t},\n\t\t},\n\t\tdist.Order{{\n\t\t\tGroup: []dist.ModuleRef{{\n\t\t\t\tModuleInfo: dist.ModuleInfo{\n\t\t\t\t\tID:      \"buildpack.1.id\",\n\t\t\t\t\tVersion: \"buildpack.1.version\",\n\t\t\t\t},\n\t\t\t}},\n\t\t}, {\n\t\t\tGroup: []dist.ModuleRef{{\n\t\t\t\tModuleInfo: dist.ModuleInfo{\n\t\t\t\t\tID:      \"buildpack.2.id\",\n\t\t\t\t\tVersion: \"buildpack.2.version\",\n\t\t\t\t},\n\t\t\t}},\n\t\t}},\n\t\tdist.ModuleLayers{\n\t\t\t\"extension.1.id\": {\n\t\t\t\t\"extension.1.version\": {\n\t\t\t\t\tAPI: api.MustParse(\"0.3\"),\n\t\t\t\t},\n\t\t\t},\n\t\t\t\"extension.2.id\": {\n\t\t\t\t\"extension.2.version\": {\n\t\t\t\t\tAPI: api.MustParse(\"0.3\"),\n\t\t\t\t},\n\t\t\t},\n\t\t},\n\t\tdist.Order{{\n\t\t\tGroup: []dist.ModuleRef{{\n\t\t\t\tModuleInfo: dist.ModuleInfo{\n\t\t\t\t\tID:      \"extension.1.id\",\n\t\t\t\t\tVersion: \"extension.1.version\",\n\t\t\t\t},\n\t\t\t}},\n\t\t}, {\n\t\t\tGroup: []dist.ModuleRef{{\n\t\t\t\tModuleInfo: dist.ModuleInfo{\n\t\t\t\t\tID:      \"extension.2.id\",\n\t\t\t\t\tVersion: \"extension.2.version\",\n\t\t\t\t},\n\t\t\t}},\n\t\t}},\n\t\tsystem,\n\t\tosImageCreator,\n\t)\n}\n\nfunc setAPIs(t *testing.T, image *fakes.Image, buildpackAPIs []string, platformAPIs []string) {\n\tbuilderMDLabelName := \"io.buildpacks.builder.metadata\"\n\tvar supportedBuildpackAPIs builder.APISet\n\tfor _, v := range buildpackAPIs {\n\t\tsupportedBuildpackAPIs = append(supportedBuildpackAPIs, api.MustParse(v))\n\t}\n\tvar supportedPlatformAPIs builder.APISet\n\tfor _, v := range platformAPIs {\n\t\tsupportedPlatformAPIs = append(supportedPlatformAPIs, api.MustParse(v))\n\t}\n\tbuilderMDLabel, err := image.Label(builderMDLabelName)\n\th.AssertNil(t, err)\n\tvar builderMD builder.Metadata\n\th.AssertNil(t, json.Unmarshal([]byte(builderMDLabel), &builderMD))\n\tbuilderMD.Lifecycle.APIs = builder.LifecycleAPIs{\n\t\tBuildpack: builder.APIVersions{\n\t\t\tSupported: supportedBuildpackAPIs,\n\t\t},\n\t\tPlatform: builder.APIVersions{\n\t\t\tSupported: supportedPlatformAPIs,\n\t\t},\n\t}\n\tbuilderMDLabelBytes, err := json.Marshal(&builderMD)\n\th.AssertNil(t, err)\n\th.AssertNil(t, image.SetLabel(builderMDLabelName, string(builderMDLabelBytes)))\n}\n"
  },
  {
    "path": "pkg/client/client.go",
    "content": "/*\nPackage client provides all the functionality provided by pack as a library through a go api.\n\n# Prerequisites\n\nIn order to use most functionality, you will need an OCI runtime such as Docker or podman installed.\n\n# References\n\nThis package provides functionality to create and manipulate all artifacts outlined in the Cloud Native Buildpacks specification.\nAn introduction to these artifacts and their usage can be found at https://buildpacks.io/docs/.\n\nThe formal specification of the pack platform provides can be found at: https://github.com/buildpacks/spec.\n*/\npackage client\n\nimport (\n\t\"context\"\n\t\"os\"\n\t\"path/filepath\"\n\n\t\"github.com/buildpacks/imgutil\"\n\t\"github.com/buildpacks/imgutil/local\"\n\t\"github.com/buildpacks/imgutil/remote\"\n\t\"github.com/google/go-containerregistry/pkg/authn\"\n\tdockerClient \"github.com/moby/moby/client\"\n\t\"github.com/pkg/errors\"\n\n\t\"github.com/buildpacks/pack/internal/build\"\n\ticonfig \"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/internal/style\"\n\t\"github.com/buildpacks/pack/pkg/blob\"\n\t\"github.com/buildpacks/pack/pkg/buildpack\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n\t\"github.com/buildpacks/pack/pkg/image\"\n\t\"github.com/buildpacks/pack/pkg/index\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n)\n\nconst (\n\t// Env variable to set the root folder for manifest list local storage\n\txdgRuntimePath = \"XDG_RUNTIME_DIR\"\n)\n\nvar (\n\t// Version is the version of `pack`. It is injected at compile time.\n\tVersion = \"0.0.0\"\n)\n\n//go:generate mockgen -package testmocks -destination ../testmocks/mock_docker_client.go github.com/moby/moby/client APIClient\n\n//go:generate mockgen -package testmocks -destination ../testmocks/mock_image_fetcher.go github.com/buildpacks/pack/pkg/client ImageFetcher\n\n// ImageFetcher is an interface representing the ability to fetch local and remote images.\ntype ImageFetcher interface {\n\t// Fetch fetches an image by resolving it both remotely and locally depending on provided parameters.\n\t// The pull behavior is dictated by the pullPolicy, which can have the following behavior\n\t//   - PullNever: try to use the daemon to return a `local.Image`.\n\t//   - PullIfNotPResent: try look to use the daemon to return a `local.Image`, if none is found  fetch a remote image.\n\t//   - PullAlways: it will only try to fetch a remote image.\n\t//\n\t// These PullPolicies that these interact with the daemon argument.\n\t// PullIfNotPresent and daemon = false, gives us the same behavior as PullAlways.\n\t// There is a single invalid configuration, PullNever and daemon = false, this will always fail.\n\tFetch(ctx context.Context, name string, options image.FetchOptions) (imgutil.Image, error)\n\n\t// CheckReadAccess verifies if an image is accessible with read permissions\n\t// When FetchOptions.Daemon is true and the image doesn't exist in the daemon,\n\t// the behavior is dictated by the pull policy, which can have the following behavior\n\t//   - PullNever: returns false\n\t//   - PullAlways Or PullIfNotPresent: it will check read access for the remote image.\n\t// When FetchOptions.Daemon is false it will check read access for the remote image.\n\tCheckReadAccess(repo string, options image.FetchOptions) bool\n\n\t// FetchForPlatform fetches an image and resolves it to a platform-specific digest before fetching.\n\t// This ensures that multi-platform images are always resolved to the correct platform-specific manifest.\n\tFetchForPlatform(ctx context.Context, name string, options image.FetchOptions) (imgutil.Image, error)\n}\n\n//go:generate mockgen -package testmocks -destination ../testmocks/mock_blob_downloader.go github.com/buildpacks/pack/pkg/client BlobDownloader\n\n// BlobDownloader is an interface for collecting both remote and local assets as blobs.\ntype BlobDownloader interface {\n\t// Download collects both local and remote assets and provides a blob object\n\t// used to read asset contents.\n\tDownload(ctx context.Context, pathOrURI string) (blob.Blob, error)\n}\n\n//go:generate mockgen -package testmocks -destination ../testmocks/mock_image_factory.go github.com/buildpacks/pack/pkg/client ImageFactory\n\n// ImageFactory is an interface representing the ability to create a new OCI image.\ntype ImageFactory interface {\n\t// NewImage initializes an image object with required settings so that it\n\t// can be written either locally or to a registry.\n\tNewImage(repoName string, local bool, target dist.Target) (imgutil.Image, error)\n}\n\n//go:generate mockgen -package testmocks -destination ../testmocks/mock_index_factory.go github.com/buildpacks/pack/pkg/client IndexFactory\n\n// IndexFactory is an interface representing the ability to create a ImageIndex/ManifestList.\ntype IndexFactory interface {\n\t// Exists return true if the given index exits in the local storage\n\tExists(repoName string) bool\n\t// CreateIndex creates ManifestList locally\n\tCreateIndex(repoName string, opts ...imgutil.IndexOption) (imgutil.ImageIndex, error)\n\t// LoadIndex loads ManifestList from local storage with the given name\n\tLoadIndex(reponame string, opts ...imgutil.IndexOption) (imgutil.ImageIndex, error)\n\t// FetchIndex fetches ManifestList from Registry with the given name\n\tFetchIndex(name string, opts ...imgutil.IndexOption) (imgutil.ImageIndex, error)\n\t// FindIndex will find Index locally then on remote\n\tFindIndex(name string, opts ...imgutil.IndexOption) (imgutil.ImageIndex, error)\n}\n\n//go:generate mockgen -package testmocks -destination ../testmocks/mock_buildpack_downloader.go github.com/buildpacks/pack/pkg/client BuildpackDownloader\n\n// BuildpackDownloader is an interface for downloading and extracting buildpacks from various sources\ntype BuildpackDownloader interface {\n\t// Download parses a buildpack URI and downloads the buildpack and any dependencies buildpacks from the appropriate source\n\tDownload(ctx context.Context, buildpackURI string, opts buildpack.DownloadOptions) (buildpack.BuildModule, []buildpack.BuildModule, error)\n}\n\n// Client is an orchestration object, it contains all parameters needed to\n// build an app image using Cloud Native Buildpacks.\n// All settings on this object should be changed through ClientOption functions.\ntype Client struct {\n\tlogger logging.Logger\n\tdocker DockerClient\n\n\tkeychain            authn.Keychain\n\timageFactory        ImageFactory\n\timageFetcher        ImageFetcher\n\tindexFactory        IndexFactory\n\tdownloader          BlobDownloader\n\tlifecycleExecutor   LifecycleExecutor\n\tbuildpackDownloader BuildpackDownloader\n\n\texperimental    bool\n\tregistryMirrors map[string]string\n\tversion         string\n}\n\nfunc (c *Client) processSystem(system dist.System, buildpacks []buildpack.BuildModule, disableSystem bool) (dist.System, error) {\n\tif disableSystem {\n\t\treturn dist.System{}, nil\n\t}\n\n\tif len(buildpacks) == 0 {\n\t\treturn system, nil\n\t}\n\n\tresolved := dist.System{}\n\n\t// Create a map of available buildpacks for faster lookup\n\tavailableBPs := make(map[string]bool)\n\tfor _, bp := range buildpacks {\n\t\tbpInfo := bp.Descriptor().Info()\n\t\tavailableBPs[bpInfo.ID+\"@\"+bpInfo.Version] = true\n\t}\n\n\t// Process pre-buildpacks\n\tfor _, preBp := range system.Pre.Buildpacks {\n\t\tkey := preBp.ID + \"@\" + preBp.Version\n\t\tif availableBPs[key] {\n\t\t\tresolved.Pre.Buildpacks = append(resolved.Pre.Buildpacks, preBp)\n\t\t} else if !preBp.Optional {\n\t\t\treturn dist.System{}, errors.Errorf(\"required system buildpack %s@%s is not available\", preBp.ID, preBp.Version)\n\t\t}\n\t}\n\n\t// Process post-buildpacks\n\tfor _, postBp := range system.Post.Buildpacks {\n\t\tkey := postBp.ID + \"@\" + postBp.Version\n\t\tif availableBPs[key] {\n\t\t\tresolved.Post.Buildpacks = append(resolved.Post.Buildpacks, postBp)\n\t\t} else if !postBp.Optional {\n\t\t\treturn dist.System{}, errors.Errorf(\"required system buildpack %s@%s is not available\", postBp.ID, postBp.Version)\n\t\t}\n\t}\n\n\treturn resolved, nil\n}\n\n// Option is a type of function that mutate settings on the client.\n// Values in these functions are set through currying.\ntype Option func(c *Client)\n\n// WithLogger supply your own logger.\nfunc WithLogger(l logging.Logger) Option {\n\treturn func(c *Client) {\n\t\tc.logger = l\n\t}\n}\n\n// WithImageFactory supply your own image factory.\nfunc WithImageFactory(f ImageFactory) Option {\n\treturn func(c *Client) {\n\t\tc.imageFactory = f\n\t}\n}\n\n// WithIndexFactory supply your own index factory\nfunc WithIndexFactory(f IndexFactory) Option {\n\treturn func(c *Client) {\n\t\tc.indexFactory = f\n\t}\n}\n\n// WithFetcher supply your own Fetcher.\n// A Fetcher retrieves both local and remote images to make them available.\nfunc WithFetcher(f ImageFetcher) Option {\n\treturn func(c *Client) {\n\t\tc.imageFetcher = f\n\t}\n}\n\n// WithDownloader supply your own downloader.\n// A Downloader is used to gather buildpacks from both remote urls, or local sources.\nfunc WithDownloader(d BlobDownloader) Option {\n\treturn func(c *Client) {\n\t\tc.downloader = d\n\t}\n}\n\n// WithBuildpackDownloader supply your own BuildpackDownloader.\n// A BuildpackDownloader is used to gather buildpacks from both remote urls, or local sources.\nfunc WithBuildpackDownloader(d BuildpackDownloader) Option {\n\treturn func(c *Client) {\n\t\tc.buildpackDownloader = d\n\t}\n}\n\n// Deprecated: use WithDownloader instead.\n//\n// WithCacheDir supply your own cache directory.\nfunc WithCacheDir(path string) Option {\n\treturn func(c *Client) {\n\t\tc.downloader = blob.NewDownloader(c.logger, path)\n\t}\n}\n\n// WithDockerClient supply your own docker client.\nfunc WithDockerClient(docker DockerClient) Option {\n\treturn func(c *Client) {\n\t\tc.docker = docker\n\t}\n}\n\n// WithExperimental sets whether experimental features should be enabled.\nfunc WithExperimental(experimental bool) Option {\n\treturn func(c *Client) {\n\t\tc.experimental = experimental\n\t}\n}\n\n// WithRegistryMirrors sets mirrors to pull images from.\nfunc WithRegistryMirrors(registryMirrors map[string]string) Option {\n\treturn func(c *Client) {\n\t\tc.registryMirrors = registryMirrors\n\t}\n}\n\n// WithKeychain sets keychain of credentials to image registries\nfunc WithKeychain(keychain authn.Keychain) Option {\n\treturn func(c *Client) {\n\t\tc.keychain = keychain\n\t}\n}\n\nconst DockerAPIVersion = \"1.38\"\n\n// NewClient allocates and returns a Client configured with the specified options.\nfunc NewClient(opts ...Option) (*Client, error) {\n\tclient := &Client{\n\t\tversion:  Version,\n\t\tkeychain: authn.DefaultKeychain,\n\t}\n\n\tfor _, opt := range opts {\n\t\topt(client)\n\t}\n\n\tif client.logger == nil {\n\t\tclient.logger = logging.NewSimpleLogger(os.Stderr)\n\t}\n\n\tif client.docker == nil {\n\t\tvar err error\n\t\tclient.docker, err = dockerClient.New(\n\t\t\tdockerClient.FromEnv,\n\t\t)\n\t\tif err != nil {\n\t\t\treturn nil, errors.Wrap(err, \"creating docker client\")\n\t\t}\n\t}\n\n\tif client.downloader == nil {\n\t\tpackHome, err := iconfig.PackHome()\n\t\tif err != nil {\n\t\t\treturn nil, errors.Wrap(err, \"getting pack home\")\n\t\t}\n\t\tclient.downloader = blob.NewDownloader(client.logger, filepath.Join(packHome, \"download-cache\"))\n\t}\n\n\tif client.imageFetcher == nil {\n\t\tclient.imageFetcher = image.NewFetcher(client.logger, client.docker, image.WithRegistryMirrors(client.registryMirrors), image.WithKeychain(client.keychain))\n\t}\n\n\tif client.imageFactory == nil {\n\t\tclient.imageFactory = &imageFactory{\n\t\t\tdockerClient: client.docker,\n\t\t\tkeychain:     client.keychain,\n\t\t}\n\t}\n\n\tif client.indexFactory == nil {\n\t\tpackHome, err := iconfig.PackHome()\n\t\tif err != nil {\n\t\t\treturn nil, errors.Wrap(err, \"getting pack home\")\n\t\t}\n\t\tindexRootStoragePath := filepath.Join(packHome, \"manifests\")\n\t\tif xdgPath, ok := os.LookupEnv(xdgRuntimePath); ok {\n\t\t\tindexRootStoragePath = xdgPath\n\t\t}\n\t\tclient.indexFactory = index.NewIndexFactory(client.keychain, indexRootStoragePath)\n\t}\n\n\tif client.buildpackDownloader == nil {\n\t\tclient.buildpackDownloader = buildpack.NewDownloader(\n\t\t\tclient.logger,\n\t\t\tclient.imageFetcher,\n\t\t\tclient.downloader,\n\t\t\t&registryResolver{\n\t\t\t\tlogger: client.logger,\n\t\t\t},\n\t\t)\n\t}\n\n\tclient.lifecycleExecutor = build.NewLifecycleExecutor(client.logger, client.docker)\n\n\treturn client, nil\n}\n\ntype registryResolver struct {\n\tlogger logging.Logger\n}\n\nfunc (r *registryResolver) Resolve(registryName, bpName string) (string, error) {\n\tcache, err := getRegistry(r.logger, registryName)\n\tif err != nil {\n\t\treturn \"\", errors.Wrapf(err, \"lookup registry %s\", style.Symbol(registryName))\n\t}\n\n\tregBuildpack, err := cache.LocateBuildpack(bpName)\n\tif err != nil {\n\t\treturn \"\", errors.Wrapf(err, \"lookup buildpack %s\", style.Symbol(bpName))\n\t}\n\n\treturn regBuildpack.Address, nil\n}\n\ntype imageFactory struct {\n\tdockerClient local.DockerClient\n\tkeychain     authn.Keychain\n}\n\nfunc (f *imageFactory) NewImage(repoName string, daemon bool, target dist.Target) (imgutil.Image, error) {\n\tplatform := imgutil.Platform{OS: target.OS, Architecture: target.Arch, Variant: target.ArchVariant}\n\n\tif len(target.Distributions) > 0 {\n\t\t// We need to set platform distribution information so that it will be reflected in the image config.\n\t\t// We assume the given target's distributions were already expanded, we should be dealing with just 1 distribution name and version.\n\t\tplatform.OSVersion = target.Distributions[0].Version\n\t}\n\n\tif daemon {\n\t\treturn local.NewImage(repoName, f.dockerClient, local.WithDefaultPlatform(platform))\n\t}\n\n\treturn remote.NewImage(repoName, f.keychain, remote.WithDefaultPlatform(platform))\n}\n"
  },
  {
    "path": "pkg/client/client_test.go",
    "content": "package client\n\nimport (\n\t\"bytes\"\n\t\"io\"\n\t\"os\"\n\t\"testing\"\n\n\t\"github.com/buildpacks/lifecycle/api\"\n\n\t\"github.com/golang/mock/gomock\"\n\tdockerClient \"github.com/moby/moby/client\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\t\"github.com/buildpacks/pack/pkg/buildpack\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\t\"github.com/buildpacks/pack/pkg/testmocks\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestClient(t *testing.T) {\n\tspec.Run(t, \"Client\", testClient, spec.Report(report.Terminal{}))\n}\n\nfunc testClient(t *testing.T, when spec.G, it spec.S) {\n\twhen(\"#NewClient\", func() {\n\t\tit(\"default works\", func() {\n\t\t\t_, err := NewClient()\n\t\t\th.AssertNil(t, err)\n\t\t})\n\n\t\twhen(\"docker env is messed up\", func() {\n\t\t\tvar dockerHost string\n\t\t\tvar dockerHostKey = \"DOCKER_HOST\"\n\t\t\tit.Before(func() {\n\t\t\t\tdockerHost = os.Getenv(dockerHostKey)\n\t\t\t\th.AssertNil(t, os.Setenv(dockerHostKey, \"fake-value\"))\n\t\t\t})\n\n\t\t\tit.After(func() {\n\t\t\t\th.AssertNil(t, os.Setenv(dockerHostKey, dockerHost))\n\t\t\t})\n\n\t\t\tit(\"returns errors\", func() {\n\t\t\t\t_, err := NewClient()\n\t\t\t\th.AssertError(t, err, \"docker client\")\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"#WithLogger\", func() {\n\t\tit(\"uses logger provided\", func() {\n\t\t\tvar w bytes.Buffer\n\t\t\tlogger := logging.NewSimpleLogger(&w)\n\t\t\tcl, err := NewClient(WithLogger(logger))\n\t\t\th.AssertNil(t, err)\n\t\t\th.AssertSameInstance(t, cl.logger, logger)\n\t\t})\n\t})\n\n\twhen(\"#WithImageFactory\", func() {\n\t\tit(\"uses image factory provided\", func() {\n\t\t\tmockController := gomock.NewController(t)\n\t\t\tmockImageFactory := testmocks.NewMockImageFactory(mockController)\n\t\t\tcl, err := NewClient(WithImageFactory(mockImageFactory))\n\t\t\th.AssertNil(t, err)\n\t\t\th.AssertSameInstance(t, cl.imageFactory, mockImageFactory)\n\t\t})\n\t})\n\n\twhen(\"#WithFetcher\", func() {\n\t\tit(\"uses image factory provided\", func() {\n\t\t\tmockController := gomock.NewController(t)\n\t\t\tmockFetcher := testmocks.NewMockImageFetcher(mockController)\n\t\t\tcl, err := NewClient(WithFetcher(mockFetcher))\n\t\t\th.AssertNil(t, err)\n\t\t\th.AssertSameInstance(t, cl.imageFetcher, mockFetcher)\n\t\t})\n\t})\n\n\twhen(\"#WithDownloader\", func() {\n\t\tit(\"uses image factory provided\", func() {\n\t\t\tmockController := gomock.NewController(t)\n\t\t\tmockDownloader := testmocks.NewMockBlobDownloader(mockController)\n\t\t\tcl, err := NewClient(WithDownloader(mockDownloader))\n\t\t\th.AssertNil(t, err)\n\t\t\th.AssertSameInstance(t, cl.downloader, mockDownloader)\n\t\t})\n\t})\n\n\twhen(\"#WithDockerClient\", func() {\n\t\tit(\"uses docker client provided\", func() {\n\t\t\tdocker, err := dockerClient.New(\n\t\t\t\tdockerClient.FromEnv,\n\t\t\t)\n\t\t\th.AssertNil(t, err)\n\t\t\tcl, err := NewClient(WithDockerClient(docker))\n\t\t\th.AssertNil(t, err)\n\t\t\th.AssertSameInstance(t, cl.docker, docker)\n\t\t})\n\t})\n\n\twhen(\"#WithExperimental\", func() {\n\t\tit(\"sets experimental = true\", func() {\n\t\t\tcl, err := NewClient(WithExperimental(true))\n\t\t\th.AssertNil(t, err)\n\t\t\th.AssertEq(t, cl.experimental, true)\n\t\t})\n\n\t\tit(\"sets experimental = false\", func() {\n\t\t\tcl, err := NewClient(WithExperimental(true))\n\t\t\th.AssertNil(t, err)\n\t\t\th.AssertEq(t, cl.experimental, true)\n\t\t})\n\t})\n\n\twhen(\"#WithRegistryMirror\", func() {\n\t\tit(\"uses registry mirrors provided\", func() {\n\t\t\tregistryMirrors := map[string]string{\n\t\t\t\t\"index.docker.io\": \"10.0.0.1\",\n\t\t\t}\n\n\t\t\tcl, err := NewClient(WithRegistryMirrors(registryMirrors))\n\t\t\th.AssertNil(t, err)\n\t\t\th.AssertEq(t, cl.registryMirrors, registryMirrors)\n\t\t})\n\t})\n\n\twhen(\"#processSystem\", func() {\n\t\tvar (\n\t\t\tsubject          *Client\n\t\t\tmockController   *gomock.Controller\n\t\t\tavailableBPs     []buildpack.BuildModule\n\t\t\tsystemBuildpacks dist.System\n\t\t)\n\n\t\tit.Before(func() {\n\t\t\tmockController = gomock.NewController(t)\n\t\t\tsubject = &Client{}\n\n\t\t\t// Create mock buildpack modules\n\t\t\tavailableBPs = []buildpack.BuildModule{\n\t\t\t\t&mockBuildModule{id: \"example/pre-bp\", version: \"1.0.0\"},\n\t\t\t\t&mockBuildModule{id: \"example/post-bp\", version: \"2.0.0\"},\n\t\t\t\t&mockBuildModule{id: \"example/optional-bp\", version: \"3.0.0\"},\n\t\t\t}\n\t\t})\n\n\t\tit.After(func() {\n\t\t\tmockController.Finish()\n\t\t})\n\n\t\twhen(\"disableSystem is true\", func() {\n\t\t\tit(\"returns empty system\", func() {\n\t\t\t\tsystemBuildpacks = dist.System{\n\t\t\t\t\tPre: dist.SystemBuildpacks{\n\t\t\t\t\t\tBuildpacks: []dist.ModuleRef{\n\t\t\t\t\t\t\t{ModuleInfo: dist.ModuleInfo{ID: \"example/pre-bp\", Version: \"1.0.0\"}, Optional: false},\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t}\n\n\t\t\t\tresult, err := subject.processSystem(systemBuildpacks, availableBPs, true)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertEq(t, len(result.Pre.Buildpacks), 0)\n\t\t\t\th.AssertEq(t, len(result.Post.Buildpacks), 0)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"no buildpacks are available\", func() {\n\t\t\tit(\"returns the original system\", func() {\n\t\t\t\tsystemBuildpacks = dist.System{\n\t\t\t\t\tPre: dist.SystemBuildpacks{\n\t\t\t\t\t\tBuildpacks: []dist.ModuleRef{\n\t\t\t\t\t\t\t{ModuleInfo: dist.ModuleInfo{ID: \"example/pre-bp\", Version: \"1.0.0\"}, Optional: false},\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t}\n\n\t\t\t\tresult, err := subject.processSystem(systemBuildpacks, []buildpack.BuildModule{}, false)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertEq(t, result, systemBuildpacks)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"all required system buildpacks are available\", func() {\n\t\t\tit(\"returns resolved system with all buildpacks\", func() {\n\t\t\t\tsystemBuildpacks = dist.System{\n\t\t\t\t\tPre: dist.SystemBuildpacks{\n\t\t\t\t\t\tBuildpacks: []dist.ModuleRef{\n\t\t\t\t\t\t\t{ModuleInfo: dist.ModuleInfo{ID: \"example/pre-bp\", Version: \"1.0.0\"}, Optional: false},\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t\tPost: dist.SystemBuildpacks{\n\t\t\t\t\t\tBuildpacks: []dist.ModuleRef{\n\t\t\t\t\t\t\t{ModuleInfo: dist.ModuleInfo{ID: \"example/post-bp\", Version: \"2.0.0\"}, Optional: false},\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t}\n\n\t\t\t\tresult, err := subject.processSystem(systemBuildpacks, availableBPs, false)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertEq(t, len(result.Pre.Buildpacks), 1)\n\t\t\t\th.AssertEq(t, result.Pre.Buildpacks[0].ID, \"example/pre-bp\")\n\t\t\t\th.AssertEq(t, len(result.Post.Buildpacks), 1)\n\t\t\t\th.AssertEq(t, result.Post.Buildpacks[0].ID, \"example/post-bp\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"required system buildpack is missing\", func() {\n\t\t\tit(\"returns an error for missing pre-buildpack\", func() {\n\t\t\t\tsystemBuildpacks = dist.System{\n\t\t\t\t\tPre: dist.SystemBuildpacks{\n\t\t\t\t\t\tBuildpacks: []dist.ModuleRef{\n\t\t\t\t\t\t\t{ModuleInfo: dist.ModuleInfo{ID: \"missing/pre-bp\", Version: \"1.0.0\"}, Optional: false},\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t}\n\n\t\t\t\t_, err := subject.processSystem(systemBuildpacks, availableBPs, false)\n\t\t\t\th.AssertError(t, err, \"required system buildpack missing/pre-bp@1.0.0 is not available\")\n\t\t\t})\n\n\t\t\tit(\"returns an error for missing post-buildpack\", func() {\n\t\t\t\tsystemBuildpacks = dist.System{\n\t\t\t\t\tPost: dist.SystemBuildpacks{\n\t\t\t\t\t\tBuildpacks: []dist.ModuleRef{\n\t\t\t\t\t\t\t{ModuleInfo: dist.ModuleInfo{ID: \"missing/post-bp\", Version: \"1.0.0\"}, Optional: false},\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t}\n\n\t\t\t\t_, err := subject.processSystem(systemBuildpacks, availableBPs, false)\n\t\t\t\th.AssertError(t, err, \"required system buildpack missing/post-bp@1.0.0 is not available\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"optional system buildpack is missing\", func() {\n\t\t\tit(\"ignores missing optional pre-buildpack\", func() {\n\t\t\t\tsystemBuildpacks = dist.System{\n\t\t\t\t\tPre: dist.SystemBuildpacks{\n\t\t\t\t\t\tBuildpacks: []dist.ModuleRef{\n\t\t\t\t\t\t\t{ModuleInfo: dist.ModuleInfo{ID: \"example/pre-bp\", Version: \"1.0.0\"}, Optional: false},\n\t\t\t\t\t\t\t{ModuleInfo: dist.ModuleInfo{ID: \"missing/optional-bp\", Version: \"1.0.0\"}, Optional: true},\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t}\n\n\t\t\t\tresult, err := subject.processSystem(systemBuildpacks, availableBPs, false)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertEq(t, len(result.Pre.Buildpacks), 1)\n\t\t\t\th.AssertEq(t, result.Pre.Buildpacks[0].ID, \"example/pre-bp\")\n\t\t\t})\n\n\t\t\tit(\"ignores missing optional post-buildpack\", func() {\n\t\t\t\tsystemBuildpacks = dist.System{\n\t\t\t\t\tPost: dist.SystemBuildpacks{\n\t\t\t\t\t\tBuildpacks: []dist.ModuleRef{\n\t\t\t\t\t\t\t{ModuleInfo: dist.ModuleInfo{ID: \"example/post-bp\", Version: \"2.0.0\"}, Optional: false},\n\t\t\t\t\t\t\t{ModuleInfo: dist.ModuleInfo{ID: \"missing/optional-bp\", Version: \"1.0.0\"}, Optional: true},\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t}\n\n\t\t\t\tresult, err := subject.processSystem(systemBuildpacks, availableBPs, false)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertEq(t, len(result.Post.Buildpacks), 1)\n\t\t\t\th.AssertEq(t, result.Post.Buildpacks[0].ID, \"example/post-bp\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"mix of available and missing buildpacks\", func() {\n\t\t\tit(\"includes available buildpacks and reports error for required missing ones\", func() {\n\t\t\t\tsystemBuildpacks = dist.System{\n\t\t\t\t\tPre: dist.SystemBuildpacks{\n\t\t\t\t\t\tBuildpacks: []dist.ModuleRef{\n\t\t\t\t\t\t\t{ModuleInfo: dist.ModuleInfo{ID: \"example/pre-bp\", Version: \"1.0.0\"}, Optional: false},\n\t\t\t\t\t\t\t{ModuleInfo: dist.ModuleInfo{ID: \"missing/required-bp\", Version: \"1.0.0\"}, Optional: false},\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t\tPost: dist.SystemBuildpacks{\n\t\t\t\t\t\tBuildpacks: []dist.ModuleRef{\n\t\t\t\t\t\t\t{ModuleInfo: dist.ModuleInfo{ID: \"example/post-bp\", Version: \"2.0.0\"}, Optional: true},\n\t\t\t\t\t\t\t{ModuleInfo: dist.ModuleInfo{ID: \"missing/optional-bp\", Version: \"1.0.0\"}, Optional: true},\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t}\n\n\t\t\t\t_, err := subject.processSystem(systemBuildpacks, availableBPs, false)\n\t\t\t\th.AssertError(t, err, \"required system buildpack missing/required-bp@1.0.0 is not available\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"buildpack version mismatch\", func() {\n\t\t\tit(\"requires exact version match\", func() {\n\t\t\t\tsystemBuildpacks = dist.System{\n\t\t\t\t\tPre: dist.SystemBuildpacks{\n\t\t\t\t\t\tBuildpacks: []dist.ModuleRef{\n\t\t\t\t\t\t\t{ModuleInfo: dist.ModuleInfo{ID: \"example/pre-bp\", Version: \"2.0.0\"}, Optional: false}, // wrong version\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t}\n\n\t\t\t\t_, err := subject.processSystem(systemBuildpacks, availableBPs, false)\n\t\t\t\th.AssertError(t, err, \"required system buildpack example/pre-bp@2.0.0 is not available\")\n\t\t\t})\n\t\t})\n\t})\n}\n\n// Mock implementations for testing purpuse\n\n// mockDescriptor is a mock implementation of buildpack.Descriptor\ntype mockDescriptor struct {\n\tinfo dist.ModuleInfo\n}\n\nfunc (m *mockDescriptor) API() *api.Version {\n\treturn nil\n}\n\nfunc (m *mockDescriptor) EnsureStackSupport(stackID string, providedMixins []string, validateRunStageMixins bool) error {\n\treturn nil\n}\n\nfunc (m *mockDescriptor) EnsureTargetSupport(os, arch, distroName, distroVersion string) error {\n\treturn nil\n}\n\nfunc (m *mockDescriptor) EscapedID() string {\n\treturn m.info.ID\n}\n\nfunc (m *mockDescriptor) Info() dist.ModuleInfo {\n\treturn m.info\n}\n\nfunc (m *mockDescriptor) Kind() string {\n\treturn buildpack.KindBuildpack\n}\n\nfunc (m *mockDescriptor) Order() dist.Order {\n\treturn nil\n}\n\nfunc (m *mockDescriptor) Stacks() []dist.Stack {\n\treturn nil\n}\n\nfunc (m *mockDescriptor) Targets() []dist.Target {\n\treturn nil\n}\n\n// mockBuildModule is a mock implementation of buildpack.BuildModule for testing\ntype mockBuildModule struct {\n\tid      string\n\tversion string\n}\n\nfunc (m *mockBuildModule) Descriptor() buildpack.Descriptor {\n\treturn &mockDescriptor{\n\t\tinfo: dist.ModuleInfo{\n\t\t\tID:      m.id,\n\t\t\tVersion: m.version,\n\t\t},\n\t}\n}\n\nfunc (m *mockBuildModule) Open() (io.ReadCloser, error) {\n\treturn nil, nil\n}\n"
  },
  {
    "path": "pkg/client/common.go",
    "content": "package client\n\nimport (\n\t\"context\"\n\t\"errors\"\n\t\"fmt\"\n\n\t\"github.com/buildpacks/imgutil\"\n\t\"github.com/google/go-containerregistry/pkg/name\"\n\n\t\"github.com/buildpacks/pack/internal/builder\"\n\t\"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/internal/registry\"\n\t\"github.com/buildpacks/pack/internal/style\"\n\t\"github.com/buildpacks/pack/pkg/image\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n)\n\nfunc (c *Client) addManifestToIndex(ctx context.Context, repoName string, index imgutil.ImageIndex) error {\n\timageRef, err := name.ParseReference(repoName, name.WeakValidation)\n\tif err != nil {\n\t\treturn fmt.Errorf(\"'%s' is not a valid manifest reference: %s\", style.Symbol(repoName), err)\n\t}\n\n\timageToAdd, err := c.imageFetcher.Fetch(ctx, imageRef.Name(), image.FetchOptions{Daemon: false})\n\tif err != nil {\n\t\treturn err\n\t}\n\n\tindex.AddManifest(imageToAdd.UnderlyingImage())\n\treturn nil\n}\n\nfunc (c *Client) parseTagReference(imageName string) (name.Reference, error) {\n\tif imageName == \"\" {\n\t\treturn nil, errors.New(\"image is a required parameter\")\n\t}\n\tif _, err := name.ParseReference(imageName, name.WeakValidation); err != nil {\n\t\treturn nil, fmt.Errorf(\"'%s' is not a valid tag reference: %s\", imageName, err)\n\t}\n\tref, err := name.NewTag(imageName, name.WeakValidation)\n\tif err != nil {\n\t\treturn nil, fmt.Errorf(\"'%s' is not a tag reference\", imageName)\n\t}\n\n\treturn ref, nil\n}\n\nfunc (c *Client) resolveRunImage(runImage, imgRegistry, bldrRegistry string, runImageMetadata builder.RunImageMetadata, additionalMirrors map[string][]string, publish bool, options image.FetchOptions) string {\n\tif runImage != \"\" {\n\t\tc.logger.Debugf(\"Using provided run-image %s\", style.Symbol(runImage))\n\t\treturn runImage\n\t}\n\n\tpreferredRegistry := bldrRegistry\n\tif publish || bldrRegistry == \"\" {\n\t\tpreferredRegistry = imgRegistry\n\t}\n\n\trunImageName := getBestRunMirror(\n\t\tpreferredRegistry,\n\t\trunImageMetadata.Image,\n\t\trunImageMetadata.Mirrors,\n\t\tadditionalMirrors[runImageMetadata.Image],\n\t\tc.imageFetcher,\n\t\toptions,\n\t)\n\n\tswitch {\n\tcase runImageName == runImageMetadata.Image:\n\t\tc.logger.Debugf(\"Selected run image %s\", style.Symbol(runImageName))\n\tcase contains(runImageMetadata.Mirrors, runImageName):\n\t\tc.logger.Debugf(\"Selected run image mirror %s\", style.Symbol(runImageName))\n\tdefault:\n\t\tc.logger.Debugf(\"Selected run image mirror %s from local config\", style.Symbol(runImageName))\n\t}\n\treturn runImageName\n}\n\nfunc getRegistry(logger logging.Logger, registryName string) (registry.Cache, error) {\n\thome, err := config.PackHome()\n\tif err != nil {\n\t\treturn registry.Cache{}, err\n\t}\n\n\tif err := config.MkdirAll(home); err != nil {\n\t\treturn registry.Cache{}, err\n\t}\n\n\tcfg, err := getConfig()\n\tif err != nil {\n\t\treturn registry.Cache{}, err\n\t}\n\n\tif registryName == \"\" {\n\t\treturn registry.NewDefaultRegistryCache(logger, home)\n\t}\n\n\tfor _, reg := range config.GetRegistries(cfg) {\n\t\tif reg.Name == registryName {\n\t\t\treturn registry.NewRegistryCache(logger, home, reg.URL)\n\t\t}\n\t}\n\n\treturn registry.Cache{}, fmt.Errorf(\"registry %s is not defined in your config file\", style.Symbol(registryName))\n}\n\nfunc getConfig() (config.Config, error) {\n\tpath, err := config.DefaultConfigPath()\n\tif err != nil {\n\t\treturn config.Config{}, err\n\t}\n\n\tcfg, err := config.Read(path)\n\tif err != nil {\n\t\treturn config.Config{}, err\n\t}\n\treturn cfg, nil\n}\n\nfunc contains(slc []string, v string) bool {\n\tfor _, s := range slc {\n\t\tif s == v {\n\t\t\treturn true\n\t\t}\n\t}\n\treturn false\n}\n\nfunc getBestRunMirror(registry string, runImage string, mirrors []string, preferredMirrors []string, fetcher ImageFetcher, options image.FetchOptions) string {\n\trunImageList := filterImageList(append(append(append([]string{}, preferredMirrors...), runImage), mirrors...), fetcher, options)\n\tfor _, img := range runImageList {\n\t\tref, err := name.ParseReference(img, name.WeakValidation)\n\t\tif err != nil {\n\t\t\tcontinue\n\t\t}\n\t\tif reg := ref.Context().RegistryStr(); reg == registry {\n\t\t\treturn img\n\t\t}\n\t}\n\n\tif len(runImageList) > 0 {\n\t\treturn runImageList[0]\n\t}\n\n\treturn runImage\n}\n\nfunc filterImageList(imageList []string, fetcher ImageFetcher, options image.FetchOptions) []string {\n\tvar accessibleImages []string\n\n\tfor i, img := range imageList {\n\t\tif fetcher.CheckReadAccess(img, options) {\n\t\t\taccessibleImages = append(accessibleImages, imageList[i])\n\t\t}\n\t}\n\n\treturn accessibleImages\n}\n"
  },
  {
    "path": "pkg/client/common_test.go",
    "content": "package client\n\nimport (\n\t\"bytes\"\n\t\"testing\"\n\n\t\"github.com/buildpacks/lifecycle/auth\"\n\t\"github.com/golang/mock/gomock\"\n\t\"github.com/google/go-containerregistry/pkg/authn\"\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\t\"github.com/buildpacks/pack/internal/builder\"\n\t\"github.com/buildpacks/pack/pkg/image\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\t\"github.com/buildpacks/pack/pkg/testmocks\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestCommon(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"build\", testCommon, spec.Report(report.Terminal{}))\n}\n\nfunc testCommon(t *testing.T, when spec.G, it spec.S) {\n\twhen(\"#resolveRunImage\", func() {\n\t\tvar (\n\t\t\tsubject         *Client\n\t\t\toutBuf          bytes.Buffer\n\t\t\tlogger          logging.Logger\n\t\t\tkeychain        authn.Keychain\n\t\t\trunImageName    string\n\t\t\tdefaultRegistry string\n\t\t\tdefaultMirror   string\n\t\t\tgcrRegistry     string\n\t\t\tgcrRunMirror    string\n\t\t\tstackInfo       builder.StackMetadata\n\t\t\tassert          = h.NewAssertionManager(t)\n\t\t\tpublish         bool\n\t\t\terr             error\n\t\t)\n\n\t\tit.Before(func() {\n\t\t\tlogger = logging.NewLogWithWriters(&outBuf, &outBuf)\n\n\t\t\tkeychain, err = auth.DefaultKeychain(\"pack-test/dummy\")\n\t\t\th.AssertNil(t, err)\n\n\t\t\tsubject, err = NewClient(WithLogger(logger), WithKeychain(keychain))\n\t\t\tassert.Nil(err)\n\n\t\t\tdefaultRegistry = \"default.registry.io\"\n\t\t\trunImageName = \"stack/run\"\n\t\t\tdefaultMirror = defaultRegistry + \"/\" + runImageName\n\t\t\tgcrRegistry = \"gcr.io\"\n\t\t\tgcrRunMirror = gcrRegistry + \"/\" + runImageName\n\t\t\tstackInfo = builder.StackMetadata{\n\t\t\t\tRunImage: builder.RunImageMetadata{\n\t\t\t\t\tImage: runImageName,\n\t\t\t\t\tMirrors: []string{\n\t\t\t\t\t\tdefaultMirror, gcrRunMirror,\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t}\n\t\t})\n\n\t\twhen(\"passed specific run image\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\tpublish = false\n\t\t\t})\n\n\t\t\tit(\"selects that run image\", func() {\n\t\t\t\trunImgFlag := \"flag/passed-run-image\"\n\t\t\t\trunImageName = subject.resolveRunImage(runImgFlag, defaultRegistry, \"\", stackInfo.RunImage, nil, publish, image.FetchOptions{Daemon: !publish, PullPolicy: image.PullAlways})\n\t\t\t\tassert.Equal(runImageName, runImgFlag)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"desirable run-image are accessible\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\tpublish = true\n\t\t\t\tmockController := gomock.NewController(t)\n\t\t\t\tmockFetcher := testmocks.NewMockImageFetcher(mockController)\n\t\t\t\tmockFetcher.EXPECT().CheckReadAccess(gomock.Any(), gomock.Any()).Return(true).AnyTimes()\n\t\t\t\tsubject, err = NewClient(WithLogger(logger), WithKeychain(keychain), WithFetcher(mockFetcher))\n\t\t\t\th.AssertNil(t, err)\n\t\t\t})\n\n\t\t\tit(\"defaults to run-image in registry publishing to\", func() {\n\t\t\t\trunImageName = subject.resolveRunImage(\"\", gcrRegistry, defaultRegistry, stackInfo.RunImage, nil, publish, image.FetchOptions{})\n\t\t\t\tassert.Equal(runImageName, gcrRunMirror)\n\t\t\t})\n\n\t\t\tit(\"prefers config defined run image mirror to stack defined run image mirror\", func() {\n\t\t\t\tconfigMirrors := map[string][]string{\n\t\t\t\t\trunImageName: {defaultRegistry + \"/unique-run-img\"},\n\t\t\t\t}\n\t\t\t\trunImageName = subject.resolveRunImage(\"\", defaultRegistry, \"\", stackInfo.RunImage, configMirrors, publish, image.FetchOptions{})\n\t\t\t\tassert.NotEqual(runImageName, defaultMirror)\n\t\t\t\tassert.Equal(runImageName, defaultRegistry+\"/unique-run-img\")\n\t\t\t})\n\n\t\t\tit(\"returns a config mirror if no match to target registry\", func() {\n\t\t\t\tconfigMirrors := map[string][]string{\n\t\t\t\t\trunImageName: {defaultRegistry + \"/unique-run-img\"},\n\t\t\t\t}\n\t\t\t\trunImageName = subject.resolveRunImage(\"\", \"test.registry.io\", \"\", stackInfo.RunImage, configMirrors, publish, image.FetchOptions{})\n\t\t\t\tassert.NotEqual(runImageName, defaultMirror)\n\t\t\t\tassert.Equal(runImageName, defaultRegistry+\"/unique-run-img\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"desirable run-images are not accessible\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\tpublish = true\n\n\t\t\t\tmockController := gomock.NewController(t)\n\t\t\t\tmockFetcher := testmocks.NewMockImageFetcher(mockController)\n\t\t\t\tmockFetcher.EXPECT().CheckReadAccess(gcrRunMirror, gomock.Any()).Return(false)\n\t\t\t\tmockFetcher.EXPECT().CheckReadAccess(stackInfo.RunImage.Image, gomock.Any()).Return(false)\n\t\t\t\tmockFetcher.EXPECT().CheckReadAccess(defaultMirror, gomock.Any()).Return(true)\n\n\t\t\t\tsubject, err = NewClient(WithLogger(logger), WithKeychain(keychain), WithFetcher(mockFetcher))\n\t\t\t\th.AssertNil(t, err)\n\t\t\t})\n\n\t\t\tit(\"selects the first accessible run-image\", func() {\n\t\t\t\trunImageName = subject.resolveRunImage(\"\", gcrRegistry, defaultRegistry, stackInfo.RunImage, nil, publish, image.FetchOptions{})\n\t\t\t\tassert.Equal(runImageName, defaultMirror)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"desirable run-image are empty\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\tpublish = false\n\t\t\t\tstackInfo = builder.StackMetadata{\n\t\t\t\t\tRunImage: builder.RunImageMetadata{\n\t\t\t\t\t\tImage: \"stack/run-image\",\n\t\t\t\t\t},\n\t\t\t\t}\n\t\t\t})\n\n\t\t\tit(\"selects the builder run-image\", func() {\n\t\t\t\t// issue: https://github.com/buildpacks/pack/issues/2078\n\t\t\t\trunImageName = subject.resolveRunImage(\"\", \"\", \"\", stackInfo.RunImage, nil, publish, image.FetchOptions{})\n\t\t\t\tassert.Equal(runImageName, \"stack/run-image\")\n\t\t\t})\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "pkg/client/create_builder.go",
    "content": "package client\n\nimport (\n\t\"archive/tar\"\n\t\"context\"\n\t\"fmt\"\n\t\"io\"\n\tOS \"os\"\n\t\"path/filepath\"\n\t\"sort\"\n\t\"strings\"\n\n\t\"github.com/buildpacks/pack/internal/name\"\n\n\t\"github.com/Masterminds/semver\"\n\t\"github.com/buildpacks/imgutil\"\n\t\"github.com/pkg/errors\"\n\t\"golang.org/x/text/cases\"\n\t\"golang.org/x/text/language\"\n\n\tpubbldr \"github.com/buildpacks/pack/builder\"\n\t\"github.com/buildpacks/pack/internal/builder\"\n\t\"github.com/buildpacks/pack/internal/paths\"\n\t\"github.com/buildpacks/pack/internal/style\"\n\t\"github.com/buildpacks/pack/pkg/buildpack\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n\t\"github.com/buildpacks/pack/pkg/image\"\n)\n\n// CreateBuilderOptions is a configuration object used to change the behavior of\n// CreateBuilder.\ntype CreateBuilderOptions struct {\n\t// The base directory to use to resolve relative assets\n\tRelativeBaseDir string\n\n\t// Name of the builder.\n\tBuilderName string\n\n\t// BuildConfigEnv for Builder\n\tBuildConfigEnv map[string]string\n\n\t// Map of labels to add to the Buildpack\n\tLabels map[string]string\n\n\t// Configuration that defines the functionality a builder provides.\n\tConfig pubbldr.Config\n\n\t// Skip building image locally, directly publish to a registry.\n\t// Requires BuilderName to be a valid registry location.\n\tPublish bool\n\n\t// Append [os]-[arch] suffix to the image tag when publishing a multi-arch to a registry\n\t// Requires Publish to be true\n\tAppendImageNameSuffix bool\n\n\t// Buildpack registry name. Defines where all registry buildpacks will be pulled from.\n\tRegistry string\n\n\t// Strategy for updating images before a build.\n\tPullPolicy image.PullPolicy\n\n\t// List of modules to be flattened\n\tFlatten buildpack.FlattenModuleInfos\n\n\t// Target platforms to build builder images for\n\tTargets []dist.Target\n\n\t// Temporary directory to use for downloading lifecycle images.\n\tTempDirectory string\n\n\t// Additional image tags to push to, each will contain contents identical to Image\n\tAdditionalTags []string\n}\n\n// CreateBuilder creates and saves a builder image to a registry with the provided options.\n// If any configuration is invalid, it will error and exit without creating any images.\nfunc (c *Client) CreateBuilder(ctx context.Context, opts CreateBuilderOptions) error {\n\ttargets, err := c.processBuilderCreateTargets(ctx, opts)\n\tif err != nil {\n\t\treturn err\n\t}\n\n\tif len(targets) == 0 {\n\t\t_, err = c.createBuilderTarget(ctx, opts, nil, false)\n\t\tif err != nil {\n\t\t\treturn err\n\t\t}\n\t} else {\n\t\tvar digests []string\n\t\tmultiArch := len(targets) > 1 && opts.Publish\n\n\t\tfor _, target := range targets {\n\t\t\tdigest, err := c.createBuilderTarget(ctx, opts, &target, multiArch)\n\t\t\tif err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\t\t\tdigests = append(digests, digest)\n\t\t}\n\n\t\tif multiArch && len(digests) > 1 {\n\t\t\treturn c.CreateManifest(ctx, CreateManifestOptions{\n\t\t\t\tIndexRepoName: opts.BuilderName,\n\t\t\t\tRepoNames:     digests,\n\t\t\t\tPublish:       true,\n\t\t\t})\n\t\t}\n\t}\n\n\treturn nil\n}\n\nfunc (c *Client) createBuilderTarget(ctx context.Context, opts CreateBuilderOptions, target *dist.Target, multiArch bool) (string, error) {\n\tif err := c.validateConfig(ctx, opts, target); err != nil {\n\t\treturn \"\", err\n\t}\n\n\tbldr, err := c.createBaseBuilder(ctx, opts, target, multiArch)\n\tif err != nil {\n\t\treturn \"\", errors.Wrap(err, \"failed to create builder\")\n\t}\n\n\tif err := c.addBuildpacksToBuilder(ctx, opts, bldr); err != nil {\n\t\treturn \"\", errors.Wrap(err, \"failed to add buildpacks to builder\")\n\t}\n\n\tif err := c.addExtensionsToBuilder(ctx, opts, bldr); err != nil {\n\t\treturn \"\", errors.Wrap(err, \"failed to add extensions to builder\")\n\t}\n\n\tbldr.SetOrder(opts.Config.Order)\n\tbldr.SetOrderExtensions(opts.Config.OrderExtensions)\n\tbldr.SetSystem(opts.Config.System)\n\n\tif opts.Config.Stack.ID != \"\" {\n\t\tbldr.SetStack(opts.Config.Stack)\n\t}\n\tbldr.SetRunImage(opts.Config.Run)\n\tbldr.SetBuildConfigEnv(opts.BuildConfigEnv)\n\n\terr = bldr.Save(c.logger, builder.CreatorMetadata{Version: c.version}, opts.AdditionalTags...)\n\tif err != nil {\n\t\treturn \"\", err\n\t}\n\n\tif multiArch {\n\t\t// We need to keep the identifier to create the image index\n\t\tid, err := bldr.Image().Identifier()\n\t\tif err != nil {\n\t\t\treturn \"\", errors.Wrapf(err, \"determining image manifest digest\")\n\t\t}\n\t\treturn id.String(), nil\n\t}\n\treturn \"\", nil\n}\n\nfunc (c *Client) validateConfig(ctx context.Context, opts CreateBuilderOptions, target *dist.Target) error {\n\tif err := pubbldr.ValidateConfig(opts.Config); err != nil {\n\t\treturn errors.Wrap(err, \"invalid builder config\")\n\t}\n\n\tif err := c.validateRunImageConfig(ctx, opts, target); err != nil {\n\t\treturn errors.Wrap(err, \"invalid run image config\")\n\t}\n\n\treturn nil\n}\n\nfunc (c *Client) validateRunImageConfig(ctx context.Context, opts CreateBuilderOptions, target *dist.Target) error {\n\tvar runImages []imgutil.Image\n\tfor _, r := range opts.Config.Run.Images {\n\t\tfor _, i := range append([]string{r.Image}, r.Mirrors...) {\n\t\t\tif !opts.Publish {\n\t\t\t\timg, err := c.imageFetcher.Fetch(ctx, i, image.FetchOptions{Daemon: true, PullPolicy: opts.PullPolicy, Target: target})\n\t\t\t\tif err != nil {\n\t\t\t\t\tif errors.Cause(err) != image.ErrNotFound {\n\t\t\t\t\t\treturn errors.Wrap(err, \"failed to fetch image\")\n\t\t\t\t\t}\n\t\t\t\t} else {\n\t\t\t\t\trunImages = append(runImages, img)\n\t\t\t\t\tcontinue\n\t\t\t\t}\n\t\t\t}\n\n\t\t\timg, err := c.imageFetcher.Fetch(ctx, i, image.FetchOptions{Daemon: false, PullPolicy: opts.PullPolicy, Target: target})\n\t\t\tif err != nil {\n\t\t\t\tif errors.Cause(err) != image.ErrNotFound {\n\t\t\t\t\treturn errors.Wrap(err, \"failed to fetch image\")\n\t\t\t\t}\n\t\t\t\tc.logger.Warnf(\"run image %s is not accessible\", style.Symbol(i))\n\t\t\t} else {\n\t\t\t\trunImages = append(runImages, img)\n\t\t\t}\n\t\t}\n\t}\n\n\tfor _, img := range runImages {\n\t\tif opts.Config.Stack.ID != \"\" {\n\t\t\tstackID, err := img.Label(\"io.buildpacks.stack.id\")\n\t\t\tif err != nil {\n\t\t\t\treturn errors.Wrap(err, \"failed to label image\")\n\t\t\t}\n\n\t\t\tif stackID != opts.Config.Stack.ID {\n\t\t\t\treturn fmt.Errorf(\n\t\t\t\t\t\"stack %s from builder config is incompatible with stack %s from run image %s\",\n\t\t\t\t\tstyle.Symbol(opts.Config.Stack.ID),\n\t\t\t\t\tstyle.Symbol(stackID),\n\t\t\t\t\tstyle.Symbol(img.Name()),\n\t\t\t\t)\n\t\t\t}\n\t\t}\n\t}\n\n\treturn nil\n}\n\nfunc (c *Client) createBaseBuilder(ctx context.Context, opts CreateBuilderOptions, target *dist.Target, multiArch bool) (*builder.Builder, error) {\n\tbaseImage, err := c.imageFetcher.Fetch(ctx, opts.Config.Build.Image, image.FetchOptions{Daemon: !opts.Publish, PullPolicy: opts.PullPolicy, Target: target})\n\tif err != nil {\n\t\treturn nil, errors.Wrap(err, \"fetch build image\")\n\t}\n\n\tc.logger.Debugf(\"Creating builder %s from build-image %s\", style.Symbol(opts.BuilderName), style.Symbol(baseImage.Name()))\n\n\tvar builderOpts []builder.BuilderOption\n\tif opts.Flatten != nil && len(opts.Flatten.FlattenModules()) > 0 {\n\t\tbuilderOpts = append(builderOpts, builder.WithFlattened(opts.Flatten))\n\t}\n\tif len(opts.Labels) > 0 {\n\t\tbuilderOpts = append(builderOpts, builder.WithLabels(opts.Labels))\n\t}\n\n\tbuilderName := opts.BuilderName\n\tif multiArch && opts.AppendImageNameSuffix {\n\t\tbuilderName, err = name.AppendSuffix(builderName, *target)\n\t\tif err != nil {\n\t\t\treturn nil, errors.Wrap(err, \"invalid image name\")\n\t\t}\n\t}\n\n\tbldr, err := builder.New(baseImage, builderName, builderOpts...)\n\tif err != nil {\n\t\treturn nil, errors.Wrap(err, \"invalid build-image\")\n\t}\n\n\tarchitecture, err := baseImage.Architecture()\n\tif err != nil {\n\t\treturn nil, errors.Wrap(err, \"lookup image Architecture\")\n\t}\n\n\tos, err := baseImage.OS()\n\tif err != nil {\n\t\treturn nil, errors.Wrap(err, \"lookup image OS\")\n\t}\n\n\tif os == \"windows\" && !c.experimental {\n\t\treturn nil, NewExperimentError(\"Windows containers support is currently experimental.\")\n\t}\n\n\tbldr.SetDescription(opts.Config.Description)\n\n\tif opts.Config.Stack.ID != \"\" && bldr.StackID != opts.Config.Stack.ID {\n\t\treturn nil, fmt.Errorf(\n\t\t\t\"stack %s from builder config is incompatible with stack %s from build image\",\n\t\t\tstyle.Symbol(opts.Config.Stack.ID),\n\t\t\tstyle.Symbol(bldr.StackID),\n\t\t)\n\t}\n\n\tlifecycle, err := c.fetchLifecycle(ctx, opts, os, architecture)\n\tif err != nil {\n\t\treturn nil, errors.Wrap(err, \"fetch lifecycle\")\n\t}\n\n\t// Validate lifecycle version for image extensions\n\tif err := c.validateLifecycleVersion(opts.Config, lifecycle); err != nil {\n\t\treturn nil, err\n\t}\n\tbldr.SetLifecycle(lifecycle)\n\tbldr.SetBuildConfigEnv(opts.BuildConfigEnv)\n\n\treturn bldr, nil\n}\n\nfunc (c *Client) fetchLifecycle(ctx context.Context, opts CreateBuilderOptions, os string, architecture string) (builder.Lifecycle, error) {\n\tconfig := opts.Config.Lifecycle\n\tif config.Version != \"\" && config.URI != \"\" {\n\t\treturn nil, errors.Errorf(\n\t\t\t\"%s can only declare %s or %s, not both\",\n\t\t\tstyle.Symbol(\"lifecycle\"), style.Symbol(\"version\"), style.Symbol(\"uri\"),\n\t\t)\n\t}\n\n\tvar uri string\n\tvar err error\n\tswitch {\n\tcase buildpack.HasDockerLocator(config.URI):\n\t\turi, err = c.uriFromLifecycleImage(ctx, opts.TempDirectory, config)\n\t\tif err != nil {\n\t\t\treturn nil, errors.Wrap(err, \"Could not parse uri from lifecycle image\")\n\t\t}\n\tcase config.Version != \"\":\n\t\tv, err := semver.NewVersion(config.Version)\n\t\tif err != nil {\n\t\t\treturn nil, errors.Wrapf(err, \"%s must be a valid semver\", style.Symbol(\"lifecycle.version\"))\n\t\t}\n\n\t\turi = c.uriFromLifecycleVersion(*v, os, architecture)\n\tcase config.URI != \"\":\n\t\turi, err = paths.FilePathToURI(config.URI, opts.RelativeBaseDir)\n\t\tif err != nil {\n\t\t\treturn nil, err\n\t\t}\n\tdefault:\n\t\turi = c.uriFromLifecycleVersion(*semver.MustParse(builder.DefaultLifecycleVersion), os, architecture)\n\t}\n\n\tblob, err := c.downloader.Download(ctx, uri)\n\tif err != nil {\n\t\treturn nil, errors.Wrap(err, \"downloading lifecycle\")\n\t}\n\n\tlifecycle, err := builder.NewLifecycle(blob)\n\tif err != nil {\n\t\treturn nil, errors.Wrap(err, \"invalid lifecycle\")\n\t}\n\n\treturn lifecycle, nil\n}\n\nfunc (c *Client) addBuildpacksToBuilder(ctx context.Context, opts CreateBuilderOptions, bldr *builder.Builder) error {\n\tfor _, b := range opts.Config.Buildpacks {\n\t\tif err := c.addConfig(ctx, buildpack.KindBuildpack, b, opts, bldr); err != nil {\n\t\t\treturn err\n\t\t}\n\t}\n\treturn nil\n}\n\nfunc (c *Client) addExtensionsToBuilder(ctx context.Context, opts CreateBuilderOptions, bldr *builder.Builder) error {\n\tfor _, e := range opts.Config.Extensions {\n\t\tif err := c.addConfig(ctx, buildpack.KindExtension, e, opts, bldr); err != nil {\n\t\t\treturn err\n\t\t}\n\t}\n\treturn nil\n}\n\nfunc (c *Client) addConfig(ctx context.Context, kind string, config pubbldr.ModuleConfig, opts CreateBuilderOptions, bldr *builder.Builder) error {\n\tc.logger.Debugf(\"Looking up %s %s\", kind, style.Symbol(config.DisplayString()))\n\n\tbuilderOS, err := bldr.Image().OS()\n\tif err != nil {\n\t\treturn errors.Wrapf(err, \"getting builder OS\")\n\t}\n\tbuilderArch, err := bldr.Image().Architecture()\n\tif err != nil {\n\t\treturn errors.Wrapf(err, \"getting builder architecture\")\n\t}\n\n\ttarget := &dist.Target{OS: builderOS, Arch: builderArch}\n\tc.logger.Debugf(\"Downloading buildpack for platform: %s\", target.ValuesAsPlatform())\n\n\tmainBP, depBPs, err := c.buildpackDownloader.Download(ctx, config.URI, buildpack.DownloadOptions{\n\t\tDaemon:          !opts.Publish,\n\t\tImageName:       config.ImageName,\n\t\tModuleKind:      kind,\n\t\tPullPolicy:      opts.PullPolicy,\n\t\tRegistryName:    opts.Registry,\n\t\tRelativeBaseDir: opts.RelativeBaseDir,\n\t\tTarget:          target,\n\t})\n\tif err != nil {\n\t\treturn errors.Wrapf(err, \"downloading %s\", kind)\n\t}\n\terr = validateModule(kind, mainBP, config.URI, config.ID, config.Version)\n\tif err != nil {\n\t\treturn errors.Wrapf(err, \"invalid %s\", kind)\n\t}\n\n\tbpDesc := mainBP.Descriptor()\n\tfor _, deprecatedAPI := range bldr.LifecycleDescriptor().APIs.Buildpack.Deprecated {\n\t\tif deprecatedAPI.Equal(bpDesc.API()) {\n\t\t\tc.logger.Warnf(\n\t\t\t\t\"%s %s is using deprecated Buildpacks API version %s\",\n\t\t\t\tcases.Title(language.AmericanEnglish).String(kind),\n\t\t\t\tstyle.Symbol(bpDesc.Info().FullName()),\n\t\t\t\tstyle.Symbol(bpDesc.API().String()),\n\t\t\t)\n\t\t\tbreak\n\t\t}\n\t}\n\n\t// Fixes 1453\n\tsort.Slice(depBPs, func(i, j int) bool {\n\t\tcompareID := strings.Compare(depBPs[i].Descriptor().Info().ID, depBPs[j].Descriptor().Info().ID)\n\t\tif compareID == 0 {\n\t\t\treturn strings.Compare(depBPs[i].Descriptor().Info().Version, depBPs[j].Descriptor().Info().Version) <= 0\n\t\t}\n\t\treturn compareID < 0\n\t})\n\n\tswitch kind {\n\tcase buildpack.KindBuildpack:\n\t\tbldr.AddBuildpacks(mainBP, depBPs)\n\tcase buildpack.KindExtension:\n\t\t// Extensions can't be composite\n\t\tbldr.AddExtension(mainBP)\n\tdefault:\n\t\treturn fmt.Errorf(\"unknown module kind: %s\", kind)\n\t}\n\treturn nil\n}\n\nfunc (c *Client) processBuilderCreateTargets(ctx context.Context, opts CreateBuilderOptions) ([]dist.Target, error) {\n\tvar targets []dist.Target\n\n\tif len(opts.Targets) > 0 {\n\t\tif opts.Publish {\n\t\t\ttargets = opts.Targets\n\t\t} else {\n\t\t\t// find a target that matches the daemon\n\t\t\tdaemonTarget, err := c.daemonTarget(ctx, opts.Targets)\n\t\t\tif err != nil {\n\t\t\t\treturn targets, err\n\t\t\t}\n\t\t\ttargets = append(targets, daemonTarget)\n\t\t}\n\t}\n\treturn targets, nil\n}\n\nfunc validateModule(kind string, module buildpack.BuildModule, source, expectedID, expectedVersion string) error {\n\tinfo := module.Descriptor().Info()\n\tif expectedID != \"\" && info.ID != expectedID {\n\t\treturn fmt.Errorf(\n\t\t\t\"%s from URI %s has ID %s which does not match ID %s from builder config\",\n\t\t\tkind,\n\t\t\tstyle.Symbol(source),\n\t\t\tstyle.Symbol(info.ID),\n\t\t\tstyle.Symbol(expectedID),\n\t\t)\n\t}\n\n\tif expectedVersion != \"\" && info.Version != expectedVersion {\n\t\treturn fmt.Errorf(\n\t\t\t\"%s from URI %s has version %s which does not match version %s from builder config\",\n\t\t\tkind,\n\t\t\tstyle.Symbol(source),\n\t\t\tstyle.Symbol(info.Version),\n\t\t\tstyle.Symbol(expectedVersion),\n\t\t)\n\t}\n\n\treturn nil\n}\n\nfunc (c *Client) uriFromLifecycleVersion(version semver.Version, os string, architecture string) string {\n\tarch := \"x86-64\"\n\n\tif os == \"windows\" {\n\t\treturn fmt.Sprintf(\"https://github.com/buildpacks/lifecycle/releases/download/v%s/lifecycle-v%s+windows.%s.tgz\", version.String(), version.String(), arch)\n\t}\n\n\tif architecture == \"amd64\" {\n\t\tarchitecture = \"x86-64\"\n\t}\n\n\tif builder.SupportedLinuxArchitecture(architecture) {\n\t\tarch = architecture\n\t} else {\n\t\t// FIXME: this should probably be an error case in the future, see https://github.com/buildpacks/pack/issues/2163\n\t\tc.logger.Warnf(\"failed to find a lifecycle binary for requested architecture %s, defaulting to %s\", style.Symbol(architecture), style.Symbol(arch))\n\t}\n\n\treturn fmt.Sprintf(\"https://github.com/buildpacks/lifecycle/releases/download/v%s/lifecycle-v%s+linux.%s.tgz\", version.String(), version.String(), arch)\n}\n\nfunc stripTopDirAndWrite(layerReader io.ReadCloser, outputPath string) (*OS.File, error) {\n\tfile, err := OS.Create(outputPath)\n\tif err != nil {\n\t\treturn nil, err\n\t}\n\n\ttarWriter := tar.NewWriter(file)\n\ttarReader := tar.NewReader(layerReader)\n\ttarReader.Next()\n\n\tfor {\n\t\theader, err := tarReader.Next()\n\t\tif err == io.EOF {\n\t\t\tbreak\n\t\t}\n\t\tif err != nil {\n\t\t\treturn nil, err\n\t\t}\n\n\t\tpathSep := string(OS.PathSeparator)\n\t\tcnbPrefix := fmt.Sprintf(\"%scnb%s\", pathSep, pathSep)\n\t\tnewHeader := *header\n\t\tnewHeader.Name = strings.TrimPrefix(header.Name, cnbPrefix)\n\n\t\tif err := tarWriter.WriteHeader(&newHeader); err != nil {\n\t\t\treturn nil, err\n\t\t}\n\n\t\tif _, err := io.Copy(tarWriter, tarReader); err != nil {\n\t\t\treturn nil, err\n\t\t}\n\t}\n\n\treturn file, nil\n}\n\nfunc (c *Client) uriFromLifecycleImage(ctx context.Context, basePath string, config pubbldr.LifecycleConfig) (uri string, err error) {\n\tvar lifecycleImage imgutil.Image\n\timageName := buildpack.ParsePackageLocator(config.URI)\n\tc.logger.Debugf(\"Downloading lifecycle image: %s\", style.Symbol(imageName))\n\n\tlifecycleImage, err = c.imageFetcher.Fetch(ctx, imageName, image.FetchOptions{Daemon: false})\n\tif err != nil {\n\t\treturn \"\", err\n\t}\n\n\tlifecyclePath := filepath.Join(basePath, \"lifecycle.tar\")\n\tunderlyingImage := lifecycleImage.UnderlyingImage()\n\tif underlyingImage == nil {\n\t\treturn \"\", errors.New(\"lifecycle image has no underlying image\")\n\t}\n\n\tlayers, err := underlyingImage.Layers()\n\tif err != nil {\n\t\treturn \"\", err\n\t}\n\n\tif len(layers) == 0 {\n\t\treturn \"\", errors.New(\"lifecycle image has no layers\")\n\t}\n\n\t// Assume the last layer has the lifecycle\n\tlifecycleLayer := layers[len(layers)-1]\n\n\tlayerReader, err := lifecycleLayer.Uncompressed()\n\tif err != nil {\n\t\treturn \"\", err\n\t}\n\tdefer layerReader.Close()\n\n\tfile, err := stripTopDirAndWrite(layerReader, lifecyclePath)\n\tif err != nil {\n\t\treturn \"\", err\n\t}\n\n\tdefer file.Close()\n\n\turi, err = paths.FilePathToURI(lifecyclePath, \"\")\n\tif err != nil {\n\t\treturn \"\", err\n\t}\n\treturn uri, err\n}\n\nfunc hasExtensions(builderConfig pubbldr.Config) bool {\n\treturn len(builderConfig.Extensions) > 0 || len(builderConfig.OrderExtensions) > 0\n}\n\nfunc (c *Client) validateLifecycleVersion(builderConfig pubbldr.Config, lifecycle builder.Lifecycle) error {\n\tif !hasExtensions(builderConfig) {\n\t\treturn nil\n\t}\n\n\tdescriptor := lifecycle.Descriptor()\n\n\t// Extensions are stable starting from Platform API 0.13\n\t// Check the latest supported Platform API version\n\tif len(descriptor.APIs.Platform.Supported) == 0 {\n\t\t// No Platform API information available, skip validation\n\t\treturn nil\n\t}\n\n\tplatformAPI := descriptor.APIs.Platform.Supported.Latest()\n\tif platformAPI.LessThan(\"0.13\") {\n\t\tif !c.experimental {\n\t\t\treturn errors.Errorf(\n\t\t\t\t\"builder config contains image extensions, but the lifecycle Platform API version (%s) is older than 0.13; \"+\n\t\t\t\t\t\"support for image extensions with Platform API < 0.13 is currently experimental\",\n\t\t\t\tplatformAPI.String(),\n\t\t\t)\n\t\t}\n\t}\n\n\treturn nil\n}\n"
  },
  {
    "path": "pkg/client/create_builder_test.go",
    "content": "package client_test\n\nimport (\n\t\"bytes\"\n\t\"context\"\n\t\"fmt\"\n\t\"os\"\n\t\"path/filepath\"\n\t\"strings\"\n\t\"testing\"\n\n\t\"github.com/google/go-containerregistry/pkg/name\"\n\t\"github.com/google/go-containerregistry/pkg/v1/tarball\"\n\n\t\"github.com/buildpacks/imgutil/fakes\"\n\t\"github.com/buildpacks/lifecycle/api\"\n\t\"github.com/golang/mock/gomock\"\n\t\"github.com/heroku/color\"\n\tmobysystem \"github.com/moby/moby/api/types/system\"\n\tdockerclient \"github.com/moby/moby/client\"\n\t\"github.com/pkg/errors\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\tpubbldr \"github.com/buildpacks/pack/builder\"\n\tpubbldpkg \"github.com/buildpacks/pack/buildpackage\"\n\t\"github.com/buildpacks/pack/internal/builder\"\n\tifakes \"github.com/buildpacks/pack/internal/fakes\"\n\t\"github.com/buildpacks/pack/internal/paths\"\n\t\"github.com/buildpacks/pack/internal/style\"\n\t\"github.com/buildpacks/pack/pkg/archive\"\n\t\"github.com/buildpacks/pack/pkg/blob\"\n\t\"github.com/buildpacks/pack/pkg/buildpack\"\n\t\"github.com/buildpacks/pack/pkg/client\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n\t\"github.com/buildpacks/pack/pkg/image\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\t\"github.com/buildpacks/pack/pkg/testmocks\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestCreateBuilder(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"create_builder\", testCreateBuilder, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testCreateBuilder(t *testing.T, when spec.G, it spec.S) {\n\twhen(\"#CreateBuilder\", func() {\n\t\tvar (\n\t\t\tmockController          *gomock.Controller\n\t\t\tmockDownloader          *testmocks.MockBlobDownloader\n\t\t\tmockBuildpackDownloader *testmocks.MockBuildpackDownloader\n\t\t\tmockImageFactory        *testmocks.MockImageFactory\n\t\t\tmockImageFetcher        *testmocks.MockImageFetcher\n\t\t\tmockDockerClient        *testmocks.MockAPIClient\n\t\t\tfakeBuildImage          *fakes.Image\n\t\t\tfakeRunImage            *fakes.Image\n\t\t\tfakeRunImageMirror      *fakes.Image\n\t\t\topts                    client.CreateBuilderOptions\n\t\t\tsubject                 *client.Client\n\t\t\tlogger                  logging.Logger\n\t\t\tout                     bytes.Buffer\n\t\t\ttmpDir                  string\n\t\t)\n\t\tvar prepareFetcherWithRunImages = func() {\n\t\t\tmockImageFetcher.EXPECT().Fetch(gomock.Any(), \"some/run-image\", gomock.Any()).Return(fakeRunImage, nil).AnyTimes()\n\t\t\tmockImageFetcher.EXPECT().Fetch(gomock.Any(), \"localhost:5000/some/run-image\", gomock.Any()).Return(fakeRunImageMirror, nil).AnyTimes()\n\t\t}\n\n\t\tvar prepareFetcherWithBuildImage = func() {\n\t\t\tmockImageFetcher.EXPECT().Fetch(gomock.Any(), \"some/build-image\", gomock.Any()).Return(fakeBuildImage, nil)\n\t\t}\n\n\t\tvar prepareExtensions = func() {\n\t\t\t// Extensions require Platform API >= 0.13\n\t\t\topts.Config.Lifecycle.URI = \"file:///some-lifecycle-platform-0-13\"\n\t\t\topts.Config.Extensions = []pubbldr.ModuleConfig{\n\t\t\t\t{\n\t\t\t\t\tModuleInfo: dist.ModuleInfo{ID: \"ext.one\", Version: \"1.2.3\", Homepage: \"http://one.extension\"},\n\t\t\t\t\tImageOrURI: dist.ImageOrURI{\n\t\t\t\t\t\tBuildpackURI: dist.BuildpackURI{\n\t\t\t\t\t\t\tURI: \"https://example.fake/ext-one.tgz\",\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t}\n\t\t\topts.Config.OrderExtensions = []dist.OrderEntry{{\n\t\t\t\tGroup: []dist.ModuleRef{\n\t\t\t\t\t{ModuleInfo: dist.ModuleInfo{ID: \"ext.one\", Version: \"1.2.3\"}, Optional: true},\n\t\t\t\t}},\n\t\t\t}\n\t\t}\n\n\t\tvar createBuildpack = func(descriptor dist.BuildpackDescriptor) buildpack.BuildModule {\n\t\t\tbuildpack, err := ifakes.NewFakeBuildpack(descriptor, 0644)\n\t\t\th.AssertNil(t, err)\n\t\t\treturn buildpack\n\t\t}\n\n\t\tvar shouldCallBuildpackDownloaderWith = func(uri string, buildpackDownloadOptions buildpack.DownloadOptions) {\n\t\t\tbuildpack := createBuildpack(dist.BuildpackDescriptor{\n\t\t\t\tWithAPI:    api.MustParse(\"0.3\"),\n\t\t\t\tWithInfo:   dist.ModuleInfo{ID: \"example/foo\", Version: \"1.1.0\"},\n\t\t\t\tWithStacks: []dist.Stack{{ID: \"some.stack.id\"}},\n\t\t\t})\n\t\t\tmockBuildpackDownloader.EXPECT().Download(gomock.Any(), uri, gomock.Any()).Return(buildpack, nil, nil)\n\t\t}\n\n\t\tit.Before(func() {\n\t\t\tlogger = logging.NewLogWithWriters(&out, &out, logging.WithVerbose())\n\t\t\tmockController = gomock.NewController(t)\n\t\t\tmockDownloader = testmocks.NewMockBlobDownloader(mockController)\n\t\t\tmockImageFetcher = testmocks.NewMockImageFetcher(mockController)\n\t\t\tmockImageFactory = testmocks.NewMockImageFactory(mockController)\n\t\t\tmockDockerClient = testmocks.NewMockAPIClient(mockController)\n\t\t\tmockBuildpackDownloader = testmocks.NewMockBuildpackDownloader(mockController)\n\n\t\t\tfakeBuildImage = fakes.NewImage(\"some/build-image\", \"\", nil)\n\t\t\th.AssertNil(t, fakeBuildImage.SetLabel(\"io.buildpacks.stack.id\", \"some.stack.id\"))\n\t\t\th.AssertNil(t, fakeBuildImage.SetLabel(\"io.buildpacks.stack.mixins\", `[\"mixinX\", \"build:mixinY\"]`))\n\t\t\th.AssertNil(t, fakeBuildImage.SetEnv(\"CNB_USER_ID\", \"1234\"))\n\t\t\th.AssertNil(t, fakeBuildImage.SetEnv(\"CNB_GROUP_ID\", \"4321\"))\n\n\t\t\tfakeRunImage = fakes.NewImage(\"some/run-image\", \"\", nil)\n\t\t\th.AssertNil(t, fakeRunImage.SetLabel(\"io.buildpacks.stack.id\", \"some.stack.id\"))\n\n\t\t\tfakeRunImageMirror = fakes.NewImage(\"localhost:5000/some/run-image\", \"\", nil)\n\t\t\th.AssertNil(t, fakeRunImageMirror.SetLabel(\"io.buildpacks.stack.id\", \"some.stack.id\"))\n\n\t\t\texampleBuildpackBlob := blob.NewBlob(filepath.Join(\"testdata\", \"buildpack\"))\n\t\t\tmockDownloader.EXPECT().Download(gomock.Any(), \"https://example.fake/bp-one.tgz\").Return(exampleBuildpackBlob, nil).AnyTimes()\n\t\t\texampleExtensionBlob := blob.NewBlob(filepath.Join(\"testdata\", \"extension\"))\n\t\t\tmockDownloader.EXPECT().Download(gomock.Any(), \"https://example.fake/ext-one.tgz\").Return(exampleExtensionBlob, nil).AnyTimes()\n\t\t\tmockDownloader.EXPECT().Download(gomock.Any(), \"some/buildpack/dir\").Return(blob.NewBlob(filepath.Join(\"testdata\", \"buildpack\")), nil).AnyTimes()\n\t\t\tmockDownloader.EXPECT().Download(gomock.Any(), \"file:///some-lifecycle\").Return(blob.NewBlob(filepath.Join(\"testdata\", \"lifecycle\", \"platform-0.4\")), nil).AnyTimes()\n\t\t\tmockDownloader.EXPECT().Download(gomock.Any(), \"file:///some-lifecycle-platform-0-1\").Return(blob.NewBlob(filepath.Join(\"testdata\", \"lifecycle\", \"platform-0.3\")), nil).AnyTimes()\n\t\t\tmockDownloader.EXPECT().Download(gomock.Any(), \"file:///some-lifecycle-platform-0-13\").Return(blob.NewBlob(filepath.Join(\"testdata\", \"lifecycle\", \"platform-0.13\")), nil).AnyTimes()\n\n\t\t\tbp, err := buildpack.FromBuildpackRootBlob(exampleBuildpackBlob, archive.DefaultTarWriterFactory(), nil)\n\t\t\th.AssertNil(t, err)\n\t\t\tmockBuildpackDownloader.EXPECT().Download(gomock.Any(), \"https://example.fake/bp-one.tgz\", gomock.Any()).Return(bp, nil, nil).AnyTimes()\n\t\t\text, err := buildpack.FromExtensionRootBlob(exampleExtensionBlob, archive.DefaultTarWriterFactory(), nil)\n\t\t\th.AssertNil(t, err)\n\t\t\tmockBuildpackDownloader.EXPECT().Download(gomock.Any(), \"https://example.fake/ext-one.tgz\", gomock.Any()).Return(ext, nil, nil).AnyTimes()\n\n\t\t\tsubject, err = client.NewClient(\n\t\t\t\tclient.WithLogger(logger),\n\t\t\t\tclient.WithDownloader(mockDownloader),\n\t\t\t\tclient.WithImageFactory(mockImageFactory),\n\t\t\t\tclient.WithFetcher(mockImageFetcher),\n\t\t\t\tclient.WithDockerClient(mockDockerClient),\n\t\t\t\tclient.WithBuildpackDownloader(mockBuildpackDownloader),\n\t\t\t)\n\t\t\th.AssertNil(t, err)\n\n\t\t\tmockDockerClient.EXPECT().Info(context.TODO(), gomock.Any()).Return(dockerclient.SystemInfoResult{Info: mobysystem.Info{OSType: \"linux\"}}, nil).AnyTimes()\n\n\t\t\topts = client.CreateBuilderOptions{\n\t\t\t\tRelativeBaseDir: \"/\",\n\t\t\t\tBuilderName:     \"some/builder\",\n\t\t\t\tConfig: pubbldr.Config{\n\t\t\t\t\tDescription: \"Some description\",\n\t\t\t\t\tBuildpacks: []pubbldr.ModuleConfig{\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\tModuleInfo: dist.ModuleInfo{ID: \"bp.one\", Version: \"1.2.3\", Homepage: \"http://one.buildpack\"},\n\t\t\t\t\t\t\tImageOrURI: dist.ImageOrURI{\n\t\t\t\t\t\t\t\tBuildpackURI: dist.BuildpackURI{\n\t\t\t\t\t\t\t\t\tURI: \"https://example.fake/bp-one.tgz\",\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t\tOrder: []dist.OrderEntry{{\n\t\t\t\t\t\tGroup: []dist.ModuleRef{\n\t\t\t\t\t\t\t{ModuleInfo: dist.ModuleInfo{ID: \"bp.one\", Version: \"1.2.3\"}, Optional: false},\n\t\t\t\t\t\t}},\n\t\t\t\t\t},\n\t\t\t\t\tStack: pubbldr.StackConfig{\n\t\t\t\t\t\tID: \"some.stack.id\",\n\t\t\t\t\t},\n\t\t\t\t\tRun: pubbldr.RunConfig{\n\t\t\t\t\t\tImages: []pubbldr.RunImageConfig{{\n\t\t\t\t\t\t\tImage:   \"some/run-image\",\n\t\t\t\t\t\t\tMirrors: []string{\"localhost:5000/some/run-image\"},\n\t\t\t\t\t\t}},\n\t\t\t\t\t},\n\t\t\t\t\tBuild: pubbldr.BuildConfig{\n\t\t\t\t\t\tImage: \"some/build-image\",\n\t\t\t\t\t},\n\t\t\t\t\tLifecycle: pubbldr.LifecycleConfig{URI: \"file:///some-lifecycle\"},\n\t\t\t\t},\n\t\t\t\tPublish:    false,\n\t\t\t\tPullPolicy: image.PullAlways,\n\t\t\t}\n\n\t\t\ttmpDir, err = os.MkdirTemp(\"\", \"create-builder-test\")\n\t\t\th.AssertNil(t, err)\n\t\t})\n\n\t\tit.After(func() {\n\t\t\tmockController.Finish()\n\t\t\th.AssertNil(t, os.RemoveAll(tmpDir))\n\t\t})\n\n\t\tvar successfullyCreateBuilder = func() *builder.Builder {\n\t\t\tt.Helper()\n\n\t\t\terr := subject.CreateBuilder(context.TODO(), opts)\n\t\t\th.AssertNil(t, err)\n\n\t\t\th.AssertEq(t, fakeBuildImage.IsSaved(), true)\n\t\t\tbldr, err := builder.FromImage(fakeBuildImage)\n\t\t\th.AssertNil(t, err)\n\n\t\t\treturn bldr\n\t\t}\n\n\t\twhen(\"validating the builder config\", func() {\n\t\t\tit(\"should not fail when the stack ID is empty\", func() {\n\t\t\t\topts.Config.Stack.ID = \"\"\n\t\t\t\tprepareFetcherWithBuildImage()\n\t\t\t\tprepareFetcherWithRunImages()\n\n\t\t\t\terr := subject.CreateBuilder(context.TODO(), opts)\n\n\t\t\t\th.AssertNil(t, err)\n\t\t\t})\n\n\t\t\tit(\"should fail when the stack ID from the builder config does not match the stack ID from the build image\", func() {\n\t\t\t\th.AssertNil(t, fakeBuildImage.SetLabel(\"io.buildpacks.stack.id\", \"other.stack.id\"))\n\t\t\t\tprepareFetcherWithBuildImage()\n\t\t\t\tprepareFetcherWithRunImages()\n\n\t\t\t\terr := subject.CreateBuilder(context.TODO(), opts)\n\n\t\t\t\th.AssertError(t, err, \"stack 'some.stack.id' from builder config is incompatible with stack 'other.stack.id' from build image\")\n\t\t\t})\n\n\t\t\tit(\"should not fail when the stack is empty\", func() {\n\t\t\t\topts.Config.Stack.ID = \"\"\n\t\t\t\topts.Config.Stack.BuildImage = \"\"\n\t\t\t\topts.Config.Stack.RunImage = \"\"\n\t\t\t\tprepareFetcherWithBuildImage()\n\t\t\t\tprepareFetcherWithRunImages()\n\n\t\t\t\terr := subject.CreateBuilder(context.TODO(), opts)\n\n\t\t\t\th.AssertNil(t, err)\n\t\t\t})\n\n\t\t\tit(\"should fail when the run images and stack are empty\", func() {\n\t\t\t\topts.Config.Stack.BuildImage = \"\"\n\t\t\t\topts.Config.Stack.RunImage = \"\"\n\n\t\t\t\topts.Config.Run = pubbldr.RunConfig{}\n\n\t\t\t\terr := subject.CreateBuilder(context.TODO(), opts)\n\n\t\t\t\th.AssertError(t, err, \"run.images are required\")\n\t\t\t})\n\n\t\t\tit(\"should fail when the run images image and stack are empty\", func() {\n\t\t\t\topts.Config.Stack.BuildImage = \"\"\n\t\t\t\topts.Config.Stack.RunImage = \"\"\n\n\t\t\t\topts.Config.Run = pubbldr.RunConfig{\n\t\t\t\t\tImages: []pubbldr.RunImageConfig{{}},\n\t\t\t\t}\n\n\t\t\t\terr := subject.CreateBuilder(context.TODO(), opts)\n\n\t\t\t\th.AssertError(t, err, \"run.images.image is required\")\n\t\t\t})\n\n\t\t\tit(\"should fail if stack and run image are different\", func() {\n\t\t\t\topts.Config.Stack.RunImage = \"some-other-stack-run-image\"\n\n\t\t\t\terr := subject.CreateBuilder(context.TODO(), opts)\n\n\t\t\t\th.AssertError(t, err, \"run.images and stack.run-image do not match\")\n\t\t\t})\n\n\t\t\tit(\"should fail if stack and build image are different\", func() {\n\t\t\t\topts.Config.Stack.BuildImage = \"some-other-stack-build-image\"\n\n\t\t\t\terr := subject.CreateBuilder(context.TODO(), opts)\n\n\t\t\t\th.AssertError(t, err, \"build.image and stack.build-image do not match\")\n\t\t\t})\n\n\t\t\tit(\"should fail when lifecycle version is not a semver\", func() {\n\t\t\t\tprepareFetcherWithBuildImage()\n\t\t\t\tprepareFetcherWithRunImages()\n\t\t\t\topts.Config.Lifecycle.URI = \"\"\n\t\t\t\topts.Config.Lifecycle.Version = \"not-semver\"\n\n\t\t\t\terr := subject.CreateBuilder(context.TODO(), opts)\n\n\t\t\t\th.AssertError(t, err, \"'lifecycle.version' must be a valid semver\")\n\t\t\t})\n\n\t\t\tit(\"should fail when both lifecycle version and uri are present\", func() {\n\t\t\t\tprepareFetcherWithBuildImage()\n\t\t\t\tprepareFetcherWithRunImages()\n\t\t\t\topts.Config.Lifecycle.URI = \"file://some-lifecycle\"\n\t\t\t\topts.Config.Lifecycle.Version = \"something\"\n\n\t\t\t\terr := subject.CreateBuilder(context.TODO(), opts)\n\n\t\t\t\th.AssertError(t, err, \"'lifecycle' can only declare 'version' or 'uri', not both\")\n\t\t\t})\n\n\t\t\tit(\"should fail when buildpack ID does not match downloaded buildpack\", func() {\n\t\t\t\tprepareFetcherWithBuildImage()\n\t\t\t\tprepareFetcherWithRunImages()\n\t\t\t\topts.Config.Buildpacks[0].ID = \"does.not.match\"\n\n\t\t\t\terr := subject.CreateBuilder(context.TODO(), opts)\n\n\t\t\t\th.AssertError(t, err, \"buildpack from URI 'https://example.fake/bp-one.tgz' has ID 'bp.one' which does not match ID 'does.not.match' from builder config\")\n\t\t\t})\n\n\t\t\tit(\"should fail when buildpack version does not match downloaded buildpack\", func() {\n\t\t\t\tprepareFetcherWithBuildImage()\n\t\t\t\tprepareFetcherWithRunImages()\n\t\t\t\topts.Config.Buildpacks[0].Version = \"0.0.0\"\n\n\t\t\t\terr := subject.CreateBuilder(context.TODO(), opts)\n\n\t\t\t\th.AssertError(t, err, \"buildpack from URI 'https://example.fake/bp-one.tgz' has version '1.2.3' which does not match version '0.0.0' from builder config\")\n\t\t\t})\n\n\t\t\tit(\"should fail when extension ID does not match downloaded extension\", func() {\n\t\t\t\tprepareFetcherWithBuildImage()\n\t\t\t\tprepareFetcherWithRunImages()\n\t\t\t\tprepareExtensions()\n\t\t\t\topts.Config.Extensions[0].ID = \"does.not.match\"\n\n\t\t\t\terr := subject.CreateBuilder(context.TODO(), opts)\n\n\t\t\t\th.AssertError(t, err, \"extension from URI 'https://example.fake/ext-one.tgz' has ID 'ext.one' which does not match ID 'does.not.match' from builder config\")\n\t\t\t})\n\n\t\t\tit(\"should fail when extension version does not match downloaded extension\", func() {\n\t\t\t\tprepareFetcherWithBuildImage()\n\t\t\t\tprepareFetcherWithRunImages()\n\t\t\t\tprepareExtensions()\n\t\t\t\topts.Config.Extensions[0].Version = \"0.0.0\"\n\n\t\t\t\terr := subject.CreateBuilder(context.TODO(), opts)\n\n\t\t\t\th.AssertError(t, err, \"extension from URI 'https://example.fake/ext-one.tgz' has version '1.2.3' which does not match version '0.0.0' from builder config\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"validating the run image config\", func() {\n\t\t\tit(\"should fail when the stack ID from the builder config does not match the stack ID from the run image\", func() {\n\t\t\t\tprepareFetcherWithRunImages()\n\t\t\t\th.AssertNil(t, fakeRunImage.SetLabel(\"io.buildpacks.stack.id\", \"other.stack.id\"))\n\n\t\t\t\terr := subject.CreateBuilder(context.TODO(), opts)\n\n\t\t\t\th.AssertError(t, err, \"stack 'some.stack.id' from builder config is incompatible with stack 'other.stack.id' from run image 'some/run-image'\")\n\t\t\t})\n\n\t\t\tit(\"should fail when the stack ID from the builder config does not match the stack ID from the run image mirrors\", func() {\n\t\t\t\tprepareFetcherWithRunImages()\n\t\t\t\th.AssertNil(t, fakeRunImageMirror.SetLabel(\"io.buildpacks.stack.id\", \"other.stack.id\"))\n\n\t\t\t\terr := subject.CreateBuilder(context.TODO(), opts)\n\n\t\t\t\th.AssertError(t, err, \"stack 'some.stack.id' from builder config is incompatible with stack 'other.stack.id' from run image 'localhost:5000/some/run-image'\")\n\t\t\t})\n\n\t\t\tit(\"should warn when the run image cannot be found\", func() {\n\t\t\t\tmockImageFetcher.EXPECT().Fetch(gomock.Any(), \"some/build-image\", image.FetchOptions{Daemon: true, PullPolicy: image.PullAlways}).Return(fakeBuildImage, nil)\n\n\t\t\t\tmockImageFetcher.EXPECT().Fetch(gomock.Any(), \"some/run-image\", image.FetchOptions{Daemon: false, PullPolicy: image.PullAlways}).Return(nil, errors.Wrap(image.ErrNotFound, \"yikes\"))\n\t\t\t\tmockImageFetcher.EXPECT().Fetch(gomock.Any(), \"some/run-image\", image.FetchOptions{Daemon: true, PullPolicy: image.PullAlways}).Return(nil, errors.Wrap(image.ErrNotFound, \"yikes\"))\n\n\t\t\t\tmockImageFetcher.EXPECT().Fetch(gomock.Any(), \"localhost:5000/some/run-image\", image.FetchOptions{Daemon: false, PullPolicy: image.PullAlways}).Return(nil, errors.Wrap(image.ErrNotFound, \"yikes\"))\n\t\t\t\tmockImageFetcher.EXPECT().Fetch(gomock.Any(), \"localhost:5000/some/run-image\", image.FetchOptions{Daemon: true, PullPolicy: image.PullAlways}).Return(nil, errors.Wrap(image.ErrNotFound, \"yikes\"))\n\n\t\t\t\terr := subject.CreateBuilder(context.TODO(), opts)\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\th.AssertContains(t, out.String(), \"Warning: run image 'some/run-image' is not accessible\")\n\t\t\t})\n\n\t\t\tit(\"should fail when not publish and the run image cannot be fetched\", func() {\n\t\t\t\tmockImageFetcher.EXPECT().Fetch(gomock.Any(), \"some/run-image\", image.FetchOptions{Daemon: true, PullPolicy: image.PullAlways}).Return(nil, errors.New(\"yikes\"))\n\n\t\t\t\terr := subject.CreateBuilder(context.TODO(), opts)\n\t\t\t\th.AssertError(t, err, \"failed to fetch image: yikes\")\n\t\t\t})\n\n\t\t\tit(\"should fail when publish and the run image cannot be fetched\", func() {\n\t\t\t\tmockImageFetcher.EXPECT().Fetch(gomock.Any(), \"some/run-image\", image.FetchOptions{Daemon: false, PullPolicy: image.PullAlways}).Return(nil, errors.New(\"yikes\"))\n\n\t\t\t\topts.Publish = true\n\t\t\t\terr := subject.CreateBuilder(context.TODO(), opts)\n\t\t\t\th.AssertError(t, err, \"failed to fetch image: yikes\")\n\t\t\t})\n\n\t\t\tit(\"should fail when the run image isn't a valid image\", func() {\n\t\t\t\tfakeImage := fakeBadImageStruct{}\n\n\t\t\t\tmockImageFetcher.EXPECT().Fetch(gomock.Any(), \"some/run-image\", gomock.Any()).Return(fakeImage, nil).AnyTimes()\n\t\t\t\tmockImageFetcher.EXPECT().Fetch(gomock.Any(), \"localhost:5000/some/run-image\", gomock.Any()).Return(fakeImage, nil).AnyTimes()\n\n\t\t\t\terr := subject.CreateBuilder(context.TODO(), opts)\n\t\t\t\th.AssertError(t, err, \"failed to label image\")\n\t\t\t})\n\n\t\t\twhen(\"publish is true\", func() {\n\t\t\t\tit(\"should only try to validate the remote run image\", func() {\n\t\t\t\t\tmockImageFetcher.EXPECT().Fetch(gomock.Any(), \"some/build-image\", image.FetchOptions{Daemon: true}).Times(0)\n\t\t\t\t\tmockImageFetcher.EXPECT().Fetch(gomock.Any(), \"some/run-image\", image.FetchOptions{Daemon: true}).Times(0)\n\t\t\t\t\tmockImageFetcher.EXPECT().Fetch(gomock.Any(), \"localhost:5000/some/run-image\", image.FetchOptions{Daemon: true}).Times(0)\n\n\t\t\t\t\tmockImageFetcher.EXPECT().Fetch(gomock.Any(), \"some/build-image\", image.FetchOptions{Daemon: false}).Return(fakeBuildImage, nil)\n\t\t\t\t\tmockImageFetcher.EXPECT().Fetch(gomock.Any(), \"some/run-image\", image.FetchOptions{Daemon: false}).Return(fakeRunImage, nil)\n\t\t\t\t\tmockImageFetcher.EXPECT().Fetch(gomock.Any(), \"localhost:5000/some/run-image\", image.FetchOptions{Daemon: false}).Return(fakeRunImageMirror, nil)\n\n\t\t\t\t\topts.Publish = true\n\n\t\t\t\t\terr := subject.CreateBuilder(context.TODO(), opts)\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"creating the base builder\", func() {\n\t\t\twhen(\"build image not found\", func() {\n\t\t\t\tit(\"should fail\", func() {\n\t\t\t\t\tprepareFetcherWithRunImages()\n\t\t\t\t\tmockImageFetcher.EXPECT().Fetch(gomock.Any(), \"some/build-image\", image.FetchOptions{Daemon: true, PullPolicy: image.PullAlways}).Return(nil, image.ErrNotFound)\n\n\t\t\t\t\terr := subject.CreateBuilder(context.TODO(), opts)\n\t\t\t\t\th.AssertError(t, err, \"fetch build image: not found\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"build image isn't a valid image\", func() {\n\t\t\t\tit(\"should fail\", func() {\n\t\t\t\t\tfakeImage := fakeBadImageStruct{}\n\n\t\t\t\t\tprepareFetcherWithRunImages()\n\t\t\t\t\tmockImageFetcher.EXPECT().Fetch(gomock.Any(), \"some/build-image\", image.FetchOptions{Daemon: true, PullPolicy: image.PullAlways}).Return(fakeImage, nil)\n\n\t\t\t\t\terr := subject.CreateBuilder(context.TODO(), opts)\n\t\t\t\t\th.AssertError(t, err, \"failed to create builder: invalid build-image\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"windows containers\", func() {\n\t\t\t\twhen(\"experimental enabled\", func() {\n\t\t\t\t\tit(\"succeeds\", func() {\n\t\t\t\t\t\topts.Config.Extensions = nil      // TODO: downloading extensions doesn't work yet; to be implemented in https://github.com/buildpacks/pack/issues/1489\n\t\t\t\t\t\topts.Config.OrderExtensions = nil // TODO: downloading extensions doesn't work yet; to be implemented in https://github.com/buildpacks/pack/issues/1489\n\t\t\t\t\t\tpackClientWithExperimental, err := client.NewClient(\n\t\t\t\t\t\t\tclient.WithLogger(logger),\n\t\t\t\t\t\t\tclient.WithDownloader(mockDownloader),\n\t\t\t\t\t\t\tclient.WithImageFactory(mockImageFactory),\n\t\t\t\t\t\t\tclient.WithFetcher(mockImageFetcher),\n\t\t\t\t\t\t\tclient.WithExperimental(true),\n\t\t\t\t\t\t)\n\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\tprepareFetcherWithRunImages()\n\n\t\t\t\t\t\th.AssertNil(t, fakeBuildImage.SetOS(\"windows\"))\n\t\t\t\t\t\tmockImageFetcher.EXPECT().Fetch(gomock.Any(), \"some/build-image\", image.FetchOptions{Daemon: true, PullPolicy: image.PullAlways}).Return(fakeBuildImage, nil)\n\n\t\t\t\t\t\terr = packClientWithExperimental.CreateBuilder(context.TODO(), opts)\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"experimental disabled\", func() {\n\t\t\t\t\tit(\"fails\", func() {\n\t\t\t\t\t\tprepareFetcherWithRunImages()\n\n\t\t\t\t\t\th.AssertNil(t, fakeBuildImage.SetOS(\"windows\"))\n\t\t\t\t\t\tmockImageFetcher.EXPECT().Fetch(gomock.Any(), \"some/build-image\", gomock.Any()).Return(fakeBuildImage, nil)\n\n\t\t\t\t\t\terr := subject.CreateBuilder(context.TODO(), opts)\n\t\t\t\t\t\th.AssertError(t, err, \"failed to create builder: Windows containers support is currently experimental.\")\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"error downloading lifecycle\", func() {\n\t\t\t\tit(\"should fail\", func() {\n\t\t\t\t\tprepareFetcherWithBuildImage()\n\t\t\t\t\tprepareFetcherWithRunImages()\n\t\t\t\t\topts.Config.Lifecycle.URI = \"fake\"\n\n\t\t\t\t\turi, err := paths.FilePathToURI(opts.Config.Lifecycle.URI, opts.RelativeBaseDir)\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\tmockDownloader.EXPECT().Download(gomock.Any(), uri).Return(nil, errors.New(\"error here\")).AnyTimes()\n\n\t\t\t\t\terr = subject.CreateBuilder(context.TODO(), opts)\n\t\t\t\t\th.AssertError(t, err, \"downloading lifecycle\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"lifecycle isn't a valid lifecycle\", func() {\n\t\t\t\tit(\"should fail\", func() {\n\t\t\t\t\tprepareFetcherWithBuildImage()\n\t\t\t\t\tprepareFetcherWithRunImages()\n\t\t\t\t\topts.Config.Lifecycle.URI = \"fake\"\n\n\t\t\t\t\turi, err := paths.FilePathToURI(opts.Config.Lifecycle.URI, opts.RelativeBaseDir)\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\tmockDownloader.EXPECT().Download(gomock.Any(), uri).Return(blob.NewBlob(filepath.Join(\"testdata\", \"empty-file\")), nil).AnyTimes()\n\n\t\t\t\t\terr = subject.CreateBuilder(context.TODO(), opts)\n\t\t\t\t\th.AssertError(t, err, \"invalid lifecycle\")\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"validating lifecycle Platform API for extensions\", func() {\n\t\t\twhen(\"lifecycle supports Platform API >= 0.13\", func() {\n\t\t\t\tit(\"should allow extensions without experimental flag\", func() {\n\t\t\t\t\t// Uses default lifecycle which has Platform API 0.13\n\t\t\t\t\tprepareFetcherWithBuildImage()\n\t\t\t\t\tprepareFetcherWithRunImages()\n\t\t\t\t\topts.Config.Lifecycle.URI = \"file:///some-lifecycle-platform-0-13\"\n\n\t\t\t\t\terr := subject.CreateBuilder(context.TODO(), opts)\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"lifecycle supports Platform API < 0.13\", func() {\n\t\t\t\twhen(\"experimental flag is not set\", func() {\n\t\t\t\t\tit(\"should fail when builder has extensions\", func() {\n\t\t\t\t\t\tprepareFetcherWithBuildImage()\n\t\t\t\t\t\tprepareFetcherWithRunImages()\n\t\t\t\t\t\tprepareExtensions()\n\t\t\t\t\t\t// Override to use lifecycle with Platform API 0.3 (< 0.13) for this test\n\t\t\t\t\t\topts.Config.Lifecycle.URI = \"file:///some-lifecycle\"\n\n\t\t\t\t\t\terr := subject.CreateBuilder(context.TODO(), opts)\n\t\t\t\t\t\th.AssertError(t, err, \"support for image extensions with Platform API < 0.13 is currently experimental\")\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"experimental flag is set\", func() {\n\t\t\t\t\tit(\"should succeed when builder has extensions\", func() {\n\t\t\t\t\t\tpackClientWithExperimental, err := client.NewClient(\n\t\t\t\t\t\t\tclient.WithLogger(logger),\n\t\t\t\t\t\t\tclient.WithDownloader(mockDownloader),\n\t\t\t\t\t\t\tclient.WithImageFactory(mockImageFactory),\n\t\t\t\t\t\t\tclient.WithFetcher(mockImageFetcher),\n\t\t\t\t\t\t\tclient.WithDockerClient(mockDockerClient),\n\t\t\t\t\t\t\tclient.WithBuildpackDownloader(mockBuildpackDownloader),\n\t\t\t\t\t\t\tclient.WithExperimental(true),\n\t\t\t\t\t\t)\n\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\tprepareFetcherWithBuildImage()\n\t\t\t\t\t\tprepareFetcherWithRunImages()\n\t\t\t\t\t\tprepareExtensions()\n\t\t\t\t\t\t// Remove buildpacks to avoid API compatibility issues\n\t\t\t\t\t\topts.Config.Buildpacks = nil\n\t\t\t\t\t\topts.Config.Order = nil\n\t\t\t\t\t\t// Override to use lifecycle with Platform API 0.3 (< 0.13) for this test\n\t\t\t\t\t\topts.Config.Lifecycle.URI = \"file:///some-lifecycle\"\n\n\t\t\t\t\t\terr = packClientWithExperimental.CreateBuilder(context.TODO(), opts)\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"builder has no extensions\", func() {\n\t\t\t\tit(\"should succeed regardless of Platform API version\", func() {\n\t\t\t\t\tprepareFetcherWithBuildImage()\n\t\t\t\t\tprepareFetcherWithRunImages()\n\t\t\t\t\t// Remove extensions from config\n\t\t\t\t\topts.Config.Extensions = nil\n\t\t\t\t\topts.Config.OrderExtensions = nil\n\t\t\t\t\t// Use lifecycle with Platform API 0.3 (< 0.13)\n\t\t\t\t\topts.Config.Lifecycle.URI = \"file:///some-lifecycle\"\n\n\t\t\t\t\terr := subject.CreateBuilder(context.TODO(), opts)\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"only lifecycle version is provided\", func() {\n\t\t\tit(\"should download from predetermined uri\", func() {\n\t\t\t\tprepareFetcherWithBuildImage()\n\t\t\t\tprepareFetcherWithRunImages()\n\t\t\t\topts.Config.Lifecycle.URI = \"\"\n\t\t\t\topts.Config.Lifecycle.Version = \"3.4.5\"\n\n\t\t\t\tmockDownloader.EXPECT().Download(\n\t\t\t\t\tgomock.Any(),\n\t\t\t\t\t\"https://github.com/buildpacks/lifecycle/releases/download/v3.4.5/lifecycle-v3.4.5+linux.x86-64.tgz\",\n\t\t\t\t).Return(\n\t\t\t\t\tblob.NewBlob(filepath.Join(\"testdata\", \"lifecycle\", \"platform-0.4\")), nil,\n\t\t\t\t)\n\n\t\t\t\terr := subject.CreateBuilder(context.TODO(), opts)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t})\n\n\t\t\tit(\"should download from predetermined uri for arm64\", func() {\n\t\t\t\tprepareFetcherWithBuildImage()\n\t\t\t\tprepareFetcherWithRunImages()\n\t\t\t\topts.Config.Lifecycle.URI = \"\"\n\t\t\t\topts.Config.Lifecycle.Version = \"3.4.5\"\n\t\t\t\th.AssertNil(t, fakeBuildImage.SetArchitecture(\"arm64\"))\n\n\t\t\t\tmockDownloader.EXPECT().Download(\n\t\t\t\t\tgomock.Any(),\n\t\t\t\t\t\"https://github.com/buildpacks/lifecycle/releases/download/v3.4.5/lifecycle-v3.4.5+linux.arm64.tgz\",\n\t\t\t\t).Return(\n\t\t\t\t\tblob.NewBlob(filepath.Join(\"testdata\", \"lifecycle\", \"platform-0.4\")), nil,\n\t\t\t\t)\n\n\t\t\t\terr := subject.CreateBuilder(context.TODO(), opts)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t})\n\n\t\t\tit(\"should download x86-64 lifecycle when architecture is amd64\", func() {\n\t\t\t\tprepareFetcherWithBuildImage()\n\t\t\t\tprepareFetcherWithRunImages()\n\t\t\t\topts.Config.Lifecycle.URI = \"\"\n\t\t\t\topts.Config.Lifecycle.Version = \"3.4.5\"\n\t\t\t\th.AssertNil(t, fakeBuildImage.SetArchitecture(\"amd64\"))\n\n\t\t\t\tmockDownloader.EXPECT().Download(\n\t\t\t\t\tgomock.Any(),\n\t\t\t\t\t\"https://github.com/buildpacks/lifecycle/releases/download/v3.4.5/lifecycle-v3.4.5+linux.x86-64.tgz\",\n\t\t\t\t).Return(\n\t\t\t\t\tblob.NewBlob(filepath.Join(\"testdata\", \"lifecycle\", \"platform-0.4\")), nil,\n\t\t\t\t)\n\n\t\t\t\terr := subject.CreateBuilder(context.TODO(), opts)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertNotContains(t, out.String(), \"failed to find a lifecycle binary\")\n\t\t\t})\n\n\t\t\tit(\"should warn and default to x86-64 for unknown architecture\", func() {\n\t\t\t\tprepareFetcherWithBuildImage()\n\t\t\t\tprepareFetcherWithRunImages()\n\t\t\t\topts.Config.Lifecycle.URI = \"\"\n\t\t\t\topts.Config.Lifecycle.Version = \"3.4.5\"\n\t\t\t\th.AssertNil(t, fakeBuildImage.SetArchitecture(\"riscv64\"))\n\n\t\t\t\tmockDownloader.EXPECT().Download(\n\t\t\t\t\tgomock.Any(),\n\t\t\t\t\t\"https://github.com/buildpacks/lifecycle/releases/download/v3.4.5/lifecycle-v3.4.5+linux.x86-64.tgz\",\n\t\t\t\t).Return(\n\t\t\t\t\tblob.NewBlob(filepath.Join(\"testdata\", \"lifecycle\", \"platform-0.4\")), nil,\n\t\t\t\t)\n\n\t\t\t\terr := subject.CreateBuilder(context.TODO(), opts)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertContains(t, out.String(), \"failed to find a lifecycle binary\")\n\t\t\t})\n\n\t\t\twhen(\"windows\", func() {\n\t\t\t\tit(\"should download from predetermined uri\", func() {\n\t\t\t\t\topts.Config.Extensions = nil      // TODO: downloading extensions doesn't work yet; to be implemented in https://github.com/buildpacks/pack/issues/1489\n\t\t\t\t\topts.Config.OrderExtensions = nil // TODO: downloading extensions doesn't work yet; to be implemented in https://github.com/buildpacks/pack/issues/1489\n\t\t\t\t\tpackClientWithExperimental, err := client.NewClient(\n\t\t\t\t\t\tclient.WithLogger(logger),\n\t\t\t\t\t\tclient.WithDownloader(mockDownloader),\n\t\t\t\t\t\tclient.WithImageFactory(mockImageFactory),\n\t\t\t\t\t\tclient.WithFetcher(mockImageFetcher),\n\t\t\t\t\t\tclient.WithExperimental(true),\n\t\t\t\t\t)\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\tprepareFetcherWithBuildImage()\n\t\t\t\t\tprepareFetcherWithRunImages()\n\t\t\t\t\topts.Config.Lifecycle.URI = \"\"\n\t\t\t\t\topts.Config.Lifecycle.Version = \"3.4.5\"\n\t\t\t\t\th.AssertNil(t, fakeBuildImage.SetOS(\"windows\"))\n\n\t\t\t\t\tmockDownloader.EXPECT().Download(\n\t\t\t\t\t\tgomock.Any(),\n\t\t\t\t\t\t\"https://github.com/buildpacks/lifecycle/releases/download/v3.4.5/lifecycle-v3.4.5+windows.x86-64.tgz\",\n\t\t\t\t\t).Return(\n\t\t\t\t\t\tblob.NewBlob(filepath.Join(\"testdata\", \"lifecycle\", \"platform-0.4\")), nil,\n\t\t\t\t\t)\n\n\t\t\t\t\terr = packClientWithExperimental.CreateBuilder(context.TODO(), opts)\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"no lifecycle version or URI is provided\", func() {\n\t\t\tit(\"should download default lifecycle\", func() {\n\t\t\t\tprepareFetcherWithBuildImage()\n\t\t\t\tprepareFetcherWithRunImages()\n\t\t\t\topts.Config.Lifecycle.URI = \"\"\n\t\t\t\topts.Config.Lifecycle.Version = \"\"\n\n\t\t\t\tmockDownloader.EXPECT().Download(\n\t\t\t\t\tgomock.Any(),\n\t\t\t\t\tfmt.Sprintf(\n\t\t\t\t\t\t\"https://github.com/buildpacks/lifecycle/releases/download/v%s/lifecycle-v%s+linux.x86-64.tgz\",\n\t\t\t\t\t\tbuilder.DefaultLifecycleVersion,\n\t\t\t\t\t\tbuilder.DefaultLifecycleVersion,\n\t\t\t\t\t),\n\t\t\t\t).Return(\n\t\t\t\t\tblob.NewBlob(filepath.Join(\"testdata\", \"lifecycle\", \"platform-0.4\")), nil,\n\t\t\t\t)\n\n\t\t\t\terr := subject.CreateBuilder(context.TODO(), opts)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t})\n\n\t\t\tit(\"should download default lifecycle on arm64\", func() {\n\t\t\t\tprepareFetcherWithBuildImage()\n\t\t\t\tprepareFetcherWithRunImages()\n\t\t\t\topts.Config.Lifecycle.URI = \"\"\n\t\t\t\topts.Config.Lifecycle.Version = \"\"\n\t\t\t\th.AssertNil(t, fakeBuildImage.SetArchitecture(\"arm64\"))\n\n\t\t\t\tmockDownloader.EXPECT().Download(\n\t\t\t\t\tgomock.Any(),\n\t\t\t\t\tfmt.Sprintf(\n\t\t\t\t\t\t\"https://github.com/buildpacks/lifecycle/releases/download/v%s/lifecycle-v%s+linux.arm64.tgz\",\n\t\t\t\t\t\tbuilder.DefaultLifecycleVersion,\n\t\t\t\t\t\tbuilder.DefaultLifecycleVersion,\n\t\t\t\t\t),\n\t\t\t\t).Return(\n\t\t\t\t\tblob.NewBlob(filepath.Join(\"testdata\", \"lifecycle\", \"platform-0.4\")), nil,\n\t\t\t\t)\n\n\t\t\t\terr := subject.CreateBuilder(context.TODO(), opts)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t})\n\n\t\t\twhen(\"windows\", func() {\n\t\t\t\tit(\"should download default lifecycle\", func() {\n\t\t\t\t\topts.Config.Extensions = nil      // TODO: downloading extensions doesn't work yet; to be implemented in https://github.com/buildpacks/pack/issues/1489\n\t\t\t\t\topts.Config.OrderExtensions = nil // TODO: downloading extensions doesn't work yet; to be implemented in https://github.com/buildpacks/pack/issues/1489\n\t\t\t\t\tpackClientWithExperimental, err := client.NewClient(\n\t\t\t\t\t\tclient.WithLogger(logger),\n\t\t\t\t\t\tclient.WithDownloader(mockDownloader),\n\t\t\t\t\t\tclient.WithImageFactory(mockImageFactory),\n\t\t\t\t\t\tclient.WithFetcher(mockImageFetcher),\n\t\t\t\t\t\tclient.WithExperimental(true),\n\t\t\t\t\t)\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\tprepareFetcherWithBuildImage()\n\t\t\t\t\tprepareFetcherWithRunImages()\n\t\t\t\t\topts.Config.Lifecycle.URI = \"\"\n\t\t\t\t\topts.Config.Lifecycle.Version = \"\"\n\t\t\t\t\th.AssertNil(t, fakeBuildImage.SetOS(\"windows\"))\n\n\t\t\t\t\tmockDownloader.EXPECT().Download(\n\t\t\t\t\t\tgomock.Any(),\n\t\t\t\t\t\tfmt.Sprintf(\n\t\t\t\t\t\t\t\"https://github.com/buildpacks/lifecycle/releases/download/v%s/lifecycle-v%s+windows.x86-64.tgz\",\n\t\t\t\t\t\t\tbuilder.DefaultLifecycleVersion,\n\t\t\t\t\t\t\tbuilder.DefaultLifecycleVersion,\n\t\t\t\t\t\t),\n\t\t\t\t\t).Return(\n\t\t\t\t\t\tblob.NewBlob(filepath.Join(\"testdata\", \"lifecycle\", \"platform-0.4\")), nil,\n\t\t\t\t\t)\n\n\t\t\t\t\terr = packClientWithExperimental.CreateBuilder(context.TODO(), opts)\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"lifecycle URI is a docker image\", func() {\n\t\t\tvar lifecycleImageName = \"buildpacksio/lifecycle:latest\"\n\n\t\t\tsetupFakeLifecycleImage := func() *h.FakeWithUnderlyingImage {\n\t\t\t\t// Write the tar content to a file in tmpDir\n\t\t\t\tlifecycleLayerPath := filepath.Join(\"testdata\", \"lifecycle\", \"lifecycle.tar\")\n\t\t\t\tlifecycleTag, err := name.NewTag(lifecycleImageName)\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\tv1LifecycleImage, err := tarball.ImageFromPath(lifecycleLayerPath, &lifecycleTag)\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\treturn h.NewFakeWithUnderlyingV1Image(lifecycleImageName, nil, v1LifecycleImage)\n\t\t\t}\n\n\t\t\tit(\"should download lifecycle from docker registry\", func() {\n\t\t\t\tprepareFetcherWithBuildImage()\n\t\t\t\tprepareFetcherWithRunImages()\n\n\t\t\t\tfakeLifecycleImage := setupFakeLifecycleImage()\n\n\t\t\t\topts.Config.Lifecycle.URI = \"docker://\" + lifecycleImageName\n\t\t\t\topts.Config.Lifecycle.Version = \"\"\n\t\t\t\topts.RelativeBaseDir = tmpDir\n\n\t\t\t\t// Expect the image fetcher to fetch the lifecycle image\n\t\t\t\tmockImageFetcher.EXPECT().Fetch(gomock.Any(), lifecycleImageName, image.FetchOptions{Daemon: false}).Return(fakeLifecycleImage, nil)\n\n\t\t\t\t// Create the expected lifecycle.tar file that will be referenced\n\t\t\t\tlifecyclePath := filepath.Join(tmpDir, \"lifecycle.tar\")\n\n\t\t\t\t// The downloader will be called with the extracted lifecycle tar\n\t\t\t\tmockDownloader.EXPECT().Download(gomock.Any(), gomock.Any()).DoAndReturn(func(ctx context.Context, uri string) (blob.Blob, error) {\n\t\t\t\t\t// The URI should be a file:// URI pointing to lifecycle.tar\n\t\t\t\t\th.AssertTrue(t, strings.Contains(uri, \"lifecycle.tar\"))\n\n\t\t\t\t\t// Write a minimal lifecycle tar for the test\n\t\t\t\t\tf, err := os.Create(lifecyclePath)\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\tdefer f.Close()\n\n\t\t\t\t\t// Copy the test lifecycle content\n\t\t\t\t\ttestLifecycle := blob.NewBlob(filepath.Join(\"testdata\", \"lifecycle\", \"platform-0.4\"))\n\t\t\t\t\treturn testLifecycle, nil\n\t\t\t\t})\n\n\t\t\t\terr := subject.CreateBuilder(context.TODO(), opts)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t})\n\n\t\t\tit(\"should handle docker URI without docker:// prefix\", func() {\n\t\t\t\tprepareFetcherWithBuildImage()\n\t\t\t\tprepareFetcherWithRunImages()\n\n\t\t\t\tfakeLifecycleImage := setupFakeLifecycleImage()\n\n\t\t\t\topts.Config.Lifecycle.URI = \"docker:/\" + lifecycleImageName\n\t\t\t\topts.Config.Lifecycle.Version = \"\"\n\t\t\t\topts.RelativeBaseDir = tmpDir\n\n\t\t\t\t// Expect the image fetcher to fetch the lifecycle image\n\t\t\t\tmockImageFetcher.EXPECT().Fetch(gomock.Any(), lifecycleImageName, image.FetchOptions{Daemon: false}).Return(fakeLifecycleImage, nil)\n\n\t\t\t\t// The downloader will be called with the extracted lifecycle tar\n\t\t\t\tmockDownloader.EXPECT().Download(gomock.Any(), gomock.Any()).DoAndReturn(func(ctx context.Context, uri string) (blob.Blob, error) {\n\t\t\t\t\ttestLifecycle := blob.NewBlob(filepath.Join(\"testdata\", \"lifecycle\", \"platform-0.4\"))\n\t\t\t\t\treturn testLifecycle, nil\n\t\t\t\t})\n\n\t\t\t\terr := subject.CreateBuilder(context.TODO(), opts)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t})\n\n\t\t\twhen(\"fetching lifecycle image fails\", func() {\n\t\t\t\tit(\"should return an error\", func() {\n\t\t\t\t\tprepareFetcherWithBuildImage()\n\t\t\t\t\tprepareFetcherWithRunImages()\n\n\t\t\t\t\topts.Config.Lifecycle.URI = \"docker://\" + lifecycleImageName\n\t\t\t\t\topts.Config.Lifecycle.Version = \"\"\n\t\t\t\t\topts.RelativeBaseDir = tmpDir\n\n\t\t\t\t\t// Expect the image fetcher to fail\n\t\t\t\t\tmockImageFetcher.EXPECT().Fetch(gomock.Any(), lifecycleImageName, image.FetchOptions{Daemon: false}).Return(nil, errors.New(\"failed to fetch image\"))\n\n\t\t\t\t\terr := subject.CreateBuilder(context.TODO(), opts)\n\t\t\t\t\th.AssertError(t, err, \"Could not parse uri from lifecycle image\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"lifecycle image has no layers\", func() {\n\t\t\t\tit(\"should return an error\", func() {\n\t\t\t\t\tprepareFetcherWithBuildImage()\n\t\t\t\t\tprepareFetcherWithRunImages()\n\n\t\t\t\t\topts.Config.Lifecycle.URI = \"docker://\" + lifecycleImageName\n\t\t\t\t\topts.Config.Lifecycle.Version = \"\"\n\t\t\t\t\topts.RelativeBaseDir = tmpDir\n\n\t\t\t\t\t// Create an image with no layers\n\t\t\t\t\temptyLifecycleImage := fakes.NewImage(lifecycleImageName, \"\", nil)\n\n\t\t\t\t\tmockImageFetcher.EXPECT().Fetch(gomock.Any(), lifecycleImageName, image.FetchOptions{Daemon: false}).Return(emptyLifecycleImage, nil)\n\n\t\t\t\t\terr := subject.CreateBuilder(context.TODO(), opts)\n\t\t\t\t\th.AssertError(t, err, \"Could not parse uri from lifecycle image\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"both lifecycle URI and version are provided\", func() {\n\t\t\t\tit(\"should return an error\", func() {\n\t\t\t\t\tprepareFetcherWithBuildImage()\n\t\t\t\t\tprepareFetcherWithRunImages()\n\n\t\t\t\t\topts.Config.Lifecycle.URI = \"docker://\" + lifecycleImageName\n\t\t\t\t\topts.Config.Lifecycle.Version = \"1.2.3\"\n\n\t\t\t\t\terr := subject.CreateBuilder(context.TODO(), opts)\n\t\t\t\t\th.AssertError(t, err, \"'lifecycle' can only declare 'version' or 'uri', not both\")\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"buildpack mixins are not satisfied\", func() {\n\t\t\tit(\"should return an error\", func() {\n\t\t\t\tprepareFetcherWithBuildImage()\n\t\t\t\tprepareFetcherWithRunImages()\n\t\t\t\th.AssertNil(t, fakeBuildImage.SetLabel(\"io.buildpacks.stack.mixins\", \"\"))\n\n\t\t\t\terr := subject.CreateBuilder(context.TODO(), opts)\n\n\t\t\t\th.AssertError(t, err, \"validating buildpacks: buildpack 'bp.one@1.2.3' requires missing mixin(s): build:mixinY, mixinX\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"creation succeeds\", func() {\n\t\t\tit(\"should set basic metadata\", func() {\n\t\t\t\tprepareFetcherWithBuildImage()\n\t\t\t\tprepareFetcherWithRunImages()\n\n\t\t\t\tbldr := successfullyCreateBuilder()\n\n\t\t\t\th.AssertEq(t, bldr.Name(), \"some/builder\")\n\t\t\t\th.AssertEq(t, bldr.Description(), \"Some description\")\n\t\t\t\th.AssertEq(t, bldr.UID(), 1234)\n\t\t\t\th.AssertEq(t, bldr.GID(), 4321)\n\t\t\t\th.AssertEq(t, bldr.StackID, \"some.stack.id\")\n\t\t\t})\n\n\t\t\tit(\"should set buildpack and order metadata\", func() {\n\t\t\t\tprepareFetcherWithBuildImage()\n\t\t\t\tprepareFetcherWithRunImages()\n\n\t\t\t\tbldr := successfullyCreateBuilder()\n\n\t\t\t\tbpInfo := dist.ModuleInfo{\n\t\t\t\t\tID:       \"bp.one\",\n\t\t\t\t\tVersion:  \"1.2.3\",\n\t\t\t\t\tHomepage: \"http://one.buildpack\",\n\t\t\t\t}\n\t\t\t\th.AssertEq(t, bldr.Buildpacks(), []dist.ModuleInfo{bpInfo})\n\t\t\t\tbpInfo.Homepage = \"\"\n\t\t\t\th.AssertEq(t, bldr.Order(), dist.Order{{\n\t\t\t\t\tGroup: []dist.ModuleRef{{\n\t\t\t\t\t\tModuleInfo: bpInfo,\n\t\t\t\t\t\tOptional:   false,\n\t\t\t\t\t}},\n\t\t\t\t}})\n\t\t\t})\n\n\t\t\tit(\"should set extensions and order-extensions metadata\", func() {\n\t\t\t\tprepareFetcherWithBuildImage()\n\t\t\t\tprepareFetcherWithRunImages()\n\t\t\t\tprepareExtensions()\n\n\t\t\t\tbldr := successfullyCreateBuilder()\n\n\t\t\t\textInfo := dist.ModuleInfo{\n\t\t\t\t\tID:       \"ext.one\",\n\t\t\t\t\tVersion:  \"1.2.3\",\n\t\t\t\t\tHomepage: \"http://one.extension\",\n\t\t\t\t}\n\t\t\t\th.AssertEq(t, bldr.Extensions(), []dist.ModuleInfo{extInfo})\n\t\t\t\textInfo.Homepage = \"\"\n\t\t\t\th.AssertEq(t, bldr.OrderExtensions(), dist.Order{{\n\t\t\t\t\tGroup: []dist.ModuleRef{{\n\t\t\t\t\t\tModuleInfo: extInfo,\n\t\t\t\t\t\tOptional:   false, // extensions are always optional\n\t\t\t\t\t}},\n\t\t\t\t}})\n\t\t\t})\n\n\t\t\tit(\"should embed the lifecycle\", func() {\n\t\t\t\tprepareFetcherWithBuildImage()\n\t\t\t\tprepareFetcherWithRunImages()\n\t\t\t\tsuccessfullyCreateBuilder()\n\n\t\t\t\tlayerTar, err := fakeBuildImage.FindLayerWithPath(\"/cnb/lifecycle\")\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertTarHasFile(t, layerTar, \"/cnb/lifecycle/detector\")\n\t\t\t\th.AssertTarHasFile(t, layerTar, \"/cnb/lifecycle/restorer\")\n\t\t\t\th.AssertTarHasFile(t, layerTar, \"/cnb/lifecycle/analyzer\")\n\t\t\t\th.AssertTarHasFile(t, layerTar, \"/cnb/lifecycle/builder\")\n\t\t\t\th.AssertTarHasFile(t, layerTar, \"/cnb/lifecycle/exporter\")\n\t\t\t\th.AssertTarHasFile(t, layerTar, \"/cnb/lifecycle/launcher\")\n\t\t\t})\n\n\t\t\tit(\"should set lifecycle descriptor\", func() {\n\t\t\t\tprepareFetcherWithBuildImage()\n\t\t\t\tprepareFetcherWithRunImages()\n\t\t\t\tbldr := successfullyCreateBuilder()\n\n\t\t\t\th.AssertEq(t, bldr.LifecycleDescriptor().Info.Version.String(), \"0.0.0\")\n\t\t\t\t//nolint:staticcheck\n\t\t\t\th.AssertEq(t, bldr.LifecycleDescriptor().API.BuildpackVersion.String(), \"0.2\")\n\t\t\t\t//nolint:staticcheck\n\t\t\t\th.AssertEq(t, bldr.LifecycleDescriptor().API.PlatformVersion.String(), \"0.2\")\n\t\t\t\th.AssertEq(t, bldr.LifecycleDescriptor().APIs.Buildpack.Deprecated.AsStrings(), []string{\"0.2\", \"0.3\"})\n\t\t\t\th.AssertEq(t, bldr.LifecycleDescriptor().APIs.Buildpack.Supported.AsStrings(), []string{\"0.2\", \"0.3\", \"0.4\", \"0.9\"})\n\t\t\t\th.AssertEq(t, bldr.LifecycleDescriptor().APIs.Platform.Deprecated.AsStrings(), []string{\"0.2\"})\n\t\t\t\th.AssertEq(t, bldr.LifecycleDescriptor().APIs.Platform.Supported.AsStrings(), []string{\"0.3\", \"0.4\"})\n\t\t\t})\n\n\t\t\tit(\"should warn when deprecated Buildpack API version is used\", func() {\n\t\t\t\tprepareFetcherWithBuildImage()\n\t\t\t\tprepareFetcherWithRunImages()\n\t\t\t\tprepareExtensions()\n\t\t\t\tbldr := successfullyCreateBuilder()\n\n\t\t\t\th.AssertEq(t, bldr.LifecycleDescriptor().APIs.Buildpack.Deprecated.AsStrings(), []string{\"0.2\", \"0.3\"})\n\t\t\t\th.AssertContains(t, out.String(), fmt.Sprintf(\"Buildpack %s is using deprecated Buildpacks API version %s\", style.Symbol(\"bp.one@1.2.3\"), style.Symbol(\"0.3\")))\n\t\t\t\th.AssertContains(t, out.String(), fmt.Sprintf(\"Extension %s is using deprecated Buildpacks API version %s\", style.Symbol(\"ext.one@1.2.3\"), style.Symbol(\"0.3\")))\n\t\t\t})\n\n\t\t\tit(\"shouldn't warn when Buildpack API version used isn't deprecated\", func() {\n\t\t\t\tprepareFetcherWithBuildImage()\n\t\t\t\tprepareFetcherWithRunImages()\n\t\t\t\tprepareExtensions()\n\t\t\t\topts.Config.Buildpacks[0].URI = \"https://example.fake/bp-one-with-api-4.tgz\"\n\t\t\t\topts.Config.Extensions[0].URI = \"https://example.fake/ext-one-with-api-9.tgz\"\n\n\t\t\t\tbuildpackBlob := blob.NewBlob(filepath.Join(\"testdata\", \"buildpack-api-0.4\"))\n\t\t\t\tbp, err := buildpack.FromBuildpackRootBlob(buildpackBlob, archive.DefaultTarWriterFactory(), nil)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\tmockBuildpackDownloader.EXPECT().Download(gomock.Any(), \"https://example.fake/bp-one-with-api-4.tgz\", gomock.Any()).Return(bp, nil, nil)\n\n\t\t\t\textensionBlob := blob.NewBlob(filepath.Join(\"testdata\", \"extension-api-0.9\"))\n\t\t\t\textension, err := buildpack.FromExtensionRootBlob(extensionBlob, archive.DefaultTarWriterFactory(), nil)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\tmockBuildpackDownloader.EXPECT().Download(gomock.Any(), \"https://example.fake/ext-one-with-api-9.tgz\", gomock.Any()).Return(extension, nil, nil)\n\n\t\t\t\tbldr := successfullyCreateBuilder()\n\n\t\t\t\th.AssertEq(t, bldr.LifecycleDescriptor().APIs.Buildpack.Deprecated.AsStrings(), []string{\"0.2\", \"0.3\"})\n\t\t\t\th.AssertNotContains(t, out.String(), \"is using deprecated Buildpacks API version\")\n\t\t\t})\n\n\t\t\tit(\"should set labels\", func() {\n\t\t\t\topts.Labels = map[string]string{\"test.label.one\": \"1\", \"test.label.two\": \"2\"}\n\t\t\t\tprepareFetcherWithBuildImage()\n\t\t\t\tprepareFetcherWithRunImages()\n\n\t\t\t\terr := subject.CreateBuilder(context.TODO(), opts)\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\timageLabels, err := fakeBuildImage.Labels()\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertEq(t, imageLabels[\"test.label.one\"], \"1\")\n\t\t\t\th.AssertEq(t, imageLabels[\"test.label.two\"], \"2\")\n\t\t\t})\n\n\t\t\twhen(\"Buildpack dependencies are provided\", func() {\n\t\t\t\tvar (\n\t\t\t\t\tbp1v1          buildpack.BuildModule\n\t\t\t\t\tbp1v2          buildpack.BuildModule\n\t\t\t\t\tbp2v1          buildpack.BuildModule\n\t\t\t\t\tbp2v2          buildpack.BuildModule\n\t\t\t\t\tfakeLayerImage *h.FakeAddedLayerImage\n\t\t\t\t\terr            error\n\t\t\t\t)\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tfakeLayerImage = &h.FakeAddedLayerImage{Image: fakeBuildImage}\n\t\t\t\t})\n\n\t\t\t\tvar prepareBuildpackDependencies = func() []buildpack.BuildModule {\n\t\t\t\t\tbp1v1Blob := blob.NewBlob(filepath.Join(\"testdata\", \"buildpack-non-deterministic\", \"buildpack-1-version-1\"))\n\t\t\t\t\tbp1v2Blob := blob.NewBlob(filepath.Join(\"testdata\", \"buildpack-non-deterministic\", \"buildpack-1-version-2\"))\n\t\t\t\t\tbp2v1Blob := blob.NewBlob(filepath.Join(\"testdata\", \"buildpack-non-deterministic\", \"buildpack-2-version-1\"))\n\t\t\t\t\tbp2v2Blob := blob.NewBlob(filepath.Join(\"testdata\", \"buildpack-non-deterministic\", \"buildpack-2-version-2\"))\n\n\t\t\t\t\tbp1v1, err = buildpack.FromBuildpackRootBlob(bp1v1Blob, archive.DefaultTarWriterFactory(), nil)\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\tbp1v2, err = buildpack.FromBuildpackRootBlob(bp1v2Blob, archive.DefaultTarWriterFactory(), nil)\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\tbp2v1, err = buildpack.FromBuildpackRootBlob(bp2v1Blob, archive.DefaultTarWriterFactory(), nil)\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\tbp2v2, err = buildpack.FromBuildpackRootBlob(bp2v2Blob, archive.DefaultTarWriterFactory(), nil)\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\treturn []buildpack.BuildModule{bp2v2, bp2v1, bp1v1, bp1v2}\n\t\t\t\t}\n\n\t\t\t\tvar successfullyCreateDeterministicBuilder = func() {\n\t\t\t\t\tt.Helper()\n\n\t\t\t\t\terr := subject.CreateBuilder(context.TODO(), opts)\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\th.AssertEq(t, fakeLayerImage.IsSaved(), true)\n\t\t\t\t}\n\n\t\t\t\tit(\"should add dependencies buildpacks layers order by ID and version\", func() {\n\t\t\t\t\tmockImageFetcher.EXPECT().Fetch(gomock.Any(), \"some/build-image\", gomock.Any()).Return(fakeLayerImage, nil)\n\t\t\t\t\tprepareFetcherWithRunImages()\n\t\t\t\t\tprepareExtensions()\n\t\t\t\t\topts.Config.Buildpacks[0].URI = \"https://example.fake/bp-one-with-api-4.tgz\"\n\t\t\t\t\topts.Config.Extensions[0].URI = \"https://example.fake/ext-one-with-api-9.tgz\"\n\t\t\t\t\tbpDependencies := prepareBuildpackDependencies()\n\n\t\t\t\t\tbuildpackBlob := blob.NewBlob(filepath.Join(\"testdata\", \"buildpack-api-0.4\"))\n\t\t\t\t\tbp, err := buildpack.FromBuildpackRootBlob(buildpackBlob, archive.DefaultTarWriterFactory(), nil)\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\tmockBuildpackDownloader.EXPECT().Download(gomock.Any(), \"https://example.fake/bp-one-with-api-4.tgz\", gomock.Any()).DoAndReturn(\n\t\t\t\t\t\tfunc(ctx context.Context, buildpackURI string, opts buildpack.DownloadOptions) (buildpack.BuildModule, []buildpack.BuildModule, error) {\n\t\t\t\t\t\t\t// test options\n\t\t\t\t\t\t\th.AssertEq(t, opts.Target.ValuesAsPlatform(), \"linux/amd64\")\n\t\t\t\t\t\t\treturn bp, bpDependencies, nil\n\t\t\t\t\t\t})\n\n\t\t\t\t\textensionBlob := blob.NewBlob(filepath.Join(\"testdata\", \"extension-api-0.9\"))\n\t\t\t\t\textension, err := buildpack.FromExtensionRootBlob(extensionBlob, archive.DefaultTarWriterFactory(), nil)\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\tmockBuildpackDownloader.EXPECT().Download(gomock.Any(), \"https://example.fake/ext-one-with-api-9.tgz\", gomock.Any()).DoAndReturn(\n\t\t\t\t\t\tfunc(ctx context.Context, buildpackURI string, opts buildpack.DownloadOptions) (buildpack.BuildModule, []buildpack.BuildModule, error) {\n\t\t\t\t\t\t\t// test options\n\t\t\t\t\t\t\th.AssertEq(t, opts.Target.ValuesAsPlatform(), \"linux/amd64\")\n\t\t\t\t\t\t\treturn extension, nil, nil\n\t\t\t\t\t\t})\n\n\t\t\t\t\tsuccessfullyCreateDeterministicBuilder()\n\n\t\t\t\t\tlayers := fakeLayerImage.AddedLayersOrder()\n\t\t\t\t\t// Main buildpack + 4 dependencies + 1 extension\n\t\t\t\t\th.AssertEq(t, len(layers), 6)\n\n\t\t\t\t\t// [0] bp.one.1.2.3.tar - main buildpack\n\t\t\t\t\th.AssertTrue(t, strings.Contains(layers[1], h.LayerFileName(bp1v1)))\n\t\t\t\t\th.AssertTrue(t, strings.Contains(layers[2], h.LayerFileName(bp1v2)))\n\t\t\t\t\th.AssertTrue(t, strings.Contains(layers[3], h.LayerFileName(bp2v1)))\n\t\t\t\t\th.AssertTrue(t, strings.Contains(layers[4], h.LayerFileName(bp2v2)))\n\t\t\t\t\t// [5] ext.one.1.2.3.tar - extension\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\tit(\"supports directory buildpacks\", func() {\n\t\t\tprepareFetcherWithBuildImage()\n\t\t\tprepareFetcherWithRunImages()\n\t\t\topts.RelativeBaseDir = \"\"\n\t\t\tdirectoryPath := \"testdata/buildpack\"\n\t\t\topts.Config.Buildpacks[0].URI = directoryPath\n\n\t\t\tbuildpackBlob := blob.NewBlob(directoryPath)\n\t\t\tbuildpack, err := buildpack.FromBuildpackRootBlob(buildpackBlob, archive.DefaultTarWriterFactory(), nil)\n\t\t\th.AssertNil(t, err)\n\t\t\tmockBuildpackDownloader.EXPECT().Download(gomock.Any(), directoryPath, gomock.Any()).Return(buildpack, nil, nil)\n\n\t\t\terr = subject.CreateBuilder(context.TODO(), opts)\n\t\t\th.AssertNil(t, err)\n\t\t})\n\n\t\tit(\"supports directory extensions\", func() {\n\t\t\tprepareFetcherWithBuildImage()\n\t\t\tprepareFetcherWithRunImages()\n\t\t\tprepareExtensions()\n\t\t\topts.RelativeBaseDir = \"\"\n\t\t\tdirectoryPath := \"testdata/extension\"\n\t\t\topts.Config.Extensions[0].URI = directoryPath\n\n\t\t\textensionBlob := blob.NewBlob(directoryPath)\n\t\t\textension, err := buildpack.FromExtensionRootBlob(extensionBlob, archive.DefaultTarWriterFactory(), nil)\n\t\t\th.AssertNil(t, err)\n\t\t\tmockBuildpackDownloader.EXPECT().Download(gomock.Any(), directoryPath, gomock.Any()).Return(extension, nil, nil)\n\n\t\t\terr = subject.CreateBuilder(context.TODO(), opts)\n\t\t\th.AssertNil(t, err)\n\t\t})\n\n\t\twhen(\"package file\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\tfileURI := func(path string) (original, uri string) {\n\t\t\t\t\tabsPath, err := paths.FilePathToURI(path, \"\")\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\treturn path, absPath\n\t\t\t\t}\n\n\t\t\t\tcnbFile, _ := fileURI(filepath.Join(tmpDir, \"bp_one1.cnb\"))\n\t\t\t\tbuildpackPath, buildpackPathURI := fileURI(filepath.Join(\"testdata\", \"buildpack\"))\n\t\t\t\tmockDownloader.EXPECT().Download(gomock.Any(), buildpackPathURI).Return(blob.NewBlob(buildpackPath), nil)\n\n\t\t\t\th.AssertNil(t, subject.PackageBuildpack(context.TODO(), client.PackageBuildpackOptions{\n\t\t\t\t\tName: cnbFile,\n\t\t\t\t\tConfig: pubbldpkg.Config{\n\t\t\t\t\t\tPlatform:  dist.Platform{OS: \"linux\"},\n\t\t\t\t\t\tBuildpack: dist.BuildpackURI{URI: buildpackPath},\n\t\t\t\t\t},\n\t\t\t\t\tFormat: \"file\",\n\t\t\t\t}))\n\n\t\t\t\tbuildpack, _, err := buildpack.BuildpacksFromOCILayoutBlob(blob.NewBlob(cnbFile))\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\tmockBuildpackDownloader.EXPECT().Download(gomock.Any(), cnbFile, gomock.Any()).Return(buildpack, nil, nil).AnyTimes()\n\t\t\t\topts.Config.Buildpacks = []pubbldr.ModuleConfig{{\n\t\t\t\t\tImageOrURI: dist.ImageOrURI{BuildpackURI: dist.BuildpackURI{URI: cnbFile}},\n\t\t\t\t}}\n\t\t\t})\n\n\t\t\tit(\"package file is valid\", func() {\n\t\t\t\tprepareFetcherWithBuildImage()\n\t\t\t\tprepareFetcherWithRunImages()\n\t\t\t\tbldr := successfullyCreateBuilder()\n\n\t\t\t\tbpInfo := dist.ModuleInfo{\n\t\t\t\t\tID:       \"bp.one\",\n\t\t\t\t\tVersion:  \"1.2.3\",\n\t\t\t\t\tHomepage: \"http://one.buildpack\",\n\t\t\t\t}\n\t\t\t\th.AssertEq(t, bldr.Buildpacks(), []dist.ModuleInfo{bpInfo})\n\t\t\t\tbpInfo.Homepage = \"\"\n\t\t\t\th.AssertEq(t, bldr.Order(), dist.Order{{\n\t\t\t\t\tGroup: []dist.ModuleRef{{\n\t\t\t\t\t\tModuleInfo: bpInfo,\n\t\t\t\t\t\tOptional:   false,\n\t\t\t\t\t}},\n\t\t\t\t}})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"packages\", func() {\n\t\t\twhen(\"package image lives in cnb registry\", func() {\n\t\t\t\twhen(\"publish=false and pull-policy=always\", func() {\n\t\t\t\t\tit(\"should call BuildpackDownloader with the proper argumentss\", func() {\n\t\t\t\t\t\tprepareFetcherWithBuildImage()\n\t\t\t\t\t\tprepareFetcherWithRunImages()\n\t\t\t\t\t\topts.BuilderName = \"some/builder\"\n\t\t\t\t\t\topts.Publish = false\n\t\t\t\t\t\topts.PullPolicy = image.PullAlways\n\t\t\t\t\t\topts.Registry = \"some-registry\"\n\t\t\t\t\t\topts.Config.Buildpacks = append(\n\t\t\t\t\t\t\topts.Config.Buildpacks,\n\t\t\t\t\t\t\tpubbldr.ModuleConfig{\n\t\t\t\t\t\t\t\tImageOrURI: dist.ImageOrURI{\n\t\t\t\t\t\t\t\t\tBuildpackURI: dist.BuildpackURI{\n\t\t\t\t\t\t\t\t\t\tURI: \"urn:cnb:registry:example/foo@1.1.0\",\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t)\n\n\t\t\t\t\t\tshouldCallBuildpackDownloaderWith(\"urn:cnb:registry:example/foo@1.1.0\", buildpack.DownloadOptions{Daemon: true, PullPolicy: image.PullAlways, RegistryName: \"some-\"})\n\t\t\t\t\t\th.AssertNil(t, subject.CreateBuilder(context.TODO(), opts))\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"flatten option is set\", func() {\n\t\t\t/*       1\n\t\t\t *    /    \\\n\t\t\t *   2      3\n\t\t\t *         /  \\\n\t\t\t *        4     5\n\t\t\t *\t          /  \\\n\t\t\t *           6   7\n\t\t\t */\n\t\t\tvar (\n\t\t\t\tfakeLayerImage *h.FakeAddedLayerImage\n\t\t\t\terr            error\n\t\t\t)\n\n\t\t\tvar successfullyCreateFlattenBuilder = func() {\n\t\t\t\tt.Helper()\n\n\t\t\t\terr := subject.CreateBuilder(context.TODO(), opts)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertEq(t, fakeLayerImage.IsSaved(), true)\n\t\t\t}\n\n\t\t\tit.Before(func() {\n\t\t\t\tfakeLayerImage = &h.FakeAddedLayerImage{Image: fakeBuildImage}\n\t\t\t\tmockImageFetcher.EXPECT().Fetch(gomock.Any(), \"some/build-image\", gomock.Any()).Return(fakeLayerImage, nil)\n\n\t\t\t\tvar depBPs []buildpack.BuildModule\n\t\t\t\tblob1 := blob.NewBlob(filepath.Join(\"testdata\", \"buildpack-flatten\", \"buildpack-1\"))\n\t\t\t\tfor i := 2; i <= 7; i++ {\n\t\t\t\t\tb := blob.NewBlob(filepath.Join(\"testdata\", \"buildpack-flatten\", fmt.Sprintf(\"buildpack-%d\", i)))\n\t\t\t\t\tbp, err := buildpack.FromBuildpackRootBlob(b, archive.DefaultTarWriterFactory(), nil)\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\tdepBPs = append(depBPs, bp)\n\t\t\t\t}\n\t\t\t\tmockDownloader.EXPECT().Download(gomock.Any(), \"https://example.fake/flatten-bp-1.tgz\").Return(blob1, nil).AnyTimes()\n\n\t\t\t\tbp, err := buildpack.FromBuildpackRootBlob(blob1, archive.DefaultTarWriterFactory(), nil)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\tmockBuildpackDownloader.EXPECT().Download(gomock.Any(), \"https://example.fake/flatten-bp-1.tgz\", gomock.Any()).Return(bp, depBPs, nil).AnyTimes()\n\n\t\t\t\topts = client.CreateBuilderOptions{\n\t\t\t\t\tRelativeBaseDir: \"/\",\n\t\t\t\t\tBuilderName:     \"some/builder\",\n\t\t\t\t\tConfig: pubbldr.Config{\n\t\t\t\t\t\tDescription: \"Some description\",\n\t\t\t\t\t\tBuildpacks: []pubbldr.ModuleConfig{\n\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\tModuleInfo: dist.ModuleInfo{ID: \"flatten/bp-1\", Version: \"1\", Homepage: \"http://buildpack-1\"},\n\t\t\t\t\t\t\t\tImageOrURI: dist.ImageOrURI{\n\t\t\t\t\t\t\t\t\tBuildpackURI: dist.BuildpackURI{\n\t\t\t\t\t\t\t\t\t\tURI: \"https://example.fake/flatten-bp-1.tgz\",\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t},\n\t\t\t\t\t\tOrder: []dist.OrderEntry{{\n\t\t\t\t\t\t\tGroup: []dist.ModuleRef{\n\t\t\t\t\t\t\t\t{ModuleInfo: dist.ModuleInfo{ID: \"flatten/bp-2\", Version: \"2\"}, Optional: false},\n\t\t\t\t\t\t\t\t{ModuleInfo: dist.ModuleInfo{ID: \"flatten/bp-4\", Version: \"4\"}, Optional: false},\n\t\t\t\t\t\t\t\t{ModuleInfo: dist.ModuleInfo{ID: \"flatten/bp-6\", Version: \"6\"}, Optional: false},\n\t\t\t\t\t\t\t\t{ModuleInfo: dist.ModuleInfo{ID: \"flatten/bp-7\", Version: \"7\"}, Optional: false},\n\t\t\t\t\t\t\t}},\n\t\t\t\t\t\t},\n\t\t\t\t\t\tStack: pubbldr.StackConfig{\n\t\t\t\t\t\t\tID: \"some.stack.id\",\n\t\t\t\t\t\t},\n\t\t\t\t\t\tRun: pubbldr.RunConfig{\n\t\t\t\t\t\t\tImages: []pubbldr.RunImageConfig{{\n\t\t\t\t\t\t\t\tImage:   \"some/run-image\",\n\t\t\t\t\t\t\t\tMirrors: []string{\"localhost:5000/some/run-image\"},\n\t\t\t\t\t\t\t}},\n\t\t\t\t\t\t},\n\t\t\t\t\t\tBuild: pubbldr.BuildConfig{\n\t\t\t\t\t\t\tImage: \"some/build-image\",\n\t\t\t\t\t\t},\n\t\t\t\t\t\tLifecycle: pubbldr.LifecycleConfig{URI: \"file:///some-lifecycle\"},\n\t\t\t\t\t},\n\t\t\t\t\tPublish:    false,\n\t\t\t\t\tPullPolicy: image.PullAlways,\n\t\t\t\t}\n\t\t\t})\n\n\t\t\twhen(\"flatten all\", func() {\n\t\t\t\tit(\"creates 1 layer for all buildpacks\", func() {\n\t\t\t\t\tprepareFetcherWithRunImages()\n\t\t\t\t\topts.Flatten, err = buildpack.ParseFlattenBuildModules([]string{\"flatten/bp-1@1,flatten/bp-2@2,flatten/bp-4@4,flatten/bp-6@6,flatten/bp-7@7,flatten/bp-3@3,flatten/bp-5@5\"})\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\tsuccessfullyCreateFlattenBuilder()\n\n\t\t\t\t\tlayers := fakeLayerImage.AddedLayersOrder()\n\n\t\t\t\t\th.AssertEq(t, len(layers), 1)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"only some modules are flattened\", func() {\n\t\t\t\tit(\"creates 1 layer for buildpacks [1,2,3,4,5,6] and 1 layer for buildpack [7]\", func() {\n\t\t\t\t\tprepareFetcherWithRunImages()\n\t\t\t\t\topts.Flatten, err = buildpack.ParseFlattenBuildModules([]string{\"flatten/bp-1@1,flatten/bp-2@2,flatten/bp-4@4,flatten/bp-6@6,flatten/bp-3@3,flatten/bp-5@5\"})\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\tsuccessfullyCreateFlattenBuilder()\n\n\t\t\t\t\tlayers := fakeLayerImage.AddedLayersOrder()\n\t\t\t\t\th.AssertEq(t, len(layers), 2)\n\t\t\t\t})\n\n\t\t\t\tit(\"creates 1 layer for buildpacks [1,2,3] and 1 layer for [4,5,6] and 1 layer for [7]\", func() {\n\t\t\t\t\tprepareFetcherWithRunImages()\n\t\t\t\t\topts.Flatten, err = buildpack.ParseFlattenBuildModules([]string{\"flatten/bp-1@1,flatten/bp-2@2,flatten/bp-3@3\", \"flatten/bp-4@4,flatten/bp-6@6,flatten/bp-5@5\"})\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\tsuccessfullyCreateFlattenBuilder()\n\n\t\t\t\t\tlayers := fakeLayerImage.AddedLayersOrder()\n\t\t\t\t\th.AssertEq(t, len(layers), 3)\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"daemon target selection for multi-platform builders\", func() {\n\t\t\twhen(\"publish is false\", func() {\n\t\t\t\twhen(\"daemon is linux/amd64\", func() {\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\tmockDockerClient.EXPECT().ServerVersion(gomock.Any(), gomock.Any()).Return(dockerclient.ServerVersionResult{\n\t\t\t\t\t\t\tOs:   \"linux\",\n\t\t\t\t\t\t\tArch: \"amd64\",\n\t\t\t\t\t\t}, nil).AnyTimes()\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"multiple targets are provided\", func() {\n\t\t\t\t\t\tit(\"selects the matching OS and architecture\", func() {\n\t\t\t\t\t\t\tprepareFetcherWithBuildImage()\n\t\t\t\t\t\t\tprepareFetcherWithRunImages()\n\n\t\t\t\t\t\t\topts.Targets = []dist.Target{\n\t\t\t\t\t\t\t\t{OS: \"linux\", Arch: \"arm64\"},\n\t\t\t\t\t\t\t\t{OS: \"linux\", Arch: \"amd64\"}, // should match\n\t\t\t\t\t\t\t\t{OS: \"windows\", Arch: \"amd64\"},\n\t\t\t\t\t\t\t}\n\n\t\t\t\t\t\t\th.AssertNil(t, subject.CreateBuilder(context.TODO(), opts))\n\n\t\t\t\t\t\t\t// Verify that only one image was created (for the matching target)\n\t\t\t\t\t\t\th.AssertEq(t, fakeBuildImage.IsSaved(), true)\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"no exact architecture match exists\", func() {\n\t\t\t\t\t\tit(\"returns error\", func() {\n\t\t\t\t\t\t\topts.Targets = []dist.Target{\n\t\t\t\t\t\t\t\t{OS: \"linux\", Arch: \"arm64\"},\n\t\t\t\t\t\t\t\t{OS: \"linux\", Arch: \"arm\"},\n\t\t\t\t\t\t\t}\n\n\t\t\t\t\t\t\terr := subject.CreateBuilder(context.TODO(), opts)\n\t\t\t\t\t\t\th.AssertError(t, err, \"could not find a target that matches daemon os=linux and architecture=amd64\")\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"target with empty architecture exists\", func() {\n\t\t\t\t\t\tit(\"selects the OS-only match\", func() {\n\t\t\t\t\t\t\tprepareFetcherWithBuildImage()\n\t\t\t\t\t\t\tprepareFetcherWithRunImages()\n\n\t\t\t\t\t\t\topts.Targets = []dist.Target{\n\t\t\t\t\t\t\t\t{OS: \"linux\", Arch: \"arm64\"},\n\t\t\t\t\t\t\t\t{OS: \"linux\", Arch: \"\"}, // should match\n\t\t\t\t\t\t\t\t{OS: \"windows\", Arch: \"amd64\"},\n\t\t\t\t\t\t\t}\n\n\t\t\t\t\t\t\th.AssertNil(t, subject.CreateBuilder(context.TODO(), opts))\n\n\t\t\t\t\t\t\t// Verify that the builder was created\n\t\t\t\t\t\t\th.AssertEq(t, fakeBuildImage.IsSaved(), true)\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"daemon is linux/arm64\", func() {\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\tmockDockerClient.EXPECT().ServerVersion(gomock.Any(), gomock.Any()).Return(dockerclient.ServerVersionResult{\n\t\t\t\t\t\t\tOs:   \"linux\",\n\t\t\t\t\t\t\tArch: \"arm64\",\n\t\t\t\t\t\t}, nil).AnyTimes()\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"targets are ordered with amd64 first\", func() {\n\t\t\t\t\t\tit(\"selects arm64 even when amd64 appears first\", func() {\n\t\t\t\t\t\t\tprepareFetcherWithBuildImage()\n\t\t\t\t\t\t\tprepareFetcherWithRunImages()\n\n\t\t\t\t\t\t\topts.Targets = []dist.Target{\n\t\t\t\t\t\t\t\t{OS: \"linux\", Arch: \"amd64\"}, // appears first\n\t\t\t\t\t\t\t\t{OS: \"linux\", Arch: \"arm64\"}, // should match\n\t\t\t\t\t\t\t\t{OS: \"windows\", Arch: \"arm64\"},\n\t\t\t\t\t\t\t}\n\n\t\t\t\t\t\t\th.AssertNil(t, subject.CreateBuilder(context.TODO(), opts))\n\n\t\t\t\t\t\t\t// Verify that the builder was created\n\t\t\t\t\t\t\th.AssertEq(t, fakeBuildImage.IsSaved(), true)\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"only amd64 targets available\", func() {\n\t\t\t\t\t\tit(\"returns error\", func() {\n\t\t\t\t\t\t\topts.Targets = []dist.Target{\n\t\t\t\t\t\t\t\t{OS: \"linux\", Arch: \"amd64\"},\n\t\t\t\t\t\t\t\t{OS: \"windows\", Arch: \"amd64\"},\n\t\t\t\t\t\t\t}\n\n\t\t\t\t\t\t\terr := subject.CreateBuilder(context.TODO(), opts)\n\t\t\t\t\t\t\th.AssertError(t, err, \"could not find a target that matches daemon os=linux and architecture=arm64\")\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"empty targets list\", func() {\n\t\t\t\t\tit(\"creates builder without calling daemonTarget\", func() {\n\t\t\t\t\t\tprepareFetcherWithBuildImage()\n\t\t\t\t\t\tprepareFetcherWithRunImages()\n\n\t\t\t\t\t\t// Empty targets should use the default behavior\n\t\t\t\t\t\topts.Targets = []dist.Target{}\n\n\t\t\t\t\t\t// ServerVersion should NOT be called for empty targets\n\t\t\t\t\t\tmockDockerClient.EXPECT().ServerVersion(gomock.Any(), gomock.Any()).Times(0)\n\n\t\t\t\t\t\th.AssertNil(t, subject.CreateBuilder(context.TODO(), opts))\n\n\t\t\t\t\t\t// Verify that the builder was created\n\t\t\t\t\t\th.AssertEq(t, fakeBuildImage.IsSaved(), true)\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\t})\n}\n\ntype fakeBadImageStruct struct {\n\t*fakes.Image\n}\n\nfunc (i fakeBadImageStruct) Name() string {\n\treturn \"fake image\"\n}\n\nfunc (i fakeBadImageStruct) Label(str string) (string, error) {\n\treturn \"\", errors.New(\"error here\")\n}\n"
  },
  {
    "path": "pkg/client/docker.go",
    "content": "package client\n\nimport (\n\t\"context\"\n\t\"io\"\n\n\tdockerClient \"github.com/moby/moby/client\"\n)\n\n// DockerClient is the subset of client.APIClient which required by this package\ntype DockerClient interface {\n\tImageHistory(ctx context.Context, image string, opts ...dockerClient.ImageHistoryOption) (dockerClient.ImageHistoryResult, error)\n\tImageInspect(ctx context.Context, image string, opts ...dockerClient.ImageInspectOption) (dockerClient.ImageInspectResult, error)\n\tImageTag(ctx context.Context, options dockerClient.ImageTagOptions) (dockerClient.ImageTagResult, error)\n\tImageLoad(ctx context.Context, input io.Reader, opts ...dockerClient.ImageLoadOption) (dockerClient.ImageLoadResult, error)\n\tImageSave(ctx context.Context, images []string, opts ...dockerClient.ImageSaveOption) (dockerClient.ImageSaveResult, error)\n\tImageRemove(ctx context.Context, image string, options dockerClient.ImageRemoveOptions) (dockerClient.ImageRemoveResult, error)\n\tImagePull(ctx context.Context, ref string, options dockerClient.ImagePullOptions) (dockerClient.ImagePullResponse, error)\n\tInfo(ctx context.Context, options dockerClient.InfoOptions) (dockerClient.SystemInfoResult, error)\n\tServerVersion(ctx context.Context, options dockerClient.ServerVersionOptions) (dockerClient.ServerVersionResult, error)\n\tVolumeRemove(ctx context.Context, volumeID string, options dockerClient.VolumeRemoveOptions) (dockerClient.VolumeRemoveResult, error)\n\tContainerCreate(ctx context.Context, options dockerClient.ContainerCreateOptions) (dockerClient.ContainerCreateResult, error)\n\tCopyFromContainer(ctx context.Context, containerID string, options dockerClient.CopyFromContainerOptions) (dockerClient.CopyFromContainerResult, error)\n\tContainerInspect(ctx context.Context, containerID string, options dockerClient.ContainerInspectOptions) (dockerClient.ContainerInspectResult, error)\n\tContainerRemove(ctx context.Context, container string, options dockerClient.ContainerRemoveOptions) (dockerClient.ContainerRemoveResult, error)\n\tCopyToContainer(ctx context.Context, container string, options dockerClient.CopyToContainerOptions) (dockerClient.CopyToContainerResult, error)\n\tContainerWait(ctx context.Context, containerID string, options dockerClient.ContainerWaitOptions) dockerClient.ContainerWaitResult\n\tContainerAttach(ctx context.Context, container string, options dockerClient.ContainerAttachOptions) (dockerClient.ContainerAttachResult, error)\n\tContainerStart(ctx context.Context, container string, options dockerClient.ContainerStartOptions) (dockerClient.ContainerStartResult, error)\n\tNetworkCreate(ctx context.Context, name string, options dockerClient.NetworkCreateOptions) (dockerClient.NetworkCreateResult, error)\n\tNetworkRemove(ctx context.Context, networkID string, options dockerClient.NetworkRemoveOptions) (dockerClient.NetworkRemoveResult, error)\n}\n"
  },
  {
    "path": "pkg/client/docker_context.go",
    "content": "package client\n\nimport (\n\t\"encoding/json\"\n\t\"fmt\"\n\t\"io\"\n\t\"os\"\n\t\"path/filepath\"\n\n\t\"github.com/pkg/errors\"\n\n\t\"github.com/opencontainers/go-digest\"\n\n\t\"github.com/buildpacks/pack/pkg/logging\"\n)\n\nconst (\n\tdockerHostEnvVar            = \"DOCKER_HOST\"\n\tdockerConfigEnvVar          = \"DOCKER_CONFIG\"\n\tdefaultDockerRootConfigDir  = \".docker\"\n\tdefaultDockerConfigFileName = \"config.json\"\n\n\tdockerContextDirName      = \"contexts\"\n\tdockerContextMetaDirName  = \"meta\"\n\tdockerContextMetaFileName = \"meta.json\"\n\tdockerContextEndpoint     = \"docker\"\n\tdefaultDockerContext      = \"default\"\n)\n\ntype configFile struct {\n\tCurrentContext string `json:\"currentContext,omitempty\"`\n}\n\ntype endpoint struct {\n\tHost string `json:\",omitempty\"`\n}\n\n/*\n\t Example Docker context file\n\t {\n\t  \"Name\": \"desktop-linux\",\n\t  \"dockerConfigMetadata\": {\n\t    \"Description\": \"Docker Desktop\"\n\t  },\n\t  \"Endpoints\": {\n\t    \"docker\": {\n\t      \"Host\": \"unix:///Users/jbustamante/.docker/run/docker.sock\",\n\t      \"SkipTLSVerify\": false\n\t    }\n\t  }\n\t}\n*/\ntype dockerConfigMetadata struct {\n\tName      string              `json:\",omitempty\"`\n\tEndpoints map[string]endpoint `json:\"endpoints,omitempty\"`\n}\n\nfunc ProcessDockerContext(logger logging.Logger) error {\n\tdockerHost := os.Getenv(dockerHostEnvVar)\n\tif dockerHost == \"\" {\n\t\tdockerConfigDir, err := configDir()\n\t\tif err != nil {\n\t\t\treturn err\n\t\t}\n\n\t\tlogger.Debugf(\"looking for docker configuration file at: %s\", dockerConfigDir)\n\t\tconfiguration, err := readConfigFile(dockerConfigDir)\n\t\tif err != nil {\n\t\t\treturn errors.Wrapf(err, \"reading configuration file at '%s'\", dockerConfigDir)\n\t\t}\n\n\t\tif skip(configuration) {\n\t\t\tlogger.Debug(\"docker context is default or empty, skipping it\")\n\t\t\treturn nil\n\t\t}\n\n\t\tconfigMetaData, err := readConfigMetadata(dockerConfigDir, configuration.CurrentContext)\n\t\tif err != nil {\n\t\t\treturn errors.Wrapf(err, \"reading metadata for current context '%s' at '%s'\", configuration.CurrentContext, dockerConfigDir)\n\t\t}\n\n\t\tif dockerEndpoint, ok := configMetaData.Endpoints[dockerContextEndpoint]; ok {\n\t\t\tos.Setenv(dockerHostEnvVar, dockerEndpoint.Host)\n\t\t\tlogger.Debugf(\"using docker context '%s' with endpoint = '%s'\", configuration.CurrentContext, dockerEndpoint.Host)\n\t\t} else {\n\t\t\tlogger.Warnf(\"docker endpoint doesn't exist for context '%s'\", configuration.CurrentContext)\n\t\t}\n\t} else {\n\t\tlogger.Debugf(\"'%s=%s' environment variable is being used\", dockerHostEnvVar, dockerHost)\n\t}\n\treturn nil\n}\n\nfunc configDir() (string, error) {\n\tdir := os.Getenv(dockerConfigEnvVar)\n\tif dir == \"\" {\n\t\thome, err := os.UserHomeDir()\n\t\tif err != nil {\n\t\t\treturn \"\", errors.Wrap(err, \"determining user home directory\")\n\t\t}\n\t\tdir = filepath.Join(home, defaultDockerRootConfigDir)\n\t}\n\treturn dir, nil\n}\n\nfunc readConfigFile(configDir string) (*configFile, error) {\n\tfilename := filepath.Join(configDir, defaultDockerConfigFileName)\n\tconfig := &configFile{}\n\tfile, err := os.Open(filename)\n\tif err != nil {\n\t\tif os.IsNotExist(err) {\n\t\t\treturn &configFile{}, nil\n\t\t}\n\t\treturn &configFile{}, err\n\t}\n\tdefer file.Close()\n\tif err := json.NewDecoder(file).Decode(config); err != nil && !errors.Is(err, io.EOF) {\n\t\treturn &configFile{}, err\n\t}\n\treturn config, nil\n}\n\nfunc readConfigMetadata(configDir string, context string) (dockerConfigMetadata, error) {\n\tdockerContextDir := filepath.Join(configDir, dockerContextDirName)\n\tmetaFileName := filepath.Join(dockerContextDir, dockerContextMetaDirName, digest.FromString(context).Encoded(), dockerContextMetaFileName)\n\tbytes, err := os.ReadFile(metaFileName)\n\tif err != nil {\n\t\tif errors.Is(err, os.ErrNotExist) {\n\t\t\treturn dockerConfigMetadata{}, fmt.Errorf(\"docker context '%s' not found\", context)\n\t\t}\n\t\treturn dockerConfigMetadata{}, err\n\t}\n\tvar meta dockerConfigMetadata\n\tif err := json.Unmarshal(bytes, &meta); err != nil {\n\t\treturn dockerConfigMetadata{}, fmt.Errorf(\"parsing %s: %v\", metaFileName, err)\n\t}\n\tif meta.Name != context {\n\t\treturn dockerConfigMetadata{}, fmt.Errorf(\"context '%s' doesn't match metadata name '%s' at '%s'\", context, meta.Name, metaFileName)\n\t}\n\n\treturn meta, nil\n}\n\nfunc skip(configuration *configFile) bool {\n\treturn configuration == nil || configuration.CurrentContext == defaultDockerContext || configuration.CurrentContext == \"\"\n}\n"
  },
  {
    "path": "pkg/client/docker_context_test.go",
    "content": "package client_test\n\nimport (\n\t\"bytes\"\n\t\"os\"\n\t\"path/filepath\"\n\t\"strings\"\n\t\"testing\"\n\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\t\"github.com/buildpacks/pack/pkg/client\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestProcessDockerContext(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"processDockerContext\", testProcessDockerContext, spec.Report(report.Terminal{}))\n}\n\nconst (\n\trootFolder = \"docker-context\"\n\thappyCase  = \"happy-cases\"\n\terrorCase  = \"error-cases\"\n)\n\nfunc testProcessDockerContext(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\toutBuf bytes.Buffer\n\t\tlogger logging.Logger\n\t)\n\n\tit.Before(func() {\n\t\tlogger = logging.NewLogWithWriters(&outBuf, &outBuf, logging.WithVerbose())\n\t})\n\n\twhen(\"env DOCKER_HOST is set\", func() {\n\t\tit.Before(func() {\n\t\t\tos.Setenv(\"DOCKER_HOST\", \"some-value\")\n\t\t})\n\n\t\tit(\"docker context process is skipped\", func() {\n\t\t\terr := client.ProcessDockerContext(logger)\n\t\t\th.AssertNil(t, err)\n\t\t\th.AssertContains(t, strings.TrimSpace(outBuf.String()), \"'DOCKER_HOST=some-value' environment variable is being used\")\n\t\t})\n\t})\n\n\twhen(\"env DOCKER_HOST is empty\", func() {\n\t\tit.Before(func() {\n\t\t\tos.Setenv(\"DOCKER_HOST\", \"\")\n\t\t})\n\n\t\twhen(\"config.json has currentContext\", func() {\n\t\t\twhen(\"currentContext is default\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tsetDockerConfig(t, happyCase, \"default-context\")\n\t\t\t\t})\n\n\t\t\t\tit(\"docker context process is skip\", func() {\n\t\t\t\t\terr := client.ProcessDockerContext(logger)\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\th.AssertContains(t, strings.TrimSpace(outBuf.String()), \"docker context is default or empty, skipping it\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"currentContext is default but config doesn't exist\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tsetDockerConfig(t, errorCase, \"empty-context\")\n\t\t\t\t})\n\n\t\t\t\tit(\"throw an error\", func() {\n\t\t\t\t\terr := client.ProcessDockerContext(logger)\n\t\t\t\t\th.AssertNotNil(t, err)\n\t\t\t\t\th.AssertError(t, err, \"docker context 'some-bad-context' not found\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"currentContext is not default\", func() {\n\t\t\t\twhen(\"metadata has one endpoint\", func() {\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\tsetDockerConfig(t, happyCase, \"custom-context\")\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"docker endpoint host is being used\", func() {\n\t\t\t\t\t\terr := client.ProcessDockerContext(logger)\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\th.AssertContains(t, outBuf.String(), \"using docker context 'desktop-linux' with endpoint = 'unix:///Users/user/.docker/run/docker.sock'\")\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"metadata has more than one endpoint\", func() {\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\tsetDockerConfig(t, happyCase, \"two-endpoints-context\")\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"docker endpoint host is being used\", func() {\n\t\t\t\t\t\terr := client.ProcessDockerContext(logger)\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\th.AssertContains(t, outBuf.String(), \"using docker context 'desktop-linux' with endpoint = 'unix:///Users/user/.docker/run/docker.sock'\")\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"currentContext doesn't match metadata name\", func() {\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\tsetDockerConfig(t, errorCase, \"current-context-does-not-match\")\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"throw an error\", func() {\n\t\t\t\t\t\terr := client.ProcessDockerContext(logger)\n\t\t\t\t\t\th.AssertNotNil(t, err)\n\t\t\t\t\t\th.AssertError(t, err, \"context 'desktop-linux' doesn't match metadata name 'bad-name'\")\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"metadata doesn't contain a docker endpoint\", func() {\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\tsetDockerConfig(t, errorCase, \"docker-endpoint-does-not-exist\")\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"writes a warn message into the log\", func() {\n\t\t\t\t\t\terr := client.ProcessDockerContext(logger)\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\th.AssertContains(t, outBuf.String(), \"docker endpoint doesn't exist for context 'desktop-linux'\")\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"metadata is invalid\", func() {\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\tsetDockerConfig(t, errorCase, \"invalid-metadata\")\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"throw an error\", func() {\n\t\t\t\t\t\terr := client.ProcessDockerContext(logger)\n\t\t\t\t\t\th.AssertNotNil(t, err)\n\t\t\t\t\t\th.AssertError(t, err, \"reading metadata for current context 'desktop-linux'\")\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"config.json is invalid\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\tsetDockerConfig(t, errorCase, \"invalid-config\")\n\t\t\t})\n\n\t\t\tit(\"throw an error\", func() {\n\t\t\t\terr := client.ProcessDockerContext(logger)\n\t\t\t\th.AssertNotNil(t, err)\n\t\t\t\th.AssertError(t, err, \"reading configuration file\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"config.json doesn't have current context\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\tsetDockerConfig(t, happyCase, \"current-context-not-defined\")\n\t\t\t})\n\n\t\t\tit(\"docker context process is skip\", func() {\n\t\t\t\terr := client.ProcessDockerContext(logger)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertContains(t, strings.TrimSpace(outBuf.String()), \"docker context is default or empty, skipping it\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"docker config folder doesn't exists\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\tsetDockerConfig(t, errorCase, \"no-docker-folder\")\n\t\t\t})\n\n\t\t\tit(\"docker context process is skip\", func() {\n\t\t\t\terr := client.ProcessDockerContext(logger)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertContains(t, strings.TrimSpace(outBuf.String()), \"docker context is default or empty, skipping it\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"config.json config doesn't exists\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\tsetDockerConfig(t, errorCase, \"config-does-not-exist\")\n\t\t\t})\n\n\t\t\tit(\"docker context process is skip\", func() {\n\t\t\t\terr := client.ProcessDockerContext(logger)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertContains(t, strings.TrimSpace(outBuf.String()), \"docker context is default or empty, skipping it\")\n\t\t\t})\n\t\t})\n\t})\n}\n\nfunc setDockerConfig(t *testing.T, test, context string) {\n\tt.Helper()\n\tcontextDir, err := filepath.Abs(filepath.Join(\"testdata\", rootFolder, test, context))\n\th.AssertNil(t, err)\n\terr = os.Setenv(\"DOCKER_CONFIG\", contextDir)\n\th.AssertNil(t, err)\n}\n"
  },
  {
    "path": "pkg/client/download_sbom.go",
    "content": "package client\n\nimport (\n\t\"context\"\n\n\t\"github.com/buildpacks/lifecycle/layers\"\n\t\"github.com/buildpacks/lifecycle/platform\"\n\t\"github.com/buildpacks/lifecycle/platform/files\"\n\t\"github.com/pkg/errors\"\n\n\t\"github.com/buildpacks/pack/pkg/dist\"\n\t\"github.com/buildpacks/pack/pkg/image\"\n)\n\ntype DownloadSBOMOptions struct {\n\tDaemon         bool\n\tDestinationDir string\n}\n\n// Deserialize just the subset of fields we need to avoid breaking changes\ntype sbomMetadata struct {\n\tBOM *files.LayerMetadata `json:\"sbom\" toml:\"sbom\"`\n}\n\nfunc (s *sbomMetadata) isMissing() bool {\n\treturn s == nil ||\n\t\ts.BOM == nil ||\n\t\ts.BOM.SHA == \"\"\n}\n\nconst (\n\tLocal = iota\n\tRemote\n)\n\n// DownloadSBOM pulls SBOM layer from an image.\n// It reads the SBOM metadata of an image then\n// pulls the corresponding diffId, if it exists\nfunc (c *Client) DownloadSBOM(name string, options DownloadSBOMOptions) error {\n\timg, err := c.imageFetcher.Fetch(context.Background(), name, image.FetchOptions{Daemon: options.Daemon, PullPolicy: image.PullNever})\n\tif err != nil {\n\t\tif errors.Cause(err) == image.ErrNotFound {\n\t\t\tc.logger.Warnf(\"if the image is saved on a registry run with the flag '--remote', for example: 'pack sbom download --remote %s'\", name)\n\t\t\treturn errors.Wrapf(image.ErrNotFound, \"image '%s' cannot be found\", name)\n\t\t}\n\t\treturn err\n\t}\n\n\tvar sbomMD sbomMetadata\n\tif _, err := dist.GetLabel(img, platform.LifecycleMetadataLabel, &sbomMD); err != nil {\n\t\treturn err\n\t}\n\n\tif sbomMD.isMissing() {\n\t\treturn errors.Errorf(\"could not find SBoM information on '%s'\", name)\n\t}\n\n\trc, err := img.GetLayer(sbomMD.BOM.SHA)\n\tif err != nil {\n\t\treturn err\n\t}\n\tdefer rc.Close()\n\n\treturn layers.Extract(rc, options.DestinationDir)\n}\n"
  },
  {
    "path": "pkg/client/download_sbom_test.go",
    "content": "package client\n\nimport (\n\t\"bytes\"\n\t\"crypto/sha256\"\n\t\"encoding/hex\"\n\t\"errors\"\n\t\"fmt\"\n\t\"os\"\n\t\"path/filepath\"\n\t\"testing\"\n\n\t\"github.com/buildpacks/imgutil/fakes\"\n\t\"github.com/golang/mock/gomock\"\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\t\"github.com/buildpacks/pack/pkg/archive\"\n\t\"github.com/buildpacks/pack/pkg/image\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\t\"github.com/buildpacks/pack/pkg/testmocks\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestDownloadSBOM(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"DownloadSBOM\", testDownloadSBOM, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testDownloadSBOM(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tsubject          *Client\n\t\tmockImageFetcher *testmocks.MockImageFetcher\n\t\tmockDockerClient *testmocks.MockAPIClient\n\t\tmockController   *gomock.Controller\n\t\tout              bytes.Buffer\n\t)\n\n\tit.Before(func() {\n\t\tmockController = gomock.NewController(t)\n\t\tmockImageFetcher = testmocks.NewMockImageFetcher(mockController)\n\t\tmockDockerClient = testmocks.NewMockAPIClient(mockController)\n\n\t\tvar err error\n\t\tsubject, err = NewClient(WithLogger(logging.NewLogWithWriters(&out, &out)), WithFetcher(mockImageFetcher), WithDockerClient(mockDockerClient))\n\t\th.AssertNil(t, err)\n\t})\n\n\tit.After(func() {\n\t\tmockController.Finish()\n\t})\n\n\twhen(\"the image exists\", func() {\n\t\tvar (\n\t\t\tmockImage *testmocks.MockImage\n\t\t\ttmpDir    string\n\t\t\ttmpFile   string\n\t\t)\n\n\t\tit.Before(func() {\n\t\t\tvar err error\n\t\t\ttmpDir, err = os.MkdirTemp(\"\", \"pack.download.sbom.test.\")\n\t\t\th.AssertNil(t, err)\n\n\t\t\tf, err := os.CreateTemp(\"\", \"pack.download.sbom.test.\")\n\t\t\th.AssertNil(t, err)\n\t\t\ttmpFile = f.Name()\n\n\t\t\terr = archive.CreateSingleFileTar(tmpFile, \"sbom\", \"some-sbom-content\")\n\t\t\th.AssertNil(t, err)\n\n\t\t\tdata, err := os.ReadFile(tmpFile)\n\t\t\th.AssertNil(t, err)\n\n\t\t\thsh := sha256.New()\n\t\t\thsh.Write(data)\n\t\t\tshasum := hex.EncodeToString(hsh.Sum(nil))\n\n\t\t\tmockImage = testmocks.NewImage(\"some/image\", \"\", nil)\n\t\t\tmockImage.AddLayerWithDiffID(tmpFile, fmt.Sprintf(\"sha256:%s\", shasum))\n\t\t\th.AssertNil(t, mockImage.SetLabel(\n\t\t\t\t\"io.buildpacks.lifecycle.metadata\",\n\t\t\t\tfmt.Sprintf(\n\t\t\t\t\t`{\n  \"sbom\": {\n    \"sha\": \"sha256:%s\"\n  }\n}`, shasum)))\n\n\t\t\tmockImageFetcher.EXPECT().Fetch(gomock.Any(), \"some/image\", image.FetchOptions{Daemon: true, PullPolicy: image.PullNever}).Return(mockImage, nil)\n\t\t})\n\n\t\tit.After(func() {\n\t\t\tos.RemoveAll(tmpDir)\n\t\t\tos.RemoveAll(tmpFile)\n\t\t})\n\n\t\tit(\"returns the stack ID\", func() {\n\t\t\terr := subject.DownloadSBOM(\"some/image\", DownloadSBOMOptions{Daemon: true, DestinationDir: tmpDir})\n\t\t\th.AssertNil(t, err)\n\n\t\t\tcontents, err := os.ReadFile(filepath.Join(tmpDir, \"sbom\"))\n\t\t\th.AssertNil(t, err)\n\n\t\t\th.AssertEq(t, string(contents), \"some-sbom-content\")\n\t\t})\n\t})\n\n\twhen(\"the image doesn't exist\", func() {\n\t\tit(\"returns nil\", func() {\n\t\t\tmockImageFetcher.EXPECT().Fetch(gomock.Any(), \"some/non-existent-image\", image.FetchOptions{Daemon: true, PullPolicy: image.PullNever}).Return(nil, image.ErrNotFound)\n\n\t\t\terr := subject.DownloadSBOM(\"some/non-existent-image\", DownloadSBOMOptions{Daemon: true, DestinationDir: \"\"})\n\t\t\texpectedError := fmt.Sprintf(\"image '%s' cannot be found\", \"some/non-existent-image\")\n\t\t\th.AssertError(t, err, expectedError)\n\n\t\t\texpectedMessage := fmt.Sprintf(\"Warning: if the image is saved on a registry run with the flag '--remote', for example: 'pack sbom download --remote %s'\", \"some/non-existent-image\")\n\t\t\th.AssertContains(t, out.String(), expectedMessage)\n\t\t})\n\t})\n\n\twhen(\"there is an error fetching the image\", func() {\n\t\tit(\"returns the error\", func() {\n\t\t\tmockImageFetcher.EXPECT().Fetch(gomock.Any(), \"some/image\", image.FetchOptions{Daemon: true, PullPolicy: image.PullNever}).Return(nil, errors.New(\"some-error\"))\n\n\t\t\terr := subject.DownloadSBOM(\"some/image\", DownloadSBOMOptions{Daemon: true, DestinationDir: \"\"})\n\t\t\th.AssertError(t, err, \"some-error\")\n\t\t})\n\t})\n\n\twhen(\"the image is SBOM metadata\", func() {\n\t\tit(\"returns empty data\", func() {\n\t\t\tmockImageFetcher.EXPECT().\n\t\t\t\tFetch(gomock.Any(), \"some/image-without-labels\", image.FetchOptions{Daemon: true, PullPolicy: image.PullNever}).\n\t\t\t\tReturn(fakes.NewImage(\"some/image-without-labels\", \"\", nil), nil)\n\n\t\t\terr := subject.DownloadSBOM(\"some/image-without-labels\", DownloadSBOMOptions{Daemon: true, DestinationDir: \"\"})\n\t\t\th.AssertError(t, err, \"could not find SBoM information on 'some/image-without-labels'\")\n\t\t})\n\t})\n\n\twhen(\"the image has malformed metadata\", func() {\n\t\tvar badImage *fakes.Image\n\n\t\tit.Before(func() {\n\t\t\tbadImage = fakes.NewImage(\"some/image-with-malformed-metadata\", \"\", nil)\n\t\t\tmockImageFetcher.EXPECT().\n\t\t\t\tFetch(gomock.Any(), \"some/image-with-malformed-metadata\", image.FetchOptions{Daemon: true, PullPolicy: image.PullNever}).\n\t\t\t\tReturn(badImage, nil)\n\t\t})\n\n\t\tit(\"returns an error when layers md cannot parse\", func() {\n\t\t\th.AssertNil(t, badImage.SetLabel(\"io.buildpacks.lifecycle.metadata\", \"not   ----  json\"))\n\n\t\t\terr := subject.DownloadSBOM(\"some/image-with-malformed-metadata\", DownloadSBOMOptions{Daemon: true, DestinationDir: \"\"})\n\t\t\th.AssertError(t, err, \"unmarshalling label 'io.buildpacks.lifecycle.metadata'\")\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "pkg/client/errors.go",
    "content": "package client\n\n// ExperimentError denotes that an experimental feature was trying to be used without experimental features enabled.\ntype ExperimentError struct {\n\tmsg string\n}\n\nfunc NewExperimentError(msg string) ExperimentError {\n\treturn ExperimentError{msg}\n}\n\nfunc (ee ExperimentError) Error() string {\n\treturn ee.msg\n}\n\n// SoftError is an error that is not intended to be displayed.\ntype SoftError struct{}\n\nfunc NewSoftError() SoftError {\n\treturn SoftError{}\n}\n\nfunc (se SoftError) Error() string {\n\treturn \"\"\n}\n"
  },
  {
    "path": "pkg/client/example_build_test.go",
    "content": "//go:build !windows && example\n\npackage client_test\n\nimport (\n\t\"context\"\n\t\"fmt\"\n\t\"path/filepath\"\n\n\t\"github.com/buildpacks/pack/pkg/client\"\n)\n\n// This example shows the basic usage of the package: Create a client,\n// call a configuration object, call the client's Build function.\nfunc Example_build() {\n\t// create a context object\n\tcontext := context.Background()\n\n\t// initialize a pack client\n\tpack, err := client.NewClient()\n\tif err != nil {\n\t\tpanic(err)\n\t}\n\n\t// replace this with the location of a sample application\n\t// For a list of prepared samples see the 'apps' folder at\n\t// https://github.com/buildpacks/samples.\n\tappPath := filepath.Join(\"testdata\", \"some-app\")\n\n\t// initialize our options\n\tbuildOpts := client.BuildOptions{\n\t\tImage:        \"pack-lib-test-image:0.0.1\",\n\t\tBuilder:      \"cnbs/sample-builder:noble\",\n\t\tAppPath:      appPath,\n\t\tTrustBuilder: func(string) bool { return true },\n\t}\n\n\t// build an image\n\terr = pack.Build(context, buildOpts)\n\tif err != nil {\n\t\tpanic(err)\n\t}\n\n\tfmt.Println(\"build completed\")\n\t// Output: build completed\n}\n"
  },
  {
    "path": "pkg/client/example_buildpack_downloader_test.go",
    "content": "//go:build !windows && example\n\npackage client_test\n\nimport (\n\t\"context\"\n\t\"errors\"\n\t\"fmt\"\n\t\"path/filepath\"\n\n\t\"github.com/buildpacks/pack/pkg/buildpack\"\n\t\"github.com/buildpacks/pack/pkg/client\"\n)\n\n// This example shows how to replace the buildpack downloader component\nfunc Example_buildpack_downloader() {\n\t// create a context object\n\tcontext := context.Background()\n\n\t// initialize a pack client\n\tpack, err := client.NewClient(client.WithBuildpackDownloader(&bpDownloader{}))\n\tif err != nil {\n\t\tpanic(err)\n\t}\n\n\t// replace this with the location of a sample application\n\t// For a list of prepared samples see the 'apps' folder at\n\t// https://github.com/buildpacks/samples.\n\tappPath := filepath.Join(\"testdata\", \"some-app\")\n\n\t// initialize our options\n\tbuildOpts := client.BuildOptions{\n\t\tImage:        \"pack-lib-test-image:0.0.1\",\n\t\tBuilder:      \"cnbs/sample-builder:bionic\",\n\t\tAppPath:      appPath,\n\t\tBuildpacks:   []string{\"some-buildpack:1.2.3\"},\n\t\tTrustBuilder: func(string) bool { return true },\n\t}\n\n\t// build an image\n\t_ = pack.Build(context, buildOpts)\n\n\t// Output: custom buildpack downloader called\n}\n\nvar _ client.BuildpackDownloader = (*bpDownloader)(nil)\n\ntype bpDownloader struct{}\n\nfunc (f *bpDownloader) Download(ctx context.Context, buildpackURI string, opts buildpack.DownloadOptions) (buildpack.BuildModule, []buildpack.BuildModule, error) {\n\tfmt.Println(\"custom buildpack downloader called\")\n\treturn nil, nil, errors.New(\"not implemented\")\n}\n"
  },
  {
    "path": "pkg/client/example_fetcher_test.go",
    "content": "//go:build !windows && example\n\npackage client_test\n\nimport (\n\t\"context\"\n\t\"errors\"\n\t\"fmt\"\n\t\"path/filepath\"\n\n\t\"github.com/buildpacks/imgutil\"\n\n\t\"github.com/buildpacks/pack/pkg/client\"\n\t\"github.com/buildpacks/pack/pkg/image\"\n)\n\n// This example shows how to replace the image fetcher component\nfunc Example_fetcher() {\n\t// create a context object\n\tcontext := context.Background()\n\n\t// initialize a pack client\n\tpack, err := client.NewClient(client.WithFetcher(&fetcher{}))\n\tif err != nil {\n\t\tpanic(err)\n\t}\n\n\t// replace this with the location of a sample application\n\t// For a list of prepared samples see the 'apps' folder at\n\t// https://github.com/buildpacks/samples.\n\tappPath := filepath.Join(\"testdata\", \"some-app\")\n\n\t// initialize our options\n\tbuildOpts := client.BuildOptions{\n\t\tImage:        \"pack-lib-test-image:0.0.1\",\n\t\tBuilder:      \"cnbs/sample-builder:bionic\",\n\t\tAppPath:      appPath,\n\t\tTrustBuilder: func(string) bool { return true },\n\t}\n\n\t// build an image\n\t_ = pack.Build(context, buildOpts)\n\n\t// Output: custom fetcher called\n}\n\nvar _ client.ImageFetcher = (*fetcher)(nil)\n\ntype fetcher struct{}\n\nfunc (f *fetcher) Fetch(_ context.Context, imageName string, _ image.FetchOptions) (imgutil.Image, error) {\n\tfmt.Println(\"custom fetcher called\")\n\treturn nil, errors.New(\"not implemented\")\n}\n\nfunc (f *fetcher) FetchForPlatform(_ context.Context, imageName string, _ image.FetchOptions) (imgutil.Image, error) {\n\tfmt.Println(\"custom fetcher called\")\n\treturn nil, errors.New(\"not implemented\")\n}\n\nfunc (f *fetcher) CheckReadAccess(_ string, _ image.FetchOptions) bool {\n\treturn true\n}\n"
  },
  {
    "path": "pkg/client/input_image_reference.go",
    "content": "package client\n\nimport (\n\t\"os\"\n\t\"path/filepath\"\n\t\"runtime\"\n\t\"strings\"\n\n\t\"github.com/pkg/errors\"\n)\n\ntype InputImageReference interface {\n\tName() string\n\tLayout() bool\n\tFullName() (string, error)\n}\n\ntype defaultInputImageReference struct {\n\tname string\n}\n\ntype layoutInputImageReference struct {\n\tname string\n}\n\nfunc ParseInputImageReference(input string) InputImageReference {\n\tif strings.HasPrefix(input, \"oci:\") {\n\t\timageNameParsed := strings.SplitN(input, \":\", 2)\n\t\treturn &layoutInputImageReference{\n\t\t\tname: imageNameParsed[1],\n\t\t}\n\t}\n\treturn &defaultInputImageReference{\n\t\tname: input,\n\t}\n}\n\nfunc (d *defaultInputImageReference) Name() string {\n\treturn d.name\n}\n\nfunc (d *defaultInputImageReference) Layout() bool {\n\treturn false\n}\n\nfunc (d *defaultInputImageReference) FullName() (string, error) {\n\treturn d.name, nil\n}\n\nfunc (l *layoutInputImageReference) Name() string {\n\treturn filepath.Base(l.name)\n}\n\nfunc (l *layoutInputImageReference) Layout() bool {\n\treturn true\n}\n\nfunc (l *layoutInputImageReference) FullName() (string, error) {\n\tvar (\n\t\tfullImagePath string\n\t\terr           error\n\t)\n\n\tpath := parsePath(l.name)\n\n\tif fullImagePath, err = filepath.EvalSymlinks(path); err != nil {\n\t\tif !os.IsNotExist(err) {\n\t\t\treturn \"\", errors.Wrap(err, \"evaluate symlink\")\n\t\t} else {\n\t\t\tfullImagePath = path\n\t\t}\n\t}\n\n\tif fullImagePath, err = filepath.Abs(fullImagePath); err != nil {\n\t\treturn \"\", errors.Wrap(err, \"resolve absolute path\")\n\t}\n\n\treturn fullImagePath, nil\n}\n\nfunc parsePath(path string) string {\n\tvar result string\n\tif filepath.IsAbs(path) && runtime.GOOS == \"windows\" {\n\t\tdir, fileWithTag := filepath.Split(path)\n\t\tfile := removeTag(fileWithTag)\n\t\tresult = filepath.Join(dir, file)\n\t} else {\n\t\tresult = removeTag(path)\n\t}\n\treturn result\n}\n\nfunc removeTag(path string) string {\n\tresult := path\n\tif strings.Contains(path, \":\") {\n\t\tsplit := strings.SplitN(path, \":\", 2)\n\t\t// do not include the tag in the path\n\t\tresult = split[0]\n\t}\n\treturn result\n}\n"
  },
  {
    "path": "pkg/client/input_image_reference_test.go",
    "content": "package client\n\nimport (\n\t\"fmt\"\n\t\"os\"\n\t\"path/filepath\"\n\t\"testing\"\n\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestInputImageReference(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"InputImageReference\", testInputImageReference, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testInputImageReference(t *testing.T, when spec.G, it spec.S) {\n\tvar defaultImageReference, layoutImageReference InputImageReference\n\n\tit.Before(func() {\n\t\tdefaultImageReference = ParseInputImageReference(\"busybox\")\n\t\tlayoutImageReference = ParseInputImageReference(\"oci:my-app\")\n\t})\n\n\twhen(\"#ParseInputImageReference\", func() {\n\t\twhen(\"oci layout image reference is not provided\", func() {\n\t\t\tit(\"default implementation is returned\", func() {\n\t\t\t\th.AssertEq(t, defaultImageReference.Layout(), false)\n\t\t\t\th.AssertEq(t, defaultImageReference.Name(), \"busybox\")\n\n\t\t\t\tfullName, err := defaultImageReference.FullName()\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertEq(t, fullName, \"busybox\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"oci layout image reference is provided\", func() {\n\t\t\tit(\"layout implementation is returned\", func() {\n\t\t\t\th.AssertTrue(t, layoutImageReference.Layout())\n\t\t\t\th.AssertEq(t, layoutImageReference.Name(), \"my-app\")\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"#FullName\", func() {\n\t\twhen(\"oci layout image reference is provided\", func() {\n\t\t\twhen(\"not absolute path provided\", func() {\n\t\t\t\tit(\"it will be joined with the current working directory\", func() {\n\t\t\t\t\tfullPath, err := layoutImageReference.FullName()\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\tcurrentWorkingDir, err := os.Getwd()\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\texpectedPath := filepath.Join(currentWorkingDir, layoutImageReference.Name())\n\t\t\t\t\th.AssertEq(t, fullPath, expectedPath)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"absolute path provided\", func() {\n\t\t\t\tvar (\n\t\t\t\t\tfullPath, expectedFullPath, tmpDir string\n\t\t\t\t\terr                                error\n\t\t\t\t)\n\n\t\t\t\tit.Before(func() {\n\t\t\t\t\ttmpDir, err = os.MkdirTemp(\"\", \"pack.input.image.reference.test\")\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\texpectedFullPath = filepath.Join(tmpDir, \"my-app\")\n\t\t\t\t\tlayoutImageReference = ParseInputImageReference(fmt.Sprintf(\"oci:%s\", expectedFullPath))\n\t\t\t\t})\n\n\t\t\t\tit.After(func() {\n\t\t\t\t\terr = os.RemoveAll(tmpDir)\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t})\n\n\t\t\t\tit(\"it must returned the path provided\", func() {\n\t\t\t\t\tfullPath, err = layoutImageReference.FullName()\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\th.AssertEq(t, fullPath, expectedFullPath)\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "pkg/client/inspect_builder.go",
    "content": "package client\n\nimport (\n\t\"errors\"\n\n\tpubbldr \"github.com/buildpacks/pack/builder\"\n\n\t\"github.com/buildpacks/pack/internal/builder\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n\t\"github.com/buildpacks/pack/pkg/image\"\n)\n\n// BuilderInfo is a collection of metadata describing a builder created using client.\ntype BuilderInfo struct {\n\t// Human readable, description of a builder.\n\tDescription string\n\n\t// Stack name used by the builder.\n\tStack string\n\n\t// List of Stack mixins, this information is provided by Stack variable.\n\tMixins []string\n\n\t// RunImage provided by the builder.\n\tRunImages []pubbldr.RunImageConfig\n\n\t// All buildpacks included within the builder.\n\tBuildpacks []dist.ModuleInfo\n\n\t// Detailed ordering of buildpacks and nested buildpacks where depth is specified.\n\tOrder pubbldr.DetectionOrder\n\n\t// Listing of all buildpack layers in a builder.\n\t// All elements in the Buildpacks variable are represented in this\n\t// object.\n\tBuildpackLayers dist.ModuleLayers\n\n\t// Lifecycle provides the following API versioning information for a builder:\n\t// - Lifecycle Version used in this builder,\n\t// - Platform API,\n\t// - Buildpack API.\n\tLifecycle builder.LifecycleDescriptor\n\n\t// Name and Version information from tooling used\n\t// to produce this builder.\n\tCreatedBy builder.CreatorMetadata\n\n\t// All extensions included within the builder.\n\tExtensions []dist.ModuleInfo\n\n\t// Detailed ordering of extensions.\n\tOrderExtensions pubbldr.DetectionOrder\n}\n\n// BuildpackInfoKey contains all information needed to determine buildpack equivalence.\ntype BuildpackInfoKey struct {\n\tID      string\n\tVersion string\n}\n\ntype BuilderInspectionConfig struct {\n\tOrderDetectionDepth int\n}\n\ntype BuilderInspectionModifier func(config *BuilderInspectionConfig)\n\nfunc WithDetectionOrderDepth(depth int) BuilderInspectionModifier {\n\treturn func(config *BuilderInspectionConfig) {\n\t\tconfig.OrderDetectionDepth = depth\n\t}\n}\n\n// InspectBuilder reads label metadata of a local or remote builder image. It initializes a BuilderInfo\n// object with this metadata, and returns it. This method will error if the name image cannot be found\n// both locally and remotely, or if the found image does not contain the proper labels.\nfunc (c *Client) InspectBuilder(name string, daemon bool, modifiers ...BuilderInspectionModifier) (*BuilderInfo, error) {\n\tinspector := builder.NewInspector(\n\t\tbuilder.NewImageFetcherWrapper(c.imageFetcher),\n\t\tbuilder.NewLabelManagerProvider(),\n\t\tbuilder.NewDetectionOrderCalculator(),\n\t)\n\n\tinspectionConfig := BuilderInspectionConfig{OrderDetectionDepth: pubbldr.OrderDetectionNone}\n\tfor _, mod := range modifiers {\n\t\tmod(&inspectionConfig)\n\t}\n\n\tinfo, err := inspector.Inspect(name, daemon, inspectionConfig.OrderDetectionDepth)\n\tif err != nil {\n\t\tif errors.Is(err, image.ErrNotFound) {\n\t\t\treturn nil, nil\n\t\t}\n\t\treturn nil, err\n\t}\n\n\treturn &BuilderInfo{\n\t\tDescription:     info.Description,\n\t\tStack:           info.StackID,\n\t\tMixins:          info.Mixins,\n\t\tRunImages:       info.RunImages,\n\t\tBuildpacks:      info.Buildpacks,\n\t\tOrder:           info.Order,\n\t\tBuildpackLayers: info.BuildpackLayers,\n\t\tLifecycle:       info.Lifecycle,\n\t\tCreatedBy:       info.CreatedBy,\n\t\tExtensions:      info.Extensions,\n\t\tOrderExtensions: info.OrderExtensions,\n\t}, nil\n}\n"
  },
  {
    "path": "pkg/client/inspect_builder_test.go",
    "content": "package client\n\nimport (\n\t\"bytes\"\n\t\"fmt\"\n\t\"testing\"\n\n\tpubbldr \"github.com/buildpacks/pack/builder\"\n\n\t\"github.com/buildpacks/imgutil/fakes\"\n\t\"github.com/buildpacks/lifecycle/api\"\n\t\"github.com/golang/mock/gomock\"\n\t\"github.com/google/go-cmp/cmp\"\n\t\"github.com/heroku/color\"\n\t\"github.com/pkg/errors\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\t\"github.com/buildpacks/pack/internal/builder\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n\t\"github.com/buildpacks/pack/pkg/image\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\t\"github.com/buildpacks/pack/pkg/testmocks\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestInspectBuilder(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"InspectBuilder\", testInspectBuilder, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testInspectBuilder(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tsubject          *Client\n\t\tmockImageFetcher *testmocks.MockImageFetcher\n\t\tmockController   *gomock.Controller\n\t\tbuilderImage     *fakes.Image\n\t\tout              bytes.Buffer\n\t\tassert           = h.NewAssertionManager(t)\n\t)\n\n\tit.Before(func() {\n\t\tmockController = gomock.NewController(t)\n\t\tmockImageFetcher = testmocks.NewMockImageFetcher(mockController)\n\n\t\tsubject = &Client{\n\t\t\tlogger:       logging.NewLogWithWriters(&out, &out),\n\t\t\timageFetcher: mockImageFetcher,\n\t\t}\n\n\t\tbuilderImage = fakes.NewImage(\"some/builder\", \"\", nil)\n\t\tassert.Succeeds(builderImage.SetLabel(\"io.buildpacks.stack.id\", \"test.stack.id\"))\n\t\tassert.Succeeds(builderImage.SetLabel(\n\t\t\t\"io.buildpacks.stack.mixins\",\n\t\t\t`[\"mixinOne\", \"build:mixinTwo\", \"mixinThree\", \"build:mixinFour\"]`,\n\t\t))\n\t\tassert.Succeeds(builderImage.SetEnv(\"CNB_USER_ID\", \"1234\"))\n\t\tassert.Succeeds(builderImage.SetEnv(\"CNB_GROUP_ID\", \"4321\"))\n\t})\n\n\tit.After(func() {\n\t\tmockController.Finish()\n\t})\n\n\twhen(\"the image exists\", func() {\n\t\tfor _, useDaemon := range []bool{true, false} {\n\t\t\tuseDaemon := useDaemon\n\t\t\twhen(fmt.Sprintf(\"daemon is %t\", useDaemon), func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tif useDaemon {\n\t\t\t\t\t\tmockImageFetcher.EXPECT().Fetch(gomock.Any(), \"some/builder\", image.FetchOptions{Daemon: true, PullPolicy: image.PullNever}).Return(builderImage, nil)\n\t\t\t\t\t} else {\n\t\t\t\t\t\tmockImageFetcher.EXPECT().Fetch(gomock.Any(), \"some/builder\", image.FetchOptions{Daemon: false, PullPolicy: image.PullNever}).Return(builderImage, nil)\n\t\t\t\t\t}\n\t\t\t\t})\n\n\t\t\t\twhen(\"only deprecated lifecycle apis are present\", func() {\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\tassert.Succeeds(builderImage.SetLabel(\n\t\t\t\t\t\t\t\"io.buildpacks.builder.metadata\",\n\t\t\t\t\t\t\t`{\"lifecycle\": {\"version\": \"1.2.3\", \"api\": {\"buildpack\": \"1.2\",\"platform\": \"2.3\"}}}`,\n\t\t\t\t\t\t))\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"returns has both deprecated and new fields\", func() {\n\t\t\t\t\t\tbuilderInfo, err := subject.InspectBuilder(\"some/builder\", useDaemon)\n\t\t\t\t\t\tassert.Nil(err)\n\n\t\t\t\t\t\tassert.Equal(builderInfo.Lifecycle, builder.LifecycleDescriptor{\n\t\t\t\t\t\t\tInfo: builder.LifecycleInfo{\n\t\t\t\t\t\t\t\tVersion: builder.VersionMustParse(\"1.2.3\"),\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\tAPI: builder.LifecycleAPI{\n\t\t\t\t\t\t\t\tBuildpackVersion: api.MustParse(\"1.2\"),\n\t\t\t\t\t\t\t\tPlatformVersion:  api.MustParse(\"2.3\"),\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\tAPIs: builder.LifecycleAPIs{\n\t\t\t\t\t\t\t\tBuildpack: builder.APIVersions{Supported: builder.APISet{api.MustParse(\"1.2\")}},\n\t\t\t\t\t\t\t\tPlatform:  builder.APIVersions{Supported: builder.APISet{api.MustParse(\"2.3\")}},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"the builder image has appropriate metadata labels\", func() {\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\tassert.Succeeds(builderImage.SetLabel(\"io.buildpacks.builder.metadata\", `{\n  \"description\": \"Some description\",\n  \"stack\": {\n    \"runImage\": {\n      \"image\": \"some/run-image\",\n      \"mirrors\": [\n        \"gcr.io/some/default\"\n      ]\n    }\n  },\n  \"buildpacks\": [\n    {\n      \"id\": \"test.nested\",\n\t  \"version\": \"test.nested.version\",\n\t  \"homepage\": \"http://geocities.com/top-bp\"\n\t},\n\t{\n      \"id\": \"test.bp.one\",\n\t  \"version\": \"test.bp.one.version\",\n\t  \"homepage\": \"http://geocities.com/cool-bp\",\n\t  \"name\": \"one\"\n    },\n\t{\n      \"id\": \"test.bp.two\",\n\t  \"version\": \"test.bp.two.version\"\n    },\n\t{\n      \"id\": \"test.bp.two\",\n\t  \"version\": \"test.bp.two.version\"\n    }\n  ],\n  \"lifecycle\": {\"version\": \"1.2.3\", \"api\": {\"buildpack\": \"0.1\",\"platform\": \"2.3\"}, \"apis\":  {\n\t\"buildpack\": {\"deprecated\": [\"0.1\"], \"supported\": [\"1.2\", \"1.3\"]},\n\t\"platform\": {\"deprecated\": [], \"supported\": [\"2.3\", \"2.4\"]}\n  }},\n  \"createdBy\": {\"name\": \"pack\", \"version\": \"1.2.3\"},\n  \"images\": [\n    {\n      \"image\": \"some/run-image\",\n      \"mirrors\": [\n        \"gcr.io/some/default\"\n      ]\n    }\n  ]\n}`))\n\n\t\t\t\t\t\tassert.Succeeds(builderImage.SetLabel(\n\t\t\t\t\t\t\t\"io.buildpacks.buildpack.order\",\n\t\t\t\t\t\t\t`[\n\t{\n\t  \"group\": \n\t\t[\n\t\t  {\n\t\t\t\"id\": \"test.nested\",\n\t\t\t\"version\": \"test.nested.version\",\n\t\t\t\"optional\": false\n\t\t  },\n\t\t  {\n\t\t\t\"id\": \"test.bp.two\",\n\t\t\t\"optional\": true\n\t\t  }\n\t\t]\n\t}\n]`,\n\t\t\t\t\t\t))\n\n\t\t\t\t\t\tassert.Succeeds(builderImage.SetLabel(\n\t\t\t\t\t\t\t\"io.buildpacks.buildpack.layers\",\n\t\t\t\t\t\t\t`{\n  \"test.nested\": {\n    \"test.nested.version\": {\n      \"api\": \"0.2\",\n      \"order\": [\n        {\n          \"group\": [\n            {\n              \"id\": \"test.bp.one\",\n              \"version\": \"test.bp.one.version\"\n            },\n            {\n              \"id\": \"test.bp.two\",\n              \"version\": \"test.bp.two.version\"\n            }\n          ]\n        }\n      ],\n      \"layerDiffID\": \"sha256:test.nested.sha256\",\n\t  \"homepage\": \"http://geocities.com/top-bp\"\n    }\n  },\n  \"test.bp.one\": {\n    \"test.bp.one.version\": {\n      \"api\": \"0.2\",\n      \"stacks\": [\n        {\n          \"id\": \"test.stack.id\"\n        }\n      ],\n      \"layerDiffID\": \"sha256:test.bp.one.sha256\",\n\t  \"homepage\": \"http://geocities.com/cool-bp\",\n\t  \"name\": \"one\"\n    }\n  },\n \"test.bp.two\": {\n    \"test.bp.two.version\": {\n      \"api\": \"0.2\",\n      \"stacks\": [\n        {\n          \"id\": \"test.stack.id\"\n        }\n      ],\n      \"layerDiffID\": \"sha256:test.bp.two.sha256\"\n    }\n  }\n}`))\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"returns the builder with the given name with information from the label\", func() {\n\t\t\t\t\t\tbuilderInfo, err := subject.InspectBuilder(\"some/builder\", useDaemon)\n\t\t\t\t\t\tassert.Nil(err)\n\t\t\t\t\t\tapiVersion, err := api.NewVersion(\"0.2\")\n\t\t\t\t\t\tassert.Nil(err)\n\n\t\t\t\t\t\twant := BuilderInfo{\n\t\t\t\t\t\t\tDescription: \"Some description\",\n\t\t\t\t\t\t\tStack:       \"test.stack.id\",\n\t\t\t\t\t\t\tMixins:      []string{\"mixinOne\", \"mixinThree\", \"build:mixinTwo\", \"build:mixinFour\"},\n\t\t\t\t\t\t\tRunImages:   []pubbldr.RunImageConfig{{Image: \"some/run-image\", Mirrors: []string{\"gcr.io/some/default\"}}},\n\t\t\t\t\t\t\tBuildpacks: []dist.ModuleInfo{\n\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\tID:       \"test.bp.one\",\n\t\t\t\t\t\t\t\t\tVersion:  \"test.bp.one.version\",\n\t\t\t\t\t\t\t\t\tName:     \"one\",\n\t\t\t\t\t\t\t\t\tHomepage: \"http://geocities.com/cool-bp\",\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\tID:      \"test.bp.two\",\n\t\t\t\t\t\t\t\t\tVersion: \"test.bp.two.version\",\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\tID:       \"test.nested\",\n\t\t\t\t\t\t\t\t\tVersion:  \"test.nested.version\",\n\t\t\t\t\t\t\t\t\tHomepage: \"http://geocities.com/top-bp\",\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\tOrder: pubbldr.DetectionOrder{\n\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\tGroupDetectionOrder: pubbldr.DetectionOrder{\n\t\t\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t\t\tModuleRef: dist.ModuleRef{\n\t\t\t\t\t\t\t\t\t\t\t\tModuleInfo: dist.ModuleInfo{ID: \"test.nested\", Version: \"test.nested.version\"},\n\t\t\t\t\t\t\t\t\t\t\t\tOptional:   false,\n\t\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t\t\tModuleRef: dist.ModuleRef{\n\t\t\t\t\t\t\t\t\t\t\t\tModuleInfo: dist.ModuleInfo{ID: \"test.bp.two\"},\n\t\t\t\t\t\t\t\t\t\t\t\tOptional:   true,\n\t\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\tBuildpackLayers: map[string]map[string]dist.ModuleLayerInfo{\n\t\t\t\t\t\t\t\t\"test.nested\": {\n\t\t\t\t\t\t\t\t\t\"test.nested.version\": {\n\t\t\t\t\t\t\t\t\t\tAPI: apiVersion,\n\t\t\t\t\t\t\t\t\t\tOrder: dist.Order{\n\t\t\t\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t\t\t\tGroup: []dist.ModuleRef{\n\t\t\t\t\t\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t\t\t\t\t\tModuleInfo: dist.ModuleInfo{\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\tID:      \"test.bp.one\",\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\tVersion: \"test.bp.one.version\",\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t\t\t\t\t\tOptional: false,\n\t\t\t\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t\t\t\t\t\tModuleInfo: dist.ModuleInfo{\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\tID:      \"test.bp.two\",\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\tVersion: \"test.bp.two.version\",\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t\t\t\t\t\tOptional: false,\n\t\t\t\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t\tLayerDiffID: \"sha256:test.nested.sha256\",\n\t\t\t\t\t\t\t\t\t\tHomepage:    \"http://geocities.com/top-bp\",\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\"test.bp.one\": {\n\t\t\t\t\t\t\t\t\t\"test.bp.one.version\": {\n\t\t\t\t\t\t\t\t\t\tAPI: apiVersion,\n\t\t\t\t\t\t\t\t\t\tStacks: []dist.Stack{\n\t\t\t\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t\t\t\tID: \"test.stack.id\",\n\t\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t\tLayerDiffID: \"sha256:test.bp.one.sha256\",\n\t\t\t\t\t\t\t\t\t\tHomepage:    \"http://geocities.com/cool-bp\",\n\t\t\t\t\t\t\t\t\t\tName:        \"one\",\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\"test.bp.two\": {\n\t\t\t\t\t\t\t\t\t\"test.bp.two.version\": {\n\t\t\t\t\t\t\t\t\t\tAPI: apiVersion,\n\t\t\t\t\t\t\t\t\t\tStacks: []dist.Stack{\n\t\t\t\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t\t\t\tID: \"test.stack.id\",\n\t\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t\tLayerDiffID: \"sha256:test.bp.two.sha256\",\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\tLifecycle: builder.LifecycleDescriptor{\n\t\t\t\t\t\t\t\tInfo: builder.LifecycleInfo{\n\t\t\t\t\t\t\t\t\tVersion: builder.VersionMustParse(\"1.2.3\"),\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\tAPI: builder.LifecycleAPI{\n\t\t\t\t\t\t\t\t\tBuildpackVersion: api.MustParse(\"0.1\"),\n\t\t\t\t\t\t\t\t\tPlatformVersion:  api.MustParse(\"2.3\"),\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\tAPIs: builder.LifecycleAPIs{\n\t\t\t\t\t\t\t\t\tBuildpack: builder.APIVersions{\n\t\t\t\t\t\t\t\t\t\tDeprecated: builder.APISet{api.MustParse(\"0.1\")},\n\t\t\t\t\t\t\t\t\t\tSupported:  builder.APISet{api.MustParse(\"1.2\"), api.MustParse(\"1.3\")},\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\tPlatform: builder.APIVersions{\n\t\t\t\t\t\t\t\t\t\tDeprecated: builder.APISet{},\n\t\t\t\t\t\t\t\t\t\tSupported:  builder.APISet{api.MustParse(\"2.3\"), api.MustParse(\"2.4\")},\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\tCreatedBy: builder.CreatorMetadata{\n\t\t\t\t\t\t\t\tName:    \"pack\",\n\t\t\t\t\t\t\t\tVersion: \"1.2.3\",\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t}\n\n\t\t\t\t\t\tif diff := cmp.Diff(want, *builderInfo); diff != \"\" {\n\t\t\t\t\t\t\tt.Errorf(\"InspectBuilder() mismatch (-want +got):\\n%s\", diff)\n\t\t\t\t\t\t}\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"order detection depth is higher than None\", func() {\n\t\t\t\t\t\tit(\"shows subgroup order as part of order\", func() {\n\t\t\t\t\t\t\tbuilderInfo, err := subject.InspectBuilder(\n\t\t\t\t\t\t\t\t\"some/builder\",\n\t\t\t\t\t\t\t\tuseDaemon,\n\t\t\t\t\t\t\t\tWithDetectionOrderDepth(pubbldr.OrderDetectionMaxDepth),\n\t\t\t\t\t\t\t)\n\t\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\t\twant := pubbldr.DetectionOrder{\n\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\tGroupDetectionOrder: pubbldr.DetectionOrder{\n\t\t\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t\t\tModuleRef: dist.ModuleRef{\n\t\t\t\t\t\t\t\t\t\t\t\tModuleInfo: dist.ModuleInfo{ID: \"test.nested\", Version: \"test.nested.version\"},\n\t\t\t\t\t\t\t\t\t\t\t\tOptional:   false,\n\t\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t\t\tGroupDetectionOrder: pubbldr.DetectionOrder{\n\t\t\t\t\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t\t\t\t\tModuleRef: dist.ModuleRef{\n\t\t\t\t\t\t\t\t\t\t\t\t\t\tModuleInfo: dist.ModuleInfo{\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\tID:      \"test.bp.one\",\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\tVersion: \"test.bp.one.version\",\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t\t\t\t\tModuleRef: dist.ModuleRef{\n\t\t\t\t\t\t\t\t\t\t\t\t\t\tModuleInfo: dist.ModuleInfo{\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\tID:      \"test.bp.two\",\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\tVersion: \"test.bp.two.version\",\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t\t\tModuleRef: dist.ModuleRef{\n\t\t\t\t\t\t\t\t\t\t\t\tModuleInfo: dist.ModuleInfo{ID: \"test.bp.two\"},\n\t\t\t\t\t\t\t\t\t\t\t\tOptional:   true,\n\t\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t}\n\n\t\t\t\t\t\t\tif diff := cmp.Diff(want, builderInfo.Order); diff != \"\" {\n\t\t\t\t\t\t\t\tt.Errorf(\"\\\"InspectBuilder() mismatch (-want +got):\\b%s\", diff)\n\t\t\t\t\t\t\t}\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\t// TODO add test case when builder is flattened\n\t\t\t})\n\t\t}\n\t})\n\n\twhen(\"the image does not exist\", func() {\n\t\tit.Before(func() {\n\t\t\tnotFoundImage := fakes.NewImage(\"\", \"\", nil)\n\t\t\tnotFoundImage.Delete()\n\t\t\tmockImageFetcher.EXPECT().Fetch(gomock.Any(), \"some/builder\", image.FetchOptions{Daemon: true, PullPolicy: image.PullNever}).Return(nil, errors.Wrap(image.ErrNotFound, \"some-error\"))\n\t\t})\n\n\t\tit(\"return nil metadata\", func() {\n\t\t\tmetadata, err := subject.InspectBuilder(\"some/builder\", true)\n\t\t\tassert.Nil(err)\n\t\t\tassert.Nil(metadata)\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "pkg/client/inspect_buildpack.go",
    "content": "package client\n\nimport (\n\t\"context\"\n\t\"fmt\"\n\t\"sort\"\n\n\tv1 \"github.com/opencontainers/image-spec/specs-go/v1\"\n\n\t\"github.com/buildpacks/pack/internal/style\"\n\t\"github.com/buildpacks/pack/pkg/buildpack\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n\t\"github.com/buildpacks/pack/pkg/image\"\n)\n\ntype BuildpackInfo struct {\n\tBuildpackMetadata buildpack.Metadata\n\tBuildpacks        []dist.ModuleInfo\n\tOrder             dist.Order\n\tBuildpackLayers   dist.ModuleLayers\n\tLocation          buildpack.LocatorType\n}\n\ntype InspectBuildpackOptions struct {\n\tBuildpackName string\n\tDaemon        bool\n\tRegistry      string\n}\n\ntype ImgWrapper struct {\n\tv1.ImageConfig\n}\n\nfunc (iw ImgWrapper) Label(name string) (string, error) {\n\treturn iw.Labels[name], nil\n}\n\nfunc (c *Client) InspectBuildpack(opts InspectBuildpackOptions) (*BuildpackInfo, error) {\n\tlocatorType, err := buildpack.GetLocatorType(opts.BuildpackName, \"\", []dist.ModuleInfo{})\n\tif err != nil {\n\t\treturn nil, err\n\t}\n\tvar layersMd dist.ModuleLayers\n\tvar buildpackMd buildpack.Metadata\n\n\tswitch locatorType {\n\tcase buildpack.RegistryLocator:\n\t\tbuildpackMd, layersMd, err = metadataFromRegistry(c, opts.BuildpackName, opts.Registry)\n\tcase buildpack.PackageLocator:\n\t\tbuildpackMd, layersMd, err = metadataFromImage(c, opts.BuildpackName, opts.Daemon)\n\tcase buildpack.URILocator:\n\t\tbuildpackMd, layersMd, err = metadataFromArchive(c.downloader, opts.BuildpackName)\n\tdefault:\n\t\treturn nil, fmt.Errorf(\"unable to handle locator %q: for buildpack %q\", locatorType, opts.BuildpackName)\n\t}\n\tif err != nil {\n\t\treturn nil, err\n\t}\n\n\treturn &BuildpackInfo{\n\t\tBuildpackMetadata: buildpackMd,\n\t\tBuildpackLayers:   layersMd,\n\t\tOrder:             extractOrder(buildpackMd),\n\t\tBuildpacks:        extractBuildpacks(layersMd),\n\t\tLocation:          locatorType,\n\t}, nil\n}\n\nfunc metadataFromRegistry(client *Client, name, registry string) (buildpackMd buildpack.Metadata, layersMd dist.ModuleLayers, err error) {\n\tregistryCache, err := getRegistry(client.logger, registry)\n\tif err != nil {\n\t\treturn buildpack.Metadata{}, dist.ModuleLayers{}, fmt.Errorf(\"invalid registry %s: %q\", registry, err)\n\t}\n\n\tregistryBp, err := registryCache.LocateBuildpack(name)\n\tif err != nil {\n\t\treturn buildpack.Metadata{}, dist.ModuleLayers{}, fmt.Errorf(\"unable to find %s in registry: %q\", style.Symbol(name), err)\n\t}\n\tbuildpackMd, layersMd, err = metadataFromImage(client, registryBp.Address, false)\n\tif err != nil {\n\t\treturn buildpack.Metadata{}, dist.ModuleLayers{}, fmt.Errorf(\"error pulling registry specified image: %s\", err)\n\t}\n\treturn buildpackMd, layersMd, nil\n}\n\nfunc metadataFromArchive(downloader BlobDownloader, path string) (buildpackMd buildpack.Metadata, layersMd dist.ModuleLayers, err error) {\n\timgBlob, err := downloader.Download(context.Background(), path)\n\tif err != nil {\n\t\treturn buildpack.Metadata{}, dist.ModuleLayers{}, fmt.Errorf(\"unable to download archive: %q\", err)\n\t}\n\n\tconfig, err := buildpack.ConfigFromOCILayoutBlob(imgBlob)\n\tif err != nil {\n\t\treturn buildpack.Metadata{}, dist.ModuleLayers{}, fmt.Errorf(\"unable to fetch config from buildpack blob: %q\", err)\n\t}\n\twrapper := ImgWrapper{config}\n\n\tif _, err := dist.GetLabel(wrapper, dist.BuildpackLayersLabel, &layersMd); err != nil {\n\t\treturn buildpack.Metadata{}, dist.ModuleLayers{}, err\n\t}\n\n\tif _, err := dist.GetLabel(wrapper, buildpack.MetadataLabel, &buildpackMd); err != nil {\n\t\treturn buildpack.Metadata{}, dist.ModuleLayers{}, err\n\t}\n\treturn buildpackMd, layersMd, nil\n}\n\nfunc metadataFromImage(client *Client, name string, daemon bool) (buildpackMd buildpack.Metadata, layersMd dist.ModuleLayers, err error) {\n\timageName := buildpack.ParsePackageLocator(name)\n\timg, err := client.imageFetcher.Fetch(context.Background(), imageName, image.FetchOptions{Daemon: daemon, PullPolicy: image.PullNever})\n\tif err != nil {\n\t\treturn buildpack.Metadata{}, dist.ModuleLayers{}, err\n\t}\n\tif _, err := dist.GetLabel(img, dist.BuildpackLayersLabel, &layersMd); err != nil {\n\t\treturn buildpack.Metadata{}, dist.ModuleLayers{}, fmt.Errorf(\"unable to get image label %s: %q\", dist.BuildpackLayersLabel, err)\n\t}\n\n\tif _, err := dist.GetLabel(img, buildpack.MetadataLabel, &buildpackMd); err != nil {\n\t\treturn buildpack.Metadata{}, dist.ModuleLayers{}, fmt.Errorf(\"unable to get image label %s: %q\", buildpack.MetadataLabel, err)\n\t}\n\treturn buildpackMd, layersMd, nil\n}\n\nfunc extractOrder(buildpackMd buildpack.Metadata) dist.Order {\n\treturn dist.Order{\n\t\t{\n\t\t\tGroup: []dist.ModuleRef{\n\t\t\t\t{\n\t\t\t\t\tModuleInfo: buildpackMd.ModuleInfo,\n\t\t\t\t},\n\t\t\t},\n\t\t},\n\t}\n}\n\nfunc extractBuildpacks(layersMd dist.ModuleLayers) []dist.ModuleInfo {\n\tresult := []dist.ModuleInfo{}\n\tbuildpackSet := map[*dist.ModuleInfo]bool{}\n\n\tfor buildpackID, buildpackMap := range layersMd {\n\t\tfor version, layerInfo := range buildpackMap {\n\t\t\tbp := dist.ModuleInfo{\n\t\t\t\tID:       buildpackID,\n\t\t\t\tName:     layerInfo.Name,\n\t\t\t\tVersion:  version,\n\t\t\t\tHomepage: layerInfo.Homepage,\n\t\t\t}\n\t\t\tbuildpackSet[&bp] = true\n\t\t}\n\t}\n\n\tfor currentBuildpack := range buildpackSet {\n\t\tresult = append(result, *currentBuildpack)\n\t}\n\n\tsort.Slice(result, func(i int, j int) bool {\n\t\tswitch {\n\t\tcase result[i].ID < result[j].ID:\n\t\t\treturn true\n\t\tcase result[i].ID == result[j].ID:\n\t\t\treturn result[i].Version < result[j].Version\n\t\tdefault:\n\t\t\treturn false\n\t\t}\n\t})\n\treturn result\n}\n"
  },
  {
    "path": "pkg/client/inspect_buildpack_test.go",
    "content": "package client_test\n\nimport (\n\t\"archive/tar\"\n\t\"bytes\"\n\t\"fmt\"\n\t\"os\"\n\t\"path/filepath\"\n\t\"runtime\"\n\t\"testing\"\n\n\t\"github.com/buildpacks/imgutil/fakes\"\n\t\"github.com/buildpacks/lifecycle/api\"\n\t\"github.com/golang/mock/gomock\"\n\t\"github.com/google/go-containerregistry/pkg/v1/empty\"\n\t\"github.com/google/go-containerregistry/pkg/v1/layout\"\n\t\"github.com/google/go-containerregistry/pkg/v1/mutate\"\n\t\"github.com/heroku/color\"\n\t\"github.com/pkg/errors\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\tcfg \"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/pkg/archive\"\n\t\"github.com/buildpacks/pack/pkg/blob\"\n\t\"github.com/buildpacks/pack/pkg/buildpack\"\n\t\"github.com/buildpacks/pack/pkg/client\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n\t\"github.com/buildpacks/pack/pkg/image\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\t\"github.com/buildpacks/pack/pkg/testmocks\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nconst buildpackageMetadataTag = `{\n  \"id\": \"some/top-buildpack\",\n  \"version\": \"0.0.1\",\n  \"name\": \"top\",\n  \"homepage\": \"top-buildpack-homepage\",\n  \"stacks\": [\n    {\n      \"id\": \"io.buildpacks.stacks.first-stack\"\n    },\n    {\n      \"id\": \"io.buildpacks.stacks.second-stack\"\n    }\n  ]\n}`\n\nconst buildpackLayersTag = `{\n   \"some/first-inner-buildpack\":{\n      \"1.0.0\":{\n         \"api\":\"0.2\",\n         \"order\":[\n            {\n               \"group\":[\n                  {\n                     \"id\":\"some/first-inner-buildpack\",\n                     \"version\":\"1.0.0\"\n                  },\n                  {\n                     \"id\":\"some/second-inner-buildpack\",\n                     \"version\":\"3.0.0\"\n                  }\n               ]\n            },\n            {\n               \"group\":[\n                  {\n                     \"id\":\"some/second-inner-buildpack\",\n                     \"version\":\"3.0.0\"\n                  }\n               ]\n            }\n         ],\n         \"stacks\":[\n            {\n               \"id\":\"io.buildpacks.stacks.first-stack\"\n            },\n            {\n               \"id\":\"io.buildpacks.stacks.second-stack\"\n            }\n         ],\n         \"layerDiffID\":\"sha256:first-inner-buildpack-diff-id\",\n         \"homepage\":\"first-inner-buildpack-homepage\"\n      }\n   },\n   \"some/second-inner-buildpack\":{\n      \"2.0.0\":{\n         \"api\":\"0.2\",\n         \"stacks\":[\n            {\n               \"id\":\"io.buildpacks.stacks.first-stack\"\n            },\n            {\n               \"id\":\"io.buildpacks.stacks.second-stack\"\n            }\n         ],\n         \"layerDiffID\":\"sha256:second-inner-buildpack-diff-id\",\n         \"homepage\":\"second-inner-buildpack-homepage\"\n      },\n      \"3.0.0\":{\n         \"api\":\"0.2\",\n         \"stacks\":[\n            {\n               \"id\":\"io.buildpacks.stacks.first-stack\"\n            },\n            {\n               \"id\":\"io.buildpacks.stacks.second-stack\"\n            }\n         ],\n         \"layerDiffID\":\"sha256:third-inner-buildpack-diff-id\",\n         \"homepage\":\"third-inner-buildpack-homepage\"\n      }\n   },\n   \"some/top-buildpack\":{\n      \"0.0.1\":{\n         \"api\":\"0.2\",\n         \"order\":[\n            {\n               \"group\":[\n                  {\n                     \"id\":\"some/first-inner-buildpack\",\n                     \"version\":\"1.0.0\"\n                  },\n                  {\n                     \"id\":\"some/second-inner-buildpack\",\n                     \"version\":\"2.0.0\"\n                  }\n               ]\n            },\n            {\n               \"group\":[\n                  {\n                     \"id\":\"some/first-inner-buildpack\",\n                     \"version\":\"1.0.0\"\n                  }\n               ]\n            }\n         ],\n         \"layerDiffID\":\"sha256:top-buildpack-diff-id\",\n         \"homepage\":\"top-buildpack-homepage\",\n\t\t \"name\": \"top\"\n      }\n   }\n}`\n\nfunc TestInspectBuildpack(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"InspectBuilder\", testInspectBuildpack, spec.Sequential(), spec.Report(report.Terminal{}))\n}\n\nfunc testInspectBuildpack(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tsubject          *client.Client\n\t\tmockImageFetcher *testmocks.MockImageFetcher\n\t\tmockController   *gomock.Controller\n\t\tout              bytes.Buffer\n\t\tbuildpackImage   *fakes.Image\n\t\tapiVersion       *api.Version\n\t\texpectedInfo     *client.BuildpackInfo\n\t\tmockDownloader   *testmocks.MockBlobDownloader\n\n\t\ttmpDir        string\n\t\tbuildpackPath string\n\t)\n\n\tit.Before(func() {\n\t\tmockController = gomock.NewController(t)\n\t\tmockImageFetcher = testmocks.NewMockImageFetcher(mockController)\n\t\tmockDownloader = testmocks.NewMockBlobDownloader(mockController)\n\n\t\tsubject = &client.Client{}\n\t\tclient.WithLogger(logging.NewLogWithWriters(&out, &out))(subject)\n\t\tclient.WithFetcher(mockImageFetcher)(subject)\n\t\tclient.WithDownloader(mockDownloader)(subject)\n\n\t\tbuildpackImage = fakes.NewImage(\"some/buildpack\", \"\", nil)\n\t\th.AssertNil(t, buildpackImage.SetLabel(buildpack.MetadataLabel, buildpackageMetadataTag))\n\t\th.AssertNil(t, buildpackImage.SetLabel(dist.BuildpackLayersLabel, buildpackLayersTag))\n\n\t\tvar err error\n\t\tapiVersion, err = api.NewVersion(\"0.2\")\n\t\th.AssertNil(t, err)\n\n\t\ttmpDir, err = os.MkdirTemp(\"\", \"inspectBuildpack\")\n\t\th.AssertNil(t, err)\n\n\t\tbuildpackPath = filepath.Join(tmpDir, \"buildpackTarFile.tar\")\n\n\t\texpectedInfo = &client.BuildpackInfo{\n\t\t\tBuildpackMetadata: buildpack.Metadata{\n\t\t\t\tModuleInfo: dist.ModuleInfo{\n\t\t\t\t\tID:       \"some/top-buildpack\",\n\t\t\t\t\tVersion:  \"0.0.1\",\n\t\t\t\t\tName:     \"top\",\n\t\t\t\t\tHomepage: \"top-buildpack-homepage\",\n\t\t\t\t},\n\t\t\t\tStacks: []dist.Stack{\n\t\t\t\t\t{ID: \"io.buildpacks.stacks.first-stack\"},\n\t\t\t\t\t{ID: \"io.buildpacks.stacks.second-stack\"},\n\t\t\t\t},\n\t\t\t},\n\t\t\tBuildpacks: []dist.ModuleInfo{\n\t\t\t\t{\n\t\t\t\t\tID:       \"some/first-inner-buildpack\",\n\t\t\t\t\tVersion:  \"1.0.0\",\n\t\t\t\t\tHomepage: \"first-inner-buildpack-homepage\",\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\tID:       \"some/second-inner-buildpack\",\n\t\t\t\t\tVersion:  \"2.0.0\",\n\t\t\t\t\tHomepage: \"second-inner-buildpack-homepage\",\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\tID:       \"some/second-inner-buildpack\",\n\t\t\t\t\tVersion:  \"3.0.0\",\n\t\t\t\t\tHomepage: \"third-inner-buildpack-homepage\",\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\tID:       \"some/top-buildpack\",\n\t\t\t\t\tVersion:  \"0.0.1\",\n\t\t\t\t\tName:     \"top\",\n\t\t\t\t\tHomepage: \"top-buildpack-homepage\",\n\t\t\t\t},\n\t\t\t},\n\t\t\tOrder: dist.Order{\n\t\t\t\t{\n\t\t\t\t\tGroup: []dist.ModuleRef{\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\tModuleInfo: dist.ModuleInfo{\n\t\t\t\t\t\t\t\tID:       \"some/top-buildpack\",\n\t\t\t\t\t\t\t\tVersion:  \"0.0.1\",\n\t\t\t\t\t\t\t\tName:     \"top\",\n\t\t\t\t\t\t\t\tHomepage: \"top-buildpack-homepage\",\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\tOptional: false,\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t},\n\t\t\tBuildpackLayers: dist.ModuleLayers{\n\t\t\t\t\"some/first-inner-buildpack\": {\n\t\t\t\t\t\"1.0.0\": {\n\t\t\t\t\t\tAPI: apiVersion,\n\t\t\t\t\t\tStacks: []dist.Stack{\n\t\t\t\t\t\t\t{ID: \"io.buildpacks.stacks.first-stack\"},\n\t\t\t\t\t\t\t{ID: \"io.buildpacks.stacks.second-stack\"},\n\t\t\t\t\t\t},\n\t\t\t\t\t\tOrder: dist.Order{\n\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\tGroup: []dist.ModuleRef{\n\t\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t\tModuleInfo: dist.ModuleInfo{\n\t\t\t\t\t\t\t\t\t\t\tID:      \"some/first-inner-buildpack\",\n\t\t\t\t\t\t\t\t\t\t\tVersion: \"1.0.0\",\n\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t\tOptional: false,\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t\tModuleInfo: dist.ModuleInfo{\n\t\t\t\t\t\t\t\t\t\t\tID:      \"some/second-inner-buildpack\",\n\t\t\t\t\t\t\t\t\t\t\tVersion: \"3.0.0\",\n\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t\tOptional: false,\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\tGroup: []dist.ModuleRef{\n\t\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t\tModuleInfo: dist.ModuleInfo{\n\t\t\t\t\t\t\t\t\t\t\tID:      \"some/second-inner-buildpack\",\n\t\t\t\t\t\t\t\t\t\t\tVersion: \"3.0.0\",\n\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t\tOptional: false,\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t},\n\t\t\t\t\t\tLayerDiffID: \"sha256:first-inner-buildpack-diff-id\",\n\t\t\t\t\t\tHomepage:    \"first-inner-buildpack-homepage\",\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t\t\"some/second-inner-buildpack\": {\n\t\t\t\t\t\"2.0.0\": {\n\t\t\t\t\t\tAPI: apiVersion,\n\t\t\t\t\t\tStacks: []dist.Stack{\n\t\t\t\t\t\t\t{ID: \"io.buildpacks.stacks.first-stack\"},\n\t\t\t\t\t\t\t{ID: \"io.buildpacks.stacks.second-stack\"},\n\t\t\t\t\t\t},\n\t\t\t\t\t\tLayerDiffID: \"sha256:second-inner-buildpack-diff-id\",\n\t\t\t\t\t\tHomepage:    \"second-inner-buildpack-homepage\",\n\t\t\t\t\t},\n\t\t\t\t\t\"3.0.0\": {\n\t\t\t\t\t\tAPI: apiVersion,\n\t\t\t\t\t\tStacks: []dist.Stack{\n\t\t\t\t\t\t\t{ID: \"io.buildpacks.stacks.first-stack\"},\n\t\t\t\t\t\t\t{ID: \"io.buildpacks.stacks.second-stack\"},\n\t\t\t\t\t\t},\n\t\t\t\t\t\tLayerDiffID: \"sha256:third-inner-buildpack-diff-id\",\n\t\t\t\t\t\tHomepage:    \"third-inner-buildpack-homepage\",\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t\t\"some/top-buildpack\": {\n\t\t\t\t\t\"0.0.1\": {\n\t\t\t\t\t\tAPI: apiVersion,\n\t\t\t\t\t\tOrder: dist.Order{\n\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\tGroup: []dist.ModuleRef{\n\t\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t\tModuleInfo: dist.ModuleInfo{\n\t\t\t\t\t\t\t\t\t\t\tID:      \"some/first-inner-buildpack\",\n\t\t\t\t\t\t\t\t\t\t\tVersion: \"1.0.0\",\n\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t\tOptional: false,\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t\tModuleInfo: dist.ModuleInfo{\n\t\t\t\t\t\t\t\t\t\t\tID:      \"some/second-inner-buildpack\",\n\t\t\t\t\t\t\t\t\t\t\tVersion: \"2.0.0\",\n\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t\tOptional: false,\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\tGroup: []dist.ModuleRef{\n\t\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t\tModuleInfo: dist.ModuleInfo{\n\t\t\t\t\t\t\t\t\t\t\tID:      \"some/first-inner-buildpack\",\n\t\t\t\t\t\t\t\t\t\t\tVersion: \"1.0.0\",\n\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t\tOptional: false,\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t},\n\t\t\t\t\t\tLayerDiffID: \"sha256:top-buildpack-diff-id\",\n\t\t\t\t\t\tHomepage:    \"top-buildpack-homepage\",\n\t\t\t\t\t\tName:        \"top\",\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t},\n\t\t}\n\t})\n\n\tit.After(func() {\n\t\tmockController.Finish()\n\t\terr := os.RemoveAll(tmpDir)\n\t\tif runtime.GOOS != \"windows\" {\n\t\t\th.AssertNil(t, err)\n\t\t}\n\t})\n\n\twhen(\"inspect-buildpack\", func() {\n\t\twhen(\"inspecting a registry buildpack\", func() {\n\t\t\tvar registryFixture string\n\t\t\tvar configPath string\n\t\t\tit.Before(func() {\n\t\t\t\texpectedInfo.Location = buildpack.RegistryLocator\n\n\t\t\t\tregistryFixture = h.CreateRegistryFixture(t, tmpDir, filepath.Join(\"testdata\", \"registry\"))\n\t\t\t\tpackHome := filepath.Join(tmpDir, \"packHome\")\n\t\t\t\th.AssertNil(t, os.Setenv(\"PACK_HOME\", packHome))\n\n\t\t\t\tconfigPath = filepath.Join(packHome, \"config.toml\")\n\t\t\t\th.AssertNil(t, cfg.Write(cfg.Config{\n\t\t\t\t\tRegistries: []cfg.Registry{\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\tName: \"some-registry\",\n\t\t\t\t\t\t\tType: \"github\",\n\t\t\t\t\t\t\tURL:  registryFixture,\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t}, configPath))\n\n\t\t\t\tmockImageFetcher.EXPECT().Fetch(\n\t\t\t\t\tgomock.Any(),\n\t\t\t\t\t\"example.com/some/package@sha256:8c27fe111c11b722081701dfed3bd55e039b9ce92865473cf4cdfa918071c566\",\n\t\t\t\t\timage.FetchOptions{Daemon: false, PullPolicy: image.PullNever}).Return(buildpackImage, nil)\n\t\t\t})\n\n\t\t\tit.After(func() {\n\t\t\t\th.AssertNil(t, os.Unsetenv(\"PACK_HOME\"))\n\t\t\t})\n\n\t\t\tit(\"succeeds\", func() {\n\t\t\t\tregistryBuildpack := \"urn:cnb:registry:example/java\"\n\t\t\t\tinspectOptions := client.InspectBuildpackOptions{\n\t\t\t\t\tBuildpackName: registryBuildpack,\n\t\t\t\t\tRegistry:      \"some-registry\",\n\t\t\t\t}\n\t\t\t\tinfo, err := subject.InspectBuildpack(inspectOptions)\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\th.AssertEq(t, info, expectedInfo)\n\t\t\t})\n\n\t\t\t// TODO add test case when buildpack is flattened\n\t\t})\n\n\t\twhen(\"inspecting local buildpack archive\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\texpectedInfo.Location = buildpack.URILocator\n\n\t\t\t\tassert := h.NewAssertionManager(t)\n\t\t\t\twriteBuildpackArchive(buildpackPath, tmpDir, assert)\n\t\t\t})\n\n\t\t\tit(\"succeeds\", func() {\n\t\t\t\tmockDownloader.EXPECT().Download(gomock.Any(), buildpackPath).Return(blob.NewBlob(buildpackPath), nil)\n\t\t\t\tinspectOptions := client.InspectBuildpackOptions{\n\t\t\t\t\tBuildpackName: buildpackPath,\n\t\t\t\t\tDaemon:        false,\n\t\t\t\t}\n\t\t\t\tinfo, err := subject.InspectBuildpack(inspectOptions)\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\th.AssertEq(t, info, expectedInfo)\n\t\t\t})\n\n\t\t\t// TODO add test case when buildpack is flattened\n\t\t})\n\n\t\twhen(\"inspecting an image\", func() {\n\t\t\tfor _, useDaemon := range []bool{true, false} {\n\t\t\t\tuseDaemon := useDaemon\n\t\t\t\twhen(fmt.Sprintf(\"daemon is %t\", useDaemon), func() {\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\texpectedInfo.Location = buildpack.PackageLocator\n\t\t\t\t\t\tif useDaemon {\n\t\t\t\t\t\t\tmockImageFetcher.EXPECT().Fetch(gomock.Any(), \"some/buildpack\", image.FetchOptions{Daemon: true, PullPolicy: image.PullNever}).Return(buildpackImage, nil)\n\t\t\t\t\t\t} else {\n\t\t\t\t\t\t\tmockImageFetcher.EXPECT().Fetch(gomock.Any(), \"some/buildpack\", image.FetchOptions{Daemon: false, PullPolicy: image.PullNever}).Return(buildpackImage, nil)\n\t\t\t\t\t\t}\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"succeeds\", func() {\n\t\t\t\t\t\tinspectOptions := client.InspectBuildpackOptions{\n\t\t\t\t\t\t\tBuildpackName: \"docker://some/buildpack\",\n\t\t\t\t\t\t\tDaemon:        useDaemon,\n\t\t\t\t\t\t}\n\t\t\t\t\t\tinfo, err := subject.InspectBuildpack(inspectOptions)\n\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\th.AssertEq(t, info, expectedInfo)\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t}\n\t\t})\n\t})\n\twhen(\"failure cases\", func() {\n\t\twhen(\"invalid buildpack name\", func() {\n\t\t\tit(\"returns an error\", func() {\n\t\t\t\tinvalidBuildpackName := \"\"\n\t\t\t\tinspectOptions := client.InspectBuildpackOptions{\n\t\t\t\t\tBuildpackName: invalidBuildpackName,\n\t\t\t\t}\n\t\t\t\t_, err := subject.InspectBuildpack(inspectOptions)\n\n\t\t\t\th.AssertError(t, err, \"unable to handle locator \")\n\t\t\t\th.AssertFalse(t, errors.Is(err, image.ErrNotFound))\n\t\t\t})\n\t\t})\n\t\twhen(\"buildpack image\", func() {\n\t\t\twhen(\"unable to fetch buildpack image\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tmockImageFetcher.EXPECT().Fetch(gomock.Any(), \"missing/buildpack\", image.FetchOptions{Daemon: true, PullPolicy: image.PullNever}).Return(nil, errors.Wrapf(image.ErrNotFound, \"big bad error\"))\n\t\t\t\t})\n\t\t\t\tit(\"returns an ErrNotFound error\", func() {\n\t\t\t\t\tinspectOptions := client.InspectBuildpackOptions{\n\t\t\t\t\t\tBuildpackName: \"docker://missing/buildpack\",\n\t\t\t\t\t\tDaemon:        true,\n\t\t\t\t\t}\n\t\t\t\t\t_, err := subject.InspectBuildpack(inspectOptions)\n\t\t\t\t\th.AssertTrue(t, errors.Is(err, image.ErrNotFound))\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"image does not have buildpackage metadata\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tfakeImage := fakes.NewImage(\"empty\", \"\", nil)\n\t\t\t\t\th.AssertNil(t, fakeImage.SetLabel(dist.BuildpackLayersLabel, \":::\"))\n\t\t\t\t\tmockImageFetcher.EXPECT().Fetch(gomock.Any(), \"missing-metadata/buildpack\", image.FetchOptions{Daemon: true, PullPolicy: image.PullNever}).Return(fakeImage, nil)\n\t\t\t\t})\n\t\t\t\tit(\"returns an error\", func() {\n\t\t\t\t\tinspectOptions := client.InspectBuildpackOptions{\n\t\t\t\t\t\tBuildpackName: \"docker://missing-metadata/buildpack\",\n\t\t\t\t\t\tDaemon:        true,\n\t\t\t\t\t}\n\t\t\t\t\t_, err := subject.InspectBuildpack(inspectOptions)\n\n\t\t\t\t\th.AssertError(t, err, fmt.Sprintf(\"unable to get image label %s\", dist.BuildpackLayersLabel))\n\t\t\t\t\th.AssertFalse(t, errors.Is(err, image.ErrNotFound))\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\t\twhen(\"buildpack archive\", func() {\n\t\t\twhen(\"archive is not a buildpack\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tinvalidBuildpackPath := filepath.Join(tmpDir, \"fake-buildpack-path\")\n\t\t\t\t\th.AssertNil(t, os.WriteFile(invalidBuildpackPath, []byte(\"not a buildpack\"), os.ModePerm))\n\n\t\t\t\t\tmockDownloader.EXPECT().Download(gomock.Any(), \"https://invalid/buildpack\").Return(blob.NewBlob(invalidBuildpackPath), nil)\n\t\t\t\t})\n\t\t\t\tit(\"returns an error\", func() {\n\t\t\t\t\tinspectOptions := client.InspectBuildpackOptions{\n\t\t\t\t\t\tBuildpackName: \"https://invalid/buildpack\",\n\t\t\t\t\t\tDaemon:        true,\n\t\t\t\t\t}\n\n\t\t\t\t\t_, err := subject.InspectBuildpack(inspectOptions)\n\t\t\t\t\th.AssertNotNil(t, err)\n\t\t\t\t\th.AssertFalse(t, errors.Is(err, image.ErrNotFound))\n\t\t\t\t\th.AssertError(t, err, \"unable to fetch config from buildpack blob:\")\n\t\t\t\t})\n\t\t\t})\n\t\t\twhen(\"unable to download buildpack archive\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tmockDownloader.EXPECT().Download(gomock.Any(), \"https://missing/buildpack\").Return(nil, errors.New(\"unable to download archive\"))\n\t\t\t\t})\n\t\t\t\tit(\"returns a untyped error\", func() {\n\t\t\t\t\tinspectOptions := client.InspectBuildpackOptions{\n\t\t\t\t\t\tBuildpackName: \"https://missing/buildpack\",\n\t\t\t\t\t\tDaemon:        true,\n\t\t\t\t\t}\n\n\t\t\t\t\t_, err := subject.InspectBuildpack(inspectOptions)\n\t\t\t\t\th.AssertNotNil(t, err)\n\t\t\t\t\th.AssertFalse(t, errors.Is(err, image.ErrNotFound))\n\t\t\t\t\th.AssertError(t, err, \"unable to download archive\")\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"buildpack on registry\", func() {\n\t\t\twhen(\"unable to get registry\", func() {\n\t\t\t\tit(\"returns an error\", func() {\n\t\t\t\t\tregistryBuildpack := \"urn:cnb:registry:example/foo\"\n\t\t\t\t\tinspectOptions := client.InspectBuildpackOptions{\n\t\t\t\t\t\tBuildpackName: registryBuildpack,\n\t\t\t\t\t\tDaemon:        true,\n\t\t\t\t\t\tRegistry:      \":::\",\n\t\t\t\t\t}\n\n\t\t\t\t\t_, err := subject.InspectBuildpack(inspectOptions)\n\n\t\t\t\t\th.AssertError(t, err, \"invalid registry :::\")\n\t\t\t\t\th.AssertFalse(t, errors.Is(err, image.ErrNotFound))\n\t\t\t\t})\n\t\t\t})\n\t\t\twhen(\"buildpack is not on registry\", func() {\n\t\t\t\tvar registryFixture string\n\t\t\t\tvar configPath string\n\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tregistryFixture = h.CreateRegistryFixture(t, tmpDir, filepath.Join(\"testdata\", \"registry\"))\n\t\t\t\t\tpackHome := filepath.Join(tmpDir, \"packHome\")\n\t\t\t\t\th.AssertNil(t, os.Setenv(\"PACK_HOME\", packHome))\n\t\t\t\t\tconfigPath = filepath.Join(packHome, \"config.toml\")\n\t\t\t\t\th.AssertNil(t, cfg.Write(cfg.Config{\n\t\t\t\t\t\tRegistries: []cfg.Registry{\n\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\tName: \"some-registry\",\n\t\t\t\t\t\t\t\tType: \"github\",\n\t\t\t\t\t\t\t\tURL:  registryFixture,\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t},\n\t\t\t\t\t}, configPath))\n\t\t\t\t})\n\t\t\t\tit(\"returns an error\", func() {\n\t\t\t\t\tregistryBuildpack := \"urn:cnb:registry:example/not-present\"\n\t\t\t\t\tinspectOptions := client.InspectBuildpackOptions{\n\t\t\t\t\t\tBuildpackName: registryBuildpack,\n\t\t\t\t\t\tDaemon:        true,\n\t\t\t\t\t\tRegistry:      \"some-registry\",\n\t\t\t\t\t}\n\n\t\t\t\t\t_, err := subject.InspectBuildpack(inspectOptions)\n\n\t\t\t\t\th.AssertError(t, err, \"unable to find 'urn:cnb:registry:example/not-present' in registry:\")\n\t\t\t\t})\n\t\t\t})\n\t\t\twhen(\"unable to fetch buildpack from registry\", func() {\n\t\t\t\tvar registryFixture string\n\t\t\t\tvar configPath string\n\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tregistryFixture = h.CreateRegistryFixture(t, tmpDir, filepath.Join(\"testdata\", \"registry\"))\n\t\t\t\t\tpackHome := filepath.Join(tmpDir, \"packHome\")\n\t\t\t\t\th.AssertNil(t, os.Setenv(\"PACK_HOME\", packHome))\n\n\t\t\t\t\tconfigPath = filepath.Join(packHome, \"config.toml\")\n\t\t\t\t\th.AssertNil(t, cfg.Write(cfg.Config{\n\t\t\t\t\t\tRegistries: []cfg.Registry{\n\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\tName: \"some-registry\",\n\t\t\t\t\t\t\t\tType: \"github\",\n\t\t\t\t\t\t\t\tURL:  registryFixture,\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t},\n\t\t\t\t\t}, configPath))\n\t\t\t\t\tmockImageFetcher.EXPECT().Fetch(\n\t\t\t\t\t\tgomock.Any(),\n\t\t\t\t\t\t\"example.com/some/package@sha256:2560f05307e8de9d830f144d09556e19dd1eb7d928aee900ed02208ae9727e7a\",\n\t\t\t\t\t\timage.FetchOptions{Daemon: false, PullPolicy: image.PullNever}).Return(nil, image.ErrNotFound)\n\t\t\t\t})\n\t\t\t\tit(\"returns an untyped error\", func() {\n\t\t\t\t\tregistryBuildpack := \"urn:cnb:registry:example/foo\"\n\t\t\t\t\tinspectOptions := client.InspectBuildpackOptions{\n\t\t\t\t\t\tBuildpackName: registryBuildpack,\n\t\t\t\t\t\tDaemon:        true,\n\t\t\t\t\t\tRegistry:      \"some-registry\",\n\t\t\t\t\t}\n\n\t\t\t\t\t_, err := subject.InspectBuildpack(inspectOptions)\n\t\t\t\t\th.AssertNotNil(t, err)\n\t\t\t\t\th.AssertFalse(t, errors.Is(err, image.ErrNotFound))\n\t\t\t\t\th.AssertError(t, err, \"error pulling registry specified image\")\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\t})\n}\n\n// write an OCI image using GGCR lib\nfunc writeBuildpackArchive(buildpackPath, tmpDir string, assert h.AssertionManager) {\n\tlayoutDir := filepath.Join(tmpDir, \"layout\")\n\timgIndex := empty.Index\n\timg := empty.Image\n\tc, err := img.ConfigFile()\n\tassert.Nil(err)\n\n\tc.Config.Labels = map[string]string{}\n\tc.Config.Labels[buildpack.MetadataLabel] = buildpackageMetadataTag\n\tc.Config.Labels[dist.BuildpackLayersLabel] = buildpackLayersTag\n\timg, err = mutate.Config(img, c.Config)\n\tassert.Nil(err)\n\n\tp, err := layout.Write(layoutDir, imgIndex)\n\tassert.Nil(err)\n\n\tassert.Nil(p.AppendImage(img))\n\tassert.Nil(err)\n\n\tbuildpackWriter, err := os.Create(buildpackPath)\n\tassert.Nil(err)\n\tdefer buildpackWriter.Close()\n\n\ttw := tar.NewWriter(buildpackWriter)\n\tdefer tw.Close()\n\n\tassert.Nil(archive.WriteDirToTar(tw, layoutDir, \"/\", 0, 0, 0755, true, false, nil))\n}\n"
  },
  {
    "path": "pkg/client/inspect_extension.go",
    "content": "package client\n\nimport (\n\t\"context\"\n\t\"fmt\"\n\n\t\"github.com/buildpacks/pack/pkg/buildpack\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n\t\"github.com/buildpacks/pack/pkg/image\"\n)\n\ntype ExtensionInfo struct {\n\tExtension dist.ModuleInfo\n\tLocation  buildpack.LocatorType\n}\n\ntype InspectExtensionOptions struct {\n\tExtensionName string\n\tDaemon        bool\n}\n\nfunc (c *Client) InspectExtension(opts InspectExtensionOptions) (*ExtensionInfo, error) {\n\tlocatorType, err := buildpack.GetLocatorType(opts.ExtensionName, \"\", []dist.ModuleInfo{})\n\tif err != nil {\n\t\treturn nil, err\n\t}\n\tvar layerMd dist.ModuleLayers\n\n\tlayerMd, err = metadataOfExtensionFromImage(c, opts.ExtensionName, opts.Daemon)\n\n\tif err != nil {\n\t\treturn nil, err\n\t}\n\n\tif len(layerMd) != 1 {\n\t\treturn nil, fmt.Errorf(\"expected 1 extension, got %d\", len(layerMd))\n\t}\n\n\treturn &ExtensionInfo{\n\t\tExtension: extractExtension(layerMd),\n\t\tLocation:  locatorType,\n\t}, nil\n}\n\nfunc metadataOfExtensionFromImage(client *Client, name string, daemon bool) (layerMd dist.ModuleLayers, err error) {\n\timageName := buildpack.ParsePackageLocator(name)\n\timg, err := client.imageFetcher.Fetch(context.Background(), imageName, image.FetchOptions{Daemon: daemon, PullPolicy: image.PullNever})\n\tif err != nil {\n\t\treturn dist.ModuleLayers{}, err\n\t}\n\n\tif _, err := dist.GetLabel(img, dist.ExtensionLayersLabel, &layerMd); err != nil {\n\t\treturn dist.ModuleLayers{}, fmt.Errorf(\"unable to get image label %s: %q\", dist.ExtensionLayersLabel, err)\n\t}\n\n\treturn layerMd, nil\n}\n\nfunc extractExtension(layerMd dist.ModuleLayers) dist.ModuleInfo {\n\tresult := dist.ModuleInfo{}\n\tfor extensionID, extensionMap := range layerMd {\n\t\tfor version, layerInfo := range extensionMap {\n\t\t\tex := dist.ModuleInfo{\n\t\t\t\tID:       extensionID,\n\t\t\t\tName:     layerInfo.Name,\n\t\t\t\tVersion:  version,\n\t\t\t\tHomepage: layerInfo.Homepage,\n\t\t\t}\n\t\t\tresult = ex\n\t\t}\n\t}\n\treturn result\n}\n"
  },
  {
    "path": "pkg/client/inspect_extension_test.go",
    "content": "package client_test\n\nimport (\n\t\"bytes\"\n\t\"fmt\"\n\t\"testing\"\n\n\t\"github.com/buildpacks/imgutil/fakes\"\n\t\"github.com/golang/mock/gomock\"\n\t\"github.com/heroku/color\"\n\t\"github.com/pkg/errors\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\t\"github.com/buildpacks/pack/pkg/buildpack\"\n\t\"github.com/buildpacks/pack/pkg/client\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n\t\"github.com/buildpacks/pack/pkg/image\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\t\"github.com/buildpacks/pack/pkg/testmocks\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nconst extensionMetadataTag = `{\n  \"id\": \"some/top-extension\",\n  \"version\": \"0.0.1\",\n  \"name\": \"top\",\n  \"homepage\": \"top-extension-homepage\"\n}`\n\nconst extensionLayersTag = `{\n   \"some/top-extension\":{\n      \"0.0.1\":{\n         \"api\":\"0.2\",\n         \"homepage\":\"top-extension-homepage\",\n\t\t \"name\": \"top\"\n      }\n   }\n}`\n\nfunc TestInspectExtension(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"InspectExtension\", testInspectExtension, spec.Sequential(), spec.Report(report.Terminal{}))\n}\nfunc testInspectExtension(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tsubject          *client.Client\n\t\tmockImageFetcher *testmocks.MockImageFetcher\n\t\tmockController   *gomock.Controller\n\t\tout              bytes.Buffer\n\t\textensionImage   *fakes.Image\n\t\texpectedInfo     *client.ExtensionInfo\n\t)\n\n\tit.Before(func() {\n\t\tmockController = gomock.NewController(t)\n\t\tmockImageFetcher = testmocks.NewMockImageFetcher(mockController)\n\n\t\tsubject = &client.Client{}\n\t\tclient.WithLogger(logging.NewLogWithWriters(&out, &out))(subject)\n\t\tclient.WithFetcher(mockImageFetcher)(subject)\n\n\t\textensionImage = fakes.NewImage(\"some/extension\", \"\", nil)\n\t\th.AssertNil(t, extensionImage.SetLabel(dist.ExtensionMetadataLabel, extensionMetadataTag))\n\t\th.AssertNil(t, extensionImage.SetLabel(dist.ExtensionLayersLabel, extensionLayersTag))\n\n\t\texpectedInfo = &client.ExtensionInfo{\n\t\t\tExtension: dist.ModuleInfo{\n\t\t\t\tID:       \"some/top-extension\",\n\t\t\t\tVersion:  \"0.0.1\",\n\t\t\t\tName:     \"top\",\n\t\t\t\tHomepage: \"top-extension-homepage\",\n\t\t\t},\n\t\t}\n\t})\n\n\tit.After(func() {\n\t\tmockController.Finish()\n\t})\n\n\twhen(\"inspect-extension\", func() {\n\t\twhen(\"inspecting an image\", func() {\n\t\t\tfor _, useDaemon := range []bool{true, false} {\n\t\t\t\tuseDaemon := useDaemon\n\t\t\t\twhen(fmt.Sprintf(\"daemon is %t\", useDaemon), func() {\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\texpectedInfo.Location = buildpack.PackageLocator\n\t\t\t\t\t\tif useDaemon {\n\t\t\t\t\t\t\tmockImageFetcher.EXPECT().Fetch(gomock.Any(), \"some/extension\", image.FetchOptions{Daemon: true, PullPolicy: image.PullNever}).Return(extensionImage, nil)\n\t\t\t\t\t\t} else {\n\t\t\t\t\t\t\tmockImageFetcher.EXPECT().Fetch(gomock.Any(), \"some/extension\", image.FetchOptions{Daemon: false, PullPolicy: image.PullNever}).Return(extensionImage, nil)\n\t\t\t\t\t\t}\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"succeeds\", func() {\n\t\t\t\t\t\tinspectOptions := client.InspectExtensionOptions{\n\t\t\t\t\t\t\tExtensionName: \"docker://some/extension\",\n\t\t\t\t\t\t\tDaemon:        useDaemon,\n\t\t\t\t\t\t}\n\t\t\t\t\t\tinfo, err := subject.InspectExtension(inspectOptions)\n\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\th.AssertEq(t, info, expectedInfo)\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t}\n\t\t})\n\t})\n\twhen(\"failure cases\", func() {\n\t\twhen(\"invalid extension name\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\tmockImageFetcher.EXPECT().Fetch(gomock.Any(), \"\", image.FetchOptions{Daemon: false, PullPolicy: image.PullNever}).Return(nil, errors.Wrapf(image.ErrNotFound, \"unable to handle locator\"))\n\t\t\t})\n\t\t\tit(\"returns an error\", func() {\n\t\t\t\tinvalidExtensionName := \"\"\n\t\t\t\tinspectOptions := client.InspectExtensionOptions{\n\t\t\t\t\tExtensionName: invalidExtensionName,\n\t\t\t\t}\n\t\t\t\t_, err := subject.InspectExtension(inspectOptions)\n\n\t\t\t\th.AssertError(t, err, \"unable to handle locator\")\n\t\t\t\th.AssertTrue(t, errors.Is(err, image.ErrNotFound))\n\t\t\t})\n\t\t})\n\t\twhen(\"extension image\", func() {\n\t\t\twhen(\"unable to fetch extension image\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tmockImageFetcher.EXPECT().Fetch(gomock.Any(), \"missing/extension\", image.FetchOptions{Daemon: true, PullPolicy: image.PullNever}).Return(nil, errors.Wrapf(image.ErrNotFound, \"big bad error\"))\n\t\t\t\t})\n\t\t\t\tit(\"returns an ErrNotFound error\", func() {\n\t\t\t\t\tinspectOptions := client.InspectExtensionOptions{\n\t\t\t\t\t\tExtensionName: \"docker://missing/extension\",\n\t\t\t\t\t\tDaemon:        true,\n\t\t\t\t\t}\n\t\t\t\t\t_, err := subject.InspectExtension(inspectOptions)\n\t\t\t\t\th.AssertTrue(t, errors.Is(err, image.ErrNotFound))\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"image does not have extension metadata\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tfakeImage := fakes.NewImage(\"empty\", \"\", nil)\n\t\t\t\t\th.AssertNil(t, fakeImage.SetLabel(dist.ExtensionLayersLabel, \":::\"))\n\t\t\t\t\tmockImageFetcher.EXPECT().Fetch(gomock.Any(), \"missing-metadata/extension\", image.FetchOptions{Daemon: true, PullPolicy: image.PullNever}).Return(fakeImage, nil)\n\t\t\t\t})\n\t\t\t\tit(\"returns an error\", func() {\n\t\t\t\t\tinspectOptions := client.InspectExtensionOptions{\n\t\t\t\t\t\tExtensionName: \"docker://missing-metadata/extension\",\n\t\t\t\t\t\tDaemon:        true,\n\t\t\t\t\t}\n\t\t\t\t\t_, err := subject.InspectExtension(inspectOptions)\n\n\t\t\t\t\th.AssertError(t, err, fmt.Sprintf(\"unable to get image label %s\", dist.ExtensionLayersLabel))\n\t\t\t\t\th.AssertFalse(t, errors.Is(err, image.ErrNotFound))\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "pkg/client/inspect_image.go",
    "content": "package client\n\nimport (\n\t\"context\"\n\t\"strings\"\n\n\t\"github.com/Masterminds/semver\"\n\t\"github.com/buildpacks/lifecycle/buildpack\"\n\t\"github.com/buildpacks/lifecycle/launch\"\n\t\"github.com/buildpacks/lifecycle/platform\"\n\t\"github.com/buildpacks/lifecycle/platform/files\"\n\t\"github.com/pkg/errors\"\n\n\t\"github.com/buildpacks/pack/pkg/dist\"\n\t\"github.com/buildpacks/pack/pkg/image\"\n)\n\n// ImageInfo is a collection of metadata describing\n// an app image built using Cloud Native Buildpacks.\ntype ImageInfo struct {\n\t// Stack Identifier used when building this image\n\tStackID string\n\n\t// List of buildpacks that passed detection, ran their build\n\t// phases and made a contribution to this image.\n\tBuildpacks []buildpack.GroupElement\n\n\t// List of extensions that passed detection, ran their generate\n\t// phases and made a contribution to this image.\n\tExtensions []buildpack.GroupElement\n\n\t// Base includes two references to the run image,\n\t// - the Run Image ID,\n\t// - the hash of the last layer in the app image that belongs to the run image.\n\t// A way to visualize this is given an image with n layers:\n\t//\n\t// last layer in run image\n\t//          v\n\t// [1, ..., k, k+1, ..., n]\n\t//              ^\n\t//   first layer added by buildpacks\n\t//\n\t// the first 1 to k layers all belong to the run image,\n\t// the last k+1 to n layers are added by buildpacks.\n\t// the sum of all of these is our app image.\n\tBase files.RunImageForRebase\n\n\t// BOM or Bill of materials, contains dependency and\n\t// version information provided by each buildpack.\n\tBOM []buildpack.BOMEntry\n\n\t// Stack includes the run image name, and a list of image mirrors,\n\t// where the run image is hosted.\n\tStack files.Stack\n\n\t// Processes lists all processes contributed by buildpacks.\n\tProcesses ProcessDetails\n\n\t// If the image can be rebased\n\tRebasable bool\n}\n\n// ProcessDetails is a collection of all start command metadata\n// on an image.\ntype ProcessDetails struct {\n\t// An Images default start command.\n\tDefaultProcess *launch.Process\n\n\t// List of all start commands contributed by buildpacks.\n\tOtherProcesses []launch.Process\n}\n\n// Deserialize just the subset of fields we need to avoid breaking changes\ntype layersMetadata struct {\n\tRunImage files.RunImageForRebase `json:\"runImage\" toml:\"run-image\"`\n\tStack    files.Stack             `json:\"stack\" toml:\"stack\"`\n}\n\nconst (\n\tplatformAPIEnv            = \"CNB_PLATFORM_API\"\n\tcnbProcessEnv             = \"CNB_PROCESS_TYPE\"\n\tlauncherEntrypoint        = \"/cnb/lifecycle/launcher\"\n\twindowsLauncherEntrypoint = `c:\\cnb\\lifecycle\\launcher.exe`\n\tentrypointPrefix          = \"/cnb/process/\"\n\twindowsEntrypointPrefix   = `c:\\cnb\\process\\`\n\tdefaultProcess            = \"web\"\n\tfallbackPlatformAPI       = \"0.3\"\n\twindowsPrefix             = \"c:\"\n)\n\n// InspectImage reads the Label metadata of an image. It initializes a ImageInfo object\n// using this metadata, and returns it.\n// If daemon is true, first the local registry will be searched for the image.\n// Otherwise it assumes the image is remote.\nfunc (c *Client) InspectImage(name string, daemon bool) (*ImageInfo, error) {\n\timg, err := c.imageFetcher.Fetch(context.Background(), name, image.FetchOptions{Daemon: daemon, PullPolicy: image.PullNever})\n\tif err != nil {\n\t\tif errors.Cause(err) == image.ErrNotFound {\n\t\t\treturn nil, nil\n\t\t}\n\t\treturn nil, err\n\t}\n\n\tvar layersMd layersMetadata\n\tif _, err := dist.GetLabel(img, platform.LifecycleMetadataLabel, &layersMd); err != nil {\n\t\treturn nil, err\n\t}\n\n\tvar buildMD files.BuildMetadata\n\tif _, err := dist.GetLabel(img, platform.BuildMetadataLabel, &buildMD); err != nil {\n\t\treturn nil, err\n\t}\n\n\tminimumBaseImageReferenceVersion := semver.MustParse(\"0.5.0\")\n\tactualLauncherVersion, err := semver.NewVersion(buildMD.Launcher.Version)\n\n\tif err == nil && actualLauncherVersion.LessThan(minimumBaseImageReferenceVersion) {\n\t\tlayersMd.RunImage.Reference = \"\"\n\t}\n\n\tstackID, err := img.Label(platform.StackIDLabel)\n\tif err != nil {\n\t\treturn nil, err\n\t}\n\n\trebasable, err := getRebasableLabel(img)\n\tif err != nil {\n\t\treturn nil, err\n\t}\n\n\tplatformAPI, err := img.Env(platformAPIEnv)\n\tif err != nil {\n\t\treturn nil, errors.Wrap(err, \"reading platform api\")\n\t}\n\n\tif platformAPI == \"\" {\n\t\tplatformAPI = fallbackPlatformAPI\n\t}\n\n\tplatformAPIVersion, err := semver.NewVersion(platformAPI)\n\tif err != nil {\n\t\treturn nil, errors.Wrap(err, \"parsing platform api version\")\n\t}\n\n\tvar defaultProcessType string\n\tif platformAPIVersion.LessThan(semver.MustParse(\"0.4\")) {\n\t\tdefaultProcessType, err = img.Env(cnbProcessEnv)\n\t\tif err != nil || defaultProcessType == \"\" {\n\t\t\tdefaultProcessType = defaultProcess\n\t\t}\n\t} else {\n\t\tentrypoint, err := img.Entrypoint()\n\t\tif err != nil {\n\t\t\treturn nil, errors.Wrap(err, \"reading entrypoint\")\n\t\t}\n\n\t\tif len(entrypoint) > 0 && entrypoint[0] != launcherEntrypoint && entrypoint[0] != windowsLauncherEntrypoint {\n\t\t\tprocess := entrypoint[0]\n\t\t\tif strings.HasPrefix(process, windowsPrefix) {\n\t\t\t\tprocess = strings.TrimPrefix(process, windowsEntrypointPrefix)\n\t\t\t\tprocess = strings.TrimSuffix(process, \".exe\") // Trim .exe for Windows support\n\t\t\t} else {\n\t\t\t\tprocess = strings.TrimPrefix(process, entrypointPrefix)\n\t\t\t}\n\n\t\t\tdefaultProcessType = process\n\t\t}\n\t}\n\n\tworkingDir, err := img.WorkingDir()\n\tif err != nil {\n\t\treturn nil, errors.Wrap(err, \"reading WorkingDir\")\n\t}\n\n\tvar processDetails ProcessDetails\n\tfor _, proc := range buildMD.Processes {\n\t\tproc := proc\n\t\tif proc.WorkingDirectory == \"\" {\n\t\t\tproc.WorkingDirectory = workingDir\n\t\t}\n\t\tif proc.Type == defaultProcessType {\n\t\t\tprocessDetails.DefaultProcess = &proc\n\t\t\tcontinue\n\t\t}\n\t\tprocessDetails.OtherProcesses = append(processDetails.OtherProcesses, proc)\n\t}\n\n\tvar stackCompat files.Stack\n\tif layersMd.RunImage.Image != \"\" {\n\t\tstackCompat = layersMd.RunImage.ToStack()\n\t} else {\n\t\tstackCompat = layersMd.Stack\n\t}\n\n\tif buildMD.Extensions != nil {\n\t\treturn &ImageInfo{\n\t\t\tStackID:    stackID,\n\t\t\tStack:      stackCompat,\n\t\t\tBase:       layersMd.RunImage,\n\t\t\tBOM:        buildMD.BOM,\n\t\t\tBuildpacks: buildMD.Buildpacks,\n\t\t\tExtensions: buildMD.Extensions,\n\t\t\tProcesses:  processDetails,\n\t\t\tRebasable:  rebasable,\n\t\t}, nil\n\t}\n\n\treturn &ImageInfo{\n\t\tStackID:    stackID,\n\t\tStack:      stackCompat,\n\t\tBase:       layersMd.RunImage,\n\t\tBOM:        buildMD.BOM,\n\t\tBuildpacks: buildMD.Buildpacks,\n\t\tProcesses:  processDetails,\n\t\tRebasable:  rebasable,\n\t}, nil\n}\n\nfunc getRebasableLabel(labeled dist.Labeled) (bool, error) {\n\tvar rebasableOutput bool\n\tisPresent, err := dist.GetLabel(labeled, platform.RebasableLabel, &rebasableOutput)\n\tif err != nil {\n\t\treturn false, err\n\t}\n\n\tif !isPresent {\n\t\trebasableOutput = true\n\t}\n\n\treturn rebasableOutput, nil\n}\n"
  },
  {
    "path": "pkg/client/inspect_image_test.go",
    "content": "package client\n\nimport (\n\t\"bytes\"\n\t\"encoding/json\"\n\t\"errors\"\n\t\"fmt\"\n\t\"testing\"\n\n\t\"github.com/buildpacks/imgutil/fakes\"\n\t\"github.com/buildpacks/lifecycle/launch\"\n\t\"github.com/buildpacks/lifecycle/platform/files\"\n\t\"github.com/golang/mock/gomock\"\n\t\"github.com/google/go-cmp/cmp\"\n\t\"github.com/google/go-cmp/cmp/cmpopts\"\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\t\"github.com/buildpacks/pack/pkg/image\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\t\"github.com/buildpacks/pack/pkg/testmocks\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestInspectImage(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"InspectImage\", testInspectImage, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\n// PlatformAPI should be ignored because it is not set in the metadata label\nvar ignorePlatformAPI = []cmp.Option{\n\tcmpopts.IgnoreFields(launch.Process{}, \"PlatformAPI\"),\n\tcmpopts.IgnoreFields(launch.RawCommand{}, \"PlatformAPI\"),\n}\n\nfunc testInspectImage(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tsubject                        *Client\n\t\tmockImageFetcher               *testmocks.MockImageFetcher\n\t\tmockDockerClient               *testmocks.MockAPIClient\n\t\tmockController                 *gomock.Controller\n\t\tmockImage                      *testmocks.MockImage\n\t\tmockImageNoRebasable           *testmocks.MockImage\n\t\tmockImageRebasableWithoutLabel *testmocks.MockImage\n\t\tmockImageWithExtension         *testmocks.MockImage\n\t\tout                            bytes.Buffer\n\t)\n\n\tit.Before(func() {\n\t\tmockController = gomock.NewController(t)\n\t\tmockImageFetcher = testmocks.NewMockImageFetcher(mockController)\n\t\tmockDockerClient = testmocks.NewMockAPIClient(mockController)\n\n\t\tvar err error\n\t\tsubject, err = NewClient(WithLogger(logging.NewLogWithWriters(&out, &out)), WithFetcher(mockImageFetcher), WithDockerClient(mockDockerClient))\n\t\th.AssertNil(t, err)\n\n\t\tmockImage = testmocks.NewImage(\"some/image\", \"\", nil)\n\t\th.AssertNil(t, mockImage.SetWorkingDir(\"/test-workdir\"))\n\t\th.AssertNil(t, mockImage.SetLabel(\"io.buildpacks.stack.id\", \"test.stack.id\"))\n\t\th.AssertNil(t, mockImage.SetLabel(\"io.buildpacks.rebasable\", \"true\"))\n\t\th.AssertNil(t, mockImage.SetLabel(\n\t\t\t\"io.buildpacks.lifecycle.metadata\",\n\t\t\t`{\n  \"stack\": {\n    \"runImage\": {\n      \"image\": \"some-run-image\",\n      \"mirrors\": [\n        \"some-mirror\",\n        \"other-mirror\"\n      ]\n    }\n  },\n  \"runImage\": {\n    \"topLayer\": \"some-top-layer\",\n    \"reference\": \"some-run-image-reference\"\n  }\n}`,\n\t\t))\n\t\th.AssertNil(t, mockImage.SetLabel(\n\t\t\t\"io.buildpacks.build.metadata\",\n\t\t\t`{\n  \"bom\": [\n    {\n      \"name\": \"some-bom-element\"\n    }\n  ],\n  \"buildpacks\": [\n    {\n      \"id\": \"some-buildpack\",\n      \"version\": \"some-version\"\n    },\n    {\n      \"id\": \"other-buildpack\",\n      \"version\": \"other-version\"\n    }\n  ],\n  \"processes\": [\n    {\n      \"type\": \"other-process\",\n      \"command\": \"/other/process\",\n      \"args\": [\"opt\", \"1\"],\n      \"direct\": true\n    },\n    {\n      \"type\": \"web\",\n      \"command\": \"/start/web-process\",\n      \"args\": [\"-p\", \"1234\"],\n      \"direct\": false\n    }\n  ],\n  \"launcher\": {\n    \"version\": \"0.5.0\"\n  }\n}`,\n\t\t))\n\n\t\tmockImageNoRebasable = testmocks.NewImage(\"some/imageNoRebasable\", \"\", nil)\n\t\th.AssertNil(t, mockImageNoRebasable.SetWorkingDir(\"/test-workdir\"))\n\t\th.AssertNil(t, mockImageNoRebasable.SetLabel(\"io.buildpacks.stack.id\", \"test.stack.id\"))\n\t\th.AssertNil(t, mockImageNoRebasable.SetLabel(\"io.buildpacks.rebasable\", \"false\"))\n\t\th.AssertNil(t, mockImageNoRebasable.SetLabel(\n\t\t\t\"io.buildpacks.lifecycle.metadata\",\n\t\t\t`{\n  \"stack\": {\n    \"runImage\": {\n      \"image\": \"some-run-image-no-rebasable\",\n      \"mirrors\": [\n        \"some-mirror\",\n        \"other-mirror\"\n      ]\n    }\n  },\n  \"runImage\": {\n    \"topLayer\": \"some-top-layer\",\n    \"reference\": \"some-run-image-reference\"\n  }\n}`,\n\t\t))\n\t\th.AssertNil(t, mockImageNoRebasable.SetLabel(\n\t\t\t\"io.buildpacks.build.metadata\",\n\t\t\t`{\n  \"bom\": [\n    {\n      \"name\": \"some-bom-element\"\n    }\n  ],\n  \"buildpacks\": [\n    {\n      \"id\": \"some-buildpack\",\n      \"version\": \"some-version\"\n    },\n    {\n      \"id\": \"other-buildpack\",\n      \"version\": \"other-version\"\n    }\n  ],\n  \"processes\": [\n    {\n      \"type\": \"other-process\",\n      \"command\": \"/other/process\",\n      \"args\": [\"opt\", \"1\"],\n      \"direct\": true\n    },\n    {\n      \"type\": \"web\",\n      \"command\": \"/start/web-process\",\n      \"args\": [\"-p\", \"1234\"],\n      \"direct\": false\n    }\n  ],\n  \"launcher\": {\n    \"version\": \"0.5.0\"\n  }\n}`,\n\t\t))\n\n\t\tmockImageRebasableWithoutLabel = testmocks.NewImage(\"some/imageRebasableWithoutLabel\", \"\", nil)\n\t\th.AssertNil(t, mockImageNoRebasable.SetWorkingDir(\"/test-workdir\"))\n\t\th.AssertNil(t, mockImageNoRebasable.SetLabel(\"io.buildpacks.stack.id\", \"test.stack.id\"))\n\t\th.AssertNil(t, mockImageNoRebasable.SetLabel(\n\t\t\t\"io.buildpacks.lifecycle.metadata\",\n\t\t\t`{\n  \"stack\": {\n    \"runImage\": {\n      \"image\": \"some-run-image-no-rebasable\",\n      \"mirrors\": [\n        \"some-mirror\",\n        \"other-mirror\"\n      ]\n    }\n  },\n  \"runImage\": {\n    \"topLayer\": \"some-top-layer\",\n    \"reference\": \"some-run-image-reference\"\n  }\n}`,\n\t\t))\n\t\th.AssertNil(t, mockImageNoRebasable.SetLabel(\n\t\t\t\"io.buildpacks.build.metadata\",\n\t\t\t`{\n  \"bom\": [\n    {\n      \"name\": \"some-bom-element\"\n    }\n  ],\n  \"buildpacks\": [\n    {\n      \"id\": \"some-buildpack\",\n      \"version\": \"some-version\"\n    },\n    {\n      \"id\": \"other-buildpack\",\n      \"version\": \"other-version\"\n    }\n  ],\n  \"processes\": [\n    {\n      \"type\": \"other-process\",\n      \"command\": \"/other/process\",\n      \"args\": [\"opt\", \"1\"],\n      \"direct\": true\n    },\n    {\n      \"type\": \"web\",\n      \"command\": \"/start/web-process\",\n      \"args\": [\"-p\", \"1234\"],\n      \"direct\": false\n    }\n  ],\n  \"launcher\": {\n    \"version\": \"0.5.0\"\n  }\n}`,\n\t\t))\n\n\t\tmockImageWithExtension = testmocks.NewImage(\"some/imageWithExtension\", \"\", nil)\n\t\th.AssertNil(t, mockImageWithExtension.SetWorkingDir(\"/test-workdir\"))\n\t\th.AssertNil(t, mockImageWithExtension.SetLabel(\"io.buildpacks.stack.id\", \"test.stack.id\"))\n\t\th.AssertNil(t, mockImageWithExtension.SetLabel(\"io.buildpacks.rebasable\", \"true\"))\n\t\th.AssertNil(t, mockImageWithExtension.SetLabel(\n\t\t\t\"io.buildpacks.lifecycle.metadata\",\n\t\t\t`{\n  \"stack\": {\n    \"runImage\": {\n      \"image\": \"some-run-image\",\n      \"mirrors\": [\n        \"some-mirror\",\n        \"other-mirror\"\n      ]\n    }\n  },\n  \"runImage\": {\n    \"topLayer\": \"some-top-layer\",\n    \"reference\": \"some-run-image-reference\"\n  }\n}`,\n\t\t))\n\t\th.AssertNil(t, mockImageWithExtension.SetLabel(\n\t\t\t\"io.buildpacks.build.metadata\",\n\t\t\t`{\n  \"bom\": [\n    {\n      \"name\": \"some-bom-element\"\n    }\n  ],\n  \"buildpacks\": [\n    {\n      \"id\": \"some-buildpack\",\n      \"version\": \"some-version\"\n    },\n    {\n      \"id\": \"other-buildpack\",\n      \"version\": \"other-version\"\n    }\n  ],\n    \"extensions\": [\n    {\n      \"id\": \"some-extension\",\n      \"version\": \"some-version\"\n    },\n    {\n      \"id\": \"other-extension\",\n      \"version\": \"other-version\"\n    }\n  ],\n  \"processes\": [\n    {\n      \"type\": \"other-process\",\n      \"command\": \"/other/process\",\n      \"args\": [\"opt\", \"1\"],\n      \"direct\": true\n    },\n    {\n      \"type\": \"web\",\n      \"command\": \"/start/web-process\",\n      \"args\": [\"-p\", \"1234\"],\n      \"direct\": false\n    }\n  ],\n  \"launcher\": {\n    \"version\": \"0.5.0\"\n  }\n}`,\n\t\t))\n\t})\n\n\tit.After(func() {\n\t\tmockController.Finish()\n\t})\n\n\twhen(\"the image exists\", func() {\n\t\tfor _, useDaemon := range []bool{true, false} {\n\t\t\tuseDaemon := useDaemon\n\t\t\twhen(fmt.Sprintf(\"daemon is %t\", useDaemon), func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tif useDaemon {\n\t\t\t\t\t\tmockImageFetcher.EXPECT().Fetch(gomock.Any(), \"some/image\", image.FetchOptions{Daemon: true, PullPolicy: image.PullNever}).Return(mockImage, nil).AnyTimes()\n\t\t\t\t\t\tmockImageFetcher.EXPECT().Fetch(gomock.Any(), \"some/imageNoRebasable\", image.FetchOptions{Daemon: true, PullPolicy: image.PullNever}).Return(mockImageNoRebasable, nil).AnyTimes()\n\t\t\t\t\t\tmockImageFetcher.EXPECT().Fetch(gomock.Any(), \"some/imageRebasableWithoutLabel\", image.FetchOptions{Daemon: true, PullPolicy: image.PullNever}).Return(mockImageRebasableWithoutLabel, nil).AnyTimes()\n\t\t\t\t\t\tmockImageFetcher.EXPECT().Fetch(gomock.Any(), \"some/imageWithExtension\", image.FetchOptions{Daemon: true, PullPolicy: image.PullNever}).Return(mockImageWithExtension, nil).AnyTimes()\n\t\t\t\t\t} else {\n\t\t\t\t\t\tmockImageFetcher.EXPECT().Fetch(gomock.Any(), \"some/image\", image.FetchOptions{Daemon: false, PullPolicy: image.PullNever}).Return(mockImage, nil).AnyTimes()\n\t\t\t\t\t\tmockImageFetcher.EXPECT().Fetch(gomock.Any(), \"some/imageNoRebasable\", image.FetchOptions{Daemon: false, PullPolicy: image.PullNever}).Return(mockImageNoRebasable, nil).AnyTimes()\n\t\t\t\t\t\tmockImageFetcher.EXPECT().Fetch(gomock.Any(), \"some/imageRebasableWithoutLabel\", image.FetchOptions{Daemon: false, PullPolicy: image.PullNever}).Return(mockImageRebasableWithoutLabel, nil).AnyTimes()\n\t\t\t\t\t\tmockImageFetcher.EXPECT().Fetch(gomock.Any(), \"some/imageWithExtension\", image.FetchOptions{Daemon: false, PullPolicy: image.PullNever}).Return(mockImageWithExtension, nil).AnyTimes()\n\t\t\t\t\t}\n\t\t\t\t})\n\n\t\t\t\tit(\"returns the stack ID\", func() {\n\t\t\t\t\tinfo, err := subject.InspectImage(\"some/image\", useDaemon)\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\th.AssertEq(t, info.StackID, \"test.stack.id\")\n\t\t\t\t})\n\n\t\t\t\tit(\"returns the stack ID with extension\", func() {\n\t\t\t\t\tinfoWithExtension, err := subject.InspectImage(\"some/imageWithExtension\", useDaemon)\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\th.AssertEq(t, infoWithExtension.StackID, \"test.stack.id\")\n\t\t\t\t})\n\n\t\t\t\tit(\"returns the stack from runImage.Image if set\", func() {\n\t\t\t\t\th.AssertNil(t, mockImage.SetLabel(\n\t\t\t\t\t\t\"io.buildpacks.lifecycle.metadata\",\n\t\t\t\t\t\t`{\n  \"runImage\": {\n    \"topLayer\": \"some-top-layer\",\n    \"reference\": \"some-run-image-reference\",\n    \"image\":  \"is everything\"\n  }\n}`,\n\t\t\t\t\t))\n\t\t\t\t\tinfo, err := subject.InspectImage(\"some/image\", useDaemon)\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\th.AssertEq(t, info.Stack,\n\t\t\t\t\t\tfiles.Stack{RunImage: files.RunImageForExport{Image: \"is everything\"}})\n\t\t\t\t})\n\n\t\t\t\tit(\"returns the stack\", func() {\n\t\t\t\t\tinfo, err := subject.InspectImage(\"some/image\", useDaemon)\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\th.AssertEq(t, info.Stack,\n\t\t\t\t\t\tfiles.Stack{\n\t\t\t\t\t\t\tRunImage: files.RunImageForExport{\n\t\t\t\t\t\t\t\tImage: \"some-run-image\",\n\t\t\t\t\t\t\t\tMirrors: []string{\n\t\t\t\t\t\t\t\t\t\"some-mirror\",\n\t\t\t\t\t\t\t\t\t\"other-mirror\",\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t},\n\t\t\t\t\t)\n\t\t\t\t})\n\n\t\t\t\tit(\"returns the stack with extension\", func() {\n\t\t\t\t\tinfoWithExtension, err := subject.InspectImage(\"some/imageWithExtension\", useDaemon)\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\th.AssertEq(t, infoWithExtension.Stack,\n\t\t\t\t\t\tfiles.Stack{\n\t\t\t\t\t\t\tRunImage: files.RunImageForExport{\n\t\t\t\t\t\t\t\tImage: \"some-run-image\",\n\t\t\t\t\t\t\t\tMirrors: []string{\n\t\t\t\t\t\t\t\t\t\"some-mirror\",\n\t\t\t\t\t\t\t\t\t\"other-mirror\",\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t},\n\t\t\t\t\t)\n\t\t\t\t})\n\n\t\t\t\tit(\"returns the base image\", func() {\n\t\t\t\t\tinfo, err := subject.InspectImage(\"some/image\", useDaemon)\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\th.AssertEq(t, info.Base,\n\t\t\t\t\t\tfiles.RunImageForRebase{\n\t\t\t\t\t\t\tTopLayer:  \"some-top-layer\",\n\t\t\t\t\t\t\tReference: \"some-run-image-reference\",\n\t\t\t\t\t\t},\n\t\t\t\t\t)\n\t\t\t\t})\n\n\t\t\t\tit(\"returns the base image with extension\", func() {\n\t\t\t\t\tinfoWithExtension, err := subject.InspectImage(\"some/imageWithExtension\", useDaemon)\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\th.AssertEq(t, infoWithExtension.Base,\n\t\t\t\t\t\tfiles.RunImageForRebase{\n\t\t\t\t\t\t\tTopLayer:  \"some-top-layer\",\n\t\t\t\t\t\t\tReference: \"some-run-image-reference\",\n\t\t\t\t\t\t},\n\t\t\t\t\t)\n\t\t\t\t})\n\n\t\t\t\tit(\"returns the rebasable image\", func() {\n\t\t\t\t\tinfo, err := subject.InspectImage(\"some/image\", useDaemon)\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\th.AssertEq(t, info.Rebasable, true)\n\t\t\t\t})\n\n\t\t\t\tit(\"returns the rebasable image true if the label has not been set\", func() {\n\t\t\t\t\tinfo, err := subject.InspectImage(\"some/imageRebasableWithoutLabel\", useDaemon)\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\th.AssertEq(t, info.Rebasable, true)\n\t\t\t\t})\n\n\t\t\t\tit(\"returns the no rebasable image\", func() {\n\t\t\t\t\tinfo, err := subject.InspectImage(\"some/imageNoRebasable\", useDaemon)\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\th.AssertEq(t, info.Rebasable, false)\n\t\t\t\t})\n\n\t\t\t\tit(\"returns the rebasable image with Extension\", func() {\n\t\t\t\t\tinfoRebasableWithExtension, err := subject.InspectImage(\"some/imageWithExtension\", useDaemon)\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\th.AssertEq(t, infoRebasableWithExtension.Rebasable, true)\n\t\t\t\t})\n\n\t\t\t\tit(\"returns the BOM\", func() {\n\t\t\t\t\tinfo, err := subject.InspectImage(\"some/image\", useDaemon)\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\trawBOM, err := json.Marshal(info.BOM)\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\th.AssertContains(t, string(rawBOM), `[{\"name\":\"some-bom-element\"`)\n\t\t\t\t})\n\n\t\t\t\tit(\"returns the BOM\", func() {\n\t\t\t\t\tinfoWithExtension, err := subject.InspectImage(\"some/imageWithExtension\", useDaemon)\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\trawBOM, err := json.Marshal(infoWithExtension.BOM)\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\th.AssertContains(t, string(rawBOM), `[{\"name\":\"some-bom-element\"`)\n\t\t\t\t})\n\n\t\t\t\tit(\"returns the buildpacks\", func() {\n\t\t\t\t\tinfo, err := subject.InspectImage(\"some/image\", useDaemon)\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\th.AssertEq(t, len(info.Buildpacks), 2)\n\t\t\t\t\th.AssertEq(t, info.Buildpacks[0].ID, \"some-buildpack\")\n\t\t\t\t\th.AssertEq(t, info.Buildpacks[0].Version, \"some-version\")\n\t\t\t\t\th.AssertEq(t, info.Buildpacks[1].ID, \"other-buildpack\")\n\t\t\t\t\th.AssertEq(t, info.Buildpacks[1].Version, \"other-version\")\n\t\t\t\t})\n\n\t\t\t\tit(\"returns the buildpacks with extension\", func() {\n\t\t\t\t\tinfoWithExtension, err := subject.InspectImage(\"some/imageWithExtension\", useDaemon)\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\th.AssertEq(t, len(infoWithExtension.Buildpacks), 2)\n\t\t\t\t\th.AssertEq(t, infoWithExtension.Buildpacks[0].ID, \"some-buildpack\")\n\t\t\t\t\th.AssertEq(t, infoWithExtension.Buildpacks[0].Version, \"some-version\")\n\t\t\t\t\th.AssertEq(t, infoWithExtension.Buildpacks[1].ID, \"other-buildpack\")\n\t\t\t\t\th.AssertEq(t, infoWithExtension.Buildpacks[1].Version, \"other-version\")\n\t\t\t\t})\n\n\t\t\t\tit(\"returns the extensions\", func() {\n\t\t\t\t\tinfoWithExtension, err := subject.InspectImage(\"some/imageWithExtension\", useDaemon)\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\th.AssertEq(t, len(infoWithExtension.Extensions), 2)\n\t\t\t\t\th.AssertEq(t, infoWithExtension.Extensions[0].ID, \"some-extension\")\n\t\t\t\t\th.AssertEq(t, infoWithExtension.Extensions[0].Version, \"some-version\")\n\t\t\t\t\th.AssertEq(t, infoWithExtension.Extensions[1].ID, \"other-extension\")\n\t\t\t\t\th.AssertEq(t, infoWithExtension.Extensions[1].Version, \"other-version\")\n\t\t\t\t})\n\n\t\t\t\tit(\"returns the processes setting the web process as default\", func() {\n\t\t\t\t\tinfo, err := subject.InspectImage(\"some/image\", useDaemon)\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\th.AssertEq(t, info.Processes,\n\t\t\t\t\t\tProcessDetails{\n\t\t\t\t\t\t\tDefaultProcess: &launch.Process{\n\t\t\t\t\t\t\t\tType:             \"web\",\n\t\t\t\t\t\t\t\tCommand:          launch.RawCommand{Entries: []string{\"/start/web-process\"}},\n\t\t\t\t\t\t\t\tArgs:             []string{\"-p\", \"1234\"},\n\t\t\t\t\t\t\t\tDirect:           false,\n\t\t\t\t\t\t\t\tWorkingDirectory: \"/test-workdir\",\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\tOtherProcesses: []launch.Process{\n\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\tType:             \"other-process\",\n\t\t\t\t\t\t\t\t\tCommand:          launch.RawCommand{Entries: []string{\"/other/process\"}},\n\t\t\t\t\t\t\t\t\tArgs:             []string{\"opt\", \"1\"},\n\t\t\t\t\t\t\t\t\tDirect:           true,\n\t\t\t\t\t\t\t\t\tWorkingDirectory: \"/test-workdir\",\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t},\n\t\t\t\t\t\tignorePlatformAPI...)\n\t\t\t\t})\n\n\t\t\t\twhen(\"Platform API < 0.4\", func() {\n\t\t\t\t\twhen(\"CNB_PROCESS_TYPE is set\", func() {\n\t\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\t\th.AssertNil(t, mockImage.SetEnv(\"CNB_PROCESS_TYPE\", \"other-process\"))\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\tit(\"returns processes setting the correct default process\", func() {\n\t\t\t\t\t\t\tinfo, err := subject.InspectImage(\"some/image\", useDaemon)\n\t\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\t\th.AssertEq(t, info.Processes,\n\t\t\t\t\t\t\t\tProcessDetails{\n\t\t\t\t\t\t\t\t\tDefaultProcess: &launch.Process{\n\t\t\t\t\t\t\t\t\t\tType:             \"other-process\",\n\t\t\t\t\t\t\t\t\t\tCommand:          launch.RawCommand{Entries: []string{\"/other/process\"}},\n\t\t\t\t\t\t\t\t\t\tArgs:             []string{\"opt\", \"1\"},\n\t\t\t\t\t\t\t\t\t\tDirect:           true,\n\t\t\t\t\t\t\t\t\t\tWorkingDirectory: \"/test-workdir\",\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\tOtherProcesses: []launch.Process{\n\t\t\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t\t\tType:             \"web\",\n\t\t\t\t\t\t\t\t\t\t\tCommand:          launch.RawCommand{Entries: []string{\"/start/web-process\"}},\n\t\t\t\t\t\t\t\t\t\t\tArgs:             []string{\"-p\", \"1234\"},\n\t\t\t\t\t\t\t\t\t\t\tDirect:           false,\n\t\t\t\t\t\t\t\t\t\t\tWorkingDirectory: \"/test-workdir\",\n\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\tignorePlatformAPI...)\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"CNB_PROCESS_TYPE is set, but doesn't match an existing process\", func() {\n\t\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\t\th.AssertNil(t, mockImage.SetEnv(\"CNB_PROCESS_TYPE\", \"missing-process\"))\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\tit(\"returns a nil default process\", func() {\n\t\t\t\t\t\t\tinfo, err := subject.InspectImage(\"some/image\", useDaemon)\n\t\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\t\th.AssertEq(t, info.Processes,\n\t\t\t\t\t\t\t\tProcessDetails{\n\t\t\t\t\t\t\t\t\tDefaultProcess: nil,\n\t\t\t\t\t\t\t\t\tOtherProcesses: []launch.Process{\n\t\t\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t\t\tType:             \"other-process\",\n\t\t\t\t\t\t\t\t\t\t\tCommand:          launch.RawCommand{Entries: []string{\"/other/process\"}},\n\t\t\t\t\t\t\t\t\t\t\tArgs:             []string{\"opt\", \"1\"},\n\t\t\t\t\t\t\t\t\t\t\tDirect:           true,\n\t\t\t\t\t\t\t\t\t\t\tWorkingDirectory: \"/test-workdir\",\n\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t\t\tType:             \"web\",\n\t\t\t\t\t\t\t\t\t\t\tCommand:          launch.RawCommand{Entries: []string{\"/start/web-process\"}},\n\t\t\t\t\t\t\t\t\t\t\tArgs:             []string{\"-p\", \"1234\"},\n\t\t\t\t\t\t\t\t\t\t\tDirect:           false,\n\t\t\t\t\t\t\t\t\t\t\tWorkingDirectory: \"/test-workdir\",\n\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\tignorePlatformAPI...)\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"returns a nil default process when CNB_PROCESS_TYPE is not set and there is no web process\", func() {\n\t\t\t\t\t\th.AssertNil(t, mockImage.SetLabel(\n\t\t\t\t\t\t\t\"io.buildpacks.build.metadata\",\n\t\t\t\t\t\t\t`{\n  \"processes\": [\n    {\n      \"type\": \"other-process\",\n      \"command\": \"/other/process\",\n      \"args\": [\"opt\", \"1\"],\n      \"direct\": true\n    }\n  ]\n}`,\n\t\t\t\t\t\t))\n\n\t\t\t\t\t\tinfo, err := subject.InspectImage(\"some/image\", useDaemon)\n\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\th.AssertEq(t, info.Processes,\n\t\t\t\t\t\t\tProcessDetails{\n\t\t\t\t\t\t\t\tDefaultProcess: nil,\n\t\t\t\t\t\t\t\tOtherProcesses: []launch.Process{\n\t\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t\tType:             \"other-process\",\n\t\t\t\t\t\t\t\t\t\tCommand:          launch.RawCommand{Entries: []string{\"/other/process\"}},\n\t\t\t\t\t\t\t\t\t\tArgs:             []string{\"opt\", \"1\"},\n\t\t\t\t\t\t\t\t\t\tDirect:           true,\n\t\t\t\t\t\t\t\t\t\tWorkingDirectory: \"/test-workdir\",\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\tignorePlatformAPI...)\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"Platform API >= 0.4 and <= 0.8\", func() {\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\th.AssertNil(t, mockImage.SetEnv(\"CNB_PLATFORM_API\", \"0.4\"))\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"CNB_PLATFORM_API set to bad value\", func() {\n\t\t\t\t\t\tit(\"errors\", func() {\n\t\t\t\t\t\t\th.AssertNil(t, mockImage.SetEnv(\"CNB_PLATFORM_API\", \"not-semver\"))\n\t\t\t\t\t\t\t_, err := subject.InspectImage(\"some/image\", useDaemon)\n\t\t\t\t\t\t\th.AssertError(t, err, \"parsing platform api version\")\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"Can't inspect Image entrypoint\", func() {\n\t\t\t\t\t\tit(\"errors\", func() {\n\t\t\t\t\t\t\tmockImage.EntrypointCall.Returns.Error = errors.New(\"some-error\")\n\n\t\t\t\t\t\t\t_, err := subject.InspectImage(\"some/image\", useDaemon)\n\t\t\t\t\t\t\th.AssertError(t, err, \"reading entrypoint\")\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"ENTRYPOINT is empty\", func() {\n\t\t\t\t\t\tit(\"sets nil default process\", func() {\n\t\t\t\t\t\t\tinfo, err := subject.InspectImage(\"some/image\", useDaemon)\n\t\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\t\th.AssertEq(t, info.Processes,\n\t\t\t\t\t\t\t\tProcessDetails{\n\t\t\t\t\t\t\t\t\tDefaultProcess: nil,\n\t\t\t\t\t\t\t\t\tOtherProcesses: []launch.Process{\n\t\t\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t\t\tType:             \"other-process\",\n\t\t\t\t\t\t\t\t\t\t\tCommand:          launch.RawCommand{Entries: []string{\"/other/process\"}},\n\t\t\t\t\t\t\t\t\t\t\tArgs:             []string{\"opt\", \"1\"},\n\t\t\t\t\t\t\t\t\t\t\tDirect:           true,\n\t\t\t\t\t\t\t\t\t\t\tWorkingDirectory: \"/test-workdir\",\n\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t\t\tType:             \"web\",\n\t\t\t\t\t\t\t\t\t\t\tCommand:          launch.RawCommand{Entries: []string{\"/start/web-process\"}},\n\t\t\t\t\t\t\t\t\t\t\tArgs:             []string{\"-p\", \"1234\"},\n\t\t\t\t\t\t\t\t\t\t\tDirect:           false,\n\t\t\t\t\t\t\t\t\t\t\tWorkingDirectory: \"/test-workdir\",\n\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\tignorePlatformAPI...)\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"CNB_PROCESS_TYPE is set\", func() {\n\t\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\t\th.AssertNil(t, mockImage.SetEnv(\"CNB_PROCESS_TYPE\", \"other-process\"))\n\n\t\t\t\t\t\t\tmockImage.EntrypointCall.Returns.StringArr = []string{\"/cnb/process/web\"}\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\tit(\"ignores it and sets the correct default process\", func() {\n\t\t\t\t\t\t\tinfo, err := subject.InspectImage(\"some/image\", useDaemon)\n\t\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\t\th.AssertEq(t, info.Processes,\n\t\t\t\t\t\t\t\tProcessDetails{\n\t\t\t\t\t\t\t\t\tDefaultProcess: &launch.Process{\n\t\t\t\t\t\t\t\t\t\tType:             \"web\",\n\t\t\t\t\t\t\t\t\t\tCommand:          launch.RawCommand{Entries: []string{\"/start/web-process\"}},\n\t\t\t\t\t\t\t\t\t\tArgs:             []string{\"-p\", \"1234\"},\n\t\t\t\t\t\t\t\t\t\tDirect:           false,\n\t\t\t\t\t\t\t\t\t\tWorkingDirectory: \"/test-workdir\",\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\tOtherProcesses: []launch.Process{\n\t\t\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t\t\tType:             \"other-process\",\n\t\t\t\t\t\t\t\t\t\t\tCommand:          launch.RawCommand{Entries: []string{\"/other/process\"}},\n\t\t\t\t\t\t\t\t\t\t\tArgs:             []string{\"opt\", \"1\"},\n\t\t\t\t\t\t\t\t\t\t\tDirect:           true,\n\t\t\t\t\t\t\t\t\t\t\tWorkingDirectory: \"/test-workdir\",\n\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\tignorePlatformAPI...)\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"ENTRYPOINT is set, but doesn't match an existing process\", func() {\n\t\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\t\tmockImage.EntrypointCall.Returns.StringArr = []string{\"/cnb/process/unknown-process\"}\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\tit(\"returns nil default default process\", func() {\n\t\t\t\t\t\t\tinfo, err := subject.InspectImage(\"some/image\", useDaemon)\n\t\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\t\th.AssertEq(t, info.Processes,\n\t\t\t\t\t\t\t\tProcessDetails{\n\t\t\t\t\t\t\t\t\tDefaultProcess: nil,\n\t\t\t\t\t\t\t\t\tOtherProcesses: []launch.Process{\n\t\t\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t\t\tType:             \"other-process\",\n\t\t\t\t\t\t\t\t\t\t\tCommand:          launch.RawCommand{Entries: []string{\"/other/process\"}},\n\t\t\t\t\t\t\t\t\t\t\tArgs:             []string{\"opt\", \"1\"},\n\t\t\t\t\t\t\t\t\t\t\tDirect:           true,\n\t\t\t\t\t\t\t\t\t\t\tWorkingDirectory: \"/test-workdir\",\n\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t\t\tType:             \"web\",\n\t\t\t\t\t\t\t\t\t\t\tCommand:          launch.RawCommand{Entries: []string{\"/start/web-process\"}},\n\t\t\t\t\t\t\t\t\t\t\tArgs:             []string{\"-p\", \"1234\"},\n\t\t\t\t\t\t\t\t\t\t\tDirect:           false,\n\t\t\t\t\t\t\t\t\t\t\tWorkingDirectory: \"/test-workdir\",\n\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\tignorePlatformAPI...)\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"ENTRYPOINT set to /cnb/lifecycle/launcher\", func() {\n\t\t\t\t\t\tit(\"returns a nil default process\", func() {\n\t\t\t\t\t\t\tmockImage.EntrypointCall.Returns.StringArr = []string{\"/cnb/lifecycle/launcher\"}\n\n\t\t\t\t\t\t\th.AssertNil(t, mockImage.SetLabel(\n\t\t\t\t\t\t\t\t\"io.buildpacks.build.metadata\",\n\t\t\t\t\t\t\t\t`{\n\t\t\t\t\t \"processes\": [\n\t\t\t\t\t   {\n\t\t\t\t\t     \"type\": \"other-process\",\n\t\t\t\t\t     \"command\": \"/other/process\",\n\t\t\t\t\t     \"args\": [\"opt\", \"1\"],\n\t\t\t\t\t     \"direct\": true\n\t\t\t\t\t   }\n\t\t\t\t\t ]\n\t\t\t\t\t}`,\n\t\t\t\t\t\t\t))\n\n\t\t\t\t\t\t\tinfo, err := subject.InspectImage(\"some/image\", useDaemon)\n\t\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\t\th.AssertEq(t, info.Processes,\n\t\t\t\t\t\t\t\tProcessDetails{\n\t\t\t\t\t\t\t\t\tDefaultProcess: nil,\n\t\t\t\t\t\t\t\t\tOtherProcesses: []launch.Process{\n\t\t\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t\t\tType:             \"other-process\",\n\t\t\t\t\t\t\t\t\t\t\tCommand:          launch.RawCommand{Entries: []string{\"/other/process\"}},\n\t\t\t\t\t\t\t\t\t\t\tArgs:             []string{\"opt\", \"1\"},\n\t\t\t\t\t\t\t\t\t\t\tDirect:           true,\n\t\t\t\t\t\t\t\t\t\t\tWorkingDirectory: \"/test-workdir\",\n\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\tignorePlatformAPI...)\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"Inspecting Windows images\", func() {\n\t\t\t\t\t\twhen(`ENTRYPOINT set to c:\\cnb\\lifecycle\\launcher.exe`, func() {\n\t\t\t\t\t\t\tit(\"sets default process to nil\", func() {\n\t\t\t\t\t\t\t\tmockImage.EntrypointCall.Returns.StringArr = []string{`c:\\cnb\\lifecycle\\launcher.exe`}\n\n\t\t\t\t\t\t\t\tinfo, err := subject.InspectImage(\"some/image\", useDaemon)\n\t\t\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\t\t\th.AssertEq(t, info.Processes,\n\t\t\t\t\t\t\t\t\tProcessDetails{\n\t\t\t\t\t\t\t\t\t\tDefaultProcess: nil,\n\t\t\t\t\t\t\t\t\t\tOtherProcesses: []launch.Process{\n\t\t\t\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t\t\t\tType:             \"other-process\",\n\t\t\t\t\t\t\t\t\t\t\t\tCommand:          launch.RawCommand{Entries: []string{\"/other/process\"}},\n\t\t\t\t\t\t\t\t\t\t\t\tArgs:             []string{\"opt\", \"1\"},\n\t\t\t\t\t\t\t\t\t\t\t\tDirect:           true,\n\t\t\t\t\t\t\t\t\t\t\t\tWorkingDirectory: \"/test-workdir\",\n\t\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t\t\t\tType:             \"web\",\n\t\t\t\t\t\t\t\t\t\t\t\tCommand:          launch.RawCommand{Entries: []string{\"/start/web-process\"}},\n\t\t\t\t\t\t\t\t\t\t\t\tArgs:             []string{\"-p\", \"1234\"},\n\t\t\t\t\t\t\t\t\t\t\t\tDirect:           false,\n\t\t\t\t\t\t\t\t\t\t\t\tWorkingDirectory: \"/test-workdir\",\n\t\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\tignorePlatformAPI...)\n\t\t\t\t\t\t\t})\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\twhen(\"ENTRYPOINT is set, but doesn't match an existing process\", func() {\n\t\t\t\t\t\t\tit(\"sets default process to nil\", func() {\n\t\t\t\t\t\t\t\tmockImage.EntrypointCall.Returns.StringArr = []string{`c:\\cnb\\process\\unknown-process.exe`}\n\n\t\t\t\t\t\t\t\tinfo, err := subject.InspectImage(\"some/image\", useDaemon)\n\t\t\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\t\t\th.AssertEq(t, info.Processes,\n\t\t\t\t\t\t\t\t\tProcessDetails{\n\t\t\t\t\t\t\t\t\t\tDefaultProcess: nil,\n\t\t\t\t\t\t\t\t\t\tOtherProcesses: []launch.Process{\n\t\t\t\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t\t\t\tType:             \"other-process\",\n\t\t\t\t\t\t\t\t\t\t\t\tCommand:          launch.RawCommand{Entries: []string{\"/other/process\"}},\n\t\t\t\t\t\t\t\t\t\t\t\tArgs:             []string{\"opt\", \"1\"},\n\t\t\t\t\t\t\t\t\t\t\t\tDirect:           true,\n\t\t\t\t\t\t\t\t\t\t\t\tWorkingDirectory: \"/test-workdir\",\n\t\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t\t\t\tType:             \"web\",\n\t\t\t\t\t\t\t\t\t\t\t\tCommand:          launch.RawCommand{Entries: []string{\"/start/web-process\"}},\n\t\t\t\t\t\t\t\t\t\t\t\tArgs:             []string{\"-p\", \"1234\"},\n\t\t\t\t\t\t\t\t\t\t\t\tDirect:           false,\n\t\t\t\t\t\t\t\t\t\t\t\tWorkingDirectory: \"/test-workdir\",\n\t\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\tignorePlatformAPI...)\n\t\t\t\t\t\t\t})\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\twhen(\"ENTRYPOINT is set, and matches an existing process\", func() {\n\t\t\t\t\t\t\tit(\"sets default process to defined process\", func() {\n\t\t\t\t\t\t\t\tmockImage.EntrypointCall.Returns.StringArr = []string{`c:\\cnb\\process\\other-process.exe`}\n\n\t\t\t\t\t\t\t\tinfo, err := subject.InspectImage(\"some/image\", useDaemon)\n\t\t\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\t\t\th.AssertEq(t, info.Processes,\n\t\t\t\t\t\t\t\t\tProcessDetails{\n\t\t\t\t\t\t\t\t\t\tDefaultProcess: &launch.Process{\n\t\t\t\t\t\t\t\t\t\t\tType:             \"other-process\",\n\t\t\t\t\t\t\t\t\t\t\tCommand:          launch.RawCommand{Entries: []string{\"/other/process\"}},\n\t\t\t\t\t\t\t\t\t\t\tArgs:             []string{\"opt\", \"1\"},\n\t\t\t\t\t\t\t\t\t\t\tDirect:           true,\n\t\t\t\t\t\t\t\t\t\t\tWorkingDirectory: \"/test-workdir\",\n\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t\tOtherProcesses: []launch.Process{\n\t\t\t\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t\t\t\tType:             \"web\",\n\t\t\t\t\t\t\t\t\t\t\t\tCommand:          launch.RawCommand{Entries: []string{\"/start/web-process\"}},\n\t\t\t\t\t\t\t\t\t\t\t\tArgs:             []string{\"-p\", \"1234\"},\n\t\t\t\t\t\t\t\t\t\t\t\tDirect:           false,\n\t\t\t\t\t\t\t\t\t\t\t\tWorkingDirectory: \"/test-workdir\",\n\t\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\tignorePlatformAPI...)\n\t\t\t\t\t\t\t})\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"Platform API > 0.8\", func() {\n\t\t\t\t\twhen(\"working-dir is set\", func() {\n\t\t\t\t\t\tit(\"returns process with working directory if available\", func() {\n\t\t\t\t\t\t\th.AssertNil(t, mockImage.SetLabel(\n\t\t\t\t\t\t\t\t\"io.buildpacks.build.metadata\",\n\t\t\t\t\t\t\t\t`{\n\t\t\t\t\t \"processes\": [\n\t\t\t\t\t   {\n\t\t\t\t\t     \"type\": \"other-process\",\n\t\t\t\t\t     \"command\": \"/other/process\",\n\t\t\t\t\t     \"args\": [\"opt\", \"1\"],\n\t\t\t\t\t     \"direct\": true,\n\t\t\t\t\t\t \"working-dir\": \"/other-workdir\"\n\t\t\t\t\t   }\n\t\t\t\t\t ]\n\t\t\t\t\t}`,\n\t\t\t\t\t\t\t))\n\n\t\t\t\t\t\t\tinfo, err := subject.InspectImage(\"some/image\", useDaemon)\n\t\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\t\tfmt.Print(info)\n\n\t\t\t\t\t\t\th.AssertEq(t, info.Processes,\n\t\t\t\t\t\t\t\tProcessDetails{\n\t\t\t\t\t\t\t\t\tDefaultProcess: nil,\n\t\t\t\t\t\t\t\t\tOtherProcesses: []launch.Process{\n\t\t\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t\t\tType:             \"other-process\",\n\t\t\t\t\t\t\t\t\t\t\tCommand:          launch.RawCommand{Entries: []string{\"/other/process\"}},\n\t\t\t\t\t\t\t\t\t\t\tArgs:             []string{\"opt\", \"1\"},\n\t\t\t\t\t\t\t\t\t\t\tDirect:           true,\n\t\t\t\t\t\t\t\t\t\t\tWorkingDirectory: \"/other-workdir\",\n\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\tignorePlatformAPI...)\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"working-dir is not set\", func() {\n\t\t\t\t\t\tit(\"returns process with working directory from image\", func() {\n\t\t\t\t\t\t\tinfo, err := subject.InspectImage(\"some/image\", useDaemon)\n\t\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\t\th.AssertEq(t, info.Processes,\n\t\t\t\t\t\t\t\tProcessDetails{\n\t\t\t\t\t\t\t\t\tDefaultProcess: &launch.Process{\n\t\t\t\t\t\t\t\t\t\tType:             \"web\",\n\t\t\t\t\t\t\t\t\t\tCommand:          launch.RawCommand{Entries: []string{\"/start/web-process\"}},\n\t\t\t\t\t\t\t\t\t\tArgs:             []string{\"-p\", \"1234\"},\n\t\t\t\t\t\t\t\t\t\tDirect:           false,\n\t\t\t\t\t\t\t\t\t\tWorkingDirectory: \"/test-workdir\",\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\tOtherProcesses: []launch.Process{\n\t\t\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t\t\tType:             \"other-process\",\n\t\t\t\t\t\t\t\t\t\t\tCommand:          launch.RawCommand{Entries: []string{\"/other/process\"}},\n\t\t\t\t\t\t\t\t\t\t\tArgs:             []string{\"opt\", \"1\"},\n\t\t\t\t\t\t\t\t\t\t\tDirect:           true,\n\t\t\t\t\t\t\t\t\t\t\tWorkingDirectory: \"/test-workdir\",\n\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\tignorePlatformAPI...)\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\t\t}\n\t})\n\n\twhen(\"the image doesn't exist\", func() {\n\t\tit(\"returns nil\", func() {\n\t\t\tmockImageFetcher.EXPECT().Fetch(gomock.Any(), \"not/some-image\", image.FetchOptions{Daemon: true, PullPolicy: image.PullNever}).Return(nil, image.ErrNotFound)\n\n\t\t\tinfo, err := subject.InspectImage(\"not/some-image\", true)\n\t\t\th.AssertNil(t, err)\n\t\t\th.AssertNil(t, info)\n\t\t})\n\t})\n\n\twhen(\"there is an error fetching the image\", func() {\n\t\tit(\"returns the error\", func() {\n\t\t\tmockImageFetcher.EXPECT().Fetch(gomock.Any(), \"not/some-image\", image.FetchOptions{Daemon: true, PullPolicy: image.PullNever}).Return(nil, errors.New(\"some-error\"))\n\n\t\t\t_, err := subject.InspectImage(\"not/some-image\", true)\n\t\t\th.AssertError(t, err, \"some-error\")\n\t\t})\n\t})\n\n\twhen(\"the image is missing labels\", func() {\n\t\tit(\"returns empty data\", func() {\n\t\t\tmockImageFetcher.EXPECT().\n\t\t\t\tFetch(gomock.Any(), \"missing/labels\", image.FetchOptions{Daemon: true, PullPolicy: image.PullNever}).\n\t\t\t\tReturn(fakes.NewImage(\"missing/labels\", \"\", nil), nil)\n\t\t\tinfo, err := subject.InspectImage(\"missing/labels\", true)\n\t\t\th.AssertNil(t, err)\n\t\t\th.AssertEq(t, info, &ImageInfo{Rebasable: true}, ignorePlatformAPI...)\n\t\t})\n\t})\n\n\twhen(\"the image has malformed labels\", func() {\n\t\tvar badImage *fakes.Image\n\n\t\tit.Before(func() {\n\t\t\tbadImage = fakes.NewImage(\"bad/image\", \"\", nil)\n\t\t\tmockImageFetcher.EXPECT().\n\t\t\t\tFetch(gomock.Any(), \"bad/image\", image.FetchOptions{Daemon: true, PullPolicy: image.PullNever}).\n\t\t\t\tReturn(badImage, nil)\n\t\t})\n\n\t\tit(\"returns an error when layers md cannot parse\", func() {\n\t\t\th.AssertNil(t, badImage.SetLabel(\"io.buildpacks.lifecycle.metadata\", \"not   ----  json\"))\n\t\t\t_, err := subject.InspectImage(\"bad/image\", true)\n\t\t\th.AssertError(t, err, \"unmarshalling label 'io.buildpacks.lifecycle.metadata'\")\n\t\t})\n\n\t\tit(\"returns an error when build md cannot parse\", func() {\n\t\t\th.AssertNil(t, badImage.SetLabel(\"io.buildpacks.build.metadata\", \"not   ----  json\"))\n\t\t\t_, err := subject.InspectImage(\"bad/image\", true)\n\t\t\th.AssertError(t, err, \"unmarshalling label 'io.buildpacks.build.metadata'\")\n\t\t})\n\t})\n\n\twhen(\"lifecycle version is 0.4.x or earlier\", func() {\n\t\tit(\"includes an empty base image reference\", func() {\n\t\t\toldImage := fakes.NewImage(\"old/image\", \"\", nil)\n\t\t\tmockImageFetcher.EXPECT().Fetch(gomock.Any(), \"old/image\", image.FetchOptions{Daemon: true, PullPolicy: image.PullNever}).Return(oldImage, nil)\n\n\t\t\th.AssertNil(t, oldImage.SetLabel(\n\t\t\t\t\"io.buildpacks.lifecycle.metadata\",\n\t\t\t\t`{\n  \"runImage\": {\n    \"topLayer\": \"some-top-layer\",\n    \"reference\": \"some-run-image-reference\"\n  }\n}`,\n\t\t\t))\n\t\t\th.AssertNil(t, oldImage.SetLabel(\n\t\t\t\t\"io.buildpacks.build.metadata\",\n\t\t\t\t`{\n  \"launcher\": {\n    \"version\": \"0.4.0\"\n  }\n}`,\n\t\t\t))\n\n\t\t\tinfo, err := subject.InspectImage(\"old/image\", true)\n\t\t\th.AssertNil(t, err)\n\t\t\th.AssertEq(t, info.Base,\n\t\t\t\tfiles.RunImageForRebase{\n\t\t\t\t\tTopLayer:  \"some-top-layer\",\n\t\t\t\t\tReference: \"\",\n\t\t\t\t},\n\t\t\t)\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "pkg/client/manifest_add.go",
    "content": "package client\n\nimport (\n\t\"context\"\n\t\"fmt\"\n\n\t\"github.com/buildpacks/pack/internal/style\"\n)\n\ntype ManifestAddOptions struct {\n\t// Image index we want to update\n\tIndexRepoName string\n\n\t// Name of image we wish to add into the image index\n\tRepoName string\n}\n\n// AddManifest implements commands.PackClient.\nfunc (c *Client) AddManifest(ctx context.Context, opts ManifestAddOptions) (err error) {\n\tidx, err := c.indexFactory.LoadIndex(opts.IndexRepoName)\n\tif err != nil {\n\t\treturn err\n\t}\n\n\tif err = c.addManifestToIndex(ctx, opts.RepoName, idx); err != nil {\n\t\treturn err\n\t}\n\n\tif err = idx.SaveDir(); err != nil {\n\t\treturn fmt.Errorf(\"failed to save manifest list %s to local storage: %w\", style.Symbol(opts.RepoName), err)\n\t}\n\n\tc.logger.Infof(\"Successfully added image %s to index\", style.Symbol(opts.RepoName))\n\treturn nil\n}\n"
  },
  {
    "path": "pkg/client/manifest_add_test.go",
    "content": "package client\n\nimport (\n\t\"bytes\"\n\t\"context\"\n\t\"errors\"\n\t\"os\"\n\t\"path/filepath\"\n\t\"testing\"\n\n\t\"github.com/buildpacks/imgutil\"\n\t\"github.com/golang/mock/gomock\"\n\t\"github.com/google/go-containerregistry/pkg/authn\"\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\tifakes \"github.com/buildpacks/pack/internal/fakes\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\t\"github.com/buildpacks/pack/pkg/testmocks\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestAddManifest(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\n\tspec.Run(t, \"build\", testAddManifest, spec.Report(report.Terminal{}))\n}\n\nfunc testAddManifest(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tmockController   *gomock.Controller\n\t\tmockIndexFactory *testmocks.MockIndexFactory\n\t\tfakeImageFetcher *ifakes.FakeImageFetcher\n\t\tout              bytes.Buffer\n\t\tlogger           logging.Logger\n\t\tsubject          *Client\n\t\terr              error\n\t\ttmpDir           string\n\t)\n\n\tit.Before(func() {\n\t\tfakeImageFetcher = ifakes.NewFakeImageFetcher()\n\t\tlogger = logging.NewLogWithWriters(&out, &out, logging.WithVerbose())\n\t\tmockController = gomock.NewController(t)\n\t\tmockIndexFactory = testmocks.NewMockIndexFactory(mockController)\n\n\t\ttmpDir, err = os.MkdirTemp(\"\", \"add-manifest-test\")\n\t\th.AssertNil(t, err)\n\t\tos.Setenv(\"XDG_RUNTIME_DIR\", tmpDir)\n\n\t\tsubject, err = NewClient(\n\t\t\tWithLogger(logger),\n\t\t\tWithFetcher(fakeImageFetcher),\n\t\t\tWithIndexFactory(mockIndexFactory),\n\t\t\tWithExperimental(true),\n\t\t\tWithKeychain(authn.DefaultKeychain),\n\t\t)\n\t\th.AssertSameInstance(t, mockIndexFactory, subject.indexFactory)\n\t\th.AssertNil(t, err)\n\n\t\t// Create a remote image to be fetched when adding to the image index\n\t\tfakeImage := h.NewFakeWithRandomUnderlyingV1Image(t, \"pack/image\", nil)\n\t\tfakeImageFetcher.RemoteImages[\"index.docker.io/pack/image:latest\"] = fakeImage\n\t})\n\tit.After(func() {\n\t\tmockController.Finish()\n\t\tos.RemoveAll(tmpDir)\n\t})\n\n\twhen(\"#AddManifest\", func() {\n\t\twhen(\"index doesn't exist\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\tmockIndexFactory.EXPECT().LoadIndex(gomock.Any(), gomock.Any()).Return(nil, errors.New(\"index not found locally\"))\n\t\t\t})\n\n\t\t\tit(\"should return an error\", func() {\n\t\t\t\terr = subject.AddManifest(\n\t\t\t\t\tcontext.TODO(),\n\t\t\t\t\tManifestAddOptions{\n\t\t\t\t\t\tIndexRepoName: \"pack/none-existent-index\",\n\t\t\t\t\t\tRepoName:      \"pack/image\",\n\t\t\t\t\t},\n\t\t\t\t)\n\t\t\t\th.AssertError(t, err, \"index not found locally\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"index exists\", func() {\n\t\t\tvar (\n\t\t\t\tindexPath     string\n\t\t\t\tindexRepoName string\n\t\t\t)\n\n\t\t\twhen(\"no errors on save\", func() {\n\t\t\t\twhen(\"valid manifest is provided\", func() {\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\tindexRepoName = h.NewRandomIndexRepoName()\n\t\t\t\t\t\tindexPath = filepath.Join(tmpDir, imgutil.MakeFileSafeName(indexRepoName))\n\t\t\t\t\t\t// Initialize the Index with 2 image manifest\n\t\t\t\t\t\tidx := h.RandomCNBIndex(t, indexRepoName, 1, 2)\n\t\t\t\t\t\th.AssertNil(t, idx.SaveDir())\n\t\t\t\t\t\tmockIndexFactory.EXPECT().LoadIndex(gomock.Eq(indexRepoName), gomock.Any()).Return(idx, nil)\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"adds the given image\", func() {\n\t\t\t\t\t\terr = subject.AddManifest(\n\t\t\t\t\t\t\tcontext.TODO(),\n\t\t\t\t\t\t\tManifestAddOptions{\n\t\t\t\t\t\t\t\tIndexRepoName: indexRepoName,\n\t\t\t\t\t\t\t\tRepoName:      \"pack/image\",\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t)\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\th.AssertContains(t, out.String(), \"Successfully added image 'pack/image' to index\")\n\n\t\t\t\t\t\t// We expect one more manifest to be added\n\t\t\t\t\t\tindex := h.ReadIndexManifest(t, indexPath)\n\t\t\t\t\t\th.AssertEq(t, len(index.Manifests), 3)\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"invalid manifest reference name is used\", func() {\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\tindexRepoName = h.NewRandomIndexRepoName()\n\t\t\t\t\t\tindexPath = filepath.Join(tmpDir, imgutil.MakeFileSafeName(indexRepoName))\n\t\t\t\t\t\t// Initialize the Index with 2 image manifest\n\t\t\t\t\t\tidx := h.RandomCNBIndex(t, indexRepoName, 1, 2)\n\t\t\t\t\t\tmockIndexFactory.EXPECT().LoadIndex(gomock.Eq(indexRepoName), gomock.Any()).Return(idx, nil)\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"errors a message\", func() {\n\t\t\t\t\t\terr = subject.AddManifest(\n\t\t\t\t\t\t\tcontext.TODO(),\n\t\t\t\t\t\t\tManifestAddOptions{\n\t\t\t\t\t\t\t\tIndexRepoName: indexRepoName,\n\t\t\t\t\t\t\t\tRepoName:      \"pack@@image\",\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t)\n\t\t\t\t\t\th.AssertNotNil(t, err)\n\t\t\t\t\t\th.AssertError(t, err, \"is not a valid manifest reference\")\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"when manifest reference doesn't exist in the registry\", func() {\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\tindexRepoName = h.NewRandomIndexRepoName()\n\t\t\t\t\t\tindexPath = filepath.Join(tmpDir, imgutil.MakeFileSafeName(indexRepoName))\n\t\t\t\t\t\t// Initialize the Index with 2 image manifest\n\t\t\t\t\t\tidx := h.RandomCNBIndex(t, indexRepoName, 1, 2)\n\t\t\t\t\t\tmockIndexFactory.EXPECT().LoadIndex(gomock.Eq(indexRepoName), gomock.Any()).Return(idx, nil)\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"it errors a message\", func() {\n\t\t\t\t\t\terr = subject.AddManifest(\n\t\t\t\t\t\t\tcontext.TODO(),\n\t\t\t\t\t\t\tManifestAddOptions{\n\t\t\t\t\t\t\t\tIndexRepoName: indexRepoName,\n\t\t\t\t\t\t\t\tRepoName:      \"pack/image-not-found\",\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t)\n\t\t\t\t\t\th.AssertNotNil(t, err)\n\t\t\t\t\t\th.AssertError(t, err, \"does not exist in registry\")\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"errors on save\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tindexRepoName = h.NewRandomIndexRepoName()\n\t\t\t\t\tcnbIdx := h.NewMockImageIndex(t, indexRepoName, 1, 2)\n\t\t\t\t\tcnbIdx.ErrorOnSave = true\n\t\t\t\t\tmockIndexFactory.\n\t\t\t\t\t\tEXPECT().\n\t\t\t\t\t\tLoadIndex(gomock.Eq(indexRepoName), gomock.Any()).\n\t\t\t\t\t\tReturn(cnbIdx, nil).\n\t\t\t\t\t\tAnyTimes()\n\t\t\t\t})\n\n\t\t\t\tit(\"errors when the manifest list couldn't be saved locally\", func() {\n\t\t\t\t\terr = subject.AddManifest(\n\t\t\t\t\t\tcontext.TODO(),\n\t\t\t\t\t\tManifestAddOptions{\n\t\t\t\t\t\t\tIndexRepoName: indexRepoName,\n\t\t\t\t\t\t\tRepoName:      \"pack/image\",\n\t\t\t\t\t\t},\n\t\t\t\t\t)\n\t\t\t\t\th.AssertNotNil(t, err)\n\t\t\t\t\th.AssertError(t, err, \"failed to save manifest list\")\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "pkg/client/manifest_annotate.go",
    "content": "package client\n\nimport (\n\t\"context\"\n\t\"fmt\"\n\n\t\"github.com/google/go-containerregistry/pkg/name\"\n\n\t\"github.com/buildpacks/pack/internal/style\"\n\t\"github.com/buildpacks/pack/pkg/image\"\n)\n\ntype ManifestAnnotateOptions struct {\n\t// Image index we want to update\n\tIndexRepoName string\n\n\t// Name of image within the index that we wish to update\n\tRepoName string\n\n\t// 'os' of the image we wish to update in the image index\n\tOS string\n\n\t// 'architecture' of the image we wish to update in the image index\n\tOSArch string\n\n\t// 'os variant' of the image we wish to update in the image index\n\tOSVariant string\n\n\t// 'annotations' of the image we wish to update in the image index\n\tAnnotations map[string]string\n}\n\n// AnnotateManifest implements commands.PackClient.\nfunc (c *Client) AnnotateManifest(ctx context.Context, opts ManifestAnnotateOptions) error {\n\tidx, err := c.indexFactory.LoadIndex(opts.IndexRepoName)\n\tif err != nil {\n\t\treturn err\n\t}\n\n\timageRef, err := name.ParseReference(opts.RepoName, name.WeakValidation)\n\tif err != nil {\n\t\treturn fmt.Errorf(\"'%s' is not a valid image reference: %s\", opts.RepoName, err)\n\t}\n\n\timageToAnnotate, err := c.imageFetcher.Fetch(ctx, imageRef.Name(), image.FetchOptions{Daemon: false})\n\tif err != nil {\n\t\treturn err\n\t}\n\n\thash, err := imageToAnnotate.Identifier()\n\tif err != nil {\n\t\treturn err\n\t}\n\n\tdigest, err := name.NewDigest(hash.String())\n\tif err != nil {\n\t\treturn err\n\t}\n\n\tif opts.OS != \"\" {\n\t\tif err = idx.SetOS(digest, opts.OS); err != nil {\n\t\t\treturn fmt.Errorf(\"failed to set the 'os' for %s: %w\", style.Symbol(opts.RepoName), err)\n\t\t}\n\t}\n\tif opts.OSArch != \"\" {\n\t\tif err = idx.SetArchitecture(digest, opts.OSArch); err != nil {\n\t\t\treturn fmt.Errorf(\"failed to set the 'arch' for %s: %w\", style.Symbol(opts.RepoName), err)\n\t\t}\n\t}\n\tif opts.OSVariant != \"\" {\n\t\tif err = idx.SetVariant(digest, opts.OSVariant); err != nil {\n\t\t\treturn fmt.Errorf(\"failed to set the 'os variant' for %s: %w\", style.Symbol(opts.RepoName), err)\n\t\t}\n\t}\n\tif len(opts.Annotations) != 0 {\n\t\tif err = idx.SetAnnotations(digest, opts.Annotations); err != nil {\n\t\t\treturn fmt.Errorf(\"failed to set the 'annotations' for %s: %w\", style.Symbol(opts.RepoName), err)\n\t\t}\n\t}\n\n\tif err = idx.SaveDir(); err != nil {\n\t\treturn fmt.Errorf(\"failed to save manifest list %s to local storage: %w\", style.Symbol(opts.RepoName), err)\n\t}\n\n\tc.logger.Infof(\"Successfully annotated image %s in index %s\", style.Symbol(opts.RepoName), style.Symbol(opts.IndexRepoName))\n\treturn nil\n}\n"
  },
  {
    "path": "pkg/client/manifest_annotate_test.go",
    "content": "package client\n\nimport (\n\t\"bytes\"\n\t\"context\"\n\t\"os\"\n\t\"testing\"\n\n\t\"github.com/buildpacks/imgutil\"\n\t\"github.com/golang/mock/gomock\"\n\t\"github.com/google/go-containerregistry/pkg/authn\"\n\t\"github.com/google/go-containerregistry/pkg/name\"\n\t\"github.com/heroku/color\"\n\t\"github.com/pkg/errors\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\tifakes \"github.com/buildpacks/pack/internal/fakes\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\t\"github.com/buildpacks/pack/pkg/testmocks\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nconst invalidDigest = \"sha256:d4707523ce6e12afdbe9a3be5ad69027150a834870ca0933baf7516dd1fe0f56\"\n\nfunc TestAnnotateManifest(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"build\", testAnnotateManifest, spec.Sequential(), spec.Report(report.Terminal{}))\n}\n\nfunc testAnnotateManifest(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tmockController   *gomock.Controller\n\t\tmockIndexFactory *testmocks.MockIndexFactory\n\t\tfakeImageFetcher *ifakes.FakeImageFetcher\n\t\tout              bytes.Buffer\n\t\tlogger           logging.Logger\n\t\tsubject          *Client\n\t\terr              error\n\t\ttmpDir           string\n\t)\n\n\tit.Before(func() {\n\t\tfakeImageFetcher = ifakes.NewFakeImageFetcher()\n\t\tlogger = logging.NewLogWithWriters(&out, &out, logging.WithVerbose())\n\t\tmockController = gomock.NewController(t)\n\t\tmockIndexFactory = testmocks.NewMockIndexFactory(mockController)\n\n\t\ttmpDir, err = os.MkdirTemp(\"\", \"annotate-manifest-test\")\n\t\th.AssertNil(t, err)\n\t\tos.Setenv(\"XDG_RUNTIME_DIR\", tmpDir)\n\n\t\tsubject, err = NewClient(\n\t\t\tWithLogger(logger),\n\t\t\tWithFetcher(fakeImageFetcher),\n\t\t\tWithIndexFactory(mockIndexFactory),\n\t\t\tWithExperimental(true),\n\t\t\tWithKeychain(authn.DefaultKeychain),\n\t\t)\n\t\th.AssertSameInstance(t, mockIndexFactory, subject.indexFactory)\n\t\th.AssertNil(t, err)\n\t})\n\tit.After(func() {\n\t\tmockController.Finish()\n\t\tos.RemoveAll(tmpDir)\n\t})\n\n\twhen(\"#AnnotateManifest\", func() {\n\t\tvar (\n\t\t\tdigest        name.Digest\n\t\t\tidx           imgutil.ImageIndex\n\t\t\tindexRepoName string\n\t\t)\n\t\twhen(\"index doesn't exist\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\tindexRepoName = h.NewRandomIndexRepoName()\n\t\t\t\tmockIndexFactory.EXPECT().LoadIndex(gomock.Any(), gomock.Any()).Return(nil, errors.New(\"index not found locally\"))\n\t\t\t})\n\n\t\t\tit(\"should return an error\", func() {\n\t\t\t\terr = subject.AnnotateManifest(\n\t\t\t\t\tcontext.TODO(),\n\t\t\t\t\tManifestAnnotateOptions{\n\t\t\t\t\t\tIndexRepoName: indexRepoName,\n\t\t\t\t\t\tRepoName:      \"pack/image\",\n\t\t\t\t\t},\n\t\t\t\t)\n\t\t\t\th.AssertEq(t, err.Error(), \"index not found locally\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"index exists\", func() {\n\t\t\twhen(\"no errors on save\", func() {\n\t\t\t\twhen(\"OS is given\", func() {\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\tindexRepoName = h.NewRandomIndexRepoName()\n\t\t\t\t\t\tidx, digest = h.RandomCNBIndexAndDigest(t, indexRepoName, 1, 2)\n\t\t\t\t\t\tmockIndexFactory.EXPECT().LoadIndex(gomock.Eq(indexRepoName), gomock.Any()).Return(idx, nil)\n\t\t\t\t\t\tfakeImage := h.NewFakeWithRandomUnderlyingV1Image(t, \"pack/image\", digest)\n\t\t\t\t\t\tfakeImageFetcher.RemoteImages[digest.Name()] = fakeImage\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"should set OS for given image\", func() {\n\t\t\t\t\t\terr = subject.AnnotateManifest(\n\t\t\t\t\t\t\tcontext.TODO(),\n\t\t\t\t\t\t\tManifestAnnotateOptions{\n\t\t\t\t\t\t\t\tIndexRepoName: indexRepoName,\n\t\t\t\t\t\t\t\tRepoName:      digest.Name(),\n\t\t\t\t\t\t\t\tOS:            \"some-os\",\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t)\n\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\tos, err := idx.OS(digest)\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\th.AssertEq(t, os, \"some-os\")\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t\twhen(\"Arch is given\", func() {\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\tindexRepoName = h.NewRandomIndexRepoName()\n\t\t\t\t\t\tidx, digest = h.RandomCNBIndexAndDigest(t, indexRepoName, 1, 2)\n\t\t\t\t\t\tmockIndexFactory.EXPECT().LoadIndex(gomock.Eq(indexRepoName), gomock.Any()).Return(idx, nil)\n\t\t\t\t\t\tfakeImage := h.NewFakeWithRandomUnderlyingV1Image(t, \"pack/image\", digest)\n\t\t\t\t\t\tfakeImageFetcher.RemoteImages[digest.Name()] = fakeImage\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"should set Arch for given image\", func() {\n\t\t\t\t\t\terr = subject.AnnotateManifest(\n\t\t\t\t\t\t\tcontext.TODO(),\n\t\t\t\t\t\t\tManifestAnnotateOptions{\n\t\t\t\t\t\t\t\tIndexRepoName: indexRepoName,\n\t\t\t\t\t\t\t\tRepoName:      digest.Name(),\n\t\t\t\t\t\t\t\tOSArch:        \"some-arch\",\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t)\n\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\tarch, err := idx.Architecture(digest)\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\th.AssertEq(t, arch, \"some-arch\")\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t\twhen(\"OS Variant is given\", func() {\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\tindexRepoName = h.NewRandomIndexRepoName()\n\t\t\t\t\t\tidx, digest = h.RandomCNBIndexAndDigest(t, indexRepoName, 1, 2)\n\t\t\t\t\t\tmockIndexFactory.EXPECT().LoadIndex(gomock.Eq(indexRepoName), gomock.Any()).Return(idx, nil)\n\t\t\t\t\t\tfakeImage := h.NewFakeWithRandomUnderlyingV1Image(t, \"pack/image\", digest)\n\t\t\t\t\t\tfakeImageFetcher.RemoteImages[digest.Name()] = fakeImage\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"should set Variant for given image\", func() {\n\t\t\t\t\t\terr = subject.AnnotateManifest(\n\t\t\t\t\t\t\tcontext.TODO(),\n\t\t\t\t\t\t\tManifestAnnotateOptions{\n\t\t\t\t\t\t\t\tIndexRepoName: indexRepoName,\n\t\t\t\t\t\t\t\tRepoName:      digest.Name(),\n\t\t\t\t\t\t\t\tOSVariant:     \"some-variant\",\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t)\n\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\tvariant, err := idx.Variant(digest)\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\th.AssertEq(t, variant, \"some-variant\")\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t\twhen(\"Annotations are given\", func() {\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\tindexRepoName = h.NewRandomIndexRepoName()\n\t\t\t\t\t\tidx, digest = h.RandomCNBIndexAndDigest(t, indexRepoName, 1, 2)\n\t\t\t\t\t\tmockIndexFactory.EXPECT().LoadIndex(gomock.Eq(indexRepoName), gomock.Any()).Return(idx, nil)\n\t\t\t\t\t\tfakeImage := h.NewFakeWithRandomUnderlyingV1Image(t, \"pack/image\", digest)\n\t\t\t\t\t\tfakeImageFetcher.RemoteImages[digest.Name()] = fakeImage\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"should set Annotations for given image\", func() {\n\t\t\t\t\t\terr = subject.AnnotateManifest(\n\t\t\t\t\t\t\tcontext.TODO(),\n\t\t\t\t\t\t\tManifestAnnotateOptions{\n\t\t\t\t\t\t\t\tIndexRepoName: indexRepoName,\n\t\t\t\t\t\t\t\tRepoName:      digest.Name(),\n\t\t\t\t\t\t\t\tAnnotations:   map[string]string{\"some-key\": \"some-value\"},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t)\n\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\tannos, err := idx.Annotations(digest)\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\th.AssertEq(t, annos, map[string]string{\"some-key\": \"some-value\"})\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"should save the annotated index\", func() {\n\t\t\t\t\t\tvar (\n\t\t\t\t\t\t\tfakeOS          = \"some-os\"\n\t\t\t\t\t\t\tfakeArch        = \"some-arch\"\n\t\t\t\t\t\t\tfakeVariant     = \"some-variant\"\n\t\t\t\t\t\t\tfakeAnnotations = map[string]string{\"some-key\": \"some-value\"}\n\t\t\t\t\t\t)\n\n\t\t\t\t\t\terr = subject.AnnotateManifest(\n\t\t\t\t\t\t\tcontext.TODO(),\n\t\t\t\t\t\t\tManifestAnnotateOptions{\n\t\t\t\t\t\t\t\tIndexRepoName: indexRepoName,\n\t\t\t\t\t\t\t\tRepoName:      digest.Name(),\n\t\t\t\t\t\t\t\tOS:            fakeOS,\n\t\t\t\t\t\t\t\tOSArch:        fakeArch,\n\t\t\t\t\t\t\t\tOSVariant:     fakeVariant,\n\t\t\t\t\t\t\t\tAnnotations:   fakeAnnotations,\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t)\n\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\terr = idx.SaveDir()\n\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\tos, err := idx.OS(digest)\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\th.AssertEq(t, os, fakeOS)\n\n\t\t\t\t\t\tarch, err := idx.Architecture(digest)\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\th.AssertEq(t, arch, fakeArch)\n\n\t\t\t\t\t\tvariant, err := idx.Variant(digest)\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\th.AssertEq(t, variant, fakeVariant)\n\n\t\t\t\t\t\tannos, err := idx.Annotations(digest)\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\th.AssertEq(t, annos, fakeAnnotations)\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"image does not exist with given digest\", func() {\n\t\t\tvar nonExistentDigest string\n\n\t\t\tit.Before(func() {\n\t\t\t\tindexRepoName = h.NewRandomIndexRepoName()\n\t\t\t\tidx = h.RandomCNBIndex(t, indexRepoName, 1, 2)\n\t\t\t\tnonExistentDigest = \"busybox@\" + invalidDigest\n\t\t\t\tmockIndexFactory.EXPECT().LoadIndex(gomock.Eq(indexRepoName), gomock.Any()).Return(idx, nil)\n\t\t\t})\n\n\t\t\tit(\"errors for Arch\", func() {\n\t\t\t\terr = subject.AnnotateManifest(\n\t\t\t\t\tcontext.TODO(),\n\t\t\t\t\tManifestAnnotateOptions{\n\t\t\t\t\t\tIndexRepoName: indexRepoName,\n\t\t\t\t\t\tRepoName:      nonExistentDigest,\n\t\t\t\t\t\tOSArch:        \"some-arch\",\n\t\t\t\t\t},\n\t\t\t\t)\n\t\t\t\th.AssertNotNil(t, err)\n\t\t\t})\n\t\t\tit(\"errors for Variant\", func() {\n\t\t\t\terr = subject.AnnotateManifest(\n\t\t\t\t\tcontext.TODO(),\n\t\t\t\t\tManifestAnnotateOptions{\n\t\t\t\t\t\tIndexRepoName: indexRepoName,\n\t\t\t\t\t\tRepoName:      nonExistentDigest,\n\t\t\t\t\t\tOSVariant:     \"some-variant\",\n\t\t\t\t\t},\n\t\t\t\t)\n\t\t\t\th.AssertNotNil(t, err)\n\t\t\t})\n\t\t\tit(\"errors for Annotations\", func() {\n\t\t\t\terr = subject.AnnotateManifest(\n\t\t\t\t\tcontext.TODO(),\n\t\t\t\t\tManifestAnnotateOptions{\n\t\t\t\t\t\tIndexRepoName: indexRepoName,\n\t\t\t\t\t\tRepoName:      nonExistentDigest,\n\t\t\t\t\t\tAnnotations:   map[string]string{\"some-key\": \"some-value\"},\n\t\t\t\t\t},\n\t\t\t\t)\n\t\t\t\th.AssertNotNil(t, err)\n\t\t\t})\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "pkg/client/manifest_create.go",
    "content": "package client\n\nimport (\n\t\"context\"\n\t\"fmt\"\n\n\t\"github.com/buildpacks/imgutil\"\n\t\"github.com/google/go-containerregistry/pkg/v1/types\"\n\n\t\"github.com/buildpacks/pack/internal/style\"\n)\n\ntype CreateManifestOptions struct {\n\t// Image index we want to create\n\tIndexRepoName string\n\n\t// Name of images we wish to add into the image index\n\tRepoNames []string\n\n\t// Media type of the index\n\tFormat types.MediaType\n\n\t// true if we want to publish to an insecure registry\n\tInsecure bool\n\n\t// true if we want to push the index to a registry after creating\n\tPublish bool\n}\n\n// CreateManifest implements commands.PackClient.\nfunc (c *Client) CreateManifest(ctx context.Context, opts CreateManifestOptions) (err error) {\n\tops := parseOptsToIndexOptions(opts)\n\n\tif c.indexFactory.Exists(opts.IndexRepoName) {\n\t\treturn fmt.Errorf(\"manifest list '%s' already exists in local storage; use 'pack manifest remove' to \"+\n\t\t\t\"remove it before creating a new manifest list with the same name\", style.Symbol(opts.IndexRepoName))\n\t}\n\n\tindex, err := c.indexFactory.CreateIndex(opts.IndexRepoName, ops...)\n\tif err != nil {\n\t\treturn err\n\t}\n\n\tfor _, repoName := range opts.RepoNames {\n\t\tif err = c.addManifestToIndex(ctx, repoName, index); err != nil {\n\t\t\treturn err\n\t\t}\n\t}\n\n\tif opts.Publish {\n\t\t// push to a registry without saving a local copy\n\t\tops = append(ops, imgutil.WithPurge(true))\n\t\tif err = index.Push(ops...); err != nil {\n\t\t\treturn err\n\t\t}\n\n\t\tc.logger.Infof(\"Successfully pushed manifest list %s to registry\", style.Symbol(opts.IndexRepoName))\n\t\treturn nil\n\t}\n\n\tif err = index.SaveDir(); err != nil {\n\t\treturn fmt.Errorf(\"manifest list %s could not be saved to local storage: %w\", style.Symbol(opts.IndexRepoName), err)\n\t}\n\n\tc.logger.Infof(\"Successfully created manifest list %s\", style.Symbol(opts.IndexRepoName))\n\treturn nil\n}\n\nfunc parseOptsToIndexOptions(opts CreateManifestOptions) (idxOpts []imgutil.IndexOption) {\n\tif opts.Insecure {\n\t\treturn []imgutil.IndexOption{\n\t\t\timgutil.WithMediaType(opts.Format),\n\t\t\timgutil.WithInsecure(),\n\t\t}\n\t}\n\tif opts.Format == \"\" {\n\t\topts.Format = types.OCIImageIndex\n\t}\n\treturn []imgutil.IndexOption{\n\t\timgutil.WithMediaType(opts.Format),\n\t}\n}\n"
  },
  {
    "path": "pkg/client/manifest_create_test.go",
    "content": "package client\n\nimport (\n\t\"bytes\"\n\t\"context\"\n\t\"os\"\n\t\"path/filepath\"\n\t\"testing\"\n\n\t\"github.com/buildpacks/imgutil\"\n\t\"github.com/golang/mock/gomock\"\n\t\"github.com/google/go-containerregistry/pkg/authn\"\n\t\"github.com/google/go-containerregistry/pkg/v1/types\"\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\tifakes \"github.com/buildpacks/pack/internal/fakes\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\t\"github.com/buildpacks/pack/pkg/testmocks\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestCreateManifest(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"build\", testCreateManifest, spec.Report(report.Terminal{}))\n}\n\nfunc testCreateManifest(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tmockController   *gomock.Controller\n\t\tmockIndexFactory *testmocks.MockIndexFactory\n\t\tfakeImageFetcher *ifakes.FakeImageFetcher\n\t\tout              bytes.Buffer\n\t\tlogger           logging.Logger\n\t\tsubject          *Client\n\t\terr              error\n\t\ttmpDir           string\n\t)\n\n\tit.Before(func() {\n\t\tfakeImageFetcher = ifakes.NewFakeImageFetcher()\n\t\tlogger = logging.NewLogWithWriters(&out, &out, logging.WithVerbose())\n\t\tmockController = gomock.NewController(t)\n\t\tmockIndexFactory = testmocks.NewMockIndexFactory(mockController)\n\n\t\ttmpDir, err = os.MkdirTemp(\"\", \"add-manifest-test\")\n\t\th.AssertNil(t, err)\n\t\tos.Setenv(\"XDG_RUNTIME_DIR\", tmpDir)\n\n\t\tsubject, err = NewClient(\n\t\t\tWithLogger(logger),\n\t\t\tWithFetcher(fakeImageFetcher),\n\t\t\tWithIndexFactory(mockIndexFactory),\n\t\t\tWithExperimental(true),\n\t\t\tWithKeychain(authn.DefaultKeychain),\n\t\t)\n\t\th.AssertSameInstance(t, mockIndexFactory, subject.indexFactory)\n\t\th.AssertNil(t, err)\n\t})\n\tit.After(func() {\n\t\tmockController.Finish()\n\t\th.AssertNil(t, os.RemoveAll(tmpDir))\n\t})\n\n\twhen(\"#CreateManifest\", func() {\n\t\tvar indexRepoName string\n\t\twhen(\"index doesn't exist\", func() {\n\t\t\tvar indexLocalPath string\n\n\t\t\twhen(\"remote manifest is provided\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tfakeImage := h.NewFakeWithRandomUnderlyingV1Image(t, \"pack/image\", nil)\n\t\t\t\t\tfakeImageFetcher.RemoteImages[\"index.docker.io/library/busybox:1.36-musl\"] = fakeImage\n\t\t\t\t})\n\n\t\t\t\twhen(\"publish is false\", func() {\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\t// We want to actually create an index, so no need to mock the index factory\n\t\t\t\t\t\tsubject, err = NewClient(\n\t\t\t\t\t\t\tWithLogger(logger),\n\t\t\t\t\t\t\tWithFetcher(fakeImageFetcher),\n\t\t\t\t\t\t\tWithExperimental(true),\n\t\t\t\t\t\t\tWithKeychain(authn.DefaultKeychain),\n\t\t\t\t\t\t)\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"no errors on save\", func() {\n\t\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\t\tindexRepoName = h.NewRandomIndexRepoName()\n\t\t\t\t\t\t\tindexLocalPath = filepath.Join(tmpDir, imgutil.MakeFileSafeName(indexRepoName))\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\twhen(\"no media type is provided\", func() {\n\t\t\t\t\t\t\tit(\"creates the index adding the manifest\", func() {\n\t\t\t\t\t\t\t\terr = subject.CreateManifest(\n\t\t\t\t\t\t\t\t\tcontext.TODO(),\n\t\t\t\t\t\t\t\t\tCreateManifestOptions{\n\t\t\t\t\t\t\t\t\t\tIndexRepoName: indexRepoName,\n\t\t\t\t\t\t\t\t\t\tRepoNames:     []string{\"busybox:1.36-musl\"},\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t)\n\t\t\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\t\t\tindex := h.ReadIndexManifest(t, indexLocalPath)\n\t\t\t\t\t\t\t\th.AssertEq(t, len(index.Manifests), 1)\n\t\t\t\t\t\t\t\t// By default uses OCI media-types\n\t\t\t\t\t\t\t\th.AssertEq(t, index.MediaType, types.OCIImageIndex)\n\t\t\t\t\t\t\t})\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\twhen(\"media type is provided\", func() {\n\t\t\t\t\t\t\tit(\"creates the index adding the manifest\", func() {\n\t\t\t\t\t\t\t\terr = subject.CreateManifest(\n\t\t\t\t\t\t\t\t\tcontext.TODO(),\n\t\t\t\t\t\t\t\t\tCreateManifestOptions{\n\t\t\t\t\t\t\t\t\t\tIndexRepoName: indexRepoName,\n\t\t\t\t\t\t\t\t\t\tRepoNames:     []string{\"busybox:1.36-musl\"},\n\t\t\t\t\t\t\t\t\t\tFormat:        types.DockerManifestList,\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t)\n\t\t\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\t\t\tindex := h.ReadIndexManifest(t, indexLocalPath)\n\t\t\t\t\t\t\t\th.AssertEq(t, len(index.Manifests), 1)\n\t\t\t\t\t\t\t\th.AssertEq(t, index.MediaType, types.DockerManifestList)\n\t\t\t\t\t\t\t})\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"publish is true\", func() {\n\t\t\t\t\tvar index *h.MockImageIndex\n\n\t\t\t\t\twhen(\"no errors on save\", func() {\n\t\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\t\tindexRepoName = h.NewRandomIndexRepoName()\n\t\t\t\t\t\t\tindexLocalPath = filepath.Join(tmpDir, imgutil.MakeFileSafeName(indexRepoName))\n\n\t\t\t\t\t\t\t// index stub return to check if push operation was called\n\t\t\t\t\t\t\tindex = h.NewMockImageIndex(t, indexRepoName, 0, 0)\n\n\t\t\t\t\t\t\t// We need to mock the index factory to inject a stub index to be pushed.\n\t\t\t\t\t\t\tmockIndexFactory.EXPECT().Exists(gomock.Eq(indexRepoName)).Return(false)\n\t\t\t\t\t\t\tmockIndexFactory.EXPECT().CreateIndex(gomock.Eq(indexRepoName), gomock.Any()).Return(index, nil)\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\tit(\"creates the index adding the manifest and pushes it to the registry\", func() {\n\t\t\t\t\t\t\terr = subject.CreateManifest(\n\t\t\t\t\t\t\t\tcontext.TODO(),\n\t\t\t\t\t\t\t\tCreateManifestOptions{\n\t\t\t\t\t\t\t\t\tIndexRepoName: indexRepoName,\n\t\t\t\t\t\t\t\t\tRepoNames:     []string{\"busybox:1.36-musl\"},\n\t\t\t\t\t\t\t\t\tPublish:       true,\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t)\n\t\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\t\t// index is not saved locally and push it to the registry\n\t\t\t\t\t\t\th.AssertPathDoesNotExists(t, indexLocalPath)\n\t\t\t\t\t\t\th.AssertTrue(t, index.PushCalled)\n\t\t\t\t\t\t\th.AssertTrue(t, index.PurgeOption)\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"no manifest is provided\", func() {\n\t\t\t\twhen(\"no errors on save\", func() {\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\t// We want to actually create an index, so no need to mock the index factory\n\t\t\t\t\t\tsubject, err = NewClient(\n\t\t\t\t\t\t\tWithLogger(logger),\n\t\t\t\t\t\t\tWithFetcher(fakeImageFetcher),\n\t\t\t\t\t\t\tWithExperimental(true),\n\t\t\t\t\t\t\tWithKeychain(authn.DefaultKeychain),\n\t\t\t\t\t\t)\n\n\t\t\t\t\t\tindexRepoName = h.NewRandomIndexRepoName()\n\t\t\t\t\t\tindexLocalPath = filepath.Join(tmpDir, imgutil.MakeFileSafeName(indexRepoName))\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"creates an empty index with OCI media-type\", func() {\n\t\t\t\t\t\terr = subject.CreateManifest(\n\t\t\t\t\t\t\tcontext.TODO(),\n\t\t\t\t\t\t\tCreateManifestOptions{\n\t\t\t\t\t\t\t\tIndexRepoName: indexRepoName,\n\t\t\t\t\t\t\t\tFormat:        types.OCIImageIndex,\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t)\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\tindex := h.ReadIndexManifest(t, indexLocalPath)\n\t\t\t\t\t\th.AssertEq(t, len(index.Manifests), 0)\n\t\t\t\t\t\th.AssertEq(t, index.MediaType, types.OCIImageIndex)\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"creates an empty index with Docker media-type\", func() {\n\t\t\t\t\t\terr = subject.CreateManifest(\n\t\t\t\t\t\t\tcontext.TODO(),\n\t\t\t\t\t\t\tCreateManifestOptions{\n\t\t\t\t\t\t\t\tIndexRepoName: indexRepoName,\n\t\t\t\t\t\t\t\tFormat:        types.DockerManifestList,\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t)\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\tindex := h.ReadIndexManifest(t, indexLocalPath)\n\t\t\t\t\t\th.AssertEq(t, len(index.Manifests), 0)\n\t\t\t\t\t\th.AssertEq(t, index.MediaType, types.DockerManifestList)\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"index exists\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\tindexRepoName = h.NewRandomIndexRepoName()\n\n\t\t\t\t// mock the index factory to simulate the index exists\n\t\t\t\tmockIndexFactory.EXPECT().Exists(gomock.Eq(indexRepoName)).AnyTimes().Return(true)\n\t\t\t})\n\n\t\t\tit(\"returns an error when index already exists\", func() {\n\t\t\t\terr = subject.CreateManifest(\n\t\t\t\t\tcontext.TODO(),\n\t\t\t\t\tCreateManifestOptions{\n\t\t\t\t\t\tIndexRepoName: indexRepoName,\n\t\t\t\t\t},\n\t\t\t\t)\n\t\t\t\th.AssertError(t, err, \"already exists in local storage; use 'pack manifest remove' to remove it before creating a new manifest list with the same name\")\n\t\t\t})\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "pkg/client/manifest_inspect.go",
    "content": "package client\n\nimport (\n\t\"fmt\"\n\n\t\"github.com/buildpacks/imgutil\"\n)\n\n// InspectManifest implements commands.PackClient.\nfunc (c *Client) InspectManifest(indexRepoName string) error {\n\tvar (\n\t\tindex    imgutil.ImageIndex\n\t\tindexStr string\n\t\terr      error\n\t)\n\n\tindex, err = c.indexFactory.FindIndex(indexRepoName)\n\tif err != nil {\n\t\treturn err\n\t}\n\n\tif indexStr, err = index.Inspect(); err != nil {\n\t\treturn fmt.Errorf(\"failed to inspect manifest list '%s': %w\", indexRepoName, err)\n\t}\n\n\tc.logger.Info(indexStr)\n\treturn nil\n}\n"
  },
  {
    "path": "pkg/client/manifest_inspect_test.go",
    "content": "package client\n\nimport (\n\t\"bytes\"\n\t\"encoding/json\"\n\t\"testing\"\n\n\t\"github.com/buildpacks/imgutil\"\n\t\"github.com/golang/mock/gomock\"\n\t\"github.com/google/go-containerregistry/pkg/authn\"\n\tv1 \"github.com/google/go-containerregistry/pkg/v1\"\n\t\"github.com/google/go-containerregistry/pkg/v1/random\"\n\t\"github.com/heroku/color\"\n\t\"github.com/pkg/errors\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\t\"github.com/buildpacks/pack/pkg/testmocks\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestInspectManifest(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"build\", testInspectManifest, spec.Report(report.Terminal{}))\n}\n\nfunc testInspectManifest(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tmockController   *gomock.Controller\n\t\tmockIndexFactory *testmocks.MockIndexFactory\n\t\tstdout           bytes.Buffer\n\t\tstderr           bytes.Buffer\n\t\tlogger           logging.Logger\n\t\tsubject          *Client\n\t\terr              error\n\t)\n\n\tit.Before(func() {\n\t\tlogger = logging.NewLogWithWriters(&stdout, &stderr, logging.WithVerbose())\n\t\tmockController = gomock.NewController(t)\n\t\tmockIndexFactory = testmocks.NewMockIndexFactory(mockController)\n\n\t\tsubject, err = NewClient(\n\t\t\tWithLogger(logger),\n\t\t\tWithIndexFactory(mockIndexFactory),\n\t\t\tWithExperimental(true),\n\t\t\tWithKeychain(authn.DefaultKeychain),\n\t\t)\n\t\th.AssertSameInstance(t, mockIndexFactory, subject.indexFactory)\n\t\th.AssertSameInstance(t, subject.logger, logger)\n\t\th.AssertNil(t, err)\n\t})\n\tit.After(func() {\n\t\tmockController.Finish()\n\t})\n\n\twhen(\"#InspectManifest\", func() {\n\t\tvar indexRepoName string\n\n\t\twhen(\"index doesn't exits\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\tindexRepoName = h.NewRandomIndexRepoName()\n\t\t\t\tmockIndexFactory.\n\t\t\t\t\tEXPECT().\n\t\t\t\t\tFindIndex(gomock.Eq(indexRepoName), gomock.Any()).Return(nil, errors.New(\"index not found\"))\n\t\t\t})\n\n\t\t\tit(\"should return an error when index not found\", func() {\n\t\t\t\terr = subject.InspectManifest(indexRepoName)\n\t\t\t\th.AssertEq(t, err.Error(), \"index not found\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"index exists\", func() {\n\t\t\tvar indexManifest *v1.IndexManifest\n\n\t\t\tit.Before(func() {\n\t\t\t\tindexRepoName = h.NewRandomIndexRepoName()\n\t\t\t\tidx := setUpIndex(t, indexRepoName, *mockIndexFactory)\n\t\t\t\tindexManifest, err = idx.IndexManifest()\n\t\t\t\th.AssertNil(t, err)\n\t\t\t})\n\n\t\t\tit(\"should return formatted IndexManifest\", func() {\n\t\t\t\terr = subject.InspectManifest(indexRepoName)\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\tprintedIndex := &v1.IndexManifest{}\n\t\t\t\terr = json.Unmarshal(stdout.Bytes(), printedIndex)\n\t\t\t\th.AssertEq(t, indexManifest, printedIndex)\n\t\t\t})\n\t\t})\n\t})\n}\n\nfunc setUpIndex(t *testing.T, indexRepoName string, mockIndexFactory testmocks.MockIndexFactory) v1.ImageIndex {\n\trandomUnderlyingIndex, err := random.Index(1024, 1, 2)\n\th.AssertNil(t, err)\n\n\toptions := &imgutil.IndexOptions{\n\t\tBaseIndex: randomUnderlyingIndex,\n\t}\n\tidx, err := imgutil.NewCNBIndex(indexRepoName, *options)\n\th.AssertNil(t, err)\n\n\tmockIndexFactory.EXPECT().FindIndex(gomock.Eq(indexRepoName), gomock.Any()).Return(idx, nil)\n\treturn randomUnderlyingIndex\n}\n"
  },
  {
    "path": "pkg/client/manifest_push.go",
    "content": "package client\n\nimport (\n\t\"fmt\"\n\n\t\"github.com/buildpacks/imgutil\"\n\t\"github.com/google/go-containerregistry/pkg/v1/types\"\n\n\t\"github.com/buildpacks/pack/internal/style\"\n)\n\ntype PushManifestOptions struct {\n\t// Image index we want to update\n\tIndexRepoName string\n\n\t// Index media-type\n\tFormat types.MediaType\n\n\t// true if we want to publish to an insecure registry\n\tInsecure bool\n\n\t// true if we want the index to be deleted from local storage after pushing it\n\tPurge bool\n}\n\n// PushManifest implements commands.PackClient.\nfunc (c *Client) PushManifest(opts PushManifestOptions) (err error) {\n\tif opts.Format == \"\" {\n\t\topts.Format = types.OCIImageIndex\n\t}\n\tops := parseOptions(opts)\n\n\tidx, err := c.indexFactory.LoadIndex(opts.IndexRepoName)\n\tif err != nil {\n\t\treturn\n\t}\n\n\tif err = idx.Push(ops...); err != nil {\n\t\treturn fmt.Errorf(\"failed to push manifest list %s: %w\", style.Symbol(opts.IndexRepoName), err)\n\t}\n\n\tif !opts.Purge {\n\t\tc.logger.Infof(\"Successfully pushed manifest list %s to registry\", style.Symbol(opts.IndexRepoName))\n\t\treturn nil\n\t}\n\n\treturn idx.DeleteDir()\n}\n\nfunc parseOptions(opts PushManifestOptions) (idxOptions []imgutil.IndexOption) {\n\tif opts.Insecure {\n\t\tidxOptions = append(idxOptions, imgutil.WithInsecure())\n\t}\n\n\tif opts.Purge {\n\t\tidxOptions = append(idxOptions, imgutil.WithPurge(true))\n\t}\n\n\treturn append(idxOptions, imgutil.WithMediaType(opts.Format))\n}\n"
  },
  {
    "path": "pkg/client/manifest_push_test.go",
    "content": "package client\n\nimport (\n\t\"bytes\"\n\t\"os\"\n\t\"testing\"\n\n\t\"github.com/golang/mock/gomock\"\n\t\"github.com/google/go-containerregistry/pkg/authn\"\n\t\"github.com/heroku/color\"\n\t\"github.com/pkg/errors\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\t\"github.com/buildpacks/pack/pkg/testmocks\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestPushManifest(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"build\", testPushManifest, spec.Report(report.Terminal{}))\n}\n\nfunc testPushManifest(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tmockController   *gomock.Controller\n\t\tmockIndexFactory *testmocks.MockIndexFactory\n\t\tout              bytes.Buffer\n\t\tlogger           logging.Logger\n\t\tsubject          *Client\n\t\terr              error\n\t\ttmpDir           string\n\t)\n\tit.Before(func() {\n\t\tlogger = logging.NewLogWithWriters(&out, &out, logging.WithVerbose())\n\t\tmockController = gomock.NewController(t)\n\t\tmockIndexFactory = testmocks.NewMockIndexFactory(mockController)\n\n\t\tsubject, err = NewClient(\n\t\t\tWithLogger(logger),\n\t\t\tWithIndexFactory(mockIndexFactory),\n\t\t\tWithExperimental(true),\n\t\t\tWithKeychain(authn.DefaultKeychain),\n\t\t)\n\t\th.AssertSameInstance(t, mockIndexFactory, subject.indexFactory)\n\t\th.AssertNil(t, err)\n\t})\n\tit.After(func() {\n\t\tmockController.Finish()\n\t\th.AssertNil(t, os.RemoveAll(tmpDir))\n\t})\n\n\twhen(\"#PushManifest\", func() {\n\t\twhen(\"index exists locally\", func() {\n\t\t\tvar index *h.MockImageIndex\n\n\t\t\tit.Before(func() {\n\t\t\t\tindex = h.NewMockImageIndex(t, \"some-index\", 1, 2)\n\t\t\t\tmockIndexFactory.EXPECT().LoadIndex(gomock.Eq(\"some-index\"), gomock.Any()).Return(index, nil)\n\t\t\t})\n\t\t\tit(\"pushes the index to the registry\", func() {\n\t\t\t\terr = subject.PushManifest(PushManifestOptions{\n\t\t\t\t\tIndexRepoName: \"some-index\",\n\t\t\t\t})\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertTrue(t, index.PushCalled)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"index doesn't exist locally\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\tmockIndexFactory.EXPECT().LoadIndex(gomock.Any(), gomock.Any()).Return(nil, errors.New(\"ErrNoImageOrIndexFoundWithGivenDigest\"))\n\t\t\t})\n\n\t\t\tit(\"errors with a message\", func() {\n\t\t\t\terr = subject.PushManifest(PushManifestOptions{\n\t\t\t\t\tIndexRepoName: \"some-index\",\n\t\t\t\t})\n\t\t\t\th.AssertNotNil(t, err)\n\t\t\t})\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "pkg/client/manifest_remove.go",
    "content": "package client\n\nimport \"errors\"\n\n// DeleteManifest implements commands.PackClient.\nfunc (c *Client) DeleteManifest(names []string) error {\n\tvar allErrors error\n\tfor _, name := range names {\n\t\timgIndex, err := c.indexFactory.LoadIndex(name)\n\t\tif err != nil {\n\t\t\tallErrors = errors.Join(allErrors, err)\n\t\t\tcontinue\n\t\t}\n\n\t\tif err := imgIndex.DeleteDir(); err != nil {\n\t\t\tallErrors = errors.Join(allErrors, err)\n\t\t}\n\t}\n\n\tif allErrors == nil {\n\t\tc.logger.Info(\"Successfully deleted manifest list(s) from local storage\")\n\t}\n\treturn allErrors\n}\n"
  },
  {
    "path": "pkg/client/manifest_remove_test.go",
    "content": "package client\n\nimport (\n\t\"bytes\"\n\t\"os\"\n\t\"path/filepath\"\n\t\"testing\"\n\n\t\"github.com/buildpacks/imgutil\"\n\t\"github.com/golang/mock/gomock\"\n\t\"github.com/google/go-containerregistry/pkg/authn\"\n\t\"github.com/heroku/color\"\n\t\"github.com/pkg/errors\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\t\"github.com/buildpacks/pack/pkg/testmocks\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestDeleteManifest(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"build\", testDeleteManifest, spec.Report(report.Terminal{}))\n}\n\nfunc testDeleteManifest(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tmockController   *gomock.Controller\n\t\tmockIndexFactory *testmocks.MockIndexFactory\n\t\tout              bytes.Buffer\n\t\tlogger           logging.Logger\n\t\tsubject          *Client\n\t\terr              error\n\t\ttmpDir           string\n\t)\n\n\tit.Before(func() {\n\t\tlogger = logging.NewLogWithWriters(&out, &out, logging.WithVerbose())\n\t\tmockController = gomock.NewController(t)\n\t\tmockIndexFactory = testmocks.NewMockIndexFactory(mockController)\n\n\t\ttmpDir, err = os.MkdirTemp(\"\", \"remove-manifest-test\")\n\t\th.AssertNil(t, err)\n\t\tos.Setenv(\"XDG_RUNTIME_DIR\", tmpDir)\n\n\t\tsubject, err = NewClient(\n\t\t\tWithLogger(logger),\n\t\t\tWithIndexFactory(mockIndexFactory),\n\t\t\tWithExperimental(true),\n\t\t\tWithKeychain(authn.DefaultKeychain),\n\t\t)\n\t\th.AssertSameInstance(t, mockIndexFactory, subject.indexFactory)\n\t\th.AssertNil(t, err)\n\t})\n\tit.After(func() {\n\t\tmockController.Finish()\n\t\th.AssertNil(t, os.RemoveAll(tmpDir))\n\t})\n\n\twhen(\"#DeleteManifest\", func() {\n\t\tvar (\n\t\t\tindexPath     string\n\t\t\tindexRepoName string\n\t\t)\n\n\t\twhen(\"index doesn't exists\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\tmockIndexFactory.EXPECT().LoadIndex(gomock.Any(), gomock.Any()).Return(nil, errors.New(\"index not found locally\"))\n\t\t\t})\n\t\t\tit(\"should return an error when index is already deleted\", func() {\n\t\t\t\terr = subject.DeleteManifest([]string{\"pack/none-existent-index\"})\n\t\t\t\th.AssertNotNil(t, err)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"index exists\", func() {\n\t\t\tvar idx imgutil.ImageIndex\n\n\t\t\tit.Before(func() {\n\t\t\t\tindexRepoName = h.NewRandomIndexRepoName()\n\t\t\t\tindexPath = filepath.Join(tmpDir, imgutil.MakeFileSafeName(indexRepoName))\n\t\t\t\tidx = h.RandomCNBIndex(t, indexRepoName, 1, 1)\n\t\t\t\tmockIndexFactory.EXPECT().LoadIndex(gomock.Eq(indexRepoName), gomock.Any()).Return(idx, nil)\n\n\t\t\t\t// Let's write the index on disk\n\t\t\t\th.AssertNil(t, idx.SaveDir())\n\t\t\t})\n\n\t\t\tit(\"should delete local index\", func() {\n\t\t\t\terr = subject.DeleteManifest([]string{indexRepoName})\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertContains(t, out.String(), \"Successfully deleted manifest list(s) from local storage\")\n\t\t\t\th.AssertPathDoesNotExists(t, indexPath)\n\t\t\t})\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "pkg/client/manifest_rm.go",
    "content": "package client\n\nimport (\n\t\"errors\"\n\t\"fmt\"\n\n\tgccrName \"github.com/google/go-containerregistry/pkg/name\"\n)\n\n// RemoveManifest implements commands.PackClient.\nfunc (c *Client) RemoveManifest(name string, images []string) error {\n\tvar allErrors error\n\n\timgIndex, err := c.indexFactory.LoadIndex(name)\n\tif err != nil {\n\t\treturn err\n\t}\n\n\tfor _, image := range images {\n\t\tref, err := gccrName.NewDigest(image, gccrName.WeakValidation, gccrName.Insecure)\n\t\tif err != nil {\n\t\t\tallErrors = errors.Join(allErrors, fmt.Errorf(\"invalid instance '%s': %w\", image, err))\n\t\t}\n\n\t\tif err = imgIndex.RemoveManifest(ref); err != nil {\n\t\t\tallErrors = errors.Join(allErrors, err)\n\t\t}\n\n\t\tif err = imgIndex.SaveDir(); err != nil {\n\t\t\tallErrors = errors.Join(allErrors, err)\n\t\t}\n\t}\n\n\tif allErrors == nil {\n\t\tc.logger.Infof(\"Successfully removed image(s) from index: '%s'\", name)\n\t}\n\n\treturn allErrors\n}\n"
  },
  {
    "path": "pkg/client/manifest_rm_test.go",
    "content": "package client\n\nimport (\n\t\"bytes\"\n\t\"os\"\n\t\"path/filepath\"\n\t\"testing\"\n\n\t\"github.com/buildpacks/imgutil\"\n\t\"github.com/golang/mock/gomock\"\n\t\"github.com/google/go-containerregistry/pkg/authn\"\n\t\"github.com/google/go-containerregistry/pkg/name\"\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\t\"github.com/buildpacks/pack/pkg/testmocks\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestRemoveManifest(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"build\", testRemoveManifest, spec.Report(report.Terminal{}))\n}\n\nfunc testRemoveManifest(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tmockController   *gomock.Controller\n\t\tmockIndexFactory *testmocks.MockIndexFactory\n\t\tout              bytes.Buffer\n\t\tlogger           logging.Logger\n\t\tsubject          *Client\n\t\terr              error\n\t\ttmpDir           string\n\t)\n\n\tit.Before(func() {\n\t\tlogger = logging.NewLogWithWriters(&out, &out, logging.WithVerbose())\n\t\tmockController = gomock.NewController(t)\n\t\tmockIndexFactory = testmocks.NewMockIndexFactory(mockController)\n\n\t\ttmpDir, err = os.MkdirTemp(\"\", \"rm-manifest-test\")\n\t\th.AssertNil(t, err)\n\t\tos.Setenv(\"XDG_RUNTIME_DIR\", tmpDir)\n\n\t\tsubject, err = NewClient(\n\t\t\tWithLogger(logger),\n\t\t\tWithIndexFactory(mockIndexFactory),\n\t\t\tWithExperimental(true),\n\t\t\tWithKeychain(authn.DefaultKeychain),\n\t\t)\n\t\th.AssertSameInstance(t, mockIndexFactory, subject.indexFactory)\n\t\th.AssertNil(t, err)\n\t})\n\tit.After(func() {\n\t\tmockController.Finish()\n\t\th.AssertNil(t, os.RemoveAll(tmpDir))\n\t})\n\n\twhen(\"#RemoveManifest\", func() {\n\t\tvar (\n\t\t\tindexPath     string\n\t\t\tindexRepoName string\n\t\t)\n\n\t\twhen(\"index exists\", func() {\n\t\t\tvar digest name.Digest\n\t\t\tvar idx imgutil.ImageIndex\n\n\t\t\tit.Before(func() {\n\t\t\t\tindexRepoName = h.NewRandomIndexRepoName()\n\t\t\t\tindexPath = filepath.Join(tmpDir, imgutil.MakeFileSafeName(indexRepoName))\n\n\t\t\t\t// Initialize the Index with 2 image manifest\n\t\t\t\tidx, digest = h.RandomCNBIndexAndDigest(t, indexRepoName, 1, 2)\n\t\t\t\tmockIndexFactory.EXPECT().LoadIndex(gomock.Eq(indexRepoName), gomock.Any()).Return(idx, nil)\n\t\t\t})\n\n\t\t\tit(\"should remove local index\", func() {\n\t\t\t\terr = subject.RemoveManifest(indexRepoName, []string{digest.Name()})\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t// We expect one manifest after removing one of them\n\t\t\t\tindex := h.ReadIndexManifest(t, indexPath)\n\t\t\t\th.AssertEq(t, len(index.Manifests), 1)\n\t\t\t\th.AssertNotEq(t, index.Manifests[0].Digest.String(), digest.Name())\n\t\t\t})\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "pkg/client/new_buildpack.go",
    "content": "package client\n\nimport (\n\t\"context\"\n\t\"os\"\n\t\"path/filepath\"\n\n\t\"github.com/BurntSushi/toml\"\n\n\t\"github.com/buildpacks/lifecycle/api\"\n\n\t\"github.com/buildpacks/pack/internal/style\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n)\n\nvar (\n\tbashBinBuild = `#!/usr/bin/env bash\n\nset -euo pipefail\n\nlayers_dir=\"$1\"\nenv_dir=\"$2/env\"\nplan_path=\"$3\"\n\nexit 0\n`\n\tbashBinDetect = `#!/usr/bin/env bash\n\nexit 0\n`\n)\n\ntype NewBuildpackOptions struct {\n\t// api compat version of the output buildpack artifact.\n\tAPI string\n\n\t// The base directory to generate assets\n\tPath string\n\n\t// The ID of the output buildpack artifact.\n\tID string\n\n\t// version of the output buildpack artifact.\n\tVersion string\n\n\t// Deprecated: The stacks this buildpack will work with\n\tStacks []dist.Stack\n\n\t// the targets this buildpack will work with\n\tTargets []dist.Target\n}\n\nfunc (c *Client) NewBuildpack(ctx context.Context, opts NewBuildpackOptions) error {\n\terr := createBuildpackTOML(opts.Path, opts.ID, opts.Version, opts.API, opts.Stacks, opts.Targets, c)\n\tif err != nil {\n\t\treturn err\n\t}\n\treturn createBashBuildpack(opts.Path, c)\n}\n\nfunc createBashBuildpack(path string, c *Client) error {\n\tif err := createBinScript(path, \"build\", bashBinBuild, c); err != nil {\n\t\treturn err\n\t}\n\n\tif err := createBinScript(path, \"detect\", bashBinDetect, c); err != nil {\n\t\treturn err\n\t}\n\n\treturn nil\n}\n\nfunc createBinScript(path, name, contents string, c *Client) error {\n\tbinDir := filepath.Join(path, \"bin\")\n\tbinFile := filepath.Join(binDir, name)\n\n\t_, err := os.Stat(binFile)\n\tif os.IsNotExist(err) {\n\t\t// The following line's comment is for gosec, it will ignore rule 301 in this case\n\t\t// G301: Expect directory permissions to be 0750 or less\n\t\t/* #nosec G301 */\n\t\tif err := os.MkdirAll(binDir, 0755); err != nil {\n\t\t\treturn err\n\t\t}\n\t\t// The following line's comment is for gosec, it will ignore rule 306 in this case\n\t\t// G306: Expect WriteFile permissions to be 0600 or less\n\t\t/* #nosec G306 */\n\t\terr = os.WriteFile(binFile, []byte(contents), 0755)\n\t\tif err != nil {\n\t\t\treturn err\n\t\t}\n\n\t\tif c != nil {\n\t\t\tc.logger.Infof(\"    %s  bin/%s\", style.Symbol(\"create\"), name)\n\t\t}\n\t}\n\treturn nil\n}\n\nfunc createBuildpackTOML(path, id, version, apiStr string, stacks []dist.Stack, targets []dist.Target, c *Client) error {\n\tapi, err := api.NewVersion(apiStr)\n\tif err != nil {\n\t\treturn err\n\t}\n\n\tbuildpackTOML := dist.BuildpackDescriptor{\n\t\tWithAPI:     api,\n\t\tWithStacks:  stacks,\n\t\tWithTargets: targets,\n\t\tWithInfo: dist.ModuleInfo{\n\t\t\tID:      id,\n\t\t\tVersion: version,\n\t\t},\n\t}\n\n\t// The following line's comment is for gosec, it will ignore rule 301 in this case\n\t// G301: Expect directory permissions to be 0750 or less\n\t/* #nosec G301 */\n\tif err := os.MkdirAll(path, 0755); err != nil {\n\t\treturn err\n\t}\n\n\tbuildpackTOMLPath := filepath.Join(path, \"buildpack.toml\")\n\t_, err = os.Stat(buildpackTOMLPath)\n\tif os.IsNotExist(err) {\n\t\tf, err := os.Create(buildpackTOMLPath)\n\t\tif err != nil {\n\t\t\treturn err\n\t\t}\n\t\tif err := toml.NewEncoder(f).Encode(buildpackTOML); err != nil {\n\t\t\treturn err\n\t\t}\n\t\tdefer f.Close()\n\t\tif c != nil {\n\t\t\tc.logger.Infof(\"    %s  buildpack.toml\", style.Symbol(\"create\"))\n\t\t}\n\t}\n\n\treturn nil\n}\n"
  },
  {
    "path": "pkg/client/new_buildpack_test.go",
    "content": "package client_test\n\nimport (\n\t\"context\"\n\t\"os\"\n\t\"path/filepath\"\n\t\"runtime\"\n\t\"testing\"\n\n\t\"github.com/heroku/color\"\n\t\"github.com/pelletier/go-toml\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\t\"github.com/buildpacks/pack/pkg/client\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestNewBuildpack(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"NewBuildpack\", testNewBuildpack, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testNewBuildpack(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tsubject *client.Client\n\t\ttmpDir  string\n\t)\n\n\tit.Before(func() {\n\t\tvar err error\n\n\t\ttmpDir, err = os.MkdirTemp(\"\", \"new-buildpack-test\")\n\t\th.AssertNil(t, err)\n\n\t\tsubject, err = client.NewClient()\n\t\th.AssertNil(t, err)\n\t})\n\n\tit.After(func() {\n\t\th.AssertNil(t, os.RemoveAll(tmpDir))\n\t})\n\n\twhen(\"#NewBuildpack\", func() {\n\t\tit(\"should create bash scripts\", func() {\n\t\t\terr := subject.NewBuildpack(context.TODO(), client.NewBuildpackOptions{\n\t\t\t\tAPI:     \"0.4\",\n\t\t\t\tPath:    tmpDir,\n\t\t\t\tID:      \"example/my-cnb\",\n\t\t\t\tVersion: \"0.0.0\",\n\t\t\t\tStacks: []dist.Stack{\n\t\t\t\t\t{\n\t\t\t\t\t\tID:     \"some-stack\",\n\t\t\t\t\t\tMixins: []string{\"some-mixin\"},\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t})\n\t\t\th.AssertNil(t, err)\n\n\t\t\tinfo, err := os.Stat(filepath.Join(tmpDir, \"bin/build\"))\n\t\t\th.AssertFalse(t, os.IsNotExist(err))\n\t\t\tif runtime.GOOS != \"windows\" {\n\t\t\t\th.AssertTrue(t, info.Mode()&0100 != 0)\n\t\t\t}\n\n\t\t\tinfo, err = os.Stat(filepath.Join(tmpDir, \"bin/detect\"))\n\t\t\th.AssertFalse(t, os.IsNotExist(err))\n\t\t\tif runtime.GOOS != \"windows\" {\n\t\t\t\th.AssertTrue(t, info.Mode()&0100 != 0)\n\t\t\t}\n\n\t\t\tassertBuildpackToml(t, tmpDir, \"example/my-cnb\")\n\t\t})\n\n\t\twhen(\"files exist\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\tvar err error\n\n\t\t\t\terr = os.MkdirAll(filepath.Join(tmpDir, \"bin\"), 0755)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\terr = os.WriteFile(filepath.Join(tmpDir, \"buildpack.toml\"), []byte(\"expected value\"), 0655)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\terr = os.WriteFile(filepath.Join(tmpDir, \"bin\", \"build\"), []byte(\"expected value\"), 0755)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\terr = os.WriteFile(filepath.Join(tmpDir, \"bin\", \"detect\"), []byte(\"expected value\"), 0755)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t})\n\n\t\t\tit(\"should not clobber files that exist\", func() {\n\t\t\t\terr := subject.NewBuildpack(context.TODO(), client.NewBuildpackOptions{\n\t\t\t\t\tAPI:     \"0.4\",\n\t\t\t\t\tPath:    tmpDir,\n\t\t\t\t\tID:      \"example/my-cnb\",\n\t\t\t\t\tVersion: \"0.0.0\",\n\t\t\t\t\tStacks: []dist.Stack{\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\tID:     \"some-stack\",\n\t\t\t\t\t\t\tMixins: []string{\"some-mixin\"},\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t})\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\tcontent, err := os.ReadFile(filepath.Join(tmpDir, \"buildpack.toml\"))\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertEq(t, content, []byte(\"expected value\"))\n\n\t\t\t\tcontent, err = os.ReadFile(filepath.Join(tmpDir, \"bin\", \"build\"))\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertEq(t, content, []byte(\"expected value\"))\n\n\t\t\t\tcontent, err = os.ReadFile(filepath.Join(tmpDir, \"bin\", \"detect\"))\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertEq(t, content, []byte(\"expected value\"))\n\t\t\t})\n\t\t})\n\t})\n}\n\nfunc assertBuildpackToml(t *testing.T, path string, id string) {\n\tbuildpackTOML := filepath.Join(path, \"buildpack.toml\")\n\t_, err := os.Stat(buildpackTOML)\n\th.AssertFalse(t, os.IsNotExist(err))\n\n\tf, err := os.Open(buildpackTOML)\n\th.AssertNil(t, err)\n\tvar buildpackDescriptor dist.BuildpackDescriptor\n\terr = toml.NewDecoder(f).Decode(&buildpackDescriptor)\n\th.AssertNil(t, err)\n\tdefer f.Close()\n\n\th.AssertEq(t, buildpackDescriptor.Info().ID, \"example/my-cnb\")\n}\n"
  },
  {
    "path": "pkg/client/package_buildpack.go",
    "content": "package client\n\nimport (\n\t\"context\"\n\t\"fmt\"\n\t\"path/filepath\"\n\n\t\"github.com/moby/moby/client\"\n\t\"github.com/pkg/errors\"\n\n\t\"github.com/buildpacks/pack/internal/name\"\n\n\tpubbldpkg \"github.com/buildpacks/pack/buildpackage\"\n\t\"github.com/buildpacks/pack/internal/layer\"\n\t\"github.com/buildpacks/pack/internal/paths\"\n\t\"github.com/buildpacks/pack/internal/style\"\n\t\"github.com/buildpacks/pack/pkg/blob\"\n\t\"github.com/buildpacks/pack/pkg/buildpack\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n\t\"github.com/buildpacks/pack/pkg/image\"\n)\n\nconst (\n\t// Packaging indicator that format of inputs/outputs will be an OCI image on the registry.\n\tFormatImage = \"image\"\n\n\t// Packaging indicator that format of output will be a file on the host filesystem.\n\tFormatFile = \"file\"\n\n\t// CNBExtension is the file extension for a cloud native buildpack tar archive\n\tCNBExtension = \".cnb\"\n)\n\n// PackageBuildpackOptions is a configuration object used to define\n// the behavior of PackageBuildpack.\ntype PackageBuildpackOptions struct {\n\t// The base director to resolve relative assest from\n\tRelativeBaseDir string\n\n\t// The name of the output buildpack artifact.\n\tName string\n\n\t// Type of output format, The options are the either the const FormatImage, or FormatFile.\n\tFormat string\n\n\t// Defines the Buildpacks configuration.\n\tConfig pubbldpkg.Config\n\n\t// Push resulting builder image up to a registry\n\t// specified in the Name variable.\n\tPublish bool\n\n\t// Append [os]-[arch] suffix to the image tag when publishing a multi-arch to a registry\n\t// Requires Publish to be true\n\tAppendImageNameSuffix bool\n\n\t// Strategy for updating images before packaging.\n\tPullPolicy image.PullPolicy\n\n\t// Name of the buildpack registry. Used to\n\t// add buildpacks to a package.\n\tRegistry string\n\n\t// Flatten layers\n\tFlatten bool\n\n\t// List of buildpack images to exclude from being flattened.\n\tFlattenExclude []string\n\n\t// Map of labels to add to the Buildpack\n\tLabels map[string]string\n\n\t// Target platforms to build packages for\n\tTargets []dist.Target\n\n\t// Additional image tags to push to, each will contain contents identical to Image\n\tAdditionalTags []string\n}\n\n// PackageBuildpack packages buildpack(s) into either an image or file.\nfunc (c *Client) PackageBuildpack(ctx context.Context, opts PackageBuildpackOptions) error {\n\tif opts.Format == \"\" {\n\t\topts.Format = FormatImage\n\t}\n\n\ttargets, err := c.processPackageBuildpackTargets(ctx, opts)\n\tif err != nil {\n\t\treturn err\n\t}\n\tmultiArch := len(targets) > 1 && (opts.Publish || opts.Format == FormatFile)\n\n\tvar digests []string\n\ttargets = dist.ExpandTargetsDistributions(targets...)\n\tfor _, target := range targets {\n\t\tdigest, err := c.packageBuildpackTarget(ctx, opts, target, multiArch)\n\t\tif err != nil {\n\t\t\treturn err\n\t\t}\n\t\tdigests = append(digests, digest)\n\t}\n\n\tif opts.Publish && len(digests) > 1 {\n\t\t// Image Index must be created only when we pushed to registry\n\t\treturn c.CreateManifest(ctx, CreateManifestOptions{\n\t\t\tIndexRepoName: opts.Name,\n\t\t\tRepoNames:     digests,\n\t\t\tPublish:       true,\n\t\t})\n\t}\n\n\treturn nil\n}\n\nfunc (c *Client) packageBuildpackTarget(ctx context.Context, opts PackageBuildpackOptions, target dist.Target, multiArch bool) (string, error) {\n\tvar digest string\n\tif target.OS == \"windows\" && !c.experimental {\n\t\treturn \"\", NewExperimentError(\"Windows buildpackage support is currently experimental.\")\n\t}\n\n\terr := c.validateOSPlatform(ctx, target.OS, opts.Publish, opts.Format)\n\tif err != nil {\n\t\treturn digest, err\n\t}\n\n\twriterFactory, err := layer.NewWriterFactory(target.OS)\n\tif err != nil {\n\t\treturn digest, errors.Wrap(err, \"creating layer writer factory\")\n\t}\n\n\tvar packageBuilderOpts []buildpack.PackageBuilderOption\n\tif opts.Flatten {\n\t\tpackageBuilderOpts = append(packageBuilderOpts, buildpack.DoNotFlatten(opts.FlattenExclude),\n\t\t\tbuildpack.WithLayerWriterFactory(writerFactory), buildpack.WithLogger(c.logger))\n\t}\n\tpackageBuilder := buildpack.NewBuilder(c.imageFactory, packageBuilderOpts...)\n\n\tbpURI := opts.Config.Buildpack.URI\n\tif bpURI == \"\" {\n\t\treturn digest, errors.New(\"buildpack URI must be provided\")\n\t}\n\n\tif ok, platformRootFolder := buildpack.PlatformRootFolder(bpURI, target); ok {\n\t\tbpURI = platformRootFolder\n\t}\n\n\tmainBlob, err := c.downloadBuildpackFromURI(ctx, bpURI, opts.RelativeBaseDir)\n\tif err != nil {\n\t\treturn digest, err\n\t}\n\n\tbp, err := buildpack.FromBuildpackRootBlob(mainBlob, writerFactory, c.logger)\n\tif err != nil {\n\t\treturn digest, errors.Wrapf(err, \"creating buildpack from %s\", style.Symbol(bpURI))\n\t}\n\n\tpackageBuilder.SetBuildpack(bp)\n\n\tplatform := target.ValuesAsPlatform()\n\n\tfor _, dep := range opts.Config.Dependencies {\n\t\tif multiArch {\n\t\t\tlocatorType, err := buildpack.GetLocatorType(dep.URI, opts.RelativeBaseDir, []dist.ModuleInfo{})\n\t\t\tif err != nil {\n\t\t\t\treturn digest, err\n\t\t\t}\n\t\t\tif locatorType == buildpack.URILocator {\n\t\t\t\t// When building a composite multi-platform buildpack all the dependencies must be pushed to a registry\n\t\t\t\treturn digest, errors.New(fmt.Sprintf(\"uri %s is not allowed when creating a composite multi-platform buildpack; push your dependencies to a registry and use 'docker://<image>' instead\", style.Symbol(dep.URI)))\n\t\t\t}\n\t\t}\n\n\t\tc.logger.Debugf(\"Downloading buildpack dependency for platform %s\", platform)\n\t\tmainBP, deps, err := c.buildpackDownloader.Download(ctx, dep.URI, buildpack.DownloadOptions{\n\t\t\tRegistryName:    opts.Registry,\n\t\t\tRelativeBaseDir: opts.RelativeBaseDir,\n\t\t\tImageName:       dep.ImageName,\n\t\t\tDaemon:          !opts.Publish,\n\t\t\tPullPolicy:      opts.PullPolicy,\n\t\t\tTarget:          &target,\n\t\t})\n\t\tif err != nil {\n\t\t\treturn digest, errors.Wrapf(err, \"packaging dependencies (uri=%s,image=%s)\", style.Symbol(dep.URI), style.Symbol(dep.ImageName))\n\t\t}\n\n\t\tpackageBuilder.AddDependencies(mainBP, deps)\n\t}\n\n\tswitch opts.Format {\n\tcase FormatFile:\n\t\tname := opts.Name\n\t\tif multiArch {\n\t\t\textension := filepath.Ext(name)\n\t\t\torigFileName := name[:len(name)-len(filepath.Ext(name))]\n\t\t\tif target.Arch != \"\" {\n\t\t\t\tname = fmt.Sprintf(\"%s-%s-%s%s\", origFileName, target.OS, target.Arch, extension)\n\t\t\t} else {\n\t\t\t\tname = fmt.Sprintf(\"%s-%s%s\", origFileName, target.OS, extension)\n\t\t\t}\n\t\t}\n\t\terr = packageBuilder.SaveAsFile(name, target, opts.Labels)\n\t\tif err != nil {\n\t\t\treturn digest, err\n\t\t}\n\tcase FormatImage:\n\t\tpackageName := opts.Name\n\t\tif multiArch && opts.AppendImageNameSuffix {\n\t\t\tpackageName, err = name.AppendSuffix(packageName, target)\n\t\t\tif err != nil {\n\t\t\t\treturn \"\", errors.Wrap(err, \"invalid image name\")\n\t\t\t}\n\t\t}\n\t\timg, err := packageBuilder.SaveAsImage(packageName, opts.Publish, target, opts.Labels, opts.AdditionalTags...)\n\t\tif err != nil {\n\t\t\treturn digest, errors.Wrapf(err, \"saving image\")\n\t\t}\n\t\tif multiArch {\n\t\t\t// We need to keep the identifier to create the image index\n\t\t\tid, err := img.Identifier()\n\t\t\tif err != nil {\n\t\t\t\treturn digest, errors.Wrapf(err, \"determining image manifest digest\")\n\t\t\t}\n\t\t\tdigest = id.String()\n\t\t}\n\tdefault:\n\t\treturn digest, errors.Errorf(\"unknown format: %s\", style.Symbol(opts.Format))\n\t}\n\treturn digest, nil\n}\n\nfunc (c *Client) downloadBuildpackFromURI(ctx context.Context, uri, relativeBaseDir string) (blob.Blob, error) {\n\tabsPath, err := paths.FilePathToURI(uri, relativeBaseDir)\n\tif err != nil {\n\t\treturn nil, errors.Wrapf(err, \"making absolute: %s\", style.Symbol(uri))\n\t}\n\turi = absPath\n\n\tc.logger.Debugf(\"Downloading buildpack from URI: %s\", style.Symbol(uri))\n\tblob, err := c.downloader.Download(ctx, uri)\n\tif err != nil {\n\t\treturn nil, errors.Wrapf(err, \"downloading buildpack from %s\", style.Symbol(uri))\n\t}\n\n\treturn blob, nil\n}\n\nfunc (c *Client) processPackageBuildpackTargets(ctx context.Context, opts PackageBuildpackOptions) ([]dist.Target, error) {\n\tvar targets []dist.Target\n\tif len(opts.Targets) > 0 {\n\t\t// when exporting to the daemon, we need to select just one target\n\t\tif !opts.Publish && opts.Format == FormatImage {\n\t\t\tdaemonTarget, err := c.daemonTarget(ctx, opts.Targets)\n\t\t\tif err != nil {\n\t\t\t\treturn targets, err\n\t\t\t}\n\t\t\ttargets = append(targets, daemonTarget)\n\t\t} else {\n\t\t\ttargets = opts.Targets\n\t\t}\n\t} else {\n\t\ttargets = append(targets, dist.Target{OS: opts.Config.Platform.OS})\n\t}\n\treturn targets, nil\n}\n\nfunc (c *Client) validateOSPlatform(ctx context.Context, os string, publish bool, format string) error {\n\tif publish || format == FormatFile {\n\t\treturn nil\n\t}\n\n\tresult, err := c.docker.Info(ctx, client.InfoOptions{})\n\tif err != nil {\n\t\treturn err\n\t}\n\n\tif result.Info.OSType != os {\n\t\treturn errors.Errorf(\"invalid %s specified: DOCKER_OS is %s\", style.Symbol(\"platform.os\"), style.Symbol(result.Info.OSType))\n\t}\n\n\treturn nil\n}\n\n// daemonTarget returns a target that matches with the given daemon os/arch\nfunc (c *Client) daemonTarget(ctx context.Context, targets []dist.Target) (dist.Target, error) {\n\tserverResult, err := c.docker.ServerVersion(ctx, client.ServerVersionOptions{})\n\tif err != nil {\n\t\treturn dist.Target{}, err\n\t}\n\n\tfor _, t := range targets {\n\t\tif t.Arch != \"\" && t.OS == serverResult.Os && t.Arch == serverResult.Arch {\n\t\t\treturn t, nil\n\t\t} else if t.Arch == \"\" && t.OS == serverResult.Os {\n\t\t\treturn t, nil\n\t\t}\n\t}\n\treturn dist.Target{}, errors.Errorf(\"could not find a target that matches daemon os=%s and architecture=%s\", serverResult.Os, serverResult.Arch)\n}\n"
  },
  {
    "path": "pkg/client/package_buildpack_test.go",
    "content": "package client_test\n\nimport (\n\t\"bytes\"\n\t\"context\"\n\t\"fmt\"\n\t\"os\"\n\t\"path/filepath\"\n\t\"testing\"\n\n\t\"github.com/buildpacks/imgutil\"\n\t\"github.com/buildpacks/imgutil/fakes\"\n\t\"github.com/buildpacks/lifecycle/api\"\n\t\"github.com/golang/mock/gomock\"\n\t\"github.com/google/go-containerregistry/pkg/name\"\n\t\"github.com/heroku/color\"\n\tmobysystem \"github.com/moby/moby/api/types/system\"\n\tdockerclient \"github.com/moby/moby/client\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\t\"github.com/buildpacks/pack/pkg/archive\"\n\n\tpubbldpkg \"github.com/buildpacks/pack/buildpackage\"\n\tcfg \"github.com/buildpacks/pack/internal/config\"\n\tifakes \"github.com/buildpacks/pack/internal/fakes\"\n\t\"github.com/buildpacks/pack/internal/paths\"\n\t\"github.com/buildpacks/pack/pkg/blob\"\n\t\"github.com/buildpacks/pack/pkg/buildpack\"\n\t\"github.com/buildpacks/pack/pkg/client\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n\t\"github.com/buildpacks/pack/pkg/image\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\t\"github.com/buildpacks/pack/pkg/testmocks\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestPackageBuildpack(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"PackageBuildpack\", testPackageBuildpack, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testPackageBuildpack(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tsubject          *client.Client\n\t\tmockController   *gomock.Controller\n\t\tmockDownloader   *testmocks.MockBlobDownloader\n\t\tmockImageFactory *testmocks.MockImageFactory\n\t\tmockImageFetcher *testmocks.MockImageFetcher\n\t\tmockDockerClient *testmocks.MockAPIClient\n\t\tmockIndexFactory *testmocks.MockIndexFactory\n\t\tout              bytes.Buffer\n\t)\n\n\tit.Before(func() {\n\t\tmockController = gomock.NewController(t)\n\t\tmockDownloader = testmocks.NewMockBlobDownloader(mockController)\n\t\tmockImageFactory = testmocks.NewMockImageFactory(mockController)\n\t\tmockImageFetcher = testmocks.NewMockImageFetcher(mockController)\n\t\tmockDockerClient = testmocks.NewMockAPIClient(mockController)\n\t\tmockIndexFactory = testmocks.NewMockIndexFactory(mockController)\n\n\t\tvar err error\n\t\tsubject, err = client.NewClient(\n\t\t\tclient.WithLogger(logging.NewLogWithWriters(&out, &out)),\n\t\t\tclient.WithDownloader(mockDownloader),\n\t\t\tclient.WithImageFactory(mockImageFactory),\n\t\t\tclient.WithFetcher(mockImageFetcher),\n\t\t\tclient.WithDockerClient(mockDockerClient),\n\t\t\tclient.WithIndexFactory(mockIndexFactory),\n\t\t)\n\t\th.AssertNil(t, err)\n\t})\n\n\tit.After(func() {\n\t\tmockController.Finish()\n\t})\n\n\tcreateBuildpack := func(descriptor dist.BuildpackDescriptor) string {\n\t\tbp, err := ifakes.NewFakeBuildpackBlob(&descriptor, 0644)\n\t\th.AssertNil(t, err)\n\t\turl := fmt.Sprintf(\"https://example.com/bp.%s.tgz\", h.RandString(12))\n\t\tmockDownloader.EXPECT().Download(gomock.Any(), url).Return(bp, nil).AnyTimes()\n\t\treturn url\n\t}\n\n\twhen(\"buildpack has issues\", func() {\n\t\twhen(\"buildpack has no URI\", func() {\n\t\t\tit(\"should fail\", func() {\n\t\t\t\terr := subject.PackageBuildpack(context.TODO(), client.PackageBuildpackOptions{\n\t\t\t\t\tName: \"Fake-Name\",\n\t\t\t\t\tConfig: pubbldpkg.Config{\n\t\t\t\t\t\tPlatform:  dist.Platform{OS: \"linux\"},\n\t\t\t\t\t\tBuildpack: dist.BuildpackURI{URI: \"\"},\n\t\t\t\t\t},\n\t\t\t\t\tPublish: true,\n\t\t\t\t})\n\t\t\t\th.AssertError(t, err, \"buildpack URI must be provided\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"can't download buildpack\", func() {\n\t\t\tit(\"should fail\", func() {\n\t\t\t\tbpURL := fmt.Sprintf(\"https://example.com/bp.%s.tgz\", h.RandString(12))\n\t\t\t\tmockDownloader.EXPECT().Download(gomock.Any(), bpURL).Return(nil, image.ErrNotFound).AnyTimes()\n\n\t\t\t\terr := subject.PackageBuildpack(context.TODO(), client.PackageBuildpackOptions{\n\t\t\t\t\tName: \"Fake-Name\",\n\t\t\t\t\tConfig: pubbldpkg.Config{\n\t\t\t\t\t\tPlatform:  dist.Platform{OS: \"linux\"},\n\t\t\t\t\t\tBuildpack: dist.BuildpackURI{URI: bpURL},\n\t\t\t\t\t},\n\t\t\t\t\tPublish: true,\n\t\t\t\t})\n\t\t\t\th.AssertError(t, err, \"downloading buildpack\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"buildpack isn't a valid buildpack\", func() {\n\t\t\tit(\"should fail\", func() {\n\t\t\t\tfakeBlob := blob.NewBlob(filepath.Join(\"testdata\", \"empty-file\"))\n\t\t\t\tbpURL := fmt.Sprintf(\"https://example.com/bp.%s.tgz\", h.RandString(12))\n\t\t\t\tmockDownloader.EXPECT().Download(gomock.Any(), bpURL).Return(fakeBlob, nil).AnyTimes()\n\n\t\t\t\terr := subject.PackageBuildpack(context.TODO(), client.PackageBuildpackOptions{\n\t\t\t\t\tName: \"Fake-Name\",\n\t\t\t\t\tConfig: pubbldpkg.Config{\n\t\t\t\t\t\tPlatform:  dist.Platform{OS: \"linux\"},\n\t\t\t\t\t\tBuildpack: dist.BuildpackURI{URI: bpURL},\n\t\t\t\t\t},\n\t\t\t\t\tPublish: true,\n\t\t\t\t})\n\t\t\t\th.AssertError(t, err, \"creating buildpack\")\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"dependencies have issues\", func() {\n\t\twhen(\"dependencies include a flawed packaged buildpack file\", func() {\n\t\t\tit(\"should fail\", func() {\n\t\t\t\tdependencyPath := \"http://example.com/flawed.file\"\n\t\t\t\tmockDownloader.EXPECT().Download(gomock.Any(), dependencyPath).Return(blob.NewBlob(\"no-file.txt\"), nil).AnyTimes()\n\n\t\t\t\tmockDockerClient.EXPECT().Info(context.TODO(), gomock.Any()).Return(dockerclient.SystemInfoResult{Info: mobysystem.Info{OSType: \"linux\"}}, nil).AnyTimes()\n\n\t\t\t\tpackageDescriptor := dist.BuildpackDescriptor{\n\t\t\t\t\tWithAPI:  api.MustParse(\"0.2\"),\n\t\t\t\t\tWithInfo: dist.ModuleInfo{ID: \"bp.1\", Version: \"1.2.3\"},\n\t\t\t\t\tWithOrder: dist.Order{{\n\t\t\t\t\t\tGroup: []dist.ModuleRef{{\n\t\t\t\t\t\t\tModuleInfo: dist.ModuleInfo{ID: \"bp.nested\", Version: \"2.3.4\"},\n\t\t\t\t\t\t\tOptional:   false,\n\t\t\t\t\t\t}},\n\t\t\t\t\t}},\n\t\t\t\t}\n\n\t\t\t\terr := subject.PackageBuildpack(context.TODO(), client.PackageBuildpackOptions{\n\t\t\t\t\tName: \"test\",\n\t\t\t\t\tConfig: pubbldpkg.Config{\n\t\t\t\t\t\tPlatform:     dist.Platform{OS: \"linux\"},\n\t\t\t\t\t\tBuildpack:    dist.BuildpackURI{URI: createBuildpack(packageDescriptor)},\n\t\t\t\t\t\tDependencies: []dist.ImageOrURI{{BuildpackURI: dist.BuildpackURI{URI: dependencyPath}}},\n\t\t\t\t\t},\n\t\t\t\t\tPublish:    false,\n\t\t\t\t\tPullPolicy: image.PullAlways,\n\t\t\t\t})\n\n\t\t\t\th.AssertError(t, err, \"inspecting buildpack blob\")\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"FormatImage\", func() {\n\t\twhen(\"simple package for both OS formats (experimental only)\", func() {\n\t\t\tit(\"creates package image based on daemon OS\", func() {\n\t\t\t\tfor _, daemonOS := range []string{\"linux\", \"windows\"} {\n\t\t\t\t\tlocalMockDockerClient := testmocks.NewMockAPIClient(mockController)\n\t\t\t\t\tlocalMockDockerClient.EXPECT().Info(context.TODO(), gomock.Any()).Return(dockerclient.SystemInfoResult{Info: mobysystem.Info{OSType: daemonOS}}, nil).AnyTimes()\n\n\t\t\t\t\tpackClientWithExperimental, err := client.NewClient(\n\t\t\t\t\t\tclient.WithDockerClient(localMockDockerClient),\n\t\t\t\t\t\tclient.WithDownloader(mockDownloader),\n\t\t\t\t\t\tclient.WithImageFactory(mockImageFactory),\n\t\t\t\t\t\tclient.WithExperimental(true),\n\t\t\t\t\t)\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\tfakeImage := fakes.NewImage(\"basic/package-\"+h.RandString(12), \"\", nil)\n\t\t\t\t\tmockImageFactory.EXPECT().NewImage(fakeImage.Name(), true, dist.Target{OS: daemonOS}).Return(fakeImage, nil)\n\n\t\t\t\t\tfakeBlob := blob.NewBlob(filepath.Join(\"testdata\", \"empty-file\"))\n\t\t\t\t\tbpURL := fmt.Sprintf(\"https://example.com/bp.%s.tgz\", h.RandString(12))\n\t\t\t\t\tmockDownloader.EXPECT().Download(gomock.Any(), bpURL).Return(fakeBlob, nil).AnyTimes()\n\n\t\t\t\t\th.AssertNil(t, packClientWithExperimental.PackageBuildpack(context.TODO(), client.PackageBuildpackOptions{\n\t\t\t\t\t\tFormat: client.FormatImage,\n\t\t\t\t\t\tName:   fakeImage.Name(),\n\t\t\t\t\t\tConfig: pubbldpkg.Config{\n\t\t\t\t\t\t\tPlatform: dist.Platform{OS: daemonOS},\n\t\t\t\t\t\t\tBuildpack: dist.BuildpackURI{URI: createBuildpack(dist.BuildpackDescriptor{\n\t\t\t\t\t\t\t\tWithAPI:    api.MustParse(\"0.2\"),\n\t\t\t\t\t\t\t\tWithInfo:   dist.ModuleInfo{ID: \"bp.basic\", Version: \"2.3.4\"},\n\t\t\t\t\t\t\t\tWithStacks: []dist.Stack{{ID: \"some.stack.id\"}},\n\t\t\t\t\t\t\t})},\n\t\t\t\t\t\t},\n\t\t\t\t\t\tPullPolicy: image.PullNever,\n\t\t\t\t\t}))\n\t\t\t\t}\n\t\t\t})\n\n\t\t\tit(\"fails without experimental on Windows daemons\", func() {\n\t\t\t\twindowsMockDockerClient := testmocks.NewMockAPIClient(mockController)\n\n\t\t\t\tpackClientWithoutExperimental, err := client.NewClient(\n\t\t\t\t\tclient.WithDockerClient(windowsMockDockerClient),\n\t\t\t\t\tclient.WithExperimental(false),\n\t\t\t\t)\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\terr = packClientWithoutExperimental.PackageBuildpack(context.TODO(), client.PackageBuildpackOptions{\n\t\t\t\t\tConfig: pubbldpkg.Config{\n\t\t\t\t\t\tPlatform: dist.Platform{\n\t\t\t\t\t\t\tOS: \"windows\",\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t})\n\t\t\t\th.AssertError(t, err, \"Windows buildpackage support is currently experimental.\")\n\t\t\t})\n\n\t\t\tit(\"fails for mismatched platform and daemon os\", func() {\n\t\t\t\twindowsMockDockerClient := testmocks.NewMockAPIClient(mockController)\n\t\t\t\twindowsMockDockerClient.EXPECT().Info(context.TODO(), gomock.Any()).Return(dockerclient.SystemInfoResult{Info: mobysystem.Info{OSType: \"windows\"}}, nil).AnyTimes()\n\n\t\t\t\tpackClientWithoutExperimental, err := client.NewClient(\n\t\t\t\t\tclient.WithDockerClient(windowsMockDockerClient),\n\t\t\t\t\tclient.WithExperimental(false),\n\t\t\t\t)\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\terr = packClientWithoutExperimental.PackageBuildpack(context.TODO(), client.PackageBuildpackOptions{\n\t\t\t\t\tConfig: pubbldpkg.Config{\n\t\t\t\t\t\tPlatform: dist.Platform{\n\t\t\t\t\t\t\tOS: \"linux\",\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t})\n\n\t\t\t\th.AssertError(t, err, \"invalid 'platform.os' specified: DOCKER_OS is 'windows'\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"nested package lives in registry\", func() {\n\t\t\tvar nestedPackage *fakes.Image\n\n\t\t\tit.Before(func() {\n\t\t\t\tnestedPackage = fakes.NewImage(\"nested/package-\"+h.RandString(12), \"\", nil)\n\t\t\t\tmockImageFactory.EXPECT().NewImage(nestedPackage.Name(), false, dist.Target{OS: \"linux\"}).Return(nestedPackage, nil)\n\n\t\t\t\tmockDockerClient.EXPECT().Info(context.TODO(), gomock.Any()).Return(dockerclient.SystemInfoResult{Info: mobysystem.Info{OSType: \"linux\"}}, nil).AnyTimes()\n\n\t\t\t\th.AssertNil(t, subject.PackageBuildpack(context.TODO(), client.PackageBuildpackOptions{\n\t\t\t\t\tName: nestedPackage.Name(),\n\t\t\t\t\tConfig: pubbldpkg.Config{\n\t\t\t\t\t\tPlatform: dist.Platform{OS: \"linux\"},\n\t\t\t\t\t\tBuildpack: dist.BuildpackURI{URI: createBuildpack(dist.BuildpackDescriptor{\n\t\t\t\t\t\t\tWithAPI:    api.MustParse(\"0.2\"),\n\t\t\t\t\t\t\tWithInfo:   dist.ModuleInfo{ID: \"bp.nested\", Version: \"2.3.4\"},\n\t\t\t\t\t\t\tWithStacks: []dist.Stack{{ID: \"some.stack.id\"}},\n\t\t\t\t\t\t})},\n\t\t\t\t\t},\n\t\t\t\t\tPublish:    true,\n\t\t\t\t\tPullPolicy: image.PullAlways,\n\t\t\t\t}))\n\t\t\t})\n\n\t\t\tshouldFetchNestedPackage := func(demon bool, pull image.PullPolicy) {\n\t\t\t\tmockImageFetcher.EXPECT().Fetch(gomock.Any(), nestedPackage.Name(), image.FetchOptions{Daemon: demon, PullPolicy: pull, Target: &dist.Target{OS: \"linux\"}}).Return(nestedPackage, nil)\n\t\t\t}\n\n\t\t\tshouldNotFindNestedPackageWhenCallingImageFetcherWith := func(demon bool, pull image.PullPolicy) {\n\t\t\t\tmockImageFetcher.EXPECT().Fetch(gomock.Any(), nestedPackage.Name(), image.FetchOptions{Daemon: demon, PullPolicy: pull, Target: &dist.Target{OS: \"linux\"}}).Return(nil, image.ErrNotFound)\n\t\t\t}\n\n\t\t\tshouldCreateLocalPackage := func() imgutil.Image {\n\t\t\t\timg := fakes.NewImage(\"some/package-\"+h.RandString(12), \"\", nil)\n\t\t\t\tmockImageFactory.EXPECT().NewImage(img.Name(), true, dist.Target{OS: \"linux\"}).Return(img, nil)\n\t\t\t\treturn img\n\t\t\t}\n\n\t\t\tshouldCreateRemotePackage := func() *fakes.Image {\n\t\t\t\timg := fakes.NewImage(\"some/package-\"+h.RandString(12), \"\", nil)\n\t\t\t\tmockImageFactory.EXPECT().NewImage(img.Name(), false, dist.Target{OS: \"linux\"}).Return(img, nil)\n\t\t\t\treturn img\n\t\t\t}\n\n\t\t\twhen(\"publish=false and pull-policy=always\", func() {\n\t\t\t\tit(\"should pull and use local nested package image\", func() {\n\t\t\t\t\tshouldFetchNestedPackage(true, image.PullAlways)\n\t\t\t\t\tpackageImage := shouldCreateLocalPackage()\n\n\t\t\t\t\th.AssertNil(t, subject.PackageBuildpack(context.TODO(), client.PackageBuildpackOptions{\n\t\t\t\t\t\tName: packageImage.Name(),\n\t\t\t\t\t\tConfig: pubbldpkg.Config{\n\t\t\t\t\t\t\tPlatform: dist.Platform{OS: \"linux\"},\n\t\t\t\t\t\t\tBuildpack: dist.BuildpackURI{URI: createBuildpack(dist.BuildpackDescriptor{\n\t\t\t\t\t\t\t\tWithAPI:  api.MustParse(\"0.2\"),\n\t\t\t\t\t\t\t\tWithInfo: dist.ModuleInfo{ID: \"bp.1\", Version: \"1.2.3\"},\n\t\t\t\t\t\t\t\tWithOrder: dist.Order{{\n\t\t\t\t\t\t\t\t\tGroup: []dist.ModuleRef{{\n\t\t\t\t\t\t\t\t\t\tModuleInfo: dist.ModuleInfo{ID: \"bp.nested\", Version: \"2.3.4\"},\n\t\t\t\t\t\t\t\t\t\tOptional:   false,\n\t\t\t\t\t\t\t\t\t}},\n\t\t\t\t\t\t\t\t}},\n\t\t\t\t\t\t\t})},\n\t\t\t\t\t\t\tDependencies: []dist.ImageOrURI{{ImageRef: dist.ImageRef{ImageName: nestedPackage.Name()}}},\n\t\t\t\t\t\t},\n\t\t\t\t\t\tPublish:    false,\n\t\t\t\t\t\tPullPolicy: image.PullAlways,\n\t\t\t\t\t}))\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"publish=true and pull-policy=always\", func() {\n\t\t\t\tit(\"should use remote nested package image\", func() {\n\t\t\t\t\tshouldFetchNestedPackage(false, image.PullAlways)\n\t\t\t\t\tpackageImage := shouldCreateRemotePackage()\n\n\t\t\t\t\th.AssertNil(t, subject.PackageBuildpack(context.TODO(), client.PackageBuildpackOptions{\n\t\t\t\t\t\tName: packageImage.Name(),\n\t\t\t\t\t\tConfig: pubbldpkg.Config{\n\t\t\t\t\t\t\tPlatform: dist.Platform{OS: \"linux\"},\n\t\t\t\t\t\t\tBuildpack: dist.BuildpackURI{URI: createBuildpack(dist.BuildpackDescriptor{\n\t\t\t\t\t\t\t\tWithAPI:  api.MustParse(\"0.2\"),\n\t\t\t\t\t\t\t\tWithInfo: dist.ModuleInfo{ID: \"bp.1\", Version: \"1.2.3\"},\n\t\t\t\t\t\t\t\tWithOrder: dist.Order{{\n\t\t\t\t\t\t\t\t\tGroup: []dist.ModuleRef{{\n\t\t\t\t\t\t\t\t\t\tModuleInfo: dist.ModuleInfo{ID: \"bp.nested\", Version: \"2.3.4\"},\n\t\t\t\t\t\t\t\t\t\tOptional:   false,\n\t\t\t\t\t\t\t\t\t}},\n\t\t\t\t\t\t\t\t}},\n\t\t\t\t\t\t\t})},\n\t\t\t\t\t\t\tDependencies: []dist.ImageOrURI{{ImageRef: dist.ImageRef{ImageName: nestedPackage.Name()}}},\n\t\t\t\t\t\t},\n\t\t\t\t\t\tPublish:    true,\n\t\t\t\t\t\tPullPolicy: image.PullAlways,\n\t\t\t\t\t}))\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"publish=true and pull-policy=never\", func() {\n\t\t\t\tit(\"should push to registry and not pull nested package image\", func() {\n\t\t\t\t\tshouldFetchNestedPackage(false, image.PullNever)\n\t\t\t\t\tpackageImage := shouldCreateRemotePackage()\n\n\t\t\t\t\th.AssertNil(t, subject.PackageBuildpack(context.TODO(), client.PackageBuildpackOptions{\n\t\t\t\t\t\tName: packageImage.Name(),\n\t\t\t\t\t\tConfig: pubbldpkg.Config{\n\t\t\t\t\t\t\tPlatform: dist.Platform{OS: \"linux\"},\n\t\t\t\t\t\t\tBuildpack: dist.BuildpackURI{URI: createBuildpack(dist.BuildpackDescriptor{\n\t\t\t\t\t\t\t\tWithAPI:  api.MustParse(\"0.2\"),\n\t\t\t\t\t\t\t\tWithInfo: dist.ModuleInfo{ID: \"bp.1\", Version: \"1.2.3\"},\n\t\t\t\t\t\t\t\tWithOrder: dist.Order{{\n\t\t\t\t\t\t\t\t\tGroup: []dist.ModuleRef{{\n\t\t\t\t\t\t\t\t\t\tModuleInfo: dist.ModuleInfo{ID: \"bp.nested\", Version: \"2.3.4\"},\n\t\t\t\t\t\t\t\t\t\tOptional:   false,\n\t\t\t\t\t\t\t\t\t}},\n\t\t\t\t\t\t\t\t}},\n\t\t\t\t\t\t\t})},\n\t\t\t\t\t\t\tDependencies: []dist.ImageOrURI{{ImageRef: dist.ImageRef{ImageName: nestedPackage.Name()}}},\n\t\t\t\t\t\t},\n\t\t\t\t\t\tPublish:    true,\n\t\t\t\t\t\tPullPolicy: image.PullNever,\n\t\t\t\t\t}))\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"publish=false pull-policy=never and there is no local image\", func() {\n\t\t\t\tit(\"should fail without trying to retrieve nested image from registry\", func() {\n\t\t\t\t\tshouldNotFindNestedPackageWhenCallingImageFetcherWith(true, image.PullNever)\n\n\t\t\t\t\th.AssertError(t, subject.PackageBuildpack(context.TODO(), client.PackageBuildpackOptions{\n\t\t\t\t\t\tName: \"some/package\",\n\t\t\t\t\t\tConfig: pubbldpkg.Config{\n\t\t\t\t\t\t\tPlatform: dist.Platform{OS: \"linux\"},\n\t\t\t\t\t\t\tBuildpack: dist.BuildpackURI{URI: createBuildpack(dist.BuildpackDescriptor{\n\t\t\t\t\t\t\t\tWithAPI:    api.MustParse(\"0.2\"),\n\t\t\t\t\t\t\t\tWithInfo:   dist.ModuleInfo{ID: \"bp.1\", Version: \"1.2.3\"},\n\t\t\t\t\t\t\t\tWithStacks: []dist.Stack{{ID: \"some.stack.id\"}},\n\t\t\t\t\t\t\t})},\n\t\t\t\t\t\t\tDependencies: []dist.ImageOrURI{{ImageRef: dist.ImageRef{ImageName: nestedPackage.Name()}}},\n\t\t\t\t\t\t},\n\t\t\t\t\t\tPublish:    false,\n\t\t\t\t\t\tPullPolicy: image.PullNever,\n\t\t\t\t\t}), \"not found\")\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"nested package is not a valid package\", func() {\n\t\t\tit(\"should error\", func() {\n\t\t\t\tnotPackageImage := fakes.NewImage(\"not/package\", \"\", nil)\n\t\t\t\tmockImageFetcher.EXPECT().Fetch(gomock.Any(), notPackageImage.Name(), image.FetchOptions{Daemon: true, PullPolicy: image.PullAlways, Target: &dist.Target{OS: \"linux\"}}).Return(notPackageImage, nil)\n\n\t\t\t\tmockDockerClient.EXPECT().Info(context.TODO(), gomock.Any()).Return(dockerclient.SystemInfoResult{Info: mobysystem.Info{OSType: \"linux\"}}, nil).AnyTimes()\n\n\t\t\t\th.AssertError(t, subject.PackageBuildpack(context.TODO(), client.PackageBuildpackOptions{\n\t\t\t\t\tName: \"some/package\",\n\t\t\t\t\tConfig: pubbldpkg.Config{\n\t\t\t\t\t\tPlatform: dist.Platform{OS: \"linux\"},\n\t\t\t\t\t\tBuildpack: dist.BuildpackURI{URI: createBuildpack(dist.BuildpackDescriptor{\n\t\t\t\t\t\t\tWithAPI:    api.MustParse(\"0.2\"),\n\t\t\t\t\t\t\tWithInfo:   dist.ModuleInfo{ID: \"bp.1\", Version: \"1.2.3\"},\n\t\t\t\t\t\t\tWithStacks: []dist.Stack{{ID: \"some.stack.id\"}},\n\t\t\t\t\t\t})},\n\t\t\t\t\t\tDependencies: []dist.ImageOrURI{{ImageRef: dist.ImageRef{ImageName: notPackageImage.Name()}}},\n\t\t\t\t\t},\n\t\t\t\t\tPublish:    false,\n\t\t\t\t\tPullPolicy: image.PullAlways,\n\t\t\t\t}), \"extracting buildpacks from 'not/package': could not find label 'io.buildpacks.buildpackage.metadata'\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"flatten option is set\", func() {\n\t\t\t/*       1\n\t\t\t *    /    \\\n\t\t\t *   2      3\n\t\t\t *         /  \\\n\t\t\t *        4     5\n\t\t\t *\t          /  \\\n\t\t\t *           6   7\n\t\t\t */\n\t\t\tvar (\n\t\t\t\tfakeLayerImage          *h.FakeAddedLayerImage\n\t\t\t\topts                    client.PackageBuildpackOptions\n\t\t\t\tmockBuildpackDownloader *testmocks.MockBuildpackDownloader\n\t\t\t)\n\n\t\t\tvar successfullyCreateFlattenPackage = func() {\n\t\t\t\tt.Helper()\n\t\t\t\terr := subject.PackageBuildpack(context.TODO(), opts)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertEq(t, fakeLayerImage.IsSaved(), true)\n\t\t\t}\n\n\t\t\tit.Before(func() {\n\t\t\t\tmockBuildpackDownloader = testmocks.NewMockBuildpackDownloader(mockController)\n\n\t\t\t\tvar err error\n\t\t\t\tsubject, err = client.NewClient(\n\t\t\t\t\tclient.WithLogger(logging.NewLogWithWriters(&out, &out)),\n\t\t\t\t\tclient.WithDownloader(mockDownloader),\n\t\t\t\t\tclient.WithImageFactory(mockImageFactory),\n\t\t\t\t\tclient.WithFetcher(mockImageFetcher),\n\t\t\t\t\tclient.WithDockerClient(mockDockerClient),\n\t\t\t\t\tclient.WithBuildpackDownloader(mockBuildpackDownloader),\n\t\t\t\t)\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\tmockDockerClient.EXPECT().Info(context.TODO(), gomock.Any()).Return(dockerclient.SystemInfoResult{Info: mobysystem.Info{OSType: \"linux\"}}, nil).AnyTimes()\n\n\t\t\t\tname := \"basic/package-\" + h.RandString(12)\n\t\t\t\tfakeImage := fakes.NewImage(name, \"\", nil)\n\t\t\t\tfakeLayerImage = &h.FakeAddedLayerImage{Image: fakeImage}\n\t\t\t\tmockImageFactory.EXPECT().NewImage(fakeLayerImage.Name(), true, dist.Target{OS: \"linux\"}).Return(fakeLayerImage, nil)\n\t\t\t\tmockImageFetcher.EXPECT().Fetch(gomock.Any(), name, gomock.Any()).Return(fakeLayerImage, nil).AnyTimes()\n\n\t\t\t\tblob1 := blob.NewBlob(filepath.Join(\"testdata\", \"buildpack-flatten\", \"buildpack-1\"))\n\t\t\t\tmockDownloader.EXPECT().Download(gomock.Any(), \"https://example.fake/flatten-bp-1.tgz\").Return(blob1, nil).AnyTimes()\n\t\t\t\tbp, err := buildpack.FromBuildpackRootBlob(blob1, archive.DefaultTarWriterFactory(), nil)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\tmockBuildpackDownloader.EXPECT().Download(gomock.Any(), \"https://example.fake/flatten-bp-1.tgz\", gomock.Any()).Return(bp, nil, nil).AnyTimes()\n\n\t\t\t\t// flatten buildpack 2\n\t\t\t\tblob2 := blob.NewBlob(filepath.Join(\"testdata\", \"buildpack-flatten\", \"buildpack-2\"))\n\t\t\t\tbp2, err := buildpack.FromBuildpackRootBlob(blob2, archive.DefaultTarWriterFactory(), nil)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\tmockBuildpackDownloader.EXPECT().Download(gomock.Any(), \"https://example.fake/flatten-bp-2.tgz\", gomock.Any()).Return(bp2, nil, nil).AnyTimes()\n\n\t\t\t\t// flatten buildpack 3\n\t\t\t\tblob3 := blob.NewBlob(filepath.Join(\"testdata\", \"buildpack-flatten\", \"buildpack-3\"))\n\t\t\t\tbp3, err := buildpack.FromBuildpackRootBlob(blob3, archive.DefaultTarWriterFactory(), nil)\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\tvar depBPs []buildpack.BuildModule\n\t\t\t\tfor i := 4; i <= 7; i++ {\n\t\t\t\t\tb := blob.NewBlob(filepath.Join(\"testdata\", \"buildpack-flatten\", fmt.Sprintf(\"buildpack-%d\", i)))\n\t\t\t\t\tbp, err := buildpack.FromBuildpackRootBlob(b, archive.DefaultTarWriterFactory(), nil)\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\tdepBPs = append(depBPs, bp)\n\t\t\t\t}\n\t\t\t\tmockBuildpackDownloader.EXPECT().Download(gomock.Any(), \"https://example.fake/flatten-bp-3.tgz\", gomock.Any()).Return(bp3, depBPs, nil).AnyTimes()\n\n\t\t\t\topts = client.PackageBuildpackOptions{\n\t\t\t\t\tFormat: client.FormatImage,\n\t\t\t\t\tName:   fakeLayerImage.Name(),\n\t\t\t\t\tConfig: pubbldpkg.Config{\n\t\t\t\t\t\tPlatform:  dist.Platform{OS: \"linux\"},\n\t\t\t\t\t\tBuildpack: dist.BuildpackURI{URI: \"https://example.fake/flatten-bp-1.tgz\"},\n\t\t\t\t\t\tDependencies: []dist.ImageOrURI{\n\t\t\t\t\t\t\t{BuildpackURI: dist.BuildpackURI{URI: \"https://example.fake/flatten-bp-2.tgz\"}},\n\t\t\t\t\t\t\t{BuildpackURI: dist.BuildpackURI{URI: \"https://example.fake/flatten-bp-3.tgz\"}},\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t\tPullPolicy: image.PullNever,\n\t\t\t\t\tFlatten:    true,\n\t\t\t\t}\n\t\t\t})\n\n\t\t\twhen(\"flatten all\", func() {\n\t\t\t\tit(\"creates package image with all dependencies\", func() {\n\t\t\t\t\tsuccessfullyCreateFlattenPackage()\n\n\t\t\t\t\tlayers := fakeLayerImage.AddedLayersOrder()\n\t\t\t\t\th.AssertEq(t, len(layers), 1)\n\t\t\t\t})\n\n\t\t\t\t// TODO add test case for flatten all with --flatten-exclude\n\t\t\t})\n\t\t})\n\n\t\twhen(\"multi-platform\", func() {\n\t\t\tvar (\n\t\t\t\tindex          *h.MockImageIndex\n\t\t\t\tindexLocalPath string\n\t\t\t\ttargets        []dist.Target\n\t\t\t\tbpPathURI      string\n\t\t\t\trepoName       string\n\t\t\t\ttmpDir         string\n\t\t\t\terr            error\n\t\t\t)\n\n\t\t\tit.Before(func() {\n\t\t\t\ttmpDir, err = os.MkdirTemp(\"\", \"package-buildpack-multi-platform\")\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertNil(t, os.Setenv(\"XDG_RUNTIME_DIR\", tmpDir))\n\n\t\t\t\trepoName = \"basic/multi-platform-package-\" + h.RandString(12)\n\t\t\t\tindexLocalPath = filepath.Join(tmpDir, imgutil.MakeFileSafeName(repoName))\n\t\t\t})\n\n\t\t\tit.After(func() {\n\t\t\t\tos.Remove(tmpDir)\n\t\t\t})\n\n\t\t\twhen(\"simple buildpack\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\t// index stub returned to check if push operation was called\n\t\t\t\t\tindex = h.NewMockImageIndex(t, repoName, 0, 0)\n\n\t\t\t\t\t// We need to mock the index factory to inject a stub index to be pushed.\n\t\t\t\t\tmockIndexFactory.EXPECT().Exists(gomock.Eq(repoName)).Return(false)\n\t\t\t\t\tmockIndexFactory.EXPECT().CreateIndex(gomock.Eq(repoName), gomock.Any()).Return(index, nil)\n\t\t\t\t})\n\n\t\t\t\twhen(\"folder structure doesn't follow multi-platform convention\", func() {\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\tdestBpPath := filepath.Join(\"testdata\", \"buildpack-multi-platform\", \"buildpack-old-format\")\n\t\t\t\t\t\tbpPathURI, err = paths.FilePathToURI(destBpPath, \"\")\n\n\t\t\t\t\t\tprepareDownloadedBuildpackBlobAtURI(t, mockDownloader, destBpPath)\n\t\t\t\t\t\tprepareExpectedMultiPlaformImages(t, mockImageFactory, mockImageFetcher, repoName, dist.Target{OS: \"linux\", Arch: \"amd64\"},\n\t\t\t\t\t\t\texpectedMultiPlatformImage{digest: newDigest(t, repoName, \"sha256:b9d056b83bb6446fee29e89a7fcf10203c562c1f59586a6e2f39c903597bda34\")})\n\t\t\t\t\t\tprepareExpectedMultiPlaformImages(t, mockImageFactory, mockImageFetcher, repoName, dist.Target{OS: \"linux\", Arch: \"arm\"},\n\t\t\t\t\t\t\texpectedMultiPlatformImage{digest: newDigest(t, repoName, \"sha256:b9d056b83bb6446fee29e89a7fcf10203c562c1f59586a6e2f39c903597bda35\")})\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"creates a multi-platform buildpack and pushes it to a registry\", func() {\n\t\t\t\t\t\t// Define targets we want to package\n\t\t\t\t\t\ttargets = []dist.Target{{OS: \"linux\", Arch: \"amd64\"}, {OS: \"linux\", Arch: \"arm\"}}\n\n\t\t\t\t\t\th.AssertNil(t, subject.PackageBuildpack(context.TODO(), client.PackageBuildpackOptions{\n\t\t\t\t\t\t\tFormat:          client.FormatImage,\n\t\t\t\t\t\t\tPublish:         true,\n\t\t\t\t\t\t\tRelativeBaseDir: \"\",\n\t\t\t\t\t\t\tName:            repoName,\n\t\t\t\t\t\t\tConfig: pubbldpkg.Config{\n\t\t\t\t\t\t\t\tBuildpack: dist.BuildpackURI{URI: bpPathURI},\n\t\t\t\t\t\t\t\tTargets:   []dist.Target{},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\tTargets:    targets,\n\t\t\t\t\t\t\tPullPolicy: image.PullNever,\n\t\t\t\t\t\t}))\n\n\t\t\t\t\t\t// index is not saved locally\n\t\t\t\t\t\th.AssertPathDoesNotExists(t, indexLocalPath)\n\n\t\t\t\t\t\t// Push operation was done\n\t\t\t\t\t\th.AssertTrue(t, index.PushCalled)\n\t\t\t\t\t\th.AssertTrue(t, index.PurgeOption)\n\n\t\t\t\t\t\t// index has the two expected manifests amd64 and arm\n\t\t\t\t\t\tindexManifest, err := index.IndexManifest()\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\th.AssertEq(t, len(indexManifest.Manifests), 2)\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"folder structure follows multi-platform convention\", func() {\n\t\t\t\t\twhen(\"os/arch is used\", func() {\n\t\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\t\tdestBpPath := filepath.Join(\"testdata\", \"buildpack-multi-platform\", \"buildpack-new-format\")\n\n\t\t\t\t\t\t\tbpPathURI, err = paths.FilePathToURI(destBpPath, \"\")\n\t\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\t\tprepareDownloadedBuildpackBlobAtURI(t, mockDownloader, filepath.Join(destBpPath, \"linux\", \"amd64\"))\n\t\t\t\t\t\t\tprepareDownloadedBuildpackBlobAtURI(t, mockDownloader, filepath.Join(destBpPath, \"linux\", \"arm\"))\n\n\t\t\t\t\t\t\tprepareExpectedMultiPlaformImages(t, mockImageFactory, mockImageFetcher, repoName, dist.Target{OS: \"linux\", Arch: \"amd64\"},\n\t\t\t\t\t\t\t\texpectedMultiPlatformImage{digest: newDigest(t, repoName, \"sha256:b9d056b83bb6446fee29e89a7fcf10203c562c1f59586a6e2f39c903597bda34\")})\n\n\t\t\t\t\t\t\tprepareExpectedMultiPlaformImages(t, mockImageFactory, mockImageFetcher, repoName, dist.Target{OS: \"linux\", Arch: \"arm\"},\n\t\t\t\t\t\t\t\texpectedMultiPlatformImage{digest: newDigest(t, repoName, \"sha256:b9d056b83bb6446fee29e89a7fcf10203c562c1f59586a6e2f39c903597bda35\")})\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\tit(\"creates a multi-platform buildpack and pushes it to a registry\", func() {\n\t\t\t\t\t\t\t// Define targets we want to package\n\t\t\t\t\t\t\ttargets = []dist.Target{{OS: \"linux\", Arch: \"amd64\"}, {OS: \"linux\", Arch: \"arm\"}}\n\n\t\t\t\t\t\t\th.AssertNil(t, subject.PackageBuildpack(context.TODO(), client.PackageBuildpackOptions{\n\t\t\t\t\t\t\t\tFormat:          client.FormatImage,\n\t\t\t\t\t\t\t\tPublish:         true,\n\t\t\t\t\t\t\t\tRelativeBaseDir: \"\",\n\t\t\t\t\t\t\t\tName:            repoName,\n\t\t\t\t\t\t\t\tConfig: pubbldpkg.Config{\n\t\t\t\t\t\t\t\t\tBuildpack: dist.BuildpackURI{URI: bpPathURI},\n\t\t\t\t\t\t\t\t\tTargets:   []dist.Target{},\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\tTargets:    targets,\n\t\t\t\t\t\t\t\tPullPolicy: image.PullNever,\n\t\t\t\t\t\t\t}))\n\n\t\t\t\t\t\t\t// index is not saved locally\n\t\t\t\t\t\t\th.AssertPathDoesNotExists(t, indexLocalPath)\n\n\t\t\t\t\t\t\t// Push operation was done\n\t\t\t\t\t\t\th.AssertTrue(t, index.PushCalled)\n\t\t\t\t\t\t\th.AssertTrue(t, index.PurgeOption)\n\n\t\t\t\t\t\t\t// index has the two expected manifests amd64 and arm\n\t\t\t\t\t\t\tindexManifest, err := index.IndexManifest()\n\t\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\t\th.AssertEq(t, len(indexManifest.Manifests), 2)\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"os/arch/variant/name@version is used\", func() {\n\t\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\t\tdestBpPath := filepath.Join(\"testdata\", \"buildpack-multi-platform\", \"buildpack-new-format-with-versions\")\n\n\t\t\t\t\t\t\tbpPathURI, err = paths.FilePathToURI(destBpPath, \"\")\n\t\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\t\tprepareDownloadedBuildpackBlobAtURI(t, mockDownloader, filepath.Join(destBpPath, \"linux\", \"amd64\", \"v5\", \"ubuntu@18.01\"))\n\t\t\t\t\t\t\tprepareDownloadedBuildpackBlobAtURI(t, mockDownloader, filepath.Join(destBpPath, \"linux\", \"amd64\", \"v5\", \"ubuntu@21.01\"))\n\t\t\t\t\t\t\tprepareDownloadedBuildpackBlobAtURI(t, mockDownloader, filepath.Join(destBpPath, \"linux\", \"arm\", \"v6\", \"ubuntu@18.01\"))\n\t\t\t\t\t\t\tprepareDownloadedBuildpackBlobAtURI(t, mockDownloader, filepath.Join(destBpPath, \"linux\", \"arm\", \"v6\", \"ubuntu@21.01\"))\n\n\t\t\t\t\t\t\tprepareExpectedMultiPlaformImages(t, mockImageFactory, mockImageFetcher, repoName, dist.Target{OS: \"linux\", Arch: \"amd64\", ArchVariant: \"v5\", Distributions: []dist.Distribution{\n\t\t\t\t\t\t\t\t{Name: \"ubuntu\", Version: \"21.01\"}}}, expectedMultiPlatformImage{digest: newDigest(t, repoName, \"sha256:b9d056b83bb6446fee29e89a7fcf10203c562c1f59586a6e2f39c903597bda34\")})\n\n\t\t\t\t\t\t\tprepareExpectedMultiPlaformImages(t, mockImageFactory, mockImageFetcher, repoName, dist.Target{OS: \"linux\", Arch: \"amd64\", ArchVariant: \"v5\", Distributions: []dist.Distribution{\n\t\t\t\t\t\t\t\t{Name: \"ubuntu\", Version: \"18.01\"}}}, expectedMultiPlatformImage{digest: newDigest(t, repoName, \"sha256:b9d056b83bb6446fee29e89a7fcf10203c562c1f59586a6e2f39c903597bda35\")})\n\n\t\t\t\t\t\t\tprepareExpectedMultiPlaformImages(t, mockImageFactory, mockImageFetcher, repoName, dist.Target{OS: \"linux\", Arch: \"arm\", ArchVariant: \"v6\", Distributions: []dist.Distribution{\n\t\t\t\t\t\t\t\t{Name: \"ubuntu\", Version: \"18.01\"}}}, expectedMultiPlatformImage{digest: newDigest(t, repoName, \"sha256:b9d056b83bb6446fee29e89a7fcf10203c562c1f59586a6e2f39c903597bda36\")})\n\n\t\t\t\t\t\t\tprepareExpectedMultiPlaformImages(t, mockImageFactory, mockImageFetcher, repoName, dist.Target{OS: \"linux\", Arch: \"arm\", ArchVariant: \"v6\", Distributions: []dist.Distribution{\n\t\t\t\t\t\t\t\t{Name: \"ubuntu\", Version: \"21.01\"}}}, expectedMultiPlatformImage{digest: newDigest(t, repoName, \"sha256:b9d056b83bb6446fee29e89a7fcf10203c562c1f59586a6e2f39c903597bda36\")})\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\tit(\"creates a multi-platform buildpack and pushes it to a registry\", func() {\n\t\t\t\t\t\t\t// Define targets we want to package\n\t\t\t\t\t\t\ttargets = []dist.Target{{OS: \"linux\", Arch: \"amd64\", ArchVariant: \"v5\",\n\t\t\t\t\t\t\t\tDistributions: []dist.Distribution{{Name: \"ubuntu\", Version: \"18.01\"}, {Name: \"ubuntu\", Version: \"21.01\"}}},\n\t\t\t\t\t\t\t\t{OS: \"linux\", Arch: \"arm\", ArchVariant: \"v6\", Distributions: []dist.Distribution{{Name: \"ubuntu\", Version: \"18.01\"}, {Name: \"ubuntu\", Version: \"21.01\"}}}}\n\n\t\t\t\t\t\t\th.AssertNil(t, subject.PackageBuildpack(context.TODO(), client.PackageBuildpackOptions{\n\t\t\t\t\t\t\t\tFormat:          client.FormatImage,\n\t\t\t\t\t\t\t\tPublish:         true,\n\t\t\t\t\t\t\t\tRelativeBaseDir: \"\",\n\t\t\t\t\t\t\t\tName:            repoName,\n\t\t\t\t\t\t\t\tConfig: pubbldpkg.Config{\n\t\t\t\t\t\t\t\t\tBuildpack: dist.BuildpackURI{URI: bpPathURI},\n\t\t\t\t\t\t\t\t\tTargets:   []dist.Target{},\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\tTargets:    targets,\n\t\t\t\t\t\t\t\tPullPolicy: image.PullNever,\n\t\t\t\t\t\t\t}))\n\n\t\t\t\t\t\t\t// index is not saved locally\n\t\t\t\t\t\t\th.AssertPathDoesNotExists(t, indexLocalPath)\n\n\t\t\t\t\t\t\t// Push operation was done\n\t\t\t\t\t\t\th.AssertTrue(t, index.PushCalled)\n\t\t\t\t\t\t\th.AssertTrue(t, index.PurgeOption)\n\n\t\t\t\t\t\t\t// index has the four expected manifests two for each architecture\n\t\t\t\t\t\t\tindexManifest, err := index.IndexManifest()\n\t\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\t\th.AssertEq(t, len(indexManifest.Manifests), 4)\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"composite buildpack\", func() {\n\t\t\t\tvar (\n\t\t\t\t\ttarget1 dist.Target\n\t\t\t\t\tbp1URI  string\n\t\t\t\t\ttarget2 dist.Target\n\t\t\t\t\tbp2URI  string\n\t\t\t\t)\n\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tbp1URI = \"localhost:3333/bp-1\"\n\t\t\t\t\ttarget1 = dist.Target{OS: \"linux\", Arch: \"amd64\"}\n\n\t\t\t\t\tbp2URI = \"localhost:3333/bp-2\"\n\t\t\t\t\ttarget2 = dist.Target{OS: \"linux\", Arch: \"arm\"}\n\t\t\t\t})\n\n\t\t\t\twhen(\"dependencies are saved on a registry\", func() {\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\t// Check testdata/buildpack-multi-platform/buildpack-composite for configuration details\n\t\t\t\t\t\tdestBpPath := filepath.Join(\"testdata\", \"buildpack-multi-platform\", \"buildpack-composite\")\n\n\t\t\t\t\t\tbpPathURI, err = paths.FilePathToURI(destBpPath, \"\")\n\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\tprepareDownloadedBuildpackBlobAtURI(t, mockDownloader, destBpPath)\n\n\t\t\t\t\t\tindexAMD64Digest := newDigest(t, repoName, \"sha256:b9d056b83bb6446fee29e89a7fcf10203c562c1f59586a6e2f39c903597bda40\")\n\t\t\t\t\t\tprepareRemoteMultiPlatformBuildpackPackage(t, mockImageFactory, mockImageFetcher, repoName, indexAMD64Digest, target1, []expectedMultiPlatformImage{\n\t\t\t\t\t\t\t{digest: newDigest(t, bp1URI, \"sha256:b9d056b83bb6446fee29e89a7fcf10203c562c1f59586a6e2f39c903597bda34\"), id: \"samples/bp-1\", version: \"0.0.1\", bpURI: bp1URI},\n\t\t\t\t\t\t\t{digest: newDigest(t, bp2URI, \"sha256:b9d056b83bb6446fee29e89a7fcf10203c562c1f59586a6e2f39c903597bda35\"), id: \"samples/bp-2\", version: \"0.0.1\", bpURI: bp2URI},\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\tindexARMDigest := newDigest(t, repoName, \"sha256:b9d056b83bb6446fee29e89a7fcf10203c562c1f59586a6e2f39c903597bda41\")\n\t\t\t\t\t\tprepareRemoteMultiPlatformBuildpackPackage(t, mockImageFactory, mockImageFetcher, repoName, indexARMDigest, target2, []expectedMultiPlatformImage{\n\t\t\t\t\t\t\t{digest: newDigest(t, bp1URI, \"sha256:b9d056b83bb6446fee29e89a7fcf10203c562c1f59586a6e2f39c903597bda36\"), id: \"samples/bp-1\", version: \"0.0.1\", bpURI: bp1URI},\n\t\t\t\t\t\t\t{digest: newDigest(t, bp2URI, \"sha256:b9d056b83bb6446fee29e89a7fcf10203c562c1f59586a6e2f39c903597bda37\"), id: \"samples/bp-2\", version: \"0.0.1\", bpURI: bp2URI},\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\t// Define expected targets to package\n\t\t\t\t\t\ttargets = []dist.Target{target1, target2}\n\n\t\t\t\t\t\t// index stub returned to check if push operation was called\n\t\t\t\t\t\tindex = h.NewMockImageIndex(t, repoName, 0, 0)\n\n\t\t\t\t\t\t// We need to mock the index factory to inject a stub index to be pushed.\n\t\t\t\t\t\tmockIndexFactory.EXPECT().Exists(gomock.Eq(repoName)).Return(false)\n\t\t\t\t\t\tmockIndexFactory.EXPECT().CreateIndex(gomock.Eq(repoName), gomock.Any()).Return(index, nil)\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"creates a multi-platform buildpack and pushes it to a registry\", func() {\n\t\t\t\t\t\th.AssertNil(t, subject.PackageBuildpack(context.TODO(), client.PackageBuildpackOptions{\n\t\t\t\t\t\t\tFormat:          client.FormatImage,\n\t\t\t\t\t\t\tPublish:         true,\n\t\t\t\t\t\t\tRelativeBaseDir: \"\",\n\t\t\t\t\t\t\tName:            repoName,\n\t\t\t\t\t\t\tConfig: pubbldpkg.Config{\n\t\t\t\t\t\t\t\tBuildpack: dist.BuildpackURI{URI: bpPathURI},\n\t\t\t\t\t\t\t\tDependencies: []dist.ImageOrURI{\n\t\t\t\t\t\t\t\t\t{BuildpackURI: dist.BuildpackURI{URI: bp1URI}},\n\t\t\t\t\t\t\t\t\t{BuildpackURI: dist.BuildpackURI{URI: bp2URI}},\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\tTargets: []dist.Target{},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\tTargets: targets,\n\t\t\t\t\t\t}))\n\n\t\t\t\t\t\t// index is not saved locally\n\t\t\t\t\t\th.AssertPathDoesNotExists(t, indexLocalPath)\n\n\t\t\t\t\t\t// Push operation was done\n\t\t\t\t\t\th.AssertTrue(t, index.PushCalled)\n\t\t\t\t\t\th.AssertTrue(t, index.PurgeOption)\n\n\t\t\t\t\t\t// index has the two expected manifests amd64 and arm\n\t\t\t\t\t\tindexManifest, err := index.IndexManifest()\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\th.AssertEq(t, len(indexManifest.Manifests), 2)\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"dependencies are on disk\", func() {\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\t// Check testdata/buildpack-multi-platform/buildpack-composite for configuration details\n\t\t\t\t\t\tdestBpPath := filepath.Join(\"testdata\", \"buildpack-multi-platform\", \"buildpack-composite-with-dependencies-on-disk\")\n\n\t\t\t\t\t\tbpPathURI, err = paths.FilePathToURI(destBpPath, \"\")\n\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\tprepareDownloadedBuildpackBlobAtURI(t, mockDownloader, destBpPath)\n\n\t\t\t\t\t\tbp1URI = filepath.Join(\"testdata\", \"buildpack-multi-platform\", \"buildpack-new-format\")\n\n\t\t\t\t\t\t// Define expected targets to package\n\t\t\t\t\t\ttargets = []dist.Target{target1, target2}\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"errors with a message\", func() {\n\t\t\t\t\t\t// If dependencies point to a file or a URL like https://example.com/buildpack.tgz\n\t\t\t\t\t\t// we will need to define some conventions to fetch by target\n\t\t\t\t\t\t// The OCI registry already solved the problem, that's why we do not allow this path for now\n\t\t\t\t\t\terr = subject.PackageBuildpack(context.TODO(), client.PackageBuildpackOptions{\n\t\t\t\t\t\t\tFormat:          client.FormatImage,\n\t\t\t\t\t\t\tPublish:         true,\n\t\t\t\t\t\t\tRelativeBaseDir: \"\",\n\t\t\t\t\t\t\tName:            repoName,\n\t\t\t\t\t\t\tConfig: pubbldpkg.Config{\n\t\t\t\t\t\t\t\tBuildpack: dist.BuildpackURI{URI: bpPathURI},\n\t\t\t\t\t\t\t\tDependencies: []dist.ImageOrURI{\n\t\t\t\t\t\t\t\t\t{BuildpackURI: dist.BuildpackURI{URI: bp1URI}},\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\tTargets: []dist.Target{},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\tTargets: targets,\n\t\t\t\t\t\t})\n\t\t\t\t\t\th.AssertNotNil(t, err)\n\t\t\t\t\t\th.AssertError(t, err, \"is not allowed when creating a composite multi-platform buildpack; push your dependencies to a registry and use 'docker://<image>' instead\")\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"daemon target selection\", func() {\n\t\t\t\t\twhen(\"publish is false\", func() {\n\t\t\t\t\t\twhen(\"daemon is linux/amd64\", func() {\n\t\t\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\t\t\tmockDockerClient.EXPECT().ServerVersion(gomock.Any(), gomock.Any()).Return(dockerclient.ServerVersionResult{\n\t\t\t\t\t\t\t\t\tOs:   \"linux\",\n\t\t\t\t\t\t\t\t\tArch: \"amd64\",\n\t\t\t\t\t\t\t\t}, nil).AnyTimes()\n\t\t\t\t\t\t\t})\n\n\t\t\t\t\t\t\twhen(\"targets include exact match\", func() {\n\t\t\t\t\t\t\t\tit(\"selects the exact OS and architecture match\", func() {\n\t\t\t\t\t\t\t\t\t// Prepare buildpack\n\t\t\t\t\t\t\t\t\tdestBpPath := filepath.Join(\"testdata\", \"buildpack-multi-platform\", \"buildpack-new-format\")\n\t\t\t\t\t\t\t\t\tbpPathURI, err = paths.FilePathToURI(destBpPath, \"\")\n\t\t\t\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\t\t\t\t// The code will check for platform-specific folder and download from there\n\t\t\t\t\t\t\t\t\tprepareDownloadedBuildpackBlobAtURI(t, mockDownloader, filepath.Join(destBpPath, \"linux\", \"amd64\"))\n\n\t\t\t\t\t\t\t\t\t// Mock docker info for validateOSPlatform\n\t\t\t\t\t\t\t\t\tmockDockerClient.EXPECT().Info(gomock.Any(), gomock.Any()).Return(dockerclient.SystemInfoResult{Info: mobysystem.Info{OSType: \"linux\"}}, nil)\n\n\t\t\t\t\t\t\t\t\t// Mock expectations for the selected target\n\t\t\t\t\t\t\t\t\tfakeImage := fakes.NewImage(repoName, \"\", nil)\n\t\t\t\t\t\t\t\t\tmockImageFactory.EXPECT().NewImage(repoName, true, dist.Target{OS: \"linux\", Arch: \"amd64\"}).Return(fakeImage, nil)\n\n\t\t\t\t\t\t\t\t\ttargets := []dist.Target{\n\t\t\t\t\t\t\t\t\t\t{OS: \"linux\", Arch: \"arm64\"},\n\t\t\t\t\t\t\t\t\t\t{OS: \"linux\", Arch: \"amd64\"}, // exact match\n\t\t\t\t\t\t\t\t\t\t{OS: \"windows\", Arch: \"amd64\"},\n\t\t\t\t\t\t\t\t\t}\n\n\t\t\t\t\t\t\t\t\terr = subject.PackageBuildpack(context.TODO(), client.PackageBuildpackOptions{\n\t\t\t\t\t\t\t\t\t\tFormat:          client.FormatImage,\n\t\t\t\t\t\t\t\t\t\tPublish:         false,\n\t\t\t\t\t\t\t\t\t\tRelativeBaseDir: \"\",\n\t\t\t\t\t\t\t\t\t\tName:            repoName,\n\t\t\t\t\t\t\t\t\t\tConfig: pubbldpkg.Config{\n\t\t\t\t\t\t\t\t\t\t\tBuildpack: dist.BuildpackURI{URI: bpPathURI},\n\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t\tTargets:    targets,\n\t\t\t\t\t\t\t\t\t\tPullPolicy: image.PullNever,\n\t\t\t\t\t\t\t\t\t})\n\t\t\t\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\t\t\t\t// Verify the image was saved (indicates successful packaging)\n\t\t\t\t\t\t\t\t\th.AssertEq(t, fakeImage.IsSaved(), true)\n\t\t\t\t\t\t\t\t})\n\t\t\t\t\t\t\t})\n\n\t\t\t\t\t\t\twhen(\"targets only have OS match with different architectures\", func() {\n\t\t\t\t\t\t\t\tit(\"returns error when no architecture matches\", func() {\n\t\t\t\t\t\t\t\t\ttargets := []dist.Target{\n\t\t\t\t\t\t\t\t\t\t{OS: \"linux\", Arch: \"arm64\"},\n\t\t\t\t\t\t\t\t\t\t{OS: \"linux\", Arch: \"arm\"},\n\t\t\t\t\t\t\t\t\t\t{OS: \"windows\", Arch: \"amd64\"},\n\t\t\t\t\t\t\t\t\t}\n\n\t\t\t\t\t\t\t\t\terr := subject.PackageBuildpack(context.TODO(), client.PackageBuildpackOptions{\n\t\t\t\t\t\t\t\t\t\tFormat:          client.FormatImage,\n\t\t\t\t\t\t\t\t\t\tPublish:         false,\n\t\t\t\t\t\t\t\t\t\tRelativeBaseDir: \"\",\n\t\t\t\t\t\t\t\t\t\tName:            repoName,\n\t\t\t\t\t\t\t\t\t\tConfig: pubbldpkg.Config{\n\t\t\t\t\t\t\t\t\t\t\tBuildpack: dist.BuildpackURI{URI: \"some-bp-uri\"},\n\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t\tTargets:    targets,\n\t\t\t\t\t\t\t\t\t\tPullPolicy: image.PullNever,\n\t\t\t\t\t\t\t\t\t})\n\t\t\t\t\t\t\t\t\th.AssertError(t, err, \"could not find a target that matches daemon os=linux and architecture=amd64\")\n\t\t\t\t\t\t\t\t})\n\t\t\t\t\t\t\t})\n\n\t\t\t\t\t\t\twhen(\"targets have OS match with empty architecture\", func() {\n\t\t\t\t\t\t\t\tit(\"selects the target with matching OS and empty architecture\", func() {\n\t\t\t\t\t\t\t\t\t// Prepare buildpack\n\t\t\t\t\t\t\t\t\tdestBpPath := filepath.Join(\"testdata\", \"buildpack\")\n\t\t\t\t\t\t\t\t\tbpPathURI, err = paths.FilePathToURI(destBpPath, \"\")\n\t\t\t\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\t\t\t\tprepareDownloadedBuildpackBlobAtURI(t, mockDownloader, destBpPath)\n\n\t\t\t\t\t\t\t\t\t// Mock docker info for validateOSPlatform\n\t\t\t\t\t\t\t\t\tmockDockerClient.EXPECT().Info(gomock.Any(), gomock.Any()).Return(dockerclient.SystemInfoResult{Info: mobysystem.Info{OSType: \"linux\"}}, nil)\n\n\t\t\t\t\t\t\t\t\t// Mock expectations for the selected target\n\t\t\t\t\t\t\t\t\tfakeImage := fakes.NewImage(repoName, \"\", nil)\n\t\t\t\t\t\t\t\t\tmockImageFactory.EXPECT().NewImage(repoName, true, dist.Target{OS: \"linux\", Arch: \"\"}).Return(fakeImage, nil)\n\n\t\t\t\t\t\t\t\t\ttargets := []dist.Target{\n\t\t\t\t\t\t\t\t\t\t{OS: \"linux\", Arch: \"arm64\"},\n\t\t\t\t\t\t\t\t\t\t{OS: \"linux\", Arch: \"\"}, // OS match with empty arch\n\t\t\t\t\t\t\t\t\t\t{OS: \"windows\", Arch: \"amd64\"},\n\t\t\t\t\t\t\t\t\t}\n\n\t\t\t\t\t\t\t\t\terr = subject.PackageBuildpack(context.TODO(), client.PackageBuildpackOptions{\n\t\t\t\t\t\t\t\t\t\tFormat:          client.FormatImage,\n\t\t\t\t\t\t\t\t\t\tPublish:         false,\n\t\t\t\t\t\t\t\t\t\tRelativeBaseDir: \"\",\n\t\t\t\t\t\t\t\t\t\tName:            repoName,\n\t\t\t\t\t\t\t\t\t\tConfig: pubbldpkg.Config{\n\t\t\t\t\t\t\t\t\t\t\tBuildpack: dist.BuildpackURI{URI: bpPathURI},\n\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t\tTargets:    targets,\n\t\t\t\t\t\t\t\t\t\tPullPolicy: image.PullNever,\n\t\t\t\t\t\t\t\t\t})\n\t\t\t\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\t\t\t\t// Verify the image was saved\n\t\t\t\t\t\t\t\t\th.AssertEq(t, fakeImage.IsSaved(), true)\n\t\t\t\t\t\t\t\t})\n\t\t\t\t\t\t\t})\n\n\t\t\t\t\t\t\twhen(\"multiple targets match\", func() {\n\t\t\t\t\t\t\t\tit(\"selects the first exact match\", func() {\n\t\t\t\t\t\t\t\t\t// Prepare buildpack\n\t\t\t\t\t\t\t\t\tdestBpPath := filepath.Join(\"testdata\", \"buildpack-multi-platform\", \"buildpack-new-format\")\n\t\t\t\t\t\t\t\t\tbpPathURI, err = paths.FilePathToURI(destBpPath, \"\")\n\t\t\t\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\t\t\t\t// The code will check for platform-specific folder and download from there\n\t\t\t\t\t\t\t\t\tprepareDownloadedBuildpackBlobAtURI(t, mockDownloader, filepath.Join(destBpPath, \"linux\", \"amd64\"))\n\n\t\t\t\t\t\t\t\t\t// Mock docker info for validateOSPlatform\n\t\t\t\t\t\t\t\t\tmockDockerClient.EXPECT().Info(gomock.Any(), gomock.Any()).Return(dockerclient.SystemInfoResult{Info: mobysystem.Info{OSType: \"linux\"}}, nil)\n\n\t\t\t\t\t\t\t\t\t// Mock expectations for the selected target\n\t\t\t\t\t\t\t\t\tfakeImage := fakes.NewImage(repoName, \"\", nil)\n\t\t\t\t\t\t\t\t\tmockImageFactory.EXPECT().NewImage(repoName, true, dist.Target{OS: \"linux\", Arch: \"amd64\", ArchVariant: \"v1\"}).Return(fakeImage, nil)\n\n\t\t\t\t\t\t\t\t\ttargets := []dist.Target{\n\t\t\t\t\t\t\t\t\t\t{OS: \"linux\", Arch: \"arm64\"},\n\t\t\t\t\t\t\t\t\t\t{OS: \"linux\", Arch: \"amd64\", ArchVariant: \"v1\"}, // first exact match\n\t\t\t\t\t\t\t\t\t\t{OS: \"linux\", Arch: \"amd64\", ArchVariant: \"v2\"}, // second exact match\n\t\t\t\t\t\t\t\t\t\t{OS: \"linux\", Arch: \"\"},\n\t\t\t\t\t\t\t\t\t}\n\n\t\t\t\t\t\t\t\t\terr = subject.PackageBuildpack(context.TODO(), client.PackageBuildpackOptions{\n\t\t\t\t\t\t\t\t\t\tFormat:          client.FormatImage,\n\t\t\t\t\t\t\t\t\t\tPublish:         false,\n\t\t\t\t\t\t\t\t\t\tRelativeBaseDir: \"\",\n\t\t\t\t\t\t\t\t\t\tName:            repoName,\n\t\t\t\t\t\t\t\t\t\tConfig: pubbldpkg.Config{\n\t\t\t\t\t\t\t\t\t\t\tBuildpack: dist.BuildpackURI{URI: bpPathURI},\n\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t\tTargets:    targets,\n\t\t\t\t\t\t\t\t\t\tPullPolicy: image.PullNever,\n\t\t\t\t\t\t\t\t\t})\n\t\t\t\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\t\t\t\t// Verify the image was saved\n\t\t\t\t\t\t\t\t\th.AssertEq(t, fakeImage.IsSaved(), true)\n\t\t\t\t\t\t\t\t})\n\t\t\t\t\t\t\t})\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\twhen(\"daemon is linux/arm64\", func() {\n\t\t\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\t\t\tmockDockerClient.EXPECT().ServerVersion(gomock.Any(), gomock.Any()).Return(dockerclient.ServerVersionResult{\n\t\t\t\t\t\t\t\t\tOs:   \"linux\",\n\t\t\t\t\t\t\t\t\tArch: \"arm64\",\n\t\t\t\t\t\t\t\t}, nil).AnyTimes()\n\t\t\t\t\t\t\t})\n\n\t\t\t\t\t\t\twhen(\"targets are ordered with amd64 first\", func() {\n\t\t\t\t\t\t\t\tit(\"selects arm64 even when amd64 appears first\", func() {\n\t\t\t\t\t\t\t\t\t// Prepare buildpack\n\t\t\t\t\t\t\t\t\tdestBpPath := filepath.Join(\"testdata\", \"buildpack-multi-platform\", \"buildpack-new-format\")\n\t\t\t\t\t\t\t\t\tbpPathURI, err = paths.FilePathToURI(destBpPath, \"\")\n\t\t\t\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\t\t\t\t// The code will check for platform-specific folder and download from there\n\t\t\t\t\t\t\t\t\t// Mock both paths as PlatformRootFolder returns /linux when it exists\n\t\t\t\t\t\t\t\t\tprepareDownloadedBuildpackBlobAtURI(t, mockDownloader, filepath.Join(destBpPath, \"linux\"))\n\t\t\t\t\t\t\t\t\tprepareDownloadedBuildpackBlobAtURI(t, mockDownloader, filepath.Join(destBpPath, \"linux\", \"arm\"))\n\n\t\t\t\t\t\t\t\t\t// Mock docker info for validateOSPlatform\n\t\t\t\t\t\t\t\t\tmockDockerClient.EXPECT().Info(gomock.Any(), gomock.Any()).Return(dockerclient.SystemInfoResult{Info: mobysystem.Info{OSType: \"linux\"}}, nil)\n\n\t\t\t\t\t\t\t\t\t// Mock expectations for the selected target\n\t\t\t\t\t\t\t\t\tfakeImage := fakes.NewImage(repoName, \"\", nil)\n\t\t\t\t\t\t\t\t\tmockImageFactory.EXPECT().NewImage(repoName, true, dist.Target{OS: \"linux\", Arch: \"arm64\"}).Return(fakeImage, nil)\n\n\t\t\t\t\t\t\t\t\ttargets := []dist.Target{\n\t\t\t\t\t\t\t\t\t\t{OS: \"linux\", Arch: \"amd64\"}, // appears first but wrong arch\n\t\t\t\t\t\t\t\t\t\t{OS: \"linux\", Arch: \"arm64\"}, // exact match\n\t\t\t\t\t\t\t\t\t\t{OS: \"windows\", Arch: \"arm64\"},\n\t\t\t\t\t\t\t\t\t}\n\n\t\t\t\t\t\t\t\t\terr = subject.PackageBuildpack(context.TODO(), client.PackageBuildpackOptions{\n\t\t\t\t\t\t\t\t\t\tFormat:          client.FormatImage,\n\t\t\t\t\t\t\t\t\t\tPublish:         false,\n\t\t\t\t\t\t\t\t\t\tRelativeBaseDir: \"\",\n\t\t\t\t\t\t\t\t\t\tName:            repoName,\n\t\t\t\t\t\t\t\t\t\tConfig: pubbldpkg.Config{\n\t\t\t\t\t\t\t\t\t\t\tBuildpack: dist.BuildpackURI{URI: bpPathURI},\n\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t\tTargets:    targets,\n\t\t\t\t\t\t\t\t\t\tPullPolicy: image.PullNever,\n\t\t\t\t\t\t\t\t\t})\n\t\t\t\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\t\t\t\t// Verify the image was saved\n\t\t\t\t\t\t\t\t\th.AssertEq(t, fakeImage.IsSaved(), true)\n\t\t\t\t\t\t\t\t})\n\t\t\t\t\t\t\t})\n\n\t\t\t\t\t\t\twhen(\"only amd64 targets available\", func() {\n\t\t\t\t\t\t\t\tit(\"returns error\", func() {\n\t\t\t\t\t\t\t\t\ttargets := []dist.Target{\n\t\t\t\t\t\t\t\t\t\t{OS: \"linux\", Arch: \"amd64\"},\n\t\t\t\t\t\t\t\t\t\t{OS: \"windows\", Arch: \"amd64\"},\n\t\t\t\t\t\t\t\t\t}\n\n\t\t\t\t\t\t\t\t\terr := subject.PackageBuildpack(context.TODO(), client.PackageBuildpackOptions{\n\t\t\t\t\t\t\t\t\t\tFormat:          client.FormatImage,\n\t\t\t\t\t\t\t\t\t\tPublish:         false,\n\t\t\t\t\t\t\t\t\t\tRelativeBaseDir: \"\",\n\t\t\t\t\t\t\t\t\t\tName:            repoName,\n\t\t\t\t\t\t\t\t\t\tConfig: pubbldpkg.Config{\n\t\t\t\t\t\t\t\t\t\t\tBuildpack: dist.BuildpackURI{URI: \"some-bp-uri\"},\n\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t\tTargets:    targets,\n\t\t\t\t\t\t\t\t\t\tPullPolicy: image.PullNever,\n\t\t\t\t\t\t\t\t\t})\n\t\t\t\t\t\t\t\t\th.AssertError(t, err, \"could not find a target that matches daemon os=linux and architecture=arm64\")\n\t\t\t\t\t\t\t\t})\n\t\t\t\t\t\t\t})\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\twhen(\"daemon is windows/amd64\", func() {\n\t\t\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\t\t\tmockDockerClient.EXPECT().ServerVersion(gomock.Any(), gomock.Any()).Return(dockerclient.ServerVersionResult{\n\t\t\t\t\t\t\t\t\tOs:   \"windows\",\n\t\t\t\t\t\t\t\t\tArch: \"amd64\",\n\t\t\t\t\t\t\t\t}, nil).AnyTimes()\n\t\t\t\t\t\t\t})\n\n\t\t\t\t\t\t\twhen(\"targets include windows\", func() {\n\t\t\t\t\t\t\t\tit(\"selects windows/amd64\", func() {\n\t\t\t\t\t\t\t\t\t// Create a Windows-compatible client\n\t\t\t\t\t\t\t\t\twindowsClient, err := client.NewClient(\n\t\t\t\t\t\t\t\t\t\tclient.WithDockerClient(mockDockerClient),\n\t\t\t\t\t\t\t\t\t\tclient.WithLogger(logging.NewLogWithWriters(&out, &out)),\n\t\t\t\t\t\t\t\t\t\tclient.WithDownloader(mockDownloader),\n\t\t\t\t\t\t\t\t\t\tclient.WithImageFactory(mockImageFactory),\n\t\t\t\t\t\t\t\t\t\tclient.WithIndexFactory(mockIndexFactory),\n\t\t\t\t\t\t\t\t\t\tclient.WithFetcher(mockImageFetcher),\n\t\t\t\t\t\t\t\t\t\tclient.WithExperimental(true),\n\t\t\t\t\t\t\t\t\t)\n\t\t\t\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\t\t\t\t// Prepare buildpack\n\t\t\t\t\t\t\t\t\tdestBpPath := filepath.Join(\"testdata\", \"buildpack\")\n\t\t\t\t\t\t\t\t\tbpPathURI, err = paths.FilePathToURI(destBpPath, \"\")\n\t\t\t\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\t\t\t\tprepareDownloadedBuildpackBlobAtURI(t, mockDownloader, destBpPath)\n\n\t\t\t\t\t\t\t\t\t// Mock docker info for validateOSPlatform\n\t\t\t\t\t\t\t\t\tmockDockerClient.EXPECT().Info(gomock.Any(), gomock.Any()).Return(dockerclient.SystemInfoResult{Info: mobysystem.Info{OSType: \"windows\"}}, nil)\n\n\t\t\t\t\t\t\t\t\t// Mock expectations for the selected target\n\t\t\t\t\t\t\t\t\tfakeImage := fakes.NewImage(repoName, \"\", nil)\n\t\t\t\t\t\t\t\t\tmockImageFactory.EXPECT().NewImage(repoName, true, dist.Target{OS: \"windows\", Arch: \"amd64\"}).Return(fakeImage, nil)\n\n\t\t\t\t\t\t\t\t\ttargets := []dist.Target{\n\t\t\t\t\t\t\t\t\t\t{OS: \"linux\", Arch: \"amd64\"},\n\t\t\t\t\t\t\t\t\t\t{OS: \"windows\", Arch: \"amd64\"}, // exact match\n\t\t\t\t\t\t\t\t\t\t{OS: \"darwin\", Arch: \"amd64\"},\n\t\t\t\t\t\t\t\t\t}\n\n\t\t\t\t\t\t\t\t\terr = windowsClient.PackageBuildpack(context.TODO(), client.PackageBuildpackOptions{\n\t\t\t\t\t\t\t\t\t\tFormat:          client.FormatImage,\n\t\t\t\t\t\t\t\t\t\tPublish:         false,\n\t\t\t\t\t\t\t\t\t\tRelativeBaseDir: \"\",\n\t\t\t\t\t\t\t\t\t\tName:            repoName,\n\t\t\t\t\t\t\t\t\t\tConfig: pubbldpkg.Config{\n\t\t\t\t\t\t\t\t\t\t\tBuildpack: dist.BuildpackURI{URI: bpPathURI},\n\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t\tTargets:    targets,\n\t\t\t\t\t\t\t\t\t\tPullPolicy: image.PullNever,\n\t\t\t\t\t\t\t\t\t})\n\t\t\t\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\t\t\t\t// Verify the image was saved\n\t\t\t\t\t\t\t\t\th.AssertEq(t, fakeImage.IsSaved(), true)\n\t\t\t\t\t\t\t\t})\n\t\t\t\t\t\t\t})\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\twhen(\"targets with distributions\", func() {\n\t\t\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\t\t\tmockDockerClient.EXPECT().ServerVersion(gomock.Any(), gomock.Any()).Return(dockerclient.ServerVersionResult{\n\t\t\t\t\t\t\t\t\tOs:   \"linux\",\n\t\t\t\t\t\t\t\t\tArch: \"amd64\",\n\t\t\t\t\t\t\t\t}, nil).AnyTimes()\n\t\t\t\t\t\t\t})\n\n\t\t\t\t\t\t\tit(\"selects target ignoring distributions\", func() {\n\t\t\t\t\t\t\t\t// Prepare buildpack\n\t\t\t\t\t\t\t\tdestBpPath := filepath.Join(\"testdata\", \"buildpack-multi-platform\", \"buildpack-new-format\")\n\t\t\t\t\t\t\t\tbpPathURI, err = paths.FilePathToURI(destBpPath, \"\")\n\t\t\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\t\t\tprepareDownloadedBuildpackBlobAtURI(t, mockDownloader, filepath.Join(destBpPath, \"linux\", \"amd64\"))\n\n\t\t\t\t\t\t\t\t// Mock docker info for validateOSPlatform\n\t\t\t\t\t\t\t\tmockDockerClient.EXPECT().Info(gomock.Any(), gomock.Any()).Return(dockerclient.SystemInfoResult{Info: mobysystem.Info{OSType: \"linux\"}}, nil)\n\n\t\t\t\t\t\t\t\t// Mock expectations for the selected target\n\t\t\t\t\t\t\t\tfakeImage := fakes.NewImage(repoName, \"\", nil)\n\t\t\t\t\t\t\t\tmockImageFactory.EXPECT().NewImage(repoName, true, dist.Target{\n\t\t\t\t\t\t\t\t\tOS:   \"linux\",\n\t\t\t\t\t\t\t\t\tArch: \"amd64\",\n\t\t\t\t\t\t\t\t\tDistributions: []dist.Distribution{\n\t\t\t\t\t\t\t\t\t\t{Name: \"ubuntu\", Version: \"22.04\"},\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t}).Return(fakeImage, nil)\n\n\t\t\t\t\t\t\t\ttargets := []dist.Target{\n\t\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t\tOS:   \"linux\",\n\t\t\t\t\t\t\t\t\t\tArch: \"amd64\",\n\t\t\t\t\t\t\t\t\t\tDistributions: []dist.Distribution{\n\t\t\t\t\t\t\t\t\t\t\t{Name: \"ubuntu\", Version: \"22.04\"},\n\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t}\n\n\t\t\t\t\t\t\t\terr = subject.PackageBuildpack(context.TODO(), client.PackageBuildpackOptions{\n\t\t\t\t\t\t\t\t\tFormat:          client.FormatImage,\n\t\t\t\t\t\t\t\t\tPublish:         false,\n\t\t\t\t\t\t\t\t\tRelativeBaseDir: \"\",\n\t\t\t\t\t\t\t\t\tName:            repoName,\n\t\t\t\t\t\t\t\t\tConfig: pubbldpkg.Config{\n\t\t\t\t\t\t\t\t\t\tBuildpack: dist.BuildpackURI{URI: bpPathURI},\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\tTargets:    targets,\n\t\t\t\t\t\t\t\t\tPullPolicy: image.PullNever,\n\t\t\t\t\t\t\t\t})\n\t\t\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\t\t\t// Verify the image was saved\n\t\t\t\t\t\t\t\th.AssertEq(t, fakeImage.IsSaved(), true)\n\t\t\t\t\t\t\t})\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\twhen(\"empty targets list\", func() {\n\t\t\t\t\t\t\tit(\"uses default behavior without calling daemonTarget\", func() {\n\t\t\t\t\t\t\t\t// Prepare buildpack\n\t\t\t\t\t\t\t\tbpPathURI, err = paths.FilePathToURI(filepath.Join(\"testdata\", \"buildpack\"), \"\")\n\t\t\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\t\t\tprepareDownloadedBuildpackBlobAtURI(t, mockDownloader, filepath.Join(\"testdata\", \"buildpack\"))\n\n\t\t\t\t\t\t\t\t// Mock expectations - ServerVersion should NOT be called\n\t\t\t\t\t\t\t\t// as daemonTarget is not invoked for empty targets\n\t\t\t\t\t\t\t\tmockDockerClient.EXPECT().Info(gomock.Any(), gomock.Any()).Return(dockerclient.SystemInfoResult{Info: mobysystem.Info{OSType: \"linux\"}}, nil)\n\t\t\t\t\t\t\t\tfakeImage := fakes.NewImage(repoName, \"\", nil)\n\t\t\t\t\t\t\t\tmockImageFactory.EXPECT().NewImage(repoName, true, dist.Target{OS: \"linux\"}).Return(fakeImage, nil)\n\n\t\t\t\t\t\t\t\terr = subject.PackageBuildpack(context.TODO(), client.PackageBuildpackOptions{\n\t\t\t\t\t\t\t\t\tFormat:          client.FormatImage,\n\t\t\t\t\t\t\t\t\tPublish:         false,\n\t\t\t\t\t\t\t\t\tRelativeBaseDir: \"\",\n\t\t\t\t\t\t\t\t\tName:            repoName,\n\t\t\t\t\t\t\t\t\tConfig: pubbldpkg.Config{\n\t\t\t\t\t\t\t\t\t\tPlatform:  dist.Platform{OS: \"linux\"},\n\t\t\t\t\t\t\t\t\t\tBuildpack: dist.BuildpackURI{URI: bpPathURI},\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\tTargets:    []dist.Target{}, // empty targets\n\t\t\t\t\t\t\t\t\tPullPolicy: image.PullNever,\n\t\t\t\t\t\t\t\t})\n\t\t\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\t\t\t// Verify the image was saved\n\t\t\t\t\t\t\t\th.AssertEq(t, fakeImage.IsSaved(), true)\n\t\t\t\t\t\t\t})\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"FormatFile\", func() {\n\t\twhen(\"simple package for both OS formats (experimental only)\", func() {\n\t\t\tit(\"creates package image in either OS format\", func() {\n\t\t\t\ttmpDir, err := os.MkdirTemp(\"\", \"package-buildpack\")\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\tdefer os.Remove(tmpDir)\n\n\t\t\t\tfor _, imageOS := range []string{\"linux\", \"windows\"} {\n\t\t\t\t\tlocalMockDockerClient := testmocks.NewMockAPIClient(mockController)\n\t\t\t\t\tlocalMockDockerClient.EXPECT().Info(context.TODO(), gomock.Any()).Return(dockerclient.SystemInfoResult{Info: mobysystem.Info{OSType: imageOS}}, nil).AnyTimes()\n\n\t\t\t\t\tpackClientWithExperimental, err := client.NewClient(\n\t\t\t\t\t\tclient.WithDockerClient(localMockDockerClient),\n\t\t\t\t\t\tclient.WithLogger(logging.NewLogWithWriters(&out, &out)),\n\t\t\t\t\t\tclient.WithDownloader(mockDownloader),\n\t\t\t\t\t\tclient.WithExperimental(true),\n\t\t\t\t\t)\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\tfakeBlob := blob.NewBlob(filepath.Join(\"testdata\", \"empty-file\"))\n\t\t\t\t\tbpURL := fmt.Sprintf(\"https://example.com/bp.%s.tgz\", h.RandString(12))\n\t\t\t\t\tmockDownloader.EXPECT().Download(gomock.Any(), bpURL).Return(fakeBlob, nil).AnyTimes()\n\n\t\t\t\t\tpackagePath := filepath.Join(tmpDir, h.RandString(12)+\"-test.cnb\")\n\t\t\t\t\th.AssertNil(t, packClientWithExperimental.PackageBuildpack(context.TODO(), client.PackageBuildpackOptions{\n\t\t\t\t\t\tFormat: client.FormatFile,\n\t\t\t\t\t\tName:   packagePath,\n\t\t\t\t\t\tConfig: pubbldpkg.Config{\n\t\t\t\t\t\t\tPlatform: dist.Platform{OS: imageOS},\n\t\t\t\t\t\t\tBuildpack: dist.BuildpackURI{URI: createBuildpack(dist.BuildpackDescriptor{\n\t\t\t\t\t\t\t\tWithAPI:    api.MustParse(\"0.2\"),\n\t\t\t\t\t\t\t\tWithInfo:   dist.ModuleInfo{ID: \"bp.basic\", Version: \"2.3.4\"},\n\t\t\t\t\t\t\t\tWithStacks: []dist.Stack{{ID: \"some.stack.id\"}},\n\t\t\t\t\t\t\t})},\n\t\t\t\t\t\t},\n\t\t\t\t\t\tPullPolicy: image.PullNever,\n\t\t\t\t\t}))\n\t\t\t\t}\n\t\t\t})\n\t\t})\n\n\t\twhen(\"nested package\", func() {\n\t\t\tvar (\n\t\t\t\tnestedPackage     *fakes.Image\n\t\t\t\tchildDescriptor   dist.BuildpackDescriptor\n\t\t\t\tpackageDescriptor dist.BuildpackDescriptor\n\t\t\t\ttmpDir            string\n\t\t\t\terr               error\n\t\t\t)\n\n\t\t\tit.Before(func() {\n\t\t\t\tchildDescriptor = dist.BuildpackDescriptor{\n\t\t\t\t\tWithAPI:    api.MustParse(\"0.2\"),\n\t\t\t\t\tWithInfo:   dist.ModuleInfo{ID: \"bp.nested\", Version: \"2.3.4\"},\n\t\t\t\t\tWithStacks: []dist.Stack{{ID: \"some.stack.id\"}},\n\t\t\t\t}\n\n\t\t\t\tpackageDescriptor = dist.BuildpackDescriptor{\n\t\t\t\t\tWithAPI:  api.MustParse(\"0.2\"),\n\t\t\t\t\tWithInfo: dist.ModuleInfo{ID: \"bp.1\", Version: \"1.2.3\"},\n\t\t\t\t\tWithOrder: dist.Order{{\n\t\t\t\t\t\tGroup: []dist.ModuleRef{{\n\t\t\t\t\t\t\tModuleInfo: dist.ModuleInfo{ID: \"bp.nested\", Version: \"2.3.4\"},\n\t\t\t\t\t\t\tOptional:   false,\n\t\t\t\t\t\t}},\n\t\t\t\t\t}},\n\t\t\t\t}\n\n\t\t\t\ttmpDir, err = os.MkdirTemp(\"\", \"package-buildpack\")\n\t\t\t\th.AssertNil(t, err)\n\t\t\t})\n\n\t\t\tit.After(func() {\n\t\t\t\th.AssertNil(t, os.RemoveAll(tmpDir))\n\t\t\t})\n\n\t\t\twhen(\"dependencies are packaged buildpack image\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tnestedPackage = fakes.NewImage(\"nested/package-\"+h.RandString(12), \"\", nil)\n\t\t\t\t\tmockImageFactory.EXPECT().NewImage(nestedPackage.Name(), false, dist.Target{OS: \"linux\"}).Return(nestedPackage, nil)\n\n\t\t\t\t\th.AssertNil(t, subject.PackageBuildpack(context.TODO(), client.PackageBuildpackOptions{\n\t\t\t\t\t\tName: nestedPackage.Name(),\n\t\t\t\t\t\tConfig: pubbldpkg.Config{\n\t\t\t\t\t\t\tPlatform:  dist.Platform{OS: \"linux\"},\n\t\t\t\t\t\t\tBuildpack: dist.BuildpackURI{URI: createBuildpack(childDescriptor)},\n\t\t\t\t\t\t},\n\t\t\t\t\t\tPublish:    true,\n\t\t\t\t\t\tPullPolicy: image.PullAlways,\n\t\t\t\t\t}))\n\n\t\t\t\t\tmockImageFetcher.EXPECT().Fetch(gomock.Any(), nestedPackage.Name(), image.FetchOptions{Daemon: true, PullPolicy: image.PullAlways, Target: &dist.Target{OS: \"linux\"}}).Return(nestedPackage, nil)\n\t\t\t\t})\n\n\t\t\t\tit(\"should pull and use local nested package image\", func() {\n\t\t\t\t\tpackagePath := filepath.Join(tmpDir, \"test.cnb\")\n\n\t\t\t\t\th.AssertNil(t, subject.PackageBuildpack(context.TODO(), client.PackageBuildpackOptions{\n\t\t\t\t\t\tName: packagePath,\n\t\t\t\t\t\tConfig: pubbldpkg.Config{\n\t\t\t\t\t\t\tPlatform:     dist.Platform{OS: \"linux\"},\n\t\t\t\t\t\t\tBuildpack:    dist.BuildpackURI{URI: createBuildpack(packageDescriptor)},\n\t\t\t\t\t\t\tDependencies: []dist.ImageOrURI{{ImageRef: dist.ImageRef{ImageName: nestedPackage.Name()}}},\n\t\t\t\t\t\t},\n\t\t\t\t\t\tPublish:    false,\n\t\t\t\t\t\tPullPolicy: image.PullAlways,\n\t\t\t\t\t\tFormat:     client.FormatFile,\n\t\t\t\t\t}))\n\n\t\t\t\t\tassertPackageBPFileHasBuildpacks(t, packagePath, []dist.BuildpackDescriptor{packageDescriptor, childDescriptor})\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"dependencies are unpackaged buildpack\", func() {\n\t\t\t\tit(\"should work\", func() {\n\t\t\t\t\tpackagePath := filepath.Join(tmpDir, \"test.cnb\")\n\n\t\t\t\t\th.AssertNil(t, subject.PackageBuildpack(context.TODO(), client.PackageBuildpackOptions{\n\t\t\t\t\t\tName: packagePath,\n\t\t\t\t\t\tConfig: pubbldpkg.Config{\n\t\t\t\t\t\t\tPlatform:     dist.Platform{OS: \"linux\"},\n\t\t\t\t\t\t\tBuildpack:    dist.BuildpackURI{URI: createBuildpack(packageDescriptor)},\n\t\t\t\t\t\t\tDependencies: []dist.ImageOrURI{{BuildpackURI: dist.BuildpackURI{URI: createBuildpack(childDescriptor)}}},\n\t\t\t\t\t\t},\n\t\t\t\t\t\tPublish:    false,\n\t\t\t\t\t\tPullPolicy: image.PullAlways,\n\t\t\t\t\t\tFormat:     client.FormatFile,\n\t\t\t\t\t}))\n\n\t\t\t\t\tassertPackageBPFileHasBuildpacks(t, packagePath, []dist.BuildpackDescriptor{packageDescriptor, childDescriptor})\n\t\t\t\t})\n\n\t\t\t\twhen(\"dependency download fails\", func() {\n\t\t\t\t\tit(\"should error\", func() {\n\t\t\t\t\t\tbpURL := fmt.Sprintf(\"https://example.com/bp.%s.tgz\", h.RandString(12))\n\t\t\t\t\t\tmockDownloader.EXPECT().Download(gomock.Any(), bpURL).Return(nil, image.ErrNotFound).AnyTimes()\n\n\t\t\t\t\t\tpackagePath := filepath.Join(tmpDir, \"test.cnb\")\n\n\t\t\t\t\t\terr = subject.PackageBuildpack(context.TODO(), client.PackageBuildpackOptions{\n\t\t\t\t\t\t\tName: packagePath,\n\t\t\t\t\t\t\tConfig: pubbldpkg.Config{\n\t\t\t\t\t\t\t\tPlatform:     dist.Platform{OS: \"linux\"},\n\t\t\t\t\t\t\t\tBuildpack:    dist.BuildpackURI{URI: createBuildpack(packageDescriptor)},\n\t\t\t\t\t\t\t\tDependencies: []dist.ImageOrURI{{BuildpackURI: dist.BuildpackURI{URI: bpURL}}},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\tPublish:    false,\n\t\t\t\t\t\t\tPullPolicy: image.PullAlways,\n\t\t\t\t\t\t\tFormat:     client.FormatFile,\n\t\t\t\t\t\t})\n\t\t\t\t\t\th.AssertError(t, err, \"downloading buildpack\")\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"dependency isn't a valid buildpack\", func() {\n\t\t\t\t\tit(\"should error\", func() {\n\t\t\t\t\t\tfakeBlob := blob.NewBlob(filepath.Join(\"testdata\", \"empty-file\"))\n\t\t\t\t\t\tbpURL := fmt.Sprintf(\"https://example.com/bp.%s.tgz\", h.RandString(12))\n\t\t\t\t\t\tmockDownloader.EXPECT().Download(gomock.Any(), bpURL).Return(fakeBlob, nil).AnyTimes()\n\n\t\t\t\t\t\tpackagePath := filepath.Join(tmpDir, \"test.cnb\")\n\n\t\t\t\t\t\terr = subject.PackageBuildpack(context.TODO(), client.PackageBuildpackOptions{\n\t\t\t\t\t\t\tName: packagePath,\n\t\t\t\t\t\t\tConfig: pubbldpkg.Config{\n\t\t\t\t\t\t\t\tPlatform:     dist.Platform{OS: \"linux\"},\n\t\t\t\t\t\t\t\tBuildpack:    dist.BuildpackURI{URI: createBuildpack(packageDescriptor)},\n\t\t\t\t\t\t\t\tDependencies: []dist.ImageOrURI{{BuildpackURI: dist.BuildpackURI{URI: bpURL}}},\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\tPublish:    false,\n\t\t\t\t\t\t\tPullPolicy: image.PullAlways,\n\t\t\t\t\t\t\tFormat:     client.FormatFile,\n\t\t\t\t\t\t})\n\t\t\t\t\t\th.AssertError(t, err, \"packaging dependencies\")\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"dependencies include packaged buildpack image and unpacked buildpack\", func() {\n\t\t\t\tvar secondChildDescriptor dist.BuildpackDescriptor\n\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tsecondChildDescriptor = dist.BuildpackDescriptor{\n\t\t\t\t\t\tWithAPI:    api.MustParse(\"0.2\"),\n\t\t\t\t\t\tWithInfo:   dist.ModuleInfo{ID: \"bp.nested1\", Version: \"2.3.4\"},\n\t\t\t\t\t\tWithStacks: []dist.Stack{{ID: \"some.stack.id\"}},\n\t\t\t\t\t}\n\n\t\t\t\t\tpackageDescriptor.WithOrder = append(packageDescriptor.Order(), dist.OrderEntry{Group: []dist.ModuleRef{{\n\t\t\t\t\t\tModuleInfo: dist.ModuleInfo{ID: secondChildDescriptor.Info().ID, Version: secondChildDescriptor.Info().Version},\n\t\t\t\t\t\tOptional:   false,\n\t\t\t\t\t}}})\n\n\t\t\t\t\tnestedPackage = fakes.NewImage(\"nested/package-\"+h.RandString(12), \"\", nil)\n\t\t\t\t\tmockImageFactory.EXPECT().NewImage(nestedPackage.Name(), false, dist.Target{OS: \"linux\"}).Return(nestedPackage, nil)\n\n\t\t\t\t\th.AssertNil(t, subject.PackageBuildpack(context.TODO(), client.PackageBuildpackOptions{\n\t\t\t\t\t\tName: nestedPackage.Name(),\n\t\t\t\t\t\tConfig: pubbldpkg.Config{\n\t\t\t\t\t\t\tPlatform:  dist.Platform{OS: \"linux\"},\n\t\t\t\t\t\t\tBuildpack: dist.BuildpackURI{URI: createBuildpack(childDescriptor)},\n\t\t\t\t\t\t},\n\t\t\t\t\t\tPublish:    true,\n\t\t\t\t\t\tPullPolicy: image.PullAlways,\n\t\t\t\t\t}))\n\n\t\t\t\t\tmockImageFetcher.EXPECT().Fetch(gomock.Any(), nestedPackage.Name(), image.FetchOptions{Daemon: true, PullPolicy: image.PullAlways, Target: &dist.Target{OS: \"linux\"}}).Return(nestedPackage, nil)\n\t\t\t\t})\n\n\t\t\t\tit(\"should include both of them\", func() {\n\t\t\t\t\tpackagePath := filepath.Join(tmpDir, \"test.cnb\")\n\n\t\t\t\t\th.AssertNil(t, subject.PackageBuildpack(context.TODO(), client.PackageBuildpackOptions{\n\t\t\t\t\t\tName: packagePath,\n\t\t\t\t\t\tConfig: pubbldpkg.Config{\n\t\t\t\t\t\t\tPlatform:  dist.Platform{OS: \"linux\"},\n\t\t\t\t\t\t\tBuildpack: dist.BuildpackURI{URI: createBuildpack(packageDescriptor)},\n\t\t\t\t\t\t\tDependencies: []dist.ImageOrURI{{ImageRef: dist.ImageRef{ImageName: nestedPackage.Name()}},\n\t\t\t\t\t\t\t\t{BuildpackURI: dist.BuildpackURI{URI: createBuildpack(secondChildDescriptor)}}},\n\t\t\t\t\t\t},\n\t\t\t\t\t\tPublish:    false,\n\t\t\t\t\t\tPullPolicy: image.PullAlways,\n\t\t\t\t\t\tFormat:     client.FormatFile,\n\t\t\t\t\t}))\n\n\t\t\t\t\tassertPackageBPFileHasBuildpacks(t, packagePath, []dist.BuildpackDescriptor{packageDescriptor, childDescriptor, secondChildDescriptor})\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"dependencies include a packaged buildpack file\", func() {\n\t\t\t\tvar (\n\t\t\t\t\tdependencyPackagePath string\n\t\t\t\t)\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tdependencyPackagePath = filepath.Join(tmpDir, \"dep.cnb\")\n\t\t\t\t\tdependencyPackageURI, err := paths.FilePathToURI(dependencyPackagePath, \"\")\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\th.AssertNil(t, subject.PackageBuildpack(context.TODO(), client.PackageBuildpackOptions{\n\t\t\t\t\t\tName: dependencyPackagePath,\n\t\t\t\t\t\tConfig: pubbldpkg.Config{\n\t\t\t\t\t\t\tPlatform:  dist.Platform{OS: \"linux\"},\n\t\t\t\t\t\t\tBuildpack: dist.BuildpackURI{URI: createBuildpack(childDescriptor)},\n\t\t\t\t\t\t},\n\t\t\t\t\t\tPullPolicy: image.PullAlways,\n\t\t\t\t\t\tFormat:     client.FormatFile,\n\t\t\t\t\t}))\n\n\t\t\t\t\tmockDownloader.EXPECT().Download(gomock.Any(), dependencyPackageURI).Return(blob.NewBlob(dependencyPackagePath), nil).AnyTimes()\n\t\t\t\t})\n\n\t\t\t\tit(\"should open file and correctly add buildpacks\", func() {\n\t\t\t\t\tpackagePath := filepath.Join(tmpDir, \"test.cnb\")\n\n\t\t\t\t\th.AssertNil(t, subject.PackageBuildpack(context.TODO(), client.PackageBuildpackOptions{\n\t\t\t\t\t\tName: packagePath,\n\t\t\t\t\t\tConfig: pubbldpkg.Config{\n\t\t\t\t\t\t\tPlatform:     dist.Platform{OS: \"linux\"},\n\t\t\t\t\t\t\tBuildpack:    dist.BuildpackURI{URI: createBuildpack(packageDescriptor)},\n\t\t\t\t\t\t\tDependencies: []dist.ImageOrURI{{BuildpackURI: dist.BuildpackURI{URI: dependencyPackagePath}}},\n\t\t\t\t\t\t},\n\t\t\t\t\t\tPublish:    false,\n\t\t\t\t\t\tPullPolicy: image.PullAlways,\n\t\t\t\t\t\tFormat:     client.FormatFile,\n\t\t\t\t\t}))\n\n\t\t\t\t\tassertPackageBPFileHasBuildpacks(t, packagePath, []dist.BuildpackDescriptor{packageDescriptor, childDescriptor})\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"dependencies include a buildpack registry urn file\", func() {\n\t\t\t\tvar (\n\t\t\t\t\ttmpDir          string\n\t\t\t\t\tregistryFixture string\n\t\t\t\t\tpackHome        string\n\t\t\t\t)\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tvar err error\n\n\t\t\t\t\tchildDescriptor = dist.BuildpackDescriptor{\n\t\t\t\t\t\tWithAPI:    api.MustParse(\"0.2\"),\n\t\t\t\t\t\tWithInfo:   dist.ModuleInfo{ID: \"example/foo\", Version: \"1.1.0\"},\n\t\t\t\t\t\tWithStacks: []dist.Stack{{ID: \"some.stack.id\"}},\n\t\t\t\t\t}\n\n\t\t\t\t\tpackageDescriptor = dist.BuildpackDescriptor{\n\t\t\t\t\t\tWithAPI:  api.MustParse(\"0.2\"),\n\t\t\t\t\t\tWithInfo: dist.ModuleInfo{ID: \"bp.1\", Version: \"1.2.3\"},\n\t\t\t\t\t\tWithOrder: dist.Order{{\n\t\t\t\t\t\t\tGroup: []dist.ModuleRef{{\n\t\t\t\t\t\t\t\tModuleInfo: dist.ModuleInfo{ID: \"example/foo\", Version: \"1.1.0\"},\n\t\t\t\t\t\t\t\tOptional:   false,\n\t\t\t\t\t\t\t}},\n\t\t\t\t\t\t}},\n\t\t\t\t\t}\n\n\t\t\t\t\ttmpDir, err = os.MkdirTemp(\"\", \"registry\")\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\tpackHome = filepath.Join(tmpDir, \".pack\")\n\t\t\t\t\terr = os.MkdirAll(packHome, 0755)\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\tos.Setenv(\"PACK_HOME\", packHome)\n\n\t\t\t\t\tregistryFixture = h.CreateRegistryFixture(t, tmpDir, filepath.Join(\"testdata\", \"registry\"))\n\t\t\t\t\th.AssertNotNil(t, registryFixture)\n\n\t\t\t\t\tpackageImage := fakes.NewImage(\"example.com/some/package@sha256:74eb48882e835d8767f62940d453eb96ed2737de3a16573881dcea7dea769df7\", \"\", nil)\n\t\t\t\t\terr = packageImage.AddLayerWithDiffID(\"testdata/empty-file\", \"sha256:xxx\")\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\terr = packageImage.SetLabel(\"io.buildpacks.buildpackage.metadata\", `{\"id\":\"example/foo\", \"version\":\"1.1.0\", \"stacks\":[{\"id\":\"some.stack.id\"}]}`)\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\terr = packageImage.SetLabel(\"io.buildpacks.buildpack.layers\", `{\"example/foo\":{\"1.1.0\":{\"api\": \"0.2\", \"layerDiffID\":\"sha256:xxx\", \"stacks\":[{\"id\":\"some.stack.id\"}]}}}`)\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\tmockImageFetcher.EXPECT().Fetch(gomock.Any(), packageImage.Name(), image.FetchOptions{Daemon: true, PullPolicy: image.PullAlways, Target: &dist.Target{OS: \"linux\"}}).Return(packageImage, nil)\n\n\t\t\t\t\tpackHome := filepath.Join(tmpDir, \"packHome\")\n\t\t\t\t\th.AssertNil(t, os.Setenv(\"PACK_HOME\", packHome))\n\t\t\t\t\tconfigPath := filepath.Join(packHome, \"config.toml\")\n\t\t\t\t\th.AssertNil(t, cfg.Write(cfg.Config{\n\t\t\t\t\t\tRegistries: []cfg.Registry{\n\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\tName: \"some-registry\",\n\t\t\t\t\t\t\t\tType: \"github\",\n\t\t\t\t\t\t\t\tURL:  registryFixture,\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t},\n\t\t\t\t\t}, configPath))\n\t\t\t\t})\n\n\t\t\t\tit.After(func() {\n\t\t\t\t\tos.Unsetenv(\"PACK_HOME\")\n\t\t\t\t\terr := os.RemoveAll(tmpDir)\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t})\n\n\t\t\t\tit(\"should open file and correctly add buildpacks\", func() {\n\t\t\t\t\tpackagePath := filepath.Join(tmpDir, \"test.cnb\")\n\n\t\t\t\t\th.AssertNil(t, subject.PackageBuildpack(context.TODO(), client.PackageBuildpackOptions{\n\t\t\t\t\t\tName: packagePath,\n\t\t\t\t\t\tConfig: pubbldpkg.Config{\n\t\t\t\t\t\t\tPlatform:     dist.Platform{OS: \"linux\"},\n\t\t\t\t\t\t\tBuildpack:    dist.BuildpackURI{URI: createBuildpack(packageDescriptor)},\n\t\t\t\t\t\t\tDependencies: []dist.ImageOrURI{{BuildpackURI: dist.BuildpackURI{URI: \"urn:cnb:registry:example/foo@1.1.0\"}}},\n\t\t\t\t\t\t},\n\t\t\t\t\t\tPublish:    false,\n\t\t\t\t\t\tPullPolicy: image.PullAlways,\n\t\t\t\t\t\tFormat:     client.FormatFile,\n\t\t\t\t\t\tRegistry:   \"some-registry\",\n\t\t\t\t\t}))\n\n\t\t\t\t\tassertPackageBPFileHasBuildpacks(t, packagePath, []dist.BuildpackDescriptor{packageDescriptor, childDescriptor})\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"unknown format is provided\", func() {\n\t\tit(\"should error\", func() {\n\t\t\tmockDockerClient.EXPECT().Info(context.TODO(), gomock.Any()).Return(dockerclient.SystemInfoResult{Info: mobysystem.Info{OSType: \"linux\"}}, nil).AnyTimes()\n\n\t\t\terr := subject.PackageBuildpack(context.TODO(), client.PackageBuildpackOptions{\n\t\t\t\tName:   \"some-buildpack\",\n\t\t\t\tFormat: \"invalid-format\",\n\t\t\t\tConfig: pubbldpkg.Config{\n\t\t\t\t\tPlatform: dist.Platform{OS: \"linux\"},\n\t\t\t\t\tBuildpack: dist.BuildpackURI{URI: createBuildpack(dist.BuildpackDescriptor{\n\t\t\t\t\t\tWithAPI:    api.MustParse(\"0.2\"),\n\t\t\t\t\t\tWithInfo:   dist.ModuleInfo{ID: \"bp.1\", Version: \"1.2.3\"},\n\t\t\t\t\t\tWithStacks: []dist.Stack{{ID: \"some.stack.id\"}},\n\t\t\t\t\t})},\n\t\t\t\t},\n\t\t\t\tPublish:    false,\n\t\t\t\tPullPolicy: image.PullAlways,\n\t\t\t})\n\t\t\th.AssertError(t, err, \"unknown format: 'invalid-format'\")\n\t\t})\n\t})\n}\n\nfunc assertPackageBPFileHasBuildpacks(t *testing.T, path string, descriptors []dist.BuildpackDescriptor) {\n\tpackageBlob := blob.NewBlob(path)\n\tmainBP, depBPs, err := buildpack.BuildpacksFromOCILayoutBlob(packageBlob)\n\th.AssertNil(t, err)\n\th.AssertBuildpacksHaveDescriptors(t, append([]buildpack.BuildModule{mainBP}, depBPs...), descriptors)\n}\n\nfunc prepareDownloadedBuildpackBlobAtURI(t *testing.T, mockDownloader *testmocks.MockBlobDownloader, path string) {\n\tblob := blob.NewBlob(path)\n\turi, err := paths.FilePathToURI(path, \"\")\n\th.AssertNil(t, err)\n\tmockDownloader.EXPECT().Download(gomock.Any(), uri).Return(blob, nil).AnyTimes()\n}\n\n// prepareExpectedMultiPlaformImages creates a fake CNBImage that will be fetched from a registry\nfunc prepareExpectedMultiPlaformImages(t *testing.T, mockImageFactory *testmocks.MockImageFactory, mockImageFetcher *testmocks.MockImageFetcher, repoName string, target dist.Target, expected expectedMultiPlatformImage) {\n\tfakeImage := h.NewFakeWithRandomUnderlyingV1Image(t, repoName, expected.digest)\n\tmockImageFactory.EXPECT().NewImage(repoName, false, gomock.Eq(target)).Return(fakeImage, nil)\n\tmockImageFetcher.EXPECT().Fetch(gomock.Any(), expected.digest.Name(), gomock.Any()).Return(fakeImage, nil)\n}\n\n// prepareRemoteMultiPlatformBuildpackPackage creates remotes buildpack packages required to create a composite buildapck\n// repoName: image index reference name\n// digest: manifest digest for the given target\n// target: os/arch for the given manifest\nfunc prepareRemoteMultiPlatformBuildpackPackage(t *testing.T, mockImageFactory *testmocks.MockImageFactory, mockImageFetcher *testmocks.MockImageFetcher, repoName string, digest name.Digest, target dist.Target, expected []expectedMultiPlatformImage) {\n\t// crates each remote buildpack package for the given target\n\tfor _, v := range expected {\n\t\t// it must already exist in a registry, pack will pull it from a registry and write its content on disk to create a .tar\n\t\tfakeImage := h.NewFakeWithRandomUnderlyingV1Image(t, v.bpURI, v.digest)\n\t\t// Each buildpack package is expected to have some labels\n\t\th.AssertNil(t, fakeImage.SetLabel(\"io.buildpacks.buildpackage.metadata\", fmt.Sprintf(`{\"id\":\"%s\",\"version\":\"%s\",\"stacks\":[{\"id\":\"*\"}]}`, v.id, v.version)))\n\t\tlayers, err := fakeImage.UnderlyingImage().Layers()\n\t\th.AssertNil(t, err)\n\t\tdiffID, err := layers[0].DiffID()\n\t\th.AssertNil(t, err)\n\t\th.AssertNil(t, fakeImage.SetLabel(\"io.buildpacks.buildpack.layers\", fmt.Sprintf(`{\"%s\":{\"%s\":{\"api\":\"0.10\",\"stacks\":[{\"id\":\"*\"}],\"layerDiffID\":\"%s\"}}}`, v.id, v.version, diffID)))\n\n\t\t// pack will fetch the buildpack package from the registry by target\n\t\tmockImageFetcher.EXPECT().Fetch(gomock.Any(), v.bpURI, gomock.Eq(image.FetchOptions{Daemon: false, Target: &target})).Return(fakeImage, nil)\n\t}\n\n\t// Once all the buildpacks were written to disk as .tar giles\n\t// pack will create a new OCI image adding all the .tar files as layers\n\tcompositeBuildpackImage := h.NewFakeWithRandomUnderlyingV1Image(t, repoName, digest)\n\tmockImageFactory.EXPECT().NewImage(repoName, false, gomock.Eq(target)).Return(compositeBuildpackImage, nil)\n\n\t// Once the composite buildpack image was pushed to the registry, pack will create an Image Index adding\n\t// each manifest by digest\n\tmockImageFetcher.EXPECT().Fetch(gomock.Any(), digest.Name(), gomock.Any()).Return(compositeBuildpackImage, nil)\n}\n\nfunc newDigest(t *testing.T, repoName, sha string) name.Digest {\n\tdigest, err := name.NewDigest(fmt.Sprintf(\"%s@%s\", repoName, sha))\n\th.AssertNil(t, err)\n\treturn digest\n}\n\n// expectedMultiPlatformImage is a helper struct with the data needed to prepare a mock remote buildpack package\ntype expectedMultiPlatformImage struct {\n\tid      string\n\tversion string\n\tbpURI   string\n\tdigest  name.Digest\n}\n"
  },
  {
    "path": "pkg/client/package_extension.go",
    "content": "package client\n\nimport (\n\t\"context\"\n\t\"fmt\"\n\t\"path/filepath\"\n\n\t\"github.com/pkg/errors\"\n\n\t\"github.com/buildpacks/pack/internal/layer\"\n\t\"github.com/buildpacks/pack/internal/style\"\n\t\"github.com/buildpacks/pack/pkg/buildpack\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n)\n\n// PackageExtension packages extension(s) into either an image or file.\nfunc (c *Client) PackageExtension(ctx context.Context, opts PackageBuildpackOptions) error {\n\tif opts.Format == \"\" {\n\t\topts.Format = FormatImage\n\t}\n\n\ttargets, err := c.processPackageBuildpackTargets(ctx, opts)\n\tif err != nil {\n\t\treturn err\n\t}\n\tmultiArch := len(targets) > 1 && (opts.Publish || opts.Format == FormatFile)\n\n\tvar digests []string\n\ttargets = dist.ExpandTargetsDistributions(targets...)\n\tfor _, target := range targets {\n\t\tdigest, err := c.packageExtensionTarget(ctx, opts, target, multiArch)\n\t\tif err != nil {\n\t\t\treturn err\n\t\t}\n\t\tdigests = append(digests, digest)\n\t}\n\n\tif opts.Publish && len(digests) > 1 {\n\t\t// Image Index must be created only when we pushed to registry\n\t\treturn c.CreateManifest(ctx, CreateManifestOptions{\n\t\t\tIndexRepoName: opts.Name,\n\t\t\tRepoNames:     digests,\n\t\t\tPublish:       true,\n\t\t})\n\t}\n\n\treturn nil\n}\n\nfunc (c *Client) packageExtensionTarget(ctx context.Context, opts PackageBuildpackOptions, target dist.Target, multiArch bool) (string, error) {\n\tvar digest string\n\tif target.OS == \"windows\" && !c.experimental {\n\t\treturn \"\", NewExperimentError(\"Windows extensionpackage support is currently experimental.\")\n\t}\n\n\terr := c.validateOSPlatform(ctx, target.OS, opts.Publish, opts.Format)\n\tif err != nil {\n\t\treturn digest, err\n\t}\n\n\twriterFactory, err := layer.NewWriterFactory(target.OS)\n\tif err != nil {\n\t\treturn digest, errors.Wrap(err, \"creating layer writer factory\")\n\t}\n\n\tpackageBuilder := buildpack.NewBuilder(c.imageFactory)\n\n\texURI := opts.Config.Extension.URI\n\tif exURI == \"\" {\n\t\treturn digest, errors.New(\"extension URI must be provided\")\n\t}\n\n\tif ok, platformRootFolder := buildpack.PlatformRootFolder(exURI, target); ok {\n\t\texURI = platformRootFolder\n\t}\n\n\tmainBlob, err := c.downloadBuildpackFromURI(ctx, exURI, opts.RelativeBaseDir)\n\tif err != nil {\n\t\treturn digest, err\n\t}\n\n\tex, err := buildpack.FromExtensionRootBlob(mainBlob, writerFactory, c.logger)\n\tif err != nil {\n\t\treturn digest, errors.Wrapf(err, \"creating extension from %s\", style.Symbol(exURI))\n\t}\n\n\tpackageBuilder.SetExtension(ex)\n\n\tswitch opts.Format {\n\tcase FormatFile:\n\t\tname := opts.Name\n\t\tif multiArch {\n\t\t\tfileExtension := filepath.Ext(name)\n\t\t\torigFileName := name[:len(name)-len(filepath.Ext(name))]\n\t\t\tif target.Arch != \"\" {\n\t\t\t\tname = fmt.Sprintf(\"%s-%s-%s%s\", origFileName, target.OS, target.Arch, fileExtension)\n\t\t\t} else {\n\t\t\t\tname = fmt.Sprintf(\"%s-%s%s\", origFileName, target.OS, fileExtension)\n\t\t\t}\n\t\t}\n\t\terr = packageBuilder.SaveAsFile(name, target, opts.Labels)\n\t\tif err != nil {\n\t\t\treturn digest, err\n\t\t}\n\tcase FormatImage:\n\t\timg, err := packageBuilder.SaveAsImage(opts.Name, opts.Publish, target, opts.Labels, opts.AdditionalTags...)\n\t\tif err != nil {\n\t\t\treturn digest, errors.Wrapf(err, \"saving image\")\n\t\t}\n\t\tif multiArch {\n\t\t\t// We need to keep the identifier to create the image index\n\t\t\tid, err := img.Identifier()\n\t\t\tif err != nil {\n\t\t\t\treturn digest, errors.Wrapf(err, \"determining image manifest digest\")\n\t\t\t}\n\t\t\tdigest = id.String()\n\t\t}\n\tdefault:\n\t\treturn digest, errors.Errorf(\"unknown format: %s\", style.Symbol(opts.Format))\n\t}\n\treturn digest, nil\n}\n"
  },
  {
    "path": "pkg/client/package_extension_test.go",
    "content": "package client_test\n\nimport (\n\t\"bytes\"\n\t\"context\"\n\t\"fmt\"\n\t\"os\"\n\t\"path/filepath\"\n\t\"testing\"\n\n\t\"github.com/buildpacks/imgutil/fakes\"\n\t\"github.com/buildpacks/lifecycle/api\"\n\t\"github.com/golang/mock/gomock\"\n\t\"github.com/heroku/color\"\n\tmobysystem \"github.com/moby/moby/api/types/system\"\n\tdockerclient \"github.com/moby/moby/client\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\tpubbldpkg \"github.com/buildpacks/pack/buildpackage\"\n\tifakes \"github.com/buildpacks/pack/internal/fakes\"\n\t\"github.com/buildpacks/pack/pkg/blob\"\n\t\"github.com/buildpacks/pack/pkg/client\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n\t\"github.com/buildpacks/pack/pkg/image\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\t\"github.com/buildpacks/pack/pkg/testmocks\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestPackageExtension(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"PackageExtension\", testPackageExtension, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testPackageExtension(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tsubject          *client.Client\n\t\tmockController   *gomock.Controller\n\t\tmockDownloader   *testmocks.MockBlobDownloader\n\t\tmockImageFactory *testmocks.MockImageFactory\n\t\tmockImageFetcher *testmocks.MockImageFetcher\n\t\tmockDockerClient *testmocks.MockAPIClient\n\t\tout              bytes.Buffer\n\t)\n\n\tit.Before(func() {\n\t\tmockController = gomock.NewController(t)\n\t\tmockDownloader = testmocks.NewMockBlobDownloader(mockController)\n\t\tmockImageFactory = testmocks.NewMockImageFactory(mockController)\n\t\tmockImageFetcher = testmocks.NewMockImageFetcher(mockController)\n\t\tmockDockerClient = testmocks.NewMockAPIClient(mockController)\n\n\t\tvar err error\n\t\tsubject, err = client.NewClient(\n\t\t\tclient.WithLogger(logging.NewLogWithWriters(&out, &out)),\n\t\t\tclient.WithDownloader(mockDownloader),\n\t\t\tclient.WithImageFactory(mockImageFactory),\n\t\t\tclient.WithFetcher(mockImageFetcher),\n\t\t\tclient.WithDockerClient(mockDockerClient),\n\t\t)\n\t\th.AssertNil(t, err)\n\t})\n\n\tit.After(func() {\n\t\tmockController.Finish()\n\t})\n\n\tcreateExtension := func(descriptor dist.ExtensionDescriptor) string {\n\t\tex, err := ifakes.NewFakeExtensionBlob(&descriptor, 0644)\n\t\th.AssertNil(t, err)\n\t\turl := fmt.Sprintf(\"https://example.com/ex.%s.tgz\", h.RandString(12))\n\t\tmockDownloader.EXPECT().Download(gomock.Any(), url).Return(ex, nil).AnyTimes()\n\t\treturn url\n\t}\n\n\twhen(\"extension has issues\", func() {\n\t\twhen(\"extension has no URI\", func() {\n\t\t\tit(\"should fail\", func() {\n\t\t\t\terr := subject.PackageExtension(context.TODO(), client.PackageBuildpackOptions{\n\t\t\t\t\tName: \"Fake-Name\",\n\t\t\t\t\tConfig: pubbldpkg.Config{\n\t\t\t\t\t\tPlatform:  dist.Platform{OS: \"linux\"},\n\t\t\t\t\t\tExtension: dist.BuildpackURI{URI: \"\"},\n\t\t\t\t\t},\n\t\t\t\t\tPublish: true,\n\t\t\t\t})\n\t\t\t\th.AssertError(t, err, \"extension URI must be provided\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"can't download extension\", func() {\n\t\t\tit(\"should fail\", func() {\n\t\t\t\texURL := fmt.Sprintf(\"https://example.com/ex.%s.tgz\", h.RandString(12))\n\t\t\t\tmockDownloader.EXPECT().Download(gomock.Any(), exURL).Return(nil, image.ErrNotFound).AnyTimes()\n\n\t\t\t\terr := subject.PackageExtension(context.TODO(), client.PackageBuildpackOptions{\n\t\t\t\t\tName: \"Fake-Name\",\n\t\t\t\t\tConfig: pubbldpkg.Config{\n\t\t\t\t\t\tPlatform:  dist.Platform{OS: \"linux\"},\n\t\t\t\t\t\tExtension: dist.BuildpackURI{URI: exURL},\n\t\t\t\t\t},\n\t\t\t\t\tPublish: true,\n\t\t\t\t})\n\t\t\t\th.AssertError(t, err, \"downloading buildpack\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"extension isn't a valid extension\", func() {\n\t\t\tit(\"should fail\", func() {\n\t\t\t\tfakeBlob := blob.NewBlob(filepath.Join(\"testdata\", \"empty-file\"))\n\t\t\t\texURL := fmt.Sprintf(\"https://example.com/ex.%s.tgz\", h.RandString(12))\n\t\t\t\tmockDownloader.EXPECT().Download(gomock.Any(), exURL).Return(fakeBlob, nil).AnyTimes()\n\n\t\t\t\terr := subject.PackageExtension(context.TODO(), client.PackageBuildpackOptions{\n\t\t\t\t\tName: \"Fake-Name\",\n\t\t\t\t\tConfig: pubbldpkg.Config{\n\t\t\t\t\t\tPlatform:  dist.Platform{OS: \"linux\"},\n\t\t\t\t\t\tExtension: dist.BuildpackURI{URI: exURL},\n\t\t\t\t\t},\n\t\t\t\t\tPublish: true,\n\t\t\t\t})\n\t\t\t\th.AssertError(t, err, \"creating extension\")\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"FormatImage\", func() {\n\t\twhen(\"simple package for both OS formats (experimental only)\", func() {\n\t\t\tit(\"creates package image based on daemon OS\", func() {\n\t\t\t\tfor _, daemonOS := range []string{\"linux\", \"windows\"} {\n\t\t\t\t\tlocalMockDockerClient := testmocks.NewMockAPIClient(mockController)\n\t\t\t\t\tlocalMockDockerClient.EXPECT().Info(context.TODO(), gomock.Any()).Return(dockerclient.SystemInfoResult{Info: mobysystem.Info{OSType: daemonOS}}, nil).AnyTimes()\n\n\t\t\t\t\tpackClientWithExperimental, err := client.NewClient(\n\t\t\t\t\t\tclient.WithDockerClient(localMockDockerClient),\n\t\t\t\t\t\tclient.WithDownloader(mockDownloader),\n\t\t\t\t\t\tclient.WithImageFactory(mockImageFactory),\n\t\t\t\t\t\tclient.WithExperimental(true),\n\t\t\t\t\t)\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\tfakeImage := fakes.NewImage(\"basic/package-\"+h.RandString(12), \"\", nil)\n\t\t\t\t\tmockImageFactory.EXPECT().NewImage(fakeImage.Name(), true, dist.Target{OS: daemonOS}).Return(fakeImage, nil)\n\n\t\t\t\t\tfakeBlob := blob.NewBlob(filepath.Join(\"testdata\", \"empty-file\"))\n\t\t\t\t\texURL := fmt.Sprintf(\"https://example.com/ex.%s.tgz\", h.RandString(12))\n\t\t\t\t\tmockDownloader.EXPECT().Download(gomock.Any(), exURL).Return(fakeBlob, nil).AnyTimes()\n\n\t\t\t\t\th.AssertNil(t, packClientWithExperimental.PackageExtension(context.TODO(), client.PackageBuildpackOptions{\n\t\t\t\t\t\tFormat: client.FormatImage,\n\t\t\t\t\t\tName:   fakeImage.Name(),\n\t\t\t\t\t\tConfig: pubbldpkg.Config{\n\t\t\t\t\t\t\tPlatform: dist.Platform{OS: daemonOS},\n\t\t\t\t\t\t\tExtension: dist.BuildpackURI{URI: createExtension(dist.ExtensionDescriptor{\n\t\t\t\t\t\t\t\tWithAPI:  api.MustParse(\"0.2\"),\n\t\t\t\t\t\t\t\tWithInfo: dist.ModuleInfo{ID: \"ex.basic\", Version: \"2.3.4\"},\n\t\t\t\t\t\t\t})},\n\t\t\t\t\t\t},\n\t\t\t\t\t\tPullPolicy: image.PullNever,\n\t\t\t\t\t}))\n\t\t\t\t}\n\t\t\t})\n\n\t\t\tit(\"fails without experimental on Windows daemons\", func() {\n\t\t\t\twindowsMockDockerClient := testmocks.NewMockAPIClient(mockController)\n\n\t\t\t\tpackClientWithoutExperimental, err := client.NewClient(\n\t\t\t\t\tclient.WithDockerClient(windowsMockDockerClient),\n\t\t\t\t\tclient.WithExperimental(false),\n\t\t\t\t)\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\terr = packClientWithoutExperimental.PackageExtension(context.TODO(), client.PackageBuildpackOptions{\n\t\t\t\t\tConfig: pubbldpkg.Config{\n\t\t\t\t\t\tPlatform: dist.Platform{\n\t\t\t\t\t\t\tOS: \"windows\",\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t})\n\t\t\t\th.AssertError(t, err, \"Windows extensionpackage support is currently experimental.\")\n\t\t\t})\n\n\t\t\tit(\"fails for mismatched platform and daemon os\", func() {\n\t\t\t\twindowsMockDockerClient := testmocks.NewMockAPIClient(mockController)\n\t\t\t\twindowsMockDockerClient.EXPECT().Info(context.TODO(), gomock.Any()).Return(dockerclient.SystemInfoResult{Info: mobysystem.Info{OSType: \"windows\"}}, nil).AnyTimes()\n\n\t\t\t\tpackClientWithoutExperimental, err := client.NewClient(\n\t\t\t\t\tclient.WithDockerClient(windowsMockDockerClient),\n\t\t\t\t\tclient.WithExperimental(false),\n\t\t\t\t)\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\terr = packClientWithoutExperimental.PackageExtension(context.TODO(), client.PackageBuildpackOptions{\n\t\t\t\t\tConfig: pubbldpkg.Config{\n\t\t\t\t\t\tPlatform: dist.Platform{\n\t\t\t\t\t\t\tOS: \"linux\",\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t})\n\n\t\t\t\th.AssertError(t, err, \"invalid 'platform.os' specified: DOCKER_OS is 'windows'\")\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"FormatFile\", func() {\n\t\twhen(\"simple package for both OS formats (experimental only)\", func() {\n\t\t\tit(\"creates package image in either OS format\", func() {\n\t\t\t\ttmpDir, err := os.MkdirTemp(\"\", \"package-extension\")\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\tdefer os.Remove(tmpDir)\n\n\t\t\t\tfor _, imageOS := range []string{\"linux\", \"windows\"} {\n\t\t\t\t\tlocalMockDockerClient := testmocks.NewMockAPIClient(mockController)\n\t\t\t\t\tlocalMockDockerClient.EXPECT().Info(context.TODO(), gomock.Any()).Return(dockerclient.SystemInfoResult{Info: mobysystem.Info{OSType: imageOS}}, nil).AnyTimes()\n\n\t\t\t\t\tpackClientWithExperimental, err := client.NewClient(\n\t\t\t\t\t\tclient.WithDockerClient(localMockDockerClient),\n\t\t\t\t\t\tclient.WithLogger(logging.NewLogWithWriters(&out, &out)),\n\t\t\t\t\t\tclient.WithDownloader(mockDownloader),\n\t\t\t\t\t\tclient.WithExperimental(true),\n\t\t\t\t\t)\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\tfakeBlob := blob.NewBlob(filepath.Join(\"testdata\", \"empty-file\"))\n\t\t\t\t\texURL := fmt.Sprintf(\"https://example.com/ex.%s.tgz\", h.RandString(12))\n\t\t\t\t\tmockDownloader.EXPECT().Download(gomock.Any(), exURL).Return(fakeBlob, nil).AnyTimes()\n\n\t\t\t\t\tpackagePath := filepath.Join(tmpDir, h.RandString(12)+\"-test.cnb\")\n\t\t\t\t\th.AssertNil(t, packClientWithExperimental.PackageExtension(context.TODO(), client.PackageBuildpackOptions{\n\t\t\t\t\t\tFormat: client.FormatFile,\n\t\t\t\t\t\tName:   packagePath,\n\t\t\t\t\t\tConfig: pubbldpkg.Config{\n\t\t\t\t\t\t\tPlatform: dist.Platform{OS: imageOS},\n\t\t\t\t\t\t\tExtension: dist.BuildpackURI{URI: createExtension(dist.ExtensionDescriptor{\n\t\t\t\t\t\t\t\tWithAPI:  api.MustParse(\"0.2\"),\n\t\t\t\t\t\t\t\tWithInfo: dist.ModuleInfo{ID: \"ex.basic\", Version: \"2.3.4\"},\n\t\t\t\t\t\t\t})},\n\t\t\t\t\t\t},\n\t\t\t\t\t\tPullPolicy: image.PullNever,\n\t\t\t\t\t}))\n\t\t\t\t}\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"unknown format is provided\", func() {\n\t\tit(\"should error\", func() {\n\t\t\tmockDockerClient.EXPECT().Info(context.TODO(), gomock.Any()).Return(dockerclient.SystemInfoResult{Info: mobysystem.Info{OSType: \"linux\"}}, nil).AnyTimes()\n\n\t\t\terr := subject.PackageExtension(context.TODO(), client.PackageBuildpackOptions{\n\t\t\t\tName:   \"some-extension\",\n\t\t\t\tFormat: \"invalid-format\",\n\t\t\t\tConfig: pubbldpkg.Config{\n\t\t\t\t\tPlatform: dist.Platform{OS: \"linux\"},\n\t\t\t\t\tExtension: dist.BuildpackURI{URI: createExtension(dist.ExtensionDescriptor{\n\t\t\t\t\t\tWithAPI:  api.MustParse(\"0.2\"),\n\t\t\t\t\t\tWithInfo: dist.ModuleInfo{ID: \"ex.1\", Version: \"1.2.3\"},\n\t\t\t\t\t})},\n\t\t\t\t},\n\t\t\t\tPublish:    false,\n\t\t\t\tPullPolicy: image.PullAlways,\n\t\t\t})\n\t\t\th.AssertError(t, err, \"unknown format: 'invalid-format'\")\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "pkg/client/process_volumes.go",
    "content": "//go:build linux || windows\n\npackage client\n\nimport (\n\t\"fmt\"\n\t\"runtime\"\n\t\"strings\"\n\n\t\"github.com/docker/docker/volume/mounts\"\n\t\"github.com/pkg/errors\"\n\n\t\"github.com/buildpacks/pack/internal/style\"\n)\n\nfunc processVolumes(imgOS string, volumes []string) (processed []string, warnings []string, err error) {\n\tvar parser mounts.Parser\n\tswitch \"windows\" {\n\tcase imgOS:\n\t\tparser = mounts.NewWindowsParser()\n\tcase runtime.GOOS:\n\t\tparser = mounts.NewLCOWParser()\n\tdefault:\n\t\tparser = mounts.NewLinuxParser()\n\t}\n\tfor _, v := range volumes {\n\t\tvolume, err := parser.ParseMountRaw(v, \"\")\n\t\tif err != nil {\n\t\t\treturn nil, nil, errors.Wrapf(err, \"platform volume %q has invalid format\", v)\n\t\t}\n\n\t\tsensitiveDirs := []string{\"/cnb\", \"/layers\", \"/workspace\"}\n\t\tif imgOS == \"windows\" {\n\t\t\tsensitiveDirs = []string{`c:/cnb`, `c:\\cnb`, `c:/layers`, `c:\\layers`, `c:/workspace`, `c:\\workspace`}\n\t\t}\n\t\tfor _, p := range sensitiveDirs {\n\t\t\tif strings.HasPrefix(strings.ToLower(volume.Spec.Target), p) {\n\t\t\t\twarnings = append(warnings, fmt.Sprintf(\"Mounting to a sensitive directory %s\", style.Symbol(volume.Spec.Target)))\n\t\t\t}\n\t\t}\n\n\t\tprocessed = append(processed, fmt.Sprintf(\"%s:%s:%s\", volume.Spec.Source, volume.Spec.Target, processMode(volume.Mode)))\n\t}\n\treturn processed, warnings, nil\n}\n\nfunc processMode(mode string) string {\n\tif mode == \"\" {\n\t\treturn \"ro\"\n\t}\n\n\treturn mode\n}\n"
  },
  {
    "path": "pkg/client/process_volumes_unix.go",
    "content": "//go:build unix && !linux\n\npackage client\n\nimport (\n\t\"fmt\"\n\t\"strings\"\n\n\t\"github.com/docker/cli/cli/compose/loader\"\n\t\"github.com/docker/cli/cli/compose/types\"\n\t\"github.com/pkg/errors\"\n\n\t\"github.com/buildpacks/pack/internal/style\"\n)\n\nfunc processVolumes(imgOS string, volumes []string) (processed []string, warnings []string, err error) {\n\tfor _, v := range volumes {\n\t\tvolume, err := parseVolume(v)\n\t\tif err != nil {\n\t\t\treturn nil, nil, err\n\t\t}\n\t\tsensitiveDirs := []string{\"/cnb\", \"/layers\", \"/workspace\"}\n\t\tif imgOS == \"windows\" {\n\t\t\tsensitiveDirs = []string{`c:/cnb`, `c:\\cnb`, `c:/layers`, `c:\\layers`}\n\t\t}\n\t\tfor _, p := range sensitiveDirs {\n\t\t\tif strings.HasPrefix(strings.ToLower(volume.Target), p) {\n\t\t\t\twarnings = append(warnings, fmt.Sprintf(\"Mounting to a sensitive directory %s\", style.Symbol(volume.Target)))\n\t\t\t}\n\t\t}\n\t\tmode := \"ro\"\n\t\tif strings.HasSuffix(v, \":rw\") && !volume.ReadOnly {\n\t\t\tmode = \"rw\"\n\t\t}\n\t\tprocessed = append(processed, fmt.Sprintf(\"%s:%s:%s\", volume.Source, volume.Target, mode))\n\t}\n\treturn processed, warnings, nil\n}\n\nfunc parseVolume(volume string) (types.ServiceVolumeConfig, error) {\n\t// volume format: '<host path>:<target path>[:<options>]'\n\tsplit := strings.Split(volume, \":\")\n\tif len(split) == 3 {\n\t\tif split[2] != \"ro\" && split[2] != \"rw\" && !strings.Contains(split[2], \"volume-opt\") {\n\t\t\treturn types.ServiceVolumeConfig{}, errors.New(fmt.Sprintf(\"platform volume %q has invalid format: invalid mode: %s\", volume, split[2]))\n\t\t}\n\t}\n\tconfig, err := loader.ParseVolume(volume)\n\tif err != nil {\n\t\treturn config, errors.Wrapf(err, \"platform volume %q has invalid format\", volume)\n\t}\n\treturn config, nil\n}\n"
  },
  {
    "path": "pkg/client/pull_buildpack.go",
    "content": "package client\n\nimport (\n\t\"context\"\n\t\"fmt\"\n\n\t\"github.com/pkg/errors\"\n\n\t\"github.com/buildpacks/pack/internal/style\"\n\t\"github.com/buildpacks/pack/pkg/buildpack\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n\t\"github.com/buildpacks/pack/pkg/image\"\n)\n\n// PullBuildpackOptions are options available for PullBuildpack\ntype PullBuildpackOptions struct {\n\t// URI of the buildpack to retrieve.\n\tURI string\n\t// RegistryName to search for buildpacks from.\n\tRegistryName string\n\t// RelativeBaseDir to resolve relative assests from.\n\tRelativeBaseDir string\n}\n\n// PullBuildpack pulls given buildpack to be stored locally\nfunc (c *Client) PullBuildpack(ctx context.Context, opts PullBuildpackOptions) error {\n\tlocatorType, err := buildpack.GetLocatorType(opts.URI, \"\", []dist.ModuleInfo{})\n\tif err != nil {\n\t\treturn err\n\t}\n\n\tswitch locatorType {\n\tcase buildpack.PackageLocator:\n\t\timageName := buildpack.ParsePackageLocator(opts.URI)\n\t\tc.logger.Debugf(\"Pulling buildpack from image: %s\", imageName)\n\n\t\t_, err = c.imageFetcher.Fetch(ctx, imageName, image.FetchOptions{Daemon: true, PullPolicy: image.PullAlways})\n\t\tif err != nil {\n\t\t\treturn errors.Wrapf(err, \"fetching image %s\", style.Symbol(opts.URI))\n\t\t}\n\tcase buildpack.RegistryLocator:\n\t\tc.logger.Debugf(\"Pulling buildpack from registry: %s\", style.Symbol(opts.URI))\n\t\tregistryCache, err := getRegistry(c.logger, opts.RegistryName)\n\n\t\tif err != nil {\n\t\t\treturn errors.Wrapf(err, \"invalid registry '%s'\", opts.RegistryName)\n\t\t}\n\n\t\tregistryBp, err := registryCache.LocateBuildpack(opts.URI)\n\t\tif err != nil {\n\t\t\treturn errors.Wrapf(err, \"locating in registry %s\", style.Symbol(opts.URI))\n\t\t}\n\n\t\t_, err = c.imageFetcher.Fetch(ctx, registryBp.Address, image.FetchOptions{Daemon: true, PullPolicy: image.PullAlways})\n\t\tif err != nil {\n\t\t\treturn errors.Wrapf(err, \"fetching image %s\", style.Symbol(opts.URI))\n\t\t}\n\tcase buildpack.InvalidLocator:\n\t\treturn fmt.Errorf(\"invalid buildpack URI %s\", style.Symbol(opts.URI))\n\tdefault:\n\t\treturn fmt.Errorf(\"unsupported buildpack URI type: %s\", style.Symbol(locatorType.String()))\n\t}\n\n\treturn nil\n}\n"
  },
  {
    "path": "pkg/client/pull_buildpack_test.go",
    "content": "package client_test\n\nimport (\n\t\"bytes\"\n\t\"context\"\n\t\"os\"\n\t\"path/filepath\"\n\t\"runtime\"\n\t\"strings\"\n\t\"testing\"\n\n\t\"github.com/buildpacks/imgutil/fakes\"\n\t\"github.com/golang/mock/gomock\"\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\tcfg \"github.com/buildpacks/pack/internal/config\"\n\t\"github.com/buildpacks/pack/internal/registry\"\n\t\"github.com/buildpacks/pack/pkg/client\"\n\t\"github.com/buildpacks/pack/pkg/image\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\t\"github.com/buildpacks/pack/pkg/testmocks\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestPullBuildpack(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"PackageBuildpack\", testPullBuildpack, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testPullBuildpack(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tsubject          *client.Client\n\t\tmockController   *gomock.Controller\n\t\tmockDownloader   *testmocks.MockBlobDownloader\n\t\tmockImageFactory *testmocks.MockImageFactory\n\t\tmockImageFetcher *testmocks.MockImageFetcher\n\t\tmockDockerClient *testmocks.MockAPIClient\n\t\tout              bytes.Buffer\n\t)\n\n\tit.Before(func() {\n\t\tmockController = gomock.NewController(t)\n\t\tmockDownloader = testmocks.NewMockBlobDownloader(mockController)\n\t\tmockImageFactory = testmocks.NewMockImageFactory(mockController)\n\t\tmockImageFetcher = testmocks.NewMockImageFetcher(mockController)\n\t\tmockDockerClient = testmocks.NewMockAPIClient(mockController)\n\n\t\tvar err error\n\t\tsubject, err = client.NewClient(\n\t\t\tclient.WithLogger(logging.NewLogWithWriters(&out, &out)),\n\t\t\tclient.WithDownloader(mockDownloader),\n\t\t\tclient.WithImageFactory(mockImageFactory),\n\t\t\tclient.WithFetcher(mockImageFetcher),\n\t\t\tclient.WithDockerClient(mockDockerClient),\n\t\t)\n\t\th.AssertNil(t, err)\n\t})\n\n\tit.After(func() {\n\t\tmockController.Finish()\n\t})\n\n\twhen(\"buildpack has issues\", func() {\n\t\tit(\"should fail if not in the registry\", func() {\n\t\t\terr := subject.PullBuildpack(context.TODO(), client.PullBuildpackOptions{\n\t\t\t\tURI:          \"invalid/image\",\n\t\t\t\tRegistryName: registry.DefaultRegistryName,\n\t\t\t})\n\t\t\th.AssertError(t, err, \"locating in registry\")\n\t\t})\n\n\t\tit(\"should fail if it's a URI type\", func() {\n\t\t\terr := subject.PullBuildpack(context.TODO(), client.PullBuildpackOptions{\n\t\t\t\tURI: \"file://some-file\",\n\t\t\t})\n\t\t\th.AssertError(t, err, \"unsupported buildpack URI type: 'URILocator'\")\n\t\t})\n\n\t\tit(\"should fail if not a valid URI\", func() {\n\t\t\terr := subject.PullBuildpack(context.TODO(), client.PullBuildpackOptions{\n\t\t\t\tURI: \"G@Rb*g3_\",\n\t\t\t})\n\t\t\th.AssertError(t, err, \"invalid buildpack URI\")\n\t\t})\n\t})\n\n\twhen(\"pulling from a docker registry\", func() {\n\t\tit(\"should fetch the image\", func() {\n\t\t\tpackageImage := fakes.NewImage(\"example.com/some/package:1.0.0\", \"\", nil)\n\t\t\th.AssertNil(t, packageImage.SetLabel(\"io.buildpacks.buildpackage.metadata\", `{}`))\n\t\t\th.AssertNil(t, packageImage.SetLabel(\"io.buildpacks.buildpack.layers\", `{}`))\n\t\t\tmockImageFetcher.EXPECT().Fetch(gomock.Any(), packageImage.Name(), image.FetchOptions{Daemon: true, PullPolicy: image.PullAlways}).Return(packageImage, nil)\n\n\t\t\th.AssertNil(t, subject.PullBuildpack(context.TODO(), client.PullBuildpackOptions{\n\t\t\t\tURI: \"example.com/some/package:1.0.0\",\n\t\t\t}))\n\t\t})\n\t})\n\n\twhen(\"pulling from a buildpack registry\", func() {\n\t\tvar (\n\t\t\ttmpDir          string\n\t\t\tregistryFixture string\n\t\t\tpackHome        string\n\t\t)\n\n\t\tit.Before(func() {\n\t\t\tvar err error\n\t\t\ttmpDir, err = os.MkdirTemp(\"\", \"registry\")\n\t\t\th.AssertNil(t, err)\n\n\t\t\tpackHome = filepath.Join(tmpDir, \".pack\")\n\t\t\terr = os.MkdirAll(packHome, 0755)\n\t\t\th.AssertNil(t, err)\n\t\t\tos.Setenv(\"PACK_HOME\", packHome)\n\n\t\t\tregistryFixture = h.CreateRegistryFixture(t, tmpDir, filepath.Join(\"testdata\", \"registry\"))\n\n\t\t\tpackageImage := fakes.NewImage(\"example.com/some/package@sha256:74eb48882e835d8767f62940d453eb96ed2737de3a16573881dcea7dea769df7\", \"\", nil)\n\t\t\tpackageImage.SetLabel(\"io.buildpacks.buildpackage.metadata\", `{}`)\n\t\t\tpackageImage.SetLabel(\"io.buildpacks.buildpack.layers\", `{}`)\n\t\t\tmockImageFetcher.EXPECT().Fetch(gomock.Any(), packageImage.Name(), image.FetchOptions{Daemon: true, PullPolicy: image.PullAlways}).Return(packageImage, nil)\n\n\t\t\tpackHome := filepath.Join(tmpDir, \"packHome\")\n\t\t\th.AssertNil(t, os.Setenv(\"PACK_HOME\", packHome))\n\t\t\tconfigPath := filepath.Join(packHome, \"config.toml\")\n\t\t\th.AssertNil(t, cfg.Write(cfg.Config{\n\t\t\t\tRegistries: []cfg.Registry{\n\t\t\t\t\t{\n\t\t\t\t\t\tName: \"some-registry\",\n\t\t\t\t\t\tType: \"github\",\n\t\t\t\t\t\tURL:  registryFixture,\n\t\t\t\t\t},\n\t\t\t\t},\n\t\t\t}, configPath))\n\t\t})\n\n\t\tit.After(func() {\n\t\t\tos.Unsetenv(\"PACK_HOME\")\n\t\t\terr := os.RemoveAll(tmpDir)\n\t\t\tif runtime.GOOS != \"windows\" && err != nil && strings.Contains(err.Error(), \"The process cannot access the file because it is being used by another process.\") {\n\t\t\t\th.AssertNil(t, err)\n\t\t\t}\n\t\t})\n\n\t\tit(\"should fetch the image\", func() {\n\t\t\th.AssertNil(t, subject.PullBuildpack(context.TODO(), client.PullBuildpackOptions{\n\t\t\t\tURI:          \"example/foo@1.1.0\",\n\t\t\t\tRegistryName: \"some-registry\",\n\t\t\t}))\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "pkg/client/rebase.go",
    "content": "package client\n\nimport (\n\t\"context\"\n\t\"os\"\n\t\"path/filepath\"\n\n\t\"github.com/BurntSushi/toml\"\n\t\"github.com/buildpacks/lifecycle/phase\"\n\t\"github.com/buildpacks/lifecycle/platform\"\n\t\"github.com/buildpacks/lifecycle/platform/files\"\n\t\"github.com/pkg/errors\"\n\n\t\"github.com/buildpacks/pack/internal/build\"\n\t\"github.com/buildpacks/pack/internal/builder\"\n\t\"github.com/buildpacks/pack/internal/style\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n\t\"github.com/buildpacks/pack/pkg/image\"\n)\n\n// RebaseOptions is a configuration struct that controls image rebase behavior.\ntype RebaseOptions struct {\n\t// Name of image we wish to rebase.\n\tRepoName string\n\n\t// Flag to publish image to remote registry after rebase completion.\n\tPublish bool\n\n\t// Strategy for pulling images during rebase.\n\tPullPolicy image.PullPolicy\n\n\t// Image to rebase against. This image must have\n\t// the same StackID as the previous run image.\n\tRunImage string\n\n\t// A mapping from StackID to an array of mirrors.\n\t// This mapping used only if both RunImage is omitted and Publish is true.\n\t// AdditionalMirrors gives us inputs to recalculate the 'best' run image\n\t// based on the registry we are publishing to.\n\tAdditionalMirrors map[string][]string\n\n\t// If provided, directory to which report.toml will be copied\n\tReportDestinationDir string\n\n\t// Pass-through force flag to lifecycle rebase command to skip target data\n\t// validated (will not have any effect if API < 0.12).\n\tForce bool\n\n\tInsecureRegistries []string\n\n\t// Image reference to use as the previous image for rebase.\n\tPreviousImage string\n}\n\n// Rebase updates the run image layers in an app image.\n// This operation mutates the image specified in opts.\nfunc (c *Client) Rebase(ctx context.Context, opts RebaseOptions) error {\n\tvar flags = []string{\"rebase\"}\n\timageRef, err := c.parseTagReference(opts.RepoName)\n\tif err != nil {\n\t\treturn errors.Wrapf(err, \"invalid image name '%s'\", opts.RepoName)\n\t}\n\n\trepoName := opts.RepoName\n\n\tif opts.PreviousImage != \"\" {\n\t\trepoName = opts.PreviousImage\n\t}\n\n\tappImage, err := c.imageFetcher.Fetch(ctx, repoName, image.FetchOptions{Daemon: !opts.Publish, PullPolicy: opts.PullPolicy, InsecureRegistries: opts.InsecureRegistries})\n\tif err != nil {\n\t\treturn err\n\t}\n\n\tappOS, err := appImage.OS()\n\tif err != nil {\n\t\treturn errors.Wrapf(err, \"getting app OS\")\n\t}\n\n\tappArch, err := appImage.Architecture()\n\tif err != nil {\n\t\treturn errors.Wrapf(err, \"getting app architecture\")\n\t}\n\n\tvar md files.LayersMetadataCompat\n\tif ok, err := dist.GetLabel(appImage, platform.LifecycleMetadataLabel, &md); err != nil {\n\t\treturn err\n\t} else if !ok {\n\t\treturn errors.Errorf(\"could not find label %s on image\", style.Symbol(platform.LifecycleMetadataLabel))\n\t}\n\tvar runImageMD builder.RunImageMetadata\n\tif md.RunImage.Image != \"\" {\n\t\trunImageMD = builder.RunImageMetadata{\n\t\t\tImage:   md.RunImage.Image,\n\t\t\tMirrors: md.RunImage.Mirrors,\n\t\t}\n\t} else if md.Stack != nil {\n\t\trunImageMD = builder.RunImageMetadata{\n\t\t\tImage:   md.Stack.RunImage.Image,\n\t\t\tMirrors: md.Stack.RunImage.Mirrors,\n\t\t}\n\t}\n\n\ttarget := &dist.Target{OS: appOS, Arch: appArch}\n\tfetchOptions := image.FetchOptions{\n\t\tDaemon:             !opts.Publish,\n\t\tPullPolicy:         opts.PullPolicy,\n\t\tTarget:             target,\n\t\tInsecureRegistries: opts.InsecureRegistries,\n\t}\n\n\trunImageName := c.resolveRunImage(\n\t\topts.RunImage,\n\t\timageRef.Context().RegistryStr(),\n\t\t\"\",\n\t\trunImageMD,\n\t\topts.AdditionalMirrors,\n\t\topts.Publish,\n\t\tfetchOptions,\n\t)\n\n\tif runImageName == \"\" {\n\t\treturn errors.New(\"run image must be specified\")\n\t}\n\n\tbaseImage, err := c.imageFetcher.Fetch(ctx, runImageName, fetchOptions)\n\tif err != nil {\n\t\treturn err\n\t}\n\n\tfor _, reg := range opts.InsecureRegistries {\n\t\tflags = append(flags, \"-insecure-registry\", reg)\n\t}\n\n\tc.logger.Infof(\"Rebasing %s on run image %s\", style.Symbol(appImage.Name()), style.Symbol(baseImage.Name()))\n\trebaser := &phase.Rebaser{Logger: c.logger, PlatformAPI: build.SupportedPlatformAPIVersions.Latest(), Force: opts.Force}\n\treport, err := rebaser.Rebase(appImage, baseImage, opts.RepoName, nil)\n\tif err != nil {\n\t\treturn err\n\t}\n\n\tappImageIdentifier, err := appImage.Identifier()\n\tif err != nil {\n\t\treturn err\n\t}\n\n\tc.logger.Infof(\"Rebased Image: %s\", style.Symbol(appImageIdentifier.String()))\n\n\tif opts.ReportDestinationDir != \"\" {\n\t\treportPath := filepath.Join(opts.ReportDestinationDir, \"report.toml\")\n\t\treportFile, err := os.OpenFile(reportPath, os.O_RDWR|os.O_CREATE, 0644)\n\t\tif err != nil {\n\t\t\tc.logger.Warnf(\"unable to open %s for writing rebase report\", reportPath)\n\t\t\treturn err\n\t\t}\n\n\t\tdefer reportFile.Close()\n\t\terr = toml.NewEncoder(reportFile).Encode(report)\n\t\tif err != nil {\n\t\t\tc.logger.Warnf(\"unable to write rebase report to %s\", reportPath)\n\t\t\treturn err\n\t\t}\n\t}\n\treturn nil\n}\n"
  },
  {
    "path": "pkg/client/rebase_test.go",
    "content": "package client\n\nimport (\n\t\"bytes\"\n\t\"context\"\n\t\"os\"\n\t\"path/filepath\"\n\t\"testing\"\n\n\t\"github.com/buildpacks/imgutil/fakes\"\n\t\"github.com/buildpacks/lifecycle/auth\"\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\tifakes \"github.com/buildpacks/pack/internal/fakes\"\n\t\"github.com/buildpacks/pack/pkg/image\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestRebase(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"rebase_factory\", testRebase, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testRebase(t *testing.T, when spec.G, it spec.S) {\n\twhen(\"#Rebase\", func() {\n\t\tvar (\n\t\t\tfakeImageFetcher   *ifakes.FakeImageFetcher\n\t\t\tsubject            *Client\n\t\t\tfakeAppImage       *fakes.Image\n\t\t\tfakeRunImage       *fakes.Image\n\t\t\tfakeRunImageMirror *fakes.Image\n\t\t\tout                bytes.Buffer\n\t\t)\n\n\t\tit.Before(func() {\n\t\t\tfakeImageFetcher = ifakes.NewFakeImageFetcher()\n\n\t\t\tfakeAppImage = fakes.NewImage(\"some/app\", \"\", &fakeIdentifier{name: \"app-image\"})\n\t\t\th.AssertNil(t, fakeAppImage.SetLabel(\"io.buildpacks.lifecycle.metadata\",\n\t\t\t\t`{\"stack\":{\"runImage\":{\"image\":\"some/run\", \"mirrors\":[\"example.com/some/run\"]}}}`))\n\t\t\th.AssertNil(t, fakeAppImage.SetLabel(\"io.buildpacks.stack.id\", \"io.buildpacks.stacks.jammy\"))\n\t\t\tfakeImageFetcher.LocalImages[\"some/app\"] = fakeAppImage\n\n\t\t\tfakeRunImage = fakes.NewImage(\"some/run\", \"run-image-top-layer-sha\", &fakeIdentifier{name: \"run-image-digest\"})\n\t\t\th.AssertNil(t, fakeRunImage.SetLabel(\"io.buildpacks.stack.id\", \"io.buildpacks.stacks.jammy\"))\n\t\t\tfakeImageFetcher.LocalImages[\"some/run\"] = fakeRunImage\n\n\t\t\tfakeRunImageMirror = fakes.NewImage(\"example.com/some/run\", \"mirror-top-layer-sha\", &fakeIdentifier{name: \"mirror-digest\"})\n\t\t\th.AssertNil(t, fakeRunImageMirror.SetLabel(\"io.buildpacks.stack.id\", \"io.buildpacks.stacks.jammy\"))\n\t\t\tfakeImageFetcher.LocalImages[\"example.com/some/run\"] = fakeRunImageMirror\n\n\t\t\tkeychain, err := auth.DefaultKeychain(\"pack-test/dummy\")\n\t\t\th.AssertNil(t, err)\n\n\t\t\tfakeLogger := logging.NewLogWithWriters(&out, &out)\n\t\t\tsubject = &Client{\n\t\t\t\tlogger:       fakeLogger,\n\t\t\t\timageFetcher: fakeImageFetcher,\n\t\t\t\tkeychain:     keychain,\n\t\t\t}\n\t\t})\n\n\t\tit.After(func() {\n\t\t\th.AssertNilE(t, fakeAppImage.Cleanup())\n\t\t\th.AssertNilE(t, fakeRunImage.Cleanup())\n\t\t\th.AssertNilE(t, fakeRunImageMirror.Cleanup())\n\t\t})\n\n\t\twhen(\"#Rebase\", func() {\n\t\t\twhen(\"run image is provided by the user\", func() {\n\t\t\t\twhen(\"the image has a label with a run image specified\", func() {\n\t\t\t\t\tvar fakeCustomRunImage *fakes.Image\n\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\tfakeCustomRunImage = fakes.NewImage(\"custom/run\", \"custom-base-top-layer-sha\", &fakeIdentifier{name: \"custom-base-digest\"})\n\t\t\t\t\t\th.AssertNil(t, fakeCustomRunImage.SetLabel(\"io.buildpacks.stack.id\", \"io.buildpacks.stacks.jammy\"))\n\t\t\t\t\t\tfakeImageFetcher.LocalImages[\"custom/run\"] = fakeCustomRunImage\n\t\t\t\t\t})\n\n\t\t\t\t\tit.After(func() {\n\t\t\t\t\t\th.AssertNilE(t, fakeCustomRunImage.Cleanup())\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"--force\", func() {\n\t\t\t\t\t\tit(\"uses the run image provided by the user\", func() {\n\t\t\t\t\t\t\th.AssertNil(t, subject.Rebase(context.TODO(),\n\t\t\t\t\t\t\t\tRebaseOptions{\n\t\t\t\t\t\t\t\t\tRunImage: \"custom/run\",\n\t\t\t\t\t\t\t\t\tRepoName: \"some/app\",\n\t\t\t\t\t\t\t\t\tForce:    true,\n\t\t\t\t\t\t\t\t}))\n\t\t\t\t\t\t\th.AssertEq(t, fakeAppImage.Base(), \"custom/run\")\n\t\t\t\t\t\t\tlbl, _ := fakeAppImage.Label(\"io.buildpacks.lifecycle.metadata\")\n\t\t\t\t\t\t\th.AssertContains(t, lbl, `\"runImage\":{\"topLayer\":\"custom-base-top-layer-sha\",\"reference\":\"custom-base-digest\"`)\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"errors\", func() {\n\t\t\t\t\t\th.AssertError(t, subject.Rebase(context.TODO(),\n\t\t\t\t\t\t\tRebaseOptions{\n\t\t\t\t\t\t\t\tRunImage: \"custom/run\",\n\t\t\t\t\t\t\t\tRepoName: \"some/app\",\n\t\t\t\t\t\t\t}), \"new base image 'custom/run' not found in existing run image metadata\")\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"run image is NOT provided by the user\", func() {\n\t\t\t\twhen(\"the image has a label with a run image specified\", func() {\n\t\t\t\t\tit(\"uses the run image provided in the App image label\", func() {\n\t\t\t\t\t\th.AssertNil(t, subject.Rebase(context.TODO(), RebaseOptions{\n\t\t\t\t\t\t\tRepoName: \"some/app\",\n\t\t\t\t\t\t}))\n\t\t\t\t\t\th.AssertEq(t, fakeAppImage.Base(), \"some/run\")\n\t\t\t\t\t\tlbl, _ := fakeAppImage.Label(\"io.buildpacks.lifecycle.metadata\")\n\t\t\t\t\t\th.AssertContains(t, lbl, `\"runImage\":{\"topLayer\":\"run-image-top-layer-sha\",\"reference\":\"run-image-digest\"`)\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"the image has a label with a run image mirrors specified\", func() {\n\t\t\t\t\twhen(\"there are no user provided mirrors\", func() {\n\t\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\t\tfakeImageFetcher.LocalImages[\"example.com/some/app\"] = fakeAppImage\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\tit(\"chooses a matching mirror from the app image label\", func() {\n\t\t\t\t\t\t\th.AssertNil(t, subject.Rebase(context.TODO(), RebaseOptions{\n\t\t\t\t\t\t\t\tRepoName: \"example.com/some/app\",\n\t\t\t\t\t\t\t}))\n\t\t\t\t\t\t\th.AssertEq(t, fakeAppImage.Base(), \"example.com/some/run\")\n\t\t\t\t\t\t\tlbl, _ := fakeAppImage.Label(\"io.buildpacks.lifecycle.metadata\")\n\t\t\t\t\t\t\th.AssertContains(t, lbl, `\"runImage\":{\"topLayer\":\"mirror-top-layer-sha\",\"reference\":\"mirror-digest\"`)\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"there are user provided mirrors\", func() {\n\t\t\t\t\t\tvar (\n\t\t\t\t\t\t\tfakeLocalMirror *fakes.Image\n\t\t\t\t\t\t)\n\t\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\t\tfakeImageFetcher.LocalImages[\"example.com/some/app\"] = fakeAppImage\n\t\t\t\t\t\t\tfakeLocalMirror = fakes.NewImage(\"example.com/some/local-run\", \"local-mirror-top-layer-sha\", &fakeIdentifier{name: \"local-mirror-digest\"})\n\t\t\t\t\t\t\th.AssertNil(t, fakeLocalMirror.SetLabel(\"io.buildpacks.stack.id\", \"io.buildpacks.stacks.jammy\"))\n\t\t\t\t\t\t\tfakeImageFetcher.LocalImages[\"example.com/some/local-run\"] = fakeLocalMirror\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\tit.After(func() {\n\t\t\t\t\t\t\th.AssertNilE(t, fakeLocalMirror.Cleanup())\n\t\t\t\t\t\t})\n\t\t\t\t\t\twhen(\"--force\", func() {\n\t\t\t\t\t\t\tit(\"chooses a matching local mirror first\", func() {\n\t\t\t\t\t\t\t\th.AssertNil(t, subject.Rebase(context.TODO(), RebaseOptions{\n\t\t\t\t\t\t\t\t\tRepoName: \"example.com/some/app\",\n\t\t\t\t\t\t\t\t\tAdditionalMirrors: map[string][]string{\n\t\t\t\t\t\t\t\t\t\t\"some/run\": {\"example.com/some/local-run\"},\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\tForce: true,\n\t\t\t\t\t\t\t\t}))\n\t\t\t\t\t\t\t\th.AssertEq(t, fakeAppImage.Base(), \"example.com/some/local-run\")\n\t\t\t\t\t\t\t\tlbl, _ := fakeAppImage.Label(\"io.buildpacks.lifecycle.metadata\")\n\t\t\t\t\t\t\t\th.AssertContains(t, lbl, `\"runImage\":{\"topLayer\":\"local-mirror-top-layer-sha\",\"reference\":\"local-mirror-digest\"`)\n\t\t\t\t\t\t\t})\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\t\t\t\t\twhen(\"there is a label and it has a run image and no stack\", func() {\n\t\t\t\t\t\tit(\"reads the run image from the label\", func() {\n\t\t\t\t\t\t\th.AssertNil(t, fakeAppImage.SetLabel(\"io.buildpacks.lifecycle.metadata\",\n\t\t\t\t\t\t\t\t`{\"runImage\":{\"image\":\"some/run\", \"mirrors\":[\"example.com/some/run\"]}}`))\n\t\t\t\t\t\t\th.AssertNil(t, subject.Rebase(context.TODO(), RebaseOptions{\n\t\t\t\t\t\t\t\tRepoName: \"some/app\",\n\t\t\t\t\t\t\t}))\n\t\t\t\t\t\t\th.AssertEq(t, fakeAppImage.Base(), \"some/run\")\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\t\t\t\t\twhen(\"there is neither runImage nor stack\", func() {\n\t\t\t\t\t\tit(\"fails gracefully\", func() {\n\t\t\t\t\t\t\th.AssertNil(t, fakeAppImage.SetLabel(\"io.buildpacks.lifecycle.metadata\", `{}`))\n\t\t\t\t\t\t\th.AssertError(t, subject.Rebase(context.TODO(), RebaseOptions{RepoName: \"some/app\"}),\n\t\t\t\t\t\t\t\t\"run image must be specified\")\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"the image does not have a label with a run image specified\", func() {\n\t\t\t\t\tit(\"returns an error\", func() {\n\t\t\t\t\t\th.AssertNil(t, fakeAppImage.SetLabel(\"io.buildpacks.lifecycle.metadata\", \"{}\"))\n\t\t\t\t\t\terr := subject.Rebase(context.TODO(), RebaseOptions{\n\t\t\t\t\t\t\tRepoName: \"some/app\",\n\t\t\t\t\t\t})\n\t\t\t\t\t\th.AssertError(t, err, \"run image must be specified\")\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"publish\", func() {\n\t\t\t\tvar (\n\t\t\t\t\tfakeRemoteRunImage *fakes.Image\n\t\t\t\t)\n\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tfakeRemoteRunImage = fakes.NewImage(\"some/run\", \"remote-top-layer-sha\", &fakeIdentifier{name: \"remote-digest\"})\n\t\t\t\t\th.AssertNil(t, fakeRemoteRunImage.SetLabel(\"io.buildpacks.stack.id\", \"io.buildpacks.stacks.jammy\"))\n\t\t\t\t\tfakeImageFetcher.RemoteImages[\"some/run\"] = fakeRemoteRunImage\n\t\t\t\t})\n\n\t\t\t\tit.After(func() {\n\t\t\t\t\th.AssertNilE(t, fakeRemoteRunImage.Cleanup())\n\t\t\t\t})\n\n\t\t\t\twhen(\"is false\", func() {\n\t\t\t\t\twhen(\"pull policy is always\", func() {\n\t\t\t\t\t\tit(\"updates the local image\", func() {\n\t\t\t\t\t\t\th.AssertNil(t, subject.Rebase(context.TODO(), RebaseOptions{\n\t\t\t\t\t\t\t\tRepoName:   \"some/app\",\n\t\t\t\t\t\t\t\tPullPolicy: image.PullAlways,\n\t\t\t\t\t\t\t}))\n\t\t\t\t\t\t\th.AssertEq(t, fakeAppImage.Base(), \"some/run\")\n\t\t\t\t\t\t\tlbl, _ := fakeAppImage.Label(\"io.buildpacks.lifecycle.metadata\")\n\t\t\t\t\t\t\th.AssertContains(t, lbl, `\"runImage\":{\"topLayer\":\"remote-top-layer-sha\",\"reference\":\"remote-digest\"`)\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"pull policy is never\", func() {\n\t\t\t\t\t\tit(\"uses local image\", func() {\n\t\t\t\t\t\t\th.AssertNil(t, subject.Rebase(context.TODO(), RebaseOptions{\n\t\t\t\t\t\t\t\tRepoName:   \"some/app\",\n\t\t\t\t\t\t\t\tPullPolicy: image.PullNever,\n\t\t\t\t\t\t\t}))\n\t\t\t\t\t\t\th.AssertEq(t, fakeAppImage.Base(), \"some/run\")\n\t\t\t\t\t\t\tlbl, _ := fakeAppImage.Label(\"io.buildpacks.lifecycle.metadata\")\n\t\t\t\t\t\t\th.AssertContains(t, lbl, `\"runImage\":{\"topLayer\":\"run-image-top-layer-sha\",\"reference\":\"run-image-digest\"`)\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"report directory is set\", func() {\n\t\t\t\t\tit(\"writes the report\", func() {\n\t\t\t\t\t\ttmpdir := t.TempDir()\n\t\t\t\t\t\th.AssertNil(t, subject.Rebase(context.TODO(), RebaseOptions{\n\t\t\t\t\t\t\tRepoName:             \"some/app\",\n\t\t\t\t\t\t\tReportDestinationDir: tmpdir,\n\t\t\t\t\t\t}))\n\t\t\t\t\t\t_, err := os.Stat(filepath.Join(tmpdir, \"report.toml\"))\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"is true\", func() {\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\tfakeImageFetcher.RemoteImages[\"some/app\"] = fakeAppImage\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"skip pull is anything\", func() {\n\t\t\t\t\t\tit(\"uses remote image\", func() {\n\t\t\t\t\t\t\th.AssertNil(t, subject.Rebase(context.TODO(), RebaseOptions{\n\t\t\t\t\t\t\t\tRepoName: \"some/app\",\n\t\t\t\t\t\t\t\tPublish:  true,\n\t\t\t\t\t\t\t}))\n\t\t\t\t\t\t\th.AssertEq(t, fakeAppImage.Base(), \"some/run\")\n\t\t\t\t\t\t\tlbl, _ := fakeAppImage.Label(\"io.buildpacks.lifecycle.metadata\")\n\t\t\t\t\t\t\th.AssertContains(t, lbl, `\"runImage\":{\"topLayer\":\"remote-top-layer-sha\",\"reference\":\"remote-digest\"`)\n\t\t\t\t\t\t\targs := fakeImageFetcher.FetchCalls[\"some/run\"]\n\t\t\t\t\t\t\th.AssertEq(t, args.Target.ValuesAsPlatform(), \"linux/amd64\")\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\t\t\twhen(\"previous image is provided\", func() {\n\t\t\t\tit(\"fetches the image using the previous image name\", func() {\n\t\t\t\t\th.AssertNil(t, subject.Rebase(context.TODO(), RebaseOptions{\n\t\t\t\t\t\tRepoName:      \"new/app\",\n\t\t\t\t\t\tPreviousImage: \"some/app\",\n\t\t\t\t\t}))\n\t\t\t\t\targs := fakeImageFetcher.FetchCalls[\"some/app\"]\n\t\t\t\t\th.AssertNotNil(t, args)\n\t\t\t\t\th.AssertEq(t, args.Daemon, true)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"previous image is set to new image name\", func() {\n\t\t\t\tit(\"returns error if Fetch function fails\", func() {\n\t\t\t\t\terr := subject.Rebase(context.TODO(), RebaseOptions{\n\t\t\t\t\t\tRepoName:      \"some/app\",\n\t\t\t\t\t\tPreviousImage: \"new/app\",\n\t\t\t\t\t})\n\t\t\t\t\th.AssertError(t, err, \"image 'new/app' does not exist on the daemon: not found\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"previous image is not provided\", func() {\n\t\t\t\tit(\"fetches the image using the repo name\", func() {\n\t\t\t\t\th.AssertNil(t, subject.Rebase(context.TODO(), RebaseOptions{\n\t\t\t\t\t\tRepoName: \"some/app\",\n\t\t\t\t\t}))\n\t\t\t\t\targs := fakeImageFetcher.FetchCalls[\"some/app\"]\n\t\t\t\t\th.AssertNotNil(t, args)\n\t\t\t\t\th.AssertEq(t, args.Daemon, true)\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\t})\n}\n\ntype fakeIdentifier struct {\n\tname string\n}\n\nfunc (f *fakeIdentifier) String() string {\n\treturn f.name\n}\n"
  },
  {
    "path": "pkg/client/register_buildpack.go",
    "content": "package client\n\nimport (\n\t\"context\"\n\t\"errors\"\n\t\"net/url\"\n\t\"runtime\"\n\t\"strings\"\n\n\t\"github.com/buildpacks/pack/internal/registry\"\n\t\"github.com/buildpacks/pack/pkg/buildpack\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n\t\"github.com/buildpacks/pack/pkg/image\"\n)\n\n// RegisterBuildpackOptions is a configuration struct that controls the\n// behavior of the RegisterBuildpack function.\ntype RegisterBuildpackOptions struct {\n\tImageName string\n\tType      string\n\tURL       string\n\tName      string\n}\n\n// RegisterBuildpack updates the Buildpack Registry with to include a new buildpack specified in\n// the opts argument\nfunc (c *Client) RegisterBuildpack(ctx context.Context, opts RegisterBuildpackOptions) error {\n\tappImage, err := c.imageFetcher.Fetch(ctx, opts.ImageName, image.FetchOptions{Daemon: false, PullPolicy: image.PullAlways})\n\tif err != nil {\n\t\treturn err\n\t}\n\n\tvar buildpackInfo dist.ModuleInfo\n\tif _, err := dist.GetLabel(appImage, buildpack.MetadataLabel, &buildpackInfo); err != nil {\n\t\treturn err\n\t}\n\n\tnamespace, name, err := parseID(buildpackInfo.ID)\n\tif err != nil {\n\t\treturn err\n\t}\n\n\tid, err := appImage.Identifier()\n\tif err != nil {\n\t\treturn err\n\t}\n\n\tbuildpack := registry.Buildpack{\n\t\tNamespace: namespace,\n\t\tName:      name,\n\t\tVersion:   buildpackInfo.Version,\n\t\tAddress:   id.String(),\n\t\tYanked:    false,\n\t}\n\n\tswitch opts.Type {\n\tcase \"github\":\n\t\tissueURL, err := registry.GetIssueURL(opts.URL)\n\t\tif err != nil {\n\t\t\treturn err\n\t\t}\n\n\t\tissue, err := registry.CreateGithubIssue(buildpack)\n\t\tif err != nil {\n\t\t\treturn err\n\t\t}\n\n\t\tparams := url.Values{}\n\t\tparams.Add(\"title\", issue.Title)\n\t\tparams.Add(\"body\", issue.Body)\n\t\tparams.Add(\"template\", \"add-buildpack.md\")\n\t\tissueURL.RawQuery = params.Encode()\n\n\t\tc.logger.Debugf(\"Open URL in browser: %s\", issueURL)\n\t\tcmd, err := registry.CreateBrowserCmd(issueURL.String(), runtime.GOOS)\n\t\tif err != nil {\n\t\t\treturn err\n\t\t}\n\n\t\treturn cmd.Start()\n\tcase \"git\":\n\t\tregistryCache, err := getRegistry(c.logger, opts.Name)\n\t\tif err != nil {\n\t\t\treturn err\n\t\t}\n\n\t\tusername, err := parseUsernameFromURL(opts.URL)\n\t\tif err != nil {\n\t\t\treturn err\n\t\t}\n\n\t\tif err := registry.GitCommit(buildpack, username, registryCache); err != nil {\n\t\t\treturn err\n\t\t}\n\t}\n\n\treturn nil\n}\n\nfunc parseUsernameFromURL(url string) (string, error) {\n\tparts := strings.Split(url, \"/\")\n\tif len(parts) < 3 {\n\t\treturn \"\", errors.New(\"invalid url: cannot parse username from url\")\n\t}\n\tif parts[3] == \"\" {\n\t\treturn \"\", errors.New(\"invalid url: username is empty\")\n\t}\n\n\treturn parts[3], nil\n}\n\nfunc parseID(id string) (string, string, error) {\n\tparts := strings.Split(id, \"/\")\n\tif len(parts) < 2 {\n\t\treturn \"\", \"\", errors.New(\"invalid id: does not contain a namespace\")\n\t} else if len(parts) > 2 {\n\t\treturn \"\", \"\", errors.New(\"invalid id: contains unexpected characters\")\n\t}\n\n\treturn parts[0], parts[1], nil\n}\n"
  },
  {
    "path": "pkg/client/register_buildpack_test.go",
    "content": "package client\n\nimport (\n\t\"bytes\"\n\t\"context\"\n\t\"testing\"\n\n\t\"github.com/buildpacks/imgutil/fakes\"\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\tifakes \"github.com/buildpacks/pack/internal/fakes\"\n\t\"github.com/buildpacks/pack/internal/registry\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestRegisterBuildpack(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"register_buildpack\", testRegisterBuildpack, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testRegisterBuildpack(t *testing.T, when spec.G, it spec.S) {\n\twhen(\"#RegisterBuildpack\", func() {\n\t\tvar (\n\t\t\tfakeImageFetcher *ifakes.FakeImageFetcher\n\t\t\tfakeAppImage     *fakes.Image\n\t\t\tsubject          *Client\n\t\t\tout              bytes.Buffer\n\t\t)\n\n\t\tit.Before(func() {\n\t\t\tfakeImageFetcher = ifakes.NewFakeImageFetcher()\n\t\t\tfakeAppImage = fakes.NewImage(\"buildpack/image\", \"\", &fakeIdentifier{name: \"buildpack-image\"})\n\n\t\t\th.AssertNil(t, fakeAppImage.SetLabel(\"io.buildpacks.buildpackage.metadata\",\n\t\t\t\t`{\"id\":\"heroku/java-function\",\"version\":\"1.1.1\",\"stacks\":[{\"id\":\"heroku-18\"},{\"id\":\"io.buildpacks.stacks.jammy\"},{\"id\":\"org.cloudfoundry.stacks.cflinuxfs3\"}]}`))\n\t\t\tfakeImageFetcher.RemoteImages[\"buildpack/image\"] = fakeAppImage\n\n\t\t\tfakeLogger := logging.NewLogWithWriters(&out, &out)\n\t\t\tsubject = &Client{\n\t\t\t\tlogger:       fakeLogger,\n\t\t\t\timageFetcher: fakeImageFetcher,\n\t\t\t}\n\t\t})\n\n\t\tit.After(func() {\n\t\t\t_ = fakeAppImage.Cleanup()\n\t\t})\n\n\t\tit(\"should return error for an invalid image (github)\", func() {\n\t\t\tfakeAppImage = fakes.NewImage(\"invalid/image\", \"\", &fakeIdentifier{name: \"buildpack-image\"})\n\t\t\th.AssertNil(t, fakeAppImage.SetLabel(\"io.buildpacks.buildpackage.metadata\", `{}`))\n\n\t\t\th.AssertNotNil(t, subject.RegisterBuildpack(context.TODO(),\n\t\t\t\tRegisterBuildpackOptions{\n\t\t\t\t\tImageName: \"invalid/image\",\n\t\t\t\t\tType:      \"github\",\n\t\t\t\t\tURL:       registry.DefaultRegistryURL,\n\t\t\t\t\tName:      registry.DefaultRegistryName,\n\t\t\t\t}))\n\t\t})\n\n\t\tit(\"should return error for missing image label (github)\", func() {\n\t\t\tfakeAppImage = fakes.NewImage(\"missinglabel/image\", \"\", &fakeIdentifier{name: \"buildpack-image\"})\n\t\t\th.AssertNil(t, fakeAppImage.SetLabel(\"io.buildpacks.buildpackage.metadata\", `{}`))\n\t\t\tfakeImageFetcher.RemoteImages[\"missinglabel/image\"] = fakeAppImage\n\n\t\t\th.AssertNotNil(t, subject.RegisterBuildpack(context.TODO(),\n\t\t\t\tRegisterBuildpackOptions{\n\t\t\t\t\tImageName: \"missinglabel/image\",\n\t\t\t\t\tType:      \"github\",\n\t\t\t\t\tURL:       registry.DefaultRegistryURL,\n\t\t\t\t\tName:      registry.DefaultRegistryName,\n\t\t\t\t}))\n\t\t})\n\n\t\tit(\"should throw error if missing URL (github)\", func() {\n\t\t\th.AssertError(t, subject.RegisterBuildpack(context.TODO(),\n\t\t\t\tRegisterBuildpackOptions{\n\t\t\t\t\tImageName: \"buildpack/image\",\n\t\t\t\t\tType:      \"github\",\n\t\t\t\t\tURL:       \"\",\n\t\t\t\t\tName:      \"official\",\n\t\t\t\t}), \"missing github URL\")\n\t\t})\n\n\t\tit(\"should throw error if missing URL (git)\", func() {\n\t\t\th.AssertError(t, subject.RegisterBuildpack(context.TODO(),\n\t\t\t\tRegisterBuildpackOptions{\n\t\t\t\t\tImageName: \"buildpack/image\",\n\t\t\t\t\tType:      \"git\",\n\t\t\t\t\tURL:       \"\",\n\t\t\t\t\tName:      \"official\",\n\t\t\t\t}), \"invalid url: cannot parse username from url\")\n\t\t})\n\n\t\tit(\"should throw error if using malformed URL (git)\", func() {\n\t\t\th.AssertError(t, subject.RegisterBuildpack(context.TODO(),\n\t\t\t\tRegisterBuildpackOptions{\n\t\t\t\t\tImageName: \"buildpack/image\",\n\t\t\t\t\tType:      \"git\",\n\t\t\t\t\tURL:       \"https://github.com//buildpack-registry/\",\n\t\t\t\t\tName:      \"official\",\n\t\t\t\t}), \"invalid url: username is empty\")\n\t\t})\n\n\t\tit(\"should return error for an invalid image (git)\", func() {\n\t\t\tfakeAppImage = fakes.NewImage(\"invalid/image\", \"\", &fakeIdentifier{name: \"buildpack-image\"})\n\t\t\th.AssertNil(t, fakeAppImage.SetLabel(\"io.buildpacks.buildpackage.metadata\", `{}`))\n\n\t\t\th.AssertNotNil(t, subject.RegisterBuildpack(context.TODO(),\n\t\t\t\tRegisterBuildpackOptions{\n\t\t\t\t\tImageName: \"invalid/image\",\n\t\t\t\t\tType:      \"git\",\n\t\t\t\t\tURL:       registry.DefaultRegistryURL,\n\t\t\t\t\tName:      registry.DefaultRegistryName,\n\t\t\t\t}))\n\t\t})\n\n\t\tit(\"should return error for missing image label (git)\", func() {\n\t\t\tfakeAppImage = fakes.NewImage(\"missinglabel/image\", \"\", &fakeIdentifier{name: \"buildpack-image\"})\n\t\t\th.AssertNil(t, fakeAppImage.SetLabel(\"io.buildpacks.buildpackage.metadata\", `{}`))\n\t\t\tfakeImageFetcher.RemoteImages[\"missinglabel/image\"] = fakeAppImage\n\n\t\t\th.AssertNotNil(t, subject.RegisterBuildpack(context.TODO(),\n\t\t\t\tRegisterBuildpackOptions{\n\t\t\t\t\tImageName: \"missinglabel/image\",\n\t\t\t\t\tType:      \"git\",\n\t\t\t\t\tURL:       registry.DefaultRegistryURL,\n\t\t\t\t\tName:      registry.DefaultRegistryName,\n\t\t\t\t}))\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "pkg/client/testdata/builder.toml",
    "content": "[[buildpacks]]\nid = \"some.bp1\"\nuri = \"some-path-1\"\n\n[[buildpacks]]\nid = \"some/bp2\"\nuri = \"some-path-2\"\n\n[[buildpacks]]\nid = \"some/bp2\"\nuri = \"some-path-3\"\n\n[[order]]\n[[order.group]]\n  id = \"some.bp1\"\n  version = \"1.2.3\"\n\n[[order.group]]\n  id = \"some/bp2\"\n  version = \"1.2.4\"\n\n[[order]]\n[[order.group]]\n  id = \"some.bp1\"\n  version = \"1.2.3\"\n\n[stack]\nid = \"com.example.stack\"\nbuild-image = \"some/build\"\nrun-image = \"some/run\"\nrun-image-mirrors = [\"gcr.io/some/run2\"]"
  },
  {
    "path": "pkg/client/testdata/buildpack/bin/build",
    "content": "build-contents"
  },
  {
    "path": "pkg/client/testdata/buildpack/bin/detect",
    "content": ""
  },
  {
    "path": "pkg/client/testdata/buildpack/buildpack.toml",
    "content": "api = \"0.3\"\n\n[buildpack]\nid = \"bp.one\"\nversion = \"1.2.3\"\nhomepage = \"http://one.buildpack\"\n\n[[stacks]]\nid = \"some.stack.id\"\nmixins = [\"mixinX\", \"build:mixinY\", \"run:mixinZ\"]\n"
  },
  {
    "path": "pkg/client/testdata/buildpack-api-0.4/bin/build",
    "content": "build-contents"
  },
  {
    "path": "pkg/client/testdata/buildpack-api-0.4/bin/detect",
    "content": ""
  },
  {
    "path": "pkg/client/testdata/buildpack-api-0.4/buildpack.toml",
    "content": "api = \"0.4\"\n\n[buildpack]\nid = \"bp.one\"\nversion = \"1.2.3\"\nhomepage = \"http://one.buildpack\"\n\n[[stacks]]\nid = \"some.stack.id\"\nmixins = [\"mixinX\", \"build:mixinY\", \"run:mixinZ\"]\n"
  },
  {
    "path": "pkg/client/testdata/buildpack-flatten/buildpack-1/buildpack.toml",
    "content": "api = \"0.3\"\n\n[buildpack]\nid = \"flatten/bp-1\"\nversion = \"1\"\nhomepage = \"http://buildpack-1\"\n\n[[order]]\n[[order.group]]\nid = \"flatten/bp-2\"\nversion = \"2\"\n\n[[order.group]]\nid = \"flatten/bp-3\"\nversion = \"3\"\n"
  },
  {
    "path": "pkg/client/testdata/buildpack-flatten/buildpack-2/bin/build",
    "content": "build-contents"
  },
  {
    "path": "pkg/client/testdata/buildpack-flatten/buildpack-2/bin/detect",
    "content": ""
  },
  {
    "path": "pkg/client/testdata/buildpack-flatten/buildpack-2/buildpack.toml",
    "content": "api = \"0.3\"\n\n[buildpack]\nid = \"flatten/bp-2\"\nversion = \"2\"\nhomepage = \"http://buildpack-2\"\n\n[[stacks]]\nid = \"*\"\n"
  },
  {
    "path": "pkg/client/testdata/buildpack-flatten/buildpack-3/buildpack.toml",
    "content": "api = \"0.3\"\n\n[buildpack]\nid = \"flatten/bp-3\"\nversion = \"3\"\nhomepage = \"http://buildpack-3\"\n\n[[order]]\n[[order.group]]\nid = \"flatten/bp-4\"\nversion = \"4\"\n\n[[order.group]]\nid = \"flatten/bp-5\"\nversion = \"5\"\n"
  },
  {
    "path": "pkg/client/testdata/buildpack-flatten/buildpack-4/bin/build",
    "content": "build-contents"
  },
  {
    "path": "pkg/client/testdata/buildpack-flatten/buildpack-4/bin/detect",
    "content": ""
  },
  {
    "path": "pkg/client/testdata/buildpack-flatten/buildpack-4/buildpack.toml",
    "content": "api = \"0.3\"\n\n[buildpack]\nid = \"flatten/bp-4\"\nversion = \"4\"\nhomepage = \"http://buildpack-4\"\n\n[[stacks]]\nid = \"*\"\n"
  },
  {
    "path": "pkg/client/testdata/buildpack-flatten/buildpack-5/buildpack.toml",
    "content": "api = \"0.3\"\n\n[buildpack]\nid = \"flatten/bp-5\"\nversion = \"5\"\nhomepage = \"http://buildpack-5\"\n\n[[order]]\n[[order.group]]\nid = \"flatten/bp-6\"\nversion = \"6\"\n\n[[order.group]]\nid = \"flatten/bp-7\"\nversion = \"7\"\n"
  },
  {
    "path": "pkg/client/testdata/buildpack-flatten/buildpack-6/bin/build",
    "content": "build-contents"
  },
  {
    "path": "pkg/client/testdata/buildpack-flatten/buildpack-6/bin/detect",
    "content": ""
  },
  {
    "path": "pkg/client/testdata/buildpack-flatten/buildpack-6/buildpack.toml",
    "content": "api = \"0.3\"\n\n[buildpack]\nid = \"flatten/bp-6\"\nversion = \"6\"\nhomepage = \"http://buildpack-6\"\n\n[[stacks]]\nid = \"*\"\n\n"
  },
  {
    "path": "pkg/client/testdata/buildpack-flatten/buildpack-7/bin/build",
    "content": "build-contents"
  },
  {
    "path": "pkg/client/testdata/buildpack-flatten/buildpack-7/bin/detect",
    "content": ""
  },
  {
    "path": "pkg/client/testdata/buildpack-flatten/buildpack-7/buildpack.toml",
    "content": "api = \"0.3\"\n\n[buildpack]\nid = \"flatten/bp-7\"\nversion = \"7\"\nhomepage = \"http://buildpack-7\"\n\n[[stacks]]\nid = \"*\"\n"
  },
  {
    "path": "pkg/client/testdata/buildpack-multi-platform/README.md",
    "content": "When creating multi-platform buildpacks, the root buildpack.toml file must be copied into each\nplaform root folder; this operation must be done by the caller of the method:\n\n`PackageBuildpack(ctx context.Context, opts PackageBuildpackOptions) error`\n\nTo simplify the tests, the buildpack.toml is already copied in each buildpack folder.\n"
  },
  {
    "path": "pkg/client/testdata/buildpack-multi-platform/buildpack-composite/buildpack.toml",
    "content": "api = \"0.10\"\n\n[buildpack]\nid = \"samples/composite-buildpack\"\nversion = \"0.0.1\"\n\n# Order used for detection\n[[order]]\n[[order.group]]\nid = \"samples/bp-1\"\nversion = \"0.0.1\"\n\n[[order.group]]\nid = \"samples/bp-2\"\nversion = \"0.0.1\"\n"
  },
  {
    "path": "pkg/client/testdata/buildpack-multi-platform/buildpack-composite/package.toml",
    "content": "[buildpack]\nuri = \".\"\n\n[[targets]]\nos = \"linux\"\narch = \"amd64\"\n\n[[targets]]\nos = \"linux\"\narch = \"arm64\"\n\n[[dependencies]]\nuri = \"localhost:3333/bp-1\"\n\n[[dependencies]]\nuri = \"localhost:3333/bp-2\"\n"
  },
  {
    "path": "pkg/client/testdata/buildpack-multi-platform/buildpack-composite-with-dependencies-on-disk/buildpack.toml",
    "content": "api = \"0.10\"\n\n[buildpack]\nid = \"samples/composite-buildpack\"\nversion = \"0.0.1\"\n\n# Order used for detection\n[[order]]\n[[order.group]]\nid = \"samples/bp-1\"\nversion = \"0.0.1\"\n\n"
  },
  {
    "path": "pkg/client/testdata/buildpack-multi-platform/buildpack-composite-with-dependencies-on-disk/package.toml",
    "content": "[buildpack]\nuri = \".\"\n\n[[targets]]\nos = \"linux\"\narch = \"amd64\"\n\n[[targets]]\nos = \"linux\"\narch = \"arm64\"\n\n[[dependencies]]\nuri = \"../samples/bp-1\"\n\n"
  },
  {
    "path": "pkg/client/testdata/buildpack-multi-platform/buildpack-new-format/linux/amd64/bin/build",
    "content": "build-amd64-contents\n"
  },
  {
    "path": "pkg/client/testdata/buildpack-multi-platform/buildpack-new-format/linux/amd64/bin/detect",
    "content": "detect-amd64-contents\n"
  },
  {
    "path": "pkg/client/testdata/buildpack-multi-platform/buildpack-new-format/linux/amd64/buildpack.toml",
    "content": "api = \"0.10\"\n\n[buildpack]\nid = \"samples/multi-platform\"\nversion = \"0.0.1\"\n\n[[targets]]\nos = \"linux\"\narch = \"amd64\"\n\n[[targets]]\nos = \"linux\"\narch = \"arm64\"\n\n[[stacks]]\nid = \"*\"\n"
  },
  {
    "path": "pkg/client/testdata/buildpack-multi-platform/buildpack-new-format/linux/arm/bin/build",
    "content": "build-arm-contents\n"
  },
  {
    "path": "pkg/client/testdata/buildpack-multi-platform/buildpack-new-format/linux/arm/bin/detect",
    "content": "detect-arm-contents\n"
  },
  {
    "path": "pkg/client/testdata/buildpack-multi-platform/buildpack-new-format/linux/arm/buildpack.toml",
    "content": "api = \"0.10\"\n\n[buildpack]\nid = \"samples/multi-platform\"\nversion = \"0.0.1\"\n\n[[targets]]\nos = \"linux\"\narch = \"amd64\"\n\n[[targets]]\nos = \"linux\"\narch = \"arm64\"\n\n[[stacks]]\nid = \"*\"\n"
  },
  {
    "path": "pkg/client/testdata/buildpack-multi-platform/buildpack-new-format/linux/buildpack.toml",
    "content": "api = \"0.10\"\n\n[buildpack]\nid = \"samples/multi-platform\"\nversion = \"0.0.1\"\n\n[[targets]]\nos = \"linux\"\narch = \"amd64\"\n\n[[targets]]\nos = \"linux\"\narch = \"arm64\"\n\n[[stacks]]\nid = \"*\"\n\n"
  },
  {
    "path": "pkg/client/testdata/buildpack-multi-platform/buildpack-new-format-with-versions/linux/amd64/v5/ubuntu@18.01/bin/build",
    "content": "build-amd64-contents\n"
  },
  {
    "path": "pkg/client/testdata/buildpack-multi-platform/buildpack-new-format-with-versions/linux/amd64/v5/ubuntu@18.01/bin/detect",
    "content": "detect-amd64-contents\n"
  },
  {
    "path": "pkg/client/testdata/buildpack-multi-platform/buildpack-new-format-with-versions/linux/amd64/v5/ubuntu@18.01/buildpack.toml",
    "content": "api = \"0.10\"\n\n[buildpack]\nid = \"samples/multi-platform\"\nversion = \"0.0.1\"\n\n[[targets]]\nos = \"linux\"\narch = \"amd64\"\n[[targets.distributions]]\nname = \"ubuntu\"\nversions = [\"18.01\", \"21.01\"]\n\n[[targets]]\nos = \"linux\"\narch = \"arm64\"\nvariant = \"v6\"\n[[targets.distributions]]\nname = \"ubuntu\"\nversions = [\"18.01\", \"21.01\"]\n\n[[stacks]]\nid = \"*\"\n"
  },
  {
    "path": "pkg/client/testdata/buildpack-multi-platform/buildpack-new-format-with-versions/linux/amd64/v5/ubuntu@21.01/bin/build",
    "content": "build-amd64-contents\n"
  },
  {
    "path": "pkg/client/testdata/buildpack-multi-platform/buildpack-new-format-with-versions/linux/amd64/v5/ubuntu@21.01/bin/detect",
    "content": "detect-amd64-contents\n"
  },
  {
    "path": "pkg/client/testdata/buildpack-multi-platform/buildpack-new-format-with-versions/linux/amd64/v5/ubuntu@21.01/buildpack.toml",
    "content": "api = \"0.10\"\n\n[buildpack]\nid = \"samples/multi-platform\"\nversion = \"0.0.1\"\n\n[[targets]]\nos = \"linux\"\narch = \"amd64\"\n\n[[targets]]\nos = \"linux\"\narch = \"arm64\"\n\n[[stacks]]\nid = \"*\"\n"
  },
  {
    "path": "pkg/client/testdata/buildpack-multi-platform/buildpack-new-format-with-versions/linux/arm/v6/ubuntu@18.01/bin/build",
    "content": "build-arm-contents\n"
  },
  {
    "path": "pkg/client/testdata/buildpack-multi-platform/buildpack-new-format-with-versions/linux/arm/v6/ubuntu@18.01/bin/detect",
    "content": "detect-arm-contents\n"
  },
  {
    "path": "pkg/client/testdata/buildpack-multi-platform/buildpack-new-format-with-versions/linux/arm/v6/ubuntu@18.01/buildpack.toml",
    "content": "api = \"0.10\"\n\n[buildpack]\nid = \"samples/multi-platform\"\nversion = \"0.0.1\"\n\n[[targets]]\nos = \"linux\"\narch = \"amd64\"\n\n[[targets]]\nos = \"linux\"\narch = \"arm64\"\n\n[[stacks]]\nid = \"*\"\n"
  },
  {
    "path": "pkg/client/testdata/buildpack-multi-platform/buildpack-new-format-with-versions/linux/arm/v6/ubuntu@21.01/bin/build",
    "content": "build-arm-contents\n"
  },
  {
    "path": "pkg/client/testdata/buildpack-multi-platform/buildpack-new-format-with-versions/linux/arm/v6/ubuntu@21.01/bin/detect",
    "content": "detect-arm-contents\n"
  },
  {
    "path": "pkg/client/testdata/buildpack-multi-platform/buildpack-new-format-with-versions/linux/arm/v6/ubuntu@21.01/buildpack.toml",
    "content": "api = \"0.10\"\n\n[buildpack]\nid = \"samples/multi-platform\"\nversion = \"0.0.1\"\n\n[[targets]]\nos = \"linux\"\narch = \"amd64\"\n\n[[targets]]\nos = \"linux\"\narch = \"arm64\"\n\n[[stacks]]\nid = \"*\"\n"
  },
  {
    "path": "pkg/client/testdata/buildpack-multi-platform/buildpack-old-format/bin/build",
    "content": "build-contents"
  },
  {
    "path": "pkg/client/testdata/buildpack-multi-platform/buildpack-old-format/bin/detect",
    "content": ""
  },
  {
    "path": "pkg/client/testdata/buildpack-multi-platform/buildpack-old-format/buildpack.toml",
    "content": "api = \"0.3\"\n\n[buildpack]\nid = \"bp.one\"\nversion = \"1.2.3\"\nhomepage = \"http://one.buildpack\"\n\n[[stacks]]\nid = \"some.stack.id\"\nmixins = [\"mixinX\", \"build:mixinY\", \"run:mixinZ\"]\n"
  },
  {
    "path": "pkg/client/testdata/buildpack-non-deterministic/buildpack-1-version-1/bin/build",
    "content": "build-contents"
  },
  {
    "path": "pkg/client/testdata/buildpack-non-deterministic/buildpack-1-version-1/bin/detect",
    "content": ""
  },
  {
    "path": "pkg/client/testdata/buildpack-non-deterministic/buildpack-1-version-1/buildpack.toml",
    "content": "api = \"0.4\"\n\n[buildpack]\nid = \"buildpack-1-id\"\nversion = \"buildpack-1-version-1\"\nhomepage = \"http://non-deterministic.buildpack-1\"\n\n[[stacks]]\nid = \"some.stack.id\"\nmixins = [\"mixinX\", \"build:mixinY\", \"run:mixinZ\"]\n"
  },
  {
    "path": "pkg/client/testdata/buildpack-non-deterministic/buildpack-1-version-2/bin/build",
    "content": "build-contents"
  },
  {
    "path": "pkg/client/testdata/buildpack-non-deterministic/buildpack-1-version-2/bin/detect",
    "content": ""
  },
  {
    "path": "pkg/client/testdata/buildpack-non-deterministic/buildpack-1-version-2/buildpack.toml",
    "content": "api = \"0.4\"\n\n[buildpack]\nid = \"buildpack-1-id\"\nversion = \"buildpack-1-version-2\"\nhomepage = \"http://non-deterministic.buildpack-1\"\n\n[[stacks]]\nid = \"some.stack.id\"\nmixins = [\"mixinX\", \"build:mixinY\", \"run:mixinZ\"]\n\n"
  },
  {
    "path": "pkg/client/testdata/buildpack-non-deterministic/buildpack-2-version-1/bin/build",
    "content": "build-contents"
  },
  {
    "path": "pkg/client/testdata/buildpack-non-deterministic/buildpack-2-version-1/bin/detect",
    "content": ""
  },
  {
    "path": "pkg/client/testdata/buildpack-non-deterministic/buildpack-2-version-1/buildpack.toml",
    "content": "api = \"0.4\"\n\n[buildpack]\nid = \"buildpack-2-id\"\nversion = \"buildpack-2-version-1\"\nhomepage = \"http://non-deterministic.buildpack-2\"\n\n[[stacks]]\nid = \"some.stack.id\"\nmixins = [\"mixinX\", \"build:mixinY\", \"run:mixinZ\"]\n"
  },
  {
    "path": "pkg/client/testdata/buildpack-non-deterministic/buildpack-2-version-2/bin/build",
    "content": "build-contents"
  },
  {
    "path": "pkg/client/testdata/buildpack-non-deterministic/buildpack-2-version-2/bin/detect",
    "content": ""
  },
  {
    "path": "pkg/client/testdata/buildpack-non-deterministic/buildpack-2-version-2/buildpack.toml",
    "content": "api = \"0.4\"\n\n[buildpack]\nid = \"buildpack-2-id\"\nversion = \"buildpack-2-version-2\"\nhomepage = \"http://non-deterministic.buildpack-2\"\n\n[[stacks]]\nid = \"some.stack.id\"\nmixins = [\"mixinX\", \"build:mixinY\", \"run:mixinZ\"]\n"
  },
  {
    "path": "pkg/client/testdata/buildpack2/bin/build",
    "content": ""
  },
  {
    "path": "pkg/client/testdata/buildpack2/bin/detect",
    "content": ""
  },
  {
    "path": "pkg/client/testdata/buildpack2/buildpack.toml",
    "content": "api = \"0.3\"\n\n[buildpack]\nid = \"some-other-buildpack-id\"\nversion = \"some-other-buildpack-version\"\n\n[[stacks]]\nid = \"some.stack.id\"\nmixins = [\"mixinA\", \"build:mixinB\", \"run:mixinC\"]\n"
  },
  {
    "path": "pkg/client/testdata/docker-context/error-cases/config-does-not-exist/README",
    "content": "This folder is intentionally empty to test the scenario when the docker config.json file doesn't exist\n"
  },
  {
    "path": "pkg/client/testdata/docker-context/error-cases/current-context-does-not-match/config.json",
    "content": "{\n  \"currentContext\": \"desktop-linux\"\n}\n"
  },
  {
    "path": "pkg/client/testdata/docker-context/error-cases/current-context-does-not-match/contexts/meta/fe9c6bd7a66301f49ca9b6a70b217107cd1284598bfc254700c989b916da791e/meta.json",
    "content": "{\n  \"Name\": \"bad-name\",\n  \"Endpoints\": {\n    \"docker\": {\n      \"Host\": \"unix:///Users/user/.docker/run/docker.sock\"\n    }\n  }\n}\n"
  },
  {
    "path": "pkg/client/testdata/docker-context/error-cases/docker-endpoint-does-not-exist/config.json",
    "content": "{\n  \"currentContext\": \"desktop-linux\"\n}\n"
  },
  {
    "path": "pkg/client/testdata/docker-context/error-cases/docker-endpoint-does-not-exist/contexts/meta/fe9c6bd7a66301f49ca9b6a70b217107cd1284598bfc254700c989b916da791e/meta.json",
    "content": "{\n  \"Name\": \"desktop-linux\",\n  \"Endpoints\": {\n    \"foo\": {\n      \"Host\": \"unix:///Users/user/.docker/run/docker.sock\"\n    }\n  }\n}\n"
  },
  {
    "path": "pkg/client/testdata/docker-context/error-cases/empty-context/config.json",
    "content": "{\n  \"currentContext\": \"some-bad-context\"\n}\n"
  },
  {
    "path": "pkg/client/testdata/docker-context/error-cases/invalid-config/config.json",
    "content": "{\n  \"currentContext\": \"some-bad-context\n}\n"
  },
  {
    "path": "pkg/client/testdata/docker-context/error-cases/invalid-metadata/config.json",
    "content": "{\n  \"currentContext\": \"desktop-linux\"\n}\n"
  },
  {
    "path": "pkg/client/testdata/docker-context/error-cases/invalid-metadata/contexts/meta/fe9c6bd7a66301f49ca9b6a70b217107cd1284598bfc254700c989b916da791e/meta.json",
    "content": "{\n  \"Name\": \"desktop-linux\",\n  \"Endpoints\": {\n    \"docker\": {\n      \"Host\": \"unix:///Users/user/.docker/run/docker.sock\n    }\n  }\n}\n"
  },
  {
    "path": "pkg/client/testdata/docker-context/happy-cases/current-context-not-defined/config.json",
    "content": "{\n  \"auths\": {\n    \"https://index.docker.io/v1/\": {}\n  },\n  \"credsStore\": \"desktop\",\n  \"experimental\": \"disabled\"\n}\n"
  },
  {
    "path": "pkg/client/testdata/docker-context/happy-cases/custom-context/config.json",
    "content": "{\n  \"currentContext\": \"desktop-linux\"\n}\n"
  },
  {
    "path": "pkg/client/testdata/docker-context/happy-cases/custom-context/contexts/meta/fe9c6bd7a66301f49ca9b6a70b217107cd1284598bfc254700c989b916da791e/meta.json",
    "content": "{\n  \"Name\": \"desktop-linux\",\n  \"Endpoints\": {\n    \"docker\": {\n      \"Host\": \"unix:///Users/user/.docker/run/docker.sock\"\n    }\n  }\n}\n"
  },
  {
    "path": "pkg/client/testdata/docker-context/happy-cases/default-context/config.json",
    "content": "{\n  \"currentContext\": \"default\"\n}\n"
  },
  {
    "path": "pkg/client/testdata/docker-context/happy-cases/two-endpoints-context/config.json",
    "content": "{\n  \"currentContext\": \"desktop-linux\"\n}\n"
  },
  {
    "path": "pkg/client/testdata/docker-context/happy-cases/two-endpoints-context/contexts/meta/fe9c6bd7a66301f49ca9b6a70b217107cd1284598bfc254700c989b916da791e/meta.json",
    "content": "{\n  \"Name\": \"desktop-linux\",\n  \"Endpoints\": {\n    \"docker\": {\n      \"Host\": \"unix:///Users/user/.docker/run/docker.sock\"\n    },\n    \"foo\": {\n      \"Host\": \"something else\"\n    }\n  }\n}\n"
  },
  {
    "path": "pkg/client/testdata/downloader/dirA/file.txt",
    "content": "some file contents"
  },
  {
    "path": "pkg/client/testdata/empty-file",
    "content": ""
  },
  {
    "path": "pkg/client/testdata/extension/bin/detect",
    "content": ""
  },
  {
    "path": "pkg/client/testdata/extension/bin/generate",
    "content": "generate-contents"
  },
  {
    "path": "pkg/client/testdata/extension/extension.toml",
    "content": "api = \"0.3\"\n\n[extension]\nid = \"ext.one\"\nversion = \"1.2.3\"\nhomepage = \"http://one.extension\"\n"
  },
  {
    "path": "pkg/client/testdata/extension-api-0.9/bin/detect",
    "content": ""
  },
  {
    "path": "pkg/client/testdata/extension-api-0.9/bin/generate",
    "content": "generate-contents"
  },
  {
    "path": "pkg/client/testdata/extension-api-0.9/extension.toml",
    "content": "api = \"0.9\"\n\n[extension]\nid = \"ext.one\"\nversion = \"1.2.3\"\nhomepage = \"http://one.extension\"\n"
  },
  {
    "path": "pkg/client/testdata/just-a-file.txt",
    "content": ""
  },
  {
    "path": "pkg/client/testdata/lifecycle/platform-0.13/lifecycle-v0.0.0-arch/analyzer",
    "content": "analyzer"
  },
  {
    "path": "pkg/client/testdata/lifecycle/platform-0.13/lifecycle-v0.0.0-arch/builder",
    "content": "builder"
  },
  {
    "path": "pkg/client/testdata/lifecycle/platform-0.13/lifecycle-v0.0.0-arch/creator",
    "content": "creator"
  },
  {
    "path": "pkg/client/testdata/lifecycle/platform-0.13/lifecycle-v0.0.0-arch/detector",
    "content": "detector"
  },
  {
    "path": "pkg/client/testdata/lifecycle/platform-0.13/lifecycle-v0.0.0-arch/exporter",
    "content": "exporter"
  },
  {
    "path": "pkg/client/testdata/lifecycle/platform-0.13/lifecycle-v0.0.0-arch/launcher",
    "content": "launcher"
  },
  {
    "path": "pkg/client/testdata/lifecycle/platform-0.13/lifecycle-v0.0.0-arch/restorer",
    "content": "restorer"
  },
  {
    "path": "pkg/client/testdata/lifecycle/platform-0.13/lifecycle.toml",
    "content": "[lifecycle]\nversion = \"0.0.0\"\n\n[apis]\n[apis.buildpack]\ndeprecated = [\"0.2\", \"0.3\"]\nsupported = [\"0.2\", \"0.3\", \"0.4\", \"0.9\"]\n\n[apis.platform]\ndeprecated = [\"0.2\"]\nsupported = [\"0.3\", \"0.4\", \"0.5\", \"0.6\", \"0.7\", \"0.8\", \"0.9\", \"0.10\", \"0.11\", \"0.12\", \"0.13\"]"
  },
  {
    "path": "pkg/client/testdata/lifecycle/platform-0.3/lifecycle-v0.0.0-arch/analyzer",
    "content": "analyzer"
  },
  {
    "path": "pkg/client/testdata/lifecycle/platform-0.3/lifecycle-v0.0.0-arch/builder",
    "content": "builder"
  },
  {
    "path": "pkg/client/testdata/lifecycle/platform-0.3/lifecycle-v0.0.0-arch/creator",
    "content": "creator"
  },
  {
    "path": "pkg/client/testdata/lifecycle/platform-0.3/lifecycle-v0.0.0-arch/detector",
    "content": "detector"
  },
  {
    "path": "pkg/client/testdata/lifecycle/platform-0.3/lifecycle-v0.0.0-arch/exporter",
    "content": "exporter"
  },
  {
    "path": "pkg/client/testdata/lifecycle/platform-0.3/lifecycle-v0.0.0-arch/launcher",
    "content": "launcher"
  },
  {
    "path": "pkg/client/testdata/lifecycle/platform-0.3/lifecycle-v0.0.0-arch/restorer",
    "content": "restorer"
  },
  {
    "path": "pkg/client/testdata/lifecycle/platform-0.3/lifecycle.toml",
    "content": "[lifecycle]\nversion = \"0.0.0\"\n\n[api]\nbuildpack = \"0.2\"\nplatform = \"0.3\""
  },
  {
    "path": "pkg/client/testdata/lifecycle/platform-0.4/lifecycle-v0.0.0-arch/analyzer",
    "content": "analyzer"
  },
  {
    "path": "pkg/client/testdata/lifecycle/platform-0.4/lifecycle-v0.0.0-arch/builder",
    "content": "builder"
  },
  {
    "path": "pkg/client/testdata/lifecycle/platform-0.4/lifecycle-v0.0.0-arch/creator",
    "content": "creator"
  },
  {
    "path": "pkg/client/testdata/lifecycle/platform-0.4/lifecycle-v0.0.0-arch/detector",
    "content": "detector"
  },
  {
    "path": "pkg/client/testdata/lifecycle/platform-0.4/lifecycle-v0.0.0-arch/exporter",
    "content": "exporter"
  },
  {
    "path": "pkg/client/testdata/lifecycle/platform-0.4/lifecycle-v0.0.0-arch/launcher",
    "content": "launcher"
  },
  {
    "path": "pkg/client/testdata/lifecycle/platform-0.4/lifecycle-v0.0.0-arch/restorer",
    "content": "restorer"
  },
  {
    "path": "pkg/client/testdata/lifecycle/platform-0.4/lifecycle.toml",
    "content": "[lifecycle]\nversion = \"0.0.0\"\n\n[apis]\n[apis.buildpack]\ndeprecated = [\"0.2\", \"0.3\"]\nsupported = [\"0.2\", \"0.3\", \"0.4\", \"0.9\"]\n\n[apis.platform]\ndeprecated = [\"0.2\"]\nsupported = [\"0.3\", \"0.4\"]"
  },
  {
    "path": "pkg/client/testdata/non-zip-file",
    "content": "some-content"
  },
  {
    "path": "pkg/client/testdata/registry/3/fo/example_foo",
    "content": "{\"ns\":\"example\",\"name\":\"foo\",\"version\":\"1.0.0\",\"yanked\":false,\"addr\":\"example.com/some/package@sha256:8c27fe111c11b722081701dfed3bd55e039b9ce92865473cf4cdfa918071c566\"}\n{\"ns\":\"example\",\"name\":\"foo\",\"version\":\"1.1.0\",\"yanked\":false,\"addr\":\"example.com/some/package@sha256:74eb48882e835d8767f62940d453eb96ed2737de3a16573881dcea7dea769df7\"}\n{\"ns\":\"example\",\"name\":\"foo\",\"version\":\"1.2.0\",\"yanked\":false,\"addr\":\"example.com/some/package@sha256:2560f05307e8de9d830f144d09556e19dd1eb7d928aee900ed02208ae9727e7a\"}\n"
  },
  {
    "path": "pkg/client/testdata/registry/ja/va/example_java",
    "content": "{\"ns\":\"example\",\"name\":\"java\",\"version\":\"1.0.0\",\"yanked\":false,\"addr\":\"example.com/some/package@sha256:8c27fe111c11b722081701dfed3bd55e039b9ce92865473cf4cdfa918071c566\"}\n"
  },
  {
    "path": "pkg/client/testdata/some-app/.gitignore",
    "content": "!.gitignore"
  },
  {
    "path": "pkg/client/version.go",
    "content": "package client\n\n// Version returns the version of the client\nfunc (c *Client) Version() string {\n\treturn c.version\n}\n"
  },
  {
    "path": "pkg/client/yank_buildpack.go",
    "content": "package client\n\nimport (\n\t\"net/url\"\n\t\"runtime\"\n\n\t\"github.com/buildpacks/pack/internal/registry\"\n)\n\n// YankBuildpackOptions is a configuration struct that controls the Yanking a buildpack\n// from the Buildpack Registry.\ntype YankBuildpackOptions struct {\n\tID      string\n\tVersion string\n\tType    string\n\tURL     string\n\tYank    bool\n}\n\n// YankBuildpack marks a buildpack on the Buildpack Registry as 'yanked'. This forbids future\n// builds from using it.\nfunc (c *Client) YankBuildpack(opts YankBuildpackOptions) error {\n\tnamespace, name, err := registry.ParseNamespaceName(opts.ID)\n\tif err != nil {\n\t\treturn err\n\t}\n\tissueURL, err := registry.GetIssueURL(opts.URL)\n\tif err != nil {\n\t\treturn err\n\t}\n\n\tbuildpack := registry.Buildpack{\n\t\tNamespace: namespace,\n\t\tName:      name,\n\t\tVersion:   opts.Version,\n\t\tYanked:    opts.Yank,\n\t}\n\n\tissue, err := registry.CreateGithubIssue(buildpack)\n\tif err != nil {\n\t\treturn err\n\t}\n\n\tparams := url.Values{}\n\tparams.Add(\"title\", issue.Title)\n\tparams.Add(\"body\", issue.Body)\n\tissueURL.RawQuery = params.Encode()\n\n\tc.logger.Debugf(\"Open URL in browser: %s\", issueURL)\n\tcmd, err := registry.CreateBrowserCmd(issueURL.String(), runtime.GOOS)\n\tif err != nil {\n\t\treturn err\n\t}\n\n\treturn cmd.Start()\n}\n"
  },
  {
    "path": "pkg/client/yank_buildpack_test.go",
    "content": "package client\n\nimport (\n\t\"bytes\"\n\t\"testing\"\n\n\t\"github.com/buildpacks/imgutil/fakes\"\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\tifakes \"github.com/buildpacks/pack/internal/fakes\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestYankBuildpack(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"yank_buildpack\", testYankBuildpack, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testYankBuildpack(t *testing.T, when spec.G, it spec.S) {\n\twhen(\"#YankBuildpack\", func() {\n\t\tvar (\n\t\t\tfakeImageFetcher *ifakes.FakeImageFetcher\n\t\t\tfakeAppImage     *fakes.Image\n\t\t\tsubject          *Client\n\t\t\tout              bytes.Buffer\n\t\t)\n\n\t\tit.Before(func() {\n\t\t\tfakeImageFetcher = ifakes.NewFakeImageFetcher()\n\t\t\tfakeAppImage = fakes.NewImage(\"buildpack/image\", \"\", &fakeIdentifier{name: \"buildpack-image\"})\n\n\t\t\th.AssertNil(t, fakeAppImage.SetLabel(\"io.buildpacks.buildpackage.metadata\",\n\t\t\t\t`{\"id\":\"heroku/java-function\",\"version\":\"1.1.1\",\"stacks\":[{\"id\":\"heroku-18\"},{\"id\":\"io.buildpacks.stacks.jammy\"},{\"id\":\"org.cloudfoundry.stacks.cflinuxfs3\"}]}`))\n\t\t\tfakeImageFetcher.RemoteImages[\"buildpack/image\"] = fakeAppImage\n\n\t\t\tfakeLogger := logging.NewLogWithWriters(&out, &out)\n\t\t\tsubject = &Client{\n\t\t\t\tlogger:       fakeLogger,\n\t\t\t\timageFetcher: fakeImageFetcher,\n\t\t\t}\n\t\t})\n\n\t\tit.After(func() {\n\t\t\t_ = fakeAppImage.Cleanup()\n\t\t})\n\n\t\tit(\"should return error for missing namespace id\", func() {\n\t\t\terr := subject.YankBuildpack(YankBuildpackOptions{\n\t\t\t\tID: \"hello\",\n\t\t\t})\n\t\t\th.AssertError(t, err, \"invalid id 'hello' does not contain a namespace\")\n\t\t})\n\n\t\tit(\"should return error for invalid id\", func() {\n\t\t\terr := subject.YankBuildpack(YankBuildpackOptions{\n\t\t\t\tID: \"bad/id/name\",\n\t\t\t})\n\t\t\th.AssertError(t, err, \"invalid id 'bad/id/name' contains unexpected characters\")\n\t\t})\n\n\t\tit(\"should return error when URL is missing\", func() {\n\t\t\terr := subject.YankBuildpack(YankBuildpackOptions{\n\t\t\t\tID:      \"heroku/java\",\n\t\t\t\tVersion: \"0.2.1\",\n\t\t\t\tType:    \"github\",\n\t\t\t\tURL:     \"\",\n\t\t\t})\n\t\t\th.AssertError(t, err, \"missing github URL\")\n\t\t})\n\n\t\tit(\"should return error when URL is invalid\", func() {\n\t\t\terr := subject.YankBuildpack(YankBuildpackOptions{\n\t\t\t\tID:      \"heroku/java\",\n\t\t\t\tVersion: \"0.2.1\",\n\t\t\t\tType:    \"github\",\n\t\t\t\tURL:     \"bad url\",\n\t\t\t})\n\t\t\th.AssertNotNil(t, err)\n\t\t\th.AssertContains(t, err.Error(), \"invalid URI for request\")\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "pkg/dist/buildmodule.go",
    "content": "package dist\n\nimport (\n\t\"fmt\"\n\t\"strings\"\n\n\t\"github.com/pkg/errors\"\n\n\t\"github.com/buildpacks/pack/internal/style\"\n)\n\nconst AssumedBuildpackAPIVersion = \"0.1\"\nconst BuildpacksDir = \"/cnb/buildpacks\"\nconst ExtensionsDir = \"/cnb/extensions\"\n\ntype ModuleInfo struct {\n\tID          string    `toml:\"id,omitempty\" json:\"id,omitempty\" yaml:\"id,omitempty\"`\n\tName        string    `toml:\"name,omitempty\" json:\"name,omitempty\" yaml:\"name,omitempty\"`\n\tVersion     string    `toml:\"version,omitempty\" json:\"version,omitempty\" yaml:\"version,omitempty\"`\n\tDescription string    `toml:\"description,omitempty\" json:\"description,omitempty\" yaml:\"description,omitempty\"`\n\tHomepage    string    `toml:\"homepage,omitempty\" json:\"homepage,omitempty\" yaml:\"homepage,omitempty\"`\n\tKeywords    []string  `toml:\"keywords,omitempty\" json:\"keywords,omitempty\" yaml:\"keywords,omitempty\"`\n\tExecEnv     []string  `toml:\"exec-env,omitempty\" json:\"exec-env,omitempty\" yaml:\"exec-env,omitempty\"`\n\tLicenses    []License `toml:\"licenses,omitempty\" json:\"licenses,omitempty\" yaml:\"licenses,omitempty\"`\n\tClearEnv    bool      `toml:\"clear-env,omitempty\" json:\"clear-env,omitempty\" yaml:\"clear-env,omitempty\"`\n}\n\nfunc (b ModuleInfo) FullName() string {\n\tif b.Version != \"\" {\n\t\treturn b.ID + \"@\" + b.Version\n\t}\n\treturn b.ID\n}\n\nfunc (b ModuleInfo) FullNameWithVersion() (string, error) {\n\tif b.Version == \"\" {\n\t\treturn b.ID, errors.Errorf(\"buildpack %s does not have a version defined\", style.Symbol(b.ID))\n\t}\n\treturn b.ID + \"@\" + b.Version, nil\n}\n\n// Satisfy stringer\nfunc (b ModuleInfo) String() string { return b.FullName() }\n\n// Match compares two buildpacks by ID and Version\nfunc (b ModuleInfo) Match(o ModuleInfo) bool {\n\treturn b.ID == o.ID && b.Version == o.Version\n}\n\ntype License struct {\n\tType string `toml:\"type\"`\n\tURI  string `toml:\"uri\"`\n}\n\ntype Stack struct {\n\tID     string   `json:\"id\" toml:\"id\"`\n\tMixins []string `json:\"mixins,omitempty\" toml:\"mixins,omitempty\"`\n}\n\ntype Target struct {\n\tOS            string         `json:\"os\" toml:\"os\"`\n\tArch          string         `json:\"arch\" toml:\"arch\"`\n\tArchVariant   string         `json:\"variant,omitempty\" toml:\"variant,omitempty\"`\n\tDistributions []Distribution `json:\"distros,omitempty\" toml:\"distros,omitempty\"`\n}\n\n// ValuesAsSlice converts the internal representation of a target (os, arch, variant, etc.) into a string slice,\n// where each value included in the final array must be not empty.\nfunc (t *Target) ValuesAsSlice() []string {\n\tvar targets []string\n\tif t.OS != \"\" {\n\t\ttargets = append(targets, t.OS)\n\t}\n\tif t.Arch != \"\" {\n\t\ttargets = append(targets, t.Arch)\n\t}\n\tif t.ArchVariant != \"\" {\n\t\ttargets = append(targets, t.ArchVariant)\n\t}\n\n\tfor _, d := range t.Distributions {\n\t\ttargets = append(targets, fmt.Sprintf(\"%s@%s\", d.Name, d.Version))\n\t}\n\treturn targets\n}\n\nfunc (t *Target) ValuesAsPlatform() string {\n\treturn strings.Join(t.ValuesAsSlice(), \"/\")\n}\n\n// ExpandTargetsDistributions expands each provided target (with multiple distribution versions) to multiple targets (each with a single distribution version).\n// For example, given an array with ONE target with the format:\n//\n//\t[\n//\t  {OS:\"linux\", Distributions: []dist.Distribution{{Name: \"ubuntu\", Version: \"18.01\"},{Name: \"ubuntu\", Version: \"21.01\"}}}\n//\t]\n//\n// it returns an array with TWO targets each with the format:\n//\n//\t[\n//\t {OS:\"linux\",Distributions: []dist.Distribution{{Name: \"ubuntu\", Version: \"18.01\"}}},\n//\t {OS:\"linux\",Distributions: []dist.Distribution{{Name: \"ubuntu\", Version: \"21.01\"}}}\n//\t]\nfunc ExpandTargetsDistributions(targets ...Target) []Target {\n\tvar expandedTargets []Target\n\tfor _, target := range targets {\n\t\texpandedTargets = append(expandedTargets, expandTargetDistributions(target)...)\n\t}\n\treturn expandedTargets\n}\n\nfunc expandTargetDistributions(target Target) []Target {\n\tvar expandedTargets []Target\n\tif (len(target.Distributions)) > 1 {\n\t\toriginalDistros := target.Distributions\n\t\tfor _, distro := range originalDistros {\n\t\t\tcopyTarget := target\n\t\t\tcopyTarget.Distributions = []Distribution{distro}\n\t\t\texpandedTargets = append(expandedTargets, copyTarget)\n\t\t}\n\t} else {\n\t\texpandedTargets = append(expandedTargets, target)\n\t}\n\treturn expandedTargets\n}\n\ntype Distribution struct {\n\tName    string `json:\"name,omitempty\" toml:\"name,omitempty\"`\n\tVersion string `json:\"version,omitempty\" toml:\"version,omitempty\"`\n}\n"
  },
  {
    "path": "pkg/dist/buildmodule_test.go",
    "content": "package dist_test\n\nimport (\n\t\"testing\"\n\n\t\"github.com/buildpacks/pack/pkg/dist\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n\n\t\"github.com/heroku/color\"\n\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n)\n\nfunc TestBuildModule(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"testBuildModule\", testBuildModule, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testBuildModule(t *testing.T, when spec.G, it spec.S) {\n\tvar info dist.ModuleInfo\n\n\tit.Before(func() {\n\t\tinfo = dist.ModuleInfo{\n\t\t\tID:      \"some-id\",\n\t\t\tName:    \"some-name\",\n\t\t\tVersion: \"some-version\",\n\t\t}\n\t})\n\n\twhen(\"#FullName\", func() {\n\t\twhen(\"version\", func() {\n\t\t\twhen(\"blank\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tinfo.Version = \"\"\n\t\t\t\t})\n\n\t\t\t\tit(\"prints ID\", func() {\n\t\t\t\t\th.AssertEq(t, info.FullName(), \"some-id\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"not blank\", func() {\n\t\t\t\tit(\"prints ID and version\", func() {\n\t\t\t\t\th.AssertEq(t, info.FullName(), \"some-id@some-version\")\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"#FullNameWithVersion\", func() {\n\t\twhen(\"version\", func() {\n\t\t\twhen(\"blank\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tinfo.Version = \"\"\n\t\t\t\t})\n\n\t\t\t\tit(\"errors\", func() {\n\t\t\t\t\t_, err := info.FullNameWithVersion()\n\t\t\t\t\th.AssertNotNil(t, err)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"not blank\", func() {\n\t\t\t\tit(\"prints ID and version\", func() {\n\t\t\t\t\tactual, err := info.FullNameWithVersion()\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\th.AssertEq(t, actual, \"some-id@some-version\")\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"#String\", func() {\n\t\tit(\"returns #FullName\", func() {\n\t\t\tinfo.Version = \"\"\n\t\t\th.AssertEq(t, info.String(), info.FullName())\n\t\t})\n\t})\n\n\twhen(\"#Match\", func() {\n\t\twhen(\"IDs and versions match\", func() {\n\t\t\tit(\"returns true\", func() {\n\t\t\t\tother := dist.ModuleInfo{\n\t\t\t\t\tID:      \"some-id\",\n\t\t\t\t\tVersion: \"some-version\",\n\t\t\t\t}\n\t\t\t\th.AssertEq(t, info.Match(other), true)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"only IDs match\", func() {\n\t\t\tit(\"returns false\", func() {\n\t\t\t\tother := dist.ModuleInfo{\n\t\t\t\t\tID:      \"some-id\",\n\t\t\t\t\tVersion: \"some-other-version\",\n\t\t\t\t}\n\t\t\t\th.AssertEq(t, info.Match(other), false)\n\t\t\t})\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "pkg/dist/buildpack_descriptor.go",
    "content": "package dist\n\nimport (\n\t\"encoding/json\"\n\t\"fmt\"\n\t\"sort\"\n\t\"strings\"\n\n\t\"github.com/buildpacks/lifecycle/api\"\n\n\t\"github.com/buildpacks/pack/internal/stringset\"\n\t\"github.com/buildpacks/pack/internal/style\"\n)\n\ntype BuildpackDescriptor struct {\n\tWithAPI          *api.Version `toml:\"api\"`\n\tWithInfo         ModuleInfo   `toml:\"buildpack\"`\n\tWithStacks       []Stack      `toml:\"stacks,omitempty\"`\n\tWithTargets      []Target     `toml:\"targets,omitempty\"`\n\tWithOrder        Order        `toml:\"order\"`\n\tWithWindowsBuild bool\n\tWithLinuxBuild   bool\n}\n\nfunc (b *BuildpackDescriptor) EscapedID() string {\n\treturn strings.ReplaceAll(b.Info().ID, \"/\", \"_\")\n}\n\nfunc (b *BuildpackDescriptor) EnsureStackSupport(stackID string, providedMixins []string, validateRunStageMixins bool) error {\n\tif len(b.Stacks()) == 0 {\n\t\treturn nil // Order buildpack or a buildpack using Targets, no validation required\n\t}\n\n\tbpMixins, err := b.findMixinsForStack(stackID)\n\tif err != nil {\n\t\treturn err\n\t}\n\n\tif !validateRunStageMixins {\n\t\tvar filtered []string\n\t\tfor _, m := range bpMixins {\n\t\t\tif !strings.HasPrefix(m, \"run:\") {\n\t\t\t\tfiltered = append(filtered, m)\n\t\t\t}\n\t\t}\n\t\tbpMixins = filtered\n\t}\n\n\t_, missing, _ := stringset.Compare(providedMixins, bpMixins)\n\tif len(missing) > 0 {\n\t\tsort.Strings(missing)\n\t\treturn fmt.Errorf(\"buildpack %s requires missing mixin(s): %s\", style.Symbol(b.Info().FullName()), strings.Join(missing, \", \"))\n\t}\n\treturn nil\n}\n\nfunc (b *BuildpackDescriptor) EnsureTargetSupport(givenOS, givenArch, givenDistroName, givenDistroVersion string) error {\n\tif len(b.Targets()) == 0 {\n\t\tif (!b.WithLinuxBuild && !b.WithWindowsBuild) || len(b.Stacks()) > 0 { // nolint\n\t\t\treturn nil // Order buildpack or stack buildpack, no validation required\n\t\t} else if b.WithLinuxBuild && givenOS == DefaultTargetOSLinux && givenArch == DefaultTargetArch {\n\t\t\treturn nil\n\t\t} else if b.WithWindowsBuild && givenOS == DefaultTargetOSWindows && givenArch == DefaultTargetArch {\n\t\t\treturn nil\n\t\t}\n\t}\n\tfor _, bpTarget := range b.Targets() {\n\t\tif bpTarget.OS == givenOS {\n\t\t\tif bpTarget.Arch == \"\" || givenArch == \"\" || bpTarget.Arch == givenArch {\n\t\t\t\tif len(bpTarget.Distributions) == 0 || givenDistroName == \"\" || givenDistroVersion == \"\" {\n\t\t\t\t\treturn nil\n\t\t\t\t}\n\t\t\t\tfor _, bpDistro := range bpTarget.Distributions {\n\t\t\t\t\tif bpDistro.Name == givenDistroName && bpDistro.Version == givenDistroVersion {\n\t\t\t\t\t\treturn nil\n\t\t\t\t\t}\n\t\t\t\t}\n\t\t\t}\n\t\t}\n\t}\n\ttype osDistribution struct {\n\t\tName    string `json:\"name,omitempty\"`\n\t\tVersion string `json:\"version,omitempty\"`\n\t}\n\ttype target struct {\n\t\tOS           string         `json:\"os\"`\n\t\tArch         string         `json:\"arch\"`\n\t\tDistribution osDistribution `json:\"distribution\"`\n\t}\n\treturn fmt.Errorf(\n\t\t\"unable to satisfy target os/arch constraints; build image: %s, buildpack %s: %s\",\n\t\ttoJSONMaybe(target{\n\t\t\tOS:           givenOS,\n\t\t\tArch:         givenArch,\n\t\t\tDistribution: osDistribution{Name: givenDistroName, Version: givenDistroVersion},\n\t\t}),\n\t\tstyle.Symbol(b.Info().FullName()),\n\t\ttoJSONMaybe(b.Targets()),\n\t)\n}\n\nfunc toJSONMaybe(v interface{}) string {\n\tb, err := json.Marshal(v)\n\tif err != nil {\n\t\treturn fmt.Sprintf(\"%s\", v) // hopefully v is a Stringer\n\t}\n\treturn string(b)\n}\n\nfunc (b *BuildpackDescriptor) Kind() string {\n\treturn \"buildpack\"\n}\n\nfunc (b *BuildpackDescriptor) API() *api.Version {\n\treturn b.WithAPI\n}\n\nfunc (b *BuildpackDescriptor) Info() ModuleInfo {\n\treturn b.WithInfo\n}\n\nfunc (b *BuildpackDescriptor) Order() Order {\n\treturn b.WithOrder\n}\n\nfunc (b *BuildpackDescriptor) Stacks() []Stack {\n\treturn b.WithStacks\n}\n\nfunc (b *BuildpackDescriptor) Targets() []Target {\n\treturn b.WithTargets\n}\n\nfunc (b *BuildpackDescriptor) findMixinsForStack(stackID string) ([]string, error) {\n\tfor _, s := range b.Stacks() {\n\t\tif s.ID == stackID || s.ID == \"*\" {\n\t\t\treturn s.Mixins, nil\n\t\t}\n\t}\n\treturn nil, fmt.Errorf(\"buildpack %s does not support stack %s\", style.Symbol(b.Info().FullName()), style.Symbol(stackID))\n}\n"
  },
  {
    "path": "pkg/dist/buildpack_descriptor_test.go",
    "content": "package dist_test\n\nimport (\n\t\"testing\"\n\n\t\"github.com/buildpacks/lifecycle/api\"\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\t\"github.com/buildpacks/pack/pkg/buildpack\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestBuildpackDescriptor(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"testBuildpackDescriptor\", testBuildpackDescriptor, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testBuildpackDescriptor(t *testing.T, when spec.G, it spec.S) {\n\twhen(\"#EscapedID\", func() {\n\t\tit(\"returns escaped ID\", func() {\n\t\t\tbpDesc := dist.BuildpackDescriptor{\n\t\t\t\tWithInfo: dist.ModuleInfo{ID: \"some/id\"},\n\t\t\t}\n\t\t\th.AssertEq(t, bpDesc.EscapedID(), \"some_id\")\n\t\t})\n\t})\n\n\twhen(\"#EnsureStackSupport\", func() {\n\t\twhen(\"not validating against run image mixins\", func() {\n\t\t\tit(\"ignores run-only mixins\", func() {\n\t\t\t\tbp := dist.BuildpackDescriptor{\n\t\t\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\t\t\tID:      \"some.buildpack.id\",\n\t\t\t\t\t\tVersion: \"some.buildpack.version\",\n\t\t\t\t\t},\n\t\t\t\t\tWithStacks: []dist.Stack{{\n\t\t\t\t\t\tID:     \"some.stack.id\",\n\t\t\t\t\t\tMixins: []string{\"mixinA\", \"build:mixinB\", \"run:mixinD\"},\n\t\t\t\t\t}},\n\t\t\t\t}\n\n\t\t\t\tprovidedMixins := []string{\"mixinA\", \"build:mixinB\", \"mixinC\"}\n\t\t\t\th.AssertNil(t, bp.EnsureStackSupport(\"some.stack.id\", providedMixins, false))\n\t\t\t})\n\n\t\t\tit(\"works with wildcard stack\", func() {\n\t\t\t\tbp := dist.BuildpackDescriptor{\n\t\t\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\t\t\tID:      \"some.buildpack.id\",\n\t\t\t\t\t\tVersion: \"some.buildpack.version\",\n\t\t\t\t\t},\n\t\t\t\t\tWithStacks: []dist.Stack{{\n\t\t\t\t\t\tID:     \"*\",\n\t\t\t\t\t\tMixins: []string{\"mixinA\", \"build:mixinB\", \"run:mixinD\"},\n\t\t\t\t\t}},\n\t\t\t\t}\n\n\t\t\t\tprovidedMixins := []string{\"mixinA\", \"build:mixinB\", \"mixinC\"}\n\t\t\t\th.AssertNil(t, bp.EnsureStackSupport(\"some.stack.id\", providedMixins, false))\n\t\t\t})\n\n\t\t\tit(\"returns an error with any missing (and non-ignored) mixins\", func() {\n\t\t\t\tbp := dist.BuildpackDescriptor{\n\t\t\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\t\t\tID:      \"some.buildpack.id\",\n\t\t\t\t\t\tVersion: \"some.buildpack.version\",\n\t\t\t\t\t},\n\t\t\t\t\tWithStacks: []dist.Stack{{\n\t\t\t\t\t\tID:     \"some.stack.id\",\n\t\t\t\t\t\tMixins: []string{\"mixinX\", \"mixinY\", \"run:mixinZ\"},\n\t\t\t\t\t}},\n\t\t\t\t}\n\n\t\t\t\tprovidedMixins := []string{\"mixinA\", \"mixinB\"}\n\t\t\t\terr := bp.EnsureStackSupport(\"some.stack.id\", providedMixins, false)\n\n\t\t\t\th.AssertError(t, err, \"buildpack 'some.buildpack.id@some.buildpack.version' requires missing mixin(s): mixinX, mixinY\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"validating against run image mixins\", func() {\n\t\t\tit(\"requires run-only mixins\", func() {\n\t\t\t\tbp := dist.BuildpackDescriptor{\n\t\t\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\t\t\tID:      \"some.buildpack.id\",\n\t\t\t\t\t\tVersion: \"some.buildpack.version\",\n\t\t\t\t\t},\n\t\t\t\t\tWithStacks: []dist.Stack{{\n\t\t\t\t\t\tID:     \"some.stack.id\",\n\t\t\t\t\t\tMixins: []string{\"mixinA\", \"build:mixinB\", \"run:mixinD\"},\n\t\t\t\t\t}},\n\t\t\t\t}\n\n\t\t\t\tprovidedMixins := []string{\"mixinA\", \"build:mixinB\", \"mixinC\", \"run:mixinD\"}\n\n\t\t\t\th.AssertNil(t, bp.EnsureStackSupport(\"some.stack.id\", providedMixins, true))\n\t\t\t})\n\n\t\t\tit(\"returns an error with any missing mixins\", func() {\n\t\t\t\tbp := dist.BuildpackDescriptor{\n\t\t\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\t\t\tID:      \"some.buildpack.id\",\n\t\t\t\t\t\tVersion: \"some.buildpack.version\",\n\t\t\t\t\t},\n\t\t\t\t\tWithStacks: []dist.Stack{{\n\t\t\t\t\t\tID:     \"some.stack.id\",\n\t\t\t\t\t\tMixins: []string{\"mixinX\", \"mixinY\", \"run:mixinZ\"},\n\t\t\t\t\t}},\n\t\t\t\t}\n\n\t\t\t\tprovidedMixins := []string{\"mixinA\", \"mixinB\"}\n\n\t\t\t\terr := bp.EnsureStackSupport(\"some.stack.id\", providedMixins, true)\n\n\t\t\t\th.AssertError(t, err, \"buildpack 'some.buildpack.id@some.buildpack.version' requires missing mixin(s): mixinX, mixinY, run:mixinZ\")\n\t\t\t})\n\t\t})\n\n\t\tit(\"returns an error when buildpack does not support stack\", func() {\n\t\t\tbp := dist.BuildpackDescriptor{\n\t\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\t\tID:      \"some.buildpack.id\",\n\t\t\t\t\tVersion: \"some.buildpack.version\",\n\t\t\t\t},\n\t\t\t\tWithStacks: []dist.Stack{{\n\t\t\t\t\tID:     \"some.stack.id\",\n\t\t\t\t\tMixins: []string{\"mixinX\", \"mixinY\"},\n\t\t\t\t}},\n\t\t\t}\n\n\t\t\terr := bp.EnsureStackSupport(\"some.nonexistent.stack.id\", []string{\"mixinA\"}, true)\n\n\t\t\th.AssertError(t, err, \"buildpack 'some.buildpack.id@some.buildpack.version' does not support stack 'some.nonexistent.stack.id\")\n\t\t})\n\n\t\tit(\"skips validating order buildpack\", func() {\n\t\t\tbp := dist.BuildpackDescriptor{\n\t\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\t\tID:      \"some.buildpack.id\",\n\t\t\t\t\tVersion: \"some.buildpack.version\",\n\t\t\t\t},\n\t\t\t\tWithStacks: []dist.Stack{},\n\t\t\t}\n\n\t\t\th.AssertNil(t, bp.EnsureStackSupport(\"some.stack.id\", []string{\"mixinA\"}, true))\n\t\t})\n\t})\n\n\twhen(\"validating against run image target\", func() {\n\t\tit(\"succeeds with no distribution\", func() {\n\t\t\tbp := dist.BuildpackDescriptor{\n\t\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\t\tID:      \"some.buildpack.id\",\n\t\t\t\t\tVersion: \"some.buildpack.version\",\n\t\t\t\t},\n\t\t\t\tWithTargets: []dist.Target{{\n\t\t\t\t\tOS:   \"fake-os\",\n\t\t\t\t\tArch: \"fake-arch\",\n\t\t\t\t}},\n\t\t\t}\n\n\t\t\th.AssertNil(t, bp.EnsureStackSupport(\"some.stack.id\", []string{}, true))\n\t\t\th.AssertNil(t, bp.EnsureTargetSupport(\"fake-os\", \"fake-arch\", \"fake-distro\", \"0.0\"))\n\t\t})\n\n\t\tit(\"succeeds with no target and bin/build.exe\", func() {\n\t\t\tbp := dist.BuildpackDescriptor{\n\t\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\t\tID:      \"some.buildpack.id\",\n\t\t\t\t\tVersion: \"some.buildpack.version\",\n\t\t\t\t},\n\t\t\t\tWithWindowsBuild: true,\n\t\t\t}\n\n\t\t\th.AssertNil(t, bp.EnsureStackSupport(\"some.stack.id\", []string{}, true))\n\t\t\th.AssertNil(t, bp.EnsureTargetSupport(\"windows\", \"amd64\", \"fake-distro\", \"0.0\"))\n\t\t})\n\n\t\tit(\"succeeds with no target and bin/build\", func() {\n\t\t\tbp := dist.BuildpackDescriptor{\n\t\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\t\tID:      \"some.buildpack.id\",\n\t\t\t\t\tVersion: \"some.buildpack.version\",\n\t\t\t\t},\n\t\t\t\tWithLinuxBuild: true,\n\t\t\t}\n\n\t\t\th.AssertNil(t, bp.EnsureStackSupport(\"some.stack.id\", []string{}, true))\n\t\t\th.AssertNil(t, bp.EnsureTargetSupport(\"linux\", \"amd64\", \"fake-distro\", \"0.0\"))\n\t\t})\n\n\t\tit(\"returns an error when no match\", func() {\n\t\t\tbp := dist.BuildpackDescriptor{\n\t\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\t\tID:      \"some.buildpack.id\",\n\t\t\t\t\tVersion: \"some.buildpack.version\",\n\t\t\t\t},\n\t\t\t\tWithTargets: []dist.Target{{\n\t\t\t\t\tOS:   \"fake-os\",\n\t\t\t\t\tArch: \"fake-arch\",\n\t\t\t\t}},\n\t\t\t}\n\n\t\t\th.AssertNil(t, bp.EnsureStackSupport(\"some.stack.id\", []string{}, true))\n\t\t\th.AssertError(t, bp.EnsureTargetSupport(\"some-other-os\", \"fake-arch\", \"fake-distro\", \"0.0\"),\n\t\t\t\t`unable to satisfy target os/arch constraints; build image: {\"os\":\"some-other-os\",\"arch\":\"fake-arch\",\"distribution\":{\"name\":\"fake-distro\",\"version\":\"0.0\"}}, buildpack 'some.buildpack.id@some.buildpack.version': [{\"os\":\"fake-os\",\"arch\":\"fake-arch\"}]`)\n\t\t})\n\n\t\tit(\"succeeds with distribution\", func() {\n\t\t\tbp := dist.BuildpackDescriptor{\n\t\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\t\tID:      \"some.buildpack.id\",\n\t\t\t\t\tVersion: \"some.buildpack.version\",\n\t\t\t\t},\n\t\t\t\tWithTargets: []dist.Target{{\n\t\t\t\t\tOS:   \"fake-os\",\n\t\t\t\t\tArch: \"fake-arch\",\n\t\t\t\t\tDistributions: []dist.Distribution{\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\tName:    \"fake-distro\",\n\t\t\t\t\t\t\tVersion: \"0.1\",\n\t\t\t\t\t\t},\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\tName:    \"another-distro\",\n\t\t\t\t\t\t\tVersion: \"0.22\",\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t}},\n\t\t\t}\n\n\t\t\th.AssertNil(t, bp.EnsureStackSupport(\"some.stack.id\", []string{}, true))\n\t\t\th.AssertNil(t, bp.EnsureTargetSupport(\"fake-os\", \"fake-arch\", \"fake-distro\", \"0.1\"))\n\t\t})\n\n\t\tit(\"returns an error when no distribution matches\", func() {\n\t\t\tbp := dist.BuildpackDescriptor{\n\t\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\t\tID:      \"some.buildpack.id\",\n\t\t\t\t\tVersion: \"some.buildpack.version\",\n\t\t\t\t},\n\t\t\t\tWithTargets: []dist.Target{{\n\t\t\t\t\tOS:   \"fake-os\",\n\t\t\t\t\tArch: \"fake-arch\",\n\t\t\t\t\tDistributions: []dist.Distribution{\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\tName:    \"fake-distro\",\n\t\t\t\t\t\t\tVersion: \"0.1\",\n\t\t\t\t\t\t},\n\t\t\t\t\t\t{\n\t\t\t\t\t\t\tName:    \"another-distro\",\n\t\t\t\t\t\t\tVersion: \"0.22\",\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t}},\n\t\t\t}\n\n\t\t\th.AssertNil(t, bp.EnsureStackSupport(\"some.stack.id\", []string{}, true))\n\t\t\th.AssertError(t, bp.EnsureTargetSupport(\"some-other-os\", \"fake-arch\", \"fake-distro\", \"0.0\"),\n\t\t\t\t`unable to satisfy target os/arch constraints; build image: {\"os\":\"some-other-os\",\"arch\":\"fake-arch\",\"distribution\":{\"name\":\"fake-distro\",\"version\":\"0.0\"}}, buildpack 'some.buildpack.id@some.buildpack.version': [{\"os\":\"fake-os\",\"arch\":\"fake-arch\",\"distros\":[{\"name\":\"fake-distro\",\"version\":\"0.1\"},{\"name\":\"another-distro\",\"version\":\"0.22\"}]}]`)\n\t\t})\n\n\t\tit(\"succeeds with missing arch\", func() {\n\t\t\tbp := dist.BuildpackDescriptor{\n\t\t\t\tWithInfo: dist.ModuleInfo{\n\t\t\t\t\tID:      \"some.buildpack.id\",\n\t\t\t\t\tVersion: \"some.buildpack.version\",\n\t\t\t\t},\n\t\t\t\tWithTargets: []dist.Target{{\n\t\t\t\t\tOS: \"fake-os\",\n\t\t\t\t}},\n\t\t\t}\n\n\t\t\th.AssertNil(t, bp.EnsureTargetSupport(\"fake-os\", \"fake-arch\", \"fake-distro\", \"0.1\"))\n\t\t})\n\t})\n\n\twhen(\"#Kind\", func() {\n\t\tit(\"returns 'buildpack'\", func() {\n\t\t\tbpDesc := dist.BuildpackDescriptor{}\n\t\t\th.AssertEq(t, bpDesc.Kind(), buildpack.KindBuildpack)\n\t\t})\n\t})\n\n\twhen(\"#API\", func() {\n\t\tit(\"returns the api\", func() {\n\t\t\tbpDesc := dist.BuildpackDescriptor{\n\t\t\t\tWithAPI: api.MustParse(\"0.99\"),\n\t\t\t}\n\t\t\th.AssertEq(t, bpDesc.API().String(), \"0.99\")\n\t\t})\n\t})\n\n\twhen(\"#Info\", func() {\n\t\tit(\"returns the module info\", func() {\n\t\t\tinfo := dist.ModuleInfo{\n\t\t\t\tID:      \"some-id\",\n\t\t\t\tName:    \"some-name\",\n\t\t\t\tVersion: \"some-version\",\n\t\t\t}\n\t\t\tbpDesc := dist.BuildpackDescriptor{\n\t\t\t\tWithInfo: info,\n\t\t\t}\n\t\t\th.AssertEq(t, bpDesc.Info(), info)\n\t\t})\n\t})\n\n\twhen(\"#Order\", func() {\n\t\tit(\"returns the order\", func() {\n\t\t\torder := dist.Order{\n\t\t\t\tdist.OrderEntry{Group: []dist.ModuleRef{\n\t\t\t\t\t{ModuleInfo: dist.ModuleInfo{\n\t\t\t\t\t\tID: \"some-id\", Name: \"some-name\", Version: \"some-version\",\n\t\t\t\t\t}},\n\t\t\t\t}},\n\t\t\t}\n\t\t\tbpDesc := dist.BuildpackDescriptor{\n\t\t\t\tWithOrder: order,\n\t\t\t}\n\t\t\th.AssertEq(t, bpDesc.Order(), order)\n\t\t})\n\t})\n\n\twhen(\"#Stacks\", func() {\n\t\tit(\"returns the stacks\", func() {\n\t\t\tstacks := []dist.Stack{\n\t\t\t\t{ID: \"some-id\", Mixins: []string{\"some-mixin\"}},\n\t\t\t}\n\t\t\tbpDesc := dist.BuildpackDescriptor{\n\t\t\t\tWithStacks: stacks,\n\t\t\t}\n\t\t\th.AssertEq(t, bpDesc.Stacks(), stacks)\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "pkg/dist/dist.go",
    "content": "package dist\n\nimport (\n\t\"github.com/buildpacks/lifecycle/api\"\n)\n\nconst (\n\tBuildpackLayersLabel   = \"io.buildpacks.buildpack.layers\"\n\tExtensionLayersLabel   = \"io.buildpacks.extension.layers\"\n\tExtensionMetadataLabel = \"io.buildpacks.extension.metadata\"\n\tDefaultTargetOSLinux   = \"linux\"\n\tDefaultTargetOSWindows = \"windows\"\n\tDefaultTargetArch      = \"amd64\"\n)\n\ntype BuildpackURI struct {\n\tURI string `toml:\"uri\"`\n}\n\ntype ImageRef struct {\n\tImageName string `toml:\"image\"`\n}\n\ntype ImageOrURI struct {\n\tBuildpackURI\n\tImageRef\n}\n\nfunc (c *ImageOrURI) DisplayString() string {\n\tif c.URI != \"\" {\n\t\treturn c.URI\n\t}\n\n\treturn c.ImageName\n}\n\ntype Platform struct {\n\tOS string `toml:\"os\"`\n}\n\ntype Order []OrderEntry\n\ntype OrderEntry struct {\n\tGroup []ModuleRef `toml:\"group\" json:\"group\"`\n}\n\ntype System struct {\n\tPre  SystemBuildpacks `toml:\"pre,omitempty\" json:\"pre,omitempty\"`\n\tPost SystemBuildpacks `toml:\"post,omitempty\" json:\"post,omitempty\"`\n}\n\ntype SystemBuildpacks struct {\n\tBuildpacks []ModuleRef `toml:\"buildpacks,omitempty\" json:\"buildpacks,omitempty\"`\n}\n\ntype ModuleRef struct {\n\tModuleInfo `yaml:\"buildpackinfo,inline\"`\n\tOptional   bool `toml:\"optional,omitempty\" json:\"optional,omitempty\" yaml:\"optional,omitempty\"`\n}\n\ntype ModuleLayers map[string]map[string]ModuleLayerInfo\n\ntype ModuleLayerInfo struct {\n\tAPI         *api.Version `json:\"api\"`\n\tStacks      []Stack      `json:\"stacks,omitempty\"`\n\tTargets     []Target     `json:\"targets,omitempty\"`\n\tOrder       Order        `json:\"order,omitempty\"`\n\tLayerDiffID string       `json:\"layerDiffID\"`\n\tHomepage    string       `json:\"homepage,omitempty\"`\n\tName        string       `json:\"name,omitempty\"`\n}\n\nfunc (b ModuleLayers) Get(id, version string) (ModuleLayerInfo, bool) {\n\tbuildpackLayerEntries, ok := b[id]\n\tif !ok {\n\t\treturn ModuleLayerInfo{}, false\n\t}\n\tif len(buildpackLayerEntries) == 1 && version == \"\" {\n\t\tfor key := range buildpackLayerEntries {\n\t\t\tversion = key\n\t\t}\n\t}\n\n\tresult, ok := buildpackLayerEntries[version]\n\treturn result, ok\n}\n"
  },
  {
    "path": "pkg/dist/dist_test.go",
    "content": "package dist_test\n\nimport (\n\t\"testing\"\n\n\t\"github.com/buildpacks/lifecycle/api\"\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\t\"github.com/buildpacks/pack/pkg/dist\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestDist(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"testDist\", testDist, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testDist(t *testing.T, when spec.G, it spec.S) {\n\twhen(\"ModuleLayers\", func() {\n\t\twhen(\"Get\", func() {\n\t\t\tvar (\n\t\t\t\tbuildpackLayers dist.ModuleLayers\n\t\t\t\tapiVersion      *api.Version\n\t\t\t)\n\t\t\tit.Before(func() {\n\t\t\t\tvar err error\n\t\t\t\tapiVersion, err = api.NewVersion(\"0.0\")\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\tbuildpackLayers = dist.ModuleLayers{\n\t\t\t\t\t\"buildpack\": {\n\t\t\t\t\t\t\"version1\": {\n\t\t\t\t\t\t\tAPI:         apiVersion,\n\t\t\t\t\t\t\tLayerDiffID: \"buildpack-v1-diff\",\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t\t\"other-buildpack\": {\n\t\t\t\t\t\t\"version1\": {\n\t\t\t\t\t\t\tAPI:         apiVersion,\n\t\t\t\t\t\t\tLayerDiffID: \"other-buildpack-v2-diff\",\n\t\t\t\t\t\t},\n\t\t\t\t\t\t\"version2\": {\n\t\t\t\t\t\t\tAPI:         apiVersion,\n\t\t\t\t\t\t\tLayerDiffID: \"other-buildpack-v2-diff\",\n\t\t\t\t\t\t},\n\t\t\t\t\t},\n\t\t\t\t}\n\t\t\t})\n\n\t\t\twhen(\"ID and Version are provided and present\", func() {\n\t\t\t\tit(\"succeeds\", func() {\n\t\t\t\t\tout, ok := buildpackLayers.Get(\"buildpack\", \"version1\")\n\t\t\t\t\th.AssertEq(t, ok, true)\n\t\t\t\t\th.AssertEq(t, out, dist.ModuleLayerInfo{\n\t\t\t\t\t\tAPI:         apiVersion,\n\t\t\t\t\t\tLayerDiffID: \"buildpack-v1-diff\",\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"ID is present, Version is left empty, but can be inferred\", func() {\n\t\t\t\tit(\"succeeds\", func() {\n\t\t\t\t\tout, ok := buildpackLayers.Get(\"buildpack\", \"\")\n\t\t\t\t\th.AssertEq(t, ok, true)\n\t\t\t\t\th.AssertEq(t, out, dist.ModuleLayerInfo{\n\t\t\t\t\t\tAPI:         apiVersion,\n\t\t\t\t\t\tLayerDiffID: \"buildpack-v1-diff\",\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"ID is present, Version is left empty and cannot be inferred\", func() {\n\t\t\t\tit(\"fails\", func() {\n\t\t\t\t\t_, ok := buildpackLayers.Get(\"other-buildpack\", \"\")\n\t\t\t\t\th.AssertEq(t, ok, false)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"ID is NOT provided\", func() {\n\t\t\t\tit(\"fails\", func() {\n\t\t\t\t\t_, ok := buildpackLayers.Get(\"missing-buildpack\", \"\")\n\t\t\t\t\th.AssertEq(t, ok, false)\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"Add\", func() {\n\t\t\twhen(\"a new buildpack is added\", func() {\n\t\t\t\tit(\"succeeds\", func() {\n\t\t\t\t\tlayers := dist.ModuleLayers{}\n\t\t\t\t\tapiVersion, _ := api.NewVersion(\"0.0\")\n\t\t\t\t\tdescriptor := dist.BuildpackDescriptor{WithAPI: apiVersion, WithInfo: dist.ModuleInfo{ID: \"test\", Name: \"test\", Version: \"1.0\"}}\n\t\t\t\t\tdist.AddToLayersMD(layers, &descriptor, \"\")\n\t\t\t\t\tlayerInfo, ok := layers.Get(descriptor.Info().ID, descriptor.Info().Version)\n\t\t\t\t\th.AssertEq(t, ok, true)\n\t\t\t\t\th.AssertEq(t, layerInfo.Name, descriptor.Info().Name)\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"ImageOrURI\", func() {\n\t\twhen(\"DisplayString\", func() {\n\t\t\twhen(\"uri\", func() {\n\t\t\t\twhen(\"blank\", func() {\n\t\t\t\t\tit(\"returns image\", func() {\n\t\t\t\t\t\ttoTest := dist.ImageOrURI{\n\t\t\t\t\t\t\tImageRef: dist.ImageRef{\n\t\t\t\t\t\t\t\tImageName: \"some-image-name\",\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t}\n\t\t\t\t\t\th.AssertEq(t, toTest.DisplayString(), \"some-image-name\")\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"not blank\", func() {\n\t\t\t\t\tit(\"returns uri\", func() {\n\t\t\t\t\t\ttoTest := dist.ImageOrURI{\n\t\t\t\t\t\t\tBuildpackURI: dist.BuildpackURI{\n\t\t\t\t\t\t\t\tURI: \"some-uri\",\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\tImageRef: dist.ImageRef{\n\t\t\t\t\t\t\t\tImageName: \"some-image-name\",\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t}\n\t\t\t\t\t\th.AssertEq(t, toTest.DisplayString(), \"some-uri\")\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "pkg/dist/distribution.go",
    "content": "// Package dist is responsible for cataloging all data types in relation\n// to distributing Cloud Native Buildpack components.\npackage dist\n"
  },
  {
    "path": "pkg/dist/extension_descriptor.go",
    "content": "package dist\n\nimport (\n\t\"strings\"\n\n\t\"github.com/buildpacks/lifecycle/api\"\n)\n\ntype ExtensionDescriptor struct {\n\tWithAPI     *api.Version `toml:\"api\"`\n\tWithInfo    ModuleInfo   `toml:\"extension\"`\n\tWithTargets []Target     `toml:\"targets,omitempty\"`\n}\n\nfunc (e *ExtensionDescriptor) EnsureStackSupport(_ string, _ []string, _ bool) error {\n\treturn nil\n}\n\nfunc (e *ExtensionDescriptor) EnsureTargetSupport(_, _, _, _ string) error {\n\treturn nil\n}\n\nfunc (e *ExtensionDescriptor) EscapedID() string {\n\treturn strings.ReplaceAll(e.Info().ID, \"/\", \"_\")\n}\n\nfunc (e *ExtensionDescriptor) Kind() string {\n\treturn \"extension\"\n}\n\nfunc (e *ExtensionDescriptor) API() *api.Version {\n\treturn e.WithAPI\n}\n\nfunc (e *ExtensionDescriptor) Info() ModuleInfo {\n\treturn e.WithInfo\n}\n\nfunc (e *ExtensionDescriptor) Order() Order {\n\treturn nil\n}\n\nfunc (e *ExtensionDescriptor) Stacks() []Stack {\n\treturn nil\n}\n\nfunc (e *ExtensionDescriptor) Targets() []Target {\n\treturn e.WithTargets\n}\n"
  },
  {
    "path": "pkg/dist/extension_descriptor_test.go",
    "content": "package dist_test\n\nimport (\n\t\"testing\"\n\n\t\"github.com/buildpacks/lifecycle/api\"\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\t\"github.com/buildpacks/pack/pkg/buildpack\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestExtensionDescriptor(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"testExtensionDescriptor\", testExtensionDescriptor, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testExtensionDescriptor(t *testing.T, when spec.G, it spec.S) {\n\twhen(\"#EscapedID\", func() {\n\t\tit(\"returns escaped ID\", func() {\n\t\t\textDesc := dist.ExtensionDescriptor{\n\t\t\t\tWithInfo: dist.ModuleInfo{ID: \"some/id\"},\n\t\t\t}\n\t\t\th.AssertEq(t, extDesc.EscapedID(), \"some_id\")\n\t\t})\n\t})\n\n\twhen(\"#Kind\", func() {\n\t\tit(\"returns 'extension'\", func() {\n\t\t\textDesc := dist.ExtensionDescriptor{}\n\t\t\th.AssertEq(t, extDesc.Kind(), buildpack.KindExtension)\n\t\t})\n\t})\n\n\twhen(\"#API\", func() {\n\t\tit(\"returns the api\", func() {\n\t\t\textDesc := dist.ExtensionDescriptor{\n\t\t\t\tWithAPI: api.MustParse(\"0.99\"),\n\t\t\t}\n\t\t\th.AssertEq(t, extDesc.API().String(), \"0.99\")\n\t\t})\n\t})\n\n\twhen(\"#Info\", func() {\n\t\tit(\"returns the module info\", func() {\n\t\t\tinfo := dist.ModuleInfo{\n\t\t\t\tID:      \"some-id\",\n\t\t\t\tName:    \"some-name\",\n\t\t\t\tVersion: \"some-version\",\n\t\t\t}\n\t\t\textDesc := dist.ExtensionDescriptor{\n\t\t\t\tWithInfo: info,\n\t\t\t}\n\t\t\th.AssertEq(t, extDesc.Info(), info)\n\t\t})\n\t})\n\n\twhen(\"#Order\", func() {\n\t\tit(\"returns empty\", func() {\n\t\t\tvar empty dist.Order\n\t\t\textDesc := dist.ExtensionDescriptor{}\n\t\t\th.AssertEq(t, extDesc.Order(), empty)\n\t\t})\n\t})\n\n\twhen(\"#Stacks\", func() {\n\t\tit(\"returns empty\", func() {\n\t\t\tvar empty []dist.Stack\n\t\t\textDesc := dist.ExtensionDescriptor{}\n\t\t\th.AssertEq(t, extDesc.Stacks(), empty)\n\t\t})\n\t})\n\n\twhen(\"#Targets\", func() {\n\t\tit(\"returns the api\", func() {\n\t\t\ttargets := []dist.Target{{\n\t\t\t\tOS:   \"fake-os\",\n\t\t\t\tArch: \"fake-arch\",\n\t\t\t}}\n\t\t\textDesc := dist.ExtensionDescriptor{\n\t\t\t\tWithTargets: targets,\n\t\t\t}\n\t\t\th.AssertEq(t, extDesc.Targets(), targets)\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "pkg/dist/image.go",
    "content": "package dist\n\nimport (\n\t\"encoding/json\"\n\n\t\"github.com/pkg/errors\"\n\n\t\"github.com/buildpacks/pack/internal/style\"\n)\n\ntype Labeled interface {\n\tLabel(name string) (value string, err error)\n}\n\ntype Labelable interface {\n\tSetLabel(name string, value string) error\n}\n\nfunc SetLabel(labelable Labelable, label string, data interface{}) error {\n\tdataBytes, err := json.Marshal(data)\n\tif err != nil {\n\t\treturn errors.Wrapf(err, \"marshalling data to JSON for label %s\", style.Symbol(label))\n\t}\n\tif err := labelable.SetLabel(label, string(dataBytes)); err != nil {\n\t\treturn errors.Wrapf(err, \"setting label %s\", style.Symbol(label))\n\t}\n\treturn nil\n}\n\nfunc GetLabel(labeled Labeled, label string, obj interface{}) (ok bool, err error) {\n\tlabelData, err := labeled.Label(label)\n\tif err != nil {\n\t\treturn false, errors.Wrapf(err, \"retrieving label %s\", style.Symbol(label))\n\t}\n\tif labelData != \"\" {\n\t\tif err := json.Unmarshal([]byte(labelData), obj); err != nil {\n\t\t\treturn false, errors.Wrapf(err, \"unmarshalling label %s\", style.Symbol(label))\n\t\t}\n\t\treturn true, nil\n\t}\n\treturn false, nil\n}\n"
  },
  {
    "path": "pkg/dist/image_test.go",
    "content": "package dist_test\n\nimport (\n\t\"testing\"\n\n\t\"github.com/heroku/color\"\n\t\"github.com/pkg/errors\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\t\"github.com/buildpacks/pack/internal/builder/fakes\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestImage(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"testImage\", testImage, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testImage(t *testing.T, when spec.G, it spec.S) {\n\twhen(\"A label needs to be get\", func() {\n\t\tit(\"sets a label successfully\", func() {\n\t\t\tvar outputLabel bool\n\t\t\tmockInspectable := fakes.FakeInspectable{ReturnForLabel: \"true\", ErrorForLabel: nil}\n\n\t\t\tisPresent, err := dist.GetLabel(&mockInspectable, \"random-label\", &outputLabel)\n\n\t\t\th.AssertNil(t, err)\n\t\t\th.AssertEq(t, isPresent, true)\n\t\t\th.AssertEq(t, outputLabel, true)\n\t\t})\n\n\t\tit(\"returns an error\", func() {\n\t\t\tvar outputLabel bool\n\t\t\tmockInspectable := fakes.FakeInspectable{ReturnForLabel: \"\", ErrorForLabel: errors.New(\"random-error\")}\n\n\t\t\tisPresent, err := dist.GetLabel(&mockInspectable, \"random-label\", &outputLabel)\n\n\t\t\th.AssertNotNil(t, err)\n\t\t\th.AssertEq(t, isPresent, false)\n\t\t\th.AssertEq(t, outputLabel, false)\n\t\t})\n\t})\n\n\twhen(\"Try to get an empty label\", func() {\n\t\tit(\"returns isPresent but it doesn't set the label\", func() {\n\t\t\tvar outputLabel bool\n\t\t\tmockInspectable := fakes.FakeInspectable{ReturnForLabel: \"\", ErrorForLabel: nil}\n\n\t\t\tisPresent, err := dist.GetLabel(&mockInspectable, \"random-label\", &outputLabel)\n\n\t\t\th.AssertNil(t, err)\n\t\t\th.AssertEq(t, isPresent, false)\n\t\t\th.AssertEq(t, outputLabel, false)\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "pkg/dist/layers.go",
    "content": "package dist\n\nimport (\n\t\"os\"\n\t\"path/filepath\"\n\n\t\"github.com/buildpacks/lifecycle/api\"\n\tv1 \"github.com/google/go-containerregistry/pkg/v1\"\n\t\"github.com/google/go-containerregistry/pkg/v1/tarball\"\n\t\"github.com/pkg/errors\"\n)\n\ntype Descriptor interface {\n\tAPI() *api.Version\n\tInfo() ModuleInfo\n\tOrder() Order\n\tStacks() []Stack\n\tTargets() []Target\n}\n\nfunc LayerDiffID(layerTarPath string) (v1.Hash, error) {\n\tfh, err := os.Open(filepath.Clean(layerTarPath))\n\tif err != nil {\n\t\treturn v1.Hash{}, errors.Wrap(err, \"opening tar file\")\n\t}\n\tdefer fh.Close()\n\n\tlayer, err := tarball.LayerFromFile(layerTarPath)\n\tif err != nil {\n\t\treturn v1.Hash{}, errors.Wrap(err, \"reading layer tar\")\n\t}\n\n\thash, err := layer.DiffID()\n\tif err != nil {\n\t\treturn v1.Hash{}, errors.Wrap(err, \"generating diff id\")\n\t}\n\n\treturn hash, nil\n}\n\nfunc AddToLayersMD(layerMD ModuleLayers, descriptor Descriptor, diffID string) {\n\tinfo := descriptor.Info()\n\tif _, ok := layerMD[info.ID]; !ok {\n\t\tlayerMD[info.ID] = map[string]ModuleLayerInfo{}\n\t}\n\tlayerMD[info.ID][info.Version] = ModuleLayerInfo{\n\t\tAPI:         descriptor.API(),\n\t\tStacks:      descriptor.Stacks(),\n\t\tTargets:     descriptor.Targets(),\n\t\tOrder:       descriptor.Order(),\n\t\tLayerDiffID: diffID,\n\t\tHomepage:    info.Homepage,\n\t\tName:        info.Name,\n\t}\n}\n"
  },
  {
    "path": "pkg/image/fetcher.go",
    "content": "package image\n\nimport (\n\t\"context\"\n\t\"encoding/base64\"\n\t\"encoding/json\"\n\t\"fmt\"\n\t\"io\"\n\t\"strings\"\n\n\t\"github.com/buildpacks/imgutil/layout\"\n\t\"github.com/buildpacks/imgutil/layout/sparse\"\n\tcerrdefs \"github.com/containerd/errdefs\"\n\n\t\"github.com/buildpacks/imgutil\"\n\t\"github.com/buildpacks/imgutil/local\"\n\t\"github.com/buildpacks/imgutil/remote\"\n\t\"github.com/buildpacks/lifecycle/auth\"\n\t\"github.com/docker/docker/pkg/jsonmessage\"\n\t\"github.com/google/go-containerregistry/pkg/authn\"\n\t\"github.com/moby/moby/client\"\n\tocispec \"github.com/opencontainers/image-spec/specs-go/v1\"\n\t\"github.com/pkg/errors\"\n\n\tpname \"github.com/buildpacks/pack/internal/name\"\n\t\"github.com/buildpacks/pack/internal/style\"\n\t\"github.com/buildpacks/pack/internal/term\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n)\n\n// FetcherOption is a type of function that mutate settings on the client.\n// Values in these functions are set through currying.\ntype FetcherOption func(c *Fetcher)\n\ntype LayoutOption struct {\n\tPath   string\n\tSparse bool\n}\n\n// WithRegistryMirrors supply your own mirrors for registry.\nfunc WithRegistryMirrors(registryMirrors map[string]string) FetcherOption {\n\treturn func(c *Fetcher) {\n\t\tc.registryMirrors = registryMirrors\n\t}\n}\n\nfunc WithKeychain(keychain authn.Keychain) FetcherOption {\n\treturn func(c *Fetcher) {\n\t\tc.keychain = keychain\n\t}\n}\n\ntype DockerClient interface {\n\tlocal.DockerClient\n\tImagePull(ctx context.Context, ref string, options client.ImagePullOptions) (client.ImagePullResponse, error)\n}\n\ntype Fetcher struct {\n\tdocker          DockerClient\n\tlogger          logging.Logger\n\tregistryMirrors map[string]string\n\tkeychain        authn.Keychain\n}\n\ntype FetchOptions struct {\n\tDaemon             bool\n\tTarget             *dist.Target\n\tPullPolicy         PullPolicy\n\tLayoutOption       LayoutOption\n\tInsecureRegistries []string\n}\n\nfunc NewFetcher(logger logging.Logger, docker DockerClient, opts ...FetcherOption) *Fetcher {\n\tfetcher := &Fetcher{\n\t\tlogger:   logger,\n\t\tdocker:   docker,\n\t\tkeychain: authn.DefaultKeychain,\n\t}\n\n\tfor _, opt := range opts {\n\t\topt(fetcher)\n\t}\n\n\treturn fetcher\n}\n\nvar ErrNotFound = errors.New(\"not found\")\n\nfunc (f *Fetcher) Fetch(ctx context.Context, name string, options FetchOptions) (imgutil.Image, error) {\n\tname, err := pname.TranslateRegistry(name, f.registryMirrors, f.logger)\n\tif err != nil {\n\t\treturn nil, err\n\t}\n\n\tif (options.LayoutOption != LayoutOption{}) {\n\t\treturn f.fetchLayoutImage(name, options.LayoutOption)\n\t}\n\n\tif !options.Daemon {\n\t\treturn f.fetchRemoteImage(name, options.Target, options.InsecureRegistries)\n\t}\n\n\tswitch options.PullPolicy {\n\tcase PullNever:\n\t\timg, err := f.fetchDaemonImage(name)\n\t\treturn img, err\n\tcase PullIfNotPresent:\n\t\timg, err := f.fetchDaemonImage(name)\n\t\tif err == nil || !errors.Is(err, ErrNotFound) {\n\t\t\treturn img, err\n\t\t}\n\t}\n\n\tmsg := fmt.Sprintf(\"Pulling image %s\", style.Symbol(name))\n\tif options.Target != nil {\n\t\tmsg = fmt.Sprintf(\"Pulling image %s with platform %s\", style.Symbol(name), style.Symbol(options.Target.ValuesAsPlatform()))\n\t}\n\tf.logger.Debug(msg)\n\tif err = f.pullImage(ctx, name, options.Target); err != nil {\n\t\t// FIXME: this matching is brittle and the fallback should be removed when https://github.com/buildpacks/pack/issues/2079\n\t\t// has been fixed for a sufficient amount of time.\n\t\t// Sample error from docker engine:\n\t\t// `image with reference <image> was found but does not match the specified platform: wanted linux/amd64, actual: linux` or\n\t\t// `image with reference <image> was found but its platform (linux) does not match the specified platform (linux/amd64)`\n\t\tif strings.Contains(err.Error(), \"does not match the specified platform\") {\n\t\t\tf.logger.Debugf(fmt.Sprintf(\"Pulling image %s\", style.Symbol(name)))\n\t\t\terr = f.pullImage(ctx, name, nil)\n\t\t}\n\t}\n\tif err != nil && !errors.Is(err, ErrNotFound) {\n\t\treturn nil, err\n\t}\n\n\treturn f.fetchDaemonImage(name)\n}\n\nfunc (f *Fetcher) CheckReadAccess(repo string, options FetchOptions) bool {\n\tif !options.Daemon || options.PullPolicy == PullAlways {\n\t\treturn f.checkRemoteReadAccess(repo)\n\t}\n\tif _, err := f.fetchDaemonImage(repo); err != nil {\n\t\tif errors.Is(err, ErrNotFound) {\n\t\t\t// Image doesn't exist in the daemon\n\t\t\t// \tPull Never: should fail\n\t\t\t// \tPull If Not Present: need to check the registry\n\t\t\tif options.PullPolicy == PullNever {\n\t\t\t\treturn false\n\t\t\t}\n\t\t\treturn f.checkRemoteReadAccess(repo)\n\t\t}\n\t\tf.logger.Debugf(\"failed reading image '%s' from the daemon, error: %s\", repo, err.Error())\n\t\treturn false\n\t}\n\treturn true\n}\n\nfunc (f *Fetcher) checkRemoteReadAccess(repo string) bool {\n\timg, err := remote.NewImage(repo, f.keychain)\n\tif err != nil {\n\t\tf.logger.Debugf(\"failed accessing remote image %s, error: %s\", repo, err.Error())\n\t\treturn false\n\t}\n\tif ok, err := img.CheckReadAccess(); ok {\n\t\tf.logger.Debugf(\"CheckReadAccess succeeded for the run image %s\", repo)\n\t\treturn true\n\t} else {\n\t\tf.logger.Debugf(\"CheckReadAccess failed for the run image %s, error: %s\", repo, err.Error())\n\t\treturn false\n\t}\n}\n\nfunc (f *Fetcher) fetchDaemonImage(name string) (imgutil.Image, error) {\n\timage, err := local.NewImage(name, f.docker, local.FromBaseImage(name))\n\tif err != nil {\n\t\treturn nil, err\n\t}\n\n\tif !image.Found() {\n\t\treturn nil, errors.Wrapf(ErrNotFound, \"image %s does not exist on the daemon\", style.Symbol(name))\n\t}\n\n\treturn image, nil\n}\n\nfunc (f *Fetcher) fetchRemoteImage(name string, target *dist.Target, insecureRegistries []string) (imgutil.Image, error) {\n\tvar (\n\t\timage   imgutil.Image\n\t\toptions []imgutil.ImageOption\n\t\terr     error\n\t)\n\n\tif len(insecureRegistries) > 0 {\n\t\tfor _, registry := range insecureRegistries {\n\t\t\toptions = append(options, remote.WithRegistrySetting(registry, true))\n\t\t}\n\t}\n\n\tif target == nil {\n\t\timage, err = remote.NewImage(name, f.keychain, append(options, remote.FromBaseImage(name))...)\n\t} else {\n\t\tplatform := imgutil.Platform{OS: target.OS, Architecture: target.Arch, Variant: target.ArchVariant}\n\t\timage, err = remote.NewImage(name, f.keychain, append(append(options, remote.FromBaseImage(name)), remote.WithDefaultPlatform(platform))...)\n\t}\n\n\tif err != nil {\n\t\treturn nil, err\n\t}\n\n\tif !image.Found() {\n\t\treturn nil, errors.Wrapf(ErrNotFound, \"image %s does not exist in registry\", style.Symbol(name))\n\t}\n\n\treturn image, nil\n}\n\nfunc (f *Fetcher) fetchLayoutImage(name string, options LayoutOption) (imgutil.Image, error) {\n\tvar (\n\t\timage imgutil.Image\n\t\terr   error\n\t)\n\n\tv1Image, err := remote.NewV1Image(name, f.keychain)\n\tif err != nil {\n\t\treturn nil, err\n\t}\n\n\tif options.Sparse {\n\t\timage, err = sparse.NewImage(options.Path, v1Image)\n\t} else {\n\t\timage, err = layout.NewImage(options.Path, layout.FromBaseImageInstance(v1Image))\n\t}\n\n\tif err != nil {\n\t\treturn nil, err\n\t}\n\n\terr = image.Save()\n\tif err != nil {\n\t\treturn nil, err\n\t}\n\n\treturn image, nil\n}\n\n// FetchForPlatform fetches an image and resolves it to a platform-specific digest before fetching.\n// This ensures that multi-platform images are always resolved to the correct platform-specific manifest.\nfunc (f *Fetcher) FetchForPlatform(ctx context.Context, name string, options FetchOptions) (imgutil.Image, error) {\n\t// If no target is specified, fall back to regular fetch\n\tif options.Target == nil {\n\t\treturn f.Fetch(ctx, name, options)\n\t}\n\n\tname, err := pname.TranslateRegistry(name, f.registryMirrors, f.logger)\n\tif err != nil {\n\t\treturn nil, err\n\t}\n\n\tplatformStr := options.Target.ValuesAsPlatform()\n\n\t// When PullPolicy is PullNever, skip platform-specific digest resolution as it requires\n\t// network access to fetch the manifest list. Instead, use the image as-is from the daemon.\n\t// Note: This may cause issues with containerd storage. Users should pre-pull the platform-specific\n\t// digest if they encounter errors.\n\tif options.Daemon && options.PullPolicy == PullNever {\n\t\tf.logger.Debugf(\"Using lifecycle %s with platform %s (skipping digest resolution due to --pull-policy never)\", name, platformStr)\n\t\treturn f.Fetch(ctx, name, options)\n\t}\n\n\t// Build platform and registry settings from options\n\tplatform := imgutil.Platform{\n\t\tOS:           options.Target.OS,\n\t\tArchitecture: options.Target.Arch,\n\t\tVariant:      options.Target.ArchVariant,\n\t}\n\tregistrySettings := make(map[string]imgutil.RegistrySetting)\n\tfor _, registry := range options.InsecureRegistries {\n\t\tregistrySettings[registry] = imgutil.RegistrySetting{Insecure: true}\n\t}\n\n\t// Resolve to platform-specific digest\n\tresolvedName, err := resolvePlatformSpecificDigest(name, &platform, f.keychain, registrySettings)\n\tif err != nil {\n\t\treturn nil, errors.Wrapf(err, \"resolving image %s to platform-specific digest\", style.Symbol(name))\n\t}\n\n\t// Log the resolution for visibility\n\tf.logger.Debugf(\"Using lifecycle %s; pulling digest %s for platform %s\", name, resolvedName, platformStr)\n\n\treturn f.Fetch(ctx, resolvedName, options)\n}\n\nfunc (f *Fetcher) pullImage(ctx context.Context, imageID string, target *dist.Target) error {\n\tregAuth, err := f.registryAuth(imageID)\n\tif err != nil {\n\t\treturn err\n\t}\n\n\tpullOpts := client.ImagePullOptions{RegistryAuth: regAuth}\n\tif target != nil {\n\t\tpullOpts.Platforms = []ocispec.Platform{{\n\t\t\tOS:           target.OS,\n\t\t\tArchitecture: target.Arch,\n\t\t\tVariant:      target.ArchVariant,\n\t\t}}\n\t}\n\tpullResult, err := f.docker.ImagePull(ctx, imageID, pullOpts)\n\tif err != nil {\n\t\tif cerrdefs.IsNotFound(err) {\n\t\t\treturn errors.Wrapf(ErrNotFound, \"image %s does not exist on the daemon\", style.Symbol(imageID))\n\t\t}\n\n\t\treturn err\n\t}\n\n\twriter := logging.GetWriterForLevel(f.logger, logging.InfoLevel)\n\ttermFd, isTerm := term.IsTerminal(writer)\n\n\terr = jsonmessage.DisplayJSONMessagesStream(pullResult, &colorizedWriter{writer}, termFd, isTerm, nil)\n\tif err != nil {\n\t\treturn err\n\t}\n\n\treturn pullResult.Close()\n}\n\nfunc (f *Fetcher) registryAuth(ref string) (string, error) {\n\t_, a, err := auth.ReferenceForRepoName(f.keychain, ref)\n\tif err != nil {\n\t\treturn \"\", errors.Wrapf(err, \"resolve auth for ref %s\", ref)\n\t}\n\tauthConfig, err := a.Authorization()\n\tif err != nil {\n\t\treturn \"\", err\n\t}\n\n\tdataJSON, err := json.Marshal(authConfig)\n\tif err != nil {\n\t\treturn \"\", err\n\t}\n\n\treturn base64.StdEncoding.EncodeToString(dataJSON), nil\n}\n\ntype colorizedWriter struct {\n\twriter io.Writer\n}\n\ntype colorFunc = func(string, ...interface{}) string\n\nfunc (w *colorizedWriter) Write(p []byte) (n int, err error) {\n\tmsg := string(p)\n\tcolorizers := map[string]colorFunc{\n\t\t\"Waiting\":           style.Waiting,\n\t\t\"Pulling fs layer\":  style.Waiting,\n\t\t\"Downloading\":       style.Working,\n\t\t\"Download complete\": style.Working,\n\t\t\"Extracting\":        style.Working,\n\t\t\"Pull complete\":     style.Complete,\n\t\t\"Already exists\":    style.Complete,\n\t\t\"=\":                 style.ProgressBar,\n\t\t\">\":                 style.ProgressBar,\n\t}\n\tfor pattern, colorize := range colorizers {\n\t\tmsg = strings.ReplaceAll(msg, pattern, colorize(pattern))\n\t}\n\treturn w.writer.Write([]byte(msg))\n}\n\n// WrapDockerClient wraps a moby docker client to match our DockerClient interface\n"
  },
  {
    "path": "pkg/image/fetcher_test.go",
    "content": "package image_test\n\nimport (\n\t\"bytes\"\n\t\"context\"\n\t\"fmt\"\n\t\"os\"\n\t\"path/filepath\"\n\t\"runtime\"\n\t\"testing\"\n\n\t\"github.com/buildpacks/imgutil\"\n\t\"github.com/buildpacks/imgutil/local\"\n\t\"github.com/buildpacks/imgutil/remote\"\n\t\"github.com/golang/mock/gomock\"\n\t\"github.com/google/go-containerregistry/pkg/authn\"\n\t\"github.com/heroku/color\"\n\t\"github.com/moby/moby/client\"\n\t\"github.com/pkg/errors\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\t\"github.com/buildpacks/pack/pkg/dist\"\n\t\"github.com/buildpacks/pack/pkg/image\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\t\"github.com/buildpacks/pack/pkg/testmocks\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nvar docker *client.Client\nvar registryConfig *h.TestRegistryConfig\n\nfunc TestFetcher(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\n\th.RequireDocker(t)\n\n\tregistryConfig = h.RunRegistry(t)\n\tdefer registryConfig.StopRegistry(t)\n\n\t// TODO: is there a better solution to the auth problem?\n\tos.Setenv(\"DOCKER_CONFIG\", registryConfig.DockerConfigDir)\n\n\tvar err error\n\tdocker, err = client.New(client.FromEnv)\n\th.AssertNil(t, err)\n\tspec.Run(t, \"Fetcher\", testFetcher, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testFetcher(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\timageFetcher *image.Fetcher\n\t\trepoName     string\n\t\trepo         string\n\t\toutBuf       bytes.Buffer\n\t\tosType       string\n\t)\n\n\tit.Before(func() {\n\t\trepo = \"some-org/\" + h.RandString(10)\n\t\trepoName = registryConfig.RepoName(repo)\n\t\timageFetcher = image.NewFetcher(logging.NewLogWithWriters(&outBuf, &outBuf, logging.WithVerbose()), docker)\n\n\t\tinfoResult, err := docker.Info(context.TODO(), client.InfoOptions{})\n\t\th.AssertNil(t, err)\n\t\tosType = infoResult.Info.OSType\n\t})\n\n\twhen(\"#Fetch\", func() {\n\t\twhen(\"daemon is false\", func() {\n\t\t\twhen(\"PullAlways\", func() {\n\t\t\t\twhen(\"there is a remote image\", func() {\n\t\t\t\t\twhen(\"default platform\", func() {\n\t\t\t\t\t\t// default is linux/runtime.GOARCH\n\t\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\t\timg, err := remote.NewImage(repoName, authn.DefaultKeychain)\n\t\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\t\th.AssertNil(t, img.Save())\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\tit(\"returns the remote image\", func() {\n\t\t\t\t\t\t\t_, err := imageFetcher.Fetch(context.TODO(), repoName, image.FetchOptions{Daemon: false, PullPolicy: image.PullAlways})\n\t\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\tit(\"returns the remote image when insecure registry\", func() {\n\t\t\t\t\t\t\tinsecureRegistry := fmt.Sprintf(\"%s:%s\", registryConfig.RunRegistryHost, registryConfig.RunRegistryPort)\n\t\t\t\t\t\t\t_, err := imageFetcher.Fetch(context.TODO(), repoName, image.FetchOptions{Daemon: false, PullPolicy: image.PullAlways, InsecureRegistries: []string{insecureRegistry}})\n\t\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"platform with variant and version\", func() {\n\t\t\t\t\t\tvar target dist.Target\n\n\t\t\t\t\t\t// default is linux/runtime.GOARCH\n\t\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\t\timg, err := remote.NewImage(repoName, authn.DefaultKeychain, remote.WithDefaultPlatform(imgutil.Platform{\n\t\t\t\t\t\t\t\tOS:           runtime.GOOS,\n\t\t\t\t\t\t\t\tArchitecture: runtime.GOARCH,\n\t\t\t\t\t\t\t\tVariant:      \"v1\",\n\t\t\t\t\t\t\t\tOSVersion:    \"my-version\",\n\t\t\t\t\t\t\t}))\n\t\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\t\th.AssertNil(t, img.Save())\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\tit(\"returns the remote image\", func() {\n\t\t\t\t\t\t\ttarget = dist.Target{\n\t\t\t\t\t\t\t\tOS:          runtime.GOOS,\n\t\t\t\t\t\t\t\tArch:        runtime.GOARCH,\n\t\t\t\t\t\t\t\tArchVariant: \"v1\",\n\t\t\t\t\t\t\t\tDistributions: []dist.Distribution{\n\t\t\t\t\t\t\t\t\t{Name: \"some-name\", Version: \"my-version\"},\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t}\n\n\t\t\t\t\t\t\timg, err := imageFetcher.Fetch(context.TODO(), repoName, image.FetchOptions{Daemon: false, PullPolicy: image.PullAlways, Target: &target})\n\t\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\t\tvariant, err := img.Variant()\n\t\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\t\th.AssertEq(t, variant, \"v1\")\n\n\t\t\t\t\t\t\tosVersion, err := img.OSVersion()\n\t\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\t\th.AssertEq(t, osVersion, \"my-version\")\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"there is no remote image\", func() {\n\t\t\t\t\tit(\"returns an error\", func() {\n\t\t\t\t\t\t_, err := imageFetcher.Fetch(context.TODO(), repoName, image.FetchOptions{Daemon: false, PullPolicy: image.PullAlways})\n\t\t\t\t\t\th.AssertError(t, err, fmt.Sprintf(\"image '%s' does not exist in registry\", repoName))\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"PullIfNotPresent\", func() {\n\t\t\t\twhen(\"there is a remote image\", func() {\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\timg, err := remote.NewImage(repoName, authn.DefaultKeychain)\n\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\th.AssertNil(t, img.Save())\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"returns the remote image\", func() {\n\t\t\t\t\t\t_, err := imageFetcher.Fetch(context.TODO(), repoName, image.FetchOptions{Daemon: false, PullPolicy: image.PullIfNotPresent})\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"there is no remote image\", func() {\n\t\t\t\t\tit(\"returns an error\", func() {\n\t\t\t\t\t\t_, err := imageFetcher.Fetch(context.TODO(), repoName, image.FetchOptions{Daemon: false, PullPolicy: image.PullIfNotPresent})\n\t\t\t\t\t\th.AssertError(t, err, fmt.Sprintf(\"image '%s' does not exist in registry\", repoName))\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"daemon is true\", func() {\n\t\t\twhen(\"PullNever\", func() {\n\t\t\t\twhen(\"there is a local image\", func() {\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\t// Make sure the repoName is not a valid remote repo.\n\t\t\t\t\t\t// This is to verify that no remote check is made\n\t\t\t\t\t\t// when there's a valid local image.\n\t\t\t\t\t\trepoName = \"invalidhost\" + repoName\n\n\t\t\t\t\t\timg, err := local.NewImage(repoName, docker)\n\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\th.AssertNil(t, img.Save())\n\t\t\t\t\t})\n\n\t\t\t\t\tit.After(func() {\n\t\t\t\t\t\th.DockerRmi(docker, repoName)\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"returns the local image\", func() {\n\t\t\t\t\t\t_, err := imageFetcher.Fetch(context.TODO(), repoName, image.FetchOptions{Daemon: true, PullPolicy: image.PullNever})\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"there is no local image\", func() {\n\t\t\t\t\tit(\"returns an error\", func() {\n\t\t\t\t\t\t_, err := imageFetcher.Fetch(context.TODO(), repoName, image.FetchOptions{Daemon: true, PullPolicy: image.PullNever})\n\t\t\t\t\t\th.AssertError(t, err, fmt.Sprintf(\"image '%s' does not exist on the daemon\", repoName))\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"PullAlways\", func() {\n\t\t\t\twhen(\"there is a remote image\", func() {\n\t\t\t\t\tvar (\n\t\t\t\t\t\tlogger *logging.LogWithWriters\n\t\t\t\t\t\toutput func() string\n\t\t\t\t\t)\n\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\t// Instantiate a pull-able local image\n\t\t\t\t\t\t// as opposed to a remote image so that the image\n\t\t\t\t\t\t// is created with the OS of the docker daemon\n\t\t\t\t\t\timg, err := local.NewImage(repoName, docker)\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\tdefer h.DockerRmi(docker, repoName)\n\n\t\t\t\t\t\th.AssertNil(t, img.Save())\n\n\t\t\t\t\t\th.AssertNil(t, h.PushImage(docker, img.Name(), registryConfig))\n\n\t\t\t\t\t\tvar outCons *color.Console\n\t\t\t\t\t\toutCons, output = h.MockWriterAndOutput()\n\t\t\t\t\t\tlogger = logging.NewLogWithWriters(outCons, outCons)\n\t\t\t\t\t\timageFetcher = image.NewFetcher(logger, docker)\n\t\t\t\t\t})\n\n\t\t\t\t\tit.After(func() {\n\t\t\t\t\t\th.DockerRmi(docker, repoName)\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"pull the image and return the local copy\", func() {\n\t\t\t\t\t\t_, err := imageFetcher.Fetch(context.TODO(), repoName, image.FetchOptions{Daemon: true, PullPolicy: image.PullAlways})\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\th.AssertNotEq(t, output(), \"\")\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"doesn't log anything in quiet mode\", func() {\n\t\t\t\t\t\tlogger.WantQuiet(true)\n\t\t\t\t\t\t_, err := imageFetcher.Fetch(context.TODO(), repoName, image.FetchOptions{Daemon: true, PullPolicy: image.PullAlways})\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\th.AssertEq(t, output(), \"\")\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"there is no remote image\", func() {\n\t\t\t\t\twhen(\"there is a local image\", func() {\n\t\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\t\timg, err := local.NewImage(repoName, docker)\n\t\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\t\th.AssertNil(t, img.Save())\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\tit.After(func() {\n\t\t\t\t\t\t\th.DockerRmi(docker, repoName)\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\tit(\"returns the local image\", func() {\n\t\t\t\t\t\t\t_, err := imageFetcher.Fetch(context.TODO(), repoName, image.FetchOptions{Daemon: true, PullPolicy: image.PullAlways})\n\t\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"there is no local image\", func() {\n\t\t\t\t\t\tit(\"returns an error\", func() {\n\t\t\t\t\t\t\t_, err := imageFetcher.Fetch(context.TODO(), repoName, image.FetchOptions{Daemon: true, PullPolicy: image.PullAlways})\n\t\t\t\t\t\t\th.AssertError(t, err, fmt.Sprintf(\"image '%s' does not exist on the daemon\", repoName))\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"image platform is specified\", func() {\n\t\t\t\t\tit(\"passes the platform argument to the daemon\", func() {\n\t\t\t\t\t\t_, err := imageFetcher.Fetch(context.TODO(), repoName, image.FetchOptions{Daemon: true, PullPolicy: image.PullAlways, Target: &dist.Target{OS: \"some-unsupported-platform\"}})\n\t\t\t\t\t\th.AssertError(t, err, \"unknown operating system or architecture\")\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"remote platform does not match\", func() {\n\t\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\t\timg, err := remote.NewImage(repoName, authn.DefaultKeychain, remote.WithDefaultPlatform(imgutil.Platform{OS: osType, Architecture: \"\"}))\n\t\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\t\th.AssertNil(t, img.Save())\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\tit(\"retries without setting platform\", func() {\n\t\t\t\t\t\t\t_, err := imageFetcher.Fetch(context.TODO(), repoName, image.FetchOptions{Daemon: true, PullPolicy: image.PullAlways, Target: &dist.Target{OS: osType, Arch: runtime.GOARCH}})\n\t\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"PullIfNotPresent\", func() {\n\t\t\t\twhen(\"there is a remote image\", func() {\n\t\t\t\t\tvar (\n\t\t\t\t\t\tlabel          = \"label\"\n\t\t\t\t\t\tremoteImgLabel string\n\t\t\t\t\t)\n\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\t// Instantiate a pull-able local image\n\t\t\t\t\t\t// as opposed to a remote image so that the image\n\t\t\t\t\t\t// is created with the OS of the docker daemon\n\t\t\t\t\t\tremoteImg, err := local.NewImage(repoName, docker)\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\tdefer h.DockerRmi(docker, repoName)\n\n\t\t\t\t\t\th.AssertNil(t, remoteImg.SetLabel(label, \"1\"))\n\t\t\t\t\t\th.AssertNil(t, remoteImg.Save())\n\n\t\t\t\t\t\th.AssertNil(t, h.PushImage(docker, remoteImg.Name(), registryConfig))\n\n\t\t\t\t\t\tremoteImgLabel, err = remoteImg.Label(label)\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t})\n\n\t\t\t\t\tit.After(func() {\n\t\t\t\t\t\th.DockerRmi(docker, repoName)\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"there is a local image\", func() {\n\t\t\t\t\t\tvar localImgLabel string\n\n\t\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\t\tlocalImg, err := local.NewImage(repoName, docker)\n\t\t\t\t\t\t\th.AssertNil(t, localImg.SetLabel(label, \"2\"))\n\t\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\t\th.AssertNil(t, localImg.Save())\n\n\t\t\t\t\t\t\tlocalImgLabel, err = localImg.Label(label)\n\t\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\tit.After(func() {\n\t\t\t\t\t\t\th.DockerRmi(docker, repoName)\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\tit(\"returns the local image\", func() {\n\t\t\t\t\t\t\tfetchedImg, err := imageFetcher.Fetch(context.TODO(), repoName, image.FetchOptions{Daemon: true, PullPolicy: image.PullIfNotPresent})\n\t\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\t\th.AssertNotContains(t, outBuf.String(), \"Pulling image\")\n\n\t\t\t\t\t\t\tfetchedImgLabel, err := fetchedImg.Label(label)\n\t\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\t\th.AssertEq(t, fetchedImgLabel, localImgLabel)\n\t\t\t\t\t\t\th.AssertNotEq(t, fetchedImgLabel, remoteImgLabel)\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"there is no local image\", func() {\n\t\t\t\t\t\tit(\"returns the remote image\", func() {\n\t\t\t\t\t\t\tfetchedImg, err := imageFetcher.Fetch(context.TODO(), repoName, image.FetchOptions{Daemon: true, PullPolicy: image.PullIfNotPresent})\n\t\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\t\tfetchedImgLabel, err := fetchedImg.Label(label)\n\t\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\t\th.AssertEq(t, fetchedImgLabel, remoteImgLabel)\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"there is no remote image\", func() {\n\t\t\t\t\twhen(\"there is a local image\", func() {\n\t\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\t\timg, err := local.NewImage(repoName, docker)\n\t\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\t\th.AssertNil(t, img.Save())\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\tit.After(func() {\n\t\t\t\t\t\t\th.DockerRmi(docker, repoName)\n\t\t\t\t\t\t})\n\n\t\t\t\t\t\tit(\"returns the local image\", func() {\n\t\t\t\t\t\t\t_, err := imageFetcher.Fetch(context.TODO(), repoName, image.FetchOptions{Daemon: true, PullPolicy: image.PullIfNotPresent})\n\t\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\n\t\t\t\t\twhen(\"there is no local image\", func() {\n\t\t\t\t\t\tit(\"returns an error\", func() {\n\t\t\t\t\t\t\t_, err := imageFetcher.Fetch(context.TODO(), repoName, image.FetchOptions{Daemon: true, PullPolicy: image.PullIfNotPresent})\n\t\t\t\t\t\t\th.AssertError(t, err, fmt.Sprintf(\"image '%s' does not exist on the daemon\", repoName))\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"image platform is specified\", func() {\n\t\t\t\t\tit(\"passes the platform argument to the daemon\", func() {\n\t\t\t\t\t\t_, err := imageFetcher.Fetch(context.TODO(), repoName, image.FetchOptions{Daemon: true, PullPolicy: image.PullIfNotPresent, Target: &dist.Target{OS: \"some-unsupported-platform\"}})\n\t\t\t\t\t\th.AssertError(t, err, \"unknown operating system or architecture\")\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"layout option is provided\", func() {\n\t\t\tvar (\n\t\t\t\tlayoutOption image.LayoutOption\n\t\t\t\timagePath    string\n\t\t\t\ttmpDir       string\n\t\t\t\terr          error\n\t\t\t)\n\n\t\t\tit.Before(func() {\n\t\t\t\t// set up local layout repo\n\t\t\t\ttmpDir, err = os.MkdirTemp(\"\", \"pack.fetcher.test\")\n\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t// dummy layer to validate sparse behavior\n\t\t\t\ttarDir := filepath.Join(tmpDir, \"layer\")\n\t\t\t\terr = os.MkdirAll(tarDir, os.ModePerm)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\tlayerPath := h.CreateTAR(t, tarDir, \".\", -1)\n\n\t\t\t\t// set up the remote image to be used\n\t\t\t\timg, err := remote.NewImage(repoName, authn.DefaultKeychain)\n\t\t\t\timg.AddLayer(layerPath)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertNil(t, img.Save())\n\n\t\t\t\t// set up layout options for the tests\n\t\t\t\timagePath = filepath.Join(tmpDir, repo)\n\t\t\t\tlayoutOption = image.LayoutOption{\n\t\t\t\t\tPath:   imagePath,\n\t\t\t\t\tSparse: false,\n\t\t\t\t}\n\t\t\t})\n\n\t\t\tit.After(func() {\n\t\t\t\terr = os.RemoveAll(tmpDir)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t})\n\n\t\t\twhen(\"sparse is false\", func() {\n\t\t\t\tit(\"returns and layout image on disk\", func() {\n\t\t\t\t\t_, err := imageFetcher.Fetch(context.TODO(), repoName, image.FetchOptions{LayoutOption: layoutOption})\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t// all layers were written\n\t\t\t\t\th.AssertBlobsLen(t, imagePath, 3)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"sparse is true\", func() {\n\t\t\t\tit(\"returns and layout image on disk\", func() {\n\t\t\t\t\tlayoutOption.Sparse = true\n\t\t\t\t\t_, err := imageFetcher.Fetch(context.TODO(), repoName, image.FetchOptions{LayoutOption: layoutOption})\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t// only manifest and config was written\n\t\t\t\t\th.AssertBlobsLen(t, imagePath, 2)\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"#FetchForPlatform\", func() {\n\t\twhen(\"target is nil\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\timg, err := remote.NewImage(repoName, authn.DefaultKeychain)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertNil(t, img.Save())\n\t\t\t})\n\n\t\t\tit(\"delegates to regular Fetch method\", func() {\n\t\t\t\tfetchedImg, err := imageFetcher.FetchForPlatform(context.TODO(), repoName, image.FetchOptions{\n\t\t\t\t\tDaemon:     false,\n\t\t\t\t\tPullPolicy: image.PullAlways,\n\t\t\t\t\tTarget:     nil,\n\t\t\t\t})\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertNotNil(t, fetchedImg)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"target is specified\", func() {\n\t\t\twhen(\"multi-platform image\", func() {\n\t\t\t\twhen(\"matching platform exists\", func() {\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\t// Create a multi-platform image by creating an index\n\t\t\t\t\t\t// For testing purposes, we'll create a single-platform image with the current architecture\n\t\t\t\t\t\timg, err := remote.NewImage(repoName, authn.DefaultKeychain, remote.WithDefaultPlatform(imgutil.Platform{\n\t\t\t\t\t\t\tOS:           runtime.GOOS,\n\t\t\t\t\t\t\tArchitecture: runtime.GOARCH,\n\t\t\t\t\t\t}))\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\th.AssertNil(t, img.Save())\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"successfully fetches the platform-specific image\", func() {\n\t\t\t\t\t\ttarget := dist.Target{\n\t\t\t\t\t\t\tOS:   runtime.GOOS,\n\t\t\t\t\t\t\tArch: runtime.GOARCH,\n\t\t\t\t\t\t}\n\n\t\t\t\t\t\tfetchedImg, err := imageFetcher.FetchForPlatform(context.TODO(), repoName, image.FetchOptions{\n\t\t\t\t\t\t\tDaemon:     false,\n\t\t\t\t\t\t\tPullPolicy: image.PullAlways,\n\t\t\t\t\t\t\tTarget:     &target,\n\t\t\t\t\t\t})\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\th.AssertNotNil(t, fetchedImg)\n\n\t\t\t\t\t\t// Verify the platform matches\n\t\t\t\t\t\tos, err := fetchedImg.OS()\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\th.AssertEq(t, os, runtime.GOOS)\n\n\t\t\t\t\t\tarch, err := fetchedImg.Architecture()\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\th.AssertEq(t, arch, runtime.GOARCH)\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"true manifest list with multiple platforms\", func() {\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\t// Create a random image index with platform annotations\n\t\t\t\t\t\th.SetUpRandomRemoteIndexWithPlatforms(t, repoName, []struct{ OS, Arch string }{\n\t\t\t\t\t\t\t{OS: \"linux\", Arch: \"amd64\"},\n\t\t\t\t\t\t\t{OS: \"linux\", Arch: \"arm64\"},\n\t\t\t\t\t\t})\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"resolves to the correct platform-specific digest for amd64\", func() {\n\t\t\t\t\t\ttarget := dist.Target{\n\t\t\t\t\t\t\tOS:   \"linux\",\n\t\t\t\t\t\t\tArch: \"amd64\",\n\t\t\t\t\t\t}\n\n\t\t\t\t\t\tfetchedImg, err := imageFetcher.FetchForPlatform(context.TODO(), repoName, image.FetchOptions{\n\t\t\t\t\t\t\tDaemon:     false,\n\t\t\t\t\t\t\tPullPolicy: image.PullAlways,\n\t\t\t\t\t\t\tTarget:     &target,\n\t\t\t\t\t\t})\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\th.AssertNotNil(t, fetchedImg)\n\n\t\t\t\t\t\t// Verify the platform matches\n\t\t\t\t\t\tarch, err := fetchedImg.Architecture()\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\th.AssertEq(t, arch, \"amd64\")\n\n\t\t\t\t\t\tos, err := fetchedImg.OS()\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\th.AssertEq(t, os, \"linux\")\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"resolves to the correct platform-specific digest for arm64\", func() {\n\t\t\t\t\t\ttarget := dist.Target{\n\t\t\t\t\t\t\tOS:   \"linux\",\n\t\t\t\t\t\t\tArch: \"arm64\",\n\t\t\t\t\t\t}\n\n\t\t\t\t\t\tfetchedImg, err := imageFetcher.FetchForPlatform(context.TODO(), repoName, image.FetchOptions{\n\t\t\t\t\t\t\tDaemon:     false,\n\t\t\t\t\t\t\tPullPolicy: image.PullAlways,\n\t\t\t\t\t\t\tTarget:     &target,\n\t\t\t\t\t\t})\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\th.AssertNotNil(t, fetchedImg)\n\n\t\t\t\t\t\t// Verify the platform matches\n\t\t\t\t\t\tarch, err := fetchedImg.Architecture()\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\th.AssertEq(t, arch, \"arm64\")\n\n\t\t\t\t\t\tos, err := fetchedImg.OS()\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\th.AssertEq(t, os, \"linux\")\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"matching platform does not exist\", func() {\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\t// Create an image with a specific platform\n\t\t\t\t\t\timg, err := remote.NewImage(repoName, authn.DefaultKeychain, remote.WithDefaultPlatform(imgutil.Platform{\n\t\t\t\t\t\t\tOS:           runtime.GOOS,\n\t\t\t\t\t\t\tArchitecture: runtime.GOARCH,\n\t\t\t\t\t\t}))\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\th.AssertNil(t, img.Save())\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"returns an error\", func() {\n\t\t\t\t\t\t// Request a different platform that doesn't exist\n\t\t\t\t\t\tdifferentArch := \"nonexistent-arch\"\n\t\t\t\t\t\ttarget := dist.Target{\n\t\t\t\t\t\t\tOS:   runtime.GOOS,\n\t\t\t\t\t\t\tArch: differentArch,\n\t\t\t\t\t\t}\n\n\t\t\t\t\t\t_, err := imageFetcher.FetchForPlatform(context.TODO(), repoName, image.FetchOptions{\n\t\t\t\t\t\t\tDaemon:     false,\n\t\t\t\t\t\t\tPullPolicy: image.PullAlways,\n\t\t\t\t\t\t\tTarget:     &target,\n\t\t\t\t\t\t})\n\t\t\t\t\t\th.AssertError(t, err, \"does not match requested platform\")\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"single-platform image\", func() {\n\t\t\t\twhen(\"platform matches\", func() {\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\timg, err := remote.NewImage(repoName, authn.DefaultKeychain, remote.WithDefaultPlatform(imgutil.Platform{\n\t\t\t\t\t\t\tOS:           runtime.GOOS,\n\t\t\t\t\t\t\tArchitecture: runtime.GOARCH,\n\t\t\t\t\t\t}))\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\th.AssertNil(t, img.Save())\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"successfully fetches the image\", func() {\n\t\t\t\t\t\ttarget := dist.Target{\n\t\t\t\t\t\t\tOS:   runtime.GOOS,\n\t\t\t\t\t\t\tArch: runtime.GOARCH,\n\t\t\t\t\t\t}\n\n\t\t\t\t\t\tfetchedImg, err := imageFetcher.FetchForPlatform(context.TODO(), repoName, image.FetchOptions{\n\t\t\t\t\t\t\tDaemon:     false,\n\t\t\t\t\t\t\tPullPolicy: image.PullAlways,\n\t\t\t\t\t\t\tTarget:     &target,\n\t\t\t\t\t\t})\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\th.AssertNotNil(t, fetchedImg)\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"platform does not match\", func() {\n\t\t\t\t\tit.Before(func() {\n\t\t\t\t\t\timg, err := remote.NewImage(repoName, authn.DefaultKeychain, remote.WithDefaultPlatform(imgutil.Platform{\n\t\t\t\t\t\t\tOS:           runtime.GOOS,\n\t\t\t\t\t\t\tArchitecture: runtime.GOARCH,\n\t\t\t\t\t\t}))\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\th.AssertNil(t, img.Save())\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"returns a platform mismatch error\", func() {\n\t\t\t\t\t\t// Use a different OS to ensure mismatch\n\t\t\t\t\t\tdifferentOS := \"nonexistent-os\"\n\t\t\t\t\t\ttarget := dist.Target{\n\t\t\t\t\t\t\tOS:   differentOS,\n\t\t\t\t\t\t\tArch: runtime.GOARCH,\n\t\t\t\t\t\t}\n\n\t\t\t\t\t\t_, err := imageFetcher.FetchForPlatform(context.TODO(), repoName, image.FetchOptions{\n\t\t\t\t\t\t\tDaemon:     false,\n\t\t\t\t\t\t\tPullPolicy: image.PullAlways,\n\t\t\t\t\t\t\tTarget:     &target,\n\t\t\t\t\t\t})\n\t\t\t\t\t\th.AssertError(t, err, \"does not match requested platform\")\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"with insecure registries\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\timg, err := remote.NewImage(repoName, authn.DefaultKeychain, remote.WithDefaultPlatform(imgutil.Platform{\n\t\t\t\t\t\tOS:           runtime.GOOS,\n\t\t\t\t\t\tArchitecture: runtime.GOARCH,\n\t\t\t\t\t}))\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\th.AssertNil(t, img.Save())\n\t\t\t\t})\n\n\t\t\t\tit(\"successfully fetches using insecure registry settings\", func() {\n\t\t\t\t\ttarget := dist.Target{\n\t\t\t\t\t\tOS:   runtime.GOOS,\n\t\t\t\t\t\tArch: runtime.GOARCH,\n\t\t\t\t\t}\n\t\t\t\t\tinsecureRegistry := fmt.Sprintf(\"%s:%s\", registryConfig.RunRegistryHost, registryConfig.RunRegistryPort)\n\n\t\t\t\t\tfetchedImg, err := imageFetcher.FetchForPlatform(context.TODO(), repoName, image.FetchOptions{\n\t\t\t\t\t\tDaemon:             false,\n\t\t\t\t\t\tPullPolicy:         image.PullAlways,\n\t\t\t\t\t\tTarget:             &target,\n\t\t\t\t\t\tInsecureRegistries: []string{insecureRegistry},\n\t\t\t\t\t})\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\th.AssertNotNil(t, fetchedImg)\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"with platform variant\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\timg, err := remote.NewImage(repoName, authn.DefaultKeychain, remote.WithDefaultPlatform(imgutil.Platform{\n\t\t\t\t\t\tOS:           runtime.GOOS,\n\t\t\t\t\t\tArchitecture: runtime.GOARCH,\n\t\t\t\t\t\tVariant:      \"v7\",\n\t\t\t\t\t}))\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\th.AssertNil(t, img.Save())\n\t\t\t\t})\n\n\t\t\t\tit(\"successfully fetches the image with matching variant\", func() {\n\t\t\t\t\ttarget := dist.Target{\n\t\t\t\t\t\tOS:          runtime.GOOS,\n\t\t\t\t\t\tArch:        runtime.GOARCH,\n\t\t\t\t\t\tArchVariant: \"v7\",\n\t\t\t\t\t}\n\n\t\t\t\t\tfetchedImg, err := imageFetcher.FetchForPlatform(context.TODO(), repoName, image.FetchOptions{\n\t\t\t\t\t\tDaemon:     false,\n\t\t\t\t\t\tPullPolicy: image.PullAlways,\n\t\t\t\t\t\tTarget:     &target,\n\t\t\t\t\t})\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\th.AssertNotNil(t, fetchedImg)\n\n\t\t\t\t\tvariant, err := fetchedImg.Variant()\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\th.AssertEq(t, variant, \"v7\")\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"image does not exist\", func() {\n\t\t\tit(\"returns an error\", func() {\n\t\t\t\ttarget := dist.Target{\n\t\t\t\t\tOS:   runtime.GOOS,\n\t\t\t\t\tArch: runtime.GOARCH,\n\t\t\t\t}\n\n\t\t\t\tnonExistentImage := registryConfig.RepoName(\"nonexistent/\" + h.RandString(10))\n\t\t\t\t_, err := imageFetcher.FetchForPlatform(context.TODO(), nonExistentImage, image.FetchOptions{\n\t\t\t\t\tDaemon:     false,\n\t\t\t\t\tPullPolicy: image.PullAlways,\n\t\t\t\t\tTarget:     &target,\n\t\t\t\t})\n\t\t\t\th.AssertError(t, err, \"\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"pull policy is PullNever with daemon\", func() {\n\t\t\tvar localImageName string\n\n\t\t\tit.Before(func() {\n\t\t\t\t// Use a different name for the local image to avoid conflicts\n\t\t\t\tlocalImageName = \"pack.local/test-\" + h.RandString(10)\n\n\t\t\t\t// Create a local daemon image with platform information\n\t\t\t\t// Use osType (daemon OS) instead of runtime.GOOS to handle cases where\n\t\t\t\t// Windows runner is running Linux containers\n\t\t\t\timg, err := local.NewImage(localImageName, docker, local.WithDefaultPlatform(imgutil.Platform{\n\t\t\t\t\tOS:           osType,\n\t\t\t\t\tArchitecture: runtime.GOARCH,\n\t\t\t\t}))\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertNil(t, img.Save())\n\t\t\t})\n\n\t\t\tit.After(func() {\n\t\t\t\th.DockerRmi(docker, localImageName)\n\t\t\t})\n\n\t\t\tit(\"skips platform-specific digest resolution and uses tag directly\", func() {\n\t\t\t\ttarget := dist.Target{\n\t\t\t\t\tOS:   osType,\n\t\t\t\t\tArch: runtime.GOARCH,\n\t\t\t\t}\n\n\t\t\t\tfetchedImg, err := imageFetcher.FetchForPlatform(context.TODO(), localImageName, image.FetchOptions{\n\t\t\t\t\tDaemon:     true,\n\t\t\t\t\tPullPolicy: image.PullNever,\n\t\t\t\t\tTarget:     &target,\n\t\t\t\t})\n\n\t\t\t\t// Should succeed without network access (digest resolution skipped)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertNotNil(t, fetchedImg)\n\n\t\t\t\t// Verify debug message about skipping digest resolution\n\t\t\t\th.AssertContains(t, outBuf.String(), \"skipping digest resolution due to --pull-policy never\")\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"#CheckReadAccess\", func() {\n\t\tvar daemon bool\n\n\t\twhen(\"Daemon is true\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\tdaemon = true\n\t\t\t})\n\n\t\t\twhen(\"an error is thrown by the daemon\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\tmockController := gomock.NewController(t)\n\t\t\t\t\tmockDockerClient := testmocks.NewMockAPIClient(mockController)\n\t\t\t\t\tmockDockerClient.EXPECT().ServerVersion(gomock.Any(), gomock.Any()).Return(client.ServerVersionResult{}, errors.New(\"something wrong happened\"))\n\t\t\t\t\timageFetcher = image.NewFetcher(logging.NewLogWithWriters(&outBuf, &outBuf, logging.WithVerbose()), mockDockerClient)\n\t\t\t\t})\n\t\t\t\twhen(\"PullNever\", func() {\n\t\t\t\t\tit(\"read access must be false\", func() {\n\t\t\t\t\t\th.AssertFalse(t, imageFetcher.CheckReadAccess(\"pack.test/dummy\", image.FetchOptions{Daemon: daemon, PullPolicy: image.PullNever}))\n\t\t\t\t\t\th.AssertContains(t, outBuf.String(), \"failed reading image 'pack.test/dummy' from the daemon\")\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"PullIfNotPresent\", func() {\n\t\t\t\t\tit(\"read access must be false\", func() {\n\t\t\t\t\t\th.AssertFalse(t, imageFetcher.CheckReadAccess(\"pack.test/dummy\", image.FetchOptions{Daemon: daemon, PullPolicy: image.PullIfNotPresent}))\n\t\t\t\t\t\th.AssertContains(t, outBuf.String(), \"failed reading image 'pack.test/dummy' from the daemon\")\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"image exists only in the daemon\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\timg, err := local.NewImage(\"pack.test/dummy\", docker)\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\th.AssertNil(t, img.Save())\n\t\t\t\t})\n\t\t\t\twhen(\"PullAlways\", func() {\n\t\t\t\t\tit(\"read access must be false\", func() {\n\t\t\t\t\t\th.AssertFalse(t, imageFetcher.CheckReadAccess(\"pack.test/dummy\", image.FetchOptions{Daemon: daemon, PullPolicy: image.PullAlways}))\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"PullNever\", func() {\n\t\t\t\t\tit(\"read access must be true\", func() {\n\t\t\t\t\t\th.AssertTrue(t, imageFetcher.CheckReadAccess(\"pack.test/dummy\", image.FetchOptions{Daemon: daemon, PullPolicy: image.PullNever}))\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"PullIfNotPresent\", func() {\n\t\t\t\t\tit(\"read access must be true\", func() {\n\t\t\t\t\t\th.AssertTrue(t, imageFetcher.CheckReadAccess(\"pack.test/dummy\", image.FetchOptions{Daemon: daemon, PullPolicy: image.PullIfNotPresent}))\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"image doesn't exist in the daemon but in remote\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\timg, err := remote.NewImage(repoName, authn.DefaultKeychain)\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\th.AssertNil(t, img.Save())\n\t\t\t\t})\n\t\t\t\twhen(\"PullAlways\", func() {\n\t\t\t\t\tit(\"read access must be true\", func() {\n\t\t\t\t\t\th.AssertTrue(t, imageFetcher.CheckReadAccess(repoName, image.FetchOptions{Daemon: daemon, PullPolicy: image.PullAlways}))\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"PullNever\", func() {\n\t\t\t\t\tit(\"read access must be false\", func() {\n\t\t\t\t\t\th.AssertFalse(t, imageFetcher.CheckReadAccess(repoName, image.FetchOptions{Daemon: daemon, PullPolicy: image.PullNever}))\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"PullIfNotPresent\", func() {\n\t\t\t\t\tit(\"read access must be true\", func() {\n\t\t\t\t\t\th.AssertTrue(t, imageFetcher.CheckReadAccess(repoName, image.FetchOptions{Daemon: daemon, PullPolicy: image.PullIfNotPresent}))\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"Daemon is false\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\tdaemon = false\n\t\t\t})\n\n\t\t\twhen(\"remote image doesn't exists\", func() {\n\t\t\t\tit(\"fails when checking dummy image\", func() {\n\t\t\t\t\th.AssertFalse(t, imageFetcher.CheckReadAccess(\"pack.test/dummy\", image.FetchOptions{Daemon: daemon}))\n\t\t\t\t\th.AssertContains(t, outBuf.String(), \"CheckReadAccess failed for the run image pack.test/dummy\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"remote image exists\", func() {\n\t\t\t\tit.Before(func() {\n\t\t\t\t\timg, err := remote.NewImage(repoName, authn.DefaultKeychain)\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\th.AssertNil(t, img.Save())\n\t\t\t\t})\n\n\t\t\t\tit(\"read access is valid\", func() {\n\t\t\t\t\th.AssertTrue(t, imageFetcher.CheckReadAccess(repoName, image.FetchOptions{Daemon: daemon}))\n\t\t\t\t\th.AssertContains(t, outBuf.String(), fmt.Sprintf(\"CheckReadAccess succeeded for the run image %s\", repoName))\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "pkg/image/platform.go",
    "content": "package image\n\nimport (\n\t\"fmt\"\n\t\"strings\"\n\n\t\"github.com/buildpacks/imgutil\"\n\t\"github.com/google/go-containerregistry/pkg/authn\"\n\t\"github.com/google/go-containerregistry/pkg/name\"\n\t\"github.com/google/go-containerregistry/pkg/v1/remote\"\n\t\"github.com/google/go-containerregistry/pkg/v1/types\"\n\t\"github.com/pkg/errors\"\n)\n\n// resolvePlatformSpecificDigest resolves a multi-platform image reference to a platform-specific digest.\n// If the image is a manifest list, it finds the manifest for the specified platform and returns its digest.\n// If the image is a single-platform image, it validates the platform matches and returns a digest reference.\nfunc resolvePlatformSpecificDigest(imageRef string, platform *imgutil.Platform, keychain authn.Keychain, registrySettings map[string]imgutil.RegistrySetting) (string, error) {\n\t// If platform is nil, return the reference unchanged\n\tif platform == nil {\n\t\treturn imageRef, nil\n\t}\n\n\t// Parse the reference (could be digest or tag)\n\tref, err := name.ParseReference(imageRef, name.WeakValidation)\n\tif err != nil {\n\t\treturn \"\", errors.Wrapf(err, \"parsing image reference %q\", imageRef)\n\t}\n\n\t// Get registry settings for the reference\n\treg := getRegistrySetting(imageRef, registrySettings)\n\n\t// Get authentication\n\tauth, err := keychain.Resolve(ref.Context().Registry)\n\tif err != nil {\n\t\treturn \"\", errors.Wrapf(err, \"resolving authentication for registry %q\", ref.Context().Registry)\n\t}\n\n\t// Fetch the descriptor\n\tdesc, err := remote.Get(ref, remote.WithAuth(auth), remote.WithTransport(imgutil.GetTransport(reg.Insecure)))\n\tif err != nil {\n\t\treturn \"\", errors.Wrapf(err, \"fetching descriptor for %q\", imageRef)\n\t}\n\n\t// Check if it's a manifest list\n\tif desc.MediaType == types.OCIImageIndex || desc.MediaType == types.DockerManifestList {\n\t\t// Get the index\n\t\tindex, err := desc.ImageIndex()\n\t\tif err != nil {\n\t\t\treturn \"\", errors.Wrapf(err, \"getting image index for %q\", imageRef)\n\t\t}\n\n\t\t// Get the manifest list\n\t\tmanifestList, err := index.IndexManifest()\n\t\tif err != nil {\n\t\t\treturn \"\", errors.Wrapf(err, \"getting manifest list for %q\", imageRef)\n\t\t}\n\n\t\t// Find the platform-specific manifest\n\t\tfor _, manifest := range manifestList.Manifests {\n\t\t\tif manifest.Platform != nil {\n\t\t\t\tmanifestPlatform := &imgutil.Platform{\n\t\t\t\t\tOS:           manifest.Platform.OS,\n\t\t\t\t\tArchitecture: manifest.Platform.Architecture,\n\t\t\t\t\tVariant:      manifest.Platform.Variant,\n\t\t\t\t\tOSVersion:    manifest.Platform.OSVersion,\n\t\t\t\t}\n\n\t\t\t\tif platformsMatch(platform, manifestPlatform) {\n\t\t\t\t\t// Create a new digest reference for the platform-specific manifest\n\t\t\t\t\tplatformDigestRef, err := name.NewDigest(\n\t\t\t\t\t\tfmt.Sprintf(\"%s@%s\", ref.Context().Name(), manifest.Digest.String()),\n\t\t\t\t\t\tname.WeakValidation,\n\t\t\t\t\t)\n\t\t\t\t\tif err != nil {\n\t\t\t\t\t\treturn \"\", errors.Wrapf(err, \"creating platform-specific digest reference\")\n\t\t\t\t\t}\n\t\t\t\t\treturn platformDigestRef.String(), nil\n\t\t\t\t}\n\t\t\t}\n\t\t}\n\n\t\treturn \"\", errors.Errorf(\"no manifest found for platform %s/%s%s in manifest list %q\",\n\t\t\tplatform.OS,\n\t\t\tplatform.Architecture,\n\t\t\tplatformString(platform),\n\t\t\timageRef)\n\t}\n\n\t// If it's a single manifest, validate that the platform matches\n\timg, err := desc.Image()\n\tif err != nil {\n\t\treturn \"\", errors.Wrapf(err, \"getting image for %q\", imageRef)\n\t}\n\n\tconfigFile, err := img.ConfigFile()\n\tif err != nil {\n\t\treturn \"\", errors.Wrapf(err, \"getting config file for %q\", imageRef)\n\t}\n\n\t// Create platform from image config\n\timagePlatform := &imgutil.Platform{\n\t\tOS:           configFile.OS,\n\t\tArchitecture: configFile.Architecture,\n\t\tVariant:      configFile.Variant,\n\t\tOSVersion:    configFile.OSVersion,\n\t}\n\n\t// Check if the image's platform matches the requested platform\n\tif !platformsMatch(platform, imagePlatform) {\n\t\treturn \"\", errors.Errorf(\"image platform %s/%s%s does not match requested platform %s/%s%s for %q\",\n\t\t\tconfigFile.OS,\n\t\t\tconfigFile.Architecture,\n\t\t\tplatformString(imagePlatform),\n\t\t\tplatform.OS,\n\t\t\tplatform.Architecture,\n\t\t\tplatformString(platform),\n\t\t\timageRef)\n\t}\n\n\t// Platform matches - if input was a digest reference, return it unchanged\n\t// If input was a tag reference, return the digest reference for consistency\n\tif _, ok := ref.(name.Digest); ok {\n\t\treturn imageRef, nil\n\t}\n\n\t// Convert tag reference to digest reference\n\tdigest, err := img.Digest()\n\tif err != nil {\n\t\treturn \"\", errors.Wrapf(err, \"getting digest for image %q\", imageRef)\n\t}\n\n\tdigestRef, err := name.NewDigest(\n\t\tfmt.Sprintf(\"%s@%s\", ref.Context().Name(), digest.String()),\n\t\tname.WeakValidation,\n\t)\n\tif err != nil {\n\t\treturn \"\", errors.Wrapf(err, \"creating digest reference for %q\", imageRef)\n\t}\n\n\treturn digestRef.String(), nil\n}\n\n// platformsMatch checks if two platforms match.\n// OS and Architecture must match exactly.\n// For Variant and OSVersion, if either is blank, it's considered a match.\nfunc platformsMatch(p1, p2 *imgutil.Platform) bool {\n\tif p1 == nil || p2 == nil {\n\t\treturn false\n\t}\n\n\t// OS and Architecture must match exactly\n\tif p1.OS != p2.OS || p1.Architecture != p2.Architecture {\n\t\treturn false\n\t}\n\n\t// For Variant and OSVersion, if either is blank, consider it a match\n\tvariantMatch := p1.Variant == \"\" || p2.Variant == \"\" || p1.Variant == p2.Variant\n\tosVersionMatch := p1.OSVersion == \"\" || p2.OSVersion == \"\" || p1.OSVersion == p2.OSVersion\n\n\treturn variantMatch && osVersionMatch\n}\n\n// platformString returns a pretty-printed string representation of a platform's variant and OS version.\n// Returns empty string if both are blank, otherwise returns \"/variant:osversion\" format.\nfunc platformString(platform *imgutil.Platform) string {\n\tif platform == nil {\n\t\treturn \"\"\n\t}\n\n\tvar parts []string\n\n\tif platform.Variant != \"\" {\n\t\tparts = append(parts, platform.Variant)\n\t}\n\n\tif platform.OSVersion != \"\" {\n\t\tparts = append(parts, platform.OSVersion)\n\t}\n\n\tif len(parts) == 0 {\n\t\treturn \"\"\n\t}\n\n\tresult := \"/\" + parts[0]\n\tif len(parts) > 1 {\n\t\tresult += \":\" + parts[1]\n\t}\n\n\treturn result\n}\n\n// getRegistrySetting returns the registry setting for a given repository name.\n// It checks if any prefix in the settings map matches the repository name.\nfunc getRegistrySetting(forRepoName string, givenSettings map[string]imgutil.RegistrySetting) imgutil.RegistrySetting {\n\tif givenSettings == nil {\n\t\treturn imgutil.RegistrySetting{}\n\t}\n\tfor prefix, r := range givenSettings {\n\t\tif strings.HasPrefix(forRepoName, prefix) {\n\t\t\treturn r\n\t\t}\n\t}\n\treturn imgutil.RegistrySetting{}\n}\n"
  },
  {
    "path": "pkg/image/pull_policy.go",
    "content": "package image\n\nimport (\n\t\"github.com/pkg/errors\"\n)\n\n// PullPolicy defines a policy for how to manage images\ntype PullPolicy int\n\nconst (\n\t// PullAlways images, even if they are present\n\tPullAlways PullPolicy = iota\n\t// PullNever images, even if they are not present\n\tPullNever\n\t// PullIfNotPresent pulls images if they aren't present\n\tPullIfNotPresent\n)\n\nvar nameMap = map[string]PullPolicy{\"always\": PullAlways, \"never\": PullNever, \"if-not-present\": PullIfNotPresent, \"\": PullAlways}\n\n// ParsePullPolicy from string\nfunc ParsePullPolicy(policy string) (PullPolicy, error) {\n\tif val, ok := nameMap[policy]; ok {\n\t\treturn val, nil\n\t}\n\n\treturn PullAlways, errors.Errorf(\"invalid pull policy %s\", policy)\n}\n\nfunc (p PullPolicy) String() string {\n\tswitch p {\n\tcase PullAlways:\n\t\treturn \"always\"\n\tcase PullNever:\n\t\treturn \"never\"\n\tcase PullIfNotPresent:\n\t\treturn \"if-not-present\"\n\t}\n\n\treturn \"\"\n}\n"
  },
  {
    "path": "pkg/image/pull_policy_test.go",
    "content": "package image_test\n\nimport (\n\t\"testing\"\n\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\t\"github.com/buildpacks/pack/pkg/image\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestPullPolicy(t *testing.T) {\n\tspec.Run(t, \"PullPolicy\", testPullPolicy, spec.Report(report.Terminal{}))\n}\n\nfunc testPullPolicy(t *testing.T, when spec.G, it spec.S) {\n\twhen(\"#ParsePullPolicy\", func() {\n\t\tit(\"returns PullNever for never\", func() {\n\t\t\tpolicy, err := image.ParsePullPolicy(\"never\")\n\t\t\th.AssertNil(t, err)\n\t\t\th.AssertEq(t, policy, image.PullNever)\n\t\t})\n\n\t\tit(\"returns PullAlways for always\", func() {\n\t\t\tpolicy, err := image.ParsePullPolicy(\"always\")\n\t\t\th.AssertNil(t, err)\n\t\t\th.AssertEq(t, policy, image.PullAlways)\n\t\t})\n\n\t\tit(\"returns PullIfNotPresent for if-not-present\", func() {\n\t\t\tpolicy, err := image.ParsePullPolicy(\"if-not-present\")\n\t\t\th.AssertNil(t, err)\n\t\t\th.AssertEq(t, policy, image.PullIfNotPresent)\n\t\t})\n\n\t\tit(\"defaults to PullAlways, if empty string\", func() {\n\t\t\tpolicy, err := image.ParsePullPolicy(\"\")\n\t\t\th.AssertNil(t, err)\n\t\t\th.AssertEq(t, policy, image.PullAlways)\n\t\t})\n\n\t\tit(\"returns error for unknown string\", func() {\n\t\t\t_, err := image.ParsePullPolicy(\"fake-policy-here\")\n\t\t\th.AssertError(t, err, \"invalid pull policy\")\n\t\t})\n\t})\n\n\twhen(\"#String\", func() {\n\t\tit(\"returns the right String value\", func() {\n\t\t\th.AssertEq(t, image.PullAlways.String(), \"always\")\n\t\t\th.AssertEq(t, image.PullNever.String(), \"never\")\n\t\t\th.AssertEq(t, image.PullIfNotPresent.String(), \"if-not-present\")\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "pkg/index/index_factory.go",
    "content": "package index\n\nimport (\n\t\"fmt\"\n\t\"os\"\n\t\"path/filepath\"\n\n\t\"github.com/buildpacks/imgutil\"\n\t\"github.com/buildpacks/imgutil/layout\"\n\t\"github.com/buildpacks/imgutil/remote\"\n\t\"github.com/google/go-containerregistry/pkg/authn\"\n\t\"github.com/pkg/errors\"\n)\n\ntype IndexFactory struct {\n\tkeychain authn.Keychain\n\tpath     string\n}\n\nfunc NewIndexFactory(keychain authn.Keychain, path string) *IndexFactory {\n\treturn &IndexFactory{\n\t\tkeychain: keychain,\n\t\tpath:     path,\n\t}\n}\n\nfunc (f *IndexFactory) Exists(repoName string) bool {\n\treturn layoutImageExists(f.localPath(repoName))\n}\n\nfunc (f *IndexFactory) LoadIndex(repoName string, opts ...imgutil.IndexOption) (index imgutil.ImageIndex, err error) {\n\tif !f.Exists(repoName) {\n\t\treturn nil, errors.New(fmt.Sprintf(\"Image: '%s' not found\", repoName))\n\t}\n\topts = appendOption(opts, imgutil.FromBaseIndex(f.localPath(repoName)))\n\treturn layout.NewIndex(repoName, appendDefaultOptions(opts, f.keychain, f.path)...)\n}\n\nfunc (f *IndexFactory) FetchIndex(name string, opts ...imgutil.IndexOption) (idx imgutil.ImageIndex, err error) {\n\treturn remote.NewIndex(name, appendDefaultOptions(opts, f.keychain, f.path)...)\n}\n\nfunc (f *IndexFactory) FindIndex(repoName string, opts ...imgutil.IndexOption) (idx imgutil.ImageIndex, err error) {\n\tif f.Exists(repoName) {\n\t\treturn f.LoadIndex(repoName, opts...)\n\t}\n\treturn f.FetchIndex(repoName, opts...)\n}\n\nfunc (f *IndexFactory) CreateIndex(repoName string, opts ...imgutil.IndexOption) (idx imgutil.ImageIndex, err error) {\n\treturn layout.NewIndex(repoName, appendDefaultOptions(opts, f.keychain, f.path)...)\n}\n\nfunc (f *IndexFactory) localPath(repoName string) string {\n\treturn filepath.Join(f.path, imgutil.MakeFileSafeName(repoName))\n}\n\nfunc layoutImageExists(path string) bool {\n\tif !pathExists(path) {\n\t\treturn false\n\t}\n\tindex := filepath.Join(path, \"index.json\")\n\tif _, err := os.Stat(index); os.IsNotExist(err) {\n\t\treturn false\n\t}\n\treturn true\n}\n\nfunc pathExists(path string) bool {\n\tif path != \"\" {\n\t\tif _, err := os.Stat(path); !os.IsNotExist(err) {\n\t\t\treturn true\n\t\t}\n\t}\n\treturn false\n}\n\nfunc appendOption(ops []imgutil.IndexOption, op imgutil.IndexOption) []imgutil.IndexOption {\n\treturn append(ops, op)\n}\n\nfunc appendDefaultOptions(ops []imgutil.IndexOption, keychain authn.Keychain, path string) []imgutil.IndexOption {\n\treturn append(ops, imgutil.WithKeychain(keychain), imgutil.WithXDGRuntimePath(path))\n}\n"
  },
  {
    "path": "pkg/index/index_factory_test.go",
    "content": "package index_test\n\nimport (\n\t\"fmt\"\n\t\"os\"\n\t\"testing\"\n\n\t\"github.com/buildpacks/imgutil\"\n\t\"github.com/google/go-containerregistry/pkg/authn\"\n\t\"github.com/google/go-containerregistry/pkg/name\"\n\tv1 \"github.com/google/go-containerregistry/pkg/v1\"\n\t\"github.com/google/go-containerregistry/pkg/v1/random\"\n\t\"github.com/google/go-containerregistry/pkg/v1/remote\"\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\t\"github.com/buildpacks/pack/pkg/index\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nvar dockerRegistry *h.TestRegistryConfig\n\nfunc TestIndexFactory(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\n\th.RequireDocker(t)\n\n\tdockerRegistry = h.RunRegistry(t)\n\tdefer dockerRegistry.StopRegistry(t)\n\n\tos.Setenv(\"DOCKER_CONFIG\", dockerRegistry.DockerConfigDir)\n\tspec.Run(t, \"Fetcher\", testIndexFactory, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testIndexFactory(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tindexFactory  *index.IndexFactory\n\t\timageIndex    imgutil.ImageIndex\n\t\tindexRepoName string\n\t\terr           error\n\t\ttmpDir        string\n\t)\n\n\tit.Before(func() {\n\t\ttmpDir, err = os.MkdirTemp(\"\", \"index-factory-test\")\n\t\th.AssertNil(t, err)\n\t\tindexFactory = index.NewIndexFactory(authn.DefaultKeychain, tmpDir)\n\t})\n\n\tit.After(func() {\n\t\tos.RemoveAll(tmpDir)\n\t})\n\n\twhen(\"#CreateIndex\", func() {\n\t\tit.Before(func() {\n\t\t\tindexRepoName = h.NewRandomIndexRepoName()\n\t\t})\n\n\t\twhen(\"no options are provided\", func() {\n\t\t\tit(\"creates an image index\", func() {\n\t\t\t\timageIndex, err = indexFactory.CreateIndex(indexRepoName)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertNotNil(t, imageIndex)\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"#Exists\", func() {\n\t\twhen(\"index exists on disk\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\tindexRepoName = h.NewRandomIndexRepoName()\n\t\t\t\tsetUpLocalIndex(t, indexFactory, indexRepoName)\n\t\t\t})\n\n\t\t\tit(\"returns true\", func() {\n\t\t\t\th.AssertTrue(t, indexFactory.Exists(indexRepoName))\n\t\t\t})\n\t\t})\n\n\t\twhen(\"index does not exist on disk\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\tindexRepoName = h.NewRandomIndexRepoName()\n\t\t\t})\n\n\t\t\tit(\"returns false\", func() {\n\t\t\t\th.AssertFalse(t, indexFactory.Exists(indexRepoName))\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"#LoadIndex\", func() {\n\t\twhen(\"index exists on disk\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\tindexRepoName = h.NewRandomIndexRepoName()\n\t\t\t\tsetUpLocalIndex(t, indexFactory, indexRepoName)\n\t\t\t})\n\n\t\t\tit(\"loads the index from disk\", func() {\n\t\t\t\timageIndex, err = indexFactory.LoadIndex(indexRepoName)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertNotNil(t, imageIndex)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"index does not exist on disk\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\tindexRepoName = h.NewRandomIndexRepoName()\n\t\t\t})\n\n\t\t\tit(\"errors with a message\", func() {\n\t\t\t\t_, err = indexFactory.LoadIndex(indexRepoName)\n\t\t\t\th.AssertError(t, err, fmt.Sprintf(\"Image: '%s' not found\", indexRepoName))\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"#FetchIndex\", func() {\n\t\twhen(\"index exists in a remote registry\", func() {\n\t\t\tvar remoteIndexRepoName string\n\n\t\t\tit.Before(func() {\n\t\t\t\tindexRepoName = h.NewRandomIndexRepoName()\n\t\t\t\tremoteIndexRepoName = newTestImageIndexName(\"fetch-remote\")\n\t\t\t\tsetUpRandomRemoteIndex(t, remoteIndexRepoName, 1, 1)\n\t\t\t})\n\n\t\t\tit(\"creates an index with the underlying remote index\", func() {\n\t\t\t\t_, err = indexFactory.FetchIndex(indexRepoName, imgutil.FromBaseIndex(remoteIndexRepoName))\n\t\t\t\th.AssertNil(t, err)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"index does not exist in a remote registry\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\tindexRepoName = h.NewRandomIndexRepoName()\n\t\t\t})\n\n\t\t\tit(\"errors with a message\", func() {\n\t\t\t\t_, err = indexFactory.FetchIndex(indexRepoName, imgutil.FromBaseIndex(indexRepoName))\n\t\t\t\th.AssertNotNil(t, err)\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"#FindIndex\", func() {\n\t\twhen(\"index exists on disk\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\tindexRepoName = h.NewRandomIndexRepoName()\n\t\t\t\tsetUpLocalIndex(t, indexFactory, indexRepoName)\n\t\t\t})\n\n\t\t\tit(\"finds the index on disk\", func() {\n\t\t\t\timageIndex, err = indexFactory.FindIndex(indexRepoName)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertNotNil(t, imageIndex)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"index exists in a remote registry\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\tindexRepoName = newTestImageIndexName(\"find-remote\")\n\t\t\t\tsetUpRandomRemoteIndex(t, indexRepoName, 1, 1)\n\t\t\t})\n\n\t\t\tit(\"finds the index in the remote registry\", func() {\n\t\t\t\timageIndex, err = indexFactory.FindIndex(indexRepoName)\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\th.AssertNotNil(t, imageIndex)\n\t\t\t})\n\t\t})\n\t})\n}\n\nfunc setUpLocalIndex(t *testing.T, indexFactory *index.IndexFactory, indexRepoName string) {\n\timageIndex, err := indexFactory.CreateIndex(indexRepoName)\n\th.AssertNil(t, err)\n\th.AssertNil(t, imageIndex.SaveDir())\n}\n\nfunc newTestImageIndexName(name string) string {\n\treturn dockerRegistry.RepoName(name + \"-\" + h.RandString(10))\n}\n\n// setUpRandomRemoteIndex creates a random image index with the provided (count) number of manifest\n// each manifest will have the provided number of layers\nfunc setUpRandomRemoteIndex(t *testing.T, repoName string, layers, count int64) v1.ImageIndex {\n\tref, err := name.ParseReference(repoName, name.WeakValidation)\n\th.AssertNil(t, err)\n\n\trandomIndex, err := random.Index(1024, layers, count)\n\th.AssertNil(t, err)\n\n\terr = remote.WriteIndex(ref, randomIndex, remote.WithAuthFromKeychain(authn.DefaultKeychain))\n\th.AssertNil(t, err)\n\n\treturn randomIndex\n}\n"
  },
  {
    "path": "pkg/logging/logger_simple.go",
    "content": "package logging\n\nimport (\n\t\"fmt\"\n\t\"io\"\n\t\"log\"\n)\n\n// NewSimpleLogger creates a simple logger for the pack library.\nfunc NewSimpleLogger(w io.Writer) Logger {\n\treturn &simpleLogger{\n\t\tout: log.New(w, \"\", log.LstdFlags|log.Lmicroseconds),\n\t}\n}\n\ntype simpleLogger struct {\n\tout *log.Logger\n}\n\nconst (\n\tdebugPrefix = \"DEBUG:\"\n\tinfoPrefix  = \"INFO:\"\n\twarnPrefix  = \"WARN:\"\n\terrorPrefix = \"ERROR:\"\n\tprefixFmt   = \"%-7s %s\"\n)\n\nfunc (l *simpleLogger) Debug(msg string) {\n\tl.out.Printf(prefixFmt, debugPrefix, msg)\n}\n\nfunc (l *simpleLogger) Debugf(format string, v ...interface{}) {\n\tl.out.Printf(prefixFmt, debugPrefix, fmt.Sprintf(format, v...))\n}\n\nfunc (l *simpleLogger) Info(msg string) {\n\tl.out.Printf(prefixFmt, infoPrefix, msg)\n}\n\nfunc (l *simpleLogger) Infof(format string, v ...interface{}) {\n\tl.out.Printf(prefixFmt, infoPrefix, fmt.Sprintf(format, v...))\n}\n\nfunc (l *simpleLogger) Warn(msg string) {\n\tl.out.Printf(prefixFmt, warnPrefix, msg)\n}\n\nfunc (l *simpleLogger) Warnf(format string, v ...interface{}) {\n\tl.out.Printf(prefixFmt, warnPrefix, fmt.Sprintf(format, v...))\n}\n\nfunc (l *simpleLogger) Error(msg string) {\n\tl.out.Printf(prefixFmt, errorPrefix, msg)\n}\n\nfunc (l *simpleLogger) Errorf(format string, v ...interface{}) {\n\tl.out.Printf(prefixFmt, errorPrefix, fmt.Sprintf(format, v...))\n}\n\nfunc (l *simpleLogger) Writer() io.Writer {\n\treturn l.out.Writer()\n}\n\nfunc (l *simpleLogger) IsVerbose() bool {\n\treturn false\n}\n"
  },
  {
    "path": "pkg/logging/logger_simple_test.go",
    "content": "package logging_test\n\nimport (\n\t\"bytes\"\n\t\"testing\"\n\n\t\"github.com/sclevine/spec\"\n\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nconst (\n\tdebugMatcher = `^\\d{4}\\/\\d{2}\\/\\d{2} \\d{2}:\\d{2}:\\d{2}\\.\\d{6} DEBUG:  \\w*\\n$`\n\tinfoMatcher  = `^\\d{4}\\/\\d{2}\\/\\d{2} \\d{2}:\\d{2}:\\d{2}\\.\\d{6} INFO:   \\w*\\n$`\n\twarnMatcher  = `^\\d{4}\\/\\d{2}\\/\\d{2} \\d{2}:\\d{2}:\\d{2}\\.\\d{6} WARN:   \\w*\\n$`\n\terrorMatcher = `^\\d{4}\\/\\d{2}\\/\\d{2} \\d{2}:\\d{2}:\\d{2}\\.\\d{6} ERROR:  \\w*\\n$`\n)\n\nfunc TestSimpleLogger(t *testing.T) {\n\tspec.Run(t, \"SimpleLogger\", func(t *testing.T, when spec.G, it spec.S) {\n\t\tvar w bytes.Buffer\n\t\tvar logger logging.Logger\n\n\t\tit.Before(func() {\n\t\t\tlogger = logging.NewSimpleLogger(&w)\n\t\t})\n\n\t\tit.After(func() {\n\t\t\tw.Reset()\n\t\t})\n\n\t\tit(\"should print debug messages properly\", func() {\n\t\t\tlogger.Debug(\"test\")\n\t\t\th.AssertMatch(t, w.String(), debugMatcher)\n\t\t})\n\n\t\tit(\"should format debug messages properly\", func() {\n\t\t\tlogger.Debugf(\"test%s\", \"foo\")\n\t\t\th.AssertMatch(t, w.String(), debugMatcher)\n\t\t})\n\n\t\tit(\"should print info messages properly\", func() {\n\t\t\tlogger.Info(\"test\")\n\t\t\th.AssertMatch(t, w.String(), infoMatcher)\n\t\t})\n\n\t\tit(\"should format info messages properly\", func() {\n\t\t\tlogger.Infof(\"test%s\", \"foo\")\n\t\t\th.AssertMatch(t, w.String(), infoMatcher)\n\t\t})\n\n\t\tit(\"should print error messages properly\", func() {\n\t\t\tlogger.Error(\"test\")\n\t\t\th.AssertMatch(t, w.String(), errorMatcher)\n\t\t})\n\n\t\tit(\"should format error messages properly\", func() {\n\t\t\tlogger.Errorf(\"test%s\", \"foo\")\n\t\t\th.AssertMatch(t, w.String(), errorMatcher)\n\t\t})\n\n\t\tit(\"should print warn messages properly\", func() {\n\t\t\tlogger.Warn(\"test\")\n\t\t\th.AssertMatch(t, w.String(), warnMatcher)\n\t\t})\n\n\t\tit(\"should format warn messages properly\", func() {\n\t\t\tlogger.Warnf(\"test%s\", \"foo\")\n\t\t\th.AssertMatch(t, w.String(), warnMatcher)\n\t\t})\n\n\t\tit(\"shouldn't be verbose by default\", func() {\n\t\t\th.AssertFalse(t, logger.IsVerbose())\n\t\t})\n\n\t\tit(\"should not format writer messages\", func() {\n\t\t\t_, _ = logger.Writer().Write([]byte(\"test\"))\n\t\t\th.AssertEq(t, w.String(), \"test\")\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "pkg/logging/logger_writers.go",
    "content": "// Package logging implements the logger for the pack CLI.\npackage logging\n\nimport (\n\t\"fmt\"\n\t\"io\"\n\t\"regexp\"\n\t\"sync\"\n\t\"time\"\n\n\t\"github.com/apex/log\"\n\t\"github.com/heroku/color\"\n\n\t\"github.com/buildpacks/pack/internal/style\"\n)\n\nconst (\n\terrorLevelText = \"ERROR: \"\n\twarnLevelText  = \"Warning: \"\n\tlineFeed       = '\\n'\n\t// log level to use when quiet is true\n\tquietLevel = log.WarnLevel\n\t// log level to use when debug is true\n\tverboseLevel = log.DebugLevel\n\t// time format the out logging uses\n\ttimeFmt = \"2006/01/02 15:04:05.000000\"\n\t// InvalidFileDescriptor based on https://golang.org/src/os/file_unix.go?s=2183:2210#L57\n\tInvalidFileDescriptor = ^(uintptr(0))\n)\n\nvar colorCodeMatcher = regexp.MustCompile(`\\x1b\\[[0-9;]*m`)\n\nvar _ Logger = (*LogWithWriters)(nil)\n\n// LogWithWriters is a logger used with the pack CLI, allowing users to print logs for various levels, including Info, Debug and Error\ntype LogWithWriters struct {\n\tsync.Mutex\n\tlog.Logger\n\twantTime bool\n\tclock    func() time.Time\n\tout      io.Writer\n\terrOut   io.Writer\n}\n\n// NewLogWithWriters creates a logger to be used with pack CLI.\nfunc NewLogWithWriters(stdout, stderr io.Writer, opts ...func(*LogWithWriters)) *LogWithWriters {\n\tlw := &LogWithWriters{\n\t\tLogger: log.Logger{\n\t\t\tLevel: log.InfoLevel,\n\t\t},\n\t\twantTime: false,\n\t\tclock:    time.Now,\n\t\tout:      stdout,\n\t\terrOut:   stderr,\n\t}\n\tlw.Handler = lw\n\n\tfor _, opt := range opts {\n\t\topt(lw)\n\t}\n\n\treturn lw\n}\n\n// WithClock is an option used to initialize a LogWithWriters with a given clock function\nfunc WithClock(clock func() time.Time) func(writers *LogWithWriters) {\n\treturn func(logger *LogWithWriters) {\n\t\tlogger.clock = clock\n\t}\n}\n\n// WithVerbose is an option used to initialize a LogWithWriters with Verbose turned on\nfunc WithVerbose() func(writers *LogWithWriters) {\n\treturn func(logger *LogWithWriters) {\n\t\tlogger.Level = log.DebugLevel\n\t}\n}\n\n// HandleLog handles log events, printing entries appropriately\nfunc (lw *LogWithWriters) HandleLog(e *log.Entry) error {\n\tlw.Lock()\n\tdefer lw.Unlock()\n\n\twriter := lw.WriterForLevel(Level(e.Level))\n\t_, err := fmt.Fprint(writer, appendMissingLineFeed(fmt.Sprintf(\"%s%s\", formatLevel(e.Level), e.Message)))\n\n\treturn err\n}\n\n// WriterForLevel returns a Writer for the given Level\nfunc (lw *LogWithWriters) WriterForLevel(level Level) io.Writer {\n\tif lw.Level > log.Level(level) {\n\t\treturn io.Discard\n\t}\n\n\tif level == ErrorLevel {\n\t\treturn newLogWriter(lw.errOut, lw.clock, lw.wantTime)\n\t}\n\n\treturn newLogWriter(lw.out, lw.clock, lw.wantTime)\n}\n\n// Writer returns the base Writer for the LogWithWriters\nfunc (lw *LogWithWriters) Writer() io.Writer {\n\treturn lw.out\n}\n\n// WantTime turns timestamps on in log entries\nfunc (lw *LogWithWriters) WantTime(f bool) {\n\tlw.wantTime = f\n}\n\n// WantQuiet reduces the number of logs returned\nfunc (lw *LogWithWriters) WantQuiet(f bool) {\n\tif f {\n\t\tlw.Level = quietLevel\n\t}\n}\n\n// WantVerbose increases the number of logs returned\nfunc (lw *LogWithWriters) WantVerbose(f bool) {\n\tif f {\n\t\tlw.Level = verboseLevel\n\t}\n}\n\n// IsVerbose returns whether verbose logging is on\nfunc (lw *LogWithWriters) IsVerbose() bool {\n\treturn lw.Level == log.DebugLevel\n}\n\nfunc formatLevel(ll log.Level) string {\n\tswitch ll {\n\tcase log.ErrorLevel:\n\t\treturn style.Error(errorLevelText)\n\tcase log.WarnLevel:\n\t\treturn style.Warn(warnLevelText)\n\t}\n\n\treturn \"\"\n}\n\n// Preserve behavior of other loggers\nfunc appendMissingLineFeed(msg string) string {\n\tbuff := []byte(msg)\n\tif len(buff) == 0 || buff[len(buff)-1] != lineFeed {\n\t\tbuff = append(buff, lineFeed)\n\t}\n\treturn string(buff)\n}\n\n// logWriter is a writer used for logs\ntype logWriter struct {\n\tsync.Mutex\n\tout         io.Writer\n\tclock       func() time.Time\n\twantTime    bool\n\twantNoColor bool\n}\n\nfunc newLogWriter(writer io.Writer, clock func() time.Time, wantTime bool) *logWriter {\n\twantNoColor := !color.Enabled()\n\treturn &logWriter{\n\t\tout:         writer,\n\t\tclock:       clock,\n\t\twantTime:    wantTime,\n\t\twantNoColor: wantNoColor,\n\t}\n}\n\n// Write writes a message prepended by the time to the set io.Writer\nfunc (lw *logWriter) Write(buf []byte) (n int, err error) {\n\tlw.Lock()\n\tdefer lw.Unlock()\n\n\tlength := len(buf)\n\tif lw.wantNoColor {\n\t\tbuf = stripColor(buf)\n\t}\n\n\tprefix := \"\"\n\tif lw.wantTime {\n\t\tprefix = fmt.Sprintf(\"%s \", lw.clock().Format(timeFmt))\n\t}\n\n\t_, err = fmt.Fprintf(lw.out, \"%s%s\", prefix, buf)\n\treturn length, err\n}\n\n// Writer returns the base Writer for the logWriter\nfunc (lw *logWriter) Writer() io.Writer {\n\treturn lw.out\n}\n\n// Fd returns the file descriptor of the writer. This is used to ensure it is a Console, and can therefore display streams of text\nfunc (lw *logWriter) Fd() uintptr {\n\tlw.Lock()\n\tdefer lw.Unlock()\n\n\tif file, ok := lw.out.(hasDescriptor); ok {\n\t\treturn file.Fd()\n\t}\n\n\treturn InvalidFileDescriptor\n}\n\n// Remove all ANSI color information.\nfunc stripColor(b []byte) []byte {\n\treturn colorCodeMatcher.ReplaceAll(b, []byte(\"\"))\n}\n\ntype hasDescriptor interface {\n\tFd() uintptr\n}\n"
  },
  {
    "path": "pkg/logging/logger_writers_test.go",
    "content": "package logging_test\n\nimport (\n\t\"fmt\"\n\t\"io\"\n\t\"testing\"\n\t\"time\"\n\n\t\"github.com/apex/log\"\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\t\"github.com/buildpacks/pack/internal/style\"\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestLogWithWriters(t *testing.T) {\n\tspec.Run(t, \"LogWithWriters\", testLogWithWriters, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testLogWithWriters(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tlogger           *logging.LogWithWriters\n\t\toutCons, errCons *color.Console\n\t\tfOut, fErr       func() string\n\t\ttimeFmt          = \"2006/01/02 15:04:05.000000\"\n\t\ttestTime         = \"2019/05/15 01:01:01.000000\"\n\t)\n\n\tit.Before(func() {\n\t\toutCons, fOut = h.MockWriterAndOutput()\n\t\terrCons, fErr = h.MockWriterAndOutput()\n\t\tlogger = logging.NewLogWithWriters(outCons, errCons, logging.WithClock(func() time.Time {\n\t\t\tclock, _ := time.Parse(timeFmt, testTime)\n\t\t\treturn clock\n\t\t}))\n\t})\n\n\twhen(\"default\", func() {\n\t\tit(\"has no time and color\", func() {\n\t\t\tlogger.Info(color.HiBlueString(\"test\"))\n\t\t\th.AssertEq(t, fOut(), \"\\x1b[94mtest\\x1b[0m\\n\")\n\t\t})\n\n\t\tit(\"will not log debug messages\", func() {\n\t\t\tlogger.Debug(\"debug_\")\n\t\t\tlogger.Debugf(\"debugf\")\n\n\t\t\toutput := fOut()\n\t\t\th.AssertNotContains(t, output, \"debug_\\n\")\n\t\t\th.AssertNotContains(t, output, \"debugf\\n\")\n\t\t})\n\n\t\tit(\"logs info and warning messages to standard writer\", func() {\n\t\t\tlogger.Info(\"info_\")\n\t\t\tlogger.Infof(\"infof\")\n\t\t\tlogger.Warn(\"warn_\")\n\t\t\tlogger.Warnf(\"warnf\")\n\n\t\t\toutput := fOut()\n\t\t\th.AssertContains(t, output, \"info_\\n\")\n\t\t\th.AssertContains(t, output, \"infof\\n\")\n\t\t\th.AssertContains(t, output, \"warn_\\n\")\n\t\t\th.AssertContains(t, output, \"warnf\\n\")\n\t\t})\n\n\t\tit(\"logs error to error writer\", func() {\n\t\t\tlogger.Error(\"error_\")\n\t\t\tlogger.Errorf(\"errorf\")\n\n\t\t\toutput := fErr()\n\t\t\th.AssertContains(t, output, \"error_\\n\")\n\t\t\th.AssertContains(t, output, \"errorf\\n\")\n\t\t})\n\n\t\tit(\"will return correct writers\", func() {\n\t\t\th.AssertSameInstance(t, logger.Writer(), outCons)\n\t\t\th.AssertSameInstance(t, logger.WriterForLevel(logging.DebugLevel), io.Discard)\n\t\t})\n\n\t\tit(\"is only verbose for debug level\", func() {\n\t\t\th.AssertFalse(t, logger.IsVerbose())\n\n\t\t\tlogger.Level = log.DebugLevel\n\t\t\th.AssertTrue(t, logger.IsVerbose())\n\t\t})\n\t})\n\n\twhen(\"time is set to true\", func() {\n\t\tit(\"time is logged in info\", func() {\n\t\t\tlogger.WantTime(true)\n\t\t\tlogger.Info(\"test\")\n\t\t\th.AssertEq(t, fOut(), \"2019/05/15 01:01:01.000000 test\\n\")\n\t\t})\n\n\t\tit(\"time is logged in error\", func() {\n\t\t\tlogger.WantTime(true)\n\t\t\tlogger.Error(\"test\")\n\t\t\th.AssertEq(t, fErr(), fmt.Sprintf(\"2019/05/15 01:01:01.000000 %stest\\n\", style.Error(\"ERROR: \")))\n\t\t})\n\n\t\twhen(\"WriterForLevel\", func() {\n\t\t\tit(\"time is logged in info\", func() {\n\t\t\t\tlogger.WantTime(true)\n\t\t\t\twriter := logger.WriterForLevel(logging.InfoLevel)\n\t\t\t\twriter.Write([]byte(\"test\\n\"))\n\t\t\t\th.AssertEq(t, fOut(), \"2019/05/15 01:01:01.000000 test\\n\")\n\t\t\t})\n\n\t\t\tit(\"time is logged in error\", func() {\n\t\t\t\tlogger.WantTime(true)\n\t\t\t\twriter := logger.WriterForLevel(logging.ErrorLevel)\n\t\t\t\twriter.Write([]byte(\"test\\n\"))\n\t\t\t\t// The writer doesn't prepend the level\n\t\t\t\th.AssertEq(t, fErr(), \"2019/05/15 01:01:01.000000 test\\n\")\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"colors are disabled\", func() {\n\t\tit(\"don't display colors\", func() {\n\t\t\toutCons.DisableColors(true)\n\t\t\tlogger.Info(color.HiBlueString(\"test\"))\n\t\t\th.AssertEq(t, fOut(), \"test\\n\")\n\t\t})\n\t})\n\n\twhen(\"quiet is set to true\", func() {\n\t\tit.Before(func() {\n\t\t\tlogger.WantQuiet(true)\n\t\t})\n\n\t\tit(\"will not log debug or info messages\", func() {\n\t\t\tlogger.Debug(\"debug_\")\n\t\t\tlogger.Debugf(\"debugf\")\n\t\t\tlogger.Info(\"info_\")\n\t\t\tlogger.Infof(\"infof\")\n\n\t\t\toutput := fOut()\n\t\t\th.AssertNotContains(t, output, \"debug_\\n\")\n\t\t\th.AssertNotContains(t, output, \"debugf\\n\")\n\t\t\th.AssertNotContains(t, output, \"info_\\n\")\n\t\t\th.AssertNotContains(t, output, \"infof\\n\")\n\t\t})\n\n\t\tit(\"logs warnings to standard writer\", func() {\n\t\t\tlogger.Warn(\"warn_\")\n\t\t\tlogger.Warnf(\"warnf\")\n\n\t\t\toutput := fOut()\n\t\t\th.AssertContains(t, output, \"warn_\\n\")\n\t\t\th.AssertContains(t, output, \"warnf\\n\")\n\t\t})\n\n\t\tit(\"logs error to error writer\", func() {\n\t\t\tlogger.Error(\"error_\")\n\t\t\tlogger.Errorf(\"errorf\")\n\n\t\t\toutput := fErr()\n\t\t\th.AssertContains(t, output, \"error_\\n\")\n\t\t\th.AssertContains(t, output, \"errorf\\n\")\n\t\t})\n\n\t\tit(\"will return correct writers\", func() {\n\t\t\th.AssertSameInstance(t, logger.Writer(), outCons)\n\t\t\th.AssertSameInstance(t, logger.WriterForLevel(logging.DebugLevel), io.Discard)\n\t\t\th.AssertSameInstance(t, logger.WriterForLevel(logging.InfoLevel), io.Discard)\n\t\t})\n\t})\n\n\twhen(\"verbose is set to true\", func() {\n\t\tit.Before(func() {\n\t\t\tlogger.WantVerbose(true)\n\t\t})\n\n\t\tit(\"all messages are logged\", func() {\n\t\t\tlogger.Debug(\"debug_\")\n\t\t\tlogger.Debugf(\"debugf\")\n\t\t\tlogger.Info(\"info_\")\n\t\t\tlogger.Infof(\"infof\")\n\t\t\tlogger.Warn(\"warn_\")\n\t\t\tlogger.Warnf(\"warnf\")\n\n\t\t\toutput := fOut()\n\t\t\th.AssertContains(t, output, \"debug_\")\n\t\t\th.AssertContains(t, output, \"debugf\")\n\t\t\th.AssertContains(t, output, \"info_\")\n\t\t\th.AssertContains(t, output, \"infof\")\n\t\t\th.AssertContains(t, output, \"warn_\")\n\t\t\th.AssertContains(t, output, \"warnf\")\n\t\t})\n\n\t\tit(\"logs error to error writer\", func() {\n\t\t\tlogger.Error(\"error_\")\n\t\t\tlogger.Errorf(\"errorf\")\n\n\t\t\toutput := fErr()\n\t\t\th.AssertContains(t, output, \"error_\\n\")\n\t\t\th.AssertContains(t, output, \"errorf\\n\")\n\t\t})\n\n\t\tit(\"will return correct writers\", func() {\n\t\t\th.AssertSameInstance(t, logger.Writer(), outCons)\n\t\t\tassertLogWriterHasOut(t, logger.WriterForLevel(logging.DebugLevel), outCons)\n\t\t\tassertLogWriterHasOut(t, logger.WriterForLevel(logging.InfoLevel), outCons)\n\t\t\tassertLogWriterHasOut(t, logger.WriterForLevel(logging.WarnLevel), outCons)\n\t\t\tassertLogWriterHasOut(t, logger.WriterForLevel(logging.ErrorLevel), errCons)\n\t\t})\n\t})\n\n\tit(\"will convert an empty string to a line feed\", func() {\n\t\tlogger.Info(\"\")\n\t\texpected := \"\\n\"\n\t\th.AssertEq(t, fOut(), expected)\n\t})\n}\n\nfunc assertLogWriterHasOut(t *testing.T, writer io.Writer, out io.Writer) {\n\tlogWriter, ok := writer.(hasWriter)\n\th.AssertTrue(t, ok)\n\th.AssertSameInstance(t, logWriter.Writer(), out)\n}\n\ntype hasWriter interface {\n\tWriter() io.Writer\n}\n"
  },
  {
    "path": "pkg/logging/logging.go",
    "content": "// Package logging defines the minimal interface that loggers must support to be used by client.\npackage logging\n\nimport (\n\t\"io\"\n\n\t\"github.com/buildpacks/pack/internal/style\"\n)\n\ntype Level int\n\nconst (\n\tDebugLevel Level = iota\n\tInfoLevel\n\tWarnLevel\n\tErrorLevel\n)\n\n// Logger defines behavior required by a logging package used by pack libraries\ntype Logger interface {\n\tDebug(msg string)\n\tDebugf(fmt string, v ...interface{})\n\n\tInfo(msg string)\n\tInfof(fmt string, v ...interface{})\n\n\tWarn(msg string)\n\tWarnf(fmt string, v ...interface{})\n\n\tError(msg string)\n\tErrorf(fmt string, v ...interface{})\n\n\tWriter() io.Writer\n\n\tIsVerbose() bool\n}\n\ntype isSelectableWriter interface {\n\tWriterForLevel(level Level) io.Writer\n}\n\n// GetWriterForLevel retrieves the appropriate Writer for the log level provided.\n//\n// See isSelectableWriter\nfunc GetWriterForLevel(logger Logger, level Level) io.Writer {\n\tif w, ok := logger.(isSelectableWriter); ok {\n\t\treturn w.WriterForLevel(level)\n\t}\n\n\treturn logger.Writer()\n}\n\n// IsQuiet defines whether a pack logger is set to quiet mode\nfunc IsQuiet(logger Logger) bool {\n\tif writer := GetWriterForLevel(logger, InfoLevel); writer == io.Discard {\n\t\treturn true\n\t}\n\n\treturn false\n}\n\n// Tip logs a tip.\nfunc Tip(l Logger, format string, v ...interface{}) {\n\tl.Infof(style.Tip(\"Tip: \")+format, v...)\n}\n"
  },
  {
    "path": "pkg/logging/logging_test.go",
    "content": "package logging_test\n\nimport (\n\t\"bytes\"\n\t\"testing\"\n\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestLogging(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"Logging\", testLogging, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testLogging(t *testing.T, when spec.G, it spec.S) {\n\twhen(\"#GetWriterForLevel\", func() {\n\t\twhen(\"isSelectableWriter\", func() {\n\t\t\tit(\"returns Logger for appropriate level\", func() {\n\t\t\t\toutCons, output := h.MockWriterAndOutput()\n\t\t\t\terrCons, errOutput := h.MockWriterAndOutput()\n\t\t\t\tlogger := logging.NewLogWithWriters(outCons, errCons)\n\n\t\t\t\tinfoLogger := logging.GetWriterForLevel(logger, logging.InfoLevel)\n\t\t\t\t_, _ = infoLogger.Write([]byte(\"info test\"))\n\t\t\t\th.AssertEq(t, output(), \"info test\")\n\n\t\t\t\terrorLogger := logging.GetWriterForLevel(logger, logging.ErrorLevel)\n\t\t\t\t_, _ = errorLogger.Write([]byte(\"error test\"))\n\t\t\t\th.AssertEq(t, errOutput(), \"error test\")\n\t\t\t})\n\t\t})\n\n\t\twhen(\"doesn't implement isSelectableWriter\", func() {\n\t\t\tit(\"returns one Writer for all levels\", func() {\n\t\t\t\tvar w bytes.Buffer\n\t\t\t\tlogger := logging.NewSimpleLogger(&w)\n\t\t\t\twriter := logging.GetWriterForLevel(logger, logging.InfoLevel)\n\t\t\t\t_, _ = writer.Write([]byte(\"info test\\n\"))\n\t\t\t\th.AssertEq(t, w.String(), \"info test\\n\")\n\n\t\t\t\twriter = logging.GetWriterForLevel(logger, logging.ErrorLevel)\n\t\t\t\t_, _ = writer.Write([]byte(\"error test\\n\"))\n\t\t\t\th.AssertEq(t, w.String(), \"info test\\nerror test\\n\")\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"IsQuiet\", func() {\n\t\twhen(\"implements isSelectableWriter\", func() {\n\t\t\tit(\"return true for quiet mode\", func() {\n\t\t\t\tvar w bytes.Buffer\n\t\t\t\tlogger := logging.NewLogWithWriters(&w, &w)\n\t\t\t\th.AssertEq(t, logging.IsQuiet(logger), false)\n\n\t\t\t\tlogger.WantQuiet(true)\n\t\t\t\th.AssertEq(t, logging.IsQuiet(logger), true)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"doesn't implement isSelectableWriter\", func() {\n\t\t\tit(\"always returns false\", func() {\n\t\t\t\tvar w bytes.Buffer\n\t\t\t\tlogger := logging.NewSimpleLogger(&w)\n\t\t\t\th.AssertEq(t, logging.IsQuiet(logger), false)\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"#Tip\", func() {\n\t\tit(\"prepends `Tip:` to string\", func() {\n\t\t\tvar w bytes.Buffer\n\t\t\tlogger := logging.NewSimpleLogger(&w)\n\t\t\tlogging.Tip(logger, \"test\")\n\t\t\th.AssertContains(t, w.String(), \"Tip: \"+\"test\")\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "pkg/logging/prefix_writer.go",
    "content": "package logging\n\nimport (\n\t\"bufio\"\n\t\"bytes\"\n\t\"fmt\"\n\t\"io\"\n\n\t\"github.com/buildpacks/pack/internal/style\"\n)\n\n// PrefixWriter is a buffering writer that prefixes each new line. Close should be called to properly flush the buffer.\ntype PrefixWriter struct {\n\tout           io.Writer\n\tbuf           *bytes.Buffer\n\tprefix        string\n\treaderFactory func(data []byte) io.Reader\n}\n\ntype PrefixWriterOption func(c *PrefixWriter)\n\nfunc WithReaderFactory(factory func(data []byte) io.Reader) PrefixWriterOption {\n\treturn func(writer *PrefixWriter) {\n\t\twriter.readerFactory = factory\n\t}\n}\n\n// NewPrefixWriter writes by w will be prefixed\nfunc NewPrefixWriter(w io.Writer, prefix string, opts ...PrefixWriterOption) *PrefixWriter {\n\twriter := &PrefixWriter{\n\t\tout:    w,\n\t\tprefix: fmt.Sprintf(\"[%s] \", style.Prefix(prefix)),\n\t\tbuf:    &bytes.Buffer{},\n\t\treaderFactory: func(data []byte) io.Reader {\n\t\t\treturn bytes.NewReader(data)\n\t\t},\n\t}\n\n\tfor _, opt := range opts {\n\t\topt(writer)\n\t}\n\n\treturn writer\n}\n\n// Write writes bytes to the embedded log function\nfunc (w *PrefixWriter) Write(data []byte) (int, error) {\n\tscanner := bufio.NewScanner(w.readerFactory(data))\n\tscanner.Split(ScanLinesKeepNewLine)\n\tfor scanner.Scan() {\n\t\tnewBits := scanner.Bytes()\n\t\tif len(newBits) > 0 && newBits[len(newBits)-1] != '\\n' { // just append if we don't have a new line\n\t\t\t_, err := w.buf.Write(newBits)\n\t\t\tif err != nil {\n\t\t\t\treturn 0, err\n\t\t\t}\n\t\t} else { // write our complete message\n\t\t\t_, err := w.buf.Write(bytes.TrimRight(newBits, \"\\n\"))\n\t\t\tif err != nil {\n\t\t\t\treturn 0, err\n\t\t\t}\n\n\t\t\terr = w.flush()\n\t\t\tif err != nil {\n\t\t\t\treturn 0, err\n\t\t\t}\n\t\t}\n\t}\n\n\tif err := scanner.Err(); err != nil {\n\t\treturn 0, err\n\t}\n\n\treturn len(data), nil\n}\n\n// Close writes any pending data in the buffer\nfunc (w *PrefixWriter) Close() error {\n\tif w.buf.Len() > 0 {\n\t\treturn w.flush()\n\t}\n\n\treturn nil\n}\n\nfunc (w *PrefixWriter) flush() error {\n\tbits := w.buf.Bytes()\n\tw.buf.Reset()\n\n\t// process any CR in message\n\tif i := bytes.LastIndexByte(bits, '\\r'); i >= 0 {\n\t\tbits = bits[i+1:]\n\t}\n\n\t_, err := fmt.Fprint(w.out, w.prefix+string(bits)+\"\\n\")\n\treturn err\n}\n\n// A customized implementation of bufio.ScanLines that preserves new line characters.\nfunc ScanLinesKeepNewLine(data []byte, atEOF bool) (advance int, token []byte, err error) {\n\tif atEOF && len(data) == 0 {\n\t\treturn 0, nil, nil\n\t}\n\n\t// first we'll split by LF (\\n)\n\t// then remove any preceding CR (\\r) [due to CR+LF]\n\tif i := bytes.IndexByte(data, '\\n'); i >= 0 {\n\t\t// We have a full newline-terminated line.\n\t\treturn i + 1, append(dropCR(data[0:i]), '\\n'), nil\n\t}\n\n\t// If we're at EOF, we have a final, non-terminated line. Return it.\n\tif atEOF {\n\t\treturn len(data), data, nil\n\t}\n\t// Request more data.\n\treturn 0, nil, nil\n}\n\n// dropCR drops a terminal \\r from the data.\nfunc dropCR(data []byte) []byte {\n\tif len(data) > 0 && data[len(data)-1] == '\\r' {\n\t\treturn data[0 : len(data)-1]\n\t}\n\treturn data\n}\n"
  },
  {
    "path": "pkg/logging/prefix_writer_test.go",
    "content": "package logging_test\n\nimport (\n\t\"bytes\"\n\t\"errors\"\n\t\"io\"\n\t\"testing\"\n\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestPrefixWriter(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"PrefixWriter\", testPrefixWriter, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testPrefixWriter(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tassert = h.NewAssertionManager(t)\n\t)\n\n\twhen(\"#Write\", func() {\n\t\tit(\"prepends prefix to string\", func() {\n\t\t\tvar w bytes.Buffer\n\n\t\t\twriter := logging.NewPrefixWriter(&w, \"prefix\")\n\t\t\t_, err := writer.Write([]byte(\"test\"))\n\t\t\tassert.Nil(err)\n\t\t\terr = writer.Close()\n\t\t\tassert.Nil(err)\n\n\t\t\th.AssertEq(t, w.String(), \"[prefix] test\\n\")\n\t\t})\n\n\t\tit(\"prepends prefix to multi-line string\", func() {\n\t\t\tvar w bytes.Buffer\n\n\t\t\twriter := logging.NewPrefixWriter(&w, \"prefix\")\n\t\t\t_, err := writer.Write([]byte(\"line 1\\nline 2\\nline 3\"))\n\t\t\tassert.Nil(err)\n\t\t\terr = writer.Close()\n\t\t\tassert.Nil(err)\n\n\t\t\th.AssertEq(t, w.String(), \"[prefix] line 1\\n[prefix] line 2\\n[prefix] line 3\\n\")\n\t\t})\n\n\t\tit(\"buffers mid-line calls\", func() {\n\t\t\tvar buf bytes.Buffer\n\n\t\t\twriter := logging.NewPrefixWriter(&buf, \"prefix\")\n\t\t\t_, err := writer.Write([]byte(\"word 1, \"))\n\t\t\tassert.Nil(err)\n\t\t\t_, err = writer.Write([]byte(\"word 2, \"))\n\t\t\tassert.Nil(err)\n\t\t\t_, err = writer.Write([]byte(\"word 3.\"))\n\t\t\tassert.Nil(err)\n\t\t\terr = writer.Close()\n\t\t\tassert.Nil(err)\n\n\t\t\th.AssertEq(t, buf.String(), \"[prefix] word 1, word 2, word 3.\\n\")\n\t\t})\n\n\t\tit(\"handles empty lines\", func() {\n\t\t\tvar buf bytes.Buffer\n\n\t\t\twriter := logging.NewPrefixWriter(&buf, \"prefix\")\n\t\t\t_, err := writer.Write([]byte(\"\\n\"))\n\t\t\tassert.Nil(err)\n\t\t\terr = writer.Close()\n\t\t\tassert.Nil(err)\n\n\t\t\th.AssertEq(t, buf.String(), \"[prefix] \\n\")\n\t\t})\n\n\t\tit(\"handles empty input\", func() {\n\t\t\tvar buf bytes.Buffer\n\n\t\t\twriter := logging.NewPrefixWriter(&buf, \"prefix\")\n\t\t\t_, err := writer.Write([]byte(\"\"))\n\t\t\tassert.Nil(err)\n\t\t\terr = writer.Close()\n\t\t\tassert.Nil(err)\n\n\t\t\tassert.Equal(buf.String(), \"\")\n\t\t})\n\n\t\tit(\"propagates reader errors\", func() {\n\t\t\tvar buf bytes.Buffer\n\n\t\t\tfactory := &boobyTrapReaderFactory{failAtCallNumber: 2}\n\t\t\twriter := logging.NewPrefixWriter(&buf, \"prefix\", logging.WithReaderFactory(factory.NewReader))\n\t\t\t_, err := writer.Write([]byte(\"word 1,\"))\n\t\t\tassert.Nil(err)\n\t\t\t_, err = writer.Write([]byte(\"word 2.\"))\n\t\t\tassert.ErrorContains(err, \"some error\")\n\t\t})\n\n\t\tit(\"handles requests to clear line\", func() {\n\t\t\tvar buf bytes.Buffer\n\n\t\t\twriter := logging.NewPrefixWriter(&buf, \"prefix\")\n\t\t\t_, err := writer.Write([]byte(\"progress 1\\rprogress 2\\rprogress 3\\rcomplete!\"))\n\t\t\tassert.Nil(err)\n\t\t\terr = writer.Close()\n\t\t\tassert.Nil(err)\n\n\t\t\th.AssertEq(t, buf.String(), \"[prefix] complete!\\n\")\n\t\t})\n\n\t\tit(\"handles requests clear line (amidst content)\", func() {\n\t\t\tvar buf bytes.Buffer\n\n\t\t\twriter := logging.NewPrefixWriter(&buf, \"prefix\")\n\t\t\t_, err := writer.Write([]byte(\"downloading\\rcompleted!      \\r\\nall done!\\nnevermind\\r\"))\n\t\t\tassert.Nil(err)\n\t\t\terr = writer.Close()\n\t\t\tassert.Nil(err)\n\n\t\t\th.AssertEq(t, buf.String(), \"[prefix] completed!      \\n[prefix] all done!\\n[prefix] \\n\")\n\t\t})\n\t})\n}\n\ntype boobyTrapReaderFactory struct {\n\tnumberOfCalls    int\n\tfailAtCallNumber int\n}\n\nfunc (b *boobyTrapReaderFactory) NewReader(data []byte) io.Reader {\n\tb.numberOfCalls++\n\tif b.numberOfCalls >= b.failAtCallNumber {\n\t\treturn &faultyReader{}\n\t}\n\n\treturn bytes.NewReader(data)\n}\n\ntype faultyReader struct {\n}\n\nfunc (f faultyReader) Read(b []byte) (n int, err error) {\n\treturn 0, errors.New(\"some error\")\n}\n"
  },
  {
    "path": "pkg/project/project.go",
    "content": "package project\n\nimport (\n\t\"fmt\"\n\t\"os\"\n\t\"path/filepath\"\n\t\"strings\"\n\n\tv03 \"github.com/buildpacks/pack/pkg/project/v03\"\n\n\t\"github.com/BurntSushi/toml\"\n\t\"github.com/pkg/errors\"\n\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\t\"github.com/buildpacks/pack/pkg/project/types\"\n\tv01 \"github.com/buildpacks/pack/pkg/project/v01\"\n\tv02 \"github.com/buildpacks/pack/pkg/project/v02\"\n)\n\ntype Project struct {\n\tVersion string `toml:\"schema-version\"`\n}\n\ntype VersionDescriptor struct {\n\tProject Project `toml:\"_\"`\n}\n\nvar parsers = map[string]func(string) (types.Descriptor, toml.MetaData, error){\n\t\"0.1\": v01.NewDescriptor,\n\t\"0.2\": v02.NewDescriptor,\n\t\"0.3\": v03.NewDescriptor,\n}\n\nfunc ReadProjectDescriptor(pathToFile string, logger logging.Logger) (types.Descriptor, error) {\n\tprojectTomlContents, err := os.ReadFile(filepath.Clean(pathToFile))\n\tif err != nil {\n\t\treturn types.Descriptor{}, err\n\t}\n\n\tvar versionDescriptor struct {\n\t\tProject struct {\n\t\t\tVersion string `toml:\"schema-version\"`\n\t\t} `toml:\"_\"`\n\t}\n\n\t_, err = toml.Decode(string(projectTomlContents), &versionDescriptor)\n\tif err != nil {\n\t\treturn types.Descriptor{}, errors.Wrapf(err, \"parsing schema version\")\n\t}\n\n\tversion := versionDescriptor.Project.Version\n\tif version == \"\" {\n\t\tlogger.Warn(\"No schema version declared in project.toml, defaulting to schema version 0.1\")\n\t\tversion = \"0.1\"\n\t}\n\n\tif _, ok := parsers[version]; !ok {\n\t\treturn types.Descriptor{}, fmt.Errorf(\"unknown project descriptor schema version %s\", version)\n\t}\n\n\tdescriptor, tomlMetaData, err := parsers[version](string(projectTomlContents))\n\tif err != nil {\n\t\treturn types.Descriptor{}, err\n\t}\n\n\twarnIfTomlContainsKeysNotSupportedBySchema(version, tomlMetaData, logger)\n\n\treturn descriptor, validate(descriptor)\n}\n\nfunc warnIfTomlContainsKeysNotSupportedBySchema(schemaVersion string, tomlMetaData toml.MetaData, logger logging.Logger) {\n\tunsupportedKeys := []string{}\n\n\tfor _, undecoded := range tomlMetaData.Undecoded() {\n\t\tkeyName := undecoded.String()\n\t\tif unsupportedKey(keyName, schemaVersion) {\n\t\t\tunsupportedKeys = append(unsupportedKeys, keyName)\n\t\t}\n\t}\n\n\tif len(unsupportedKeys) != 0 {\n\t\tlogger.Warnf(\"The following keys declared in project.toml are not supported in schema version %s:\\n\", schemaVersion)\n\t\tfor _, unsupported := range unsupportedKeys {\n\t\t\tlogger.Warnf(\"- %s\\n\", unsupported)\n\t\t}\n\t\tlogger.Warn(\"The above keys will be ignored. If this is not intentional, try updating your schema version.\\n\")\n\t}\n}\n\nfunc unsupportedKey(keyName, schemaVersion string) bool {\n\tswitch schemaVersion {\n\tcase \"0.1\":\n\t\t// filter out any keys from [metadata] and any other custom table defined by end-users\n\t\treturn strings.HasPrefix(keyName, \"project.\") || strings.HasPrefix(keyName, \"build.\") || strings.Contains(keyName, \"io.buildpacks\")\n\tcase \"0.2\":\n\t\t// filter out any keys from [_.metadata] and any other custom table defined by end-users\n\t\treturn strings.Contains(keyName, \"io.buildpacks\") || (strings.HasPrefix(keyName, \"_.\") && !strings.HasPrefix(keyName, \"_.metadata\"))\n\t}\n\treturn true\n}\n\nfunc validate(p types.Descriptor) error {\n\tif p.Build.Exclude != nil && p.Build.Include != nil {\n\t\treturn errors.New(\"project.toml: cannot have both include and exclude defined\")\n\t}\n\n\tif len(p.Project.Licenses) > 0 {\n\t\tfor _, license := range p.Project.Licenses {\n\t\t\tif license.Type == \"\" && license.URI == \"\" {\n\t\t\t\treturn errors.New(\"project.toml: must have a type or uri defined for each license\")\n\t\t\t}\n\t\t}\n\t}\n\n\tfor _, bp := range p.Build.Buildpacks {\n\t\tif bp.ID == \"\" && bp.URI == \"\" {\n\t\t\treturn errors.New(\"project.toml: buildpacks must have an id or url defined\")\n\t\t}\n\t\tif bp.URI != \"\" && bp.Version != \"\" {\n\t\t\treturn errors.New(\"project.toml: buildpacks cannot have both uri and version defined\")\n\t\t}\n\t}\n\n\treturn nil\n}\n"
  },
  {
    "path": "pkg/project/project_test.go",
    "content": "package project\n\nimport (\n\t\"log\"\n\t\"os\"\n\t\"reflect\"\n\t\"testing\"\n\n\t\"github.com/buildpacks/pack/pkg/project/types\"\n\n\t\"github.com/buildpacks/lifecycle/api\"\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\t\"github.com/buildpacks/pack/pkg/logging\"\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestProject(t *testing.T) {\n\th.RequireDocker(t)\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\n\tspec.Run(t, \"Provider\", testProject, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testProject(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tlogger     *logging.LogWithWriters\n\t\treadStdout func() string\n\t)\n\n\tit.Before(func() {\n\t\tvar stdout *color.Console\n\t\tstdout, readStdout = h.MockWriterAndOutput()\n\t\tstderr, _ := h.MockWriterAndOutput()\n\t\tlogger = logging.NewLogWithWriters(stdout, stderr)\n\t})\n\n\twhen(\"#ReadProjectDescriptor\", func() {\n\t\twhen(\"valid 0.3 project.toml file is provided\", func() {\n\t\t\tit(\"should exec-env on [[io.buildpacks.group]]\", func() {\n\t\t\t\tprojectToml := `\n[_]\nname = \"gallant 0.3\"\nschema-version = \"0.3\"\n\n[[io.buildpacks.group]]\nid = \"buildpacks/metrics-agent\"\nversion = \"latest\"\nexec-env = [\"production\"]\n`\n\t\t\t\ttmpProjectToml, err := createTmpProjectTomlFile(projectToml)\n\t\t\t\tif err != nil {\n\t\t\t\t\tt.Fatal(err)\n\t\t\t\t}\n\n\t\t\t\tprojectDescriptor, err := ReadProjectDescriptor(tmpProjectToml.Name(), logger)\n\t\t\t\tif err != nil {\n\t\t\t\t\tt.Fatal(err)\n\t\t\t\t}\n\n\t\t\t\tassertProjectName(t, \"gallant 0.3\", projectDescriptor)\n\t\t\t\tassertSchemaVersion(t, api.MustParse(\"0.3\"), projectDescriptor)\n\n\t\t\t\texpectedNumberOfBuildPacks := 1\n\t\t\t\texpectedNumberOfExecEnvs := 1\n\t\t\t\tatIndex := 0\n\t\t\t\tassertBuildPackGroupExecEnv(t, \"production\", expectedNumberOfBuildPacks, expectedNumberOfExecEnvs, atIndex, projectDescriptor)\n\t\t\t})\n\n\t\t\tit(\"should exec-env on [[io.buildpacks.pre.group]]\", func() {\n\t\t\t\tprojectToml := `\n[_]\nname = \"gallant 0.3\"\nschema-version = \"0.3\"\n\n[[io.buildpacks.pre.group]]\nid = \"buildpacks/procfile\"\nversion = \"latest\"\nexec-env = [\"test\"]\n`\n\t\t\t\ttmpProjectToml, err := createTmpProjectTomlFile(projectToml)\n\t\t\t\tif err != nil {\n\t\t\t\t\tt.Fatal(err)\n\t\t\t\t}\n\n\t\t\t\tprojectDescriptor, err := ReadProjectDescriptor(tmpProjectToml.Name(), logger)\n\t\t\t\tif err != nil {\n\t\t\t\t\tt.Fatal(err)\n\t\t\t\t}\n\n\t\t\t\tassertProjectName(t, \"gallant 0.3\", projectDescriptor)\n\t\t\t\tassertSchemaVersion(t, api.MustParse(\"0.3\"), projectDescriptor)\n\n\t\t\t\texpectedNumberOfBuildPacks := 1\n\t\t\t\texpectedNumberOfExecEnvs := 1\n\t\t\t\tatIndex := 0\n\t\t\t\tassertBuildPackPreGroupExecEnv(t, \"test\", expectedNumberOfBuildPacks, expectedNumberOfExecEnvs, atIndex, projectDescriptor)\n\t\t\t})\n\t\t})\n\n\t\tit(\"should exec-env on [[io.buildpacks.post.group]]\", func() {\n\t\t\tprojectToml := `\n[_]\nname = \"gallant 0.3\"\nschema-version = \"0.3\"\n\n[[io.buildpacks.post.group]]\nid = \"buildpacks/headless-chrome\"\nversion = \"latest\"\nexec-env = [\"test-1\"]\n`\n\t\t\ttmpProjectToml, err := createTmpProjectTomlFile(projectToml)\n\t\t\tif err != nil {\n\t\t\t\tt.Fatal(err)\n\t\t\t}\n\n\t\t\tprojectDescriptor, err := ReadProjectDescriptor(tmpProjectToml.Name(), logger)\n\t\t\tif err != nil {\n\t\t\t\tt.Fatal(err)\n\t\t\t}\n\n\t\t\tassertProjectName(t, \"gallant 0.3\", projectDescriptor)\n\t\t\tassertSchemaVersion(t, api.MustParse(\"0.3\"), projectDescriptor)\n\n\t\t\texpectedNumberOfBuildPacks := 1\n\t\t\texpectedNumberOfExecEnvs := 1\n\t\t\tatIndex := 0\n\t\t\tassertBuildPackPostGroupExecEnv(t, \"test-1\", expectedNumberOfBuildPacks, expectedNumberOfExecEnvs, atIndex, projectDescriptor)\n\t\t})\n\n\t\tit(\"should exec-env on [[io.buildpacks.build.env]]\", func() {\n\t\t\tprojectToml := `\n[_]\nname = \"gallant 0.3\"\nschema-version = \"0.3\"\n\n[[io.buildpacks.build.env]]\nname = \"RAILS_ENV\"\nvalue = \"test\"\nexec-env = [\"test-1.1\"]\n`\n\t\t\ttmpProjectToml, err := createTmpProjectTomlFile(projectToml)\n\t\t\tif err != nil {\n\t\t\t\tt.Fatal(err)\n\t\t\t}\n\n\t\t\tprojectDescriptor, err := ReadProjectDescriptor(tmpProjectToml.Name(), logger)\n\t\t\tif err != nil {\n\t\t\t\tt.Fatal(err)\n\t\t\t}\n\n\t\t\tassertProjectName(t, \"gallant 0.3\", projectDescriptor)\n\t\t\tassertSchemaVersion(t, api.MustParse(\"0.3\"), projectDescriptor)\n\n\t\t\texpectedNumberOfBuildPacks := 1\n\t\t\texpectedNumberOfExecEnvs := 1\n\t\t\tatIndex := 0\n\t\t\tassertBuildPackBuildExecEnv(t, \"test-1.1\", expectedNumberOfBuildPacks, expectedNumberOfExecEnvs, atIndex, projectDescriptor)\n\t\t})\n\n\t\tit(\"should parse a valid v0.2 project.toml file\", func() {\n\t\t\tprojectToml := `\n[_]\nname = \"gallant 0.2\"\nschema-version=\"0.2\"\n[[_.licenses]]\ntype = \"MIT\"\n[_.metadata]\npipeline = \"Lucerne\"\n[io.buildpacks]\nexclude = [ \"*.jar\" ]\n[[io.buildpacks.pre.group]]\nuri = \"https://example.com/buildpack/pre\"\n[[io.buildpacks.post.group]]\nuri = \"https://example.com/buildpack/post\"\n[[io.buildpacks.group]]\nid = \"example/lua\"\nversion = \"1.0\"\n[[io.buildpacks.group]]\nuri = \"https://example.com/buildpack\"\n[[io.buildpacks.build.env]]\nname = \"JAVA_OPTS\"\nvalue = \"-Xmx300m\"\n[[io.buildpacks.env.build]]\nname = \"JAVA_OPTS\"\nvalue = \"this-should-get-overridden-because-its-deprecated\"\n`\n\t\t\ttmpProjectToml, err := createTmpProjectTomlFile(projectToml)\n\t\t\tif err != nil {\n\t\t\t\tt.Fatal(err)\n\t\t\t}\n\n\t\t\tprojectDescriptor, err := ReadProjectDescriptor(tmpProjectToml.Name(), logger)\n\t\t\tif err != nil {\n\t\t\t\tt.Fatal(err)\n\t\t\t}\n\n\t\t\tvar expected string\n\n\t\t\texpected = \"gallant 0.2\"\n\t\t\tif projectDescriptor.Project.Name != expected {\n\t\t\t\tt.Fatalf(\"Expected\\n-----\\n%#v\\n-----\\nbut got\\n-----\\n%#v\\n\",\n\t\t\t\t\texpected, projectDescriptor.Project.Name)\n\t\t\t}\n\n\t\t\texpectedVersion := api.MustParse(\"0.2\")\n\t\t\tif !reflect.DeepEqual(expectedVersion, projectDescriptor.SchemaVersion) {\n\t\t\t\tt.Fatalf(\"Expected\\n-----\\n%#v\\n-----\\nbut got\\n-----\\n%#v\\n\",\n\t\t\t\t\texpectedVersion, projectDescriptor.SchemaVersion)\n\t\t\t}\n\n\t\t\texpected = \"example/lua\"\n\t\t\tif projectDescriptor.Build.Buildpacks[0].ID != expected {\n\t\t\t\tt.Fatalf(\"Expected\\n-----\\n%#v\\n-----\\nbut got\\n-----\\n%#v\\n\",\n\t\t\t\t\texpected, projectDescriptor.Build.Buildpacks[0].ID)\n\t\t\t}\n\n\t\t\texpected = \"1.0\"\n\t\t\tif projectDescriptor.Build.Buildpacks[0].Version != expected {\n\t\t\t\tt.Fatalf(\"Expected\\n-----\\n%#v\\n-----\\nbut got\\n-----\\n%#v\\n\",\n\t\t\t\t\texpected, projectDescriptor.Build.Buildpacks[0].Version)\n\t\t\t}\n\n\t\t\texpected = \"https://example.com/buildpack\"\n\t\t\tif projectDescriptor.Build.Buildpacks[1].URI != expected {\n\t\t\t\tt.Fatalf(\"Expected\\n-----\\n%#v\\n-----\\nbut got\\n-----\\n%#v\\n\",\n\t\t\t\t\texpected, projectDescriptor.Build.Buildpacks[1].URI)\n\t\t\t}\n\n\t\t\texpected = \"https://example.com/buildpack/pre\"\n\t\t\tif projectDescriptor.Build.Pre.Buildpacks[0].URI != expected {\n\t\t\t\tt.Fatalf(\"Expected\\n-----\\n%#v\\n-----\\nbut got\\n-----\\n%#v\\n\",\n\t\t\t\t\texpected, projectDescriptor.Build.Pre.Buildpacks[0].URI)\n\t\t\t}\n\n\t\t\texpected = \"https://example.com/buildpack/post\"\n\t\t\tif projectDescriptor.Build.Post.Buildpacks[0].URI != expected {\n\t\t\t\tt.Fatalf(\"Expected\\n-----\\n%#v\\n-----\\nbut got\\n-----\\n%#v\\n\",\n\t\t\t\t\texpected, projectDescriptor.Build.Post.Buildpacks[0].URI)\n\t\t\t}\n\n\t\t\texpected = \"JAVA_OPTS\"\n\t\t\tif projectDescriptor.Build.Env[0].Name != expected {\n\t\t\t\tt.Fatalf(\"Expected\\n-----\\n%#v\\n-----\\nbut got\\n-----\\n%#v\\n\",\n\t\t\t\t\texpected, projectDescriptor.Build.Env[0].Name)\n\t\t\t}\n\n\t\t\texpected = \"-Xmx300m\"\n\t\t\tif projectDescriptor.Build.Env[0].Value != expected {\n\t\t\t\tt.Fatalf(\"Expected\\n-----\\n%#v\\n-----\\nbut got\\n-----\\n%#v\\n\",\n\t\t\t\t\texpected, projectDescriptor.Build.Env[0].Value)\n\t\t\t}\n\n\t\t\texpected = \"MIT\"\n\t\t\tif projectDescriptor.Project.Licenses[0].Type != expected {\n\t\t\t\tt.Fatalf(\"Expected\\n-----\\n%#v\\n-----\\nbut got\\n-----\\n%#v\\n\",\n\t\t\t\t\texpected, projectDescriptor.Project.Licenses[0].Type)\n\t\t\t}\n\n\t\t\texpected = \"Lucerne\"\n\t\t\tif projectDescriptor.Metadata[\"pipeline\"] != expected {\n\t\t\t\tt.Fatalf(\"Expected\\n-----\\n%#v\\n-----\\nbut got\\n-----\\n%#v\\n\",\n\t\t\t\t\texpected, projectDescriptor.Metadata[\"pipeline\"])\n\t\t\t}\n\t\t})\n\n\t\tit(\"should be backwards compatible with older v0.2 project.toml file\", func() {\n\t\t\tprojectToml := `\n[_]\nname = \"gallant 0.2\"\nschema-version=\"0.2\"\n[[io.buildpacks.env.build]]\nname = \"JAVA_OPTS\"\nvalue = \"-Xmx300m\"\n`\n\t\t\ttmpProjectToml, err := createTmpProjectTomlFile(projectToml)\n\t\t\tif err != nil {\n\t\t\t\tt.Fatal(err)\n\t\t\t}\n\n\t\t\tprojectDescriptor, err := ReadProjectDescriptor(tmpProjectToml.Name(), logger)\n\t\t\tif err != nil {\n\t\t\t\tt.Fatal(err)\n\t\t\t}\n\n\t\t\tvar expected string\n\n\t\t\texpected = \"JAVA_OPTS\"\n\t\t\tif projectDescriptor.Build.Env[0].Name != expected {\n\t\t\t\tt.Fatalf(\"Expected\\n-----\\n%#v\\n-----\\nbut got\\n-----\\n%#v\\n\",\n\t\t\t\t\texpected, projectDescriptor.Build.Env[0].Name)\n\t\t\t}\n\n\t\t\texpected = \"-Xmx300m\"\n\t\t\tif projectDescriptor.Build.Env[0].Value != expected {\n\t\t\t\tt.Fatalf(\"Expected\\n-----\\n%#v\\n-----\\nbut got\\n-----\\n%#v\\n\",\n\t\t\t\t\texpected, projectDescriptor.Build.Env[0].Value)\n\t\t\t}\n\t\t})\n\n\t\tit(\"should parse a valid v0.1 project.toml file\", func() {\n\t\t\tprojectToml := `\n[project]\nname = \"gallant\"\nversion = \"1.0.2\"\nsource-url = \"https://github.com/buildpacks/pack\"\n[[project.licenses]]\ntype = \"MIT\"\n[build]\nexclude = [ \"*.jar\" ]\n[[build.buildpacks]]\nid = \"example/lua\"\nversion = \"1.0\"\n[[build.buildpacks]]\nuri = \"https://example.com/buildpack\"\n[[build.env]]\nname = \"JAVA_OPTS\"\nvalue = \"-Xmx300m\"\n[metadata]\npipeline = \"Lucerne\"\n`\n\t\t\ttmpProjectToml, err := createTmpProjectTomlFile(projectToml)\n\t\t\tif err != nil {\n\t\t\t\tt.Fatal(err)\n\t\t\t}\n\n\t\t\tprojectDescriptor, err := ReadProjectDescriptor(tmpProjectToml.Name(), logger)\n\t\t\tif err != nil {\n\t\t\t\tt.Fatal(err)\n\t\t\t}\n\n\t\t\tvar expected string\n\n\t\t\texpected = \"gallant\"\n\t\t\tif projectDescriptor.Project.Name != expected {\n\t\t\t\tt.Fatalf(\"Expected\\n-----\\n%#v\\n-----\\nbut got\\n-----\\n%#v\\n\",\n\t\t\t\t\texpected, projectDescriptor.Project.Name)\n\t\t\t}\n\n\t\t\texpectedVersion := api.MustParse(\"0.1\")\n\t\t\tif !reflect.DeepEqual(expectedVersion, projectDescriptor.SchemaVersion) {\n\t\t\t\tt.Fatalf(\"Expected\\n-----\\n%#v\\n-----\\nbut got\\n-----\\n%#v\\n\",\n\t\t\t\t\texpectedVersion, projectDescriptor.SchemaVersion)\n\t\t\t}\n\n\t\t\texpected = \"1.0.2\"\n\t\t\tif projectDescriptor.Project.Version != expected {\n\t\t\t\tt.Fatalf(\"Expected\\n-----\\n%#v\\n-----\\nbut got\\n-----\\n%#v\\n\",\n\t\t\t\t\texpected, projectDescriptor.Project.Version)\n\t\t\t}\n\n\t\t\texpected = \"https://github.com/buildpacks/pack\"\n\t\t\tif projectDescriptor.Project.SourceURL != expected {\n\t\t\t\tt.Fatalf(\"Expected\\n-----\\n%#v\\n-----\\nbut got\\n-----\\n%#v\\n\",\n\t\t\t\t\texpected, projectDescriptor.Project.SourceURL)\n\t\t\t}\n\n\t\t\texpected = \"example/lua\"\n\t\t\tif projectDescriptor.Build.Buildpacks[0].ID != expected {\n\t\t\t\tt.Fatalf(\"Expected\\n-----\\n%#v\\n-----\\nbut got\\n-----\\n%#v\\n\",\n\t\t\t\t\texpected, projectDescriptor.Build.Buildpacks[0].ID)\n\t\t\t}\n\n\t\t\texpected = \"1.0\"\n\t\t\tif projectDescriptor.Build.Buildpacks[0].Version != expected {\n\t\t\t\tt.Fatalf(\"Expected\\n-----\\n%#v\\n-----\\nbut got\\n-----\\n%#v\\n\",\n\t\t\t\t\texpected, projectDescriptor.Build.Buildpacks[0].Version)\n\t\t\t}\n\n\t\t\texpected = \"https://example.com/buildpack\"\n\t\t\tif projectDescriptor.Build.Buildpacks[1].URI != expected {\n\t\t\t\tt.Fatalf(\"Expected\\n-----\\n%#v\\n-----\\nbut got\\n-----\\n%#v\\n\",\n\t\t\t\t\texpected, projectDescriptor.Build.Buildpacks[1].URI)\n\t\t\t}\n\n\t\t\texpected = \"JAVA_OPTS\"\n\t\t\tif projectDescriptor.Build.Env[0].Name != expected {\n\t\t\t\tt.Fatalf(\"Expected\\n-----\\n%#v\\n-----\\nbut got\\n-----\\n%#v\\n\",\n\t\t\t\t\texpected, projectDescriptor.Build.Env[0].Name)\n\t\t\t}\n\n\t\t\texpected = \"-Xmx300m\"\n\t\t\tif projectDescriptor.Build.Env[0].Value != expected {\n\t\t\t\tt.Fatalf(\"Expected\\n-----\\n%#v\\n-----\\nbut got\\n-----\\n%#v\\n\",\n\t\t\t\t\texpected, projectDescriptor.Build.Env[0].Value)\n\t\t\t}\n\n\t\t\texpected = \"MIT\"\n\t\t\tif projectDescriptor.Project.Licenses[0].Type != expected {\n\t\t\t\tt.Fatalf(\"Expected\\n-----\\n%#v\\n-----\\nbut got\\n-----\\n%#v\\n\",\n\t\t\t\t\texpected, projectDescriptor.Project.Licenses[0].Type)\n\t\t\t}\n\n\t\t\texpected = \"Lucerne\"\n\t\t\tif projectDescriptor.Metadata[\"pipeline\"] != expected {\n\t\t\t\tt.Fatalf(\"Expected\\n-----\\n%#v\\n-----\\nbut got\\n-----\\n%#v\\n\",\n\t\t\t\t\texpected, projectDescriptor.Metadata[\"pipeline\"])\n\t\t\t}\n\t\t})\n\n\t\tit(\"should create empty build ENV\", func() {\n\t\t\tprojectToml := `\n[project]\nname = \"gallant\"\n`\n\t\t\ttmpProjectToml, err := createTmpProjectTomlFile(projectToml)\n\t\t\tif err != nil {\n\t\t\t\tt.Fatal(err)\n\t\t\t}\n\n\t\t\tprojectDescriptor, err := ReadProjectDescriptor(tmpProjectToml.Name(), logger)\n\t\t\tif err != nil {\n\t\t\t\tt.Fatal(err)\n\t\t\t}\n\n\t\t\texpected := 0\n\t\t\tif len(projectDescriptor.Build.Env) != 0 {\n\t\t\t\tt.Fatalf(\"Expected\\n-----\\n%d\\n-----\\nbut got\\n-----\\n%d\\n\",\n\t\t\t\t\texpected, len(projectDescriptor.Build.Env))\n\t\t\t}\n\n\t\t\tfor _, envVar := range projectDescriptor.Build.Env {\n\t\t\t\tt.Fatalf(\"Expected\\n-----\\n%#v\\n-----\\nbut got\\n-----\\n%#v\\n\",\n\t\t\t\t\t\"[]\", envVar)\n\t\t\t}\n\t\t})\n\n\t\tit(\"should fail for an invalid project.toml path\", func() {\n\t\t\t_, err := ReadProjectDescriptor(\"/path/that/does/not/exist/project.toml\", logger)\n\n\t\t\tif !os.IsNotExist(err) {\n\t\t\t\tt.Fatalf(\"Expected\\n-----\\n%#v\\n-----\\nbut got\\n-----\\n%#v\\n\",\n\t\t\t\t\t\"project.toml does not exist error\", \"no error\")\n\t\t\t}\n\t\t})\n\n\t\tit(\"should enforce mutual exclusivity between exclude and include\", func() {\n\t\t\tprojectToml := `\n[project]\nname = \"bad excludes and includes\"\n\n[build]\nexclude = [ \"*.jar\" ]\ninclude = [ \"*.jpg\" ]\n`\n\t\t\ttmpProjectToml, err := createTmpProjectTomlFile(projectToml)\n\t\t\tif err != nil {\n\t\t\t\tt.Fatal(err)\n\t\t\t}\n\t\t\t_, err = ReadProjectDescriptor(tmpProjectToml.Name(), logger)\n\t\t\tif err == nil {\n\t\t\t\tt.Fatalf(\n\t\t\t\t\t\"Expected error for having both exclude and include defined\")\n\t\t\t}\n\t\t})\n\n\t\tit(\"should have an id or uri defined for buildpacks\", func() {\n\t\t\tprojectToml := `\n[project]\nname = \"missing buildpacks id and uri\"\n\n[[build.buildpacks]]\nversion = \"1.2.3\"\n`\n\t\t\ttmpProjectToml, err := createTmpProjectTomlFile(projectToml)\n\t\t\tif err != nil {\n\t\t\t\tt.Fatal(err)\n\t\t\t}\n\n\t\t\t_, err = ReadProjectDescriptor(tmpProjectToml.Name(), logger)\n\t\t\tif err == nil {\n\t\t\t\tt.Fatalf(\"Expected error for NOT having id or uri defined for buildpacks\")\n\t\t\t}\n\t\t})\n\n\t\tit(\"should not allow both uri and version\", func() {\n\t\t\tprojectToml := `\n[project]\nname = \"cannot have both uri and version defined\"\n\n[[build.buildpacks]]\nuri = \"https://example.com/buildpack\"\nversion = \"1.2.3\"\n`\n\t\t\ttmpProjectToml, err := createTmpProjectTomlFile(projectToml)\n\t\t\tif err != nil {\n\t\t\t\tt.Fatal(err)\n\t\t\t}\n\n\t\t\t_, err = ReadProjectDescriptor(tmpProjectToml.Name(), logger)\n\t\t\tif err == nil {\n\t\t\t\tt.Fatal(\"Expected error for having both uri and version defined for a buildpack(s)\")\n\t\t\t}\n\t\t})\n\n\t\tit(\"should require either a type or uri for licenses\", func() {\n\t\t\tprojectToml := `\n[project]\nname = \"licenses should have either a type or uri defined\"\n\n[[project.licenses]]\n`\n\t\t\ttmpProjectToml, err := createTmpProjectTomlFile(projectToml)\n\t\t\tif err != nil {\n\t\t\t\tt.Fatal(err)\n\t\t\t}\n\n\t\t\t_, err = ReadProjectDescriptor(tmpProjectToml.Name(), logger)\n\t\t\tif err == nil {\n\t\t\t\tt.Fatal(\"Expected error for having neither type or uri defined for licenses\")\n\t\t\t}\n\t\t})\n\n\t\tit(\"should warn when no schema version is declared\", func() {\n\t\t\tprojectToml := ``\n\t\t\ttmpProjectToml, err := createTmpProjectTomlFile(projectToml)\n\t\t\tif err != nil {\n\t\t\t\tt.Fatal(err)\n\t\t\t}\n\n\t\t\t_, err = ReadProjectDescriptor(tmpProjectToml.Name(), logger)\n\t\t\th.AssertNil(t, err)\n\n\t\t\th.AssertContains(t, readStdout(), \"Warning: No schema version declared in project.toml, defaulting to schema version 0.1\\n\")\n\t\t})\n\n\t\tit(\"should warn when unsupported keys, on tables the project owns, are declared with schema v0.1\", func() {\n\t\t\tprojectToml := `\n[project]\nauthors = [\"foo\", \"bar\"]\n\n# try to use buildpack.io table with version 0.1 - warning message expected\n[[io.buildpacks.build.env]]\nname = \"JAVA_OPTS\"\nvalue = \"-Xmx1g\"\n\n# something else defined by end-users - no warning message expected\n[io.docker]\nfile = \"./Dockerfile\"\n\n# some metadata - no warning message expected\n[metadata]\nfoo = \"bar\"\n`\n\t\t\ttmpProjectToml, err := createTmpProjectTomlFile(projectToml)\n\t\t\tif err != nil {\n\t\t\t\tt.Fatal(err)\n\t\t\t}\n\n\t\t\t_, err = ReadProjectDescriptor(tmpProjectToml.Name(), logger)\n\t\t\th.AssertNil(t, err)\n\t\t\th.AssertContains(\n\t\t\t\tt,\n\t\t\t\treadStdout(),\n\t\t\t\t\"Warning: The following keys declared in project.toml are not supported in schema version 0.1:\\n\"+\n\t\t\t\t\t\"Warning: - io.buildpacks.build.env\\n\"+\n\t\t\t\t\t\"Warning: - io.buildpacks.build.env.name\\n\"+\n\t\t\t\t\t\"Warning: - io.buildpacks.build.env.value\\n\"+\n\t\t\t\t\t\"Warning: The above keys will be ignored. If this is not intentional, try updating your schema version.\\n\",\n\t\t\t)\n\t\t})\n\n\t\tit(\"should warn when unsupported keys, on tables the project owns, are declared with schema v0.2\", func() {\n\t\t\tprojectToml := `\n[_]\nschema-version = \"0.2\"\nid = \"foo\"\nversion = \"bar\"\n# typo in a key under valid table - warning message expected\nversions = \"0.1\"\n\n[[_.licenses]]\ntype = \"foo\"\n# invalid key under a valid table - warning message expected\nfoo = \"bar\"\n\n# try to use an invalid key under io.buildpacks - warning message expected\n[[io.buildpacks.build.foo]]\nname = \"something\"\n\n# something else defined by end-users - no warning message expected\n[io.docker]\nfile = \"./Dockerfile\"\n\n# some metadata defined the end-user - no warning message expected\n[_.metadata]\nfoo = \"bar\"\n\n# more metadata defined the end-user - no warning message expected\n[_.metadata.fizz]\nbuzz = [\"a\", \"b\", \"c\"]\n`\n\t\t\ttmpProjectToml, err := createTmpProjectTomlFile(projectToml)\n\t\t\tif err != nil {\n\t\t\t\tt.Fatal(err)\n\t\t\t}\n\n\t\t\t_, err = ReadProjectDescriptor(tmpProjectToml.Name(), logger)\n\t\t\th.AssertNil(t, err)\n\n\t\t\t// Assert we only warn\n\t\t\th.AssertContains(\n\t\t\t\tt,\n\t\t\t\treadStdout(),\n\t\t\t\t\"Warning: The following keys declared in project.toml are not supported in schema version 0.2:\\n\"+\n\t\t\t\t\t\"Warning: - _.versions\\n\"+\n\t\t\t\t\t\"Warning: - _.licenses.foo\\n\"+\n\t\t\t\t\t\"Warning: - io.buildpacks.build.foo\\n\"+\n\t\t\t\t\t\"Warning: - io.buildpacks.build.foo.name\\n\"+\n\t\t\t\t\t\"Warning: The above keys will be ignored. If this is not intentional, try updating your schema version.\\n\",\n\t\t\t)\n\t\t})\n\t})\n}\n\nfunc assertProjectName(t *testing.T, expected string, projectDescriptor types.Descriptor) {\n\tif projectDescriptor.Project.Name != expected {\n\t\tt.Fatalf(\"Expected\\n-----\\n%#v\\n-----\\nbut got\\n-----\\n%#v\\n\",\n\t\t\texpected, projectDescriptor.Project.Name)\n\t}\n}\n\nfunc assertSchemaVersion(t *testing.T, expected *api.Version, projectDescriptor types.Descriptor) {\n\tif !reflect.DeepEqual(expected, projectDescriptor.SchemaVersion) {\n\t\tt.Fatalf(\"Expected\\n-----\\n%#v\\n-----\\nbut got\\n-----\\n%#v\\n\",\n\t\t\texpected, projectDescriptor.SchemaVersion)\n\t}\n}\n\nfunc assertBuildPackGroupExecEnv(t *testing.T, expected string, bpLength int, execEnvLength int, atIndex int, projectDescriptor types.Descriptor) {\n\th.AssertTrue(t, len(projectDescriptor.Build.Buildpacks) == bpLength)\n\th.AssertTrue(t, len(projectDescriptor.Build.Buildpacks[atIndex].ExecEnv) == execEnvLength)\n\tif !reflect.DeepEqual(expected, projectDescriptor.Build.Buildpacks[atIndex].ExecEnv[atIndex]) {\n\t\tt.Fatalf(\"Expected\\n-----\\n%#v\\n-----\\nbut got\\n-----\\n%#v\\n\",\n\t\t\texpected, projectDescriptor.Build.Buildpacks[atIndex].ExecEnv[atIndex])\n\t}\n}\n\nfunc assertBuildPackPreGroupExecEnv(t *testing.T, expected string, bpLength int, execEnvLength int, atIndex int, projectDescriptor types.Descriptor) {\n\th.AssertTrue(t, len(projectDescriptor.Build.Pre.Buildpacks) == bpLength)\n\th.AssertTrue(t, len(projectDescriptor.Build.Pre.Buildpacks[atIndex].ExecEnv) == execEnvLength)\n\tif !reflect.DeepEqual(expected, projectDescriptor.Build.Pre.Buildpacks[atIndex].ExecEnv[atIndex]) {\n\t\tt.Fatalf(\"Expected\\n-----\\n%#v\\n-----\\nbut got\\n-----\\n%#v\\n\",\n\t\t\texpected, projectDescriptor.Build.Pre.Buildpacks[atIndex].ExecEnv[atIndex])\n\t}\n}\n\nfunc assertBuildPackPostGroupExecEnv(t *testing.T, expected string, bpLength int, execEnvLength int, atIndex int, projectDescriptor types.Descriptor) {\n\th.AssertTrue(t, len(projectDescriptor.Build.Post.Buildpacks) == bpLength)\n\th.AssertTrue(t, len(projectDescriptor.Build.Post.Buildpacks[atIndex].ExecEnv) == execEnvLength)\n\tif !reflect.DeepEqual(expected, projectDescriptor.Build.Post.Buildpacks[atIndex].ExecEnv[atIndex]) {\n\t\tt.Fatalf(\"Expected\\n-----\\n%#v\\n-----\\nbut got\\n-----\\n%#v\\n\",\n\t\t\texpected, projectDescriptor.Build.Post.Buildpacks[atIndex].ExecEnv[atIndex])\n\t}\n}\n\nfunc assertBuildPackBuildExecEnv(t *testing.T, expected string, bpLength int, execEnvLength int, atIndex int, projectDescriptor types.Descriptor) {\n\th.AssertTrue(t, len(projectDescriptor.Build.Env) == bpLength)\n\th.AssertTrue(t, len(projectDescriptor.Build.Env[atIndex].ExecEnv) == execEnvLength)\n\tif !reflect.DeepEqual(expected, projectDescriptor.Build.Env[atIndex].ExecEnv[atIndex]) {\n\t\tt.Fatalf(\"Expected\\n-----\\n%#v\\n-----\\nbut got\\n-----\\n%#v\\n\",\n\t\t\texpected, projectDescriptor.Build.Env[atIndex].ExecEnv[atIndex])\n\t}\n}\n\nfunc createTmpProjectTomlFile(projectToml string) (*os.File, error) {\n\ttmpProjectToml, err := os.CreateTemp(os.TempDir(), \"project-\")\n\tif err != nil {\n\t\tlog.Fatal(\"Failed to create temporary project toml file\", err)\n\t}\n\n\tif _, err := tmpProjectToml.Write([]byte(projectToml)); err != nil {\n\t\tlog.Fatal(\"Failed to write to temporary file\", err)\n\t}\n\treturn tmpProjectToml, err\n}\n"
  },
  {
    "path": "pkg/project/types/types.go",
    "content": "package types\n\nimport (\n\t\"github.com/buildpacks/lifecycle/api\"\n)\n\ntype Script struct {\n\tAPI    string `toml:\"api\"`\n\tInline string `toml:\"inline\"`\n\tShell  string `toml:\"shell\"`\n}\n\ntype Buildpack struct {\n\tID      string   `toml:\"id\"`\n\tVersion string   `toml:\"version\"`\n\tURI     string   `toml:\"uri\"`\n\tScript  Script   `toml:\"script\"`\n\tExecEnv []string `toml:\"exec-env\"`\n}\n\ntype EnvVar struct {\n\tName    string   `toml:\"name\"`\n\tValue   string   `toml:\"value\"`\n\tExecEnv []string `toml:\"exec-env\"`\n}\n\ntype Build struct {\n\tInclude    []string    `toml:\"include\"`\n\tExclude    []string    `toml:\"exclude\"`\n\tBuildpacks []Buildpack `toml:\"buildpacks\"`\n\tEnv        []EnvVar    `toml:\"env\"`\n\tBuilder    string      `toml:\"builder\"`\n\tPre        GroupAddition\n\tPost       GroupAddition\n}\n\ntype Project struct {\n\tID               string    `toml:\"id\"`\n\tName             string    `toml:\"name\"`\n\tVersion          string    `toml:\"version\"`\n\tAuthors          []string  `toml:\"authors\"`\n\tDocumentationURL string    `toml:\"documentation-url\"`\n\tSourceURL        string    `toml:\"source-url\"`\n\tLicenses         []License `toml:\"licenses\"`\n}\n\ntype License struct {\n\tType string `toml:\"type\"`\n\tURI  string `toml:\"uri\"`\n}\n\ntype Descriptor struct {\n\tProject       Project                `toml:\"project\"`\n\tBuild         Build                  `toml:\"build\"`\n\tMetadata      map[string]interface{} `toml:\"metadata\"`\n\tSchemaVersion *api.Version\n}\n\ntype GroupAddition struct {\n\tBuildpacks []Buildpack `toml:\"group\"`\n}\n"
  },
  {
    "path": "pkg/project/v01/project.go",
    "content": "package v01\n\nimport (\n\t\"github.com/BurntSushi/toml\"\n\t\"github.com/buildpacks/lifecycle/api\"\n\n\t\"github.com/buildpacks/pack/pkg/project/types\"\n)\n\ntype Descriptor struct {\n\tProject  types.Project          `toml:\"project\"`\n\tBuild    types.Build            `toml:\"build\"`\n\tMetadata map[string]interface{} `toml:\"metadata\"`\n}\n\nfunc NewDescriptor(projectTomlContents string) (types.Descriptor, toml.MetaData, error) {\n\tversionedDescriptor := &Descriptor{}\n\n\ttomlMetaData, err := toml.Decode(projectTomlContents, versionedDescriptor)\n\tif err != nil {\n\t\treturn types.Descriptor{}, tomlMetaData, err\n\t}\n\n\treturn types.Descriptor{\n\t\tProject:       versionedDescriptor.Project,\n\t\tBuild:         versionedDescriptor.Build,\n\t\tMetadata:      versionedDescriptor.Metadata,\n\t\tSchemaVersion: api.MustParse(\"0.1\"),\n\t}, tomlMetaData, nil\n}\n"
  },
  {
    "path": "pkg/project/v02/metadata.go",
    "content": "package v02\n\nimport (\n\t\"fmt\"\n\t\"sort\"\n\t\"strings\"\n\t\"time\"\n\n\t\"github.com/buildpacks/lifecycle/platform/files\"\n\t\"github.com/go-git/go-git/v5\"\n\t\"github.com/go-git/go-git/v5/plumbing\"\n)\n\ntype TagInfo struct {\n\tName    string\n\tMessage string\n\tType    string\n\tTagHash string\n\tTagTime time.Time\n}\n\nfunc GitMetadata(appPath string) *files.ProjectSource {\n\trepo, err := git.PlainOpen(appPath)\n\tif err != nil {\n\t\treturn nil\n\t}\n\theadRef, err := repo.Head()\n\tif err != nil {\n\t\treturn nil\n\t}\n\tcommitTagMap := generateTagsMap(repo)\n\n\tdescribe := parseGitDescribe(repo, headRef, commitTagMap)\n\trefs := parseGitRefs(repo, headRef, commitTagMap)\n\tremote := parseGitRemote(repo)\n\n\tprojectSource := &files.ProjectSource{\n\t\tType: \"git\",\n\t\tVersion: map[string]interface{}{\n\t\t\t\"commit\":   headRef.Hash().String(),\n\t\t\t\"describe\": describe,\n\t\t},\n\t\tMetadata: map[string]interface{}{\n\t\t\t\"refs\": refs,\n\t\t\t\"url\":  remote,\n\t\t},\n\t}\n\treturn projectSource\n}\n\nfunc generateTagsMap(repo *git.Repository) map[string][]TagInfo {\n\tcommitTagMap := make(map[string][]TagInfo)\n\ttags, err := repo.Tags()\n\tif err != nil {\n\t\treturn commitTagMap\n\t}\n\n\ttags.ForEach(func(ref *plumbing.Reference) error {\n\t\ttagObj, err := repo.TagObject(ref.Hash())\n\t\tswitch err {\n\t\tcase nil:\n\t\t\tcommitTagMap[tagObj.Target.String()] = append(\n\t\t\t\tcommitTagMap[tagObj.Target.String()],\n\t\t\t\tTagInfo{Name: tagObj.Name, Message: tagObj.Message, Type: \"annotated\", TagHash: ref.Hash().String(), TagTime: tagObj.Tagger.When},\n\t\t\t)\n\t\tcase plumbing.ErrObjectNotFound:\n\t\t\tcommitTagMap[ref.Hash().String()] = append(\n\t\t\t\tcommitTagMap[ref.Hash().String()],\n\t\t\t\tTagInfo{Name: getRefName(ref.Name().String()), Message: \"\", Type: \"unannotated\", TagHash: ref.Hash().String(), TagTime: time.Now()},\n\t\t\t)\n\t\tdefault:\n\t\t\treturn err\n\t\t}\n\t\treturn nil\n\t})\n\n\tfor _, tagRefs := range commitTagMap {\n\t\tsort.Slice(tagRefs, func(i, j int) bool {\n\t\t\tif tagRefs[i].Type == \"annotated\" && tagRefs[j].Type == \"annotated\" {\n\t\t\t\treturn tagRefs[i].TagTime.After(tagRefs[j].TagTime)\n\t\t\t}\n\t\t\tif tagRefs[i].Type == \"unannotated\" && tagRefs[j].Type == \"unannotated\" {\n\t\t\t\treturn tagRefs[i].Name < tagRefs[j].Name\n\t\t\t}\n\t\t\tif tagRefs[i].Type == \"annotated\" && tagRefs[j].Type == \"unannotated\" {\n\t\t\t\treturn true\n\t\t\t}\n\t\t\treturn false\n\t\t})\n\t}\n\treturn commitTagMap\n}\n\nfunc generateBranchMap(repo *git.Repository) map[string][]string {\n\tcommitBranchMap := make(map[string][]string)\n\tbranches, err := repo.Branches()\n\tif err != nil {\n\t\treturn commitBranchMap\n\t}\n\tbranches.ForEach(func(ref *plumbing.Reference) error {\n\t\tcommitBranchMap[ref.Hash().String()] = append(commitBranchMap[ref.Hash().String()], getRefName(ref.Name().String()))\n\t\treturn nil\n\t})\n\treturn commitBranchMap\n}\n\n// `git describe --tags --always`\nfunc parseGitDescribe(repo *git.Repository, headRef *plumbing.Reference, commitTagMap map[string][]TagInfo) string {\n\tlogOpts := &git.LogOptions{\n\t\tFrom:  headRef.Hash(),\n\t\tOrder: git.LogOrderCommitterTime,\n\t}\n\tcommits, err := repo.Log(logOpts)\n\tif err != nil {\n\t\treturn \"\"\n\t}\n\n\tlatestTag := headRef.Hash().String()\n\tcommitsFromHEAD := 0\n\tcommitBranchMap := generateBranchMap(repo)\n\tbranchAtHEAD := getRefName(headRef.String())\n\tcurrentBranch := branchAtHEAD\n\tfor {\n\t\tcommitInfo, err := commits.Next()\n\t\tif err != nil {\n\t\t\tbreak\n\t\t}\n\n\t\tif branchesAtCommit, exists := commitBranchMap[commitInfo.Hash.String()]; exists {\n\t\t\tcurrentBranch = branchesAtCommit[0]\n\t\t}\n\t\tif refs, exists := commitTagMap[commitInfo.Hash.String()]; exists {\n\t\t\tif branchAtHEAD != currentBranch && commitsFromHEAD != 0 {\n\t\t\t\t// https://git-scm.com/docs/git-describe#_examples\n\t\t\t\tlatestTag = fmt.Sprintf(\"%s-%d-g%s\", refs[0].Name, commitsFromHEAD, headRef.Hash().String())\n\t\t\t} else {\n\t\t\t\tlatestTag = refs[0].Name\n\t\t\t}\n\t\t\tbreak\n\t\t}\n\t\tcommitsFromHEAD += 1\n\t}\n\treturn latestTag\n}\n\nfunc parseGitRefs(repo *git.Repository, headRef *plumbing.Reference, commitTagMap map[string][]TagInfo) []string {\n\tvar parsedRefs []string\n\tparsedRefs = append(parsedRefs, getRefName(headRef.Name().String()))\n\tif refs, exists := commitTagMap[headRef.Hash().String()]; exists {\n\t\tfor _, ref := range refs {\n\t\t\tparsedRefs = append(parsedRefs, ref.Name)\n\t\t}\n\t}\n\treturn parsedRefs\n}\n\nfunc parseGitRemote(repo *git.Repository) string {\n\tremotes, err := repo.Remotes()\n\tif err != nil || len(remotes) == 0 {\n\t\treturn \"\"\n\t}\n\n\tfor _, remote := range remotes {\n\t\tif remote.Config().Name == \"origin\" {\n\t\t\treturn remote.Config().URLs[0]\n\t\t}\n\t}\n\treturn remotes[0].Config().URLs[0]\n}\n\n// Parse ref name from refs/tags/<ref_name>\nfunc getRefName(ref string) string {\n\tif refSplit := strings.SplitN(ref, \"/\", 3); len(refSplit) == 3 {\n\t\treturn refSplit[2]\n\t}\n\treturn \"\"\n}\n"
  },
  {
    "path": "pkg/project/v02/metadata_test.go",
    "content": "package v02\n\nimport (\n\t\"fmt\"\n\t\"math/rand\"\n\t\"os\"\n\t\"path/filepath\"\n\t\"sort\"\n\t\"testing\"\n\t\"time\"\n\n\t\"github.com/buildpacks/lifecycle/platform/files\"\n\t\"github.com/go-git/go-git/v5\"\n\t\"github.com/go-git/go-git/v5/config\"\n\t\"github.com/go-git/go-git/v5/plumbing\"\n\t\"github.com/go-git/go-git/v5/plumbing/object\"\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n\n\th \"github.com/buildpacks/pack/testhelpers\"\n)\n\nfunc TestMetadata(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"Metadata\", testMetadata, spec.Sequential(), spec.Report(report.Terminal{}))\n}\n\nfunc testMetadata(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\trepoPath string\n\t\trepo     *git.Repository\n\t\tcommits  []plumbing.Hash\n\t)\n\n\tit.Before(func() {\n\t\tvar err error\n\n\t\trepoPath, err = os.MkdirTemp(\"\", \"test-repo\")\n\t\th.AssertNil(t, err)\n\n\t\trepo, err = git.PlainInit(repoPath, false)\n\t\th.AssertNil(t, err)\n\n\t\tcommits = createCommits(t, repo, repoPath, 5)\n\t})\n\n\tit.After(func() {\n\t\th.AssertNil(t, os.RemoveAll(repoPath))\n\t})\n\n\twhen(\"#GitMetadata\", func() {\n\t\tit(\"returns proper metadata format\", func() {\n\t\t\tassert := h.NewAssertionManager(t)\n\t\t\tremoteOpts := &config.RemoteConfig{\n\t\t\t\tName: \"origin\",\n\t\t\t\tURLs: []string{\"git@github.com:testorg/testproj.git\", \"git@github.com:testorg/testproj.git\"},\n\t\t\t}\n\t\t\trepo.CreateRemote(remoteOpts)\n\t\t\tcreateUnannotatedTag(t, repo, commits[len(commits)-1], \"testTag\")\n\n\t\t\toutput := GitMetadata(repoPath)\n\t\t\texpectedOutput := &files.ProjectSource{\n\t\t\t\tType: \"git\",\n\t\t\t\tVersion: map[string]interface{}{\n\t\t\t\t\t\"commit\":   commits[len(commits)-1].String(),\n\t\t\t\t\t\"describe\": \"testTag\",\n\t\t\t\t},\n\t\t\t\tMetadata: map[string]interface{}{\n\t\t\t\t\t\"refs\": []string{\"master\", \"testTag\"},\n\t\t\t\t\t\"url\":  \"git@github.com:testorg/testproj.git\",\n\t\t\t\t},\n\t\t\t}\n\t\t\tassert.Equal(output, expectedOutput)\n\t\t})\n\n\t\tit(\"returns nil if error occurs while fetching metadata\", func() {\n\t\t\toutput := GitMetadata(\"/git-path-not-found-ok\")\n\t\t\th.AssertNil(t, output)\n\t\t})\n\t})\n\n\twhen(\"#generateTagsMap\", func() {\n\t\twhen(\"repository has no tags\", func() {\n\t\t\tit(\"returns empty map\", func() {\n\t\t\t\tcommitTagsMap := generateTagsMap(repo)\n\t\t\t\th.AssertEq(t, len(commitTagsMap), 0)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"repository has only unannotated tags\", func() {\n\t\t\tit(\"returns correct map if commits only have one tag\", func() {\n\t\t\t\tfor i := 0; i < 4; i++ {\n\t\t\t\t\tcreateUnannotatedTag(t, repo, commits[i], \"\")\n\t\t\t\t}\n\n\t\t\t\tcommitTagsMap := generateTagsMap(repo)\n\t\t\t\th.AssertEq(t, len(commitTagsMap), 4)\n\t\t\t\tfor i := 0; i < 4; i++ {\n\t\t\t\t\ttagsInfo, shouldExist := commitTagsMap[commits[i].String()]\n\t\t\t\t\th.AssertEq(t, shouldExist, true)\n\t\t\t\t\th.AssertNotEq(t, tagsInfo[0].Name, \"\")\n\t\t\t\t\th.AssertEq(t, tagsInfo[0].Type, \"unannotated\")\n\t\t\t\t\th.AssertEq(t, tagsInfo[0].Message, \"\")\n\t\t\t\t}\n\t\t\t\t_, shouldNotExist := commitTagsMap[commits[3].String()]\n\t\t\t\th.AssertEq(t, shouldNotExist, true)\n\t\t\t})\n\n\t\t\tit(\"returns map sorted by ascending tag name if commits have multiple tags\", func() {\n\t\t\t\tfor i := 0; i < 4; i++ {\n\t\t\t\t\tfor j := 0; j <= rand.Intn(10); j++ {\n\t\t\t\t\t\tcreateUnannotatedTag(t, repo, commits[i], \"\")\n\t\t\t\t\t}\n\t\t\t\t}\n\n\t\t\t\tcommitTagsMap := generateTagsMap(repo)\n\t\t\t\th.AssertEq(t, len(commitTagsMap), 4)\n\t\t\t\tfor i := 0; i < 4; i++ {\n\t\t\t\t\ttagsInfo, shouldExist := commitTagsMap[commits[i].String()]\n\t\t\t\t\th.AssertEq(t, shouldExist, true)\n\n\t\t\t\t\ttagsSortedByName := sort.SliceIsSorted(tagsInfo, func(i, j int) bool {\n\t\t\t\t\t\treturn tagsInfo[i].Name < tagsInfo[j].Name\n\t\t\t\t\t})\n\t\t\t\t\th.AssertEq(t, tagsSortedByName, true)\n\t\t\t\t}\n\t\t\t})\n\t\t})\n\n\t\twhen(\"repository has only annotated tags\", func() {\n\t\t\tit(\"returns correct map if commits only have one tag\", func() {\n\t\t\t\tfor i := 0; i < 4; i++ {\n\t\t\t\t\tcreateAnnotatedTag(t, repo, commits[i], \"\")\n\t\t\t\t}\n\n\t\t\t\tcommitTagsMap := generateTagsMap(repo)\n\t\t\t\th.AssertEq(t, len(commitTagsMap), 4)\n\t\t\t\tfor i := 0; i < 4; i++ {\n\t\t\t\t\ttagsInfo, shouldExist := commitTagsMap[commits[i].String()]\n\t\t\t\t\th.AssertEq(t, shouldExist, true)\n\t\t\t\t\th.AssertNotEq(t, tagsInfo[0].Name, \"\")\n\t\t\t\t\th.AssertEq(t, tagsInfo[0].Type, \"annotated\")\n\t\t\t\t\th.AssertNotEq(t, tagsInfo[0].Message, \"\")\n\t\t\t\t}\n\t\t\t\t_, shouldNotExist := commitTagsMap[commits[3].String()]\n\t\t\t\th.AssertEq(t, shouldNotExist, true)\n\t\t\t})\n\n\t\t\tit(\"returns map sorted by descending tag creation time if commits have multiple tags\", func() {\n\t\t\t\tfor i := 0; i < 4; i++ {\n\t\t\t\t\tfor j := 0; j <= rand.Intn(10); j++ {\n\t\t\t\t\t\tcreateAnnotatedTag(t, repo, commits[i], \"\")\n\t\t\t\t\t}\n\t\t\t\t}\n\n\t\t\t\tcommitTagsMap := generateTagsMap(repo)\n\t\t\t\th.AssertEq(t, len(commitTagsMap), 4)\n\t\t\t\tfor i := 0; i < 4; i++ {\n\t\t\t\t\ttagsInfo, shouldExist := commitTagsMap[commits[i].String()]\n\t\t\t\t\th.AssertEq(t, shouldExist, true)\n\n\t\t\t\t\ttagsSortedByTime := sort.SliceIsSorted(tagsInfo, func(i, j int) bool {\n\t\t\t\t\t\treturn tagsInfo[i].TagTime.After(tagsInfo[j].TagTime)\n\t\t\t\t\t})\n\t\t\t\t\th.AssertEq(t, tagsSortedByTime, true)\n\t\t\t\t}\n\t\t\t\t_, shouldNotExist := commitTagsMap[commits[3].String()]\n\t\t\t\th.AssertEq(t, shouldNotExist, true)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"repository has both annotated and unannotated tags\", func() {\n\t\t\tit(\"returns map where annotated tags exist prior to unnanotated if commits have multiple tags\", func() {\n\t\t\t\tfor i := 0; i < 4; i++ {\n\t\t\t\t\tfor j := 0; j <= rand.Intn(10); j++ {\n\t\t\t\t\t\tcreateAnnotatedTag(t, repo, commits[i], \"\")\n\t\t\t\t\t}\n\t\t\t\t\tfor j := 0; j <= rand.Intn(10); j++ {\n\t\t\t\t\t\tcreateUnannotatedTag(t, repo, commits[i], \"\")\n\t\t\t\t\t}\n\t\t\t\t}\n\n\t\t\t\tcommitTagsMap := generateTagsMap(repo)\n\t\t\t\th.AssertEq(t, len(commitTagsMap), 4)\n\t\t\t\tfor i := 0; i < 4; i++ {\n\t\t\t\t\ttagsInfo, shouldExist := commitTagsMap[commits[i].String()]\n\t\t\t\t\th.AssertEq(t, shouldExist, true)\n\n\t\t\t\t\ttagsSortedByType := sort.SliceIsSorted(tagsInfo, func(i, j int) bool {\n\t\t\t\t\t\tif tagsInfo[i].Type == \"annotated\" && tagsInfo[j].Type == \"unannotated\" {\n\t\t\t\t\t\t\treturn true\n\t\t\t\t\t\t}\n\t\t\t\t\t\treturn false\n\t\t\t\t\t})\n\t\t\t\t\th.AssertEq(t, tagsSortedByType, true)\n\t\t\t\t}\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"#generateBranchMap\", func() {\n\t\tit(\"returns map with latest commit of the `master` branch\", func() {\n\t\t\tbranchMap := generateBranchMap(repo)\n\t\t\th.AssertEq(t, branchMap[commits[len(commits)-1].String()][0], \"master\")\n\t\t})\n\n\t\tit(\"returns map with latest commit all the branches\", func() {\n\t\t\tcheckoutBranch(t, repo, \"newbranch-1\", true)\n\t\t\tnewBranchCommits := createCommits(t, repo, repoPath, 3)\n\t\t\tcheckoutBranch(t, repo, \"master\", false)\n\t\t\tcheckoutBranch(t, repo, \"newbranch-2\", true)\n\n\t\t\tbranchMap := generateBranchMap(repo)\n\t\t\th.AssertEq(t, branchMap[commits[len(commits)-1].String()][0], \"master\")\n\t\t\th.AssertEq(t, branchMap[commits[len(commits)-1].String()][1], \"newbranch-2\")\n\t\t\th.AssertEq(t, branchMap[newBranchCommits[len(newBranchCommits)-1].String()][0], \"newbranch-1\")\n\t\t})\n\t})\n\n\twhen(\"#parseGitDescribe\", func() {\n\t\twhen(\"all tags are defined in a single branch\", func() {\n\t\t\twhen(\"repository has no tags\", func() {\n\t\t\t\tit(\"returns latest commit hash\", func() {\n\t\t\t\t\tcommitTagsMap := generateTagsMap(repo)\n\t\t\t\t\theadRef, err := repo.Head()\n\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\toutput := parseGitDescribe(repo, headRef, commitTagsMap)\n\t\t\t\t\th.AssertEq(t, output, commits[len(commits)-1].String())\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"repository has only unannotated tags\", func() {\n\t\t\t\tit(\"returns first tag encountered from HEAD\", func() {\n\t\t\t\t\tfor i := 0; i < 3; i++ {\n\t\t\t\t\t\ttagName := fmt.Sprintf(\"v0.%d-lw\", i+1)\n\t\t\t\t\t\tcreateUnannotatedTag(t, repo, commits[i], tagName)\n\t\t\t\t\t}\n\n\t\t\t\t\tcommitTagsMap := generateTagsMap(repo)\n\t\t\t\t\theadRef, err := repo.Head()\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\toutput := parseGitDescribe(repo, headRef, commitTagsMap)\n\t\t\t\t\th.AssertEq(t, output, \"v0.3-lw\")\n\t\t\t\t})\n\n\t\t\t\tit(\"returns proper tag name for tags containing `/`\", func() {\n\t\t\t\t\ttagName := \"v0.1/testing\"\n\t\t\t\t\tt.Logf(\"Checking output for tag name: %s\", tagName)\n\t\t\t\t\tcreateUnannotatedTag(t, repo, commits[0], tagName)\n\n\t\t\t\t\tcommitTagsMap := generateTagsMap(repo)\n\t\t\t\t\theadRef, err := repo.Head()\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\toutput := parseGitDescribe(repo, headRef, commitTagsMap)\n\t\t\t\t\th.AssertContains(t, output, \"v0.1/testing\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"repository has only annotated tags\", func() {\n\t\t\t\tit(\"returns first tag encountered from HEAD\", func() {\n\t\t\t\t\tfor i := 0; i < 3; i++ {\n\t\t\t\t\t\ttagName := fmt.Sprintf(\"v0.%d\", i+1)\n\t\t\t\t\t\tcreateAnnotatedTag(t, repo, commits[i], tagName)\n\t\t\t\t\t}\n\n\t\t\t\t\tcommitTagsMap := generateTagsMap(repo)\n\t\t\t\t\theadRef, err := repo.Head()\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\toutput := parseGitDescribe(repo, headRef, commitTagsMap)\n\t\t\t\t\th.AssertEq(t, output, \"v0.3\")\n\t\t\t\t})\n\t\t\t})\n\n\t\t\twhen(\"repository has both annotated and unannotated tags\", func() {\n\t\t\t\twhen(\"each commit has only one tag\", func() {\n\t\t\t\t\tit(\"returns the first tag encountered from HEAD if unannotated tag comes first\", func() {\n\t\t\t\t\t\tcreateAnnotatedTag(t, repo, commits[0], \"ann-tag-at-commit-0\")\n\t\t\t\t\t\tcreateUnannotatedTag(t, repo, commits[1], \"unann-tag-at-commit-1\")\n\t\t\t\t\t\tcreateAnnotatedTag(t, repo, commits[2], \"ann-tag-at-commit-2\")\n\t\t\t\t\t\tcreateUnannotatedTag(t, repo, commits[3], \"unann-tag-at-commit-3\")\n\t\t\t\t\t\tcreateUnannotatedTag(t, repo, commits[4], \"unann-tag-at-commit-4\")\n\n\t\t\t\t\t\tcommitTagsMap := generateTagsMap(repo)\n\t\t\t\t\t\theadRef, err := repo.Head()\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\toutput := parseGitDescribe(repo, headRef, commitTagsMap)\n\t\t\t\t\t\th.AssertEq(t, output, \"unann-tag-at-commit-4\")\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"returns the first tag encountered from HEAD if annotated tag comes first\", func() {\n\t\t\t\t\t\tcreateAnnotatedTag(t, repo, commits[0], \"ann-tag-at-commit-0\")\n\t\t\t\t\t\tcreateUnannotatedTag(t, repo, commits[1], \"unann-tag-at-commit-1\")\n\t\t\t\t\t\tcreateAnnotatedTag(t, repo, commits[2], \"ann-tag-at-commit-2\")\n\t\t\t\t\t\tcreateAnnotatedTag(t, repo, commits[3], \"ann-tag-at-commit-3\")\n\n\t\t\t\t\t\tcommitTagsMap := generateTagsMap(repo)\n\t\t\t\t\t\theadRef, err := repo.Head()\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\toutput := parseGitDescribe(repo, headRef, commitTagsMap)\n\t\t\t\t\t\th.AssertEq(t, output, \"ann-tag-at-commit-3\")\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"returns the tag at HEAD if annotated tag exists at HEAD\", func() {\n\t\t\t\t\t\tcreateAnnotatedTag(t, repo, commits[4], \"ann-tag-at-HEAD\")\n\n\t\t\t\t\t\tcommitTagsMap := generateTagsMap(repo)\n\t\t\t\t\t\theadRef, err := repo.Head()\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\toutput := parseGitDescribe(repo, headRef, commitTagsMap)\n\t\t\t\t\t\th.AssertEq(t, output, \"ann-tag-at-HEAD\")\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"returns the tag at HEAD if unannotated tag exists at HEAD\", func() {\n\t\t\t\t\t\tcreateUnannotatedTag(t, repo, commits[4], \"unann-tag-at-HEAD\")\n\n\t\t\t\t\t\tcommitTagsMap := generateTagsMap(repo)\n\t\t\t\t\t\theadRef, err := repo.Head()\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\toutput := parseGitDescribe(repo, headRef, commitTagsMap)\n\t\t\t\t\t\th.AssertEq(t, output, \"unann-tag-at-HEAD\")\n\t\t\t\t\t})\n\t\t\t\t})\n\n\t\t\t\twhen(\"commits have multiple tags\", func() {\n\t\t\t\t\tit(\"returns most recently created tag if a commit has multiple annotated tags\", func() {\n\t\t\t\t\t\tcreateAnnotatedTag(t, repo, commits[1], \"ann-tag-1-at-commit-1\")\n\t\t\t\t\t\tcreateAnnotatedTag(t, repo, commits[2], \"ann-tag-1-at-commit-2\")\n\t\t\t\t\t\tcreateAnnotatedTag(t, repo, commits[2], \"ann-tag-2-at-commit-2\")\n\t\t\t\t\t\tcreateAnnotatedTag(t, repo, commits[2], \"ann-tag-3-at-commit-2\")\n\n\t\t\t\t\t\tcommitTagsMap := generateTagsMap(repo)\n\t\t\t\t\t\theadRef, err := repo.Head()\n\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\toutput := parseGitDescribe(repo, headRef, commitTagsMap)\n\t\t\t\t\t\ttagsAtCommit := commitTagsMap[commits[2].String()]\n\t\t\t\t\t\th.AssertEq(t, output, tagsAtCommit[0].Name)\n\t\t\t\t\t\tfor i := 1; i < len(tagsAtCommit); i++ {\n\t\t\t\t\t\t\th.AssertEq(t, tagsAtCommit[i].TagTime.Before(tagsAtCommit[0].TagTime), true)\n\t\t\t\t\t\t}\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"returns the tag name that comes first when sorted alphabetically if a commit has multiple unannotated tags\", func() {\n\t\t\t\t\t\tcreateUnannotatedTag(t, repo, commits[1], \"ann-tag-1-at-commit-1\")\n\t\t\t\t\t\tcreateUnannotatedTag(t, repo, commits[2], \"v0.000002-lw\")\n\t\t\t\t\t\tcreateUnannotatedTag(t, repo, commits[2], \"v0.0002-lw\")\n\t\t\t\t\t\tcreateUnannotatedTag(t, repo, commits[2], \"v1.0002-lw\")\n\n\t\t\t\t\t\tcommitTagsMap := generateTagsMap(repo)\n\t\t\t\t\t\theadRef, err := repo.Head()\n\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\toutput := parseGitDescribe(repo, headRef, commitTagsMap)\n\t\t\t\t\t\th.AssertEq(t, output, \"v0.000002-lw\")\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"returns annotated tag is a commit has both annotated and unannotated tags\", func() {\n\t\t\t\t\t\tcreateAnnotatedTag(t, repo, commits[1], \"ann-tag-1-at-commit-1\")\n\t\t\t\t\t\tcreateAnnotatedTag(t, repo, commits[2], \"ann-tag-1-at-commit-2\")\n\t\t\t\t\t\tcreateUnannotatedTag(t, repo, commits[2], \"unann-tag-1-at-commit-2\")\n\n\t\t\t\t\t\tcommitTagsMap := generateTagsMap(repo)\n\t\t\t\t\t\theadRef, err := repo.Head()\n\t\t\t\t\t\th.AssertNil(t, err)\n\n\t\t\t\t\t\toutput := parseGitDescribe(repo, headRef, commitTagsMap)\n\t\t\t\t\t\th.AssertEq(t, output, \"ann-tag-1-at-commit-2\")\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\n\t\twhen(\"tags are defined in multiple branches\", func() {\n\t\t\twhen(\"tag is defined in the latest commit of `master` branch and HEAD is at a different branch\", func() {\n\t\t\t\tit(\"returns the tag if HEAD, master and different branch is at tags\", func() {\n\t\t\t\t\tcheckoutBranch(t, repo, \"new-branch\", true)\n\t\t\t\t\tcreateAnnotatedTag(t, repo, commits[len(commits)-1], \"ann-tag-at-HEAD\")\n\n\t\t\t\t\theadRef, err := repo.Head()\n\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\tcommitTagsMap := generateTagsMap(repo)\n\t\t\t\t\toutput := parseGitDescribe(repo, headRef, commitTagsMap)\n\t\t\t\t\th.AssertEq(t, output, \"ann-tag-at-HEAD\")\n\t\t\t\t})\n\n\t\t\t\twhen(\"branch is multiple commits ahead of master\", func() {\n\t\t\t\t\tit(\"returns git generated version of annotated tag if branch is 2 commits ahead of `master`\", func() {\n\t\t\t\t\t\tcreateAnnotatedTag(t, repo, commits[len(commits)-1], \"testTag\")\n\t\t\t\t\t\tcheckoutBranch(t, repo, \"new-branch\", true)\n\t\t\t\t\t\tnewCommits := createCommits(t, repo, repoPath, 2)\n\n\t\t\t\t\t\theadRef, err := repo.Head()\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\tcommitTagsMap := generateTagsMap(repo)\n\t\t\t\t\t\toutput := parseGitDescribe(repo, headRef, commitTagsMap)\n\t\t\t\t\t\texpectedOutput := fmt.Sprintf(\"testTag-2-g%s\", newCommits[len(newCommits)-1].String())\n\t\t\t\t\t\th.AssertEq(t, output, expectedOutput)\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"returns git generated version of unannotated tag if branch is 5 commits ahead of `master`\", func() {\n\t\t\t\t\t\tcreateUnannotatedTag(t, repo, commits[len(commits)-1], \"testTag\")\n\t\t\t\t\t\tcheckoutBranch(t, repo, \"new-branch\", true)\n\t\t\t\t\t\tnewCommits := createCommits(t, repo, repoPath, 5)\n\n\t\t\t\t\t\theadRef, err := repo.Head()\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\tcommitTagsMap := generateTagsMap(repo)\n\t\t\t\t\t\toutput := parseGitDescribe(repo, headRef, commitTagsMap)\n\t\t\t\t\t\texpectedOutput := fmt.Sprintf(\"testTag-5-g%s\", newCommits[len(newCommits)-1].String())\n\t\t\t\t\t\th.AssertEq(t, output, expectedOutput)\n\t\t\t\t\t})\n\n\t\t\t\t\tit(\"returns the commit hash if only the diverged tree of `master` branch has a tag\", func() {\n\t\t\t\t\t\tcheckoutBranch(t, repo, \"new-branch\", true)\n\t\t\t\t\t\tcheckoutBranch(t, repo, \"master\", false)\n\t\t\t\t\t\tnewCommits := createCommits(t, repo, repoPath, 3)\n\t\t\t\t\t\tcreateUnannotatedTag(t, repo, newCommits[len(newCommits)-1], \"testTagAtMaster\")\n\t\t\t\t\t\tcheckoutBranch(t, repo, \"new-branch\", false)\n\n\t\t\t\t\t\theadRef, err := repo.Head()\n\t\t\t\t\t\th.AssertNil(t, err)\n\t\t\t\t\t\tcommitTagsMap := generateTagsMap(repo)\n\t\t\t\t\t\toutput := parseGitDescribe(repo, headRef, commitTagsMap)\n\t\t\t\t\t\texpectedOutput := commits[len(commits)-1].String()\n\t\t\t\t\t\th.AssertEq(t, output, expectedOutput)\n\t\t\t\t\t})\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"#parseGitRefs\", func() {\n\t\twhen(\"HEAD is not at a tag\", func() {\n\t\t\tit(\"returns branch name if checked out branch is `master`\", func() {\n\t\t\t\tcommitTagsMap := generateTagsMap(repo)\n\t\t\t\theadRef, err := repo.Head()\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\toutput := parseGitRefs(repo, headRef, commitTagsMap)\n\t\t\t\texpectedOutput := []string{\"master\"}\n\t\t\t\th.AssertEq(t, output, expectedOutput)\n\t\t\t})\n\n\t\t\tit(\"returns branch name if checked out branch is not `master`\", func() {\n\t\t\t\tcheckoutBranch(t, repo, \"tests/05-05/test-branch\", true)\n\t\t\t\tcreateCommits(t, repo, repoPath, 1)\n\n\t\t\t\tcommitTagsMap := generateTagsMap(repo)\n\t\t\t\theadRef, err := repo.Head()\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\toutput := parseGitRefs(repo, headRef, commitTagsMap)\n\t\t\t\texpectedOutput := []string{\"tests/05-05/test-branch\"}\n\t\t\t\th.AssertEq(t, output, expectedOutput)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"HEAD is at a commit with single tag\", func() {\n\t\t\tit(\"returns annotated tag and branch name\", func() {\n\t\t\t\tcreateAnnotatedTag(t, repo, commits[len(commits)-1], \"test-tag\")\n\t\t\t\tcommitTagsMap := generateTagsMap(repo)\n\t\t\t\theadRef, err := repo.Head()\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\toutput := parseGitRefs(repo, headRef, commitTagsMap)\n\t\t\t\texpectedOutput := []string{\"master\", \"test-tag\"}\n\t\t\t\th.AssertEq(t, output, expectedOutput)\n\t\t\t})\n\n\t\t\tit(\"returns unannotated tag and branch name\", func() {\n\t\t\t\tcreateUnannotatedTag(t, repo, commits[len(commits)-1], \"test-tag\")\n\t\t\t\tcommitTagsMap := generateTagsMap(repo)\n\t\t\t\theadRef, err := repo.Head()\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\toutput := parseGitRefs(repo, headRef, commitTagsMap)\n\t\t\t\texpectedOutput := []string{\"master\", \"test-tag\"}\n\t\t\t\th.AssertEq(t, output, expectedOutput)\n\t\t\t})\n\t\t})\n\n\t\twhen(\"HEAD is at a commit with multiple tags\", func() {\n\t\t\tit(\"returns correct tag names if all tags are unannotated\", func() {\n\t\t\t\tcreateUnannotatedTag(t, repo, commits[len(commits)-2], \"v0.01-testtag-lw\")\n\t\t\t\tcreateUnannotatedTag(t, repo, commits[len(commits)-1], \"v0.02-testtag-lw-1\")\n\t\t\t\tcreateUnannotatedTag(t, repo, commits[len(commits)-1], \"v0.02-testtag-lw-2\")\n\t\t\t\tcommitTagsMap := generateTagsMap(repo)\n\t\t\t\theadRef, err := repo.Head()\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\toutput := parseGitRefs(repo, headRef, commitTagsMap)\n\t\t\t\texpectedOutput := []string{\"master\", \"v0.02-testtag-lw-1\", \"v0.02-testtag-lw-2\"}\n\t\t\t\th.AssertEq(t, output, expectedOutput)\n\t\t\t})\n\n\t\t\tit(\"returns correct tag names if all tags are annotated\", func() {\n\t\t\t\tcreateAnnotatedTag(t, repo, commits[len(commits)-2], \"v0.01-testtag\")\n\t\t\t\tcreateAnnotatedTag(t, repo, commits[len(commits)-1], \"v0.02-testtag\")\n\t\t\t\tcreateAnnotatedTag(t, repo, commits[len(commits)-1], \"v0.03-testtag\")\n\t\t\t\tcommitTagsMap := generateTagsMap(repo)\n\t\t\t\theadRef, err := repo.Head()\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\toutput := parseGitRefs(repo, headRef, commitTagsMap)\n\t\t\t\texpectedOutput := []string{\"master\", \"v0.02-testtag\", \"v0.03-testtag\"}\n\t\t\t\tsort.Strings(output)\n\t\t\t\tsort.Strings(expectedOutput)\n\t\t\t\th.AssertEq(t, output, expectedOutput)\n\t\t\t})\n\n\t\t\tit(\"returns correct tag names for both tag types\", func() {\n\t\t\t\tcreateUnannotatedTag(t, repo, commits[len(commits)-3], \"v0.001-testtag-lw\")\n\t\t\t\tcreateAnnotatedTag(t, repo, commits[len(commits)-2], \"v0.01-testtag\")\n\t\t\t\tcreateUnannotatedTag(t, repo, commits[len(commits)-1], \"v0.02-testtag-lw-1\")\n\t\t\t\tcreateUnannotatedTag(t, repo, commits[len(commits)-1], \"v0.02-testtag-lw-2\")\n\t\t\t\tcreateAnnotatedTag(t, repo, commits[len(commits)-1], \"v0.02-testtag-1\")\n\n\t\t\t\tcommitTagsMap := generateTagsMap(repo)\n\t\t\t\theadRef, err := repo.Head()\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\toutput := parseGitRefs(repo, headRef, commitTagsMap)\n\t\t\t\texpectedOutput := []string{\"master\", \"v0.02-testtag-1\", \"v0.02-testtag-lw-1\", \"v0.02-testtag-lw-2\"}\n\t\t\t\th.AssertEq(t, output, expectedOutput)\n\t\t\t})\n\n\t\t\tit(\"returns correct tag names for both tag types when branch is not `master`\", func() {\n\t\t\t\tcheckoutBranch(t, repo, \"test-branch\", true)\n\t\t\t\tcreateUnannotatedTag(t, repo, commits[len(commits)-3], \"v0.001-testtag-lw\")\n\t\t\t\tcreateAnnotatedTag(t, repo, commits[len(commits)-2], \"v0.01-testtag\")\n\t\t\t\tcreateUnannotatedTag(t, repo, commits[len(commits)-1], \"v0.02-testtag-lw-1\")\n\t\t\t\tcreateUnannotatedTag(t, repo, commits[len(commits)-1], \"v0.02-testtag-lw-2\")\n\t\t\t\tcreateAnnotatedTag(t, repo, commits[len(commits)-1], \"v0.02-testtag-1\")\n\t\t\t\tcreateAnnotatedTag(t, repo, commits[len(commits)-1], \"v0.02-testtag-2\")\n\n\t\t\t\tcommitTagsMap := generateTagsMap(repo)\n\t\t\t\theadRef, err := repo.Head()\n\t\t\t\th.AssertNil(t, err)\n\t\t\t\toutput := parseGitRefs(repo, headRef, commitTagsMap)\n\t\t\t\texpectedOutput := []string{\"test-branch\", \"v0.02-testtag-1\", \"v0.02-testtag-2\", \"v0.02-testtag-lw-1\", \"v0.02-testtag-lw-2\"}\n\t\t\t\tsort.Strings(output)\n\t\t\t\tsort.Strings(expectedOutput)\n\t\t\t\th.AssertEq(t, output, expectedOutput)\n\t\t\t})\n\t\t})\n\t})\n\n\twhen(\"#parseGitRemote\", func() {\n\t\tit(\"returns fetch url if remote `origin` exists\", func() {\n\t\t\tremoteOpts := &config.RemoteConfig{\n\t\t\t\tName: \"origin\",\n\t\t\t\tURLs: []string{\"git@github.com:testorg/testproj.git\", \"git@github.com:testorg/testproj.git\"},\n\t\t\t}\n\t\t\trepo.CreateRemote(remoteOpts)\n\n\t\t\toutput := parseGitRemote(repo)\n\t\t\th.AssertEq(t, output, \"git@github.com:testorg/testproj.git\")\n\t\t})\n\n\t\tit(\"returns empty string if no remote exists\", func() {\n\t\t\toutput := parseGitRemote(repo)\n\t\t\th.AssertEq(t, output, \"\")\n\t\t})\n\n\t\tit(\"returns fetch url if fetch and push URLs are different\", func() {\n\t\t\tremoteOpts := &config.RemoteConfig{\n\t\t\t\tName: \"origin\",\n\t\t\t\tURLs: []string{\"git@fetch.com:testorg/testproj.git\", \"git@pushing-p-github.com:testorg/testproj.git\"},\n\t\t\t}\n\t\t\trepo.CreateRemote(remoteOpts)\n\n\t\t\toutput := parseGitRemote(repo)\n\t\t\th.AssertEq(t, output, \"git@fetch.com:testorg/testproj.git\")\n\t\t})\n\t})\n\n\twhen(\"#getRefName\", func() {\n\t\tit(\"return proper ref for refs with `/`\", func() {\n\t\t\toutput := getRefName(\"refs/tags/this/is/a/tag/with/slashes\")\n\t\t\th.AssertEq(t, output, \"this/is/a/tag/with/slashes\")\n\t\t})\n\t})\n}\n\nfunc createCommits(t *testing.T, repo *git.Repository, repoPath string, numberOfCommits int) []plumbing.Hash {\n\tworktree, err := repo.Worktree()\n\th.AssertNil(t, err)\n\n\tvar commitHashes []plumbing.Hash\n\tfor i := 0; i < numberOfCommits; i++ {\n\t\tfile, err := os.CreateTemp(repoPath, h.RandString(10))\n\t\th.AssertNil(t, err)\n\t\tdefer file.Close()\n\n\t\t_, err = worktree.Add(filepath.Base(file.Name()))\n\t\th.AssertNil(t, err)\n\n\t\tcommitMsg := fmt.Sprintf(\"%s %d\", \"test commit number\", i)\n\t\tcommitOpts := git.CommitOptions{\n\t\t\tAll: true,\n\t\t\tAuthor: &object.Signature{\n\t\t\t\tName:  \"Test Author\",\n\t\t\t\tEmail: \"testauthor@test.com\",\n\t\t\t\tWhen:  time.Now(),\n\t\t\t},\n\t\t\tCommitter: &object.Signature{\n\t\t\t\tName:  \"Test Committer\",\n\t\t\t\tEmail: \"testcommitter@test.com\",\n\t\t\t\tWhen:  time.Now(),\n\t\t\t},\n\t\t}\n\t\tcommitHash, err := worktree.Commit(commitMsg, &commitOpts)\n\t\th.AssertNil(t, err)\n\t\tcommitHashes = append(commitHashes, commitHash)\n\t}\n\treturn commitHashes\n}\n\nfunc createUnannotatedTag(t *testing.T, repo *git.Repository, commitHash plumbing.Hash, tagName string) {\n\tif tagName == \"\" {\n\t\tversion := rand.Float32()*10 + float32(rand.Intn(20))\n\t\ttagName = fmt.Sprintf(\"v%f-lw\", version)\n\t}\n\t_, err := repo.CreateTag(tagName, commitHash, nil)\n\th.AssertNil(t, err)\n}\n\nfunc createAnnotatedTag(t *testing.T, repo *git.Repository, commitHash plumbing.Hash, tagName string) {\n\tif tagName == \"\" {\n\t\tversion := rand.Float32()*10 + float32(rand.Intn(20))\n\t\ttagName = fmt.Sprintf(\"v%f-%s\", version, h.RandString(5))\n\t}\n\ttagMessage := fmt.Sprintf(\"This is an annotated tag for version - %s\", tagName)\n\ttagOpts := &git.CreateTagOptions{\n\t\tMessage: tagMessage,\n\t\tTagger: &object.Signature{\n\t\t\tName:  \"Test Tagger\",\n\t\t\tEmail: \"testtagger@test.com\",\n\t\t\tWhen:  time.Now().Add(time.Hour*time.Duration(rand.Intn(100)) + time.Minute*time.Duration(rand.Intn(100))),\n\t\t},\n\t}\n\t_, err := repo.CreateTag(tagName, commitHash, tagOpts)\n\th.AssertNil(t, err)\n}\n\nfunc checkoutBranch(t *testing.T, repo *git.Repository, branchName string, newBranch bool) {\n\tworktree, err := repo.Worktree()\n\th.AssertNil(t, err)\n\n\tvar fullBranchName string\n\tif branchName == \"\" {\n\t\tfullBranchName = \"refs/heads/\" + h.RandString(10)\n\t} else {\n\t\tfullBranchName = \"refs/heads/\" + branchName\n\t}\n\n\tcheckoutOpts := &git.CheckoutOptions{\n\t\tBranch: plumbing.ReferenceName(fullBranchName),\n\t\tCreate: newBranch,\n\t}\n\terr = worktree.Checkout(checkoutOpts)\n\th.AssertNil(t, err)\n}\n"
  },
  {
    "path": "pkg/project/v02/project.go",
    "content": "package v02\n\nimport (\n\t\"github.com/BurntSushi/toml\"\n\t\"github.com/buildpacks/lifecycle/api\"\n\n\t\"github.com/buildpacks/pack/pkg/project/types\"\n)\n\ntype Buildpacks struct {\n\tInclude []string      `toml:\"include\"`\n\tExclude []string      `toml:\"exclude\"`\n\tGroup   []Buildpack   `toml:\"group\"`\n\tEnv     Env           `toml:\"env\"`\n\tBuild   Build         `toml:\"build\"`\n\tBuilder string        `toml:\"builder\"`\n\tPre     GroupAddition `toml:\"pre\"`\n\tPost    GroupAddition `toml:\"post\"`\n}\n\ntype Build struct {\n\tEnv []EnvVar `toml:\"env\"`\n}\n\n// Deprecated: use `[[io.buildpacks.build.env]]` instead. see https://github.com/buildpacks/pack/pull/1479\ntype Env struct {\n\tBuild []EnvVar `toml:\"build\"`\n}\n\ntype Project struct {\n\tSchemaVersion    string                 `toml:\"schema-version\"`\n\tID               string                 `toml:\"id\"`\n\tName             string                 `toml:\"name\"`\n\tVersion          string                 `toml:\"version\"`\n\tAuthors          []string               `toml:\"authors\"`\n\tLicenses         []types.License        `toml:\"licenses\"`\n\tDocumentationURL string                 `toml:\"documentation-url\"`\n\tSourceURL        string                 `toml:\"source-url\"`\n\tMetadata         map[string]interface{} `toml:\"metadata\"`\n}\n\ntype IO struct {\n\tBuildpacks Buildpacks `toml:\"buildpacks\"`\n}\n\ntype Descriptor struct {\n\tProject Project `toml:\"_\"`\n\tIO      IO      `toml:\"io\"`\n}\n\ntype Buildpack struct {\n\tID      string       `toml:\"id\"`\n\tVersion string       `toml:\"version\"`\n\tURI     string       `toml:\"uri\"`\n\tScript  types.Script `toml:\"script\"`\n}\n\ntype EnvVar struct {\n\tName  string `toml:\"name\"`\n\tValue string `toml:\"value\"`\n}\n\ntype GroupAddition struct {\n\tBuildpacks []Buildpack `toml:\"group\"`\n}\n\nfunc NewDescriptor(projectTomlContents string) (types.Descriptor, toml.MetaData, error) {\n\tversionedDescriptor := &Descriptor{}\n\ttomlMetaData, err := toml.Decode(projectTomlContents, &versionedDescriptor)\n\tif err != nil {\n\t\treturn types.Descriptor{}, tomlMetaData, err\n\t}\n\n\t// backward compatibility for incorrect key\n\tenv := versionedDescriptor.IO.Buildpacks.Build.Env\n\tif env == nil {\n\t\tenv = versionedDescriptor.IO.Buildpacks.Env.Build\n\t}\n\n\treturn types.Descriptor{\n\t\tProject: types.Project{\n\t\t\tName:     versionedDescriptor.Project.Name,\n\t\t\tLicenses: versionedDescriptor.Project.Licenses,\n\t\t},\n\t\tBuild: types.Build{\n\t\t\tInclude:    versionedDescriptor.IO.Buildpacks.Include,\n\t\t\tExclude:    versionedDescriptor.IO.Buildpacks.Exclude,\n\t\t\tBuildpacks: mapToBuildPacksDescriptor(versionedDescriptor.IO.Buildpacks.Group),\n\t\t\tEnv:        mapToEnvVarsDescriptor(env),\n\t\t\tBuilder:    versionedDescriptor.IO.Buildpacks.Builder,\n\t\t\tPre: types.GroupAddition{\n\t\t\t\tBuildpacks: mapToBuildPacksDescriptor(versionedDescriptor.IO.Buildpacks.Pre.Buildpacks),\n\t\t\t},\n\t\t\tPost: types.GroupAddition{\n\t\t\t\tBuildpacks: mapToBuildPacksDescriptor(versionedDescriptor.IO.Buildpacks.Post.Buildpacks),\n\t\t\t},\n\t\t},\n\t\tMetadata:      versionedDescriptor.Project.Metadata,\n\t\tSchemaVersion: api.MustParse(\"0.2\"),\n\t}, tomlMetaData, nil\n}\n\nfunc mapToBuildPacksDescriptor(v2BuildPacks []Buildpack) []types.Buildpack {\n\tvar buildPacks []types.Buildpack\n\tfor _, v2BuildPack := range v2BuildPacks {\n\t\tbuildPacks = append(buildPacks, mapToBuildPackDescriptor(v2BuildPack))\n\t}\n\treturn buildPacks\n}\n\nfunc mapToBuildPackDescriptor(v2BuildPack Buildpack) types.Buildpack {\n\treturn types.Buildpack{\n\t\tID:      v2BuildPack.ID,\n\t\tVersion: v2BuildPack.Version,\n\t\tURI:     v2BuildPack.URI,\n\t\tScript:  v2BuildPack.Script,\n\t\tExecEnv: []string{}, // schema v2 doesn't handle execution environments variables\n\t}\n}\n\nfunc mapToEnvVarsDescriptor(v2EnvVars []EnvVar) []types.EnvVar {\n\tvar envVars []types.EnvVar\n\tfor _, v2EnvVar := range v2EnvVars {\n\t\tenvVars = append(envVars, mapToEnVarDescriptor(v2EnvVar))\n\t}\n\treturn envVars\n}\n\nfunc mapToEnVarDescriptor(v2EnVar EnvVar) types.EnvVar {\n\treturn types.EnvVar{\n\t\tName:    v2EnVar.Name,\n\t\tValue:   v2EnVar.Value,\n\t\tExecEnv: []string{}, // schema v2 doesn't handle execution environments variables\n\t}\n}\n"
  },
  {
    "path": "pkg/project/v03/project.go",
    "content": "package v03\n\nimport (\n\t\"github.com/BurntSushi/toml\"\n\t\"github.com/buildpacks/lifecycle/api\"\n\n\t\"github.com/buildpacks/pack/pkg/project/types\"\n)\n\ntype Buildpacks struct {\n\tInclude []string            `toml:\"include\"`\n\tExclude []string            `toml:\"exclude\"`\n\tGroup   []types.Buildpack   `toml:\"group\"`\n\tBuild   types.Build         `toml:\"build\"`\n\tBuilder string              `toml:\"builder\"`\n\tPre     types.GroupAddition `toml:\"pre\"`\n\tPost    types.GroupAddition `toml:\"post\"`\n}\n\ntype Project struct {\n\tSchemaVersion    string                 `toml:\"schema-version\"`\n\tID               string                 `toml:\"id\"`\n\tName             string                 `toml:\"name\"`\n\tVersion          string                 `toml:\"version\"`\n\tAuthors          []string               `toml:\"authors\"`\n\tLicenses         []types.License        `toml:\"licenses\"`\n\tDocumentationURL string                 `toml:\"documentation-url\"`\n\tSourceURL        string                 `toml:\"source-url\"`\n\tMetadata         map[string]interface{} `toml:\"metadata\"`\n}\n\ntype IO struct {\n\tBuildpacks Buildpacks `toml:\"buildpacks\"`\n}\n\ntype Descriptor struct {\n\tProject Project `toml:\"_\"`\n\tIO      IO      `toml:\"io\"`\n}\n\nfunc NewDescriptor(projectTomlContents string) (types.Descriptor, toml.MetaData, error) {\n\tversionedDescriptor := &Descriptor{}\n\ttomlMetaData, err := toml.Decode(projectTomlContents, &versionedDescriptor)\n\tif err != nil {\n\t\treturn types.Descriptor{}, tomlMetaData, err\n\t}\n\n\treturn types.Descriptor{\n\t\tProject: types.Project{\n\t\t\tName:     versionedDescriptor.Project.Name,\n\t\t\tLicenses: versionedDescriptor.Project.Licenses,\n\t\t},\n\t\tBuild: types.Build{\n\t\t\tInclude:    versionedDescriptor.IO.Buildpacks.Include,\n\t\t\tExclude:    versionedDescriptor.IO.Buildpacks.Exclude,\n\t\t\tBuildpacks: versionedDescriptor.IO.Buildpacks.Group,\n\t\t\tEnv:        versionedDescriptor.IO.Buildpacks.Build.Env,\n\t\t\tBuilder:    versionedDescriptor.IO.Buildpacks.Builder,\n\t\t\tPre:        versionedDescriptor.IO.Buildpacks.Pre,\n\t\t\tPost:       versionedDescriptor.IO.Buildpacks.Post,\n\t\t},\n\t\tMetadata:      versionedDescriptor.Project.Metadata,\n\t\tSchemaVersion: api.MustParse(\"0.3\"),\n\t}, tomlMetaData, nil\n}\n"
  },
  {
    "path": "pkg/testmocks/mock_access_checker.go",
    "content": "// Code generated by MockGen. DO NOT EDIT.\n// Source: github.com/buildpacks/pack/pkg/client (interfaces: AccessChecker)\n\n// Package testmocks is a generated GoMock package.\npackage testmocks\n\nimport (\n\treflect \"reflect\"\n\n\tgomock \"github.com/golang/mock/gomock\"\n)\n\n// MockAccessChecker is a mock of AccessChecker interface.\ntype MockAccessChecker struct {\n\tctrl     *gomock.Controller\n\trecorder *MockAccessCheckerMockRecorder\n}\n\n// MockAccessCheckerMockRecorder is the mock recorder for MockAccessChecker.\ntype MockAccessCheckerMockRecorder struct {\n\tmock *MockAccessChecker\n}\n\n// NewMockAccessChecker creates a new mock instance.\nfunc NewMockAccessChecker(ctrl *gomock.Controller) *MockAccessChecker {\n\tmock := &MockAccessChecker{ctrl: ctrl}\n\tmock.recorder = &MockAccessCheckerMockRecorder{mock}\n\treturn mock\n}\n\n// EXPECT returns an object that allows the caller to indicate expected use.\nfunc (m *MockAccessChecker) EXPECT() *MockAccessCheckerMockRecorder {\n\treturn m.recorder\n}\n\n// Check mocks base method.\nfunc (m *MockAccessChecker) Check(arg0 string) bool {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"Check\", arg0)\n\tret0, _ := ret[0].(bool)\n\treturn ret0\n}\n\n// Check indicates an expected call of Check.\nfunc (mr *MockAccessCheckerMockRecorder) Check(arg0 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"Check\", reflect.TypeOf((*MockAccessChecker)(nil).Check), arg0)\n}\n"
  },
  {
    "path": "pkg/testmocks/mock_blob_downloader.go",
    "content": "// Code generated by MockGen. DO NOT EDIT.\n// Source: github.com/buildpacks/pack/pkg/client (interfaces: BlobDownloader)\n\n// Package testmocks is a generated GoMock package.\npackage testmocks\n\nimport (\n\tcontext \"context\"\n\treflect \"reflect\"\n\n\tgomock \"github.com/golang/mock/gomock\"\n\n\tblob \"github.com/buildpacks/pack/pkg/blob\"\n)\n\n// MockBlobDownloader is a mock of BlobDownloader interface.\ntype MockBlobDownloader struct {\n\tctrl     *gomock.Controller\n\trecorder *MockBlobDownloaderMockRecorder\n}\n\n// MockBlobDownloaderMockRecorder is the mock recorder for MockBlobDownloader.\ntype MockBlobDownloaderMockRecorder struct {\n\tmock *MockBlobDownloader\n}\n\n// NewMockBlobDownloader creates a new mock instance.\nfunc NewMockBlobDownloader(ctrl *gomock.Controller) *MockBlobDownloader {\n\tmock := &MockBlobDownloader{ctrl: ctrl}\n\tmock.recorder = &MockBlobDownloaderMockRecorder{mock}\n\treturn mock\n}\n\n// EXPECT returns an object that allows the caller to indicate expected use.\nfunc (m *MockBlobDownloader) EXPECT() *MockBlobDownloaderMockRecorder {\n\treturn m.recorder\n}\n\n// Download mocks base method.\nfunc (m *MockBlobDownloader) Download(arg0 context.Context, arg1 string) (blob.Blob, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"Download\", arg0, arg1)\n\tret0, _ := ret[0].(blob.Blob)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// Download indicates an expected call of Download.\nfunc (mr *MockBlobDownloaderMockRecorder) Download(arg0, arg1 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"Download\", reflect.TypeOf((*MockBlobDownloader)(nil).Download), arg0, arg1)\n}\n"
  },
  {
    "path": "pkg/testmocks/mock_build_module.go",
    "content": "// Code generated by MockGen. DO NOT EDIT.\n// Source: github.com/buildpacks/pack/pkg/buildpack (interfaces: BuildModule)\n\n// Package testmocks is a generated GoMock package.\npackage testmocks\n\nimport (\n\tio \"io\"\n\treflect \"reflect\"\n\n\tgomock \"github.com/golang/mock/gomock\"\n\n\tbuildpack \"github.com/buildpacks/pack/pkg/buildpack\"\n)\n\n// MockBuildModule is a mock of BuildModule interface.\ntype MockBuildModule struct {\n\tctrl     *gomock.Controller\n\trecorder *MockBuildModuleMockRecorder\n}\n\n// MockBuildModuleMockRecorder is the mock recorder for MockBuildModule.\ntype MockBuildModuleMockRecorder struct {\n\tmock *MockBuildModule\n}\n\n// NewMockBuildModule creates a new mock instance.\nfunc NewMockBuildModule(ctrl *gomock.Controller) *MockBuildModule {\n\tmock := &MockBuildModule{ctrl: ctrl}\n\tmock.recorder = &MockBuildModuleMockRecorder{mock}\n\treturn mock\n}\n\n// EXPECT returns an object that allows the caller to indicate expected use.\nfunc (m *MockBuildModule) EXPECT() *MockBuildModuleMockRecorder {\n\treturn m.recorder\n}\n\n// Descriptor mocks base method.\nfunc (m *MockBuildModule) Descriptor() buildpack.Descriptor {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"Descriptor\")\n\tret0, _ := ret[0].(buildpack.Descriptor)\n\treturn ret0\n}\n\n// Descriptor indicates an expected call of Descriptor.\nfunc (mr *MockBuildModuleMockRecorder) Descriptor() *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"Descriptor\", reflect.TypeOf((*MockBuildModule)(nil).Descriptor))\n}\n\n// Open mocks base method.\nfunc (m *MockBuildModule) Open() (io.ReadCloser, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"Open\")\n\tret0, _ := ret[0].(io.ReadCloser)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// Open indicates an expected call of Open.\nfunc (mr *MockBuildModuleMockRecorder) Open() *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"Open\", reflect.TypeOf((*MockBuildModule)(nil).Open))\n}\n"
  },
  {
    "path": "pkg/testmocks/mock_buildpack_downloader.go",
    "content": "// Code generated by MockGen. DO NOT EDIT.\n// Source: github.com/buildpacks/pack/pkg/client (interfaces: BuildpackDownloader)\n\n// Package testmocks is a generated GoMock package.\npackage testmocks\n\nimport (\n\tcontext \"context\"\n\treflect \"reflect\"\n\n\tgomock \"github.com/golang/mock/gomock\"\n\n\tbuildpack \"github.com/buildpacks/pack/pkg/buildpack\"\n)\n\n// MockBuildpackDownloader is a mock of BuildpackDownloader interface.\ntype MockBuildpackDownloader struct {\n\tctrl     *gomock.Controller\n\trecorder *MockBuildpackDownloaderMockRecorder\n}\n\n// MockBuildpackDownloaderMockRecorder is the mock recorder for MockBuildpackDownloader.\ntype MockBuildpackDownloaderMockRecorder struct {\n\tmock *MockBuildpackDownloader\n}\n\n// NewMockBuildpackDownloader creates a new mock instance.\nfunc NewMockBuildpackDownloader(ctrl *gomock.Controller) *MockBuildpackDownloader {\n\tmock := &MockBuildpackDownloader{ctrl: ctrl}\n\tmock.recorder = &MockBuildpackDownloaderMockRecorder{mock}\n\treturn mock\n}\n\n// EXPECT returns an object that allows the caller to indicate expected use.\nfunc (m *MockBuildpackDownloader) EXPECT() *MockBuildpackDownloaderMockRecorder {\n\treturn m.recorder\n}\n\n// Download mocks base method.\nfunc (m *MockBuildpackDownloader) Download(arg0 context.Context, arg1 string, arg2 buildpack.DownloadOptions) (buildpack.BuildModule, []buildpack.BuildModule, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"Download\", arg0, arg1, arg2)\n\tret0, _ := ret[0].(buildpack.BuildModule)\n\tret1, _ := ret[1].([]buildpack.BuildModule)\n\tret2, _ := ret[2].(error)\n\treturn ret0, ret1, ret2\n}\n\n// Download indicates an expected call of Download.\nfunc (mr *MockBuildpackDownloaderMockRecorder) Download(arg0, arg1, arg2 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"Download\", reflect.TypeOf((*MockBuildpackDownloader)(nil).Download), arg0, arg1, arg2)\n}\n"
  },
  {
    "path": "pkg/testmocks/mock_docker_client.go",
    "content": "// Code generated by MockGen. DO NOT EDIT.\n// Source: github.com/moby/moby/client (interfaces: APIClient)\n\n// Package testmocks is a generated GoMock package.\npackage testmocks\n\nimport (\n\tcontext \"context\"\n\tio \"io\"\n\tnet \"net\"\n\treflect \"reflect\"\n\n\tgomock \"github.com/golang/mock/gomock\"\n\tclient \"github.com/moby/moby/client\"\n)\n\n// MockAPIClient is a mock of APIClient interface.\ntype MockAPIClient struct {\n\tctrl     *gomock.Controller\n\trecorder *MockAPIClientMockRecorder\n}\n\n// MockAPIClientMockRecorder is the mock recorder for MockAPIClient.\ntype MockAPIClientMockRecorder struct {\n\tmock *MockAPIClient\n}\n\n// NewMockAPIClient creates a new mock instance.\nfunc NewMockAPIClient(ctrl *gomock.Controller) *MockAPIClient {\n\tmock := &MockAPIClient{ctrl: ctrl}\n\tmock.recorder = &MockAPIClientMockRecorder{mock}\n\treturn mock\n}\n\n// EXPECT returns an object that allows the caller to indicate expected use.\nfunc (m *MockAPIClient) EXPECT() *MockAPIClientMockRecorder {\n\treturn m.recorder\n}\n\n// BuildCachePrune mocks base method.\nfunc (m *MockAPIClient) BuildCachePrune(arg0 context.Context, arg1 client.BuildCachePruneOptions) (client.BuildCachePruneResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"BuildCachePrune\", arg0, arg1)\n\tret0, _ := ret[0].(client.BuildCachePruneResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// BuildCachePrune indicates an expected call of BuildCachePrune.\nfunc (mr *MockAPIClientMockRecorder) BuildCachePrune(arg0, arg1 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"BuildCachePrune\", reflect.TypeOf((*MockAPIClient)(nil).BuildCachePrune), arg0, arg1)\n}\n\n// BuildCancel mocks base method.\nfunc (m *MockAPIClient) BuildCancel(arg0 context.Context, arg1 string, arg2 client.BuildCancelOptions) (client.BuildCancelResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"BuildCancel\", arg0, arg1, arg2)\n\tret0, _ := ret[0].(client.BuildCancelResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// BuildCancel indicates an expected call of BuildCancel.\nfunc (mr *MockAPIClientMockRecorder) BuildCancel(arg0, arg1, arg2 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"BuildCancel\", reflect.TypeOf((*MockAPIClient)(nil).BuildCancel), arg0, arg1, arg2)\n}\n\n// CheckpointCreate mocks base method.\nfunc (m *MockAPIClient) CheckpointCreate(arg0 context.Context, arg1 string, arg2 client.CheckpointCreateOptions) (client.CheckpointCreateResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"CheckpointCreate\", arg0, arg1, arg2)\n\tret0, _ := ret[0].(client.CheckpointCreateResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// CheckpointCreate indicates an expected call of CheckpointCreate.\nfunc (mr *MockAPIClientMockRecorder) CheckpointCreate(arg0, arg1, arg2 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"CheckpointCreate\", reflect.TypeOf((*MockAPIClient)(nil).CheckpointCreate), arg0, arg1, arg2)\n}\n\n// CheckpointList mocks base method.\nfunc (m *MockAPIClient) CheckpointList(arg0 context.Context, arg1 string, arg2 client.CheckpointListOptions) (client.CheckpointListResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"CheckpointList\", arg0, arg1, arg2)\n\tret0, _ := ret[0].(client.CheckpointListResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// CheckpointList indicates an expected call of CheckpointList.\nfunc (mr *MockAPIClientMockRecorder) CheckpointList(arg0, arg1, arg2 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"CheckpointList\", reflect.TypeOf((*MockAPIClient)(nil).CheckpointList), arg0, arg1, arg2)\n}\n\n// CheckpointRemove mocks base method.\nfunc (m *MockAPIClient) CheckpointRemove(arg0 context.Context, arg1 string, arg2 client.CheckpointRemoveOptions) (client.CheckpointRemoveResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"CheckpointRemove\", arg0, arg1, arg2)\n\tret0, _ := ret[0].(client.CheckpointRemoveResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// CheckpointRemove indicates an expected call of CheckpointRemove.\nfunc (mr *MockAPIClientMockRecorder) CheckpointRemove(arg0, arg1, arg2 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"CheckpointRemove\", reflect.TypeOf((*MockAPIClient)(nil).CheckpointRemove), arg0, arg1, arg2)\n}\n\n// ClientVersion mocks base method.\nfunc (m *MockAPIClient) ClientVersion() string {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"ClientVersion\")\n\tret0, _ := ret[0].(string)\n\treturn ret0\n}\n\n// ClientVersion indicates an expected call of ClientVersion.\nfunc (mr *MockAPIClientMockRecorder) ClientVersion() *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"ClientVersion\", reflect.TypeOf((*MockAPIClient)(nil).ClientVersion))\n}\n\n// Close mocks base method.\nfunc (m *MockAPIClient) Close() error {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"Close\")\n\tret0, _ := ret[0].(error)\n\treturn ret0\n}\n\n// Close indicates an expected call of Close.\nfunc (mr *MockAPIClientMockRecorder) Close() *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"Close\", reflect.TypeOf((*MockAPIClient)(nil).Close))\n}\n\n// ConfigCreate mocks base method.\nfunc (m *MockAPIClient) ConfigCreate(arg0 context.Context, arg1 client.ConfigCreateOptions) (client.ConfigCreateResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"ConfigCreate\", arg0, arg1)\n\tret0, _ := ret[0].(client.ConfigCreateResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// ConfigCreate indicates an expected call of ConfigCreate.\nfunc (mr *MockAPIClientMockRecorder) ConfigCreate(arg0, arg1 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"ConfigCreate\", reflect.TypeOf((*MockAPIClient)(nil).ConfigCreate), arg0, arg1)\n}\n\n// ConfigInspect mocks base method.\nfunc (m *MockAPIClient) ConfigInspect(arg0 context.Context, arg1 string, arg2 client.ConfigInspectOptions) (client.ConfigInspectResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"ConfigInspect\", arg0, arg1, arg2)\n\tret0, _ := ret[0].(client.ConfigInspectResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// ConfigInspect indicates an expected call of ConfigInspect.\nfunc (mr *MockAPIClientMockRecorder) ConfigInspect(arg0, arg1, arg2 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"ConfigInspect\", reflect.TypeOf((*MockAPIClient)(nil).ConfigInspect), arg0, arg1, arg2)\n}\n\n// ConfigList mocks base method.\nfunc (m *MockAPIClient) ConfigList(arg0 context.Context, arg1 client.ConfigListOptions) (client.ConfigListResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"ConfigList\", arg0, arg1)\n\tret0, _ := ret[0].(client.ConfigListResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// ConfigList indicates an expected call of ConfigList.\nfunc (mr *MockAPIClientMockRecorder) ConfigList(arg0, arg1 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"ConfigList\", reflect.TypeOf((*MockAPIClient)(nil).ConfigList), arg0, arg1)\n}\n\n// ConfigRemove mocks base method.\nfunc (m *MockAPIClient) ConfigRemove(arg0 context.Context, arg1 string, arg2 client.ConfigRemoveOptions) (client.ConfigRemoveResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"ConfigRemove\", arg0, arg1, arg2)\n\tret0, _ := ret[0].(client.ConfigRemoveResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// ConfigRemove indicates an expected call of ConfigRemove.\nfunc (mr *MockAPIClientMockRecorder) ConfigRemove(arg0, arg1, arg2 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"ConfigRemove\", reflect.TypeOf((*MockAPIClient)(nil).ConfigRemove), arg0, arg1, arg2)\n}\n\n// ConfigUpdate mocks base method.\nfunc (m *MockAPIClient) ConfigUpdate(arg0 context.Context, arg1 string, arg2 client.ConfigUpdateOptions) (client.ConfigUpdateResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"ConfigUpdate\", arg0, arg1, arg2)\n\tret0, _ := ret[0].(client.ConfigUpdateResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// ConfigUpdate indicates an expected call of ConfigUpdate.\nfunc (mr *MockAPIClientMockRecorder) ConfigUpdate(arg0, arg1, arg2 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"ConfigUpdate\", reflect.TypeOf((*MockAPIClient)(nil).ConfigUpdate), arg0, arg1, arg2)\n}\n\n// ContainerAttach mocks base method.\nfunc (m *MockAPIClient) ContainerAttach(arg0 context.Context, arg1 string, arg2 client.ContainerAttachOptions) (client.ContainerAttachResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"ContainerAttach\", arg0, arg1, arg2)\n\tret0, _ := ret[0].(client.ContainerAttachResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// ContainerAttach indicates an expected call of ContainerAttach.\nfunc (mr *MockAPIClientMockRecorder) ContainerAttach(arg0, arg1, arg2 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"ContainerAttach\", reflect.TypeOf((*MockAPIClient)(nil).ContainerAttach), arg0, arg1, arg2)\n}\n\n// ContainerCommit mocks base method.\nfunc (m *MockAPIClient) ContainerCommit(arg0 context.Context, arg1 string, arg2 client.ContainerCommitOptions) (client.ContainerCommitResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"ContainerCommit\", arg0, arg1, arg2)\n\tret0, _ := ret[0].(client.ContainerCommitResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// ContainerCommit indicates an expected call of ContainerCommit.\nfunc (mr *MockAPIClientMockRecorder) ContainerCommit(arg0, arg1, arg2 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"ContainerCommit\", reflect.TypeOf((*MockAPIClient)(nil).ContainerCommit), arg0, arg1, arg2)\n}\n\n// ContainerCreate mocks base method.\nfunc (m *MockAPIClient) ContainerCreate(arg0 context.Context, arg1 client.ContainerCreateOptions) (client.ContainerCreateResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"ContainerCreate\", arg0, arg1)\n\tret0, _ := ret[0].(client.ContainerCreateResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// ContainerCreate indicates an expected call of ContainerCreate.\nfunc (mr *MockAPIClientMockRecorder) ContainerCreate(arg0, arg1 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"ContainerCreate\", reflect.TypeOf((*MockAPIClient)(nil).ContainerCreate), arg0, arg1)\n}\n\n// ContainerDiff mocks base method.\nfunc (m *MockAPIClient) ContainerDiff(arg0 context.Context, arg1 string, arg2 client.ContainerDiffOptions) (client.ContainerDiffResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"ContainerDiff\", arg0, arg1, arg2)\n\tret0, _ := ret[0].(client.ContainerDiffResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// ContainerDiff indicates an expected call of ContainerDiff.\nfunc (mr *MockAPIClientMockRecorder) ContainerDiff(arg0, arg1, arg2 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"ContainerDiff\", reflect.TypeOf((*MockAPIClient)(nil).ContainerDiff), arg0, arg1, arg2)\n}\n\n// ContainerExport mocks base method.\nfunc (m *MockAPIClient) ContainerExport(arg0 context.Context, arg1 string, arg2 client.ContainerExportOptions) (client.ContainerExportResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"ContainerExport\", arg0, arg1, arg2)\n\tret0, _ := ret[0].(client.ContainerExportResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// ContainerExport indicates an expected call of ContainerExport.\nfunc (mr *MockAPIClientMockRecorder) ContainerExport(arg0, arg1, arg2 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"ContainerExport\", reflect.TypeOf((*MockAPIClient)(nil).ContainerExport), arg0, arg1, arg2)\n}\n\n// ContainerInspect mocks base method.\nfunc (m *MockAPIClient) ContainerInspect(arg0 context.Context, arg1 string, arg2 client.ContainerInspectOptions) (client.ContainerInspectResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"ContainerInspect\", arg0, arg1, arg2)\n\tret0, _ := ret[0].(client.ContainerInspectResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// ContainerInspect indicates an expected call of ContainerInspect.\nfunc (mr *MockAPIClientMockRecorder) ContainerInspect(arg0, arg1, arg2 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"ContainerInspect\", reflect.TypeOf((*MockAPIClient)(nil).ContainerInspect), arg0, arg1, arg2)\n}\n\n// ContainerKill mocks base method.\nfunc (m *MockAPIClient) ContainerKill(arg0 context.Context, arg1 string, arg2 client.ContainerKillOptions) (client.ContainerKillResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"ContainerKill\", arg0, arg1, arg2)\n\tret0, _ := ret[0].(client.ContainerKillResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// ContainerKill indicates an expected call of ContainerKill.\nfunc (mr *MockAPIClientMockRecorder) ContainerKill(arg0, arg1, arg2 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"ContainerKill\", reflect.TypeOf((*MockAPIClient)(nil).ContainerKill), arg0, arg1, arg2)\n}\n\n// ContainerList mocks base method.\nfunc (m *MockAPIClient) ContainerList(arg0 context.Context, arg1 client.ContainerListOptions) (client.ContainerListResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"ContainerList\", arg0, arg1)\n\tret0, _ := ret[0].(client.ContainerListResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// ContainerList indicates an expected call of ContainerList.\nfunc (mr *MockAPIClientMockRecorder) ContainerList(arg0, arg1 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"ContainerList\", reflect.TypeOf((*MockAPIClient)(nil).ContainerList), arg0, arg1)\n}\n\n// ContainerLogs mocks base method.\nfunc (m *MockAPIClient) ContainerLogs(arg0 context.Context, arg1 string, arg2 client.ContainerLogsOptions) (client.ContainerLogsResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"ContainerLogs\", arg0, arg1, arg2)\n\tret0, _ := ret[0].(client.ContainerLogsResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// ContainerLogs indicates an expected call of ContainerLogs.\nfunc (mr *MockAPIClientMockRecorder) ContainerLogs(arg0, arg1, arg2 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"ContainerLogs\", reflect.TypeOf((*MockAPIClient)(nil).ContainerLogs), arg0, arg1, arg2)\n}\n\n// ContainerPause mocks base method.\nfunc (m *MockAPIClient) ContainerPause(arg0 context.Context, arg1 string, arg2 client.ContainerPauseOptions) (client.ContainerPauseResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"ContainerPause\", arg0, arg1, arg2)\n\tret0, _ := ret[0].(client.ContainerPauseResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// ContainerPause indicates an expected call of ContainerPause.\nfunc (mr *MockAPIClientMockRecorder) ContainerPause(arg0, arg1, arg2 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"ContainerPause\", reflect.TypeOf((*MockAPIClient)(nil).ContainerPause), arg0, arg1, arg2)\n}\n\n// ContainerPrune mocks base method.\nfunc (m *MockAPIClient) ContainerPrune(arg0 context.Context, arg1 client.ContainerPruneOptions) (client.ContainerPruneResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"ContainerPrune\", arg0, arg1)\n\tret0, _ := ret[0].(client.ContainerPruneResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// ContainerPrune indicates an expected call of ContainerPrune.\nfunc (mr *MockAPIClientMockRecorder) ContainerPrune(arg0, arg1 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"ContainerPrune\", reflect.TypeOf((*MockAPIClient)(nil).ContainerPrune), arg0, arg1)\n}\n\n// ContainerRemove mocks base method.\nfunc (m *MockAPIClient) ContainerRemove(arg0 context.Context, arg1 string, arg2 client.ContainerRemoveOptions) (client.ContainerRemoveResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"ContainerRemove\", arg0, arg1, arg2)\n\tret0, _ := ret[0].(client.ContainerRemoveResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// ContainerRemove indicates an expected call of ContainerRemove.\nfunc (mr *MockAPIClientMockRecorder) ContainerRemove(arg0, arg1, arg2 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"ContainerRemove\", reflect.TypeOf((*MockAPIClient)(nil).ContainerRemove), arg0, arg1, arg2)\n}\n\n// ContainerRename mocks base method.\nfunc (m *MockAPIClient) ContainerRename(arg0 context.Context, arg1 string, arg2 client.ContainerRenameOptions) (client.ContainerRenameResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"ContainerRename\", arg0, arg1, arg2)\n\tret0, _ := ret[0].(client.ContainerRenameResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// ContainerRename indicates an expected call of ContainerRename.\nfunc (mr *MockAPIClientMockRecorder) ContainerRename(arg0, arg1, arg2 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"ContainerRename\", reflect.TypeOf((*MockAPIClient)(nil).ContainerRename), arg0, arg1, arg2)\n}\n\n// ContainerResize mocks base method.\nfunc (m *MockAPIClient) ContainerResize(arg0 context.Context, arg1 string, arg2 client.ContainerResizeOptions) (client.ContainerResizeResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"ContainerResize\", arg0, arg1, arg2)\n\tret0, _ := ret[0].(client.ContainerResizeResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// ContainerResize indicates an expected call of ContainerResize.\nfunc (mr *MockAPIClientMockRecorder) ContainerResize(arg0, arg1, arg2 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"ContainerResize\", reflect.TypeOf((*MockAPIClient)(nil).ContainerResize), arg0, arg1, arg2)\n}\n\n// ContainerRestart mocks base method.\nfunc (m *MockAPIClient) ContainerRestart(arg0 context.Context, arg1 string, arg2 client.ContainerRestartOptions) (client.ContainerRestartResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"ContainerRestart\", arg0, arg1, arg2)\n\tret0, _ := ret[0].(client.ContainerRestartResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// ContainerRestart indicates an expected call of ContainerRestart.\nfunc (mr *MockAPIClientMockRecorder) ContainerRestart(arg0, arg1, arg2 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"ContainerRestart\", reflect.TypeOf((*MockAPIClient)(nil).ContainerRestart), arg0, arg1, arg2)\n}\n\n// ContainerStart mocks base method.\nfunc (m *MockAPIClient) ContainerStart(arg0 context.Context, arg1 string, arg2 client.ContainerStartOptions) (client.ContainerStartResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"ContainerStart\", arg0, arg1, arg2)\n\tret0, _ := ret[0].(client.ContainerStartResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// ContainerStart indicates an expected call of ContainerStart.\nfunc (mr *MockAPIClientMockRecorder) ContainerStart(arg0, arg1, arg2 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"ContainerStart\", reflect.TypeOf((*MockAPIClient)(nil).ContainerStart), arg0, arg1, arg2)\n}\n\n// ContainerStatPath mocks base method.\nfunc (m *MockAPIClient) ContainerStatPath(arg0 context.Context, arg1 string, arg2 client.ContainerStatPathOptions) (client.ContainerStatPathResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"ContainerStatPath\", arg0, arg1, arg2)\n\tret0, _ := ret[0].(client.ContainerStatPathResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// ContainerStatPath indicates an expected call of ContainerStatPath.\nfunc (mr *MockAPIClientMockRecorder) ContainerStatPath(arg0, arg1, arg2 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"ContainerStatPath\", reflect.TypeOf((*MockAPIClient)(nil).ContainerStatPath), arg0, arg1, arg2)\n}\n\n// ContainerStats mocks base method.\nfunc (m *MockAPIClient) ContainerStats(arg0 context.Context, arg1 string, arg2 client.ContainerStatsOptions) (client.ContainerStatsResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"ContainerStats\", arg0, arg1, arg2)\n\tret0, _ := ret[0].(client.ContainerStatsResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// ContainerStats indicates an expected call of ContainerStats.\nfunc (mr *MockAPIClientMockRecorder) ContainerStats(arg0, arg1, arg2 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"ContainerStats\", reflect.TypeOf((*MockAPIClient)(nil).ContainerStats), arg0, arg1, arg2)\n}\n\n// ContainerStop mocks base method.\nfunc (m *MockAPIClient) ContainerStop(arg0 context.Context, arg1 string, arg2 client.ContainerStopOptions) (client.ContainerStopResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"ContainerStop\", arg0, arg1, arg2)\n\tret0, _ := ret[0].(client.ContainerStopResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// ContainerStop indicates an expected call of ContainerStop.\nfunc (mr *MockAPIClientMockRecorder) ContainerStop(arg0, arg1, arg2 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"ContainerStop\", reflect.TypeOf((*MockAPIClient)(nil).ContainerStop), arg0, arg1, arg2)\n}\n\n// ContainerTop mocks base method.\nfunc (m *MockAPIClient) ContainerTop(arg0 context.Context, arg1 string, arg2 client.ContainerTopOptions) (client.ContainerTopResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"ContainerTop\", arg0, arg1, arg2)\n\tret0, _ := ret[0].(client.ContainerTopResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// ContainerTop indicates an expected call of ContainerTop.\nfunc (mr *MockAPIClientMockRecorder) ContainerTop(arg0, arg1, arg2 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"ContainerTop\", reflect.TypeOf((*MockAPIClient)(nil).ContainerTop), arg0, arg1, arg2)\n}\n\n// ContainerUnpause mocks base method.\nfunc (m *MockAPIClient) ContainerUnpause(arg0 context.Context, arg1 string, arg2 client.ContainerUnpauseOptions) (client.ContainerUnpauseResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"ContainerUnpause\", arg0, arg1, arg2)\n\tret0, _ := ret[0].(client.ContainerUnpauseResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// ContainerUnpause indicates an expected call of ContainerUnpause.\nfunc (mr *MockAPIClientMockRecorder) ContainerUnpause(arg0, arg1, arg2 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"ContainerUnpause\", reflect.TypeOf((*MockAPIClient)(nil).ContainerUnpause), arg0, arg1, arg2)\n}\n\n// ContainerUpdate mocks base method.\nfunc (m *MockAPIClient) ContainerUpdate(arg0 context.Context, arg1 string, arg2 client.ContainerUpdateOptions) (client.ContainerUpdateResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"ContainerUpdate\", arg0, arg1, arg2)\n\tret0, _ := ret[0].(client.ContainerUpdateResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// ContainerUpdate indicates an expected call of ContainerUpdate.\nfunc (mr *MockAPIClientMockRecorder) ContainerUpdate(arg0, arg1, arg2 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"ContainerUpdate\", reflect.TypeOf((*MockAPIClient)(nil).ContainerUpdate), arg0, arg1, arg2)\n}\n\n// ContainerWait mocks base method.\nfunc (m *MockAPIClient) ContainerWait(arg0 context.Context, arg1 string, arg2 client.ContainerWaitOptions) client.ContainerWaitResult {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"ContainerWait\", arg0, arg1, arg2)\n\tret0, _ := ret[0].(client.ContainerWaitResult)\n\treturn ret0\n}\n\n// ContainerWait indicates an expected call of ContainerWait.\nfunc (mr *MockAPIClientMockRecorder) ContainerWait(arg0, arg1, arg2 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"ContainerWait\", reflect.TypeOf((*MockAPIClient)(nil).ContainerWait), arg0, arg1, arg2)\n}\n\n// CopyFromContainer mocks base method.\nfunc (m *MockAPIClient) CopyFromContainer(arg0 context.Context, arg1 string, arg2 client.CopyFromContainerOptions) (client.CopyFromContainerResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"CopyFromContainer\", arg0, arg1, arg2)\n\tret0, _ := ret[0].(client.CopyFromContainerResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// CopyFromContainer indicates an expected call of CopyFromContainer.\nfunc (mr *MockAPIClientMockRecorder) CopyFromContainer(arg0, arg1, arg2 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"CopyFromContainer\", reflect.TypeOf((*MockAPIClient)(nil).CopyFromContainer), arg0, arg1, arg2)\n}\n\n// CopyToContainer mocks base method.\nfunc (m *MockAPIClient) CopyToContainer(arg0 context.Context, arg1 string, arg2 client.CopyToContainerOptions) (client.CopyToContainerResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"CopyToContainer\", arg0, arg1, arg2)\n\tret0, _ := ret[0].(client.CopyToContainerResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// CopyToContainer indicates an expected call of CopyToContainer.\nfunc (mr *MockAPIClientMockRecorder) CopyToContainer(arg0, arg1, arg2 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"CopyToContainer\", reflect.TypeOf((*MockAPIClient)(nil).CopyToContainer), arg0, arg1, arg2)\n}\n\n// DaemonHost mocks base method.\nfunc (m *MockAPIClient) DaemonHost() string {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"DaemonHost\")\n\tret0, _ := ret[0].(string)\n\treturn ret0\n}\n\n// DaemonHost indicates an expected call of DaemonHost.\nfunc (mr *MockAPIClientMockRecorder) DaemonHost() *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"DaemonHost\", reflect.TypeOf((*MockAPIClient)(nil).DaemonHost))\n}\n\n// DialHijack mocks base method.\nfunc (m *MockAPIClient) DialHijack(arg0 context.Context, arg1, arg2 string, arg3 map[string][]string) (net.Conn, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"DialHijack\", arg0, arg1, arg2, arg3)\n\tret0, _ := ret[0].(net.Conn)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// DialHijack indicates an expected call of DialHijack.\nfunc (mr *MockAPIClientMockRecorder) DialHijack(arg0, arg1, arg2, arg3 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"DialHijack\", reflect.TypeOf((*MockAPIClient)(nil).DialHijack), arg0, arg1, arg2, arg3)\n}\n\n// Dialer mocks base method.\nfunc (m *MockAPIClient) Dialer() func(context.Context) (net.Conn, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"Dialer\")\n\tret0, _ := ret[0].(func(context.Context) (net.Conn, error))\n\treturn ret0\n}\n\n// Dialer indicates an expected call of Dialer.\nfunc (mr *MockAPIClientMockRecorder) Dialer() *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"Dialer\", reflect.TypeOf((*MockAPIClient)(nil).Dialer))\n}\n\n// DiskUsage mocks base method.\nfunc (m *MockAPIClient) DiskUsage(arg0 context.Context, arg1 client.DiskUsageOptions) (client.DiskUsageResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"DiskUsage\", arg0, arg1)\n\tret0, _ := ret[0].(client.DiskUsageResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// DiskUsage indicates an expected call of DiskUsage.\nfunc (mr *MockAPIClientMockRecorder) DiskUsage(arg0, arg1 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"DiskUsage\", reflect.TypeOf((*MockAPIClient)(nil).DiskUsage), arg0, arg1)\n}\n\n// DistributionInspect mocks base method.\nfunc (m *MockAPIClient) DistributionInspect(arg0 context.Context, arg1 string, arg2 client.DistributionInspectOptions) (client.DistributionInspectResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"DistributionInspect\", arg0, arg1, arg2)\n\tret0, _ := ret[0].(client.DistributionInspectResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// DistributionInspect indicates an expected call of DistributionInspect.\nfunc (mr *MockAPIClientMockRecorder) DistributionInspect(arg0, arg1, arg2 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"DistributionInspect\", reflect.TypeOf((*MockAPIClient)(nil).DistributionInspect), arg0, arg1, arg2)\n}\n\n// Events mocks base method.\nfunc (m *MockAPIClient) Events(arg0 context.Context, arg1 client.EventsListOptions) client.EventsResult {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"Events\", arg0, arg1)\n\tret0, _ := ret[0].(client.EventsResult)\n\treturn ret0\n}\n\n// Events indicates an expected call of Events.\nfunc (mr *MockAPIClientMockRecorder) Events(arg0, arg1 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"Events\", reflect.TypeOf((*MockAPIClient)(nil).Events), arg0, arg1)\n}\n\n// ExecAttach mocks base method.\nfunc (m *MockAPIClient) ExecAttach(arg0 context.Context, arg1 string, arg2 client.ExecAttachOptions) (client.ExecAttachResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"ExecAttach\", arg0, arg1, arg2)\n\tret0, _ := ret[0].(client.ExecAttachResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// ExecAttach indicates an expected call of ExecAttach.\nfunc (mr *MockAPIClientMockRecorder) ExecAttach(arg0, arg1, arg2 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"ExecAttach\", reflect.TypeOf((*MockAPIClient)(nil).ExecAttach), arg0, arg1, arg2)\n}\n\n// ExecCreate mocks base method.\nfunc (m *MockAPIClient) ExecCreate(arg0 context.Context, arg1 string, arg2 client.ExecCreateOptions) (client.ExecCreateResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"ExecCreate\", arg0, arg1, arg2)\n\tret0, _ := ret[0].(client.ExecCreateResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// ExecCreate indicates an expected call of ExecCreate.\nfunc (mr *MockAPIClientMockRecorder) ExecCreate(arg0, arg1, arg2 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"ExecCreate\", reflect.TypeOf((*MockAPIClient)(nil).ExecCreate), arg0, arg1, arg2)\n}\n\n// ExecInspect mocks base method.\nfunc (m *MockAPIClient) ExecInspect(arg0 context.Context, arg1 string, arg2 client.ExecInspectOptions) (client.ExecInspectResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"ExecInspect\", arg0, arg1, arg2)\n\tret0, _ := ret[0].(client.ExecInspectResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// ExecInspect indicates an expected call of ExecInspect.\nfunc (mr *MockAPIClientMockRecorder) ExecInspect(arg0, arg1, arg2 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"ExecInspect\", reflect.TypeOf((*MockAPIClient)(nil).ExecInspect), arg0, arg1, arg2)\n}\n\n// ExecResize mocks base method.\nfunc (m *MockAPIClient) ExecResize(arg0 context.Context, arg1 string, arg2 client.ExecResizeOptions) (client.ExecResizeResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"ExecResize\", arg0, arg1, arg2)\n\tret0, _ := ret[0].(client.ExecResizeResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// ExecResize indicates an expected call of ExecResize.\nfunc (mr *MockAPIClientMockRecorder) ExecResize(arg0, arg1, arg2 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"ExecResize\", reflect.TypeOf((*MockAPIClient)(nil).ExecResize), arg0, arg1, arg2)\n}\n\n// ExecStart mocks base method.\nfunc (m *MockAPIClient) ExecStart(arg0 context.Context, arg1 string, arg2 client.ExecStartOptions) (client.ExecStartResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"ExecStart\", arg0, arg1, arg2)\n\tret0, _ := ret[0].(client.ExecStartResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// ExecStart indicates an expected call of ExecStart.\nfunc (mr *MockAPIClientMockRecorder) ExecStart(arg0, arg1, arg2 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"ExecStart\", reflect.TypeOf((*MockAPIClient)(nil).ExecStart), arg0, arg1, arg2)\n}\n\n// ImageBuild mocks base method.\nfunc (m *MockAPIClient) ImageBuild(arg0 context.Context, arg1 io.Reader, arg2 client.ImageBuildOptions) (client.ImageBuildResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"ImageBuild\", arg0, arg1, arg2)\n\tret0, _ := ret[0].(client.ImageBuildResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// ImageBuild indicates an expected call of ImageBuild.\nfunc (mr *MockAPIClientMockRecorder) ImageBuild(arg0, arg1, arg2 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"ImageBuild\", reflect.TypeOf((*MockAPIClient)(nil).ImageBuild), arg0, arg1, arg2)\n}\n\n// ImageHistory mocks base method.\nfunc (m *MockAPIClient) ImageHistory(arg0 context.Context, arg1 string, arg2 ...client.ImageHistoryOption) (client.ImageHistoryResult, error) {\n\tm.ctrl.T.Helper()\n\tvarargs := []interface{}{arg0, arg1}\n\tfor _, a := range arg2 {\n\t\tvarargs = append(varargs, a)\n\t}\n\tret := m.ctrl.Call(m, \"ImageHistory\", varargs...)\n\tret0, _ := ret[0].(client.ImageHistoryResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// ImageHistory indicates an expected call of ImageHistory.\nfunc (mr *MockAPIClientMockRecorder) ImageHistory(arg0, arg1 interface{}, arg2 ...interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\tvarargs := append([]interface{}{arg0, arg1}, arg2...)\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"ImageHistory\", reflect.TypeOf((*MockAPIClient)(nil).ImageHistory), varargs...)\n}\n\n// ImageImport mocks base method.\nfunc (m *MockAPIClient) ImageImport(arg0 context.Context, arg1 client.ImageImportSource, arg2 string, arg3 client.ImageImportOptions) (client.ImageImportResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"ImageImport\", arg0, arg1, arg2, arg3)\n\tret0, _ := ret[0].(client.ImageImportResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// ImageImport indicates an expected call of ImageImport.\nfunc (mr *MockAPIClientMockRecorder) ImageImport(arg0, arg1, arg2, arg3 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"ImageImport\", reflect.TypeOf((*MockAPIClient)(nil).ImageImport), arg0, arg1, arg2, arg3)\n}\n\n// ImageInspect mocks base method.\nfunc (m *MockAPIClient) ImageInspect(arg0 context.Context, arg1 string, arg2 ...client.ImageInspectOption) (client.ImageInspectResult, error) {\n\tm.ctrl.T.Helper()\n\tvarargs := []interface{}{arg0, arg1}\n\tfor _, a := range arg2 {\n\t\tvarargs = append(varargs, a)\n\t}\n\tret := m.ctrl.Call(m, \"ImageInspect\", varargs...)\n\tret0, _ := ret[0].(client.ImageInspectResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// ImageInspect indicates an expected call of ImageInspect.\nfunc (mr *MockAPIClientMockRecorder) ImageInspect(arg0, arg1 interface{}, arg2 ...interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\tvarargs := append([]interface{}{arg0, arg1}, arg2...)\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"ImageInspect\", reflect.TypeOf((*MockAPIClient)(nil).ImageInspect), varargs...)\n}\n\n// ImageList mocks base method.\nfunc (m *MockAPIClient) ImageList(arg0 context.Context, arg1 client.ImageListOptions) (client.ImageListResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"ImageList\", arg0, arg1)\n\tret0, _ := ret[0].(client.ImageListResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// ImageList indicates an expected call of ImageList.\nfunc (mr *MockAPIClientMockRecorder) ImageList(arg0, arg1 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"ImageList\", reflect.TypeOf((*MockAPIClient)(nil).ImageList), arg0, arg1)\n}\n\n// ImageLoad mocks base method.\nfunc (m *MockAPIClient) ImageLoad(arg0 context.Context, arg1 io.Reader, arg2 ...client.ImageLoadOption) (client.ImageLoadResult, error) {\n\tm.ctrl.T.Helper()\n\tvarargs := []interface{}{arg0, arg1}\n\tfor _, a := range arg2 {\n\t\tvarargs = append(varargs, a)\n\t}\n\tret := m.ctrl.Call(m, \"ImageLoad\", varargs...)\n\tret0, _ := ret[0].(client.ImageLoadResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// ImageLoad indicates an expected call of ImageLoad.\nfunc (mr *MockAPIClientMockRecorder) ImageLoad(arg0, arg1 interface{}, arg2 ...interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\tvarargs := append([]interface{}{arg0, arg1}, arg2...)\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"ImageLoad\", reflect.TypeOf((*MockAPIClient)(nil).ImageLoad), varargs...)\n}\n\n// ImagePrune mocks base method.\nfunc (m *MockAPIClient) ImagePrune(arg0 context.Context, arg1 client.ImagePruneOptions) (client.ImagePruneResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"ImagePrune\", arg0, arg1)\n\tret0, _ := ret[0].(client.ImagePruneResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// ImagePrune indicates an expected call of ImagePrune.\nfunc (mr *MockAPIClientMockRecorder) ImagePrune(arg0, arg1 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"ImagePrune\", reflect.TypeOf((*MockAPIClient)(nil).ImagePrune), arg0, arg1)\n}\n\n// ImagePull mocks base method.\nfunc (m *MockAPIClient) ImagePull(arg0 context.Context, arg1 string, arg2 client.ImagePullOptions) (client.ImagePullResponse, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"ImagePull\", arg0, arg1, arg2)\n\tret0, _ := ret[0].(client.ImagePullResponse)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// ImagePull indicates an expected call of ImagePull.\nfunc (mr *MockAPIClientMockRecorder) ImagePull(arg0, arg1, arg2 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"ImagePull\", reflect.TypeOf((*MockAPIClient)(nil).ImagePull), arg0, arg1, arg2)\n}\n\n// ImagePush mocks base method.\nfunc (m *MockAPIClient) ImagePush(arg0 context.Context, arg1 string, arg2 client.ImagePushOptions) (client.ImagePushResponse, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"ImagePush\", arg0, arg1, arg2)\n\tret0, _ := ret[0].(client.ImagePushResponse)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// ImagePush indicates an expected call of ImagePush.\nfunc (mr *MockAPIClientMockRecorder) ImagePush(arg0, arg1, arg2 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"ImagePush\", reflect.TypeOf((*MockAPIClient)(nil).ImagePush), arg0, arg1, arg2)\n}\n\n// ImageRemove mocks base method.\nfunc (m *MockAPIClient) ImageRemove(arg0 context.Context, arg1 string, arg2 client.ImageRemoveOptions) (client.ImageRemoveResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"ImageRemove\", arg0, arg1, arg2)\n\tret0, _ := ret[0].(client.ImageRemoveResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// ImageRemove indicates an expected call of ImageRemove.\nfunc (mr *MockAPIClientMockRecorder) ImageRemove(arg0, arg1, arg2 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"ImageRemove\", reflect.TypeOf((*MockAPIClient)(nil).ImageRemove), arg0, arg1, arg2)\n}\n\n// ImageSave mocks base method.\nfunc (m *MockAPIClient) ImageSave(arg0 context.Context, arg1 []string, arg2 ...client.ImageSaveOption) (client.ImageSaveResult, error) {\n\tm.ctrl.T.Helper()\n\tvarargs := []interface{}{arg0, arg1}\n\tfor _, a := range arg2 {\n\t\tvarargs = append(varargs, a)\n\t}\n\tret := m.ctrl.Call(m, \"ImageSave\", varargs...)\n\tret0, _ := ret[0].(client.ImageSaveResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// ImageSave indicates an expected call of ImageSave.\nfunc (mr *MockAPIClientMockRecorder) ImageSave(arg0, arg1 interface{}, arg2 ...interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\tvarargs := append([]interface{}{arg0, arg1}, arg2...)\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"ImageSave\", reflect.TypeOf((*MockAPIClient)(nil).ImageSave), varargs...)\n}\n\n// ImageSearch mocks base method.\nfunc (m *MockAPIClient) ImageSearch(arg0 context.Context, arg1 string, arg2 client.ImageSearchOptions) (client.ImageSearchResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"ImageSearch\", arg0, arg1, arg2)\n\tret0, _ := ret[0].(client.ImageSearchResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// ImageSearch indicates an expected call of ImageSearch.\nfunc (mr *MockAPIClientMockRecorder) ImageSearch(arg0, arg1, arg2 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"ImageSearch\", reflect.TypeOf((*MockAPIClient)(nil).ImageSearch), arg0, arg1, arg2)\n}\n\n// ImageTag mocks base method.\nfunc (m *MockAPIClient) ImageTag(arg0 context.Context, arg1 client.ImageTagOptions) (client.ImageTagResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"ImageTag\", arg0, arg1)\n\tret0, _ := ret[0].(client.ImageTagResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// ImageTag indicates an expected call of ImageTag.\nfunc (mr *MockAPIClientMockRecorder) ImageTag(arg0, arg1 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"ImageTag\", reflect.TypeOf((*MockAPIClient)(nil).ImageTag), arg0, arg1)\n}\n\n// Info mocks base method.\nfunc (m *MockAPIClient) Info(arg0 context.Context, arg1 client.InfoOptions) (client.SystemInfoResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"Info\", arg0, arg1)\n\tret0, _ := ret[0].(client.SystemInfoResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// Info indicates an expected call of Info.\nfunc (mr *MockAPIClientMockRecorder) Info(arg0, arg1 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"Info\", reflect.TypeOf((*MockAPIClient)(nil).Info), arg0, arg1)\n}\n\n// NetworkConnect mocks base method.\nfunc (m *MockAPIClient) NetworkConnect(arg0 context.Context, arg1 string, arg2 client.NetworkConnectOptions) (client.NetworkConnectResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"NetworkConnect\", arg0, arg1, arg2)\n\tret0, _ := ret[0].(client.NetworkConnectResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// NetworkConnect indicates an expected call of NetworkConnect.\nfunc (mr *MockAPIClientMockRecorder) NetworkConnect(arg0, arg1, arg2 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"NetworkConnect\", reflect.TypeOf((*MockAPIClient)(nil).NetworkConnect), arg0, arg1, arg2)\n}\n\n// NetworkCreate mocks base method.\nfunc (m *MockAPIClient) NetworkCreate(arg0 context.Context, arg1 string, arg2 client.NetworkCreateOptions) (client.NetworkCreateResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"NetworkCreate\", arg0, arg1, arg2)\n\tret0, _ := ret[0].(client.NetworkCreateResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// NetworkCreate indicates an expected call of NetworkCreate.\nfunc (mr *MockAPIClientMockRecorder) NetworkCreate(arg0, arg1, arg2 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"NetworkCreate\", reflect.TypeOf((*MockAPIClient)(nil).NetworkCreate), arg0, arg1, arg2)\n}\n\n// NetworkDisconnect mocks base method.\nfunc (m *MockAPIClient) NetworkDisconnect(arg0 context.Context, arg1 string, arg2 client.NetworkDisconnectOptions) (client.NetworkDisconnectResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"NetworkDisconnect\", arg0, arg1, arg2)\n\tret0, _ := ret[0].(client.NetworkDisconnectResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// NetworkDisconnect indicates an expected call of NetworkDisconnect.\nfunc (mr *MockAPIClientMockRecorder) NetworkDisconnect(arg0, arg1, arg2 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"NetworkDisconnect\", reflect.TypeOf((*MockAPIClient)(nil).NetworkDisconnect), arg0, arg1, arg2)\n}\n\n// NetworkInspect mocks base method.\nfunc (m *MockAPIClient) NetworkInspect(arg0 context.Context, arg1 string, arg2 client.NetworkInspectOptions) (client.NetworkInspectResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"NetworkInspect\", arg0, arg1, arg2)\n\tret0, _ := ret[0].(client.NetworkInspectResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// NetworkInspect indicates an expected call of NetworkInspect.\nfunc (mr *MockAPIClientMockRecorder) NetworkInspect(arg0, arg1, arg2 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"NetworkInspect\", reflect.TypeOf((*MockAPIClient)(nil).NetworkInspect), arg0, arg1, arg2)\n}\n\n// NetworkList mocks base method.\nfunc (m *MockAPIClient) NetworkList(arg0 context.Context, arg1 client.NetworkListOptions) (client.NetworkListResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"NetworkList\", arg0, arg1)\n\tret0, _ := ret[0].(client.NetworkListResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// NetworkList indicates an expected call of NetworkList.\nfunc (mr *MockAPIClientMockRecorder) NetworkList(arg0, arg1 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"NetworkList\", reflect.TypeOf((*MockAPIClient)(nil).NetworkList), arg0, arg1)\n}\n\n// NetworkPrune mocks base method.\nfunc (m *MockAPIClient) NetworkPrune(arg0 context.Context, arg1 client.NetworkPruneOptions) (client.NetworkPruneResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"NetworkPrune\", arg0, arg1)\n\tret0, _ := ret[0].(client.NetworkPruneResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// NetworkPrune indicates an expected call of NetworkPrune.\nfunc (mr *MockAPIClientMockRecorder) NetworkPrune(arg0, arg1 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"NetworkPrune\", reflect.TypeOf((*MockAPIClient)(nil).NetworkPrune), arg0, arg1)\n}\n\n// NetworkRemove mocks base method.\nfunc (m *MockAPIClient) NetworkRemove(arg0 context.Context, arg1 string, arg2 client.NetworkRemoveOptions) (client.NetworkRemoveResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"NetworkRemove\", arg0, arg1, arg2)\n\tret0, _ := ret[0].(client.NetworkRemoveResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// NetworkRemove indicates an expected call of NetworkRemove.\nfunc (mr *MockAPIClientMockRecorder) NetworkRemove(arg0, arg1, arg2 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"NetworkRemove\", reflect.TypeOf((*MockAPIClient)(nil).NetworkRemove), arg0, arg1, arg2)\n}\n\n// NodeInspect mocks base method.\nfunc (m *MockAPIClient) NodeInspect(arg0 context.Context, arg1 string, arg2 client.NodeInspectOptions) (client.NodeInspectResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"NodeInspect\", arg0, arg1, arg2)\n\tret0, _ := ret[0].(client.NodeInspectResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// NodeInspect indicates an expected call of NodeInspect.\nfunc (mr *MockAPIClientMockRecorder) NodeInspect(arg0, arg1, arg2 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"NodeInspect\", reflect.TypeOf((*MockAPIClient)(nil).NodeInspect), arg0, arg1, arg2)\n}\n\n// NodeList mocks base method.\nfunc (m *MockAPIClient) NodeList(arg0 context.Context, arg1 client.NodeListOptions) (client.NodeListResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"NodeList\", arg0, arg1)\n\tret0, _ := ret[0].(client.NodeListResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// NodeList indicates an expected call of NodeList.\nfunc (mr *MockAPIClientMockRecorder) NodeList(arg0, arg1 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"NodeList\", reflect.TypeOf((*MockAPIClient)(nil).NodeList), arg0, arg1)\n}\n\n// NodeRemove mocks base method.\nfunc (m *MockAPIClient) NodeRemove(arg0 context.Context, arg1 string, arg2 client.NodeRemoveOptions) (client.NodeRemoveResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"NodeRemove\", arg0, arg1, arg2)\n\tret0, _ := ret[0].(client.NodeRemoveResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// NodeRemove indicates an expected call of NodeRemove.\nfunc (mr *MockAPIClientMockRecorder) NodeRemove(arg0, arg1, arg2 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"NodeRemove\", reflect.TypeOf((*MockAPIClient)(nil).NodeRemove), arg0, arg1, arg2)\n}\n\n// NodeUpdate mocks base method.\nfunc (m *MockAPIClient) NodeUpdate(arg0 context.Context, arg1 string, arg2 client.NodeUpdateOptions) (client.NodeUpdateResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"NodeUpdate\", arg0, arg1, arg2)\n\tret0, _ := ret[0].(client.NodeUpdateResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// NodeUpdate indicates an expected call of NodeUpdate.\nfunc (mr *MockAPIClientMockRecorder) NodeUpdate(arg0, arg1, arg2 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"NodeUpdate\", reflect.TypeOf((*MockAPIClient)(nil).NodeUpdate), arg0, arg1, arg2)\n}\n\n// Ping mocks base method.\nfunc (m *MockAPIClient) Ping(arg0 context.Context, arg1 client.PingOptions) (client.PingResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"Ping\", arg0, arg1)\n\tret0, _ := ret[0].(client.PingResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// Ping indicates an expected call of Ping.\nfunc (mr *MockAPIClientMockRecorder) Ping(arg0, arg1 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"Ping\", reflect.TypeOf((*MockAPIClient)(nil).Ping), arg0, arg1)\n}\n\n// PluginCreate mocks base method.\nfunc (m *MockAPIClient) PluginCreate(arg0 context.Context, arg1 io.Reader, arg2 client.PluginCreateOptions) (client.PluginCreateResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"PluginCreate\", arg0, arg1, arg2)\n\tret0, _ := ret[0].(client.PluginCreateResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// PluginCreate indicates an expected call of PluginCreate.\nfunc (mr *MockAPIClientMockRecorder) PluginCreate(arg0, arg1, arg2 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"PluginCreate\", reflect.TypeOf((*MockAPIClient)(nil).PluginCreate), arg0, arg1, arg2)\n}\n\n// PluginDisable mocks base method.\nfunc (m *MockAPIClient) PluginDisable(arg0 context.Context, arg1 string, arg2 client.PluginDisableOptions) (client.PluginDisableResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"PluginDisable\", arg0, arg1, arg2)\n\tret0, _ := ret[0].(client.PluginDisableResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// PluginDisable indicates an expected call of PluginDisable.\nfunc (mr *MockAPIClientMockRecorder) PluginDisable(arg0, arg1, arg2 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"PluginDisable\", reflect.TypeOf((*MockAPIClient)(nil).PluginDisable), arg0, arg1, arg2)\n}\n\n// PluginEnable mocks base method.\nfunc (m *MockAPIClient) PluginEnable(arg0 context.Context, arg1 string, arg2 client.PluginEnableOptions) (client.PluginEnableResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"PluginEnable\", arg0, arg1, arg2)\n\tret0, _ := ret[0].(client.PluginEnableResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// PluginEnable indicates an expected call of PluginEnable.\nfunc (mr *MockAPIClientMockRecorder) PluginEnable(arg0, arg1, arg2 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"PluginEnable\", reflect.TypeOf((*MockAPIClient)(nil).PluginEnable), arg0, arg1, arg2)\n}\n\n// PluginInspect mocks base method.\nfunc (m *MockAPIClient) PluginInspect(arg0 context.Context, arg1 string, arg2 client.PluginInspectOptions) (client.PluginInspectResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"PluginInspect\", arg0, arg1, arg2)\n\tret0, _ := ret[0].(client.PluginInspectResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// PluginInspect indicates an expected call of PluginInspect.\nfunc (mr *MockAPIClientMockRecorder) PluginInspect(arg0, arg1, arg2 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"PluginInspect\", reflect.TypeOf((*MockAPIClient)(nil).PluginInspect), arg0, arg1, arg2)\n}\n\n// PluginInstall mocks base method.\nfunc (m *MockAPIClient) PluginInstall(arg0 context.Context, arg1 string, arg2 client.PluginInstallOptions) (client.PluginInstallResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"PluginInstall\", arg0, arg1, arg2)\n\tret0, _ := ret[0].(client.PluginInstallResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// PluginInstall indicates an expected call of PluginInstall.\nfunc (mr *MockAPIClientMockRecorder) PluginInstall(arg0, arg1, arg2 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"PluginInstall\", reflect.TypeOf((*MockAPIClient)(nil).PluginInstall), arg0, arg1, arg2)\n}\n\n// PluginList mocks base method.\nfunc (m *MockAPIClient) PluginList(arg0 context.Context, arg1 client.PluginListOptions) (client.PluginListResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"PluginList\", arg0, arg1)\n\tret0, _ := ret[0].(client.PluginListResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// PluginList indicates an expected call of PluginList.\nfunc (mr *MockAPIClientMockRecorder) PluginList(arg0, arg1 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"PluginList\", reflect.TypeOf((*MockAPIClient)(nil).PluginList), arg0, arg1)\n}\n\n// PluginPush mocks base method.\nfunc (m *MockAPIClient) PluginPush(arg0 context.Context, arg1 string, arg2 client.PluginPushOptions) (client.PluginPushResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"PluginPush\", arg0, arg1, arg2)\n\tret0, _ := ret[0].(client.PluginPushResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// PluginPush indicates an expected call of PluginPush.\nfunc (mr *MockAPIClientMockRecorder) PluginPush(arg0, arg1, arg2 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"PluginPush\", reflect.TypeOf((*MockAPIClient)(nil).PluginPush), arg0, arg1, arg2)\n}\n\n// PluginRemove mocks base method.\nfunc (m *MockAPIClient) PluginRemove(arg0 context.Context, arg1 string, arg2 client.PluginRemoveOptions) (client.PluginRemoveResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"PluginRemove\", arg0, arg1, arg2)\n\tret0, _ := ret[0].(client.PluginRemoveResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// PluginRemove indicates an expected call of PluginRemove.\nfunc (mr *MockAPIClientMockRecorder) PluginRemove(arg0, arg1, arg2 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"PluginRemove\", reflect.TypeOf((*MockAPIClient)(nil).PluginRemove), arg0, arg1, arg2)\n}\n\n// PluginSet mocks base method.\nfunc (m *MockAPIClient) PluginSet(arg0 context.Context, arg1 string, arg2 client.PluginSetOptions) (client.PluginSetResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"PluginSet\", arg0, arg1, arg2)\n\tret0, _ := ret[0].(client.PluginSetResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// PluginSet indicates an expected call of PluginSet.\nfunc (mr *MockAPIClientMockRecorder) PluginSet(arg0, arg1, arg2 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"PluginSet\", reflect.TypeOf((*MockAPIClient)(nil).PluginSet), arg0, arg1, arg2)\n}\n\n// PluginUpgrade mocks base method.\nfunc (m *MockAPIClient) PluginUpgrade(arg0 context.Context, arg1 string, arg2 client.PluginUpgradeOptions) (client.PluginUpgradeResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"PluginUpgrade\", arg0, arg1, arg2)\n\tret0, _ := ret[0].(client.PluginUpgradeResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// PluginUpgrade indicates an expected call of PluginUpgrade.\nfunc (mr *MockAPIClientMockRecorder) PluginUpgrade(arg0, arg1, arg2 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"PluginUpgrade\", reflect.TypeOf((*MockAPIClient)(nil).PluginUpgrade), arg0, arg1, arg2)\n}\n\n// RegistryLogin mocks base method.\nfunc (m *MockAPIClient) RegistryLogin(arg0 context.Context, arg1 client.RegistryLoginOptions) (client.RegistryLoginResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"RegistryLogin\", arg0, arg1)\n\tret0, _ := ret[0].(client.RegistryLoginResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// RegistryLogin indicates an expected call of RegistryLogin.\nfunc (mr *MockAPIClientMockRecorder) RegistryLogin(arg0, arg1 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"RegistryLogin\", reflect.TypeOf((*MockAPIClient)(nil).RegistryLogin), arg0, arg1)\n}\n\n// SecretCreate mocks base method.\nfunc (m *MockAPIClient) SecretCreate(arg0 context.Context, arg1 client.SecretCreateOptions) (client.SecretCreateResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"SecretCreate\", arg0, arg1)\n\tret0, _ := ret[0].(client.SecretCreateResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// SecretCreate indicates an expected call of SecretCreate.\nfunc (mr *MockAPIClientMockRecorder) SecretCreate(arg0, arg1 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"SecretCreate\", reflect.TypeOf((*MockAPIClient)(nil).SecretCreate), arg0, arg1)\n}\n\n// SecretInspect mocks base method.\nfunc (m *MockAPIClient) SecretInspect(arg0 context.Context, arg1 string, arg2 client.SecretInspectOptions) (client.SecretInspectResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"SecretInspect\", arg0, arg1, arg2)\n\tret0, _ := ret[0].(client.SecretInspectResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// SecretInspect indicates an expected call of SecretInspect.\nfunc (mr *MockAPIClientMockRecorder) SecretInspect(arg0, arg1, arg2 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"SecretInspect\", reflect.TypeOf((*MockAPIClient)(nil).SecretInspect), arg0, arg1, arg2)\n}\n\n// SecretList mocks base method.\nfunc (m *MockAPIClient) SecretList(arg0 context.Context, arg1 client.SecretListOptions) (client.SecretListResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"SecretList\", arg0, arg1)\n\tret0, _ := ret[0].(client.SecretListResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// SecretList indicates an expected call of SecretList.\nfunc (mr *MockAPIClientMockRecorder) SecretList(arg0, arg1 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"SecretList\", reflect.TypeOf((*MockAPIClient)(nil).SecretList), arg0, arg1)\n}\n\n// SecretRemove mocks base method.\nfunc (m *MockAPIClient) SecretRemove(arg0 context.Context, arg1 string, arg2 client.SecretRemoveOptions) (client.SecretRemoveResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"SecretRemove\", arg0, arg1, arg2)\n\tret0, _ := ret[0].(client.SecretRemoveResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// SecretRemove indicates an expected call of SecretRemove.\nfunc (mr *MockAPIClientMockRecorder) SecretRemove(arg0, arg1, arg2 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"SecretRemove\", reflect.TypeOf((*MockAPIClient)(nil).SecretRemove), arg0, arg1, arg2)\n}\n\n// SecretUpdate mocks base method.\nfunc (m *MockAPIClient) SecretUpdate(arg0 context.Context, arg1 string, arg2 client.SecretUpdateOptions) (client.SecretUpdateResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"SecretUpdate\", arg0, arg1, arg2)\n\tret0, _ := ret[0].(client.SecretUpdateResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// SecretUpdate indicates an expected call of SecretUpdate.\nfunc (mr *MockAPIClientMockRecorder) SecretUpdate(arg0, arg1, arg2 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"SecretUpdate\", reflect.TypeOf((*MockAPIClient)(nil).SecretUpdate), arg0, arg1, arg2)\n}\n\n// ServerVersion mocks base method.\nfunc (m *MockAPIClient) ServerVersion(arg0 context.Context, arg1 client.ServerVersionOptions) (client.ServerVersionResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"ServerVersion\", arg0, arg1)\n\tret0, _ := ret[0].(client.ServerVersionResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// ServerVersion indicates an expected call of ServerVersion.\nfunc (mr *MockAPIClientMockRecorder) ServerVersion(arg0, arg1 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"ServerVersion\", reflect.TypeOf((*MockAPIClient)(nil).ServerVersion), arg0, arg1)\n}\n\n// ServiceCreate mocks base method.\nfunc (m *MockAPIClient) ServiceCreate(arg0 context.Context, arg1 client.ServiceCreateOptions) (client.ServiceCreateResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"ServiceCreate\", arg0, arg1)\n\tret0, _ := ret[0].(client.ServiceCreateResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// ServiceCreate indicates an expected call of ServiceCreate.\nfunc (mr *MockAPIClientMockRecorder) ServiceCreate(arg0, arg1 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"ServiceCreate\", reflect.TypeOf((*MockAPIClient)(nil).ServiceCreate), arg0, arg1)\n}\n\n// ServiceInspect mocks base method.\nfunc (m *MockAPIClient) ServiceInspect(arg0 context.Context, arg1 string, arg2 client.ServiceInspectOptions) (client.ServiceInspectResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"ServiceInspect\", arg0, arg1, arg2)\n\tret0, _ := ret[0].(client.ServiceInspectResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// ServiceInspect indicates an expected call of ServiceInspect.\nfunc (mr *MockAPIClientMockRecorder) ServiceInspect(arg0, arg1, arg2 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"ServiceInspect\", reflect.TypeOf((*MockAPIClient)(nil).ServiceInspect), arg0, arg1, arg2)\n}\n\n// ServiceList mocks base method.\nfunc (m *MockAPIClient) ServiceList(arg0 context.Context, arg1 client.ServiceListOptions) (client.ServiceListResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"ServiceList\", arg0, arg1)\n\tret0, _ := ret[0].(client.ServiceListResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// ServiceList indicates an expected call of ServiceList.\nfunc (mr *MockAPIClientMockRecorder) ServiceList(arg0, arg1 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"ServiceList\", reflect.TypeOf((*MockAPIClient)(nil).ServiceList), arg0, arg1)\n}\n\n// ServiceLogs mocks base method.\nfunc (m *MockAPIClient) ServiceLogs(arg0 context.Context, arg1 string, arg2 client.ServiceLogsOptions) (client.ServiceLogsResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"ServiceLogs\", arg0, arg1, arg2)\n\tret0, _ := ret[0].(client.ServiceLogsResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// ServiceLogs indicates an expected call of ServiceLogs.\nfunc (mr *MockAPIClientMockRecorder) ServiceLogs(arg0, arg1, arg2 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"ServiceLogs\", reflect.TypeOf((*MockAPIClient)(nil).ServiceLogs), arg0, arg1, arg2)\n}\n\n// ServiceRemove mocks base method.\nfunc (m *MockAPIClient) ServiceRemove(arg0 context.Context, arg1 string, arg2 client.ServiceRemoveOptions) (client.ServiceRemoveResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"ServiceRemove\", arg0, arg1, arg2)\n\tret0, _ := ret[0].(client.ServiceRemoveResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// ServiceRemove indicates an expected call of ServiceRemove.\nfunc (mr *MockAPIClientMockRecorder) ServiceRemove(arg0, arg1, arg2 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"ServiceRemove\", reflect.TypeOf((*MockAPIClient)(nil).ServiceRemove), arg0, arg1, arg2)\n}\n\n// ServiceUpdate mocks base method.\nfunc (m *MockAPIClient) ServiceUpdate(arg0 context.Context, arg1 string, arg2 client.ServiceUpdateOptions) (client.ServiceUpdateResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"ServiceUpdate\", arg0, arg1, arg2)\n\tret0, _ := ret[0].(client.ServiceUpdateResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// ServiceUpdate indicates an expected call of ServiceUpdate.\nfunc (mr *MockAPIClientMockRecorder) ServiceUpdate(arg0, arg1, arg2 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"ServiceUpdate\", reflect.TypeOf((*MockAPIClient)(nil).ServiceUpdate), arg0, arg1, arg2)\n}\n\n// SwarmGetUnlockKey mocks base method.\nfunc (m *MockAPIClient) SwarmGetUnlockKey(arg0 context.Context) (client.SwarmGetUnlockKeyResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"SwarmGetUnlockKey\", arg0)\n\tret0, _ := ret[0].(client.SwarmGetUnlockKeyResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// SwarmGetUnlockKey indicates an expected call of SwarmGetUnlockKey.\nfunc (mr *MockAPIClientMockRecorder) SwarmGetUnlockKey(arg0 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"SwarmGetUnlockKey\", reflect.TypeOf((*MockAPIClient)(nil).SwarmGetUnlockKey), arg0)\n}\n\n// SwarmInit mocks base method.\nfunc (m *MockAPIClient) SwarmInit(arg0 context.Context, arg1 client.SwarmInitOptions) (client.SwarmInitResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"SwarmInit\", arg0, arg1)\n\tret0, _ := ret[0].(client.SwarmInitResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// SwarmInit indicates an expected call of SwarmInit.\nfunc (mr *MockAPIClientMockRecorder) SwarmInit(arg0, arg1 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"SwarmInit\", reflect.TypeOf((*MockAPIClient)(nil).SwarmInit), arg0, arg1)\n}\n\n// SwarmInspect mocks base method.\nfunc (m *MockAPIClient) SwarmInspect(arg0 context.Context, arg1 client.SwarmInspectOptions) (client.SwarmInspectResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"SwarmInspect\", arg0, arg1)\n\tret0, _ := ret[0].(client.SwarmInspectResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// SwarmInspect indicates an expected call of SwarmInspect.\nfunc (mr *MockAPIClientMockRecorder) SwarmInspect(arg0, arg1 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"SwarmInspect\", reflect.TypeOf((*MockAPIClient)(nil).SwarmInspect), arg0, arg1)\n}\n\n// SwarmJoin mocks base method.\nfunc (m *MockAPIClient) SwarmJoin(arg0 context.Context, arg1 client.SwarmJoinOptions) (client.SwarmJoinResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"SwarmJoin\", arg0, arg1)\n\tret0, _ := ret[0].(client.SwarmJoinResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// SwarmJoin indicates an expected call of SwarmJoin.\nfunc (mr *MockAPIClientMockRecorder) SwarmJoin(arg0, arg1 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"SwarmJoin\", reflect.TypeOf((*MockAPIClient)(nil).SwarmJoin), arg0, arg1)\n}\n\n// SwarmLeave mocks base method.\nfunc (m *MockAPIClient) SwarmLeave(arg0 context.Context, arg1 client.SwarmLeaveOptions) (client.SwarmLeaveResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"SwarmLeave\", arg0, arg1)\n\tret0, _ := ret[0].(client.SwarmLeaveResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// SwarmLeave indicates an expected call of SwarmLeave.\nfunc (mr *MockAPIClientMockRecorder) SwarmLeave(arg0, arg1 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"SwarmLeave\", reflect.TypeOf((*MockAPIClient)(nil).SwarmLeave), arg0, arg1)\n}\n\n// SwarmUnlock mocks base method.\nfunc (m *MockAPIClient) SwarmUnlock(arg0 context.Context, arg1 client.SwarmUnlockOptions) (client.SwarmUnlockResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"SwarmUnlock\", arg0, arg1)\n\tret0, _ := ret[0].(client.SwarmUnlockResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// SwarmUnlock indicates an expected call of SwarmUnlock.\nfunc (mr *MockAPIClientMockRecorder) SwarmUnlock(arg0, arg1 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"SwarmUnlock\", reflect.TypeOf((*MockAPIClient)(nil).SwarmUnlock), arg0, arg1)\n}\n\n// SwarmUpdate mocks base method.\nfunc (m *MockAPIClient) SwarmUpdate(arg0 context.Context, arg1 client.SwarmUpdateOptions) (client.SwarmUpdateResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"SwarmUpdate\", arg0, arg1)\n\tret0, _ := ret[0].(client.SwarmUpdateResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// SwarmUpdate indicates an expected call of SwarmUpdate.\nfunc (mr *MockAPIClientMockRecorder) SwarmUpdate(arg0, arg1 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"SwarmUpdate\", reflect.TypeOf((*MockAPIClient)(nil).SwarmUpdate), arg0, arg1)\n}\n\n// TaskInspect mocks base method.\nfunc (m *MockAPIClient) TaskInspect(arg0 context.Context, arg1 string, arg2 client.TaskInspectOptions) (client.TaskInspectResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"TaskInspect\", arg0, arg1, arg2)\n\tret0, _ := ret[0].(client.TaskInspectResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// TaskInspect indicates an expected call of TaskInspect.\nfunc (mr *MockAPIClientMockRecorder) TaskInspect(arg0, arg1, arg2 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"TaskInspect\", reflect.TypeOf((*MockAPIClient)(nil).TaskInspect), arg0, arg1, arg2)\n}\n\n// TaskList mocks base method.\nfunc (m *MockAPIClient) TaskList(arg0 context.Context, arg1 client.TaskListOptions) (client.TaskListResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"TaskList\", arg0, arg1)\n\tret0, _ := ret[0].(client.TaskListResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// TaskList indicates an expected call of TaskList.\nfunc (mr *MockAPIClientMockRecorder) TaskList(arg0, arg1 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"TaskList\", reflect.TypeOf((*MockAPIClient)(nil).TaskList), arg0, arg1)\n}\n\n// TaskLogs mocks base method.\nfunc (m *MockAPIClient) TaskLogs(arg0 context.Context, arg1 string, arg2 client.TaskLogsOptions) (client.TaskLogsResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"TaskLogs\", arg0, arg1, arg2)\n\tret0, _ := ret[0].(client.TaskLogsResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// TaskLogs indicates an expected call of TaskLogs.\nfunc (mr *MockAPIClientMockRecorder) TaskLogs(arg0, arg1, arg2 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"TaskLogs\", reflect.TypeOf((*MockAPIClient)(nil).TaskLogs), arg0, arg1, arg2)\n}\n\n// VolumeCreate mocks base method.\nfunc (m *MockAPIClient) VolumeCreate(arg0 context.Context, arg1 client.VolumeCreateOptions) (client.VolumeCreateResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"VolumeCreate\", arg0, arg1)\n\tret0, _ := ret[0].(client.VolumeCreateResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// VolumeCreate indicates an expected call of VolumeCreate.\nfunc (mr *MockAPIClientMockRecorder) VolumeCreate(arg0, arg1 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"VolumeCreate\", reflect.TypeOf((*MockAPIClient)(nil).VolumeCreate), arg0, arg1)\n}\n\n// VolumeInspect mocks base method.\nfunc (m *MockAPIClient) VolumeInspect(arg0 context.Context, arg1 string, arg2 client.VolumeInspectOptions) (client.VolumeInspectResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"VolumeInspect\", arg0, arg1, arg2)\n\tret0, _ := ret[0].(client.VolumeInspectResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// VolumeInspect indicates an expected call of VolumeInspect.\nfunc (mr *MockAPIClientMockRecorder) VolumeInspect(arg0, arg1, arg2 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"VolumeInspect\", reflect.TypeOf((*MockAPIClient)(nil).VolumeInspect), arg0, arg1, arg2)\n}\n\n// VolumeList mocks base method.\nfunc (m *MockAPIClient) VolumeList(arg0 context.Context, arg1 client.VolumeListOptions) (client.VolumeListResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"VolumeList\", arg0, arg1)\n\tret0, _ := ret[0].(client.VolumeListResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// VolumeList indicates an expected call of VolumeList.\nfunc (mr *MockAPIClientMockRecorder) VolumeList(arg0, arg1 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"VolumeList\", reflect.TypeOf((*MockAPIClient)(nil).VolumeList), arg0, arg1)\n}\n\n// VolumePrune mocks base method.\nfunc (m *MockAPIClient) VolumePrune(arg0 context.Context, arg1 client.VolumePruneOptions) (client.VolumePruneResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"VolumePrune\", arg0, arg1)\n\tret0, _ := ret[0].(client.VolumePruneResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// VolumePrune indicates an expected call of VolumePrune.\nfunc (mr *MockAPIClientMockRecorder) VolumePrune(arg0, arg1 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"VolumePrune\", reflect.TypeOf((*MockAPIClient)(nil).VolumePrune), arg0, arg1)\n}\n\n// VolumeRemove mocks base method.\nfunc (m *MockAPIClient) VolumeRemove(arg0 context.Context, arg1 string, arg2 client.VolumeRemoveOptions) (client.VolumeRemoveResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"VolumeRemove\", arg0, arg1, arg2)\n\tret0, _ := ret[0].(client.VolumeRemoveResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// VolumeRemove indicates an expected call of VolumeRemove.\nfunc (mr *MockAPIClientMockRecorder) VolumeRemove(arg0, arg1, arg2 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"VolumeRemove\", reflect.TypeOf((*MockAPIClient)(nil).VolumeRemove), arg0, arg1, arg2)\n}\n\n// VolumeUpdate mocks base method.\nfunc (m *MockAPIClient) VolumeUpdate(arg0 context.Context, arg1 string, arg2 client.VolumeUpdateOptions) (client.VolumeUpdateResult, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"VolumeUpdate\", arg0, arg1, arg2)\n\tret0, _ := ret[0].(client.VolumeUpdateResult)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// VolumeUpdate indicates an expected call of VolumeUpdate.\nfunc (mr *MockAPIClientMockRecorder) VolumeUpdate(arg0, arg1, arg2 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"VolumeUpdate\", reflect.TypeOf((*MockAPIClient)(nil).VolumeUpdate), arg0, arg1, arg2)\n}\n"
  },
  {
    "path": "pkg/testmocks/mock_image.go",
    "content": "package testmocks\n\nimport (\n\t\"github.com/buildpacks/imgutil\"\n\t\"github.com/buildpacks/imgutil/fakes\"\n)\n\ntype MockImage struct {\n\t*fakes.Image\n\tEntrypointCall struct {\n\t\tcallCount int\n\t\tReceived  struct{}\n\t\tReturns   struct {\n\t\t\tStringArr []string\n\t\t\tError     error\n\t\t}\n\t\tStub func() ([]string, error)\n\t}\n}\n\nfunc NewImage(name, topLayerSha string, identifier imgutil.Identifier) *MockImage {\n\treturn &MockImage{\n\t\tImage: fakes.NewImage(name, topLayerSha, identifier),\n\t}\n}\n\nfunc (m *MockImage) Entrypoint() ([]string, error) {\n\tif m.EntrypointCall.Stub != nil {\n\t\treturn m.EntrypointCall.Stub()\n\t}\n\treturn m.EntrypointCall.Returns.StringArr, m.EntrypointCall.Returns.Error\n}\n"
  },
  {
    "path": "pkg/testmocks/mock_image_factory.go",
    "content": "// Code generated by MockGen. DO NOT EDIT.\n// Source: github.com/buildpacks/pack/pkg/client (interfaces: ImageFactory)\n\n// Package testmocks is a generated GoMock package.\npackage testmocks\n\nimport (\n\treflect \"reflect\"\n\n\timgutil \"github.com/buildpacks/imgutil\"\n\tgomock \"github.com/golang/mock/gomock\"\n\n\tdist \"github.com/buildpacks/pack/pkg/dist\"\n)\n\n// MockImageFactory is a mock of ImageFactory interface.\ntype MockImageFactory struct {\n\tctrl     *gomock.Controller\n\trecorder *MockImageFactoryMockRecorder\n}\n\n// MockImageFactoryMockRecorder is the mock recorder for MockImageFactory.\ntype MockImageFactoryMockRecorder struct {\n\tmock *MockImageFactory\n}\n\n// NewMockImageFactory creates a new mock instance.\nfunc NewMockImageFactory(ctrl *gomock.Controller) *MockImageFactory {\n\tmock := &MockImageFactory{ctrl: ctrl}\n\tmock.recorder = &MockImageFactoryMockRecorder{mock}\n\treturn mock\n}\n\n// EXPECT returns an object that allows the caller to indicate expected use.\nfunc (m *MockImageFactory) EXPECT() *MockImageFactoryMockRecorder {\n\treturn m.recorder\n}\n\n// NewImage mocks base method.\nfunc (m *MockImageFactory) NewImage(arg0 string, arg1 bool, arg2 dist.Target) (imgutil.Image, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"NewImage\", arg0, arg1, arg2)\n\tret0, _ := ret[0].(imgutil.Image)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// NewImage indicates an expected call of NewImage.\nfunc (mr *MockImageFactoryMockRecorder) NewImage(arg0, arg1, arg2 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"NewImage\", reflect.TypeOf((*MockImageFactory)(nil).NewImage), arg0, arg1, arg2)\n}\n"
  },
  {
    "path": "pkg/testmocks/mock_image_fetcher.go",
    "content": "// Code generated by MockGen. DO NOT EDIT.\n// Source: github.com/buildpacks/pack/pkg/client (interfaces: ImageFetcher)\n\n// Package testmocks is a generated GoMock package.\npackage testmocks\n\nimport (\n\tcontext \"context\"\n\treflect \"reflect\"\n\n\timgutil \"github.com/buildpacks/imgutil\"\n\tgomock \"github.com/golang/mock/gomock\"\n\n\timage \"github.com/buildpacks/pack/pkg/image\"\n)\n\n// MockImageFetcher is a mock of ImageFetcher interface.\ntype MockImageFetcher struct {\n\tctrl     *gomock.Controller\n\trecorder *MockImageFetcherMockRecorder\n}\n\n// MockImageFetcherMockRecorder is the mock recorder for MockImageFetcher.\ntype MockImageFetcherMockRecorder struct {\n\tmock *MockImageFetcher\n}\n\n// NewMockImageFetcher creates a new mock instance.\nfunc NewMockImageFetcher(ctrl *gomock.Controller) *MockImageFetcher {\n\tmock := &MockImageFetcher{ctrl: ctrl}\n\tmock.recorder = &MockImageFetcherMockRecorder{mock}\n\treturn mock\n}\n\n// EXPECT returns an object that allows the caller to indicate expected use.\nfunc (m *MockImageFetcher) EXPECT() *MockImageFetcherMockRecorder {\n\treturn m.recorder\n}\n\n// CheckReadAccess mocks base method.\nfunc (m *MockImageFetcher) CheckReadAccess(arg0 string, arg1 image.FetchOptions) bool {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"CheckReadAccess\", arg0, arg1)\n\tret0, _ := ret[0].(bool)\n\treturn ret0\n}\n\n// CheckReadAccess indicates an expected call of CheckReadAccess.\nfunc (mr *MockImageFetcherMockRecorder) CheckReadAccess(arg0, arg1 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"CheckReadAccess\", reflect.TypeOf((*MockImageFetcher)(nil).CheckReadAccess), arg0, arg1)\n}\n\n// Fetch mocks base method.\nfunc (m *MockImageFetcher) Fetch(arg0 context.Context, arg1 string, arg2 image.FetchOptions) (imgutil.Image, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"Fetch\", arg0, arg1, arg2)\n\tret0, _ := ret[0].(imgutil.Image)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// Fetch indicates an expected call of Fetch.\nfunc (mr *MockImageFetcherMockRecorder) Fetch(arg0, arg1, arg2 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"Fetch\", reflect.TypeOf((*MockImageFetcher)(nil).Fetch), arg0, arg1, arg2)\n}\n\n// FetchForPlatform mocks base method.\nfunc (m *MockImageFetcher) FetchForPlatform(arg0 context.Context, arg1 string, arg2 image.FetchOptions) (imgutil.Image, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"FetchForPlatform\", arg0, arg1, arg2)\n\tret0, _ := ret[0].(imgutil.Image)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// FetchForPlatform indicates an expected call of FetchForPlatform.\nfunc (mr *MockImageFetcherMockRecorder) FetchForPlatform(arg0, arg1, arg2 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"FetchForPlatform\", reflect.TypeOf((*MockImageFetcher)(nil).FetchForPlatform), arg0, arg1, arg2)\n}\n"
  },
  {
    "path": "pkg/testmocks/mock_index_factory.go",
    "content": "// Code generated by MockGen. DO NOT EDIT.\n// Source: github.com/buildpacks/pack/pkg/client (interfaces: IndexFactory)\n\n// Package testmocks is a generated GoMock package.\npackage testmocks\n\nimport (\n\treflect \"reflect\"\n\n\timgutil \"github.com/buildpacks/imgutil\"\n\tgomock \"github.com/golang/mock/gomock\"\n)\n\n// MockIndexFactory is a mock of IndexFactory interface.\ntype MockIndexFactory struct {\n\tctrl     *gomock.Controller\n\trecorder *MockIndexFactoryMockRecorder\n}\n\n// MockIndexFactoryMockRecorder is the mock recorder for MockIndexFactory.\ntype MockIndexFactoryMockRecorder struct {\n\tmock *MockIndexFactory\n}\n\n// NewMockIndexFactory creates a new mock instance.\nfunc NewMockIndexFactory(ctrl *gomock.Controller) *MockIndexFactory {\n\tmock := &MockIndexFactory{ctrl: ctrl}\n\tmock.recorder = &MockIndexFactoryMockRecorder{mock}\n\treturn mock\n}\n\n// EXPECT returns an object that allows the caller to indicate expected use.\nfunc (m *MockIndexFactory) EXPECT() *MockIndexFactoryMockRecorder {\n\treturn m.recorder\n}\n\n// CreateIndex mocks base method.\nfunc (m *MockIndexFactory) CreateIndex(arg0 string, arg1 ...imgutil.IndexOption) (imgutil.ImageIndex, error) {\n\tm.ctrl.T.Helper()\n\tvarargs := []interface{}{arg0}\n\tfor _, a := range arg1 {\n\t\tvarargs = append(varargs, a)\n\t}\n\tret := m.ctrl.Call(m, \"CreateIndex\", varargs...)\n\tret0, _ := ret[0].(imgutil.ImageIndex)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// CreateIndex indicates an expected call of CreateIndex.\nfunc (mr *MockIndexFactoryMockRecorder) CreateIndex(arg0 interface{}, arg1 ...interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\tvarargs := append([]interface{}{arg0}, arg1...)\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"CreateIndex\", reflect.TypeOf((*MockIndexFactory)(nil).CreateIndex), varargs...)\n}\n\n// Exists mocks base method.\nfunc (m *MockIndexFactory) Exists(arg0 string) bool {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"Exists\", arg0)\n\tret0, _ := ret[0].(bool)\n\treturn ret0\n}\n\n// Exists indicates an expected call of Exists.\nfunc (mr *MockIndexFactoryMockRecorder) Exists(arg0 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"Exists\", reflect.TypeOf((*MockIndexFactory)(nil).Exists), arg0)\n}\n\n// FetchIndex mocks base method.\nfunc (m *MockIndexFactory) FetchIndex(arg0 string, arg1 ...imgutil.IndexOption) (imgutil.ImageIndex, error) {\n\tm.ctrl.T.Helper()\n\tvarargs := []interface{}{arg0}\n\tfor _, a := range arg1 {\n\t\tvarargs = append(varargs, a)\n\t}\n\tret := m.ctrl.Call(m, \"FetchIndex\", varargs...)\n\tret0, _ := ret[0].(imgutil.ImageIndex)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// FetchIndex indicates an expected call of FetchIndex.\nfunc (mr *MockIndexFactoryMockRecorder) FetchIndex(arg0 interface{}, arg1 ...interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\tvarargs := append([]interface{}{arg0}, arg1...)\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"FetchIndex\", reflect.TypeOf((*MockIndexFactory)(nil).FetchIndex), varargs...)\n}\n\n// FindIndex mocks base method.\nfunc (m *MockIndexFactory) FindIndex(arg0 string, arg1 ...imgutil.IndexOption) (imgutil.ImageIndex, error) {\n\tm.ctrl.T.Helper()\n\tvarargs := []interface{}{arg0}\n\tfor _, a := range arg1 {\n\t\tvarargs = append(varargs, a)\n\t}\n\tret := m.ctrl.Call(m, \"FindIndex\", varargs...)\n\tret0, _ := ret[0].(imgutil.ImageIndex)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// FindIndex indicates an expected call of FindIndex.\nfunc (mr *MockIndexFactoryMockRecorder) FindIndex(arg0 interface{}, arg1 ...interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\tvarargs := append([]interface{}{arg0}, arg1...)\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"FindIndex\", reflect.TypeOf((*MockIndexFactory)(nil).FindIndex), varargs...)\n}\n\n// LoadIndex mocks base method.\nfunc (m *MockIndexFactory) LoadIndex(arg0 string, arg1 ...imgutil.IndexOption) (imgutil.ImageIndex, error) {\n\tm.ctrl.T.Helper()\n\tvarargs := []interface{}{arg0}\n\tfor _, a := range arg1 {\n\t\tvarargs = append(varargs, a)\n\t}\n\tret := m.ctrl.Call(m, \"LoadIndex\", varargs...)\n\tret0, _ := ret[0].(imgutil.ImageIndex)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// LoadIndex indicates an expected call of LoadIndex.\nfunc (mr *MockIndexFactoryMockRecorder) LoadIndex(arg0 interface{}, arg1 ...interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\tvarargs := append([]interface{}{arg0}, arg1...)\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"LoadIndex\", reflect.TypeOf((*MockIndexFactory)(nil).LoadIndex), varargs...)\n}\n"
  },
  {
    "path": "pkg/testmocks/mock_registry_resolver.go",
    "content": "// Code generated by MockGen. DO NOT EDIT.\n// Source: github.com/buildpacks/pack/pkg/buildpack (interfaces: RegistryResolver)\n\n// Package testmocks is a generated GoMock package.\npackage testmocks\n\nimport (\n\treflect \"reflect\"\n\n\tgomock \"github.com/golang/mock/gomock\"\n)\n\n// MockRegistryResolver is a mock of RegistryResolver interface.\ntype MockRegistryResolver struct {\n\tctrl     *gomock.Controller\n\trecorder *MockRegistryResolverMockRecorder\n}\n\n// MockRegistryResolverMockRecorder is the mock recorder for MockRegistryResolver.\ntype MockRegistryResolverMockRecorder struct {\n\tmock *MockRegistryResolver\n}\n\n// NewMockRegistryResolver creates a new mock instance.\nfunc NewMockRegistryResolver(ctrl *gomock.Controller) *MockRegistryResolver {\n\tmock := &MockRegistryResolver{ctrl: ctrl}\n\tmock.recorder = &MockRegistryResolverMockRecorder{mock}\n\treturn mock\n}\n\n// EXPECT returns an object that allows the caller to indicate expected use.\nfunc (m *MockRegistryResolver) EXPECT() *MockRegistryResolverMockRecorder {\n\treturn m.recorder\n}\n\n// Resolve mocks base method.\nfunc (m *MockRegistryResolver) Resolve(arg0, arg1 string) (string, error) {\n\tm.ctrl.T.Helper()\n\tret := m.ctrl.Call(m, \"Resolve\", arg0, arg1)\n\tret0, _ := ret[0].(string)\n\tret1, _ := ret[1].(error)\n\treturn ret0, ret1\n}\n\n// Resolve indicates an expected call of Resolve.\nfunc (mr *MockRegistryResolverMockRecorder) Resolve(arg0, arg1 interface{}) *gomock.Call {\n\tmr.mock.ctrl.T.Helper()\n\treturn mr.mock.ctrl.RecordCallWithMethodType(mr.mock, \"Resolve\", reflect.TypeOf((*MockRegistryResolver)(nil).Resolve), arg0, arg1)\n}\n"
  },
  {
    "path": "project.toml",
    "content": "[project]\nversion = \"1.0.2\"\nsource-url = \"https://github.com/buildpacks/pack\"\n\n[[build.env]]\nname = \"BP_GO_TARGETS\"\nvalue = \"./cmd/pack\"\n"
  },
  {
    "path": "registry/type.go",
    "content": "package registry\n\nconst (\n\tTypeGit    = \"git\"\n\tTypeGitHub = \"github\"\n)\n\nvar Types = []string{\n\tTypeGit,\n\tTypeGitHub,\n}\n"
  },
  {
    "path": "testdata/builder.toml",
    "content": "[[buildpacks]]\nid = \"some.bp1\"\nuri = \"some-path-1\"\n\n[[buildpacks]]\nid = \"some/bp2\"\nuri = \"some-path-2\"\n\n[[buildpacks]]\nid = \"some/bp2\"\nuri = \"some-path-3\"\n\n[[order]]\n[[order.group]]\n  id = \"some.bp1\"\n  version = \"1.2.3\"\n\n[[order.group]]\n  id = \"some/bp2\"\n  version = \"1.2.4\"\n\n[[order]]\n[[order.group]]\n  id = \"some.bp1\"\n  version = \"1.2.3\"\n\n[stack]\nid = \"com.example.stack\"\nbuild-image = \"some/build\"\nrun-image = \"some/run\"\nrun-image-mirrors = [\"gcr.io/some/run2\"]"
  },
  {
    "path": "testdata/buildpack/bin/build",
    "content": "build-contents"
  },
  {
    "path": "testdata/buildpack/bin/detect",
    "content": ""
  },
  {
    "path": "testdata/buildpack/buildpack.toml",
    "content": "api = \"0.3\"\n\n[buildpack]\nid = \"bp.one\"\nversion = \"1.2.3\"\nhomepage = \"http://one.buildpack\"\n\n[[stacks]]\nid = \"some.stack.id\"\nmixins = [\"mixinX\", \"build:mixinY\", \"run:mixinZ\"]\n"
  },
  {
    "path": "testdata/buildpack-api-0.4/bin/build",
    "content": "build-contents"
  },
  {
    "path": "testdata/buildpack-api-0.4/bin/detect",
    "content": ""
  },
  {
    "path": "testdata/buildpack-api-0.4/buildpack.toml",
    "content": "api = \"0.4\"\n\n[buildpack]\nid = \"bp.one\"\nversion = \"1.2.3\"\nhomepage = \"http://one.buildpack\"\n\n[[stacks]]\nid = \"some.stack.id\"\nmixins = [\"mixinX\", \"build:mixinY\", \"run:mixinZ\"]\n"
  },
  {
    "path": "testdata/buildpack2/bin/build",
    "content": ""
  },
  {
    "path": "testdata/buildpack2/bin/detect",
    "content": ""
  },
  {
    "path": "testdata/buildpack2/buildpack.toml",
    "content": "api = \"0.3\"\n\n[buildpack]\nid = \"some-other-buildpack-id\"\nversion = \"some-other-buildpack-version\"\n\n[[stacks]]\nid = \"some.stack.id\"\nmixins = [\"mixinA\", \"build:mixinB\", \"run:mixinC\"]\n"
  },
  {
    "path": "testdata/downloader/dirA/file.txt",
    "content": "some file contents"
  },
  {
    "path": "testdata/empty-file",
    "content": ""
  },
  {
    "path": "testdata/just-a-file.txt",
    "content": ""
  },
  {
    "path": "testdata/lifecycle/platform-0.3/lifecycle-v0.0.0-arch/analyzer",
    "content": "analyzer"
  },
  {
    "path": "testdata/lifecycle/platform-0.3/lifecycle-v0.0.0-arch/builder",
    "content": "builder"
  },
  {
    "path": "testdata/lifecycle/platform-0.3/lifecycle-v0.0.0-arch/creator",
    "content": "creator"
  },
  {
    "path": "testdata/lifecycle/platform-0.3/lifecycle-v0.0.0-arch/detector",
    "content": "detector"
  },
  {
    "path": "testdata/lifecycle/platform-0.3/lifecycle-v0.0.0-arch/exporter",
    "content": "exporter"
  },
  {
    "path": "testdata/lifecycle/platform-0.3/lifecycle-v0.0.0-arch/launcher",
    "content": "launcher"
  },
  {
    "path": "testdata/lifecycle/platform-0.3/lifecycle-v0.0.0-arch/restorer",
    "content": "restorer"
  },
  {
    "path": "testdata/lifecycle/platform-0.3/lifecycle.toml",
    "content": "[lifecycle]\nversion = \"0.0.0\"\n\n[api]\nbuildpack = \"0.2\"\nplatform = \"0.3\""
  },
  {
    "path": "testdata/lifecycle/platform-0.4/lifecycle-v0.0.0-arch/analyzer",
    "content": "analyzer"
  },
  {
    "path": "testdata/lifecycle/platform-0.4/lifecycle-v0.0.0-arch/builder",
    "content": "builder"
  },
  {
    "path": "testdata/lifecycle/platform-0.4/lifecycle-v0.0.0-arch/creator",
    "content": "creator"
  },
  {
    "path": "testdata/lifecycle/platform-0.4/lifecycle-v0.0.0-arch/detector",
    "content": "detector"
  },
  {
    "path": "testdata/lifecycle/platform-0.4/lifecycle-v0.0.0-arch/exporter",
    "content": "exporter"
  },
  {
    "path": "testdata/lifecycle/platform-0.4/lifecycle-v0.0.0-arch/launcher",
    "content": "launcher"
  },
  {
    "path": "testdata/lifecycle/platform-0.4/lifecycle-v0.0.0-arch/restorer",
    "content": "restorer"
  },
  {
    "path": "testdata/lifecycle/platform-0.4/lifecycle.toml",
    "content": "[lifecycle]\nversion = \"0.0.0\"\n\n[apis]\n[apis.buildpack]\ndeprecated = [\"0.2\", \"0.3\"]\nsupported = [\"0.2\", \"0.3\", \"0.4\"]\n\n[apis.platform]\ndeprecated = [\"0.2\"]\nsupported = [\"0.3\", \"0.4\"]"
  },
  {
    "path": "testdata/non-zip-file",
    "content": "some-content"
  },
  {
    "path": "testdata/registry/3/fo/example_foo",
    "content": "{\"ns\":\"example\",\"name\":\"foo\",\"version\":\"1.0.0\",\"yanked\":false,\"addr\":\"example.com/some/package@sha256:8c27fe111c11b722081701dfed3bd55e039b9ce92865473cf4cdfa918071c566\"}\n{\"ns\":\"example\",\"name\":\"foo\",\"version\":\"1.1.0\",\"yanked\":false,\"addr\":\"example.com/some/package@sha256:74eb48882e835d8767f62940d453eb96ed2737de3a16573881dcea7dea769df7\"}\n{\"ns\":\"example\",\"name\":\"foo\",\"version\":\"1.2.0\",\"yanked\":false,\"addr\":\"example.com/some/package@sha256:2560f05307e8de9d830f144d09556e19dd1eb7d928aee900ed02208ae9727e7a\"}\n"
  },
  {
    "path": "testdata/registry/ja/va/example_java",
    "content": "{\"ns\":\"example\",\"name\":\"java\",\"version\":\"1.0.0\",\"yanked\":false,\"addr\":\"example.com/some/package@sha256:8c27fe111c11b722081701dfed3bd55e039b9ce92865473cf4cdfa918071c566\"}\n"
  },
  {
    "path": "testdata/some-app/.gitignore",
    "content": "!.gitignore"
  },
  {
    "path": "testhelpers/arg_patterns.go",
    "content": "package testhelpers\n\nimport (\n\t\"fmt\"\n\t\"reflect\"\n\t\"strings\"\n\t\"testing\"\n\n\t\"github.com/buildpacks/pack/internal/stringset\"\n)\n\nfunc AssertIncludeAllExpectedPatterns(t *testing.T, receivedArgs []string, expectedPatterns ...[]string) {\n\tt.Helper()\n\n\tmissingPatterns := [][]string{}\n\n\tfor _, expectedPattern := range expectedPatterns {\n\t\tif !patternExists(expectedPattern, receivedArgs) {\n\t\t\tmissingPatterns = append(missingPatterns, expectedPattern)\n\t\t}\n\t}\n\n\tassertSliceEmpty(t,\n\t\tmissingPatterns,\n\t\t\"Expected the patterns %s to exist in [%s]\",\n\t\tmissingPatterns,\n\t\tstrings.Join(receivedArgs, \" \"),\n\t)\n}\n\nfunc patternExists(expectedPattern []string, receivedArgs []string) bool {\n\t_, missing, _ := stringset.Compare(receivedArgs, expectedPattern)\n\tif len(missing) > 0 {\n\t\treturn false\n\t}\n\n\tif len(expectedPattern) == 1 {\n\t\treturn true\n\t}\n\n\tfor _, loc := range matchLocations(expectedPattern[0], receivedArgs) {\n\t\tfinalElementLoc := intMin(loc+len(expectedPattern), len(receivedArgs))\n\n\t\treceivedSubSlice := receivedArgs[loc:finalElementLoc]\n\n\t\tif reflect.DeepEqual(receivedSubSlice, expectedPattern) {\n\t\t\treturn true\n\t\t}\n\t}\n\n\treturn false\n}\n\nfunc matchLocations(expectedArg string, receivedArgs []string) []int {\n\tindices := []int{}\n\n\tfor i, receivedArg := range receivedArgs {\n\t\tif receivedArg == expectedArg {\n\t\t\tindices = append(indices, i)\n\t\t}\n\t}\n\n\treturn indices\n}\n\nfunc assertSliceEmpty(t *testing.T, actual [][]string, msg string, msgArgs ...interface{}) {\n\tt.Helper()\n\n\tempty, err := sliceEmpty(actual)\n\n\tif err != nil {\n\t\tt.Fatalf(\"assertSliceNotEmpty error: %s\", err.Error())\n\t}\n\n\tif !empty {\n\t\tt.Fatalf(msg, msgArgs...)\n\t}\n}\n\nfunc sliceEmpty(slice [][]string) (bool, error) {\n\tswitch reflect.TypeOf(slice).Kind() {\n\tcase reflect.Slice:\n\t\treturn reflect.ValueOf(slice).Len() == 0, nil\n\tdefault:\n\t\treturn true, fmt.Errorf(\"invoked with non slice actual: %v\", slice)\n\t}\n}\n\nfunc intMin(a, b int) int {\n\tif a < b {\n\t\treturn a\n\t}\n\treturn b\n}\n"
  },
  {
    "path": "testhelpers/assert_file.go",
    "content": "package testhelpers\n\nimport (\n\t\"os\"\n)\n\nfunc (a *AssertionManager) FileExists(filePath string) {\n\ta.testObject.Helper()\n\t_, err := os.Stat(filePath)\n\tif err != nil {\n\t\tif os.IsNotExist(err) {\n\t\t\ta.testObject.Fatalf(\"Expected file %s to exist.\", filePath)\n\t\t} else {\n\t\t\ta.testObject.Fatal(\"Unexpected error:\", err.Error())\n\t\t}\n\t}\n}\n\nfunc (a *AssertionManager) FileIsNotEmpty(filePath string) {\n\ta.testObject.Helper()\n\tinfo, err := os.Stat(filePath)\n\tif err != nil {\n\t\ta.testObject.Fatal(\"Unexpected error:\", err.Error())\n\t}\n\n\tif info.IsDir() {\n\t\ta.testObject.Fatalf(\"File %s is a directory.\", filePath)\n\t}\n\n\tif info.Size() == 0 {\n\t\ta.testObject.Fatalf(\"Expected file %s to not be empty.\", filePath)\n\t}\n}\n"
  },
  {
    "path": "testhelpers/assertions.go",
    "content": "package testhelpers\n\nimport (\n\t\"bytes\"\n\t\"encoding/json\"\n\t\"fmt\"\n\t\"regexp\"\n\t\"strings\"\n\t\"testing\"\n\n\t\"github.com/pelletier/go-toml\"\n\t\"gopkg.in/yaml.v3\"\n\n\t\"github.com/buildpacks/pack/testhelpers/comparehelpers\"\n\n\t\"github.com/google/go-cmp/cmp\"\n)\n\ntype AssertionManager struct {\n\ttestObject *testing.T\n}\n\nfunc NewAssertionManager(testObject *testing.T) AssertionManager {\n\treturn AssertionManager{\n\t\ttestObject: testObject,\n\t}\n}\n\nfunc (a AssertionManager) TrimmedEq(actual, expected string) {\n\ta.testObject.Helper()\n\n\tactualLines := strings.Split(actual, \"\\n\")\n\texpectedLines := strings.Split(expected, \"\\n\")\n\tfor lineIdx, line := range actualLines {\n\t\tactualLines[lineIdx] = strings.TrimRight(line, \"\\t \\n\")\n\t}\n\n\tfor lineIdx, line := range expectedLines {\n\t\texpectedLines[lineIdx] = strings.TrimRight(line, \"\\t \\n\")\n\t}\n\n\tactualTrimmed := strings.Join(actualLines, \"\\n\")\n\texpectedTrimmed := strings.Join(expectedLines, \"\\n\")\n\n\ta.Equal(actualTrimmed, expectedTrimmed)\n}\n\nfunc (a AssertionManager) AssertTrimmedContains(actual, expected string) {\n\ta.testObject.Helper()\n\n\tactualLines := strings.Split(actual, \"\\n\")\n\texpectedLines := strings.Split(expected, \"\\n\")\n\tfor lineIdx, line := range actualLines {\n\t\tactualLines[lineIdx] = strings.TrimRight(line, \"\\t \\n\")\n\t}\n\n\tfor lineIdx, line := range expectedLines {\n\t\texpectedLines[lineIdx] = strings.TrimRight(line, \"\\t \\n\")\n\t}\n\n\tactualTrimmed := strings.Join(actualLines, \"\\n\")\n\texpectedTrimmed := strings.Join(expectedLines, \"\\n\")\n\n\ta.Contains(actualTrimmed, expectedTrimmed)\n}\n\nfunc (a AssertionManager) Equal(actual, expected interface{}) {\n\ta.testObject.Helper()\n\n\tif diff := cmp.Diff(actual, expected); diff != \"\" {\n\t\ta.testObject.Fatal(diff)\n\t}\n}\n\nfunc (a AssertionManager) NotEqual(actual, expected interface{}) {\n\ta.testObject.Helper()\n\n\tif diff := cmp.Diff(actual, expected); diff == \"\" {\n\t\ta.testObject.Fatal(diff)\n\t}\n}\n\nfunc (a AssertionManager) Nil(actual interface{}) {\n\ta.testObject.Helper()\n\n\tif !isNil(actual) {\n\t\ta.testObject.Fatalf(\"expected nil: %v\", actual)\n\t}\n}\n\nfunc (a AssertionManager) Succeeds(actual interface{}) {\n\ta.testObject.Helper()\n\n\ta.Nil(actual)\n}\n\nfunc (a AssertionManager) Fails(actual interface{}) {\n\ta.testObject.Helper()\n\n\ta.NotNil(actual)\n}\n\nfunc (a AssertionManager) NilWithMessage(actual interface{}, message string) {\n\ta.testObject.Helper()\n\n\tif !isNil(actual) {\n\t\ta.testObject.Fatalf(\"expected nil: %s: %s\", actual, message)\n\t}\n}\n\nfunc (a AssertionManager) TrueWithMessage(actual bool, message string) {\n\ta.testObject.Helper()\n\n\tif !actual {\n\t\ta.testObject.Fatalf(\"expected true: %s\", message)\n\t}\n}\n\nfunc (a AssertionManager) NotNil(actual interface{}) {\n\ta.testObject.Helper()\n\n\tif isNil(actual) {\n\t\ta.testObject.Fatal(\"expect not nil\")\n\t}\n}\n\nfunc (a AssertionManager) Contains(actual, expected string) {\n\ta.testObject.Helper()\n\n\tif !strings.Contains(actual, expected) {\n\t\ta.testObject.Fatalf(\n\t\t\t\"Expected '%s' to contain '%s'\\n\\nDiff:%s\",\n\t\t\tactual,\n\t\t\texpected,\n\t\t\tcmp.Diff(expected, actual),\n\t\t)\n\t}\n}\n\nfunc (a AssertionManager) EqualJSON(actualJSON, expectedJSON string) {\n\ta.ContainsJSON(actualJSON, expectedJSON)\n\ta.ContainsJSON(expectedJSON, actualJSON)\n}\n\nfunc (a AssertionManager) ContainsJSON(actualJSON, expectedJSON string) {\n\ta.testObject.Helper()\n\n\tvar actual interface{}\n\terr := json.Unmarshal([]byte(actualJSON), &actual)\n\tif err != nil {\n\t\ta.testObject.Fatalf(\n\t\t\t\"Unable to unmarshal 'actualJSON': %q\", err,\n\t\t)\n\t}\n\n\tvar expected interface{}\n\terr = json.Unmarshal([]byte(expectedJSON), &expected)\n\tif err != nil {\n\t\ta.testObject.Fatalf(\n\t\t\t\"Unable to unmarshal 'expectedJSON': %q\", err,\n\t\t)\n\t}\n\n\tif !comparehelpers.DeepContains(actual, expected) {\n\t\texpectedJSONDebug, err := json.Marshal(expected)\n\t\tif err != nil {\n\t\t\ta.testObject.Fatalf(\"unable to render expected failure expectation: %q\", err)\n\t\t}\n\n\t\tactualJSONDebug, err := json.Marshal(actual)\n\t\tif err != nil {\n\t\t\ta.testObject.Fatalf(\"unable to render actual failure expectation: %q\", err)\n\t\t}\n\n\t\tvar prettifiedExpected bytes.Buffer\n\t\terr = json.Indent(&prettifiedExpected, expectedJSONDebug, \"\", \"  \")\n\t\tif err != nil {\n\t\t\ta.testObject.Fatal(\"failed to format expected TOML output as JSON\")\n\t\t}\n\n\t\tvar prettifiedActual bytes.Buffer\n\t\terr = json.Indent(&prettifiedActual, actualJSONDebug, \"\", \"  \")\n\t\tif err != nil {\n\t\t\ta.testObject.Fatal(\"failed to format actual TOML output as JSON\")\n\t\t}\n\n\t\tactualJSONDiffArray := strings.Split(prettifiedActual.String(), \"\\n\")\n\t\texpectedJSONDiffArray := strings.Split(prettifiedExpected.String(), \"\\n\")\n\n\t\ta.testObject.Fatalf(\n\t\t\t\"Expected '%s' to contain '%s'\\n\\nJSON Diff:%s\",\n\t\t\tprettifiedActual.String(),\n\t\t\tprettifiedExpected.String(),\n\t\t\tcmp.Diff(actualJSONDiffArray, expectedJSONDiffArray),\n\t\t)\n\t}\n}\n\nfunc (a AssertionManager) EqualYAML(actualYAML, expectedYAML string) {\n\ta.ContainsYAML(actualYAML, expectedYAML)\n\ta.ContainsYAML(expectedYAML, actualYAML)\n}\n\nfunc (a AssertionManager) ContainsYAML(actualYAML, expectedYAML string) {\n\ta.testObject.Helper()\n\n\tvar actual interface{}\n\terr := yaml.Unmarshal([]byte(actualYAML), &actual)\n\tif err != nil {\n\t\ta.testObject.Fatalf(\n\t\t\t\"Unable to unmarshal 'actualJSON': %q\", err,\n\t\t)\n\t}\n\n\tvar expected interface{}\n\terr = yaml.Unmarshal([]byte(expectedYAML), &expected)\n\tif err != nil {\n\t\ta.testObject.Fatalf(\n\t\t\t\"Unable to unmarshal 'expectedYAML': %q\", err,\n\t\t)\n\t}\n\n\tif !comparehelpers.DeepContains(actual, expected) {\n\t\texpectedYAMLDebug, err := yaml.Marshal(expected)\n\t\tif err != nil {\n\t\t\ta.testObject.Fatalf(\"unable to render expected failure expectation: %q\", err)\n\t\t}\n\n\t\tactualYAMLDebug, err := yaml.Marshal(actual)\n\t\tif err != nil {\n\t\t\ta.testObject.Fatalf(\"unable to render actual failure expectation: %q\", err)\n\t\t}\n\n\t\tactualYAMLDiffArray := strings.Split(string(actualYAMLDebug), \"\\n\")\n\t\texpectedYAMLDiffArray := strings.Split(string(expectedYAMLDebug), \"\\n\")\n\n\t\ta.testObject.Fatalf(\n\t\t\t\"Expected '%s' to contain '%s'\\n\\nDiff:%s\",\n\t\t\tstring(actualYAMLDebug),\n\t\t\tstring(expectedYAMLDebug),\n\t\t\tcmp.Diff(actualYAMLDiffArray, expectedYAMLDiffArray),\n\t\t)\n\t}\n}\n\nfunc (a AssertionManager) EqualTOML(actualTOML, expectedTOML string) {\n\ta.ContainsTOML(actualTOML, expectedTOML)\n\ta.ContainsTOML(expectedTOML, actualTOML)\n}\n\nfunc (a AssertionManager) ContainsTOML(actualTOML, expectedTOML string) {\n\ta.testObject.Helper()\n\n\tvar actual interface{}\n\terr := toml.Unmarshal([]byte(actualTOML), &actual)\n\tif err != nil {\n\t\ta.testObject.Fatalf(\n\t\t\t\"Unable to unmarshal 'actualTOML': %q\", err,\n\t\t)\n\t}\n\n\tvar expected interface{}\n\terr = toml.Unmarshal([]byte(expectedTOML), &expected)\n\tif err != nil {\n\t\ta.testObject.Fatalf(\n\t\t\t\"Unable to unmarshal 'expectedTOML': %q\", err,\n\t\t)\n\t}\n\n\tif !comparehelpers.DeepContains(actual, expected) {\n\t\texpectedJSONDebug, err := json.Marshal(expected)\n\t\tif err != nil {\n\t\t\ta.testObject.Fatalf(\"unable to render expected failure expectation: %q\", err)\n\t\t}\n\n\t\tactualJSONDebug, err := json.Marshal(actual)\n\t\tif err != nil {\n\t\t\ta.testObject.Fatalf(\"unable to render actual failure expectation: %q\", err)\n\t\t}\n\n\t\tvar prettifiedExpected bytes.Buffer\n\t\terr = json.Indent(&prettifiedExpected, expectedJSONDebug, \"\", \"  \")\n\t\tif err != nil {\n\t\t\ta.testObject.Fatal(\"failed to format expected TOML output as JSON\")\n\t\t}\n\n\t\tvar prettifiedActual bytes.Buffer\n\t\terr = json.Indent(&prettifiedActual, actualJSONDebug, \"\", \"  \")\n\t\tif err != nil {\n\t\t\ta.testObject.Fatal(\"failed to format actual TOML output as JSON\")\n\t\t}\n\n\t\ta.testObject.Fatalf(\n\t\t\t\"Expected '%s' to contain '%s'\\n\\nJSON Diff:%s\",\n\t\t\tprettifiedActual.String(),\n\t\t\tprettifiedExpected.String(),\n\t\t\tcmp.Diff(prettifiedActual.String(), prettifiedExpected.String()),\n\t\t)\n\t}\n}\n\nfunc (a AssertionManager) ContainsF(actual, expected string, formatArgs ...interface{}) {\n\ta.testObject.Helper()\n\n\ta.Contains(actual, fmt.Sprintf(expected, formatArgs...))\n}\n\n// ContainsWithMessage will fail if expected is not contained within actual, messageFormat will be printed as the\n// failure message, with actual interpolated in the message\nfunc (a AssertionManager) ContainsWithMessage(actual, expected, messageFormat string) {\n\ta.testObject.Helper()\n\n\tif !strings.Contains(actual, expected) {\n\t\ta.testObject.Fatalf(messageFormat, actual)\n\t}\n}\n\nfunc (a AssertionManager) ContainsAll(actual string, expected ...string) {\n\ta.testObject.Helper()\n\n\tfor _, e := range expected {\n\t\ta.Contains(actual, e)\n\t}\n}\n\nfunc (a AssertionManager) Matches(actual string, pattern *regexp.Regexp) {\n\ta.testObject.Helper()\n\n\tif !pattern.MatchString(actual) {\n\t\ta.testObject.Fatalf(\"Expected '%s' to match regex '%s'\", actual, pattern)\n\t}\n}\n\nfunc (a AssertionManager) NoMatches(actual string, pattern *regexp.Regexp) {\n\ta.testObject.Helper()\n\n\tif pattern.MatchString(actual) {\n\t\ta.testObject.Fatalf(\"Expected '%s' not to match regex '%s'\", actual, pattern)\n\t}\n}\n\nfunc (a AssertionManager) MatchesAll(actual string, patterns ...*regexp.Regexp) {\n\ta.testObject.Helper()\n\n\tfor _, pattern := range patterns {\n\t\ta.Matches(actual, pattern)\n\t}\n}\n\nfunc (a AssertionManager) NotContains(actual, expected string) {\n\ta.testObject.Helper()\n\n\tif strings.Contains(actual, expected) {\n\t\ta.testObject.Fatalf(\"Expected '%s' not to be in '%s'\", expected, actual)\n\t}\n}\n\n// NotContainWithMessage will fail if expected is contained within actual, messageFormat will be printed as the failure\n// message, with actual interpolated in the message\nfunc (a AssertionManager) NotContainWithMessage(actual, expected, messageFormat string) {\n\ta.testObject.Helper()\n\n\tif strings.Contains(actual, expected) {\n\t\ta.testObject.Fatalf(messageFormat, actual)\n\t}\n}\n\n// Error checks that the provided value is an error (non-nil)\nfunc (a AssertionManager) Error(actual error) {\n\ta.testObject.Helper()\n\n\tif actual == nil {\n\t\ta.testObject.Fatal(\"Expected an error but got nil\")\n\t}\n}\n\nfunc (a AssertionManager) ErrorContains(actual error, expected string) {\n\ta.testObject.Helper()\n\n\tif actual == nil {\n\t\ta.testObject.Fatalf(\"Expected %q an error but got nil\", expected)\n\t}\n\n\ta.Contains(actual.Error(), expected)\n}\n\nfunc (a AssertionManager) ErrorWithMessage(actual error, message string) {\n\ta.testObject.Helper()\n\n\ta.Error(actual)\n\ta.Equal(actual.Error(), message)\n}\n\nfunc (a AssertionManager) ErrorWithMessageF(actual error, format string, args ...interface{}) {\n\ta.testObject.Helper()\n\n\ta.ErrorWithMessage(actual, fmt.Sprintf(format, args...))\n}\n"
  },
  {
    "path": "testhelpers/comparehelpers/deep_compare.go",
    "content": "package comparehelpers\n\nimport (\n\t\"fmt\"\n\t\"reflect\"\n)\n\n// Returns true if 'containee' is contained in 'container'\n// Note this method searches all objects in 'container' for containee\n// Contains is defined by the following relationship\n// basic data types (string, float, int,...):\n//\n//\tcontainer == containee\n//\n// maps:\n//\n//\tevery key-value pair from containee is in container\n//\tEx: {\"a\": 1, \"b\": 2, \"c\": 3} contains {\"a\": 1, \"c\": 3}\n//\n// arrays:\n//\n//\tevery element in containee is present and ordered in an array in container\n//\tEx: [1, 1, 4, 3, 10, 4] contains [1, 3, 4 ]\n//\n// Limitaions:\n// Cannot handle the following types: Pointers, Func\n// Assumes we are compairing structs generated from JSON, YAML, or TOML.\nfunc DeepContains(container, containee interface{}) bool {\n\tif container == nil || containee == nil {\n\t\treturn container == containee\n\t}\n\tv1 := reflect.ValueOf(container)\n\tv2 := reflect.ValueOf(containee)\n\n\treturn deepContains(v1, v2, 0)\n}\n\nfunc deepContains(v1, v2 reflect.Value, depth int) bool {\n\tif depth > 200 {\n\t\tpanic(\"deep Contains depth exceeded, likely a circular reference\")\n\t}\n\tif !v1.IsValid() || !v2.IsValid() {\n\t\treturn v1.IsValid() == v2.IsValid()\n\t}\n\n\tswitch v1.Kind() {\n\tcase reflect.Array, reflect.Slice:\n\t\t// check for subset matches in arrays\n\t\treturn arrayLikeContains(v1, v2, depth+1)\n\tcase reflect.Map:\n\t\treturn mapContains(v1, v2, depth+1)\n\tcase reflect.Interface:\n\t\treturn deepContains(v1.Elem(), v2, depth+1)\n\tcase reflect.Ptr, reflect.Struct, reflect.Func:\n\t\tpanic(fmt.Sprintf(\"unimplmemented comparison for type: %s\", v1.Kind().String()))\n\tdefault: // assume it is a atomic datatype\n\t\treturn reflect.DeepEqual(v1, v2)\n\t}\n}\n\nfunc mapContains(v1, v2 reflect.Value, depth int) bool {\n\tt2 := v2.Kind()\n\tif t2 == reflect.Interface {\n\t\treturn mapContains(v1, v2.Elem(), depth+1)\n\t} else if t2 == reflect.Map {\n\t\tresult := true\n\t\tfor _, k := range v2.MapKeys() {\n\t\t\tk2Val := v2.MapIndex(k)\n\t\t\tk1Val := v1.MapIndex(k)\n\t\t\tif !k1Val.IsValid() || !reflect.DeepEqual(k1Val.Interface(), k2Val.Interface()) {\n\t\t\t\tresult = false\n\t\t\t\tbreak\n\t\t\t}\n\t\t}\n\t\tif result {\n\t\t\treturn true\n\t\t}\n\t}\n\tfor _, k := range v1.MapKeys() {\n\t\tval := v1.MapIndex(k)\n\t\tif deepContains(val, v2, depth+1) {\n\t\t\treturn true\n\t\t}\n\t}\n\treturn false\n}\n\nfunc arrayLikeContains(v1, v2 reflect.Value, depth int) bool {\n\tswitch v2.Kind() {\n\tcase reflect.Interface:\n\t\treturn mapContains(v1, v2.Elem(), depth+1)\n\tcase reflect.Array, reflect.Slice:\n\t\tv1Index := 0\n\t\tv2Index := 0\n\t\tfor v1Index < v1.Len() && v2Index < v2.Len() {\n\t\t\tif reflect.DeepEqual(v1.Index(v1Index).Interface(), v2.Index(v2Index).Interface()) {\n\t\t\t\tv2Index++\n\t\t\t}\n\t\t\tv1Index++\n\t\t}\n\t\tif v2Index == v2.Len() {\n\t\t\treturn true\n\t\t}\n\t}\n\tfor i := 0; i < v1.Len(); i++ {\n\t\tif deepContains(v1.Index(i), v2, depth+1) {\n\t\t\treturn true\n\t\t}\n\t}\n\treturn false\n}\n"
  },
  {
    "path": "testhelpers/comparehelpers/deep_compare_test.go",
    "content": "package comparehelpers_test\n\nimport (\n\t\"encoding/json\"\n\t\"testing\"\n\n\t\"github.com/buildpacks/pack/testhelpers\"\n\t\"github.com/buildpacks/pack/testhelpers/comparehelpers\"\n\n\t\"github.com/heroku/color\"\n\t\"github.com/sclevine/spec\"\n\t\"github.com/sclevine/spec/report\"\n)\n\nfunc TestDeepContains(t *testing.T) {\n\tcolor.Disable(true)\n\tdefer color.Disable(false)\n\tspec.Run(t, \"Builder Writer\", testDeepContains, spec.Parallel(), spec.Report(report.Terminal{}))\n}\n\nfunc testDeepContains(t *testing.T, when spec.G, it spec.S) {\n\tvar (\n\t\tassert = testhelpers.NewAssertionManager(t)\n\t)\n\twhen(\"DeepContains\", func() {\n\t\tvar (\n\t\t\tcontainerJSON string\n\t\t\tcontainer     interface{}\n\t\t)\n\t\twhen(\"Searching for array containment\", func() {\n\t\t\tit.Before(func() {\n\t\t\t\tcontainerJSON = `[\n\t{\n\t  \"Name\": \"Platypus\",\n\t  \"Order\": \"Monotremata\",\n      \"Info\":  [\n\t\t{\n\t\t\t\"Population\": 5000,\n\t\t\t\"Habitat\": [\"splish-spash\", \"waters\"]\n\t\t},\n\t\t{\n\t\t\t\"Geography\" : \"Moon\"\n\t\t},\n\t\t{\n\t\t\t\"Discography\": \"My records are all platynum\"\n\t\t}\n\t  ]\n\t},\n\t{\n\t  \"Name\": \"Quoll\",\n\t  \"Order\": \"Dasyuromorphia\",\n\t  \"Info\": []\n\t}\n]`\n\n\t\t\t\tassert.Succeeds(json.Unmarshal([]byte(containerJSON), &container))\n\t\t\t})\n\t\t\twhen(\"subarray is contained\", func() {\n\t\t\t\tit(\"return true\", func() {\n\t\t\t\t\tcontainedJSON := `[{ \"Geography\":\"Moon\" }, {\"Discography\": \"My records are all platynum\"}]`\n\n\t\t\t\t\tvar contained interface{}\n\t\t\t\t\tassert.Succeeds(json.Unmarshal([]byte(containedJSON), &contained))\n\n\t\t\t\t\tout := comparehelpers.DeepContains(container, contained)\n\t\t\t\t\tassert.Equal(out, true)\n\t\t\t\t})\n\t\t\t})\n\t\t\twhen(\"subarray is not contained\", func() {\n\t\t\t\tit(\"returns false\", func() {\n\t\t\t\t\tcontainedJSON := `[{ \"Geography\":\"Moon\" }, {\"Discography\": \"Splish-splash Cash III\"}]`\n\n\t\t\t\t\tvar contained interface{}\n\t\t\t\t\tassert.Succeeds(json.Unmarshal([]byte(containedJSON), &contained))\n\n\t\t\t\t\tout := comparehelpers.DeepContains(container, contained)\n\t\t\t\t\tassert.Equal(out, false)\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\t\twhen(\"Searching for map containment\", func() {\n\t\t\tvar (\n\t\t\t\tcontainerJSON string\n\t\t\t\tcontainer     interface{}\n\t\t\t)\n\t\t\tit.Before(func() {\n\t\t\t\tcontainerJSON = `[\n\t{\n\t  \"Name\": \"Platypus\",\n\t  \"Order\": \"Monotremata\",\n      \"Info\":  [\n\t\t{\n\t\t\t\"Population\": 5000,\n\t\t\t\"Size\": \"smol\",\n\t\t\t\"Habitat\": [\"shallow\", \"waters\"]\n\t\t},\n\t\t{\n\t\t\t\"Geography\" : \"Moon\"\n\t\t},\n\t\t{\n\t\t\t\"Discography\": \"My records are all platynum\"\n\t\t}\n\t  ]\n\t},\n\t{\n\t  \"Name\": \"Quoll\",\n\t  \"Order\": \"Dasyuromorphia\",\n\t  \"Info\": []\n\t}\n]`\n\t\t\t\tassert.Succeeds(json.Unmarshal([]byte(containerJSON), &container))\n\t\t\t})\n\t\t\twhen(\"map is contained\", func() {\n\t\t\t\tit(\"returns true\", func() {\n\t\t\t\t\tcontainedJSON := `{\"Population\": 5000, \"Size\": \"smol\"}`\n\t\t\t\t\tvar contained interface{}\n\t\t\t\t\tassert.Succeeds(json.Unmarshal([]byte(containedJSON), &contained))\n\n\t\t\t\t\tout := comparehelpers.DeepContains(container, contained)\n\t\t\t\t\tassert.Equal(out, true)\n\t\t\t\t})\n\t\t\t})\n\t\t\twhen(\"map is not contained\", func() {\n\t\t\t\tit(\"returns false\", func() {\n\t\t\t\t\tcontainedJSON := `{\"Order\": \"Nemotode\"}`\n\t\t\t\t\tvar contained interface{}\n\t\t\t\t\tassert.Succeeds(json.Unmarshal([]byte(containedJSON), &contained))\n\n\t\t\t\t\tout := comparehelpers.DeepContains(container, contained)\n\t\t\t\t\tassert.Equal(out, false)\n\t\t\t\t})\n\t\t\t})\n\t\t})\n\t})\n\twhen(\"json is not contained\", func() {\n\t\tit(\"return false\", func() {\n\t\t\tcontainerJSON := `[\n\t{\"Name\": \"Platypus\", \"Order\": \"Monotremata\"},\n\t{\"Name\": \"Quoll\",    \"Order\": \"Dasyuromorphia\"}\n]`\n\t\t\tvar container interface{}\n\t\t\tassert.Succeeds(json.Unmarshal([]byte(containerJSON), &container))\n\n\t\t\tcontainedJSON := `[{\"Name\": \"Notapus\", \"Order\": \"Monotremata\"}]`\n\n\t\t\tvar contained interface{}\n\t\t\tassert.Succeeds(json.Unmarshal([]byte(containedJSON), &contained))\n\n\t\t\tout := comparehelpers.DeepContains(container, contained)\n\t\t\tassert.Equal(out, false)\n\t\t})\n\t})\n}\n"
  },
  {
    "path": "testhelpers/image_index.go",
    "content": "package testhelpers\n\nimport (\n\t\"encoding/json\"\n\t\"errors\"\n\t\"fmt\"\n\t\"io\"\n\t\"net/http\"\n\t\"os\"\n\t\"path/filepath\"\n\t\"testing\"\n\n\t\"github.com/buildpacks/imgutil\"\n\t\"github.com/buildpacks/imgutil/fakes\"\n\timgutilRemote \"github.com/buildpacks/imgutil/remote\"\n\t\"github.com/google/go-containerregistry/pkg/authn\"\n\t\"github.com/google/go-containerregistry/pkg/name\"\n\tv1 \"github.com/google/go-containerregistry/pkg/v1\"\n\t\"github.com/google/go-containerregistry/pkg/v1/random\"\n\t\"github.com/google/go-containerregistry/pkg/v1/remote\"\n\t\"github.com/google/go-containerregistry/pkg/v1/types\"\n)\n\nfunc NewRandomIndexRepoName() string {\n\treturn \"test-index-\" + RandString(10)\n}\n\nfunc AssertPathExists(t *testing.T, path string) {\n\tt.Helper()\n\t_, err := os.Stat(path)\n\tif os.IsNotExist(err) {\n\t\tt.Errorf(\"Expected %q to exist\", path)\n\t} else if err != nil {\n\t\tt.Fatalf(\"Error stating %q: %v\", path, err)\n\t}\n}\n\nfunc AssertPathDoesNotExists(t *testing.T, path string) {\n\tt.Helper()\n\t_, err := os.Stat(path)\n\tif err == nil {\n\t\tt.Errorf(\"Expected %q to not exists\", path)\n\t}\n}\n\nfunc FetchImageIndexDescriptor(t *testing.T, repoName string) v1.ImageIndex {\n\tt.Helper()\n\n\tr, err := name.ParseReference(repoName, name.WeakValidation)\n\tAssertNil(t, err)\n\n\tauth, err := authn.DefaultKeychain.Resolve(r.Context().Registry)\n\tAssertNil(t, err)\n\n\tindex, err := remote.Index(r, remote.WithTransport(http.DefaultTransport), remote.WithAuth(auth))\n\tAssertNil(t, err)\n\n\treturn index\n}\n\nfunc AssertRemoteImageIndex(t *testing.T, repoName string, mediaType types.MediaType, expectedNumberOfManifests int) {\n\tt.Helper()\n\n\tremoteIndex := FetchImageIndexDescriptor(t, repoName)\n\tAssertNotNil(t, remoteIndex)\n\tremoteIndexMediaType, err := remoteIndex.MediaType()\n\tAssertNil(t, err)\n\tAssertEq(t, remoteIndexMediaType, mediaType)\n\tremoteIndexManifest, err := remoteIndex.IndexManifest()\n\tAssertNil(t, err)\n\tAssertNotNil(t, remoteIndexManifest)\n\tAssertEq(t, len(remoteIndexManifest.Manifests), expectedNumberOfManifests)\n}\n\nfunc CreateRemoteImage(t *testing.T, repoName, tag, baseImage string) *imgutilRemote.Image {\n\timg1RepoName := fmt.Sprintf(\"%s:%s\", repoName, tag)\n\timg1, err := imgutilRemote.NewImage(img1RepoName, authn.DefaultKeychain, imgutilRemote.FromBaseImage(baseImage))\n\tAssertNil(t, err)\n\terr = img1.Save()\n\tAssertNil(t, err)\n\treturn img1\n}\n\nfunc ReadIndexManifest(t *testing.T, path string) *v1.IndexManifest {\n\tt.Helper()\n\n\tindexPath := filepath.Join(path, \"index.json\")\n\tAssertPathExists(t, filepath.Join(path, \"oci-layout\"))\n\tAssertPathExists(t, indexPath)\n\n\t// check index file\n\tdata, err := os.ReadFile(indexPath)\n\tAssertNil(t, err)\n\n\tindex := &v1.IndexManifest{}\n\terr = json.Unmarshal(data, index)\n\tAssertNil(t, err)\n\treturn index\n}\n\nfunc RandomCNBIndex(t *testing.T, repoName string, layers, count int64) *imgutil.CNBIndex {\n\tt.Helper()\n\n\trandomIndex, err := random.Index(1024, layers, count)\n\tAssertNil(t, err)\n\toptions := &imgutil.IndexOptions{\n\t\tBaseIndex: randomIndex,\n\t\tLayoutIndexOptions: imgutil.LayoutIndexOptions{\n\t\t\tXdgPath: os.Getenv(\"XDG_RUNTIME_DIR\"),\n\t\t},\n\t}\n\tidx, err := imgutil.NewCNBIndex(repoName, *options)\n\tAssertNil(t, err)\n\treturn idx\n}\n\nfunc RandomCNBIndexAndDigest(t *testing.T, repoName string, layers, count int64) (idx imgutil.ImageIndex, digest name.Digest) {\n\tidx = RandomCNBIndex(t, repoName, layers, count)\n\n\timgIdx, ok := idx.(*imgutil.CNBIndex)\n\tAssertEq(t, ok, true)\n\n\tmfest, err := imgIdx.IndexManifest()\n\tAssertNil(t, err)\n\n\tdigest, err = name.NewDigest(fmt.Sprintf(\"%s@%s\", repoName, mfest.Manifests[0].Digest.String()))\n\tAssertNil(t, err)\n\n\treturn idx, digest\n}\n\n// MockImageIndex wraps a real CNBIndex to record if some key methods are invoke\ntype MockImageIndex struct {\n\timgutil.CNBIndex\n\tErrorOnSave     bool\n\tPushCalled      bool\n\tPurgeOption     bool\n\tDeleteDirCalled bool\n}\n\n// NewMockImageIndex creates a random index with the given number of layers and manifests count\nfunc NewMockImageIndex(t *testing.T, repoName string, layers, count int64) *MockImageIndex {\n\tcnbIdx := RandomCNBIndex(t, repoName, layers, count)\n\tidx := &MockImageIndex{\n\t\tCNBIndex: *cnbIdx,\n\t}\n\treturn idx\n}\n\nfunc (i *MockImageIndex) SaveDir() error {\n\tif i.ErrorOnSave {\n\t\treturn errors.New(\"something failed writing the index on disk\")\n\t}\n\treturn i.CNBIndex.SaveDir()\n}\n\nfunc (i *MockImageIndex) Push(ops ...imgutil.IndexOption) error {\n\tvar pushOps = &imgutil.IndexOptions{}\n\tfor _, op := range ops {\n\t\tif err := op(pushOps); err != nil {\n\t\t\treturn err\n\t\t}\n\t}\n\n\ti.PushCalled = true\n\ti.PurgeOption = pushOps.Purge\n\treturn nil\n}\n\nfunc (i *MockImageIndex) DeleteDir() error {\n\ti.DeleteDirCalled = true\n\treturn nil\n}\n\nfunc NewFakeWithRandomUnderlyingV1Image(t *testing.T, repoName string, identifier imgutil.Identifier) *FakeWithRandomUnderlyingImage {\n\tfakeCNBImage := fakes.NewImage(repoName, \"\", identifier)\n\tunderlyingImage, err := random.Image(1024, 1)\n\tAssertNil(t, err)\n\treturn &FakeWithRandomUnderlyingImage{\n\t\tImage:           fakeCNBImage,\n\t\tunderlyingImage: underlyingImage,\n\t}\n}\n\ntype FakeWithRandomUnderlyingImage struct {\n\t*fakes.Image\n\tunderlyingImage v1.Image\n}\n\nfunc (t *FakeWithRandomUnderlyingImage) UnderlyingImage() v1.Image {\n\treturn t.underlyingImage\n}\n\nfunc (t *FakeWithRandomUnderlyingImage) GetLayer(sha string) (io.ReadCloser, error) {\n\thash, err := v1.NewHash(sha)\n\tif err != nil {\n\t\treturn nil, err\n\t}\n\n\tlayer, err := t.UnderlyingImage().LayerByDiffID(hash)\n\tif err != nil {\n\t\treturn nil, err\n\t}\n\treturn layer.Uncompressed()\n}\n\n// SetUpRandomRemoteIndexWithPlatforms creates an image index with platform-specific images and pushes it to the registry\n// Uses imgutil for both image creation and index management, following the pattern from manifest_create.go\nfunc SetUpRandomRemoteIndexWithPlatforms(t *testing.T, indexRepoName string, platforms []struct{ OS, Arch string }) {\n\tt.Helper()\n\n\t// Create platform-specific images using imgutil and collect their identifiers\n\tvar imageDigests []string\n\tfor _, platform := range platforms {\n\t\tplatformTag := fmt.Sprintf(\"%s-%s\", platform.OS, platform.Arch)\n\t\tplatformImageName := fmt.Sprintf(\"%s:%s\", indexRepoName, platformTag)\n\n\t\t// Use imgutil to create image with proper platform\n\t\timg, err := imgutilRemote.NewImage(platformImageName, authn.DefaultKeychain, imgutilRemote.WithDefaultPlatform(imgutil.Platform{\n\t\t\tOS:           platform.OS,\n\t\t\tArchitecture: platform.Arch,\n\t\t}))\n\t\tAssertNil(t, err)\n\t\tAssertNil(t, img.Save())\n\n\t\t// Extract the digest identifier\n\t\tid, err := img.Identifier()\n\t\tAssertNil(t, err)\n\t\timageDigests = append(imageDigests, id.String())\n\t}\n\n\t// Create a CNBIndex (similar to indexFactory.CreateIndex in manifest_create.go)\n\ttmpDir, err := os.MkdirTemp(\"\", \"index-test\")\n\tAssertNil(t, err)\n\tdefer os.RemoveAll(tmpDir)\n\n\tidx, err := imgutil.NewCNBIndex(indexRepoName, imgutil.IndexOptions{\n\t\tRemoteIndexOptions: imgutil.RemoteIndexOptions{\n\t\t\tKeychain: authn.DefaultKeychain,\n\t\t},\n\t\tLayoutIndexOptions: imgutil.LayoutIndexOptions{\n\t\t\tXdgPath: tmpDir,\n\t\t},\n\t})\n\tAssertNil(t, err)\n\n\t// Add each image to the index (similar to addManifestToIndex in common.go)\n\tfor i, digestStr := range imageDigests {\n\t\t// Fetch the image using the digest\n\t\timageToAdd, err := imgutilRemote.NewImage(digestStr, authn.DefaultKeychain, imgutilRemote.FromBaseImage(digestStr))\n\t\tAssertNil(t, err)\n\n\t\t// Add the underlying v1.Image to the index\n\t\tidx.AddManifest(imageToAdd.UnderlyingImage())\n\n\t\t// Set platform metadata for the manifest\n\t\tdigestRef, err := name.NewDigest(digestStr)\n\t\tAssertNil(t, err)\n\n\t\terr = idx.SetOS(digestRef, platforms[i].OS)\n\t\tAssertNil(t, err)\n\t\terr = idx.SetArchitecture(digestRef, platforms[i].Arch)\n\t\tAssertNil(t, err)\n\t}\n\n\t// Push the index to the registry (similar to manifest_create.go with Publish option)\n\terr = idx.Push(imgutil.WithPurge(true), imgutil.WithMediaType(types.OCIImageIndex))\n\tAssertNil(t, err)\n}\n"
  },
  {
    "path": "testhelpers/registry.go",
    "content": "package testhelpers\n\nimport (\n\t\"context\"\n\t\"encoding/base64\"\n\t\"fmt\"\n\t\"io\"\n\t\"net\"\n\t\"net/url\"\n\t\"os\"\n\t\"path/filepath\"\n\t\"testing\"\n\t\"time\"\n\n\t\"github.com/go-git/go-git/v5\"\n\t\"github.com/go-git/go-git/v5/plumbing/object\"\n\tdockercontainer \"github.com/moby/moby/api/types/container\"\n\tdockernetwork \"github.com/moby/moby/api/types/network\"\n\tdockerregistry \"github.com/moby/moby/api/types/registry\"\n\t\"github.com/moby/moby/client\"\n\t\"golang.org/x/crypto/bcrypt\"\n\n\t\"github.com/buildpacks/pack/pkg/archive\"\n)\n\nvar registryContainerNames = map[string]string{\n\t\"linux\":   \"library/registry:2\",\n\t\"windows\": \"micahyoung/registry:latest\",\n}\n\ntype TestRegistryConfig struct {\n\trunRegistryName       string\n\tregistryContainerName string\n\tRunRegistryHost       string\n\tRunRegistryPort       string\n\tDockerConfigDir       string\n\tusername              string\n\tpassword              string\n}\n\nfunc RegistryHost(host, port string) string {\n\treturn fmt.Sprintf(\"%s:%s\", host, port)\n}\n\nfunc CreateRegistryFixture(t *testing.T, tmpDir, fixturePath string) string {\n\tt.Helper()\n\t// copy fixture to temp dir\n\tregistryFixtureCopy := filepath.Join(tmpDir, \"registryCopy\")\n\n\tRecursiveCopyNow(t, fixturePath, registryFixtureCopy)\n\n\t// git init that dir\n\trepository, err := git.PlainInit(registryFixtureCopy, false)\n\tAssertNil(t, err)\n\n\t// git add . that dir\n\tworktree, err := repository.Worktree()\n\tAssertNil(t, err)\n\n\t_, err = worktree.Add(\".\")\n\tAssertNil(t, err)\n\n\t// git commit that dir\n\tcommit, err := worktree.Commit(\"first\", &git.CommitOptions{\n\t\tAuthor: &object.Signature{\n\t\t\tName:  \"John Doe\",\n\t\t\tEmail: \"john@doe.org\",\n\t\t\tWhen:  time.Now(),\n\t\t},\n\t})\n\tAssertNil(t, err)\n\n\t_, err = repository.CommitObject(commit)\n\tAssertNil(t, err)\n\n\treturn registryFixtureCopy\n}\n\nfunc RunRegistry(t *testing.T) *TestRegistryConfig {\n\tt.Log(\"run registry\")\n\tt.Helper()\n\n\trunRegistryName := \"test-registry-\" + RandString(10)\n\tusername := RandString(10)\n\tpassword := RandString(10)\n\n\trunRegistryHost, runRegistryPort, registryCtnrName := startRegistry(t, runRegistryName, username, password)\n\tdockerConfigDir := setupDockerConfigWithAuth(t, username, password, runRegistryHost, runRegistryPort)\n\n\tregistryConfig := &TestRegistryConfig{\n\t\trunRegistryName:       runRegistryName,\n\t\tregistryContainerName: registryCtnrName,\n\t\tRunRegistryHost:       runRegistryHost,\n\t\tRunRegistryPort:       runRegistryPort,\n\t\tDockerConfigDir:       dockerConfigDir,\n\t\tusername:              username,\n\t\tpassword:              password,\n\t}\n\n\twaitForRegistryToBeAvailable(t, registryConfig)\n\n\treturn registryConfig\n}\n\nfunc waitForRegistryToBeAvailable(t *testing.T, registryConfig *TestRegistryConfig) {\n\tctx, cancel := context.WithTimeout(context.Background(), 10*time.Second)\n\tdefer cancel()\n\tfor {\n\t\t_, err := registryConfig.RegistryCatalog()\n\t\tif err == nil {\n\t\t\tbreak\n\t\t}\n\n\t\tctxErr := ctx.Err()\n\t\tif ctxErr != nil {\n\t\t\tt.Fatal(\"registry not ready:\", ctxErr.Error(), \":\", err.Error())\n\t\t}\n\n\t\ttime.Sleep(500 * time.Microsecond)\n\t}\n}\n\nfunc (rc *TestRegistryConfig) AuthConfig() dockerregistry.AuthConfig {\n\treturn dockerregistry.AuthConfig{\n\t\tUsername:      rc.username,\n\t\tPassword:      rc.password,\n\t\tServerAddress: RegistryHost(rc.RunRegistryHost, rc.RunRegistryPort),\n\t}\n}\n\nfunc (rc *TestRegistryConfig) Login(t *testing.T, username string, password string) {\n\tEventually(t, func() bool {\n\t\t_, err := dockerCli(t).RegistryLogin(context.Background(), client.RegistryLoginOptions{\n\t\t\tUsername:      username,\n\t\t\tPassword:      password,\n\t\t\tServerAddress: RegistryHost(rc.RunRegistryHost, rc.RunRegistryPort),\n\t\t})\n\t\treturn err == nil\n\t}, 100*time.Millisecond, 10*time.Second)\n}\n\nfunc startRegistry(t *testing.T, runRegistryName, username, password string) (string, string, string) {\n\tctx := context.Background()\n\n\tdaemonInfoResult, err := dockerCli(t).Info(ctx, client.InfoOptions{})\n\tAssertNil(t, err)\n\n\tregistryContainerName := registryContainerNames[daemonInfoResult.Info.OSType]\n\tAssertNil(t, PullImageWithAuth(dockerCli(t), registryContainerName, \"\"))\n\n\thtpasswdTar := generateHtpasswd(t, username, password)\n\tdefer htpasswdTar.Close()\n\n\tctrResult, err := dockerCli(t).ContainerCreate(ctx, client.ContainerCreateOptions{\n\t\tName: runRegistryName,\n\t\tConfig: &dockercontainer.Config{\n\t\t\tImage:  registryContainerName,\n\t\t\tLabels: map[string]string{\"author\": \"pack\"},\n\t\t\tEnv: []string{\n\t\t\t\t\"REGISTRY_AUTH=htpasswd\",\n\t\t\t\t\"REGISTRY_AUTH_HTPASSWD_REALM=Registry Realm\",\n\t\t\t\t\"REGISTRY_AUTH_HTPASSWD_PATH=/registry_test_htpasswd\",\n\t\t\t},\n\t\t},\n\t\tHostConfig: &dockercontainer.HostConfig{\n\t\t\tAutoRemove: true,\n\t\t\tPortBindings: dockernetwork.PortMap{\n\t\t\t\tdockernetwork.MustParsePort(\"5000/tcp\"): []dockernetwork.PortBinding{{HostPort: \"0\"}},\n\t\t\t},\n\t\t},\n\t})\n\tAssertNil(t, err)\n\n\t_, err = dockerCli(t).CopyToContainer(ctx, ctrResult.ID, client.CopyToContainerOptions{\n\t\tDestinationPath: \"/\",\n\t\tContent:         htpasswdTar,\n\t})\n\tAssertNil(t, err)\n\n\t_, err = dockerCli(t).ContainerStart(ctx, ctrResult.ID, client.ContainerStartOptions{})\n\tAssertNil(t, err)\n\n\trunRegistryPort, err := waitForPortBinding(t, ctrResult.ID, \"5000/tcp\", 30*time.Second)\n\tAssertNil(t, err)\n\n\trunRegistryHost := DockerHostname(t)\n\treturn runRegistryHost, runRegistryPort, registryContainerName\n}\n\nfunc waitForPortBinding(t *testing.T, containerID, portSpec string, duration time.Duration) (binding string, err error) {\n\tt.Helper()\n\tticker := time.NewTicker(500 * time.Millisecond)\n\tdefer ticker.Stop()\n\ttimer := time.NewTimer(duration)\n\tdefer timer.Stop()\n\n\tfor {\n\t\tselect {\n\t\tcase <-ticker.C:\n\t\t\tinspectResult, err := dockerCli(t).ContainerInspect(context.TODO(), containerID, client.ContainerInspectOptions{})\n\t\t\tif err != nil {\n\t\t\t\treturn \"\", err\n\t\t\t}\n\t\t\tinspect := inspectResult.Container\n\n\t\t\tportPort, _ := dockernetwork.ParsePort(portSpec)\n\t\t\tportBindings := inspect.NetworkSettings.Ports[portPort]\n\t\t\tif len(portBindings) > 0 {\n\t\t\t\treturn portBindings[0].HostPort, nil\n\t\t\t}\n\t\tcase <-timer.C:\n\t\t\tt.Fatalf(\"timeout waiting for port binding: %v\", duration)\n\t\t}\n\t}\n}\n\nfunc DockerHostname(t *testing.T) string {\n\tdockerCli := dockerCli(t)\n\n\tdaemonHost := dockerCli.DaemonHost()\n\tu, err := url.Parse(daemonHost)\n\tif err != nil {\n\t\tt.Fatalf(\"unable to parse URI client.DaemonHost: %s\", err)\n\t}\n\n\tswitch u.Scheme {\n\t// DOCKER_HOST is usually remote so always use its hostname/IP\n\t// Note: requires \"insecure-registries\" CIDR entry on Daemon config\n\tcase \"tcp\":\n\t\treturn u.Hostname()\n\n\t// if DOCKER_HOST is non-tcp, we assume that we are\n\t// talking to the daemon over a local pipe.\n\tdefault:\n\t\tdaemonInfoResult, err := dockerCli.Info(context.TODO(), client.InfoOptions{})\n\t\tif err != nil {\n\t\t\tt.Fatalf(\"unable to fetch client.DockerInfo: %s\", err)\n\t\t}\n\n\t\tif daemonInfoResult.Info.OSType == \"windows\" {\n\t\t\t// try to lookup the host IP by helper domain name (https://docs.docker.com/docker-for-windows/networking/#use-cases-and-workarounds)\n\t\t\t// Note: pack appears to not support /etc/hosts-based insecure-registries\n\t\t\taddrs, err := net.LookupHost(\"host.docker.internal\")\n\t\t\tif err != nil {\n\t\t\t\tt.Fatalf(\"unknown address response: %+v %s\", addrs, err)\n\t\t\t}\n\t\t\tif len(addrs) != 1 {\n\t\t\t\tt.Fatalf(\"ambiguous address response: %v\", addrs)\n\t\t\t}\n\t\t\treturn addrs[0]\n\t\t}\n\n\t\t// Linux can use --network=host so always use \"localhost\"\n\t\treturn \"localhost\"\n\t}\n}\n\nfunc generateHtpasswd(t *testing.T, username string, password string) io.ReadCloser {\n\t// https://docs.docker.com/registry/deploying/#restricting-access\n\t// HTPASSWD format: https://github.com/foomo/htpasswd/blob/e3a90e78da9cff06a83a78861847aa9092cbebdd/hashing.go#L23\n\tpasswordBytes, _ := bcrypt.GenerateFromPassword([]byte(password), bcrypt.DefaultCost)\n\treader := archive.CreateSingleFileTarReader(\"/registry_test_htpasswd\", username+\":\"+string(passwordBytes))\n\treturn reader\n}\n\nfunc setupDockerConfigWithAuth(t *testing.T, username string, password string, runRegistryHost string, runRegistryPort string) string {\n\tdockerConfigDir, err := os.MkdirTemp(\"\", \"pack.test.docker.config.dir\")\n\tAssertNil(t, err)\n\n\tAssertNil(t, os.WriteFile(filepath.Join(dockerConfigDir, \"config.json\"), []byte(fmt.Sprintf(`{\n\t\t\t  \"auths\": {\n\t\t\t    \"%s\": {\n\t\t\t      \"auth\": \"%s\"\n\t\t\t    }\n\t\t\t  }\n\t\t\t}\n\t\t\t`, RegistryHost(runRegistryHost, runRegistryPort), encodedUserPass(username, password))), 0666))\n\treturn dockerConfigDir\n}\n\nfunc encodedUserPass(username string, password string) string {\n\treturn base64.StdEncoding.EncodeToString([]byte(fmt.Sprintf(\"%s:%s\", username, password)))\n}\n\nfunc (rc *TestRegistryConfig) RmRegistry(t *testing.T) {\n\trc.StopRegistry(t)\n\n\tt.Log(\"remove registry\")\n\tt.Helper()\n\n\tid := ImageID(t, rc.registryContainerName)\n\tDockerRmi(dockerCli(t), id)\n}\n\nfunc (rc *TestRegistryConfig) StopRegistry(t *testing.T) {\n\tt.Log(\"stop registry\")\n\tt.Helper()\n\tdockerCli(t).ContainerKill(context.Background(), rc.runRegistryName, client.ContainerKillOptions{Signal: \"SIGKILL\"})\n\n\terr := os.RemoveAll(rc.DockerConfigDir)\n\tAssertNil(t, err)\n}\n\nfunc (rc *TestRegistryConfig) RepoName(name string) string {\n\treturn RegistryHost(rc.RunRegistryHost, rc.RunRegistryPort) + \"/\" + name\n}\n\nfunc (rc *TestRegistryConfig) RegistryAuth() string {\n\treturn base64.StdEncoding.EncodeToString([]byte(fmt.Sprintf(`{\"username\":\"%s\",\"password\":\"%s\"}`, rc.username, rc.password)))\n}\n\nfunc (rc *TestRegistryConfig) RegistryCatalog() (string, error) {\n\treturn HTTPGetE(fmt.Sprintf(\"http://%s/v2/_catalog\", RegistryHost(rc.RunRegistryHost, rc.RunRegistryPort)), map[string]string{\n\t\t\"Authorization\": \"Basic \" + encodedUserPass(rc.username, rc.password),\n\t})\n}\n"
  },
  {
    "path": "testhelpers/tar_assertions.go",
    "content": "package testhelpers\n\nimport (\n\t\"archive/tar\"\n\t\"bytes\"\n\t\"compress/gzip\"\n\t\"encoding/json\"\n\t\"io\"\n\t\"os\"\n\t\"path/filepath\"\n\t\"testing\"\n\t\"time\"\n\n\t\"github.com/pkg/errors\"\n\n\t\"github.com/buildpacks/pack/pkg/archive\"\n)\n\nvar gzipMagicHeader = []byte{'\\x1f', '\\x8b'}\n\ntype TarEntryAssertion func(t *testing.T, header *tar.Header, data []byte)\n\ntype TarEntriesAssertion func(t *testing.T, header1 *tar.Header, data1 []byte, header2 *tar.Header, data2 []byte)\n\nfunc AssertOnTarEntry(t *testing.T, tarPath, entryPath string, assertFns ...TarEntryAssertion) {\n\tt.Helper()\n\n\ttarFile, err := os.Open(filepath.Clean(tarPath))\n\tAssertNil(t, err)\n\tdefer tarFile.Close()\n\n\theader, data, err := readTarFileEntry(tarFile, entryPath)\n\tAssertNil(t, err)\n\n\tfor _, fn := range assertFns {\n\t\tfn(t, header, data)\n\t}\n}\n\nfunc AssertOnNestedTar(nestedEntryPath string, assertions ...TarEntryAssertion) TarEntryAssertion {\n\treturn func(t *testing.T, _ *tar.Header, data []byte) {\n\t\tt.Helper()\n\n\t\theader, data, err := readTarFileEntry(bytes.NewReader(data), nestedEntryPath)\n\t\tAssertNil(t, err)\n\n\t\tfor _, assertion := range assertions {\n\t\t\tassertion(t, header, data)\n\t\t}\n\t}\n}\n\nfunc AssertOnTarEntries(t *testing.T, tarPath string, entryPath1, entryPath2 string, assertFns ...TarEntriesAssertion) {\n\tt.Helper()\n\n\ttarFile, err := os.Open(filepath.Clean(tarPath))\n\tAssertNil(t, err)\n\tdefer tarFile.Close()\n\n\theader1, data1, err := readTarFileEntry(tarFile, entryPath1)\n\tAssertNil(t, err)\n\n\t_, err = tarFile.Seek(0, io.SeekStart)\n\tAssertNil(t, err)\n\n\theader2, data2, err := readTarFileEntry(tarFile, entryPath2)\n\tAssertNil(t, err)\n\n\tfor _, fn := range assertFns {\n\t\tfn(t, header1, data1, header2, data2)\n\t}\n}\n\nfunc readTarFileEntry(reader io.Reader, entryPath string) (*tar.Header, []byte, error) {\n\tvar (\n\t\tgzipReader *gzip.Reader\n\t\terr        error\n\t)\n\n\theaderBytes, isGzipped, err := isGzipped(reader)\n\tif err != nil {\n\t\treturn nil, nil, errors.Wrap(err, \"checking if reader\")\n\t}\n\treader = io.MultiReader(bytes.NewReader(headerBytes), reader)\n\n\tif isGzipped {\n\t\tgzipReader, err = gzip.NewReader(reader)\n\t\tif err != nil {\n\t\t\treturn nil, nil, errors.Wrap(err, \"failed to create gzip reader\")\n\t\t}\n\t\treader = gzipReader\n\t\tdefer gzipReader.Close()\n\t}\n\n\treturn archive.ReadTarEntry(reader, entryPath)\n}\n\nfunc isGzipped(reader io.Reader) (headerBytes []byte, isGzipped bool, err error) {\n\tmagicHeader := make([]byte, 2)\n\tn, err := reader.Read(magicHeader)\n\tif n == 0 && err == io.EOF {\n\t\treturn magicHeader, false, nil\n\t}\n\tif err != nil {\n\t\treturn magicHeader, false, err\n\t}\n\t// This assertion is based on https://stackoverflow.com/a/28332019. It checks whether the two header bytes of\n\t// the file match the expected headers for a gzip file; the first one is 0x1f and the second is 0x8b\n\treturn magicHeader, bytes.Equal(magicHeader, gzipMagicHeader), nil\n}\n\nfunc ContentContains(expected string) TarEntryAssertion {\n\treturn func(t *testing.T, header *tar.Header, contents []byte) {\n\t\tt.Helper()\n\t\tAssertContains(t, string(contents), expected)\n\t}\n}\n\nfunc ContentEquals(expected string) TarEntryAssertion {\n\treturn func(t *testing.T, header *tar.Header, contents []byte) {\n\t\tt.Helper()\n\t\tAssertEq(t, string(contents), expected)\n\t}\n}\n\nfunc SymlinksTo(expectedTarget string) TarEntryAssertion {\n\treturn func(t *testing.T, header *tar.Header, _ []byte) {\n\t\tt.Helper()\n\t\tif header.Typeflag != tar.TypeSymlink {\n\t\t\tt.Fatalf(\"path '%s' is not a symlink, type flag is '%c'\", header.Name, header.Typeflag)\n\t\t}\n\n\t\tif header.Linkname != expectedTarget {\n\t\t\tt.Fatalf(\"symlink '%s' does not point to '%s', instead it points to '%s'\", header.Name, expectedTarget, header.Linkname)\n\t\t}\n\t}\n}\n\nfunc AreEquivalentHardLinks() TarEntriesAssertion {\n\treturn func(t *testing.T, header1 *tar.Header, _ []byte, header2 *tar.Header, _ []byte) {\n\t\tt.Helper()\n\t\tif header1.Typeflag != tar.TypeLink && header2.Typeflag != tar.TypeLink {\n\t\t\tt.Fatalf(\"path '%s' and '%s' are not hardlinks, type flags are '%c' and '%c'\", header1.Name, header2.Name, header1.Typeflag, header2.Typeflag)\n\t\t}\n\n\t\tif header1.Linkname != header2.Name && header2.Linkname != header1.Name {\n\t\t\tt.Fatalf(\"'%s' and '%s' are not the same file\", header1.Name, header2.Name)\n\t\t}\n\t}\n}\n\nfunc HasOwnerAndGroup(expectedUID int, expectedGID int) TarEntryAssertion {\n\treturn func(t *testing.T, header *tar.Header, _ []byte) {\n\t\tt.Helper()\n\t\tif header.Uid != expectedUID {\n\t\t\tt.Fatalf(\"expected '%s' to have uid '%d', but got '%d'\", header.Name, expectedUID, header.Uid)\n\t\t}\n\t\tif header.Gid != expectedGID {\n\t\t\tt.Fatalf(\"expected '%s' to have gid '%d', but got '%d'\", header.Name, expectedGID, header.Gid)\n\t\t}\n\t}\n}\n\nfunc IsJSON() TarEntryAssertion {\n\treturn func(t *testing.T, header *tar.Header, data []byte) {\n\t\tif !json.Valid(data) {\n\t\t\tt.Fatal(\"not valid JSON\")\n\t\t}\n\t}\n}\n\nfunc IsGzipped() TarEntryAssertion {\n\treturn func(t *testing.T, header *tar.Header, data []byte) {\n\t\t_, isGzipped, err := isGzipped(bytes.NewReader(data))\n\t\tAssertNil(t, err)\n\t\tif !isGzipped {\n\t\t\tt.Fatal(\"is not gzipped\")\n\t\t}\n\t}\n}\n\nfunc HasFileMode(expectedMode int64) TarEntryAssertion {\n\treturn func(t *testing.T, header *tar.Header, _ []byte) {\n\t\tt.Helper()\n\t\tif header.Mode != expectedMode {\n\t\t\tt.Fatalf(\"expected '%s' to have mode '%o', but got '%o'\", header.Name, expectedMode, header.Mode)\n\t\t}\n\t}\n}\n\nfunc HasModTime(expectedTime time.Time) TarEntryAssertion {\n\treturn func(t *testing.T, header *tar.Header, _ []byte) {\n\t\tt.Helper()\n\t\tif header.ModTime.UnixNano() != expectedTime.UnixNano() {\n\t\t\tt.Fatalf(\"expected '%s' to have mod time '%s', but got '%s'\", header.Name, expectedTime, header.ModTime)\n\t\t}\n\t}\n}\n\nfunc DoesNotHaveModTime(expectedTime time.Time) TarEntryAssertion {\n\treturn func(t *testing.T, header *tar.Header, _ []byte) {\n\t\tt.Helper()\n\t\tif header.ModTime.UnixNano() == expectedTime.UnixNano() {\n\t\t\tt.Fatalf(\"expected '%s' to not have mod time '%s'\", header.Name, expectedTime)\n\t\t}\n\t}\n}\n\nfunc IsDirectory() TarEntryAssertion {\n\treturn func(t *testing.T, header *tar.Header, _ []byte) {\n\t\tt.Helper()\n\t\tif header.Typeflag != tar.TypeDir {\n\t\t\tt.Fatalf(\"expected '%s' to be a directory but was '%d'\", header.Name, header.Typeflag)\n\t\t}\n\t}\n}\n"
  },
  {
    "path": "testhelpers/tar_verifier.go",
    "content": "package testhelpers\n\nimport (\n\t\"archive/tar\"\n\t\"io\"\n\t\"testing\"\n\t\"time\"\n)\n\ntype TarVerifier struct {\n\tt   *testing.T\n\ttr  *tar.Reader\n\tuid int\n\tgid int\n}\n\nfunc NewTarVerifier(t *testing.T, tr *tar.Reader, uid, gid int) *TarVerifier {\n\treturn &TarVerifier{\n\t\tt:   t,\n\t\ttr:  tr,\n\t\tuid: uid,\n\t\tgid: gid,\n\t}\n}\n\nfunc (v *TarVerifier) NextDirectory(name string, mode int64) {\n\tv.t.Helper()\n\theader, err := v.tr.Next()\n\tif err != nil {\n\t\tv.t.Fatalf(\"Failed to get next file: %s\", err)\n\t}\n\n\tif header.Name != name {\n\t\tv.t.Fatalf(`expected dir with name %s, got %s`, name, header.Name)\n\t}\n\tif header.Typeflag != tar.TypeDir {\n\t\tv.t.Fatalf(`expected %s to be a Directory`, header.Name)\n\t}\n\tif header.Uid != v.uid {\n\t\tv.t.Fatalf(`expected %s to have Uid %d but, got: %d`, header.Name, v.uid, header.Uid)\n\t}\n\tif header.Gid != v.gid {\n\t\tv.t.Fatalf(`expected %s to have Gid %d but, got: %d`, header.Name, v.gid, header.Gid)\n\t}\n\tif header.Mode != mode {\n\t\tv.t.Fatalf(`expected %s to have mode %o but, got: %o`, header.Name, mode, header.Mode)\n\t}\n\tif !header.ModTime.Equal(time.Date(1980, time.January, 1, 0, 0, 1, 0, time.UTC)) {\n\t\tv.t.Fatalf(`expected %s to have been normalized, got: %s`, header.Name, header.ModTime.String())\n\t}\n}\n\nfunc (v *TarVerifier) NoMoreFilesExist() {\n\tv.t.Helper()\n\theader, err := v.tr.Next()\n\tif err == nil {\n\t\tv.t.Fatalf(`expected no more files but found: %s`, header.Name)\n\t} else if err != io.EOF {\n\t\tv.t.Error(err.Error())\n\t}\n}\n\nfunc (v *TarVerifier) NextFile(name, expectedFileContents string, expectedFileMode int64) {\n\tv.t.Helper()\n\theader, err := v.tr.Next()\n\tif err != nil {\n\t\tv.t.Fatalf(\"Failed to get next file: %s\", err)\n\t}\n\n\tif header.Name != name {\n\t\tv.t.Fatalf(`expected dir with name %s, got %s`, name, header.Name)\n\t}\n\tif header.Typeflag != tar.TypeReg {\n\t\tv.t.Fatalf(`expected %s to be a file`, header.Name)\n\t}\n\tif header.Uid != v.uid {\n\t\tv.t.Fatalf(`expected %s to have Uid %d but, got: %d`, header.Name, v.uid, header.Uid)\n\t}\n\tif header.Gid != v.gid {\n\t\tv.t.Fatalf(`expected %s to have Gid %d but, got: %d`, header.Name, v.gid, header.Gid)\n\t}\n\n\tfileContents := make([]byte, header.Size)\n\tv.tr.Read(fileContents)\n\tif string(fileContents) != expectedFileContents {\n\t\tv.t.Fatalf(`expected to some-file.txt to have %s got %s`, expectedFileContents, string(fileContents))\n\t}\n\n\tif !header.ModTime.Equal(time.Date(1980, time.January, 1, 0, 0, 1, 0, time.UTC)) {\n\t\tv.t.Fatalf(`expected %s to have been normalized, got: %s`, header.Name, header.ModTime.String())\n\t}\n\n\tif header.Mode != expectedFileMode {\n\t\tv.t.Fatalf(\"files should have mode %o, got: %o\", expectedFileMode, header.Mode)\n\t}\n}\n\nfunc (v *TarVerifier) NextSymLink(name, link string) {\n\tv.t.Helper()\n\theader, err := v.tr.Next()\n\tif err != nil {\n\t\tv.t.Fatalf(\"Failed to get next file: %s\", err)\n\t}\n\n\tif header.Name != name {\n\t\tv.t.Fatalf(`expected dir with name %s, got %s`, name, header.Name)\n\t}\n\tif header.Typeflag != tar.TypeSymlink {\n\t\tv.t.Fatalf(`expected %s to be a link got %s`, header.Name, string(header.Typeflag))\n\t}\n\tif header.Uid != v.uid {\n\t\tv.t.Fatalf(`expected %s to have Uid %d but, got: %d`, header.Name, v.uid, header.Uid)\n\t}\n\tif header.Gid != v.gid {\n\t\tv.t.Fatalf(`expected %s to have Gid %d but, got: %d`, header.Name, v.gid, header.Gid)\n\t}\n\n\t// tar names and linknames should be Linux formatted paths, regardless of OS\n\tif header.Linkname != \"../some-file.txt\" {\n\t\tv.t.Fatalf(`expected link-file to have target %s got: %s`, link, header.Linkname)\n\t}\n\tif !header.ModTime.Equal(time.Date(1980, time.January, 1, 0, 0, 1, 0, time.UTC)) {\n\t\tv.t.Fatalf(`expected %s to have been normalized, got: %s`, header.Name, header.ModTime.String())\n\t}\n}\n"
  },
  {
    "path": "testhelpers/testhelpers.go",
    "content": "package testhelpers\n\nimport (\n\t\"archive/tar\"\n\t\"bytes\"\n\t\"compress/gzip\"\n\t\"context\"\n\t\"encoding/json\"\n\t\"fmt\"\n\t\"io\"\n\t\"math/rand\"\n\t\"net/http\"\n\t\"os\"\n\t\"os/exec\"\n\t\"path/filepath\"\n\t\"reflect\"\n\t\"regexp\"\n\t\"runtime\"\n\t\"strings\"\n\t\"sync\"\n\t\"testing\"\n\t\"time\"\n\n\t\"github.com/buildpacks/imgutil\"\n\tv1 \"github.com/google/go-containerregistry/pkg/v1\"\n\n\t\"github.com/buildpacks/imgutil/fakes\"\n\n\t\"github.com/docker/docker/pkg/jsonmessage\"\n\t\"github.com/docker/docker/pkg/stdcopy\"\n\t\"github.com/go-git/go-git/v5\"\n\t\"github.com/google/go-cmp/cmp\"\n\t\"github.com/heroku/color\"\n\tdcontainer \"github.com/moby/moby/api/types/container\"\n\t\"github.com/moby/moby/client\"\n\t\"github.com/pkg/errors\"\n\n\t\"github.com/buildpacks/pack/internal/container\"\n\t\"github.com/buildpacks/pack/internal/stringset\"\n\t\"github.com/buildpacks/pack/internal/style\"\n\t\"github.com/buildpacks/pack/pkg/archive\"\n\t\"github.com/buildpacks/pack/pkg/buildpack\"\n\t\"github.com/buildpacks/pack/pkg/dist\"\n)\n\nfunc RandString(n int) string {\n\tb := make([]byte, n)\n\tfor i := range b {\n\t\tb[i] = 'a' + byte(rand.Intn(26))\n\t}\n\treturn string(b)\n}\n\n// Assert deep equality (and provide useful difference as a test failure)\nfunc AssertEq(t *testing.T, actual, expected interface{}, opts ...cmp.Option) {\n\tt.Helper()\n\tif diff := cmp.Diff(expected, actual, opts...); diff != \"\" {\n\t\tt.Fatal(diff)\n\t}\n}\n\nfunc AssertFunctionName(t *testing.T, fn interface{}, expected string) {\n\tt.Helper()\n\tname := runtime.FuncForPC(reflect.ValueOf(fn).Pointer()).Name()\n\tif name == \"\" {\n\t\tt.Fatalf(\"Unable to retrieve function name for %#v. Is it a function?\", fn)\n\t}\n\n\tif !hasMatches(name, fmt.Sprintf(`\\.(%s)\\.func[\\d]+$`, expected)) {\n\t\tt.Fatalf(\"Expected func name '%s' to contain '%s'\", name, expected)\n\t}\n}\n\n// Assert deep equality (and provide useful difference as a test failure)\nfunc AssertNotEq(t *testing.T, actual, expected interface{}) {\n\tt.Helper()\n\tif diff := cmp.Diff(expected, actual); diff == \"\" {\n\t\tt.Fatal(diff)\n\t}\n}\n\nfunc AssertTrue(t *testing.T, actual interface{}) {\n\tt.Helper()\n\tAssertEq(t, actual, true)\n}\n\nfunc AssertFalse(t *testing.T, actual interface{}) {\n\tt.Helper()\n\tAssertEq(t, actual, false)\n}\n\nfunc AssertUnique(t *testing.T, items ...interface{}) {\n\tt.Helper()\n\titemMap := map[interface{}]interface{}{}\n\tfor _, item := range items {\n\t\titemMap[item] = nil\n\t}\n\tif len(itemMap) != len(items) {\n\t\tt.Fatalf(\"Expected items in %v to be unique\", items)\n\t}\n}\n\n// Assert the simplistic pointer (or literal value) equality\nfunc AssertSameInstance(t *testing.T, actual, expected interface{}) {\n\tt.Helper()\n\tif actual != expected {\n\t\tt.Fatalf(\"Expected %s and %s to be the same instance\", actual, expected)\n\t}\n}\n\nfunc AssertError(t *testing.T, actual error, expected string) {\n\tt.Helper()\n\tif actual == nil {\n\t\tt.Fatalf(\"Expected an error but got nil\")\n\t}\n\tif !strings.Contains(actual.Error(), expected) {\n\t\tt.Fatalf(`Expected error to contain \"%s\", got \"%s\"`, expected, actual.Error())\n\t}\n}\n\nfunc AssertContains(t *testing.T, actual, expected string) {\n\tt.Helper()\n\tif !strings.Contains(actual, expected) {\n\t\tt.Fatalf(\n\t\t\t\"Expected '%s' to contain '%s'\\n\\nDiff:%s\",\n\t\t\tactual,\n\t\t\texpected,\n\t\t\tcmp.Diff(expected, actual),\n\t\t)\n\t}\n}\n\nfunc AssertContainsAllInOrder(t *testing.T, actual bytes.Buffer, expected ...string) {\n\tt.Helper()\n\n\tvar tested []byte\n\n\tfor _, exp := range expected {\n\t\tb, found := readUntilString(&actual, exp)\n\t\ttested = append(tested, b...)\n\n\t\tif !found {\n\t\t\tt.Fatalf(\"Expected '%s' to include all of '%s' in order\", string(tested), strings.Join(expected, \", \"))\n\t\t}\n\t}\n}\n\nfunc readUntilString(b *bytes.Buffer, expected string) (read []byte, found bool) {\n\tfor {\n\t\ts, err := b.ReadBytes(expected[len(expected)-1])\n\t\tif err != nil {\n\t\t\treturn append(read, s...), false\n\t\t}\n\n\t\tread = append(read, s...)\n\t\tif bytes.HasSuffix(read, []byte(expected)) {\n\t\t\treturn read, true\n\t\t}\n\t}\n}\n\n// AssertContainsMatch matches on content by regular expression\nfunc AssertContainsMatch(t *testing.T, actual, exp string) {\n\tt.Helper()\n\tif !hasMatches(actual, exp) {\n\t\tt.Fatalf(\"Expected '%s' to match expression '%s'\", actual, exp)\n\t}\n}\n\nfunc AssertNotContainsMatch(t *testing.T, actual, exp string) {\n\tt.Helper()\n\tif hasMatches(actual, exp) {\n\t\tt.Fatalf(\"Expected '%s' not to match expression '%s'\", actual, exp)\n\t}\n}\n\nfunc AssertNotContains(t *testing.T, actual, expected string) {\n\tt.Helper()\n\tif strings.Contains(actual, expected) {\n\t\tt.Fatalf(\"Expected '%s' to not contain '%s'\", actual, expected)\n\t}\n}\n\ntype KeyValue[k comparable, v any] struct {\n\tkey   k\n\tvalue v\n}\n\nfunc NewKeyValue[k comparable, v any](key k, value v) KeyValue[k, v] {\n\treturn KeyValue[k, v]{key: key, value: value}\n}\n\nfunc AssertMapContains[key comparable, value any](t *testing.T, actual map[key]value, expected ...KeyValue[key, value]) {\n\tt.Helper()\n\tfor _, i := range expected {\n\t\tif v, ok := actual[i.key]; !ok || !reflect.DeepEqual(v, i.value) {\n\t\t\tt.Fatalf(\"Expected %s to contain elements %s\", reflect.ValueOf(actual), reflect.ValueOf(expected))\n\t\t}\n\t}\n}\n\nfunc AssertMapNotContains[key comparable, value any](t *testing.T, actual map[key]value, expected ...KeyValue[key, value]) {\n\tt.Helper()\n\tfor _, i := range expected {\n\t\tif v, ok := actual[i.key]; ok && reflect.DeepEqual(v, i.value) {\n\t\t\tt.Fatalf(\"Expected %s to not contain elements %s\", reflect.ValueOf(actual), reflect.ValueOf(expected))\n\t\t}\n\t}\n}\n\nfunc AssertSliceContains(t *testing.T, slice []string, expected ...string) {\n\tt.Helper()\n\t_, missing, _ := stringset.Compare(slice, expected)\n\tif len(missing) > 0 {\n\t\tt.Fatalf(\"Expected %s to contain elements %s\", slice, missing)\n\t}\n}\n\nfunc AssertSliceContainsInOrder(t *testing.T, slice []string, expected ...string) {\n\tt.Helper()\n\n\tAssertSliceContains(t, slice, expected...)\n\n\tvar common []string\n\texpectedSet := stringset.FromSlice(expected)\n\tfor _, sliceV := range slice {\n\t\tif _, ok := expectedSet[sliceV]; ok {\n\t\t\tcommon = append(common, sliceV)\n\t\t}\n\t}\n\n\tlastFoundI := -1\n\tfor _, expectedV := range expected {\n\t\tfor foundI, foundV := range common {\n\t\t\tif expectedV == foundV && lastFoundI < foundI {\n\t\t\t\tlastFoundI = foundI\n\t\t\t} else if expectedV == foundV {\n\t\t\t\tt.Fatalf(\"Expected '%s' come earlier in the slice.\\nslice: %v\\nexpected order: %v\", expectedV, slice, expected)\n\t\t\t}\n\t\t}\n\t}\n}\n\nfunc AssertSliceNotContains(t *testing.T, slice []string, expected ...string) {\n\tt.Helper()\n\t_, missing, _ := stringset.Compare(slice, expected)\n\tif len(missing) != len(expected) {\n\t\tt.Fatalf(\"Expected %s not to contain elements %s\", slice, expected)\n\t}\n}\n\nfunc AssertSliceContainsMatch(t *testing.T, slice []string, expected ...string) {\n\tt.Helper()\n\n\tvar missing []string\n\n\tfor _, expectedStr := range expected {\n\t\tvar found bool\n\t\tfor _, actualStr := range slice {\n\t\t\tif regexp.MustCompile(expectedStr).MatchString(actualStr) {\n\t\t\t\tfound = true\n\t\t\t\tbreak\n\t\t\t}\n\t\t}\n\t\tif !found {\n\t\t\tmissing = append(missing, expectedStr)\n\t\t}\n\t}\n\n\tif len(missing) > 0 {\n\t\tt.Fatalf(\"Expected %s to contain elements %s\", slice, missing)\n\t}\n}\n\nfunc AssertSliceContainsOnly(t *testing.T, slice []string, expected ...string) {\n\tt.Helper()\n\textra, missing, _ := stringset.Compare(slice, expected)\n\tif len(missing) > 0 {\n\t\tt.Fatalf(\"Expected %s to contain elements %s\", slice, missing)\n\t}\n\tif len(extra) > 0 {\n\t\tt.Fatalf(\"Expected %s to not contain elements %s\", slice, extra)\n\t}\n}\n\nfunc AssertMatch(t *testing.T, actual string, expected string) {\n\tt.Helper()\n\tif !regexp.MustCompile(expected).MatchString(actual) {\n\t\tt.Fatalf(\"Expected '%s' to match regex '%s'\", actual, expected)\n\t}\n}\n\n// AssertNilE checks for nil value, if not nil it sets test as failed without stopping execution.\nfunc AssertNilE(t *testing.T, actual interface{}) {\n\tt.Helper()\n\tif !isNil(actual) {\n\t\tt.Errorf(\"Expected nil: %s\", actual)\n\t}\n}\n\n// AssertNil checks for nil value, if not nil it fails the test and stops execution immediately.\nfunc AssertNil(t *testing.T, actual interface{}) {\n\tt.Helper()\n\tif !isNil(actual) {\n\t\tt.Fatalf(\"Expected nil: %s\", actual)\n\t}\n}\n\nfunc AssertNotNil(t *testing.T, actual interface{}) {\n\tt.Helper()\n\tif isNil(actual) {\n\t\tt.Fatal(\"Expected not nil\")\n\t}\n}\n\nfunc AssertTarball(t *testing.T, path string) {\n\tt.Helper()\n\tf, err := os.Open(filepath.Clean(path))\n\tAssertNil(t, err)\n\tdefer f.Close()\n\n\treader := tar.NewReader(f)\n\t_, err = reader.Next()\n\tAssertNil(t, err)\n}\n\nfunc isNil(value interface{}) bool {\n\treturn value == nil || (reflect.TypeOf(value).Kind() == reflect.Ptr && reflect.ValueOf(value).IsNil())\n}\n\nfunc hasMatches(actual, exp string) bool {\n\tregex := regexp.MustCompile(exp)\n\tmatches := regex.FindAll([]byte(actual), -1)\n\treturn len(matches) > 0\n}\n\n// IndexOf returns the index of the first occurrence of substr in s, or -1 if not found\nfunc IndexOf(s, substr string) int {\n\treturn strings.Index(s, substr)\n}\n\nvar dockerCliVal *client.Client\nvar dockerCliOnce sync.Once\nvar dockerCliErr error\n\nfunc dockerCli(t *testing.T) *client.Client {\n\tdockerCliOnce.Do(func() {\n\t\tdockerCliVal, dockerCliErr = client.New(client.FromEnv)\n\t})\n\tAssertNil(t, dockerCliErr)\n\treturn dockerCliVal\n}\n\nfunc Eventually(t *testing.T, test func() bool, every time.Duration, timeout time.Duration) {\n\tt.Helper()\n\n\tticker := time.NewTicker(every)\n\tdefer ticker.Stop()\n\ttimer := time.NewTimer(timeout)\n\tdefer timer.Stop()\n\n\tfor {\n\t\tselect {\n\t\tcase <-ticker.C:\n\t\t\tif test() {\n\t\t\t\treturn\n\t\t\t}\n\t\tcase <-timer.C:\n\t\t\tt.Fatalf(\"timeout on eventually: %v\", timeout)\n\t\t}\n\t}\n}\n\nfunc CreateImage(t *testing.T, dockerCli *client.Client, repoName, dockerFile string) {\n\tt.Helper()\n\n\tbuildContext := archive.CreateSingleFileTarReader(\"Dockerfile\", dockerFile)\n\tdefer buildContext.Close()\n\n\tresp, err := dockerCli.ImageBuild(context.Background(), buildContext, client.ImageBuildOptions{\n\t\tTags:           []string{repoName},\n\t\tSuppressOutput: true,\n\t\tRemove:         true,\n\t\tForceRemove:    true,\n\t})\n\tAssertNil(t, err)\n\n\tdefer resp.Body.Close()\n\terr = checkResponse(resp.Body)\n\tAssertNil(t, errors.Wrapf(err, \"building image %s\", style.Symbol(repoName)))\n}\n\nfunc CreateImageFromDir(t *testing.T, dockerCli *client.Client, repoName string, dir string) {\n\tt.Helper()\n\n\tbuildContext := archive.ReadDirAsTar(dir, \"/\", 0, 0, -1, true, false, nil)\n\tresp, err := dockerCli.ImageBuild(context.Background(), buildContext, client.ImageBuildOptions{\n\t\tTags:           []string{repoName},\n\t\tRemove:         true,\n\t\tForceRemove:    true,\n\t\tSuppressOutput: false,\n\t})\n\tAssertNil(t, err)\n\n\tdefer resp.Body.Close()\n\terr = checkResponse(resp.Body)\n\tAssertNil(t, errors.Wrapf(err, \"building image %s\", style.Symbol(repoName)))\n}\n\nfunc CheckImageBuildResult(response client.ImageBuildResult, err error) error {\n\tif err != nil {\n\t\treturn err\n\t}\n\n\tdefer response.Body.Close()\n\treturn checkResponse(response.Body)\n}\n\nfunc checkResponse(responseBody io.Reader) error {\n\tbody, err := io.ReadAll(responseBody)\n\tif err != nil {\n\t\treturn errors.Wrap(err, \"reading body\")\n\t}\n\n\tmessages := strings.Builder{}\n\tfor _, line := range bytes.Split(body, []byte(\"\\n\")) {\n\t\tif len(line) == 0 {\n\t\t\tcontinue\n\t\t}\n\n\t\tvar msg jsonmessage.JSONMessage\n\t\terr := json.Unmarshal(line, &msg)\n\t\tif err != nil {\n\t\t\treturn errors.Wrapf(err, \"expected JSON: %s\", string(line))\n\t\t}\n\n\t\tif msg.Stream != \"\" {\n\t\t\tmessages.WriteString(msg.Stream)\n\t\t}\n\n\t\tif msg.Error != nil {\n\t\t\treturn errors.WithMessage(msg.Error, messages.String())\n\t\t}\n\t}\n\n\treturn nil\n}\n\nfunc CreateImageOnRemote(t *testing.T, dockerCli *client.Client, registryConfig *TestRegistryConfig, repoName, dockerFile string) string {\n\tt.Helper()\n\timageName := registryConfig.RepoName(repoName)\n\tCreateImage(t, dockerCli, imageName, dockerFile)\n\tAssertNil(t, PushImage(dockerCli, imageName, registryConfig))\n\treturn imageName\n}\n\nfunc DockerRmi(dockerCli *client.Client, repoNames ...string) error {\n\tvar err error\n\tctx := context.Background()\n\tfor _, name := range repoNames {\n\t\t_, e := dockerCli.ImageRemove(\n\t\t\tctx,\n\t\t\tname,\n\t\t\tclient.ImageRemoveOptions{Force: true, PruneChildren: true},\n\t\t)\n\t\tif e != nil && err == nil {\n\t\t\terr = e\n\t\t}\n\t}\n\treturn err\n}\n\nfunc PushImage(dockerCli *client.Client, ref string, registryConfig *TestRegistryConfig) error {\n\trc, err := dockerCli.ImagePush(context.Background(), ref, client.ImagePushOptions{RegistryAuth: registryConfig.RegistryAuth()})\n\tif err != nil {\n\t\treturn errors.Wrap(err, \"pushing image\")\n\t}\n\n\tdefer rc.Close()\n\terr = checkResponse(rc)\n\tif err != nil {\n\t\treturn errors.Wrap(err, \"push response\")\n\t}\n\n\treturn nil\n}\n\nfunc HTTPGetE(url string, headers map[string]string) (string, error) {\n\tclient := http.DefaultClient\n\n\trequest, err := http.NewRequest(\"GET\", url, nil)\n\tif err != nil {\n\t\treturn \"\", errors.Wrap(err, \"making new request\")\n\t}\n\n\tfor key, val := range headers {\n\t\trequest.Header.Set(key, val)\n\t}\n\n\tresp, err := client.Do(request)\n\tif err != nil {\n\t\treturn \"\", errors.Wrap(err, \"doing request\")\n\t}\n\tdefer resp.Body.Close()\n\tif resp.StatusCode >= 300 {\n\t\treturn \"\", fmt.Errorf(\"HTTP Status was bad: %s => %d\", url, resp.StatusCode)\n\t}\n\tb, err := io.ReadAll(resp.Body)\n\tif err != nil {\n\t\treturn \"\", errors.Wrap(err, \"reading body\")\n\t}\n\treturn string(b), nil\n}\n\nfunc ImageID(t *testing.T, repoName string) string {\n\tt.Helper()\n\tinspect, err := dockerCli(t).ImageInspect(context.Background(), repoName)\n\tAssertNil(t, err)\n\treturn strings.TrimPrefix(inspect.ID, \"sha256:\")\n}\n\nfunc Digest(t *testing.T, repoName string) string {\n\tt.Helper()\n\tinspect, err := dockerCli(t).ImageInspect(context.Background(), repoName)\n\tAssertNil(t, err)\n\tif len(inspect.RepoDigests) < 1 {\n\t\tt.Fatalf(\"image '%s' has no repo digests\", repoName)\n\t}\n\tparts := strings.Split(inspect.RepoDigests[0], \"@\")\n\tif len(parts) < 2 {\n\t\tt.Fatalf(\"repo digest '%s' malformed\", inspect.RepoDigests[0])\n\t}\n\treturn parts[1]\n}\n\nfunc TopLayerDiffID(t *testing.T, repoName string) string {\n\tt.Helper()\n\tinspect, err := dockerCli(t).ImageInspect(context.Background(), repoName)\n\tAssertNil(t, err)\n\tif len(inspect.RootFS.Layers) < 1 {\n\t\tt.Fatalf(\"image '%s' has no layers\", repoName)\n\t}\n\treturn inspect.RootFS.Layers[len(inspect.RootFS.Layers)-1]\n}\n\nfunc Run(t *testing.T, cmd *exec.Cmd) string {\n\tt.Helper()\n\ttxt, err := RunE(cmd)\n\tAssertNil(t, err)\n\treturn txt\n}\n\nfunc RunE(cmd *exec.Cmd) (string, error) {\n\toutput, err := cmd.CombinedOutput()\n\tif err != nil {\n\t\treturn string(output), fmt.Errorf(\"failed to execute command: %v, %s, %s\", cmd.Args, err, output)\n\t}\n\n\treturn string(output), nil\n}\n\nfunc PullImageWithAuth(dockerCli *client.Client, ref, registryAuth string) error {\n\tpullResult, err := dockerCli.ImagePull(context.Background(), ref, client.ImagePullOptions{RegistryAuth: registryAuth})\n\tif err != nil {\n\t\treturn err\n\t}\n\tif _, err := io.Copy(io.Discard, pullResult); err != nil {\n\t\treturn err\n\t}\n\treturn pullResult.Close()\n}\n\nfunc CopyFile(t *testing.T, src, dst string) {\n\tt.Helper()\n\n\terr := CopyFileE(src, dst)\n\tAssertNil(t, err)\n}\n\nfunc CopyFileE(src, dst string) error {\n\tfi, err := os.Stat(src)\n\tif err != nil {\n\t\treturn err\n\t}\n\n\tsrcFile, err := os.Open(filepath.Clean(src))\n\tif err != nil {\n\t\treturn err\n\t}\n\tdefer srcFile.Close()\n\n\tdstFile, err := os.OpenFile(filepath.Clean(dst), os.O_RDWR|os.O_CREATE|os.O_TRUNC, fi.Mode())\n\tif err != nil {\n\t\treturn err\n\t}\n\tdefer dstFile.Close()\n\n\t_, err = io.Copy(dstFile, srcFile)\n\tif err != nil {\n\t\treturn err\n\t}\n\n\tmodifiedtime := time.Time{}\n\treturn os.Chtimes(dst, modifiedtime, modifiedtime)\n}\n\nfunc RecursiveCopy(t *testing.T, src, dst string) {\n\tt.Helper()\n\n\terr := RecursiveCopyE(src, dst)\n\tAssertNil(t, err)\n}\n\nfunc RecursiveCopyE(src, dst string) error {\n\tfis, err := os.ReadDir(src)\n\tif err != nil {\n\t\treturn err\n\t}\n\n\tfor _, entry := range fis {\n\t\tfi, err := entry.Info()\n\t\tif err != nil {\n\t\t\treturn err\n\t\t}\n\t\tif fi.Mode().IsRegular() {\n\t\t\terr = CopyFileE(filepath.Join(src, fi.Name()), filepath.Join(dst, fi.Name()))\n\t\t\tif err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\t\t}\n\t\tif fi.IsDir() {\n\t\t\terr = os.Mkdir(filepath.Join(dst, fi.Name()), fi.Mode())\n\t\t\tif err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\t\t\terr = RecursiveCopyE(filepath.Join(src, fi.Name()), filepath.Join(dst, fi.Name()))\n\t\t\tif err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\t\t}\n\t}\n\n\tmodifiedtime := time.Time{}\n\terr = os.Chtimes(dst, modifiedtime, modifiedtime)\n\tif err != nil {\n\t\treturn err\n\t}\n\n\treturn os.Chmod(dst, 0775)\n}\n\nfunc RequireDocker(t *testing.T) {\n\tnoDocker := os.Getenv(\"NO_DOCKER\")\n\tSkipIf(t, strings.ToLower(noDocker) == \"true\" || noDocker == \"1\", \"Skipping because docker daemon unavailable\")\n}\n\nfunc SkipIf(t *testing.T, expression bool, reason string) {\n\tt.Helper()\n\tif expression {\n\t\tt.Skip(reason)\n\t}\n}\n\nfunc SkipUnless(t *testing.T, expression bool, reason string) {\n\tt.Helper()\n\tif !expression {\n\t\tt.Skip(reason)\n\t}\n}\n\n// dockerClientAdapter adapts moby client to internal/container.DockerClient interface\ntype dockerClientAdapter struct {\n\t*client.Client\n}\n\nfunc (a *dockerClientAdapter) ContainerWait(ctx context.Context, containerID string, options client.ContainerWaitOptions) client.ContainerWaitResult {\n\treturn a.Client.ContainerWait(ctx, containerID, options)\n}\n\nfunc (a *dockerClientAdapter) ContainerAttach(ctx context.Context, container string, options client.ContainerAttachOptions) (client.ContainerAttachResult, error) {\n\treturn a.Client.ContainerAttach(ctx, container, options)\n}\n\nfunc (a *dockerClientAdapter) ContainerStart(ctx context.Context, container string, options client.ContainerStartOptions) (client.ContainerStartResult, error) {\n\treturn a.Client.ContainerStart(ctx, container, options)\n}\n\nfunc RunContainer(ctx context.Context, dockerCli *client.Client, id string, stdout io.Writer, stderr io.Writer) error {\n\tadapter := &dockerClientAdapter{Client: dockerCli}\n\tbodyChan, errChan := container.ContainerWaitWrapper(ctx, adapter, id, dcontainer.WaitConditionNextExit)\n\n\tlogsResult, err := dockerCli.ContainerAttach(ctx, id, client.ContainerAttachOptions{\n\t\tStream: true,\n\t\tStdout: true,\n\t\tStderr: true,\n\t})\n\tif err != nil {\n\t\treturn err\n\t}\n\n\t_, err = dockerCli.ContainerStart(ctx, id, client.ContainerStartOptions{})\n\tif err != nil {\n\t\treturn errors.Wrap(err, \"container start\")\n\t}\n\n\tcopyErr := make(chan error)\n\tgo func() {\n\t\t_, err := stdcopy.StdCopy(stdout, stderr, logsResult.Reader)\n\t\tcopyErr <- err\n\t}()\n\n\tselect {\n\tcase body := <-bodyChan:\n\t\tif body.StatusCode != 0 {\n\t\t\treturn fmt.Errorf(\"failed with status code: %d\", body.StatusCode)\n\t\t}\n\tcase err := <-errChan:\n\t\treturn err\n\t}\n\treturn <-copyErr\n}\n\nfunc CreateTGZ(t *testing.T, srcDir, tarDir string, mode int64) string {\n\tt.Helper()\n\n\tfh, err := os.CreateTemp(\"\", \"*.tgz\")\n\tAssertNil(t, err)\n\tdefer fh.Close()\n\n\tgw := gzip.NewWriter(fh)\n\tdefer gw.Close()\n\n\twriteTAR(t, srcDir, tarDir, mode, gw)\n\n\treturn fh.Name()\n}\n\nfunc CreateTAR(t *testing.T, srcDir, tarDir string, mode int64) string {\n\tt.Helper()\n\n\tfh, err := os.CreateTemp(\"\", \"*.tgz\")\n\tAssertNil(t, err)\n\tdefer fh.Close()\n\n\twriteTAR(t, srcDir, tarDir, mode, fh)\n\n\treturn fh.Name()\n}\n\nfunc writeTAR(t *testing.T, srcDir, tarDir string, mode int64, w io.Writer) {\n\tt.Helper()\n\ttw := tar.NewWriter(w)\n\tdefer tw.Close()\n\n\terr := archive.WriteDirToTar(tw, srcDir, tarDir, 0, 0, mode, true, false, nil)\n\tAssertNil(t, err)\n}\n\nfunc RecursiveCopyNow(t *testing.T, src, dst string) {\n\tt.Helper()\n\terr := os.MkdirAll(dst, 0750)\n\tAssertNil(t, err)\n\n\tfis, err := os.ReadDir(src)\n\tAssertNil(t, err)\n\tfor _, entry := range fis {\n\t\tfi, err := entry.Info()\n\t\tAssertNil(t, err)\n\t\tif fi.Mode().IsRegular() {\n\t\t\tsrcFile, err := os.Open(filepath.Join(filepath.Clean(src), fi.Name()))\n\t\t\tAssertNil(t, err)\n\t\t\tdstFile, err := os.Create(filepath.Join(dst, fi.Name()))\n\t\t\tAssertNil(t, err)\n\t\t\t_, err = io.Copy(dstFile, srcFile)\n\t\t\tAssertNil(t, err)\n\t\t\tmodifiedTime := time.Now().Local()\n\t\t\terr = os.Chtimes(filepath.Join(dst, fi.Name()), modifiedTime, modifiedTime)\n\t\t\tAssertNil(t, err)\n\t\t\terr = os.Chmod(filepath.Join(dst, fi.Name()), 0664)\n\t\t\tAssertNil(t, err)\n\t\t}\n\t\tif fi.IsDir() {\n\t\t\terr = os.Mkdir(filepath.Join(dst, fi.Name()), fi.Mode())\n\t\t\tAssertNil(t, err)\n\t\t\tRecursiveCopyNow(t, filepath.Join(src, fi.Name()), filepath.Join(dst, fi.Name()))\n\t\t}\n\t}\n\tmodifiedTime := time.Now().Local()\n\terr = os.Chtimes(dst, modifiedTime, modifiedTime)\n\tAssertNil(t, err)\n\terr = os.Chmod(dst, 0775)\n\tAssertNil(t, err)\n}\n\nfunc AssertTarFileContents(t *testing.T, tarfile, path, expected string) {\n\tt.Helper()\n\texist, contents := tarFileContents(t, tarfile, path)\n\tif !exist {\n\t\tt.Fatalf(\"%s does not exist in %s\", path, tarfile)\n\t}\n\tAssertEq(t, contents, expected)\n}\n\nfunc tarFileContents(t *testing.T, tarfile, path string) (exist bool, contents string) {\n\tt.Helper()\n\tr, err := os.Open(filepath.Clean(tarfile))\n\tAssertNil(t, err)\n\tdefer r.Close()\n\n\ttr := tar.NewReader(r)\n\tfor {\n\t\theader, err := tr.Next()\n\t\tif err == io.EOF {\n\t\t\tbreak\n\t\t}\n\t\tAssertNil(t, err)\n\n\t\tif header.Name == path {\n\t\t\tbuf, err := io.ReadAll(tr)\n\t\t\tAssertNil(t, err)\n\t\t\treturn true, string(buf)\n\t\t}\n\t}\n\treturn false, \"\"\n}\n\nfunc AssertTarHasFile(t *testing.T, tarFile, path string) {\n\tt.Helper()\n\n\texist := tarHasFile(t, tarFile, path)\n\tif !exist {\n\t\tt.Fatalf(\"%s does not exist in %s\", path, tarFile)\n\t}\n}\n\nfunc tarHasFile(t *testing.T, tarFile, path string) (exist bool) {\n\tt.Helper()\n\n\tr, err := os.Open(filepath.Clean(tarFile))\n\tAssertNil(t, err)\n\tdefer r.Close()\n\n\ttr := tar.NewReader(r)\n\tfor {\n\t\theader, err := tr.Next()\n\t\tif err == io.EOF {\n\t\t\tbreak\n\t\t}\n\t\tAssertNil(t, err)\n\n\t\tif header.Name == path {\n\t\t\treturn true\n\t\t}\n\t}\n\n\treturn false\n}\n\nfunc AssertBuildpacksHaveDescriptors(t *testing.T, modules []buildpack.BuildModule, descriptors []dist.BuildpackDescriptor) {\n\tAssertEq(t, len(modules), len(descriptors))\n\tfor _, mod := range modules {\n\t\tfound := false\n\t\tmodDesc, ok := mod.Descriptor().(*dist.BuildpackDescriptor)\n\t\tAssertEq(t, ok, true)\n\t\tfor _, descriptor := range descriptors {\n\t\t\tif diff := cmp.Diff(*modDesc, descriptor); diff == \"\" {\n\t\t\t\tfound = true\n\t\t\t\tbreak\n\t\t\t}\n\t\t}\n\t\tAssertTrue(t, found)\n\t}\n}\n\nfunc AssertGitHeadEq(t *testing.T, path1, path2 string) {\n\tr1, err := git.PlainOpen(path1)\n\tAssertNil(t, err)\n\n\tr2, err := git.PlainOpen(path2)\n\tAssertNil(t, err)\n\n\th1, err := r1.Head()\n\tAssertNil(t, err)\n\n\th2, err := r2.Head()\n\tAssertNil(t, err)\n\n\tAssertEq(t, h1.Hash().String(), h2.Hash().String())\n}\n\nfunc AssertBlobsLen(t *testing.T, path string, expected int) {\n\tt.Helper()\n\tfis, err := os.ReadDir(filepath.Join(path, \"blobs\", \"sha256\"))\n\tAssertNil(t, err)\n\tAssertEq(t, len(fis), expected)\n}\n\nfunc MockWriterAndOutput() (*color.Console, func() string) {\n\tr, w, _ := os.Pipe()\n\tconsole := color.NewConsole(w)\n\treturn console, func() string {\n\t\t_ = w.Close()\n\t\tvar b bytes.Buffer\n\t\t_, _ = io.Copy(&b, r)\n\t\t_ = r.Close()\n\t\treturn b.String()\n\t}\n}\n\nfunc LayerFileName(bp buildpack.BuildModule) string {\n\treturn fmt.Sprintf(\"%s.%s.tar\", bp.Descriptor().Info().ID, bp.Descriptor().Info().Version)\n}\n\ntype FakeAddedLayerImage struct {\n\t*fakes.Image\n\taddedLayersOrder []string\n}\n\nfunc (f *FakeAddedLayerImage) AddedLayersOrder() []string {\n\treturn f.addedLayersOrder\n}\n\nfunc (f *FakeAddedLayerImage) AddLayerWithDiffID(path, diffID string) error {\n\tf.addedLayersOrder = append(f.addedLayersOrder, path)\n\treturn f.Image.AddLayerWithDiffID(path, diffID)\n}\n\ntype FakeWithUnderlyingImage struct {\n\t*fakes.Image\n\tunderlyingImage v1.Image\n}\n\nfunc (t *FakeWithUnderlyingImage) UnderlyingImage() v1.Image {\n\treturn t.underlyingImage\n}\n\nfunc NewFakeWithUnderlyingV1Image(repoName string, identifier imgutil.Identifier, underlyingImage v1.Image) *FakeWithUnderlyingImage {\n\tfakeCNBImage := fakes.NewImage(repoName, \"\", identifier)\n\treturn &FakeWithUnderlyingImage{\n\t\tImage:           fakeCNBImage,\n\t\tunderlyingImage: underlyingImage,\n\t}\n}\n"
  },
  {
    "path": "tools/go.mod",
    "content": "module github.com/buildpacks/pack/tools\n\ngo 1.25.0\n\ntoolchain go1.25.5\n\nrequire (\n\tgithub.com/golang/mock v1.6.0\n\tgithub.com/golangci/golangci-lint/v2 v2.0.2\n\tgolang.org/x/tools v0.31.0\n)\n\nrequire (\n\t4d63.com/gocheckcompilerdirectives v1.3.0 // indirect\n\t4d63.com/gochecknoglobals v0.2.2 // indirect\n\tgithub.com/4meepo/tagalign v1.4.2 // indirect\n\tgithub.com/Abirdcfly/dupword v0.1.3 // indirect\n\tgithub.com/Antonboom/errname v1.1.0 // indirect\n\tgithub.com/Antonboom/nilnil v1.1.0 // indirect\n\tgithub.com/Antonboom/testifylint v1.6.0 // indirect\n\tgithub.com/BurntSushi/toml v1.5.0 // indirect\n\tgithub.com/Crocmagnon/fatcontext v0.7.1 // indirect\n\tgithub.com/Djarvur/go-err113 v0.0.0-20210108212216-aea10b59be24 // indirect\n\tgithub.com/GaijinEntertainment/go-exhaustruct/v3 v3.3.1 // indirect\n\tgithub.com/Masterminds/semver/v3 v3.3.1 // indirect\n\tgithub.com/OpenPeeDeeP/depguard/v2 v2.2.1 // indirect\n\tgithub.com/alecthomas/go-check-sumtype v0.3.1 // indirect\n\tgithub.com/alexkohler/nakedret/v2 v2.0.5 // indirect\n\tgithub.com/alexkohler/prealloc v1.0.0 // indirect\n\tgithub.com/alingse/asasalint v0.0.11 // indirect\n\tgithub.com/alingse/nilnesserr v0.1.2 // indirect\n\tgithub.com/ashanbrown/forbidigo v1.6.0 // indirect\n\tgithub.com/ashanbrown/makezero v1.2.0 // indirect\n\tgithub.com/aymanbagabas/go-osc52/v2 v2.0.1 // indirect\n\tgithub.com/beorn7/perks v1.0.1 // indirect\n\tgithub.com/bkielbasa/cyclop v1.2.3 // indirect\n\tgithub.com/blizzy78/varnamelen v0.8.0 // indirect\n\tgithub.com/bombsimon/wsl/v4 v4.6.0 // indirect\n\tgithub.com/breml/bidichk v0.3.3 // indirect\n\tgithub.com/breml/errchkjson v0.4.1 // indirect\n\tgithub.com/butuzov/ireturn v0.3.1 // indirect\n\tgithub.com/butuzov/mirror v1.3.0 // indirect\n\tgithub.com/catenacyber/perfsprint v0.9.1 // indirect\n\tgithub.com/ccojocar/zxcvbn-go v1.0.2 // indirect\n\tgithub.com/cespare/xxhash/v2 v2.3.0 // indirect\n\tgithub.com/charithe/durationcheck v0.0.10 // indirect\n\tgithub.com/charmbracelet/colorprofile v0.2.3-0.20250311203215-f60798e515dc // indirect\n\tgithub.com/charmbracelet/lipgloss v1.1.0 // indirect\n\tgithub.com/charmbracelet/x/ansi v0.8.0 // indirect\n\tgithub.com/charmbracelet/x/cellbuf v0.0.13-0.20250311204145-2c3ea96c31dd // indirect\n\tgithub.com/charmbracelet/x/term v0.2.1 // indirect\n\tgithub.com/chavacava/garif v0.1.0 // indirect\n\tgithub.com/ckaznocha/intrange v0.3.1 // indirect\n\tgithub.com/curioswitch/go-reassign v0.3.0 // indirect\n\tgithub.com/daixiang0/gci v0.13.6 // indirect\n\tgithub.com/dave/dst v0.27.3 // indirect\n\tgithub.com/davecgh/go-spew v1.1.1 // indirect\n\tgithub.com/denis-tingaikin/go-header v0.5.0 // indirect\n\tgithub.com/ettle/strcase v0.2.0 // indirect\n\tgithub.com/fatih/color v1.18.0 // indirect\n\tgithub.com/fatih/structtag v1.2.0 // indirect\n\tgithub.com/firefart/nonamedreturns v1.0.5 // indirect\n\tgithub.com/fsnotify/fsnotify v1.5.4 // indirect\n\tgithub.com/fzipp/gocyclo v0.6.0 // indirect\n\tgithub.com/ghostiam/protogetter v0.3.12 // indirect\n\tgithub.com/go-critic/go-critic v0.13.0 // indirect\n\tgithub.com/go-toolsmith/astcast v1.1.0 // indirect\n\tgithub.com/go-toolsmith/astcopy v1.1.0 // indirect\n\tgithub.com/go-toolsmith/astequal v1.2.0 // indirect\n\tgithub.com/go-toolsmith/astfmt v1.1.0 // indirect\n\tgithub.com/go-toolsmith/astp v1.1.0 // indirect\n\tgithub.com/go-toolsmith/strparse v1.1.0 // indirect\n\tgithub.com/go-toolsmith/typep v1.1.0 // indirect\n\tgithub.com/go-viper/mapstructure/v2 v2.4.0 // indirect\n\tgithub.com/go-xmlfmt/xmlfmt v1.1.3 // indirect\n\tgithub.com/gobwas/glob v0.2.3 // indirect\n\tgithub.com/gofrs/flock v0.12.1 // indirect\n\tgithub.com/golang/protobuf v1.5.3 // indirect\n\tgithub.com/golangci/dupl v0.0.0-20250308024227-f665c8d69b32 // indirect\n\tgithub.com/golangci/go-printf-func-name v0.1.0 // indirect\n\tgithub.com/golangci/gofmt v0.0.0-20250106114630-d62b90e6713d // indirect\n\tgithub.com/golangci/golines v0.0.0-20250217134842-442fd0091d95 // indirect\n\tgithub.com/golangci/misspell v0.6.0 // indirect\n\tgithub.com/golangci/plugin-module-register v0.1.1 // indirect\n\tgithub.com/golangci/revgrep v0.8.0 // indirect\n\tgithub.com/golangci/unconvert v0.0.0-20240309020433-c5143eacb3ed // indirect\n\tgithub.com/google/go-cmp v0.7.0 // indirect\n\tgithub.com/gordonklaus/ineffassign v0.1.0 // indirect\n\tgithub.com/gostaticanalysis/analysisutil v0.7.1 // indirect\n\tgithub.com/gostaticanalysis/comment v1.5.0 // indirect\n\tgithub.com/gostaticanalysis/forcetypeassert v0.2.0 // indirect\n\tgithub.com/gostaticanalysis/nilerr v0.1.1 // indirect\n\tgithub.com/hashicorp/go-immutable-radix/v2 v2.1.0 // indirect\n\tgithub.com/hashicorp/go-version v1.7.0 // indirect\n\tgithub.com/hashicorp/golang-lru/v2 v2.0.7 // indirect\n\tgithub.com/hashicorp/hcl v1.0.0 // indirect\n\tgithub.com/hexops/gotextdiff v1.0.3 // indirect\n\tgithub.com/inconshreveable/mousetrap v1.1.0 // indirect\n\tgithub.com/jgautheron/goconst v1.7.1 // indirect\n\tgithub.com/jingyugao/rowserrcheck v1.1.1 // indirect\n\tgithub.com/jjti/go-spancheck v0.6.4 // indirect\n\tgithub.com/julz/importas v0.2.0 // indirect\n\tgithub.com/karamaru-alpha/copyloopvar v1.2.1 // indirect\n\tgithub.com/kisielk/errcheck v1.9.0 // indirect\n\tgithub.com/kkHAIKE/contextcheck v1.1.6 // indirect\n\tgithub.com/kulti/thelper v0.6.3 // indirect\n\tgithub.com/kunwardeep/paralleltest v1.0.10 // indirect\n\tgithub.com/lasiar/canonicalheader v1.1.2 // indirect\n\tgithub.com/ldez/exptostd v0.4.2 // indirect\n\tgithub.com/ldez/gomoddirectives v0.6.1 // indirect\n\tgithub.com/ldez/grignotin v0.9.0 // indirect\n\tgithub.com/ldez/tagliatelle v0.7.1 // indirect\n\tgithub.com/ldez/usetesting v0.4.2 // indirect\n\tgithub.com/leonklingele/grouper v1.1.2 // indirect\n\tgithub.com/lucasb-eyer/go-colorful v1.2.0 // indirect\n\tgithub.com/macabu/inamedparam v0.2.0 // indirect\n\tgithub.com/magiconair/properties v1.8.6 // indirect\n\tgithub.com/maratori/testableexamples v1.0.0 // indirect\n\tgithub.com/maratori/testpackage v1.1.1 // indirect\n\tgithub.com/matoous/godox v1.1.0 // indirect\n\tgithub.com/mattn/go-colorable v0.1.14 // indirect\n\tgithub.com/mattn/go-isatty v0.0.20 // indirect\n\tgithub.com/mattn/go-runewidth v0.0.16 // indirect\n\tgithub.com/matttproud/golang_protobuf_extensions v1.0.1 // indirect\n\tgithub.com/mgechev/revive v1.7.0 // indirect\n\tgithub.com/mitchellh/go-homedir v1.1.0 // indirect\n\tgithub.com/mitchellh/mapstructure v1.5.0 // indirect\n\tgithub.com/moricho/tparallel v0.3.2 // indirect\n\tgithub.com/muesli/termenv v0.16.0 // indirect\n\tgithub.com/nakabonne/nestif v0.3.1 // indirect\n\tgithub.com/nishanths/exhaustive v0.12.0 // indirect\n\tgithub.com/nishanths/predeclared v0.2.2 // indirect\n\tgithub.com/nunnatsa/ginkgolinter v0.19.1 // indirect\n\tgithub.com/olekukonko/tablewriter v0.0.5 // indirect\n\tgithub.com/pelletier/go-toml v1.9.5 // indirect\n\tgithub.com/pelletier/go-toml/v2 v2.2.3 // indirect\n\tgithub.com/pmezard/go-difflib v1.0.0 // indirect\n\tgithub.com/polyfloyd/go-errorlint v1.7.1 // indirect\n\tgithub.com/prometheus/client_golang v1.12.1 // indirect\n\tgithub.com/prometheus/client_model v0.2.0 // indirect\n\tgithub.com/prometheus/common v0.32.1 // indirect\n\tgithub.com/prometheus/procfs v0.7.3 // indirect\n\tgithub.com/quasilyte/go-ruleguard v0.4.4 // indirect\n\tgithub.com/quasilyte/go-ruleguard/dsl v0.3.22 // indirect\n\tgithub.com/quasilyte/gogrep v0.5.0 // indirect\n\tgithub.com/quasilyte/regex/syntax v0.0.0-20210819130434-b3f0c404a727 // indirect\n\tgithub.com/quasilyte/stdinfo v0.0.0-20220114132959-f7386bf02567 // indirect\n\tgithub.com/raeperd/recvcheck v0.2.0 // indirect\n\tgithub.com/rivo/uniseg v0.4.7 // indirect\n\tgithub.com/rogpeppe/go-internal v1.14.1 // indirect\n\tgithub.com/ryancurrah/gomodguard v1.4.1 // indirect\n\tgithub.com/ryanrolds/sqlclosecheck v0.5.1 // indirect\n\tgithub.com/sanposhiho/wastedassign/v2 v2.1.0 // indirect\n\tgithub.com/santhosh-tekuri/jsonschema/v6 v6.0.1 // indirect\n\tgithub.com/sashamelentyev/interfacebloat v1.1.0 // indirect\n\tgithub.com/sashamelentyev/usestdlibvars v1.28.0 // indirect\n\tgithub.com/securego/gosec/v2 v2.22.2 // indirect\n\tgithub.com/sirupsen/logrus v1.9.3 // indirect\n\tgithub.com/sivchari/containedctx v1.0.3 // indirect\n\tgithub.com/sonatard/noctx v0.1.0 // indirect\n\tgithub.com/sourcegraph/go-diff v0.7.0 // indirect\n\tgithub.com/spf13/afero v1.12.0 // indirect\n\tgithub.com/spf13/cast v1.5.0 // indirect\n\tgithub.com/spf13/cobra v1.9.1 // indirect\n\tgithub.com/spf13/jwalterweatherman v1.1.0 // indirect\n\tgithub.com/spf13/pflag v1.0.6 // indirect\n\tgithub.com/spf13/viper v1.12.0 // indirect\n\tgithub.com/ssgreg/nlreturn/v2 v2.2.1 // indirect\n\tgithub.com/stbenjam/no-sprintf-host-port v0.2.0 // indirect\n\tgithub.com/stretchr/objx v0.5.2 // indirect\n\tgithub.com/stretchr/testify v1.10.0 // indirect\n\tgithub.com/subosito/gotenv v1.4.1 // indirect\n\tgithub.com/tdakkota/asciicheck v0.4.1 // indirect\n\tgithub.com/tetafro/godot v1.5.0 // indirect\n\tgithub.com/timakin/bodyclose v0.0.0-20241222091800-1db5c5ca4d67 // indirect\n\tgithub.com/timonwong/loggercheck v0.10.1 // indirect\n\tgithub.com/tomarrell/wrapcheck/v2 v2.10.0 // indirect\n\tgithub.com/tommy-muehle/go-mnd/v2 v2.5.1 // indirect\n\tgithub.com/ultraware/funlen v0.2.0 // indirect\n\tgithub.com/ultraware/whitespace v0.2.0 // indirect\n\tgithub.com/uudashr/gocognit v1.2.0 // indirect\n\tgithub.com/uudashr/iface v1.3.1 // indirect\n\tgithub.com/xen0n/gosmopolitan v1.3.0 // indirect\n\tgithub.com/xo/terminfo v0.0.0-20220910002029-abceb7e1c41e // indirect\n\tgithub.com/yagipy/maintidx v1.0.0 // indirect\n\tgithub.com/yeya24/promlinter v0.3.0 // indirect\n\tgithub.com/ykadowak/zerologlint v0.1.5 // indirect\n\tgitlab.com/bosi/decorder v0.4.2 // indirect\n\tgo-simpler.org/musttag v0.13.0 // indirect\n\tgo-simpler.org/sloglint v0.9.0 // indirect\n\tgo.uber.org/atomic v1.7.0 // indirect\n\tgo.uber.org/automaxprocs v1.6.0 // indirect\n\tgo.uber.org/multierr v1.6.0 // indirect\n\tgo.uber.org/zap v1.24.0 // indirect\n\tgolang.org/x/exp/typeparams v0.0.0-20250210185358-939b2ce775ac // indirect\n\tgolang.org/x/mod v0.24.0 // indirect\n\tgolang.org/x/sync v0.12.0 // indirect\n\tgolang.org/x/sys v0.31.0 // indirect\n\tgolang.org/x/text v0.23.0 // indirect\n\tgoogle.golang.org/protobuf v1.36.5 // indirect\n\tgopkg.in/ini.v1 v1.67.0 // indirect\n\tgopkg.in/yaml.v2 v2.4.0 // indirect\n\tgopkg.in/yaml.v3 v3.0.1 // indirect\n\thonnef.co/go/tools v0.6.1 // indirect\n\tmvdan.cc/gofumpt v0.7.0 // indirect\n\tmvdan.cc/unparam v0.0.0-20250301125049-0df0534333a4 // indirect\n)\n"
  },
  {
    "path": "tools/go.sum",
    "content": "4d63.com/gocheckcompilerdirectives v1.3.0 h1:Ew5y5CtcAAQeTVKUVFrE7EwHMrTO6BggtEj8BZSjZ3A=\n4d63.com/gocheckcompilerdirectives v1.3.0/go.mod h1:ofsJ4zx2QAuIP/NO/NAh1ig6R1Fb18/GI7RVMwz7kAY=\n4d63.com/gochecknoglobals v0.2.2 h1:H1vdnwnMaZdQW/N+NrkT1SZMTBmcwHe9Vq8lJcYYTtU=\n4d63.com/gochecknoglobals v0.2.2/go.mod h1:lLxwTQjL5eIesRbvnzIP3jZtG140FnTdz+AlMa+ogt0=\ncloud.google.com/go v0.26.0/go.mod h1:aQUYkXzVsufM+DwF1aE+0xfcU+56JwCaLick0ClmMTw=\ncloud.google.com/go v0.34.0/go.mod h1:aQUYkXzVsufM+DwF1aE+0xfcU+56JwCaLick0ClmMTw=\ncloud.google.com/go v0.38.0/go.mod h1:990N+gfupTy94rShfmMCWGDn0LpTmnzTp2qbd1dvSRU=\ncloud.google.com/go v0.44.1/go.mod h1:iSa0KzasP4Uvy3f1mN/7PiObzGgflwredwwASm/v6AU=\ncloud.google.com/go v0.44.2/go.mod h1:60680Gw3Yr4ikxnPRS/oxxkBccT6SA1yMk63TGekxKY=\ncloud.google.com/go v0.45.1/go.mod h1:RpBamKRgapWJb87xiFSdk4g1CME7QZg3uwTez+TSTjc=\ncloud.google.com/go v0.46.3/go.mod h1:a6bKKbmY7er1mI7TEI4lsAkts/mkhTSZK8w33B4RAg0=\ncloud.google.com/go v0.50.0/go.mod h1:r9sluTvynVuxRIOHXQEHMFffphuXHOMZMycpNR5e6To=\ncloud.google.com/go v0.52.0/go.mod h1:pXajvRH/6o3+F9jDHZWQ5PbGhn+o8w9qiu/CffaVdO4=\ncloud.google.com/go v0.53.0/go.mod h1:fp/UouUEsRkN6ryDKNW/Upv/JBKnv6WDthjR6+vze6M=\ncloud.google.com/go v0.54.0/go.mod h1:1rq2OEkV3YMf6n/9ZvGWI3GWw0VoqH/1x2nd8Is/bPc=\ncloud.google.com/go v0.56.0/go.mod h1:jr7tqZxxKOVYizybht9+26Z/gUq7tiRzu+ACVAMbKVk=\ncloud.google.com/go v0.57.0/go.mod h1:oXiQ6Rzq3RAkkY7N6t3TcE6jE+CIBBbA36lwQ1JyzZs=\ncloud.google.com/go v0.62.0/go.mod h1:jmCYTdRCQuc1PHIIJ/maLInMho30T/Y0M4hTdTShOYc=\ncloud.google.com/go v0.65.0/go.mod h1:O5N8zS7uWy9vkA9vayVHs65eM1ubvY4h553ofrNHObY=\ncloud.google.com/go/bigquery v1.0.1/go.mod h1:i/xbL2UlR5RvWAURpBYZTtm/cXjCha9lbfbpx4poX+o=\ncloud.google.com/go/bigquery v1.3.0/go.mod h1:PjpwJnslEMmckchkHFfq+HTD2DmtT67aNFKH1/VBDHE=\ncloud.google.com/go/bigquery v1.4.0/go.mod h1:S8dzgnTigyfTmLBfrtrhyYhwRxG72rYxvftPBK2Dvzc=\ncloud.google.com/go/bigquery v1.5.0/go.mod h1:snEHRnqQbz117VIFhE8bmtwIDY80NLUZUMb4Nv6dBIg=\ncloud.google.com/go/bigquery v1.7.0/go.mod h1://okPTzCYNXSlb24MZs83e2Do+h+VXtc4gLoIoXIAPc=\ncloud.google.com/go/bigquery v1.8.0/go.mod h1:J5hqkt3O0uAFnINi6JXValWIb1v0goeZM77hZzJN/fQ=\ncloud.google.com/go/datastore v1.0.0/go.mod h1:LXYbyblFSglQ5pkeyhO+Qmw7ukd3C+pD7TKLgZqpHYE=\ncloud.google.com/go/datastore v1.1.0/go.mod h1:umbIZjpQpHh4hmRpGhH4tLFup+FVzqBi1b3c64qFpCk=\ncloud.google.com/go/pubsub v1.0.1/go.mod h1:R0Gpsv3s54REJCy4fxDixWD93lHJMoZTyQ2kNxGRt3I=\ncloud.google.com/go/pubsub v1.1.0/go.mod h1:EwwdRX2sKPjnvnqCa270oGRyludottCI76h+R3AArQw=\ncloud.google.com/go/pubsub v1.2.0/go.mod h1:jhfEVHT8odbXTkndysNHCcx0awwzvfOlguIAii9o8iA=\ncloud.google.com/go/pubsub v1.3.1/go.mod h1:i+ucay31+CNRpDW4Lu78I4xXG+O1r/MAHgjpRVR+TSU=\ncloud.google.com/go/storage v1.0.0/go.mod h1:IhtSnM/ZTZV8YYJWCY8RULGVqBDmpoyjwiyrjsg+URw=\ncloud.google.com/go/storage v1.5.0/go.mod h1:tpKbwo567HUNpVclU5sGELwQWBDZ8gh0ZeosJ0Rtdos=\ncloud.google.com/go/storage v1.6.0/go.mod h1:N7U0C8pVQ/+NIKOBQyamJIeKQKkZ+mxpohlUTyfDhBk=\ncloud.google.com/go/storage v1.8.0/go.mod h1:Wv1Oy7z6Yz3DshWRJFhqM/UCfaWIRTdp0RXyy7KQOVs=\ncloud.google.com/go/storage v1.10.0/go.mod h1:FLPqc6j+Ki4BU591ie1oL6qBQGu2Bl/tZ9ullr3+Kg0=\ndmitri.shuralyov.com/gpu/mtl v0.0.0-20190408044501-666a987793e9/go.mod h1:H6x//7gZCb22OMCxBHrMx7a5I7Hp++hsVxbQ4BYO7hU=\ngithub.com/4meepo/tagalign v1.4.2 h1:0hcLHPGMjDyM1gHG58cS73aQF8J4TdVR96TZViorO9E=\ngithub.com/4meepo/tagalign v1.4.2/go.mod h1:+p4aMyFM+ra7nb41CnFG6aSDXqRxU/w1VQqScKqDARI=\ngithub.com/Abirdcfly/dupword v0.1.3 h1:9Pa1NuAsZvpFPi9Pqkd93I7LIYRURj+A//dFd5tgBeE=\ngithub.com/Abirdcfly/dupword v0.1.3/go.mod h1:8VbB2t7e10KRNdwTVoxdBaxla6avbhGzb8sCTygUMhw=\ngithub.com/Antonboom/errname v1.1.0 h1:A+ucvdpMwlo/myWrkHEUEBWc/xuXdud23S8tmTb/oAE=\ngithub.com/Antonboom/errname v1.1.0/go.mod h1:O1NMrzgUcVBGIfi3xlVuvX8Q/VP/73sseCaAppfjqZw=\ngithub.com/Antonboom/nilnil v1.1.0 h1:jGxJxjgYS3VUUtOTNk8Z1icwT5ESpLH/426fjmQG+ng=\ngithub.com/Antonboom/nilnil v1.1.0/go.mod h1:b7sAlogQjFa1wV8jUW3o4PMzDVFLbTux+xnQdvzdcIE=\ngithub.com/Antonboom/testifylint v1.6.0 h1:6rdILVPt4+rqcvhid8w9wJNynKLUgqHNpFyM67UeXyc=\ngithub.com/Antonboom/testifylint v1.6.0/go.mod h1:k+nEkathI2NFjKO6HvwmSrbzUcQ6FAnbZV+ZRrnXPLI=\ngithub.com/BurntSushi/toml v0.3.1/go.mod h1:xHWCNGjB5oqiDr8zfno3MHue2Ht5sIBksp03qcyfWMU=\ngithub.com/BurntSushi/toml v1.5.0 h1:W5quZX/G/csjUnuI8SUYlsHs9M38FC7znL0lIO+DvMg=\ngithub.com/BurntSushi/toml v1.5.0/go.mod h1:ukJfTF/6rtPPRCnwkur4qwRxa8vTRFBF0uk2lLoLwho=\ngithub.com/BurntSushi/xgb v0.0.0-20160522181843-27f122750802/go.mod h1:IVnqGOEym/WlBOVXweHU+Q+/VP0lqqI8lqeDx9IjBqo=\ngithub.com/Crocmagnon/fatcontext v0.7.1 h1:SC/VIbRRZQeQWj/TcQBS6JmrXcfA+BU4OGSVUt54PjM=\ngithub.com/Crocmagnon/fatcontext v0.7.1/go.mod h1:1wMvv3NXEBJucFGfwOJBxSVWcoIO6emV215SMkW9MFU=\ngithub.com/Djarvur/go-err113 v0.0.0-20210108212216-aea10b59be24 h1:sHglBQTwgx+rWPdisA5ynNEsoARbiCBOyGcJM4/OzsM=\ngithub.com/Djarvur/go-err113 v0.0.0-20210108212216-aea10b59be24/go.mod h1:4UJr5HIiMZrwgkSPdsjy2uOQExX/WEILpIrO9UPGuXs=\ngithub.com/GaijinEntertainment/go-exhaustruct/v3 v3.3.1 h1:Sz1JIXEcSfhz7fUi7xHnhpIE0thVASYjvosApmHuD2k=\ngithub.com/GaijinEntertainment/go-exhaustruct/v3 v3.3.1/go.mod h1:n/LSCXNuIYqVfBlVXyHfMQkZDdp1/mmxfSjADd3z1Zg=\ngithub.com/Masterminds/semver/v3 v3.3.1 h1:QtNSWtVZ3nBfk8mAOu/B6v7FMJ+NHTIgUPi7rj+4nv4=\ngithub.com/Masterminds/semver/v3 v3.3.1/go.mod h1:4V+yj/TJE1HU9XfppCwVMZq3I84lprf4nC11bSS5beM=\ngithub.com/OpenPeeDeeP/depguard/v2 v2.2.1 h1:vckeWVESWp6Qog7UZSARNqfu/cZqvki8zsuj3piCMx4=\ngithub.com/OpenPeeDeeP/depguard/v2 v2.2.1/go.mod h1:q4DKzC4UcVaAvcfd41CZh0PWpGgzrVxUYBlgKNGquUo=\ngithub.com/alecthomas/assert/v2 v2.11.0 h1:2Q9r3ki8+JYXvGsDyBXwH3LcJ+WK5D0gc5E8vS6K3D0=\ngithub.com/alecthomas/assert/v2 v2.11.0/go.mod h1:Bze95FyfUr7x34QZrjL+XP+0qgp/zg8yS+TtBj1WA3k=\ngithub.com/alecthomas/go-check-sumtype v0.3.1 h1:u9aUvbGINJxLVXiFvHUlPEaD7VDULsrxJb4Aq31NLkU=\ngithub.com/alecthomas/go-check-sumtype v0.3.1/go.mod h1:A8TSiN3UPRw3laIgWEUOHHLPa6/r9MtoigdlP5h3K/E=\ngithub.com/alecthomas/repr v0.4.0 h1:GhI2A8MACjfegCPVq9f1FLvIBS+DrQ2KQBFZP1iFzXc=\ngithub.com/alecthomas/repr v0.4.0/go.mod h1:Fr0507jx4eOXV7AlPV6AVZLYrLIuIeSOWtW57eE/O/4=\ngithub.com/alecthomas/template v0.0.0-20160405071501-a0175ee3bccc/go.mod h1:LOuyumcjzFXgccqObfd/Ljyb9UuFJ6TxHnclSeseNhc=\ngithub.com/alecthomas/template v0.0.0-20190718012654-fb15b899a751/go.mod h1:LOuyumcjzFXgccqObfd/Ljyb9UuFJ6TxHnclSeseNhc=\ngithub.com/alecthomas/units v0.0.0-20151022065526-2efee857e7cf/go.mod h1:ybxpYRFXyAe+OPACYpWeL0wqObRcbAqCMya13uyzqw0=\ngithub.com/alecthomas/units v0.0.0-20190717042225-c3de453c63f4/go.mod h1:ybxpYRFXyAe+OPACYpWeL0wqObRcbAqCMya13uyzqw0=\ngithub.com/alecthomas/units v0.0.0-20190924025748-f65c72e2690d/go.mod h1:rBZYJk541a8SKzHPHnH3zbiI+7dagKZ0cgpgrD7Fyho=\ngithub.com/alexkohler/nakedret/v2 v2.0.5 h1:fP5qLgtwbx9EJE8dGEERT02YwS8En4r9nnZ71RK+EVU=\ngithub.com/alexkohler/nakedret/v2 v2.0.5/go.mod h1:bF5i0zF2Wo2o4X4USt9ntUWve6JbFv02Ff4vlkmS/VU=\ngithub.com/alexkohler/prealloc v1.0.0 h1:Hbq0/3fJPQhNkN0dR95AVrr6R7tou91y0uHG5pOcUuw=\ngithub.com/alexkohler/prealloc v1.0.0/go.mod h1:VetnK3dIgFBBKmg0YnD9F9x6Icjd+9cvfHR56wJVlKE=\ngithub.com/alingse/asasalint v0.0.11 h1:SFwnQXJ49Kx/1GghOFz1XGqHYKp21Kq1nHad/0WQRnw=\ngithub.com/alingse/asasalint v0.0.11/go.mod h1:nCaoMhw7a9kSJObvQyVzNTPBDbNpdocqrSP7t/cW5+I=\ngithub.com/alingse/nilnesserr v0.1.2 h1:Yf8Iwm3z2hUUrP4muWfW83DF4nE3r1xZ26fGWUKCZlo=\ngithub.com/alingse/nilnesserr v0.1.2/go.mod h1:1xJPrXonEtX7wyTq8Dytns5P2hNzoWymVUIaKm4HNFg=\ngithub.com/ashanbrown/forbidigo v1.6.0 h1:D3aewfM37Yb3pxHujIPSpTf6oQk9sc9WZi8gerOIVIY=\ngithub.com/ashanbrown/forbidigo v1.6.0/go.mod h1:Y8j9jy9ZYAEHXdu723cUlraTqbzjKF1MUyfOKL+AjcU=\ngithub.com/ashanbrown/makezero v1.2.0 h1:/2Lp1bypdmK9wDIq7uWBlDF1iMUpIIS4A+pF6C9IEUU=\ngithub.com/ashanbrown/makezero v1.2.0/go.mod h1:dxlPhHbDMC6N6xICzFBSK+4njQDdK8euNO0qjQMtGY4=\ngithub.com/aymanbagabas/go-osc52/v2 v2.0.1 h1:HwpRHbFMcZLEVr42D4p7XBqjyuxQH5SMiErDT4WkJ2k=\ngithub.com/aymanbagabas/go-osc52/v2 v2.0.1/go.mod h1:uYgXzlJ7ZpABp8OJ+exZzJJhRNQ2ASbcXHWsFqH8hp8=\ngithub.com/benbjohnson/clock v1.1.0 h1:Q92kusRqC1XV2MjkWETPvjJVqKetz1OzxZB7mHJLju8=\ngithub.com/benbjohnson/clock v1.1.0/go.mod h1:J11/hYXuz8f4ySSvYwY0FKfm+ezbsZBKZxNJlLklBHA=\ngithub.com/beorn7/perks v0.0.0-20180321164747-3a771d992973/go.mod h1:Dwedo/Wpr24TaqPxmxbtue+5NUziq4I4S80YR8gNf3Q=\ngithub.com/beorn7/perks v1.0.0/go.mod h1:KWe93zE9D1o94FZ5RNwFwVgaQK1VOXiVxmqh+CedLV8=\ngithub.com/beorn7/perks v1.0.1 h1:VlbKKnNfV8bJzeqoa4cOKqO6bYr3WgKZxO8Z16+hsOM=\ngithub.com/beorn7/perks v1.0.1/go.mod h1:G2ZrVWU2WbWT9wwq4/hrbKbnv/1ERSJQ0ibhJ6rlkpw=\ngithub.com/bkielbasa/cyclop v1.2.3 h1:faIVMIGDIANuGPWH031CZJTi2ymOQBULs9H21HSMa5w=\ngithub.com/bkielbasa/cyclop v1.2.3/go.mod h1:kHTwA9Q0uZqOADdupvcFJQtp/ksSnytRMe8ztxG8Fuo=\ngithub.com/blizzy78/varnamelen v0.8.0 h1:oqSblyuQvFsW1hbBHh1zfwrKe3kcSj0rnXkKzsQ089M=\ngithub.com/blizzy78/varnamelen v0.8.0/go.mod h1:V9TzQZ4fLJ1DSrjVDfl89H7aMnTvKkApdHeyESmyR7k=\ngithub.com/bombsimon/wsl/v4 v4.6.0 h1:ew2R/N42su553DKTYqt3HSxaQN+uHQPv4xZ2MBmwaW4=\ngithub.com/bombsimon/wsl/v4 v4.6.0/go.mod h1:uV/+6BkffuzSAVYD+yGyld1AChO7/EuLrCF/8xTiapg=\ngithub.com/breml/bidichk v0.3.3 h1:WSM67ztRusf1sMoqH6/c4OBCUlRVTKq+CbSeo0R17sE=\ngithub.com/breml/bidichk v0.3.3/go.mod h1:ISbsut8OnjB367j5NseXEGGgO/th206dVa427kR8YTE=\ngithub.com/breml/errchkjson v0.4.1 h1:keFSS8D7A2T0haP9kzZTi7o26r7kE3vymjZNeNDRDwg=\ngithub.com/breml/errchkjson v0.4.1/go.mod h1:a23OvR6Qvcl7DG/Z4o0el6BRAjKnaReoPQFciAl9U3s=\ngithub.com/butuzov/ireturn v0.3.1 h1:mFgbEI6m+9W8oP/oDdfA34dLisRFCj2G6o/yiI1yZrY=\ngithub.com/butuzov/ireturn v0.3.1/go.mod h1:ZfRp+E7eJLC0NQmk1Nrm1LOrn/gQlOykv+cVPdiXH5M=\ngithub.com/butuzov/mirror v1.3.0 h1:HdWCXzmwlQHdVhwvsfBb2Au0r3HyINry3bDWLYXiKoc=\ngithub.com/butuzov/mirror v1.3.0/go.mod h1:AEij0Z8YMALaq4yQj9CPPVYOyJQyiexpQEQgihajRfI=\ngithub.com/catenacyber/perfsprint v0.9.1 h1:5LlTp4RwTooQjJCvGEFV6XksZvWE7wCOUvjD2z0vls0=\ngithub.com/catenacyber/perfsprint v0.9.1/go.mod h1:q//VWC2fWbcdSLEY1R3l8n0zQCDPdE4IjZwyY1HMunM=\ngithub.com/ccojocar/zxcvbn-go v1.0.2 h1:na/czXU8RrhXO4EZme6eQJLR4PzcGsahsBOAwU6I3Vg=\ngithub.com/ccojocar/zxcvbn-go v1.0.2/go.mod h1:g1qkXtUSvHP8lhHp5GrSmTz6uWALGRMQdw6Qnz/hi60=\ngithub.com/census-instrumentation/opencensus-proto v0.2.1/go.mod h1:f6KPmirojxKA12rnyqOA5BBL4O983OfeGPqjHWSTneU=\ngithub.com/cespare/xxhash/v2 v2.1.1/go.mod h1:VGX0DQ3Q6kWi7AoAeZDth3/j3BFtOZR5XLFGgcrjCOs=\ngithub.com/cespare/xxhash/v2 v2.1.2/go.mod h1:VGX0DQ3Q6kWi7AoAeZDth3/j3BFtOZR5XLFGgcrjCOs=\ngithub.com/cespare/xxhash/v2 v2.3.0 h1:UL815xU9SqsFlibzuggzjXhog7bL6oX9BbNZnL2UFvs=\ngithub.com/cespare/xxhash/v2 v2.3.0/go.mod h1:VGX0DQ3Q6kWi7AoAeZDth3/j3BFtOZR5XLFGgcrjCOs=\ngithub.com/charithe/durationcheck v0.0.10 h1:wgw73BiocdBDQPik+zcEoBG/ob8uyBHf2iyoHGPf5w4=\ngithub.com/charithe/durationcheck v0.0.10/go.mod h1:bCWXb7gYRysD1CU3C+u4ceO49LoGOY1C1L6uouGNreQ=\ngithub.com/charmbracelet/colorprofile v0.2.3-0.20250311203215-f60798e515dc h1:4pZI35227imm7yK2bGPcfpFEmuY1gc2YSTShr4iJBfs=\ngithub.com/charmbracelet/colorprofile v0.2.3-0.20250311203215-f60798e515dc/go.mod h1:X4/0JoqgTIPSFcRA/P6INZzIuyqdFY5rm8tb41s9okk=\ngithub.com/charmbracelet/lipgloss v1.1.0 h1:vYXsiLHVkK7fp74RkV7b2kq9+zDLoEU4MZoFqR/noCY=\ngithub.com/charmbracelet/lipgloss v1.1.0/go.mod h1:/6Q8FR2o+kj8rz4Dq0zQc3vYf7X+B0binUUBwA0aL30=\ngithub.com/charmbracelet/x/ansi v0.8.0 h1:9GTq3xq9caJW8ZrBTe0LIe2fvfLR/bYXKTx2llXn7xE=\ngithub.com/charmbracelet/x/ansi v0.8.0/go.mod h1:wdYl/ONOLHLIVmQaxbIYEC/cRKOQyjTkowiI4blgS9Q=\ngithub.com/charmbracelet/x/cellbuf v0.0.13-0.20250311204145-2c3ea96c31dd h1:vy0GVL4jeHEwG5YOXDmi86oYw2yuYUGqz6a8sLwg0X8=\ngithub.com/charmbracelet/x/cellbuf v0.0.13-0.20250311204145-2c3ea96c31dd/go.mod h1:xe0nKWGd3eJgtqZRaN9RjMtK7xUYchjzPr7q6kcvCCs=\ngithub.com/charmbracelet/x/term v0.2.1 h1:AQeHeLZ1OqSXhrAWpYUtZyX1T3zVxfpZuEQMIQaGIAQ=\ngithub.com/charmbracelet/x/term v0.2.1/go.mod h1:oQ4enTYFV7QN4m0i9mzHrViD7TQKvNEEkHUMCmsxdUg=\ngithub.com/chavacava/garif v0.1.0 h1:2JHa3hbYf5D9dsgseMKAmc/MZ109otzgNFk5s87H9Pc=\ngithub.com/chavacava/garif v0.1.0/go.mod h1:XMyYCkEL58DF0oyW4qDjjnPWONs2HBqYKI+UIPD+Gww=\ngithub.com/chzyer/logex v1.1.10/go.mod h1:+Ywpsq7O8HXn0nuIou7OrIPyXbp3wmkHB+jjWRnGsAI=\ngithub.com/chzyer/readline v0.0.0-20180603132655-2972be24d48e/go.mod h1:nSuG5e5PlCu98SY8svDHJxuZscDgtXS6KTTbou5AhLI=\ngithub.com/chzyer/test v0.0.0-20180213035817-a1ea475d72b1/go.mod h1:Q3SI9o4m/ZMnBNeIyt5eFwwo7qiLfzFZmjNmxjkiQlU=\ngithub.com/ckaznocha/intrange v0.3.1 h1:j1onQyXvHUsPWujDH6WIjhyH26gkRt/txNlV7LspvJs=\ngithub.com/ckaznocha/intrange v0.3.1/go.mod h1:QVepyz1AkUoFQkpEqksSYpNpUo3c5W7nWh/s6SHIJJk=\ngithub.com/client9/misspell v0.3.4/go.mod h1:qj6jICC3Q7zFZvVWo7KLAzC3yx5G7kyvSDkc90ppPyw=\ngithub.com/cncf/udpa/go v0.0.0-20191209042840-269d4d468f6f/go.mod h1:M8M6+tZqaGXZJjfX53e64911xZQV5JYwmTeXPW+k8Sc=\ngithub.com/cpuguy83/go-md2man/v2 v2.0.6/go.mod h1:oOW0eioCTA6cOiMLiUPZOpcVxMig6NIQQ7OS05n1F4g=\ngithub.com/curioswitch/go-reassign v0.3.0 h1:dh3kpQHuADL3cobV/sSGETA8DOv457dwl+fbBAhrQPs=\ngithub.com/curioswitch/go-reassign v0.3.0/go.mod h1:nApPCCTtqLJN/s8HfItCcKV0jIPwluBOvZP+dsJGA88=\ngithub.com/daixiang0/gci v0.13.6 h1:RKuEOSkGpSadkGbvZ6hJ4ddItT3cVZ9Vn9Rybk6xjl8=\ngithub.com/daixiang0/gci v0.13.6/go.mod h1:12etP2OniiIdP4q+kjUGrC/rUagga7ODbqsom5Eo5Yk=\ngithub.com/dave/dst v0.27.3 h1:P1HPoMza3cMEquVf9kKy8yXsFirry4zEnWOdYPOoIzY=\ngithub.com/dave/dst v0.27.3/go.mod h1:jHh6EOibnHgcUW3WjKHisiooEkYwqpHLBSX1iOBhEyc=\ngithub.com/dave/jennifer v1.7.1 h1:B4jJJDHelWcDhlRQxWeo0Npa/pYKBLrirAQoTN45txo=\ngithub.com/dave/jennifer v1.7.1/go.mod h1:nXbxhEmQfOZhWml3D1cDK5M1FLnMSozpbFN/m3RmGZc=\ngithub.com/davecgh/go-spew v1.1.0/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=\ngithub.com/davecgh/go-spew v1.1.1 h1:vj9j/u1bqnvCEfJOwUhtlOARqs3+rkHYY13jYWTU97c=\ngithub.com/davecgh/go-spew v1.1.1/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=\ngithub.com/denis-tingaikin/go-header v0.5.0 h1:SRdnP5ZKvcO9KKRP1KJrhFR3RrlGuD+42t4429eC9k8=\ngithub.com/denis-tingaikin/go-header v0.5.0/go.mod h1:mMenU5bWrok6Wl2UsZjy+1okegmwQ3UgWl4V1D8gjlY=\ngithub.com/dlclark/regexp2 v1.11.0 h1:G/nrcoOa7ZXlpoa/91N3X7mM3r8eIlMBBJZvsz/mxKI=\ngithub.com/dlclark/regexp2 v1.11.0/go.mod h1:DHkYz0B9wPfa6wondMfaivmHpzrQ3v9q8cnmRbL6yW8=\ngithub.com/envoyproxy/go-control-plane v0.9.0/go.mod h1:YTl/9mNaCwkRvm6d1a2C3ymFceY/DCBVvsKhRF0iEA4=\ngithub.com/envoyproxy/go-control-plane v0.9.1-0.20191026205805-5f8ba28d4473/go.mod h1:YTl/9mNaCwkRvm6d1a2C3ymFceY/DCBVvsKhRF0iEA4=\ngithub.com/envoyproxy/go-control-plane v0.9.4/go.mod h1:6rpuAdCZL397s3pYoYcLgu1mIlRU8Am5FuJP05cCM98=\ngithub.com/envoyproxy/protoc-gen-validate v0.1.0/go.mod h1:iSmxcyjqTsJpI2R4NaDN7+kN2VEUnK/pcBlmesArF7c=\ngithub.com/ettle/strcase v0.2.0 h1:fGNiVF21fHXpX1niBgk0aROov1LagYsOwV/xqKDKR/Q=\ngithub.com/ettle/strcase v0.2.0/go.mod h1:DajmHElDSaX76ITe3/VHVyMin4LWSJN5Z909Wp+ED1A=\ngithub.com/fatih/color v1.18.0 h1:S8gINlzdQ840/4pfAwic/ZE0djQEH3wM94VfqLTZcOM=\ngithub.com/fatih/color v1.18.0/go.mod h1:4FelSpRwEGDpQ12mAdzqdOukCy4u8WUtOY6lkT/6HfU=\ngithub.com/fatih/structtag v1.2.0 h1:/OdNE99OxoI/PqaW/SuSK9uxxT3f/tcSZgon/ssNSx4=\ngithub.com/fatih/structtag v1.2.0/go.mod h1:mBJUNpUnHmRKrKlQQlmCrh5PuhftFbNv8Ys4/aAZl94=\ngithub.com/firefart/nonamedreturns v1.0.5 h1:tM+Me2ZaXs8tfdDw3X6DOX++wMCOqzYUho6tUTYIdRA=\ngithub.com/firefart/nonamedreturns v1.0.5/go.mod h1:gHJjDqhGM4WyPt639SOZs+G89Ko7QKH5R5BhnO6xJhw=\ngithub.com/frankban/quicktest v1.14.3 h1:FJKSZTDHjyhriyC81FLQ0LY93eSai0ZyR/ZIkd3ZUKE=\ngithub.com/frankban/quicktest v1.14.3/go.mod h1:mgiwOwqx65TmIk1wJ6Q7wvnVMocbUorkibMOrVTHZps=\ngithub.com/fsnotify/fsnotify v1.5.4 h1:jRbGcIw6P2Meqdwuo0H1p6JVLbL5DHKAKlYndzMwVZI=\ngithub.com/fsnotify/fsnotify v1.5.4/go.mod h1:OVB6XrOHzAwXMpEM7uPOzcehqUV2UqJxmVXmkdnm1bU=\ngithub.com/fzipp/gocyclo v0.6.0 h1:lsblElZG7d3ALtGMx9fmxeTKZaLLpU8mET09yN4BBLo=\ngithub.com/fzipp/gocyclo v0.6.0/go.mod h1:rXPyn8fnlpa0R2csP/31uerbiVBugk5whMdlyaLkLoA=\ngithub.com/ghostiam/protogetter v0.3.12 h1:xTPjH97iKph27vXRRKV0OCke5sAMoHPbVeVstdzmCLE=\ngithub.com/ghostiam/protogetter v0.3.12/go.mod h1:WZ0nw9pfzsgxuRsPOFQomgDVSWtDLJRfQJEhsGbmQMA=\ngithub.com/go-critic/go-critic v0.13.0 h1:kJzM7wzltQasSUXtYyTl6UaPVySO6GkaR1thFnJ6afY=\ngithub.com/go-critic/go-critic v0.13.0/go.mod h1:M/YeuJ3vOCQDnP2SU+ZhjgRzwzcBW87JqLpMJLrZDLI=\ngithub.com/go-gl/glfw v0.0.0-20190409004039-e6da0acd62b1/go.mod h1:vR7hzQXu2zJy9AVAgeJqvqgH9Q5CA+iKCZ2gyEVpxRU=\ngithub.com/go-gl/glfw/v3.3/glfw v0.0.0-20191125211704-12ad95a8df72/go.mod h1:tQ2UAYgL5IevRw8kRxooKSPJfGvJ9fJQFa0TUsXzTg8=\ngithub.com/go-gl/glfw/v3.3/glfw v0.0.0-20200222043503-6f7a984d4dc4/go.mod h1:tQ2UAYgL5IevRw8kRxooKSPJfGvJ9fJQFa0TUsXzTg8=\ngithub.com/go-kit/kit v0.8.0/go.mod h1:xBxKIO96dXMWWy0MnWVtmwkA9/13aqxPnvrjFYMA2as=\ngithub.com/go-kit/kit v0.9.0/go.mod h1:xBxKIO96dXMWWy0MnWVtmwkA9/13aqxPnvrjFYMA2as=\ngithub.com/go-kit/log v0.1.0/go.mod h1:zbhenjAZHb184qTLMA9ZjW7ThYL0H2mk7Q6pNt4vbaY=\ngithub.com/go-logfmt/logfmt v0.3.0/go.mod h1:Qt1PoO58o5twSAckw1HlFXLmHsOX5/0LbT9GBnD5lWE=\ngithub.com/go-logfmt/logfmt v0.4.0/go.mod h1:3RMwSq7FuexP4Kalkev3ejPJsZTpXXBr9+V4qmtdjCk=\ngithub.com/go-logfmt/logfmt v0.5.0/go.mod h1:wCYkCAKZfumFQihp8CzCvQ3paCTfi41vtzG1KdI/P7A=\ngithub.com/go-logr/logr v1.4.2 h1:6pFjapn8bFcIbiKo3XT4j/BhANplGihG6tvd+8rYgrY=\ngithub.com/go-logr/logr v1.4.2/go.mod h1:9T104GzyrTigFIr8wt5mBrctHMim0Nb2HLGrmQ40KvY=\ngithub.com/go-quicktest/qt v1.101.0 h1:O1K29Txy5P2OK0dGo59b7b0LR6wKfIhttaAhHUyn7eI=\ngithub.com/go-quicktest/qt v1.101.0/go.mod h1:14Bz/f7NwaXPtdYEgzsx46kqSxVwTbzVZsDC26tQJow=\ngithub.com/go-stack/stack v1.8.0/go.mod h1:v0f6uXyyMGvRgIKkXu+yp6POWl0qKG85gN/melR3HDY=\ngithub.com/go-task/slim-sprig/v3 v3.0.0 h1:sUs3vkvUymDpBKi3qH1YSqBQk9+9D/8M2mN1vB6EwHI=\ngithub.com/go-task/slim-sprig/v3 v3.0.0/go.mod h1:W848ghGpv3Qj3dhTPRyJypKRiqCdHZiAzKg9hl15HA8=\ngithub.com/go-toolsmith/astcast v1.1.0 h1:+JN9xZV1A+Re+95pgnMgDboWNVnIMMQXwfBwLRPgSC8=\ngithub.com/go-toolsmith/astcast v1.1.0/go.mod h1:qdcuFWeGGS2xX5bLM/c3U9lewg7+Zu4mr+xPwZIB4ZU=\ngithub.com/go-toolsmith/astcopy v1.1.0 h1:YGwBN0WM+ekI/6SS6+52zLDEf8Yvp3n2seZITCUBt5s=\ngithub.com/go-toolsmith/astcopy v1.1.0/go.mod h1:hXM6gan18VA1T/daUEHCFcYiW8Ai1tIwIzHY6srfEAw=\ngithub.com/go-toolsmith/astequal v1.0.3/go.mod h1:9Ai4UglvtR+4up+bAD4+hCj7iTo4m/OXVTSLnCyTAx4=\ngithub.com/go-toolsmith/astequal v1.1.0/go.mod h1:sedf7VIdCL22LD8qIvv7Nn9MuWJruQA/ysswh64lffQ=\ngithub.com/go-toolsmith/astequal v1.2.0 h1:3Fs3CYZ1k9Vo4FzFhwwewC3CHISHDnVUPC4x0bI2+Cw=\ngithub.com/go-toolsmith/astequal v1.2.0/go.mod h1:c8NZ3+kSFtFY/8lPso4v8LuJjdJiUFVnSuU3s0qrrDY=\ngithub.com/go-toolsmith/astfmt v1.1.0 h1:iJVPDPp6/7AaeLJEruMsBUlOYCmvg0MoCfJprsOmcco=\ngithub.com/go-toolsmith/astfmt v1.1.0/go.mod h1:OrcLlRwu0CuiIBp/8b5PYF9ktGVZUjlNMV634mhwuQ4=\ngithub.com/go-toolsmith/astp v1.1.0 h1:dXPuCl6u2llURjdPLLDxJeZInAeZ0/eZwFJmqZMnpQA=\ngithub.com/go-toolsmith/astp v1.1.0/go.mod h1:0T1xFGz9hicKs8Z5MfAqSUitoUYS30pDMsRVIDHs8CA=\ngithub.com/go-toolsmith/pkgload v1.2.2 h1:0CtmHq/02QhxcF7E9N5LIFcYFsMR5rdovfqTtRKkgIk=\ngithub.com/go-toolsmith/pkgload v1.2.2/go.mod h1:R2hxLNRKuAsiXCo2i5J6ZQPhnPMOVtU+f0arbFPWCus=\ngithub.com/go-toolsmith/strparse v1.0.0/go.mod h1:YI2nUKP9YGZnL/L1/DLFBfixrcjslWct4wyljWhSRy8=\ngithub.com/go-toolsmith/strparse v1.1.0 h1:GAioeZUK9TGxnLS+qfdqNbA4z0SSm5zVNtCQiyP2Bvw=\ngithub.com/go-toolsmith/strparse v1.1.0/go.mod h1:7ksGy58fsaQkGQlY8WVoBFNyEPMGuJin1rfoPS4lBSQ=\ngithub.com/go-toolsmith/typep v1.1.0 h1:fIRYDyF+JywLfqzyhdiHzRop/GQDxxNhLGQ6gFUNHus=\ngithub.com/go-toolsmith/typep v1.1.0/go.mod h1:fVIw+7zjdsMxDA3ITWnH1yOiw1rnTQKCsF/sk2H/qig=\ngithub.com/go-viper/mapstructure/v2 v2.4.0 h1:EBsztssimR/CONLSZZ04E8qAkxNYq4Qp9LvH92wZUgs=\ngithub.com/go-viper/mapstructure/v2 v2.4.0/go.mod h1:oJDH3BJKyqBA2TXFhDsKDGDTlndYOZ6rGS0BRZIxGhM=\ngithub.com/go-xmlfmt/xmlfmt v1.1.3 h1:t8Ey3Uy7jDSEisW2K3somuMKIpzktkWptA0iFCnRUWY=\ngithub.com/go-xmlfmt/xmlfmt v1.1.3/go.mod h1:aUCEOzzezBEjDBbFBoSiya/gduyIiWYRP6CnSFIV8AM=\ngithub.com/gobwas/glob v0.2.3 h1:A4xDbljILXROh+kObIiy5kIaPYD8e96x1tgBhUI5J+Y=\ngithub.com/gobwas/glob v0.2.3/go.mod h1:d3Ez4x06l9bZtSvzIay5+Yzi0fmZzPgnTbPcKjJAkT8=\ngithub.com/gofrs/flock v0.12.1 h1:MTLVXXHf8ekldpJk3AKicLij9MdwOWkZ+a/jHHZby9E=\ngithub.com/gofrs/flock v0.12.1/go.mod h1:9zxTsyu5xtJ9DK+1tFZyibEV7y3uwDxPPfbxeeHCoD0=\ngithub.com/gogo/protobuf v1.1.1/go.mod h1:r8qH/GZQm5c6nD/R0oafs1akxWv10x8SbQlK7atdtwQ=\ngithub.com/golang/glog v0.0.0-20160126235308-23def4e6c14b/go.mod h1:SBH7ygxi8pfUlaOkMMuAQtPIUF8ecWP5IEl/CR7VP2Q=\ngithub.com/golang/groupcache v0.0.0-20190702054246-869f871628b6/go.mod h1:cIg4eruTrX1D+g88fzRXU5OdNfaM+9IcxsU14FzY7Hc=\ngithub.com/golang/groupcache v0.0.0-20191227052852-215e87163ea7/go.mod h1:cIg4eruTrX1D+g88fzRXU5OdNfaM+9IcxsU14FzY7Hc=\ngithub.com/golang/groupcache v0.0.0-20200121045136-8c9f03a8e57e/go.mod h1:cIg4eruTrX1D+g88fzRXU5OdNfaM+9IcxsU14FzY7Hc=\ngithub.com/golang/mock v1.1.1/go.mod h1:oTYuIxOrZwtPieC+H1uAHpcLFnEyAGVDL/k47Jfbm0A=\ngithub.com/golang/mock v1.2.0/go.mod h1:oTYuIxOrZwtPieC+H1uAHpcLFnEyAGVDL/k47Jfbm0A=\ngithub.com/golang/mock v1.3.1/go.mod h1:sBzyDLLjw3U8JLTeZvSv8jJB+tU5PVekmnlKIyFUx0Y=\ngithub.com/golang/mock v1.4.0/go.mod h1:UOMv5ysSaYNkG+OFQykRIcU/QvvxJf3p21QfJ2Bt3cw=\ngithub.com/golang/mock v1.4.1/go.mod h1:UOMv5ysSaYNkG+OFQykRIcU/QvvxJf3p21QfJ2Bt3cw=\ngithub.com/golang/mock v1.4.3/go.mod h1:UOMv5ysSaYNkG+OFQykRIcU/QvvxJf3p21QfJ2Bt3cw=\ngithub.com/golang/mock v1.4.4/go.mod h1:l3mdAwkq5BuhzHwde/uurv3sEJeZMXNpwsxVWU71h+4=\ngithub.com/golang/mock v1.6.0 h1:ErTB+efbowRARo13NNdxyJji2egdxLGQhRaY+DUumQc=\ngithub.com/golang/mock v1.6.0/go.mod h1:p6yTPP+5HYm5mzsMV8JkE6ZKdX+/wYM6Hr+LicevLPs=\ngithub.com/golang/protobuf v1.2.0/go.mod h1:6lQm79b+lXiMfvg/cZm0SGofjICqVBUtrP5yJMmIC1U=\ngithub.com/golang/protobuf v1.3.1/go.mod h1:6lQm79b+lXiMfvg/cZm0SGofjICqVBUtrP5yJMmIC1U=\ngithub.com/golang/protobuf v1.3.2/go.mod h1:6lQm79b+lXiMfvg/cZm0SGofjICqVBUtrP5yJMmIC1U=\ngithub.com/golang/protobuf v1.3.3/go.mod h1:vzj43D7+SQXF/4pzW/hwtAqwc6iTitCiVSaWz5lYuqw=\ngithub.com/golang/protobuf v1.3.4/go.mod h1:vzj43D7+SQXF/4pzW/hwtAqwc6iTitCiVSaWz5lYuqw=\ngithub.com/golang/protobuf v1.3.5/go.mod h1:6O5/vntMXwX2lRkT1hjjk0nAC1IDOTvTlVgjlRvqsdk=\ngithub.com/golang/protobuf v1.4.0-rc.1/go.mod h1:ceaxUfeHdC40wWswd/P6IGgMaK3YpKi5j83Wpe3EHw8=\ngithub.com/golang/protobuf v1.4.0-rc.1.0.20200221234624-67d41d38c208/go.mod h1:xKAWHe0F5eneWXFV3EuXVDTCmh+JuBKY0li0aMyXATA=\ngithub.com/golang/protobuf v1.4.0-rc.2/go.mod h1:LlEzMj4AhA7rCAGe4KMBDvJI+AwstrUpVNzEA03Pprs=\ngithub.com/golang/protobuf v1.4.0-rc.4.0.20200313231945-b860323f09d0/go.mod h1:WU3c8KckQ9AFe+yFwt9sWVRKCVIyN9cPHBJSNnbL67w=\ngithub.com/golang/protobuf v1.4.0/go.mod h1:jodUvKwWbYaEsadDk5Fwe5c77LiNKVO9IDvqG2KuDX0=\ngithub.com/golang/protobuf v1.4.1/go.mod h1:U8fpvMrcmy5pZrNK1lt4xCsGvpyWQ/VVv6QDs8UjoX8=\ngithub.com/golang/protobuf v1.4.2/go.mod h1:oDoupMAO8OvCJWAcko0GGGIgR6R6ocIYbsSw735rRwI=\ngithub.com/golang/protobuf v1.4.3/go.mod h1:oDoupMAO8OvCJWAcko0GGGIgR6R6ocIYbsSw735rRwI=\ngithub.com/golang/protobuf v1.5.0/go.mod h1:FsONVRAS9T7sI+LIUmWTfcYkHO4aIWwzhcaSAoJOfIk=\ngithub.com/golang/protobuf v1.5.2/go.mod h1:XVQd3VNwM+JqD3oG2Ue2ip4fOMUkwXdXDdiuN0vRsmY=\ngithub.com/golang/protobuf v1.5.3 h1:KhyjKVUg7Usr/dYsdSqoFveMYd5ko72D+zANwlG1mmg=\ngithub.com/golang/protobuf v1.5.3/go.mod h1:XVQd3VNwM+JqD3oG2Ue2ip4fOMUkwXdXDdiuN0vRsmY=\ngithub.com/golangci/dupl v0.0.0-20250308024227-f665c8d69b32 h1:WUvBfQL6EW/40l6OmeSBYQJNSif4O11+bmWEz+C7FYw=\ngithub.com/golangci/dupl v0.0.0-20250308024227-f665c8d69b32/go.mod h1:NUw9Zr2Sy7+HxzdjIULge71wI6yEg1lWQr7Evcu8K0E=\ngithub.com/golangci/go-printf-func-name v0.1.0 h1:dVokQP+NMTO7jwO4bwsRwLWeudOVUPPyAKJuzv8pEJU=\ngithub.com/golangci/go-printf-func-name v0.1.0/go.mod h1:wqhWFH5mUdJQhweRnldEywnR5021wTdZSNgwYceV14s=\ngithub.com/golangci/gofmt v0.0.0-20250106114630-d62b90e6713d h1:viFft9sS/dxoYY0aiOTsLKO2aZQAPT4nlQCsimGcSGE=\ngithub.com/golangci/gofmt v0.0.0-20250106114630-d62b90e6713d/go.mod h1:ivJ9QDg0XucIkmwhzCDsqcnxxlDStoTl89jDMIoNxKY=\ngithub.com/golangci/golangci-lint/v2 v2.0.2 h1:dMCC8ikPiLDvHMFy3+XypSAuGDBOLzwWqqamer+bWsY=\ngithub.com/golangci/golangci-lint/v2 v2.0.2/go.mod h1:ptNNMeGBQrbves0Qq38xvfdJg18PzxmT+7KRCOpm6i8=\ngithub.com/golangci/golines v0.0.0-20250217134842-442fd0091d95 h1:AkK+w9FZBXlU/xUmBtSJN1+tAI4FIvy5WtnUnY8e4p8=\ngithub.com/golangci/golines v0.0.0-20250217134842-442fd0091d95/go.mod h1:k9mmcyWKSTMcPPvQUCfRWWQ9VHJ1U9Dc0R7kaXAgtnQ=\ngithub.com/golangci/misspell v0.6.0 h1:JCle2HUTNWirNlDIAUO44hUsKhOFqGPoC4LZxlaSXDs=\ngithub.com/golangci/misspell v0.6.0/go.mod h1:keMNyY6R9isGaSAu+4Q8NMBwMPkh15Gtc8UCVoDtAWo=\ngithub.com/golangci/plugin-module-register v0.1.1 h1:TCmesur25LnyJkpsVrupv1Cdzo+2f7zX0H6Jkw1Ol6c=\ngithub.com/golangci/plugin-module-register v0.1.1/go.mod h1:TTpqoB6KkwOJMV8u7+NyXMrkwwESJLOkfl9TxR1DGFc=\ngithub.com/golangci/revgrep v0.8.0 h1:EZBctwbVd0aMeRnNUsFogoyayvKHyxlV3CdUA46FX2s=\ngithub.com/golangci/revgrep v0.8.0/go.mod h1:U4R/s9dlXZsg8uJmaR1GrloUr14D7qDl8gi2iPXJH8k=\ngithub.com/golangci/unconvert v0.0.0-20240309020433-c5143eacb3ed h1:IURFTjxeTfNFP0hTEi1YKjB/ub8zkpaOqFFMApi2EAs=\ngithub.com/golangci/unconvert v0.0.0-20240309020433-c5143eacb3ed/go.mod h1:XLXN8bNw4CGRPaqgl3bv/lhz7bsGPh4/xSaMTbo2vkQ=\ngithub.com/google/btree v0.0.0-20180813153112-4030bb1f1f0c/go.mod h1:lNA+9X1NB3Zf8V7Ke586lFgjr2dZNuvo3lPJSGZ5JPQ=\ngithub.com/google/btree v1.0.0/go.mod h1:lNA+9X1NB3Zf8V7Ke586lFgjr2dZNuvo3lPJSGZ5JPQ=\ngithub.com/google/go-cmp v0.2.0/go.mod h1:oXzfMopK8JAjlY9xF4vHSVASa0yLyX7SntLO5aqRK0M=\ngithub.com/google/go-cmp v0.3.0/go.mod h1:8QqcDgzrUqlUb/G2PQTWiueGozuR1884gddMywk6iLU=\ngithub.com/google/go-cmp v0.3.1/go.mod h1:8QqcDgzrUqlUb/G2PQTWiueGozuR1884gddMywk6iLU=\ngithub.com/google/go-cmp v0.4.0/go.mod h1:v8dTdLbMG2kIc/vJvl+f65V22dbkXbowE6jgT/gNBxE=\ngithub.com/google/go-cmp v0.4.1/go.mod h1:v8dTdLbMG2kIc/vJvl+f65V22dbkXbowE6jgT/gNBxE=\ngithub.com/google/go-cmp v0.5.0/go.mod h1:v8dTdLbMG2kIc/vJvl+f65V22dbkXbowE6jgT/gNBxE=\ngithub.com/google/go-cmp v0.5.1/go.mod h1:v8dTdLbMG2kIc/vJvl+f65V22dbkXbowE6jgT/gNBxE=\ngithub.com/google/go-cmp v0.5.2/go.mod h1:v8dTdLbMG2kIc/vJvl+f65V22dbkXbowE6jgT/gNBxE=\ngithub.com/google/go-cmp v0.5.4/go.mod h1:v8dTdLbMG2kIc/vJvl+f65V22dbkXbowE6jgT/gNBxE=\ngithub.com/google/go-cmp v0.5.5/go.mod h1:v8dTdLbMG2kIc/vJvl+f65V22dbkXbowE6jgT/gNBxE=\ngithub.com/google/go-cmp v0.5.6/go.mod h1:v8dTdLbMG2kIc/vJvl+f65V22dbkXbowE6jgT/gNBxE=\ngithub.com/google/go-cmp v0.5.8/go.mod h1:17dUlkBOakJ0+DkrSSNjCkIjxS6bF9zb3elmeNGIjoY=\ngithub.com/google/go-cmp v0.7.0 h1:wk8382ETsv4JYUZwIsn6YpYiWiBsYLSJiTsyBybVuN8=\ngithub.com/google/go-cmp v0.7.0/go.mod h1:pXiqmnSA92OHEEa9HXL2W4E7lf9JzCmGVUdgjX3N/iU=\ngithub.com/google/gofuzz v1.0.0/go.mod h1:dBl0BpW6vV/+mYPU4Po3pmUjxk6FQPldtuIdl/M65Eg=\ngithub.com/google/martian v2.1.0+incompatible/go.mod h1:9I4somxYTbIHy5NJKHRl3wXiIaQGbYVAs8BPL6v8lEs=\ngithub.com/google/martian/v3 v3.0.0/go.mod h1:y5Zk1BBys9G+gd6Jrk0W3cC1+ELVxBWuIGO+w/tUAp0=\ngithub.com/google/pprof v0.0.0-20181206194817-3ea8567a2e57/go.mod h1:zfwlbNMJ+OItoe0UupaVj+oy1omPYYDuagoSzA8v9mc=\ngithub.com/google/pprof v0.0.0-20190515194954-54271f7e092f/go.mod h1:zfwlbNMJ+OItoe0UupaVj+oy1omPYYDuagoSzA8v9mc=\ngithub.com/google/pprof v0.0.0-20191218002539-d4f498aebedc/go.mod h1:ZgVRPoUq/hfqzAqh7sHMqb3I9Rq5C59dIz2SbBwJ4eM=\ngithub.com/google/pprof v0.0.0-20200212024743-f11f1df84d12/go.mod h1:ZgVRPoUq/hfqzAqh7sHMqb3I9Rq5C59dIz2SbBwJ4eM=\ngithub.com/google/pprof v0.0.0-20200229191704-1ebb73c60ed3/go.mod h1:ZgVRPoUq/hfqzAqh7sHMqb3I9Rq5C59dIz2SbBwJ4eM=\ngithub.com/google/pprof v0.0.0-20200430221834-fc25d7d30c6d/go.mod h1:ZgVRPoUq/hfqzAqh7sHMqb3I9Rq5C59dIz2SbBwJ4eM=\ngithub.com/google/pprof v0.0.0-20200708004538-1a94d8640e99/go.mod h1:ZgVRPoUq/hfqzAqh7sHMqb3I9Rq5C59dIz2SbBwJ4eM=\ngithub.com/google/pprof v0.0.0-20241210010833-40e02aabc2ad h1:a6HEuzUHeKH6hwfN/ZoQgRgVIWFJljSWa/zetS2WTvg=\ngithub.com/google/pprof v0.0.0-20241210010833-40e02aabc2ad/go.mod h1:vavhavw2zAxS5dIdcRluK6cSGGPlZynqzFM8NdvU144=\ngithub.com/google/renameio v0.1.0/go.mod h1:KWCgfxg9yswjAJkECMjeO8J8rahYeXnNhOm40UhjYkI=\ngithub.com/googleapis/gax-go/v2 v2.0.4/go.mod h1:0Wqv26UfaUD9n4G6kQubkQ+KchISgw+vpHVxEJEs9eg=\ngithub.com/googleapis/gax-go/v2 v2.0.5/go.mod h1:DWXyrwAJ9X0FpwwEdw+IPEYBICEFu5mhpdKc/us6bOk=\ngithub.com/gordonklaus/ineffassign v0.1.0 h1:y2Gd/9I7MdY1oEIt+n+rowjBNDcLQq3RsH5hwJd0f9s=\ngithub.com/gordonklaus/ineffassign v0.1.0/go.mod h1:Qcp2HIAYhR7mNUVSIxZww3Guk4it82ghYcEXIAk+QT0=\ngithub.com/gostaticanalysis/analysisutil v0.7.1 h1:ZMCjoue3DtDWQ5WyU16YbjbQEQ3VuzwxALrpYd+HeKk=\ngithub.com/gostaticanalysis/analysisutil v0.7.1/go.mod h1:v21E3hY37WKMGSnbsw2S/ojApNWb6C1//mXO48CXbVc=\ngithub.com/gostaticanalysis/comment v1.4.1/go.mod h1:ih6ZxzTHLdadaiSnF5WY3dxUoXfXAlTaRzuaNDlSado=\ngithub.com/gostaticanalysis/comment v1.4.2/go.mod h1:KLUTGDv6HOCotCH8h2erHKmpci2ZoR8VPu34YA2uzdM=\ngithub.com/gostaticanalysis/comment v1.5.0 h1:X82FLl+TswsUMpMh17srGRuKaaXprTaytmEpgnKIDu8=\ngithub.com/gostaticanalysis/comment v1.5.0/go.mod h1:V6eb3gpCv9GNVqb6amXzEUX3jXLVK/AdA+IrAMSqvEc=\ngithub.com/gostaticanalysis/forcetypeassert v0.2.0 h1:uSnWrrUEYDr86OCxWa4/Tp2jeYDlogZiZHzGkWFefTk=\ngithub.com/gostaticanalysis/forcetypeassert v0.2.0/go.mod h1:M5iPavzE9pPqWyeiVXSFghQjljW1+l/Uke3PXHS6ILY=\ngithub.com/gostaticanalysis/nilerr v0.1.1 h1:ThE+hJP0fEp4zWLkWHWcRyI2Od0p7DlgYG3Uqrmrcpk=\ngithub.com/gostaticanalysis/nilerr v0.1.1/go.mod h1:wZYb6YI5YAxxq0i1+VJbY0s2YONW0HU0GPE3+5PWN4A=\ngithub.com/gostaticanalysis/testutil v0.3.1-0.20210208050101-bfb5c8eec0e4/go.mod h1:D+FIZ+7OahH3ePw/izIEeH5I06eKs1IKI4Xr64/Am3M=\ngithub.com/gostaticanalysis/testutil v0.5.0 h1:Dq4wT1DdTwTGCQQv3rl3IvD5Ld0E6HiY+3Zh0sUGqw8=\ngithub.com/gostaticanalysis/testutil v0.5.0/go.mod h1:OLQSbuM6zw2EvCcXTz1lVq5unyoNft372msDY0nY5Hs=\ngithub.com/hashicorp/go-immutable-radix/v2 v2.1.0 h1:CUW5RYIcysz+D3B+l1mDeXrQ7fUvGGCwJfdASSzbrfo=\ngithub.com/hashicorp/go-immutable-radix/v2 v2.1.0/go.mod h1:hgdqLXA4f6NIjRVisM1TJ9aOJVNRqKZj+xDGF6m7PBw=\ngithub.com/hashicorp/go-uuid v1.0.3 h1:2gKiV6YVmrJ1i2CKKa9obLvRieoRGviZFL26PcT/Co8=\ngithub.com/hashicorp/go-uuid v1.0.3/go.mod h1:6SBZvOh/SIDV7/2o3Jml5SYk/TvGqwFJ/bN7x4byOro=\ngithub.com/hashicorp/go-version v1.2.1/go.mod h1:fltr4n8CU8Ke44wwGCBoEymUuxUHl09ZGVZPK5anwXA=\ngithub.com/hashicorp/go-version v1.7.0 h1:5tqGy27NaOTB8yJKUZELlFAS/LTKJkrmONwQKeRZfjY=\ngithub.com/hashicorp/go-version v1.7.0/go.mod h1:fltr4n8CU8Ke44wwGCBoEymUuxUHl09ZGVZPK5anwXA=\ngithub.com/hashicorp/golang-lru v0.5.0/go.mod h1:/m3WP610KZHVQ1SGc6re/UDhFvYD7pJ4Ao+sR/qLZy8=\ngithub.com/hashicorp/golang-lru v0.5.1/go.mod h1:/m3WP610KZHVQ1SGc6re/UDhFvYD7pJ4Ao+sR/qLZy8=\ngithub.com/hashicorp/golang-lru/v2 v2.0.7 h1:a+bsQ5rvGLjzHuww6tVxozPZFVghXaHOwFs4luLUK2k=\ngithub.com/hashicorp/golang-lru/v2 v2.0.7/go.mod h1:QeFd9opnmA6QUJc5vARoKUSoFhyfM2/ZepoAG6RGpeM=\ngithub.com/hashicorp/hcl v1.0.0 h1:0Anlzjpi4vEasTeNFn2mLJgTSwt0+6sfsiTG8qcWGx4=\ngithub.com/hashicorp/hcl v1.0.0/go.mod h1:E5yfLk+7swimpb2L/Alb/PJmXilQ/rhwaUYs4T20WEQ=\ngithub.com/hexops/gotextdiff v1.0.3 h1:gitA9+qJrrTCsiCl7+kh75nPqQt1cx4ZkudSTLoUqJM=\ngithub.com/hexops/gotextdiff v1.0.3/go.mod h1:pSWU5MAI3yDq+fZBTazCSJysOMbxWL1BSow5/V2vxeg=\ngithub.com/ianlancetaylor/demangle v0.0.0-20181102032728-5e5cf60278f6/go.mod h1:aSSvb/t6k1mPoxDqO4vJh6VOCGPwU4O0C2/Eqndh1Sc=\ngithub.com/inconshreveable/mousetrap v1.1.0 h1:wN+x4NVGpMsO7ErUn/mUI3vEoE6Jt13X2s0bqwp9tc8=\ngithub.com/inconshreveable/mousetrap v1.1.0/go.mod h1:vpF70FUmC8bwa3OWnCshd2FqLfsEA9PFc4w1p2J65bw=\ngithub.com/jgautheron/goconst v1.7.1 h1:VpdAG7Ca7yvvJk5n8dMwQhfEZJh95kl/Hl9S1OI5Jkk=\ngithub.com/jgautheron/goconst v1.7.1/go.mod h1:aAosetZ5zaeC/2EfMeRswtxUFBpe2Hr7HzkgX4fanO4=\ngithub.com/jingyugao/rowserrcheck v1.1.1 h1:zibz55j/MJtLsjP1OF4bSdgXxwL1b+Vn7Tjzq7gFzUs=\ngithub.com/jingyugao/rowserrcheck v1.1.1/go.mod h1:4yvlZSDb3IyDTUZJUmpZfm2Hwok+Dtp+nu2qOq+er9c=\ngithub.com/jjti/go-spancheck v0.6.4 h1:Tl7gQpYf4/TMU7AT84MN83/6PutY21Nb9fuQjFTpRRc=\ngithub.com/jjti/go-spancheck v0.6.4/go.mod h1:yAEYdKJ2lRkDA8g7X+oKUHXOWVAXSBJRv04OhF+QUjk=\ngithub.com/jpillora/backoff v1.0.0/go.mod h1:J/6gKK9jxlEcS3zixgDgUAsiuZ7yrSoa/FX5e0EB2j4=\ngithub.com/json-iterator/go v1.1.6/go.mod h1:+SdeFBvtyEkXs7REEP0seUULqWtbJapLOCVDaaPEHmU=\ngithub.com/json-iterator/go v1.1.10/go.mod h1:KdQUCv79m/52Kvf8AW2vK1V8akMuk1QjK/uOdHXbAo4=\ngithub.com/json-iterator/go v1.1.11/go.mod h1:KdQUCv79m/52Kvf8AW2vK1V8akMuk1QjK/uOdHXbAo4=\ngithub.com/json-iterator/go v1.1.12/go.mod h1:e30LSqwooZae/UwlEbR2852Gd8hjQvJoHmT4TnhNGBo=\ngithub.com/jstemmer/go-junit-report v0.0.0-20190106144839-af01ea7f8024/go.mod h1:6v2b51hI/fHJwM22ozAgKL4VKDeJcHhJFhtBdhmNjmU=\ngithub.com/jstemmer/go-junit-report v0.9.1/go.mod h1:Brl9GWCQeLvo8nXZwPNNblvFj/XSXhF0NWZEnDohbsk=\ngithub.com/julienschmidt/httprouter v1.2.0/go.mod h1:SYymIcj16QtmaHHD7aYtjjsJG7VTCxuUUipMqKk8s4w=\ngithub.com/julienschmidt/httprouter v1.3.0/go.mod h1:JR6WtHb+2LUe8TCKY3cZOxFyyO8IZAc4RVcycCCAKdM=\ngithub.com/julz/importas v0.2.0 h1:y+MJN/UdL63QbFJHws9BVC5RpA2iq0kpjrFajTGivjQ=\ngithub.com/julz/importas v0.2.0/go.mod h1:pThlt589EnCYtMnmhmRYY/qn9lCf/frPOK+WMx3xiJY=\ngithub.com/karamaru-alpha/copyloopvar v1.2.1 h1:wmZaZYIjnJ0b5UoKDjUHrikcV0zuPyyxI4SVplLd2CI=\ngithub.com/karamaru-alpha/copyloopvar v1.2.1/go.mod h1:nFmMlFNlClC2BPvNaHMdkirmTJxVCY0lhxBtlfOypMM=\ngithub.com/kisielk/errcheck v1.9.0 h1:9xt1zI9EBfcYBvdU1nVrzMzzUPUtPKs9bVSIM3TAb3M=\ngithub.com/kisielk/errcheck v1.9.0/go.mod h1:kQxWMMVZgIkDq7U8xtG/n2juOjbLgZtedi0D+/VL/i8=\ngithub.com/kisielk/gotool v1.0.0/go.mod h1:XhKaO+MFFWcvkIS/tQcRk01m1F5IRFswLeQ+oQHNcck=\ngithub.com/kkHAIKE/contextcheck v1.1.6 h1:7HIyRcnyzxL9Lz06NGhiKvenXq7Zw6Q0UQu/ttjfJCE=\ngithub.com/kkHAIKE/contextcheck v1.1.6/go.mod h1:3dDbMRNBFaq8HFXWC1JyvDSPm43CmE6IuHam8Wr0rkg=\ngithub.com/konsorten/go-windows-terminal-sequences v1.0.1/go.mod h1:T0+1ngSBFLxvqU3pZ+m/2kptfBszLMUkC4ZK/EgS/cQ=\ngithub.com/konsorten/go-windows-terminal-sequences v1.0.3/go.mod h1:T0+1ngSBFLxvqU3pZ+m/2kptfBszLMUkC4ZK/EgS/cQ=\ngithub.com/kr/logfmt v0.0.0-20140226030751-b84e30acd515/go.mod h1:+0opPa2QZZtGFBFZlji/RkVcI2GknAs/DXo4wKdlNEc=\ngithub.com/kr/pretty v0.1.0/go.mod h1:dAy3ld7l9f0ibDNOQOHHMYYIIbhfbHSm3C4ZsoJORNo=\ngithub.com/kr/pretty v0.3.1 h1:flRD4NNwYAUpkphVc1HcthR4KEIFJ65n8Mw5qdRn3LE=\ngithub.com/kr/pretty v0.3.1/go.mod h1:hoEshYVHaxMs3cyo3Yncou5ZscifuDolrwPKZanG3xk=\ngithub.com/kr/pty v1.1.1/go.mod h1:pFQYn66WHrOpPYNljwOMqo10TkYh1fy3cYio2l3bCsQ=\ngithub.com/kr/text v0.1.0/go.mod h1:4Jbv+DJW3UT/LiOwJeYQe1efqtUx/iVham/4vfdArNI=\ngithub.com/kr/text v0.2.0 h1:5Nx0Ya0ZqY2ygV366QzturHI13Jq95ApcVaJBhpS+AY=\ngithub.com/kr/text v0.2.0/go.mod h1:eLer722TekiGuMkidMxC/pM04lWEeraHUUmBw8l2grE=\ngithub.com/kulti/thelper v0.6.3 h1:ElhKf+AlItIu+xGnI990no4cE2+XaSu1ULymV2Yulxs=\ngithub.com/kulti/thelper v0.6.3/go.mod h1:DsqKShOvP40epevkFrvIwkCMNYxMeTNjdWL4dqWHZ6I=\ngithub.com/kunwardeep/paralleltest v1.0.10 h1:wrodoaKYzS2mdNVnc4/w31YaXFtsc21PCTdvWJ/lDDs=\ngithub.com/kunwardeep/paralleltest v1.0.10/go.mod h1:2C7s65hONVqY7Q5Efj5aLzRCNLjw2h4eMc9EcypGjcY=\ngithub.com/lasiar/canonicalheader v1.1.2 h1:vZ5uqwvDbyJCnMhmFYimgMZnJMjwljN5VGY0VKbMXb4=\ngithub.com/lasiar/canonicalheader v1.1.2/go.mod h1:qJCeLFS0G/QlLQ506T+Fk/fWMa2VmBUiEI2cuMK4djI=\ngithub.com/ldez/exptostd v0.4.2 h1:l5pOzHBz8mFOlbcifTxzfyYbgEmoUqjxLFHZkjlbHXs=\ngithub.com/ldez/exptostd v0.4.2/go.mod h1:iZBRYaUmcW5jwCR3KROEZ1KivQQp6PHXbDPk9hqJKCQ=\ngithub.com/ldez/gomoddirectives v0.6.1 h1:Z+PxGAY+217f/bSGjNZr/b2KTXcyYLgiWI6geMBN2Qc=\ngithub.com/ldez/gomoddirectives v0.6.1/go.mod h1:cVBiu3AHR9V31em9u2kwfMKD43ayN5/XDgr+cdaFaKs=\ngithub.com/ldez/grignotin v0.9.0 h1:MgOEmjZIVNn6p5wPaGp/0OKWyvq42KnzAt/DAb8O4Ow=\ngithub.com/ldez/grignotin v0.9.0/go.mod h1:uaVTr0SoZ1KBii33c47O1M8Jp3OP3YDwhZCmzT9GHEk=\ngithub.com/ldez/tagliatelle v0.7.1 h1:bTgKjjc2sQcsgPiT902+aadvMjCeMHrY7ly2XKFORIk=\ngithub.com/ldez/tagliatelle v0.7.1/go.mod h1:3zjxUpsNB2aEZScWiZTHrAXOl1x25t3cRmzfK1mlo2I=\ngithub.com/ldez/usetesting v0.4.2 h1:J2WwbrFGk3wx4cZwSMiCQQ00kjGR0+tuuyW0Lqm4lwA=\ngithub.com/ldez/usetesting v0.4.2/go.mod h1:eEs46T3PpQ+9RgN9VjpY6qWdiw2/QmfiDeWmdZdrjIQ=\ngithub.com/leonklingele/grouper v1.1.2 h1:o1ARBDLOmmasUaNDesWqWCIFH3u7hoFlM84YrjT3mIY=\ngithub.com/leonklingele/grouper v1.1.2/go.mod h1:6D0M/HVkhs2yRKRFZUoGjeDy7EZTfFBE9gl4kjmIGkA=\ngithub.com/lucasb-eyer/go-colorful v1.2.0 h1:1nnpGOrhyZZuNyfu1QjKiUICQ74+3FNCN69Aj6K7nkY=\ngithub.com/lucasb-eyer/go-colorful v1.2.0/go.mod h1:R4dSotOR9KMtayYi1e77YzuveK+i7ruzyGqttikkLy0=\ngithub.com/macabu/inamedparam v0.2.0 h1:VyPYpOc10nkhI2qeNUdh3Zket4fcZjEWe35poddBCpE=\ngithub.com/macabu/inamedparam v0.2.0/go.mod h1:+Pee9/YfGe5LJ62pYXqB89lJ+0k5bsR8Wgz/C0Zlq3U=\ngithub.com/magiconair/properties v1.8.6 h1:5ibWZ6iY0NctNGWo87LalDlEZ6R41TqbbDamhfG/Qzo=\ngithub.com/magiconair/properties v1.8.6/go.mod h1:y3VJvCyxH9uVvJTWEGAELF3aiYNyPKd5NZ3oSwXrF60=\ngithub.com/maratori/testableexamples v1.0.0 h1:dU5alXRrD8WKSjOUnmJZuzdxWOEQ57+7s93SLMxb2vI=\ngithub.com/maratori/testableexamples v1.0.0/go.mod h1:4rhjL1n20TUTT4vdh3RDqSizKLyXp7K2u6HgraZCGzE=\ngithub.com/maratori/testpackage v1.1.1 h1:S58XVV5AD7HADMmD0fNnziNHqKvSdDuEKdPD1rNTU04=\ngithub.com/maratori/testpackage v1.1.1/go.mod h1:s4gRK/ym6AMrqpOa/kEbQTV4Q4jb7WeLZzVhVVVOQMc=\ngithub.com/matoous/godox v1.1.0 h1:W5mqwbyWrwZv6OQ5Z1a/DHGMOvXYCBP3+Ht7KMoJhq4=\ngithub.com/matoous/godox v1.1.0/go.mod h1:jgE/3fUXiTurkdHOLT5WEkThTSuE7yxHv5iWPa80afs=\ngithub.com/matryer/is v1.4.0 h1:sosSmIWwkYITGrxZ25ULNDeKiMNzFSr4V/eqBQP0PeE=\ngithub.com/matryer/is v1.4.0/go.mod h1:8I/i5uYgLzgsgEloJE1U6xx5HkBQpAZvepWuujKwMRU=\ngithub.com/mattn/go-colorable v0.1.14 h1:9A9LHSqF/7dyVVX6g0U9cwm9pG3kP9gSzcuIPHPsaIE=\ngithub.com/mattn/go-colorable v0.1.14/go.mod h1:6LmQG8QLFO4G5z1gPvYEzlUgJ2wF+stgPZH1UqBm1s8=\ngithub.com/mattn/go-isatty v0.0.20 h1:xfD0iDuEKnDkl03q4limB+vH+GxLEtL/jb4xVJSWWEY=\ngithub.com/mattn/go-isatty v0.0.20/go.mod h1:W+V8PltTTMOvKvAeJH7IuucS94S2C6jfK/D7dTCTo3Y=\ngithub.com/mattn/go-runewidth v0.0.9/go.mod h1:H031xJmbD/WCDINGzjvQ9THkh0rPKHF+m2gUSrubnMI=\ngithub.com/mattn/go-runewidth v0.0.16 h1:E5ScNMtiwvlvB5paMFdw9p4kSQzbXFikJ5SQO6TULQc=\ngithub.com/mattn/go-runewidth v0.0.16/go.mod h1:Jdepj2loyihRzMpdS35Xk/zdY8IAYHsh153qUoGf23w=\ngithub.com/matttproud/golang_protobuf_extensions v1.0.1 h1:4hp9jkHxhMHkqkrB3Ix0jegS5sx/RkqARlsWZ6pIwiU=\ngithub.com/matttproud/golang_protobuf_extensions v1.0.1/go.mod h1:D8He9yQNgCq6Z5Ld7szi9bcBfOoFv/3dc6xSMkL2PC0=\ngithub.com/mgechev/revive v1.7.0 h1:JyeQ4yO5K8aZhIKf5rec56u0376h8AlKNQEmjfkjKlY=\ngithub.com/mgechev/revive v1.7.0/go.mod h1:qZnwcNhoguE58dfi96IJeSTPeZQejNeoMQLUZGi4SW4=\ngithub.com/mitchellh/go-homedir v1.1.0 h1:lukF9ziXFxDFPkA1vsr5zpc1XuPDn/wFntq5mG+4E0Y=\ngithub.com/mitchellh/go-homedir v1.1.0/go.mod h1:SfyaCUpYCn1Vlf4IUYiD9fPX4A5wJrkLzIz1N1q0pr0=\ngithub.com/mitchellh/mapstructure v1.5.0 h1:jeMsZIYE/09sWLaz43PL7Gy6RuMjD2eJVyuac5Z2hdY=\ngithub.com/mitchellh/mapstructure v1.5.0/go.mod h1:bFUtVrKA4DC2yAKiSyO/QUcy7e+RRV2QTWOzhPopBRo=\ngithub.com/modern-go/concurrent v0.0.0-20180228061459-e0a39a4cb421/go.mod h1:6dJC0mAP4ikYIbvyc7fijjWJddQyLn8Ig3JB5CqoB9Q=\ngithub.com/modern-go/concurrent v0.0.0-20180306012644-bacd9c7ef1dd/go.mod h1:6dJC0mAP4ikYIbvyc7fijjWJddQyLn8Ig3JB5CqoB9Q=\ngithub.com/modern-go/reflect2 v0.0.0-20180701023420-4b7aa43c6742/go.mod h1:bx2lNnkwVCuqBIxFjflWJWanXIb3RllmbCylyMrvgv0=\ngithub.com/modern-go/reflect2 v1.0.1/go.mod h1:bx2lNnkwVCuqBIxFjflWJWanXIb3RllmbCylyMrvgv0=\ngithub.com/modern-go/reflect2 v1.0.2/go.mod h1:yWuevngMOJpCy52FWWMvUC8ws7m/LJsjYzDa0/r8luk=\ngithub.com/moricho/tparallel v0.3.2 h1:odr8aZVFA3NZrNybggMkYO3rgPRcqjeQUlBBFVxKHTI=\ngithub.com/moricho/tparallel v0.3.2/go.mod h1:OQ+K3b4Ln3l2TZveGCywybl68glfLEwFGqvnjok8b+U=\ngithub.com/muesli/termenv v0.16.0 h1:S5AlUN9dENB57rsbnkPyfdGuWIlkmzJjbFf0Tf5FWUc=\ngithub.com/muesli/termenv v0.16.0/go.mod h1:ZRfOIKPFDYQoDFF4Olj7/QJbW60Ol/kL1pU3VfY/Cnk=\ngithub.com/mwitkow/go-conntrack v0.0.0-20161129095857-cc309e4a2223/go.mod h1:qRWi+5nqEBWmkhHvq77mSJWrCKwh8bxhgT7d/eI7P4U=\ngithub.com/mwitkow/go-conntrack v0.0.0-20190716064945-2f068394615f/go.mod h1:qRWi+5nqEBWmkhHvq77mSJWrCKwh8bxhgT7d/eI7P4U=\ngithub.com/nakabonne/nestif v0.3.1 h1:wm28nZjhQY5HyYPx+weN3Q65k6ilSBxDb8v5S81B81U=\ngithub.com/nakabonne/nestif v0.3.1/go.mod h1:9EtoZochLn5iUprVDmDjqGKPofoUEBL8U4Ngq6aY7OE=\ngithub.com/nishanths/exhaustive v0.12.0 h1:vIY9sALmw6T/yxiASewa4TQcFsVYZQQRUQJhKRf3Swg=\ngithub.com/nishanths/exhaustive v0.12.0/go.mod h1:mEZ95wPIZW+x8kC4TgC+9YCUgiST7ecevsVDTgc2obs=\ngithub.com/nishanths/predeclared v0.2.2 h1:V2EPdZPliZymNAn79T8RkNApBjMmVKh5XRpLm/w98Vk=\ngithub.com/nishanths/predeclared v0.2.2/go.mod h1:RROzoN6TnGQupbC+lqggsOlcgysk3LMK/HI84Mp280c=\ngithub.com/nunnatsa/ginkgolinter v0.19.1 h1:mjwbOlDQxZi9Cal+KfbEJTCz327OLNfwNvoZ70NJ+c4=\ngithub.com/nunnatsa/ginkgolinter v0.19.1/go.mod h1:jkQ3naZDmxaZMXPWaS9rblH+i+GWXQCaS/JFIWcOH2s=\ngithub.com/olekukonko/tablewriter v0.0.5 h1:P2Ga83D34wi1o9J6Wh1mRuqd4mF/x/lgBS7N7AbDhec=\ngithub.com/olekukonko/tablewriter v0.0.5/go.mod h1:hPp6KlRPjbx+hW8ykQs1w3UBbZlj6HuIJcUGPhkA7kY=\ngithub.com/onsi/ginkgo/v2 v2.22.2 h1:/3X8Panh8/WwhU/3Ssa6rCKqPLuAkVY2I0RoyDLySlU=\ngithub.com/onsi/ginkgo/v2 v2.22.2/go.mod h1:oeMosUL+8LtarXBHu/c0bx2D/K9zyQ6uX3cTyztHwsk=\ngithub.com/onsi/gomega v1.36.2 h1:koNYke6TVk6ZmnyHrCXba/T/MoLBXFjeC1PtvYgw0A8=\ngithub.com/onsi/gomega v1.36.2/go.mod h1:DdwyADRjrc825LhMEkD76cHR5+pUnjhUN8GlHlRPHzY=\ngithub.com/otiai10/copy v1.2.0/go.mod h1:rrF5dJ5F0t/EWSYODDu4j9/vEeYHMkc8jt0zJChqQWw=\ngithub.com/otiai10/copy v1.14.0 h1:dCI/t1iTdYGtkvCuBG2BgR6KZa83PTclw4U5n2wAllU=\ngithub.com/otiai10/copy v1.14.0/go.mod h1:ECfuL02W+/FkTWZWgQqXPWZgW9oeKCSQ5qVfSc4qc4w=\ngithub.com/otiai10/curr v0.0.0-20150429015615-9b4961190c95/go.mod h1:9qAhocn7zKJG+0mI8eUu6xqkFDYS2kb2saOteoSB3cE=\ngithub.com/otiai10/curr v1.0.0/go.mod h1:LskTG5wDwr8Rs+nNQ+1LlxRjAtTZZjtJW4rMXl6j4vs=\ngithub.com/otiai10/mint v1.3.0/go.mod h1:F5AjcsTsWUqX+Na9fpHb52P8pcRX2CI6A3ctIT91xUo=\ngithub.com/otiai10/mint v1.3.1/go.mod h1:/yxELlJQ0ufhjUwhshSj+wFjZ78CnZ48/1wtmBH1OTc=\ngithub.com/pelletier/go-toml v1.9.5 h1:4yBQzkHv+7BHq2PQUZF3Mx0IYxG7LsP222s7Agd3ve8=\ngithub.com/pelletier/go-toml v1.9.5/go.mod h1:u1nR/EPcESfeI/szUZKdtJ0xRNbUoANCkoOuaOx1Y+c=\ngithub.com/pelletier/go-toml/v2 v2.2.3 h1:YmeHyLY8mFWbdkNWwpr+qIL2bEqT0o95WSdkNHvL12M=\ngithub.com/pelletier/go-toml/v2 v2.2.3/go.mod h1:MfCQTFTvCcUyyvvwm1+G6H/jORL20Xlb6rzQu9GuUkc=\ngithub.com/pkg/errors v0.8.0/go.mod h1:bwawxfHBFNV+L2hUp1rHADufV3IMtnDRdf1r5NINEl0=\ngithub.com/pkg/errors v0.8.1/go.mod h1:bwawxfHBFNV+L2hUp1rHADufV3IMtnDRdf1r5NINEl0=\ngithub.com/pkg/errors v0.9.1 h1:FEBLx1zS214owpjy7qsBeixbURkuhQAwrK5UwLGTwt4=\ngithub.com/pkg/errors v0.9.1/go.mod h1:bwawxfHBFNV+L2hUp1rHADufV3IMtnDRdf1r5NINEl0=\ngithub.com/pmezard/go-difflib v1.0.0 h1:4DBwDE0NGyQoBHbLQYPwSUPoCMWR5BEzIk/f1lZbAQM=\ngithub.com/pmezard/go-difflib v1.0.0/go.mod h1:iKH77koFhYxTK1pcRnkKkqfTogsbg7gZNVY4sRDYZ/4=\ngithub.com/polyfloyd/go-errorlint v1.7.1 h1:RyLVXIbosq1gBdk/pChWA8zWYLsq9UEw7a1L5TVMCnA=\ngithub.com/polyfloyd/go-errorlint v1.7.1/go.mod h1:aXjNb1x2TNhoLsk26iv1yl7a+zTnXPhwEMtEXukiLR8=\ngithub.com/prashantv/gostub v1.1.0 h1:BTyx3RfQjRHnUWaGF9oQos79AlQ5k8WNktv7VGvVH4g=\ngithub.com/prashantv/gostub v1.1.0/go.mod h1:A5zLQHz7ieHGG7is6LLXLz7I8+3LZzsrV0P1IAHhP5U=\ngithub.com/prometheus/client_golang v0.9.1/go.mod h1:7SWBe2y4D6OKWSNQJUaRYU/AaXPKyh/dDVn+NZz0KFw=\ngithub.com/prometheus/client_golang v1.0.0/go.mod h1:db9x61etRT2tGnBNRi70OPL5FsnadC4Ky3P0J6CfImo=\ngithub.com/prometheus/client_golang v1.7.1/go.mod h1:PY5Wy2awLA44sXw4AOSfFBetzPP4j5+D6mVACh+pe2M=\ngithub.com/prometheus/client_golang v1.11.0/go.mod h1:Z6t4BnS23TR94PD6BsDNk8yVqroYurpAkEiz0P2BEV0=\ngithub.com/prometheus/client_golang v1.12.1 h1:ZiaPsmm9uiBeaSMRznKsCDNtPCS0T3JVDGF+06gjBzk=\ngithub.com/prometheus/client_golang v1.12.1/go.mod h1:3Z9XVyYiZYEO+YQWt3RD2R3jrbd179Rt297l4aS6nDY=\ngithub.com/prometheus/client_model v0.0.0-20180712105110-5c3871d89910/go.mod h1:MbSGuTsp3dbXC40dX6PRTWyKYBIrTGTE9sqQNg2J8bo=\ngithub.com/prometheus/client_model v0.0.0-20190129233127-fd36f4220a90/go.mod h1:xMI15A0UPsDsEKsMN9yxemIoYk6Tm2C1GtYGdfGttqA=\ngithub.com/prometheus/client_model v0.0.0-20190812154241-14fe0d1b01d4/go.mod h1:xMI15A0UPsDsEKsMN9yxemIoYk6Tm2C1GtYGdfGttqA=\ngithub.com/prometheus/client_model v0.2.0 h1:uq5h0d+GuxiXLJLNABMgp2qUWDPiLvgCzz2dUR+/W/M=\ngithub.com/prometheus/client_model v0.2.0/go.mod h1:xMI15A0UPsDsEKsMN9yxemIoYk6Tm2C1GtYGdfGttqA=\ngithub.com/prometheus/common v0.4.1/go.mod h1:TNfzLD0ON7rHzMJeJkieUDPYmFC7Snx/y86RQel1bk4=\ngithub.com/prometheus/common v0.10.0/go.mod h1:Tlit/dnDKsSWFlCLTWaA1cyBgKHSMdTB80sz/V91rCo=\ngithub.com/prometheus/common v0.26.0/go.mod h1:M7rCNAaPfAosfx8veZJCuw84e35h3Cfd9VFqTh1DIvc=\ngithub.com/prometheus/common v0.32.1 h1:hWIdL3N2HoUx3B8j3YN9mWor0qhY/NlEKZEaXxuIRh4=\ngithub.com/prometheus/common v0.32.1/go.mod h1:vu+V0TpY+O6vW9J44gczi3Ap/oXXR10b+M/gUGO4Hls=\ngithub.com/prometheus/procfs v0.0.0-20181005140218-185b4288413d/go.mod h1:c3At6R/oaqEKCNdg8wHV1ftS6bRYblBhIjjI8uT2IGk=\ngithub.com/prometheus/procfs v0.0.2/go.mod h1:TjEm7ze935MbeOT/UhFTIMYKhuLP4wbCsTZCD3I8kEA=\ngithub.com/prometheus/procfs v0.1.3/go.mod h1:lV6e/gmhEcM9IjHGsFOCxxuZ+z1YqCvr4OA4YeYWdaU=\ngithub.com/prometheus/procfs v0.6.0/go.mod h1:cz+aTbrPOrUb4q7XlbU9ygM+/jj0fzG6c1xBZuNvfVA=\ngithub.com/prometheus/procfs v0.7.3 h1:4jVXhlkAyzOScmCkXBTOLRLTz8EeU+eyjrwB/EPq0VU=\ngithub.com/prometheus/procfs v0.7.3/go.mod h1:cz+aTbrPOrUb4q7XlbU9ygM+/jj0fzG6c1xBZuNvfVA=\ngithub.com/quasilyte/go-ruleguard v0.4.4 h1:53DncefIeLX3qEpjzlS1lyUmQoUEeOWPFWqaTJq9eAQ=\ngithub.com/quasilyte/go-ruleguard v0.4.4/go.mod h1:Vl05zJ538vcEEwu16V/Hdu7IYZWyKSwIy4c88Ro1kRE=\ngithub.com/quasilyte/go-ruleguard/dsl v0.3.22 h1:wd8zkOhSNr+I+8Qeciml08ivDt1pSXe60+5DqOpCjPE=\ngithub.com/quasilyte/go-ruleguard/dsl v0.3.22/go.mod h1:KeCP03KrjuSO0H1kTuZQCWlQPulDV6YMIXmpQss17rU=\ngithub.com/quasilyte/gogrep v0.5.0 h1:eTKODPXbI8ffJMN+W2aE0+oL0z/nh8/5eNdiO34SOAo=\ngithub.com/quasilyte/gogrep v0.5.0/go.mod h1:Cm9lpz9NZjEoL1tgZ2OgeUKPIxL1meE7eo60Z6Sk+Ng=\ngithub.com/quasilyte/regex/syntax v0.0.0-20210819130434-b3f0c404a727 h1:TCg2WBOl980XxGFEZSS6KlBGIV0diGdySzxATTWoqaU=\ngithub.com/quasilyte/regex/syntax v0.0.0-20210819130434-b3f0c404a727/go.mod h1:rlzQ04UMyJXu/aOvhd8qT+hvDrFpiwqp8MRXDY9szc0=\ngithub.com/quasilyte/stdinfo v0.0.0-20220114132959-f7386bf02567 h1:M8mH9eK4OUR4lu7Gd+PU1fV2/qnDNfzT635KRSObncs=\ngithub.com/quasilyte/stdinfo v0.0.0-20220114132959-f7386bf02567/go.mod h1:DWNGW8A4Y+GyBgPuaQJuWiy0XYftx4Xm/y5Jqk9I6VQ=\ngithub.com/raeperd/recvcheck v0.2.0 h1:GnU+NsbiCqdC2XX5+vMZzP+jAJC5fht7rcVTAhX74UI=\ngithub.com/raeperd/recvcheck v0.2.0/go.mod h1:n04eYkwIR0JbgD73wT8wL4JjPC3wm0nFtzBnWNocnYU=\ngithub.com/rivo/uniseg v0.2.0/go.mod h1:J6wj4VEh+S6ZtnVlnTBMWIodfgj8LQOQFoIToxlJtxc=\ngithub.com/rivo/uniseg v0.4.7 h1:WUdvkW8uEhrYfLC4ZzdpI2ztxP1I582+49Oc5Mq64VQ=\ngithub.com/rivo/uniseg v0.4.7/go.mod h1:FN3SvrM+Zdj16jyLfmOkMNblXMcoc8DfTHruCPUcx88=\ngithub.com/rogpeppe/go-internal v1.3.0/go.mod h1:M8bDsm7K2OlrFYOpmOWEs/qY81heoFRclV5y23lUDJ4=\ngithub.com/rogpeppe/go-internal v1.14.1 h1:UQB4HGPB6osV0SQTLymcB4TgvyWu6ZyliaW0tI/otEQ=\ngithub.com/rogpeppe/go-internal v1.14.1/go.mod h1:MaRKkUm5W0goXpeCfT7UZI6fk/L7L7so1lCWt35ZSgc=\ngithub.com/russross/blackfriday/v2 v2.1.0/go.mod h1:+Rmxgy9KzJVeS9/2gXHxylqXiyQDYRxCVz55jmeOWTM=\ngithub.com/ryancurrah/gomodguard v1.4.1 h1:eWC8eUMNZ/wM/PWuZBv7JxxqT5fiIKSIyTvjb7Elr+g=\ngithub.com/ryancurrah/gomodguard v1.4.1/go.mod h1:qnMJwV1hX9m+YJseXEBhd2s90+1Xn6x9dLz11ualI1I=\ngithub.com/ryanrolds/sqlclosecheck v0.5.1 h1:dibWW826u0P8jNLsLN+En7+RqWWTYrjCB9fJfSfdyCU=\ngithub.com/ryanrolds/sqlclosecheck v0.5.1/go.mod h1:2g3dUjoS6AL4huFdv6wn55WpLIDjY7ZgUR4J8HOO/XQ=\ngithub.com/sanposhiho/wastedassign/v2 v2.1.0 h1:crurBF7fJKIORrV85u9UUpePDYGWnwvv3+A96WvwXT0=\ngithub.com/sanposhiho/wastedassign/v2 v2.1.0/go.mod h1:+oSmSC+9bQ+VUAxA66nBb0Z7N8CK7mscKTDYC6aIek4=\ngithub.com/santhosh-tekuri/jsonschema/v6 v6.0.1 h1:PKK9DyHxif4LZo+uQSgXNqs0jj5+xZwwfKHgph2lxBw=\ngithub.com/santhosh-tekuri/jsonschema/v6 v6.0.1/go.mod h1:JXeL+ps8p7/KNMjDQk3TCwPpBy0wYklyWTfbkIzdIFU=\ngithub.com/sashamelentyev/interfacebloat v1.1.0 h1:xdRdJp0irL086OyW1H/RTZTr1h/tMEOsumirXcOJqAw=\ngithub.com/sashamelentyev/interfacebloat v1.1.0/go.mod h1:+Y9yU5YdTkrNvoX0xHc84dxiN1iBi9+G8zZIhPVoNjQ=\ngithub.com/sashamelentyev/usestdlibvars v1.28.0 h1:jZnudE2zKCtYlGzLVreNp5pmCdOxXUzwsMDBkR21cyQ=\ngithub.com/sashamelentyev/usestdlibvars v1.28.0/go.mod h1:9nl0jgOfHKWNFS43Ojw0i7aRoS4j6EBye3YBhmAIRF8=\ngithub.com/securego/gosec/v2 v2.22.2 h1:IXbuI7cJninj0nRpZSLCUlotsj8jGusohfONMrHoF6g=\ngithub.com/securego/gosec/v2 v2.22.2/go.mod h1:UEBGA+dSKb+VqM6TdehR7lnQtIIMorYJ4/9CW1KVQBE=\ngithub.com/sergi/go-diff v1.2.0 h1:XU+rvMAioB0UC3q1MFrIQy4Vo5/4VsRDQQXHsEya6xQ=\ngithub.com/sergi/go-diff v1.2.0/go.mod h1:STckp+ISIX8hZLjrqAeVduY0gWCT9IjLuqbuNXdaHfM=\ngithub.com/shurcooL/go v0.0.0-20180423040247-9e1955d9fb6e/go.mod h1:TDJrrUr11Vxrven61rcy3hJMUqaf/CLWYhHNPmT14Lk=\ngithub.com/shurcooL/go-goon v0.0.0-20170922171312-37c2f522c041/go.mod h1:N5mDOmsrJOB+vfqUK+7DmDyjhSLIIBnXo9lvZJj3MWQ=\ngithub.com/sirupsen/logrus v1.2.0/go.mod h1:LxeOpSwHxABJmUn/MG1IvRgCAasNZTLOkJPxbbu5VWo=\ngithub.com/sirupsen/logrus v1.4.2/go.mod h1:tLMulIdttU9McNUspp0xgXVQah82FyeX6MwdIuYE2rE=\ngithub.com/sirupsen/logrus v1.6.0/go.mod h1:7uNnSEd1DgxDLC74fIahvMZmmYsHGZGEOFrfsX/uA88=\ngithub.com/sirupsen/logrus v1.9.3 h1:dueUQJ1C2q9oE3F7wvmSGAaVtTmUizReu6fjN8uqzbQ=\ngithub.com/sirupsen/logrus v1.9.3/go.mod h1:naHLuLoDiP4jHNo9R0sCBMtWGeIprob74mVsIT4qYEQ=\ngithub.com/sivchari/containedctx v1.0.3 h1:x+etemjbsh2fB5ewm5FeLNi5bUjK0V8n0RB+Wwfd0XE=\ngithub.com/sivchari/containedctx v1.0.3/go.mod h1:c1RDvCbnJLtH4lLcYD/GqwiBSSf4F5Qk0xld2rBqzJ4=\ngithub.com/sonatard/noctx v0.1.0 h1:JjqOc2WN16ISWAjAk8M5ej0RfExEXtkEyExl2hLW+OM=\ngithub.com/sonatard/noctx v0.1.0/go.mod h1:0RvBxqY8D4j9cTTTWE8ylt2vqj2EPI8fHmrxHdsaZ2c=\ngithub.com/sourcegraph/go-diff v0.7.0 h1:9uLlrd5T46OXs5qpp8L/MTltk0zikUGi0sNNyCpA8G0=\ngithub.com/sourcegraph/go-diff v0.7.0/go.mod h1:iBszgVvyxdc8SFZ7gm69go2KDdt3ag071iBaWPF6cjs=\ngithub.com/spf13/afero v1.12.0 h1:UcOPyRBYczmFn6yvphxkn9ZEOY65cpwGKb5mL36mrqs=\ngithub.com/spf13/afero v1.12.0/go.mod h1:ZTlWwG4/ahT8W7T0WQ5uYmjI9duaLQGy3Q2OAl4sk/4=\ngithub.com/spf13/cast v1.5.0 h1:rj3WzYc11XZaIZMPKmwP96zkFEnnAmV8s6XbB2aY32w=\ngithub.com/spf13/cast v1.5.0/go.mod h1:SpXXQ5YoyJw6s3/6cMTQuxvgRl3PCJiyaX9p6b155UU=\ngithub.com/spf13/cobra v1.9.1 h1:CXSaggrXdbHK9CF+8ywj8Amf7PBRmPCOJugH954Nnlo=\ngithub.com/spf13/cobra v1.9.1/go.mod h1:nDyEzZ8ogv936Cinf6g1RU9MRY64Ir93oCnqb9wxYW0=\ngithub.com/spf13/jwalterweatherman v1.1.0 h1:ue6voC5bR5F8YxI5S67j9i582FU4Qvo2bmqnqMYADFk=\ngithub.com/spf13/jwalterweatherman v1.1.0/go.mod h1:aNWZUN0dPAAO/Ljvb5BEdw96iTZ0EXowPYD95IqWIGo=\ngithub.com/spf13/pflag v1.0.5/go.mod h1:McXfInJRrz4CZXVZOBLb0bTZqETkiAhM9Iw0y3An2Bg=\ngithub.com/spf13/pflag v1.0.6 h1:jFzHGLGAlb3ruxLB8MhbI6A8+AQX/2eW4qeyNZXNp2o=\ngithub.com/spf13/pflag v1.0.6/go.mod h1:McXfInJRrz4CZXVZOBLb0bTZqETkiAhM9Iw0y3An2Bg=\ngithub.com/spf13/viper v1.12.0 h1:CZ7eSOd3kZoaYDLbXnmzgQI5RlciuXBMA+18HwHRfZQ=\ngithub.com/spf13/viper v1.12.0/go.mod h1:b6COn30jlNxbm/V2IqWiNWkJ+vZNiMNksliPCiuKtSI=\ngithub.com/ssgreg/nlreturn/v2 v2.2.1 h1:X4XDI7jstt3ySqGU86YGAURbxw3oTDPK9sPEi6YEwQ0=\ngithub.com/ssgreg/nlreturn/v2 v2.2.1/go.mod h1:E/iiPB78hV7Szg2YfRgyIrk1AD6JVMTRkkxBiELzh2I=\ngithub.com/stbenjam/no-sprintf-host-port v0.2.0 h1:i8pxvGrt1+4G0czLr/WnmyH7zbZ8Bg8etvARQ1rpyl4=\ngithub.com/stbenjam/no-sprintf-host-port v0.2.0/go.mod h1:eL0bQ9PasS0hsyTyfTjjG+E80QIyPnBVQbYZyv20Jfk=\ngithub.com/stretchr/objx v0.1.0/go.mod h1:HFkY916IF+rwdDfMAkV7OtwuqBVzrE8GR6GFx+wExME=\ngithub.com/stretchr/objx v0.1.1/go.mod h1:HFkY916IF+rwdDfMAkV7OtwuqBVzrE8GR6GFx+wExME=\ngithub.com/stretchr/objx v0.4.0/go.mod h1:YvHI0jy2hoMjB+UWwv71VJQ9isScKT/TqJzVSSt89Yw=\ngithub.com/stretchr/objx v0.5.0/go.mod h1:Yh+to48EsGEfYuaHDzXPcE3xhTkx73EhmCGUpEOglKo=\ngithub.com/stretchr/objx v0.5.2 h1:xuMeJ0Sdp5ZMRXx/aWO6RZxdr3beISkG5/G/aIRr3pY=\ngithub.com/stretchr/objx v0.5.2/go.mod h1:FRsXN1f5AsAjCGJKqEizvkpNtU+EGNCLh3NxZ/8L+MA=\ngithub.com/stretchr/testify v1.2.2/go.mod h1:a8OnRcib4nhh0OaRAV+Yts87kKdq0PP7pXfy6kDkUVs=\ngithub.com/stretchr/testify v1.3.0/go.mod h1:M5WIy9Dh21IEIfnGCwXGc5bZfKNJtfHm1UVUgZn+9EI=\ngithub.com/stretchr/testify v1.4.0/go.mod h1:j7eGeouHqKxXV5pUuKE4zz7dFj8WfuZ+81PSLYec5m4=\ngithub.com/stretchr/testify v1.5.1/go.mod h1:5W2xD1RspED5o8YsWQXVCued0rvSQ+mT+I5cxcmMvtA=\ngithub.com/stretchr/testify v1.7.0/go.mod h1:6Fq8oRcR53rry900zMqJjRRixrwX3KX962/h/Wwjteg=\ngithub.com/stretchr/testify v1.7.1/go.mod h1:6Fq8oRcR53rry900zMqJjRRixrwX3KX962/h/Wwjteg=\ngithub.com/stretchr/testify v1.8.0/go.mod h1:yNjHg4UonilssWZ8iaSj1OCr/vHnekPRkoO+kdMU+MU=\ngithub.com/stretchr/testify v1.8.4/go.mod h1:sz/lmYIOXD/1dqDmKjjqLyZ2RngseejIcXlSw2iwfAo=\ngithub.com/stretchr/testify v1.10.0 h1:Xv5erBjTwe/5IxqUQTdXv5kgmIvbHo3QQyRwhJsOfJA=\ngithub.com/stretchr/testify v1.10.0/go.mod h1:r2ic/lqez/lEtzL7wO/rwa5dbSLXVDPFyf8C91i36aY=\ngithub.com/subosito/gotenv v1.4.1 h1:jyEFiXpy21Wm81FBN71l9VoMMV8H8jG+qIK3GCpY6Qs=\ngithub.com/subosito/gotenv v1.4.1/go.mod h1:ayKnFf/c6rvx/2iiLrJUk1e6plDbT3edrFNGqEflhK0=\ngithub.com/tdakkota/asciicheck v0.4.1 h1:bm0tbcmi0jezRA2b5kg4ozmMuGAFotKI3RZfrhfovg8=\ngithub.com/tdakkota/asciicheck v0.4.1/go.mod h1:0k7M3rCfRXb0Z6bwgvkEIMleKH3kXNz9UqJ9Xuqopr8=\ngithub.com/tenntenn/modver v1.0.1 h1:2klLppGhDgzJrScMpkj9Ujy3rXPUspSjAcev9tSEBgA=\ngithub.com/tenntenn/modver v1.0.1/go.mod h1:bePIyQPb7UeioSRkw3Q0XeMhYZSMx9B8ePqg6SAMGH0=\ngithub.com/tenntenn/text/transform v0.0.0-20200319021203-7eef512accb3 h1:f+jULpRQGxTSkNYKJ51yaw6ChIqO+Je8UqsTKN/cDag=\ngithub.com/tenntenn/text/transform v0.0.0-20200319021203-7eef512accb3/go.mod h1:ON8b8w4BN/kE1EOhwT0o+d62W65a6aPw1nouo9LMgyY=\ngithub.com/tetafro/godot v1.5.0 h1:aNwfVI4I3+gdxjMgYPus9eHmoBeJIbnajOyqZYStzuw=\ngithub.com/tetafro/godot v1.5.0/go.mod h1:2oVxTBSftRTh4+MVfUaUXR6bn2GDXCaMcOG4Dk3rfio=\ngithub.com/timakin/bodyclose v0.0.0-20241222091800-1db5c5ca4d67 h1:9LPGD+jzxMlnk5r6+hJnar67cgpDIz/iyD+rfl5r2Vk=\ngithub.com/timakin/bodyclose v0.0.0-20241222091800-1db5c5ca4d67/go.mod h1:mkjARE7Yr8qU23YcGMSALbIxTQ9r9QBVahQOBRfU460=\ngithub.com/timonwong/loggercheck v0.10.1 h1:uVZYClxQFpw55eh+PIoqM7uAOHMrhVcDoWDery9R8Lg=\ngithub.com/timonwong/loggercheck v0.10.1/go.mod h1:HEAWU8djynujaAVX7QI65Myb8qgfcZ1uKbdpg3ZzKl8=\ngithub.com/tomarrell/wrapcheck/v2 v2.10.0 h1:SzRCryzy4IrAH7bVGG4cK40tNUhmVmMDuJujy4XwYDg=\ngithub.com/tomarrell/wrapcheck/v2 v2.10.0/go.mod h1:g9vNIyhb5/9TQgumxQyOEqDHsmGYcGsVMOx/xGkqdMo=\ngithub.com/tommy-muehle/go-mnd/v2 v2.5.1 h1:NowYhSdyE/1zwK9QCLeRb6USWdoif80Ie+v+yU8u1Zw=\ngithub.com/tommy-muehle/go-mnd/v2 v2.5.1/go.mod h1:WsUAkMJMYww6l/ufffCD3m+P7LEvr8TnZn9lwVDlgzw=\ngithub.com/ultraware/funlen v0.2.0 h1:gCHmCn+d2/1SemTdYMiKLAHFYxTYz7z9VIDRaTGyLkI=\ngithub.com/ultraware/funlen v0.2.0/go.mod h1:ZE0q4TsJ8T1SQcjmkhN/w+MceuatI6pBFSxxyteHIJA=\ngithub.com/ultraware/whitespace v0.2.0 h1:TYowo2m9Nfj1baEQBjuHzvMRbp19i+RCcRYrSWoFa+g=\ngithub.com/ultraware/whitespace v0.2.0/go.mod h1:XcP1RLD81eV4BW8UhQlpaR+SDc2givTvyI8a586WjW8=\ngithub.com/uudashr/gocognit v1.2.0 h1:3BU9aMr1xbhPlvJLSydKwdLN3tEUUrzPSSM8S4hDYRA=\ngithub.com/uudashr/gocognit v1.2.0/go.mod h1:k/DdKPI6XBZO1q7HgoV2juESI2/Ofj9AcHPZhBBdrTU=\ngithub.com/uudashr/iface v1.3.1 h1:bA51vmVx1UIhiIsQFSNq6GZ6VPTk3WNMZgRiCe9R29U=\ngithub.com/uudashr/iface v1.3.1/go.mod h1:4QvspiRd3JLPAEXBQ9AiZpLbJlrWWgRChOKDJEuQTdg=\ngithub.com/xen0n/gosmopolitan v1.3.0 h1:zAZI1zefvo7gcpbCOrPSHJZJYA9ZgLfJqtKzZ5pHqQM=\ngithub.com/xen0n/gosmopolitan v1.3.0/go.mod h1:rckfr5T6o4lBtM1ga7mLGKZmLxswUoH1zxHgNXOsEt4=\ngithub.com/xo/terminfo v0.0.0-20220910002029-abceb7e1c41e h1:JVG44RsyaB9T2KIHavMF/ppJZNG9ZpyihvCd0w101no=\ngithub.com/xo/terminfo v0.0.0-20220910002029-abceb7e1c41e/go.mod h1:RbqR21r5mrJuqunuUZ/Dhy/avygyECGrLceyNeo4LiM=\ngithub.com/yagipy/maintidx v1.0.0 h1:h5NvIsCz+nRDapQ0exNv4aJ0yXSI0420omVANTv3GJM=\ngithub.com/yagipy/maintidx v1.0.0/go.mod h1:0qNf/I/CCZXSMhsRsrEPDZ+DkekpKLXAJfsTACwgXLk=\ngithub.com/yeya24/promlinter v0.3.0 h1:JVDbMp08lVCP7Y6NP3qHroGAO6z2yGKQtS5JsjqtoFs=\ngithub.com/yeya24/promlinter v0.3.0/go.mod h1:cDfJQQYv9uYciW60QT0eeHlFodotkYZlL+YcPQN+mW4=\ngithub.com/ykadowak/zerologlint v0.1.5 h1:Gy/fMz1dFQN9JZTPjv1hxEk+sRWm05row04Yoolgdiw=\ngithub.com/ykadowak/zerologlint v0.1.5/go.mod h1:KaUskqF3e/v59oPmdq1U1DnKcuHokl2/K1U4pmIELKg=\ngithub.com/yuin/goldmark v1.1.25/go.mod h1:3hX8gzYuyVAZsxl0MRgGTJEmQBFcNTphYh9decYSb74=\ngithub.com/yuin/goldmark v1.1.27/go.mod h1:3hX8gzYuyVAZsxl0MRgGTJEmQBFcNTphYh9decYSb74=\ngithub.com/yuin/goldmark v1.1.32/go.mod h1:3hX8gzYuyVAZsxl0MRgGTJEmQBFcNTphYh9decYSb74=\ngithub.com/yuin/goldmark v1.2.1/go.mod h1:3hX8gzYuyVAZsxl0MRgGTJEmQBFcNTphYh9decYSb74=\ngithub.com/yuin/goldmark v1.3.5/go.mod h1:mwnBkeHKe2W/ZEtQ+71ViKU8L12m81fl3OWwC1Zlc8k=\ngithub.com/yuin/goldmark v1.4.1/go.mod h1:mwnBkeHKe2W/ZEtQ+71ViKU8L12m81fl3OWwC1Zlc8k=\ngithub.com/yuin/goldmark v1.4.13/go.mod h1:6yULJ656Px+3vBD8DxQVa3kxgyrAnzto9xy5taEt/CY=\ngitlab.com/bosi/decorder v0.4.2 h1:qbQaV3zgwnBZ4zPMhGLW4KZe7A7NwxEhJx39R3shffo=\ngitlab.com/bosi/decorder v0.4.2/go.mod h1:muuhHoaJkA9QLcYHq4Mj8FJUwDZ+EirSHRiaTcTf6T8=\ngo-simpler.org/assert v0.9.0 h1:PfpmcSvL7yAnWyChSjOz6Sp6m9j5lyK8Ok9pEL31YkQ=\ngo-simpler.org/assert v0.9.0/go.mod h1:74Eqh5eI6vCK6Y5l3PI8ZYFXG4Sa+tkr70OIPJAUr28=\ngo-simpler.org/musttag v0.13.0 h1:Q/YAW0AHvaoaIbsPj3bvEI5/QFP7w696IMUpnKXQfCE=\ngo-simpler.org/musttag v0.13.0/go.mod h1:FTzIGeK6OkKlUDVpj0iQUXZLUO1Js9+mvykDQy9C5yM=\ngo-simpler.org/sloglint v0.9.0 h1:/40NQtjRx9txvsB/RN022KsUJU+zaaSb/9q9BSefSrE=\ngo-simpler.org/sloglint v0.9.0/go.mod h1:G/OrAF6uxj48sHahCzrbarVMptL2kjWTaUeC8+fOGww=\ngo.opencensus.io v0.21.0/go.mod h1:mSImk1erAIZhrmZN+AvHh14ztQfjbGwt4TtuofqLduU=\ngo.opencensus.io v0.22.0/go.mod h1:+kGneAE2xo2IficOXnaByMWTGM9T73dGwxeWcUqIpI8=\ngo.opencensus.io v0.22.2/go.mod h1:yxeiOL68Rb0Xd1ddK5vPZ/oVn4vY4Ynel7k9FzqtOIw=\ngo.opencensus.io v0.22.3/go.mod h1:yxeiOL68Rb0Xd1ddK5vPZ/oVn4vY4Ynel7k9FzqtOIw=\ngo.opencensus.io v0.22.4/go.mod h1:yxeiOL68Rb0Xd1ddK5vPZ/oVn4vY4Ynel7k9FzqtOIw=\ngo.uber.org/atomic v1.7.0 h1:ADUqmZGgLDDfbSL9ZmPxKTybcoEYHgpYfELNoN+7hsw=\ngo.uber.org/atomic v1.7.0/go.mod h1:fEN4uk6kAWBTFdckzkM89CLk9XfWZrxpCo0nPH17wJc=\ngo.uber.org/automaxprocs v1.6.0 h1:O3y2/QNTOdbF+e/dpXNNW7Rx2hZ4sTIPyybbxyNqTUs=\ngo.uber.org/automaxprocs v1.6.0/go.mod h1:ifeIMSnPZuznNm6jmdzmU3/bfk01Fe2fotchwEFJ8r8=\ngo.uber.org/goleak v1.1.11 h1:wy28qYRKZgnJTxGxvye5/wgWr1EKjmUDGYox5mGlRlI=\ngo.uber.org/goleak v1.1.11/go.mod h1:cwTWslyiVhfpKIDGSZEM2HlOvcqm+tG4zioyIeLoqMQ=\ngo.uber.org/multierr v1.6.0 h1:y6IPFStTAIT5Ytl7/XYmHvzXQ7S3g/IeZW9hyZ5thw4=\ngo.uber.org/multierr v1.6.0/go.mod h1:cdWPpRnG4AhwMwsgIHip0KRBQjJy5kYEpYjJxpXp9iU=\ngo.uber.org/zap v1.24.0 h1:FiJd5l1UOLj0wCgbSE0rwwXHzEdAZS6hiiSnxJN/D60=\ngo.uber.org/zap v1.24.0/go.mod h1:2kMP+WWQ8aoFoedH3T2sq6iJ2yDWpHbP0f6MQbS9Gkg=\ngolang.org/x/crypto v0.0.0-20180904163835-0709b304e793/go.mod h1:6SG95UA2DQfeDnfUPMdvaQW0Q7yPrPDi9nlGo2tz2b4=\ngolang.org/x/crypto v0.0.0-20190308221718-c2843e01d9a2/go.mod h1:djNgcEr1/C05ACkg1iLfiJU5Ep61QUkGW8qpdssI0+w=\ngolang.org/x/crypto v0.0.0-20190510104115-cbcb75029529/go.mod h1:yigFU9vqHzYiE8UmvKecakEJjdnWj3jj499lnFckfCI=\ngolang.org/x/crypto v0.0.0-20190605123033-f99c8df09eb5/go.mod h1:yigFU9vqHzYiE8UmvKecakEJjdnWj3jj499lnFckfCI=\ngolang.org/x/crypto v0.0.0-20191011191535-87dc89f01550/go.mod h1:yigFU9vqHzYiE8UmvKecakEJjdnWj3jj499lnFckfCI=\ngolang.org/x/crypto v0.0.0-20200622213623-75b288015ac9/go.mod h1:LzIPMQfyMNhhGPhUkYOs5KpL4U8rLKemX1yGLhDgUto=\ngolang.org/x/crypto v0.0.0-20210921155107-089bfa567519/go.mod h1:GvvjBRRGRdwPK5ydBHafDWAxML/pGHZbMvKqRZ5+Abc=\ngolang.org/x/crypto v0.13.0/go.mod h1:y6Z2r+Rw4iayiXXAIxJIDAJ1zMW4yaTpebo8fPOliYc=\ngolang.org/x/crypto v0.14.0/go.mod h1:MVFd36DqK4CsrnJYDkBA3VC4m2GkXAM0PvzMCn4JQf4=\ngolang.org/x/exp v0.0.0-20190121172915-509febef88a4/go.mod h1:CJ0aWSM057203Lf6IL+f9T1iT9GByDxfZKAQTCR3kQA=\ngolang.org/x/exp v0.0.0-20190306152737-a1d7652674e8/go.mod h1:CJ0aWSM057203Lf6IL+f9T1iT9GByDxfZKAQTCR3kQA=\ngolang.org/x/exp v0.0.0-20190510132918-efd6b22b2522/go.mod h1:ZjyILWgesfNpC6sMxTJOJm9Kp84zZh5NQWvqDGG3Qr8=\ngolang.org/x/exp v0.0.0-20190829153037-c13cbed26979/go.mod h1:86+5VVa7VpoJ4kLfm080zCjGlMRFzhUhsZKEZO7MGek=\ngolang.org/x/exp v0.0.0-20191030013958-a1ab85dbe136/go.mod h1:JXzH8nQsPlswgeRAPE3MuO9GYsAcnJvJ4vnMwN/5qkY=\ngolang.org/x/exp v0.0.0-20191129062945-2f5052295587/go.mod h1:2RIsYlXP63K8oxa1u096TMicItID8zy7Y6sNkU49FU4=\ngolang.org/x/exp v0.0.0-20191227195350-da58074b4299/go.mod h1:2RIsYlXP63K8oxa1u096TMicItID8zy7Y6sNkU49FU4=\ngolang.org/x/exp v0.0.0-20200119233911-0405dc783f0a/go.mod h1:2RIsYlXP63K8oxa1u096TMicItID8zy7Y6sNkU49FU4=\ngolang.org/x/exp v0.0.0-20200207192155-f17229e696bd/go.mod h1:J/WKrq2StrnmMY6+EHIKF9dgMWnmCNThgcyBT1FY9mM=\ngolang.org/x/exp v0.0.0-20200224162631-6cc2880d07d6/go.mod h1:3jZMyOhIsHpP37uCMkUooju7aAi5cS1Q23tOzKc+0MU=\ngolang.org/x/exp v0.0.0-20240909161429-701f63a606c0 h1:e66Fs6Z+fZTbFBAxKfP3PALWBtpfqks2bwGcexMxgtk=\ngolang.org/x/exp v0.0.0-20240909161429-701f63a606c0/go.mod h1:2TbTHSBQa924w8M6Xs1QcRcFwyucIwBGpK1p2f1YFFY=\ngolang.org/x/exp/typeparams v0.0.0-20220428152302-39d4317da171/go.mod h1:AbB0pIl9nAr9wVwH+Z2ZpaocVmF5I4GyWCDIsVjR0bk=\ngolang.org/x/exp/typeparams v0.0.0-20230203172020-98cc5a0785f9/go.mod h1:AbB0pIl9nAr9wVwH+Z2ZpaocVmF5I4GyWCDIsVjR0bk=\ngolang.org/x/exp/typeparams v0.0.0-20250210185358-939b2ce775ac h1:TSSpLIG4v+p0rPv1pNOQtl1I8knsO4S9trOxNMOLVP4=\ngolang.org/x/exp/typeparams v0.0.0-20250210185358-939b2ce775ac/go.mod h1:AbB0pIl9nAr9wVwH+Z2ZpaocVmF5I4GyWCDIsVjR0bk=\ngolang.org/x/image v0.0.0-20190227222117-0694c2d4d067/go.mod h1:kZ7UVZpmo3dzQBMxlp+ypCbDeSB+sBbTgSJuh5dn5js=\ngolang.org/x/image v0.0.0-20190802002840-cff245a6509b/go.mod h1:FeLwcggjj3mMvU+oOTbSwawSJRM1uh48EjtB4UJZlP0=\ngolang.org/x/lint v0.0.0-20181026193005-c67002cb31c3/go.mod h1:UVdnD1Gm6xHRNCYTkRU2/jEulfH38KcIWyp/GAMgvoE=\ngolang.org/x/lint v0.0.0-20190227174305-5b3e6a55c961/go.mod h1:wehouNa3lNwaWXcvxsM5YxQ5yQlVC4a0KAMCusXpPoU=\ngolang.org/x/lint v0.0.0-20190301231843-5614ed5bae6f/go.mod h1:UVdnD1Gm6xHRNCYTkRU2/jEulfH38KcIWyp/GAMgvoE=\ngolang.org/x/lint v0.0.0-20190313153728-d0100b6bd8b3/go.mod h1:6SW0HCj/g11FgYtHlgUYUwCkIfeOF89ocIRzGO/8vkc=\ngolang.org/x/lint v0.0.0-20190409202823-959b441ac422/go.mod h1:6SW0HCj/g11FgYtHlgUYUwCkIfeOF89ocIRzGO/8vkc=\ngolang.org/x/lint v0.0.0-20190909230951-414d861bb4ac/go.mod h1:6SW0HCj/g11FgYtHlgUYUwCkIfeOF89ocIRzGO/8vkc=\ngolang.org/x/lint v0.0.0-20190930215403-16217165b5de/go.mod h1:6SW0HCj/g11FgYtHlgUYUwCkIfeOF89ocIRzGO/8vkc=\ngolang.org/x/lint v0.0.0-20191125180803-fdd1cda4f05f/go.mod h1:5qLYkcX4OjUUV8bRuDixDT3tpyyb+LUpUlRWLxfhWrs=\ngolang.org/x/lint v0.0.0-20200130185559-910be7a94367/go.mod h1:3xt1FjdF8hUf6vQPIChWIBhFzV8gjjsPE/fR3IyQdNY=\ngolang.org/x/lint v0.0.0-20200302205851-738671d3881b/go.mod h1:3xt1FjdF8hUf6vQPIChWIBhFzV8gjjsPE/fR3IyQdNY=\ngolang.org/x/mobile v0.0.0-20190312151609-d3739f865fa6/go.mod h1:z+o9i4GpDbdi3rU15maQ/Ox0txvL9dWGYEHz965HBQE=\ngolang.org/x/mobile v0.0.0-20190719004257-d2bd2a29d028/go.mod h1:E/iHnbuqvinMTCcRqshq8CkpyQDoeVncDDYHnLhea+o=\ngolang.org/x/mod v0.0.0-20190513183733-4bf6d317e70e/go.mod h1:mXi4GBBbnImb6dmsKGUJ2LatrhH/nqhxcFungHvyanc=\ngolang.org/x/mod v0.1.0/go.mod h1:0QHyrYULN0/3qlju5TqG8bIK38QM8yzMo5ekMj3DlcY=\ngolang.org/x/mod v0.1.1-0.20191105210325-c90efee705ee/go.mod h1:QqPTAvyqsEbceGzBzNggFXnrqF1CaUcvgkdR5Ot7KZg=\ngolang.org/x/mod v0.1.1-0.20191107180719-034126e5016b/go.mod h1:QqPTAvyqsEbceGzBzNggFXnrqF1CaUcvgkdR5Ot7KZg=\ngolang.org/x/mod v0.2.0/go.mod h1:s0Qsj1ACt9ePp/hMypM3fl4fZqREWJwdYDEqhRiZZUA=\ngolang.org/x/mod v0.3.0/go.mod h1:s0Qsj1ACt9ePp/hMypM3fl4fZqREWJwdYDEqhRiZZUA=\ngolang.org/x/mod v0.4.1/go.mod h1:s0Qsj1ACt9ePp/hMypM3fl4fZqREWJwdYDEqhRiZZUA=\ngolang.org/x/mod v0.4.2/go.mod h1:s0Qsj1ACt9ePp/hMypM3fl4fZqREWJwdYDEqhRiZZUA=\ngolang.org/x/mod v0.6.0-dev.0.20220106191415-9b9b3d81d5e3/go.mod h1:3p9vT2HGsQu2K1YbXdKPJLVgG5VJdoTa1poYQBtP1AY=\ngolang.org/x/mod v0.6.0-dev.0.20220419223038-86c51ed26bb4/go.mod h1:jJ57K6gSWd91VN4djpZkiMVwK6gcyfeH4XE8wZrZaV4=\ngolang.org/x/mod v0.7.0/go.mod h1:iBbtSCu2XBx23ZKBPSOrRkjjQPZFPuis4dIYUhu/chs=\ngolang.org/x/mod v0.8.0/go.mod h1:iBbtSCu2XBx23ZKBPSOrRkjjQPZFPuis4dIYUhu/chs=\ngolang.org/x/mod v0.9.0/go.mod h1:iBbtSCu2XBx23ZKBPSOrRkjjQPZFPuis4dIYUhu/chs=\ngolang.org/x/mod v0.12.0/go.mod h1:iBbtSCu2XBx23ZKBPSOrRkjjQPZFPuis4dIYUhu/chs=\ngolang.org/x/mod v0.13.0/go.mod h1:hTbmBsO62+eylJbnUtE2MGJUyE7QWk4xUqPFrRgJ+7c=\ngolang.org/x/mod v0.24.0 h1:ZfthKaKaT4NrhGVZHO1/WDTwGES4De8KtWO0SIbNJMU=\ngolang.org/x/mod v0.24.0/go.mod h1:IXM97Txy2VM4PJ3gI61r1YEk/gAj6zAHN3AdZt6S9Ww=\ngolang.org/x/net v0.0.0-20180724234803-3673e40ba225/go.mod h1:mL1N/T3taQHkDXs73rZJwtUhF3w3ftmwwsq0BUmARs4=\ngolang.org/x/net v0.0.0-20180826012351-8a410e7b638d/go.mod h1:mL1N/T3taQHkDXs73rZJwtUhF3w3ftmwwsq0BUmARs4=\ngolang.org/x/net v0.0.0-20181114220301-adae6a3d119a/go.mod h1:mL1N/T3taQHkDXs73rZJwtUhF3w3ftmwwsq0BUmARs4=\ngolang.org/x/net v0.0.0-20190108225652-1e06a53dbb7e/go.mod h1:mL1N/T3taQHkDXs73rZJwtUhF3w3ftmwwsq0BUmARs4=\ngolang.org/x/net v0.0.0-20190213061140-3a22650c66bd/go.mod h1:mL1N/T3taQHkDXs73rZJwtUhF3w3ftmwwsq0BUmARs4=\ngolang.org/x/net v0.0.0-20190311183353-d8887717615a/go.mod h1:t9HGtf8HONx5eT2rtn7q6eTqICYqUVnKs3thJo3Qplg=\ngolang.org/x/net v0.0.0-20190404232315-eb5bcb51f2a3/go.mod h1:t9HGtf8HONx5eT2rtn7q6eTqICYqUVnKs3thJo3Qplg=\ngolang.org/x/net v0.0.0-20190501004415-9ce7a6920f09/go.mod h1:t9HGtf8HONx5eT2rtn7q6eTqICYqUVnKs3thJo3Qplg=\ngolang.org/x/net v0.0.0-20190503192946-f4e77d36d62c/go.mod h1:t9HGtf8HONx5eT2rtn7q6eTqICYqUVnKs3thJo3Qplg=\ngolang.org/x/net v0.0.0-20190603091049-60506f45cf65/go.mod h1:HSz+uSET+XFnRR8LxR5pz3Of3rY3CfYBVs4xY44aLks=\ngolang.org/x/net v0.0.0-20190613194153-d28f0bde5980/go.mod h1:z5CRVTTTmAJ677TzLLGU+0bjPO0LkuOLi4/5GtJWs/s=\ngolang.org/x/net v0.0.0-20190620200207-3b0461eec859/go.mod h1:z5CRVTTTmAJ677TzLLGU+0bjPO0LkuOLi4/5GtJWs/s=\ngolang.org/x/net v0.0.0-20190628185345-da137c7871d7/go.mod h1:z5CRVTTTmAJ677TzLLGU+0bjPO0LkuOLi4/5GtJWs/s=\ngolang.org/x/net v0.0.0-20190724013045-ca1201d0de80/go.mod h1:z5CRVTTTmAJ677TzLLGU+0bjPO0LkuOLi4/5GtJWs/s=\ngolang.org/x/net v0.0.0-20191209160850-c0dbc17a3553/go.mod h1:z5CRVTTTmAJ677TzLLGU+0bjPO0LkuOLi4/5GtJWs/s=\ngolang.org/x/net v0.0.0-20200114155413-6afb5195e5aa/go.mod h1:z5CRVTTTmAJ677TzLLGU+0bjPO0LkuOLi4/5GtJWs/s=\ngolang.org/x/net v0.0.0-20200202094626-16171245cfb2/go.mod h1:z5CRVTTTmAJ677TzLLGU+0bjPO0LkuOLi4/5GtJWs/s=\ngolang.org/x/net v0.0.0-20200222125558-5a598a2470a0/go.mod h1:z5CRVTTTmAJ677TzLLGU+0bjPO0LkuOLi4/5GtJWs/s=\ngolang.org/x/net v0.0.0-20200226121028-0de0cce0169b/go.mod h1:z5CRVTTTmAJ677TzLLGU+0bjPO0LkuOLi4/5GtJWs/s=\ngolang.org/x/net v0.0.0-20200301022130-244492dfa37a/go.mod h1:z5CRVTTTmAJ677TzLLGU+0bjPO0LkuOLi4/5GtJWs/s=\ngolang.org/x/net v0.0.0-20200324143707-d3edc9973b7e/go.mod h1:qpuaurCH72eLCgpAm/N6yyVIVM9cpaDIP3A8BGJEC5A=\ngolang.org/x/net v0.0.0-20200501053045-e0ff5e5a1de5/go.mod h1:qpuaurCH72eLCgpAm/N6yyVIVM9cpaDIP3A8BGJEC5A=\ngolang.org/x/net v0.0.0-20200506145744-7e3656a0809f/go.mod h1:qpuaurCH72eLCgpAm/N6yyVIVM9cpaDIP3A8BGJEC5A=\ngolang.org/x/net v0.0.0-20200513185701-a91f0712d120/go.mod h1:qpuaurCH72eLCgpAm/N6yyVIVM9cpaDIP3A8BGJEC5A=\ngolang.org/x/net v0.0.0-20200520182314-0ba52f642ac2/go.mod h1:qpuaurCH72eLCgpAm/N6yyVIVM9cpaDIP3A8BGJEC5A=\ngolang.org/x/net v0.0.0-20200625001655-4c5254603344/go.mod h1:/O7V0waA8r7cgGh81Ro3o1hOxt32SMVPicZroKQ2sZA=\ngolang.org/x/net v0.0.0-20200707034311-ab3426394381/go.mod h1:/O7V0waA8r7cgGh81Ro3o1hOxt32SMVPicZroKQ2sZA=\ngolang.org/x/net v0.0.0-20200822124328-c89045814202/go.mod h1:/O7V0waA8r7cgGh81Ro3o1hOxt32SMVPicZroKQ2sZA=\ngolang.org/x/net v0.0.0-20201021035429-f5854403a974/go.mod h1:sp8m0HH+o8qH0wwXwYZr8TS3Oi6o0r6Gce1SSxlDquU=\ngolang.org/x/net v0.0.0-20210226172049-e18ecbb05110/go.mod h1:m0MpNAwzfU5UDzcl9v0D8zg8gWTRqZa9RBIspLL5mdg=\ngolang.org/x/net v0.0.0-20210405180319-a5a99cb37ef4/go.mod h1:p54w0d4576C0XHj96bSt6lcn1PtDYWL6XObtHCRCNQM=\ngolang.org/x/net v0.0.0-20210525063256-abc453219eb5/go.mod h1:9nx3DQGgdP8bBQD5qxJ1jj9UTztislL4KSBs9R2vV5Y=\ngolang.org/x/net v0.0.0-20211015210444-4f30a5c0130f/go.mod h1:9nx3DQGgdP8bBQD5qxJ1jj9UTztislL4KSBs9R2vV5Y=\ngolang.org/x/net v0.0.0-20220722155237-a158d28d115b/go.mod h1:XRhObCWvk6IyKnWLug+ECip1KBveYUHfp+8e9klMJ9c=\ngolang.org/x/net v0.2.0/go.mod h1:KqCZLdyyvdV855qA2rE3GC2aiw5xGR5TEjj8smXukLY=\ngolang.org/x/net v0.6.0/go.mod h1:2Tu9+aMcznHK/AK1HMvgo6xiTLG5rD5rZLDS+rp2Bjs=\ngolang.org/x/net v0.8.0/go.mod h1:QVkue5JL9kW//ek3r6jTKnTFis1tRmNAW2P1shuFdJc=\ngolang.org/x/net v0.10.0/go.mod h1:0qNGK6F8kojg2nk9dLZ2mShWaEBan6FAoqfSigmmuDg=\ngolang.org/x/net v0.15.0/go.mod h1:idbUs1IY1+zTqbi8yxTbhexhEEk5ur9LInksu6HrEpk=\ngolang.org/x/net v0.16.0/go.mod h1:NxSsAGuq816PNPmqtQdLE42eU2Fs7NoRIZrHJAlaCOE=\ngolang.org/x/net v0.37.0 h1:1zLorHbz+LYj7MQlSf1+2tPIIgibq2eL5xkrGk6f+2c=\ngolang.org/x/net v0.37.0/go.mod h1:ivrbrMbzFq5J41QOQh0siUuly180yBYtLp+CKbEaFx8=\ngolang.org/x/oauth2 v0.0.0-20180821212333-d2e6202438be/go.mod h1:N/0e6XlmueqKjAGxoOufVs8QHGRruUQn6yWY3a++T0U=\ngolang.org/x/oauth2 v0.0.0-20190226205417-e64efc72b421/go.mod h1:gOpvHmFTYa4IltrdGE7lF6nIHvwfUNPOp7c8zoXwtLw=\ngolang.org/x/oauth2 v0.0.0-20190604053449-0f29369cfe45/go.mod h1:gOpvHmFTYa4IltrdGE7lF6nIHvwfUNPOp7c8zoXwtLw=\ngolang.org/x/oauth2 v0.0.0-20191202225959-858c2ad4c8b6/go.mod h1:gOpvHmFTYa4IltrdGE7lF6nIHvwfUNPOp7c8zoXwtLw=\ngolang.org/x/oauth2 v0.0.0-20200107190931-bf48bf16ab8d/go.mod h1:gOpvHmFTYa4IltrdGE7lF6nIHvwfUNPOp7c8zoXwtLw=\ngolang.org/x/oauth2 v0.0.0-20210514164344-f6687ab2804c/go.mod h1:KelEdhl1UZF7XfJ4dDtk6s++YSgaE7mD/BuKKDLBl4A=\ngolang.org/x/sync v0.0.0-20180314180146-1d60e4601c6f/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=\ngolang.org/x/sync v0.0.0-20181108010431-42b317875d0f/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=\ngolang.org/x/sync v0.0.0-20181221193216-37e7f081c4d4/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=\ngolang.org/x/sync v0.0.0-20190227155943-e225da77a7e6/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=\ngolang.org/x/sync v0.0.0-20190423024810-112230192c58/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=\ngolang.org/x/sync v0.0.0-20190911185100-cd5d95a43a6e/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=\ngolang.org/x/sync v0.0.0-20200317015054-43a5402ce75a/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=\ngolang.org/x/sync v0.0.0-20200625203802-6e8e738ad208/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=\ngolang.org/x/sync v0.0.0-20201020160332-67f06af15bc9/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=\ngolang.org/x/sync v0.0.0-20201207232520-09787c993a3a/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=\ngolang.org/x/sync v0.0.0-20210220032951-036812b2e83c/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=\ngolang.org/x/sync v0.0.0-20220722155255-886fb9371eb4/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=\ngolang.org/x/sync v0.1.0/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=\ngolang.org/x/sync v0.3.0/go.mod h1:FU7BRWz2tNW+3quACPkgCx/L+uEAv1htQ0V83Z9Rj+Y=\ngolang.org/x/sync v0.4.0/go.mod h1:FU7BRWz2tNW+3quACPkgCx/L+uEAv1htQ0V83Z9Rj+Y=\ngolang.org/x/sync v0.12.0 h1:MHc5BpPuC30uJk597Ri8TV3CNZcTLu6B6z4lJy+g6Jw=\ngolang.org/x/sync v0.12.0/go.mod h1:1dzgHSNfp02xaA81J2MS99Qcpr2w7fw1gpm99rleRqA=\ngolang.org/x/sys v0.0.0-20180830151530-49385e6e1522/go.mod h1:STP8DvDyc/dI5b8T5hshtkjS+E42TnysNCUPdjciGhY=\ngolang.org/x/sys v0.0.0-20180905080454-ebe1bf3edb33/go.mod h1:STP8DvDyc/dI5b8T5hshtkjS+E42TnysNCUPdjciGhY=\ngolang.org/x/sys v0.0.0-20181116152217-5ac8a444bdc5/go.mod h1:STP8DvDyc/dI5b8T5hshtkjS+E42TnysNCUPdjciGhY=\ngolang.org/x/sys v0.0.0-20190215142949-d0b11bdaac8a/go.mod h1:STP8DvDyc/dI5b8T5hshtkjS+E42TnysNCUPdjciGhY=\ngolang.org/x/sys v0.0.0-20190312061237-fead79001313/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=\ngolang.org/x/sys v0.0.0-20190412213103-97732733099d/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=\ngolang.org/x/sys v0.0.0-20190422165155-953cdadca894/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=\ngolang.org/x/sys v0.0.0-20190502145724-3ef323f4f1fd/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=\ngolang.org/x/sys v0.0.0-20190507160741-ecd444e8653b/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=\ngolang.org/x/sys v0.0.0-20190606165138-5da285871e9c/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=\ngolang.org/x/sys v0.0.0-20190624142023-c5567b49c5d0/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=\ngolang.org/x/sys v0.0.0-20190726091711-fc99dfbffb4e/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=\ngolang.org/x/sys v0.0.0-20191001151750-bb3f8db39f24/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=\ngolang.org/x/sys v0.0.0-20191204072324-ce4227a45e2e/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=\ngolang.org/x/sys v0.0.0-20191228213918-04cbcbbfeed8/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=\ngolang.org/x/sys v0.0.0-20200106162015-b016eb3dc98e/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=\ngolang.org/x/sys v0.0.0-20200113162924-86b910548bc1/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=\ngolang.org/x/sys v0.0.0-20200122134326-e047566fdf82/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=\ngolang.org/x/sys v0.0.0-20200202164722-d101bd2416d5/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=\ngolang.org/x/sys v0.0.0-20200212091648-12a6c2dcc1e4/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=\ngolang.org/x/sys v0.0.0-20200223170610-d5e6a3e2c0ae/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=\ngolang.org/x/sys v0.0.0-20200302150141-5c8b2ff67527/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=\ngolang.org/x/sys v0.0.0-20200323222414-85ca7c5b95cd/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=\ngolang.org/x/sys v0.0.0-20200331124033-c3d80250170d/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=\ngolang.org/x/sys v0.0.0-20200501052902-10377860bb8e/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=\ngolang.org/x/sys v0.0.0-20200511232937-7e40ca221e25/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=\ngolang.org/x/sys v0.0.0-20200515095857-1151b9dac4a9/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=\ngolang.org/x/sys v0.0.0-20200523222454-059865788121/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=\ngolang.org/x/sys v0.0.0-20200615200032-f1bc736245b1/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=\ngolang.org/x/sys v0.0.0-20200625212154-ddb9806d33ae/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=\ngolang.org/x/sys v0.0.0-20200803210538-64077c9b5642/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=\ngolang.org/x/sys v0.0.0-20200930185726-fdedc70b468f/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=\ngolang.org/x/sys v0.0.0-20201119102817-f84b799fce68/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=\ngolang.org/x/sys v0.0.0-20210124154548-22da62e12c0c/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=\ngolang.org/x/sys v0.0.0-20210330210617-4fbd30eecc44/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=\ngolang.org/x/sys v0.0.0-20210423082822-04245dca01da/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=\ngolang.org/x/sys v0.0.0-20210510120138-977fb7262007/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=\ngolang.org/x/sys v0.0.0-20210603081109-ebe580a85c40/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=\ngolang.org/x/sys v0.0.0-20210615035016-665e8c7367d1/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=\ngolang.org/x/sys v0.0.0-20211019181941-9d821ace8654/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=\ngolang.org/x/sys v0.0.0-20211105183446-c75c47738b0c/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=\ngolang.org/x/sys v0.0.0-20220114195835-da31bd327af9/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=\ngolang.org/x/sys v0.0.0-20220412211240-33da011f77ad/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=\ngolang.org/x/sys v0.0.0-20220520151302-bc2c85ada10a/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=\ngolang.org/x/sys v0.0.0-20220715151400-c0bba94af5f8/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=\ngolang.org/x/sys v0.0.0-20220722155257-8c9f86f7a55f/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=\ngolang.org/x/sys v0.2.0/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=\ngolang.org/x/sys v0.5.0/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=\ngolang.org/x/sys v0.6.0/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=\ngolang.org/x/sys v0.8.0/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=\ngolang.org/x/sys v0.12.0/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=\ngolang.org/x/sys v0.13.0/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=\ngolang.org/x/sys v0.31.0 h1:ioabZlmFYtWhL+TRYpcnNlLwhyxaM9kWTDEmfnprqik=\ngolang.org/x/sys v0.31.0/go.mod h1:BJP2sWEmIv4KK5OTEluFJCKSidICx8ciO85XgH3Ak8k=\ngolang.org/x/term v0.0.0-20201126162022-7de9c90e9dd1/go.mod h1:bj7SfCRtBDWHUb9snDiAeCFNEtKQo2Wmx5Cou7ajbmo=\ngolang.org/x/term v0.0.0-20210927222741-03fcf44c2211/go.mod h1:jbD1KX2456YbFQfuXm/mYQcufACuNUgVhRMnK/tPxf8=\ngolang.org/x/term v0.2.0/go.mod h1:TVmDHMZPmdnySmBfhjOoOdhjzdE1h4u1VwSiw2l1Nuc=\ngolang.org/x/term v0.5.0/go.mod h1:jMB1sMXY+tzblOD4FWmEbocvup2/aLOaQEp7JmGp78k=\ngolang.org/x/term v0.6.0/go.mod h1:m6U89DPEgQRMq3DNkDClhWw02AUbt2daBVO4cn4Hv9U=\ngolang.org/x/term v0.8.0/go.mod h1:xPskH00ivmX89bAKVGSKKtLOWNx2+17Eiy94tnKShWo=\ngolang.org/x/term v0.12.0/go.mod h1:owVbMEjm3cBLCHdkQu9b1opXd4ETQWc3BhuQGKgXgvU=\ngolang.org/x/term v0.13.0/go.mod h1:LTmsnFJwVN6bCy1rVCoS+qHT1HhALEFxKncY3WNNh4U=\ngolang.org/x/text v0.0.0-20170915032832-14c0d48ead0c/go.mod h1:NqM8EUOU14njkJ3fqMW+pc6Ldnwhi/IjpwHt7yyuwOQ=\ngolang.org/x/text v0.3.0/go.mod h1:NqM8EUOU14njkJ3fqMW+pc6Ldnwhi/IjpwHt7yyuwOQ=\ngolang.org/x/text v0.3.1-0.20180807135948-17ff2d5776d2/go.mod h1:NqM8EUOU14njkJ3fqMW+pc6Ldnwhi/IjpwHt7yyuwOQ=\ngolang.org/x/text v0.3.2/go.mod h1:bEr9sfX3Q8Zfm5fL9x+3itogRgK3+ptLWKqgva+5dAk=\ngolang.org/x/text v0.3.3/go.mod h1:5Zoc/QRtKVWzQhOtBMvqHzDpF6irO9z98xDceosuGiQ=\ngolang.org/x/text v0.3.6/go.mod h1:5Zoc/QRtKVWzQhOtBMvqHzDpF6irO9z98xDceosuGiQ=\ngolang.org/x/text v0.3.7/go.mod h1:u+2+/6zg+i71rQMx5EYifcz6MCKuco9NR6JIITiCfzQ=\ngolang.org/x/text v0.4.0/go.mod h1:mrYo+phRRbMaCq/xk9113O4dZlRixOauAjOtrjsXDZ8=\ngolang.org/x/text v0.7.0/go.mod h1:mrYo+phRRbMaCq/xk9113O4dZlRixOauAjOtrjsXDZ8=\ngolang.org/x/text v0.8.0/go.mod h1:e1OnstbJyHTd6l/uOt8jFFHp6TRDWZR/bV3emEE/zU8=\ngolang.org/x/text v0.9.0/go.mod h1:e1OnstbJyHTd6l/uOt8jFFHp6TRDWZR/bV3emEE/zU8=\ngolang.org/x/text v0.13.0/go.mod h1:TvPlkZtksWOMsz7fbANvkp4WM8x/WCo/om8BMLbz+aE=\ngolang.org/x/text v0.23.0 h1:D71I7dUrlY+VX0gQShAThNGHFxZ13dGLBHQLVl1mJlY=\ngolang.org/x/text v0.23.0/go.mod h1:/BLNzu4aZCJ1+kcD0DNRotWKage4q2rGVAg4o22unh4=\ngolang.org/x/time v0.0.0-20181108054448-85acf8d2951c/go.mod h1:tRJNPiyCQ0inRvYxbN9jk5I+vvW/OXSQhTDSoE431IQ=\ngolang.org/x/time v0.0.0-20190308202827-9d24e82272b4/go.mod h1:tRJNPiyCQ0inRvYxbN9jk5I+vvW/OXSQhTDSoE431IQ=\ngolang.org/x/time v0.0.0-20191024005414-555d28b269f0/go.mod h1:tRJNPiyCQ0inRvYxbN9jk5I+vvW/OXSQhTDSoE431IQ=\ngolang.org/x/tools v0.0.0-20180917221912-90fa682c2a6e/go.mod h1:n7NCudcB/nEzxVGmLbDWY5pfWTLqBcC2KZ6jyYvM4mQ=\ngolang.org/x/tools v0.0.0-20190114222345-bf090417da8b/go.mod h1:n7NCudcB/nEzxVGmLbDWY5pfWTLqBcC2KZ6jyYvM4mQ=\ngolang.org/x/tools v0.0.0-20190226205152-f727befe758c/go.mod h1:9Yl7xja0Znq3iFh3HoIrodX9oNMXvdceNzlUR8zjMvY=\ngolang.org/x/tools v0.0.0-20190311212946-11955173bddd/go.mod h1:LCzVGOaR6xXOjkQ3onu1FJEFr0SW1gC7cKk1uF8kGRs=\ngolang.org/x/tools v0.0.0-20190312151545-0bb0c0a6e846/go.mod h1:LCzVGOaR6xXOjkQ3onu1FJEFr0SW1gC7cKk1uF8kGRs=\ngolang.org/x/tools v0.0.0-20190312170243-e65039ee4138/go.mod h1:LCzVGOaR6xXOjkQ3onu1FJEFr0SW1gC7cKk1uF8kGRs=\ngolang.org/x/tools v0.0.0-20190425150028-36563e24a262/go.mod h1:RgjU9mgBXZiqYHBnxXauZ1Gv1EHHAz9KjViQ78xBX0Q=\ngolang.org/x/tools v0.0.0-20190506145303-2d16b83fe98c/go.mod h1:RgjU9mgBXZiqYHBnxXauZ1Gv1EHHAz9KjViQ78xBX0Q=\ngolang.org/x/tools v0.0.0-20190524140312-2c0ae7006135/go.mod h1:RgjU9mgBXZiqYHBnxXauZ1Gv1EHHAz9KjViQ78xBX0Q=\ngolang.org/x/tools v0.0.0-20190606124116-d0a3d012864b/go.mod h1:/rFqwRUd4F7ZHNgwSSTFct+R/Kf4OFW1sUzUTQQTgfc=\ngolang.org/x/tools v0.0.0-20190621195816-6e04913cbbac/go.mod h1:/rFqwRUd4F7ZHNgwSSTFct+R/Kf4OFW1sUzUTQQTgfc=\ngolang.org/x/tools v0.0.0-20190628153133-6cdbf07be9d0/go.mod h1:/rFqwRUd4F7ZHNgwSSTFct+R/Kf4OFW1sUzUTQQTgfc=\ngolang.org/x/tools v0.0.0-20190816200558-6889da9d5479/go.mod h1:b+2E5dAYhXwXZwtnZ6UAqBI28+e2cm9otk0dWdXHAEo=\ngolang.org/x/tools v0.0.0-20190911174233-4f2ddba30aff/go.mod h1:b+2E5dAYhXwXZwtnZ6UAqBI28+e2cm9otk0dWdXHAEo=\ngolang.org/x/tools v0.0.0-20191012152004-8de300cfc20a/go.mod h1:b+2E5dAYhXwXZwtnZ6UAqBI28+e2cm9otk0dWdXHAEo=\ngolang.org/x/tools v0.0.0-20191113191852-77e3bb0ad9e7/go.mod h1:b+2E5dAYhXwXZwtnZ6UAqBI28+e2cm9otk0dWdXHAEo=\ngolang.org/x/tools v0.0.0-20191115202509-3a792d9c32b2/go.mod h1:b+2E5dAYhXwXZwtnZ6UAqBI28+e2cm9otk0dWdXHAEo=\ngolang.org/x/tools v0.0.0-20191119224855-298f0cb1881e/go.mod h1:b+2E5dAYhXwXZwtnZ6UAqBI28+e2cm9otk0dWdXHAEo=\ngolang.org/x/tools v0.0.0-20191125144606-a911d9008d1f/go.mod h1:b+2E5dAYhXwXZwtnZ6UAqBI28+e2cm9otk0dWdXHAEo=\ngolang.org/x/tools v0.0.0-20191130070609-6e064ea0cf2d/go.mod h1:b+2E5dAYhXwXZwtnZ6UAqBI28+e2cm9otk0dWdXHAEo=\ngolang.org/x/tools v0.0.0-20191216173652-a0e659d51361/go.mod h1:TB2adYChydJhpapKDTa4BR/hXlZSLoq2Wpct/0txZ28=\ngolang.org/x/tools v0.0.0-20191227053925-7b8e75db28f4/go.mod h1:TB2adYChydJhpapKDTa4BR/hXlZSLoq2Wpct/0txZ28=\ngolang.org/x/tools v0.0.0-20200117161641-43d50277825c/go.mod h1:TB2adYChydJhpapKDTa4BR/hXlZSLoq2Wpct/0txZ28=\ngolang.org/x/tools v0.0.0-20200122220014-bf1340f18c4a/go.mod h1:TB2adYChydJhpapKDTa4BR/hXlZSLoq2Wpct/0txZ28=\ngolang.org/x/tools v0.0.0-20200130002326-2f3ba24bd6e7/go.mod h1:TB2adYChydJhpapKDTa4BR/hXlZSLoq2Wpct/0txZ28=\ngolang.org/x/tools v0.0.0-20200204074204-1cc6d1ef6c74/go.mod h1:TB2adYChydJhpapKDTa4BR/hXlZSLoq2Wpct/0txZ28=\ngolang.org/x/tools v0.0.0-20200207183749-b753a1ba74fa/go.mod h1:TB2adYChydJhpapKDTa4BR/hXlZSLoq2Wpct/0txZ28=\ngolang.org/x/tools v0.0.0-20200212150539-ea181f53ac56/go.mod h1:TB2adYChydJhpapKDTa4BR/hXlZSLoq2Wpct/0txZ28=\ngolang.org/x/tools v0.0.0-20200224181240-023911ca70b2/go.mod h1:TB2adYChydJhpapKDTa4BR/hXlZSLoq2Wpct/0txZ28=\ngolang.org/x/tools v0.0.0-20200227222343-706bc42d1f0d/go.mod h1:TB2adYChydJhpapKDTa4BR/hXlZSLoq2Wpct/0txZ28=\ngolang.org/x/tools v0.0.0-20200304193943-95d2e580d8eb/go.mod h1:o4KQGtdN14AW+yjsvvwRTJJuXz8XRtIHtEnmAXLyFUw=\ngolang.org/x/tools v0.0.0-20200312045724-11d5b4c81c7d/go.mod h1:o4KQGtdN14AW+yjsvvwRTJJuXz8XRtIHtEnmAXLyFUw=\ngolang.org/x/tools v0.0.0-20200324003944-a576cf524670/go.mod h1:Sl4aGygMT6LrqrWclx+PTx3U+LnKx/seiNR+3G19Ar8=\ngolang.org/x/tools v0.0.0-20200329025819-fd4102a86c65/go.mod h1:Sl4aGygMT6LrqrWclx+PTx3U+LnKx/seiNR+3G19Ar8=\ngolang.org/x/tools v0.0.0-20200331025713-a30bf2db82d4/go.mod h1:Sl4aGygMT6LrqrWclx+PTx3U+LnKx/seiNR+3G19Ar8=\ngolang.org/x/tools v0.0.0-20200501065659-ab2804fb9c9d/go.mod h1:EkVYQZoAsY45+roYkvgYkIh4xh/qjgUK9TdY2XT94GE=\ngolang.org/x/tools v0.0.0-20200512131952-2bc93b1c0c88/go.mod h1:EkVYQZoAsY45+roYkvgYkIh4xh/qjgUK9TdY2XT94GE=\ngolang.org/x/tools v0.0.0-20200515010526-7d3b6ebf133d/go.mod h1:EkVYQZoAsY45+roYkvgYkIh4xh/qjgUK9TdY2XT94GE=\ngolang.org/x/tools v0.0.0-20200618134242-20370b0cb4b2/go.mod h1:EkVYQZoAsY45+roYkvgYkIh4xh/qjgUK9TdY2XT94GE=\ngolang.org/x/tools v0.0.0-20200724022722-7017fd6b1305/go.mod h1:njjCfa9FT2d7l9Bc6FUM5FLjQPp3cFF28FI3qnDFljA=\ngolang.org/x/tools v0.0.0-20200729194436-6467de6f59a7/go.mod h1:njjCfa9FT2d7l9Bc6FUM5FLjQPp3cFF28FI3qnDFljA=\ngolang.org/x/tools v0.0.0-20200804011535-6c149bb5ef0d/go.mod h1:njjCfa9FT2d7l9Bc6FUM5FLjQPp3cFF28FI3qnDFljA=\ngolang.org/x/tools v0.0.0-20200820010801-b793a1359eac/go.mod h1:njjCfa9FT2d7l9Bc6FUM5FLjQPp3cFF28FI3qnDFljA=\ngolang.org/x/tools v0.0.0-20200825202427-b303f430e36d/go.mod h1:njjCfa9FT2d7l9Bc6FUM5FLjQPp3cFF28FI3qnDFljA=\ngolang.org/x/tools v0.0.0-20201023174141-c8cfbd0f21e6/go.mod h1:emZCQorbCU4vsT4fOWvOPXz4eW1wZW4PmDk9uLelYpA=\ngolang.org/x/tools v0.1.1-0.20210205202024-ef80cdb6ec6d/go.mod h1:9bzcO0MWcOuT0tm1iBGzDVPshzfwoVvREIui8C+MHqU=\ngolang.org/x/tools v0.1.1-0.20210302220138-2ac05c832e1a/go.mod h1:9bzcO0MWcOuT0tm1iBGzDVPshzfwoVvREIui8C+MHqU=\ngolang.org/x/tools v0.1.1/go.mod h1:o0xws9oXOQQZyjljx8fwUC0k7L1pTE6eaCbjGeHmOkk=\ngolang.org/x/tools v0.1.5/go.mod h1:o0xws9oXOQQZyjljx8fwUC0k7L1pTE6eaCbjGeHmOkk=\ngolang.org/x/tools v0.1.10/go.mod h1:Uh6Zz+xoGYZom868N8YTex3t7RhtHDBrE8Gzo9bV56E=\ngolang.org/x/tools v0.1.12/go.mod h1:hNGJHUnrk76NpqgfD5Aqm5Crs+Hm0VOH/i9J2+nxYbc=\ngolang.org/x/tools v0.3.0/go.mod h1:/rWhSS2+zyEVwoJf8YAX6L2f0ntZ7Kn/mGgAWcipA5k=\ngolang.org/x/tools v0.6.0/go.mod h1:Xwgl3UAJ/d3gWutnCtw505GrjyAbvKui8lOU390QaIU=\ngolang.org/x/tools v0.7.0/go.mod h1:4pg6aUX35JBAogB10C9AtvVL+qowtN4pT3CGSQex14s=\ngolang.org/x/tools v0.13.0/go.mod h1:HvlwmtVNQAhOuCjW7xxvovg8wbNq7LwfXh/k7wXUl58=\ngolang.org/x/tools v0.14.0/go.mod h1:uYBEerGOWcJyEORxN+Ek8+TT266gXkNlHdJBwexUsBg=\ngolang.org/x/tools v0.31.0 h1:0EedkvKDbh+qistFTd0Bcwe/YLh4vHwWEkiI0toFIBU=\ngolang.org/x/tools v0.31.0/go.mod h1:naFTU+Cev749tSJRXJlna0T3WxKvb1kWEx15xA4SdmQ=\ngolang.org/x/xerrors v0.0.0-20190717185122-a985d3407aa7/go.mod h1:I/5z698sn9Ka8TeJc9MKroUUfqBBauWjQqLJ2OPfmY0=\ngolang.org/x/xerrors v0.0.0-20191011141410-1b5146add898/go.mod h1:I/5z698sn9Ka8TeJc9MKroUUfqBBauWjQqLJ2OPfmY0=\ngolang.org/x/xerrors v0.0.0-20191204190536-9bdfabe68543/go.mod h1:I/5z698sn9Ka8TeJc9MKroUUfqBBauWjQqLJ2OPfmY0=\ngolang.org/x/xerrors v0.0.0-20200804184101-5ec99f83aff1/go.mod h1:I/5z698sn9Ka8TeJc9MKroUUfqBBauWjQqLJ2OPfmY0=\ngoogle.golang.org/api v0.4.0/go.mod h1:8k5glujaEP+g9n7WNsDg8QP6cUVNI86fCNMcbazEtwE=\ngoogle.golang.org/api v0.7.0/go.mod h1:WtwebWUNSVBH/HAw79HIFXZNqEvBhG+Ra+ax0hx3E3M=\ngoogle.golang.org/api v0.8.0/go.mod h1:o4eAsZoiT+ibD93RtjEohWalFOjRDx6CVaqeizhEnKg=\ngoogle.golang.org/api v0.9.0/go.mod h1:o4eAsZoiT+ibD93RtjEohWalFOjRDx6CVaqeizhEnKg=\ngoogle.golang.org/api v0.13.0/go.mod h1:iLdEw5Ide6rF15KTC1Kkl0iskquN2gFfn9o9XIsbkAI=\ngoogle.golang.org/api v0.14.0/go.mod h1:iLdEw5Ide6rF15KTC1Kkl0iskquN2gFfn9o9XIsbkAI=\ngoogle.golang.org/api v0.15.0/go.mod h1:iLdEw5Ide6rF15KTC1Kkl0iskquN2gFfn9o9XIsbkAI=\ngoogle.golang.org/api v0.17.0/go.mod h1:BwFmGc8tA3vsd7r/7kR8DY7iEEGSU04BFxCo5jP/sfE=\ngoogle.golang.org/api v0.18.0/go.mod h1:BwFmGc8tA3vsd7r/7kR8DY7iEEGSU04BFxCo5jP/sfE=\ngoogle.golang.org/api v0.19.0/go.mod h1:BwFmGc8tA3vsd7r/7kR8DY7iEEGSU04BFxCo5jP/sfE=\ngoogle.golang.org/api v0.20.0/go.mod h1:BwFmGc8tA3vsd7r/7kR8DY7iEEGSU04BFxCo5jP/sfE=\ngoogle.golang.org/api v0.22.0/go.mod h1:BwFmGc8tA3vsd7r/7kR8DY7iEEGSU04BFxCo5jP/sfE=\ngoogle.golang.org/api v0.24.0/go.mod h1:lIXQywCXRcnZPGlsd8NbLnOjtAoL6em04bJ9+z0MncE=\ngoogle.golang.org/api v0.28.0/go.mod h1:lIXQywCXRcnZPGlsd8NbLnOjtAoL6em04bJ9+z0MncE=\ngoogle.golang.org/api v0.29.0/go.mod h1:Lcubydp8VUV7KeIHD9z2Bys/sm/vGKnG1UHuDBSrHWM=\ngoogle.golang.org/api v0.30.0/go.mod h1:QGmEvQ87FHZNiUVJkT14jQNYJ4ZJjdRF23ZXz5138Fc=\ngoogle.golang.org/appengine v1.1.0/go.mod h1:EbEs0AVv82hx2wNQdGPgUI5lhzA/G0D9YwlJXL52JkM=\ngoogle.golang.org/appengine v1.4.0/go.mod h1:xpcJRLb0r/rnEns0DIKYYv+WjYCduHsrkT7/EB5XEv4=\ngoogle.golang.org/appengine v1.5.0/go.mod h1:xpcJRLb0r/rnEns0DIKYYv+WjYCduHsrkT7/EB5XEv4=\ngoogle.golang.org/appengine v1.6.1/go.mod h1:i06prIuMbXzDqacNJfV5OdTW448YApPu5ww/cMBSeb0=\ngoogle.golang.org/appengine v1.6.5/go.mod h1:8WjMMxjGQR8xUklV/ARdw2HLXBOI7O7uCIDZVag1xfc=\ngoogle.golang.org/appengine v1.6.6/go.mod h1:8WjMMxjGQR8xUklV/ARdw2HLXBOI7O7uCIDZVag1xfc=\ngoogle.golang.org/genproto v0.0.0-20180817151627-c66870c02cf8/go.mod h1:JiN7NxoALGmiZfu7CAH4rXhgtRTLTxftemlI0sWmxmc=\ngoogle.golang.org/genproto v0.0.0-20190307195333-5fe7a883aa19/go.mod h1:VzzqZJRnGkLBvHegQrXjBqPurQTc5/KpmUdxsrq26oE=\ngoogle.golang.org/genproto v0.0.0-20190418145605-e7d98fc518a7/go.mod h1:VzzqZJRnGkLBvHegQrXjBqPurQTc5/KpmUdxsrq26oE=\ngoogle.golang.org/genproto v0.0.0-20190425155659-357c62f0e4bb/go.mod h1:VzzqZJRnGkLBvHegQrXjBqPurQTc5/KpmUdxsrq26oE=\ngoogle.golang.org/genproto v0.0.0-20190502173448-54afdca5d873/go.mod h1:VzzqZJRnGkLBvHegQrXjBqPurQTc5/KpmUdxsrq26oE=\ngoogle.golang.org/genproto v0.0.0-20190801165951-fa694d86fc64/go.mod h1:DMBHOl98Agz4BDEuKkezgsaosCRResVns1a3J2ZsMNc=\ngoogle.golang.org/genproto v0.0.0-20190819201941-24fa4b261c55/go.mod h1:DMBHOl98Agz4BDEuKkezgsaosCRResVns1a3J2ZsMNc=\ngoogle.golang.org/genproto v0.0.0-20190911173649-1774047e7e51/go.mod h1:IbNlFCBrqXvoKpeg0TB2l7cyZUmoaFKYIwrEpbDKLA8=\ngoogle.golang.org/genproto v0.0.0-20191108220845-16a3f7862a1a/go.mod h1:n3cpQtvxv34hfy77yVDNjmbRyujviMdxYliBSkLhpCc=\ngoogle.golang.org/genproto v0.0.0-20191115194625-c23dd37a84c9/go.mod h1:n3cpQtvxv34hfy77yVDNjmbRyujviMdxYliBSkLhpCc=\ngoogle.golang.org/genproto v0.0.0-20191216164720-4f79533eabd1/go.mod h1:n3cpQtvxv34hfy77yVDNjmbRyujviMdxYliBSkLhpCc=\ngoogle.golang.org/genproto v0.0.0-20191230161307-f3c370f40bfb/go.mod h1:n3cpQtvxv34hfy77yVDNjmbRyujviMdxYliBSkLhpCc=\ngoogle.golang.org/genproto v0.0.0-20200115191322-ca5a22157cba/go.mod h1:n3cpQtvxv34hfy77yVDNjmbRyujviMdxYliBSkLhpCc=\ngoogle.golang.org/genproto v0.0.0-20200122232147-0452cf42e150/go.mod h1:n3cpQtvxv34hfy77yVDNjmbRyujviMdxYliBSkLhpCc=\ngoogle.golang.org/genproto v0.0.0-20200204135345-fa8e72b47b90/go.mod h1:GmwEX6Z4W5gMy59cAlVYjN9JhxgbQH6Gn+gFDQe2lzA=\ngoogle.golang.org/genproto v0.0.0-20200212174721-66ed5ce911ce/go.mod h1:55QSHmfGQM9UVYDPBsyGGes0y52j32PQ3BqQfXhyH3c=\ngoogle.golang.org/genproto v0.0.0-20200224152610-e50cd9704f63/go.mod h1:55QSHmfGQM9UVYDPBsyGGes0y52j32PQ3BqQfXhyH3c=\ngoogle.golang.org/genproto v0.0.0-20200228133532-8c2c7df3a383/go.mod h1:55QSHmfGQM9UVYDPBsyGGes0y52j32PQ3BqQfXhyH3c=\ngoogle.golang.org/genproto v0.0.0-20200305110556-506484158171/go.mod h1:55QSHmfGQM9UVYDPBsyGGes0y52j32PQ3BqQfXhyH3c=\ngoogle.golang.org/genproto v0.0.0-20200312145019-da6875a35672/go.mod h1:55QSHmfGQM9UVYDPBsyGGes0y52j32PQ3BqQfXhyH3c=\ngoogle.golang.org/genproto v0.0.0-20200331122359-1ee6d9798940/go.mod h1:55QSHmfGQM9UVYDPBsyGGes0y52j32PQ3BqQfXhyH3c=\ngoogle.golang.org/genproto v0.0.0-20200430143042-b979b6f78d84/go.mod h1:55QSHmfGQM9UVYDPBsyGGes0y52j32PQ3BqQfXhyH3c=\ngoogle.golang.org/genproto v0.0.0-20200511104702-f5ebc3bea380/go.mod h1:55QSHmfGQM9UVYDPBsyGGes0y52j32PQ3BqQfXhyH3c=\ngoogle.golang.org/genproto v0.0.0-20200515170657-fc4c6c6a6587/go.mod h1:YsZOwe1myG/8QRHRsmBRE1LrgQY60beZKjly0O1fX9U=\ngoogle.golang.org/genproto v0.0.0-20200526211855-cb27e3aa2013/go.mod h1:NbSheEEYHJ7i3ixzK3sjbqSGDJWnxyFXZblF3eUsNvo=\ngoogle.golang.org/genproto v0.0.0-20200618031413-b414f8b61790/go.mod h1:jDfRM7FcilCzHH/e9qn6dsT145K34l5v+OpcnNgKAAA=\ngoogle.golang.org/genproto v0.0.0-20200729003335-053ba62fc06f/go.mod h1:FWY/as6DDZQgahTzZj3fqbO1CbirC29ZNUFHwi0/+no=\ngoogle.golang.org/genproto v0.0.0-20200804131852-c06518451d9c/go.mod h1:FWY/as6DDZQgahTzZj3fqbO1CbirC29ZNUFHwi0/+no=\ngoogle.golang.org/genproto v0.0.0-20200825200019-8632dd797987/go.mod h1:FWY/as6DDZQgahTzZj3fqbO1CbirC29ZNUFHwi0/+no=\ngoogle.golang.org/grpc v1.19.0/go.mod h1:mqu4LbDTu4XGKhr4mRzUsmM4RtVoemTSY81AxZiDr8c=\ngoogle.golang.org/grpc v1.20.1/go.mod h1:10oTOabMzJvdu6/UiuZezV6QK5dSlG84ov/aaiqXj38=\ngoogle.golang.org/grpc v1.21.1/go.mod h1:oYelfM1adQP15Ek0mdvEgi9Df8B9CZIaU1084ijfRaM=\ngoogle.golang.org/grpc v1.23.0/go.mod h1:Y5yQAOtifL1yxbo5wqy6BxZv8vAUGQwXBOALyacEbxg=\ngoogle.golang.org/grpc v1.25.1/go.mod h1:c3i+UQWmh7LiEpx4sFZnkU36qjEYZ0imhYfXVyQciAY=\ngoogle.golang.org/grpc v1.26.0/go.mod h1:qbnxyOmOxrQa7FizSgH+ReBfzJrCY1pSN7KXBS8abTk=\ngoogle.golang.org/grpc v1.27.0/go.mod h1:qbnxyOmOxrQa7FizSgH+ReBfzJrCY1pSN7KXBS8abTk=\ngoogle.golang.org/grpc v1.27.1/go.mod h1:qbnxyOmOxrQa7FizSgH+ReBfzJrCY1pSN7KXBS8abTk=\ngoogle.golang.org/grpc v1.28.0/go.mod h1:rpkK4SK4GF4Ach/+MFLZUBavHOvF2JJB5uozKKal+60=\ngoogle.golang.org/grpc v1.29.1/go.mod h1:itym6AZVZYACWQqET3MqgPpjcuV5QH3BxFS3IjizoKk=\ngoogle.golang.org/grpc v1.30.0/go.mod h1:N36X2cJ7JwdamYAgDz+s+rVMFjt3numwzf/HckM8pak=\ngoogle.golang.org/grpc v1.31.0/go.mod h1:N36X2cJ7JwdamYAgDz+s+rVMFjt3numwzf/HckM8pak=\ngoogle.golang.org/protobuf v0.0.0-20200109180630-ec00e32a8dfd/go.mod h1:DFci5gLYBciE7Vtevhsrf46CRTquxDuWsQurQQe4oz8=\ngoogle.golang.org/protobuf v0.0.0-20200221191635-4d8936d0db64/go.mod h1:kwYJMbMJ01Woi6D6+Kah6886xMZcty6N08ah7+eCXa0=\ngoogle.golang.org/protobuf v0.0.0-20200228230310-ab0ca4ff8a60/go.mod h1:cfTl7dwQJ+fmap5saPgwCLgHXTUD7jkjRqWcaiX5VyM=\ngoogle.golang.org/protobuf v1.20.1-0.20200309200217-e05f789c0967/go.mod h1:A+miEFZTKqfCUM6K7xSMQL9OKL/b6hQv+e19PK+JZNE=\ngoogle.golang.org/protobuf v1.21.0/go.mod h1:47Nbq4nVaFHyn7ilMalzfO3qCViNmqZ2kzikPIcrTAo=\ngoogle.golang.org/protobuf v1.22.0/go.mod h1:EGpADcykh3NcUnDUJcl1+ZksZNG86OlYog2l/sGQquU=\ngoogle.golang.org/protobuf v1.23.0/go.mod h1:EGpADcykh3NcUnDUJcl1+ZksZNG86OlYog2l/sGQquU=\ngoogle.golang.org/protobuf v1.23.1-0.20200526195155-81db48ad09cc/go.mod h1:EGpADcykh3NcUnDUJcl1+ZksZNG86OlYog2l/sGQquU=\ngoogle.golang.org/protobuf v1.24.0/go.mod h1:r/3tXBNzIEhYS9I1OUVjXDlt8tc493IdKGjtUeSXeh4=\ngoogle.golang.org/protobuf v1.25.0/go.mod h1:9JNX74DMeImyA3h4bdi1ymwjUzf21/xIlbajtzgsN7c=\ngoogle.golang.org/protobuf v1.26.0-rc.1/go.mod h1:jlhhOSvTdKEhbULTjvd4ARK9grFBp09yW+WbY/TyQbw=\ngoogle.golang.org/protobuf v1.26.0/go.mod h1:9q0QmTI4eRPtz6boOQmLYwt+qCgq0jsYwAQnmE0givc=\ngoogle.golang.org/protobuf v1.36.5 h1:tPhr+woSbjfYvY6/GPufUoYizxw1cF/yFoxJ2fmpwlM=\ngoogle.golang.org/protobuf v1.36.5/go.mod h1:9fA7Ob0pmnwhb644+1+CVWFRbNajQ6iRojtC/QF5bRE=\ngopkg.in/alecthomas/kingpin.v2 v2.2.6/go.mod h1:FMv+mEhP44yOT+4EoQTLFTRgOQ1FBLkstjWtayDeSgw=\ngopkg.in/check.v1 v0.0.0-20161208181325-20d25e280405/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0=\ngopkg.in/check.v1 v1.0.0-20180628173108-788fd7840127/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0=\ngopkg.in/check.v1 v1.0.0-20190902080502-41f04d3bba15/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0=\ngopkg.in/check.v1 v1.0.0-20201130134442-10cb98267c6c h1:Hei/4ADfdWqJk1ZMxUNpqntNwaWcugrBjAiHlqqRiVk=\ngopkg.in/check.v1 v1.0.0-20201130134442-10cb98267c6c/go.mod h1:JHkPIbrfpd72SG/EVd6muEfDQjcINNoR0C8j2r3qZ4Q=\ngopkg.in/errgo.v2 v2.1.0/go.mod h1:hNsd1EY+bozCKY1Ytp96fpM3vjJbqLJn88ws8XvfDNI=\ngopkg.in/ini.v1 v1.67.0 h1:Dgnx+6+nfE+IfzjUEISNeydPJh9AXNNsWbGP9KzCsOA=\ngopkg.in/ini.v1 v1.67.0/go.mod h1:pNLf8WUiyNEtQjuu5G5vTm06TEv9tsIgeAvK8hOrP4k=\ngopkg.in/yaml.v2 v2.2.1/go.mod h1:hI93XBmqTisBFMUTm0b8Fm+jr3Dg1NNxqwp+5A1VGuI=\ngopkg.in/yaml.v2 v2.2.2/go.mod h1:hI93XBmqTisBFMUTm0b8Fm+jr3Dg1NNxqwp+5A1VGuI=\ngopkg.in/yaml.v2 v2.2.4/go.mod h1:hI93XBmqTisBFMUTm0b8Fm+jr3Dg1NNxqwp+5A1VGuI=\ngopkg.in/yaml.v2 v2.2.5/go.mod h1:hI93XBmqTisBFMUTm0b8Fm+jr3Dg1NNxqwp+5A1VGuI=\ngopkg.in/yaml.v2 v2.3.0/go.mod h1:hI93XBmqTisBFMUTm0b8Fm+jr3Dg1NNxqwp+5A1VGuI=\ngopkg.in/yaml.v2 v2.4.0 h1:D8xgwECY7CYvx+Y2n4sBz93Jn9JRvxdiyyo8CTfuKaY=\ngopkg.in/yaml.v2 v2.4.0/go.mod h1:RDklbk79AGWmwhnvt/jBztapEOGDOx6ZbXqjP6csGnQ=\ngopkg.in/yaml.v3 v3.0.0-20200313102051-9f266ea9e77c/go.mod h1:K4uyk7z7BCEPqu6E+C64Yfv1cQ7kz7rIZviUmN+EgEM=\ngopkg.in/yaml.v3 v3.0.1 h1:fxVm/GzAzEWqLHuvctI91KS9hhNmmWOoWu0XTYJS7CA=\ngopkg.in/yaml.v3 v3.0.1/go.mod h1:K4uyk7z7BCEPqu6E+C64Yfv1cQ7kz7rIZviUmN+EgEM=\nhonnef.co/go/tools v0.0.0-20190102054323-c2f93a96b099/go.mod h1:rf3lG4BRIbNafJWhAfAdb/ePZxsR/4RtNHQocxwk9r4=\nhonnef.co/go/tools v0.0.0-20190106161140-3f1c8253044a/go.mod h1:rf3lG4BRIbNafJWhAfAdb/ePZxsR/4RtNHQocxwk9r4=\nhonnef.co/go/tools v0.0.0-20190418001031-e561f6794a2a/go.mod h1:rf3lG4BRIbNafJWhAfAdb/ePZxsR/4RtNHQocxwk9r4=\nhonnef.co/go/tools v0.0.0-20190523083050-ea95bdfd59fc/go.mod h1:rf3lG4BRIbNafJWhAfAdb/ePZxsR/4RtNHQocxwk9r4=\nhonnef.co/go/tools v0.0.1-2019.2.3/go.mod h1:a3bituU0lyd329TUQxRnasdCoJDkEUEAqEt0JzvZhAg=\nhonnef.co/go/tools v0.0.1-2020.1.3/go.mod h1:X/FiERA/W4tHapMX5mGpAtMSVEeEUOyHaw9vFzvIQ3k=\nhonnef.co/go/tools v0.0.1-2020.1.4/go.mod h1:X/FiERA/W4tHapMX5mGpAtMSVEeEUOyHaw9vFzvIQ3k=\nhonnef.co/go/tools v0.6.1 h1:R094WgE8K4JirYjBaOpz/AvTyUu/3wbmAoskKN/pxTI=\nhonnef.co/go/tools v0.6.1/go.mod h1:3puzxxljPCe8RGJX7BIy1plGbxEOZni5mR2aXe3/uk4=\nmvdan.cc/gofumpt v0.7.0 h1:bg91ttqXmi9y2xawvkuMXyvAA/1ZGJqYAEGjXuP0JXU=\nmvdan.cc/gofumpt v0.7.0/go.mod h1:txVFJy/Sc/mvaycET54pV8SW8gWxTlUuGHVEcncmNUo=\nmvdan.cc/unparam v0.0.0-20250301125049-0df0534333a4 h1:WjUu4yQoT5BHT1w8Zu56SP8367OuBV5jvo+4Ulppyf8=\nmvdan.cc/unparam v0.0.0-20250301125049-0df0534333a4/go.mod h1:rthT7OuvRbaGcd5ginj6dA2oLE7YNlta9qhBNNdCaLE=\nrsc.io/binaryregexp v0.2.0/go.mod h1:qTv7/COck+e2FymRvadv62gMdZztPaShugOCi3I+8D8=\nrsc.io/quote/v3 v3.1.0/go.mod h1:yEA65RcK8LyAZtP9Kv3t0HmxON59tX3rD+tICJqUlj0=\nrsc.io/sampler v1.3.0/go.mod h1:T1hPZKmBbMNahiBKFy5HrXp6adAjACjK9JXDnKaTXpA=\n"
  },
  {
    "path": "tools/pedantic_imports/main.go",
    "content": "package main\n\nimport (\n\t\"bufio\"\n\t\"fmt\"\n\t\"log\"\n\t\"os\"\n\t\"path/filepath\"\n\t\"strings\"\n)\n\nfunc main() {\n\tbasePackage := os.Args[1]\n\tsrc := os.Args[2]\n\n\tmessages, err := collectImportErrors(src, basePackage)\n\tif err != nil {\n\t\tlog.Fatal(err)\n\t}\n\n\tfor _, message := range messages {\n\t\tfmt.Println(message)\n\t}\n\n\tif len(messages) > 0 {\n\t\tos.Exit(1)\n\t}\n}\n\n// collectImportErrors runs in addition to `gofmt` to check that imports are properly organized in groups:\n// + there's at most 2 blank lines between imports,\n// + the `github.com/buildpacks/pack` imports must come in the last import group.\nfunc collectImportErrors(root, basePackage string) ([]string, error) {\n\tvar list []string\n\n\terr := filepath.Walk(root, func(path string, info os.FileInfo, err error) error {\n\t\tif err != nil {\n\t\t\treturn err\n\t\t}\n\n\t\tif info.IsDir() {\n\t\t\tif isIgnoredDir(info.Name()) {\n\t\t\t\treturn filepath.SkipDir\n\t\t\t}\n\t\t\treturn nil\n\t\t}\n\n\t\tif strings.HasSuffix(info.Name(), \".go\") {\n\t\t\tmessages, err := checkImports(path, basePackage)\n\t\t\tif err != nil {\n\t\t\t\treturn err\n\t\t\t}\n\n\t\t\tlist = append(list, messages...)\n\t\t}\n\n\t\treturn nil\n\t})\n\n\treturn list, err\n}\n\nfunc isIgnoredDir(name string) bool {\n\treturn name == \"vendor\" ||\n\t\t(strings.HasPrefix(\".\", name) && name != \".\")\n}\n\nfunc checkImports(path, basePackage string) ([]string, error) {\n\tfile, err := os.Open(filepath.Clean(path))\n\tif err != nil {\n\t\treturn nil, err\n\t}\n\tdefer file.Close()\n\n\tvar (\n\t\tinImport   bool\n\t\tblankLines int\n\t\tlast       string\n\t)\n\n\tscanner := bufio.NewScanner(file)\n\tfor scanner.Scan() {\n\t\tline := scanner.Text()\n\n\t\tif strings.HasPrefix(line, \"import (\") {\n\t\t\tinImport = true\n\t\t} else if inImport {\n\t\t\tif line == \"\" {\n\t\t\t\tblankLines++\n\t\t\t} else if line == \")\" {\n\t\t\t\tbreak\n\t\t\t} else {\n\t\t\t\tlast = strings.TrimSpace(line)\n\t\t\t}\n\t\t}\n\t}\n\n\tif err := scanner.Err(); err != nil {\n\t\treturn nil, err\n\t}\n\n\tvar messages []string\n\n\tif blankLines == 2 {\n\t\tif !strings.Contains(last, basePackage) {\n\t\t\tmessages = append(messages, fmt.Sprintf(\"%q must have pack imports last\", path))\n\t\t}\n\t} else if blankLines > 2 {\n\t\tmessages = append(messages, fmt.Sprintf(\"%q contains more than 3 groups of imports\", path))\n\t}\n\n\treturn messages, nil\n}\n"
  },
  {
    "path": "tools/test-fork.sh",
    "content": "#!/usr/bin/env bash\n\nreadonly wfdir=\".github/workflows\"\n\n# $1 - registry repo name\n\necho \"Parse registry: $1\"\nfirstPart=$(echo \"$1\" | cut -d/ -f1)\nsecondPart=$(echo \"$1\" | cut -d/ -f2)\nthirdPart=$(echo \"$1\" | cut -d/ -f3)\n\nregistry=\"\"\nusername=\"\"\nreponame=\"\"\nif [[ -z $thirdPart ]]; then # assume Docker Hub\n  registry=\"index.docker.io\"\n  username=$firstPart\n  reponame=$secondPart\nelse\n  registry=$firstPart\n  username=$secondPart\n  reponame=$thirdPart\nfi\n\necho \"Using registry $registry and username $username\"\nif [[ $reponame != \"pack\" ]]; then\n  echo \"Repo name must be 'pack'\"\n  exit 1\nfi\n\necho \"Disabling workflows that should not run on the forked repository\"\ndisable=(\n  delivery-archlinux-git.yml\n  delivery-archlinux.yml\n  delivery-chocolatey.yml\n  delivery-homebrew.yml\n  delivery-release-dispatch.yml\n  delivery-ubuntu.yml\n  privileged-pr-process.yml\n)\nfor d in \"${disable[@]}\"; do\n  if [ -e \"$wfdir/$d\" ]; then\n    mv \"$wfdir/$d\" \"$wfdir/$d.disabled\"\n  fi\ndone\n\necho \"Removing upstream maintainers from the benchmark alert CC\"\nsed -i '' \"/alert-comment-cc-users:/d\" $wfdir/benchmark.yml\n\necho \"Removing the architectures that require self-hosted runner from the build strategies.\"\nsed -i '' \"/config: \\[.*\\]/ s/windows-lcow, //g\" $wfdir/build.yml\nsed -i '' \"/- config: windows-lcow/,+4d\" $wfdir/build.yml\n\necho \"Replacing the registry account with owned one (assumes DOCKER_PASSWORD and DOCKER_USERNAME have been added to GitHub secrets, if not using ghcr.io)\"\nsed -i '' \"s/buildpacksio\\/pack/$registry\\/$username\\/$reponame/g\" $wfdir/check-latest-release.yml\nsed -i '' \"/REGISTRY_NAME: 'index.docker.io'/ s/index.docker.io/$registry/g\" $wfdir/delivery-docker.yml\nsed -i '' \"/USER_NAME: 'buildpacksio'/ s/buildpacksio/$username/g\" $wfdir/delivery-docker.yml\n\nif [[ $registry != \"index.docker.io\" ]]; then\n  echo \"Updating login action to specify the registry\"\n  sed -i '' \"s/username: \\${{ secrets.DOCKER_USERNAME }}/registry: $registry\\n          username: $username/g\" $wfdir/delivery-docker.yml\nfi\n\nif [[ $registry == *\"ghcr.io\"* ]]; then\n  echo \"Updating login action to use GitHub token for ghcr.io\"\n  sed -i '' \"s/secrets.DOCKER_PASSWORD/secrets.GITHUB_TOKEN/g\" $wfdir/delivery-docker.yml\n\n  echo \"Adding workflow permissions to push images to ghcr.io\"\n  LF=$'\\n'\n  sed -i '' \"/runs-on: ubuntu-latest/ a\\\\\n    permissions:\\\\\n      contents: read\\\\\n      packages: write\\\\\n      attestations: write\\\\\n      id-token: write${LF}\" $wfdir/delivery-docker.yml\n  LF=\"\"\nfi\n"
  },
  {
    "path": "tools/tools.go",
    "content": "//go:build tools\n\npackage tools\n\nimport (\n\t_ \"github.com/golang/mock/mockgen\"\n\t_ \"github.com/golangci/golangci-lint/v2/cmd/golangci-lint\"\n\t_ \"golang.org/x/tools/cmd/goimports\"\n)\n"
  }
]