[
  {
    "path": ".github/dependabot.yml",
    "content": "# To get started with Dependabot version updates, you'll need to specify which\n# package ecosystems to update and where the package manifests are located.\n# Please see the documentation for all configuration options:\n# https://docs.github.com/code-security/dependabot/dependabot-version-updates/configuration-options-for-the-dependabot.yml-file\n\nversion: 2\nupdates:\n  - package-ecosystem: \"npm\" # See documentation for possible values\n    directory: \"/\" # Location of package manifests\n    schedule:\n      interval: \"daily\"\n"
  },
  {
    "path": ".github/workflows/build-docker.yml",
    "content": "name: Docker Build\n\non:\n  push:\n    branches:\n      - \"**\"\n    tags:\n      - \"**\"\njobs:\n  docker:\n    runs-on: ubuntu-latest\n    steps:\n      - uses: actions/checkout@v3\n      - uses: actions/setup-node@v3\n        with:\n          node-version: 18.x\n      - run: npm install\n      - run: npx tsc\n      - name: Docker meta\n        id: meta\n        uses: docker/metadata-action@v3\n        with:\n          images: humanmade/tachyon\n          tags: |\n            type=edge,branch=master\n            type=ref,event=tag\n      - uses: docker/setup-qemu-action@v1\n      - name: Set up Docker Buildx\n        uses: docker/setup-buildx-action@v1\n      - name: Login to DockerHub\n        uses: docker/login-action@v1\n        with:\n          username: ${{ secrets.DOCKERHUB_USERNAME }}\n          password: ${{ secrets.DOCKERHUB_TOKEN }}\n      - name: Build and push latest\n        if: github.ref == 'refs/heads/master' || startsWith(github.ref, 'refs/tags/')\n        uses: docker/build-push-action@v2\n        with:\n          file: Dockerfile\n          context: .\n          platforms: linux/amd64,linux/arm64\n          push: true\n          tags: ${{ steps.meta.outputs.tags }}\n"
  },
  {
    "path": ".github/workflows/release.yml",
    "content": "name: Release\n\non:\n  push:\n    tags:\n    - \"**\"\n    branches:\n    - '**'\n  pull_request:\n    branches:\n    - '**'\n\njobs:\n  build:\n    runs-on: ubuntu-latest\n    steps:\n      - uses: actions/checkout@v4\n      - uses: actions/setup-node@v4\n        with:\n          node-version: 18.x\n      - run: npm install\n      - run: npx tsc\n      - uses: aws-actions/setup-sam@v2\n        with:\n          use-installer: true\n      - run: npm run build\n      - run: npm run build-zip\n      - uses: softprops/action-gh-release@v1\n        if: startsWith(github.ref, 'refs/tags/')\n        with:\n          files: lambda.zip\n      - uses: actions/upload-artifact@v4\n        if: github.event_name == 'pull_request'\n        with:\n          path: lambda.zip\n          name: lambda\n"
  },
  {
    "path": ".github/workflows/test.yml",
    "content": "name: Test\n\non:\n  push:\n    branches:\n      - '**'\n\njobs:\n  test:\n    runs-on: ubuntu-latest\n    steps:\n      - uses: actions/checkout@v3\n      - uses: actions/setup-node@v3\n        with:\n          node-version: 18.x\n      - run: npm install\n      - run: npx jest\n        env:\n          AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}\n          AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}\n          S3_BUCKET: ${{ vars.S3_BUCKET }}\n          S3_REGION: ${{ vars.S3_REGION }}\n\n"
  },
  {
    "path": ".gitignore",
    "content": "node_modules/\n.idea\nlambda.zip\n.aws-sam/\n.DS_Store\ndist/\n/tests/test-filesize/output/\n"
  },
  {
    "path": "CONTRIBUTING.md",
    "content": "# Contributing\n\n## Building for Lambda\n\nYou'll need to [install the AWS SAM CLI](https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/install-sam-cli.html) as AWS SAM.\n\n```\nnpm install\nnpm run build // Builds the function for use in SAM\n```\n\n### Building locally\n\nTachyon is written in TypeScript. All TypeScript files are in `.src` and running `npx tsc` will build everything to `./dist`. You can run `npx tsc -w` to watch for file changes to update `./dist`. This is needed if you are running the server locally (see below) or running the Lambda environment via the SAM cli (see below.)\n\n### Running a server locally\n\nInvoking the function via Lambda locally is somewhat slow (see below), in many cases you may want to start a local Node server which maps the Node request into a Lambda-like request. `./src/server.ts` exists for that reason. The local server will still connect to the S3 bucket (set with the `S3_BUCKET` env var) for files.\n\n\n### Running Lambda Locally\n\nBefore testing any of the Lambda function calls via the `sam` CLI, you must run `sam build -u` to build the NPM deps via the Lambda docker container. This will also build the `./dist/` into the SAM environment, so any subsequent changes to files in `./src` but be first built (which updates `./dist`), and then `sam build -u` must be run.\n\nTo run Tachyon in a Lambda local environment via docker, use the `sam local invoke -e events/animated-gif.json` CLI command. This will call the function via the `src/lambda-handler.handler` function.\n\n### Writing tests\n\nTests should be written using Jest. Files matching `./tests/**/test-*.ts` will automatically be included in the Jest testsuite. For tests, you don't need to run `npx tsc` to compile TypeScript files to `./dist`, as this is integrated automatically via the `ts-jest` package.\n\nRun `npm test` to run the tests.\n"
  },
  {
    "path": "Dockerfile",
    "content": "FROM public.ecr.aws/lambda/nodejs:18\nCOPY package.json /var/task/\nCOPY package-lock.json /var/task/\nRUN npm install --omit=dev\nCOPY dist /var/task/dist\n\n# Set environment variables, backwards compat with Tachyon 2x.\nARG S3_REGION\nARG S3_BUCKET\nARG S3_ENDPOINT\nARG PORT\n\n# Start the reactor\nEXPOSE ${PORT:-8080}\nENTRYPOINT /var/lang/bin/node dist/server.js ${PORT:-8080}\n"
  },
  {
    "path": "LICENSE",
    "content": "ISC License\n\nCopyright (c) 2023 Human Made\n\nPermission to use, copy, modify, and/or distribute this software for any\npurpose with or without fee is hereby granted, provided that the above\ncopyright notice and this permission notice appear in all copies.\n\nTHE SOFTWARE IS PROVIDED “AS IS” AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH\nREGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY\nAND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT,\nINDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM\nLOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR\nOTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR\nPERFORMANCE OF THIS SOFTWARE.\n"
  },
  {
    "path": "README.md",
    "content": "<table width=\"100%\">\n\t<tr>\n\t\t<td align=\"left\" colspan=\"2\">\n\t\t\t<strong>Tachyon</strong><br />\n\t\t\tFaster than light image resizing service that runs on AWS. Super simple to set up, highly available and very performant.\n\t\t</td>\n\t</tr>\n\t<tr>\n\t\t<td>\n\t\t\tA <strong><a href=\"https://hmn.md/\">Human Made</a></strong> project. Maintained by @joehoyle.\n\t\t</td>\n\t\t<td align=\"center\">\n\t\t\t<img src=\"https://hmn.md/content/themes/hmnmd/assets/images/hm-logo.svg\" width=\"100\" />\n\t\t</td>\n\t</tr>\n</table>\n\nTachyon is a faster than light image resizing service that runs on AWS. Super simple to set up, highly available and very performant.\n\n\n## Setup\n\nTachyon comes in two parts: the server to serve images and the [plugin to use it](./docs/plugin.md). To use Tachyon, you need to run at least one server, as well as the plugin on all sites you want to use it.\n\n## Installation on AWS Lambda\n\nWe require using Tachyon on [AWS Lambda](https://aws.amazon.com/lambda/details/) to offload image processing task in a serverless configuration. This ensures you don't need lots of hardware to handle thousands of image resize requests, and can scale essentially infinitely. One Tachyon stack is required per S3 bucket, so we recommend using a common region bucket for all sites, which then only requires a single Tachyon stack per region.\n\nTachyon requires the following Lambda Function spec:\n\n- Runtime: Node JS 18\n- Function URL activated\n- Env vars:\n  - S3_BUCKET=my-bucket\n  - S3_REGION=my-bucket-region\n  - S3_ENDPOINT=http://my-custom-endpoint (optional)\n  - S3_FORCE_PATH_STYLE=1 (optional)\n\nTake the `lambda.zip` from the latest release and upload it to your function.\n\nFor routing web traffic to the Lambda function, we recommend using [Lambda Function URLs](https://docs.aws.amazon.com/lambda/latest/dg/urls-configuration.html). These should be configured as:\n\n- Auth type: None\n- Invoke mode: `RESPONSE_STREAM`\n\nAlternatively, you can use API Gateway; this should be set to route all GET requests (i.e. `/{proxy+}`) to invoke your Tachyon Lambda function.\n\nWe also recommend running an aggressive caching proxy/CDN in front of Tachyon, such as CloudFront. (An expiration time of 1 year is typical for a production configuration.)\n\n## Documentation\n\n* [Plugin Setup](./docs/plugin.md)\n* [Using Tachyon](./docs/using.md)\n* [Hints and Tips](./docs/tips.md)\n\n\n## Credits\n\nCreated by Human Made for high volume and large-scale sites. We run Tachyon on sites with millions of monthly page views, and thousands of sites.\n\nWritten and maintained by [Joe Hoyle](https://github.com/joehoyle).\n\nTachyon is inspired by Photon by Automattic. As Tachyon is not an all-purpose image resizer, rather it uses a media library in Amazon S3, it has a different use case to [Photon](https://jetpack.com/support/photon/).\n\nTachyon uses the [Sharp](https://github.com/lovell/sharp) (Used under the license Apache License 2.0) Node.js library for the resizing operations, which in turn uses the great libvips library.\n\nInterested in joining in on the fun? [Join us, and become human!](https://hmn.md/is/hiring/)\n\n\n## Looking for a different Tachyon?\n\nTachyon by Human Made provides image resizing services for the web, and is specifically designed for WordPress. \"Tachyon\" is named after the [hypothetical faster-than-light particle](https://en.wikipedia.org/wiki/Tachyon).\n\nOther software named Tachyon include:\n\n* [Tachyon by VYV](https://tachyon.video/) - Video playback and media server.\n* [Tachyon by Cinnafilm](https://cinnafilm.com/product/tachyon/) - Video processing for the cinema industry.\n* [TACHYONS](https://tachyons.io/) - CSS framework.\n"
  },
  {
    "path": "SECURITY.md",
    "content": "# Security Policy\n\n## Reporting a Vulnerability\n\nEmail security [at] humanmade.com\n"
  },
  {
    "path": "docs/plugin.md",
    "content": "# Plugin Setup\n\nThe Tachyon plugin is responsible for replacing WordPress' default thumbnail handling with dynamic Tachyon URLs.\n\n[Download from GitHub →](https://github.com/humanmade/tachyon-plugin)\n\n\n## Installation\n\nInstall the Tachyon plugin as a regular plugin in your WordPress install (mu-plugins also supported).\n\nYou also need to point the plugin to your [Tachyon server](server.md). Add the following to your `wp-config-local.php`:\n\n```php\ndefine( 'TACHYON_URL', 'http://localhost:8080/<bucket name>/uploads' );\n```\n\n\n## Credits\n\nThe Tachyon plugin is based on the Photon plugin code by Automattic, part of [Jetpack](https://github.com/Automattic/jetpack/blob/master/class.photon.php). Used under the GPL.\n"
  },
  {
    "path": "docs/tips.md",
    "content": "# Hints and Tips\n\n## Regions\n\nWhen running Tachyon in production, we recommend running one Tachyon instance per region. This instance should connect to the S3 bucket for the region, which can then be shared across all stacks in that region.\n\nWhile S3 buckets can be accessed from any region, running Lambda from the same region as the bucket is recommended. This reduces latency and improves image serving speed.\n"
  },
  {
    "path": "docs/using.md",
    "content": "# Using\n\nTachyon provides a simple HTTP interface in the form of:\n\n`https://{tachyon-domain}/my/image/path/on/s3.png?w=100&h=80`\n\nIt's really that simple!\n\n## Args Reference\n\n| URL Arg | Type | Description |\n|---|----|---|\n|`w`|Number|Max width of the image.|\n|`h`|Number|Max height of the image.|\n|`quality`|Number, 0-100|Image quality.|\n|`resize`|String, \"w,h\"|A comma separated string of the target width and height in pixels. Crops the image.|\n|`crop_strategy`|String, \"smart\", \"entropy\", \"attention\"|There are 3 automatic cropping strategies for use with `resize`: <ul><li>`attention`: good results, ~70% slower</li><li>`entropy`: mediocre results, ~30% slower</li><li>`smart`: best results, ~50% slower</li>|\n|`gravity`|String|Alternative to `crop_strategy`. Crops are made from the center of the image by default, passing one of \"north\", \"northeast\", \"east\", \"southeast\", \"south\", \"southwest\", \"west\", \"northwest\" or \"center\" will crop from that edge.|\n|`fit`|String, \"w,h\"|A comma separated string of the target maximum width and height. Does not crop the image.|\n|`crop`|Boolean\\|String, \"x,y,w,h\"|Crop an image by percentages x-offset, y-offset, width and height (x,y,w,h). Percentages are used so that you don’t need to recalculate the cropping when transforming the image in other ways such as resizing it. You can crop by pixel values too by appending `px` to the values. `crop=160px,160px,788px,788px` takes a 788 by 788 pixel square starting at 160 by 160.|\n|`zoom`|Number|Zooms the image by the specified amount for high DPI displays. `zoom=2` produces an image twice the size specified in `w`, `h`, `fit` or `resize`. The quality is automatically reduced to keep file sizes roughly equivalent to the non-zoomed image unless the `quality` argument is passed.|\n|`webp`|Boolean, 1|Force WebP format.|\n|`lb`|String, \"w,h\"|Add letterboxing effect to images, by scaling them to width, height while maintaining the aspect ratio and filling the rest with black or `background`.|\n|`background`|String|Add background color via name (red) or hex value (%23ff0000). Don't forget to escape # as `%23`.|\n"
  },
  {
    "path": "global.d.ts",
    "content": "declare type ResponseStream = {\n\tsetContentType( type: string ): void;\n\twrite( stream: string | Buffer ): void;\n\tend(): void;\n\tmetadata?: any;\n};\n\ndeclare type StreamifyHandler = ( event: APIGatewayProxyEventV2, response: ResponseStream ) => Promise<any>;\n\ndeclare var awslambda: {\n\tstreamifyResponse: (\n\t\thandler: StreamifyHandler\n\t) => ( event: APIGatewayProxyEventV2, context: ResponseStream ) => void,\n\tHttpResponseStream: {\n\t\tfrom( response: ResponseStream, metadata: {\n\t\t\theaders?: Record<string, string>,\n\t\t\tstatusCode: number,\n\t\t\tcookies?: string[],\n\t\t} ): ResponseStream\n\t}\n};\n\n"
  },
  {
    "path": "jest.config.js",
    "content": "/** @type {import('ts-jest').JestConfigWithTsJest} */\nexport default {\n\tpreset: 'ts-jest',\n\ttestEnvironment: 'node',\n\ttestMatch: ['<rootDir>/tests/**/test-*.ts'],\n\tsetupFiles: ['<rootDir>/tests/setup.ts'],\n\textensionsToTreatAsEsm: ['.ts'],\n\ttransform: {\n\t\t'^.+\\\\.tsx?$': [\n\t\t\t'ts-jest',\n\t\t\t{\n\t\t\t\tuseESM: true,\n\t\t\t\ttsconfig: './tsconfig.test.json',\n\t\t\t},\n\t\t],\n\t},\n\tmoduleNameMapper: {\n\t\t\"(.+)\\\\.js\": \"$1\"\n\t},\n};\n"
  },
  {
    "path": "package.json",
    "content": "{\n\t\"name\": \"tachyon\",\n\t\"version\": \"3.0.0\",\n\t\"type\": \"module\",\n\t\"repository\": {\n\t\t\"type\": \"git\",\n\t\t\"url\": \"https://github.com/humanmade/tachyon.git\"\n\t},\n\t\"description\": \"Human Made Tachyon in node\",\n\t\"main\": \"dist/lambda-handler.js\",\n\t\"config\": {\n\t\t\"bucket\": \"\",\n\t\t\"path\": \"\",\n\t\t\"region\": \"us-east-1\",\n\t\t\"function-name\": \"\"\n\t},\n\t\"scripts\": {\n\t\t\"build\": \"sam build -u\",\n\t\t\"start\": \"tsc -w & nodemon --watch dist/lambda-handler.js --exec 'node dist/lambda-handler.js'\",\n\t\t\"test\": \"AWS_PROFILE=hmn-test S3_BUCKET=testtachyonbucket S3_REGION=us-east-1 jest\",\n\t\t\"build-zip\": \"rm lambda.zip ; cd .aws-sam/build/Tachyon && zip -r --exclude='node_modules/animated-gif-detector/test/*' ../../../lambda.zip ./node_modules/ package.json ./dist/\",\n\t\t\"upload-zip\": \"aws s3 --region=$npm_config_region cp ./lambda.zip s3://$npm_config_bucket/$npm_config_path\",\n\t\t\"update-function-code\": \"aws lambda update-function-code --region $npm_config_region --function-name $npm_config_function_name --zip-file fileb://`pwd`/lambda.zip\",\n\t\t\"lint\": \"npx eslint ./*.ts ./**/*.ts\"\n\t},\n\t\"author\": \"Joe Hoyle\",\n\t\"license\": \"ISC\",\n\t\"dependencies\": {\n\t\t\"@aws-sdk/client-s3\": \"^3.712.0\",\n\t\t\"eslint-config-react-app\": \"^7.0.1\",\n\t\t\"sharp\": \"^0.34.5\",\n\t\t\"smartcrop-sharp\": \"^2.0.6\"\n\t},\n\t\"devDependencies\": {\n\t\t\"@aws-sdk/s3-request-presigner\": \"^3.709.0\",\n\t\t\"@humanmade/eslint-config\": \"^1.1.3\",\n\t\t\"@types/aws-lambda\": \"^8.10.161\",\n\t\t\"@types/cli-table\": \"^0.3.4\",\n\t\t\"@types/jest\": \"^30.0.0\",\n\t\t\"@types/node\": \"^25.6.0\",\n\t\t\"@typescript-eslint/eslint-plugin\": \"^6.21.0\",\n\t\t\"@typescript-eslint/parser\": \"^6.3.0\",\n\t\t\"aws-lambda\": \"^1.0.7\",\n\t\t\"cli-table\": \"^0.3.1\",\n\t\t\"eslint\": \"^8.46.0\",\n\t\t\"eslint-config\": \"^0.3.0\",\n\t\t\"eslint-plugin-flowtype\": \"^8.0.3\",\n\t\t\"eslint-plugin-jsdoc\": \"^54.5.0\",\n\t\t\"filesize\": \"^11.0.16\",\n\t\t\"jest\": \"^30.3.0\",\n\t\t\"lambda-stream\": \"^0.6.0\",\n\t\t\"nodemon\": \"^3.1.14\",\n\t\t\"ts-jest\": \"^29.4.9\",\n\t\t\"typescript\": \"^6.0.3\"\n\t},\n\t\"eslintConfig\": {\n\t\t\"extends\": \"@humanmade/eslint-config\",\n\t\t\"parser\": \"@typescript-eslint/parser\",\n\t\t\"parserOptions\": {\n\t\t\t\"project\": [\n\t\t\t\t\"./tsconfig.json\"\n\t\t\t]\n\t\t},\n\t\t\"overrides\": [\n\t\t\t{\n\t\t\t\t\"files\": [\n\t\t\t\t\t\"*.ts\"\n\t\t\t\t],\n\t\t\t\t\"rules\": {\n\t\t\t\t\t\"jsdoc/require-param-description\": \"off\",\n\t\t\t\t\t\"jsdoc/require-returns\": \"off\",\n\t\t\t\t\t\"jsdoc/require-param-type\": \"off\",\n\t\t\t\t\t\"jsdoc/require-param\": \"off\",\n\t\t\t\t\t\"no-undef\": \"off\",\n\t\t\t\t\t\"import/named\": \"off\"\n\t\t\t\t}\n\t\t\t},\n\t\t\t{\n\t\t\t\t\"files\": [\n\t\t\t\t\t\"*.d.ts\"\n\t\t\t\t],\n\t\t\t\t\"rules\": {\n\t\t\t\t\t\"no-unused-vars\": \"off\",\n\t\t\t\t\t\"no-var\": \"off\"\n\t\t\t\t}\n\t\t\t}\n\t\t]\n\t},\n\t\"overrides\": {\n\t\t\"smartcrop-sharp\": {\n\t\t\t\"sharp\": \"$sharp\"\n\t\t},\n\t\t\"@humanmade/eslint-config\": {\n\t\t\t\"eslint\": \"$eslint\",\n\t\t\t\"eslint-plugin-flowtype\": \"$eslint-plugin-flowtype\",\n\t\t\t\"eslint-plugin-jsdoc\": \"$eslint-plugin-jsdoc\",\n\t\t\t\"eslint-config-react-app\": \"$eslint-config-react-app\"\n\t\t}\n\t},\n\t\"engines\": {\n\t\t\"node\": \"18\"\n\t}\n}\n"
  },
  {
    "path": "src/lambda-handler.ts",
    "content": "import { Args, getS3File, resizeBuffer, Config } from './lib.js';\n/**\n *\n * @param event\n * @param response\n */\nconst streamify_handler: StreamifyHandler = async ( event, response ) => {\n\tconst region = process.env.S3_REGION!;\n\tconst bucket = process.env.S3_BUCKET!;\n\tconst config: Config = {\n\t\tregion: region,\n\t\tbucket: bucket,\n\t};\n\tif ( process.env.S3_ENDPOINT ) {\n\t\tconfig.endpoint = process.env.S3_ENDPOINT;\n\t}\n\n\tif ( process.env.S3_FORCE_PATH_STYLE ) {\n\t\tconfig.forcePathStyle = true;\n\t}\n\n\tconst key = decodeURIComponent( event.rawPath.substring( 1 ) ).replace( '/tachyon/', '/' );\n\tconst args = ( event.queryStringParameters || {} ) as unknown as Args & {\n\t\t'X-Amz-Expires'?: string;\n\t\t'presign'?: string,\n\t\tkey: string;\n\t\treferer?: string;\n\t};\n\targs.key = key;\n\tif ( typeof args.webp === 'undefined' ) {\n\t\targs.webp = !! ( event.headers && Object.keys( event.headers ).find( key => key.toLowerCase() === 'x-webp' ) );\n\t}\n\tconst refererHeaderKey = Object.keys(event.headers || {}).find(h => h.toLowerCase() === 'referer');\n\tif (refererHeaderKey) {\n\t\t\targs.referer = event.headers[refererHeaderKey];\n\t}\n\t// If there is a presign param, we need to decode it and add it to the args. This is to provide a secondary way to pass pre-sign params,\n\t// as using them in a Lambda function URL invocation will trigger a Lambda error.\n\tif ( args.presign ) {\n\t\tconst presignArgs = new URLSearchParams( args.presign );\n\t\tfor ( const [ key, value ] of presignArgs.entries() ) {\n\t\t\targs[ key as keyof Args ] = value;\n\t\t}\n\t\tdelete args.presign;\n\t}\n\n\tlet s3_response;\n\n\ttry {\n\t\ts3_response = await getS3File( config, key, args );\n\t} catch ( e: any ) {\n\t\t// An AccessDenied error means the file is either protected, or doesn't exist.\n\t\tif ( e.Code === 'AccessDenied' ) {\n\t\t\tconst metadata = {\n\t\t\t\tstatusCode: 404,\n\t\t\t\theaders: {\n\t\t\t\t\t'Content-Type': 'text/html',\n\t\t\t\t},\n\t\t\t};\n\t\t\tresponse = awslambda.HttpResponseStream.from( response, metadata );\n\t\t\tresponse.write( 'File not found.' );\n\t\t\tresponse.end();\n\t\t\treturn;\n\t\t}\n\t\tthrow e;\n\t}\n\n\tif ( ! s3_response.Body ) {\n\t\tthrow new Error( 'No body in file.' );\n\t}\n\n\tlet buffer = Buffer.from( await s3_response.Body.transformToByteArray() );\n\n\tlet { info, data } = await resizeBuffer( buffer, args );\n\t// If this is a signed URL, we need to calculate the max-age of the image.\n\tlet maxAge = 31536000;\n\tif ( args['X-Amz-Expires'] ) {\n\t\t// Date format of X-Amz-Date is YYYYMMDDTHHMMSSZ, which is not parsable by Date.\n\t\tconst dateString = args['X-Amz-Date']!.replace(\n\t\t\t/(\\d{4})(\\d{2})(\\d{2})T(\\d{2})(\\d{2})(\\d{2})Z/,\n\t\t\t'$1-$2-$3T$4:$5:$6Z'\n\t\t);\n\t\tconst date = new Date( dateString );\n\n\t\t// Calculate when the signed URL will expire, as we'll set the max-age\n\t\t// cache control to this value.\n\t\tconst expires = date.getTime() / 1000 + Number( args['X-Amz-Expires'] );\n\n\t\t// Mage age is the date the URL expires minus the current time.\n\t\tmaxAge = Math.round( expires - new Date().getTime() / 1000 ); // eslint-disable-line no-unused-vars\n\t}\n\n\t// Somewhat undocumented API on how to pass headers to a stream response.\n\tresponse = awslambda.HttpResponseStream.from( response, {\n\t\tstatusCode: 200,\n\t\theaders: {\n\t\t\t'Cache-Control': `max-age=${ maxAge }`,\n\t\t\t'Last-Modified': ( new Date() ).toUTCString(),\n\t\t\t'Content-Type': 'image/' + info.format,\n\t\t},\n\t} );\n\n\tresponse.write( data );\n\tresponse.end();\n};\n\nif ( typeof awslambda === 'undefined' ) {\n\tglobal.awslambda = {\n\t\t/**\n\t\t *\n\t\t * @param handler\n\t\t */\n\t\tstreamifyResponse( handler: StreamifyHandler ): StreamifyHandler {\n\t\t\treturn handler;\n\t\t},\n\t\tHttpResponseStream: {\n\t\t\t/**\n\t\t\t * @param response The response stream object\n\t\t\t * @param metadata The metadata object\n\t\t\t * @param metadata.headers\n\t\t\t */\n\t\t\tfrom( response: ResponseStream, metadata: {\n\t\t\t\theaders?: Record<string, string>,\n\t\t\t} ): ResponseStream {\n\t\t\t\tresponse.metadata = metadata;\n\t\t\t\treturn response;\n\t\t\t},\n\t\t},\n\t};\n}\n\nexport const handler = awslambda.streamifyResponse( streamify_handler );\n"
  },
  {
    "path": "src/lib.ts",
    "content": "import { S3Client, S3ClientConfig, GetObjectCommand, GetObjectCommandOutput } from '@aws-sdk/client-s3';\nimport sharp from 'sharp';\nimport smartcrop from 'smartcrop-sharp';\n\nexport interface Args {\n\t// Optional args.\n\tbackground?: string;\n\tcrop?: string | string[];\n\tcrop_strategy?: string;\n\tfit?: string;\n\tgravity?: string;\n\th?: string;\n\tlb?: string;\n\tresize?: string | number[];\n\tquality?: string | number;\n\tw?: string;\n\twebp?: string | boolean;\n\tzoom?: string;\n\t'X-Amz-Algorithm'?: string;\n\t'X-Amz-Content-Sha256'?: string;\n\t'X-Amz-Credential'?: string;\n\t'X-Amz-SignedHeaders'?: string;\n\t'X-Amz-Expires'?: string;\n\t'X-Amz-Signature'?: string;\n\t'X-Amz-Date'?: string;\n\t'X-Amz-Security-Token'?: string;\n\treferer?: string;\n}\n\nexport type Config = S3ClientConfig & { bucket: string };\n\n/**\n * Get the dimensions from a string or array of strings.\n */\nfunction getDimArray( dims: string | number[], zoom: number = 1 ): ( number | null )[] {\n\tlet dimArr = typeof dims === 'string' ? dims.split( ',' ) : dims;\n\treturn dimArr.map( v => Math.round( Number( v ) * zoom ) || null );\n}\n\n/**\n * Clamp a value between a min and max.\n */\nfunction clamp( val: number | string, min: number, max: number ): number {\n\treturn Math.min( Math.max( Number( val ), min ), max );\n}\n\n/**\n * Get a file from S3/\n */\nexport async function getS3File( config: Config, key: string, args: Args ): Promise<GetObjectCommandOutput> {\n\tconst s3 = new S3Client( {\n\t\t...config,\n\t\tsigner: {\n\t\t\t/**\n\t\t\t *\n\t\t\t * @param request\n\t\t\t */\n\t\t\tsign: async request => {\n\t\t\t\tif ( ! args['X-Amz-Algorithm'] ) {\n\t\t\t\t\t// Add referer to the request headers on non-presigned URLs\n\t\t\t\t\t// Presigned URLs works without the referer header\n\t\t\t\t\tif (args.referer) {\n\t\t\t\t\t\trequest.headers = request.headers || {};\n\t\t\t\t\t\trequest.headers['referer'] = args.referer;\n\t\t\t\t\t}\n\t\t\t\t\treturn request;\n\t\t\t\t}\n\t\t\t\tconst presignedParamNames = [\n\t\t\t\t\t'X-Amz-Algorithm',\n\t\t\t\t\t'X-Amz-Content-Sha256',\n\t\t\t\t\t'X-Amz-Credential',\n\t\t\t\t\t'X-Amz-SignedHeaders',\n\t\t\t\t\t'X-Amz-Expires',\n\t\t\t\t\t'X-Amz-Signature',\n\t\t\t\t\t'X-Amz-Date',\n\t\t\t\t\t'X-Amz-Security-Token',\n\t\t\t\t] as const;\n\t\t\t\tconst presignedParams: { [K in ( typeof presignedParamNames )[number]]?: string } = {}; // eslint-disable-line no-unused-vars\n\t\t\t\tconst signedHeaders = ( args['X-Amz-SignedHeaders']?.split( ';' ) || [] ).map( header => header.toLowerCase().trim() );\n\n\t\t\t\tfor ( const paramName of presignedParamNames ) {\n\t\t\t\t\tif ( args[paramName] ) {\n\t\t\t\t\t\tpresignedParams[paramName] = args[paramName];\n\t\t\t\t\t}\n\t\t\t\t}\n\n\t\t\t\tconst headers: typeof request.headers = {};\n\t\t\t\tfor ( const header in request.headers ) {\n\t\t\t\t\tif ( signedHeaders.includes( header.toLowerCase() ) ) {\n\t\t\t\t\t\theaders[header] = request.headers[header];\n\t\t\t\t\t}\n\t\t\t\t}\n\t\t\t\trequest.query = presignedParams;\n\n\t\t\t\trequest.headers = headers;\n\t\t\t\treturn request;\n\t\t\t},\n\t\t},\n\t} );\n\n\tconst command = new GetObjectCommand( {\n\t\tBucket: config.bucket,\n\t\tKey: key,\n\t} );\n\n\treturn s3.send( command );\n}\n/**\n * Apply a logarithmic compression to a value based on a zoom level.\n * return a default compression value based on a logarithmic scale\n * defaultValue = 100, zoom = 2; = 65\n * defaultValue = 80, zoom = 2; = 50\n * defaultValue = 100, zoom = 1.5; = 86\n * defaultValue = 80, zoom = 1.5; = 68\n */\nfunction applyZoomCompression( defaultValue: number, zoom: number ): number {\n\tconst value = Math.round( defaultValue - ( Math.log( zoom ) / Math.log( defaultValue / zoom ) ) * ( defaultValue * zoom ) );\n\tconst min = Math.round( defaultValue / zoom );\n\treturn clamp( value, min, defaultValue );\n}\n\ntype ResizeBufferResult = {\n\tdata: Buffer;\n\tinfo: sharp.OutputInfo & {\n\t\terrors: string;\n\t}\n};\n\n/**\n * Resize a buffer of an image.\n */\nexport async function resizeBuffer(\n\tbuffer: Buffer | Uint8Array,\n\targs: Args\n): Promise<ResizeBufferResult> {\n\tconst image = sharp( buffer, {\n\t\tfailOnError: false,\n\t\tanimated: true,\n\t} ).withMetadata();\n\n\t// check we can get valid metadata\n\tconst metadata = await image.metadata();\n\n\t// auto rotate based on orientation EXIF data.\n\timage.rotate();\n\n\t// validate args, remove from the object if not valid\n\tconst errors: string[] = [];\n\n\tif ( args.w ) {\n\t\tif ( ! /^[1-9]\\d*$/.test( args.w ) ) {\n\t\t\tdelete args.w;\n\t\t\terrors.push( 'w arg is not valid' );\n\t\t}\n\t}\n\tif ( args.h ) {\n\t\tif ( ! /^[1-9]\\d*$/.test( args.h ) ) {\n\t\t\tdelete args.h;\n\t\t\terrors.push( 'h arg is not valid' );\n\t\t}\n\t}\n\tif ( args.quality ) {\n\t\tif (\n\t\t\t! /^[0-9]{1,3}$/.test( args.quality as string ) ||\n\t\t\t( args.quality as number ) < 0 ||\n\t\t\t( args.quality as number ) > 100\n\t\t) {\n\t\t\tdelete args.quality;\n\t\t\terrors.push( 'quality arg is not valid' );\n\t\t}\n\t}\n\tif ( args.resize ) {\n\t\tif ( ! /^\\d+(px)?,\\d+(px)?$/.test( args.resize as string ) ) {\n\t\t\tdelete args.resize;\n\t\t\terrors.push( 'resize arg is not valid' );\n\t\t}\n\t}\n\tif ( args.crop_strategy ) {\n\t\tif ( ! /^(smart|entropy|attention)$/.test( args.crop_strategy ) ) {\n\t\t\tdelete args.crop_strategy;\n\t\t\terrors.push( 'crop_strategy arg is not valid' );\n\t\t}\n\t}\n\tif ( args.gravity ) {\n\t\tif ( ! /^(north|northeast|east|southeast|south|southwest|west|northwest|center)$/.test( args.gravity ) ) {\n\t\t\tdelete args.gravity;\n\t\t\terrors.push( 'gravity arg is not valid' );\n\t\t}\n\t}\n\tif ( args.fit ) {\n\t\tif ( ! /^\\d+(px)?,\\d+(px)?$/.test( args.fit as string ) ) {\n\t\t\tdelete args.fit;\n\t\t\terrors.push( 'fit arg is not valid' );\n\t\t}\n\t}\n\tif ( args.crop ) {\n\t\tif ( ! /^\\d+(px)?,\\d+(px)?,\\d+(px)?,\\d+(px)?$/.test( args.crop as string ) ) {\n\t\t\tdelete args.crop;\n\t\t\terrors.push( 'crop arg is not valid' );\n\t\t}\n\t}\n\tif ( args.zoom ) {\n\t\tif ( ! /^\\d+(\\.\\d+)?$/.test( args.zoom ) ) {\n\t\t\tdelete args.zoom;\n\t\t\terrors.push( 'zoom arg is not valid' );\n\t\t}\n\t}\n\tif ( args.webp ) {\n\t\tif ( ! /^0|1|true|false$/.test( args.webp as string ) ) {\n\t\t\tdelete args.webp;\n\t\t\terrors.push( 'webp arg is not valid' );\n\t\t}\n\t}\n\tif ( args.lb ) {\n\t\tif ( ! /^\\d+(px)?,\\d+(px)?$/.test( args.lb ) ) {\n\t\t\tdelete args.lb;\n\t\t\terrors.push( 'lb arg is not valid' );\n\t\t}\n\t}\n\tif ( args.background ) {\n\t\tif ( ! /^#[a-f0-9]{3}[a-f0-9]{3}?$/.test( args.background ) ) {\n\t\t\tdelete args.background;\n\t\t\terrors.push( 'background arg is not valid' );\n\t\t}\n\t}\n\n\t// crop (assumes crop data from original)\n\tif ( args.crop ) {\n\t\tconst cropValuesString = typeof args.crop === 'string' ? args.crop.split( ',' ) : args.crop;\n\n\t\t// convert percentages to px values\n\t\tconst cropValues = cropValuesString.map( function ( value, index ) {\n\t\t\tif ( value.indexOf( 'px' ) > -1 ) {\n\t\t\t\treturn Number( value.substring( 0, value.length - 2 ) );\n\t\t\t} else {\n\t\t\t\treturn Number(\n\t\t\t\t\tNumber( ( metadata[index % 2 ? 'height' : 'width'] as number ) * ( Number( value ) / 100 ) ).toFixed( 0 )\n\t\t\t\t);\n\t\t\t}\n\t\t} );\n\n\t\timage.extract( {\n\t\t\tleft: cropValues[0],\n\t\t\ttop: cropValues[1],\n\t\t\twidth: cropValues[2],\n\t\t\theight: cropValues[3],\n\t\t} );\n\t}\n\n\t// get zoom value\n\tconst zoom = parseFloat( args.zoom || '1' ) || 1;\n\n\t// resize\n\tif ( args.resize ) {\n\t\t// apply smart crop if available\n\t\tif ( args.crop_strategy === 'smart' && ! args.crop ) {\n\t\t\tconst cropResize = getDimArray( args.resize );\n\t\t\tconst rotatedImage = await image.toBuffer();\n\t\t\tconst result = await smartcrop.crop( rotatedImage, {\n\t\t\t\twidth: cropResize[0] as number,\n\t\t\t\theight: cropResize[1] as number,\n\t\t\t} );\n\n\t\t\tif ( result && result.topCrop ) {\n\t\t\t\timage.extract( {\n\t\t\t\t\tleft: result.topCrop.x,\n\t\t\t\t\ttop: result.topCrop.y,\n\t\t\t\t\twidth: result.topCrop.width,\n\t\t\t\t\theight: result.topCrop.height,\n\t\t\t\t} );\n\t\t\t}\n\t\t}\n\n\t\t// apply the resize\n\t\targs.resize = getDimArray( args.resize, zoom ) as number[];\n\t\timage.resize( {\n\t\t\twidth: args.resize[0],\n\t\t\theight: args.resize[1],\n\t\t\twithoutEnlargement: true,\n\t\t\tposition: ( args.crop_strategy !== 'smart' && args.crop_strategy ) || args.gravity || 'centre',\n\t\t} );\n\t} else if ( args.fit ) {\n\t\tconst fit = getDimArray( args.fit, zoom ) as number[];\n\t\timage.resize( {\n\t\t\twidth: fit[0],\n\t\t\theight: fit[1],\n\t\t\tfit: 'inside',\n\t\t\twithoutEnlargement: true,\n\t\t} );\n\t} else if ( args.lb ) {\n\t\tconst lb = getDimArray( args.lb, zoom ) as number[];\n\t\timage.resize( {\n\t\t\twidth: lb[0],\n\t\t\theight: lb[1],\n\t\t\tfit: 'contain',\n\t\t\t// default to a black background to replicate Photon API behavior\n\t\t\t// when no background colour specified\n\t\t\tbackground: args.background || 'black',\n\t\t\twithoutEnlargement: true,\n\t\t} );\n\t} else if ( args.w || args.h ) {\n\t\timage.resize( {\n\t\t\twidth: Number( args.w ) * zoom || undefined,\n\t\t\theight: Number( args.h ) * zoom || undefined,\n\t\t\tfit: args.crop ? 'cover' : 'inside',\n\t\t\twithoutEnlargement: true,\n\t\t} );\n\t}\n\n\t// set default quality slightly higher than sharp's default\n\tif ( ! args.quality ) {\n\t\targs.quality = applyZoomCompression( 82, zoom );\n\t}\n\n\t// allow override of compression quality\n\tif ( args.webp ) {\n\t\timage.webp( {\n\t\t\tquality: Math.round( clamp( args.quality, 0, 100 ) ),\n\t\t} );\n\t} else if ( metadata.format === 'jpeg' ) {\n\t\timage.jpeg( {\n\t\t\tquality: Math.round( clamp( args.quality, 0, 100 ) ),\n\t\t} );\n\t} else if ( metadata.format === 'png' ) {\n\t\t// Compress the PNG.\n\t\timage.png( {\n\t\t\tpalette: true,\n\t\t} );\n\t}\n\n\t// send image\n\treturn new Promise( ( resolve, reject ) => {\n\t\timage.toBuffer( async ( err, data, info ) => {\n\t\t\tif ( err ) {\n\t\t\t\treject( err );\n\t\t\t}\n\n\t\t\t// add invalid args\n\t\t\tresolve( {\n\t\t\t\tdata,\n\t\t\t\tinfo: {\n\t\t\t\t\t...info,\n\t\t\t\t\terrors: errors.join( ';' ),\n\t\t\t\t},\n\t\t\t} );\n\t\t} );\n\t} );\n}\n"
  },
  {
    "path": "src/server.ts",
    "content": "import { createServer, IncomingMessage, ServerResponse } from 'http';\n\nimport { handler } from './lambda-handler.js';\n\n// Define the server\nconst server = createServer( async ( req: IncomingMessage, res: ServerResponse ) => {\n\t// Constructing API Gateway event\n\tconst url = new URL( req.url!, `http://${req.headers.host}` );\n\tconst apiGatewayEvent = {\n\t\tversion: '2.0',\n\t\trouteKey: req.url!,\n\t\trawPath: url.pathname,\n\t\trawQueryString: url.searchParams.toString(),\n\t\theaders: req.headers,\n\t\trequestContext: {\n\t\t\taccountId: '123456789012',\n\t\t\tstage: 'default',\n\t\t\thttp: {\n\t\t\t\tmethod: req.method!,\n\t\t\t\tpath: req.url!,\n\t\t\t\tprotocol: 'HTTP/1.1',\n\t\t\t\tsourceIp: req.socket.remoteAddress!,\n\t\t\t\tuserAgent: req.headers['user-agent']!,\n\t\t\t},\n\t\t\trequestId: 'c6af9ac6-7b61-11e6-9a41-93e8deadbeef',\n\t\t\trouteKey: req.url!,\n\t\t},\n\t\tqueryStringParameters: Array.from( url.searchParams ).reduce(\n\t\t\t( acc, [ key, value ] ) => ( {\n\t\t\t\t...acc,\n\t\t\t\t[key]: value,\n\t\t\t} ),\n\t\t\t{}\n\t\t),\n\t};\n\n\ttry {\n\t\tawait handler( apiGatewayEvent, {\n\t\t\t/**\n\t\t\t * Set the content type for the respone.\n\t\t\t */\n\t\t\tsetContentType( type: string ): void {\n\t\t\t\tres.setHeader( 'Content-Type', type );\n\t\t\t},\n\t\t\t/**\n\t\t\t * Write data to the response.\n\t\t\t */\n\t\t\twrite( stream: string | Buffer ): void {\n\t\t\t\tres.write( stream );\n\t\t\t},\n\t\t\t/**\n\t\t\t * End the response.\n\t\t\t */\n\t\t\tend(): void {\n\t\t\t\tres.end();\n\t\t\t},\n\t\t} );\n\t} catch ( e ) {\n\t\tres.write( JSON.stringify( e ) );\n\t\tres.statusCode = 500;\n\t\tres.end();\n\t}\n} );\n\n// Start the server\nconst port = process.argv.slice( 2 )[0] || 8080;\n\nserver.listen( port, () => {\n\tconsole.log( `Server running at http://localhost:${port}/` ); // eslint-disable-line no-console\n} );\n"
  },
  {
    "path": "template.yaml",
    "content": "AWSTemplateFormatVersion: '2010-09-09'\nTransform: AWS::Serverless-2016-10-31\nResources:\n  Tachyon:\n    Type: AWS::Serverless::Function\n    Properties:\n      CodeUri: ./\n      Handler: dist/lambda-handler.handler\n      Environment:\n        Variables:\n          S3_BUCKET: hmn-uploads\n          S3_REGION: us-east-1\n      Runtime: nodejs18.x\n      Architectures:\n      - x86_64\n      Events:\n        Api:\n          Type: Api\n          Properties:\n            Path: \"/{proxy+}\"\n            Method: get\n      Timeout: 60\n      FunctionUrlConfig:\n        AuthType: NONE\n        InvokeMode: RESPONSE_STREAM\n\n"
  },
  {
    "path": "tests/events/animated-gif.json",
    "content": "{\n\t\"version\": \"2.0\",\n\t\"routeKey\": \"$default\",\n\t\"rawPath\": \"/s3-uploads-unit-tests/tachyon/pen.gif\",\n\t\"rawQueryString\": \"\",\n\t\"headers\": {\n\t\t\"sec-fetch-mode\": \"navigate\",\n\t\t\"x-amzn-tls-version\": \"TLSv1.2\",\n\t\t\"sec-fetch-site\": \"none\",\n\t\t\"accept-language\": \"en-US,en;q=0.5\",\n\t\t\"x-forwarded-proto\": \"https\",\n\t\t\"x-forwarded-port\": \"443\",\n\t\t\"x-forwarded-for\": \"212.59.69.208\",\n\t\t\"sec-fetch-user\": \"?1\",\n\t\t\"accept\": \"text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,*/*;q=0.8\",\n\t\t\"x-amzn-tls-cipher-suite\": \"ECDHE-RSA-AES128-GCM-SHA256\",\n\t\t\"x-amzn-trace-id\": \"Root=1-64c21fa5-2e18c6c333bff5bc70a1bd9d\",\n\t\t\"host\": \"opphilvfiwbi2ouykkz57ccs6q0khsad.lambda-url.eu-central-1.on.aws\",\n\t\t\"upgrade-insecure-requests\": \"1\",\n\t\t\"accept-encoding\": \"gzip, deflate, br\",\n\t\t\"sec-fetch-dest\": \"document\",\n\t\t\"user-agent\": \"Mozilla/5.0 (Macintosh; Intel Mac OS X 10.15; rv:109.0) Gecko/20100101 Firefox/115.0\"\n\t},\n\t\"requestContext\": {\n\t\t\"accountId\": \"anonymous\",\n\t\t\"apiId\": \"opphilvfiwbi2ouykkz57ccs6q0khsad\",\n\t\t\"domainName\": \"opphilvfiwbi2ouykkz57ccs6q0khsad.lambda-url.eu-central-1.on.aws\",\n\t\t\"domainPrefix\": \"opphilvfiwbi2ouykkz57ccs6q0khsad\",\n\t\t\"http\": {\n\t\t\t\"method\": \"GET\",\n\t\t\t\"path\": \"/s3-uploads-unit-tests/tachyon/pen.gif\",\n\t\t\t\"protocol\": \"HTTP/1.1\",\n\t\t\t\"sourceIp\": \"212.59.69.208\",\n\t\t\t\"userAgent\": \"Mozilla/5.0 (Macintosh; Intel Mac OS X 10.15; rv:109.0) Gecko/20100101 Firefox/115.0\"\n\t\t},\n\t\t\"requestId\": \"c5c172a9-7a3c-4c6f-a66a-3422d9031e59\",\n\t\t\"routeKey\": \"$default\",\n\t\t\"stage\": \"$default\",\n\t\t\"time\": \"27/Jul/2023:07:41:25 +0000\",\n\t\t\"timeEpoch\": 1690443685529\n\t},\n\t\"isBase64Encoded\": false\n}\n"
  },
  {
    "path": "tests/events/original.json",
    "content": "{\n\t\"version\": \"2.0\",\n\t\"routeKey\": \"$default\",\n\t\"rawPath\": \"/s3-uploads-unit-tests/tachyon/canola.jpg\",\n\t\"rawQueryString\": \"\",\n\t\"headers\": {\n\t\t\"sec-fetch-mode\": \"navigate\",\n\t\t\"x-amzn-tls-version\": \"TLSv1.2\",\n\t\t\"sec-fetch-site\": \"none\",\n\t\t\"accept-language\": \"en-US,en;q=0.5\",\n\t\t\"x-forwarded-proto\": \"https\",\n\t\t\"x-forwarded-port\": \"443\",\n\t\t\"x-forwarded-for\": \"212.59.69.208\",\n\t\t\"sec-fetch-user\": \"?1\",\n\t\t\"accept\": \"text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,*/*;q=0.8\",\n\t\t\"x-amzn-tls-cipher-suite\": \"ECDHE-RSA-AES128-GCM-SHA256\",\n\t\t\"x-amzn-trace-id\": \"Root=1-64c21fa5-2e18c6c333bff5bc70a1bd9d\",\n\t\t\"host\": \"opphilvfiwbi2ouykkz57ccs6q0khsad.lambda-url.eu-central-1.on.aws\",\n\t\t\"upgrade-insecure-requests\": \"1\",\n\t\t\"accept-encoding\": \"gzip, deflate, br\",\n\t\t\"sec-fetch-dest\": \"document\",\n\t\t\"user-agent\": \"Mozilla/5.0 (Macintosh; Intel Mac OS X 10.15; rv:109.0) Gecko/20100101 Firefox/115.0\"\n\t},\n\t\"requestContext\": {\n\t\t\"accountId\": \"anonymous\",\n\t\t\"apiId\": \"opphilvfiwbi2ouykkz57ccs6q0khsad\",\n\t\t\"domainName\": \"opphilvfiwbi2ouykkz57ccs6q0khsad.lambda-url.eu-central-1.on.aws\",\n\t\t\"domainPrefix\": \"opphilvfiwbi2ouykkz57ccs6q0khsad\",\n\t\t\"http\": {\n\t\t\t\"method\": \"GET\",\n\t\t\t\"path\": \"/s3-uploads-unit-tests/tachyon/canola.jpg\",\n\t\t\t\"protocol\": \"HTTP/1.1\",\n\t\t\t\"sourceIp\": \"212.59.69.208\",\n\t\t\t\"userAgent\": \"Mozilla/5.0 (Macintosh; Intel Mac OS X 10.15; rv:109.0) Gecko/20100101 Firefox/115.0\"\n\t\t},\n\t\t\"requestId\": \"c5c172a9-7a3c-4c6f-a66a-3422d9031e59\",\n\t\t\"routeKey\": \"$default\",\n\t\t\"stage\": \"$default\",\n\t\t\"time\": \"27/Jul/2023:07:41:25 +0000\",\n\t\t\"timeEpoch\": 1690443685529\n\t},\n\t\"isBase64Encoded\": false\n}\n"
  },
  {
    "path": "tests/events/signed-url.json",
    "content": "{\n\t\"version\": \"2.0\",\n\t\"routeKey\": \"$default\",\n\t\"rawPath\": \"/s3-uploads-unit-tests/private.png\",\n\t\"rawQueryString\": \"X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIAYM4GX6NWSKCAHDX4%2F20230727%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20230727T184113Z&X-Amz-Expires=604800&X-Amz-SignedHeaders=host&X-Amz-Signature=656a3e562cfe284651fa47ff38261581713350effe7a830dc81d789996b4b808\",\n\t\"headers\": {\n\t\t\"sec-fetch-mode\": \"navigate\",\n\t\t\"x-amzn-tls-version\": \"TLSv1.2\",\n\t\t\"sec-fetch-site\": \"none\",\n\t\t\"accept-language\": \"en-US,en;q=0.5\",\n\t\t\"x-forwarded-proto\": \"https\",\n\t\t\"x-forwarded-port\": \"443\",\n\t\t\"x-forwarded-for\": \"212.59.69.208\",\n\t\t\"sec-fetch-user\": \"?1\",\n\t\t\"accept\": \"text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,*/*;q=0.8\",\n\t\t\"x-amzn-tls-cipher-suite\": \"ECDHE-RSA-AES128-GCM-SHA256\",\n\t\t\"x-amzn-trace-id\": \"Root=1-64c21fa5-2e18c6c333bff5bc70a1bd9d\",\n\t\t\"host\": \"opphilvfiwbi2ouykkz57ccs6q0khsad.lambda-url.eu-central-1.on.aws\",\n\t\t\"upgrade-insecure-requests\": \"1\",\n\t\t\"accept-encoding\": \"gzip, deflate, br\",\n\t\t\"sec-fetch-dest\": \"document\",\n\t\t\"user-agent\": \"Mozilla/5.0 (Macintosh; Intel Mac OS X 10.15; rv:109.0) Gecko/20100101 Firefox/115.0\"\n\t},\n\t\"queryStringParameters\": {\n\t\t\"X-Amz-Algorithm\": \"AWS4-HMAC-SHA256\",\n\t\t\"X-Amz-Credential\": \"AKIAYM4GX6NWSKCAHDX4/20230727/us-east-1/s3/aws4_request\",\n\t\t\"X-Amz-Date\": \"20230727T184113Z\",\n\t\t\"X-Amz-Expires\": \"604800\",\n\t\t\"X-Amz-SignedHeaders\": \"host\",\n\t\t\"X-Amz-Signature\": \"656a3e562cfe284651fa47ff38261581713350effe7a830dc81d789996b4b808\"\n\t},\n\t\"requestContext\": {\n\t\t\"accountId\": \"anonymous\",\n\t\t\"apiId\": \"opphilvfiwbi2ouykkz57ccs6q0khsad\",\n\t\t\"domainName\": \"opphilvfiwbi2ouykkz57ccs6q0khsad.lambda-url.eu-central-1.on.aws\",\n\t\t\"domainPrefix\": \"opphilvfiwbi2ouykkz57ccs6q0khsad\",\n\t\t\"http\": {\n\t\t\t\"method\": \"GET\",\n\t\t\t\"path\": \"/s3-uploads-unit-tests/private.png?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIAYM4GX6NWSKCAHDX4%2F20230727%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20230727T184113Z&X-Amz-Expires=604800&X-Amz-SignedHeaders=host&X-Amz-Signature=656a3e562cfe284651fa47ff38261581713350effe7a830dc81d789996b4b808\",\n\t\t\t\"protocol\": \"HTTP/1.1\",\n\t\t\t\"sourceIp\": \"212.59.69.208\",\n\t\t\t\"userAgent\": \"Mozilla/5.0 (Macintosh; Intel Mac OS X 10.15; rv:109.0) Gecko/20100101 Firefox/115.0\"\n\t\t},\n\t\t\"requestId\": \"c5c172a9-7a3c-4c6f-a66a-3422d9031e59\",\n\t\t\"routeKey\": \"$default\",\n\t\t\"stage\": \"$default\",\n\t\t\"time\": \"27/Jul/2023:07:41:25 +0000\",\n\t\t\"timeEpoch\": 1690443685529\n\t},\n\t\"isBase64Encoded\": false\n}\n"
  },
  {
    "path": "tests/setup.ts",
    "content": "/**\n * Jest setup file to mock the AWS Lambda runtime global `awslambda`.\n * This must run before test modules are imported so that\n * lambda-handler.ts can call awslambda.streamifyResponse() at module scope.\n */\nglobal.awslambda = {\n\t/**\n\t *\n\t * @param handler\n\t */\n\tstreamifyResponse( handler: StreamifyHandler ): StreamifyHandler {\n\t\treturn handler;\n\t},\n\tHttpResponseStream: {\n\t\t/**\n\t\t * @param stream The response stream.\n\t\t * @param metadata The metadata for the response.\n\t\t */\n\t\tfrom( stream: ResponseStream, metadata: any ): ResponseStream {\n\t\t\tstream.metadata = metadata;\n\t\t\treturn stream;\n\t\t},\n\t},\n};\n"
  },
  {
    "path": "tests/test-animated-files.ts",
    "content": "import fs from 'fs';\n\nimport { test, expect } from '@jest/globals';\nimport sharp from 'sharp';\n\nimport { resizeBuffer } from '../src/lib';\n\ntest.failing( 'Test animated png resize', async () => {\n\tlet file = fs.readFileSync( __dirname + '/images/animated.png', {} );\n\tconst result = await resizeBuffer( file, { w: '20' } );\n\texpect( result.info.format ).toBe( 'png' );\n\texpect( result.info.width ).toBe( 20 );\n\n\tlet image = sharp( file );\n\tlet metadata = await image.metadata();\n\texpect( metadata.pages ).toBe( 20 );\n} );\n\ntest( 'Test animated gif resize', async () => {\n\tlet file = fs.readFileSync( __dirname + '/images/animated.gif', {} );\n\tconst result = await resizeBuffer( file, { w: '20' } );\n\texpect( result.info.format ).toBe( 'gif' );\n\texpect( result.info.width ).toBe( 20 );\n\n\tlet image = sharp( file );\n\tlet metadata = await image.metadata();\n\texpect( metadata.pages ).toBe( 48 );\n} );\n\ntest( 'Test animated gif resize webp', async () => {\n\tlet file = fs.readFileSync( __dirname + '/images/animated.gif', {} );\n\tconst result = await resizeBuffer( file, {\n\t\tw: '20',\n\t\twebp: true,\n\t} );\n\texpect( result.info.format ).toBe( 'webp' );\n\texpect( result.info.width ).toBe( 20 );\n\n\tlet image = sharp( file );\n\tlet metadata = await image.metadata();\n\texpect( metadata.pages ).toBe( 48 );\n} );\n\ntest( 'Test animated webp resize', async () => {\n\tlet file = fs.readFileSync( __dirname + '/images/animated.webp', {} );\n\tconst result = await resizeBuffer( file, {\n\t\tw: '20',\n\t} );\n\texpect( result.info.format ).toBe( 'webp' );\n\texpect( result.info.width ).toBe( 20 );\n\n\tlet image = sharp( file );\n\tlet metadata = await image.metadata();\n\texpect( metadata.pages ).toBe( 12 );\n} );\n\ntest( 'Test animated webp resize webp', async () => {\n\tlet file = fs.readFileSync( __dirname + '/images/animated.webp', {} );\n\tconst result = await resizeBuffer( file, {\n\t\tw: '20',\n\t\twebp: true,\n\t} );\n\texpect( result.info.format ).toBe( 'webp' );\n\texpect( result.info.width ).toBe( 20 );\n\n\tlet image = sharp( file );\n\tlet metadata = await image.metadata();\n\texpect( metadata.pages ).toBe( 12 );\n} );\n"
  },
  {
    "path": "tests/test-filesize/fixtures.json",
    "content": "{\n    \"Website.png-original.png\": 40173,\n    \"Website.png-small.png\": 5850,\n    \"Website.png-medium.png\": 16594,\n    \"Website.png-large.png\": 40173,\n    \"Website.png-webp.webp\": 27218,\n    \"animated.png-original.png\": 4929,\n    \"animated.png-small.png\": 4929,\n    \"animated.png-medium.png\": 4929,\n    \"animated.png-large.png\": 4929,\n    \"animated.png-webp.webp\": 8762,\n    \"animated.gif-original.gif\": 370705,\n    \"animated.gif-small.gif\": 99089,\n    \"animated.gif-medium.gif\": 370705,\n    \"animated.gif-large.gif\": 370705,\n    \"animated.gif-webp.webp\": 208888,\n    \"briefing-copywriting.jpg-original.jpeg\": 122312,\n    \"briefing-copywriting.jpg-small.jpeg\": 10007,\n    \"briefing-copywriting.jpg-medium.jpeg\": 16484,\n    \"briefing-copywriting.jpg-large.jpeg\": 36253,\n    \"briefing-copywriting.jpg-webp.webp\": 22722,\n    \"animated.webp-original.webp\": 111636,\n    \"animated.webp-small.webp\": 26868,\n    \"animated.webp-medium.webp\": 76902,\n    \"animated.webp-large.webp\": 111636,\n    \"animated.webp-webp.webp\": 115972,\n    \"hdr.jpg-original.jpeg\": 148964,\n    \"hdr.jpg-small.jpeg\": 10642,\n    \"hdr.jpg-medium.jpeg\": 24341,\n    \"hdr.jpg-large.jpeg\": 87548,\n    \"hdr.jpg-webp.webp\": 82806,\n    \"icons.png-original.png\": 31735,\n    \"icons.png-small.png\": 6351,\n    \"icons.png-medium.png\": 13431,\n    \"icons.png-large.png\": 27323,\n    \"icons.png-webp.webp\": 30694,\n    \"humans.png-original.png\": 856689,\n    \"humans.png-small.png\": 16050,\n    \"humans.png-medium.png\": 63874,\n    \"humans.png-large.png\": 283843,\n    \"humans.png-webp.webp\": 142800\n}"
  },
  {
    "path": "tests/test-filesize/test-filesize.ts",
    "content": "import fs from 'fs';\n\nimport { test, expect } from '@jest/globals';\nimport Table from 'cli-table';\nimport { filesize } from 'filesize';\n\nimport { Args, resizeBuffer } from '../../src/lib';\n\nlet images = fs.readdirSync( __dirname + '/../images' );\n\nconst args = process.argv.slice( 2 );\n\nif ( args[0] && args[0].indexOf( '--' ) !== 0 ) {\n\timages = images.filter( file => args[0] === file );\n}\n\n// Manually change to true when you are intentionally changing files.\nconst saveFixtured = false;\n\nconst table = new Table( {\n\thead: [ 'Image', 'Original Size', 'Tachyon Size', '100px', '300px', '700px', '700px webp' ],\n\tcolWidths: [ 30, 15, 20, 15, 15, 15, 20 ],\n} );\n\n// Read in existing features for resizes, so we can detect if image resizing\n// has lead to a change in file size from previous runs.\nconst oldFixtures = JSON.parse( fs.readFileSync( __dirname + '/fixtures.json' ).toString() );\nconst fixtures: { [key: string]: number } = {};\n\n/**\n *\n */\ntest( 'Test file sizes', async () => {\n\tawait Promise.all(\n\t\timages.map( async imageName => {\n\t\t\tconst image = `${__dirname}/../images/${imageName}`;\n\t\t\tconst imageData = fs.readFileSync( image );\n\t\t\tconst sizes = {\n\t\t\t\toriginal: {},\n\t\t\t\tsmall: { w: 100 },\n\t\t\t\tmedium: { w: 300 },\n\t\t\t\tlarge: { w: 700 },\n\t\t\t\twebp: {\n\t\t\t\t\tw: 700,\n\t\t\t\t\twebp: true,\n\t\t\t\t},\n\t\t\t};\n\t\t\tconst promises = await Promise.all(\n\t\t\t\tObject.entries( sizes ).map( async ( [ _size, args ] ) => {\n\t\t\t\t\treturn resizeBuffer( imageData, args as Args );\n\t\t\t\t} )\n\t\t\t);\n\n\t\t\t// Zip them back into a size => image map.\n\t\t\tconst initial : { [key: string]: any } = {};\n\t\t\tconst resized = promises.reduce( ( images, image, index ) => {\n\t\t\t\timages[ Object.keys( sizes )[index] ] = image;\n\t\t\t\treturn images;\n\t\t\t}, initial );\n\n\t\t\t// Save each one to the file system for viewing.\n\t\t\tObject.entries( resized ).forEach( ( [ size, image ] ) => {\n\t\t\t\tconst imageKey = `${imageName}-${size}.${image.info.format}`;\n\t\t\t\tfixtures[imageKey] = image.data.length;\n\t\t\t\tfs.writeFile( `${__dirname}/output/${imageKey}`, image.data, () => {} );\n\t\t\t} );\n\n\t\t\ttable.push( [\n\t\t\t\timageName,\n\t\t\t\tfilesize( imageData.length ),\n\t\t\t\tfilesize( resized.original.info.size ) +\n\t\t\t\t\t' (' +\n\t\t\t\t\tMath.floor( ( resized.original.info.size / imageData.length ) * 100 ) +\n\t\t\t\t\t'%)',\n\t\t\t\tfilesize( resized.small.info.size ),\n\t\t\t\tfilesize( resized.medium.info.size ),\n\t\t\t\tfilesize( resized.large.info.size ),\n\t\t\t\tfilesize( resized.webp.info.size ) +\n\t\t\t\t\t' (' +\n\t\t\t\t\tMath.floor( ( resized.webp.info.size / resized.large.info.size ) * 100 ) +\n\t\t\t\t\t'%)',\n\t\t\t] );\n\t\t} )\n\t);\n\n\tif ( saveFixtured ) {\n\t\tfs.writeFileSync( __dirname + '/fixtures.json', JSON.stringify( fixtures, null, 4 ) );\n\t}\n\n\tconsole.log( table.toString() );\n\n\tfor ( const key in fixtures ) {\n\t\tif ( ! oldFixtures[key] ) {\n\t\t\tcontinue;\n\t\t}\n\n\t\t// Make sure the image size is within 1% of the old image size. This is because\n\t\t// file resizing sizes etc across systems and architectures is not 100%\n\t\t// deterministic. See https://github.com/lovell/sharp/issues/3783\n\t\tlet increasedPercent = 100 - Math.round( oldFixtures[key] / fixtures[key] * 100 );\n\t\tlet increasedSize = fixtures[key] - oldFixtures[key];\n\n\t\tif ( fixtures[key] !== oldFixtures[key] ) {\n\t\t\tconst diff = Math.abs( 100 - ( oldFixtures[key] / fixtures[key] * 100 ) );\n\t\t\tconsole.log(\n\t\t\t\t`${key} is different than image in fixtures by (${\n\t\t\t\t\tfilesize( oldFixtures[key] - fixtures[key] )\n\t\t\t\t}, ${diff}%.). New ${ filesize( fixtures[key] ) }, old ${ filesize( oldFixtures[key] ) } }`\n\t\t\t);\n\t\t}\n\n\t\t// If the file has changed by more than 5kb, then we expect it to be within 3% of the old size.\n\t\tif ( increasedSize > 1024 * 5 ) {\n\t\t\texpect( increasedPercent ).toBeLessThanOrEqual( 3 );\n\t\t}\n\t}\n} );\n"
  },
  {
    "path": "tests/test-lambda.ts",
    "content": "import { test, expect } from '@jest/globals';\n\nimport { handler } from '../src/lambda-handler';\n\nimport animatedGifLambdaEvent from './events/animated-gif.json';\n\nprocess.env.S3_REGION = 'us-east-1';\nprocess.env.S3_BUCKET = 'hmn-uploads';\n\ntest( 'Test content type headers', async () => {\n\tconst testResponseStream = new TestResponseStream();\n\tawait handler( animatedGifLambdaEvent, testResponseStream );\n\n\texpect( testResponseStream.contentType ).toBe( 'image/gif' );\n} );\n\ntest( 'Test image not found', async () => {\n\tconst testResponseStream = new TestResponseStream();\n\tanimatedGifLambdaEvent.rawPath = '/tachyon/does-not-exist.gif';\n\n\tawait handler( animatedGifLambdaEvent, testResponseStream );\n\n\texpect( testResponseStream.metadata.statusCode ).toBe( 404 );\n\texpect( testResponseStream.contentType ).toBe( 'text/html' );\n} );\n\n/**\n * A test response stream.\n */\nclass TestResponseStream {\n\tcontentType: string | undefined;\n\tbody: string | Buffer | undefined;\n\theaders: { [key: string]: string } = {};\n\tmetadata: any;\n\n\tsetContentType( type: string ): void {\n\t\tthis.contentType = type;\n\t}\n\twrite( stream: string | Buffer ): void {\n\t\tif ( typeof this.body === 'string' ) {\n\t\t\tthis.body += stream;\n\t\t} else if ( this.body instanceof Buffer ) {\n\t\t\tthis.body = this.body.toString().concat( stream.toString() );\n\t\t} else {\n\t\t\tthis.body = stream;\n\t\t}\n\t}\n\tend(): void {\n\t\tif ( this.metadata.headers['Content-Type'] ) {\n\t\t\tthis.contentType = this.metadata.headers['Content-Type'];\n\t\t}\n\t}\n}\n\n"
  },
  {
    "path": "tests/test-private-upload.ts",
    "content": "import { GetObjectCommand, S3Client } from '@aws-sdk/client-s3';\nimport { getSignedUrl } from '@aws-sdk/s3-request-presigner';\nimport { test, expect } from '@jest/globals';\nimport { MiddlewareType } from '@smithy/types';\n\nimport { handler } from '../src/lambda-handler';\nimport { Args } from '../src/lib';\n\n/**\n * Presign a URL for a given key.\n * @param key\n * @returns {Promise<Args>} The presigned params\n */\nasync function getPresignedUrlParams( key: string ) : Promise<Args> {\n\tconst client = new S3Client( {\n\t\tregion: process.env.S3_REGION,\n\t} );\n\tconst command = new GetObjectCommand( {\n\t\tBucket: process.env.S3_BUCKET,\n\t\tKey: key,\n\t} );\n\n\t/**\n\t * Middleware to remove the x-id query string param form the GetObject call.\n\t */\n\tconst middleware: MiddlewareType<any, any> = ( next: any, context: any ) => async ( args: any ) => {\n\t\tconst { request } = args;\n\t\tdelete request.query['x-id'];\n\t\treturn next( args );\n\t};\n\n\tclient.middlewareStack.addRelativeTo( middleware, {\n\t\tname: 'tests',\n\t\trelation: 'before',\n\t\ttoMiddleware: 'awsAuthMiddleware',\n\t\toverride: true,\n\t} );\n\n\tconst presignedUrl = new URL( await getSignedUrl( client, command, {\n\t\texpiresIn: 60,\n\t} ) );\n\n\tconst queryStringParameters: Args = Object.fromEntries( presignedUrl.searchParams.entries() );\n\n\treturn queryStringParameters;\n}\n\ntest( 'Test get private upload', async () => {\n\tconst event = {\n\t\t'version': '2.0',\n\t\t'routeKey': '$default',\n\t\t'rawPath': '/private.png',\n\t\t'headers': {\n\t\t},\n\t\tqueryStringParameters: await getPresignedUrlParams( 'private.png' ),\n\t\t'isBase64Encoded': false,\n\t};\n\n\tlet contentType;\n\n\tawait handler( event, {\n\t\t/**\n\t\t * Set the content type for the respone.\n\t\t */\n\t\tsetContentType( type: string ): void {\n\t\t\tcontentType = type;\n\t\t},\n\t\t/**\n\t\t * Write data to the response.\n\t\t */\n\t\twrite( stream: string | Buffer ): void {\n\t\t},\n\t\t/**\n\t\t * End the response.\n\t\t */\n\t\tend(): void {\n\t\t\tif ( this.metadata.headers['Cache-Control'] ) {\n\t\t\t\tcontentType = this.metadata.headers['Content-Type'];\n\t\t\t}\n\t\t},\n\t} );\n\n\texpect( contentType ).toBe( 'image/png' );\n} );\n\ntest( 'Test get private upload with presign params', async () => {\n\tconst presignParams = await getPresignedUrlParams( 'private.png' ) as Record<string, string>;\n\n\t// The below credentials are temporary and will need regenerating before the test is run.\n\t// Run aws s3 presign --expires 3600 s3://hmn-uploads/private.png\n\tconst event = {\n\t\t'version': '2.0',\n\t\t'routeKey': '$default',\n\t\t'rawPath': '/private.png',\n\t\t'headers': {\n\t\t},\n\t\t'queryStringParameters': {\n\t\t\tpresign: new URLSearchParams( presignParams ).toString(),\n\t\t},\n\t\t'isBase64Encoded': false,\n\t};\n\n\tlet contentType;\n\n\tawait handler( event, {\n\t\t/**\n\t\t * Set the content type for the respone.\n\t\t */\n\t\tsetContentType( type: string ): void {\n\t\t\tcontentType = type;\n\t\t},\n\t\t/**\n\t\t * Write data to the response.\n\t\t */\n\t\twrite( stream: string | Buffer ): void {\n\t\t},\n\t\t/**\n\t\t * End the response.\n\t\t */\n\t\tend(): void {\n\t\t\tif ( this.metadata.headers['Cache-Control'] ) {\n\t\t\t\tcontentType = this.metadata.headers['Content-Type'];\n\t\t\t}\n\t\t},\n\t} );\n\n\texpect( contentType ).toBe( 'image/png' );\n} );\n"
  },
  {
    "path": "tsconfig.json",
    "content": "{\n\t\"compilerOptions\": {\n\t\t\"target\": \"es2020\",\n\t\t\"strict\": true,\n\t\t\"preserveConstEnums\": true,\n\t\t\"sourceMap\": false,\n\t\t\"module\": \"Node16\",\n\t\t\"moduleResolution\": \"node16\",\n\t\t\"esModuleInterop\": true,\n\t\t\"skipLibCheck\": true,\n\t\t\"forceConsistentCasingInFileNames\": true,\n\t\t\"isolatedModules\": true,\n\t\t\"types\": [\"node\"],\n\t\t\"resolveJsonModule\": true,\n\t\t\"outDir\": \"dist\",\n\t\t\"rootDir\": \"./src\"\n\t},\n\t\"exclude\": [\"node_modules\"],\n\t\"include\": [\"./src/**/*.ts\", \"./*.d.ts\"],\n}\n"
  },
  {
    "path": "tsconfig.test.json",
    "content": "{\n\t\"compilerOptions\": {\n\t\t\"target\": \"es2020\",\n\t\t\"strict\": true,\n\t\t\"preserveConstEnums\": true,\n\t\t\"noEmit\": false,\n\t\t\"sourceMap\": true,\n\t\t\"module\": \"commonjs\",\n\t\t\"moduleResolution\": \"node10\",\n\t\t\"ignoreDeprecations\": \"6.0\",\n\t\t\"esModuleInterop\": true,\n\t\t\"skipLibCheck\": true,\n\t\t\"forceConsistentCasingInFileNames\": true,\n\t\t\"isolatedModules\": true,\n\t\t\"outDir\": \"./dist\",\n\t\t\"types\": [\"node\"],\n\t\t\"resolveJsonModule\": true,\n\t\t\"rootDir\": \".\"\n\t},\n\t\"exclude\": [\"node_modules\", \"**/*.test.ts\"],\n\t\"include\": [\"./src/**/*.ts\", \"./*.d.ts\", \"./tests/**/*.ts\"],\n}\n"
  }
]