Full Code of contentful/contentful-export for AI

main 3a4fb235090b cached
60 files
157.2 KB
40.8k tokens
52 symbols
1 requests
Download .txt
Repository: contentful/contentful-export
Branch: main
Commit: 3a4fb235090b
Files: 60
Total size: 157.2 KB

Directory structure:
gitextract_8t_4q63h/

├── .bito/
│   └── guidelines/
│       ├── domain-invariants.txt
│       ├── repo-truth-and-boundaries.txt
│       └── review-posture.txt
├── .bito.yaml
├── .contentful/
│   └── vault-secrets.yaml
├── .eslintrc
├── .github/
│   ├── CODEOWNERS
│   ├── ISSUE_TEMPLATE/
│   │   ├── bug_report.md
│   │   ├── config.yml
│   │   └── feature_request.md
│   ├── PULL_REQUEST_TEMPLATE.md
│   ├── dependabot.yml
│   └── workflows/
│       ├── build.yaml
│       ├── check.yaml
│       ├── codeql.yaml
│       ├── dependabot-approve-and-request-merge.yaml
│       ├── main.yaml
│       └── release.yaml
├── .gitignore
├── .npmrc
├── .nvmrc
├── AGENTS.md
├── ARCHITECTURE.md
├── CONTRIBUTING.md
├── LICENSE
├── README.md
├── babel.config.json
├── bin/
│   └── contentful-export
├── catalog-info.yaml
├── docs/
│   ├── ADRs/
│   │   ├── 2016-06-20-babel-cjs-build-pipeline.md
│   │   ├── 2016-06-20-semantic-release.md
│   │   ├── 2024-12-02-typescript-check-only.md
│   │   ├── 2025-11-14-circleci-to-github-actions.md
│   │   ├── 2026-04-09-cma-v12-node-22-minimum.md
│   │   └── README.md
│   └── specs/
│       ├── .gitkeep
│       └── README.md
├── example-config.json
├── example-config.test.json
├── lib/
│   ├── index.js
│   ├── parseOptions.js
│   ├── tasks/
│   │   ├── download-assets.js
│   │   ├── get-space-data.js
│   │   └── init-client.js
│   ├── usageParams.js
│   └── utils/
│       ├── embargoedAssets.js
│       └── headers.js
├── package.json
├── test/
│   ├── integration/
│   │   └── export-lib.test.js
│   └── unit/
│       ├── index.test.js
│       ├── mocks/
│       │   ├── download-assets.js
│       │   └── get-space-data.js
│       ├── parseOptions.test.js
│       ├── tasks/
│       │   ├── download-assets.test.js
│       │   ├── get-space-data.test.js
│       │   └── init-client.test.js
│       └── utils/
│           ├── embargoedAssets.test.js
│           └── headers.test.js
├── tsconfig.json
└── types.d.ts

================================================
FILE CONTENTS
================================================

================================================
FILE: .bito/guidelines/domain-invariants.txt
================================================
Critical domain invariants for contentful-export:

1. types.d.ts and lib/usageParams.js must stay in sync with the actual Options interface.
   Any new option added to parseOptions.js must be reflected in both files.

2. dist/ is a build artifact compiled from lib/ via Babel. Never edit dist/ files directly.
   The build command is: npm run build (which runs clean + tsc check + babel compile).

3. package.json version is always "0.0.0-determined-by-semantic-release". Never set a
   version number manually. semantic-release handles all versioning based on conventional
   commit messages.

4. Webhooks and roles can only be exported from the master environment. The code in
   get-space-data.js explicitly skips these when environmentId !== 'master'. This is a
   Contentful API constraint, not a bug.

5. When deliveryToken is provided and includeDrafts is false, entries and assets are
   fetched from the CDA (published-only). Tags are CMA-only and will not be exported
   via CDA. Do not change this dual-client logic without understanding the implications
   for downstream consumers.

6. Asset downloads and editor interface fetching both use concurrency of 6 via
   Bluebird Promise.map. Increasing concurrency risks hitting Contentful API rate limits.

7. The contentOnly flag is a convenience shorthand that sets skipRoles, skipContentModel,
   and skipWebhooks to true. Do not duplicate this logic elsewhere.

8. bfj (Big-Friendly JSON) is used for writing export files to handle large spaces
   without exhausting memory. Do not replace with JSON.stringify for the export write path.

9. Embargoed assets (URLs where the subdomain matches {images|assets|downloads|videos}.secure.*)
   require JWT-signed URLs via the asset_keys API. The signing cache key is
   host:spaceId:environmentId with a 6-hour expiry window. Changes to this logic are
   security-sensitive.

10. The pretest script runs lint + build + clean test artifacts. Running npm test always
    triggers a full build. For faster iteration during development, use npm run test:unit
    directly (after an initial build).


================================================
FILE: .bito/guidelines/repo-truth-and-boundaries.txt
================================================
Use the repository's written documentation as review context and check whether
the change matches the documented intent.

- Start from README.md, ARCHITECTURE.md, AGENTS.md, CONTRIBUTING.md, and docs/ADRs/
  for architectural context.
- Check whether code, tests, and documentation all tell the same story. Flag
  mismatches between implementation and the documented architecture or ADRs.
- Treat AGENTS.md as the authoritative guide for sharp edges and invariants. If a
  change violates an invariant documented there, flag it.
- If CI or another required check already enforces a merge rule, do not ask for
  duplicate PR template sections or manual checklists.
- Ask for an ADR update when a change is architecture-significant (new module, new
  dependency, new persistence strategy, new integration).
- Distinguish the current public API surface from internal implementation details.
  Public type/interface changes in types.d.ts require extra scrutiny.
- The standalone CLI (bin/contentful-export) prints a redirection notice pointing
  users to contentful-cli. Changes should focus on the library API, not CLI enhancements.


================================================
FILE: .bito/guidelines/review-posture.txt
================================================
Review this pull request like the tech lead of the contentful-export project.

- Prefer a few high-signal findings to a long list of minor or style-only comments.
- Prefer behavior, contract, runtime, and documentation issues over process-only
  suggestions. Do not ask for duplicate PR template sections, checklists, or manual
  validation acknowledgements when CI or required checks already enforce that policy.
- Keep feedback actionable: explain why it matters, how it would surface in practice,
  and the clearest next step.
- If a concern is only a risk or assumption rather than a confirmed bug, say that
  clearly and explain what evidence would confirm it.
- If you find no issues, say so explicitly and call out any residual uncertainty
  that still deserves human attention.

Key areas to focus on for this repo:
- Public API surface changes (Options interface, return types) -- check that types.d.ts
  is updated to match any changes in lib/.
- Backward compatibility for npm consumers -- this is a library, not just a CLI.
- Pagination correctness in get-space-data.js -- off-by-one errors or missed edge cases
  can silently drop content during export.
- Embargoed asset handling -- JWT signing, cache invalidation, and error handling are
  security-sensitive.
- contentful-batch-libs compatibility -- ensure imported utilities match the expected API.


================================================
FILE: .bito.yaml
================================================
suggestion_mode: comprehensive
post_description: true
post_changelist: true
exclude_files: 'package-lock.json'
exclude_draft_pr: false
secret_scanner_feedback: true
linters_feedback: true
repo_level_guidelines_enabled: true
sequence_diagram_enabled: true
custom_guidelines:
  general:
    - name: 'Review Posture'
      path: './.bito/guidelines/review-posture.txt'
    - name: 'Repo Truth And Alignment'
      path: './.bito/guidelines/repo-truth-and-boundaries.txt'
    - name: 'Domain Invariants'
      path: './.bito/guidelines/domain-invariants.txt'


================================================
FILE: .contentful/vault-secrets.yaml
================================================
version: 1
services:
  github-action:
    policies:
      - dependabot
      - semantic-release
      - packages-read


================================================
FILE: .eslintrc
================================================
{
  "extends": [
    "standard",
    "eslint:recommended",
    "plugin:@typescript-eslint/eslint-recommended",
    "plugin:@typescript-eslint/recommended"
  ],
  "plugins": [
    "standard",
    "promise",
    "jest"
  ],
  "env": {
    "jest/globals": true
  }
}


================================================
FILE: .github/CODEOWNERS
================================================
* @contentful/team-developer-experience

package.json
package-lock.json


================================================
FILE: .github/ISSUE_TEMPLATE/bug_report.md
================================================
---
name: Bug Report
about: Create a report to help us improve
title: '[BUG] '
labels: bug
assignees: ''
---

## Bug Description

A clear and concise description of what the bug is.

## Steps to Reproduce

1. Go to '...'
2. Execute '...'
3. See error

## Expected Behavior

A clear and concise description of what you expected to happen.

## Actual Behavior

A clear and concise description of what actually happened.

## Code Sample

```javascript
// Minimal code to reproduce the issue
```

## Environment

- OS: [e.g. macOS 13.0, Windows 11, Ubuntu 22.04]
- Package Version: [e.g. 1.2.3]
- Node Version: [e.g. 18.0.0]
- Package Manager: [e.g. npm 9.0.0, yarn 1.22.0]

## Error Messages/Logs

```
Paste any error messages or relevant logs here
```

## Screenshots

If applicable, add screenshots to help explain your problem.

## Additional Context

Add any other context about the problem here.

## Possible Solution

If you have suggestions on how to fix the bug, please describe them here.


================================================
FILE: .github/ISSUE_TEMPLATE/config.yml
================================================
blank_issues_enabled: true
contact_links:
  - name: Contentful Support & Help Center
    url: https://support.contentful.com/hc/en-us/
    about: Submit a support ticket or browse the help center.
  - name: Contentful Developer Portal
    url: https://www.contentful.com/developers/
    about: Browse the developer portal for documentation, tutorials, and more.
  - name: Contentful Community Discord
    url: https://www.contentful.com/discord/
    about: Get peer support on the Contentful Community Discord.


================================================
FILE: .github/ISSUE_TEMPLATE/feature_request.md
================================================
---
name: Feature Request
about: Suggest an idea for this project
title: '[FEATURE] '
labels: enhancement
assignees: ''
---

## Feature Description

A clear and concise description of the feature you'd like to see.

## Problem Statement

Is your feature request related to a problem? Please describe.
Example: I'm always frustrated when [...]

## Proposed Solution

A clear and concise description of what you want to happen.

## Use Case

Describe the use case for this feature. How would you use it?

```javascript
// Example of how the feature would be used
const example = new Feature({
  option: 'value',
});
```

## Alternatives Considered

A clear and concise description of any alternative solutions or features you've considered.

## Benefits

What are the benefits of implementing this feature?

- Benefit 1
- Benefit 2
- Benefit 3

## Potential Drawbacks

Are there any potential drawbacks or challenges with this feature?

## Additional Context

Add any other context, screenshots, or examples about the feature request here.

## Implementation Suggestions

If you have ideas about how this could be implemented, please share them here.

## Priority

How important is this feature to you?

- [ ] Critical - Blocking my usage
- [ ] High - Important for my use case
- [ ] Medium - Would be nice to have
- [ ] Low - Just a suggestion

## Willingness to Contribute

- [ ] I'd be willing to submit a PR for this feature
- [ ] I can help test this feature
- [ ] I can help with documentation


================================================
FILE: .github/PULL_REQUEST_TEMPLATE.md
================================================
<!--
Thank you for opening a pull request.

Please fill in as much of the template below as you're able. Feel free to remove
any section you want to skip.
-->

## Summary

<!-- Give a short summary what your PR is introducing/fixing. -->

## Description

<!-- Describe your changes in detail -->

## Motivation and Context

<!--
Why is this change required? What problem does it solve?
If it fixes an open issue, please link to the issue here.
-->

## PR Checklist

- [ ] I have read the `CONTRIBUTING.md` file
- [ ] All commits follow [conventional commits](https://www.conventionalcommits.org/en/v1.0.0/)
- [ ] Documentation is updated (if necessary)
- [ ] PR doesn't contain any sensitive information
- [ ] There are no breaking changes


================================================
FILE: .github/dependabot.yml
================================================
version: 2

updates:
  - package-ecosystem: npm
    directory: "/"
    schedule:
      interval: daily
      time: "00:00"
      timezone: UTC
    open-pull-requests-limit: 10
    ignore:
    - dependency-name: husky
      versions:
        - ">=5.0.0"
    - dependency-name: figures # Pure ESM module. Remove when supporting ESM
      versions:
        - ">=4.0.0"
    - dependency-name: bfj
      versions:
        - ">=9.0.0"
    - dependency-name: semantic-release
      versions:
        - ">=23.0.0"
    commit-message:
      prefix: build
      include: scope
    groups:
      production-dependencies:
        applies-to: version-updates
        dependency-type: production
        update-types:
          - minor
          - patch
        patterns:
          - '*'
      dev-dependencies:
        applies-to: version-updates
        dependency-type: development
        update-types:
          - minor
          - patch
        patterns:
          - '*'

    cooldown:
      default-days: 15

================================================
FILE: .github/workflows/build.yaml
================================================
name: Build

on:
  workflow_call:

jobs:
  build:
    runs-on: ubuntu-latest

    permissions:
      contents: read

    steps:
      - name: Checkout code
        uses: actions/checkout@v5

      - name: Setup Node.js
        uses: actions/setup-node@v6
        with:
          node-version: '24'
          cache: 'npm'

      - name: Install dependencies
        run: npm ci

      - name: Build
        run: npm run build

      - name: Check build artifacts
        run: ls -la dist/

      - name: Save Build folders
        uses: actions/cache/save@v4
        with:
          path: |
            dist
          key: build-cache-${{ github.run_id }}-${{ github.run_attempt }}


================================================
FILE: .github/workflows/check.yaml
================================================
name: Run Checks

on:
  workflow_call:
    secrets:
      MANAGEMENT_TOKEN:
        required: true
      DELIVERY_TOKEN:
        required: true
      EXPORT_SPACE_ID:
        required: true
      EXPORT_SPACE_ID_EMBARGOED_ASSETS:
        required: true

jobs:
  check:
    runs-on: ubuntu-latest

    permissions:
      contents: read

    steps:
      - name: Checkout code
        uses: actions/checkout@v5

      - name: Setup Node.js
        uses: actions/setup-node@v6
        with:
          node-version: '24'
          cache: 'npm'

      - name: Install dependencies
        run: npm ci

      - name: Restore the build folders
        uses: actions/cache/restore@v4
        with:
          path: |
            dist
          key: build-cache-${{ github.run_id }}-${{ github.run_attempt }}
          fail-on-cache-miss: true

      # Prior CI pipeline did not lint or format, skipping for now
      # - name: Run linter
      #   run: npm run lint

      # - name: Check formatting
      #   run: npm run format:check

      - name: Run unit tests
        run: npm run test:unit

      - name: Run integration tests
        run: npm run test:integration
        env:
          MANAGEMENT_TOKEN: ${{ secrets.MANAGEMENT_TOKEN }}
          DELIVERY_TOKEN: ${{ secrets.DELIVERY_TOKEN }}
          EXPORT_SPACE_ID: ${{ secrets.EXPORT_SPACE_ID }}
          EXPORT_SPACE_ID_EMBARGOED_ASSETS: ${{ secrets.EXPORT_SPACE_ID_EMBARGOED_ASSETS }}

================================================
FILE: .github/workflows/codeql.yaml
================================================
---
name: "CodeQL Scan for GitHub Actions Workflows"

on:
  push:
    branches: [main]
    paths: [".github/workflows/**"]

jobs:
  analyze:
    name: Analyze GitHub Actions workflows
    runs-on: ubuntu-latest
    permissions:
      actions: read
      contents: read
      security-events: write

    steps:
      - uses: actions/checkout@v5

      - name: Initialize CodeQL
        uses: github/codeql-action/init@v4
        with:
          languages: actions

      - name: Run CodeQL Analysis
        uses: github/codeql-action/analyze@v4
        with:
          category: actions


================================================
FILE: .github/workflows/dependabot-approve-and-request-merge.yaml
================================================
name: "dependabot approve-and-request-merge"

on: pull_request_target

jobs:
  worker:
    permissions:
      contents: write
      id-token: write
    runs-on: ubuntu-latest
    if: github.event.pull_request.user.login == 'dependabot[bot]' && github.repository == github.event.pull_request.head.repo.full_name
    steps:
      - uses: contentful/github-auto-merge@v1
        with:
          VAULT_URL: ${{ secrets.VAULT_URL }}


================================================
FILE: .github/workflows/main.yaml
================================================
name: CI
permissions:
  contents: read

on:
  push:
    branches: ['**']
  pull_request:
    branches: ['**']

jobs:
  build:
    uses: ./.github/workflows/build.yaml

  check:
    needs: build
    uses: ./.github/workflows/check.yaml
    secrets: inherit

  release:
    if: github.event_name == 'push' && (github.ref == 'refs/heads/main' || github.ref == 'refs/heads/beta')
    needs: [build, check]
    permissions:
      contents: write
      id-token: write
      actions: read
    uses: ./.github/workflows/release.yaml
    secrets:
      VAULT_URL: ${{ secrets.VAULT_URL }}



================================================
FILE: .github/workflows/release.yaml
================================================
name: Release

on:
  workflow_call:
    secrets:
      VAULT_URL:
        required: true

jobs:
  release:
    runs-on: ubuntu-latest

    permissions:
      contents: write
      id-token: write
      actions: read

    steps:
      - name: 'Retrieve Secrets from Vault'
        id: vault
        uses: hashicorp/vault-action@v3.4.0
        with:
          url: ${{ secrets.VAULT_URL }}
          role: ${{ github.event.repository.name }}-github-action
          method: jwt
          path: github-actions
          exportEnv: false
          secrets: |
            github/token/${{ github.event.repository.name }}-semantic-release token | GITHUB_TOKEN ;

      - name: Get Automation Bot User ID
        id: get-user-id
        run: echo "user-id=$(gh api "/users/contentful-automation[bot]" --jq .id)" >> "$GITHUB_OUTPUT"
        env:
          GITHUB_TOKEN: ${{ steps.vault.outputs.GITHUB_TOKEN }}

      - name: Setting up Git User Credentials
        run: |
          git config --global user.name 'contentful-automation[bot]'
          git config --global user.email '${{ steps.get-user-id.outputs.user-id }}+contentful-automation[bot]@users.noreply.github.com'

      - name: Checkout code
        uses: actions/checkout@v5
        with:
          fetch-depth: 0

      - name: Setup Node.js
        uses: actions/setup-node@v6
        with:
          node-version: '24'
          cache: 'npm'

      - name: Install latest npm
        run: npm install -g npm@latest

      - name: Install dependencies
        run: npm ci

      - name: Restore the build folders
        uses: actions/cache/restore@v4
        with:
          path: |
            dist
          key: build-cache-${{ github.run_id }}-${{ github.run_attempt }}
          fail-on-cache-miss: true

      - name: Run Release
        run: |
          echo "Starting Semantic Release Process"
          echo "npm version: $(npm -v)"
          npm run semantic-release
        env:
          GITHUB_TOKEN: ${{ steps.vault.outputs.GITHUB_TOKEN }}

      - name: Get latest release tag
        id: get-tag
        run: |
          TAG=$(gh api repos/${{ github.repository }}/releases/latest --jq .tag_name)
          echo "tag=$TAG" >> $GITHUB_OUTPUT
        env:
          GITHUB_TOKEN: ${{ steps.vault.outputs.GITHUB_TOKEN }}

      - name: Summary
        run: |
          echo "## Release Summary" >> $GITHUB_STEP_SUMMARY
          echo "" >> $GITHUB_STEP_SUMMARY
          echo "- **Version**: ${{ steps.get-tag.outputs.tag }}" >> $GITHUB_STEP_SUMMARY
          echo "- **GitHub Release**: https://github.com/${{ github.repository }}/releases/tag/${{ steps.get-tag.outputs.tag }}" >> $GITHUB_STEP_SUMMARY


================================================
FILE: .gitignore
================================================
contentful-export-*

dist
gh-pages

# Docker
*.dockerfile
*.dockerignore

# Node package managers
yarn.lock

# Created by https://www.gitignore.io/api/vim,code,linux,macos,windows,sublimetext,node

### Code ###
# Visual Studio Code - https://code.visualstudio.com/
.settings/
.vscode/
jsconfig.json

# Export config
config.json

### Linux ###
*~

# temporary files which can be created if a process still has a handle open of a deleted file
.fuse_hidden*

# KDE directory preferences
.directory

# Linux trash folder which might appear on any partition or disk
.Trash-*

# .nfs files are created when an open file is removed but is still being accessed
.nfs*

### macOS ###
*.DS_Store
.AppleDouble
.LSOverride

# Icon must end with two \r
Icon

# Thumbnails
._*

# Files that might appear in the root of a volume
.DocumentRevisions-V100
.fseventsd
.Spotlight-V100
.TemporaryItems
.Trashes
.VolumeIcon.icns
.com.apple.timemachine.donotpresent

# Directories potentially created on remote AFP share
.AppleDB
.AppleDesktop
Network Trash Folder
Temporary Items
.apdisk

### Node ###
# Logs
logs
*.log
npm-debug.log*
yarn-debug.log*
yarn-error.log*

# Runtime data
pids
*.pid
*.seed
*.pid.lock

# Directory for instrumented libs generated by jscoverage/JSCover
lib-cov

# Coverage directory used by tools like istanbul
coverage

# nyc test coverage
.nyc_output

# Grunt intermediate storage (http://gruntjs.com/creating-plugins#storing-task-files)
.grunt

# Bower dependency directory (https://bower.io/)
bower_components

# node-waf configuration
.lock-wscript

# Compiled binary addons (http://nodejs.org/api/addons.html)
build/Release

# Dependency directories
node_modules/
jspm_packages/

# Typescript v1 declaration files
typings/

# Optional npm cache directory
.npm

# Optional eslint cache
.eslintcache

# Optional REPL history
.node_repl_history

# Output of 'npm pack'
*.tgz

# Yarn Integrity file
.yarn-integrity

# dotenv environment variables file
.env
.envrc

### SublimeText ###
# cache files for sublime text
*.tmlanguage.cache
*.tmPreferences.cache
*.stTheme.cache

# workspace files are user-specific
*.sublime-workspace

# project files should be checked into the repository, unless a significant
# proportion of contributors will probably not be using SublimeText
# *.sublime-project

# sftp configuration file
sftp-config.json

# Package control specific files
Package Control.last-run
Package Control.ca-list
Package Control.ca-bundle
Package Control.system-ca-bundle
Package Control.cache/
Package Control.ca-certs/
Package Control.merged-ca-bundle
Package Control.user-ca-bundle
oscrypto-ca-bundle.crt
bh_unicode_properties.cache

# Sublime-github package stores a github token in this file
# https://packagecontrol.io/packages/sublime-github
GitHub.sublime-settings

### Vim ###
# swap
.sw[a-p]
.*.sw[a-p]
# session
Session.vim
# temporary
.netrwhist
# auto-generated tag files
tags

### Windows ###
# Windows thumbnail cache files
Thumbs.db
ehthumbs.db
ehthumbs_vista.db

# Folder config file
Desktop.ini

# Recycle Bin used on file shares
$RECYCLE.BIN/

# Windows Installer files
*.cab
*.msi
*.msm
*.msp

# Windows shortcuts
*.lnk

.idea
.tool-versions

# End of https://www.gitignore.io/api/vim,code,linux,macos,windows,sublimetext,node


================================================
FILE: .npmrc
================================================
ignore-scripts=true


================================================
FILE: .nvmrc
================================================
24


================================================
FILE: AGENTS.md
================================================
<!-- Generated by seed-golden-context | Last updated: 2026-05-04 -->
# Agent Guide

Read this file first. It tells you where to find context in this repo.

## Quick Reference

| What you need | Where to look |
|---|---|
| How this repo is structured | [ARCHITECTURE.md](./ARCHITECTURE.md) |
| How to build/test/run | [CONTRIBUTING.md](./CONTRIBUTING.md) |
| Why decisions were made | [docs/ADRs/](./docs/ADRs/) |
| What this repo does | [README.md](./README.md) |
| PR review rules | [.bito/guidelines/](./.bito/guidelines/) |
| Active specs/work | [docs/specs/](./docs/specs/) |

## Sharp Edges & Invariants

- **Never edit `dist/` directly.** It is a Babel build artifact compiled from `lib/`. Always edit source in `lib/` and run `npm run build`.
- **`types.d.ts` is hand-maintained.** It is NOT auto-generated. When changing the public API (options, return type), you must manually update `types.d.ts` to match.
- **`usageParams.js` and `types.d.ts` must stay in sync.** CLI options (yargs definitions in `lib/usageParams.js`) and the TypeScript `Options` interface (`types.d.ts`) define the same option set. Changes to one must be reflected in the other.
- **Webhooks and roles are master-only.** The code explicitly skips webhook and role export when `environmentId !== 'master'`. Do not change this -- it reflects a Contentful API constraint.
- **The standalone CLI redirects to `contentful-cli`.** `bin/contentful-export` prints a notice that the CLI has moved to `contentful-cli`, then runs the export. Do not add new CLI-only features here.
- **Integration tests require real Contentful spaces.** They are not mocked. CI provides the required secrets via environment variables. Do not commit tokens.
- **`package.json` version is `0.0.0-determined-by-semantic-release`.** Never set a version manually. `semantic-release` handles all versioning.
- **Babel target (Node 12) is lower than `engines.node` (>=22).** This is a known inconsistency in `babel.config.json`. The low target is harmless but confusing.
- **`contentOnly` flag is a shorthand.** When set, it internally enables `skipRoles`, `skipContentModel`, and `skipWebhooks`. Do not duplicate this logic.
- **Asset downloads use concurrency of 6.** Both `download-assets.js` and `get-space-data.js` (editor interfaces) use Bluebird `Promise.map` with `{ concurrency: 6 }`. Be careful about changing this -- it affects API rate limiting.

## High-Traffic Areas

These paths are the most critical and frequently exercised — changes here carry outsized risk:

| Path | Why it's sensitive | What to watch |
|---|---|---|
| `lib/tasks/get-space-data.js` | Core export logic; every export invokes it | Pagination ordering (`sys.createdAt,sys.id`) is load-bearing for deterministic exports. Changing page size or concurrency affects API rate limits. |
| `lib/parseOptions.js` | Validates and merges all user input | Adding/removing options here cascades to `usageParams.js`, `types.d.ts`, and all downstream consumers. |
| `lib/index.js` | Listr task orchestration | Task ordering matters — e.g., `get-space-data` must complete before `download-assets` can reference fetched asset URLs. |
| `lib/tasks/download-assets.js` | Network-heavy, concurrency-limited | Changing concurrency (currently 6) can trigger CMA rate limiting or exhaust memory on large spaces. |

## Key Conventions

- **Commit format:** Conventional Commits enforced by Commitizen + Husky pre-commit hook
- **Branch strategy:** `main` (stable releases) + `beta` (pre-releases), feature branches, squash merge
- **Test location:** `test/unit/` mirrors `lib/` structure; `test/integration/` for end-to-end
- **Module system:** ES modules in source, compiled to CJS via Babel for npm distribution
- **Build before test:** `npm test` runs `pretest` which includes lint + build

## Integration Points

**Upstream (this repo consumes):**
- Contentful Management API (`api.contentful.com`) -- all entity CRUD
- Contentful Delivery API (`cdn.contentful.com`) -- published-only content
- `contentful-batch-libs` -- shared utilities for export/import tools

**Downstream (consumes this repo):**
- `contentful-cli` -- wraps this library as `contentful space export`
- `contentful-mcp-server` -- uses this library for space-to-space migration
- Direct npm consumers

## Build & Quality

```bash
# Quick verification loop
npm install && npm run build && npm run test:unit && npm run lint
```


================================================
FILE: ARCHITECTURE.md
================================================
<!-- Generated by seed-golden-context | Last updated: 2026-05-04 -->
# Architecture

## Overview

`contentful-export` is a Node.js library and CLI tool that exports the full content model, content, and assets from a Contentful space (or environment) to a JSON file. It orchestrates paginated reads against the Contentful Management API (CMA) and optionally the Content Delivery API (CDA), downloads asset files, and writes the aggregated result to disk. It is the export half of the Contentful export/import toolchain.

## System Context

```mermaid
graph TD
    CLI["contentful-cli<br/>(space export command)"] --> LIB["contentful-export<br/>(this repo)"]
    MCP["contentful-mcp-server<br/>(space-to-space migration)"] --> LIB
    USER["Direct library consumers<br/>(npm package)"] --> LIB
    LIB --> CMA["Contentful Management API<br/>(api.contentful.com)"]
    LIB --> CDA["Contentful Delivery API<br/>(cdn.contentful.com)"]
    LIB --> FS["Local filesystem<br/>(JSON export file + asset downloads)"]
```

## Internal Structure

| Directory / File | Purpose |
|---|---|
| `lib/index.js` | Main entry point. Orchestrates the export pipeline using Listr tasks: init client, fetch space data, download assets, write export file. |
| `lib/parseOptions.js` | Merges defaults, config file, and user-supplied params. Validates required fields (`spaceId`, `managementToken`). Processes proxy settings, query strings, and file paths. |
| `lib/tasks/init-client.js` | Creates CMA and/or CDA client instances using `contentful-management` and `contentful` SDKs. |
| `lib/tasks/get-space-data.js` | Paginated fetching of all entity types (content types, entries, assets, locales, tags, webhooks, roles, editor interfaces). Handles draft/archived filtering and tag stripping. |
| `lib/tasks/download-assets.js` | Downloads asset binary files to disk with concurrency of 6. Handles embargoed (signed URL) assets. |
| `lib/usageParams.js` | Yargs CLI argument definitions. Consumed by the `bin/contentful-export` CLI entry point. |
| `lib/utils/embargoedAssets.js` | JWT-based URL signing for embargoed (secure) assets. Caches asset keys per space/environment. |
| `lib/utils/headers.js` | Parses custom HTTP header strings (`-H "Key: Value"`) into an object for API requests. |
| `bin/contentful-export` | CLI entry point. Requires the built `dist/` output, prints a redirection notice pointing users to `contentful-cli`, then runs the export. |
| `types.d.ts` | Hand-maintained TypeScript type declarations for the public API (`Options` interface and default export). |
| `dist/` | Babel-compiled output (CJS). Generated by `npm run build`, not checked into git. |

## Data Flow

1. **Option parsing** -- User passes options (programmatic or CLI). `parseOptions` merges defaults, config file, and params. Validates required fields.
2. **Client initialization** -- Creates a CMA client. If `deliveryToken` is provided (and `includeDrafts` is false), also creates a CDA client for fetching published-only entries/assets.
3. **Paginated fetching** -- `get-space-data.js` connects to the space/environment, then fetches each entity type in sequence using `pagedGet` (page size = `maxAllowedLimit`, default 1000, ordered by `sys.createdAt,sys.id`). Entries and assets are post-filtered for drafts/archived status and optionally have tags stripped.
4. **Asset download** (optional) -- If `downloadAssets` is true, asset files are streamed to disk under `exportDir/<host>/<path>`. Embargoed assets are signed via the asset_keys API with a 6-hour expiry window before download.
5. **JSON export** -- The aggregated data object is written to disk using `bfj` (Big-Friendly JSON) for streaming large JSON writes without exhausting memory.
6. **Summary** -- A table of exported entity counts is printed, along with duration and file path.

## Domain Concepts

| Concept | Description |
|---|---|
| **Space** | Top-level container in Contentful. Export targets one space at a time. |
| **Environment** | A branch of a space's content. Defaults to `master`. Webhooks and roles can only be exported from the `master` environment. |
| **Content Model** | The set of content types, locales, and editor interfaces that define the structure of content. |
| **Embargoed Assets** | Assets hosted on `*.secure.*` domains that require JWT-signed URLs for access. The tool creates short-lived signed URLs via the `asset_keys` API. |
| **Draft / Archived** | Entries and assets can be in draft (no `publishedVersion`) or archived (`archivedVersion` set) states. By default, only published items are exported. |
| **CDA vs CMA export** | CMA returns all versions (latest, including unpublished changes). CDA returns only the published version. Providing a `deliveryToken` switches entry/asset fetching to CDA. Tags are CMA-only and will not be exported via CDA. |

## Key Dependencies

| Dependency | Why it's here |
|---|---|
| `contentful-management` (v12) | CMA client for fetching space data. v12 requires Node >=22. |
| `contentful` (v11) | CDA client, used when `deliveryToken` is provided for published-only export. |
| `contentful-batch-libs` (v11) | Shared utility library for Contentful export/import tools: logging, error handling, task wrapping, proxy utilities, sequence headers. |
| `bfj` (v9) | Big-Friendly JSON -- streaming JSON serializer for writing large export files without memory exhaustion. |
| `listr` | Task runner that provides structured progress output (spinner or verbose renderer for CI). |
| `bluebird` | Promise library used for `Promise.map` with concurrency control (pagination, asset downloads). |
| `yargs` (v18) | CLI argument parsing. |
| `axios` (v1) | HTTP client for downloading asset files. |
| `jsonwebtoken` | JWT signing for embargoed asset URL generation. |
| `date-fns` | Date formatting and duration calculation for export file naming and summary. |

## Configuration

| Variable / Flag | Purpose | Default |
|---|---|---|
| `spaceId` | Space to export (required) | -- |
| `managementToken` | CMA API token (required) | -- |
| `environmentId` | Environment within the space | `master` |
| `deliveryToken` | CDA token; switches entry/asset fetching to published-only | -- |
| `exportDir` | Directory for output files | `process.cwd()` |
| `saveFile` | Whether to write JSON to disk | `true` |
| `maxAllowedLimit` | Items per API page request | `1000` |
| `downloadAssets` | Download asset binary files to disk | `false` |
| `includeDrafts` | Include draft entries/assets | `false` |
| `includeArchived` | Include archived entries/assets | `false` |
| `contentOnly` | Only export entries and assets (sets `skipRoles`, `skipContentModel`, `skipWebhooks` to true) | `false` |
| `skipContentModel` / `skipContent` / `skipRoles` / `skipWebhooks` / `skipTags` / `skipEditorInterfaces` | Granular skip flags | all `false` |
| `stripTags` | Remove tags from exported entries/assets | `false` |
| `host` | CMA API host | `api.contentful.com` |
| `hostDelivery` | CDA host | `cdn.contentful.com` |
| `proxy` / `rawProxy` | HTTP proxy configuration | -- / `false` |
| `useVerboseRenderer` | Line-by-line output instead of spinner (useful for CI) | `false` |
| `config` | Path to JSON config file with all options | -- |

### CI Environment Variables

These are used in the GitHub Actions check workflow for integration tests:

| Variable | Purpose |
|---|---|
| `MANAGEMENT_TOKEN` | CMA token for test space |
| `DELIVERY_TOKEN` | CDA token for test space |
| `EXPORT_SPACE_ID` | Space ID for integration tests |
| `EXPORT_SPACE_ID_EMBARGOED_ASSETS` | Space ID for embargoed asset tests |

## Operational Knowledge

### Deployment

This is an npm library, not a deployed service. Releases happen automatically via `semantic-release` when commits are pushed to `main` (stable) or `beta` (prerelease) branches. The release workflow retrieves credentials from HashiCorp Vault (visible in `.github/workflows/release.yaml`).

- **Rollback:** Unpublish or publish a patched version to npm. There is no service to roll back.
- **Beta channel:** Pushing to the `beta` branch publishes a prerelease version on the `beta` npm dist-tag.

### Failure Modes

| Failure | Cause | Mitigation |
|---|---|---|
| `400 - Response size too big` | Contentful API response size limits exceeded | Reduce `maxAllowedLimit` (e.g., to 50) |
| Integration test failures in CI | Missing or expired test space credentials (secrets) | Ensure `MANAGEMENT_TOKEN`, `DELIVERY_TOKEN`, `EXPORT_SPACE_ID`, `EXPORT_SPACE_ID_EMBARGOED_ASSETS` secrets are valid |
| Embargoed asset download failure | Asset key creation fails or JWT signing error | Check that the space has embargoed assets enabled and the management token has permissions |
| `ContentfulMultiError` | Aggregated errors during export (partial failure) | Check the error log file at the path printed in output |

### Dependency Failure Behavior

This library depends on the Contentful Management and Delivery APIs at runtime. If those APIs are unavailable:

| Scenario | Behavior |
|---|---|
| CMA unreachable (network failure, DNS, timeout) | Export fails immediately with an Axios network error. No partial output is written. |
| CMA returns 5xx errors | The SDK retries with exponential backoff (built into `contentful-management`). After retries are exhausted, the export fails with the error aggregated into `ContentfulMultiError`. |
| CDA unreachable (when `deliveryToken` is provided) | Same as CMA — network error or retry exhaustion leads to export failure. |
| Rate-limited (429) | The SDK handles 429 responses with automatic retry after the `X-Contentful-RateLimit-Reset` header delay. Large spaces may see slow exports but will eventually complete unless the rate limit is persistently exceeded. |
| Asset CDN unreachable (during `downloadAssets`) | Individual asset downloads fail after Axios timeout. The export completes but reports failed asset downloads in the error log. |

There is no partial-export resume capability — a failed export must be retried from scratch.


## Integration Points

### Upstream (this repo consumes)

- **Contentful Management API** (`api.contentful.com`) -- Primary data source for all entity types
- **Contentful Delivery API** (`cdn.contentful.com`) -- Optional, for published-only entry/asset export
- **Contentful Asset Keys API** -- For signing embargoed asset download URLs
- **contentful-batch-libs** -- Shared logging, error handling, and utility functions

### Downstream (consumes this repo)

- **contentful-cli** (`contentful space export` command) -- Primary CLI consumer; the standalone CLI in this repo redirects users here
- **contentful-mcp-server** -- Uses this library for space-to-space migration export step
- **Direct npm consumers** -- Anyone importing `contentful-export` as a library


================================================
FILE: CONTRIBUTING.md
================================================
<!-- Generated by seed-golden-context | Last updated: 2026-05-04 -->
# Contributing

## Prerequisites

| Tool | Version | Notes |
|---|---|---|
| Node.js | >=22 (see `.nvmrc` for exact: 24) | Use `nvm use` to switch automatically |
| npm | Bundled with Node | Lockfile is `package-lock.json` |

No additional tokens or Docker setup is needed for local development. Integration tests require Contentful API tokens (provided via environment variables in CI).

## Getting Started

```bash
# Clone and install
git clone git@github.com:contentful/contentful-export.git
cd contentful-export
npm install                     # source: package-lock.json

# Build (clean, type-check, then babel compile)
npm run build                   # source: package.json -> scripts.build

# Run tests (lint + build + unit + integration)
npm test                        # source: package.json -> scripts.test
```

**Note:** `npm test` runs `pretest` first (lint + build + clean test artifacts), then unit tests with coverage, then integration tests. Integration tests require environment variables -- see the CI section below.

## Development Workflow

```bash
# Watch mode for incremental builds
npm run build:watch             # source: package.json -> scripts.build:watch

# Run only unit tests
npm run test:unit               # source: package.json -> scripts.test:unit

# Run unit tests in watch mode
npm run test:unit:watch         # source: package.json -> scripts.test:unit:watch

# Run only integration tests (requires env vars)
npm run test:integration        # source: package.json -> scripts.test:integration
```

## Commands

### Build

| Command | What it does | Source |
|---|---|---|
| `npm run build` | Clean dist/, type-check with tsc, compile with Babel to dist/ | `package.json` -> `scripts.build` |
| `npm run build:watch` | Babel compile in watch mode | `package.json` -> `scripts.build:watch` |
| `npm run clean` | Remove dist/ and coverage/ | `package.json` -> `scripts.clean` |
| `npm run check` | TypeScript type checking (no emit) | `package.json` -> `scripts.check` |

### Test

| Command | What it does | Source |
|---|---|---|
| `npm test` | Lint + build + unit tests + integration tests | `package.json` -> `scripts.test` |
| `npm run test:unit` | Jest unit tests with coverage | `package.json` -> `scripts.test:unit` |
| `npm run test:unit:watch` | Unit tests in watch mode | `package.json` -> `scripts.test:unit:watch` |
| `npm run test:unit:debug` | Unit tests with Node inspector for debugging | `package.json` -> `scripts.test:unit:debug` |
| `npm run test:integration` | Jest integration tests (requires env vars) | `package.json` -> `scripts.test:integration` |
| `npm run test:integration:watch` | Integration tests in watch mode | `package.json` -> `scripts.test:integration:watch` |
| `npm run test:integration:debug` | Integration tests with Node inspector | `package.json` -> `scripts.test:integration:debug` |

### Lint

| Command | What it does | Source |
|---|---|---|
| `npm run lint` | ESLint on lib/, bin/, and types.d.ts | `package.json` -> `scripts.lint` |
| `npm run lint:fix` | ESLint with auto-fix | `package.json` -> `scripts.lint:fix` |

### Release

| Command | What it does | Source |
|---|---|---|
| `npm run semantic-release` | Run semantic-release (CI only) | `package.json` -> `scripts.semantic-release` |

## Testing

- **Framework:** Jest (v29)
- **Config:** Inline in `package.json` under the `jest` key
- **Unit tests:** `test/unit/` -- mirrors `lib/` structure
- **Integration tests:** `test/integration/` -- runs against a real Contentful space (requires environment variables)
- **Run all:** `npm test`
- **Run single:** `npx jest --testPathPattern=test/unit/tasks/init-client`
- **Coverage:** Collected from `lib/**/*.js`, excludes `usageParams.js`

Integration tests require these environment variables:
- `MANAGEMENT_TOKEN` -- CMA API token
- `DELIVERY_TOKEN` -- CDA API token
- `EXPORT_SPACE_ID` -- Space ID for standard tests
- `EXPORT_SPACE_ID_EMBARGOED_ASSETS` -- Space ID for embargoed asset tests

## Code Style & Conventions

- **Linting:** ESLint with `standard` + `@typescript-eslint` rules (config: `.eslintrc`)
- **Module system:** Source is ES modules (import/export), compiled to CJS via Babel for distribution
- **Type checking:** TypeScript (`tsc --noEmit`) via `tsconfig.json` -- checks `.js` files with `allowJs` and `checkJs` enabled. Strict mode is off.
- **No Prettier:** This repo does not use Prettier. Follow the existing code style and ESLint rules.
- **Note:** The CI pipeline currently has linting commented out in the check workflow (see `.github/workflows/check.yaml`). Linting still runs locally via `npm test` (which calls `pretest`).

## Commit Convention

This repo uses [Conventional Commits](https://www.conventionalcommits.org/) via [Commitizen](https://github.com/commitizen/cz-cli) with `cz-conventional-changelog`:

```
type(scope): description
```

Valid types: `feat`, `fix`, `chore`, `docs`, `refactor`, `test`, `perf`, `ci`, `build`, `revert`

Examples:
```
feat: add support for exporting taxonomies
fix: handle embargoed asset download timeout
build(deps): bump contentful-management to v12
chore(ci): update Node version in workflow
```

`semantic-release` uses `@semantic-release/commit-analyzer` to determine version bumps:
- `feat:` -> minor version bump
- `fix:` -> patch version bump
- `build(deps):` -> patch version bump (custom rule)
- `feat!:` or `fix!:` or `BREAKING CHANGE:` in body -> major version bump

Git hooks are configured via npm lifecycle scripts: `precommit` runs `npm run lint` and `prepush` runs `npm run test`. Husky v4 is a devDependency but hook configuration relies on npm script naming conventions.

## Branch Strategy

- `main` -- Production. Merges trigger a stable npm release via semantic-release.
- `beta` -- Pre-release channel. Merges trigger a beta npm release (`npm install contentful-export@beta`).
- Feature branches -- No enforced naming pattern. Use descriptive names (e.g., `feat/add-taxonomy-export`, `fix/embargoed-download`).

## Release Process

Fully automated via `semantic-release` on GitHub Actions:

1. Push or merge to `main` or `beta`
2. CI runs build + check jobs
3. If checks pass, the release job runs `semantic-release`
4. `semantic-release` analyzes commit messages, determines version, publishes to npm, creates a GitHub release

Release credentials (GitHub token) are retrieved from HashiCorp Vault during CI. The npm publish mechanism is handled by semantic-release.

## Pull Requests

- No enforced PR title format
- Required checks: Build job + Check job (unit tests + integration tests)
- Dependabot PRs are auto-approved and auto-merged via the `dependabot-approve-and-request-merge` workflow
- CI runs on all branches (push and PR events)

## CI/CD

| Job | Trigger | What it does |
|---|---|---|
| `build` (Build) | Push to any branch, PR to any branch | Checkout, setup Node 24, npm ci, babel build, cache dist/ | 
| `check` (Run Checks) | After build succeeds | Restore build cache, run unit tests, run integration tests (with secrets) |
| `release` (Release) | Push to `main` or `beta`, after build + check pass | Retrieve Vault secrets, run `semantic-release` to publish to npm + create GitHub release |
| `codeql` (CodeQL Scan) | Push to `main` changing `.github/workflows/` | Static analysis of GitHub Actions workflows |
| `dependabot-approve-and-request-merge` | `pull_request_target` from dependabot | Auto-approve and request merge for Dependabot PRs |

## Adding a New Component

| What you're adding | Copy this as a template | Keep in sync |
|---|---|---|
| New entity type in export | Any entity block in `lib/tasks/get-space-data.js` (e.g., tags ~line 63) | `parseOptions.js` (default + `contentOnly`), `usageParams.js`, `types.d.ts` |
| New top-level task | `lib/tasks/download-assets.js` (options closure pattern) or `get-space-data.js` (sub-list pattern) | Wire into `lib/index.js` tasks array |
| New utility | `lib/utils/headers.js` | — |

Tests mirror `lib/` structure: `test/unit/tasks/<name>.test.js` or `test/unit/utils/<name>.test.js`.

## File-Level Guidance

| Path | Notes |
|---|---|
| `dist/` | Generated by Babel build. Never edit directly. Not committed to git. |
| `types.d.ts` | Hand-maintained TypeScript declarations for the public API. Update when changing the `Options` interface or export signature. |
| `bin/contentful-export` | CLI entry point. Requires compiled `dist/` output. Prints a redirection notice pointing to `contentful-cli`, then runs the export. |
| `package-lock.json` | Lockfile. Do not edit manually. Regenerate with `npm install`. |
| `.npmrc` | Contains `ignore-scripts=true` for security. |
| `example-config.json` | Example configuration file shipped with the package. Keep in sync with supported options. |
| `lib/usageParams.js` | CLI argument definitions. Keep in sync with `types.d.ts` when adding/removing options. |


================================================
FILE: LICENSE
================================================
The MIT License (MIT)

Copyright (c) 2016 Contentful

Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.


================================================
FILE: README.md
================================================
# Contentful export tool

[![CI](https://github.com/contentful/contentful-export/actions/workflows/main.yaml/badge.svg)](https://github.com/contentful/contentful-export/actions/workflows/main.yaml)
[![npm](https://img.shields.io/npm/v/contentful-export.svg)](https://www.npmjs.com/package/contentful-export) [![semantic-release](https://img.shields.io/badge/%20%20%F0%9F%93%A6%F0%9F%9A%80-semantic--release-e10079.svg)](https://github.com/semantic-release/semantic-release)

[Contentful](https://www.contentful.com) provides a content infrastructure for digital teams to power content in websites, apps, and devices. Unlike a CMS, Contentful was built to integrate with the modern software stack. It offers a central hub for structured content, powerful management and delivery APIs, and a customizable web app that enable developers and content creators to ship digital products faster.

This is a library that helps you backup your Content Model, Content and Assets or move them to a new Contentful space. _It will support Roles & Permissions in a future version._

To import your exported data, please refer to the [contentful-import](https://github.com/contentful/contentful-import) repository.

## :exclamation: Usage as CLI

> We moved the CLI version of this tool into our [Contentful CLI](https://github.com/contentful/contentful-cli). This allows our users to use and install only one single CLI tool to get the full Contentful experience.
>
> Please have a look at the [Contentful CLI export command documentation](https://github.com/contentful/contentful-cli/tree/master/docs/space/export) to learn more about how to use this as command line tool.

## :cloud: Pre-requisites && Installation

### Pre-requisites

- Node LTS

### :cloud: Installation

```bash
npm install contentful-export
```

## :hand: Usage

### CommonJS

```javascript
const contentfulExport = require('contentful-export')

const options = {
  spaceId: '<space_id>',
  managementToken: '<content_management_api_key>',
  ...
}

contentfulExport(options)
  .then((result) => {
    console.log('Your space data:', result)
  })
  .catch((err) => {
    console.log('Oh no! Some errors occurred!', err)
  })
```

### ESM

```javascript
import contentfulExport from 'contentful-export'

const options = {
  spaceId: '<space_id>',
  managementToken: '<content_management_api_key>',
  ...
}

// contentfulExport returns a Promise so you can use async/await, etc.
await contentfulExport(options)
```

### Querying

To scope your export, you are able to pass query parameters. All search parameters of our API are supported as documented in our [API documentation](https://www.contentful.com/developers/docs/references/content-delivery-api/#/reference/search-parameters).

```javascript
const contentfulExport = require('contentful-export')

const options = {
  spaceId: '<space_id>',
  managementToken: '<content_management_api_key>',
  queryEntries: ['content_type=<content_type_id>']
}

contentfulExport(options)
...
```

The Export tool also support multiple inline queries.

```javascript
const contentfulExport = require('contentful-export')

const options = {
  spaceId: '<space_id>',
  managementToken: '<content_management_api_key>',
  queryEntries: [
    'content_type=<content_type_id>',
    'sys.id=<entry_id>'
  ]
}

contentfulExport(options)
...
```

`queryAssets` uses the same syntax as `queryEntries`

### Export an environment

```javascript
const contentfulExport = require('contentful-export')

const options = {
  spaceId: '<space_id>',
  managementToken: '<content_management_api_key>',
  environmentId: '<environment_id>'
}

contentfulExport(options)
...
```

## :gear: Configuration options

### Basics

#### `spaceId` [string] [required]

ID of the space with source data

#### `environmentId` [string] [default: 'master']

ID of the environment in the source space

#### `managementToken` [string] [required]

Contentful management API token for the space to be exported

#### `deliveryToken` [string]

Contentful Content Delivery API (CDA) token for the space to be exported.

Providing `deliveryToken` will export both entries and assets from the
Contentful Delivery API, instead of the Contentful Management API.
This may be useful if you want to export the latest _published_ versions,
as the management API always only exports the entirety of items, with the latest
unpublished content. So if you want to make sure only to see the latest
published changes, provide the `deliveryToken`.

Just to clarify: When Contentful Management API always returns the latest version (e.g. 50 in this case):

```
  "createdAt": "2020-01-06T12:00:00.000Z",
  "updatedAt": "2020-04-07T11:00:00.000Z",
  "publishedVersion": 23,
  "publishedAt": "2020-04-05T14:00:00.000Z",
  "publishedCounter": 1,
  "version": 50,
```

the Content Delivery API would return the `publishedVersion` (23). CDA responses don't include
version number.

Note: Tags are only available on the Contentful Management API, so they will not be exported if you provide a Contentful Delivery Token. Tags is a new feature that not all users have access to.

### Output

#### `exportDir` [string] [default: current process working directory]

Defines the path for storing the export JSON file

#### `saveFile` [boolean] [default: true]

Save the export as a JSON file

#### `contentFile` [string]

The filename for the exported data

### Filtering

#### `includeDrafts` [boolean] [default: false]

Include drafts in the exported entries.

The `deliveryToken` option is ignored
when `includeDrafts` has been set as `true`.
If you want to include drafts, there's no point of getting them through the
Content Delivery API.

#### `includeArchived` [boolean] [default: false]

Include archived entries in the exported entries

#### `skipContentModel` [boolean] [default: false]

Skip exporting content models

#### `skipEditorInterfaces` [boolean] [default: false]

Skip exporting editor interfaces

#### `skipContent` [boolean] [default: false]

Skip exporting assets and entries.

#### `skipRoles` [boolean] [default: false]

Skip exporting roles and permissions

#### `skipTags` [boolean] [default: false]

Skip exporting tags

#### `skipWebhooks` [boolean] [default: false]

Skip exporting webhooks

#### `stripTags` [boolean] [default: false]

Untag assets and entries

#### `contentOnly` [boolean] [default: false]

Only export entries and assets

#### `queryEntries` [array]

Only export entries that match these queries

#### `queryAssets` [array]

Only export assets that match these queries

#### `downloadAssets` [boolean]

Download actual asset files

### Connection

#### `host` [string] [default: 'api.contentful.com']

The Management API host

#### `hostDelivery` [string] [default: 'cdn.contentful.com']

The Delivery API host

#### `proxy` [string]

Proxy configuration in HTTP auth format: `host:port` or `user:password@host:port`

#### `rawProxy` [boolean]

Pass proxy config to Axios instead of creating a custom httpsAgent

#### `maxAllowedLimit` [number] [default: 1000]

The number of items per page per request

#### `headers` [object]

Additional headers to attach to the requests.

### Other

#### `errorLogFile` [string]

Full path to the error log file

#### `useVerboseRenderer` [boolean] [default: false]

Display progress in new lines instead of displaying a busy spinner and the status in the same line. Useful for CI.

## :rescue_worker_helmet: Troubleshooting

### Proxy

Unable to connect to Contentful through your proxy? Try to set the `rawProxy` option to `true`.

```javascript
contentfulExport({
  proxy: 'https://cat:dog@example.com:1234',
  rawProxy: true,
  ...
})
```

### Error: 400 - Bad Request - Response size too big.

Contentful response sizes are limited (find more info in our [technical limit docs](https://www.contentful.com/developers/docs/technical-limits/)). In order to resolve this issue, limit the amount of entities received within a single request by setting the [`maxAllowedLimit`](#maxallowedlimit-number-default-1000) option:

```javascript
contentfulExport({
  proxy: 'https://cat:dog@example.com:1234',
  rawProxy: true,
  maxAllowedLimit: 50
  ...
})
```

### Embargoed Assets

If a space is configured to use the [embargoed assets feature](https://www.contentful.com/help/media/embargoed-assets/), certain options will need to be set to use the export/import tooling. When exporting content, the `downloadAssets` option must be set to `true`. This will download the asset files to your local machine. Then, when importing content ([using `contentful-import`](https://github.com/contentful/contentful-import)), the `uploadAssets` option must be set to `true` and the `assetsDirectory` must be set to the directory that contains all of the exported asset folders.

```javascript
const contentfulExport = require("contentful-export");

const options = {
  spaceId: "<space_id>",
  managementToken: "<content_management_api_key>",
  downloadAssets: true,
};

contentfulExport(options);
```

## :card_file_box: Exported data structure

This is an overview of the exported data:

```json
{
  "contentTypes": [],
  "entries": [],
  "assets": [],
  "locales": [],
  "tags": [],
  "webhooks": [],
  "roles": [],
  "editorInterfaces": []
}
```

_Note:_ Tags feature is not available for all users. If you do not have access to this feature, the tags array will always be empty.

## :warning: Limitations

- This tool currently does **not** support the export of space memberships.
- Exported webhooks with credentials will be exported as normal webhooks. Credentials should be added manually afterwards.
- If you have custom UI extensions, you need to reinstall them manually in the new space.

## :memo: Changelog

Read the [releases](https://github.com/contentful/contentful-export/releases) page for more information.

## :scroll: License

This project is licensed under MIT license

## For AI Agents

<!-- Generated by seed-golden-context | Last updated: 2026-05-04 -->
If you are an AI coding agent working in this repository, read [AGENTS.md](./AGENTS.md) first. It tells you where to find architectural context, development setup, decision records, and repo-specific rules.

[1]: https://www.contentful.com


================================================
FILE: babel.config.json
================================================
{
  "presets": [
    [
      "@babel/preset-env",
      {
        "targets": {
          "node": "12"
        }
      }
    ]
  ],
  "plugins": [
    "@babel/plugin-proposal-object-rest-spread",
    "add-module-exports"
  ]
}


================================================
FILE: bin/contentful-export
================================================
#!/usr/bin/env node

// eslint-disable-next-line
const runContentfulExport = require('../dist/index')
// eslint-disable-next-line
const usageParams = require('../dist/usageParams')

console.log('We moved the CLI version of this tool into our Contentful CLI.\nThis allows our users to use and install only one single CLI tool to get the full Contentful experience.\nFor more info please visit https://github.com/contentful/contentful-cli/tree/master/docs/space/export')

runContentfulExport(usageParams)
  .then(() => {
    process.exit(0)
  })
  .catch((err) => {
    if (err.name !== 'ContentfulMultiError') {
      console.error(err)
    }
    process.exit(1)
  })


================================================
FILE: catalog-info.yaml
================================================
apiVersion: backstage.io/v1alpha1
kind: Component
metadata:
  name: contentful-export
  description: |
    This tool allows you to export a Contentful space to a JSON dump.
  annotations:
    circleci.com/project-slug: github/contentful/contentful-export
    github.com/project-slug: contentful/contentful-export
    contentful.com/ci-alert-slack: prd-ecosystem-dx-bots
    contentful.com/service-tier: "4"
  tags:
    - tier-4
spec:
  type: cli
  lifecycle: production
  owner: group:team-developer-experience


================================================
FILE: docs/ADRs/2016-06-20-babel-cjs-build-pipeline.md
================================================
<!-- Generated by seed-golden-context | Last updated: 2026-05-04 -->
# Use Babel to Transpile ES2015+ to CommonJS

## Status

Accepted

## Context

The project was created in June 2016, when ES module support in Node.js was not yet available. The codebase is written using ES2015+ syntax (import/export, arrow functions, object spread) but needed to produce CommonJS output compatible with the Node.js ecosystem and npm consumers.

## Decision

Use Babel (`@babel/preset-env` targeting Node 12, with `@babel/plugin-proposal-object-rest-spread` and `babel-plugin-add-module-exports`) to transpile `lib/` source to `dist/` as CommonJS modules. The `main` field in `package.json` points to `dist/index.js`.

The `add-module-exports` Babel plugin ensures that `module.exports = exports.default` is added, allowing CommonJS consumers to `require('contentful-export')` without `.default`.

## Consequences

- Source files use ES module syntax (`import`/`export`), but the published package is CJS-only.
- The `dist/` directory is a build artifact and must be compiled before testing or publishing.
- Babel config has not been updated to target modern Node versions (still targets Node 12 in `babel.config.json` despite `engines.node >= 22` in `package.json`). This is a known inconsistency -- the low target is harmless since all output runs on Node 22+.
- A future migration to native ESM or dual CJS/ESM publishing would require updating the build pipeline.


================================================
FILE: docs/ADRs/2016-06-20-semantic-release.md
================================================
<!-- Generated by seed-golden-context | Last updated: 2026-05-04 -->
# Use semantic-release for Automated Versioning and Publishing

## Status

Accepted

## Context

The project needed an automated release process that derives version numbers from commit messages and publishes to npm without manual intervention. Conventional Commits were already adopted as the commit message standard.

Context not found for the original decision rationale -- likely a default/inherited choice from Contentful's ecosystem tooling practices. The project has used semantic-release since its early days.

## Decision

Use `semantic-release` with the following plugins:
1. `@semantic-release/commit-analyzer` -- Determines version bump from commit messages. Custom rule: `build(deps)` commits trigger a patch release.
2. `@semantic-release/release-notes-generator` -- Generates changelog from commits.
3. `@semantic-release/npm` -- Publishes to npm.
4. `@semantic-release/github` -- Creates GitHub releases.

Two release branches are configured:
- `main` -- Stable releases
- `beta` -- Pre-release channel with `beta` dist-tag

## Consequences

- Version numbers are never manually set (`"version": "0.0.0-determined-by-semantic-release"` in `package.json`).
- Every merge to `main` or `beta` that includes a `feat:`, `fix:`, or `build(deps):` commit triggers a release.
- Contributors must follow conventional commit format for their changes to appear in releases.
- Commitizen (`cz-conventional-changelog`) is configured to guide commit message formatting.


================================================
FILE: docs/ADRs/2024-12-02-typescript-check-only.md
================================================
<!-- Generated by seed-golden-context | Last updated: 2026-05-04 -->
# Add TypeScript Type Checking (No Emit)

## Status

Accepted

## Context

The codebase is written in JavaScript but lacked type safety. TypeScript's `checkJs` feature allows type-checking existing `.js` files without rewriting them in `.ts`. A `types.d.ts` file was already maintained for the public API surface.

Source: git commit `61c42cd` (2024-12-02) -- "chore: use tsconfig for type checking"

## Decision

Add a `tsconfig.json` with `"noEmit": true`, `"allowJs": true`, and `"checkJs": true` to enable TypeScript type-checking on the existing JavaScript source without changing the Babel build pipeline. The `npm run check` script runs `tsc`, and `npm run build` calls `check` before the Babel compilation step.

Strict mode is deliberately left off (`"strict": false`) to avoid a large-scale type fix effort on the legacy codebase.

## Consequences

- Type errors are caught at build time without requiring a full TypeScript migration.
- The Babel build pipeline remains the sole compiler for producing `dist/` output.
- `types.d.ts` continues to be hand-maintained separately from the source.
- Future contributors can incrementally add JSDoc type annotations to improve type coverage.


================================================
FILE: docs/ADRs/2025-11-14-circleci-to-github-actions.md
================================================
<!-- Generated by seed-golden-context | Last updated: 2026-05-04 -->
# Migrate CI/CD from CircleCI to GitHub Actions

## Status

Accepted

## Context

The project originally used CircleCI for CI/CD. As part of a coordinated initiative (DX-541), publishing pipelines were migrated to GitHub Actions for consistency, improved secret management via HashiCorp Vault, and reduced tooling fragmentation.

Source: git commit `00b792d` (2025-11-14) -- "chore: migrate publishing pipeline to GHA from CircleCI [DX-541]"

## Decision

Replace the CircleCI pipeline with GitHub Actions workflows:
- `main.yaml` -- Orchestrates build, check, and release jobs
- `build.yaml` -- Installs dependencies, compiles with Babel, caches `dist/`
- `check.yaml` -- Runs unit and integration tests using cached build artifacts
- `release.yaml` -- Retrieves credentials from HashiCorp Vault and runs `semantic-release`
- `codeql.yaml` -- CodeQL scanning for GitHub Actions workflows
- `dependabot-approve-and-request-merge.yaml` -- Auto-approves Dependabot PRs

Secrets (GitHub token) are managed via Vault rather than repository-level GitHub secrets.

## Consequences

- CI/CD is fully GitHub-native, consistent with other Contentful ecosystem repositories.
- Release credentials are fetched at runtime from Vault, reducing secret sprawl.
- Build artifacts are cached between jobs using `actions/cache`.
- The `catalog-info.yaml` annotation still references the old CircleCI slug (`circleci.com/project-slug`), which is a stale reference.


================================================
FILE: docs/ADRs/2026-04-09-cma-v12-node-22-minimum.md
================================================
<!-- Generated by seed-golden-context | Last updated: 2026-05-04 -->
# Update to CMA.js v12 and Drop Node <22 Support

## Status

Accepted

## Context

The `contentful-management` SDK released v12 with breaking changes that required Node.js 22 or later. This was a coordinated update across multiple Contentful ecosystem repositories (contentful-export, contentful-import, contentful-cli, field-editors).

Source: git commit `6b9c061` (2026-04-09) -- "fix!: update to CMA.js v12 and drop Node <22 support [DX-781]"

## Decision

Upgrade `contentful-management` to v12, bump `contentful-batch-libs` to v11 (compatible with CMA v12), and set `engines.node` to `>=22`. The `.nvmrc` was updated to `24` to match the CI environment. This was a breaking change for consumers running older Node.js versions.

## Consequences

- Node.js 22 is the minimum supported version for this package.
- Consumers on Node <22 must stay on the previous major version.
- CI workflows use Node 24 (matching `.nvmrc`).
- The `contentful-batch-libs` v11 update was a necessary companion change.
- This aligns the package with the rest of the Contentful ecosystem's Node.js support matrix.


================================================
FILE: docs/ADRs/README.md
================================================
<!-- Generated by seed-golden-context | Last updated: 2026-05-04 -->
# Architecture Decision Records

| ADR | Date | Status | Title |
|---|---|---|---|
| [001](./2016-06-20-babel-cjs-build-pipeline.md) | 2016-06-20 | Accepted | Use Babel to transpile ES2015+ to CommonJS |
| [002](./2016-06-20-semantic-release.md) | 2016-06-20 | Accepted | Use semantic-release for automated versioning and publishing |
| [003](./2024-12-02-typescript-check-only.md) | 2024-12-02 | Accepted | Add TypeScript type checking (no emit) |
| [004](./2025-11-14-circleci-to-github-actions.md) | 2025-11-14 | Accepted | Migrate CI/CD from CircleCI to GitHub Actions |
| [005](./2026-04-09-cma-v12-node-22-minimum.md) | 2026-04-09 | Accepted | Update to CMA.js v12 and drop Node <22 support |


================================================
FILE: docs/specs/.gitkeep
================================================


================================================
FILE: docs/specs/README.md
================================================
# Specs

Implementation-level specs for active work in this repo. Format: `YYYY-MM-<spec_name>.md`.
Specs here represent current intent -- archive or remove when work ships.


================================================
FILE: example-config.json
================================================
{
  "spaceId": "source space id",
  "environmentId": "master",
  "managementToken": "destination space management token",
  "deliveryToken": "token to export both entries and assets from the Contentful Delivery API",
  "exportDir": "/path/to/export/directory",
  "saveFile": true,
  "contentFile": "export.json",
  "includeDrafts": false,
  "includeArchived": false,
  "skipContentModel": false,
  "skipEditorInterfaces": false,
  "skipContent": false,
  "skipRoles": false,
  "skipWebhooks": false,
  "contentOnly": false,
  "queryEntries": [
    "content_type=<content_type_id>",
    "sys.id=<entry_id>",
    "limit=1000"
  ],
  "queryAssets": [
    "fields.title=Example"
  ],
  "downloadAssets": false,
  "host": "api.contentful.com",
  "proxy": "https://user:password@host:port",
  "rawProxy": false,
  "maxAllowedLimit": 1000,
  "errorLogFile": "/path/to/error.log",
  "useVerboseRenderer": false
}

================================================
FILE: example-config.test.json
================================================
{
  "spaceId": "source space id",
  "managementToken": "destination space management token"
}


================================================
FILE: lib/index.js
================================================
import { access } from 'fs'

import bfj from 'bfj'
import Promise from 'bluebird'
import Table from 'cli-table3'
import Listr from 'listr'
import UpdateRenderer from 'listr-update-renderer'
import VerboseRenderer from 'listr-verbose-renderer'
import startCase from 'lodash.startcase'
import mkdirp from 'mkdirp'
import { differenceInSeconds } from 'date-fns/differenceInSeconds'
import { formatDistance } from 'date-fns/formatDistance'

import {
  setupLogging,
  displayErrorLog,
  wrapTask,
  writeErrorLogFile
} from 'contentful-batch-libs'

import downloadAssets from './tasks/download-assets'
import getSpaceData from './tasks/get-space-data'
import initClient from './tasks/init-client'

import parseOptions from './parseOptions'

const accessP = Promise.promisify(access)

const tableOptions = {
  // remove ANSI color codes for better CI/CD compatibility
  style: { head: [], border: [] }
}

function createListrOptions (options) {
  if (options.useVerboseRenderer) {
    return {
      renderer: VerboseRenderer
    }
  }
  return {
    renderer: UpdateRenderer,
    collapse: false
  }
}

export default function runContentfulExport (params) {
  const log = []
  const options = parseOptions(params)

  const listrOptions = createListrOptions(options)

  // Setup custom error listener to store errors for later
  setupLogging(log)

  const tasks = new Listr(
    [
      {
        title: 'Initialize client',
        task: wrapTask((ctx) => {
          try {
            // CMA client
            ctx.client = initClient(options)
            if (options.deliveryToken && !options.includeDrafts) {
              // CDA client for fetching only public entries
              ctx.cdaClient = initClient(options, true)
            }
            return Promise.resolve()
          } catch (err) {
            return Promise.reject(err)
          }
        })
      },
      {
        title: 'Fetching data from space',
        task: (ctx) => {
          return getSpaceData({
            client: ctx.client,
            cdaClient: ctx.cdaClient,
            spaceId: options.spaceId,
            environmentId: options.environmentId,
            maxAllowedLimit: options.maxAllowedLimit,
            includeDrafts: options.includeDrafts,
            includeArchived: options.includeArchived,
            skipContentModel: options.skipContentModel,
            skipEditorInterfaces: options.skipEditorInterfaces,
            skipContent: options.skipContent,
            skipWebhooks: options.skipWebhooks,
            skipRoles: options.skipRoles,
            skipTags: options.skipTags,
            stripTags: options.stripTags,
            listrOptions,
            queryEntries: options.queryEntries,
            queryAssets: options.queryAssets
          })
        }
      },
      {
        title: 'Download assets',
        task: wrapTask(downloadAssets(options)),
        skip: (ctx) =>
          !options.downloadAssets ||
          !Object.prototype.hasOwnProperty.call(ctx.data, 'assets')
      },
      {
        title: 'Write export log file',
        task: () => {
          return new Listr([
            {
              title: 'Lookup directory to store the logs',
              task: (ctx) => {
                return accessP(options.exportDir)
                  .then(() => {
                    ctx.logDirectoryExists = true
                  })
                  .catch(() => {
                    ctx.logDirectoryExists = false
                  })
              }
            },
            {
              title: 'Create log directory',
              task: () => {
                return mkdirp(options.exportDir)
              },
              skip: (ctx) => !ctx.logDirectoryExists
            },
            {
              title: 'Writing data to file',
              task: (ctx) => {
                return bfj.write(options.logFilePath, ctx.data, {
                  circular: 'ignore',
                  space: 2
                })
              }
            }
          ])
        },
        skip: () => !options.saveFile
      }
    ],
    listrOptions
  )

  return tasks
    .run({
      data: {}
    })
    .then((ctx) => {
      const resultTypes = Object.keys(ctx.data)
      if (resultTypes.length) {
        const resultTable = new Table(tableOptions)

        resultTable.push([{ colSpan: 2, content: 'Exported entities' }])

        resultTypes.forEach((type) => {
          resultTable.push([startCase(type), ctx.data[type].length])
        })

        console.log(resultTable.toString())
      } else {
        console.log('No data was exported')
      }

      if ('assetDownloads' in ctx) {
        const downloadsTable = new Table(tableOptions)
        downloadsTable.push([
          { colSpan: 2, content: 'Asset file download results' }
        ])
        downloadsTable.push(['Successful', ctx.assetDownloads.successCount])
        downloadsTable.push(['Warnings ', ctx.assetDownloads.warningCount])
        downloadsTable.push(['Errors ', ctx.assetDownloads.errorCount])
        console.log(downloadsTable.toString())
      }

      const endTime = new Date()
      const durationHuman = formatDistance(endTime, options.startTime)
      const durationSeconds = differenceInSeconds(endTime, options.startTime)

      console.log(`The export took ${durationHuman} (${durationSeconds}s)`)
      if (options.saveFile) {
        console.log(
          `\nStored space data to json file at: ${options.logFilePath}`
        )
      }
      return ctx.data
    })
    .catch((err) => {
      log.push({
        ts: new Date().toJSON(),
        level: 'error',
        error: err
      })
    })
    .then((data) => {
      // @todo this should live in batch libs
      const errorLog = log.filter(
        (logMessage) =>
          logMessage.level !== 'info' && logMessage.level !== 'warning'
      )
      const displayLog = log.filter(
        (logMessage) => logMessage.level !== 'info'
      )
      displayErrorLog(displayLog)

      if (errorLog.length) {
        return writeErrorLogFile(options.errorLogFile, errorLog).then(() => {
          const multiError = new Error('Errors occured')
          multiError.name = 'ContentfulMultiError'
          Object.assign(multiError, { errors: errorLog })
          throw multiError
        })
      }

      console.log('The export was successful.')

      return data
    })
}


================================================
FILE: lib/parseOptions.js
================================================
import { addSequenceHeader, agentFromProxy, proxyStringToObject } from 'contentful-batch-libs'
import { format } from 'date-fns/format'
import { resolve } from 'path'
import qs from 'querystring'

import { version } from '../package.json'
import { getHeadersConfig } from './utils/headers'

export default function parseOptions (params) {
  const defaultOptions = {
    environmentId: 'master',
    exportDir: process.cwd(),
    includeDrafts: false,
    includeArchived: false,
    skipRoles: false,
    skipContentModel: false,
    skipEditorInterfaces: false,
    skipContent: false,
    skipWebhooks: false,
    skipTags: false,
    stripTags: false,
    maxAllowedLimit: 1000,
    saveFile: true,
    useVerboseRenderer: false,
    rawProxy: false
  }

  const configFile = params.config
    // eslint-disable-next-line @typescript-eslint/no-require-imports
    ? require(resolve(process.cwd(), params.config))
    : {}

  const options = {
    ...defaultOptions,
    ...configFile,
    ...params,
    headers: addSequenceHeader(params.headers || getHeadersConfig(params.header))
  }

  // Validation
  if (!options.spaceId) {
    throw new Error('The `spaceId` option is required.')
  }

  if (!options.managementToken) {
    throw new Error('The `managementToken` option is required.')
  }

  options.startTime = new Date()
  options.contentFile = options.contentFile || `contentful-export-${options.spaceId}-${options.environmentId}-${format(options.startTime, "yyyy-MM-dd'T'HH-mm-ss")}.json`

  options.logFilePath = resolve(options.exportDir, options.contentFile)

  if (!options.errorLogFile) {
    options.errorLogFile = resolve(options.exportDir, `contentful-export-error-log-${options.spaceId}-${options.environmentId}-${format(options.startTime, "yyyy-MM-dd'T'HH-mm-ss")}.json`)
  } else {
    options.errorLogFile = resolve(process.cwd(), options.errorLogFile)
  }

  // Further processing
  options.accessToken = options.managementToken

  if (options.proxy) {
    if (typeof options.proxy === 'string') {
      const proxySimpleExp = /.+:\d+/
      const proxyAuthExp = /.+:.+@.+:\d+/
      if (!(proxySimpleExp.test(options.proxy) || proxyAuthExp.test(options.proxy))) {
        throw new Error('Please provide the proxy config in the following format:\nhost:port or user:password@host:port')
      }
      options.proxy = proxyStringToObject(options.proxy)
    }

    if (!options.rawProxy) {
      options.httpsAgent = agentFromProxy(options.proxy)
      delete options.proxy
    }
  }

  if (options.queryEntries && options.queryEntries.length > 0) {
    const querystr = options.queryEntries.join('&')
    options.queryEntries = qs.parse(querystr)
  }

  if (options.queryAssets && options.queryAssets.length > 0) {
    const querystr = options.queryAssets.join('&')
    options.queryAssets = qs.parse(querystr)
  }

  if (options.contentOnly) {
    options.skipRoles = true
    options.skipContentModel = true
    options.skipWebhooks = true
  }

  options.application = options.managementApplication || `contentful.export/${version}`
  options.feature = options.managementFeature || 'library-export'
  return options
}


================================================
FILE: lib/tasks/download-assets.js
================================================
import Promise from 'bluebird'
import { getEntityName } from 'contentful-batch-libs'
import figures from 'figures'
import { createWriteStream, promises as fs } from 'fs'
import path from 'path'
import { pipeline } from 'stream'
import { promisify } from 'util'
import { calculateExpiryTimestamp, isEmbargoedAsset, signUrl } from '../utils/embargoedAssets'
import axios from 'axios'

const streamPipeline = promisify(pipeline)

/**
 * @param {Object} options - The options for downloading the asset.
 * @param {string} options.url - The URL of the asset to download.
 * @param {string} options.directory - The directory where the asset should be saved.
 * @param {import('axios').AxiosInstance} options.httpClient - The HTTP client to use for downloading the asset.
 */
async function downloadAsset ({ url, directory, httpClient }) {
// handle urls without protocol
  if (url.startsWith('//')) {
    url = 'https:' + url
  }

  // build local file path from the url for the download
  const parsedUrl = new URL(url)
  const decodedPathname = decodeURIComponent(parsedUrl.pathname)
  const localFile = path.join(directory, parsedUrl.host, decodedPathname)

  // ensure directory exists and create file stream
  await fs.mkdir(path.dirname(localFile), { recursive: true })
  const file = createWriteStream(localFile)

  try {
    // download asset
    const assetRequest = await httpClient.get(url, {
      responseType: 'stream',
      transformResponse: [(data) => data]
    })

    // Wait for stream to be consumed before returning local file
    await streamPipeline(assetRequest.data, file)
    return localFile
  } catch (e) {
    /**
     * @type {import('axios').AxiosError}
     */
    const axiosError = e
    throw new Error(`error response status: ${axiosError.response.status}`)
  }
}

export default function downloadAssets (options) {
  return (ctx, task) => {
    let successCount = 0
    let warningCount = 0
    let errorCount = 0

    const httpClient = axios.create({
      headers: options.headers,
      timeout: options.timeout,
      httpAgent: options.httpAgent,
      httpsAgent: options.httpsAgent,
      proxy: options.proxy
    })

    return Promise.map(ctx.data.assets, (asset) => {
      const entityName = getEntityName(asset)
      if (!asset.fields.file) {
        task.output = `${figures.warning} asset ${entityName} has no file(s)`
        warningCount++
        return
      }
      const locales = Object.keys(asset.fields.file)
      return Promise.mapSeries(locales, (locale) => {
        const url = asset.fields.file[locale].url
        if (!url) {
          task.output = `${figures.cross} asset '${entityName}' doesn't contain an url in path asset.fields.file[${locale}].url`
          errorCount++

          return Promise.resolve()
        }

        let startingPromise = Promise.resolve({ url, directory: options.exportDir, httpClient })

        if (isEmbargoedAsset(url)) {
          const { host, accessToken, spaceId, environmentId } = options
          const expiresAtMs = calculateExpiryTimestamp()

          startingPromise = signUrl(host, accessToken, spaceId, environmentId, url, expiresAtMs, httpClient)
            .then((signedUrl) => ({ url: signedUrl, directory: options.exportDir, httpClient }))
        }

        return startingPromise
          .then(downloadAsset)
          .then(() => {
            task.output = `${figures.tick} downloaded ${entityName} (${url})`
            successCount++
          })
          .catch((error) => {
            task.output = `${figures.cross} error downloading ${url}: ${error.message}`
            errorCount++
          })
      })
    }, {
      concurrency: 6
    })
      .then(() => {
        ctx.assetDownloads = {
          successCount,
          warningCount,
          errorCount
        }
      })
  }
}


================================================
FILE: lib/tasks/get-space-data.js
================================================
import Promise from 'bluebird'
import { logEmitter, wrapTask } from 'contentful-batch-libs'
import Listr from 'listr'
import verboseRenderer from 'listr-verbose-renderer'

const MAX_ALLOWED_LIMIT = 1000
let pageLimit = MAX_ALLOWED_LIMIT

/**
 * Gets all the content from a space via the management API. This includes
 * content in draft state.
 */
export default function getFullSourceSpace ({
  client,
  cdaClient,
  spaceId,
  environmentId = 'master',
  skipContentModel,
  skipContent,
  skipWebhooks,
  skipRoles,
  skipEditorInterfaces,
  skipTags,
  stripTags,
  includeDrafts,
  includeArchived,
  maxAllowedLimit,
  listrOptions,
  queryEntries,
  queryAssets
}) {
  pageLimit = maxAllowedLimit || MAX_ALLOWED_LIMIT
  listrOptions = listrOptions || {
    renderer: verboseRenderer
  }

  return new Listr([
    {
      title: 'Connecting to space',
      task: wrapTask((ctx) => {
        return client.getSpace(spaceId)
          .then((space) => {
            ctx.space = space
            return space.getEnvironment(environmentId)
          })
          .then((environment) => {
            ctx.environment = environment
          })
      })
    },
    {
      title: 'Fetching content types data',
      task: wrapTask((ctx) => {
        return pagedGet({ source: ctx.environment, method: 'getContentTypes' })
          .then(extractItems)
          .then((items) => {
            ctx.data.contentTypes = items
          })
      }),
      skip: () => skipContentModel
    },
    {
      title: 'Fetching tags data',
      task: wrapTask((ctx) => {
        return pagedGet({ source: ctx.environment, method: 'getTags' })
          .then(extractItems)
          .then((items) => {
            ctx.data.tags = items
          })
          .catch(() => {
            ctx.data.tags = []
          })
      }),
      skip: () => skipTags
    },
    {
      title: 'Fetching editor interfaces data',
      task: wrapTask((ctx) => {
        return getEditorInterfaces(ctx.data.contentTypes)
          .then((editorInterfaces) => {
            ctx.data.editorInterfaces = editorInterfaces.filter((editorInterface) => {
              return editorInterface !== null
            })
          })
      }),
      skip: (ctx) => skipContentModel || skipEditorInterfaces || (ctx.data.contentTypes.length === 0 && 'Skipped since no content types downloaded')
    },
    {
      title: 'Fetching content entries data',
      task: wrapTask((ctx) => {
        const source = cdaClient?.withAllLocales || ctx.environment
        if (cdaClient) {
          // let's not fetch children when using Content Delivery API
          queryEntries = queryEntries || {}
          queryEntries.include = 0
        }
        return pagedGet({ source, method: 'getEntries', query: queryEntries })
          .then(extractItems)
          .then((items) => filterDrafts(items, includeDrafts, cdaClient))
          .then((items) => filterArchived(items, includeArchived))
          .then((items) => removeTags(items, stripTags))
          .then((items) => {
            ctx.data.entries = items
          })
      }),
      skip: () => skipContent
    },
    {
      title: 'Fetching assets data',
      task: wrapTask((ctx) => {
        const source = cdaClient?.withAllLocales || ctx.environment
        queryAssets = queryAssets || {}
        return pagedGet({ source, method: 'getAssets', query: queryAssets })
          .then(extractItems)
          .then((items) => filterDrafts(items, includeDrafts, cdaClient))
          .then((items) => filterArchived(items, includeArchived))
          .then((items) => removeTags(items, stripTags))
          .then((items) => {
            ctx.data.assets = items
          })
      }),
      skip: () => skipContent
    },
    {
      title: 'Fetching locales data',
      task: wrapTask((ctx) => {
        return pagedGet({ source: ctx.environment, method: 'getLocales' })
          .then(extractItems)
          .then((items) => {
            ctx.data.locales = items
          })
      }),
      skip: () => skipContentModel
    },
    {
      title: 'Fetching webhooks data',
      task: wrapTask((ctx) => {
        return pagedGet({ source: ctx.space, method: 'getWebhooks' })
          .then(extractItems)
          .then((items) => {
            ctx.data.webhooks = items
          })
      }),
      skip: () => skipWebhooks || (environmentId !== 'master' && 'Webhooks can only be exported from master environment')
    },
    {
      title: 'Fetching roles data',
      task: wrapTask((ctx) => {
        return pagedGet({ source: ctx.space, method: 'getRoles' })
          .then(extractItems)
          .then((items) => {
            ctx.data.roles = items
          })
      }),
      skip: () => skipRoles || (environmentId !== 'master' && 'Roles can only be exported from master environment')
    }
  ], listrOptions)
}

function getEditorInterfaces (contentTypes) {
  return Promise.map(contentTypes, (contentType) => {
    return contentType.getEditorInterface()
      .then((editorInterface) => {
        logEmitter.emit('info', `Fetched editor interface for ${contentType.name}`)
        return editorInterface
      })
      .catch(() => {
      // old contentTypes may not have an editor interface but we'll handle in a later stage
      // but it should not stop getting the data process
        logEmitter.emit('warning', `No editor interface found for ${contentType}`)
        return Promise.resolve(null)
      })
  }, {
    concurrency: 6
  })
}

/**
 * Gets all the existing entities based on pagination parameters.
 * The first call will have no aggregated response. Subsequent calls will
 * concatenate the new responses to the original one.
 */
function pagedGet ({ source, method, skip = 0, aggregatedResponse = null, query = null }) {
  const userQueryLimit = query && query.limit
  const fetchedTotal = aggregatedResponse && aggregatedResponse.items.length
  const limit = userQueryLimit ? Math.min(pageLimit, userQueryLimit - fetchedTotal) : pageLimit

  const requestQuery = Object.assign({},
    {
      skip,
      order: 'sys.createdAt,sys.id'
    },
    query,
    {
      limit
    }
  )

  return source[method](requestQuery)
    .then((response) => {
      if (!aggregatedResponse) {
        aggregatedResponse = response
      } else {
        aggregatedResponse.items = aggregatedResponse.items.concat(response.items)
      }

      const totalItemsLength = aggregatedResponse.items.length
      const total = response.total

      logPagingStatus(response, requestQuery, userQueryLimit)

      const gotAllQueryLimitedItems = userQueryLimit && totalItemsLength >= userQueryLimit
      const gotAllItems = totalItemsLength >= total
      const gotNoItems = totalItemsLength <= 0
      if (gotAllQueryLimitedItems || gotAllItems || gotNoItems) {
        return aggregatedResponse
      }
      return pagedGet({ source, method, skip: skip + response.items.length, aggregatedResponse, query })
    })
}

function logPagingStatus (response, requestQuery, userLimit) {
  const { total, limit, items } = response
  const pagedItemsLength = items.length

  // sometimes our pageLimit or queryLimit of 1000 is overridden by the API (like in locales)
  const imposedLimit = limit || requestQuery.limit
  const limitedTotal = userLimit ? Math.min(userLimit, total) : total
  const page = Math.ceil(requestQuery.skip / imposedLimit) + 1
  const pages = Math.ceil(limitedTotal / imposedLimit)
  logEmitter.emit('info', `Fetched ${pagedItemsLength} of ${total} items (Page ${page}/${pages})`)
}

function extractItems (response) {
  return response.items
}

function filterDrafts (items, includeDrafts, cdaClient) {
  // CDA filters drafts based on host, no need to do filtering here
  return (includeDrafts || cdaClient) ? items : items.filter((item) => !!item.sys.publishedVersion || !!item.sys.archivedVersion)
}

function filterArchived (items, includeArchived) {
  return includeArchived ? items : items.filter((item) => !item.sys.archivedVersion)
}

function removeTags (items, stripTags) {
  if (stripTags) {
    items.forEach(item => {
      if (item.metadata?.tags) {
        item.metadata.tags = []
      }
    })
  }
  return items
}


================================================
FILE: lib/tasks/init-client.js
================================================
import { createClient as createCdaClient } from 'contentful'
import { logEmitter } from 'contentful-batch-libs'
import { createClient as createCmaClient } from 'contentful-management'

function logHandler (level, data) {
  logEmitter.emit(level, data)
}

export default function initClient (opts, useCda = false) {
  const defaultOpts = {
    timeout: 10000,
    logHandler
  }
  const config = {
    ...defaultOpts,
    ...opts
  }
  if (useCda) {
    const cdaConfig = {
      ...config,
      space: config.spaceId,
      accessToken: config.deliveryToken,
      environment: config.environmentId,
      host: config.hostDelivery
    }
    return createCdaClient(cdaConfig).withoutLinkResolution
  }
  return createCmaClient(config, { type: 'legacy' })
}


================================================
FILE: lib/usageParams.js
================================================
import yargs from 'yargs'
import packageFile from '../package.json'

export default yargs
  .version(packageFile.version || 'Version only available on installed package')
  .usage('Usage: $0 [options]')
  .option('space-id', {
    describe: 'ID of Space with source data',
    type: 'string',
    demand: true
  })
  .option('environment-id', {
    describe: 'ID of Environment with source data',
    type: 'string',
    default: 'master'
  })
  .option('management-token', {
    describe: 'Contentful management API token for the space to be exported',
    type: 'string',
    demand: true
  })
  .option('delivery-token', {
    describe: 'Contentful Content Delivery API token for the space to be exported',
    type: 'string'
  })
  .option('export-dir', {
    describe: 'Defines the path for storing the export json file (default path is the current directory)',
    type: 'string'
  })
  .option('include-drafts', {
    describe: 'Include drafts in the exported entries',
    type: 'boolean',
    default: false
  })
  .option('include-archived', {
    describe: 'Include archived entries in the exported entries',
    type: 'boolean',
    default: false
  })
  .option('skip-content-model', {
    describe: 'Skip exporting content models',
    type: 'boolean',
    default: false
  })
  .option('skip-content', {
    describe: 'Skip exporting assets and entries',
    type: 'boolean',
    default: false
  })
  .option('skip-roles', {
    describe: 'Skip exporting roles and permissions',
    type: 'boolean',
    default: false
  })
  .options('skip-tags', {
    describe: 'Skip exporting tags',
    type: 'boolean',
    default: false
  })
  .option('skip-webhooks', {
    describe: 'Skip exporting webhooks',
    type: 'boolean',
    default: false
  })
  .options('strip-tags', {
    describe: 'Untag assets and entries',
    type: 'boolean',
    default: false
  })
  .option('content-only', {
    describe: 'only export entries and assets',
    type: 'boolean',
    default: false
  })
  .option('download-assets', {
    describe: 'With this flags asset files will also be downloaded',
    type: 'boolean'
  })
  .option('max-allowed-limit', {
    describe: 'How many items per page per request',
    type: 'number',
    default: 1000
  })
  .option('host', {
    describe: 'Management API host',
    type: 'string',
    default: 'api.contentful.com'
  })
  .option('host-delivery', {
    describe: 'Delivery API host',
    type: 'string',
    default: 'cdn.contentful.com'
  })
  .option('proxy', {
    describe: 'Proxy configuration in HTTP auth format: [http|https]://host:port or [http|https]://user:password@host:port',
    type: 'string'
  })
  .option('raw-proxy', {
    describe: 'Pass proxy config to Axios instead of creating a custom httpsAgent',
    type: 'boolean',
    default: false
  })
  .option('error-log-file', {
    describe: 'Full path to the error log file',
    type: 'string'
  })
  .option('query-entries', {
    describe: 'Exports only entries that matches these queries',
    type: 'array'
  })
  .option('query-assets', {
    describe: 'Exports only assets that matches these queries',
    type: 'array'
  })
  .option('content-file', {
    describe: 'The filename for the exported data',
    type: 'string'
  })
  .option('save-file', {
    describe: 'Save the export as a json file',
    type: 'boolean',
    default: true
  })
  .option('use-verbose-renderer', {
    describe: 'Display progress in new lines instead of displaying a busy spinner and the status in the same line. Useful for CI.',
    type: 'boolean',
    default: false
  })
  .option('header', {
    alias: 'H',
    type: 'string',
    describe: 'Pass an additional HTTP Header'
  })
  .config('config', 'An optional configuration JSON file containing all the options for a single run')
  .argv


================================================
FILE: lib/utils/embargoedAssets.js
================================================
import jwt from 'jsonwebtoken'

const SIX_HOURS_IN_MS = 6 * 60 * 60 * 1000
const assetKeyCache = new Map()

/**
 * @param {string} host - The Contentful API host.
 * @param {string} accessToken - The access token for the Contentful API.
 * @param {string} spaceId - The ID of the Contentful space.
 * @param {string} environmentId - The ID of the Contentful environment.
 * @param {number} expiresAtMs - The expiration time in milliseconds.
 * @param {import('axios').AxiosInstance} httpClient - The HTTP client to use for requests.
 */
function createAssetKey (host, accessToken, spaceId, environmentId, expiresAtMs, httpClient) {
  return httpClient(`https://${host}/spaces/${spaceId}/environments/${environmentId}/asset_keys`, {
    method: 'POST',
    data: JSON.stringify({
      expiresAt: Math.floor(expiresAtMs / 1000) // in seconds
    }),
    headers: {
      Authorization: `Bearer ${accessToken}`,
      'Content-Type': 'application/json'
    }
  })
}

export const shouldCreateNewCacheItem = (cacheItem, currentExpiresAtMs) =>
  !cacheItem || currentExpiresAtMs - cacheItem.expiresAtMs > SIX_HOURS_IN_MS

async function createCachedAssetKey (host, accessToken, spaceId, environmentId, minExpiresAtMs, httpClient) {
  const cacheKey = `${host}:${spaceId}:${environmentId}`
  let cacheItem = assetKeyCache.get(cacheKey)

  if (shouldCreateNewCacheItem(cacheItem, minExpiresAtMs)) {
    const expiresAtMs = calculateExpiryTimestamp()

    if (minExpiresAtMs > expiresAtMs) {
      throw new Error(`Cannot fetch an asset key so far in the future: ${minExpiresAtMs} > ${expiresAtMs}`)
    }

    try {
      const assetKeyResponse = await createAssetKey(host, accessToken, spaceId, environmentId, expiresAtMs, httpClient)
      cacheItem = { expiresAtMs, result: assetKeyResponse.data }
      assetKeyCache.set(cacheKey, cacheItem)
    } catch (err) {
      // If we encounter an error, make sure to clear the cache item if this is the most recent fetch.
      const curCacheItem = assetKeyCache.get(cacheKey)
      if (curCacheItem === cacheItem) {
        assetKeyCache.delete(cacheKey)
      }

      return Promise.reject(err)
    }
  }

  return cacheItem.result
}

function generateSignedToken (secret, urlWithoutQueryParams, expiresAtMs) {
  // Convert expiresAtMs to seconds, if defined
  const exp = expiresAtMs ? Math.floor(expiresAtMs / 1000) : undefined
  return jwt.sign({
    sub: urlWithoutQueryParams,
    exp
  }, secret, { algorithm: 'HS256' })
}

function generateSignedUrl (policy, secret, url, expiresAtMs) {
  const parsedUrl = new URL(url)

  const urlWithoutQueryParams = parsedUrl.origin + parsedUrl.pathname
  const token = generateSignedToken(secret, urlWithoutQueryParams, expiresAtMs)

  parsedUrl.searchParams.set('token', token)
  parsedUrl.searchParams.set('policy', policy)

  return parsedUrl.toString()
}

export function isEmbargoedAsset (url) {
  const pattern = /((images)|(assets)|(downloads)|(videos))\.secure\./
  return pattern.test(url)
}

export function calculateExpiryTimestamp () {
  return Date.now() + SIX_HOURS_IN_MS
}

/**
 * @param {string} host - The Contentful API host.
 * @param {string} accessToken - The access token for the Contentful API.
 * @param {string} spaceId - The ID of the Contentful space.
 * @param {string} environmentId - The ID of the Contentful environment.
 * @param {string} url - The URL to be signed.
 * @param {number} expiresAtMs - The expiration time in milliseconds.
 * @param {import('axios').AxiosInstance} httpClient - The HTTP client to use for requests.
 */
export function signUrl (host, accessToken, spaceId, environmentId, url, expiresAtMs, httpClient) {
  // handle urls without protocol
  if (url.startsWith('//')) {
    url = 'https:' + url
  }

  return createCachedAssetKey(host, accessToken, spaceId, environmentId, expiresAtMs, httpClient)
    .then(({ policy, secret }) => generateSignedUrl(policy, secret, url, expiresAtMs))
}


================================================
FILE: lib/utils/headers.js
================================================
/**
 * Turn header option into an object. Invalid header values
 * are ignored.
 *
 * @example
 * getHeadersConfig('Accept: Any')
 * // -> {Accept: 'Any'}
 *
 * @example
 * getHeadersConfig(['Accept: Any', 'X-Version: 1'])
 * // -> {Accept: 'Any', 'X-Version': '1'}
 *
 * @param value {string|string[]}
 */
export function getHeadersConfig (value) {
  if (!value) {
    return {}
  }

  const values = Array.isArray(value) ? value : [value]

  return values.reduce((headers, value) => {
    value = value.trim()

    const separatorIndex = value.indexOf(':')

    // Invalid header format
    if (separatorIndex === -1) {
      return headers
    }

    const headerKey = value.slice(0, separatorIndex).trim()
    const headerValue = value.slice(separatorIndex + 1).trim()

    return {
      ...headers,
      [headerKey]: headerValue
    }
  }, {})
}


================================================
FILE: package.json
================================================
{
  "name": "contentful-export",
  "version": "0.0.0-determined-by-semantic-release",
  "description": "this tool allows you to export a space to a JSON dump",
  "main": "dist/index.js",
  "types": "types.d.ts",
  "engines": {
    "node": ">=22"
  },
  "bin": {
    "contentful-export": "./bin/contentful-export"
  },
  "scripts": {
    "build": "npm run clean && npm run check && babel lib --out-dir dist",
    "build:watch": "babel lib --out-dir dist --watch",
    "check": "tsc",
    "clean": "rimraf dist && rimraf coverage",
    "lint": "eslint lib bin/* types.d.ts",
    "lint:fix": "npm run lint -- --fix",
    "pretest": "npm run lint && npm run build && rimraf ./test/integration/tmp",
    "test": "npm run test:unit && npm run test:integration",
    "test:unit": "jest --testPathPattern=test/unit --coverage",
    "test:unit:debug": "node --inspect-brk ./node_modules/.bin/jest --runInBand --watch --testPathPattern=test/unit",
    "test:unit:watch": "npm run test:unit -- --watch",
    "test:integration": "jest --testPathPattern=test/integration",
    "test:integration:debug": "node --inspect-brk ./node_modules/.bin/jest --runInBand --watch --testPathPattern=test/integration",
    "test:integration:watch": "npm run test:integration -- --watch",
    "semantic-release": "semantic-release",
    "prepublishOnly": "npm run build",
    "postpublish": "npm run clean",
    "precommit": "npm run lint",
    "prepush": "npm run test"
  },
  "repository": {
    "type": "git",
    "url": "https://github.com/contentful/contentful-export.git"
  },
  "keywords": [
    "contentful",
    "contentful-export"
  ],
  "author": "Contentful <opensource@contentful.com>",
  "license": "MIT",
  "bugs": {
    "url": "https://github.com/contentful/contentful-export/issues"
  },
  "dependencies": {
    "axios": "^1.13.5",
    "bfj": "^9.1.3",
    "bluebird": "^3.3.3",
    "cli-table3": "^0.6.0",
    "contentful": "^11.5.10",
    "contentful-batch-libs": "^11.0.0",
    "contentful-management": "^12.2.0",
    "date-fns": "^4.1.0",
    "figures": "^3.2.0",
    "jsonwebtoken": "^9.0.0",
    "listr": "^0.14.1",
    "listr-update-renderer": "^0.5.0",
    "listr-verbose-renderer": "^0.6.0",
    "lodash.startcase": "^4.4.0",
    "mkdirp": "^2.0.0",
    "yargs": "^18.0.0"
  },
  "devDependencies": {
    "@babel/cli": "^7.0.0",
    "@babel/core": "^7.0.0",
    "@babel/plugin-proposal-object-rest-spread": "^7.0.0",
    "@babel/preset-env": "^7.0.0",
    "@babel/template": "^7.0.0",
    "@babel/types": "^7.0.0",
    "@types/jest": "^29.0.0",
    "@typescript-eslint/eslint-plugin": "^8.2.0",
    "babel-jest": "^30.0.0",
    "babel-plugin-add-module-exports": "^1.0.2",
    "cz-conventional-changelog": "^3.3.0",
    "eslint": "^8.27.0",
    "eslint-config-standard": "^17.0.0",
    "eslint-plugin-import": "^2.12.0",
    "eslint-plugin-jest": "^29.0.1",
    "eslint-plugin-node": "^11.1.0",
    "eslint-plugin-promise": "^6.0.0",
    "eslint-plugin-standard": "^5.0.0",
    "https-proxy-agent": "^7.0.0",
    "husky": "^4.3.8",
    "jest": "^29.0.0",
    "nixt": "^0.5.0",
    "nock": "^15.0.0",
    "opener": "^1.4.1",
    "rimraf": "^4.0.7",
    "semantic-release": "^25.0.2"
  },
  "files": [
    "bin",
    "dist",
    "example-config.json",
    "index.js",
    "types.d.ts"
  ],
  "config": {
    "commitizen": {
      "path": "./node_modules/cz-conventional-changelog"
    }
  },
  "release": {
    "branches": [
      "main",
      {
        "name": "beta",
        "channel": "beta",
        "prerelease": true
      }
    ],
    "plugins": [
      [
        "@semantic-release/commit-analyzer",
        {
          "releaseRules": [
            {
              "type": "build",
              "scope": "deps",
              "release": "patch"
            }
          ]
        }
      ],
      "@semantic-release/release-notes-generator",
      "@semantic-release/npm",
      "@semantic-release/github"
    ]
  },
  "jest": {
    "testEnvironment": "node",
    "collectCoverageFrom": [
      "lib/**/*.js"
    ],
    "coveragePathIgnorePatterns": [
      "usageParams.js"
    ]
  },
  "overrides": {
    "cross-spawn": "^7.0.6"
  }
}


================================================
FILE: test/integration/export-lib.test.js
================================================
import { join } from 'path'
import fs from 'fs'

import mkdirp from 'mkdirp'
import rimraf from 'rimraf'

import runContentfulExport from '../../dist/index'

const fsPromises = fs.promises

jest.setTimeout(15000)

const tmpFolder = join(__dirname, 'tmp-lib')
const spaceId = process.env.EXPORT_SPACE_ID
const managementToken = process.env.MANAGEMENT_TOKEN
const deliveryToken = process.env.DELIVERY_TOKEN

const spaceIdEmbargoedAssets = process.env.EXPORT_SPACE_ID_EMBARGOED_ASSETS

beforeAll(() => {
  mkdirp.sync(tmpFolder)
})

afterAll(() => {
  rimraf.sync(tmpFolder)
})

test('It should export space when used as a library', () => {
  return runContentfulExport({ spaceId, managementToken, saveFile: false, exportDir: tmpFolder })
    .catch((multierror) => {
      const errors = multierror.errors.filter((error) => Object.prototype.hasOwnProperty.call(error, 'error'))
      expect(errors).toHaveLength(0)
    })
    .then((content) => {
      expect(content).toBeTruthy()
      expect(content.contentTypes).toHaveLength(2)
      expect(content.editorInterfaces).toHaveLength(2)
      expect(content.entries).toHaveLength(4)
      expect(content.assets).toHaveLength(4)
      expect(content.locales).toHaveLength(1)
      expect(content.tags).toHaveLength(4)
      expect(content.webhooks).toHaveLength(0)
      expect(content.roles).toHaveLength(7)
      // make sure entries are delivered from CMA
      expect(content.entries[0].sys).toHaveProperty('publishedVersion')
    })
})

test('It should export environment when used as a library', () => {
  return runContentfulExport({ spaceId, environmentId: 'staging', managementToken, saveFile: false, exportDir: tmpFolder })
    .catch((multierror) => {
      const errors = multierror.errors.filter((error) => Object.prototype.hasOwnProperty.call(error, 'error'))
      expect(errors).toHaveLength(0)
    })
    .then((content) => {
      expect(content).toBeTruthy()
      expect(content.contentTypes).toHaveLength(2)
      expect(content.editorInterfaces).toHaveLength(2)
      expect(content.entries).toHaveLength(4)
      expect(content.assets).toHaveLength(4)
      expect(content.locales).toHaveLength(1)
      expect(content.tags).toHaveLength(2)
      expect(content).not.toHaveProperty('webhooks')
      expect(content).not.toHaveProperty('roles')
    })
})

test('It should export space when used as a library, with deliveryToken', () => {
  return runContentfulExport({ spaceId, managementToken, deliveryToken, saveFile: false, exportDir: tmpFolder })
    .catch((multierror) => {
      const errors = multierror.errors.filter((error) => Object.prototype.hasOwnProperty.call(error, 'error'))
      expect(errors).toHaveLength(0)
    })
    .then((content) => {
      expect(content).toBeTruthy()
      expect(content.contentTypes).toHaveLength(2)
      expect(content.editorInterfaces).toHaveLength(2)
      expect(content.entries).toHaveLength(4)
      expect(content.assets).toHaveLength(4)
      expect(content.locales).toHaveLength(1)
      expect(content.tags).toHaveLength(4)
      expect(content.webhooks).toHaveLength(0)
      expect(content.roles).toHaveLength(7)
    })
})

test('It should export embargoed assets space when used as a library', () => {
  return runContentfulExport({
    spaceId: spaceIdEmbargoedAssets,
    managementToken,
    saveFile: true,
    downloadAssets: true,
    exportDir: tmpFolder,
    host: 'api.contentful.com'
  })
    .catch((multierror) => {
      const errors = multierror.errors.filter((error) => Object.prototype.hasOwnProperty.call(error, 'error'))
      expect(errors).toHaveLength(0)
    })
    .then(async (content) => {
      expect(content.assets).toHaveLength(1)

      // This code ensures that the protected/embargoed asset has actually been downloaded
      const files = await fsPromises.readdir(tmpFolder, { withFileTypes: true })
      const directories = files.filter(f => f.isDirectory())
      expect(directories).toHaveLength(1)
    })
})


================================================
FILE: test/unit/index.test.js
================================================
import { resolve } from 'path'

import bfj from 'bfj'
import fs from 'fs'
import mkdirp from 'mkdirp'

import {
  setupLogging,
  displayErrorLog,
  writeErrorLogFile
} from 'contentful-batch-libs'

import { mockDownloadAssets } from './mocks/download-assets'
import { mockGetSpaceData } from './mocks/get-space-data'

import downloadAssets from '../../lib/tasks/download-assets'
import getSpaceData from '../../lib/tasks/get-space-data'
import initClient from '../../lib/tasks/init-client'
import runContentfulExport from '../../lib/index'

jest.spyOn(global.console, 'log')
jest.mock('../../lib/tasks/init-client')
jest.mock('../../lib/tasks/download-assets', () => jest.fn(() => mockDownloadAssets))
jest.mock('../../lib/tasks/get-space-data', () => jest.fn(mockGetSpaceData))

jest.mock('fs', () => ({ access: jest.fn((path, cb) => cb()) }))
jest.mock('mkdirp', () => jest.fn())
jest.mock('bfj', () => ({ write: jest.fn().mockResolvedValue() }))
jest.mock('contentful-batch-libs/dist/logging', () => ({
  setupLogging: jest.fn(),
  displayErrorLog: jest.fn(),
  logToTaskOutput: () => jest.fn(),
  writeErrorLogFile: jest.fn((destination, errorLog) => {
    const multiError = new Error('Errors occured')
    multiError.name = 'ContentfulMultiError'
    multiError.errors = errorLog
    throw multiError
  })
}))

afterEach(() => {
  initClient.mockClear()
  getSpaceData.mockClear()
  setupLogging.mockClear()
  displayErrorLog.mockClear()
  fs.access.mockClear()
  mkdirp.mockClear()
  bfj.write.mockClear()
  writeErrorLogFile.mockClear()
  downloadAssets.mockClear()
  global.console.log.mockClear()
})

test('Runs Contentful Export with default config', async () => {
  await runContentfulExport({
    errorLogFile: 'errorlogfile',
    spaceId: 'someSpaceId',
    managementToken: 'someManagementToken'
  })

  expect(initClient.mock.calls).toHaveLength(1)
  expect(getSpaceData.mock.calls).toHaveLength(1)
  expect(setupLogging.mock.calls).toHaveLength(1)
  expect(downloadAssets.mock.calls).toHaveLength(1)
  expect(displayErrorLog.mock.calls).toHaveLength(1)
  expect(fs.access.mock.calls).toHaveLength(1)
  expect(mkdirp.mock.calls).toHaveLength(1)
  expect(bfj.write.mock.calls).toHaveLength(1)
  expect(writeErrorLogFile.mock.calls).toHaveLength(0)
  const exportedTable = global.console.log.mock.calls.find((call) => call[0].match(/Exported entities/))
  expect(exportedTable).not.toBeUndefined()
  expect(exportedTable[0]).toMatch(/Exported entities/)
  expect(exportedTable[0]).toMatch(/Content Types.+0/)
  expect(exportedTable[0]).toMatch(/Entries.+0/)
  expect(exportedTable[0]).toMatch(/Assets.+2/)
  expect(exportedTable[0]).toMatch(/Locales.+0/)
  const assetsTable = global.console.log.mock.calls.find((call) => call[0].match(/Asset file download results/))
  expect(assetsTable).toBeUndefined()
})

test('Runs Contentful Export and downloads assets', async () => {
  await runContentfulExport({
    errorLogFile: 'errorlogfile',
    spaceId: 'someSpaceId',
    managementToken: 'someManagementToken',
    downloadAssets: true
  })

  expect(initClient.mock.calls).toHaveLength(1)
  expect(getSpaceData.mock.calls).toHaveLength(1)
  expect(setupLogging.mock.calls).toHaveLength(1)
  expect(downloadAssets.mock.calls).toHaveLength(1)
  expect(displayErrorLog.mock.calls).toHaveLength(1)
  expect(fs.access.mock.calls).toHaveLength(1)
  expect(mkdirp.mock.calls).toHaveLength(1)
  expect(bfj.write.mock.calls).toHaveLength(1)
  expect(writeErrorLogFile.mock.calls).toHaveLength(0)
  const exportedTable = global.console.log.mock.calls.find((call) => call[0].match(/Exported entities/))
  expect(exportedTable).not.toBeUndefined()
  expect(exportedTable[0]).toMatch(/Exported entities/)
  expect(exportedTable[0]).toMatch(/Content Types.+0/)
  expect(exportedTable[0]).toMatch(/Entries.+0/)
  expect(exportedTable[0]).toMatch(/Assets.+2/)
  expect(exportedTable[0]).toMatch(/Locales.+0/)
  const assetsTable = global.console.log.mock.calls.find((call) => call[0].match(/Asset file download results/))
  expect(assetsTable).not.toBeUndefined()
  expect(assetsTable[0]).toMatch(/Asset file download results/)
  expect(assetsTable[0]).toMatch(/Successful.+3/)
  expect(assetsTable[0]).toMatch(/Warnings.+2/)
  expect(assetsTable[0]).toMatch(/Errors.+1/)
})

test('Creates a valid and correct opts object', async () => {
  const errorLogFile = 'errorlogfile'
  const { default: exampleConfig } = await import('../../example-config.test.json')

  await runContentfulExport({
    errorLogFile,
    config: resolve(__dirname, '..', '..', 'example-config.test.json')
  })

  expect(initClient.mock.calls[0][0].skipContentModel).toBeFalsy()
  expect(initClient.mock.calls[0][0].skipEditorInterfaces).toBeFalsy()
  expect(initClient.mock.calls[0][0].skipTags).toBeFalsy()
  expect(initClient.mock.calls[0][0].stripTags).toBeFalsy()
  expect(initClient.mock.calls[0][0].errorLogFile).toBe(resolve(process.cwd(), errorLogFile))
  expect(initClient.mock.calls[0][0].spaceId).toBe(exampleConfig.spaceId)
  expect(initClient.mock.calls).toHaveLength(1)
  expect(getSpaceData.mock.calls).toHaveLength(1)
  expect(setupLogging.mock.calls).toHaveLength(1)
  expect(downloadAssets.mock.calls).toHaveLength(1)
  expect(displayErrorLog.mock.calls).toHaveLength(1)
  expect(fs.access.mock.calls).toHaveLength(1)
  expect(mkdirp.mock.calls).toHaveLength(1)
  expect(bfj.write.mock.calls).toHaveLength(1)
  expect(writeErrorLogFile.mock.calls).toHaveLength(0)
  const exportedTable = global.console.log.mock.calls.find((call) => call[0].match(/Exported entities/))
  expect(exportedTable).not.toBeUndefined()
  expect(exportedTable[0]).toMatch(/Exported entities/)
  expect(exportedTable[0]).toMatch(/Content Types.+0/)
  expect(exportedTable[0]).toMatch(/Entries.+0/)
  expect(exportedTable[0]).toMatch(/Assets.+2/)
  expect(exportedTable[0]).toMatch(/Locales.+0/)
})

test('Run Contentful export fails due to rejection', async () => {
  const rejectError = new Error()
  rejectError.request = { uri: 'erroruri' }
  getSpaceData.mockImplementation(() => Promise.reject(rejectError))

  await expect(runContentfulExport({
    errorLogFile: 'errorlogfile',
    spaceId: 'someSpaceId',
    managementToken: 'someManagementToken'
  })).rejects.toThrow()

  expect(initClient.mock.calls).toHaveLength(1)
  expect(getSpaceData.mock.calls).toHaveLength(1)
  expect(setupLogging.mock.calls).toHaveLength(1)
  expect(downloadAssets.mock.calls).toHaveLength(1)
  expect(displayErrorLog.mock.calls).toHaveLength(1)
  expect(fs.access.mock.calls).toHaveLength(0)
  expect(mkdirp.mock.calls).toHaveLength(0)
  expect(bfj.write.mock.calls).toHaveLength(0)
  expect(writeErrorLogFile.mock.calls).toHaveLength(1)
})


================================================
FILE: test/unit/mocks/download-assets.js
================================================
export const mockDownloadAssets = async (ctx) => {
  ctx.assetDownloads = {
    successCount: 3,
    warningCount: 2,
    errorCount: 1
  }
}


================================================
FILE: test/unit/mocks/get-space-data.js
================================================
import Listr from 'listr'

export const mockGetSpaceData = () => {
  return new Listr([
    {
      title: 'mocked get full source space',
      task: (ctx) => {
        ctx.data = {
          contentTypes: [],
          entries: [],
          assets: [
            {
              sys: { id: 'someValidAsset' },
              fields: {
                file: {
                  'en-US': {
                    url: '//images.contentful.com/kq9lln4hyr8s/2MTd2wBirYikEYkIIc0YSw/7aa4c06f3054996e45bb3f13964cb254/rocka-nutrition.png'
                  }
                }
              }
            },
            {
              sys: { id: 'someBrokenAsset' },
              fields: {}
            }
          ],
          locales: []
        }
      }
    }
  ])
}


================================================
FILE: test/unit/parseOptions.test.js
================================================
import { HttpsProxyAgent } from 'https-proxy-agent'
import { basename, isAbsolute, resolve, sep } from 'path'
import parseOptions from '../../lib/parseOptions'

const spaceId = 'foo'
const managementToken = 'someManagementToken'
const basePath = resolve(__dirname, '..', '..')

const toBeAbsolutePathWithPattern = (received, pattern) => {
  const escapedPattern = [basename(basePath), pattern].join(`\\${sep}`)

  return (!isAbsolute(received) || !RegExp(`/${escapedPattern}$/`).test(received))
}

test('parseOptions sets requires spaceId', () => {
  expect(
    () => parseOptions({})
  ).toThrow('The `spaceId` option is required.')
})

test('parseOptions sets requires managementToken', () => {
  expect(
    () => parseOptions({
      spaceId: 'someSpaceId'
    })
  ).toThrow('The `managementToken` option is required.')
})

test('parseOptions sets correct default options', async () => {
  const { default: packageJson } = await import(resolve(basePath, 'package.json'))
  const version = packageJson.version

  const options = parseOptions({ spaceId, managementToken })

  const contentFileNamePattern = `contentful-export-${spaceId}-master-[0-9]{4}-[0-9]{2}-[0-9]{2}T[0-9]{2}-[0-9]{2}-[0-9]{2}\\.json`
  const errorFileNamePattern = `contentful-export-error-log-${spaceId}-master-[0-9]{4}-[0-9]{2}-[0-9]{2}T[0-9]{2}-[0-9]{2}-[0-9]{2}\\.json`

  expect(options.contentFile).toMatch(new RegExp(`^${contentFileNamePattern}$`))
  expect(toBeAbsolutePathWithPattern(options.errorLogFile, errorFileNamePattern)).toBe(true)
  expect(options.exportDir).toBe(basePath)
  expect(options.includeDrafts).toBe(false)
  expect(options.includeArchived).toBe(false)
  expect(toBeAbsolutePathWithPattern(options.logFilePath, contentFileNamePattern)).toBe(true)
  expect(options.application).toBe(`contentful.export/${version}`)
  expect(options.feature).toBe('library-export')
  expect(options.accessToken).toBe(managementToken)
  expect(options.maxAllowedLimit).toBe(1000)
  expect(options.saveFile).toBe(true)
  expect(options.skipContent).toBe(false)
  expect(options.skipContentModel).toBe(false)
  expect(options.skipEditorInterfaces).toBe(false)
  expect(options.skipRoles).toBe(false)
  expect(options.skipWebhooks).toBe(false)
  expect(options.skipTags).toBe(false)
  expect(options.stripTags).toBe(false)
  expect(options.spaceId).toBe(spaceId)
  expect(options.startTime).toBeInstanceOf(Date)
  expect(options.useVerboseRenderer).toBe(false)
  expect(options.deliveryToken).toBeUndefined()
})

test('parseOption accepts config file', async () => {
  const configFileName = 'example-config.test.json'
  const { default: config } = await import(resolve(basePath, configFileName))

  const options = parseOptions({ config: configFileName })
  Object.keys(config).forEach((key) => {
    expect(options[key]).toBe(config[key])
  })
})

test('parseOption overwrites errorLogFile', () => {
  const errorLogFile = 'error.log'
  const options = parseOptions({
    spaceId,
    managementToken,
    errorLogFile
  })
  expect(options.errorLogFile).toBe(resolve(basePath, errorLogFile))
})

test('parseOption throws with invalid proxy', () => {
  expect(() => parseOptions({
    spaceId,
    managementToken,
    proxy: 'invalid'
  })).toThrow('Please provide the proxy config in the following format:\nhost:port or user:password@host:port')
})

test('parseOption accepts proxy config as string', () => {
  const options = parseOptions({
    spaceId,
    managementToken,
    proxy: 'localhost:1234'
  })
  expect(options).not.toHaveProperty('proxy')
  expect(options.httpsAgent).toBeInstanceOf(HttpsProxyAgent)
})

test('parseOption accepts proxy config as object', () => {
  const options = parseOptions({
    spaceId,
    managementToken,
    proxy: {
      host: 'localhost',
      port: 1234,
      user: 'foo',
      password: 'bar'
    }
  })
  expect(options).not.toHaveProperty('proxy')
  expect(options.httpsAgent).toBeInstanceOf(HttpsProxyAgent)
})

test('parseOptions parses queryEntries option', () => {
  const options = parseOptions({
    spaceId,
    managementToken,
    queryEntries: [
      'someParam=someValue',
      'someOtherParam=someOtherValue'
    ]
  })
  expect(options.queryEntries).toMatchObject({
    someParam: 'someValue',
    someOtherParam: 'someOtherValue'
  })
})

test('parseOptions parses queryAssets option', () => {
  const options = parseOptions({
    spaceId,
    managementToken,
    queryAssets: [
      'someParam=someValue',
      'someOtherParam=someOtherValue'
    ]
  })
  expect(options.queryAssets).toMatchObject({
    someParam: 'someValue',
    someOtherParam: 'someOtherValue'
  })
})

test('parseOptions sets correct options given contentOnly', () => {
  const options = parseOptions({
    spaceId,
    managementToken,
    contentOnly: true
  })
  expect(options.skipRoles).toBe(true)
  expect(options.skipContentModel).toBe(true)
  expect(options.skipWebhooks).toBe(true)
})

test('parseOptions accepts custom application & feature', () => {
  const managementApplication = 'managementApplicationMock'
  const managementFeature = 'managementFeatureMock'

  const options = parseOptions({
    spaceId,
    managementToken,
    managementApplication,
    managementFeature
  })

  expect(options.application).toBe(managementApplication)
  expect(options.feature).toBe(managementFeature)
})

test('parseOption parses deliveryToken option', () => {
  const options = parseOptions({
    spaceId,
    managementToken,
    deliveryToken: 'testDeliveryToken'
  })
  expect(options.accessToken).toBe(managementToken)
  expect(options.spaceId).toBe(spaceId)
  expect(options.deliveryToken).toBe('testDeliveryToken')
})

test('parseOption parses headers option', () => {
  const options = parseOptions({
    spaceId,
    managementToken,
    headers: {
      header1: '1',
      header2: '2'
    }
  })
  expect(options.headers).toEqual({
    header1: '1',
    header2: '2',
    'CF-Sequence': expect.any(String)
  })
})

test('parses params.header if provided', function () {
  const config = parseOptions({
    spaceId,
    managementToken,
    header: ['Accept   : application/json ', ' X-Header: 1']
  })
  expect(config.headers).toEqual({ Accept: 'application/json', 'X-Header': '1', 'CF-Sequence': expect.any(String) })
})


================================================
FILE: test/unit/tasks/download-assets.test.js
================================================
import { promises as fs, rmSync } from 'fs'
import { tmpdir } from 'os'
import { resolve } from 'path'

import nock from 'nock'

import downloadAssets from '../../../lib/tasks/download-assets'

const tmpDirectory = resolve(tmpdir(), 'contentful-import-test')

const BASE_PATH = '//images.contentful.com'
const BASE_PATH_SECURE = '//images.secure.contentful.com'
const ASSET_PATH = '/kq9lln4hyr8s/2MTd2wBirYikEYkIIc0YSw/7aa4c06f3054996e45bb3f13964cb254'
const EXISTING_ASSET_FILENAME = 'rocka-nutrition.png'
const EXISTING_ASSET_URL = `${ASSET_PATH}/${EXISTING_ASSET_FILENAME}`
const EMBARGOED_ASSET_FILENAME = 'space-dog.png'
const EMBARGOED_ASSET_URL = `${ASSET_PATH}/${EMBARGOED_ASSET_FILENAME}`
const NON_EXISTING_URL = '/does-not-exist.png'
const UNICODE_SHORT_FILENAME = '测试文件.jpg'
const UNICODE_SHORT_URL = `${ASSET_PATH}/${encodeURIComponent(UNICODE_SHORT_FILENAME)}`
const UNICODE_LONG_FILENAME = `${'测试文件'.repeat(10)}.jpg`
const UNICODE_LONG_URL = `${ASSET_PATH}/${encodeURIComponent(UNICODE_LONG_FILENAME)}`
const DIFFERENT_FILENAME = 'different filename.jpg'
const UPLOAD_URL = '//file-stack-url-do-not-use-me.png'

const API_HOST = 'api.contentful.com'
const SPACE_ID = 'kq9lln4hyr8s'
const ACCESS_TOKEN = 'abc'
const ENVIRONMENT_ID = 'master'
const POLICY = 'eyJhbG.eyJMDIyfQ.SflKx5c'
const SECRET = 's3cr3t'

let taskProxy
let output

nock(`https:${BASE_PATH}`)
  .get(EXISTING_ASSET_URL)
  .times(8)
  .reply(200)

nock(`https:${BASE_PATH}`)
  .get(NON_EXISTING_URL)
  .reply(404)

nock(`https:${BASE_PATH}`)
  .get(UNICODE_SHORT_URL)
  .times(2)
  .reply(200)

nock(`https:${BASE_PATH}`)
  .get(UNICODE_LONG_URL)
  .times(2)
  .reply(200)

// Mock downloading assets using signed URLs
nock(`https:${BASE_PATH_SECURE}`)
  .get(EMBARGOED_ASSET_URL)
  .query({ policy: POLICY, token: /.+/i })
  .times(2)
  .reply(200)

// Mock asset-key creation for embargoed assets
nock(`https://${API_HOST}`)
  .post(`/spaces/${SPACE_ID}/environments/${ENVIRONMENT_ID}/asset_keys`, {
    expiresAt: /.+/i
  })
  .times(1)
  .reply(200, { policy: POLICY, secret: SECRET })

function getAssets ({ existing = 0, nonExisting = 0, missingUrl = 0, embargoed = 0, unicodeShort = 0, unicodeLong = 0, differentFilename = 0 } = {}) {
  const existingUrl = `${BASE_PATH}${EXISTING_ASSET_URL}`
  const embargoedUrl = `${BASE_PATH_SECURE}${EMBARGOED_ASSET_URL}`
  const nonExistingUrl = `${BASE_PATH}${NON_EXISTING_URL}`
  const unicodeShortUrl = `${BASE_PATH}${UNICODE_SHORT_URL}`
  const unicodeLongUrl = `${BASE_PATH}${UNICODE_LONG_URL}`
  const assets = []
  for (let i = 0; i < nonExisting; i++) {
    assets.push({
      sys: {
        id: `Non existing asset ${i}`
      },
      fields: {
        file: {
          'en-US': {
            url: nonExistingUrl,
            fileName: NON_EXISTING_URL,
            upload: UPLOAD_URL
          },
          'de-DE': {
            url: nonExistingUrl,
            fileName: NON_EXISTING_URL,
            upload: UPLOAD_URL
          }
        }
      }
    })
  }
  for (let i = 0; i < existing; i++) {
    assets.push({
      sys: {
        id: `existing asset ${i}`
      },
      fields: {
        file: {
          'en-US': {
            url: existingUrl,
            fileName: EXISTING_ASSET_FILENAME,
            upload: UPLOAD_URL
          },
          'de-DE': {
            url: existingUrl,
            fileName: EXISTING_ASSET_FILENAME,
            upload: UPLOAD_URL
          }
        }
      }
    })
  }
  for (let i = 0; i < embargoed; i++) {
    assets.push({
      sys: {
        id: `embargoed asset ${i}`
      },
      fields: {
        file: {
          'en-US': {
            url: embargoedUrl,
            fileName: EMBARGOED_ASSET_FILENAME,
            upload: UPLOAD_URL
          },
          'de-DE': {
            url: embargoedUrl,
            fileName: EMBARGOED_ASSET_FILENAME,
            upload: UPLOAD_URL
          }
        }
      }
    })
  }
  for (let i = 0; i < missingUrl; i++) {
    assets.push({
      sys: {
        id: `missing file url ${i}`
      },
      fields: {
        file: {
          'en-US': {
            upload: UPLOAD_URL,
            fileName: DIFFERENT_FILENAME
          },
          'de-DE': {
            upload: UPLOAD_URL,
            fileName: DIFFERENT_FILENAME
          }
        }
      }
    })
  }
  for (let i = 0; i < unicodeShort; i++) {
    assets.push({
      sys: {
        id: `unicode short asset ${i}`
      },
      fields: {
        file: {
          'en-US': {
            url: unicodeShortUrl,
            fileName: UNICODE_SHORT_FILENAME,
            upload: UPLOAD_URL
          },
          'de-DE': {
            url: unicodeShortUrl,
            fileName: UNICODE_SHORT_FILENAME,
            upload: UPLOAD_URL
          }
        }
      }
    })
  }
  for (let i = 0; i < unicodeLong; i++) {
    assets.push({
      sys: {
        id: `unicode long asset ${i}`
      },
      fields: {
        file: {
          'en-US': {
            url: unicodeLongUrl,
            fileName: UNICODE_LONG_FILENAME,
            upload: UPLOAD_URL
          },
          'de-DE': {
            url: unicodeLongUrl,
            fileName: UNICODE_LONG_FILENAME,
            upload: UPLOAD_URL
          }
        }
      }
    })
  }
  for (let i = 0; i < differentFilename; i++) {
    assets.push({
      sys: {
        id: `different filename asset ${i}`
      },
      fields: {
        file: {
          'en-US': {
            url: existingUrl,
            fileName: DIFFERENT_FILENAME,
            upload: UPLOAD_URL
          },
          'de-DE': {
            url: existingUrl,
            fileName: DIFFERENT_FILENAME,
            upload: UPLOAD_URL
          }
        }
      }
    })
  }
  return assets
}

beforeEach(() => {
  output = jest.fn()
  taskProxy = new Proxy({}, {
    set: (obj, prop, value) => {
      if (prop === 'output') {
        output(value)
        return value
      }
      throw new Error(`It should not access task property ${String(prop)} (value: ${value})`)
    }
  })
})
beforeAll(async () => {
  await fs.mkdir(tmpDirectory, { recursive: true })
})

afterAll(() => {
  // Couldn't get `fs.promises.rm` to work without permissions issues
  rmSync(tmpDirectory, { recursive: true, force: true })

  if (!nock.isDone()) {
    throw new Error(`pending mocks: ${nock.pendingMocks().join(', ')}`)
  }

  nock.cleanAll()
  nock.restore()
})

test('Downloads assets and properly counts failed attempts', () => {
  const task = downloadAssets({
    exportDir: tmpDirectory
  })
  const ctx = {
    data: {
      assets: [
        ...getAssets({ existing: 1, nonExisting: 1 }),
        {
          sys: {
            id: 'corrupt asset [warning]'
          },
          fields: {}
        }
      ]
    }
  }

  return task(ctx, taskProxy)
    .then(() => {
      expect(ctx.assetDownloads).toEqual({
        successCount: 2,
        warningCount: 1,
        errorCount: 2
      })
      expect(output.mock.calls).toHaveLength(5)
    })
})

test('Downloads embargoed assets', () => {
  const task = downloadAssets({
    exportDir: tmpDirectory,
    host: API_HOST,
    accessToken: ACCESS_TOKEN,
    spaceId: SPACE_ID,
    environmentId: ENVIRONMENT_ID
  })
  const ctx = {
    data: {
      assets: [
        ...getAssets({ embargoed: 1 })
      ]
    }
  }

  return task(ctx, taskProxy)
    .then(() => {
      expect(ctx.assetDownloads).toEqual({
        successCount: 2,
        warningCount: 0,
        errorCount: 0
      })
      expect(output.mock.calls).toHaveLength(2)
    })
})

test('it doesn\'t use fileStack url as fallback for the file url and throws a warning output', () => {
  const task = downloadAssets({
    exportDir: tmpDirectory
  })
  const ctx = {
    data: {
      assets: [
        ...getAssets({ existing: 2, missingUrl: 1 })
      ]
    }
  }

  return task(ctx, taskProxy)
    .then(() => {
      expect(ctx.assetDownloads).toEqual({
        successCount: 4,
        warningCount: 0,
        errorCount: 2
      })
      expect(output.mock.calls).toHaveLength(6)

      const missingUrlsOutputCount = output.mock.calls.filter(call =>
        call[0]?.endsWith('asset.fields.file[en-US].url') ||
          call[0]?.endsWith('asset.fields.file[de-DE].url'))

      expect(missingUrlsOutputCount).toHaveLength(2)
    })
})

test('Downloads assets with short Unicode filenames', () => {
  const task = downloadAssets({
    exportDir: tmpDirectory
  })
  const ctx = {
    data: {
      assets: [
        ...getAssets({ unicodeShort: 1 })
      ]
    }
  }

  return task(ctx, taskProxy)
    .then(() => {
      expect(ctx.assetDownloads).toEqual({
        successCount: 2,
        warningCount: 0,
        errorCount: 0
      })
      expect(output.mock.calls).toHaveLength(2)

      const unicodeShortAsset = ctx.data.assets.find(asset => asset.sys.id === 'unicode short asset 0')
      expect(unicodeShortAsset.fields.file['en-US'].fileName).toBe(UNICODE_SHORT_FILENAME)
      expect(unicodeShortAsset.fields.file['de-DE'].fileName).toBe(UNICODE_SHORT_FILENAME)
    })
})

test('Downloads assets with long Unicode filenames', () => {
  const task = downloadAssets({
    exportDir: tmpDirectory
  })
  const ctx = {
    data: {
      assets: [
        ...getAssets({ unicodeLong: 1 })
      ]
    }
  }

  return task(ctx, taskProxy)
    .then(() => {
      expect(ctx.assetDownloads).toEqual({
        successCount: 2,
        warningCount: 0,
        errorCount: 0
      })
      expect(output.mock.calls).toHaveLength(2)

      const unicodeLongAsset = ctx.data.assets.find(asset => asset.sys.id === 'unicode long asset 0')
      expect(unicodeLongAsset.fields.file['en-US'].fileName).toBe(UNICODE_LONG_FILENAME)
      expect(unicodeLongAsset.fields.file['de-DE'].fileName).toBe(UNICODE_LONG_FILENAME)
    })
})

test('Downloads assets with different filename than URL path', () => {
  const task = downloadAssets({
    exportDir: tmpDirectory
  })
  const ctx = {
    data: {
      assets: [
        ...getAssets({ differentFilename: 1 })
      ]
    }
  }

  return task(ctx, taskProxy)
    .then(() => {
      expect(ctx.assetDownloads).toEqual({
        successCount: 2,
        warningCount: 0,
        errorCount: 0
      })
      expect(output.mock.calls).toHaveLength(2)

      const differentFilenameAsset = ctx.data.assets.find(asset => asset.sys.id === 'different filename asset 0')
      expect(differentFilenameAsset.fields.file['en-US'].fileName).toBe(DIFFERENT_FILENAME)
      expect(differentFilenameAsset.fields.file['en-US'].url).toBe(`${BASE_PATH}${EXISTING_ASSET_URL}`)
      expect(differentFilenameAsset.fields.file['de-DE'].fileName).toBe(DIFFERENT_FILENAME)
      expect(differentFilenameAsset.fields.file['de-DE'].url).toBe(`${BASE_PATH}${EXISTING_ASSET_URL}`)
    })
})


================================================
FILE: test/unit/tasks/get-space-data.test.js
================================================
import getSpaceData from '../../../lib/tasks/get-space-data'

const maxAllowedLimit = 100
const resultItemCount = 420

function pagedResult (query, maxItems, mock = {}) {
  const { skip, limit } = query
  const cnt = maxItems - skip > limit ? limit : maxItems - skip
  return {
    items: Array.from({ length: cnt}, (n) => {
      const id = n * skip + 1
      return Object.assign({ sys: { id }}, mock)
    }),
    total: maxItems
  }
}

function pagedContentResult (query, maxItems, mock = {}) {
  const result = pagedResult(query, maxItems, mock)
  result.items.map((item, index) => {
    item.sys.publishedVersion = index % 2
    return item
  })
  return result
}

const mockSpace = {}

const mockEnvironment = {}

const mockClient = {}

const getEditorInterface = jest.fn()

const mockAsset = { metadata: { tags: [{}] } }

const mockEntry = { metadata: { tags: [{}] } }

function setupMocks () {
  mockClient.getSpace = jest.fn(() => Promise.resolve(mockSpace))
  mockSpace.getEnvironment = jest.fn(() => Promise.resolve(mockEnvironment))
  mockEnvironment.getContentTypes = jest.fn((query) => {
    return Promise.resolve(pagedResult(query, resultItemCount, {
      getEditorInterface
    }))
  })
  mockEnvironment.getEntries = jest.fn((query) => {
    return Promise.resolve(pagedContentResult(query, resultItemCount, mockEntry))
  })
  mockEnvironment.getAssets = jest.fn((query) => {
    return Promise.resolve(pagedContentResult(query, resultItemCount, mockAsset))
  })
  mockEnvironment.getLocales = jest.fn((query) => {
    return Promise.resolve(pagedResult(query, resultItemCount))
  })
  mockEnvironment.getTags = jest.fn((query) => {
    return Promise.resolve(pagedResult(query, resultItemCount))
  })
  mockSpace.getWebhooks = jest.fn((query) => {
    return Promise.resolve(pagedResult(query, resultItemCount))
  })
  mockSpace.getRoles = jest.fn((query) => {
    return Promise.resolve(pagedResult(query, resultItemCount))
  })
  getEditorInterface.mockImplementation(() => Promise.resolve({}))
}

beforeEach(setupMocks)

afterEach(() => {
  mockClient.getSpace.mockClear()
  mockEnvironment.getContentTypes.mockClear()
  mockEnvironment.getEntries.mockClear()
  mockEnvironment.getAssets.mockClear()
  mockEnvironment.getLocales.mockClear()
  mockEnvironment.getTags.mockClear()
  mockSpace.getWebhooks.mockClear()
  mockSpace.getRoles.mockClear()
  getEditorInterface.mockClear()
})

test('Gets whole destination content', () => {
  return getSpaceData({
    client: mockClient,
    spaceId: 'spaceid',
    maxAllowedLimit
  })
    .run({
      data: {}
    })
    .then((response) => {
      expect(mockClient.getSpace.mock.calls).toHaveLength(1)
      expect(mockSpace.getEnvironment.mock.calls).toHaveLength(1)
      expect(mockEnvironment.getContentTypes.mock.calls).toHaveLength(Math.ceil(resultItemCount / maxAllowedLimit))
      expect(mockEnvironment.getEntries.mock.calls).toHaveLength(Math.ceil(resultItemCount / maxAllowedLimit))
      expect(mockEnvironment.getAssets.mock.calls).toHaveLength(Math.ceil(resultItemCount / maxAllowedLimit))
      expect(mockEnvironment.getLocales.mock.calls).toHaveLength(Math.ceil(resultItemCount / maxAllowedLimit))
      expect(mockEnvironment.getTags.mock.calls).toHaveLength(Math.ceil(resultItemCount / maxAllowedLimit))
      expect(mockSpace.getWebhooks.mock.calls).toHaveLength(Math.ceil(resultItemCount / maxAllowedLimit))
      expect(mockSpace.getRoles.mock.calls).toHaveLength(Math.ceil(resultItemCount / maxAllowedLimit))
      expect(getEditorInterface.mock.calls).toHaveLength(resultItemCount)
      expect(response.data.contentTypes).toHaveLength(resultItemCount)
      expect(response.data.entries).toHaveLength(resultItemCount / 2)
      expect(response.data.assets).toHaveLength(resultItemCount / 2)
      expect(response.data.locales).toHaveLength(resultItemCount)
      expect(response.data.tags).toHaveLength(resultItemCount)
      expect(response.data.webhooks).toHaveLength(resultItemCount)
      expect(response.data.roles).toHaveLength(resultItemCount)
      expect(response.data.editorInterfaces).toHaveLength(resultItemCount)
    })
})

test('Gets whole destination content without content model', () => {
  return getSpaceData({
    client: mockClient,
    spaceId: 'spaceid',
    maxAllowedLimit,
    skipContentModel: true
  })
    .run({
      data: {}
    })
    .then((response) => {
      expect(mockClient.getSpace.mock.calls).toHaveLength(1)
      expect(mockSpace.getEnvironment.mock.calls).toHaveLength(1)
      expect(mockEnvironment.getContentTypes.mock.calls).toHaveLength(0)
      expect(mockEnvironment.getEntries.mock.calls).toHaveLength(Math.ceil(resultItemCount / maxAllowedLimit))
      expect(mockEnvironment.getAssets.mock.calls).toHaveLength(Math.ceil(resultItemCount / maxAllowedLimit))
      expect(mockEnvironment.getLocales.mock.calls).toHaveLength(0)
      expect(mockEnvironment.getTags.mock.calls).toHaveLength(Math.ceil(resultItemCount / maxAllowedLimit))
      expect(mockSpace.getWebhooks.mock.calls).toHaveLength(Math.ceil(resultItemCount / maxAllowedLimit))
      expect(mockSpace.getRoles.mock.calls).toHaveLength(Math.ceil(resultItemCount / maxAllowedLimit))
      expect(getEditorInterface.mock.calls).toHaveLength(0)
      expect(response.data.contentTypes).toBeUndefined()
      expect(response.data.entries).toHaveLength(resultItemCount / 2)
      expect(response.data.assets).toHaveLength(resultItemCount / 2)
      expect(response.data.locales).toBeUndefined()
      expect(response.data.tags).toHaveLength(resultItemCount)
      expect(response.data.webhooks).toHaveLength(resultItemCount)
      expect(response.data.roles).toHaveLength(resultItemCount)
      expect(response.data.editorInterfaces).toBeUndefined()
    })
})

test('Gets whole destination content without content', () => {
  return getSpaceData({
    client: mockClient,
    spaceId: 'spaceid',
    maxAllowedLimit,
    skipContent: true
  })
    .run({
      data: {}
    })
    .then((response) => {
      expect(mockClient.getSpace.mock.calls).toHaveLength(1)
      expect(mockSpace.getEnvironment.mock.calls).toHaveLength(1)
      expect(mockEnvironment.getContentTypes.mock.calls).toHaveLength(Math.ceil(resultItemCount / maxAllowedLimit))
      expect(mockEnvironment.getEntries.mock.calls).toHaveLength(0)
      expect(mockEnvironment.getAssets.mock.calls).toHaveLength(0)
      expect(mockEnvironment.getLocales.mock.calls).toHaveLength(Math.ceil(resultItemCount / maxAllowedLimit))
      expect(mockEnvironment.getTags.mock.calls).toHaveLength(Math.ceil(resultItemCount / maxAllowedLimit))
      expect(mockSpace.getWebhooks.mock.calls).toHaveLength(Math.ceil(resultItemCount / maxAllowedLimit))
      expect(mockSpace.getRoles.mock.calls).toHaveLength(Math.ceil(resultItemCount / maxAllowedLimit))
      expect(getEditorInterface.mock.calls).toHaveLength(resultItemCount)
      expect(response.data.contentTypes).toHaveLength(resultItemCount)
      expect(response.data.entries).toBeUndefined()
      expect(response.data.assets).toBeUndefined()
      expect(response.data.locales).toHaveLength(resultItemCount)
      expect(response.data.tags).toHaveLength(resultItemCount)
      expect(response.data.webhooks).toHaveLength(resultItemCount)
      expect(response.data.roles).toHaveLength(resultItemCount)
      expect(response.data.editorInterfaces).toHaveLength(resultItemCount)
    })
})

test('Gets whole destination content without webhooks', () => {
  return getSpaceData({
    client: mockClient,
    spaceId: 'spaceid',
    maxAllowedLimit,
    skipWebhooks: true
  })
    .run({
      data: {}
    })
    .then((response) => {
      expect(mockClient.getSpace.mock.calls).toHaveLength(1)
      expect(mockSpace.getEnvironment.mock.calls).toHaveLength(1)
      expect(mockEnvironment.getContentTypes.mock.calls).toHaveLength(Math.ceil(resultItemCount / maxAllowedLimit))
      expect(mockEnvironment.getEntries.mock.calls).toHaveLength(Math.ceil(resultItemCount / maxAllowedLimit))
      expect(mockEnvironment.getAssets.mock.calls).toHaveLength(Math.ceil(resultItemCount / maxAllowedLimit))
      expect(mockEnvironment.getLocales.mock.calls).toHaveLength(Math.ceil(resultItemCount / maxAllowedLimit))
      expect(mockEnvironment.getTags.mock.calls).toHaveLength(Math.ceil(resultItemCount / maxAllowedLimit))
      expect(mockSpace.getWebhooks.mock.calls).toHaveLength(0)
      expect(mockSpace.getRoles.mock.calls).toHaveLength(Math.ceil(resultItemCount / maxAllowedLimit))
      expect(getEditorInterface.mock.calls).toHaveLength(resultItemCount)
      expect(response.data.contentTypes).toHaveLength(resultItemCount)
      expect(response.data.entries).toHaveLength(resultItemCount / 2)
      expect(response.data.assets).toHaveLength(resultItemCount / 2)
      expect(response.data.locales).toHaveLength(resultItemCount)
      expect(response.data.tags).toHaveLength(resultItemCount)
      expect(response.data.webhooks).toBeUndefined()
      expect(response.data.roles).toHaveLength(resultItemCount)
      expect(response.data.editorInterfaces).toHaveLength(resultItemCount)
    })
})

test('Gets whole destination content without roles', () => {
  return getSpaceData({
    client: mockClient,
    spaceId: 'spaceid',
    maxAllowedLimit,
    skipRoles: true
  })
    .run({
      data: {}
    })
    .then((response) => {
      expect(mockClient.getSpace.mock.calls).toHaveLength(1)
      expect(mockSpace.getEnvironment.mock.calls).toHaveLength(1)
      expect(mockEnvironment.getContentTypes.mock.calls).toHaveLength(Math.ceil(resultItemCount / maxAllowedLimit))
      expect(mockEnvironment.getEntries.mock.calls).toHaveLength(Math.ceil(resultItemCount / maxAllowedLimit))
      expect(mockEnvironment.getAssets.mock.calls).toHaveLength(Math.ceil(resultItemCount / maxAllowedLimit))
      expect(mockEnvironment.getLocales.mock.calls).toHaveLength(Math.ceil(resultItemCount / maxAllowedLimit))
      expect(mockEnvironment.getTags.mock.calls).toHaveLength(Math.ceil(resultItemCount / maxAllowedLimit))
      expect(mockSpace.getWebhooks.mock.calls).toHaveLength(Math.ceil(resultItemCount / maxAllowedLimit))
      expect(mockSpace.getRoles.mock.calls).toHaveLength(0)
      expect(getEditorInterface.mock.calls).toHaveLength(resultItemCount)
      expect(response.data.contentTypes).toHaveLength(resultItemCount)
      expect(response.data.entries).toHaveLength(resultItemCount / 2)
      expect(response.data.assets).toHaveLength(resultItemCount / 2)
      expect(response.data.locales).toHaveLength(resultItemCount)
      expect(response.data.tags).toHaveLength(resultItemCount)
      expect(response.data.webhooks).toHaveLength(resultItemCount)
      expect(response.data.roles).toBeUndefined()
      expect(response.data.editorInterfaces).toHaveLength(resultItemCount)
    })
})

test('Gets whole destination content without editor interfaces', () => {
  return getSpaceData({
    client: mockClient,
    spaceId: 'spaceid',
    maxAllowedLimit,
    skipEditorInterfaces: true
  })
    .run({
      data: {}
    })
    .then((response) => {
      expect(mockClient.getSpace.mock.calls).toHaveLength(1)
      expect(mockSpace.getEnvironment.mock.calls).toHaveLength(1)
      expect(mockEnvironment.getContentTypes.mock.calls).toHaveLength(Math.ceil(resultItemCount / maxAllowedLimit))
      expect(mockEnvironment.getEntries.mock.calls).toHaveLength(Math.ceil(resultItemCount / maxAllowedLimit))
      expect(mockEnvironment.getAssets.mock.calls).toHaveLength(Math.ceil(resultItemCount / maxAllowedLimit))
      expect(mockEnvironment.getLocales.mock.calls).toHaveLength(Math.ceil(resultItemCount / maxAllowedLimit))
      expect(mockEnvironment.getTags.mock.calls).toHaveLength(Math.ceil(resultItemCount / maxAllowedLimit))
      expect(mockSpace.getWebhooks.mock.calls).toHaveLength(Math.ceil(resultItemCount / maxAllowedLimit))
      expect(mockSpace.getRoles.mock.calls).toHaveLength(Math.ceil(resultItemCount / maxAllowedLimit))
      expect(getEditorInterface.mock.calls).toHaveLength(0)
      expect(response.data.contentTypes).toHaveLength(resultItemCount)
      expect(response.data.entries).toHaveLength(resultItemCount / 2)
      expect(response.data.assets).toHaveLength(resultItemCount / 2)
      expect(response.data.locales).toHaveLength(resultItemCount)
      expect(response.data.tags).toHaveLength(resultItemCount)
      expect(response.data.webhooks).toHaveLength(resultItemCount)
      expect(response.data.roles).toHaveLength(resultItemCount)
      expect(response.data.editorInterfaces).toBeUndefined()
    })
})

test('Gets whole destination content without tags', () => {
  return getSpaceData({
    client: mockClient,
    spaceId: 'spaceid',
    maxAllowedLimit,
    skipTags: true
  })
    .run({
      data: {}
    })
    .then((response) => {
      expect(mockClient.getSpace.mock.calls).toHaveLength(1)
      expect(mockSpace.getEnvironment.mock.calls).toHaveLength(1)
      expect(mockEnvironment.getContentTypes.mock.calls).toHaveLength(Math.ceil(resultItemCount / maxAllowedLimit))
      expect(mockEnvironment.getEntries.mock.calls).toHaveLength(Math.ceil(resultItemCount / maxAllowedLimit))
      expect(mockEnvironment.getAssets.mock.calls).toHaveLength(Math.ceil(resultItemCount / maxAllowedLimit))
      expect(mockEnvironment.getLocales.mock.calls).toHaveLength(Math.ceil(resultItemCount / maxAllowedLimit))
      expect(mockEnvironment.getTags.mock.calls).toHaveLength(0)
      expect(mockSpace.getWebhooks.mock.calls).toHaveLength(Math.ceil(resultItemCount / maxAllowedLimit))
      expect(mockSpace.getRoles.mock.calls).toHaveLength(Math.ceil(resultItemCount / maxAllowedLimit))
      expect(getEditorInterface.mock.calls).toHaveLength(resultItemCount)
      expect(response.data.contentTypes).toHaveLength(resultItemCount)
      expect(response.data.entries).toHaveLength(resultItemCount / 2)
      expect(response.data.assets).toHaveLength(resultItemCount / 2)
      expect(response.data.locales).toHaveLength(resultItemCount)
      expect(response.data.tags).toBeUndefined()
      expect(response.data.webhooks).toHaveLength(resultItemCount)
      expect(response.data.roles).toHaveLength(resultItemCount)
      expect(response.data.editorInterfaces).toHaveLength(resultItemCount)
    })
})

test('Gets whole destination content with drafts', () => {
  return getSpaceData({
    client: mockClient,
    spaceId: 'spaceid',
    maxAllowedLimit,
    includeDrafts: true
  })
    .run({
      data: {}
    })
    .then((response) => {
      expect(mockClient.getSpace.mock.calls).toHaveLength(1)
      expect(mockSpace.getEnvironment.mock.calls).toHaveLength(1)
      expect(mockEnvironment.getContentTypes.mock.calls).toHaveLength(Math.ceil(resultItemCount / maxAllowedLimit))
      expect(mockEnvironment.getEntries.mock.calls).toHaveLength(Math.ceil(resultItemCount / maxAllowedLimit))
      expect(mockEnvironment.getAssets.mock.calls).toHaveLength(Math.ceil(resultItemCount / maxAllowedLimit))
      expect(mockEnvironment.getLocales.mock.calls).toHaveLength(Math.ceil(resultItemCount / maxAllowedLimit))
      expect(mockEnvironment.getTags.mock.calls).toHaveLength(Math.ceil(resultItemCount / maxAllowedLimit))
      expect(mockSpace.getWebhooks.mock.calls).toHaveLength(Math.ceil(resultItemCount / maxAllowedLimit))
      expect(mockSpace.getRoles.mock.calls).toHaveLength(Math.ceil(resultItemCount / maxAllowedLimit))
      expect(getEditorInterface.mock.calls).toHaveLength(resultItemCount)
      expect(response.data.contentTypes).toHaveLength(resultItemCount)
      expect(response.data.entries).toHaveLength(resultItemCount)
      expect(response.data.assets).toHaveLength(resultItemCount)
      expect(response.data.locales).toHaveLength(resultItemCount)
      expect(response.data.tags).toHaveLength(resultItemCount)
      expect(response.data.webhooks).toHaveLength(resultItemCount)
      expect(response.data.roles).toHaveLength(resultItemCount)
      expect(response.data.editorInterfaces).toHaveLength(resultItemCount)
    })
})

test('Gets whole destination content with archived entries', () => {
  return getSpaceData({
    client: mockClient,
    spaceId: 'spaceid',
    maxAllowedLimit,
    includeDrafts: true,
    includeArchived: true
  })
    .run({
      data: {}
    })
    .then((response) => {
      expect(mockClient.getSpace.mock.calls).toHaveLength(1)
      expect(mockSpace.getEnvironment.mock.calls).toHaveLength(1)
      expect(mockEnvironment.getContentTypes.mock.calls).toHaveLength(Math.ceil(resultItemCount / maxAllowedLimit))
      expect(mockEnvironment.getEntries.mock.calls).toHaveLength(Math.ceil(resultItemCount / maxAllowedLimit))
      expect(mockEnvironment.getAssets.mock.calls).toHaveLength(Math.ceil(resultItemCount / maxAllowedLimit))
      expect(mockEnvironment.getLocales.mock.calls).toHaveLength(Math.ceil(resultItemCount / maxAllowedLimit))
      expect(mockEnvironment.getTags.mock.calls).toHaveLength(Math.ceil(resultItemCount / maxAllowedLimit))
      expect(mockSpace.getWebhooks.mock.calls).toHaveLength(Math.ceil(resultItemCount / maxAllowedLimit))
      expect(mockSpace.getRoles.mock.calls).toHaveLength(Math.ceil(resultItemCount / maxAllowedLimit))
      expect(getEditorInterface.mock.calls).toHaveLength(resultItemCount)
      expect(response.data.contentTypes).toHaveLength(resultItemCount)
      expect(response.data.entries).toHaveLength(resultItemCount)
      expect(response.data.assets).toHaveLength(resultItemCount)
      expect(response.data.locales).toHaveLength(resultItemCount)
      expect(response.data.tags).toHaveLength(resultItemCount)
      expect(response.data.webhooks).toHaveLength(resultItemCount)
      expect(response.data.roles).toHaveLength(resultItemCount)
      expect(response.data.editorInterfaces).toHaveLength(resultItemCount)
    })
})

test('Skips webhooks & roles for non-master environments', () => {
  return getSpaceData({
    client: mockClient,
    spaceId: 'spaceid',
    environmentId: 'staging',
    maxAllowedLimit,
    includeDrafts: true
  })
    .run({
      data: {}
    })
    .then((response) => {
      expect(mockClient.getSpace.mock.calls).toHaveLength(1)
      expect(mockSpace.getEnvironment.mock.calls).toHaveLength(1)
      expect(mockEnvironment.getContentTypes.mock.calls).toHaveLength(Math.ceil(resultItemCount / maxAllowedLimit))
      expect(mockEnvironment.getEntries.mock.calls).toHaveLength(Math.ceil(resultItemCount / maxAllowedLimit))
      expect(mockEnvironment.getAssets.mock.calls).toHaveLength(Math.ceil(resultItemCount / maxAllowedLimit))
      expect(mockEnvironment.getLocales.mock.calls).toHaveLength(Math.ceil(resultItemCount / maxAllowedLimit))
      expect(mockEnvironment.getTags.mock.calls).toHaveLength(Math.ceil(resultItemCount / maxAllowedLimit))
      expect(mockSpace.getWebhooks.mock.calls).toHaveLength(0)
      expect(mockSpace.getRoles.mock.calls).toHaveLength(0)
      expect(getEditorInterface.mock.calls).toHaveLength(resultItemCount)
      expect(response.data.contentTypes).toHaveLength(resultItemCount)
      expect(response.data.entries).toHaveLength(resultItemCount)
      expect(response.data.assets).toHaveLength(resultItemCount)
      expect(response.data.locales).toHaveLength(resultItemCount)
      expect(response.data.tags).toHaveLength(resultItemCount)
      expect(response.data).not.toHaveProperty('webhooks')
      expect(response.data).not.toHaveProperty('roles')
      expect(response.data.editorInterfaces).toHaveLength(resultItemCount)
    })
})

test('Gets whole destination content and detects missing editor interfaces', () => {
  getEditorInterface.mockImplementation(() => Promise.reject(new Error('No editor interface found')))

  return getSpaceData({
    client: mockClient,
    spaceId: 'spaceid',
    maxAllowedLimit,
    skipContent: true,
    skipWebhooks: true,
    skipRoles: true
  })
    .run({
      data: {}
    })
    .then((response) => {
      expect(mockClient.getSpace.mock.calls).toHaveLength(1)
      expect(mockSpace.getEnvironment.mock.calls).toHaveLength(1)
      expect(mockEnvironment.getContentTypes.mock.calls).toHaveLength(Math.ceil(resultItemCount / maxAllowedLimit))
      expect(getEditorInterface.mock.calls).toHaveLength(resultItemCount)
      expect(response.data.contentTypes).toHaveLength(resultItemCount)
      expect(response.data.editorInterfaces).toHaveLength(0)
    })
})

test('Skips editor interfaces since no content types are found', () => {
  mockEnvironment.getContentTypes.mockImplementation(() => Promise.resolve({
    items: [],
    total: 0
  }))

  return getSpaceData({
    client: mockClient,
    spaceId: 'spaceid',
    maxAllowedLimit,
    skipContent: true,
    skipWebhooks: true,
    skipRoles: true
  })
    .run({
      data: {}
    })
    .then((response) => {
      expect(mockClient.getSpace.mock.calls).toHaveLength(1)
      expect(mockSpace.getEnvironment.mock.calls).toHaveLength(1)
      expect(mockEnvironment.getContentTypes.mock.calls).toHaveLength(1)
      expect(getEditorInterface.mock.calls).toHaveLength(0)
      expect(response.data.contentTypes).toHaveLength(0)
      expect(response.data.editorInterfaces).toBeUndefined()
    })
})

test('Loads 1000 items per page by default', () => {
  return getSpaceData({
    client: mockClient,
    spaceId: 'spaceid',
    skipContent: true,
    skipWebhooks: true,
    skipRoles: true
  })
    .run({
      data: {}
    })
    .then((response) => {
      expect(mockClient.getSpace.mock.calls).toHaveLength(1)
      expect(mockSpace.getEnvironment.mock.calls).toHaveLength(1)
      expect(mockEnvironment.getContentTypes.mock.calls).toHaveLength(1)
      expect(mockEnvironment.getContentTypes.mock.calls[0][0].limit).toBe(1000)
      expect(getEditorInterface.mock.calls).toHaveLength(resultItemCount)
      expect(response.data.contentTypes).toHaveLength(resultItemCount)
      expect(response.data.editorInterfaces).toHaveLength(resultItemCount)
    })
})

test('Query entry/asset respect limit query param', () => {
  // overwrite the getAssets mock so maxItems is larger than default page size in pagedGet (get-space-data.js)
  mockEnvironment.getAssets = jest.fn((query) => {
    return Promise.resolve(pagedContentResult(query, 2000, mockEntry))
  })
  return getSpaceData({
    client: mockClient,
    spaceId: 'spaceid',
    skipContentModel: true,
    skipWebhooks: true,
    skipRoles: true,
    includeDrafts: true,
    queryEntries: { limit: 20 }, // test limit < pageSize
    queryAssets: { limit: 1001 } // test limit > pageSize
  })
    .run({
      data: {}
    })
    .then((response) => {
      expect(mockClient.getSpace.mock.calls).toHaveLength(1)
      expect(mockSpace.getEnvironment.mock.calls).toHaveLength(1)
      expect(mockEnvironment.getEntries.mock.calls[0][0].limit).toBe(20)
      expect(mockEnvironment.getAssets.mock.calls[0][0].limit).toBe(1000) // assets should be called 2x
      expect(mockEnvironment.getAssets.mock.calls[1][0].limit).toBe(1) // because it has to fetch the final item in the second page
      expect(response.data.assets).toHaveLength(1001)
      expect(response.data.entries).toHaveLength(20)
    })
})

test('only skips fetched items', () => {
  // overwrite the getLocales only returns 20 items in pages of 10
  mockEnvironment.getLocales = jest.fn()
    .mockResolvedValueOnce({
      items: Array.from({ length: 10 }, (n) => {
        const id = n + 1
        return Object.assign({ sys: { id } })
      }),
      total: 20
    })
    .mockResolvedValueOnce({
      items: Array.from({ length: 7 }, (n) => {
        const id = n + 11
        return Object.assign({ sys: { id } })
      }),
      total: 17
    })
  return getSpaceData({
    client: mockClient,
    spaceId: 'spaceid',
    skipContent: true,
    skipWebhooks: true,
    skipRoles: true
  })
    .run({
      data: {}
    })
    .then(() => {
      expect(mockClient.getSpace.mock.calls).toHaveLength(1)
      expect(mockSpace.getEnvironment.mock.calls).toHaveLength(1)
      expect(mockEnvironment.getLocales.mock.calls).toHaveLength(2)
      expect(mockEnvironment.getLocales.mock.calls[0][0].limit).toBe(1000)
      expect(mockEnvironment.getLocales.mock.calls[0][0].skip).toBe(0)
      expect(mockEnvironment.getLocales.mock.calls[1][0].limit).toBe(1000)
      expect(mockEnvironment.getLocales.mock.calls[1][0].skip).toBe(10)
    })
})

test('halts fetching when no items in page', () => {
  // overwrite the getLocales returns 0 items
  mockEnvironment.getLocales = jest.fn()
    .mockResolvedValueOnce({
      items: [],
      total: 20
    })
  return getSpaceData({
    client: mockClient,
    spaceId: 'spaceid',
    skipContent: true,
    skipWebhooks: true,
    skipRoles: true
  })
    .run({
      data: {}
    })
    .then(() => {
      expect(mockClient.getSpace.mock.calls).toHaveLength(1)
      expect(mockSpace.getEnvironment.mock.calls).toHaveLength(1)
      expect(mockEnvironment.getLocales.mock.calls).toHaveLength(1)
      expect(mockEnvironment.getLocales.mock.calls[0][0].limit).toBe(1000)
      expect(mockEnvironment.getLocales.mock.calls[0][0].skip).toBe(0)
    })
})

test('Strips tags from entries and assets', () => {
  return getSpaceData({
    client: mockClient,
    spaceId: 'spaceid',
    maxAllowedLimit,
    stripTags: true
  })
    .run({
      data: {}
    })
    .then((response) => {
      expect(response.data.entries).toHaveLength(resultItemCount / 2)
      const hasAssetsWithTags = response.data.assets.some(asset => asset.metadata?.tags?.length > 0)
      expect(hasAssetsWithTags).toBe(false)
      const hasEntryWithTags = response.data.entries.some(entry => entry.metadata?.tags?.length > 0)
      expect(hasEntryWithTags).toBe(false)
    })
})


================================================
FILE: test/unit/tasks/init-client.test.js
================================================
import initClient from '../../../lib/tasks/init-client'

import contentfulManagement from 'contentful-management'
import contentful from 'contentful'
import { logEmitter } from 'contentful-batch-libs'

jest.mock('contentful-management', () => {
  return {
    createClient: jest.fn(() => 'cmaClient')
  }
})

jest.mock('contentful', () => {
  return {
    createClient: jest.fn(() => 'cdaClient')
  }
})

jest.mock('contentful-batch-libs', () => {
  return {
    logEmitter: {
      emit: jest.fn()
    }
  }
})

test('does create clients and passes custom logHandler', () => {
  const opts = {
    httpAgent: 'httpAgent',
    httpsAgent: 'httpsAgent',
    application: 'application',
    headers: 'headers',
    host: 'host',
    insecure: 'insecure',
    integration: 'integration',
    port: 'port',
    proxy: 'proxy',
    accessToken: 'accessToken',
    spaceId: 'spaceId'
  }

  initClient(opts)

  expect(contentfulManagement.createClient.mock.calls[0][0]).toMatchObject({
    accessToken: opts.accessToken,
    host: opts.host,
    port: opts.port,
    headers: opts.headers,
    insecure: opts.insecure,
    proxy: opts.proxy,
    httpAgent: opts.httpAgent,
    httpsAgent: opts.httpsAgent,
    application: opts.application,
    integration: opts.integration
  })
  expect(contentfulManagement.createClient.mock.calls[0][0]).toHaveProperty('logHandler')
  expect(contentfulManagement.createClient.mock.calls[0][0].timeout).toEqual(10000)
  expect(contentfulManagement.createClient.mock.calls).toHaveLength(1)
  expect(contentful.createClient.mock.calls).toHaveLength(0)

  // Call passed log handler
  contentfulManagement.createClient.mock.calls[0][0].logHandler('level', 'logMessage')

  expect(logEmitter.emit.mock.calls[0][0]).toBe('level')
  expect(logEmitter.emit.mock.calls[0][1]).toBe('logMessage')
})

test('does create both clients when deliveryToken is set', () => {
  const opts = {
    httpAgent: 'httpAgent',
    httpsAgent: 'httpsAgent',
    application: 'application',
    headers: 'headers',
    host: 'host',
    insecure: 'insecure',
    integration: 'integration',
    port: 'port',
    proxy: 'proxy',
    accessToken: 'accessToken',
    spaceId: 'spaceId',
    deliveryToken: 'deliveryToken',
    hostDelivery: 'hostDelivery'
  }

  initClient(opts, true)

  expect(contentfulManagement.createClient.mock.calls[0][0]).toMatchObject({
    accessToken: opts.accessToken,
    host: opts.host,
    port: opts.port,
    headers: opts.headers,
    insecure: opts.insecure,
    proxy: opts.proxy,
    httpAgent: opts.httpAgent,
    httpsAgent: opts.httpsAgent,
    application: opts.application,
    integration: opts.integration
  })
  expect(contentful.createClient.mock.calls[0][0]).toMatchObject({
    space: opts.spaceId,
    accessToken: opts.deliveryToken,
    host: opts.hostDelivery,
    port: opts.port,
    headers: opts.headers,
    insecure: opts.insecure,
    proxy: opts.proxy,
    httpAgent: opts.httpAgent,
    httpsAgent: opts.httpsAgent,
    application: opts.application,
    integration: opts.integration
  })
  expect(contentfulManagement.createClient.mock.calls).toHaveLength(1)
  expect(contentful.createClient.mock.calls).toHaveLength(1)
})


================================================
FILE: test/unit/utils/embargoedAssets.test.js
================================================
import { shouldCreateNewCacheItem } from '../../../lib/utils/embargoedAssets'
const SIX_HOURS_IN_MS = 6 * 60 * 60 * 1000

test('only returns true for expiry time difference greater than 6 hours', () => {
  expect(shouldCreateNewCacheItem({ expiresAtMs: 1 }, SIX_HOURS_IN_MS - 2)).toBe(false)
  expect(shouldCreateNewCacheItem({ expiresAtMs: 1 }, SIX_HOURS_IN_MS + 2)).toBe(true)
})


================================================
FILE: test/unit/utils/headers.test.js
================================================
import { getHeadersConfig } from '../../../lib/utils/headers'

test('getHeadersConfig returns empty object when value is undefined', () => {
  expect(getHeadersConfig(undefined)).toEqual({})
})

test('getHeadersConfig accepts single or multiple values', () => {
  expect(getHeadersConfig('Accept: Any')).toEqual({ Accept: 'Any' })
  expect(getHeadersConfig(['Accept: Any', 'X-Version: 1'])).toEqual({
    Accept: 'Any',
    'X-Version': '1'
  })
})

test('getHeadersConfig ignores invalid headers', () => {
  expect(
    getHeadersConfig(['Accept: Any', 'X-Version: 1', 'invalid'])
  ).toEqual({
    Accept: 'Any',
    'X-Version': '1'
  })
})

test('getHeadersConfig trims spacing around keys & values', () => {
  expect(
    getHeadersConfig([
      '  Accept:   Any   ',
      '   X-Version   :1 ',
      'invalid'
    ])
  ).toEqual({
    Accept: 'Any',
    'X-Version': '1'
  })
})


================================================
FILE: tsconfig.json
================================================
{
  "compilerOptions": {
    "target": "ESNext",
    "module": "commonjs",
    "esModuleInterop": true,
    "forceConsistentCasingInFileNames": false,
    "strict": false,
    "skipLibCheck": true,
    "allowJs": true,
    "checkJs": true,
    "resolveJsonModule": true,
    "noEmit": true
  },
  "include": ["./lib", "./bin"]
}


================================================
FILE: types.d.ts
================================================
export interface Options {
  managementToken: string;
  spaceId: string;
  contentFile?: string;
  contentOnly?: boolean;
  deliveryToken?: string;
  downloadAssets?: boolean;
  environmentId?: string;
  errorLogFile?: string;
  exportDir?: string;
  headers?: string[];
  host?: string;
  includeArchived?: boolean;
  includeDrafts?: boolean;
  limit?: number;
  managementApplication?: string;
  managementFeature?: string;
  maxAllowedLimit?: number;
  proxy?: string;
  queryEntries?: string[];
  queryAssets?: string[];
  rawProxy?: boolean;
  saveFile?: boolean;
  skipContent?: boolean;
  skipContentModel?: boolean;
  skipEditorInterfaces?: boolean;
  skipRoles?: boolean;
  skipWebhooks?: boolean;
  skipTags?: boolean;
  useVerboseRenderer?: boolean;
}

type ContentfulExportField = 'contentTypes' | 'entries' | 'assets' | 'locales' | 'tags' | 'webhooks' | 'roles' | 'editorInterfaces';

declare const runContentfulExport: (params: Options) => Promise<Record<ContentfulExportField, unknown[]>>
export default runContentfulExport
Download .txt
gitextract_8t_4q63h/

├── .bito/
│   └── guidelines/
│       ├── domain-invariants.txt
│       ├── repo-truth-and-boundaries.txt
│       └── review-posture.txt
├── .bito.yaml
├── .contentful/
│   └── vault-secrets.yaml
├── .eslintrc
├── .github/
│   ├── CODEOWNERS
│   ├── ISSUE_TEMPLATE/
│   │   ├── bug_report.md
│   │   ├── config.yml
│   │   └── feature_request.md
│   ├── PULL_REQUEST_TEMPLATE.md
│   ├── dependabot.yml
│   └── workflows/
│       ├── build.yaml
│       ├── check.yaml
│       ├── codeql.yaml
│       ├── dependabot-approve-and-request-merge.yaml
│       ├── main.yaml
│       └── release.yaml
├── .gitignore
├── .npmrc
├── .nvmrc
├── AGENTS.md
├── ARCHITECTURE.md
├── CONTRIBUTING.md
├── LICENSE
├── README.md
├── babel.config.json
├── bin/
│   └── contentful-export
├── catalog-info.yaml
├── docs/
│   ├── ADRs/
│   │   ├── 2016-06-20-babel-cjs-build-pipeline.md
│   │   ├── 2016-06-20-semantic-release.md
│   │   ├── 2024-12-02-typescript-check-only.md
│   │   ├── 2025-11-14-circleci-to-github-actions.md
│   │   ├── 2026-04-09-cma-v12-node-22-minimum.md
│   │   └── README.md
│   └── specs/
│       ├── .gitkeep
│       └── README.md
├── example-config.json
├── example-config.test.json
├── lib/
│   ├── index.js
│   ├── parseOptions.js
│   ├── tasks/
│   │   ├── download-assets.js
│   │   ├── get-space-data.js
│   │   └── init-client.js
│   ├── usageParams.js
│   └── utils/
│       ├── embargoedAssets.js
│       └── headers.js
├── package.json
├── test/
│   ├── integration/
│   │   └── export-lib.test.js
│   └── unit/
│       ├── index.test.js
│       ├── mocks/
│       │   ├── download-assets.js
│       │   └── get-space-data.js
│       ├── parseOptions.test.js
│       ├── tasks/
│       │   ├── download-assets.test.js
│       │   ├── get-space-data.test.js
│       │   └── init-client.test.js
│       └── utils/
│           ├── embargoedAssets.test.js
│           └── headers.test.js
├── tsconfig.json
└── types.d.ts
Download .txt
SYMBOL INDEX (52 symbols across 11 files)

FILE: lib/index.js
  function createListrOptions (line 34) | function createListrOptions (options) {
  function runContentfulExport (line 46) | function runContentfulExport (params) {

FILE: lib/parseOptions.js
  function parseOptions (line 9) | function parseOptions (params) {

FILE: lib/tasks/download-assets.js
  function downloadAsset (line 19) | async function downloadAsset ({ url, directory, httpClient }) {
  function downloadAssets (line 53) | function downloadAssets (options) {

FILE: lib/tasks/get-space-data.js
  constant MAX_ALLOWED_LIMIT (line 6) | const MAX_ALLOWED_LIMIT = 1000
  function getFullSourceSpace (line 13) | function getFullSourceSpace ({
  function getEditorInterfaces (line 160) | function getEditorInterfaces (contentTypes) {
  function pagedGet (line 183) | function pagedGet ({ source, method, skip = 0, aggregatedResponse = null...
  function logPagingStatus (line 222) | function logPagingStatus (response, requestQuery, userLimit) {
  function extractItems (line 234) | function extractItems (response) {
  function filterDrafts (line 238) | function filterDrafts (items, includeDrafts, cdaClient) {
  function filterArchived (line 243) | function filterArchived (items, includeArchived) {
  function removeTags (line 247) | function removeTags (items, stripTags) {

FILE: lib/tasks/init-client.js
  function logHandler (line 5) | function logHandler (level, data) {
  function initClient (line 9) | function initClient (opts, useCda = false) {

FILE: lib/utils/embargoedAssets.js
  constant SIX_HOURS_IN_MS (line 3) | const SIX_HOURS_IN_MS = 6 * 60 * 60 * 1000
  function createAssetKey (line 14) | function createAssetKey (host, accessToken, spaceId, environmentId, expi...
  function createCachedAssetKey (line 30) | async function createCachedAssetKey (host, accessToken, spaceId, environ...
  function generateSignedToken (line 59) | function generateSignedToken (secret, urlWithoutQueryParams, expiresAtMs) {
  function generateSignedUrl (line 68) | function generateSignedUrl (policy, secret, url, expiresAtMs) {
  function isEmbargoedAsset (line 80) | function isEmbargoedAsset (url) {
  function calculateExpiryTimestamp (line 85) | function calculateExpiryTimestamp () {
  function signUrl (line 98) | function signUrl (host, accessToken, spaceId, environmentId, url, expire...

FILE: lib/utils/headers.js
  function getHeadersConfig (line 15) | function getHeadersConfig (value) {

FILE: test/unit/tasks/download-assets.test.js
  constant BASE_PATH (line 11) | const BASE_PATH = '//images.contentful.com'
  constant BASE_PATH_SECURE (line 12) | const BASE_PATH_SECURE = '//images.secure.contentful.com'
  constant ASSET_PATH (line 13) | const ASSET_PATH = '/kq9lln4hyr8s/2MTd2wBirYikEYkIIc0YSw/7aa4c06f3054996...
  constant EXISTING_ASSET_FILENAME (line 14) | const EXISTING_ASSET_FILENAME = 'rocka-nutrition.png'
  constant EXISTING_ASSET_URL (line 15) | const EXISTING_ASSET_URL = `${ASSET_PATH}/${EXISTING_ASSET_FILENAME}`
  constant EMBARGOED_ASSET_FILENAME (line 16) | const EMBARGOED_ASSET_FILENAME = 'space-dog.png'
  constant EMBARGOED_ASSET_URL (line 17) | const EMBARGOED_ASSET_URL = `${ASSET_PATH}/${EMBARGOED_ASSET_FILENAME}`
  constant NON_EXISTING_URL (line 18) | const NON_EXISTING_URL = '/does-not-exist.png'
  constant UNICODE_SHORT_FILENAME (line 19) | const UNICODE_SHORT_FILENAME = '测试文件.jpg'
  constant UNICODE_SHORT_URL (line 20) | const UNICODE_SHORT_URL = `${ASSET_PATH}/${encodeURIComponent(UNICODE_SH...
  constant UNICODE_LONG_FILENAME (line 21) | const UNICODE_LONG_FILENAME = `${'测试文件'.repeat(10)}.jpg`
  constant UNICODE_LONG_URL (line 22) | const UNICODE_LONG_URL = `${ASSET_PATH}/${encodeURIComponent(UNICODE_LON...
  constant DIFFERENT_FILENAME (line 23) | const DIFFERENT_FILENAME = 'different filename.jpg'
  constant UPLOAD_URL (line 24) | const UPLOAD_URL = '//file-stack-url-do-not-use-me.png'
  constant API_HOST (line 26) | const API_HOST = 'api.contentful.com'
  constant SPACE_ID (line 27) | const SPACE_ID = 'kq9lln4hyr8s'
  constant ACCESS_TOKEN (line 28) | const ACCESS_TOKEN = 'abc'
  constant ENVIRONMENT_ID (line 29) | const ENVIRONMENT_ID = 'master'
  constant POLICY (line 30) | const POLICY = 'eyJhbG.eyJMDIyfQ.SflKx5c'
  constant SECRET (line 31) | const SECRET = 's3cr3t'
  function getAssets (line 70) | function getAssets ({ existing = 0, nonExisting = 0, missingUrl = 0, emb...

FILE: test/unit/tasks/get-space-data.test.js
  function pagedResult (line 6) | function pagedResult (query, maxItems, mock = {}) {
  function pagedContentResult (line 18) | function pagedContentResult (query, maxItems, mock = {}) {
  function setupMocks (line 39) | function setupMocks () {

FILE: test/unit/utils/embargoedAssets.test.js
  constant SIX_HOURS_IN_MS (line 2) | const SIX_HOURS_IN_MS = 6 * 60 * 60 * 1000

FILE: types.d.ts
  type Options (line 1) | interface Options {
  type ContentfulExportField (line 33) | type ContentfulExportField = 'contentTypes' | 'entries' | 'assets' | 'lo...
Condensed preview — 60 files, each showing path, character count, and a content snippet. Download the .json file or copy for the full structured content (170K chars).
[
  {
    "path": ".bito/guidelines/domain-invariants.txt",
    "chars": 2101,
    "preview": "Critical domain invariants for contentful-export:\n\n1. types.d.ts and lib/usageParams.js must stay in sync with the actua"
  },
  {
    "path": ".bito/guidelines/repo-truth-and-boundaries.txt",
    "chars": 1130,
    "preview": "Use the repository's written documentation as review context and check whether\nthe change matches the documented intent."
  },
  {
    "path": ".bito/guidelines/review-posture.txt",
    "chars": 1366,
    "preview": "Review this pull request like the tech lead of the contentful-export project.\n\n- Prefer a few high-signal findings to a "
  },
  {
    "path": ".bito.yaml",
    "chars": 555,
    "preview": "suggestion_mode: comprehensive\npost_description: true\npost_changelist: true\nexclude_files: 'package-lock.json'\nexclude_d"
  },
  {
    "path": ".contentful/vault-secrets.yaml",
    "chars": 118,
    "preview": "version: 1\nservices:\n  github-action:\n    policies:\n      - dependabot\n      - semantic-release\n      - packages-read\n"
  },
  {
    "path": ".eslintrc",
    "chars": 264,
    "preview": "{\n  \"extends\": [\n    \"standard\",\n    \"eslint:recommended\",\n    \"plugin:@typescript-eslint/eslint-recommended\",\n    \"plug"
  },
  {
    "path": ".github/CODEOWNERS",
    "chars": 72,
    "preview": "* @contentful/team-developer-experience\n\npackage.json\npackage-lock.json\n"
  },
  {
    "path": ".github/ISSUE_TEMPLATE/bug_report.md",
    "chars": 995,
    "preview": "---\nname: Bug Report\nabout: Create a report to help us improve\ntitle: '[BUG] '\nlabels: bug\nassignees: ''\n---\n\n## Bug Des"
  },
  {
    "path": ".github/ISSUE_TEMPLATE/config.yml",
    "chars": 511,
    "preview": "blank_issues_enabled: true\ncontact_links:\n  - name: Contentful Support & Help Center\n    url: https://support.contentful"
  },
  {
    "path": ".github/ISSUE_TEMPLATE/feature_request.md",
    "chars": 1498,
    "preview": "---\nname: Feature Request\nabout: Suggest an idea for this project\ntitle: '[FEATURE] '\nlabels: enhancement\nassignees: ''\n"
  },
  {
    "path": ".github/PULL_REQUEST_TEMPLATE.md",
    "chars": 740,
    "preview": "<!--\nThank you for opening a pull request.\n\nPlease fill in as much of the template below as you're able. Feel free to re"
  },
  {
    "path": ".github/dependabot.yml",
    "chars": 1000,
    "preview": "version: 2\n\nupdates:\n  - package-ecosystem: npm\n    directory: \"/\"\n    schedule:\n      interval: daily\n      time: \"00:0"
  },
  {
    "path": ".github/workflows/build.yaml",
    "chars": 681,
    "preview": "name: Build\n\non:\n  workflow_call:\n\njobs:\n  build:\n    runs-on: ubuntu-latest\n\n    permissions:\n      contents: read\n\n   "
  },
  {
    "path": ".github/workflows/check.yaml",
    "chars": 1441,
    "preview": "name: Run Checks\n\non:\n  workflow_call:\n    secrets:\n      MANAGEMENT_TOKEN:\n        required: true\n      DELIVERY_TOKEN:"
  },
  {
    "path": ".github/workflows/codeql.yaml",
    "chars": 586,
    "preview": "---\nname: \"CodeQL Scan for GitHub Actions Workflows\"\n\non:\n  push:\n    branches: [main]\n    paths: [\".github/workflows/**"
  },
  {
    "path": ".github/workflows/dependabot-approve-and-request-merge.yaml",
    "chars": 428,
    "preview": "name: \"dependabot approve-and-request-merge\"\n\non: pull_request_target\n\njobs:\n  worker:\n    permissions:\n      contents: "
  },
  {
    "path": ".github/workflows/main.yaml",
    "chars": 582,
    "preview": "name: CI\npermissions:\n  contents: read\n\non:\n  push:\n    branches: ['**']\n  pull_request:\n    branches: ['**']\n\njobs:\n  b"
  },
  {
    "path": ".github/workflows/release.yaml",
    "chars": 2676,
    "preview": "name: Release\n\non:\n  workflow_call:\n    secrets:\n      VAULT_URL:\n        required: true\n\njobs:\n  release:\n    runs-on: "
  },
  {
    "path": ".gitignore",
    "chars": 3261,
    "preview": "contentful-export-*\n\ndist\ngh-pages\n\n# Docker\n*.dockerfile\n*.dockerignore\n\n# Node package managers\nyarn.lock\n\n# Created b"
  },
  {
    "path": ".npmrc",
    "chars": 20,
    "preview": "ignore-scripts=true\n"
  },
  {
    "path": ".nvmrc",
    "chars": 3,
    "preview": "24\n"
  },
  {
    "path": "AGENTS.md",
    "chars": 4407,
    "preview": "<!-- Generated by seed-golden-context | Last updated: 2026-05-04 -->\n# Agent Guide\n\nRead this file first. It tells you w"
  },
  {
    "path": "ARCHITECTURE.md",
    "chars": 10784,
    "preview": "<!-- Generated by seed-golden-context | Last updated: 2026-05-04 -->\n# Architecture\n\n## Overview\n\n`contentful-export` is"
  },
  {
    "path": "CONTRIBUTING.md",
    "chars": 8945,
    "preview": "<!-- Generated by seed-golden-context | Last updated: 2026-05-04 -->\n# Contributing\n\n## Prerequisites\n\n| Tool | Version "
  },
  {
    "path": "LICENSE",
    "chars": 1077,
    "preview": "The MIT License (MIT)\n\nCopyright (c) 2016 Contentful\n\nPermission is hereby granted, free of charge, to any person obtain"
  },
  {
    "path": "README.md",
    "chars": 10218,
    "preview": "# Contentful export tool\n\n[![CI](https://github.com/contentful/contentful-export/actions/workflows/main.yaml/badge.svg)]"
  },
  {
    "path": "babel.config.json",
    "chars": 226,
    "preview": "{\n  \"presets\": [\n    [\n      \"@babel/preset-env\",\n      {\n        \"targets\": {\n          \"node\": \"12\"\n        }\n      }\n"
  },
  {
    "path": "bin/contentful-export",
    "chars": 667,
    "preview": "#!/usr/bin/env node\n\n// eslint-disable-next-line\nconst runContentfulExport = require('../dist/index')\n// eslint-disable-"
  },
  {
    "path": "catalog-info.yaml",
    "chars": 511,
    "preview": "apiVersion: backstage.io/v1alpha1\nkind: Component\nmetadata:\n  name: contentful-export\n  description: |\n    This tool all"
  },
  {
    "path": "docs/ADRs/2016-06-20-babel-cjs-build-pipeline.md",
    "chars": 1454,
    "preview": "<!-- Generated by seed-golden-context | Last updated: 2026-05-04 -->\n# Use Babel to Transpile ES2015+ to CommonJS\n\n## St"
  },
  {
    "path": "docs/ADRs/2016-06-20-semantic-release.md",
    "chars": 1541,
    "preview": "<!-- Generated by seed-golden-context | Last updated: 2026-05-04 -->\n# Use semantic-release for Automated Versioning and"
  },
  {
    "path": "docs/ADRs/2024-12-02-typescript-check-only.md",
    "chars": 1265,
    "preview": "<!-- Generated by seed-golden-context | Last updated: 2026-05-04 -->\n# Add TypeScript Type Checking (No Emit)\n\n## Status"
  },
  {
    "path": "docs/ADRs/2025-11-14-circleci-to-github-actions.md",
    "chars": 1515,
    "preview": "<!-- Generated by seed-golden-context | Last updated: 2026-05-04 -->\n# Migrate CI/CD from CircleCI to GitHub Actions\n\n##"
  },
  {
    "path": "docs/ADRs/2026-04-09-cma-v12-node-22-minimum.md",
    "chars": 1165,
    "preview": "<!-- Generated by seed-golden-context | Last updated: 2026-05-04 -->\n# Update to CMA.js v12 and Drop Node <22 Support\n\n#"
  },
  {
    "path": "docs/ADRs/README.md",
    "chars": 768,
    "preview": "<!-- Generated by seed-golden-context | Last updated: 2026-05-04 -->\n# Architecture Decision Records\n\n| ADR | Date | Sta"
  },
  {
    "path": "docs/specs/.gitkeep",
    "chars": 0,
    "preview": ""
  },
  {
    "path": "docs/specs/README.md",
    "chars": 174,
    "preview": "# Specs\n\nImplementation-level specs for active work in this repo. Format: `YYYY-MM-<spec_name>.md`.\nSpecs here represent"
  },
  {
    "path": "example-config.json",
    "chars": 904,
    "preview": "{\n  \"spaceId\": \"source space id\",\n  \"environmentId\": \"master\",\n  \"managementToken\": \"destination space management token\""
  },
  {
    "path": "example-config.test.json",
    "chars": 94,
    "preview": "{\n  \"spaceId\": \"source space id\",\n  \"managementToken\": \"destination space management token\"\n}\n"
  },
  {
    "path": "lib/index.js",
    "chars": 6379,
    "preview": "import { access } from 'fs'\n\nimport bfj from 'bfj'\nimport Promise from 'bluebird'\nimport Table from 'cli-table3'\nimport "
  },
  {
    "path": "lib/parseOptions.js",
    "chars": 3144,
    "preview": "import { addSequenceHeader, agentFromProxy, proxyStringToObject } from 'contentful-batch-libs'\nimport { format } from 'd"
  },
  {
    "path": "lib/tasks/download-assets.js",
    "chars": 3823,
    "preview": "import Promise from 'bluebird'\nimport { getEntityName } from 'contentful-batch-libs'\nimport figures from 'figures'\nimpor"
  },
  {
    "path": "lib/tasks/get-space-data.js",
    "chars": 8200,
    "preview": "import Promise from 'bluebird'\nimport { logEmitter, wrapTask } from 'contentful-batch-libs'\nimport Listr from 'listr'\nim"
  },
  {
    "path": "lib/tasks/init-client.js",
    "chars": 758,
    "preview": "import { createClient as createCdaClient } from 'contentful'\nimport { logEmitter } from 'contentful-batch-libs'\nimport {"
  },
  {
    "path": "lib/usageParams.js",
    "chars": 3805,
    "preview": "import yargs from 'yargs'\nimport packageFile from '../package.json'\n\nexport default yargs\n  .version(packageFile.version"
  },
  {
    "path": "lib/utils/embargoedAssets.js",
    "chars": 3935,
    "preview": "import jwt from 'jsonwebtoken'\n\nconst SIX_HOURS_IN_MS = 6 * 60 * 60 * 1000\nconst assetKeyCache = new Map()\n\n/**\n * @para"
  },
  {
    "path": "lib/utils/headers.js",
    "chars": 853,
    "preview": "/**\n * Turn header option into an object. Invalid header values\n * are ignored.\n *\n * @example\n * getHeadersConfig('Acce"
  },
  {
    "path": "package.json",
    "chars": 4144,
    "preview": "{\n  \"name\": \"contentful-export\",\n  \"version\": \"0.0.0-determined-by-semantic-release\",\n  \"description\": \"this tool allows"
  },
  {
    "path": "test/integration/export-lib.test.js",
    "chars": 3975,
    "preview": "import { join } from 'path'\nimport fs from 'fs'\n\nimport mkdirp from 'mkdirp'\nimport rimraf from 'rimraf'\n\nimport runCont"
  },
  {
    "path": "test/unit/index.test.js",
    "chars": 6699,
    "preview": "import { resolve } from 'path'\n\nimport bfj from 'bfj'\nimport fs from 'fs'\nimport mkdirp from 'mkdirp'\n\nimport {\n  setupL"
  },
  {
    "path": "test/unit/mocks/download-assets.js",
    "chars": 142,
    "preview": "export const mockDownloadAssets = async (ctx) => {\n  ctx.assetDownloads = {\n    successCount: 3,\n    warningCount: 2,\n  "
  },
  {
    "path": "test/unit/mocks/get-space-data.js",
    "chars": 764,
    "preview": "import Listr from 'listr'\n\nexport const mockGetSpaceData = () => {\n  return new Listr([\n    {\n      title: 'mocked get f"
  },
  {
    "path": "test/unit/parseOptions.test.js",
    "chars": 6267,
    "preview": "import { HttpsProxyAgent } from 'https-proxy-agent'\nimport { basename, isAbsolute, resolve, sep } from 'path'\nimport par"
  },
  {
    "path": "test/unit/tasks/download-assets.test.js",
    "chars": 10779,
    "preview": "import { promises as fs, rmSync } from 'fs'\nimport { tmpdir } from 'os'\nimport { resolve } from 'path'\n\nimport nock from"
  },
  {
    "path": "test/unit/tasks/get-space-data.test.js",
    "chars": 25711,
    "preview": "import getSpaceData from '../../../lib/tasks/get-space-data'\n\nconst maxAllowedLimit = 100\nconst resultItemCount = 420\n\nf"
  },
  {
    "path": "test/unit/tasks/init-client.test.js",
    "chars": 3191,
    "preview": "import initClient from '../../../lib/tasks/init-client'\n\nimport contentfulManagement from 'contentful-management'\nimport"
  },
  {
    "path": "test/unit/utils/embargoedAssets.test.js",
    "chars": 382,
    "preview": "import { shouldCreateNewCacheItem } from '../../../lib/utils/embargoedAssets'\nconst SIX_HOURS_IN_MS = 6 * 60 * 60 * 1000"
  },
  {
    "path": "test/unit/utils/headers.test.js",
    "chars": 887,
    "preview": "import { getHeadersConfig } from '../../../lib/utils/headers'\n\ntest('getHeadersConfig returns empty object when value is"
  },
  {
    "path": "tsconfig.json",
    "chars": 329,
    "preview": "{\n  \"compilerOptions\": {\n    \"target\": \"ESNext\",\n    \"module\": \"commonjs\",\n    \"esModuleInterop\": true,\n    \"forceConsis"
  },
  {
    "path": "types.d.ts",
    "chars": 1039,
    "preview": "export interface Options {\n  managementToken: string;\n  spaceId: string;\n  contentFile?: string;\n  contentOnly?: boolean"
  }
]

About this extraction

This page contains the full source code of the contentful/contentful-export GitHub repository, extracted and formatted as plain text for AI agents and large language models (LLMs). The extraction includes 60 files (157.2 KB), approximately 40.8k tokens, and a symbol index with 52 extracted functions, classes, methods, constants, and types. Use this with OpenClaw, Claude, ChatGPT, Cursor, Windsurf, or any other AI tool that accepts text input. You can copy the full output to your clipboard or download it as a .txt file.

Extracted by GitExtract — free GitHub repo to text converter for AI. Built by Nikandr Surkov.

Copied to clipboard!