Repository: voidzero-dev/vite-plus Branch: main Commit: a5b2e1f2b425 Files: 1701 Total size: 4.3 MB Directory structure: gitextract_0d0lq36w/ ├── .cargo/ │ └── config.toml ├── .claude/ │ ├── agents/ │ │ ├── cargo-workspace-merger.md │ │ └── monorepo-architect.md │ └── skills/ │ ├── add-ecosystem-ci/ │ │ └── SKILL.md │ ├── bump-vite-task/ │ │ └── SKILL.md │ ├── spawn-process/ │ │ └── SKILL.md │ └── sync-tsdown-cli/ │ └── SKILL.md ├── .clippy.toml ├── .devcontainer/ │ └── devcontainer.json ├── .gitattributes ├── .github/ │ ├── ISSUE_TEMPLATE/ │ │ ├── bug_report.yml │ │ ├── config.yml │ │ ├── docs.yml │ │ └── feature_request.yml │ ├── actions/ │ │ ├── build-upstream/ │ │ │ └── action.yml │ │ ├── clone/ │ │ │ └── action.yml │ │ ├── download-rolldown-binaries/ │ │ │ └── action.yml │ │ └── set-snapshot-version/ │ │ ├── action.yml │ │ ├── compute-version.mjs │ │ └── package.json │ ├── renovate.json │ ├── scripts/ │ │ └── upgrade-deps.mjs │ └── workflows/ │ ├── ci.yml │ ├── claude.yml │ ├── cleanup-cache.yml │ ├── deny.yml │ ├── e2e-test.yml │ ├── issue-close-require.yml │ ├── issue-labeled.yml │ ├── release.yml │ ├── test-standalone-install.yml │ ├── upgrade-deps.yml │ └── zizmor.yml ├── .gitignore ├── .husky/ │ └── pre-commit ├── .node-version ├── .rustfmt.toml ├── .typos.toml ├── .vscode/ │ ├── extensions.json │ └── settings.json ├── CLAUDE.md ├── CONTRIBUTING.md ├── Cargo.toml ├── LICENSE ├── README.md ├── bench/ │ ├── .gitignore │ ├── Cargo.toml │ ├── benches/ │ │ └── workspace_load.rs │ ├── fixtures/ │ │ └── monorepo/ │ │ ├── package.json │ │ ├── pnpm-workspace.yaml │ │ └── vite-plus.json │ ├── generate-monorepo.ts │ ├── package.json │ └── tsconfig.json ├── crates/ │ ├── vite_command/ │ │ ├── Cargo.toml │ │ └── src/ │ │ └── lib.rs │ ├── vite_error/ │ │ ├── Cargo.toml │ │ └── src/ │ │ └── lib.rs │ ├── vite_global_cli/ │ │ ├── Cargo.toml │ │ └── src/ │ │ ├── cli.rs │ │ ├── command_picker.rs │ │ ├── commands/ │ │ │ ├── add.rs │ │ │ ├── config.rs │ │ │ ├── create.rs │ │ │ ├── dedupe.rs │ │ │ ├── delegate.rs │ │ │ ├── dlx.rs │ │ │ ├── env/ │ │ │ │ ├── bin_config.rs │ │ │ │ ├── config.rs │ │ │ │ ├── current.rs │ │ │ │ ├── default.rs │ │ │ │ ├── doctor.rs │ │ │ │ ├── exec.rs │ │ │ │ ├── global_install.rs │ │ │ │ ├── list.rs │ │ │ │ ├── list_remote.rs │ │ │ │ ├── mod.rs │ │ │ │ ├── off.rs │ │ │ │ ├── on.rs │ │ │ │ ├── package_metadata.rs │ │ │ │ ├── packages.rs │ │ │ │ ├── pin.rs │ │ │ │ ├── setup.rs │ │ │ │ ├── unpin.rs │ │ │ │ ├── use.rs │ │ │ │ └── which.rs │ │ │ ├── implode.rs │ │ │ ├── install.rs │ │ │ ├── link.rs │ │ │ ├── migrate.rs │ │ │ ├── mod.rs │ │ │ ├── outdated.rs │ │ │ ├── pm.rs │ │ │ ├── remove.rs │ │ │ ├── run_or_delegate.rs │ │ │ ├── staged.rs │ │ │ ├── unlink.rs │ │ │ ├── update.rs │ │ │ ├── upgrade/ │ │ │ │ ├── install.rs │ │ │ │ ├── integrity.rs │ │ │ │ ├── mod.rs │ │ │ │ ├── platform.rs │ │ │ │ └── registry.rs │ │ │ ├── version.rs │ │ │ ├── vpx.rs │ │ │ └── why.rs │ │ ├── error.rs │ │ ├── help.rs │ │ ├── js_executor.rs │ │ ├── main.rs │ │ ├── shim/ │ │ │ ├── cache.rs │ │ │ ├── dispatch.rs │ │ │ ├── exec.rs │ │ │ └── mod.rs │ │ └── tips/ │ │ ├── mod.rs │ │ ├── short_aliases.rs │ │ └── use_vpx_or_run.rs │ ├── vite_install/ │ │ ├── Cargo.toml │ │ ├── README.md │ │ └── src/ │ │ ├── commands/ │ │ │ ├── add.rs │ │ │ ├── audit.rs │ │ │ ├── cache.rs │ │ │ ├── config.rs │ │ │ ├── dedupe.rs │ │ │ ├── deprecate.rs │ │ │ ├── dist_tag.rs │ │ │ ├── dlx.rs │ │ │ ├── fund.rs │ │ │ ├── install.rs │ │ │ ├── link.rs │ │ │ ├── list.rs │ │ │ ├── login.rs │ │ │ ├── logout.rs │ │ │ ├── mod.rs │ │ │ ├── outdated.rs │ │ │ ├── owner.rs │ │ │ ├── pack.rs │ │ │ ├── ping.rs │ │ │ ├── prune.rs │ │ │ ├── publish.rs │ │ │ ├── rebuild.rs │ │ │ ├── remove.rs │ │ │ ├── run.rs │ │ │ ├── search.rs │ │ │ ├── token.rs │ │ │ ├── unlink.rs │ │ │ ├── update.rs │ │ │ ├── view.rs │ │ │ ├── whoami.rs │ │ │ └── why.rs │ │ ├── config.rs │ │ ├── lib.rs │ │ ├── main.rs │ │ ├── package_manager.rs │ │ ├── request.rs │ │ └── shim.rs │ ├── vite_js_runtime/ │ │ ├── Cargo.toml │ │ └── src/ │ │ ├── cache.rs │ │ ├── dev_engines.rs │ │ ├── download.rs │ │ ├── error.rs │ │ ├── lib.rs │ │ ├── platform.rs │ │ ├── provider.rs │ │ ├── providers/ │ │ │ ├── mod.rs │ │ │ └── node.rs │ │ └── runtime.rs │ ├── vite_migration/ │ │ ├── Cargo.toml │ │ └── src/ │ │ ├── ast_grep.rs │ │ ├── eslint.rs │ │ ├── file_walker.rs │ │ ├── import_rewriter.rs │ │ ├── lib.rs │ │ ├── package.rs │ │ ├── prettier.rs │ │ ├── script_rewrite.rs │ │ └── vite_config.rs │ ├── vite_shared/ │ │ ├── Cargo.toml │ │ └── src/ │ │ ├── env_config.rs │ │ ├── env_vars.rs │ │ ├── header.rs │ │ ├── home.rs │ │ ├── lib.rs │ │ ├── output.rs │ │ ├── package_json.rs │ │ ├── path_env.rs │ │ ├── string_similarity.rs │ │ └── tracing.rs │ ├── vite_static_config/ │ │ ├── Cargo.toml │ │ ├── README.md │ │ └── src/ │ │ └── lib.rs │ └── vite_trampoline/ │ ├── Cargo.toml │ └── src/ │ └── main.rs ├── deny.toml ├── docs/ │ ├── .gitignore │ ├── .vitepress/ │ │ ├── config.mts │ │ ├── env.d.ts │ │ ├── theme/ │ │ │ ├── Layout.vue │ │ │ ├── assets/ │ │ │ │ └── animations/ │ │ │ │ ├── 1280_x_580_vite+_masthead.riv │ │ │ │ ├── 253_x_268_vite+_masthead_mobile.riv │ │ │ │ ├── 514_x_246_focus_on_shipping_v2.riv │ │ │ │ └── 561_x_273_stay_fast_at_scale.riv │ │ │ ├── components/ │ │ │ │ ├── Footer.vue │ │ │ │ └── home/ │ │ │ │ ├── CoreFeature3Col.vue │ │ │ │ ├── FeatureCheck.vue │ │ │ │ ├── FeatureDevBuild.vue │ │ │ │ ├── FeaturePack.vue │ │ │ │ ├── FeatureRun.vue │ │ │ │ ├── FeatureRunTerminal.vue │ │ │ │ ├── FeatureTest.vue │ │ │ │ ├── FeatureToolbar.vue │ │ │ │ ├── Fullstack2Col.vue │ │ │ │ ├── HeadingSection2.vue │ │ │ │ ├── HeadingSection3.vue │ │ │ │ ├── HeadingSection4.vue │ │ │ │ ├── Hero.vue │ │ │ │ ├── HeroRive.vue │ │ │ │ ├── InstallCommand.vue │ │ │ │ ├── PartnerLogos.vue │ │ │ │ ├── ProductivityGrid.vue │ │ │ │ ├── StackedBlock.vue │ │ │ │ ├── Terminal.vue │ │ │ │ ├── TerminalTranscript.vue │ │ │ │ └── Testimonials.vue │ │ │ ├── data/ │ │ │ │ ├── feature-run-transcripts.ts │ │ │ │ ├── performance.ts │ │ │ │ ├── terminal-transcripts.ts │ │ │ │ └── testimonials.ts │ │ │ ├── index.ts │ │ │ ├── layouts/ │ │ │ │ ├── Error404.vue │ │ │ │ └── Home.vue │ │ │ └── styles.css │ │ └── tsconfig.json │ ├── config/ │ │ ├── build.md │ │ ├── fmt.md │ │ ├── index.md │ │ ├── lint.md │ │ ├── pack.md │ │ ├── run.md │ │ ├── staged.md │ │ └── test.md │ ├── guide/ │ │ ├── build.md │ │ ├── cache.md │ │ ├── check.md │ │ ├── ci.md │ │ ├── commit-hooks.md │ │ ├── create.md │ │ ├── dev.md │ │ ├── env.md │ │ ├── fmt.md │ │ ├── ide-integration.md │ │ ├── implode.md │ │ ├── index.md │ │ ├── install.md │ │ ├── lint.md │ │ ├── migrate.md │ │ ├── pack.md │ │ ├── run.md │ │ ├── test.md │ │ ├── troubleshooting.md │ │ ├── upgrade.md │ │ ├── vpx.md │ │ └── why.md │ ├── index.md │ ├── package.json │ ├── pnpm-workspace.yaml │ └── public/ │ └── _redirects ├── ecosystem-ci/ │ ├── clone.ts │ ├── patch-project.ts │ ├── paths.ts │ ├── repo.json │ └── verify-install.ts ├── justfile ├── netlify.toml ├── package.json ├── packages/ │ ├── cli/ │ │ ├── .gitignore │ │ ├── AGENTS.md │ │ ├── BUNDLING.md │ │ ├── README.md │ │ ├── bin/ │ │ │ ├── oxfmt │ │ │ ├── oxlint │ │ │ └── vp │ │ ├── binding/ │ │ │ ├── .gitignore │ │ │ ├── Cargo.toml │ │ │ ├── build.rs │ │ │ ├── index.cjs │ │ │ ├── index.d.cts │ │ │ ├── index.d.ts │ │ │ ├── index.js │ │ │ └── src/ │ │ │ ├── cli.rs │ │ │ ├── exec/ │ │ │ │ ├── args.rs │ │ │ │ ├── mod.rs │ │ │ │ └── workspace.rs │ │ │ ├── lib.rs │ │ │ ├── migration.rs │ │ │ ├── package_manager.rs │ │ │ └── utils.rs │ │ ├── build.ts │ │ ├── install.ps1 │ │ ├── install.sh │ │ ├── package.json │ │ ├── publish-native-addons.ts │ │ ├── rolldown.config.ts │ │ ├── rules/ │ │ │ ├── vite-prepare.yml │ │ │ └── vite-tools.yml │ │ ├── skills/ │ │ │ └── vite-plus/ │ │ │ └── SKILL.md │ │ ├── snap-tests/ │ │ │ ├── bin-oxfmt-wrapper/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── bin-oxlint-wrapper/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── build-vite-env/ │ │ │ │ ├── index.html │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ ├── steps.json │ │ │ │ └── vite.config.ts │ │ │ ├── cache-clean/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ ├── src/ │ │ │ │ │ └── index.ts │ │ │ │ ├── steps.json │ │ │ │ ├── subfolder/ │ │ │ │ │ └── .gitkeep │ │ │ │ └── vite.config.ts │ │ │ ├── cache-scripts-default/ │ │ │ │ ├── hello.mjs │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── cache-scripts-enabled/ │ │ │ │ ├── hello.mjs │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ ├── steps.json │ │ │ │ └── vite.config.ts │ │ │ ├── change-passthrough-env-config/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ ├── steps.json │ │ │ │ └── vite.config.ts │ │ │ ├── check-all-skipped/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── check-fail-fast/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ ├── src/ │ │ │ │ │ └── index.js │ │ │ │ └── steps.json │ │ │ ├── check-fix/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ ├── src/ │ │ │ │ │ └── index.js │ │ │ │ └── steps.json │ │ │ ├── check-fix-missing-stderr/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ ├── src/ │ │ │ │ │ └── index.js │ │ │ │ ├── steps.json │ │ │ │ └── vite.config.ts │ │ │ ├── check-fix-paths/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ ├── src/ │ │ │ │ │ └── index.js │ │ │ │ └── steps.json │ │ │ ├── check-fix-reformat/ │ │ │ │ ├── .oxlintrc.json │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ ├── src/ │ │ │ │ │ └── index.js │ │ │ │ └── steps.json │ │ │ ├── check-fmt-fail/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ ├── src/ │ │ │ │ │ └── index.js │ │ │ │ └── steps.json │ │ │ ├── check-lint-fail/ │ │ │ │ ├── .oxlintrc.json │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ ├── src/ │ │ │ │ │ └── index.js │ │ │ │ └── steps.json │ │ │ ├── check-lint-fail-no-typecheck/ │ │ │ │ ├── .oxlintrc.json │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ ├── src/ │ │ │ │ │ └── index.js │ │ │ │ ├── steps.json │ │ │ │ └── vite.config.ts │ │ │ ├── check-lint-fail-typecheck/ │ │ │ │ ├── .oxlintrc.json │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ ├── src/ │ │ │ │ │ └── index.js │ │ │ │ ├── steps.json │ │ │ │ └── vite.config.ts │ │ │ ├── check-lint-warn/ │ │ │ │ ├── .oxlintrc.json │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ ├── src/ │ │ │ │ │ └── index.js │ │ │ │ └── steps.json │ │ │ ├── check-no-fmt/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ ├── src/ │ │ │ │ │ └── index.js │ │ │ │ └── steps.json │ │ │ ├── check-no-lint/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ ├── src/ │ │ │ │ │ └── index.js │ │ │ │ └── steps.json │ │ │ ├── check-oxlint-env/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── check-pass/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ ├── src/ │ │ │ │ │ └── index.js │ │ │ │ └── steps.json │ │ │ ├── check-pass-no-typecheck/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ ├── src/ │ │ │ │ │ └── index.js │ │ │ │ ├── steps.json │ │ │ │ └── vite.config.ts │ │ │ ├── check-pass-typecheck/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ ├── src/ │ │ │ │ │ └── index.js │ │ │ │ ├── steps.json │ │ │ │ └── vite.config.ts │ │ │ ├── check-pass-typecheck-github-actions/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ ├── src/ │ │ │ │ │ └── index.js │ │ │ │ ├── steps.json │ │ │ │ └── vite.config.ts │ │ │ ├── cli-helper-message/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-dev-with-port/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-doc/ │ │ │ │ ├── api-examples.md │ │ │ │ ├── index.md │ │ │ │ ├── markdown-examples.md │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ ├── steps.json │ │ │ │ └── vite.config.ts │ │ │ ├── command-exec/ │ │ │ │ ├── package.json │ │ │ │ ├── setup-bin.js │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-exec-cwd/ │ │ │ │ ├── package.json │ │ │ │ ├── setup.js │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-exec-monorepo/ │ │ │ │ ├── package.json │ │ │ │ ├── packages/ │ │ │ │ │ ├── app-a/ │ │ │ │ │ │ └── package.json │ │ │ │ │ ├── app-b/ │ │ │ │ │ │ └── package.json │ │ │ │ │ └── lib-c/ │ │ │ │ │ └── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-exec-monorepo-filter-v2/ │ │ │ │ ├── package.json │ │ │ │ ├── packages/ │ │ │ │ │ ├── app-a/ │ │ │ │ │ │ └── package.json │ │ │ │ │ ├── app-b/ │ │ │ │ │ │ └── package.json │ │ │ │ │ └── lib-c/ │ │ │ │ │ └── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-exec-monorepo-order/ │ │ │ │ ├── package.json │ │ │ │ ├── packages/ │ │ │ │ │ ├── app-mobile/ │ │ │ │ │ │ └── package.json │ │ │ │ │ ├── app-web/ │ │ │ │ │ │ └── package.json │ │ │ │ │ ├── cycle-a/ │ │ │ │ │ │ └── package.json │ │ │ │ │ ├── cycle-b/ │ │ │ │ │ │ └── package.json │ │ │ │ │ ├── cycle-c/ │ │ │ │ │ │ └── package.json │ │ │ │ │ ├── cycle-d/ │ │ │ │ │ │ └── package.json │ │ │ │ │ ├── cycle-e/ │ │ │ │ │ │ └── package.json │ │ │ │ │ ├── lib-core/ │ │ │ │ │ │ └── package.json │ │ │ │ │ ├── lib-ui/ │ │ │ │ │ │ └── package.json │ │ │ │ │ └── lib-utils/ │ │ │ │ │ └── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-helper/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-init-inline-config/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-init-inline-config-existing/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ ├── steps.json │ │ │ │ └── vite.config.ts │ │ │ ├── command-install-shortcut/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ ├── steps.json │ │ │ │ └── vite.config.ts │ │ │ ├── command-pack/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ ├── src/ │ │ │ │ │ ├── hello.ts │ │ │ │ │ └── index.ts │ │ │ │ ├── steps.json │ │ │ │ └── vite.config.ts │ │ │ ├── command-pack-external/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ ├── src/ │ │ │ │ │ └── index.ts │ │ │ │ └── steps.json │ │ │ ├── command-pack-monorepo/ │ │ │ │ ├── package.json │ │ │ │ ├── packages/ │ │ │ │ │ ├── array-config/ │ │ │ │ │ │ ├── package.json │ │ │ │ │ │ ├── src/ │ │ │ │ │ │ │ └── sub/ │ │ │ │ │ │ │ ├── hello.ts │ │ │ │ │ │ │ └── index.ts │ │ │ │ │ │ └── vite.config.ts │ │ │ │ │ ├── default-config/ │ │ │ │ │ │ ├── package.json │ │ │ │ │ │ └── src/ │ │ │ │ │ │ ├── hello.ts │ │ │ │ │ │ └── index.ts │ │ │ │ │ └── hello/ │ │ │ │ │ ├── package.json │ │ │ │ │ ├── src/ │ │ │ │ │ │ ├── hello.ts │ │ │ │ │ │ └── index.ts │ │ │ │ │ └── vite.config.ts │ │ │ │ ├── snap.txt │ │ │ │ ├── steps.json │ │ │ │ └── vite.config.ts │ │ │ ├── command-pack-no-input/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-preview/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-run-with-vp-config/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-version/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-vp-alias/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── exit-code/ │ │ │ │ ├── failure.js │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ ├── steps.json │ │ │ │ └── vite.config.ts │ │ │ ├── fingerprint-ignore-test/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ ├── steps.json │ │ │ │ └── vite.config.ts │ │ │ ├── fmt-check-with-vite-config/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ ├── src/ │ │ │ │ │ └── valid.js │ │ │ │ ├── steps.json │ │ │ │ └── vite.config.ts │ │ │ ├── fmt-ignore-patterns/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ ├── src/ │ │ │ │ │ ├── ignored/ │ │ │ │ │ │ └── badly-formatted.js │ │ │ │ │ └── valid.js │ │ │ │ ├── steps.json │ │ │ │ └── vite.config.ts │ │ │ ├── ignore_dist/ │ │ │ │ ├── .gitignore │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ ├── steps.json │ │ │ │ └── vite.config.ts │ │ │ ├── lint-ignore-patterns/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ ├── src/ │ │ │ │ │ ├── ignored/ │ │ │ │ │ │ └── has-error.js │ │ │ │ │ └── valid.js │ │ │ │ ├── steps.json │ │ │ │ └── vite.config.ts │ │ │ ├── lint-vite-config-rules/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ ├── src/ │ │ │ │ │ ├── has-console.js │ │ │ │ │ └── valid.js │ │ │ │ ├── steps.json │ │ │ │ └── vite.config.ts │ │ │ ├── npm-install-with-options/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ ├── steps.json │ │ │ │ └── vite.config.ts │ │ │ ├── oxlint-typeaware/ │ │ │ │ ├── .gitignore │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ ├── src/ │ │ │ │ │ └── index.ts │ │ │ │ ├── steps.json │ │ │ │ ├── types.ts │ │ │ │ └── vite.config.ts │ │ │ ├── pass-no-color-env/ │ │ │ │ ├── check.js │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ ├── steps.json │ │ │ │ └── vite.config.ts │ │ │ ├── plain-terminal-ui/ │ │ │ │ ├── hello.mjs │ │ │ │ ├── input.txt │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ ├── steps.json │ │ │ │ ├── subfolder/ │ │ │ │ │ └── hello.mjs │ │ │ │ └── vite.config.ts │ │ │ ├── plain-terminal-ui-nested/ │ │ │ │ ├── .gitignore │ │ │ │ ├── a.ts │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ ├── src/ │ │ │ │ │ └── index.ts │ │ │ │ ├── steps.json │ │ │ │ └── vite.config.ts │ │ │ ├── run-task-command-conflict/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ ├── steps.json │ │ │ │ └── vite.config.ts │ │ │ ├── synthetic-build-cache-disabled/ │ │ │ │ ├── index.html │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── synthetic-dev-cache-disabled/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ ├── steps.json │ │ │ │ └── vite.config.ts │ │ │ ├── task-config-cwd/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ ├── steps.json │ │ │ │ ├── subfolder/ │ │ │ │ │ └── a.js │ │ │ │ └── vite.config.ts │ │ │ ├── test-nested-tasks/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── vite-config-task/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ ├── steps.json │ │ │ │ └── vite.config.ts │ │ │ ├── vite-task-path-env-include-pm/ │ │ │ │ ├── main.js │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ ├── steps.json │ │ │ │ └── vite.config.ts │ │ │ ├── vitest-browser-mode/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ ├── src/ │ │ │ │ │ ├── bar.js │ │ │ │ │ ├── foo.js │ │ │ │ │ └── foo.test.js │ │ │ │ ├── steps.json │ │ │ │ ├── vite.config.ts │ │ │ │ └── vitest.config.ts │ │ │ ├── vp-run-expansion/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ ├── steps.json │ │ │ │ └── vite.config.ts │ │ │ ├── workspace-lint-subpackage/ │ │ │ │ ├── package.json │ │ │ │ ├── packages/ │ │ │ │ │ └── app-a/ │ │ │ │ │ ├── package.json │ │ │ │ │ ├── src/ │ │ │ │ │ │ └── index.js │ │ │ │ │ └── vite.config.ts │ │ │ │ ├── pnpm-workspace.yaml │ │ │ │ ├── snap.txt │ │ │ │ ├── steps.json │ │ │ │ └── vite.config.ts │ │ │ ├── workspace-root-vite-config/ │ │ │ │ ├── package.json │ │ │ │ ├── packages/ │ │ │ │ │ ├── app-a/ │ │ │ │ │ │ ├── index.js │ │ │ │ │ │ └── package.json │ │ │ │ │ └── app-b/ │ │ │ │ │ ├── index.js │ │ │ │ │ └── package.json │ │ │ │ ├── pnpm-workspace.yaml │ │ │ │ ├── snap.txt │ │ │ │ ├── steps.json │ │ │ │ └── vite.config.ts │ │ │ └── yarn-install-with-options/ │ │ │ ├── package.json │ │ │ ├── snap.txt │ │ │ ├── steps.json │ │ │ └── vite.config.ts │ │ ├── snap-tests-global/ │ │ │ ├── cli-helper-message/ │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-add-npm10/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-add-npm10-with-workspace/ │ │ │ │ ├── package.json │ │ │ │ ├── packages/ │ │ │ │ │ ├── app/ │ │ │ │ │ │ └── package.json │ │ │ │ │ └── utils/ │ │ │ │ │ └── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-add-npm11/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-add-npm11-with-workspace/ │ │ │ │ ├── package.json │ │ │ │ ├── packages/ │ │ │ │ │ ├── app/ │ │ │ │ │ │ └── package.json │ │ │ │ │ └── utils/ │ │ │ │ │ └── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-add-pnpm10/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-add-pnpm10-with-workspace/ │ │ │ │ ├── package.json │ │ │ │ ├── packages/ │ │ │ │ │ ├── app/ │ │ │ │ │ │ └── package.json │ │ │ │ │ └── utils/ │ │ │ │ │ └── package.json │ │ │ │ ├── pnpm-workspace.yaml │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-add-pnpm9/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-add-pnpm9-with-workspace/ │ │ │ │ ├── package.json │ │ │ │ ├── packages/ │ │ │ │ │ ├── app/ │ │ │ │ │ │ └── package.json │ │ │ │ │ └── utils/ │ │ │ │ │ └── package.json │ │ │ │ ├── pnpm-workspace.yaml │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-add-yarn4/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-add-yarn4-with-workspace/ │ │ │ │ ├── package.json │ │ │ │ ├── packages/ │ │ │ │ │ ├── admin/ │ │ │ │ │ │ └── package.json │ │ │ │ │ ├── app/ │ │ │ │ │ │ └── package.json │ │ │ │ │ └── utils/ │ │ │ │ │ └── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-cache-npm10/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-cache-pnpm10/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-cache-yarn4/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-check-help/ │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-config-custom-dir-hook-path/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-config-help/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-config-no-agent-writes/ │ │ │ │ ├── CLAUDE.md │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-config-npm10/ │ │ │ │ ├── .npmrc │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-config-pnpm10/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-config-prepare-auto-hooks/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-config-replace-husky-hookspath/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-config-update-agents/ │ │ │ │ ├── AGENTS.md │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-config-yarn1/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-config-yarn4/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-create-help/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-dedupe-npm10/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-dedupe-pnpm10/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-dedupe-yarn4/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-dlx-no-package-json/ │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-dlx-npm10/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-dlx-pnpm10/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-dlx-yarn4/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-env-exec/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-env-exec-shim-mode/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-env-install-conflict/ │ │ │ │ ├── conflict-pkg/ │ │ │ │ │ ├── cli.js │ │ │ │ │ └── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-env-install-fail/ │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-env-install-no-arg/ │ │ │ │ ├── .node-version │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-env-install-no-arg-fail/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-env-install-node-version/ │ │ │ │ ├── command-env-install-node-version-pkg/ │ │ │ │ │ ├── cli.js │ │ │ │ │ └── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-env-install-version-alias/ │ │ │ │ ├── command-env-install-version-alias-pkg/ │ │ │ │ │ ├── cli.js │ │ │ │ │ └── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-env-use/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-env-which/ │ │ │ │ ├── .node-version │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-exec/ │ │ │ │ ├── package.json │ │ │ │ ├── setup-bin.js │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-fmt-help/ │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-install-auto-create-package-json/ │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-install-bug-31/ │ │ │ │ ├── package.json │ │ │ │ ├── packages/ │ │ │ │ │ └── utils/ │ │ │ │ │ └── package.json │ │ │ │ ├── pnpm-workspace.yaml │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-link-npm10/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-link-pnpm10/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-link-yarn4/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-lint-help/ │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-list-no-package-json/ │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-list-npm10/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-list-npm10-with-workspace/ │ │ │ │ ├── package.json │ │ │ │ ├── packages/ │ │ │ │ │ ├── app/ │ │ │ │ │ │ └── package.json │ │ │ │ │ └── utils/ │ │ │ │ │ └── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-list-pnpm10/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-list-pnpm10-with-workspace/ │ │ │ │ ├── package.json │ │ │ │ ├── packages/ │ │ │ │ │ ├── app/ │ │ │ │ │ │ └── package.json │ │ │ │ │ └── utils/ │ │ │ │ │ └── package.json │ │ │ │ ├── pnpm-workspace.yaml │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-list-yarn1/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-list-yarn4/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-outdated-npm10/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-outdated-npm10-with-workspace/ │ │ │ │ ├── package.json │ │ │ │ ├── packages/ │ │ │ │ │ ├── app/ │ │ │ │ │ │ └── package.json │ │ │ │ │ └── utils/ │ │ │ │ │ └── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-outdated-pnpm10/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-outdated-pnpm10-with-workspace/ │ │ │ │ ├── package.json │ │ │ │ ├── packages/ │ │ │ │ │ ├── app/ │ │ │ │ │ │ └── package.json │ │ │ │ │ └── utils/ │ │ │ │ │ └── package.json │ │ │ │ ├── pnpm-workspace.yaml │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-outdated-yarn4/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-owner-npm10/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-owner-pnpm10/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-owner-yarn1/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-owner-yarn4/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-pack-exe/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ ├── src/ │ │ │ │ │ └── index.ts │ │ │ │ ├── steps.json │ │ │ │ └── vite.config.ts │ │ │ ├── command-pack-exe-error/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ ├── src/ │ │ │ │ │ └── index.ts │ │ │ │ ├── steps.json │ │ │ │ └── vite.config.ts │ │ │ ├── command-pack-npm10/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-pack-npm10-with-workspace/ │ │ │ │ ├── package.json │ │ │ │ ├── packages/ │ │ │ │ │ ├── app/ │ │ │ │ │ │ └── package.json │ │ │ │ │ └── utils/ │ │ │ │ │ └── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-pack-pnpm10/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-pack-pnpm10-with-workspace/ │ │ │ │ ├── package.json │ │ │ │ ├── packages/ │ │ │ │ │ ├── app/ │ │ │ │ │ │ └── package.json │ │ │ │ │ └── utils/ │ │ │ │ │ └── package.json │ │ │ │ ├── pnpm-workspace.yaml │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-pack-yarn4/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-pack-yarn4-with-workspace/ │ │ │ │ ├── package.json │ │ │ │ ├── packages/ │ │ │ │ │ ├── app/ │ │ │ │ │ │ └── package.json │ │ │ │ │ └── utils/ │ │ │ │ │ └── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-pm-no-package-json/ │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-prune-npm10/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-prune-pnpm10/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-prune-yarn4/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-publish-npm10/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-publish-pnpm10/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-publish-yarn1/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-publish-yarn4/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-remove-npm10/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-remove-npm10-with-workspace/ │ │ │ │ ├── package.json │ │ │ │ ├── packages/ │ │ │ │ │ ├── app/ │ │ │ │ │ │ └── package.json │ │ │ │ │ └── utils/ │ │ │ │ │ └── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-remove-pnpm10/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-remove-pnpm10-with-workspace/ │ │ │ │ ├── package.json │ │ │ │ ├── packages/ │ │ │ │ │ ├── app/ │ │ │ │ │ │ └── package.json │ │ │ │ │ └── utils/ │ │ │ │ │ └── package.json │ │ │ │ ├── pnpm-workspace.yaml │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-remove-yarn4/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-remove-yarn4-with-workspace/ │ │ │ │ ├── package.json │ │ │ │ ├── packages/ │ │ │ │ │ ├── admin/ │ │ │ │ │ │ └── package.json │ │ │ │ │ ├── app/ │ │ │ │ │ │ └── package.json │ │ │ │ │ └── utils/ │ │ │ │ │ └── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-run-without-vite-plus/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-staged-broken-config/ │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-staged-help/ │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-staged-no-config/ │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-staged-with-config/ │ │ │ │ ├── .oxlintrc.json │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ ├── src/ │ │ │ │ │ └── index.ts │ │ │ │ ├── steps.json │ │ │ │ └── vite.config.ts │ │ │ ├── command-unlink-npm10/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-unlink-pnpm10/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-unlink-yarn4/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-update-npm10/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-update-npm10-with-workspace/ │ │ │ │ ├── package.json │ │ │ │ ├── packages/ │ │ │ │ │ ├── app/ │ │ │ │ │ │ └── package.json │ │ │ │ │ └── utils/ │ │ │ │ │ └── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-update-pnpm10/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-update-pnpm10-with-workspace/ │ │ │ │ ├── package.json │ │ │ │ ├── packages/ │ │ │ │ │ ├── app/ │ │ │ │ │ │ └── package.json │ │ │ │ │ └── utils/ │ │ │ │ │ └── package.json │ │ │ │ ├── pnpm-workspace.yaml │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-update-yarn4/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-update-yarn4-with-workspace/ │ │ │ │ ├── package.json │ │ │ │ ├── packages/ │ │ │ │ │ ├── app/ │ │ │ │ │ │ └── package.json │ │ │ │ │ └── utils/ │ │ │ │ │ └── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-upgrade-check/ │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-upgrade-rollback/ │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-version-no-side-effects/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-version-with-env/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-view-npm10/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-view-pnpm10/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-view-yarn1/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-view-yarn4/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-vpx-no-package-json/ │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-vpx-pnpm10/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-why-npm10/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-why-npm10-with-workspace/ │ │ │ │ ├── package.json │ │ │ │ ├── packages/ │ │ │ │ │ ├── app/ │ │ │ │ │ │ └── package.json │ │ │ │ │ └── utils/ │ │ │ │ │ └── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-why-pnpm10/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-why-pnpm10-with-workspace/ │ │ │ │ ├── package.json │ │ │ │ ├── packages/ │ │ │ │ │ ├── app/ │ │ │ │ │ │ └── package.json │ │ │ │ │ └── utils/ │ │ │ │ │ └── package.json │ │ │ │ ├── pnpm-workspace.yaml │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── command-why-yarn4/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── create-from-monorepo-subdir/ │ │ │ │ ├── apps/ │ │ │ │ │ └── website/ │ │ │ │ │ └── package.json │ │ │ │ ├── package.json │ │ │ │ ├── pnpm-workspace.yaml │ │ │ │ ├── scripts/ │ │ │ │ │ └── helper/ │ │ │ │ │ └── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── create-from-nonworkspace-subdir/ │ │ │ │ ├── package.json │ │ │ │ ├── scripts/ │ │ │ │ │ └── .keep │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── create-generator-outside-monorepo/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── create-missing-typecheck/ │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── delegate-respects-default-node-version/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── dev-engines-runtime-pnpm10/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── env-install-binary-conflict/ │ │ │ │ ├── .node-version │ │ │ │ ├── env-binary-conflict-pkg-a/ │ │ │ │ │ ├── cli.js │ │ │ │ │ └── package.json │ │ │ │ ├── env-binary-conflict-pkg-b/ │ │ │ │ │ ├── cli.js │ │ │ │ │ └── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── fallback-all-invalid-to-user-default/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── fallback-invalid-engines-to-dev-engines/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── global-cli-fallback/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── migration-add-git-hooks/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── migration-agent-claude/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ ├── src/ │ │ │ │ │ └── index.ts │ │ │ │ └── steps.json │ │ │ ├── migration-already-vite-plus/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── migration-already-vite-plus-with-husky-hookspath/ │ │ │ │ ├── .husky/ │ │ │ │ │ └── pre-commit │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── migration-already-vite-plus-with-husky-lint-staged/ │ │ │ │ ├── .husky/ │ │ │ │ │ └── pre-commit │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── migration-auto-create-vite-config/ │ │ │ │ ├── .oxfmtrc.json │ │ │ │ ├── .oxlintrc.json │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── migration-baseurl-tsconfig/ │ │ │ │ ├── .oxlintrc.json │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ ├── steps.json │ │ │ │ └── tsconfig.json │ │ │ ├── migration-chained-lint-staged-pre-commit/ │ │ │ │ ├── .husky/ │ │ │ │ │ └── pre-commit │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── migration-check/ │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── migration-composed-husky-custom-dir/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── migration-composed-husky-prepare/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── migration-env-prefix-lint-staged/ │ │ │ │ ├── .husky/ │ │ │ │ │ └── pre-commit │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── migration-eslint/ │ │ │ │ ├── eslint.config.mjs │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── migration-eslint-legacy/ │ │ │ │ ├── .eslintrc │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── migration-eslint-legacy-already-vite-plus/ │ │ │ │ ├── .eslintrc │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── migration-eslint-lint-staged/ │ │ │ │ ├── eslint.config.mjs │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── migration-eslint-lint-staged-mjs/ │ │ │ │ ├── eslint.config.mjs │ │ │ │ ├── lint-staged.config.mjs │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── migration-eslint-lintstagedrc/ │ │ │ │ ├── .lintstagedrc.json │ │ │ │ ├── eslint.config.mjs │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── migration-eslint-monorepo/ │ │ │ │ ├── eslint.config.mjs │ │ │ │ ├── package.json │ │ │ │ ├── packages/ │ │ │ │ │ ├── app/ │ │ │ │ │ │ └── package.json │ │ │ │ │ └── utils/ │ │ │ │ │ └── package.json │ │ │ │ ├── pnpm-workspace.yaml │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── migration-eslint-monorepo-package-only/ │ │ │ │ ├── package.json │ │ │ │ ├── packages/ │ │ │ │ │ └── app/ │ │ │ │ │ ├── eslint.config.mjs │ │ │ │ │ └── package.json │ │ │ │ ├── pnpm-workspace.yaml │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── migration-eslint-npx-wrapper/ │ │ │ │ ├── eslint.config.mjs │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── migration-eslint-rerun/ │ │ │ │ ├── eslint.config.mjs │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── migration-eslint-rerun-dual-config/ │ │ │ │ ├── .eslintrc │ │ │ │ ├── eslint.config.mjs │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── migration-eslint-rerun-mjs/ │ │ │ │ ├── eslint.config.mjs │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ ├── steps.json │ │ │ │ └── vite.config.mjs │ │ │ ├── migration-existing-husky/ │ │ │ │ ├── .husky/ │ │ │ │ │ └── pre-commit │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── migration-existing-husky-lint-staged/ │ │ │ │ ├── .husky/ │ │ │ │ │ └── pre-commit │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── migration-existing-husky-v8-hooks/ │ │ │ │ ├── .husky/ │ │ │ │ │ └── pre-commit │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── migration-existing-husky-v8-multi-hooks/ │ │ │ │ ├── .husky/ │ │ │ │ │ ├── commit-msg │ │ │ │ │ └── pre-commit │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── migration-existing-lint-staged-config/ │ │ │ │ ├── .lintstagedrc.json │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── migration-existing-pnpm-exec-lint-staged/ │ │ │ │ ├── .husky/ │ │ │ │ │ └── pre-commit │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── migration-existing-pre-commit/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── migration-existing-prepare-script/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── migration-from-tsdown/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ ├── src/ │ │ │ │ │ └── index.ts │ │ │ │ ├── steps.json │ │ │ │ └── tsdown.config.ts │ │ │ ├── migration-from-tsdown-json-config/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ ├── src/ │ │ │ │ │ └── index.ts │ │ │ │ ├── steps.json │ │ │ │ ├── tsdown.config.json │ │ │ │ └── vite.config.ts │ │ │ ├── migration-from-vitest-config/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ ├── steps.json │ │ │ │ └── vitest.config.ts │ │ │ ├── migration-from-vitest-files/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ ├── steps.json │ │ │ │ └── test/ │ │ │ │ └── hello.ts │ │ │ ├── migration-hooks-skip-on-existing-hookspath/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── migration-husky-env-skip/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── migration-husky-or-prepare/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── migration-husky-semicolon-prepare/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── migration-husky-v8-preserves-lint-staged/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── migration-lint-staged-in-scripts/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── migration-lint-staged-merge-fail/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ ├── steps.json │ │ │ │ └── vite.config.ts │ │ │ ├── migration-lint-staged-ts-config/ │ │ │ │ ├── lint-staged.config.ts │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── migration-lintstagedrc-json/ │ │ │ │ ├── .lintstagedrc.json │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── migration-lintstagedrc-merge-fail/ │ │ │ │ ├── .lintstagedrc.json │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ ├── steps.json │ │ │ │ └── vite.config.ts │ │ │ ├── migration-lintstagedrc-not-support/ │ │ │ │ ├── .lintstagedrc │ │ │ │ ├── .lintstagedrc.yaml │ │ │ │ ├── lint-staged.config.mjs │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── migration-lintstagedrc-staged-exists/ │ │ │ │ ├── .lintstagedrc.json │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ ├── steps.json │ │ │ │ └── vite.config.ts │ │ │ ├── migration-merge-vite-config-js/ │ │ │ │ ├── .oxlintrc.json │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ ├── steps.json │ │ │ │ └── vite.config.js │ │ │ ├── migration-merge-vite-config-ts/ │ │ │ │ ├── .oxfmtrc.json │ │ │ │ ├── .oxlintrc.json │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ ├── steps.json │ │ │ │ └── vite.config.ts │ │ │ ├── migration-monorepo-husky-v8-preserves-lint-staged/ │ │ │ │ ├── package.json │ │ │ │ ├── packages/ │ │ │ │ │ └── app/ │ │ │ │ │ └── package.json │ │ │ │ ├── pnpm-workspace.yaml │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── migration-monorepo-pnpm/ │ │ │ │ ├── .oxfmtrc.json │ │ │ │ ├── .oxlintrc.json │ │ │ │ ├── package.json │ │ │ │ ├── packages/ │ │ │ │ │ ├── app/ │ │ │ │ │ │ └── package.json │ │ │ │ │ ├── only-oxlint/ │ │ │ │ │ │ ├── .oxlintrc.json │ │ │ │ │ │ └── package.json │ │ │ │ │ └── utils/ │ │ │ │ │ └── package.json │ │ │ │ ├── pnpm-workspace.yaml │ │ │ │ ├── snap.txt │ │ │ │ ├── steps.json │ │ │ │ └── vite.config.ts │ │ │ ├── migration-monorepo-pnpm-overrides-dependency-selector/ │ │ │ │ ├── package.json │ │ │ │ ├── packages/ │ │ │ │ │ └── app/ │ │ │ │ │ └── package.json │ │ │ │ ├── pnpm-workspace.yaml │ │ │ │ ├── snap.txt │ │ │ │ ├── steps.json │ │ │ │ └── vite.config.ts │ │ │ ├── migration-monorepo-skip-vite-peer-dependency/ │ │ │ │ ├── package.json │ │ │ │ ├── packages/ │ │ │ │ │ └── vite-plugin/ │ │ │ │ │ ├── package.json │ │ │ │ │ └── src/ │ │ │ │ │ └── index.ts │ │ │ │ ├── pnpm-workspace.yaml │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── migration-monorepo-yarn4/ │ │ │ │ ├── .oxlintrc.json │ │ │ │ ├── package.json │ │ │ │ ├── packages/ │ │ │ │ │ ├── app/ │ │ │ │ │ │ └── package.json │ │ │ │ │ └── utils/ │ │ │ │ │ └── package.json │ │ │ │ ├── snap.txt │ │ │ │ ├── steps.json │ │ │ │ └── vite.config.ts │ │ │ ├── migration-no-agent/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ ├── src/ │ │ │ │ │ └── index.ts │ │ │ │ └── steps.json │ │ │ ├── migration-no-git-repo/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── migration-no-hooks/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── migration-no-hooks-with-husky/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── migration-not-supported-npm8.2/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── migration-not-supported-pnpm9.4/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── migration-not-supported-vite6/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── migration-not-supported-vitest3/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── migration-other-hook-tool/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── migration-partially-migrated-pre-commit/ │ │ │ │ ├── .husky/ │ │ │ │ │ └── pre-commit │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── migration-pre-commit-env-setup/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── migration-prettier/ │ │ │ │ ├── .prettierrc.json │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── migration-prettier-eslint-combo/ │ │ │ │ ├── .prettierrc.json │ │ │ │ ├── eslint.config.mjs │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── migration-prettier-ignore-unknown/ │ │ │ │ ├── .prettierrc.json │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── migration-prettier-lint-staged/ │ │ │ │ ├── .prettierrc.json │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── migration-prettier-pkg-json/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── migration-prettier-rerun/ │ │ │ │ ├── .prettierrc.json │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── migration-rewrite-declare-module/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ ├── src/ │ │ │ │ │ └── index.ts │ │ │ │ └── steps.json │ │ │ ├── migration-rewrite-reference-types/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ ├── src/ │ │ │ │ │ └── env.d.ts │ │ │ │ └── steps.json │ │ │ ├── migration-skip-vite-dependency/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ ├── src/ │ │ │ │ │ └── index.ts │ │ │ │ └── steps.json │ │ │ ├── migration-skip-vite-peer-dependency/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ ├── src/ │ │ │ │ │ └── index.ts │ │ │ │ └── steps.json │ │ │ ├── migration-standalone-npm/ │ │ │ │ ├── .gitignore │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── migration-subpath/ │ │ │ │ ├── foo/ │ │ │ │ │ └── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── migration-vite-version/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── new-check/ │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── new-create-vite/ │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── new-create-vite-directory-dot/ │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── new-create-vite-with-scope-name/ │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── new-vite-monorepo/ │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── npm-global-install-already-linked/ │ │ │ │ ├── .node-version │ │ │ │ ├── npm-global-linked-pkg/ │ │ │ │ │ ├── cli.js │ │ │ │ │ └── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── npm-global-install-custom-prefix/ │ │ │ │ ├── npm-global-custom-prefix-pkg/ │ │ │ │ │ ├── cli.js │ │ │ │ │ └── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── npm-global-install-custom-prefix-on-path/ │ │ │ │ ├── npm-global-on-path-pkg/ │ │ │ │ │ ├── cli.js │ │ │ │ │ └── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── npm-global-install-dot/ │ │ │ │ ├── npm-global-dot-pkg/ │ │ │ │ │ ├── cli.js │ │ │ │ │ └── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── npm-global-install-hint/ │ │ │ │ ├── npm-global-hint-pkg/ │ │ │ │ │ ├── cli.js │ │ │ │ │ └── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── npm-global-uninstall-link-cleanup/ │ │ │ │ ├── npm-global-uninstall-pkg/ │ │ │ │ │ ├── cli.js │ │ │ │ │ └── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── npm-global-uninstall-preexisting-binary/ │ │ │ │ ├── npm-global-preexist-pkg/ │ │ │ │ │ ├── cli.js │ │ │ │ │ └── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── npm-global-uninstall-prefix/ │ │ │ │ ├── npm-global-prefix-pkg/ │ │ │ │ │ ├── cli.js │ │ │ │ │ └── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── npm-global-uninstall-shared-bin-name/ │ │ │ │ ├── pkg-a/ │ │ │ │ │ ├── cli.js │ │ │ │ │ └── package.json │ │ │ │ ├── pkg-b/ │ │ │ │ │ ├── cli.js │ │ │ │ │ └── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── npm-global-uninstall-vp-managed/ │ │ │ │ ├── npm-global-vp-managed-pkg/ │ │ │ │ │ ├── cli.js │ │ │ │ │ └── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── shim-inherits-parent-dev-engines-runtime/ │ │ │ │ ├── package.json │ │ │ │ ├── packages/ │ │ │ │ │ └── app/ │ │ │ │ │ └── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── shim-inherits-parent-engines-node/ │ │ │ │ ├── package.json │ │ │ │ ├── packages/ │ │ │ │ │ └── app/ │ │ │ │ │ └── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── shim-inherits-parent-node-version/ │ │ │ │ ├── .node-version │ │ │ │ ├── package.json │ │ │ │ ├── packages/ │ │ │ │ │ └── app/ │ │ │ │ │ └── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── shim-pnpm-uses-project-node-version/ │ │ │ │ ├── .node-version │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── shim-recursive-npm-run/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ └── shim-recursive-package-binary/ │ │ │ ├── .node-version │ │ │ ├── recursive-cli-pkg/ │ │ │ │ ├── cli.js │ │ │ │ └── package.json │ │ │ ├── snap.txt │ │ │ └── steps.json │ │ ├── snap-tests-todo/ │ │ │ ├── command-pack-watch-restart/ │ │ │ │ ├── kill-watch.sh │ │ │ │ ├── package.json │ │ │ │ ├── run-watch.sh │ │ │ │ ├── snap.txt │ │ │ │ ├── src/ │ │ │ │ │ └── index.ts │ │ │ │ ├── steps.json │ │ │ │ ├── vite.config.ts │ │ │ │ ├── wait-for-dist.sh │ │ │ │ └── wait-for-dist2.sh │ │ │ ├── exit-non-zero-on-cmd-not-exists/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ ├── pnpm-install-with-options/ │ │ │ │ ├── package.json │ │ │ │ ├── snap.txt │ │ │ │ └── steps.json │ │ │ └── test-panicked-fix/ │ │ │ ├── package.json │ │ │ ├── snap.txt │ │ │ └── steps.json │ │ ├── src/ │ │ │ ├── __tests__/ │ │ │ │ ├── index.spec.ts │ │ │ │ ├── init-config.spec.ts │ │ │ │ ├── pack.spec.ts │ │ │ │ └── resolve-vite-config.spec.ts │ │ │ ├── bin.ts │ │ │ ├── config/ │ │ │ │ ├── __tests__/ │ │ │ │ │ └── hooks.spec.ts │ │ │ │ ├── bin.ts │ │ │ │ └── hooks.ts │ │ │ ├── create/ │ │ │ │ ├── __tests__/ │ │ │ │ │ ├── __snapshots__/ │ │ │ │ │ │ └── utils.spec.ts.snap │ │ │ │ │ ├── discovery.spec.ts │ │ │ │ │ ├── initial-template-options.spec.ts │ │ │ │ │ ├── prompts.spec.ts │ │ │ │ │ └── utils.spec.ts │ │ │ │ ├── bin.ts │ │ │ │ ├── command.ts │ │ │ │ ├── discovery.ts │ │ │ │ ├── initial-template-options.ts │ │ │ │ ├── prompts.ts │ │ │ │ ├── random-name.ts │ │ │ │ ├── templates/ │ │ │ │ │ ├── builtin.ts │ │ │ │ │ ├── generator.ts │ │ │ │ │ ├── index.ts │ │ │ │ │ ├── monorepo.ts │ │ │ │ │ ├── remote.ts │ │ │ │ │ └── types.ts │ │ │ │ └── utils.ts │ │ │ ├── define-config.ts │ │ │ ├── index.cts │ │ │ ├── index.ts │ │ │ ├── init-config.ts │ │ │ ├── lint.ts │ │ │ ├── mcp/ │ │ │ │ └── bin.ts │ │ │ ├── migration/ │ │ │ │ ├── __tests__/ │ │ │ │ │ ├── __snapshots__/ │ │ │ │ │ │ └── migrator.spec.ts.snap │ │ │ │ │ ├── compat.spec.ts │ │ │ │ │ └── migrator.spec.ts │ │ │ │ ├── bin.ts │ │ │ │ ├── compat.ts │ │ │ │ ├── detector.ts │ │ │ │ ├── migrator.ts │ │ │ │ └── report.ts │ │ │ ├── pack-bin.ts │ │ │ ├── pack.ts │ │ │ ├── resolve-doc.ts │ │ │ ├── resolve-fmt.ts │ │ │ ├── resolve-lint.ts │ │ │ ├── resolve-pack.ts │ │ │ ├── resolve-test.ts │ │ │ ├── resolve-vite-config.ts │ │ │ ├── resolve-vite.ts │ │ │ ├── run-config.ts │ │ │ ├── staged/ │ │ │ │ └── bin.ts │ │ │ ├── staged-config.ts │ │ │ ├── types/ │ │ │ │ ├── index.ts │ │ │ │ ├── package.ts │ │ │ │ └── workspace.ts │ │ │ ├── utils/ │ │ │ │ ├── __tests__/ │ │ │ │ │ ├── agent.spec.ts │ │ │ │ │ ├── editor.spec.ts │ │ │ │ │ ├── help.spec.ts │ │ │ │ │ └── package.spec.ts │ │ │ │ ├── agent.ts │ │ │ │ ├── command.ts │ │ │ │ ├── constants.ts │ │ │ │ ├── editor.ts │ │ │ │ ├── help.ts │ │ │ │ ├── json.ts │ │ │ │ ├── package.ts │ │ │ │ ├── path.ts │ │ │ │ ├── prompts.ts │ │ │ │ ├── skills.ts │ │ │ │ ├── terminal.ts │ │ │ │ ├── tsconfig.ts │ │ │ │ ├── types.ts │ │ │ │ ├── workspace.ts │ │ │ │ └── yaml.ts │ │ │ └── version.ts │ │ ├── templates/ │ │ │ ├── generator/ │ │ │ │ ├── README.md │ │ │ │ ├── bin/ │ │ │ │ │ └── index.ts │ │ │ │ ├── package.json │ │ │ │ ├── src/ │ │ │ │ │ └── template.ts │ │ │ │ └── tsconfig.json │ │ │ └── monorepo/ │ │ │ ├── README.md │ │ │ ├── _gitignore │ │ │ ├── _yarnrc.yml │ │ │ ├── package.json │ │ │ ├── pnpm-workspace.yaml │ │ │ ├── tsconfig.json │ │ │ └── vite.config.ts │ │ └── tsconfig.json │ ├── core/ │ │ ├── .gitignore │ │ ├── BUNDLING.md │ │ ├── __tests__/ │ │ │ └── build-artifacts.spec.ts │ │ ├── build-support/ │ │ │ ├── build-cjs-deps.ts │ │ │ ├── find-create-require.ts │ │ │ ├── rewrite-imports.ts │ │ │ └── rewrite-module-specifiers.ts │ │ ├── build.ts │ │ ├── package.json │ │ └── tsconfig.json │ ├── prompts/ │ │ ├── LICENSE │ │ ├── package.json │ │ ├── src/ │ │ │ ├── __tests__/ │ │ │ │ ├── __snapshots__/ │ │ │ │ │ └── render.spec.ts.snap │ │ │ │ └── render.spec.ts │ │ │ ├── autocomplete.ts │ │ │ ├── box.ts │ │ │ ├── common.ts │ │ │ ├── confirm.ts │ │ │ ├── group-multi-select.ts │ │ │ ├── group.ts │ │ │ ├── index.ts │ │ │ ├── limit-options.ts │ │ │ ├── log.ts │ │ │ ├── messages.ts │ │ │ ├── multi-select.ts │ │ │ ├── note.ts │ │ │ ├── password.ts │ │ │ ├── path.ts │ │ │ ├── progress-bar.ts │ │ │ ├── select-key.ts │ │ │ ├── select.ts │ │ │ ├── spinner.ts │ │ │ ├── stream.ts │ │ │ ├── task-log.ts │ │ │ ├── task.ts │ │ │ └── text.ts │ │ └── tsdown.config.ts │ ├── test/ │ │ ├── .gitignore │ │ ├── BUNDLING.md │ │ ├── __tests__/ │ │ │ └── build-artifacts.spec.ts │ │ ├── build.ts │ │ ├── package.json │ │ └── tsconfig.json │ └── tools/ │ ├── .upstream-versions.json │ ├── README.md │ ├── package.json │ ├── snap-tests/ │ │ ├── json-sort/ │ │ │ ├── array.json │ │ │ ├── snap.txt │ │ │ └── steps.json │ │ └── replace-file-content/ │ │ ├── foo/ │ │ │ └── example.toml │ │ ├── snap.txt │ │ └── steps.json │ └── src/ │ ├── __tests__/ │ │ ├── __snapshots__/ │ │ │ └── utils.spec.ts.snap │ │ └── utils.spec.ts │ ├── bin.js │ ├── brand-vite.ts │ ├── index.ts │ ├── install-global-cli.ts │ ├── json-edit.ts │ ├── json-sort.ts │ ├── merge-peer-deps.ts │ ├── replace-file-content.ts │ ├── snap-test.ts │ ├── sync-remote-deps.ts │ └── utils.ts ├── pnpm-workspace.yaml ├── rfcs/ │ ├── add-remove-package-commands.md │ ├── check-command.md │ ├── cli-output-polish.md │ ├── cli-tips.md │ ├── code-generator.md │ ├── config-and-staged-commands.md │ ├── dedupe-package-command.md │ ├── dlx-command.md │ ├── env-command.md │ ├── exec-command.md │ ├── global-cli-rust-binary.md │ ├── implode-command.md │ ├── init-editor-configs.md │ ├── install-command.md │ ├── js-runtime.md │ ├── link-unlink-package-commands.md │ ├── merge-global-and-local-cli.md │ ├── migration-command.md │ ├── outdated-package-command.md │ ├── pack-command.md │ ├── pm-command-group.md │ ├── run-without-vite-plus-dependency.md │ ├── split-global-cli.md │ ├── trampoline-exe-for-shims.md │ ├── update-package-command.md │ ├── upgrade-command.md │ ├── vpx-command.md │ └── why-package-command.md ├── rust-toolchain.toml ├── scripts/ │ └── generate-license.ts ├── tmp/ │ └── .gitignore ├── tsconfig.json └── vite.config.ts ================================================ FILE CONTENTS ================================================ ================================================ FILE: .cargo/config.toml ================================================ [env] # Required by rolldown_workspace crate - points to the rolldown subproject root WORKSPACE_DIR = { value = "rolldown", relative = true } [build] rustflags = ["--cfg", "tokio_unstable"] # also update .github/workflows/ci.yml # fix sqlite build error on linux [target.'cfg(target_os = "linux")'] rustflags = ["--cfg", "tokio_unstable", "-C", "link-args=-Wl,--warn-unresolved-symbols"] # Increase stack size on Windows to avoid stack overflow [target.'cfg(all(windows, target_env = "msvc"))'] rustflags = ["--cfg", "tokio_unstable", "-C", "link-arg=/STACK:8388608"] [target.'cfg(all(windows, target_env = "gnu"))'] rustflags = ["--cfg", "tokio_unstable", "-C", "link-arg=-Wl,--stack,8388608"] [unstable] bindeps = true [net] git-fetch-with-cli = true # use git CLI to authenticate for vite-task git dependencies ================================================ FILE: .claude/agents/cargo-workspace-merger.md ================================================ --- name: cargo-workspace-merger description: "Use this agent when you need to merge one Cargo workspace into another, specifically when integrating a subproject's crates and dependencies into a root workspace. This includes tasks like: adding crate path references to workspace members, merging workspace dependency definitions while avoiding duplicates, and ensuring only production dependencies (not unnecessary dev dependencies) are included.\\n\\n\\nContext: The user wants to integrate the rolldown project into their existing Cargo workspace.\\nuser: \"I need to merge the rolldown Cargo workspace into our root workspace\"\\nassistant: \"I'll use the cargo-workspace-merger agent to handle this integration. This involves analyzing both Cargo.toml files, identifying the crates to add, and merging the necessary dependencies.\"\\n\\n\\n\\n\\nContext: The user has cloned a Rust project as a subdirectory and wants to integrate it.\\nuser: \"Can you add all the crates from ./external-lib into our workspace?\"\\nassistant: \"I'll launch the cargo-workspace-merger agent to analyze the external library's workspace structure and merge it into your root Cargo.toml.\"\\n\\n" model: opus color: yellow --- You are an expert Rust build system engineer specializing in Cargo workspace management and dependency resolution. You have deep knowledge of Cargo.toml structure, workspace inheritance, and dependency deduplication strategies. ## Your Primary Mission Merge a child Cargo workspace (located in a subdirectory) into a parent root Cargo workspace. This involves two main tasks: 1. **Adding crate references**: Add all crates from the child workspace to the root workspace's `[workspace.dependencies]` section with proper path references. 2. **Merging workspace dependencies**: Combine the child workspace's `[workspace.dependencies]` with the root's dependencies, ensuring no duplicates and only including dependencies actually used by the crates being merged. ## Step-by-Step Process ### Step 1: Analyze the Child Workspace - Read the child workspace's `Cargo.toml` (e.g., `./rolldown/Cargo.toml`) - Identify all workspace members from the `[workspace.members]` section - Extract all `[workspace.dependencies]` definitions ### Step 2: Identify Crates to Add - For each workspace member, locate its `Cargo.toml` - Extract the crate name from `[package].name` - Build a list of path references in the format: `crate_name = { path = "./child/crates/crate_name" }` ### Step 3: Analyze Dependency Usage - For each crate in the child workspace, read its `Cargo.toml` - Collect all dependencies from `[dependencies]`, `[dev-dependencies]`, and `[build-dependencies]` - Focus on dependencies that reference `workspace = true` - these need the workspace-level definition - Create a set of actually-used workspace dependencies ### Step 4: Filter and Merge Dependencies - From the child's `[workspace.dependencies]`, only include those that are actually used by the crates - Check for conflicts with existing root workspace dependencies: - Same dependency, same version: Skip (already exists) - Same dependency, different version: Flag for manual resolution and suggest keeping the newer version - Exclude dev-only dependencies that aren't needed for the merged crates ### Step 5: Update Root Cargo.toml - Add all crate path references to `[workspace.dependencies]` - Add filtered workspace dependencies to `[workspace.dependencies]` - Maintain alphabetical ordering within sections for cleanliness - Preserve any existing comments and formatting ## Output Format Provide: 1. A summary of crates being added 2. A summary of dependencies being merged 3. Any conflicts or issues requiring manual attention 4. The exact additions to make to the root `Cargo.toml` ## Quality Checks - Verify all paths exist before adding references - Ensure no duplicate entries are created - Validate that merged dependencies don't break existing crates - After modifications, suggest running `cargo check --workspace` to verify the merge - Use highest compatible semver versions (if not pinned) and merge features in crates ## Important Considerations - Use `vite_path` types for path operations as per project conventions - Dependencies with `path` references in the child workspace may need path adjustments - Feature flags on dependencies must be preserved - Optional dependencies must maintain their optional status - If a dependency exists in both workspaces with different features, merge the feature lists ### Workspace Package Inheritance Child crates may inherit fields from `[workspace.package]` using `field.workspace = true`. Common inherited fields include: - `homepage` - `repository` - `license` - `edition` - `authors` - `rust-version` **Important**: If the child workspace's `[workspace.package]` defines fields that the root workspace does not, you must add those fields to the root workspace's `[workspace.package]` section. Otherwise, crates that inherit these fields will fail to build with errors like: ``` error inheriting `homepage` from workspace root manifest's `workspace.package.homepage` Caused by: `workspace.package.homepage` was not defined ``` **Steps to handle this**: 1. Read the child workspace's `[workspace.package]` section 2. Compare with the root workspace's `[workspace.package]` section 3. Add any missing fields to the root workspace (use the root project's own values, not the child's) ## Error Handling - If a crate path doesn't exist, report it clearly and skip - If Cargo.toml parsing fails, provide the specific error - If version conflicts exist, list all conflicts before proceeding and ask for guidance ### Crates with Compile-Time Environment Variables Some crates use `env!()` macros that require compile-time environment variables set via `.cargo/config.toml`. These crates often have `relative = true` paths that only work when building from their original workspace root. **Example**: `rolldown_workspace` uses `env!("WORKSPACE_DIR")` which is set in `rolldown/.cargo/config.toml`. **How to handle**: 1. Check child workspace's `.cargo/config.toml` for `[env]` section 2. If crates use these env vars with `relative = true`, copy those env vars to root `.cargo/config.toml` with paths adjusted to point to the child workspace directory 3. Example: If child has `WORKSPACE_DIR = { value = "", relative = true }`, root should have `WORKSPACE_DIR = { value = "child-dir", relative = true }` ================================================ FILE: .claude/agents/monorepo-architect.md ================================================ --- name: monorepo-architect description: Use this agent when you need architectural guidance for monorepo tooling, particularly for reviewing code organization, module boundaries, and ensuring proper separation of concerns in Rust/Node.js projects. This agent should be invoked after implementing new features or refactoring existing code to validate architectural decisions and placement of functionality.\n\nExamples:\n- \n Context: The user has just implemented a new caching mechanism for the monorepo task runner.\n user: "I've added a new caching system to handle task outputs"\n assistant: "I'll use the monorepo-architect agent to review the architectural decisions and ensure the caching logic is properly placed within the module structure."\n \n Since new functionality was added, use the monorepo-architect agent to review the code architecture and module boundaries.\n \n\n- \n Context: The user is refactoring the task dependency resolution system.\n user: "I've refactored how we resolve task dependencies across packages"\n assistant: "Let me invoke the monorepo-architect agent to review the refactored code and ensure proper separation of concerns."\n \n After refactoring core functionality, use the monorepo-architect agent to validate architectural decisions.\n \n\n- \n Context: The user is adding cross-package communication features.\n user: "I've implemented a new IPC mechanism for packages to communicate during builds"\n assistant: "I'll use the monorepo-architect agent to review where this IPC logic lives and ensure it doesn't create inappropriate cross-module dependencies."\n \n When adding features that span multiple modules, use the monorepo-architect agent to prevent architectural violations.\n \n model: opus color: purple --- You are a senior software architect with deep expertise in Rust and Node.js ecosystems, specializing in monorepo tooling and build systems. You have extensively studied and analyzed the architectures of nx, Turborepo, Rush, and Lage, understanding their design decisions, trade-offs, and implementation patterns. Your primary responsibility is to review code architecture and ensure that functionality is properly organized within the codebase. You focus on: **Core Architectural Principles:** - Single Responsibility: Each module, file, and function should have one clear purpose - Separation of Concerns: Business logic, I/O operations, and configuration should be clearly separated - Module Boundaries: Enforce clean interfaces between modules, preventing tight coupling - Dependency Direction: Dependencies should flow in one direction, typically from high-level to low-level modules **When reviewing code, you will:** 1. **Analyze Module Structure**: Examine where new functionality has been placed and determine if it belongs there based on the module's responsibility. Look for code that crosses logical boundaries or mixes concerns. 2. **Identify Architectural Violations**: - Cross-module responsibilities where one module is doing work that belongs to another - Circular dependencies or bidirectional coupling - Business logic mixed with I/O operations - Configuration logic scattered across multiple modules - Violation of the dependency inversion principle 3. **Suggest Proper Placement**: When you identify misplaced functionality, provide specific recommendations: - Identify the correct module/file where the code should reside - Explain why the current placement violates architectural principles - Suggest how to refactor without breaking existing functionality - Consider the impact on testing and maintainability 4. **Reference Industry Standards**: Draw from your knowledge of nx, Turborepo, Rush, and Lage to: - Compare architectural decisions with proven patterns from these tools - Highlight when a different approach might be more scalable or maintainable - Suggest battle-tested patterns for common monorepo challenges 5. **Focus on Rust/Node.js Best Practices**: - In Rust: Ensure proper use of ownership, traits for abstraction, and module organization - In Node.js: Validate CommonJS/ESM module patterns, async patterns, and package boundaries - For interop: Review FFI boundaries and data serialization approaches **Review Methodology:** 1. Start by understanding the intent of the recent changes 2. Map out the affected modules and their responsibilities 3. Identify any code that seems out of place or creates inappropriate coupling 4. Provide a prioritized list of architectural concerns (critical, important, minor) 5. For each concern, explain the principle being violated and suggest a concrete fix **Output Format:** Structure your review as: - **Summary**: Brief overview of architectural health - **Critical Issues**: Must-fix architectural violations that will cause problems - **Recommendations**: Suggested improvements with rationale - **Positive Patterns**: Acknowledge well-architected decisions - **Comparison Notes**: When relevant, note how similar problems are solved in nx/Turborepo/Rush/Lage You are pragmatic and understand that perfect architecture must be balanced with delivery speed. Focus on issues that will genuinely impact maintainability, testability, or scalability. Avoid nitpicking and recognize when 'good enough' is appropriate for the current stage of the project. When you lack context about the broader system, ask clarifying questions rather than making assumptions. Your goal is to ensure the codebase remains maintainable and follows established architectural patterns while evolving to meet new requirements. ================================================ FILE: .claude/skills/add-ecosystem-ci/SKILL.md ================================================ --- name: add-ecosystem-ci description: Add a new ecosystem-ci test case for testing real-world projects against vite-plus allowed-tools: Bash, Read, Edit, Write, WebFetch, AskUserQuestion --- # Add Ecosystem-CI Test Case Add a new ecosystem-ci test case following this process: ## Step 1: Get Repository Information Ask the user for the GitHub repository URL if not provided as argument: $ARGUMENTS Use GitHub CLI to get repository info: ```bash gh api repos/OWNER/REPO --jq '.default_branch' gh api repos/OWNER/REPO/commits/BRANCH --jq '.sha' ``` ## Step 2: Auto-detect Project Configuration ### 2.1 Check for Subdirectory Fetch the repository's root to check if the main package.json is in a subdirectory (like `web/`, `app/`, `frontend/`). ### 2.2 Check if Project Already Uses Vite-Plus Check the project's root `package.json` for `vite-plus` in `dependencies` or `devDependencies`. If the project already uses vite-plus, set `forceFreshMigration: true` in `repo.json`. This tells `patch-project.ts` to set `VITE_PLUS_FORCE_MIGRATE=1` so `vp migrate` forces full dependency rewriting instead of skipping with "already using Vite+". ### 2.3 Auto-detect Commands from GitHub Workflows Fetch the project's GitHub workflow files to detect available commands: ```bash # List workflow files gh api repos/OWNER/REPO/contents/.github/workflows --jq '.[].name' # Fetch workflow content (for each .yml/.yaml file) gh api repos/OWNER/REPO/contents/.github/workflows/ci.yml --jq '.content' | base64 -d ``` Look for common patterns in workflow files: - `pnpm run ` / `npm run ` / `yarn ` - Commands like: `lint`, `build`, `test`, `type-check`, `typecheck`, `format`, `format:check` - Map detected commands to `vp` equivalents: `vp run lint`, `vp run build`, etc. ### 2.4 Ask User to Confirm Present the auto-detected configuration and ask user to confirm or modify: - Which directory contains the main package.json? (auto-detected or manual) - What Node.js version to use? (22 or 24, try to detect from workflow) - Which commands to run? (show detected commands as multi-select options) - Which OS to run on? (both, ubuntu-only, windows-only) - default: both ## Step 3: Update Files 1. **Add to `ecosystem-ci/repo.json`**: ```json { "project-name": { "repository": "https://github.com/owner/repo.git", "branch": "main", "hash": "full-commit-sha", "directory": "web", // only if subdirectory is needed "forceFreshMigration": true // only if project already uses vite-plus } } ``` 2. **Add to `.github/workflows/e2e-test.yml`** matrix: ```yaml - name: project-name node-version: 24 directory: web # only if subdirectory is needed command: | vp run lint vp run build ``` ## Step 4: Verify Test the clone locally: ```bash node ecosystem-ci/clone.ts project-name ``` 3. **Add OS exclusion to `.github/workflows/e2e-test.yml`** (if not running on both): For ubuntu-only: ```yaml exclude: - os: windows-latest project: name: project-name ``` For windows-only: ```yaml exclude: - os: ubuntu-latest project: name: project-name ``` ## Important Notes - The `directory` field is optional - only add it if the package.json is not in the project root - If `directory` is specified in repo.json, it must also be specified in the workflow matrix - `patch-project.ts` automatically handles running `vp migrate` in the correct directory - `forceFreshMigration` is required for projects that already have `vite-plus` in their package.json — it sets `VITE_PLUS_FORCE_MIGRATE=1` so `vp migrate` forces full dependency rewriting instead of skipping - OS exclusions are added to the existing `exclude` section in the workflow matrix ================================================ FILE: .claude/skills/bump-vite-task/SKILL.md ================================================ --- name: bump-vite-task description: Bump vite-task git dependency to the latest main commit. Use when you need to update the vite-task crates (fspy, vite_glob, vite_path, vite_str, vite_task, vite_workspace) in vite-plus. allowed-tools: Read, Grep, Glob, Edit, Bash, Agent, WebFetch --- # Bump vite-task to Latest Main Update the vite-task git dependency in `Cargo.toml` to the latest commit on the vite-task main branch, fix any breaking changes, and create a PR. ## Steps ### 1. Get current and target commits - Read `Cargo.toml` and find the current `rev = "..."` for any vite-task git dependency (e.g., `vite_task`, `vite_path`, `fspy`, `vite_glob`, `vite_str`, `vite_workspace`). They all share the same revision. - Get the latest commit hash on vite-task's main branch: ```bash git ls-remote https://github.com/voidzero-dev/vite-task.git refs/heads/main ``` ### 2. Update Cargo.toml - Replace **all** occurrences of the old commit hash with the new one in `Cargo.toml`. There are 6 crate entries that reference the same vite-task revision: `fspy`, `vite_glob`, `vite_path`, `vite_str`, `vite_task`, `vite_workspace`. ### 3. Ensure upstream dependencies are cloned - `cargo check` requires the `./rolldown` and `./vite` directories to exist (many workspace path dependencies point to `./rolldown/crates/...`). - Locally, clone them using the commit hashes from `packages/tools/.upstream-versions.json`. - CI handles this automatically via the `.github/actions/clone` action. ### 4. Verify compilation - Run `cargo check` to ensure the new vite-task compiles without errors. - If there are compilation errors, these are **breaking changes** from vite-task. Fix them in the vite-plus codebase (the consuming side), not in vite-task. - Common breaking changes include: renamed functions/methods, changed function signatures, new required fields in structs, removed public APIs. ### 5. Run tests - Run `cargo test -p vite_command -p vite_error -p vite_install -p vite_js_runtime -p vite_migration -p vite_shared -p vite_static_config -p vite-plus-cli -p vite_global_cli` to run the vite-plus crate tests. - Note: Some tests require network access (e.g., `vite_install::package_manager` tests, `vite_global_cli::commands::env` tests). These may fail in sandboxed environments. Verify they also fail on the main branch before dismissing them. - Note: `cargo test -p vite_task` will NOT work because vite_task is a git dependency, not a workspace member. ### 6. Update snap tests vite-task changes often affect CLI output, which means snap tests need updating. Common output changes: - **Status icons**: e.g., cache hit/miss indicators may change - **New CLI options**: e.g., new flags added to `vp run` that show up in help output - **Cache behavior messages**: e.g., new summary lines about cache status - **Task output formatting**: e.g., step numbering, separator lines To update snap tests: 1. Push your changes and let CI run the snap tests. 2. CI will show the diff in the E2E test logs if snap tests fail. 3. Extract the diff from CI logs and apply it locally. 4. Check all three platforms (Linux, Mac, Windows) since they may have slightly different snap test coverage. 5. Watch for trailing newline issues - ensure snap files end consistently. Snap test files are at `packages/cli/snap-tests/*/snap.txt` and `packages/cli/snap-tests-global/*/snap.txt`. ### 7. Create the PR - Commit message: `chore: bump vite-task to ` - PR title: `chore: bump vite-task to ` - PR body: Link to vite-task CHANGELOG.md diff between old and new commits: ``` https://github.com/voidzero-dev/vite-task/compare/...#diff-06572a96a58dc510037d5efa622f9bec8519bc1beab13c9f251e97e657a9d4ed ``` ### 8. Verify CI Wait for CI and ensure the `done` check passes. Key checks to monitor: - **Lint**: Clippy and format checks - **Test** (Linux, Mac, Windows): Rust unit tests - **CLI E2E test** (Linux, Mac, Windows): Snap tests - most likely to fail on a vite-task bump - **Run task**: Task runner integration tests - **Cargo Deny**: License/advisory checks (may have pre-existing failures unrelated to bump) The only **required** status check for merging is `done`, which aggregates the other checks (excluding Cargo Deny). ## Notes - Building the full CLI locally (`pnpm bootstrap-cli`) requires the rolldown Node.js package to be built first, which is complex. Prefer relying on CI for snap test generation. - `Cargo.lock` is automatically updated by cargo when you change the revision in `Cargo.toml`. ================================================ FILE: .claude/skills/spawn-process/SKILL.md ================================================ --- name: spawn-process description: Guide for writing subprocess execution code using the vite_command crate allowed-tools: Read, Grep, Glob, Edit, Write, Bash --- # Add Subprocess Execution Code When writing Rust code that needs to spawn subprocesses (resolve binaries, build commands, execute programs), always use the `vite_command` crate. Never use `which`, `tokio::process::Command::new`, or `std::process::Command::new` directly. ## Available APIs ### `vite_command::resolve_bin(name, path_env, cwd)` — Resolve a binary name to an absolute path Handles PATHEXT (`.cmd`/`.bat`) on Windows. Pass `None` for `path_env` to search the current process PATH. ```rust // Resolve using current PATH let bin = vite_command::resolve_bin("node", None, &cwd)?; // Resolve using a custom PATH let custom_path = std::ffi::OsString::from(&path_env_str); let bin = vite_command::resolve_bin("eslint", Some(&custom_path), &cwd)?; ``` ### `vite_command::build_command(bin_path, cwd)` — Build a command for a pre-resolved binary Returns `tokio::process::Command` with cwd, inherited stdio, and `fix_stdio_streams` on Unix already configured. Add args, envs, or override stdio as needed. ```rust let bin = vite_command::resolve_bin("eslint", None, &cwd)?; let mut cmd = vite_command::build_command(&bin, &cwd); cmd.args(&[".", "--fix"]); cmd.env("NODE_ENV", "production"); let mut child = cmd.spawn()?; let status = child.wait().await?; ``` ### `vite_command::build_shell_command(shell_cmd, cwd)` — Build a shell command Uses `/bin/sh -c` on Unix, `cmd.exe /C` on Windows. Same stdio and `fix_stdio_streams` setup as `build_command`. ```rust let mut cmd = vite_command::build_shell_command("echo hello && ls", &cwd); let mut child = cmd.spawn()?; let status = child.wait().await?; ``` ### `vite_command::run_command(bin_name, args, envs, cwd)` — Resolve + build + run in one call Combines resolve_bin, build_command, and status().await. The `envs` HashMap must include `"PATH"` if you want custom PATH resolution. ```rust let envs = HashMap::from([("PATH".to_string(), path_value)]); let status = vite_command::run_command("node", &["--version"], &envs, &cwd).await?; ``` ## Dependency Setup Add `vite_command` to the crate's `Cargo.toml`: ```toml [dependencies] vite_command = { workspace = true } ``` Do NOT add `which` as a direct dependency — binary resolution goes through `vite_command::resolve_bin`. ## Exception `crates/vite_global_cli/src/shim/exec.rs` uses synchronous `std::process::Command` with Unix `exec()` for process replacement. This is the only place that bypasses `vite_command`. ================================================ FILE: .claude/skills/sync-tsdown-cli/SKILL.md ================================================ --- name: sync-tsdown-cli description: Compare tsdown CLI options with vp pack and sync any new or removed options. Use when tsdown is upgraded or when you need to check for CLI option drift between tsdown and vp pack. allowed-tools: Read, Grep, Glob, Edit, Bash --- # Sync tsdown CLI Options with vp pack Compare the upstream `tsdown` CLI options with `vp pack` (defined in `packages/cli/src/pack-bin.ts`) and sync any differences. ## Steps 1. Run `npx tsdown --help` from `packages/cli/` to get tsdown's current CLI options 2. Read `packages/cli/src/pack-bin.ts` to see vp pack's current options 3. Compare and add any new tsdown options to `pack-bin.ts` using the existing cac `.option()` pattern 4. If tsdown removed options, do NOT remove them from `pack-bin.ts` -- instead add a code comment like `// NOTE: removed from tsdown CLI in vX.Y.Z` above the option so reviewers can decide whether to follow up 5. Preserve intentional differences: - `-c, --config` is intentionally commented out (vp pack uses vite.config.ts) - `--env-prefix` has a different default (`['VITE_PACK_', 'TSDOWN_']`) 6. Verify with `pnpm --filter vite-plus build-ts` and `vp pack -h` 7. If new parameters were added, add a corresponding snap test under `packages/cli/snap-tests/` to verify the new option works correctly ================================================ FILE: .clippy.toml ================================================ avoid-breaking-exported-api = false disallowed-methods = [ { path = "str::to_ascii_lowercase", reason = "To avoid memory allocation, use `cow_utils::CowUtils::cow_to_ascii_lowercase` instead." }, { path = "str::to_ascii_uppercase", reason = "To avoid memory allocation, use `cow_utils::CowUtils::cow_to_ascii_uppercase` instead." }, { path = "str::to_lowercase", reason = "To avoid memory allocation, use `cow_utils::CowUtils::cow_to_lowercase` instead." }, { path = "str::to_uppercase", reason = "To avoid memory allocation, use `cow_utils::CowUtils::cow_to_uppercase` instead." }, { path = "str::replace", reason = "To avoid memory allocation, use `cow_utils::CowUtils::cow_replace` instead." }, { path = "str::replacen", reason = "To avoid memory allocation, use `cow_utils::CowUtils::cow_replacen` instead." }, { path = "std::env::current_dir", reason = "To get an `AbsolutePathBuf`, Use `vite_path::current_dir` instead." }, ] disallowed-types = [ { path = "std::collections::HashMap", reason = "Use `rustc_hash::FxHashMap` instead, which is typically faster." }, { path = "std::collections::HashSet", reason = "Use `rustc_hash::FxHashSet` instead, which is typically faster." }, { path = "std::path::Path", reason = "Use `vite_path::RelativePath` or `vite_path::AbsolutePath` instead" }, { path = "std::path::PathBuf", reason = "Use `vite_path::RelativePathBuf` or `vite_path::AbsolutePathBuf` instead" }, { path = "std::string::String", reason = "Use `vite_str::Str` for small strings. For large strings, prefer `Box/Rc/Arc` if mutation is not needed." }, ] disallowed-macros = [ { path = "std::format", reason = "Use `vite_str::format` for small strings." }, { path = "std::println", reason = "Use `vite_shared::output` functions (`info`, `note`, `success`) instead." }, { path = "std::print", reason = "Use `vite_shared::output` functions (`info`, `note`, `success`) instead." }, { path = "std::eprintln", reason = "Use `vite_shared::output` functions (`warn`, `error`) instead." }, { path = "std::eprint", reason = "Use `vite_shared::output` functions (`warn`, `error`) instead." }, ] ================================================ FILE: .devcontainer/devcontainer.json ================================================ // For format details, see https://aka.ms/devcontainer.json. For config options, see the // README at: https://github.com/devcontainers/templates/tree/main/src/rust { "name": "Rust", // Or use a Dockerfile or Docker Compose file. More info: https://containers.dev/guide/dockerfile "image": "mcr.microsoft.com/vscode/devcontainers/base:ubuntu-22.04", "updateContentCommand": { "rustToolchain": "rustup show" }, "containerEnv": { "CARGO_TARGET_DIR": "/tmp/target" }, "features": { "ghcr.io/devcontainers/features/rust:1": {}, "ghcr.io/devcontainers-extra/features/fish-apt-get:1": {} }, "customizations": { "vscode": { "extensions": ["rust-lang.rust-analyzer", "tamasfe.even-better-toml", "fill-labs.dependi"], "settings": { "terminal.integrated.defaultProfile.linux": "fish", "terminal.integrated.profiles.linux": { "fish": { "path": "/usr/bin/fish" } } } } }, "postCreateCommand": "curl -fsSL https://vite.plus | bash" // Use 'mounts' to make the cargo cache persistent in a Docker Volume. // "mounts": [ // { // "source": "devcontainer-cargo-cache-${devcontainerId}", // "target": "/usr/local/cargo", // "type": "volume" // } // ] // Features to add to the dev container. More info: https://containers.dev/features. // "features": {}, // Use 'forwardPorts' to make a list of ports inside the container available locally. // "forwardPorts": [], // Use 'postCreateCommand' to run commands after the container is created. // "postCreateCommand": "rustc --version", // Configure tool-specific properties. // "customizations": {}, // Uncomment to connect as root instead. More info: https://aka.ms/dev-containers-non-root. // "remoteUser": "root" } ================================================ FILE: .gitattributes ================================================ * text=auto eol=lf ================================================ FILE: .github/ISSUE_TEMPLATE/bug_report.yml ================================================ name: "\U0001F41E Bug report" description: Report an issue with Vite+ labels: [pending triage] type: Bug body: - type: markdown attributes: value: | Thanks for taking the time to fill out this bug report. Please only file issues here if the bug is in Vite+ itself. If the bug belongs to an underlying tool, report it in that project's tracker (linked in the "Create new issue" page). - type: textarea id: bug-description attributes: label: Describe the bug description: A clear and concise description of what the bug is. If you intend to submit a PR for this issue, tell us in the description. placeholder: I am doing ... What I expect is ... What is actually happening is ... validations: required: true - type: input id: reproduction attributes: label: Reproduction description: Please provide a link to a minimal reproduction repository or a runnable stackblitz/sandbox. If a report is vague and has no reproduction, it may be closed. placeholder: Reproduction URL validations: required: true - type: textarea id: reproduction-steps attributes: label: Steps to reproduce description: Provide any required steps, commands, or setup details. placeholder: Run `pnpm install` followed by `vp dev` - type: textarea id: system-info attributes: label: System Info description: | Paste the full output of both commands: - `vp env current` - `vp --version` render: shell placeholder: Paste `vp env current` and `vp --version` output here validations: required: true - type: dropdown id: package-manager attributes: label: Used Package Manager description: Select the package manager used in your project options: - npm - yarn - pnpm - bun validations: required: true - type: textarea id: logs attributes: label: Logs description: | Optional when a reproduction is provided. Please copy-paste text logs instead of screenshots. If relevant, run your command with `--debug` and include the output. render: shell - type: checkboxes id: checkboxes attributes: label: Validations description: Before submitting the issue, please confirm the following options: - label: Read the [Contributing Guidelines](https://github.com/voidzero-dev/vite-plus/blob/main/CONTRIBUTING.md). required: true - label: Check that there isn't [already an issue](https://github.com/voidzero-dev/vite-plus/issues) for the same bug. required: true - label: Confirm this is a Vite+ issue and not an upstream issue (Vite, Vitest, tsdown, Rolldown, or Oxc). required: true - label: The provided reproduction is a [minimal reproducible example](https://stackoverflow.com/help/minimal-reproducible-example). required: true ================================================ FILE: .github/ISSUE_TEMPLATE/config.yml ================================================ blank_issues_enabled: false contact_links: - name: Vite Issues url: https://github.com/vitejs/vite/issues/new/choose about: Report issues specific to Vite core in the Vite repository. - name: Vitest Issues url: https://github.com/vitest-dev/vitest/issues/new/choose about: Report issues specific to Vitest in the Vitest repository. - name: tsdown Issues url: https://github.com/rolldown/tsdown/issues/new/choose about: Report issues specific to tsdown in the tsdown repository. - name: Rolldown Issues url: https://github.com/rolldown/rolldown/issues/new/choose about: Report issues specific to Rolldown in the Rolldown repository. - name: Oxc (Oxlint/Oxfmt) Issues url: https://github.com/oxc-project/oxc/issues/new/choose about: Report Oxlint/Oxfmt issues in the Oxc repository. ================================================ FILE: .github/ISSUE_TEMPLATE/docs.yml ================================================ name: "\U0001F4DA Documentation" description: Suggest a documentation improvement for Vite+ labels: [documentation] body: - type: markdown attributes: value: | Thanks for taking the time to fill out this issue. - type: checkboxes id: documentation_is attributes: label: Documentation is options: - label: Missing - label: Outdated - label: Confusing - label: Not sure - type: textarea id: description attributes: label: Explain in Detail description: A clear and concise description of your suggestion. If you intend to submit a PR for this issue, mention it here. placeholder: The description of ... is not clear. I thought it meant ... but it wasn't. validations: required: true - type: textarea id: suggestion attributes: label: Your Suggestion for Changes validations: required: true - type: input id: reference attributes: label: Relevant Page description: Link to the relevant doc page, section, or file. placeholder: https://github.com/voidzero-dev/vite-plus/blob/main/docs/... - type: input id: reproduction attributes: label: Reproduction (Optional) description: If the docs issue is tied to behavior, share a minimal reproduction link. placeholder: Reproduction URL - type: textarea id: reproduction-steps attributes: label: Steps to reproduce (Optional) description: Add steps if the docs issue is about incorrect behavior guidance. placeholder: Run `pnpm install` followed by `vp dev` ================================================ FILE: .github/ISSUE_TEMPLATE/feature_request.yml ================================================ name: "\U0001F680 New feature proposal" description: Propose a new feature to be added to Vite+ labels: [pending triage] type: Feature body: - type: markdown attributes: value: | Thanks for your interest in Vite+ and taking the time to fill out this feature report. Please only open feature proposals here for Vite+ itself. If the proposal belongs to an underlying tool, use that project's tracker (linked in the "Create new issue" page). - type: textarea id: feature-description attributes: label: Description description: 'Clear and concise description of the problem. Explain use cases and motivation. If you intend to submit a PR for this issue, mention it here.' placeholder: As a developer using Vite+ I want [goal / wish] so that [benefit]. validations: required: true - type: textarea id: suggested-solution attributes: label: Suggested solution description: 'Describe a possible API, behavior, or implementation direction.' validations: required: true - type: textarea id: alternative attributes: label: Alternative description: Describe any alternative solutions or features you've considered. - type: textarea id: additional-context attributes: label: Additional context description: Any other context, links, or screenshots about the feature request. - type: checkboxes id: checkboxes attributes: label: Validations description: Before submitting the issue, please confirm the following options: - label: Read the [Contributing Guidelines](https://github.com/voidzero-dev/vite-plus/blob/main/CONTRIBUTING.md). required: true - label: Confirm this request is for Vite+ itself and not for Vite, Vitest, tsdown, Rolldown, or Oxc. required: true - label: Check that there isn't already an issue requesting the same feature. required: true ================================================ FILE: .github/actions/build-upstream/action.yml ================================================ name: 'Build with Upstream Repositories' description: 'Builds Vite+ with the upstream repositories' inputs: target: description: 'The target platform' required: true print-after-build: description: 'Print the output after the build' required: false default: 'false' runs: using: 'composite' steps: - uses: ./.github/actions/download-rolldown-binaries with: github-token: ${{ github.token }} target: ${{ inputs.target }} upload: 'false' # Compute cache key once before any builds modify files # (packages/cli/package.json is modified by syncTestPackageExports during build-ts) # Include env vars (RELEASE_BUILD, DEBUG, VERSION) to ensure cache miss on release builds - name: Compute NAPI binding cache key id: cache-key shell: bash run: | echo "key=napi-binding-v3-${{ inputs.target }}-${{ env.RELEASE_BUILD }}-${{ env.DEBUG }}-${{ env.VERSION }}-${{ env.NPM_TAG }}-${{ hashFiles('packages/tools/.upstream-versions.json', 'Cargo.lock', 'crates/**/*.rs', 'crates/*/Cargo.toml', 'packages/cli/binding/**/*.rs', 'packages/cli/binding/Cargo.toml', 'Cargo.toml', '.cargo/config.toml', 'packages/cli/package.json', 'packages/cli/build.ts') }}" >> $GITHUB_OUTPUT # Cache NAPI bindings and Rust CLI binary (the slow parts, especially on Windows) - name: Restore NAPI binding cache id: cache-restore uses: actions/cache/restore@94b89442628ad1d101e352b7ee38f30e1bef108e # v5 with: path: | packages/cli/binding/*.node packages/cli/binding/index.js packages/cli/binding/index.d.ts packages/cli/binding/index.cjs packages/cli/binding/index.d.cts target/${{ inputs.target }}/release/vp target/${{ inputs.target }}/release/vp.exe target/${{ inputs.target }}/release/vp-shim.exe key: ${{ steps.cache-key.outputs.key }} # Apply Vite+ branding patches to vite source (CI checks out # upstream vite which doesn't have branding patches) - name: Brand vite shell: bash run: pnpm exec tool brand-vite # Build upstream TypeScript packages first (don't depend on native bindings) - name: Build upstream TypeScript packages shell: bash run: | pnpm --filter @rolldown/pluginutils build pnpm --filter rolldown build-node pnpm --filter vite build-types pnpm --filter "@voidzero-dev/*" build pnpm --filter vite-plus build-ts # NAPI builds - only run on cache miss (slow, especially on Windows) # Must run before vite-plus TypeScript builds which depend on the bindings - name: Build NAPI bindings (x86_64-linux) shell: bash if: steps.cache-restore.outputs.cache-hit != 'true' && inputs.target == 'x86_64-unknown-linux-gnu' run: | pnpm --filter=vite-plus build-native --target ${{ inputs.target }} --use-napi-cross env: TARGET_CC: clang DEBUG: napi:* - name: Build NAPI bindings (aarch64-linux) shell: bash if: steps.cache-restore.outputs.cache-hit != 'true' && inputs.target == 'aarch64-unknown-linux-gnu' run: | pnpm --filter=vite-plus build-native --target ${{ inputs.target }} --use-napi-cross env: TARGET_CC: clang TARGET_CFLAGS: '-D_BSD_SOURCE' DEBUG: napi:* - name: Build NAPI bindings (non-Linux targets) shell: bash if: steps.cache-restore.outputs.cache-hit != 'true' && !contains(inputs.target, 'linux') run: | pnpm --filter=vite-plus build-native --target ${{ inputs.target }} env: DEBUG: napi:* - name: Build Rust CLI binary (x86_64-linux) if: steps.cache-restore.outputs.cache-hit != 'true' && inputs.target == 'x86_64-unknown-linux-gnu' shell: bash run: | pnpm exec napi build --use-napi-cross --target ${{ inputs.target }} --release -p vite_global_cli env: TARGET_CC: clang DEBUG: napi:* - name: Build Rust CLI binary (aarch64-linux) if: steps.cache-restore.outputs.cache-hit != 'true' && inputs.target == 'aarch64-unknown-linux-gnu' shell: bash run: | pnpm exec napi build --use-napi-cross --target ${{ inputs.target }} --release -p vite_global_cli env: TARGET_CC: clang TARGET_CFLAGS: '-D_BSD_SOURCE' DEBUG: napi:* - name: Build Rust CLI binary (non-Linux targets) if: steps.cache-restore.outputs.cache-hit != 'true' && !contains(inputs.target, 'linux') shell: bash run: cargo build --release --target ${{ inputs.target }} -p vite_global_cli - name: Build trampoline shim binary (Windows only) if: steps.cache-restore.outputs.cache-hit != 'true' && contains(inputs.target, 'windows') shell: bash run: cargo build --release --target ${{ inputs.target }} -p vite_trampoline - name: Save NAPI binding cache if: steps.cache-restore.outputs.cache-hit != 'true' uses: actions/cache/save@94b89442628ad1d101e352b7ee38f30e1bef108e # v5 with: path: | packages/cli/binding/*.node packages/cli/binding/index.js packages/cli/binding/index.d.ts packages/cli/binding/index.cjs packages/cli/binding/index.d.cts target/${{ inputs.target }}/release/vp target/${{ inputs.target }}/release/vp.exe target/${{ inputs.target }}/release/vp-shim.exe key: ${{ steps.cache-key.outputs.key }} # Build vite-plus TypeScript after native bindings are ready - name: Build vite-plus TypeScript packages shell: bash run: | pnpm --filter=vite-plus build-ts - name: Print output after build shell: bash if: inputs.print-after-build == 'true' run: | pnpm vp -h pnpm vp run -h pnpm vp lint -h pnpm vp test -h pnpm vp build -h pnpm vp fmt -h ================================================ FILE: .github/actions/clone/action.yml ================================================ name: 'Clone Repositories' description: 'Clone self and upstream repositories' inputs: ecosystem-ci-project: description: 'The ecosystem ci project to clone' required: false default: '' outputs: ecosystem-ci-project-path: description: 'The path where the ecosystem ci project was cloned' value: ${{ steps.ecosystem-ci-project-hash.outputs.ECOSYSTEM_CI_PROJECT_PATH }} runs: using: 'composite' steps: - name: Output rolldown and vite hash shell: bash id: upstream-versions run: | node -e "console.log('ROLLDOWN_HASH=' + require('./packages/tools/.upstream-versions.json').rolldown.hash)" >> $GITHUB_OUTPUT node -e "console.log('ROLLDOWN_VITE_HASH=' + require('./packages/tools/.upstream-versions.json')['vite'].hash)" >> $GITHUB_OUTPUT - name: Output ecosystem ci project hash shell: bash id: ecosystem-ci-project-hash if: ${{ inputs.ecosystem-ci-project != '' }} run: | node -e "console.log('ECOSYSTEM_CI_PROJECT_HASH=' + require('./ecosystem-ci/repo.json')['${{ inputs.ecosystem-ci-project }}'].hash)" >> $GITHUB_OUTPUT node -e "console.log('ECOSYSTEM_CI_PROJECT_REPOSITORY=' + require('./ecosystem-ci/repo.json')['${{ inputs.ecosystem-ci-project }}'].repository.replace('https://github.com/', '').replace('.git', ''))" >> $GITHUB_OUTPUT echo "ECOSYSTEM_CI_PROJECT_PATH=${{ runner.temp }}/vite-plus-ecosystem-ci/${{ inputs.ecosystem-ci-project }}" >> $GITHUB_OUTPUT - uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1 with: repository: rolldown/rolldown path: rolldown ref: ${{ steps.upstream-versions.outputs.ROLLDOWN_HASH }} - uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1 with: repository: vitejs/vite path: vite ref: ${{ steps.upstream-versions.outputs.ROLLDOWN_VITE_HASH }} # Disable autocrlf to preserve LF line endings on Windows # This prevents prettier/eslint from failing with "Delete ␍" errors - name: Configure git for LF line endings if: ${{ inputs.ecosystem-ci-project != '' }} shell: bash run: git config --global core.autocrlf false - name: Clone ecosystem ci project if: ${{ inputs.ecosystem-ci-project != '' }} shell: bash run: npx tsx ecosystem-ci/clone.ts ${{ inputs.ecosystem-ci-project }} ================================================ FILE: .github/actions/download-rolldown-binaries/action.yml ================================================ name: 'Download Rolldown Binaries' description: 'Download previous release rolldown binaries and upload as artifact' inputs: github-token: description: 'GitHub token for accessing GitHub Package Registry' required: true target: description: 'The target platform' default: 'x86_64-unknown-linux-gnu' required: false upload: description: 'Upload the rolldown binaries as artifact' required: false default: 'true' runs: using: 'composite' steps: - name: Install previous release shell: bash run: | if ${{ runner.os == 'Windows' }}; then export TARGET="win32-x64-msvc" elif ${{ runner.os == 'Linux' }}; then export TARGET="linux-x64-gnu" elif ${{ runner.os == 'macOS' }}; then export TARGET="darwin-arm64" fi # Pin to the version from checked-out rolldown source to avoid mismatch # between JS code (built from source) and native binary (downloaded from npm). # Falls back to npm latest only when rolldown source isn't cloned yet # (e.g., the standalone download-previous-rolldown-binaries job). if [ -f "./rolldown/packages/rolldown/package.json" ]; then export VERSION=$(node -p "require('./rolldown/packages/rolldown/package.json').version") echo "Using rolldown version from source: ${VERSION}" else export VERSION=$(npm view --json rolldown | jq -r '.version') echo "Warning: rolldown source not found, using npm latest: ${VERSION}" fi npm pack "@rolldown/binding-${TARGET}@${VERSION}" tar -xzf "rolldown-binding-${TARGET}-${VERSION}.tgz" if [ -d "./rolldown/packages/rolldown/src" ]; then cp "./package/rolldown-binding.${TARGET}.node" ./rolldown/packages/rolldown/src ls ./rolldown/packages/rolldown/src fi env: GITHUB_TOKEN: ${{ inputs.github-token }} - uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2 if: ${{ inputs.upload == 'true' }} with: name: rolldown-binaries path: ./package/rolldown-binding.*.node if-no-files-found: error - name: Clean up shell: bash run: | rm -rf package rm *.tgz ================================================ FILE: .github/actions/set-snapshot-version/action.yml ================================================ name: Compute Release Version description: Get latest tag from GitHub and increment the patch version inputs: npm_tag: description: 'npm tag (latest or alpha)' required: true default: 'latest' outputs: version: description: The computed version string value: ${{ steps.version.outputs.version }} runs: using: composite steps: - name: Compute next patch version id: version shell: bash run: | git fetch --tags --quiet npm install --prefix ${{ github.action_path }} semver > /dev/null 2>&1 VERSION_OUTPUT=$(node ${{ github.action_path }}/compute-version.mjs "${{ inputs.npm_tag }}") echo "$VERSION_OUTPUT" echo "$VERSION_OUTPUT" | tail -n 1 >> $GITHUB_OUTPUT ================================================ FILE: .github/actions/set-snapshot-version/compute-version.mjs ================================================ import { execSync } from 'node:child_process'; import semver from 'semver'; const npmTag = process.argv[2] || 'latest'; // Get all version tags const tagsOutput = execSync('git tag -l "v*"', { encoding: 'utf-8' }).trim(); const tags = tagsOutput ? tagsOutput.split('\n') : []; // Parse and filter to valid semver, then find latest stable (no prerelease) const stableTags = tags .map((tag) => semver.parse(tag.replace(/^v/, ''))) .filter((v) => v !== null && v.prerelease.length === 0); let nextVersion; if (stableTags.length === 0) { nextVersion = '0.1.0'; } else { stableTags.sort(semver.rcompare); const latest = stableTags[0]; nextVersion = semver.inc(latest, 'patch'); } let version; if (npmTag === 'alpha') { // Find existing alpha tags for this version const alphaPrefix = `v${nextVersion}-alpha.`; const alphaTags = tags .filter((tag) => tag.startsWith(alphaPrefix)) .map((tag) => semver.parse(tag.replace(/^v/, ''))) .filter((v) => v !== null); let alphaNum = 0; if (alphaTags.length > 0) { alphaTags.sort(semver.rcompare); alphaNum = alphaTags[0].prerelease[1] + 1; } version = `${nextVersion}-alpha.${alphaNum}`; } else { version = nextVersion; } const latestStable = stableTags.length > 0 ? `v${stableTags[0].version}` : 'none'; console.log(`Computed version: ${version} (latest stable tag: ${latestStable})`); console.log(`version=${version}`); ================================================ FILE: .github/actions/set-snapshot-version/package.json ================================================ { "private": true, "type": "module" } ================================================ FILE: .github/renovate.json ================================================ { "$schema": "https://docs.renovatebot.com/renovate-schema.json", "extends": ["github>Boshen/renovate"], "ignorePaths": [ "packages/cli/snap-tests/**", "packages/cli/snap-tests-global/**", "packages/cli/snap-tests-todo/**", "bench/fixtures/**", "rolldown/**", "vite/**" ], "packageRules": [ { "matchPackageNames": ["vitest-dev"], "enabled": false }, { "matchPackageNames": [ "fspy", "vite_glob", "vite_path", "vite_str", "vite_task", "vite_workspace", "https://github.com/voidzero-dev/vite-task" ], "enabled": false } ] } ================================================ FILE: .github/scripts/upgrade-deps.mjs ================================================ import fs from 'node:fs'; import path from 'node:path'; const ROOT = process.cwd(); // ============ GitHub API ============ async function getLatestTagCommit(owner, repo) { const res = await fetch(`https://api.github.com/repos/${owner}/${repo}/tags`, { headers: { Authorization: `token ${process.env.GITHUB_TOKEN}`, Accept: 'application/vnd.github.v3+json', }, }); if (!res.ok) { throw new Error(`Failed to fetch tags for ${owner}/${repo}: ${res.status} ${res.statusText}`); } const tags = await res.json(); if (!Array.isArray(tags) || !tags.length) { throw new Error(`No tags found for ${owner}/${repo}`); } if (!tags[0]?.commit?.sha) { throw new Error(`Invalid tag structure for ${owner}/${repo}: missing commit SHA`); } console.log(`${repo} -> ${tags[0].name}`); return tags[0].commit.sha; } // ============ npm Registry ============ async function getLatestNpmVersion(packageName) { const res = await fetch(`https://registry.npmjs.org/${packageName}/latest`); if (!res.ok) { throw new Error( `Failed to fetch npm version for ${packageName}: ${res.status} ${res.statusText}`, ); } const data = await res.json(); if (!data?.version) { throw new Error(`Invalid npm response for ${packageName}: missing version field`); } return data.version; } // ============ Update .upstream-versions.json ============ async function updateUpstreamVersions() { const filePath = path.join(ROOT, 'packages/tools/.upstream-versions.json'); const data = JSON.parse(fs.readFileSync(filePath, 'utf8')); // rolldown -> rolldown/rolldown data.rolldown.hash = await getLatestTagCommit('rolldown', 'rolldown'); // vite -> vitejs/vite data['vite'].hash = await getLatestTagCommit('vitejs', 'vite'); fs.writeFileSync(filePath, JSON.stringify(data, null, 2) + '\n'); console.log('Updated .upstream-versions.json'); } // ============ Update pnpm-workspace.yaml ============ async function updatePnpmWorkspace(versions) { const filePath = path.join(ROOT, 'pnpm-workspace.yaml'); let content = fs.readFileSync(filePath, 'utf8'); // Update vitest-dev override (handle pre-release versions like -beta.1, -rc.0) // Handle both quoted ('npm:vitest@^...') and unquoted (npm:vitest@^...) forms content = content.replace( /vitest-dev: '?npm:vitest@\^[\d.]+(-[\w.]+)?'?/, `vitest-dev: 'npm:vitest@^${versions.vitest}'`, ); // Update tsdown in catalog (handle pre-release versions) content = content.replace(/tsdown: \^[\d.]+(-[\w.]+)?/, `tsdown: ^${versions.tsdown}`); // Update @oxc-node/cli in catalog content = content.replace( /'@oxc-node\/cli': \^[\d.]+(-[\w.]+)?/, `'@oxc-node/cli': ^${versions.oxcNodeCli}`, ); // Update @oxc-node/core in catalog content = content.replace( /'@oxc-node\/core': \^[\d.]+(-[\w.]+)?/, `'@oxc-node/core': ^${versions.oxcNodeCore}`, ); // Update oxfmt in catalog content = content.replace(/oxfmt: =[\d.]+(-[\w.]+)?/, `oxfmt: =${versions.oxfmt}`); // Update oxlint in catalog (but not oxlint-tsgolint) content = content.replace(/oxlint: =[\d.]+(-[\w.]+)?\n/, `oxlint: =${versions.oxlint}\n`); // Update oxlint-tsgolint in catalog content = content.replace( /oxlint-tsgolint: =[\d.]+(-[\w.]+)?/, `oxlint-tsgolint: =${versions.oxlintTsgolint}`, ); fs.writeFileSync(filePath, content); console.log('Updated pnpm-workspace.yaml'); } // ============ Update packages/test/package.json ============ async function updateTestPackage(vitestVersion) { const filePath = path.join(ROOT, 'packages/test/package.json'); const pkg = JSON.parse(fs.readFileSync(filePath, 'utf8')); // Update all @vitest/* devDependencies for (const dep of Object.keys(pkg.devDependencies)) { if (dep.startsWith('@vitest/')) { pkg.devDependencies[dep] = vitestVersion; } } // Update vitest-dev devDependency if (pkg.devDependencies['vitest-dev']) { pkg.devDependencies['vitest-dev'] = `^${vitestVersion}`; } // Update @vitest/ui peerDependency if present if (pkg.peerDependencies?.['@vitest/ui']) { pkg.peerDependencies['@vitest/ui'] = vitestVersion; } fs.writeFileSync(filePath, JSON.stringify(pkg, null, 2) + '\n'); console.log('Updated packages/test/package.json'); } // ============ Update packages/core/package.json ============ async function updateCorePackage(devtoolsVersion) { const filePath = path.join(ROOT, 'packages/core/package.json'); const pkg = JSON.parse(fs.readFileSync(filePath, 'utf8')); // Update @vitejs/devtools in devDependencies if (pkg.devDependencies?.['@vitejs/devtools']) { pkg.devDependencies['@vitejs/devtools'] = `^${devtoolsVersion}`; } fs.writeFileSync(filePath, JSON.stringify(pkg, null, 2) + '\n'); console.log('Updated packages/core/package.json'); } console.log('Fetching latest versions…'); const [ vitestVersion, tsdownVersion, devtoolsVersion, oxcNodeCliVersion, oxcNodeCoreVersion, oxfmtVersion, oxlintVersion, oxlintTsgolintVersion, ] = await Promise.all([ getLatestNpmVersion('vitest'), getLatestNpmVersion('tsdown'), getLatestNpmVersion('@vitejs/devtools'), getLatestNpmVersion('@oxc-node/cli'), getLatestNpmVersion('@oxc-node/core'), getLatestNpmVersion('oxfmt'), getLatestNpmVersion('oxlint'), getLatestNpmVersion('oxlint-tsgolint'), ]); console.log(`vitest: ${vitestVersion}`); console.log(`tsdown: ${tsdownVersion}`); console.log(`@vitejs/devtools: ${devtoolsVersion}`); console.log(`@oxc-node/cli: ${oxcNodeCliVersion}`); console.log(`@oxc-node/core: ${oxcNodeCoreVersion}`); console.log(`oxfmt: ${oxfmtVersion}`); console.log(`oxlint: ${oxlintVersion}`); console.log(`oxlint-tsgolint: ${oxlintTsgolintVersion}`); await updateUpstreamVersions(); await updatePnpmWorkspace({ vitest: vitestVersion, tsdown: tsdownVersion, oxcNodeCli: oxcNodeCliVersion, oxcNodeCore: oxcNodeCoreVersion, oxfmt: oxfmtVersion, oxlint: oxlintVersion, oxlintTsgolint: oxlintTsgolintVersion, }); await updateTestPackage(vitestVersion); await updateCorePackage(devtoolsVersion); console.log('Done!'); ================================================ FILE: .github/workflows/ci.yml ================================================ name: CI permissions: # Doing it explicitly because the default permission only includes metadata: read. contents: read on: workflow_dispatch: pull_request: types: [opened, synchronize, labeled] push: branches: - main paths-ignore: - '**/*.md' concurrency: group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.sha }} cancel-in-progress: ${{ github.ref_name != 'main' }} defaults: run: shell: bash jobs: optimize-ci: runs-on: ubuntu-latest # or whichever runner you use for your CI outputs: skip: ${{ steps.check_skip.outputs.skip }} steps: - name: Optimize CI id: check_skip uses: withgraphite/graphite-ci-action@ee395f3a78254c006d11339669c6cabddf196f72 with: graphite_token: ${{ secrets.GRAPHITE_CI_OPTIMIZER_TOKEN }} detect-changes: runs-on: ubuntu-latest needs: optimize-ci if: needs.optimize-ci.outputs.skip == 'false' permissions: contents: read pull-requests: read outputs: code-changed: ${{ steps.filter.outputs.code }} steps: - uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1 - uses: dorny/paths-filter@de90cc6fb38fc0963ad72b210f1f284cd68cea36 # v3.0.2 id: filter with: filters: | code: - '!**/*.md' download-previous-rolldown-binaries: needs: detect-changes if: needs.detect-changes.outputs.code-changed == 'true' runs-on: ubuntu-latest permissions: contents: read packages: read steps: - uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1 - uses: ./.github/actions/download-rolldown-binaries with: github-token: ${{ secrets.GITHUB_TOKEN }} test: needs: detect-changes if: needs.detect-changes.outputs.code-changed == 'true' name: Test strategy: fail-fast: false matrix: include: - os: namespace-profile-linux-x64-default target: x86_64-unknown-linux-gnu - os: windows-latest target: x86_64-pc-windows-msvc - os: namespace-profile-mac-default target: aarch64-apple-darwin runs-on: ${{ matrix.os }} steps: - uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1 - uses: ./.github/actions/clone - name: Setup Dev Drive if: runner.os == 'Windows' uses: samypr100/setup-dev-drive@30f0f98ae5636b2b6501e181dfb3631b9974818d # v4.0.0 with: drive-size: 12GB drive-format: ReFS env-mapping: | CARGO_HOME,{{ DEV_DRIVE }}/.cargo RUSTUP_HOME,{{ DEV_DRIVE }}/.rustup - uses: oxc-project/setup-rust@d286d43bc1f606abbd98096666ff8be68c8d5f57 # v1.0.0 with: save-cache: ${{ github.ref_name == 'main' }} cache-key: test target-dir: ${{ runner.os == 'Windows' && format('{0}/target', env.DEV_DRIVE) || '' }} - run: rustup target add x86_64-unknown-linux-musl if: ${{ matrix.target == 'x86_64-unknown-linux-gnu' }} - run: cargo check --all-targets --all-features env: RUSTFLAGS: '-D warnings --cfg tokio_unstable' # also update .cargo/config.toml # Test all crates/* packages. New crates are automatically included. # Also test vite-plus-cli (lives outside crates/) to catch type sync issues. - run: cargo test $(for d in crates/*/; do echo -n "-p $(basename $d) "; done) -p vite-plus-cli env: RUST_MIN_STACK: 8388608 lint: needs: detect-changes if: needs.detect-changes.outputs.code-changed == 'true' name: Lint runs-on: namespace-profile-linux-x64-default steps: - uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1 - uses: ./.github/actions/clone - uses: oxc-project/setup-rust@d286d43bc1f606abbd98096666ff8be68c8d5f57 # v1.0.0 with: save-cache: ${{ github.ref_name == 'main' }} cache-key: lint tools: cargo-shear components: clippy rust-docs rustfmt - run: | cargo shear cargo fmt --check # cargo clippy --all-targets --all-features -- -D warnings # RUSTDOCFLAGS='-D warnings' cargo doc --no-deps --document-private-items - uses: crate-ci/typos@631208b7aac2daa8b707f55e7331f9112b0e062d # v1.44.0 with: files: . - uses: oxc-project/setup-node@fdbf0dfd334c4e6d56ceeb77d91c76339c2a0885 # v1.0.4 - name: Install docs dependencies run: pnpm -C docs install --frozen-lockfile - name: Deduplicate dependencies run: pnpm dedupe --check run: name: Run task runs-on: namespace-profile-linux-x64-default needs: - download-previous-rolldown-binaries steps: - uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1 - uses: ./.github/actions/clone - uses: oxc-project/setup-rust@d286d43bc1f606abbd98096666ff8be68c8d5f57 # v1.0.0 with: save-cache: ${{ github.ref_name == 'main' }} cache-key: run - uses: oxc-project/setup-node@fdbf0dfd334c4e6d56ceeb77d91c76339c2a0885 # v1.0.4 - uses: actions/download-artifact@d3f86a106a0bac45b974a628896c90dbdf5c8093 # v4.3.0 with: name: rolldown-binaries path: ./rolldown/packages/rolldown/src merge-multiple: true - name: Build with upstream uses: ./.github/actions/build-upstream with: target: x86_64-unknown-linux-gnu - name: Install Global CLI vp run: | pnpm bootstrap-cli:ci echo "$HOME/.vite-plus/bin" >> $GITHUB_PATH - name: Print help for built-in commands run: | which vp vp -h vp run -h vp lint -h vp test -h vp build -h vp fmt -h cli-e2e-test: name: CLI E2E test needs: - download-previous-rolldown-binaries strategy: fail-fast: false matrix: include: - os: namespace-profile-linux-x64-default - os: namespace-profile-mac-default - os: windows-latest runs-on: ${{ matrix.os }} steps: - uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1 - uses: ./.github/actions/clone - name: Setup Dev Drive if: runner.os == 'Windows' uses: samypr100/setup-dev-drive@30f0f98ae5636b2b6501e181dfb3631b9974818d # v4.0.0 with: drive-size: 12GB drive-format: ReFS env-mapping: | CARGO_HOME,{{ DEV_DRIVE }}/.cargo RUSTUP_HOME,{{ DEV_DRIVE }}/.rustup - uses: oxc-project/setup-rust@d286d43bc1f606abbd98096666ff8be68c8d5f57 # v1.0.0 with: save-cache: ${{ github.ref_name == 'main' }} cache-key: cli-e2e-test target-dir: ${{ runner.os == 'Windows' && format('{0}/target', env.DEV_DRIVE) || '' }} - uses: oxc-project/setup-node@fdbf0dfd334c4e6d56ceeb77d91c76339c2a0885 # v1.0.4 - name: Install docs dependencies run: pnpm -C docs install --frozen-lockfile - uses: actions/download-artifact@d3f86a106a0bac45b974a628896c90dbdf5c8093 # v4.3.0 with: name: rolldown-binaries path: ./rolldown/packages/rolldown/src merge-multiple: true - name: Build with upstream uses: ./.github/actions/build-upstream with: target: ${{ matrix.os == 'namespace-profile-linux-x64-default' && 'x86_64-unknown-linux-gnu' || matrix.os == 'windows-latest' && 'x86_64-pc-windows-msvc' || 'aarch64-apple-darwin' }} - name: Check TypeScript types if: ${{ matrix.os == 'namespace-profile-linux-x64-default' }} run: pnpm tsgo - name: Install Global CLI vp run: | pnpm bootstrap-cli:ci if [[ "$RUNNER_OS" == "Windows" ]]; then echo "$USERPROFILE\.vite-plus\bin" >> $GITHUB_PATH else echo "$HOME/.vite-plus/bin" >> $GITHUB_PATH fi - name: Verify vp installation run: | which vp vp --version vp -h - name: Run vp check run: vp check - name: Test global package install (powershell) if: ${{ matrix.os == 'windows-latest' }} shell: pwsh run: | echo "PATH: $env:Path" where.exe node where.exe npm where.exe npx where.exe vp vp env doctor # Test 1: Install a JS-based CLI (typescript) vp install -g typescript tsc --version where.exe tsc # Test 2: Verify the package was installed correctly Get-ChildItem "$env:USERPROFILE\.vite-plus\packages\typescript\" Get-ChildItem "$env:USERPROFILE\.vite-plus\bin\" # Test 3: Uninstall vp uninstall -g typescript # Test 4: Verify uninstall removed shim Write-Host "Checking bin dir after uninstall:" Get-ChildItem "$env:USERPROFILE\.vite-plus\bin\" $shimPath = "$env:USERPROFILE\.vite-plus\bin\tsc.cmd" if (Test-Path $shimPath) { Write-Error "tsc shim file still exists at $shimPath" exit 1 } Write-Host "tsc shim removed successfully" # Test 5: use session vp env use 18 node --version vp env doctor vp env use --unset node --version - name: Test global package install (cmd) if: ${{ matrix.os == 'windows-latest' }} shell: cmd run: | echo "PATH: %PATH%" where.exe node where.exe npm where.exe npx where.exe vp vp env use 18 node --version vp env use --unset node --version vp env doctor REM Test 1: Install a JS-based CLI (typescript) vp install -g typescript tsc --version where.exe tsc REM Test 2: Verify the package was installed correctly dir "%USERPROFILE%\.vite-plus\packages\typescript\" dir "%USERPROFILE%\.vite-plus\bin\" REM Test 3: Uninstall vp uninstall -g typescript REM Test 4: Verify uninstall removed shim (.cmd wrapper) echo Checking bin dir after uninstall: dir "%USERPROFILE%\.vite-plus\bin\" if exist "%USERPROFILE%\.vite-plus\bin\tsc.cmd" ( echo Error: tsc.cmd shim file still exists exit /b 1 ) echo tsc.cmd shim removed successfully REM Test 5: Verify shell script was also removed (for Git Bash) if exist "%USERPROFILE%\.vite-plus\bin\tsc" ( echo Error: tsc shell script still exists exit /b 1 ) echo tsc shell script removed successfully REM Test 6: use session vp env use 18 node --version vp env doctor vp env use --unset node --version - name: Test global package install (bash) run: | echo "PATH: $PATH" ls -la ~/.vite-plus/ ls -la ~/.vite-plus/bin/ which node which npm which npx which vp vp env doctor # Test 1: Install a JS-based CLI (typescript) vp install -g typescript tsc --version which tsc # Test 2: Verify the package was installed correctly ls -la ~/.vite-plus/packages/typescript/ ls -la ~/.vite-plus/bin/ # Test 3: Uninstall vp uninstall -g typescript # Test 4: Verify uninstall removed shim echo "Checking bin dir after uninstall:" ls -la ~/.vite-plus/bin/ if [ -f ~/.vite-plus/bin/tsc ]; then echo "Error: tsc shim file still exists at ~/.vite-plus/bin/tsc" exit 1 fi echo "tsc shim removed successfully" # Test 5: use session vp env use 18 node --version vp env doctor vp env use --unset node --version - name: Install Playwright browsers run: pnpx playwright install chromium - name: Run CLI snapshot tests run: | RUST_BACKTRACE=1 pnpm test if ! git diff --exit-code; then echo "::error::Snapshot diff detected. Run 'pnpm -F vite-plus snap-test' locally and commit the updated snap.txt files." git diff --stat git diff exit 1 fi env: RUST_MIN_STACK: 8388608 # Upgrade tests (merged from separate job to avoid duplicate build) - name: Test upgrade (bash) shell: bash run: | # Helper to read the installed CLI version from package.json get_cli_version() { node -p "require(require('path').resolve(process.env.USERPROFILE || process.env.HOME, '.vite-plus', 'current', 'node_modules', 'vite-plus', 'package.json')).version" } # Save initial (dev build) version INITIAL_VERSION=$(get_cli_version) echo "Initial version: $INITIAL_VERSION" # --check queries npm registry and prints update status vp upgrade --check # full upgrade: download, extract, swap vp upgrade --force vp --version vp env doctor ls -la ~/.vite-plus/ # Verify version changed after update UPDATED_VERSION=$(get_cli_version) echo "Updated version: $UPDATED_VERSION" if [ "$UPDATED_VERSION" == "$INITIAL_VERSION" ]; then echo "Error: version should have changed after upgrade (still $INITIAL_VERSION)" exit 1 fi # rollback to the previous version vp upgrade --rollback vp --version vp env doctor # Verify version restored after rollback ROLLBACK_VERSION=$(get_cli_version) echo "Rollback version: $ROLLBACK_VERSION" if [ "$ROLLBACK_VERSION" != "$INITIAL_VERSION" ]; then echo "Error: version should have been restored after rollback (expected $INITIAL_VERSION, got $ROLLBACK_VERSION)" exit 1 fi - name: Test upgrade (powershell) if: ${{ matrix.os == 'windows-latest' }} shell: pwsh run: | Get-ChildItem "$env:USERPROFILE\.vite-plus\" # Helper to read the installed CLI version from package.json function Get-CliVersion { node -p "require(require('path').resolve(process.env.USERPROFILE, '.vite-plus', 'current', 'node_modules', 'vite-plus', 'package.json')).version" } # Save initial (dev build) version $initialVersion = Get-CliVersion Write-Host "Initial version: $initialVersion" # --check queries npm registry and prints update status vp upgrade --check # full upgrade: download, extract, swap vp upgrade --force vp --version vp env doctor Get-ChildItem "$env:USERPROFILE\.vite-plus\" # Verify version changed after update $updatedVersion = Get-CliVersion Write-Host "Updated version: $updatedVersion" if ($updatedVersion -eq $initialVersion) { Write-Error "Error: version should have changed after upgrade (still $initialVersion)" exit 1 } # rollback to the previous version vp upgrade --rollback vp --version vp env doctor # Verify version restored after rollback $rollbackVersion = Get-CliVersion Write-Host "Rollback version: $rollbackVersion" if ($rollbackVersion -ne $initialVersion) { Write-Error "Error: version should have been restored after rollback (expected $initialVersion, got $rollbackVersion)" exit 1 } - name: Test upgrade (cmd) if: ${{ matrix.os == 'windows-latest' }} shell: cmd run: | REM Save initial (dev build) version for /f "usebackq delims=" %%v in (`node -p "require(require('path').resolve(process.env.USERPROFILE, '.vite-plus', 'current', 'node_modules', 'vite-plus', 'package.json')).version"`) do set INITIAL_VERSION=%%v echo Initial version: %INITIAL_VERSION% REM --check queries npm registry and prints update status vp upgrade --check REM full upgrade: download, extract, swap vp upgrade --force vp --version vp env doctor dir "%USERPROFILE%\.vite-plus\" REM Verify version changed after update for /f "usebackq delims=" %%v in (`node -p "require(require('path').resolve(process.env.USERPROFILE, '.vite-plus', 'current', 'node_modules', 'vite-plus', 'package.json')).version"`) do set UPDATED_VERSION=%%v echo Updated version: %UPDATED_VERSION% if "%UPDATED_VERSION%"=="%INITIAL_VERSION%" ( echo Error: version should have changed after upgrade, still %INITIAL_VERSION% exit /b 1 ) REM rollback to the previous version vp upgrade --rollback vp --version vp env doctor REM Verify version restored after rollback for /f "usebackq delims=" %%v in (`node -p "require(require('path').resolve(process.env.USERPROFILE, '.vite-plus', 'current', 'node_modules', 'vite-plus', 'package.json')).version"`) do set ROLLBACK_VERSION=%%v echo Rollback version: %ROLLBACK_VERSION% if not "%ROLLBACK_VERSION%"=="%INITIAL_VERSION%" ( echo Error: version should have been restored after rollback, expected %INITIAL_VERSION%, got %ROLLBACK_VERSION% exit /b 1 ) - name: Test implode (bash) shell: bash run: | vp implode --yes ls -la ~/ VP_HOME="${USERPROFILE:-$HOME}/.vite-plus" if [ -d "$VP_HOME" ]; then echo "Error: $VP_HOME still exists after implode" exit 1 fi # Reinstall pnpm bootstrap-cli:ci vp --version - name: Test implode (powershell) if: ${{ matrix.os == 'windows-latest' }} shell: pwsh run: | vp implode --yes Start-Sleep -Seconds 5 dir "$env:USERPROFILE\" if (Test-Path "$env:USERPROFILE\.vite-plus") { Write-Error "~/.vite-plus still exists after implode" exit 1 } pnpm bootstrap-cli:ci vp --version - name: Test implode (cmd) if: ${{ matrix.os == 'windows-latest' }} shell: cmd run: | REM vp.exe renames its own parent directory; cmd.exe may report REM "The system cannot find the path specified" on exit — ignore it. vp implode --yes || ver >NUL timeout /T 5 /NOBREAK >NUL dir "%USERPROFILE%\" if exist "%USERPROFILE%\.vite-plus" ( echo Error: .vite-plus still exists after implode exit /b 1 ) pnpm bootstrap-cli:ci vp --version install-e2e-test: name: Local CLI `vp install` E2E test needs: - download-previous-rolldown-binaries runs-on: namespace-profile-linux-x64-default # Run if: not a PR, OR PR has 'test: install-e2e' label if: >- github.event_name != 'pull_request' || contains(github.event.pull_request.labels.*.name, 'test: install-e2e') steps: - uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1 - uses: ./.github/actions/clone - uses: oxc-project/setup-rust@d286d43bc1f606abbd98096666ff8be68c8d5f57 # v1.0.0 with: save-cache: ${{ github.ref_name == 'main' }} cache-key: install-e2e-test - uses: oxc-project/setup-node@fdbf0dfd334c4e6d56ceeb77d91c76339c2a0885 # v1.0.4 - uses: actions/download-artifact@d3f86a106a0bac45b974a628896c90dbdf5c8093 # v4.3.0 with: name: rolldown-binaries path: ./rolldown/packages/rolldown/src merge-multiple: true - name: Build with upstream uses: ./.github/actions/build-upstream with: target: x86_64-unknown-linux-gnu - name: Build CLI run: | pnpm bootstrap-cli:ci echo "$HOME/.vite-plus/bin" >> $GITHUB_PATH - name: Run local CLI `vp install` run: | export PATH=$PWD/node_modules/.bin:$PATH vp -h # Test vp install on various repositories with different package managers repos=( # pnpm workspace "pnpm/pnpm:pnpm" "vitejs/vite:vite" # yarn workspace "napi-rs/napi-rs:napi-rs" "toeverything/AFFiNE:AFFiNE" # npm workspace "npm/cli:npm" "redhat-developer/vscode-extension-tester:vscode-extension-tester" ) for repo_info in "${repos[@]}"; do IFS=':' read -r repo dir_name <<< "$repo_info" echo "Testing vp install on $repo…" # remove the directory if it exists if [ -d "$RUNNER_TEMP/$dir_name" ]; then rm -rf "$RUNNER_TEMP/$dir_name" fi git clone --depth 1 "https://github.com/$repo.git" "$RUNNER_TEMP/$dir_name" cd "$RUNNER_TEMP/$dir_name" vp install # run again to show install cache increase by time time vp install echo "✓ Successfully installed dependencies for $repo" echo "" done done: runs-on: ubuntu-latest if: always() needs: - test - lint - run - cli-e2e-test steps: - run: exit 1 # Thank you, next https://github.com/vercel/next.js/blob/canary/.github/workflows/build_and_test.yml#L379 if: ${{ contains(needs.*.result, 'failure') || contains(needs.*.result, 'cancelled') }} ================================================ FILE: .github/workflows/claude.yml ================================================ name: Claude Code on: issues: types: [assigned] jobs: analyze: if: github.repository == 'voidzero-dev/vite-plus' && github.event.action == 'assigned' && github.event.assignee.login == 'boshen' runs-on: ubuntu-slim permissions: contents: read issues: write id-token: write steps: - name: Checkout repository uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2 with: fetch-depth: 100 persist-credentials: true - name: Run Claude Code id: claude uses: anthropics/claude-code-action@26ec041249acb0a944c0a47b6c0c13f05dbc5b44 # v1.0.70 with: claude_code_oauth_token: ${{ secrets.CLAUDE_CODE_OAUTH_TOKEN }} assignee_trigger: 'boshen' claude_args: --allowedTools "Edit,Write,Read,Glob,Grep,Bash(gh:*),Bash(cargo:*),Bash(git:*),Bash(just:*),WebFetch,TodoWrite" prompt: | Analyze issue #${{ github.event.issue.number }} in ${{ github.repository }} and determine if it can be fixed. First, use `gh issue view ${{ github.event.issue.number }}` to read the issue details. Then: 1. Search the codebase to gather relevant context (related files, existing implementations, tests) 2. Determine if the issue is fixable and estimate the complexity Finally, post a comment on the issue with: - A brief summary of your understanding of the issue - Relevant files/code you found - Whether this issue is fixable (yes/no/needs clarification) - If the issue is unclear, ask for more context - If fixable, provide a concrete implementation plan with specific steps - Any potential concerns or blockers - name: Unassign boshen if: always() env: GH_TOKEN: ${{ github.token }} run: gh issue edit ${{ github.event.issue.number }} --remove-assignee Boshen ================================================ FILE: .github/workflows/cleanup-cache.yml ================================================ name: Cleanup github runner caches on closed pull requests on: pull_request: types: - closed jobs: cleanup: runs-on: ubuntu-latest permissions: actions: write steps: - name: Cleanup run: | echo "Fetching list of cache keys" cacheKeysForPR=$(gh cache list --ref $BRANCH --limit 100 --json id --jq '.[].id') ## Setting this to not fail the workflow while deleting cache keys. set +e echo "Deleting caches…" for cacheKey in $cacheKeysForPR do gh cache delete $cacheKey done echo "Done" env: GH_TOKEN: ${{ github.token }} GH_REPO: ${{ github.repository }} BRANCH: refs/pull/${{ github.event.pull_request.number }}/merge ================================================ FILE: .github/workflows/deny.yml ================================================ name: Cargo Deny permissions: {} on: workflow_dispatch: pull_request: types: [opened, synchronize] paths: - 'Cargo.lock' - 'deny.toml' - '.github/workflows/deny.yml' push: branches: - main paths: - 'Cargo.lock' - 'deny.toml' - '.github/workflows/deny.yml' concurrency: group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.sha }} cancel-in-progress: ${{ github.ref_name != 'main' }} jobs: deny: name: Cargo Deny runs-on: ubuntu-latest steps: - uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1 with: persist-credentials: false - name: Output rolldown hash id: upstream-versions run: node -e "console.log('ROLLDOWN_HASH=' + require('./packages/tools/.upstream-versions.json').rolldown.hash)" >> $GITHUB_OUTPUT - uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1 with: repository: rolldown/rolldown path: rolldown ref: ${{ steps.upstream-versions.outputs.ROLLDOWN_HASH }} - uses: oxc-project/setup-rust@d286d43bc1f606abbd98096666ff8be68c8d5f57 # v1.0.0 with: restore-cache: false # Pinned to 0.18.6+ for CVSS 4.0 support (EmbarkStudios/cargo-deny#805) tools: cargo-deny@0.19.0 - run: cargo deny check ================================================ FILE: .github/workflows/e2e-test.yml ================================================ name: E2E Test permissions: {} on: workflow_dispatch: schedule: # Run every day at 0:00 GMT (8:00 AM Singapore time) - cron: '0 0 * * *' push: branches: - main paths-ignore: - '**/*.md' pull_request: types: [opened, synchronize, labeled] concurrency: group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.sha }} cancel-in-progress: ${{ github.ref_name != 'main' }} defaults: run: shell: bash jobs: detect-changes: runs-on: ubuntu-latest permissions: contents: read pull-requests: read outputs: related-files-changed: ${{ steps.filter.outputs.related-files }} steps: - uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1 - uses: dorny/paths-filter@de90cc6fb38fc0963ad72b210f1f284cd68cea36 # v3.0.2 id: filter with: filters: | related-files: - 'packages/**/build.ts' - .github/workflows/e2e-test.yml - 'ecosystem-ci/*' download-previous-rolldown-binaries: needs: detect-changes runs-on: ubuntu-latest # Run if: not a PR, OR PR has 'test: e2e' label, OR PR is from deps/upstream-update branch, OR build.ts files changed if: >- github.event_name != 'pull_request' || contains(github.event.pull_request.labels.*.name, 'test: e2e') || github.head_ref == 'deps/upstream-update' || needs.detect-changes.outputs.related-files-changed == 'true' permissions: contents: read packages: read steps: - uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1 - uses: ./.github/actions/download-rolldown-binaries with: github-token: ${{ secrets.GITHUB_TOKEN }} build: name: Build vite-plus packages (${{ matrix.os }}) runs-on: ${{ matrix.os }} permissions: contents: read packages: read needs: - download-previous-rolldown-binaries strategy: fail-fast: false matrix: include: - os: ubuntu-latest target: x86_64-unknown-linux-gnu - os: windows-latest target: x86_64-pc-windows-msvc steps: - uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1 - uses: ./.github/actions/clone # Disable Windows Defender real-time scanning to speed up I/O-heavy builds (~30-50% faster) - name: Disable Windows Defender if: runner.os == 'Windows' shell: powershell run: Set-MpPreference -DisableRealtimeMonitoring $true - uses: oxc-project/setup-rust@d286d43bc1f606abbd98096666ff8be68c8d5f57 # v1.0.0 with: save-cache: ${{ github.ref_name == 'main' }} cache-key: e2e-build-${{ matrix.os }} - uses: oxc-project/setup-node@fdbf0dfd334c4e6d56ceeb77d91c76339c2a0885 # v1.0.4 - uses: actions/download-artifact@d3f86a106a0bac45b974a628896c90dbdf5c8093 # v4.3.0 with: name: rolldown-binaries path: ./rolldown/packages/rolldown/src merge-multiple: true - name: Build with upstream uses: ./.github/actions/build-upstream with: target: ${{ matrix.target }} - name: Pack packages into tgz run: | mkdir -p tmp/tgz cd packages/core && pnpm pack --pack-destination ../../tmp/tgz && cd ../.. cd packages/test && pnpm pack --pack-destination ../../tmp/tgz && cd ../.. cd packages/cli && pnpm pack --pack-destination ../../tmp/tgz && cd ../.. # Copy vp binary for e2e-test job (findVpBinary expects it in target/) cp target/${{ matrix.target }}/release/vp tmp/tgz/vp 2>/dev/null || cp target/${{ matrix.target }}/release/vp.exe tmp/tgz/vp.exe 2>/dev/null || true cp target/${{ matrix.target }}/release/vp-shim.exe tmp/tgz/vp-shim.exe 2>/dev/null || true ls -la tmp/tgz - name: Upload tgz artifacts uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2 with: name: vite-plus-packages-${{ matrix.os }} path: tmp/tgz/ retention-days: 1 e2e-test: name: ${{ matrix.project.name }} E2E test (${{ matrix.os }}) env: # For packing manager install from github package registry GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} permissions: contents: read packages: read needs: - build runs-on: ${{ matrix.os }} timeout-minutes: 10 strategy: fail-fast: false matrix: os: - ubuntu-latest - windows-latest project: - name: vibe-dashboard node-version: 24 command: | npx playwright install chromium # FIXME: Failed to load JS plugin: ./plugins/debugger.js # vp run ready vp fmt vp test vp run build # FIXME: TypeError: Failed to fetch dynamically imported module # - name: skeleton # node-version: 24 # command: | # vp run format # vp run lint:check # vp run check # npx playwright install chromium # vp run test - name: rollipop node-version: 22 command: | vp run -r build # FIXME: typescript-eslint(no-redundant-type-constituents): 'rolldownExperimental.DevEngine' is an 'error' type that acts as 'any' and overrides all other types in this union type. vp run lint || true # FIXME: src/bundler-pool.ts(8,8): error TS2307: Cannot find module '@rollipop/core' or its corresponding type declarations. vp run -r typecheck || true vp run format vp run @rollipop/common#test vp run @rollipop/core#test vp run @rollipop/dev-server#test - name: frm-stack node-version: 24 command: | vp run lint:check vp run format:check vp run typecheck vp run @yourcompany/api#test vp run @yourcompany/backend-core#test - name: vue-mini node-version: 24 command: | # FIXME: skip format for now, will re-enable after prettier migration support # vp run format vp run lint vp run type vp run test -- --coverage # SKIP: vite-plugin-react - vite-task config loading incompatibility # vite-task needs to load vite.config.js for all workspace packages to build the task graph, # but the vite-plus process starts with workspace root as cwd. # The plugin-react-swc playgrounds use SWC plugins (e.g., @swc/plugin-emotion) which # cannot be resolved when loading the config from workspace root. # # Minimal reproduction: # git clone https://github.com/vitejs/vite-plugin-react /tmp/vite-plugin-react-test # cd /tmp/vite-plugin-react-test && pnpm install && pnpm run build # node packages/plugin-react-swc/playground/emotion-plugin/vite.config.js # # Error: Cannot find module '@swc/plugin-emotion' # # This works when running from within the playground directory (pnpm run build) # because pnpm's symlink structure allows resolution, but fails when loading from workspace root. # - name: vite-plugin-react # node-version: 22 # command: | # vp run format # vp run lint -- --fix # # TODO(fengmk2): run all builds and tests after tsdown version upgrade # vp run @vitejs/plugin-rsc#build # vp run @vitejs/plugin-rsc#test - name: vitepress node-version: 24 command: | npx playwright install chromium vp run format vp run build vp test run -r __tests__/unit vp run tests-e2e#test VITE_TEST_BUILD=1 vp run tests-e2e#test vp run tests-init#test - name: tanstack-start-helloworld node-version: 24 command: | npx playwright install chromium vp run test vp run build - name: oxlint-plugin-complexity node-version: 22 command: | vp run format vp run format:check vp run build vp run lint vp run test:run npx tsc --noEmit - name: vite-vue-vercel node-version: 24 command: | npx playwright install chromium vp run test vp run build - name: dify node-version: 24 directory: web command: | vp run type-check:tsgo vp run build vp run test navigation-utils.test.ts real-browser-flicker.test.tsx workflow-parallel-limit.test.tsx - name: viteplus-ws-repro node-version: 24 command: | vp test run - name: vp-config node-version: 22 command: | vp check vp pack vp test - name: vinext node-version: 24 command: | vp run build vp check --fix vp run check vp run test - name: reactive-resume node-version: 24 command: | vp fmt vp lint --type-aware vp build vp test - name: yaak node-version: 24 command: | vp fmt --ignore-path .oxfmtignore # FIXME: type-aware lint fails with "Invalid tsconfig" without full Rust/wasm bootstrap vp lint || true vp test - name: npmx.dev node-version: 24 command: | vp fmt vp run lint vp run test:types vp test --project unit - name: vite-plus-jest-dom-repro node-version: 24 command: | vp test run exclude: # frm-stack uses Docker (testcontainers) which doesn't work the same way on Windows - os: windows-latest project: name: frm-stack # dify only runs on Linux for now - os: windows-latest project: name: dify # vinext uses workerd native deps that don't build on Windows - os: windows-latest project: name: vinext # yaak is a Tauri app with Rust/wasm deps - os: windows-latest project: name: yaak # npmx.dev is a Nuxt app, ubuntu-only for now - os: windows-latest project: name: npmx.dev steps: - uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1 - uses: ./.github/actions/clone with: ecosystem-ci-project: ${{ matrix.project.name }} # Disable Windows Defender real-time scanning to speed up I/O-heavy operations - name: Disable Windows Defender if: runner.os == 'Windows' shell: powershell run: Set-MpPreference -DisableRealtimeMonitoring $true - uses: actions/setup-node@a0853c24544627f65ddf259abe73b1d18a591444 # v5 with: node-version: ${{ matrix.project.node-version }} package-manager-cache: false - name: Download vite-plus packages uses: actions/download-artifact@d3f86a106a0bac45b974a628896c90dbdf5c8093 # v4.3.0 with: name: vite-plus-packages-${{ matrix.os }} path: tmp/tgz - name: Install vp CLI shell: bash run: | # Place vp binary where install-global-cli.ts expects it (target/release/) mkdir -p target/release cp tmp/tgz/vp target/release/vp 2>/dev/null || cp tmp/tgz/vp.exe target/release/vp.exe 2>/dev/null || true cp tmp/tgz/vp-shim.exe target/release/vp-shim.exe 2>/dev/null || true chmod +x target/release/vp 2>/dev/null || true node $GITHUB_WORKSPACE/packages/tools/src/install-global-cli.ts --tgz $GITHUB_WORKSPACE/tmp/tgz/vite-plus-0.0.0.tgz # Use USERPROFILE (native Windows path) instead of HOME (Git Bash path /c/Users/...) # so cmd.exe and Node.js execSync can resolve binaries in PATH echo "${USERPROFILE:-$HOME}/.vite-plus/bin" >> $GITHUB_PATH - name: Migrate in ${{ matrix.project.name }} working-directory: ${{ runner.temp }}/vite-plus-ecosystem-ci/${{ matrix.project.name }}${{ matrix.project.directory && format('/{0}', matrix.project.directory) || '' }} shell: bash run: | node $GITHUB_WORKSPACE/ecosystem-ci/patch-project.ts ${{ matrix.project.name }} vp install --no-frozen-lockfile - name: Verify local tgz packages installed working-directory: ${{ runner.temp }}/vite-plus-ecosystem-ci/${{ matrix.project.name }}${{ matrix.project.directory && format('/{0}', matrix.project.directory) || '' }} shell: bash run: node $GITHUB_WORKSPACE/ecosystem-ci/verify-install.ts - name: Run vite-plus commands in ${{ matrix.project.name }} working-directory: ${{ runner.temp }}/vite-plus-ecosystem-ci/${{ matrix.project.name }}${{ matrix.project.directory && format('/{0}', matrix.project.directory) || '' }} run: ${{ matrix.project.command }} notify-failure: name: Notify on failure runs-on: ubuntu-latest needs: e2e-test if: ${{ failure() && github.event_name == 'schedule' }} permissions: contents: read issues: write steps: - name: Create or update GitHub issue on failure env: GH_TOKEN: ${{ secrets.GITHUB_TOKEN }} GH_REPO: ${{ github.repository }} run: | ISSUE_TITLE="E2E Test Scheduled Run Failed" ISSUE_LABEL="e2e-failure" RUN_URL="${{ github.server_url }}/${{ github.repository }}/actions/runs/${{ github.run_id }}" # Create label if it doesn't exist if ! gh label list --json name --jq '.[].name' | grep -q "^${ISSUE_LABEL}$"; then CREATE_LABEL_OUTPUT=$(gh label create "$ISSUE_LABEL" --color "d73a4a" --description "E2E test scheduled run failure" 2>&1) if [ $? -eq 0 ]; then echo "Created label: $ISSUE_LABEL" elif echo "$CREATE_LABEL_OUTPUT" | grep -qi "already exists"; then echo "Label '$ISSUE_LABEL' already exists, continuing." else echo "Error: Failed to create label '$ISSUE_LABEL':" echo "$CREATE_LABEL_OUTPUT" >&2 exit 1 fi fi # Search for existing open issue with the label EXISTING_ISSUE=$(gh issue list --label "$ISSUE_LABEL" --state open --json number --jq '.[0].number') if [ -z "$EXISTING_ISSUE" ]; then # Create new issue if none exists gh issue create \ --title "$ISSUE_TITLE" \ --label "$ISSUE_LABEL" \ --body "The scheduled E2E test run has failed. **Failed Run:** $RUN_URL **Time:** $(date -u '+%Y-%m-%d %H:%M:%S UTC') Please investigate the failure and fix any issues." echo "Created new issue" else # Add comment to existing issue gh issue comment "$EXISTING_ISSUE" \ --body "The scheduled E2E test run has failed again. **Failed Run:** $RUN_URL **Time:** $(date -u '+%Y-%m-%d %H:%M:%S UTC')" echo "Added comment to issue #$EXISTING_ISSUE" fi ================================================ FILE: .github/workflows/issue-close-require.yml ================================================ name: Issue Close Require on: schedule: - cron: '0 0 * * *' jobs: close-issues: if: github.repository == 'voidzero-dev/vite-plus' runs-on: ubuntu-slim permissions: issues: write # for actions-cool/issues-helper to update issues pull-requests: write # for actions-cool/issues-helper to update PRs steps: - name: needs reproduction uses: actions-cool/issues-helper@71b62d7da76e59ff7b193904feb6e77d4dbb2777 # v3 with: actions: 'close-issues' token: ${{ secrets.GITHUB_TOKEN }} labels: 'needs reproduction' inactive-day: 3 ================================================ FILE: .github/workflows/issue-labeled.yml ================================================ name: Issue Labeled on: issues: types: [labeled] jobs: reply-labeled: if: github.repository == 'voidzero-dev/vite-plus' runs-on: ubuntu-slim permissions: issues: write # for actions-cool/issues-helper to update issues pull-requests: write # for actions-cool/issues-helper to update PRs steps: - name: contribution welcome if: github.event.label.name == 'contribution welcome' || github.event.label.name == 'help wanted' uses: actions-cool/issues-helper@71b62d7da76e59ff7b193904feb6e77d4dbb2777 # v3 with: actions: 'remove-labels' token: ${{ secrets.GITHUB_TOKEN }} issue-number: ${{ github.event.issue.number }} labels: 'pending triage, needs reproduction' - name: needs reproduction if: github.event.label.name == 'needs reproduction' uses: actions-cool/issues-helper@71b62d7da76e59ff7b193904feb6e77d4dbb2777 # v3 with: actions: 'create-comment, remove-labels' token: ${{ secrets.GITHUB_TOKEN }} issue-number: ${{ github.event.issue.number }} labels: 'pending triage' body: | Hello @${{ github.event.issue.user.login }} 👋 Please provide a [minimal reproduction](https://stackoverflow.com/help/minimal-reproducible-example) using a GitHub repository. This helps us understand and resolve your issue much faster. **A good reproduction should be:** - **Minimal** – include only the code necessary to demonstrate the issue - **Complete** – contain everything needed to run and observe the problem - **Reproducible** – consistently show the issue with clear steps If no reproduction is provided, issues labeled `needs reproduction` will be closed after 3 days of inactivity. For more context on why this is required, please read: https://antfu.me/posts/why-reproductions-are-required ================================================ FILE: .github/workflows/release.yml ================================================ name: Release on: workflow_dispatch: inputs: npm_tag: description: 'npm tag for publish' required: true default: 'latest' type: choice options: - latest - alpha version: description: 'Override version (leave empty to auto-compute). Use when retrying a failed publish.' required: false default: '' type: string permissions: {} env: RELEASE_BUILD: 'true' DEBUG: 'napi:*' NPM_TAG: ${{ inputs.npm_tag }} jobs: prepare: runs-on: ubuntu-latest permissions: contents: read outputs: version: ${{ steps.version.outputs.version }} steps: - uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1 with: fetch-depth: 0 fetch-tags: true - uses: ./.github/actions/set-snapshot-version if: ${{ inputs.version == '' }} id: computed with: npm_tag: ${{ inputs.npm_tag }} - name: Set final version id: version run: echo "version=${{ inputs.version || steps.computed.outputs.version }}" >> $GITHUB_OUTPUT build-rust: runs-on: ${{ matrix.settings.os }} needs: prepare permissions: contents: read env: VERSION: ${{ needs.prepare.outputs.version }} strategy: fail-fast: false matrix: settings: - target: aarch64-apple-darwin os: macos-latest - target: x86_64-apple-darwin os: macos-latest - target: aarch64-unknown-linux-gnu os: ubuntu-latest - target: x86_64-unknown-linux-gnu os: ubuntu-latest - target: x86_64-pc-windows-msvc os: windows-latest - target: aarch64-pc-windows-msvc os: windows-latest steps: - uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1 - uses: ./.github/actions/clone - uses: actions/download-artifact@d3f86a106a0bac45b974a628896c90dbdf5c8093 # v4.3.0 - uses: oxc-project/setup-rust@d286d43bc1f606abbd98096666ff8be68c8d5f57 # v1.0.2 with: save-cache: ${{ github.ref_name == 'main' }} cache-key: release - name: Rustup Adds Target run: rustup target add ${{ matrix.settings.target }} - uses: oxc-project/setup-node@fdbf0dfd334c4e6d56ceeb77d91c76339c2a0885 # v1.0.4 - name: Set binding version shell: bash run: | pnpm exec tool replace-file-content packages/cli/binding/Cargo.toml 'version = "0.0.0"' 'version = "${{ env.VERSION }}"' pnpm exec tool replace-file-content crates/vite_global_cli/Cargo.toml 'version = "0.0.0"' 'version = "${{ env.VERSION }}"' cat crates/vite_global_cli/Cargo.toml - name: Verify version replacement shell: bash run: | if grep -q 'version = "0.0.0"' crates/vite_global_cli/Cargo.toml; then echo "ERROR: Version replacement failed for crates/vite_global_cli/Cargo.toml" head -5 crates/vite_global_cli/Cargo.toml exit 1 fi if grep -q 'version = "0.0.0"' packages/cli/binding/Cargo.toml; then echo "ERROR: Version replacement failed for packages/cli/binding/Cargo.toml" head -5 packages/cli/binding/Cargo.toml exit 1 fi echo "Version replacement verified successfully" - name: Build uses: ./.github/actions/build-upstream with: target: ${{ matrix.settings.target }} - name: Upload Vite+ native artifact uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2 with: name: vite-plus-native-${{ matrix.settings.target }} path: ./packages/cli/binding/*.node if-no-files-found: error - name: Upload Rust CLI binary artifact uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2 with: name: vite-global-cli-${{ matrix.settings.target }} path: | ./target/${{ matrix.settings.target }}/release/vp ./target/${{ matrix.settings.target }}/release/vp.exe ./target/${{ matrix.settings.target }}/release/vp-shim.exe if-no-files-found: error - name: Remove .node files before upload dist if: ${{ matrix.settings.target == 'x86_64-unknown-linux-gnu' }} run: | rm ./packages/core/dist/**/*.node - name: Upload core dist uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2 if: ${{ matrix.settings.target == 'x86_64-unknown-linux-gnu' }} with: name: core path: ./packages/core/dist if-no-files-found: error - name: Upload cli dist uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2 if: ${{ matrix.settings.target == 'x86_64-unknown-linux-gnu' }} with: name: cli path: ./packages/cli/dist if-no-files-found: error - name: Upload cli skills (docs for agent integration) uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2 if: ${{ matrix.settings.target == 'x86_64-unknown-linux-gnu' }} with: name: cli-skills path: ./packages/cli/skills if-no-files-found: error Release: runs-on: ubuntu-latest needs: [prepare, build-rust] permissions: contents: write packages: write id-token: write # Required for OIDC env: VERSION: ${{ needs.prepare.outputs.version }} steps: - uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1 - uses: ./.github/actions/clone - uses: pnpm/action-setup@41ff72655975bd51cab0327fa583b6e92b6d3061 # v4.2.0 - uses: actions/setup-node@6044e13b5dc448c55e2357c09f80417699197238 # v6.2.0 with: node-version-file: .node-version package-manager-cache: false registry-url: 'https://registry.npmjs.org' cache: 'pnpm' - name: Install dependencies run: pnpm install - name: Download cli dist uses: actions/download-artifact@d3f86a106a0bac45b974a628896c90dbdf5c8093 # v4.3.0 with: path: packages/cli/dist pattern: cli merge-multiple: true - name: Download cli skills uses: actions/download-artifact@d3f86a106a0bac45b974a628896c90dbdf5c8093 # v4.3.0 with: path: packages/cli/skills pattern: cli-skills merge-multiple: true - name: Download cli binding uses: actions/download-artifact@d3f86a106a0bac45b974a628896c90dbdf5c8093 # v4.3.0 with: path: packages/cli/artifacts pattern: vite-plus-native-* - name: Download core dist uses: actions/download-artifact@d3f86a106a0bac45b974a628896c90dbdf5c8093 # v4.3.0 with: path: packages/core/dist pattern: core merge-multiple: true - uses: ./.github/actions/download-rolldown-binaries with: github-token: ${{ github.token }} target: x86_64-unknown-linux-gnu upload: 'false' - name: Download Rust CLI binaries uses: actions/download-artifact@d3f86a106a0bac45b974a628896c90dbdf5c8093 # v4.3.0 with: path: rust-cli-artifacts pattern: vite-global-cli-* - name: Move Rust CLI binaries to target directories run: | # Move each artifact's binary to the correct target directory for artifact_dir in rust-cli-artifacts/vite-global-cli-*/; do if [ -d "$artifact_dir" ]; then # Extract target name from directory (e.g., vite-global-cli-x86_64-unknown-linux-gnu -> x86_64-unknown-linux-gnu) dir_name=$(basename "$artifact_dir") target_name=${dir_name#vite-global-cli-} # Create target directory and copy binary mkdir -p "target/${target_name}/release" cp -r "$artifact_dir"* "target/${target_name}/release/" fi done # Show what we have (fail if no binaries found) vp_files=$(find target -name "vp*" -type f 2>/dev/null || echo "") if [ -z "$vp_files" ]; then echo "Error: No vp binaries found in target directory" echo "Artifact contents:" find rust-cli-artifacts -type f || true exit 1 fi echo "Found binaries:" echo "$vp_files" - name: Set npm packages version run: | sed -i 's/"version": "0.0.0"/"version": "${{ env.VERSION }}"/' packages/core/package.json sed -i 's/"version": "0.0.0"/"version": "${{ env.VERSION }}"/' packages/test/package.json sed -i 's/"version": "0.0.0"/"version": "${{ env.VERSION }}"/' packages/cli/package.json - name: Build test run: pnpm --filter=@voidzero-dev/vite-plus-test build - name: 'Setup npm' run: | npm install -g npm@latest - name: Publish native addons run: | node ./packages/cli/publish-native-addons.ts - name: Publish run: | pnpm publish --filter=./packages/core --tag ${{ inputs.npm_tag }} --access public --no-git-checks pnpm publish --filter=./packages/test --tag ${{ inputs.npm_tag }} --access public --no-git-checks pnpm publish --filter=./packages/cli --tag ${{ inputs.npm_tag }} --access public --no-git-checks - name: Create release body run: | if [[ "${{ inputs.npm_tag }}" == "latest" ]]; then INSTALL_BASH="curl -fsSL https://vite.plus | bash" INSTALL_PS1="irm https://vite.plus/ps1 | iex" else INSTALL_BASH="curl -fsSL https://vite.plus | VITE_PLUS_VERSION=${{ env.VERSION }} bash" INSTALL_PS1="\\\$env:VITE_PLUS_VERSION=\\\"${{ env.VERSION }}\\\"; irm https://vite.plus/ps1 | iex" fi cat > ./RELEASE_BODY.md <> $GITHUB_PATH - name: Verify bin setup run: | # Verify bin directory was created by vp env --setup BIN_PATH="$HOME/.vite-plus/bin" ls -al "$BIN_PATH" if [ ! -d "$BIN_PATH" ]; then echo "Error: Bin directory not found: $BIN_PATH" exit 1 fi # Verify shim executables exist for shim in node npm npx; do if [ ! -f "$BIN_PATH/$shim" ]; then echo "Error: Shim not found: $BIN_PATH/$shim" exit 1 fi echo "Found shim: $BIN_PATH/$shim" done # Verify vp env doctor works vp env doctor vp env run --node 24 -- node -p "process.versions" which node which npm which npx which vp - name: Verify upgrade run: | # --check queries npm registry and prints update status vp upgrade --check vp upgrade 0.0.0-gbe8891a5.20260227-1615 vp --version # rollback to the previous version (should succeed after a real update) vp upgrade --rollback vp --version test-install-sh-readonly-config: name: Test install.sh (readonly shell config) runs-on: ubuntu-latest permissions: contents: read steps: - uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1 - name: Make shell config files read-only run: | # Simulate Nix-managed or read-only shell configs touch ~/.bashrc ~/.bash_profile ~/.profile chmod 444 ~/.bashrc ~/.bash_profile ~/.profile - name: Run install.sh run: | output=$(cat packages/cli/install.sh | bash 2>&1) || { echo "$output" echo "Install script exited with non-zero status" exit 1 } echo "$output" # Verify installation succeeds (not a fatal error) echo "$output" | grep -q "successfully installed" # Verify fallback message shows binary location echo "$output" | grep -q "vp was installed to:" # Verify fallback message shows manual instructions echo "$output" | grep -q "Or run vp directly:" # Verify the permission warning was shown echo "$output" | grep -qi "permission denied" - name: Verify vp works via direct path run: | ~/.vite-plus/bin/vp --version test-install-sh-arm64: name: Test install.sh (Linux ARM64 glibc via QEMU) runs-on: ubuntu-latest permissions: contents: read steps: - uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1 - name: Set up QEMU uses: docker/setup-qemu-action@29109295f81e9208d7d86ff1c6c12d2833863392 # v3.6.0 with: platforms: arm64 - name: Run install.sh in ARM64 container run: | docker run --rm --platform linux/arm64 \ -v "${{ github.workspace }}:/workspace" \ -e VITE_PLUS_VERSION=alpha \ ubuntu:20.04 bash -c " ls -al ~/ apt-get update && apt-get install -y curl ca-certificates cat /workspace/packages/cli/install.sh | bash if [ -f ~/.profile ]; then source ~/.profile elif [ -f ~/.bashrc ]; then source ~/.bashrc else export PATH="$HOME/.vite-plus/bin:$PATH" fi vp --version vp --help vp dlx print-current-version # Verify bin setup BIN_PATH=\"\$HOME/.vite-plus/bin\" if [ ! -d \"\$BIN_PATH\" ]; then echo \"Error: Bin directory not found: \$BIN_PATH\" exit 1 fi for shim in node npm npx; do if [ ! -f \"\$BIN_PATH/\$shim\" ]; then echo \"Error: Shim not found: \$BIN_PATH/\$shim\" exit 1 fi echo \"Found shim: \$BIN_PATH/\$shim\" done vp env doctor export VITE_LOG=trace vp env run --node 24 -- node -p \"process.versions\" # Verify upgrade vp upgrade --check vp upgrade 0.0.0-gbe8891a5.20260227-1615 vp --version vp upgrade --rollback vp --version # FIXME: qemu: uncaught target signal 11 (Segmentation fault) - core dumped # vp create vite --no-interactive --no-agent -- hello --no-interactive -t vanilla # cd hello && vp run build " test-install-ps1-v5: name: Test install.ps1 (Windows x64, PowerShell 5.1) runs-on: windows-latest permissions: contents: read steps: - uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1 - name: Assert PowerShell 5.x shell: powershell run: | Write-Host "PowerShell version: $($PSVersionTable.PSVersion)" if ($PSVersionTable.PSVersion.Major -ne 5) { Write-Error "Expected PowerShell 5.x but got $($PSVersionTable.PSVersion)" exit 1 } - name: Run install.ps1 shell: powershell run: | & ./packages/cli/install.ps1 - name: Run install.ps1 via irm simulation (catches BOM issues) shell: powershell run: | $ErrorActionPreference = "Stop" Get-Content ./packages/cli/install.ps1 -Raw | Invoke-Expression - name: Set PATH shell: bash run: | echo "$USERPROFILE\.vite-plus\bin" >> $GITHUB_PATH - name: Verify installation shell: powershell working-directory: ${{ runner.temp }} run: | Write-Host "PATH: $env:Path" vp --version vp --help vp create vite --no-interactive --no-agent -- hello --no-interactive -t vanilla cd hello vp run build vp --version - name: Verify bin setup shell: powershell run: | $binPath = "$env:USERPROFILE\.vite-plus\bin" Get-ChildItem -Force $binPath if (-not (Test-Path $binPath)) { Write-Error "Bin directory not found: $binPath" exit 1 } $expectedShims = @("node.exe", "npm.exe", "npx.exe") foreach ($shim in $expectedShims) { $shimFile = Join-Path $binPath $shim if (-not (Test-Path $shimFile)) { Write-Error "Shim not found: $shimFile" exit 1 } Write-Host "Found shim: $shimFile" } where.exe node where.exe npm where.exe npx where.exe vp $env:Path = "$env:USERPROFILE\.vite-plus\bin;$env:Path" vp env doctor vp env run --node 24 -- node -p "process.versions" - name: Verify upgrade shell: powershell run: | vp upgrade --check vp upgrade 0.0.0-gbe8891a5.20260227-1615 vp --version vp upgrade --rollback vp --version test-install-ps1-arm64: name: Test install.ps1 (Windows ARM64) runs-on: windows-11-arm permissions: contents: read steps: - uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1 - name: Run install.ps1 shell: pwsh run: | & ./packages/cli/install.ps1 - name: Set PATH shell: bash run: | echo "$USERPROFILE\.vite-plus\bin" >> $GITHUB_PATH - name: Verify installation shell: pwsh working-directory: ${{ runner.temp }} run: | Write-Host "PATH: $env:Path" vp --version vp --help vp create vite --no-interactive --no-agent -- hello --no-interactive -t vanilla cd hello vp run build vp --version - name: Verify bin setup shell: pwsh run: | $binPath = "$env:USERPROFILE\.vite-plus\bin" Get-ChildItem -Force $binPath if (-not (Test-Path $binPath)) { Write-Error "Bin directory not found: $binPath" exit 1 } $expectedShims = @("node.exe", "npm.exe", "npx.exe") foreach ($shim in $expectedShims) { $shimFile = Join-Path $binPath $shim if (-not (Test-Path $shimFile)) { Write-Error "Shim not found: $shimFile" exit 1 } Write-Host "Found shim: $shimFile" } where.exe node where.exe npm where.exe npx where.exe vp $env:Path = "$env:USERPROFILE\.vite-plus\bin;$env:Path" vp env doctor vp env run --node 24 -- node -p "process.versions" - name: Verify upgrade shell: pwsh run: | vp upgrade --check vp upgrade 0.0.0-gbe8891a5.20260227-1615 vp --version vp upgrade --rollback vp --version test-install-ps1: name: Test install.ps1 (Windows x64) runs-on: windows-latest permissions: contents: read steps: - uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1 - name: Run install.ps1 shell: pwsh run: | & ./packages/cli/install.ps1 - name: Set PATH shell: bash run: | echo "$USERPROFILE\.vite-plus\bin" >> $GITHUB_PATH - name: Verify upgrade shell: pwsh run: | # --check queries npm registry and prints update status vp upgrade --check vp upgrade 0.0.0-gbe8891a5.20260227-1615 vp --version # rollback to the previous version (should succeed after a real update) vp upgrade --rollback vp --version - name: Verify installation on powershell shell: pwsh working-directory: ${{ runner.temp }} run: | # Print PATH from environment echo "PATH: $env:Path" vp --version vp --help # $env:VITE_LOG = "trace" # test create command vp create vite --no-interactive --no-agent -- hello --no-interactive -t vanilla cd hello && vp run build && vp --version - name: Verify bin setup on powershell shell: pwsh run: | # Verify bin directory was created by vp env --setup $binPath = "$env:USERPROFILE\.vite-plus\bin" Get-ChildItem -Force $binPath if (-not (Test-Path $binPath)) { Write-Error "Bin directory not found: $binPath" exit 1 } # Verify shim executables exist (trampoline .exe files on Windows) $expectedShims = @("node.exe", "npm.exe", "npx.exe") foreach ($shim in $expectedShims) { $shimFile = Join-Path $binPath $shim if (-not (Test-Path $shimFile)) { Write-Error "Shim not found: $shimFile" exit 1 } Write-Host "Found shim: $shimFile" } where.exe node where.exe npm where.exe npx where.exe vp # Verify vp env doctor works $env:Path = "$env:USERPROFILE\.vite-plus\bin;$env:Path" vp env doctor vp env run --node 24 -- node -p "process.versions" - name: Verify installation on cmd shell: cmd working-directory: ${{ runner.temp }} run: | echo PATH: %PATH% dir "%USERPROFILE%\.vite-plus" dir "%USERPROFILE%\.vite-plus\bin" REM test create command vp create vite --no-interactive --no-agent -- hello-cmd --no-interactive -t vanilla cd hello-cmd && vp run build && vp --version - name: Verify bin setup on cmd shell: cmd run: | REM Verify bin directory was created by vp env --setup set "BIN_PATH=%USERPROFILE%\.vite-plus\bin" dir "%BIN_PATH%" REM Verify shim executables exist (Windows uses trampoline .exe files) for %%s in (node.exe npm.exe npx.exe vp.exe) do ( if not exist "%BIN_PATH%\%%s" ( echo Error: Shim not found: %BIN_PATH%\%%s exit /b 1 ) echo Found shim: %BIN_PATH%\%%s ) where node where npm where npx where vp REM Verify vp env doctor works vp env doctor vp env run --node 24 -- node -p "process.versions" - name: Verify installation on bash shell: bash working-directory: ${{ runner.temp }} run: | echo "PATH: $PATH" ls -al ~/.vite-plus ls -al ~/.vite-plus/bin vp --version vp --help # test create command vp create vite --no-interactive --no-agent -- hello-bash --no-interactive -t vanilla cd hello-bash && vp run build && vp --version - name: Verify bin setup on bash shell: bash run: | # Verify bin directory was created by vp env --setup BIN_PATH="$HOME/.vite-plus/bin" ls -al "$BIN_PATH" if [ ! -d "$BIN_PATH" ]; then echo "Error: Bin directory not found: $BIN_PATH" exit 1 fi # Verify trampoline .exe files exist for shim in node.exe npm.exe npx.exe vp.exe; do if [ ! -f "$BIN_PATH/$shim" ]; then echo "Error: Trampoline shim not found: $BIN_PATH/$shim" exit 1 fi echo "Found trampoline shim: $BIN_PATH/$shim" done # Verify vp env doctor works vp env doctor vp env run --node 24 -- node -p "process.versions" which node which npm which npx which vp ================================================ FILE: .github/workflows/upgrade-deps.yml ================================================ name: Upgrade Upstream Dependencies on: schedule: - cron: '0 0 * * *' # Daily at midnight UTC workflow_dispatch: # Manual trigger permissions: {} jobs: upgrade: runs-on: ubuntu-latest permissions: contents: write pull-requests: write actions: read id-token: write steps: - uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1 - uses: ./.github/actions/clone - uses: oxc-project/setup-rust@d286d43bc1f606abbd98096666ff8be68c8d5f57 # v1.0.0 with: save-cache: ${{ github.ref_name == 'main' }} cache-key: upgrade-deps tools: just,cargo-shear - uses: oxc-project/setup-node@fdbf0dfd334c4e6d56ceeb77d91c76339c2a0885 # v1.0.4 - name: Rustup Adds Target run: rustup target add x86_64-unknown-linux-gnu - name: Rustup Adds Target for rolldown working-directory: rolldown run: rustup target add x86_64-unknown-linux-gnu - name: Upgrade dependencies id: upgrade env: GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} run: node .github/scripts/upgrade-deps.mjs - name: Sync remote and build id: build continue-on-error: true # Create PR even if build fails run: | pnpm install --no-frozen-lockfile pnpm tool sync-remote pnpm install --no-frozen-lockfile - name: Build uses: ./.github/actions/build-upstream id: build-upstream continue-on-error: true with: target: x86_64-unknown-linux-gnu print-after-build: 'true' env: RELEASE_BUILD: 'true' - uses: anthropics/claude-code-action@eb99fb38f09dedf69f423f1315d6c0272ace56a0 # Claude Code to 2.1.72 env: RELEASE_BUILD: 'true' with: claude_code_oauth_token: ${{ secrets.ANTHROPIC_API_KEY }} github_token: ${{ secrets.GITHUB_TOKEN }} show_full_output: 'true' prompt: | Check if the build-upstream steps failed and fix them. ### Background - The build-upstream steps are at ./.github/actions/build-upstream/action.yml - The deps upgrade script is at ./.github/scripts/upgrade-deps.mjs ### Instructions - We are using `pnpm` as the package manager - We are aiming to upgrade all dependencies to the latest versions in this workflow, so don't downgrade any dependencies. - Compare tsdown CLI options with `vp pack` and sync any new or removed options. Follow the instructions in `.claude/skills/sync-tsdown-cli/SKILL.md`. - Check `.claude/agents/cargo-workspace-merger.md` if rolldown hash is changed. - Run the steps in `build-upstream` action.yml after your fixing. If no errors are found, you can safe to exit. - Install global CLI after the build-upstream steps are successful, by running the following commands: - `pnpm bootstrap-cli:ci` - `echo "$HOME/.vite-plus/bin" >> $GITHUB_PATH` - Run `pnpm run lint` to check if there are any issues after the build, if has, deep investigate it and fix it. You need to run `just build` before you can run `pnpm run lint`. - Run `pnpm run test` after `just build` to ensure all tests are successful. - The snapshot tests in `pnpm run test` are always successful, you need to check the snapshot diffs in git to see if there is anything wrong after our deps upgrade. - If deps in our `Cargo.toml` need to be upgraded, you can refer to the `./.claude/agents/cargo-workspace-merger.md` - If `Cargo.toml` has been modified, you need to run `cargo shear` to ensure there is nothing wrong with our dependencies. - Run `cargo check --all-targets --all-features` to ensure everything works fine if any Rust related codes are modified. - Run the following commands to ensure everything works fine: vp -h vp run -h vp lint -h vp test -h vp build -h vp fmt -h vp pack -h - Your final step is to run `just build` to ensure all builds are successful. Help me fix the errors in `build-upstream` steps if exists. No need to commit changes after your fixing we have a following step to commit all file changes. claude_args: | --model opus --allowedTools "Bash,Edit,Replace,NotebookEditCell" additional_permissions: | actions: read - name: Update lockfile run: | pnpm install --no-frozen-lockfile pnpm dedupe - name: Checkout binding files run: | git checkout packages/cli/binding/index.cjs git checkout packages/cli/binding/index.d.cts - name: Format code run: pnpm fmt - name: Close and delete previous PR env: GH_TOKEN: ${{ secrets.AUTO_UPDATE_BRANCH_TOKEN }} run: | # Find PR with the deps/upstream-update branch PR_NUMBER=$(gh pr list --head deps/upstream-update --json number --jq '.[0].number') if [ -n "$PR_NUMBER" ]; then echo "Found existing PR #$PR_NUMBER, closing and deleting branch…" gh pr close "$PR_NUMBER" --delete-branch else echo "No existing PR found with branch deps/upstream-update" fi - name: Create/Update PR uses: peter-evans/create-pull-request@22a9089034f40e5a961c8808d113e2c98fb63676 # v7.0.11 with: base: main branch: deps/upstream-update title: 'feat(deps): upgrade upstream dependencies' sign-commits: true token: ${{ secrets.AUTO_UPDATE_BRANCH_TOKEN }} branch-token: ${{ secrets.GITHUB_TOKEN }} body: | Automated daily upgrade of upstream dependencies: - rolldown (latest tag) - vite (latest tag) - vitest (latest npm version) - tsdown (latest npm version) Build status: ${{ steps.build.outcome }} commit-message: 'feat(deps): upgrade upstream dependencies' ================================================ FILE: .github/workflows/zizmor.yml ================================================ name: Zizmor permissions: {} on: workflow_dispatch: pull_request: types: [opened, synchronize] paths: - '.github/workflows/**' push: branches: - main - 'renovate/**' paths: - '.github/workflows/**' jobs: zizmor: name: zizmor runs-on: ubuntu-latest permissions: security-events: write steps: - name: Checkout repository uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4 with: persist-credentials: false submodules: true - uses: taiki-e/install-action@ae97ff9daf1cd2e216671a047d80ff48461e30bb # v2.49.1 with: tool: zizmor - name: Run zizmor run: zizmor --format sarif . > results.sarif env: GH_TOKEN: ${{ secrets.GITHUB_TOKEN }} - name: Upload SARIF file uses: github/codeql-action/upload-sarif@b56ba49b26e50535fa1e7f7db0f4f7b4bf65d80d # v3 with: sarif_file: results.sarif category: zizmor ================================================ FILE: .gitignore ================================================ /target node_modules dist .claude/settings.local.json *.tsbuildinfo .DS_Store rolldown rolldown-vite vite /crates/vite_global_cli/vp ================================================ FILE: .husky/pre-commit ================================================ pnpm lint-staged ================================================ FILE: .node-version ================================================ 22.18.0 ================================================ FILE: .rustfmt.toml ================================================ style_edition = "2024" # Make Rust more readable given most people have wide screens nowadays. # This is also the setting used by [rustc](https://github.com/rust-lang/rust/blob/master/rustfmt.toml) use_small_heuristics = "Max" # Use field initialize shorthand if possible use_field_init_shorthand = true reorder_modules = true # All unstable features that we wish for # unstable_features = true # Provide a cleaner impl order reorder_impl_items = true # Provide a cleaner import sort order group_imports = "StdExternalCrate" # Group "use" statements by crate imports_granularity = "Crate" ================================================ FILE: .typos.toml ================================================ [default.extend-words] ratatui = "ratatui" PUNICODE = "PUNICODE" Jod = "Jod" # Node.js v22 LTS codename [files] extend-exclude = [ "**/snap-tests/**/snap.txt", "crates/fspy_detours_sys/detours", "crates/fspy_detours_sys/src/generated_bindings.rs", "packages/cli/src/oxfmt-config.ts", ] ================================================ FILE: .vscode/extensions.json ================================================ { "recommendations": ["VoidZero.vite-plus-extension-pack"] } ================================================ FILE: .vscode/settings.json ================================================ { "editor.formatOnSave": true, "files.insertFinalNewline": true, "files.trimFinalNewlines": true, "[javascript]": { "editor.defaultFormatter": "oxc.oxc-vscode" }, "[typescriptreact]": { "editor.defaultFormatter": "oxc.oxc-vscode" }, "[typescript]": { "editor.defaultFormatter": "oxc.oxc-vscode" }, "[json]": { "editor.defaultFormatter": "oxc.oxc-vscode" }, "typescript.preferences.importModuleSpecifierEnding": "js", "typescript.reportStyleChecksAsWarnings": false, "typescript.updateImportsOnFileMove.enabled": "always", "typescript.experimental.useTsgo": true } ================================================ FILE: CLAUDE.md ================================================ # Vite-Plus A monorepo task runner (like nx/turbo) with intelligent caching and dependency resolution. ## Core Concept **Task Execution**: Run tasks across monorepo packages with automatic dependency ordering. ```bash # Built-in commands vp build # Run Vite build (dedicated command) vp test # Run Vitest (dedicated command) vp lint # Run oxlint (dedicated command) # Run tasks across packages (explicit mode) vp run build -r # recursive with topological ordering vp run app#build web#build # specific packages vp run build -r --no-topological # recursive without implicit deps # Run task in current package (implicit mode - for non-built-in tasks) vp run dev # runs dev script from package.json ``` ## Key Architecture - **Entry**: `crates/vite_task/src/lib.rs` - CLI parsing and main logic - **Workspace**: `crates/vite_task/src/config/workspace.rs` - Loads packages, creates task graph - **Task Graph**: `crates/vite_task/src/config/task_graph_builder.rs` - Builds dependency graph - **Execution**: `crates/vite_task/src/schedule.rs` - Executes tasks in dependency order ## Task Dependencies 1. **Explicit** (always applied): Defined in `vite-task.json` ```json { "tasks": { "test": { "command": "jest", "dependsOn": ["build", "lint"] } } } ``` 2. **Implicit** (when `--topological`): Based on package.json dependencies - If A depends on B, then A#build depends on B#build automatically ## Key Features - **Topological Flag**: Controls implicit dependencies from package relationships - Default: ON for `--recursive`, OFF otherwise - Toggle with `--no-topological` to disable - **Boolean Flags**: All support `--no-*` pattern for explicit disable - Example: `--recursive` vs `--no-recursive` - Conflicts handled by clap - If you want to add a new boolean flag, follow this pattern ## Path Type System - **Type Safety**: All paths use typed `vite_path` instead of `std::path` for better safety - **Absolute Paths**: `vite_path::AbsolutePath` / `AbsolutePathBuf` - **Relative Paths**: `vite_path::RelativePath` / `RelativePathBuf` - **Usage Guidelines**: - Use methods such as `strip_prefix`/`join` provided in `vite_path` for path operations instead of converting to std paths - Only convert to std paths when interfacing with std library functions, and this should be implicit in most cases thanks to `AsRef` implementations - Add necessary methods in `vite_path` instead of falling back to std path types - **Converting from std paths** (e.g., `TempDir::path()`): ```rust let temp_path = AbsolutePathBuf::new(temp_dir.path().to_path_buf()).unwrap(); ``` - **Function signatures**: Prefer `&AbsolutePath` over `&std::path::Path` - **Passing to std functions**: `AbsolutePath` implements `AsRef`, use `.as_path()` when explicit `&Path` is required ## Clippy Rules All **new** Rust code must follow the custom clippy rules defined in `.clippy.toml` (disallowed types, macros, and methods). Existing code may not fully comply due to historical reasons. ## CLI Output All user-facing output must go through shared output modules instead of raw print calls. - **Rust**: Use `vite_shared::output` functions (`info`, `warn`, `error`, `note`, `success`) — never raw `println!`/`eprintln!` (enforced by clippy `disallowed-macros`) - **TypeScript**: Use `packages/cli/src/utils/terminal.ts` functions (`infoMsg`, `warnMsg`, `errorMsg`, `noteMsg`, `log`) — never raw `console.log`/`console.error` ## Git Workflow - Run `vp check --fix` before committing to format and lint code ## Quick Reference - **Compound Commands**: `"build": "tsc && rollup"` splits into subtasks - **Task Format**: `package#task` (e.g., `app#build`) - **Path Types**: Use `vite_path` types instead of `std::path` types for type safety - **Tests**: Run `cargo test -p vite_task` to verify changes - **Debug**: Use `--debug` to see cache operations ## Tests - Run `cargo test` to execute all tests - You never need to run `pnpm install` in the test fixtures dir, vite-plus should able to load and parse the workspace without `pnpm install`. ## Build - Run `pnpm bootstrap-cli` from the project root to build all packages and install the global CLI - This builds all `@voidzero-dev/*` and `vite-plus` packages - Compiles the Rust NAPI bindings and the `vp` Rust binary - Installs the CLI globally to `~/.vite-plus/` ## Snap Tests Snap tests are located in `packages/cli/snap-tests/` (local CLI) and `packages/cli/snap-tests-global/` (global CLI). Each test case is a directory containing: - `package.json` - Package configuration for the test - `steps.json` - Commands to run and environment variables - `src/` - Source files for the test - `snap.txt` - Expected output (generated/updated by running the test) ```bash # Run all snap tests (local + global) pnpm -F vite-plus snap-test # Run only local CLI snap tests pnpm -F vite-plus snap-test-local pnpm -F vite-plus snap-test-local # Run only global CLI snap tests pnpm -F vite-plus snap-test-global pnpm -F vite-plus snap-test-global ``` The snap test will automatically generate/update the `snap.txt` file with the command outputs. It exits with zero status even if there are output differences; you need to manually check the diffs(`git diff`) to verify correctness. ================================================ FILE: CONTRIBUTING.md ================================================ # Contributing Guide ## Initial Setup ### macOS / Linux You'll need the following tools installed on your system: ``` brew install pnpm node just cmake ``` Install Rust & Cargo using rustup: ``` curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh cargo install cargo-binstall ``` Initial setup to install dependencies for Vite+: ``` just init ``` ### Windows You'll need the following tools installed on your system. You can use [winget](https://learn.microsoft.com/en-us/windows/package-manager/). ```powershell winget install pnpm.pnpm OpenJS.NodeJS.LTS Casey.Just Kitware.CMake ``` Install Rust & Cargo from [rustup.rs](https://rustup.rs/), then install `cargo-binstall`: ```powershell cargo install cargo-binstall ``` Initial setup to install dependencies for Vite+: ```powershell just init ``` **Note:** Run commands in PowerShell or Windows Terminal. Some commands may require elevated permissions. ## Build Vite+ and upstream dependencies To create a release build of Vite+ and all upstream dependencies, run: ``` just build ``` ## Install the Vite+ Global CLI from source code ``` pnpm bootstrap-cli vp --version ``` This builds all packages, compiles the Rust `vp` binary, and installs the CLI to `~/.vite-plus`. ## Workflow for build and test You can run this command to build, test and check if there are any snapshot changes: ``` pnpm bootstrap-cli && pnpm test && git status ``` ## Running Snap Tests Snap tests verify CLI output. They are located in `packages/cli/snap-tests/` (local CLI) and `packages/cli/snap-tests-global/` (global CLI). ```bash # Run all snap tests (local + global) pnpm -F vite-plus snap-test # Run only local CLI snap tests pnpm -F vite-plus snap-test-local pnpm -F vite-plus snap-test-local # Run only global CLI snap tests pnpm -F vite-plus snap-test-global pnpm -F vite-plus snap-test-global ``` Snap tests auto-generate `snap.txt` files. Check `git diff` to verify output changes are correct. ## Verified Commits All commits in PR branches should be GitHub-verified so reviewers can confirm commit authenticity. Set up local commit signing and GitHub verification first: - Follow GitHub's guide for GPG commit signature verification: https://docs.github.com/en/authentication/managing-commit-signature-verification/about-commit-signature-verification#gpg-commit-signature-verification - If you use Graphite, add the Graphite GPG key to your GitHub account from the Graphite UI as well, otherwise commits updated by Graphite won't show as verified. After setup, re-sign any existing commits in your branch so the full branch is verified: ```bash # Re-sign each commit on your branch (replace origin/main with your branch base if needed) git rebase -i origin/main # At each stop: git commit --amend --date=now --no-edit -S # Then continue: git rebase --continue ``` When done, force-push the updated branch history: ```bash git push --force-with-lease ``` ## Pull upstream dependencies > [!NOTE] > > Upstream dependencies only need to be updated when an ["upgrade upstream dependencies"](https://github.com/voidzero-dev/vite-plus/pulls?q=is%3Apr+feat%28deps%29%3A+upgrade+upstream+dependencies+merged) pull request is merged. To sync the latest upstream dependencies such as Rolldown and Vite, run: ``` pnpm tool sync-remote just build ``` ## macOS Performance Tip If you are using macOS, add your terminal app (Ghostty, iTerm2, Terminal, …) to the approved "Developer Tools" apps in the Privacy panel of System Settings and restart your terminal app. Your Rust builds will be about ~30% faster. ================================================ FILE: Cargo.toml ================================================ [workspace] resolver = "3" members = ["bench", "crates/*", "packages/cli/binding"] [workspace.metadata.cargo-shear] ignored = [ # These workspace dependencies are used by rolldown crates, not our local crates "css-module-lexer", "html5gum", "rolldown_filter_analyzer", "rolldown_plugin_vite_asset", "rolldown_plugin_vite_asset_import_meta_url", "rolldown_plugin_vite_css", "rolldown_plugin_vite_css_post", "rolldown_plugin_vite_html", "rolldown_plugin_vite_html_inline_proxy", "string_cache", ] [workspace.package] authors = ["Vite+ Authors"] edition = "2024" homepage = "https://github.com/voidzero-dev/vite-plus" license = "MIT" repository = "https://github.com/voidzero-dev/vite-plus" rust-version = "1.92.0" [workspace.lints.rust] absolute_paths_not_starting_with_crate = "warn" non_ascii_idents = "warn" unit-bindings = "warn" unexpected_cfgs = { level = "warn", check-cfg = ['cfg(coverage)', 'cfg(coverage_nightly)'] } unsafe_op_in_unsafe_fn = "warn" unused_unsafe = "warn" [workspace.lints.clippy] all = { level = "warn", priority = -1 } # restriction dbg_macro = "warn" todo = "warn" unimplemented = "warn" print_stdout = "warn" print_stderr = "warn" allow_attributes = "warn" pedantic = { level = "warn", priority = -1 } nursery = { level = "warn", priority = -1 } cargo = { level = "warn", priority = -1 } cargo_common_metadata = "allow" [workspace.dependencies] anyhow = "1.0.98" append-only-vec = "0.1.7" arcstr = { version = "1.2.0", default-features = false } ariadne = { package = "rolldown-ariadne", version = "0.5.3" } ast-grep-config = "0.40.1" ast-grep-core = "0.40.1" ast-grep-language = { version = "0.40.1", default-features = false, features = [ "tree-sitter-bash", "tree-sitter-typescript", ] } async-channel = "2.3.1" async-scoped = "0.9.0" async-trait = "0.1.89" backon = "1.3.0" base-encode = "0.3.1" base64-simd = "0.8.0" bincode = "2.0.1" bstr = { version = "1.12.0", default-features = false, features = ["alloc", "std"] } bitflags = "2.9.1" brush-parser = "0.3.0" blake3 = "1.8.2" chrono = { version = "0.4", features = ["serde"] } clap = "4.5.40" clap_complete = "4.6.0" commondir = "1.0.0" cow-utils = "0.1.3" criterion = { version = "0.7", features = ["html_reports"] } criterion2 = { version = "3.0.0", default-features = false } crossterm = { version = "0.29.0", features = ["event-stream"] } css-module-lexer = "0.0.15" dashmap = "6.1.0" derive_more = { version = "2.0.1", features = ["debug"] } directories = "6.0.0" dunce = "1.0.5" fast-glob = "1.0.0" flate2 = { version = "=1.1.9", features = ["zlib-rs"] } form_urlencoded = "1.2.1" fspy = { git = "https://github.com/voidzero-dev/vite-task.git", rev = "69cc6eba95a3b7f25f7d4d32c3f29b1386995907" } futures = "0.3.31" futures-util = "0.3.31" glob = "0.3.2" heck = "0.5.0" hex = "0.4.3" html5gum = "0.8.1" httpmock = "0.7" ignore = "0.4" indicatif = "0.18" indexmap = "2.9.0" indoc = "2.0.5" infer = "0.19.0" insta = "1.43.1" itertools = "0.14.0" itoa = "1.0.15" json-escape-simd = "3" json-strip-comments = "3" jsonschema = { version = "0.45.0", default-features = false } junction = "1.4.1" memchr = "2.7.4" mimalloc-safe = "0.1.52" mime = "0.3.17" napi = { version = "3.0.0", default-features = false, features = [ "async", "error_anyhow", "anyhow", "tracing", "object_indexmap", ] } napi-build = "2" napi-derive = { version = "3.0.0", default-features = false, features = [ "type-def", "strict", "tracing", ] } nix = { version = "0.30.1", features = ["dir"] } nodejs-built-in-modules = "1.0.0" nom = "8.0.0" num-bigint = "0.4.6" num-format = "0.4" num_cpus = "1.17" owo-colors = "4.2.2" parking_lot = "0.12.5" pathdiff = "0.2.3" pnp = "0.12.7" percent-encoding = "2.3.1" petgraph = "0.8.2" pretty_assertions = "1.4.1" phf = "0.13.0" rayon = "1.10.0" regex = "1.11.1" regress = "0.11.0" reqwest = { version = "0.12", default-features = false } rolldown-notify = "10.2.0" rolldown-notify-debouncer-full = "0.7.5" ropey = "1.6.1" rusqlite = { version = "0.37.0", features = ["bundled"] } rustc-hash = "2.1.1" schemars = "1.0.0" self_cell = "1.2.0" node-semver = "2.2.0" semver = "1.0.26" serde = { version = "1.0.219", features = ["derive"] } serde_json = "1.0.140" serde_yaml = "0.9.34" serde_yml = "0.0.12" serial_test = "3.2.0" sha1 = "0.10.6" sha2 = "0.10.9" simdutf8 = "0.1.5" smallvec = "1.15.0" string_cache = "0.9.0" sugar_path = { version = "2.0.1", features = ["cached_current_dir"] } tar = "0.4.43" tempfile = "3.14.0" terminal_size = "0.4.2" test-log = { version = "0.2.18", features = ["trace"] } testing_macros = "1.0.0" thiserror = "2" tokio = { version = "1.48.0", default-features = false } tracing = "0.1.41" tracing-chrome = "0.7.2" tracing-subscriber = { version = "0.3.19", default-features = false, features = [ "env-filter", "fmt", "json", "serde", "std", ] } ts-rs = "12.0" typedmap = "0.6.0" url = "2.5.4" urlencoding = "2.1.3" uuid = "1.17.0" vfs = "0.13.0" vite_command = { path = "crates/vite_command" } vite_error = { path = "crates/vite_error" } vite_js_runtime = { path = "crates/vite_js_runtime" } vite_glob = { git = "https://github.com/voidzero-dev/vite-task.git", rev = "69cc6eba95a3b7f25f7d4d32c3f29b1386995907" } vite_install = { path = "crates/vite_install" } vite_migration = { path = "crates/vite_migration" } vite_shared = { path = "crates/vite_shared" } vite_static_config = { path = "crates/vite_static_config" } vite_path = { git = "https://github.com/voidzero-dev/vite-task.git", rev = "69cc6eba95a3b7f25f7d4d32c3f29b1386995907" } vite_str = { git = "https://github.com/voidzero-dev/vite-task.git", rev = "69cc6eba95a3b7f25f7d4d32c3f29b1386995907" } vite_task = { git = "https://github.com/voidzero-dev/vite-task.git", rev = "69cc6eba95a3b7f25f7d4d32c3f29b1386995907" } vite_workspace = { git = "https://github.com/voidzero-dev/vite-task.git", rev = "69cc6eba95a3b7f25f7d4d32c3f29b1386995907" } walkdir = "2.5.0" wax = "0.6.0" which = "8.0.0" xxhash-rust = "0.8.15" zip = "7.2" # oxc crates with the same version oxc = { version = "0.121.0", features = [ "ast_visit", "transformer", "minifier", "mangler", "semantic", "codegen", "serialize", "isolated_declarations", "regular_expression", "cfg", ] } oxc_allocator = { version = "0.121.0", features = ["pool"] } oxc_ast = "0.121.0" oxc_ecmascript = "0.121.0" oxc_parser = "0.121.0" oxc_span = "0.121.0" oxc_napi = "0.121.0" oxc_minify_napi = "0.121.0" oxc_parser_napi = "0.121.0" oxc_transform_napi = "0.121.0" oxc_traverse = "0.121.0" # oxc crates in their own repos oxc_index = { version = "4", features = ["rayon", "serde"] } oxc_resolver = { version = "11.19.1", features = ["yarn_pnp"] } oxc_resolver_napi = { version = "11.19.1", default-features = false, features = ["yarn_pnp"] } oxc_sourcemap = "6" # rolldown crates rolldown = { path = "./rolldown/crates/rolldown" } rolldown_binding = { path = "./rolldown/crates/rolldown_binding" } rolldown_common = { path = "./rolldown/crates/rolldown_common" } rolldown_dev = { path = "./rolldown/crates/rolldown_dev" } rolldown_dev_common = { path = "./rolldown/crates/rolldown_dev_common" } rolldown_devtools = { path = "./rolldown/crates/rolldown_devtools" } rolldown_devtools_action = { path = "./rolldown/crates/rolldown_devtools_action" } rolldown_ecmascript = { path = "./rolldown/crates/rolldown_ecmascript" } rolldown_ecmascript_utils = { path = "./rolldown/crates/rolldown_ecmascript_utils" } rolldown_error = { path = "./rolldown/crates/rolldown_error" } rolldown_filter_analyzer = { path = "./rolldown/crates/rolldown_filter_analyzer" } rolldown_fs = { path = "./rolldown/crates/rolldown_fs" } rolldown_fs_watcher = { path = "./rolldown/crates/rolldown_fs_watcher" } rolldown_plugin = { path = "./rolldown/crates/rolldown_plugin" } rolldown_plugin_asset_module = { path = "./rolldown/crates/rolldown_plugin_asset_module" } rolldown_plugin_bundle_analyzer = { path = "./rolldown/crates/rolldown_plugin_bundle_analyzer" } rolldown_plugin_chunk_import_map = { path = "./rolldown/crates/rolldown_plugin_chunk_import_map" } rolldown_plugin_copy_module = { path = "./rolldown/crates/rolldown_plugin_copy_module" } rolldown_plugin_data_url = { path = "./rolldown/crates/rolldown_plugin_data_url" } rolldown_plugin_esm_external_require = { path = "./rolldown/crates/rolldown_plugin_esm_external_require" } rolldown_plugin_hmr = { path = "./rolldown/crates/rolldown_plugin_hmr" } rolldown_plugin_isolated_declaration = { path = "./rolldown/crates/rolldown_plugin_isolated_declaration" } rolldown_plugin_lazy_compilation = { path = "./rolldown/crates/rolldown_plugin_lazy_compilation" } rolldown_plugin_oxc_runtime = { path = "./rolldown/crates/rolldown_plugin_oxc_runtime" } rolldown_plugin_replace = { path = "./rolldown/crates/rolldown_plugin_replace" } rolldown_plugin_utils = { path = "./rolldown/crates/rolldown_plugin_utils" } rolldown_plugin_vite_alias = { path = "./rolldown/crates/rolldown_plugin_vite_alias" } rolldown_plugin_vite_asset = { path = "./rolldown/crates/rolldown_plugin_vite_asset" } rolldown_plugin_vite_asset_import_meta_url = { path = "./rolldown/crates/rolldown_plugin_vite_asset_import_meta_url" } rolldown_plugin_vite_build_import_analysis = { path = "./rolldown/crates/rolldown_plugin_vite_build_import_analysis" } rolldown_plugin_vite_css = { path = "./rolldown/crates/rolldown_plugin_vite_css" } rolldown_plugin_vite_css_post = { path = "./rolldown/crates/rolldown_plugin_vite_css_post" } rolldown_plugin_vite_dynamic_import_vars = { path = "./rolldown/crates/rolldown_plugin_vite_dynamic_import_vars" } rolldown_plugin_vite_html = { path = "./rolldown/crates/rolldown_plugin_vite_html" } rolldown_plugin_vite_html_inline_proxy = { path = "./rolldown/crates/rolldown_plugin_vite_html_inline_proxy" } rolldown_plugin_vite_import_glob = { path = "./rolldown/crates/rolldown_plugin_vite_import_glob" } rolldown_plugin_vite_json = { path = "./rolldown/crates/rolldown_plugin_vite_json" } rolldown_plugin_vite_load_fallback = { path = "./rolldown/crates/rolldown_plugin_vite_load_fallback" } rolldown_plugin_vite_manifest = { path = "./rolldown/crates/rolldown_plugin_vite_manifest" } rolldown_plugin_vite_module_preload_polyfill = { path = "./rolldown/crates/rolldown_plugin_vite_module_preload_polyfill" } rolldown_plugin_vite_react_refresh_wrapper = { path = "./rolldown/crates/rolldown_plugin_vite_react_refresh_wrapper" } rolldown_plugin_vite_reporter = { path = "./rolldown/crates/rolldown_plugin_vite_reporter" } rolldown_plugin_vite_resolve = { path = "./rolldown/crates/rolldown_plugin_vite_resolve" } rolldown_plugin_vite_transform = { path = "./rolldown/crates/rolldown_plugin_vite_transform" } rolldown_plugin_vite_wasm_fallback = { path = "./rolldown/crates/rolldown_plugin_vite_wasm_fallback" } rolldown_plugin_vite_web_worker_post = { path = "./rolldown/crates/rolldown_plugin_vite_web_worker_post" } rolldown_resolver = { path = "./rolldown/crates/rolldown_resolver" } rolldown_sourcemap = { path = "./rolldown/crates/rolldown_sourcemap" } rolldown_std_utils = { path = "./rolldown/crates/rolldown_std_utils" } rolldown_testing = { path = "./rolldown/crates/rolldown_testing" } rolldown_testing_config = { path = "./rolldown/crates/rolldown_testing_config" } rolldown_tracing = { path = "./rolldown/crates/rolldown_tracing" } rolldown_utils = { path = "./rolldown/crates/rolldown_utils" } rolldown_watcher = { path = "./rolldown/crates/rolldown_watcher" } rolldown_workspace = { path = "./rolldown/crates/rolldown_workspace" } string_wizard = { path = "./rolldown/crates/string_wizard", features = ["serde"] } # ============================================================================= # Local Development Patches # ============================================================================= # This section patches vite-task crates to use local paths for simultaneous # vite-task and vite-plus development. When making changes to vite-task that # affect vite-plus, this allows testing without publishing or pushing commits. # # To use: Ensure vite-task is cloned at ../vite-task relative to vite-plus. # Comment out this section before committing. # ============================================================================= # [patch."https://github.com/voidzero-dev/vite-task.git"] # fspy = { path = "../vite-task/crates/fspy" } # vite_glob = { path = "../vite-task/crates/vite_glob" } # vite_path = { path = "../vite-task/crates/vite_path" } # vite_str = { path = "../vite-task/crates/vite_str" } # vite_task = { path = "../vite-task/crates/vite_task" } # vite_workspace = { path = "../vite-task/crates/vite_workspace" } [profile.dev] # Disabling debug info speeds up local and CI builds, # and we don't rely on it for debugging that much. debug = false [profile.release] # Configurations explicitly listed here for clarity. # Using the best options for performance. opt-level = 3 lto = "fat" codegen-units = 1 strip = "symbols" # set to `false` for debug information debug = false # set to `true` for debug information panic = "abort" # Let it crash and force ourselves to write safe Rust. # The trampoline binary is copied per shim tool (~5-10 copies), so optimize for # size instead of speed. This reduces it from ~200KB to ~100KB on Windows. [profile.release.package.vite_trampoline] opt-level = "z" ================================================ FILE: LICENSE ================================================ MIT License Copyright (c) 2026-present, VoidZero Inc. Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ================================================ FILE: README.md ================================================ Vite+ **The Unified Toolchain for the Web** _runtime and package management, create, dev, check, test, build, pack, and monorepo task caching in a single dependency_ --- Vite+ is the unified entry point for local web development. It combines [Vite](https://vite.dev/), [Vitest](https://vitest.dev/), [Oxlint](https://oxc.rs/docs/guide/usage/linter.html), [Oxfmt](https://oxc.rs/docs/guide/usage/formatter.html), [Rolldown](https://rolldown.rs/), [tsdown](https://tsdown.dev/), and [Vite Task](https://github.com/voidzero-dev/vite-task) into one zero-config toolchain that also manages runtime and package manager workflows: - **`vp env`:** Manage Node.js globally and per project - **`vp install`:** Install dependencies with automatic package manager detection - **`vp dev`:** Run Vite's fast native ESM dev server with instant HMR - **`vp check`:** Run formatting, linting, and type checks in one command - **`vp test`:** Run tests through bundled Vitest - **`vp build`:** Build applications for production with Vite + Rolldown - **`vp run`:** Execute monorepo tasks with caching and dependency-aware scheduling - **`vp pack`:** Build libraries for npm publishing or standalone app binaries - **`vp create` / `vp migrate`:** Scaffold new projects and migrate existing ones All of this is configured from your project root and works across Vite's framework ecosystem. Vite+ is fully open-source under the MIT license. ## Getting Started Install Vite+ globally as `vp`: For Linux or macOS: ```bash curl -fsSL https://vite.plus | bash ``` For Windows: ```bash irm https://viteplus.dev/install.ps1 | iex ``` `vp` handles the full development lifecycle such as package management, development servers, linting, formatting, testing and building for production. ## Configuring Vite+ Vite+ can be configured using a single `vite.config.ts` at the root of your project: ```ts import { defineConfig } from 'vite-plus'; export default defineConfig({ // Standard Vite configuration for dev/build/preview. plugins: [], // Vitest configuration. test: { include: ['src/**/*.test.ts'], }, // Oxlint configuration. lint: { ignorePatterns: ['dist/**'], }, // Oxfmt configuration. fmt: { semi: true, singleQuote: true, }, // Vite Task configuration. run: { tasks: { 'generate:icons': { command: 'node scripts/generate-icons.js', envs: ['ICON_THEME'], }, }, }, // `vp staged` configuration. staged: { '*': 'vp check --fix', }, }); ``` This lets you keep the configuration for your development server, build, test, lint, format, task runner, and staged-file workflow in one place with type-safe config and shared defaults. Use `vp migrate` to migrate to Vite+. It merges tool-specific config files such as `.oxlintrc*`, `.oxfmtrc*`, and lint-staged config into `vite.config.ts`. ### CLI Workflows (`vp help`) #### Start - **create** - Create a new project from a template - **migrate** - Migrate an existing project to Vite+ - **config** - Configure hooks and agent integration - **staged** - Run linters on staged files - **install** (`i`) - Install dependencies - **env** - Manage Node.js versions #### Develop - **dev** - Run the development server - **check** - Run format, lint, and type checks - **lint** - Lint code - **fmt** - Format code - **test** - Run tests #### Execute - **run** - Run monorepo tasks - **exec** - Execute a command from local `node_modules/.bin` - **dlx** - Execute a package binary without installing it as a dependency - **cache** - Manage the task cache #### Build - **build** - Build for production - **pack** - Build libraries - **preview** - Preview production build #### Manage Dependencies Vite+ automatically wraps your package manager (pnpm, npm, or Yarn) based on `packageManager` and lockfiles: - **add** - Add packages to dependencies - **remove** (`rm`, `un`, `uninstall`) - Remove packages from dependencies - **update** (`up`) - Update packages to latest versions - **dedupe** - Deduplicate dependencies - **outdated** - Check outdated packages - **list** (`ls`) - List installed packages - **why** (`explain`) - Show why a package is installed - **info** (`view`, `show`) - View package metadata from the registry - **link** (`ln`) / **unlink** - Manage local package links - **pm** - Forward a command to the package manager #### Maintain - **upgrade** - Update `vp` itself to the latest version - **implode** - Remove `vp` and all related data ### Scaffolding your first Vite+ project Use `vp create` to create a new project: ```bash vp create ``` You can run `vp create` inside of a project to add new apps or libraries to your project. ### Migrating an existing project You can migrate an existing project to Vite+: ```bash vp migrate ``` ### GitHub Actions Use the official [`setup-vp`](https://github.com/voidzero-dev/setup-vp) action to install Vite+ in GitHub Actions: ```yaml - uses: voidzero-dev/setup-vp@v1 with: node-version: '22' cache: true ``` #### Manual Installation & Migration If you are manually migrating a project to Vite+, install these dev dependencies first: ```bash npm install -D vite-plus @voidzero-dev/vite-plus-core@latest ``` You need to add overrides to your package manager for `vite` and `vitest` so that other packages depending on Vite and Vitest will use the Vite+ versions: ```json "overrides": { "vite": "npm:@voidzero-dev/vite-plus-core@latest", "vitest": "npm:@voidzero-dev/vite-plus-test@latest" } ``` If you are using `pnpm`, add this to your `pnpm-workspace.yaml`: ```yaml overrides: vite: npm:@voidzero-dev/vite-plus-core@latest vitest: npm:@voidzero-dev/vite-plus-test@latest ``` Or, if you are using Yarn: ```json "resolutions": { "vite": "npm:@voidzero-dev/vite-plus-core@latest", "vitest": "npm:@voidzero-dev/vite-plus-test@latest" } ``` ## Sponsors Thanks to [namespace.so](https://namespace.so) for powering our CI/CD pipelines with fast, free macOS and Linux runners. ================================================ FILE: bench/.gitignore ================================================ fixtures/monorepo ================================================ FILE: bench/Cargo.toml ================================================ [package] name = "vite-plus-benches" version = "0.1.0" edition = "2024" [dev-dependencies] anyhow = { workspace = true } async-trait = { workspace = true } criterion = { workspace = true } rustc-hash = { workspace = true } tokio = { workspace = true, features = ["rt"] } vite_path = { workspace = true } vite_str = { workspace = true } vite_task = { workspace = true } [[bench]] name = "workspace_load" harness = false ================================================ FILE: bench/benches/workspace_load.rs ================================================ use std::{ffi::OsStr, hint::black_box, path::PathBuf, sync::Arc}; use criterion::{BenchmarkId, Criterion, criterion_group, criterion_main}; use rustc_hash::FxHashMap; use tokio::runtime::Runtime; use vite_path::{AbsolutePath, AbsolutePathBuf}; use vite_str::Str; use vite_task::{ CommandHandler, HandledCommand, Session, SessionConfig, plan_request::ScriptCommand, }; /// A no-op command handler for benchmarking purposes. #[derive(Debug, Default)] struct NoOpCommandHandler; #[async_trait::async_trait(?Send)] impl CommandHandler for NoOpCommandHandler { async fn handle_command( &mut self, _command: &mut ScriptCommand, ) -> anyhow::Result { Ok(HandledCommand::Verbatim) } } /// A no-op user config loader for benchmarking. #[derive(Debug, Default)] struct NoOpUserConfigLoader; #[async_trait::async_trait(?Send)] impl vite_task::loader::UserConfigLoader for NoOpUserConfigLoader { async fn load_user_config_file( &self, _package_path: &AbsolutePath, ) -> anyhow::Result> { Ok(None) } } /// Owned session callbacks for benchmarking. #[derive(Default)] struct BenchSessionConfig { command_handler: NoOpCommandHandler, user_config_loader: NoOpUserConfigLoader, } impl BenchSessionConfig { fn as_callbacks(&mut self) -> SessionConfig<'_> { SessionConfig { command_handler: &mut self.command_handler, user_config_loader: &mut self.user_config_loader, program_name: Str::from("vp"), } } } fn bench_workspace_load(c: &mut Criterion) { let fixture_path = AbsolutePathBuf::new(PathBuf::from(env!("CARGO_MANIFEST_DIR"))) .unwrap() .join("fixtures") .join("monorepo"); let runtime = Runtime::new().unwrap(); // Session::ensure_task_graph_loaded benchmark let mut session_group = c.benchmark_group("session_task_graph_load"); session_group.measurement_time(std::time::Duration::from_secs(10)); session_group.bench_function("ensure_task_graph_loaded", |b| { b.iter(|| { runtime.block_on(async { let mut owned_callbacks = BenchSessionConfig::default(); let envs: FxHashMap, Arc> = FxHashMap::default(); let mut session = Session::init_with( envs, fixture_path.clone().into(), owned_callbacks.as_callbacks(), ) .expect("Failed to create session"); black_box( session.ensure_task_graph_loaded().await.expect("Failed to load task graph"), ); }); }); }); session_group.bench_with_input(BenchmarkId::new("packages", 100), &fixture_path, |b, path| { b.iter(|| { runtime.block_on(async { let mut owned_callbacks = BenchSessionConfig::default(); let envs: FxHashMap, Arc> = FxHashMap::default(); let mut session = Session::init_with(envs, path.clone().into(), owned_callbacks.as_callbacks()) .expect("Failed to create session"); black_box( session.ensure_task_graph_loaded().await.expect("Failed to load task graph"), ); }); }); }); session_group.finish(); } criterion_group!(benches, bench_workspace_load); criterion_main!(benches); ================================================ FILE: bench/fixtures/monorepo/package.json ================================================ { "name": "monorepo-benchmark", "version": "1.0.0", "private": true, "workspaces": [ "packages/*" ], "scripts": { "build:all": "vite-plus run build", "test:all": "vite-plus run test", "lint:all": "vite-plus run lint" }, "devDependencies": { "vite-plus": "*" } } ================================================ FILE: bench/fixtures/monorepo/pnpm-workspace.yaml ================================================ packages: - 'packages/*' ================================================ FILE: bench/fixtures/monorepo/vite-plus.json ================================================ { "tasks": { "build": { "cache": true, "parallel": true }, "test": { "cache": true, "parallel": true }, "lint": { "cache": false, "parallel": true } } } ================================================ FILE: bench/generate-monorepo.ts ================================================ import * as fs from 'node:fs'; import * as path from 'node:path'; import { fileURLToPath } from 'node:url'; interface Package { name: string; dependencies: string[]; scripts: Record; hasVitePlusConfig: boolean; } const __dirname = path.join(fileURLToPath(import.meta.url), '..'); class MonorepoGenerator { private packages: Map = new Map(); private readonly PACKAGE_COUNT = 1000; private readonly MAX_DEPS_PER_PACKAGE = 8; private readonly MIN_DEPS_PER_PACKAGE = 2; private readonly SCRIPT_NAMES = ['build', 'test', 'lint', 'dev', 'start', 'prepare', 'compile']; private readonly CATEGORIES = ['core', 'util', 'feature', 'service', 'app']; constructor(private rootDir: string) {} private getRandomInt(min: number, max: number): number { return Math.floor(Math.random() * (max - min + 1)) + min; } private getRandomElement(arr: T[]): T { return arr[Math.floor(Math.random() * arr.length)]; } private generatePackageName(index: number): string { const category = this.getRandomElement(this.CATEGORIES); const paddedIndex = index.toString().padStart(2, '0'); return `${category}-${paddedIndex}`; } private generateScriptCommand(scriptName: string, packageName: string): string { const commands = [ `echo "Running ${scriptName} for ${packageName}"`, `node scripts/${scriptName}.js`, `tsc --build`, `webpack build`, `rollup -c`, `esbuild src/index.js --bundle`, `npm run pre${scriptName}`, `node tasks/${scriptName}`, ]; // Generate command with 0-3 && concatenations const numCommands = this.getRandomInt(1, 4); const selectedCommands: string[] = []; for (let i = 0; i < numCommands; i++) { selectedCommands.push(this.getRandomElement(commands)); } return selectedCommands.join(' && '); } private generateScripts(packageName: string): Record { const scripts: Record = {}; // Each package has 2-3 scripts const numScripts = this.getRandomInt(2, 3); const selectedScripts = new Set(); while (selectedScripts.size < numScripts) { selectedScripts.add(this.getRandomElement(this.SCRIPT_NAMES)); } for (const scriptName of selectedScripts) { scripts[scriptName] = this.generateScriptCommand(scriptName, packageName); } return scripts; } private selectDependencies(currentIndex: number, availablePackages: string[]): string[] { const numDeps = this.getRandomInt(this.MIN_DEPS_PER_PACKAGE, this.MAX_DEPS_PER_PACKAGE); const dependencies = new Set(); // Create a complex graph by selecting dependencies from different layers // Prefer packages with lower indices (creates deeper dependency chains) const eligiblePackages = availablePackages.filter((pkg) => { const pkgIndex = parseInt(pkg.split('-')[1]); return pkgIndex < currentIndex; }); if (eligiblePackages.length === 0) { return []; } while (dependencies.size < numDeps && dependencies.size < eligiblePackages.length) { const dep = this.getRandomElement(eligiblePackages); dependencies.add(dep); } // Add some cross-category dependencies for complexity if (Math.random() > 0.3) { const crossCategoryDeps = availablePackages.filter((pkg) => { const category = pkg.split('-')[0]; return category !== currentIndex.toString().split('-')[0]; }); if (crossCategoryDeps.length > 0) { dependencies.add(this.getRandomElement(crossCategoryDeps)); } } return Array.from(dependencies); } private generatePackages(): void { // First, create all package names const allPackageNames: string[] = []; for (let i = 0; i < this.PACKAGE_COUNT; i++) { allPackageNames.push(this.generatePackageName(i)); } // Generate packages with dependencies for (let i = 0; i < this.PACKAGE_COUNT; i++) { const packageName = allPackageNames[i]; const scripts = this.generateScripts(packageName); // 70% chance to have vite-plus.json config const hasVitePlusConfig = Math.random() > 0.3; // Select dependencies from packages created before this one const dependencies = i === 0 ? [] : this.selectDependencies(i, allPackageNames.slice(0, i)); this.packages.set(packageName, { name: packageName, dependencies, scripts, hasVitePlusConfig, }); } // Ensure complex transitive dependencies for script resolution testing this.addTransitiveScriptDependencies(); } private addTransitiveScriptDependencies(): void { // Create specific patterns for testing transitive script dependencies const packagesArray = Array.from(this.packages.entries()); for (let i = 0; i < 50; i++) { const [nameA, pkgA] = this.getRandomElement(packagesArray); const [nameB, pkgB] = this.getRandomElement(packagesArray); const [nameC, pkgC] = this.getRandomElement(packagesArray); if (nameA !== nameB && nameB !== nameC && nameA !== nameC) { // Setup: A depends on B, B depends on C if (!pkgA.dependencies.includes(nameB)) { pkgA.dependencies.push(nameB); } if (!pkgB.dependencies.includes(nameC)) { pkgB.dependencies.push(nameC); } // Create the scenario: A has build, B doesn't, C has build const scriptName = this.getRandomElement(this.SCRIPT_NAMES); pkgA.scripts[scriptName] = this.generateScriptCommand(scriptName, nameA); delete pkgB.scripts[scriptName]; // B doesn't have the script pkgC.scripts[scriptName] = this.generateScriptCommand(scriptName, nameC); } } } private writePackage(pkg: Package): void { const packageDir = path.join(this.rootDir, 'packages', pkg.name); // Create directory structure fs.mkdirSync(packageDir, { recursive: true }); fs.mkdirSync(path.join(packageDir, 'src'), { recursive: true }); // Write package.json const packageJson = { name: `@monorepo/${pkg.name}`, version: '1.0.0', main: 'src/index.js', scripts: pkg.scripts, dependencies: pkg.dependencies.reduce( (deps, dep) => { deps[`@monorepo/${dep}`] = 'workspace:*'; return deps; }, {} as Record, ), }; fs.writeFileSync(path.join(packageDir, 'package.json'), JSON.stringify(packageJson, null, 2)); // Write source file const indexContent = `// ${pkg.name} module export function ${pkg.name.replace('-', '_')}() { console.log('Executing ${pkg.name}'); ${pkg.dependencies.map((dep) => ` require('@monorepo/${dep}');`).join('\n')} } module.exports = { ${pkg.name.replace('-', '_')} }; `; fs.writeFileSync(path.join(packageDir, 'src', 'index.js'), indexContent); // Write vite-plus.json if needed if (pkg.hasVitePlusConfig) { const vitePlusConfig = { extends: '../../vite-plus.json', tasks: { build: { cache: true, env: { NODE_ENV: 'production', }, }, }, }; fs.writeFileSync( path.join(packageDir, 'vite-plus.json'), JSON.stringify(vitePlusConfig, null, 2), ); } } public generate(): void { console.log('Generating monorepo structure…'); // Clean and create root directory if (fs.existsSync(this.rootDir)) { fs.rmSync(this.rootDir, { recursive: true, force: true }); } fs.mkdirSync(this.rootDir, { recursive: true }); fs.mkdirSync(path.join(this.rootDir, 'packages'), { recursive: true }); // Generate packages this.generatePackages(); // Write all packages let count = 0; for (const [_, pkg] of this.packages) { this.writePackage(pkg); count++; if (count % 100 === 0) { console.log(`Generated ${count} packages…`); } } // Write root package.json const rootPackageJson = { name: 'monorepo-benchmark', version: '1.0.0', private: true, workspaces: ['packages/*'], scripts: { 'build:all': 'vp run build', 'test:all': 'vp run test', 'lint:all': 'vp run lint', }, devDependencies: { 'vite-plus': '*', }, }; fs.writeFileSync( path.join(this.rootDir, 'package.json'), JSON.stringify(rootPackageJson, null, 2), ); // Write pnpm-workspace.yaml for pnpm support const pnpmWorkspace = `packages: - 'packages/*' `; fs.writeFileSync(path.join(this.rootDir, 'pnpm-workspace.yaml'), pnpmWorkspace); // Write root vite-plus.json const rootVitePlusConfig = { tasks: { build: { cache: true, parallel: true, }, test: { cache: true, parallel: true, }, lint: { cache: false, parallel: true, }, }, }; fs.writeFileSync( path.join(this.rootDir, 'vite-plus.json'), JSON.stringify(rootVitePlusConfig, null, 2), ); console.log(`Successfully generated monorepo with ${this.PACKAGE_COUNT} packages!`); console.log(`Location: ${this.rootDir}`); // Print some statistics this.printStatistics(); } private printStatistics(): void { let totalDeps = 0; let maxDeps = 0; let packagesWithVitePlus = 0; const scriptCounts = new Map(); for (const [_, pkg] of this.packages) { totalDeps += pkg.dependencies.length; maxDeps = Math.max(maxDeps, pkg.dependencies.length); if (pkg.hasVitePlusConfig) { packagesWithVitePlus++; } for (const script of Object.keys(pkg.scripts)) { scriptCounts.set(script, (scriptCounts.get(script) || 0) + 1); } } console.log('\nStatistics:'); console.log(`- Total packages: ${this.packages.size}`); console.log( `- Average dependencies per package: ${(totalDeps / this.packages.size).toFixed(2)}`, ); console.log(`- Max dependencies in a package: ${maxDeps}`); console.log(`- Packages with vite-plus.json: ${packagesWithVitePlus}`); console.log('- Script distribution:'); for (const [script, count] of scriptCounts) { console.log(` - ${script}: ${count} packages`); } } } // Main execution const outputDir = path.join(__dirname, 'fixtures', 'monorepo'); const generator = new MonorepoGenerator(outputDir); generator.generate(); ================================================ FILE: bench/package.json ================================================ { "type": "module" } ================================================ FILE: bench/tsconfig.json ================================================ { "extends": "../tsconfig.json", "compilerOptions": { "outDir": "./dist", "noEmit": true, "erasableSyntaxOnly": false }, "include": ["**/*.ts"] } ================================================ FILE: crates/vite_command/Cargo.toml ================================================ [package] name = "vite_command" version = "0.0.0" authors.workspace = true edition.workspace = true license.workspace = true rust-version.workspace = true [dependencies] fspy = { workspace = true } tokio = { workspace = true } tracing = { workspace = true } vite_error = { workspace = true } vite_path = { workspace = true } which = { workspace = true, features = ["tracing"] } [target.'cfg(not(target_os = "windows"))'.dependencies] nix = { workspace = true } [dev-dependencies] tempfile = { workspace = true } tokio = { workspace = true, features = ["macros", "test-util"] } [lints] workspace = true [lib] doctest = false ================================================ FILE: crates/vite_command/src/lib.rs ================================================ use std::{ collections::HashMap, ffi::OsStr, process::{ExitStatus, Stdio}, }; use fspy::AccessMode; use tokio::process::Command; use vite_error::Error; use vite_path::{AbsolutePath, AbsolutePathBuf, RelativePathBuf}; /// Result of running a command with fspy tracking. #[derive(Debug)] pub struct FspyCommandResult { /// The termination status of the command. pub status: ExitStatus, /// The path accesses of the command. pub path_accesses: HashMap, } /// Resolve a binary name to a full path using the `which` crate. /// Handles PATHEXT (`.cmd`/`.bat`) resolution natively on Windows. /// /// If `path_env` is `None`, searches the process's current `PATH`. pub fn resolve_bin( bin_name: &str, path_env: Option<&OsStr>, cwd: impl AsRef, ) -> Result { let current_path; let path_env = match path_env { Some(p) => p, None => { current_path = std::env::var_os("PATH").unwrap_or_default(); ¤t_path } }; let path = which::which_in(bin_name, Some(path_env), cwd.as_ref()) .map_err(|_| Error::CannotFindBinaryPath(bin_name.into()))?; AbsolutePathBuf::new(path).ok_or_else(|| Error::CannotFindBinaryPath(bin_name.into())) } /// Build a `tokio::process::Command` for a pre-resolved binary path. /// Sets inherited stdio and `fix_stdio_streams` (Unix pre_exec). /// Callers can further customize (add args, envs, override stdio, etc.). pub fn build_command(bin_path: &AbsolutePath, cwd: &AbsolutePath) -> Command { let mut cmd = Command::new(bin_path.as_path()); cmd.current_dir(cwd).stdin(Stdio::inherit()).stdout(Stdio::inherit()).stderr(Stdio::inherit()); #[cfg(unix)] unsafe { cmd.pre_exec(|| { fix_stdio_streams(); Ok(()) }); } cmd } /// Build a `tokio::process::Command` for shell execution. /// Uses `/bin/sh -c` on Unix, `cmd.exe /C` on Windows. pub fn build_shell_command(shell_cmd: &str, cwd: &AbsolutePath) -> Command { #[cfg(unix)] let mut cmd = { let mut cmd = Command::new("/bin/sh"); cmd.arg("-c").arg(shell_cmd); cmd }; #[cfg(windows)] let mut cmd = { let mut cmd = Command::new("cmd.exe"); cmd.arg("/C").arg(shell_cmd); cmd }; cmd.current_dir(cwd).stdin(Stdio::inherit()).stdout(Stdio::inherit()).stderr(Stdio::inherit()); #[cfg(unix)] unsafe { cmd.pre_exec(|| { fix_stdio_streams(); Ok(()) }); } cmd } /// Run a command with the given bin name, arguments, environment variables, and current working directory. /// /// # Arguments /// /// * `bin_name`: The name of the binary to run. /// * `args`: The arguments to pass to the binary. /// * `envs`: The custom environment variables to set for the command, will be merged with the system environment variables. /// * `cwd`: The current working directory for the command. /// /// # Returns /// /// Returns the exit status of the command. pub async fn run_command( bin_name: &str, args: I, envs: &HashMap, cwd: impl AsRef, ) -> Result where I: IntoIterator, S: AsRef, { let cwd = cwd.as_ref(); let paths = envs.get("PATH"); let bin_path = resolve_bin(bin_name, paths.map(|p| OsStr::new(p.as_str())), cwd)?; let mut cmd = build_command(&bin_path, cwd); cmd.args(args).envs(envs); let status = cmd.status().await?; Ok(status) } /// Run a command with fspy tracking. /// /// # Arguments /// /// * `bin_name`: The name of the binary to run. /// * `args`: The arguments to pass to the binary. /// * `envs`: The custom environment variables to set for the command. /// * `cwd`: The current working directory for the command. /// /// # Returns /// /// Returns a FspyCommandResult containing the exit status and path accesses. pub async fn run_command_with_fspy( bin_name: &str, args: I, envs: &HashMap, cwd: impl AsRef, ) -> Result where I: IntoIterator, S: AsRef, { let cwd = cwd.as_ref(); let mut cmd = fspy::Command::new(bin_name); cmd.args(args) // set system environment variables first .envs(std::env::vars_os()) // then set custom environment variables .envs(envs) .current_dir(cwd) .stdin(Stdio::inherit()) .stdout(Stdio::inherit()) .stderr(Stdio::inherit()); // fix stdio streams on unix #[cfg(unix)] unsafe { cmd.pre_exec(|| { fix_stdio_streams(); Ok(()) }); } let child = cmd.spawn().await.map_err(|e| Error::Anyhow(e.into()))?; let termination = child.wait_handle.await?; let mut path_accesses = HashMap::::new(); for access in termination.path_accesses.iter() { tracing::debug!("Path access: {:?}", access); let relative_path = access .path .strip_path_prefix(cwd, |strip_result| { let Ok(stripped_path) = strip_result else { return None; }; if stripped_path.as_os_str().is_empty() { return None; } tracing::debug!("stripped_path: {:?}", stripped_path); Some(RelativePathBuf::new(stripped_path).map_err(|err| { Error::InvalidRelativePath { path: stripped_path.into(), reason: err } })) }) .transpose()?; let Some(relative_path) = relative_path else { continue; }; path_accesses .entry(relative_path) .and_modify(|mode| *mode |= access.mode) .or_insert(access.mode); } Ok(FspyCommandResult { status: termination.status, path_accesses }) } #[cfg(unix)] pub fn fix_stdio_streams() { // libuv may mark stdin/stdout/stderr as close-on-exec, which interferes with Rust's subprocess spawning. // As a workaround, we clear the FD_CLOEXEC flag on these file descriptors to prevent them from being closed when spawning child processes. // // For details see https://github.com/libuv/libuv/issues/2062 // Fixed by reference from https://github.com/electron/electron/pull/15555 use std::os::fd::BorrowedFd; use nix::{ fcntl::{FcntlArg, FdFlag, fcntl}, libc::{STDERR_FILENO, STDIN_FILENO, STDOUT_FILENO}, }; // Safe function to clear FD_CLOEXEC flag fn clear_cloexec(fd: BorrowedFd<'_>) { // Borrow RawFd as BorrowedFd to satisfy AsFd constraint if let Ok(flags) = fcntl(fd, FcntlArg::F_GETFD) { let mut fd_flags = FdFlag::from_bits_retain(flags); if fd_flags.contains(FdFlag::FD_CLOEXEC) { fd_flags.remove(FdFlag::FD_CLOEXEC); // Ignore errors: some fd may be closed let _ = fcntl(fd, FcntlArg::F_SETFD(fd_flags)); } } } // Clear FD_CLOEXEC on stdin, stdout, stderr clear_cloexec(unsafe { BorrowedFd::borrow_raw(STDIN_FILENO) }); clear_cloexec(unsafe { BorrowedFd::borrow_raw(STDOUT_FILENO) }); clear_cloexec(unsafe { BorrowedFd::borrow_raw(STDERR_FILENO) }); } #[cfg(test)] mod tests { use tempfile::{TempDir, tempdir}; use vite_path::AbsolutePathBuf; use super::*; fn create_temp_dir() -> TempDir { tempdir().expect("Failed to create temp directory") } mod run_command_tests { use super::*; #[tokio::test] async fn test_run_command_and_find_binary_path() { let temp_dir = create_temp_dir(); let temp_dir_path = AbsolutePathBuf::new(temp_dir.path().canonicalize().unwrap().to_path_buf()) .unwrap(); let envs = HashMap::from([( "PATH".to_string(), std::env::var_os("PATH").unwrap_or_default().into_string().unwrap(), )]); let result = run_command("npm", &["--version"], &envs, &temp_dir_path).await; assert!(result.is_ok(), "Should run command successfully, but got error: {:?}", result); } #[tokio::test] async fn test_run_command_and_not_find_binary_path() { let temp_dir = create_temp_dir(); let temp_dir_path = AbsolutePathBuf::new(temp_dir.path().canonicalize().unwrap().to_path_buf()) .unwrap(); let envs = HashMap::from([( "PATH".to_string(), std::env::var_os("PATH").unwrap_or_default().into_string().unwrap(), )]); let result = run_command("npm-not-exists", &["--version"], &envs, &temp_dir_path).await; assert!(result.is_err(), "Should not find binary path, but got: {:?}", result); assert_eq!( result.unwrap_err().to_string(), "Cannot find binary path for command 'npm-not-exists'" ); } } mod run_command_with_fspy_tests { use super::*; #[tokio::test] async fn test_run_command_with_fspy() { let temp_dir = create_temp_dir(); let temp_dir_path = AbsolutePathBuf::new(temp_dir.path().canonicalize().unwrap().to_path_buf()) .unwrap(); let envs = HashMap::from([( "PATH".to_string(), std::env::var_os("PATH").unwrap_or_default().into_string().unwrap(), )]); let result = run_command_with_fspy("node", &["-p", "process.cwd()"], &envs, &temp_dir_path) .await; assert!(result.is_ok(), "Should run command successfully, but got error: {:?}", result); let cmd_result = result.unwrap(); assert!(cmd_result.status.success()); } #[tokio::test] async fn test_run_command_with_fspy_and_capture_path_accesses_write_file() { let temp_dir = create_temp_dir(); let temp_dir_path = AbsolutePathBuf::new(temp_dir.path().canonicalize().unwrap().to_path_buf()) .unwrap(); let envs = HashMap::from([( "PATH".to_string(), std::env::var_os("PATH").unwrap_or_default().into_string().unwrap(), )]); let result = run_command_with_fspy( "node", &["-p", "fs.writeFileSync(path.join(process.cwd(), 'package.json'), '{}');'done'"], &envs, &temp_dir_path, ) .await; assert!(result.is_ok(), "Should run command successfully, but got error: {:?}", result); let cmd_result = result.unwrap(); assert!(cmd_result.status.success()); eprintln!("cmd_result: {:?}", cmd_result); // Verify package.json is in path accesses with WRITE mode. // Note: We don't assert exact count of path accesses because `node` may be a shim // from tool version managers (volta, mise, fnm, etc.) that read additional config // files (e.g., .tool-versions, .mise.toml, .nvmrc) to determine which Node version // to use. let path_access = cmd_result .path_accesses .get(&RelativePathBuf::new("package.json").unwrap()) .expect("package.json should be in path accesses"); assert!(path_access.contains(AccessMode::WRITE)); // Note: We don't assert !READ because writeFileSync may trigger reads // depending on Node.js internals and OS filesystem behavior } #[tokio::test] async fn test_run_command_with_fspy_and_capture_path_accesses_write_and_read_file() { let temp_dir = create_temp_dir(); let temp_dir_path = AbsolutePathBuf::new(temp_dir.path().canonicalize().unwrap().to_path_buf()) .unwrap(); let envs = HashMap::from([( "PATH".to_string(), std::env::var_os("PATH").unwrap_or_default().into_string().unwrap(), )]); let result = run_command_with_fspy( "node", &["-p", "fs.writeFileSync(path.join(process.cwd(), 'package.json'), '{}'); fs.readFileSync(path.join(process.cwd(), 'package.json'), 'utf8'); 'done'"], &envs, &temp_dir_path, ) .await; assert!(result.is_ok(), "Should run command successfully, but got error: {:?}", result); let cmd_result = result.unwrap(); assert!(cmd_result.status.success()); eprintln!("cmd_result: {:?}", cmd_result); // Verify package.json is in path accesses with WRITE and READ modes. // Note: We don't assert exact count of path accesses because `node` may be a shim // from tool version managers (volta, mise, fnm, etc.) that read additional config // files (e.g., .tool-versions, .mise.toml, .nvmrc) to determine which Node version // to use. let path_access = cmd_result .path_accesses .get(&RelativePathBuf::new("package.json").unwrap()) .expect("package.json should be in path accesses"); assert!(path_access.contains(AccessMode::WRITE)); assert!(path_access.contains(AccessMode::READ)); } #[tokio::test] async fn test_run_command_with_fspy_and_not_find_binary_path() { let temp_dir = create_temp_dir(); let temp_dir_path = AbsolutePathBuf::new(temp_dir.path().canonicalize().unwrap().to_path_buf()) .unwrap(); let envs = HashMap::from([( "PATH".to_string(), std::env::var_os("PATH").unwrap_or_default().into_string().unwrap(), )]); let result = run_command_with_fspy("npm-not-exists", &["--version"], &envs, &temp_dir_path) .await; assert!(result.is_err(), "Should not find binary path, but got: {:?}", result); assert!( result .err() .unwrap() .to_string() .contains("could not resolve the full path of program '\"npm-not-exists\"'") ); } } } ================================================ FILE: crates/vite_error/Cargo.toml ================================================ [package] name = "vite_error" version = "0.0.0" authors.workspace = true edition.workspace = true license.workspace = true publish = false rust-version.workspace = true [dependencies] anyhow = { workspace = true } ast-grep-config = { workspace = true } bincode = { workspace = true } bstr = { workspace = true } ignore = { workspace = true } nix = { workspace = true } rusqlite = { workspace = true } semver = { workspace = true } serde_json = { workspace = true } serde_yml = { workspace = true } thiserror = { workspace = true } tokio = { workspace = true } vite_path = { workspace = true } vite_str = { workspace = true } vite_workspace = { workspace = true } wax = { workspace = true } [target.'cfg(target_os = "windows")'.dependencies] reqwest = { workspace = true, features = ["stream", "native-tls-vendored", "json"] } [target.'cfg(not(target_os = "windows"))'.dependencies] reqwest = { workspace = true, features = ["stream", "rustls-tls", "json"] } [lib] test = false doctest = false ================================================ FILE: crates/vite_error/src/lib.rs ================================================ use std::{ffi::OsString, path::Path, sync::Arc}; use thiserror::Error; use vite_path::{AbsolutePath, AbsolutePathBuf, relative::FromPathError}; use vite_str::Str; #[derive(Error, Debug)] pub enum Error { #[error(transparent)] Sqlite(#[from] rusqlite::Error), #[error(transparent)] BincodeEncode(#[from] bincode::error::EncodeError), #[error(transparent)] BincodeDecode(#[from] bincode::error::DecodeError), #[error("Unrecognized db version: {0}")] UnrecognizedDbVersion(u32), #[error(transparent)] Io(#[from] std::io::Error), #[error("IO error: {err} at {path:?}")] IoWithPath { err: std::io::Error, path: Arc }, #[error(transparent)] JoinPathsError(#[from] std::env::JoinPathsError), #[cfg(unix)] #[error(transparent)] Nix(#[from] nix::Error), #[error(transparent)] Serde(#[from] serde_json::Error), #[error("Env value is not valid unicode: {key} = {value:?}")] EnvValueIsNotValidUnicode { key: Str, value: OsString }, #[cfg(unix)] #[error("Unsupported file type: {0:?}")] UnsupportedFileType(nix::dir::Type), #[cfg(windows)] #[error("Unsupported file type: {0:?}")] UnsupportedFileType(std::fs::FileType), #[error(transparent)] Utf8Error(#[from] bstr::Utf8Error), #[error(transparent)] WaxBuild(#[from] wax::BuildError), #[error(transparent)] WaxWalk(#[from] wax::WalkError), #[error(transparent)] IgnoreError(#[from] ignore::Error), #[error(transparent)] SerdeYml(#[from] serde_yml::Error), #[error(transparent)] WorkspaceError(#[from] vite_workspace::Error), #[error("Lint failed, reason: {reason}")] LintFailed { status: Str, reason: Str }, #[error("Fmt failed")] FmtFailed { status: Str, reason: Str }, #[error("Vite failed")] Vite { status: Str, reason: Str }, #[error("Test failed")] TestFailed { status: Str, reason: Str }, #[error("Lib failed")] LibFailed { status: Str, reason: Str }, #[error("Doc failed, reason: {reason}")] DocFailed { status: Str, reason: Str }, #[error("Resolve universal vite config failed")] ResolveUniversalViteConfigFailed { status: Str, reason: Str }, #[error("The path ({path:?}) is not a valid relative path because: {reason}")] InvalidRelativePath { path: Box, reason: FromPathError }, #[error("Unsupported package manager: {0}")] UnsupportedPackageManager(Str), #[error("Unrecognized any package manager, please specify the package manager")] UnrecognizedPackageManager, #[error( "Package manager {name}@{version} in {package_json_path:?} is invalid, expected format: 'package-manager-name@major.minor.patch'" )] PackageManagerVersionInvalid { name: Str, version: Str, package_json_path: AbsolutePathBuf }, #[error("Package manager {name}@{version} not found on {url}")] PackageManagerVersionNotFound { name: Str, version: Str, url: Str }, #[error(transparent)] Semver(#[from] semver::Error), #[error(transparent)] Reqwest(#[from] reqwest::Error), #[error(transparent)] JoinError(#[from] tokio::task::JoinError), #[error("User cancelled by Ctrl+C")] UserCancelled, #[error("Hash mismatch: expected {expected}, got {actual}")] HashMismatch { expected: Str, actual: Str }, #[error("Invalid hash format: {0}")] InvalidHashFormat(Str), #[error("Unsupported hash algorithm: {0}")] UnsupportedHashAlgorithm(Str), #[error("Cannot find binary path for command '{0}'")] CannotFindBinaryPath(Str), #[error("Invalid argument: {0}")] InvalidArgument(Str), #[error(transparent)] AstGrepConfigError(#[from] ast_grep_config::RuleConfigError), #[error(transparent)] Anyhow(#[from] anyhow::Error), } ================================================ FILE: crates/vite_global_cli/Cargo.toml ================================================ [package] name = "vite_global_cli" version = "0.0.0" authors.workspace = true edition.workspace = true license.workspace = true publish = false rust-version.workspace = true [[bin]] name = "vp" path = "src/main.rs" [dependencies] base64-simd = { workspace = true } chrono = { workspace = true } clap = { workspace = true, features = ["derive"] } clap_complete = { workspace = true } directories = { workspace = true } flate2 = { workspace = true } serde = { workspace = true } serde_json = { workspace = true } node-semver = { workspace = true } sha2 = { workspace = true } tar = { workspace = true } thiserror = { workspace = true } tokio = { workspace = true, features = ["full"] } tracing = { workspace = true } owo-colors = { workspace = true } oxc_resolver = { workspace = true } crossterm = { workspace = true } vite_error = { workspace = true } vite_install = { workspace = true } vite_js_runtime = { workspace = true } vite_path = { workspace = true } vite_command = { workspace = true } vite_shared = { workspace = true } vite_str = { workspace = true } vite_workspace = { workspace = true } [target.'cfg(windows)'.dependencies] junction = { workspace = true } [dev-dependencies] serial_test = { workspace = true } tempfile = { workspace = true } [lints] workspace = true ================================================ FILE: crates/vite_global_cli/src/cli.rs ================================================ //! CLI argument parsing and command routing. //! //! This module defines the CLI structure using clap and routes commands //! to their appropriate handlers. use std::process::ExitStatus; use clap::{CommandFactory, FromArgMatches, Parser, Subcommand}; use vite_install::commands::{ add::SaveDependencyType, install::InstallCommandOptions, outdated::Format, }; use vite_path::AbsolutePathBuf; use crate::{ commands::{ self, AddCommand, DedupeCommand, DlxCommand, InstallCommand, LinkCommand, OutdatedCommand, RemoveCommand, UnlinkCommand, UpdateCommand, WhyCommand, }, error::Error, help, }; #[derive(Clone, Copy, Debug)] pub struct RenderOptions { pub show_header: bool, } impl Default for RenderOptions { fn default() -> Self { Self { show_header: true } } } /// Vite+ Global CLI #[derive(Parser, Debug)] #[clap( name = "vp", bin_name = "vp", author, about = "Vite+ - A next-generation build tool", long_about = None )] #[command(disable_help_subcommand = true, disable_version_flag = true)] pub struct Args { /// Print version #[arg(short = 'V', long = "version")] pub version: bool, #[clap(subcommand)] pub command: Option, } /// Available commands #[derive(Subcommand, Debug)] pub enum Commands { // ========================================================================= // Category A: Package Manager Commands // ========================================================================= /// Install all dependencies, or add packages if package names are provided #[command(visible_alias = "i")] Install { /// Do not install devDependencies #[arg(short = 'P', long)] prod: bool, /// Only install devDependencies (install) / Save to devDependencies (add) #[arg(short = 'D', long)] dev: bool, /// Do not install optionalDependencies #[arg(long)] no_optional: bool, /// Fail if lockfile needs to be updated (CI mode) #[arg(long, overrides_with = "no_frozen_lockfile")] frozen_lockfile: bool, /// Allow lockfile updates (opposite of --frozen-lockfile) #[arg(long, overrides_with = "frozen_lockfile")] no_frozen_lockfile: bool, /// Only update lockfile, don't install #[arg(long)] lockfile_only: bool, /// Use cached packages when available #[arg(long)] prefer_offline: bool, /// Only use packages already in cache #[arg(long)] offline: bool, /// Force reinstall all dependencies #[arg(short = 'f', long)] force: bool, /// Do not run lifecycle scripts #[arg(long)] ignore_scripts: bool, /// Don't read or generate lockfile #[arg(long)] no_lockfile: bool, /// Fix broken lockfile entries (pnpm and yarn@2+ only) #[arg(long)] fix_lockfile: bool, /// Create flat `node_modules` (pnpm only) #[arg(long)] shamefully_hoist: bool, /// Re-run resolution for peer dependency analysis (pnpm only) #[arg(long)] resolution_only: bool, /// Suppress output (silent mode) #[arg(long)] silent: bool, /// Filter packages in monorepo (can be used multiple times) #[arg(long, value_name = "PATTERN")] filter: Option>, /// Install in workspace root only #[arg(short = 'w', long)] workspace_root: bool, /// Save exact version (only when adding packages) #[arg(short = 'E', long)] save_exact: bool, /// Save to peerDependencies (only when adding packages) #[arg(long)] save_peer: bool, /// Save to optionalDependencies (only when adding packages) #[arg(short = 'O', long)] save_optional: bool, /// Save the new dependency to the default catalog (only when adding packages) #[arg(long)] save_catalog: bool, /// Install globally (only when adding packages) #[arg(short = 'g', long)] global: bool, /// Node.js version to use for global installation (only with -g) #[arg(long, requires = "global")] node: Option, /// Packages to add (if provided, acts as `vp add`) #[arg(required = false)] packages: Option>, /// Additional arguments to pass through to the package manager #[arg(last = true, allow_hyphen_values = true)] pass_through_args: Option>, }, /// Add packages to dependencies Add { /// Save to `dependencies` (default) #[arg(short = 'P', long)] save_prod: bool, /// Save to `devDependencies` #[arg(short = 'D', long)] save_dev: bool, /// Save to `peerDependencies` and `devDependencies` #[arg(long)] save_peer: bool, /// Save to `optionalDependencies` #[arg(short = 'O', long)] save_optional: bool, /// Save exact version rather than semver range #[arg(short = 'E', long)] save_exact: bool, /// Save the new dependency to the specified catalog name #[arg(long, value_name = "CATALOG_NAME")] save_catalog_name: Option, /// Save the new dependency to the default catalog #[arg(long)] save_catalog: bool, /// A list of package names allowed to run postinstall #[arg(long, value_name = "NAMES")] allow_build: Option, /// Filter packages in monorepo (can be used multiple times) #[arg(long, value_name = "PATTERN")] filter: Option>, /// Add to workspace root #[arg(short = 'w', long)] workspace_root: bool, /// Only add if package exists in workspace (pnpm-specific) #[arg(long)] workspace: bool, /// Install globally #[arg(short = 'g', long)] global: bool, /// Node.js version to use for global installation (only with -g) #[arg(long, requires = "global")] node: Option, /// Packages to add #[arg(required = true)] packages: Vec, /// Additional arguments to pass through to the package manager #[arg(last = true, allow_hyphen_values = true)] pass_through_args: Option>, }, /// Remove packages from dependencies #[command(visible_alias = "rm", visible_alias = "un", visible_alias = "uninstall")] Remove { /// Only remove from `devDependencies` (pnpm-specific) #[arg(short = 'D', long)] save_dev: bool, /// Only remove from `optionalDependencies` (pnpm-specific) #[arg(short = 'O', long)] save_optional: bool, /// Only remove from `dependencies` (pnpm-specific) #[arg(short = 'P', long)] save_prod: bool, /// Filter packages in monorepo (can be used multiple times) #[arg(long, value_name = "PATTERN")] filter: Option>, /// Remove from workspace root #[arg(short = 'w', long)] workspace_root: bool, /// Remove recursively from all workspace packages #[arg(short = 'r', long)] recursive: bool, /// Remove global packages #[arg(short = 'g', long)] global: bool, /// Preview what would be removed without actually removing (only with -g) #[arg(long, requires = "global")] dry_run: bool, /// Packages to remove #[arg(required = true)] packages: Vec, /// Additional arguments to pass through to the package manager #[arg(last = true, allow_hyphen_values = true)] pass_through_args: Option>, }, /// Update packages to their latest versions #[command(visible_alias = "up")] Update { /// Update to latest version (ignore semver range) #[arg(short = 'L', long)] latest: bool, /// Update global packages #[arg(short = 'g', long)] global: bool, /// Update recursively in all workspace packages #[arg(short = 'r', long)] recursive: bool, /// Filter packages in monorepo (can be used multiple times) #[arg(long, value_name = "PATTERN")] filter: Option>, /// Include workspace root #[arg(short = 'w', long)] workspace_root: bool, /// Update only devDependencies #[arg(short = 'D', long)] dev: bool, /// Update only dependencies (production) #[arg(short = 'P', long)] prod: bool, /// Interactive mode #[arg(short = 'i', long)] interactive: bool, /// Don't update optionalDependencies #[arg(long)] no_optional: bool, /// Update lockfile only, don't modify package.json #[arg(long)] no_save: bool, /// Only update if package exists in workspace (pnpm-specific) #[arg(long)] workspace: bool, /// Packages to update (optional - updates all if omitted) packages: Vec, /// Additional arguments to pass through to the package manager #[arg(last = true, allow_hyphen_values = true)] pass_through_args: Option>, }, /// Deduplicate dependencies Dedupe { /// Check if deduplication would make changes #[arg(long)] check: bool, /// Additional arguments to pass through to the package manager #[arg(last = true, allow_hyphen_values = true)] pass_through_args: Option>, }, /// Check for outdated packages Outdated { /// Package name(s) to check packages: Vec, /// Show extended information #[arg(long)] long: bool, /// Output format: table (default), list, or json #[arg(long, value_name = "FORMAT", value_parser = clap::value_parser!(Format))] format: Option, /// Check recursively across all workspaces #[arg(short = 'r', long)] recursive: bool, /// Filter packages in monorepo #[arg(long, value_name = "PATTERN")] filter: Option>, /// Include workspace root #[arg(short = 'w', long)] workspace_root: bool, /// Only production and optional dependencies #[arg(short = 'P', long)] prod: bool, /// Only dev dependencies #[arg(short = 'D', long)] dev: bool, /// Exclude optional dependencies #[arg(long)] no_optional: bool, /// Only show compatible versions #[arg(long)] compatible: bool, /// Sort results by field #[arg(long, value_name = "FIELD")] sort_by: Option, /// Check globally installed packages #[arg(short = 'g', long)] global: bool, /// Additional arguments to pass through to the package manager #[arg(last = true, allow_hyphen_values = true)] pass_through_args: Option>, }, /// Show why a package is installed #[command(visible_alias = "explain")] Why { /// Package(s) to check #[arg(required = true)] packages: Vec, /// Output in JSON format #[arg(long)] json: bool, /// Show extended information #[arg(long)] long: bool, /// Show parseable output #[arg(long)] parseable: bool, /// Check recursively across all workspaces #[arg(short = 'r', long)] recursive: bool, /// Filter packages in monorepo #[arg(long, value_name = "PATTERN")] filter: Option>, /// Check in workspace root #[arg(short = 'w', long)] workspace_root: bool, /// Only production dependencies #[arg(short = 'P', long)] prod: bool, /// Only dev dependencies #[arg(short = 'D', long)] dev: bool, /// Limit tree depth #[arg(long)] depth: Option, /// Exclude optional dependencies #[arg(long)] no_optional: bool, /// Check globally installed packages #[arg(short = 'g', long)] global: bool, /// Exclude peer dependencies #[arg(long)] exclude_peers: bool, /// Use a finder function defined in .pnpmfile.cjs #[arg(long, value_name = "FINDER_NAME")] find_by: Option, /// Additional arguments to pass through to the package manager #[arg(last = true, allow_hyphen_values = true)] pass_through_args: Option>, }, /// View package information from the registry #[command(visible_alias = "view", visible_alias = "show")] Info { /// Package name with optional version #[arg(required = true)] package: String, /// Specific field to view field: Option, /// Output in JSON format #[arg(long)] json: bool, /// Additional arguments to pass through to the package manager #[arg(last = true, allow_hyphen_values = true)] pass_through_args: Option>, }, /// Link packages for local development #[command(visible_alias = "ln")] Link { /// Package name or directory to link #[arg(value_name = "PACKAGE|DIR")] package: Option, /// Arguments to pass to package manager #[arg(allow_hyphen_values = true, trailing_var_arg = true)] args: Vec, }, /// Unlink packages Unlink { /// Package name to unlink #[arg(value_name = "PACKAGE|DIR")] package: Option, /// Unlink in every workspace package #[arg(short = 'r', long)] recursive: bool, /// Arguments to pass to package manager #[arg(allow_hyphen_values = true, trailing_var_arg = true)] args: Vec, }, /// Execute a package binary without installing it Dlx { /// Package(s) to install before running #[arg(long, short = 'p', value_name = "NAME")] package: Vec, /// Execute within a shell environment #[arg(long = "shell-mode", short = 'c')] shell_mode: bool, /// Suppress all output except the executed command's output #[arg(long, short = 's')] silent: bool, /// Package to execute and arguments #[arg(required = true, trailing_var_arg = true, allow_hyphen_values = true)] args: Vec, }, /// Forward a command to the package manager #[command(subcommand)] Pm(PmCommands), // ========================================================================= // Category B: JS Script Commands // These commands are implemented in JavaScript and executed via managed Node.js // ========================================================================= /// Create a new project from a template (delegates to JS) #[command(disable_help_flag = true)] Create { #[arg(trailing_var_arg = true, allow_hyphen_values = true)] args: Vec, }, /// Migrate an existing project to Vite+ (delegates to JS) #[command(disable_help_flag = true)] Migrate { #[arg(trailing_var_arg = true, allow_hyphen_values = true)] args: Vec, }, /// In-repo configuration (hooks, agent integration) #[command(disable_help_flag = true)] Config { #[arg(trailing_var_arg = true, allow_hyphen_values = true)] args: Vec, }, /// Run vite-staged on Git staged files #[command(disable_help_flag = true, name = "staged")] Staged { #[arg(trailing_var_arg = true, allow_hyphen_values = true)] args: Vec, }, // ========================================================================= // Category C: Local CLI Delegation (stubs for now) // ========================================================================= /// Run the development server #[command(disable_help_flag = true)] Dev { /// Additional arguments #[arg(trailing_var_arg = true, allow_hyphen_values = true)] args: Vec, }, /// Build application #[command(disable_help_flag = true)] Build { /// Additional arguments #[arg(trailing_var_arg = true, allow_hyphen_values = true)] args: Vec, }, /// Run tests #[command(disable_help_flag = true)] Test { /// Additional arguments #[arg(trailing_var_arg = true, allow_hyphen_values = true)] args: Vec, }, /// Lint code #[command(disable_help_flag = true)] Lint { /// Additional arguments #[arg(trailing_var_arg = true, allow_hyphen_values = true)] args: Vec, }, /// Format code #[command(disable_help_flag = true)] Fmt { /// Additional arguments #[arg(trailing_var_arg = true, allow_hyphen_values = true)] args: Vec, }, /// Run format, lint, and type checks #[command(disable_help_flag = true)] Check { /// Additional arguments #[arg(trailing_var_arg = true, allow_hyphen_values = true)] args: Vec, }, /// Build library #[command(disable_help_flag = true)] Pack { /// Additional arguments #[arg(trailing_var_arg = true, allow_hyphen_values = true)] args: Vec, }, /// Run tasks #[command(disable_help_flag = true)] Run { /// Additional arguments #[arg(trailing_var_arg = true, allow_hyphen_values = true)] args: Vec, }, /// Execute a command from local node_modules/.bin #[command(disable_help_flag = true)] Exec { /// Additional arguments #[arg(trailing_var_arg = true, allow_hyphen_values = true)] args: Vec, }, /// Preview production build #[command(disable_help_flag = true)] Preview { /// Additional arguments #[arg(trailing_var_arg = true, allow_hyphen_values = true)] args: Vec, }, /// Manage the task cache #[command(disable_help_flag = true)] Cache { /// Additional arguments #[arg(trailing_var_arg = true, allow_hyphen_values = true)] args: Vec, }, /// Manage Node.js versions Env(EnvArgs), // ========================================================================= // Self-Management // ========================================================================= /// Update vp itself to the latest version #[command(name = "upgrade")] Upgrade { /// Target version (e.g., "0.2.0"). Defaults to latest. version: Option, /// npm dist-tag to install (default: "latest", also: "alpha") #[arg(long, default_value = "latest")] tag: String, /// Check for updates without installing #[arg(long)] check: bool, /// Revert to the previously active version #[arg(long)] rollback: bool, /// Force reinstall even if already on the target version #[arg(long)] force: bool, /// Suppress output #[arg(long)] silent: bool, /// Custom npm registry URL #[arg(long)] registry: Option, }, /// Remove vp and all related data Implode { /// Skip confirmation prompt #[arg(long, short = 'y')] yes: bool, }, } /// Arguments for the `env` command #[derive(clap::Args, Debug)] #[command(after_help = "\ Examples: Setup: vp env setup # Create shims for node, npm, npx vp env on # Use vite-plus managed Node.js vp env print # Print shell snippet for this session Manage: vp env pin lts # Pin to latest LTS version vp env install # Install version from .node-version / package.json vp env use 20 # Use Node.js 20 for this shell session vp env use --unset # Remove session override Inspect: vp env current # Show current resolved environment vp env current --json # JSON output for automation vp env doctor # Check environment configuration vp env which node # Show which node binary will be used vp env list-remote --lts # List only LTS versions Execute: vp env exec --node lts npm i # Execute 'npm i' with latest LTS vp env exec node -v # Shim mode (version auto-resolved) Related Commands: vp install -g # Install a package globally vp uninstall -g # Uninstall a package globally vp update -g [package] # Update global packages vp list -g [package] # List global packages")] pub struct EnvArgs { /// Subcommand (e.g., 'default', 'setup', 'doctor', 'which') #[command(subcommand)] pub command: Option, } /// Subcommands for the `env` command #[derive(clap::Subcommand, Debug)] pub enum EnvSubcommands { /// Show current environment information Current { /// Output in JSON format #[arg(long)] json: bool, }, /// Print shell snippet to set environment for current session Print, /// Set or show the global default Node.js version Default { /// Version to set as default (e.g., "20.18.0", "lts", "latest") /// If not provided, shows the current default version: Option, }, /// Enable managed mode - shims always use vite-plus managed Node.js On, /// Enable system-first mode - shims prefer system Node.js, fallback to managed Off, /// Create or update shims in VITE_PLUS_HOME/bin Setup { /// Force refresh shims even if they exist #[arg(long)] refresh: bool, /// Only create env files (skip shims and instructions) #[arg(long)] env_only: bool, }, /// Run diagnostics and show environment status Doctor, /// Show path to the tool that would be executed Which { /// Tool name (node, npm, or npx) tool: String, }, /// Pin a Node.js version in the current directory (creates .node-version) Pin { /// Version to pin (e.g., "20.18.0", "lts", "latest", "^20.0.0") /// If not provided, shows the current pinned version version: Option, /// Remove the .node-version file from current directory #[arg(long)] unpin: bool, /// Skip pre-downloading the pinned version #[arg(long)] no_install: bool, /// Overwrite existing .node-version without confirmation #[arg(long)] force: bool, }, /// Remove the .node-version file from current directory (alias for `pin --unpin`) Unpin, /// List locally installed Node.js versions #[command(visible_alias = "ls")] List { /// Output as JSON #[arg(long)] json: bool, }, /// List available Node.js versions from the registry #[command(name = "list-remote", visible_alias = "ls-remote")] ListRemote { /// Filter versions by pattern (e.g., "20" for 20.x versions) pattern: Option, /// Show only LTS versions #[arg(long)] lts: bool, /// Show all versions (not just recent) #[arg(long)] all: bool, /// Output as JSON #[arg(long)] json: bool, /// Version sorting order #[arg(long, value_enum, default_value_t = SortingMethod::Asc)] sort: SortingMethod, }, /// Execute a command with a specific Node.js version #[command(visible_alias = "run")] Exec { /// Node.js version to use (e.g., "20.18.0", "lts", "^20.0.0") /// If not provided and command is node/npm/npx or a global package binary, /// version is resolved automatically (same as shim behavior) #[arg(long)] node: Option, /// npm version to use (optional, defaults to bundled) #[arg(long)] npm: Option, /// Command and arguments to run #[arg(trailing_var_arg = true, allow_hyphen_values = true)] command: Vec, }, /// Uninstall a Node.js version #[command(visible_alias = "uni")] Uninstall { /// Version to uninstall (e.g., "20.18.0") #[arg(required = true)] version: String, }, /// Install a Node.js version #[command(visible_alias = "i")] Install { /// Version to install (e.g., "20", "20.18.0", "lts", "latest") /// If not provided, installs the version from .node-version or package.json version: Option, }, /// Use a specific Node.js version for this shell session Use { /// Version to use (e.g., "20", "20.18.0", "lts", "latest") /// If not provided, reads from .node-version or package.json version: Option, /// Remove session override (revert to file-based resolution) #[arg(long)] unset: bool, /// Skip auto-installation if version not present #[arg(long)] no_install: bool, /// Suppress output if version is already active #[arg(long)] silent_if_unchanged: bool, }, } /// Version sorting order for list-remote command #[derive(clap::ValueEnum, Clone, Debug, Default)] pub enum SortingMethod { /// Sort versions in ascending order (earliest to latest) #[default] Asc, /// Sort versions in descending order (latest to earliest) Desc, } /// Package manager subcommands #[derive(Subcommand, Debug, Clone)] pub enum PmCommands { /// Remove unnecessary packages Prune { /// Remove devDependencies #[arg(long)] prod: bool, /// Remove optional dependencies #[arg(long)] no_optional: bool, /// Additional arguments #[arg(last = true, allow_hyphen_values = true)] pass_through_args: Option>, }, /// Create a tarball of the package Pack { /// Pack all workspace packages #[arg(short = 'r', long)] recursive: bool, /// Filter packages to pack #[arg(long, value_name = "PATTERN")] filter: Option>, /// Output path for the tarball #[arg(long)] out: Option, /// Directory where the tarball will be saved #[arg(long)] pack_destination: Option, /// Gzip compression level (0-9) #[arg(long)] pack_gzip_level: Option, /// Output in JSON format #[arg(long)] json: bool, /// Additional arguments #[arg(last = true, allow_hyphen_values = true)] pass_through_args: Option>, }, /// List installed packages #[command(visible_alias = "ls")] List { /// Package pattern to filter pattern: Option, /// Maximum depth of dependency tree #[arg(long)] depth: Option, /// Output in JSON format #[arg(long)] json: bool, /// Show extended information #[arg(long)] long: bool, /// Parseable output format #[arg(long)] parseable: bool, /// Only production dependencies #[arg(short = 'P', long)] prod: bool, /// Only dev dependencies #[arg(short = 'D', long)] dev: bool, /// Exclude optional dependencies #[arg(long)] no_optional: bool, /// Exclude peer dependencies #[arg(long)] exclude_peers: bool, /// Show only project packages #[arg(long)] only_projects: bool, /// Use a finder function #[arg(long, value_name = "FINDER_NAME")] find_by: Option, /// List across all workspaces #[arg(short = 'r', long)] recursive: bool, /// Filter packages in monorepo #[arg(long, value_name = "PATTERN")] filter: Vec, /// List global packages #[arg(short = 'g', long)] global: bool, /// Additional arguments #[arg(last = true, allow_hyphen_values = true)] pass_through_args: Option>, }, /// View package information from the registry #[command(visible_alias = "info", visible_alias = "show")] View { /// Package name with optional version #[arg(required = true)] package: String, /// Specific field to view field: Option, /// Output in JSON format #[arg(long)] json: bool, /// Additional arguments #[arg(last = true, allow_hyphen_values = true)] pass_through_args: Option>, }, /// Publish package to registry Publish { /// Tarball or folder to publish #[arg(value_name = "TARBALL|FOLDER")] target: Option, /// Preview without publishing #[arg(long)] dry_run: bool, /// Publish tag #[arg(long)] tag: Option, /// Access level (public/restricted) #[arg(long)] access: Option, /// One-time password for authentication #[arg(long, value_name = "OTP")] otp: Option, /// Skip git checks #[arg(long)] no_git_checks: bool, /// Set the branch name to publish from #[arg(long, value_name = "BRANCH")] publish_branch: Option, /// Save publish summary #[arg(long)] report_summary: bool, /// Force publish #[arg(long)] force: bool, /// Output in JSON format #[arg(long)] json: bool, /// Publish all workspace packages #[arg(short = 'r', long)] recursive: bool, /// Filter packages in monorepo #[arg(long, value_name = "PATTERN")] filter: Option>, /// Additional arguments #[arg(last = true, allow_hyphen_values = true)] pass_through_args: Option>, }, /// Manage package owners #[command(subcommand, visible_alias = "author")] Owner(OwnerCommands), /// Manage package cache Cache { /// Subcommand: dir, path, clean #[arg(required = true)] subcommand: String, /// Additional arguments #[arg(last = true, allow_hyphen_values = true)] pass_through_args: Option>, }, /// Manage package manager configuration #[command(subcommand, visible_alias = "c")] Config(ConfigCommands), /// Log in to a registry #[command(visible_alias = "adduser")] Login { /// Registry URL #[arg(long, value_name = "URL")] registry: Option, /// Scope for the login #[arg(long, value_name = "SCOPE")] scope: Option, /// Additional arguments #[arg(last = true, allow_hyphen_values = true)] pass_through_args: Option>, }, /// Log out from a registry Logout { /// Registry URL #[arg(long, value_name = "URL")] registry: Option, /// Scope for the logout #[arg(long, value_name = "SCOPE")] scope: Option, /// Additional arguments #[arg(last = true, allow_hyphen_values = true)] pass_through_args: Option>, }, /// Show the current logged-in user Whoami { /// Registry URL #[arg(long, value_name = "URL")] registry: Option, /// Additional arguments #[arg(last = true, allow_hyphen_values = true)] pass_through_args: Option>, }, /// Manage authentication tokens #[command(subcommand)] Token(TokenCommands), /// Run a security audit Audit { /// Automatically fix vulnerabilities #[arg(long)] fix: bool, /// Output in JSON format #[arg(long)] json: bool, /// Minimum vulnerability level to report #[arg(long, value_name = "LEVEL")] level: Option, /// Only audit production dependencies #[arg(long)] production: bool, /// Additional arguments #[arg(last = true, allow_hyphen_values = true)] pass_through_args: Option>, }, /// Manage distribution tags #[command(name = "dist-tag", subcommand)] DistTag(DistTagCommands), /// Deprecate a package version Deprecate { /// Package name with version (e.g., "my-pkg@1.0.0") package: String, /// Deprecation message message: String, /// One-time password for authentication #[arg(long, value_name = "OTP")] otp: Option, /// Registry URL #[arg(long, value_name = "URL")] registry: Option, /// Additional arguments #[arg(last = true, allow_hyphen_values = true)] pass_through_args: Option>, }, /// Search for packages in the registry Search { /// Search terms #[arg(required = true, num_args = 1..)] terms: Vec, /// Output in JSON format #[arg(long)] json: bool, /// Show extended information #[arg(long)] long: bool, /// Registry URL #[arg(long, value_name = "URL")] registry: Option, /// Additional arguments #[arg(last = true, allow_hyphen_values = true)] pass_through_args: Option>, }, /// Rebuild native modules #[command(visible_alias = "rb")] Rebuild { /// Additional arguments #[arg(last = true, allow_hyphen_values = true)] pass_through_args: Option>, }, /// Show funding information for installed packages Fund { /// Output in JSON format #[arg(long)] json: bool, /// Additional arguments #[arg(last = true, allow_hyphen_values = true)] pass_through_args: Option>, }, /// Ping the registry Ping { /// Registry URL #[arg(long, value_name = "URL")] registry: Option, /// Additional arguments #[arg(last = true, allow_hyphen_values = true)] pass_through_args: Option>, }, } /// Configuration subcommands #[derive(Subcommand, Debug, Clone)] pub enum ConfigCommands { /// List all configuration List { /// Output in JSON format #[arg(long)] json: bool, /// Use global config #[arg(short = 'g', long)] global: bool, /// Config location: project (default) or global #[arg(long, value_name = "LOCATION")] location: Option, }, /// Get configuration value Get { /// Config key key: String, /// Output in JSON format #[arg(long)] json: bool, /// Use global config #[arg(short = 'g', long)] global: bool, /// Config location #[arg(long, value_name = "LOCATION")] location: Option, }, /// Set configuration value Set { /// Config key key: String, /// Config value value: String, /// Output in JSON format #[arg(long)] json: bool, /// Use global config #[arg(short = 'g', long)] global: bool, /// Config location #[arg(long, value_name = "LOCATION")] location: Option, }, /// Delete configuration key Delete { /// Config key key: String, /// Use global config #[arg(short = 'g', long)] global: bool, /// Config location #[arg(long, value_name = "LOCATION")] location: Option, }, } /// Owner subcommands #[derive(Subcommand, Debug, Clone)] pub enum OwnerCommands { /// List package owners #[command(visible_alias = "ls")] List { /// Package name package: String, /// One-time password for authentication #[arg(long, value_name = "OTP")] otp: Option, }, /// Add package owner Add { /// Username user: String, /// Package name package: String, /// One-time password for authentication #[arg(long, value_name = "OTP")] otp: Option, }, /// Remove package owner Rm { /// Username user: String, /// Package name package: String, /// One-time password for authentication #[arg(long, value_name = "OTP")] otp: Option, }, } /// Token subcommands #[derive(Subcommand, Debug, Clone)] pub enum TokenCommands { /// List all known tokens #[command(visible_alias = "ls")] List { /// Output in JSON format #[arg(long)] json: bool, /// Registry URL #[arg(long, value_name = "URL")] registry: Option, /// Additional arguments #[arg(last = true, allow_hyphen_values = true)] pass_through_args: Option>, }, /// Create a new authentication token Create { /// Output in JSON format #[arg(long)] json: bool, /// Registry URL #[arg(long, value_name = "URL")] registry: Option, /// CIDR ranges to restrict the token to #[arg(long, value_name = "CIDR")] cidr: Option>, /// Create a read-only token #[arg(long)] readonly: bool, /// Additional arguments #[arg(last = true, allow_hyphen_values = true)] pass_through_args: Option>, }, /// Revoke an authentication token Revoke { /// Token or token ID to revoke token: String, /// Registry URL #[arg(long, value_name = "URL")] registry: Option, /// Additional arguments #[arg(last = true, allow_hyphen_values = true)] pass_through_args: Option>, }, } /// Distribution tag subcommands #[derive(Subcommand, Debug, Clone)] pub enum DistTagCommands { /// List distribution tags for a package #[command(visible_alias = "ls")] List { /// Package name package: Option, }, /// Add a distribution tag Add { /// Package name with version (e.g., "my-pkg@1.0.0") package_at_version: String, /// Tag name tag: String, }, /// Remove a distribution tag Rm { /// Package name package: String, /// Tag name tag: String, }, } /// Determine the save dependency type from CLI flags. fn determine_save_dependency_type( save_dev: bool, save_peer: bool, save_optional: bool, save_prod: bool, ) -> Option { if save_dev { Some(SaveDependencyType::Dev) } else if save_peer { Some(SaveDependencyType::Peer) } else if save_optional { Some(SaveDependencyType::Optional) } else if save_prod { Some(SaveDependencyType::Production) } else { None } } fn has_flag_before_terminator(args: &[String], flag: &str) -> bool { for arg in args { if arg == "--" { break; } if arg == flag || arg.starts_with(&format!("{flag}=")) { return true; } } false } fn should_force_global_delegate(command: &str, args: &[String]) -> bool { match command { "lint" => has_flag_before_terminator(args, "--init"), "fmt" => { has_flag_before_terminator(args, "--init") || has_flag_before_terminator(args, "--migrate") } _ => false, } } /// Run the CLI command. pub async fn run_command(cwd: AbsolutePathBuf, args: Args) -> Result { run_command_with_options(cwd, args, RenderOptions::default()).await } /// Run the CLI command with rendering options. pub async fn run_command_with_options( cwd: AbsolutePathBuf, args: Args, render_options: RenderOptions, ) -> Result { // Handle --version flag (Category B: delegates to JS) if args.version { return commands::version::execute(cwd).await; } // If no command provided, show help and exit let Some(command) = args.command else { // Use custom help formatting to match the JS CLI output if render_options.show_header { command_with_help().print_help().ok(); } else { command_with_help_with_options(render_options).print_help().ok(); } println!(); // Return a successful exit status since help was requested implicitly return Ok(std::process::ExitStatus::default()); }; match command { // Category A: Package Manager Commands Commands::Install { prod, dev, no_optional, frozen_lockfile, no_frozen_lockfile, lockfile_only, prefer_offline, offline, force, ignore_scripts, no_lockfile, fix_lockfile, shamefully_hoist, resolution_only, silent, filter, workspace_root, save_exact, save_peer, save_optional, save_catalog, global, node, packages, pass_through_args, } => { print_runtime_header(render_options.show_header && !silent); // If packages are provided, redirect to Add command if let Some(pkgs) = packages && !pkgs.is_empty() { // Handle global install via vite-plus managed global install if global { use crate::commands::env::global_install; for package in &pkgs { if let Err(e) = global_install::install(package, node.as_deref(), force).await { eprintln!("Failed to install {}: {}", package, e); return Ok(exit_status(1)); } } return Ok(ExitStatus::default()); } let save_dependency_type = determine_save_dependency_type(dev, save_peer, save_optional, prod); return AddCommand::new(cwd) .execute( &pkgs, save_dependency_type, save_exact, if save_catalog { Some("default") } else { None }, filter.as_deref(), workspace_root, false, // workspace_only global, None, // allow_build pass_through_args.as_deref(), ) .await; } // No packages provided, run regular install let options = InstallCommandOptions { prod, dev, no_optional, frozen_lockfile, no_frozen_lockfile, lockfile_only, prefer_offline, offline, force, ignore_scripts, no_lockfile, fix_lockfile, shamefully_hoist, resolution_only, silent, filters: filter.as_deref(), workspace_root, pass_through_args: pass_through_args.as_deref(), }; InstallCommand::new(cwd).execute(&options).await } Commands::Add { save_prod, save_dev, save_peer, save_optional, save_exact, save_catalog_name, save_catalog, allow_build, filter, workspace_root, workspace, global, node, packages, pass_through_args, } => { // Handle global install via vite-plus managed global install if global { use crate::commands::env::global_install; for package in &packages { if let Err(e) = global_install::install(package, node.as_deref(), false).await { eprintln!("Failed to install {}: {}", package, e); return Ok(exit_status(1)); } } return Ok(ExitStatus::default()); } let save_dependency_type = determine_save_dependency_type(save_dev, save_peer, save_optional, save_prod); let catalog_name = if save_catalog { Some("default") } else { save_catalog_name.as_deref() }; AddCommand::new(cwd) .execute( &packages, save_dependency_type, save_exact, catalog_name, filter.as_deref(), workspace_root, workspace, global, allow_build.as_deref(), pass_through_args.as_deref(), ) .await } Commands::Remove { save_dev, save_optional, save_prod, filter, workspace_root, recursive, global, dry_run, packages, pass_through_args, } => { // Handle global uninstall via vite-plus managed global install if global { use crate::commands::env::global_install; for package in &packages { if let Err(e) = global_install::uninstall(package, dry_run).await { eprintln!("Failed to uninstall {}: {}", package, e); return Ok(exit_status(1)); } } return Ok(ExitStatus::default()); } RemoveCommand::new(cwd) .execute( &packages, save_dev, save_optional, save_prod, filter.as_deref(), workspace_root, recursive, global, pass_through_args.as_deref(), ) .await } Commands::Update { latest, global, recursive, filter, workspace_root, dev, prod, interactive, no_optional, no_save, workspace, packages, pass_through_args, } => { // Handle global update via vite-plus managed global install if global { use crate::commands::env::{global_install, package_metadata::PackageMetadata}; let packages_to_update = if packages.is_empty() { let all = PackageMetadata::list_all().await?; if all.is_empty() { println!("No global packages installed."); return Ok(ExitStatus::default()); } all.iter().map(|p| p.name.clone()).collect::>() } else { packages.clone() }; for package in &packages_to_update { if let Err(e) = global_install::install(package, None, false).await { eprintln!("Failed to update {}: {}", package, e); return Ok(exit_status(1)); } } return Ok(ExitStatus::default()); } UpdateCommand::new(cwd) .execute( &packages, latest, global, recursive, filter.as_deref(), workspace_root, dev, prod, interactive, no_optional, no_save, workspace, pass_through_args.as_deref(), ) .await } Commands::Dedupe { check, pass_through_args } => { DedupeCommand::new(cwd).execute(check, pass_through_args.as_deref()).await } Commands::Outdated { packages, long, format, recursive, filter, workspace_root, prod, dev, no_optional, compatible, sort_by, global, pass_through_args, } => { OutdatedCommand::new(cwd) .execute( &packages, long, format, recursive, filter.as_deref(), workspace_root, prod, dev, no_optional, compatible, sort_by.as_deref(), global, pass_through_args.as_deref(), ) .await } Commands::Why { packages, json, long, parseable, recursive, filter, workspace_root, prod, dev, depth, no_optional, global, exclude_peers, find_by, pass_through_args, } => { WhyCommand::new(cwd) .execute( &packages, json, long, parseable, recursive, filter.as_deref(), workspace_root, prod, dev, depth, no_optional, global, exclude_peers, find_by.as_deref(), pass_through_args.as_deref(), ) .await } Commands::Info { package, field, json, pass_through_args } => { commands::pm::execute_info( cwd, &package, field.as_deref(), json, pass_through_args.as_deref(), ) .await } Commands::Link { package, args } => { let pass_through = if args.is_empty() { None } else { Some(args.as_slice()) }; LinkCommand::new(cwd).execute(package.as_deref(), pass_through).await } Commands::Unlink { package, recursive, args } => { let pass_through = if args.is_empty() { None } else { Some(args.as_slice()) }; UnlinkCommand::new(cwd).execute(package.as_deref(), recursive, pass_through).await } Commands::Dlx { package, shell_mode, silent, args } => { DlxCommand::new(cwd).execute(package, shell_mode, silent, args).await } Commands::Pm(pm_command) => commands::pm::execute_pm_subcommand(cwd, pm_command).await, // Category B: JS Script Commands Commands::Create { args } => commands::create::execute(cwd, &args).await, Commands::Migrate { args } => commands::migrate::execute(cwd, &args).await, Commands::Config { args } => commands::config::execute(cwd, &args).await, Commands::Staged { args } => commands::staged::execute(cwd, &args).await, // Category C: Local CLI Delegation (stubs) Commands::Dev { args } => { if help::maybe_print_unified_delegate_help("dev", &args, render_options.show_header) { return Ok(ExitStatus::default()); } print_runtime_header(render_options.show_header); commands::delegate::execute(cwd, "dev", &args).await } Commands::Build { args } => { if help::maybe_print_unified_delegate_help("build", &args, render_options.show_header) { return Ok(ExitStatus::default()); } print_runtime_header(render_options.show_header); commands::delegate::execute(cwd, "build", &args).await } Commands::Test { args } => { if help::maybe_print_unified_delegate_help("test", &args, render_options.show_header) { return Ok(ExitStatus::default()); } print_runtime_header(render_options.show_header); commands::delegate::execute(cwd, "test", &args).await } Commands::Lint { args } => { if help::maybe_print_unified_delegate_help("lint", &args, render_options.show_header) { return Ok(ExitStatus::default()); } print_runtime_header(render_options.show_header); if should_force_global_delegate("lint", &args) { commands::delegate::execute_global(cwd, "lint", &args).await } else { commands::delegate::execute(cwd, "lint", &args).await } } Commands::Fmt { args } => { if help::maybe_print_unified_delegate_help("fmt", &args, render_options.show_header) { return Ok(ExitStatus::default()); } print_runtime_header(render_options.show_header); if should_force_global_delegate("fmt", &args) { commands::delegate::execute_global(cwd, "fmt", &args).await } else { commands::delegate::execute(cwd, "fmt", &args).await } } Commands::Check { args } => { if help::maybe_print_unified_delegate_help("check", &args, render_options.show_header) { return Ok(ExitStatus::default()); } print_runtime_header(render_options.show_header); commands::delegate::execute(cwd, "check", &args).await } Commands::Pack { args } => { if help::maybe_print_unified_delegate_help("pack", &args, render_options.show_header) { return Ok(ExitStatus::default()); } print_runtime_header(render_options.show_header); commands::delegate::execute(cwd, "pack", &args).await } Commands::Run { args } => { if help::maybe_print_unified_delegate_help("run", &args, render_options.show_header) { return Ok(ExitStatus::default()); } print_runtime_header(render_options.show_header); commands::run_or_delegate::execute(cwd, &args).await } Commands::Exec { args } => { if help::maybe_print_unified_delegate_help("exec", &args, render_options.show_header) { return Ok(ExitStatus::default()); } print_runtime_header(render_options.show_header); commands::delegate::execute(cwd, "exec", &args).await } Commands::Preview { args } => { if help::maybe_print_unified_delegate_help("preview", &args, render_options.show_header) { return Ok(ExitStatus::default()); } print_runtime_header(render_options.show_header); commands::delegate::execute(cwd, "preview", &args).await } Commands::Cache { args } => { if help::maybe_print_unified_delegate_help("cache", &args, render_options.show_header) { return Ok(ExitStatus::default()); } print_runtime_header(render_options.show_header); commands::delegate::execute(cwd, "cache", &args).await } Commands::Env(args) => commands::env::execute(cwd, args).await, // Self-Management Commands::Upgrade { version, tag, check, rollback, force, silent, registry } => { commands::upgrade::execute(commands::upgrade::UpgradeOptions { version, tag, check, rollback, force, silent, registry, }) .await } Commands::Implode { yes } => commands::implode::execute(yes), } } /// Create an exit status with the given code. pub(crate) fn exit_status(code: i32) -> ExitStatus { #[cfg(unix)] { use std::os::unix::process::ExitStatusExt; ExitStatus::from_raw(code << 8) } #[cfg(windows)] { use std::os::windows::process::ExitStatusExt; ExitStatus::from_raw(code as u32) } } fn print_runtime_header(show_header: bool) { if !show_header { return; } println!("{}", vite_shared::header::vite_plus_header()); println!(); } /// Build a clap Command with custom help formatting matching the JS CLI output. pub fn command_with_help() -> clap::Command { command_with_help_with_options(RenderOptions::default()) } /// Build a clap Command with custom help formatting and rendering options. pub fn command_with_help_with_options(render_options: RenderOptions) -> clap::Command { apply_custom_help(Args::command(), render_options) } /// Apply custom help formatting to a clap Command to match the JS CLI output. fn apply_custom_help(cmd: clap::Command, render_options: RenderOptions) -> clap::Command { let after_help = help::render_help_doc(&help::top_level_help_doc()); let options_heading = help::render_heading("Options"); let header = if render_options.show_header { vite_shared::header::vite_plus_header() } else { String::new() }; let help_template = format!("{header}{{after-help}}\n{options_heading}\n{{options}}\n"); cmd.after_help(after_help).help_template(help_template) } /// Parse CLI arguments from a custom args iterator with custom help formatting. /// Returns `Err` with the clap error if parsing fails (e.g., unknown command). pub fn try_parse_args_from( args: impl IntoIterator, ) -> Result { try_parse_args_from_with_options(args, RenderOptions::default()) } /// Parse CLI arguments from a custom args iterator with rendering options. /// Returns `Err` with the clap error if parsing fails (e.g., unknown command). pub fn try_parse_args_from_with_options( args: impl IntoIterator, render_options: RenderOptions, ) -> Result { let cmd = apply_custom_help(Args::command(), render_options); let matches = cmd.try_get_matches_from(args)?; Args::from_arg_matches(&matches).map_err(|e| e.into()) } #[cfg(test)] mod tests { use super::{has_flag_before_terminator, should_force_global_delegate}; #[test] fn detects_flag_before_option_terminator() { assert!(has_flag_before_terminator( &["--init".to_string(), "src/index.ts".to_string()], "--init" )); } #[test] fn ignores_flag_after_option_terminator() { assert!(!has_flag_before_terminator( &["src/index.ts".to_string(), "--".to_string(), "--init".to_string(),], "--init" )); } #[test] fn lint_init_forces_global_delegate() { assert!(should_force_global_delegate("lint", &["--init".to_string()])); } #[test] fn fmt_migrate_forces_global_delegate() { assert!(should_force_global_delegate("fmt", &["--migrate=prettier".to_string()])); } #[test] fn non_init_does_not_force_global_delegate() { assert!(!should_force_global_delegate("lint", &["src/index.ts".to_string()])); assert!(!should_force_global_delegate("fmt", &["--check".to_string()])); } } ================================================ FILE: crates/vite_global_cli/src/command_picker.rs ================================================ //! Interactive top-level command picker for `vp`. use std::{ io::{self, IsTerminal, Write}, ops::ControlFlow, }; use crossterm::{ cursor, event::{self, Event, KeyCode, KeyEvent, KeyEventKind, KeyModifiers}, execute, style::{Attribute, Print, ResetColor, SetAttribute, SetForegroundColor}, terminal::{self, ClearType}, }; use vite_path::AbsolutePath; use crate::commands::has_vite_plus_dependency; const NEWLINE: &str = "\r\n"; const SELECTED_COLOR: crossterm::style::Color = crossterm::style::Color::Blue; const SELECTED_MARKER: &str = "›"; const UNSELECTED_MARKER: &str = " "; #[derive(Clone, Copy, Debug, PartialEq, Eq)] pub struct PickedCommand { pub command: &'static str, pub append_help: bool, } #[derive(Clone, Copy, Debug, PartialEq, Eq)] pub enum TopLevelCommandPick { Skipped, Selected(PickedCommand), Cancelled, } #[derive(Clone, Copy)] struct CommandEntry { label: &'static str, command: &'static str, summary: &'static str, append_help: bool, } const COMMANDS: &[CommandEntry] = &[ CommandEntry { label: "create", command: "create", summary: "Create a new project from a template.", append_help: false, }, CommandEntry { label: "migrate", command: "migrate", summary: "Migrate an existing project to Vite+.", append_help: false, }, CommandEntry { label: "dev", command: "dev", summary: "Run the development server.", append_help: false, }, CommandEntry { label: "check", command: "check", summary: "Run format, lint, and type checks.", append_help: false, }, CommandEntry { label: "test", command: "test", summary: "Run tests.", append_help: false }, CommandEntry { label: "install", command: "install", summary: "Install dependencies, or add packages when names are provided.", append_help: false, }, CommandEntry { label: "run", command: "run", summary: "Run tasks.", append_help: false }, CommandEntry { label: "build", command: "build", summary: "Build for production.", append_help: false, }, CommandEntry { label: "pack", command: "pack", summary: "Build library.", append_help: false }, CommandEntry { label: "preview", command: "preview", summary: "Preview production build.", append_help: false, }, CommandEntry { label: "config", command: "config", summary: "Configure hooks and agent integration.", append_help: false, }, CommandEntry { label: "outdated", command: "outdated", summary: "Check for outdated packages.", append_help: false, }, CommandEntry { label: "env", command: "env", summary: "Manage Node.js versions.", append_help: false, }, CommandEntry { label: "help", command: "help", summary: "View all commands and details", append_help: false, }, ]; const CI_ENV_VARS: &[&str] = &[ "CI", "CONTINUOUS_INTEGRATION", "GITHUB_ACTIONS", "GITLAB_CI", "CIRCLECI", "TRAVIS", "JENKINS_URL", "BUILDKITE", "DRONE", "CODEBUILD_BUILD_ID", "TF_BUILD", ]; pub fn pick_top_level_command_if_interactive( cwd: &AbsolutePath, ) -> io::Result { if !should_enable_picker() { return Ok(TopLevelCommandPick::Skipped); } let command_order = default_command_order(has_vite_plus_dependency(cwd)); Ok(match run_picker(&command_order)? { Some(selection) => TopLevelCommandPick::Selected(selection), None => TopLevelCommandPick::Cancelled, }) } fn should_enable_picker() -> bool { std::io::stdin().is_terminal() && std::io::stdout().is_terminal() && std::env::var("TERM").map_or(true, |term| term != "dumb") && !is_ci_environment() } fn is_ci_environment() -> bool { CI_ENV_VARS.iter().any(|key| std::env::var_os(key).is_some()) } fn run_picker(command_order: &[usize]) -> io::Result> { let mut stdout = io::stdout(); let mut selected_position = 0usize; let mut viewport_start = 0usize; let mut query = String::new(); let is_warp = vite_shared::header::is_warp_terminal(); let header_overhead = if is_warp { 10 } else { 9 }; terminal::enable_raw_mode()?; execute!(stdout, terminal::EnterAlternateScreen, cursor::Hide)?; let pick_result = loop { let filtered_indices = filtered_command_indices(&query, command_order); if filtered_indices.is_empty() { selected_position = 0; viewport_start = 0; } else { if selected_position >= filtered_indices.len() { selected_position = 0; } viewport_start = viewport_start.min(filtered_indices.len().saturating_sub(1)); } let (_, rows) = terminal::size().unwrap_or((80, 24)); let rows = if rows == 0 { 24 } else { rows }; let viewport_size = compute_viewport_size(rows.into(), filtered_indices.len(), header_overhead); viewport_start = align_viewport(viewport_start, selected_position, viewport_size); match render_picker( &mut stdout, &query, &filtered_indices, selected_position, viewport_start, viewport_size, ) { Ok(()) => {} Err(err) => break Err(err), } match event::read() { Ok(Event::Key(KeyEvent { code, modifiers, kind, .. })) => { if kind == KeyEventKind::Press { match handle_key_event( code, modifiers, &mut query, &mut selected_position, filtered_indices.len(), ) { ControlFlow::Continue(()) => continue, ControlFlow::Break(Some(())) => { let Some(index) = filtered_indices.get(selected_position).copied() else { continue; }; break Ok(Some(PickedCommand { command: COMMANDS[index].command, append_help: COMMANDS[index].append_help, })); } ControlFlow::Break(None) => break Ok(None), } } } Ok(_) => continue, Err(err) => break Err(err), } }; let cleanup_result = cleanup_picker(&mut stdout); match (pick_result, cleanup_result) { (Ok(picked), Ok(())) => Ok(picked), (Err(err), _) => Err(err), (Ok(_), Err(err)) => Err(err), } } fn cleanup_picker(stdout: &mut io::Stdout) -> io::Result<()> { terminal::disable_raw_mode()?; execute!(stdout, cursor::Show, terminal::LeaveAlternateScreen, ResetColor)?; Ok(()) } fn handle_key_event( code: KeyCode, modifiers: KeyModifiers, query: &mut String, selected_position: &mut usize, filtered_len: usize, ) -> ControlFlow> { match code { KeyCode::Char('c') if modifiers.contains(KeyModifiers::CONTROL) => ControlFlow::Break(None), KeyCode::Esc => ControlFlow::Break(None), KeyCode::Backspace => { if !query.is_empty() { query.pop(); *selected_position = 0; } ControlFlow::Continue(()) } KeyCode::Up => { *selected_position = selected_position.saturating_sub(1); ControlFlow::Continue(()) } KeyCode::Down => { if *selected_position + 1 < filtered_len { *selected_position += 1; } ControlFlow::Continue(()) } KeyCode::Home => { *selected_position = 0; ControlFlow::Continue(()) } KeyCode::End => { *selected_position = filtered_len.saturating_sub(1); ControlFlow::Continue(()) } KeyCode::Enter => { if filtered_len == 0 { ControlFlow::Continue(()) } else { ControlFlow::Break(Some(())) } } KeyCode::Char(ch) if modifiers.is_empty() || modifiers == KeyModifiers::SHIFT => { if !ch.is_control() { query.push(ch); *selected_position = 0; } ControlFlow::Continue(()) } _ => ControlFlow::Continue(()), } } fn render_picker( stdout: &mut io::Stdout, query: &str, filtered_indices: &[usize], selected_position: usize, viewport_start: usize, viewport_size: usize, ) -> io::Result<()> { let (columns, _) = terminal::size().unwrap_or((80, 24)); let columns = if columns == 0 { 80 } else { columns }; // Warp terminal needs extra padding since it renders alternate screen // content flush against the edges of its block-mode renderer. let pad = if vite_shared::header::is_warp_terminal() { " " } else { "" }; let max_width = usize::from(columns).saturating_sub(4 + pad.len()); let viewport_end = (viewport_start + viewport_size).min(filtered_indices.len()); let instruction = truncate_line(&picker_instruction(query), max_width); execute!(stdout, cursor::MoveTo(0, 0), terminal::Clear(ClearType::All),)?; if vite_shared::header::is_warp_terminal() { execute!(stdout, Print(NEWLINE))?; } execute!( stdout, Print(format!("{pad}{}", vite_shared::header::vite_plus_header())), Print(NEWLINE), Print(NEWLINE), Print(format!("{pad}{instruction}")), Print(NEWLINE), Print(NEWLINE) )?; if viewport_start > 0 { execute!( stdout, SetForegroundColor(crossterm::style::Color::DarkGrey), Print(format!("{pad} ↑ more")), Print(NEWLINE), ResetColor )?; } for (index, command_index) in filtered_indices[viewport_start..viewport_end].iter().enumerate() { let actual_position = viewport_start + index; let is_selected = actual_position == selected_position; let entry = &COMMANDS[*command_index]; let marker = if is_selected { SELECTED_MARKER } else { UNSELECTED_MARKER }; let label = truncate_line(entry.label, max_width); if entry.command == "help" { let (help_label, help_summary) = selected_command_parts(entry.command, entry.summary, max_width); execute!( stdout, SetForegroundColor(crossterm::style::Color::DarkGrey), Print(format!("{pad} {marker} ")), ResetColor )?; if is_selected { execute!( stdout, SetForegroundColor(SELECTED_COLOR), SetAttribute(Attribute::Bold), Print(help_label), SetAttribute(Attribute::Reset), ResetColor )?; } else { execute!(stdout, Print(help_label))?; } if let Some(summary) = help_summary { execute!( stdout, SetForegroundColor(crossterm::style::Color::DarkGrey), Print(" "), Print(summary), ResetColor )?; } execute!(stdout, Print(NEWLINE))?; continue; } if is_selected { let (selected_label, selected_summary) = selected_command_parts(&label, entry.summary, max_width); execute!( stdout, SetForegroundColor(crossterm::style::Color::DarkGrey), Print(format!("{pad} {marker} ")), ResetColor )?; execute!(stdout, SetForegroundColor(SELECTED_COLOR), SetAttribute(Attribute::Bold),)?; execute!(stdout, Print(selected_label))?; execute!(stdout, SetAttribute(Attribute::Reset), ResetColor)?; if let Some(summary) = selected_summary { execute!( stdout, SetForegroundColor(crossterm::style::Color::DarkGrey), Print(" "), Print(summary), ResetColor )?; } execute!(stdout, Print(NEWLINE))?; } else { execute!( stdout, SetForegroundColor(crossterm::style::Color::DarkGrey), Print(format!("{pad} {marker} ")), ResetColor, Print(label), )?; execute!(stdout, Print(NEWLINE))?; } } if viewport_end < filtered_indices.len() { execute!( stdout, SetForegroundColor(crossterm::style::Color::DarkGrey), Print(format!("{pad} ↓ more")), Print(NEWLINE), ResetColor )?; } if filtered_indices.is_empty() { let no_match = if query.is_empty() { "No common commands available. Run `vp help`.".to_string() } else { format!("No common command matches '{query}'. Run `vp help`.") }; let no_match = truncate_line(&no_match, max_width); execute!( stdout, Print(NEWLINE), SetForegroundColor(crossterm::style::Color::DarkGrey), Print(format!("{pad} ")), Print(no_match), Print(NEWLINE), ResetColor )?; } stdout.flush() } fn picker_instruction(query: &str) -> String { format!("Select a command (↑/↓, Enter to run, type to search): {query}") } fn compute_viewport_size( terminal_rows: usize, total_commands: usize, header_overhead: usize, ) -> usize { terminal_rows.saturating_sub(header_overhead).clamp(6, total_commands.max(6)) } fn align_viewport(current_start: usize, selected_index: usize, viewport_size: usize) -> usize { if selected_index < current_start { selected_index } else if selected_index >= current_start + viewport_size { selected_index + 1 - viewport_size } else { current_start } } fn truncate_line(line: &str, max_chars: usize) -> String { if max_chars == 0 { return String::new(); } let char_count = line.chars().count(); if char_count <= max_chars { return line.to_string(); } if max_chars == 1 { return "…".to_string(); } line.chars().take(max_chars - 1).collect::() + "…" } fn selected_command_parts( command: &str, summary: &str, max_chars: usize, ) -> (String, Option) { let selected_label = format!("{command}:"); let selected_label_width = selected_label.chars().count(); if max_chars <= selected_label_width { return (truncate_line(&selected_label, max_chars), None); } let summary_width = max_chars - selected_label_width - 1; if summary_width == 0 { return (selected_label, None); } (selected_label, Some(truncate_line(summary, summary_width))) } fn default_command_order(prioritize_run: bool) -> Vec { let indices = (0..COMMANDS.len()).collect::>(); if !prioritize_run { return indices; } let migrate_index = COMMANDS .iter() .position(|command| command.command == "migrate") .expect("migrate command should exist"); let run_index = COMMANDS .iter() .position(|command| command.command == "run") .expect("run command should exist"); let mut ordered = Vec::with_capacity(indices.len()); ordered.push(run_index); ordered .extend(indices.into_iter().filter(|index| *index != run_index && *index != migrate_index)); ordered } fn filtered_command_indices(query: &str, command_order: &[usize]) -> Vec { let query = query.trim(); if query.is_empty() { return command_order.to_vec(); } let query = query.to_ascii_lowercase(); let starts_with_matches = command_order .iter() .copied() .filter(|index| { let command = &COMMANDS[*index]; let command_name = command.command.to_ascii_lowercase(); command_name.starts_with(&query) }) .collect::>(); if !starts_with_matches.is_empty() { return starts_with_matches; } command_order .iter() .copied() .filter(|index| { let command = &COMMANDS[*index]; let command_name = command.command.to_ascii_lowercase(); command_name.contains(&query) }) .collect::>() } #[cfg(test)] mod tests { use super::{ COMMANDS, align_viewport, compute_viewport_size, default_command_order, filtered_command_indices, picker_instruction, selected_command_parts, }; #[test] fn commands_are_unique() { let mut names = COMMANDS.iter().map(|command| command.command).collect::>(); names.sort_unstable(); names.dedup(); assert_eq!(names.len(), COMMANDS.len()); } #[test] fn commands_with_required_args_default_to_help() { let expected: [&str; 0] = []; let mut actual = COMMANDS .iter() .filter(|command| command.append_help) .map(|command| command.command) .collect::>(); actual.sort_unstable(); assert_eq!(actual, expected); } #[test] fn viewport_aligns_to_selected_row() { assert_eq!(align_viewport(0, 0, 8), 0); assert_eq!(align_viewport(0, 6, 8), 0); assert_eq!(align_viewport(0, 8, 8), 1); assert_eq!(align_viewport(5, 2, 8), 2); } #[test] fn viewport_size_is_clamped() { assert_eq!(compute_viewport_size(12, 30, 9), 6); assert_eq!(compute_viewport_size(24, 30, 9), 15); assert_eq!(compute_viewport_size(100, 8, 9), 8); // Warp adds 1 extra row of overhead assert_eq!(compute_viewport_size(12, 30, 10), 6); assert_eq!(compute_viewport_size(24, 30, 10), 14); } #[test] fn filtering_is_case_insensitive_and_returns_matching_commands_only() { let order = default_command_order(false); let run = filtered_command_indices("Ru", &order); assert_eq!(run.len(), 1); assert_eq!(COMMANDS[run[0]].command, "run"); let build = filtered_command_indices("b", &order); let build_commands = build.iter().map(|index| COMMANDS[*index].command).collect::>(); assert!(build_commands.contains(&"build")); } #[test] fn filtering_with_no_matches_returns_empty() { let order = default_command_order(false); let no_match = filtered_command_indices("xyz123", &order); assert!(no_match.is_empty()); } #[test] fn filtering_prefers_prefix_matches() { let order = default_command_order(false); let help = filtered_command_indices("he", &order); assert_eq!(help.len(), 1); assert_eq!(COMMANDS[help[0]].command, "help"); } #[test] fn default_order_puts_create_first_for_non_vite_plus_projects() { let order = default_command_order(false); assert_eq!(COMMANDS[order[0]].command, "create"); } #[test] fn default_order_puts_run_first_for_vite_plus_projects() { let order = default_command_order(true); assert_eq!(COMMANDS[order[0]].command, "run"); } #[test] fn default_order_hides_migrate_for_vite_plus_projects() { let order = default_command_order(true); let ordered_commands = order.iter().map(|index| COMMANDS[*index].command).collect::>(); assert!(!ordered_commands.contains(&"migrate")); } #[test] fn selected_command_parts_appends_summary() { let (label, summary) = selected_command_parts("create", "Create a new project.", 80); assert_eq!(label, "create:"); assert_eq!(summary, Some("Create a new project.".to_string())); } #[test] fn selected_command_parts_truncates_summary_to_fit_width() { let (label, summary) = selected_command_parts("create", "Create a new project.", 18); assert_eq!(label, "create:"); assert_eq!(summary, Some("Create a …".to_string())); } #[test] fn selected_command_parts_truncates_label_when_width_is_tight() { let (label, summary) = selected_command_parts("create", "Create a new project.", 4); assert_eq!(label, "cre…"); assert_eq!(summary, None); } #[test] fn help_entry_uses_static_inline_description() { let help = COMMANDS .iter() .find(|entry| entry.command == "help") .expect("help command should exist"); assert_eq!(help.label, "help"); assert_eq!(help.summary, "View all commands and details"); } #[test] fn picker_instruction_mentions_search() { assert_eq!( picker_instruction(""), "Select a command (↑/↓, Enter to run, type to search): " ); } } ================================================ FILE: crates/vite_global_cli/src/commands/add.rs ================================================ use std::process::ExitStatus; use vite_install::{ commands::add::{AddCommandOptions, SaveDependencyType}, package_manager::PackageManager, }; use vite_path::AbsolutePathBuf; use super::prepend_js_runtime_to_path_env; use crate::error::Error; /// Add command for adding packages to dependencies. /// /// This command automatically detects the package manager and translates /// the add command to the appropriate package manager-specific syntax. pub struct AddCommand { cwd: AbsolutePathBuf, } impl AddCommand { pub fn new(cwd: AbsolutePathBuf) -> Self { Self { cwd } } pub async fn execute( self, packages: &[String], save_dependency_type: Option, save_exact: bool, save_catalog_name: Option<&str>, filters: Option<&[String]>, workspace_root: bool, workspace_only: bool, global: bool, allow_build: Option<&str>, pass_through_args: Option<&[String]>, ) -> Result { prepend_js_runtime_to_path_env(&self.cwd).await?; super::ensure_package_json(&self.cwd).await?; let add_command_options = AddCommandOptions { packages, save_dependency_type, save_exact, filters, workspace_root, workspace_only, global, save_catalog_name, allow_build, pass_through_args, }; // Detect package manager let package_manager = PackageManager::builder(&self.cwd).build_with_default().await?; Ok(package_manager.run_add_command(&add_command_options, &self.cwd).await?) } } #[cfg(test)] mod tests { use super::*; #[test] fn test_add_command_new() { let workspace_root = if cfg!(windows) { AbsolutePathBuf::new("C:\\test".into()).unwrap() } else { AbsolutePathBuf::new("/test".into()).unwrap() }; let cmd = AddCommand::new(workspace_root.clone()); assert_eq!(cmd.cwd, workspace_root); } } ================================================ FILE: crates/vite_global_cli/src/commands/config.rs ================================================ //! In-repo configuration command (Category B: JavaScript Command). use std::process::ExitStatus; use vite_path::AbsolutePathBuf; use crate::error::Error; /// Execute the `config` command by delegating to local or global vite-plus. pub async fn execute(cwd: AbsolutePathBuf, args: &[String]) -> Result { super::delegate::execute(cwd, "config", args).await } ================================================ FILE: crates/vite_global_cli/src/commands/create.rs ================================================ //! Project scaffolding command (Category B: JavaScript Command). use std::process::ExitStatus; use vite_path::AbsolutePathBuf; use crate::error::Error; /// Execute the `create` command by delegating to local or global vite-plus. pub async fn execute(cwd: AbsolutePathBuf, args: &[String]) -> Result { super::delegate::execute(cwd, "create", args).await } #[cfg(test)] mod tests { #[test] fn test_create_command_module_exists() { // Basic test to ensure the module compiles assert!(true); } } ================================================ FILE: crates/vite_global_cli/src/commands/dedupe.rs ================================================ use std::process::ExitStatus; use vite_install::commands::dedupe::DedupeCommandOptions; use vite_path::AbsolutePathBuf; use super::{build_package_manager, prepend_js_runtime_to_path_env}; use crate::error::Error; /// Dedupe command for deduplicating dependencies by removing older versions. /// /// This command automatically detects the package manager and translates /// the dedupe command to the appropriate package manager-specific syntax. pub struct DedupeCommand { cwd: AbsolutePathBuf, } impl DedupeCommand { pub fn new(cwd: AbsolutePathBuf) -> Self { Self { cwd } } pub async fn execute( self, check: bool, pass_through_args: Option<&[String]>, ) -> Result { prepend_js_runtime_to_path_env(&self.cwd).await?; let package_manager = build_package_manager(&self.cwd).await?; let dedupe_command_options = DedupeCommandOptions { check, pass_through_args }; Ok(package_manager.run_dedupe_command(&dedupe_command_options, &self.cwd).await?) } } #[cfg(test)] mod tests { use super::*; #[test] fn test_dedupe_command_new() { let workspace_root = if cfg!(windows) { AbsolutePathBuf::new("C:\\test".into()).unwrap() } else { AbsolutePathBuf::new("/test".into()).unwrap() }; let cmd = DedupeCommand::new(workspace_root.clone()); assert_eq!(cmd.cwd, workspace_root); } } ================================================ FILE: crates/vite_global_cli/src/commands/delegate.rs ================================================ //! JavaScript command delegation — resolves local vite-plus first, falls back to global. use std::process::ExitStatus; use vite_path::AbsolutePathBuf; use crate::{error::Error, js_executor::JsExecutor}; /// Execute a command by delegating to the local `vite-plus` CLI. pub async fn execute( cwd: AbsolutePathBuf, command: &str, args: &[String], ) -> Result { let mut executor = JsExecutor::new(None); let mut full_args = vec![command.to_string()]; full_args.extend(args.iter().cloned()); executor.delegate_to_local_cli(&cwd, &full_args).await } /// Execute a command by delegating to the global `vite-plus` CLI. pub async fn execute_global( cwd: AbsolutePathBuf, command: &str, args: &[String], ) -> Result { let mut executor = JsExecutor::new(None); let mut full_args = vec![command.to_string()]; full_args.extend(args.iter().cloned()); executor.delegate_to_global_cli(&cwd, &full_args).await } #[cfg(test)] mod tests { #[test] fn test_delegate_command_module_exists() { // Basic test to ensure the module compiles assert!(true); } } ================================================ FILE: crates/vite_global_cli/src/commands/dlx.rs ================================================ use std::{collections::HashMap, process::ExitStatus}; use vite_command::run_command; use vite_install::{ commands::dlx::{DlxCommandOptions, build_npx_args}, package_manager::PackageManager, }; use vite_path::AbsolutePathBuf; use super::prepend_js_runtime_to_path_env; use crate::error::Error; /// Dlx command for executing packages without installing them as dependencies. /// /// This command automatically detects the package manager and translates /// the dlx command to the appropriate package manager-specific syntax: /// - pnpm: pnpm dlx /// - npm: npm exec /// - yarn@2+: yarn dlx /// - yarn@1: falls back to npx /// /// When no package.json is found, falls back to npx directly. pub struct DlxCommand { cwd: AbsolutePathBuf, } impl DlxCommand { pub fn new(cwd: AbsolutePathBuf) -> Self { Self { cwd } } pub async fn execute( self, packages: Vec, shell_mode: bool, silent: bool, args: Vec, ) -> Result { if args.is_empty() { return Err(Error::Other("dlx requires a package name".into())); } prepend_js_runtime_to_path_env(&self.cwd).await?; // First arg is the package spec, rest are command args let package_spec = &args[0]; let command_args: Vec = args[1..].to_vec(); let dlx_command_options = DlxCommandOptions { packages: &packages, package_spec, args: &command_args, shell_mode, silent, }; match PackageManager::builder(&self.cwd).build_with_default().await { Ok(pm) => Ok(pm.run_dlx_command(&dlx_command_options, &self.cwd).await?), Err(vite_error::Error::WorkspaceError(vite_workspace::Error::PackageJsonNotFound( _, ))) => { // No package.json found — fall back to npx directly let args = build_npx_args(&dlx_command_options); let envs = HashMap::new(); Ok(run_command("npx", &args, &envs, &self.cwd).await?) } Err(e) => Err(e.into()), } } } #[cfg(test)] mod tests { use super::*; #[test] fn test_dlx_command_new() { let workspace_root = if cfg!(windows) { AbsolutePathBuf::new("C:\\test".into()).unwrap() } else { AbsolutePathBuf::new("/test".into()).unwrap() }; let cmd = DlxCommand::new(workspace_root.clone()); assert_eq!(cmd.cwd, workspace_root); } } ================================================ FILE: crates/vite_global_cli/src/commands/env/bin_config.rs ================================================ //! Per-binary configuration storage for global packages. //! //! Each binary installed via `vp install -g` gets a config file at //! `~/.vite-plus/bins/{name}.json` that tracks which package owns it. //! This enables: //! - Deterministic binary-to-package resolution //! - Conflict detection when installing packages with overlapping binaries //! - Safe uninstall (only removes binaries owned by the package) use serde::{Deserialize, Serialize}; use vite_path::AbsolutePathBuf; use super::config::get_vite_plus_home; use crate::error::Error; /// Source that installed a binary. #[derive(Debug, Clone, Default, PartialEq, Eq, Serialize, Deserialize)] #[serde(rename_all = "lowercase")] pub enum BinSource { /// Installed via `vp install -g` (managed shim) #[default] Vp, /// Installed via `npm install -g` shim interception (direct symlink) Npm, } /// Config for a single binary, stored at ~/.vite-plus/bins/{name}.json #[derive(Debug, Clone, Serialize, Deserialize)] #[serde(rename_all = "camelCase")] pub struct BinConfig { /// Binary name pub name: String, /// Package that installed this binary pub package: String, /// Package version pub version: String, /// Node.js version used pub node_version: String, /// How this binary was installed #[serde(default)] pub source: BinSource, } impl BinConfig { /// Create a new BinConfig with `Vp` source (used by `vp install -g`). pub fn new(name: String, package: String, version: String, node_version: String) -> Self { Self { name, package, version, node_version, source: BinSource::Vp } } /// Create a new BinConfig with `Npm` source (used by npm install -g interception). pub fn new_npm(name: String, package: String, node_version: String) -> Self { Self { name, package, version: String::new(), node_version, source: BinSource::Npm } } /// Get the bins directory path (~/.vite-plus/bins/). pub fn bins_dir() -> Result { Ok(get_vite_plus_home()?.join("bins")) } /// Get the path to a binary's config file. pub fn path(bin_name: &str) -> Result { Ok(Self::bins_dir()?.join(format!("{bin_name}.json"))) } /// Load config for a binary (synchronous). pub fn load_sync(bin_name: &str) -> Result, Error> { let path = Self::path(bin_name)?; match std::fs::read_to_string(path.as_path()) { Ok(content) => { let config: Self = serde_json::from_str(&content).map_err(|e| { Error::ConfigError(format!("Failed to parse bin config: {e}").into()) })?; Ok(Some(config)) } Err(e) if e.kind() == std::io::ErrorKind::NotFound => Ok(None), Err(e) => Err(e.into()), } } /// Save config for a binary (synchronous). pub fn save_sync(&self) -> Result<(), Error> { let path = Self::path(&self.name)?; if let Some(parent) = path.parent() { std::fs::create_dir_all(parent)?; } let content = serde_json::to_string_pretty(self).map_err(|e| { Error::ConfigError(format!("Failed to serialize bin config: {e}").into()) })?; std::fs::write(path.as_path(), content)?; Ok(()) } /// Delete config for a binary (synchronous). pub fn delete_sync(bin_name: &str) -> Result<(), Error> { let path = Self::path(bin_name)?; match std::fs::remove_file(path.as_path()) { Ok(()) => Ok(()), Err(e) if e.kind() == std::io::ErrorKind::NotFound => Ok(()), Err(e) => Err(e.into()), } } /// Load config for a binary. pub async fn load(bin_name: &str) -> Result, Error> { let path = Self::path(bin_name)?; if !tokio::fs::try_exists(&path).await.unwrap_or(false) { return Ok(None); } let content = tokio::fs::read_to_string(&path).await?; let config: Self = serde_json::from_str(&content) .map_err(|e| Error::ConfigError(format!("Failed to parse bin config: {e}").into()))?; Ok(Some(config)) } /// Save config for a binary. pub async fn save(&self) -> Result<(), Error> { let path = Self::path(&self.name)?; // Ensure bins directory exists if let Some(parent) = path.parent() { tokio::fs::create_dir_all(parent).await?; } let content = serde_json::to_string_pretty(self).map_err(|e| { Error::ConfigError(format!("Failed to serialize bin config: {e}").into()) })?; tokio::fs::write(&path, content).await?; Ok(()) } /// Delete config for a binary. pub async fn delete(bin_name: &str) -> Result<(), Error> { let path = Self::path(bin_name)?; if tokio::fs::try_exists(&path).await.unwrap_or(false) { tokio::fs::remove_file(&path).await?; } Ok(()) } /// Find all binaries installed by a package. /// /// This is used as a fallback during uninstall when PackageMetadata is missing /// (orphan recovery). pub async fn find_by_package(package_name: &str) -> Result, Error> { let bins_dir = Self::bins_dir()?; if !tokio::fs::try_exists(&bins_dir).await.unwrap_or(false) { return Ok(Vec::new()); } let mut bins = Vec::new(); let mut entries = tokio::fs::read_dir(&bins_dir).await?; while let Some(entry) = entries.next_entry().await? { let path = entry.path(); if path.extension().is_some_and(|e| e == "json") { if let Ok(content) = tokio::fs::read_to_string(&path).await { if let Ok(config) = serde_json::from_str::(&content) { if config.package == package_name { bins.push(config.name); } } } } } Ok(bins) } } #[cfg(test)] mod tests { use tempfile::TempDir; use super::*; #[tokio::test] async fn test_save_and_load() { let temp_dir = TempDir::new().unwrap(); let _guard = vite_shared::EnvConfig::test_guard( vite_shared::EnvConfig::for_test_with_home(temp_dir.path()), ); let config = BinConfig::new( "tsc".to_string(), "typescript".to_string(), "5.0.0".to_string(), "20.18.0".to_string(), ); config.save().await.unwrap(); let loaded = BinConfig::load("tsc").await.unwrap(); assert!(loaded.is_some()); let loaded = loaded.unwrap(); assert_eq!(loaded.name, "tsc"); assert_eq!(loaded.package, "typescript"); assert_eq!(loaded.version, "5.0.0"); assert_eq!(loaded.node_version, "20.18.0"); } #[tokio::test] async fn test_find_by_package() { let temp_dir = TempDir::new().unwrap(); let _guard = vite_shared::EnvConfig::test_guard( vite_shared::EnvConfig::for_test_with_home(temp_dir.path()), ); // Create configs for typescript (tsc, tsserver) let tsc = BinConfig::new( "tsc".to_string(), "typescript".to_string(), "5.0.0".to_string(), "20.18.0".to_string(), ); tsc.save().await.unwrap(); let tsserver = BinConfig::new( "tsserver".to_string(), "typescript".to_string(), "5.0.0".to_string(), "20.18.0".to_string(), ); tsserver.save().await.unwrap(); // Create config for eslint let eslint = BinConfig::new( "eslint".to_string(), "eslint".to_string(), "9.0.0".to_string(), "22.0.0".to_string(), ); eslint.save().await.unwrap(); // Find by package let ts_bins = BinConfig::find_by_package("typescript").await.unwrap(); assert_eq!(ts_bins.len(), 2); assert!(ts_bins.contains(&"tsc".to_string())); assert!(ts_bins.contains(&"tsserver".to_string())); let eslint_bins = BinConfig::find_by_package("eslint").await.unwrap(); assert_eq!(eslint_bins.len(), 1); assert!(eslint_bins.contains(&"eslint".to_string())); let nonexistent_bins = BinConfig::find_by_package("nonexistent").await.unwrap(); assert!(nonexistent_bins.is_empty()); } #[tokio::test] async fn test_delete() { let temp_dir = TempDir::new().unwrap(); let _guard = vite_shared::EnvConfig::test_guard( vite_shared::EnvConfig::for_test_with_home(temp_dir.path()), ); let config = BinConfig::new( "tsc".to_string(), "typescript".to_string(), "5.0.0".to_string(), "20.18.0".to_string(), ); config.save().await.unwrap(); // Verify it exists let loaded = BinConfig::load("tsc").await.unwrap(); assert!(loaded.is_some()); // Delete BinConfig::delete("tsc").await.unwrap(); // Verify it's gone let loaded = BinConfig::load("tsc").await.unwrap(); assert!(loaded.is_none()); // Delete again should not error BinConfig::delete("tsc").await.unwrap(); } #[tokio::test] async fn test_load_nonexistent() { let temp_dir = TempDir::new().unwrap(); let _guard = vite_shared::EnvConfig::test_guard( vite_shared::EnvConfig::for_test_with_home(temp_dir.path()), ); let loaded = BinConfig::load("nonexistent").await.unwrap(); assert!(loaded.is_none()); } #[test] fn test_source_defaults_to_vp() { let config = BinConfig::new( "tsc".to_string(), "typescript".to_string(), "5.0.0".to_string(), "20.18.0".to_string(), ); assert_eq!(config.source, BinSource::Vp); } #[test] fn test_new_npm_source() { let config = BinConfig::new_npm( "codex".to_string(), "@openai/codex".to_string(), "22.22.0".to_string(), ); assert_eq!(config.source, BinSource::Npm); assert_eq!(config.name, "codex"); assert_eq!(config.package, "@openai/codex"); assert!(config.version.is_empty()); assert_eq!(config.node_version, "22.22.0"); } #[test] fn test_source_backward_compat_deserialize() { // Old BinConfig files without "source" field should default to "vp" let json = r#"{"name":"tsc","package":"typescript","version":"5.0.0","nodeVersion":"20.18.0"}"#; let config: BinConfig = serde_json::from_str(json).unwrap(); assert_eq!(config.source, BinSource::Vp); } #[test] fn test_sync_save_load_delete() { let temp_dir = TempDir::new().unwrap(); let _guard = vite_shared::EnvConfig::test_guard( vite_shared::EnvConfig::for_test_with_home(temp_dir.path()), ); let config = BinConfig::new_npm( "codex".to_string(), "@openai/codex".to_string(), "22.22.0".to_string(), ); config.save_sync().unwrap(); let loaded = BinConfig::load_sync("codex").unwrap(); assert!(loaded.is_some()); let loaded = loaded.unwrap(); assert_eq!(loaded.source, BinSource::Npm); assert_eq!(loaded.package, "@openai/codex"); BinConfig::delete_sync("codex").unwrap(); let loaded = BinConfig::load_sync("codex").unwrap(); assert!(loaded.is_none()); // Delete again should not error BinConfig::delete_sync("codex").unwrap(); } } ================================================ FILE: crates/vite_global_cli/src/commands/env/config.rs ================================================ //! Configuration and version resolution for the env command. //! //! This module provides: //! - VITE_PLUS_HOME path resolution //! - Version resolution with priority order //! - Config file management use serde::{Deserialize, Serialize}; use vite_js_runtime::{ NodeProvider, VersionSource, normalize_version, read_package_json, resolve_node_version, }; use vite_path::{AbsolutePath, AbsolutePathBuf}; use crate::error::Error; /// Config file name const CONFIG_FILE: &str = "config.json"; /// Shim mode determines how shims resolve tools. #[derive(Serialize, Deserialize, Clone, Copy, Debug, Default, PartialEq, Eq)] #[serde(rename_all = "snake_case")] pub enum ShimMode { /// Shims always use vite-plus managed Node.js #[default] Managed, /// Shims prefer system Node.js, fallback to managed if not found SystemFirst, } /// User configuration stored in VITE_PLUS_HOME/config.json #[derive(Serialize, Deserialize, Default, Debug)] #[serde(rename_all = "camelCase")] pub struct Config { /// Default Node.js version when no project version file is found #[serde(default, skip_serializing_if = "Option::is_none")] pub default_node_version: Option, /// Shim mode for tool resolution #[serde(default, skip_serializing_if = "is_default_shim_mode")] pub shim_mode: ShimMode, } /// Check if shim mode is the default (for skip_serializing_if) fn is_default_shim_mode(mode: &ShimMode) -> bool { *mode == ShimMode::Managed } /// Version resolution result #[derive(Debug)] pub struct VersionResolution { /// The resolved version string (e.g., "20.18.0") pub version: String, /// The source of the version (e.g., ".node-version", "engines.node", "default") pub source: String, /// Path to the source file (if applicable) pub source_path: Option, /// Project root directory (if version came from a project file) pub project_root: Option, /// Whether the original version spec was a range (e.g., "20", "^20.0.0", "lts/*") /// Range versions should use time-based cache expiry instead of mtime-only validation pub is_range: bool, } /// Get the VITE_PLUS_HOME directory path. /// /// Uses `VITE_PLUS_HOME` environment variable if set, otherwise defaults to `~/.vite-plus`. pub fn get_vite_plus_home() -> Result { Ok(vite_shared::get_vite_plus_home()?) } /// Get the bin directory path (~/.vite-plus/bin/). pub fn get_bin_dir() -> Result { Ok(get_vite_plus_home()?.join("bin")) } /// Get the packages directory path (~/.vite-plus/packages/). pub fn get_packages_dir() -> Result { Ok(get_vite_plus_home()?.join("packages")) } /// Get the tmp directory path for staging (~/.vite-plus/tmp/). pub fn get_tmp_dir() -> Result { Ok(get_vite_plus_home()?.join("tmp")) } /// Get the node_modules directory path for a package. /// /// npm uses different layouts on Unix vs Windows: /// - Unix: `/lib/node_modules/` /// - Windows: `/node_modules/` /// /// This function probes both paths and returns the one that exists, /// falling back to the platform default if neither exists. pub fn get_node_modules_dir(prefix: &AbsolutePath, package_name: &str) -> AbsolutePathBuf { // Try Unix layout first (lib/node_modules) let unix_path = prefix.join("lib").join("node_modules").join(package_name); if unix_path.as_path().exists() { return unix_path; } // Try Windows layout (node_modules) let win_path = prefix.join("node_modules").join(package_name); if win_path.as_path().exists() { return win_path; } // Neither exists - return platform default (for pre-creation checks) #[cfg(windows)] { win_path } #[cfg(not(windows))] { unix_path } } /// Get the config file path. pub fn get_config_path() -> Result { Ok(get_vite_plus_home()?.join(CONFIG_FILE)) } /// Load configuration from disk. pub async fn load_config() -> Result { let config_path = get_config_path()?; if !tokio::fs::try_exists(&config_path).await.unwrap_or(false) { return Ok(Config::default()); } let content = tokio::fs::read_to_string(&config_path).await?; let config: Config = serde_json::from_str(&content)?; Ok(config) } /// Save configuration to disk. pub async fn save_config(config: &Config) -> Result<(), Error> { let config_path = get_config_path()?; let vite_plus_home = get_vite_plus_home()?; // Ensure directory exists tokio::fs::create_dir_all(&vite_plus_home).await?; let content = serde_json::to_string_pretty(config)?; tokio::fs::write(&config_path, content).await?; Ok(()) } /// Environment variable for per-shell session Node.js version override. /// Set by `vp env use` command. pub const VERSION_ENV_VAR: &str = vite_shared::env_vars::VITE_PLUS_NODE_VERSION; /// Session version file name, written by `vp env use` so shims work without the shell eval wrapper. pub const SESSION_VERSION_FILE: &str = ".session-node-version"; /// Get the path to the session version file (~/.vite-plus/.session-node-version). pub fn get_session_version_path() -> Result { Ok(get_vite_plus_home()?.join(SESSION_VERSION_FILE)) } /// Read the session version file. Returns `None` if the file is missing or empty. pub async fn read_session_version() -> Option { let path = get_session_version_path().ok()?; let content = tokio::fs::read_to_string(&path).await.ok()?; let trimmed = content.trim().to_string(); if trimmed.is_empty() { None } else { Some(trimmed) } } /// Read the session version file synchronously. Returns `None` if the file is missing or empty. pub fn read_session_version_sync() -> Option { let path = get_session_version_path().ok()?; let content = std::fs::read_to_string(path.as_path()).ok()?; let trimmed = content.trim().to_string(); if trimmed.is_empty() { None } else { Some(trimmed) } } /// Write the resolved version to the session version file. pub async fn write_session_version(version: &str) -> Result<(), Error> { let path = get_session_version_path()?; // Ensure parent directory exists if let Some(parent) = path.parent() { tokio::fs::create_dir_all(parent).await?; } tokio::fs::write(&path, version).await?; Ok(()) } /// Delete the session version file. Ignores "not found" errors. pub async fn delete_session_version() -> Result<(), Error> { let path = get_session_version_path()?; match tokio::fs::remove_file(&path).await { Ok(()) => Ok(()), Err(e) if e.kind() == std::io::ErrorKind::NotFound => Ok(()), Err(e) => Err(e.into()), } } /// Resolve Node.js version for a directory. /// /// Resolution order: /// 0. `VITE_PLUS_NODE_VERSION` env var (session override from `vp env use`) /// 1. `.session-node-version` file (session override written by `vp env use` for shell-wrapper-less environments) /// 2. `.node-version` file in current or parent directories /// 3. `package.json#engines.node` in current or parent directories /// 4. `package.json#devEngines.runtime` in current or parent directories /// 5. User default from config.json /// 6. Latest LTS version pub async fn resolve_version(cwd: &AbsolutePath) -> Result { // Session override via environment variable (set by `vp env use`) if let Some(env_version) = vite_shared::EnvConfig::get().node_version { let env_version = env_version.trim(); if !env_version.is_empty() { return Ok(VersionResolution { version: env_version.to_string(), source: VERSION_ENV_VAR.into(), source_path: None, project_root: None, is_range: false, }); } } // Session override via file (written by `vp env use` for shell-wrapper-less environments) if let Some(session_version) = read_session_version().await { return Ok(VersionResolution { version: session_version, source: SESSION_VERSION_FILE.into(), source_path: get_session_version_path().ok(), project_root: None, is_range: false, }); } resolve_version_from_files(cwd).await } /// Resolve Node.js version from project files only (skipping session overrides). /// /// This is used by `vp env use` without arguments to revert to file-based resolution. pub async fn resolve_version_from_files(cwd: &AbsolutePath) -> Result { let provider = NodeProvider::new(); // Use shared version resolution with directory walking let resolution = resolve_node_version(cwd, true) .await .map_err(|e| Error::ConfigError(e.to_string().into()))?; if let Some(resolution) = resolution { // Validate version before attempting resolution // If invalid, warning is printed by normalize_version and we fall through to defaults if let Some(validated) = normalize_version(&resolution.version.clone().into(), &resolution.source.to_string()) { // Detect if the original version spec was a range (not exact) // This includes partial versions (20, 20.18), semver ranges (^20.0.0), LTS aliases, and "latest" let is_range = NodeProvider::is_version_alias(&validated) || !NodeProvider::is_exact_version(&validated); let resolved = resolve_version_string(&validated, &provider).await?; return Ok(VersionResolution { version: resolved, source: resolution.source.to_string(), source_path: resolution.source_path, project_root: resolution.project_root, is_range, }); } // Invalid version from a project source - try lower-priority sources in the same directory. // This mirrors the fallback logic in download_runtime_for_project(). // - NodeVersionFile: try engines.node, then devEngines.runtime // - EnginesNode: try devEngines.runtime if matches!(resolution.source, VersionSource::NodeVersionFile | VersionSource::EnginesNode) { if let Some(project_root) = &resolution.project_root { let package_json_path = project_root.join("package.json"); if let Ok(Some(pkg)) = read_package_json(&package_json_path).await { // Try engines.node (only when falling back from .node-version) if matches!(resolution.source, VersionSource::NodeVersionFile) { if let Some(engines_node) = pkg .engines .as_ref() .and_then(|e| e.node.clone()) .and_then(|v| normalize_version(&v, "engines.node")) { let resolved = resolve_version_string(&engines_node, &provider).await?; let is_range = NodeProvider::is_lts_alias(&engines_node) || !NodeProvider::is_exact_version(&engines_node); return Ok(VersionResolution { version: resolved, source: "engines.node".into(), source_path: Some(package_json_path), project_root: Some(project_root.clone()), is_range, }); } } // Try devEngines.runtime if let Some(dev_engines) = pkg .dev_engines .as_ref() .and_then(|de| de.runtime.as_ref()) .and_then(|rt| rt.find_by_name("node")) .map(|r| r.version.clone()) .filter(|v| !v.is_empty()) .and_then(|v| normalize_version(&v, "devEngines.runtime")) { let resolved = resolve_version_string(&dev_engines, &provider).await?; let is_range = NodeProvider::is_lts_alias(&dev_engines) || !NodeProvider::is_exact_version(&dev_engines); return Ok(VersionResolution { version: resolved, source: "devEngines.runtime".into(), source_path: Some(package_json_path), project_root: Some(project_root.clone()), is_range, }); } } } } // Invalid version and no valid package.json sources - fall through to user default or LTS } // CLI-specific: Check user default from config let config = load_config().await?; if let Some(default_version) = config.default_node_version { let resolved = resolve_version_alias(&default_version, &provider).await?; // Check if default is an alias or range let is_alias = matches!(default_version.to_lowercase().as_str(), "lts" | "latest"); let is_range = is_alias || NodeProvider::is_lts_alias(&default_version) || !NodeProvider::is_exact_version(&default_version); return Ok(VersionResolution { version: resolved, source: "default".into(), // Don't set source_path for aliases (lts, latest) so cache can refresh source_path: if is_alias { None } else { Some(get_config_path()?) }, project_root: None, is_range, }); } // CLI-specific: Fall back to latest LTS let version = provider.resolve_latest_version().await?; Ok(VersionResolution { version: version.to_string(), source: "lts".into(), source_path: None, project_root: None, is_range: true, // LTS fallback is always a range (re-resolve periodically) }) } /// Resolve a version string to an exact version. async fn resolve_version_string(version: &str, provider: &NodeProvider) -> Result { // Check for LTS alias first (lts/*, lts/iron, lts/-1) if NodeProvider::is_lts_alias(version) { let resolved = provider.resolve_lts_alias(version).await?; return Ok(resolved.to_string()); } // Check for "latest" alias - resolves to absolute latest version (including non-LTS) if NodeProvider::is_latest_alias(version) { let resolved = provider.resolve_version("*").await?; return Ok(resolved.to_string()); } // If it's already an exact version, use it directly if NodeProvider::is_exact_version(version) { // Strip v prefix if present (e.g., "v20.18.0" -> "20.18.0") let normalized = version.strip_prefix('v').unwrap_or(version); return Ok(normalized.to_string()); } // Resolve from network (semver ranges) let resolved = provider.resolve_version(version).await?; Ok(resolved.to_string()) } /// Resolve version alias (lts, latest) to an exact version. /// /// Wraps resolution errors with a user-friendly message showing valid examples. pub async fn resolve_version_alias( version: &str, provider: &NodeProvider, ) -> Result { let result = match version.to_lowercase().as_str() { "lts" => { let resolved = provider.resolve_latest_version().await?; Ok(resolved.to_string()) } "latest" => { // Resolve * to get the absolute latest version let resolved = provider.resolve_version("*").await?; Ok(resolved.to_string()) } _ => resolve_version_string(version, provider).await, }; result.map_err(|e| match e { Error::RuntimeDownload( vite_js_runtime::Error::SemverRange(_) | vite_js_runtime::Error::NoMatchingVersion { .. }, ) => Error::Other( format!( "Invalid Node.js version: \"{version}\"\n\n\ Valid examples:\n \ vp env use 20 # Latest Node.js 20.x\n \ vp env use 20.18.0 # Exact version\n \ vp env use lts # Latest LTS version\n \ vp env use latest # Latest version" ) .into(), ), other => other, }) } #[cfg(test)] mod tests { use tempfile::TempDir; use vite_js_runtime::VersionSource; use vite_path::AbsolutePathBuf; use super::*; #[test] fn test_get_node_modules_dir_probes_unix_layout() { let temp_dir = TempDir::new().unwrap(); let prefix = AbsolutePathBuf::new(temp_dir.path().to_path_buf()).unwrap(); // Create Unix layout let unix_path = temp_dir.path().join("lib").join("node_modules").join("test-pkg"); std::fs::create_dir_all(&unix_path).unwrap(); let result = get_node_modules_dir(&prefix, "test-pkg"); assert!( result.as_path().ends_with("lib/node_modules/test-pkg"), "Should find Unix layout: {}", result.as_path().display() ); } #[test] fn test_get_node_modules_dir_probes_windows_layout() { let temp_dir = TempDir::new().unwrap(); let prefix = AbsolutePathBuf::new(temp_dir.path().to_path_buf()).unwrap(); // Create Windows layout (no lib/) let win_path = temp_dir.path().join("node_modules").join("test-pkg"); std::fs::create_dir_all(&win_path).unwrap(); let result = get_node_modules_dir(&prefix, "test-pkg"); assert!( result.as_path().ends_with("node_modules/test-pkg") && !result.as_path().to_string_lossy().contains("lib/node_modules"), "Should find Windows layout: {}", result.as_path().display() ); } #[test] fn test_get_node_modules_dir_prefers_unix_layout_when_both_exist() { let temp_dir = TempDir::new().unwrap(); let prefix = AbsolutePathBuf::new(temp_dir.path().to_path_buf()).unwrap(); // Create both layouts let unix_path = temp_dir.path().join("lib").join("node_modules").join("test-pkg"); let win_path = temp_dir.path().join("node_modules").join("test-pkg"); std::fs::create_dir_all(&unix_path).unwrap(); std::fs::create_dir_all(&win_path).unwrap(); let result = get_node_modules_dir(&prefix, "test-pkg"); // Unix layout is checked first assert!( result.as_path().ends_with("lib/node_modules/test-pkg"), "Should prefer Unix layout when both exist: {}", result.as_path().display() ); } #[test] fn test_get_node_modules_dir_returns_platform_default_when_neither_exists() { let temp_dir = TempDir::new().unwrap(); let prefix = AbsolutePathBuf::new(temp_dir.path().to_path_buf()).unwrap(); // Don't create any directories let result = get_node_modules_dir(&prefix, "test-pkg"); #[cfg(windows)] assert!( result.as_path().ends_with("node_modules/test-pkg") && !result.as_path().to_string_lossy().contains("lib/node_modules"), "Should return Windows default: {}", result.as_path().display() ); #[cfg(not(windows))] assert!( result.as_path().ends_with("lib/node_modules/test-pkg"), "Should return Unix default: {}", result.as_path().display() ); } #[test] fn test_get_node_modules_dir_handles_scoped_packages() { let temp_dir = TempDir::new().unwrap(); let prefix = AbsolutePathBuf::new(temp_dir.path().to_path_buf()).unwrap(); // Create Unix layout for scoped package let unix_path = temp_dir.path().join("lib").join("node_modules").join("@scope").join("pkg"); std::fs::create_dir_all(&unix_path).unwrap(); let result = get_node_modules_dir(&prefix, "@scope/pkg"); assert!( result.as_path().ends_with("lib/node_modules/@scope/pkg"), "Should find scoped package: {}", result.as_path().display() ); } #[tokio::test] async fn test_resolve_version_from_node_version_file() { let temp_dir = TempDir::new().unwrap(); let temp_path = AbsolutePathBuf::new(temp_dir.path().to_path_buf()).unwrap(); let _guard = vite_shared::EnvConfig::test_guard(vite_shared::EnvConfig::for_test()); // Create .node-version file tokio::fs::write(temp_path.join(".node-version"), "20.18.0\n").await.unwrap(); let resolution = resolve_version(&temp_path).await.unwrap(); assert_eq!(resolution.version, "20.18.0"); assert_eq!(resolution.source, ".node-version"); assert!(resolution.source_path.is_some()); } #[tokio::test] async fn test_resolve_version_walks_up_directory() { let temp_dir = TempDir::new().unwrap(); let temp_path = AbsolutePathBuf::new(temp_dir.path().to_path_buf()).unwrap(); let _guard = vite_shared::EnvConfig::test_guard(vite_shared::EnvConfig::for_test()); // Create .node-version in parent tokio::fs::write(temp_path.join(".node-version"), "20.18.0\n").await.unwrap(); // Create subdirectory let subdir = temp_path.join("subdir"); tokio::fs::create_dir(&subdir).await.unwrap(); let resolution = resolve_version(&subdir).await.unwrap(); assert_eq!(resolution.version, "20.18.0"); assert_eq!(resolution.source, ".node-version"); } #[tokio::test] async fn test_resolve_version_from_engines_node() { let temp_dir = TempDir::new().unwrap(); let temp_path = AbsolutePathBuf::new(temp_dir.path().to_path_buf()).unwrap(); // Create package.json with engines.node // Also create an empty .node-version to stop walk-up from finding parent project's version let package_json = r#"{"engines":{"node":"20.18.0"}}"#; tokio::fs::write(temp_path.join("package.json"), package_json).await.unwrap(); // Use resolve_node_version directly with walk_up=false to test engines.node specifically let resolution = resolve_node_version(&temp_path, false) .await .map_err(|e| Error::ConfigError(e.to_string().into())) .unwrap() .unwrap(); assert_eq!(&*resolution.version, "20.18.0"); assert_eq!(resolution.source, VersionSource::EnginesNode); } #[tokio::test] async fn test_resolve_version_from_dev_engines() { let temp_dir = TempDir::new().unwrap(); let temp_path = AbsolutePathBuf::new(temp_dir.path().to_path_buf()).unwrap(); // Create package.json with devEngines.runtime let package_json = r#"{"devEngines":{"runtime":{"name":"node","version":"20.18.0"}}}"#; tokio::fs::write(temp_path.join("package.json"), package_json).await.unwrap(); // Use resolve_node_version directly with walk_up=false to test devEngines specifically let resolution = resolve_node_version(&temp_path, false) .await .map_err(|e| Error::ConfigError(e.to_string().into())) .unwrap() .unwrap(); assert_eq!(&*resolution.version, "20.18.0"); assert_eq!(resolution.source, VersionSource::DevEnginesRuntime); } #[tokio::test] async fn test_resolve_version_node_version_takes_priority() { let temp_dir = TempDir::new().unwrap(); let temp_path = AbsolutePathBuf::new(temp_dir.path().to_path_buf()).unwrap(); let _guard = vite_shared::EnvConfig::test_guard(vite_shared::EnvConfig::for_test()); // Create both .node-version and package.json with engines.node tokio::fs::write(temp_path.join(".node-version"), "22.0.0\n").await.unwrap(); let package_json = r#"{"engines":{"node":"20.18.0"}}"#; tokio::fs::write(temp_path.join("package.json"), package_json).await.unwrap(); let resolution = resolve_version(&temp_path).await.unwrap(); // .node-version should take priority assert_eq!(resolution.version, "22.0.0"); assert_eq!(resolution.source, ".node-version"); } #[tokio::test] async fn test_resolve_version_string_strips_v_prefix() { let provider = NodeProvider::new(); // Test that v-prefixed exact versions are normalized let result = resolve_version_string("v20.18.0", &provider).await.unwrap(); assert_eq!(result, "20.18.0", "v prefix should be stripped from exact versions"); } #[tokio::test] #[ignore] // Requires running outside of any Node.js project (walk-up finds .node-version) async fn test_resolve_version_alias_default_no_source_path() { let temp_dir = TempDir::new().unwrap(); let temp_path = AbsolutePathBuf::new(temp_dir.path().to_path_buf()).unwrap(); let _guard = vite_shared::EnvConfig::test_guard( vite_shared::EnvConfig::for_test_with_home(temp_dir.path()), ); let config = Config { default_node_version: Some("lts".to_string()), ..Default::default() }; save_config(&config).await.unwrap(); // Create empty dir to resolve version in (no .node-version) let test_dir = temp_path.join("test-project"); tokio::fs::create_dir_all(&test_dir).await.unwrap(); let resolution = resolve_version(&test_dir).await.unwrap(); assert_eq!(resolution.source, "default"); assert!(resolution.source_path.is_none(), "Alias defaults should not have source_path"); } #[tokio::test] #[ignore] // Requires running outside of any Node.js project (walk-up finds .node-version) async fn test_resolve_version_exact_default_has_source_path() { let temp_dir = TempDir::new().unwrap(); let temp_path = AbsolutePathBuf::new(temp_dir.path().to_path_buf()).unwrap(); let _guard = vite_shared::EnvConfig::test_guard( vite_shared::EnvConfig::for_test_with_home(temp_dir.path()), ); let config = Config { default_node_version: Some("20.18.0".to_string()), ..Default::default() }; save_config(&config).await.unwrap(); // Create empty dir to resolve version in (no .node-version) let test_dir = temp_path.join("test-project"); tokio::fs::create_dir_all(&test_dir).await.unwrap(); let resolution = resolve_version(&test_dir).await.unwrap(); assert_eq!(resolution.source, "default"); assert!(resolution.source_path.is_some(), "Exact version defaults should have source_path"); } #[tokio::test] async fn test_resolve_version_invalid_node_version_falls_through_to_lts() { let temp_dir = TempDir::new().unwrap(); let temp_path = AbsolutePathBuf::new(temp_dir.path().to_path_buf()).unwrap(); let _guard = vite_shared::EnvConfig::test_guard( vite_shared::EnvConfig::for_test_with_home(temp_dir.path()), ); // Create .node-version file with invalid version tokio::fs::write(temp_path.join(".node-version"), "invalid-version\n").await.unwrap(); // resolve_version should NOT fail - it should fall through to LTS let resolution = resolve_version(&temp_path).await.unwrap(); // Should fall through to LTS since the .node-version is invalid // and no user default is configured assert_eq!(resolution.source, "lts"); assert!(resolution.source_path.is_none()); assert!(resolution.is_range, "LTS fallback should be marked as range"); } #[tokio::test] async fn test_resolve_version_invalid_node_version_falls_through_to_default() { let temp_dir = TempDir::new().unwrap(); let temp_path = AbsolutePathBuf::new(temp_dir.path().to_path_buf()).unwrap(); let _guard = vite_shared::EnvConfig::test_guard( vite_shared::EnvConfig::for_test_with_home(temp_dir.path()), ); // Create .node-version file with invalid version tokio::fs::write(temp_path.join(".node-version"), "not-a-version\n").await.unwrap(); // Create config with a default version let config = Config { default_node_version: Some("20.18.0".to_string()), ..Default::default() }; save_config(&config).await.unwrap(); // resolve_version should NOT fail - it should fall through to user default let resolution = resolve_version(&temp_path).await.unwrap(); // Should fall through to user default since .node-version is invalid assert_eq!(resolution.source, "default"); assert_eq!(resolution.version, "20.18.0"); } #[tokio::test] async fn test_resolve_version_invalid_node_version_falls_through_to_engines_node() { let temp_dir = TempDir::new().unwrap(); let temp_path = AbsolutePathBuf::new(temp_dir.path().to_path_buf()).unwrap(); let _guard = vite_shared::EnvConfig::test_guard( vite_shared::EnvConfig::for_test_with_home(temp_dir.path()), ); // Create .node-version file with invalid version (typo or unsupported alias) tokio::fs::write(temp_path.join(".node-version"), "laetst\n").await.unwrap(); // Create package.json with valid engines.node let package_json = r#"{"engines":{"node":"^20.18.0"}}"#; tokio::fs::write(temp_path.join("package.json"), package_json).await.unwrap(); // resolve_version should NOT fail - it should fall through to engines.node let resolution = resolve_version(&temp_path).await.unwrap(); // Should fall through to engines.node since .node-version is invalid assert_eq!(resolution.source, "engines.node"); // Version should be resolved from ^20.18.0 (a 20.x version) assert!( resolution.version.starts_with("20."), "Expected version to start with '20.', got: {}", resolution.version ); } #[tokio::test] async fn test_resolve_version_invalid_node_version_falls_through_to_dev_engines() { let temp_dir = TempDir::new().unwrap(); let temp_path = AbsolutePathBuf::new(temp_dir.path().to_path_buf()).unwrap(); let _guard = vite_shared::EnvConfig::test_guard( vite_shared::EnvConfig::for_test_with_home(temp_dir.path()), ); // Create .node-version file with invalid version tokio::fs::write(temp_path.join(".node-version"), "invalid\n").await.unwrap(); // Create package.json with devEngines.runtime but no engines.node let package_json = r#"{"devEngines":{"runtime":{"name":"node","version":"^20.18.0"}}}"#; tokio::fs::write(temp_path.join("package.json"), package_json).await.unwrap(); // resolve_version should NOT fail - it should fall through to devEngines.runtime let resolution = resolve_version(&temp_path).await.unwrap(); // Should fall through to devEngines.runtime since .node-version is invalid assert_eq!(resolution.source, "devEngines.runtime"); // Version should be resolved from ^20.18.0 (a 20.x version) assert!( resolution.version.starts_with("20."), "Expected version to start with '20.', got: {}", resolution.version ); } #[tokio::test] async fn test_resolve_version_invalid_engines_node_falls_through_to_dev_engines() { let temp_dir = TempDir::new().unwrap(); let temp_path = AbsolutePathBuf::new(temp_dir.path().to_path_buf()).unwrap(); let _guard = vite_shared::EnvConfig::test_guard( vite_shared::EnvConfig::for_test_with_home(temp_dir.path()), ); // Create package.json with invalid engines.node but valid devEngines.runtime // No .node-version file — resolve_node_version returns EnginesNode source let package_json = r#"{"engines":{"node":"invalid"},"devEngines":{"runtime":{"name":"node","version":"^20.18.0"}}}"#; tokio::fs::write(temp_path.join("package.json"), package_json).await.unwrap(); // resolve_version should fall through from invalid engines.node to devEngines.runtime let resolution = resolve_version(&temp_path).await.unwrap(); assert_eq!(resolution.source, "devEngines.runtime"); assert!( resolution.version.starts_with("20."), "Expected version to start with '20.', got: {}", resolution.version ); } #[tokio::test] async fn test_resolve_version_latest_alias_in_node_version() { let temp_dir = TempDir::new().unwrap(); let temp_path = AbsolutePathBuf::new(temp_dir.path().to_path_buf()).unwrap(); let _guard = vite_shared::EnvConfig::test_guard(vite_shared::EnvConfig::for_test()); // Create .node-version file with "latest" alias tokio::fs::write(temp_path.join(".node-version"), "latest\n").await.unwrap(); let resolution = resolve_version(&temp_path).await.unwrap(); // Should resolve from .node-version assert_eq!(resolution.source, ".node-version"); // "latest" is a range (should be re-resolved periodically) assert!(resolution.is_range, "'latest' should be marked as a range"); // Version should be at least v20.x assert!( resolution.version.starts_with("2") || resolution.version.starts_with("3"), "Expected version to be at least v20.x, got: {}", resolution.version ); } #[tokio::test] async fn test_resolve_version_env_var_takes_priority() { let temp_dir = TempDir::new().unwrap(); let temp_path = AbsolutePathBuf::new(temp_dir.path().to_path_buf()).unwrap(); let _guard = vite_shared::EnvConfig::test_guard(vite_shared::EnvConfig { node_version: Some("22.0.0".into()), ..vite_shared::EnvConfig::for_test() }); // Create .node-version file tokio::fs::write(temp_path.join(".node-version"), "20.18.0\n").await.unwrap(); let resolution = resolve_version(&temp_path).await.unwrap(); // VITE_PLUS_NODE_VERSION should take priority over .node-version assert_eq!(resolution.version, "22.0.0"); assert_eq!(resolution.source, VERSION_ENV_VAR); assert!(resolution.source_path.is_none()); assert!(!resolution.is_range); } /// Verify that the env var source is accepted by `vp env install` (no-arg) source validation. /// This is a regression test for a bug where `vp env use 24` followed by `vp env install` /// would fail with "No Node.js version found in current project." #[tokio::test] async fn test_env_var_source_accepted_by_install_validation() { let temp_dir = TempDir::new().unwrap(); let temp_path = AbsolutePathBuf::new(temp_dir.path().to_path_buf()).unwrap(); let _guard = vite_shared::EnvConfig::test_guard(vite_shared::EnvConfig { node_version: Some("22.0.0".into()), ..vite_shared::EnvConfig::for_test() }); let resolution = resolve_version(&temp_path).await.unwrap(); // The install command uses this match to validate sources. // VERSION_ENV_VAR must be accepted alongside project-file sources. let accepted = matches!( resolution.source.as_str(), ".node-version" | "engines.node" | "devEngines.runtime" | VERSION_ENV_VAR ); assert!( accepted, "Install source validation should accept '{}' but it was rejected", resolution.source ); assert_eq!(resolution.version, "22.0.0"); } // ── Session version file tests ── #[tokio::test] async fn test_write_and_read_session_version() { let temp_dir = TempDir::new().unwrap(); let _guard = vite_shared::EnvConfig::test_guard( vite_shared::EnvConfig::for_test_with_home(temp_dir.path()), ); // Write a session version write_session_version("22.0.0").await.unwrap(); // Read it back (async) let version = read_session_version().await; assert_eq!(version.as_deref(), Some("22.0.0")); // Read it back (sync) let version_sync = read_session_version_sync(); assert_eq!(version_sync.as_deref(), Some("22.0.0")); } #[tokio::test] async fn test_read_session_version_returns_none_when_missing() { let temp_dir = TempDir::new().unwrap(); let _guard = vite_shared::EnvConfig::test_guard( vite_shared::EnvConfig::for_test_with_home(temp_dir.path()), ); assert!(read_session_version().await.is_none()); assert!(read_session_version_sync().is_none()); } #[tokio::test] async fn test_read_session_version_returns_none_for_empty_file() { let temp_dir = TempDir::new().unwrap(); let _guard = vite_shared::EnvConfig::test_guard( vite_shared::EnvConfig::for_test_with_home(temp_dir.path()), ); // Write empty content let path = get_session_version_path().unwrap(); tokio::fs::create_dir_all(path.parent().unwrap()).await.unwrap(); tokio::fs::write(&path, "").await.unwrap(); assert!(read_session_version().await.is_none()); assert!(read_session_version_sync().is_none()); // Also test whitespace-only content tokio::fs::write(&path, " \n ").await.unwrap(); assert!(read_session_version().await.is_none()); assert!(read_session_version_sync().is_none()); } #[tokio::test] async fn test_read_session_version_trims_whitespace() { let temp_dir = TempDir::new().unwrap(); let _guard = vite_shared::EnvConfig::test_guard( vite_shared::EnvConfig::for_test_with_home(temp_dir.path()), ); write_session_version("20.18.0").await.unwrap(); // Overwrite with whitespace-padded content let path = get_session_version_path().unwrap(); tokio::fs::write(&path, " 20.18.0 \n").await.unwrap(); assert_eq!(read_session_version().await.as_deref(), Some("20.18.0")); assert_eq!(read_session_version_sync().as_deref(), Some("20.18.0")); } #[tokio::test] async fn test_delete_session_version() { let temp_dir = TempDir::new().unwrap(); let _guard = vite_shared::EnvConfig::test_guard( vite_shared::EnvConfig::for_test_with_home(temp_dir.path()), ); // Write then delete write_session_version("22.0.0").await.unwrap(); assert!(read_session_version().await.is_some()); delete_session_version().await.unwrap(); assert!(read_session_version().await.is_none()); } #[tokio::test] async fn test_delete_session_version_ignores_missing_file() { let temp_dir = TempDir::new().unwrap(); let _guard = vite_shared::EnvConfig::test_guard( vite_shared::EnvConfig::for_test_with_home(temp_dir.path()), ); // Deleting a non-existent file should succeed let result = delete_session_version().await; assert!(result.is_ok()); } #[tokio::test] async fn test_resolve_version_session_file_takes_priority_over_node_version() { let temp_dir = TempDir::new().unwrap(); let temp_path = AbsolutePathBuf::new(temp_dir.path().to_path_buf()).unwrap(); let _guard = vite_shared::EnvConfig::test_guard( vite_shared::EnvConfig::for_test_with_home(temp_dir.path()), ); // Create .node-version file tokio::fs::write(temp_path.join(".node-version"), "20.18.0\n").await.unwrap(); // Write session version file write_session_version("22.0.0").await.unwrap(); let resolution = resolve_version(&temp_path).await.unwrap(); // Session file should take priority over .node-version assert_eq!(resolution.version, "22.0.0"); assert_eq!(resolution.source, SESSION_VERSION_FILE); assert!(resolution.source_path.is_some()); assert!(!resolution.is_range); // Clean up delete_session_version().await.unwrap(); } #[tokio::test] async fn test_resolve_version_env_var_takes_priority_over_session_file() { let temp_dir = TempDir::new().unwrap(); let temp_path = AbsolutePathBuf::new(temp_dir.path().to_path_buf()).unwrap(); let _guard = vite_shared::EnvConfig::test_guard(vite_shared::EnvConfig { node_version: Some("24.0.0".into()), vite_plus_home: Some(temp_dir.path().into()), ..vite_shared::EnvConfig::for_test() }); // Write session version file with different version write_session_version("22.0.0").await.unwrap(); let resolution = resolve_version(&temp_path).await.unwrap(); // Env var should take priority over session file assert_eq!(resolution.version, "24.0.0"); assert_eq!(resolution.source, VERSION_ENV_VAR); // Clean up delete_session_version().await.unwrap(); } #[tokio::test] async fn test_resolve_version_falls_through_when_no_session_file() { let temp_dir = TempDir::new().unwrap(); let temp_path = AbsolutePathBuf::new(temp_dir.path().to_path_buf()).unwrap(); let _guard = vite_shared::EnvConfig::test_guard( vite_shared::EnvConfig::for_test_with_home(temp_dir.path()), ); // Create .node-version file tokio::fs::write(temp_path.join(".node-version"), "20.18.0\n").await.unwrap(); let resolution = resolve_version(&temp_path).await.unwrap(); // Should fall through to .node-version since no session file exists assert_eq!(resolution.version, "20.18.0"); assert_eq!(resolution.source, ".node-version"); } /// Verify that the session file source is accepted by `vp env install` (no-arg) source validation. /// This is a regression test ensuring `vp env use 24` followed by `vp env install` /// works when the session file is the resolution source. #[tokio::test] async fn test_session_file_source_accepted_by_install_validation() { let temp_dir = TempDir::new().unwrap(); let temp_path = AbsolutePathBuf::new(temp_dir.path().to_path_buf()).unwrap(); let _guard = vite_shared::EnvConfig::test_guard( vite_shared::EnvConfig::for_test_with_home(temp_dir.path()), ); // Write session version file write_session_version("22.0.0").await.unwrap(); let resolution = resolve_version(&temp_path).await.unwrap(); // The install command uses this match to validate sources. // SESSION_VERSION_FILE must be accepted alongside project-file sources. let accepted = matches!( resolution.source.as_str(), ".node-version" | "engines.node" | "devEngines.runtime" | VERSION_ENV_VAR | SESSION_VERSION_FILE ); assert!( accepted, "Install source validation should accept '{}' but it was rejected", resolution.source ); assert_eq!(resolution.version, "22.0.0"); assert_eq!(resolution.source, SESSION_VERSION_FILE); // Clean up delete_session_version().await.unwrap(); } #[tokio::test] async fn test_resolve_version_empty_env_var_is_ignored() { let temp_dir = TempDir::new().unwrap(); let temp_path = AbsolutePathBuf::new(temp_dir.path().to_path_buf()).unwrap(); let _guard = vite_shared::EnvConfig::test_guard(vite_shared::EnvConfig { node_version: Some("".into()), ..vite_shared::EnvConfig::for_test() }); // Create .node-version file tokio::fs::write(temp_path.join(".node-version"), "20.18.0\n").await.unwrap(); let resolution = resolve_version(&temp_path).await.unwrap(); // Empty env var should be ignored, should fall through to .node-version assert_eq!(resolution.version, "20.18.0"); assert_eq!(resolution.source, ".node-version"); } #[tokio::test] async fn test_resolve_version_whitespace_env_var_is_ignored() { let temp_dir = TempDir::new().unwrap(); let temp_path = AbsolutePathBuf::new(temp_dir.path().to_path_buf()).unwrap(); let _guard = vite_shared::EnvConfig::test_guard(vite_shared::EnvConfig { node_version: Some(" ".into()), ..vite_shared::EnvConfig::for_test() }); // Create .node-version file tokio::fs::write(temp_path.join(".node-version"), "20.18.0\n").await.unwrap(); let resolution = resolve_version(&temp_path).await.unwrap(); // Whitespace env var should be ignored, should fall through to .node-version assert_eq!(resolution.version, "20.18.0"); assert_eq!(resolution.source, ".node-version"); } // ── resolve_version_from_files tests ── /// Verify that `resolve_version_from_files` ignores session env var override. /// This is the key behavior for `vp env use` without arguments. #[tokio::test] async fn test_resolve_version_from_files_ignores_env_var() { let temp_dir = TempDir::new().unwrap(); let temp_path = AbsolutePathBuf::new(temp_dir.path().to_path_buf()).unwrap(); let _guard = vite_shared::EnvConfig::test_guard(vite_shared::EnvConfig { node_version: Some("22.0.0".into()), ..vite_shared::EnvConfig::for_test() }); // Create .node-version file with different version tokio::fs::write(temp_path.join(".node-version"), "20.18.0\n").await.unwrap(); // resolve_version_from_files should skip env var and use .node-version let resolution = resolve_version_from_files(&temp_path).await.unwrap(); assert_eq!(resolution.version, "20.18.0"); assert_eq!(resolution.source, ".node-version"); } /// Verify that `resolve_version_from_files` ignores session file override. #[tokio::test] async fn test_resolve_version_from_files_ignores_session_file() { let temp_dir = TempDir::new().unwrap(); let temp_path = AbsolutePathBuf::new(temp_dir.path().to_path_buf()).unwrap(); let _guard = vite_shared::EnvConfig::test_guard( vite_shared::EnvConfig::for_test_with_home(temp_dir.path()), ); // Write session version file write_session_version("22.0.0").await.unwrap(); // Create .node-version file with different version tokio::fs::write(temp_path.join(".node-version"), "20.18.0\n").await.unwrap(); // resolve_version_from_files should skip session file and use .node-version let resolution = resolve_version_from_files(&temp_path).await.unwrap(); assert_eq!(resolution.version, "20.18.0"); assert_eq!(resolution.source, ".node-version"); // Clean up delete_session_version().await.unwrap(); } /// Verify that `resolve_version_from_files` still respects both env var and session file. #[tokio::test] async fn test_resolve_version_still_respects_overrides() { let temp_dir = TempDir::new().unwrap(); let temp_path = AbsolutePathBuf::new(temp_dir.path().to_path_buf()).unwrap(); let _guard = vite_shared::EnvConfig::test_guard(vite_shared::EnvConfig { node_version: Some("22.0.0".into()), ..vite_shared::EnvConfig::for_test_with_home(temp_dir.path()) }); // Create .node-version file tokio::fs::write(temp_path.join(".node-version"), "20.18.0\n").await.unwrap(); // resolve_version should still use env var (existing behavior) let resolution = resolve_version(&temp_path).await.unwrap(); assert_eq!(resolution.version, "22.0.0"); assert_eq!(resolution.source, VERSION_ENV_VAR); // But resolve_version_from_files should skip it let resolution_from_files = resolve_version_from_files(&temp_path).await.unwrap(); assert_eq!(resolution_from_files.version, "20.18.0"); assert_eq!(resolution_from_files.source, ".node-version"); } } ================================================ FILE: crates/vite_global_cli/src/commands/env/current.rs ================================================ //! Current environment information command. //! //! Shows information about the current Node.js environment. use std::process::ExitStatus; use owo_colors::OwoColorize; use serde::Serialize; use vite_path::AbsolutePathBuf; use super::config::resolve_version; use crate::{error::Error, help}; /// JSON output structure for `vp env current --json` #[derive(Serialize)] struct CurrentEnvInfo { version: String, source: String, #[serde(skip_serializing_if = "Option::is_none")] project_root: Option, node_path: String, tool_paths: ToolPaths, } #[derive(Serialize)] struct ToolPaths { node: String, npm: String, npx: String, } fn accent(text: &str) -> String { if help::should_style_help() { text.bright_blue().to_string() } else { text.to_string() } } fn print_rows(title: &str, rows: &[(&str, String)]) { println!("{}", help::render_heading(title)); let label_width = rows.iter().map(|(label, _)| label.chars().count()).max().unwrap_or(0); for (label, value) in rows { let padding = " ".repeat(label_width.saturating_sub(label.chars().count())); println!(" {}{} {value}", accent(label), padding); } } /// Execute the current command. pub async fn execute(cwd: AbsolutePathBuf, json: bool) -> Result { let resolution = resolve_version(&cwd).await?; // Get the home directory for this version let home_dir = vite_shared::get_vite_plus_home()? .join("js_runtime") .join("node") .join(&resolution.version); #[cfg(windows)] let (node_path, npm_path, npx_path) = { (home_dir.join("node.exe"), home_dir.join("npm.cmd"), home_dir.join("npx.cmd")) }; #[cfg(not(windows))] let (node_path, npm_path, npx_path) = { ( home_dir.join("bin").join("node"), home_dir.join("bin").join("npm"), home_dir.join("bin").join("npx"), ) }; if json { let info = CurrentEnvInfo { version: resolution.version.clone(), source: resolution.source.clone(), project_root: resolution .project_root .as_ref() .map(|p| p.as_path().display().to_string()), node_path: node_path.as_path().display().to_string(), tool_paths: ToolPaths { node: node_path.as_path().display().to_string(), npm: npm_path.as_path().display().to_string(), npx: npx_path.as_path().display().to_string(), }, }; let json_str = serde_json::to_string_pretty(&info)?; println!("{json_str}"); } else { let mut environment_rows = vec![("Version", resolution.version.clone()), ("Source", resolution.source.clone())]; if let Some(path) = &resolution.source_path { environment_rows.push(("Source Path", path.as_path().display().to_string())); } if let Some(root) = &resolution.project_root { environment_rows.push(("Project Root", root.as_path().display().to_string())); } print_rows("Environment", &environment_rows); println!(); print_rows( "Tool Paths", &[ ("node", node_path.as_path().display().to_string()), ("npm", npm_path.as_path().display().to_string()), ("npx", npx_path.as_path().display().to_string()), ], ); } Ok(ExitStatus::default()) } ================================================ FILE: crates/vite_global_cli/src/commands/env/default.rs ================================================ //! Default version management command. //! //! Handles `vp env default [VERSION]` to set or show the global default Node.js version. use std::process::ExitStatus; use vite_path::AbsolutePathBuf; use super::config::{get_config_path, load_config, save_config}; use crate::error::Error; /// Execute the default command. pub async fn execute(_cwd: AbsolutePathBuf, version: Option) -> Result { match version { Some(v) => set_default(&v).await, None => show_default().await, } } /// Show the current default version. async fn show_default() -> Result { let config = load_config().await?; match config.default_node_version { Some(version) => { println!("Default Node.js version: {version}"); let config_path = get_config_path()?; println!(" Set via: {}", config_path.as_path().display()); // If it's an alias, also show the resolved version if version == "lts" || version == "latest" { let provider = vite_js_runtime::NodeProvider::new(); match resolve_alias(&version, &provider).await { Ok(resolved) => println!(" Currently resolves to: {resolved}"), Err(_) => {} } } } None => { // No default configured - show what would be used let provider = vite_js_runtime::NodeProvider::new(); match provider.resolve_latest_version().await { Ok(lts_version) => { println!("No default version configured. Using latest LTS ({lts_version})."); println!(" Run 'vp env default ' to set a default."); } Err(_) => { println!("No default version configured."); println!(" Run 'vp env default ' to set a default."); } } } } Ok(ExitStatus::default()) } /// Set the default version. async fn set_default(version: &str) -> Result { let provider = vite_js_runtime::NodeProvider::new(); // Validate the version let (display_version, store_version) = match version.to_lowercase().as_str() { "lts" => { // Resolve to show current value, but store "lts" as alias let current_lts = provider.resolve_latest_version().await?; (format!("lts (currently {})", current_lts), "lts".to_string()) } "latest" => { // Resolve to show current value, but store "latest" as alias let current_latest = provider.resolve_version("*").await?; (format!("latest (currently {})", current_latest), "latest".to_string()) } _ => { // Validate version exists let resolved = if vite_js_runtime::NodeProvider::is_exact_version(version) { version.to_string() } else { provider.resolve_version(version).await?.to_string() }; (resolved.clone(), resolved) } }; // Save to config let mut config = load_config().await?; config.default_node_version = Some(store_version); save_config(&config).await?; // Invalidate resolve cache so the new default takes effect immediately crate::shim::invalidate_cache(); println!("\u{2713} Default Node.js version set to {display_version}"); Ok(ExitStatus::default()) } /// Resolve version alias to actual version. async fn resolve_alias( alias: &str, provider: &vite_js_runtime::NodeProvider, ) -> Result { match alias { "lts" => Ok(provider.resolve_latest_version().await?.to_string()), "latest" => Ok(provider.resolve_version("*").await?.to_string()), _ => Ok(alias.to_string()), } } ================================================ FILE: crates/vite_global_cli/src/commands/env/doctor.rs ================================================ //! Doctor command implementation for environment diagnostics. use std::process::ExitStatus; use owo_colors::OwoColorize; use vite_path::{AbsolutePathBuf, current_dir}; use vite_shared::{env_vars, output}; use super::config::{ self, ShimMode, get_bin_dir, get_vite_plus_home, load_config, resolve_version, }; use crate::error::Error; /// IDE-relevant profile files that GUI-launched applications can see. /// GUI apps don't run through an interactive shell, so only login/environment /// files reliably affect them. /// - macOS: `.zshenv` is sourced for all zsh invocations (including IDE env resolution) /// - Linux: `.profile` is sourced by X11 display managers; `.zshenv` covers Wayland + zsh #[cfg(not(windows))] #[cfg(target_os = "macos")] const IDE_PROFILES: &[(&str, bool)] = &[(".zshenv", false), (".profile", false)]; #[cfg(not(windows))] #[cfg(target_os = "linux")] const IDE_PROFILES: &[(&str, bool)] = &[(".profile", false), (".zshenv", false)]; #[cfg(not(windows))] #[cfg(not(any(target_os = "macos", target_os = "linux")))] const IDE_PROFILES: &[(&str, bool)] = &[(".profile", false)]; /// All shell profile files that interactive terminal sessions may source. /// This matches the files that `install.sh` writes to and `vp implode` cleans. /// The bool flag indicates whether the file uses fish-style sourcing (`env.fish` /// instead of `env`). #[cfg(not(windows))] const ALL_SHELL_PROFILES: &[(&str, bool)] = &[ (".zshenv", false), (".zshrc", false), (".bash_profile", false), (".bashrc", false), (".profile", false), (".config/fish/config.fish", true), (".config/fish/conf.d/vite-plus.fish", true), ]; /// Result of checking profile files for env sourcing. #[cfg(not(windows))] enum EnvSourcingStatus { /// Found in an IDE-relevant profile (e.g., .zshenv, .profile). IdeFound, /// Found only in an interactive shell profile (e.g., .bashrc, .zshrc). ShellOnly, /// Not found in any profile. NotFound, } /// Known version managers that might conflict const KNOWN_VERSION_MANAGERS: &[(&str, &str)] = &[ ("nvm", "NVM_DIR"), ("fnm", "FNM_DIR"), ("volta", "VOLTA_HOME"), ("asdf", "ASDF_DIR"), ("mise", "MISE_DIR"), ("n", "N_PREFIX"), ]; /// Tools that should have shims const SHIM_TOOLS: &[&str] = &["node", "npm", "npx", "vpx"]; /// Column width for left-side keys in aligned output const KEY_WIDTH: usize = 18; /// Print a section header (bold, with blank line before). fn print_section(name: &str) { println!(); println!("{}", name.bold()); } /// Print an aligned key-value line with a status indicator. /// /// `status` should be a colored string like "✓".green(), "✗".red(), etc. /// Use `" "` for informational lines with no status. fn print_check(status: &str, key: &str, value: &str) { if status.trim().is_empty() { println!(" {key: String { if let Ok(home) = std::env::var("HOME") { if let Some(suffix) = path.strip_prefix(&home) { return format!("~{suffix}"); } } path.to_string() } /// Execute the doctor command. pub async fn execute(cwd: AbsolutePathBuf) -> Result { let mut has_errors = false; // Section: Installation println!("{}", "Installation".bold()); has_errors |= !check_vite_plus_home().await; has_errors |= !check_bin_dir().await; // Section: Configuration print_section("Configuration"); check_shim_mode().await; // Check env sourcing: IDE-relevant profiles first, then all shell profiles #[cfg(not(windows))] let env_status = check_env_sourcing(); check_session_override(); // Section: PATH print_section("PATH"); has_errors |= !check_path().await; // Section: Version Resolution print_section("Version Resolution"); check_current_resolution(&cwd).await; // Section: Conflicts (conditional) check_conflicts(); // Section: IDE Setup (conditional - when env not found in IDE-relevant profiles) #[cfg(not(windows))] { match &env_status { EnvSourcingStatus::IdeFound => {} // All good, no guidance needed EnvSourcingStatus::ShellOnly | EnvSourcingStatus::NotFound => { // Show IDE setup guidance when env is not in IDE-relevant profiles if let Ok(bin_dir) = get_bin_dir() { print_ide_setup_guidance(&bin_dir); } } } } // Summary println!(); if has_errors { println!( "{}", "\u{2717} Some issues found. Run the suggested commands to fix them.".red().bold() ); Ok(super::exit_status(1)) } else { println!("{}", "\u{2713} All checks passed".green().bold()); Ok(ExitStatus::default()) } } /// Check VITE_PLUS_HOME directory. async fn check_vite_plus_home() -> bool { let home = match get_vite_plus_home() { Ok(h) => h, Err(e) => { print_check( &output::CROSS.red().to_string(), env_vars::VITE_PLUS_HOME, &format!("{e}").red().to_string(), ); return false; } }; let display = abbreviate_home(&home.as_path().display().to_string()); if tokio::fs::try_exists(&home).await.unwrap_or(false) { print_check(&output::CHECK.green().to_string(), env_vars::VITE_PLUS_HOME, &display); true } else { print_check( &output::CROSS.red().to_string(), env_vars::VITE_PLUS_HOME, &"does not exist".red().to_string(), ); print_hint("Run 'vp env setup' to create it."); false } } /// Check bin directory and shim files. async fn check_bin_dir() -> bool { let bin_dir = match get_bin_dir() { Ok(d) => d, Err(_) => return false, }; if !tokio::fs::try_exists(&bin_dir).await.unwrap_or(false) { print_check( &output::CROSS.red().to_string(), "Bin directory", &"does not exist".red().to_string(), ); print_hint("Run 'vp env setup' to create bin directory and shims."); return false; } print_check(&output::CHECK.green().to_string(), "Bin directory", "exists"); let mut missing = Vec::new(); for tool in SHIM_TOOLS { let shim_path = bin_dir.join(shim_filename(tool)); if !tokio::fs::try_exists(&shim_path).await.unwrap_or(false) { missing.push(*tool); } } if missing.is_empty() { print_check(&output::CHECK.green().to_string(), "Shims", &SHIM_TOOLS.join(", ")); true } else { print_check( &output::CROSS.red().to_string(), "Missing shims", &missing.join(", ").red().to_string(), ); print_hint("Run 'vp env setup' to create missing shims."); false } } /// Get the filename for a shim (platform-specific). fn shim_filename(tool: &str) -> String { #[cfg(windows)] { // All tools use trampoline .exe files on Windows format!("{tool}.exe") } #[cfg(not(windows))] { tool.to_string() } } /// Check and display shim mode. async fn check_shim_mode() { let config = match load_config().await { Ok(c) => c, Err(e) => { print_check( &output::WARN_SIGN.yellow().to_string(), "Shim mode", &format!("config error: {e}").yellow().to_string(), ); return; } }; match config.shim_mode { ShimMode::Managed => { print_check(&output::CHECK.green().to_string(), "Shim mode", "managed"); } ShimMode::SystemFirst => { print_check( &output::CHECK.green().to_string(), "Shim mode", &"system-first".bright_blue().to_string(), ); // Check if system Node.js is available if let Some(system_node) = find_system_node() { print_check(" ", "System Node.js", &system_node.display().to_string()); } else { print_check( &output::WARN_SIGN.yellow().to_string(), "System Node.js", &"not found (will use managed)".yellow().to_string(), ); } } } } /// Check profile files for env sourcing and classify where it was found. /// /// Tries IDE-relevant profiles first, then falls back to all shell profiles. /// Returns `EnvSourcingStatus` indicating where (if anywhere) the sourcing was found. #[cfg(not(windows))] fn check_env_sourcing() -> EnvSourcingStatus { let bin_dir = match get_bin_dir() { Ok(d) => d, Err(_) => return EnvSourcingStatus::NotFound, }; let home_path = bin_dir .parent() .map(|p| p.as_path().display().to_string()) .unwrap_or_else(|| bin_dir.as_path().display().to_string()); let home_path = if let Ok(home_dir) = std::env::var("HOME") { if let Some(suffix) = home_path.strip_prefix(&home_dir) { format!("$HOME{suffix}") } else { home_path } } else { home_path }; // First: check IDE-relevant profiles (login/environment files visible to GUI apps) if let Some(file) = check_profile_files(&home_path, IDE_PROFILES) { print_check( &output::CHECK.green().to_string(), "IDE integration", &format!("env sourced in {file}"), ); return EnvSourcingStatus::IdeFound; } // Second: check all shell profiles (interactive terminal sessions) if let Some(file) = check_profile_files(&home_path, ALL_SHELL_PROFILES) { print_check( &output::WARN_SIGN.yellow().to_string(), "IDE integration", &format!( "{} {}", format!("env sourced in {file}").yellow(), "(may not be visible to GUI apps)".dimmed(), ), ); return EnvSourcingStatus::ShellOnly; } EnvSourcingStatus::NotFound } /// Find system Node.js, skipping vite-plus bin directory and any /// directories listed in `VITE_PLUS_BYPASS`. fn find_system_node() -> Option { let bin_dir = get_bin_dir().ok(); let path_var = std::env::var_os("PATH")?; // Parse VITE_PLUS_BYPASS as a PATH-style list of additional directories to skip let bypass_paths: Vec = std::env::var_os(env_vars::VITE_PLUS_BYPASS) .map(|v| std::env::split_paths(&v).collect()) .unwrap_or_default(); // Filter PATH to exclude our bin directory and any bypass directories let filtered_paths: Vec<_> = std::env::split_paths(&path_var) .filter(|p| { if let Some(ref bin) = bin_dir { if p == bin.as_path() { return false; } } !bypass_paths.iter().any(|bp| p == bp) }) .collect(); let filtered_path = std::env::join_paths(filtered_paths).ok()?; // Use vite_command::resolve_bin with filtered PATH - stops at first match let cwd = current_dir().ok()?; vite_command::resolve_bin("node", Some(&filtered_path), &cwd).ok().map(|p| p.into_path_buf()) } /// Check for active session override via VITE_PLUS_NODE_VERSION or session file. fn check_session_override() { if let Ok(version) = std::env::var(config::VERSION_ENV_VAR) { let version = version.trim(); if !version.is_empty() { print_check( &output::WARN_SIGN.yellow().to_string(), "Session override", &format!("{}={version}", env_vars::VITE_PLUS_NODE_VERSION).yellow().to_string(), ); print_hint("Overrides all file-based resolution."); print_hint("Run 'vp env use --unset' to remove."); } } // Also check session version file if let Some(version) = config::read_session_version_sync() { print_check( &output::WARN_SIGN.yellow().to_string(), "Session override (file)", &format!("{}={version}", config::SESSION_VERSION_FILE).yellow().to_string(), ); print_hint("Written by 'vp env use'. Run 'vp env use --unset' to remove."); } } /// Check PATH configuration. async fn check_path() -> bool { let bin_dir = match get_bin_dir() { Ok(d) => d, Err(_) => return false, }; let path_var = std::env::var_os("PATH").unwrap_or_default(); let paths: Vec<_> = std::env::split_paths(&path_var).collect(); // Check if bin directory is in PATH let bin_path = bin_dir.as_path(); let bin_position = paths.iter().position(|p| p == bin_path); let bin_display = abbreviate_home(&bin_dir.as_path().display().to_string()); match bin_position { Some(0) => { print_check(&output::CHECK.green().to_string(), "vp", "first in PATH"); } Some(pos) => { print_check( &output::WARN_SIGN.yellow().to_string(), "vp", &format!("in PATH at position {pos}").yellow().to_string(), ); print_hint("For best results, bin should be first in PATH."); } None => { print_check(&output::CROSS.red().to_string(), "vp", &"not in PATH".red().to_string()); print_hint(&format!("Expected: {bin_display}")); println!(); print_path_fix(&bin_dir); return false; } } // Show which tool would be executed for each shim for tool in SHIM_TOOLS { if let Some(tool_path) = find_in_path(tool) { let expected = bin_dir.join(shim_filename(tool)); let display = abbreviate_home(&tool_path.display().to_string()); if tool_path == expected.as_path() { print_check( &output::CHECK.green().to_string(), tool, &format!("{display} {}", "(vp shim)".dimmed()), ); } else { print_check( &output::WARN_SIGN.yellow().to_string(), tool, &format!("{} {}", display.yellow(), "(not vp shim)".dimmed()), ); } } else { print_check(" ", tool, "not found"); } } true } /// Find an executable in PATH. fn find_in_path(name: &str) -> Option { let cwd = current_dir().ok()?; vite_command::resolve_bin(name, None, &cwd).ok().map(|p| p.into_path_buf()) } /// Print PATH fix instructions for shell setup. fn print_path_fix(bin_dir: &vite_path::AbsolutePath) { #[cfg(not(windows))] { // Derive vite_plus_home from bin_dir (parent), using $HOME prefix for readability let home_path = bin_dir .parent() .map(|p| p.as_path().display().to_string()) .unwrap_or_else(|| bin_dir.as_path().display().to_string()); let home_path = if let Ok(home_dir) = std::env::var("HOME") { if let Some(suffix) = home_path.strip_prefix(&home_dir) { format!("$HOME{suffix}") } else { home_path } } else { home_path }; println!(" {}", "Add to your shell profile (~/.zshrc, ~/.bashrc, etc.):".dimmed()); println!(); println!(" . \"{home_path}/env\""); println!(); println!(" {}", "For fish shell, add to ~/.config/fish/config.fish:".dimmed()); println!(); println!(" source \"{home_path}/env.fish\""); println!(); println!(" {}", "Then restart your terminal.".dimmed()); } #[cfg(windows)] { let _ = bin_dir; println!(" {}", "Add the bin directory to your PATH via:".dimmed()); println!(" System Properties -> Environment Variables -> Path"); println!(); println!(" {}", "Then restart your terminal.".dimmed()); } } /// Search for vite-plus env sourcing line in the given profile files. /// /// Each entry in `profile_files` is `(filename, is_fish)`. When `is_fish` is true, /// searches for the `env.fish` pattern instead of `env`. /// /// Returns `Some(display_path)` if any profile file contains a reference /// to the vite-plus env file, `None` otherwise. #[cfg(not(windows))] fn check_profile_files(vite_plus_home: &str, profile_files: &[(&str, bool)]) -> Option { let home_dir = std::env::var("HOME").ok()?; for &(file, is_fish) in profile_files { let full_path = format!("{home_dir}/{file}"); if let Ok(content) = std::fs::read_to_string(&full_path) { // Build candidate strings: both $HOME/... and /absolute/... let env_suffix = if is_fish { "/env.fish" } else { "/env" }; let mut search_strings = vec![format!("{vite_plus_home}{env_suffix}")]; if let Some(suffix) = vite_plus_home.strip_prefix("$HOME") { search_strings.push(format!("{home_dir}{suffix}{env_suffix}")); } if search_strings.iter().any(|s| content.contains(s)) { return Some(format!("~/{file}")); } } } // If ZDOTDIR is set and differs from $HOME, also check $ZDOTDIR/.zshenv and .zshrc if let Ok(zdotdir) = std::env::var("ZDOTDIR") { if !zdotdir.is_empty() && zdotdir != home_dir { let env_suffix = "/env"; let mut search_strings = vec![format!("{vite_plus_home}{env_suffix}")]; if let Some(suffix) = vite_plus_home.strip_prefix("$HOME") { search_strings.push(format!("{home_dir}{suffix}{env_suffix}")); } for file in [".zshenv", ".zshrc"] { let path = format!("{zdotdir}/{file}"); if let Ok(content) = std::fs::read_to_string(&path) { if search_strings.iter().any(|s| content.contains(s)) { return Some(abbreviate_home(&path)); } } } } } // If XDG_CONFIG_HOME is set and differs from default, also check fish conf.d if let Ok(xdg_config) = std::env::var("XDG_CONFIG_HOME") { let default_config = format!("{home_dir}/.config"); if !xdg_config.is_empty() && xdg_config != default_config { let fish_suffix = "/env.fish"; let mut search_strings = vec![format!("{vite_plus_home}{fish_suffix}")]; if let Some(suffix) = vite_plus_home.strip_prefix("$HOME") { search_strings.push(format!("{home_dir}{suffix}{fish_suffix}")); } let path = format!("{xdg_config}/fish/conf.d/vite-plus.fish"); if let Ok(content) = std::fs::read_to_string(&path) { if search_strings.iter().any(|s| content.contains(s)) { return Some(abbreviate_home(&path)); } } } } None } /// Print IDE setup guidance for GUI applications. #[cfg(not(windows))] fn print_ide_setup_guidance(bin_dir: &vite_path::AbsolutePath) { // Derive vite_plus_home display path from bin_dir.parent(), using $HOME prefix let home_path = bin_dir .parent() .map(|p| p.as_path().display().to_string()) .unwrap_or_else(|| bin_dir.as_path().display().to_string()); let home_path = if let Ok(home_dir) = std::env::var("HOME") { if let Some(suffix) = home_path.strip_prefix(&home_dir) { format!("$HOME{suffix}") } else { home_path } } else { home_path }; print_section("IDE Setup"); print_check( &output::WARN_SIGN.yellow().to_string(), "", &"GUI applications may not see shell PATH changes.".yellow().to_string(), ); println!(); #[cfg(target_os = "macos")] { println!(" {}", "macOS:".dimmed()); println!(" {}", "Add to ~/.zshenv or ~/.profile:".dimmed()); println!(" . \"{home_path}/env\""); println!(" {}", "Then restart your IDE to apply changes.".dimmed()); } #[cfg(target_os = "linux")] { println!(" {}", "Linux:".dimmed()); println!(" {}", "Add to ~/.profile:".dimmed()); println!(" . \"{home_path}/env\""); println!(" {}", "Then log out and log back in for changes to take effect.".dimmed()); } // Fallback for other Unix platforms #[cfg(not(any(target_os = "macos", target_os = "linux")))] { println!(" {}", "Add to your shell profile:".dimmed()); println!(" . \"{home_path}/env\""); println!(" {}", "Then restart your IDE to apply changes.".dimmed()); } } /// Check current directory version resolution. async fn check_current_resolution(cwd: &AbsolutePathBuf) { print_check(" ", "Directory", &cwd.as_path().display().to_string()); match resolve_version(cwd).await { Ok(resolution) => { let source_display = resolution .source_path .as_ref() .map(|p| p.as_path().display().to_string()) .unwrap_or(resolution.source); print_check(" ", "Source", &source_display); print_check(" ", "Version", &resolution.version.bright_green().to_string()); // Check if Node.js is installed let home_dir = match vite_shared::get_vite_plus_home() { Ok(d) => d.join("js_runtime").join("node").join(&resolution.version), Err(_) => return, }; #[cfg(windows)] let binary_path = home_dir.join("node.exe"); #[cfg(not(windows))] let binary_path = home_dir.join("bin").join("node"); if tokio::fs::try_exists(&binary_path).await.unwrap_or(false) { print_check(&output::CHECK.green().to_string(), "Node binary", "installed"); } else { print_check( &output::WARN_SIGN.yellow().to_string(), "Node binary", &"not installed".yellow().to_string(), ); print_hint("Version will be downloaded on first use."); } } Err(e) => { print_check( &output::CROSS.red().to_string(), "Resolution", &format!("failed: {e}").red().to_string(), ); } } } /// Check for conflicts with other version managers. fn check_conflicts() { let mut conflicts = Vec::new(); for (name, env_var) in KNOWN_VERSION_MANAGERS { if std::env::var(env_var).is_ok() { conflicts.push(*name); } } // Also check for common shims in PATH if let Some(node_path) = find_in_path("node") { let path_str = node_path.to_string_lossy(); if path_str.contains(".nvm") { if !conflicts.contains(&"nvm") { conflicts.push("nvm"); } } else if path_str.contains(".fnm") { if !conflicts.contains(&"fnm") { conflicts.push("fnm"); } } else if path_str.contains(".volta") { if !conflicts.contains(&"volta") { conflicts.push("volta"); } } } if !conflicts.is_empty() { print_section("Conflicts"); for manager in &conflicts { print_check( &output::WARN_SIGN.yellow().to_string(), manager, &format!( "detected ({} is set)", KNOWN_VERSION_MANAGERS .iter() .find(|(n, _)| n == manager) .map(|(_, e)| *e) .unwrap_or("in PATH") ) .yellow() .to_string(), ); } print_hint("Consider removing other version managers from your PATH"); print_hint("to avoid version conflicts."); } } #[cfg(test)] mod tests { use serial_test::serial; use tempfile::TempDir; use super::*; #[test] fn test_shim_filename_consistency() { // All tools should use the same extension pattern // On Windows: all .cmd, On Unix: all without extension let node = shim_filename("node"); let npm = shim_filename("npm"); let npx = shim_filename("npx"); #[cfg(windows)] { // All shims should use .exe on Windows (trampoline executables) assert_eq!(node, "node.exe"); assert_eq!(npm, "npm.exe"); assert_eq!(npx, "npx.exe"); } #[cfg(not(windows))] { assert_eq!(node, "node"); assert_eq!(npm, "npm"); assert_eq!(npx, "npx"); } } /// Create a fake executable file in the given directory. #[cfg(unix)] fn create_fake_executable(dir: &std::path::Path, name: &str) -> std::path::PathBuf { use std::os::unix::fs::PermissionsExt; let path = dir.join(name); std::fs::write(&path, "#!/bin/sh\n").unwrap(); std::fs::set_permissions(&path, std::fs::Permissions::from_mode(0o755)).unwrap(); path } #[cfg(windows)] fn create_fake_executable(dir: &std::path::Path, name: &str) -> std::path::PathBuf { let path = dir.join(format!("{name}.exe")); std::fs::write(&path, "fake").unwrap(); path } /// Helper to save and restore PATH and VITE_PLUS_BYPASS around a test. struct EnvGuard { original_path: Option, original_bypass: Option, } impl EnvGuard { fn new() -> Self { Self { original_path: std::env::var_os("PATH"), original_bypass: std::env::var_os(env_vars::VITE_PLUS_BYPASS), } } } impl Drop for EnvGuard { fn drop(&mut self) { unsafe { match &self.original_path { Some(v) => std::env::set_var("PATH", v), None => std::env::remove_var("PATH"), } match &self.original_bypass { Some(v) => std::env::set_var(env_vars::VITE_PLUS_BYPASS, v), None => std::env::remove_var(env_vars::VITE_PLUS_BYPASS), } } } } #[test] #[serial] fn test_find_system_node_skips_bypass_paths() { let _guard = EnvGuard::new(); let temp = TempDir::new().unwrap(); let dir_a = temp.path().join("bin_a"); let dir_b = temp.path().join("bin_b"); std::fs::create_dir_all(&dir_a).unwrap(); std::fs::create_dir_all(&dir_b).unwrap(); create_fake_executable(&dir_a, "node"); create_fake_executable(&dir_b, "node"); let path = std::env::join_paths([dir_a.as_path(), dir_b.as_path()]).unwrap(); // SAFETY: This test runs in isolation with serial_test unsafe { std::env::set_var("PATH", &path); std::env::set_var(env_vars::VITE_PLUS_BYPASS, dir_a.as_os_str()); } let result = find_system_node(); assert!(result.is_some(), "Should find node in non-bypassed directory"); assert!(result.unwrap().starts_with(&dir_b), "Should find node in dir_b, not dir_a"); } #[test] #[serial] fn test_find_system_node_returns_none_when_all_paths_bypassed() { let _guard = EnvGuard::new(); let temp = TempDir::new().unwrap(); let dir_a = temp.path().join("bin_a"); std::fs::create_dir_all(&dir_a).unwrap(); create_fake_executable(&dir_a, "node"); // SAFETY: This test runs in isolation with serial_test unsafe { std::env::set_var("PATH", dir_a.as_os_str()); std::env::set_var(env_vars::VITE_PLUS_BYPASS, dir_a.as_os_str()); } let result = find_system_node(); assert!(result.is_none(), "Should return None when all paths are bypassed"); } #[test] fn test_abbreviate_home() { if let Ok(home) = std::env::var("HOME") { let path = format!("{home}/.vite-plus"); assert_eq!(abbreviate_home(&path), "~/.vite-plus"); // Non-home path should be unchanged assert_eq!(abbreviate_home("/usr/local/bin"), "/usr/local/bin"); } } /// Guard for env vars used by profile file tests. #[cfg(not(windows))] struct ProfileEnvGuard { original_home: Option, original_zdotdir: Option, original_xdg_config: Option, } #[cfg(not(windows))] impl ProfileEnvGuard { fn new( home: &std::path::Path, zdotdir: Option<&std::path::Path>, xdg_config: Option<&std::path::Path>, ) -> Self { let guard = Self { original_home: std::env::var_os("HOME"), original_zdotdir: std::env::var_os("ZDOTDIR"), original_xdg_config: std::env::var_os("XDG_CONFIG_HOME"), }; unsafe { std::env::set_var("HOME", home); match zdotdir { Some(v) => std::env::set_var("ZDOTDIR", v), None => std::env::remove_var("ZDOTDIR"), } match xdg_config { Some(v) => std::env::set_var("XDG_CONFIG_HOME", v), None => std::env::remove_var("XDG_CONFIG_HOME"), } } guard } } #[cfg(not(windows))] impl Drop for ProfileEnvGuard { fn drop(&mut self) { unsafe { match &self.original_home { Some(v) => std::env::set_var("HOME", v), None => std::env::remove_var("HOME"), } match &self.original_zdotdir { Some(v) => std::env::set_var("ZDOTDIR", v), None => std::env::remove_var("ZDOTDIR"), } match &self.original_xdg_config { Some(v) => std::env::set_var("XDG_CONFIG_HOME", v), None => std::env::remove_var("XDG_CONFIG_HOME"), } } } } #[test] #[serial] #[cfg(not(windows))] fn test_check_profile_files_finds_zdotdir() { let temp = TempDir::new().unwrap(); let fake_home = temp.path().join("home"); let zdotdir = temp.path().join("zdotdir"); std::fs::create_dir_all(&fake_home).unwrap(); std::fs::create_dir_all(&zdotdir).unwrap(); std::fs::write(zdotdir.join(".zshenv"), ". \"$HOME/.vite-plus/env\"\n").unwrap(); let _guard = ProfileEnvGuard::new(&fake_home, Some(&zdotdir), None); // Pass an empty base list so only ZDOTDIR fallback is triggered let result = check_profile_files("$HOME/.vite-plus", &[]); assert!(result.is_some(), "Should find .zshenv in ZDOTDIR"); assert!(result.unwrap().ends_with(".zshenv")); } #[test] #[serial] #[cfg(not(windows))] fn test_check_profile_files_finds_xdg_fish() { let temp = TempDir::new().unwrap(); let fake_home = temp.path().join("home"); let xdg_config = temp.path().join("xdg_config"); let fish_dir = xdg_config.join("fish/conf.d"); std::fs::create_dir_all(&fake_home).unwrap(); std::fs::create_dir_all(&fish_dir).unwrap(); std::fs::write(fish_dir.join("vite-plus.fish"), "source \"$HOME/.vite-plus/env.fish\"\n") .unwrap(); let _guard = ProfileEnvGuard::new(&fake_home, None, Some(&xdg_config)); // Pass an empty base list so only XDG fallback is triggered let result = check_profile_files("$HOME/.vite-plus", &[]); assert!(result.is_some(), "Should find vite-plus.fish in XDG_CONFIG_HOME"); assert!(result.unwrap().contains("vite-plus.fish")); } #[test] #[serial] #[cfg(not(windows))] fn test_check_profile_files_finds_posix_env_in_bashrc() { let temp = TempDir::new().unwrap(); let fake_home = temp.path().join("home"); std::fs::create_dir_all(&fake_home).unwrap(); std::fs::write(fake_home.join(".bashrc"), "# some config\n. \"$HOME/.vite-plus/env\"\n") .unwrap(); let _guard = ProfileEnvGuard::new(&fake_home, None, None); let result = check_profile_files("$HOME/.vite-plus", &[(".bashrc", false), (".profile", false)]); assert!(result.is_some(), "Should find env sourcing in .bashrc"); assert_eq!(result.unwrap(), "~/.bashrc"); } #[test] #[serial] #[cfg(not(windows))] fn test_check_profile_files_finds_fish_env() { let temp = TempDir::new().unwrap(); let fake_home = temp.path().join("home"); let fish_dir = fake_home.join(".config/fish"); std::fs::create_dir_all(&fish_dir).unwrap(); std::fs::write(fish_dir.join("config.fish"), "source \"$HOME/.vite-plus/env.fish\"\n") .unwrap(); let _guard = ProfileEnvGuard::new(&fake_home, None, None); let result = check_profile_files("$HOME/.vite-plus", &[(".config/fish/config.fish", true)]); assert!(result.is_some(), "Should find env.fish sourcing in fish config"); assert_eq!(result.unwrap(), "~/.config/fish/config.fish"); } #[test] #[serial] #[cfg(not(windows))] fn test_check_profile_files_returns_none_when_not_found() { let temp = TempDir::new().unwrap(); let fake_home = temp.path().join("home"); std::fs::create_dir_all(&fake_home).unwrap(); // Create a .bashrc without vite-plus sourcing std::fs::write(fake_home.join(".bashrc"), "# no vite-plus here\nexport FOO=bar\n").unwrap(); let _guard = ProfileEnvGuard::new(&fake_home, None, None); let result = check_profile_files("$HOME/.vite-plus", &[(".bashrc", false), (".profile", false)]); assert!(result.is_none(), "Should return None when env sourcing not found"); } #[test] #[serial] #[cfg(not(windows))] fn test_check_profile_files_finds_absolute_path() { let temp = TempDir::new().unwrap(); let fake_home = temp.path().join("home"); std::fs::create_dir_all(&fake_home).unwrap(); // Use absolute path form instead of $HOME let abs_path = format!(". \"{}/home/.vite-plus/env\"\n", temp.path().display()); std::fs::write(fake_home.join(".zshenv"), &abs_path).unwrap(); let _guard = ProfileEnvGuard::new(&fake_home, None, None); let result = check_profile_files("$HOME/.vite-plus", &[(".zshenv", false)]); assert!(result.is_some(), "Should find absolute path form of env sourcing"); assert_eq!(result.unwrap(), "~/.zshenv"); } } ================================================ FILE: crates/vite_global_cli/src/commands/env/exec.rs ================================================ //! Exec command for executing commands with a specific Node.js version. //! //! Handles two modes: //! 1. Explicit version: `vp env exec --node [--npm ] ` //! 2. Shim mode: `vp env exec [args...]` where tool is node/npm/npx or a global package binary //! //! The shim mode uses the same dispatch logic as Unix symlinks, ensuring identical behavior //! across platforms (used by Windows .cmd wrappers and Git Bash shell scripts). use std::process::ExitStatus; use vite_js_runtime::NodeProvider; use vite_shared::{env_vars, format_path_prepended}; use crate::{ error::Error, shim::{dispatch as shim_dispatch, is_shim_tool}, }; /// Execute the exec command. /// /// When `--node` is provided, runs a command with the specified Node.js version. /// When `--node` is not provided and the command is a shim tool (node/npm/npx or global package), /// uses the same shim dispatch logic as Unix symlinks. pub async fn execute( node_version: Option<&str>, npm_version: Option<&str>, command: &[String], ) -> Result { let command = normalize_wrapper_command(command); if command.is_empty() { eprintln!("vp env exec: missing command to execute"); eprintln!("Usage: vp env exec [--node ] [args...]"); return Ok(exit_status(1)); } // If --node is provided, use explicit version mode (existing behavior) if let Some(version) = node_version { return execute_with_version(version, npm_version, &command).await; } // No --node provided - check if first command is a shim tool // This includes: // - Core tools (node, npm, npx) // - Globally installed package binaries (tsc, eslint, etc.) let tool = &command[0]; if is_shim_tool(tool) { // Clear recursion env var to force fresh version resolution. // This is needed because `vp env exec` may be invoked from within a context // where VITE_PLUS_TOOL_RECURSION is already set (e.g., when pnpm runs through // the vite-plus shim). Without clearing it, shim_dispatch would passthrough // to the system node instead of resolving the version. // SAFETY: This is safe because we're about to spawn a child process and we want // fresh version resolution, not passthrough behavior. unsafe { std::env::remove_var(env_vars::VITE_PLUS_TOOL_RECURSION); } // Use the SAME shim dispatch as Unix symlinks - this ensures: // - Core tools: Version resolved from .node-version/package.json/default // - Package binaries: Uses Node.js version from package metadata // - Automatic Node.js download if needed // - Recursion prevention via VITE_PLUS_TOOL_RECURSION // - Shim mode checking (managed vs system-first) let args: Vec = command[1..].to_vec(); let exit_code = shim_dispatch(tool, &args).await; return Ok(exit_status(exit_code)); } // Not a shim tool and no --node - error eprintln!("vp env exec: --node is required when running non-shim commands"); eprintln!("Usage: vp env exec --node [args...]"); eprintln!(); eprintln!("For shim tools, --node is optional (version resolved automatically):"); eprintln!(" vp env exec node script.js # Core tool"); eprintln!(" vp env exec npm install # Core tool"); eprintln!(" vp env exec tsc --version # Global package"); Ok(exit_status(1)) } /// Normalize arguments when invoked via Windows shim wrappers. /// /// Wrappers insert `--` after the tool name so flags like `--help` aren't /// consumed by clap while parsing `vp env exec`. Remove only that inserted /// separator before forwarding args to the target tool. fn normalize_wrapper_command(command: &[String]) -> Vec { let from_wrapper = std::env::var_os(env_vars::VITE_PLUS_SHIM_WRAPPER).is_some(); let normalized = normalize_wrapper_command_inner(command, from_wrapper); if from_wrapper { // SAFETY: We're in a short-lived CLI process and clearing a wrapper-only // marker before tool execution avoids leaking it to child processes. unsafe { std::env::remove_var(env_vars::VITE_PLUS_SHIM_WRAPPER); } } normalized } fn normalize_wrapper_command_inner(command: &[String], from_wrapper: bool) -> Vec { let mut normalized = command.to_vec(); if from_wrapper && normalized.len() >= 2 && normalized[1] == "--" { normalized.remove(1); } normalized } /// Execute a command with an explicitly specified Node.js version. async fn execute_with_version( node_version: &str, npm_version: Option<&str>, command: &[String], ) -> Result { // Warn about unsupported --npm flag if npm_version.is_some() { eprintln!("Warning: --npm flag is not yet implemented, using bundled npm"); } // 1. Resolve version let provider = NodeProvider::new(); let resolved_version = resolve_version(node_version, &provider).await?; // 2. Ensure installed (download if needed) let runtime = vite_js_runtime::download_runtime(vite_js_runtime::JsRuntimeType::Node, &resolved_version) .await?; // 3. Clear recursion env var to force re-evaluation in child processes // SAFETY: This is safe because we're about to spawn a child process and we want // to ensure the env var is not inherited. We're not reading this env var in other // threads at this point. unsafe { std::env::remove_var(env_vars::VITE_PLUS_TOOL_RECURSION); } // 4. Build PATH with node bin dir first (uses platform-specific separator) // Always prepend to ensure the requested Node version is first in PATH let node_bin_dir = runtime.get_bin_prefix(); let new_path = format_path_prepended(node_bin_dir.as_path()); // 5. Execute command let (cmd, args) = command.split_first().unwrap(); let status = tokio::process::Command::new(cmd).args(args).env("PATH", new_path).status().await?; Ok(status) } /// Resolve version to an exact version. /// /// Handles aliases (lts, latest) and version ranges. async fn resolve_version(version: &str, provider: &NodeProvider) -> Result { match version.to_lowercase().as_str() { "lts" => { let resolved = provider.resolve_latest_version().await?; Ok(resolved.to_string()) } "latest" => { let resolved = provider.resolve_version("*").await?; Ok(resolved.to_string()) } _ => { // For exact versions, use directly if NodeProvider::is_exact_version(version) { // Strip v prefix if present let normalized = version.strip_prefix('v').unwrap_or(version); Ok(normalized.to_string()) } else { // For ranges/partial versions, resolve to exact let resolved = provider.resolve_version(version).await?; Ok(resolved.to_string()) } } } } /// Create an exit status with the given code. fn exit_status(code: i32) -> ExitStatus { #[cfg(unix)] { use std::os::unix::process::ExitStatusExt; ExitStatus::from_raw(code << 8) } #[cfg(windows)] { use std::os::windows::process::ExitStatusExt; ExitStatus::from_raw(code as u32) } } #[cfg(test)] mod tests { use serial_test::serial; use super::*; #[tokio::test] async fn test_execute_missing_command() { let result = execute(Some("20.18.0"), None, &[]).await; assert!(result.is_ok()); let status = result.unwrap(); assert!(!status.success()); } #[tokio::test] #[serial] async fn test_execute_node_version() { // Run 'node --version' with a specific Node.js version let command = vec!["node".to_string(), "--version".to_string()]; let result = execute(Some("20.18.0"), None, &command).await; assert!(result.is_ok()); let status = result.unwrap(); assert!(status.success()); } #[tokio::test] async fn test_resolve_version_exact() { let provider = NodeProvider::new(); let version = resolve_version("20.18.0", &provider).await.unwrap(); assert_eq!(version, "20.18.0"); } #[tokio::test] async fn test_resolve_version_with_v_prefix() { let provider = NodeProvider::new(); let version = resolve_version("v20.18.0", &provider).await.unwrap(); assert_eq!(version, "20.18.0"); } #[tokio::test] async fn test_resolve_version_partial() { let provider = NodeProvider::new(); let version = resolve_version("20", &provider).await.unwrap(); // Should resolve to a 20.x.x version - check starts with "20." assert!(version.starts_with("20."), "Expected version starting with '20.', got: {version}"); } #[tokio::test] async fn test_resolve_version_range() { let provider = NodeProvider::new(); let version = resolve_version("^20.0.0", &provider).await.unwrap(); // Should resolve to a 20.x.x version - check starts with "20." assert!(version.starts_with("20."), "Expected version starting with '20.', got: {version}"); } #[tokio::test] async fn test_resolve_version_lts() { let provider = NodeProvider::new(); let version = resolve_version("lts", &provider).await.unwrap(); // Should resolve to a valid version (format: x.y.z) let parts: Vec<&str> = version.split('.').collect(); assert_eq!(parts.len(), 3, "Expected version format x.y.z, got: {version}"); // Major version should be >= 20 (current LTS line) let major: u32 = parts[0].parse().expect("Major version should be a number"); assert!(major >= 20, "Expected major version >= 20, got: {major}"); } #[tokio::test] async fn test_shim_mode_error_for_non_shim_command() { // Running a non-shim command without --node should error let command = vec!["python".to_string(), "--version".to_string()]; let result = execute(None, None, &command).await; assert!(result.is_ok()); let status = result.unwrap(); // Should fail because python is not a shim tool and --node was not provided assert!(!status.success(), "Non-shim command without --node should fail"); } #[test] fn test_normalize_wrapper_command_strips_only_wrapper_separator() { let command = vec!["node".to_string(), "--".to_string(), "--version".to_string()]; let normalized = normalize_wrapper_command_inner(&command, true); assert_eq!(normalized, vec!["node", "--version"]); } #[test] fn test_normalize_wrapper_command_no_wrapper_keeps_separator() { let command = vec!["node".to_string(), "--".to_string(), "--version".to_string()]; let normalized = normalize_wrapper_command_inner(&command, false); assert_eq!(normalized, command); } } ================================================ FILE: crates/vite_global_cli/src/commands/env/global_install.rs ================================================ //! Global package installation handling. use std::{ collections::HashSet, io::{Read, Write}, process::Stdio, }; use tokio::process::Command; use vite_js_runtime::NodeProvider; use vite_path::{AbsolutePath, current_dir}; use vite_shared::{format_path_prepended, output}; use super::{ bin_config::BinConfig, config::{ get_bin_dir, get_node_modules_dir, get_packages_dir, get_tmp_dir, resolve_version, resolve_version_alias, }, package_metadata::PackageMetadata, }; use crate::error::Error; /// Install a global package. /// /// If `node_version` is provided, uses that version. Otherwise, resolves from current directory. /// If `force` is true, auto-uninstalls conflicting packages. pub async fn install( package_spec: &str, node_version: Option<&str>, force: bool, ) -> Result<(), Error> { // Parse package spec (e.g., "typescript", "typescript@5.0.0", "@scope/pkg") let (package_name, _version_spec) = parse_package_spec(package_spec); output::raw(&format!("Installing {} globally...", package_spec)); // 1. Resolve Node.js version let version = if let Some(v) = node_version { let provider = NodeProvider::new(); resolve_version_alias(v, &provider).await? } else { // Resolve from current directory let cwd = current_dir().map_err(|e| { Error::ConfigError(format!("Cannot get current directory: {}", e).into()) })?; let resolution = resolve_version(&cwd).await?; resolution.version }; // 2. Ensure Node.js is installed let runtime = vite_js_runtime::download_runtime(vite_js_runtime::JsRuntimeType::Node, &version).await?; let node_bin_dir = runtime.get_bin_prefix(); let npm_path = if cfg!(windows) { node_bin_dir.join("npm.cmd") } else { node_bin_dir.join("npm") }; // 3. Create staging directory let tmp_dir = get_tmp_dir()?; let staging_dir = tmp_dir.join("packages").join(&package_name); // Clean up any previous failed install if tokio::fs::try_exists(&staging_dir).await.unwrap_or(false) { tokio::fs::remove_dir_all(&staging_dir).await?; } tokio::fs::create_dir_all(&staging_dir).await?; // 4. Run npm install with prefix set to staging directory // Pipe stdout/stderr so npm output is hidden on success, shown on failure let output = Command::new(npm_path.as_path()) .args(["install", "-g", "--no-fund", package_spec]) .env("npm_config_prefix", staging_dir.as_path()) .env("PATH", format_path_prepended(node_bin_dir.as_path())) .stdout(Stdio::piped()) .stderr(Stdio::piped()) .output() .await?; if !output.status.success() { // Clean up staging directory let _ = tokio::fs::remove_dir_all(&staging_dir).await; // Show captured output to help debug the failure let _ = std::io::stdout().write_all(&output.stdout); let _ = std::io::stderr().write_all(&output.stderr); return Err(Error::ConfigError( format!("npm install failed with exit code: {:?}", output.status.code()).into(), )); } // 5. Find installed package and extract metadata let node_modules_dir = get_node_modules_dir(&staging_dir, &package_name); let package_json_path = node_modules_dir.join("package.json"); if !tokio::fs::try_exists(&package_json_path).await.unwrap_or(false) { let _ = tokio::fs::remove_dir_all(&staging_dir).await; return Err(Error::ConfigError( format!( "Package {} was not installed correctly, package.json not found at {}", package_name, package_json_path.as_path().display() ) .into(), )); } // Read package.json to get version and binaries let package_json_content = tokio::fs::read_to_string(&package_json_path).await?; let package_json: serde_json::Value = serde_json::from_str(&package_json_content) .map_err(|e| Error::ConfigError(format!("Failed to parse package.json: {}", e).into()))?; let installed_version = package_json["version"].as_str().unwrap_or("unknown").to_string(); let binary_infos = extract_binaries(&package_json); // Detect which binaries are JavaScript files let mut bin_names = Vec::new(); let mut js_bins = HashSet::new(); for info in &binary_infos { bin_names.push(info.name.clone()); let binary_path = node_modules_dir.join(&info.path); if is_javascript_binary(&binary_path) { js_bins.insert(info.name.clone()); } } // 5b. Check for binary conflicts (before moving staging to final location) let mut conflicts: Vec<(String, String)> = Vec::new(); // (bin_name, existing_package) for bin_name in &bin_names { if let Some(config) = BinConfig::load(bin_name).await? { // Only conflict if owned by a different package if config.package != package_name { conflicts.push((bin_name.clone(), config.package.clone())); } } } if !conflicts.is_empty() { if force { // Auto-uninstall conflicting packages let packages_to_remove: HashSet<_> = conflicts.iter().map(|(_, pkg)| pkg.clone()).collect(); for pkg in packages_to_remove { output::raw(&format!("Uninstalling {} (conflicts with {})...", pkg, package_name)); // Use Box::pin to avoid recursive async type issues Box::pin(uninstall(&pkg, false)).await?; } } else { // Hard fail with clear error // Clean up staging directory let _ = tokio::fs::remove_dir_all(&staging_dir).await; return Err(Error::BinaryConflict { bin_name: conflicts[0].0.clone(), existing_package: conflicts[0].1.clone(), new_package: package_name.clone(), }); } } // 6. Move staging to final location let packages_dir = get_packages_dir()?; let final_dir = packages_dir.join(&package_name); // Remove existing installation if present if tokio::fs::try_exists(&final_dir).await.unwrap_or(false) { tokio::fs::remove_dir_all(&final_dir).await?; } // Create parent directory (handles scoped packages like @scope/pkg) if let Some(parent) = final_dir.parent() { tokio::fs::create_dir_all(parent).await?; } tokio::fs::rename(&staging_dir, &final_dir).await?; // 7. Save package metadata let metadata = PackageMetadata::new( package_name.clone(), installed_version.clone(), version.clone(), None, // npm version - could extract from runtime bin_names.clone(), js_bins, "npm".to_string(), ); metadata.save().await?; // 8. Create shims for binaries and save per-binary configs let bin_dir = get_bin_dir()?; for bin_name in &bin_names { create_package_shim(&bin_dir, bin_name, &package_name).await?; // Write per-binary config let bin_config = BinConfig::new( bin_name.clone(), package_name.clone(), installed_version.clone(), version.clone(), ); bin_config.save().await?; } output::raw(&format!("Installed {} v{}", package_name, installed_version)); if !bin_names.is_empty() { output::raw(&format!("Binaries: {}", bin_names.join(", "))); } Ok(()) } /// Uninstall a global package. /// /// Uses two-phase uninstall: /// 1. Try to use PackageMetadata for binary list /// 2. Fallback to scanning BinConfig files for orphaned binaries pub async fn uninstall(package_name: &str, dry_run: bool) -> Result<(), Error> { let (package_name, _) = parse_package_spec(package_name); // Phase 1: Try to use PackageMetadata for binary list let bins = if let Some(metadata) = PackageMetadata::load(&package_name).await? { metadata.bins.clone() } else { // Phase 2: Fallback - scan BinConfig files for orphaned binaries let orphan_bins = BinConfig::find_by_package(&package_name).await?; if orphan_bins.is_empty() { return Err(Error::ConfigError( format!("Package {} is not installed", package_name).into(), )); } orphan_bins }; if dry_run { let bin_dir = get_bin_dir()?; let packages_dir = get_packages_dir()?; let package_dir = packages_dir.join(&package_name); let metadata_path = PackageMetadata::metadata_path(&package_name)?; output::raw(&format!("Would uninstall {}:", package_name)); for bin_name in &bins { output::raw(&format!(" - shim: {}", bin_dir.join(bin_name).as_path().display())); } output::raw(&format!(" - package dir: {}", package_dir.as_path().display())); output::raw(&format!(" - metadata: {}", metadata_path.as_path().display())); return Ok(()); } // Remove shims and bin configs let bin_dir = get_bin_dir()?; for bin_name in &bins { remove_package_shim(&bin_dir, bin_name).await?; BinConfig::delete(bin_name).await?; } // Remove package directory let packages_dir = get_packages_dir()?; let package_dir = packages_dir.join(&package_name); if tokio::fs::try_exists(&package_dir).await.unwrap_or(false) { tokio::fs::remove_dir_all(&package_dir).await?; } // Remove metadata file PackageMetadata::delete(&package_name).await?; output::raw(&format!("Uninstalled {}", package_name)); Ok(()) } /// Parse package spec into name and optional version. fn parse_package_spec(spec: &str) -> (String, Option) { // Handle scoped packages: @scope/name@version if spec.starts_with('@') { // Find the second @ for version if let Some(idx) = spec[1..].find('@') { let idx = idx + 1; // Adjust for the skipped first char return (spec[..idx].to_string(), Some(spec[idx + 1..].to_string())); } return (spec.to_string(), None); } // Handle regular packages: name@version if let Some(idx) = spec.find('@') { return (spec[..idx].to_string(), Some(spec[idx + 1..].to_string())); } (spec.to_string(), None) } /// Binary info extracted from package.json. struct BinaryInfo { /// Binary name (the command users will run) name: String, /// Relative path to the binary file from package root path: String, } /// Extract binary names and paths from package.json. fn extract_binaries(package_json: &serde_json::Value) -> Vec { let mut bins = Vec::new(); if let Some(bin) = package_json.get("bin") { match bin { serde_json::Value::String(path) => { // Single binary with package name if let Some(name) = package_json["name"].as_str() { // Get just the package name without scope let bin_name = name.split('/').last().unwrap_or(name); bins.push(BinaryInfo { name: bin_name.to_string(), path: path.clone() }); } } serde_json::Value::Object(map) => { // Multiple binaries for (name, path) in map { if let serde_json::Value::String(path) = path { bins.push(BinaryInfo { name: name.clone(), path: path.clone() }); } } } _ => {} } } bins } /// Check if a file is a JavaScript file that should be run with Node. /// /// Returns true if: /// - The file has a .js, .mjs, or .cjs extension /// - The file has a shebang containing "node" /// /// This function safely reads only the first 256 bytes to check the shebang, /// avoiding issues with binary files that may not have newlines. fn is_javascript_binary(path: &AbsolutePath) -> bool { // Check extension first (fast path, no file I/O) if let Some(ext) = path.as_path().extension() { let ext = ext.to_string_lossy().to_lowercase(); if ext == "js" || ext == "mjs" || ext == "cjs" { return true; } } // For extensionless files, read only first 256 bytes to check shebang // This is safe even for binary files if let Ok(mut file) = std::fs::File::open(path.as_path()) { let mut buffer = [0u8; 256]; if let Ok(n) = file.read(&mut buffer) { if n >= 2 && buffer[0] == b'#' && buffer[1] == b'!' { // Found shebang, check for "node" in the first line // Find newline or use entire buffer let end = buffer[..n].iter().position(|&b| b == b'\n').unwrap_or(n); if let Ok(shebang) = std::str::from_utf8(&buffer[..end]) { if shebang.contains("node") { return true; } } } } } false } /// Core shims that should not be overwritten by package binaries. pub(crate) const CORE_SHIMS: &[&str] = &["node", "npm", "npx", "vp"]; /// Create a shim for a package binary. /// /// On Unix: Creates a symlink to ../current/bin/vp /// On Windows: Creates a trampoline .exe that forwards to vp.exe async fn create_package_shim( bin_dir: &vite_path::AbsolutePath, bin_name: &str, package_name: &str, ) -> Result<(), Error> { // Check for conflicts with core shims if CORE_SHIMS.contains(&bin_name) { output::warn(&format!( "Package '{}' provides '{}' binary, but it conflicts with a core shim. Skipping.", package_name, bin_name )); return Ok(()); } // Ensure bin directory exists tokio::fs::create_dir_all(bin_dir).await?; #[cfg(unix)] { let shim_path = bin_dir.join(bin_name); // Check if already a managed shim (symlink to ../current/bin/vp) if let Ok(target) = tokio::fs::read_link(&shim_path).await { if target == std::path::Path::new("../current/bin/vp") { return Ok(()); } // Exists but points elsewhere (e.g., npm-installed direct symlink) — replace it tokio::fs::remove_file(&shim_path).await?; } // Create symlink to ../current/bin/vp tokio::fs::symlink("../current/bin/vp", &shim_path).await?; tracing::debug!("Created package shim symlink {:?} -> ../current/bin/vp", shim_path); } #[cfg(windows)] { let shim_path = bin_dir.join(format!("{}.exe", bin_name)); // Skip if already exists (e.g., re-installing the same package) if tokio::fs::try_exists(&shim_path).await.unwrap_or(false) { return Ok(()); } // Copy the trampoline binary as .exe. // The trampoline detects the tool name from its own filename and sets // VITE_PLUS_SHIM_TOOL env var before spawning vp.exe. let trampoline_src = super::setup::get_trampoline_path()?; tokio::fs::copy(trampoline_src.as_path(), &shim_path).await?; // Remove legacy .cmd and shell script wrappers from previous versions. // In Git Bash/MSYS, the extensionless script takes precedence over .exe, // so leftover wrappers would bypass the trampoline. super::setup::cleanup_legacy_windows_shim(bin_dir, bin_name).await; tracing::debug!("Created package trampoline shim {:?}", shim_path); } Ok(()) } /// Remove a shim for a package binary. async fn remove_package_shim( bin_dir: &vite_path::AbsolutePath, bin_name: &str, ) -> Result<(), Error> { // Don't remove core shims if CORE_SHIMS.contains(&bin_name) { return Ok(()); } #[cfg(unix)] { let shim_path = bin_dir.join(bin_name); // Use symlink_metadata to detect symlinks (even broken ones) if tokio::fs::symlink_metadata(&shim_path).await.is_ok() { tokio::fs::remove_file(&shim_path).await?; } } #[cfg(windows)] { // Remove trampoline .exe shim and legacy .cmd / shell script wrappers. // Best-effort: ignore NotFound errors for files that don't exist. for suffix in &[".exe", ".cmd", ""] { let path = if suffix.is_empty() { bin_dir.join(bin_name) } else { bin_dir.join(format!("{bin_name}{suffix}")) }; let _ = tokio::fs::remove_file(&path).await; } } Ok(()) } #[cfg(test)] mod tests { use super::*; /// RAII guard that sets `VITE_PLUS_TRAMPOLINE_PATH` to a fake binary on creation /// and clears it on drop. Ensures cleanup even on test panics. #[cfg(windows)] struct FakeTrampolineGuard; #[cfg(windows)] impl FakeTrampolineGuard { fn new(dir: &std::path::Path) -> Self { let trampoline = dir.join("vp-shim.exe"); std::fs::write(&trampoline, b"fake-trampoline").unwrap(); unsafe { std::env::set_var(vite_shared::env_vars::VITE_PLUS_TRAMPOLINE_PATH, &trampoline); } Self } } #[cfg(windows)] impl Drop for FakeTrampolineGuard { fn drop(&mut self) { unsafe { std::env::remove_var(vite_shared::env_vars::VITE_PLUS_TRAMPOLINE_PATH); } } } #[tokio::test] #[cfg_attr(windows, serial_test::serial)] async fn test_create_package_shim_creates_bin_dir() { use tempfile::TempDir; use vite_path::AbsolutePathBuf; // Create a temp directory but don't create the bin subdirectory let temp_dir = TempDir::new().unwrap(); #[cfg(windows)] let _guard = FakeTrampolineGuard::new(temp_dir.path()); let bin_dir = temp_dir.path().join("bin"); let bin_dir = AbsolutePathBuf::new(bin_dir).unwrap(); // Verify bin directory doesn't exist assert!(!bin_dir.as_path().exists()); // Create a shim - this should create the bin directory create_package_shim(&bin_dir, "test-shim", "test-package").await.unwrap(); // Verify bin directory was created assert!(bin_dir.as_path().exists()); // Verify shim file was created (on Windows, shims have .exe extension) // On Unix, symlinks may be broken (target doesn't exist), so use symlink_metadata #[cfg(unix)] { let shim_path = bin_dir.join("test-shim"); assert!( std::fs::symlink_metadata(shim_path.as_path()).is_ok(), "Symlink shim should exist" ); } #[cfg(windows)] { let shim_path = bin_dir.join("test-shim.exe"); assert!(shim_path.as_path().exists()); } } #[tokio::test] async fn test_create_package_shim_skips_core_shims() { use tempfile::TempDir; use vite_path::AbsolutePathBuf; let temp_dir = TempDir::new().unwrap(); let bin_dir = AbsolutePathBuf::new(temp_dir.path().to_path_buf()).unwrap(); // Try to create a shim for "node" which is a core shim create_package_shim(&bin_dir, "node", "some-package").await.unwrap(); // Verify the shim was NOT created (core shims should be skipped) #[cfg(unix)] let shim_path = bin_dir.join("node"); #[cfg(windows)] let shim_path = bin_dir.join("node.exe"); assert!(!shim_path.as_path().exists()); } #[tokio::test] #[cfg_attr(windows, serial_test::serial)] async fn test_remove_package_shim_removes_shim() { use tempfile::TempDir; use vite_path::AbsolutePathBuf; let temp_dir = TempDir::new().unwrap(); #[cfg(windows)] let _guard = FakeTrampolineGuard::new(temp_dir.path()); let bin_dir = AbsolutePathBuf::new(temp_dir.path().to_path_buf()).unwrap(); // Create a shim create_package_shim(&bin_dir, "tsc", "typescript").await.unwrap(); // Verify the shim was created // On Unix, symlinks may be broken (target doesn't exist), so use symlink_metadata #[cfg(unix)] { let shim_path = bin_dir.join("tsc"); assert!( std::fs::symlink_metadata(shim_path.as_path()).is_ok(), "Shim should exist after creation" ); // Remove the shim remove_package_shim(&bin_dir, "tsc").await.unwrap(); // Verify the shim was removed assert!( std::fs::symlink_metadata(shim_path.as_path()).is_err(), "Shim should be removed" ); } #[cfg(windows)] { let shim_path = bin_dir.join("tsc.exe"); assert!(shim_path.as_path().exists(), "Shim should exist after creation"); // Remove the shim remove_package_shim(&bin_dir, "tsc").await.unwrap(); // Verify the shim was removed assert!(!shim_path.as_path().exists(), "Shim should be removed"); } } #[tokio::test] async fn test_remove_package_shim_handles_missing_shim() { use tempfile::TempDir; use vite_path::AbsolutePathBuf; let temp_dir = TempDir::new().unwrap(); let bin_dir = AbsolutePathBuf::new(temp_dir.path().to_path_buf()).unwrap(); // Remove a shim that doesn't exist - should not error remove_package_shim(&bin_dir, "nonexistent").await.unwrap(); } #[tokio::test] #[cfg_attr(windows, serial_test::serial)] async fn test_uninstall_removes_shims_from_metadata() { use tempfile::TempDir; use vite_path::AbsolutePathBuf; let temp_dir = TempDir::new().unwrap(); let temp_path = temp_dir.path().to_path_buf(); #[cfg(windows)] let _trampoline_guard = FakeTrampolineGuard::new(&temp_path); let _env_guard = vite_shared::EnvConfig::test_guard( vite_shared::EnvConfig::for_test_with_home(&temp_path), ); // Create bin directory let bin_dir = AbsolutePathBuf::new(temp_path.join("bin")).unwrap(); tokio::fs::create_dir_all(&bin_dir).await.unwrap(); // Create shims for "tsc" and "tsserver" create_package_shim(&bin_dir, "tsc", "typescript").await.unwrap(); create_package_shim(&bin_dir, "tsserver", "typescript").await.unwrap(); // Verify shims exist // On Unix, symlinks may be broken (target doesn't exist), so use symlink_metadata #[cfg(unix)] { assert!( std::fs::symlink_metadata(bin_dir.join("tsc").as_path()).is_ok(), "tsc shim should exist" ); assert!( std::fs::symlink_metadata(bin_dir.join("tsserver").as_path()).is_ok(), "tsserver shim should exist" ); } #[cfg(windows)] { assert!(bin_dir.join("tsc.exe").as_path().exists(), "tsc.exe shim should exist"); assert!( bin_dir.join("tsserver.exe").as_path().exists(), "tsserver.exe shim should exist" ); } // Create metadata with bins let metadata = PackageMetadata::new( "typescript".to_string(), "5.9.3".to_string(), "20.18.0".to_string(), None, vec!["tsc".to_string(), "tsserver".to_string()], HashSet::from(["tsc".to_string(), "tsserver".to_string()]), "npm".to_string(), ); metadata.save().await.unwrap(); // Create package directory (needed for uninstall) let packages_dir = AbsolutePathBuf::new(temp_path.join("packages")).unwrap(); let package_dir = packages_dir.join("typescript"); tokio::fs::create_dir_all(&package_dir).await.unwrap(); // Verify metadata was saved let loaded = PackageMetadata::load("typescript").await.unwrap(); assert!(loaded.is_some(), "Metadata should be loaded"); let loaded = loaded.unwrap(); assert_eq!(loaded.bins, vec!["tsc", "tsserver"], "bins should match"); // Run uninstall uninstall("typescript", false).await.unwrap(); // Verify shims were removed #[cfg(unix)] { assert!(!bin_dir.join("tsc").as_path().exists(), "tsc shim should be removed"); assert!( !bin_dir.join("tsserver").as_path().exists(), "tsserver shim should be removed" ); } #[cfg(windows)] { assert!(!bin_dir.join("tsc.exe").as_path().exists(), "tsc.exe shim should be removed"); assert!( !bin_dir.join("tsserver.exe").as_path().exists(), "tsserver.exe shim should be removed" ); } } #[test] fn test_parse_package_spec_simple() { let (name, version) = parse_package_spec("typescript"); assert_eq!(name, "typescript"); assert_eq!(version, None); } #[test] fn test_parse_package_spec_with_version() { let (name, version) = parse_package_spec("typescript@5.0.0"); assert_eq!(name, "typescript"); assert_eq!(version, Some("5.0.0".to_string())); } #[test] fn test_parse_package_spec_scoped() { let (name, version) = parse_package_spec("@types/node"); assert_eq!(name, "@types/node"); assert_eq!(version, None); } #[test] fn test_parse_package_spec_scoped_with_version() { let (name, version) = parse_package_spec("@types/node@20.0.0"); assert_eq!(name, "@types/node"); assert_eq!(version, Some("20.0.0".to_string())); } #[test] fn test_is_javascript_binary_with_js_extension() { use tempfile::TempDir; use vite_path::AbsolutePathBuf; let temp_dir = TempDir::new().unwrap(); let js_file = temp_dir.path().join("cli.js"); std::fs::write(&js_file, "console.log('hello')").unwrap(); let path = AbsolutePathBuf::new(js_file).unwrap(); assert!(is_javascript_binary(&path)); } #[test] fn test_is_javascript_binary_with_mjs_extension() { use tempfile::TempDir; use vite_path::AbsolutePathBuf; let temp_dir = TempDir::new().unwrap(); let mjs_file = temp_dir.path().join("cli.mjs"); std::fs::write(&mjs_file, "export default 'hello'").unwrap(); let path = AbsolutePathBuf::new(mjs_file).unwrap(); assert!(is_javascript_binary(&path)); } #[test] fn test_is_javascript_binary_with_cjs_extension() { use tempfile::TempDir; use vite_path::AbsolutePathBuf; let temp_dir = TempDir::new().unwrap(); let cjs_file = temp_dir.path().join("cli.cjs"); std::fs::write(&cjs_file, "module.exports = 'hello'").unwrap(); let path = AbsolutePathBuf::new(cjs_file).unwrap(); assert!(is_javascript_binary(&path)); } #[test] fn test_is_javascript_binary_with_node_shebang() { use tempfile::TempDir; use vite_path::AbsolutePathBuf; let temp_dir = TempDir::new().unwrap(); let cli_file = temp_dir.path().join("cli"); std::fs::write(&cli_file, "#!/usr/bin/env node\nconsole.log('hello')").unwrap(); let path = AbsolutePathBuf::new(cli_file).unwrap(); assert!(is_javascript_binary(&path)); } #[test] fn test_is_javascript_binary_with_direct_node_shebang() { use tempfile::TempDir; use vite_path::AbsolutePathBuf; let temp_dir = TempDir::new().unwrap(); let cli_file = temp_dir.path().join("cli"); std::fs::write(&cli_file, "#!/usr/bin/node\nconsole.log('hello')").unwrap(); let path = AbsolutePathBuf::new(cli_file).unwrap(); assert!(is_javascript_binary(&path)); } #[test] fn test_is_javascript_binary_native_executable() { use tempfile::TempDir; use vite_path::AbsolutePathBuf; let temp_dir = TempDir::new().unwrap(); // Simulate a native binary (ELF header) let native_file = temp_dir.path().join("native-cli"); std::fs::write(&native_file, b"\x7fELF").unwrap(); let path = AbsolutePathBuf::new(native_file).unwrap(); assert!(!is_javascript_binary(&path)); } #[test] fn test_is_javascript_binary_shell_script() { use tempfile::TempDir; use vite_path::AbsolutePathBuf; let temp_dir = TempDir::new().unwrap(); let shell_file = temp_dir.path().join("script.sh"); std::fs::write(&shell_file, "#!/bin/bash\necho hello").unwrap(); let path = AbsolutePathBuf::new(shell_file).unwrap(); assert!(!is_javascript_binary(&path)); } #[test] fn test_is_javascript_binary_python_script() { use tempfile::TempDir; use vite_path::AbsolutePathBuf; let temp_dir = TempDir::new().unwrap(); let python_file = temp_dir.path().join("script.py"); std::fs::write(&python_file, "#!/usr/bin/env python3\nprint('hello')").unwrap(); let path = AbsolutePathBuf::new(python_file).unwrap(); assert!(!is_javascript_binary(&path)); } #[test] fn test_is_javascript_binary_empty_file() { use tempfile::TempDir; use vite_path::AbsolutePathBuf; let temp_dir = TempDir::new().unwrap(); let empty_file = temp_dir.path().join("empty"); std::fs::write(&empty_file, "").unwrap(); let path = AbsolutePathBuf::new(empty_file).unwrap(); assert!(!is_javascript_binary(&path)); } } ================================================ FILE: crates/vite_global_cli/src/commands/env/list.rs ================================================ //! List command for displaying locally installed Node.js versions. //! //! Handles `vp env list` to show Node.js versions installed in VITE_PLUS_HOME/js_runtime/node/. use std::{cmp::Ordering, process::ExitStatus}; use owo_colors::OwoColorize; use serde::Serialize; use vite_path::AbsolutePathBuf; use super::config; use crate::error::Error; /// JSON output format for a single installed version #[derive(Serialize)] struct InstalledVersionJson { version: String, current: bool, default: bool, } /// Scan the node versions directory and return sorted version strings. fn list_installed_versions(node_dir: &std::path::Path) -> Vec { let entries = match std::fs::read_dir(node_dir) { Ok(entries) => entries, Err(_) => return Vec::new(), }; let mut versions: Vec = entries .filter_map(|entry| { let entry = entry.ok()?; let name = entry.file_name().into_string().ok()?; // Skip hidden directories and non-directories if name.starts_with('.') || !entry.path().is_dir() { return None; } Some(name) }) .collect(); versions.sort_by(|a, b| compare_versions(a, b)); versions } /// Compare two version strings numerically (e.g., "20.18.0" vs "22.13.0"). fn compare_versions(a: &str, b: &str) -> Ordering { let parse = |v: &str| -> Vec { v.split('.').filter_map(|p| p.parse().ok()).collect() }; let a_parts = parse(a); let b_parts = parse(b); a_parts.cmp(&b_parts) } /// Execute the list command (local installed versions). pub async fn execute(cwd: AbsolutePathBuf, json_output: bool) -> Result { let home_dir = vite_shared::get_vite_plus_home().map_err(|e| Error::ConfigError(format!("{e}").into()))?; let node_dir = home_dir.join("js_runtime").join("node"); let versions = list_installed_versions(node_dir.as_path()); if versions.is_empty() { if json_output { println!("[]"); } else { println!("No Node.js versions installed."); println!(); println!("Install a version with: vp env install "); } return Ok(ExitStatus::default()); } // Resolve current version (gracefully handle errors) let current_version = config::resolve_version(&cwd).await.ok().map(|r| r.version); // Load default version let default_version = config::load_config().await.ok().and_then(|c| c.default_node_version); if json_output { print_json(&versions, current_version.as_deref(), default_version.as_deref()); } else { print_human(&versions, current_version.as_deref(), default_version.as_deref()); } Ok(ExitStatus::default()) } /// Print installed versions as JSON. fn print_json(versions: &[String], current: Option<&str>, default: Option<&str>) { let entries: Vec = versions .iter() .map(|v| InstalledVersionJson { version: v.clone(), current: current.is_some_and(|c| c == v), default: default.is_some_and(|d| d == v), }) .collect(); // unwrap is safe here since we're serializing simple structs println!("{}", serde_json::to_string_pretty(&entries).unwrap()); } /// Print installed versions in human-readable format. fn print_human(versions: &[String], current: Option<&str>, default: Option<&str>) { for v in versions { let is_current = current.is_some_and(|c| c == v); let is_default = default.is_some_and(|d| d == v); let mut markers = Vec::new(); if is_current { markers.push("current"); } if is_default { markers.push("default"); } let marker_str = if markers.is_empty() { String::new() } else { format!(" {}", markers.join(" ").dimmed()) }; let line = format!("* v{v}{marker_str}"); if is_current { println!("{}", line.bright_blue()); } else { println!("{line}"); } } } #[cfg(test)] mod tests { use super::*; #[test] fn test_version_cmp() { assert_eq!(compare_versions("18.20.0", "20.18.0"), Ordering::Less); assert_eq!(compare_versions("22.13.0", "20.18.0"), Ordering::Greater); assert_eq!(compare_versions("20.18.0", "20.18.0"), Ordering::Equal); assert_eq!(compare_versions("20.9.0", "20.18.0"), Ordering::Less); } #[test] fn test_list_installed_versions_nonexistent_dir() { let versions = list_installed_versions(std::path::Path::new("/nonexistent/path")); assert!(versions.is_empty()); } #[test] fn test_list_installed_versions_empty_dir() { let dir = tempfile::tempdir().unwrap(); let versions = list_installed_versions(dir.path()); assert!(versions.is_empty()); } #[test] fn test_list_installed_versions_with_versions() { let dir = tempfile::tempdir().unwrap(); // Create version directories std::fs::create_dir(dir.path().join("20.18.0")).unwrap(); std::fs::create_dir(dir.path().join("22.13.0")).unwrap(); std::fs::create_dir(dir.path().join("18.20.0")).unwrap(); // Create a hidden dir that should be skipped std::fs::create_dir(dir.path().join(".tmp")).unwrap(); // Create a file that should be skipped std::fs::write(dir.path().join("some-file"), "").unwrap(); let versions = list_installed_versions(dir.path()); assert_eq!(versions, vec!["18.20.0", "20.18.0", "22.13.0"]); } } ================================================ FILE: crates/vite_global_cli/src/commands/env/list_remote.rs ================================================ //! List-remote command for displaying available Node.js versions from the registry. //! //! Handles `vp env list-remote` to show available Node.js versions from the Node.js distribution. use std::process::ExitStatus; use owo_colors::OwoColorize; use serde::Serialize; use vite_js_runtime::{LtsInfo, NodeProvider, NodeVersionEntry}; use crate::{cli::SortingMethod, error::Error}; /// Default number of major versions to show const DEFAULT_MAJOR_VERSIONS: usize = 10; /// JSON output format for version list #[derive(Serialize)] struct VersionListJson { versions: Vec, } /// JSON format for a single version entry #[derive(Serialize)] struct VersionJson { version: String, lts: Option, latest: bool, latest_lts: bool, } /// Execute the list-remote command. pub async fn execute( pattern: Option, lts_only: bool, show_all: bool, json_output: bool, sort: SortingMethod, ) -> Result { let provider = NodeProvider::new(); let versions = provider.fetch_version_index().await?; if versions.is_empty() { println!("No versions found."); return Ok(ExitStatus::default()); } // Filter versions based on options let mut filtered = filter_versions(&versions, pattern.as_deref(), lts_only, show_all); // fetch_version_index() returns newest-first (desc). // For asc (default), reverse to show oldest-first. if matches!(sort, SortingMethod::Asc) { filtered.reverse(); } if json_output { print_json(&filtered, &versions)?; } else { print_human(&filtered); } Ok(ExitStatus::default()) } /// Filter versions based on criteria. fn filter_versions<'a>( versions: &'a [NodeVersionEntry], pattern: Option<&str>, lts_only: bool, show_all: bool, ) -> Vec<&'a NodeVersionEntry> { let mut filtered: Vec<&'a NodeVersionEntry> = versions.iter().collect(); // Filter by LTS if requested if lts_only { filtered.retain(|v| v.is_lts()); } // Filter by pattern (major version) if let Some(pattern) = pattern { filtered.retain(|v| { let version_str = v.version.strip_prefix('v').unwrap_or(&v.version); version_str.starts_with(pattern) || version_str.starts_with(&format!("{pattern}.")) }); } // Limit to recent major versions unless --all is specified if !show_all && pattern.is_none() { filtered = limit_to_recent_majors(filtered, DEFAULT_MAJOR_VERSIONS); } filtered } /// Extract major version from a version string like "v20.18.0" or "20.18.0" fn extract_major(version: &str) -> Option { let version_str = version.strip_prefix('v').unwrap_or(version); version_str.split('.').next()?.parse().ok() } /// Limit versions to the N most recent major versions. fn limit_to_recent_majors( versions: Vec<&NodeVersionEntry>, max_majors: usize, ) -> Vec<&NodeVersionEntry> { // Get unique major versions let mut majors: Vec = versions.iter().filter_map(|v| extract_major(&v.version)).collect(); majors.sort_unstable(); majors.dedup(); majors.reverse(); // Keep only the most recent N majors let recent_majors: std::collections::HashSet = majors.into_iter().take(max_majors).collect(); versions .into_iter() .filter(|v| extract_major(&v.version).is_some_and(|m| recent_majors.contains(&m))) .collect() } /// Print versions as JSON. fn print_json( versions: &[&NodeVersionEntry], all_versions: &[NodeVersionEntry], ) -> Result<(), Error> { // Find the latest version and latest LTS let latest_version = all_versions.first().map(|v| &v.version); let latest_lts_version = all_versions.iter().find(|v| v.is_lts()).map(|v| &v.version); let version_list: Vec = versions .iter() .map(|v| { let lts = match &v.lts { LtsInfo::Codename(name) => Some(name.to_string()), _ => None, }; let is_latest = latest_version.is_some_and(|lv| lv == &v.version); let is_latest_lts = latest_lts_version.is_some_and(|llv| llv == &v.version); VersionJson { version: v.version.strip_prefix('v').unwrap_or(&v.version).to_string(), lts, latest: is_latest, latest_lts: is_latest_lts, } }) .collect(); let output = VersionListJson { versions: version_list }; println!("{}", serde_json::to_string_pretty(&output)?); Ok(()) } /// Print versions in human-readable format (fnm-style). fn print_human(versions: &[&NodeVersionEntry]) { if versions.is_empty() { eprintln!("{}", "No versions were found!".red()); return; } for version in versions { let version_str = &version.version; // Ensure v prefix let display = if version_str.starts_with('v') { version_str.to_string() } else { format!("v{version_str}") }; if let LtsInfo::Codename(name) = &version.lts { println!("{}{}", display, format!(" ({name})").bright_blue()); } else { println!("{display}"); } } } #[cfg(test)] mod tests { use super::*; fn make_version(version: &str, lts: Option<&str>) -> NodeVersionEntry { NodeVersionEntry { version: version.into(), lts: match lts { Some(name) => LtsInfo::Codename(name.into()), None => LtsInfo::Boolean(false), }, } } #[test] fn test_filter_versions_lts_only() { let versions = vec![ make_version("v24.0.0", None), make_version("v22.13.0", Some("Jod")), make_version("v20.18.0", Some("Iron")), ]; let filtered = filter_versions(&versions, None, true, false); assert_eq!(filtered.len(), 2); assert!(filtered.iter().all(|v| v.is_lts())); } #[test] fn test_filter_versions_by_pattern() { let versions = vec![ make_version("v24.0.0", None), make_version("v22.13.0", Some("Jod")), make_version("v22.12.0", Some("Jod")), make_version("v20.18.0", Some("Iron")), ]; let filtered = filter_versions(&versions, Some("22"), false, true); assert_eq!(filtered.len(), 2); assert!(filtered.iter().all(|v| v.version.starts_with("v22."))); } #[test] fn test_limit_to_recent_majors() { let versions = vec![ make_version("v24.0.0", None), make_version("v23.0.0", None), make_version("v22.13.0", Some("Jod")), make_version("v21.0.0", None), make_version("v20.18.0", Some("Iron")), ]; let refs: Vec<&NodeVersionEntry> = versions.iter().collect(); let limited = limit_to_recent_majors(refs, 2); // Should only have v24 and v23 assert_eq!(limited.len(), 2); assert!(limited.iter().any(|v| v.version.starts_with("v24."))); assert!(limited.iter().any(|v| v.version.starts_with("v23."))); } #[test] fn test_filter_versions_show_all_returns_all_versions() { // Create versions spanning many major versions (more than DEFAULT_MAJOR_VERSIONS) let versions = vec![ make_version("v25.0.0", None), make_version("v24.0.0", None), make_version("v23.0.0", None), make_version("v22.13.0", Some("Jod")), make_version("v21.0.0", None), make_version("v20.18.0", Some("Iron")), make_version("v19.0.0", None), make_version("v18.20.0", Some("Hydrogen")), make_version("v17.0.0", None), make_version("v16.20.0", Some("Gallium")), make_version("v15.0.0", None), make_version("v14.0.0", None), ]; // Without show_all, should be limited to DEFAULT_MAJOR_VERSIONS (10) let filtered_limited = filter_versions(&versions, None, false, false); assert_eq!(filtered_limited.len(), 10); // With show_all=true, should return all versions let filtered_all = filter_versions(&versions, None, false, true); assert_eq!(filtered_all.len(), 12); } #[test] fn test_filter_versions_show_all_with_lts_filter() { let versions = vec![ make_version("v25.0.0", None), make_version("v22.13.0", Some("Jod")), make_version("v20.18.0", Some("Iron")), make_version("v18.20.0", Some("Hydrogen")), ]; // With lts_only and show_all, should return all LTS versions let filtered = filter_versions(&versions, None, true, true); assert_eq!(filtered.len(), 3); assert!(filtered.iter().all(|v| v.is_lts())); } } ================================================ FILE: crates/vite_global_cli/src/commands/env/mod.rs ================================================ //! Environment management commands. //! //! This module provides the `vp env` command for managing Node.js environments //! through shim-based version management. pub mod bin_config; pub mod config; mod current; mod default; mod doctor; mod exec; pub mod global_install; mod list; mod list_remote; mod off; mod on; pub mod package_metadata; pub mod packages; mod pin; mod setup; mod unpin; mod r#use; mod which; use std::process::ExitStatus; use vite_path::AbsolutePathBuf; use crate::{ cli::{EnvArgs, EnvSubcommands}, error::Error, }; fn print_env_header() { println!("{}", vite_shared::header::vite_plus_header()); println!(); } fn should_print_env_header(subcommand: &EnvSubcommands) -> bool { match subcommand { EnvSubcommands::Current { json } => !json, EnvSubcommands::List { json } => !json, EnvSubcommands::ListRemote { json, .. } => !json, // Keep these machine-consumable / passthrough commands header-free. EnvSubcommands::Use { .. } | EnvSubcommands::Exec { .. } => false, _ => true, } } /// Execute the env command based on the provided arguments. pub async fn execute(cwd: AbsolutePathBuf, args: EnvArgs) -> Result { // Handle subcommands first if let Some(subcommand) = args.command { if should_print_env_header(&subcommand) { print_env_header(); } return match subcommand { crate::cli::EnvSubcommands::Current { json } => current::execute(cwd, json).await, crate::cli::EnvSubcommands::Print => print_env(cwd).await, crate::cli::EnvSubcommands::Default { version } => default::execute(cwd, version).await, crate::cli::EnvSubcommands::On => on::execute().await, crate::cli::EnvSubcommands::Off => off::execute().await, crate::cli::EnvSubcommands::Setup { refresh, env_only } => { setup::execute(refresh, env_only).await } crate::cli::EnvSubcommands::Doctor => doctor::execute(cwd).await, crate::cli::EnvSubcommands::Which { tool } => which::execute(cwd, &tool).await, crate::cli::EnvSubcommands::Pin { version, unpin, no_install, force } => { pin::execute(cwd, version, unpin, no_install, force).await } crate::cli::EnvSubcommands::Unpin => unpin::execute(cwd).await, crate::cli::EnvSubcommands::List { json } => list::execute(cwd, json).await, crate::cli::EnvSubcommands::ListRemote { pattern, lts, all, json, sort } => { list_remote::execute(pattern, lts, all, json, sort).await } crate::cli::EnvSubcommands::Exec { node, npm, command } => { exec::execute(node.as_deref(), npm.as_deref(), &command).await } crate::cli::EnvSubcommands::Uninstall { version } => { let provider = vite_js_runtime::NodeProvider::new(); let resolved = config::resolve_version_alias(&version, &provider).await?; let home_dir = vite_shared::get_vite_plus_home() .map_err(|e| crate::error::Error::ConfigError(format!("{e}").into()))?; let version_dir = home_dir.join("js_runtime").join("node").join(&resolved); if !version_dir.as_path().exists() { eprintln!("Node.js v{} is not installed", resolved); return Ok(exit_status(1)); } tokio::fs::remove_dir_all(version_dir.as_path()).await.map_err(|e| { crate::error::Error::ConfigError( format!("Failed to remove Node.js v{}: {}", resolved, e).into(), ) })?; println!("Uninstalled Node.js v{}", resolved); Ok(ExitStatus::default()) } crate::cli::EnvSubcommands::Use { version, unset, no_install, silent_if_unchanged } => { r#use::execute(cwd, version, unset, no_install, silent_if_unchanged).await } crate::cli::EnvSubcommands::Install { version } => { let (resolved, from_session_override) = if let Some(version) = version { let provider = vite_js_runtime::NodeProvider::new(); (config::resolve_version_alias(&version, &provider).await?, false) } else { let resolution = config::resolve_version(&cwd).await?; let from_session_override = matches!( resolution.source.as_str(), config::VERSION_ENV_VAR | config::SESSION_VERSION_FILE ); match resolution.source.as_str() { ".node-version" | "engines.node" | "devEngines.runtime" | config::VERSION_ENV_VAR | config::SESSION_VERSION_FILE => {} _ => { eprintln!("No Node.js version found in current project."); eprintln!("Specify a version: vp env install "); eprintln!("Or pin one: vp env pin "); return Ok(exit_status(1)); } } (resolution.version, from_session_override) }; println!("Installing Node.js v{}...", resolved); vite_js_runtime::download_runtime(vite_js_runtime::JsRuntimeType::Node, &resolved) .await?; println!("Installed Node.js v{}", resolved); if from_session_override { eprintln!("Note: Installed from session override."); eprintln!("Run `vp env use --unset` to revert to project version resolution."); } Ok(ExitStatus::default()) } }; } // No flags provided - show unified help to match `vp env --help`. if !crate::help::print_unified_clap_help_for_path(&["env"]) { // Fallback to clap's built-in help printer if unified rendering fails. use clap::CommandFactory; println!("{}", vite_shared::header::vite_plus_header()); println!(); crate::cli::Args::command() .find_subcommand("env") .unwrap() .clone() .disable_help_subcommand(true) .print_help() .ok(); } Ok(ExitStatus::default()) } /// Print shell snippet for setting environment (`vp env print`) async fn print_env(cwd: AbsolutePathBuf) -> Result { // Resolve the Node.js version for the current directory let resolution = config::resolve_version(&cwd).await?; // Get the node bin directory let runtime = vite_js_runtime::download_runtime( vite_js_runtime::JsRuntimeType::Node, &resolution.version, ) .await?; let bin_dir = runtime.get_bin_prefix(); // Print shell snippet println!("# Add to your shell to use this Node.js version for this session:"); println!("export PATH=\"{}:$PATH\"", bin_dir.as_path().display()); Ok(ExitStatus::default()) } /// Create an exit status with the given code. fn exit_status(code: i32) -> ExitStatus { #[cfg(unix)] { use std::os::unix::process::ExitStatusExt; ExitStatus::from_raw(code << 8) } #[cfg(windows)] { use std::os::windows::process::ExitStatusExt; ExitStatus::from_raw(code as u32) } } ================================================ FILE: crates/vite_global_cli/src/commands/env/off.rs ================================================ //! Enable system-first mode command. //! //! Handles `vp env off` to set shim mode to "system_first" - //! shims prefer system Node.js, fallback to managed if not found. use std::process::ExitStatus; use owo_colors::OwoColorize; use super::config::{ShimMode, load_config, save_config}; use crate::{error::Error, help}; fn accent_command(command: &str) -> String { if help::should_style_help() { format!("`{}`", command.bright_blue()) } else { format!("`{command}`") } } /// Execute the `vp env off` command. pub async fn execute() -> Result { let mut config = load_config().await?; if config.shim_mode == ShimMode::SystemFirst { println!("Shim mode is already set to system-first."); println!( "Shims will prefer system Node.js, falling back to Vite+ managed Node.js if not found." ); return Ok(ExitStatus::default()); } config.shim_mode = ShimMode::SystemFirst; save_config(&config).await?; println!("\u{2713} Shim mode set to system-first."); println!(); println!( "Shims will now prefer system Node.js, falling back to Vite+ managed Node.js if not found." ); println!(); println!("Run {} to always use the Vite+ managed Node.js.", accent_command("vp env on")); Ok(ExitStatus::default()) } ================================================ FILE: crates/vite_global_cli/src/commands/env/on.rs ================================================ //! Enable managed mode command. //! //! Handles `vp env on` to set shim mode to "managed" - shims always use vite-plus Node.js. use std::process::ExitStatus; use owo_colors::OwoColorize; use super::config::{ShimMode, load_config, save_config}; use crate::{error::Error, help}; fn accent_command(command: &str) -> String { if help::should_style_help() { format!("`{}`", command.bright_blue()) } else { format!("`{command}`") } } /// Execute the `vp env on` command. pub async fn execute() -> Result { let mut config = load_config().await?; if config.shim_mode == ShimMode::Managed { println!("Shim mode is already set to managed."); println!("Shims will always use the Vite+ managed Node.js."); return Ok(ExitStatus::default()); } config.shim_mode = ShimMode::Managed; save_config(&config).await?; println!("\u{2713} Shim mode set to managed."); println!(); println!("Shims will now always use the Vite+ managed Node.js."); println!(); println!("Run {} to prefer system Node.js instead.", accent_command("vp env off")); Ok(ExitStatus::default()) } ================================================ FILE: crates/vite_global_cli/src/commands/env/package_metadata.rs ================================================ //! Package metadata storage for global packages. use std::collections::HashSet; use chrono::{DateTime, Utc}; use serde::{Deserialize, Serialize}; use vite_path::AbsolutePathBuf; use super::config::get_packages_dir; use crate::error::Error; /// Metadata for a globally installed package. #[derive(Debug, Clone, Serialize, Deserialize)] #[serde(rename_all = "camelCase")] pub struct PackageMetadata { /// Package name pub name: String, /// Package version pub version: String, /// Platform versions used during installation pub platform: Platform, /// Binary names provided by this package pub bins: Vec, /// Binary names that are JavaScript files (need Node.js to run). #[serde(default)] pub js_bins: HashSet, /// Package manager used for installation (npm, yarn, pnpm) pub manager: String, /// Installation timestamp pub installed_at: DateTime, } /// Platform versions pinned to this package. #[derive(Debug, Clone, Serialize, Deserialize)] pub struct Platform { /// Node.js version pub node: String, /// npm version (if applicable) #[serde(skip_serializing_if = "Option::is_none")] pub npm: Option, } impl PackageMetadata { /// Create new package metadata. pub fn new( name: String, version: String, node_version: String, npm_version: Option, bins: Vec, js_bins: HashSet, manager: String, ) -> Self { Self { name, version, platform: Platform { node: node_version, npm: npm_version }, bins, js_bins, manager, installed_at: Utc::now(), } } /// Check if a binary requires Node.js to run. pub fn is_js_binary(&self, bin_name: &str) -> bool { self.js_bins.contains(bin_name) } /// Get the metadata file path for a package. pub fn metadata_path(package_name: &str) -> Result { let packages_dir = get_packages_dir()?; Ok(packages_dir.join(format!("{package_name}.json"))) } /// Load metadata for a package. pub async fn load(package_name: &str) -> Result, Error> { let path = Self::metadata_path(package_name)?; if !tokio::fs::try_exists(&path).await.unwrap_or(false) { return Ok(None); } let content = tokio::fs::read_to_string(&path).await?; let metadata: Self = serde_json::from_str(&content).map_err(|e| { Error::ConfigError(format!("Failed to parse package metadata: {e}").into()) })?; Ok(Some(metadata)) } /// Save metadata for a package. pub async fn save(&self) -> Result<(), Error> { let path = Self::metadata_path(&self.name)?; // Create parent directory (handles scoped packages like @scope/pkg.json) if let Some(parent) = path.parent() { tokio::fs::create_dir_all(parent).await?; } let content = serde_json::to_string_pretty(self).map_err(|e| { Error::ConfigError(format!("Failed to serialize package metadata: {e}").into()) })?; tokio::fs::write(&path, content).await?; Ok(()) } /// Delete metadata for a package. pub async fn delete(package_name: &str) -> Result<(), Error> { let path = Self::metadata_path(package_name)?; if tokio::fs::try_exists(&path).await.unwrap_or(false) { tokio::fs::remove_file(&path).await?; } Ok(()) } /// List all installed packages. pub async fn list_all() -> Result, Error> { let packages_dir = get_packages_dir()?; if !tokio::fs::try_exists(&packages_dir).await.unwrap_or(false) { return Ok(Vec::new()); } let mut packages = Vec::new(); list_packages_recursive(&packages_dir, &mut packages).await?; Ok(packages) } /// Find the package that provides a given binary. /// /// Returns the package metadata if found, None otherwise. pub async fn find_by_binary(binary_name: &str) -> Result, Error> { let packages = Self::list_all().await?; for package in packages { if package.bins.contains(&binary_name.to_string()) { return Ok(Some(package)); } } Ok(None) } } /// Recursively list packages in a directory (handles scoped packages in subdirs). async fn list_packages_recursive( dir: &vite_path::AbsolutePath, packages: &mut Vec, ) -> Result<(), Error> { let mut entries = tokio::fs::read_dir(dir).await?; while let Some(entry) = entries.next_entry().await? { let path = entry.path(); let file_type = entry.file_type().await?; if file_type.is_dir() { // Only recurse into scoped package directories (@scope/) // Skip package installation directories (typescript/, projj/) if let Some(name) = entry.file_name().to_str() { if name.starts_with('@') { if let Some(abs_path) = AbsolutePathBuf::new(path) { Box::pin(list_packages_recursive(&abs_path, packages)).await?; } } } } else if path.extension().is_some_and(|e| e == "json") { // Read JSON metadata files if let Ok(content) = tokio::fs::read_to_string(&path).await { if let Ok(metadata) = serde_json::from_str::(&content) { packages.push(metadata); } } } } Ok(()) } #[cfg(test)] mod tests { use super::*; #[test] fn test_metadata_path_regular_package() { // Regular package: typescript.json let path = PackageMetadata::metadata_path("typescript").unwrap(); assert!(path.as_path().ends_with("typescript.json")); } #[test] fn test_metadata_path_scoped_package() { // Scoped package: @types/node.json (inside @types directory) let path = PackageMetadata::metadata_path("@types/node").unwrap(); let path_str = path.as_path().to_string_lossy(); assert!( path_str.ends_with("@types/node.json"), "Expected path ending with @types/node.json, got: {}", path_str ); } #[tokio::test] async fn test_save_scoped_package_metadata() { use tempfile::TempDir; let temp_dir = TempDir::new().unwrap(); let temp_path = temp_dir.path().to_path_buf(); let _guard = vite_shared::EnvConfig::test_guard( vite_shared::EnvConfig::for_test_with_home(&temp_path), ); let metadata = PackageMetadata::new( "@scope/test-pkg".to_string(), "1.0.0".to_string(), "20.18.0".to_string(), None, vec!["test-bin".to_string()], HashSet::from(["test-bin".to_string()]), "npm".to_string(), ); // This should not fail with "No such file or directory" // because save() should create the @scope parent directory let result = metadata.save().await; assert!(result.is_ok(), "Failed to save scoped package metadata: {:?}", result.err()); // Verify the file exists at the correct location let expected_path = temp_path.join("packages").join("@scope").join("test-pkg.json"); assert!(expected_path.exists(), "Metadata file not found at {:?}", expected_path); } #[tokio::test] async fn test_list_all_includes_scoped_packages() { use tempfile::TempDir; let temp_dir = TempDir::new().unwrap(); let temp_path = temp_dir.path().to_path_buf(); let _guard = vite_shared::EnvConfig::test_guard( vite_shared::EnvConfig::for_test_with_home(&temp_path), ); // Create regular package metadata let regular = PackageMetadata::new( "typescript".to_string(), "5.0.0".to_string(), "20.18.0".to_string(), None, vec!["tsc".to_string()], HashSet::from(["tsc".to_string()]), "npm".to_string(), ); regular.save().await.unwrap(); // Create scoped package metadata let scoped = PackageMetadata::new( "@types/node".to_string(), "20.0.0".to_string(), "20.18.0".to_string(), None, vec![], HashSet::new(), "npm".to_string(), ); scoped.save().await.unwrap(); // list_all should find both let all = PackageMetadata::list_all().await.unwrap(); assert_eq!(all.len(), 2, "Expected 2 packages, got {}", all.len()); let names: Vec<_> = all.iter().map(|p| p.name.as_str()).collect(); assert!(names.contains(&"typescript"), "Missing typescript package"); assert!(names.contains(&"@types/node"), "Missing @types/node package"); } #[tokio::test] async fn test_find_by_binary() { use tempfile::TempDir; let temp_dir = TempDir::new().unwrap(); let temp_path = temp_dir.path().to_path_buf(); let _guard = vite_shared::EnvConfig::test_guard( vite_shared::EnvConfig::for_test_with_home(&temp_path), ); // Create typescript package with tsc and tsserver binaries let typescript = PackageMetadata::new( "typescript".to_string(), "5.0.0".to_string(), "20.18.0".to_string(), None, vec!["tsc".to_string(), "tsserver".to_string()], HashSet::from(["tsc".to_string(), "tsserver".to_string()]), "npm".to_string(), ); typescript.save().await.unwrap(); // Create eslint package with eslint binary let eslint = PackageMetadata::new( "eslint".to_string(), "9.0.0".to_string(), "22.13.0".to_string(), None, vec!["eslint".to_string()], HashSet::from(["eslint".to_string()]), "npm".to_string(), ); eslint.save().await.unwrap(); // Find by binary should return the correct package let found = PackageMetadata::find_by_binary("tsc").await.unwrap(); assert!(found.is_some(), "Should find package providing tsc"); assert_eq!(found.unwrap().name, "typescript"); let found = PackageMetadata::find_by_binary("tsserver").await.unwrap(); assert!(found.is_some(), "Should find package providing tsserver"); assert_eq!(found.unwrap().name, "typescript"); let found = PackageMetadata::find_by_binary("eslint").await.unwrap(); assert!(found.is_some(), "Should find package providing eslint"); assert_eq!(found.unwrap().name, "eslint"); // Non-existent binary should return None let found = PackageMetadata::find_by_binary("nonexistent").await.unwrap(); assert!(found.is_none(), "Should not find package for nonexistent binary"); } } ================================================ FILE: crates/vite_global_cli/src/commands/env/packages.rs ================================================ //! List installed global packages. use std::process::ExitStatus; use owo_colors::OwoColorize; use super::package_metadata::PackageMetadata; use crate::error::Error; /// Execute the packages command. pub async fn execute(json: bool, pattern: Option<&str>) -> Result { let all_packages = PackageMetadata::list_all().await?; let packages: Vec<_> = if let Some(pat) = pattern { let pat_lower = pat.to_lowercase(); all_packages.into_iter().filter(|p| p.name.to_lowercase().contains(&pat_lower)).collect() } else { all_packages }; if packages.is_empty() { if json { println!("[]"); } else if pattern.is_some() { println!("No global packages matching '{}'.", pattern.unwrap()); println!(); println!("Run 'vp list -g' to see all installed global packages."); } else { println!("No global packages installed."); println!(); println!("Install packages with: vp install -g "); } return Ok(ExitStatus::default()); } if json { let json_output = serde_json::to_string_pretty(&packages) .map_err(|e| Error::ConfigError(format!("Failed to serialize: {e}").into()))?; println!("{json_output}"); } else { let col_pkg = "Package"; let col_node = "Node version"; let col_bins = "Binaries"; let mut w_pkg = col_pkg.len(); let mut w_node = col_node.len(); for pkg in &packages { let name = format!("{}@{}", pkg.name, pkg.version); w_pkg = w_pkg.max(name.len()); w_node = w_node.max(pkg.platform.node.len()); } let gap = 3; println!("{:gap$}{:gap$}{}", col_pkg, "", col_node, "", col_bins); println!("{:gap$}{:gap$}{}", "---", "", "---", "", "---"); for pkg in &packages { let name = format!("{:gap$}{:gap$}{}", name.bright_blue(), "", pkg.platform.node, "", bins ); } } Ok(ExitStatus::default()) } ================================================ FILE: crates/vite_global_cli/src/commands/env/pin.rs ================================================ //! Pin command for per-directory Node.js version management. //! //! Handles `vp env pin [VERSION]` to pin a Node.js version in the current directory //! by creating or updating a `.node-version` file. use std::{io::Write, process::ExitStatus}; use vite_js_runtime::NodeProvider; use vite_path::AbsolutePathBuf; use vite_shared::output; use super::config::{get_config_path, load_config}; use crate::error::Error; /// Node version file name const NODE_VERSION_FILE: &str = ".node-version"; /// Execute the pin command. pub async fn execute( cwd: AbsolutePathBuf, version: Option, unpin: bool, no_install: bool, force: bool, ) -> Result { // Handle --unpin flag if unpin { return do_unpin(&cwd).await; } match version { Some(v) => do_pin(&cwd, &v, no_install, force).await, None => show_pinned(&cwd).await, } } /// Show the current pinned version. async fn show_pinned(cwd: &AbsolutePathBuf) -> Result { let node_version_path = cwd.join(NODE_VERSION_FILE); // Check if .node-version exists in current directory if tokio::fs::try_exists(&node_version_path).await.unwrap_or(false) { let content = tokio::fs::read_to_string(&node_version_path).await?; let version = content.trim(); println!("Pinned version: {version}"); println!(" Source: {}", node_version_path.as_path().display()); return Ok(ExitStatus::default()); } // Check for inherited version from parent directories if let Some((version, source_path)) = find_inherited_version(cwd).await? { println!("No version pinned in current directory."); println!(" Inherited: {version} from {}", source_path.as_path().display()); return Ok(ExitStatus::default()); } // No .node-version anywhere - show default let config = load_config().await?; match config.default_node_version { Some(version) => { let config_path = get_config_path()?; println!("No version pinned."); println!(" Using default: {version} (from {})", config_path.as_path().display()); } None => { println!("No version pinned."); println!(" Run 'vp env pin ' to pin a version."); } } Ok(ExitStatus::default()) } /// Find .node-version in parent directories. async fn find_inherited_version( cwd: &AbsolutePathBuf, ) -> Result, Error> { let mut current: Option = cwd.parent().map(|p| p.to_absolute_path_buf()); while let Some(dir) = current { let node_version_path = dir.join(NODE_VERSION_FILE); if tokio::fs::try_exists(&node_version_path).await.unwrap_or(false) { let content = tokio::fs::read_to_string(&node_version_path).await?; return Ok(Some((content.trim().to_string(), node_version_path))); } current = dir.parent().map(|p| p.to_absolute_path_buf()); } Ok(None) } /// Pin a version to the current directory. async fn do_pin( cwd: &AbsolutePathBuf, version: &str, no_install: bool, force: bool, ) -> Result { let provider = NodeProvider::new(); let node_version_path = cwd.join(NODE_VERSION_FILE); // Resolve the version (aliases like lts/latest are resolved to exact versions) let (resolved_version, was_alias) = resolve_version_for_pin(version, &provider).await?; // Check if .node-version already exists if !force && tokio::fs::try_exists(&node_version_path).await.unwrap_or(false) { let existing_content = tokio::fs::read_to_string(&node_version_path).await?; let existing_version = existing_content.trim(); if existing_version == resolved_version { println!("Already pinned to {resolved_version}"); return Ok(ExitStatus::default()); } // Prompt for confirmation print!(".node-version already exists with version {existing_version}"); println!(); print!("Overwrite with {resolved_version}? (y/n): "); std::io::stdout().flush()?; let mut input = String::new(); std::io::stdin().read_line(&mut input)?; if !input.trim().eq_ignore_ascii_case("y") { println!("Cancelled."); return Ok(ExitStatus::default()); } } // Write the version to .node-version tokio::fs::write(&node_version_path, format!("{resolved_version}\n")).await?; // Invalidate resolve cache so the pinned version takes effect immediately crate::shim::invalidate_cache(); // Print success message if was_alias { output::success(&format!( "Pinned Node.js version to {resolved_version} (resolved from {version})" )); } else { output::success(&format!("Pinned Node.js version to {resolved_version}")); } println!(" Created {} in {}", NODE_VERSION_FILE, cwd.as_path().display()); // Pre-download the version unless --no-install is specified if no_install { output::note("Version will be downloaded on first use."); } else { // Download the runtime match vite_js_runtime::download_runtime( vite_js_runtime::JsRuntimeType::Node, &resolved_version, ) .await { Ok(_) => { output::success(&format!("Node.js {resolved_version} installed")); } Err(e) => { output::warn(&format!("Failed to download Node.js {resolved_version}: {e}")); output::note("Version will be downloaded on first use."); } } } Ok(ExitStatus::default()) } /// Resolve version for pinning. /// /// Aliases (lts, latest) are resolved to exact versions. /// Returns (resolved_version, was_alias). async fn resolve_version_for_pin( version: &str, provider: &NodeProvider, ) -> Result<(String, bool), Error> { match version.to_lowercase().as_str() { "lts" => { let resolved = provider.resolve_latest_version().await?; Ok((resolved.to_string(), true)) } "latest" => { let resolved = provider.resolve_version("*").await?; Ok((resolved.to_string(), true)) } _ => { // For exact versions, validate they exist if NodeProvider::is_exact_version(version) { // Validate the version exists by trying to resolve it provider.resolve_version(version).await?; Ok((version.to_string(), false)) } else { // For ranges/partial versions, resolve to exact version let resolved = provider.resolve_version(version).await?; Ok((resolved.to_string(), true)) } } } } /// Remove the .node-version file from current directory. pub async fn do_unpin(cwd: &AbsolutePathBuf) -> Result { let node_version_path = cwd.join(NODE_VERSION_FILE); if !tokio::fs::try_exists(&node_version_path).await.unwrap_or(false) { println!("No {} file in current directory.", NODE_VERSION_FILE); return Ok(ExitStatus::default()); } tokio::fs::remove_file(&node_version_path).await?; // Invalidate resolve cache so the unpinned version falls back correctly crate::shim::invalidate_cache(); output::success(&format!("Removed {} from {}", NODE_VERSION_FILE, cwd.as_path().display())); Ok(ExitStatus::default()) } #[cfg(test)] mod tests { use serial_test::serial; use tempfile::TempDir; use vite_path::AbsolutePathBuf; use super::*; #[tokio::test] async fn test_show_pinned_no_file() { let temp_dir = TempDir::new().unwrap(); let temp_path = AbsolutePathBuf::new(temp_dir.path().to_path_buf()).unwrap(); // Should not error when no .node-version exists let result = show_pinned(&temp_path).await; assert!(result.is_ok()); } #[tokio::test] async fn test_show_pinned_with_file() { let temp_dir = TempDir::new().unwrap(); let temp_path = AbsolutePathBuf::new(temp_dir.path().to_path_buf()).unwrap(); // Create .node-version tokio::fs::write(temp_path.join(".node-version"), "20.18.0\n").await.unwrap(); let result = show_pinned(&temp_path).await; assert!(result.is_ok()); } #[tokio::test] async fn test_find_inherited_version() { let temp_dir = TempDir::new().unwrap(); let temp_path = AbsolutePathBuf::new(temp_dir.path().to_path_buf()).unwrap(); // Create .node-version in parent tokio::fs::write(temp_path.join(".node-version"), "20.18.0\n").await.unwrap(); // Create subdirectory let subdir = temp_path.join("subdir"); tokio::fs::create_dir(&subdir).await.unwrap(); let result = find_inherited_version(&subdir).await.unwrap(); assert!(result.is_some()); let (version, _) = result.unwrap(); assert_eq!(version, "20.18.0"); } #[tokio::test] async fn test_do_unpin() { let temp_dir = TempDir::new().unwrap(); let temp_path = AbsolutePathBuf::new(temp_dir.path().to_path_buf()).unwrap(); // Create .node-version let node_version_path = temp_path.join(".node-version"); tokio::fs::write(&node_version_path, "20.18.0\n").await.unwrap(); // Unpin let result = do_unpin(&temp_path).await; assert!(result.is_ok()); // File should be gone assert!(!tokio::fs::try_exists(&node_version_path).await.unwrap()); } #[tokio::test] // Run serially: mutates VITE_PLUS_HOME env var which affects invalidate_cache() #[serial] async fn test_do_unpin_invalidates_cache() { let temp_dir = TempDir::new().unwrap(); let temp_path = AbsolutePathBuf::new(temp_dir.path().to_path_buf()).unwrap(); // Point VITE_PLUS_HOME to temp dir unsafe { std::env::set_var(vite_shared::env_vars::VITE_PLUS_HOME, temp_path.as_path()); } // Create cache file manually let cache_dir = temp_path.join("cache"); std::fs::create_dir_all(&cache_dir).unwrap(); let cache_file = cache_dir.join("resolve_cache.json"); std::fs::write(&cache_file, r#"{"version":2,"entries":{}}"#).unwrap(); assert!( std::fs::metadata(cache_file.as_path()).is_ok(), "Cache file should exist before unpin" ); // Create .node-version and unpin let node_version_path = temp_path.join(".node-version"); tokio::fs::write(&node_version_path, "20.18.0\n").await.unwrap(); let result = do_unpin(&temp_path).await; assert!(result.is_ok()); // Cache file should be removed by invalidate_cache() assert!( std::fs::metadata(cache_file.as_path()).is_err(), "Cache file should be removed after unpin" ); // Cleanup unsafe { std::env::remove_var(vite_shared::env_vars::VITE_PLUS_HOME); } } // Run serially: mutates VITE_PLUS_HOME env var which affects invalidate_cache() #[tokio::test] #[serial] async fn test_do_pin_invalidates_cache() { let temp_dir = TempDir::new().unwrap(); let temp_path = AbsolutePathBuf::new(temp_dir.path().to_path_buf()).unwrap(); // Point VITE_PLUS_HOME to temp dir unsafe { std::env::set_var(vite_shared::env_vars::VITE_PLUS_HOME, temp_path.as_path()); } // Create cache file manually let cache_dir = temp_path.join("cache"); std::fs::create_dir_all(&cache_dir).unwrap(); let cache_file = cache_dir.join("resolve_cache.json"); std::fs::write(&cache_file, r#"{"version":2,"entries":{}}"#).unwrap(); assert!( std::fs::metadata(cache_file.as_path()).is_ok(), "Cache file should exist before pin" ); // Pin an exact version (no_install=true to skip download, force=true to skip prompt) let result = do_pin(&temp_path, "20.18.0", true, true).await; assert!(result.is_ok()); // .node-version should be created let node_version_path = temp_path.join(".node-version"); assert!(tokio::fs::try_exists(&node_version_path).await.unwrap()); let content = tokio::fs::read_to_string(&node_version_path).await.unwrap(); assert_eq!(content.trim(), "20.18.0"); // Cache file should be removed by invalidate_cache() assert!( std::fs::metadata(cache_file.as_path()).is_err(), "Cache file should be removed after pin" ); // Cleanup unsafe { std::env::remove_var(vite_shared::env_vars::VITE_PLUS_HOME); } } #[tokio::test] async fn test_do_unpin_no_file() { let temp_dir = TempDir::new().unwrap(); let temp_path = AbsolutePathBuf::new(temp_dir.path().to_path_buf()).unwrap(); // Should not error when no file exists let result = do_unpin(&temp_path).await; assert!(result.is_ok()); } #[tokio::test] async fn test_resolve_version_for_pin_partial_version() { let provider = NodeProvider::new(); // Partial version "20" should resolve to an exact version like "20.x.y" let (resolved, was_alias) = resolve_version_for_pin("20", &provider).await.unwrap(); assert!(was_alias, "partial version should be treated as alias"); // The resolved version should be a full semver version starting with "20." assert!( resolved.starts_with("20."), "expected resolved version to start with '20.', got: {resolved}" ); // Should be a valid exact version (major.minor.patch) let parts: Vec<&str> = resolved.split('.').collect(); assert_eq!(parts.len(), 3, "expected 3 version parts, got: {resolved}"); assert!(parts.iter().all(|p| p.parse::().is_ok()), "all parts should be numeric"); } #[tokio::test] async fn test_resolve_version_for_pin_exact_version() { let provider = NodeProvider::new(); // Exact version should be returned as-is let (resolved, was_alias) = resolve_version_for_pin("20.18.0", &provider).await.unwrap(); assert!(!was_alias, "exact version should not be treated as alias"); assert_eq!(resolved, "20.18.0"); } } ================================================ FILE: crates/vite_global_cli/src/commands/env/setup.rs ================================================ //! Setup command implementation for creating bin directory and shims. //! //! Creates the following structure: //! - ~/.vite-plus/bin/ - Contains vp symlink and node/npm/npx shims //! - ~/.vite-plus/current/ - Contains the actual vp CLI binary //! //! On Unix: //! - bin/vp is a symlink to ../current/bin/vp //! - bin/node, bin/npm, bin/npx are symlinks to ../current/bin/vp //! - Symlinks preserve argv[0], allowing tool detection via the symlink name //! //! On Windows: //! - bin/vp.exe, bin/node.exe, bin/npm.exe, bin/npx.exe are trampoline executables //! - Each trampoline detects its tool name from its own filename and spawns //! current\bin\vp.exe with VITE_PLUS_SHIM_TOOL env var set //! - This avoids the "Terminate batch job (Y/N)?" prompt from .cmd wrappers use std::process::ExitStatus; use clap::CommandFactory; use owo_colors::OwoColorize; use super::config::{get_bin_dir, get_vite_plus_home}; use crate::{cli::Args, error::Error, help}; /// Tools to create shims for (node, npm, npx, vpx) const SHIM_TOOLS: &[&str] = &["node", "npm", "npx", "vpx"]; fn accent_command(command: &str) -> String { if help::should_style_help() { format!("`{}`", command.bright_blue()) } else { format!("`{command}`") } } /// Execute the setup command. pub async fn execute(refresh: bool, env_only: bool) -> Result { let vite_plus_home = get_vite_plus_home()?; // Ensure home directory exists (env files are written here) tokio::fs::create_dir_all(&vite_plus_home).await?; // Generate completion scripts generate_completion_scripts(&vite_plus_home).await?; // Create env files with PATH guard (prevents duplicate PATH entries) create_env_files(&vite_plus_home).await?; if env_only { println!("{}", help::render_heading("Setup")); println!(" Updated shell environment files."); println!(" Run {} to verify setup.", accent_command("vp env doctor")); return Ok(ExitStatus::default()); } let bin_dir = get_bin_dir()?; println!("{}", help::render_heading("Setup")); println!(" Preparing vite-plus environment."); println!(); // Ensure bin directory exists tokio::fs::create_dir_all(&bin_dir).await?; // Get the current executable path (for shims) let current_exe = std::env::current_exe() .map_err(|e| Error::ConfigError(format!("Cannot find current executable: {e}").into()))?; // Create wrapper script in bin/ setup_vp_wrapper(&bin_dir, refresh).await?; // Create shims for node, npm, npx let mut created = Vec::new(); let mut skipped = Vec::new(); for tool in SHIM_TOOLS { let result = create_shim(¤t_exe, &bin_dir, tool, refresh).await?; if result { created.push(*tool); } else { skipped.push(*tool); } } // Best-effort cleanup of .old files from rename-before-copy on Windows #[cfg(windows)] if refresh { cleanup_old_files(&bin_dir).await; } // Print results if !created.is_empty() { println!("{}", help::render_heading("Created Shims")); for tool in &created { let shim_path = bin_dir.join(shim_filename(tool)); println!(" {}", shim_path.as_path().display()); } } if !skipped.is_empty() && !refresh { if !created.is_empty() { println!(); } println!("{}", help::render_heading("Skipped Shims")); for tool in &skipped { let shim_path = bin_dir.join(shim_filename(tool)); println!(" {}", shim_path.as_path().display()); } println!(); println!(" Use --refresh to update existing shims."); } println!(); print_path_instructions(&bin_dir); Ok(ExitStatus::default()) } /// Create symlink in bin/ that points to current/bin/vp. async fn setup_vp_wrapper(bin_dir: &vite_path::AbsolutePath, refresh: bool) -> Result<(), Error> { #[cfg(unix)] { let bin_vp = bin_dir.join("vp"); // Create symlink bin/vp -> ../current/bin/vp let should_create_symlink = refresh || !tokio::fs::try_exists(&bin_vp).await.unwrap_or(false) || !is_symlink(&bin_vp).await; // Replace non-symlink with symlink if should_create_symlink { // Remove existing if present (could be old wrapper script or file) if tokio::fs::try_exists(&bin_vp).await.unwrap_or(false) { tokio::fs::remove_file(&bin_vp).await?; } // Create relative symlink tokio::fs::symlink("../current/bin/vp", &bin_vp).await?; tracing::debug!("Created symlink {:?} -> ../current/bin/vp", bin_vp); } } #[cfg(windows)] { let bin_vp_exe = bin_dir.join("vp.exe"); // Create trampoline bin/vp.exe that forwards to current\bin\vp.exe let should_create = refresh || !tokio::fs::try_exists(&bin_vp_exe).await.unwrap_or(false); if should_create { let trampoline_src = get_trampoline_path()?; // On refresh, the existing vp.exe may still be running (the trampoline // that launched us). Windows prevents overwriting a running exe, so we // rename it to a timestamped .old file first, then copy the new one. if tokio::fs::try_exists(&bin_vp_exe).await.unwrap_or(false) { rename_to_old(&bin_vp_exe).await; } tokio::fs::copy(trampoline_src.as_path(), &bin_vp_exe).await?; tracing::debug!("Created trampoline {:?}", bin_vp_exe); } // Clean up legacy .cmd and shell script wrappers from previous versions if refresh { cleanup_legacy_windows_shim(bin_dir, "vp").await; } } Ok(()) } /// Check if a path is a symlink. #[cfg(unix)] async fn is_symlink(path: &vite_path::AbsolutePath) -> bool { match tokio::fs::symlink_metadata(path).await { Ok(m) => m.file_type().is_symlink(), Err(_) => false, } } /// Create a single shim for node/npm/npx. /// /// Returns `true` if the shim was created, `false` if it already exists. async fn create_shim( source: &std::path::Path, bin_dir: &vite_path::AbsolutePath, tool: &str, refresh: bool, ) -> Result { let shim_path = bin_dir.join(shim_filename(tool)); // Check if shim already exists if tokio::fs::try_exists(&shim_path).await.unwrap_or(false) { if !refresh { return Ok(false); } // Remove existing shim for refresh. // On Windows, .exe files may be locked (by antivirus, indexer, or // still-running processes), so rename to .old first instead of deleting. #[cfg(windows)] rename_to_old(&shim_path).await; #[cfg(not(windows))] { tokio::fs::remove_file(&shim_path).await?; } } #[cfg(unix)] { create_unix_shim(source, &shim_path, tool).await?; } #[cfg(windows)] { create_windows_shim(source, bin_dir, tool).await?; } Ok(true) } /// Get the filename for a shim (platform-specific). fn shim_filename(tool: &str) -> String { #[cfg(windows)] { // All tools use trampoline .exe files on Windows format!("{tool}.exe") } #[cfg(not(windows))] { tool.to_string() } } /// Create a Unix shim using symlink to ../current/bin/vp. /// /// Symlinks preserve argv[0], allowing the vp binary to detect which tool /// was invoked. This is the same pattern used by Volta. #[cfg(unix)] async fn create_unix_shim( _source: &std::path::Path, shim_path: &vite_path::AbsolutePath, _tool: &str, ) -> Result<(), Error> { // Create symlink to ../current/bin/vp (relative path) tokio::fs::symlink("../current/bin/vp", shim_path).await?; tracing::debug!("Created symlink shim at {:?} -> ../current/bin/vp", shim_path); Ok(()) } /// Create Windows shims using trampoline `.exe` files. /// /// Each tool gets a copy of the trampoline binary renamed to `.exe`. /// The trampoline detects its tool name from its own filename and spawns /// vp.exe with `VITE_PLUS_SHIM_TOOL` set, avoiding the "Terminate batch job?" /// prompt that `.cmd` wrappers cause on Ctrl+C. /// /// See: #[cfg(windows)] async fn create_windows_shim( _source: &std::path::Path, bin_dir: &vite_path::AbsolutePath, tool: &str, ) -> Result<(), Error> { let trampoline_src = get_trampoline_path()?; let shim_path = bin_dir.join(format!("{tool}.exe")); tokio::fs::copy(trampoline_src.as_path(), &shim_path).await?; // Clean up legacy .cmd and shell script wrappers from previous versions cleanup_legacy_windows_shim(bin_dir, tool).await; tracing::debug!("Created trampoline shim {:?}", shim_path); Ok(()) } /// Creates completion scripts in `~/.vite-plus/completion/`: /// - `vp.bash` (bash) /// - `_vp` (zsh, following zsh convention) /// - `vp.fish` (fish shell) /// - `vp.ps1` (PowerShell) async fn generate_completion_scripts( vite_plus_home: &vite_path::AbsolutePath, ) -> Result<(), Error> { let mut cmd = Args::command(); // Create completion directory let completion_dir = vite_plus_home.join("completion"); tokio::fs::create_dir_all(&completion_dir).await?; // Generate shell completion scripts let completions = [ (clap_complete::Shell::Bash, "vp.bash"), (clap_complete::Shell::Zsh, "_vp"), (clap_complete::Shell::Fish, "vp.fish"), (clap_complete::Shell::PowerShell, "vp.ps1"), ]; for (shell, filename) in completions { let path = completion_dir.join(filename); let mut file = std::fs::File::create(&path)?; clap_complete::generate(shell, &mut cmd, "vp", &mut file); } tracing::debug!("Generated completion scripts in {:?}", completion_dir); Ok(()) } /// Get the path to the trampoline template binary (vp-shim.exe). /// /// The trampoline binary is distributed alongside vp.exe in the same directory. /// In tests, `VITE_PLUS_TRAMPOLINE_PATH` can override the resolved path. #[cfg(windows)] pub(crate) fn get_trampoline_path() -> Result { // Allow tests to override the trampoline path if let Ok(override_path) = std::env::var(vite_shared::env_vars::VITE_PLUS_TRAMPOLINE_PATH) { let path = std::path::PathBuf::from(override_path); if path.exists() { return vite_path::AbsolutePathBuf::new(path) .ok_or_else(|| Error::ConfigError("Invalid trampoline override path".into())); } } let current_exe = std::env::current_exe() .map_err(|e| Error::ConfigError(format!("Cannot find current executable: {e}").into()))?; let bin_dir = current_exe .parent() .ok_or_else(|| Error::ConfigError("Cannot find parent directory of vp.exe".into()))?; let trampoline = bin_dir.join("vp-shim.exe"); if !trampoline.exists() { return Err(Error::ConfigError( format!( "Trampoline binary not found at {}. Re-install vite-plus to fix this.", trampoline.display() ) .into(), )); } vite_path::AbsolutePathBuf::new(trampoline) .ok_or_else(|| Error::ConfigError("Invalid trampoline path".into())) } /// Rename an existing `.exe` to a timestamped `.old` file instead of deleting. /// /// On Windows, running `.exe` files can't be deleted or overwritten, but they can /// be renamed. The `.old` files are cleaned up by `cleanup_old_files()`. #[cfg(windows)] async fn rename_to_old(path: &vite_path::AbsolutePath) { let timestamp = std::time::SystemTime::now() .duration_since(std::time::UNIX_EPOCH) .unwrap_or_default() .as_secs(); if let Some(name) = path.as_path().file_name().and_then(|n| n.to_str()) { let old_name = format!("{name}.{timestamp}.old"); let old_path = path.as_path().with_file_name(&old_name); if let Err(e) = tokio::fs::rename(path, &old_path).await { tracing::warn!("Failed to rename {} to {}: {}", name, old_name, e); } } } /// Best-effort cleanup of accumulated `.old` files from previous rename-before-copy operations. /// /// When refreshing `bin/vp.exe` on Windows, the running trampoline is renamed to a /// timestamped `.old` file. This function tries to delete all such files. Files still /// in use by a running process will silently fail to delete and be cleaned up next time. #[cfg(windows)] async fn cleanup_old_files(bin_dir: &vite_path::AbsolutePath) { let Ok(mut entries) = tokio::fs::read_dir(bin_dir).await else { return; }; while let Ok(Some(entry)) = entries.next_entry().await { let file_name = entry.file_name(); let name = file_name.to_string_lossy(); if name.ends_with(".old") { let _ = tokio::fs::remove_file(entry.path()).await; } } } /// Remove legacy `.cmd` and shell script wrappers from previous versions. #[cfg(windows)] pub(crate) async fn cleanup_legacy_windows_shim(bin_dir: &vite_path::AbsolutePath, tool: &str) { // Remove old .cmd wrapper (best-effort, ignore NotFound) let cmd_path = bin_dir.join(format!("{tool}.cmd")); let _ = tokio::fs::remove_file(&cmd_path).await; // Remove old shell script wrapper (extensionless, for Git Bash) // Only remove if it starts with #!/bin/sh (not a binary or other file) // Read only the first 9 bytes to avoid loading large files into memory let sh_path = bin_dir.join(tool); let is_shell_script = async { use tokio::io::AsyncReadExt; let mut file = tokio::fs::File::open(&sh_path).await.ok()?; let mut buf = [0u8; 9]; // b"#!/bin/sh".len() let n = file.read(&mut buf).await.ok()?; Some(buf[..n].starts_with(b"#!/bin/sh")) // file handle dropped here before remove_file } .await; if is_shell_script == Some(true) { let _ = tokio::fs::remove_file(&sh_path).await; } } /// Create env files with PATH guard (prevents duplicate PATH entries). /// /// Creates: /// - `~/.vite-plus/env` (POSIX shell — bash/zsh) with `vp()` wrapper function /// - `~/.vite-plus/env.fish` (fish shell) with `vp` wrapper function /// - `~/.vite-plus/env.ps1` (PowerShell) with PATH setup + `vp` function /// - `~/.vite-plus/bin/vp-use.cmd` (cmd.exe wrapper for `vp env use`) async fn create_env_files(vite_plus_home: &vite_path::AbsolutePath) -> Result<(), Error> { let bin_path = vite_plus_home.join("bin"); let completion_path = vite_plus_home.join("completion"); // Use $HOME-relative path if install dir is under HOME (like rustup's ~/.cargo/env) // This makes the env file portable across sessions where HOME may differ let home_dir = vite_shared::EnvConfig::get().user_home; let to_ref = |path: &vite_path::AbsolutePath| -> String { home_dir .as_ref() .and_then(|h| path.as_path().strip_prefix(h).ok()) .map(|s| { // Normalize to forward slashes for $HOME/... paths (POSIX-style) format!("$HOME/{}", s.display().to_string().replace('\\', "/")) }) .unwrap_or_else(|| path.as_path().display().to_string()) }; let bin_path_ref = to_ref(&bin_path); // POSIX env file (bash/zsh) // When sourced multiple times, removes existing entry and re-prepends to front // Uses parameter expansion to split PATH around the bin entry in O(1) operations // Includes vp() shell function wrapper for `vp env use` (evals stdout) // Includes shell completion support let env_content = r#"#!/bin/sh # Vite+ environment setup (https://viteplus.dev) __vp_bin="__VP_BIN__" case ":${PATH}:" in *":${__vp_bin}:"*) __vp_tmp=":${PATH}:" __vp_before="${__vp_tmp%%":${__vp_bin}:"*}" __vp_before="${__vp_before#:}" __vp_after="${__vp_tmp#*":${__vp_bin}:"}" __vp_after="${__vp_after%:}" export PATH="${__vp_bin}${__vp_before:+:${__vp_before}}${__vp_after:+:${__vp_after}}" unset __vp_tmp __vp_before __vp_after ;; *) export PATH="$__vp_bin:$PATH" ;; esac unset __vp_bin # Shell function wrapper: intercepts `vp env use` to eval its stdout, # which sets/unsets VITE_PLUS_NODE_VERSION in the current shell session. vp() { if [ "$1" = "env" ] && [ "$2" = "use" ]; then case " $* " in *" -h "*|*" --help "*) command vp "$@"; return; esac __vp_out="$(VITE_PLUS_ENV_USE_EVAL_ENABLE=1 command vp "$@")" || return $? eval "$__vp_out" else command vp "$@" fi } # Shell completion for bash/zsh # Source appropriate completion script based on current shell # Only load completion in interactive shells with required builtins if [ -n "$BASH_VERSION" ] && type complete >/dev/null 2>&1; then # Bash shell with completion support __vp_completion="__VP_COMPLETION_BASH__" if [ -f "$__vp_completion" ]; then . "$__vp_completion" fi unset __vp_completion elif [ -n "$ZSH_VERSION" ] && type compdef >/dev/null 2>&1; then # Zsh shell with completion support __vp_completion="__VP_COMPLETION_ZSH__" if [ -f "$__vp_completion" ]; then . "$__vp_completion" fi unset __vp_completion fi "# .replace("__VP_BIN__", &bin_path_ref) .replace("__VP_COMPLETION_BASH__", &to_ref(&completion_path.join("vp.bash"))) .replace("__VP_COMPLETION_ZSH__", &to_ref(&completion_path.join("_vp"))); let env_file = vite_plus_home.join("env"); tokio::fs::write(&env_file, env_content).await?; // Fish env file with vp wrapper function let env_fish_content = r#"# Vite+ environment setup (https://viteplus.dev) set -l __vp_idx (contains -i -- __VP_BIN__ $PATH) and set -e PATH[$__vp_idx] set -gx PATH __VP_BIN__ $PATH # Shell function wrapper: intercepts `vp env use` to eval its stdout, # which sets/unsets VITE_PLUS_NODE_VERSION in the current shell session. function vp if test (count $argv) -ge 2; and test "$argv[1]" = "env"; and test "$argv[2]" = "use" if contains -- -h $argv; or contains -- --help $argv command vp $argv; return end set -lx VITE_PLUS_ENV_USE_EVAL_ENABLE 1 set -l __vp_out (command vp $argv); or return $status eval $__vp_out else command vp $argv end end # Shell completion for fish if not set -q __vp_completion_sourced set -l __vp_completion "__VP_COMPLETION_FISH__" if test -f "$__vp_completion" source "$__vp_completion" set -g __vp_completion_sourced 1 end end "# .replace("__VP_BIN__", &bin_path_ref) .replace("__VP_COMPLETION_FISH__", &to_ref(&completion_path.join("vp.fish"))); let env_fish_file = vite_plus_home.join("env.fish"); tokio::fs::write(&env_fish_file, env_fish_content).await?; // PowerShell env file let env_ps1_content = r#"# Vite+ environment setup (https://viteplus.dev) $__vp_bin = "__VP_BIN_WIN__" if ($env:Path -split ';' -notcontains $__vp_bin) { $env:Path = "$__vp_bin;$env:Path" } # Shell function wrapper: intercepts `vp env use` to eval its stdout, # which sets/unsets VITE_PLUS_NODE_VERSION in the current shell session. function vp { if ($args.Count -ge 2 -and $args[0] -eq "env" -and $args[1] -eq "use") { if ($args -contains "-h" -or $args -contains "--help") { & (Join-Path $__vp_bin "vp.exe") @args; return } $env:VITE_PLUS_ENV_USE_EVAL_ENABLE = "1" $output = & (Join-Path $__vp_bin "vp.exe") @args 2>&1 | ForEach-Object { if ($_ -is [System.Management.Automation.ErrorRecord]) { Write-Host $_.Exception.Message } else { $_ } } Remove-Item Env:VITE_PLUS_ENV_USE_EVAL_ENABLE -ErrorAction SilentlyContinue if ($LASTEXITCODE -eq 0 -and $output) { Invoke-Expression ($output -join "`n") } } else { & (Join-Path $__vp_bin "vp.exe") @args } } # Shell completion for PowerShell $__vp_completion = "__VP_COMPLETION_PS1__" if (Test-Path $__vp_completion) { . $__vp_completion } "#; // For PowerShell, use the actual absolute path (not $HOME-relative) let bin_path_win = bin_path.as_path().display().to_string(); let completion_ps1_win = completion_path.join("vp.ps1").as_path().display().to_string(); let env_ps1_content = env_ps1_content .replace("__VP_BIN_WIN__", &bin_path_win) .replace("__VP_COMPLETION_PS1__", &completion_ps1_win); let env_ps1_file = vite_plus_home.join("env.ps1"); tokio::fs::write(&env_ps1_file, env_ps1_content).await?; // cmd.exe wrapper for `vp env use` (cmd.exe cannot define shell functions) // Users run `vp-use 24` in cmd.exe instead of `vp env use 24` let vp_use_cmd_content = "@echo off\r\nset VITE_PLUS_ENV_USE_EVAL_ENABLE=1\r\nfor /f \"delims=\" %%i in ('%~dp0..\\current\\bin\\vp.exe env use %*') do %%i\r\nset VITE_PLUS_ENV_USE_EVAL_ENABLE=\r\n"; // Only write if bin directory exists (it may not during --env-only) if tokio::fs::try_exists(&bin_path).await.unwrap_or(false) { let vp_use_cmd_file = bin_path.join("vp-use.cmd"); tokio::fs::write(&vp_use_cmd_file, vp_use_cmd_content).await?; } Ok(()) } /// Print instructions for adding bin directory to PATH. fn print_path_instructions(bin_dir: &vite_path::AbsolutePath) { // Derive vite_plus_home from bin_dir (parent), using $HOME prefix for readability let home_path = bin_dir .parent() .map(|p| p.as_path().display().to_string()) .unwrap_or_else(|| bin_dir.as_path().display().to_string()); let home_path = if let Ok(home_dir) = std::env::var("HOME") { if let Some(suffix) = home_path.strip_prefix(&home_dir) { format!("$HOME{suffix}") } else { home_path } } else { home_path }; println!("{}", help::render_heading("Next Steps")); println!(" Add to your shell profile (~/.zshrc, ~/.bashrc, etc.):"); println!(); println!(" . \"{home_path}/env\""); println!(); println!(" For fish shell, add to ~/.config/fish/config.fish:"); println!(); println!(" source \"{home_path}/env.fish\""); println!(); println!(" For PowerShell, add to your $PROFILE:"); println!(); println!(" . \"{home_path}/env.ps1\""); println!(); println!(" For IDE support (VS Code, Cursor), ensure bin directory is in system PATH:"); #[cfg(target_os = "macos")] { println!(" - macOS: Add to ~/.profile or use launchd"); } #[cfg(target_os = "linux")] { println!(" - Linux: Add to ~/.profile for display manager integration"); } #[cfg(target_os = "windows")] { println!(" - Windows: System Properties -> Environment Variables -> Path"); } println!(); println!( " Restart your terminal and IDE, then run {} to verify.", accent_command("vp env doctor") ); } #[cfg(test)] mod tests { use tempfile::TempDir; use vite_path::AbsolutePathBuf; use super::*; /// Helper: create a test_guard with user_home set to the given path. fn home_guard(home: impl Into) -> vite_shared::TestEnvGuard { vite_shared::EnvConfig::test_guard(vite_shared::EnvConfig { user_home: Some(home.into()), ..vite_shared::EnvConfig::for_test() }) } #[tokio::test] async fn test_create_env_files_creates_all_files() { let temp_dir = TempDir::new().unwrap(); let home = AbsolutePathBuf::new(temp_dir.path().to_path_buf()).unwrap(); let _guard = home_guard(temp_dir.path()); create_env_files(&home).await.unwrap(); let env_path = home.join("env"); let env_fish_path = home.join("env.fish"); let env_ps1_path = home.join("env.ps1"); assert!(env_path.as_path().exists(), "env file should be created"); assert!(env_fish_path.as_path().exists(), "env.fish file should be created"); assert!(env_ps1_path.as_path().exists(), "env.ps1 file should be created"); } #[tokio::test] async fn test_create_env_files_replaces_placeholder_with_home_relative_path() { let temp_dir = TempDir::new().unwrap(); let home = AbsolutePathBuf::new(temp_dir.path().to_path_buf()).unwrap(); let _guard = home_guard(temp_dir.path()); create_env_files(&home).await.unwrap(); let env_content = tokio::fs::read_to_string(home.join("env")).await.unwrap(); let fish_content = tokio::fs::read_to_string(home.join("env.fish")).await.unwrap(); // Placeholder should be fully replaced assert!( !env_content.contains("__VP_BIN__"), "env file should not contain __VP_BIN__ placeholder" ); assert!( !fish_content.contains("__VP_BIN__"), "env.fish file should not contain __VP_BIN__ placeholder" ); // Should use $HOME-relative path since install dir is under HOME assert!( env_content.contains("$HOME/bin"), "env file should reference $HOME/bin, got: {env_content}" ); assert!( fish_content.contains("$HOME/bin"), "env.fish file should reference $HOME/bin, got: {fish_content}" ); } #[tokio::test] async fn test_create_env_files_uses_absolute_path_when_not_under_home() { let temp_dir = TempDir::new().unwrap(); let home = AbsolutePathBuf::new(temp_dir.path().to_path_buf()).unwrap(); // Set user_home to a different path so install dir is NOT under HOME let _guard = home_guard("/nonexistent-home-dir"); create_env_files(&home).await.unwrap(); let env_content = tokio::fs::read_to_string(home.join("env")).await.unwrap(); let fish_content = tokio::fs::read_to_string(home.join("env.fish")).await.unwrap(); // Should use absolute path since install dir is not under HOME let expected_bin = home.join("bin"); let expected_str = expected_bin.as_path().display().to_string(); assert!( env_content.contains(&expected_str), "env file should use absolute path {expected_str}, got: {env_content}" ); assert!( fish_content.contains(&expected_str), "env.fish file should use absolute path {expected_str}, got: {fish_content}" ); // Should NOT use $HOME-relative path assert!(!env_content.contains("$HOME/bin"), "env file should not reference $HOME/bin"); } #[tokio::test] async fn test_create_env_files_posix_contains_path_guard() { let temp_dir = TempDir::new().unwrap(); let home = AbsolutePathBuf::new(temp_dir.path().to_path_buf()).unwrap(); let _guard = home_guard(temp_dir.path()); create_env_files(&home).await.unwrap(); let env_content = tokio::fs::read_to_string(home.join("env")).await.unwrap(); // Verify PATH guard structure: case statement checks for duplicate assert!( env_content.contains("case \":${PATH}:\" in"), "env file should contain PATH guard case statement" ); assert!( env_content.contains("*\":${__vp_bin}:\"*)"), "env file should check for existing bin in PATH" ); // Verify it re-prepends to front when already present assert!( env_content.contains("export PATH=\"${__vp_bin}"), "env file should re-prepend bin to front of PATH" ); // Verify simple prepend for new entry assert!( env_content.contains("export PATH=\"$__vp_bin:$PATH\""), "env file should prepend bin to PATH for new entry" ); } #[tokio::test] async fn test_create_env_files_fish_contains_path_guard() { let temp_dir = TempDir::new().unwrap(); let home = AbsolutePathBuf::new(temp_dir.path().to_path_buf()).unwrap(); let _guard = home_guard(temp_dir.path()); create_env_files(&home).await.unwrap(); let fish_content = tokio::fs::read_to_string(home.join("env.fish")).await.unwrap(); // Verify fish PATH guard: remove existing entry before prepending assert!( fish_content.contains("contains -i --"), "env.fish should check for existing bin in PATH" ); assert!( fish_content.contains("set -e PATH[$__vp_idx]"), "env.fish should remove existing entry" ); assert!(fish_content.contains("set -gx PATH"), "env.fish should set PATH globally"); } #[tokio::test] async fn test_create_env_files_is_idempotent() { let temp_dir = TempDir::new().unwrap(); let home = AbsolutePathBuf::new(temp_dir.path().to_path_buf()).unwrap(); let _guard = home_guard(temp_dir.path()); // Create env files twice create_env_files(&home).await.unwrap(); let first_env = tokio::fs::read_to_string(home.join("env")).await.unwrap(); let first_fish = tokio::fs::read_to_string(home.join("env.fish")).await.unwrap(); let first_ps1 = tokio::fs::read_to_string(home.join("env.ps1")).await.unwrap(); create_env_files(&home).await.unwrap(); let second_env = tokio::fs::read_to_string(home.join("env")).await.unwrap(); let second_fish = tokio::fs::read_to_string(home.join("env.fish")).await.unwrap(); let second_ps1 = tokio::fs::read_to_string(home.join("env.ps1")).await.unwrap(); assert_eq!(first_env, second_env, "env file should be identical after second write"); assert_eq!(first_fish, second_fish, "env.fish file should be identical after second write"); assert_eq!(first_ps1, second_ps1, "env.ps1 file should be identical after second write"); } #[tokio::test] async fn test_create_env_files_posix_contains_vp_shell_function() { let temp_dir = TempDir::new().unwrap(); let home = AbsolutePathBuf::new(temp_dir.path().to_path_buf()).unwrap(); let _guard = home_guard(temp_dir.path()); create_env_files(&home).await.unwrap(); let env_content = tokio::fs::read_to_string(home.join("env")).await.unwrap(); // Verify vp() shell function wrapper is present assert!(env_content.contains("vp() {"), "env file should contain vp() shell function"); assert!( env_content.contains("\"$1\" = \"env\""), "env file should check for 'env' subcommand" ); assert!( env_content.contains("\"$2\" = \"use\""), "env file should check for 'use' subcommand" ); assert!(env_content.contains("eval \"$__vp_out\""), "env file should eval the output"); assert!( env_content.contains("command vp \"$@\""), "env file should use 'command vp' for passthrough" ); } #[tokio::test] async fn test_create_env_files_fish_contains_vp_function() { let temp_dir = TempDir::new().unwrap(); let home = AbsolutePathBuf::new(temp_dir.path().to_path_buf()).unwrap(); let _guard = home_guard(temp_dir.path()); create_env_files(&home).await.unwrap(); let fish_content = tokio::fs::read_to_string(home.join("env.fish")).await.unwrap(); // Verify fish vp function wrapper is present assert!(fish_content.contains("function vp"), "env.fish file should contain vp function"); assert!( fish_content.contains("\"$argv[1]\" = \"env\""), "env.fish should check for 'env' subcommand" ); assert!( fish_content.contains("\"$argv[2]\" = \"use\""), "env.fish should check for 'use' subcommand" ); assert!( fish_content.contains("command vp $argv"), "env.fish should use 'command vp' for passthrough" ); } #[tokio::test] async fn test_create_env_files_ps1_contains_vp_function() { let temp_dir = TempDir::new().unwrap(); let home = AbsolutePathBuf::new(temp_dir.path().to_path_buf()).unwrap(); let _guard = home_guard(temp_dir.path()); create_env_files(&home).await.unwrap(); let ps1_content = tokio::fs::read_to_string(home.join("env.ps1")).await.unwrap(); // Verify PowerShell function is present assert!(ps1_content.contains("function vp {"), "env.ps1 should contain vp function"); assert!(ps1_content.contains("Invoke-Expression"), "env.ps1 should use Invoke-Expression"); // Should not contain placeholders assert!( !ps1_content.contains("__VP_BIN_WIN__"), "env.ps1 should not contain __VP_BIN_WIN__ placeholder" ); } #[tokio::test] async fn test_execute_env_only_creates_home_dir_and_env_files() { let temp_dir = TempDir::new().unwrap(); let fresh_home = temp_dir.path().join("new-vite-plus"); // Directory does NOT exist yet — execute should create it let _guard = vite_shared::EnvConfig::test_guard(vite_shared::EnvConfig { vite_plus_home: Some(fresh_home.clone()), user_home: Some(temp_dir.path().to_path_buf()), ..vite_shared::EnvConfig::for_test() }); let status = execute(false, true).await.unwrap(); assert!(status.success(), "execute --env-only should succeed"); // Directory should now exist assert!(fresh_home.exists(), "VITE_PLUS_HOME directory should be created"); // Env files should be written assert!(fresh_home.join("env").exists(), "env file should be created"); assert!(fresh_home.join("env.fish").exists(), "env.fish file should be created"); assert!(fresh_home.join("env.ps1").exists(), "env.ps1 file should be created"); } #[tokio::test] async fn test_generate_completion_scripts_creates_all_files() { let temp_dir = TempDir::new().unwrap(); let home = AbsolutePathBuf::new(temp_dir.path().to_path_buf()).unwrap(); generate_completion_scripts(&home).await.unwrap(); let completion_dir = home.join("completion"); // Verify all completion scripts are created let bash_completion = completion_dir.join("vp.bash"); let zsh_completion = completion_dir.join("_vp"); let fish_completion = completion_dir.join("vp.fish"); let ps1_completion = completion_dir.join("vp.ps1"); assert!(bash_completion.as_path().exists(), "bash completion (vp.bash) should be created"); assert!(zsh_completion.as_path().exists(), "zsh completion (_vp) should be created"); assert!(fish_completion.as_path().exists(), "fish completion (vp.fish) should be created"); assert!( ps1_completion.as_path().exists(), "PowerShell completion (vp.ps1) should be created" ); } #[tokio::test] async fn test_create_env_files_contains_completion() { let temp_dir = TempDir::new().unwrap(); let home = AbsolutePathBuf::new(temp_dir.path().to_path_buf()).unwrap(); let _guard = home_guard(temp_dir.path()); create_env_files(&home).await.unwrap(); let env_content = tokio::fs::read_to_string(home.join("env")).await.unwrap(); let fish_content = tokio::fs::read_to_string(home.join("env.fish")).await.unwrap(); let ps1_content = tokio::fs::read_to_string(home.join("env.ps1")).await.unwrap(); assert!( env_content.contains("Shell completion") && env_content.contains("/completion/vp.bash\""), "env file should contain bash completion" ); assert!( fish_content.contains("Shell completion") && fish_content.contains("/completion/vp.fish\""), "env.fish file should contain fish completion" ); assert!( ps1_content.contains("Shell completion") && ps1_content.contains(&format!( "{}completion{}vp.ps1\"", std::path::MAIN_SEPARATOR_STR, std::path::MAIN_SEPARATOR_STR )), "env.ps1 file should contain PowerShell completion" ); // Verify placeholders are replaced assert!( !env_content.contains("__VP_COMPLETION_BASH__") && !env_content.contains("__VP_COMPLETION_ZSH__"), "env file should not contain __VP_COMPLETION_* placeholders" ); assert!( !fish_content.contains("__VP_COMPLETION_FISH__"), "env.fish file should not contain __VP_COMPLETION_FISH__ placeholder" ); assert!( !ps1_content.contains("__VP_COMPLETION_PS1__"), "env.ps1 file should not contain __VP_COMPLETION_PS1__ placeholder" ); } } ================================================ FILE: crates/vite_global_cli/src/commands/env/unpin.rs ================================================ //! Unpin command - alias for `pin --unpin`. //! //! Handles `vp env unpin` to remove the `.node-version` file from the current directory. use std::process::ExitStatus; use vite_path::AbsolutePathBuf; use crate::error::Error; /// Execute the unpin command. pub async fn execute(cwd: AbsolutePathBuf) -> Result { super::pin::do_unpin(&cwd).await } ================================================ FILE: crates/vite_global_cli/src/commands/env/use.rs ================================================ //! Implementation of `vp env use` command. //! //! Outputs shell-appropriate commands to stdout that set (or unset) //! the `VITE_PLUS_NODE_VERSION` environment variable. The shell function //! wrapper in `~/.vite-plus/env` evals this output to modify the current //! shell session. //! //! All user-facing status messages go to stderr so they don't interfere //! with the eval'd output. use std::process::ExitStatus; use vite_path::AbsolutePathBuf; use super::config::{self, VERSION_ENV_VAR}; use crate::error::Error; /// Detected shell type for output formatting. enum Shell { /// POSIX shell (bash, zsh, sh) Posix, /// Fish shell Fish, /// PowerShell PowerShell, /// Windows cmd.exe Cmd, } /// Detect the current shell from environment variables. fn detect_shell() -> Shell { let config = vite_shared::EnvConfig::get(); if config.fish_version.is_some() { Shell::Fish } else if cfg!(windows) && config.ps_module_path.is_some() { Shell::PowerShell } else if cfg!(windows) { Shell::Cmd } else { Shell::Posix } } /// Format a shell export command for the detected shell. fn format_export(shell: &Shell, value: &str) -> String { match shell { Shell::Posix => format!("export {VERSION_ENV_VAR}={value}"), Shell::Fish => format!("set -gx {VERSION_ENV_VAR} {value}"), Shell::PowerShell => format!("$env:{VERSION_ENV_VAR} = \"{value}\""), Shell::Cmd => format!("set {VERSION_ENV_VAR}={value}"), } } /// Format a shell unset command for the detected shell. fn format_unset(shell: &Shell) -> String { match shell { Shell::Posix => format!("unset {VERSION_ENV_VAR}"), Shell::Fish => format!("set -e {VERSION_ENV_VAR}"), Shell::PowerShell => { format!("Remove-Item Env:{VERSION_ENV_VAR} -ErrorAction SilentlyContinue") } Shell::Cmd => format!("set {VERSION_ENV_VAR}="), } } /// Whether the shell eval wrapper is active. /// When true, the wrapper will eval our stdout to set env vars — no session file needed. /// When false (CI, direct invocation), we write a session file so shims can read it. fn has_eval_wrapper() -> bool { vite_shared::EnvConfig::get().env_use_eval_enable } /// Execute the `vp env use` command. pub async fn execute( cwd: AbsolutePathBuf, version: Option, unset: bool, no_install: bool, silent_if_unchanged: bool, ) -> Result { let shell = detect_shell(); // Handle --unset: remove session override if unset { if has_eval_wrapper() { println!("{}", format_unset(&shell)); } else { config::delete_session_version().await?; } eprintln!("Reverted to file-based Node.js version resolution"); return Ok(ExitStatus::default()); } let provider = vite_js_runtime::NodeProvider::new(); // Resolve version: explicit argument or from project files // When no argument provided, unset session override and resolve from project files let (resolved_version, source_desc) = if let Some(ref ver) = version { let resolved = config::resolve_version_alias(ver, &provider).await?; (resolved, format!("{ver}")) } else { // No version argument - unset session override first if has_eval_wrapper() { println!("{}", format_unset(&shell)); } else { config::delete_session_version().await?; } // Now resolve from project files (not from session override) let resolution = config::resolve_version_from_files(&cwd).await?; let source = resolution.source.clone(); (resolution.version, source) }; // Check if already active and suppress output if requested if silent_if_unchanged { let current_env = vite_shared::EnvConfig::get().node_version.map(|v| v.trim().to_string()); let current = if !has_eval_wrapper() { current_env.or(config::read_session_version().await) } else { current_env }; if current.as_deref() == Some(&resolved_version) { // Already active — idempotent, skip stderr status message if has_eval_wrapper() { println!("{}", format_export(&shell, &resolved_version)); } else { config::write_session_version(&resolved_version).await?; } return Ok(ExitStatus::default()); } } // Ensure version is installed (unless --no-install) if !no_install { let home_dir = vite_shared::get_vite_plus_home() .map_err(|e| Error::ConfigError(format!("{e}").into()))? .join("js_runtime") .join("node") .join(&resolved_version); #[cfg(windows)] let binary_path = home_dir.join("node.exe"); #[cfg(not(windows))] let binary_path = home_dir.join("bin").join("node"); if !binary_path.as_path().exists() { eprintln!("Installing Node.js v{}...", resolved_version); vite_js_runtime::download_runtime( vite_js_runtime::JsRuntimeType::Node, &resolved_version, ) .await?; } } if has_eval_wrapper() { // Output the shell command to stdout (consumed by shell wrapper's eval) println!("{}", format_export(&shell, &resolved_version)); } else { // No eval wrapper (CI or direct invocation) — write session file so shims can read it config::write_session_version(&resolved_version).await?; } // Status message to stderr (visible to user) eprintln!("Using Node.js v{} (resolved from {})", resolved_version, source_desc); Ok(ExitStatus::default()) } #[cfg(test)] mod tests { use super::*; #[test] fn test_detect_shell_posix_even_with_psmodulepath() { let _guard = vite_shared::EnvConfig::test_guard(vite_shared::EnvConfig { ps_module_path: Some("/some/path".into()), ..vite_shared::EnvConfig::for_test() }); let shell = detect_shell(); #[cfg(not(windows))] assert!(matches!(shell, Shell::Posix)); #[cfg(windows)] assert!(matches!(shell, Shell::PowerShell)); } #[test] fn test_detect_shell_fish() { let _guard = vite_shared::EnvConfig::test_guard(vite_shared::EnvConfig { fish_version: Some("3.7.0".into()), ..vite_shared::EnvConfig::for_test() }); let shell = detect_shell(); assert!(matches!(shell, Shell::Fish)); } #[test] fn test_detect_shell_posix_default() { // All shell detection fields None → defaults let _guard = vite_shared::EnvConfig::test_guard(vite_shared::EnvConfig::for_test()); let shell = detect_shell(); #[cfg(not(windows))] assert!(matches!(shell, Shell::Posix)); #[cfg(windows)] assert!(matches!(shell, Shell::Cmd)); } #[test] fn test_format_export_posix() { let result = format_export(&Shell::Posix, "20.18.0"); assert_eq!(result, "export VITE_PLUS_NODE_VERSION=20.18.0"); } #[test] fn test_format_export_fish() { let result = format_export(&Shell::Fish, "20.18.0"); assert_eq!(result, "set -gx VITE_PLUS_NODE_VERSION 20.18.0"); } #[test] fn test_format_export_powershell() { let result = format_export(&Shell::PowerShell, "20.18.0"); assert_eq!(result, "$env:VITE_PLUS_NODE_VERSION = \"20.18.0\""); } #[test] fn test_format_export_cmd() { let result = format_export(&Shell::Cmd, "20.18.0"); assert_eq!(result, "set VITE_PLUS_NODE_VERSION=20.18.0"); } #[test] fn test_format_unset_posix() { let result = format_unset(&Shell::Posix); assert_eq!(result, "unset VITE_PLUS_NODE_VERSION"); } #[test] fn test_format_unset_fish() { let result = format_unset(&Shell::Fish); assert_eq!(result, "set -e VITE_PLUS_NODE_VERSION"); } #[test] fn test_format_unset_powershell() { let result = format_unset(&Shell::PowerShell); assert_eq!(result, "Remove-Item Env:VITE_PLUS_NODE_VERSION -ErrorAction SilentlyContinue"); } #[test] fn test_format_unset_cmd() { let result = format_unset(&Shell::Cmd); assert_eq!(result, "set VITE_PLUS_NODE_VERSION="); } } ================================================ FILE: crates/vite_global_cli/src/commands/env/which.rs ================================================ //! Which command implementation. //! //! Shows the path to the tool binary that would be executed. //! //! For core tools (node, npm, npx), shows the resolved Node.js binary path //! along with version and resolution source. //! For global packages, shows the binary path plus package metadata. use std::process::ExitStatus; use chrono::Local; use owo_colors::OwoColorize; use vite_path::{AbsolutePath, AbsolutePathBuf}; use vite_shared::output; use super::{ config::{VERSION_ENV_VAR, get_node_modules_dir, get_packages_dir, resolve_version}, package_metadata::PackageMetadata, }; use crate::error::Error; /// Core tools (node, npm, npx) const CORE_TOOLS: &[&str] = &["node", "npm", "npx"]; /// Column width for left-side labels in aligned metadata output const LABEL_WIDTH: usize = 10; /// Execute the which command. pub async fn execute(cwd: AbsolutePathBuf, tool: &str) -> Result { // Check if this is a core tool if CORE_TOOLS.contains(&tool) { return execute_core_tool(cwd, tool).await; } // Check if this is a global package binary if let Some(metadata) = PackageMetadata::find_by_binary(tool).await? { return execute_package_binary(tool, &metadata).await; } // Unknown tool output::error(&format!("tool '{}' not found", tool.bold())); eprintln!("Not a core tool (node, npm, npx) or installed global package."); eprintln!("Run 'vp list -g' to see installed packages."); Ok(exit_status(1)) } /// Execute which for a core tool (node, npm, npx). async fn execute_core_tool(cwd: AbsolutePathBuf, tool: &str) -> Result { // Resolve version for current directory let resolution = resolve_version(&cwd).await?; // Get the tool path let home_dir = vite_shared::get_vite_plus_home()? .join("js_runtime") .join("node") .join(&resolution.version); #[cfg(windows)] let tool_path = if tool == "node" { home_dir.join("node.exe") } else { home_dir.join(format!("{tool}.cmd")) }; #[cfg(not(windows))] let tool_path = home_dir.join("bin").join(tool); // Check if the tool exists if !tokio::fs::try_exists(&tool_path).await.unwrap_or(false) { output::error(&format!("{} not found", tool.bold())); eprintln!("Node.js {} is not installed.", resolution.version); eprintln!("Run 'vp env install {}' to install it.", resolution.version); return Ok(exit_status(1)); } // Print binary path (first line, uncolored, pipe-friendly) println!("{}", tool_path.as_path().display()); // Print metadata let source_display = format_source(&resolution.source, resolution.source_path.as_deref()); println!(" {:) -> String { match source { s if s == VERSION_ENV_VAR => format!("{s} (session)"), "lts" => "lts (fallback)".to_string(), _ => match source_path { Some(path) => path.as_path().display().to_string(), None => source.to_string(), }, } } /// Execute which for a global package binary. async fn execute_package_binary( tool: &str, metadata: &PackageMetadata, ) -> Result { // Locate the binary path let binary_path = locate_package_binary(&metadata.name, tool)?; // Check if binary exists if !tokio::fs::try_exists(&binary_path).await.unwrap_or(false) { output::error(&format!("binary '{}' not found", tool.bold())); eprintln!("Package {} may need to be reinstalled.", metadata.name); eprintln!("Run 'vp install -g {}' to reinstall.", metadata.name); return Ok(exit_status(1)); } // Format installation timestamp (date only) let installed_local = metadata.installed_at.with_timezone(&Local); let installed_str = installed_local.format("%Y-%m-%d").to_string(); // Print binary path (first line, uncolored, pipe-friendly) println!("{}", binary_path.as_path().display()); // Print metadata println!( " {: Result { let packages_dir = get_packages_dir()?; let package_dir = packages_dir.join(package_name); // The binary is referenced in package.json's bin field // npm uses different layouts: Unix=lib/node_modules, Windows=node_modules let node_modules_dir = get_node_modules_dir(&package_dir, package_name); let package_json_path = node_modules_dir.join("package.json"); if !package_json_path.as_path().exists() { return Err(Error::ConfigError(format!("Package {} not found", package_name).into())); } // Read package.json to find the binary path let content = std::fs::read_to_string(package_json_path.as_path())?; let package_json: serde_json::Value = serde_json::from_str(&content) .map_err(|e| Error::ConfigError(format!("Failed to parse package.json: {e}").into()))?; let binary_path = match package_json.get("bin") { Some(serde_json::Value::String(path)) => { // Single binary - check if it matches the name let pkg_name = package_json["name"].as_str().unwrap_or(""); let expected_name = pkg_name.split('/').last().unwrap_or(pkg_name); if expected_name == binary_name { node_modules_dir.join(path) } else { return Err(Error::ConfigError( format!("Binary {} not found in package", binary_name).into(), )); } } Some(serde_json::Value::Object(map)) => { // Multiple binaries - find the one we need if let Some(serde_json::Value::String(path)) = map.get(binary_name) { node_modules_dir.join(path) } else { return Err(Error::ConfigError( format!("Binary {} not found in package", binary_name).into(), )); } } _ => { return Err(Error::ConfigError( format!("No bin field in package.json for {}", package_name).into(), )); } }; Ok(binary_path) } /// Create an exit status with the given code. fn exit_status(code: i32) -> ExitStatus { #[cfg(unix)] { use std::os::unix::process::ExitStatusExt; ExitStatus::from_raw(code << 8) } #[cfg(windows)] { use std::os::windows::process::ExitStatusExt; ExitStatus::from_raw(code as u32) } } ================================================ FILE: crates/vite_global_cli/src/commands/implode.rs ================================================ //! `vp implode` — completely remove vp and all its data from this system. use std::{ io::{IsTerminal, Write}, process::ExitStatus, }; use directories::BaseDirs; use owo_colors::OwoColorize; use vite_path::{AbsolutePath, AbsolutePathBuf}; use vite_shared::output; use vite_str::Str; use crate::{cli::exit_status, error::Error}; /// All shell profile paths to check, with `is_snippet` flag. const SHELL_PROFILES: &[(&str, bool)] = &[ (".zshenv", false), (".zshrc", false), (".bash_profile", false), (".bashrc", false), (".profile", false), (".config/fish/conf.d/vite-plus.fish", true), ]; /// Abbreviate a path for display: replace `$HOME` prefix with `~`. fn abbreviate_home_path(path: &AbsolutePath, user_home: &AbsolutePath) -> Str { match path.strip_prefix(user_home) { Ok(Some(suffix)) => vite_str::format!("~/{suffix}"), _ => Str::from(path.to_string()), } } /// Comment marker written by the install script above the sourcing line. const VITE_PLUS_COMMENT: &str = "# Vite+ bin"; pub fn execute(yes: bool) -> Result { let Ok(home_dir) = vite_shared::get_vite_plus_home() else { output::info("vite-plus is not installed (could not determine home directory)"); return Ok(exit_status(0)); }; if !home_dir.as_path().exists() { output::info("vite-plus is not installed (directory does not exist)"); return Ok(exit_status(0)); } // Resolve user home for shell profile paths let base_dirs = BaseDirs::new() .ok_or_else(|| Error::Other("Could not determine user home directory".into()))?; let user_home = AbsolutePathBuf::new(base_dirs.home_dir().to_path_buf()).unwrap(); // Collect shell profiles that contain Vite+ lines (content cached for cleaning) let affected_profiles = collect_affected_profiles(&user_home); // Confirmation if !yes && !confirm_implode(&home_dir, &affected_profiles)? { return Ok(exit_status(0)); } // Clean shell profiles using cached content (no re-read) clean_affected_profiles(&affected_profiles); // Remove Windows PATH entry #[cfg(windows)] { let bin_path = home_dir.join("bin"); if let Err(e) = remove_windows_path_entry(&bin_path) { output::warn(&vite_str::format!("Failed to clean Windows PATH: {e}")); } else { output::success("Removed vite-plus from Windows PATH"); } } // Remove the directory remove_vite_plus_dir(&home_dir)?; output::raw(""); output::success("vite-plus has been removed from your system."); output::note("Restart your terminal to apply shell changes."); Ok(exit_status(0)) } /// A shell profile that contains Vite+ sourcing lines. struct AffectedProfile { /// Display name (e.g. ".zshrc", ".config/fish/conf.d/vite-plus.fish"). name: Str, /// Absolute path to the file. path: AbsolutePathBuf, kind: AffectedProfileKind, } // Indicating whether it's a snippet (remove file) or a main profile (remove lines). enum AffectedProfileKind { // A snippet, uninstall would be as easy as removing the file Snippet, Main { /// File content read during detection (reused for cleaning). content: Str, }, } /// Collect shell profiles that contain Vite+ sourcing lines. /// Content is cached so we don't need to re-read during cleaning. fn collect_affected_profiles(user_home: &AbsolutePathBuf) -> Vec { let mut affected = Vec::new(); // Build full list of (display_name, path, is_snippet) from the base set let mut profiles: Vec<(Str, AbsolutePathBuf, bool)> = SHELL_PROFILES .iter() .map(|&(name, is_snippet)| { (vite_str::format!("~/{name}"), user_home.join(name), is_snippet) }) .collect(); // If ZDOTDIR is set and differs from $HOME, also check there. if let Ok(zdotdir) = std::env::var("ZDOTDIR") && let Some(zdotdir_path) = AbsolutePathBuf::new(zdotdir.into()) && zdotdir_path != *user_home { for name in [".zshenv", ".zshrc"] { let path = zdotdir_path.join(name); let display = abbreviate_home_path(&path, user_home); profiles.push((display, path, false)); } } // If XDG_CONFIG_HOME is set and differs from $HOME/.config, also check there. if let Ok(xdg_config) = std::env::var("XDG_CONFIG_HOME") && let Some(xdg_path) = AbsolutePathBuf::new(xdg_config.into()) && xdg_path != user_home.join(".config") { let path = xdg_path.join("fish/conf.d/vite-plus.fish"); let display = abbreviate_home_path(&path, user_home); profiles.push((display, path, true)); } for (name, path, is_snippet) in profiles { // For snippets, check if the file exists only if is_snippet { if let Ok(true) = std::fs::exists(&path) { affected.push(AffectedProfile { name, path, kind: AffectedProfileKind::Snippet }) } continue; } // Read directly — if the file doesn't exist, read_to_string returns Err // which .ok().filter() handles gracefully (no redundant exists() check). if let Some(content) = std::fs::read_to_string(&path).ok().filter(|c| has_vite_plus_lines(c)) { affected.push(AffectedProfile { name, path, kind: AffectedProfileKind::Main { content: Str::from(content) }, }); } } affected } /// Show confirmation prompt and require the user to type "uninstall". /// Returns `Ok(true)` if confirmed, `Ok(false)` if aborted. fn confirm_implode( home_dir: &AbsolutePathBuf, affected_profiles: &[AffectedProfile], ) -> Result { if !std::io::stdin().is_terminal() { return Err(Error::UserMessage( "Cannot prompt for confirmation: stdin is not a TTY. Use --yes to skip confirmation." .into(), )); } output::warn("This will completely remove vite-plus from your system!"); output::raw(""); output::raw(&vite_str::format!(" Directory: {}", home_dir.as_path().display())); if !affected_profiles.is_empty() { output::raw(" Shell profiles to clean:"); for profile in affected_profiles { output::raw(&vite_str::format!(" - {}", profile.name)); } } output::raw(""); output::raw(&vite_str::format!("Type {} to confirm:", "uninstall".bold())); // String is needed here for read_line #[expect(clippy::disallowed_types)] let mut input = String::new(); std::io::stdout().flush()?; std::io::stdin().read_line(&mut input)?; if input.trim() != "uninstall" { output::info("Aborted."); return Ok(false); } Ok(true) } /// Clean all affected shell profiles using cached content (no re-read). fn clean_affected_profiles(affected_profiles: &[AffectedProfile]) { for profile in affected_profiles { match &profile.kind { AffectedProfileKind::Main { content } => { let cleaned = remove_vite_plus_lines(content); match std::fs::write(&profile.path, cleaned.as_bytes()) { Ok(()) => output::success(&vite_str::format!("Cleaned {}", profile.name)), Err(e) => { output::warn(&vite_str::format!("Failed to clean {}: {e}", profile.name)); } } } AffectedProfileKind::Snippet => match std::fs::remove_file(&profile.path) { Ok(()) => output::success(&vite_str::format!("Removed {}", profile.name)), Err(e) => { output::warn(&vite_str::format!("Failed to remove {}: {e}", profile.name)); } }, } } } /// Remove the ~/.vite-plus directory. fn remove_vite_plus_dir(home_dir: &AbsolutePathBuf) -> Result<(), Error> { #[cfg(unix)] { match std::fs::remove_dir_all(home_dir) { Ok(()) => { output::success(&vite_str::format!("Removed {}", home_dir.as_path().display())); Ok(()) } Err(e) => { output::error(&vite_str::format!( "Failed to remove {}: {e}", home_dir.as_path().display() )); Err(Error::CommandExecution(e)) } } } #[cfg(windows)] { // On Windows, the running `vp` binary is always locked, so direct // removal will fail. Rename the directory first so the original path // is immediately free for reinstall, then schedule deletion of the // renamed directory via a detached process. let trash_path = home_dir.as_path().with_extension(vite_str::format!("removing-{}", std::process::id())); if let Err(e) = std::fs::rename(home_dir, &trash_path) { output::error(&vite_str::format!( "Failed to rename {} for removal: {e}", home_dir.as_path().display() )); return Err(Error::CommandExecution(e)); } match spawn_deferred_delete(&trash_path) { Ok(_) => { output::success(&vite_str::format!( "Scheduled removal of {} (will complete shortly)", home_dir.as_path().display() )); } Err(e) => { output::error(&vite_str::format!( "Failed to schedule removal of {}: {e}", home_dir.as_path().display() )); return Err(Error::CommandExecution(e)); } } Ok(()) } } /// Build a `cmd.exe` script that retries `rmdir /S /Q` up to 10 times with /// 1-second pauses, exiting as soon as the directory is gone. #[cfg(windows)] fn build_deferred_delete_script(trash_path: &std::path::Path) -> Str { let p = trash_path.to_string_lossy(); vite_str::format!( "for /L %i in (1,1,10) do @(\ if not exist \"{p}\" exit /B 0 & \ rmdir /S /Q \"{p}\" 2>NUL & \ if not exist \"{p}\" exit /B 0 & \ timeout /T 1 /NOBREAK >NUL\ )" ) } /// Spawn a detached `cmd.exe` process that retries deletion of `trash_path`. #[cfg(windows)] fn spawn_deferred_delete(trash_path: &std::path::Path) -> std::io::Result { let script = build_deferred_delete_script(trash_path); std::process::Command::new("cmd.exe") .args(["/C", &script]) .stdin(std::process::Stdio::null()) .stdout(std::process::Stdio::null()) .stderr(std::process::Stdio::null()) .spawn() } /// Check if file content contains Vite+ sourcing lines. fn has_vite_plus_lines(content: &str) -> bool { let pattern = ".vite-plus/env\""; content.lines().any(|line| line.contains(pattern)) } /// Remove Vite+ lines from content, returning the cleaned string. fn remove_vite_plus_lines(content: &str) -> Str { let pattern = ".vite-plus/env\""; let lines: Vec<&str> = content.lines().collect(); let mut remove_indices = Vec::new(); for (i, line) in lines.iter().enumerate() { if line.contains(pattern) { remove_indices.push(i); // Also remove the comment line above if i > 0 && lines[i - 1].contains(VITE_PLUS_COMMENT) { remove_indices.push(i - 1); // Also remove the blank line before the comment if i > 1 && lines[i - 2].trim().is_empty() { remove_indices.push(i - 2); } } } } if remove_indices.is_empty() { return Str::from(content); } #[expect(clippy::disallowed_types)] let mut result = String::with_capacity(content.len()); for (i, line) in lines.iter().enumerate() { if !remove_indices.contains(&i) { result.push_str(line); result.push('\n'); } } // Preserve trailing newline behavior of original if !content.ends_with('\n') && result.ends_with('\n') { result.pop(); } Str::from(result) } /// Remove `.vite-plus\bin` from the Windows User PATH via PowerShell. #[cfg(windows)] fn remove_windows_path_entry(bin_path: &vite_path::AbsolutePath) -> std::io::Result<()> { let bin_str = bin_path.as_path().to_string_lossy(); let script = vite_str::format!( "[Environment]::SetEnvironmentVariable('Path', \ ([Environment]::GetEnvironmentVariable('Path', 'User') -split ';' | \ Where-Object {{ $_ -ne '{bin_str}' }}) -join ';', 'User')" ); let status = std::process::Command::new("powershell") .args(["-NoProfile", "-Command", &script]) .status()?; if status.success() { Ok(()) } else { Err(std::io::Error::new(std::io::ErrorKind::Other, "PowerShell command failed")) } } #[cfg(test)] mod tests { #[cfg(not(windows))] use serial_test::serial; use super::*; #[test] fn test_remove_vite_plus_lines_posix() { let content = "# existing config\nexport FOO=bar\n\n# Vite+ bin (https://viteplus.dev)\n. \"$HOME/.vite-plus/env\"\n"; let result = remove_vite_plus_lines(content); assert_eq!(&*result, "# existing config\nexport FOO=bar\n"); } #[test] fn test_remove_vite_plus_lines_no_match() { let content = "# just a normal config\nexport PATH=/usr/bin\n"; let result = remove_vite_plus_lines(content); assert_eq!(&*result, content); } #[test] fn test_remove_vite_plus_lines_absolute_path() { let content = "# existing\n. \"/home/user/.vite-plus/env\"\n"; let result = remove_vite_plus_lines(content); assert_eq!(&*result, "# existing\n"); } #[test] fn test_remove_vite_plus_lines_preserves_surrounding() { let content = "# before\nexport A=1\n\n# Vite+ bin (https://viteplus.dev)\n. \"$HOME/.vite-plus/env\"\n# after\nexport B=2\n"; let result = remove_vite_plus_lines(content); assert_eq!(&*result, "# before\nexport A=1\n# after\nexport B=2\n"); } #[test] fn test_clean_affected_profiles_integration() { let temp_dir = tempfile::tempdir().unwrap(); let temp_path = AbsolutePathBuf::new(temp_dir.path().to_path_buf()).unwrap(); let profile_path = temp_path.join(".zshrc"); let original = "# my config\nexport FOO=bar\n\n# Vite+ bin (https://viteplus.dev)\n. \"$HOME/.vite-plus/env\"\n"; std::fs::write(&profile_path, original).unwrap(); let profiles = vec![AffectedProfile { name: Str::from(".zshrc"), path: profile_path.clone(), kind: AffectedProfileKind::Main { content: Str::from(original) }, }]; clean_affected_profiles(&profiles); let result = std::fs::read_to_string(&profile_path).unwrap(); assert_eq!(result, "# my config\nexport FOO=bar\n"); assert!(!result.contains(".vite-plus/env")); } #[test] fn test_remove_vite_plus_dir_success() { let temp_dir = tempfile::tempdir().unwrap(); let dir = AbsolutePathBuf::new(temp_dir.path().to_path_buf()).unwrap(); let target = dir.join("to-remove"); std::fs::create_dir_all(&target).unwrap(); std::fs::write(target.join("file.txt"), "data").unwrap(); let result = remove_vite_plus_dir(&target); assert!(result.is_ok()); assert!(!target.as_path().exists()); } #[test] fn test_remove_vite_plus_dir_nonexistent() { let temp_dir = tempfile::tempdir().unwrap(); let dir = AbsolutePathBuf::new(temp_dir.path().to_path_buf()).unwrap(); let target = dir.join("does-not-exist"); let result = remove_vite_plus_dir(&target); assert!(result.is_err()); } #[test] #[cfg(windows)] fn test_build_deferred_delete_script() { let path = std::path::Path::new(r"C:\Users\test\.vite-plus.removing-1234"); let script = build_deferred_delete_script(path); assert!(script.contains("rmdir /S /Q")); assert!(script.contains(r"C:\Users\test\.vite-plus.removing-1234")); assert!(script.contains("for /L %i in (1,1,10)")); assert!(script.contains("timeout /T 1 /NOBREAK")); } #[test] #[cfg(not(windows))] fn test_abbreviate_home_path() { let home = AbsolutePathBuf::new("/home/user".into()).unwrap(); // Under home → ~/... let under = AbsolutePathBuf::new("/home/user/.zshrc".into()).unwrap(); assert_eq!(&*abbreviate_home_path(&under, &home), "~/.zshrc"); // Outside home → absolute path as-is let outside = AbsolutePathBuf::new("/opt/zdotdir/.zshenv".into()).unwrap(); assert_eq!(&*abbreviate_home_path(&outside, &home), "/opt/zdotdir/.zshenv"); } #[test] #[serial] #[cfg(not(windows))] fn test_collect_affected_profiles() { let temp_dir = tempfile::tempdir().unwrap(); let home = AbsolutePathBuf::new(temp_dir.path().to_path_buf()).unwrap(); // Clear ZDOTDIR/XDG_CONFIG_HOME so the test environment doesn't affect results let _guard = ProfileEnvGuard::new(None, None); // Main profile with vite-plus line std::fs::write(home.join(".zshrc"), ". \"$HOME/.vite-plus/env\"\n").unwrap(); // Unrelated profile (should be ignored) std::fs::write(home.join(".bashrc"), "export PATH=/usr/bin\n").unwrap(); // Snippet file (just needs to exist) let fish_dir = home.join(".config/fish/conf.d"); std::fs::create_dir_all(&fish_dir).unwrap(); std::fs::write(fish_dir.join("vite-plus.fish"), "source ~/.vite-plus/env.fish\n").unwrap(); let profiles = collect_affected_profiles(&home); assert_eq!(profiles.len(), 2); assert!(matches!(&profiles[0].kind, AffectedProfileKind::Main { .. })); assert!(matches!(&profiles[1].kind, AffectedProfileKind::Snippet)); } /// Guard that saves and restores ZDOTDIR and XDG_CONFIG_HOME env vars. #[cfg(not(windows))] struct ProfileEnvGuard { original_zdotdir: Option, original_xdg_config: Option, } #[cfg(not(windows))] impl ProfileEnvGuard { fn new(zdotdir: Option<&std::path::Path>, xdg_config: Option<&std::path::Path>) -> Self { let guard = Self { original_zdotdir: std::env::var_os("ZDOTDIR"), original_xdg_config: std::env::var_os("XDG_CONFIG_HOME"), }; unsafe { match zdotdir { Some(v) => std::env::set_var("ZDOTDIR", v), None => std::env::remove_var("ZDOTDIR"), } match xdg_config { Some(v) => std::env::set_var("XDG_CONFIG_HOME", v), None => std::env::remove_var("XDG_CONFIG_HOME"), } } guard } } #[cfg(not(windows))] impl Drop for ProfileEnvGuard { fn drop(&mut self) { unsafe { match &self.original_zdotdir { Some(v) => std::env::set_var("ZDOTDIR", v), None => std::env::remove_var("ZDOTDIR"), } match &self.original_xdg_config { Some(v) => std::env::set_var("XDG_CONFIG_HOME", v), None => std::env::remove_var("XDG_CONFIG_HOME"), } } } } #[test] #[serial] #[cfg(not(windows))] fn test_collect_affected_profiles_zdotdir() { let temp_dir = tempfile::tempdir().unwrap(); let home = AbsolutePathBuf::new(temp_dir.path().join("home")).unwrap(); let zdotdir = temp_dir.path().join("zdotdir"); std::fs::create_dir_all(&home).unwrap(); std::fs::create_dir_all(&zdotdir).unwrap(); std::fs::write(zdotdir.join(".zshenv"), ". \"$HOME/.vite-plus/env\"\n").unwrap(); let _guard = ProfileEnvGuard::new(Some(&zdotdir), None); let profiles = collect_affected_profiles(&home); let zdotdir_profiles: Vec<_> = profiles.iter().filter(|p| p.path.as_path().starts_with(&zdotdir)).collect(); assert_eq!(zdotdir_profiles.len(), 1); assert!(matches!(&zdotdir_profiles[0].kind, AffectedProfileKind::Main { .. })); } #[test] #[serial] #[cfg(not(windows))] fn test_collect_affected_profiles_xdg_config() { let temp_dir = tempfile::tempdir().unwrap(); let home = AbsolutePathBuf::new(temp_dir.path().join("home")).unwrap(); let xdg_config = temp_dir.path().join("xdg_config"); let fish_dir = xdg_config.join("fish/conf.d"); std::fs::create_dir_all(&home).unwrap(); std::fs::create_dir_all(&fish_dir).unwrap(); std::fs::write(fish_dir.join("vite-plus.fish"), "").unwrap(); let _guard = ProfileEnvGuard::new(None, Some(&xdg_config)); let profiles = collect_affected_profiles(&home); let xdg_profiles: Vec<_> = profiles.iter().filter(|p| p.path.as_path().starts_with(&xdg_config)).collect(); assert_eq!(xdg_profiles.len(), 1); assert!(matches!(&xdg_profiles[0].kind, AffectedProfileKind::Snippet)); } #[test] fn test_execute_not_installed() { let temp_dir = tempfile::tempdir().unwrap(); let non_existent = temp_dir.path().join("does-not-exist"); // Use thread-local test guard instead of mutating process-global env let _guard = vite_shared::EnvConfig::test_guard( vite_shared::EnvConfig::for_test_with_home(&non_existent), ); let result = execute(true); assert!(result.is_ok()); assert!(result.unwrap().success()); } } ================================================ FILE: crates/vite_global_cli/src/commands/install.rs ================================================ use std::process::ExitStatus; use vite_install::{PackageManager, commands::install::InstallCommandOptions}; use vite_path::AbsolutePathBuf; use super::prepend_js_runtime_to_path_env; use crate::error::Error; /// Install command. pub struct InstallCommand { cwd: AbsolutePathBuf, } impl InstallCommand { pub fn new(cwd: AbsolutePathBuf) -> Self { Self { cwd } } pub async fn execute(self, options: &InstallCommandOptions<'_>) -> Result { prepend_js_runtime_to_path_env(&self.cwd).await?; super::ensure_package_json(&self.cwd).await?; let package_manager = PackageManager::builder(&self.cwd).build_with_default().await?; Ok(package_manager.run_install_command(options, &self.cwd).await?) } } #[cfg(test)] mod tests { use std::{fs, path::PathBuf}; use tempfile::TempDir; use super::*; #[test] fn test_install_command_new() { let workspace_root = AbsolutePathBuf::new(PathBuf::from(if cfg!(windows) { "C:\\test\\workspace" } else { "/test/workspace" })) .unwrap(); let command = InstallCommand::new(workspace_root.clone()); assert_eq!(command.cwd, workspace_root); } #[ignore = "skip this test for auto run, should be run manually, because it will prompt for user selection"] #[tokio::test] async fn test_install_command_with_package_json_without_package_manager() { let temp_dir = TempDir::new().unwrap(); let workspace_root = AbsolutePathBuf::new(temp_dir.path().to_path_buf()).unwrap(); // Create a minimal package.json let package_json = r#"{ "name": "test-package", "version": "1.0.0" }"#; fs::write(workspace_root.join("package.json"), package_json).unwrap(); let command = InstallCommand::new(workspace_root); assert!(command.execute(&InstallCommandOptions::default()).await.is_ok()); } #[tokio::test] #[serial_test::serial] async fn test_install_command_with_package_json_with_package_manager() { let temp_dir = TempDir::new().unwrap(); let workspace_root = AbsolutePathBuf::new(temp_dir.path().to_path_buf()).unwrap(); // Create a minimal package.json let package_json = r#"{ "name": "test-package", "version": "1.0.0", "packageManager": "pnpm@10.15.0" }"#; fs::write(workspace_root.join("package.json"), package_json).unwrap(); let command = InstallCommand::new(workspace_root); let result = command.execute(&InstallCommandOptions::default()).await; println!("result: {result:?}"); assert!(result.is_ok()); } #[tokio::test] async fn test_ensure_package_json_creates_when_missing() { let temp_dir = TempDir::new().unwrap(); let dir_path = AbsolutePathBuf::new(temp_dir.path().to_path_buf()).unwrap(); let package_json_path = dir_path.join("package.json"); // Verify no package.json exists assert!(!package_json_path.as_path().exists()); // Call ensure_package_json crate::commands::ensure_package_json(&dir_path).await.unwrap(); // Verify package.json was created with correct content let content = fs::read_to_string(&package_json_path).unwrap(); let parsed: serde_json::Value = serde_json::from_str(&content).unwrap(); assert_eq!(parsed["type"], "module"); } #[tokio::test] async fn test_ensure_package_json_does_not_overwrite_existing() { let temp_dir = TempDir::new().unwrap(); let dir_path = AbsolutePathBuf::new(temp_dir.path().to_path_buf()).unwrap(); let package_json_path = dir_path.join("package.json"); // Create an existing package.json let existing_content = r#"{"name": "existing-package"}"#; fs::write(&package_json_path, existing_content).unwrap(); // Call ensure_package_json crate::commands::ensure_package_json(&dir_path).await.unwrap(); // Verify existing package.json was NOT overwritten let content = fs::read_to_string(&package_json_path).unwrap(); assert_eq!(content, existing_content); } #[tokio::test] async fn test_install_command_execute_with_invalid_workspace() { let temp_dir = TempDir::new().unwrap(); let workspace_root = AbsolutePathBuf::new(temp_dir.path().join("nonexistent")).unwrap(); let command = InstallCommand::new(workspace_root); let result = command.execute(&InstallCommandOptions::default()).await; assert!(result.is_err()); } } ================================================ FILE: crates/vite_global_cli/src/commands/link.rs ================================================ use std::process::ExitStatus; use vite_install::commands::link::LinkCommandOptions; use vite_path::AbsolutePathBuf; use super::{build_package_manager, prepend_js_runtime_to_path_env}; use crate::error::Error; /// Link command for local package development. /// /// This command automatically detects the package manager and translates /// the link command to the appropriate package manager-specific syntax. pub struct LinkCommand { cwd: AbsolutePathBuf, } impl LinkCommand { pub fn new(cwd: AbsolutePathBuf) -> Self { Self { cwd } } pub async fn execute( self, package: Option<&str>, pass_through_args: Option<&[String]>, ) -> Result { prepend_js_runtime_to_path_env(&self.cwd).await?; let package_manager = build_package_manager(&self.cwd).await?; let link_command_options = LinkCommandOptions { package, pass_through_args }; Ok(package_manager.run_link_command(&link_command_options, &self.cwd).await?) } } #[cfg(test)] mod tests { use super::*; #[test] fn test_link_command_new() { let workspace_root = if cfg!(windows) { AbsolutePathBuf::new("C:\\test".into()).unwrap() } else { AbsolutePathBuf::new("/test".into()).unwrap() }; let cmd = LinkCommand::new(workspace_root.clone()); assert_eq!(cmd.cwd, workspace_root); } } ================================================ FILE: crates/vite_global_cli/src/commands/migrate.rs ================================================ //! Migration command (Category B: JavaScript Command). use std::process::ExitStatus; use vite_path::AbsolutePathBuf; use crate::error::Error; /// Execute the `migrate` command by delegating to local or global vite-plus. pub async fn execute(cwd: AbsolutePathBuf, args: &[String]) -> Result { super::delegate::execute(cwd, "migrate", args).await } #[cfg(test)] mod tests { #[test] fn test_migrate_command_module_exists() { // Basic test to ensure the module compiles assert!(true); } } ================================================ FILE: crates/vite_global_cli/src/commands/mod.rs ================================================ //! Command implementations for the global CLI. //! //! Commands are organized by category: //! //! Category A - Package manager commands: //! - `add`: Add packages to dependencies //! - `install`: Install all dependencies //! - `remove`: Remove packages from dependencies //! - `update`: Update packages to their latest versions //! - `dedupe`: Deduplicate dependencies //! - `outdated`: Check for outdated packages //! - `why`: Show why a package is installed //! - `link`: Link packages for local development //! - `unlink`: Unlink packages //! - `dlx`: Execute a package binary without installing it //! - `pm`: Forward commands to the package manager //! //! Category B - JS Script Commands: //! - `create`: Project scaffolding //! - `migrate`: Migration command //! - `version`: Version display //! //! Category C - Local CLI Delegation: //! - `delegate`: Local CLI delegation use std::{collections::HashMap, io::BufReader}; use vite_install::package_manager::PackageManager; use vite_path::AbsolutePath; use vite_shared::{PrependOptions, prepend_to_path_env}; use crate::{error::Error, js_executor::JsExecutor}; #[derive(serde::Deserialize, Default)] #[serde(rename_all = "camelCase")] struct DepCheckPackageJson { #[serde(default)] dependencies: HashMap, #[serde(default)] dev_dependencies: HashMap, } /// Check if vite-plus is listed in the nearest package.json's /// dependencies or devDependencies. /// /// Returns `true` if vite-plus is found, `false` if not found /// or if no package.json exists. pub fn has_vite_plus_dependency(cwd: &AbsolutePath) -> bool { let mut current = cwd; loop { let package_json_path = current.join("package.json"); if package_json_path.as_path().exists() { if let Ok(file) = std::fs::File::open(&package_json_path) { if let Ok(pkg) = serde_json::from_reader::<_, DepCheckPackageJson>(BufReader::new(file)) { return pkg.dependencies.contains_key("vite-plus") || pkg.dev_dependencies.contains_key("vite-plus"); } } return false; // Found package.json but couldn't parse deps → treat as no dependency } match current.parent() { Some(parent) if parent != current => current = parent, _ => return false, // Reached filesystem root } } } /// Ensure a package.json exists in the given directory. /// If it doesn't exist, create a minimal one with `{ "type": "module" }`. pub async fn ensure_package_json(project_path: &AbsolutePath) -> Result<(), Error> { let package_json_path = project_path.join("package.json"); if !package_json_path.as_path().exists() { let content = serde_json::to_string_pretty(&serde_json::json!({ "type": "module" }))?; tokio::fs::write(&package_json_path, format!("{content}\n")).await?; tracing::info!("Created package.json in {:?}", project_path); } Ok(()) } /// Ensure the JS runtime is downloaded and prepend its bin directory to PATH. /// This should be called before executing any package manager command. /// /// If `project_path` contains a package.json, uses the project's runtime /// (based on devEngines.runtime). Otherwise, falls back to the CLI's runtime. pub async fn prepend_js_runtime_to_path_env(project_path: &AbsolutePath) -> Result<(), Error> { let mut executor = JsExecutor::new(None); // Use project runtime if package.json exists, otherwise use CLI runtime let package_json_path = project_path.join("package.json"); let runtime = if package_json_path.as_path().exists() { executor.ensure_project_runtime(project_path).await? } else { executor.ensure_cli_runtime().await? }; let node_bin_prefix = runtime.get_bin_prefix(); // Use dedupe_anywhere=true to check if node bin already exists anywhere in PATH let options = PrependOptions { dedupe_anywhere: true }; if prepend_to_path_env(&node_bin_prefix, options) { tracing::debug!("Set PATH to include {:?}", node_bin_prefix); } Ok(()) } /// Build a PackageManager, converting PackageJsonNotFound into a friendly error message. pub async fn build_package_manager(cwd: &AbsolutePath) -> Result { match PackageManager::builder(cwd).build_with_default().await { Ok(pm) => Ok(pm), Err(vite_error::Error::WorkspaceError(vite_workspace::Error::PackageJsonNotFound(_))) => { Err(Error::UserMessage("No package.json found.".into())) } Err(e) => Err(e.into()), } } // Category A: Package manager commands pub mod add; pub mod dedupe; pub mod dlx; pub mod install; pub mod link; pub mod outdated; pub mod pm; pub mod remove; pub mod unlink; pub mod update; pub mod why; // Category B: JS Script Commands pub mod config; pub mod create; pub mod migrate; pub mod staged; pub mod version; // Category D: Environment Management pub mod env; // Standalone binary command pub mod vpx; // Self-Management pub mod implode; pub mod upgrade; // Category C: Local CLI Delegation pub mod delegate; pub mod run_or_delegate; // Re-export command structs for convenient access pub use add::AddCommand; pub use dedupe::DedupeCommand; pub use dlx::DlxCommand; pub use install::InstallCommand; pub use link::LinkCommand; pub use outdated::OutdatedCommand; pub use remove::RemoveCommand; pub use unlink::UnlinkCommand; pub use update::UpdateCommand; pub use why::WhyCommand; #[cfg(test)] mod tests { use vite_path::AbsolutePathBuf; use super::*; #[test] fn test_has_vite_plus_in_dev_dependencies() { let temp_dir = tempfile::tempdir().unwrap(); let temp_path = AbsolutePathBuf::new(temp_dir.path().to_path_buf()).unwrap(); std::fs::write( temp_path.join("package.json"), r#"{ "devDependencies": { "vite-plus": "^1.0.0" } }"#, ) .unwrap(); assert!(has_vite_plus_dependency(&temp_path)); } #[test] fn test_has_vite_plus_in_dependencies() { let temp_dir = tempfile::tempdir().unwrap(); let temp_path = AbsolutePathBuf::new(temp_dir.path().to_path_buf()).unwrap(); std::fs::write( temp_path.join("package.json"), r#"{ "dependencies": { "vite-plus": "^1.0.0" } }"#, ) .unwrap(); assert!(has_vite_plus_dependency(&temp_path)); } #[test] fn test_no_vite_plus_dependency() { let temp_dir = tempfile::tempdir().unwrap(); let temp_path = AbsolutePathBuf::new(temp_dir.path().to_path_buf()).unwrap(); std::fs::write( temp_path.join("package.json"), r#"{ "devDependencies": { "vite": "^6.0.0" } }"#, ) .unwrap(); assert!(!has_vite_plus_dependency(&temp_path)); } #[test] fn test_no_package_json() { let temp_dir = tempfile::tempdir().unwrap(); let temp_path = AbsolutePathBuf::new(temp_dir.path().to_path_buf()).unwrap(); assert!(!has_vite_plus_dependency(&temp_path)); } #[test] fn test_nested_directory_walks_up() { let temp_dir = tempfile::tempdir().unwrap(); let temp_path = AbsolutePathBuf::new(temp_dir.path().to_path_buf()).unwrap(); std::fs::write( temp_path.join("package.json"), r#"{ "devDependencies": { "vite-plus": "^1.0.0" } }"#, ) .unwrap(); let child_dir = temp_path.join("child"); std::fs::create_dir(&child_dir).unwrap(); let child_path = AbsolutePathBuf::new(child_dir.as_path().to_path_buf()).unwrap(); assert!(has_vite_plus_dependency(&child_path)); } #[test] fn test_empty_package_json() { let temp_dir = tempfile::tempdir().unwrap(); let temp_path = AbsolutePathBuf::new(temp_dir.path().to_path_buf()).unwrap(); std::fs::write(temp_path.join("package.json"), r#"{}"#).unwrap(); assert!(!has_vite_plus_dependency(&temp_path)); } #[test] fn test_nested_dir_stops_at_nearest_package_json() { let temp_dir = tempfile::tempdir().unwrap(); let temp_path = AbsolutePathBuf::new(temp_dir.path().to_path_buf()).unwrap(); // Parent has vite-plus std::fs::write( temp_path.join("package.json"), r#"{ "devDependencies": { "vite-plus": "^1.0.0" } }"#, ) .unwrap(); // Child has its own package.json without vite-plus let child_dir = temp_path.join("child"); std::fs::create_dir(&child_dir).unwrap(); std::fs::write( child_dir.join("package.json"), r#"{ "devDependencies": { "vite": "^6.0.0" } }"#, ) .unwrap(); let child_path = AbsolutePathBuf::new(child_dir.as_path().to_path_buf()).unwrap(); // Should find the child's package.json first and return false assert!(!has_vite_plus_dependency(&child_path)); } } ================================================ FILE: crates/vite_global_cli/src/commands/outdated.rs ================================================ use std::process::ExitStatus; use vite_install::commands::outdated::{Format, OutdatedCommandOptions}; use vite_path::AbsolutePathBuf; use super::{build_package_manager, prepend_js_runtime_to_path_env}; use crate::error::Error; /// Outdated command for checking outdated packages. /// /// This command automatically detects the package manager and translates /// the outdated command to the appropriate package manager-specific syntax. pub struct OutdatedCommand { cwd: AbsolutePathBuf, } impl OutdatedCommand { pub fn new(cwd: AbsolutePathBuf) -> Self { Self { cwd } } #[allow(clippy::too_many_arguments)] pub async fn execute( self, packages: &[String], long: bool, format: Option, recursive: bool, filters: Option<&[String]>, workspace_root: bool, prod: bool, dev: bool, no_optional: bool, compatible: bool, sort_by: Option<&str>, global: bool, pass_through_args: Option<&[String]>, ) -> Result { prepend_js_runtime_to_path_env(&self.cwd).await?; let package_manager = build_package_manager(&self.cwd).await?; let outdated_command_options = OutdatedCommandOptions { packages, long, format, recursive, filters, workspace_root, prod, dev, no_optional, compatible, sort_by, global, pass_through_args, }; Ok(package_manager.run_outdated_command(&outdated_command_options, &self.cwd).await?) } } #[cfg(test)] mod tests { use super::*; #[test] fn test_outdated_command_new() { let workspace_root = if cfg!(windows) { AbsolutePathBuf::new("C:\\test".into()).unwrap() } else { AbsolutePathBuf::new("/test".into()).unwrap() }; let cmd = OutdatedCommand::new(workspace_root.clone()); assert_eq!(cmd.cwd, workspace_root); } } ================================================ FILE: crates/vite_global_cli/src/commands/pm.rs ================================================ //! Package manager commands (Category A). //! //! This module handles the `pm` subcommand and the `info` command which are //! routed through helper functions. Other PM commands (add, install, remove, etc.) //! are implemented as separate command modules with struct-based patterns. use std::process::ExitStatus; use vite_install::commands::{ audit::AuditCommandOptions, cache::CacheCommandOptions, config::ConfigCommandOptions, deprecate::DeprecateCommandOptions, dist_tag::{DistTagCommandOptions, DistTagSubcommand}, fund::FundCommandOptions, list::ListCommandOptions, login::LoginCommandOptions, logout::LogoutCommandOptions, owner::OwnerSubcommand, pack::PackCommandOptions, ping::PingCommandOptions, prune::PruneCommandOptions, publish::PublishCommandOptions, rebuild::RebuildCommandOptions, search::SearchCommandOptions, token::TokenSubcommand, view::ViewCommandOptions, whoami::WhoamiCommandOptions, }; use vite_path::AbsolutePathBuf; use super::{build_package_manager, prepend_js_runtime_to_path_env}; use crate::{ cli::{ConfigCommands, DistTagCommands, OwnerCommands, PmCommands, TokenCommands}, error::Error, }; /// Execute the info command. pub async fn execute_info( cwd: AbsolutePathBuf, package: &str, field: Option<&str>, json: bool, pass_through_args: Option<&[String]>, ) -> Result { prepend_js_runtime_to_path_env(&cwd).await?; let package_manager = build_package_manager(&cwd).await?; let options = ViewCommandOptions { package, field, json, pass_through_args }; Ok(package_manager.run_view_command(&options, &cwd).await?) } /// Execute a pm subcommand. pub async fn execute_pm_subcommand( cwd: AbsolutePathBuf, command: PmCommands, ) -> Result { // Intercept `pm list -g` to use vite-plus managed global packages listing if let PmCommands::List { global: true, json, ref pattern, .. } = command { return crate::commands::env::packages::execute(json, pattern.as_deref()).await; } prepend_js_runtime_to_path_env(&cwd).await?; let package_manager = build_package_manager(&cwd).await?; match command { PmCommands::Prune { prod, no_optional, pass_through_args } => { let options = PruneCommandOptions { prod, no_optional, pass_through_args: pass_through_args.as_deref(), }; Ok(package_manager.run_prune_command(&options, &cwd).await?) } PmCommands::Pack { recursive, filter, out, pack_destination, pack_gzip_level, json, pass_through_args, } => { let options = PackCommandOptions { recursive, filters: filter.as_deref(), out: out.as_deref(), pack_destination: pack_destination.as_deref(), pack_gzip_level, json, pass_through_args: pass_through_args.as_deref(), }; Ok(package_manager.run_pack_command(&options, &cwd).await?) } PmCommands::List { pattern, depth, json, long, parseable, prod, dev, no_optional, exclude_peers, only_projects, find_by, recursive, filter, global, pass_through_args, } => { let options = ListCommandOptions { pattern: pattern.as_deref(), depth, json, long, parseable, prod, dev, no_optional, exclude_peers, only_projects, find_by: find_by.as_deref(), recursive, filters: if filter.is_empty() { None } else { Some(&filter) }, global, pass_through_args: pass_through_args.as_deref(), }; Ok(package_manager.run_list_command(&options, &cwd).await?) } PmCommands::View { package, field, json, pass_through_args } => { let options = ViewCommandOptions { package: &package, field: field.as_deref(), json, pass_through_args: pass_through_args.as_deref(), }; Ok(package_manager.run_view_command(&options, &cwd).await?) } PmCommands::Publish { target, dry_run, tag, access, otp, no_git_checks, publish_branch, report_summary, force, json, recursive, filter, pass_through_args, } => { let options = PublishCommandOptions { target: target.as_deref(), dry_run, tag: tag.as_deref(), access: access.as_deref(), otp: otp.as_deref(), no_git_checks, publish_branch: publish_branch.as_deref(), report_summary, force, json, recursive, filters: filter.as_deref(), pass_through_args: pass_through_args.as_deref(), }; Ok(package_manager.run_publish_command(&options, &cwd).await?) } PmCommands::Owner(owner_command) => { let subcommand = match owner_command { OwnerCommands::List { package, otp } => OwnerSubcommand::List { package, otp }, OwnerCommands::Add { user, package, otp } => { OwnerSubcommand::Add { user, package, otp } } OwnerCommands::Rm { user, package, otp } => { OwnerSubcommand::Rm { user, package, otp } } }; Ok(package_manager.run_owner_command(&subcommand, &cwd).await?) } PmCommands::Cache { subcommand, pass_through_args } => { let options = CacheCommandOptions { subcommand: &subcommand, pass_through_args: pass_through_args.as_deref(), }; Ok(package_manager.run_cache_command(&options, &cwd).await?) } PmCommands::Config(config_command) => match config_command { ConfigCommands::List { json, global, location } => { let options = ConfigCommandOptions { subcommand: "list", key: None, value: None, json, location: if global { Some("global") } else { location.as_deref() }, pass_through_args: None, }; Ok(package_manager.run_config_command(&options, &cwd).await?) } ConfigCommands::Get { key, json, global, location } => { let options = ConfigCommandOptions { subcommand: "get", key: Some(key.as_str()), value: None, json, location: if global { Some("global") } else { location.as_deref() }, pass_through_args: None, }; Ok(package_manager.run_config_command(&options, &cwd).await?) } ConfigCommands::Set { key, value, json, global, location } => { let options = ConfigCommandOptions { subcommand: "set", key: Some(key.as_str()), value: Some(value.as_str()), json, location: if global { Some("global") } else { location.as_deref() }, pass_through_args: None, }; Ok(package_manager.run_config_command(&options, &cwd).await?) } ConfigCommands::Delete { key, global, location } => { let options = ConfigCommandOptions { subcommand: "delete", key: Some(key.as_str()), value: None, json: false, location: if global { Some("global") } else { location.as_deref() }, pass_through_args: None, }; Ok(package_manager.run_config_command(&options, &cwd).await?) } }, PmCommands::Login { registry, scope, pass_through_args } => { let options = LoginCommandOptions { registry: registry.as_deref(), scope: scope.as_deref(), pass_through_args: pass_through_args.as_deref(), }; Ok(package_manager.run_login_command(&options, &cwd).await?) } PmCommands::Logout { registry, scope, pass_through_args } => { let options = LogoutCommandOptions { registry: registry.as_deref(), scope: scope.as_deref(), pass_through_args: pass_through_args.as_deref(), }; Ok(package_manager.run_logout_command(&options, &cwd).await?) } PmCommands::Whoami { registry, pass_through_args } => { let options = WhoamiCommandOptions { registry: registry.as_deref(), pass_through_args: pass_through_args.as_deref(), }; Ok(package_manager.run_whoami_command(&options, &cwd).await?) } PmCommands::Token(token_command) => { let subcommand = match token_command { TokenCommands::List { json, registry, pass_through_args } => { TokenSubcommand::List { json, registry, pass_through_args } } TokenCommands::Create { json, registry, cidr, readonly, pass_through_args } => { TokenSubcommand::Create { json, registry, cidr, readonly, pass_through_args } } TokenCommands::Revoke { token, registry, pass_through_args } => { TokenSubcommand::Revoke { token, registry, pass_through_args } } }; Ok(package_manager.run_token_command(&subcommand, &cwd).await?) } PmCommands::Audit { fix, json, level, production, pass_through_args } => { let options = AuditCommandOptions { fix, json, level: level.as_deref(), production, pass_through_args: pass_through_args.as_deref(), }; Ok(package_manager.run_audit_command(&options, &cwd).await?) } PmCommands::DistTag(dist_tag_command) => { let subcommand = match dist_tag_command { DistTagCommands::List { package } => DistTagSubcommand::List { package }, DistTagCommands::Add { package_at_version, tag } => { DistTagSubcommand::Add { package_at_version, tag } } DistTagCommands::Rm { package, tag } => DistTagSubcommand::Rm { package, tag }, }; let options = DistTagCommandOptions { subcommand, pass_through_args: None }; Ok(package_manager.run_dist_tag_command(&options, &cwd).await?) } PmCommands::Deprecate { package, message, otp, registry, pass_through_args } => { let options = DeprecateCommandOptions { package: &package, message: &message, otp: otp.as_deref(), registry: registry.as_deref(), pass_through_args: pass_through_args.as_deref(), }; Ok(package_manager.run_deprecate_command(&options, &cwd).await?) } PmCommands::Search { terms, json, long, registry, pass_through_args } => { let options = SearchCommandOptions { terms: &terms, json, long, registry: registry.as_deref(), pass_through_args: pass_through_args.as_deref(), }; Ok(package_manager.run_search_command(&options, &cwd).await?) } PmCommands::Rebuild { pass_through_args } => { let options = RebuildCommandOptions { pass_through_args: pass_through_args.as_deref() }; Ok(package_manager.run_rebuild_command(&options, &cwd).await?) } PmCommands::Fund { json, pass_through_args } => { let options = FundCommandOptions { json, pass_through_args: pass_through_args.as_deref() }; Ok(package_manager.run_fund_command(&options, &cwd).await?) } PmCommands::Ping { registry, pass_through_args } => { let options = PingCommandOptions { registry: registry.as_deref(), pass_through_args: pass_through_args.as_deref(), }; Ok(package_manager.run_ping_command(&options, &cwd).await?) } } } #[cfg(test)] mod tests { use vite_install::commands::add::SaveDependencyType; #[test] fn test_save_dependency_type() { assert!(matches!(SaveDependencyType::Dev, SaveDependencyType::Dev)); assert!(matches!(SaveDependencyType::Production, SaveDependencyType::Production)); } } ================================================ FILE: crates/vite_global_cli/src/commands/remove.rs ================================================ use std::process::ExitStatus; use vite_install::commands::remove::RemoveCommandOptions; use vite_path::AbsolutePathBuf; use super::{build_package_manager, prepend_js_runtime_to_path_env}; use crate::error::Error; /// Remove command for removing packages from dependencies. /// /// This command automatically detects the package manager and translates /// the remove command to the appropriate package manager-specific syntax. pub struct RemoveCommand { cwd: AbsolutePathBuf, } impl RemoveCommand { pub fn new(cwd: AbsolutePathBuf) -> Self { Self { cwd } } pub async fn execute( self, packages: &[String], save_dev: bool, save_optional: bool, save_prod: bool, filters: Option<&[String]>, workspace_root: bool, recursive: bool, global: bool, pass_through_args: Option<&[String]>, ) -> Result { prepend_js_runtime_to_path_env(&self.cwd).await?; let package_manager = build_package_manager(&self.cwd).await?; let remove_command_options = RemoveCommandOptions { packages, filters, workspace_root, recursive, global, save_dev, save_optional, save_prod, pass_through_args, }; Ok(package_manager.run_remove_command(&remove_command_options, &self.cwd).await?) } } #[cfg(test)] mod tests { use super::*; #[test] fn test_remove_command_new() { let workspace_root = if cfg!(windows) { AbsolutePathBuf::new("C:\\test".into()).unwrap() } else { AbsolutePathBuf::new("/test".into()).unwrap() }; let cmd = RemoveCommand::new(workspace_root.clone()); assert_eq!(cmd.cwd, workspace_root); } } ================================================ FILE: crates/vite_global_cli/src/commands/run_or_delegate.rs ================================================ //! Run command with fallback to package manager when vite-plus is not a dependency. use std::process::ExitStatus; use vite_path::AbsolutePathBuf; use crate::error::Error; /// Execute `vp run `. /// /// If vite-plus is a dependency, delegate to the local CLI. /// If not, fall back to ` run `. pub async fn execute(cwd: AbsolutePathBuf, args: &[String]) -> Result { if super::has_vite_plus_dependency(&cwd) { tracing::debug!("vite-plus is a dependency, delegating to local CLI"); super::delegate::execute(cwd, "run", args).await } else { tracing::debug!("vite-plus is not a dependency, falling back to package manager run"); super::prepend_js_runtime_to_path_env(&cwd).await?; let package_manager = super::build_package_manager(&cwd).await?; Ok(package_manager.run_script_command(args, &cwd).await?) } } ================================================ FILE: crates/vite_global_cli/src/commands/staged.rs ================================================ //! Staged command (Category B: JavaScript Command). use std::process::ExitStatus; use vite_path::AbsolutePathBuf; use crate::error::Error; /// Execute the `staged` command by delegating to local or global vite-plus. pub async fn execute(cwd: AbsolutePathBuf, args: &[String]) -> Result { super::delegate::execute(cwd, "staged", args).await } ================================================ FILE: crates/vite_global_cli/src/commands/unlink.rs ================================================ use std::process::ExitStatus; use vite_install::commands::unlink::UnlinkCommandOptions; use vite_path::AbsolutePathBuf; use super::{build_package_manager, prepend_js_runtime_to_path_env}; use crate::error::Error; /// Unlink command for removing package links. /// /// This command automatically detects the package manager and translates /// the unlink command to the appropriate package manager-specific syntax. pub struct UnlinkCommand { cwd: AbsolutePathBuf, } impl UnlinkCommand { pub fn new(cwd: AbsolutePathBuf) -> Self { Self { cwd } } pub async fn execute( self, package: Option<&str>, recursive: bool, pass_through_args: Option<&[String]>, ) -> Result { prepend_js_runtime_to_path_env(&self.cwd).await?; let package_manager = build_package_manager(&self.cwd).await?; let unlink_command_options = UnlinkCommandOptions { package, recursive, pass_through_args }; Ok(package_manager.run_unlink_command(&unlink_command_options, &self.cwd).await?) } } #[cfg(test)] mod tests { use super::*; #[test] fn test_unlink_command_new() { let workspace_root = if cfg!(windows) { AbsolutePathBuf::new("C:\\test".into()).unwrap() } else { AbsolutePathBuf::new("/test".into()).unwrap() }; let cmd = UnlinkCommand::new(workspace_root.clone()); assert_eq!(cmd.cwd, workspace_root); } } ================================================ FILE: crates/vite_global_cli/src/commands/update.rs ================================================ use std::process::ExitStatus; use vite_install::commands::update::UpdateCommandOptions; use vite_path::AbsolutePathBuf; use super::{build_package_manager, prepend_js_runtime_to_path_env}; use crate::error::Error; /// Update command for updating packages to their latest versions. /// /// This command automatically detects the package manager and translates /// the update command to the appropriate package manager-specific syntax. pub struct UpdateCommand { cwd: AbsolutePathBuf, } impl UpdateCommand { pub fn new(cwd: AbsolutePathBuf) -> Self { Self { cwd } } #[allow(clippy::too_many_arguments)] pub async fn execute( self, packages: &[String], latest: bool, global: bool, recursive: bool, filters: Option<&[String]>, workspace_root: bool, dev: bool, prod: bool, interactive: bool, no_optional: bool, no_save: bool, workspace_only: bool, pass_through_args: Option<&[String]>, ) -> Result { prepend_js_runtime_to_path_env(&self.cwd).await?; let package_manager = build_package_manager(&self.cwd).await?; let update_command_options = UpdateCommandOptions { packages, latest, global, recursive, filters, workspace_root, dev, prod, interactive, no_optional, no_save, workspace_only, pass_through_args, }; Ok(package_manager.run_update_command(&update_command_options, &self.cwd).await?) } } #[cfg(test)] mod tests { use super::*; #[test] fn test_update_command_new() { let workspace_root = if cfg!(windows) { AbsolutePathBuf::new("C:\\test".into()).unwrap() } else { AbsolutePathBuf::new("/test".into()).unwrap() }; let cmd = UpdateCommand::new(workspace_root.clone()); assert_eq!(cmd.cwd, workspace_root); } } ================================================ FILE: crates/vite_global_cli/src/commands/upgrade/install.rs ================================================ //! Installation logic for upgrade. //! //! Handles tarball extraction, dependency installation, symlink swapping, //! and version cleanup. use std::{ io::{Cursor, Read as _}, path::Path, }; use flate2::read::GzDecoder; use tar::Archive; use vite_path::{AbsolutePath, AbsolutePathBuf}; use crate::error::Error; /// Validate that a path from a tarball entry is safe (no path traversal). /// /// Returns `false` if the path contains `..` components or is absolute. fn is_safe_tar_path(path: &Path) -> bool { // Also check for Unix-style absolute paths, since tar archives always use forward // slashes and `Path::is_absolute()` on Windows only recognizes `C:\...` style paths. let starts_with_slash = path.to_string_lossy().starts_with('/'); !path.is_absolute() && !starts_with_slash && !path.components().any(|c| matches!(c, std::path::Component::ParentDir)) } /// Extract the platform-specific package (binary only). /// /// From the platform tarball, extracts: /// - The `vp` binary → `{version_dir}/bin/vp` /// - The `vp-shim.exe` trampoline → `{version_dir}/bin/vp-shim.exe` (Windows only) /// /// `.node` files are no longer extracted here — npm installs them /// via the platform package's optionalDependencies. pub async fn extract_platform_package( tgz_data: &[u8], version_dir: &AbsolutePath, ) -> Result<(), Error> { let bin_dir = version_dir.join("bin"); tokio::fs::create_dir_all(&bin_dir).await?; let data = tgz_data.to_vec(); let bin_dir_clone = bin_dir.clone(); tokio::task::spawn_blocking(move || { let cursor = Cursor::new(data); let decoder = GzDecoder::new(cursor); let mut archive = Archive::new(decoder); for entry_result in archive.entries()? { let mut entry = entry_result?; let path = entry.path()?.to_path_buf(); // Strip the leading `package/` prefix that npm tarballs have let relative = path.strip_prefix("package").unwrap_or(&path).to_path_buf(); // Reject paths with traversal components (security) if !is_safe_tar_path(&relative) { continue; } let file_name = relative.file_name().and_then(|n| n.to_str()).unwrap_or(""); if file_name == "vp" || file_name == "vp.exe" || file_name == "vp-shim.exe" { // Binary goes to bin/ let target = bin_dir_clone.join(file_name); let mut buf = Vec::new(); entry.read_to_end(&mut buf)?; std::fs::write(&target, &buf)?; // Set executable permission on Unix #[cfg(unix)] { use std::os::unix::fs::PermissionsExt; std::fs::set_permissions(&target, std::fs::Permissions::from_mode(0o755))?; } } } Ok::<(), Error>(()) }) .await .map_err(|e| Error::Upgrade(format!("Task join error: {e}").into()))??; Ok(()) } /// Generate a wrapper `package.json` that declares `vite-plus` as a dependency. /// /// This replaces the old approach of extracting the main package tarball. /// npm will install `vite-plus` and all its transitive deps via `vp install`. pub async fn generate_wrapper_package_json( version_dir: &AbsolutePath, version: &str, ) -> Result<(), Error> { let json = serde_json::json!({ "name": "vp-global", "version": version, "private": true, "dependencies": { "vite-plus": version } }); let content = serde_json::to_string_pretty(&json)? + "\n"; tokio::fs::write(version_dir.join("package.json"), content).await?; Ok(()) } /// Install production dependencies using the new version's binary. /// /// Spawns: `{version_dir}/bin/vp install --silent [--registry ]` with `CI=true`. pub async fn install_production_deps( version_dir: &AbsolutePath, registry: Option<&str>, ) -> Result<(), Error> { let vp_binary = version_dir.join("bin").join(if cfg!(windows) { "vp.exe" } else { "vp" }); if !tokio::fs::try_exists(&vp_binary).await.unwrap_or(false) { return Err(Error::Upgrade( format!("New binary not found at {}", vp_binary.as_path().display()).into(), )); } tracing::debug!("Running vp install in {}", version_dir.as_path().display()); let mut args = vec!["install", "--silent"]; if let Some(registry_url) = registry { args.push("--"); args.push("--registry"); args.push(registry_url); } let output = tokio::process::Command::new(vp_binary.as_path()) .args(&args) .current_dir(version_dir) .env("CI", "true") .output() .await?; if !output.status.success() { let stderr = String::from_utf8_lossy(&output.stderr); return Err(Error::Upgrade( format!( "Failed to install production dependencies (exit code: {})\n{}", output.status.code().unwrap_or(-1), stderr.trim() ) .into(), )); } Ok(()) } /// Save the current version before swapping, for rollback support. /// /// Reads the `current` symlink target and writes the version to `.previous-version`. pub async fn save_previous_version(install_dir: &AbsolutePath) -> Result, Error> { let current_link = install_dir.join("current"); if !tokio::fs::try_exists(¤t_link).await.unwrap_or(false) { return Ok(None); } let target = tokio::fs::read_link(¤t_link).await?; let version = target.file_name().and_then(|n| n.to_str()).map(String::from); if let Some(ref v) = version { let prev_file = install_dir.join(".previous-version"); tokio::fs::write(&prev_file, v).await?; tracing::debug!("Saved previous version: {}", v); } Ok(version) } /// Atomically swap the `current` symlink to point to a new version. /// /// On Unix: creates a temp symlink then renames (atomic). /// On Windows: removes junction and creates a new one. pub async fn swap_current_link(install_dir: &AbsolutePath, version: &str) -> Result<(), Error> { let current_link = install_dir.join("current"); let version_dir = install_dir.join(version); // Verify the version directory exists if !tokio::fs::try_exists(&version_dir).await.unwrap_or(false) { return Err(Error::Upgrade( format!("Version directory does not exist: {}", version_dir.as_path().display()).into(), )); } #[cfg(unix)] { // Atomic symlink swap: create temp link, then rename over current let temp_link = install_dir.join("current.new"); // Remove temp link if it exists from a previous failed attempt let _ = tokio::fs::remove_file(&temp_link).await; tokio::fs::symlink(version, &temp_link).await?; tokio::fs::rename(&temp_link, ¤t_link).await?; } #[cfg(windows)] { // Windows: junction swap (not atomic) // Remove whatever exists at current_link — could be a junction, symlink, or directory. // We don't rely on junction::exists() since it may not detect junctions created by // cmd /c mklink /J (used by install.ps1). if current_link.as_path().exists() { // std::fs::remove_dir works on junctions/symlinks without removing target contents if let Err(e) = std::fs::remove_dir(¤t_link) { tracing::debug!("remove_dir failed ({}), trying junction::delete", e); junction::delete(¤t_link).map_err(|e| { Error::Upgrade( format!( "Failed to remove existing junction at {}: {e}", current_link.as_path().display() ) .into(), ) })?; } } junction::create(&version_dir, ¤t_link).map_err(|e| { Error::Upgrade( format!( "Failed to create junction at {}: {e}\nTry removing it manually and run again.", current_link.as_path().display() ) .into(), ) })?; } tracing::debug!("Swapped current → {}", version); Ok(()) } /// Refresh shims by running `vp env setup --refresh` with the new binary. pub async fn refresh_shims(install_dir: &AbsolutePath) -> Result<(), Error> { let vp_binary = install_dir.join("current").join("bin").join(if cfg!(windows) { "vp.exe" } else { "vp" }); if !tokio::fs::try_exists(&vp_binary).await.unwrap_or(false) { tracing::warn!( "New binary not found at {}, skipping shim refresh", vp_binary.as_path().display() ); return Ok(()); } tracing::debug!("Refreshing shims..."); let output = tokio::process::Command::new(vp_binary.as_path()) .args(["env", "setup", "--refresh"]) .output() .await?; if !output.status.success() { let stderr = String::from_utf8_lossy(&output.stderr); tracing::warn!( "Shim refresh exited with code {}, continuing anyway\n{}", output.status.code().unwrap_or(-1), stderr.trim() ); } Ok(()) } /// Clean up old version directories, keeping at most `max_keep` versions. /// /// Sorts by creation time (newest first, matching install.sh behavior) and removes /// the oldest beyond the limit. Protected versions are never removed, even if they /// fall outside the keep limit (e.g., the active version after a downgrade). pub async fn cleanup_old_versions( install_dir: &AbsolutePath, max_keep: usize, protected_versions: &[&str], ) -> Result<(), Error> { let mut versions: Vec<(std::time::SystemTime, AbsolutePathBuf)> = Vec::new(); let mut entries = tokio::fs::read_dir(install_dir).await?; while let Some(entry) = entries.next_entry().await? { let name = entry.file_name(); let name_str = name.to_string_lossy(); // Only consider entries that parse as semver if node_semver::Version::parse(&name_str).is_ok() { let metadata = entry.metadata().await?; // Use creation time (birth time), fallback to modified time let time = metadata.created().unwrap_or_else(|_| { metadata.modified().unwrap_or(std::time::SystemTime::UNIX_EPOCH) }); let path = AbsolutePathBuf::new(entry.path()).ok_or_else(|| { Error::Upgrade(format!("Invalid absolute path: {}", entry.path().display()).into()) })?; versions.push((time, path)); } } // Sort newest first (by creation time, matching install.sh) versions.sort_by(|a, b| b.0.cmp(&a.0)); // Remove versions beyond the keep limit, but never remove protected versions for (_time, path) in versions.into_iter().skip(max_keep) { let name = path.as_path().file_name().and_then(|n| n.to_str()).unwrap_or(""); if protected_versions.contains(&name) { tracing::debug!("Skipping protected version: {}", name); continue; } tracing::debug!("Cleaning up old version: {}", path.as_path().display()); if let Err(e) = tokio::fs::remove_dir_all(&path).await { tracing::warn!("Failed to remove {}: {}", path.as_path().display(), e); } } Ok(()) } /// Read the previous version from `.previous-version` file. pub async fn read_previous_version(install_dir: &AbsolutePath) -> Result, Error> { let prev_file = install_dir.join(".previous-version"); if !tokio::fs::try_exists(&prev_file).await.unwrap_or(false) { return Ok(None); } let content = tokio::fs::read_to_string(&prev_file).await?; let version = content.trim().to_string(); if version.is_empty() { Ok(None) } else { Ok(Some(version)) } } #[cfg(test)] mod tests { use super::*; #[test] fn test_is_safe_tar_path_normal() { assert!(is_safe_tar_path(Path::new("dist/index.js"))); assert!(is_safe_tar_path(Path::new("bin/vp"))); assert!(is_safe_tar_path(Path::new("package.json"))); assert!(is_safe_tar_path(Path::new("templates/react/index.ts"))); } #[test] fn test_is_safe_tar_path_traversal() { assert!(!is_safe_tar_path(Path::new("../etc/passwd"))); assert!(!is_safe_tar_path(Path::new("dist/../../etc/passwd"))); assert!(!is_safe_tar_path(Path::new(".."))); } #[test] fn test_is_safe_tar_path_absolute() { assert!(!is_safe_tar_path(Path::new("/etc/passwd"))); assert!(!is_safe_tar_path(Path::new("/usr/bin/vp"))); } #[tokio::test] async fn test_cleanup_preserves_active_downgraded_version() { let temp = tempfile::tempdir().unwrap(); let install_dir = AbsolutePathBuf::new(temp.path().to_path_buf()).unwrap(); // Create 7 version directories with staggered creation times. // Simulate: installed 0.1-0.7 in order, then rolled back to 0.2.0 for v in ["0.1.0", "0.2.0", "0.3.0", "0.4.0", "0.5.0", "0.6.0", "0.7.0"] { tokio::fs::create_dir(install_dir.join(v)).await.unwrap(); // Small delay to ensure distinct creation times tokio::time::sleep(std::time::Duration::from_millis(10)).await; } // Simulate rollback: current points to 0.2.0 (low semver rank) #[cfg(unix)] tokio::fs::symlink("0.2.0", install_dir.join("current")).await.unwrap(); // Cleanup keeping top 5, with 0.2.0 protected (the active version) cleanup_old_versions(&install_dir, 5, &["0.2.0"]).await.unwrap(); // 0.2.0 is the active version — it MUST survive cleanup assert!( tokio::fs::try_exists(install_dir.join("0.2.0")).await.unwrap(), "Active version 0.2.0 was deleted by cleanup" ); } #[tokio::test] async fn test_cleanup_sorts_by_creation_time_not_semver() { let temp = tempfile::tempdir().unwrap(); let install_dir = AbsolutePathBuf::new(temp.path().to_path_buf()).unwrap(); // Create versions in non-semver order with creation times: // 0.5.0 (oldest), 0.1.0, 0.3.0, 0.7.0, 0.2.0, 0.6.0 (newest) for v in ["0.5.0", "0.1.0", "0.3.0", "0.7.0", "0.2.0", "0.6.0"] { tokio::fs::create_dir(install_dir.join(v)).await.unwrap(); tokio::time::sleep(std::time::Duration::from_millis(10)).await; } // Keep top 4 by creation time → keep 0.6.0, 0.2.0, 0.7.0, 0.3.0 // Remove 0.1.0 and 0.5.0 (oldest by creation time) cleanup_old_versions(&install_dir, 4, &[]).await.unwrap(); // The 4 newest by creation time should survive assert!(tokio::fs::try_exists(install_dir.join("0.6.0")).await.unwrap()); assert!(tokio::fs::try_exists(install_dir.join("0.2.0")).await.unwrap()); assert!(tokio::fs::try_exists(install_dir.join("0.7.0")).await.unwrap()); assert!(tokio::fs::try_exists(install_dir.join("0.3.0")).await.unwrap()); // The 2 oldest by creation time should be removed assert!( !tokio::fs::try_exists(install_dir.join("0.5.0")).await.unwrap(), "0.5.0 (oldest by creation time) should have been removed" ); assert!( !tokio::fs::try_exists(install_dir.join("0.1.0")).await.unwrap(), "0.1.0 (second oldest by creation time) should have been removed" ); } #[tokio::test] async fn test_cleanup_old_versions_with_nonexistent_dir() { // Verifies that cleanup_old_versions propagates errors on non-existent dir. // In the real flow, such errors from post-swap operations should be non-fatal. let non_existent = AbsolutePathBuf::new(std::env::temp_dir().join("non-existent-upgrade-test-dir")) .unwrap(); let result = cleanup_old_versions(&non_existent, 5, &[]).await; assert!(result.is_err(), "cleanup_old_versions should error on non-existent dir"); } } ================================================ FILE: crates/vite_global_cli/src/commands/upgrade/integrity.rs ================================================ //! Integrity verification for downloaded tarballs. //! //! Verifies SHA-512 integrity using the Subresource Integrity (SRI) format //! that npm registries provide: `sha512-{base64}`. use sha2::{Digest, Sha512}; use crate::error::Error; /// Verify the integrity of data against an SRI hash. /// /// Parses the SRI format `sha512-{base64}`, computes the SHA-512 hash /// of the data, base64-encodes it, and compares. pub fn verify_integrity(data: &[u8], expected_sri: &str) -> Result<(), Error> { let expected_b64 = expected_sri .strip_prefix("sha512-") .ok_or_else(|| Error::UnsupportedIntegrity(expected_sri.into()))?; let mut hasher = Sha512::new(); hasher.update(data); let actual_b64 = base64_simd::STANDARD.encode_to_string(hasher.finalize()); if actual_b64 != expected_b64 { return Err(Error::IntegrityMismatch { expected: expected_sri.into(), actual: format!("sha512-{actual_b64}").into(), }); } tracing::debug!("Integrity verification successful"); Ok(()) } #[cfg(test)] mod tests { use super::*; #[test] fn test_verify_integrity_valid() { let data = b"Hello, World!"; let mut hasher = Sha512::new(); hasher.update(data); let hash = base64_simd::STANDARD.encode_to_string(hasher.finalize()); let sri = format!("sha512-{hash}"); assert!(verify_integrity(data, &sri).is_ok()); } #[test] fn test_verify_integrity_mismatch() { let data = b"Hello, World!"; let sri = "sha512-AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA"; let err = verify_integrity(data, sri).unwrap_err(); assert!(matches!(err, Error::IntegrityMismatch { .. })); } #[test] fn test_verify_integrity_unsupported_format() { let data = b"Hello, World!"; let sri = "sha256-abc123"; let err = verify_integrity(data, sri).unwrap_err(); assert!(matches!(err, Error::UnsupportedIntegrity(_))); } #[test] fn test_verify_integrity_no_prefix() { let data = b"Hello, World!"; let sri = "not-a-valid-sri"; let err = verify_integrity(data, sri).unwrap_err(); assert!(matches!(err, Error::UnsupportedIntegrity(_))); } } ================================================ FILE: crates/vite_global_cli/src/commands/upgrade/mod.rs ================================================ //! Upgrade command for the vp CLI. //! //! Downloads and installs a new version of the CLI from the npm registry //! with SHA-512 integrity verification. mod install; mod integrity; mod platform; mod registry; use std::process::ExitStatus; use owo_colors::OwoColorize; use vite_install::request::HttpClient; use vite_path::AbsolutePathBuf; use vite_shared::output; use crate::{commands::env::config::get_vite_plus_home, error::Error}; /// Options for the upgrade command. pub struct UpgradeOptions { /// Target version (e.g., "0.2.0"). None means use the tag. pub version: Option, /// npm dist-tag (default: "latest") pub tag: String, /// Check for updates without installing pub check: bool, /// Revert to previous version pub rollback: bool, /// Force reinstall even if already on the target version pub force: bool, /// Suppress output pub silent: bool, /// Custom npm registry URL pub registry: Option, } /// Maximum number of old versions to keep. const MAX_VERSIONS_KEEP: usize = 5; /// Execute the upgrade command. #[allow(clippy::print_stdout, clippy::print_stderr)] pub async fn execute(options: UpgradeOptions) -> Result { let install_dir = get_vite_plus_home()?; // Handle --rollback if options.rollback { return execute_rollback(&install_dir, options.silent).await; } // Step 1: Detect platform let platform_suffix = platform::detect_platform_suffix()?; tracing::debug!("Platform: {}", platform_suffix); // Step 2: Determine version to resolve let version_or_tag = options.version.as_deref().unwrap_or(&options.tag); if !options.silent { output::info("checking for updates..."); } // Step 3: Resolve version from npm registry let resolved = registry::resolve_version(version_or_tag, &platform_suffix, options.registry.as_deref()) .await?; let current_version = env!("CARGO_PKG_VERSION"); if !options.silent { output::info(&format!( "found vite-plus@{} (current: {})", resolved.version, current_version )); } // Step 4: Handle --check (report and exit) if options.check { if resolved.version == current_version { println!("\n{} Already up to date ({})", output::CHECK.green(), current_version); } else { println!("Update available: {} \u{2192} {}", current_version, resolved.version); println!("Run `vp upgrade` to update."); } return Ok(ExitStatus::default()); } // Step 5: Handle already up-to-date if resolved.version == current_version && !options.force { if !options.silent { println!("\n{} Already up to date ({})", output::CHECK.green(), current_version); } return Ok(ExitStatus::default()); } if !options.silent { output::info(&format!( "downloading vite-plus@{} for {}...", resolved.version, platform_suffix )); } // Step 6: Download platform tarball (main package is installed via npm) let client = HttpClient::new(); let platform_data = client .get_bytes(&resolved.platform_tarball_url) .await .map_err(|e| Error::Upgrade(format!("Failed to download platform package: {e}").into()))?; // Step 7: Verify integrity integrity::verify_integrity(&platform_data, &resolved.platform_integrity)?; if !options.silent { output::info("installing..."); } // Step 8: Create version directory let version_dir = install_dir.join(&resolved.version); tokio::fs::create_dir_all(&version_dir).await?; // Step 9: Extract platform binary and install via npm let result = install_platform_and_main( &platform_data, &version_dir, &install_dir, &resolved.version, current_version, options.silent, options.registry.as_deref(), ) .await; // On failure, clean up the version directory if result.is_err() { tracing::debug!("Cleaning up failed install at {}", version_dir.as_path().display()); let _ = tokio::fs::remove_dir_all(&version_dir).await; } result } /// Core installation logic, separated for error cleanup. #[allow(clippy::print_stdout, clippy::print_stderr)] async fn install_platform_and_main( platform_data: &[u8], version_dir: &AbsolutePathBuf, install_dir: &AbsolutePathBuf, new_version: &str, current_version: &str, silent: bool, registry: Option<&str>, ) -> Result { // Extract platform package (binary only; .node files installed via npm optionalDeps) install::extract_platform_package(platform_data, version_dir).await?; // Verify binary was extracted let binary_name = if cfg!(windows) { "vp.exe" } else { "vp" }; let binary_path = version_dir.join("bin").join(binary_name); if !tokio::fs::try_exists(&binary_path).await.unwrap_or(false) { return Err(Error::Upgrade( "Binary not found after extraction. The download may be corrupted.".into(), )); } // Generate wrapper package.json that declares vite-plus as a dependency install::generate_wrapper_package_json(version_dir, new_version).await?; // Install production dependencies (npm installs vite-plus + all transitive deps) install::install_production_deps(version_dir, registry).await?; // Save previous version for rollback let previous_version = install::save_previous_version(install_dir).await?; tracing::debug!("Previous version: {:?}", previous_version); // Swap current link — POINT OF NO RETURN install::swap_current_link(install_dir, new_version).await?; // Post-swap operations: non-fatal (the update already succeeded) if let Err(e) = install::refresh_shims(install_dir).await { output::warn(&format!("Shim refresh failed (non-fatal): {e}")); } let mut protected = vec![new_version]; if let Some(ref prev) = previous_version { protected.push(prev.as_str()); } if let Err(e) = install::cleanup_old_versions(install_dir, MAX_VERSIONS_KEEP, &protected).await { output::warn(&format!("Old version cleanup failed (non-fatal): {e}")); } if !silent { println!( "\n{} Updated vite-plus from {} {} {}", output::CHECK.green(), current_version, output::ARROW, new_version ); println!( "\n Release notes: https://github.com/voidzero-dev/vite-plus/releases/tag/v{}", new_version ); } Ok(ExitStatus::default()) } /// Execute rollback to the previous version. #[allow(clippy::print_stdout, clippy::print_stderr)] async fn execute_rollback( install_dir: &AbsolutePathBuf, silent: bool, ) -> Result { let previous = install::read_previous_version(install_dir) .await? .ok_or_else(|| Error::Upgrade("No previous version found. Cannot rollback.".into()))?; // Verify the version directory still exists let prev_dir = install_dir.join(&previous); if !tokio::fs::try_exists(&prev_dir).await.unwrap_or(false) { return Err(Error::Upgrade( format!("Previous version directory ({}) no longer exists. Cannot rollback.", previous) .into(), )); } if !silent { let current_version = env!("CARGO_PKG_VERSION"); output::info("rolling back to previous version..."); output::info(&format!("switching from {} {} {}", current_version, output::ARROW, previous)); } // Save the current version as the new "previous" before swapping install::save_previous_version(install_dir).await?; // Swap to the previous version install::swap_current_link(install_dir, &previous).await?; // Refresh shims install::refresh_shims(install_dir).await?; if !silent { println!("\n{} Rolled back to {}", output::CHECK.green(), previous); } Ok(ExitStatus::default()) } ================================================ FILE: crates/vite_global_cli/src/commands/upgrade/platform.rs ================================================ //! Platform detection for upgrade. //! //! Detects the current platform and returns the npm package suffix //! used to find the correct platform-specific binary package. use crate::error::Error; /// Detect the current platform suffix for npm package naming. /// /// Returns strings like `darwin-arm64`, `linux-x64-gnu`, `linux-arm64-musl`, `win32-x64-msvc`. pub fn detect_platform_suffix() -> Result { let os_name = if cfg!(target_os = "macos") { "darwin" } else if cfg!(target_os = "linux") { "linux" } else if cfg!(target_os = "windows") { "win32" } else { return Err(Error::Upgrade( format!("Unsupported operating system: {}", std::env::consts::OS).into(), )); }; let arch_name = if cfg!(target_arch = "x86_64") { "x64" } else if cfg!(target_arch = "aarch64") { "arm64" } else { return Err(Error::Upgrade( format!("Unsupported architecture: {}", std::env::consts::ARCH).into(), )); }; if os_name == "linux" { let libc = if cfg!(target_env = "musl") { "musl" } else { "gnu" }; Ok(format!("{os_name}-{arch_name}-{libc}")) } else if os_name == "win32" { Ok(format!("{os_name}-{arch_name}-msvc")) } else { Ok(format!("{os_name}-{arch_name}")) } } #[cfg(test)] mod tests { use super::*; #[test] fn test_detect_platform_suffix() { let suffix = detect_platform_suffix().unwrap(); // Should be non-empty and contain a dash assert!(!suffix.is_empty()); assert!(suffix.contains('-')); // Should match the current platform #[cfg(all(target_os = "macos", target_arch = "aarch64"))] assert_eq!(suffix, "darwin-arm64"); #[cfg(all(target_os = "macos", target_arch = "x86_64"))] assert_eq!(suffix, "darwin-x64"); #[cfg(all(target_os = "linux", target_arch = "x86_64", not(target_env = "musl")))] assert_eq!(suffix, "linux-x64-gnu"); #[cfg(all(target_os = "linux", target_arch = "x86_64", target_env = "musl"))] assert_eq!(suffix, "linux-x64-musl"); #[cfg(all(target_os = "linux", target_arch = "aarch64", not(target_env = "musl")))] assert_eq!(suffix, "linux-arm64-gnu"); #[cfg(all(target_os = "windows", target_arch = "x86_64"))] assert_eq!(suffix, "win32-x64-msvc"); } } ================================================ FILE: crates/vite_global_cli/src/commands/upgrade/registry.rs ================================================ //! npm registry client for version resolution. //! //! Queries the npm registry to resolve versions and get tarball URLs //! with integrity hashes for both the main package and platform-specific package. use serde::Deserialize; use vite_install::{config::npm_registry, request::HttpClient}; use crate::error::Error; /// npm package version metadata (subset of fields we need). #[derive(Debug, Deserialize)] pub struct PackageVersionMetadata { pub version: String, pub dist: DistInfo, } /// Distribution info from npm registry. #[derive(Debug, Deserialize)] pub struct DistInfo { pub tarball: String, pub integrity: String, } /// Resolved version info with URLs and integrity for the platform package. #[derive(Debug)] pub struct ResolvedVersion { pub version: String, pub platform_tarball_url: String, pub platform_integrity: String, } const MAIN_PACKAGE_NAME: &str = "vite-plus"; const PLATFORM_PACKAGE_SCOPE: &str = "@voidzero-dev"; const CLI_PACKAGE_NAME_PREFIX: &str = "vite-plus-cli"; /// Resolve a version from the npm registry. /// /// Makes two HTTP calls: /// 1. Main package metadata to resolve version tags (e.g., "latest" → "1.2.3") /// 2. CLI platform package metadata to get tarball URL and integrity pub async fn resolve_version( version_or_tag: &str, platform_suffix: &str, registry_override: Option<&str>, ) -> Result { let default_registry = npm_registry(); let registry_raw = registry_override.unwrap_or(&default_registry); let registry = registry_raw.trim_end_matches('/'); let client = HttpClient::new(); // Step 1: Fetch main package metadata to resolve version let main_url = format!("{registry}/{MAIN_PACKAGE_NAME}/{version_or_tag}"); tracing::debug!("Fetching main package metadata: {}", main_url); let main_meta: PackageVersionMetadata = client.get_json(&main_url).await.map_err(|e| { Error::Upgrade(format!("Failed to fetch package metadata from {main_url}: {e}").into()) })?; // Step 2: Query CLI platform package directly let cli_package_name = format!("{PLATFORM_PACKAGE_SCOPE}/{CLI_PACKAGE_NAME_PREFIX}-{platform_suffix}"); let cli_url = format!("{registry}/{cli_package_name}/{}", main_meta.version); tracing::debug!("Fetching CLI package metadata: {}", cli_url); let cli_meta: PackageVersionMetadata = client.get_json(&cli_url).await.map_err(|e| { Error::Upgrade( format!( "Failed to fetch CLI package metadata from {cli_url}: {e}. \ Your platform ({platform_suffix}) may not be supported." ) .into(), ) })?; Ok(ResolvedVersion { version: main_meta.version, platform_tarball_url: cli_meta.dist.tarball, platform_integrity: cli_meta.dist.integrity, }) } #[cfg(test)] mod tests { use super::*; #[test] fn test_cli_package_name_construction() { let suffix = "darwin-arm64"; let name = format!("{PLATFORM_PACKAGE_SCOPE}/{CLI_PACKAGE_NAME_PREFIX}-{suffix}"); assert_eq!(name, "@voidzero-dev/vite-plus-cli-darwin-arm64"); } #[test] fn test_all_platform_suffixes_match_published_cli_packages() { // These are the actual published CLI package suffixes // (from packages/cli/publish-native-addons.ts RUST_TARGETS keys) let published_suffixes = [ "darwin-arm64", "darwin-x64", "linux-arm64-gnu", "linux-x64-gnu", "win32-arm64-msvc", "win32-x64-msvc", ]; let published_packages: Vec = published_suffixes .iter() .map(|s| format!("{PLATFORM_PACKAGE_SCOPE}/{CLI_PACKAGE_NAME_PREFIX}-{s}")) .collect(); // All known platform suffixes that detect_platform_suffix() can return let detection_suffixes = [ "darwin-arm64", "darwin-x64", "linux-arm64-gnu", "linux-x64-gnu", "linux-arm64-musl", "linux-x64-musl", "win32-arm64-msvc", "win32-x64-msvc", ]; for suffix in &detection_suffixes { let package_name = format!("{PLATFORM_PACKAGE_SCOPE}/{CLI_PACKAGE_NAME_PREFIX}-{suffix}"); // musl variants are not published, so skip them if suffix.contains("musl") { continue; } assert!( published_packages.contains(&package_name), "Platform suffix '{suffix}' produces CLI package name '{package_name}' \ which does not match any published CLI package" ); } } } ================================================ FILE: crates/vite_global_cli/src/commands/version.rs ================================================ //! Version command. use std::{ collections::BTreeMap, fs, path::{Path, PathBuf}, process::ExitStatus, }; use owo_colors::OwoColorize; use serde::Deserialize; use vite_install::get_package_manager_type_and_version; use vite_path::AbsolutePathBuf; use vite_workspace::find_workspace_root; use crate::{commands::env::config::resolve_version, error::Error, help}; #[derive(Debug, Deserialize)] #[serde(rename_all = "camelCase")] struct PackageJson { version: String, #[serde(default)] bundled_versions: BTreeMap, } #[derive(Debug)] struct LocalVitePlus { version: String, package_dir: PathBuf, } #[derive(Debug, Clone, Copy)] struct ToolSpec { display_name: &'static str, package_name: &'static str, bundled_version_key: Option<&'static str>, } const TOOL_SPECS: [ToolSpec; 7] = [ ToolSpec { display_name: "vite", package_name: "@voidzero-dev/vite-plus-core", bundled_version_key: Some("vite"), }, ToolSpec { display_name: "rolldown", package_name: "@voidzero-dev/vite-plus-core", bundled_version_key: Some("rolldown"), }, ToolSpec { display_name: "vitest", package_name: "@voidzero-dev/vite-plus-test", bundled_version_key: Some("vitest"), }, ToolSpec { display_name: "oxfmt", package_name: "oxfmt", bundled_version_key: None }, ToolSpec { display_name: "oxlint", package_name: "oxlint", bundled_version_key: None }, ToolSpec { display_name: "oxlint-tsgolint", package_name: "oxlint-tsgolint", bundled_version_key: None, }, ToolSpec { display_name: "tsdown", package_name: "@voidzero-dev/vite-plus-core", bundled_version_key: Some("tsdown"), }, ]; const NOT_FOUND: &str = "Not found"; fn read_package_json(package_json_path: &Path) -> Option { let content = fs::read_to_string(package_json_path).ok()?; serde_json::from_str(&content).ok() } fn find_local_vite_plus(start: &Path) -> Option { let mut current = Some(start); while let Some(dir) = current { let package_json_path = dir.join("node_modules").join("vite-plus").join("package.json"); if let Some(pkg) = read_package_json(&package_json_path) { let package_dir = package_json_path.parent()?.to_path_buf(); // Follow symlinks (pnpm links node_modules/vite-plus -> node_modules/.pnpm/.../vite-plus) // so parent traversal can discover colocated dependency links. let package_dir = fs::canonicalize(&package_dir).unwrap_or(package_dir); return Some(LocalVitePlus { version: pkg.version, package_dir }); } current = dir.parent(); } None } fn resolve_package_json(base_dir: &Path, package_name: &str) -> Option { let mut current = Some(base_dir); while let Some(dir) = current { let package_json_path = dir.join("node_modules").join(package_name).join("package.json"); if let Some(pkg) = read_package_json(&package_json_path) { return Some(pkg); } current = dir.parent(); } None } fn resolve_tool_version(local: &LocalVitePlus, tool: ToolSpec) -> Option { let pkg = resolve_package_json(&local.package_dir, tool.package_name)?; if let Some(key) = tool.bundled_version_key && let Some(version) = pkg.bundled_versions.get(key) { return Some(version.clone()); } Some(pkg.version) } fn accent(text: &str) -> String { if help::should_style_help() { text.bright_blue().to_string() } else { text.to_string() } } fn print_rows(title: &str, rows: &[(&str, String)]) { println!("{}", help::render_heading(title)); let label_width = rows.iter().map(|(label, _)| label.chars().count()).max().unwrap_or(0); for (label, value) in rows { let padding = " ".repeat(label_width.saturating_sub(label.chars().count())); println!(" {}{} {value}", accent(label), padding); } } fn format_version(version: Option) -> String { match version { Some(v) => format!("v{v}"), None => NOT_FOUND.to_string(), } } async fn get_node_version_info(cwd: &AbsolutePathBuf) -> Option<(String, String)> { // Try the full managed resolution chain if let Ok(resolution) = resolve_version(cwd).await { return Some((resolution.version, resolution.source)); } // Fallback: detect system Node version (with VITE_PLUS_BYPASS to avoid hitting the shim) let version = detect_system_node_version()?; Some((version, "system".to_string())) } fn detect_system_node_version() -> Option { let output = std::process::Command::new("node") .arg("--version") .env(vite_shared::env_vars::VITE_PLUS_BYPASS, "1") .output() .ok()?; if !output.status.success() { return None; } let version = String::from_utf8(output.stdout).ok()?; let version = version.trim().strip_prefix('v').unwrap_or(version.trim()); if version.is_empty() { return None; } Some(version.to_string()) } /// Execute the `--version` command. pub async fn execute(cwd: AbsolutePathBuf) -> Result { println!("{}", vite_shared::header::vite_plus_header()); println!(); println!("vp v{}", env!("CARGO_PKG_VERSION")); println!(); // Local vite-plus and tools let local = find_local_vite_plus(cwd.as_path()); print_rows( "Local vite-plus", &[("vite-plus", format_version(local.as_ref().map(|pkg| pkg.version.clone())))], ); println!(); let tool_rows = TOOL_SPECS .iter() .map(|tool| { let version = local.as_ref().and_then(|local_pkg| resolve_tool_version(local_pkg, *tool)); (tool.display_name, format_version(version)) }) .collect::>(); print_rows("Tools", &tool_rows); println!(); // Environment info let package_manager_info = find_workspace_root(&cwd) .ok() .and_then(|(root, _)| { get_package_manager_type_and_version(&root, None) .ok() .map(|(pm, v, _)| format!("{pm} v{v}")) }) .unwrap_or(NOT_FOUND.to_string()); let node_info = get_node_version_info(&cwd) .await .map(|(v, s)| match s.as_str() { "lts" | "default" | "system" => format!("v{v}"), _ => format!("v{v} ({s})"), }) .unwrap_or(NOT_FOUND.to_string()); let env_rows = [("Package manager", package_manager_info), ("Node.js", node_info)]; print_rows("Environment", &env_rows); Ok(ExitStatus::default()) } #[cfg(test)] mod tests { #[cfg(unix)] use std::{fs, path::Path}; #[cfg(unix)] use super::{ToolSpec, find_local_vite_plus, resolve_tool_version}; use super::{detect_system_node_version, format_version}; #[cfg(unix)] fn symlink_dir(src: &Path, dst: &Path) { std::os::unix::fs::symlink(src, dst).unwrap(); } #[test] fn format_version_values() { assert_eq!(format_version(Some("1.2.3".to_string())), "v1.2.3"); assert_eq!(format_version(None), "Not found"); } #[test] fn detect_system_node_version_returns_version() { let version = detect_system_node_version(); assert!(version.is_some(), "expected node to be installed"); let version = version.unwrap(); assert!(!version.starts_with('v'), "version should not have v prefix"); assert!(version.contains('.'), "expected semver-like version, got: {version}"); } #[cfg(unix)] #[test] fn resolves_tool_versions_from_pnpm_symlink_layout() { let temp = tempfile::tempdir().unwrap(); let project = temp.path(); let pnpm_pkg_dir = project.join("node_modules/.pnpm/vite-plus@1.0.0/node_modules/vite-plus"); fs::create_dir_all(&pnpm_pkg_dir).unwrap(); fs::write(pnpm_pkg_dir.join("package.json"), r#"{"version":"1.0.0"}"#).unwrap(); let core_pkg_dir = project .join("node_modules/.pnpm/vite-plus@1.0.0/node_modules/@voidzero-dev/vite-plus-core"); fs::create_dir_all(&core_pkg_dir).unwrap(); fs::write( core_pkg_dir.join("package.json"), r#"{"version":"1.0.0","bundledVersions":{"vite":"8.0.0"}}"#, ) .unwrap(); let node_modules_dir = project.join("node_modules"); fs::create_dir_all(&node_modules_dir).unwrap(); symlink_dir( Path::new(".pnpm/vite-plus@1.0.0/node_modules/vite-plus"), &node_modules_dir.join("vite-plus"), ); let local = find_local_vite_plus(project).expect("expected local vite-plus to resolve"); let tool = ToolSpec { display_name: "vite", package_name: "@voidzero-dev/vite-plus-core", bundled_version_key: Some("vite"), }; let resolved = resolve_tool_version(&local, tool); assert_eq!(resolved.as_deref(), Some("8.0.0")); } } ================================================ FILE: crates/vite_global_cli/src/commands/vpx.rs ================================================ //! `vpx` command implementation. //! //! Executes a command from a local or remote npm package (like `npx`). //! Resolution order: //! 1. Local `node_modules/.bin` (walk up from cwd) //! 2. Global vp packages (installed via `vp install -g`) //! 3. System PATH (excluding vite-plus bin directory) //! 4. Remote download via `vp dlx` use vite_path::{AbsolutePath, AbsolutePathBuf}; use vite_shared::{PrependOptions, output, prepend_to_path_env}; use super::DlxCommand; use crate::{commands::env::config, shim::dispatch}; /// Parsed vpx flags. #[derive(Debug, Default)] pub struct VpxFlags { /// Packages to install (from --package/-p) pub packages: Vec, /// Execute within a shell environment (-c/--shell-mode) pub shell_mode: bool, /// Suppress output (-s/--silent) pub silent: bool, /// Show help (-h/--help) pub help: bool, } /// Help text for vpx. const VPX_HELP: &str = "\ Execute a command from a local or remote npm package Usage: vpx [OPTIONS] [args...] Arguments: Package binary to execute [args...] Arguments to pass to the command Options: -p, --package Package(s) to install if not found locally -c, --shell-mode Execute the command within a shell environment -s, --silent Suppress all output except the command's output -h, --help Print help Examples: vpx eslint . # Run local eslint (or download) vpx create-vue my-app # Download and run create-vue vpx typescript@5.5.4 tsc --version # Run specific version vpx -p cowsay -c 'echo \"hi\" | cowsay' # Shell mode with package"; /// A globally installed binary found via `vp install -g`. struct GlobalBinary { path: AbsolutePathBuf, is_js: bool, node_version: String, } /// Main entry point for vpx execution. /// /// Called from shim dispatch when `argv[0]` is `vpx`. pub async fn execute_vpx(args: &[String], cwd: &AbsolutePath) -> i32 { let (flags, positional) = parse_vpx_args(args); // Show help if flags.help { println!("{VPX_HELP}"); return 0; } // No command specified if positional.is_empty() { output::error("vpx requires a command to run"); eprintln!(); eprintln!("Usage: vpx [args...]"); eprintln!(); eprintln!("Examples:"); eprintln!(" vpx eslint ."); eprintln!(" vpx create-vue my-app"); return 1; } let cmd_spec = &positional[0]; // Extract the command name (binary to look for in node_modules/.bin) let cmd_name = extract_command_name(cmd_spec); // If no version spec and no --package flag, try local → global → PATH lookup if !has_version_spec(cmd_spec) && flags.packages.is_empty() && !flags.shell_mode { // 1. Try local node_modules/.bin if let Some(local_bin) = find_local_binary(cwd, &cmd_name) { tracing::debug!("vpx: found local binary at {}", local_bin.as_path().display()); prepend_node_modules_bin_to_path(cwd); let cmd_args: Vec = positional[1..].to_vec(); return crate::shim::exec::exec_tool(&local_bin, &cmd_args); } // 2. Try global vp packages if let Some(global_bin) = find_global_binary(&cmd_name).await { tracing::debug!("vpx: found global binary at {}", global_bin.path.as_path().display()); return execute_global_binary(global_bin, &positional[1..], cwd).await; } // 3. Try system PATH (excluding vite-plus bin dir) if let Some(path_bin) = find_on_path(&cmd_name) { tracing::debug!("vpx: found on PATH at {}", path_bin.as_path().display()); prepend_node_modules_bin_to_path(cwd); let cmd_args: Vec = positional[1..].to_vec(); return crate::shim::exec::exec_tool(&path_bin, &cmd_args); } } // 4. Fall back to dlx (remote download) let cwd_buf = cwd.to_absolute_path_buf(); match DlxCommand::new(cwd_buf) .execute(flags.packages, flags.shell_mode, flags.silent, positional) .await { Ok(status) => status.code().unwrap_or(1), Err(e) => { output::error(&format!("vpx: {e}")); 1 } } } /// Find a binary in globally installed vp packages. /// /// Uses the dispatch helpers to look up BinConfig and PackageMetadata. async fn find_global_binary(cmd: &str) -> Option { let metadata = match dispatch::find_package_for_binary(cmd).await { Ok(Some(m)) => m, _ => return None, }; let path = match dispatch::locate_package_binary(&metadata.name, cmd) { Ok(p) => p, Err(_) => return None, }; Some(GlobalBinary { is_js: metadata.is_js_binary(cmd), node_version: metadata.platform.node.clone(), path, }) } /// Execute a globally installed binary. /// /// Ensures the required Node.js version is installed, prepends its bin dir /// and local node_modules/.bin dirs to PATH, then executes. async fn execute_global_binary(bin: GlobalBinary, args: &[String], cwd: &AbsolutePath) -> i32 { // Ensure Node.js is installed if let Err(e) = dispatch::ensure_installed(&bin.node_version).await { output::error(&format!("vpx: Failed to install Node {}: {e}", bin.node_version)); return 1; } // Locate node binary for this version let node_path = match dispatch::locate_tool(&bin.node_version, "node") { Ok(p) => p, Err(e) => { output::error(&format!("vpx: Node not found: {e}")); return 1; } }; // Prepend Node.js bin dir to PATH let node_bin_dir = node_path.parent().expect("Node has no parent directory"); prepend_to_path_env(node_bin_dir, PrependOptions::default()); // Prepend local node_modules/.bin dirs to PATH prepend_node_modules_bin_to_path(cwd); if bin.is_js { // Execute: node let mut full_args = vec![bin.path.as_path().display().to_string()]; full_args.extend(args.iter().cloned()); crate::shim::exec::exec_tool(&node_path, &full_args) } else { crate::shim::exec::exec_tool(&bin.path, args) } } /// Find a command on system PATH, excluding the vite-plus bin directory. /// /// This prevents vpx from finding itself (or other vite-plus shims) on PATH. fn find_on_path(cmd: &str) -> Option { let bin_dir = config::get_bin_dir().ok(); let path_var = std::env::var_os("PATH")?; // Filter PATH to exclude vite-plus bin directory let filtered_paths: Vec<_> = std::env::split_paths(&path_var) .filter(|p| { if let Some(ref bin) = bin_dir { if p == bin.as_path() { return false; } } true }) .collect(); let filtered_path = std::env::join_paths(filtered_paths).ok()?; let cwd = vite_path::current_dir().ok()?; vite_command::resolve_bin(cmd, Some(&filtered_path), &cwd).ok() } /// Prepend all `node_modules/.bin` directories from cwd upward to PATH. /// /// Walks up from cwd and prepends each existing `node_modules/.bin` directory /// to PATH so that sub-processes also resolve local binaries first. fn prepend_node_modules_bin_to_path(cwd: &AbsolutePath) { // Collect dirs bottom-up, then prepend in reverse so nearest is first let mut bin_dirs = Vec::new(); let mut current = cwd; loop { let bin_dir = current.join("node_modules").join(".bin"); if bin_dir.as_path().is_dir() { bin_dirs.push(bin_dir); } match current.parent() { Some(parent) if parent != current => current = parent, _ => break, } } // Prepend in reverse order so the nearest (deepest) directory ends up first for dir in bin_dirs.iter().rev() { prepend_to_path_env(dir, PrependOptions { dedupe_anywhere: true }); } } /// Walk up from `cwd` looking for `node_modules/.bin/`. /// /// On Windows, also checks for `.cmd` extension. /// Returns the absolute path to the binary if found. pub fn find_local_binary(cwd: &AbsolutePath, cmd: &str) -> Option { let mut current = cwd; loop { let bin_dir = current.join("node_modules").join(".bin"); let bin_path = bin_dir.join(cmd); if bin_path.as_path().exists() { return Some(bin_path); } // On Windows, check for .cmd extension #[cfg(windows)] { let cmd_path = bin_dir.join(format!("{cmd}.cmd")); if cmd_path.as_path().exists() { return Some(cmd_path); } } // Move to parent directory match current.parent() { Some(parent) if parent != current => current = parent, _ => return None, // Reached filesystem root } } } /// Check if a package spec includes a version (e.g., `eslint@9`). /// /// Scoped packages like `@vue/cli` are not version specs, but /// `@vue/cli@5.0.0` is. pub fn has_version_spec(spec: &str) -> bool { if spec.starts_with('@') { // Scoped package: @scope/pkg@version if let Some(slash_pos) = spec.find('/') { return spec[slash_pos + 1..].contains('@'); } // Just "@scope" with no slash — not a valid spec, no version return false; } spec.contains('@') } /// Extract the command/binary name from a package spec. /// /// Examples: /// - `eslint` → `eslint` /// - `eslint@9` → `eslint` /// - `@vue/cli` → `cli` /// - `@vue/cli@5.0.0` → `cli` fn extract_command_name(spec: &str) -> String { if spec.starts_with('@') { // Scoped package: @scope/pkg or @scope/pkg@version if let Some(slash_pos) = spec.find('/') { let after_slash = &spec[slash_pos + 1..]; // Strip version if present if let Some(at_pos) = after_slash.find('@') { return after_slash[..at_pos].to_string(); } return after_slash.to_string(); } // Just "@scope" — use as-is (unusual case) return spec.to_string(); } // Unscoped: pkg or pkg@version if let Some(at_pos) = spec.find('@') { spec[..at_pos].to_string() } else { spec.to_string() } } /// Parse vpx flags from the argument slice. /// /// All flags must come before the first positional argument (npx-style). /// Returns the parsed flags and remaining positional arguments. pub fn parse_vpx_args(args: &[String]) -> (VpxFlags, Vec) { let mut flags = VpxFlags::default(); let mut positional = Vec::new(); let mut i = 0; while i < args.len() { let arg = &args[i]; // Once we see a non-flag argument, everything else is positional if !arg.starts_with('-') { positional.extend_from_slice(&args[i..]); break; } match arg.as_str() { "-p" | "--package" => { i += 1; if i < args.len() { flags.packages.push(args[i].clone()); } } "-c" | "--shell-mode" => { flags.shell_mode = true; } "-s" | "--silent" => { flags.silent = true; } "-h" | "--help" => { flags.help = true; } other => { // Handle --package=VALUE if let Some(value) = other.strip_prefix("--package=") { flags.packages.push(value.to_string()); } else if let Some(value) = other.strip_prefix("-p=") { flags.packages.push(value.to_string()); } else { // Unknown flag — treat as start of positional args positional.extend_from_slice(&args[i..]); break; } } } i += 1; } (flags, positional) } #[cfg(test)] mod tests { use serial_test::serial; use super::*; // ========================================================================= // has_version_spec tests // ========================================================================= #[test] fn test_has_version_spec_simple_package() { assert!(!has_version_spec("eslint")); } #[test] fn test_has_version_spec_with_version() { assert!(has_version_spec("eslint@9")); } #[test] fn test_has_version_spec_with_full_version() { assert!(has_version_spec("typescript@5.5.4")); } #[test] fn test_has_version_spec_scoped_package_no_version() { assert!(!has_version_spec("@vue/cli")); } #[test] fn test_has_version_spec_scoped_package_with_version() { assert!(has_version_spec("@vue/cli@5.0.0")); } #[test] fn test_has_version_spec_scoped_no_slash() { assert!(!has_version_spec("@vue")); } #[test] fn test_has_version_spec_with_tag() { assert!(has_version_spec("eslint@latest")); } // ========================================================================= // extract_command_name tests // ========================================================================= #[test] fn test_extract_command_name_simple() { assert_eq!(extract_command_name("eslint"), "eslint"); } #[test] fn test_extract_command_name_with_version() { assert_eq!(extract_command_name("eslint@9"), "eslint"); } #[test] fn test_extract_command_name_scoped() { assert_eq!(extract_command_name("@vue/cli"), "cli"); } #[test] fn test_extract_command_name_scoped_with_version() { assert_eq!(extract_command_name("@vue/cli@5.0.0"), "cli"); } #[test] fn test_extract_command_name_create_vue() { assert_eq!(extract_command_name("create-vue"), "create-vue"); } // ========================================================================= // parse_vpx_args tests // ========================================================================= #[test] fn test_parse_vpx_args_simple_command() { let args: Vec = vec!["eslint".into(), ".".into()]; let (flags, positional) = parse_vpx_args(&args); assert!(flags.packages.is_empty()); assert!(!flags.shell_mode); assert!(!flags.silent); assert!(!flags.help); assert_eq!(positional, vec!["eslint", "."]); } #[test] fn test_parse_vpx_args_with_package_flag() { let args: Vec = vec!["-p".into(), "cowsay".into(), "-c".into(), "echo hi | cowsay".into()]; let (flags, positional) = parse_vpx_args(&args); assert_eq!(flags.packages, vec!["cowsay"]); assert!(flags.shell_mode); assert_eq!(positional, vec!["echo hi | cowsay"]); } #[test] fn test_parse_vpx_args_with_long_package_flag() { let args: Vec = vec!["--package".into(), "yo".into(), "yo".into(), "webapp".into()]; let (flags, positional) = parse_vpx_args(&args); assert_eq!(flags.packages, vec!["yo"]); assert_eq!(positional, vec!["yo", "webapp"]); } #[test] fn test_parse_vpx_args_with_package_equals() { let args: Vec = vec!["--package=cowsay".into(), "cowsay".into(), "hello".into()]; let (flags, positional) = parse_vpx_args(&args); assert_eq!(flags.packages, vec!["cowsay"]); assert_eq!(positional, vec!["cowsay", "hello"]); } #[test] fn test_parse_vpx_args_multiple_packages() { let args: Vec = vec![ "-p".into(), "cowsay".into(), "-p".into(), "lolcatjs".into(), "-c".into(), "echo hi | cowsay | lolcatjs".into(), ]; let (flags, positional) = parse_vpx_args(&args); assert_eq!(flags.packages, vec!["cowsay", "lolcatjs"]); assert!(flags.shell_mode); assert_eq!(positional, vec!["echo hi | cowsay | lolcatjs"]); } #[test] fn test_parse_vpx_args_silent() { let args: Vec = vec!["-s".into(), "create-vue".into(), "my-app".into()]; let (flags, positional) = parse_vpx_args(&args); assert!(flags.silent); assert_eq!(positional, vec!["create-vue", "my-app"]); } #[test] fn test_parse_vpx_args_help() { let args: Vec = vec!["--help".into()]; let (flags, positional) = parse_vpx_args(&args); assert!(flags.help); assert!(positional.is_empty()); } #[test] fn test_parse_vpx_args_no_args() { let args: Vec = vec![]; let (flags, positional) = parse_vpx_args(&args); assert!(flags.packages.is_empty()); assert!(!flags.shell_mode); assert!(!flags.silent); assert!(!flags.help); assert!(positional.is_empty()); } #[test] fn test_parse_vpx_args_unknown_flag_becomes_positional() { let args: Vec = vec!["--version".into()]; let (flags, positional) = parse_vpx_args(&args); assert!(!flags.help); assert_eq!(positional, vec!["--version"]); } // ========================================================================= // find_local_binary tests // ========================================================================= #[test] fn test_find_local_binary_in_cwd() { let temp_dir = tempfile::tempdir().unwrap(); let temp_path = AbsolutePathBuf::new(temp_dir.path().to_path_buf()).unwrap(); // Create node_modules/.bin/eslint let bin_dir = temp_path.join("node_modules").join(".bin"); std::fs::create_dir_all(&bin_dir).unwrap(); let eslint_path = bin_dir.join("eslint"); std::fs::write(&eslint_path, "#!/bin/sh\n").unwrap(); let result = find_local_binary(&temp_path, "eslint"); assert!(result.is_some()); assert_eq!(result.unwrap().as_path(), eslint_path.as_path()); } #[test] fn test_find_local_binary_walks_up() { let temp_dir = tempfile::tempdir().unwrap(); let temp_path = AbsolutePathBuf::new(temp_dir.path().to_path_buf()).unwrap(); // Create node_modules/.bin/eslint at root let bin_dir = temp_path.join("node_modules").join(".bin"); std::fs::create_dir_all(&bin_dir).unwrap(); let eslint_path = bin_dir.join("eslint"); std::fs::write(&eslint_path, "#!/bin/sh\n").unwrap(); // Create nested directory let nested_dir = temp_path.join("packages").join("app"); std::fs::create_dir_all(&nested_dir).unwrap(); let nested_abs = AbsolutePathBuf::new(nested_dir.as_path().to_path_buf()).unwrap(); let result = find_local_binary(&nested_abs, "eslint"); assert!(result.is_some()); assert_eq!(result.unwrap().as_path(), eslint_path.as_path()); } #[test] fn test_find_local_binary_not_found() { let temp_dir = tempfile::tempdir().unwrap(); let temp_path = AbsolutePathBuf::new(temp_dir.path().to_path_buf()).unwrap(); let result = find_local_binary(&temp_path, "nonexistent-tool"); assert!(result.is_none()); } #[test] fn test_find_local_binary_prefers_nearest() { let temp_dir = tempfile::tempdir().unwrap(); let temp_path = AbsolutePathBuf::new(temp_dir.path().to_path_buf()).unwrap(); // Create eslint at root let root_bin = temp_path.join("node_modules").join(".bin"); std::fs::create_dir_all(&root_bin).unwrap(); std::fs::write(root_bin.join("eslint"), "root").unwrap(); // Create eslint in nested package let nested = temp_path.join("packages").join("app"); let nested_bin = nested.join("node_modules").join(".bin"); std::fs::create_dir_all(&nested_bin).unwrap(); std::fs::write(nested_bin.join("eslint"), "nested").unwrap(); let nested_abs = AbsolutePathBuf::new(nested.as_path().to_path_buf()).unwrap(); let result = find_local_binary(&nested_abs, "eslint"); assert!(result.is_some()); // Should find the nested one first let found = result.unwrap(); assert_eq!(found.as_path(), nested_bin.join("eslint").as_path()); } // ========================================================================= // find_global_binary tests // ========================================================================= #[tokio::test] async fn test_find_global_binary_not_installed() { // A binary that doesn't exist in any global package should return None let result = find_global_binary("nonexistent-vpx-test-binary-xyz").await; assert!(result.is_none()); } // ========================================================================= // find_on_path tests // ========================================================================= #[cfg(unix)] fn create_fake_executable(dir: &std::path::Path, name: &str) -> std::path::PathBuf { use std::os::unix::fs::PermissionsExt; let path = dir.join(name); std::fs::write(&path, "#!/bin/sh\n").unwrap(); std::fs::set_permissions(&path, std::fs::Permissions::from_mode(0o755)).unwrap(); path } #[cfg(windows)] fn create_fake_executable(dir: &std::path::Path, name: &str) -> std::path::PathBuf { let path = dir.join(format!("{name}.exe")); std::fs::write(&path, "fake").unwrap(); path } #[test] #[serial] fn test_find_on_path_finds_tool() { let original_path = std::env::var_os("PATH"); let temp = tempfile::tempdir().unwrap(); let dir = temp.path().join("bin_test"); std::fs::create_dir_all(&dir).unwrap(); create_fake_executable(&dir, "vpx-test-tool-abc"); // SAFETY: serial test unsafe { std::env::set_var("PATH", &dir); } let result = find_on_path("vpx-test-tool-abc"); assert!(result.is_some()); unsafe { match &original_path { Some(v) => std::env::set_var("PATH", v), None => std::env::remove_var("PATH"), } } } #[test] #[serial] fn test_find_on_path_excludes_vp_bin_dir() { let original_path = std::env::var_os("PATH"); let original_home = std::env::var_os("VITE_PLUS_HOME"); let temp = tempfile::tempdir().unwrap(); // Set up a fake vite-plus home with bin dir let fake_home = temp.path().join("vite-plus-home"); let fake_bin = fake_home.join("bin"); std::fs::create_dir_all(&fake_bin).unwrap(); create_fake_executable(&fake_bin, "vpx-excluded-tool"); // Set up another directory with the same tool let other_dir = temp.path().join("other_bin"); std::fs::create_dir_all(&other_dir).unwrap(); create_fake_executable(&other_dir, "vpx-excluded-tool"); let path = std::env::join_paths([fake_bin.as_path(), other_dir.as_path()]).unwrap(); // SAFETY: serial test unsafe { std::env::set_var("PATH", &path); std::env::set_var("VITE_PLUS_HOME", fake_home.as_os_str()); } let result = find_on_path("vpx-excluded-tool"); assert!(result.is_some()); // Should find the one in other_dir, not fake_bin assert!( result.unwrap().as_path().starts_with(&other_dir), "Should skip vite-plus bin dir and find tool in other directory" ); unsafe { match &original_path { Some(v) => std::env::set_var("PATH", v), None => std::env::remove_var("PATH"), } match &original_home { Some(v) => std::env::set_var("VITE_PLUS_HOME", v), None => std::env::remove_var("VITE_PLUS_HOME"), } } } // ========================================================================= // prepend_node_modules_bin_to_path tests // ========================================================================= #[test] #[serial] fn test_prepend_node_modules_bin_to_path() { let original_path = std::env::var_os("PATH"); let temp = tempfile::tempdir().unwrap(); let temp_path = AbsolutePathBuf::new(temp.path().to_path_buf()).unwrap(); // Create node_modules/.bin at root let root_bin = temp_path.join("node_modules").join(".bin"); std::fs::create_dir_all(&root_bin).unwrap(); // Create node_modules/.bin in nested package let nested = temp_path.join("packages").join("app"); let nested_bin = nested.join("node_modules").join(".bin"); std::fs::create_dir_all(&nested_bin).unwrap(); // SAFETY: serial test unsafe { std::env::set_var("PATH", "/usr/bin"); } prepend_node_modules_bin_to_path(&nested); let new_path = std::env::var_os("PATH").unwrap(); let paths: Vec<_> = std::env::split_paths(&new_path).collect(); // Nearest (nested) should be first assert_eq!(paths[0], nested_bin.as_path().to_path_buf()); // Root should be second assert_eq!(paths[1], root_bin.as_path().to_path_buf()); unsafe { match &original_path { Some(v) => std::env::set_var("PATH", v), None => std::env::remove_var("PATH"), } } } } ================================================ FILE: crates/vite_global_cli/src/commands/why.rs ================================================ use std::process::ExitStatus; use vite_install::commands::why::WhyCommandOptions; use vite_path::AbsolutePathBuf; use super::{build_package_manager, prepend_js_runtime_to_path_env}; use crate::error::Error; /// Why command for showing why a package is installed. /// /// This command automatically detects the package manager and translates /// the why command to the appropriate package manager-specific syntax. pub struct WhyCommand { cwd: AbsolutePathBuf, } impl WhyCommand { pub fn new(cwd: AbsolutePathBuf) -> Self { Self { cwd } } #[allow(clippy::too_many_arguments)] pub async fn execute( self, packages: &[String], json: bool, long: bool, parseable: bool, recursive: bool, filters: Option<&[String]>, workspace_root: bool, prod: bool, dev: bool, depth: Option, no_optional: bool, global: bool, exclude_peers: bool, find_by: Option<&str>, pass_through_args: Option<&[String]>, ) -> Result { prepend_js_runtime_to_path_env(&self.cwd).await?; let package_manager = build_package_manager(&self.cwd).await?; let why_command_options = WhyCommandOptions { packages, json, long, parseable, recursive, filters, workspace_root, prod, dev, depth, no_optional, global, exclude_peers, find_by, pass_through_args, }; Ok(package_manager.run_why_command(&why_command_options, &self.cwd).await?) } } #[cfg(test)] mod tests { use super::*; #[test] fn test_why_command_new() { let workspace_root = if cfg!(windows) { AbsolutePathBuf::new("C:\\test".into()).unwrap() } else { AbsolutePathBuf::new("/test".into()).unwrap() }; let cmd = WhyCommand::new(workspace_root.clone()); assert_eq!(cmd.cwd, workspace_root); } } ================================================ FILE: crates/vite_global_cli/src/error.rs ================================================ //! Error types for the global CLI. use std::io; use vite_str::Str; /// Error type for the global CLI. #[derive(Debug, thiserror::Error)] pub enum Error { #[allow(dead_code)] // Will be used for better error messages #[error("No package manager detected. Please run in a project directory with a package.json.")] NoPackageManager, #[error("Failed to download Node.js runtime: {0}")] RuntimeDownload(#[from] vite_js_runtime::Error), #[error("Command execution failed: {0}")] CommandExecution(#[from] io::Error), #[error( "JS scripts directory not found. Set VITE_GLOBAL_CLI_JS_SCRIPTS_DIR or ensure scripts are bundled." )] JsScriptsDirNotFound, #[error("Failed to determine CLI binary path")] CliBinaryNotFound, #[error("Workspace error: {0}")] Workspace(#[from] vite_workspace::Error), #[error("Install error: {0}")] Install(#[from] vite_error::Error), #[error("Configuration error: {0}")] ConfigError(Str), #[error("JSON error: {0}")] JsonError(#[from] serde_json::Error), #[error("{0}")] Other(Str), /// User-facing message printed without "Error: " prefix. #[error("{0}")] UserMessage(Str), #[error( "Executable '{bin_name}' is already installed by {existing_package}\n\nPlease remove {existing_package} before installing {new_package}, or use --force to auto-replace" )] BinaryConflict { bin_name: String, existing_package: String, new_package: String }, #[error("Upgrade error: {0}")] Upgrade(Str), #[error("Integrity mismatch: expected {expected}, got {actual}")] IntegrityMismatch { expected: Str, actual: Str }, #[error("Unsupported integrity format: {0} (only sha512 is supported)")] UnsupportedIntegrity(Str), } ================================================ FILE: crates/vite_global_cli/src/help.rs ================================================ //! Unified help rendering for the global CLI. use std::{fmt::Write as _, io::IsTerminal}; use clap::{CommandFactory, error::ErrorKind}; use owo_colors::OwoColorize; #[derive(Clone, Debug)] pub struct HelpDoc { pub usage: &'static str, pub summary: Vec<&'static str>, pub sections: Vec, pub documentation_url: Option<&'static str>, } #[derive(Clone, Debug)] pub enum HelpSection { Rows { title: &'static str, rows: Vec }, Lines { title: &'static str, lines: Vec<&'static str> }, } #[derive(Clone, Debug)] pub struct HelpRow { pub label: &'static str, pub description: Vec<&'static str>, } #[derive(Clone, Debug)] struct OwnedHelpDoc { usage: String, summary: Vec, sections: Vec, documentation_url: Option, } #[derive(Clone, Debug)] enum OwnedHelpSection { Rows { title: String, rows: Vec }, Lines { title: String, lines: Vec }, } #[derive(Clone, Debug)] struct OwnedHelpRow { label: String, description: Vec, } fn row(label: &'static str, description: &'static str) -> HelpRow { HelpRow { label, description: vec![description] } } fn section_rows(title: &'static str, rows: Vec) -> HelpSection { HelpSection::Rows { title, rows } } fn section_lines(title: &'static str, lines: Vec<&'static str>) -> HelpSection { HelpSection::Lines { title, lines } } fn documentation_url_for_command_path(command_path: &[&str]) -> Option<&'static str> { match command_path { [] => Some("https://viteplus.dev/guide/"), ["create"] => Some("https://viteplus.dev/guide/create"), ["migrate"] => Some("https://viteplus.dev/guide/migrate"), ["config"] | ["staged"] => Some("https://viteplus.dev/guide/commit-hooks"), [ "install" | "add" | "remove" | "update" | "dedupe" | "outdated" | "list" | "ls" | "why" | "info" | "view" | "show" | "link" | "unlink" | "pm", .., ] => Some("https://viteplus.dev/guide/install"), ["dev"] => Some("https://viteplus.dev/guide/dev"), ["check"] => Some("https://viteplus.dev/guide/check"), ["lint"] => Some("https://viteplus.dev/guide/lint"), ["fmt"] => Some("https://viteplus.dev/guide/fmt"), ["test"] => Some("https://viteplus.dev/guide/test"), ["run"] => Some("https://viteplus.dev/guide/run"), ["exec" | "dlx"] => Some("https://viteplus.dev/guide/vpx"), ["cache"] => Some("https://viteplus.dev/guide/cache"), ["build" | "preview"] => Some("https://viteplus.dev/guide/build"), ["pack"] => Some("https://viteplus.dev/guide/pack"), ["env", ..] => Some("https://viteplus.dev/guide/env"), ["upgrade"] => Some("https://viteplus.dev/guide/upgrade"), _ => None, } } pub fn render_heading(title: &str) -> String { let heading = format!("{title}:"); if !should_style_help() { return heading; } if should_accent_heading(title) { heading.bold().bright_blue().to_string() } else { heading.bold().to_string() } } fn render_usage_value(usage: &str) -> String { if should_style_help() { usage.bold().to_string() } else { usage.to_string() } } fn should_accent_heading(title: &str) -> bool { title != "Usage" } fn write_documentation_footer(output: &mut String, documentation_url: &str) { let _ = writeln!(output); let _ = writeln!(output, "{} {documentation_url}", render_heading("Documentation")); } pub fn should_style_help() -> bool { std::io::stdout().is_terminal() && std::env::var_os("NO_COLOR").is_none() && std::env::var("CLICOLOR").map_or(true, |value| value != "0") && std::env::var("TERM").map_or(true, |term| term != "dumb") } fn render_rows(rows: &[HelpRow]) -> Vec { if rows.is_empty() { return vec![]; } let label_width = rows.iter().map(|row| row.label.len()).max().unwrap_or(0); let mut output = Vec::new(); for row in rows { let mut description_iter = row.description.iter(); if let Some(first) = description_iter.next() { output.push(format!(" {:label_width$} {}", row.label, first)); for line in description_iter { output.push(format!(" {:label_width$} {}", "", line)); } } else { output.push(format!(" {}", row.label)); } } output } fn render_owned_rows(rows: &[OwnedHelpRow]) -> Vec { if rows.is_empty() { return vec![]; } let label_width = rows.iter().map(|row| row.label.chars().count()).max().unwrap_or(0); let mut output = Vec::new(); for row in rows { let mut description_iter = row.description.iter(); if let Some(first) = description_iter.next() { output.push(format!(" {:label_width$} {}", row.label, first)); for line in description_iter { output.push(format!(" {:label_width$} {}", "", line)); } } else { output.push(format!(" {}", row.label)); } } output } fn split_comment_suffix(line: &str) -> Option<(&str, &str)> { line.find(" #").map(|index| line.split_at(index)) } fn render_muted_comment_suffix(line: &str) -> String { if !should_style_help() { return line.to_string(); } if let Some((prefix, suffix)) = split_comment_suffix(line) { return format!("{}{}", prefix, suffix.bright_black()); } line.to_string() } pub fn render_help_doc(doc: &HelpDoc) -> String { let mut output = String::new(); let _ = writeln!(output, "{} {}", render_heading("Usage"), render_usage_value(doc.usage)); if !doc.summary.is_empty() { let _ = writeln!(output); for line in &doc.summary { let _ = writeln!(output, "{line}"); } } for section in &doc.sections { let _ = writeln!(output); match section { HelpSection::Rows { title, rows } => { let _ = writeln!(output, "{}", render_heading(title)); for line in render_rows(rows) { let _ = writeln!(output, "{line}"); } } HelpSection::Lines { title, lines } => { let _ = writeln!(output, "{}", render_heading(title)); for line in lines { let _ = writeln!(output, "{}", render_muted_comment_suffix(line)); } } } } if let Some(documentation_url) = doc.documentation_url { write_documentation_footer(&mut output, documentation_url); } output } fn render_owned_help_doc(doc: &OwnedHelpDoc) -> String { let mut output = String::new(); let _ = writeln!(output, "{} {}", render_heading("Usage"), render_usage_value(&doc.usage)); if !doc.summary.is_empty() { let _ = writeln!(output); for line in &doc.summary { let _ = writeln!(output, "{line}"); } } for section in &doc.sections { let _ = writeln!(output); match section { OwnedHelpSection::Rows { title, rows } => { let _ = writeln!(output, "{}", render_heading(title)); for line in render_owned_rows(rows) { let _ = writeln!(output, "{line}"); } } OwnedHelpSection::Lines { title, lines } => { let _ = writeln!(output, "{}", render_heading(title)); for line in lines { let _ = writeln!(output, "{}", render_muted_comment_suffix(line)); } } } } if let Some(documentation_url) = &doc.documentation_url { write_documentation_footer(&mut output, documentation_url); } output } fn is_section_heading(line: &str) -> bool { let trimmed = line.trim_end(); !trimmed.is_empty() && !trimmed.starts_with(' ') && trimmed.ends_with(':') } fn split_label_and_description(content: &str) -> Option<(String, String)> { let bytes = content.as_bytes(); let mut i = 0; while i + 1 < bytes.len() { if bytes[i] == b' ' && bytes[i + 1] == b' ' { let mut j = i + 2; while j < bytes.len() && bytes[j] == b' ' { j += 1; } let label = content[..i].trim_end(); let description = content[j..].trim_start(); if !label.is_empty() && !description.is_empty() { return Some((label.to_string(), description.to_string())); } i = j; continue; } i += 1; } None } fn parse_rows(lines: &[String]) -> Vec { let mut rows = Vec::new(); for line in lines { if line.trim().is_empty() { continue; } let leading = line.chars().take_while(|c| *c == ' ').count(); let content = line.trim_start(); if content.is_empty() { continue; } if let Some((label, description)) = split_label_and_description(content) { rows.push(OwnedHelpRow { label, description: vec![description] }); continue; } if leading >= 4 && content.starts_with('-') { rows.push(OwnedHelpRow { label: content.to_string(), description: vec![] }); continue; } if leading >= 4 { if let Some(last) = rows.last_mut() { last.description.push(content.to_string()); continue; } } rows.push(OwnedHelpRow { label: content.to_string(), description: vec![] }); } rows } fn strip_ansi(value: &str) -> String { let mut output = String::with_capacity(value.len()); let mut chars = value.chars().peekable(); while let Some(ch) = chars.next() { if ch == '\u{1b}' { match chars.peek().copied() { // CSI sequence (for example: \x1b[1m) Some('[') => { let _ = chars.next(); for c in chars.by_ref() { if ('@'..='~').contains(&c) { break; } } } // OSC sequence (for example: hyperlinks) Some(']') => { let _ = chars.next(); let mut prev = '\0'; for c in chars.by_ref() { if c == '\u{7}' || (prev == '\u{1b}' && c == '\\') { break; } prev = c; } } _ => {} } continue; } output.push(ch); } output } fn parse_clap_help_to_doc(raw_help: &str) -> Option { let normalized = raw_help.replace("\r\n", "\n"); let lines: Vec = normalized.lines().map(strip_ansi).collect(); let usage_index = lines.iter().position(|line| line.starts_with("Usage: "))?; let usage = lines[usage_index].trim_start_matches("Usage: ").trim().to_string(); let summary = lines[..usage_index] .iter() .map(|line| line.trim_end()) .filter(|line| !line.trim().is_empty()) .map(str::to_string) .collect::>(); let mut sections = Vec::new(); let mut i = usage_index + 1; while i < lines.len() { if lines[i].trim().is_empty() { i += 1; continue; } if !is_section_heading(&lines[i]) { i += 1; continue; } let title = lines[i].trim_end().trim_end_matches(':').to_string(); i += 1; let mut body = Vec::new(); while i < lines.len() { if is_section_heading(&lines[i]) { break; } body.push(lines[i].trim_end().to_string()); i += 1; } let first_non_empty = body.iter().position(|line| !line.trim().is_empty()); let last_non_empty = body.iter().rposition(|line| !line.trim().is_empty()); let body = match (first_non_empty, last_non_empty) { (Some(start), Some(end)) if start <= end => body[start..=end].to_vec(), _ => vec![], }; let row_sections = matches!(title.as_str(), "Arguments" | "Options" | "Commands" | "Subcommands"); if row_sections { let rows = parse_rows(&body); sections.push(OwnedHelpSection::Rows { title, rows }); } else { let lines = body.into_iter().filter(|line| !line.trim().is_empty()).collect::>(); sections.push(OwnedHelpSection::Lines { title, lines }); } } Some(OwnedHelpDoc { usage, summary, sections, documentation_url: None }) } pub fn top_level_help_doc() -> HelpDoc { HelpDoc { usage: "vp [COMMAND]", summary: Vec::new(), sections: vec![ section_rows( "Start", vec![ row("create", "Create a new project from a template"), row("migrate", "Migrate an existing project to Vite+"), row("config", "Configure hooks and agent integration"), row("staged", "Run linters on staged files"), row( "install, i", "Install all dependencies, or add packages if package names are provided", ), row("env", "Manage Node.js versions"), ], ), section_rows( "Develop", vec![ row("dev", "Run the development server"), row("check", "Run format, lint, and type checks"), row("lint", "Lint code"), row("fmt", "Format code"), row("test", "Run tests"), ], ), section_rows( "Execute", vec![ row("run", "Run tasks"), row("exec", "Execute a command from local node_modules/.bin"), row("dlx", "Execute a package binary without installing it as a dependency"), row("cache", "Manage the task cache"), ], ), section_rows( "Build", vec![ row("build", "Build for production"), row("pack", "Build library"), row("preview", "Preview production build"), ], ), section_rows( "Manage Dependencies", vec![ row("add", "Add packages to dependencies"), row("remove, rm, un, uninstall", "Remove packages from dependencies"), row("update, up", "Update packages to their latest versions"), row("dedupe", "Deduplicate dependencies by removing older versions"), row("outdated", "Check for outdated packages"), row("list, ls", "List installed packages"), row("why, explain", "Show why a package is installed"), row("info, view, show", "View package information from the registry"), row("link, ln", "Link packages for local development"), row("unlink", "Unlink packages"), row("pm", "Forward a command to the package manager"), ], ), section_rows( "Maintain", vec![ row("upgrade", "Update vp itself to the latest version"), row("implode", "Remove vp and all related data"), ], ), ], documentation_url: documentation_url_for_command_path(&[]), } } fn env_help_doc() -> HelpDoc { HelpDoc { usage: "vp env [COMMAND]", summary: vec!["Manage Node.js versions"], sections: vec![ section_rows( "Setup", vec![ row("setup", "Create or update shims in VITE_PLUS_HOME/bin"), row("on", "Enable managed mode - shims always use vite-plus managed Node.js"), row( "off", "Enable system-first mode - shims prefer system Node.js, fallback to managed", ), row("print", "Print shell snippet to set environment for current session"), ], ), section_rows( "Manage", vec![ row("default", "Set or show the global default Node.js version"), row( "pin", "Pin a Node.js version in the current directory (creates .node-version)", ), row( "unpin", "Remove the .node-version file from current directory (alias for `pin --unpin`)", ), row("use", "Use a specific Node.js version for this shell session"), row("install", "Install a Node.js version [aliases: i]"), row("uninstall", "Uninstall a Node.js version [aliases: uni]"), row("exec", "Execute a command with a specific Node.js version [aliases: run]"), ], ), section_rows( "Inspect", vec![ row("current", "Show current environment information"), row("doctor", "Run diagnostics and show environment status"), row("which", "Show path to the tool that would be executed"), row("list", "List locally installed Node.js versions [aliases: ls]"), row( "list-remote", "List available Node.js versions from the registry [aliases: ls-remote]", ), ], ), section_lines( "Examples", vec![ " Setup:", " vp env setup # Create shims for node, npm, npx", " vp env on # Use vite-plus managed Node.js", " vp env print # Print shell snippet for this session", "", " Manage:", " vp env pin lts # Pin to latest LTS version", " vp env install # Install version from .node-version / package.json", " vp env use 20 # Use Node.js 20 for this shell session", " vp env use --unset # Remove session override", "", " Inspect:", " vp env current # Show current resolved environment", " vp env current --json # JSON output for automation", " vp env doctor # Check environment configuration", " vp env which node # Show which node binary will be used", " vp env list-remote --lts # List only LTS versions", "", " Execute:", " vp env exec --node lts npm i # Execute 'npm i' with latest LTS", " vp env exec node -v # Shim mode (version auto-resolved)", ], ), section_lines( "Related Commands", vec![ " vp install -g # Install a package globally", " vp uninstall -g # Uninstall a package globally", " vp update -g [package] # Update global packages", " vp list -g [package] # List global packages", ], ), ], documentation_url: documentation_url_for_command_path(&["env"]), } } fn delegated_help_doc(command: &str) -> Option { match command { "dev" => Some(HelpDoc { usage: "vp dev [ROOT] [OPTIONS]", summary: vec!["Run the development server.", "Options are forwarded to Vite."], sections: vec![ section_rows( "Arguments", vec![row("[ROOT]", "Project root directory (default: current directory)")], ), section_rows( "Options", vec![ row("--host [HOST]", "Specify hostname"), row("--port ", "Specify port"), row("--open [PATH]", "Open browser on startup"), row("--strictPort", "Exit if specified port is already in use"), row("-c, --config ", "Use specified config file"), row("--base ", "Public base path"), row("-m, --mode ", "Set env mode"), row("-h, --help", "Print help"), ], ), section_lines( "Examples", vec![" vp dev", " vp dev --open", " vp dev --host localhost --port 5173"], ), ], documentation_url: documentation_url_for_command_path(&["dev"]), }), "build" => Some(HelpDoc { usage: "vp build [ROOT] [OPTIONS]", summary: vec!["Build for production.", "Options are forwarded to Vite."], sections: vec![ section_rows( "Arguments", vec![row("[ROOT]", "Project root directory (default: current directory)")], ), section_rows( "Options", vec![ row("--target ", "Transpile target"), row("--outDir ", "Output directory"), row("--sourcemap [MODE]", "Output source maps"), row("--minify [MINIFIER]", "Enable/disable minification"), row("-w, --watch", "Rebuild when files change"), row("-c, --config ", "Use specified config file"), row("-m, --mode ", "Set env mode"), row("-h, --help", "Print help"), ], ), section_lines( "Examples", vec![" vp build", " vp build --watch", " vp build --sourcemap"], ), ], documentation_url: documentation_url_for_command_path(&["build"]), }), "preview" => Some(HelpDoc { usage: "vp preview [ROOT] [OPTIONS]", summary: vec!["Preview production build.", "Options are forwarded to Vite."], sections: vec![ section_rows( "Arguments", vec![row("[ROOT]", "Project root directory (default: current directory)")], ), section_rows( "Options", vec![ row("--host [HOST]", "Specify hostname"), row("--port ", "Specify port"), row("--strictPort", "Exit if specified port is already in use"), row("--open [PATH]", "Open browser on startup"), row("--outDir ", "Output directory to preview"), row("-c, --config ", "Use specified config file"), row("-m, --mode ", "Set env mode"), row("-h, --help", "Print help"), ], ), section_lines("Examples", vec![" vp preview", " vp preview --port 4173"]), ], documentation_url: documentation_url_for_command_path(&["preview"]), }), "test" => Some(HelpDoc { usage: "vp test [COMMAND] [FILTERS] [OPTIONS]", summary: vec!["Run tests.", "Options are forwarded to Vitest."], sections: vec![ section_rows( "Commands", vec![ row("run", "Run tests once"), row("watch", "Run tests in watch mode"), row("dev", "Run tests in development mode"), row("related", "Run tests related to changed files"), row("bench", "Run benchmarks"), row("init", "Initialize Vitest config"), row("list", "List matching tests"), ], ), section_rows( "Options", vec![ row("-c, --config ", "Path to config file"), row("-w, --watch", "Enable watch mode"), row("-t, --testNamePattern ", "Run tests matching regexp"), row("--ui", "Enable UI"), row("--coverage", "Enable coverage"), row("--reporter ", "Specify reporter"), row("-h, --help", "Print help"), ], ), section_lines( "Examples", vec![ " vp test", " vp test run src/foo.test.ts", " vp test watch --coverage", ], ), ], documentation_url: documentation_url_for_command_path(&["test"]), }), "lint" => Some(HelpDoc { usage: "vp lint [PATH]... [OPTIONS]", summary: vec!["Lint code.", "Options are forwarded to Oxlint."], sections: vec![ section_rows( "Options", vec![ row("--tsconfig ", "TypeScript tsconfig path"), row("--fix", "Fix issues when possible"), row("--type-aware", "Enable rules requiring type information"), row("--import-plugin", "Enable import plugin"), row("--rules", "List registered rules"), row("-h, --help", "Print help"), ], ), section_lines( "Examples", vec![ " vp lint", " vp lint src --fix", " vp lint --type-aware --tsconfig ./tsconfig.json", ], ), ], documentation_url: documentation_url_for_command_path(&["lint"]), }), "fmt" => Some(HelpDoc { usage: "vp fmt [PATH]... [OPTIONS]", summary: vec!["Format code.", "Options are forwarded to Oxfmt."], sections: vec![ section_rows( "Options", vec![ row("--write", "Format and write files in place"), row("--check", "Check if files are formatted"), row("--list-different", "List files that would be changed"), row("--ignore-path ", "Path to ignore file(s)"), row("--threads ", "Number of threads to use"), row("-h, --help", "Print help"), ], ), section_lines( "Examples", vec![" vp fmt", " vp fmt src --check", " vp fmt . --write"], ), ], documentation_url: documentation_url_for_command_path(&["fmt"]), }), "check" => Some(HelpDoc { usage: "vp check [OPTIONS] [PATHS]...", summary: vec!["Run format, lint, and type checks."], sections: vec![ section_rows( "Options", vec![ row("--fix", "Auto-fix format and lint issues"), row("--no-fmt", "Skip format check"), row("--no-lint", "Skip lint check"), row("-h, --help", "Print help"), ], ), section_lines( "Examples", vec![" vp check", " vp check --fix", " vp check --no-lint src/index.ts"], ), ], documentation_url: documentation_url_for_command_path(&["check"]), }), "pack" => Some(HelpDoc { usage: "vp pack [...FILES] [OPTIONS]", summary: vec!["Build library.", "Options are forwarded to tsdown."], sections: vec![ section_rows( "Options", vec![ row("-f, --format ", "Bundle format: esm, cjs, iife, umd"), row("-d, --out-dir ", "Output directory"), row("--sourcemap", "Generate source map"), row("--dts", "Generate dts files"), row("--minify", "Minify output"), row("-w, --watch [PATH]", "Watch mode"), row("-h, --help", "Print help"), ], ), section_lines( "Examples", vec![" vp pack", " vp pack src/index.ts --dts", " vp pack --watch"], ), ], documentation_url: documentation_url_for_command_path(&["pack"]), }), "run" => Some(HelpDoc { usage: "vp run [OPTIONS] [TASK_SPECIFIER] [ADDITIONAL_ARGS]...", summary: vec!["Run tasks."], sections: vec![ section_rows( "Arguments", vec![ row( "[TASK_SPECIFIER]", "`packageName#taskName` or `taskName`. If omitted, lists all available tasks", ), row("[ADDITIONAL_ARGS]...", "Additional arguments to pass to the tasks"), ], ), section_rows( "Options", vec![ row("-r, --recursive", "Select all packages in the workspace"), row( "-t, --transitive", "Select the current package and its transitive dependencies", ), row("-w, --workspace-root", "Select the workspace root package"), row( "-F, --filter ", "Match packages by name, directory, or glob pattern", ), row( "--ignore-depends-on", "Do not run dependencies specified in `dependsOn` fields", ), row("-v, --verbose", "Show full detailed summary after execution"), row("--last-details", "Display the detailed summary of the last run"), row("-h, --help", "Print help (see more with '--help')"), ], ), section_lines( "Filter Patterns", vec![ " --filter Select by package name (e.g. foo, @scope/*)", " --filter ./ Select packages under a directory", " --filter {} Same as ./, but allows traversal suffixes", " --filter ... Select package and its dependencies", " --filter ... Select package and its dependents", " --filter ^... Select only the dependencies (exclude the package itself)", " --filter ! Exclude packages matching the pattern", ], ), ], documentation_url: documentation_url_for_command_path(&["run"]), }), "exec" => Some(HelpDoc { usage: "vp exec [OPTIONS] [COMMAND]...", summary: vec!["Execute a command from local node_modules/.bin."], sections: vec![ section_rows( "Arguments", vec![row("[COMMAND]...", "Command and arguments to execute")], ), section_rows( "Options", vec![ row("-r, --recursive", "Select all packages in the workspace"), row( "-t, --transitive", "Select the current package and its transitive dependencies", ), row("-w, --workspace-root", "Select the workspace root package"), row( "-F, --filter ", "Match packages by name, directory, or glob pattern", ), row("-c, --shell-mode", "Execute the command within a shell environment"), row("--parallel", "Run concurrently without topological ordering"), row("--reverse", "Reverse execution order"), row("--resume-from ", "Resume from a specific package"), row("--report-summary", "Save results to vp-exec-summary.json"), row("-h, --help", "Print help (see more with '--help')"), ], ), section_lines( "Filter Patterns", vec![ " --filter Select by package name (e.g. foo, @scope/*)", " --filter ./ Select packages under a directory", " --filter {} Same as ./, but allows traversal suffixes", " --filter ... Select package and its dependencies", " --filter ... Select package and its dependents", " --filter ^... Select only the dependencies (exclude the package itself)", " --filter ! Exclude packages matching the pattern", ], ), section_lines( "Examples", vec![ " vp exec node --version # Run local node", " vp exec tsc --noEmit # Run local TypeScript compiler", " vp exec -c 'tsc --noEmit && prettier --check .' # Shell mode", " vp exec -r -- tsc --noEmit # Run in all workspace packages", " vp exec --filter 'app...' -- tsc # Run in filtered packages", ], ), ], documentation_url: documentation_url_for_command_path(&["exec"]), }), "cache" => Some(HelpDoc { usage: "vp cache ", summary: vec!["Manage the task cache."], sections: vec![ section_rows("Commands", vec![row("clean", "Clean up all the cache")]), section_rows("Options", vec![row("-h, --help", "Print help")]), ], documentation_url: documentation_url_for_command_path(&["cache"]), }), _ => None, } } fn is_help_flag(arg: &str) -> bool { matches!(arg, "-h" | "--help") } fn has_help_flag_before_terminator(args: &[String]) -> bool { args.iter().take_while(|arg| arg.as_str() != "--").any(|arg| is_help_flag(arg)) } fn skip_clap_unified_help(command: &str) -> bool { matches!( command, "create" | "migrate" | "dev" | "build" | "preview" | "test" | "lint" | "fmt" | "check" | "pack" | "run" | "exec" | "cache" ) } pub fn maybe_print_unified_clap_subcommand_help(argv: &[String]) -> bool { if argv.len() < 3 { return false; } let command = crate::cli::Args::command(); let mut current = &command; let mut path_len = 0; let mut index = 1; let mut first_command_name: Option = None; let mut command_path = Vec::new(); while index < argv.len() { let arg = &argv[index]; if arg.starts_with('-') { break; } let Some(next) = current.find_subcommand(arg) else { break; }; if first_command_name.is_none() { first_command_name = Some(next.get_name().to_string()); } command_path.push(next.get_name().to_string()); current = next; path_len += 1; index += 1; } if path_len == 0 { return false; } let Some(first_command_name) = first_command_name else { return false; }; if skip_clap_unified_help(&first_command_name) { return false; } // Respect `--` option terminator: flags after `--` belong to the wrapped // command and should not trigger CLI help rewriting. if !has_help_flag_before_terminator(&argv[index..]) { return false; } if command_path.len() == 1 && command_path[0] == "env" { println!("{}", vite_shared::header::vite_plus_header()); println!(); println!("{}", render_help_doc(&env_help_doc())); return true; } let mut command_path_refs = Vec::with_capacity(command_path.len()); for segment in &command_path { command_path_refs.push(segment.as_str()); } print_unified_clap_help_for_path(&command_path_refs) } pub fn should_print_unified_delegate_help(args: &[String]) -> bool { matches!(args, [arg] if is_help_flag(arg)) } pub fn maybe_print_unified_delegate_help( command: &str, args: &[String], show_header: bool, ) -> bool { if !should_print_unified_delegate_help(args) { return false; } let Some(doc) = delegated_help_doc(command) else { return false; }; if show_header { println!("{}", vite_shared::header::vite_plus_header()); println!(); } println!("{}", render_help_doc(&doc)); true } pub fn print_unified_clap_help_for_path(command_path: &[&str]) -> bool { if command_path == ["env"] { println!("{}", vite_shared::header::vite_plus_header()); println!(); println!("{}", render_help_doc(&env_help_doc())); return true; } let mut help_args = vec!["vp".to_string()]; help_args.extend(command_path.iter().map(ToString::to_string)); help_args.push("--help".to_string()); let raw_help = match crate::cli::try_parse_args_from(help_args) { Err(error) if matches!(error.kind(), ErrorKind::DisplayHelp) => error.to_string(), _ => return false, }; let Some(doc) = parse_clap_help_to_doc(&raw_help) else { return false; }; let doc = OwnedHelpDoc { documentation_url: documentation_url_for_command_path(command_path) .map(ToString::to_string), ..doc }; println!("{}", vite_shared::header::vite_plus_header()); println!(); println!("{}", render_owned_help_doc(&doc)); true } #[cfg(test)] mod tests { use super::{ HelpDoc, documentation_url_for_command_path, has_help_flag_before_terminator, parse_clap_help_to_doc, parse_rows, render_help_doc, split_comment_suffix, strip_ansi, }; #[test] fn parse_rows_supports_wrapped_option_labels() { let lines = vec![ " -P, --prod Do not install devDependencies".to_string(), " --no-optional".to_string(), " Do not install optionalDependencies".to_string(), ]; let rows = parse_rows(&lines); assert_eq!(rows.len(), 2); assert_eq!(rows[0].label, "-P, --prod"); assert_eq!(rows[0].description, vec!["Do not install devDependencies"]); assert_eq!(rows[1].label, "--no-optional"); assert_eq!(rows[1].description, vec!["Do not install optionalDependencies"]); } #[test] fn parse_clap_help_extracts_usage_summary_and_sections() { let raw_help = "\ Add packages to dependencies Usage: vp add [OPTIONS] ... Arguments: ... Packages to add Options: -h, --help Print help "; let doc = parse_clap_help_to_doc(raw_help).expect("should parse clap help text"); assert_eq!(doc.usage, "vp add [OPTIONS] ..."); assert_eq!(doc.summary, vec!["Add packages to dependencies"]); assert_eq!(doc.sections.len(), 2); } #[test] fn help_flag_before_terminator_is_detected() { let args = vec!["vpx".to_string(), "--help".to_string()]; assert!(has_help_flag_before_terminator(&args)); } #[test] fn help_flag_after_terminator_is_ignored() { let args = vec!["vpx".to_string(), "--".to_string(), "--help".to_string()]; assert!(!has_help_flag_before_terminator(&args)); } #[test] fn strip_ansi_removes_csi_sequences() { let input = "\u{1b}[1mOptions:\u{1b}[0m"; assert_eq!(strip_ansi(input), "Options:"); } #[test] fn parse_clap_help_with_ansi_sequences() { let raw_help = "\ \u{1b}[1mAdd packages to dependencies\u{1b}[0m \u{1b}[1mUsage:\u{1b}[0m vp add [OPTIONS] ... \u{1b}[1mArguments:\u{1b}[0m ... Packages to add \u{1b}[1mOptions:\u{1b}[0m -h, --help Print help "; let doc = parse_clap_help_to_doc(raw_help).expect("should parse clap help text"); assert_eq!(doc.usage, "vp add [OPTIONS] ..."); assert_eq!(doc.summary, vec!["Add packages to dependencies"]); assert_eq!(doc.sections.len(), 2); } #[test] fn split_comment_suffix_extracts_command_comment() { let line = " vp env list-remote 20 # List Node.js 20.x versions"; let (prefix, suffix) = split_comment_suffix(line).expect("expected comment suffix"); assert_eq!(prefix, " vp env list-remote 20 "); assert_eq!(suffix, " # List Node.js 20.x versions"); } #[test] fn split_comment_suffix_returns_none_without_comment() { assert!(split_comment_suffix(" vp env list").is_none()); } #[test] fn docs_url_is_mapped_for_grouped_commands() { assert_eq!( documentation_url_for_command_path(&["add"]), Some("https://viteplus.dev/guide/install") ); assert_eq!( documentation_url_for_command_path(&["env", "list"]), Some("https://viteplus.dev/guide/env") ); assert_eq!( documentation_url_for_command_path(&["config"]), Some("https://viteplus.dev/guide/commit-hooks") ); } #[test] fn render_help_doc_appends_documentation_footer() { let output = render_help_doc(&HelpDoc { usage: "vp demo", summary: vec![], sections: vec![], documentation_url: Some("https://viteplus.dev/guide/demo"), }); assert!(output.contains("Documentation: https://viteplus.dev/guide/demo")); } } ================================================ FILE: crates/vite_global_cli/src/js_executor.rs ================================================ //! JavaScript execution via managed Node.js runtime. //! //! This module handles downloading and caching Node.js via `vite_js_runtime`, //! and executing JavaScript scripts using the managed runtime. use std::process::ExitStatus; use tokio::process::Command; use vite_js_runtime::{ JsRuntime, JsRuntimeType, download_runtime, download_runtime_for_project, is_valid_version, read_package_json, resolve_node_version, }; use vite_path::{AbsolutePath, AbsolutePathBuf}; use vite_shared::{PrependOptions, PrependResult, env_vars, format_path_with_prepend}; use crate::{commands::env::config, error::Error}; /// JavaScript executor using managed Node.js runtime. /// /// Handles two runtime resolution strategies: /// - CLI runtime: For package manager commands and bundled JS scripts (Categories A & B) /// - Project runtime: For delegating to local vite-plus CLI (Category C) pub struct JsExecutor { /// Cached runtime for CLI commands (Categories A & B) cli_runtime: Option, /// Cached runtime for project delegation (Category C) project_runtime: Option, /// Directory containing JS scripts (from `VITE_GLOBAL_CLI_JS_SCRIPTS_DIR`) scripts_dir: Option, } impl JsExecutor { /// Create a new JS executor. /// /// # Arguments /// * `scripts_dir` - Optional path to the JS scripts directory. /// If not provided, will be auto-detected from the binary location. #[must_use] pub const fn new(scripts_dir: Option) -> Self { Self { cli_runtime: None, project_runtime: None, scripts_dir } } /// Get the JS scripts directory. /// /// Resolution order: /// 1. Explicitly provided `scripts_dir` /// 2. `VITE_GLOBAL_CLI_JS_SCRIPTS_DIR` environment variable /// 3. Auto-detect from binary location (../dist relative to binary) pub fn get_scripts_dir(&self) -> Result { // 1. Use explicitly provided scripts_dir if let Some(dir) = &self.scripts_dir { return Ok(dir.clone()); } // 2. Check environment variable if let Ok(dir) = std::env::var(env_vars::VITE_GLOBAL_CLI_JS_SCRIPTS_DIR) { return AbsolutePathBuf::new(dir.into()).ok_or(Error::JsScriptsDirNotFound); } // 3. Auto-detect from binary location // JS scripts are at ../node_modules/vite-plus/dist relative to the binary directory // e.g., ~/.vite-plus//bin/vp -> ~/.vite-plus//node_modules/vite-plus/dist/ let exe_path = std::env::current_exe().map_err(|_| Error::JsScriptsDirNotFound)?; // Resolve symlinks to get the real binary path (Unix only) // Skip on Windows to avoid path resolution issues #[cfg(unix)] let exe_path = std::fs::canonicalize(&exe_path).map_err(|_| Error::JsScriptsDirNotFound)?; let bin_dir = exe_path.parent().ok_or(Error::JsScriptsDirNotFound)?; let version_dir = bin_dir.parent().ok_or(Error::JsScriptsDirNotFound)?; let scripts_dir = version_dir.join("node_modules").join("vite-plus").join("dist"); AbsolutePathBuf::new(scripts_dir).ok_or(Error::JsScriptsDirNotFound) } /// Get the path to the current Rust binary (vp). /// /// This is passed to JS scripts via `VITE_PLUS_CLI_BIN` environment variable /// so they can invoke vp commands when needed. fn get_bin_path() -> Result { let exe_path = std::env::current_exe().map_err(|_| Error::CliBinaryNotFound)?; AbsolutePathBuf::new(exe_path).ok_or(Error::CliBinaryNotFound) } /// Create a JS runtime command with common environment variables set. /// /// Sets up: /// - `VITE_PLUS_CLI_BIN`: So JS scripts can invoke vp commands /// - `PATH`: Prepends the runtime bin directory so child processes can find the JS runtime fn create_js_command( runtime_binary: &AbsolutePath, runtime_bin_prefix: &AbsolutePath, ) -> Command { let mut cmd = Command::new(runtime_binary.as_path()); if let Ok(bin_path) = Self::get_bin_path() { tracing::debug!("Set VITE_PLUS_CLI_BIN to {:?}", bin_path); cmd.env(env_vars::VITE_PLUS_CLI_BIN, bin_path.as_path()); } // Prepend runtime bin to PATH so child processes can find the JS runtime let options = PrependOptions { dedupe_anywhere: true }; if let PrependResult::Prepended(new_path) = format_path_with_prepend(runtime_bin_prefix.as_path(), options) { tracing::debug!("Set PATH to {:?}", new_path); cmd.env("PATH", new_path); } cmd } /// Get the CLI's package.json directory (parent of `scripts_dir`). /// /// This is used for resolving the CLI's default Node.js version /// from `devEngines.runtime` in the CLI's package.json. fn get_cli_package_dir(&self) -> Result { let scripts_dir = self.get_scripts_dir()?; // scripts_dir is typically packages/cli/dist, so parent is packages/cli scripts_dir .parent() .map(vite_path::AbsolutePath::to_absolute_path_buf) .ok_or(Error::JsScriptsDirNotFound) } /// Ensure the CLI runtime is downloaded and cached. /// /// Uses the CLI's package.json `devEngines.runtime` configuration /// to determine which Node.js version to use. pub async fn ensure_cli_runtime(&mut self) -> Result<&JsRuntime, Error> { if self.cli_runtime.is_none() { let cli_dir = self.get_cli_package_dir()?; tracing::debug!("Resolving CLI runtime from {:?}", cli_dir); let runtime = download_runtime_for_project(&cli_dir).await?; self.cli_runtime = Some(runtime); } Ok(self.cli_runtime.as_ref().unwrap()) } /// Ensure the project runtime is downloaded and cached. /// /// Resolution order: /// 1. Session override (env var from `vp env use`) /// 2. Session override (file from `vp env use`) /// 3. Project sources (.node-version, engines.node, devEngines.runtime) — /// delegates to `download_runtime_for_project()` for cache-aware resolution /// 4. User default from config.json /// 5. Latest LTS pub async fn ensure_project_runtime( &mut self, project_path: &AbsolutePath, ) -> Result<&JsRuntime, Error> { if self.project_runtime.is_none() { tracing::debug!("Resolving project runtime from {:?}", project_path); // 1–2. Session overrides: env var (from `vp env use`), then file let session_version = vite_shared::EnvConfig::get() .node_version .map(|v| v.trim().to_string()) .filter(|v| !v.is_empty()); let session_version = if session_version.is_some() { session_version } else { config::read_session_version().await }; if let Some(version) = session_version { let runtime = download_runtime(JsRuntimeType::Node, &version).await?; return Ok(self.project_runtime.insert(runtime)); } // 3. Check if project has any *valid* version source. // resolve_node_version returns Some for any non-empty value, // even invalid ones. We must validate before routing to // download_runtime_for_project, which falls to LTS on all-invalid // and would skip the user's configured default. let has_valid_project_source = has_valid_version_source(project_path).await?; let runtime = if has_valid_project_source { // At least one valid project source exists — delegate to // download_runtime_for_project for cache-aware range resolution // and intra-project fallback chain download_runtime_for_project(project_path).await? } else { // No valid project source — check user default from config, then LTS let resolution = config::resolve_version(project_path).await?; download_runtime(JsRuntimeType::Node, &resolution.version).await? }; self.project_runtime = Some(runtime); } Ok(self.project_runtime.as_ref().unwrap()) } /// Download a specific Node.js version. /// /// This is used when we need a specific version regardless of /// package.json configuration. #[allow(dead_code)] // Will be used in future phases pub async fn download_node(&self, version: &str) -> Result { Ok(download_runtime(JsRuntimeType::Node, version).await?) } /// Delegate to local or global vite-plus CLI. /// /// Uses `oxc_resolver` to find the project's local vite-plus installation. /// If found, runs the local `dist/bin.js` directly. Otherwise, falls back /// to the global installation's `dist/bin.js`. /// /// Uses the project's runtime resolved via `config::resolve_version()`. /// For side-effect-free commands like `--version`, use [`delegate_with_cli_runtime`] instead. /// /// # Arguments /// * `project_path` - Path to the project directory /// * `args` - Arguments to pass to the local CLI pub async fn delegate_to_local_cli( &mut self, project_path: &AbsolutePath, args: &[String], ) -> Result { // Use project's runtime based on its devEngines.runtime configuration let runtime = self.ensure_project_runtime(project_path).await?; let node_binary = runtime.get_binary_path(); let bin_prefix = runtime.get_bin_prefix(); self.run_js_entry(project_path, &node_binary, &bin_prefix, args).await } /// Delegate to the global vite-plus CLI entrypoint directly. /// /// Unlike [`delegate_to_local_cli`], this bypasses project-local resolution and always runs /// the global installation's `dist/bin.js`. pub async fn delegate_to_global_cli( &mut self, project_path: &AbsolutePath, args: &[String], ) -> Result { let runtime = self.ensure_cli_runtime().await?; let node_binary = runtime.get_binary_path(); let bin_prefix = runtime.get_bin_prefix(); let scripts_dir = self.get_scripts_dir()?; let entry_point = scripts_dir.join("bin.js"); let mut cmd = Self::create_js_command(&node_binary, &bin_prefix); cmd.arg(entry_point.as_path()).args(args).current_dir(project_path.as_path()); let status = cmd.status().await?; Ok(status) } /// Delegate to local or global vite-plus CLI using the CLI's own runtime. /// /// Like [`delegate_to_local_cli`], but uses the CLI's bundled runtime /// (from its own `devEngines.runtime` in `package.json`) instead of the /// project's runtime. This avoids side effects like writing `.node-version` /// when no version source exists in the project directory. /// /// Use this for read-only / side-effect-free commands like `--version`. #[allow(dead_code)] // kept for future read-only delegations pub async fn delegate_with_cli_runtime( &mut self, project_path: &AbsolutePath, args: &[String], ) -> Result { let runtime = self.ensure_cli_runtime().await?; let node_binary = runtime.get_binary_path(); let bin_prefix = runtime.get_bin_prefix(); self.run_js_entry(project_path, &node_binary, &bin_prefix, args).await } /// Run a JS entry point with the given runtime, resolving local vite-plus first. async fn run_js_entry( &self, project_path: &AbsolutePath, node_binary: &AbsolutePath, bin_prefix: &AbsolutePath, args: &[String], ) -> Result { // Try to resolve vite-plus from the project directory using oxc_resolver let entry_point = match Self::resolve_local_vite_plus(project_path) { Some(path) => path, None => { // Fall back to the global installation's bin.js let scripts_dir = self.get_scripts_dir()?; scripts_dir.join("bin.js") } }; tracing::debug!("Delegating to CLI via JS entry point: {:?} {:?}", entry_point, args); let mut cmd = Self::create_js_command(node_binary, bin_prefix); cmd.arg(entry_point.as_path()).args(args).current_dir(project_path.as_path()); let status = cmd.status().await?; Ok(status) } /// Resolve the local vite-plus package's `dist/bin.js` from the project directory. fn resolve_local_vite_plus(project_path: &AbsolutePath) -> Option { use oxc_resolver::{ResolveOptions, Resolver}; let resolver = Resolver::new(ResolveOptions { condition_names: vec!["import".into(), "node".into()], ..ResolveOptions::default() }); // Resolve vite-plus/package.json from the project directory to find the package root let resolved = resolver.resolve(project_path, "vite-plus/package.json").ok()?; let pkg_dir = resolved.path().parent()?; let bin_js = pkg_dir.join("dist").join("bin.js"); if bin_js.exists() { tracing::debug!("Found local vite-plus at {:?}", bin_js); AbsolutePathBuf::new(bin_js) } else { tracing::debug!("Local vite-plus found but dist/bin.js missing at {:?}", bin_js); None } } } /// Check whether a project directory has at least one valid version source. /// /// Uses `is_valid_version` (no warning side effects) to avoid duplicate /// warnings when `download_runtime_for_project` or `config::resolve_version` /// later call `normalize_version` on the same values. /// /// Returns `false` when all sources are missing or invalid, so the caller /// can fall through to the user's configured default instead of LTS. async fn has_valid_version_source( project_path: &AbsolutePath, ) -> Result { let resolution = resolve_node_version(project_path, true).await?; let Some(ref r) = resolution else { return Ok(false); }; // Primary source is a valid version? if is_valid_version(&r.version) { return Ok(true); } // Primary source invalid — check package.json for valid fallbacks let pkg_path = project_path.join("package.json"); let Ok(Some(pkg)) = read_package_json(&pkg_path).await else { return Ok(false); }; let engines_valid = pkg.engines.as_ref().and_then(|e| e.node.as_ref()).is_some_and(|v| is_valid_version(v)); let dev_engines_valid = !engines_valid && pkg .dev_engines .as_ref() .and_then(|de| de.runtime.as_ref()) .and_then(|rt| rt.find_by_name("node")) .filter(|r| !r.version.is_empty()) .is_some_and(|r| is_valid_version(&r.version)); Ok(engines_valid || dev_engines_valid) } #[cfg(test)] mod tests { use serial_test::serial; use super::*; #[test] fn test_js_executor_new() { let executor = JsExecutor::new(None); assert!(executor.cli_runtime.is_none()); assert!(executor.project_runtime.is_none()); assert!(executor.scripts_dir.is_none()); } #[test] fn test_js_executor_with_scripts_dir() { let scripts_dir = if cfg!(windows) { AbsolutePathBuf::new("C:\\test\\scripts".into()).unwrap() } else { AbsolutePathBuf::new("/test/scripts".into()).unwrap() }; let executor = JsExecutor::new(Some(scripts_dir.clone())); assert_eq!(executor.get_scripts_dir().unwrap(), scripts_dir); } #[test] fn test_create_js_command_uses_direct_binary() { use std::ffi::OsStr; let (runtime_binary, runtime_bin_prefix, expected_program) = if cfg!(windows) { ( AbsolutePathBuf::new("C:\\node\\node.exe".into()).unwrap(), AbsolutePathBuf::new("C:\\node".into()).unwrap(), "C:\\node\\node.exe", ) } else { ( AbsolutePathBuf::new("/usr/local/bin/node".into()).unwrap(), AbsolutePathBuf::new("/usr/local/bin".into()).unwrap(), "/usr/local/bin/node", ) }; let cmd = JsExecutor::create_js_command(&runtime_binary, &runtime_bin_prefix); // The command should use the node binary directly assert_eq!(cmd.as_std().get_program(), OsStr::new(expected_program)); } #[tokio::test] #[serial] async fn test_delegate_to_local_cli_prints_node_version() { use std::io::Write; use tempfile::TempDir; // Create a temporary directory for the scripts (used as fallback global dir) let temp_dir = TempDir::new().unwrap(); let scripts_dir = AbsolutePathBuf::new(temp_dir.path().to_path_buf()).unwrap(); // Create a bin.js that prints process.version let script_path = temp_dir.path().join("bin.js"); let mut file = std::fs::File::create(&script_path).unwrap(); writeln!(file, "console.log(process.version);").unwrap(); // Create executor with the temp scripts directory as global fallback let mut executor = JsExecutor::new(Some(scripts_dir.clone())); // Delegate — no local vite-plus will be found, so it falls back to global bin.js let status = executor.delegate_to_local_cli(&scripts_dir, &[]).await.unwrap(); assert!(status.success(), "Script should execute successfully"); } } ================================================ FILE: crates/vite_global_cli/src/main.rs ================================================ //! Vite+ Global CLI //! //! A standalone Rust binary for the vite+ global CLI that can run without //! pre-installed Node.js. Uses managed Node.js from `vite_js_runtime` for //! package manager commands and JS script execution. // Allow printing to stderr for CLI error messages #![allow(clippy::print_stderr)] mod cli; mod command_picker; mod commands; mod error; mod help; mod js_executor; mod shim; mod tips; use std::{ io::{IsTerminal, Write}, process::{ExitCode, ExitStatus}, }; use clap::error::{ContextKind, ContextValue}; use owo_colors::OwoColorize; use vite_shared::output; pub use crate::cli::try_parse_args_from; use crate::cli::{ RenderOptions, run_command, run_command_with_options, try_parse_args_from_with_options, }; /// Normalize CLI arguments: /// - `vp list ...` / `vp ls ...` → `vp pm list ...` /// - `vp help [command]` → `vp [command] --help` fn normalize_args(args: Vec) -> Vec { match args.get(1).map(String::as_str) { // `vp list ...` → `vp pm list ...` // `vp ls ...` → `vp pm list ...` Some("list" | "ls") => { let mut normalized = Vec::with_capacity(args.len() + 1); normalized.push(args[0].clone()); normalized.push("pm".to_string()); normalized.push("list".to_string()); normalized.extend(args[2..].iter().cloned()); normalized } // `vp help` alone -> show main help Some("help") if args.len() == 2 => vec![args[0].clone(), "--help".to_string()], // `vp help [command] [args...]` -> `vp [command] --help [args...]` Some("help") if args.len() > 2 => { let mut normalized = Vec::with_capacity(args.len()); normalized.push(args[0].clone()); normalized.push(args[2].clone()); normalized.push("--help".to_string()); normalized.extend(args[3..].iter().cloned()); normalized } // No transformation needed _ => args, } } struct InvalidSubcommandDetails { invalid_subcommand: String, suggestion: Option, } fn extract_invalid_subcommand_details(error: &clap::Error) -> Option { let invalid_subcommand = match error.get(ContextKind::InvalidSubcommand) { Some(ContextValue::String(value)) => value.as_str(), _ => return None, }; let suggestion = match error.get(ContextKind::SuggestedSubcommand) { Some(ContextValue::String(value)) => Some(value.to_owned()), Some(ContextValue::Strings(values)) => { vite_shared::string_similarity::pick_best_suggestion(invalid_subcommand, values) } _ => None, }; Some(InvalidSubcommandDetails { invalid_subcommand: invalid_subcommand.to_owned(), suggestion }) } fn print_invalid_subcommand_error(details: &InvalidSubcommandDetails) { println!("{}", vite_shared::header::vite_plus_header()); println!(); let highlighted_subcommand = details.invalid_subcommand.bright_blue().to_string(); output::error(&format!("Command '{highlighted_subcommand}' not found")); } fn is_affirmative_response(input: &str) -> bool { matches!(input.trim().to_ascii_lowercase().as_str(), "y" | "yes" | "ok" | "true" | "1") } fn should_prompt_for_correction() -> bool { std::io::stdin().is_terminal() && std::io::stderr().is_terminal() } fn prompt_to_run_suggested_command(suggestion: &str) -> bool { if !should_prompt_for_correction() { return false; } eprintln!(); let highlighted_suggestion = format!("`vp {suggestion}`").bright_blue().to_string(); eprint!("Do you want to run {highlighted_suggestion}? (y/N): "); if std::io::stderr().flush().is_err() { return false; } let Some(input) = read_confirmation_input() else { return false; }; is_affirmative_response(input.trim()) } fn read_confirmation_input() -> Option { let mut input = String::new(); std::io::stdin().read_line(&mut input).ok()?; Some(input) } fn replace_top_level_typoed_subcommand( raw_args: &[String], invalid_subcommand: &str, suggestion: &str, ) -> Option> { let index = raw_args.iter().position(|arg| !arg.starts_with('-'))?; if raw_args.get(index)? != invalid_subcommand { return None; } let mut corrected = raw_args.to_vec(); corrected[index] = suggestion.to_owned(); Some(corrected) } fn exit_status_to_exit_code(exit_status: ExitStatus) -> ExitCode { if exit_status.success() { ExitCode::SUCCESS } else { #[allow(clippy::cast_sign_loss, clippy::cast_possible_truncation)] exit_status.code().map_or(ExitCode::FAILURE, |c| ExitCode::from(c as u8)) } } async fn run_corrected_args(cwd: &vite_path::AbsolutePathBuf, raw_args: &[String]) -> ExitCode { let render_options = RenderOptions { show_header: false }; let args_with_program = std::iter::once("vp".to_string()).chain(raw_args.iter().cloned()); let normalized_args = normalize_args(args_with_program.collect()); let parsed = match try_parse_args_from_with_options(normalized_args, render_options) { Ok(args) => args, Err(e) => { e.print().ok(); #[allow(clippy::cast_sign_loss)] return ExitCode::from(e.exit_code() as u8); } }; match run_command_with_options(cwd.clone(), parsed, render_options).await { Ok(exit_status) => exit_status_to_exit_code(exit_status), Err(e) => { if matches!(&e, error::Error::UserMessage(_)) { eprintln!("{e}"); } else { output::error(&format!("{e}")); } ExitCode::FAILURE } } } fn extract_unknown_argument(error: &clap::Error) -> Option { match error.get(ContextKind::InvalidArg) { Some(ContextValue::String(value)) => Some(value.to_owned()), _ => None, } } fn has_pass_as_value_suggestion(error: &clap::Error) -> bool { let contains_pass_as_value = |suggestion: &str| suggestion.contains("as a value"); match error.get(ContextKind::Suggested) { Some(ContextValue::String(suggestion)) => contains_pass_as_value(suggestion), Some(ContextValue::Strings(suggestions)) => { suggestions.iter().any(|suggestion| contains_pass_as_value(suggestion)) } Some(ContextValue::StyledStr(suggestion)) => { contains_pass_as_value(&suggestion.to_string()) } Some(ContextValue::StyledStrs(suggestions)) => { suggestions.iter().any(|suggestion| contains_pass_as_value(&suggestion.to_string())) } _ => false, } } fn print_unknown_argument_error(error: &clap::Error) -> bool { let Some(invalid_argument) = extract_unknown_argument(error) else { return false; }; println!("{}", vite_shared::header::vite_plus_header()); println!(); let highlighted_argument = invalid_argument.bright_blue().to_string(); output::error(&format!("Unexpected argument '{highlighted_argument}'")); if has_pass_as_value_suggestion(error) { eprintln!(); let pass_through_argument = format!("-- {invalid_argument}"); let highlighted_pass_through_argument = format!("`{}`", pass_through_argument.bright_blue()); eprintln!("Use {highlighted_pass_through_argument} to pass the argument as a value"); } true } #[tokio::main] async fn main() -> ExitCode { // Initialize tracing vite_shared::init_tracing(); // Check for shim mode (invoked as node, npm, or npx) let mut args: Vec = std::env::args().collect(); let argv0 = args.first().map(|s| s.as_str()).unwrap_or("vp"); tracing::debug!("argv0: {argv0}"); if let Some(tool) = shim::detect_shim_tool(argv0) { // Shim mode - dispatch to the appropriate tool let exit_code = shim::dispatch(&tool, &args[1..]).await; return ExitCode::from(exit_code as u8); } // Normal CLI mode - get current working directory let cwd = match vite_path::current_dir() { Ok(path) => path, Err(e) => { output::error(&format!("Failed to get current directory: {e}")); return ExitCode::FAILURE; } }; if args.len() == 1 { match command_picker::pick_top_level_command_if_interactive(&cwd) { Ok(command_picker::TopLevelCommandPick::Selected(selection)) => { args.push(selection.command.to_string()); if selection.append_help { args.push("--help".to_string()); } } Ok(command_picker::TopLevelCommandPick::Cancelled) => { return ExitCode::SUCCESS; } Ok(command_picker::TopLevelCommandPick::Skipped) => {} Err(err) => { tracing::debug!("Failed to run top-level command picker: {err}"); } } } let mut tip_context = tips::TipContext { // Capture user args (excluding argv0) before normalization raw_args: args[1..].to_vec(), ..Default::default() }; // Normalize arguments (list/ls aliases, help rewriting) let normalized_args = normalize_args(args); // Print unified subcommand help for clap-managed commands before clap handles help output. if help::maybe_print_unified_clap_subcommand_help(&normalized_args) { return ExitCode::SUCCESS; } // Parse CLI arguments (using custom help formatting) let exit_code = match try_parse_args_from(normalized_args) { Err(e) => { use clap::error::ErrorKind; // --help and --version are clap "errors" but should exit successfully. if matches!(e.kind(), ErrorKind::DisplayHelp | ErrorKind::DisplayVersion) { e.print().ok(); ExitCode::SUCCESS } else if matches!(e.kind(), ErrorKind::InvalidSubcommand) { if let Some(details) = extract_invalid_subcommand_details(&e) { print_invalid_subcommand_error(&details); if let Some(suggestion) = &details.suggestion { if let Some(corrected_raw_args) = replace_top_level_typoed_subcommand( &tip_context.raw_args, &details.invalid_subcommand, suggestion, ) { if prompt_to_run_suggested_command(suggestion) { tip_context.raw_args = corrected_raw_args.clone(); run_corrected_args(&cwd, &corrected_raw_args).await } else { let code = e.exit_code(); tip_context.clap_error = Some(e); #[allow(clippy::cast_sign_loss)] ExitCode::from(code as u8) } } else { let code = e.exit_code(); tip_context.clap_error = Some(e); #[allow(clippy::cast_sign_loss)] ExitCode::from(code as u8) } } else { let code = e.exit_code(); tip_context.clap_error = Some(e); #[allow(clippy::cast_sign_loss)] ExitCode::from(code as u8) } } else { e.print().ok(); let code = e.exit_code(); tip_context.clap_error = Some(e); #[allow(clippy::cast_sign_loss)] ExitCode::from(code as u8) } } else if matches!(e.kind(), ErrorKind::UnknownArgument) { if !print_unknown_argument_error(&e) { e.print().ok(); } let code = e.exit_code(); tip_context.clap_error = Some(e); #[allow(clippy::cast_sign_loss)] ExitCode::from(code as u8) } else { e.print().ok(); let code = e.exit_code(); tip_context.clap_error = Some(e); #[allow(clippy::cast_sign_loss)] ExitCode::from(code as u8) } } Ok(args) => match run_command(cwd.clone(), args).await { Ok(exit_status) => exit_status_to_exit_code(exit_status), Err(e) => { if matches!(&e, error::Error::UserMessage(_)) { eprintln!("{e}"); } else { output::error(&format!("{e}")); } ExitCode::FAILURE } }, }; tip_context.exit_code = if exit_code == ExitCode::SUCCESS { 0 } else { 1 }; if let Some(tip) = tips::get_tip(&tip_context) { eprintln!("\n{} {}", "tip:".bright_black().bold(), tip.bright_black()); } exit_code } #[cfg(test)] mod tests { use clap::error::ErrorKind; use super::{ extract_unknown_argument, has_pass_as_value_suggestion, is_affirmative_response, replace_top_level_typoed_subcommand, try_parse_args_from, }; #[test] fn unknown_argument_detected_without_pass_as_value_hint() { let error = try_parse_args_from(["vp".to_string(), "--cache".to_string()]) .expect_err("Expected parse error"); assert_eq!(error.kind(), ErrorKind::UnknownArgument); assert_eq!(extract_unknown_argument(&error).as_deref(), Some("--cache")); assert!(!has_pass_as_value_suggestion(&error)); } #[test] fn unknown_argument_detected_with_pass_as_value_hint() { let error = try_parse_args_from([ "vp".to_string(), "remove".to_string(), "--stream".to_string(), "foo".to_string(), ]) .expect_err("Expected parse error"); assert_eq!(error.kind(), ErrorKind::UnknownArgument); assert_eq!(extract_unknown_argument(&error).as_deref(), Some("--stream")); assert!(has_pass_as_value_suggestion(&error)); } #[test] fn affirmative_response_detection() { assert!(is_affirmative_response("y")); assert!(is_affirmative_response("yes")); assert!(is_affirmative_response("Y")); assert!(!is_affirmative_response("Sure")); assert!(!is_affirmative_response("n")); assert!(!is_affirmative_response("")); } #[test] fn replace_top_level_typoed_subcommand_preserves_trailing_args() { let raw_args = vec!["fnt".to_string(), "--write".to_string(), "src".to_string()]; let corrected = replace_top_level_typoed_subcommand(&raw_args, "fnt", "fmt") .expect("Expected typoed command to be replaced"); assert_eq!(corrected, vec!["fmt".to_string(), "--write".to_string(), "src".to_string()]); } #[test] fn replace_top_level_typoed_subcommand_skips_nested_subcommands() { let raw_args = vec!["env".to_string(), "typo".to_string()]; let corrected = replace_top_level_typoed_subcommand(&raw_args, "typo", "on"); assert!(corrected.is_none()); } } ================================================ FILE: crates/vite_global_cli/src/shim/cache.rs ================================================ //! Resolution cache for shim operations. //! //! Caches version resolution results to avoid re-resolving on every invocation. //! Uses mtime-based invalidation to detect changes in version source files. use std::{ collections::HashMap, time::{SystemTime, UNIX_EPOCH}, }; use serde::{Deserialize, Serialize}; use vite_path::{AbsolutePath, AbsolutePathBuf}; /// Cache format version for upgrade compatibility /// v2: Added `is_range` field to track range vs exact version for cache expiry const CACHE_VERSION: u32 = 2; /// Default maximum cache entries (LRU eviction) const DEFAULT_MAX_ENTRIES: usize = 4096; /// A single cache entry for a resolved version. #[derive(Serialize, Deserialize, Debug, Clone)] pub struct ResolveCacheEntry { /// The resolved version string (e.g., "20.18.0") pub version: String, /// The source of the version (e.g., ".node-version", "engines.node") pub source: String, /// Project root directory (if applicable) pub project_root: Option, /// Unix timestamp when this entry was resolved pub resolved_at: u64, /// Mtime of the version source file (for invalidation) pub version_file_mtime: u64, /// Path to the version source file pub source_path: Option, /// Whether the original version spec was a range (e.g., "20", "^20.0.0", "lts/*") /// Range versions use time-based expiry (1 hour) instead of mtime-only validation #[serde(default)] pub is_range: bool, } /// Resolution cache stored in VITE_PLUS_HOME/cache/resolve_cache.json. #[derive(Serialize, Deserialize, Debug)] pub struct ResolveCache { /// Cache format version for upgrade compatibility version: u32, /// Cache entries keyed by current working directory entries: HashMap, } impl Default for ResolveCache { fn default() -> Self { Self { version: CACHE_VERSION, entries: HashMap::new() } } } impl ResolveCache { /// Load cache from disk. pub fn load(cache_path: &AbsolutePath) -> Self { match std::fs::read_to_string(cache_path) { Ok(content) => { match serde_json::from_str::(&content) { Ok(cache) if cache.version == CACHE_VERSION => cache, Ok(_) => { // Version mismatch, reset cache tracing::debug!("Cache version mismatch, resetting"); Self::default() } Err(e) => { tracing::debug!("Failed to parse cache: {e}"); Self::default() } } } Err(_) => Self::default(), } } /// Save cache to disk. pub fn save(&self, cache_path: &AbsolutePath) { // Ensure parent directory exists if let Some(parent) = cache_path.parent() { std::fs::create_dir_all(parent).ok(); } if let Ok(content) = serde_json::to_string(self) { std::fs::write(cache_path, content).ok(); } } /// Get a cache entry if valid. pub fn get(&self, cwd: &AbsolutePath) -> Option<&ResolveCacheEntry> { let key = cwd.as_path().to_string_lossy().to_string(); let entry = self.entries.get(&key)?; // Validate mtime of source file if !self.is_entry_valid(entry) { return None; } Some(entry) } /// Insert a cache entry. pub fn insert(&mut self, cwd: &AbsolutePath, entry: ResolveCacheEntry) { let key = cwd.as_path().to_string_lossy().to_string(); // LRU eviction if needed if self.entries.len() >= DEFAULT_MAX_ENTRIES { self.evict_oldest(); } self.entries.insert(key, entry); } /// Check if an entry is still valid based on source file mtime and range status. /// /// For exact versions: Uses mtime-based validation only (cache valid until file changes) /// For range versions: Uses both mtime AND time-based expiry (1 hour TTL) /// /// This ensures range versions like "20" or "^20.0.0" are periodically re-resolved /// to pick up new releases, while exact versions like "20.18.0" only re-resolve /// when the source file is modified. fn is_entry_valid(&self, entry: &ResolveCacheEntry) -> bool { // For range versions (including LTS aliases), always apply time-based expiry // This ensures we periodically re-resolve to pick up new releases if entry.is_range { let now = SystemTime::now().duration_since(UNIX_EPOCH).map(|d| d.as_secs()).unwrap_or(0); if now.saturating_sub(entry.resolved_at) >= 3600 { // Range cache expired (> 1 hour) return false; } // Range cache still within TTL, but also check mtime if source_path exists if let Some(source_path) = &entry.source_path { let path = std::path::Path::new(source_path); if let Ok(metadata) = std::fs::metadata(path) { if let Ok(mtime) = metadata.modified() { let mtime_secs = mtime.duration_since(UNIX_EPOCH).map(|d| d.as_secs()).unwrap_or(0); return mtime_secs == entry.version_file_mtime; } } return false; // Source file missing or can't read mtime } return true; // No source file, within TTL } // For exact versions, check source file let Some(source_path) = &entry.source_path else { // No source file to validate (e.g., "lts" default) // Consider valid if resolved recently (within 1 hour) let now = SystemTime::now().duration_since(UNIX_EPOCH).map(|d| d.as_secs()).unwrap_or(0); return now.saturating_sub(entry.resolved_at) < 3600; }; let path = std::path::Path::new(source_path); let Ok(metadata) = std::fs::metadata(path) else { return false; }; let Ok(mtime) = metadata.modified() else { return false; }; let mtime_secs = mtime.duration_since(UNIX_EPOCH).map(|d| d.as_secs()).unwrap_or(0); mtime_secs == entry.version_file_mtime } /// Evict the oldest entry (by resolved_at timestamp). fn evict_oldest(&mut self) { if let Some((oldest_key, _)) = self .entries .iter() .min_by_key(|(_, entry)| entry.resolved_at) .map(|(k, v)| (k.clone(), v.clone())) { self.entries.remove(&oldest_key); } } } /// Get the cache file path. pub fn get_cache_path() -> Option { let home = crate::commands::env::config::get_vite_plus_home().ok()?; Some(home.join("cache").join("resolve_cache.json")) } /// Invalidate the entire resolve cache by deleting the cache file. /// Called after version configuration changes (e.g., `vp env default`, `vp env pin`, `vp env unpin`). pub fn invalidate_cache() { if let Some(cache_path) = get_cache_path() { std::fs::remove_file(cache_path.as_path()).ok(); } } /// Get the mtime of a file as Unix timestamp. pub fn get_file_mtime(path: &AbsolutePath) -> Option { let metadata = std::fs::metadata(path).ok()?; let mtime = metadata.modified().ok()?; mtime.duration_since(UNIX_EPOCH).map(|d| d.as_secs()).ok() } /// Get the current Unix timestamp. pub fn now_timestamp() -> u64 { SystemTime::now().duration_since(UNIX_EPOCH).map(|d| d.as_secs()).unwrap_or(0) } #[cfg(test)] mod tests { use tempfile::TempDir; use super::*; #[test] fn test_range_version_cache_should_expire_after_ttl() { // BUG: Currently, range versions with source_path use mtime-only validation // and never expire. They should use time-based expiry like aliases. let temp_dir = TempDir::new().unwrap(); let temp_path = AbsolutePathBuf::new(temp_dir.path().to_path_buf()).unwrap(); let cache_file = temp_path.join("cache.json"); // Create a .node-version file let version_file = temp_path.join(".node-version"); std::fs::write(&version_file, "20\n").unwrap(); let mtime = get_file_mtime(&version_file).expect("Should be able to get mtime of created file"); let mut cache = ResolveCache::default(); // Create an entry for a range version (e.g., "20" resolved to "20.20.0") // with source_path set (from .node-version file) and resolved 2 hours ago let entry = ResolveCacheEntry { version: "20.20.0".to_string(), source: ".node-version".to_string(), project_root: None, resolved_at: now_timestamp() - 7200, // 2 hours ago (> 1 hour TTL) version_file_mtime: mtime, source_path: Some(version_file.as_path().display().to_string()), // BUG FIX: need to add is_range field is_range: true, }; // Save entry to cache cache.insert(&temp_path, entry.clone()); cache.save(&cache_file); // Reload cache let loaded_cache = ResolveCache::load(&cache_file); // BUG: This entry is still considered valid because mtime hasn't changed // but it SHOULD be invalid because it's a range and TTL has expired // After fix: is_entry_valid should return false for expired range entries let cached_entry = loaded_cache.get(&temp_path); // The cache entry should be INVALID (None) because: // 1. is_range is true // 2. resolved_at is > 1 hour ago // Even though the mtime hasn't changed assert!( cached_entry.is_none(), "Range version cache should expire after 1 hour TTL, \ but mtime-only validation is returning the stale entry" ); } #[test] fn test_exact_version_cache_uses_mtime_validation() { // Exact versions should use mtime-based validation, not time-based expiry let temp_dir = TempDir::new().unwrap(); let temp_path = AbsolutePathBuf::new(temp_dir.path().to_path_buf()).unwrap(); let cache_file = temp_path.join("cache.json"); // Create a .node-version file let version_file = temp_path.join(".node-version"); std::fs::write(&version_file, "20.18.0\n").unwrap(); let mtime = get_file_mtime(&version_file).unwrap(); let mut cache = ResolveCache::default(); // Create an entry for an exact version resolved 2 hours ago let entry = ResolveCacheEntry { version: "20.18.0".to_string(), source: ".node-version".to_string(), project_root: None, resolved_at: now_timestamp() - 7200, // 2 hours ago version_file_mtime: mtime, source_path: Some(version_file.as_path().display().to_string()), is_range: false, // Exact version, not a range }; cache.insert(&temp_path, entry); cache.save(&cache_file); // Reload cache let loaded_cache = ResolveCache::load(&cache_file); let cached_entry = loaded_cache.get(&temp_path); // Exact version cache should still be valid as long as mtime hasn't changed assert!( cached_entry.is_some(), "Exact version cache should use mtime validation, not time-based expiry" ); assert_eq!(cached_entry.unwrap().version, "20.18.0"); } #[test] fn test_range_cache_valid_within_ttl() { // Range version cache should be valid within the 1 hour TTL let temp_dir = TempDir::new().unwrap(); let temp_path = AbsolutePathBuf::new(temp_dir.path().to_path_buf()).unwrap(); let cache_file = temp_path.join("cache.json"); // Create a .node-version file let version_file = temp_path.join(".node-version"); std::fs::write(&version_file, "20\n").unwrap(); let mtime = get_file_mtime(&version_file).unwrap(); let mut cache = ResolveCache::default(); // Create an entry for a range version resolved recently (30 minutes ago) let entry = ResolveCacheEntry { version: "20.20.0".to_string(), source: ".node-version".to_string(), project_root: None, resolved_at: now_timestamp() - 1800, // 30 minutes ago (< 1 hour TTL) version_file_mtime: mtime, source_path: Some(version_file.as_path().display().to_string()), is_range: true, }; cache.insert(&temp_path, entry); cache.save(&cache_file); // Reload cache let loaded_cache = ResolveCache::load(&cache_file); let cached_entry = loaded_cache.get(&temp_path); // Range version cache should still be valid within TTL assert!(cached_entry.is_some(), "Range version cache should be valid within TTL"); assert_eq!(cached_entry.unwrap().version, "20.20.0"); } // Run serially: mutates VITE_PLUS_HOME env var which affects get_cache_path() #[test] #[serial_test::serial] fn test_invalidate_cache_removes_file() { let temp_dir = TempDir::new().unwrap(); let temp_path = AbsolutePathBuf::new(temp_dir.path().to_path_buf()).unwrap(); // Set VITE_PLUS_HOME to temp dir so invalidate_cache() targets our test file let cache_dir = temp_path.join("cache"); std::fs::create_dir_all(&cache_dir).unwrap(); let cache_file = cache_dir.join("resolve_cache.json"); // Create a cache with an entry and save it let mut cache = ResolveCache::default(); cache.insert( &temp_path, ResolveCacheEntry { version: "20.18.0".to_string(), source: ".node-version".to_string(), project_root: None, resolved_at: now_timestamp(), version_file_mtime: 0, source_path: None, is_range: false, }, ); cache.save(&cache_file); assert!(std::fs::metadata(cache_file.as_path()).is_ok(), "Cache file should exist"); // Point VITE_PLUS_HOME to our temp dir and call invalidate_cache unsafe { std::env::set_var(vite_shared::env_vars::VITE_PLUS_HOME, temp_path.as_path()); } invalidate_cache(); unsafe { std::env::remove_var(vite_shared::env_vars::VITE_PLUS_HOME); } // Cache file should be removed assert!( std::fs::metadata(cache_file.as_path()).is_err(), "Cache file should be removed after invalidation" ); // Loading from removed file should return empty default cache let loaded_cache = ResolveCache::load(&cache_file); assert!(loaded_cache.get(&temp_path).is_none(), "Cache should be empty after invalidation"); } } ================================================ FILE: crates/vite_global_cli/src/shim/dispatch.rs ================================================ //! Main dispatch logic for shim operations. //! //! This module handles the core shim functionality: //! 1. Version resolution (with caching) //! 2. Node.js installation (if needed) //! 3. Tool execution (core tools and package binaries) use vite_path::{AbsolutePath, AbsolutePathBuf, current_dir}; use vite_shared::{PrependOptions, env_vars, output, prepend_to_path_env}; use super::{ cache::{self, ResolveCache, ResolveCacheEntry}, exec, is_core_shim_tool, }; use crate::commands::env::{ bin_config::{BinConfig, BinSource}, config::{self, ShimMode}, global_install::CORE_SHIMS, package_metadata::PackageMetadata, }; /// Environment variable used to prevent infinite recursion in shim dispatch. /// /// When set, the shim will skip version resolution and execute the tool /// directly using the current PATH (passthrough mode). const RECURSION_ENV_VAR: &str = env_vars::VITE_PLUS_TOOL_RECURSION; /// Package manager tools that should resolve Node.js version from the project context /// rather than using the install-time version. const PACKAGE_MANAGER_TOOLS: &[&str] = &["pnpm", "yarn"]; fn is_package_manager_tool(tool: &str) -> bool { PACKAGE_MANAGER_TOOLS.contains(&tool) } /// Parsed npm global command (install or uninstall). struct NpmGlobalCommand { /// Package names/specs extracted from args (e.g., ["codex", "typescript@5"]) packages: Vec, /// Explicit `--prefix ` from the CLI args, if present. explicit_prefix: Option, } /// Value-bearing npm flags whose next arg should be skipped during package extraction. /// Note: `--prefix` is handled separately to capture its value. const NPM_VALUE_FLAGS: &[&str] = &["--registry", "--tag", "--cache", "--tmp"]; /// Install subcommands recognized by npm. const NPM_INSTALL_SUBCOMMANDS: &[&str] = &["install", "i", "add"]; /// Uninstall subcommands recognized by npm. const NPM_UNINSTALL_SUBCOMMANDS: &[&str] = &["uninstall", "un", "remove", "rm"]; /// Parse npm args to detect a global command (`npm -g `). /// Returns None if the args don't match the expected pattern. fn parse_npm_global_command(args: &[String], subcommands: &[&str]) -> Option { let mut has_global = false; let mut has_subcommand = false; let mut packages = Vec::new(); let mut skip_next = false; let mut prefix_next = false; let mut explicit_prefix = None; // The npm subcommand must be the first positional (non-flag) arg. // Once we see a positional that isn't a recognized subcommand, no later // positional can be the subcommand (e.g. `npm run install -g` → not install). let mut seen_positional = false; for arg in args { // Capture the value after --prefix if prefix_next { prefix_next = false; explicit_prefix = Some(arg.clone()); continue; } if skip_next { skip_next = false; continue; } if arg == "-g" || arg == "--global" { has_global = true; continue; } // Capture --prefix specially (its value is needed for prefix resolution) if arg == "--prefix" { prefix_next = true; continue; } if let Some(value) = arg.strip_prefix("--prefix=") { explicit_prefix = Some(value.to_string()); continue; } // Check for value-bearing flags (skip their values) if NPM_VALUE_FLAGS.contains(&arg.as_str()) { skip_next = true; continue; } // Skip flags if arg.starts_with('-') { continue; } // Subcommand must be the first positional arg if !seen_positional && subcommands.contains(&arg.as_str()) && !has_subcommand { has_subcommand = true; seen_positional = true; continue; } seen_positional = true; // This is a positional arg (package spec) packages.push(arg.clone()); } if !has_global || !has_subcommand || packages.is_empty() { return None; } Some(NpmGlobalCommand { packages, explicit_prefix }) } /// Parse npm args to detect `npm install -g `. fn parse_npm_global_install(args: &[String]) -> Option { let mut parsed = parse_npm_global_command(args, NPM_INSTALL_SUBCOMMANDS)?; // Filter out URLs and git+ prefixes (too complex to resolve package names) parsed.packages.retain(|pkg| !pkg.contains("://") && !pkg.starts_with("git+")); if parsed.packages.is_empty() { None } else { Some(parsed) } } /// Parse npm args to detect `npm uninstall -g `. fn parse_npm_global_uninstall(args: &[String]) -> Option { parse_npm_global_command(args, NPM_UNINSTALL_SUBCOMMANDS) } /// Resolve package name from a spec string. /// /// Handles: /// - Regular specs: "codex" → "codex", "typescript@5" → "typescript" /// - Scoped specs: "@scope/pkg" → "@scope/pkg", "@scope/pkg@1.0" → "@scope/pkg" /// - Local paths: "./foo" → reads foo/package.json → name field fn is_local_path(spec: &str) -> bool { spec == "." || spec == ".." || spec.starts_with("./") || spec.starts_with("../") || spec.starts_with('/') || (cfg!(windows) && spec.len() >= 3 && spec.as_bytes()[1] == b':' && (spec.as_bytes()[2] == b'\\' || spec.as_bytes()[2] == b'/')) } fn resolve_package_name(spec: &str) -> Option { // Local path — read package.json to get the actual name if is_local_path(spec) { let pkg_json_path = current_dir().ok()?.join(spec).join("package.json"); let content = std::fs::read_to_string(pkg_json_path.as_path()).ok()?; let json: serde_json::Value = serde_json::from_str(&content).ok()?; return json.get("name").and_then(|n| n.as_str()).map(str::to_string); } // Scoped package: @scope/name or @scope/name@version if let Some(rest) = spec.strip_prefix('@') { if let Some(idx) = rest.find('@') { return Some(spec[..=idx].to_string()); } return Some(spec.to_string()); } // Regular package: name or name@version if let Some(idx) = spec.find('@') { return Some(spec[..idx].to_string()); } Some(spec.to_string()) } /// Get the actual npm global prefix directory. /// /// Runs `npm config get prefix` to determine the global prefix, which respects /// `NPM_CONFIG_PREFIX` env var and `.npmrc` settings. Falls back to `node_dir`. #[allow(clippy::disallowed_types)] fn get_npm_global_prefix(npm_path: &AbsolutePath, node_dir: &AbsolutePathBuf) -> AbsolutePathBuf { // `npm config get prefix` respects NPM_CONFIG_PREFIX, .npmrc, and other // npm config mechanisms. if let Ok(output) = std::process::Command::new(npm_path.as_path()).args(["config", "get", "prefix"]).output() { if output.status.success() { if let Ok(prefix) = std::str::from_utf8(&output.stdout) { let prefix = prefix.trim(); if let Some(prefix_path) = AbsolutePathBuf::new(prefix.into()) { return prefix_path; } } } } // Fallback: default npm prefix is the Node install dir node_dir.clone() } /// After npm install -g completes, check if installed binaries are on PATH. /// /// First determines the actual npm global bin directory (which may differ from the /// default if the user has set a custom prefix). If that directory is already on the /// user's original PATH, binaries are reachable and no action is needed. /// /// Otherwise, in interactive mode, prompt user to create bin links. /// In non-interactive mode, create links automatically. /// Always print a tip suggesting `vp install -g`. #[allow(clippy::disallowed_macros, clippy::disallowed_types)] fn check_npm_global_install_result( packages: &[String], original_path: Option<&std::ffi::OsStr>, npm_prefix: &AbsolutePath, node_dir: &AbsolutePath, node_version: &str, ) { use std::io::IsTerminal; let Ok(bin_dir) = config::get_bin_dir() else { return }; // Derive bin dir from prefix (Unix: prefix/bin, Windows: prefix itself) #[cfg(unix)] let npm_bin_dir = npm_prefix.join("bin"); #[cfg(windows)] let npm_bin_dir = npm_prefix.to_absolute_path_buf(); // If the npm global bin dir is already on the user's original PATH, // binaries are reachable without shims — no action needed. if let Some(orig) = original_path { if std::env::split_paths(orig).any(|p| p == npm_bin_dir.as_path()) { return; } } let is_interactive = std::io::stdin().is_terminal(); // (bin_name, source_path, package_name) let mut missing_bins: Vec<(String, AbsolutePathBuf, String)> = Vec::new(); let mut managed_conflicts: Vec<(String, String)> = Vec::new(); for spec in packages { let Some(package_name) = resolve_package_name(spec) else { continue }; let Some(content) = read_npm_package_json(npm_prefix, node_dir, &package_name) else { continue; }; let Ok(package_json) = serde_json::from_str::(&content) else { continue; }; let bin_names = extract_bin_names(&package_json); for bin_name in bin_names { // Skip core shims if CORE_SHIMS.contains(&bin_name.as_str()) { continue; } // Check if binary already exists in bin_dir (vite-plus bin) // On Unix: symlinks (bin/tsc) // On Windows: trampoline .exe (bin/tsc.exe) or legacy .cmd (bin/tsc.cmd) let shim_path = bin_dir.join(&bin_name); let shim_exists = std::fs::symlink_metadata(shim_path.as_path()).is_ok() || { #[cfg(windows)] { let exe_path = bin_dir.join(vite_str::format!("{bin_name}.exe")); std::fs::symlink_metadata(exe_path.as_path()).is_ok() } #[cfg(not(windows))] false }; if shim_exists { if let Ok(Some(config)) = BinConfig::load_sync(&bin_name) { if config.source == BinSource::Vp { // Managed by vp install -g — warn about the conflict managed_conflicts.push((bin_name, config.package.clone())); } else if config.source == BinSource::Npm && config.package != package_name { // Link exists from a different npm package — recreate link for new owner. // The old symlink points at the previous package's binary; we must // replace it so it resolves to the new package's binary in npm's bin dir. #[cfg(unix)] let source_path = npm_bin_dir.join(&bin_name); #[cfg(windows)] let source_path = npm_bin_dir.join(vite_str::format!("{bin_name}.cmd")); if source_path.as_path().exists() { let _ = std::fs::remove_file(shim_path.as_path()); create_bin_link( &bin_dir, &bin_name, &source_path, &package_name, node_version, ); } } } continue; } // Also check .cmd on Windows #[cfg(windows)] { let cmd_path = bin_dir.join(format!("{bin_name}.cmd")); if cmd_path.as_path().exists() { continue; } } // Binary source in actual npm global bin dir #[cfg(unix)] let source_path = npm_bin_dir.join(&bin_name); #[cfg(windows)] let source_path = npm_bin_dir.join(format!("{bin_name}.cmd")); if source_path.as_path().exists() { missing_bins.push((bin_name, source_path, package_name.clone())); } } } // Deduplicate by bin_name so that when two packages declare the same binary, // only the last one is linked (matching npm's "last writer wins" behavior). let missing_bins = dedup_missing_bins(missing_bins); if !managed_conflicts.is_empty() { for (bin_name, pkg) in &managed_conflicts { output::raw(&vite_str::format!( "Skipped '{bin_name}': managed by `vp install -g {pkg}`. Run `vp uninstall -g {pkg}` to remove it first." )); } } if missing_bins.is_empty() { return; } let should_link = if is_interactive { // Prompt user let bin_list: Vec<&str> = missing_bins.iter().map(|(name, _, _)| name.as_str()).collect(); let bin_display = bin_list.join(", "); output::raw(&vite_str::format!("'{bin_display}' is not available on your PATH.")); output::raw_inline("Create a link in ~/.vite-plus/bin/ to make it available? [Y/n] "); let _ = std::io::Write::flush(&mut std::io::stdout()); let mut input = String::new(); let confirmed = std::io::stdin().read_line(&mut input).is_ok(); let trimmed = input.trim(); confirmed && (trimmed.is_empty() || trimmed.eq_ignore_ascii_case("y") || trimmed.eq_ignore_ascii_case("yes")) } else { // Non-interactive: auto-link true }; if should_link { for (bin_name, source_path, package_name) in &missing_bins { create_bin_link(&bin_dir, bin_name, source_path, package_name, node_version); } } // Always print the tip let pkg_names: Vec<&str> = packages.iter().map(String::as_str).collect(); let pkg_display = pkg_names.join(" "); output::raw(&vite_str::format!( "\ntip: Use `vp install -g {pkg_display}` for managed shims that persist across Node.js version changes." )); } /// Extract binary names from a package.json value. fn extract_bin_names(package_json: &serde_json::Value) -> Vec { let mut bins = Vec::new(); if let Some(bin) = package_json.get("bin") { match bin { serde_json::Value::String(_) => { // Single binary with package name if let Some(name) = package_json["name"].as_str() { let bin_name = name.split('/').last().unwrap_or(name); bins.push(bin_name.to_string()); } } serde_json::Value::Object(map) => { for name in map.keys() { bins.push(name.clone()); } } _ => {} } } bins } /// Extract the relative path for a specific bin name from a package.json "bin" field. fn extract_bin_path(package_json: &serde_json::Value, bin_name: &str) -> Option { match package_json.get("bin")? { serde_json::Value::String(path) => { // Single binary — matches if the package name's last segment equals bin_name let pkg_name = package_json["name"].as_str()?; let expected = pkg_name.split('/').last().unwrap_or(pkg_name); if expected == bin_name { Some(path.clone()) } else { None } } serde_json::Value::Object(map) => { map.get(bin_name).and_then(|v| v.as_str()).map(str::to_string) } _ => None, } } /// Create a bin link for a binary and record it via BinConfig. fn create_bin_link( bin_dir: &AbsolutePath, bin_name: &str, source_path: &AbsolutePath, package_name: &str, node_version: &str, ) { let mut linked = false; #[cfg(unix)] { let link_path = bin_dir.join(bin_name); if std::os::unix::fs::symlink(source_path.as_path(), link_path.as_path()).is_ok() { output::raw(&vite_str::format!( "Linked '{bin_name}' to {}", link_path.as_path().display() )); linked = true; } else { output::error(&vite_str::format!("Failed to create link for '{bin_name}'")); } } #[cfg(windows)] { // npm-installed packages use .cmd wrappers pointing to npm's generated script. // Unlike vp-installed packages, these don't have PackageMetadata, so the // trampoline approach won't work (dispatch_package_binary would fail). let cmd_path = bin_dir.join(vite_str::format!("{bin_name}.cmd")); let wrapper_content = vite_str::format!( "@echo off\r\n\"{source}\" %*\r\nexit /b %ERRORLEVEL%\r\n", source = source_path.as_path().display() ); if std::fs::write(cmd_path.as_path(), &*wrapper_content).is_ok() { output::raw(&vite_str::format!( "Linked '{bin_name}' to {}", cmd_path.as_path().display() )); linked = true; } else { output::error(&vite_str::format!("Failed to create link for '{bin_name}'")); } // Also create shell script for Git Bash let sh_path = bin_dir.join(bin_name); let sh_content = format!("#!/bin/sh\nexec \"{}\" \"$@\"\n", source_path.as_path().display()); let _ = std::fs::write(sh_path.as_path(), sh_content); } // Record the link in BinConfig so we can identify it during uninstall if linked { let _ = BinConfig::new_npm( bin_name.to_string(), package_name.to_string(), node_version.to_string(), ) .save_sync(); } } /// Deduplicate missing_bins by bin_name, keeping the last entry (npm's "last writer wins"). /// /// When `npm install -g pkg-a pkg-b` and both declare the same binary name, we get /// duplicate entries. Without dedup, `create_bin_link` would fail on the second entry /// because the symlink already exists, leaving stale BinConfig for the first package. #[allow(clippy::disallowed_types)] fn dedup_missing_bins( missing_bins: Vec<(String, AbsolutePathBuf, String)>, ) -> Vec<(String, AbsolutePathBuf, String)> { let mut seen: std::collections::HashSet = std::collections::HashSet::new(); let mut deduped = Vec::new(); for entry in missing_bins.into_iter().rev() { if seen.insert(entry.0.clone()) { deduped.push(entry); } } deduped.reverse(); deduped } /// After npm uninstall -g completes, remove bin links that were created during install. /// /// Each entry is `(bin_name, package_name)`. We only remove a link if its BinConfig /// has `source: Npm` AND `package` matches the package being uninstalled. This prevents /// removing a link that was overwritten by a later install of a different package. /// /// When a bin is owned by a **different** npm package (not being uninstalled), npm may /// still delete its binary from `npm_bin_dir`, leaving our symlink dangling. In that /// case we repair the link by pointing directly at the surviving package's binary. #[allow(clippy::disallowed_types)] fn remove_npm_global_uninstall_links(bin_entries: &[(String, String)], npm_prefix: &AbsolutePath) { let Ok(bin_dir) = config::get_bin_dir() else { return }; for (bin_name, package_name) in bin_entries { // Skip core shims if CORE_SHIMS.contains(&bin_name.as_str()) { continue; } let config = match BinConfig::load_sync(bin_name) { Ok(Some(c)) if c.source == BinSource::Npm => c, _ => continue, }; if config.package == *package_name { // Owned by the package being uninstalled — remove the link let link_path = bin_dir.join(bin_name); if std::fs::symlink_metadata(link_path.as_path()).is_ok() { if std::fs::remove_file(link_path.as_path()).is_ok() { output::raw(&vite_str::format!( "Removed link '{bin_name}' from {}", link_path.as_path().display() )); } } // Clean up the BinConfig let _ = BinConfig::delete_sync(bin_name); // Also remove .cmd and .exe on Windows #[cfg(windows)] { let cmd_path = bin_dir.join(vite_str::format!("{bin_name}.cmd")); let _ = std::fs::remove_file(cmd_path.as_path()); let exe_path = bin_dir.join(vite_str::format!("{bin_name}.exe")); let _ = std::fs::remove_file(exe_path.as_path()); } } else { // Owned by a different npm package — check if our link target is now broken // (npm may have deleted the binary from npm_bin_dir when uninstalling) let link_path = bin_dir.join(bin_name); // On Unix, exists() follows the symlink — if target is gone, it returns false. // On Windows, the shim files are regular files that always "exist", // so we always fall through to the repair check below. #[cfg(unix)] if link_path.as_path().exists() { // Target still accessible — nothing to repair continue; } // Target is broken — repair by pointing to the surviving package's binary let surviving_pkg = &config.package; let node_modules_dir = config::get_node_modules_dir(npm_prefix, surviving_pkg); let pkg_json_path = node_modules_dir.join("package.json"); let content = match std::fs::read_to_string(pkg_json_path.as_path()) { Ok(c) => c, Err(_) => continue, }; let package_json = match serde_json::from_str::(&content) { Ok(v) => v, Err(_) => continue, }; let Some(bin_rel_path) = extract_bin_path(&package_json, bin_name) else { continue; }; let source_path = node_modules_dir.join(&bin_rel_path); if source_path.as_path().exists() { let _ = std::fs::remove_file(link_path.as_path()); #[cfg(windows)] { let cmd_path = bin_dir.join(vite_str::format!("{bin_name}.cmd")); let _ = std::fs::remove_file(cmd_path.as_path()); } create_bin_link( &bin_dir, bin_name, &source_path, surviving_pkg, &config.node_version, ); } } } } /// Read the installed package.json from npm's node_modules directory. /// Tries the npm prefix first (handles custom prefix), then falls back to node_dir. #[allow(clippy::disallowed_types)] fn read_npm_package_json( npm_prefix: &AbsolutePath, node_dir: &AbsolutePath, package_name: &str, ) -> Option { std::fs::read_to_string( config::get_node_modules_dir(npm_prefix, package_name).join("package.json").as_path(), ) .ok() .or_else(|| { let dir = config::get_node_modules_dir(node_dir, package_name); std::fs::read_to_string(dir.join("package.json").as_path()).ok() }) } /// Collect (bin_name, package_name) pairs from packages by reading their installed package.json files. #[allow(clippy::disallowed_types)] fn collect_bin_names_from_npm( packages: &[String], npm_prefix: &AbsolutePath, node_dir: &AbsolutePath, ) -> Vec<(String, String)> { let mut all_bins = Vec::new(); for spec in packages { let Some(package_name) = resolve_package_name(spec) else { continue }; let Some(content) = read_npm_package_json(npm_prefix, node_dir, &package_name) else { continue; }; let Ok(package_json) = serde_json::from_str::(&content) else { continue; }; for bin_name in extract_bin_names(&package_json) { all_bins.push((bin_name, package_name.clone())); } } all_bins } /// Resolve the npm prefix, preferring an explicit `--prefix` from CLI args. /// /// Handles both absolute and relative `--prefix` values by resolving against cwd. /// `AbsolutePathBuf::join` replaces the base when the argument is absolute (like /// `PathBuf::join`), so `cwd.join("/abs")` → `/abs` and `cwd.join("./rel")` → `/cwd/./rel`. fn resolve_npm_prefix( parsed: &NpmGlobalCommand, npm_path: &AbsolutePath, node_dir: &AbsolutePathBuf, ) -> AbsolutePathBuf { if let Some(ref prefix) = parsed.explicit_prefix { if let Ok(cwd) = current_dir() { return cwd.join(prefix); } } get_npm_global_prefix(npm_path, node_dir) } /// Main shim dispatch entry point. /// /// Called when the binary is invoked as node, npm, npx, or a package binary. /// Returns an exit code to be used with std::process::exit. pub async fn dispatch(tool: &str, args: &[String]) -> i32 { tracing::debug!("dispatch: tool: {tool}, args: {:?}", args); // Handle vpx — standalone command, doesn't need recursion/bypass/shim-mode checks if tool == "vpx" { let cwd = match current_dir() { Ok(path) => path, Err(e) => { eprintln!("vp: Failed to get current directory: {e}"); return 1; } }; return crate::commands::vpx::execute_vpx(args, &cwd).await; } // Check recursion prevention - if already in a shim context, passthrough directly // Only applies to core tools (node/npm/npx) whose bin dir is prepended to PATH. // Package binaries are always resolved via metadata lookup, so they can't loop. if std::env::var(RECURSION_ENV_VAR).is_ok() && is_core_shim_tool(tool) { tracing::debug!("recursion prevention enabled for core tool"); return passthrough_to_system(tool, args); } // Check bypass mode (explicit environment variable) if std::env::var(env_vars::VITE_PLUS_BYPASS).is_ok() { tracing::debug!("bypass mode enabled"); return bypass_to_system(tool, args); } // Check shim mode from config let shim_mode = load_shim_mode().await; if shim_mode == ShimMode::SystemFirst { tracing::debug!("system-first mode enabled"); // In system-first mode, try to find system tool first if let Some(system_path) = find_system_tool(tool) { // Append current bin_dir to VITE_PLUS_BYPASS to prevent infinite loops // when multiple vite-plus installations exist in PATH. // The next installation will filter all accumulated paths. if let Ok(bin_dir) = config::get_bin_dir() { let bypass_val = match std::env::var_os(env_vars::VITE_PLUS_BYPASS) { Some(existing) => { let mut paths: Vec<_> = std::env::split_paths(&existing).collect(); paths.push(bin_dir.as_path().to_path_buf()); std::env::join_paths(paths).unwrap_or(existing) } None => std::ffi::OsString::from(bin_dir.as_path()), }; // SAFETY: Setting env vars before exec (which replaces the process) is safe unsafe { std::env::set_var(env_vars::VITE_PLUS_BYPASS, bypass_val); } } return exec::exec_tool(&system_path, args); } // Fall through to managed if system not found } // Check if this is a package binary (not node/npm/npx) if !is_core_shim_tool(tool) { return dispatch_package_binary(tool, args).await; } // Get current working directory let cwd = match current_dir() { Ok(path) => path, Err(e) => { eprintln!("vp: Failed to get current directory: {e}"); return 1; } }; // Resolve version (with caching) let resolution = match resolve_with_cache(&cwd).await { Ok(r) => r, Err(e) => { eprintln!("vp: Failed to resolve Node version: {e}"); eprintln!("vp: Run 'vp env doctor' for diagnostics"); return 1; } }; // Ensure Node.js is installed if let Err(e) = ensure_installed(&resolution.version).await { eprintln!("vp: Failed to install Node {}: {e}", resolution.version); return 1; } // Locate tool binary let tool_path = match locate_tool(&resolution.version, tool) { Ok(p) => p, Err(e) => { eprintln!("vp: Tool '{tool}' not found: {e}"); return 1; } }; // Save original PATH before we modify it — needed for npm global install check. // Only captured for npm to avoid unnecessary work on node/npx hot path. let original_path = if tool == "npm" { std::env::var_os("PATH") } else { None }; // Prepare environment for recursive invocations // Prepend real node bin dir to PATH so child processes use the correct version let node_bin_dir = tool_path.parent().expect("Tool has no parent directory"); // Use dedupe_anywhere=false to only check if it's first in PATH (original behavior) prepend_to_path_env(node_bin_dir, PrependOptions::default()); // Optional debug env vars if std::env::var(env_vars::VITE_PLUS_DEBUG_SHIM).is_ok() { // SAFETY: Setting env vars at this point before exec is safe unsafe { std::env::set_var(env_vars::VITE_PLUS_ACTIVE_NODE, &resolution.version); std::env::set_var(env_vars::VITE_PLUS_RESOLVE_SOURCE, &resolution.source); } } // Set recursion prevention marker before executing // This prevents infinite loops when the executed tool invokes another shim // SAFETY: Setting env vars at this point before exec is safe unsafe { std::env::set_var(RECURSION_ENV_VAR, "1"); } // For npm install/uninstall -g, use spawn+wait so we can post-check/cleanup binaries if tool == "npm" { if let Some(parsed) = parse_npm_global_install(args) { let exit_code = exec::spawn_tool(&tool_path, args); if exit_code == 0 { if let Ok(home_dir) = vite_shared::get_vite_plus_home() { let node_dir = home_dir.join("js_runtime").join("node").join(&*resolution.version); let npm_prefix = resolve_npm_prefix(&parsed, &tool_path, &node_dir); check_npm_global_install_result( &parsed.packages, original_path.as_deref(), &npm_prefix, &node_dir, &resolution.version, ); } } return exit_code; } if let Some(parsed) = parse_npm_global_uninstall(args) { // Collect bin names before uninstall (package.json will be gone after) let context = if let Ok(home_dir) = vite_shared::get_vite_plus_home() { let node_dir = home_dir.join("js_runtime").join("node").join(&*resolution.version); let npm_prefix = resolve_npm_prefix(&parsed, &tool_path, &node_dir); let bins = collect_bin_names_from_npm(&parsed.packages, &npm_prefix, &node_dir); Some((bins, npm_prefix)) } else { None }; let exit_code = exec::spawn_tool(&tool_path, args); if exit_code == 0 { if let Some((bin_names, npm_prefix)) = context { remove_npm_global_uninstall_links(&bin_names, &npm_prefix); } } return exit_code; } } // Execute the tool (normal path — exec replaces process on Unix) exec::exec_tool(&tool_path, args) } /// Dispatch a package binary shim. /// /// Finds the package that provides this binary and executes it with the /// Node.js version that was used to install the package. async fn dispatch_package_binary(tool: &str, args: &[String]) -> i32 { // Find which package provides this binary let package_metadata = match find_package_for_binary(tool).await { Ok(Some(metadata)) => metadata, Ok(None) => { eprintln!("vp: Binary '{tool}' not found in any installed package"); eprintln!("vp: Run 'vp install -g ' to install"); return 1; } Err(e) => { eprintln!("vp: Failed to find package for '{tool}': {e}"); return 1; } }; // Determine Node.js version to use: // - Package managers (pnpm, yarn): resolve from project context so they respect // the project's engines.node / .node-version, falling back to install-time version // - Other package binaries: use the install-time version (original behavior) let node_version = if is_package_manager_tool(tool) { let cwd = match current_dir() { Ok(path) => path, Err(e) => { eprintln!("vp: Failed to get current directory: {e}"); return 1; } }; match resolve_with_cache(&cwd).await { Ok(resolution) => resolution.version, Err(_) => { // Fall back to install-time version if project resolution fails package_metadata.platform.node.clone() } } } else { package_metadata.platform.node.clone() }; // Ensure Node.js is installed if let Err(e) = ensure_installed(&node_version).await { eprintln!("vp: Failed to install Node {}: {e}", node_version); return 1; } // Locate the actual binary in the package directory let binary_path = match locate_package_binary(&package_metadata.name, tool) { Ok(p) => p, Err(e) => { eprintln!("vp: Binary '{tool}' not found: {e}"); return 1; } }; // Locate node binary for this version let node_path = match locate_tool(&node_version, "node") { Ok(p) => p, Err(e) => { eprintln!("vp: Node not found: {e}"); return 1; } }; // Prepare environment for recursive invocations let node_bin_dir = node_path.parent().expect("Node has no parent directory"); prepend_to_path_env(node_bin_dir, PrependOptions::default()); // Check if the binary is a JavaScript file that needs Node.js // This info was determined at install time and stored in metadata if package_metadata.is_js_binary(tool) { // Execute: node let mut full_args = vec![binary_path.as_path().display().to_string()]; full_args.extend(args.iter().cloned()); exec::exec_tool(&node_path, &full_args) } else { // Execute the binary directly (native executable or non-Node script) exec::exec_tool(&binary_path, args) } } /// Find the package that provides a given binary. /// /// Uses BinConfig for deterministic O(1) lookup instead of scanning all packages. pub(crate) async fn find_package_for_binary( binary_name: &str, ) -> Result, String> { // Use BinConfig for deterministic lookup if let Some(bin_config) = BinConfig::load(binary_name).await.map_err(|e| format!("{e}"))? { return PackageMetadata::load(&bin_config.package).await.map_err(|e| format!("{e}")); } // Binary not installed Ok(None) } /// Locate a binary within a package's installation directory. pub(crate) fn locate_package_binary( package_name: &str, binary_name: &str, ) -> Result { let packages_dir = config::get_packages_dir().map_err(|e| format!("{e}"))?; let package_dir = packages_dir.join(package_name); // The binary is referenced in package.json's bin field // npm uses different layouts: Unix=lib/node_modules, Windows=node_modules let node_modules_dir = config::get_node_modules_dir(&package_dir, package_name); let package_json_path = node_modules_dir.join("package.json"); if !package_json_path.as_path().exists() { return Err(format!("Package {} not found", package_name)); } // Read package.json to find the binary path let content = std::fs::read_to_string(package_json_path.as_path()) .map_err(|e| format!("Failed to read package.json: {e}"))?; let package_json: serde_json::Value = serde_json::from_str(&content).map_err(|e| format!("Failed to parse package.json: {e}"))?; let binary_path = match package_json.get("bin") { Some(serde_json::Value::String(path)) => { // Single binary - check if it matches the name let pkg_name = package_json["name"].as_str().unwrap_or(""); let expected_name = pkg_name.split('/').last().unwrap_or(pkg_name); if expected_name == binary_name { node_modules_dir.join(path) } else { return Err(format!("Binary {} not found in package", binary_name)); } } Some(serde_json::Value::Object(map)) => { // Multiple binaries - find the one we need if let Some(serde_json::Value::String(path)) = map.get(binary_name) { node_modules_dir.join(path) } else { return Err(format!("Binary {} not found in package", binary_name)); } } _ => { return Err(format!("No bin field in package.json for {}", package_name)); } }; if !binary_path.as_path().exists() { return Err(format!( "Binary {} not found at {}", binary_name, binary_path.as_path().display() )); } Ok(binary_path) } /// Bypass shim and use system tool. fn bypass_to_system(tool: &str, args: &[String]) -> i32 { match find_system_tool(tool) { Some(system_path) => exec::exec_tool(&system_path, args), None => { eprintln!("vp: VITE_PLUS_BYPASS is set but no system '{tool}' found in PATH"); 1 } } } /// Passthrough mode for recursion prevention. /// /// When VITE_PLUS_TOOL_RECURSION is set, we skip version resolution /// and execute the tool directly using the current PATH. /// This prevents infinite loops when a managed tool invokes another shim. fn passthrough_to_system(tool: &str, args: &[String]) -> i32 { match find_system_tool(tool) { Some(system_path) => exec::exec_tool(&system_path, args), None => { eprintln!("vp: Recursion detected but no '{tool}' found in PATH (excluding shims)"); 1 } } } /// Resolve version with caching. async fn resolve_with_cache(cwd: &AbsolutePathBuf) -> Result { // Fast-path: VITE_PLUS_NODE_VERSION env var set by `vp env use` // Skip all disk I/O for cache when session override is active if let Ok(env_version) = std::env::var(config::VERSION_ENV_VAR) { let env_version = env_version.trim().to_string(); if !env_version.is_empty() { return Ok(ResolveCacheEntry { version: env_version, source: config::VERSION_ENV_VAR.to_string(), project_root: None, resolved_at: cache::now_timestamp(), version_file_mtime: 0, source_path: None, is_range: false, }); } } // Fast-path: session version file written by `vp env use` if let Some(session_version) = config::read_session_version().await { return Ok(ResolveCacheEntry { version: session_version, source: config::SESSION_VERSION_FILE.to_string(), project_root: None, resolved_at: cache::now_timestamp(), version_file_mtime: 0, source_path: None, is_range: false, }); } // Load cache let cache_path = cache::get_cache_path(); let mut cache = cache_path.as_ref().map(|p| ResolveCache::load(p)).unwrap_or_default(); // Check cache hit if let Some(entry) = cache.get(cwd) { tracing::debug!( "Cache hit for {}: {} (from {})", cwd.as_path().display(), entry.version, entry.source ); return Ok(entry.clone()); } // Cache miss - resolve version let resolution = config::resolve_version(cwd).await.map_err(|e| format!("{e}"))?; // Create cache entry let mtime = resolution.source_path.as_ref().and_then(|p| cache::get_file_mtime(p)).unwrap_or(0); let entry = ResolveCacheEntry { version: resolution.version.clone(), source: resolution.source.clone(), project_root: resolution .project_root .as_ref() .map(|p: &AbsolutePathBuf| p.as_path().display().to_string()), resolved_at: cache::now_timestamp(), version_file_mtime: mtime, source_path: resolution .source_path .as_ref() .map(|p: &AbsolutePathBuf| p.as_path().display().to_string()), is_range: resolution.is_range, }; // Save to cache cache.insert(cwd, entry.clone()); if let Some(ref path) = cache_path { cache.save(path); } Ok(entry) } /// Ensure Node.js is installed. pub(crate) async fn ensure_installed(version: &str) -> Result<(), String> { let home_dir = vite_shared::get_vite_plus_home() .map_err(|e| format!("Failed to get vite-plus home dir: {e}"))? .join("js_runtime") .join("node") .join(version); #[cfg(windows)] let binary_path = home_dir.join("node.exe"); #[cfg(not(windows))] let binary_path = home_dir.join("bin").join("node"); // Check if already installed if binary_path.as_path().exists() { return Ok(()); } // Download the runtime vite_js_runtime::download_runtime(vite_js_runtime::JsRuntimeType::Node, version) .await .map_err(|e| format!("{e}"))?; Ok(()) } /// Locate a tool binary within the Node.js installation. pub(crate) fn locate_tool(version: &str, tool: &str) -> Result { let home_dir = vite_shared::get_vite_plus_home() .map_err(|e| format!("Failed to get vite-plus home dir: {e}"))? .join("js_runtime") .join("node") .join(version); #[cfg(windows)] let tool_path = if tool == "node" { home_dir.join("node.exe") } else { // npm and npx are .cmd scripts on Windows home_dir.join(format!("{tool}.cmd")) }; #[cfg(not(windows))] let tool_path = home_dir.join("bin").join(tool); if !tool_path.as_path().exists() { return Err(format!("Tool '{}' not found at {}", tool, tool_path.as_path().display())); } Ok(tool_path) } /// Load shim mode from config. /// /// Returns the default (Managed) if config cannot be read. async fn load_shim_mode() -> ShimMode { config::load_config().await.map(|c| c.shim_mode).unwrap_or_default() } /// Find a system tool in PATH, skipping the vite-plus bin directory and any /// directories listed in `VITE_PLUS_BYPASS`. /// /// Returns the absolute path to the tool if found, None otherwise. fn find_system_tool(tool: &str) -> Option { let bin_dir = config::get_bin_dir().ok(); let path_var = std::env::var_os("PATH")?; tracing::debug!("path_var: {:?}", path_var); // Parse VITE_PLUS_BYPASS as a PATH-style list of additional directories to skip. // This prevents infinite loops when multiple vite-plus installations exist in PATH. let bypass_paths: Vec = std::env::var_os(env_vars::VITE_PLUS_BYPASS) .map(|v| std::env::split_paths(&v).collect()) .unwrap_or_default(); tracing::debug!("bypass_paths: {:?}", bypass_paths); // Filter PATH to exclude our bin directory and any bypass directories let filtered_paths: Vec<_> = std::env::split_paths(&path_var) .filter(|p| { if let Some(ref bin) = bin_dir { if p == bin.as_path() { return false; } } !bypass_paths.iter().any(|bp| p == bp) }) .collect(); let filtered_path = std::env::join_paths(filtered_paths).ok()?; // Use vite_command::resolve_bin with filtered PATH - stops at first match let cwd = current_dir().ok()?; vite_command::resolve_bin(tool, Some(&filtered_path), &cwd).ok() } #[cfg(test)] mod tests { use serial_test::serial; use tempfile::TempDir; use super::*; /// Create a fake executable file in the given directory. #[cfg(unix)] fn create_fake_executable(dir: &std::path::Path, name: &str) -> std::path::PathBuf { use std::os::unix::fs::PermissionsExt; let path = dir.join(name); std::fs::write(&path, "#!/bin/sh\n").unwrap(); std::fs::set_permissions(&path, std::fs::Permissions::from_mode(0o755)).unwrap(); path } #[cfg(windows)] fn create_fake_executable(dir: &std::path::Path, name: &str) -> std::path::PathBuf { let path = dir.join(format!("{name}.exe")); std::fs::write(&path, "fake").unwrap(); path } /// Helper to save and restore PATH and VITE_PLUS_BYPASS around a test. struct EnvGuard { original_path: Option, original_bypass: Option, } impl EnvGuard { fn new() -> Self { Self { original_path: std::env::var_os("PATH"), original_bypass: std::env::var_os(env_vars::VITE_PLUS_BYPASS), } } } impl Drop for EnvGuard { fn drop(&mut self) { unsafe { match &self.original_path { Some(v) => std::env::set_var("PATH", v), None => std::env::remove_var("PATH"), } match &self.original_bypass { Some(v) => std::env::set_var(env_vars::VITE_PLUS_BYPASS, v), None => std::env::remove_var(env_vars::VITE_PLUS_BYPASS), } } } } #[test] #[serial] fn test_find_system_tool_works_without_bypass() { let _guard = EnvGuard::new(); let temp = TempDir::new().unwrap(); let dir = temp.path().join("bin_a"); std::fs::create_dir_all(&dir).unwrap(); create_fake_executable(&dir, "mytesttool"); // SAFETY: This test runs in isolation with serial_test unsafe { std::env::set_var("PATH", &dir); std::env::remove_var(env_vars::VITE_PLUS_BYPASS); } let result = find_system_tool("mytesttool"); assert!(result.is_some(), "Should find tool when no bypass is set"); assert!(result.unwrap().as_path().starts_with(&dir)); } #[test] #[serial] fn test_find_system_tool_skips_single_bypass_path() { let _guard = EnvGuard::new(); let temp = TempDir::new().unwrap(); let dir_a = temp.path().join("bin_a"); let dir_b = temp.path().join("bin_b"); std::fs::create_dir_all(&dir_a).unwrap(); std::fs::create_dir_all(&dir_b).unwrap(); create_fake_executable(&dir_a, "mytesttool"); create_fake_executable(&dir_b, "mytesttool"); let path = std::env::join_paths([dir_a.as_path(), dir_b.as_path()]).unwrap(); // SAFETY: This test runs in isolation with serial_test unsafe { std::env::set_var("PATH", &path); // Bypass dir_a — should skip it and find dir_b's tool std::env::set_var(env_vars::VITE_PLUS_BYPASS, dir_a.as_os_str()); } let result = find_system_tool("mytesttool"); assert!(result.is_some(), "Should find tool in non-bypassed directory"); assert!( result.unwrap().as_path().starts_with(&dir_b), "Should find tool in dir_b, not dir_a" ); } #[test] #[serial] fn test_find_system_tool_filters_multiple_bypass_paths() { let _guard = EnvGuard::new(); let temp = TempDir::new().unwrap(); let dir_a = temp.path().join("bin_a"); let dir_b = temp.path().join("bin_b"); let dir_c = temp.path().join("bin_c"); std::fs::create_dir_all(&dir_a).unwrap(); std::fs::create_dir_all(&dir_b).unwrap(); std::fs::create_dir_all(&dir_c).unwrap(); create_fake_executable(&dir_a, "mytesttool"); create_fake_executable(&dir_b, "mytesttool"); create_fake_executable(&dir_c, "mytesttool"); let path = std::env::join_paths([dir_a.as_path(), dir_b.as_path(), dir_c.as_path()]).unwrap(); let bypass = std::env::join_paths([dir_a.as_path(), dir_b.as_path()]).unwrap(); // SAFETY: This test runs in isolation with serial_test unsafe { std::env::set_var("PATH", &path); std::env::set_var(env_vars::VITE_PLUS_BYPASS, &bypass); } let result = find_system_tool("mytesttool"); assert!(result.is_some(), "Should find tool in dir_c"); assert!( result.unwrap().as_path().starts_with(&dir_c), "Should find tool in dir_c since dir_a and dir_b are bypassed" ); } #[test] #[serial] fn test_find_system_tool_returns_none_when_all_paths_bypassed() { let _guard = EnvGuard::new(); let temp = TempDir::new().unwrap(); let dir_a = temp.path().join("bin_a"); std::fs::create_dir_all(&dir_a).unwrap(); create_fake_executable(&dir_a, "mytesttool"); // SAFETY: This test runs in isolation with serial_test unsafe { std::env::set_var("PATH", dir_a.as_os_str()); std::env::set_var(env_vars::VITE_PLUS_BYPASS, dir_a.as_os_str()); } let result = find_system_tool("mytesttool"); assert!(result.is_none(), "Should return None when all paths are bypassed"); } /// Simulates the SystemFirst loop prevention: Installation A sets VITE_PLUS_BYPASS /// with its own bin dir, then Installation B (seeing VITE_PLUS_BYPASS) should filter /// both A's dir (from bypass) and its own dir (from get_bin_dir), finding the real tool /// in a third directory or returning None. #[test] #[serial] fn test_find_system_tool_cumulative_bypass_prevents_loop() { let _guard = EnvGuard::new(); let temp = TempDir::new().unwrap(); let install_a_bin = temp.path().join("install_a_bin"); let install_b_bin = temp.path().join("install_b_bin"); let real_system_bin = temp.path().join("real_system"); std::fs::create_dir_all(&install_a_bin).unwrap(); std::fs::create_dir_all(&install_b_bin).unwrap(); std::fs::create_dir_all(&real_system_bin).unwrap(); create_fake_executable(&install_a_bin, "mytesttool"); create_fake_executable(&install_b_bin, "mytesttool"); create_fake_executable(&real_system_bin, "mytesttool"); // PATH has all three dirs: install_a, install_b, real_system let path = std::env::join_paths([ install_a_bin.as_path(), install_b_bin.as_path(), real_system_bin.as_path(), ]) .unwrap(); // Simulate: Installation A already set VITE_PLUS_BYPASS= // Installation B also needs to filter install_b_bin (via get_bin_dir), // but get_bin_dir returns the real vite-plus home. So we test by putting // install_b_bin in the bypass as well (simulating cumulative append). let bypass = std::env::join_paths([install_a_bin.as_path(), install_b_bin.as_path()]).unwrap(); // SAFETY: This test runs in isolation with serial_test unsafe { std::env::set_var("PATH", &path); std::env::set_var(env_vars::VITE_PLUS_BYPASS, &bypass); } let result = find_system_tool("mytesttool"); assert!(result.is_some(), "Should find tool in real_system directory"); assert!( result.unwrap().as_path().starts_with(&real_system_bin), "Should find the real system tool, not any vite-plus installation" ); } /// When both installations are bypassed and no real system tool exists, should return None. #[test] #[serial] fn test_find_system_tool_returns_none_with_no_real_system_tool() { let _guard = EnvGuard::new(); let temp = TempDir::new().unwrap(); let install_a_bin = temp.path().join("install_a_bin"); let install_b_bin = temp.path().join("install_b_bin"); std::fs::create_dir_all(&install_a_bin).unwrap(); std::fs::create_dir_all(&install_b_bin).unwrap(); create_fake_executable(&install_a_bin, "mytesttool"); create_fake_executable(&install_b_bin, "mytesttool"); let path = std::env::join_paths([install_a_bin.as_path(), install_b_bin.as_path()]).unwrap(); let bypass = std::env::join_paths([install_a_bin.as_path(), install_b_bin.as_path()]).unwrap(); // SAFETY: This test runs in isolation with serial_test unsafe { std::env::set_var("PATH", &path); std::env::set_var(env_vars::VITE_PLUS_BYPASS, &bypass); } let result = find_system_tool("mytesttool"); assert!( result.is_none(), "Should return None when all dirs are bypassed and no real system tool exists" ); } // --- parse_npm_global_install tests --- fn s(strs: &[&str]) -> Vec { strs.iter().map(|s| s.to_string()).collect() } #[test] fn test_parse_npm_global_install_basic() { let result = parse_npm_global_install(&s(&["install", "-g", "typescript"])); assert!(result.is_some()); assert_eq!(result.unwrap().packages, vec!["typescript"]); } #[test] fn test_parse_npm_global_install_shorthand() { let result = parse_npm_global_install(&s(&["i", "-g", "typescript@5.0.0"])); assert!(result.is_some()); assert_eq!(result.unwrap().packages, vec!["typescript@5.0.0"]); } #[test] fn test_parse_npm_global_install_global_first() { let result = parse_npm_global_install(&s(&["-g", "install", "pkg1", "pkg2"])); assert!(result.is_some()); assert_eq!(result.unwrap().packages, vec!["pkg1", "pkg2"]); } #[test] fn test_parse_npm_global_install_long_global() { let result = parse_npm_global_install(&s(&["install", "--global", "@scope/pkg"])); assert!(result.is_some()); assert_eq!(result.unwrap().packages, vec!["@scope/pkg"]); } #[test] fn test_parse_npm_global_install_not_uninstall() { let result = parse_npm_global_install(&s(&["uninstall", "-g", "typescript"])); assert!(result.is_none(), "uninstall should not be detected"); } #[test] fn test_parse_npm_global_install_no_global_flag() { let result = parse_npm_global_install(&s(&["install", "typescript"])); assert!(result.is_none(), "no -g flag should return None"); } #[test] fn test_parse_npm_global_install_no_packages() { let result = parse_npm_global_install(&s(&["install", "-g"])); assert!(result.is_none(), "no packages should return None"); } #[test] fn test_parse_npm_global_install_local_path() { // Local paths are supported (read package.json to resolve name) let result = parse_npm_global_install(&s(&["install", "-g", "./local"])); assert!(result.is_some()); assert_eq!(result.unwrap().packages, vec!["./local"]); } #[test] fn test_parse_npm_global_install_skip_registry() { let result = parse_npm_global_install(&s(&["install", "-g", "--registry", "https://x", "pkg"])); assert!(result.is_some()); assert_eq!(result.unwrap().packages, vec!["pkg"]); } #[test] fn test_parse_npm_global_install_not_run_subcommand() { let result = parse_npm_global_install(&s(&["run", "build", "-g"])); assert!(result.is_none(), "run is not an install subcommand"); } #[test] fn test_parse_npm_global_install_git_url() { let result = parse_npm_global_install(&s(&["install", "-g", "git+https://repo"])); assert!(result.is_none(), "git+ URLs should be filtered"); } #[test] fn test_parse_npm_global_install_url() { let result = parse_npm_global_install(&s(&["install", "-g", "https://example.com/pkg.tgz"])); assert!(result.is_none(), "URLs should be filtered"); } // --- parse_npm_global_uninstall tests --- #[test] fn test_parse_npm_global_uninstall_basic() { let result = parse_npm_global_uninstall(&s(&["uninstall", "-g", "typescript"])); assert!(result.is_some()); assert_eq!(result.unwrap().packages, vec!["typescript"]); } #[test] fn test_parse_npm_global_uninstall_shorthand_un() { let result = parse_npm_global_uninstall(&s(&["un", "-g", "typescript"])); assert!(result.is_some()); assert_eq!(result.unwrap().packages, vec!["typescript"]); } #[test] fn test_parse_npm_global_uninstall_shorthand_rm() { let result = parse_npm_global_uninstall(&s(&["rm", "--global", "pkg1", "pkg2"])); assert!(result.is_some()); assert_eq!(result.unwrap().packages, vec!["pkg1", "pkg2"]); } #[test] fn test_parse_npm_global_uninstall_remove() { let result = parse_npm_global_uninstall(&s(&["remove", "-g", "@scope/pkg"])); assert!(result.is_some()); assert_eq!(result.unwrap().packages, vec!["@scope/pkg"]); } #[test] fn test_parse_npm_global_uninstall_not_install() { let result = parse_npm_global_uninstall(&s(&["install", "-g", "typescript"])); assert!(result.is_none(), "install should not be detected as uninstall"); } #[test] fn test_parse_npm_global_uninstall_no_global_flag() { let result = parse_npm_global_uninstall(&s(&["uninstall", "typescript"])); assert!(result.is_none(), "no -g flag should return None"); } #[test] fn test_parse_npm_global_uninstall_no_packages() { let result = parse_npm_global_uninstall(&s(&["uninstall", "-g"])); assert!(result.is_none(), "no packages should return None"); } #[test] fn test_parse_npm_global_install_run_subcommand_with_install_arg() { // `npm run install -g` — "run" is the first positional, so "install" is NOT the subcommand let result = parse_npm_global_install(&s(&["run", "install", "-g"])); assert!(result.is_none(), "install after run should not be detected as npm install"); } #[test] fn test_parse_npm_global_uninstall_run_subcommand_with_uninstall_arg() { // `npm run uninstall -g foo` — "run" is first positional, "uninstall" is a script arg let result = parse_npm_global_uninstall(&s(&["run", "uninstall", "-g", "foo"])); assert!(result.is_none(), "uninstall after run should not be detected as npm uninstall"); } #[test] fn test_parse_npm_global_install_flag_before_subcommand() { // `npm -g install pkg` — flags don't consume the positional slot let result = parse_npm_global_install(&s(&["-g", "install", "pkg"])); assert!(result.is_some()); assert_eq!(result.unwrap().packages, vec!["pkg"]); } // --- resolve_package_name tests --- #[test] fn test_resolve_package_name_simple() { assert_eq!(resolve_package_name("codex"), Some("codex".to_string())); } #[test] fn test_resolve_package_name_with_version() { assert_eq!(resolve_package_name("typescript@5.0.0"), Some("typescript".to_string())); } #[test] fn test_resolve_package_name_scoped() { assert_eq!(resolve_package_name("@scope/pkg"), Some("@scope/pkg".to_string())); } #[test] fn test_resolve_package_name_scoped_with_version() { assert_eq!(resolve_package_name("@scope/pkg@1.0.0"), Some("@scope/pkg".to_string())); } #[test] fn test_resolve_package_name_local_path_with_package_json() { let temp = TempDir::new().unwrap(); let pkg_dir = temp.path().join("my-pkg"); std::fs::create_dir_all(&pkg_dir).unwrap(); std::fs::write(pkg_dir.join("package.json"), r#"{"name": "my-actual-pkg"}"#).unwrap(); let spec = pkg_dir.to_str().unwrap(); // Use absolute path starting with / assert_eq!(resolve_package_name(spec), Some("my-actual-pkg".to_string())); } #[test] fn test_resolve_package_name_local_path_no_package_json() { assert_eq!(resolve_package_name("./nonexistent"), None); } // --- extract_bin_names tests --- #[test] fn test_extract_bin_names_single() { let json: serde_json::Value = serde_json::from_str(r#"{"name": "my-pkg", "bin": "./cli.js"}"#).unwrap(); assert_eq!(extract_bin_names(&json), vec!["my-pkg"]); } #[test] fn test_extract_bin_names_scoped_single() { let json: serde_json::Value = serde_json::from_str(r#"{"name": "@scope/my-pkg", "bin": "./cli.js"}"#).unwrap(); assert_eq!(extract_bin_names(&json), vec!["my-pkg"]); } #[test] fn test_extract_bin_names_object() { let json: serde_json::Value = serde_json::from_str( r#"{"name": "pkg", "bin": {"cli-a": "./a.js", "cli-b": "./b.js"}}"#, ) .unwrap(); let mut names = extract_bin_names(&json); names.sort(); assert_eq!(names, vec!["cli-a", "cli-b"]); } #[test] fn test_extract_bin_names_no_bin() { let json: serde_json::Value = serde_json::from_str(r#"{"name": "pkg"}"#).unwrap(); assert!(extract_bin_names(&json).is_empty()); } // --- is_local_path tests --- #[test] fn test_is_local_path_bare_dot() { assert!(is_local_path(".")); } #[test] fn test_is_local_path_bare_dotdot() { assert!(is_local_path("..")); } #[test] fn test_is_local_path_relative_dot() { assert!(is_local_path("./foo")); assert!(is_local_path("../bar")); } #[test] fn test_is_local_path_absolute() { assert!(is_local_path("/usr/local/pkg")); } #[test] fn test_is_local_path_package_name() { assert!(!is_local_path("typescript")); assert!(!is_local_path("@scope/pkg")); assert!(!is_local_path("pkg@1.0.0")); } #[cfg(windows)] #[test] fn test_is_local_path_windows_drive() { assert!(is_local_path("C:\\pkg")); assert!(is_local_path("D:/projects/my-pkg")); assert!(!is_local_path("C")); // too short } // --- dedup missing_bins tests --- #[test] fn test_dedup_missing_bins_keeps_last_entry() { // Simulates: `npm install -g pkg-a pkg-b` where both declare bin "shared-cli". // After dedup, only the last entry (pkg-b) should survive — npm's "last writer wins". let temp = TempDir::new().unwrap(); let source_a = AbsolutePathBuf::new(temp.path().join("node_modules/.bin/shared-cli")).unwrap(); let source_b = AbsolutePathBuf::new(temp.path().join("node_modules/.bin/shared-cli")).unwrap(); let missing_bins: Vec<(String, AbsolutePathBuf, String)> = vec![ ("shared-cli".to_string(), source_a, "pkg-a".to_string()), ("shared-cli".to_string(), source_b, "pkg-b".to_string()), ]; // Apply the same dedup logic used in check_npm_global_install_result let deduped = dedup_missing_bins(missing_bins); assert_eq!(deduped.len(), 1, "Should have exactly one entry after dedup"); assert_eq!(deduped[0].0, "shared-cli"); assert_eq!(deduped[0].2, "pkg-b", "Last writer (pkg-b) should win"); } #[test] fn test_dedup_missing_bins_preserves_unique_entries() { let temp = TempDir::new().unwrap(); let source_a = AbsolutePathBuf::new(temp.path().join("bin/cli-a")).unwrap(); let source_b = AbsolutePathBuf::new(temp.path().join("bin/cli-b")).unwrap(); let missing_bins: Vec<(String, AbsolutePathBuf, String)> = vec![ ("cli-a".to_string(), source_a, "pkg-a".to_string()), ("cli-b".to_string(), source_b, "pkg-b".to_string()), ]; let deduped = dedup_missing_bins(missing_bins); assert_eq!(deduped.len(), 2, "Unique entries should be preserved"); assert_eq!(deduped[0].0, "cli-a"); assert_eq!(deduped[1].0, "cli-b"); } #[test] fn test_dedup_missing_bins_multiple_dupes() { // Three packages all declare "shared" and two packages declare "other" let temp = TempDir::new().unwrap(); let src = |name: &str| AbsolutePathBuf::new(temp.path().join(name)).unwrap(); let missing_bins: Vec<(String, AbsolutePathBuf, String)> = vec![ ("shared".to_string(), src("s1"), "pkg-a".to_string()), ("other".to_string(), src("o1"), "pkg-a".to_string()), ("shared".to_string(), src("s2"), "pkg-b".to_string()), ("shared".to_string(), src("s3"), "pkg-c".to_string()), ("other".to_string(), src("o2"), "pkg-c".to_string()), ]; let deduped = dedup_missing_bins(missing_bins); assert_eq!(deduped.len(), 2); // "shared" last writer is pkg-c, "other" last writer is pkg-c assert_eq!(deduped[0].0, "shared"); assert_eq!(deduped[0].2, "pkg-c"); assert_eq!(deduped[1].0, "other"); assert_eq!(deduped[1].2, "pkg-c"); } // --- parse_npm_global_command --prefix tests --- #[test] fn test_parse_npm_global_install_with_prefix() { let result = parse_npm_global_install(&s(&["install", "-g", "--prefix", "/tmp/test", "pkg"])); assert!(result.is_some()); let parsed = result.unwrap(); assert_eq!(parsed.packages, vec!["pkg"]); assert_eq!(parsed.explicit_prefix.as_deref(), Some("/tmp/test")); } #[test] fn test_parse_npm_global_install_with_prefix_equals() { let result = parse_npm_global_install(&s(&["install", "-g", "--prefix=/tmp/test", "pkg"])); assert!(result.is_some()); let parsed = result.unwrap(); assert_eq!(parsed.packages, vec!["pkg"]); assert_eq!(parsed.explicit_prefix.as_deref(), Some("/tmp/test")); } #[test] fn test_parse_npm_global_install_without_prefix() { let result = parse_npm_global_install(&s(&["install", "-g", "pkg"])); assert!(result.is_some()); let parsed = result.unwrap(); assert_eq!(parsed.packages, vec!["pkg"]); assert!(parsed.explicit_prefix.is_none()); } #[test] fn test_parse_npm_global_uninstall_with_prefix() { let result = parse_npm_global_uninstall(&s(&["uninstall", "-g", "--prefix", "/custom/dir", "pkg"])); assert!(result.is_some()); let parsed = result.unwrap(); assert_eq!(parsed.packages, vec!["pkg"]); assert_eq!(parsed.explicit_prefix.as_deref(), Some("/custom/dir")); } // --- resolve_npm_prefix tests --- #[test] #[serial] fn test_resolve_npm_prefix_relative() { let temp = TempDir::new().unwrap(); let temp_path = AbsolutePathBuf::new(temp.path().to_path_buf()).unwrap(); // SAFETY: This test runs in isolation with serial_test unsafe { std::env::set_var("PWD", temp_path.as_path()); } let parsed = NpmGlobalCommand { packages: vec!["pkg".to_string()], explicit_prefix: Some("./custom".to_string()), }; // Use a dummy npm_path and node_dir (should not be reached) let dummy_dir = temp_path.join("dummy"); let result = resolve_npm_prefix(&parsed, &dummy_dir, &dummy_dir); // Should resolve relative to cwd, not fall back to get_npm_global_prefix assert!( result.as_path().ends_with("custom"), "Expected path ending with 'custom', got: {}", result.as_path().display() ); } #[test] #[serial] fn test_resolve_npm_prefix_absolute() { let temp = TempDir::new().unwrap(); let temp_path = AbsolutePathBuf::new(temp.path().to_path_buf()).unwrap(); let abs_prefix = temp_path.join("abs-prefix"); let parsed = NpmGlobalCommand { packages: vec!["pkg".to_string()], explicit_prefix: Some(abs_prefix.as_path().display().to_string()), }; let dummy_dir = temp_path.join("dummy"); let result = resolve_npm_prefix(&parsed, &dummy_dir, &dummy_dir); assert_eq!( result.as_path(), abs_prefix.as_path(), "Absolute prefix should be returned as-is" ); } #[test] fn test_resolve_npm_prefix_none_fallback() { // When no explicit prefix, resolve_npm_prefix calls get_npm_global_prefix. // We can't easily test that fallback without a real npm, so just verify // it doesn't panic and returns some path. let temp = TempDir::new().unwrap(); let temp_path = AbsolutePathBuf::new(temp.path().to_path_buf()).unwrap(); let parsed = NpmGlobalCommand { packages: vec![], explicit_prefix: None }; let dummy_dir = temp_path.join("dummy"); // This will fall back to get_npm_global_prefix, which may fail but should // ultimately return node_dir as the final fallback let result = resolve_npm_prefix(&parsed, &dummy_dir, &dummy_dir); assert!(!result.as_path().as_os_str().is_empty()); } } ================================================ FILE: crates/vite_global_cli/src/shim/exec.rs ================================================ //! Platform-specific execution for shim operations. //! //! On Unix, uses execve to replace the current process. //! On Windows, spawns the process and waits for completion. use vite_path::AbsolutePath; use vite_shared::output; /// Convert a process ExitStatus to an exit code. /// On Unix, if the process was killed by a signal, returns 128 + signal_number. fn exit_code_from_status(status: std::process::ExitStatus) -> i32 { #[cfg(unix)] { use std::os::unix::process::ExitStatusExt; if let Some(signal) = status.signal() { return 128 + signal; } } status.code().unwrap_or(1) } /// Spawn a tool as a child process and wait for completion. /// /// Unlike `exec_tool()`, this does NOT replace the current process on Unix, /// allowing the caller to run code after the tool exits. pub fn spawn_tool(path: &AbsolutePath, args: &[String]) -> i32 { match std::process::Command::new(path.as_path()).args(args).status() { Ok(status) => exit_code_from_status(status), Err(e) => { output::error(&format!("Failed to execute {}: {}", path.as_path().display(), e)); 1 } } } /// Execute a tool, replacing the current process on Unix. /// /// Returns an exit code on Windows or if exec fails on Unix. pub fn exec_tool(path: &AbsolutePath, args: &[String]) -> i32 { #[cfg(unix)] { exec_unix(path, args) } #[cfg(windows)] { exec_windows(path, args) } } /// Unix: Use exec to replace the current process. #[cfg(unix)] fn exec_unix(path: &AbsolutePath, args: &[String]) -> i32 { use std::os::unix::process::CommandExt; let mut cmd = std::process::Command::new(path.as_path()); cmd.args(args); // exec replaces the current process - this only returns on error let err = cmd.exec(); output::error(&format!("Failed to exec {}: {}", path.as_path().display(), err)); 1 } /// Windows: Spawn the process and wait for completion. #[cfg(windows)] fn exec_windows(path: &AbsolutePath, args: &[String]) -> i32 { spawn_tool(path, args) } #[cfg(test)] mod tests { use super::*; #[cfg(unix)] #[test] fn test_exit_code_from_status_normal() { let status = std::process::Command::new("/bin/sh").arg("-c").arg("exit 42").status().unwrap(); assert_eq!(exit_code_from_status(status), 42); } #[cfg(windows)] #[test] fn test_exit_code_from_status_normal() { let status = std::process::Command::new("cmd").args(["/C", "exit 42"]).status().unwrap(); assert_eq!(exit_code_from_status(status), 42); } #[cfg(unix)] #[test] fn test_exit_code_from_status_signal() { // Process kills itself with SIGINT (signal 2), expected exit code: 128 + 2 = 130 let status = std::process::Command::new("/bin/sh").arg("-c").arg("kill -INT $$").status().unwrap(); assert_eq!(exit_code_from_status(status), 130); } } ================================================ FILE: crates/vite_global_cli/src/shim/mod.rs ================================================ //! Shim module for intercepting node, npm, npx, and package binary commands. //! //! This module provides the functionality for the vp binary to act as a shim //! when invoked as `node`, `npm`, `npx`, or any globally installed package binary. //! //! Detection methods: //! - Unix: Symlinks to vp binary preserve argv[0], allowing tool detection //! - Windows: Trampoline `.exe` files set `VITE_PLUS_SHIM_TOOL` env var and spawn vp.exe //! - Legacy: `.cmd` wrappers call `vp env exec ` directly (deprecated) mod cache; pub(crate) mod dispatch; pub(crate) mod exec; pub(crate) use cache::invalidate_cache; pub use dispatch::dispatch; use vite_shared::env_vars; /// Core shim tools (node, npm, npx) pub const CORE_SHIM_TOOLS: &[&str] = &["node", "npm", "npx"]; /// Extract the tool name from argv[0]. /// /// Handles various formats: /// - `node` (Unix) /// - `/usr/bin/node` (Unix full path) /// - `node.exe` (Windows) /// - `C:\path\node.exe` (Windows full path) pub fn extract_tool_name(argv0: &str) -> String { let path = std::path::Path::new(argv0); let stem = path.file_stem().unwrap_or_default().to_string_lossy(); // Handle Windows: strip .exe, .cmd extensions if present in stem // (file_stem already strips the extension) stem.to_lowercase() } /// Check if the given tool name is a core shim tool (node/npm/npx). #[must_use] pub fn is_core_shim_tool(tool: &str) -> bool { CORE_SHIM_TOOLS.contains(&tool) } /// Check if the given tool name is a shim tool (core or package binary). /// /// This is a quick check that returns true if: /// 1. The tool is a core shim (node/npm/npx), OR /// 2. The tool name is not "vp" (package binaries are detected later via metadata) #[must_use] pub fn is_shim_tool(tool: &str) -> bool { // Core tools are always shims if is_core_shim_tool(tool) { return true; } // "vp" is not a shim - it's the main CLI if tool == "vp" { return false; } // For other tools, we need to check if they're package binaries // This is a heuristic - we'll check metadata in dispatch // We assume anything invoked from the bin directory is a shim is_potential_package_binary(tool) } /// Check if the tool could be a package binary shim. /// /// Returns true if a shim for the tool exists in the configured bin directory. /// This check respects the VITE_PLUS_HOME environment variable for custom home directories. /// /// Note: We check the configured bin directory directly instead of using current_exe() /// because when running through a wrapper script (e.g., current/bin/vp), the current_exe() /// returns the wrapper's location, not the original shim's location. fn is_potential_package_binary(tool: &str) -> bool { use crate::commands::env::config; // Get the configured bin directory (respects VITE_PLUS_HOME env var) let Ok(configured_bin) = config::get_bin_dir() else { return false; }; // Check if the shim exists in the configured bin directory. // Use symlink_metadata to detect symlinks (even broken ones). // On Windows, check .exe first (trampoline shims, the common case), // then fall back to extensionless (Unix symlinks or legacy). #[cfg(windows)] { let exe_path = configured_bin.join(format!("{tool}.exe")); if std::fs::symlink_metadata(&exe_path).is_ok() { return true; } } let shim_path = configured_bin.join(tool); if std::fs::symlink_metadata(&shim_path).is_ok() { return true; } false } /// Environment variable used for shim tool detection via shell wrapper scripts. const SHIM_TOOL_ENV_VAR: &str = env_vars::VITE_PLUS_SHIM_TOOL; /// Detect the shim tool from environment and argv. /// /// Detection priority: /// 1. Check `VITE_PLUS_SHIM_TOOL` env var (set by trampoline exe on Windows) /// 2. If argv[0] is "vp" or "vp.exe", this is a direct CLI invocation - NOT shim mode /// 3. Fall back to argv[0] detection (primary method on Unix with symlinks) /// /// IMPORTANT: This function clears `VITE_PLUS_SHIM_TOOL` after reading it to /// prevent the env var from leaking to child processes. pub fn detect_shim_tool(argv0: &str) -> Option { // Always clear the env var to prevent it from leaking to child processes. // We read it first, then clear it immediately. // SAFETY: We're at program startup before any threads are spawned. let env_tool = std::env::var(SHIM_TOOL_ENV_VAR).ok(); unsafe { std::env::remove_var(SHIM_TOOL_ENV_VAR); } // Check VITE_PLUS_SHIM_TOOL env var first (set by trampoline exe on Windows). // This takes priority over argv[0] because the trampoline spawns vp.exe // (so argv[0] would be "vp"), but the env var carries the real tool name. if let Some(tool) = env_tool { if !tool.is_empty() { let tool_lower = tool.to_lowercase(); // Accept any tool from env var (could be core or package binary) if tool_lower != "vp" { return Some(tool_lower); } } } // If argv[0] is explicitly "vp" or "vp.exe", this is a direct CLI invocation. let argv0_tool = extract_tool_name(argv0); if argv0_tool == "vp" { return None; // Direct vp invocation, not shim mode } if argv0_tool == "vpx" { return Some("vpx".to_string()); } // Fall back to argv[0] detection (Unix symlinks) if is_shim_tool(&argv0_tool) { Some(argv0_tool) } else { None } } #[cfg(test)] mod tests { use super::*; #[test] fn test_extract_tool_name() { assert_eq!(extract_tool_name("node"), "node"); assert_eq!(extract_tool_name("/usr/bin/node"), "node"); assert_eq!(extract_tool_name("/home/user/.vite-plus/bin/node"), "node"); assert_eq!(extract_tool_name("npm"), "npm"); assert_eq!(extract_tool_name("npx"), "npx"); assert_eq!(extract_tool_name("vp"), "vp"); // Files with extensions (works on all platforms) assert_eq!(extract_tool_name("node.exe"), "node"); assert_eq!(extract_tool_name("npm.cmd"), "npm"); // Windows paths - only test on Windows #[cfg(windows)] { assert_eq!(extract_tool_name("C:\\Users\\user\\.vite-plus\\bin\\node.exe"), "node"); } } #[test] fn test_is_shim_tool() { // Core shim tools are always recognized assert!(is_core_shim_tool("node")); assert!(is_core_shim_tool("npm")); assert!(is_core_shim_tool("npx")); assert!(!is_core_shim_tool("yarn")); // yarn is not a core shim tool assert!(!is_core_shim_tool("vp")); assert!(!is_core_shim_tool("cargo")); assert!(!is_core_shim_tool("tsc")); // Package binary, not core // is_shim_tool includes core tools assert!(is_shim_tool("node")); assert!(is_shim_tool("npm")); assert!(is_shim_tool("npx")); assert!(!is_shim_tool("vp")); // vp is never a shim } /// Test that is_potential_package_binary checks the configured bin directory. /// /// The function now checks if a shim exists in the configured bin directory /// (from VITE_PLUS_HOME/bin) instead of relying on current_exe(). /// This allows it to work correctly with wrapper scripts. #[test] fn test_is_potential_package_binary_checks_configured_bin() { // The function checks config::get_bin_dir() which respects VITE_PLUS_HOME. // Without setting VITE_PLUS_HOME, it defaults to ~/.vite-plus/bin. // // Since we can't easily create test shims in the actual bin directory, // we just verify the function doesn't panic and returns false for // non-existent tools. assert!(!is_potential_package_binary("nonexistent-tool-12345")); assert!(!is_potential_package_binary("another-fake-tool")); } #[test] fn test_detect_shim_tool_vpx() { // vpx should be detected via the argv0 check, before the env var check // and before is_shim_tool (which would incorrectly match it as a package binary) // SAFETY: We're in a test unsafe { std::env::remove_var(SHIM_TOOL_ENV_VAR); } let result = detect_shim_tool("vpx"); assert_eq!(result, Some("vpx".to_string())); // Also works with full path let result = detect_shim_tool("/home/user/.vite-plus/bin/vpx"); assert_eq!(result, Some("vpx".to_string())); // Also works with .exe extension (Windows) let result = detect_shim_tool("vpx.exe"); assert_eq!(result, Some("vpx".to_string())); } } ================================================ FILE: crates/vite_global_cli/src/tips/mod.rs ================================================ //! CLI tips system for providing helpful suggestions to users. //! //! Tips are shown after command execution to help users discover features //! and shortcuts. mod short_aliases; mod use_vpx_or_run; use clap::error::ErrorKind as ClapErrorKind; use self::{short_aliases::ShortAliases, use_vpx_or_run::UseVpxOrRun}; /// Execution context passed in from the CLI entry point. pub struct TipContext { /// CLI arguments as typed by the user, excluding the program name (`vp`). pub raw_args: Vec, /// The exit code of the command (0 = success, non-zero = failure). pub exit_code: i32, /// The clap error if parsing failed. pub clap_error: Option, } impl Default for TipContext { fn default() -> Self { TipContext { raw_args: Vec::new(), exit_code: 0, clap_error: None } } } impl TipContext { /// Whether the command completed successfully. #[expect(dead_code)] pub fn success(&self) -> bool { self.exit_code == 0 } pub fn is_unknown_command_error(&self) -> bool { if let Some(err) = &self.clap_error { matches!(err.kind(), ClapErrorKind::InvalidSubcommand) } else { false } } /// Iterate positional args (skipping flags starting with `-`). fn positionals(&self) -> impl Iterator { self.raw_args.iter().map(String::as_str).filter(|a| !a.starts_with('-')) } /// The subcommand (first positional arg, e.g., "ls", "build"). pub fn subcommand(&self) -> Option<&str> { self.positionals().next() } /// Whether the positional args start with the given command pattern. /// Pattern is space-separated: "pm list" matches even if flags are interspersed. #[expect(dead_code)] pub fn is_subcommand(&self, pattern: &str) -> bool { let mut positionals = self.positionals(); pattern.split_whitespace().all(|expected| positionals.next() == Some(expected)) } } /// A tip that can be shown to the user after command execution. pub trait Tip { /// Whether this tip is relevant given the current execution context. fn matches(&self, ctx: &TipContext) -> bool; /// The tip text shown to the user. fn message(&self) -> &'static str; } /// Returns all registered tips. fn all() -> &'static [&'static dyn Tip] { &[&ShortAliases, &UseVpxOrRun] } /// Pick a random tip from those matching the current context. /// /// Returns `None` if: /// - The `VITE_PLUS_CLI_TEST` env var is set (test mode) /// - No tips match the given context pub fn get_tip(context: &TipContext) -> Option<&'static str> { if std::env::var_os("VITE_PLUS_CLI_TEST").is_some() || std::env::var_os("CI").is_some() { return None; } let now = std::time::SystemTime::now().duration_since(std::time::UNIX_EPOCH).unwrap_or_default(); let all = all(); let matching: Vec<&&dyn Tip> = all.iter().filter(|t| t.matches(context)).collect(); if matching.is_empty() { return None; } // Use subsec_nanos for random tip selection let nanos = now.subsec_nanos() as usize; Some(matching[nanos % matching.len()].message()) } /// Create a `TipContext` from a command string using real clap parsing. /// /// `command` is exactly what the user types in the terminal (e.g. `"vp list --flag"`). /// The first arg is treated as the program name and excluded from `raw_args`, /// matching how the real CLI uses `std::env::args()`. #[cfg(test)] pub fn tip_context_from_command(command: &str) -> TipContext { // Split simulates what the OS does with command line args let args: Vec = command.split_whitespace().map(String::from).collect(); let (exit_code, clap_error) = match crate::try_parse_args_from(args.iter().cloned()) { Ok(_) => (0, None), Err(e) => (e.exit_code(), Some(e)), }; // raw_args excludes program name (args[0]), same as real CLI: args[1..].to_vec() let raw_args = args.get(1..).map(<[String]>::to_vec).unwrap_or_default(); TipContext { raw_args, exit_code, clap_error } } ================================================ FILE: crates/vite_global_cli/src/tips/short_aliases.rs ================================================ //! Tip suggesting short aliases for long-form commands. use super::{Tip, TipContext}; /// Long-form commands that have short aliases. const LONG_FORMS: &[&str] = &["install", "remove", "uninstall", "update", "list", "link"]; /// Suggest short aliases when user runs a long-form command. pub struct ShortAliases; impl Tip for ShortAliases { fn matches(&self, ctx: &TipContext) -> bool { ctx.subcommand().is_some_and(|cmd| LONG_FORMS.contains(&cmd)) } fn message(&self) -> &'static str { "Available short aliases: i = install, rm = remove, un = uninstall, up = update, ls = list, ln = link" } } #[cfg(test)] mod tests { use super::*; use crate::tips::tip_context_from_command; #[test] fn matches_long_form_commands() { for cmd in LONG_FORMS { let ctx = tip_context_from_command(&format!("vp {cmd}")); assert!(ShortAliases.matches(&ctx), "should match {cmd}"); } } #[test] fn does_not_match_short_form_commands() { let short_forms = ["i", "rm", "un", "up", "ln"]; for cmd in short_forms { let ctx = tip_context_from_command(&format!("vp {cmd}")); assert!(!ShortAliases.matches(&ctx), "should not match {cmd}"); } } #[test] fn does_not_match_other_commands() { let other_commands = ["build", "test", "lint", "run", "pack"]; for cmd in other_commands { let ctx = tip_context_from_command(&format!("vp {cmd}")); assert!(!ShortAliases.matches(&ctx), "should not match {cmd}"); } } #[test] fn install_shows_short_alias_tip() { let ctx = tip_context_from_command("vp install"); assert!(ShortAliases.matches(&ctx)); } #[test] fn short_form_does_not_show_tip() { let ctx = tip_context_from_command("vp i"); assert!(!ShortAliases.matches(&ctx)); } } ================================================ FILE: crates/vite_global_cli/src/tips/use_vpx_or_run.rs ================================================ //! Tip suggesting vpx or vp run for unknown commands. use super::{Tip, TipContext}; /// Suggest `vpx ` or `vp run ================================================ FILE: docs/.vitepress/theme/components/Footer.vue ================================================ ================================================ FILE: docs/.vitepress/theme/components/home/CoreFeature3Col.vue ================================================ ================================================ FILE: docs/.vitepress/theme/components/home/FeatureCheck.vue ================================================ ================================================ FILE: docs/.vitepress/theme/components/home/FeatureDevBuild.vue ================================================ ================================================ FILE: docs/.vitepress/theme/components/home/FeaturePack.vue ================================================ ================================================ FILE: docs/.vitepress/theme/components/home/FeatureRun.vue ================================================ ================================================ FILE: docs/.vitepress/theme/components/home/FeatureRunTerminal.vue ================================================ ================================================ FILE: docs/.vitepress/theme/components/home/FeatureTest.vue ================================================ ================================================ FILE: docs/.vitepress/theme/components/home/FeatureToolbar.vue ================================================ ================================================ FILE: docs/.vitepress/theme/components/home/Fullstack2Col.vue ================================================ ================================================ FILE: docs/.vitepress/theme/components/home/HeadingSection2.vue ================================================ ================================================ FILE: docs/.vitepress/theme/components/home/HeadingSection3.vue ================================================ ================================================ FILE: docs/.vitepress/theme/components/home/HeadingSection4.vue ================================================ ================================================ FILE: docs/.vitepress/theme/components/home/Hero.vue ================================================ ================================================ FILE: docs/.vitepress/theme/components/home/HeroRive.vue ================================================ ================================================ FILE: docs/.vitepress/theme/components/home/InstallCommand.vue ================================================ ================================================ FILE: docs/.vitepress/theme/components/home/PartnerLogos.vue ================================================ ================================================ FILE: docs/.vitepress/theme/components/home/ProductivityGrid.vue ================================================ ================================================ FILE: docs/.vitepress/theme/components/home/StackedBlock.vue ================================================ ================================================ FILE: docs/.vitepress/theme/components/home/Terminal.vue ================================================ ================================================ FILE: docs/.vitepress/theme/components/home/TerminalTranscript.vue ================================================ ================================================ FILE: docs/.vitepress/theme/components/home/Testimonials.vue ================================================ ================================================ FILE: docs/.vitepress/theme/data/feature-run-transcripts.ts ================================================ import type { TerminalTranscript } from './terminal-transcripts'; export const featureRunTranscripts: TerminalTranscript[] = [ { id: 'cold', label: 'Cold Cache', title: 'First run builds the shared library and app', command: 'vp run --cache build', lineDelay: 180, completionDelay: 1200, lines: [ { segments: [{ text: '# First run builds the shared library and app', tone: 'muted' }], }, { segments: [{ text: '$ vp pack', tone: 'muted' }], }, { segments: [{ text: '$ vp build', tone: 'muted' }], }, { segments: [ { text: 'vp run:', tone: 'brand', bold: true }, { text: ' 0/2 cache hit (0%).', tone: 'muted' }, ], }, ], }, { id: 'no-changes', label: 'Full Replay', title: 'No changes replay both tasks from cache', command: 'vp run --cache build', lineDelay: 180, completionDelay: 1200, lines: [ { segments: [{ text: '# No changes replay both tasks from cache', tone: 'muted' }], }, { segments: [ { text: '$ vp pack ', tone: 'muted' }, { text: '✓ ', tone: 'success' }, { text: 'cache hit, replaying', tone: 'base' }, ], }, { segments: [ { text: '$ vp build ', tone: 'muted' }, { text: '✓ ', tone: 'success' }, { text: 'cache hit, replaying', tone: 'base' }, ], }, { segments: [ { text: 'vp run:', tone: 'brand', bold: true }, { text: ' 2/2 cache hit (100%), 1.24s saved.', tone: 'muted' }, ], }, ], }, { id: 'app-change', label: 'Partial Replay', title: 'App changes rerun only the app build', command: 'vp run --cache build', lineDelay: 180, completionDelay: 1200, lines: [ { segments: [{ text: '# App changes rerun only the app build', tone: 'muted' }], }, { segments: [ { text: '$ vp pack ', tone: 'muted' }, { text: '✓ ', tone: 'success' }, { text: 'cache hit, replaying', tone: 'base' }, ], }, { segments: [ { text: '$ vp build ', tone: 'muted' }, { text: '✗ ', tone: 'base' }, { text: 'cache miss: ', tone: 'muted' }, { text: "'src/main.ts'", tone: 'base' }, { text: ' modified, executing', tone: 'muted' }, ], }, { segments: [ { text: 'vp run:', tone: 'brand', bold: true }, { text: ' 1/2 cache hit (50%), 528ms saved.', tone: 'muted' }, ], }, ], }, { id: 'shared-change', label: 'Full Rebuild', title: 'Shared API changes rebuild the library and app', command: 'vp run --cache build', lineDelay: 180, completionDelay: 1200, lines: [ { segments: [{ text: '# Shared API changes rebuild the library and app', tone: 'muted' }], }, { segments: [ { text: '$ vp pack ', tone: 'muted' }, { text: '✗ ', tone: 'base' }, { text: 'cache miss: ', tone: 'muted' }, { text: "'src/index.ts'", tone: 'base' }, { text: ' modified, executing', tone: 'muted' }, ], }, { segments: [ { text: '$ vp build ', tone: 'muted' }, { text: '✗ ', tone: 'base' }, { text: 'cache miss: ', tone: 'muted' }, { text: "'src/routes.ts'", tone: 'base' }, { text: ' modified, executing', tone: 'muted' }, ], }, { segments: [ { text: 'vp run:', tone: 'brand', bold: true }, { text: ' 0/2 cache hit (0%).', tone: 'muted' }, ], }, ], }, ]; ================================================ FILE: docs/.vitepress/theme/data/performance.ts ================================================ export interface PerformanceData { name: string; percentage: number; time: string; isPrimary?: boolean; } export const devPerformance: PerformanceData[] = [ { name: 'Vite Dev', percentage: 15, time: '102MS', isPrimary: true, }, { name: 'Webpack', percentage: 50, time: '2.38S', }, { name: 'Rspack', percentage: 60, time: '2.38S', }, { name: 'Vite 7', percentage: 90, time: '2.38S', }, { name: 'NextJS', percentage: 100, time: '2.38S', }, ]; export const buildPerformance: PerformanceData[] = [ { name: 'Vite Build', percentage: 20, time: '1.2S', isPrimary: true, }, { name: 'Webpack', percentage: 75, time: '8.4S', }, { name: 'Rspack', percentage: 45, time: '3.1S', }, { name: 'Vite 7', percentage: 85, time: '7.2S', }, { name: 'NextJS', percentage: 100, time: '9.8S', }, ]; export const testPerformance: PerformanceData[] = [ { name: 'Vite Test', percentage: 15, time: '102MS', isPrimary: true, }, { name: 'Jest+SWC', percentage: 45, time: '2.38S', }, { name: 'Jest+TS-Jest', percentage: 55, time: '2.38S', }, { name: 'Jest+Babel', percentage: 75, time: '2.38S', }, ]; export const lintSyntaticPerformance: PerformanceData[] = [ { name: 'Syntatic Mode', percentage: 10, time: '102MS', isPrimary: true, }, { name: 'ESLint', percentage: 50, time: '2.38S', }, { name: 'Biome', percentage: 45, time: '2.38S', }, ]; export const lintTypeAwarePerformance: PerformanceData[] = [ { name: 'Type-Aware Mode', percentage: 25, time: '380MS', isPrimary: true, }, { name: 'ESLint', percentage: 85, time: '4.2S', }, { name: 'TypeScript', percentage: 70, time: '3.1S', }, { name: 'Biome', percentage: 60, time: '2.8S', }, ]; export const formatPerformance: PerformanceData[] = [ { name: 'Vite Format', percentage: 10, time: '102MS', isPrimary: true, }, { name: 'ESLint', percentage: 50, time: '2.38S', }, { name: 'Biome', percentage: 45, time: '2.38S', }, ]; ================================================ FILE: docs/.vitepress/theme/data/terminal-transcripts.ts ================================================ export type TerminalTone = 'base' | 'muted' | 'brand' | 'accent' | 'success' | 'warning'; export interface TerminalSegment { text: string; tone?: TerminalTone; bold?: boolean; } export interface TerminalLine { segments: TerminalSegment[]; tone?: TerminalTone; } export interface TerminalTranscript { id: string; label: string; title: string; command: string; prompt?: string; lineDelay?: number; completionDelay?: number; lines: TerminalLine[]; } export const terminalTranscripts: TerminalTranscript[] = [ { id: 'create', label: 'create', title: 'Scaffold a project', command: 'vp create', lineDelay: 220, completionDelay: 900, lines: [ { segments: [ { text: '◇ ', tone: 'accent' }, { text: 'Select a template ', tone: 'muted' }, { text: 'vite:application', tone: 'brand' }, ], }, { segments: [ { text: '◇ ', tone: 'accent' }, { text: 'Project directory ', tone: 'muted' }, { text: 'vite-app', tone: 'brand' }, ], }, { segments: [ { text: '• ', tone: 'muted' }, { text: 'Node ', tone: 'muted' }, { text: '24.14.0', tone: 'brand' }, { text: ' pnpm ', tone: 'muted' }, { text: '10.28.0', tone: 'accent' }, ], }, { segments: [ { text: '✓ ', tone: 'success' }, { text: 'Dependencies installed', tone: 'base' }, { text: ' in 1.1s', tone: 'muted' }, ], }, { segments: [ { text: '→ ', tone: 'brand' }, { text: 'Next: ', tone: 'muted' }, { text: 'cd vite-app && vp dev', tone: 'accent' }, ], }, ], }, { id: 'dev', label: 'dev', title: 'Start local development', command: 'vp dev', lineDelay: 220, completionDelay: 1100, lines: [ { segments: [ { text: 'VITE+ ', tone: 'brand' }, { text: 'ready in ', tone: 'muted' }, { text: '68ms', tone: 'base' }, ], }, { segments: [ { text: '→ ', tone: 'brand' }, { text: 'Local ', tone: 'muted' }, { text: 'http://localhost:5173/', tone: 'accent' }, ], }, { segments: [ { text: '→ ', tone: 'muted' }, { text: 'Network ', tone: 'muted' }, { text: '--host', tone: 'base' }, { text: ' to expose', tone: 'muted' }, ], }, { segments: [ { text: '[hmr] ', tone: 'accent' }, { text: 'updated ', tone: 'muted' }, { text: 'src/App.tsx', tone: 'brand' }, { text: ' in 14ms', tone: 'muted' }, ], }, ], }, { id: 'check', label: 'check', title: 'Check the whole project', command: 'vp check', lineDelay: 220, completionDelay: 1100, lines: [ { segments: [ { text: 'pass: ', tone: 'accent' }, { text: 'All 42 files are correctly formatted', tone: 'base' }, { text: ' (88ms, 16 threads)', tone: 'muted' }, ], }, { segments: [ { text: 'pass: ', tone: 'accent' }, { text: 'Found no warnings, lint errors, or type errors', tone: 'base' }, { text: ' in 42 files', tone: 'muted' }, { text: ' (184ms, 16 threads)', tone: 'muted' }, ], }, ], }, { id: 'test', label: 'test', title: 'Run tests with fast feedback', command: 'vp test', lineDelay: 220, completionDelay: 1100, lines: [ { segments: [ { text: 'RUN ', tone: 'muted' }, { text: 'test/button.spec.ts', tone: 'brand' }, { text: ' (3 tests)', tone: 'muted' }, ], }, { segments: [ { text: '✓ ', tone: 'success' }, { text: 'button renders loading state', tone: 'base' }, ], }, { segments: [ { text: '✓ ', tone: 'success' }, { text: '12 tests passed', tone: 'base' }, { text: ' across 4 files', tone: 'muted' }, ], }, { segments: [ { text: 'Duration ', tone: 'muted' }, { text: '312ms', tone: 'accent' }, { text: ' (transform 22ms, tests 31ms)', tone: 'muted' }, ], }, ], }, { id: 'build', label: 'build', title: 'Ship a production build', command: 'vp build', lineDelay: 220, completionDelay: 1100, lines: [ { segments: [ { text: 'Rolldown ', tone: 'brand' }, { text: 'building for production', tone: 'muted' }, ], }, { segments: [ { text: '✓ ', tone: 'success' }, { text: '128 modules transformed', tone: 'base' }, ], }, { segments: [ { text: 'dist/assets/index-B6h2Q8.js', tone: 'accent' }, { text: ' 46.2 kB gzip: 14.9 kB', tone: 'muted' }, ], }, { segments: [ { text: 'dist/assets/index-H3a8K2.css', tone: 'brand' }, { text: ' 5.1 kB gzip: 1.6 kB', tone: 'muted' }, ], }, { segments: [ { text: '✓ ', tone: 'success' }, { text: 'Built in ', tone: 'muted' }, { text: '421ms', tone: 'base' }, ], }, ], }, ]; ================================================ FILE: docs/.vitepress/theme/data/testimonials.ts ================================================ export interface TestimonialData { quote: string; logo: string; logoAlt: string; name: string; title: string; company: string; image: string; } export const testimonials: TestimonialData[] = []; ================================================ FILE: docs/.vitepress/theme/index.ts ================================================ // note: import the specific variant directly! import BaseTheme from '@voidzero-dev/vitepress-theme/src/viteplus'; import type { Theme } from 'vitepress'; import Layout from './Layout.vue'; import './styles.css'; export default { extends: BaseTheme, Layout, } satisfies Theme; ================================================ FILE: docs/.vitepress/theme/layouts/Error404.vue ================================================ ================================================ FILE: docs/.vitepress/theme/layouts/Home.vue ================================================ ================================================ FILE: docs/.vitepress/theme/styles.css ================================================ /* styles.css */ @import '@voidzero-dev/vitepress-theme/src/styles/index.css'; @source "./**/*.vue"; /* Viteplus */ :root[data-variant='viteplus'] { --color-brand: #4f30e8; } :root.dark:not([data-theme])[data-variant='viteplus'], :root[data-theme='dark'][data-variant='viteplus'] { --color-brand: #6b77f8; } :root[data-theme='light'][data-variant='viteplus'] { --color-brand: #4f30e8; } /* Update fonts for marketing pages */ .marketing-layout { --font-sans: 'APK Protocol', sans-serif; } :root[data-variant='viteplus'] { --terminal-blue: color-mix(in srgb, var(--vp-c-indigo-2) 72%, white); } .terminal-copy { max-width: 100%; font-family: var(--font-mono); min-height: 22rem; padding: 0; font-size: 0.875rem; line-height: 1.5rem; color: var(--color-white); } .terminal-prompt, .terminal-line { white-space: pre-wrap; word-break: break-word; } .terminal-spacer { height: 1rem; } .terminal-line + .terminal-line { margin-top: 0.35rem; } .terminal-tone-base { color: var(--color-white); } .terminal-tone-muted { color: color-mix(in srgb, var(--vp-c-text-2) 52%, white 48%); } .terminal-tone-brand { color: var(--terminal-blue); } .terminal-tone-accent { color: var(--terminal-blue); } .terminal-tone-success { color: var(--color-zest); } .terminal-tone-warning { color: var(--color-fire); } .terminal-cursor { display: inline-block; width: 0.62rem; height: 1.05rem; margin-left: 0.1rem; vertical-align: -0.12rem; border-radius: 2px; background: var(--color-white); animation: terminal-blink 0.95s steps(1, end) infinite; } .terminal-line-enter-active, .terminal-line-leave-active { transition: opacity 220ms ease, transform 220ms ease; } .terminal-line-enter-from, .terminal-line-leave-to { opacity: 0; transform: translateY(0.35rem); } @keyframes terminal-blink { 0%, 49% { opacity: 1; } 50%, 100% { opacity: 0; } } @media (max-width: 640px) { .terminal-copy { font-size: 0.82rem; line-height: 1.5rem; } } .terminal-blue { color: var(--terminal-blue); } ================================================ FILE: docs/.vitepress/tsconfig.json ================================================ { "compilerOptions": { "target": "ES2020", "module": "ESNext", "lib": ["ES2024", "DOM", "DOM.Iterable"], "moduleResolution": "bundler", "resolveJsonModule": true, "noEmit": false, "emitDeclarationOnly": true, "outDir": "dist", "allowImportingTsExtensions": true, "strict": true, "declaration": true, "skipLibCheck": true, "esModuleInterop": true, "allowSyntheticDefaultImports": true, "forceConsistentCasingInFileNames": true, "paths": { "@local-assets/*": ["./theme/assets/*"], "@assets/*": ["../node_modules/@voidzero-dev/vitepress-theme/src/assets/*"], "@components/*": ["../node_modules/@voidzero-dev/vitepress-theme/src/components/*"] } }, "include": ["**/*.ts", "**/*.d.ts", "**/*.vue"] } ================================================ FILE: docs/config/build.md ================================================ # Build Config `vp dev`, `vp build`, and `vp preview` use the standard [Vite configuration](https://vite.dev/config/), including [plugins](https://vite.dev/guide/using-plugins), [aliases](https://vite.dev/config/shared-options#resolve-alias), [`server`](https://vite.dev/config/server-options), [`build`](https://vite.dev/config/build-options) and [`preview`](https://vite.dev/config/preview-options) fields. ## Example ```ts import { defineConfig } from 'vite-plus'; export default defineConfig({ server: { port: 3000, }, build: { sourcemap: true, }, preview: { port: 4173, }, }); ``` ================================================ FILE: docs/config/fmt.md ================================================ # Format Config `vp fmt` and `vp check` read Oxfmt settings from the `fmt` block in `vite.config.ts`. See [Oxfmt's configuration](https://oxc.rs/docs/guide/usage/formatter/config.html) for details. ## Example ```ts import { defineConfig } from 'vite-plus'; export default defineConfig({ fmt: { ignorePatterns: ['dist/**'], singleQuote: true, semi: true, sortPackageJson: true, }, }); ``` ================================================ FILE: docs/config/index.md ================================================ # Configuring Vite+ Vite+ keeps project configuration in one place: `vite.config.ts`, allowing you to consolidate many top-level configuration files in a single file. You can keep using your Vite configuration such as `server` or `build`, and add Vite+ blocks for the rest of your workflow: ```ts import { defineConfig } from 'vite-plus'; export default defineConfig({ server: {}, build: {}, preview: {}, test: {}, lint: {}, fmt: {}, run: {}, pack: {}, staged: {}, }); ``` ## Vite+ Specific Configuration Vite+ extends the basic Vite configuration with these additions: - [`lint`](/config/lint) for Oxlint - [`fmt`](/config/fmt) for Oxfmt - [`test`](/config/test) for Vitest - [`run`](/config/run) for Vite Task - [`pack`](/config/pack) for tsdown - [`staged`](/config/staged) for staged-file checks ================================================ FILE: docs/config/lint.md ================================================ # Lint Config `vp lint` and `vp check` read Oxlint settings from the `lint` block in `vite.config.ts`. See [Oxlint's configuration](https://oxc.rs/docs/guide/usage/linter/config.html) for details. ## Example ```ts import { defineConfig } from 'vite-plus'; export default defineConfig({ lint: { ignorePatterns: ['dist/**'], options: { typeAware: true, typeCheck: true, }, rules: { 'no-console': ['error', { allow: ['error'] }], }, }, }); ``` We recommend enabling both `options.typeAware` and `options.typeCheck` so `vp lint` and `vp check` can use the full type-aware path. ================================================ FILE: docs/config/pack.md ================================================ # Pack Config `vp pack` reads tsdown settings from the `pack` block in `vite.config.ts`. See [tsdown's configuration](https://tsdown.dev/options/config-file) for details. ## Example ```ts import { defineConfig } from 'vite-plus'; export default defineConfig({ pack: { dts: true, format: ['esm', 'cjs'], sourcemap: true, }, }); ``` ================================================ FILE: docs/config/run.md ================================================ # Run Config You can configure Vite Task under the `run` field in `vite.config.ts`. Check out [`vp run`](/guide/run) to learn more about running scripts and tasks with Vite+. ```ts import { defineConfig } from 'vite-plus'; export default defineConfig({ run: { enablePrePostScripts: true, cache: { /* ... */ }, tasks: { /* ... */ }, }, }); ``` ## `run.enablePrePostScripts` - **Type:** `boolean` - **Default:** `true` Whether to automatically run `preX`/`postX` package.json scripts as lifecycle hooks when script `X` is executed. When enabled (the default), running a script like `test` will automatically run `pretest` before it and `posttest` after it, if they exist in `package.json`. ```ts export default defineConfig({ run: { enablePrePostScripts: false, // Disable pre/post lifecycle hooks }, }); ``` ::: warning This option can only be set in the workspace root's `vite.config.ts`. Setting it in a package's config will result in an error. ::: ## `run.cache` - **Type:** `boolean | { scripts?: boolean, tasks?: boolean }` - **Default:** `{ scripts: false, tasks: true }` Controls whether task results are cached and replayed on subsequent runs. ```ts export default defineConfig({ run: { cache: { scripts: true, // Cache package.json scripts (default: false) tasks: true, // Cache task definitions (default: true) }, }, }); ``` `cache: true` enables both task and script caching, `cache: false` disables both. ## `run.tasks` - **Type:** `Record` Defines tasks that can be run with `vp run `. ### `command` - **Type:** `string` Defines the shell command to run for the task. ```ts tasks: { build: { command: 'vp build', }, } ``` Each task defined in `vite.config.ts` must include its own `command`. You cannot define a task in both `vite.config.ts` and `package.json` with the same task name. Commands joined with `&&` are automatically split into independently cached sub-tasks. See [Compound Commands](/guide/run#compound-commands). ### `dependsOn` - **Type:** `string[]` - **Default:** `[]` Tasks that must complete successfully before this one starts. ```ts tasks: { deploy: { command: 'deploy-script --prod', dependsOn: ['build', 'test'], }, } ``` Dependencies can reference tasks in other packages using the `package#task` format: ```ts dependsOn: ['@my/core#build', '@my/utils#lint']; ``` See [Task Dependencies](/guide/run#task-dependencies) for details on how explicit and topological dependencies interact. ### `cache` - **Type:** `boolean` - **Default:** `true` Whether to cache this task's output. Set to `false` for tasks that should never be cached, like dev servers: ```ts tasks: { dev: { command: 'vp dev', cache: false, }, } ``` ### `env` - **Type:** `string[]` - **Default:** `[]` Environment variables included in the cache fingerprint. When any listed variable's value changes, the cache is invalidated. ```ts tasks: { build: { command: 'vp build', env: ['NODE_ENV'], }, } ``` Wildcard patterns are supported: `VITE_*` matches all variables starting with `VITE_`. ```bash $ NODE_ENV=development vp run build # first run $ NODE_ENV=production vp run build # cache miss: variable changed ``` ### `untrackedEnv` - **Type:** `string[]` - **Default:** see below Environment variables passed to the task process but **not** included in the cache fingerprint. Changing these values won't invalidate the cache. ```ts tasks: { build: { command: 'vp build', untrackedEnv: ['CI', 'GITHUB_ACTIONS'], }, } ``` A set of common environment variables are automatically passed through to all tasks: - **System:** `HOME`, `USER`, `PATH`, `SHELL`, `LANG`, `TZ` - **Node.js:** `NODE_OPTIONS`, `COREPACK_HOME`, `PNPM_HOME` - **CI/CD:** `CI`, `VERCEL_*`, `NEXT_*` - **Terminal:** `TERM`, `COLORTERM`, `FORCE_COLOR`, `NO_COLOR` ### `input` - **Type:** `Array` - **Default:** `[{ auto: true }]` (auto-inferred) Vite Task automatically detects which files are used by a command (see [Automatic File Tracking](/guide/cache#automatic-file-tracking)). The `input` option can be used to explicitly include or exclude certain files. **Exclude files** from automatic tracking: ```ts tasks: { build: { command: 'vp build', // Use `{ auto: true }` to use automatic fingerprinting (default). input: [{ auto: true }, '!**/*.tsbuildinfo', '!dist/**'], }, } ``` **Specify explicit files** only without automatic tracking: ```ts tasks: { build: { command: 'vp build', input: ['src/**/*.ts', 'vite.config.ts'], }, } ``` **Disable file tracking** entirely and cache only on command/env changes: ```ts tasks: { greet: { command: 'node greet.mjs', input: [], }, } ``` ::: tip Glob patterns are resolved relative to the package directory, not the task's `cwd`. ::: ### `cwd` - **Type:** `string` - **Default:** package root Working directory for the task, relative to the package root. ```ts tasks: { 'test-e2e': { command: 'vp test', cwd: 'tests/e2e', }, } ``` ================================================ FILE: docs/config/staged.md ================================================ # Staged Config `vp staged` and `vp config` read staged-file rules from the `staged` block in `vite.config.ts`. See the [Commit hooks guide](/guide/commit-hooks). ## Example ```ts import { defineConfig } from 'vite-plus'; export default defineConfig({ staged: { '*.{js,ts,tsx,vue,svelte}': 'vp check --fix', }, }); ``` ================================================ FILE: docs/config/test.md ================================================ # Test Config `vp test` reads Vitest settings from the `test` block in `vite.config.ts`. See [Vitest's configuration](https://vitest.dev/config/) for details. ## Example ```ts import { defineConfig } from 'vite-plus'; export default defineConfig({ test: { include: ['src/**/*.test.ts'], coverage: { reporter: ['text', 'html'], }, }, }); ``` ================================================ FILE: docs/guide/build.md ================================================ # Build `vp build` builds Vite applications for production. ## Overview `vp build` runs the standard Vite production build through Vite+. Since it is directly based on Vite, the build pipeline and configuration model are the same as Vite. For more information about how Vite production builds work, see the [Vite guide](https://vite.dev/guide/build). Note that Vite+ uses Vite 8 and [Rolldown](https://rolldown.rs/) for builds. ::: info `vp build` always runs the built-in Vite production build. If your project also has a `build` script in `package.json`, run `vp run build` when you want to run that script instead. ::: ## Usage ```bash vp build vp build --watch vp build --sourcemap ``` ## Configuration Use standard Vite configuration in `vite.config.ts`. For the full configuration reference, see the [Vite config docs](https://vite.dev/config/). Use it for: - [plugins](https://vite.dev/guide/using-plugins) - [aliases](https://vite.dev/config/shared-options#resolve-alias) - [`build`](https://vite.dev/config/build-options) - [`preview`](https://vite.dev/config/preview-options) - [environment modes](https://vite.dev/guide/env-and-mode) ## Preview Use `vp preview` to serve the production build locally after `vp build`. ```bash vp build vp preview ``` ================================================ FILE: docs/guide/cache.md ================================================ # Task Caching Vite Task can automatically track dependencies and cache tasks run through `vp run`. ## Overview When a task runs successfully (exit code 0), its terminal output (stdout/stderr) is saved. On the next run, Vite Task checks if anything changed: 1. **Arguments:** did the [additional arguments](/guide/run#additional-arguments) passed to the task change? 2. **Environment variables:** did any [fingerprinted env vars](/config/run#env) change? 3. **Input files:** did any file that the command reads change? If everything matches, the cached output is replayed instantly, and the command does not run. ::: info Currently, only terminal output is cached and replayed. Output files such as `dist/` are not cached. If you delete them, use `--no-cache` to force a re-run. Output file caching is planned for a future release. ::: When a cache miss occurs, Vite Task tells you exactly why: ``` $ vp lint ✗ cache miss: 'src/utils.ts' modified, executing $ vp build ✗ cache miss: env changed, executing $ vp test ✗ cache miss: args changed, executing ``` ## When Is Caching Enabled? A command run by `vp run` is either a **task** defined in `vite.config.ts` or a **script** defined in `package.json`. Task names and script names cannot overlap. By default, **tasks are cached and scripts are not.** There are three types of controls for task caching, in order: ### 1. Per-task `cache: false` A task can set [`cache: false`](/config/run#cache) to opt out. This cannot be overridden by any other cache control flag. ### 2. CLI flags `--no-cache` disables caching for everything. `--cache` enables caching for both tasks and scripts, which is equivalent to setting [`run.cache: true`](/config/run#run-cache) for that invocation. ### 3. Workspace config The [`run.cache`](/config/run#run-cache) option in your root `vite.config.ts` controls the default for each category: | Setting | Default | Effect | | --------------- | ------- | --------------------------------------- | | `cache.tasks` | `true` | Cache tasks defined in `vite.config.ts` | | `cache.scripts` | `false` | Cache `package.json` scripts | ## Automatic File Tracking Vite Task tracks which files each command reads during execution. When a task runs, it records which files the process opens, such as your `.ts` source files, `vite.config.ts`, and `package.json`, and records their content hashes. On the next run, it re-checks those hashes to determine if anything changed. This means caching works out of the box for most commands without any configuration. Vite Task also records: - **Missing files:** if a command probes for a file that doesn't exist, such as `utils.ts` during module resolution, creating that file later correctly invalidates the cache. - **Directory listings:** if a command scans a directory, such as a test runner looking for `*.test.ts`, adding or removing files in that directory invalidates the cache. ### Avoiding Overly Broad Input Tracking Automatic tracking can sometimes include more files than necessary, causing unnecessary cache misses: - **Tool cache files:** some tools maintain their own cache, such as TypeScript's `.tsbuildinfo` or Cargo's `target/`. These files may change between runs even when your source code has not, causing unnecessary cache invalidation. - **Directory listings:** when a command scans a directory, such as when globbing for `**/*.js`, Vite Task sees the directory read but not the glob pattern. Any file added or removed in that directory, even unrelated ones, invalidates the cache. Use the [`input`](/config/run#input) option to exclude files or to replace automatic tracking with explicit file patterns: ```ts tasks: { build: { command: 'tsc', input: [{ auto: true }, '!**/*.tsbuildinfo'], }, } ``` ## Environment Variables By default, tasks run in a clean environment. Only a small set of common variables, such as `PATH`, `HOME`, and `CI`, are passed through. Other environment variables are neither visible to the task nor included in the cache fingerprint. To add an environment variable to the cache key, add it to [`env`](/config/run#env). Changing its value then invalidates the cache: ```ts tasks: { build: { command: 'webpack --mode production', env: ['NODE_ENV'], }, } ``` To pass a variable to the task **without** affecting cache behavior, use [`untrackedEnv`](/config/run#untracked-env). This is useful for variables like `CI` or `GITHUB_ACTIONS` that should be available in the task, but do not generally affect caching behavior. See [Run Config](/config/run#env) for details on wildcard patterns and the full list of automatically passed-through variables. ## Cache Sharing Vite Task's cache is content-based. If two tasks run the same command with the same inputs, they share the cache entry. This happens naturally when multiple tasks include a common step, either as standalone tasks or as parts of [compound commands](/guide/run#compound-commands): ```json [package.json] { "scripts": { "check": "vp lint && vp build", "release": "vp lint && deploy-script" } } ``` With caching enabled, for example through `--cache` or [`run.cache.scripts: true`](/config/run#run-cache), running `check` first means the `vp lint` step in `release` is an instant cache hit, since both run the same command against the same files. ## Cache Commands Use `vp cache clean` when you need to clear cached task results: ```bash vp cache clean ``` The task cache is stored in `node_modules/.vite/task-cache` at the project root. `vp cache clean` deletes that cache directory. ================================================ FILE: docs/guide/check.md ================================================ # Check `vp check` runs format, lint, and type checks together. ## Overview `vp check` is the default command for fast static checks in Vite+. It brings together formatting through [Oxfmt](https://oxc.rs/docs/guide/usage/formatter.html), linting through [Oxlint](https://oxc.rs/docs/guide/usage/linter.html), and TypeScript type checks through [tsgolint](https://github.com/oxc-project/tsgolint). By merging all of these tasks into a single command, `vp check` is faster than running formatting, linting, and type checking as separate tools in separate commands. When `typeCheck` is enabled in the `lint.options` block in `vite.config.ts`, `vp check` also runs TypeScript type checks through the Oxlint type-aware path powered by the TypeScript Go toolchain and [tsgolint](https://github.com/oxc-project/tsgolint). `vp create` and `vp migrate` enable both `typeAware` and `typeCheck` by default. We recommend turning `typeCheck` on so `vp check` becomes the single command for static checks during development. ## Usage ```bash vp check vp check --fix # Format and run autofixers. ``` ## Configuration `vp check` uses the same configuration you already define for linting and formatting: - [`lint`](/guide/lint#configuration) block in `vite.config.ts` - [`fmt`](/guide/fmt#configuration) block in `vite.config.ts` - TypeScript project structure and tsconfig files for type-aware linting Recommended base `lint` config: ```ts import { defineConfig } from 'vite-plus'; export default defineConfig({ lint: { options: { typeAware: true, typeCheck: true, }, }, }); ``` ================================================ FILE: docs/guide/ci.md ================================================ # Continuous Integration You can use `voidzero-dev/setup-vp` to use Vite+ in CI environments. ## Overview For GitHub Actions, the recommended setup is [`voidzero-dev/setup-vp`](https://github.com/voidzero-dev/setup-vp). It installs Vite+, sets up the required Node.js version and package manager, and can cache package installs automatically. That means you usually do not need separate `setup-node`, package-manager setup, and manual dependency-cache steps in your workflow. ## GitHub Actions ```yaml - uses: voidzero-dev/setup-vp@v1 with: node-version: '22' cache: true - run: vp install - run: vp check - run: vp test - run: vp build ``` With `cache: true`, `setup-vp` handles dependency caching for you automatically. ## Simplifying Existing Workflows If you are migrating an existing GitHub Actions workflow, you can often replace large blocks of Node, package-manager, and cache setup with a single `setup-vp` step. #### Before: ```yaml - uses: actions/setup-node@v4 with: node-version: '24' - uses: pnpm/action-setup@v4 with: version: 10 - name: Get pnpm store path run: pnpm store path - uses: actions/cache@v4 with: path: ~/.pnpm-store key: ${{ runner.os }}-pnpm-${{ hashFiles('pnpm-lock.yaml') }} - run: pnpm install && pnpm dev:setup - run: pnpm test ``` #### After: ```yaml - uses: voidzero-dev/setup-vp@v1 with: node-version: '24' cache: true - run: vp install && vp run dev:setup - run: vp check - run: vp test ``` ================================================ FILE: docs/guide/commit-hooks.md ================================================ # Commit Hooks Use `vp config` to install commit hooks, and `vp staged` to run checks on staged files. ## Overview Vite+ supports commit hooks and staged-file checks without additional tooling. Use: - `vp config` to set up project hooks and related integrations - `vp staged` to run checks against the files currently staged in Git If you use [`vp create`](/guide/create) or [`vp migrate`](/guide/migrate), Vite+ prompts you to set this up for your project automatically. ## Commands ### `vp config` `vp config` configures Vite+ for the current project. It installs Git hooks, sets up the hook directory, and can also handle related project integration such as agent setup. By default, hooks are written to `.vite-hooks`: ```bash vp config vp config --hooks-dir .vite-hooks ``` ### `vp staged` `vp staged` runs staged-file checks using the `staged` config from `vite.config.ts`. If you set up Vite+ to handle your commit hooks, it will automatically run when you commit your local changes. ```bash vp staged vp staged --verbose vp staged --fail-on-changes ``` ## Configuration Define staged-file checks in the `staged` block in `vite.config.ts`: ```ts import { defineConfig } from 'vite-plus'; export default defineConfig({ staged: { '*.{js,ts,tsx,vue,svelte}': 'vp check --fix', }, }); ``` This is the default Vite+ approach and should replace separate `lint-staged` configuration in most projects. Because `vp staged` reads from `vite.config.ts`, your staged-file checks stay in the same place as your lint, format, test, build, and task-runner config. ================================================ FILE: docs/guide/create.md ================================================ # Creating a Project `vp create` interactively scaffolds new Vite+ projects, monorepos, and apps inside existing workspaces. ## Overview The `create` command is the fastest way to start with Vite+. It can be used in a few different ways: - Start a new Vite+ monorepo - Create a new standalone application or library - Add a new app or library inside an existing project This command can be used with built-in templates, community templates, or remote GitHub templates. ## Usage ```bash vp create vp create