Showing preview only (539K chars total). Download the full file or copy to clipboard to get everything.
Repository: caderek/gramma
Branch: master
Commit: 1c5de79042a1
Files: 93
Total size: 510.2 KB
Directory structure:
gitextract_6at5w77x/
├── .circleci/
│ └── config.yml
├── .eslintignore
├── .eslintrc.json
├── .github/
│ └── ISSUE_TEMPLATE/
│ ├── bug_report.md
│ ├── documentation.md
│ ├── feature_request.md
│ └── question.md
├── .gitignore
├── .gramma.json
├── .husky/
│ ├── commit-msg
│ ├── post-commit
│ └── pre-commit
├── .npmignore
├── .prettierrc
├── CHANGELOG.md
├── LICENSE.md
├── README.md
├── _config.yml
├── _layouts/
│ └── default.html
├── assets/
│ └── css/
│ └── style.scss
├── bundle/
│ └── gramma.esm.js
├── data/
│ ├── languages.json
│ └── rules.json
├── examples/
│ ├── api-markdown.js
│ ├── api-plain.js
│ └── api-simple.js
├── hello.md
├── lib/
│ ├── findUpSync.mjs
│ ├── package.json
│ └── prepareMarkdown.mjs
├── package.json
├── scripts/
│ ├── checkLanguagesSupport.js
│ └── zipBinaries.js
├── src/
│ ├── actions/
│ │ ├── checkInteractively.js
│ │ ├── checkNonInteractively.js
│ │ ├── configure.js
│ │ ├── save.js
│ │ └── saveNow.js
│ ├── boot/
│ │ ├── load.js
│ │ └── prepareConfig.js
│ ├── cli.js
│ ├── cli.test.js
│ ├── commands/
│ │ ├── check.js
│ │ ├── commit.js
│ │ ├── config.js
│ │ ├── debug.js
│ │ ├── hook.js
│ │ ├── init.js
│ │ ├── listen.js
│ │ ├── paths.js
│ │ └── server.js
│ ├── components/
│ │ ├── FixMenu.js
│ │ ├── FixMenu.test.js
│ │ ├── Mistake.js
│ │ └── Mistake.test.js
│ ├── context.js
│ ├── index.d.ts
│ ├── index.js
│ ├── initialConfig.js
│ ├── prompts/
│ │ ├── confirmConfig.js
│ │ ├── confirmInit.js
│ │ ├── confirmPort.js
│ │ ├── confirmServerReinstall.js
│ │ ├── handleMistake.js
│ │ ├── handleSave.js
│ │ └── mainMenu.js
│ ├── requests/
│ │ ├── checkViaAPI.d.ts
│ │ ├── checkViaAPI.js
│ │ ├── checkViaCmd.js
│ │ ├── checkWithFallback.js
│ │ └── updates.js
│ ├── server/
│ │ ├── getServerInfo.js
│ │ ├── getServerPID.js
│ │ ├── installServer.js
│ │ ├── showServerGUI.js
│ │ ├── startServer.js
│ │ └── stopServer.js
│ ├── text-manipulation/
│ │ ├── replace.js
│ │ ├── replace.test.js
│ │ ├── replaceAll.d.ts
│ │ ├── replaceAll.js
│ │ └── replaceAll.test.js
│ ├── utils/
│ │ ├── appLocation.js
│ │ ├── downloadFile.js
│ │ ├── equal.js
│ │ ├── findUpSync.js
│ │ ├── prepareMarkdown.js
│ │ ├── stripStyles.js
│ │ ├── stripStyles.test.js
│ │ └── unzipFile.js
│ └── validators/
│ ├── languages.js
│ └── rules.js
└── tsconfig.json
================================================
FILE CONTENTS
================================================
================================================
FILE: .circleci/config.yml
================================================
version: 2
jobs:
build:
docker:
- image: cimg/node:14.18.0
working_directory: ~/repo
steps:
- checkout
- restore_cache:
keys:
- v1-dependencies-{{ checksum "package.json" }}
- v1-dependencies-
- run: yarn install
- save_cache:
paths:
- node_modules
key: v1-dependencies-{{ checksum "package.json" }}
- run: yarn run test:ci
- run: yarn run lint
================================================
FILE: .eslintignore
================================================
lib/prepareMarkdown.mjs
src/utils/prepareMarkdown.js
src/utils/findUpSync.js
*/**/*.d.ts
================================================
FILE: .eslintrc.json
================================================
{
"extends": ["airbnb", "prettier"],
"rules": {
"no-console": 0,
"arrow-body-style": 0,
"no-restricted-syntax": 0,
"no-await-in-loop": 0,
"camelcase": 0
},
"env": {
"jest": true,
"node": true
},
"globals": {
"fetch": true
}
}
================================================
FILE: .github/ISSUE_TEMPLATE/bug_report.md
================================================
---
name: Bug report
about: Create a report to help us improve
title: ''
labels: bug
assignees: caderek
---
**Describe the bug**
A clear and concise description of what the bug is.
**To Reproduce**
Steps to reproduce the behavior.
**Expected behavior**
A clear and concise description of what you expected to happen.
**Screenshots**
If applicable, add screenshots to help explain your problem.
**Desktop (please complete the following information):**
- OS: [e.g. Linux, macOS, Windows]
- Browser [e.g. Ubunto 18.04, Mojave, 10]
**Additional context**
Add any other context about the problem here.
================================================
FILE: .github/ISSUE_TEMPLATE/documentation.md
================================================
---
name: Documentation
about: All docs related issues
title: ''
labels: documentation
assignees: caderek
---
**Describe what is missing, unclear or incorrect**
A clear and concise description of what you want us to change/add.
================================================
FILE: .github/ISSUE_TEMPLATE/feature_request.md
================================================
---
name: Feature request
about: Suggest an idea for this project
title: ''
labels: enhancement
assignees: caderek
---
**Is your feature request related to a problem? Please describe.**
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]
**Describe the solution you'd like**
A clear and concise description of what you want to happen.
**Describe alternatives you've considered**
A clear and concise description of any alternative solutions or features you've considered.
**Additional context**
Add any other context or screenshots about the feature request here.
================================================
FILE: .github/ISSUE_TEMPLATE/question.md
================================================
---
name: Question
about: All questions that do not require changes to the codebase
title: ''
labels: help wanted
assignees: caderek
---
**How can I help you?**
================================================
FILE: .gitignore
================================================
node_modules
.history
.vscode
example-responses
example.txt
coverage
bin
assets/css/*.css
assets/css/*.css.map
================================================
FILE: .gramma.json
================================================
{
"api_url": "https://api.languagetool.org/v2/check",
"api_key": "",
"dictionary": [
"Asturian",
"Bugfix",
"CHANGELOG",
"CircleCI",
"Codacy",
"CommonJS",
"Config",
"Github",
"Gramma",
"Grammarbot",
"IIFE",
"JS",
"Moçambique",
"NPM",
"README",
"XXXXXXXX",
"YYYYYYYY",
"_blank",
"api",
"api_key",
"ast-ES",
"async",
"backend",
"boolean",
"br-FR",
"chmod",
"config",
"confused_words",
"const",
"correctText",
"da-DK",
"de",
"de-AT",
"de-CH",
"de-DE",
"dev",
"el-GR",
"eo",
"eslintignore",
"esm",
"esm-min",
"exampleReplacements",
"false_friends",
"foo",
"fr",
"gender_neutrality",
"gl-ES",
"gramma",
"grammarbot",
"gui",
"href",
"iife",
"img",
"init",
"io",
"ja-JP",
"js",
"json",
"km-KH",
"linter",
"nl",
"npm",
"npmignore",
"pid",
"preAO",
"prepareReplacements",
"replaceAll",
"rimraf",
"ro-RO",
"ru-RU",
"signup",
"sk-SK",
"sl-SI",
"src",
"stdin",
"stdout",
"stylesheet",
"sv",
"symlink",
"tl-PH",
"uk-UA",
"url",
"usedCfg",
"zh-CN"
],
"language": "en-US",
"rules": {
"casing": true,
"colloquialisms": true,
"compounding": true,
"confused_words": true,
"false_friends": true,
"gender_neutrality": true,
"grammar": true,
"misc": true,
"punctuation": true,
"redundancy": true,
"regionalisms": true,
"repetitions": true,
"semantics": true,
"style": true,
"typography": false,
"typos": true
}
}
================================================
FILE: .husky/commit-msg
================================================
#!/bin/sh
exec < /dev/tty
npx gramma hook $1
================================================
FILE: .husky/post-commit
================================================
#!/bin/sh
npx gramma hook cleanup
================================================
FILE: .husky/pre-commit
================================================
#!/bin/sh
. "$(dirname "$0")/_/husky.sh"
npm test
================================================
FILE: .npmignore
================================================
_layouts
node_modules
.history
.circleci
example-responses
example.txt
coverage
bin
.husky
.github
.vscode
lib
assets/css
assets/gramma-logo.svg
assets/gramma-text.svg
assets/banner.png
assets/banner-small.png
scripts
examples
================================================
FILE: .prettierrc
================================================
{
"trailingComma": "all",
"tabWidth": 2,
"semi": false,
"singleQuote": false,
"arrowParens": "always",
"printWidth": 80
}
================================================
FILE: CHANGELOG.md
================================================
# CHANGELOG
## 1.0.0
First stable release.
## 1.1.0
- Added Git hook integration
- Updated dependencies and documentation
- Improved error handling
## 1.2.0
- Added Markdown support
- Used api.languagetool.org as the default API
## 1.3.0
- Support for environment variables in config files
- Local config works in subdirectories
- Automatic markdown support for .md files
- Better error handling
- Improved documentation
## 1.4.0
- Automatically include changes to .gramma.json when executing Git hook
- Standalone binaries migrated to Node 16
## 1.4.1
- Fixed JS API, added type definitions
- Fixed hooks behavior with commit --verbose flag
## 1.4.2 - 1.4.4
- Isomorphic JS API (works on browser)
## 1.4.5
- Fixed CORS in JS API (browser)
## 1.4.6 - 1.4.7
- Bundles (esm, esm-min, iife)
## 1.4.8
- Fixed links in README
## 1.5.0
- When local server is installed but not running, Gramma will now try to use command-line interface for LanguageTool communication instead of spawning HTTP server (if possible).
- Gramma will now automatically check for updates once a day.
- Added validation for languages and rules parameters.
## 1.6.0
- Added `gramma server info` command.
- Added option to set custom port when managing local server manually.
================================================
FILE: LICENSE.md
================================================
Copyright 2021 Maciej Cąderek
Permission to use, copy, modify, and/or distribute this software
for any purpose with or without fee is hereby granted,
provided that the above copyright notice
and this permission notice appear in all copies.
THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES
WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES
OF MERCHANTABILITY AND FITNESS.
IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, INDIRECT,
OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE,
DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT,
NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF
OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.
================================================
FILE: README.md
================================================
<div align="center">
<img src="assets/gramma-logo.png" alt="gramma-logo" />
</div>
<div align="center">
<img src="assets/gramma-text.png" alt="gramma-title" />
</div>
<!-- Gramma is an interactive tool that helps you find and fix grammatical mistakes in files and text strings. You can also use it in a non-interactive way, as a simple linter for automation processes.
Gramma works on Linux, Windows, and macOS.
Gramma supports many languages. You can find a full list <a href="https://languagetool.org/languages">here</a>.
Gramma works out-of-the-box, communicating with [languagetool.org](https://languagetool.org), but can also be easily configured to work with other compatible APIs, including local or remote [LanguageTool server](https://dev.languagetool.org/http-server). -->
<div> </div>
<div align="center">
<a href="https://circleci.com/gh/caderek/gramma/tree/master" target="_blank"><img src="https://img.shields.io/circleci/build/github/caderek/gramma.svg?labelColor=024160" alt="CircleCI"></a>
<img src="https://img.shields.io/npm/v/gramma.svg?labelColor=024160&color=0081B8" alt="npm version">
<img src="https://img.shields.io/node/v/gramma.svg?labelColor=024160&color=0081B8" alt="node version">
<img src="https://img.shields.io/npm/l/gramma.svg?labelColor=024160&color=0081B8" alt="npm license">
</div>
<div> </div>
<div align="center">
<img src="docs/example.gif" alt="Example" style="border-radius: 10px;">
</div>
<div> </div>
<div><img src="assets/divider.png" width="838" alt="---" class="divider" /></div>
## Features
- Provides advanced grammar checks via LanguageTool (remote API or local server).
- Supports global and local (per-project) configuration.
- Supports plain text and markdown.
- Git integration!
- Fully interactive!
<div><img src="assets/divider.png" width="838" alt="---" class="divider" /></div>
## Contents
1. [Installation](#installation)
- [Via NPM (global)](#installation-npm)
- [Standalone binary](#installation-binary)
- [Dev tool for JS/TS projects](#installation-dev)
- [Local LanguageTool server (optional)](#installation-server)
1. [Usage](#usage)
- [Check file](#usage-check)
- [Check string](#usage-listen)
- [Git commit with grammar check](#usage-commit)
- [Command-line options](#usage-options)
- [Usage inside VIM](#usage-vim)
1. [Configuration](#config)
- [Introduction](#config-intro)
- [Local config](#config-local)
- [Git integration](#config-git)
- [Checker settings](#config-checker)
- [Customizing API server](#config-server)
- [Security](#config-security)
1. [Managing a local server](#server)
1. [JS API](#js)
1. [License](#license)
<a id='installation'></a>
<div><img src="assets/divider.png" width="838" alt="---" class="divider" /></div>
## Installation
<a id='installation-npm'></a>
### Via NPM
It is the recommended way if you have Node.js already installed (or you are willing to do so).
```
npm i gramma -g
```
<hr/>
<a id='installation-binary'></a>
### Standalone binary
If you prefer a single binary file, you can download it for the most popular platforms:
- [gramma-linux64-v1.6.0.zip](https://github.com/caderek/gramma/releases/download/v1.6.0/gramma-linux64-v1.6.0.zip)
- [gramma-macos-v1.6.0.zip](https://github.com/caderek/gramma/releases/download/v1.6.0/gramma-macos-v1.6.0.zip)
- [gramma-windows64-v1.6.0.zip](https://github.com/caderek/gramma/releases/download/v1.6.0/gramma-windows64-v1.6.0.zip)
After downloading and unpacking the binary, add it to your PATH or create a symlink to your executable directory (depending on the platform).
<hr/>
<a id='installation-dev'></a>
### Dev tool for JS/TS projects
You can install Gramma locally for your JS/TS project - this method gives you a separate, project specific config.
```
npm i gramma -D
```
or
```
yarn add gramma -D
```
Then create the local config file:
```
npx gramma init
```
You will be asked if you want to integrate Gramma with Git (via hook). You can later manually toggle git hook via `npx gramma hook` command.
Git hook also works with a non-default hooks path (Husky, etc.).
<hr/>
<a id='installation-server'></a>
### Local LanguageTool server (optional)
For this to work, you have to install Java 1.8 or higher (you can find it [here](https://adoptium.net)). You can check if you have it installed already by running:
```
java -version
```
To install the local server, use:
```
gramma server install
```
That's it - Gramma will now use and manage the local server automatically.
<a id='usage'></a>
<div><img src="assets/divider.png" width="838" alt="---" class="divider" /></div>
## Usage
<a id='usage-check'></a>
### Check file
Interactive fix:
```
gramma check [file]
```
Just print potential mistakes and return status code:
```
gramma check -p [file]
```
Examples:
```
gramma check path/to/my_file.txt
```
```
gramma check -p path/to/other/file.txt
```
<hr/>
<a id='usage-listen'></a>
### Check string
Interactive fix:
```
gramma listen [text]
```
Just print potential mistakes and return status code:
```
gramma listen -p [text]
```
Examples:
```
gramma listen "This sentence will be checked interactively."
```
```
gramma listen -p "Suggestions for this sentence will be printed."
```
<hr/>
<a id='usage-commit'></a>
### Git commit with grammar check
_**TIP:** Instead of the commands below, you can use [Git integration](#config-git)._
Equivalent to `git commit -m [message]`:
```
gramma commit [text]
```
Equivalent to `git commit -am [message]`:
```
gramma commit -a [text]
```
Examples:
```
gramma commit "My commit message"
```
```
gramma commit -a "Another commit message (files added)"
```
<hr/>
<a id='usage-options'></a>
### Command-line options
_Note: This section describes options for grammar-checking commands only. Other command-specific options are described in their specific sections of this document._
- `-p / --print` - check text in the non-interactive mode
- `-n / --no-colors` - when paired with the `-p` flag, removes colors from the output
- `-d / --disable <rule>` - disable specific [rule](#available-rules)
- `-e / --enable <rule>` - enable specific [rule](#available-rules)
- `-l / --language <language_code>` - mark a text as written in provided [language](#available-languages)
- `-m / --markdown` - treat the input as markdown (removes some false-positives)
You can enable or disable multiple rules in one command by using a corresponding option multiple times. You can also compound boolean options if you use their short version.
Example:
```
gramma listen "I like making mistkaes!" -pn -d typos -d typography -e casing -l en-GB
```
<hr/>
<a id='usage-vim'></a>
### Usage inside VIM
If you are a VIM/Neovim user, you can use Gramma directly inside the editor:
Print the potential mistakes:
```
:w !gramma check /dev/stdin -pn
```
Interactive fix of the current file:
```
:terminal gramma check %
```
It will open the interactive terminal inside VIM - to handle Gramma suggestions, enter the interactive mode (`a` or `i`) and use Gramma as usual. After you fix the mistakes and replace a file, press `Enter` to return to the editor.
<details>
<summary style="outline: none; cursor: pointer">Example GIF (click to expand)</summary>
<img src="https://raw.githubusercontent.com/caderek/gramma/master/docs/gramma-vim.gif" alt="Gramma VIM example" />
</details>
<a id='config'></a>
<div><img src="assets/divider.png" width="838" alt="---" class="divider" /></div>
## Configuration
<a id='config-intro'></a>
### Introduction
With Gramma, you can use a global and local configuration file. Gramma will use a proper config file following their priority:
1. Command-line options
2. Local config
3. Global config
Gramma will automatically generate a global configuration file on the first run.
You can check the path to the global configuration file (as well as other paths used by Gramma) via the following command:
```
gramma paths
```
You can change your settings by manually editing configuration files or running:
```
gramma config <setting> <value> [-g]
```
_Note: `-g` (`--global`) flag should be used when you want to alter the global config._
<hr/>
<a id='config-local'></a>
### Local config
You can initialize local config by running the following command in your project's root directory:
```
gramma init
```
Gramma creates the local configuration file in your working directory under `.gramma.json` name.
<hr/>
<a id='config-git'></a>
### Git integration
You can toggle Git hook via:
```
gramma hook
```
It will add/remove an entry in `commit-msg` hook.
Gramma follows the Git configuration file, so it should work with a non-standard hooks location.
<hr/>
<a id='config-checker'></a>
### Checker settings
#### Adding a word to the dictionary
Usually, you will add custom words to the local or global dictionary via interactive menu during the fix process, but you can also make it via separate command:
```
gramma config dictionary <your_word> [-g]
```
Examples:
```
gramma config dictionary aws
gramma config dictionary figma -g
```
#### Changing default language
```
gramma config language <language_code> [-g]
```
Examples:
```
gramma config language en-GB
gramma config language pl-PL -g
```
<a id="available-languages"></a>
<details>
<summary style="outline: none; cursor: pointer">Available languages (click to expand)</summary>
<table>
<tr>
<th>Code</th>
<th>Name</th>
<th>languagetool.org</th>
<th>grammarbot.io</th>
<th>local</th>
</tr>
<tr>
<td><tt style="white-space: pre;">auto</tt></td>
<td>automatic language detection</td>
<td>✔</td>
<td>✔</td>
<td>✔</td>
</tr>
<!--LANG-->
<tr>
<td><tt style="white-space: pre;">ar</tt></td>
<td>Arabic</td>
<td>✔</td>
<td>-</td>
<td>✔</td>
</tr>
<tr>
<td><tt style="white-space: pre;">ast-ES</tt></td>
<td>Asturian</td>
<td>✔</td>
<td>✔</td>
<td>✔</td>
</tr>
<tr>
<td><tt style="white-space: pre;">be-BY</tt></td>
<td>Belarusian</td>
<td>✔</td>
<td>✔</td>
<td>✔</td>
</tr>
<tr>
<td><tt style="white-space: pre;">br-FR</tt></td>
<td>Breton</td>
<td>✔</td>
<td>✔</td>
<td>✔</td>
</tr>
<tr>
<td><tt style="white-space: pre;">ca-ES</tt></td>
<td>Catalan</td>
<td>✔</td>
<td>✔</td>
<td>✔</td>
</tr>
<tr>
<td><tt style="white-space: pre;">ca-ES-valencia</tt></td>
<td>Catalan (Valencian)</td>
<td>✔</td>
<td>✔</td>
<td>✔</td>
</tr>
<tr>
<td><tt style="white-space: pre;">zh-CN</tt></td>
<td>Chinese</td>
<td>✔</td>
<td>-</td>
<td>✔</td>
</tr>
<tr>
<td><tt style="white-space: pre;">da-DK</tt></td>
<td>Danish</td>
<td>✔</td>
<td>✔</td>
<td>✔</td>
</tr>
<tr>
<td><tt style="white-space: pre;">nl</tt></td>
<td>Dutch</td>
<td>✔</td>
<td>✔</td>
<td>✔</td>
</tr>
<tr>
<td><tt style="white-space: pre;">nl-BE</tt></td>
<td>Dutch (Belgium)</td>
<td>✔</td>
<td>-</td>
<td>✔</td>
</tr>
<tr>
<td><tt style="white-space: pre;">en</tt></td>
<td>English</td>
<td>✔</td>
<td>✔</td>
<td>✔</td>
</tr>
<tr>
<td><tt style="white-space: pre;">en-AU</tt></td>
<td>English (Australian)</td>
<td>✔</td>
<td>✔</td>
<td>✔</td>
</tr>
<tr>
<td><tt style="white-space: pre;">en-CA</tt></td>
<td>English (Canadian)</td>
<td>✔</td>
<td>✔</td>
<td>✔</td>
</tr>
<tr>
<td><tt style="white-space: pre;">en-GB</tt></td>
<td>English (GB)</td>
<td>✔</td>
<td>✔</td>
<td>✔</td>
</tr>
<tr>
<td><tt style="white-space: pre;">en-NZ</tt></td>
<td>English (New Zealand)</td>
<td>✔</td>
<td>✔</td>
<td>✔</td>
</tr>
<tr>
<td><tt style="white-space: pre;">en-ZA</tt></td>
<td>English (South African)</td>
<td>✔</td>
<td>✔</td>
<td>✔</td>
</tr>
<tr>
<td><tt style="white-space: pre;">en-US</tt></td>
<td>English (US)</td>
<td>✔</td>
<td>✔</td>
<td>✔</td>
</tr>
<tr>
<td><tt style="white-space: pre;">eo</tt></td>
<td>Esperanto</td>
<td>✔</td>
<td>✔</td>
<td>✔</td>
</tr>
<tr>
<td><tt style="white-space: pre;">fr</tt></td>
<td>French</td>
<td>✔</td>
<td>-</td>
<td>✔</td>
</tr>
<tr>
<td><tt style="white-space: pre;">gl-ES</tt></td>
<td>Galician</td>
<td>✔</td>
<td>✔</td>
<td>✔</td>
</tr>
<tr>
<td><tt style="white-space: pre;">de</tt></td>
<td>German</td>
<td>✔</td>
<td>-</td>
<td>✔</td>
</tr>
<tr>
<td><tt style="white-space: pre;">de-AT</tt></td>
<td>German (Austria)</td>
<td>✔</td>
<td>-</td>
<td>✔</td>
</tr>
<tr>
<td><tt style="white-space: pre;">de-DE</tt></td>
<td>German (Germany)</td>
<td>✔</td>
<td>-</td>
<td>✔</td>
</tr>
<tr>
<td><tt style="white-space: pre;">de-CH</tt></td>
<td>German (Swiss)</td>
<td>✔</td>
<td>-</td>
<td>✔</td>
</tr>
<tr>
<td><tt style="white-space: pre;">el-GR</tt></td>
<td>Greek</td>
<td>✔</td>
<td>✔</td>
<td>✔</td>
</tr>
<tr>
<td><tt style="white-space: pre;">ga-IE</tt></td>
<td>Irish</td>
<td>✔</td>
<td>-</td>
<td>✔</td>
</tr>
<tr>
<td><tt style="white-space: pre;">it</tt></td>
<td>Italian</td>
<td>✔</td>
<td>-</td>
<td>✔</td>
</tr>
<tr>
<td><tt style="white-space: pre;">ja-JP</tt></td>
<td>Japanese</td>
<td>✔</td>
<td>✔</td>
<td>✔</td>
</tr>
<tr>
<td><tt style="white-space: pre;">km-KH</tt></td>
<td>Khmer</td>
<td>✔</td>
<td>✔</td>
<td>✔</td>
</tr>
<tr>
<td><tt style="white-space: pre;">fa</tt></td>
<td>Persian</td>
<td>✔</td>
<td>✔</td>
<td>✔</td>
</tr>
<tr>
<td><tt style="white-space: pre;">pl-PL</tt></td>
<td>Polish</td>
<td>✔</td>
<td>✔</td>
<td>✔</td>
</tr>
<tr>
<td><tt style="white-space: pre;">pt</tt></td>
<td>Portuguese</td>
<td>✔</td>
<td>-</td>
<td>✔</td>
</tr>
<tr>
<td><tt style="white-space: pre;">pt-AO</tt></td>
<td>Portuguese (Angola preAO)</td>
<td>✔</td>
<td>-</td>
<td>✔</td>
</tr>
<tr>
<td><tt style="white-space: pre;">pt-BR</tt></td>
<td>Portuguese (Brazil)</td>
<td>✔</td>
<td>-</td>
<td>✔</td>
</tr>
<tr>
<td><tt style="white-space: pre;">pt-MZ</tt></td>
<td>Portuguese (Moçambique preAO)</td>
<td>✔</td>
<td>-</td>
<td>✔</td>
</tr>
<tr>
<td><tt style="white-space: pre;">pt-PT</tt></td>
<td>Portuguese (Portugal)</td>
<td>✔</td>
<td>-</td>
<td>✔</td>
</tr>
<tr>
<td><tt style="white-space: pre;">ro-RO</tt></td>
<td>Romanian</td>
<td>✔</td>
<td>✔</td>
<td>✔</td>
</tr>
<tr>
<td><tt style="white-space: pre;">ru-RU</tt></td>
<td>Russian</td>
<td>✔</td>
<td>-</td>
<td>✔</td>
</tr>
<tr>
<td><tt style="white-space: pre;">de-DE-x-simple-language</tt></td>
<td>Simple German</td>
<td>✔</td>
<td>✔</td>
<td>✔</td>
</tr>
<tr>
<td><tt style="white-space: pre;">sk-SK</tt></td>
<td>Slovak</td>
<td>✔</td>
<td>✔</td>
<td>✔</td>
</tr>
<tr>
<td><tt style="white-space: pre;">sl-SI</tt></td>
<td>Slovenian</td>
<td>✔</td>
<td>✔</td>
<td>✔</td>
</tr>
<tr>
<td><tt style="white-space: pre;">es</tt></td>
<td>Spanish</td>
<td>✔</td>
<td>-</td>
<td>✔</td>
</tr>
<tr>
<td><tt style="white-space: pre;">es-AR</tt></td>
<td>Spanish (voseo)</td>
<td>✔</td>
<td>-</td>
<td>✔</td>
</tr>
<tr>
<td><tt style="white-space: pre;">sv</tt></td>
<td>Swedish</td>
<td>✔</td>
<td>✔</td>
<td>✔</td>
</tr>
<tr>
<td><tt style="white-space: pre;">tl-PH</tt></td>
<td>Tagalog</td>
<td>✔</td>
<td>✔</td>
<td>✔</td>
</tr>
<tr>
<td><tt style="white-space: pre;">ta-IN</tt></td>
<td>Tamil</td>
<td>✔</td>
<td>✔</td>
<td>✔</td>
</tr>
<tr>
<td><tt style="white-space: pre;">uk-UA</tt></td>
<td>Ukrainian</td>
<td>✔</td>
<td>✔</td>
<td>✔</td>
</tr>
<!--/LANG-->
</table>
</details>
_Note: By default, Gramma uses US English (`en-US`)._
#### Enabling and disabling rules
Enabling a specific rule:
```
gramma config enable <rule_name> [-g]
```
Disabling a specific rule:
```
gramma config disable <rule_name> [-g]
```
Examples:
```
gramma config enable punctuation
gramma config enable casing -g
gramma config disable typography
gramma config disable style -g
```
<a id="available-rules"></a>
<details>
<summary style="outline: none; cursor: pointer">Available rules (click to expand)</summary>
<table>
<tr><th>Rule</th><th>Description</th></tr>
<tr><td><tt style="white-space: pre;">casing</tt></td><td>Rules about detecting uppercase words where lowercase is required and vice versa.</td></tr>
<tr><td><tt style="white-space: pre;">colloquialisms</tt></td><td>Colloquial style.</td></tr>
<tr><td><tt style="white-space: pre;">compounding</tt></td><td>Rules about spelling terms as one word or as as separate words.</td></tr>
<tr><td><tt style="white-space: pre;">confused_words</tt></td><td>Words that are easily confused, like 'there' and 'their' in English.</td></tr>
<tr><td><tt style="white-space: pre;">false_friends</tt></td><td>False friends: words easily confused by language learners because a similar word exists in their native language.</td></tr>
<tr><td><tt style="white-space: pre;">gender_neutrality</tt></td><td>Helps to ensure gender-neutral terms.</td></tr>
<tr><td><tt style="white-space: pre;">grammar</tt></td><td>Basic grammar check.</td></tr>
<tr><td><tt style="white-space: pre;">misc</tt></td><td>Miscellaneous rules that don't fit elsewhere.</td></tr>
<tr><td><tt style="white-space: pre;">punctuation</tt></td><td>Punctuation mistakes.</td></tr>
<tr><td><tt style="white-space: pre;">redundancy</tt></td><td>Redundant words.</td></tr>
<tr><td><tt style="white-space: pre;">regionalisms</tt></td><td>Regionalisms: words used only in another language variant or used with different meanings.</td></tr>
<tr><td><tt style="white-space: pre;">repetitions</tt></td><td>Repeated words.</td></tr>
<tr><td><tt style="white-space: pre;">semantics</tt></td><td>Logic, content, and consistency problems.</td></tr>
<tr><td><tt style="white-space: pre;">style</tt></td><td>General style issues not covered by other categories, like overly verbose wording.</td></tr>
<tr><td><tt style="white-space: pre;">typography</tt></td><td>Problems like incorrectly used dash or quote characters.</td></tr>
<tr><td><tt style="white-space: pre;">typos</tt></td><td>Spelling issues.</td></tr>
</table>
</details>
_Note: By default, all rules are enabled._
<hr/>
<a id='config-server'></a>
### Customizing API server
#### Defining custom API endpoint
If you want to use remote LanguageTool server, or use the one already installed in your system (not installed via `gramma server install`), you can define a custom API endpoint:
```
gramma config api_url <custom_api_endpoint> [-g]
```
Examples:
```
gramma config api_url https://my-custom-api-url.xyz/v2/check
gramma config api_url http://localhost:8081/v2/check -g
```
#### Running local server only when needed
If you do not want the local server to run all the time, you can configure Gramma to run it only when needed (`run → check → close`). It is useful when you run Gramma only from time to time and want to lower the memory consumption:
```
gramma config server_once true -g
```
Revert:
```
gramma config server_once false -g
```
#### Adding API key
If you use a paid option on [grammarbot.io](https://www.grammarbot.io/) or [languagetool.org](https://languagetool.org), you will receive an API key that you can use in Gramma:
```
gramma config api_key <your_api_key> [-g]
```
<hr/>
<a id='config-security'></a>
### Security
If you need to store some sensitive data in your local config file (API key etc.) you can use environment variables directly in the config file (supports `.env` files).
Example:
```json
{
"api_url": "https://my-language-tool-api.com/v2/check",
"api_key": "${MY_ENV_VARIABLE}",
...other_settings
}
```
_Note: The default API (`api.languagetool.org`) is generally [safe and does not store your texts](https://languagetool.org/pl/legal/privacy), but if you want to be extra careful, you should use a [local server](#installation-server) or custom API endpoint._
<a id='server'></a>
<div><img src="assets/divider.png" width="838" alt="---" class="divider" /></div>
## Managing a local server
If you have [configured a local server](#installation-server), Gramma will manage the server automatically - nevertheless, there might be situations when you want to manage the server manually. Gramma simplifies this by exposing basic server commands:
#### Starting the server
```
gramma server start
```
You can also specify a custom port:
```
gramma server start --port <port_number>
```
_Note: When you use this command, Gramma will ignore the `server_once` config option. This is expected behavior - I assume that if you use this command, you want the server to actually run, not stop after the first check._
#### Stopping the server
```
gramma server stop
```
#### Getting the server info
```
gramma server info
```
#### Getting the server PID
```
gramma server pid
```
_Note: You can use `gramma server info` instead - this command is kept to not break backward compatibility._
#### Opening the built-in GUI
```
gramma server gui
```
<a id='js'></a>
<div><img src="assets/divider.png" width="838" alt="---" class="divider" /></div>
## JS API
In addition to command-line usage, you can use two exposed methods if you want to handle mistakes by yourself.
#### Imports
If you use Node.js or a bundler for your browser build, you can use CommonJS or esm:
```js
const gramma = require("gramma")
```
```js
import gramma from "gramma"
```
If you don't use a bundler and want to use gramma in the browser, there are some prebuild packages in [/bundle](https://github.com/caderek/gramma/tree/master/bundle) directory:
- `gramma.esm.js` - ES Modules bundle
- `gramma.esm.min.js` - minified ES Modules bundle
- `gramma.min.js` - IIFE bundle exposing global `gramma` variable
You can also import ESM bundle directly from CDN:
```html
<script type="module">
import gramma from "https://cdn.skypack.dev/gramma"
</script>
```
<hr/>
#### check() method
Returns a promise with a check result.
```js
const gramma = require("gramma")
gramma.check("Some text to check.").then(console.log)
```
You can also pass a second argument - an options object. Available options:
- `api_url` - url to a non-default API server
- `api_key` - server API key
- `dictionary` - an array of words that should be whitelisted
- `language` - language code to specify the text language
- `rules` - object defining which rules should be disabled
<details>
<summary style="outline: none; cursor: pointer">Default options object (click to expand)</summary>
<pre>
{
"api_url": "https://api.languagetool.org/v2/check",
"api_key": "",
"dictionary": [],
"language": "en-US",
"rules": {
"casing": true,
"colloquialisms": true,
"compounding": true,
"confused_words": true,
"false_friends": true,
"gender_neutrality": true,
"grammar": true,
"misc": true,
"punctuation": true,
"redundancy": true,
"regionalisms": true,
"repetitions": true,
"semantics": true,
"style": true,
"typography": true,
"typos": true
}
}
</pre>
</details>
You can find all available values for each setting in the [configuration section](#config) of this document.
Example with all options set:
```js
const gramma = require("gramma")
gramma
.check("Some text to check.", {
api_url: "http://my-custom-language-tool-server.xyz/v2/check",
api_key: "SOME_API_KEY",
dictionary: ["npm", "gramma"],
language: "pl-PL",
rules: {
typography: false,
casing: false,
},
})
.then(console.log)
```
<hr/>
#### replaceAll() method
Replace words with provided ones. It takes an array of objects in the following format:
```js
const exampleReplacements = [
{ offset: 6, length: 3, change: "correct phrase" },
{ offset: 20, length: 7, change: "another phrase" },
]
```
You can find proper `offset` and `length` values in the object returned by the `check()` method.
Example usage:
```js
const gramma = require("gramma")
/** Your custom function **/
const prepareReplacements = (matches) => {
// your code...
}
const fix = async (text) => {
const { matches } = await gramma.check(text)
const replacements = prepareReplacements(matches)
return gramma.replaceAll(text, replacements)
}
const main = () => {
const correctText = await fix("Some text to check")
console.log(correctText)
}
main()
```
<a id='license'></a>
<div><img src="assets/divider.png" width="838" alt="---" class="divider" /></div>
## License
The project is under open, non-restrictive [ISC license](https://github.com/caderek/gramma/blob/master/LICENSE.md).
================================================
FILE: _config.yml
================================================
theme: jekyll-theme-cayman
================================================
FILE: _layouts/default.html
================================================
<!DOCTYPE html>
<html lang="{{ site.lang | default: "en-US" }}">
<head>
{% if site.google_analytics %}
<script async src="https://www.googletagmanager.com/gtag/js?id={{ site.google_analytics }}"></script>
<script>
window.dataLayer = window.dataLayer || [];
function gtag(){dataLayer.push(arguments);}
gtag('js', new Date());
gtag('config', '{{ site.google_analytics }}');
</script>
{% endif %}
<meta charset="UTF-8">
<!-- Primary Meta Tags -->
<title>Gramma - command-line grammar checker</title>
<meta name="title" content="Gramma - command-line grammar checker">
<meta name="description" content="Advanced, multilingual grammar checks interactively in your terminal!">
<!-- Open Graph / Facebook -->
<meta property="og:type" content="website">
<meta property="og:url" content="https://caderek.github.io/gramma/">
<meta property="og:title" content="Gramma - command-line grammar checker">
<meta property="og:description" content="Advanced, multilingual grammar checks interactively in your terminal!">
<meta property="og:image" content="https://caderek.github.io/gramma/assets/banner.png">
<!-- Twitter -->
<meta property="twitter:card" content="summary_large_image">
<meta property="twitter:url" content="https://caderek.github.io/gramma/">
<meta property="twitter:title" content="Gramma - command-line grammar checker">
<meta property="twitter:description" content="Advanced, multilingual grammar checks interactively in your terminal!">
<meta property="twitter:image" content="https://caderek.github.io/gramma/assets/banner.png">
<meta name="viewport" content="width=device-width, initial-scale=1">
<meta name="theme-color" content="#157878">
<meta name="apple-mobile-web-app-status-bar-style" content="black-translucent">
<link rel="stylesheet" href="{{ '/assets/css/style.css?v=' | append: site.github.build_revision | relative_url }}">
<!-- <link rel="stylesheet" href="/assets/css/style.css"> -->
<link rel="stylesheet" href="https://cdn.jsdelivr.net/gh/devicons/devicon@v2.14.0/devicon.min.css">
</head>
<body>
<div class="version">v1.6.0</div>
<header class="page-header" role="banner">
<a href="https://github.com/caderek/gramma" class="download__link">
<div class="download">
<i class="devicon-github-original"></i>
<p>View on GitHub</p>
</div>
</a>
<a href="https://www.npmjs.com/package/gramma" class="download__link">
<div class="download">
<i class="devicon-npm-original-wordmark"></i>
<p>View on NPM</p>
</div>
</a>
<a href="https://github.com/caderek/gramma/releases/download/v1.6.0/gramma-linux64-v1.6.0.zip" class="download__link">
<div class="download">
<i class="devicon-linux-plain"></i>
<p>Download for Linux</p>
</div>
</a>
<a href="https://github.com/caderek/gramma/releases/download/v1.6.0/gramma-windows64-v1.6.0.zip" class="download__link">
<div class="download">
<i class="devicon-windows8-original"></i>
<p>Download for Windows</p>
</div>
</a>
<a href="https://github.com/caderek/gramma/releases/download/v1.6.0/gramma-macos-v1.6.0.zip" class="download__link">
<div class="download">
<i class="devicon-apple-original"></i>
<p>Download for macOS</p>
</div>
</a>
</header>
<main id="content" class="main-content" role="main">
<div class="actions">
<a class="github-button" href="https://github.com/caderek/gramma" data-icon="octicon-star" data-size="large" data-show-count="true" aria-label="Star caderek/gramma on GitHub">Star</a>
<a class="github-button" href="https://github.com/caderek/gramma/subscription" data-icon="octicon-eye" data-size="large" data-show-count="true" aria-label="Watch caderek/gramma on GitHub">Watch</a>
<a class="github-button" href="https://github.com/caderek/gramma/issues" data-icon="octicon-issue-opened" data-size="large" data-show-count="true" aria-label="Issue caderek/gramma on GitHub">Issue</a>
</div>
{{ content }}
<div class="actions actions--bottom">
<a class="github-button" href="https://github.com/caderek/gramma" data-icon="octicon-star" data-size="large" data-show-count="true" aria-label="Star caderek/gramma on GitHub">Star</a>
<a class="github-button" href="https://github.com/caderek/gramma/subscription" data-icon="octicon-eye" data-size="large" data-show-count="true" aria-label="Watch caderek/gramma on GitHub">Watch</a>
<a class="github-button" href="https://github.com/caderek/gramma/issues" data-icon="octicon-issue-opened" data-size="large" data-show-count="true" aria-label="Issue caderek/gramma on GitHub">Issue</a>
</div>
<footer class="site-footer">
{% if site.github.is_project_page %}
<span class="site-footer-owner"><a href="{{ site.github.repository_url }}">{{ site.github.repository_name }}</a> is maintained by <a href="{{ site.github.owner_url }}">{{ site.github.owner_name }}</a>.</span>
{% endif %}
<span class="site-footer-credits">This page was generated by <a href="https://pages.github.com">GitHub Pages</a>.</span>
</footer>
</main>
<script async defer src="https://buttons.github.io/buttons.js"></script>
</body>
</html>
================================================
FILE: assets/css/style.scss
================================================
---
---
@import "{{ site.theme }}";
body {
margin: 0;
}
.page-header {
color: #fff;
text-align: center;
background-color: #0081b8;
background-image: linear-gradient(120deg, #0081b8, #b8e045);
padding: 15px;
}
.main-content h1,
.main-content h4,
.main-content h5,
.main-content h6 {
color: #0f9250;
}
.main-content h2 {
color: white;
background: linear-gradient(to right, #0081b8, #b8e045);
padding: 5px 10px;
margin-top: 50px;
}
.main-content h3 {
color: black;
background: linear-gradient(to right, #b5ddee, #e8f3c7);
padding: 5px 10px;
margin-top: 30px;
}
.project-tagline {
font-family: monospace;
}
.download {
display: inline-block;
padding: 10px;
opacity: 0.8;
width: 200px;
text-align: center;
margin-top: 10px;
cursor: pointer;
transition-duration: 0.2s;
}
.download:hover {
opacity: 1;
}
.download i {
font-size: 50px;
margin: 10px;
}
.download__link,
.download__link:visited {
text-decoration: none;
color: white;
outline: none;
}
.download__link:hover {
text-decoration: none;
color: white;
}
@media only screen and (max-width: 640px) {
.download {
display: block;
width: auto;
height: auto;
text-align: left;
margin: 0;
}
.download i {
font-size: 30px;
margin: 0 10px;
}
p {
display: initial;
position: relative;
top: -8px;
}
}
.divider {
display: none;
}
.version {
color: white;
position: absolute;
top: 10px;
right: 15px;
font-size: 20px;
opacity: 0.8;
}
.actions {
text-align: center;
padding: 0 10px 30px 10px;
}
.actions--bottom {
padding: 30px 10px 0 10px;
}
.site-footer {
text-align: center;
}
================================================
FILE: bundle/gramma.esm.js
================================================
var __commonJS = (cb, mod) => function __require() {
return mod || (0, cb[Object.keys(cb)[0]])((mod = { exports: {} }).exports, mod), mod.exports;
};
// node_modules/whatwg-fetch/dist/fetch.umd.js
var require_fetch_umd = __commonJS({
"node_modules/whatwg-fetch/dist/fetch.umd.js"(exports, module) {
(function(global, factory) {
typeof exports === "object" && typeof module !== "undefined" ? factory(exports) : typeof define === "function" && define.amd ? define(["exports"], factory) : factory(global.WHATWGFetch = {});
})(exports, function(exports2) {
"use strict";
var global = typeof globalThis !== "undefined" && globalThis || typeof self !== "undefined" && self || typeof global !== "undefined" && global;
var support = {
searchParams: "URLSearchParams" in global,
iterable: "Symbol" in global && "iterator" in Symbol,
blob: "FileReader" in global && "Blob" in global && function() {
try {
new Blob();
return true;
} catch (e) {
return false;
}
}(),
formData: "FormData" in global,
arrayBuffer: "ArrayBuffer" in global
};
function isDataView(obj) {
return obj && DataView.prototype.isPrototypeOf(obj);
}
if (support.arrayBuffer) {
var viewClasses = [
"[object Int8Array]",
"[object Uint8Array]",
"[object Uint8ClampedArray]",
"[object Int16Array]",
"[object Uint16Array]",
"[object Int32Array]",
"[object Uint32Array]",
"[object Float32Array]",
"[object Float64Array]"
];
var isArrayBufferView = ArrayBuffer.isView || function(obj) {
return obj && viewClasses.indexOf(Object.prototype.toString.call(obj)) > -1;
};
}
function normalizeName(name) {
if (typeof name !== "string") {
name = String(name);
}
if (/[^a-z0-9\-#$%&'*+.^_`|~!]/i.test(name) || name === "") {
throw new TypeError('Invalid character in header field name: "' + name + '"');
}
return name.toLowerCase();
}
function normalizeValue(value) {
if (typeof value !== "string") {
value = String(value);
}
return value;
}
function iteratorFor(items) {
var iterator = {
next: function() {
var value = items.shift();
return { done: value === void 0, value };
}
};
if (support.iterable) {
iterator[Symbol.iterator] = function() {
return iterator;
};
}
return iterator;
}
function Headers(headers) {
this.map = {};
if (headers instanceof Headers) {
headers.forEach(function(value, name) {
this.append(name, value);
}, this);
} else if (Array.isArray(headers)) {
headers.forEach(function(header) {
this.append(header[0], header[1]);
}, this);
} else if (headers) {
Object.getOwnPropertyNames(headers).forEach(function(name) {
this.append(name, headers[name]);
}, this);
}
}
Headers.prototype.append = function(name, value) {
name = normalizeName(name);
value = normalizeValue(value);
var oldValue = this.map[name];
this.map[name] = oldValue ? oldValue + ", " + value : value;
};
Headers.prototype["delete"] = function(name) {
delete this.map[normalizeName(name)];
};
Headers.prototype.get = function(name) {
name = normalizeName(name);
return this.has(name) ? this.map[name] : null;
};
Headers.prototype.has = function(name) {
return this.map.hasOwnProperty(normalizeName(name));
};
Headers.prototype.set = function(name, value) {
this.map[normalizeName(name)] = normalizeValue(value);
};
Headers.prototype.forEach = function(callback, thisArg) {
for (var name in this.map) {
if (this.map.hasOwnProperty(name)) {
callback.call(thisArg, this.map[name], name, this);
}
}
};
Headers.prototype.keys = function() {
var items = [];
this.forEach(function(value, name) {
items.push(name);
});
return iteratorFor(items);
};
Headers.prototype.values = function() {
var items = [];
this.forEach(function(value) {
items.push(value);
});
return iteratorFor(items);
};
Headers.prototype.entries = function() {
var items = [];
this.forEach(function(value, name) {
items.push([name, value]);
});
return iteratorFor(items);
};
if (support.iterable) {
Headers.prototype[Symbol.iterator] = Headers.prototype.entries;
}
function consumed(body) {
if (body.bodyUsed) {
return Promise.reject(new TypeError("Already read"));
}
body.bodyUsed = true;
}
function fileReaderReady(reader) {
return new Promise(function(resolve, reject) {
reader.onload = function() {
resolve(reader.result);
};
reader.onerror = function() {
reject(reader.error);
};
});
}
function readBlobAsArrayBuffer(blob) {
var reader = new FileReader();
var promise = fileReaderReady(reader);
reader.readAsArrayBuffer(blob);
return promise;
}
function readBlobAsText(blob) {
var reader = new FileReader();
var promise = fileReaderReady(reader);
reader.readAsText(blob);
return promise;
}
function readArrayBufferAsText(buf) {
var view = new Uint8Array(buf);
var chars = new Array(view.length);
for (var i = 0; i < view.length; i++) {
chars[i] = String.fromCharCode(view[i]);
}
return chars.join("");
}
function bufferClone(buf) {
if (buf.slice) {
return buf.slice(0);
} else {
var view = new Uint8Array(buf.byteLength);
view.set(new Uint8Array(buf));
return view.buffer;
}
}
function Body() {
this.bodyUsed = false;
this._initBody = function(body) {
this.bodyUsed = this.bodyUsed;
this._bodyInit = body;
if (!body) {
this._bodyText = "";
} else if (typeof body === "string") {
this._bodyText = body;
} else if (support.blob && Blob.prototype.isPrototypeOf(body)) {
this._bodyBlob = body;
} else if (support.formData && FormData.prototype.isPrototypeOf(body)) {
this._bodyFormData = body;
} else if (support.searchParams && URLSearchParams.prototype.isPrototypeOf(body)) {
this._bodyText = body.toString();
} else if (support.arrayBuffer && support.blob && isDataView(body)) {
this._bodyArrayBuffer = bufferClone(body.buffer);
this._bodyInit = new Blob([this._bodyArrayBuffer]);
} else if (support.arrayBuffer && (ArrayBuffer.prototype.isPrototypeOf(body) || isArrayBufferView(body))) {
this._bodyArrayBuffer = bufferClone(body);
} else {
this._bodyText = body = Object.prototype.toString.call(body);
}
if (!this.headers.get("content-type")) {
if (typeof body === "string") {
this.headers.set("content-type", "text/plain;charset=UTF-8");
} else if (this._bodyBlob && this._bodyBlob.type) {
this.headers.set("content-type", this._bodyBlob.type);
} else if (support.searchParams && URLSearchParams.prototype.isPrototypeOf(body)) {
this.headers.set("content-type", "application/x-www-form-urlencoded;charset=UTF-8");
}
}
};
if (support.blob) {
this.blob = function() {
var rejected = consumed(this);
if (rejected) {
return rejected;
}
if (this._bodyBlob) {
return Promise.resolve(this._bodyBlob);
} else if (this._bodyArrayBuffer) {
return Promise.resolve(new Blob([this._bodyArrayBuffer]));
} else if (this._bodyFormData) {
throw new Error("could not read FormData body as blob");
} else {
return Promise.resolve(new Blob([this._bodyText]));
}
};
this.arrayBuffer = function() {
if (this._bodyArrayBuffer) {
var isConsumed = consumed(this);
if (isConsumed) {
return isConsumed;
}
if (ArrayBuffer.isView(this._bodyArrayBuffer)) {
return Promise.resolve(this._bodyArrayBuffer.buffer.slice(this._bodyArrayBuffer.byteOffset, this._bodyArrayBuffer.byteOffset + this._bodyArrayBuffer.byteLength));
} else {
return Promise.resolve(this._bodyArrayBuffer);
}
} else {
return this.blob().then(readBlobAsArrayBuffer);
}
};
}
this.text = function() {
var rejected = consumed(this);
if (rejected) {
return rejected;
}
if (this._bodyBlob) {
return readBlobAsText(this._bodyBlob);
} else if (this._bodyArrayBuffer) {
return Promise.resolve(readArrayBufferAsText(this._bodyArrayBuffer));
} else if (this._bodyFormData) {
throw new Error("could not read FormData body as text");
} else {
return Promise.resolve(this._bodyText);
}
};
if (support.formData) {
this.formData = function() {
return this.text().then(decode);
};
}
this.json = function() {
return this.text().then(JSON.parse);
};
return this;
}
var methods = ["DELETE", "GET", "HEAD", "OPTIONS", "POST", "PUT"];
function normalizeMethod(method) {
var upcased = method.toUpperCase();
return methods.indexOf(upcased) > -1 ? upcased : method;
}
function Request(input, options) {
if (!(this instanceof Request)) {
throw new TypeError('Please use the "new" operator, this DOM object constructor cannot be called as a function.');
}
options = options || {};
var body = options.body;
if (input instanceof Request) {
if (input.bodyUsed) {
throw new TypeError("Already read");
}
this.url = input.url;
this.credentials = input.credentials;
if (!options.headers) {
this.headers = new Headers(input.headers);
}
this.method = input.method;
this.mode = input.mode;
this.signal = input.signal;
if (!body && input._bodyInit != null) {
body = input._bodyInit;
input.bodyUsed = true;
}
} else {
this.url = String(input);
}
this.credentials = options.credentials || this.credentials || "same-origin";
if (options.headers || !this.headers) {
this.headers = new Headers(options.headers);
}
this.method = normalizeMethod(options.method || this.method || "GET");
this.mode = options.mode || this.mode || null;
this.signal = options.signal || this.signal;
this.referrer = null;
if ((this.method === "GET" || this.method === "HEAD") && body) {
throw new TypeError("Body not allowed for GET or HEAD requests");
}
this._initBody(body);
if (this.method === "GET" || this.method === "HEAD") {
if (options.cache === "no-store" || options.cache === "no-cache") {
var reParamSearch = /([?&])_=[^&]*/;
if (reParamSearch.test(this.url)) {
this.url = this.url.replace(reParamSearch, "$1_=" + new Date().getTime());
} else {
var reQueryString = /\?/;
this.url += (reQueryString.test(this.url) ? "&" : "?") + "_=" + new Date().getTime();
}
}
}
}
Request.prototype.clone = function() {
return new Request(this, { body: this._bodyInit });
};
function decode(body) {
var form = new FormData();
body.trim().split("&").forEach(function(bytes) {
if (bytes) {
var split = bytes.split("=");
var name = split.shift().replace(/\+/g, " ");
var value = split.join("=").replace(/\+/g, " ");
form.append(decodeURIComponent(name), decodeURIComponent(value));
}
});
return form;
}
function parseHeaders(rawHeaders) {
var headers = new Headers();
var preProcessedHeaders = rawHeaders.replace(/\r?\n[\t ]+/g, " ");
preProcessedHeaders.split("\r").map(function(header) {
return header.indexOf("\n") === 0 ? header.substr(1, header.length) : header;
}).forEach(function(line) {
var parts = line.split(":");
var key = parts.shift().trim();
if (key) {
var value = parts.join(":").trim();
headers.append(key, value);
}
});
return headers;
}
Body.call(Request.prototype);
function Response(bodyInit, options) {
if (!(this instanceof Response)) {
throw new TypeError('Please use the "new" operator, this DOM object constructor cannot be called as a function.');
}
if (!options) {
options = {};
}
this.type = "default";
this.status = options.status === void 0 ? 200 : options.status;
this.ok = this.status >= 200 && this.status < 300;
this.statusText = options.statusText === void 0 ? "" : "" + options.statusText;
this.headers = new Headers(options.headers);
this.url = options.url || "";
this._initBody(bodyInit);
}
Body.call(Response.prototype);
Response.prototype.clone = function() {
return new Response(this._bodyInit, {
status: this.status,
statusText: this.statusText,
headers: new Headers(this.headers),
url: this.url
});
};
Response.error = function() {
var response = new Response(null, { status: 0, statusText: "" });
response.type = "error";
return response;
};
var redirectStatuses = [301, 302, 303, 307, 308];
Response.redirect = function(url, status) {
if (redirectStatuses.indexOf(status) === -1) {
throw new RangeError("Invalid status code");
}
return new Response(null, { status, headers: { location: url } });
};
exports2.DOMException = global.DOMException;
try {
new exports2.DOMException();
} catch (err) {
exports2.DOMException = function(message, name) {
this.message = message;
this.name = name;
var error = Error(message);
this.stack = error.stack;
};
exports2.DOMException.prototype = Object.create(Error.prototype);
exports2.DOMException.prototype.constructor = exports2.DOMException;
}
function fetch2(input, init) {
return new Promise(function(resolve, reject) {
var request = new Request(input, init);
if (request.signal && request.signal.aborted) {
return reject(new exports2.DOMException("Aborted", "AbortError"));
}
var xhr = new XMLHttpRequest();
function abortXhr() {
xhr.abort();
}
xhr.onload = function() {
var options = {
status: xhr.status,
statusText: xhr.statusText,
headers: parseHeaders(xhr.getAllResponseHeaders() || "")
};
options.url = "responseURL" in xhr ? xhr.responseURL : options.headers.get("X-Request-URL");
var body = "response" in xhr ? xhr.response : xhr.responseText;
setTimeout(function() {
resolve(new Response(body, options));
}, 0);
};
xhr.onerror = function() {
setTimeout(function() {
reject(new TypeError("Network request failed"));
}, 0);
};
xhr.ontimeout = function() {
setTimeout(function() {
reject(new TypeError("Network request failed"));
}, 0);
};
xhr.onabort = function() {
setTimeout(function() {
reject(new exports2.DOMException("Aborted", "AbortError"));
}, 0);
};
function fixUrl(url) {
try {
return url === "" && global.location.href ? global.location.href : url;
} catch (e) {
return url;
}
}
xhr.open(request.method, fixUrl(request.url), true);
if (request.credentials === "include") {
xhr.withCredentials = true;
} else if (request.credentials === "omit") {
xhr.withCredentials = false;
}
if ("responseType" in xhr) {
if (support.blob) {
xhr.responseType = "blob";
} else if (support.arrayBuffer && request.headers.get("Content-Type") && request.headers.get("Content-Type").indexOf("application/octet-stream") !== -1) {
xhr.responseType = "arraybuffer";
}
}
if (init && typeof init.headers === "object" && !(init.headers instanceof Headers)) {
Object.getOwnPropertyNames(init.headers).forEach(function(name) {
xhr.setRequestHeader(name, normalizeValue(init.headers[name]));
});
} else {
request.headers.forEach(function(value, name) {
xhr.setRequestHeader(name, value);
});
}
if (request.signal) {
request.signal.addEventListener("abort", abortXhr);
xhr.onreadystatechange = function() {
if (xhr.readyState === 4) {
request.signal.removeEventListener("abort", abortXhr);
}
};
}
xhr.send(typeof request._bodyInit === "undefined" ? null : request._bodyInit);
});
}
fetch2.polyfill = true;
if (!global.fetch) {
global.fetch = fetch2;
global.Headers = Headers;
global.Request = Request;
global.Response = Response;
}
exports2.Headers = Headers;
exports2.Request = Request;
exports2.Response = Response;
exports2.fetch = fetch2;
Object.defineProperty(exports2, "__esModule", { value: true });
});
}
});
// node_modules/isomorphic-fetch/fetch-npm-browserify.js
var require_fetch_npm_browserify = __commonJS({
"node_modules/isomorphic-fetch/fetch-npm-browserify.js"(exports, module) {
require_fetch_umd();
module.exports = self.fetch.bind(self);
}
});
// node_modules/strict-uri-encode/index.js
var require_strict_uri_encode = __commonJS({
"node_modules/strict-uri-encode/index.js"(exports, module) {
"use strict";
module.exports = (str) => encodeURIComponent(str).replace(/[!'()*]/g, (x) => `%${x.charCodeAt(0).toString(16).toUpperCase()}`);
}
});
// node_modules/decode-uri-component/index.js
var require_decode_uri_component = __commonJS({
"node_modules/decode-uri-component/index.js"(exports, module) {
"use strict";
var token = "%[a-f0-9]{2}";
var singleMatcher = new RegExp(token, "gi");
var multiMatcher = new RegExp("(" + token + ")+", "gi");
function decodeComponents(components, split) {
try {
return decodeURIComponent(components.join(""));
} catch (err) {
}
if (components.length === 1) {
return components;
}
split = split || 1;
var left = components.slice(0, split);
var right = components.slice(split);
return Array.prototype.concat.call([], decodeComponents(left), decodeComponents(right));
}
function decode(input) {
try {
return decodeURIComponent(input);
} catch (err) {
var tokens = input.match(singleMatcher);
for (var i = 1; i < tokens.length; i++) {
input = decodeComponents(tokens, i).join("");
tokens = input.match(singleMatcher);
}
return input;
}
}
function customDecodeURIComponent(input) {
var replaceMap = {
"%FE%FF": "\uFFFD\uFFFD",
"%FF%FE": "\uFFFD\uFFFD"
};
var match = multiMatcher.exec(input);
while (match) {
try {
replaceMap[match[0]] = decodeURIComponent(match[0]);
} catch (err) {
var result = decode(match[0]);
if (result !== match[0]) {
replaceMap[match[0]] = result;
}
}
match = multiMatcher.exec(input);
}
replaceMap["%C2"] = "\uFFFD";
var entries = Object.keys(replaceMap);
for (var i = 0; i < entries.length; i++) {
var key = entries[i];
input = input.replace(new RegExp(key, "g"), replaceMap[key]);
}
return input;
}
module.exports = function(encodedURI) {
if (typeof encodedURI !== "string") {
throw new TypeError("Expected `encodedURI` to be of type `string`, got `" + typeof encodedURI + "`");
}
try {
encodedURI = encodedURI.replace(/\+/g, " ");
return decodeURIComponent(encodedURI);
} catch (err) {
return customDecodeURIComponent(encodedURI);
}
};
}
});
// node_modules/split-on-first/index.js
var require_split_on_first = __commonJS({
"node_modules/split-on-first/index.js"(exports, module) {
"use strict";
module.exports = (string, separator) => {
if (!(typeof string === "string" && typeof separator === "string")) {
throw new TypeError("Expected the arguments to be of type `string`");
}
if (separator === "") {
return [string];
}
const separatorIndex = string.indexOf(separator);
if (separatorIndex === -1) {
return [string];
}
return [
string.slice(0, separatorIndex),
string.slice(separatorIndex + separator.length)
];
};
}
});
// node_modules/filter-obj/index.js
var require_filter_obj = __commonJS({
"node_modules/filter-obj/index.js"(exports, module) {
"use strict";
module.exports = function(obj, predicate) {
var ret = {};
var keys = Object.keys(obj);
var isArr = Array.isArray(predicate);
for (var i = 0; i < keys.length; i++) {
var key = keys[i];
var val = obj[key];
if (isArr ? predicate.indexOf(key) !== -1 : predicate(key, val, obj)) {
ret[key] = val;
}
}
return ret;
};
}
});
// node_modules/query-string/index.js
var require_query_string = __commonJS({
"node_modules/query-string/index.js"(exports) {
"use strict";
var strictUriEncode = require_strict_uri_encode();
var decodeComponent = require_decode_uri_component();
var splitOnFirst = require_split_on_first();
var filterObject = require_filter_obj();
var isNullOrUndefined = (value) => value === null || value === void 0;
var encodeFragmentIdentifier = Symbol("encodeFragmentIdentifier");
function encoderForArrayFormat(options) {
switch (options.arrayFormat) {
case "index":
return (key) => (result, value) => {
const index = result.length;
if (value === void 0 || options.skipNull && value === null || options.skipEmptyString && value === "") {
return result;
}
if (value === null) {
return [...result, [encode(key, options), "[", index, "]"].join("")];
}
return [
...result,
[encode(key, options), "[", encode(index, options), "]=", encode(value, options)].join("")
];
};
case "bracket":
return (key) => (result, value) => {
if (value === void 0 || options.skipNull && value === null || options.skipEmptyString && value === "") {
return result;
}
if (value === null) {
return [...result, [encode(key, options), "[]"].join("")];
}
return [...result, [encode(key, options), "[]=", encode(value, options)].join("")];
};
case "comma":
case "separator":
case "bracket-separator": {
const keyValueSep = options.arrayFormat === "bracket-separator" ? "[]=" : "=";
return (key) => (result, value) => {
if (value === void 0 || options.skipNull && value === null || options.skipEmptyString && value === "") {
return result;
}
value = value === null ? "" : value;
if (result.length === 0) {
return [[encode(key, options), keyValueSep, encode(value, options)].join("")];
}
return [[result, encode(value, options)].join(options.arrayFormatSeparator)];
};
}
default:
return (key) => (result, value) => {
if (value === void 0 || options.skipNull && value === null || options.skipEmptyString && value === "") {
return result;
}
if (value === null) {
return [...result, encode(key, options)];
}
return [...result, [encode(key, options), "=", encode(value, options)].join("")];
};
}
}
function parserForArrayFormat(options) {
let result;
switch (options.arrayFormat) {
case "index":
return (key, value, accumulator) => {
result = /\[(\d*)\]$/.exec(key);
key = key.replace(/\[\d*\]$/, "");
if (!result) {
accumulator[key] = value;
return;
}
if (accumulator[key] === void 0) {
accumulator[key] = {};
}
accumulator[key][result[1]] = value;
};
case "bracket":
return (key, value, accumulator) => {
result = /(\[\])$/.exec(key);
key = key.replace(/\[\]$/, "");
if (!result) {
accumulator[key] = value;
return;
}
if (accumulator[key] === void 0) {
accumulator[key] = [value];
return;
}
accumulator[key] = [].concat(accumulator[key], value);
};
case "comma":
case "separator":
return (key, value, accumulator) => {
const isArray = typeof value === "string" && value.includes(options.arrayFormatSeparator);
const isEncodedArray = typeof value === "string" && !isArray && decode(value, options).includes(options.arrayFormatSeparator);
value = isEncodedArray ? decode(value, options) : value;
const newValue = isArray || isEncodedArray ? value.split(options.arrayFormatSeparator).map((item) => decode(item, options)) : value === null ? value : decode(value, options);
accumulator[key] = newValue;
};
case "bracket-separator":
return (key, value, accumulator) => {
const isArray = /(\[\])$/.test(key);
key = key.replace(/\[\]$/, "");
if (!isArray) {
accumulator[key] = value ? decode(value, options) : value;
return;
}
const arrayValue = value === null ? [] : value.split(options.arrayFormatSeparator).map((item) => decode(item, options));
if (accumulator[key] === void 0) {
accumulator[key] = arrayValue;
return;
}
accumulator[key] = [].concat(accumulator[key], arrayValue);
};
default:
return (key, value, accumulator) => {
if (accumulator[key] === void 0) {
accumulator[key] = value;
return;
}
accumulator[key] = [].concat(accumulator[key], value);
};
}
}
function validateArrayFormatSeparator(value) {
if (typeof value !== "string" || value.length !== 1) {
throw new TypeError("arrayFormatSeparator must be single character string");
}
}
function encode(value, options) {
if (options.encode) {
return options.strict ? strictUriEncode(value) : encodeURIComponent(value);
}
return value;
}
function decode(value, options) {
if (options.decode) {
return decodeComponent(value);
}
return value;
}
function keysSorter(input) {
if (Array.isArray(input)) {
return input.sort();
}
if (typeof input === "object") {
return keysSorter(Object.keys(input)).sort((a, b) => Number(a) - Number(b)).map((key) => input[key]);
}
return input;
}
function removeHash(input) {
const hashStart = input.indexOf("#");
if (hashStart !== -1) {
input = input.slice(0, hashStart);
}
return input;
}
function getHash(url) {
let hash = "";
const hashStart = url.indexOf("#");
if (hashStart !== -1) {
hash = url.slice(hashStart);
}
return hash;
}
function extract(input) {
input = removeHash(input);
const queryStart = input.indexOf("?");
if (queryStart === -1) {
return "";
}
return input.slice(queryStart + 1);
}
function parseValue(value, options) {
if (options.parseNumbers && !Number.isNaN(Number(value)) && (typeof value === "string" && value.trim() !== "")) {
value = Number(value);
} else if (options.parseBooleans && value !== null && (value.toLowerCase() === "true" || value.toLowerCase() === "false")) {
value = value.toLowerCase() === "true";
}
return value;
}
function parse(query, options) {
options = Object.assign({
decode: true,
sort: true,
arrayFormat: "none",
arrayFormatSeparator: ",",
parseNumbers: false,
parseBooleans: false
}, options);
validateArrayFormatSeparator(options.arrayFormatSeparator);
const formatter = parserForArrayFormat(options);
const ret = Object.create(null);
if (typeof query !== "string") {
return ret;
}
query = query.trim().replace(/^[?#&]/, "");
if (!query) {
return ret;
}
for (const param of query.split("&")) {
if (param === "") {
continue;
}
let [key, value] = splitOnFirst(options.decode ? param.replace(/\+/g, " ") : param, "=");
value = value === void 0 ? null : ["comma", "separator", "bracket-separator"].includes(options.arrayFormat) ? value : decode(value, options);
formatter(decode(key, options), value, ret);
}
for (const key of Object.keys(ret)) {
const value = ret[key];
if (typeof value === "object" && value !== null) {
for (const k of Object.keys(value)) {
value[k] = parseValue(value[k], options);
}
} else {
ret[key] = parseValue(value, options);
}
}
if (options.sort === false) {
return ret;
}
return (options.sort === true ? Object.keys(ret).sort() : Object.keys(ret).sort(options.sort)).reduce((result, key) => {
const value = ret[key];
if (Boolean(value) && typeof value === "object" && !Array.isArray(value)) {
result[key] = keysSorter(value);
} else {
result[key] = value;
}
return result;
}, Object.create(null));
}
exports.extract = extract;
exports.parse = parse;
exports.stringify = (object, options) => {
if (!object) {
return "";
}
options = Object.assign({
encode: true,
strict: true,
arrayFormat: "none",
arrayFormatSeparator: ","
}, options);
validateArrayFormatSeparator(options.arrayFormatSeparator);
const shouldFilter = (key) => options.skipNull && isNullOrUndefined(object[key]) || options.skipEmptyString && object[key] === "";
const formatter = encoderForArrayFormat(options);
const objectCopy = {};
for (const key of Object.keys(object)) {
if (!shouldFilter(key)) {
objectCopy[key] = object[key];
}
}
const keys = Object.keys(objectCopy);
if (options.sort !== false) {
keys.sort(options.sort);
}
return keys.map((key) => {
const value = object[key];
if (value === void 0) {
return "";
}
if (value === null) {
return encode(key, options);
}
if (Array.isArray(value)) {
if (value.length === 0 && options.arrayFormat === "bracket-separator") {
return encode(key, options) + "[]";
}
return value.reduce(formatter(key), []).join("&");
}
return encode(key, options) + "=" + encode(value, options);
}).filter((x) => x.length > 0).join("&");
};
exports.parseUrl = (url, options) => {
options = Object.assign({
decode: true
}, options);
const [url_, hash] = splitOnFirst(url, "#");
return Object.assign({
url: url_.split("?")[0] || "",
query: parse(extract(url), options)
}, options && options.parseFragmentIdentifier && hash ? { fragmentIdentifier: decode(hash, options) } : {});
};
exports.stringifyUrl = (object, options) => {
options = Object.assign({
encode: true,
strict: true,
[encodeFragmentIdentifier]: true
}, options);
const url = removeHash(object.url).split("?")[0] || "";
const queryFromUrl = exports.extract(object.url);
const parsedQueryFromUrl = exports.parse(queryFromUrl, { sort: false });
const query = Object.assign(parsedQueryFromUrl, object.query);
let queryString = exports.stringify(query, options);
if (queryString) {
queryString = `?${queryString}`;
}
let hash = getHash(object.url);
if (object.fragmentIdentifier) {
hash = `#${options[encodeFragmentIdentifier] ? encode(object.fragmentIdentifier, options) : object.fragmentIdentifier}`;
}
return `${url}${queryString}${hash}`;
};
exports.pick = (input, filter, options) => {
options = Object.assign({
parseFragmentIdentifier: true,
[encodeFragmentIdentifier]: false
}, options);
const { url, query, fragmentIdentifier } = exports.parseUrl(input, options);
return exports.stringifyUrl({
url,
query: filterObject(query, filter),
fragmentIdentifier
}, options);
};
exports.exclude = (input, filter, options) => {
const exclusionFilter = Array.isArray(filter) ? (key) => !filter.includes(key) : (key, value) => !filter(key, value);
return exports.pick(input, exclusionFilter, options);
};
}
});
// data/rules.json
var require_rules = __commonJS({
"data/rules.json"(exports, module) {
module.exports = [
{
id: "CASING",
description: "Detecting uppercase words where lowercase is required and vice versa."
},
{
id: "COLLOQUIALISMS",
description: "Colloquial style."
},
{
id: "COMPOUNDING",
description: "Rules about spelling terms as one word or as as separate words."
},
{
id: "CONFUSED_WORDS",
description: "Words that are easily confused, like 'there' and 'their' in English."
},
{
id: "FALSE_FRIENDS",
description: "Words easily confused by language learners because a similar word exists in their native language."
},
{
id: "GENDER_NEUTRALITY",
description: ""
},
{
id: "GRAMMAR",
description: ""
},
{
id: "MISC",
description: "Miscellaneous rules that don't fit elsewhere."
},
{
id: "PUNCTUATION",
description: ""
},
{
id: "REDUNDANCY",
description: ""
},
{
id: "REGIONALISMS",
description: "Words used only in another language variant or used with different meanings."
},
{
id: "REPETITIONS",
description: ""
},
{
id: "SEMANTICS",
description: "Logic, content, and consistency problems."
},
{
id: "STYLE",
description: "General style issues not covered by other categories, like overly verbose wording."
},
{
id: "TYPOGRAPHY",
description: "Problems like incorrectly used dash or quote characters."
},
{
id: "TYPOS",
description: "Spelling issues."
}
];
}
});
// src/validators/rules.js
var require_rules2 = __commonJS({
"src/validators/rules.js"(exports, module) {
var rules = require_rules();
var ruleOptions = rules.map((rule) => rule.id.toLowerCase());
var isRule = (value) => {
return ruleOptions.includes(value);
};
module.exports = {
ruleOptions,
isRule
};
}
});
// src/initialConfig.js
var require_initialConfig = __commonJS({
"src/initialConfig.js"(exports, module) {
var { ruleOptions } = require_rules2();
var rules = {};
ruleOptions.forEach((rule) => {
rules[rule] = true;
});
var initialConfig = {
api_url: "https://api.languagetool.org/v2/check",
api_key: "",
dictionary: [],
language: "en-US",
rules
};
module.exports = initialConfig;
}
});
// src/utils/prepareMarkdown.js
var require_prepareMarkdown = __commonJS({
"src/utils/prepareMarkdown.js"(exports) {
var __create = Object.create;
var __defProp = Object.defineProperty;
var __getOwnPropDesc = Object.getOwnPropertyDescriptor;
var __getOwnPropNames = Object.getOwnPropertyNames;
var __getProtoOf = Object.getPrototypeOf;
var __hasOwnProp = Object.prototype.hasOwnProperty;
var __markAsModule = (target) => __defProp(target, "__esModule", { value: true });
var __commonJS2 = (cb, mod) => function __require() {
return mod || (0, cb[Object.keys(cb)[0]])((mod = { exports: {} }).exports, mod), mod.exports;
};
var __export = (target, all2) => {
__markAsModule(target);
for (var name in all2)
__defProp(target, name, { get: all2[name], enumerable: true });
};
var __reExport = (target, module2, desc) => {
if (module2 && typeof module2 === "object" || typeof module2 === "function") {
for (let key of __getOwnPropNames(module2))
if (!__hasOwnProp.call(target, key) && key !== "default")
__defProp(target, key, {
get: () => module2[key],
enumerable: !(desc = __getOwnPropDesc(module2, key)) || desc.enumerable
});
}
return target;
};
var __toModule = (module2) => {
return __reExport(__markAsModule(__defProp(module2 != null ? __create(__getProtoOf(module2)) : {}, "default", module2 && module2.__esModule && "default" in module2 ? { get: () => module2.default, enumerable: true } : { value: module2, enumerable: true })), module2);
};
var require_format = __commonJS2({
"node_modules/format/format.js"(exports2, module2) {
;
(function() {
var namespace;
if (typeof module2 !== "undefined") {
namespace = module2.exports = format;
} else {
namespace = function() {
return this || (1, eval)("this");
}();
}
namespace.format = format;
namespace.vsprintf = vsprintf;
if (typeof console !== "undefined" && typeof console.log === "function") {
namespace.printf = printf;
}
function printf() {
console.log(format.apply(null, arguments));
}
function vsprintf(fmt, replacements) {
return format.apply(null, [fmt].concat(replacements));
}
function format(fmt) {
var argIndex = 1, args = [].slice.call(arguments), i = 0, n = fmt.length, result = "", c, escaped = false, arg, tmp, leadingZero = false, precision, nextArg = function() {
return args[argIndex++];
}, slurpNumber = function() {
var digits = "";
while (/\d/.test(fmt[i])) {
digits += fmt[i++];
c = fmt[i];
}
return digits.length > 0 ? parseInt(digits) : null;
};
for (; i < n; ++i) {
c = fmt[i];
if (escaped) {
escaped = false;
if (c == ".") {
leadingZero = false;
c = fmt[++i];
} else if (c == "0" && fmt[i + 1] == ".") {
leadingZero = true;
i += 2;
c = fmt[i];
} else {
leadingZero = true;
}
precision = slurpNumber();
switch (c) {
case "b":
result += parseInt(nextArg(), 10).toString(2);
break;
case "c":
arg = nextArg();
if (typeof arg === "string" || arg instanceof String)
result += arg;
else
result += String.fromCharCode(parseInt(arg, 10));
break;
case "d":
result += parseInt(nextArg(), 10);
break;
case "f":
tmp = String(parseFloat(nextArg()).toFixed(precision || 6));
result += leadingZero ? tmp : tmp.replace(/^0/, "");
break;
case "j":
result += JSON.stringify(nextArg());
break;
case "o":
result += "0" + parseInt(nextArg(), 10).toString(8);
break;
case "s":
result += nextArg();
break;
case "x":
result += "0x" + parseInt(nextArg(), 10).toString(16);
break;
case "X":
result += "0x" + parseInt(nextArg(), 10).toString(16).toUpperCase();
break;
default:
result += c;
break;
}
} else if (c === "%") {
escaped = true;
} else {
result += c;
}
}
return result;
}
})();
}
});
var require_is_buffer = __commonJS2({
"node_modules/is-buffer/index.js"(exports2, module2) {
module2.exports = function isBuffer2(obj) {
return obj != null && obj.constructor != null && typeof obj.constructor.isBuffer === "function" && obj.constructor.isBuffer(obj);
};
}
});
var require_extend = __commonJS2({
"node_modules/extend/index.js"(exports2, module2) {
"use strict";
var hasOwn = Object.prototype.hasOwnProperty;
var toStr = Object.prototype.toString;
var defineProperty = Object.defineProperty;
var gOPD = Object.getOwnPropertyDescriptor;
var isArray = function isArray2(arr) {
if (typeof Array.isArray === "function") {
return Array.isArray(arr);
}
return toStr.call(arr) === "[object Array]";
};
var isPlainObject2 = function isPlainObject3(obj) {
if (!obj || toStr.call(obj) !== "[object Object]") {
return false;
}
var hasOwnConstructor = hasOwn.call(obj, "constructor");
var hasIsPrototypeOf = obj.constructor && obj.constructor.prototype && hasOwn.call(obj.constructor.prototype, "isPrototypeOf");
if (obj.constructor && !hasOwnConstructor && !hasIsPrototypeOf) {
return false;
}
var key;
for (key in obj) {
}
return typeof key === "undefined" || hasOwn.call(obj, key);
};
var setProperty = function setProperty2(target, options) {
if (defineProperty && options.name === "__proto__") {
defineProperty(target, options.name, {
enumerable: true,
configurable: true,
value: options.newValue,
writable: true
});
} else {
target[options.name] = options.newValue;
}
};
var getProperty = function getProperty2(obj, name) {
if (name === "__proto__") {
if (!hasOwn.call(obj, name)) {
return void 0;
} else if (gOPD) {
return gOPD(obj, name).value;
}
}
return obj[name];
};
module2.exports = function extend2() {
var options, name, src, copy, copyIsArray, clone;
var target = arguments[0];
var i = 1;
var length = arguments.length;
var deep = false;
if (typeof target === "boolean") {
deep = target;
target = arguments[1] || {};
i = 2;
}
if (target == null || typeof target !== "object" && typeof target !== "function") {
target = {};
}
for (; i < length; ++i) {
options = arguments[i];
if (options != null) {
for (name in options) {
src = getProperty(target, name);
copy = getProperty(options, name);
if (target !== copy) {
if (deep && copy && (isPlainObject2(copy) || (copyIsArray = isArray(copy)))) {
if (copyIsArray) {
copyIsArray = false;
clone = src && isArray(src) ? src : [];
} else {
clone = src && isPlainObject2(src) ? src : {};
}
setProperty(target, {
name,
newValue: extend2(deep, clone, copy)
});
} else if (typeof copy !== "undefined") {
setProperty(target, { name, newValue: copy });
}
}
}
}
}
return target;
};
}
});
__export(exports, {
default: () => prepareMarkdown_default
});
var defaults = {
children(node) {
return node.children;
},
annotatetextnode(node, text3) {
if (node.type === "text") {
return {
offset: {
end: node.position.end.offset,
start: node.position.start.offset
},
text: text3.substring(node.position.start.offset, node.position.end.offset)
};
} else {
return null;
}
},
interpretmarkup(text3 = "") {
return text3;
}
};
function collecttextnodes(ast, text3, options = defaults) {
const textannotations = [];
function recurse(node) {
const annotation = options.annotatetextnode(node, text3);
if (annotation !== null) {
textannotations.push(annotation);
}
const children = options.children(node);
if (children !== null && Array.isArray(children)) {
children.forEach(recurse);
}
}
recurse(ast);
return textannotations;
}
function composeannotation(text3, annotatedtextnodes, options = defaults) {
const annotations = [];
let prior = {
offset: {
end: 0,
start: 0
}
};
for (const current of annotatedtextnodes) {
const currenttext = text3.substring(prior.offset.end, current.offset.start);
annotations.push({
interpretAs: options.interpretmarkup(currenttext),
markup: currenttext,
offset: {
end: current.offset.start,
start: prior.offset.end
}
});
annotations.push(current);
prior = current;
}
const finaltext = text3.substring(prior.offset.end, text3.length);
annotations.push({
interpretAs: options.interpretmarkup(finaltext),
markup: finaltext,
offset: {
end: text3.length,
start: prior.offset.end
}
});
return { annotation: annotations };
}
function build(text3, parse3, options = defaults) {
const nodes = parse3(text3);
const textnodes = collecttextnodes(nodes, text3, options);
return composeannotation(text3, textnodes, options);
}
var import_format = __toModule(require_format());
var fault = Object.assign(create(Error), {
eval: create(EvalError),
range: create(RangeError),
reference: create(ReferenceError),
syntax: create(SyntaxError),
type: create(TypeError),
uri: create(URIError)
});
function create(Constructor) {
FormattedError.displayName = Constructor.displayName || Constructor.name;
return FormattedError;
function FormattedError(format, ...values) {
var reason = format ? (0, import_format.default)(format, ...values) : format;
return new Constructor(reason);
}
}
var own = {}.hasOwnProperty;
var markers = {
yaml: "-",
toml: "+"
};
function matters(options = "yaml") {
const results = [];
let index2 = -1;
if (!Array.isArray(options)) {
options = [options];
}
while (++index2 < options.length) {
results[index2] = matter(options[index2]);
}
return results;
}
function matter(option) {
let result = option;
if (typeof result === "string") {
if (!own.call(markers, result)) {
throw fault("Missing matter definition for `%s`", result);
}
result = {
type: result,
marker: markers[result]
};
} else if (typeof result !== "object") {
throw fault("Expected matter to be an object, not `%j`", result);
}
if (!own.call(result, "type")) {
throw fault("Missing `type` in matter `%j`", result);
}
if (!own.call(result, "fence") && !own.call(result, "marker")) {
throw fault("Missing `marker` or `fence` in matter `%j`", result);
}
return result;
}
var unicodePunctuationRegex = /[!-/:-@[-`{-~\u00A1\u00A7\u00AB\u00B6\u00B7\u00BB\u00BF\u037E\u0387\u055A-\u055F\u0589\u058A\u05BE\u05C0\u05C3\u05C6\u05F3\u05F4\u0609\u060A\u060C\u060D\u061B\u061E\u061F\u066A-\u066D\u06D4\u0700-\u070D\u07F7-\u07F9\u0830-\u083E\u085E\u0964\u0965\u0970\u09FD\u0A76\u0AF0\u0C77\u0C84\u0DF4\u0E4F\u0E5A\u0E5B\u0F04-\u0F12\u0F14\u0F3A-\u0F3D\u0F85\u0FD0-\u0FD4\u0FD9\u0FDA\u104A-\u104F\u10FB\u1360-\u1368\u1400\u166E\u169B\u169C\u16EB-\u16ED\u1735\u1736\u17D4-\u17D6\u17D8-\u17DA\u1800-\u180A\u1944\u1945\u1A1E\u1A1F\u1AA0-\u1AA6\u1AA8-\u1AAD\u1B5A-\u1B60\u1BFC-\u1BFF\u1C3B-\u1C3F\u1C7E\u1C7F\u1CC0-\u1CC7\u1CD3\u2010-\u2027\u2030-\u2043\u2045-\u2051\u2053-\u205E\u207D\u207E\u208D\u208E\u2308-\u230B\u2329\u232A\u2768-\u2775\u27C5\u27C6\u27E6-\u27EF\u2983-\u2998\u29D8-\u29DB\u29FC\u29FD\u2CF9-\u2CFC\u2CFE\u2CFF\u2D70\u2E00-\u2E2E\u2E30-\u2E4F\u2E52\u3001-\u3003\u3008-\u3011\u3014-\u301F\u3030\u303D\u30A0\u30FB\uA4FE\uA4FF\uA60D-\uA60F\uA673\uA67E\uA6F2-\uA6F7\uA874-\uA877\uA8CE\uA8CF\uA8F8-\uA8FA\uA8FC\uA92E\uA92F\uA95F\uA9C1-\uA9CD\uA9DE\uA9DF\uAA5C-\uAA5F\uAADE\uAADF\uAAF0\uAAF1\uABEB\uFD3E\uFD3F\uFE10-\uFE19\uFE30-\uFE52\uFE54-\uFE61\uFE63\uFE68\uFE6A\uFE6B\uFF01-\uFF03\uFF05-\uFF0A\uFF0C-\uFF0F\uFF1A\uFF1B\uFF1F\uFF20\uFF3B-\uFF3D\uFF3F\uFF5B\uFF5D\uFF5F-\uFF65]/;
var asciiAlpha = regexCheck(/[A-Za-z]/);
var asciiDigit = regexCheck(/\d/);
var asciiHexDigit = regexCheck(/[\dA-Fa-f]/);
var asciiAlphanumeric = regexCheck(/[\dA-Za-z]/);
var asciiPunctuation = regexCheck(/[!-/:-@[-`{-~]/);
var asciiAtext = regexCheck(/[#-'*+\--9=?A-Z^-~]/);
function asciiControl(code) {
return code !== null && (code < 32 || code === 127);
}
function markdownLineEndingOrSpace(code) {
return code !== null && (code < 0 || code === 32);
}
function markdownLineEnding(code) {
return code !== null && code < -2;
}
function markdownSpace(code) {
return code === -2 || code === -1 || code === 32;
}
var unicodeWhitespace = regexCheck(/\s/);
var unicodePunctuation = regexCheck(unicodePunctuationRegex);
function regexCheck(regex) {
return check;
function check(code) {
return code !== null && regex.test(String.fromCharCode(code));
}
}
function frontmatter(options) {
const settings = matters(options);
const flow3 = {};
let index2 = -1;
let matter2;
let code;
while (++index2 < settings.length) {
matter2 = settings[index2];
code = fence(matter2, "open").charCodeAt(0);
if (code in flow3) {
flow3[code].push(parse(matter2));
} else {
flow3[code] = [parse(matter2)];
}
}
return {
flow: flow3
};
}
function parse(matter2) {
const name = matter2.type;
const anywhere = matter2.anywhere;
const valueType = name + "Value";
const fenceType = name + "Fence";
const sequenceType = fenceType + "Sequence";
const fenceConstruct = {
tokenize: tokenizeFence,
partial: true
};
let buffer2;
return {
tokenize: tokenizeFrontmatter,
concrete: true
};
function tokenizeFrontmatter(effects, ok, nok) {
const self2 = this;
return start;
function start(code) {
const position2 = self2.now();
if (position2.column !== 1 || !anywhere && position2.line !== 1) {
return nok(code);
}
effects.enter(name);
buffer2 = fence(matter2, "open");
return effects.attempt(fenceConstruct, afterOpeningFence, nok)(code);
}
function afterOpeningFence(code) {
buffer2 = fence(matter2, "close");
return lineEnd(code);
}
function lineStart(code) {
if (code === null || markdownLineEnding(code)) {
return lineEnd(code);
}
effects.enter(valueType);
return lineData(code);
}
function lineData(code) {
if (code === null || markdownLineEnding(code)) {
effects.exit(valueType);
return lineEnd(code);
}
effects.consume(code);
return lineData;
}
function lineEnd(code) {
if (code === null) {
return nok(code);
}
effects.enter("lineEnding");
effects.consume(code);
effects.exit("lineEnding");
return effects.attempt(fenceConstruct, after, lineStart);
}
function after(code) {
effects.exit(name);
return ok(code);
}
}
function tokenizeFence(effects, ok, nok) {
let bufferIndex = 0;
return start;
function start(code) {
if (code === buffer2.charCodeAt(bufferIndex)) {
effects.enter(fenceType);
effects.enter(sequenceType);
return insideSequence(code);
}
return nok(code);
}
function insideSequence(code) {
if (bufferIndex === buffer2.length) {
effects.exit(sequenceType);
if (markdownSpace(code)) {
effects.enter("whitespace");
return insideWhitespace(code);
}
return fenceEnd(code);
}
if (code === buffer2.charCodeAt(bufferIndex++)) {
effects.consume(code);
return insideSequence;
}
return nok(code);
}
function insideWhitespace(code) {
if (markdownSpace(code)) {
effects.consume(code);
return insideWhitespace;
}
effects.exit("whitespace");
return fenceEnd(code);
}
function fenceEnd(code) {
if (code === null || markdownLineEnding(code)) {
effects.exit(fenceType);
return ok(code);
}
return nok(code);
}
}
}
function fence(matter2, prop) {
return matter2.marker ? pick(matter2.marker, prop).repeat(3) : pick(matter2.fence, prop);
}
function pick(schema, prop) {
return typeof schema === "string" ? schema : schema[prop];
}
function frontmatterFromMarkdown(options) {
const settings = matters(options);
const enter = {};
const exit2 = {};
let index2 = -1;
while (++index2 < settings.length) {
const matter2 = settings[index2];
enter[matter2.type] = opener(matter2);
exit2[matter2.type] = close;
exit2[matter2.type + "Value"] = value;
}
return { enter, exit: exit2 };
}
function opener(matter2) {
return open;
function open(token) {
this.enter({ type: matter2.type, value: "" }, token);
this.buffer();
}
}
function close(token) {
const data = this.resume();
this.exit(token).value = data.replace(/^(\r?\n|\r)|(\r?\n|\r)$/g, "");
}
function value(token) {
this.config.enter.data.call(this, token);
this.config.exit.data.call(this, token);
}
function frontmatterToMarkdown(options) {
const unsafe = [];
const handlers = {};
const settings = matters(options);
let index2 = -1;
while (++index2 < settings.length) {
const matter2 = settings[index2];
handlers[matter2.type] = handler(matter2);
unsafe.push({ atBreak: true, character: fence2(matter2, "open").charAt(0) });
}
return { unsafe, handlers };
}
function handler(matter2) {
const open = fence2(matter2, "open");
const close2 = fence2(matter2, "close");
return handle;
function handle(node) {
return open + (node.value ? "\n" + node.value : "") + "\n" + close2;
}
}
function fence2(matter2, prop) {
return matter2.marker ? pick2(matter2.marker, prop).repeat(3) : pick2(matter2.fence, prop);
}
function pick2(schema, prop) {
return typeof schema === "string" ? schema : schema[prop];
}
function remarkFrontmatter(options = "yaml") {
const data = this.data();
add("micromarkExtensions", frontmatter(options));
add("fromMarkdownExtensions", frontmatterFromMarkdown(options));
add("toMarkdownExtensions", frontmatterToMarkdown(options));
function add(field, value2) {
const list2 = data[field] ? data[field] : data[field] = [];
list2.push(value2);
}
}
function toString(node, options) {
var { includeImageAlt = true } = options || {};
return one(node, includeImageAlt);
}
function one(node, includeImageAlt) {
return node && typeof node === "object" && (node.value || (includeImageAlt ? node.alt : "") || "children" in node && all(node.children, includeImageAlt) || Array.isArray(node) && all(node, includeImageAlt)) || "";
}
function all(values, includeImageAlt) {
var result = [];
var index2 = -1;
while (++index2 < values.length) {
result[index2] = one(values[index2], includeImageAlt);
}
return result.join("");
}
function splice(list2, start, remove, items) {
const end = list2.length;
let chunkStart = 0;
let parameters;
if (start < 0) {
start = -start > end ? 0 : end + start;
} else {
start = start > end ? end : start;
}
remove = remove > 0 ? remove : 0;
if (items.length < 1e4) {
parameters = Array.from(items);
parameters.unshift(start, remove);
[].splice.apply(list2, parameters);
} else {
if (remove)
[].splice.apply(list2, [start, remove]);
while (chunkStart < items.length) {
parameters = items.slice(chunkStart, chunkStart + 1e4);
parameters.unshift(start, 0);
[].splice.apply(list2, parameters);
chunkStart += 1e4;
start += 1e4;
}
}
}
function push(list2, items) {
if (list2.length > 0) {
splice(list2, list2.length, 0, items);
return list2;
}
return items;
}
var hasOwnProperty = {}.hasOwnProperty;
function combineExtensions(extensions) {
const all2 = {};
let index2 = -1;
while (++index2 < extensions.length) {
syntaxExtension(all2, extensions[index2]);
}
return all2;
}
function syntaxExtension(all2, extension2) {
let hook;
for (hook in extension2) {
const maybe = hasOwnProperty.call(all2, hook) ? all2[hook] : void 0;
const left = maybe || (all2[hook] = {});
const right = extension2[hook];
let code;
for (code in right) {
if (!hasOwnProperty.call(left, code))
left[code] = [];
const value2 = right[code];
constructs(left[code], Array.isArray(value2) ? value2 : value2 ? [value2] : []);
}
}
}
function constructs(existing, list2) {
let index2 = -1;
const before = [];
while (++index2 < list2.length) {
;
(list2[index2].add === "after" ? existing : before).push(list2[index2]);
}
splice(existing, 0, 0, before);
}
function factorySpace(effects, ok, type, max) {
const limit = max ? max - 1 : Number.POSITIVE_INFINITY;
let size = 0;
return start;
function start(code) {
if (markdownSpace(code)) {
effects.enter(type);
return prefix(code);
}
return ok(code);
}
function prefix(code) {
if (markdownSpace(code) && size++ < limit) {
effects.consume(code);
return prefix;
}
effects.exit(type);
return ok(code);
}
}
var content = {
tokenize: initializeContent
};
function initializeContent(effects) {
const contentStart = effects.attempt(this.parser.constructs.contentInitial, afterContentStartConstruct, paragraphInitial);
let previous2;
return contentStart;
function afterContentStartConstruct(code) {
if (code === null) {
effects.consume(code);
return;
}
effects.enter("lineEnding");
effects.consume(code);
effects.exit("lineEnding");
return factorySpace(effects, contentStart, "linePrefix");
}
function paragraphInitial(code) {
effects.enter("paragraph");
return lineStart(code);
}
function lineStart(code) {
const token = effects.enter("chunkText", {
contentType: "text",
previous: previous2
});
if (previous2) {
previous2.next = token;
}
previous2 = token;
return data(code);
}
function data(code) {
if (code === null) {
effects.exit("chunkText");
effects.exit("paragraph");
effects.consume(code);
return;
}
if (markdownLineEnding(code)) {
effects.consume(code);
effects.exit("chunkText");
return lineStart;
}
effects.consume(code);
return data;
}
}
var document2 = {
tokenize: initializeDocument
};
var containerConstruct = {
tokenize: tokenizeContainer
};
function initializeDocument(effects) {
const self2 = this;
const stack = [];
let continued = 0;
let childFlow;
let childToken;
let lineStartOffset;
return start;
function start(code) {
if (continued < stack.length) {
const item = stack[continued];
self2.containerState = item[1];
return effects.attempt(item[0].continuation, documentContinue, checkNewContainers)(code);
}
return checkNewContainers(code);
}
function documentContinue(code) {
continued++;
if (self2.containerState._closeFlow) {
self2.containerState._closeFlow = void 0;
if (childFlow) {
closeFlow();
}
const indexBeforeExits = self2.events.length;
let indexBeforeFlow = indexBeforeExits;
let point2;
while (indexBeforeFlow--) {
if (self2.events[indexBeforeFlow][0] === "exit" && self2.events[indexBeforeFlow][1].type === "chunkFlow") {
point2 = self2.events[indexBeforeFlow][1].end;
break;
}
}
exitContainers(continued);
let index2 = indexBeforeExits;
while (index2 < self2.events.length) {
self2.events[index2][1].end = Object.assign({}, point2);
index2++;
}
splice(self2.events, indexBeforeFlow + 1, 0, self2.events.slice(indexBeforeExits));
self2.events.length = index2;
return checkNewContainers(code);
}
return start(code);
}
function checkNewContainers(code) {
if (continued === stack.length) {
if (!childFlow) {
return documentContinued(code);
}
if (childFlow.currentConstruct && childFlow.currentConstruct.concrete) {
return flowStart(code);
}
self2.interrupt = Boolean(childFlow.currentConstruct);
}
self2.containerState = {};
return effects.check(containerConstruct, thereIsANewContainer, thereIsNoNewContainer)(code);
}
function thereIsANewContainer(code) {
if (childFlow)
closeFlow();
exitContainers(continued);
return documentContinued(code);
}
function thereIsNoNewContainer(code) {
self2.parser.lazy[self2.now().line] = continued !== stack.length;
lineStartOffset = self2.now().offset;
return flowStart(code);
}
function documentContinued(code) {
self2.containerState = {};
return effects.attempt(containerConstruct, containerContinue, flowStart)(code);
}
function containerContinue(code) {
continued++;
stack.push([self2.currentConstruct, self2.containerState]);
return documentContinued(code);
}
function flowStart(code) {
if (code === null) {
if (childFlow)
closeFlow();
exitContainers(0);
effects.consume(code);
return;
}
childFlow = childFlow || self2.parser.flow(self2.now());
effects.enter("chunkFlow", {
contentType: "flow",
previous: childToken,
_tokenizer: childFlow
});
return flowContinue(code);
}
function flowContinue(code) {
if (code === null) {
writeToChild(effects.exit("chunkFlow"), true);
exitContainers(0);
effects.consume(code);
return;
}
if (markdownLineEnding(code)) {
effects.consume(code);
writeToChild(effects.exit("chunkFlow"));
continued = 0;
self2.interrupt = void 0;
return start;
}
effects.consume(code);
return flowContinue;
}
function writeToChild(token, eof) {
const stream = self2.sliceStream(token);
if (eof)
stream.push(null);
token.previous = childToken;
if (childToken)
childToken.next = token;
childToken = token;
childFlow.defineSkip(token.start);
childFlow.write(stream);
if (self2.parser.lazy[token.start.line]) {
let index2 = childFlow.events.length;
while (index2--) {
if (childFlow.events[index2][1].start.offset < lineStartOffset && (!childFlow.events[index2][1].end || childFlow.events[index2][1].end.offset > lineStartOffset)) {
return;
}
}
const indexBeforeExits = self2.events.length;
let indexBeforeFlow = indexBeforeExits;
let seen;
let point2;
while (indexBeforeFlow--) {
if (self2.events[indexBeforeFlow][0] === "exit" && self2.events[indexBeforeFlow][1].type === "chunkFlow") {
if (seen) {
point2 = self2.events[indexBeforeFlow][1].end;
break;
}
seen = true;
}
}
exitContainers(continued);
index2 = indexBeforeExits;
while (index2 < self2.events.length) {
self2.events[index2][1].end = Object.assign({}, point2);
index2++;
}
splice(self2.events, indexBeforeFlow + 1, 0, self2.events.slice(indexBeforeExits));
self2.events.length = index2;
}
}
function exitContainers(size) {
let index2 = stack.length;
while (index2-- > size) {
const entry = stack[index2];
self2.containerState = entry[1];
entry[0].exit.call(self2, effects);
}
stack.length = size;
}
function closeFlow() {
childFlow.write([null]);
childToken = void 0;
childFlow = void 0;
self2.containerState._closeFlow = void 0;
}
}
function tokenizeContainer(effects, ok, nok) {
return factorySpace(effects, effects.attempt(this.parser.constructs.document, ok, nok), "linePrefix", this.parser.constructs.disable.null.includes("codeIndented") ? void 0 : 4);
}
function classifyCharacter(code) {
if (code === null || markdownLineEndingOrSpace(code) || unicodeWhitespace(code)) {
return 1;
}
if (unicodePunctuation(code)) {
return 2;
}
}
function resolveAll(constructs2, events, context) {
const called = [];
let index2 = -1;
while (++index2 < constructs2.length) {
const resolve = constructs2[index2].resolveAll;
if (resolve && !called.includes(resolve)) {
events = resolve(events, context);
called.push(resolve);
}
}
return events;
}
var attention = {
name: "attention",
tokenize: tokenizeAttention,
resolveAll: resolveAllAttention
};
function resolveAllAttention(events, context) {
let index2 = -1;
let open;
let group;
let text3;
let openingSequence;
let closingSequence;
let use;
let nextEvents;
let offset;
while (++index2 < events.length) {
if (events[index2][0] === "enter" && events[index2][1].type === "attentionSequence" && events[index2][1]._close) {
open = index2;
while (open--) {
if (events[open][0] === "exit" && events[open][1].type === "attentionSequence" && events[open][1]._open && context.sliceSerialize(events[open][1]).charCodeAt(0) === context.sliceSerialize(events[index2][1]).charCodeAt(0)) {
if ((events[open][1]._close || events[index2][1]._open) && (events[index2][1].end.offset - events[index2][1].start.offset) % 3 && !((events[open][1].end.offset - events[open][1].start.offset + events[index2][1].end.offset - events[index2][1].start.offset) % 3)) {
continue;
}
use = events[open][1].end.offset - events[open][1].start.offset > 1 && events[index2][1].end.offset - events[index2][1].start.offset > 1 ? 2 : 1;
const start = Object.assign({}, events[open][1].end);
const end = Object.assign({}, events[index2][1].start);
movePoint(start, -use);
movePoint(end, use);
openingSequence = {
type: use > 1 ? "strongSequence" : "emphasisSequence",
start,
end: Object.assign({}, events[open][1].end)
};
closingSequence = {
type: use > 1 ? "strongSequence" : "emphasisSequence",
start: Object.assign({}, events[index2][1].start),
end
};
text3 = {
type: use > 1 ? "strongText" : "emphasisText",
start: Object.assign({}, events[open][1].end),
end: Object.assign({}, events[index2][1].start)
};
group = {
type: use > 1 ? "strong" : "emphasis",
start: Object.assign({}, openingSequence.start),
end: Object.assign({}, closingSequence.end)
};
events[open][1].end = Object.assign({}, openingSequence.start);
events[index2][1].start = Object.assign({}, closingSequence.end);
nextEvents = [];
if (events[open][1].end.offset - events[open][1].start.offset) {
nextEvents = push(nextEvents, [
["enter", events[open][1], context],
["exit", events[open][1], context]
]);
}
nextEvents = push(nextEvents, [
["enter", group, context],
["enter", openingSequence, context],
["exit", openingSequence, context],
["enter", text3, context]
]);
nextEvents = push(nextEvents, resolveAll(context.parser.constructs.insideSpan.null, events.slice(open + 1, index2), context));
nextEvents = push(nextEvents, [
["exit", text3, context],
["enter", closingSequence, context],
["exit", closingSequence, context],
["exit", group, context]
]);
if (events[index2][1].end.offset - events[index2][1].start.offset) {
offset = 2;
nextEvents = push(nextEvents, [
["enter", events[index2][1], context],
["exit", events[index2][1], context]
]);
} else {
offset = 0;
}
splice(events, open - 1, index2 - open + 3, nextEvents);
index2 = open + nextEvents.length - offset - 2;
break;
}
}
}
}
index2 = -1;
while (++index2 < events.length) {
if (events[index2][1].type === "attentionSequence") {
events[index2][1].type = "data";
}
}
return events;
}
function tokenizeAttention(effects, ok) {
const attentionMarkers2 = this.parser.constructs.attentionMarkers.null;
const previous2 = this.previous;
const before = classifyCharacter(previous2);
let marker;
return start;
function start(code) {
effects.enter("attentionSequence");
marker = code;
return sequence(code);
}
function sequence(code) {
if (code === marker) {
effects.consume(code);
return sequence;
}
const token = effects.exit("attentionSequence");
const after = classifyCharacter(code);
const open = !after || after === 2 && before || attentionMarkers2.includes(code);
const close2 = !before || before === 2 && after || attentionMarkers2.includes(previous2);
token._open = Boolean(marker === 42 ? open : open && (before || !close2));
token._close = Boolean(marker === 42 ? close2 : close2 && (after || !open));
return ok(code);
}
}
function movePoint(point2, offset) {
point2.column += offset;
point2.offset += offset;
point2._bufferIndex += offset;
}
var autolink = {
name: "autolink",
tokenize: tokenizeAutolink
};
function tokenizeAutolink(effects, ok, nok) {
let size = 1;
return start;
function start(code) {
effects.enter("autolink");
effects.enter("autolinkMarker");
effects.consume(code);
effects.exit("autolinkMarker");
effects.enter("autolinkProtocol");
return open;
}
function open(code) {
if (asciiAlpha(code)) {
effects.consume(code);
return schemeOrEmailAtext;
}
return asciiAtext(code) ? emailAtext(code) : nok(code);
}
function schemeOrEmailAtext(code) {
return code === 43 || code === 45 || code === 46 || asciiAlphanumeric(code) ? schemeInsideOrEmailAtext(code) : emailAtext(code);
}
function schemeInsideOrEmailAtext(code) {
if (code === 58) {
effects.consume(code);
return urlInside;
}
if ((code === 43 || code === 45 || code === 46 || asciiAlphanumeric(code)) && size++ < 32) {
effects.consume(code);
return schemeInsideOrEmailAtext;
}
return emailAtext(code);
}
function urlInside(code) {
if (code === 62) {
effects.exit("autolinkProtocol");
return end(code);
}
if (code === null || code === 32 || code === 60 || asciiControl(code)) {
return nok(code);
}
effects.consume(code);
return urlInside;
}
function emailAtext(code) {
if (code === 64) {
effects.consume(code);
size = 0;
return emailAtSignOrDot;
}
if (asciiAtext(code)) {
effects.consume(code);
return emailAtext;
}
return nok(code);
}
function emailAtSignOrDot(code) {
return asciiAlphanumeric(code) ? emailLabel(code) : nok(code);
}
function emailLabel(code) {
if (code === 46) {
effects.consume(code);
size = 0;
return emailAtSignOrDot;
}
if (code === 62) {
effects.exit("autolinkProtocol").type = "autolinkEmail";
return end(code);
}
return emailValue(code);
}
function emailValue(code) {
if ((code === 45 || asciiAlphanumeric(code)) && size++ < 63) {
effects.consume(code);
return code === 45 ? emailValue : emailLabel;
}
return nok(code);
}
function end(code) {
effects.enter("autolinkMarker");
effects.consume(code);
effects.exit("autolinkMarker");
effects.exit("autolink");
return ok;
}
}
var blankLine = {
tokenize: tokenizeBlankLine,
partial: true
};
function tokenizeBlankLine(effects, ok, nok) {
return factorySpace(effects, afterWhitespace, "linePrefix");
function afterWhitespace(code) {
return code === null || markdownLineEnding(code) ? ok(code) : nok(code);
}
}
var blockQuote = {
name: "blockQuote",
tokenize: tokenizeBlockQuoteStart,
continuation: {
tokenize: tokenizeBlockQuoteContinuation
},
exit
};
function tokenizeBlockQuoteStart(effects, ok, nok) {
const self2 = this;
return start;
function start(code) {
if (code === 62) {
const state = self2.containerState;
if (!state.open) {
effects.enter("blockQuote", {
_container: true
});
state.open = true;
}
effects.enter("blockQuotePrefix");
effects.enter("blockQuoteMarker");
effects.consume(code);
effects.exit("blockQuoteMarker");
return after;
}
return nok(code);
}
function after(code) {
if (markdownSpace(code)) {
effects.enter("blockQuotePrefixWhitespace");
effects.consume(code);
effects.exit("blockQuotePrefixWhitespace");
effects.exit("blockQuotePrefix");
return ok;
}
effects.exit("blockQuotePrefix");
return ok(code);
}
}
function tokenizeBlockQuoteContinuation(effects, ok, nok) {
return factorySpace(effects, effects.attempt(blockQuote, ok, nok), "linePrefix", this.parser.constructs.disable.null.includes("codeIndented") ? void 0 : 4);
}
function exit(effects) {
effects.exit("blockQuote");
}
var characterEscape = {
name: "characterEscape",
tokenize: tokenizeCharacterEscape
};
function tokenizeCharacterEscape(effects, ok, nok) {
return start;
function start(code) {
effects.enter("characterEscape");
effects.enter("escapeMarker");
effects.consume(code);
effects.exit("escapeMarker");
return open;
}
function open(code) {
if (asciiPunctuation(code)) {
effects.enter("characterEscapeValue");
effects.consume(code);
effects.exit("characterEscapeValue");
effects.exit("characterEscape");
return ok;
}
return nok(code);
}
}
var semicolon = 59;
var element;
function decodeEntity(characters) {
var entity = "&" + characters + ";";
var char;
element = element || document.createElement("i");
element.innerHTML = entity;
char = element.textContent;
if (char.charCodeAt(char.length - 1) === semicolon && characters !== "semi") {
return false;
}
return char === entity ? false : char;
}
var characterReference = {
name: "characterReference",
tokenize: tokenizeCharacterReference
};
function tokenizeCharacterReference(effects, ok, nok) {
const self2 = this;
let size = 0;
let max;
let test;
return start;
function start(code) {
effects.enter("characterReference");
effects.enter("characterReferenceMarker");
effects.consume(code);
effects.exit("characterReferenceMarker");
return open;
}
function open(code) {
if (code === 35) {
effects.enter("characterReferenceMarkerNumeric");
effects.consume(code);
effects.exit("characterReferenceMarkerNumeric");
return numeric;
}
effects.enter("characterReferenceValue");
max = 31;
test = asciiAlphanumeric;
return value2(code);
}
function numeric(code) {
if (code === 88 || code === 120) {
effects.enter("characterReferenceMarkerHexadecimal");
effects.consume(code);
effects.exit("characterReferenceMarkerHexadecimal");
effects.enter("characterReferenceValue");
max = 6;
test = asciiHexDigit;
return value2;
}
effects.enter("characterReferenceValue");
max = 7;
test = asciiDigit;
return value2(code);
}
function value2(code) {
let token;
if (code === 59 && size) {
token = effects.exit("characterReferenceValue");
if (test === asciiAlphanumeric && !decodeEntity(self2.sliceSerialize(token))) {
return nok(code);
}
effects.enter("characterReferenceMarker");
effects.consume(code);
effects.exit("characterReferenceMarker");
effects.exit("characterReference");
return ok;
}
if (test(code) && size++ < max) {
effects.consume(code);
return value2;
}
return nok(code);
}
}
var codeFenced = {
name: "codeFenced",
tokenize: tokenizeCodeFenced,
concrete: true
};
function tokenizeCodeFenced(effects, ok, nok) {
const self2 = this;
const closingFenceConstruct = {
tokenize: tokenizeClosingFence,
partial: true
};
const nonLazyLine = {
tokenize: tokenizeNonLazyLine,
partial: true
};
const tail = this.events[this.events.length - 1];
const initialPrefix = tail && tail[1].type === "linePrefix" ? tail[2].sliceSerialize(tail[1], true).length : 0;
let sizeOpen = 0;
let marker;
return start;
function start(code) {
effects.enter("codeFenced");
effects.enter("codeFencedFence");
effects.enter("codeFencedFenceSequence");
marker = code;
return sequenceOpen(code);
}
function sequenceOpen(code) {
if (code === marker) {
effects.consume(code);
sizeOpen++;
return sequenceOpen;
}
effects.exit("codeFencedFenceSequence");
return sizeOpen < 3 ? nok(code) : factorySpace(effects, infoOpen, "whitespace")(code);
}
function infoOpen(code) {
if (code === null || markdownLineEnding(code)) {
return openAfter(code);
}
effects.enter("codeFencedFenceInfo");
effects.enter("chunkString", {
contentType: "string"
});
return info(code);
}
function info(code) {
if (code === null || markdownLineEndingOrSpace(code)) {
effects.exit("chunkString");
effects.exit("codeFencedFenceInfo");
return factorySpace(effects, infoAfter, "whitespace")(code);
}
if (code === 96 && code === marker)
return nok(code);
effects.consume(code);
return info;
}
function infoAfter(code) {
if (code === null || markdownLineEnding(code)) {
return openAfter(code);
}
effects.enter("codeFencedFenceMeta");
effects.enter("chunkString", {
contentType: "string"
});
return meta(code);
}
function meta(code) {
if (code === null || markdownLineEnding(code)) {
effects.exit("chunkString");
effects.exit("codeFencedFenceMeta");
return openAfter(code);
}
if (code === 96 && code === marker)
return nok(code);
effects.consume(code);
return meta;
}
function openAfter(code) {
effects.exit("codeFencedFence");
return self2.interrupt ? ok(code) : contentStart(code);
}
function contentStart(code) {
if (code === null) {
return after(code);
}
if (markdownLineEnding(code)) {
return effects.attempt(nonLazyLine, effects.attempt(closingFenceConstruct, after, initialPrefix ? factorySpace(effects, contentStart, "linePrefix", initialPrefix + 1) : contentStart), after)(code);
}
effects.enter("codeFlowValue");
return contentContinue(code);
}
function contentContinue(code) {
if (code === null || markdownLineEnding(code)) {
effects.exit("codeFlowValue");
return contentStart(code);
}
effects.consume(code);
return contentContinue;
}
function after(code) {
effects.exit("codeFenced");
return ok(code);
}
function tokenizeNonLazyLine(effects2, ok2, nok2) {
const self22 = this;
return start2;
function start2(code) {
effects2.enter("lineEnding");
effects2.consume(code);
effects2.exit("lineEnding");
return lineStart;
}
function lineStart(code) {
return self22.parser.lazy[self22.now().line] ? nok2(code) : ok2(code);
}
}
function tokenizeClosingFence(effects2, ok2, nok2) {
let size = 0;
return factorySpace(effects2, closingSequenceStart, "linePrefix", this.parser.constructs.disable.null.includes("codeIndented") ? void 0 : 4);
function closingSequenceStart(code) {
effects2.enter("codeFencedFence");
effects2.enter("codeFencedFenceSequence");
return closingSequence(code);
}
function closingSequence(code) {
if (code === marker) {
effects2.consume(code);
size++;
return closingSequence;
}
if (size < sizeOpen)
return nok2(code);
effects2.exit("codeFencedFenceSequence");
return factorySpace(effects2, closingSequenceEnd, "whitespace")(code);
}
function closingSequenceEnd(code) {
if (code === null || markdownLineEnding(code)) {
effects2.exit("codeFencedFence");
return ok2(code);
}
return nok2(code);
}
}
}
var codeIndented = {
name: "codeIndented",
tokenize: tokenizeCodeIndented
};
var indentedContent = {
tokenize: tokenizeIndentedContent,
partial: true
};
function tokenizeCodeIndented(effects, ok, nok) {
const self2 = this;
return start;
function start(code) {
effects.enter("codeIndented");
return factorySpace(effects, afterStartPrefix, "linePrefix", 4 + 1)(code);
}
function afterStartPrefix(code) {
const tail = self2.events[self2.events.length - 1];
return tail && tail[1].type === "linePrefix" && tail[2].sliceSerialize(tail[1], true).length >= 4 ? afterPrefix(code) : nok(code);
}
function afterPrefix(code) {
if (code === null) {
return after(code);
}
if (markdownLineEnding(code)) {
return effects.attempt(indentedContent, afterPrefix, after)(code);
}
effects.enter("codeFlowValue");
return content3(code);
}
function content3(code) {
if (code === null || markdownLineEnding(code)) {
effects.exit("codeFlowValue");
return afterPrefix(code);
}
effects.consume(code);
return content3;
}
function after(code) {
effects.exit("codeIndented");
return ok(code);
}
}
function tokenizeIndentedContent(effects, ok, nok) {
const self2 = this;
return start;
function start(code) {
if (self2.parser.lazy[self2.now().line]) {
return nok(code);
}
if (markdownLineEnding(code)) {
effects.enter("lineEnding");
effects.consume(code);
effects.exit("lineEnding");
return start;
}
return factorySpace(effects, afterPrefix, "linePrefix", 4 + 1)(code);
}
function afterPrefix(code) {
const tail = self2.events[self2.events.length - 1];
return tail && tail[1].type === "linePrefix" && tail[2].sliceSerialize(tail[1], true).length >= 4 ? ok(code) : markdownLineEnding(code) ? start(code) : nok(code);
}
}
var codeText = {
name: "codeText",
tokenize: tokenizeCodeText,
resolve: resolveCodeText,
previous
};
function resolveCodeText(events) {
let tailExitIndex = events.length - 4;
let headEnterIndex = 3;
let index2;
let enter;
if ((events[headEnterIndex][1].type === "lineEnding" || events[headEnterIndex][1].type === "space") && (events[tailExitIndex][1].type === "lineEnding" || events[tailExitIndex][1].type === "space")) {
index2 = headEnterIndex;
while (++index2 < tailExitIndex) {
if (events[index2][1].type === "codeTextData") {
events[headEnterIndex][1].type = "codeTextPadding";
events[tailExitIndex][1].type = "codeTextPadding";
headEnterIndex += 2;
tailExitIndex -= 2;
break;
}
}
}
index2 = headEnterIndex - 1;
tailExitIndex++;
while (++index2 <= tailExitIndex) {
if (enter === void 0) {
if (index2 !== tailExitIndex && events[index2][1].type !== "lineEnding") {
enter = index2;
}
} else if (index2 === tailExitIndex || events[index2][1].type === "lineEnding") {
events[enter][1].type = "codeTextData";
if (index2 !== enter + 2) {
events[enter][1].end = events[index2 - 1][1].end;
events.splice(enter + 2, index2 - enter - 2);
tailExitIndex -= index2 - enter - 2;
index2 = enter + 2;
}
enter = void 0;
}
}
return events;
}
function previous(code) {
return code !== 96 || this.events[this.events.length - 1][1].type === "characterEscape";
}
function tokenizeCodeText(effects, ok, nok) {
const self2 = this;
let sizeOpen = 0;
let size;
let token;
return start;
function start(code) {
effects.enter("codeText");
effects.enter("codeTextSequence");
return openingSequence(code);
}
function openingSequence(code) {
if (code === 96) {
effects.consume(code);
sizeOpen++;
return openingSequence;
}
effects.exit("codeTextSequence");
return gap(code);
}
function gap(code) {
if (code === null) {
return nok(code);
}
if (code === 96) {
token = effects.enter("codeTextSequence");
size = 0;
return closingSequence(code);
}
if (code === 32) {
effects.enter("space");
effects.consume(code);
effects.exit("space");
return gap;
}
if (markdownLineEnding(code)) {
effects.enter("lineEnding");
effects.consume(code);
effects.exit("lineEnding");
return gap;
}
effects.enter("codeTextData");
return data(code);
}
function data(code) {
if (code === null || code === 32 || code === 96 || markdownLineEnding(code)) {
effects.exit("codeTextData");
return gap(code);
}
effects.consume(code);
return data;
}
function closingSequence(code) {
if (code === 96) {
effects.consume(code);
size++;
return closingSequence;
}
if (size === sizeOpen) {
effects.exit("codeTextSequence");
effects.exit("codeText");
return ok(code);
}
token.type = "codeTextData";
return data(code);
}
}
function subtokenize(events) {
const jumps = {};
let index2 = -1;
let event;
let lineIndex;
let otherIndex;
let otherEvent;
let parameters;
let subevents;
let more;
while (++index2 < events.length) {
while (index2 in jumps) {
index2 = jumps[index2];
}
event = events[index2];
if (index2 && event[1].type === "chunkFlow" && events[index2 - 1][1].type === "listItemPrefix") {
subevents = event[1]._tokenizer.events;
otherIndex = 0;
if (otherIndex < subevents.length && subevents[otherIndex][1].type === "lineEndingBlank") {
otherIndex += 2;
}
if (otherIndex < subevents.length && subevents[otherIndex][1].type === "content") {
while (++otherIndex < subevents.length) {
if (subevents[otherIndex][1].type === "content") {
break;
}
if (subevents[otherIndex][1].type === "chunkText") {
subevents[otherIndex][1]._isInFirstContentOfListItem = true;
otherIndex++;
}
}
}
}
if (event[0] === "enter") {
if (event[1].contentType) {
Object.assign(jumps, subcontent(events, index2));
index2 = jumps[index2];
more = true;
}
} else if (event[1]._container) {
otherIndex = index2;
lineIndex = void 0;
while (otherIndex--) {
otherEvent = events[otherIndex];
if (otherEvent[1].type === "lineEnding" || otherEvent[1].type === "lineEndingBlank") {
if (otherEvent[0] === "enter") {
if (lineIndex) {
events[lineIndex][1].type = "lineEndingBlank";
}
otherEvent[1].type = "lineEnding";
lineIndex = otherIndex;
}
} else {
break;
}
}
if (lineIndex) {
event[1].end = Object.assign({}, events[lineIndex][1].start);
parameters = events.slice(lineIndex, index2);
parameters.unshift(event);
splice(events, lineIndex, index2 - lineIndex + 1, parameters);
}
}
}
return !more;
}
function subcontent(events, eventIndex) {
const token = events[eventIndex][1];
const context = events[eventIndex][2];
let startPosition = eventIndex - 1;
const startPositions = [];
const tokenizer = token._tokenizer || context.parser[token.contentType](token.start);
const childEvents = tokenizer.events;
const jumps = [];
const gaps = {};
let stream;
let previous2;
let index2 = -1;
let current = token;
let adjust = 0;
let start = 0;
const breaks = [start];
while (current) {
while (events[++startPosition][1] !== current) {
}
startPositions.push(startPosition);
if (!current._tokenizer) {
stream = context.sliceStream(current);
if (!current.next) {
stream.push(null);
}
if (previous2) {
tokenizer.defineSkip(current.start);
}
if (current._isInFirstContentOfListItem) {
tokenizer._gfmTasklistFirstContentOfListItem = true;
}
tokenizer.write(stream);
if (current._isInFirstContentOfListItem) {
tokenizer._gfmTasklistFirstContentOfListItem = void 0;
}
}
previous2 = current;
current = current.next;
}
current = token;
while (++index2 < childEvents.length) {
if (childEvents[index2][0] === "exit" && childEvents[index2 - 1][0] === "enter" && childEvents[index2][1].type === childEvents[index2 - 1][1].type && childEvents[index2][1].start.line !== childEvents[index2][1].end.line) {
start = index2 + 1;
breaks.push(start);
current._tokenizer = void 0;
current.previous = void 0;
current = current.next;
}
}
tokenizer.events = [];
if (current) {
current._tokenizer = void 0;
current.previous = void 0;
} else {
breaks.pop();
}
index2 = breaks.length;
while (index2--) {
const slice = childEvents.slice(breaks[index2], breaks[index2 + 1]);
const start2 = startPositions.pop();
jumps.unshift([start2, start2 + slice.length - 1]);
splice(events, start2, 2, slice);
}
index2 = -1;
while (++index2 < jumps.length) {
gaps[adjust + jumps[index2][0]] = adjust + jumps[index2][1];
adjust += jumps[index2][1] - jumps[index2][0] - 1;
}
return gaps;
}
var content2 = {
tokenize: tokenizeContent,
resolve: resolveContent
};
var continuationConstruct = {
tokenize: tokenizeContinuation,
partial: true
};
function resolveContent(events) {
subtokenize(events);
return events;
}
function tokenizeContent(effects, ok) {
let previous2;
return start;
function start(code) {
effects.enter("content");
previous2 = effects.enter("chunkContent", {
contentType: "content"
});
return data(code);
}
function data(code) {
if (code === null) {
return contentEnd(code);
}
if (markdownLineEnding(code)) {
return effects.check(continuationConstruct, contentContinue, contentEnd)(code);
}
effects.consume(code);
return data;
}
function contentEnd(code) {
effects.exit("chunkContent");
effects.exit("content");
return ok(code);
}
function contentContinue(code) {
effects.consume(code);
effects.exit("chunkContent");
previous2.next = effects.enter("chunkContent", {
contentType: "content",
previous: previous2
});
previous2 = previous2.next;
return data;
}
}
function tokenizeContinuation(effects, ok, nok) {
const self2 = this;
return startLookahead;
function startLookahead(code) {
effects.exit("chunkContent");
effects.enter("lineEnding");
effects.consume(code);
effects.exit("lineEnding");
return factorySpace(effects, prefixed, "linePrefix");
}
function prefixed(code) {
if (code === null || markdownLineEnding(code)) {
return nok(code);
}
const tail = self2.events[self2.events.length - 1];
if (!self2.parser.constructs.disable.null.includes("codeIndented") && tail && tail[1].type === "linePrefix" && tail[2].sliceSerialize(tail[1], true).length >= 4) {
return ok(code);
}
return effects.interrupt(self2.parser.constructs.flow, nok, ok)(code);
}
}
function factoryDestination(effects, ok, nok, type, literalType, literalMarkerType, rawType, stringType, max) {
const limit = max || Number.POSITIVE_INFINITY;
let balance = 0;
return start;
function start(code) {
if (code === 60) {
effects.enter(type);
effects.enter(literalType);
effects.enter(literalMarkerType);
effects.consume(code);
effects.exit(literalMarkerType);
return destinationEnclosedBefore;
}
if (code === null || code === 41 || asciiControl(code)) {
return nok(code);
}
effects.enter(type);
effects.enter(rawType);
effects.enter(stringType);
effects.enter("chunkString", {
contentType: "string"
});
return destinationRaw(code);
}
function destinationEnclosedBefore(code) {
if (code === 62) {
effects.enter(literalMarkerType);
effects.consume(code);
effects.exit(literalMarkerType);
effects.exit(literalType);
effects.exit(type);
return ok;
}
effects.enter(stringType);
effects.enter("chunkString", {
contentType: "string"
});
return destinationEnclosed(code);
}
function destinationEnclosed(code) {
if (code === 62) {
effects.exit("chunkString");
effects.exit(stringType);
return destinationEnclosedBefore(code);
}
if (code === null || code === 60 || markdownLineEnding(code)) {
return nok(code);
}
effects.consume(code);
return code === 92 ? destinationEnclosedEscape : destinationEnclosed;
}
function destinationEnclosedEscape(code) {
if (code === 60 || code === 62 || code === 92) {
effects.consume(code);
return destinationEnclosed;
}
return destinationEnclosed(code);
}
function destinationRaw(code) {
if (code === 40) {
if (++balance > limit)
return nok(code);
effects.consume(code);
return destinationRaw;
}
if (code === 41) {
if (!balance--) {
effects.exit("chunkString");
effects.exit(stringType);
effects.exit(rawType);
effects.exit(type);
return ok(code);
}
effects.consume(code);
return destinationRaw;
}
if (code === null || markdownLineEndingOrSpace(code)) {
if (balance)
return nok(code);
effects.exit("chunkString");
effects.exit(stringType);
effects.exit(rawType);
effects.exit(type);
return ok(code);
}
if (asciiControl(code))
return nok(code);
effects.consume(code);
return code === 92 ? destinationRawEscape : destinationRaw;
}
function destinationRawEscape(code) {
if (code === 40 || code === 41 || code === 92) {
effects.consume(code);
return destinationRaw;
}
return destinationRaw(code);
}
}
function factoryLabel(effects, ok, nok, type, markerType, stringType) {
const self2 = this;
let size = 0;
let data;
return start;
function start(code) {
effects.enter(type);
effects.enter(markerType);
effects.consume(code);
effects.exit(markerType);
effects.enter(stringType);
return atBreak;
}
function atBreak(code) {
if (code === null || code === 91 || code === 93 && !data || code === 94 && !size && "_hiddenFootnoteSupport" in self2.parser.constructs || size > 999) {
return nok(code);
}
if (code === 93) {
effects.exit(stringType);
effects.enter(markerType);
effects.consume(code);
effects.exit(markerType);
effects.exit(type);
return ok;
}
if (markdownLineEnding(code)) {
effects.enter("lineEnding");
effects.consume(code);
effects.exit("lineEnding");
return atBreak;
}
effects.enter("chunkString", {
contentType: "string"
});
return label(code);
}
function label(code) {
if (code === null || code === 91 || code === 93 || markdownLineEnding(code) || size++ > 999) {
effects.exit("chunkString");
return atBreak(code);
}
effects.consume(code);
data = data || !markdownSpace(code);
return code === 92 ? labelEscape : label;
}
function labelEscape(code) {
if (code === 91 || code === 92 || code === 93) {
effects.consume(code);
size++;
return label;
}
return label(code);
}
}
function factoryTitle(effects, ok, nok, type, markerType, stringType) {
let marker;
return start;
function start(code) {
effects.enter(type);
effects.enter(markerType);
effects.consume(code);
effects.exit(markerType);
marker = code === 40 ? 41 : code;
return atFirstTitleBreak;
}
function atFirstTitleBreak(code) {
if (code === marker) {
effects.enter(markerType);
effects.consume(code);
effects.exit(markerType);
effects.exit(type);
return ok;
}
effects.enter(stringType);
return atTitleBreak(code);
}
function atTitleBreak(code) {
if (code === marker) {
effects.exit(stringType);
return atFirstTitleBreak(marker);
}
if (code === null) {
return nok(code);
}
if (markdownLineEnding(code)) {
effects.enter("lineEnding");
effects.consume(code);
effects.exit("lineEnding");
return factorySpace(effects, atTitleBreak, "linePrefix");
}
effects.enter("chunkString", {
contentType: "string"
});
return title(code);
}
function title(code) {
if (code === marker || code === null || markdownLineEnding(code)) {
effects.exit("chunkString");
return atTitleBreak(code);
}
effects.consume(code);
return code === 92 ? titleEscape : title;
}
function titleEscape(code) {
if (code === marker || code === 92) {
effects.consume(code);
return title;
}
return title(code);
}
}
function factoryWhitespace(effects, ok) {
let seen;
return start;
function start(code) {
if (markdownLineEnding(code)) {
effects.enter("lineEnding");
effects.consume(code);
effects.exit("lineEnding");
seen = true;
return start;
}
if (markdownSpace(code)) {
return factorySpace(effects, start, seen ? "linePrefix" : "lineSuffix")(code);
}
return ok(code);
}
}
function normalizeIdentifier(value2) {
return value2.replace(/[\t\n\r ]+/g, " ").replace(/^ | $/g, "").toLowerCase().toUpperCase();
}
var definition = {
name: "definition",
tokenize: tokenizeDefinition
};
var titleConstruct = {
tokenize: tokenizeTitle,
partial: true
};
function tokenizeDefinition(effects, ok, nok) {
const self2 = this;
let identifier;
return start;
function start(code) {
effects.enter("definition");
return factoryLabel.call(self2, effects, labelAfter, nok, "definitionLabel", "definitionLabelMarker", "definitionLabelString")(code);
}
function labelAfter(code) {
identifier = normalizeIdentifier(self2.sliceSerialize(self2.events[self2.events.length - 1][1]).slice(1, -1));
if (code === 58) {
effects.enter("definitionMarker");
effects.consume(code);
effects.exit("definitionMarker");
return factoryWhitespace(effects, factoryDestination(effects, effects.attempt(titleConstruct, factorySpace(effects, after, "whitespace"), factorySpace(effects, after, "whitespace")), nok, "definitionDestination", "definitionDestinationLiteral", "definitionDestinationLiteralMarker", "definitionDestinationRaw", "definitionDestinationString"));
}
return nok(code);
}
function after(code) {
if (code === null || markdownLineEnding(code)) {
effects.exit("definition");
if (!self2.parser.defined.includes(identifier)) {
self2.parser.defined.push(identifier);
}
return ok(code);
}
return nok(code);
}
}
function tokenizeTitle(effects, ok, nok) {
return start;
function start(code) {
return markdownLineEndingOrSpace(code) ? factoryWhitespace(effects, before)(code) : nok(code);
}
function before(code) {
if (code === 34 || code === 39 || code === 40) {
return factoryTitle(effects, factorySpace(effects, after, "whitespace"), nok, "definitionTitle", "definitionTitleMarker", "definitionTitleString")(code);
}
return nok(code);
}
function after(code) {
return code === null || markdownLineEnding(code) ? ok(code) : nok(code);
}
}
var hardBreakEscape = {
name: "hardBreakEscape",
tokenize: tokenizeHardBreakEscape
};
function tokenizeHardBreakEscape(effects, ok, nok) {
return start;
function start(code) {
effects.enter("hardBreakEscape");
effects.enter("escapeMarker");
effects.consume(code);
return open;
}
function open(code) {
if (markdownLineEnding(code)) {
effects.exit("escapeMarker");
effects.exit("hardBreakEscape");
return ok(code);
}
return nok(code);
}
}
var headingAtx = {
name: "headingAtx",
tokenize: tokenizeHeadingAtx,
resolve: resolveHeadingAtx
};
function resolveHeadingAtx(events, context) {
let contentEnd = events.length - 2;
let contentStart = 3;
let content3;
let text3;
if (events[contentStart][1].type === "whitespace") {
contentStart += 2;
}
if (contentEnd - 2 > contentStart && events[contentEnd][1].type === "whitespace") {
contentEnd -= 2;
}
if (events[contentEnd][1].type === "atxHeadingSequence" && (contentStart === contentEnd - 1 || contentEnd - 4 > contentStart && events[contentEnd - 2][1].type === "whitespace")) {
contentEnd -= contentStart + 1 === contentEnd ? 2 : 4;
}
if (contentEnd > contentStart) {
content3 = {
type: "atxHeadingText",
start: events[contentStart][1].start,
end: events[contentEnd][1].end
};
text3 = {
type: "chunkText",
start: events[contentStart][1].start,
end: events[contentEnd][1].end,
contentType: "text"
};
splice(events, contentStart, contentEnd - contentStart + 1, [
["enter", content3, context],
["enter", text3, context],
["exit", text3, context],
["exit", content3, context]
]);
}
return events;
}
function tokenizeHeadingAtx(effects, ok, nok) {
const self2 = this;
let size = 0;
return start;
function start(code) {
effects.enter("atxHeading");
effects.enter("atxHeadingSequence");
return fenceOpenInside(code);
}
function fenceOpenInside(code) {
if (code === 35 && size++ < 6) {
effects.consume(code);
return fenceOpenInside;
}
if (code === null || markdownLineEndingOrSpace(code)) {
effects.exit("atxHeadingSequence");
return self2.interrupt ? ok(code) : headingBreak(code);
}
return nok(code);
}
function headingBreak(code) {
if (code === 35) {
effects.enter("atxHeadingSequence");
return sequence(code);
}
if (code === null || markdownLineEnding(code)) {
effects.exit("atxHeading");
return ok(code);
}
if (markdownSpace(code)) {
return factorySpace(effects, headingBreak, "whitespace")(code);
}
effects.enter("atxHeadingText");
return data(code);
}
function sequence(code) {
if (code === 35) {
effects.consume(code);
return sequence;
}
effects.exit("atxHeadingSequence");
return headingBreak(code);
}
function data(code) {
if (code === null || code === 35 || markdownLineEndingOrSpace(code)) {
effects.exit("atxHeadingText");
return headingBreak(code);
}
effects.consume(code);
return data;
}
}
var htmlBlockNames = [
"address",
"article",
"aside",
"base",
"basefont",
"blockquote",
"body",
"caption",
"center",
"col",
"colgroup",
"dd",
"details",
"dialog",
"dir",
"div",
"dl",
"dt",
"fieldset",
"figcaption",
"figure",
"footer",
"form",
"frame",
"frameset",
"h1",
"h2",
"h3",
"h4",
"h5",
"h6",
"head",
"header",
"hr",
"html",
"iframe",
"legend",
"li",
"link",
"main",
"menu",
"menuitem",
"nav",
"noframes",
"ol",
"optgroup",
"option",
"p",
"param",
"section",
"source",
"summary",
"table",
"tbody",
"td",
"tfoot",
"th",
"thead",
"title",
"tr",
"track",
"ul"
];
var htmlRawNames = ["pre", "script", "style", "textarea"];
var htmlFlow = {
name: "htmlFlow",
tokenize: tokenizeHtmlFlow,
resolveTo: resolveToHtmlFlow,
concrete: true
};
var nextBlankConstruct = {
tokenize: tokenizeNextBlank,
partial: true
};
function resolveToHtmlFlow(events) {
let index2 = events.length;
while (index2--) {
if (events[index2][0] === "enter" && events[index2][1].type === "htmlFlow") {
break;
}
}
if (index2 > 1 && events[index2 - 2][1].type === "linePrefix") {
events[index2][1].start = events[index2 - 2][1].start;
events[index2 + 1][1].start = events[index2 - 2][1].start;
events.splice(index2 - 2, 2);
}
return events;
}
function tokenizeHtmlFlow(effects, ok, nok) {
const self2 = this;
let kind;
let startTag;
let buffer2;
let index2;
let marker;
return start;
function start(code) {
effects.enter("htmlFlow");
effects.enter("htmlFlowData");
effects.consume(code);
return open;
}
function open(code) {
if (code === 33) {
effects.consume(code);
return declarationStart;
}
if (code === 47) {
effects.consume(code);
return tagCloseStart;
}
if (code === 63) {
effects.consume(code);
kind = 3;
return self2.interrupt ? ok : continuationDeclarationInside;
}
if (asciiAlpha(code)) {
effects.consume(code);
buffer2 = String.fromCharCode(code);
startTag = true;
return tagName;
}
return nok(code);
}
function declarationStart(code) {
if (code === 45) {
effects.consume(code);
kind = 2;
return commentOpenInside;
}
if (code === 91) {
effects.consume(code);
kind = 5;
buffer2 = "CDATA[";
index2 = 0;
return cdataOpenInside;
}
if (asciiAlpha(code)) {
effects.consume(code);
kind = 4;
return self2.interrupt ? ok : continuationDeclarationInside;
}
return nok(code);
}
function commentOpenInside(code) {
if (code === 45) {
effects.consume(code);
return self2.interrupt ? ok : continuationDeclarationInside;
}
return nok(code);
}
function cdataOpenInside(code) {
if (code === buffer2.charCodeAt(index2++)) {
effects.consume(code);
return index2 === buffer2.length ? self2.interrupt ? ok : continuation : cdataOpenInside;
}
return nok(code);
}
function tagCloseStart(code) {
if (asciiAlpha(code)) {
effects.consume(code);
buffer2 = String.fromCharCode(code);
return tagName;
}
return nok(code);
}
function tagName(code) {
if (code === null || code === 47 || code === 62 || markdownLineEndingOrSpace(code)) {
if (code !== 47 && startTag && htmlRawNames.includes(buffer2.toLowerCase())) {
kind = 1;
return self2.interrupt ? ok(code) : continuation(code);
}
if (htmlBlockNames.includes(buffer2.toLowerCase())) {
kind = 6;
if (code === 47) {
effects.consume(code);
return basicSelfClosing;
}
return self2.interrupt ? ok(code) : continuation(code);
}
kind = 7;
return self2.interrupt && !self2.parser.lazy[self2.now().line] ? nok(code) : startTag ? completeAttributeNameBefore(code) : completeClosingTagAfter(code);
}
if (code === 45 || asciiAlphanumeric(code)) {
effects.consume(code);
buffer2 += String.fromCharCode(code);
return tagName;
}
return nok(code);
}
function basicSelfClosing(code) {
if (code === 62) {
effects.consume(code);
return self2.interrupt ? ok : continuation;
}
return nok(code);
}
function completeClosingTagAfter(code) {
if (markdownSpace(code)) {
effects.consume(code);
return completeClosingTagAfter;
}
return completeEnd(code);
}
function completeAttributeNameBefore(code) {
if (code === 47) {
effects.consume(code);
return completeEnd;
}
if (code === 58 || code === 95 || asciiAlpha(code)) {
effects.consume(code);
return completeAttributeName;
}
if (markdownSpace(code)) {
effects.consume(code);
return completeAttributeNameBefore;
}
return completeEnd(code);
}
function completeAttributeName(code) {
if (code === 45 || code === 46 || code === 58 || code === 95 || asciiAlphanumeric(code)) {
effects.consume(code);
return completeAttributeName;
}
return completeAttributeNameAfter(code);
}
function completeAttributeNameAfter(code) {
if (code === 61) {
effects.consume(code);
return completeAttributeValueBefore;
}
if (markdownSpace(code)) {
effects.consume(code);
return completeAttributeNameAfter;
}
return completeAttributeNameBefore(code);
}
function completeAttributeValueBefore(code) {
if (code === null || code === 60 || code === 61 || code === 62 || code === 96) {
return nok(code);
}
if (code === 34 || code === 39) {
effects.consume(code);
marker = code;
return completeAttributeValueQuoted;
}
if (markdownSpace(code)) {
effects.consume(code);
return completeAttributeValueBefore;
}
marker = null;
return completeAttributeValueUnquoted(code);
}
function completeAttributeValueQuoted(code) {
if (code === null || markdownLineEnding(code)) {
return nok(code);
}
if (code === marker) {
effects.consume(code);
return completeAttributeValueQuotedAfter;
}
effects.consume(code);
return completeAttributeValueQuoted;
}
function completeAttributeValueUnquoted(code) {
if (code === null || code === 34 || code === 39 || code === 60 || code === 61 || code === 62 || code === 96 || markdownLineEndingOrSpace(code)) {
return completeAttributeNameAfter(code);
}
effects.consume(code);
return completeAttributeValueUnquoted;
}
function completeAttributeValueQuotedAfter(code) {
if (code === 47 || code === 62 || markdownSpace(code)) {
return completeAttributeNameBefore(code);
}
return nok(code);
}
function completeEnd(code) {
if (code === 62) {
effects.consume(code);
return completeAfter;
}
return nok(code);
}
function completeAfter(code) {
if (markdownSpace(code)) {
effects.consume(code);
return completeAfter;
}
return code === null || markdownLineEnding(code) ? continuation(code) : nok(code);
}
function continuation(code) {
if (code === 45 && kind === 2) {
effects.consume(code);
return continuationCommentInside;
}
if (code === 60 && kind === 1) {
effects.consume(code);
return continuationRawTagOpen;
}
if (code === 62 && kind === 4) {
effects.consume(code);
return continuationClose;
}
if (code === 63 && kind === 3) {
effects.consume(code);
return continuationDeclarationInside;
}
if (code === 93 && kind === 5) {
effects.consume(code);
return continuationCharacterDataInside;
}
if (markdownLineEnding(code) && (kind === 6 || kind === 7)) {
return effects.check(nextBlankConstruct, continuationClose, continuationAtLineEnding)(code);
}
if (code === null || markdownLineEnding(code)) {
return continuationAtLineEnding(code);
}
effects.consume(code);
return continuation;
}
function continuationAtLineEnding(code) {
effects.exit("htmlFlowData");
return htmlContinueStart(code);
}
function htmlContinueStart(code) {
if (code === null) {
return done(code);
}
if (markdownLineEnding(code)) {
return effects.attempt({
tokenize: htmlLineEnd,
partial: true
}, htmlContinueStart, done)(code);
}
effects.enter("htmlFlowData");
return continuation(code);
}
function htmlLineEnd(effects2, ok2, nok2) {
return start2;
function start2(code) {
effects2.enter("lineEnding");
effects2.consume(code);
effects2.exit("lineEnding");
return lineStart;
}
function lineStart(code) {
return self2.parser.lazy[self2.now().line] ? nok2(code) : ok2(code);
}
}
function continuationCommentInside(code) {
if (code === 45) {
effects.consume(code);
return continuationDeclarationInside;
}
return continuation(code);
}
function continuationRawTagOpen(code) {
if (code === 47) {
effects.consume(code);
buffer2 = "";
return continuationRawEndTag;
}
return continuation(code);
}
function continuationRawEndTag(code) {
if (code === 62 && htmlRawNames.includes(buffer2.toLowerCase())) {
effects.consume(code);
return continuationClose;
}
if (asciiAlpha(code) && buffer2.length < 8) {
effects.consume(code);
buffer2 += String.fromCharCode(code);
return continuationRawEndTag;
}
return continuation(code);
}
function continuationCharacterDataInside(code) {
if (code === 93) {
effects.consume(code);
return continuationDeclarationInside;
}
return continuation(code);
}
function continuationDeclarationInside(code) {
if (code === 62) {
effects.consume(code);
return continuationClose;
}
return continuation(code);
}
function continuationClose(code) {
if (code === null || markdownLineEnding(code)) {
effects.exit("htmlFlowData");
return done(code);
}
effects.consume(code);
return continuationClose;
}
function done(code) {
effects.exit("htmlFlow");
return ok(code);
}
}
function tokenizeNextBlank(effects, ok, nok) {
return start;
function start(code) {
effects.exit("htmlFlowData");
effects.enter("lineEndingBlank");
effects.consume(code);
effects.exit("lineEndingBlank");
return effects.attempt(blankLine, ok, nok);
}
}
var htmlText = {
name: "htmlText",
tokenize: tokenizeHtmlText
};
function tokenizeHtmlText(effects, ok, nok) {
const self2 = this;
let marker;
let buffer2;
let index2;
let returnState;
return start;
function start(code) {
effects.enter("htmlText");
effects.enter("htmlTextData");
effects.consume(code);
return open;
}
function open(code) {
if (code === 33) {
effects.consume(code);
return declarationOpen;
}
if (code === 47) {
effects.consume(code);
return tagCloseStart;
}
if (code === 63) {
effects.consume(code);
return instruction;
}
if (asciiAlpha(code)) {
effects.consume(code);
return tagOpen;
}
return nok(code);
}
function declarationOpen(code) {
if (code === 45) {
effects.consume(code);
return commentOpen;
}
if (code === 91) {
effects.consume(code);
buffer2 = "CDATA[";
index2 = 0;
return cdataOpen;
}
if (asciiAlpha(code)) {
effects.consume(code);
return declaration;
}
return nok(code);
}
function commentOpen(code) {
if (code === 45) {
effects.consume(code);
return commentStart;
}
return nok(code);
}
function commentStart(code) {
if (code === null || code === 62) {
return nok(code);
}
if (code === 45) {
effects.consume(code);
return commentStartDash;
}
return comment(code);
}
function commentStartDash(code) {
if (code === null || code === 62) {
return nok(code);
}
return comment(code);
}
function comment(code) {
if (code === null) {
return nok(code);
}
if (code === 45) {
effects.consume(code);
return commentClose;
}
if (markdownLineEnding(code)) {
returnState = comment;
return atLineEnding(code);
}
effects.consume(code);
return comment;
}
function commentClose(code) {
if (code === 45) {
effects.consume(code);
return end;
}
return comment(code);
}
function cdataOpen(code) {
if (code === buffer2.charCodeAt(index2++)) {
effects.consume(code);
return index2 === buffer2.length ? cdata : cdataOpen;
}
return nok(code);
}
function cdata(code) {
if (code === null) {
return nok(code);
}
if (code === 93) {
effects.consume(code);
return cdataClose;
}
if (markdownLineEnding(code)) {
returnState = cdata;
return atLineEnding(code);
}
effects.consume(code);
return cdata;
}
function cdataClose(code) {
if (code === 93) {
effects.consume(code);
return cdataEnd;
}
return cdata(code);
}
function cdataEnd(code) {
if (code === 62) {
return end(code);
}
if (code === 93) {
effects.consume(code);
return cdataEnd;
}
return cdata(code);
}
function declaration(code) {
if (code === null || code === 62) {
return end(code);
}
if (markdownLineEnding(code)) {
returnState = declaration;
return atLineEnding(code);
}
effects.consume(code);
return declaration;
}
function instruction(code) {
if (code === null) {
return nok(code);
}
if (code === 63) {
effects.consume(code);
return instructionClose;
}
if (markdownLineEnding(code)) {
returnState = instruction;
return atLineEnding(code);
}
effects.consume(code);
return instruction;
}
function instructionClose(code) {
return code === 62 ? end(code) : instruction(code);
}
function tagCloseStart(code) {
if (asciiAlpha(code)) {
effects.consume(code);
return tagClose;
}
return nok(code);
}
function tagClose(code) {
if (code === 45 || asciiAlphanumeric(code)) {
effects.consume(code);
return tagClose;
}
return tagCloseBetween(code);
}
function tagCloseBetween(code) {
if (markdownLineEnding(code)) {
returnState = tagCloseBetween;
return atLineEnding(code);
}
if (markdownSpace(code)) {
effects.consume(code);
return tagCloseBetween;
}
return end(code);
}
function tagOpen(code) {
if (code === 45 || asciiAlphanumeric(code)) {
effects.consume(code);
return tagOpen;
}
if (code === 47 || code === 62 || markdownLineEndingOrSpace(code)) {
return tagOpenBetween(code);
}
return nok(code);
}
function tagOpenBetween(code) {
if (code === 47) {
effects.consume(code);
return end;
}
if (code === 58 || code === 95 || asciiAlpha(code)) {
effects.consume(code);
return tagOpenAttributeName;
}
if (markdownLineEnding(code)) {
returnState = tagOpenBetween;
return atLineEnding(code);
}
if (markdownSpace(code)) {
effects.consume(code);
return tagOpenBetween;
}
return end(code);
}
function tagOpenAttributeName(code) {
if (code === 45 || code === 46 || code === 58 || code === 95 || asciiAlphanumeric(code)) {
effects.consume(code);
return tagOpenAttributeName;
}
return tagOpenAttributeNameAfter(code);
}
function tagOpenAttributeNameAfter(code) {
if (code === 61) {
effects.consume(code);
return tagOpenAttributeValueBefore;
}
if (markdownLineEnding(code)) {
returnState = tagOpenAttributeNameAfter;
return atLineEnding(code);
}
if (markdownSpace(code)) {
effects.consume(code);
return tagOpenAttributeNameAfter;
}
return tagOpenBetween(code);
}
function tagOpenAttributeValueBefore(code) {
if (code === null || code === 60 || code === 61 || code === 62 || code === 96) {
return nok(code);
}
if (code === 34 || code === 39) {
effects.consume(code);
marker = code;
return tagOpenAttributeValueQuoted;
}
if (markdownLineEnding(code)) {
returnState = tagOpenAttributeValueBefore;
return atLineEnding(code);
}
if (markdownSpace(code)) {
effects.consume(code);
return tagOpenAttributeValueBefore;
}
effects.consume(code);
marker = void 0;
return tagOpenAttributeValueUnquoted;
}
function tagOpenAttributeValueQuoted(code) {
if (code === marker) {
effects.consume(code);
return tagOpenAttributeValueQuotedAfter;
}
if (code === null) {
return nok(code);
}
if (markdownLineEnding(code)) {
returnState = tagOpenAttributeValueQuoted;
return atLineEnding(code);
}
effects.consume(code);
return tagOpenAttributeValueQuoted;
}
function tagOpenAttributeValueQuotedAfter(code) {
if (code === 62 || code === 47 || markdownLineEndingOrSpace(code)) {
return tagOpenBetween(code);
}
return nok(code);
}
function tagOpenAttributeValueUnquoted(code) {
if (code === null || code === 34 || code === 39 || code === 60 || code === 61 || code === 96) {
return nok(code);
}
if (code === 62 || markdownLineEndingOrSpace(code)) {
return tagOpenBetween(code);
}
effects.consume(code);
return tagOpenAttributeValueUnquoted;
}
function atLineEnding(code) {
effects.exit("htmlTextData");
effects.enter("lineEnding");
effects.consume(code);
effects.exit("lineEnding");
return factorySpace(effects, afterPrefix, "linePrefix", self2.parser.constructs.disable.null.includes("codeIndented") ? void 0 : 4);
}
function afterPrefix(code) {
effects.enter("htmlTextData");
return returnState(code);
}
function end(code) {
if (code === 62) {
effects.consume(code);
effects.exit("htmlTextData");
effects.exit("htmlText");
return ok;
}
return nok(code);
}
}
var labelEnd = {
name: "labelEnd",
tokenize: tokenizeLabelEnd,
resolveTo: resolveToLabelEnd,
resolveAll: resolveAllLabelEnd
};
var resourceConstruct = {
tokenize: tokenizeResource
};
var fullReferenceConstruct = {
tokenize: tokenizeFullReference
};
var collapsedReferenceConstruct = {
tokenize: tokenizeCollapsedReference
};
function resolveAllLabelEnd(events) {
let index2 = -1;
let token;
while (++index2 < events.length) {
token = events[index2][1];
if (token.type === "labelImage" || token.type === "labelLink" || token.type === "labelEnd") {
events.splice(index2 + 1, token.type === "labelImage" ? 4 : 2);
token.type = "data";
index2++;
}
}
return events;
}
function resolveToLabelEnd(events, context) {
let index2 = events.length;
let offset = 0;
let token;
let open;
let close2;
let media;
while (index2--) {
token = events[index2][1];
if (open) {
if (token.type === "link" || token.type === "labelLink" && token._inactive) {
break;
}
if (events[index2][0] === "enter" && token.type === "labelLink") {
token._inactive = true;
}
} else if (close2) {
if (events[index2][0] === "enter" && (token.type === "labelImage" || token.type === "labelLink") && !token._balanced) {
open = index2;
if (token.type !== "labelLink") {
offset = 2;
break;
}
}
} else if (token.type === "labelEnd") {
close2 = index2;
}
}
const group = {
type: events[open][1].type === "labelLink" ? "link" : "image",
start: Object.assign({}, events[open][1].start),
end: Object.assign({}, events[events.length - 1][1].end)
};
const label = {
type: "label",
start: Object.assign({}, events[open][1].start),
end: Object.assign({}, events[close2][1].end)
};
const text3 = {
type: "labelText",
start: Object.assign({}, events[open + offset + 2][1].end),
end: Object.assign({}, events[close2 - 2][1].start)
};
media = [
["enter", group, context],
["enter", label, context]
];
media = push(media, events.slice(open + 1, open + offset + 3));
media = push(media, [["enter", text3, context]]);
media = push(media, resolveAll(context.parser.constructs.insideSpan.null, events.slice(open + offset + 4, close2 - 3), context));
media = push(media, [
["exit", text3, context],
events[close2 - 2],
events[close2 - 1],
["exit", label, context]
]);
media = push(media, events.slice(close2 + 1));
media = push(media, [["exit", group, context]]);
splice(events, open, events.length, media);
return events;
}
function tokenizeLabelEnd(effects, ok, nok) {
const self2 = this;
let index2 = self2.events.length;
let labelStart;
let defined;
while (index2--) {
if ((self2.events[index2][1].type === "labelImage" || self2.events[index2][1].type === "labelLink") && !self2.events[index2][1]._balanced) {
labelStart = self2.events[index2][1];
break;
}
}
return start;
function start(code) {
if (!labelStart) {
return nok(code);
}
if (labelStart._inactive)
return balanced(code);
defined = self2.parser.defined.includes(normalizeIdentifier(self2.sliceSerialize({
start: labelStart.end,
end: self2.now()
})));
effects.enter("labelEnd");
effects.enter("labelMarker");
effects.consume(code);
effects.exit("labelMarker");
effects.exit("labelEnd");
return afterLabelEnd;
}
function afterLabelEnd(code) {
if (code === 40) {
return effects.attempt(resourceConstruct, ok, defined ? ok : balanced)(code);
}
if (code === 91) {
return effects.attempt(fullReferenceConstruct, ok, defined ? effects.attempt(collapsedReferenceConstruct, ok, balanced) : balanced)(code);
}
return defined ? ok(code) : balanced(code);
}
function balanced(code) {
labelStart._balanced = true;
return nok(code);
}
}
function tokenizeResource(effects, ok, nok) {
return start;
function start(code) {
effects.enter("resource");
effects.enter("resourceMarker");
effects.consume(code);
effects.exit("resourceMarker");
return factoryWhitespace(effects, open);
}
function open(code) {
if (code === 41) {
return end(code);
}
return factoryDestination(effects, destinationAfter, nok, "resourceDestination", "resourceDestinationLiteral", "resourceDestinationLiteralMarker", "resourceDestinationRaw", "resourceDestinationString", 3)(code);
}
function destinationAfter(code) {
return markdownLineEndingOrSpace(code) ? factoryWhitespace(effects, between)(code) : end(code);
}
function between(code) {
if (code === 34 || code === 39 || code === 40) {
return factoryTitle(effects, factoryWhitespace(effects, end), nok, "resourceTitle", "resourceTitleMarker", "resourceTitleString")(code);
}
return end(code);
}
function end(code) {
if (code === 41) {
effects.enter("resourceMarker");
effects.consume(code);
effects.exit("resourceMarker");
effects.exit("resource");
return ok;
}
return nok(code);
}
}
function tokenizeFullReference(effects, ok, nok) {
const self2 = this;
return start;
function start(code) {
return factoryLabel.call(self2, effects, afterLabel, nok, "reference", "referenceMarker", "referenceString")(code);
}
function afterLabel(code) {
return self2.parser.defined.includes(normalizeIdentifier(self2.sliceSerialize(self2.events[self2.events.length - 1][1]).slice(1, -1))) ? ok(code) : nok(code);
}
}
function tokenizeCollapsedReference(effects, ok, nok) {
return start;
function start(code) {
effects.enter("reference");
effects.enter("referenceMarker");
effects.consume(code);
effects.exit("referenceMarker");
return open;
}
function open(code) {
if (code === 93) {
effects.enter("referenceMarker");
effects.consume(code);
effects.exit("referenceMarker");
effects.exit("reference");
return ok;
}
return nok(code);
}
}
var labelStartImage = {
name: "labelStartImage",
tokenize: tokenizeLabelStartImage,
resolveAll: labelEnd.resolveAll
};
function tokenizeLabelStartImage(effects, ok, nok) {
const self2 = this;
return start;
function start(code) {
effects.enter("labelImage");
effects.enter("labelImageMarker");
effects.consume(code);
effects.exit("labelImageMarker");
return open;
}
function open(code) {
if (code === 91) {
effects.enter("labelMarker");
effects.consume(code);
effects.exit("labelMarker");
effects.exit("labelImage");
return after;
}
return nok(code);
}
function after(code) {
return code === 94 && "_hiddenFootnoteSupport" in self2.parser.constructs ? nok(code) : ok(code);
}
}
var labelStartLink = {
name: "labelStartLink",
tokenize: tokenizeLabelStartLink,
resolveAll: labelEnd.resolveAll
};
function tokenizeLabelStartLink(effects, ok, nok) {
const self2 = this;
return start;
function start(code) {
effects.enter("labelLink");
effects.enter("labelMarker");
effects.consume(code);
effects.exit("labelMarker");
effects.exit("labelLink");
return after;
}
function after(code) {
return code === 94 && "_hiddenFootnoteSupport" in self2.parser.constructs ? nok(code) : ok(code);
}
}
var lineEnding = {
name: "lineEnding",
tokenize: tokenizeLineEnding
};
function tokenizeLineEnding(effects, ok) {
return start;
function start(code) {
effects.enter("lineEnding");
effects.consume(code);
effects.exit("lineEnding");
return factorySpace(effects, ok, "linePrefix");
}
}
var thematicBreak = {
name: "thematicBreak",
tokenize: tokenizeThematicBreak
};
function tokenizeThematicBreak(effects, ok, nok) {
let size = 0;
let marker;
return start;
function start(code) {
effects.enter("thematicBreak");
marker = code;
return atBreak(code);
}
function atBreak(code) {
if (code === marker) {
effects.enter("thematicBreakSequence");
return sequence(code);
}
if (markdownSpace(code)) {
return factorySpace(effects, atBreak, "whitespace")(code);
}
if (size < 3 || code !== null && !markdownLineEnding(code)) {
return nok(code);
}
effects.exit("thematicBreak");
return ok(code);
}
function sequence(code) {
if (code === marker) {
effects.consume(code);
size++;
return sequence;
}
effects.exit("thematicBreakSequence");
return atBreak(code);
}
}
var list = {
name: "list",
tokenize: tokenizeListStart,
continuation: {
tokenize: tokenizeListContinuation
},
exit: tokenizeListEnd
};
var listItemPrefixWhitespaceConstruct = {
tokenize: tokenizeListItemPrefixWhitespace,
partial: true
};
var indentConstruct = {
tokenize: tokenizeIndent,
partial: true
};
function tokenizeListStart(effects, ok, nok) {
const self2 = this;
const tail = self2.events[self2.events.length - 1];
let initialSize = tail && tail[1].type === "linePrefix" ? tail[2].sliceSerialize(tail[1], true).length : 0;
let size = 0;
return start;
function start(code) {
const kind = self2.containerState.type || (code === 42 || code === 43 || code === 45 ? "listUnordered" : "listOrdered");
if (kind === "listUnordered" ? !self2.containerState.marker || code === self2.containerState.marker : asciiDigit(code)) {
if (!self2.containerState.type) {
self2.containerState.type = kind;
effects.enter(kind, {
_container: true
});
}
if (kind === "listUnordered") {
effects.enter("listItemPrefix");
return code === 42 || code === 45 ? effects.check(thematicBreak, nok, atMarker)(code) : atMarker(code);
}
if (!self2.interrupt || code === 49) {
effects.enter("listItemPrefix");
effects.enter("listItemValue");
return inside(code);
}
}
return nok(code);
}
function inside(code) {
if (asciiDigit(code) && ++size < 10) {
effects.consume(code);
return inside;
}
if ((!self2.interrupt || size < 2) && (self2.containerState.marker ? code === self2.containerState.marker : code === 41 || code === 46)) {
effects.exit("listItemValue");
return atMarker(code);
}
return nok(code);
}
function atMarker(code) {
effects.enter("listItemMarker");
effects.consume(code);
effects.exit("listItemMarker");
self2.containerState.marker = self2.containerState.marker || code;
return effects.check(blankLine, self2.interrupt ? nok : onBlank, effects.attempt(listItemPrefixWhitespaceConstruct, endOfPrefix, otherPrefix));
}
function onBlank(code) {
self2.containerState.initialBlankLine = true;
initialSize++;
return endOfPrefix(code);
}
function otherPrefix(code) {
if (markdownSpace(code)) {
effects.enter("listItemPrefixWhitespace");
effects.consume(code);
effects.exit("listItemPrefixWhitespace");
return endOfPrefix;
}
return nok(code);
}
function endOfPrefix(code) {
self2.containerState.size = initialSize + self2.sliceSerialize(effects.exit("listItemPrefix"), true).length;
return ok(code);
}
}
function tokenizeListContinuation(effects, ok, nok) {
const self2 = this;
self2.containerState._closeFlow = void 0;
return effects.check(blankLine, onBlank, notBlank);
function onBlank(code) {
self2.containerState.furtherBlankLines = self2.containerState.furtherBlankLines || self2.containerState.initialBlankLine;
return factorySpace(effects, ok, "listItemIndent", self2.containerState.size + 1)(code);
}
function notBlank(code) {
if (self2.containerState.furtherBlankLines || !markdownSpace(code)) {
self2.containerState.furtherBlankLines = void 0;
self2.containerState.initialBlankLine = void 0;
return notInCurrentItem(code);
}
self2.containerState.furtherBlankLines = void 0;
self2.containerState.initialBlankLine = void 0;
return effects.attempt(indentConstruct, ok, notInCurrentItem)(code);
}
function notInCurrentItem(code) {
self2.containerState._closeFlow = true;
self2.interrupt = void 0;
return factorySpace(effects, effects.attempt(list, ok, nok), "linePrefix", self2.parser.constructs.disable.null.includes("codeIndented") ? void 0 : 4)(code);
}
}
function tokenizeIndent(effects, ok, nok) {
const self2 = this;
return factorySpace(effects, afterPrefix, "listItemIndent", self2.containerState.size + 1);
function afterPrefix(code) {
const tail = self2.events[self2.events.length - 1];
return tail && tail[1].type === "listItemIndent" && tail[2].sliceSerialize(tail[1], true).length === self2.containerState.size ? ok(code) : nok(code);
}
}
function tokenizeListEnd(effects) {
effects.exit(this.containerState.type);
}
function tokenizeListItemPrefixWhitespace(effects, ok, nok) {
const self2 = this;
return factorySpace(effects, afterPrefix, "listItemPrefixWhitespace", self2.parser.constructs.disable.null.includes("codeIndented") ? void 0 : 4 + 1);
function afterPrefix(code) {
const tail = self2.events[self2.events.length - 1];
return !markdownSpace(code) && tail && tail[1].type === "listItemPrefixWhitespace" ? ok(code) : nok(code);
}
}
var setextUnderline = {
name: "setextUnderline",
tokenize: tokenizeSetextUnderline,
resolveTo: resolveToSetextUnderline
};
function resolveToSetextUnderline(events, context) {
let index2 = events.length;
let content3;
let text3;
let definition2;
while (index2--) {
if (events[index2][0] === "enter") {
if (events[index2][1].type === "content") {
content3 = index2;
break;
}
if (events[index2][1].type === "paragraph") {
text3 = index2;
}
} else {
if (events[index2][1].type === "content") {
events.splice(index2, 1);
}
if (!definition2 && events[index2][1].type === "definition") {
definition2 = index2;
}
}
}
const heading = {
type: "setextHeading",
start: Object.assign({}, events[text3][1].start),
end: Object.assign({}, events[events.length - 1][1].end)
};
events[text3][1].type = "setextHeadingText";
if (definition2) {
events.splice(text3, 0, ["enter", heading, context]);
events.splice(definition2 + 1, 0, ["exit", events[content3][1], context]);
events[content3][1].end = Object.assign({}, events[definition2][1].end);
} else {
events[content3][1] = heading;
}
events.push(["exit", heading, context]);
return events;
}
function tokenizeSetextUnderline(effects, ok, nok) {
const self2 = this;
let index2 = self2.events.length;
let marker;
let paragraph;
while (index2--) {
if (self2.events[index2][1].type !== "lineEnding" && self2.events[index2][1].type !== "linePrefix" && self2.events[index2][1].type !== "content") {
paragraph = self2.events[index2][1].type === "paragraph";
break;
}
}
return start;
function start(code) {
if (!self2.parser.lazy[self2.now().line] && (self2.interrupt || paragraph)) {
effects.enter("setextHeadingLine");
effects.enter("setextHeadingLineSequence");
marker = code;
return closingSequence(code);
}
return nok(code);
}
function closingSequence(code) {
if (code === marker) {
effects.consume(code);
return closingSequence;
}
effects.exit("setextHeadingLineSequence");
return factorySpace(effects, closingSequenceEnd, "lineSuffix")(code);
}
function closingSequenceEnd(code) {
if (code === null || markdownLineEnding(code)) {
effects.exit("setextHeadingLine");
return ok(code);
}
return nok(code);
}
}
var flow = {
tokenize: initializeFlow
};
function initializeFlow(effects) {
const self2 = this;
const initial = effects.attempt(blankLine, atBlankEnding, effects.attempt(this.parser.constructs.flowInitial, afterConstruct, factorySpace(effects, effects.attempt(this.parser.constructs.flow, afterConstruct, effects.attempt(content2, afterConstruct)), "linePrefix")));
return initial;
function atBlankEnding(code) {
if (code === null) {
effects.consume(code);
return;
}
effects.enter("lineEndingBlank");
effects.consume(code);
effects.exit("lineEndingBlank");
self2.currentConstruct = void 0;
return initial;
}
function afterConstruct(code) {
if (code === null) {
effects.consume(code);
return;
}
effects.enter("lineEnding");
effects.consume(code);
effects.exit("lineEnding");
self2.currentConstruct = void 0;
return initial;
}
}
var resolver = {
resolveAll: createResolver()
};
var string = initializeFactory("string");
var text = initializeFactory("text");
function initializeFactory(field) {
return {
tokenize: initializeText,
resolveAll: createResolver(field === "text" ? resolveAllLineSuffixes : void 0)
};
function initializeText(effects) {
const self2 = this;
const constructs2 = this.parser.constructs[field];
const text3 = effects.attempt(constructs2, start, notText);
return start;
function start(code) {
return atBreak(code) ? text3(code) : notText(code);
}
function notText(code) {
if (code === null) {
effects.consume(code);
return;
}
effects.enter("data");
effects.consume(code);
return data;
}
function data(code) {
if (atBreak(code)) {
effects.exit("data");
return text3(code);
}
effects.consume(code);
return data;
}
function atBreak(code) {
if (code === null) {
return true;
}
const list2 = constructs2[code];
let index2 = -1;
if (list2) {
while (++index2 < list2.length) {
const item = list2[index2];
if (!item.previous || item.previous.call(self2, self2.previous)) {
return true;
}
}
}
return false;
}
}
}
function createResolver(extraResolver) {
return resolveAllText;
function resolveAllText(events, context) {
let index2 = -1;
let enter;
while (++index2 <= events.length) {
if (enter === void 0) {
if (events[index2] && events[index2][1].type === "data") {
enter = index2;
gitextract_6at5w77x/ ├── .circleci/ │ └── config.yml ├── .eslintignore ├── .eslintrc.json ├── .github/ │ └── ISSUE_TEMPLATE/ │ ├── bug_report.md │ ├── documentation.md │ ├── feature_request.md │ └── question.md ├── .gitignore ├── .gramma.json ├── .husky/ │ ├── commit-msg │ ├── post-commit │ └── pre-commit ├── .npmignore ├── .prettierrc ├── CHANGELOG.md ├── LICENSE.md ├── README.md ├── _config.yml ├── _layouts/ │ └── default.html ├── assets/ │ └── css/ │ └── style.scss ├── bundle/ │ └── gramma.esm.js ├── data/ │ ├── languages.json │ └── rules.json ├── examples/ │ ├── api-markdown.js │ ├── api-plain.js │ └── api-simple.js ├── hello.md ├── lib/ │ ├── findUpSync.mjs │ ├── package.json │ └── prepareMarkdown.mjs ├── package.json ├── scripts/ │ ├── checkLanguagesSupport.js │ └── zipBinaries.js ├── src/ │ ├── actions/ │ │ ├── checkInteractively.js │ │ ├── checkNonInteractively.js │ │ ├── configure.js │ │ ├── save.js │ │ └── saveNow.js │ ├── boot/ │ │ ├── load.js │ │ └── prepareConfig.js │ ├── cli.js │ ├── cli.test.js │ ├── commands/ │ │ ├── check.js │ │ ├── commit.js │ │ ├── config.js │ │ ├── debug.js │ │ ├── hook.js │ │ ├── init.js │ │ ├── listen.js │ │ ├── paths.js │ │ └── server.js │ ├── components/ │ │ ├── FixMenu.js │ │ ├── FixMenu.test.js │ │ ├── Mistake.js │ │ └── Mistake.test.js │ ├── context.js │ ├── index.d.ts │ ├── index.js │ ├── initialConfig.js │ ├── prompts/ │ │ ├── confirmConfig.js │ │ ├── confirmInit.js │ │ ├── confirmPort.js │ │ ├── confirmServerReinstall.js │ │ ├── handleMistake.js │ │ ├── handleSave.js │ │ └── mainMenu.js │ ├── requests/ │ │ ├── checkViaAPI.d.ts │ │ ├── checkViaAPI.js │ │ ├── checkViaCmd.js │ │ ├── checkWithFallback.js │ │ └── updates.js │ ├── server/ │ │ ├── getServerInfo.js │ │ ├── getServerPID.js │ │ ├── installServer.js │ │ ├── showServerGUI.js │ │ ├── startServer.js │ │ └── stopServer.js │ ├── text-manipulation/ │ │ ├── replace.js │ │ ├── replace.test.js │ │ ├── replaceAll.d.ts │ │ ├── replaceAll.js │ │ └── replaceAll.test.js │ ├── utils/ │ │ ├── appLocation.js │ │ ├── downloadFile.js │ │ ├── equal.js │ │ ├── findUpSync.js │ │ ├── prepareMarkdown.js │ │ ├── stripStyles.js │ │ ├── stripStyles.test.js │ │ └── unzipFile.js │ └── validators/ │ ├── languages.js │ └── rules.js └── tsconfig.json
SYMBOL INDEX (199 symbols across 8 files)
FILE: bundle/gramma.esm.js
method "node_modules/whatwg-fetch/dist/fetch.umd.js" (line 7) | "node_modules/whatwg-fetch/dist/fetch.umd.js"(exports, module) {
method "node_modules/isomorphic-fetch/fetch-npm-browserify.js" (line 505) | "node_modules/isomorphic-fetch/fetch-npm-browserify.js"(exports, module) {
method "node_modules/strict-uri-encode/index.js" (line 513) | "node_modules/strict-uri-encode/index.js"(exports, module) {
method "node_modules/decode-uri-component/index.js" (line 521) | "node_modules/decode-uri-component/index.js"(exports, module) {
method "node_modules/split-on-first/index.js" (line 592) | "node_modules/split-on-first/index.js"(exports, module) {
method "node_modules/filter-obj/index.js" (line 615) | "node_modules/filter-obj/index.js"(exports, module) {
method "node_modules/query-string/index.js" (line 635) | "node_modules/query-string/index.js"(exports) {
method "data/rules.json" (line 960) | "data/rules.json"(exports, module) {
method "src/validators/rules.js" (line 1032) | "src/validators/rules.js"(exports, module) {
method "src/initialConfig.js" (line 1047) | "src/initialConfig.js"(exports, module) {
method "src/utils/prepareMarkdown.js" (line 1066) | "src/utils/prepareMarkdown.js"(exports) {
method "src/requests/checkViaAPI.js" (line 6693) | "src/requests/checkViaAPI.js"(exports, module) {
method "src/text-manipulation/replace.js" (line 6753) | "src/text-manipulation/replace.js"(exports, module) {
method "src/text-manipulation/replaceAll.js" (line 6767) | "src/text-manipulation/replaceAll.js"(exports, module) {
method "src/index.js" (line 6780) | "src/index.js"(exports, module) {
FILE: scripts/checkLanguagesSupport.js
constant LOCAL_API_URL (line 5) | const LOCAL_API_URL = "http://localhost:8082/v2/languages"
constant LANGUAGETOOL_ORG_LIMIT (line 34) | const LANGUAGETOOL_ORG_LIMIT = 20 // req/min
FILE: src/commands/hook.js
constant REDIRECT_STDIN (line 12) | const REDIRECT_STDIN = "\n\nexec < /dev/tty"
FILE: src/prompts/handleMistake.js
method validate (line 24) | validate(input) {
FILE: src/requests/checkViaAPI.js
constant MAX_REPLACEMENTS (line 28) | const MAX_REPLACEMENTS = 30
FILE: src/requests/checkViaCmd.js
constant MAX_REPLACEMENTS (line 37) | const MAX_REPLACEMENTS = 30
FILE: src/utils/findUpSync.js
method constructor (line 61) | constructor(value) {
method constructor (line 69) | constructor() {
method enqueue (line 72) | enqueue(value) {
method dequeue (line 83) | dequeue() {
method clear (line 92) | clear() {
method size (line 97) | get size() {
method [Symbol.iterator] (line 100) | *[Symbol.iterator]() {
function checkType (line 114) | function checkType(type) {
function locatePathSync (line 121) | function locatePathSync(
function findUpMultipleSync (line 148) | function findUpMultipleSync(name, options = {}) {
function findUpSync (line 180) | function findUpSync(name, options = {}) {
FILE: src/utils/prepareMarkdown.js
method "node_modules/format/format.js" (line 61) | "node_modules/format/format.js"(exports, module2) {
method "node_modules/is-buffer/index.js" (line 171) | "node_modules/is-buffer/index.js"(exports, module2) {
method "node_modules/extend/index.js" (line 185) | "node_modules/extend/index.js"(exports, module2) {
method children (line 294) | children(node) {
method annotatetextnode (line 297) | annotatetextnode(node, text3) {
method interpretmarkup (line 313) | interpretmarkup(text3 = "") {
function collecttextnodes (line 317) | function collecttextnodes(ast, text3, options = defaults) {
function composeannotation (line 332) | function composeannotation(text3, annotatedtextnodes, options = defaults) {
function build (line 364) | function build(text3, parse3, options = defaults) {
function create (line 380) | function create(Constructor) {
function matters (line 395) | function matters(options = "yaml") {
function matter (line 406) | function matter(option) {
function asciiControl (line 439) | function asciiControl(code) {
function markdownLineEndingOrSpace (line 442) | function markdownLineEndingOrSpace(code) {
function markdownLineEnding (line 445) | function markdownLineEnding(code) {
function markdownSpace (line 448) | function markdownSpace(code) {
function regexCheck (line 453) | function regexCheck(regex) {
function frontmatter (line 461) | function frontmatter(options) {
function parse (line 480) | function parse(matter2) {
function fence (line 583) | function fence(matter2, prop) {
function pick (line 588) | function pick(schema, prop) {
function frontmatterFromMarkdown (line 593) | function frontmatterFromMarkdown(options) {
function opener (line 606) | function opener(matter2) {
function close (line 613) | function close(token) {
function value (line 617) | function value(token) {
function frontmatterToMarkdown (line 621) | function frontmatterToMarkdown(options) {
function handler (line 633) | function handler(matter2) {
function fence2 (line 641) | function fence2(matter2, prop) {
function pick2 (line 646) | function pick2(schema, prop) {
function remarkFrontmatter (line 651) | function remarkFrontmatter(options = "yaml") {
function toString (line 663) | function toString(node, options) {
function one (line 667) | function one(node, includeImageAlt) {
function all (line 678) | function all(values, includeImageAlt) {
function splice (line 688) | function splice(list2, start, remove, items) {
function push (line 713) | function push(list2, items) {
function combineExtensions (line 723) | function combineExtensions(extensions) {
function syntaxExtension (line 731) | function syntaxExtension(all2, extension2) {
function constructs (line 748) | function constructs(existing, list2) {
function factorySpace (line 758) | function factorySpace(effects, ok, type, max) {
function initializeContent (line 783) | function initializeContent(effects) {
function initializeDocument (line 840) | function initializeDocument(effects) {
function tokenizeContainer (line 1034) | function tokenizeContainer(effects, ok, nok) {
function classifyCharacter (line 1044) | function classifyCharacter(code) {
function resolveAll (line 1058) | function resolveAll(constructs2, events, context) {
function resolveAllAttention (line 1077) | function resolveAllAttention(events, context) {
function tokenizeAttention (line 1198) | function tokenizeAttention(effects, ok) {
function movePoint (line 1227) | function movePoint(point2, offset) {
function tokenizeAutolink (line 1238) | function tokenizeAutolink(effects, ok, nok) {
function tokenizeBlankLine (line 1334) | function tokenizeBlankLine(effects, ok, nok) {
function tokenizeBlockQuoteStart (line 1350) | function tokenizeBlockQuoteStart(effects, ok, nok) {
function tokenizeBlockQuoteContinuation (line 1382) | function tokenizeBlockQuoteContinuation(effects, ok, nok) {
function exit (line 1390) | function exit(effects) {
function tokenizeCharacterEscape (line 1399) | function tokenizeCharacterEscape(effects, ok, nok) {
function decodeEntity (line 1423) | function decodeEntity(characters) {
function tokenizeCharacterReference (line 1440) | function tokenizeCharacterReference(effects, ok, nok) {
function tokenizeCodeFenced (line 1510) | function tokenizeCodeFenced(effects, ok, nok) {
function tokenizeCodeIndented (line 1682) | function tokenizeCodeIndented(effects, ok, nok) {
function tokenizeIndentedContent (line 1720) | function tokenizeIndentedContent(effects, ok, nok) {
function resolveCodeText (line 1754) | function resolveCodeText(events) {
function previous (line 1799) | function previous(code) {
function tokenizeCodeText (line 1805) | function tokenizeCodeText(effects, ok, nok) {
function subtokenize (line 1879) | function subtokenize(events) {
function subcontent (line 1958) | function subcontent(events, eventIndex) {
function resolveContent (line 2043) | function resolveContent(events) {
function tokenizeContent (line 2047) | function tokenizeContent(effects, ok) {
function tokenizeContinuation (line 2087) | function tokenizeContinuation(effects, ok, nok) {
function factoryDestination (line 2115) | function factoryDestination(
function factoryLabel (line 2222) | function factoryLabel(effects, ok, nok, type, markerType, stringType) {
function factoryTitle (line 2292) | function factoryTitle(effects, ok, nok, type, markerType, stringType) {
function factoryWhitespace (line 2351) | function factoryWhitespace(effects, ok) {
function normalizeIdentifier (line 2374) | function normalizeIdentifier(value2) {
function tokenizeDefinition (line 2391) | function tokenizeDefinition(effects, ok, nok) {
function tokenizeTitle (line 2446) | function tokenizeTitle(effects, ok, nok) {
function tokenizeHardBreakEscape (line 2476) | function tokenizeHardBreakEscape(effects, ok, nok) {
function resolveHeadingAtx (line 2500) | function resolveHeadingAtx(events, context) {
function tokenizeHeadingAtx (line 2543) | function tokenizeHeadingAtx(effects, ok, nok) {
function resolveToHtmlFlow (line 2674) | function resolveToHtmlFlow(events) {
function tokenizeHtmlFlow (line 2691) | function tokenizeHtmlFlow(effects, ok, nok) {
function tokenizeNextBlank (line 3055) | function tokenizeNextBlank(effects, ok, nok) {
function tokenizeHtmlText (line 3071) | function tokenizeHtmlText(effects, ok, nok) {
function resolveAllLabelEnd (line 3423) | function resolveAllLabelEnd(events) {
function resolveToLabelEnd (line 3440) | function resolveToLabelEnd(events, context) {
function tokenizeLabelEnd (line 3515) | function tokenizeLabelEnd(effects, ok, nok) {
function tokenizeResource (line 3575) | function tokenizeResource(effects, ok, nok) {
function tokenizeFullReference (line 3629) | function tokenizeFullReference(effects, ok, nok) {
function tokenizeCollapsedReference (line 3655) | function tokenizeCollapsedReference(effects, ok, nok) {
function tokenizeLabelStartImage (line 3682) | function tokenizeLabelStartImage(effects, ok, nok) {
function tokenizeLabelStartLink (line 3715) | function tokenizeLabelStartLink(effects, ok, nok) {
function tokenizeLineEnding (line 3738) | function tokenizeLineEnding(effects, ok) {
function tokenizeThematicBreak (line 3753) | function tokenizeThematicBreak(effects, ok, nok) {
function tokenizeListStart (line 3804) | function tokenizeListStart(effects, ok, nok) {
function tokenizeListContinuation (line 3896) | function tokenizeListContinuation(effects, ok, nok) {
function tokenizeIndent (line 3932) | function tokenizeIndent(effects, ok, nok) {
function tokenizeListEnd (line 3949) | function tokenizeListEnd(effects) {
function tokenizeListItemPrefixWhitespace (line 3952) | function tokenizeListItemPrefixWhitespace(effects, ok, nok) {
function resolveToSetextUnderline (line 3978) | function resolveToSetextUnderline(events, context) {
function tokenizeSetextUnderline (line 4017) | function tokenizeSetextUnderline(effects, ok, nok) {
function initializeFlow (line 4063) | function initializeFlow(effects) {
function initializeFactory (line 4113) | function initializeFactory(field) {
function createResolver (line 4163) | function createResolver(extraResolver) {
function resolveAllLineSuffixes (line 4186) | function resolveAllLineSuffixes(events, context) {
function createTokenizer (line 4256) | function createTokenizer(parser, initialize, from) {
function sliceChunks (line 4504) | function sliceChunks(chunks, token) {
function serializeChunks (line 4523) | function serializeChunks(chunks, expandTabs) {
function parse2 (line 4641) | function parse2(options = {}) {
function preprocess (line 4666) | function preprocess() {
function postprocess (line 4745) | function postprocess(events) {
function decodeNumericCharacterReference (line 4751) | function decodeNumericCharacterReference(value2, base2) {
function decodeString (line 4772) | function decodeString(value2) {
function decode (line 4775) | function decode($0, $1, $2) {
function stringifyPosition (line 4790) | function stringifyPosition(value2) {
function point (line 4805) | function point(point2) {
function position (line 4808) | function position(pos) {
function index (line 4811) | function index(value2) {
function compiler (line 4828) | function compiler(options = {}) {
function configure (line 5545) | function configure(combined, extensions) {
function extension (line 5557) | function extension(combined, extension2) {
function remarkParse (line 5577) | function remarkParse(options) {
function bail (line 5595) | function bail(error) {
function isPlainObject (line 5606) | function isPlainObject(value2) {
function trough (line 5615) | function trough() {
function wrap (line 5656) | function wrap(middleware, callback) {
method constructor (line 5700) | constructor(reason, place, origin) {
function basename (line 5761) | function basename(path2, ext) {
function dirname (line 5819) | function dirname(path2) {
function extname (line 5845) | function extname(path2) {
function join (line 5886) | function join(...segments) {
function normalize (line 5898) | function normalize(path2) {
function normalizeString (line 5910) | function normalizeString(path2, allowAboveRoot) {
function assertPath (line 5979) | function assertPath(path2) {
function cwd (line 5989) | function cwd() {
function isUrl (line 5994) | function isUrl(fileURLOrPath) {
function urlToPath (line 6004) | function urlToPath(path2) {
function getPathFromURLPosix (line 6023) | function getPathFromURLPosix(url) {
method constructor (line 6054) | constructor(value2) {
method path (line 6088) | get path() {
method path (line 6091) | set path(path2) {
method dirname (line 6100) | get dirname() {
method dirname (line 6103) | set dirname(dirname2) {
method basename (line 6107) | get basename() {
method basename (line 6110) | set basename(basename2) {
method extname (line 6115) | get extname() {
method extname (line 6118) | set extname(extname2) {
method stem (line 6131) | get stem() {
method stem (line 6136) | set stem(stem) {
method toString (line 6141) | toString(encoding) {
method message (line 6144) | message(reason, place, origin) {
method info (line 6154) | info(reason, place, origin) {
method fail (line 6159) | fail(reason, place, origin) {
function assertPart (line 6165) | function assertPart(part, name) {
function assertNonEmpty (line 6172) | function assertNonEmpty(part, name) {
function assertPath2 (line 6177) | function assertPath2(path2, name) {
function base (line 6186) | function base() {
function newable (line 6429) | function newable(value2, name) {
function keys (line 6436) | function keys(value2) {
function assertParser (line 6445) | function assertParser(name, value2) {
function assertCompiler (line 6450) | function assertCompiler(name, value2) {
function assertUnfrozen (line 6455) | function assertUnfrozen(name, frozen) {
function assertNode (line 6464) | function assertNode(node) {
function assertDone (line 6469) | function assertDone(name, asyncName, complete) {
function vfile (line 6476) | function vfile(value2) {
function looksLikeAVFile (line 6479) | function looksLikeAVFile(value2) {
function looksLikeAVFileValue (line 6487) | function looksLikeAVFileValue(value2) {
method children (line 6493) | children(node) {
method annotatetextnode (line 6496) | annotatetextnode(node, text3) {
method interpretmarkup (line 6499) | interpretmarkup(text3 = "") {
function build2 (line 6504) | function build2(text3, options = defaults2) {
Condensed preview — 93 files, each showing path, character count, and a content snippet. Download the .json file or copy for the full structured content (554K chars).
[
{
"path": ".circleci/config.yml",
"chars": 473,
"preview": "version: 2\njobs:\n build:\n docker:\n - image: cimg/node:14.18.0\n\n working_directory: ~/repo\n\n steps:\n "
},
{
"path": ".eslintignore",
"chars": 88,
"preview": "lib/prepareMarkdown.mjs\nsrc/utils/prepareMarkdown.js\nsrc/utils/findUpSync.js\n*/**/*.d.ts"
},
{
"path": ".eslintrc.json",
"chars": 272,
"preview": "{\n \"extends\": [\"airbnb\", \"prettier\"],\n \"rules\": {\n \"no-console\": 0,\n \"arrow-body-style\": 0,\n \"no-restricted-s"
},
{
"path": ".github/ISSUE_TEMPLATE/bug_report.md",
"chars": 606,
"preview": "---\nname: Bug report\nabout: Create a report to help us improve\ntitle: ''\nlabels: bug\nassignees: caderek\n\n---\n\n**Describe"
},
{
"path": ".github/ISSUE_TEMPLATE/documentation.md",
"chars": 230,
"preview": "---\nname: Documentation\nabout: All docs related issues\ntitle: ''\nlabels: documentation\nassignees: caderek\n\n---\n\n**Descri"
},
{
"path": ".github/ISSUE_TEMPLATE/feature_request.md",
"chars": 609,
"preview": "---\nname: Feature request\nabout: Suggest an idea for this project\ntitle: ''\nlabels: enhancement\nassignees: caderek\n\n---\n"
},
{
"path": ".github/ISSUE_TEMPLATE/question.md",
"chars": 163,
"preview": "---\nname: Question\nabout: All questions that do not require changes to the codebase\ntitle: ''\nlabels: help wanted\nassign"
},
{
"path": ".gitignore",
"chars": 110,
"preview": "node_modules\n.history\n.vscode\nexample-responses\nexample.txt\ncoverage\nbin\nassets/css/*.css\nassets/css/*.css.map"
},
{
"path": ".gramma.json",
"chars": 1705,
"preview": "{\n \"api_url\": \"https://api.languagetool.org/v2/check\",\n \"api_key\": \"\",\n \"dictionary\": [\n \"Asturian\",\n \"Bugfix\","
},
{
"path": ".husky/commit-msg",
"chars": 47,
"preview": "#!/bin/sh\n\nexec < /dev/tty\n\nnpx gramma hook $1\n"
},
{
"path": ".husky/post-commit",
"chars": 35,
"preview": "#!/bin/sh\n\nnpx gramma hook cleanup\n"
},
{
"path": ".husky/pre-commit",
"chars": 51,
"preview": "#!/bin/sh\n. \"$(dirname \"$0\")/_/husky.sh\"\n\nnpm test\n"
},
{
"path": ".npmignore",
"chars": 226,
"preview": "_layouts\nnode_modules\n.history\n.circleci\nexample-responses\nexample.txt\ncoverage\nbin\n.husky\n.github\n.vscode\nlib\nassets/cs"
},
{
"path": ".prettierrc",
"chars": 134,
"preview": "{\n \"trailingComma\": \"all\",\n \"tabWidth\": 2,\n \"semi\": false,\n \"singleQuote\": false,\n \"arrowParens\": \"always\",\n \"prin"
},
{
"path": "CHANGELOG.md",
"chars": 1267,
"preview": "# CHANGELOG\n\n## 1.0.0\n\nFirst stable release.\n\n## 1.1.0\n\n- Added Git hook integration\n- Updated dependencies and document"
},
{
"path": "LICENSE.md",
"chars": 728,
"preview": "Copyright 2021 Maciej Cąderek\n\nPermission to use, copy, modify, and/or distribute this software\nfor any purpose with or "
},
{
"path": "README.md",
"chars": 26716,
"preview": "<div align=\"center\">\n <img src=\"assets/gramma-logo.png\" alt=\"gramma-logo\" />\n</div>\n<div align=\"center\">\n <img src"
},
{
"path": "_config.yml",
"chars": 26,
"preview": "theme: jekyll-theme-cayman"
},
{
"path": "_layouts/default.html",
"chars": 5437,
"preview": "<!DOCTYPE html>\n<html lang=\"{{ site.lang | default: \"en-US\" }}\">\n <head>\n\n {% if site.google_analytics %}\n <scr"
},
{
"path": "assets/css/style.scss",
"chars": 1673,
"preview": "---\n---\n\n@import \"{{ site.theme }}\";\n\nbody {\n margin: 0;\n}\n\n.page-header {\n color: #fff;\n text-align: center;\n backg"
},
{
"path": "bundle/gramma.esm.js",
"chars": 229015,
"preview": "var __commonJS = (cb, mod) => function __require() {\n return mod || (0, cb[Object.keys(cb)[0]])((mod = { exports: {} })"
},
{
"path": "data/languages.json",
"chars": 6288,
"preview": "[\n {\n \"name\": \"Arabic\",\n \"code\": \"ar\",\n \"longCode\": \"ar\",\n \"grammarbotIo\": false,\n \"languagetoolOrg\": tr"
},
{
"path": "data/rules.json",
"chars": 1513,
"preview": "[\n {\n \"id\": \"CASING\",\n \"description\": \"Detecting uppercase words where lowercase is required and vice versa.\"\n }"
},
{
"path": "examples/api-markdown.js",
"chars": 245,
"preview": "const { check } = require(\"../src\")\n\nconst main = async () => {\n const { language, matches } = await check(`<a href=\"#x"
},
{
"path": "examples/api-plain.js",
"chars": 189,
"preview": "const { check } = require(\"../src\")\n\nconst main = async () => {\n const response = await check(`Helo worlt!`, {\n mark"
},
{
"path": "examples/api-simple.js",
"chars": 214,
"preview": "const { check } = require(\"../src\")\n\nconst main = async () => {\n const { language, matches } = await check(\"Some wrongg"
},
{
"path": "hello.md",
"chars": 14,
"preview": "Hello world!\n\n"
},
{
"path": "lib/findUpSync.mjs",
"chars": 162,
"preview": "// esbuild findUpSync.mjs --bundle --outfile=src/utils/findUpSync.js --format=cjs --platform=node\nimport { findUpSync } "
},
{
"path": "lib/package.json",
"chars": 304,
"preview": "{\n \"name\": \"lib\",\n \"version\": \"1.0.0\",\n \"description\": \"\",\n \"main\": \"index.js\",\n \"scripts\": {\n \"test\": \"echo \\\"E"
},
{
"path": "lib/prepareMarkdown.mjs",
"chars": 151,
"preview": "import * as builder from \"annotatedtext-remark\"\n\nconst prepareMarkdown = (text) => JSON.stringify(builder.build(text))\n\n"
},
{
"path": "package.json",
"chars": 2881,
"preview": "{\n \"name\": \"gramma\",\n \"version\": \"1.6.0\",\n \"license\": \"ISC\",\n \"repository\": \"https://github.com/caderek/gramma\",\n \""
},
{
"path": "scripts/checkLanguagesSupport.js",
"chars": 2025,
"preview": "const fs = require(\"fs\")\nconst querystring = require(\"querystring\")\nconst fetch = require(\"node-fetch\")\n\nconst LOCAL_API"
},
{
"path": "scripts/zipBinaries.js",
"chars": 986,
"preview": "const fs = require(\"fs\")\nconst { execSync } = require(\"child_process\")\nconst { version } = require(\"../package.json\")\n\nc"
},
{
"path": "src/actions/checkInteractively.js",
"chars": 3073,
"preview": "const kleur = require(\"kleur\")\nconst checkWithFallback = require(\"../requests/checkWithFallback\")\nconst Mistake = requir"
},
{
"path": "src/actions/checkNonInteractively.js",
"chars": 1046,
"preview": "const kleur = require(\"kleur\")\nconst checkWithFallback = require(\"../requests/checkWithFallback\")\nconst Mistake = requir"
},
{
"path": "src/actions/configure.js",
"chars": 2329,
"preview": "const fs = require(\"fs\")\nconst kleur = require(\"kleur\")\nconst { isRule, ruleOptions } = require(\"../validators/rules\")\nc"
},
{
"path": "src/actions/save.js",
"chars": 1045,
"preview": "const path = require(\"path\")\nconst fs = require(\"fs\")\nconst kleur = require(\"kleur\")\nconst { homedir } = require(\"os\")\nc"
},
{
"path": "src/actions/saveNow.js",
"chars": 151,
"preview": "const fs = require(\"fs\")\n\nconst saveNow = async (text, filePath) => {\n fs.writeFileSync(filePath, text)\n console.clear"
},
{
"path": "src/boot/load.js",
"chars": 267,
"preview": "const prepareConfig = require(\"./prepareConfig\")\n\nconst load = (action) => (argv) => {\n if (argv.file && argv.file.ends"
},
{
"path": "src/boot/prepareConfig.js",
"chars": 3409,
"preview": "const fs = require(\"fs\")\nconst path = require(\"path\")\nconst { platform, homedir } = require(\"os\")\nconst findUpSync = req"
},
{
"path": "src/cli.js",
"chars": 3567,
"preview": "#!/usr/bin/env node\nrequire(\"dotenv\").config()\nrequire(\"isomorphic-fetch\")\n\nconst yargs = require(\"yargs\")\nconst { versi"
},
{
"path": "src/cli.test.js",
"chars": 3096,
"preview": "const shell = require(\"shelljs\")\nconst fs = require(\"fs\")\n\nconst prepareData = (text) => {\n if (!fs.existsSync(\"test-te"
},
{
"path": "src/commands/check.js",
"chars": 1111,
"preview": "const intercept = require(\"intercept-stdout\")\nconst kleur = require(\"kleur\")\nconst fs = require(\"fs\")\nconst checkNonInte"
},
{
"path": "src/commands/commit.js",
"chars": 647,
"preview": "const fs = require(\"fs\")\nconst { execSync } = require(\"child_process\")\nconst path = require(\"path\")\nconst checkInteracti"
},
{
"path": "src/commands/config.js",
"chars": 552,
"preview": "const kleur = require(\"kleur\")\nconst configure = require(\"../actions/configure\")\nconst confirmConfig = require(\"../promp"
},
{
"path": "src/commands/debug.js",
"chars": 258,
"preview": "const debug = async (argv, cfg) => {\n console.log(\"config:\")\n console.log(cfg)\n console.log(\"------------------------"
},
{
"path": "src/commands/hook.js",
"chars": 4515,
"preview": "const kleur = require(\"kleur\")\nconst fs = require(\"fs\")\nconst path = require(\"path\")\nconst os = require(\"os\")\nconst { ex"
},
{
"path": "src/commands/init.js",
"chars": 915,
"preview": "const kleur = require(\"kleur\")\nconst fs = require(\"fs\")\nconst path = require(\"path\")\nconst initialConfig = require(\"../i"
},
{
"path": "src/commands/listen.js",
"chars": 722,
"preview": "const intercept = require(\"intercept-stdout\")\nconst checkNonInteractively = require(\"../actions/checkNonInteractively\")\n"
},
{
"path": "src/commands/paths.js",
"chars": 296,
"preview": "const appLocation = require(\"../utils/appLocation\")\n\nconst paths = (argv, cfg) => {\n console.log(`Global config: ${cfg."
},
{
"path": "src/commands/server.js",
"chars": 1280,
"preview": "const kleur = require(\"kleur\")\nconst installServer = require(\"../server/installServer\")\nconst startServer = require(\"../"
},
{
"path": "src/components/FixMenu.js",
"chars": 1093,
"preview": "const kleur = require(\"kleur\")\n\nconst FixOptions = (fixes) => {\n if (fixes.length === 0) {\n return \"\"\n }\n if (fixe"
},
{
"path": "src/components/FixMenu.test.js",
"chars": 1657,
"preview": "const stripStyles = require(\"../utils/stripStyles\")\nconst FixMenu = require(\"./FixMenu\")\n\ndescribe(\"FixMenu component\", "
},
{
"path": "src/components/Mistake.js",
"chars": 1314,
"preview": "const kleur = require(\"kleur\")\nconst replace = require(\"../text-manipulation/replace\")\n\nconst getMistakeColor = (type) ="
},
{
"path": "src/components/Mistake.test.js",
"chars": 2094,
"preview": "const stripStyles = require(\"../utils/stripStyles\")\nconst Mistake = require(\"./Mistake\")\n\ndescribe(\"Mistake component\", "
},
{
"path": "src/context.js",
"chars": 60,
"preview": "const context = {\n argv: null,\n}\n\nmodule.exports = context\n"
},
{
"path": "src/index.d.ts",
"chars": 140,
"preview": "import check = require(\"./requests/checkViaAPI\")\nimport replaceAll = require(\"./text-manipulation/replaceAll\")\nexport { "
},
{
"path": "src/index.js",
"chars": 183,
"preview": "require(\"isomorphic-fetch\")\n\nconst check = require(\"./requests/checkViaAPI\")\nconst replaceAll = require(\"./text-manipula"
},
{
"path": "src/initialConfig.js",
"chars": 303,
"preview": "const { ruleOptions } = require(\"./validators/rules\")\n\nconst rules = {}\n\nruleOptions.forEach((rule) => {\n rules[rule] ="
},
{
"path": "src/prompts/confirmConfig.js",
"chars": 338,
"preview": "const prompts = require(\"prompts\")\n\nconst confirmConfig = () => {\n return prompts([\n {\n type: \"toggle\",\n n"
},
{
"path": "src/prompts/confirmInit.js",
"chars": 640,
"preview": "const prompts = require(\"prompts\")\nconst initialConfig = require(\"../initialConfig\")\n\nconst confirmInit = (hasGit) => {\n"
},
{
"path": "src/prompts/confirmPort.js",
"chars": 319,
"preview": "const prompts = require(\"prompts\")\n\nconst confirmPort = () => {\n return prompts([\n {\n type: \"toggle\",\n nam"
},
{
"path": "src/prompts/confirmServerReinstall.js",
"chars": 294,
"preview": "const prompts = require(\"prompts\")\n\nconst confirmServerReinstall = () => {\n return prompts([\n {\n type: \"confirm"
},
{
"path": "src/prompts/handleMistake.js",
"chars": 885,
"preview": "const prompts = require(\"prompts\")\nconst FixMenu = require(\"../components/FixMenu\")\n\nconst handleMistake = (fixes, issue"
},
{
"path": "src/prompts/handleSave.js",
"chars": 1010,
"preview": "const prompts = require(\"prompts\")\nconst { platform } = require(\"os\")\n\nconst initialFileName = (originalFile) => {\n con"
},
{
"path": "src/prompts/mainMenu.js",
"chars": 500,
"preview": "const prompts = require(\"prompts\")\n\nconst mainMenu = () => {\n const choices = [\n { title: \"check file\", value: \"chec"
},
{
"path": "src/requests/checkViaAPI.d.ts",
"chars": 1018,
"preview": "export = checkViaAPI\n/**\n * Calls the provided LanguageTool API\n * and returns grammar checker suggestions.\n *\n * @param"
},
{
"path": "src/requests/checkViaAPI.js",
"chars": 2712,
"preview": "const queryString = require(\"query-string\")\nconst initialConfig = require(\"../initialConfig\")\n// @ts-ignore\nconst prepar"
},
{
"path": "src/requests/checkViaCmd.js",
"chars": 2743,
"preview": "const fs = require(\"fs\")\nconst path = require(\"path\")\nconst kleur = require(\"kleur\")\nconst { execSync } = require(\"child"
},
{
"path": "src/requests/checkWithFallback.js",
"chars": 1862,
"preview": "const kleur = require(\"kleur\")\nconst startServer = require(\"../server/startServer\")\nconst checkViaAPI = require(\"./check"
},
{
"path": "src/requests/updates.js",
"chars": 1739,
"preview": "const fs = require(\"fs\")\nconst path = require(\"path\")\nconst kleur = require(\"kleur\")\nconst { version } = require(\"../../"
},
{
"path": "src/server/getServerInfo.js",
"chars": 425,
"preview": "const kleur = require(\"kleur\")\n\nconst getServerInfo = (cfg) => {\n if (cfg.global.server_pid) {\n console.log(kleur.gr"
},
{
"path": "src/server/getServerPID.js",
"chars": 276,
"preview": "const kleur = require(\"kleur\")\n\nconst getServerPID = (cfg) => {\n if (cfg.global.server_pid) {\n console.log(kleur.gre"
},
{
"path": "src/server/installServer.js",
"chars": 1308,
"preview": "const path = require(\"path\")\nconst fs = require(\"fs\")\nconst kleur = require(\"kleur\")\nconst rimraf = require(\"rimraf\")\nco"
},
{
"path": "src/server/showServerGUI.js",
"chars": 735,
"preview": "const { spawn } = require(\"child_process\")\nconst path = require(\"path\")\nconst kleur = require(\"kleur\")\n\nconst showServer"
},
{
"path": "src/server/startServer.js",
"chars": 2203,
"preview": "const { spawn } = require(\"child_process\")\nconst path = require(\"path\")\nconst kleur = require(\"kleur\")\nconst portfinder "
},
{
"path": "src/server/stopServer.js",
"chars": 929,
"preview": "const { exec } = require(\"child_process\")\nconst kleur = require(\"kleur\")\nconst { platform } = require(\"os\")\nconst config"
},
{
"path": "src/text-manipulation/replace.js",
"chars": 336,
"preview": "const replace = (text, change, offset, length) => {\n const before = text.slice(0, offset)\n const mistake = text.slice("
},
{
"path": "src/text-manipulation/replace.test.js",
"chars": 748,
"preview": "const replace = require(\"./replace\")\n\ndescribe(\"Replace\", () => {\n it(\"changes specified part of the text with provides"
},
{
"path": "src/text-manipulation/replaceAll.d.ts",
"chars": 304,
"preview": "export = replaceAll\n/**\n * Modifies provided text with specified transformations.\n *\n * @param text base text\n * @param "
},
{
"path": "src/text-manipulation/replaceAll.js",
"chars": 471,
"preview": "const replace = require(\"./replace\")\n\n/**\n * Modifies provided text with specified transformations.\n *\n * @param text ba"
},
{
"path": "src/text-manipulation/replaceAll.test.js",
"chars": 539,
"preview": "const replaceAll = require(\"./replaceAll\")\n\ndescribe(\"Replace all\", () => {\n it.only(\"changes all places according to p"
},
{
"path": "src/utils/appLocation.js",
"chars": 378,
"preview": "const path = require(\"path\")\nconst fs = require(\"fs\")\n\nconst binDir = path.dirname(process.execPath)\nconst scriptDir = _"
},
{
"path": "src/utils/downloadFile.js",
"chars": 1004,
"preview": "const fs = require(\"fs\")\nconst progressStream = require(\"progress-stream\")\nconst cliProgress = require(\"cli-progress\")\n\n"
},
{
"path": "src/utils/equal.js",
"chars": 174,
"preview": "const { deepEqual } = require(\"assert\")\n\nconst equal = (a, b) => {\n try {\n deepEqual(a, b)\n return true\n } catch"
},
{
"path": "src/utils/findUpSync.js",
"chars": 4809,
"preview": "var __create = Object.create\nvar __defProp = Object.defineProperty\nvar __getOwnPropDesc = Object.getOwnPropertyDescripto"
},
{
"path": "src/utils/prepareMarkdown.js",
"chars": 171978,
"preview": "// @ts-nocheck\n/**\n * Generated via:\n * esbuild lib/prepareMarkdown.mjs --bundle --outfile=src/utils/prepareMarkdown.js "
},
{
"path": "src/utils/stripStyles.js",
"chars": 348,
"preview": "/**\n * Strips terminal styles from string\n * (colors, weight etc.)\n *\n * @param {string} text any string\n */\nconst strip"
},
{
"path": "src/utils/stripStyles.test.js",
"chars": 751,
"preview": "const kleur = require(\"kleur\")\nconst stripStyles = require(\"./stripStyles\")\n\ndescribe(\"Strips styles from console string"
},
{
"path": "src/utils/unzipFile.js",
"chars": 261,
"preview": "const decompress = require(\"decompress\")\nconst decompressUnzip = require(\"decompress-unzip\")\n\nconst unzipFile = (pathToF"
},
{
"path": "src/validators/languages.js",
"chars": 290,
"preview": "const languages = require(\"../../data/languages.json\")\n\nconst languageOptions = [\n \"config\",\n \"auto\",\n ...languages.m"
},
{
"path": "src/validators/rules.js",
"chars": 226,
"preview": "const rules = require(\"../../data/rules.json\")\n\nconst ruleOptions = rules.map((rule) => rule.id.toLowerCase())\n\nconst is"
},
{
"path": "tsconfig.json",
"chars": 167,
"preview": "{\n \"include\": [\"src/index.js\"],\n \"compilerOptions\": {\n \"allowJs\": true,\n \"declaration\": true,\n \"emitDeclarati"
}
]
About this extraction
This page contains the full source code of the caderek/gramma GitHub repository, extracted and formatted as plain text for AI agents and large language models (LLMs). The extraction includes 93 files (510.2 KB), approximately 131.1k tokens, and a symbol index with 199 extracted functions, classes, methods, constants, and types. Use this with OpenClaw, Claude, ChatGPT, Cursor, Windsurf, or any other AI tool that accepts text input. You can copy the full output to your clipboard or download it as a .txt file.
Extracted by GitExtract — free GitHub repo to text converter for AI. Built by Nikandr Surkov.