Showing preview only (1,020K chars total). Download the full file or copy to clipboard to get everything.
Repository: tamago324/nlsp-settings.nvim
Branch: main
Commit: bf56ac18a62c
Files: 102
Total size: 977.0 KB
Directory structure:
gitextract_7wms9ce1/
├── .github/
│ └── workflows/
│ ├── gen_schemas.yml
│ └── stylua_check.yml
├── .gitignore
├── .luacheckrc
├── LICENSE
├── Makefile
├── README.md
├── doc/
│ └── nlspsettings.txt
├── examples/
│ ├── rust_analyzer.json
│ └── sumneko_lua.json
├── lua/
│ ├── nlspsettings/
│ │ ├── command/
│ │ │ ├── completion.lua
│ │ │ ├── init.lua
│ │ │ └── parser.lua
│ │ ├── config.lua
│ │ ├── deprecated.lua
│ │ ├── loaders/
│ │ │ ├── json.lua
│ │ │ └── yaml/
│ │ │ ├── init.lua
│ │ │ └── tinyyaml.lua
│ │ ├── log.lua
│ │ ├── schemas.lua
│ │ └── utils.lua
│ └── nlspsettings.lua
├── plugin/
│ └── nlspsetting.vim
├── schemas/
│ ├── README.md
│ └── _generated/
│ ├── .gitkeep
│ ├── als.json
│ ├── asm_lsp.json
│ ├── ast_grep.json
│ ├── astro.json
│ ├── awkls.json
│ ├── bashls.json
│ ├── beancount.json
│ ├── bicep.json
│ ├── bright_script.json
│ ├── clangd.json
│ ├── codeqlls.json
│ ├── cssls.json
│ ├── dartls.json
│ ├── denols.json
│ ├── elixirls.json
│ ├── elmls.json
│ ├── eslint.json
│ ├── flow.json
│ ├── fortls.json
│ ├── fsautocomplete.json
│ ├── gopls.json
│ ├── grammarly.json
│ ├── haxe_language_server.json
│ ├── hhvm.json
│ ├── hie.json
│ ├── html.json
│ ├── intelephense.json
│ ├── java_language_server.json
│ ├── jdtls.json
│ ├── jsonls.json
│ ├── julials.json
│ ├── kotlin_language_server.json
│ ├── leanls.json
│ ├── ltex.json
│ ├── lua_ls.json
│ ├── luau_lsp.json
│ ├── nickel_ls.json
│ ├── nimls.json
│ ├── omnisharp.json
│ ├── perlls.json
│ ├── perlnavigator.json
│ ├── perlpls.json
│ ├── powershell_es.json
│ ├── psalm.json
│ ├── puppet.json
│ ├── purescriptls.json
│ ├── pyls.json
│ ├── pylsp.json
│ ├── pyright.json
│ ├── r_language_server.json
│ ├── rescriptls.json
│ ├── rls.json
│ ├── rome.json
│ ├── rust_analyzer.json
│ ├── solargraph.json
│ ├── solidity_ls.json
│ ├── sorbet.json
│ ├── sourcekit.json
│ ├── spectral.json
│ ├── stylelint_lsp.json
│ ├── sumneko_lua.json
│ ├── svelte.json
│ ├── svlangserver.json
│ ├── tailwindcss.json
│ ├── terraformls.json
│ ├── tsserver.json
│ ├── volar.json
│ ├── vtsls.json
│ ├── vuels.json
│ ├── wgls_analyzer.json
│ ├── yamlls.json
│ ├── zeta_note.json
│ └── zls.json
├── scripts/
│ ├── gen_schemas.lua
│ ├── gen_schemas.sh
│ └── gen_schemas_readme.lua
└── stylua.toml
================================================
FILE CONTENTS
================================================
================================================
FILE: .github/workflows/gen_schemas.yml
================================================
name: gen_schemas
on:
# 手動実行できるようにする
workflow_dispatch:
schedule:
# 一日に1回実行する
- cron: '0 12 * * *'
jobs:
gen_schamges:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- uses: actions/checkout@v2
- name: Setup from neovim nightly
uses: rhysd/action-setup-vim@v1
with:
neovim: true
version: nightly
- name: Run docgen
run: |
git clone --depth 1 https://github.com/neovim/nvim-lspconfig
scripts/gen_schemas.sh
- name: Commit changes
env:
COMMIT_MSG: |
[gen_schemas] Update schemas
run: |
git config user.email "actions@github"
git config user.name "Github Actions"
git add schemas
# Only commit and push if we have changes
git diff --quiet && git diff --staged --quiet || (git commit -m "${COMMIT_MSG}"; git push origin HEAD:${GITHUB_REF})
================================================
FILE: .github/workflows/stylua_check.yml
================================================
name: lint
on:
# 手動実行できるようにする
workflow_dispatch:
pull_request:
branches:
- "**"
push:
branches:
- "**"
jobs:
stylua:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- uses: JohnnyMorganz/stylua-action@v1
with:
token: ${{ secrets.GITHUB_TOKEN }}
# CLI arguments
args: --check --config-path stylua.toml --glob 'lua/**/*.lua' --glob '!lua/**/tinyyaml.lua' -- lua
# Specify `version` to pin a specific version, otherwise action will always use latest version/automatically update
================================================
FILE: .gitignore
================================================
tags
nvim-lspconfig/
================================================
FILE: .luacheckrc
================================================
std = luajit
codes = true
globals = {
"vim"
}
================================================
FILE: LICENSE
================================================
The MIT License (MIT)
Copyright (c) 2021 tamago324
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
================================================
FILE: Makefile
================================================
.PHONY: fmt
fmt:
stylua --config-path stylua.toml --glob 'lua/**/*.lua' --glob '!lua/**/tinyyaml.lua' -- lua
================================================
FILE: README.md
================================================
# nlsp-settings.nvim
[](https://github.com/tamago324/nlsp-settings.nvim/actions/workflows/gen_schemas.yml)
A plugin to configure Neovim LSP using json/yaml files like `coc-settings.json`.
<img src="https://github.com/tamago324/images/blob/master/nlsp-settings.nvim/sumneko_lua_completion.gif" alt="sumneko_lua_completion.gif" width="600" style=""/>
<sub>Using `nlsp-settings.nvim` and [lspconfig](https://github.com/neovim/nvim-lspconfig/) and [jsonls](https://github.com/vscode-langservers/vscode-json-languageserver/) and [nvim-compe](https://github.com/hrsh7th/nvim-compe/) and [vim-vsnip](https://github.com/hrsh7th/vim-vsnip/)</sub>
Using `nlsp-settings.nvim`, you can write some of the `settings` to be passed to `lspconfig.xxx.setup()` in a json file.
You can also use it with [jsonls](https://github.com/vscode-langservers/vscode-json-languageserver) to complete the configuration values.
## Requirements
* Neovim
* [neovim/nvim-lspconfig](https://github.com/neovim/nvim-lspconfig/)
## Installation
```vim
Plug 'neovim/nvim-lspconfig'
Plug 'tamago324/nlsp-settings.nvim'
" Recommend
Plug 'williamboman/mason.nvim'
Plug 'williamboman/mason-lspconfig.nvim'
" Optional
Plug 'rcarriga/nvim-notify'
```
## Getting Started
### Step1. Install jsonls with mason.nvim
```
:MasonInstall json-lsp
```
### Step2. Setup LSP servers
Example: Completion using omnifunc
```lua
local mason = require("mason")
local mason_lspconfig = require("mason-lspconfig")
local lspconfig = require("lspconfig")
local nlspsettings = require("nlspsettings")
nlspsettings.setup({
config_home = vim.fn.stdpath('config') .. '/nlsp-settings',
local_settings_dir = ".nlsp-settings",
local_settings_root_markers_fallback = { '.git' },
append_default_schemas = true,
loader = 'json'
})
function on_attach(client, bufnr)
local function buf_set_option(...) vim.api.nvim_buf_set_option(bufnr, ...) end
buf_set_option('omnifunc', 'v:lua.vim.lsp.omnifunc')
end
local global_capabilities = vim.lsp.protocol.make_client_capabilities()
global_capabilities.textDocument.completion.completionItem.snippetSupport = true
lspconfig.util.default_config = vim.tbl_extend("force", lspconfig.util.default_config, {
capabilities = global_capabilities,
})
mason.setup()
mason_lspconfig.setup()
mason_lspconfig.setup_handlers({
function (server_name)
lspconfig[server_name].setup({
on_attach = on_attach
})
end
})
```
TODO: その他の設定は doc を参照
### Step3. Write settings
Execute `:LspSettings sumneko_lua`.
`sumneko_lua.json` will be created under the directory set in `config_home`. Type `<C-x><C-o>`. You should now have jsonls completion enabled.
## Usage
### LspSettings command
* `:LspSettings [server_name]`: Open the global settings file for the specified `{server_name}`.
* `:LspSettings buffer`: Open the global settings file that matches the current buffer.
* `:LspSettings local [server_name]`: Open the local settings file of the specified `{server_name}` corresponding to the cwd.
* `:LspSettings local buffer` or `LspSettings buffer local`: Open the local settings file of the server corresponding to the current buffer.
* `:LspSettings update [server_name]`: Update the setting values for the specified `{server_name}`.
For a list of language servers that have JSON Schema, see [here](schemas/README.md).
### Settings files for each project
You can create a settings file for each project with the following command.
* `:LspSettings local [server_name]`.
* `:LspSettings update [server_name]`
The settings file will be created in `{project_path}/.nlsp-settings/{server_name}.json`.
### Combine with Lua configuration
It is still possible to write `settings` in lua.
However, if you have the same key, the value in the JSON file will take precedence.
Example) Write sumneko_lua settings in Lua
```lua
local mason = require("mason")
local mason_lspconfig = require("mason-lspconfig")
local lspconfig = require("lspconfig")
local server_opts = {}
-- lua
server_opts.sumneko_lua = {
settings = {
Lua = {
workspace = {
library = {
[vim.fn.expand("$VIMRUNTIME/lua")] = true,
[vim.fn.stdpath("config") .. '/lua'] = true,
}
}
}
}
}
local common_setup_opts = {
-- on_attach = on_attach,
-- capabilities = require('cmp_nvim_lsp').update_capabilities(
-- vim.lsp.protocol.make_client_capabilities()
-- )
}
mason.setup()
mason_lspconfig.setup()
mason_lspconfig.setup_handlers({
function (server_name)
local opts = vim.deepcopy(common_setup_opts)
if server_opts[server_name] then
opts = vim.tbl_deep_extend('force', opts, server_opts[server_name])
end
lspconfig[server_name].setup(opts)
end
})
```
## Contributing
* All contributions are welcome.
## License
MIT
================================================
FILE: doc/nlspsettings.txt
================================================
*nlsp-settings.nvim*
==============================================================================
INTRODUCTION *nlsp-settings-introduction*
A plugin to configure Neovim LSP using json/yaml files like `coc-settings.json`.
==============================================================================
REQUIREMENTS *nlsp-settings-requirements*
* Neovim
* neovim/nvim-lspconfig
==============================================================================
INTERFACE *nlsp-settings-interface*
------------------------------------------------------------------------------
Lua module: nlspsettings *nlspsettings*
setup({opts}) *nlspsettings.setup()*
Set the default `on_new_config` to process reading the settings from a
JSON file.
Parameters: ~
{opts} (optional, table)
Fields: ~
{config_home} (optional, string)
The directory containing the settings files.
Default: `'~/.config/nvim/nlsp-settings'`
{local_settings_dir} (optional, string)
The directory containing the local settings files.
Default: `'.nlsp-settings'`
{local_settigns_root_markers_fallback} (optional, table)
A list of files and directories to use when looking for the
root directory when opening a file with `:LspSettings local` if
not found by `nvim-lspconfig`.
Default: `{ '.git' }`
{loader} (optional, `"json"` | `"yaml"`)
Specify the loader to load the configuration file.
You will also need to install a language server.
`"json"`:
Language server: `jsonls`
Settings file name: `{server_name}.json`
`"yaml"`:
Language server: `yamlls`
Settings file name: `{server_name}.yml`
Default: `"json"`
{ignored_servers} (optional, table)
List of server names that should not appear in the server
choices when running `:LspSettings buffer` and
`:LspSettings local buffer` if more than one server is
connected to the current buffer.
Default: `{}`
{append_default_schemas} (optional, boolean)
Add defaults to the language server schemas in the |loader|.
Default: `false`
{open_strictly} (optional, boolean)
Determines if server given from command line should have a
config in `nvim-lspconfig`.
Default: `false`
{nvim_notify} (optional, table)
Configuration for nvim-notify integration.
Config table:
• `enable` : Enable nvim-notify integration.
• `timeout` : Time to show notification in millisencons.
Default: `{ enable = false, timeout = 5000 }`
------------------------------------------------------------------------------
Lua module: nlspsettings.json *nlspsettings.json*
get_default_schemas() *nlspsettings.json.get_default_schemas()*
Return a list of default schemas
Return: ~
table
------------------------------------------------------------------------------
Lua module: nlspsettings.yaml *nlspsettings.yaml*
get_default_schemas() *nlspsettings.yaml.get_default_schemas()*
Return a list of default schemas
Return: ~
table
------------------------------------------------------------------------------
COMMANDS *:LspSettings* *nlspsettings-commands*
:LspSettings {server_name}
Open the settings file for the specified {server_name}.
:LspSettings buffer
Open a settings file that matches the current buffer.
:LspSettings local {server_name}
Open the local settings file of the specified {server_name}
corresponding to the cwd.
NOTE: Local version of `:LspSettings`
:LspSettings buffer local
:LspSettings local buffer
Open the local settings file of the server corresponding to the
current buffer.
NOTE: Local version of `:LspSettings buffer`
:LspSettings update {server_name}
Update the setting values for the specified {server_name}.
==============================================================================
vim:tw=78:sw=4:sts=4:ts=4:ft=help:norl:et
================================================
FILE: examples/rust_analyzer.json
================================================
{
"rust-analyzer.cargo.allFeatures": true,
"rust-analyzer.checkOnSave.command": "clippy"
}
================================================
FILE: examples/sumneko_lua.json
================================================
{
"Lua.runtime.version": "LuaJIT",
"Lua.diagnostics.enable": true,
"Lua.diagnostics.globals": [
"vim", "describe", "it", "before_each", "after_each"
],
"Lua.diagnostics.disable": [
"unused-local", "unused-vararg", "lowercase-global", "undefined-field"
],
"Lua.completion.callSnippet": "Both",
"Lua.completion.keywordSnippet": "Both"
}
================================================
FILE: lua/nlspsettings/command/completion.lua
================================================
local schemas = require 'nlspsettings.schemas'
local parser = require 'nlspsettings.command.parser'
--- Get the flags to implement in the completion
---@param cmdline string
---@return string[], boolean
local get_flags = function(cmdline)
local result = {}
local server = true
local matched = false
local unique
for flag, data in pairs(parser.Flags) do
if data.unique then
unique = flag
elseif cmdline:match(flag) then
if not data.server then
server = false
end
matched = true
else
table.insert(result, flag)
end
end
if matched then
return result, server
elseif cmdline:match(unique) then
return {}, server
end
return vim.tbl_keys(parser.Flags), server
end
--- Get the servers supported
---@param cmdline string
---@return string[], boolean
local get_servers = function(cmdline)
local result = {}
for _, server in ipairs(schemas.get_langserver_names()) do
if cmdline:match(server) then
return {}, true
end
table.insert(result, server)
end
return result, false
end
local M = {}
--- Returns the smarter completion list
---@param _ string
---@param cmdline string
---@return string
M.complete = function(_, cmdline)
local items = {}
local flags, requires_server = get_flags(cmdline)
local servers, server_found = get_servers(cmdline)
if server_found then
return ''
end
if requires_server then
vim.list_extend(items, servers)
end
vim.list_extend(items, flags)
return table.concat(items, '\n')
end
return M
================================================
FILE: lua/nlspsettings/command/init.lua
================================================
local config = require 'nlspsettings.config'
local parser = require 'nlspsettings.command.parser'
local nlspsettings = require 'nlspsettings'
local lspconfig = require 'lspconfig'
local log = require 'nlspsettings.log'
local path = lspconfig.util.path
local uv = vim.loop
local M = {}
--- Get the path to the buffer number
---@param bufnr number?
---@return string
local get_buffer_path = function(bufnr)
local expr = (bufnr ~= nil and '#' .. bufnr) or '%'
return vim.fn.expand(expr .. ':p')
end
--- Calls the callback with the name of the server connected to the given buffer.
---@param bufnr number?
---@param callback function
local with_server_name = function(bufnr, callback)
vim.validate {
bufnr = { bufnr, 'n', true },
}
local server_names = {}
local clients = vim.lsp.buf_get_clients(bufnr)
for _, _client in pairs(clients) do
if
not vim.tbl_contains(server_names, _client.name)
and not vim.tbl_contains(config.get().ignored_servers, _client.name)
then
table.insert(server_names, _client.name)
end
end
if not next(server_names) then
return
end
if #server_names > 1 then
vim.ui.select(server_names, { prompt = 'Select server:' }, callback)
else
callback(server_names[1])
end
end
--- open config file
---@param dir string
---@param server_name string
local open = function(dir, server_name)
vim.validate {
server_name = { server_name, 's' },
dir = { dir, 's' },
}
if not path.is_dir(dir) then
local prompt = ('Config directory "%s" not exists, create?'):format(path.sanitize(dir))
if vim.fn.confirm(prompt, '&Yes\n&No', 1) ~= 1 then
return
end
uv.fs_mkdir(dir, tonumber('700', 8))
end
local loader = require('nlspsettings.loaders.' .. config.get().loader)
local filepath = path.join(dir, server_name .. '.' .. loader.file_ext)
-- If the file does not exist, LSP will not be able to complete it, so create it
if not path.is_file(filepath) then
local fd = uv.fs_open(filepath, 'w', tonumber('644', 8))
if not fd then
log.error('Could not create file: ' .. filepath)
return
end
uv.fs_close(fd)
end
local cmd = (vim.api.nvim_buf_get_option(0, 'modified') and 'split') or 'edit'
vim.api.nvim_command(cmd .. ' ' .. filepath)
end
--- Open a settings file that matches the current buffer
M.open_buf_config = function()
with_server_name(nil, function(server_name)
if not server_name then
return
end
M.open_config(server_name)
end)
end
---Open the settings file for the specified server.
---@param server_name string
M.open_config = function(server_name)
open(config.get().config_home, server_name)
end
---Open the settings file for the specified server.
---@param server_name string
M.open_local_config = function(server_name)
local start_path = get_buffer_path()
if start_path == '' then
start_path = vim.fn.getcwd()
end
local conf = config.get()
local root_dir
if lspconfig[server_name] then
root_dir = lspconfig[server_name].get_root_dir(path.sanitize(start_path))
end
if not root_dir then
local markers = conf.local_settings_root_markers_fallback
root_dir = lspconfig.util.root_pattern(markers)(path.sanitize(start_path))
end
if root_dir then
open(path.join(root_dir:gsub('/$', ''), conf.local_settings_dir), server_name)
else
log.error(('[%s] Failed to get root_dir.'):format(server_name))
end
end
--- Open a settings file that matches the current buffer
M.open_local_buf_config = function()
with_server_name(nil, function(server_name)
if not server_name then
return
end
local clients = vim.tbl_filter(function(server)
if server.name == server_name then
return server
end
end, vim.lsp.buf_get_clients())
local client = unpack(clients)
if client then
open(path.join(client.config.root_dir, config.get().local_settings_dir), server_name)
else
log.error(('[%s] Failed to get root_dir.'):format(server_name))
end
end)
end
---Update the setting values.
---@param server_name string
M.update_settings = function(server_name)
vim.api.nvim_command 'redraw'
if nlspsettings.update_settings(server_name) then
log.error(('[%s] Failed to update the settings.'):format(server_name))
else
log.info(('[%s] Success to update the settings.'):format(server_name))
end
end
---What to do when BufWritePost fires
---@param file string
M._BufWritePost = function(file)
local server_name = path.sanitize(file):match '([^/]+)%.%w+$'
M.update_settings(server_name)
end
---Executes command from action
---@param result nlspsettings.command.parser.result
---@param actions table
M._execute = function(result, actions)
if actions[result.action] then
actions[result.action](result.server)
end
end
---Parses a command and executes it
---@vararg string
M._command = function(...)
local result = parser.parse { ... }
M._execute(result, {
[parser.Actions.OPEN] = M.open_config,
[parser.Actions.OPEN_BUFFER] = M.open_buf_config,
[parser.Actions.OPEN_LOCAL] = M.open_local_config,
[parser.Actions.OPEN_LOCAL_BUFFER] = M.open_local_buf_config,
[parser.Actions.UPDATE] = M.update_settings,
})
end
return M
================================================
FILE: lua/nlspsettings/command/parser.lua
================================================
local schemas = require 'nlspsettings.schemas'
local config = require 'nlspsettings.config'
---@class nlspsettings.command.parser.flag
---@field server boolean
---@field unique boolean
---@class nlspsettings.command.parser.FLAGS
---@field buffer nlspsettings.command.parser.flag
---@field local nlspsettings.command.parser.flag
---@field update nlspsettings.command.parser.flag
local FLAGS = {
['buffer'] = {
server = false,
unique = false,
},
['local'] = {
server = true,
unique = false,
},
['update'] = {
server = true,
unique = true,
},
}
---@class nlspsettings.command.parser.ACTIONS
---@field OPEN string
---@field OPEN_BUFFER string
---@field OPEN_LOCAL string
---@field OPEN_LOCAL_BUFFER string
---@field UPDATE string
local ACTIONS = {
OPEN = 'open',
OPEN_BUFFER = 'open_buffer',
OPEN_LOCAL = 'open_local',
OPEN_LOCAL_BUFFER = 'open_local_buffer',
UPDATE = 'update',
}
---@class nlspsettings.command.parser.ERRORS
---@field NOT_SERVER string
---@field NOT_FLAG string
---@field FLAG_REPEATED string
---@field NOT_ALLOWED string
---@field SERVER_REQUIRED string
---@field EMPTY string
local ERRORS = {
NOT_SERVER = '`%s` is not a valid server',
NOT_FLAG = '`%s` is not a valid flag',
FLAG_REPEATED = '`%s` is repeated',
NOT_ALLOWED = '`%s` does not allow `%s` flag',
SERVER_REQUIRED = '`%s` requires a server',
EMPTY = 'argument #%i is empty',
}
--- Creates a parser from list
---@param list table
---@param arg string
---@param err string
local from_list = function(list, arg, err)
for _, item in ipairs(list) do
if item == arg then
return arg
end
end
error(err:format(arg))
end
--- Parses a flag
---@param arg string
---@return string
local parse_flag = function(arg)
return from_list(vim.tbl_keys(FLAGS), arg, ERRORS.NOT_FLAG)
end
--- Parses a server
---@param arg string
---@return string
local parse_server = function(arg)
if config.get().open_strictly then
return from_list(schemas.get_langserver_names(), arg, ERRORS.NOT_SERVER)
end
return arg
end
local M = {
Flags = FLAGS,
Actions = ACTIONS,
}
---@class nlspsettings.command.parser.result
---@field action string
---@field server string?
--- Parses a table or a string
---@param args table|string
---@return nlspsettings.command.parser.result?
M.parse = function(args)
local flags = {}
local context = {
requires_server = true,
unique_flag = nil,
last_flag = nil,
}
if type(args) == 'string' then
args = vim.split(args, ' ')
end
local function process_flag(flag)
if flags[flag] then
error(ERRORS.FLAG_REPEATED:format(flag))
end
if context.unique_flag then
error(ERRORS.NOT_ALLOWED:format(context.unique_flag, flag))
elseif FLAGS[flag].unique and context.last_flag then
error(ERRORS.NOT_ALLOWED:format(context.last_flag, flag))
end
if not FLAGS[flag].server then
context.requires_server = nil
end
flags[flag] = true
context.unique_flag = FLAGS[flag].unique and flag
context.last_flag = flag
end
if vim.tbl_isempty(args) then
error(ERRORS.EMPTY:format(1))
end
for index, arg in ipairs(args) do
if arg == '' then
error(ERRORS.EMPTY:format(index))
end
if index == #args then
local success, flag = pcall(parse_flag, arg)
if success then
process_flag(flag)
elseif not context.requires_server then
error(flag)
end
if success and context.requires_server and index == 1 then
error(ERRORS.SERVER_REQUIRED:format(flag))
end
break
end
process_flag(parse_flag(arg))
end
local action = ACTIONS.OPEN
if flags['local'] and flags['buffer'] then
action = ACTIONS.OPEN_LOCAL_BUFFER
elseif flags['local'] then
action = ACTIONS.OPEN_LOCAL
elseif flags['buffer'] then
action = ACTIONS.OPEN_BUFFER
elseif flags['update'] then
action = ACTIONS.UPDATE
end
return {
action = action,
server = context.requires_server and parse_server(args[#args]),
}
end
return M
================================================
FILE: lua/nlspsettings/config.lua
================================================
local defaults_values = {
config_home = vim.fn.stdpath 'config' .. '/nlsp-settings',
local_settings_dir = '.nlsp-settings',
local_settings_root_markers_fallback = {
'.git',
'.nlsp-settings',
},
ignored_servers = {},
append_default_schemas = false,
open_strictly = false,
nvim_notify = {
enable = false,
timeout = 5000,
},
loader = 'json',
}
---@class nlspsettings.config
---@field values nlspsettings.config.values
local config = {}
---@class nlspsettings.config.values
---@field config_home string:nil
---@field local_settings_dir string
---@field local_settings_root_markers_fallback string[]
---@field ignored_servers string[]
---@field append_default_schemas boolean
---@field open_strictly boolean
---@field nvim_notify nlspsettings.config.values.nvim_notify
---@field loader '"json"' | '"yaml"'
config.values = vim.deepcopy(defaults_values)
---@class nlspsettings.config.values.nvim_notify
---@field enable boolean
---@field timeout number
config.set_default_values = function(opts)
config.values = vim.tbl_deep_extend('force', defaults_values, opts or {})
-- For an empty table
if not next(config.values.local_settings_root_markers_fallback) then
config.values.local_settings_root_markers_fallback = defaults_values.local_settings_root_markers_fallback
end
end
---
---@return nlspsettings.config.values
config.get = function()
return config.values
end
return config
================================================
FILE: lua/nlspsettings/deprecated.lua
================================================
local log = require 'nlspsettings.log'
local M = {}
M.open = function()
log.warn ':NlspConfig has been removed in favor of :LspSettings {server_name}'
end
M.open_local = function()
log.warn ':NlspLocalConfig has been removed in favor of :LspSettings local {server_name}'
end
M.open_buffer = function()
log.warn ':NlspBufConfig has been removed in favor of :LspSettings buffer'
end
M.open_local_buffer = function()
log.warn ':NlspLocalBufConfig has been removed in favor of :LspSettings buffer local'
end
M.update = function()
log.warn ':NlspUpdateSettings has been removed in favor of :LspSettings update {server_name}'
end
return M
================================================
FILE: lua/nlspsettings/loaders/json.lua
================================================
local schemas = require 'nlspsettings.schemas'
---@class nlspsettings.loaders.json
---@field name string loader name
---@field server_name string LSP server name
---@field settings_key string settings key
---@field file_ext string file extensions
local json = {
name = 'json',
server_name = 'jsonls',
settings_key = 'json',
file_ext = 'json',
}
--- Decodes from JSON.
---
---@param data string Data to decode
---@returns table json_obj Decoded JSON object
local json_decode = function(data)
local ok, result = pcall(vim.fn.json_decode, data)
if ok then
return result
else
return nil, result
end
end
---
---@param path string
---@return string
json.load = function(path)
return json_decode(table.concat(vim.fn.readfile(path), '\n'))
end
---@class nlspsettings.loaders.json.jsonschema
---@field fileMatch string|string[]
---@field url string
--- Return a list of default schemas
---@return nlspsettings.loaders.json.jsonschema[]
json.get_default_schemas = function()
local res = {}
for k, v in pairs(schemas.get_base_schemas_data()) do
table.insert(res, {
fileMatch = { k .. '.json' },
url = v,
})
end
return res
end
return json
================================================
FILE: lua/nlspsettings/loaders/yaml/init.lua
================================================
local schemas = require 'nlspsettings.schemas'
local tinyyaml = require 'nlspsettings.loaders.yaml.tinyyaml'
---@class nlspsettings.loaders.yaml
---@field name string loader name
---@field server_name string LSP server name
---@field settings_key string settings key
---@field file_ext string file extensions
local yaml = {
name = 'yaml',
server_name = 'yamlls',
settings_key = 'yaml',
file_ext = 'yml',
}
---
---@param path string
---@return string
yaml.load = function(path)
local lines = vim.fn.readfile(path)
-- see https://github.com/api7/lua-tinyyaml/pull/9
if vim.tbl_isempty(lines) or (#lines == 1 and lines[1] == '') then
return {}
end
local ok, result = pcall(tinyyaml.parse, table.concat(lines, '\n'))
if ok then
return result
else
return nil, result
end
end
--- Return a list of default schemas
---@return table<string, string>
yaml.get_default_schemas = function()
local res = {}
for k, v in pairs(schemas.get_base_schemas_data()) do
-- url: globpattern
res[v] = k .. '.yml'
end
return res
end
return yaml
================================================
FILE: lua/nlspsettings/loaders/yaml/tinyyaml.lua
================================================
-- original source code: https://github.com/api7/lua-tinyyaml/blob/1cefdf0f3a4b47b2804b6e14671b6fd073d15e66/tinyyaml.lua
-- license file: https://github.com/api7/lua-tinyyaml/blob/1cefdf0f3a4b47b2804b6e14671b6fd073d15e66/LICENSE
-- MIT License
--
-- Copyright (c) 2017 peposso
--
-- Permission is hereby granted, free of charge, to any person obtaining a copy
-- of this software and associated documentation files (the "Software"), to deal
-- in the Software without restriction, including without limitation the rights
-- to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
-- copies of the Software, and to permit persons to whom the Software is
-- furnished to do so, subject to the following conditions:
--
-- The above copyright notice and this permission notice shall be included in all
-- copies or substantial portions of the Software.
--
-- THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
-- IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
-- FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
-- AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
-- LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
-- OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
-- SOFTWARE.
--
-------------------------------------------------------------------------------
-- tinyyaml - YAML subset parser
-------------------------------------------------------------------------------
local table = table
local string = string
local schar = string.char
local ssub, gsub = string.sub, string.gsub
local sfind, smatch = string.find, string.match
local tinsert, tconcat, tremove = table.insert, table.concat, table.remove
local setmetatable = setmetatable
local pairs = pairs
local rawget = rawget
local type = type
local tonumber = tonumber
local math = math
local getmetatable = getmetatable
local error = error
local UNESCAPES = {
['0'] = "\x00", z = "\x00", N = "\x85",
a = "\x07", b = "\x08", t = "\x09",
n = "\x0a", v = "\x0b", f = "\x0c",
r = "\x0d", e = "\x1b", ['\\'] = '\\',
};
-------------------------------------------------------------------------------
-- utils
local function select(list, pred)
local selected = {}
for i = 0, #list do
local v = list[i]
if v and pred(v, i) then
tinsert(selected, v)
end
end
return selected
end
local function startswith(haystack, needle)
return ssub(haystack, 1, #needle) == needle
end
local function ltrim(str)
return smatch(str, "^%s*(.-)$")
end
local function rtrim(str)
return smatch(str, "^(.-)%s*$")
end
local function trim(str)
return smatch(str, "^%s*(.-)%s*$")
end
-------------------------------------------------------------------------------
-- Implementation.
--
local class = {__meta={}}
function class.__meta.__call(cls, ...)
local self = setmetatable({}, cls)
if cls.__init then
cls.__init(self, ...)
end
return self
end
function class.def(base, typ, cls)
base = base or class
local mt = {__metatable=base, __index=base}
for k, v in pairs(base.__meta) do mt[k] = v end
cls = setmetatable(cls or {}, mt)
cls.__index = cls
cls.__metatable = cls
cls.__type = typ
cls.__meta = mt
return cls
end
local types = {
null = class:def('null'),
map = class:def('map'),
omap = class:def('omap'),
pairs = class:def('pairs'),
set = class:def('set'),
seq = class:def('seq'),
timestamp = class:def('timestamp'),
}
local Null = types.null
function Null.__tostring() return 'yaml.null' end
function Null.isnull(v)
if v == nil then return true end
if type(v) == 'table' and getmetatable(v) == Null then return true end
return false
end
local null = Null()
function types.timestamp:__init(y, m, d, h, i, s, f, z)
self.year = tonumber(y)
self.month = tonumber(m)
self.day = tonumber(d)
self.hour = tonumber(h or 0)
self.minute = tonumber(i or 0)
self.second = tonumber(s or 0)
if type(f) == 'string' and sfind(f, '^%d+$') then
self.fraction = tonumber(f) * math.pow(10, 3 - #f)
elseif f then
self.fraction = f
else
self.fraction = 0
end
self.timezone = z
end
function types.timestamp:__tostring()
return string.format(
'%04d-%02d-%02dT%02d:%02d:%02d.%03d%s',
self.year, self.month, self.day,
self.hour, self.minute, self.second, self.fraction,
self:gettz())
end
function types.timestamp:gettz()
if not self.timezone then
return ''
end
if self.timezone == 0 then
return 'Z'
end
local sign = self.timezone > 0
local z = sign and self.timezone or -self.timezone
local zh = math.floor(z)
local zi = (z - zh) * 60
return string.format(
'%s%02d:%02d', sign and '+' or '-', zh, zi)
end
local function countindent(line)
local _, j = sfind(line, '^%s+')
if not j then
return 0, line
end
return j, ssub(line, j+1)
end
local Parser = {
timestamps=true,-- parse timestamps as objects instead of strings
}
function Parser:parsestring(line, stopper)
stopper = stopper or ''
local q = ssub(line, 1, 1)
if q == ' ' or q == '\t' then
return self:parsestring(ssub(line, 2))
end
if q == "'" then
local i = sfind(line, "'", 2, true)
if not i then
return nil, line
end
return ssub(line, 2, i-1), ssub(line, i+1)
end
if q == '"' then
local i, buf = 2, ''
while i < #line do
local c = ssub(line, i, i)
if c == '\\' then
local n = ssub(line, i+1, i+1)
if UNESCAPES[n] ~= nil then
buf = buf..UNESCAPES[n]
elseif n == 'x' then
local h = ssub(i+2,i+3)
if sfind(h, '^[0-9a-fA-F]$') then
buf = buf..schar(tonumber(h, 16))
i = i + 2
else
buf = buf..'x'
end
else
buf = buf..n
end
i = i + 1
elseif c == q then
break
else
buf = buf..c
end
i = i + 1
end
return buf, ssub(line, i+1)
end
if q == '{' or q == '[' then -- flow style
return nil, line
end
if q == '|' or q == '>' then -- block
return nil, line
end
if q == '-' or q == ':' then
if ssub(line, 2, 2) == ' ' or ssub(line, 2, 2) == '\n' or #line == 1 then
return nil, line
end
end
if line == "*" then
error("did not find expected alphabetic or numeric character")
end
local buf = ''
while #line > 0 do
local c = ssub(line, 1, 1)
if sfind(stopper, c, 1, true) then
break
elseif c == ':' and (ssub(line, 2, 2) == ' ' or ssub(line, 2, 2) == '\n' or #line == 1) then
break
elseif c == '#' and (ssub(buf, #buf, #buf) == ' ') then
break
else
buf = buf..c
end
line = ssub(line, 2)
end
return rtrim(buf), line
end
local function isemptyline(line)
return line == '' or sfind(line, '^%s*$') or sfind(line, '^%s*#')
end
local function equalsline(line, needle)
return startswith(line, needle) and isemptyline(ssub(line, #needle+1))
end
local function compactifyemptylines(lines)
-- Appends empty lines as "\n" to the end of the nearest preceding non-empty line
local compactified = {}
local lastline = {}
for i = 1, #lines do
local line = lines[i]
if isemptyline(line) then
if #compactified > 0 and i < #lines then
tinsert(lastline, "\n")
end
else
if #lastline > 0 then
tinsert(compactified, tconcat(lastline, ""))
end
lastline = {line}
end
end
if #lastline > 0 then
tinsert(compactified, tconcat(lastline, ""))
end
return compactified
end
local function checkdupekey(map, key)
if rawget(map, key) ~= nil then
-- print("found a duplicate key '"..key.."' in line: "..line)
local suffix = 1
while rawget(map, key..'_'..suffix) do
suffix = suffix + 1
end
key = key ..'_'..suffix
end
return key
end
function Parser:parseflowstyle(line, lines)
local stack = {}
while true do
if #line == 0 then
if #lines == 0 then
break
else
line = tremove(lines, 1)
end
end
local c = ssub(line, 1, 1)
if c == '#' then
line = ''
elseif c == ' ' or c == '\t' or c == '\r' or c == '\n' then
line = ssub(line, 2)
elseif c == '{' or c == '[' then
tinsert(stack, {v={},t=c})
line = ssub(line, 2)
elseif c == ':' then
local s = tremove(stack)
tinsert(stack, {v=s.v, t=':'})
line = ssub(line, 2)
elseif c == ',' then
local value = tremove(stack)
if value.t == ':' or value.t == '{' or value.t == '[' then error() end
if stack[#stack].t == ':' then
-- map
local key = tremove(stack)
key.v = checkdupekey(stack[#stack].v, key.v)
stack[#stack].v[key.v] = value.v
elseif stack[#stack].t == '{' then
-- set
stack[#stack].v[value.v] = true
elseif stack[#stack].t == '[' then
-- seq
tinsert(stack[#stack].v, value.v)
end
line = ssub(line, 2)
elseif c == '}' then
if stack[#stack].t == '{' then
if #stack == 1 then break end
stack[#stack].t = '}'
line = ssub(line, 2)
else
line = ','..line
end
elseif c == ']' then
if stack[#stack].t == '[' then
if #stack == 1 then break end
stack[#stack].t = ']'
line = ssub(line, 2)
else
line = ','..line
end
else
local s, rest = self:parsestring(line, ',{}[]')
if not s then
error('invalid flowstyle line: '..line)
end
tinsert(stack, {v=s, t='s'})
line = rest
end
end
return stack[1].v, line
end
function Parser:parseblockstylestring(line, lines, indent)
if #lines == 0 then
error("failed to find multi-line scalar content")
end
local s = {}
local firstindent = -1
local endline = -1
for i = 1, #lines do
local ln = lines[i]
local idt = countindent(ln)
if idt <= indent then
break
end
if ln == '' then
tinsert(s, '')
else
if firstindent == -1 then
firstindent = idt
elseif idt < firstindent then
break
end
tinsert(s, ssub(ln, firstindent + 1))
end
endline = i
end
local striptrailing = true
local sep = '\n'
local newlineatend = true
if line == '|' then
striptrailing = true
sep = '\n'
newlineatend = true
elseif line == '|+' then
striptrailing = false
sep = '\n'
newlineatend = true
elseif line == '|-' then
striptrailing = true
sep = '\n'
newlineatend = false
elseif line == '>' then
striptrailing = true
sep = ' '
newlineatend = true
elseif line == '>+' then
striptrailing = false
sep = ' '
newlineatend = true
elseif line == '>-' then
striptrailing = true
sep = ' '
newlineatend = false
else
error('invalid blockstyle string:'..line)
end
if #s == 0 then
return ""
end
local _, eonl = s[#s]:gsub('\n', '\n')
s[#s] = rtrim(s[#s])
if striptrailing then
eonl = 0
end
if newlineatend then
eonl = eonl + 1
end
for i = endline, 1, -1 do
tremove(lines, i)
end
return tconcat(s, sep)..string.rep('\n', eonl)
end
function Parser:parsetimestamp(line)
local _, p1, y, m, d = sfind(line, '^(%d%d%d%d)%-(%d%d)%-(%d%d)')
if not p1 then
return nil, line
end
if p1 == #line then
return types.timestamp(y, m, d), ''
end
local _, p2, h, i, s = sfind(line, '^[Tt ](%d+):(%d+):(%d+)', p1+1)
if not p2 then
return types.timestamp(y, m, d), ssub(line, p1+1)
end
if p2 == #line then
return types.timestamp(y, m, d, h, i, s), ''
end
local _, p3, f = sfind(line, '^%.(%d+)', p2+1)
if not p3 then
p3 = p2
f = 0
end
local zc = ssub(line, p3+1, p3+1)
local _, p4, zs, z = sfind(line, '^ ?([%+%-])(%d+)', p3+1)
if p4 then
z = tonumber(z)
local _, p5, zi = sfind(line, '^:(%d+)', p4+1)
if p5 then
z = z + tonumber(zi) / 60
end
z = zs == '-' and -tonumber(z) or tonumber(z)
elseif zc == 'Z' then
p4 = p3 + 1
z = 0
else
p4 = p3
z = false
end
return types.timestamp(y, m, d, h, i, s, f, z), ssub(line, p4+1)
end
function Parser:parsescalar(line, lines, indent)
line = trim(line)
line = gsub(line, '^%s*#.*$', '') -- comment only -> ''
line = gsub(line, '^%s*', '') -- trim head spaces
if line == '' or line == '~' then
return null
end
if self.timestamps then
local ts, _ = self:parsetimestamp(line)
if ts then
return ts
end
end
local s, _ = self:parsestring(line)
-- startswith quote ... string
-- not startswith quote ... maybe string
if s and (startswith(line, '"') or startswith(line, "'")) then
return s
end
if startswith('!', line) then -- unexpected tagchar
error('unsupported line: '..line)
end
if equalsline(line, '{}') then
return {}
end
if equalsline(line, '[]') then
return {}
end
if startswith(line, '{') or startswith(line, '[') then
return self:parseflowstyle(line, lines)
end
if startswith(line, '|') or startswith(line, '>') then
return self:parseblockstylestring(line, lines, indent)
end
-- Regular unquoted string
line = gsub(line, '%s*#.*$', '') -- trim tail comment
local v = line
if v == 'null' or v == 'Null' or v == 'NULL'then
return null
elseif v == 'true' or v == 'True' or v == 'TRUE' then
return true
elseif v == 'false' or v == 'False' or v == 'FALSE' then
return false
elseif v == '.inf' or v == '.Inf' or v == '.INF' then
return math.huge
elseif v == '+.inf' or v == '+.Inf' or v == '+.INF' then
return math.huge
elseif v == '-.inf' or v == '-.Inf' or v == '-.INF' then
return -math.huge
elseif v == '.nan' or v == '.NaN' or v == '.NAN' then
return 0 / 0
elseif sfind(v, '^[%+%-]?[0-9]+$') or sfind(v, '^[%+%-]?[0-9]+%.$')then
return tonumber(v) -- : int
elseif sfind(v, '^[%+%-]?[0-9]+%.[0-9]+$') then
return tonumber(v)
end
return s or v
end
function Parser:parseseq(line, lines, indent)
local seq = setmetatable({}, types.seq)
if line ~= '' then
error()
end
while #lines > 0 do
-- Check for a new document
line = lines[1]
if startswith(line, '---') then
while #lines > 0 and not startswith(lines, '---') do
tremove(lines, 1)
end
return seq
end
-- Check the indent level
local level = countindent(line)
if level < indent then
return seq
elseif level > indent then
error("found bad indenting in line: ".. line)
end
local i, j = sfind(line, '%-%s+')
if not i then
i, j = sfind(line, '%-$')
if not i then
return seq
end
end
local rest = ssub(line, j+1)
if sfind(rest, '^[^\'\"%s]*:%s*$') or sfind(rest, '^[^\'\"%s]*:%s+.') then
-- Inline nested hash
-- There are two patterns need to match as inline nested hash
-- first one should have no other characters except whitespace after `:`
-- and the second one should have characters besides whitespace after `:`
--
-- value:
-- - foo:
-- bar: 1
--
-- and
--
-- value:
-- - foo: bar
--
-- And there is one pattern should not be matched, where there is no space after `:`
-- in below, `foo:bar` should be parsed into a single string
--
-- value:
-- - foo:bar
local indent2 = j
lines[1] = string.rep(' ', indent2)..rest
tinsert(seq, self:parsemap('', lines, indent2))
elseif sfind(rest, '^%-%s+') then
-- Inline nested seq
local indent2 = j
lines[1] = string.rep(' ', indent2)..rest
tinsert(seq, self:parseseq('', lines, indent2))
elseif isemptyline(rest) then
tremove(lines, 1)
if #lines == 0 then
tinsert(seq, null)
return seq
end
if sfind(lines[1], '^%s*%-') then
local nextline = lines[1]
local indent2 = countindent(nextline)
if indent2 == indent then
-- Null seqay entry
tinsert(seq, null)
else
tinsert(seq, self:parseseq('', lines, indent2))
end
else
-- - # comment
-- key: value
local nextline = lines[1]
local indent2 = countindent(nextline)
tinsert(seq, self:parsemap('', lines, indent2))
end
elseif line == "*" then
error("did not find expected alphabetic or numeric character")
elseif rest then
-- Array entry with a value
tremove(lines, 1)
tinsert(seq, self:parsescalar(rest, lines))
end
end
return seq
end
function Parser:parseset(line, lines, indent)
if not isemptyline(line) then
error('not seq line: '..line)
end
local set = setmetatable({}, types.set)
while #lines > 0 do
-- Check for a new document
line = lines[1]
if startswith(line, '---') then
while #lines > 0 and not startswith(lines, '---') do
tremove(lines, 1)
end
return set
end
-- Check the indent level
local level = countindent(line)
if level < indent then
return set
elseif level > indent then
error("found bad indenting in line: ".. line)
end
local i, j = sfind(line, '%?%s+')
if not i then
i, j = sfind(line, '%?$')
if not i then
return set
end
end
local rest = ssub(line, j+1)
if sfind(rest, '^[^\'\"%s]*:') then
-- Inline nested hash
local indent2 = j
lines[1] = string.rep(' ', indent2)..rest
set[self:parsemap('', lines, indent2)] = true
elseif sfind(rest, '^%s+$') then
tremove(lines, 1)
if #lines == 0 then
tinsert(set, null)
return set
end
if sfind(lines[1], '^%s*%?') then
local indent2 = countindent(lines[1])
if indent2 == indent then
-- Null array entry
set[null] = true
else
set[self:parseseq('', lines, indent2)] = true
end
end
elseif rest then
tremove(lines, 1)
set[self:parsescalar(rest, lines)] = true
else
error("failed to classify line: "..line)
end
end
return set
end
function Parser:parsemap(line, lines, indent)
if not isemptyline(line) then
error('not map line: '..line)
end
local map = setmetatable({}, types.map)
while #lines > 0 do
-- Check for a new document
line = lines[1]
if startswith(line, '---') then
while #lines > 0 and not startswith(lines, '---') do
tremove(lines, 1)
end
return map
end
-- Check the indent level
local level, _ = countindent(line)
if level < indent then
return map
elseif level > indent then
error("found bad indenting in line: ".. line)
end
-- Find the key
local key
local s, rest = self:parsestring(line)
-- Quoted keys
if s and startswith(rest, ':') then
local sc = self:parsescalar(s, {}, 0)
if sc and type(sc) ~= 'string' then
key = sc
else
key = s
end
line = ssub(rest, 2)
else
error("failed to classify line: "..line)
end
key = checkdupekey(map, key)
line = ltrim(line)
if ssub(line, 1, 1) == '!' then
-- ignore type
local rh = ltrim(ssub(line, 3))
local typename = smatch(rh, '^!?[^%s]+')
line = ltrim(ssub(rh, #typename+1))
end
if not isemptyline(line) then
tremove(lines, 1)
line = ltrim(line)
map[key] = self:parsescalar(line, lines, indent)
else
-- An indent
tremove(lines, 1)
if #lines == 0 then
map[key] = null
return map;
end
if sfind(lines[1], '^%s*%-') then
local indent2 = countindent(lines[1])
map[key] = self:parseseq('', lines, indent2)
elseif sfind(lines[1], '^%s*%?') then
local indent2 = countindent(lines[1])
map[key] = self:parseset('', lines, indent2)
else
local indent2 = countindent(lines[1])
if indent >= indent2 then
-- Null hash entry
map[key] = null
else
map[key] = self:parsemap('', lines, indent2)
end
end
end
end
return map
end
-- : (list<str>)->dict
function Parser:parsedocuments(lines)
lines = compactifyemptylines(lines)
if sfind(lines[1], '^%%YAML') then tremove(lines, 1) end
local root = {}
local in_document = false
while #lines > 0 do
local line = lines[1]
-- Do we have a document header?
local docright;
if sfind(line, '^%-%-%-') then
-- Handle scalar documents
docright = ssub(line, 4)
tremove(lines, 1)
in_document = true
end
if docright then
if (not sfind(docright, '^%s+$') and
not sfind(docright, '^%s+#')) then
tinsert(root, self:parsescalar(docright, lines))
end
elseif #lines == 0 or startswith(line, '---') then
-- A naked document
tinsert(root, null)
while #lines > 0 and not sfind(lines[1], '---') do
tremove(lines, 1)
end
in_document = false
-- XXX The final '-+$' is to look for -- which ends up being an
-- error later.
elseif not in_document and #root > 0 then
-- only the first document can be explicit
error('parse error: '..line)
elseif sfind(line, '^%s*%-') then
-- An array at the root
tinsert(root, self:parseseq('', lines, 0))
elseif sfind(line, '^%s*[^%s]') then
-- A hash at the root
local level = countindent(line)
tinsert(root, self:parsemap('', lines, level))
else
-- Shouldn't get here. @lines have whitespace-only lines
-- stripped, and previous match is a line with any
-- non-whitespace. So this clause should only be reachable via
-- a perlbug where \s is not symmetric with \S
-- uncoverable statement
error('parse error: '..line)
end
end
if #root > 1 and Null.isnull(root[1]) then
tremove(root, 1)
return root
end
return root
end
--- Parse yaml string into table.
function Parser:parse(source)
local lines = {}
for line in string.gmatch(source .. '\n', '(.-)\r?\n') do
tinsert(lines, line)
end
local docs = self:parsedocuments(lines)
if #docs == 1 then
return docs[1]
end
return docs
end
local function parse(source, options)
local options = options or {}
local parser = setmetatable (options, {__index=Parser})
return parser:parse(source)
end
return {
version = 0.1,
parse = parse,
}
================================================
FILE: lua/nlspsettings/log.lua
================================================
local has_notify, notify = pcall(require, 'notify')
local config = require 'nlspsettings.config'
local TITLE = 'Lsp Settings'
--- Checks if can se nvim-notify integration
---@return
local use_nvim_notify = function()
local notify_config = config.get().nvim_notify
return has_notify and notify_config and notify_config.enable
end
--- Logs a message with the given log level.
---@param message string
---@param level number
local log = function(message, level)
if use_nvim_notify() then
notify(message, level, {
title = TITLE,
timeout = config.get().nvim_notify.timeout,
})
else
vim.notify(('[%s] %s'):format(TITLE, message), level)
end
end
--- Logs a message once with the given log level.
---@param message string
---@param level number
local log_once = function(message, level)
-- users of nvim-notify may have `vim.notify = require("notify")`
-- vim.notify_once uses vim.notify
if use_nvim_notify() and vim.notify == notify then
vim.notify_once(message, level, {
title = TITLE,
timeout = config.get().nvim_notify.timeout,
})
else
vim.notify_once(('[%s] %s'):format(TITLE, message), level)
end
end
--- Creates a logger from the given level.
---@param level string
local create_level = function(level)
return function(message, once)
if once then
return log_once(message, level)
end
log(message, level)
end
end
local M = {}
M.info = create_level(vim.log.levels.INFO)
M.warn = create_level(vim.log.levels.WARN)
M.error = create_level(vim.log.levels.ERROR)
return M
================================================
FILE: lua/nlspsettings/schemas.lua
================================================
local config = require 'nlspsettings.config'
local uv = vim.loop
local on_windows = uv.os_uname().version:match 'Windows'
local path_sep = on_windows and '\\' or '/'
local function join_paths(...)
local result = table.concat({ ... }, path_sep)
return result
end
local function system_path(path)
if not on_windows then
return path
end
return path:gsub('/', '\\')
end
-- on windows, neovim returns paths like:
-- C:\\Users\\USER\\AppData\\Local\\nvim\\lua/folder/file.lua
local script_abspath = system_path(debug.getinfo(1, 'S').source:sub(2))
local pattern = join_paths('(.*)', 'lua', 'nlspsettings', 'schemas.lua$')
local nlsp_abstpath = script_abspath:match(pattern)
local _schemas_dir = join_paths(nlsp_abstpath, 'schemas')
local _generated_dir = join_paths(_schemas_dir, '_generated')
--- path のディレクトリ内にあるすべての *.json の設定を生成する
---@return table<string, string> { server_name: file_path }
local make_schemas_table = function(path)
local handle = uv.fs_scandir(path)
if handle == nil then
return {}
end
local res = {}
while true do
local name, _ = uv.fs_scandir_next(handle)
if name == nil then
break
end
local server_name = string.match(name, '([^/]+)%.json$')
if server_name ~= nil then
res[server_name] = string.format('/%s/%s', path, name)
end
end
return res
end
-- ベースとなるスキーマの情報をリセットする
local base_schemas_data = {}
local reset_base_scemas_data = function()
local generated_schemas_table = make_schemas_table(_generated_dir)
local local_schemas_table = make_schemas_table(_schemas_dir)
base_schemas_data = vim.tbl_extend('force', generated_schemas_table, local_schemas_table)
end
reset_base_scemas_data()
---ベースとなるスキーマを返す
---@return table<string, string> { server_name, file_path }
local get_base_schemas_data = function()
return base_schemas_data
end
--- Return the name of a supported server.
---@return table
local get_langserver_names = function()
if not config.get().open_strictly then
return vim.tbl_keys(base_schemas_data)
end
return vim.tbl_values(vim.tbl_map(function(server_name)
return base_schemas_data[server_name] and server_name
end, vim.tbl_keys(require 'lspconfig.configs')))
end
return {
get_base_schemas_data = get_base_schemas_data,
get_langserver_names = get_langserver_names,
}
================================================
FILE: lua/nlspsettings/utils.lua
================================================
local log = require 'nlspsettings.log'
---@param t table
---@return boolean
local is_table = function(t)
return type(t) == 'table' and (not vim.tbl_islist(t) or vim.tbl_isempty(t))
end
--- * YAMLの場合、table のみ
--- { url: globpattern }
--- * JSON の場合、list のみ
--- { { fileMatch = string, url = string } }
--- もし、list + table や table + list のように渡されたらメッセージを表示して、t1 を返す
---@param t1 table
---@param t2 table
---@return table
local extend = function(t1, t2)
-- 変えないようにする
t1 = vim.deepcopy(t1)
t2 = vim.deepcopy(t2)
if vim.tbl_islist(t1) and vim.tbl_islist(t2) then
-- list + list
vim.list_extend(t1, t2)
return t1
end
if is_table(t1) and is_table(t2) then
-- table + table
t1 = vim.tbl_deep_extend('keep', t1, t2)
return t1
end
if (vim.tbl_islist(t1) and is_table(t2)) or (is_table(t1) and vim.tbl_islist(t2)) then
-- list + table
-- table + list
log.warn('Cannot merge list and table', true)
end
return t1
end
--- Get active clients.
local get_clients = function()
if vim.version().major == 0 and vim.version().minor <= 9 then
-- DEPRECATED IN 0.10
return vim.lsp.get_active_clients()
else
return vim.lsp.get_clients()
end
end
return {
extend = extend,
get_clients = get_clients
}
================================================
FILE: lua/nlspsettings.lua
================================================
local config = require 'nlspsettings.config'
local lspconfig = require 'lspconfig'
local utils = require 'nlspsettings.utils'
local lspconfig_util = require 'lspconfig.util'
local uv = vim.loop
local M = {}
---@class nlspsettings.server_settings
---@field global_settings table
---@field conf_settings table
---@type table<string, nlspsettings.server_settings>
local servers = {}
---@type nlspsettings.loaders.json|nlspsettings.loaders.yaml
local loader
local loader_is_set = false
local set_loader = function()
loader = require('nlspsettings.loaders.' .. config.get().loader)
loader_is_set = true
end
--- Convert table key dots to table nests
---
---@param t table settings setting table
---@return table
local lsp_table_to_lua_table = function(t)
vim.validate {
t = { t, 't' },
}
local res = {}
for key, value in pairs(t) do
local key_list = {}
for k in string.gmatch(key, '([^.]+)') do
table.insert(key_list, k)
end
local tbl = res
for i, k in ipairs(key_list) do
if i == #key_list then
tbl[k] = value
end
if tbl[k] == nil then
tbl[k] = {}
end
tbl = tbl[k]
end
end
return res
end
--- load settings file
---@param path string
---@return table json data
---@return boolean error
local load = function(path)
vim.validate {
path = { path, 's' },
}
if vim.fn.filereadable(path) == 0 then
return {}
end
local data, err = loader.load(path)
if err ~= nil then
return {}, true
end
if data == nil then
return {}
end
return lsp_table_to_lua_table(data) or {}
end
---設定ファイル名からサーバー名を取得する
---@param path string
---@return string
local get_server_name_from_path = function(path)
return path:match '([^/]+)%.%w+$'
end
--- load settings from settings file
---@param path string settings file path
---@return boolean is_error if error then true
local load_global_setting = function(path)
vim.validate {
path = { path, 's' },
}
local name = get_server_name_from_path(path)
if name == nil then
return
end
if name and servers[name] == nil then
servers[name] = {}
servers[name].global_settings = {}
servers[name].conf_settings = {}
end
local data, err = load(path)
if err then
return err
end
servers[name].global_settings = data
end
--- Returns the current settings for the specified server
---@param root_dir string
---@param server_name string
---@return table merged_settings
---@return boolean error when loading local settings
local get_settings = function(root_dir, server_name)
local conf = config.get()
local local_settings, err = load(
string.format('%s/%s/%s.%s', root_dir, conf.local_settings_dir, server_name, loader.file_ext)
)
local global_settings = (servers[server_name] and servers[server_name].global_settings) or {}
local conf_settings = (servers[server_name] and servers[server_name].conf_settings) or {}
-- Priority:
-- 1. local settings
-- 2. global settings
-- 3. setup({settings = ...})
-- 4. default_config.settings
local settings = vim.empty_dict()
settings = vim.tbl_deep_extend('keep', settings, local_settings)
settings = vim.tbl_deep_extend('keep', settings, global_settings)
settings = vim.tbl_deep_extend('keep', settings, conf_settings)
-- jsonls の場合、 schemas を追加する
if server_name == loader.server_name then
local settings_key = loader.settings_key
if settings[settings_key] == nil then
-- 何も設定していない場合、存在しないキーから取得しようとして落ちてしまうため、ここでセットしておく
settings[settings_key] = {}
settings[settings_key].schemas = {}
end
local s_schemas = settings[settings_key].schemas
-- XXX: 上でマージしているため、ここでは必要ない
-- --- schemas をマージする
-- local function merge(base_schemas, ext)
-- if ext[settings_key] == nil or ext[settings_key]['schemas'] == nil then
-- return base_schemas
-- end
--
-- return utils.extend(base_schemas, ext[settings_key]['schemas'])
-- end
--
-- s_schemas = merge(merge(merge(s_schemas, local_settings), global_settings), conf_settings)
if conf.append_default_schemas then
s_schemas = utils.extend(s_schemas, loader.get_default_schemas())
end
settings[settings_key].schemas = s_schemas
end
return settings, err
end
M.get_settings = get_settings
--- Read the settings file and notify the server in workspace/didChangeConfiguration
---@param server_name string
M.update_settings = function(server_name)
vim.validate {
server_name = { server_name, 's' },
}
if #utils.get_clients() == 0 then
-- on_new_config() が呼ばれたときに読むから、設定ファイルを読む必要はない
return false
end
local conf = config.get()
-- もしかしたら、グローバルの設置が変わっている可能性があるため、ここで読み込む
local err = load_global_setting(string.format('%s/%s.%s', conf.config_home, server_name, loader.file_ext))
if err then
return true
end
local errors = false
-- server_name のすべてのクライアントの設定を更新する
for _, client in ipairs(utils.get_clients()) do
if client.name == server_name then
-- 設定ファイルの設定と setup() の設定をマージする
-- 各クライアントの設定を読み込みたいため、ループの中で読み込む
local new_settings, err_ = get_settings(client.config.root_dir, server_name)
if err_ then
errors = true
end
-- XXX: なぜか、エラーになる...
-- client.workspace_did_change_configuration(new_settings)
client.notify('workspace/didChangeConfiguration', {
settings = new_settings,
})
-- Neovim 標準の workspace/configuration のハンドラで使っているため、常に同期を取るべき
client.config.settings = new_settings
end
end
return errors
end
--- Make an on_new_config function that sets the settings
---@param on_new_config function
---@return function
local make_on_new_config = function(on_new_config)
-- before にしたのは、settings を上書きできるようにするため
-- XXX: before か after かどっちがいいのか、なやむ
return lspconfig.util.add_hook_before(on_new_config, function(new_config, root_dir)
local server_name = new_config.name
if servers[server_name] == nil then
servers[server_name] = {}
end
-- 1度だけ、保持する ()
-- new_config.settings は `setup({settings = ...}) + default_config.settings`
servers[server_name].conf_settings = vim.deepcopy(new_config.settings)
new_config.settings = get_settings(root_dir, server_name)
end)
end
local setup_autocmds = function()
local conf = config.get()
local patterns = {
string.format('*/%s/*.%s', lspconfig_util.path.sanitize(conf.config_home):match '[^/]+$', loader.file_ext),
string.format('*/%s/*.%s', lspconfig_util.path.sanitize(conf.local_settings_dir), loader.file_ext),
}
local pattern = table.concat(patterns, ',')
vim.cmd [[augroup LspSettings]]
vim.cmd [[ autocmd!]]
vim.cmd(
([[ autocmd BufWritePost %s lua require'nlspsettings.command'._BufWritePost(vim.fn.expand('<afile>'))]]):format(
pattern
)
)
vim.cmd [[augroup END]]
end
---path 以下にあるloaderに対応する設定ファイルのリストを返す
---@param path string config_home
---@return table settings_files List of settings file
local get_settings_files = function(path)
local handle = uv.fs_scandir(path)
if handle == nil then
return {}
end
local res = {}
while true do
local name, _ = uv.fs_scandir_next(handle)
if name == nil then
break
end
table.insert(res, path .. '/' .. name)
end
return res
end
--- load settings files under config_home
local load_settings = function()
local files = get_settings_files(config.get().config_home)
for _, v in ipairs(files) do
load_global_setting(v)
end
end
--- Use on_new_config to enable automatic loading of settings files on all language servers
local setup_default_config = function()
lspconfig.util.default_config = vim.tbl_extend('force', lspconfig.util.default_config, {
on_new_config = make_on_new_config(lspconfig.util.default_config.on_new_config),
})
end
--- Setup to read from a settings file.
---@param opts nlspsettings.config.values
M.setup = function(opts)
vim.validate {
opts = { opts, 't', true },
}
opts = opts or {}
config.set_default_values(opts)
set_loader()
-- XXX: ここで読む必要ある??
-- get_settings() で読めばいいのでは?
load_settings()
setup_autocmds()
setup_default_config()
end
local mt = {}
-- M.settings = servers
M._get_servers = function()
return servers
end
---
---@return nlspsettings.loaders.json.jsonschema[]|table<string, string>
M.get_default_schemas = function()
if not loader_is_set then
set_loader()
end
return loader.get_default_schemas()
end
return setmetatable(M, mt)
================================================
FILE: plugin/nlspsetting.vim
================================================
if exists('g:loaded_nlspsettings')
finish
endif
let g:loaded_nlspsettings = 1
command! -nargs=* -complete=custom,v:lua.require'nlspsettings.command.completion'.complete
\ LspSettings lua require("nlspsettings.command")._command(<f-args>)
" deprecated
function! s:complete(arg, line, pos) abort
return join(luaeval('require("nlspsettings.schemas").get_langserver_names()'), "\n")
endfunction
command! -nargs=1 -complete=custom,s:complete NlspConfig lua require('nlspsettings.deprecated').open()
command! -nargs=1 -complete=custom,s:complete NlspLocalConfig lua require('nlspsettings.deprecated').open_local()
command! -nargs=0 NlspBufConfig lua require('nlspsettings.deprecated').open_buffer()
command! -nargs=0 NlspLocalBufConfig lua require('nlspsettings.deprecated').open_local_buffer()
command! -nargs=1 -complete=custom,s:complete NlspUpdateSettings lua require('nlspsettings.deprecated').update()
================================================
FILE: schemas/README.md
================================================
# Schemas
================================================
FILE: schemas/_generated/.gitkeep
================================================
================================================
FILE: schemas/_generated/als.json
================================================
{
"$schema": "http://json-schema.org/draft-04/schema#",
"description": "Setting of als",
"properties": {
"ada.defaultCharset": {
"default": null,
"markdownDescription": "The character set that the Ada Language Server should use when reading files from disk.\n\nIf not set in VS Code, this setting takes its value from the [`.als.json`](https://github.com/AdaCore/ada_language_server/blob/master/doc/settings.md) file at the root of the workspace, if that file exists.",
"scope": "window",
"type": "string"
},
"ada.gprConfigurationFile": {
"default": null,
"markdownDescription": "GPR configuration file (*.cgpr) for this workspace.\n\nIt is recommended to set this to a relative path starting at the root of the workspace.\n\nIf not set in VS Code, this setting takes its value from the [`.als.json`](https://github.com/AdaCore/ada_language_server/blob/master/doc/settings.md) file at the root of the workspace, if that file exists.",
"order": 0,
"scope": "window",
"type": "string"
},
"ada.projectFile": {
"default": null,
"markdownDescription": "GPR project file (*.gpr) for this workspace.\n\nIt is recommended to set this to a relative path starting at the root of the workspace.\n\nIf not set in VS Code, this setting takes its value from the [`.als.json`](https://github.com/AdaCore/ada_language_server/blob/master/doc/settings.md) file at the root of the workspace, if that file exists.",
"order": 0,
"scope": "window",
"type": "string"
},
"ada.relocateBuildTree": {
"default": null,
"markdownDescription": "The path to a directory used for out-of-tree builds. This feature is related to the [--relocate-build-tree GPRbuild command line switch](https://docs.adacore.com/gprbuild-docs/html/gprbuild_ug/building_with_gprbuild.html#switches).\n\nIf not set in VS Code, this setting takes its value from the [`.als.json`](https://github.com/AdaCore/ada_language_server/blob/master/doc/settings.md) file at the root of the workspace, if that file exists.",
"scope": "window",
"type": "string"
},
"ada.rootDir": {
"default": null,
"markdownDescription": "This setting must be used in conjunction with the `relocateBuildTree` setting.\n\nIt specifies the root directory for artifact relocation. It corresponds to the [--root-dir GPRbuild command line switch](https://docs.adacore.com/gprbuild-docs/html/gprbuild_ug/building_with_gprbuild.html#switches).\n\nIf not set in VS Code, this setting takes its value from the [`.als.json`](https://github.com/AdaCore/ada_language_server/blob/master/doc/settings.md) file at the root of the workspace, if that file exists.",
"scope": "window",
"type": "string"
},
"ada.scenarioVariables": {
"default": null,
"markdownDescription": "Scenario variables to apply to the GPR project file.\n\nThis value should be provided as an object where the property names are GPR scenario variables and the values are strings.\n\nIf not set in VS Code, this setting takes its value from the [`.als.json`](https://github.com/AdaCore/ada_language_server/blob/master/doc/settings.md) file at the root of the workspace, if that file exists.",
"order": 1,
"patternProperties": {
".*": {
"type": "string"
}
},
"scope": "window",
"type": "object"
}
}
}
================================================
FILE: schemas/_generated/asm_lsp.json
================================================
{
"$schema": "http://json-schema.org/draft-04/schema#",
"description": "Setting of asm_lsp",
"properties": {
"default_config": {
"anyOf": [
{
"$ref": "#/definitions/Config"
},
{
"type": "null"
}
]
},
"project": {
"items": {
"$ref": "#/definitions/ProjectConfig"
},
"type": [
"array",
"null"
]
}
}
}
================================================
FILE: schemas/_generated/ast_grep.json
================================================
{
"$schema": "http://json-schema.org/draft-04/schema#",
"description": "Setting of ast_grep",
"properties": {
"customLanguages": {
"additionalProperties": {
"$ref": "#/definitions/CustomLanguage"
},
"description": "A dictionary of custom languages in the project.",
"type": "object"
},
"languageGlobs": {
"additionalProperties": {
"items": {
"type": "string"
},
"type": "array"
},
"description": "A mapping to associate a language to files that have non-standard extensions or syntaxes.",
"type": "object"
},
"languageInjections": {
"description": "A list of language injections to support embedded languages in the project like JS/CSS in HTML. This is an experimental feature.",
"items": {
"$ref": "#/definitions/LanguageInjection"
},
"type": "array"
},
"ruleDirs": {
"description": "A list of string instructing where to discover ast-grep's YAML rules.",
"items": {
"type": "string"
},
"title": "Rule directories",
"type": "array"
},
"testConfigs": {
"description": "A list of object to configure ast-grep's test cases. Each object can have two fields.",
"items": {
"$ref": "#/definitions/TestConfig"
},
"title": "Test configurations",
"type": "array"
},
"utilDirs": {
"description": "A list of string instructing where to discover ast-grep's global utility rules.",
"items": {
"type": "string"
},
"title": "Utility directories",
"type": "array"
}
}
}
================================================
FILE: schemas/_generated/astro.json
================================================
{
"$schema": "http://json-schema.org/draft-04/schema#",
"description": "Setting of astro",
"properties": {
"astro.auto-import-cache.enabled": {
"default": true,
"markdownDescription": "Enable the auto import cache. Yields a faster intellisense when automatically importing a file, but can cause issues with new files not being detected. Change is applied on restart. See [#1035](https://github.com/withastro/language-tools/issues/1035).",
"scope": "resource",
"type": "boolean"
},
"astro.content-intellisense": {
"default": false,
"description": "Enable experimental support for content collection intellisense inside Markdown, MDX and Markdoc. Note that this require also enabling the feature in your Astro config (experimental.contentCollectionIntellisense) (Astro 4.14+)",
"scope": "resource",
"type": "boolean"
},
"astro.language-server.ls-path": {
"description": "Path to the language server executable. You won't need this in most cases, set this only when needing a specific version of the language server",
"title": "Language Server: Path",
"type": "string"
},
"astro.language-server.runtime": {
"description": "Path to the node executable used to execute the language server. You won't need this in most cases",
"scope": "application",
"title": "Language Server: Runtime",
"type": "string"
},
"astro.trace.server": {
"default": "off",
"description": "Traces the communication between VS Code and the language server.",
"enum": [
"off",
"messages",
"verbose"
],
"scope": "window",
"type": "string"
},
"astro.updateImportsOnFileMove.enabled": {
"default": false,
"description": "Controls whether the extension updates imports when a file is moved to a new location. In most cases, you'll want to keep this disabled as TypeScript and the Astro TypeScript plugin already handles this for you. Having multiple tools updating imports at the same time can lead to corrupted files.",
"scope": "resource",
"type": "boolean"
}
}
}
================================================
FILE: schemas/_generated/awkls.json
================================================
{
"$schema": "http://json-schema.org/draft-04/schema#",
"description": "Setting of awkls",
"properties": {
"awk-ide-vscode.indexing": {
"default": true,
"description": "Turns on/off source files indexing. Requires restart.",
"scope": "window",
"type": "boolean"
},
"awk-ide-vscode.trace.server": {
"default": "off",
"description": "Traces the communication between VS Code and the language server.",
"enum": [
"off",
"messages",
"verbose"
],
"scope": "window",
"type": "string"
}
}
}
================================================
FILE: schemas/_generated/bashls.json
================================================
{
"$schema": "http://json-schema.org/draft-04/schema#",
"description": "Setting of bashls",
"properties": {
"bashIde.backgroundAnalysisMaxFiles": {
"default": 500,
"description": "Maximum number of files to analyze in the background. Set to 0 to disable background analysis.",
"minimum": 0,
"type": "number"
},
"bashIde.enableSourceErrorDiagnostics": {
"default": false,
"description": "Enable diagnostics for source errors. Ignored if includeAllWorkspaceSymbols is true.",
"type": "boolean"
},
"bashIde.explainshellEndpoint": {
"default": "",
"description": "Configure explainshell server endpoint in order to get hover documentation on flags and options.",
"type": "string"
},
"bashIde.globPattern": {
"default": "**/*@(.sh|.inc|.bash|.command)",
"description": "Glob pattern for finding and parsing shell script files in the workspace. Used by the background analysis features across files.",
"type": "string"
},
"bashIde.includeAllWorkspaceSymbols": {
"default": false,
"description": "Controls how symbols (e.g. variables and functions) are included and used for completion, documentation, and renaming. If false (default and recommended), then we only include symbols from sourced files (i.e. using non dynamic statements like 'source file.sh' or '. file.sh' or following ShellCheck directives). If true, then all symbols from the workspace are included.",
"type": "boolean"
},
"bashIde.logLevel": {
"default": "info",
"description": "Controls the log level of the language server.",
"enum": [
"debug",
"info",
"warning",
"error"
],
"type": "string"
},
"bashIde.shellcheckArguments": {
"default": "",
"description": "Additional ShellCheck arguments. Note that we already add the following arguments: --shell, --format, and --external-sources (if shellcheckExternalSources is true).",
"type": "string"
},
"bashIde.shellcheckExternalSources": {
"default": true,
"description": "Controls whether ShellCheck is invoked with --external-sources. When enabled (default), ShellCheck follows source directives to lint referenced files. On projects with many cross-sourcing scripts this can cause unbounded memory growth. Set to false to disable.",
"type": "boolean"
},
"bashIde.shellcheckPath": {
"default": "shellcheck",
"description": "Controls the executable used for ShellCheck linting information. An empty string will disable linting.",
"type": "string"
},
"bashIde.shfmt.binaryNextLine": {
"default": false,
"description": "Allow boolean operators (like && and ||) to start a line.",
"type": "boolean"
},
"bashIde.shfmt.caseIndent": {
"default": false,
"description": "Indent patterns in case statements.",
"type": "boolean"
},
"bashIde.shfmt.funcNextLine": {
"default": false,
"description": "Place function opening braces on a separate line.",
"type": "boolean"
},
"bashIde.shfmt.ignoreEditorconfig": {
"default": false,
"description": "Ignore shfmt config options in .editorconfig (always use language server config)",
"type": "boolean"
},
"bashIde.shfmt.keepPadding": {
"default": false,
"description": "(Deprecated) Keep column alignment padding.",
"markdownDescription": "**([Deprecated](https://github.com/mvdan/sh/issues/658))** Keep column alignment padding.",
"type": "boolean"
},
"bashIde.shfmt.languageDialect": {
"default": "auto",
"description": "Language dialect to use when parsing (bash/posix/mksh/bats).",
"enum": [
"auto",
"bash",
"posix",
"mksh",
"bats"
],
"type": "string"
},
"bashIde.shfmt.path": {
"default": "shfmt",
"description": "Controls the executable used for Shfmt formatting. An empty string will disable formatting.",
"type": "string"
},
"bashIde.shfmt.simplifyCode": {
"default": false,
"description": "Simplify code before formatting.",
"type": "boolean"
},
"bashIde.shfmt.spaceRedirects": {
"default": false,
"description": "Follow redirection operators with a space.",
"type": "boolean"
}
}
}
================================================
FILE: schemas/_generated/beancount.json
================================================
{
"$schema": "http://json-schema.org/draft-04/schema#",
"description": "Setting of beancount",
"properties": {
"beancountLangServer.beanCheck": {
"additionalProperties": true,
"default": {},
"description": "Bean-check execution options",
"properties": {
"bean_check_cmd": {
"description": "Path to bean-check executable when using the system method.",
"type": "string"
},
"method": {
"default": "python-system",
"description": "Execution method for bean-check (system, python-system, python-embedded).",
"enum": [
"system",
"python-system",
"python-embedded"
],
"type": "string"
},
"python_cmd": {
"description": "Python executable to use for the python methods.",
"type": "string"
}
},
"scope": "resource",
"type": "object"
},
"beancountLangServer.formatting": {
"additionalProperties": true,
"default": {},
"description": "formatting config",
"properties": {
"account_amount_spacing": {
"type": "number"
},
"currency_column": {
"type": "number"
},
"indent_width": {
"type": "number"
},
"num_width": {
"type": "number"
},
"number_currency_spacing": {
"type": "number"
},
"prefix_width": {
"type": "number"
}
},
"scope": "resource",
"type": "object"
},
"beancountLangServer.journalFile": {
"default": "",
"description": "Path to the beancount journal",
"scope": "resource",
"type": "string"
},
"beancountLangServer.logLevel": {
"default": null,
"description": "Log level for the language server.",
"enum": [
"trace",
"debug",
"info",
"warn",
"error",
"off",
null
],
"scope": "window",
"type": [
"string",
"null"
]
},
"beancountLangServer.serverPath": {
"default": "",
"description": "Path to the beancount-language-server executable",
"scope": "resource",
"type": "string"
}
}
}
================================================
FILE: schemas/_generated/bicep.json
================================================
{
"$schema": "http://json-schema.org/draft-04/schema#",
"description": "Setting of bicep",
"properties": {
"bicep.completions.getAllAccessibleAzureContainerRegistries": {
"default": false,
"description": "When completing 'br:' module references, query Azure for all container registries accessible to the user (may be slow). If this option is off, only registries configured under moduleAliases in bicepconfig.json will be listed.",
"type": "boolean"
},
"bicep.decompileOnPaste": {
"default": true,
"description": "Automatically convert pasted JSON values, JSON ARM templates or resources from a JSON ARM template into Bicep (use Undo to revert)",
"type": "boolean"
},
"bicep.enableOutputTimestamps": {
"$comment": "This is interpreted by vscode-azuretools package and the name has to be in the following format: <extensionConfigurationPrefix>.enableOutputTimestamps",
"default": true,
"description": "Prepend each line displayed in the Bicep Operations output channel with a timestamp.",
"type": "boolean"
},
"bicep.enableSurveys": {
"default": true,
"description": "Enable occasional surveys to collect feedback that helps us improve the Bicep extension.",
"type": "boolean"
},
"bicep.suppressedWarnings": {
"default": [],
"description": "Warnings that are being suppressed because a 'Don't show again' button was pressed. Remove items to reset.",
"items": {
"type": "string"
},
"type": "array"
},
"bicep.trace.server": {
"default": "Off",
"description": "Configure tracing of messages sent to the Bicep language server.",
"enum": [
"Off",
"Messages",
"Verbose"
],
"scope": "window",
"type": "string"
}
}
}
================================================
FILE: schemas/_generated/bright_script.json
================================================
{
"$schema": "http://json-schema.org/draft-04/schema#",
"description": "Setting of bright_script",
"properties": {
"allowBrighterScriptInBrightScript": {
"default": false,
"description": "Allow brighterscript features (classes, interfaces, etc...) to be included in BrightScript (`.brs`) files, and force those files to be transpiled.",
"type": "boolean"
},
"autoImportComponentScript": {
"default": false,
"description": "When enabled, every xml component will search for a .bs or .brs file with the same name in the same folder, and add it as a script import if found. Disabled by default",
"type": "boolean"
},
"copyToStaging": {
"description": "If true, the files are copied to staging. This setting is ignored when deploy is enabled or if createPackage is enabled",
"type": "boolean"
},
"createPackage": {
"description": "Creates a zip package. Defaults to true. This setting is ignored when deploy is enabled.",
"type": "boolean"
},
"cwd": {
"description": "A path that will be used to override the current working directory",
"type": "string"
},
"deploy": {
"description": "If true, after a successful buld, the project will be deployed to the roku specified in host",
"type": "boolean"
},
"diagnosticFilters": {
"description": "A collection of filters used to hide diagnostics for certain files",
"items": {
"anyOf": [
{
"description": "The code of a diagnostic that should be filtered from all files",
"type": "number"
},
{
"description": "A file path relative to rootDir, an absolute path, or a file glob pointing to file(s) you wish to filter diagnostics from.",
"type": "string"
},
{
"description": "A file path relative to rootDir, an absolute path, or a file glob pointing to file(s) you wish to filter diagnostics from.",
"properties": {
"codes": {
"description": "A list of codes of diagnostics that should be filtered out from the files matched in 'src'`. If omitted, all error codes are used",
"items": {
"anyOf": [
{
"description": "A code of diagnostics that should be filtered out from the files matched in 'src'",
"type": [
"number",
"string"
]
}
]
},
"type": "array"
},
"src": {
"description": "A file path relative to rootDir, an absolute path, or a file glob pointing to file(s) you wish to filter diagnostics from.",
"type": "string"
}
},
"type": "object"
}
]
},
"type": "array"
},
"diagnosticLevel": {
"default": "log",
"description": "Specify what diagnostic levels are printed to the console. This has no effect on what diagnostics are reported in the LanguageServer. Defaults to 'warn'",
"enum": [
"hint",
"info",
"warn",
"error"
],
"type": "string"
},
"diagnosticSeverityOverrides": {
"description": "A map of error codes with their severity level override (error|warn|info)",
"patternProperties": {
".{1,}": {
"enum": [
"error",
"warn",
"info",
"hint"
],
"type": [
"number",
"string"
]
}
},
"type": "object"
},
"emitDefinitions": {
"default": false,
"description": "Emit type definition files (`d.bs`) during transpile",
"type": "boolean"
},
"emitFullPaths": {
"default": false,
"description": "Emit full paths to files when printing diagnostics to the console.",
"type": "boolean"
},
"extends": {
"description": "Relative or absolute path to another bsconfig.json file that this file should use as a base and then override. Prefix with a question mark (?) to prevent throwing an exception if the file does not exist.",
"type": "string"
},
"files": {
"default": [
"manifest",
"source/**/*.*",
"components/**/*.*",
"images/**/*.*"
],
"description": "The list of files that should be used in this project. Supports globs. Optionally, you can specify an object with `src` and `dest` properties to move files from one location into a different destination location",
"items": {
"anyOf": [
{
"description": "A file path or file glob",
"type": "string"
},
{
"properties": {
"dest": {
"description": "The destination for the file(s) found in 'src'",
"type": "string"
},
"src": {
"anyOf": [
{
"description": "A file path or glob pattern of source file(s)",
"type": "string"
},
{
"description": "An array of file path or globs",
"items": {
"description": "A file path or glob pattern of source file(s)",
"type": "string"
},
"type": "array"
}
]
}
},
"required": [
"src",
"dest"
],
"type": "object"
}
]
},
"type": "array"
},
"host": {
"description": "The host of the Roku that the package will be deploy to",
"type": "string"
},
"ignoreErrorCodes": {
"deprecated": true,
"deprecationMessage": "Deprecated. Use `diagnosticFilters` instead.",
"description": "A list of error codes the compiler should NOT emit, even if encountered.",
"items": {
"anyOf": [
{
"description": "A BrighterScript error code that will be ignored during compile",
"type": "number"
}
]
},
"type": "array"
},
"logLevel": {
"default": "log",
"description": "The amount of detail that should be printed to the console",
"enum": [
"error",
"warn",
"log",
"info",
"debug",
"trace"
],
"type": "string"
},
"manifest": {
"description": "A entry to modify manifest values",
"properties": {
"bs_const": {
"description": "A dictionary of bs_consts to change or add to the manifest. Each entry ",
"patternProperties": {
"^.+$": {
"type": [
"boolean",
"null"
]
}
},
"type": "object"
}
},
"type": "object"
},
"outFile": {
"description": "The path where the output zip or package should be placed. This includes the filename. Defaults to \"./out/package\"",
"type": "string"
},
"password": {
"description": " The password to use when deploying to a Roku device",
"type": "string"
},
"plugins": {
"description": "A list of node scripts or npm modules to add extra diagnostics or transform the AST.",
"items": {
"anyOf": [
{
"description": "a path to a node script or an npm module to load dynamically",
"type": "string"
}
]
},
"type": "array"
},
"removeParameterTypes": {
"default": false,
"description": "Removes the explicit type to function's parameters and return (i.e. the `as type` syntax) ",
"type": "boolean"
},
"require": {
"description": "A list of scripts or modules to pass to node's `require()` on startup. This is useful for doing things like ts-node registration",
"items": {
"anyOf": [
{
"description": "a path to a node script or an npm module to load dynamically at startup.",
"type": "string"
}
]
},
"type": "array"
},
"resolveSourceRoot": {
"default": false,
"description": "Should the `sourceRoot` property be resolve to an absolute path (relative to the bsconfig it's defined in)",
"type": "boolean"
},
"retainStagingDir": {
"default": false,
"description": "Prevent the staging folder from being deleted after the deployment package is created. This is helpful for troubleshooting why your package isn't being created the way you expected.",
"type": "boolean"
},
"retainStagingFolder": {
"default": false,
"deprecated": true,
"description": "Prevent the staging folder from being deleted after the deployment package is created. This is helpful for troubleshooting why your package isn't being created the way you expected.",
"type": "boolean"
},
"rootDir": {
"description": "The root directory of your Roku project. Defaults to the current working directory (or cwd property)",
"type": "string"
},
"sourceMap": {
"default": false,
"description": "Enables generating sourcemap files, which allow debugging tools to show the original source code while running the emitted files.",
"type": "boolean"
},
"sourceRoot": {
"description": "Override the root directory path where debugger should locate the source files. The location will be embedded in the source map to help debuggers locate the original source files. This only applies to files found within rootDir. This is useful when you want to preprocess files before passing them to BrighterScript, and want a debugger to open the original files. This option also affects the `SOURCE_FILE_PATH` and `SOURCE_LOCATION` source literals.",
"type": "string"
},
"stagingDir": {
"description": "The path to the staging folder (where all files are copied before creating the zip package)",
"type": "string"
},
"stagingFolderPath": {
"deprecated": true,
"deprecationMessage": "Deprecated. Use `stagingDir` instead.",
"description": "The path to the staging folder (where all files are copied before creating the zip package)",
"type": "string"
},
"username": {
"description": "The username to use when deploying to a Roku device",
"type": "string"
},
"watch": {
"default": false,
"description": "If true, the server will keep running and will watch and recompile on every file change",
"type": "boolean"
}
}
}
================================================
FILE: schemas/_generated/clangd.json
================================================
{
"$schema": "http://json-schema.org/draft-04/schema#",
"description": "Setting of clangd",
"properties": {
"clangd.arguments": {
"default": [],
"description": "Arguments for clangd server.",
"items": {
"type": "string"
},
"type": "array"
},
"clangd.checkUpdates": {
"default": false,
"description": "Check for language server updates on startup.",
"type": "boolean"
},
"clangd.detectExtensionConflicts": {
"default": true,
"description": "Warn about conflicting extensions and suggest disabling them.",
"type": "boolean"
},
"clangd.enable": {
"default": true,
"description": "Enable clangd language server features",
"type": "boolean"
},
"clangd.enableCodeCompletion": {
"default": true,
"description": "Enable code completion provided by the language server",
"type": "boolean"
},
"clangd.enableHover": {
"default": true,
"description": "Enable hovers provided by the language server",
"type": "boolean"
},
"clangd.fallbackFlags": {
"default": [],
"description": "Extra clang flags used to parse files when no compilation database is found.",
"items": {
"type": "string"
},
"type": "array"
},
"clangd.inactiveRegions.opacity": {
"default": 0.55,
"description": "Opacity of inactive regions (used only if clangd.inactiveRegions.useBackgroundHighlight=false)",
"type": "number"
},
"clangd.inactiveRegions.useBackgroundHighlight": {
"default": false,
"description": "Use a background highlight rather than opacity to identify inactive preprocessor regions.",
"type": "boolean"
},
"clangd.onConfigChanged": {
"default": "prompt",
"description": "What to do when clangd configuration files are changed. Ignored for clangd 12+, which can reload such files itself; however, this can be overridden with clangd.onConfigChangedForceEnable.",
"enum": [
"prompt",
"restart",
"ignore"
],
"enumDescriptions": [
"Prompt the user for restarting the server",
"Automatically restart the server",
"Do nothing"
],
"type": "string"
},
"clangd.onConfigChangedForceEnable": {
"default": false,
"description": "Force enable of \"On Config Changed\" option regardless of clangd version.",
"type": "boolean"
},
"clangd.path": {
"default": "clangd",
"description": "The path to clangd executable, e.g.: /usr/bin/clangd.",
"scope": "machine-overridable",
"type": "string"
},
"clangd.restartAfterCrash": {
"default": true,
"description": "Auto restart clangd (up to 4 times) if it crashes.",
"type": "boolean"
},
"clangd.semanticHighlighting": {
"default": true,
"deprecationMessage": "Legacy semanticHighlights is no longer supported. Please use `editor.semanticHighlighting.enabled` instead.",
"description": "Enable semantic highlighting in clangd.",
"type": "boolean"
},
"clangd.serverCompletionRanking": {
"default": true,
"description": "Always rank completion items on the server as you type. This produces more accurate results at the cost of higher latency than client-side filtering.",
"type": "boolean"
},
"clangd.trace": {
"description": "Names a file that clangd should log a performance trace to, in chrome trace-viewer JSON format.",
"type": "string"
},
"clangd.useScriptAsExecutable": {
"default": false,
"description": "Allows the path to be a script e.g.: clangd.sh.",
"scope": "machine-overridable",
"type": "boolean"
}
}
}
================================================
FILE: schemas/_generated/codeqlls.json
================================================
{
"$schema": "http://json-schema.org/draft-04/schema#",
"description": "Setting of codeqlls",
"properties": {
"codeQL.cli.executablePath": {
"default": "",
"description": "Path to the CodeQL executable that should be used by the CodeQL extension. The executable is named `codeql` on Linux/Mac and `codeql.exe` on Windows. If empty, the extension will look for a CodeQL executable on your shell PATH, or if CodeQL is not on your PATH, download and manage its own CodeQL executable.",
"scope": "machine",
"type": "string"
},
"codeQL.queryHistory.format": {
"default": "%q on %d - %s, %r result count [%t]",
"description": "Default string for how to label query history items. %t is the time of the query, %q is the query name, %d is the database name, %r is the number of results, and %s is a status string.",
"type": "string"
},
"codeQL.resultsDisplay.pageSize": {
"default": 200,
"description": "Max number of query results to display per page in the results view.",
"type": "integer"
},
"codeQL.runningQueries.autoSave": {
"default": false,
"description": "Enable automatically saving a modified query file when running a query.",
"type": "boolean"
},
"codeQL.runningQueries.cacheSize": {
"default": null,
"description": "Maximum size of the disk cache (in MB). Leave blank to allow the evaluator to automatically adjust the size of the disk cache based on the size of the codebase and the complexity of the queries being executed.",
"minimum": 1024,
"type": [
"integer",
"null"
]
},
"codeQL.runningQueries.customLogDirectory": {
"default": null,
"description": "Path to a directory where the CodeQL extension should store query server logs. If empty, the extension stores logs in a temporary workspace folder and deletes the contents after each run.",
"type": [
"string",
null
]
},
"codeQL.runningQueries.debug": {
"default": false,
"description": "Enable debug logging and tuple counting when running CodeQL queries. This information is useful for debugging query performance.",
"type": "boolean"
},
"codeQL.runningQueries.maxQueries": {
"default": 20,
"description": "Max number of simultaneous queries to run using the 'CodeQL: Run Queries' command.",
"type": "integer"
},
"codeQL.runningQueries.memory": {
"default": null,
"description": "Memory (in MB) to use for running queries. Leave blank for CodeQL to choose a suitable value based on your system's available memory.",
"minimum": 1024,
"type": [
"integer",
"null"
]
},
"codeQL.runningQueries.numberOfThreads": {
"default": 1,
"description": "Number of threads for running queries.",
"maximum": 1024,
"minimum": 0,
"type": "integer"
},
"codeQL.runningQueries.saveCache": {
"default": false,
"description": "Aggressively save intermediate results to the disk cache. This may speed up subsequent queries if they are similar. Be aware that using this option will greatly increase disk usage and initial evaluation time.",
"scope": "window",
"type": "boolean"
},
"codeQL.runningQueries.timeout": {
"default": null,
"description": "Timeout (in seconds) for running queries. Leave blank or set to zero for no timeout.",
"maximum": 2147483647,
"minimum": 0,
"type": [
"integer",
"null"
]
},
"codeQL.runningTests.additionalTestArguments": {
"default": [],
"markdownDescription": "Additional command line arguments to pass to the CLI when [running tests](https://codeql.github.com/docs/codeql-cli/manual/test-run/). This setting should be an array of strings, each containing an argument to be passed.",
"scope": "machine",
"type": "array"
},
"codeQL.runningTests.numberOfThreads": {
"default": 1,
"description": "Number of threads for running CodeQL tests.",
"maximum": 1024,
"minimum": 0,
"scope": "window",
"type": "integer"
},
"codeQL.telemetry.enableTelemetry": {
"default": false,
"markdownDescription": "Specifies whether to send CodeQL usage telemetry. This setting AND the global `#telemetry.enableTelemetry#` setting must be checked for telemetry to be sent to GitHub. For more information, see the [telemetry documentation](https://codeql.github.com/docs/codeql-for-visual-studio-code/about-telemetry-in-codeql-for-visual-studio-code)",
"scope": "application",
"type": "boolean"
},
"codeQL.telemetry.logTelemetry": {
"default": false,
"description": "Specifies whether or not to write telemetry events to the extension log.",
"scope": "application",
"type": "boolean"
}
}
}
================================================
FILE: schemas/_generated/cssls.json
================================================
{
"$schema": "http://json-schema.org/draft-04/schema#",
"description": "Setting of cssls",
"properties": {
"css.completion.completePropertyWithSemicolon": {
"default": true,
"description": "%css.completion.completePropertyWithSemicolon.desc%",
"scope": "resource",
"type": "boolean"
},
"css.completion.triggerPropertyValueCompletion": {
"default": true,
"description": "%css.completion.triggerPropertyValueCompletion.desc%",
"scope": "resource",
"type": "boolean"
},
"css.customData": {
"default": [],
"items": {
"type": "string"
},
"markdownDescription": "%css.customData.desc%",
"scope": "resource",
"type": "array"
},
"css.format.braceStyle": {
"default": "collapse",
"enum": [
"collapse",
"expand"
],
"markdownDescription": "%css.format.braceStyle.desc%",
"scope": "resource",
"type": "string"
},
"css.format.enable": {
"default": true,
"description": "%css.format.enable.desc%",
"scope": "window",
"type": "boolean"
},
"css.format.maxPreserveNewLines": {
"default": null,
"markdownDescription": "%css.format.maxPreserveNewLines.desc%",
"scope": "resource",
"type": [
"number",
"null"
]
},
"css.format.newlineBetweenRules": {
"default": true,
"markdownDescription": "%css.format.newlineBetweenRules.desc%",
"scope": "resource",
"type": "boolean"
},
"css.format.newlineBetweenSelectors": {
"default": true,
"markdownDescription": "%css.format.newlineBetweenSelectors.desc%",
"scope": "resource",
"type": "boolean"
},
"css.format.preserveNewLines": {
"default": true,
"markdownDescription": "%css.format.preserveNewLines.desc%",
"scope": "resource",
"type": "boolean"
},
"css.format.spaceAroundSelectorSeparator": {
"default": false,
"markdownDescription": "%css.format.spaceAroundSelectorSeparator.desc%",
"scope": "resource",
"type": "boolean"
},
"css.hover.documentation": {
"default": true,
"description": "%css.hover.documentation%",
"scope": "resource",
"type": "boolean"
},
"css.hover.references": {
"default": true,
"description": "%css.hover.references%",
"scope": "resource",
"type": "boolean"
},
"css.lint.argumentsInColorFunction": {
"default": "error",
"description": "%css.lint.argumentsInColorFunction.desc%",
"enum": [
"ignore",
"warning",
"error"
],
"scope": "resource",
"type": "string"
},
"css.lint.boxModel": {
"default": "ignore",
"enum": [
"ignore",
"warning",
"error"
],
"markdownDescription": "%css.lint.boxModel.desc%",
"scope": "resource",
"type": "string"
},
"css.lint.compatibleVendorPrefixes": {
"default": "ignore",
"description": "%css.lint.compatibleVendorPrefixes.desc%",
"enum": [
"ignore",
"warning",
"error"
],
"scope": "resource",
"type": "string"
},
"css.lint.duplicateProperties": {
"default": "ignore",
"description": "%css.lint.duplicateProperties.desc%",
"enum": [
"ignore",
"warning",
"error"
],
"scope": "resource",
"type": "string"
},
"css.lint.emptyRules": {
"default": "warning",
"description": "%css.lint.emptyRules.desc%",
"enum": [
"ignore",
"warning",
"error"
],
"scope": "resource",
"type": "string"
},
"css.lint.float": {
"default": "ignore",
"enum": [
"ignore",
"warning",
"error"
],
"markdownDescription": "%css.lint.float.desc%",
"scope": "resource",
"type": "string"
},
"css.lint.fontFaceProperties": {
"default": "warning",
"enum": [
"ignore",
"warning",
"error"
],
"markdownDescription": "%css.lint.fontFaceProperties.desc%",
"scope": "resource",
"type": "string"
},
"css.lint.hexColorLength": {
"default": "error",
"description": "%css.lint.hexColorLength.desc%",
"enum": [
"ignore",
"warning",
"error"
],
"scope": "resource",
"type": "string"
},
"css.lint.idSelector": {
"default": "ignore",
"description": "%css.lint.idSelector.desc%",
"enum": [
"ignore",
"warning",
"error"
],
"scope": "resource",
"type": "string"
},
"css.lint.ieHack": {
"default": "ignore",
"description": "%css.lint.ieHack.desc%",
"enum": [
"ignore",
"warning",
"error"
],
"scope": "resource",
"type": "string"
},
"css.lint.importStatement": {
"default": "ignore",
"description": "%css.lint.importStatement.desc%",
"enum": [
"ignore",
"warning",
"error"
],
"scope": "resource",
"type": "string"
},
"css.lint.important": {
"default": "ignore",
"enum": [
"ignore",
"warning",
"error"
],
"markdownDescription": "%css.lint.important.desc%",
"scope": "resource",
"type": "string"
},
"css.lint.propertyIgnoredDueToDisplay": {
"default": "warning",
"enum": [
"ignore",
"warning",
"error"
],
"markdownDescription": "%css.lint.propertyIgnoredDueToDisplay.desc%",
"scope": "resource",
"type": "string"
},
"css.lint.universalSelector": {
"default": "ignore",
"enum": [
"ignore",
"warning",
"error"
],
"markdownDescription": "%css.lint.universalSelector.desc%",
"scope": "resource",
"type": "string"
},
"css.lint.unknownAtRules": {
"default": "warning",
"description": "%css.lint.unknownAtRules.desc%",
"enum": [
"ignore",
"warning",
"error"
],
"scope": "resource",
"type": "string"
},
"css.lint.unknownProperties": {
"default": "warning",
"description": "%css.lint.unknownProperties.desc%",
"enum": [
"ignore",
"warning",
"error"
],
"scope": "resource",
"type": "string"
},
"css.lint.unknownVendorSpecificProperties": {
"default": "ignore",
"description": "%css.lint.unknownVendorSpecificProperties.desc%",
"enum": [
"ignore",
"warning",
"error"
],
"scope": "resource",
"type": "string"
},
"css.lint.validProperties": {
"default": [],
"items": {
"type": "string"
},
"markdownDescription": "%css.lint.validProperties.desc%",
"scope": "resource",
"type": "array",
"uniqueItems": true
},
"css.lint.vendorPrefix": {
"default": "warning",
"description": "%css.lint.vendorPrefix.desc%",
"enum": [
"ignore",
"warning",
"error"
],
"scope": "resource",
"type": "string"
},
"css.lint.zeroUnits": {
"default": "ignore",
"description": "%css.lint.zeroUnits.desc%",
"enum": [
"ignore",
"warning",
"error"
],
"scope": "resource",
"type": "string"
},
"css.trace.server": {
"default": "off",
"description": "%css.trace.server.desc%",
"enum": [
"off",
"messages",
"verbose"
],
"scope": "window",
"type": "string"
},
"css.validate": {
"default": true,
"description": "%css.validate.desc%",
"scope": "resource",
"type": "boolean"
}
}
}
================================================
FILE: schemas/_generated/dartls.json
================================================
{
"$schema": "http://json-schema.org/draft-04/schema#",
"description": "Setting of dartls",
"properties": {
"dart.analysisExcludedFolders": {
"default": [],
"description": "An array of paths to be excluded from Dart analysis. This option should usually be set at the Workspace level. Excluded folders will also be ignored when detecting project types.",
"items": {
"type": "string"
},
"scope": "resource",
"type": "array"
},
"dart.analyzerAdditionalArgs": {
"default": [],
"description": "Additional arguments to pass to the Dart Analysis Server. This setting is can be useful for troubleshooting issues with the Dart Analysis Server.",
"items": {
"type": "string"
},
"scope": "window",
"type": "array"
},
"dart.analyzerDiagnosticsPort": {
"default": null,
"description": "The port number to be used for the Dart analyzer diagnostic server. This setting is can be useful for troubleshooting issues with the Dart Analysis Server.",
"scope": "window",
"type": [
"null",
"number"
]
},
"dart.analyzerPath": {
"default": null,
"description": "The path to a custom Dart Analysis Server. This setting is intended for use by Dart Analysis Server developers. Use `~` to insert the user's home directory (the path should then use `/` separators even on Windows).",
"scope": "machine-overridable",
"type": [
"null",
"string"
]
},
"dart.analyzerSshHost": {
"default": null,
"description": "An SSH host to run the Analysis Server.\nThis can be useful when modifying code on a remote machine using SSHFS.",
"scope": "window",
"type": [
"null",
"string"
]
},
"dart.analyzerVmAdditionalArgs": {
"default": [],
"description": "Additional arguments to pass to the VM running the Dart Analysis Server. This setting is can be useful for troubleshooting issues with the Dart Analysis Server.",
"items": {
"type": "string"
},
"scope": "window",
"type": "array"
},
"dart.analyzerVmServicePort": {
"default": null,
"description": "The port number to be used for the Dart Analysis Server VM service. This setting is intended for use by Dart Analysis Server developers.",
"scope": "window",
"type": [
"null",
"number"
]
},
"dart.includeDependenciesInWorkspaceSymbols": {
"default": true,
"markdownDescription": "Whether to include symbols from the SDK and package dependencies in the \"Go to Symbol in Workspace\" (`cmd/ctrl`+`T`) list. This can only be disabled when using Dart 3.0 / Flutter 3.10 or later.",
"scope": "window",
"type": "boolean"
},
"dart.notifyAnalyzerErrors": {
"default": true,
"description": "Whether to show a notification the first few times an Analysis Server exception occurs.",
"scope": "window",
"type": "boolean"
},
"dart.showExtensionRecommendations": {
"default": true,
"description": "Whether to show recommendations for other VS Code extensions based on the packages you're using.",
"scope": "window",
"type": "boolean"
},
"dart.showTodos": {
"default": true,
"description": "Whether to show TODOs in the Problems list. Can be a boolean to enable all TODO comments (TODO, FIXME, HACK, UNDONE) or an array of which types to enable. Older Dart SDKs may not support some TODO kinds.",
"items": {
"type": "string"
},
"scope": "window",
"type": [
"boolean",
"array"
]
}
}
}
================================================
FILE: schemas/_generated/denols.json
================================================
{
"$schema": "http://json-schema.org/draft-04/schema#",
"description": "Setting of denols",
"properties": {
"deno.cache": {
"default": null,
"markdownDescription": "A path to the cache directory for Deno. By default, the operating system's cache path plus `deno` is used, or the `DENO_DIR` environment variable, but if set, this path will be used instead.",
"scope": "window",
"type": "string"
},
"deno.cacheOnSave": {
"default": true,
"examples": [
true,
false
],
"markdownDescription": "Controls if the extension should cache the active document's dependencies on save.",
"scope": "resource",
"type": "boolean"
},
"deno.certificateStores": {
"default": null,
"items": {
"type": "string"
},
"markdownDescription": "A list of root certificate stores used to validate TLS certificates when fetching and caching remote resources. This overrides the `DENO_TLS_CA_STORE` environment variable if set.",
"scope": "window",
"type": "array"
},
"deno.codeLens.implementations": {
"default": false,
"examples": [
true,
false
],
"markdownDescription": "Enables or disables the display of code lens information for implementations of items in the code.",
"scope": "resource",
"type": "boolean"
},
"deno.codeLens.references": {
"default": false,
"examples": [
true,
false
],
"markdownDescription": "Enables or disables the display of code lens information for references of items in the code.",
"scope": "resource",
"type": "boolean"
},
"deno.codeLens.referencesAllFunctions": {
"default": false,
"examples": [
true,
false
],
"markdownDescription": "Enables or disables the display of code lens information for all functions in the code.",
"scope": "resource",
"type": "boolean"
},
"deno.codeLens.test": {
"default": false,
"markdownDescription": "Enables or disables the display of code lenses that allow running of individual tests in the code.",
"scope": "resource",
"type": "boolean"
},
"deno.codeLens.testArgs": {
"default": [
"--allow-all",
"--no-check"
],
"items": {
"type": "string"
},
"markdownDescription": "Additional arguments to use with the run test code lens. Defaults to `[ \"--allow-all\", \"--no-check\" ]`.",
"scope": "resource",
"type": "array"
},
"deno.config": {
"default": null,
"examples": [
"./deno.jsonc",
"/path/to/deno.jsonc",
"C:\\path\\to\\deno.jsonc"
],
"markdownDescription": "The file path to a configuration file. This is the equivalent to using `--config` on the command line. The path can be either be relative to the workspace, or an absolute path.\n\nIt is recommend you name it `deno.json` or `deno.jsonc`.\n\n**Not recommended to be set globally.**",
"scope": "resource",
"type": "string"
},
"deno.defaultTaskCommand": {
"default": "open",
"enum": [
"open",
"run"
],
"markdownDescription": "Controls the default action when clicking on a task in the _Deno Tasks sidebar_.",
"scope": "resource",
"type": "string"
},
"deno.disablePaths": {
"default": [],
"examples": [
[
"./worker"
]
],
"items": {
"type": "string"
},
"markdownDescription": "Disables the Deno Language Server for specific paths. This will leave the built in TypeScript/JavaScript language server enabled for those paths. Takes priority over `deno.enablePaths`.\n\n**Not recommended to be enabled in user settings.**",
"scope": "resource",
"type": "array"
},
"deno.documentPreloadLimit": {
"default": 1000,
"examples": [
0,
100,
1000
],
"markdownDescription": "Maximum number of file system entries to traverse when finding scripts to preload into TypeScript on startup. Set this to 0 to disable document preloading.",
"scope": "resource",
"type": "number"
},
"deno.enable": {
"default": null,
"examples": [
true,
false
],
"markdownDescription": "Controls if the Deno Language Server is enabled. When enabled, the extension will disable the built-in VSCode JavaScript and TypeScript language services, and will use the Deno Language Server instead.\n\nIf omitted, your preference will be inferred as true if there is a `deno.json[c]` at your workspace root and false if not.\n\nIf you want to enable only part of your workspace folder, consider using `deno.enablePaths` setting instead.\n\n**Not recommended to be enabled globally.**",
"scope": "resource",
"type": [
"boolean",
"null"
]
},
"deno.enablePaths": {
"default": null,
"examples": [
[
"./worker"
]
],
"items": {
"type": "string"
},
"markdownDescription": "Enables the Deno Language Server for specific paths, instead of for the whole workspace folder. This will disable the built in TypeScript/JavaScript language server for those paths.\n\nWhen a value is set, the value of `\"deno.enable\"` is ignored.\n\nThe workspace folder is used as the base for the supplied paths. If for example you have all your Deno code in `worker` path in your workspace, you can add an item with the value of `./worker`, and the Deno will only provide diagnostics for the files within `worker` or any of its sub paths.\n\n**Not recommended to be enabled in user settings.**",
"scope": "resource",
"type": "array"
},
"deno.env": {
"default": {},
"examples": [
{
"HTTP_PROXY": "http://localhost:8080"
}
],
"markdownDescription": "Additional environment variables to pass to Deno processes. Overrides the user's env and `deno.envFile`. These will be overridden by more specific settings such as `deno.future` for `DENO_FUTURE`, and invariables like `NO_COLOR=1`.",
"patternProperties": {
".+": {
"type": "string"
}
},
"scope": "window",
"type": "object"
},
"deno.envFile": {
"default": null,
"examples": [
".env"
],
"markdownDescription": "Env file containing additional environment variables to pass to Deno processes. Overrides the user's env. These will be overridden by `deno.env`, more specific settings such as `deno.future` for `DENO_FUTURE`, and invariables like `NO_COLOR=1`.",
"scope": "window",
"type": "string"
},
"deno.forcePushBasedDiagnostics": {
"default": false,
"examples": [
true,
false
],
"markdownDescription": "Disables the server-capability for pull diagnostics to force push-based diagnostics.",
"scope": "window",
"type": "boolean"
},
"deno.future": {
"default": false,
"deprecationMessage": "Deno 2.0 has been released. This setting still affects 1.x.x installations, however.",
"examples": [
true,
false
],
"markdownDescription": "Enable breaking features likely to be shipped in Deno 2.0.",
"scope": "window",
"type": "boolean"
},
"deno.importMap": {
"default": null,
"examples": [
"./import_map.json",
"/path/to/import_map.json",
"C:\\path\\to\\import_map.json"
],
"markdownDescription": "The file path to an import map. This is the equivalent to using `--import-map` on the command line.\n\n[Import maps](https://deno.land/manual@v1.6.0/linking_to_external_code/import_maps) provide a way to \"relocate\" modules based on their specifiers. The path can either be relative to the workspace, or an absolute path.\n\n**Not recommended to be set globally.**",
"scope": "resource",
"type": "string"
},
"deno.internalDebug": {
"default": false,
"examples": [
true,
false
],
"markdownDescription": "Determines if the internal debugging information for the Deno language server will be logged to the _Deno Language Server_ console.",
"scope": "window",
"type": "boolean"
},
"deno.internalInspect": {
"default": false,
"examples": [
true,
false,
"127.0.0.1:9222"
],
"markdownDescription": "Enables the inspector server for the JS runtime used by the Deno Language Server to host its TS server. Optionally provide an address for the inspector listener e.g. \"127.0.0.1:9222\" (default).",
"scope": "window",
"type": [
"boolean",
"string"
]
},
"deno.lint": {
"default": true,
"examples": [
true,
false
],
"markdownDescription": "Controls if linting information will be provided by the Deno Language Server.\n\n**Not recommended to be enabled globally.**",
"scope": "resource",
"type": "boolean"
},
"deno.logFile": {
"default": false,
"examples": [
true,
false
],
"markdownDescription": "Write logs to a file in a project-local directory.",
"scope": "window",
"type": "boolean"
},
"deno.maxTsServerMemory": {
"default": 3072,
"markdownDescription": "Maximum amount of memory the TypeScript isolate can use. Defaults to 3072 (3GB).",
"scope": "resource",
"type": "number"
},
"deno.organizeImports.enabled": {
"default": true,
"examples": [
true,
false
],
"markdownDescription": "Controls if the Deno language server contributes organize imports code actions. Disable to rely on VS Code's built-in TypeScript/JavaScript organize imports instead.",
"scope": "resource",
"type": "boolean"
},
"deno.path": {
"default": null,
"examples": [
"/usr/bin/deno",
"C:\\Program Files\\deno\\deno.exe"
],
"markdownDescription": "A path to the `deno` CLI executable. By default, the extension looks for `deno` in the `PATH`, but if set, will use the path specified instead.",
"scope": "window",
"type": "string"
},
"deno.suggest.imports.autoDiscover": {
"default": true,
"markdownDescription": "If enabled, when new hosts/origins are encountered that support import suggestions, you will be prompted to enable or disable it. Defaults to `true`.",
"scope": "resource",
"type": "boolean"
},
"deno.suggest.imports.hosts": {
"default": {
"https://deno.land": true
},
"examples": {
"https://deno.land": true
},
"markdownDescription": "Controls which hosts are enabled for import suggestions.",
"scope": "resource",
"type": "object"
},
"deno.symbols.document.enabled": {
"default": true,
"examples": [
true,
false
],
"markdownDescription": "Controls if the Deno language server provides document symbols. Disable to rely on VS Code's built-in providers instead.",
"scope": "resource",
"type": "boolean"
},
"deno.symbols.workspace.enabled": {
"default": true,
"examples": [
true,
false
],
"markdownDescription": "Controls if the Deno language server provides workspace symbols. Disable to rely on VS Code's built-in providers instead.",
"scope": "resource",
"type": "boolean"
},
"deno.testing.args": {
"default": [
"--allow-all",
"--no-check"
],
"items": {
"type": "string"
},
"markdownDescription": "Arguments to use when running tests via the Test Explorer. Defaults to `[ \"--allow-all\" ]`.",
"scope": "resource",
"type": "array"
},
"deno.tlsCertificate": {
"default": null,
"markdownDescription": "A path to a PEM certificate to use as the certificate authority when validating TLS certificates when fetching and caching remote resources. This is like using `--cert` on the Deno CLI and overrides the `DENO_CERT` environment variable if set.",
"scope": "window",
"type": "string"
},
"deno.trace.server": {
"default": "off",
"enum": [
"messages",
"off",
"verbose"
],
"markdownDescription": "Traces the communication between VS Code and the Deno Language Server.",
"scope": "window"
},
"deno.unsafelyIgnoreCertificateErrors": {
"default": null,
"items": {
"type": "string"
},
"markdownDescription": "**DANGER** disables verification of TLS certificates for the hosts provided. There is likely a better way to deal with any errors than use this option. This is like using `--unsafely-ignore-certificate-errors` in the Deno CLI.",
"scope": "window",
"type": "array"
},
"deno.unstable": {
"default": [],
"items": {
"type": "string"
},
"markdownDescription": "Controls which `--unstable-*` features tests will be run with when running them via the explorer.\n\n**Not recommended to be enabled globally.**",
"scope": "resource",
"type": "array"
}
}
}
================================================
FILE: schemas/_generated/elixirls.json
================================================
{
"$schema": "http://json-schema.org/draft-04/schema#",
"description": "Setting of elixirls",
"properties": {
"elixirLS.additionalWatchedExtensions": {
"default": [],
"description": "Additional file types capable of triggering a build on change",
"items": {
"type": "string"
},
"scope": "resource",
"type": "array",
"uniqueItems": true
},
"elixirLS.autoBuild": {
"default": true,
"description": "Trigger ElixirLS build when code is saved",
"scope": "resource",
"type": "boolean"
},
"elixirLS.autoInsertRequiredAlias": {
"default": true,
"description": "Enable auto-insert required alias. This is true (enabled) by default.",
"scope": "window",
"type": "boolean"
},
"elixirLS.dialyzerEnabled": {
"default": true,
"description": "Run ElixirLS's rapid Dialyzer when code is saved",
"scope": "resource",
"type": "boolean"
},
"elixirLS.dialyzerFormat": {
"default": "dialyxir_long",
"description": "Formatter to use for Dialyzer warnings",
"enum": [
"dialyzer",
"dialyxir_short",
"dialyxir_long"
],
"markdownEnumDescriptions": [
"Original Dialyzer format",
"Same as `mix dialyzer --format short`",
"Same as `mix dialyzer --format long`"
],
"scope": "resource",
"type": "string"
},
"elixirLS.dialyzerWarnOpts": {
"default": [],
"description": "Dialyzer options to enable or disable warnings - See Dialyzer's documentation for options. Note that the \"race_conditions\" option is unsupported",
"items": {
"enum": [
"no_return",
"no_unused",
"no_unknown",
"no_improper_lists",
"no_fun_app",
"no_match",
"no_opaque",
"no_fail_call",
"no_contracts",
"no_behaviours",
"no_undefined_callbacks",
"unmatched_returns",
"error_handling",
"no_missing_calls",
"specdiffs",
"overspecs",
"underspecs",
"no_underspecs",
"extra_return",
"no_extra_return",
"missing_return",
"no_missing_return",
"unknown",
"overlapping_contract",
"opaque_union",
"no_opaque_union"
],
"type": "string"
},
"scope": "resource",
"type": "array",
"uniqueItems": true
},
"elixirLS.dotFormatter": {
"description": "Path to a custom .formatter.exs file used when formatting documents",
"minLength": 0,
"scope": "resource",
"type": "string"
},
"elixirLS.enableTestLenses": {
"default": false,
"description": "Show code lenses to run tests in terminal.",
"scope": "resource",
"type": "boolean"
},
"elixirLS.envVariables": {
"description": "Environment variables to use for compilation",
"minLength": 0,
"scope": "resource",
"type": "object"
},
"elixirLS.fetchDeps": {
"default": false,
"description": "Automatically fetch project dependencies when compiling.",
"scope": "resource",
"type": "boolean"
},
"elixirLS.incrementalDialyzer": {
"default": true,
"description": "Use OTP incremental dialyzer (available on OTP 26+)",
"scope": "resource",
"type": "boolean"
},
"elixirLS.languageServerOverridePath": {
"description": "Absolute path to alternative ElixirLS release that will override the packaged release",
"minLength": 0,
"scope": "resource",
"type": "string"
},
"elixirLS.mcpEnabled": {
"default": false,
"description": "Enable or disable the MCP server",
"scope": "resource",
"type": "boolean"
},
"elixirLS.mcpPort": {
"default": 0,
"description": "Set a specific port for the MCP server. If not set, uses `3789 + hash(workspace_path)` for predictable port assignment per workspace",
"scope": "resource",
"type": "integer"
},
"elixirLS.mixEnv": {
"default": "test",
"description": "Mix environment to use for compilation",
"minLength": 1,
"scope": "resource",
"type": "string"
},
"elixirLS.mixTarget": {
"description": "Mix target to use for compilation",
"minLength": 0,
"scope": "resource",
"type": "string"
},
"elixirLS.projectDir": {
"default": "",
"description": "Subdirectory containing Mix project if not in the project root",
"minLength": 0,
"scope": "resource",
"type": "string"
},
"elixirLS.signatureAfterComplete": {
"default": true,
"description": "Show signature help after confirming autocomplete.",
"scope": "resource",
"type": "boolean"
},
"elixirLS.stdlibSrcDir": {
"default": "",
"description": "Subdirectory where the Elixir stdlib resides to allow for source code lookup. E.g. /home/youruser/.asdf/installs/elixir/1.18.2",
"minLength": 0,
"scope": "resource",
"type": "string"
},
"elixirLS.suggestSpecs": {
"default": true,
"description": "Suggest @spec annotations inline using Dialyzer's inferred success typings (Requires Dialyzer).",
"scope": "resource",
"type": "boolean"
},
"elixirLS.trace.server": {
"default": "off",
"description": "Traces the communication between VS Code and the Elixir language server.",
"enum": [
"off",
"messages",
"verbose"
],
"scope": "window",
"type": "string"
},
"elixirLS.useCurrentRootFolderAsProjectDir": {
"default": false,
"description": "Don't try to look for mix.exs in parent directories",
"scope": "resource",
"type": "boolean"
}
}
}
================================================
FILE: schemas/_generated/elmls.json
================================================
{
"$schema": "http://json-schema.org/draft-04/schema#",
"description": "Setting of elmls",
"properties": {
"elmLS.disableElmLSDiagnostics": {
"default": false,
"description": "Disable linting diagnostics from the language server.",
"scope": "window",
"type": "boolean"
},
"elmLS.elmFormatPath": {
"default": "",
"description": "The path to your elm-format executable. Should be empty by default, in that case it will assume the name and try to first get it from a local npm installation or a global one. If you set it manually it will not try to load from the npm folder.",
"scope": "window",
"type": "string"
},
"elmLS.elmPath": {
"default": "",
"description": "The path to your elm executable. Should be empty by default, in that case it will assume the name and try to first get it from a local npm installation or a global one. If you set it manually it will not try to load from the npm folder.",
"scope": "window",
"type": "string"
},
"elmLS.elmReviewDiagnostics": {
"default": "off",
"description": "Set severity or disable linting diagnostics for elm-review.",
"enum": [
"off",
"warning",
"error"
],
"scope": "window",
"type": "string"
},
"elmLS.elmReviewPath": {
"default": "",
"description": "The path to your elm-review executable. Should be empty by default, in that case it will assume the name and try to first get it from a local npm installation or a global one. If you set it manually it will not try to load from the npm folder.",
"scope": "window",
"type": "string"
},
"elmLS.elmTestPath": {
"default": "",
"description": "The path to your elm-test executable. Should be empty by default, in that case it will assume the name and try to first get it from a local npm installation or a global one. If you set it manually it will not try to load from the npm folder.",
"scope": "window",
"type": "string"
},
"elmLS.elmTestRunner.showElmTestOutput": {
"description": "Show output of elm-test as terminal task",
"scope": "resource",
"type": "boolean"
},
"elmLS.onlyUpdateDiagnosticsOnSave": {
"default": false,
"description": "Only update compiler diagnostics on save, not on document change.",
"scope": "window",
"type": "boolean"
},
"elmLS.skipInstallPackageConfirmation": {
"default": false,
"description": "Skips confirmation for the Install Package code action.",
"scope": "window",
"type": "boolean"
},
"elmLS.trace.server": {
"default": "off",
"description": "Traces the communication between VS Code and the language server.",
"enum": [
"off",
"messages",
"verbose"
],
"scope": "window",
"type": "string"
}
}
}
================================================
FILE: schemas/_generated/eslint.json
================================================
{
"$schema": "http://json-schema.org/draft-04/schema#",
"description": "Setting of eslint",
"properties": {
"eslint.autoFixOnSave": {
"default": false,
"deprecationMessage": "The setting is deprecated. Use editor.codeActionsOnSave instead with a source.fixAll.eslint member.",
"description": "Turns auto fix on save on or off.",
"scope": "resource",
"type": "boolean"
},
"eslint.codeAction.disableRuleComment": {
"additionalProperties": false,
"default": {
"commentStyle": "line",
"enable": true,
"location": "separateLine"
},
"markdownDescription": "Show disable lint rule in the quick fix menu.",
"properties": {
"commentStyle": {
"default": "line",
"definition": "The comment style to use when disabling a rule on a specific line.",
"enum": [
"line",
"block"
],
"type": "string"
},
"enable": {
"default": true,
"description": "Show the disable code actions.",
"type": "boolean"
},
"location": {
"default": "separateLine",
"description": "Configure the disable rule code action to insert the comment on the same line or a new line.",
"enum": [
"separateLine",
"sameLine"
],
"type": "string"
}
},
"scope": "resource",
"type": "object"
},
"eslint.codeAction.showDocumentation": {
"additionalProperties": false,
"default": {
"enable": true
},
"markdownDescription": "Show open lint rule documentation web page in the quick fix menu.",
"properties": {
"enable": {
"default": true,
"description": "Show the documentation code actions.",
"type": "boolean"
}
},
"scope": "resource",
"type": "object"
},
"eslint.codeActionsOnSave.mode": {
"default": "all",
"enum": [
"all",
"problems"
],
"enumDescriptions": [
"Fixes all possible problems in the file. This option might take some time.",
"Fixes only reported problems that have non-overlapping textual edits. This option runs a lot faster."
],
"markdownDescription": "Specifies the code action mode. Possible values are 'all' and 'problems'.",
"scope": "resource",
"type": "string"
},
"eslint.codeActionsOnSave.options": {
"default": {},
"markdownDescription": "The ESLint options object to use on save (see https://eslint.org/docs/developer-guide/nodejs-api#eslint-class). `eslint.codeActionsOnSave.rules`, if specified, will take priority over any rule options here.",
"scope": "resource",
"type": "object"
},
"eslint.codeActionsOnSave.rules": {
"anyOf": [
{
"items": {
"type": "string"
},
"type": "array"
},
{
"type": "null"
}
],
"default": null,
"markdownDescription": "The rules that should be executed when computing the code actions on save or formatting a file. Defaults to the rules configured via the ESLint configuration",
"scope": "resource"
},
"eslint.debug": {
"default": false,
"markdownDescription": "Enables ESLint debug mode (same as `--debug` on the command line)",
"scope": "window",
"type": "boolean"
},
"eslint.enable": {
"default": true,
"description": "Controls whether eslint is enabled or not.",
"scope": "resource",
"type": "boolean"
},
"eslint.execArgv": {
"anyOf": [
{
"items": {
"type": "string"
},
"type": "array"
},
{
"type": "null"
}
],
"default": null,
"markdownDescription": "Additional exec argv argument passed to the runtime. This can for example be used to control the maximum heap space using --max_old_space_size",
"scope": "machine-overridable"
},
"eslint.experimental.useFlatConfig": {
"default": false,
"deprecationMessage": "Use ESLint version 8.57 or later and `eslint.useFlatConfig` instead.",
"description": "Enables support of experimental Flat Config (aka eslint.config.js). Requires ESLint version >= 8.21 < 8.57.0).",
"scope": "resource",
"type": "boolean"
},
"eslint.format.enable": {
"default": false,
"description": "Enables ESLint as a formatter.",
"scope": "resource",
"type": "boolean"
},
"eslint.ignoreUntitled": {
"default": false,
"description": "If true, untitled files won't be validated by ESLint.",
"scope": "resource",
"type": "boolean"
},
"eslint.lintTask.command": {
"default": "eslint",
"markdownDescription": "The command to run the task for linting the whole workspace. Defaults to the found eslint binary for the workspace, or 'eslint' if no binary could be found.",
"scope": "resource",
"type": "string"
},
"eslint.lintTask.enable": {
"default": false,
"description": "Controls whether a task for linting the whole workspace will be available.",
"scope": "resource",
"type": "boolean"
},
"eslint.lintTask.options": {
"default": ".",
"markdownDescription": "Command line options applied when running the task for linting the whole workspace (see https://eslint.org/docs/user-guide/command-line-interface).",
"scope": "resource",
"type": "string"
},
"eslint.migration.2_x": {
"default": "on",
"description": "Whether ESlint should migrate auto fix on save settings.",
"enum": [
"off",
"on"
],
"scope": "application",
"type": "string"
},
"eslint.nodeEnv": {
"default": null,
"markdownDescription": "The value of `NODE_ENV` to use when running eslint tasks.",
"scope": "resource",
"type": [
"string",
"null"
]
},
"eslint.nodePath": {
"default": null,
"markdownDescription": "A path added to `NODE_PATH` when resolving the eslint module.",
"scope": "machine-overridable",
"type": [
"string",
"null"
]
},
"eslint.notebooks.rules.customizations": {
"description": "A special rules customization section for text cells in notebook documents.",
"items": {
"properties": {
"rule": {
"type": "string"
},
"severity": {
"enum": [
"downgrade",
"error",
"info",
"default",
"upgrade",
"warn",
"off"
],
"type": "string"
}
},
"type": "object"
},
"scope": "resource",
"type": "array"
},
"eslint.onIgnoredFiles": {
"default": "off",
"description": "Whether ESLint should issue a warning on ignored files.",
"enum": [
"warn",
"off"
],
"scope": "resource",
"type": "string"
},
"eslint.options": {
"default": {},
"markdownDescription": "The eslint options object to provide args normally passed to eslint when executed from a command line (see https://eslint.org/docs/developer-guide/nodejs-api#eslint-class).",
"scope": "resource",
"type": "object"
},
"eslint.packageManager": {
"default": "npm",
"deprecationMessage": "The setting is deprecated. The Package Manager is automatically detected now.",
"description": "The package manager you use to install node modules.",
"enum": [
"npm",
"yarn",
"pnpm"
],
"scope": "resource",
"type": "string"
},
"eslint.probe": {
"default": [
"astro",
"civet",
"javascript",
"javascriptreact",
"typescript",
"typescriptreact",
"html",
"mdx",
"vue",
"markdown",
"json",
"jsonc",
"css",
"glimmer-js",
"glimmer-ts",
"svelte"
],
"description": "An array of language ids for which the extension should probe if support is installed.",
"items": {
"type": "string"
},
"scope": "resource",
"type": "array"
},
"eslint.problems.shortenToSingleLine": {
"default": false,
"description": "Shortens the text spans of underlined problems to their first related line.",
"scope": "resource",
"type": "boolean"
},
"eslint.provideLintTask": {
"default": false,
"deprecationMessage": "This option is deprecated. Use eslint.lintTask.enable instead.",
"description": "Controls whether a task for linting the whole workspace will be available.",
"scope": "resource",
"type": "boolean"
},
"eslint.quiet": {
"default": false,
"description": "Turns on quiet mode, which ignores warnings and info diagnostics.",
"scope": "resource",
"type": "boolean"
},
"eslint.rules.customizations": {
"description": "Override the severity of one or more rules reported by this extension, regardless of the project's ESLint config. Use globs to apply default severities for multiple rules.",
"items": {
"properties": {
"rule": {
"type": "string"
},
"severity": {
"enum": [
"downgrade",
"error",
"info",
"default",
"upgrade",
"warn",
"off"
],
"type": "string"
}
},
"type": "object"
},
"scope": "resource",
"type": "array"
},
"eslint.run": {
"default": "onType",
"description": "Run the linter on save (onSave) or on type (onType)",
"enum": [
"onSave",
"onType"
],
"scope": "resource",
"type": "string"
},
"eslint.runtime": {
"default": null,
"markdownDescription": "The location of the node binary to run ESLint under.",
"scope": "machine-overridable",
"type": [
"string",
"null"
]
},
"eslint.timeBudget.onFixes": {
"default": {
"error": 6000,
"warn": 3000
},
"markdownDescription": "The time budget in milliseconds to spend on computing fixes before showing a warning or error.",
"properties": {
"error": {
"default": 6000,
"markdownDescription": "The time budget in milliseconds to spend on computing fixes before showing an error.",
"minimum": 0,
"type": "number"
},
"warn": {
"default": 3000,
"markdownDescription": "The time budget in milliseconds to spend on computing fixes before showing a warning.",
"minimum": 0,
"type": "number"
}
},
"scope": "resource",
"type": "object"
},
"eslint.timeBudget.onValidation": {
"default": {
"error": 8000,
"warn": 4000
},
"markdownDescription": "The time budget in milliseconds to spend on validation before showing a warning or error.",
"properties": {
"error": {
"default": 8000,
"markdownDescription": "The time budget in milliseconds to spend on validation before showing an error.",
"minimum": 0,
"type": "number"
},
"warn": {
"default": 4000,
"markdownDescription": "The time budget in milliseconds to spend on validation before showing a warning.",
"minimum": 0,
"type": "number"
}
},
"scope": "resource",
"type": "object"
},
"eslint.trace.server": {
"anyOf": [
{
"default": "off",
"enum": [
"off",
"messages",
"verbose"
],
"type": "string"
},
{
"properties": {
"format": {
"default": "text",
"enum": [
"text",
"json"
],
"type": "string"
},
"verbosity": {
"default": "off",
"enum": [
"off",
"messages",
"verbose"
],
"type": "string"
}
},
"type": "object"
}
],
"default": "off",
"description": "Traces the communication between VSCode and the eslint linter service.",
"scope": "window"
},
"eslint.useESLintClass": {
"default": false,
"description": "Since version 7 ESLint offers a new API call ESLint. Use it even if the old CLIEngine is available. From version 8 on forward on ESLint class is available.",
"scope": "resource",
"type": "boolean"
},
"eslint.useFlatConfig": {
"default": null,
"markdownDescription": "Controls whether flat config should be used or not. This setting requires ESLint version 8.57 or later and is interpreted according to the [ESLint Flat Config rollout plan](https://eslint.org/blog/2023/10/flat-config-rollout-plans/). This means:\n\n - *8.57.0 <= ESLint version < 9.x*: setting is honored and defaults to false\n- *9.0.0 <= ESLint version < 10.x*: settings is honored and defaults to true\n- *10.0.0 <= ESLint version*: setting is ignored. Flat configs are the default and can't be turned off.",
"scope": "resource",
"type": [
"boolean",
"null"
]
},
"eslint.useRealpaths": {
"default": false,
"description": "Whether ESLint should use real paths when resolving files. This is useful when working with symlinks or when the casing of file paths is inconsistent.",
"scope": "resource",
"type": "boolean"
},
"eslint.validate": {
"default": null,
"description": "An array of language ids which should be validated by ESLint. If not installed ESLint will show an error.",
"items": {
"anyOf": [
{
"type": "string"
},
{
"deprecationMessage": "Auto Fix is enabled by default. Use the single string form.",
"properties": {
"autoFix": {
"description": "Whether auto fixes are provided for the language.",
"type": "boolean"
},
"language": {
"description": "The language id to be validated by ESLint.",
"type": "string"
}
},
"type": "object"
}
]
},
"scope": "resource",
"type": [
"array",
"null"
]
},
"eslint.workingDirectories": {
"items": {
"anyOf": [
{
"type": "string"
},
{
"properties": {
"mode": {
"default": "location",
"enum": [
"auto",
"location"
],
"type": "string"
}
},
"required": [
"mode"
],
"type": "object"
},
{
"deprecationMessage": "Use the new !cwd form.",
"properties": {
"changeProcessCWD": {
"description": "Whether the process's cwd should be changed as well.",
"type": "boolean"
},
"directory": {
"description": "The working directory to use if a file's path starts with this directory.",
"type": "string"
}
},
"required": [
"directory"
],
"type": "object"
},
{
"properties": {
"!cwd": {
"description": "Set to true if ESLint shouldn't change the working directory.",
"type": "boolean"
},
"directory": {
"description": "The working directory to use if a file's path starts with this directory.",
"type": "string"
}
},
"required": [
"directory"
],
"type": "object"
},
{
"properties": {
"!cwd": {
"description": "Set to true if ESLint shouldn't change the working directory.",
"type": "boolean"
},
"pattern": {
"description": "A glob pattern to match a working directory.",
"type": "string"
}
},
"required": [
"pattern"
],
"type": "object"
}
]
},
"markdownDescription": "Specifies how the working directories ESLint is using are computed. ESLint resolves configuration files (e.g. `eslintrc`, `.eslintignore`) relative to a working directory so it is important to configure this correctly.",
"scope": "resource",
"type": "array"
}
}
}
================================================
FILE: schemas/_generated/flow.json
================================================
{
"$schema": "http://json-schema.org/draft-04/schema#",
"description": "Setting of flow",
"properties": {
"flow.coverageSeverity": {
"default": "info",
"description": "Type coverage diagnostic severity",
"enum": [
"error",
"warn",
"info"
],
"scope": "resource",
"type": "string"
},
"flow.enabled": {
"default": true,
"description": "Is flow enabled",
"scope": "resource",
"type": "boolean"
},
"flow.lazyMode": {
"default": null,
"description": "Set value to enable flow lazy mode",
"scope": "resource",
"type": "string"
},
"flow.logLevel": {
"default": "info",
"description": "Log level for output panel logs",
"enum": [
"error",
"warn",
"info",
"trace"
],
"scope": "resource",
"type": "string"
},
"flow.pathToFlow": {
"default": "flow",
"description": "Absolute path to flow binary. Special var ${workspaceFolder} or ${flowconfigDir} can be used in path (NOTE: in windows you can use '/' and can omit '.cmd' in path)",
"scope": "resource",
"type": "string"
},
"flow.showUncovered": {
"default": false,
"description": "If true will show uncovered code by default",
"scope": "resource",
"type": "boolean"
},
"flow.stopFlowOnExit": {
"default": true,
"description": "Stop Flow on Exit",
"scope": "resource",
"type": "boolean"
},
"flow.trace.server": {
"anyOf": [
{
"default": "off",
"enum": [
"off",
"messages",
"verbose"
],
"type": "string"
}
],
"default": "off",
"description": "Traces the communication between VSCode and the flow lsp service.",
"scope": "window"
},
"flow.useBundledFlow": {
"default": true,
"description": "If true will use flow bundled with this plugin if nothing works",
"scope": "resource",
"type": "boolean"
},
"flow.useCodeSnippetOnFunctionSuggest": {
"default": true,
"description": "Complete functions with their parameter signature.",
"scope": "resource",
"type": "boolean"
},
"flow.useNPMPackagedFlow": {
"default": true,
"description": "Support using flow through your node_modules folder, WARNING: Checking this box is a security risk. When you open a project we will immediately run code contained within it.",
"scope": "resource",
"type": "boolean"
}
}
}
================================================
FILE: schemas/_generated/fortls.json
================================================
{
"$schema": "http://json-schema.org/draft-04/schema#",
"description": "Setting of fortls",
"properties": {
"fortran-ls.autocompletePrefix": {
"default": false,
"description": "Filter autocomplete suggestions with variable prefix",
"scope": "resource",
"type": "boolean"
},
"fortran-ls.disableDiagnostics": {
"default": false,
"description": "Disable diagnostics (requires v1.12.0+).",
"scope": "resource",
"type": "boolean"
},
"fortran-ls.displayVerWarning": {
"default": true,
"description": "Provides notifications when the underlying language server is out of date.",
"scope": "resource",
"type": "boolean"
},
"fortran-ls.enableCodeActions": {
"default": false,
"description": "Enable experimental code actions (requires v1.7.0+).",
"scope": "resource",
"type": "boolean"
},
"fortran-ls.executablePath": {
"default": "fortls",
"description": "Path to the Fortran language server (fortls).",
"scope": "resource",
"type": "string"
},
"fortran-ls.hoverSignature": {
"default": false,
"description": "Show signature information in hover for argument (also enables 'variableHover').",
"scope": "resource",
"type": "boolean"
},
"fortran-ls.includeSymbolMem": {
"default": true,
"description": "Include type members in document outline (also used for 'Go to Symbol in File')",
"scope": "resource",
"type": "boolean"
},
"fortran-ls.incrementalSync": {
"default": true,
"description": "Use incremental synchronization for file changes.",
"scope": "resource",
"type": "boolean"
},
"fortran-ls.lowercaseIntrinsics": {
"default": false,
"description": "Use lowercase for intrinsics and keywords in autocomplete requests.",
"scope": "resource",
"type": "boolean"
},
"fortran-ls.maxCommentLineLength": {
"default": -1,
"description": "Maximum comment line length (requires v1.8.0+).",
"scope": "resource",
"type": "number"
},
"fortran-ls.maxLineLength": {
"default": -1,
"description": "Maximum line length (requires v1.8.0+).",
"scope": "resource",
"type": "number"
},
"fortran-ls.notifyInit": {
"default": false,
"description": "Notify when workspace initialization is complete (requires v1.7.0+).",
"scope": "resource",
"type": "boolean"
},
"fortran-ls.useSignatureHelp": {
"default": true,
"description": "Use signature help instead of snippets when available.",
"scope": "resource",
"type": "boolean"
},
"fortran-ls.variableHover": {
"default": false,
"description": "Show hover information for variables.",
"scope": "resource",
"type": "boolean"
}
}
}
================================================
FILE: schemas/_generated/fsautocomplete.json
================================================
{
"$schema": "http://json-schema.org/draft-04/schema#",
"description": "Setting of fsautocomplete",
"properties": {
"FSharp.FSIExtraInteractiveParameters": {
"markdownDescription": "An array of additional command line parameters to pass to FSI when it is launched. See [the Microsoft documentation](https://docs.microsoft.com/en-us/dotnet/fsharp/language-reference/fsharp-interactive-options) for an exhaustive list. If both this and `#FSharp.fsiExtraParameters#` are used, both sets of arguments will be passed to the launched FSI.",
"type": "array"
},
"FSharp.FSIExtraSharedParameters": {
"markdownDescription": "An array of additional command line parameters to pass to the compiler to use when checking FSI scripts. See [the Microsoft documentation](https://docs.microsoft.com/en-us/dotnet/fsharp/language-reference/fsharp-interactive-options) for an exhaustive list. If both this and `#FSharp.fsiExtraParameters#` are used, only `#FSharp.fsiExtraParameters#` will be used.",
"type": "array"
},
"FSharp.TestExplorer.AutoDiscoverTestsOnLoad": {
"default": true,
"description": "Decides if the test explorer will automatically try discover tests when the workspace loads. You can still manually refresh the explorer to discover tests at any time",
"type": "boolean"
},
"FSharp.TestExplorer.UseLegacyDotnetCliIntegration": {
"default": false,
"description": "Use the dotnet cli to discover and run tests instead of the language server. Will lose features like streamed test results and Microsoft Testing Platform support.",
"type": "boolean"
},
"FSharp.abstractClassStubGeneration": {
"default": true,
"description": "Enables a codefix that generates missing members for an abstract class when in an type inheriting from that abstract class.",
"type": "boolean"
},
"FSharp.abstractClassStubGenerationMethodBody": {
"default": "failwith \"Not Implemented\"",
"description": "The expression to fill in the right-hand side of inherited members when generating missing members for an abstract base class",
"type": "string"
},
"FSharp.abstractClassStubGenerationObjectIdentifier": {
"default": "this",
"description": "The name of the 'self' identifier in an inherited member. For example, `this` in the expression `this.Member(x: int) = ()`",
"type": "string"
},
"FSharp.addFsiWatcher": {
"default": false,
"description": "Enables a panel for FSI that shows the value of all existing bindings in the FSI session",
"type": "boolean"
},
"FSharp.addPrivateAccessModifier": {
"default": false,
"description": "Enables a codefix that adds a private access modifier",
"type": "boolean"
},
"FSharp.analyzersPath": {
"default": [
"packages/Analyzers",
"analyzers"
],
"description": "Directories in the array are used as a source of custom analyzers. Requires restart.",
"scope": "machine-overridable",
"type": "array"
},
"FSharp.autoRevealInExplorer": {
"default": "sameAsFileExplorer",
"description": "Controls whether the solution explorer should automatically reveal and select files when opening them. If `sameAsFileExplorer` is set, then the value of the `explorer.autoReveal` setting will be used instead.",
"enum": [
"sameAsFileExplorer",
"enabled",
"disabled"
],
"scope": "window",
"type": "string"
},
"FSharp.codeLenses.references.enabled": {
"default": true,
"description": "If enabled, code lenses for reference counts for methods and functions will be shown.",
"type": "boolean"
},
"FSharp.codeLenses.signature.enabled": {
"default": true,
"description": "If enabled, code lenses for type signatures on methods and functions will be shown.",
"type": "boolean"
},
"FSharp.disableFailedProjectNotifications": {
"default": false,
"description": "Disables popup notifications for failed project loading",
"type": "boolean"
},
"FSharp.dotnetRoot": {
"description": "Sets the root path for finding locating the dotnet CLI binary. Defaults to the `dotnet` binary found on your system PATH.",
"type": "string"
},
"FSharp.enableAdaptiveLspServer": {
"default": true,
"description": "Enables Enable LSP Server based on FSharp.Data.Adaptive. This can improve stability. Requires restart.",
"markdownDeprecationMessage": "This setting has been deprecated because it is now the only behavior of the LSP Server.",
"type": "boolean"
},
"FSharp.enableAnalyzers": {
"default": false,
"description": "EXPERIMENTAL. Enables F# analyzers for custom code diagnostics. Requires restart.",
"type": "boolean"
},
"FSharp.enableMSBuildProjectGraph": {
"default": false,
"description": "EXPERIMENTAL. Enables support for loading workspaces with MsBuild's ProjectGraph. This can improve load times. Requires restart.",
"type": "boolean"
},
"FSharp.enableReferenceCodeLens": {
"default": true,
"deprecationMessage": "This setting is deprecated. Use FSharp.codeLenses.references.enabled instead.",
"description": "Enables additional code lenses showing number of references of a function or value. Requires background services to be enabled.",
"markdownDeprecationMessage": "This setting is **deprecated**. Use `#FSharp.codeLenses.references.enabled#` instead.",
"type": "boolean"
},
"FSharp.enableTouchBar": {
"default": true,
"description": "Enables TouchBar integration of build/run/debug buttons",
"type": "boolean"
},
"FSharp.enableTreeView": {
"default": true,
"description": "Enables the solution explorer view of the current workspace, which shows the workspace as MSBuild sees it",
"type": "boolean"
},
"FSharp.excludeAnalyzers": {
"default": [],
"description": "The names of custom analyzers that should not be executed.",
"scope": "machine-overridable",
"type": "array"
},
"FSharp.excludeProjectDirectories": {
"default": [
".git",
"paket-files",
".fable",
"packages",
"node_modules"
],
"description": "Directories in the array are excluded from project file search. Requires restart.",
"type": "array"
},
"FSharp.externalAutocomplete": {
"default": false,
"description": "Includes external (from unopened modules and namespaces) symbols in autocomplete",
"type": "boolean"
},
"FSharp.fcs.transparentCompiler.enabled": {
"default": false,
"markdownDescription": "EXPERIMENTAL: Enables the FSharp Compiler Service's [transparent compiler](https://github.com/dotnet/fsharp/pull/15179) feature. Requires restart.",
"type": "boolean"
},
"FSharp.fsac.attachDebugger": {
"default": false,
"markdownDescription": "Appends the `--attachdebugger` argument to fsac, this will allow you to attach a debugger.",
"type": "boolean"
},
"FSharp.fsac.cachedTypeCheckCount": {
"default": 200,
"description": "The MemoryCacheOptions.SizeLimit for caching typechecks.",
"type": "integer"
},
"FSharp.fsac.conserveMemory": {
"default": false,
"deprecationMessage": "This setting is deprecated. Use FSharp.fsac.gc.conserveMemory instead.",
"description": "Configures FsAutoComplete with settings intended to reduce memory consumption. Requires restart.",
"type": "boolean"
},
"FSharp.fsac.dotnetArgs": {
"default": [],
"description": "additional CLI arguments to be provided to the dotnet runner for FSAC",
"items": {
"type": "string"
},
"type": "array"
},
"FSharp.fsac.fsacArgs": {
"default": [],
"description": "additional CLI arguments to be provided to FSAC itself. Useful for flags that aren't exposed in the settings or CLI arguments that only exist in custom built versions of FSAC. Requires restart.",
"items": {
"type": "string"
},
"type": "array"
},
"FSharp.fsac.gc.conserveMemory": {
"markdownDescription": "Configures the garbage collector to [conserve memory](https://learn.microsoft.com/en-us/dotnet/core/runtime-config/garbage-collector#conserve-memory) at the expense of more frequent garbage collections and possibly longer pause times. Acceptable values are 0-9. Any non-zero value will allow the [Large Object Heap](https://learn.microsoft.com/en-us/dotnet/standard/garbage-collection/large-object-heap) to be compacted automatically if it has too much fragmentation. Requires restart.",
"maximum": 9,
"minimum": 0,
"type": "integer"
},
"FSharp.fsac.gc.heapCount": {
"markdownDescription": "Limits the number of [heaps](https://learn.microsoft.com/en-us/dotnet/standard/garbage-collection/fundamentals#the-managed-heap) created by the garbage collector. Applies to server garbage collection only. See [Middle Ground between Server and Workstation GC](https://devblogs.microsoft.com/dotnet/middle-ground-between-server-and-workstation-gc/) for more details. This can allow FSAC to still benefit from Server garbage collection while still limiting the number of heaps. [Only available on .NET 7 or higher](https://github.com/ionide/ionide-vscode-fsharp/issues/1899#issuecomment-1649009462). Requires restart. If FSAC is run on .NET 8 runtimes, this will be set to 2 by default to prevent inflated memory use. On .NET 9 with DATAS enabled, this will not be set. ",
"required": [
"FSharp.fsac.gc.server"
],
"type": "integer"
},
"FSharp.fsac.gc.server": {
"default": true,
"markdownDescription": "Configures whether the application uses workstation garbage collection or server garbage collection. See [Workstation vs Server Garbage Collection](https://devblogs.microsoft.com/premier-developer/understanding-different-gc-modes-with-concurrency-visualizer/#workstation-gc-vs-server-gc) for more details. Workstation will use less memory but Server will have more throughput. Requires restart.",
"type": "boolean"
},
"FSharp.fsac.gc.useDatas": {
"markdownDescription": "Configures whether the application uses the DATAS(dynamic adaptation to application sizes) server garbage collection mode. See [DATAS](https://learn.microsoft.com/dotnet/core/runtime-config/garbage-collector#dynamic-adaptation-to-application-sizes-datas) for more details. Requires restart. When FSAC is run on .NET 8 runtimes, this will be set to false by default. On .NET 9 runtimes, this will be set to true by default.",
"required": [
"FSharp.fsac.gc.server"
],
"type": "boolean"
},
"FSharp.fsac.netCoreDllPath": {
"default": "",
"description": "The path to the 'fsautocomplete.dll', a directory containing TFM-specific versions of fsautocomplete.dll, or a directory containing fsautocomplete.dll. Useful for debugging a self-built FSAC. If a DLL is specified, uses it directly. If a directory is specified and it contains TFM-specific folders (net6.0, net7.0, etc) then that directory will be probed for the best TFM to use for the current runtime. This is useful when working with a local copy of FSAC, you can point directly to the bin/Debug or bin/Release folder and it'll Just Work. Finally, if a directory is specified and there are no TFM paths, then fsautocomplete.dll from that directory is used. Requires restart.",
"scope": "machine-overridable",
"type": "string"
},
"FSharp.fsac.parallelReferenceResolution": {
"default": false,
"description": "EXPERIMENTAL: Speed up analyzing of projects in parallel. Requires restart.",
"type": "boolean"
},
"FSharp.fsac.silencedLogs": {
"default": [],
"description": "An array of log categories for FSAC to filter out. These can be found by viewing your log output and noting the text in between the brackets in the log line. For example, in the log line `[16:07:14.626 INF] [Compiler] done compiling foo.fsx`, the category is 'Compiler'. ",
"items": {
"type": "string"
},
"type": "array"
},
"FSharp.fsac.sourceTextImplementation": {
"default": "RoslynSourceText",
"description": "Enables the use of a new source text implementation. This may have better memory characteristics. Requires restart.",
"enum": [
"NamedText",
"RoslynSourceText"
],
"markdownDeprecationMessage": "This setting is deprecated because the RoslynSourceText SourceText implementation has been adopted as the only implementation in the LSP Server."
},
"FSharp.fsiExtraParameters": {
"markdownDeprecationMessage": "This setting can lead to errors when providing both FSI-CLI-only and script-typechecking-related parameters. Please use `#FSharp.FSIExtraInteractiveParameters#` for FSI-CLI-specific parameters, and `#FSharp.FSIExtraSharedParameters#` for typechecking-related parameters.",
"markdownDescription": "An array of additional command line parameters to pass to FSI when it is started. See [the Microsoft documentation](https://docs.microsoft.com/en-us/dotnet/fsharp/language-reference/fsharp-interactive-options) for an exhaustive list.",
"type": "array"
},
"FSharp.fsiSdkFilePath": {
"default": "",
"description": "The path to the F# Interactive tool used by Ionide-FSharp (When using .NET SDK scripts)",
"scope": "machine-overridable",
"type": "string"
},
"FSharp.fullNameExternalAutocomplete": {
"default": false,
"description": "When selecting an external symbols in autocomplete, insert the full name to the editor rather than open its module/namespace. Also allow filtering suggestions by typing its full name. \n\n Requires `FSharp.externalAutocomplete` enabled.",
"type": "boolean"
},
"FSharp.generateBinlog": {
"default": false,
"markdownDescription": "Enables generation of `msbuild.binlog` files for project loading. It works only for fresh, non-cached project loading. Run `F#: Clear Project Cache` and `Developer: Reload Window` to force fresh loading of all projects. These files can be loaded and inspected using the [MSBuild Structured Logger](https://github.com/KirillOsenkov/MSBuildStructuredLog)",
"type": "boolean"
},
"FSharp.includeAnalyzers": {
"default": [],
"description": "The names of custom analyzers that should exclusively be executed, others should be ignored.",
"scope": "machine-overridable",
"type": "array"
},
"FSharp.indentationSize": {
"default": 4,
"description": "The number of spaces used for indentation when generating code, e.g. for interface stubs",
"minimum": 1,
"type": "number"
},
"FSharp.infoPanelReplaceHover": {
"default": false,
"description": "Controls whether the info panel replaces tooltips",
"type": "boolean"
},
"FSharp.infoPanelShowOnStartup": {
"default": false,
"description": "Controls whether the info panel should be displayed at startup",
"type": "boolean"
},
"FSharp.infoPanelStartLocked": {
"default": false,
"description": "Controls whether the info panel should be locked at startup",
"type": "boolean"
},
"FSharp.infoPanelUpdate": {
"default": "onCursorMove",
"description": "Controls when the info panel is updated",
"enum": [
"onCursorMove",
"onHover",
"both",
"none"
],
"type": "string"
},
"FSharp.inlayHints.disableLongTooltip": {
"default": false,
"description": "Hides the explanatory tooltip that appears on InlayHints to describe the different configuration toggles.",
"type": "boolean"
},
"FSharp.inlayHints.enabled": {
"default": true,
"description": "Controls if the inlay hints feature is enabled",
"markdownDeprecationMessage": "This can be controlled by `editor.inlayHints.enabled` instead.",
"type": "boolean"
},
"FSharp.inlayHints.parameterNames": {
"default": true,
"description": "Controls if parameter-name inlay hints will be displayed for functions and methods",
"type": "boolean"
},
"FSharp.inlayHints.typeAnnotations": {
"default": true,
"description": "Controls if type-annotation inlay hints will be displayed for bindings.",
"type": "boolean"
},
"FSharp.inlineValues.enabled": {
"default": false,
"description": "Enables rendering all kinds of hints inline with your code. Currently supports pipelineHints, which are like LineLenses that appear along each step of a chain of piped expressions",
"type": "boolean"
},
"FSharp.inlineValues.prefix": {
"default": " // ",
"description": "The prefix used when rendering inline values.",
"type": "string"
},
"FSharp.interfaceStubGeneration": {
"default": true,
"description": "Enables a codefix that generates missing interface members when inside of an interface implementation expression",
"type": "boolean"
},
"FSharp.interfaceStubGenerationMethodBody": {
"default": "failwith \"Not Implemented\"",
"description": "The expression to fill in the right-hand side of interface members when generating missing members for an interface implementation expression",
"type": "string"
},
"FSharp.interfaceStubGenerationObjectIdentifier": {
"default": "this",
"description": "The name of the 'self' identifier in an interface member. For example, `this` in the expression `this.Member(x: int) = ()`",
"type": "string"
},
"FSharp.keywordsAutocomplete": {
"default": true,
"description": "Includes keywords in autocomplete",
"type": "boolean"
},
"FSharp.lineLens.enabled": {
"default": "replaceCodeLens",
"description": "Usage mode for LineLens. If `never`, LineLens will never be shown. If `replaceCodeLens`, LineLens will be placed in a decoration on top of the current line.",
"enum": [
"never",
"replaceCodeLens",
"always"
],
"type": "string"
},
"FSharp.lineLens.prefix": {
"default": " // ",
"description": "The prefix displayed before the signature in a LineLens",
"type": "string"
},
"FSharp.linter": {
"default": true,
"markdownDescription": "Enables integration with [FSharpLint](https://fsprojects.github.io/FSharpLint/) for additional (user-defined) warnings",
"type": "boolean"
},
"FSharp.msbuildAutoshow": {
"default": false,
"description": "Automatically shows the MSBuild output panel when MSBuild functionality is invoked",
"type": "boolean"
},
"FSharp.notifications.trace": {
"default": false,
"description": "Enables more verbose notifications using System.Diagnostics.Activity to view traces from FSharp.Compiler.Service.",
"type": "boolean"
},
"FSharp.notifications.traceNamespaces": {
"default": [
"BoundModel.TypeCheck",
"BackgroundCompiler."
],
"description": "The set of System.Diagnostics.Activity names to watch.",
"items": {
"type": "string"
},
"required": [
"FSharp.notifications.trace"
],
"type": "array"
},
"FSharp.openTelemetry.enabled": {
"default": false,
"markdownDescription": "Enables OpenTelemetry exporter. See [OpenTelemetry Protocol Exporter](https://opentelemetry.io/docs/reference/specification/protocol/exporter/) for environment variables to configure for the exporter. Requires Restart.",
"type": "boolean"
},
"FSharp.pipelineHints.enabled": {
"default": true,
"description": "Enables PipeLine hints, which are like LineLenses that appear along each step of a chain of piped expressions",
"type": "boolean"
},
"FSharp.pipelineHints.prefix": {
"default": " // ",
"description": "The prefix displayed before the signature",
"type": "string"
},
"FSharp.recordStubGeneration": {
"default": true,
"description": "Enables a codefix that will generate missing record fields when inside a record construction expression",
"type": "boolean"
},
"FSharp.recordStubGenerationBody": {
"default": "failwith \"Not Implemented\"",
"description": "The expression to fill in the right-hand side of record fields when generating missing fields for a record construction expression",
"type": "string"
},
"FSharp.resolveNamespaces": {
"default": true,
"description": "Enables a codefix that will suggest namespaces or module to open when a name is not recognized",
"type": "boolean"
},
"FSharp.saveOnSendLastSelection": {
"default": false,
"description": "If enabled, the current file will be saved before sending the last selection to FSI for evaluation",
"type": "boolean"
},
"FSharp.showExplorerOnStartup": {
"default": false,
"description": "Automatically shows solution explorer on plugin startup",
"type": "boolean"
},
"FSharp.showProjectExplorerIn": {
"default": "fsharp",
"description": "Set the activity (left bar) where the project explorer view will be displayed. If `explorer`, then the project explorer will be a collapsible tab in the main explorer view, a sibling to the file system explorer. If `fsharp`, a new activity with the F# logo will be added and the project explorer will be rendered in this activity.Requires restart.",
"enum": [
"explorer",
"fsharp"
],
"scope": "application",
"type": "string"
},
"FSharp.simplifyNameAnalyzer": {
"default": true,
"description": "Enables detection of cases when names of functions and values can be simplified",
"type": "boolean"
},
"FSharp.simplifyNameAnalyzerExclusions": {
"default": [
".*\\.g\\.fs",
".*\\.cg\\.fs"
],
"description": "A set of regex patterns to exclude from the simplify name analyzer",
"items": {
"type": "string"
},
"required": [
"FSharp.simplifyNameAnalyzer"
],
"type": "array"
},
"FSharp.smartIndent": {
"default": false,
"description": "Enables smart indent feature",
"type": "boolean"
},
"FSharp.suggestGitignore": {
"default": true,
"description": "Allow Ionide to prompt whenever internal data files aren't included in your project's .gitignore",
"type": "boolean"
},
"FSharp.suggestSdkScripts": {
"default": true,
"description": "Allow Ionide to prompt to use SdkScripts",
"type": "boolean"
},
"FSharp.trace.server": {
"default": "off",
"description": "Trace server messages at the LSP protocol level for diagnostics.",
"enum": [
"off",
"messages",
"verbose"
],
"scope": "window",
"type": "string"
},
"FSharp.unionCaseStubGeneration": {
"default": true,
"description": "Enables a codefix that generates missing union cases when in a match expression",
"type": "boolean"
},
"FSharp.unionCaseStubGenerationBody": {
"default": "failwith \"Not Implemented\"",
"description": "The expression to fill in the right-hand side of match cases when generating missing cases for a match on a discriminated union",
"type": "string"
},
"FSharp.unnecessaryParenthesesAnalyzer": {
"default": true,
"description": "Enables detection of unnecessary parentheses",
"type": "boolean"
},
"FSharp.unnecessaryParenthesesAnalyzerExclusions": {
"default": [
".*\\.g\\.fs",
".*\\.cg\\.fs"
],
"description": "A set of regex patterns to exclude from the unnecessary parentheses analyzer",
"items": {
"type": "string"
},
"required": [
"FSharp.unnecessaryParenthesesAnalyzer"
],
"type": "array"
},
"FSharp.unusedDeclarationsAnalyzer": {
"default": true,
"description": "Enables detection of unused declarations",
"type": "boolean"
},
"FSharp.unusedDeclarationsAnalyzerExclusions": {
"default": [
".*\\.g\\.fs",
".*\\.cg\\.fs"
],
"description": "A set of regex patterns to exclude from the unused declarations analyzer",
"items": {
"type": "string"
},
"required": [
"FSharp.unusedDeclarationsAnalyzer"
],
"type": "array"
},
"FSharp.unusedOpensAnalyzer": {
"default": true,
"description": "Enables detection of unused opens",
"type": "boolean"
},
"FSharp.unusedOpensAnalyzerExclusions": {
"default": [
".*\\.g\\.fs",
".*\\.cg\\.fs"
],
"description": "A set of regex patterns to exclude from the unused opens analyzer",
"items": {
"type": "string"
},
"required": [
"FSharp.unusedOpensAnalyzer"
],
"type": "array"
},
"FSharp.verboseLogging": {
"default": false,
"description": "Logs additional information to F# output channel. This is equivalent to passing the `--verbose` flag to FSAC. Requires restart.",
"type": "boolean"
},
"FSharp.workspaceModePeekDeepLevel": {
"default": 4,
"description": "The deep level of directory hierarchy when searching for sln/projects",
"type": "integer"
},
"FSharp.workspacePath": {
"description": "Path to the directory or solution file that should be loaded as a workspace. If set, no workspace probing or discovery is done by Ionide at all.",
"scope": "window",
"type": "string"
},
"Fsharp.fsac.gc.noAffinitize": {
"markdownDescription": "Specifies whether to [affinitize](https://learn.microsoft.com/en-us/dotnet/core/runtime-config/garbage-collector#affinitize) garbage collection threads with processors. To affinitize a GC thread means that it can only run on its specific CPU.. Applies to server garbage collection only. See [GCNoAffinitize](https://learn.microsoft.com/en-us/dotnet/framework/configure-apps/file-schema/runtime/gcnoaffinitize-element#remarks) for more details. [Only available on .NET 7 or higher](https://github.com/ionide/ionide-vscode-fsharp/issues/1899#issuecomment-1649009462). Requires restart. If FSAC is run on .NET 8 runtimes, this will be set by default. On .NET 9 with DATAS enabled, this will not be set.",
"required": [
"FSharp.fsac.gc.server"
],
"type": "boolean"
}
}
}
================================================
FILE: schemas/_generated/gopls.json
================================================
{
"$schema": "http://json-schema.org/draft-04/schema#",
"description": "Setting of gopls",
"properties": {
"go.addTags": {
"additionalProperties": false,
"default": {
"options": "json=omitempty",
"promptForTags": false,
"tags": "json",
"template": "",
"transform": "snakecase"
},
"description": "Tags and options configured here will be used by the Add Tags command to add tags to struct fields. If promptForTags is true, then user will be prompted for tags and options. By default, json tags are added.",
"properties": {
"options": {
"default": "json=omitempty",
"description": "Comma separated tag=options pairs to be used by Go: Add Tags command",
"type": "string"
},
"promptForTags": {
"default": false,
"description": "If true, Go: Add Tags command will prompt the user to provide tags, options, transform values instead of using the configured values",
"type": "boolean"
},
"tags": {
"default": "json",
"description": "Comma separated tags to be used by Go: Add Tags command",
"type": "string"
},
"template": {
"default": "",
"description": "Custom format used by Go: Add Tags command for the tag value to be applied",
"type": "string"
},
"transform": {
"default": "snakecase",
"description": "Transformation rule used by Go: Add Tags command to add tags",
"enum": [
"snakecase",
"camelcase",
"lispcase",
"pascalcase",
"keep"
],
"type": "string"
}
},
"scope": "resource",
"type": "object"
},
"go.alternateTools": {
"additionalProperties": true,
"default": {},
"description": "Alternate tools or alternate paths for the same tools used by the Go extension. Provide either absolute path or the name of the binary in GOPATH/bin, GOROOT/bin or PATH. Useful when you want to use wrapper script for the Go tools.",
"properties": {
"dlv": {
"default": "dlv",
"description": "Alternate tool to use instead of the dlv binary or alternate path to use for the dlv binary.",
"type": "string"
},
"go": {
"default": "go",
"description": "Alternate tool to use instead of the go binary or alternate path to use for the go binary.",
"type": "string"
},
"go-outline": {
"default": "go-outline",
"description": "Alternate tool to use instead of the go-outline binary or alternate path to use for the go-outline binary.",
"type": "string"
},
"gopls": {
"default": "gopls",
"description": "Alternate tool to use instead of the gopls binary or alternate path to use for the gopls binary.",
"type": "string"
}
},
"scope": "resource",
"type": "object"
},
"go.autocompleteUnimportedPackages": {
"default": false,
"description": "Include unimported packages in auto-complete suggestions. Not applicable when using the language server.",
"scope": "resource",
"type": "boolean"
},
"go.buildFlags": {
"default": [],
"description": "Flags to `go build`/`go test` used during build-on-save or running tests. (e.g. [\"-ldflags='-s'\"]) This is propagated to the language server if `gopls.build.buildFlags` is not specified.",
"items": {
"type": "string"
},
"scope": "resource",
"type": "array"
},
"go.buildOnSave": {
"default": "package",
"description": "Compiles code on file save using 'go build' or 'go test -c'. Options are 'workspace', 'package', or 'off'. Not applicable when using the language server's diagnostics. See 'go.languageServerExperimentalFeatures.diagnostics' setting.",
"enum": [
"package",
"workspace",
"off"
],
"scope": "resource",
"type": "string"
},
"go.buildTags": {
"default": "",
"description": "The Go build tags to use for all commands, that support a `-tags '...'` argument. When running tests, go.testTags will be used instead if it was set. This is propagated to the language server if `gopls.build.buildFlags` is not specified.",
"scope": "resource",
"type": "string"
},
"go.coverMode": {
"default": "default",
"description": "When generating code coverage, the value for -covermode. 'default' is the default value chosen by the 'go test' command.",
"enum": [
"default",
"set",
"count",
"atomic"
],
"scope": "resource",
"type": "string"
},
"go.coverOnSave": {
"default": false,
"description": "If true, runs 'go test -coverprofile' on save and shows test coverage.",
"scope": "resource",
"type": "boolean"
},
"go.coverOnSingleTest": {
"default": false,
"description": "If true, shows test coverage when Go: Test Function at cursor command is run.",
"type": "boolean"
},
"go.coverOnSingleTestFile": {
"default": false,
"description": "If true, shows test coverage when Go: Test Single File command is run.",
"type": "boolean"
},
"go.coverOnTestPackage": {
"default": true,
"description": "If true, shows test coverage when Go: Test Package command is run.",
"type": "boolean"
},
"go.coverShowCounts": {
"default": false,
"description": "When generating code coverage, should counts be shown as --374--",
"scope": "resource",
"type": "boolean"
},
"go.coverageDecorator": {
"additionalProperties": false,
"default": {
"coveredBorderColor": "rgba(64,128,128,0.5)",
"coveredGutterStyle": "blockblue",
"coveredHighlightColor": "rgba(64,128,128,0.5)",
"type": "highlight",
"uncoveredBorderColor": "rgba(128,64,64,0.25)",
"uncoveredGutterStyle": "slashyellow",
"uncoveredHighlightColor": "rgba(128,64,64,0.25)"
},
"description": "This option lets you choose the way to display code coverage. Choose either to highlight the complete line or to show a decorator in the gutter. You can customize the colors and borders for the former and the style for the latter.",
"properties": {
"coveredBorderColor": {
"description": "Color to use for the border of covered code.",
"type": "string"
},
"coveredGutterStyle": {
"description": "Gutter style to indicate covered code.",
"enum": [
"blockblue",
"blockred",
"blockgreen",
"blockyellow",
"slashred",
"slashgreen",
"slashblue",
"slashyellow",
"verticalred",
"verticalgreen",
"verticalblue",
"verticalyellow"
],
"type": "string"
},
"coveredHighlightColor": {
"description": "Color in the rgba format to use to highlight covered code.",
"type": "string"
},
"type": {
"enum": [
"highlight",
"gutter"
],
"type": "string"
},
"uncoveredBorderColor": {
"description": "Color to use for the border of uncovered code.",
"type": "string"
},
"uncoveredGutterStyle": {
"description": "Gutter style to indicate covered code.",
"enum": [
"blockblue",
"blockred",
"blockgreen",
"blockyellow",
"slashred",
"slashgreen",
"slashblue",
"slashyellow",
"verticalred",
"verticalgreen",
"verticalblue",
"verticalyellow"
],
"type": "string"
},
"uncoveredHighlightColor": {
"description": "Color in the rgba format to use to highlight uncovered code.",
"type": "string"
}
},
"scope": "resource",
"type": "object"
},
"go.coverageOptions": {
"default": "showBothCoveredAndUncoveredCode",
"descri
gitextract_7wms9ce1/ ├── .github/ │ └── workflows/ │ ├── gen_schemas.yml │ └── stylua_check.yml ├── .gitignore ├── .luacheckrc ├── LICENSE ├── Makefile ├── README.md ├── doc/ │ └── nlspsettings.txt ├── examples/ │ ├── rust_analyzer.json │ └── sumneko_lua.json ├── lua/ │ ├── nlspsettings/ │ │ ├── command/ │ │ │ ├── completion.lua │ │ │ ├── init.lua │ │ │ └── parser.lua │ │ ├── config.lua │ │ ├── deprecated.lua │ │ ├── loaders/ │ │ │ ├── json.lua │ │ │ └── yaml/ │ │ │ ├── init.lua │ │ │ └── tinyyaml.lua │ │ ├── log.lua │ │ ├── schemas.lua │ │ └── utils.lua │ └── nlspsettings.lua ├── plugin/ │ └── nlspsetting.vim ├── schemas/ │ ├── README.md │ └── _generated/ │ ├── .gitkeep │ ├── als.json │ ├── asm_lsp.json │ ├── ast_grep.json │ ├── astro.json │ ├── awkls.json │ ├── bashls.json │ ├── beancount.json │ ├── bicep.json │ ├── bright_script.json │ ├── clangd.json │ ├── codeqlls.json │ ├── cssls.json │ ├── dartls.json │ ├── denols.json │ ├── elixirls.json │ ├── elmls.json │ ├── eslint.json │ ├── flow.json │ ├── fortls.json │ ├── fsautocomplete.json │ ├── gopls.json │ ├── grammarly.json │ ├── haxe_language_server.json │ ├── hhvm.json │ ├── hie.json │ ├── html.json │ ├── intelephense.json │ ├── java_language_server.json │ ├── jdtls.json │ ├── jsonls.json │ ├── julials.json │ ├── kotlin_language_server.json │ ├── leanls.json │ ├── ltex.json │ ├── lua_ls.json │ ├── luau_lsp.json │ ├── nickel_ls.json │ ├── nimls.json │ ├── omnisharp.json │ ├── perlls.json │ ├── perlnavigator.json │ ├── perlpls.json │ ├── powershell_es.json │ ├── psalm.json │ ├── puppet.json │ ├── purescriptls.json │ ├── pyls.json │ ├── pylsp.json │ ├── pyright.json │ ├── r_language_server.json │ ├── rescriptls.json │ ├── rls.json │ ├── rome.json │ ├── rust_analyzer.json │ ├── solargraph.json │ ├── solidity_ls.json │ ├── sorbet.json │ ├── sourcekit.json │ ├── spectral.json │ ├── stylelint_lsp.json │ ├── sumneko_lua.json │ ├── svelte.json │ ├── svlangserver.json │ ├── tailwindcss.json │ ├── terraformls.json │ ├── tsserver.json │ ├── volar.json │ ├── vtsls.json │ ├── vuels.json │ ├── wgls_analyzer.json │ ├── yamlls.json │ ├── zeta_note.json │ └── zls.json ├── scripts/ │ ├── gen_schemas.lua │ ├── gen_schemas.sh │ └── gen_schemas_readme.lua └── stylua.toml
Condensed preview — 102 files, each showing path, character count, and a content snippet. Download the .json file or copy for the full structured content (1,101K chars).
[
{
"path": ".github/workflows/gen_schemas.yml",
"chars": 955,
"preview": "name: gen_schemas\n\non:\n # 手動実行できるようにする\n workflow_dispatch:\n schedule:\n # 一日に1回実行する\n - cron: '0 12 * * *'\n\njobs:"
},
{
"path": ".github/workflows/stylua_check.yml",
"chars": 585,
"preview": "name: lint\non:\n # 手動実行できるようにする\n workflow_dispatch:\n pull_request:\n branches: \n - \"**\"\n push:\n branches:\n "
},
{
"path": ".gitignore",
"chars": 22,
"preview": "tags\n\nnvim-lspconfig/\n"
},
{
"path": ".luacheckrc",
"chars": 49,
"preview": "std = luajit\ncodes = true\n\nglobals = {\n \"vim\"\n}\n"
},
{
"path": "LICENSE",
"chars": 1076,
"preview": "The MIT License (MIT)\n\nCopyright (c) 2021 tamago324\n\nPermission is hereby granted, free of charge, to any person obtaini"
},
{
"path": "Makefile",
"chars": 110,
"preview": ".PHONY: fmt\nfmt:\n\tstylua --config-path stylua.toml --glob 'lua/**/*.lua' --glob '!lua/**/tinyyaml.lua' -- lua\n"
},
{
"path": "README.md",
"chars": 4907,
"preview": "# nlsp-settings.nvim\n\n[\n log.warn ':NlspConfig has been removed in fa"
},
{
"path": "lua/nlspsettings/loaders/json.lua",
"chars": 1188,
"preview": "local schemas = require 'nlspsettings.schemas'\n\n---@class nlspsettings.loaders.json\n---@field name string loader name\n--"
},
{
"path": "lua/nlspsettings/loaders/yaml/init.lua",
"chars": 1076,
"preview": "local schemas = require 'nlspsettings.schemas'\nlocal tinyyaml = require 'nlspsettings.loaders.yaml.tinyyaml'\n\n---@class "
},
{
"path": "lua/nlspsettings/loaders/yaml/tinyyaml.lua",
"chars": 22601,
"preview": "-- original source code: https://github.com/api7/lua-tinyyaml/blob/1cefdf0f3a4b47b2804b6e14671b6fd073d15e66/tinyyaml.lua"
},
{
"path": "lua/nlspsettings/log.lua",
"chars": 1561,
"preview": "local has_notify, notify = pcall(require, 'notify')\nlocal config = require 'nlspsettings.config'\n\nlocal TITLE = 'Lsp Set"
},
{
"path": "lua/nlspsettings/schemas.lua",
"chars": 2311,
"preview": "local config = require 'nlspsettings.config'\n\nlocal uv = vim.loop\n\nlocal on_windows = uv.os_uname().version:match 'Windo"
},
{
"path": "lua/nlspsettings/utils.lua",
"chars": 1268,
"preview": "local log = require 'nlspsettings.log'\n\n---@param t table\n---@return boolean\nlocal is_table = function(t)\n return type("
},
{
"path": "lua/nlspsettings.lua",
"chars": 8507,
"preview": "local config = require 'nlspsettings.config'\nlocal lspconfig = require 'lspconfig'\nlocal utils = require 'nlspsettings.u"
},
{
"path": "plugin/nlspsetting.vim",
"chars": 921,
"preview": "if exists('g:loaded_nlspsettings')\n finish\nendif\nlet g:loaded_nlspsettings = 1\n\ncommand! -nargs=* -complete=custom,v:lu"
},
{
"path": "schemas/README.md",
"chars": 10,
"preview": "# Schemas\n"
},
{
"path": "schemas/_generated/.gitkeep",
"chars": 0,
"preview": ""
},
{
"path": "schemas/_generated/als.json",
"chars": 3427,
"preview": "{\n \"$schema\": \"http://json-schema.org/draft-04/schema#\",\n \"description\": \"Setting of als\",\n \"properties\": {\n \"ada."
},
{
"path": "schemas/_generated/asm_lsp.json",
"chars": 435,
"preview": "{\n \"$schema\": \"http://json-schema.org/draft-04/schema#\",\n \"description\": \"Setting of asm_lsp\",\n \"properties\": {\n \""
},
{
"path": "schemas/_generated/ast_grep.json",
"chars": 1646,
"preview": "{\n \"$schema\": \"http://json-schema.org/draft-04/schema#\",\n \"description\": \"Setting of ast_grep\",\n \"properties\": {\n "
},
{
"path": "schemas/_generated/astro.json",
"chars": 2160,
"preview": "{\n \"$schema\": \"http://json-schema.org/draft-04/schema#\",\n \"description\": \"Setting of astro\",\n \"properties\": {\n \"as"
},
{
"path": "schemas/_generated/awkls.json",
"chars": 592,
"preview": "{\n \"$schema\": \"http://json-schema.org/draft-04/schema#\",\n \"description\": \"Setting of awkls\",\n \"properties\": {\n \"aw"
},
{
"path": "schemas/_generated/bashls.json",
"chars": 4418,
"preview": "{\n \"$schema\": \"http://json-schema.org/draft-04/schema#\",\n \"description\": \"Setting of bashls\",\n \"properties\": {\n \"b"
},
{
"path": "schemas/_generated/beancount.json",
"chars": 2303,
"preview": "{\n \"$schema\": \"http://json-schema.org/draft-04/schema#\",\n \"description\": \"Setting of beancount\",\n \"properties\": {\n "
},
{
"path": "schemas/_generated/bicep.json",
"chars": 1841,
"preview": "{\n \"$schema\": \"http://json-schema.org/draft-04/schema#\",\n \"description\": \"Setting of bicep\",\n \"properties\": {\n \"bi"
},
{
"path": "schemas/_generated/bright_script.json",
"chars": 10923,
"preview": "{\n \"$schema\": \"http://json-schema.org/draft-04/schema#\",\n \"description\": \"Setting of bright_script\",\n \"properties\": {"
},
{
"path": "schemas/_generated/clangd.json",
"chars": 3790,
"preview": "{\n \"$schema\": \"http://json-schema.org/draft-04/schema#\",\n \"description\": \"Setting of clangd\",\n \"properties\": {\n \"c"
},
{
"path": "schemas/_generated/codeqlls.json",
"chars": 4933,
"preview": "{\n \"$schema\": \"http://json-schema.org/draft-04/schema#\",\n \"description\": \"Setting of codeqlls\",\n \"properties\": {\n "
},
{
"path": "schemas/_generated/cssls.json",
"chars": 7953,
"preview": "{\n \"$schema\": \"http://json-schema.org/draft-04/schema#\",\n \"description\": \"Setting of cssls\",\n \"properties\": {\n \"cs"
},
{
"path": "schemas/_generated/dartls.json",
"chars": 3725,
"preview": "{\n \"$schema\": \"http://json-schema.org/draft-04/schema#\",\n \"description\": \"Setting of dartls\",\n \"properties\": {\n \"d"
},
{
"path": "schemas/_generated/denols.json",
"chars": 13438,
"preview": "{\n \"$schema\": \"http://json-schema.org/draft-04/schema#\",\n \"description\": \"Setting of denols\",\n \"properties\": {\n \"d"
},
{
"path": "schemas/_generated/elixirls.json",
"chars": 5921,
"preview": "{\n \"$schema\": \"http://json-schema.org/draft-04/schema#\",\n \"description\": \"Setting of elixirls\",\n \"properties\": {\n "
},
{
"path": "schemas/_generated/elmls.json",
"chars": 2928,
"preview": "{\n \"$schema\": \"http://json-schema.org/draft-04/schema#\",\n \"description\": \"Setting of elmls\",\n \"properties\": {\n \"el"
},
{
"path": "schemas/_generated/eslint.json",
"chars": 17387,
"preview": "{\n \"$schema\": \"http://json-schema.org/draft-04/schema#\",\n \"description\": \"Setting of eslint\",\n \"properties\": {\n \"e"
},
{
"path": "schemas/_generated/flow.json",
"chars": 2625,
"preview": "{\n \"$schema\": \"http://json-schema.org/draft-04/schema#\",\n \"description\": \"Setting of flow\",\n \"properties\": {\n \"flo"
},
{
"path": "schemas/_generated/fortls.json",
"chars": 2904,
"preview": "{\n \"$schema\": \"http://json-schema.org/draft-04/schema#\",\n \"description\": \"Setting of fortls\",\n \"properties\": {\n \"f"
},
{
"path": "schemas/_generated/fsautocomplete.json",
"chars": 26961,
"preview": "{\n \"$schema\": \"http://json-schema.org/draft-04/schema#\",\n \"description\": \"Setting of fsautocomplete\",\n \"properties\": "
},
{
"path": "schemas/_generated/gopls.json",
"chars": 74805,
"preview": "{\n \"$schema\": \"http://json-schema.org/draft-04/schema#\",\n \"description\": \"Setting of gopls\",\n \"properties\": {\n \"go"
},
{
"path": "schemas/_generated/grammarly.json",
"chars": 25611,
"preview": "{\n \"$schema\": \"http://json-schema.org/draft-04/schema#\",\n \"description\": \"Setting of grammarly\",\n \"properties\": {\n "
},
{
"path": "schemas/_generated/haxe_language_server.json",
"chars": 29687,
"preview": "{\n \"$schema\": \"http://json-schema.org/draft-04/schema#\",\n \"description\": \"Setting of haxe_language_server\",\n \"propert"
},
{
"path": "schemas/_generated/hhvm.json",
"chars": 3509,
"preview": "{\n \"$schema\": \"http://json-schema.org/draft-04/schema#\",\n \"description\": \"Setting of hhvm\",\n \"properties\": {\n \"hac"
},
{
"path": "schemas/_generated/hie.json",
"chars": 40484,
"preview": "{\n \"$schema\": \"http://json-schema.org/draft-04/schema#\",\n \"description\": \"Setting of hie\",\n \"properties\": {\n \"hask"
},
{
"path": "schemas/_generated/html.json",
"chars": 5816,
"preview": "{\n \"$schema\": \"http://json-schema.org/draft-04/schema#\",\n \"description\": \"Setting of html\",\n \"properties\": {\n \"htm"
},
{
"path": "schemas/_generated/intelephense.json",
"chars": 30097,
"preview": "{\n \"$schema\": \"http://json-schema.org/draft-04/schema#\",\n \"description\": \"Setting of intelephense\",\n \"properties\": {\n"
},
{
"path": "schemas/_generated/java_language_server.json",
"chars": 2542,
"preview": "{\n \"$schema\": \"http://json-schema.org/draft-04/schema#\",\n \"description\": \"Setting of java_language_server\",\n \"propert"
},
{
"path": "schemas/_generated/jdtls.json",
"chars": 7367,
"preview": "{\n \"$schema\": \"http://json-schema.org/draft-04/schema#\",\n \"description\": \"Setting of jdtls\",\n \"properties\": {\n \"ja"
},
{
"path": "schemas/_generated/jsonls.json",
"chars": 3106,
"preview": "{\n \"$schema\": \"http://json-schema.org/draft-04/schema#\",\n \"description\": \"Setting of jsonls\",\n \"properties\": {\n \"j"
},
{
"path": "schemas/_generated/julials.json",
"chars": 16397,
"preview": "{\n \"$schema\": \"http://json-schema.org/draft-04/schema#\",\n \"description\": \"Setting of julials\",\n \"properties\": {\n \""
},
{
"path": "schemas/_generated/kotlin_language_server.json",
"chars": 7588,
"preview": "{\n \"$schema\": \"http://json-schema.org/draft-04/schema#\",\n \"description\": \"Setting of kotlin_language_server\",\n \"prope"
},
{
"path": "schemas/_generated/leanls.json",
"chars": 6299,
"preview": "{\n \"$schema\": \"http://json-schema.org/draft-04/schema#\",\n \"description\": \"Setting of leanls\",\n \"properties\": {\n \"l"
},
{
"path": "schemas/_generated/ltex.json",
"chars": 89828,
"preview": "{\n \"$schema\": \"http://json-schema.org/draft-04/schema#\",\n \"description\": \"Setting of ltex\",\n \"properties\": {\n \"lte"
},
{
"path": "schemas/_generated/lua_ls.json",
"chars": 83126,
"preview": "{\n \"$schema\": \"http://json-schema.org/draft-04/schema#\",\n \"description\": \"Setting of lua_ls\",\n \"properties\": {\n \"L"
},
{
"path": "schemas/_generated/luau_lsp.json",
"chars": 24143,
"preview": "{\n \"$schema\": \"http://json-schema.org/draft-04/schema#\",\n \"description\": \"Setting of luau_lsp\",\n \"properties\": {\n "
},
{
"path": "schemas/_generated/nickel_ls.json",
"chars": 629,
"preview": "{\n \"$schema\": \"http://json-schema.org/draft-04/schema#\",\n \"description\": \"Setting of nickel_ls\",\n \"properties\": {\n "
},
{
"path": "schemas/_generated/nimls.json",
"chars": 2466,
"preview": "{\n \"$schema\": \"http://json-schema.org/draft-04/schema#\",\n \"description\": \"Setting of nimls\",\n \"properties\": {\n \"ni"
},
{
"path": "schemas/_generated/omnisharp.json",
"chars": 277,
"preview": "{\n \"$schema\": \"http://json-schema.org/draft-04/schema#\",\n \"description\": \"Setting of omnisharp\",\n \"properties\": {\n "
},
{
"path": "schemas/_generated/perlls.json",
"chars": 4873,
"preview": "{\n \"$schema\": \"http://json-schema.org/draft-04/schema#\",\n \"description\": \"Setting of perlls\",\n \"properties\": {\n \"p"
},
{
"path": "schemas/_generated/perlnavigator.json",
"chars": 6118,
"preview": "{\n \"$schema\": \"http://json-schema.org/draft-04/schema#\",\n \"description\": \"Setting of perlnavigator\",\n \"properties\": {"
},
{
"path": "schemas/_generated/perlpls.json",
"chars": 4264,
"preview": "{\n \"$schema\": \"http://json-schema.org/draft-04/schema#\",\n \"description\": \"Setting of perlpls\",\n \"properties\": {\n \""
},
{
"path": "schemas/_generated/powershell_es.json",
"chars": 3574,
"preview": "{\n \"$schema\": \"http://json-schema.org/draft-04/schema#\",\n \"description\": \"Setting of powershell_es\",\n \"properties\": {"
},
{
"path": "schemas/_generated/psalm.json",
"chars": 5373,
"preview": "{\n \"$schema\": \"http://json-schema.org/draft-04/schema#\",\n \"description\": \"Setting of psalm\",\n \"properties\": {\n \"ps"
},
{
"path": "schemas/_generated/puppet.json",
"chars": 6997,
"preview": "{\n \"$schema\": \"http://json-schema.org/draft-04/schema#\",\n \"description\": \"Setting of puppet\",\n \"properties\": {\n \"p"
},
{
"path": "schemas/_generated/purescriptls.json",
"chars": 9236,
"preview": "{\n \"$schema\": \"http://json-schema.org/draft-04/schema#\",\n \"description\": \"Setting of purescriptls\",\n \"properties\": {\n"
},
{
"path": "schemas/_generated/pyls.json",
"chars": 8776,
"preview": "{\n \"$schema\": \"http://json-schema.org/draft-04/schema#\",\n \"description\": \"Setting of pyls\",\n \"properties\": {\n \"pyl"
},
{
"path": "schemas/_generated/pylsp.json",
"chars": 15937,
"preview": "{\n \"$schema\": \"http://json-schema.org/draft-04/schema#\",\n \"description\": \"Setting of pylsp\",\n \"properties\": {\n \"py"
},
{
"path": "schemas/_generated/pyright.json",
"chars": 42241,
"preview": "{\n \"$schema\": \"http://json-schema.org/draft-04/schema#\",\n \"description\": \"Setting of pyright\",\n \"properties\": {\n \""
},
{
"path": "schemas/_generated/r_language_server.json",
"chars": 1747,
"preview": "{\n \"$schema\": \"http://json-schema.org/draft-04/schema#\",\n \"description\": \"Setting of r_language_server\",\n \"properties"
},
{
"path": "schemas/_generated/rescriptls.json",
"chars": 3507,
"preview": "{\n \"$schema\": \"http://json-schema.org/draft-04/schema#\",\n \"description\": \"Setting of rescriptls\",\n \"properties\": {\n "
},
{
"path": "schemas/_generated/rls.json",
"chars": 10370,
"preview": "{\n \"$schema\": \"http://json-schema.org/draft-04/schema#\",\n \"description\": \"Setting of rls\",\n \"properties\": {\n \"rust"
},
{
"path": "schemas/_generated/rome.json",
"chars": 1180,
"preview": "{\n \"$schema\": \"http://json-schema.org/draft-04/schema#\",\n \"description\": \"Setting of rome\",\n \"properties\": {\n \"rom"
},
{
"path": "schemas/_generated/rust_analyzer.json",
"chars": 104,
"preview": "{\n \"$schema\": \"http://json-schema.org/draft-04/schema#\",\n \"description\": \"Setting of rust_analyzer\"\n}\n"
},
{
"path": "schemas/_generated/solargraph.json",
"chars": 3990,
"preview": "{\n \"$schema\": \"http://json-schema.org/draft-04/schema#\",\n \"description\": \"Setting of solargraph\",\n \"properties\": {\n "
},
{
"path": "schemas/_generated/solidity_ls.json",
"chars": 6795,
"preview": "{\n \"$schema\": \"http://json-schema.org/draft-04/schema#\",\n \"description\": \"Setting of solidity_ls\",\n \"properties\": {\n "
},
{
"path": "schemas/_generated/sorbet.json",
"chars": 6572,
"preview": "{\n \"$schema\": \"http://json-schema.org/draft-04/schema#\",\n \"description\": \"Setting of sorbet\",\n \"properties\": {\n \"s"
},
{
"path": "schemas/_generated/sourcekit.json",
"chars": 11748,
"preview": "{\n \"$schema\": \"http://json-schema.org/draft-04/schema#\",\n \"description\": \"Setting of sourcekit\",\n \"properties\": {\n "
},
{
"path": "schemas/_generated/spectral.json",
"chars": 1995,
"preview": "{\n \"$schema\": \"http://json-schema.org/draft-04/schema#\",\n \"description\": \"Setting of spectral\",\n \"properties\": {\n "
},
{
"path": "schemas/_generated/stylelint_lsp.json",
"chars": 2619,
"preview": "{\n \"$schema\": \"http://json-schema.org/draft-04/schema#\",\n \"description\": \"Setting of stylelint_lsp\",\n \"properties\": {"
},
{
"path": "schemas/_generated/sumneko_lua.json",
"chars": 83131,
"preview": "{\n \"$schema\": \"http://json-schema.org/draft-04/schema#\",\n \"description\": \"Setting of sumneko_lua\",\n \"properties\": {\n "
},
{
"path": "schemas/_generated/svelte.json",
"chars": 14785,
"preview": "{\n \"$schema\": \"http://json-schema.org/draft-04/schema#\",\n \"description\": \"Setting of svelte\",\n \"properties\": {\n \"s"
},
{
"path": "schemas/_generated/svlangserver.json",
"chars": 4734,
"preview": "{\n \"$schema\": \"http://json-schema.org/draft-04/schema#\",\n \"description\": \"Setting of svlangserver\",\n \"properties\": {\n"
},
{
"path": "schemas/_generated/tailwindcss.json",
"chars": 7718,
"preview": "{\n \"$schema\": \"http://json-schema.org/draft-04/schema#\",\n \"description\": \"Setting of tailwindcss\",\n \"properties\": {\n "
},
{
"path": "schemas/_generated/terraformls.json",
"chars": 551,
"preview": "{\n \"$schema\": \"http://json-schema.org/draft-04/schema#\",\n \"description\": \"Setting of terraformls\",\n \"properties\": {\n "
},
{
"path": "schemas/_generated/tsserver.json",
"chars": 3631,
"preview": "{\n \"$schema\": \"http://json-schema.org/draft-04/schema#\",\n \"description\": \"Setting of tsserver\",\n \"properties\": {\n "
},
{
"path": "schemas/_generated/volar.json",
"chars": 5455,
"preview": "{\n \"$schema\": \"http://json-schema.org/draft-04/schema#\",\n \"description\": \"Setting of volar\",\n \"properties\": {\n \"vu"
},
{
"path": "schemas/_generated/vtsls.json",
"chars": 55471,
"preview": "{\n \"$schema\": \"http://json-schema.org/draft-04/schema#\",\n \"description\": \"Setting of vtsls\",\n \"properties\": {\n \"ja"
},
{
"path": "schemas/_generated/vuels.json",
"chars": 12292,
"preview": "{\n \"$schema\": \"http://json-schema.org/draft-04/schema#\",\n \"description\": \"Setting of vuels\",\n \"properties\": {\n \"ve"
},
{
"path": "schemas/_generated/wgls_analyzer.json",
"chars": 1372,
"preview": "{\n \"$schema\": \"http://json-schema.org/draft-04/schema#\",\n \"description\": \"Setting of wgls_analyzer\",\n \"properties\": {"
},
{
"path": "schemas/_generated/yamlls.json",
"chars": 3086,
"preview": "{\n \"$schema\": \"http://json-schema.org/draft-04/schema#\",\n \"description\": \"Setting of yamlls\",\n \"properties\": {\n \"r"
},
{
"path": "schemas/_generated/zeta_note.json",
"chars": 855,
"preview": "{\n \"$schema\": \"http://json-schema.org/draft-04/schema#\",\n \"description\": \"Setting of zeta_note\",\n \"properties\": {\n "
},
{
"path": "schemas/_generated/zls.json",
"chars": 5756,
"preview": "{\n \"$schema\": \"http://json-schema.org/draft-04/schema#\",\n \"description\": \"Setting of zls\",\n \"properties\": {\n \"zls."
},
{
"path": "scripts/gen_schemas.lua",
"chars": 2354,
"preview": "local schemas_dir = os.getenv('PWD') .. '/schemas/_generated'\n\nlocal write_tmpfile = function(data)\n local tmpname = os"
},
{
"path": "scripts/gen_schemas.sh",
"chars": 177,
"preview": "#!/bin/sh\n\nexec nvim -u NONE -E -R --headless +'set rtp+=$PWD' +'set rtp+=$PWD/nvim-lspconfig' +'luafile scripts/gen_sch"
},
{
"path": "scripts/gen_schemas_readme.lua",
"chars": 1800,
"preview": "require'lspconfig'\nlocal configs = require 'lspconfig/configs'\nlocal uv = vim.loop\n\nlocal _schemas_dir = os.getenv('PWD'"
},
{
"path": "stylua.toml",
"chars": 100,
"preview": "indent_type = \"Spaces\"\nindent_width = 2\nquote_style = \"AutoPreferSingle\"\nno_call_parentheses = true\n"
}
]
About this extraction
This page contains the full source code of the tamago324/nlsp-settings.nvim GitHub repository, extracted and formatted as plain text for AI agents and large language models (LLMs). The extraction includes 102 files (977.0 KB), approximately 234.3k tokens. Use this with OpenClaw, Claude, ChatGPT, Cursor, Windsurf, or any other AI tool that accepts text input. You can copy the full output to your clipboard or download it as a .txt file.
Extracted by GitExtract — free GitHub repo to text converter for AI. Built by Nikandr Surkov.