Repository: aduros/ai.vim
Branch: main
Commit: 489d2c1b1e53
Files: 12
Total size: 20.5 KB
Directory structure:
gitextract_8npqn9xc/
├── .github/
│ └── workflows/
│ └── ci.yml
├── .gitignore
├── .luarc.json
├── LICENSE.txt
├── README.md
├── doc/
│ └── ai.txt
├── lua/
│ └── _ai/
│ ├── commands.lua
│ ├── config.lua
│ ├── indicator.lua
│ └── openai.lua
├── plugin/
│ └── ai.lua
└── scripts/
└── test
================================================
FILE CONTENTS
================================================
================================================
FILE: .github/workflows/ci.yml
================================================
name: Test
on: [push]
jobs:
test:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: Install lua-language-server
run: |
lls_dir=`mktemp -d`
gh release download -R sumneko/lua-language-server -p '*-linux-x64.tar.gz' -D "$lls_dir"
tar xzf "$lls_dir"/* -C "$lls_dir"
echo "$lls_dir/bin" >> $GITHUB_PATH
env:
GH_TOKEN: ${{ github.token }}
- name: Run tests
run: ./scripts/test
================================================
FILE: .gitignore
================================================
/doc/tags
================================================
FILE: .luarc.json
================================================
{
"$schema": "https://raw.githubusercontent.com/sumneko/vscode-lua/master/setting/schema.json",
"Lua.runtime.version": "LuaJIT",
"Lua.diagnostics.globals": [ "vim" ]
}
================================================
FILE: LICENSE.txt
================================================
Copyright (c) Bruno Garcia
Permission to use, copy, modify, and/or distribute this software for any
purpose with or without fee is hereby granted, provided that the above
copyright notice and this permission notice appear in all copies.
THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH
REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND
FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT,
INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM
LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR
OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR
PERFORMANCE OF THIS SOFTWARE.
================================================
FILE: README.md
================================================
# 🤖 ai.vim
A minimalist Neovim plugin for generating and editing text using OpenAI and GPT.
## Features
- Complete text in insert mode.
- Generate new text using a prompt.
- Select and edit existing text in-place.
- Streaming support for completions.
- Easy to use interface. Just hit `<Ctrl-A>` or run `:AI <prompt>`.
- Works with both source code and regular text.
## Installing
For vim-plug, add this to your init.vim:
```vim
Plug 'aduros/ai.vim'
```
Make sure you have an environment variable called `$OPENAI_API_KEY` which you can [generate
here](https://beta.openai.com/account/api-keys). You'll also need `curl` installed.
To see the full help and customization options, run `:help ai.vim`.
## Tutorial
The most basic use-case is completion, by pressing `<Ctrl-A>` in insert mode.
For example:
```typescript
function capitalize (str: string): string {
(Press <Ctrl-A> here)
}
```
Will result in:
```typescript
function capitalize (str: string): string {
return str.charAt(0).toUpperCase() + str.slice(1);
}
```
ai.vim isn't just for programming! You can also complete regular human text:
```
Hey Joe, here are some ideas for slogans for the new petshop. Which do you like best?
1. <Ctrl-A>
```
Results in:
```
Hey Joe, here are some ideas for slogans for the new petshop. Which do you like best?
1. "Where Pets Come First!"
2. "Your Pet's Home Away From Home!"
3. "The Best Place for Your Pet!"
4. "The Pet Store That Cares!"
5. "The Pet Store That Loves Your Pet!"
```
You can also generate some text by pressing `<Ctrl-A>` in normal mode and providing a prompt. For
example:
```
:AI write a thank you email to Bigco engineering interviewer
```
Results in something like:
```
Dear [Name],
I wanted to take a moment to thank you for taking the time to interview me for the engineering
position at Bigco. I was very impressed with the company and the team, and I am excited about the
possibility of joining the team.
I appreciate the time you took to explain the role and the company's mission. I am confident that I
have the skills and experience to be a valuable asset to the team.
Once again, thank you for your time and consideration. I look forward to hearing from you soon.
Sincerely,
[Your Name]
```
Besides generating new text, you can also edit existing text using a given instruction.
```css
body {
color: orange;
background: green;
}
```
Visually selecting the above CSS and running `:AI convert colors to hex` results in:
```css
body {
color: #ffa500;
background: #008000;
}
```
Another example of text editing:
```
List of capitals:
1. Toronto
2. London
3. Honolulu
4. Miami
5. Boston
```
Visually selecting this text and running `:AI sort by population` results in:
```
List of capitals:
1. London
2. Toronto
3. Boston
4. Miami
5. Honolulu
```
You can build your own shortcuts for long and complex prompts. For example:
```vim
vnoremap <silent> <leader>f :AI fix grammar and spelling and replace slang and contractions with a formal academic writing style<CR>
```
With this custom mapping you can select text that looks like this:
```
Me fail English? That's unpossible!
```
And by pressing `<leader>f` transform it into this:
```
I failed English? That is impossible!
```
If you come up with any exciting ways to use ai.vim, please share what you find!
## Important Disclaimers
**Accuracy**: GPT is good at producing text and code that looks correct at first glance, but may be
completely wrong. Make sure you carefully proof read and test everything output by this plugin!
**Privacy**: This plugin sends text to OpenAI when generating completions and edits. Don't use it in
files containing sensitive information.
================================================
FILE: doc/ai.txt
================================================
*ai.txt* Plugin for generating and editing text using OpenAI and GPT.
Author: Bruno Garcia <https://github.com/aduros/ai.vim>
==============================================================================
INTRODUCTION
*ai.vim* exposes OpenAI's powerful language processing model to Neovim in a
flexible but easy to use plugin.
You'll need an OpenAI account and to generate an API key here:
https://beta.openai.com/account/api-keys
Then set the `$OPENAI_API_KEY` environment variable, for example, by adding
it to your `~/.profile`:
`export OPENAI_API_KEY="sk-abcdefghijklmnopqrstuvwxyz1234567890"`
==============================================================================
USAGE
The *:AI* command is your point of entry to ai.vim. With it you can generate
text using a prompt, complete text at the current position, or edit existing
text in-place.
There is a recommended mapping of *<CTRL-A>* in normal, visual, and insert
modes. This mapping can be disabled by setting *g:ai_no_mappings* to 1.
There are 4 different behaviors for :AI based on whether arguments are
supplied or text is visually selected.
:AI {generator prompt}
Generate some text using the supplied prompt and insert it at the
cursor position.
Example:
:AI write an email to IT asking for a replacement laptop
:AI
When no prompt is supplied, contextually complete some text to insert
at the cursor position.
Example:
function capitalize (str: string): string {
`(Press <Ctrl-A> here)`
}
(with visual selection) :AI {edit instruction}
With some text visually selected, edit it in-place using the given
edit instruction.
Example:
List of capitals:
1. Toronto
2. London
3. Honolulu
4. Miami
5. Boston
`(Visual select)` :AI sort by population
(with visual selection) :AI
When no edit instruction is supplied, use the selected text as a
generator prompt. The generated text will replace the selected text.
Example:
Write an academic essay exploring the pros and cons of yodeling as
a career choice.
`(Visual select)` :AI
==============================================================================
CUSTOMIZING
*g:ai_completions_model* (default: "text-davinci-003")
The model to use for completions.
For more info, see https://beta.openai.com/docs/models/overview
Example: `let g:ai_completions_model="text-ada-001"`
*g:ai_edits_model* (default: "text-davinci-edit-001")
The model to use for edits.
For more info, see https://beta.openai.com/docs/models/overview
Example: `let g:ai_edits_model="code-davinci-edit-001"`
*g:ai_temperature* (default: 0)
Controls randomness of output, between 0 and 1. Lower values will be
more deterministic and higher values will take more creative risks.
Example: `let g:ai_temperature=0.7`
*g:ai_context_before* (default: 20)
When using |:AI| for contextual completion, how many additional lines
of text before the cursor to include in the request.
Example: `let g:ai_context_before=50`
*g:ai_context_after* (default: 20)
When using |:AI| for contextual completion, how many additional lines
of text after the cursor to include in the request.
Example: `let g:ai_context_after=50`
*g:ai_indicator_style* (default: "sign")
Controls where the progress indicator is displayed. Allowed values:
- `sign`: Display indicator in the sign column.
- `none`: Don't display any indicator.
Example: `let g:ai_indicator_style="none"`
*g:ai_indicator_text* (default: "🤖")
The text used to indicate a request in-progress. If your terminal can't
display emojis, you'll probably want to set this.
Example: `let g:ai_indicator_text="A"`
*g:ai_timeout* (default: 60)
Set the maximum time in seconds to wait for OpenAI requests.
Example: `let g:ai_timeout=20`
*AIIndicator* is the |:highlight| group to color the indicator text.
Example: `:highlight AIIndicator ctermbg=red`
*AIHighlight* is the |:highlight| group to color the text being processed.
Example: `:highlight AIHighlight ctermbg=green`
================================================
FILE: lua/_ai/commands.lua
================================================
local M = {}
local openai = require("_ai/openai")
local config = require("_ai/config")
local indicator = require("_ai/indicator")
---@param args { args: string, range: integer }
function M.ai (args)
local prompt = args.args
local visual_mode = args.range > 0
local buffer = vim.api.nvim_get_current_buf()
local start_row, start_col
local end_row, end_col
if visual_mode then
-- Use the visual selection
local start_pos = vim.api.nvim_buf_get_mark(buffer, "<")
start_row = start_pos[1] - 1
start_col = start_pos[2]
local end_pos = vim.api.nvim_buf_get_mark(buffer, ">")
end_row = end_pos[1] - 1
local line = vim.fn.getline(end_pos[1])
if line == "" then
end_col = 0
else
end_col = vim.fn.byteidx(line, vim.fn.charcol("'>"))
end
else
-- Use the cursor position
local start_pos = vim.api.nvim_win_get_cursor(0)
start_row = start_pos[1] - 1
local line = vim.fn.getline(start_pos[1])
if line == "" then
start_col = 0
else
start_col = vim.fn.byteidx(line, vim.fn.charcol("."))
end
end_row = start_row
end_col = start_col
end
local start_line_length = vim.api.nvim_buf_get_lines(buffer, start_row, start_row+1, true)[1]:len()
start_col = math.min(start_col, start_line_length)
local end_line_length = vim.api.nvim_buf_get_lines(buffer, end_row, end_row+1, true)[1]:len()
end_col = math.min(end_col, end_line_length)
local indicator_obj = indicator.create(buffer, start_row, start_col, end_row, end_col)
local accumulated_text = ""
local function on_data (data)
accumulated_text = accumulated_text .. data.choices[1].text
indicator.set_preview_text(indicator_obj, accumulated_text)
end
local function on_complete (err)
if err then
vim.api.nvim_err_writeln("ai.vim: " .. err)
elseif #accumulated_text > 0 then
indicator.set_buffer_text(indicator_obj, accumulated_text)
end
indicator.finish(indicator_obj)
end
if visual_mode then
local selected_text = table.concat(vim.api.nvim_buf_get_text(buffer, start_row, start_col, end_row, end_col, {}), "\n")
if prompt == "" then
-- Replace the selected text, also using it as a prompt
openai.completions({
prompt = selected_text,
}, on_data, on_complete)
else
-- Edit selected text
openai.edits({
input = selected_text,
instruction = prompt,
}, on_data, on_complete)
end
else
if prompt == "" then
-- Insert some text generated using surrounding context
local prefix = table.concat(vim.api.nvim_buf_get_text(buffer,
math.max(0, start_row-config.context_before), 0, start_row, start_col, {}), "\n")
local line_count = vim.api.nvim_buf_line_count(buffer)
local suffix = table.concat(vim.api.nvim_buf_get_text(buffer,
end_row, end_col, math.min(end_row+config.context_after, line_count-1), 99999999, {}), "\n")
openai.completions({
prompt = prefix,
suffix = suffix,
}, on_data, on_complete)
else
-- Insert some text generated using the given prompt
openai.completions({
prompt = prompt,
}, on_data, on_complete)
end
end
end
return M
================================================
FILE: lua/_ai/config.lua
================================================
local M = {}
---@param name string
---@param default_value unknown
---@return unknown
local function get_var (name, default_value)
local value = vim.g[name]
if value == nil then
return default_value
end
return value
end
M.indicator_style = get_var("ai_indicator_style", "sign")
M.indicator_text = get_var("ai_indicator_text", "🤖")
M.completions_model = get_var("ai_completions_model", "gpt-3.5-turbo-instruct")
M.edits_model = get_var("ai_edits_model", "text-davinci-edit-001")
M.temperature = get_var("ai_temperature", 0)
M.context_before = get_var("ai_context_before", 20)
M.context_after = get_var("ai_context_after", 20)
M.timeout = get_var("ai_timeout", 60)
return M
================================================
FILE: lua/_ai/indicator.lua
================================================
local M = {}
local config = require("_ai/config")
---@class Indicator
---@field buffer number
---@field extmark_id number
local ns_id = vim.api.nvim_create_namespace("")
local function get_default_extmark_opts ()
local extmark_opts = {
hl_group = "AIHighlight",
-- right_gravity = false,
-- end_right_gravity = true,
}
if config.indicator_style ~= "none" then
extmark_opts.sign_text = config.indicator_text
extmark_opts.sign_hl_group = "AIIndicator"
end
return extmark_opts
end
-- Creates a new indicator.
---@param buffer number
---@param start_row number
---@param start_col number
---@param end_row number
---@param end_col number
---@return Indicator
function M.create (buffer, start_row, start_col, end_row, end_col)
local extmark_opts = get_default_extmark_opts()
if end_row ~= start_row or end_col ~= start_col then
extmark_opts.end_row = end_row
extmark_opts.end_col = end_col
end
local extmark_id = vim.api.nvim_buf_set_extmark(buffer, ns_id, start_row, start_col, extmark_opts)
return {
buffer = buffer,
extmark_id = extmark_id,
}
end
-- Set the preview virtual text to show at this indicator.
---@param indicator Indicator
---@param text string
function M.set_preview_text (indicator, text)
local extmark = vim.api.nvim_buf_get_extmark_by_id(indicator.buffer, ns_id, indicator.extmark_id, { details = true })
local start_row = extmark[1]
local start_col = extmark[2]
if extmark[3].end_row or extmark[3].end_col then
return -- We don't support preview text on indicators over a range
end
local extmark_opts = get_default_extmark_opts()
extmark_opts.id = indicator.extmark_id
extmark_opts.virt_text_pos = "overlay"
local lines = vim.split(text, "\n")
extmark_opts.virt_text = {{lines[1], "Comment"}}
if #lines > 1 then
extmark_opts.virt_lines = vim.tbl_map(function (line) return {{line, "Comment"}} end, vim.list_slice(lines, 2))
end
vim.api.nvim_buf_set_extmark(indicator.buffer, ns_id, start_row, start_col, extmark_opts)
end
-- Sets the in-buffer text at this indicator.
---@param indicator Indicator
---@param text string
function M.set_buffer_text (indicator, text)
local extmark = vim.api.nvim_buf_get_extmark_by_id(indicator.buffer, ns_id, indicator.extmark_id, { details = true })
local start_row = extmark[1]
local start_col = extmark[2]
local end_row = extmark[3].end_row
if not end_row then
end_row = start_row
end
local end_col = extmark[3].end_col
if not end_col then
end_col = start_col
end
local lines = vim.split(text, "\n")
vim.api.nvim_buf_set_text(indicator.buffer, start_row, start_col, end_row, end_col, lines)
end
---@param indicator Indicator
function M.finish (indicator)
vim.api.nvim_buf_del_extmark(indicator.buffer, ns_id, indicator.extmark_id)
end
return M
================================================
FILE: lua/_ai/openai.lua
================================================
local M = {}
local config = require("_ai/config")
---@param cmd string
---@param args string[]
---@param on_stdout_chunk fun(chunk: string): nil
---@param on_complete fun(err: string?, output: string?): nil
local function exec (cmd, args, on_stdout_chunk, on_complete)
local stdout = vim.loop.new_pipe()
local function on_stdout_read (_, chunk)
if chunk then
vim.schedule(function ()
on_stdout_chunk(chunk)
end)
end
end
local stderr = vim.loop.new_pipe()
local stderr_chunks = {}
local function on_stderr_read (_, chunk)
if chunk then
table.insert(stderr_chunks, chunk)
end
end
local handle
handle, error = vim.loop.spawn(cmd, {
args = args,
stdio = {nil, stdout, stderr},
}, function (code)
stdout:close()
stderr:close()
handle:close()
vim.schedule(function ()
if code ~= 0 then
on_complete(vim.trim(table.concat(stderr_chunks, "")))
else
on_complete()
end
end)
end)
if not handle then
on_complete(cmd .. " could not be started: " .. error)
else
stdout:read_start(on_stdout_read)
stderr:read_start(on_stderr_read)
end
end
local function request (endpoint, body, on_data, on_complete)
local api_key = os.getenv("OPENAI_API_KEY")
if not api_key then
on_complete("$OPENAI_API_KEY environment variable must be set")
return
end
local curl_args = {
"--silent", "--show-error", "--no-buffer",
"--max-time", config.timeout,
"-L", "https://api.openai.com/v1/" .. endpoint,
"-H", "Authorization: Bearer " .. api_key,
"-X", "POST", "-H", "Content-Type: application/json",
"-d", vim.json.encode(body),
}
local buffered_chunks = ""
local function on_stdout_chunk (chunk)
buffered_chunks = buffered_chunks .. chunk
-- Extract complete JSON objects from the buffered_chunks
local json_start, json_end = buffered_chunks:find("}\n")
while json_start do
local json_str = buffered_chunks:sub(1, json_end)
buffered_chunks = buffered_chunks:sub(json_end + 1)
-- Remove the "data: " prefix
json_str = json_str:gsub("data: ", "")
local json = vim.json.decode(json_str)
if json.error then
on_complete(json.error.message)
else
on_data(json)
end
json_start, json_end = buffered_chunks:find("}\n")
end
end
exec("curl", curl_args, on_stdout_chunk, on_complete)
end
---@param body table
---@param on_data fun(data: unknown): nil
---@param on_complete fun(err: string?): nil
function M.completions (body, on_data, on_complete)
body = vim.tbl_extend("keep", body, {
model = config.completions_model,
max_tokens = 2048,
temperature = config.temperature,
stream = true,
})
request("completions", body, on_data, on_complete)
end
---@param body table
---@param on_data fun(data: unknown): nil
---@param on_complete fun(err: string?): nil
function M.edits (body, on_data, on_complete)
body = vim.tbl_extend("keep", body, {
model = config.edits_model,
temperature = config.temperature,
})
request("edits", body, on_data, on_complete)
end
return M
================================================
FILE: plugin/ai.lua
================================================
vim.api.nvim_create_user_command("AI", function (args)
require("_ai/commands").ai(args)
end, {
range = true,
nargs = "*",
})
if not vim.g.ai_no_mappings then
vim.api.nvim_set_keymap("n", "<C-a>", ":AI ", { noremap = true })
vim.api.nvim_set_keymap("v", "<C-a>", ":AI ", { noremap = true })
vim.api.nvim_set_keymap("i", "<C-a>", "<Esc>:AI<CR>a", { noremap = true })
end
================================================
FILE: scripts/test
================================================
#!/bin/sh -e
#
# Runs lua-language-server to lint the entire project.
tmp=`mktemp -d`
trap 'rm -rf -- "$tmp"' EXIT
lua-language-server --check="$PWD" --checklevel=Information --logpath="$tmp"
if [ -f "$tmp/check.json" ]; then
cat >&2 -- "$tmp/check.json"
exit 1
fi
gitextract_8npqn9xc/
├── .github/
│ └── workflows/
│ └── ci.yml
├── .gitignore
├── .luarc.json
├── LICENSE.txt
├── README.md
├── doc/
│ └── ai.txt
├── lua/
│ └── _ai/
│ ├── commands.lua
│ ├── config.lua
│ ├── indicator.lua
│ └── openai.lua
├── plugin/
│ └── ai.lua
└── scripts/
└── test
Condensed preview — 12 files, each showing path, character count, and a content snippet. Download the .json file or copy for the full structured content (23K chars).
[
{
"path": ".github/workflows/ci.yml",
"chars": 580,
"preview": "name: Test\n\non: [push]\n\njobs:\n test:\n runs-on: ubuntu-latest\n steps:\n - uses: actions/checko"
},
{
"path": ".gitignore",
"chars": 10,
"preview": "/doc/tags\n"
},
{
"path": ".luarc.json",
"chars": 180,
"preview": "{\n \"$schema\": \"https://raw.githubusercontent.com/sumneko/vscode-lua/master/setting/schema.json\",\n \"Lua.runtime.ver"
},
{
"path": "LICENSE.txt",
"chars": 725,
"preview": "Copyright (c) Bruno Garcia\n\nPermission to use, copy, modify, and/or distribute this software for any\npurpose with or wit"
},
{
"path": "README.md",
"chars": 3704,
"preview": "# 🤖 ai.vim\n\nA minimalist Neovim plugin for generating and editing text using OpenAI and GPT.\n\n## Features\n\n- Complete te"
},
{
"path": "doc/ai.txt",
"chars": 4439,
"preview": "*ai.txt* Plugin for generating and editing text using OpenAI and GPT.\n\nAuthor: Bruno Garcia <https://github.com/aduros/a"
},
{
"path": "lua/_ai/commands.lua",
"chars": 3597,
"preview": "local M = {}\n\nlocal openai = require(\"_ai/openai\")\nlocal config = require(\"_ai/config\")\nlocal indicator = require(\"_ai/i"
},
{
"path": "lua/_ai/config.lua",
"chars": 700,
"preview": "local M = {}\n\n---@param name string\n---@param default_value unknown\n---@return unknown\nlocal function get_var (name, def"
},
{
"path": "lua/_ai/indicator.lua",
"chars": 2965,
"preview": "local M = {}\n\nlocal config = require(\"_ai/config\")\n\n---@class Indicator\n---@field buffer number\n---@field extmark_id num"
},
{
"path": "lua/_ai/openai.lua",
"chars": 3469,
"preview": "local M = {}\n\nlocal config = require(\"_ai/config\")\n\n---@param cmd string\n---@param args string[]\n---@param on_stdout_chu"
},
{
"path": "plugin/ai.lua",
"chars": 394,
"preview": "vim.api.nvim_create_user_command(\"AI\", function (args)\n require(\"_ai/commands\").ai(args)\nend, {\n range = true,\n "
},
{
"path": "scripts/test",
"chars": 276,
"preview": "#!/bin/sh -e\n#\n# Runs lua-language-server to lint the entire project.\n\ntmp=`mktemp -d`\ntrap 'rm -rf -- \"$tmp\"' EXIT\n\nlua"
}
]
About this extraction
This page contains the full source code of the aduros/ai.vim GitHub repository, extracted and formatted as plain text for AI agents and large language models (LLMs). The extraction includes 12 files (20.5 KB), approximately 5.4k tokens. Use this with OpenClaw, Claude, ChatGPT, Cursor, Windsurf, or any other AI tool that accepts text input. You can copy the full output to your clipboard or download it as a .txt file.
Extracted by GitExtract — free GitHub repo to text converter for AI. Built by Nikandr Surkov.