[
  {
    "path": ".github/workflows/ci.yml",
    "content": "name: Test\n\non: [push]\n\njobs:\n    test:\n        runs-on: ubuntu-latest\n        steps:\n            - uses: actions/checkout@v3\n\n            - name: Install lua-language-server\n              run: |\n                  lls_dir=`mktemp -d`\n                  gh release download -R sumneko/lua-language-server -p '*-linux-x64.tar.gz' -D \"$lls_dir\"\n                  tar xzf \"$lls_dir\"/* -C \"$lls_dir\"\n                  echo \"$lls_dir/bin\" >> $GITHUB_PATH\n              env:\n                  GH_TOKEN: ${{ github.token }}\n\n            - name: Run tests\n              run: ./scripts/test\n"
  },
  {
    "path": ".gitignore",
    "content": "/doc/tags\n"
  },
  {
    "path": ".luarc.json",
    "content": "{\n    \"$schema\": \"https://raw.githubusercontent.com/sumneko/vscode-lua/master/setting/schema.json\",\n    \"Lua.runtime.version\": \"LuaJIT\",\n    \"Lua.diagnostics.globals\": [ \"vim\" ]\n}\n"
  },
  {
    "path": "LICENSE.txt",
    "content": "Copyright (c) Bruno Garcia\n\nPermission to use, copy, modify, and/or distribute this software for any\npurpose with or without fee is hereby granted, provided that the above\ncopyright notice and this permission notice appear in all copies.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH\nREGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND\nFITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT,\nINDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM\nLOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR\nOTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR\nPERFORMANCE OF THIS SOFTWARE.\n"
  },
  {
    "path": "README.md",
    "content": "# 🤖 ai.vim\n\nA minimalist Neovim plugin for generating and editing text using OpenAI and GPT.\n\n## Features\n\n- Complete text in insert mode.\n- Generate new text using a prompt.\n- Select and edit existing text in-place.\n- Streaming support for completions.\n- Easy to use interface. Just hit `<Ctrl-A>` or run `:AI <prompt>`.\n- Works with both source code and regular text.\n\n## Installing\n\nFor vim-plug, add this to your init.vim:\n\n```vim\nPlug 'aduros/ai.vim'\n```\n\nMake sure you have an environment variable called `$OPENAI_API_KEY` which you can [generate\nhere](https://beta.openai.com/account/api-keys). You'll also need `curl` installed.\n\nTo see the full help and customization options, run `:help ai.vim`.\n\n## Tutorial\n\nThe most basic use-case is completion, by pressing `<Ctrl-A>` in insert mode.\n\nFor example:\n\n```typescript\nfunction capitalize (str: string): string {\n    (Press <Ctrl-A> here)\n}\n```\n\nWill result in:\n\n```typescript\nfunction capitalize (str: string): string {\n    return str.charAt(0).toUpperCase() + str.slice(1);\n}\n```\n\nai.vim isn't just for programming! You can also complete regular human text:\n\n```\nHey Joe, here are some ideas for slogans for the new petshop. Which do you like best?\n1. <Ctrl-A>\n```\n\nResults in:\n\n```\nHey Joe, here are some ideas for slogans for the new petshop. Which do you like best?\n1. \"Where Pets Come First!\"\n2. \"Your Pet's Home Away From Home!\"\n3. \"The Best Place for Your Pet!\"\n4. \"The Pet Store That Cares!\"\n5. \"The Pet Store That Loves Your Pet!\"\n```\n\nYou can also generate some text by pressing `<Ctrl-A>` in normal mode and providing a prompt. For\nexample:\n\n```\n:AI write a thank you email to Bigco engineering interviewer\n```\n\nResults in something like:\n\n```\nDear [Name],\n\nI wanted to take a moment to thank you for taking the time to interview me for the engineering\nposition at Bigco. I was very impressed with the company and the team, and I am excited about the\npossibility of joining the team.\n\nI appreciate the time you took to explain the role and the company's mission. I am confident that I\nhave the skills and experience to be a valuable asset to the team.\n\nOnce again, thank you for your time and consideration. I look forward to hearing from you soon.\n\nSincerely,\n[Your Name]\n```\n\nBesides generating new text, you can also edit existing text using a given instruction.\n\n```css\nbody {\n    color: orange;\n    background: green;\n}\n```\n\nVisually selecting the above CSS and running `:AI convert colors to hex` results in:\n\n```css\nbody {\n    color: #ffa500;\n    background: #008000;\n}\n```\n\nAnother example of text editing:\n\n```\nList of capitals:\n1. Toronto\n2. London\n3. Honolulu\n4. Miami\n5. Boston\n```\n\nVisually selecting this text and running `:AI sort by population` results in:\n\n```\nList of capitals:\n1. London\n2. Toronto\n3. Boston\n4. Miami\n5. Honolulu\n```\n\nYou can build your own shortcuts for long and complex prompts. For example:\n\n```vim\nvnoremap <silent> <leader>f :AI fix grammar and spelling and replace slang and contractions with a formal academic writing style<CR>\n```\n\nWith this custom mapping you can select text that looks like this:\n\n```\nMe fail English? That's unpossible!\n```\n\nAnd by pressing `<leader>f` transform it into this:\n\n```\nI failed English? That is impossible!\n```\n\nIf you come up with any exciting ways to use ai.vim, please share what you find!\n\n## Important Disclaimers\n\n**Accuracy**: GPT is good at producing text and code that looks correct at first glance, but may be\ncompletely wrong. Make sure you carefully proof read and test everything output by this plugin!\n\n**Privacy**: This plugin sends text to OpenAI when generating completions and edits. Don't use it in\nfiles containing sensitive information.\n"
  },
  {
    "path": "doc/ai.txt",
    "content": "*ai.txt* Plugin for generating and editing text using OpenAI and GPT.\n\nAuthor: Bruno Garcia <https://github.com/aduros/ai.vim>\n\n==============================================================================\nINTRODUCTION\n\n*ai.vim* exposes OpenAI's powerful language processing model to Neovim in a\nflexible but easy to use plugin.\n\nYou'll need an OpenAI account and to generate an API key here:\n\n        https://beta.openai.com/account/api-keys\n\nThen set the `$OPENAI_API_KEY` environment variable, for example, by adding\nit to your `~/.profile`:\n\n        `export OPENAI_API_KEY=\"sk-abcdefghijklmnopqrstuvwxyz1234567890\"`\n\n==============================================================================\nUSAGE\n\nThe *:AI* command is your point of entry to ai.vim. With it you can generate\ntext using a prompt, complete text at the current position, or edit existing\ntext in-place.\n\nThere is a recommended mapping of *<CTRL-A>* in normal, visual, and insert\nmodes. This mapping can be disabled by setting *g:ai_no_mappings* to 1.\n\nThere are 4 different behaviors for :AI based on whether arguments are\nsupplied or text is visually selected.\n\n:AI {generator prompt}\n\n        Generate some text using the supplied prompt and insert it at the\n        cursor position.\n\n        Example:\n\n            :AI write an email to IT asking for a replacement laptop\n\n:AI\n\n        When no prompt is supplied, contextually complete some text to insert\n        at the cursor position.\n\n        Example:\n\n            function capitalize (str: string): string {\n                `(Press <Ctrl-A> here)`\n            }\n\n(with visual selection) :AI {edit instruction}\n\n        With some text visually selected, edit it in-place using the given\n        edit instruction.\n\n        Example:\n\n            List of capitals:\n            1. Toronto\n            2. London\n            3. Honolulu\n            4. Miami\n            5. Boston\n\n            `(Visual select)` :AI sort by population\n\n(with visual selection) :AI\n\n        When no edit instruction is supplied, use the selected text as a\n        generator prompt. The generated text will replace the selected text.\n\n        Example:\n\n            Write an academic essay exploring the pros and cons of yodeling as\n            a career choice.\n\n            `(Visual select)` :AI\n\n==============================================================================\nCUSTOMIZING\n\n*g:ai_completions_model* (default: \"text-davinci-003\")\n\n        The model to use for completions.\n\n        For more info, see https://beta.openai.com/docs/models/overview\n\n        Example: `let g:ai_completions_model=\"text-ada-001\"`\n\n*g:ai_edits_model* (default: \"text-davinci-edit-001\")\n\n        The model to use for edits.\n\n        For more info, see https://beta.openai.com/docs/models/overview\n\n        Example: `let g:ai_edits_model=\"code-davinci-edit-001\"`\n\n*g:ai_temperature* (default: 0)\n\n        Controls randomness of output, between 0 and 1. Lower values will be\n        more deterministic and higher values will take more creative risks.\n\n        Example: `let g:ai_temperature=0.7`\n\n*g:ai_context_before* (default: 20)\n\n        When using |:AI| for contextual completion, how many additional lines\n        of text before the cursor to include in the request.\n\n        Example: `let g:ai_context_before=50`\n\n*g:ai_context_after* (default: 20)\n\n        When using |:AI| for contextual completion, how many additional lines\n        of text after the cursor to include in the request.\n\n        Example: `let g:ai_context_after=50`\n\n*g:ai_indicator_style* (default: \"sign\")\n\n        Controls where the progress indicator is displayed. Allowed values:\n\n            - `sign`: Display indicator in the sign column.\n            - `none`: Don't display any indicator.\n\n        Example: `let g:ai_indicator_style=\"none\"`\n\n*g:ai_indicator_text* (default: \"🤖\")\n\n        The text used to indicate a request in-progress. If your terminal can't\n        display emojis, you'll probably want to set this.\n\n        Example: `let g:ai_indicator_text=\"A\"`\n\n*g:ai_timeout* (default: 60)\n\n        Set the maximum time in seconds to wait for OpenAI requests.\n\n        Example: `let g:ai_timeout=20`\n\n*AIIndicator* is the |:highlight| group to color the indicator text.\n\n        Example: `:highlight AIIndicator ctermbg=red`\n\n*AIHighlight* is the |:highlight| group to color the text being processed.\n\n        Example: `:highlight AIHighlight ctermbg=green`\n"
  },
  {
    "path": "lua/_ai/commands.lua",
    "content": "local M = {}\n\nlocal openai = require(\"_ai/openai\")\nlocal config = require(\"_ai/config\")\nlocal indicator = require(\"_ai/indicator\")\n\n---@param args { args: string, range: integer }\nfunction M.ai (args)\n    local prompt = args.args\n    local visual_mode = args.range > 0\n\n    local buffer = vim.api.nvim_get_current_buf()\n\n    local start_row, start_col\n    local end_row, end_col\n\n    if visual_mode then\n        -- Use the visual selection\n        local start_pos = vim.api.nvim_buf_get_mark(buffer, \"<\")\n        start_row = start_pos[1] - 1\n        start_col = start_pos[2]\n\n        local end_pos = vim.api.nvim_buf_get_mark(buffer, \">\")\n        end_row = end_pos[1] - 1\n        local line = vim.fn.getline(end_pos[1])\n        if line == \"\" then\n            end_col = 0\n        else\n            end_col = vim.fn.byteidx(line, vim.fn.charcol(\"'>\"))\n        end\n\n    else\n        -- Use the cursor position\n        local start_pos = vim.api.nvim_win_get_cursor(0)\n        start_row = start_pos[1] - 1\n        local line = vim.fn.getline(start_pos[1])\n        if line == \"\" then\n            start_col = 0\n        else\n            start_col = vim.fn.byteidx(line, vim.fn.charcol(\".\"))\n        end\n        end_row = start_row\n        end_col = start_col\n    end\n\n    local start_line_length = vim.api.nvim_buf_get_lines(buffer, start_row, start_row+1, true)[1]:len()\n    start_col = math.min(start_col, start_line_length)\n\n    local end_line_length = vim.api.nvim_buf_get_lines(buffer, end_row, end_row+1, true)[1]:len()\n    end_col = math.min(end_col, end_line_length)\n\n    local indicator_obj = indicator.create(buffer, start_row, start_col, end_row, end_col)\n    local accumulated_text = \"\"\n\n    local function on_data (data)\n        accumulated_text = accumulated_text .. data.choices[1].text\n        indicator.set_preview_text(indicator_obj, accumulated_text)\n    end\n\n    local function on_complete (err)\n        if err then\n            vim.api.nvim_err_writeln(\"ai.vim: \" .. err)\n        elseif #accumulated_text > 0 then\n            indicator.set_buffer_text(indicator_obj, accumulated_text)\n        end\n        indicator.finish(indicator_obj)\n    end\n\n    if visual_mode then\n        local selected_text = table.concat(vim.api.nvim_buf_get_text(buffer, start_row, start_col, end_row, end_col, {}), \"\\n\")\n        if prompt == \"\" then\n            -- Replace the selected text, also using it as a prompt\n            openai.completions({\n                prompt = selected_text,\n            }, on_data, on_complete)\n        else\n            -- Edit selected text\n            openai.edits({\n                input = selected_text,\n                instruction = prompt,\n            }, on_data, on_complete)\n        end\n    else\n        if prompt == \"\" then\n            -- Insert some text generated using surrounding context\n            local prefix = table.concat(vim.api.nvim_buf_get_text(buffer,\n                math.max(0, start_row-config.context_before), 0, start_row, start_col, {}), \"\\n\")\n\n            local line_count = vim.api.nvim_buf_line_count(buffer)\n            local suffix = table.concat(vim.api.nvim_buf_get_text(buffer,\n                end_row, end_col, math.min(end_row+config.context_after, line_count-1), 99999999, {}), \"\\n\")\n\n            openai.completions({\n                prompt = prefix,\n                suffix = suffix,\n            }, on_data, on_complete)\n        else\n            -- Insert some text generated using the given prompt\n            openai.completions({\n                prompt = prompt,\n            }, on_data, on_complete)\n        end\n    end\nend\n\nreturn M\n"
  },
  {
    "path": "lua/_ai/config.lua",
    "content": "local M = {}\n\n---@param name string\n---@param default_value unknown\n---@return unknown\nlocal function get_var (name, default_value)\n    local value = vim.g[name]\n    if value == nil then\n        return default_value\n    end\n    return value\nend\n\n\nM.indicator_style = get_var(\"ai_indicator_style\", \"sign\")\nM.indicator_text = get_var(\"ai_indicator_text\", \"🤖\")\nM.completions_model = get_var(\"ai_completions_model\", \"gpt-3.5-turbo-instruct\")\nM.edits_model = get_var(\"ai_edits_model\", \"text-davinci-edit-001\")\nM.temperature = get_var(\"ai_temperature\", 0)\nM.context_before = get_var(\"ai_context_before\", 20)\nM.context_after = get_var(\"ai_context_after\", 20)\nM.timeout = get_var(\"ai_timeout\", 60)\n\nreturn M\n"
  },
  {
    "path": "lua/_ai/indicator.lua",
    "content": "local M = {}\n\nlocal config = require(\"_ai/config\")\n\n---@class Indicator\n---@field buffer number\n---@field extmark_id number\n\nlocal ns_id = vim.api.nvim_create_namespace(\"\")\n\nlocal function get_default_extmark_opts ()\n    local extmark_opts = {\n        hl_group = \"AIHighlight\",\n        -- right_gravity = false,\n        -- end_right_gravity = true,\n    }\n\n    if config.indicator_style ~= \"none\" then\n        extmark_opts.sign_text = config.indicator_text\n        extmark_opts.sign_hl_group = \"AIIndicator\"\n    end\n\n    return extmark_opts\nend\n\n-- Creates a new indicator.\n---@param buffer number\n---@param start_row number\n---@param start_col number\n---@param end_row number\n---@param end_col number\n---@return Indicator\nfunction M.create (buffer, start_row, start_col, end_row, end_col)\n    local extmark_opts = get_default_extmark_opts()\n\n    if end_row ~= start_row or end_col ~= start_col then\n        extmark_opts.end_row = end_row\n        extmark_opts.end_col = end_col\n    end\n\n    local extmark_id = vim.api.nvim_buf_set_extmark(buffer, ns_id, start_row, start_col, extmark_opts)\n\n    return {\n        buffer = buffer,\n        extmark_id = extmark_id,\n    }\nend\n\n-- Set the preview virtual text to show at this indicator.\n---@param indicator Indicator\n---@param text string\nfunction M.set_preview_text (indicator, text)\n    local extmark = vim.api.nvim_buf_get_extmark_by_id(indicator.buffer, ns_id, indicator.extmark_id, { details = true })\n    local start_row = extmark[1]\n    local start_col = extmark[2]\n\n    if extmark[3].end_row or extmark[3].end_col then\n        return -- We don't support preview text on indicators over a range\n    end\n\n    local extmark_opts = get_default_extmark_opts()\n    extmark_opts.id = indicator.extmark_id\n    extmark_opts.virt_text_pos = \"overlay\"\n\n    local lines = vim.split(text, \"\\n\")\n    extmark_opts.virt_text = {{lines[1], \"Comment\"}}\n\n    if #lines > 1 then\n        extmark_opts.virt_lines = vim.tbl_map(function (line) return {{line, \"Comment\"}} end, vim.list_slice(lines, 2))\n    end\n\n    vim.api.nvim_buf_set_extmark(indicator.buffer, ns_id, start_row, start_col, extmark_opts)\nend\n\n-- Sets the in-buffer text at this indicator.\n---@param indicator Indicator\n---@param text string\nfunction M.set_buffer_text (indicator, text)\n    local extmark = vim.api.nvim_buf_get_extmark_by_id(indicator.buffer, ns_id, indicator.extmark_id, { details = true })\n    local start_row = extmark[1]\n    local start_col = extmark[2]\n\n    local end_row = extmark[3].end_row\n    if not end_row then\n        end_row = start_row\n    end\n\n    local end_col = extmark[3].end_col\n    if not end_col then\n        end_col = start_col\n    end\n\n    local lines = vim.split(text, \"\\n\")\n    vim.api.nvim_buf_set_text(indicator.buffer, start_row, start_col, end_row, end_col, lines)\nend\n\n---@param indicator Indicator\nfunction M.finish (indicator)\n    vim.api.nvim_buf_del_extmark(indicator.buffer, ns_id, indicator.extmark_id)\nend\n\nreturn M\n"
  },
  {
    "path": "lua/_ai/openai.lua",
    "content": "local M = {}\n\nlocal config = require(\"_ai/config\")\n\n---@param cmd string\n---@param args string[]\n---@param on_stdout_chunk fun(chunk: string): nil\n---@param on_complete fun(err: string?, output: string?): nil\nlocal function exec (cmd, args, on_stdout_chunk, on_complete)\n    local stdout = vim.loop.new_pipe()\n    local function on_stdout_read (_, chunk)\n        if chunk then\n            vim.schedule(function ()\n                on_stdout_chunk(chunk)\n            end)\n        end\n    end\n\n    local stderr = vim.loop.new_pipe()\n    local stderr_chunks = {}\n    local function on_stderr_read (_, chunk)\n        if chunk then\n            table.insert(stderr_chunks, chunk)\n        end\n    end\n\n    local handle\n\n    handle, error = vim.loop.spawn(cmd, {\n        args = args,\n        stdio = {nil, stdout, stderr},\n    }, function (code)\n        stdout:close()\n        stderr:close()\n        handle:close()\n\n        vim.schedule(function ()\n            if code ~= 0 then\n                on_complete(vim.trim(table.concat(stderr_chunks, \"\")))\n            else\n                on_complete()\n            end\n        end)\n    end)\n\n    if not handle then\n        on_complete(cmd .. \" could not be started: \" .. error)\n    else\n        stdout:read_start(on_stdout_read)\n        stderr:read_start(on_stderr_read)\n    end\nend\n\nlocal function request (endpoint, body, on_data, on_complete)\n    local api_key = os.getenv(\"OPENAI_API_KEY\")\n    if not api_key then\n        on_complete(\"$OPENAI_API_KEY environment variable must be set\")\n        return\n    end\n\n    local curl_args = {\n        \"--silent\", \"--show-error\", \"--no-buffer\",\n        \"--max-time\", config.timeout,\n        \"-L\", \"https://api.openai.com/v1/\" .. endpoint,\n        \"-H\", \"Authorization: Bearer \" .. api_key,\n        \"-X\", \"POST\", \"-H\", \"Content-Type: application/json\",\n        \"-d\", vim.json.encode(body),\n    }\n\n    local buffered_chunks = \"\"\n    local function on_stdout_chunk (chunk)\n        buffered_chunks = buffered_chunks .. chunk\n\n        -- Extract complete JSON objects from the buffered_chunks\n        local json_start, json_end = buffered_chunks:find(\"}\\n\")\n        while json_start do\n            local json_str = buffered_chunks:sub(1, json_end)\n            buffered_chunks = buffered_chunks:sub(json_end + 1)\n\n            -- Remove the \"data: \" prefix\n            json_str = json_str:gsub(\"data: \", \"\")\n\n            local json = vim.json.decode(json_str)\n            if json.error then\n                on_complete(json.error.message)\n            else\n                on_data(json)\n            end\n\n            json_start, json_end = buffered_chunks:find(\"}\\n\")\n        end\n    end\n\n    exec(\"curl\", curl_args, on_stdout_chunk, on_complete)\nend\n\n---@param body table\n---@param on_data fun(data: unknown): nil\n---@param on_complete fun(err: string?): nil\nfunction M.completions (body, on_data, on_complete)\n    body = vim.tbl_extend(\"keep\", body, {\n        model = config.completions_model,\n        max_tokens = 2048,\n        temperature = config.temperature,\n        stream = true,\n    })\n    request(\"completions\", body, on_data, on_complete)\nend\n\n---@param body table\n---@param on_data fun(data: unknown): nil\n---@param on_complete fun(err: string?): nil\nfunction M.edits (body, on_data, on_complete)\n    body = vim.tbl_extend(\"keep\", body, {\n        model = config.edits_model,\n        temperature = config.temperature,\n    })\n    request(\"edits\", body, on_data, on_complete)\nend\n\nreturn M\n"
  },
  {
    "path": "plugin/ai.lua",
    "content": "vim.api.nvim_create_user_command(\"AI\", function (args)\n    require(\"_ai/commands\").ai(args)\nend, {\n    range = true,\n    nargs = \"*\",\n})\n\nif not vim.g.ai_no_mappings then\n    vim.api.nvim_set_keymap(\"n\", \"<C-a>\", \":AI \", { noremap = true })\n    vim.api.nvim_set_keymap(\"v\", \"<C-a>\", \":AI \", { noremap = true })\n    vim.api.nvim_set_keymap(\"i\", \"<C-a>\", \"<Esc>:AI<CR>a\", { noremap = true })\nend\n"
  },
  {
    "path": "scripts/test",
    "content": "#!/bin/sh -e\n#\n# Runs lua-language-server to lint the entire project.\n\ntmp=`mktemp -d`\ntrap 'rm -rf -- \"$tmp\"' EXIT\n\nlua-language-server --check=\"$PWD\" --checklevel=Information --logpath=\"$tmp\"\n\nif [ -f \"$tmp/check.json\" ]; then\n    cat >&2 -- \"$tmp/check.json\"\n    exit 1\nfi\n"
  }
]