CopilotChat.nvim brings GitHub Copilot Chat capabilities directly into Neovim with a focus on transparency and user control.
Warning
For Neovim < 0.11.0, add noinsert
or noselect
to your completeopt
otherwise chat autocompletion will not work. For best autocompletion experience, also add popup
to your completeopt
(even on Neovim 0.11.0+).
luajit-tiktoken-bin
or lua51-tiktoken-bin
from AURsudo luarocks install --lua-version 5.1 tiktoken_core
tiktoken_core.so
in your Lua pathFor various plugin pickers to work correctly, you need to replace vim.ui.select
with your desired picker (as the default vim.ui.select
is very basic). Here are some examples:
require('fzf-lua').register_ui_select()
telescope-ui-select.nvim
pluginui_select
configvim.ui.select = require('mini.pick').ui_select
return { { "CopilotC-Nvim/CopilotChat.nvim", dependencies = { { "nvim-lua/plenary.nvim", branch = "master" }, }, build = "make tiktoken", opts = { -- See Configuration section for options }, }, }
call plug#begin() Plug 'nvim-lua/plenary.nvim' Plug 'CopilotC-Nvim/CopilotChat.nvim' call plug#end() lua << EOF require("CopilotChat").setup() EOF
#<name>
) - Add specific content (files, git diffs, URLs) to your prompt@<name>
) - Give LLM access to functions it can call with your approval> <text>
) - Persist context across single chat session$<model>
) - Specify which AI model to use for the chat/PromptName
) - Use predefined prompt templates for common tasks# Add specific file to context #file:src/main.lua # Give LLM access to workspace tools @copilot What files are in this project? # Sticky prompt that persists > #buffer:current > You are a helpful coding assistant
When you use @copilot
, the LLM can call functions like glob
, file
, gitdiff
etc. You'll see the proposed function call and can approve/reject it before execution.
:CopilotChat <input>?
Open chat with optional input :CopilotChatOpen
Open chat window :CopilotChatClose
Close chat window :CopilotChatToggle
Toggle chat window :CopilotChatStop
Stop current output :CopilotChatReset
Reset chat window :CopilotChatSave <name>?
Save chat history :CopilotChatLoad <name>?
Load chat history :CopilotChatPrompts
View/select prompt templates :CopilotChatModels
View/select available models :CopilotChat<PromptName>
Use specific prompt template Insert Normal Action <Tab>
- Trigger/accept completion menu for tokens <C-c>
q
Close the chat window <C-l>
<C-l>
Reset and clear the chat window <C-s>
<CR>
Submit the current prompt - grr
Toggle sticky prompt for line under cursor - grx
Clear all sticky prompts in prompt <C-y>
<C-y>
Accept nearest diff - gj
Jump to section of nearest diff - gqa
Add all answers from chat to quickfix list - gqd
Add all diffs from chat to quickfix list - gy
Yank nearest diff to register - gd
Show diff between source and nearest diff - gc
Show info about current chat - gh
Show help message
Warning
Some plugins (e.g. copilot.vim
) may also map common keys like <Tab>
in insert mode.
To avoid conflicts, disable Copilot's default <Tab>
mapping with:
vim.g.copilot_no_tab_map = true vim.keymap.set('i', '<S-Tab>', 'copilot#Accept("\\<S-Tab>")', { expr = true, replace_keycodes = false })
You can also customize CopilotChat keymaps in your config.
All predefined functions belong to the copilot
group.
buffer
Retrieves content from a specific buffer #buffer
buffers
Fetches content from multiple buffers #buffers:visible
diagnostics
Collects code diagnostics (errors, warnings) #diagnostics:current
file
Reads content from a specified file path #file:path/to/file
gitdiff
Retrieves git diff information #gitdiff:staged
gitstatus
Retrieves git status information #gitstatus
glob
Lists filenames matching a pattern in workspace #glob:**/*.lua
grep
Searches for a pattern across files in workspace #grep:TODO
quickfix
Includes content of files in quickfix list #quickfix
register
Provides access to specified Vim register #register:+
url
Fetches content from a specified URL #url:https://...
Prompt Description Explain
Write detailed explanation of selected code as paragraphs Review
Comprehensive code review with line-specific issue reporting Fix
Identify problems and rewrite code with fixes and explanation Optimize
Improve performance and readability with optimization strategy Docs
Add documentation comments to selected code Tests
Generate tests for selected code Commit
Generate commit message with commitizen convention from staged changes
For all available configuration options, see lua/CopilotChat/config.lua
.
Most users only need to configure a few options:
{ model = 'gpt-4.1', -- AI model to use temperature = 0.1, -- Lower = focused, higher = creative window = { layout = 'vertical', -- 'vertical', 'horizontal', 'float' width = 0.5, -- 50% of screen width }, auto_insert_mode = true, -- Enter insert mode when opening }
{ window = { layout = 'float', width = 80, -- Fixed width in columns height = 20, -- Fixed height in rows border = 'rounded', -- 'single', 'double', 'rounded', 'solid' title = '🤖 AI Assistant', zindex = 100, -- Ensure window stays on top }, headers = { user = '👤 You: ', assistant = '🤖 Copilot: ', tool = '🔧 Tool: ', }, separator = '━━', show_folds = false, -- Disable folding for cleaner look }
-- Auto-command to customize chat buffer behavior vim.api.nvim_create_autocmd('BufEnter', { pattern = 'copilot-*', callback = function() vim.opt_local.relativenumber = false vim.opt_local.number = false vim.opt_local.conceallevel = 0 end, })
You can customize colors by setting highlight groups in your config:
-- In your colorscheme or init.lua vim.api.nvim_set_hl(0, 'CopilotChatHeader', { fg = '#7C3AED', bold = true }) vim.api.nvim_set_hl(0, 'CopilotChatSeparator', { fg = '#374151' })
Types of copilot highlights:
CopilotChatHeader
- Header highlight in chat bufferCopilotChatSeparator
- Separator highlight in chat bufferCopilotChatStatus
- Status and spinner in chat bufferCopilotChatHelp
- Help text in chat bufferCopilotChatResource
- Resource highlight in chat buffer (e.g. #file
, #gitdiff
)CopilotChatTool
- Tool call highlight in chat buffer (e.g. @copilot
)CopilotChatPrompt
- Prompt highlight in chat buffer (e.g. /Explain
, /Review
)CopilotChatModel
- Model highlight in chat buffer (e.g. $gpt-4.1
)CopilotChatUri
- URI highlight in chat buffer (e.g. ##https://...
)CopilotChatSelection
- Selection highlight in source bufferCopilotChatAnnotation
- Annotation highlight in chat buffer (file headers, tool call headers, tool call body)Define your own prompts in the configuration:
{ prompts = { MyCustomPrompt = { prompt = 'Explain how it works.', system_prompt = 'You are very good at explaining stuff', mapping = '<leader>ccmc', description = 'My custom prompt description', }, Yarrr = { system_prompt = 'You are fascinated by pirates, so please respond in pirate speak.', }, NiceInstructions = { system_prompt = 'You are a nice coding tutor, so please respond in a friendly and helpful manner.', } } }
Define your own functions in the configuration with input handling and schema:
{ functions = { birthday = { description = "Retrieves birthday information for a person", uri = "birthday://{name}", schema = { type = 'object', required = { 'name' }, properties = { name = { type = 'string', enum = { 'Alice', 'Bob', 'Charlie' }, description = "Person's name", }, }, }, resolve = function(input) return { { uri = 'birthday://' .. input.name, mimetype = 'text/plain', data = input.name .. ' birthday info', } } end } } }
Control what content is automatically included:
{ -- Use visual selection, fallback to current line selection = function(source) return require('CopilotChat.select').visual(source) or require('CopilotChat.select').line(source) end, }
Available selections:
require('CopilotChat.select').visual
- Current visual selectionrequire('CopilotChat.select').buffer
- Entire buffer contentrequire('CopilotChat.select').line
- Current line contentrequire('CopilotChat.select').unnamed
- Unnamed register (last deleted/changed/yanked)Add custom AI providers:
{ providers = { my_provider = { get_url = function(opts) return "https://api.example.com/chat" end, get_headers = function() return { ["Authorization"] = "Bearer " .. api_key } end, get_models = function() return { { id = "gpt-4.1", name = "GPT-4.1 model" } } end, prepare_input = require('CopilotChat.config.providers').copilot.prepare_input, prepare_output = require('CopilotChat.config.providers').copilot.prepare_output, } } }
Provider Interface:
{ -- Optional: Disable provider disabled?: boolean, -- Optional: Extra info about the provider displayed in info panel get_info?(): string[] -- Optional: Get extra request headers with optional expiration time get_headers?(): table<string,string>, number?, -- Optional: Get API endpoint URL get_url?(opts: CopilotChat.Provider.options): string, -- Optional: Prepare request input prepare_input?(inputs: table<CopilotChat.Provider.input>, opts: CopilotChat.Provider.options): table, -- Optional: Prepare response output prepare_output?(output: table, opts: CopilotChat.Provider.options): CopilotChat.Provider.output, -- Optional: Get available models get_models?(headers: table): table<CopilotChat.Provider.model>, }
Built-in providers:
copilot
- GitHub Copilot (default)github_models
- GitHub Marketplace models (disabled by default)local chat = require("CopilotChat") -- Basic Chat Functions chat.ask(prompt, config) -- Ask a question with optional config chat.response() -- Get the last response text chat.resolve_prompt() -- Resolve prompt references chat.resolve_functions() -- Resolve functions that are available for automatic use by LLM (WARN: async, requires plenary.async.run) chat.resolve_model() -- Resolve model from prompt (WARN: async, requires plenary.async.run) -- Window Management chat.open(config) -- Open chat window with optional config chat.close() -- Close chat window chat.toggle(config) -- Toggle chat window visibility with optional config chat.reset() -- Reset the chat chat.stop() -- Stop current output -- Source Management chat.get_source() -- Get the current source buffer and window chat.set_source(winnr) -- Set the source window -- Selection Management chat.get_selection() -- Get the current selection chat.set_selection(bufnr, start_line, end_line, clear) -- Set or clear selection -- Prompt & Model Management chat.select_prompt(config) -- Open prompt selector with optional config chat.select_model() -- Open model selector -- History Management chat.load(name, history_path) -- Load chat history chat.save(name, history_path) -- Save chat history -- Configuration chat.setup(config) -- Update configuration chat.log_level(level) -- Set log level (debug, info, etc.)
You can also access the chat window UI methods through the chat.chat
object:
local window = require("CopilotChat").chat -- Chat UI State window:visible() -- Check if chat window is visible window:focused() -- Check if chat window is focused -- Message Management window:get_message(role, cursor) -- Get chat message by role, either last or closest to cursor window:add_message({ role, content }, replace) -- Add or replace a message in chat window:remove_message(role, cursor) -- Remove chat message by role, either last or closest to cursor window:get_block(role, cursor) -- Get code block by role, either last or closest to cursor window:add_sticky(sticky) -- Add sticky prompt to chat message -- Content Management window:append(text) -- Append text to chat window window:clear() -- Clear chat window content window:start() -- Start writing to chat window window:finish() -- Finish writing to chat window -- Navigation window:follow() -- Move cursor to end of chat content window:focus() -- Focus the chat window -- Advanced Features window:overlay(opts) -- Show overlay with specified options
-- Open chat, ask a question and handle response require("CopilotChat").open() require("CopilotChat").ask("#buffer Explain this code", { callback = function(response) vim.notify("Got response: " .. response:sub(1, 50) .. "...") return response end, }) -- Save and load chat history require("CopilotChat").save("my_debugging_session") require("CopilotChat").load("my_debugging_session") -- Use custom sticky and model require("CopilotChat").ask("How can I optimize this?", { model = "gpt-4.1", sticky = {"#buffer", "#gitdiff:staged"} })
For more examples, see the examples wiki page.
To set up the environment:
git clone https://github.com/CopilotC-Nvim/CopilotChat.nvim cd CopilotChat.nvim
To run tests:
See CONTRIBUTING.md for detailed guidelines.
Thanks goes to these wonderful people (emoji key):
This project follows the all-contributors specification. Contributions of any kind are welcome!
RetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4