In this tutorial we’ll take a look on how to add LLM (Mistral AI) capabilities into neovim (since I don’t want to use Cursor). After we install a couple of prerequisite plugins, fonts and a terminal emulator, let’s check out two plugins - CodeCompanion and Minuet. Full init.lua
config can be found in my GitHub gists.
AI Provider - Mistral
So, which of the AI providers to choose? I’ve found this comprehensive comparison. Then I’ve realized that I’d like to support an European company, so the picking was simple. I’ve chosen the one that made it into the benchmarks: Mistral.
We can start with a free tier and it’s one of the cheaper providers out there. It still hits good values on the benchmarks, especially the multi-language one, which might come in handy one day.
As I don’t want to create a fine-tuned model for me, I’d like to pick some of the vanilla options provided. The models I am interested in are:
Mistral Large: Their flagship beefy universal model. If I don’t know what to pick, I am picking this one.
Codestral: Model dedicated to coding.
Mistral Small: Small model priced at 0.1€/0.3€ per 1M input/output tokens.
And the name - La Plateforme. So French, love it. I can say I’ve been working with French models these couple of days, hope I can explain that to my wife too.
So we need the proper API key. There are actually two options on how to do this - you can either use the La Plateforme key targeting api.mistral.ai, or you can use a dedicated Codestral API key targeting codestral.mistral.ai. We’ll be using both in this example.
Neovim Plugins and Prerequisities
I am a vim user, so I’ll ignore all of the advances in the other editors and wrestle with this and the Lua mess it implies. I think it’s high time I switched to neovim.
Don’t worry, I’ll share the full init.lua
file (yes, I’ve put it all into one file). Don’t worry, it probably won’t work for you. But hey, debugging is part of the fun and a rite of passage for (neo)vim users.
Lazy.nvim: Package Manager
I opted for the lazy.nvim route (as opposed to luarocks route). Lazy.nvim is a “package manager” for nvim plugins, so let’s install the requirements, follow the single file setup tutorial and we’re set.
Lualine: Extensible Status Line
Original vim is old and constrained. If you want more information about the editor state, you’ll reach for a status line extension. So in neovim, I’ve chosen Lualine. I have little to say here, it just worked.
Blink.cmp: Completion Frontend
Since Debian stable has neovim in the version 0.10.4, I need a completion frontend extension. After a single coin-toss I’ve opted for blink-cmp. You might hit this error:
Detected an out of date or missing fuzzy matching library. Can't download from github due to not being on a git tag and no `fuzzy.prebuilt_binaries.force_version` is set.
It requires some fiddling with the config, in the end I’ve set these:
{ 'Saghen/blink.cmp',
version = '*',
build = 'cargo build --release'
}
Other options include using rust (at least rustc
and cargo
). I’ve installed them just in case.
Fira Code: Nerd Font (with Icons)
There is also a whole rabbit hole dedicated to nerd-fonts. They contain icons and ligatures used by many of the more hip Neovim plugins. In the end, I’ve picked the Fira Code font.
It’s not without issues, check if your terminal supports it. Mine didn’t, so I’ve tried Ghostty. Honestly, I might switch completely to it, I find it surprisingly intuitive.
Markview: Markdown previews
Since markdown is the de-facto standard for LLM outputs, I was excited to have better markdown viewing capabilities from markview.nvim. To use it fully, you’ll need to include the treesitter plugin and then install all of the required parsers by :TSInstall markdown markdown_inline html latex typst yaml
.
Tokyonight: Color Scheme
And since I am doing all this work, the result should contain some pretty colors as well. In the end I’ve settled in TokyoNight theme. Just make sure that it’s loaded BEFORE markview.
CodeCompanion Plugin
The field is evolving at a rapid pace, so this article might be outdated in a week. But currently, I really like the CodeCompanion - sources, docs.
It supports plenty of models and providers, including Mistral AI.
You can create a chat window using :
CodeCompanionChat.
The chat can read your buffer (just pass
#buffer
).Then you also can use tools like
@editor, @files, @cmd_line
to work with your files in a more agentic way. Continuing this way, you have the@full_stack_dev
tool for maximum vibing (check docs).
Sadly, there’s one thing really holding it back and that’s lack of fill-in-middle support for inline suggestions. Also, beware that the default model is mistral-small, which is not the brightest of the bunch (easily fixable in the config).
Despite my best efforts, I was unable to get markview to work in the codecompanion buffer. I’ll wait for new versions and try again.
Minuet Plugin
The Minuet plugin is a simple interface to many different AI providers. It supports the fill-in-middle and it also supports our provider.
After we installed the dependencies above, we can move on to Minuet install - I’ve included full config including default values, so that it’s easier for me to try out different options. You can get away with a much shorter config. Finally, since I have lualine, I’ve decided to include the notification from minuet like so.
Demo
It works! I mean… the setup, not AI generated code. But to be fair, it got only one test wrong:
Try it out yourself!
Of course, feel free to choose another provider, editor or a plugin. This is my setup, there are many like it, but this one is mine. You can find the full init.lua
config in my GitHub gists, there might be some updates. I’ll write about my opinions on this whole LLM vibing in another post.