In this part, we’ll be looking at Model Context Protocol (MCP). We’ll cover the features (tools, resources, prompts, roots, sampling), how it works, how to communicate with it (both locally and remotely).
In this tutorial we’ll take a look on how to add LLM (Mistral AI) capabilities into neovim (since I don’t want to use Cursor).
Let's run some local models on a spare old GPU. Using Ollama and OpenWebUI we can enjoy plenty of pleasant features without breaking a sweat.