If you're using AI for coding or prose in Neovim, don't forget to try out <> and <>! Both work seamlessly with local models using OpenAI compatible API, e.g., `llama-server` from llama.cpp <3 [#llm](https://mastodon.social/tags/llm) [#ai](https://mastodon.social/tags/ai) [#neovim](https://mastodon.social/tags/neovim) [#vim](https://mastodon.social/tags/vim) [#copilot](https://mastodon.social/tags/copilot)