LSP-AI is an open-source language server that serves as a backend for AI-powered functionality, designed to assist and empower software engineers, not replace them.
MIT License
1.82k
stars
55
forks
source link
[Question] Is it possible to use Ollama as a backend? #9
Hi, first of all - thank you for an awesome project :)
I have a question though - the docs say that when llama.cpp backend is used, the LSP links directly to it, but I'm not sure - is it possible to utilize an Ollama instance serving on the local network?
Thanks!
Hi, first of all - thank you for an awesome project :) I have a question though - the docs say that when
llama.cpp
backend is used, the LSP links directly to it, but I'm not sure - is it possible to utilize an Ollama instance serving on the local network? Thanks!