Closed draxil closed 11 months ago
Certainly possible, I'll look into it!
I've added a provider for llama.cpp in the latest commit - let me know if it works for you. I'll make a release with it soon (I'd prefer to wait until you test, but I'll probably release it either way over the weekend).
Oh fantastic I'll give it a shot!
Unless this has been discarded already for some reason, support for the llama.cpp server (https://github.com/ggerganov/llama.cpp/tree/master/examples/server) ?
There is already an emacs client:
https://github.com/kurnevsky/llama-cpp.el
I know this is an "example" so that might be a reason to reject this idea, but I was interested to at least ask!