Closed anonimitoraf closed 3 months ago
My first thought was that perhaps the Ollama API changed, but there's no mention of that in the release notes. Could you try the following?
(setq gptel-log-level 'debug)
Try running just the Curl command and let me know what the output is? Nothing has changed on gptel's side as far as Ollama is concerned, so I think it's most likely a connection issue.
Also thanks for noticing the issue naming convention and prefixing the title with "(ollama)"!
As a data point, I am running ollama 0.1.30 on Mac OS 14.3.1 (installed via homebrew), with various models and it generally works with gptel (e.g. I do not get the showstopper error shown above).
Oh man, I realized that my ollama
instance was running on the localhost (127.0.0.1
) interface but I was trying to access it via a different one. :facepalm:
Hi @karthink, I was using this package with
ollama
just fine until I had to upgrade myollama
binary to the newest version (v0.1.30, to fix some error)Now I'm getting![image](https://github.com/karthink/gptel/assets/15933322/cc5ca05c-6aa2-4580-8a4c-d76365d0969c)
Response Error: nil
,Ollama error (nil): Malformed JSON in response.
Running
ollama
in my terminal works fineHere's my config:
My
gptel--known-backends
is:Here's all I see in the
*gpt-log*
file:Let me know if I can provide more helpful info. Thanks!