karthink / gptel

A simple LLM client for Emacs
GNU General Public License v3.0
1.04k stars 113 forks source link

Wrong type argument: stringp, response-buffer #190

Open safijari opened 5 months ago

safijari commented 5 months ago

I get this error

Send your query with C-c RET!
Querying Ollama...
error in process filter: apply: Wrong type argument: stringp, response-buffer
error in process filter: Wrong type argument: stringp, response-buffer

Some times when I uninstall and reinstall gptel the error goes away but comes back after restarting emacs. No idea what's causing it. Any help would be appreciated. Here's the backtrace

  #f(compiled-function (backend response info) #<bytecode 0x1815cff28a0a718c>)(#s(gptel-ollama :name "Ollama" :host "192.168.50.55:11434" :header nil :protocol "http" :stream t :endpoint "/api/generate" :key nil :models ("dolphin-mistral:latest") :url "http://192.168.50.55:11434/api/generate") (:model "dolphin-mistral:latest" :created_at "2024-01-20T03:08:28.820842377Z" :response "Hello! How can I assist you today?" :done t :context [32000 6574 13 1976 460 264 2475 3842 2229 3687 297 2929 23416 304 264 10865 13892 28723 1992 19571 3078 11973 28723 2 13 32000 1838 13 5365 2 13 32000 489 11143 13 16230 28808 1602 541 315 6031 368 3154 28804] :total_duration 420573444 :load_duration 1001895 :prompt_eval_duration 99912000 :eval_count 9 :eval_duration 314931000) (:buffer response-buffer))
  apply(#f(compiled-function (backend response info) #<bytecode 0x1815cff28a0a718c>) #s(gptel-ollama :name "Ollama" :host "192.168.50.55:11434" :header nil :protocol "http" :stream t :endpoint "/api/generate" :key nil :models ("dolphin-mistral:latest") :url "http://192.168.50.55:11434/api/generate") ((:model "dolphin-mistral:latest" :created_at "2024-01-20T03:08:28.820842377Z" :response "Hello! How can I assist you today?" :done t :context [32000 6574 13 1976 460 264 2475 3842 2229 3687 297 2929 23416 304 264 10865 13892 28723 1992 19571 3078 11973 28723 2 13 32000 1838 13 5365 2 13 32000 489 11143 13 16230 28808 1602 541 315 6031 368 3154 28804] :total_duration 420573444 :load_duration 1001895 :prompt_eval_duration 99912000 :eval_count 9 :eval_duration 314931000) (:buffer response-buffer)))
  gptel--parse-response(#s(gptel-ollama :name "Ollama" :host "192.168.50.55:11434" :header nil :protocol "http" :stream t :endpoint "/api/generate" :key nil :models ("dolphin-mistral:latest") :url "http://192.168.50.55:11434/api/generate") (:model "dolphin-mistral:latest" :created_at "2024-01-20T03:08:28.820842377Z" :response "Hello! How can I assist you today?" :done t :context [32000 6574 13 1976 460 264 2475 3842 2229 3687 297 2929 23416 304 264 10865 13892 28723 1992 19571 3078 11973 28723 2 13 32000 1838 13 5365 2 13 32000 489 11143 13 16230 28808 1602 541 315 6031 368 3154 28804] :total_duration 420573444 :load_duration 1001895 :prompt_eval_duration 99912000 :eval_count 9 :eval_duration 314931000) (:buffer response-buffer))
  gptel--url-parse-response(#s(gptel-ollama :name "Ollama" :host "192.168.50.55:11434" :header nil :protocol "http" :stream t :endpoint "/api/generate" :key nil :models ("dolphin-mistral:latest") :url "http://192.168.50.55:11434/api/generate") #<buffer  *http 192.168.50.55:11434*-913249>)
  #f(compiled-function (_) #<bytecode 0x1f2db347e9eef560>)(nil)
  url-http-activate-callback()
  url-http-content-length-after-change-function(119 605 486)
  url-http-wait-for-headers-change-function(1 610 609)
  url-http-generic-filter(#<process 192.168.50.55> "HTTP/1.1 200 OK\15\nContent-Type: application/json; c...")
karthink commented 5 months ago
  1. Do you have Curl available on your system?
  2. Has Ollama ever worked for you correctly using gptel?
  3. If it has, did it produce a streaming response or just insert the whole response at once after a delay?

Please also provide your Emacs version, gptel commit version and OS (Windows/Mac/Linux). (If you installed it using package-install and don't know the gptel commit, it's enough to tell me when you installed/updated it.)

g-simmons commented 1 week ago

TL;DR: encountered this error trying to use Ollama with an old version of gptel, upgraded to 0.9.0 and reinstalled and the error went away.

Background: Never had Ollama working in gptel before, just set up Ollama today. Have been using gtpel with openai and anthropic models for months (thanks a ton btw, it's been great).

My model config: IP address is an IP on my local network

  (gptel-make-ollama "ollama-llama3"             ;Any name of your choosing
    :host "100.xxx.xxx.xxx:11434"               ;Where it's running
    :stream t ;Stream responses
    :models '("llama3"))          ;List of models

Emacs version: I am running brew-installed emacs-mac

==> railwaycat/emacsmacport/emacs-mac: stable emacs-29.1-mac-10.0, HEAD

Tests with Ollama/gptel: Same error as above on gptel 0.8.5.

I upgraded to the most recent commit on master (a834adbcba) and got Wrong type argument stringp nil error.

Then I downgraded to commit (4c0583b) since it looked like the official 0.9.0 release. After resetting gptel with package-install-file, error went away and I can generate responses successfully.

Other tests Tested using cURL, works fine

curl 100.xxx.xxx.xxx:11434/api/generate -d '{"model":"llama3","prompt":"Why is the sky blue?","stream":false}'

Tested gptel using chatgpt-3.5-turbo, that works fine too.

karthink commented 1 week ago

I upgraded to the most recent commit on master (a834adbcba) and got Wrong type argument stringp nil error.

Then I downgraded to commit (4c0583b) since it looked like the official 0.9.0 release. After resetting gptel with package-install-file, error went away and I can generate responses successfully.

That's odd. Is the takeaway that

No Ollama code has been touched since the release so this is surprising.

g-simmons commented 1 week ago

@karthink not sure! I agree that does seem surprising.

I have had intermittent issues where something in gptel gets borked and reinstalling the package fixes it, so I'm not 100% confident. Will let you know if have time to do a more thorough repro.