karthink / gptel

A simple LLM client for Emacs
GNU General Public License v3.0
1.04k stars 113 forks source link

No API key response error for local instance of ollama #163

Closed roblem closed 6 months ago

roblem commented 6 months ago

This occurs with local instances of both llamma-cpp and ollama, so perhaps I am missing something obvious. My ollama setup:

(use-package gptel)

(gptel-make-ollama
 "Ollama"                               ;Any name of your choosing
 :host "localhost:11434"                ;Where it's running
 :models '("mistral:latest")            ;Installed models
 :stream t)                             ;Stream responses

(setq-default gptel-backend (gptel-make-openai "Ollama")
              gptel-model   "mistral:latest")

And this is the results when I gptel-send:

ChatGPT response error: (((HTTP/2 401) invalid_request_error) You didn't provide an API key. You need to provide your API key in an Authorization header using Bearer auth (i.e. Authorization: Bearer YOUR_KEY), or as the password field (with blank username) if you're accessing the API from your browser and are prompted for a username and password. You can obtain an API key from https://platform.openai.com/account/api-keys.) nil

From a bash prompt,

curl http://localhost:11434/api/generate -d '{
  "model": "mistral:latest",
  "prompt":"Why is the sky blue?"
}'

gives expected results.

karthink commented 6 months ago
(use-package gptel)

(setq-default gptel-backend
              (gptel-make-ollama
               "Ollama"                    ;Any name of your choosing
               :host "localhost:11434"     ;Where it's running
               :models '("mistral:latest") ;Installed models
               :stream t)                  ;Stream responses
              gptel-model   "mistral:latest")
roblem commented 6 months ago

That works. Thanks.