karthink / gptel

A simple LLM client for Emacs
GNU General Public License v3.0
1.04k stars 113 forks source link

Ollama not working #179

Open awptechnologies opened 5 months ago

awptechnologies commented 5 months ago

I use elpacca so i unstalled gptel with

(use-package gptel) It installs and i can open inside emacs. As soon as i add the config for hosts and models my emacs falls apart.

How do i enable my local ollama so i can use my local models?

karthink commented 5 months ago

Could you paste the exact config you are using for Ollama?

awptechnologies commented 5 months ago

+BEGIN_SRC emacs-lisp

(use-package gptel)

(gptel-make-ollama "Ollama" ;Any name of your choosing :host "my-ip:11434" ;Where it's running :models '("mistral:latest") ;Installed models :stream t) ;Stream responses

+END_SRC

Ive also tried adding the optional config on readme. If i just run the use package it installs perfect but i want to use my local ollama running on my servers not chat gpt api stuff.

karthink commented 5 months ago
(use-package gptel
  :config
  (setq-default gptel-backend
                (gptel-make-ollama
                 "Ollama"
                 :host "my-ip:11434"
                 :models '("mistral:latest")
                 :stream t)
                gptel-model "mistral:latest"))

^ Let me know if this doesn't work as expected.

awptechnologies commented 5 months ago

it works but i get an error

error in process filter: Search failed: "^{"

awptechnologies commented 5 months ago

also how can i use this to interact more efficiently. When i ask it a prompt and it responds my cursor doesn't automatically move down so i have to manually move cursor to line after response.

awptechnologies commented 5 months ago

also i have a reverse proxy running my ollama backend at https://ollamaapi.my.domain.co how do i use this for host instead? the port is mapped with the reverse proxy. Do i have to set it to https mode or something

karthink commented 5 months ago

Have you looked at the README? Most of these questions are addressed there.

it works but i get an error: error in process filter: Search failed: "^{"

Is this when trying to access your ollama instance that's behind the proxy, or one on localhost?

When i ask it a prompt and it responds my cursor doesn't automatically move down so i have to manually move cursor to line after response

FAQ 1, FAQ 2

also i have a reverse proxy running my ollama backend at https://ollamaapi.my.domain.co/. How do i use this for host instead? ... Do i have to set it to https mode or something

Yes, you'll have to specify https and the correct port. Check the documentation for gptel-make-ollama.

awptechnologies commented 5 months ago

i get the error when accessing it from the ip with port number.

it doesnt work at all with the reverse proxy.i get an ere of 301 when trying to use the reverse proxy domain. I tried it with and without port as well as with http and https. reason being i have a rule and condition in haproxy to automatically redirect http to https maybe i should set port to 443 since this is where reverse proxy listens and the backend sends it to the server with the ollama port number.

Where is that documentation can you link it. I tried to add protocol to the config you sent me and it is invalid.

how would i add the protocol to this (use-package gptel :config (setq-default gptel-backend (gptel-make-ollama "Ollama" :host "my-ip:11434" :models '("mistral:latest") :stream t) gptel-model "mistral:latest"))

karthink commented 5 months ago

how would i add the protocol to this

Please look at the documentation of gptel-make-ollama. You would add the arguments :protocol "https" in the call to gptel-make-ollama.

Where is that documentation can you link it.

C-h f gptel-make-ollama in Emacs.

it doesnt work at all with the reverse proxy.i get an ere of 301 when trying to use the reverse proxy domain.

Unless you've explicitly set the outward facing port for the reverse proxy, it should be available at 443 when using https. (If you're using https Curl will default to using 443 so you don't need to specify the port at all, but it won't hurt to.)

awptechnologies commented 5 months ago

okay great that has worked for the reverse proxy im still getting the error

error in process filter: Search failed: "^{"

Even with the reverse proxy working. also is there a way to set it to automatically move cursor as the response happens?

awptechnologies commented 5 months ago

okay so i added the config for auto scroll and stuff now only issue is the error. im getting error but it seems to be working correctly.

karthink commented 5 months ago

error in process filter: Search failed: "^{"

  1. Run (setq gptel--debug t)
  2. Try to use Ollama
  3. A buffer will pop up containing the response. Please paste the contents of that buffer here

You can place the output here inside triple backticks ``` to format it correctly.

rookiejet commented 2 months ago

I get a similar message error in process filter: Search failed: "^{" with it otherwise receiving output correctly from a local ollama server. I should also note this is on the Windows port of Emacs v29.3, and it's using the Windows built-in curl.exe, if that helps.

{
  "gptel": "request headers",
  "timestamp": "2024-04-25 08:54:17"
}
{
  "Content-Type": "application/json"
}
{
  "gptel": "request body",
  "timestamp": "2024-04-25 08:54:17"
}
{
  "model": "llama3:latest",
  "messages": [
    {
      "role": "system",
      "content": "You are a large language model living in Emacs and a helpful assistant. Respond concisely."
    },
    {
      "role": "user",
      "content": "What do you know about the game GrimGrimoire?"
    }
  ],
  "stream": true,
  "options": {
    "temperature": 1.0
  }
}
{"gptel": "request Curl command", "timestamp": "2024-04-25 08:54:17"}
"curl" \
"--disable" \
"--location" \
"--silent" \
"-XPOST" \
"-y300" \
"-Y1" \
"-D-" \
^"-w^(1a6a5349d333812fe725e44a1cf4e7b4 . ^%{size_header}^)^" \
^"-d{\^"model\^":\^"llama3:latest\^",\^"messages\^":[{\^"role\^":\^"system\^",\^"content\^":\^"You are a large language model living in Emacs and a helpful assistant. Respond concisely.\^"},{\^"role\^":\^"user\^",\^"content\^":\^"What do you know about the game GrimGrimoire?\^"}],\^"stream\^":true,\^"options\^":{\^"temperature\^":1.0}}^" \
"-HContent-Type: application/json" \
"http://localhost:11434/api/chat"
{
  "gptel": "response headers",
  "timestamp": "2024-04-25 08:54:18"
}
"HTTP/1.1 200 OK\nContent-Type: application/x-ndjson\nDate: Thu, 25 Apr 2024 12:54:17 GMT\nTransfer-Encoding: chunked\n"
{
  "gptel": "response body",
  "timestamp": "2024-04-25 08:54:18"
}
{
  "model": "llama3:latest",
  "created_at": "2024-04-25T12:54:17.2642082Z",
  "message": {
    "role": "assistant",
    "content": "I"
  },
  "done": false
}
{
  "model": "llama3:latest",
  "created_at": "2024-04-25T12:54:17.2797198Z",
  "message": {
    "role": "assistant",
    "content": "'m"
  },
  "done": false
}
{
  "model": "llama3:latest",
  "created_at": "2024-04-25T12:54:17.2955392Z",
  "message": {
    "role": "assistant",
    "content": " familiar"
  },
  "done": false
}
{
  "model": "llama3:latest",
  "created_at": "2024-04-25T12:54:17.3115782Z",
  "message": {
    "role": "assistant",
    "content": " with"
  },
  "done": false
}
{
  "model": "llama3:latest",
  "created_at": "2024-04-25T12:54:17.3273111Z",
  "message": {
    "role": "assistant",
    "content": " Grim"
  },
  "done": false
}
{
  "model": "llama3:latest",
  "created_at": "2024-04-25T12:54:17.3427346Z",
  "message": {
    "role": "assistant",
    "content": "G"
  },
  "done": false
}
{
  "model": "llama3:latest",
  "created_at": "2024-04-25T12:54:17.3596977Z",
  "message": {
    "role": "assistant",
    "content": "rim"
  },
  "done": false
}
{
  "model": "llama3:latest",
  "created_at": "2024-04-25T12:54:17.3745926Z",
  "message": {
    "role": "assistant",
    "content": "oire"
  },
  "done": false
}
{
  "model": "llama3:latest",
  "created_at": "2024-04-25T12:54:17.3910053Z",
  "message": {
    "role": "assistant",
    "content": "!"
  },
  "done": false
}
{
  "model": "llama3:latest",
  "created_at": "2024-04-25T12:54:17.4063676Z",
  "message": {
    "role": "assistant",
    "content": " It"
  },
  "done": false
}
{
  "model": "llama3:latest",
  "created_at": "2024-04-25T12:54:17.4219322Z",
  "message": {
    "role": "assistant",
    "content": "'s"
  },
  "done": false
}
{
  "model": "llama3:latest",
  "created_at": "2024-04-25T12:54:17.4369143Z",
  "message": {
    "role": "assistant",
    "content": " a"
  },
  "done": false
}
{
  "model": "llama3:latest",
  "created_at": "2024-04-25T12:54:17.453134Z",
  "message": {
    "role": "assistant",
    "content": " deck"
  },
  "done": false
}
{
  "model": "llama3:latest",
  "created_at": "2024-04-25T12:54:17.4683618Z",
  "message": {
    "role": "assistant",
    "content": "-building"
  },
  "done": false
}
{
  "model": "llama3:latest",
  "created_at": "2024-04-25T12:54:17.484691Z",
  "message": {
    "role": "assistant",
    "content": " card"
  },
  "done": false
}
{
  "model": "llama3:latest",
  "created_at": "2024-04-25T12:54:17.4992819Z",
  "message": {
    "role": "assistant",
    "content": " game"
  },
  "done": false
}
{
  "model": "llama3:latest",
  "created_at": "2024-04-25T12:54:17.5147736Z",
  "message": {
    "role": "assistant",
    "content": " set"
  },
  "done": false
}
{
  "model": "llama3:latest",
  "created_at": "2024-04-25T12:54:17.5313593Z",
  "message": {
    "role": "assistant",
    "content": " in"
  },
  "done": false
}
{
  "model": "llama3:latest",
  "created_at": "2024-04-25T12:54:17.5465848Z",
  "message": {
    "role": "assistant",
    "content": " a"
  },
  "done": false
}
{
  "model": "llama3:latest",
  "created_at": "2024-04-25T12:54:17.5623072Z",
  "message": {
    "role": "assistant",
    "content": " go"
  },
  "done": false
}
{
  "model": "llama3:latest",
  "created_at": "2024-04-25T12:54:17.5773721Z",
  "message": {
    "role": "assistant",
    "content": "thic"
  },
  "done": false
}
{
  "model": "llama3:latest",
  "created_at": "2024-04-25T12:54:17.5937751Z",
  "message": {
    "role": "assistant",
    "content": " world"
  },
  "done": false
}
{
  "model": "llama3:latest",
  "created_at": "2024-04-25T12:54:17.608785Z",
  "message": {
    "role": "assistant",
    "content": ","
  },
  "done": false
}
{
  "model": "llama3:latest",
  "created_at": "2024-04-25T12:54:17.6243331Z",
  "message": {
    "role": "assistant",
    "content": " where"
  },
  "done": false
}
{
  "model": "llama3:latest",
  "created_at": "2024-04-25T12:54:17.640146Z",
  "message": {
    "role": "assistant",
    "content": " players"
  },
  "done": false
}
{
  "model": "llama3:latest",
  "created_at": "2024-04-25T12:54:17.6550869Z",
  "message": {
    "role": "assistant",
    "content": " take"
  },
  "done": false
}
{
  "model": "llama3:latest",
  "created_at": "2024-04-25T12:54:17.6707252Z",
  "message": {
    "role": "assistant",
    "content": " on"
  },
  "done": false
}
{
  "model": "llama3:latest",
  "created_at": "2024-04-25T12:54:17.6864983Z",
  "message": {
    "role": "assistant",
    "content": " the"
  },
  "done": false
}
{
  "model": "llama3:latest",
  "created_at": "2024-04-25T12:54:17.7014484Z",
  "message": {
    "role": "assistant",
    "content": " role"
  },
  "done": false
}
{
  "model": "llama3:latest",
  "created_at": "2024-04-25T12:54:17.7168979Z",
  "message": {
    "role": "assistant",
    "content": " of"
  },
  "done": false
}
{
  "model": "llama3:latest",
  "created_at": "2024-04-25T12:54:17.732516Z",
  "message": {
    "role": "assistant",
    "content": " sor"
  },
  "done": false
}
{
  "model": "llama3:latest",
  "created_at": "2024-04-25T12:54:17.7478125Z",
  "message": {
    "role": "assistant",
    "content": "cer"
  },
  "done": false
}
{
  "model": "llama3:latest",
  "created_at": "2024-04-25T12:54:17.7633118Z",
  "message": {
    "role": "assistant",
    "content": "ers"
  },
  "done": false
}
{
  "model": "llama3:latest",
  "created_at": "2024-04-25T12:54:17.7788672Z",
  "message": {
    "role": "assistant",
    "content": " competing"
  },
  "done": false
}
{
  "model": "llama3:latest",
  "created_at": "2024-04-25T12:54:17.7953185Z",
  "message": {
    "role": "assistant",
    "content": " to"
  },
  "done": false
}
{
  "model": "llama3:latest",
  "created_at": "2024-04-25T12:54:17.8100304Z",
  "message": {
    "role": "assistant",
    "content": " create"
  },
  "done": false
}
{
  "model": "llama3:latest",
  "created_at": "2024-04-25T12:54:17.8259918Z",
  "message": {
    "role": "assistant",
    "content": " the"
  },
  "done": false
}
{
  "model": "llama3:latest",
  "created_at": "2024-04-25T12:54:17.8418812Z",
  "message": {
    "role": "assistant",
    "content": " most"
  },
  "done": false
}
{
  "model": "llama3:latest",
  "created_at": "2024-04-25T12:54:17.8578414Z",
  "message": {
    "role": "assistant",
    "content": " impressive"
  },
  "done": false
}
{
  "model": "llama3:latest",
  "created_at": "2024-04-25T12:54:17.8729997Z",
  "message": {
    "role": "assistant",
    "content": " gr"
  },
  "done": false
}
{
  "model": "llama3:latest",
  "created_at": "2024-04-25T12:54:17.8884498Z",
  "message": {
    "role": "assistant",
    "content": "imo"
  },
  "done": false
}
{
  "model": "llama3:latest",
  "created_at": "2024-04-25T12:54:17.9039661Z",
  "message": {
    "role": "assistant",
    "content": "ire"
  },
  "done": false
}
{
  "model": "llama3:latest",
  "created_at": "2024-04-25T12:54:17.9185896Z",
  "message": {
    "role": "assistant",
    "content": "."
  },
  "done": false
}
{
  "model": "llama3:latest",
  "created_at": "2024-04-25T12:54:17.9345937Z",
  "message": {
    "role": "assistant",
    "content": " The"
  },
  "done": false
}
{
  "model": "llama3:latest",
  "created_at": "2024-04-25T12:54:17.9499352Z",
  "message": {
    "role": "assistant",
    "content": " game"
  },
  "done": false
}
{
  "model": "llama3:latest",
  "created_at": "2024-04-25T12:54:17.9652082Z",
  "message": {
    "role": "assistant",
    "content": " features"
  },
  "done": false
}
{
  "model": "llama3:latest",
  "created_at": "2024-04-25T12:54:17.9800119Z",
  "message": {
    "role": "assistant",
    "content": " a"
  },
  "done": false
}
{
  "model": "llama3:latest",
  "created_at": "2024-04-25T12:54:17.9953133Z",
  "message": {
    "role": "assistant",
    "content": " unique"
  },
  "done": false
}
{
  "model": "llama3:latest",
  "created_at": "2024-04-25T12:54:18.0113686Z",
  "message": {
    "role": "assistant",
    "content": " mechanism"
  },
  "done": false
}
{
  "model": "llama3:latest",
  "created_at": "2024-04-25T12:54:18.0274942Z",
  "message": {
    "role": "assistant",
    "content": " for"
  },
  "done": false
}
{
  "model": "llama3:latest",
  "created_at": "2024-04-25T12:54:18.0442264Z",
  "message": {
    "role": "assistant",
    "content": " building"
  },
  "done": false
}
{
  "model": "llama3:latest",
  "created_at": "2024-04-25T12:54:18.060342Z",
  "message": {
    "role": "assistant",
    "content": " and"
  },
  "done": false
}
{
  "model": "llama3:latest",
  "created_at": "2024-04-25T12:54:18.0759463Z",
  "message": {
    "role": "assistant",
    "content": " custom"
  },
  "done": false
}
{
  "model": "llama3:latest",
  "created_at": "2024-04-25T12:54:18.0913467Z",
  "message": {
    "role": "assistant",
    "content": "izing"
  },
  "done": false
}
{
  "model": "llama3:latest",
  "created_at": "2024-04-25T12:54:18.1075485Z",
  "message": {
    "role": "assistant",
    "content": " decks"
  },
  "done": false
}
{
  "model": "llama3:latest",
  "created_at": "2024-04-25T12:54:18.1229087Z",
  "message": {
    "role": "assistant",
    "content": ","
  },
  "done": false
}
{
  "model": "llama3:latest",
  "created_at": "2024-04-25T12:54:18.1389264Z",
  "message": {
    "role": "assistant",
    "content": " allowing"
  },
  "done": false
}
{
  "model": "llama3:latest",
  "created_at": "2024-04-25T12:54:18.154955Z",
  "message": {
    "role": "assistant",
    "content": " for"
  },
  "done": false
}
{
  "model": "llama3:latest",
  "created_at": "2024-04-25T12:54:18.1700058Z",
  "message": {
    "role": "assistant",
    "content": " a"
  },
  "done": false
}
{
  "model": "llama3:latest",
  "created_at": "2024-04-25T12:54:18.1850163Z",
  "message": {
    "role": "assistant",
    "content": " high"
  },
  "done": false
}
{
  "model": "llama3:latest",
  "created_at": "2024-04-25T12:54:18.2011366Z",
  "message": {
    "role": "assistant",
    "content": " degree"
  },
  "done": false
}
{
  "model": "llama3:latest",
  "created_at": "2024-04-25T12:54:18.2161471Z",
  "message": {
    "role": "assistant",
    "content": " of"
  },
  "done": false
}
{
  "model": "llama3:latest",
  "created_at": "2024-04-25T12:54:18.2321928Z",
  "message": {
    "role": "assistant",
    "content": " replay"
  },
  "done": false
}
{
  "model": "llama3:latest",
  "created_at": "2024-04-25T12:54:18.2483195Z",
  "message": {
    "role": "assistant",
    "content": "ability"
  },
  "done": false
}
{
  "model": "llama3:latest",
  "created_at": "2024-04-25T12:54:18.2629351Z",
  "message": {
    "role": "assistant",
    "content": "."
  },
  "done": false
}
{
  "model": "llama3:latest",
  "created_at": "2024-04-25T12:54:18.2792192Z",
  "message": {
    "role": "assistant",
    "content": ""
  },
  "done": true,
  "total_duration": 1035976600,
  "load_duration": 1770200,
  "prompt_eval_duration": 18050000,
  "eval_count": 66,
  "eval_duration": 1014484000
}
yhzhoucs commented 2 months ago

I somehow figured out the code causing the problem. In source file gptel-ollama.el, line 48:

(re-search-forward "^{")

Since the error has no effect on the result, I just add some arguments to suppress it:

(re-search-forward "^{" nil t)

After changing the line aforementioned in gptel-ollama.el and running "M-x byte-compile-file gptel-ollama.el", the error message disappered.

karthink commented 2 months ago

@yhzhoucs Thanks for that! Should be fixed for everyone now.