omagdy7 / ollama-logseq

Logseq plugin to integerate with ollama
MIT License
167 stars 15 forks source link

All Ollama commands return "undefined" #14

Closed mrcn closed 5 months ago

mrcn commented 5 months ago

Steps to Reproduce / Actual Behavior See attached video

  1. Open a terminal, type: ollama run mistral-openorca:latest
  2. Open LogSeq
  3. Type /ollama and select one of the command options
  4. See "undefined" returned in the document body, and the toast returns "CoudIn't fulfull request make sure that ollama service is running and make sure there is no typo in host or model name"

Expected Behavior

  1. I expected Ollama to return some GPT text.

Video of issue https://github.com/omagdy7/ollama-logseq/assets/5453273/be2c1cfe-716c-45f6-8945-5c3f30c59a93

Specs OS: Sonoma 14.2.1 (23C71) Logseq: 0.10.3 (77) Ollama: 0.1.19

omagdy7 commented 5 months ago

I couldn't reproduce the issue did you make sure that ollama runs normally without logseq?

mrcn commented 5 months ago

I couldn't reproduce the issue did you make sure that ollama runs normally without logseq? yes indeed. the model runs fine on its own.

Thought

i have been running a port blocker, let me see if that is interfering. UPDATE: i tried again with portblocker deactivated and have the same issue.

More Info re Error

maybe this helps a little...

here is what is shown in terminal when i "serve" the model and try to access it from Logseq using the plugin [GIN] 2024/01/12 - 11:48:32 | 404 | 269.126µs | 127.0.0.1 | POST "/api/generate"

and here is what the console returns

index-623a6c59.js:94 

       POST http://localhost:11434/api/generate 404 (Not Found)

Rr @ index-623a6c59.js:94
Vb @ index-623a6c59.js:94
await in Vb (async)
s @ index-623a6c59.js:97
w @ index-623a6c59.js:91
onKeyDown @ index-623a6c59.js:91
h0 @ index-623a6c59.js:48
g0 @ index-623a6c59.js:48
y0 @ index-623a6c59.js:48
Uf @ index-623a6c59.js:48
rh @ index-623a6c59.js:48
(anonymous) @ index-623a6c59.js:48
wd @ index-623a6c59.js:51
$m @ index-623a6c59.js:48
eu @ index-623a6c59.js:48
Gc @ index-623a6c59.js:48
L0 @ index-623a6c59.js:48

index-623a6c59.js:94 ERROR:  

Error: Error in Ollama request: Not Found

    at Rr (index-623a6c59.js:94:11313)
    at async Vb (index-623a6c59.js:94:12097)
Rr @ index-623a6c59.js:94
await in Rr (async)
Vb @ index-623a6c59.js:94
await in Vb (async)
s @ index-623a6c59.js:97
w @ index-623a6c59.js:91
onKeyDown @ index-623a6c59.js:91
h0 @ index-623a6c59.js:48
g0 @ index-623a6c59.js:48
y0 @ index-623a6c59.js:48
Uf @ index-623a6c59.js:48
rh @ index-623a6c59.js:48
(anonymous) @ index-623a6c59.js:48
wd @ index-623a6c59.js:51
$m @ index-623a6c59.js:48
eu @ index-623a6c59.js:48
Gc @ index-623a6c59.js:48
L0 @ index-623a6c59.js:48
omagdy7 commented 5 months ago

Okay I think I see what the problem is you need to change your model name in the settings by default the plugin uses mistral:instruct and you are using mistral-openorca:latest make sure to change the plugin settings to reflect the model you want to use

mrcn commented 5 months ago

Okay I think I see what the problem is you need to change your model name in the settings by default the plugin uses mistral:instruct and you are using mistral-openorca:latest make sure to change the plugin settings to reflect the model you want to use

This worked, thank you very much!

I thought this might be the issue when i first ran into the error, so i kept reverting back to the default model used in the Ollama setup readme, which is Llama2. That didn't work.

You might want to add something about this in your readme. Since I'm new to Logseq, I had never even seen the plugin settings! thanks again