raycast / extensions

Everything you need to extend Raycast.
https://developers.raycast.com
MIT License
5.01k stars 2.74k forks source link

[Ollama AI] Chat with Ollama, error parsing chucked response with remote Ollama Server #12495

Open marinsokol opened 1 month ago

marinsokol commented 1 month ago

Extension

https://www.raycast.com/massimiliano_pasquini/raycast-ollama

Raycast Version

1.74.1

macOS Version

14.4.1

Description

I have Ollama running on my home server thru Docker and Cloudflare tunnel. I have OpenUI running on same server with access to Ollama and all is working fine. I installed extension and connected to same Ollama but extension is stuck in loading state.

I forked extension to get access to logs and figured out issue. My Ollama API is return chat in chunks like shown below in OllamaApiChat, so JSON.parse is not able to parse it.

'{"model":"phi3:latest","created_at":"2024-05-21T14:53:33.113658111Z","message":{"role":"assistant","content":"Pro"},"done":false}\n' +
    '{"model":"phi3:latest","created_at":"2024-05-21T14:53:33.215823834Z","message":{"role":"assistant","content":"x"},"done":false}\n' +
    '{"model":"phi3:latest","created_at":"2024-05-21T14:53:33.315599404Z","message":{"role":"assistant","content":"m"},"done":false}\n' +
    '{"model":"phi3:latest","created_at":"2024-05-21T14:53:33.41595048Z","message":{"role":"assistant","content":"ox"},"done":false}\n' +
    '{"model":"phi3:latest","created_at":"2024-05-21T14:53:33.517010353Z","message":{"role":"assistant","content":" V"},"done":false}\n' +
    '{"model":"phi3:latest","created_at":"2024-05-21T14:53:33.616885537Z","message":{"role":"assistant","content":"E"},"done":fals'
':false}\n' +
    '{"model":"phi3:latest","created_at":"2024-05-21T14:54:22.902896385Z","message":{"role":"assistant","content":" ("},"done":false}\n' +
    '{"model":"phi3:latest","created_at":"2024-05-21T14:54:23.014813451Z","message":{"role":"assistant","content":"Virtual"},"done":false}\n' +
    '{"model":"phi3:latest","created_at":"2024-05-21T14:54:23.123241456Z","message":{"role":"assistant","content":" Environment"},"done":false}\n' +
    '{"model":"phi3:latest","created_at":"2024-05-21T14:54:23.224345231Z","message":{"role":"assistant","content":")"},"done":false}\n' +
    '{"model":"phi3:latest","created_at":"2024-05-21T14:54:23.325555309Z","message":{"role":"assistant","content":" is"},"done":false}\n'

I am developer with experience in building Raycast extensions, so I can work on fixing this. I just wanted to check first. Did you came across this issue before?

Steps To Reproduce

  1. Add url for Ollama hosted on separate server.
  2. Try Chat with Ollama command
  3. It should get stuck in loading state

Current Behaviour

No response

Expected Behaviour

No response

raycastbot commented 1 month ago

Thank you for opening this issue!

🔔 @MassimilianoPasquini97 you might want to have a look.

💡 Author and Contributors commands The author and contributors of `massimiliano_pasquini/raycast-ollama` can trigger bot actions by commenting: - `@raycastbot close this issue` Closes the issue. - `@raycastbot rename this issue to "Awesome new title"` Renames the issue. - `@raycastbot reopen this issue` Reopens the issue. - `@raycastbot assign me` Assigns yourself to the issue. - `@raycastbot good first issue` Adds the "Good first issue" label to the issue.
MassimilianoPasquini97 commented 1 month ago

Hi @marinsokol, I'm currently working on a big refactoring of the entire code with some new features, and I almost done, one of those permit to configure more than one Ollama Server and the last time I have tried a remote one it work. I haven't tried with Cloudflare tunnel but it was exposed locally with a nginx reverse proxy.

Look at MassimilianoPasquini97/raycast_ollama in the next days for this new release.

What version of Ollama are you using? The Latest one?

MassimilianoPasquini97 commented 1 month ago

@raycastbot assign me

MassimilianoPasquini97 commented 1 month ago

@raycastbot rename this issue to "[Ollama AI] Chat with Ollama, error parsing chucked response with remote Ollama Server"

marinsokol commented 1 month ago

@MassimilianoPasquini97 I am using latest (0.1.38) Ollama. I don't think cloudflare tunnel is issue here. Connecting to same server directly (using IP) resulted with same issue.

MassimilianoPasquini97 commented 1 month ago

@marinsokol I have finished the refactoring of the code and today I have tried it with a remote Ollama Server and it work. Can you try the latest version from MassimilianoPasquini97/raycast_ollama ?

marinsokol commented 3 weeks ago

@MassimilianoPasquini97 it is still not working for me. I have same issue as before.

MassimilianoPasquini97 commented 3 weeks ago

@marinsokol I don't know how to reproduce the error. I have tried with Ollama Server installed on Windows with a RTX 4080 and it work.

MassimilianoPasquini97 commented 3 weeks ago

@raycastbot assign me

marinsokol commented 3 weeks ago

Did you try testing testing using Ollama running in docker?