Open marinsokol opened 1 month ago
Thank you for opening this issue!
🔔 @MassimilianoPasquini97 you might want to have a look.
Hi @marinsokol, I'm currently working on a big refactoring of the entire code with some new features, and I almost done, one of those permit to configure more than one Ollama Server and the last time I have tried a remote one it work. I haven't tried with Cloudflare tunnel but it was exposed locally with a nginx reverse proxy.
Look at MassimilianoPasquini97/raycast_ollama in the next days for this new release.
What version of Ollama are you using? The Latest one?
@raycastbot assign me
@raycastbot rename this issue to "[Ollama AI] Chat with Ollama, error parsing chucked response with remote Ollama Server"
@MassimilianoPasquini97 I am using latest (0.1.38) Ollama. I don't think cloudflare tunnel is issue here. Connecting to same server directly (using IP) resulted with same issue.
@marinsokol I have finished the refactoring of the code and today I have tried it with a remote Ollama Server and it work. Can you try the latest version from MassimilianoPasquini97/raycast_ollama ?
@MassimilianoPasquini97 it is still not working for me. I have same issue as before.
@marinsokol I don't know how to reproduce the error. I have tried with Ollama Server installed on Windows with a RTX 4080 and it work.
@raycastbot assign me
Did you try testing testing using Ollama running in docker?
Extension
https://www.raycast.com/massimiliano_pasquini/raycast-ollama
Raycast Version
1.74.1
macOS Version
14.4.1
Description
I have Ollama running on my home server thru Docker and Cloudflare tunnel. I have OpenUI running on same server with access to Ollama and all is working fine. I installed extension and connected to same Ollama but extension is stuck in loading state.
I forked extension to get access to logs and figured out issue. My Ollama API is return chat in chunks like shown below in
OllamaApiChat
, soJSON.parse
is not able to parse it.I am developer with experience in building Raycast extensions, so I can work on fixing this. I just wanted to check first. Did you came across this issue before?
Steps To Reproduce
Chat with Ollama
commandCurrent Behaviour
No response
Expected Behaviour
No response