Closed andyccliao closed 6 months ago
Thanks for your work on this!
Strangely, Ollama works with Amica (on a Macbook Pro M1) if I run it using flask and visit the app using Firefox, but the .dmg does not work. I'm getting TypeError: Load failed
after every message I try to send. The error appears instantly (as opposed to the usual response, where ollama slows my computer down for a bit). I checked the Ollama server logs, they respond with a 403 error.
It might be an ongoing CORS problem with Ollama. This is the most relevant link I have found so far: https://github.com/jmorganca/ollama/issues/300 This I have no idea what is the trouble. Might figure out how to use and debug Tauri.
Should I open a new issue for this?
I opened an issue for this: https://github.com/semperai/amica/issues/81
Ollama is not working correctly with Amica.
Recommend using Chat API. Ollama has a Chat API as of v0.1.14 Ollama also has support for images via the Chat API as of v0.1.15