Open gitwittidbit opened 2 months ago
Not sure, by local providers you mean you are running Ollama with on your computer? I am not familiar with LibreChat, can you provide more info about the error you are getting?
Yes, I have Ollama running on my local server (but in a different docker stack). Another frontend that works with Ollama is Open-Webui. So when I chat with Amica (and have Ollama configured as backend), I am getting this:
Hey, can you grab a screenshot from developer console in browser? Wondering if CORS issue.
Okay, so it says that the loading of mixed content was blocked:
Which is strange to me, because the link is http.
when you say "link is http", do you mean that Amica is running on HTTP?
Both Amica and Ollama must be either on HTTP or HTTPS. You cannot have Amica running HTTPS and try to connect to Ollama with classic HTTP. You would either need to change Ollama to run securely as well, or downgrade Amica to non secure.
I meant that Ollama was running on HTTP.
But I was actually accessing Amica via HTTPS. This is the same setup I use with open-webui, LibreChat and a few others.
I now tried accessing Amica also via HTTP but again got that network error. But this time, the developer console actually mentions CORS (as @slowsynapse had suggested earlier):
It says that the request was blocked because of the same-origin-rule (reason: CORS headline 'Access-Control-Allow-Origin' missing). Status code 403
Oh, did you allow all origins (*) in Ollama? This is required when accessing from another container stack.
For Amica, you still need to be on HTTP (or https) for both, but you also need ollama to allow the remote origin.
Not sure where in Ollama I can set that.
But I can say that open-webui and LibreChat (and a few other frontends) all running on the same host (but in different docker stacks) all can access Ollama. So it would seem that Ollama isn't overly picky about the origin of the requests the way it is configured at the moment.
(And I didn't want to over complicate things and solve one issue at a time, so I didn't mention it before, but Amica also can't connect to the TTS/STT backend with the openai-wrapper as well that is running directly on the same host (not in docker)).
I'm happy to make any changes to my Ollama setup if that helps. I just don't know where...
If your Amica container cannot access TTS/STT either, I believe the problem comes from your Amica installation, not Ollama. I also run openwebui and I know you wouldn't be able to use it if your Ollama install wasn't already allowing all origins.
Are you accessing the app through the local network and where your backends are? I personally didn't manage to use Ollama when accessing Amica through a public domain, I had to access the container directly with its IP on my network.
I can access Amica via HTTPS which goes through a domain on the internet and I can access it via HTTP under its local IP address. The IP address is the same for the other frontends and the Ollama (and TTS/STT) backend. (But I myself am on a different local network.)
Amica is able to connect to OpenAI - so the its connectivity can't be totally messed up.
Tried it again today.
All external services seem to be reachable. But none of the local services can be accessed. How can that be?
Not sure. I think would need your exact setup to see if the issue replicates on another environment.
Hi, I installed Amica via docker compose and got it working with OpenAI. But I want to use local providers and so I am trying to connect Amica to my Ollama server. Unlike other Frontends I am running (e.g. LibreChat), Amica can't connect to Ollama with the same info - I keep getting a network error. What am I missing? Thanks!