semperai / amica

Amica is an open source interface for interactive communication with 3D characters with voice synthesis and speech recognition.
https://heyamica.com
MIT License
581 stars 92 forks source link

Amica can't reach basic-openai-api-wrapper #96

Open gitwittidbit opened 2 months ago

gitwittidbit commented 2 months ago

Hi, I installed Amica via docker compose and got it working with OpenAI. But I want to use local providers and so I am trying to connect Amica to my Ollama server. Unlike other Frontends I am running (e.g. LibreChat), Amica can't connect to Ollama with the same info - I keep getting a network error. What am I missing? Thanks!

slowsynapse commented 2 months ago

Not sure, by local providers you mean you are running Ollama with on your computer? I am not familiar with LibreChat, can you provide more info about the error you are getting?

gitwittidbit commented 2 months ago

Yes, I have Ollama running on my local server (but in a different docker stack). Another frontend that works with Ollama is Open-Webui. So when I chat with Amica (and have Ollama configured as backend), I am getting this:

Screenshot 2024-04-25 at 11 00 02

slowsynapse commented 2 months ago

Hey, can you grab a screenshot from developer console in browser? Wondering if CORS issue.

gitwittidbit commented 2 months ago

Okay, so it says that the loading of mixed content was blocked:

Screenshot 2024-04-25 at 13 18 50

Which is strange to me, because the link is http.

napiquet commented 2 months ago

when you say "link is http", do you mean that Amica is running on HTTP?

Both Amica and Ollama must be either on HTTP or HTTPS. You cannot have Amica running HTTPS and try to connect to Ollama with classic HTTP. You would either need to change Ollama to run securely as well, or downgrade Amica to non secure.

gitwittidbit commented 2 months ago

I meant that Ollama was running on HTTP.

But I was actually accessing Amica via HTTPS. This is the same setup I use with open-webui, LibreChat and a few others.

I now tried accessing Amica also via HTTP but again got that network error. But this time, the developer console actually mentions CORS (as @slowsynapse had suggested earlier):

Screenshot 2024-04-25 at 14 57 49

It says that the request was blocked because of the same-origin-rule (reason: CORS headline 'Access-Control-Allow-Origin' missing). Status code 403

napiquet commented 2 months ago

Oh, did you allow all origins (*) in Ollama? This is required when accessing from another container stack.

For Amica, you still need to be on HTTP (or https) for both, but you also need ollama to allow the remote origin.

gitwittidbit commented 2 months ago

Not sure where in Ollama I can set that.

But I can say that open-webui and LibreChat (and a few other frontends) all running on the same host (but in different docker stacks) all can access Ollama. So it would seem that Ollama isn't overly picky about the origin of the requests the way it is configured at the moment.

(And I didn't want to over complicate things and solve one issue at a time, so I didn't mention it before, but Amica also can't connect to the TTS/STT backend with the openai-wrapper as well that is running directly on the same host (not in docker)).

I'm happy to make any changes to my Ollama setup if that helps. I just don't know where...

napiquet commented 2 months ago

If your Amica container cannot access TTS/STT either, I believe the problem comes from your Amica installation, not Ollama. I also run openwebui and I know you wouldn't be able to use it if your Ollama install wasn't already allowing all origins.

Are you accessing the app through the local network and where your backends are? I personally didn't manage to use Ollama when accessing Amica through a public domain, I had to access the container directly with its IP on my network.

gitwittidbit commented 2 months ago

I can access Amica via HTTPS which goes through a domain on the internet and I can access it via HTTP under its local IP address. The IP address is the same for the other frontends and the Ollama (and TTS/STT) backend. (But I myself am on a different local network.)

Amica is able to connect to OpenAI - so the its connectivity can't be totally messed up.

gitwittidbit commented 1 month ago

Tried it again today.

All external services seem to be reachable. But none of the local services can be accessed. How can that be?

slowsynapse commented 1 month ago

Not sure. I think would need your exact setup to see if the issue replicates on another environment.