mckaywrigley / chatbot-ui

Come join the best place on the internet to learn AI skills. Use code "chatbotui" for an extra 20% off.
https://JoinTakeoff.com
MIT License
28.47k stars 7.92k forks source link

Using Ollama/local models from another source #1249

Closed kmsrockz closed 8 months ago

kmsrockz commented 8 months ago

Hello everyone, I have installed chatbot ui locally with Ollama. When I using 127.0.0.1 local models and file uploads works without problems.

But when I access the chatbot from my iPhone or other computer using the local ip for example 192.168.1.100. Local models and file uploads no longer work. Normal chat with Azure OpenAi works fine.

Ollama runs in Docker container, I can also access ollama via the URL:192.168.1.100:11434. (Ollama Service is running) But i have no local models shows in Browser, only AzureOpenAi and other API´s.

It must still be possible to access the local models. Maybe someone has an idea or has already made it work?

Very grateful for any advices 🙏 Bildschirmfoto 2024-01-24 um 11 33 05 Bildschirmfoto 2024-01-24 um 11 32 29

Screen 2024-01-24 um 11 36 21
wrapss commented 8 months ago

go in the network tabs of ur navigator and search 'tags' request

kmsrockz commented 8 months ago

Thanks for Reply ! 🙏

ive get this 2 errors....

Access to fetch at 'http://192.168.1.100:11434/api/tags' from origin 'http://192.168.1.100:3000' has been blocked by CORS policy: No 'Access-Control-Allow-Origin' header is present on the requested resource. If an opaque response serves your needs, set the request's mode to 'no-cors' to fetch the resource with CORS disabled. global-state.tsx:273

GET http://192.168.1.100:11434/api/tags net::ERR_FAILED 403 (Forbidden)

wrapss commented 8 months ago

look this

kmsrockz commented 8 months ago

Thx, add local IP Adress by OLLAMA_ORIGIN but still not work 🤔

Bortus-AI commented 8 months ago

@kmsrockz #1255 See if the suggestions here fix it.

kmsrockz commented 8 months ago

yeah works now ! THX Everybody 🙏