Closed kmsrockz closed 8 months ago
go in the network tabs of ur navigator and search 'tags' request
Thanks for Reply ! 🙏
ive get this 2 errors....
Access to fetch at 'http://192.168.1.100:11434/api/tags' from origin 'http://192.168.1.100:3000' has been blocked by CORS policy: No 'Access-Control-Allow-Origin' header is present on the requested resource. If an opaque response serves your needs, set the request's mode to 'no-cors' to fetch the resource with CORS disabled. global-state.tsx:273
GET http://192.168.1.100:11434/api/tags net::ERR_FAILED 403 (Forbidden)
Thx, add local IP Adress by OLLAMA_ORIGIN but still not work 🤔
@kmsrockz #1255 See if the suggestions here fix it.
yeah works now ! THX Everybody 🙏
Hello everyone, I have installed chatbot ui locally with Ollama. When I using 127.0.0.1 local models and file uploads works without problems.
But when I access the chatbot from my iPhone or other computer using the local ip for example 192.168.1.100. Local models and file uploads no longer work. Normal chat with Azure OpenAi works fine.
Ollama runs in Docker container, I can also access ollama via the URL:192.168.1.100:11434. (Ollama Service is running) But i have no local models shows in Browser, only AzureOpenAi and other API´s.
It must still be possible to access the local models. Maybe someone has an idea or has already made it work?
Very grateful for any advices 🙏