Open soragui opened 6 months ago
try running ollama with the arguments "OLLAMA_HOST=0.0.0.0 OLLAMA_ORIGINS=*" and use the ipv4 of your ollama instead of the local ip see
in the network tab, you can see if the 'tags' request fails for a cors error
Is there an update on this issue? We ran into the same issue and cant really figure out how to fix it. @wrapss @soragui
Same issue here
NEXT_PUBLIC_OLLAMA_URL=http://127.0.0.1:11434
try running ollama with the arguments "OLLAMA_HOST=0.0.0.0 OLLAMA_ORIGINS=*" and use the ipv4 of your ollama instead of the local ip see
Besides the set the environment OLLAMA_ORIGINS=* I have to change http://localhost:11434 to http://127.0.0.1:11434
I am hosting chatbot on local computer , and deploy ollama locally , when i using localhost:3000 , i can select local ollama to chat, but when i using cloudflare tunnle expose to internet , i can not select the ollama model :
so how to access the local ollama model when expose loca server to the internet......