mckaywrigley / chatbot-ui

AI chat for every model.
https://chatbotui.com
MIT License
27.94k stars 7.78k forks source link

Can not using local ollama model when expose to inter net #1384

Open soragui opened 6 months ago

soragui commented 6 months ago

I am hosting chatbot on local computer , and deploy ollama locally , when i using localhost:3000 , i can select local ollama to chat, but when i using cloudflare tunnle expose to internet , i can not select the ollama model :

Screenshot from 2024-02-06 10-06-40

so how to access the local ollama model when expose loca server to the internet......

wrapss commented 6 months ago

try running ollama with the arguments "OLLAMA_HOST=0.0.0.0 OLLAMA_ORIGINS=*" and use the ipv4 of your ollama instead of the local ip see

wrapss commented 6 months ago

in the network tab, you can see if the 'tags' request fails for a cors error

EduardDoronin commented 5 months ago

Is there an update on this issue? We ran into the same issue and cant really figure out how to fix it. @wrapss @soragui

gtlYashParmar commented 3 months ago

Same issue here

jtsato commented 2 months ago

NEXT_PUBLIC_OLLAMA_URL=http://127.0.0.1:11434

try running ollama with the arguments "OLLAMA_HOST=0.0.0.0 OLLAMA_ORIGINS=*" and use the ipv4 of your ollama instead of the local ip see

Besides the set the environment OLLAMA_ORIGINS=* I have to change http://localhost:11434 to http://127.0.0.1:11434