mckaywrigley / chatbot-ui

Come join the best place on the internet to learn AI skills. Use code "chatbotui" for an extra 20% off.
https://JoinTakeoff.com
MIT License
28.34k stars 7.88k forks source link

Local Models not showing up #1255

Closed Alex-Walston closed 7 months ago

Alex-Walston commented 7 months ago

I think I followed everything correctly but the local models are still not showing up on the UI

image image image

Bortus-AI commented 7 months ago

are you accessing the chat on the same computer as the ollama server? If not you need to stop ollama and then start uit via OLLAMA_HOST=0.0.0.0 OLLAMA_ORIGINS=* ollama serve so that ollama can be reached from another computer

Alex-Walston commented 7 months ago

I am accessing it from a different computer from the server.

I tried what you recommended and I am still getting the same issue

image

Bortus-AI commented 7 months ago

Okay try this. Change the ollama_url to the lan or wan IP not local host. NEXT_PUBLIC_OLLAMA_URL=http://IPADRESS:11434

Bortus-AI commented 7 months ago

Since you're accessing via another computer the chatbot ui in your browser is trying to access localhost but that isn't on that computer. Its on the remote so you need to change it to point to the actual IP.

Alex-Walston commented 7 months ago

Still nothing, although after making that change I got a hit on the ollama when I launched the Chatbot

image

Bortus-AI commented 7 months ago

I'm assuming if you run ollama list you see the downloaded models on the server? I only say that as it happened once where I had models downloaded but ollama list was blank until i restarted it. Also make sure you're running the latest ollama

Alex-Walston commented 7 months ago

That was it

Thank you!

Bortus-AI commented 7 months ago

That was it

Thank you!

Welcome!