Closed Alex-Walston closed 7 months ago
are you accessing the chat on the same computer as the ollama server? If not you need to stop ollama and then start uit via OLLAMA_HOST=0.0.0.0 OLLAMA_ORIGINS=* ollama serve so that ollama can be reached from another computer
I am accessing it from a different computer from the server.
I tried what you recommended and I am still getting the same issue
Okay try this. Change the ollama_url to the lan or wan IP not local host. NEXT_PUBLIC_OLLAMA_URL=http://IPADRESS:11434
Since you're accessing via another computer the chatbot ui in your browser is trying to access localhost but that isn't on that computer. Its on the remote so you need to change it to point to the actual IP.
Still nothing, although after making that change I got a hit on the ollama when I launched the Chatbot
I'm assuming if you run ollama list you see the downloaded models on the server? I only say that as it happened once where I had models downloaded but ollama list was blank until i restarted it. Also make sure you're running the latest ollama
That was it
Thank you!
That was it
Thank you!
Welcome!
I think I followed everything correctly but the local models are still not showing up on the UI