HelgeSverre / ollama-gui

A Web Interface for chatting with your local LLMs via the ollama API
https://ollama-gui.vercel.app/
MIT License
515 stars 84 forks source link

Accessing Ollama GUI via Local IP Doesn’t Load LLaMA 3 (localhost:8080 Works) #35

Closed SecTech12 closed 1 month ago

SecTech12 commented 1 month ago

I have the Ollama GUI running on my machine, and it works perfectly when accessed via localhost:8080. However, when I try to access it using the local IP address of my machine (e.g., 192.168.1.x:8080), the GUI loads, but LLaMA 3 does not load.

Details: Operating System: Windows11

HelgeSverre commented 1 month ago

To use your IP on the local network, you need to use the OLLAMA_HOST environment variable when starting Ollama to make it "bind" (aka listen on/for) requests for the IP instead of localhost, see docs here.

You might also have to specify OLLAMA_ORIGINS, although I have not tested this myself.

SecTech12 commented 1 month ago

Thanks!