ollama / ollama

Get up and running with Llama 3.1, Mistral, Gemma 2, and other large language models.
https://ollama.com
MIT License
86.97k stars 6.75k forks source link

ollama should detect native windows proxy configuration #5354

Open smallg0at opened 2 months ago

smallg0at commented 2 months ago

What is the issue?

Ollama seems to fail to update itself in recent versions and app logs are as follow:

time=2024-06-28T11:23:56.487+08:00 level=INFO source=logging.go:50 msg="ollama app started"
time=2024-06-28T11:23:56.540+08:00 level=INFO source=server.go:176 msg="unable to connect to server"
time=2024-06-28T11:23:56.540+08:00 level=INFO source=server.go:135 msg="starting server..."
time=2024-06-28T11:23:56.547+08:00 level=INFO source=server.go:121 msg="started ollama server with pid 31184"
time=2024-06-28T11:23:56.547+08:00 level=INFO source=server.go:123 msg="ollama server logs C:\\Users\\<username>\\AppData\\Local\\Ollama\\server.log"
time=2024-06-28T11:24:00.238+08:00 level=INFO source=updater.go:102 msg="New update available at https://github.com/ollama/ollama/releases/download/v0.1.47/OllamaSetup.exe"
time=2024-06-28T11:24:00.257+08:00 level=ERROR source=updater.go:212 msg="failed to download new release: error checking update: Head \"https://github.com/ollama/ollama/releases/download/v0.1.47/OllamaSetup.exe\": dial tcp 127.0.0.1:443: connectex: No connection could be made because the target machine actively refused it."

I'm behind a proxy and its obviously not on the 443 port, and ollama server is not listening on it. running nslookup also shows the correct IP rather than localhost. So I have a suspicion that something in proxy detection has gone wrong...

OS

Windows

GPU

Nvidia

CPU

Intel

Ollama version

0.1.46

dhiltgen commented 1 month ago

Can you confirm you've set the proxy settings as described here and it still doesn't work?

smallg0at commented 1 month ago

Can you confirm you've set the proxy settings as described here and it still doesn't work?

works now. but: