Open Hansyvea opened 3 months ago
Are you using the standard ollama host/port, e.g. 127.0.0.1:11434
? If not, you will need to set OLLAMA_HOST
or ollama.Client(host='...')
try turn off your VPN app
try turn off your VPN app
Thanks, buddy. You made my day!
Wondering why? REST call to http://127.0.0.1:11434/api/embeddings was ok on the same machine.
I have ollama service run in the background and it is working well to run any model in ternimal. However, when it comes to python, things happend.
site-packages/ollama/_client.py:71) response.raise_for_status() site-packages/ollama/_client.py:72) except httpx.HTTPStatusError as e: --->site-packages/ollama/_client.py:73) raise ResponseError(e.response.text, e.response.status_code) from None site-packages/ollama/_client.py:75) return response