Open jpmcb opened 7 months ago
Ollama supports a "health" endpoint at the root of the running server:
❯ curl localhost:11434 -vvv * Trying 127.0.0.1:11434... * Connected to localhost (127.0.0.1) port 11434 (#0) > GET / HTTP/1.1 > Host: localhost:11434 > User-Agent: curl/7.87.0 > Accept: */* > * Mark bundle as not supporting multiuse < HTTP/1.1 200 OK < Content-Type: text/plain; charset=utf-8 < Date: Thu, 18 Apr 2024 19:20:47 GMT < Content-Length: 17 < * Connection #0 to host localhost left intact Ollama is running
The biggest thing to note here is the 200 response and the returned "Ollama is running" response.
Also related:
Can ollama-python support a "ping" member on the client? I'd imagine this would hit the above endpoint and simply continue if 200 is returned. Otherwise, raise an error.
Happy to work on this if 👍🏼
I need this too
Ollama supports a "health" endpoint at the root of the running server:
The biggest thing to note here is the 200 response and the returned "Ollama is running" response.
Also related:
Can ollama-python support a "ping" member on the client? I'd imagine this would hit the above endpoint and simply continue if 200 is returned. Otherwise, raise an error.
Happy to work on this if 👍🏼