Closed haje01 closed 9 months ago
Please check that you can interact with Ollama server directly through HTTP API:
$ curl http://localhost:11434/api/chat -d '{
"model": "llama2:latest",
"messages": [
{
"role": "user",
"content": "why is the sky blue?"
}
]
}'
My Apologies...
Thanks!
After I installed llm-ollama, simple command raises
httpx.ConnectError: [Errno 111] Connection refused
Thank you in advance!