Closed nhoffman closed 10 months ago
Unfortunately, I cannot reproduce this, although I have the same package versions as you.
The plugin uses the /chat
endpoint even when you run single prompts (I did not figure out a way to differentiate between llm
and llm chat
invocations). Thus, to rule out problems with endpoint access, please check that the following works:
$ curl http://localhost:11434/api/chat -d '{
"model": "llama2:latest",
"messages": [
{
"role": "user",
"content": "why is the sky blue?"
}
]
}'
If this works, then additional troubleshooting ideas are:
ollama
is up to dateollama serve
terminal (or journalctl -f -u ollama
if you run it as a systemd
service)Thanks for the help troubleshooting. The /api/chat endpoint gives me a 404 (and works on another machine), so that's my problem right there. No idea what's wrong since the /api/generate endpoint seems fine, but I'm closing this since the problem is clearly on my end. Thanks for the plugin!
Hi there - I installed llm and this project using pipx, and there seems to be an error accessing the ollama server. Here are some details - please let me know what additional information would be helpful.