Closed DenisBY closed 4 months ago
Hey, have you tried to update Ollama?
I use ollama version is 0.1.29
So, I fixed it. For some reason I had version 0.1.22, and played with it with changing paths, etc. After your suggestion I updated to 0.1.29, reverted my changes back (/api/chat
and /api/generate
-> /v1/chat/completions
) and now it's working. Thank you!
Describe the bug I installed vs code plugin v3.7.19 and local ollama in Docker using official container from Docker Hub. Via
curl
it seems it somehow works:However via extension it doesn't:
To Reproduce Steps to reproduce the behavior:
Additional context If I change path to
/v1/chat/completions
, llama returns 404. But both/api/chat
and/api/generate
give the same result, viacurl
and in extension.