The most no-nonsense, locally or API-hosted AI code completion plugin for Visual Studio Code - like GitHub Copilot but completely free and 100% private.
The Chat endpoint defaults to /v1/chat/completions whereas the correct one for Ollama is /api/chat but even after updating no output is generated in the chat.
Describe the bug Ollama is called by Twinny, but no output is generated. 404 is generated on /v1/chat/completions
To Reproduce Steps to reproduce the behavior:
Expected behavior Output
Versions VS Code, Version: 1.87.1 (Universal) MacOS 12.6
Ollama Logs
The Chat endpoint defaults to
/v1/chat/completions
whereas the correct one for Ollama is/api/chat
but even after updating no output is generated in the chat.How to debug further..?