twinnydotdev / twinny

The most no-nonsense, locally or API-hosted AI code completion plugin for Visual Studio Code - like GitHub Copilot but completely free and 100% private.
https://twinny.dev
MIT License
2.36k stars 130 forks source link

Chat API endpoint incorrect for Ollama in default settings #172

Closed jonaslund closed 4 months ago

jonaslund commented 4 months ago

Describe the bug Ollama is called by Twinny, but no output is generated. 404 is generated on /v1/chat/completions

To Reproduce Steps to reproduce the behavior:

  1. Invoke Twinny
  2. Wait for nothing to happen

Expected behavior Output

Versions VS Code, Version: 1.87.1 (Universal) MacOS 12.6

Ollama Logs

[GIN] 2024/03/08 - 17:26:20 | 200 |  9.789041042s |       127.0.0.1 | POST     "/api/chat"
time=2024-03-08T17:26:32.833Z level=INFO source=dyn_ext_server.go:171 msg="loaded 1 images"
[GIN] 2024/03/08 - 17:26:41 | 200 |  8.766689709s |       127.0.0.1 | POST     "/api/chat"
[GIN] 2024/03/08 - 17:26:41 | 404 |  880.414083ms |       127.0.0.1 | POST     "/v1/chat/completions"

The Chat endpoint defaults to /v1/chat/completions whereas the correct one for Ollama is /api/chat but even after updating no output is generated in the chat.

How to debug further..?

jonaslund commented 4 months ago

Changing host to 127.0.0.1 and updating the API endpoints solved the issue.