The most no-nonsense, locally or API-hosted AI code completion plugin for Visual Studio Code - like GitHub Copilot but completely free and 100% private.
Describe the bug
Default Chat Api Path when setting the Api Provider to ollama, sets Chat Api Path to /v1/chat/completions instead of the correct /api/chat(source)
Note: This seems to be some sort of copy/paste error or selection error because /v1/chat/completions is also the default for lmstudio
To Reproduce
Steps to reproduce the behavior:
Go to twinny setting twinny.apiProvider
Click on it and change to ollama (if it's already ollama, change to another and change back)
Problem: Setting twinny.chatApiPath becomes /v1/chat/completions. Should have been /api/chat.
Describe the bug Default
Chat Api Path
when setting theApi Provider
toollama
, setsChat Api Path
to/v1/chat/completions
instead of the correct/api/chat
(source)Note: This seems to be some sort of copy/paste error or selection error because
/v1/chat/completions
is also the default forlmstudio
To Reproduce Steps to reproduce the behavior:
twinny.apiProvider
ollama
(if it's alreadyollama
, change to another and change back)twinny.chatApiPath
becomes/v1/chat/completions
. Should have been/api/chat
.Screenshots
![image](https://github.com/rjmacarthy/twinny/assets/639467/de178cf0-6a53-4385-b65c-9a5531c70421)