twinnydotdev / twinny

The most no-nonsense, locally or API-hosted AI code completion plugin for Visual Studio Code - like GitHub Copilot but completely free and 100% private.
https://twinny.dev
MIT License
2.36k stars 130 forks source link

Default ollama `Chat Api Path` points to the wrong URL path #191

Closed brunoais closed 3 months ago

brunoais commented 4 months ago

Describe the bug Default Chat Api Path when setting the Api Provider to ollama, sets Chat Api Path to /v1/chat/completions instead of the correct /api/chat (source)

Note: This seems to be some sort of copy/paste error or selection error because /v1/chat/completions is also the default for lmstudio

To Reproduce Steps to reproduce the behavior:

  1. Go to twinny setting twinny.apiProvider
  2. Click on it and change to ollama (if it's already ollama, change to another and change back)
  3. Problem: Setting twinny.chatApiPath becomes /v1/chat/completions. Should have been /api/chat.

Screenshots image image

rjmacarthy commented 3 months ago

Hey, this was updated as Ollama now support OpenAI specifictaion.

https://ollama.com/blog/openai-compatibility

Please update Ollama and try again.

brunoais commented 3 months ago

Updating ollama fixed. ollama was outdated.