twinnydotdev / twinny

The most no-nonsense, locally or API-hosted AI code completion plugin for Visual Studio Code - like GitHub Copilot but completely free and 100% private.
https://twinny.dev
MIT License
2.93k stars 154 forks source link

Cannot chat successfully with ollama #192

Closed brunoais closed 6 months ago

brunoais commented 6 months ago

Describe the bug

No matter what I try to chat with ollama, I get Sorry, I don’t understand. Please try again.

When I use ollama run with the same model, it works as expected. I'm using codellama:7b-instruct in both cases.

For clarification: FIM is working as expected.

To Reproduce

Steps to reproduce the behavior:

  1. Try to chat with bot
  2. Get error response

Expected behavior

Chatting with the bot succeeds

Screenshots

image image image image

Desktop (please complete the following information):

N/A

Additional context

I'm using the default system message provided by twinny. I also tried blanking the template and trying again but made no difference. Twinny 2.X was working but I have no means to downgrade the version. At least, not in the UI.

See also: #191 because the cause can be related, just invisible to the user

rjmacarthy commented 6 months ago

Hey, this was updated as Ollama now support OpenAI specifictaion.

https://ollama.com/blog/openai-compatibility

Please update Ollama and try again.

brunoais commented 6 months ago

Updating ollama fixed. ollama was outdated.