The most no-nonsense, locally or API-hosted AI code completion plugin for Visual Studio Code - like GitHub Copilot but completely free and 100% private.
No matter what I try to chat with ollama, I get Sorry, I don’t understand. Please try again.
When I use ollama run with the same model, it works as expected.
I'm using codellama:7b-instruct in both cases.
For clarification: FIM is working as expected.
To Reproduce
Steps to reproduce the behavior:
Try to chat with bot
Get error response
Expected behavior
Chatting with the bot succeeds
Screenshots
Desktop (please complete the following information):
N/A
Additional context
I'm using the default system message provided by twinny. I also tried blanking the template and trying again but made no difference.
Twinny 2.X was working but I have no means to downgrade the version. At least, not in the UI.
See also: #191 because the cause can be related, just invisible to the user
Describe the bug
No matter what I try to chat with ollama, I get
Sorry, I don’t understand. Please try again.
When I use
ollama run
with the same model, it works as expected. I'm usingcodellama:7b-instruct
in both cases.For clarification: FIM is working as expected.
To Reproduce
Steps to reproduce the behavior:
Expected behavior
Chatting with the bot succeeds
Screenshots
Desktop (please complete the following information):
N/A
Additional context
I'm using the default system message provided by twinny. I also tried blanking the template and trying again but made no difference. Twinny 2.X was working but I have no means to downgrade the version. At least, not in the UI.
See also: #191 because the cause can be related, just invisible to the user