twinnydotdev / twinny

The most no-nonsense, locally or API-hosted AI code completion plugin for Visual Studio Code - like GitHub Copilot but completely free and 100% private.
https://twinny.dev
MIT License
2.99k stars 158 forks source link

Problems with the FIM(automatic completion suggestions) #276

Closed Maesing closed 1 month ago

Maesing commented 2 months ago

Hello,

I am using Visual Studio Code with the Twinny Extension. On one of my PCs, I host Ollama running the deepseek-coder1.3b model (I also have codellama and codeqwen available). On my other PC, I have connected Twinny to the LLM hosted on the first PC.

While the chat functionality in the extension works fine, the autocompletion is problematic. Sometimes, it suggests the wrong programming languages or simply recommends irrelevant strings (e.g., "It seems like you've posted a piece of JavaScript code that includes several functions..."). The code completions it provides are not very helpful.

My question is: What could be causing these issues? Is it the model, the settings, or something else I am doing wrong? I would greatly appreciate any assistance in resolving this problem.

FIM

Here is a screenshot from a suggestion. The string on the top right is just a string and is printed out when pressing tab. Its just a bit weird and annoying. You need to delete more random stuff. Hope this is a good example.

(I don't know what context would help to solve the error, so if any context would help you just ask for it :) )

Thank you!

rjmacarthy commented 2 months ago

Hello, for deepseek please try using a base model as stated in the documentation.

E.g https://ollama.com/library/deepseek-coder:base

https://twinnydotdev.github.io/twinny-docs/general/supported-models/

robertpiosik commented 2 months ago

Config for api provided deepseek-coder: image

vrgimael commented 2 weeks ago

I'm also having similar issues and can't get it to work with Ollama