Closed Maesing closed 1 month ago
Hello, for deepseek please try using a base
model as stated in the documentation.
E.g https://ollama.com/library/deepseek-coder:base
https://twinnydotdev.github.io/twinny-docs/general/supported-models/
Config for api provided deepseek-coder:
I'm also having similar issues and can't get it to work with Ollama
Hello,
I am using Visual Studio Code with the Twinny Extension. On one of my PCs, I host Ollama running the deepseek-coder1.3b model (I also have codellama and codeqwen available). On my other PC, I have connected Twinny to the LLM hosted on the first PC.
While the chat functionality in the extension works fine, the autocompletion is problematic. Sometimes, it suggests the wrong programming languages or simply recommends irrelevant strings (e.g., "It seems like you've posted a piece of JavaScript code that includes several functions..."). The code completions it provides are not very helpful.
My question is: What could be causing these issues? Is it the model, the settings, or something else I am doing wrong? I would greatly appreciate any assistance in resolving this problem.
Here is a screenshot from a suggestion. The string on the top right is just a string and is printed out when pressing tab. Its just a bit weird and annoying. You need to delete more random stuff. Hope this is a good example.
(I don't know what context would help to solve the error, so if any context would help you just ask for it :) )
Thank you!