Closed loczek closed 11 months ago
The extension doesn't work when I use autocompletion with Ollama/codellama, here is the extension log:
2023-11-06 11:35:10.419 [error] TypeError: dG.log is not a function
at Ah (/home/user/.vscode/extensions/danielsanmedium.dscodegpt-2.2.1/dist/extension.js:284:82)
at Timeout._onTimeout (/home/user/.vscode/extensions/danielsanmedium.dscodegpt-2.2.1/dist/extension.js:419:397)
But if I use postman to post json data to http://localhost:11434/api/generate (Ollama API), it's all fine.
Try latest v2.2.3, it fixed for me on macOS
It's working now!
hmm, I have this problem, just loaded it up and it's running 2.2.3 but just autocomplete results in same error as [loczek] normal chat is working but when autocomplete active then it hangs and chat breaks too ollama-runner GPU usage is around 80% with code helper (gpu) running the other 15% or so yet nothing autocompletes any thoughts?
Same for me. Chat works in 2.2.3, but autocomplete just emit text in output, but not to the source code editor, like here: https://github.com/davila7/code-gpt-docs/issues/210
Autocompletion does not work with the latest version v.3.1.1. It just shows [11:08:12] completion provider = Ollama
in the Output tab.
Also, settings set is smaller than in previous versions - when selecting Api Key as Ollama, there is no model to choose from. Only Autocomplete: Provider shows Ollama - codellama as a choice.
Does it just not work? Should I just forget about this functionality?
The code autocompletion doesn't work and the spinner in the bottom right corner keep spinning. In the in the extension runtime stats there is a
dG.log is not a function
error. I'm running the extension with codellama through ollama and the normal chat is working normally.