JudiniLabs / code-gpt-docs

Docusaurus page
https://code-gpt-docs.vercel.app
MIT License
553 stars 58 forks source link

Auto-complete errors with ollama, Chat works fine #279

Closed geoffgs closed 1 month ago

geoffgs commented 1 month ago

Using ollama with any model produces the same results for auto-complete, quiet failures. Meanwhile the Llama-Coder extension has no trouble and the CodeGPT Chat works fine. Seems like a CodeGPT-specific problem I haven't figured out how to resolve.

Steps to Reproduce:

Expected Behavior:

Actual Behaviour:

Environment: MacOS Sonoma 14.4.1 CPU: M2 Pro RAM: 32GB

VSCode: 1.89.1 (Universal) Commit: dc96b837cf6bb4af9cd736aa3af08cf8279f7685 Date: 2024-05-07T05:14:24.611Z Electron: 28.2.8 ElectronBuildId: 27744544 Chromium: 120.0.6099.291 Node.js: 18.18.2 - I noticed a recommendation for Node 20, but this is what VSCode ships with for Mac V8: 12.0.267.19-electron.0 OS: Darwin arm64 23.4.0

CodeGPT: 3.3.25 ollama: 0.1.32 Models: codellama, deepseek-coder

Other Logs: From Extension Host after a CMD-SHIFT-I

[Extension Host] TypeError: Cannot read properties of undefined (reading 'globalState')
    at cz (/Users/some.user/.vscode/extensions/danielsanmedium.dscodegpt-3.3.25/dist/extension.js:515:412)
    at M8 (/Users/some.user//.vscode/extensions/danielsanmedium.dscodegpt-3.3.25/dist/extension.js:517:15)

From VSCode Output > CodeGPT Autocomplete, same repeated message with any key strokes : [{timestamp}] completion provider = Ollama

From VS Code DevTools Console, when switching context a Python tab:

[Extension Host] CodeGPT Language Change 
{langID: 'python', lang: 'super-secret-code.py', activeTextEditor: {…}}
activeTextEditor: {document: {…}, selection: {…}, selections: Array(1), visibleRanges: Array(1), options: {…}, …}
lang: "super-secret-code.py"
langID: "python"
[[Prototype]]: Object
geoffgs commented 1 month ago

Found the bug on my end, wholly unrelated. Apologies.