JudiniLabs / code-gpt-docs

Docusaurus page
https://code-gpt-docs.vercel.app
MIT License
569 stars 59 forks source link

No completion in editor but there is output in logs when using ollama + code llama #210

Closed PRESIDENT810 closed 4 months ago

PRESIDENT810 commented 9 months ago

I'm using Ollama with code llama locally. I installed codeGPT plugin in VSCode and enabled it. However, there's no completion in the code editor, but I can see there's output from code llama in "output" tab:

image

whereas I'm expecting something like this:

image

Did I configure it wrong? Or code llama doesn't support displaying completion in editor?

Skybladev2 commented 8 months ago

Same for me. It does not even emit to Output tab since v.3.X

PilarHidalgo commented 4 months ago

Hi! you can follow the instructions here: https://docs.codegpt.co/docs/tutorial-ai-providers/ollama