The most no-nonsense, locally or API-hosted AI code completion plugin for Visual Studio Code - like GitHub Copilot but completely free and 100% private.
At step 4, instead of using the shortcut (that did nothing for me on a Mac), search for the command in the Cmd+Shift+P panel and trigger "Enable twinny sidebar"
You'll see an error : command 'twinny.showSidebar' not found
Expected behavior
I actually don't know what this sidebar should look like, but I guess this is why the autocompletion doesn't work for me (I do have the little robot emoji that spins when I type, but nothing happens).
Screenshots
Desktop (please complete the following information):
OS: MacOs Sonoma 14.4.1
Additional context
I must precise that I'm not sure that the step 2 has been completed, because it says "Set up Ollama as the backend by default: Install Ollama". So I followed the link and installed Ollama, and it's running properly and I've been able to test the models using the Terminal prompt. But I haven't "Set up Ollama as the backend by default", because there are no instructions about how to do so. Is there something I must configure?
Also I've found that this thread talks about selecting a model. So I tried to run the command "Manage twinny providers" using the Cmd+Shift+P panel, but nothing happens.
Describe the bug I tried to follow the instructions, and when I get to the "open sidebar" step, I get an error saying that this method doesn't exist.
To Reproduce Steps to reproduce the behavior:
command 'twinny.showSidebar' not found
Expected behavior I actually don't know what this sidebar should look like, but I guess this is why the autocompletion doesn't work for me (I do have the little robot emoji that spins when I type, but nothing happens).
Screenshots
Desktop (please complete the following information):
Additional context I must precise that I'm not sure that the step 2 has been completed, because it says "Set up Ollama as the backend by default: Install Ollama". So I followed the link and installed Ollama, and it's running properly and I've been able to test the models using the Terminal prompt. But I haven't "Set up Ollama as the backend by default", because there are no instructions about how to do so. Is there something I must configure? Also I've found that this thread talks about selecting a model. So I tried to run the command "Manage twinny providers" using the Cmd+Shift+P panel, but nothing happens.