twinnydotdev / twinny

The most no-nonsense, locally or API-hosted AI code completion plugin for Visual Studio Code - like GitHub Copilot but completely free and 100% private.
https://twinny.dev
MIT License
2.3k stars 126 forks source link

Unable to interact with ollama running VSCode with WSL2 #213

Closed mdbooth closed 2 months ago

mdbooth commented 2 months ago

Describe the bug I run a Windows shell but code in Linux. When launching VSCode I select 'Connect to WSL'. From then on I am running in Linux. I am running ollama in a terminal window locally on WSL. It is listening on localhost:11434, which I have verified with CURL. ollama run produces code output.

I can't make twinny do anything at all. Other than the twinny robot head in the status bar I can see no evidence that it is installed.

The twinny extension is listed under WSL: Fedora - INSTALLED in VSCode, which is where most of my extensions are installed.

To Reproduce Steps to reproduce the behavior:

  1. In WSL2, download the ollama v0.1.31 binary.
  2. Run ollama serve
  3. From another terminal session, run ollama run codellama:7b-code
  4. Wait for it to download and start running, then quit with /bye
  5. Open VSCode
  6. At the bottom left select the connection icon, then choose 'Connect to WSL'
  7. Install twinny v3.11.4
  8. Exit VSCode
  9. Restart VSCode
  10. Open any code repository
  11. Attempt to interact with twinny in any way

Expected behavior Placing the cursor in a code context, entering some text and pressing ALT + \ should result in a connection to ollama.

Screenshots If applicable, add screenshots to help explain your problem.

Desktop (please complete the following information):

I saw https://github.com/rjmacarthy/twinny/issues/207 reporting a similar issue with remote ssh. I opened a fresh issue because they were able to interact via chat which I am not, although I wonder if that might be due to some unrelated interaction with another extension.

mdbooth commented 2 months ago

Looks like 3.11.5 was release while I was typing the above! I updated to it, but the issue remains.

mdbooth commented 2 months ago

Update: I noticed that there is a 'Twinny' icon at the bottom of my extensions list which opens a chat interface. Typing things in the chat interface does produce a log output in ollama (although I can't make it display a response). If you think this is a generic 'remote' issue, please close in favour of #207.

mdbooth commented 2 months ago

Wait, I restarted VSCode again after the 3.11.5 update, and now I'm getting autocompletions. I'm going to close this. Sorry for the noise!