ex3ndr / llama-coder

Replace Copilot local AI
https://marketplace.visualstudio.com/items?itemName=ex3ndr.llama-coder
MIT License
1.37k stars 95 forks source link

More flexibility with remote hosts #27

Open corinfinite opened 5 months ago

corinfinite commented 5 months ago

Hi,

I use VS Code's Remote SSH/Remote Containers plugin for most of my development.

Testing out llama-coder initially I ran ollama on another machine on my local network, separate from my development laptop. This worked great!

However when I use VS Code to connect to a remote development server, that server doesn't have access to the local machine I use for inference. VS Code only allows llama-coder to run in the remote host. Unfortunately VS Code can't use reverse SSH tunnels, which was the first solution I tried.

I suspect that this could be resolved by adding "extensionKind": ["workspace", "ui"] to the extension manifest, as described here. This would leave the default behavior of the extension running on the remote host, but allow it to be switched to running locally if desired.

Let me know what you think, the above comes from reading over the linked pages but I don't have any experience writing VS Code extensions so I could be wrong about something here.

corinfinite commented 5 months ago

I added the extensionKind and that allows the extension to run locally but I see the following error: Unsupported document: vscode-remote://attached-container%HASH/path/to/file.cpp, should be easy to fix once I get some time to get back to this.

ex3ndr commented 4 months ago

I have fixed this in 0.0.13