twinnydotdev / twinny

The most no-nonsense, locally or API-hosted AI code completion plugin for Visual Studio Code - like GitHub Copilot but completely free and 100% private.
https://twinny.dev
MIT License
2.58k stars 143 forks source link

Not working with localforward to remote ollama server #93

Closed eriktar closed 7 months ago

eriktar commented 7 months ago

In my setup I have a underpowered laptop where I do my coding and a beefy server to handle ollama. The server is not on the internet, and needs to be accessed via a jump host on ssh. I use LocalForward 11434 10.10.0.233:11434 to emulate having it locally.

For curl and other vscode plugins like Continue it appears that my ollama server is available on localhost

curl -X POST http://localhost:11434/api/generate -d '{
  "model": "codellama:13b",
  "prompt": "Write me a function that outputs the fibonacci sequence"
}'

{"model":"codellama:13b","created_at":"2024-02-06T07:24:29.612339609Z","response":"\t","done":false}
{"model":"codellama:13b","created_at":"2024-02-06T07:24:29.635149631Z","response":"if","done":false}
{"model":"codellama:13b","created_at":"2024-02-06T07:24:29.657876752Z","response":" err","done":false}
{"model":"codellama:13b","created_at":"2024-02-06T07:24:29.680659605Z","response":" :=","done":false}
{"model":"codellama:13b","created_at":"2024-02-06T07:24:29.703439665Z","response":" validate","done":false}
...

Twinny however does not appear to be compatible with this setup, and keeps insisting I should install ollama locally.

image

Expected behavior Since the configuration is all about ports and urls, I'm expecting Twinny to try and use the port:host combo I set up, not any other systemcalls to ollama.

Desktop

image image
rjmacarthy commented 7 months ago

Hey thanks for the info, this seems to be a Duplicate of https://github.com/rjmacarthy/twinny/issues/70?

eriktar commented 7 months ago

Yup. Thats a duplicate. Did not spot since the summary/title was windows.