twinnydotdev / twinny

The most no-nonsense, locally or API-hosted AI code completion plugin for Visual Studio Code - like GitHub Copilot but completely free and 100% private.
https://twinny.dev
MIT License
2.36k stars 130 forks source link

Display an indicator when twinny looses connection the LLM provider #168

Closed onel closed 4 months ago

onel commented 4 months ago

Is your feature request related to a problem? Please describe. When twinny looses connection there is no indicator that this thing happened, we just see the loading indicator.

Describe the solution you'd like A warning/error message above the chat box would be helpful to alert the user that something happened.

Not sure how it would look if the side panel is closed and the user just uses FIM

rjmacarthy commented 4 months ago

How would you propose that we check that a connection is lost? I am not a fan of polling the endpoints.

Many thanks,

AntonKrug commented 4 months ago

I think there is no connection to provider to begin with? Each API call starts and ends, there is no long going websocket or other connection, there is no way to tell if there is something to respond until the call is made. The plugin would have to be pinging the API all the time just to know its presence. Kinda like on a roadtrip with kids are we there yet?

onel commented 4 months ago

No, definitely polling would not be a nice option and I'm not sure we need a service to keep track of that. I was wondering if it makes sense to handle when actual requests fail.

For example, going through the code I see that in streamTemplateCompletion twinny makes a POST call with this._view?.webview.postMessag({}) on top of other calls like buildStreamRequest. I wonder if it makes sense to handle those when they fail, if they fail when ollama is offline?

rjmacarthy commented 4 months ago

Not relevant so closing.