julian-shalaby / ChatWindow-feedback

2 stars 0 forks source link

Offline mode code completion #8

Closed LeZderich closed 3 months ago

LeZderich commented 4 months ago

Hello i'm using ollama with mistral and while i'm offline only the chat is working but the code completion(alt+x) is not.

Is there a way to config the default LLM for the code completion ?

Thanks a lot for your plugin, it has been really helpful.

julian-shalaby commented 3 months ago

Hey! Over the past few months I have started to redo the entire ChatWindow project from the ground up. The V2 architecture unfortunately will not allow for Ollama usage. I understand this is a downside in many ways, but hopefully the additional upsides will mitigate the tradeoff.

Tbh I would love to support Ollama, but after closely evaluating how an integration would work seamlessly in the new architecture, the development overhead is just way too much to take on as a single developer and do it smoothly. The expanded scope leaves way too many edge cases, and the only way I could possibly address these while taking into account security considerations and smooth feature experiences would be a bunch of hacky layers of indirection. Will be open to reimplementing Ollama 1 day in the future