carlrobertoh / CodeGPT

JetBrains extension providing access to state-of-the-art LLMs, such as GPT-4, Claude 3, Code Llama, and others, all for free
https://codegpt.ee
Apache License 2.0
912 stars 186 forks source link

Add enable/disable option for providers #553

Open AnonTester opened 2 months ago

AnonTester commented 2 months ago

Describe the need of your request

I'm using the plugin with a local openai compatible provider and do not want any of my code or code parts to be sent to any external provider. It's been the second time now that after an update of the plugin, my selected provider was reset and and I only noticed because I got an error message pop up that code completion isn't allowed without auth token. The latest update changed the selected provider to Ollama (Local) from my previous Custom OpenAI setting.

Proposed solution

Add toggles in the settings to enable/disable the AI providers that one wants to use. This would reduce the provider selection in the sidebar chat window to only the enabled ones.

Checkboxes on this setting screen to enable each provider would do the job just fine: image

Additional context

Related to this issue is the reset of settings or selected provider in an update - which should not happen. If there are changes that prevent retention of the selection, then I suggest to disable ALL providers by default rather than risking sensitive code or questions to be sent to an external provider, potentially causing data protection issues.

Thank you very much for this plugin!

Rikj000 commented 3 weeks ago

Additional context

Related to this issue is the reset of settings or selected provider in an update - which should not happen. If there are changes that prevent retention of the selection, then I suggest to disable ALL providers by default rather than risking sensitive code or questions to be sent to an external provider, potentially causing data protection issues.

I second this,
it's very annoying that after each update of CodeGPT my Selected provider
(= Custom OpenAI, for a locally hosted model),
is reset back to an online option.

I do not want an online connection with any LLM,
since I do not trust them with my data, nor do I want them to train upon my data.

Please resolve this.