Closed bruceunx closed 6 months ago
Is it OK to send a PR
I think that's okay if that's same API. Let's see if @gptlang @deathbeam agree on this.
From what I understand, support for other models would depend on an API-level translation layer. In that case, I don't think we should hard code the models / have different URL fields. Just allow user configuration of the completion URL and model. For authentication, maybe also allow the user to send a custom header.
Having code that requires the user to self-host or use a third party service as part of the plugin itself is not worth it
Yea configuration for url
, headers
, some request transformer and response transformer should be plenty imo. And then let ppl do whatever they want. For ready implementations inside of the plugin directly there are already plugins for all of that, tons of them even.
Yes, I implement a gpt_server
to support the same API without much change to this plugin, just switch gpts easy without any other plugins, this is easy for personal use, you can setup server locally or on cloud, if you want, you can add other gpt servers easily, anyway, I use this method very happy, just want to share this ideal to anyone who may need this.
for self-setup gpt-server, you can add auth-token with the same logic.
gpt-server
-> https://github.com/bruceunx/gpt-server
Add Support for Gemini and Other AI Support
Just add the other AI suppliers for copilot chat, like gemini, groq, does anybody need this?
I also implement a server with fastapi to support the same api interface
https://github.com/bruceunx/CopilotChat.nvim
Is it OK to send a PR or leave it here?