Open nathancartlidge opened 1 month ago
It would be nice to have local models too. For example: https://ollama.com/ It supports Llama 3, Phi3, and a lot of other models: https://ollama.com/library C# client: https://github.com/awaescher/OllamaSharp
@minzdrav This would be enabled by my proposed change - Ollama provides partial support for the OpenAI API schema, so you'd be able to point the plugin at your local model
In particular, supporting an Azure OpenAI endpoint would be great first implementation. It would be even better if the Azure implementation supported Managed Identities so we don't end up with the unmanageable mess of API key distribution and rotation.
supporting Groq would be nice too
IMPORTANT Regarding the custom AI model option planned: We should make sure that companies can (still) force opt-out using Group Policies. And I think it would be great, if companies could enforce a list of supported endpoints by Group Policy.
bump...
Description of the new feature / enhancement
It should be possible to configure the model used (currently fixed as
gpt3.5-turbo
) and endpoint (currently fixed as OpenAI's) to arbitrary valuesScenario when this would be used?
Sending requests to an alternative AI endpoint (eg a local model, internal company hosted models, alternative ai providers), or ensuring higher-quality conversions (eg by pointing requests at gpt-4o)
Supporting information
Microsoft's documentation appears to suggest that the underlying library used for AI completions supports other libraries, it just needs to be provided with an endpoint.
The currently used model is a hardcoded string in this repository