Open az-pz opened 1 year ago
Unclear what you are asking for here.
I have the same question. Azure OpenAI is the same as OPENAI but with different endpoints. As in the application for GPT models, you can set only API KEY, but not the endpoint, Azure OpneIAI service can't be used. In some libraries, Azure OpenAI is implemented. as https://learn.microsoft.com/en-us/azure/ai-services/openai/how-to/switching-endpoints For REST API the documentation is here : https://learn.microsoft.com/en-us/azure/ai-services/openai/reference
Openai API url is defined here: https://github.com/nomic-ai/gpt4all/blob/4d855afe973a08ad81966d22adaadc5916b9126d/gpt4all-chat/chatgpt.cpp#L157
If there was an option to configure the openai api endpoint to QUrl openaiUrl("https://example-endpoint.openai.azure.com/v1/chat/completions");
, it'd allow users to interact with their azure openai instance since both API's are the same.
any update on this?
Is this something we can expect to see?
Openai API url is defined here:
If there was an option to configure the openai api endpoint to
QUrl openaiUrl("https://example-endpoint.openai.azure.com/v1/chat/completions");
, it'd allow users to interact with their azure openai instance since both API's are the same.
This line has been turned to:
It's might be possible to custom the URL, but needs they allow to modify the "baseUrl" in the rmodel
file.
my-custom-chat-api.rmodel
{ "apiKey": "the_api_key", "baseUrl": "https://web-tech-tw.eu.org/openai/v1", "modelName": "gpt-3.5-turbo" }
But it seems still a constant now.
If it works, the ollama integration will be working also. https://github.com/nomic-ai/gpt4all/issues/2544
Feature request
It should be possible to add custom deployment endpoint to any openai instance (in this case Azure). Since the API is the same, it should not be too difficult to implement.
Motivation
To use your custom openai instance. To reduce usage cost. To control your data. To control the system prompt.
Your contribution
I could help in testing.