nomic-ai / gpt4all

GPT4All: Run Local LLMs on Any Device. Open-source and available for commercial use.
https://nomic.ai/gpt4all
MIT License
70.96k stars 7.72k forks source link

[Feature] Connect to OpenAI compatible server other than ChatGPT #1243

Open az-pz opened 1 year ago

az-pz commented 1 year ago

Feature request

It should be possible to add custom deployment endpoint to any openai instance (in this case Azure). Since the API is the same, it should not be too difficult to implement.

Motivation

To use your custom openai instance. To reduce usage cost. To control your data. To control the system prompt.

Your contribution

I could help in testing.

AndriyMulyar commented 1 year ago

Unclear what you are asking for here.

borisovalex commented 1 year ago

I have the same question. Azure OpenAI is the same as OPENAI but with different endpoints. As in the application for GPT models, you can set only API KEY, but not the endpoint, Azure OpneIAI service can't be used. In some libraries, Azure OpenAI is implemented. as https://learn.microsoft.com/en-us/azure/ai-services/openai/how-to/switching-endpoints For REST API the documentation is here : https://learn.microsoft.com/en-us/azure/ai-services/openai/reference

az-pz commented 1 year ago

Openai API url is defined here: https://github.com/nomic-ai/gpt4all/blob/4d855afe973a08ad81966d22adaadc5916b9126d/gpt4all-chat/chatgpt.cpp#L157

If there was an option to configure the openai api endpoint to QUrl openaiUrl("https://example-endpoint.openai.azure.com/v1/chat/completions");, it'd allow users to interact with their azure openai instance since both API's are the same.

nicolgit commented 9 months ago

any update on this?

BenTheCloudGuy commented 8 months ago

Is this something we can expect to see?

supersonictw commented 4 months ago

Openai API url is defined here:

https://github.com/nomic-ai/gpt4all/blob/4d855afe973a08ad81966d22adaadc5916b9126d/gpt4all-chat/chatgpt.cpp#L157

If there was an option to configure the openai api endpoint to QUrl openaiUrl("https://example-endpoint.openai.azure.com/v1/chat/completions");, it'd allow users to interact with their azure openai instance since both API's are the same.

This line has been turned to:

https://github.com/nomic-ai/gpt4all/blob/c73f0e5c8c25ede56e3eeb28ff9dd37f09212994/gpt4all-chat/chatapi.cpp#L199

It's might be possible to custom the URL, but needs they allow to modify the "baseUrl" in the rmodel file.

my-custom-chat-api.rmodel

{
"apiKey": "the_api_key",
"baseUrl": "https://web-tech-tw.eu.org/openai/v1",
"modelName": "gpt-3.5-turbo"
}

But it seems still a constant now.

https://github.com/nomic-ai/gpt4all/blob/c73f0e5c8c25ede56e3eeb28ff9dd37f09212994/gpt4all-chat/modellist.cpp#L1529

supersonictw commented 4 months ago

If it works, the ollama integration will be working also. https://github.com/nomic-ai/gpt4all/issues/2544