eosphoros-ai / DB-GPT

AI Native Data App Development framework with AWEL(Agentic Workflow Expression Language) and Agents
http://docs.dbgpt.cn
MIT License
13.17k stars 1.74k forks source link

[Feature]: Azure OpenAI Support #439

Open Mshz2 opened 1 year ago

Mshz2 commented 1 year ago

Is your feature request related to a problem? Please describe. The server url fpr openai api is not applicable to azure openai Endpoint url.

Describe the solution you'd like OPENAI_TYPE=Azure PROXY_API_KEY=Azure key PROXY_SERVER_URL=https://resoucename.openai.azure.com

Describe alternatives you've considered I tried to do PROXY_API_KEY=**** PROXY_SERVER_URL=https://****.openai.azure.com/openai/deployments/chatgpt35/chat/completions?api-version=2023-03-15-preview But got bellow error in the frontend: image

csunny commented 1 year ago

Azure is not supported currently, can you give a pr for that?

Mshz2 commented 1 year ago

Azure is not supported currently, can you give a pr for that?

I would love to, but unfortunately I am not sure if I have sufficient skill. Anyway, I believe many would like to deploy your nice product in the cloud, which for instance cloud providers like azure is popular due to openai gpt model integration.

cason0126 commented 10 months ago

i can handle it @csunny

cason0126 commented 10 months ago

LLM_MODEL=proxyllm PROXY_API_KEY={your key} PROXY_API_BASE=https://{your domain}.openai.azure.com/ PROXY_API_TYPE=azure PROXY_SERVER_URL=xxxx PROXY_API_VERSION=2023-05-15 PROXYLLM_BACKEND=gpt-35-turbo

u can use config like this 。 it works @Mshz2

yihong0618 commented 10 months ago

@cason0126 nice

riverind commented 9 months ago

LLM_MODEL=proxyllm PROXY_API_KEY={your key} PROXY_API_BASE=https://{your domain}.openai.azure.com/ PROXY_API_TYPE=azure PROXY_SERVER_URL=xxxx PROXY_API_VERSION=2023-05-15 PROXYLLM_BACKEND=gpt-35-turbo

u can use config like this 。 it works @Mshz2

PROXY_SERVER_URL is represented for? thks @cason0126

cason0126 commented 9 months ago

LLM_MODEL=proxyllm PROXY_API_KEY={your key} PROXY_API_BASE=https://{your domain}.openai.azure.com/ PROXY_API_TYPE=azure PROXY_SERVER_URL=xxxx PROXY_API_VERSION=2023-05-15 PROXYLLM_BACKEND=gpt-35-turbo u can use config like this 。 it works @Mshz2

PROXY_SERVER_URL is represented for? thks @cason0126

I guess it should be redundant configuration items. You can leave this part blank, or just fill in XXX as the value. As for why and iterative improvement, I am also learning.

riverind commented 9 months ago

LLM_MODEL=proxyllm PROXY_API_KEY={your key} PROXY_API_BASE=https://{your domain}.openai.azure.com/ PROXY_API_TYPE=azure PROXY_SERVER_URL=xxxx PROXY_API_VERSION=2023-05-15 PROXYLLM_BACKEND=gpt-35-turbo u can use config like this 。 it works @Mshz2

PROXY_SERVER_URL is represented for? thks @cason0126

I guess it should be redundant configuration items. You can leave this part blank, or just fill in XXX as the value. As for why and iterative improvement, I am also learning.

thx, you run success or not ?

if success, can i know your para about azure?

if set the proxy_server_url to space, it will occur error, error info is as follows: dbgpt_server.py: error: the following arguments are required: --proxy_server_url

cason0126 commented 9 months ago

LLM_MODEL=proxyllm PROXY_API_KEY={your key} PROXY_API_BASE=https://{your domain}.openai.azure.com/ PROXY_API_TYPE=azure PROXY_SERVER_URL=xxxx PROXY_API_VERSION=2023-05-15 PROXYLLM_BACKEND=gpt-35-turbo u can use config like this 。 it works @Mshz2

PROXY_SERVER_URL is represented for? thks @cason0126

I guess it should be redundant configuration items. You can leave this part blank, or just fill in XXX as the value. As for why and iterative improvement, I am also learning.

thx, you run success or not ?

if success, can i know your para about azure?

if set the proxy_server_url to space, it will occur error, error info is as follows: dbgpt_server.py: error: the following arguments are required: --proxy_server_url

i can work success . my para is :
LLM_MODEL=proxyllm PROXY_API_KEY=your key PROXY_API_BASE=https://{your_domain}.openai.azure.com/ PROXY_API_TYPE=azure PROXY_SERVER_URL=xxxx
PROXY_API_VERSION=2023-05-15 PROXYLLM_BACKEND=gpt-35-turbo

Mshz2 commented 9 months ago

@csunny @cason0126 thanks for the feedback. It is working now ;)

By any chance, how can I change similarly for using the azure embedding model? I tried to do like this, but did not worked:

EMBEDDING_MODEL=proxy_openai
proxy_openai_proxy_server_url=https://myresourcename.openai.azure.com/openai/deployments/text-embedding-ada-002/embeddings)
proxy_openai_proxy_api_key=password
proxy_openai_proxy_backend=text-embedding-ada-002
AbhijitManepatil commented 7 months ago

LLM_MODEL=proxyllm PROXY_API_KEY={your key} PROXY_API_BASE=https://{your domain}.openai.azure.com/ PROXY_API_TYPE=azure PROXY_SERVER_URL=xxxx PROXY_API_VERSION=2023-05-15 PROXYLLM_BACKEND=gpt-35-turbo

Only change in above parameters : I used PROXYLLM_BACKEND="mydeplymentname" works for me

Cualacin0 commented 4 months ago

EMBEDDING_MODEL=proxy_openai proxy_openai_proxy_server_url=https://myresourcename.openai.azure.com/openai/deployments/text-embedding-ada-002/embeddings) proxy_openai_proxy_api_key=password proxy_openai_proxy_backend=text-embedding-ada-002

@csunny @cason0126 thanks for the feedback. It is working now ;)

By any chance, how can I change similarly for using the azure embedding model? I tried to do like this, but did not worked:

EMBEDDING_MODEL=proxy_openai
proxy_openai_proxy_server_url=https://myresourcename.openai.azure.com/openai/deployments/text-embedding-ada-002/embeddings)
proxy_openai_proxy_api_key=password
proxy_openai_proxy_backend=text-embedding-ada-002

@Mshz2 Hi, have you figured out how to use azure embedding model?

Mshz2 commented 4 months ago

EMBEDDING_MODEL=proxy_openai proxy_openai_proxy_server_url=https://myresourcename.openai.azure.com/openai/deployments/text-embedding-ada-002/embeddings) proxy_openai_proxy_api_key=password proxy_openai_proxy_backend=text-embedding-ada-002

@csunny @cason0126 thanks for the feedback. It is working now ;) By any chance, how can I change similarly for using the azure embedding model? I tried to do like this, but did not worked:

EMBEDDING_MODEL=proxy_openai
proxy_openai_proxy_server_url=https://myresourcename.openai.azure.com/openai/deployments/text-embedding-ada-002/embeddings)
proxy_openai_proxy_api_key=password
proxy_openai_proxy_backend=text-embedding-ada-002

@Mshz2 Hi, have you figured out how to use azure embedding model?

I'm not using the tool anymore))