Open jexp opened 1 month ago
To add this option to the knowledge graph builder would definitely help users to solve data confidentially issues
I did replace the openAI model with AzureOpenAI locally and it worked I guess that making an integration to specifically support Azure OpenAI ( at least for those running the app locally ) would be great :)
Yes it’s on the plan. mostly additional config for baseurl, version, deployed model
I did replace the openAI model with AzureOpenAI locally and it worked I guess that making an integration to specifically support Azure OpenAI ( at least for those running the app locally ) would be great :)
Hi! Could you share more how you did this please, thank you
I did replace the openAI model with AzureOpenAI locally and it worked I guess that making an integration to specifically support Azure OpenAI ( at least for those running the app locally ) would be great :)
Hi! Could you share more how you did this please, thank you
Sure! Disclaimer though, I didn't think of extending what exists with AzureOpenAI, I only replaced the model used for GPT
Is quite simple in /backend/src/shared/common_fn.py , line 135 I replaced the openAI model with:
elif "gpt" in model_version: llm = AzureChatOpenAI( openai_api_version=os.environ["AZURE_OPENAI_API_VERSION"], azure_deployment=os.environ["AZURE_OPENAI_CHAT_DEPLOYMENT_NAME"] )
Also you would require the following .env variables, if you used Azure Open AI through langchain you could reuse that config
AZURE_OPENAI_ENDPOINT=AZURE_OPENAI_DEPLOYMENT_BASE_URL
AZURE_OPENAI_API_KEY=AZURE_OPENAI_API_KEY
AZURE_OPENAI_CHAT_DEPLOYMENT_NAME=GPT_4O # The name of the deployment as you find it in Azure
MODEL_NAME=gpt-4o # the name of the model deployed
AZURE_OPENAI_API_VERSION=2024-02-15-preview # this is quite tricky to get right, you should find it on your deployment details
AZURE_OPENAI_EMBEDDINGS_DEPLOYMENT_NAME=ada-002 # optional, can be used for embeddings
OPENAI_API_TYPE=azure # I think this one is the same for everyone
I did replace the openAI model with AzureOpenAI locally and it worked I guess that making an integration to specifically support Azure OpenAI ( at least for those running the app locally ) would be great :)
Hi! Could you share more how you did this please, thank you
Sure! Disclaimer though, I didn't think of extending what exists with AzureOpenAI, I only replaced the model used for GPT
Is quite simple in /backend/src/shared/common_fn.py , line 135 I replaced the openAI model with:
elif "gpt" in model_version: llm = AzureChatOpenAI( openai_api_version=os.environ["AZURE_OPENAI_API_VERSION"], azure_deployment=os.environ["AZURE_OPENAI_CHAT_DEPLOYMENT_NAME"] )
Also you would require the following .env variables, if you used Azure Open AI through langchain you could reuse that config
AZURE_OPENAI_ENDPOINT=AZURE_OPENAI_DEPLOYMENT_BASE_URL
AZURE_OPENAI_API_KEY=AZURE_OPENAI_API_KEY
AZURE_OPENAI_CHAT_DEPLOYMENT_NAME=GPT_4O # The name of the deployment as you find it in Azure
MODEL_NAME=gpt-4o # the name of the model deployed
AZURE_OPENAI_API_VERSION=2024-02-15-preview # this is quite tricky to get right, you should find it on your deployment details
AZURE_OPENAI_EMBEDDINGS_DEPLOYMENT_NAME=ada-002 # optional, can be used for embeddings
OPENAI_API_TYPE=azure # I think this one is the same for everyone
Thank you I make it work earlier doing the same!
They should be compatible with the OpenAI models so nothing to change in the code, just in the utility function where we create the LLM
https://python.langchain.com/v0.1/docs/integrations/chat/azure_chat_openai/
support these environment variables