Open BryceAmackerLE opened 10 months ago
There is an issue with openai_agent.astream_chat method when the llm is AzureOpenAI instead of parent OpenAI class instance. You can reproduce it like this: https://github.com/run-llama/llama_index/issues/9219
Confirmed Bryce's merge fix the issue.
Thank you Bryce!
When replacing
OpenAI()
withAzureOpenAI()
inbackend/app/chat/engine.py
, chat responses are empty. The frontend will display "Sorry, I either wasn't able to understand your question or I don't have an answer for it.".I have confirmed that the AzureOpenAI parameters are correct, as I have gotten valid embeddings and chat completion responses back while debugging with the same objects passed to the
chat_engine
.This is how I am constructing the chat_llm:
I have added those Azure specific values to config.py and have confirmed the parameters are correct by getting valid completions and embeddings via the debugger. They are also set via
set -a; source .env
before running. I've also explicitly setAZURE_OPENAI_ENDPOINT
.Has anyone else had success getting this project to work correctly with Azure Open AI?