run-llama / sec-insights

A real world full-stack application using LlamaIndex
https://www.secinsights.ai/
MIT License
2.32k stars 631 forks source link

Empty responses when using AzureOpenAI #78

Open BryceAmackerLE opened 10 months ago

BryceAmackerLE commented 10 months ago

When replacing OpenAI() with AzureOpenAI() in backend/app/chat/engine.py, chat responses are empty. The frontend will display "Sorry, I either wasn't able to understand your question or I don't have an answer for it.".

I have confirmed that the AzureOpenAI parameters are correct, as I have gotten valid embeddings and chat completion responses back while debugging with the same objects passed to the chat_engine.

This is how I am constructing the chat_llm:

    chat_llm = AzureOpenAI(
        temperature=0,
        streaming=True,
        model=settings.AZURE_OPENAI_CHAT_MODEL_NAME,
        deployment_name=settings.AZURE_OPENAI_CHAT_DEPLOYMENT_NAME,
        api_key=settings.AZURE_OPENAI_API_KEY,
        azure_endpoint=settings.AZURE_OPENAI_API_BASE,
        api_version=settings.AZURE_OPENAI_API_VERSION,
        additional_kwargs={"api_key": settings.AZURE_OPENAI_API_KEY},
    )

I have added those Azure specific values to config.py and have confirmed the parameters are correct by getting valid completions and embeddings via the debugger. They are also set via set -a; source .env before running. I've also explicitly set AZURE_OPENAI_ENDPOINT.

Has anyone else had success getting this project to work correctly with Azure Open AI?

fchenGT commented 9 months ago

There is an issue with openai_agent.astream_chat method when the llm is AzureOpenAI instead of parent OpenAI class instance. You can reproduce it like this: https://github.com/run-llama/llama_index/issues/9219

fchenGT commented 7 months ago

Confirmed Bryce's merge fix the issue.

Thank you Bryce!