run-llama / llama_index

LlamaIndex is a data framework for your LLM applications
https://docs.llamaindex.ai
MIT License
35.71k stars 5.05k forks source link

[Question]: The differences between the StructureOutPuts of Azureopenai, the latest AzurePENAI provided by StructureOutput and Openai #15998

Open jakeywu opened 2 weeks ago

jakeywu commented 2 weeks ago

Question Validation

Question

I am an enterprise employee, plan to investigate the accuracy of the OutputStruction of LlamaIndex

We are using the following ways to complete this function,

from Openai Import Azureopenai
openai_client = Azureopenai (
 API_KEY =,
 API_VERSION =,
 azure_endpoint =
)
complete = openai_Client.beta.chat.compleTions.parse (
 Model = Model,
 Messages = MESSAGES,
 response_format =PydanticModel
)

This is OpenAI's direct support. In order to achieve the effect of double-line disaster recovery, we began to investigate the support effect of LLAMAINDEX. The current method is:

from llama_index.llms.azure_openai import azureopenai

def Concierge_Agent (user_query):
 result = Azure_openai_Client.as_Structanced_llm (
 output_CLS = ConcierGemodel
 ) .Chat (
 Messages = [
 Chatmessage (
 Role = "System",
 content = PydanticModel
 ),
 Chatmessage (
 ROLE = "User",
 Content = User_query,
 )
 ],
 )
 result = json.loads (result.message.content)

What is the difference between the effect? ​​Does LLAMAINDEX have maintained synchronization with the BETA version of Openai? hope for your reply, we thank you in advance.

dosubot[bot] commented 2 weeks ago

@jakeywu Hello there! I'm here to help you with any bugs, questions, or contributions. Let's tackle this together!

I wasn't able to find specific differences in the output structure between AzureOpenAI and LlamaIndex's AzureOpenAI integration within the repository. Additionally, there is no documentation or release notes indicating that LlamaIndex has maintained synchronization with the BETA version of OpenAI [1][2].

However, LlamaIndex's AzureOpenAI integration supports the same models and API versions as the native AzureOpenAI client, ensuring comprehensive compatibility [1][3].

To continue talking to Dosu, mention @dosu.

jakeywu commented 4 days ago

We want to know if the OutputStructure of LlamaIndex is synchronized with OpenAI, if anyone tells us, we thank you in advance.

dosubot[bot] commented 4 days ago

The LlamaIndex repository does mention synchronization with OpenAI's BETA versions in its release notes. For example, the changelog includes support for new GPT Turbo Models and new OpenAI v3 embedding models, as well as updates to the OpenAIAssistantAgent to use attachments [1][2][3].

However, I wasn't able to find specific information on whether the OutputStructure of LlamaIndex is synchronized with OpenAI. The repository does provide some comparison or compatibility notes between its AzureOpenAI integration and the native OpenAI client, highlighting differences in setup and configuration [4].

This is a tough nut to crack, could really use your help on this, @logan-markewich.

To continue talking to Dosu, mention @dosu.