betalgo / openai

OpenAI .NET sdk - Azure OpenAI, ChatGPT, Whisper, and DALL-E
https://betalgo.github.io/openai/
MIT License
2.84k stars 513 forks source link

Fix ChatCompletionCreateResponse handling multiple parallel tool call #494

Closed David-Buyer closed 3 months ago

David-Buyer commented 4 months ago

As reported in #493 this could fix the issue.

kayhantolga commented 3 months ago

Hi @David-Buyer ,

Do you think the code will handle it if the stream returns chunks with shuffled indexes, rather than in order? I mean, like this:

1. chunk index 0 - data
2. chunk index 0 - data
3. chunk index 1 - data
4. chunk index 0 - data
5. chunk index 1 - data
6. chunk index 1 - data
David-Buyer commented 3 months ago

Hi @David-Buyer ,

Do you think the code will handle it if the stream returns chunks with shuffled indexes, rather than in order? I mean, like this:

1. chunk index 0 - data
2. chunk index 0 - data
3. chunk index 1 - data
4. chunk index 0 - data
5. chunk index 1 - data
6. chunk index 1 - data

Hello @kayhantolga, i don't think it handles that case. It seems to expect to moving foward in sequential order to group arguments, supposing they belong to the last function tool added previously to the list. Do you think that case would be expected where data could be received shuffled ?

kayhantolga commented 3 months ago

I can't say it's impossible due to the lack of OpenAI documentation, and it's also challenging to experiment. However, I think if they decide to improve the model to allow both tool to return asynchronously, we may see shuffled data. If it's not too much of a headache for you, we could take a defensive approach and handle such cases. If you don't want to improve, I would merge the branch as it is, and we can try to fix this later.

David-Buyer commented 3 months ago

I can't say it's impossible due to the lack of OpenAI documentation, and it's also challenging to experiment. However, I think if they decide to improve the model to allow both tool to return asynchronously, we may see shuffled data. If it's not too much of a headache for you, we could take a defensive approach and handle such cases. If you don't want to improve, I would merge the branch as it is, and we can try to fix this later.

It's a reasonable way to proceed and moving in advance this optimization. At the moment this fix could be useful as it is, in order to fix the issue some users claimed, so you can merge it you want to provide a partial improvement. Related to challenge you told, i already have an idea and i will try asap keeping you updated too :)

David-Buyer commented 3 months ago

I added an improvement which could handle the asynchronous response about shuffled data.