Open onurmicoogullari opened 7 months ago
Thanks for the feedback! We are routing this to the appropriate team for follow-up. cc @jpalvarezl @trrwilson.
Thanks for the feedback! We are routing this to the appropriate team for follow-up. cc @jpalvarezl @trrwilson.
Library name and version
Azure.AI.OpenAI 1.0.0-beta.13
Describe the bug
Me and my team are building a custom application on top of the
Azure OpenAI Services
service with thegpt-4-32k
model deployment.We initially used the
Azure.AI.OpenAI
SDK with version1.0.0-beta.5
to facilitate the communication withAzure OpenAI Services
. Then our stakeholder made a feature request to enableChatGPT
to answer questions based on our own data. We discovered theBYOD
feature where we could upload our data to aStorage Account
, index it usingAzure AI Search
andSemantic ranker
, and then query the data throughChatGPT
inAzure OpenAI Services
. Problem is, we couldn't get it to work with SDKAzure.AI.OpenAI, version=1.0.0-beta.5
.So we decided to reverse engineer the whole thing, aka we created our own classes matching the requests and responses that we could see were being made over the network while we were using the
ChatGPT Playground
. Then we used these classes in HTTP requests to ourAzure OpenAI Services
service with the correct headers and payloads. It's not beautiful, but its working.Now we need to implement the streaming API in the
Azure.AI.OpenAI
SDK, but when we tried the normal req/resp API in the SDK we found some serious problems in the serialization logic.The following is a console application that takes the example from the SDKs README file on nuget.org, serializes it and then writes the output to the console:
It gives the following output in the console:
See how the
Messages
property is missing both the roles and the messages?I also tried creating a very simple API that takes the
ChatCompletionsOptions
object as a parameter and just writes it back to callers response message:I send this request:
And I get this response back:
The property
messages
is now empty. The same request works just fine when sent directly to theAzure OpenAI Services
API usingPostman
. I guess this means that theChatGPT Playground
inAzure OpenAI Services
doesn't use theAzure.AI.OpenAI
.NET SDK.With all that said, we need to move forward with our implementation of the streaming API, and we're not very happy about the idea to reverse engineer that one. Is there any chance that you can fix this issue anytime soon? Or might there be some specific version of the SDK that you know is working as expected (including CognotiveSearch extension), that we can use while waiting?
Expected behavior
The roles and messages in the conversation should be serialized properly so that they are included in the
ChatCompletionsOptions
requestActual behavior
Roles and messages are being omitted in the serialization made by the SDK.
Reproduction Steps
Reproduction in Console App
Run the following code with .NET SDK installed on your machine:
Reproduction in Web API
Run the following code with .NET SDK installed on your machine:
Use an HTTP client like Postman to send the following request:
Environment
.NET SDK version: 8.0.101 OS: Ubuntu 22.04 IDE and version: Visual Studio Code 1.86.1