openai / openai-dotnet

The official .NET library for the OpenAI API
https://www.nuget.org/packages/OpenAI
MIT License
1.13k stars 113 forks source link

System.ClientModel.ClientResultException: 'HTTP 400 (invalid_request_error: ) #30

Closed cjkarande closed 3 months ago

cjkarande commented 3 months ago

I am using the following code in a .NET Standard 2.0 project within a Xamarin.Forms Solution string _model = "gpt-3.5-turbo"; _openAI_Client = new ChatClient(model: _model, _apiKey); ChatCompletion chatCompletion = await _openAI_Client.CompleteChatAsync(new ChatMessage[] { new UserChatMessage("Deep Reinforcement Learning for Finance") });

however I am receiving the following exception on the execution of CompleteChatAsync()

System.ClientModel.ClientResultException: 'HTTP 400 (invalid_request_error: ) Parameter: messages.[0].content Invalid value for 'content': expected a string, got null.'

cjkarande commented 3 months ago

@joseharriaga
@hed-openai @mjr-openai @ram-openai @kimo-openai @srw-openai

Anyone from the OpenAI team who could have a look at this issue pls? Its a show stopper

joseharriaga commented 3 months ago

@trrwilson: Is this related to the issue that Chris and you were seeing with Mono?

trrwilson commented 3 months ago

@joseharriaga

@trrwilson: Is this related to the issue that Chris and you were seeing with Mono?

Almost certainly the same issue, yes. There's an oddity with how serialization covariance is handled on Mono with System.ClientModel's use of ModelReaderWriter on IJsonModel<T> when there's an abstract base type; in this case, the message is being written as IJsonModel<ChatMessage> (which has no "content" to speak of) instead of the intended derived instance type of UserChatMessage.

The problem is well understood, but we should consider expediting the workaround (of using virtual methods for these abstract serialization paths) in the interim.

rudetrue commented 3 months ago

Experiencing this as well. Let me know if I can try/test anything to work towards a fix.

MoienTajik commented 3 months ago

Same issue here on Blazor WASM. As a temporary workaround, you can use:

var json = BinaryData.FromObjectAsJson(new
{
  model = "gpt-3.5-turbo",
  messages = new[]
  {
    new
    {
      role = "user",
      content ="YOUR_PROMPT"
    }
  }
});

var clientResult = await chatClient.CompleteChatAsync(BinaryContent.Create(json));
var chatCompletion = ModelReaderWriter.Read<ChatCompletion>(clientResult.GetRawResponse().Content, new("W"));
trrwilson commented 3 months ago

To update; we've filed an issue with the .NET runtime team (thanks to @chschrae for investigating) to track the full fix for the underlying issue causing this problem:

https://github.com/dotnet/runtime/issues/103365

We're evaluating a mitigation option to unblock mono/wasm/etc. in the interim.

cjkarande commented 3 months ago

@trrwilson, thanks a ton for the fix, it indeed was an issue with the generic interface across different runtimes. The issue is now resolved in the latest (2.0.0-beta.5) Nuget update.

Appreciate the team's prompt intervention