markjamesm / MistralSharp

Unofficial .NET SDK for the Mistral AI platform.
https://www.nuget.org/packages/MistralSharp
MIT License
14 stars 2 forks source link

Add support for chat streaming #6

Open markjamesm opened 9 months ago

markjamesm commented 9 months ago

Need to add support for streaming. You can specify streamlining as a parameter in the ChatRequest class, and it is defined as:

//  Default: false
// Whether to stream back partial progress. If set, tokens will be sent as data-only server-sent events
// as they become available, with the stream terminated by a data: [DONE] message. Otherwise, the server will
// hold the request open until the timeout or until completion, with the response containing the full
// result as JSON.
Stream = false,

The best way to implement it would be to add a check to the ChatAsync() method, and if the ChatRequest Stream is set to false, default to the standard HTTPClient PostAsync call. If true, then setup a stream.