Open BloodRush3D opened 1 year ago
I also have the same problem. Please let me know if you have a solution. Thank you
Same here. Fiddler shows the request works though, responses are being returned, it is just the chatResult that is not populated correctly - Choices.Count is 1, but Choices[0].Message is null.
Looks like in the case of streaming, the response fragments are contained as "Content" inside "Delta" - the following seems to work for me (inside the lambda):
if (chatResult?.Choices[0]?.Delta?.Content != null)
{
response += chatResult.Choices[0].Delta.Content;
}
Not sure if this is a workaround or intended as such though :-)
I've asked ChatGPT to analyze the returned JSON lines (my prompt was "17 + 4 = "), here's the result FYI:
Yes, the structure of the "delta" inside "choices" is varying in the provided JSONL fragments. Here's a breakdown of the variations:
In the first fragment, the "delta" contains the key "role" with the value "assistant": {"delta": {"role": "assistant"}}
In the second to eighth fragments, the "delta" contains the key "content" with different string values: {"delta": {"content": "17"}} {"delta": {"content": " +" }} {"delta": {"content": " "}} {"delta": {"content": "4"}} {"delta": {"content": " ="}} {"delta": {"content": " "}} {"delta": {"content": "21"}}
In the ninth fragment, the "delta" object is empty: {"delta": {}}
These variations in the "delta" structure reflect different aspects of the chat completion, such as content, role, or an empty delta signifying the end of a message.
Same issue here. chatResult.Choices[0].Delta.Content
is also null.
wellhat, do you have a minimal sample that exhibits your problem I could try? We could find out if it has to do with our environment or a difference in code.
@gekah
public OpenAIClient GetOpenAIClient()
{
return new OpenAIClient(new OpenAIAuthentication("sk-", "org-"));
}
public async Task<MyPrompt> SendMessage(List<MyPrompt> visiblePrompts)
{
var chatPrompts = new List<ChatPrompt>
{
new ChatPrompt("user", "You are a helpful assistant."), // TODO - use system messages in later GPT versions
};
foreach (var val in visiblePrompts)
chatPrompts.Add(val.Prompt);
var chatRequest = new ChatRequest(chatPrompts, Model.GPT3_5_Turbo);
MyPrompt toReturn = null;
try
{
var api = GetOpenAIClient();
await api.ChatEndpoint.StreamCompletionAsync(chatRequest, result =>
{
var firstChoice = result.FirstChoice;
var content = firstChoice.Message?.Content ?? firstChoice.Delta?.Content;
var role = firstChoice.Message?.Role ?? firstChoice.Delta?.Role;
if (content == null)
throw new ArgumentNullException(nameof(content), "Did not get message back from OpenAI content back from StreamCompletionAsync().");
toReturn = new MyPrompt(new ChatPrompt(content, role));
});
}
catch (Exception ex)
{
toReturn = new MyPrompt(new ChatPrompt("system", "Error reaching OpenAI"));
Console.WriteLine(ex);
}
return toReturn;
}
Stacktrace:
[DOTNET] System.ArgumentNullException: Did not get message back from OpenAI content back from StreamCompletionAsync(). (Parameter 'content')
[DOTNET] at MySolution.Services.ChatbotService.<>c__DisplayClass6_0.<SendMessage>b__0(ChatResponse result) in /Users/jh/Projects/MySolution/MySolution/Services/ChatbotService.cs:line 53
[DOTNET] at OpenAI.Chat.ChatEndpoint.StreamCompletionAsync(ChatRequest chatRequest, Action`1 resultHandler, CancellationToken cancellationToken)
[DOTNET] at OpenAI.Chat.ChatEndpoint.StreamCompletionAsync(ChatRequest chatRequest, Action`1 resultHandler, CancellationToken cancellationToken)
[DOTNET] at MySolution.Services.ChatbotService.SendMessage(List`1 visiblePrompts) in /Users/jh/Projects/MySolution/MySolution/Services/ChatbotService.cs:line 47
I can see that I am being billed on https://platform.openai.com/account/usage
wellhat, AFAICT content is null even in Delta for the first and the last fragments / JSONL lines - try changing your code (for now) to ignore those instead of throwing at the first one. I guess you'll also need to accumulate the fragments inside the lambda before constructing your MyPrompt (outside the lambda), e.g. using a StringBuilder that you add every non-null "content" to. Not knowing your MyPrompt class, I may be completely wrong here though.
That's right, @gekah, thanks for explaining that the lambda function is running multiple times for each text fragment.
Sorry, the streaming results from Chat GPT conversations were added somewhat last minute and probably need better testing, error handling, convenience methods, etc. I know the deeply nested delta structures are not ideal. Feel free to submit a PR if you’ve got any suggestions, or I’ll try to take a look at it and improve things a bit in the next week or two. I would recommend the non-streaming methods for now.
-Roger
On Wed, Mar 15, 2023 at 10:32 AM Jacob @.***> wrote:
That's right, @gekah https://github.com/gekah, thanks for explaining that the lambda function is running multiple times for each text fragment.
— Reply to this email directly, view it on GitHub https://github.com/OkGoDoIt/OpenAI-API-dotnet/issues/80#issuecomment-1470462552, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAJBNJGI5NA27ZEJKA4YROTW4H4KXANCNFSM6AAAAAAVX6WEDI . You are receiving this because you are subscribed to this thread.Message ID: @.***>
Do you have any examples of using api.Chat.StreamChatEnumerableAsync or .StreamChatAsync?
I keep getting 'Object reference not set to an instance of an object.' OpenAI_API.Chat.ChatChoice.Message.get returned null. When I try to use either of them.