Closed leduckhc closed 2 months ago
/cc @agoncal (azure), @jdubois (azure)
Hi @leduckhc, which LLM provider are you using?
Sorry, for confusion - it's AzureOpenAiStreamingChatModel
problem so far - I have updated the issue description.
Currently the problem seems to arise on AzureOpenAi provider. I am hypothesising that the same error might occur with LocalAi provider - since the code that handle the response message seems to be similar to AzureOpenAi.
I see. I guess https://github.com/langchain4j/langchain4j/pull/1720 fixes it?
Hi,
this seems to affect
AzureOpenAiStreamingChatModel
. I am hypothesising thatLocalAiStreamingChatModel
since the code that handle response code seems to be similar toAzureOpenAiStreamingChatModel
Describe the bug
Using AiService on streaming models
Log and Stack trace
To Reproduce
Expected behavior
Please complete the following information:
Additional context