Closed johnnyoshika closed 2 months ago
Hey @johnnyoshika! I'm here to help you with your LangChain.js issue. Let's figure this out together!
To resolve the issue where AzureChatOpenAI
hangs when trying to stream chat results, ensure that the streaming
property is set to true
and that you have the appropriate callbacks to handle the streamed tokens. Here is an example of how to initialize AzureChatOpenAI
with streaming enabled and callbacks set up:
const chatModel = new AzureChatOpenAI({
azureOpenAIApiKey: process.env.AZURE_OPENAI_KEY,
azureOpenAIApiInstanceName: process.env.AZURE_OPENAI_ENDPOINT,
azureOpenAIApiDeploymentName: AZURE_OPENAI_DEPLOYMENT,
azureOpenAIApiVersion: AZURE_OPENAI_VERSION,
temperature: 0.4,
streaming: true, // Ensure streaming is enabled
callbacks: [
{
async handleLLMNewToken(token) {
console.log("New token received:", token);
},
async handleLLMEnd(output) {
console.log("Streaming ended:", output);
},
async handleLLMError(error) {
console.error("Error during streaming:", error);
},
},
],
});
const stream = await chatModel
.withConfig({
runName: 'Chat Demo',
metadata: {
conversation_id: 'id-123',
},
})
.stream([['user', 'Hello']]);
for await (const chunk of stream) {
console.log('chunk', chunk);
}
Ensure that the streaming
property is explicitly set to true
and that you have the necessary callbacks to handle the streamed tokens properly [1][2].
To continue talking to Dosu, mention @dosu.
Quick update. The request is properly being sent to LangSmith, but a response never appears as can be seen in this screenshot:
Hey, giving this a look now
Could you share that LangSmith trace or any of the non-private deployment details you sent? I just tried successfully with the latest version 2024-05-01-preview
.
Based on your screenshot, it looks like it's failing to connect and possibly mistakenly retrying with exponential backoff.
Here's the trace for the first run in the screenshot (the one that timed out): https://smith.langchain.com/public/daca436b-9929-451a-b313-bde9ac5d7556/r
Here's the trace for the second run in the screenshot (the one that's still spinning): https://smith.langchain.com/public/7b4b7b2c-4aa5-40a1-a7d7-b337630a7451/r
That's interesting that it worked for you. Where can I find version 2024-05-01-preview
? Thx
Note: most of the failed traces with Azure don't show a timeout bur rather a continuous spinner that never ends (i.e. like the second trace example):
I noticed in langchainSmith that your azure_openai_api_instance_name
is set to https://examind-web-openai-local.openai.azure.com/
, whereas it should be examind-web-openai-local
. You can refer to the source code I sent for more details (it's used to generate the final request URL).
@johnnyoshika
@jeasonnow oh wow, thank you. I'll give that a try tomorrow and report back 🙏
@jeasonnow that was exactly it. Thank you so much!
Closing issue.
Thank you @jeasonnow!!!!
Checked other resources
Example Code
Error Message and Stack Trace (if applicable)
None, it just hangs when calling
AzureChatOpenAI
'sstream
. I've also triedAzureOpenAI
and the result is the same.Description
Trying to stream chat results from azure. It works fine when using openai's
ChatOpenAI
, but switching to azure just hangs when callingAzureChatOpenAI
'sstream
(see code sample above). I've also triedAzureOpenAI
and the result is the same.A similar setup using
@azure/openai
worked fine for streaming results from Azure using the same api key and deployment, but I'm trying to switch to langchain to send traces to langsmith.System Info
langchain@0.2.4 Windows 11 Node 18.19.0 NPM 10.8.0