langchain-ai / langchainjs

🦜🔗 Build context-aware reasoning applications 🦜🔗
https://js.langchain.com/docs/
MIT License
12.15k stars 2.04k forks source link

Swapping ChatOpenAI with AzureChatOpenAI fails when streaming #5663

Closed johnnyoshika closed 2 months ago

johnnyoshika commented 2 months ago

Checked other resources

Example Code

  const useAzure = true; // Set this to false and it will work

  const chatModel = useAzure
    ? new AzureChatOpenAI({
        azureOpenAIApiKey: process.env.AZURE_OPENAI_KEY,
        azureOpenAIApiInstanceName: process.env.AZURE_OPENAI_ENDPOINT,
        azureOpenAIApiDeploymentName: AZURE_OPENAI_DEPLOYMENT,
        azureOpenAIApiVersion: AZURE_OPENAI_VERSION,
        temperature: 0.4,
        streaming: true,
      })
    : new ChatOpenAI({
        model: 'gpt-4o',
        temperature: 0.4,
        streaming: true,
      });

  // It hangs here when using azure.
  // .stream() never returns.
  const stream = await chatModel
    .withConfig({
      runName: 'Chat Demo',
      metadata: {
        conversation_id: 'id-123',
      },
    })
    .stream([['user', 'Hello']]);

  for await (const chunk of stream) {
    console.log('chunk', chunk);
  }

Error Message and Stack Trace (if applicable)

None, it just hangs when calling AzureChatOpenAI's stream. I've also tried AzureOpenAI and the result is the same.

Description

Trying to stream chat results from azure. It works fine when using openai's ChatOpenAI, but switching to azure just hangs when calling AzureChatOpenAI's stream (see code sample above). I've also tried AzureOpenAI and the result is the same.

A similar setup using @azure/openai worked fine for streaming results from Azure using the same api key and deployment, but I'm trying to switch to langchain to send traces to langsmith.

System Info

langchain@0.2.4 Windows 11 Node 18.19.0 NPM 10.8.0

dosubot[bot] commented 2 months ago

Hey @johnnyoshika! I'm here to help you with your LangChain.js issue. Let's figure this out together!

To resolve the issue where AzureChatOpenAI hangs when trying to stream chat results, ensure that the streaming property is set to true and that you have the appropriate callbacks to handle the streamed tokens. Here is an example of how to initialize AzureChatOpenAI with streaming enabled and callbacks set up:

const chatModel = new AzureChatOpenAI({
  azureOpenAIApiKey: process.env.AZURE_OPENAI_KEY,
  azureOpenAIApiInstanceName: process.env.AZURE_OPENAI_ENDPOINT,
  azureOpenAIApiDeploymentName: AZURE_OPENAI_DEPLOYMENT,
  azureOpenAIApiVersion: AZURE_OPENAI_VERSION,
  temperature: 0.4,
  streaming: true, // Ensure streaming is enabled
  callbacks: [
    {
      async handleLLMNewToken(token) {
        console.log("New token received:", token);
      },
      async handleLLMEnd(output) {
        console.log("Streaming ended:", output);
      },
      async handleLLMError(error) {
        console.error("Error during streaming:", error);
      },
    },
  ],
});

const stream = await chatModel
  .withConfig({
    runName: 'Chat Demo',
    metadata: {
      conversation_id: 'id-123',
    },
  })
  .stream([['user', 'Hello']]);

for await (const chunk of stream) {
  console.log('chunk', chunk);
}

Ensure that the streaming property is explicitly set to true and that you have the necessary callbacks to handle the streamed tokens properly [1][2].

To continue talking to Dosu, mention @dosu.

johnnyoshika commented 2 months ago

Quick update. The request is properly being sent to LangSmith, but a response never appears as can be seen in this screenshot:

image

jacoblee93 commented 2 months ago

Hey, giving this a look now

jacoblee93 commented 2 months ago

Could you share that LangSmith trace or any of the non-private deployment details you sent? I just tried successfully with the latest version 2024-05-01-preview.

Based on your screenshot, it looks like it's failing to connect and possibly mistakenly retrying with exponential backoff.

johnnyoshika commented 2 months ago

Here's the trace for the first run in the screenshot (the one that timed out): https://smith.langchain.com/public/daca436b-9929-451a-b313-bde9ac5d7556/r

Here's the trace for the second run in the screenshot (the one that's still spinning): https://smith.langchain.com/public/7b4b7b2c-4aa5-40a1-a7d7-b337630a7451/r

That's interesting that it worked for you. Where can I find version 2024-05-01-preview? Thx

Note: most of the failed traces with Azure don't show a timeout bur rather a continuous spinner that never ends (i.e. like the second trace example): image

jeasonnow commented 2 months ago

https://github.com/langchain-ai/langchainjs/blob/0c82d55e309727690208141f1538587d72eb2d86/libs/langchain-openai/src/utils/azure.ts#L32-L65

I noticed in langchainSmith that your azure_openai_api_instance_name is set to https://examind-web-openai-local.openai.azure.com/, whereas it should be examind-web-openai-local. You can refer to the source code I sent for more details (it's used to generate the final request URL).

@johnnyoshika

johnnyoshika commented 2 months ago

@jeasonnow oh wow, thank you. I'll give that a try tomorrow and report back 🙏

johnnyoshika commented 2 months ago

@jeasonnow that was exactly it. Thank you so much!

Closing issue.

jacoblee93 commented 2 months ago

Thank you @jeasonnow!!!!