langchain-ai / langchainjs

🦜🔗 Build context-aware reasoning applications 🦜🔗
https://js.langchain.com/docs/
MIT License
12.15k stars 2.04k forks source link

Unable to obtain the streaming transmission method, it always returns only after receiving the entire text #5508

Closed liuhuapiaoyuan closed 3 months ago

liuhuapiaoyuan commented 3 months ago

Checked other resources

Example Code

code :

let model =  new ChatZhipuAI({
    model: "glm-3-turbo", // Available models:
  })
for await (const chunk of await model.stream("hello , canyou  help me")) {
    console.log("|", chunk.content);
  }

Error Message and Stack Trace (if applicable)

The message is received once, and I cannot obtain the streaming results

Description

The message is received once, and I cannot obtain the streaming results

System Info

langchain@0.2.0 | MIT | deps: 16 | versions: 271 Typescript bindings for langchain https://github.com/langchain-ai/langchainjs/tree/main/langchain/

keywords: llm, ai, gpt3, chain, prompt, prompt engineering, chatgpt, machine learning, ml, openai, embeddings, vectorstores

dist .tarball: https://registry.npmjs.org/langchain/-/langchain-0.2.0.tgz .shasum: 555d84538962720cd7223f6c3ca4bd060978ebf3 .integrity: sha512-8c7Dg9OIPk4lFIQGyfOytXbUGLLSsxs9MV53cLODspkOGzaUpwy5FGBie30SrOxIEFJo+FDaJgpDAFO3Xi4NMw== .unpackedSize: 3.4 MB

dependencies: @langchain/core: ~0.2.0 binary-extensions: ^2.2.0 jsonpointer: ^5.0.1 ml-distance: ^4.0.0 uuid: ^9.0.0 zod: ^3.22.4 @langchain/openai: ~0.0.28 js-tiktoken: ^1.0.12 langchainhub: ~0.0.8 openapi-types: ^12.1.3 yaml: ^2.2.1 @langchain/textsplitters: ~0.0.0 js-yaml: ^4.1.0 langsmith: ~0.1.7 p-retry: 4 zod-to-json-schema: ^3.22.3

maintainers:

dist-tags: latest: 0.2.0 next: 0.2.0

dosubot[bot] commented 3 months ago

Hey @liuhuapiaoyuan! I'm Dosu, a helpful bot here to assist you with bugs, answer questions, and guide you to become a contributor. I'm here to help while you wait for a human maintainer. Thanks for sharing your issue—I'm taking a look at it and will get back to you shortly!

liuhuapiaoyuan commented 3 months ago

When I use ChatOpenAI, I can get streaming transmission, while import { ChatZhipuAI } from "@langchain/community/chat_models/zhipuai; `Unable to obtain

liuhuapiaoyuan commented 3 months ago

I think it should be chain.stream unable to pass the streaming parameter to the corresponding ChatZhipuAI.

jeasonnow commented 3 months ago

Since ChatZhipuAI has not yet implemented a custom chain.stream.

But, you can try

const model = new ChatZhipuAi({
    …otherConfig,
    streaming: true,
});
const stream = model.invoke('hello, can you help me')
dosubot[bot] commented 3 months ago

To obtain streaming transmission using ChatZhipuAI from the @langchain/community/chat_models/zhipuai package, you need to ensure that the streaming parameter is set to true and handle the streamed tokens using callbacks. Here is an example:

import { ChatZhipuAI } from "@langchain/community/chat_models/zhipuai";

// Create an instance of ChatZhipuAI with streaming enabled
const chatModel = new ChatZhipuAI({
  apiKey: "your_api_key_here",
  modelName: "glm-3-turbo", // or any other model name you prefer
  streaming: true, // Enable streaming
  callbacks: [
    {
      async handleLLMNewToken(token) {
        console.log("New token received:", token);
      },
    },
  ],
});

// Define your messages
const messages = [
  { role: "user", content: "Hello, can you help me?" },
];

// Call the chat model
chatModel._generate(messages).then((result) => {
  console.log(result.generations[0].text);
}).catch((error) => {
  console.error("Error:", error);
});

In this example:

If you still face issues, ensure your environment and dependencies are correctly set up and that you are using the latest version of the langchainjs library. If the problem persists, it might be worth checking if there are any known issues or limitations with the streaming capabilities of ChatZhipuAI in the repository.

#### Details **Sources** - [Langchain Agent Streaming LLM & Function Call & Tools](https://github.com/langchain-ai/langchainjs/discussions/4771) - [Issue streaming from ConversationalRetrievalQAChain](https://github.com/langchain-ai/langchainjs/discussions/2333) - [Streaming not working when using Ollama + RetrievalQAChain](https://github.com/langchain-ai/langchainjs/issues/2661) - [libs/langchain-community/src/chat_models/zhipuai.ts](https://github.com/langchain-ai/langchainjs/blob/main/libs/langchain-community/src/chat_models/zhipuai.ts) - [libs/langchain-community/src/chat_models/tests/chatzhipuai.int.test.ts](https://github.com/langchain-ai/langchainjs/blob/main/libs/langchain-community/src/chat_models/tests/chatzhipuai.int.test.ts) - [examples/src/models/chat/integration_zhipuai.ts](https://github.com/langchain-ai/langchainjs/blob/main/examples/src/models/chat/integration_zhipuai.ts)
**About Dosu** This response is meant to be useful and save you time. It is **not meant to be a precise solution**, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

To continue the conversation, mention @dosu.

liuhuapiaoyuan commented 3 months ago

Okay, I think this should form a demand

jacoblee93 commented 3 months ago

Thank you @jeasonnow!