Closed liuhuapiaoyuan closed 3 months ago
Hey @liuhuapiaoyuan! I'm Dosu, a helpful bot here to assist you with bugs, answer questions, and guide you to become a contributor. I'm here to help while you wait for a human maintainer. Thanks for sharing your issue—I'm taking a look at it and will get back to you shortly!
When I use ChatOpenAI
, I can get streaming transmission, while import { ChatZhipuAI } from "@langchain/community/chat_models/zhipuai
;
`Unable to obtain
I think it should be chain.stream
unable to pass the streaming parameter to the corresponding ChatZhipuAI.
Since ChatZhipuAI
has not yet implemented a custom chain.stream
.
But, you can try
const model = new ChatZhipuAi({
…otherConfig,
streaming: true,
});
const stream = model.invoke('hello, can you help me')
To obtain streaming transmission using ChatZhipuAI
from the @langchain/community/chat_models/zhipuai
package, you need to ensure that the streaming
parameter is set to true
and handle the streamed tokens using callbacks. Here is an example:
import { ChatZhipuAI } from "@langchain/community/chat_models/zhipuai";
// Create an instance of ChatZhipuAI with streaming enabled
const chatModel = new ChatZhipuAI({
apiKey: "your_api_key_here",
modelName: "glm-3-turbo", // or any other model name you prefer
streaming: true, // Enable streaming
callbacks: [
{
async handleLLMNewToken(token) {
console.log("New token received:", token);
},
},
],
});
// Define your messages
const messages = [
{ role: "user", content: "Hello, can you help me?" },
];
// Call the chat model
chatModel._generate(messages).then((result) => {
console.log(result.generations[0].text);
}).catch((error) => {
console.error("Error:", error);
});
In this example:
streaming
parameter is set to true
to enable streaming.callbacks
array includes a function to handle each new token received during the streaming process._generate
method is used to handle the streaming response and provide the output accordingly.If you still face issues, ensure your environment and dependencies are correctly set up and that you are using the latest version of the langchainjs
library. If the problem persists, it might be worth checking if there are any known issues or limitations with the streaming capabilities of ChatZhipuAI
in the repository.
To continue the conversation, mention @dosu.
Okay, I think this should form a demand
Thank you @jeasonnow!
Checked other resources
Example Code
code :
Error Message and Stack Trace (if applicable)
The message is received once, and I cannot obtain the streaming results
Description
The message is received once, and I cannot obtain the streaming results
System Info
langchain@0.2.0 | MIT | deps: 16 | versions: 271 Typescript bindings for langchain https://github.com/langchain-ai/langchainjs/tree/main/langchain/
keywords: llm, ai, gpt3, chain, prompt, prompt engineering, chatgpt, machine learning, ml, openai, embeddings, vectorstores
dist .tarball: https://registry.npmjs.org/langchain/-/langchain-0.2.0.tgz .shasum: 555d84538962720cd7223f6c3ca4bd060978ebf3 .integrity: sha512-8c7Dg9OIPk4lFIQGyfOytXbUGLLSsxs9MV53cLODspkOGzaUpwy5FGBie30SrOxIEFJo+FDaJgpDAFO3Xi4NMw== .unpackedSize: 3.4 MB
dependencies: @langchain/core: ~0.2.0 binary-extensions: ^2.2.0 jsonpointer: ^5.0.1 ml-distance: ^4.0.0 uuid: ^9.0.0 zod: ^3.22.4 @langchain/openai: ~0.0.28 js-tiktoken: ^1.0.12 langchainhub: ~0.0.8 openapi-types: ^12.1.3 yaml: ^2.2.1 @langchain/textsplitters: ~0.0.0 js-yaml: ^4.1.0 langsmith: ~0.1.7 p-retry: 4 zod-to-json-schema: ^3.22.3
maintainers:
dist-tags: latest: 0.2.0 next: 0.2.0