Open airhorns opened 2 weeks ago
To address the issue where the ChatGroq
model defaults to streaming in a LangChain streaming context, you need to ensure that the streaming
property is explicitly set to false
in the model configuration. This should override the default behavior even in a streaming context.
Here's how you can configure it:
import { ChatGroq } from '@langchain/groq';
const model = new ChatGroq({
model: "mixtral-8x7b-32768",
streaming: false, // Explicitly disable streaming
});
// Bind the response format to JSON
await model.bind({ response_format: { type: "json_object" } }).invoke([new HumanMessage("generate some example JSON")]);
The issue arises because the LangChain context might force streaming due to its internal logic, which checks for streaming conditions like signals or timeouts [1]. By setting streaming: false
, you should be able to prevent this behavior and use the JSON response format without encountering errors.
To continue talking to Dosu, mention @dosu.
having the same issue
If no one else is working on this right now, I'd like to take a look at this issue. Thanks
Checked other resources
Example Code
Error Message and Stack Trace (if applicable)
Description
response_format: "json"
, documented here: https://console.groq.com/docs/text-chatI think its not safe to just assume that the model can stream if the ambient context wants it to stream -- I think if specific params are passed to the invocation that tell it not to stream, it shouldn't. Without this ability to override, I can't use Groq's json support at all from within LangGraph or a bigger LangChain that is streaming tokens from other models!
System Info