Closed migalv closed 14 hours ago
Hey @migalv ,
LLMChain
is deprecated and doesn't support streaming, you should use LCEL instead.
The equivalent of your sample code using LCEL is:
final chatModel = ChatOpenAI(
apiKey: openAiApiKey,
defaultOptions: const ChatOpenAIOptions(model: 'gpt-4o-mini'),
);
final prompt = await PromptTemplate.fromFile('my_sample_template.txt');
final chain = prompt.pipe(chatModel).pipe(StringOutputParser());
final llmResponseStream = chain.stream({'input': userLastMessage});
final stringBuffer = StringBuffer();
yield* llmResponseStream.map<String>((response) {
stringBuffer.write(response);
return stringBuffer.toString();
});
Cheers.
Oh okay got it šš» thanks for the quick response @davidmigloz
System Info
Flutter 3.22.2 Dart 3.4.3
Related Components
Reproduction
I have this very simple
LLMChain
but when I call thestream()
method the stream only emits the last event.Expected behavior
But when I call the ChatModel directly, the streaming works as expected.
I investigated the
stream()
method that theLLMChain
class is using and it's the default implementation from theRunnable
class.Since the
LLMChain
class is not overriding this method, we only get the last event.Am I missing something, do I need to create my own
StreamableChain
class that overrides this method?