davidmigloz / langchain_dart

Build LLM-powered Dart/Flutter applications.
https://langchaindart.dev
MIT License
414 stars 73 forks source link

Streaming does not work with LLMChain #580

Closed migalv closed 14 hours ago

migalv commented 17 hours ago

System Info

Flutter 3.22.2 Dart 3.4.3

langchain: ^0.7.4
langchain_openai: ^0.7.1

Related Components

Reproduction

I have this very simple LLMChain but when I call the stream() method the stream only emits the last event.

final chatModel = ChatOpenAI(
  apiKey: env['openAiModelsApiKey'],
  defaultOptions: const ChatOpenAIOptions(model: 'gpt-4o-mini'),
);

final prompt = await PromptTemplate.fromFile('my_sample_template.txt');
final llmChain = LLMChain(llm: chatModel, prompt: prompt);

final llmResponseStream = llmChain.stream({'input': userLastMessage});

final stringBuffer = StringBuffer();
yield* llmResponseStream.map<String>((response) {
  stringBuffer.write(response['output']);
  return stringBuffer.toString();
});

Expected behavior

But when I call the ChatModel directly, the streaming works as expected.

final chatModel = ChatOpenAI(
  apiKey: env['openAiModelsApiKey'],
  defaultOptions: const ChatOpenAIOptions(model: 'gpt-4o-mini'),
);

final prompt = await PromptTemplate.fromFile('my_sample_template.txt');

final chatModelResponseStream = chatModel.stream({'input': userLastMessage});

yield* chatModelResponseStream.map<String>((response) {
  stringBuffer.write(response.asOutputString);
  return stringBuffer.toString();
});

I investigated the stream() method that the LLMChain class is using and it's the default implementation from the Runnable class.

/// Streams the output of invoking the [Runnable] on the given [input].
///
/// - [input] - the input to invoke the [Runnable] on.
/// - [options] - the options to use when invoking the [Runnable].
Stream<RunOutput> stream(
  final RunInput input, {
  final CallOptions? options,
}) async* {
  // By default, it just emits the result of calling `.invoke`
  // Subclasses should override this method if they support streaming output
  yield await invoke(input, options: options);
}

Since the LLMChain class is not overriding this method, we only get the last event.

Am I missing something, do I need to create my own StreamableChain class that overrides this method?

davidmigloz commented 14 hours ago

Hey @migalv ,

LLMChain is deprecated and doesn't support streaming, you should use LCEL instead.

The equivalent of your sample code using LCEL is:

final chatModel = ChatOpenAI(
  apiKey: openAiApiKey,
  defaultOptions: const ChatOpenAIOptions(model: 'gpt-4o-mini'),
);

final prompt = await PromptTemplate.fromFile('my_sample_template.txt');
final chain = prompt.pipe(chatModel).pipe(StringOutputParser());

final llmResponseStream = chain.stream({'input': userLastMessage});

final stringBuffer = StringBuffer();
yield* llmResponseStream.map<String>((response) {
  stringBuffer.write(response);
  return stringBuffer.toString();
});

Cheers.

migalv commented 13 hours ago

Oh okay got it šŸ™ŒšŸ» thanks for the quick response @davidmigloz