langchain-ai / langchain

🦜🔗 Build context-aware reasoning applications
https://python.langchain.com
MIT License
84.82k stars 13.11k forks source link

chain.ainvoke() will result in streaming output #20980

Open Huarong opened 2 weeks ago

Huarong commented 2 weeks ago

Checked other resources

Example Code

no at the moment

Error Message and Stack Trace (if applicable)

No response

Description

In langraph we use chain.ainvoke() in inner node and app.astream_events() for the whole graph app. But we found out that the chain.ainvoke() output streaming tokens.

We I dived into the code, and found maybe because of the following code:

https://github.com/langchain-ai/langchain/blob/804390ba4bcc306b90cb6d75b7f01a4231ab6463/libs/core/langchain_core/language_models/chat_models.py#L684-L701

type(self)._astream is ChatOpenAI._astream kwargs is a empty {}, which dose not have stream key from params. The langgraph app have a LogStreamCallbackHandler in the run_manager.handlers. So the if statement is True and the code generated streaming output which is not expected.

May be you should add key 'stream' to kwargs from params. And I have tried it solved my problem.

System Info

System Information

OS: Darwin OS Version: Darwin Kernel Version 23.2.0: Wed Nov 15 21:53:34 PST 2023; root:xnu-10002.61.3~2/RELEASE_ARM64_T8103 Python Version: 3.11.2 (main, Feb 21 2024, 12:24:36) [Clang 15.0.0 (clang-1500.1.0.2.5)]

Package Information

langchain_core: 0.1.46 langchain: 0.1.16 langchain_community: 0.0.34 langsmith: 0.1.22 langchain_openai: 0.0.6 langchain_text_splitters: 0.0.1 langgraph: 0.0.39

Packages not installed (Not Necessarily a Problem)

The following packages were not found:

langserve

eyurtsev commented 2 weeks ago

AFAIK it's only turned on when is stream = True -- which forces the model to stream even when using ainvoke.

This is not a bug, but the current way by which the model interface works.

Huarong commented 2 weeks ago

AFAIK

@eyurtsev But I didn't set stream=True. Actually, stream is False in LLM, but it isn't passed to kwargs. I just called astream_events() in langgraph app, and it seems make all the calls of all chains streaming. So how can I use non stream in some intermediate chains which using ainvoke?

eyurtsev commented 2 weeks ago
from langchain_core.runnables import chain

model = ChatOpenAI()

@chain
async def thingy(inputs):
  return model.ainvoke(inputs['prompt'])

async for event in thingy.astream_events({}):
   print(event)

The model should be invoked with ainvoke

Huarong commented 1 week ago
from langchain_core.runnables import chain

model = ChatOpenAI()

@chain
async def thingy(inputs):
  return model.ainvoke(inputs['prompt'])

async for event in thingy.astream_events({}):
   print(event)

The model should be invoked with ainvoke

@eyurtsev The model in the code is invoked with astream instead of ainvoke.

The code:

import asyncio
from langchain_core.runnables import chain
from langchain_openai import ChatOpenAI

model = ChatOpenAI()

@chain
async def thingy(inputs):
    return await model.ainvoke(inputs["prompt"])

async def main():
    async for event in thingy.astream_events({"prompt": "hello"}, version="v1"):
        print(event)

if __name__ == "__main__":
    asyncio.run(main())

The output:

{'event': 'on_chain_start', 'run_id': '06beb82d-5097-4fec-b731-4fc2c98ace91', 'name': 'thingy', 'tags': [], 'metadata': {}, 'data': {'input': {'prompt': 'hello'}}}
{'event': 'on_chat_model_start', 'name': 'ChatOpenAI', 'run_id': '55e39a9b-b791-4d5b-b381-ca7d93371515', 'tags': [], 'metadata': {}, 'data': {'input': {'messages': [[HumanMessage(content='hello')]]}}}
{'event': 'on_chat_model_stream', 'name': 'ChatOpenAI', 'run_id': '55e39a9b-b791-4d5b-b381-ca7d93371515', 'tags': [], 'metadata': {}, 'data': {'chunk': AIMessageChunk(content='', id='run-55e39a9b-b791-4d5b-b381-ca7d93371515')}}
{'event': 'on_chat_model_stream', 'name': 'ChatOpenAI', 'run_id': '55e39a9b-b791-4d5b-b381-ca7d93371515', 'tags': [], 'metadata': {}, 'data': {'chunk': AIMessageChunk(content='Hello', id='run-55e39a9b-b791-4d5b-b381-ca7d93371515')}}
{'event': 'on_chat_model_stream', 'name': 'ChatOpenAI', 'run_id': '55e39a9b-b791-4d5b-b381-ca7d93371515', 'tags': [], 'metadata': {}, 'data': {'chunk': AIMessageChunk(content='!', id='run-55e39a9b-b791-4d5b-b381-ca7d93371515')}}
{'event': 'on_chat_model_stream', 'name': 'ChatOpenAI', 'run_id': '55e39a9b-b791-4d5b-b381-ca7d93371515', 'tags': [], 'metadata': {}, 'data': {'chunk': AIMessageChunk(content=' How', id='run-55e39a9b-b791-4d5b-b381-ca7d93371515')}}
{'event': 'on_chat_model_stream', 'name': 'ChatOpenAI', 'run_id': '55e39a9b-b791-4d5b-b381-ca7d93371515', 'tags': [], 'metadata': {}, 'data': {'chunk': AIMessageChunk(content=' can', id='run-55e39a9b-b791-4d5b-b381-ca7d93371515')}}
{'event': 'on_chat_model_stream', 'name': 'ChatOpenAI', 'run_id': '55e39a9b-b791-4d5b-b381-ca7d93371515', 'tags': [], 'metadata': {}, 'data': {'chunk': AIMessageChunk(content=' I', id='run-55e39a9b-b791-4d5b-b381-ca7d93371515')}}
{'event': 'on_chat_model_stream', 'name': 'ChatOpenAI', 'run_id': '55e39a9b-b791-4d5b-b381-ca7d93371515', 'tags': [], 'metadata': {}, 'data': {'chunk': AIMessageChunk(content=' assist', id='run-55e39a9b-b791-4d5b-b381-ca7d93371515')}}
{'event': 'on_chat_model_stream', 'name': 'ChatOpenAI', 'run_id': '55e39a9b-b791-4d5b-b381-ca7d93371515', 'tags': [], 'metadata': {}, 'data': {'chunk': AIMessageChunk(content=' you', id='run-55e39a9b-b791-4d5b-b381-ca7d93371515')}}
{'event': 'on_chat_model_stream', 'name': 'ChatOpenAI', 'run_id': '55e39a9b-b791-4d5b-b381-ca7d93371515', 'tags': [], 'metadata': {}, 'data': {'chunk': AIMessageChunk(content=' today', id='run-55e39a9b-b791-4d5b-b381-ca7d93371515')}}
{'event': 'on_chat_model_stream', 'name': 'ChatOpenAI', 'run_id': '55e39a9b-b791-4d5b-b381-ca7d93371515', 'tags': [], 'metadata': {}, 'data': {'chunk': AIMessageChunk(content='?', id='run-55e39a9b-b791-4d5b-b381-ca7d93371515')}}
{'event': 'on_chat_model_stream', 'name': 'ChatOpenAI', 'run_id': '55e39a9b-b791-4d5b-b381-ca7d93371515', 'tags': [], 'metadata': {}, 'data': {'chunk': AIMessageChunk(content='', response_metadata={'finish_reason': 'stop'}, id='run-55e39a9b-b791-4d5b-b381-ca7d93371515')}}
{'event': 'on_chat_model_end', 'name': 'ChatOpenAI', 'run_id': '55e39a9b-b791-4d5b-b381-ca7d93371515', 'tags': [], 'metadata': {}, 'data': {'input': {'messages': [[HumanMessage(content='hello')]]}, 'output': {'generations': [[{'text': 'Hello! How can I assist you today?', 'generation_info': {'finish_reason': 'stop'}, 'type': 'ChatGeneration', 'message': AIMessage(content='Hello! How can I assist you today?', response_metadata={'finish_reason': 'stop'}, id='run-55e39a9b-b791-4d5b-b381-ca7d93371515')}]], 'llm_output': None, 'run': None}}}
{'event': 'on_chain_stream', 'run_id': '06beb82d-5097-4fec-b731-4fc2c98ace91', 'tags': [], 'metadata': {}, 'name': 'thingy', 'data': {'chunk': AIMessage(content='Hello! How can I assist you today?', response_metadata={'finish_reason': 'stop'}, id='run-55e39a9b-b791-4d5b-b381-ca7d93371515')}}
{'event': 'on_chain_end', 'name': 'thingy', 'run_id': '06beb82d-5097-4fec-b731-4fc2c98ace91', 'tags': [], 'metadata': {}, 'data': {'output': AIMessage(content='Hello! How can I assist you today?', response_metadata={'finish_reason': 'stop'}, id='run-55e39a9b-b791-4d5b-b381-ca7d93371515')}}

The versions:

(.venv) ➜  latest_langchain python -m langchain_core.sys_info

System Information
------------------
> OS:  Darwin
> OS Version:  Darwin Kernel Version 23.2.0: Wed Nov 15 21:53:34 PST 2023; root:xnu-10002.61.3~2/RELEASE_ARM64_T8103
> Python Version:  3.11.2 (main, Feb 21 2024, 12:24:36) [Clang 15.0.0 (clang-1500.1.0.2.5)]

Package Information
-------------------
> langchain_core: 0.1.52
> langchain: 0.1.17
> langchain_community: 0.0.37
> langsmith: 0.1.54
> langchain_openai: 0.1.6
> langchain_text_splitters: 0.0.1

Packages not installed (Not Necessarily a Problem)
--------------------------------------------------
The following packages were not found:

> langgraph
> langserve