langchain-ai / langgraph

Build resilient language agents as graphs.
https://langchain-ai.github.io/langgraph/
MIT License
5.48k stars 867 forks source link

In a LangGraph app employing Azure OpenAI LLM within multiple nodes, despite setting streaming=False to some of them, the app still streams all LLMs' tokens. #278

Open haolxx opened 5 months ago

haolxx commented 5 months ago

Checked other resources

Example Code

happened to both streaming options:


from langchain_core.messages import HumanMessage

inputs = [HumanMessage(content="what is the weather in sf")] async for event in app.astream_events(inputs, version="v1"): kind = event["event"] if kind == "on_chat_model_stream": content = event["data"]["chunk"].content if content:

Empty content in the context of OpenAI means

        # that the model is asking for a tool to be invoked.
        # So we only print non-empty content
        print(content, end="|")
elif kind == "on_tool_start":
    print("--")
    print(
        f"Starting tool: {event['name']} with inputs: {event['data'].get('input')}"
    )
elif kind == "on_tool_end":
    print(f"Done tool: {event['name']}")
    print(f"Tool output was: {event['data'].get('output')}")
    print("--")

inputs = {"messages": [HumanMessage(content="what is the weather in sf?")]}

async for output in app.astream_log(inputs, include_types=["llm"]):

astream_log() yields the requested logs (here LLMs) in JSONPatch format

for op in output.ops:
    if op["path"] == "/streamed_output/-":
        # this is the output from .stream()
        ...
    elif op["path"].startswith("/logs/") and op["path"].endswith(
        "/streamed_output/-"
    ):
        # because we chose to only include LLMs, these are LLM tokens
        print(op["value"])

Error Message and Stack Trace (if applicable)

No response

Description

In a LangGraph app employing Azure OpenAI LLM within multiple nodes, despite setting streaming=False to some of them, the app still streams all LLMs' tokens.

System Info

python 3.11

langgraph = "^0.0.30" langchain = "^0.1.4" langchain-openai = "^0.1.1"

nfcampos commented 4 months ago

hi, not sure what the question is here, if you call your graph with astream_events we try to stream output from all things called inside it. were you expecting to call astream_events but not get streaming output?