In a LangGraph app employing Azure OpenAI LLM within multiple nodes, despite setting streaming=False to some of them, the app still streams all LLMs' tokens. #278
[X] I added a very descriptive title to this issue.
[X] I searched the LangChain documentation with the integrated search.
[X] I used the GitHub search to find a similar question and didn't find it.
[X] I am sure that this is a bug in LangChain rather than my code.
Example Code
happened to both streaming options:
from langchain_core.messages import HumanMessage
inputs = [HumanMessage(content="what is the weather in sf")]
async for event in app.astream_events(inputs, version="v1"):
kind = event["event"]
if kind == "on_chat_model_stream":
content = event["data"]["chunk"].content
if content:
Empty content in the context of OpenAI means
# that the model is asking for a tool to be invoked.
# So we only print non-empty content
print(content, end="|")
elif kind == "on_tool_start":
print("--")
print(
f"Starting tool: {event['name']} with inputs: {event['data'].get('input')}"
)
elif kind == "on_tool_end":
print(f"Done tool: {event['name']}")
print(f"Tool output was: {event['data'].get('output')}")
print("--")
inputs = {"messages": [HumanMessage(content="what is the weather in sf?")]}
async for output in app.astream_log(inputs, include_types=["llm"]):
astream_log() yields the requested logs (here LLMs) in JSONPatch format
for op in output.ops:
if op["path"] == "/streamed_output/-":
# this is the output from .stream()
...
elif op["path"].startswith("/logs/") and op["path"].endswith(
"/streamed_output/-"
):
# because we chose to only include LLMs, these are LLM tokens
print(op["value"])
Error Message and Stack Trace (if applicable)
No response
Description
In a LangGraph app employing Azure OpenAI LLM within multiple nodes, despite setting streaming=False to some of them, the app still streams all LLMs' tokens.
hi, not sure what the question is here, if you call your graph with astream_events we try to stream output from all things called inside it. were you expecting to call astream_events but not get streaming output?
Checked other resources
Example Code
happened to both streaming options:
from langchain_core.messages import HumanMessage
inputs = [HumanMessage(content="what is the weather in sf")] async for event in app.astream_events(inputs, version="v1"): kind = event["event"] if kind == "on_chat_model_stream": content = event["data"]["chunk"].content if content:
Empty content in the context of OpenAI means
inputs = {"messages": [HumanMessage(content="what is the weather in sf?")]}
async for output in app.astream_log(inputs, include_types=["llm"]):
astream_log() yields the requested logs (here LLMs) in JSONPatch format
Error Message and Stack Trace (if applicable)
No response
Description
In a LangGraph app employing Azure OpenAI LLM within multiple nodes, despite setting streaming=False to some of them, the app still streams all LLMs' tokens.
System Info
python 3.11
langgraph = "^0.0.30" langchain = "^0.1.4" langchain-openai = "^0.1.1"