Open Lorenzo-N opened 1 month ago
To resolve the NotImplementedError: Stream not supported for function calling agent
error, you need to override the FunctionCallingAgentWorker.stream_step
method. Here is an example of how you can do this:
from llama_index.core.agent.function_calling.step import FunctionCallingAgentWorker, TaskStep, Task, TaskStepOutput
from llama_index.core.chat_engine.types import AgentChatResponse
from typing import Any, AsyncGenerator
class CustomFunctionCallingAgentWorker(FunctionCallingAgentWorker):
async def stream_step(self, step: TaskStep, task: Task, **kwargs: Any) -> TaskStepOutput:
"""Run step (stream)."""
# Implement your streaming logic here
# Example: Just a placeholder implementation
response_str = "Streaming response"
agent_response = AgentChatResponse(
response=response_str, metadata=task.extra_state
)
return TaskStepOutput(
output=agent_response,
task_step=step,
is_last=True, # or determine based on your logic
next_steps=[]
)
# Usage
worker = CustomFunctionCallingAgentWorker()
# Call worker.stream_step with appropriate arguments
This code defines a custom worker class that overrides the stream_step
method to provide a basic implementation. You can replace the placeholder logic with your actual streaming logic. The AgentChatResponse
and TaskStepOutput
classes are used to structure the response and output of the streaming step. Additionally, ensure that your implementation handles asynchronous streaming if required, as indicated by the AsyncGenerator
type in the provided context [1][2][3].
To continue talking to Dosu, mention @dosu.
Question Validation
Question
I have an LLM model (specifically from Anthropic) and I want to use the native tools from the API. The documentation suggests using a
FunctionCallingAgentWorker
, but when I run this code:I get the following error:
NotImplementedError: Stream not supported for function calling agent
How can I use the agent’s stream along with the tools? I noticed that this issue implemented the stream_chat_with_tools methods on the LLM, so I think it’s already supported. How can I resolve this (even by overriding the FunctionCallingAgentWorker.stream_step method if necessary)?