run-llama / llama-agents

MIT License
1.59k stars 156 forks source link

Not able to run `examples/pipeline_agent_service_tool_local_single.py` #138

Open lz-chen opened 1 month ago

lz-chen commented 1 month ago

Hi! I was following this example. I use Azure OpenAI and when I specify the llm in OpenAIAgent to be an AzureOpenAI instance, I keep getting connection error:

service_name='secret_fact_agent' description='Useful for getting the secret fact.' prompt=[] host=None port=None
service_name='dumb_fact_agent' description='Useful for telling funny jokes.' prompt=[] host=None port=None
INFO:llama_agents.message_queues.simple - Consumer AgentService-bcd2bb4a-d352-4efc-964a-9cae774c0de5: secret_fact_agent has been registered.
INFO:llama_agents.message_queues.simple - Consumer AgentService-269f41dc-f534-4ac9-99f7-66f82d80da84: dumb_fact_agent has been registered.
INFO:llama_agents.message_queues.simple - Consumer 61ebadcf-5874-48ba-a339-08ba79a31495: human has been registered.
INFO:llama_agents.message_queues.simple - Consumer ControlPlaneServer-1b023cb1-94be-41c6-9072-95d494daa28a: control_plane has been registered.
INFO:llama_agents.services.agent - secret_fact_agent launch_local
INFO:llama_agents.services.agent - dumb_fact_agent launch_local
INFO:llama_agents.message_queues.base - Publishing message to 'control_plane' with action 'new_task'
INFO:llama_agents.message_queues.simple - Launching message queue locally
INFO:llama_agents.message_queues.base - Publishing message to 'dumb_fact_agent' with action 'new_task'
INFO:llama_agents.message_queues.simple - Successfully published message 'control_plane' to consumer.
INFO:llama_agents.services.agent - Created new task: 81d11134-05e6-4e0d-a19f-5d416ff89a3c
INFO:llama_agents.message_queues.simple - Successfully published message 'dumb_fact_agent' to consumer.
INFO:llama_agents.message_queues.simple - Consumer ServiceAsTool-620bba80-4724-4b86-befe-418cfb23d5d1: ServiceAsTool-620bba80-4724-4b86-befe-418cfb23d5d1 has been registered.
INFO:llama_agents.message_queues.base - Publishing message to 'secret_fact_agent' with action 'new_tool_call'
INFO:llama_agents.services.agent - Created new tool call as task: 7b5acd27-b5f7-49b5-932a-993a486ac47c
INFO:llama_agents.message_queues.simple - Successfully published message 'secret_fact_agent' to consumer.
Retrying llama_index.llms.openai.base.OpenAI._achat in 0.18510759794022058 seconds as it raised APIConnectionError: Connection error..
Retrying llama_index.llms.openai.base.OpenAI._achat in 0.2218917555453621 seconds as it raised APIConnectionError: Connection error..

And I changed OpenAIAgent to ReActAgent with the same AzureOpenAI instance, it's not able to find the registered agent as tool:

service_name='secret_fact_agent' description='Useful for getting the secret fact.' prompt=[] host=None port=None
service_name='dumb_fact_agent' description='Useful for telling funny jokes.' prompt=[] host=None port=None
INFO:llama_agents.message_queues.simple - Consumer AgentService-c6353956-4c98-4544-8c5b-321951f0aebc: secret_fact_agent has been registered.
INFO:llama_agents.message_queues.simple - Consumer AgentService-9803d706-d2ba-407c-b30f-1d100d678d39: dumb_fact_agent has been registered.
INFO:llama_agents.message_queues.simple - Consumer 6296f991-9159-4748-91a5-850fb6a79ec8: human has been registered.
INFO:llama_agents.message_queues.simple - Consumer ControlPlaneServer-ea3b45e6-1896-4162-921d-818922683847: control_plane has been registered.
INFO:llama_agents.services.agent - secret_fact_agent launch_local
INFO:llama_agents.services.agent - dumb_fact_agent launch_local
INFO:llama_agents.message_queues.base - Publishing message to 'control_plane' with action 'new_task'
INFO:llama_agents.message_queues.simple - Launching message queue locally
INFO:llama_agents.message_queues.base - Publishing message to 'dumb_fact_agent' with action 'new_task'
INFO:llama_agents.message_queues.simple - Successfully published message 'control_plane' to consumer.
INFO:llama_agents.services.agent - Created new task: 2551f951-b714-4c76-ac73-86b1d0dbca70
INFO:llama_agents.message_queues.simple - Successfully published message 'dumb_fact_agent' to consumer.
> Running step 172b413e-ff25-4ed2-a510-ab899f5fb899. Step input: What is the secret fact?
Thought: The current language of the user is: English. I need to use a tool to help me answer the question.
Action: secret_fact_agent
Action Input: {'input': 'What is the secret fact?'}
Observation: Error: No such tool named `secret_fact_agent`.
> Running step ed91a870-80ed-4fa3-8ad4-a72dacc3d6b1. Step input: None
Thought: I need to use the correct tool to find the secret fact.
Action: secret_fact_agent
Action Input: {'input': 'What is the secret fact?'}
Observation: Error: No such tool named `secret_fact_agent`.
> Running step f6a8b604-9a8a-4e96-ba71-505e80f392e6. Step input: None
Thought: I need to use the correct tool name to find the secret fact.
Action: secret_fact_agent
Action Input: {'input': 'What is the secret fact?'}
Observation: Error: No such tool named `secret_fact_agent`.
> Running step c216580d-2f1c-4262-bf84-28f7b9085e8a. Step input: None
Thought: I need to use the correct tool name to find the secret fact.
Action: secret_fact_agent
Action Input: {'input': 'What is the secret fact?'}
Observation: Error: No such tool named `secret_fact_agent`.
> Running step 6cc9aa39-7743-46b1-b8e0-b442bc1132b7. Step input: None
Thought: I need to use the correct tool name to find the secret fact.
Action: secret_fact_agent
Action Input: {'input': 'What is the secret fact?'}
Observation: Error: No such tool named `secret_fact_agent`.

Shutting down.

Any suggestion on how I can fix this?

nerdai commented 1 month ago

Interesting, thanks for sharing @lz-chen.

@logan-markewich do you think perhaps AzureOpenAI as an orchestrator is not able to register the agent services as tools?

nerdai commented 1 month ago

@lz-chen: I noticed in the logs that a ToolService wasn't registered. Are you instead following this example instead of the one you have originally linked?

lz-chen commented 1 month ago

Hi @nerdai indeed, I posted wrong link. Thank you for pointing out. What I have experienced is with this example. Here is my script:

from llama_agents import (
    AgentService,
    ControlPlaneServer,
    SimpleMessageQueue,
    PipelineOrchestrator,
    ServiceComponent,
    LocalLauncher,
)
from llama_agents.tools import ServiceAsTool

from llama_index.core.agent import FunctionCallingAgentWorker
from llama_index.core.tools import FunctionTool
from llama_index.core.query_pipeline import QueryPipeline
from llama_index.llms.azure_openai import AzureOpenAI
from llama_index.llms.openai import OpenAI
from llama_index.agent.openai import OpenAIAgent
from llama_index.core.settings import Settings

from web_scraping_agent.llm.base import llm_gpt4o
from web_scraping_agent.utils.config import settings
from llama_index.core.agent import ReActAgent

Settings.llm = llm_gpt4o

llm_gpt4o = AzureOpenAI(
    azure_deployment=settings.AZURE_OPENAI_GPT4O_MODEL,
    temperature=0.,
    azure_endpoint=settings.AZURE_OPENAI_ENDPOINT,
    api_key=settings.AZURE_OPENAI_API_KEY,
    api_version=settings.AZURE_OPENAI_API_VERSION,
)

# create an agent
def get_the_secret_fact() -> str:
    """Returns the secret fact."""
    return "The secret fact is: A baby llama is called a 'Cria'."

tool = FunctionTool.from_defaults(fn=get_the_secret_fact)

worker1 = FunctionCallingAgentWorker.from_tools([tool], llm=OpenAI())
# worker2 = FunctionCallingAgentWorker.from_tools([], llm=OpenAI())
agent1 = worker1.as_agent()

# create our multi-agent framework components
message_queue = SimpleMessageQueue()

agent1_server = AgentService(
    agent=agent1,
    message_queue=message_queue,
    description="Useful for getting the secret fact.",
    service_name="secret_fact_agent",
)

agent1_server_tool = ServiceAsTool.from_service_definition(
    message_queue=message_queue, service_definition=agent1_server.service_definition
)

# agent2 = OpenAIAgent.from_tools(
#     [agent1_server_tool],
#     system_prompt="Perform the task, return the result as well as a funny joke.",
# )  # worker2.as_agent()

agent2 = ReActAgent.from_tools([agent1_server_tool],
                               llm=llm_gpt4o, verbose=True)

agent2_server = AgentService(
    agent=agent2,
    message_queue=message_queue,
    description="Useful for telling funny jokes.",
    service_name="dumb_fact_agent",
)

print(agent1_server.service_definition)
print(agent2_server.service_definition)
agent2_component = ServiceComponent.from_service_definition(agent2_server.service_definition)

pipeline = QueryPipeline(chain=[agent2_component])

pipeline_orchestrator = PipelineOrchestrator(pipeline)

control_plane = ControlPlaneServer(message_queue, pipeline_orchestrator)

# launch it
launcher = LocalLauncher([agent1_server, agent2_server], control_plane, message_queue)
result = launcher.launch_single("What is the secret fact?")

print(f"Result: {result}")
nerdai commented 1 month ago

Thanks @lz-chen. Do you mind letting me know which version of llama-agents you're running?

lz-chen commented 1 month ago

I'm using version 0.0.9 @nerdai

nerdai commented 1 month ago

That's bizarre. For some reason the tool name should be secret_fact_agent-as-tool but ReActAgent is looking for secret_fact_agent instead 🤔

lz-chen commented 1 month ago

Hmm shouldn't the ReActAgent be looking for secret_fact_agent since that's the "tool" that agent1 expose as? Also in the log it says: INFO:llama_agents.message_queues.simple - Consumer AgentService-c6353956-4c98-4544-8c5b-321951f0aebc: secret_fact_agent has been registered.

nerdai commented 1 month ago

Yeah so in this example, what we're doing is actually wrapping agent1 (more speicifically its service) as a tool so that agent2 can call it directly.

# here we make agent1 service a tool (the tool is now name f"{service.name}-as-tool" thus 'secret_fact_agent-as-tool')
agent1_server_tool = ServiceAsTool.from_service_definition(
    message_queue=message_queue, service_definition=agent1_server.service_definition
)

# here we give the tool to agent2
agent2 = ReActAgent.from_tools([agent1_server_tool],
                               llm=llm_gpt4o, verbose=True)

# here we establish a pipeline that consists of only agent2 component
# meaning if agent1 service is called, the only way must be a tool call from agent2
pipeline = QueryPipeline(chain=[agent2_component])
nerdai commented 1 month ago

I'll try to replicate the bug you're experiencing on my end tmr (likely in the evening). Will let you know how it goes

lz-chen commented 1 month ago

Thank you! @nerdai