run-llama / llama-agents

MIT License
1.14k stars 86 forks source link

[BUG] Error in executing `examples/pipeline_human_local_single.py` #101

Closed nerdai closed 2 days ago

nerdai commented 1 week ago

When trying to run the script examples/pipeline_human_local_single.py, I get this error:

  File "/Users/nerdai/Projects/llama-agents/llama_agents/orchestrators/pipeline.py", line 183, in get_next_messages
    queue_message = get_service_component_message(
  File "/Users/nerdai/Projects/llama-agents/llama_agents/orchestrators/pipeline.py", line 34, in get_service_component_message
    if module.module_type == ModuleType.AGENT:
AttributeError: 'RouterComponent' object has no attribute 'module_type'
peteryxu commented 6 days ago

ServiceComponent has the attribute of 'module_type', while RouterComponent does not have since it is NOT a ServiceComponent.

In the current examples/pipeline_human_local_single.py file, when creating QueryPipeline, an RouterComponent was passed in, vs ServiceComponent.

I did some testing, just commented out RouterCompoent, and pass in the two service components directly, and it seems to work in my local env and setup.

#################################### pipeline = QueryPipeline( chain=[ agent_component, human_component

RouterComponent(

    #     selector=PydanticSingleSelector.from_defaults(llm=OpenAI()),
    #     choices=[agent_service.description, human_service.description],
    #     components=[agent_component, human_component],
    # )
]

)

peteryxu commented 4 days ago

Tested code above, and it worked as seen below. Created a PR #112 to merge.

image