run-llama / llama_index

LlamaIndex is a data framework for your LLM applications
https://docs.llamaindex.ai
MIT License
33.39k stars 4.67k forks source link

[Bug]: LLMCompiler fails with `Tool{tool_name} not found` value error #13484

Open shaon-chowdhury opened 1 month ago

shaon-chowdhury commented 1 month ago

Bug Description

Tool names cannot be found when agent.chat is called even though the plan is created.

Version

0.10.34

Steps to Reproduce

  1. Use any tool (using duckduckgo search as an example) tools = DuckDuckGoSearchToolSpec().to_tool_list()

  2. Create LLM model and agent

    llm = Ollama(model="phi3", request_timeout=30.0, temperature=0.9)
    callback_manager = llm.callback_manager
    agent_worker = LLMCompilerAgentWorker.from_tools(
    tools, llm=llm, verbose=True, memory=chat_memory, callback_manager=callback_manager
    )
    agent = AgentRunner(agent_worker, callback_manager=callback_manager)

Relevant Logs/Tracbacks

File "/Users/user/Library/Python/3.9/lib/python/site-packages/llama_index/agent/llm_compiler/utils.py", line 85, in instantiate_new_step
    tool = _find_tool(tool_name, tools)
  File "/Users/user/Library/Python/3.9/lib/python/site-packages/llama_index/agent/llm_compiler/utils.py", line 55, in _find_tool
    raise ValueError(f"Tool {tool_name} not found.")
ValueError: Tool search not found.
dosubot[bot] commented 1 month ago

It looks like the issue you're encountering, ValueError: Tool search not found., is due to the LLMCompiler not being able to locate the DuckDuckGoSearchToolSpec in the provided tools sequence. To resolve this, ensure the following:

  1. Correctly import the DuckDuckGoSearchToolSpec from the llama_index.tools.duckduckgo package:
from llama_index.tools.duckduckgo import DuckDuckGoSearchToolSpec
  1. Verify that the DuckDuckGoSearchToolSpec is correctly instantiated and included in your tools sequence.

  2. Check that the tool's metadata.name matches the tool_name you are using, which should likely be "search".

  3. Confirm the tool is correctly passed to the instantiate_new_step function or wherever the LLMCompiler expects to find it.

Additionally, ensure the duckduckgo_search package is installed in your environment:

pip install duckduckgo_search

For further details, you might want to look into the LlamaIndex repository, specifically the files related to the LLMCompiler utilities and the DuckDuckGoSearchToolSpec.

If you've followed these steps and the issue persists, there might be a missing step or configuration in your setup based on the LlamaIndex documentation.

#### Details **Sources** - [[Bug]: ValueError: Tool with name python not found](https://github.com/run-llama/llama_index/issues/12479) - [llama-index-integrations/agent/llama-index-agent-llm-compiler/llama_index/agent/llm_compiler/utils.py](https://github.com/run-llama/llama_index/blob/main/llama-index-integrations/agent/llama-index-agent-llm-compiler/llama_index/agent/llm_compiler/utils.py) - [llama-index-integrations/tools/llama-index-tools-duckduckgo/llama_index/tools/duckduckgo/__init__.py](https://github.com/run-llama/llama_index/blob/main/llama-index-integrations/tools/llama-index-tools-duckduckgo/llama_index/tools/duckduckgo/__init__.py) - [llama-index-integrations/tools/llama-index-tools-duckduckgo/llama_index/tools/duckduckgo/base.py](https://github.com/run-llama/llama_index/blob/main/llama-index-integrations/tools/llama-index-tools-duckduckgo/llama_index/tools/duckduckgo/base.py) - [llama-index-integrations/tools/llama-index-tools-duckduckgo/README.md](https://github.com/run-llama/llama_index/blob/main/llama-index-integrations/tools/llama-index-tools-duckduckgo/README.md) - [llama-index-integrations/tools/llama-index-tools-duckduckgo/examples/duckduckgo_search.ipynb](https://github.com/run-llama/llama_index/blob/main/llama-index-integrations/tools/llama-index-tools-duckduckgo/examples/duckduckgo_search.ipynb)
**About Dosu** This response is meant to be useful and save you time. It is **not meant to be a precise solution**, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

To continue the conversation, mention @dosu.

logan-markewich commented 1 month ago

The LLM is predicting a tool name that doesn't exist is my guess? I'm not sure what your tool names/descriptions look like, but try to make them unique and descriptive

shaon-chowdhury commented 1 month ago

For the example I provided the tool names are ['duckduckgo_instant_search', 'duckduckgo_full_search'] (can be found using [tool.metadata.name for tool in tools]). Strangely though the error is ValueError: Tool search not found.. I came up with a small hotfix where I check equality of the name and also whether the tool_name is in the names of the tools in the _find_tools function. Ok if I send up a PR?

HejiaZ2023 commented 2 weeks ago

I'm using the version with the PR above merged (0.10.48.post1), and the same error still exists. I set the tools following the official cookbook,

def multiply(a: int, b: int) -> int:
    """Multiple two integers and returns the result integer"""
    return a * b

multiply_tool = FunctionTool.from_defaults(fn=multiply)

def add(a: int, b: int) -> int:
    """Add two integers and returns the result integer"""
    return a + b

add_tool = FunctionTool.from_defaults(fn=add)

tools = [multiply_tool, add_tool]

But my agent stilled tried to "search" something,

> Plan: 1. search(3)
2. multiply(121, $1)
3. add($2, 42)
4. join()<END_OF_PLAN>
assistant: Plan accepted. Processing... 

After executing the plan, the result is: 363

Your plan has been executed and the answer is 363.

And it failed unsurprisingly with same error ValueError: Tool search not found.

logan-markewich commented 2 weeks ago

@HejiaZ2023 you need to update the agent integration probably? Assuming you are using an openai agent

pip install -U llama-index-agent-openai

logan-markewich commented 2 weeks ago

There's even unit tests for this

HejiaZ2023 commented 2 weeks ago

Hi @logan-markewich I think both OP and I are using local LLM inference, not openai (OP mentioning Ollama, I'm using vllm) (And I did check my llama-index-agent-openai version is latest, 0.2.7) My best guess would be the PR fixed an agent-shared get_function_by_name, but llm_compiler agent is using its own _find_tool in llama_index/agent/llm_compiler/utils.py? (which is where it raised an Exception for both OP and me)