Open shaon-chowdhury opened 1 month ago
It looks like the issue you're encountering, ValueError: Tool search not found.
, is due to the LLMCompiler not being able to locate the DuckDuckGoSearchToolSpec in the provided tools sequence. To resolve this, ensure the following:
DuckDuckGoSearchToolSpec
from the llama_index.tools.duckduckgo
package:from llama_index.tools.duckduckgo import DuckDuckGoSearchToolSpec
Verify that the DuckDuckGoSearchToolSpec is correctly instantiated and included in your tools
sequence.
Check that the tool's metadata.name
matches the tool_name
you are using, which should likely be "search".
Confirm the tool is correctly passed to the instantiate_new_step
function or wherever the LLMCompiler expects to find it.
Additionally, ensure the duckduckgo_search
package is installed in your environment:
pip install duckduckgo_search
For further details, you might want to look into the LlamaIndex repository, specifically the files related to the LLMCompiler utilities and the DuckDuckGoSearchToolSpec.
If you've followed these steps and the issue persists, there might be a missing step or configuration in your setup based on the LlamaIndex documentation.
To continue the conversation, mention @dosu.
The LLM is predicting a tool name that doesn't exist is my guess? I'm not sure what your tool names/descriptions look like, but try to make them unique and descriptive
For the example I provided the tool names are ['duckduckgo_instant_search', 'duckduckgo_full_search']
(can be found using [tool.metadata.name for tool in tools]
). Strangely though the error is ValueError: Tool search not found.
. I came up with a small hotfix where I check equality of the name and also whether the tool_name
is in the names of the tools in the _find_tools
function. Ok if I send up a PR?
I'm using the version with the PR above merged (0.10.48.post1), and the same error still exists. I set the tools following the official cookbook,
def multiply(a: int, b: int) -> int:
"""Multiple two integers and returns the result integer"""
return a * b
multiply_tool = FunctionTool.from_defaults(fn=multiply)
def add(a: int, b: int) -> int:
"""Add two integers and returns the result integer"""
return a + b
add_tool = FunctionTool.from_defaults(fn=add)
tools = [multiply_tool, add_tool]
But my agent stilled tried to "search" something,
> Plan: 1. search(3)
2. multiply(121, $1)
3. add($2, 42)
4. join()<END_OF_PLAN>
assistant: Plan accepted. Processing...
After executing the plan, the result is: 363
Your plan has been executed and the answer is 363.
And it failed unsurprisingly with same error ValueError: Tool search not found.
@HejiaZ2023 you need to update the agent integration probably? Assuming you are using an openai agent
pip install -U llama-index-agent-openai
There's even unit tests for this
Hi @logan-markewich I think both OP and I are using local LLM inference, not openai (OP mentioning Ollama, I'm using vllm)
(And I did check my llama-index-agent-openai version is latest, 0.2.7)
My best guess would be the PR fixed an agent-shared get_function_by_name
, but llm_compiler agent is using its own _find_tool
in llama_index/agent/llm_compiler/utils.py
? (which is where it raised an Exception for both OP and me)
Bug Description
Tool names cannot be found when
agent.chat
is called even though the plan is created.Version
0.10.34
Steps to Reproduce
Use any tool (using duckduckgo search as an example)
tools = DuckDuckGoSearchToolSpec().to_tool_list()
Create LLM model and agent
Relevant Logs/Tracbacks