run-llama / llama_index

LlamaIndex is a data framework for your LLM applications
https://docs.llamaindex.ai
MIT License
36.9k stars 5.28k forks source link

[Bug]: WolframAlphaToolSpec broken #15578

Open mohammad-yousuf opened 3 months ago

mohammad-yousuf commented 3 months ago

Bug Description

Last night WolframAlphaToolSpec stopped working.

Version

llama-index-core: 0.11.0, llama-index-readers-file: 0.2.0, llama-index-tools-wolfram-alpha: 0.2.0, crewai: 0.51.1, crewai-tools: 0.8.3

Steps to Reproduce

from crewai_tools import LlamaIndexTool
from llama_index.tools.wolfram_alpha import WolframAlphaToolSpec

wolfram_spec = WolframAlphaToolSpec(app_id="E7LXP9-V748VQUY94")
wolfram_tools = wolfram_spec.to_tool_list()

crewai_wolfram_tools = [LlamaIndexTool.from_tool(t) for t in wolfram_tools]

Relevant Logs/Tracbacks

/usr/local/lib/python3.10/dist-packages/pydantic/_internal/_config.py:341: UserWarning: Valid config keys have changed in V2:
* 'allow_population_by_field_name' has been renamed to 'populate_by_name'
* 'smart_union' has been removed
  warnings.warn(message, UserWarning)
---------------------------------------------------------------------------
ValidationError                           Traceback (most recent call last)
<ipython-input-5-c7c2fcdd6516> in <cell line: 26>()
     24 wolfram_tools = wolfram_spec.to_tool_list()
     25 
---> 26 crewai_wolfram_tools = [LlamaIndexTool.from_tool(t) for t in wolfram_tools]

2 frames
/usr/local/lib/python3.10/dist-packages/pydantic/main.py in __init__(self, **data)
    191         # `__tracebackhide__` tells pytest and some other tools to omit this function from tracebacks
    192         __tracebackhide__ = True
--> 193         self.__pydantic_validator__.validate_python(data, self_instance=self)
    194 
    195     # The following line sets a flag that we use to determine when `__init__` gets overridden by the user

ValidationError: 1 validation error for LlamaIndexTool
args_schema
  Input should be a subclass of BaseModel [type=is_subclass_of, input_value=<class 'llama_index.core....ls.wolfram_alpha_query'>, input_type=ModelMetaclass]
    For further information visit https://errors.pydantic.dev/2.8/v/is_subclass_of
logan-markewich commented 3 months ago

LlamaIndex updated to pydantic v2

Looks like crewai is using pydantic v1

Pydantic v2 has been out for over a year now, hopefully they have it on the roadmap to update soon :)

Can't use v0.11.0 with crewai until they move off of pydantic v1 it seems

logan-markewich commented 3 months ago

You'll have to downgrade

pip install -U "llama-index-core<0.11.0" "llama-index-readers-file<0.2.0" "llama-index-tools-wolfram-alpha<0.2.0"
mohammad-yousuf commented 3 months ago

You'll have to downgrade

pip install -U "llama-index-core<0.11.0" "llama-index-readers-file<0.2.0" "llama-index-tools-wolfram-alpha<0.2.0"

Thanks so much!

dosubot[bot] commented 6 hours ago

Hi, @mohammad-yousuf. I'm Dosu, and I'm helping the LlamaIndex team manage their backlog. I'm marking this issue as stale.

Issue Summary:

Next Steps:

Thank you for your understanding and contribution!