run-llama / llama_index

LlamaIndex is a data framework for your LLM applications
https://docs.llamaindex.ai
MIT License
35.18k stars 4.94k forks source link

[Bug]: Using react agent with Vertex is throwing error #11979

Closed fahadsuhaib closed 2 weeks ago

fahadsuhaib commented 5 months ago

Bug Description

Followed the tutorial to create a ReACT agent,

https://docs.llamaindex.ai/en/stable/examples/agent/react_agent_with_query_engine.html

After successfully initializing the index. Its failing to run the agent code,

response = agent.chat("What was Lyft's revenue growth in 2021?")
print(str(response))

Getting error as,

AttributeError: 'Vertex' object has no attribute '_chat_client'

Version

llama-index-vertex (0.1.5)

Steps to Reproduce

Followed the tutorial to create a ReACT agent,

https://docs.llamaindex.ai/en/stable/examples/agent/react_agent_with_query_engine.html

After successfully initializing the index. Its failing to run the agent code,

response = agent.chat("What was Lyft's revenue growth in 2021?")
print(str(response))

Relevant Logs/Tracbacks

Getting error as,

AttributeError: 'Vertex' object has no attribute '_chat_client'
dosubot[bot] commented 5 months ago

Hey @fahadsuhaib, long time no chat! Hope you're doing well amidst all the code and chaos. 😄

Thank you for providing the detailed information. The error message you're seeing, AttributeError: 'Vertex' object has no attribute '_chat_client', suggests that the Vertex object is trying to access an attribute _chat_client that doesn't exist.

The Vertex class in the LlamaIndex Vertex version 0.1.5 does have a _chat_client attribute, which is initialized based on the model type provided to the Vertex class. If the model is identified as a chat model, a ChatModel or CodeChatModel instance from vertexai.language_models is assigned to _chat_client. If the model is a Gemini model, which supports both chat and code generation, the _chat_client is assigned a client created by create_gemini_client function.

The _chat_client is used in methods like chat, stream_chat, and achat to generate chat responses. These methods involve preparing the input from chat messages, calling the appropriate model through the _chat_client, and processing the model's output to create a ChatResponse.

Given this, the error you're encountering could be due to one of the following reasons:

  1. The model provided to the Vertex class is not a chat model or a Gemini model. In this case, the _chat_client attribute would not be initialized, leading to the AttributeError when the chat method is called. Please ensure that the model you're using supports chat interactions.

  2. There might be an issue with the initialization of the Vertex object in your code. Please ensure that the Vertex object is being correctly initialized with the necessary parameters, including the model type.

  3. There might be a version mismatch issue. The LlamaIndex Vertex version you're using is 0.1.5, but the latest version of LlamaIndex is v0.10.20. If possible, try updating to the latest version of LlamaIndex and see if the issue persists.

If none of the above solutions work, could you please provide more details about the model you're using and the exact code where the Vertex object is initialized? This would help in identifying the exact cause of the issue.

Sources

#### About Dosu This response is meant to be useful and save you time. It is **not meant to be a precise solution**, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot.
logan-markewich commented 5 months ago

The vertex model you are using does not provide a chat interface.

Probably this could be better handled in the code. But also, I would just pick a model that has a chat interface https://github.com/run-llama/llama_index/blob/434de440b6302aa0bee108d0caf71b6522e01eb7/llama-index-integrations/llms/llama-index-llms-vertex/llama_index/llms/vertex/base.py#L81