Open kumar19a opened 3 months ago
Hey @kumar19a! 👋 I'm here to assist you with bugs, questions, and becoming a contributor. While you're waiting for a human maintainer, I'll help you out.
The error 'AttributeError: 'HuggingFaceLLM' object has no attribute 'model'. Did you mean: '_model''
occurs because the HuggingFaceLLM
class uses a private attribute _model
to store the model instance, not model
.
To fix the error, you should access the _model
attribute instead of model
when passing the HuggingFaceLLM
instance to FunctionCallingAgentWorker
or any other function that requires the model attribute [1][2][3].
To continue talking to Dosu, mention @dosu.
@dosu But FunctionCallingAgentWorker internally uses .model rather than ._model for type checking in llama-index.
FunctionCallingAgentWorker
uses .model
instead of ._model
for type checking because it accesses the model
attribute of the FunctionCallingLLM
instance, which is a public attribute. This is evident from the following code snippet:
if not llm.metadata.is_function_calling_model:
raise ValueError(
f"Model name {llm.model} does not support function calling API. "
)
Here, llm.model
is used to retrieve the model name for the error message, indicating that model
is a public attribute of the FunctionCallingLLM
class. Additionally, the MockFunctionCallingLLM
class in the provided test file defines model
as a public attribute using pydantic
's Field
:
class MockFunctionCallingLLM(FunctionCallingLLM):
model: str = Field(default=MOCK_MODEL_NAME)
...
This further confirms that model
is intended to be accessed publicly, justifying the use of .model
for type checking [1][2].
To continue talking to Dosu, mention @dosu.
@dosu then why is it giving the following error : AttributeError: 'HuggingFaceLLM' object has no attribute 'model'. Did you mean: '_model'?
The error occurs because the FunctionCallingAgentWorker
class is trying to access the model
attribute of the HuggingFaceLLM
object, but the correct attribute name is _model
. The HuggingFaceLLM
class defines _model
as a private attribute, which is why accessing model
results in an AttributeError
.
To fix this, you need to modify the FunctionCallingAgentWorker
class to access the _model
attribute instead of model
. Here is the corrected code snippet:
if not llm.metadata.is_function_calling_model:
raise ValueError(
f"Model name {llm._model} does not support function calling API. "
)
This change should resolve the AttributeError
[1][2][3][4][5].
To continue talking to Dosu, mention @dosu.
Can you give some code to reproduce the error? Or you multithreading something?
from llama_index.llms.huggingface import HuggingFaceLLM from llama_index.core.agent import FunctionCallingAgentWorker
llm_path = 'meta-llama/Meta-Llama-3.1-8B-Instruct'
llm_hf = HuggingFaceLLM( model_name=llm_path, tokenizer_name=llm_path, device_map="auto", generate_kwargs={"temperature": 0.2, "do_sample":True}, )
agent_worker = FunctionCallingAgentWorker.from_tools( tools=tools, llm=llm_hf, allow_parallel_tool_calls=False, system_prompt=system_prompt ) ctx.data["orchestrator"] = agent_worker.as_agent()
orchestrator = ctx.data["orchestrator"] response = str(orchestrator.chat(ev.request))
Question Validation
Question
I am getting the following error 'AttributeError: 'HuggingFaceLLM' object has no attribute 'model'. Did you mean: '_model' ' while passing a HF llm to FunctionCallingAgentWorker.?