Closed whoo9112 closed 5 months ago
The error you're encountering, ValueError: "OpenLLM" object has no field "_llm"
, is likely due to a typo or inconsistency in the attribute naming within your OpenLLM
class or when you're trying to access the _llm
attribute from an instance of the OpenLLM
class.
In the provided code snippet, the OpenLLM
class correctly initializes an attribute named _llm
in its __init__
method:
self._llm = openllm.LLM[Any, Any](
model_id,
model_version=model_version,
model_tag=model_tag,
prompt_template=prompt_template,
system_message=system_prompt,
backend=backend,
quantize=quantize,
serialisation=serialization,
trust_remote_code=trust_remote_code,
embedded=True,
**attrs,
)
This means the OpenLLM
object does indeed have a field named _llm
, and it is initialized as an instance of openllm.LLM
.
If you're encountering this error, it could be due to one of the following reasons:
Access Before Initialization: You might be trying to access the _llm
attribute before it is initialized. Ensure that you are not accessing _llm
before calling OpenLLM
's constructor.
Typo in Access: There might be a typo or incorrect reference when accessing the _llm
attribute. Ensure that the attribute name is spelled correctly wherever it is accessed.
External Modification: If the codebase is large or involves dynamic attribute handling, it's possible that _llm
is being deleted or modified outside the OpenLLM
class definition. Check for any external modifications to the OpenLLM
instances.
Given the provided context, the class definition and initialization of _llm
are correct. If the error persists, review how and where the OpenLLM
instances are being used and ensure that _llm
is accessed after the object has been properly initialized.
To avoid a ValueError
when using the OpenLLM
class, it's essential to ensure that the class is properly initialized before calling any of its methods. The __init__
method should be the first to be called, as it sets up the necessary configuration and initializes the underlying OpenLLM model. After initialization, you can safely call the complete
, chat
, stream_complete
, or stream_chat
methods, depending on your requirements.
Here's a brief overview of the sequence:
OpenLLM
with the required parameters.
llm = OpenLLM(model_id="HuggingFaceH4/zephyr-7b-alpha")
complete
, chat
, stream_complete
, or stream_chat
.
response = llm.complete("To infinity, and beyond")
This sequence ensures that the OpenLLM
class is ready and configured correctly before attempting to perform any operations, thus avoiding potential ValueError
s related to uninitialized or misconfigured attributes.
If you're still encountering issues, could you provide more information about your environment? Specifically, the version of LlamaIndex you're using, and any other relevant details about your setup. This will help in diagnosing the issue further.
Documentation Issue Description
Openllm cannot be utilized.
code:
from vllm.model_executor.parallel_utils.parallel_state import destroy_model_parallel destroy_model_parallel() llm = OpenLLM("qwen/Qwen-7B-Chat", trust_remote_code=True) answer = llm.complete("hello")
occur error below:
/usr/local/lib/python3.10/dist-packages/pydantic/main.cpython-310-x86_64-linux-gnu.so in pydantic.main.BaseModel.setattr()
ValueError: "OpenLLM" object has no field "_llm"
Documentation Link
https://docs.llamaindex.ai/en/stable/api_reference/llms/openllm/