run-llama / llama_index

LlamaIndex is a data framework for your LLM applications
https://docs.llamaindex.ai
MIT License
37.02k stars 5.31k forks source link

[Bug]: HuggingFaceEmbedding和OpenLLM依赖的llama-index-core版本不一样,同时使用两个类,必有一个报错 #16023

Open NoobPythoner opened 2 months ago

NoobPythoner commented 2 months ago

Bug Description

HuggingFaceEmbedding 依赖于llama-index-core-0.10.68.post1。OpenLLM 依赖于llama-index-core-0.11.9。 运行OpenLLm成功,则HuggingFaceEmbedding报错: ile ~/anaconda3/envs/llama_factory/lib/python3.11/site-packages/llama_index/embeddings/huggingface/base.py:70, in HuggingFaceEmbedding.init(self, model_name, tokenizer_name, pooling, max_length, query_instruction, text_instruction, normalize, model, tokenizer, embed_batch_size, cache_folder, trust_remote_code, device, callback_manager, model_kwargs) 52 def init( 53 self, 54 model_name: str = DEFAULT_HUGGINGFACE_EMBEDDING_MODEL, (...) 68 model_kwargs, 69 ): ---> 70 self._device = device or infer_torch_device() 72 cache_folder = cache_folder or get_cache_dir() 74 for variable, value in [ 75 ("model", model), 76 ("tokenizer", tokenizer), 77 ("pooling", pooling), 78 ("tokenizer_name", tokenizer_name), 79 ]:

File ~/anaconda3/envs/llama_factory/lib/python3.11/site-packages/pydantic/main.py:862, in BaseModel.setattr(self, name, value) 857 raise AttributeError( 858 f'{name!r} is a ClassVar of {self.__class__.__name__} and cannot be set on an instance. ' 859 f'If you want to set a value on the class, use {self.__class__.__name__}.{name} = value.' 860 ) 861 elif not _fields.is_valid_field_name(name): --> 862 if self.pydantic_private is None or name not in self.private_attributes: 863 _object_setattr(self, name, value) 864 else:

File ~/anaconda3/envs/llama_factory/lib/python3.11/site-packages/pydantic/main.py:850, in BaseModel.getattr(self, item) 848 else: 849 if hasattr(self.class, item): --> 850 return super().getattribute(item) # Raises AttributeError if appropriate 851 else: 852 # this is the current error 853 raise AttributeError(f'{type(self).name!r} object has no attribute {item!r}')

AttributeError: 'HuggingFaceEmbedding' object has no attribute '__pydantic_private__'

运行HuggingFaceEmbedding成功,则OpenLLM报错: ile ~/anaconda3/envs/llama_factory/lib/python3.11/site-packages/llama_index/llms/openllm/init.py:1 ----> 1 from llama_index.llms.openllm.base import OpenLLM 3 all = ["OpenLLM"]

File ~/anaconda3/envs/llama_factory/lib/python3.11/site-packages/llama_index/llms/openllm/base.py:1 ----> 1 from llama_index.llms.openai_like.base import OpenAILike 4 class OpenLLM(OpenAILike): 5 r""" 6 OpenLLM LLM. 7 (...) 23 ``` 24 """

File ~/anaconda3/envs/llama_factory/lib/python3.11/site-packages/llama_index/llms/openai_like/init.py:1 ----> 1 from llama_index.llms.openai_like.base import OpenAILike 3 all = ["OpenAILike"]

File ~/anaconda3/envs/llama_factory/lib/python3.11/site-packages/llama_index/llms/openai_like/base.py:24 20 from llama_index.llms.openai.base import OpenAI, Tokenizer 21 from transformers import AutoTokenizer ---> 24 class OpenAILike(OpenAI): 25 """OpenaAILike LLM. 26 27 OpenAILike is a thin wrapper around the OpenAI model that makes it compatible with (...) 48 ``` 49 """ 51 context_window: int = Field( 52 default=DEFAULT_CONTEXT_WINDOW, 53 description=LLMMetadata.model_fields["context_window"].description, 54 )

File ~/anaconda3/envs/llama_factory/lib/python3.11/site-packages/llama_index/llms/openai_like/base.py:53, in OpenAILike() 24 class OpenAILike(OpenAI): 25 """OpenaAILike LLM. 26 27 OpenAILike is a thin wrapper around the OpenAI model that makes it compatible with (...) 48 ``` 49 """ 51 context_window: int = Field( 52 default=DEFAULT_CONTEXT_WINDOW, ---> 53 description=LLMMetadata.model_fields["context_window"].description, 54 ) 55 is_chat_model: bool = Field( 56 default=False, 57 description=LLMMetadata.model_fields["is_chat_model"].description, 58 ) 59 is_function_calling_model: bool = Field( 60 default=False, 61 description=LLMMetadata.model_fields["is_function_calling_model"].description, 62 )

AttributeError: type object 'LLMMetadata' has no attribute 'model_fields'

Version

llama-index 0.11.9

Steps to Reproduce

代码: from llama_index.llms.openllm import OpenLLM from llama_index.embeddings.huggingface import HuggingFaceEmbedding

llm_model="qwen1.5-14b-chat" # "qwen1.5-14b-chat" #"qwen-plus"

llm = OpenLLM( model=llm_model, api_base="http://0.0.0.0.:8081/v1", api_key="na" )

model_path='./embedding_model/bge-large-zh-v1.5' embed_model = HuggingFaceEmbedding( model_name=model_path, query_instruction="为这个句子生成表示以用于检索相关文章:", device='cuda' )

Relevant Logs/Tracbacks

No response

dosubot[bot] commented 2 months ago

I found some similar issues that might help you resolve the dependency conflicts between HuggingFaceEmbedding and OpenLLM:

  1. [Bug]: Conflicting versions (Open, last updated on August 20, 2024) [1]:

    • Solution: Adjust the dependencies in your pyproject.toml file to ensure compatibility between the specified versions of llama-index-core, torch, Pillow, and other related packages. Here is an example configuration:

      [tool.poetry.dependencies]
      python = ">=3.8.1,<4.0"
      llama-index-core = "^0.10.1"
      torch = ">=2.1.2,<2.3.1"
      pillow = ">=10.2.0,<10.4.0"
      torchvision = "^0.17.0"
      ftfy = "^6.1.3"
      llama-index-multi-modal-llms-ollama = "~0.1.3"
      llama-index-llms-ollama = "~0.2.2"
      
      [tool.poetry.group.dev.dependencies]
      ipython = "8.10.0"
      jupyter = "^1.0.0"
      mypy = "0.991"
      pre-commit = "3.2.0"
      pylint = "2.15.10"
      pytest = "7.2.1"
      pytest-mock = "3.11.1"
      ruff = "0.0.292"
      tree-sitter-languages = "^1.8.0"
      types-Deprecated = ">=0.1.0"
      types-PyYAML = "^6.0.12.12"
      types-protobuf = "^4.24.0.4"
      types-redis = "4.5.5.0"
      types-requests = "2.28.11.8"
      types-setuptools = "67.1.0.0"
      
      [tool.poetry.group.dev.dependencies.black]
      extras = ["jupyter"]
      version = "<=23.9.1,>=23.7.0"
      
      [tool.poetry.group.dev.dependencies.codespell]
      extras = ["toml"]
      version = ">=v2.2.6"
  2. [Bug]: LLAMA INDEX is becoming a Dependency Hell itself (Closed, last updated on August 22, 2024) [2]:

    • Solution: Use Poetry to manage dependencies. Create a new directory, initialize Poetry, and add the required packages using poetry add <package>. This approach helps manage dependencies more effectively and avoids conflicts.
  3. [Bug]: llama-index-embeddings-google and llama-index-vector-stores-google need package bump (Closed, last updated on July 21, 2024) [3]:

    • Solution: Update pip to the latest version and consider using Anaconda for managing packages and dependencies. Manually adjust version constraints in your environment to resolve conflicts.

For your specific issue, you can try the following steps:

  1. Update Dependencies: Ensure that both HuggingFaceEmbedding and OpenLLM use compatible versions of llama-index-core. You might need to adjust the versions in your pyproject.toml or requirements.txt file.

  2. Use Poetry: Initialize a new Poetry project and add the required dependencies. This can help manage and resolve version conflicts more effectively.

  3. Check for Compatibility: Verify that the versions of llama-index-core used by HuggingFaceEmbedding and OpenLLM are compatible. You might need to downgrade or upgrade one of the packages to ensure compatibility.

By following these steps, you should be able to resolve the dependency conflicts and use both HuggingFaceEmbedding and OpenLLM without issues.

To continue talking to Dosu, mention @dosu.

logan-markewich commented 2 months ago

Try pip install -U llama-index-embeddings-huggingface

912100012 commented 2 months ago

加个v不,正在学习llamaindex中,想搞个讨论群。 kai9121000