run-llama / llama_index

LlamaIndex is a data framework for your LLM applications
https://docs.llamaindex.ai
MIT License
36.99k stars 5.3k forks source link

[Bug]: Conflicting versions #15317

Open Nitro963 opened 3 months ago

Nitro963 commented 3 months ago

Bug Description

  1. Multimodal llms:
    • llama-index-multi-modal-llms-ollama ~= 0.1.3 make conflict with llama-index-llms-ollama ~= 0.2.2
    • llama-index-multi-modal-llms-ollama use an old ollama client version ^0.1.6
  2. OpenAI Clip Embeddings:
    • llama-index-embeddings-clip make conflict with torch ~=2.3.1 and Pillow ~=10.4.0

Version

0.10.65

Steps to Reproduce

Install llama-index core then add integration one by one.

Relevant Logs/Tracbacks

No response

dosubot[bot] commented 3 months ago

Hey @Nitro963, I'm here to help you with any bugs, questions, or contributions you have regarding the repository. Let's squash those bugs together!

To resolve the version conflicts between llama-index-multi-modal-llms-ollama version ~0.1.3 and llama-index-llms-ollama version ~0.2.2, as well as the conflicts between llama-index-embeddings-clip and torch version ~2.3.1 and Pillow version ~10.4.0 in your project using version 0.10.65, you need to adjust the dependencies in your pyproject.toml file.

Here is an updated pyproject.toml snippet that should help resolve these conflicts:

[tool.poetry.dependencies]
python = ">=3.8.1,<4.0"
llama-index-core = "^0.10.1"
torch = ">=2.1.2,<2.3.1"  # Adjusted to avoid conflict with torch 2.3.1
pillow = ">=10.2.0,<10.4.0"  # Adjusted to avoid conflict with Pillow 10.4.0
torchvision = "^0.17.0"
ftfy = "^6.1.3"
llama-index-multi-modal-llms-ollama = "~0.1.3"
llama-index-llms-ollama = "~0.2.2"

[tool.poetry.group.dev.dependencies]
ipython = "8.10.0"
jupyter = "^1.0.0"
mypy = "0.991"
pre-commit = "3.2.0"
pylint = "2.15.10"
pytest = "7.2.1"
pytest-mock = "3.11.1"
ruff = "0.0.292"
tree-sitter-languages = "^1.8.0"
types-Deprecated = ">=0.1.0"
types-PyYAML = "^6.0.12.12"
types-protobuf = "^4.24.0.4"
types-redis = "4.5.5.0"
types-requests = "2.28.11.8"
types-setuptools = "67.1.0.0"

[tool.poetry.group.dev.dependencies.black]
extras = ["jupyter"]
version = "<=23.9.1,>=23.7.0"

[tool.poetry.group.dev.dependencies.codespell]
extras = ["toml"]
version = ">=v2.2.6"

This configuration ensures that the versions of torch and Pillow are compatible with the llama-index-embeddings-clip package, and it includes the specified versions of llama-index-multi-modal-llms-ollama and llama-index-llms-ollama [1].

To continue talking to Dosu, mention @dosu.