Open Nitro963 opened 3 months ago
Hey @Nitro963, I'm here to help you with any bugs, questions, or contributions you have regarding the repository. Let's squash those bugs together!
To resolve the version conflicts between llama-index-multi-modal-llms-ollama
version ~0.1.3
and llama-index-llms-ollama
version ~0.2.2
, as well as the conflicts between llama-index-embeddings-clip
and torch
version ~2.3.1
and Pillow
version ~10.4.0
in your project using version 0.10.65
, you need to adjust the dependencies in your pyproject.toml
file.
Here is an updated pyproject.toml
snippet that should help resolve these conflicts:
[tool.poetry.dependencies]
python = ">=3.8.1,<4.0"
llama-index-core = "^0.10.1"
torch = ">=2.1.2,<2.3.1" # Adjusted to avoid conflict with torch 2.3.1
pillow = ">=10.2.0,<10.4.0" # Adjusted to avoid conflict with Pillow 10.4.0
torchvision = "^0.17.0"
ftfy = "^6.1.3"
llama-index-multi-modal-llms-ollama = "~0.1.3"
llama-index-llms-ollama = "~0.2.2"
[tool.poetry.group.dev.dependencies]
ipython = "8.10.0"
jupyter = "^1.0.0"
mypy = "0.991"
pre-commit = "3.2.0"
pylint = "2.15.10"
pytest = "7.2.1"
pytest-mock = "3.11.1"
ruff = "0.0.292"
tree-sitter-languages = "^1.8.0"
types-Deprecated = ">=0.1.0"
types-PyYAML = "^6.0.12.12"
types-protobuf = "^4.24.0.4"
types-redis = "4.5.5.0"
types-requests = "2.28.11.8"
types-setuptools = "67.1.0.0"
[tool.poetry.group.dev.dependencies.black]
extras = ["jupyter"]
version = "<=23.9.1,>=23.7.0"
[tool.poetry.group.dev.dependencies.codespell]
extras = ["toml"]
version = ">=v2.2.6"
This configuration ensures that the versions of torch
and Pillow
are compatible with the llama-index-embeddings-clip
package, and it includes the specified versions of llama-index-multi-modal-llms-ollama
and llama-index-llms-ollama
[1].
To continue talking to Dosu, mention @dosu.
Bug Description
Version
0.10.65
Steps to Reproduce
Install llama-index core then add integration one by one.
Relevant Logs/Tracbacks
No response