Closed tiran closed 2 months ago
We still need the >= 4.38.0, < 4.39.0
dependency on Transformers for Optimum Habana because the latest stable release of Optimum Habana (i.e. v1.11.1) doesn't work with more recent version of Transformers.
https://github.com/huggingface/optimum-habana/pull/1027 was merged a few days ago to have compatibility with Transformers v4.40 and Optimum Habana v1.12 will be released soon. You can also install the library from source if you want to benefit from this change now:
pip install git+https://github.com/huggingface/optimum-habana.git
Thank you for the update. We can wait a bit longer for the update.
I prefer to avoid install from git. Although a requirements.txt
can contain a git URL, PyPI does not permit packages that have dependencies with direct URL references.
Hi,
do you haven an ETA for the new releases of optimum and optimum-habana? We are facing the issue that Habana's vLLM fork wants transfomers >= 4.40
, which conflicts with optimum-habana 1.11.1's >= 4.38.0, < 4.39.0
.
optimum 1.20.0 and optimum-habana 1.12.0 were released over the weekend. The new version requires transformers <4.41.0,>=4.40.0
.
Feature request
optimium currently limits transformers to
>= 4.38.0, < 4.39.0
. @regisss bumped the upper version limit in PR #1851 a month ago. Is there any technical reason to limit the upper version to< 4.39
? Other dependencies allow for more recent versions. For example neuronx allows< 4.42.0
, see #1881.Motivation
We would like to use newer versions of transformers and tokenizers in InstructLab. The upper version limit for optimum makes this harder on us. We need optimum-habana for Intel Gaudi support.
Your contribution
I can create a PR. It's a trivial one line change.
Testing is less trivial. I have access to an 8-way Gaudi 2 system, but the system is currently busy. I can do some testing in about two weeks from now after I have updated the system from 1.15.1 to 1.16.0.