Open adrihercer opened 4 months ago
I have the same problem when running SageMaker specifying transformers 4.44
2024-08-08T20:54:55,820 [INFO ] W-9000-model-stdout com.amazonaws.ml.mms.wlm.WorkerLifeCycle - File "/opt/conda/lib/python3.10/site-packages/sagemaker_huggingface_inference_toolkit/transformers_utils.py", line 24, in <module>
2024-08-09T14:03:27,296 [INFO ] W-9000-model-stdout com.amazonaws.ml.mms.wlm.WorkerLifeCycle - from transformers.pipelines import Conversation, Pipeline
2024-08-09T14:03:27,297 [INFO ] W-9000-model-stdout com.amazonaws.ml.mms.wlm.WorkerLifeCycle - ImportError: cannot import name 'Conversation' from 'transformers.pipelines' (/opt/conda/lib/python3.10/site-packages/transformers/pipelines/__init__.py)
I tried specifying transformers 4.41.2
, but that triggered this error:
2024-08-09T15:14:27,329 [INFO ] W-9000-model-stdout com.amazonaws.ml.mms.wlm.WorkerLifeCycle - ImportError: peft<=0.6.2 is required for a normal functioning of this module, but found peft==0.12.0.
Specifying peft<=0.6.2
triggered this error:
"
rope_scalingmust be a dictionary with two fields,
typeand
factor, got {\u0027factor\u0027: 8.0, \u0027low_freq_factor\u0027: 1.0, \u0027high_freq_factor\u0027: 4.0, \u0027original_max_position_embeddings\u0027: 8192, \u0027rope_type\u0027: \u0027llama3\u0027}" }
I have also encountered this issue when trying to update to the most recent version of transformers and continue making use of the sagemaker-huggingface-inference-toolkit.
Is the simple fix for now to simply remove the Conversation
import from the place highlighted in the original comment?
I'm more concerned that this highlights a lack of maintenance and / or wider usage of this package across the community. What is the recommended way to host HF models for inference in SageMaker that does not make use of the sagemaker-huggingface-inference-toolkit?
@ed-berry I'm still having this issue if I use the latest version of transformers (4.44.2), and I get other errors if I use transformers 4.41.2.
I see your PR, but it still hasn't been merged. @philschmid has approved it, but it currently says "This workflow requires approval from a maintainer." I has been a week since the last action. Do you know how we can bring this to the attention of the maintainers?
Hey @joann-alvarez. That PR will need merging and a new release of the inference toolkit creating to resolve the issue if you aren't able to use an older version of transformers.
Yeah, one of the maintainers needs to approve a run of the GitHub workflows that check the PR before it can be merged. I'm not sure how we flag that beyond creating a PR and this issue though.
Your best bet might be trying to get an older version of transformers working for now.
Hi, how do I use the latest version? I'm trying to deploy FLUX to Sagemaker but I'm facing this issue when installing transformers==4.43.4
Is there any update or temporary solution to this? I can't pin it to a specific lower version as sagemaker would complain about it
Can you try adding temporary install/add huggingface-hub==0.25.2
, while we are looking into that
@philschmid I just tried adding that. I still get this error since I've been getting since August:
2024-10-21T15:08:07,968 [INFO ] W-9000-model-stdout com.amazonaws.ml.mms.wlm.WorkerLifeCycle - ImportError: cannot import name 'Conversation' from 'transformers.pipelines' (/opt/conda/lib/python3.10/site-packages/transformers/pipelines/__init__.py)
It also seems that huggingface-hub==0.25.2
was already installed.
We are hosting a model in SageMaker and today we observed the following error in our logs when the model was being relaunched in the instance:
I found someone reported a similar issue in https://discuss.huggingface.co/t/cannot-import-conversation-from-transformers-utils-py/91556. When digging into the changes in the
transformers
dependency I found this change regarding theConversation
object: https://github.com/huggingface/transformers/pull/31165Based in our logs, the last time the model was successfully relaunched in our SageMaker infrastructure (2 days ago), the version of the package that was downloaded was
transformers-4.41.2-py3-none-any.whl
, but when the error started to be observed, the downloaded version wastransformers-4.42.1-py3-none-any.whl
.According to the pull request mentioned above, the change is effective as per version
4.42
, which was released 18 hours ago at the moment of writing this issue.I think a change is needed in
src/sagemaker_huggingface_inference_toolkit/transformers_utils.py
to reflect the new structure of the code in thetransformers
dependency or pin the dependency version to version4.41.2
to prevent the issue in the future.In our case, we are updating our
requirements.txt
file to pin the version4.41.2
, but other users might not be aware of what is happening, therefore they are not aware of this fix.