Closed vrunm closed 1 year ago
It looks like you need to upgrade your version of Transformers, the 4bit support is only 4.30.0
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.
Please note that issues that do not follow the contributing guidelines are likely to be ignored.
System Info
Information
Tasks
no_trainer
script in theexamples
folder of thetransformers
repo (such asrun_no_trainer_glue.py
)Reproduction
While Fine-tuning GPT-NeoX-20B with QLoRa using accelerate and bitsandbytes I ran into this issue:
After trying to load the model ran into this issue:
Expected behavior
The model weights should have been loaded correctly from huggingface.