Closed mkgs210 closed 6 months ago
Thanks for reporting this bug, should be fixed once #699 is merged.
I had this Error: AttributeError: 'LlamaForCausalLM' object has no attribute 'adapter_to'
Are you using the latest library version from our main branch? These changes haven't been pushed to pypi yet.
I ran the qlora example by adding load_best_model_at_end=True and got ValueError:
.to
is not supported for4-bit
or8-bit
bitsandbytes models. Please use the model as it is, since the model has already been set to the correct devices and casted to the correctdtype
.Environment info
adapters
version: 0.2.0Information
Model I am using (Bert, XLNet ...): llama 2
Language I am using the model on (English, Chinese ...): multi
Adapter setup I am using (if any): LORA
The problem arises when using:
To reproduce
Steps to reproduce the behavior:
Full error:
Full error:
``` ValueError Traceback (most recent call last) [Expected behavior
adapter trainer loads the best model at the end