Closed fabianlim closed 1 month ago
@achew010 note that we put in a comment in the main README
Huggingface BNB QLoRA numbers taken with legacy approaches, but we are aware of https://github.com/foundation-model-stack/fms-acceleration/issues/10 and will update our benches. The above includes numbers using fusedOps-and-kernels and actual impl coming soon, see below.
Now that this issue is closed, that comment should be removed.
This issue regards to the warnings that for QLoRA PeFT we should pass
peft_config
directly toSFTTrainer
no_peft_model: True
thenmodel_loader
will simply load the BNB model, and some logic needed to setrequires_agumentation: False
trf.SFTTrainer
tests/test_peft_plugin
in generate sample configurations update the
CONFIGURATIONS
andCOMBINATIONS
:configs/bnb.yaml
add a new flag
no_peft_model