Closed deep-diver closed 4 months ago
@lewtun
Besides keeping or removing fsdp+qlora.yaml
discussion, I made additional commit for adding an example on https://github.com/huggingface/alignment-handbook/tree/main/scripts#fine-tuning. Please take a look!
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.
@lewtun
reminder
@lewtun
reminder. I addressed your comments :)
This PR is to add FSDP+QLoRA support with the following changes:
recipes/accelerate_configs/fsdp+qlora.yaml
peft>=0.9.0
andbitsandbytes>=0.43.0
dependenciesbnb_4bit_quant_storage
field inModelArguments
bnb_4bit_quant_storage
ofBitsAndBytesConfig
With these changes, I have confirmed FSDP+QLoRA works within my local setup (2 x A6000).