foundation-model-stack / fms-acceleration

🚀 Collection of libraries used with fms-hf-tuning to accelerate fine-tuning and training of large models.
Apache License 2.0
0 stars 4 forks source link

Allow Fused Ops to Support Dropout #32

Closed fabianlim closed 1 week ago

fabianlim commented 3 weeks ago

Right now the fused ops do not support dropout, but perhaps it can be quite trivally supported as this is the implementation of the dropout in QuantLinear in both peft.tuners.lora.bnb and peft.tuners.lora.gptq

output = lora_B(lora_A(dropout(x)))