foundation-model-stack / fms-acceleration

🚀 Collection of libraries used with fms-hf-tuning to accelerate fine-tuning and training of large models.
Apache License 2.0
0 stars 4 forks source link

Fused Ops Support for Lora Dropout #37

Closed fabianlim closed 1 week ago

fabianlim commented 2 weeks ago

This PR addresses #32 to allow Fused Ops to also support lora dropout.

The key strategy is to pass the dropout to the LoRA.apply function, and have it call inside the matmul_lora, and return out the dropped out X to the top-level to be saved for the backward

fabianlim commented 1 week ago

merged to the wrong branch, so I cherry picked to dev on commit 6186ddd4e2dd4c8f53b803bcf032b9fb481ec51e