huggingface / peft

🤗 PEFT: State-of-the-art Parameter-Efficient Fine-Tuning.
https://huggingface.co/docs/peft
Apache License 2.0
16.46k stars 1.62k forks source link

Optimize DoRA computation when there is no dropout #2107

Closed BenjaminBossan closed 1 month ago

BenjaminBossan commented 1 month ago

Feature request

DoRA could be made faster and to use less memory if the base result were reused for DoRA. However, this is only equivalent if there is no dropout (because the base result will have dropout applied). Therefore, an optimization could be done when dropout=0 (i.e. when nn.Identity is used) or during eval mode.

Motivation

Faster and more memory efficient DoRA when there is no dropout. Experimentally, dropout is not crucial for training DoRA, see this comment.

Your contribution

I can work on this when I have a bit of time but contributions are very welcome.

ariG23498 commented 1 month ago

Hey @BenjaminBossan I would love to work on this.

Should I create a PR and then have the rest of the conversation there?

BenjaminBossan commented 1 month ago

Thanks @ariG23498. Do as you like, if you have code feel free to create a (draft) PR, otherwise discussing here is also fine.