kohya-ss / sd-scripts

Apache License 2.0
5.18k stars 859 forks source link

Padding LoRA ranks to enable merging #117

Open AI-Casanova opened 1 year ago

AI-Casanova commented 1 year ago

I'm quite shaky on the math, but would it be possible to pad a lower rank LoRA (with zeros, or something else) in order to merge a lower rank LoRA with a higher rank?

I've been playing with SVD distillation at 256 rank, and might want to merge in a trained LoRA, but don't have the time or resources to fully train a high rank LoRA.

kohya-ss commented 1 year ago

I am not familiar with the mathematical theory, but I think the same formula for merging LoRA weights into the model weights could be used. I will make investigations.