Closed ghost closed 1 year ago
There are two ways to implement LoRA. One is to have aux multiplications done at runtime along with the main one (a.k.a. LoRA layers). Another is to merge the weights first. We opt to merge the weight to keep the network the same so we can reuse things such as CoreML too.
Okay thanks
There is an exampe for lora but the layers are not present in the Unet class.