Well, it should support lora training, but I've only implemented the basic finetuning functionality.
Specifically, you can refer to the implementation of lora layers in the library, add it to unet, freeze the other parameters, and train only the lora layer.
支持lora训练模式吗