guoyww / AnimateDiff

Official implementation of AnimateDiff.
https://animatediff.github.io/
Apache License 2.0
9.79k stars 800 forks source link

Asking for image layer in Unet training #342

Open dreamyou070 opened 2 months ago

dreamyou070 commented 2 months ago

Hi, I want to follow training AnimateDiff. In the tutorial (https://github.com/guoyww/AnimateDiff/blob/main/__assets__/docs/animatediff.md) first I finetune with configs/training/v1/image_finetune.yaml config.

However in Paper, there is no unet finetuning process only Lora training for domain adoptation. Can you check is what is unet image layer finetuning means??

zoucheng1991 commented 1 month ago

I have the same question. The paper says only a lora is needed for domain-adaptation, but in this project the whole unet is updated during the image_finetune stage