Open kmpartner opened 3 months ago
Can you provide more details into how you're approaching the training? Maybe provide some code snippets, etc.?
Thank you for response. I just followed code in controlnet folder using distilled models (mid block deleted models, such as "nota-ai/bk-sdm-tiny")
accelerate launch train_controlnet.py \ --pretrained_model_name_or_path="nota-ai/bk-sdm-tiny" \ --dataset_name=fusing/fill50k \ --resolution=512 \ --learning_rate=1e-5 \ --validation_image "./conditioning_image_1.png" "./conditioning_image_2.png" \ --validation_prompt "red circle with blue background" "cyan circle with brown floral background" \ --train_batch_size=4
Before starting training iteration error was displayed probably because mid block is removed from unet.
"
Traceback (most recent call last):
File "/content/diffusers/examples/controlnet/train_controlnet.py", line 1187, in
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.
Please note that issues that do not follow the contributing guidelines are likely to be ignored.
Is it possible to train controlNet for distilled model (Block-removed Knowledge-distilled Stable Diffusion, https://github.com/Nota-NetsPresso/BK-SDM)?
When I used scripts in example folder for distilled model, it returns error, similar to "unknown mid_block_type ...." , probably because of unet structure changed. Are there any ways to train controlNet for distilled model?