Closed LanTianBy closed 3 months ago
Thank you for your interest!
I haven't fully debugged the code regarding your issue yet, but my intention is the paper figure (the first in your image). It could be my mistake, but I believe there would be negligible output difference between the first and second figures because adjacent time step features are similar.
I haven't tested our method on Improved DDPM settings, but I think additional attention analysis will be needed to adopt their architecture (e.g., selecting which layer does "style injection"). Improved DDPM seems similar to the DiffuseIT style transfer setting. So, it might be more important to analyze the U-Net bottleneck or the skip connections in their setting, as noted in DiffuseIT [A] (they considered both for style transfer). I think it could be worthwhile to try "style injection" in other attention layers (e.g. near bottleneck attention layers?).
[A] Jeong, et al. "Training-free Content Injection using h-space in Diffusion Models." WACV. 2024.
- I haven't fully debugged the code regarding your issue yet, but my intention is the paper figure (the first in your image). It could be my mistake, but I believe there would be negligible output difference between the first and second figures because adjacent time step features are similar.
- I haven't tested our method on Improved DDPM settings, but I think additional attention analysis will be needed to adopt their architecture (e.g., selecting which layer does "style injection"). Improved DDPM seems similar to the DiffuseIT style transfer setting. So, it might be more important to analyze the U-Net bottleneck or the skip connections in their setting, as noted in DiffuseIT [A] (they considered both for style transfer). I think it could be worthwhile to try "style injection" in other attention layers (e.g. near bottleneck attention layers?).
Thank you very much for your prompt and patient reply. Wishing you all the best! Thank you again for your wonderful work!
Thank you for your amazing work! I have some questions about code details, and I would greatly appreciate your answers.
Thank you again for your wonderful work, and I wish you all the best in your future work!