sudban3089 / ID-Preserving-Facial-Aging

Identity-Preserving Aging of Face Images via Latent Diffusion Models [IJCB 2023]
MIT License
28 stars 5 forks source link

About the implementation using biomloss #4

Closed shiiiijp closed 3 months ago

shiiiijp commented 3 months ago

Hi! I'm very interested in your work. I have a few questions regarding your implementation.

From your paper, I thought that only the VAE is trained with biomloss, but your implementation (v1-finetune_biomloss.yaml) seems to suggest that both the VAE and UNet meant to be trained. Which one is correct?

Additionally, it looks like the pytorch-lightning trainer only accepts UNet as the trainable model with this script and configs (considering the instantiate_from_config function in ldm/util.py), which results in the biomloss being ignored. Or am I missing something?

sudban3089 commented 3 months ago

Thank you for your interest in our work. We provided a generic script for ease of use for the biometric loss due to limited performance. We tried different things with the biometric loss. We fine-tuned the VAE with facenet-pytorch and FRmodel = InceptionResnetV1(pretrained='vggface2').eval() where we computed the loss between the target and reconstructed embeddings coming from the FR model. This needed to be computed in ldm/modules/losses/contperceptual.py and this required the config file from ldm/configs/autoencoder. Please refer to DreamBooth repo for looking at various autoencoder config files. Next, we tried fine-tuning the UNet with AngularPenaltySoftMaxLoss which covers ArcFace, CosFace and SphereFace. But we observed that biometric loss whether at VAE or with U-Net was not providing desirable performance. So, we used the contrastive loss with U-Net (v1-finetune_contrastiveloss.yaml) which produced the best results and we recommend using that.

shiiiijp commented 3 months ago

Thank you for the clarification! Now I understand the intention behind the provided implementation. As you suggested, I will try using the constrastive loss.

Again, much appreciation for your work, it's been inspiring.