snap-research / weights2weights

Official Implementation of weights2weights
Other
121 stars 3 forks source link

unstable optimization in inversion #1

Closed yaseryacoob closed 3 months ago

yaseryacoob commented 3 months ago

Thanks for sharing your work. In looking at the loss in invresion.py I noticed that the optimization is not stable, even when I allowed for 1000 iterations. You can see the model_pred (just 3 out of 4 channels) display every 100 epochs and the loss recorded. in this series of images (using your example). The final loss at 999 was 0.040231965482234955. values

NOISED_0_loss_0 002037961967289448 NOISED_100_loss_0 07902728021144867 NOISED_200_loss_0 4763336181640625 NOISED_300_loss_0 042885422706604004 NOISED_400_loss_0 08340265601873398 NOISED_500_loss_0 08626551926136017 NOISED_600_loss_0 07441602647304535 NOISED_700_loss_0 36750757694244385 NOISED_800_loss_0 36185115575790405 NOISED_900_loss_0 008720280602574348

avdravid commented 3 months ago

Hi. Thank you for posting your issue. The loss landscape of diffusion models is very noisy, so you cannot really get anything meaningful by just looking at the loss at each iteration. You would get this similar noisy optimization if you were doing standard Dreambooth or fine-tuning. The best thing is to check what the generated samples look like. Let me know if you need any further guidance.