Open eeyrw opened 1 year ago
you should try fp32 vae and optimizer
It's really amazing that you know I use adam8bit and fp16.
@eeyrw do u see any improvements after you finetuned the vae?
No. I have no sufficient GPU ram so fail to make further try.
@eeyrw i got nan too but not there. it was in https://github.com/CompVis/taming-transformers/blob/master/taming/modules/losses/lpips.py#L117. I solved nan by replacing that line with torch.sqrt(torch.sum(x**2,dim=1,keepdim=True) + eps)
.
@keyu-tian Nice eps trick improves numerical stability a lot 😀
I found here cause nan: ldm/modules/losses/contperceptual.py