Open sh3rlock14 opened 1 year ago
Agree, the negative sign was already added before, so it should not be there
Agree. Line#143 already has one negative sign.
this also happens in the mssim_vae.py line#155 `
kld_loss = torch.mean(-0.5 * torch.sum(1 + log_var - mu ** 2 - log_var.exp(), dim = 1), dim = 0)
loss = recons_loss + kld_weight * kld_loss
return {'loss': loss, 'Reconstruction_Loss':recons_loss, 'KLD':-kld_loss}`
I think that in vanilla_vae.py
loss_function
there's a mistake in KLD returned value:return {'loss': loss, 'Reconstruction_Loss':recons_loss.detach(), 'KLD':-kld_loss.detach()}
the negative sign (-) should not be there!