Closed zeal-up closed 4 years ago
Does it normal to implement a negative loss? Since there is no constraint to make sure the Probabilistic Chamfer Loss is positive.
sigma_src_dst = (sigma_src + selected_sigma_dst) / 2 forward_loss = (torch.log(sigma_src_dst) + src_dst_min_dist / sigma_src_dst).mean()
The sigma is constrain to be positive but can be smaller than 1. Thus the log(sigma_src_dst) can be negative. And during my training, the loss do be negative. Is it normal?
You are right, the loss can be negative, totally normal.
Thanks for your reply!
Does it normal to implement a negative loss? Since there is no constraint to make sure the Probabilistic Chamfer Loss is positive.
The sigma is constrain to be positive but can be smaller than 1. Thus the log(sigma_src_dst) can be negative. And during my training, the loss do be negative. Is it normal?