The Laplacian Aleatoric Uncertainty Loss accounts for uncertainty in the predictions. The idea is to weigh the loss based on the predicted uncertainty (represented by log_variance).
However, the loss is going negative, is it normal or have the author observed this behavior of the depth loss?
The Laplacian Aleatoric Uncertainty Loss accounts for uncertainty in the predictions. The idea is to weigh the loss based on the predicted uncertainty (represented by log_variance). However, the loss is going negative, is it normal or have the author observed this behavior of the depth loss?