biggzlar / plausible-uncertainties

Methods used in the paper "Plausible Uncertainties for Human Pose Regression".
https://openaccess.thecvf.com/content/ICCV2023/papers/Bramlage_Plausible_Uncertainties_for_Human_Pose_Regression_ICCV_2023_paper.pdf
MIT License
10 stars 0 forks source link

About NIG_REG loss #1

Open liujiyaoFDU opened 10 months ago

liujiyaoFDU commented 10 months ago

Hi, Thanks for your Excellent work! I have a confusion about the NIG reg loss term. You defined it as equ (5): $\lambda|y{kd} - \hat \mu{kd}|(2\hat \alpha{kd}+\hat \upsilon{kd})$ in your paper.

But in your code, it was implemented as

def NIG_REG(y_true, gamma, alpha, beta, reduce=False):
    error = torch.abs(y_true - gamma) / (beta * torch.reciprocal(alpha - 1.0))
    evi = 2 * alpha
    reg = error * evi

    return torch.mean(reg) if reduce else reg

This conflicts with the formula in the paper. Could you please explain why this is done, or which of the two implementations is more effective? Looking forward to your reply.

biggzlar commented 9 months ago

Apologies for the late reply. This is an incomplete experiment that snuck its way into the release version. In the accompanying toy problem, it appears that the correct aleatoric uncertainty is only recovered if we use evi = 2 * alpha. The change to the error term (dividing by the aleatoric uncertainty) appears to improve calibration.

For further discussion on the regularization term, I recommend this paper: The Unreasonable Effectiveness of Deep Evidential Regression. Take a look at the sections on total evidence.

Hope this helps!