amdegroot / ssd.pytorch

A PyTorch Implementation of Single Shot MultiBox Detector
MIT License
5.11k stars 1.74k forks source link

why use x_max while computing softmax in function log_sum_exp()? #203

Open halfAnEngineer opened 6 years ago

halfAnEngineer commented 6 years ago

def log_sum_exp(x)
x_max = x.data.max() return torch.log(torch.sum(torch.exp(x-x_max), 1, keepdim=True)) + x_max In this function ,if we remove x_max,the output of this function is just the same,so why should we use the x_max ?

lemairecarl commented 6 years ago

For numerical stability. This is almost always done when computing the softmax function: https://stackoverflow.com/questions/42599498/numercially-stable-softmax

halfAnEngineer commented 6 years ago

Thank you, got it.