HIPS / autograd

Efficiently computes derivatives of NumPy code.
MIT License
7.03k stars 915 forks source link

autograd return nan with to norm function #617

Open giangbang opened 8 months ago

giangbang commented 8 months ago

Hi, I'm using autograd to calculate the gradient of an l2 norm operator, the code is as simple as

def f(x):
    return np.linalg.norm(x,axis=-1)**2

f_dx = grad(f)
f_dx(np.array([[0, 0.]]))

However, when I substitute vector 0 to f, it outputs nan

>> \autograd\numpy\linalg.py:100: RuntimeWarning: invalid value encountered in scalar divide                                                          
  return expand(g / ans) * x 

When I change the code to something that does not involve using linalg, it produces 0 as usual

def f(x):
    return np.sum(np.square(x))
f_dx = grad(f)
f_dx(np.array([[0, 0]], dtype=float))
array([[0., 0.]])

https://github.com/HIPS/autograd/blob/9a90bd6172d1882235c326c56c17a9540357d86b/autograd/numpy/linalg.py#L100