HIPS / autograd

Efficiently computes derivatives of NumPy code.
MIT License
7.03k stars 915 forks source link

[BUG] Differentiating `autograd.numpy.linalg.norm` gives incorrect results #612

Open mudit2812 opened 1 year ago

mudit2812 commented 1 year ago

This came to be known due to an issue on the PennyLane GitHub. Consider the following function:

import autograd
import autograd.numpy as np

def get_sv(theta, gamma=0):
    coeff = np.array([((1 + np.exp(-1.j * theta)) / 2) * (np.sqrt(1 - gamma)) , (1 - np.exp(-1.j * theta)) / 2]) / (1 - gamma * (np.cos(theta)**2))
    x = coeff / np.sqrt(np.sum(np.real(coeff) ** 2 + np.imag(coeff) ** 2))
    res = np.real(x) ** 2 + np.imag(x) ** 2
    return res

x = np.array(10.)
autograd.jacobian(get_sv, 0)(10.0)
array([ 0.27201056, -0.27201056])

The norm of coeff is 1.0, so I get the same jacobian if I remove the normalization step. However, if I use np.linalg.norm instead, I get a different jacobian:

def get_sv(theta, gamma=0):
    coeff = np.array([((1 + np.exp(-1.j * theta)) / 2) * (np.sqrt(1 - gamma)) , (1 - np.exp(-1.j * theta)) / 2]) / (1 - gamma * (np.cos(theta)**2))
    x = coeff / np.linalg.norm(coeff)
    res = np.real(x) ** 2 + np.imag(x) ** 2
    return res

x = np.array(10.)
autograd.jacobian(get_sv, 0)(10.0)
array([0.34547   , 0.56747525])

System info:


Platform info:           macOS-13.6.1-arm64-arm-64bit
Python version:          3.10.12
Autograd version:        1.6.2
Numpy version:           1.26.1
Scipy version:           1.11.1