Open mudit2812 opened 1 year ago
This came to be known due to an issue on the PennyLane GitHub. Consider the following function:
import autograd import autograd.numpy as np def get_sv(theta, gamma=0): coeff = np.array([((1 + np.exp(-1.j * theta)) / 2) * (np.sqrt(1 - gamma)) , (1 - np.exp(-1.j * theta)) / 2]) / (1 - gamma * (np.cos(theta)**2)) x = coeff / np.sqrt(np.sum(np.real(coeff) ** 2 + np.imag(coeff) ** 2)) res = np.real(x) ** 2 + np.imag(x) ** 2 return res x = np.array(10.) autograd.jacobian(get_sv, 0)(10.0)
array([ 0.27201056, -0.27201056])
The norm of coeff is 1.0, so I get the same jacobian if I remove the normalization step. However, if I use np.linalg.norm instead, I get a different jacobian:
coeff
1.0
np.linalg.norm
def get_sv(theta, gamma=0): coeff = np.array([((1 + np.exp(-1.j * theta)) / 2) * (np.sqrt(1 - gamma)) , (1 - np.exp(-1.j * theta)) / 2]) / (1 - gamma * (np.cos(theta)**2)) x = coeff / np.linalg.norm(coeff) res = np.real(x) ** 2 + np.imag(x) ** 2 return res x = np.array(10.) autograd.jacobian(get_sv, 0)(10.0)
array([0.34547 , 0.56747525])
System info:
Platform info: macOS-13.6.1-arm64-arm-64bit Python version: 3.10.12 Autograd version: 1.6.2 Numpy version: 1.26.1 Scipy version: 1.11.1
This came to be known due to an issue on the PennyLane GitHub. Consider the following function:
The norm of
coeff
is1.0
, so I get the same jacobian if I remove the normalization step. However, if I usenp.linalg.norm
instead, I get a different jacobian:System info: