PennyLaneAI / qml

Introductions to key concepts in quantum programming, as well as tutorials and implementations from cutting-edge quantum computing research.
https://pennylane.ai/qml
Apache License 2.0
539 stars 185 forks source link

[BUG] Backprop tutorial has an incorrect gradient result compared to other built-in methods #1050

Closed isaacdevlugt closed 7 months ago

isaacdevlugt commented 7 months ago

Reported here: https://discuss.pennylane.ai/t/possible-discrepancy-in-the-quantum-gradients-with-backpropagation-tutorial/4236 Demo: https://pennylane.ai/qml/demos/tutorial_backprop/

The gradient calculated manually via the below doesn't match what jax.grad or qml.gradients.param_shift gives 🤔.

def parameter_shift_term(qnode, params, i):
    shifted = params.copy()
    shifted = shifted.at[i].add(np.pi/2)
    forward = qnode(shifted)  # forward evaluation

    shifted = shifted.at[i].add(-np.pi/2)
    backward = qnode(shifted) # backward evaluation

    return 0.5 * (forward - backward)