Closed majafranz closed 2 months ago
Thanks for opening this issue @majafranz .
Adding some extra details after an initial investigation.
More localized example of the issue:
@qml.qnode(qml.device('default.qubit'), diff_method="parameter-shift")
def circuit(x, data):
qml.RX(x[0], 0)
qml.RX(x[1], 0)
qml.RY(data, 0)
return qml.probs(wires=0)
x = qml.numpy.array([0.5, 0.8], requires_grad=True)
data = qml.numpy.array([1.2, 2.3, 3.4], requires_grad=False)
circuit(x, data)
qml.jacobian(circuit)(x, data)
Combination of things that give rise to this issue:
1) batching in non-trainable data 2) Measurement with a shape (ie probs) 3) More than one trainable parameter in the circuit
Removing any of the above characteristics allows it to work.
Expected behavior
When having batched inputs for a QNode (in the example script below: input of shape
(5, 1)
), I want to calculate the probabilities of the outcomes of the circuit and then use the first probability of outcome "0" for each input. That is, I obtain a output of shape(5, 2)
from the circuit, where the first dimension (5) represents the batch dimension, and the second dimension (2) represents the respective probabilities for either "0", or "1".The probability for all "0"-es, can be obtained by indexing, i.e. when the outcomes of the QNode are
prediction
(shape(5, 2)
), the probabilities for "0"-es would beprediction[:,0]
(shape(5,)
).The "forward" pass works as expected. One would also expect to obtain gradients, wrt. the parameters, when e.g. calling
qml.AdamOptimizer().step_and_cost
.Actual behavior
Computing the gradients fails for batchsizes > 1 with the Traceback below.
Additional information
In the non-working example script, either setting the
BATCH_SIZE = 1
, or using the torch interface with a torch optimiser, e.g. as in the following code sample, does provide the expected behaviour. Therefore, I assume the bug might be related to either the gradient calculation with autograd, or theqml.AdamOptimizer
used in the example.Output
Source code
Tracebacks
System information
Existing GitHub issues