unitaryfund / mitiq

Mitiq is an open source toolkit for implementing error mitigation techniques on most current intermediate-scale quantum computers.
https://mitiq.readthedocs.io
GNU General Public License v3.0
358 stars 157 forks source link

not working with Pennylane qml.qnn.TorchLayer #1752

Closed brrbaral closed 1 year ago

brrbaral commented 1 year ago

Hi Mitiq Team, I was trying to implement ZNE using pennylane hybrid quantum classical pytorch model with reference to documentation : https://docs.pennylane.ai/en/stable/code/api/pennylane.qnn.TorchLayer.html.

I got error as :

....9586, grad_fn=<SelectBackward0>)) q[0];
rx(tensor(0.1815, grad_fn=<SelectBackward0>)) q[1];
rz(tensor(1.1420, grad_fn=<SelectBackward0>)) q[0];
ry(tensor(2.3859, grad_fn=<SelectBackward0>)) q[0];
rz(tensor(1.4169, grad_fn=<SelectBackward0>)) q[0];
rz(tensor(4.2310, grad_fn=<SelectBackward0>)) q[1];
ry(tensor(1.9263, grad_fn=<SelectBackward0>)) q[1];
rz(tensor(6.1923, grad_fn=<SelectBackward0>)) q[1];
cx q[0],q[1];
cx q[1],q[0];
rz(tensor(3.8091, grad_fn=<SelectBackward0>)) q[0];
ry(tensor(2.3116, grad_fn=<SelectBackward0>)) q[0];
rz(tensor(4.4235, grad_fn=<SelectBackward0>)) q[0];
rz(tensor(2.5640, grad_fn=<SelectBackward0>)) q[1];
ry(tensor(2.3452, grad_fn=<SelectBackward0>)) q[1];
rz(tensor(2.9057, grad_fn=<SelectBackward0>)) q[1];
cx q[0],q[1];
cx q[1],q[0];
rz(tensor(3.9693, grad_fn=<SelectBackward0>)) q[0];
ry(tensor(5.1624, grad_fn=<SelectBackward0>)) q[0];
rz(tensor(5.7602, grad_fn=<SelectBackward0>)) q[0];
rz(tensor(5.9925, grad_fn=<SelectBackward0>)) q[1];
ry(tensor(5.5003, grad_fn=<SelectBackward0>)) q[1];
rz(tensor(2.5142, grad_fn=<SelectBackward0>)) q[1];
cx q[0],q[1];
cx q[1],q[0]
        ^
at line 5, column 18

During handling of the above exception, another exception occurred:

CircuitConversionError                    Traceback (most recent call last)
/usr/local/lib/python3.9/dist-packages/mitiq/interface/conversions.py in convert_to_mitiq(circuit)
     89         mitiq_circuit = conversion_function(circuit)
     90     except Exception:
---> 91         raise CircuitConversionError(
     92             "Circuit could not be converted to an internal Mitiq circuit. "
     93             "This may be because the circuit contains custom gates or Pragmas "

CircuitConversionError: Circuit could not be converted to an internal Mitiq circuit. This may be because the circuit contains custom gates or Pragmas (pyQuil). If you think this is a bug or that this circuit should be supported, you can open an issue at https://github.com/unitaryfund/mitiq. 

Provided circuit has type <class 'pennylane.tape.qscript.QuantumScript'> and is:

<QuantumScript: wires=[0, 1], params=2>

Circuit types supported by Mitiq are 
{'cirq': 'Circuit', 'pyquil': 'Program', 'qiskit': 'QuantumCircuit', 'braket': 'Circuit', 'pennylane': 'QuantumTape'}.

Here is my code:

import numpy as np
import pennylane as qml
import torch
import sklearn.datasets

import pennylane as qml
from mitiq.zne.scaling import fold_gates_from_left
from mitiq.zne.scaling import fold_global
from mitiq.zne.inference import RichardsonFactory

n_qubits=2
noise_strength = 0.05
dev_noise_free = qml.device("default.mixed", wires=n_qubits)
dev = qml.transforms.insert(qml.AmplitudeDamping,noise_strength)(dev_noise_free)
print(dev)

@qml.transforms.mitigate_with_zne([1, 2, 3], fold_global, RichardsonFactory.extrapolate)
@qml.qnode(dev)
def qnode(inputs, weights):
    qml.templates.AngleEmbedding(inputs, wires=range(n_qubits))
    qml.templates.StronglyEntanglingLayers(weights, wires=range(n_qubits))
    return qml.expval(qml.PauliZ(0)), qml.expval(qml.PauliZ(1))

weight_shapes = {"weights": (3, n_qubits, 3)}

qlayer = qml.qnn.TorchLayer(qnode, weight_shapes)
clayer1 = torch.nn.Linear(2, 2)
clayer2 = torch.nn.Linear(2, 2)
softmax = torch.nn.Softmax(dim=1)
model = torch.nn.Sequential(clayer1, qlayer, clayer2, softmax)
print(model)

### data
samples = 100
x, y = sklearn.datasets.make_moons(samples)
y_hot = np.zeros((samples, 2))
y_hot[np.arange(samples), y] = 1

X = torch.tensor(x).float()
Y = torch.tensor(y_hot).float()

opt = torch.optim.SGD(model.parameters(), lr=0.5)
loss = torch.nn.L1Loss()
epochs = 8
batch_size = 5
batches = samples // batch_size
data_loader = torch.utils.data.DataLoader(list(zip(X, Y)), batch_size=batch_size,
                                          shuffle=True, drop_last=True)
### TRAIN
for epoch in range(epochs):

    running_loss = 0

    for x, y in data_loader:
        opt.zero_grad()

        loss_evaluated = loss(model(x), y)
        loss_evaluated.backward()

        opt.step()

        running_loss += loss_evaluated

    avg_loss = running_loss / batches
    print("Average loss over epoch {}: {:.4f}".format(epoch + 1, avg_loss))

Here is the colab notebook that I am working on: https://colab.research.google.com/drive/1r6jWHEuhtLcJyAtzU4Bp7uiG3KIRlgJN?usp=sharing

Thank You.

github-actions[bot] commented 1 year ago

Hello @brrbaral, thank you for your interest in Mitiq! If this is a bug report, please provide screenshots and/or minimum viable code to reproduce your issue, so we can do our best to help get it fixed. If you have any questions in the meantime, you can also ask us on the Unitary Fund Discord.

andreamari commented 1 year ago

Hi @brrbaral,

conversions problems may be due to gate parameters which are torch tensors instead of floats.

I would suggest, as a first attempt, to import fold_global from pennylane.transforms instead of from Mitiq.

If it doesn't work, this comment may be for a useful workaround: https://github.com/unitaryfund/mitiq/issues/1694#issuecomment-1444424001

See comments on #1694 for a previous discussion on a similar problem.

brrbaral commented 1 year ago

Dear @andreamari Thank you for your response. It worked after importing fold_global, poly_extrapolate, richardson_extrapolate and mitigate_with_zne from pennylane.transforms.

Thank you again.