PennyLaneAI / pennylane

PennyLane is a cross-platform Python library for quantum computing, quantum machine learning, and quantum chemistry. Train a quantum computer the same way as a neural network.
https://pennylane.ai
Apache License 2.0
2.25k stars 584 forks source link

Implementation of exponential extrapolator for the differentiable mitigate_with_zne functionality #3123

Open cnktysz opened 1 year ago

cnktysz commented 1 year ago

Feature details

The newly added differentiable zero noise extrapolation (mitigate_with_zne) function only supports a polynomial extrapolation. mitiq equivalent of this function allows many other functions (e.g. exponential, linear, poly-exponential, etc. (see)). I believe it would be a great addition to add these options.

Implementation

No response

How important would you say this feature is?

2: Somewhat important. Needed this quarter.

Additional information

No response

cnktysz commented 1 year ago

The implementation doesn't seem very complicated to me. I would like to start implementing with exponential fitter, but it would be great to hear the Pennylane core team's ideas.

antalszava commented 1 year ago

Hi @cnktysz, thanks for the feature request! :slightly_smiling_face:

I don't think there would be any blockers there, submissions would be welcome to add further extrapolators that support differentiation. :tada:

Let us know if you have have any questions with regards to contributing. In October, PennyLane is participating in Hacktoberfest - if this is interesting, we can also tag this issue for the hackathon! :+1:

cnktysz commented 1 year ago

Hi @antalszava, thanks for the information. Hacktoberfest sounds interesting, I think we can tag this issue accordingly.

Qottmann commented 1 year ago

Hi @cnktysz good idea! As long as you can trace it back to a polynomial fit it should be relatively straight-forward. I.e. if you mean by exponential fit $y=c_0\exp(c_1 x)$ then you can write a dummy function

def exp_extrapolate(x, y):
    logy = qml.math.log(y)
    coef = _polyfit(x, logy, 1)
    return qml.math.exp(coef[1])

since $\log(y) = \log(c_1) + c_0 x$ and the extrapolated result is $y(0)=c_1$.

For extrapolation that cannot be reduced to polynomial fitting it gets more complicated. To preserve differentiability you have to stick to pennylane.math functions. Currently, the backbone in the ZNE is the custom _polyfit function (see pennylane/transforms/mitigate.py#L247), and you would have to write a more elaborate non-linear fitting function. This is totally possible, e.g. using backprop and doing a least-squares fit, but just a little more involved.

Alternatively, you can use any numpy/scipy fitting function if you can live without automatic differentiability!

Qottmann commented 1 year ago

Note that if you want to use mitiq extrapolators (without differentiability) this is already implemented! So you can just pass whichever extrapolation function from mitiq that you like in the ZNE transform :) some resources for that: Details section of https://docs.pennylane.ai/en/stable/code/api/pennylane.transforms.mitigate_with_zne.html and this demo https://pennylane.ai/qml/demos/tutorial_error_mitigation.html

cnktysz commented 1 year ago

Hi @Qottmann, thanks for the heads up. I want to have a differentiable extrapolator. This is not possible with mitiq's extrapolator as far as I tested. I think your solution fits to many scenarios. I was thinking of implementing all the options that mitiq offers in the differentiable form.

Using the polyfit like you did might be good enough for most cases. I will try to go with this approach first as it is much easier to implement, as you said.

Qottmann commented 1 year ago

Great! Let us know how it goes, happy to help!

Related but off-topic: If your main concern is improving ZNE, another interesting direction is trying out different folding schemes (the fold_global approach is essentially reduced to scaling factors 1, 2, 3 and if you have a very good device, 4. Would be cool to see a folding scheme that allows intermediate scaling factors 1.5 etc., though its not obvious how to achieve this, i.e. how to choose which gates to repeat when and relating that with an artificial scaling factor.).

cnktysz commented 1 year ago

Yes, improving performance of ZNE is my goal. I couldn't get good results with scaling factor up-to 3 and was thinking of using other extrapolators. A collaborator of mine said they got better performance with exponential fitter, that's why I was inclined to test it out and realized it was not implemented.

I think such intermediate scaling factors would work better, but as you said it is not straightforward to decide how to do this. I will definitely have a look at this direction as well. Thank you.

CatalinaAlbornoz commented 1 year ago

Hi @cnktysz, this issue has now been labelled as participating in Hacktoberfest! This is a very large open-source event that is taking place during the month of October. If you register for the event and contribute at least 4PRs in any of the participating projects you get the chance to win a Hacktoberfest t-shirt or plant a tree.

Let us know if you have any questions about this! Remember that for your contribution to count it needs to be accepted by October 31st.

cnktysz commented 1 year ago

Hi @Qottmann. I implemented the extrapolator based on your suggestion. It works well on its own. Then, I tested it with the jax interface and unfortunately it couldn't trace the gradients. I realized the issue starts with the qml.math.log function. Seems like batched inputs do not go well with this function. I tried to find the source code for this function but can't seem to locate it in the pennylane repository. Can you point me to it if you know where it is defined. Thanks.

josh146 commented 1 year ago

Hey @cnktysz! We are actually performing dynamic dispatch within the qml.math module :) the way it works is:

So in this case, qml.math.log would simply call autoray.numpy.log under the hood, and then autoray will determine which ML framework to call.