sQUlearn / squlearn

scikit-learn interface for quantum algorithms
https://squlearn.github.io
Apache License 2.0
58 stars 18 forks source link

QNN Loss for ODEs #278

Closed rupof closed 1 month ago

rupof commented 4 months ago

Hi!

This is the implementation for the loss function for an ODE ODELoss (as done in https://arxiv.org/pdf/2011.10395). It should accept linear and non-linear 1st order and 2nd**[See footnote] order differential equations that depend on one variable.

I also created an example (ode_example) of how to use the interface and included a parameterized feature map (HEE_rzrxrz) that was used in the original paper to manipulate the function space.

The implementation interface accepts two type of inputs:

  1. A symbolic sympy expression for the homogenous differential equation:
    • If a sympy expression is given, then, the symbols_involved_in_ODE must be provided.
  2. A callable function:
    • If a callable function which implements the homogenous differential equation is given, then, a callable function for the gradient (ODE_functional_gradient) of the homogenous differential equation calculation must be provided.

About the Implementation:

This pull is a new class from LossBase that implements:

$$\mathcal{L}(\vec{\theta}) [ \ddot f, \dot f, f, x] = \sum_{j}^N \left(Fj[ \ddot f{\vec{\theta}}, \dot f{\vec{\theta}}, f{\vec{\theta}}, x]\right)^2 + \eta(f_{\vec{\theta}}(x_0) -u0)^2 + \eta(\dot f{\vec{\theta}}(x_0)- \dot u_0)^2$$

With corresponding gradient: image

where $F$, is the homogenous differential equation that can be given by the user. Also, it includes two ways to incorporate the initial value problem information based on the paper: floating and pinned

About the correctness:

I have benchmarked the gradient of the loss by comparing the finite-difference gradient with the one implemented using squlearnˈs parameter-shift rule (i.e previous equation):

Numerical Gradient: image

Squlearn Gradient: image

Also, I have solved some ODEs sucessfully,

image


Also, to use sympy I created three functions that are outside of the ODELoss class: numpyfy_sympy_loss , numerical_gradient_of_symbolic_equationand numpyfy_sympy_loss. I do not know if the current location of this function is the best one or if they should be methods, I am happy to hear your thoughts.

Feel free to give me feedback 😃 and thank you very much for your help! 🚀


*** 2nd order differential equations are implemented by differentiating twice the QNN trial function f(x), this implies that third order derivatives with regards to the parameters have to be calculated (i.e dfdxdxdp). As you may imagine this requires a lot of circuit evaluations (I believe for N points, around N^3 circuits must be evaluated). I do not think this is the proper way to solve second order ODE, one should translate a 2nd order to two coupled first orders.

I have not implemented coupled 1rst ODE solvers and this would be a possible next step, which I may do next. I could create an issue or something similar.

David-Kreplin commented 4 months ago

@rupof , is this branch ready for review or you are still working on it? You can click "Ready for review" on the bottom of the page, if you are reaedy.

rupof commented 4 months ago

@David-Kreplin it is ready for review, I just changed it now. Thanks!

rupof commented 1 month ago

Thank you very much David!

I did the following things:

If you think that the documentation for the sympy symbols needs to be improved further let me know. Thanks!

Edit: because the example is taking so much time one check is throwing an error, I am searching for an alternative