Open briandesilva opened 3 years ago
hey! I'm a beginner at GitHub can anyone tell me how to explore this ???
Hey, if you're interested in helping with this I'd recommend looking at the STLSQ implementation and thinking about ways we could adapt the call to ridge_regression
to handle complex numbers (e.g. maybe using something like Scipy's lstsq).
For the record it was probably a mistake on my part to mark this as a good first issue as solving it requires some background in linear algebra.
Hi! As ridge regression is only a matrix computation, I think it is only necessary to replace real scalar product by hermitian scalar product. I have tried and it seems to work fine. Moreover, for some dynamical systems, it is interesting to add the variable conjugates in the pySINDy library!
That's right, if you want to solve the least squares problem by solving the normal equations you can just use the hermitian scalar product. But for poorly conditioned matrices solving the normal equations will give worse results than other methods of solving least squares problems like QR factorization or an SVD. For this reason I think it would be a good idea to find a least squares implementation that supports ridge regression and complex numbers and is "smart" about which solver it uses.
That's right but it is what ridge regression does (thus what STLSQ uses), right? And the ridge parameter improves the matrix conditionning (maybe not enough?)
By default Scikit-learn's ridge regression uses different solvers depending on the type of input data (see the solver
parameter here). You're right though, that adding the ridge parameter improves the conditioning of the matrix. This might be enough.
Ok I didn't know! Maybe we should ask some guy of scikit learn if they can add a complex version of ridge regression?
Another solution would be to decompose in real and imaginary parts, use ridge regression for real numbers and impose relations between coefficients (with a constraint like SR3 does). Indeed with this method, the non linear terms are related to some other terms. But this is far more complicated than the methods we talked about above
FWIW, there seems to be some desire for this... met someone at Imperial College who was looking to discover complex coefficients for Kolmogorov flow, using pysindy
The current
STLSQ
implementation relies on Scikit-learn's ridge regression implementation, which does not support complex data. However, there is nothing that prevents the solution of least squares problems involving complex numbers (e.g. Scipy's lstsq supports them). Can we update theSTLSQ
implementation to handle complex numbers?