dynamicslab / pysindy

A package for the sparse identification of nonlinear dynamical systems from data
https://pysindy.readthedocs.io/en/latest/
Other
1.41k stars 308 forks source link

Expand numerical differentiation options #58

Closed briandesilva closed 4 years ago

briandesilva commented 4 years ago

@andgoldschmidt has implemented a number of numerical differentiation methods, some of which could easily be ported over to PySINDy. I'm creating this issue to create a discussion space for this project.

briandesilva commented 4 years ago

In particular, we discussed adding implementations of spline-based differentiation and a TV derivative method.

Ohjeah commented 4 years ago

I think we should offer various differentiation methods in pysindy, but not add them to the main package via source code. Instead, it makes perfectly sense to have this as a distinct package and import it to pysindy as a requirement.

This will add a bit more overhead, but will the numerical differentiation part more open to other project and also pysindy itself more stable to changes in the numerical differentiation part.

I have a crude draft of such a project here: https://github.com/Ohjeah/derivative.py

Maybe we can also have a look at autograd, e.g. https://github.com/google/jax

andgoldschmidt commented 4 years ago

I think separate is right. There might be other system discovery methods besides SINDy--separate packages makes reuse easy.

I also had autograd methods on my to do, but derivatives for noisy experimental data seem like a priority. Overall, both projects look similar in content and references, meaning the essential methods are:

  1. Finite difference [Local]
  2. Savitzky-Golay / Holoborodko [Local]
  3. Spline (Default: Cubic) [Global]
  4. FFT [Global]
  5. Total Variational Regularization / Polynomial-trend-filtered derivative [Global]

I think the biggest differences are in 2 and 5. How should this develop from here?

briandesilva commented 4 years ago

I agree that numerical differentiation is large enough of a project in and of itself that we should probably not try to tackle it ourselves within PySINDy. For us to be able to list an external repo as a requirement, I think it will need to be registered with PyPI. @andgoldschmidt's implementation looks a little more mature at the moment, but it's not yet quite ready to be used in PySINDy. @andgoldschmidt, would you be willing to set up your repo on PyPI?

With regards to autograd methods, my understanding is that they require the form of the function to be differentiated to be known (e.g. def f(x): return 1 / (1 + np.exp(x))). In applications where only direct measurements are available it doesn't seem like autograd would be very useful.

billtubbs commented 4 years ago

Could this be used as an option: scipy.signal.savgol_filter

It provides smoothing but also the derivative.

billtubbs commented 4 years ago

Also, I found this code based on Chartrand's original MATLAB code:

https://github.com/stur86/tvregdiff

But I think there may be issues with it.

briandesilva commented 4 years ago

Could this be used as an option: scipy.signal.savgol_filter

It provides smoothing but also the derivative.

scipy.signal.savgol_filter is currently the default smoother option for the SmoothedFiniteDifference class, but we aren't using it to compute derivatives directly. I think it could make sense to have a separate function that uses savgol_filter to compute the derivative.

briandesilva commented 4 years ago

Implemented in #85