dynamicslab / pysindy

A package for the sparse identification of nonlinear dynamical systems from data
https://pysindy.readthedocs.io/en/latest/
Other
1.42k stars 310 forks source link

Linear Parameter Varying System Identification (CONTINUED #346) #352

Closed Aditya16828 closed 1 year ago

Aditya16828 commented 1 year ago

https://github.com/dynamicslab/pysindy/issues/346#issuecomment-1604766871 For more details u may refer to issue #346 Could plz explain how to do this same thing in state-space format?

Jacob-Stevens-Haas commented 1 year ago

It's unclear what you mean by state-space format. Your data matrix is (n_timesteps, n_coordinates). Is that not state-space format?

Aditya16828 commented 1 year ago

No no, I am talking about the code that was provided as sample. I am having difficulty in understanding the num_features and num_features parameters. The code that @znicolaou provided is for x (xss here) having the shape (1, 1000) and u (uss here). But suppose I am having an x of shape (2, 1000) and u having a shape of (2, 1000) other than the v parameter (means I am having 2 control inputs other than the v parameter) then how can I modify the code, especially what would my num_features and num_features parameters. And also is it possible to do something similar to the code mentioned here ?

Aditya16828 commented 1 year ago
N = 1000
N_drop = 500
r0 = [3.5, 3.0]
vs=2+np.random.random(size=(2,N+N_drop))
xss = []
uss = []
for i in range(len(vs)):
    r = r0[i] + 1/vs[i]
    xs = []
    x = 0.5
    for n in range(N + N_drop):
        if n >= N_drop:
            xs.append(x)
        x = r[n] * x * (1 - x)
    xss.append(np.array(xs))
    uss.append(vs[i][N_drop:])

feature_lib = ps.PolynomialLibrary(degree=3, include_bias=True)
parameter_lib = ps.CustomLibrary(library_functions=[lambda x:1/x, lambda x:1/x], function_names=[lambda x:x+'^-1', lambda x:x+'^-1'],include_bias=True)

opt = ps.STLSQ(threshold=1e-1, normalize_columns=False)
model = ps.SINDy(
    feature_library=lib, optimizer=opt, feature_names=["x1", "v1", "x2", "v2"], discrete_time=True
)
# model.fit(xss_new, u=uss_new, multiple_trajectories=True)
model.fit(xss, u=uss, multiple_trajectories=True)
model.print()

I am trying out this code, and having this error: image Could anyone plz point out my mistake?

Jacob-Stevens-Haas commented 1 year ago

To paraphrase:

feature_lib = ps.PolynomialLibrary(...)
parameter_lib = ps.CustomLibrary(...)

opt = ps.STLSQ(...)
model = ps.SINDy(feature_library=lib, ...)

The error is in the part of the code that you're not pasting. Where is lib defined?

Aditya16828 commented 1 year ago

Very sorry for my mistake, this is the full code...

import numpy as np
import pysindy as ps

N = 1000
N_drop = 500
r0 = [3.5, 3.0]
vs=2+np.random.random(size=(2,N+N_drop))
xss = []
uss = []
for i in range(len(vs)):
    r = r0[i] + 1/vs[i]
    xs = []
    x = 0.5
    for n in range(N + N_drop):
        if n >= N_drop:
            xs.append(x)
        x = r[n] * x * (1 - x)
    xss.append(np.array(xs))
    uss.append(vs[i][N_drop:])

feature_lib = ps.PolynomialLibrary(degree=3, include_bias=True)
parameter_lib = ps.CustomLibrary(library_functions=[lambda x:1/x, lambda x:1/x], function_names=[lambda x:x+'^-1', lambda x:x+'^-1'],include_bias=True)
lib = ps.ParameterizedLibrary(
    feature_library=feature_lib,
    parameter_library=parameter_lib,
    num_features=2,
    num_parameters=2,
)
opt = ps.STLSQ(threshold=1e-1, normalize_columns=False)
model = ps.SINDy(
    feature_library=lib, optimizer=opt, feature_names=["x1", "v1", "x2", "v2"], discrete_time=True
)
# model.fit(xss_new, u=uss_new, multiple_trajectories=True)
model.fit(xss, u=uss, multiple_trajectories=True)
model.print()
Jacob-Stevens-Haas commented 1 year ago

xss and uss should be lists of separate trajectories, where each trajectory has all variables/coordinates present. Think of what Here, you passed a 1-D x and a 1-D v. If that's not what you meant, pass xss and uss as arrays of shape (n_time, n_coordinates) directly to model.fit. Also note that the way features are grouped together, (PolynomialLibrary takes first two features, parameter_library=CustomLibrary takes the last two.

Editing last four lines:

model = ps.SINDy(
    feature_library=lib, optimizer=opt, feature_names=["x1", "x2", "v1", "v2"], discrete_time=True
)
# note the swap of "x2" and "v1"
model.fit(np.stack(xss, -1), u=np.stack(uss, -1))
model.print()

gives

(x1)[k+1] = 3.500 1 x1[k] + -3.500 1 x1[k]^2 + 0.500 v2[k]^-1 x1[k] + -0.500 v2[k]^-1 x1[k]^2 + 0.500 v2[k]^-1 x1[k] + -0.500 v2[k]^-1 x1[k]^2
(x2)[k+1] = 3.119 1 x2[k] + -3.401 1 x2[k]^2 + 0.309 1 x2[k]^3 + 0.001 v2[k]^-1 x2[k] + -0.001 v2[k]^-1 x2[k]^3 + 0.351 v2[k]^-1 x2[k] + -0.384 v2[k]^-1 x2[k]^3 + 0.001 v2[k]^-1 x2[k] + -0.001 v2[k]^-1 x2[k]^3 + 0.351 v2[k]^-1 x2[k] + -0.384 v2[k]^-1 x2[k]^3
Aditya16828 commented 1 year ago
num = 1
N = 1000
N_drop = 500
r0 = 3.5
vs=2+np.random.random(size=(1,N+N_drop))
xss = []
uss=[]
for v in vs:
    r= r0 + 1/v
    xs = []
    x = 0.5
    for n in range(N + N_drop):
        if n >= N_drop:
            xs = xs + [x]
        x = r[n] * x * (1 - x)
    xss = xss + [np.array(xs)]
    uss=uss+[v[N_drop:]]

feature_lib = ps.PolynomialLibrary(degree=3, include_bias=True)
parameter_lib = ps.CustomLibrary(library_functions=[lambda x:1/x], function_names=[lambda x:x+'^-1'],include_bias=True)
lib = ps.ParameterizedLibrary(
    feature_library=feature_lib,
    parameter_library=parameter_lib,
    num_features=1,
    num_parameters=1,
)
opt = ps.STLSQ(threshold=1e-1, normalize_columns=False)
model = ps.SINDy(
    feature_library=lib, optimizer=opt, feature_names=["x", "v"], discrete_time=True
)
model.fit(xss, u=uss, multiple_trajectories=True)
model.print()

In this code, I have to give multiple_trajectories=True but not in the one that I previously provided, could u plz explain what does multiple_trajectories signify here??

Jacob-Stevens-Haas commented 1 year ago

multiple_trajectories is when you're fitting a model with data from multiple experiments/simulations. The derivative and feature library samples generated from each are essentially concatenated along the sample axis.

In the previous example, perhaps the maintainer thought of $x$ and $v$ as scalars - hence the num_features=1, num_parameters=1.

Hope that makes it clear!

Aditya16828 commented 1 year ago

Thank u so much sir for the clarification.