nilsnevertree / kalman-reconstruction-partially-observed-systems

Data-driven Reconstruction of Partially Observed Dynamical Systems using Kalman Algorithms in an itterative way
GNU General Public License v3.0
1 stars 0 forks source link

Multidimensional weights for local linear regression #15

Closed nilsnevertree closed 1 year ago

nilsnevertree commented 1 year ago

The weights calculated by

kalman_reconstruction.statistics.gaussian_weights()

are 2D because the single components of the state vectore can be on different positions.

Therefore the current function

LinearRegression(fit_intercept=False).fit(x_out[:-1,], x_out[1:,], sample_weight=sample_weight)

can not be used for this problem.

nilsnevertree commented 1 year ago

grafik

nilsnevertree commented 1 year ago

It is important to notice that not the time value is important for the local linear regression, but the state in space of the model at that time. Thus it should be that the weights are 2D arrays.

The coefficients from the linear regression are the matric M. This is due to x(t) = M x(t-1). So it is a hyperplane linear regression of dimension components(x) -1 leading to a coefficient Matrix of (components(x), components(x)).

nilsnevertree commented 1 year ago

The problem can be seen in this figure.

2D-linear-regression-problem

Top: observations with noise Mid: weight kernel Low: weighted observations

nilsnevertree commented 1 year ago

It is important to notice that not the time value is important for the local linear regression, but the state in space of the model at that time. Thus it should be that the weights are 2D arrays.

This might not be true, as actually the neighbouring timesteps should be used. We stay in time space and do not make this in state space.

nilsnevertree commented 1 year ago

2D-linear-regression-problem

Therefore this is how the weighting should look like.

ptandeo commented 1 year ago

The weights should be 1D, even if the dimension of the problem is nD