Closed nilsnevertree closed 1 year ago
It is important to notice that not the time value is important for the local linear regression, but the state in space of the model at that time. Thus it should be that the weights are 2D arrays.
The coefficients from the linear regression are the matric M. This is due to x(t) = M x(t-1)
.
So it is a hyperplane linear regression of dimension components(x) -1
leading to a coefficient Matrix of (components(x), components(x))
.
The problem can be seen in this figure.
Top: observations with noise Mid: weight kernel Low: weighted observations
It is important to notice that not the time value is important for the local linear regression, but the state in space of the model at that time. Thus it should be that the weights are 2D arrays.
This might not be true, as actually the neighbouring timesteps should be used. We stay in time space and do not make this in state space.
Therefore this is how the weighting should look like.
The weights should be 1D, even if the dimension of the problem is nD
The weights calculated by
are 2D because the single components of the state vectore can be on different positions.
Therefore the current function
can not be used for this problem.