HKUST-Aerial-Robotics / VINS-Fusion

An optimization-based multi-sensor state estimator
GNU General Public License v3.0
3.54k stars 1.39k forks source link

How to extract the covariance matrix of the estimated states? #68

Open skyertian opened 5 years ago

skyertian commented 5 years ago

I want to get the covariance matrix (or information matrix) of the states. Is there some way to extract them from the "problem" object?

shaozu commented 5 years ago

To obtain the covariance or information matrix, you need to organize all factors and formulate the problem as Ax=b. Then the information matrix should be A*A'. You can refer marginalization_factor.cpp for the way to formulate the problem.

xdtl commented 4 years ago

@shaozu Could you please elaborate a bit about how to formulate the problem as Ax=b? Thanks very much!

Wangxuefeng92 commented 3 years ago

@shaozu I tried to get covariance from J'J by inverse(), but the matrix of J'J is not full-rank, is there any method to solve this problem?

shaozu commented 3 years ago

Hi @Wangxuefeng92, you may try the following code:

Eigen::SelfAdjointEigenSolver<Eigen::MatrixXd> saes(A);

Eigen::MatrixXd A_inv = saes.eigenvectors() * Eigen::VectorXd((saes.eigenvalues().array() > 1e-8).select(saes.eigenvalues().array().inverse(), 0)).asDiagonal() * saes.eigenvectors().transpose();
Wangxuefeng92 commented 3 years ago

@shaozu thx for your reply! actually,I tried this method, and result a covairance-matrix.but it seems not correct. like that:

74325.2255912606 -68.318715348381 31.931881402562 0.000415757048
-105.967952683968 -5956.82217449409 4.402051776849 3.0684659E-05
19.156882731848 2.498253394926 -3276.79307465339 4.885786E-06
-4.0447002E-05 1.530197E-06 3.419551E-06 0.000322151854
-0.000233726671 3.126246E-06 1.186476E-05 0.000363354165
-138399.117842808 1189.75734353551 -46.251377295152 0.033636612983
74326.7092803511 -68.331469982257 31.93237722616 0.000415127884
-112.78069736659 -5956.76360827619 4.399775036249 3.2525628E-05
19.156882828659 2.498253393968 -3276.79307465866 4.614159E-06
-4.0449179E-05 1.530025E-06 3.419855E-06 0.000322164327
-0.000233729187 3.126185E-06 1.1864615E-05 0.000363349931
sy8008 commented 3 years ago

I have also tried the following code:

`Eigen::SelfAdjointEigenSolver saes(A);

Eigen::MatrixXd A_inv = saes.eigenvectors() Eigen::VectorXd((saes.eigenvalues().array() > 1e-8).select(saes.eigenvalues().array().inverse(), 0)).asDiagonal() saes.eigenvectors().transpose(); `

but the resulting covariance matrix (A_inv)is very large, for example, if I extract the covariance of the Para_pose[WINDOWSIZE - 1] in a test sequence, I got: 25764.5042544688,207256.2387890820,27729.1071558241.

I wonder why is that, thank you!

MeisonP commented 2 years ago

I have also tried the following code:

`Eigen::SelfAdjointEigenSolverEigen::MatrixXd saes(A);

Eigen::MatrixXd A_inv = saes.eigenvectors() Eigen::VectorXd((saes.eigenvalues().array() > 1e-8).select(saes.eigenvalues().array().inverse(), 0)).asDiagonal() saes.eigenvectors().transpose(); `

but the resulting covariance matrix (A_inv)is very large, for example, if I extract the covariance of the Para_pose[WINDOWSIZE - 1] in a test sequence, I got: 25764.5042544688,207256.2387890820,27729.1071558241.

I wonder why is that, thank you!

did you solved this problem? the cov is also large in my case, the cov is computed by using ceres interface.