jimazeyu / visual_inertial_SLAM

The goal of this project is to implement visual-inertial simultaneous localization and mapping (SLAM) using an extended Kalman filter (EKF).
1 stars 0 forks source link

Update covariance matrix #1

Open Serissa opened 5 months ago

Serissa commented 5 months ago

Hello! I don't understand the next two lines of code.Can you briefly explain why. self.covariance[:-6, -6:] = self.covariance[:-6, -6:] @ F.T self.covariance[-6:, :-6] = F @ self.covariance[-6:, :-6]

jimazeyu commented 5 months ago

When the robot moves, the correlation between the robot pose and the landmark postion also changes.

Serissa commented 5 months ago

The matrix H has a lot of values of 0, can we just dot the corresponding region? K = self.covariance @ H.T @ np.linalg.inv(H @ self.covariance @ H.T + R)

jimazeyu commented 5 months ago

you may decomposite the matrix into blocks and calculate the inverse. A little bit complex, maybe you can try

Serissa commented 5 months ago

Why is the covariance matrix defined as three times the feature dimension?

jimazeyu commented 5 months ago

Everything you wanna know is in this PPT, https://natanaso.github.io/ece276a/ref/ECE276A_12_VI_SLAM.pdf

Serissa commented 5 months ago

Thank you very much.