mspenko / RoboticsLab-Integrity-Evaluation

Localization Integrity Evaluation for Mobile Robots
8 stars 5 forks source link

IMU factor #2

Open ZHENGXi-git opened 6 months ago

ZHENGXi-git commented 6 months ago

Hi, I'm interest in the integrity application with other sensors, like LiDAR, camera and IMU. However, I think IMU is different from the other two, since IMU does not need feature matching. Could you please give me some advises about how to process the IMU data when I consider integrity? Thank you very much!

mspenko commented 6 months ago

check out our papers. You can find them on scholar.google.com

-- Matthew Spenko Professor - The Robotics Lab Rettaliata Room 261 Illinois Institute of Technology http://robots.iit.edu

On Thu, Mar 7, 2024 at 7:19 AM Xi Zheng @.***> wrote:

Hi, I'm interest in the integrity application with other sensors, like LiDAR, camera and IMU. However, I think IMU is different from the other two, since IMU does not need feature matching. Could you please give me some advises about how to process the IMU data when I consider integrity? Thank you very much!

— Reply to this email directly, view it on GitHub https://github.com/mspenko/RoboticsLab-Integrity-Evaluation/issues/2, or unsubscribe https://github.com/notifications/unsubscribe-auth/AARFQZE5KP3HXS7TNN2OOILYXBSN5AVCNFSM6AAAAABEK7UHHCVHI2DSMVQWIX3LMV43ASLTON2WKOZSGE3TGOBUG42TAOI . You are receiving this because you are subscribed to this thread.Message ID: @.***>

ZHENGXi-git commented 6 months ago

I actually read the paper 'Localization safety validation for autonomous robots'. In this paper, the IMU model is expressed as: x_k+1 = g + w_k + f_u. For LiDAR and camera, I could understand that f_u means the faulted corresponding or outliers. However, how to get the outliers in IMU measurements? IMU factor is based on the IMU preintegration which is different with the camera feature observation.