Global navigation satellite system (GNSS), inertial measurement unit (IMU) and light detection and ranging (LiDAR), as common sensors in navigation systems, are widely used in fusion simultaneous localization and mapping (SLAM) to improve performance. However, strong robustness and high efficiency are not received much attention, posing challenges for ubiquitous application of robotics systems. This article presents a robust and fast fusion odometry with INS-centric multiple modalities. Specifically, our framework is based on two subsystems: the GNSS-IMU subsystem estimates coarse states and provides global constraints, the LiDAR-IMU subsystem refines states and constructs the 3D map. The state estimation is performed by iterated error state Kalman filter without interruption even if partial sensor fails. Moreover, a fast and dynamic frame transformation method and distributed backward propagation are designed for linking world-frame states with global measurements and coping with asynchronism data. Extensive qualitative and quantitative experiments are also conducted on public datasets and in real-world scenarios. The results show that our system exhibits competitive accuracy and efficiency, with strong robustness against sensor degradation and failure.
Global navigation satellite system (GNSS), inertial measurement unit (IMU) and light detection and ranging (LiDAR), as common sensors in navigation systems, are widely used in fusion simultaneous localization and mapping (SLAM) to improve performance. However, strong robustness and high efficiency are not received much attention, posing challenges for ubiquitous application of robotics systems. This article presents a robust and fast fusion odometry with INS-centric multiple modalities. Specifically, our framework is based on two subsystems: the GNSS-IMU subsystem estimates coarse states and provides global constraints, the LiDAR-IMU subsystem refines states and constructs the 3D map. The state estimation is performed by iterated error state Kalman filter without interruption even if partial sensor fails. Moreover, a fast and dynamic frame transformation method and distributed backward propagation are designed for linking world-frame states with global measurements and coping with asynchronism data. Extensive qualitative and quantitative experiments are also conducted on public datasets and in real-world scenarios. The results show that our system exhibits competitive accuracy and efficiency, with strong robustness against sensor degradation and failure.