You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Global navigation satellite system (GNSS), inertial measurement unit (IMU) and light detection and ranging (LiDAR), as common sensors in navigation systems, are widely used in fusion simultaneous localization and mapping (SLAM) to improve performance. However, strong robustness and high efficiency are not received much attention, posing challenges for ubiquitous application of robotics systems. This article presents a robust and fast fusion odometry with INS-centric multiple modalities. Specifically, our framework is based on two subsystems: the GNSS-IMU subsystem estimates coarse states and provides global constraints, the LiDAR-IMU subsystem refines states and constructs the 3D map. The state estimation is performed by iterated error state Kalman filter without interruption even if partial sensor fails. Moreover, a fast and dynamic frame transformation method and distributed backward propagation are designed for linking world-frame states with global measurements and coping with asynchronism data. Extensive qualitative and quantitative experiments are also conducted on public datasets and in real-world scenarios. The results show that our system exhibits competitive accuracy and efficiency, with strong robustness against sensor degradation and failure.
The text was updated successfully, but these errors were encountered:
Global navigation satellite system (GNSS), inertial measurement unit (IMU) and light detection and ranging (LiDAR), as common sensors in navigation systems, are widely used in fusion simultaneous localization and mapping (SLAM) to improve performance. However, strong robustness and high efficiency are not received much attention, posing challenges for ubiquitous application of robotics systems. This article presents a robust and fast fusion odometry with INS-centric multiple modalities. Specifically, our framework is based on two subsystems: the GNSS-IMU subsystem estimates coarse states and provides global constraints, the LiDAR-IMU subsystem refines states and constructs the 3D map. The state estimation is performed by iterated error state Kalman filter without interruption even if partial sensor fails. Moreover, a fast and dynamic frame transformation method and distributed backward propagation are designed for linking world-frame states with global measurements and coping with asynchronism data. Extensive qualitative and quantitative experiments are also conducted on public datasets and in real-world scenarios. The results show that our system exhibits competitive accuracy and efficiency, with strong robustness against sensor degradation and failure.
The text was updated successfully, but these errors were encountered: