TY - JOUR
T1 - VIRAL-Fusion
T2 - A Visual-Inertial-Ranging-Lidar Sensor Fusion Approach
AU - Nguyen, Thien Minh
AU - Cao, Muqing
AU - Yuan, Shenghai
AU - Lyu, Yang
AU - Nguyen, Thien Hoang
AU - Xie, Lihua
N1 - Publisher Copyright:
© 2004-2012 IEEE.
PY - 2022/4/1
Y1 - 2022/4/1
N2 - In recent years, onboard self-localization (OSL) methods based on cameras or lidar have achieved many significant progresses. However, some issues such as estimation drift and robustness in low-texture environment still remain inherent challenges for OSL methods. On the other hand, infrastructure-based methods can generally overcome these issues, but at the expense of some installation cost. This poses an interesting problem of how to effectively combine these methods, so as to achieve localization with long-term consistency as well as flexibility compared to any single method. To this end, we propose a comprehensive optimization-based estimator for the 15-D state of an unmanned aerial vehicle (UAV), fusing data from an extensive set of sensors: inertial measurement unit (IMU), ultrawideband (UWB) ranging sensors, and multiple onboard visual-inertial and lidar odometry subsystems. In essence, a sliding window is used to formulate a sequence of robot poses, where relative rotational and translational constraints between these poses are observed in the IMU preintegration and OSL observations, while orientation and position are coupled in the body-offset UWB range observations. An optimization-based approach is developed to estimate the trajectory of the robot in this sliding window. We evaluate the performance of the proposed scheme in multiple scenarios, including experiments on public datasets, high-fidelity graphical-physical simulation, and field-collected data from UAV flight tests. The result demonstrates that our integrated localization method can effectively resolve the drift issue, while incurring minimal installation requirements.
AB - In recent years, onboard self-localization (OSL) methods based on cameras or lidar have achieved many significant progresses. However, some issues such as estimation drift and robustness in low-texture environment still remain inherent challenges for OSL methods. On the other hand, infrastructure-based methods can generally overcome these issues, but at the expense of some installation cost. This poses an interesting problem of how to effectively combine these methods, so as to achieve localization with long-term consistency as well as flexibility compared to any single method. To this end, we propose a comprehensive optimization-based estimator for the 15-D state of an unmanned aerial vehicle (UAV), fusing data from an extensive set of sensors: inertial measurement unit (IMU), ultrawideband (UWB) ranging sensors, and multiple onboard visual-inertial and lidar odometry subsystems. In essence, a sliding window is used to formulate a sequence of robot poses, where relative rotational and translational constraints between these poses are observed in the IMU preintegration and OSL observations, while orientation and position are coupled in the body-offset UWB range observations. An optimization-based approach is developed to estimate the trajectory of the robot in this sliding window. We evaluate the performance of the proposed scheme in multiple scenarios, including experiments on public datasets, high-fidelity graphical-physical simulation, and field-collected data from UAV flight tests. The result demonstrates that our integrated localization method can effectively resolve the drift issue, while incurring minimal installation requirements.
KW - Aerial robots
KW - localization
KW - optimization
UR - http://www.scopus.com/inward/record.url?scp=85106449710&partnerID=8YFLogxK
U2 - 10.1109/TRO.2021.3094157
DO - 10.1109/TRO.2021.3094157
M3 - 文章
AN - SCOPUS:85106449710
SN - 1552-3098
VL - 38
SP - 958
EP - 977
JO - IEEE Transactions on Robotics
JF - IEEE Transactions on Robotics
IS - 2
ER -