TY - JOUR
T1 - Fast multi-feature tracking method based on tightly coupled sensors
AU - Zhu, Yanze
AU - Cao, Ziyu
AU - Yang, Jianhua
AU - Hou, Hong
AU - Quan, Yihong
N1 - Publisher Copyright:
© 2023 Elsevier Ltd
PY - 2023/11/15
Y1 - 2023/11/15
N2 - In contrast to conventional methods of optimizing feature-matching results at the data-processing level, we utilize the tightly coupled mode to fuse the information collected by the camera and inertial sensor at the data-gathering level, which contributed to better accuracy and robustness in the feature-matching process. Specifically, the proposed visual-inertial feature tracking method that combines inertial measurement unit (IMU) calibration, feature matching, and prediction algorithms. The approach includes a vision-aided multi-level IMU systemic calibration method and an inertial-aided image feature prediction algorithm, which effectively processes and utilizes information from multiple sensors. Our method addresses not only the influence of image distortion and blur resulting from illumination changes and fast camera motion but also the problem of measurement errors that can arise from the long-term operation of the inertial sensors. Extensive experiments have been conducted to demonstrate that the efficiency of it is superior to the state-of-the-art methods: With the accuracy rate remains the same level, the speed of feature matching is improved by 41.8%. Additionally, when it is applied to simultaneous localization and mapping systems, its localization performances are better than that of the VINS-mono method by 8.1%.
AB - In contrast to conventional methods of optimizing feature-matching results at the data-processing level, we utilize the tightly coupled mode to fuse the information collected by the camera and inertial sensor at the data-gathering level, which contributed to better accuracy and robustness in the feature-matching process. Specifically, the proposed visual-inertial feature tracking method that combines inertial measurement unit (IMU) calibration, feature matching, and prediction algorithms. The approach includes a vision-aided multi-level IMU systemic calibration method and an inertial-aided image feature prediction algorithm, which effectively processes and utilizes information from multiple sensors. Our method addresses not only the influence of image distortion and blur resulting from illumination changes and fast camera motion but also the problem of measurement errors that can arise from the long-term operation of the inertial sensors. Extensive experiments have been conducted to demonstrate that the efficiency of it is superior to the state-of-the-art methods: With the accuracy rate remains the same level, the speed of feature matching is improved by 41.8%. Additionally, when it is applied to simultaneous localization and mapping systems, its localization performances are better than that of the VINS-mono method by 8.1%.
KW - Coupled mode
KW - Feature matching
KW - System-level calibration
KW - Visual-inertial fusion
UR - http://www.scopus.com/inward/record.url?scp=85171475053&partnerID=8YFLogxK
U2 - 10.1016/j.measurement.2023.113528
DO - 10.1016/j.measurement.2023.113528
M3 - 文章
AN - SCOPUS:85171475053
SN - 0263-2241
VL - 221
JO - Measurement: Journal of the International Measurement Confederation
JF - Measurement: Journal of the International Measurement Confederation
M1 - 113528
ER -