Fast multi-feature tracking method based on tightly coupled sensors

Yanze Zhu, Ziyu Cao, Jianhua Yang, Hong Hou, Yihong Quan

Research output: Contribution to journalArticlepeer-review

1 Scopus citations

Abstract

In contrast to conventional methods of optimizing feature-matching results at the data-processing level, we utilize the tightly coupled mode to fuse the information collected by the camera and inertial sensor at the data-gathering level, which contributed to better accuracy and robustness in the feature-matching process. Specifically, the proposed visual-inertial feature tracking method that combines inertial measurement unit (IMU) calibration, feature matching, and prediction algorithms. The approach includes a vision-aided multi-level IMU systemic calibration method and an inertial-aided image feature prediction algorithm, which effectively processes and utilizes information from multiple sensors. Our method addresses not only the influence of image distortion and blur resulting from illumination changes and fast camera motion but also the problem of measurement errors that can arise from the long-term operation of the inertial sensors. Extensive experiments have been conducted to demonstrate that the efficiency of it is superior to the state-of-the-art methods: With the accuracy rate remains the same level, the speed of feature matching is improved by 41.8%. Additionally, when it is applied to simultaneous localization and mapping systems, its localization performances are better than that of the VINS-mono method by 8.1%.

Original languageEnglish
Article number113528
JournalMeasurement: Journal of the International Measurement Confederation
Volume221
DOIs
StatePublished - 15 Nov 2023

Keywords

  • Coupled mode
  • Feature matching
  • System-level calibration
  • Visual-inertial fusion

Fingerprint

Dive into the research topics of 'Fast multi-feature tracking method based on tightly coupled sensors'. Together they form a unique fingerprint.

Cite this