SLAM in Low-Light Environments Based on Infrared-Visible Light Fusion

Haiwei Wang, Chenqi Gao, Tianyu Gao, Jinwen Hu, Zhao Xu, Junwei Han, Yan Zhu, Yong Wu

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

1 Scopus citations

Abstract

Traditional visual Simultaneous Localization and Mapping (SLAM) techniques are difficult to obtain effective information in non-ideal environments such as changing light or full of smoke, which leads to the performance degradation of SLAM algorithms. To overcome the aforementioned challenges, this paper proposes a visual SLAM front-end system based on infrared-visible light fusion. The system achieves precise optimization of camera poses and map point locations in non-ideal environments by jointly optimizing the reprojection errors of visible light image point features and infrared image edge features. In addition, this article further improves the robustness of the algorithm in non-ideal environments through back-end optimization of infrared-visible light and Inertial Measurement Unit (IMU) tight coupling.

Original languageEnglish
Title of host publication2024 IEEE 18th International Conference on Control and Automation, ICCA 2024
PublisherIEEE Computer Society
Pages868-873
Number of pages6
ISBN (Electronic)9798350354409
DOIs
StatePublished - 2024
Event18th IEEE International Conference on Control and Automation, ICCA 2024 - Reykjavik, Iceland
Duration: 18 Jun 202421 Jun 2024

Publication series

NameIEEE International Conference on Control and Automation, ICCA
ISSN (Print)1948-3449
ISSN (Electronic)1948-3457

Conference

Conference18th IEEE International Conference on Control and Automation, ICCA 2024
Country/TerritoryIceland
CityReykjavik
Period18/06/2421/06/24

Fingerprint

Dive into the research topics of 'SLAM in Low-Light Environments Based on Infrared-Visible Light Fusion'. Together they form a unique fingerprint.

Cite this