TY - GEN
T1 - Target-Tracking-Based Multi-UAV Visual SLAM in GNSS-Denied Environments
AU - Fu, Xinyu
AU - Zhu, Zhanxia
AU - Xu, Zhi
AU - Li, Qianlong
N1 - Publisher Copyright:
©2025 by the International Astronautical Federation (IAF). All rights reserved.
PY - 2025
Y1 - 2025
N2 - The interior of the space station is in a complex environment where the signals of the Global Navigation Satellite System (GNSS) are denied. With a vast number of equipment and a complex structure inside the cabin, the manual workload for its daily operation and maintenance is substantial, thus giving rise to the demand for auxiliary inspection robots. Multi-rotor UAV swarms, on the other hand, are uniquely suited in this regard due to their small size, high flexibility, and parallel processing capabilities. This paper aims to explore the feasibility of establishing an in-cabin auxiliary inspection robot system using a group of micro-UAVs. By harnessing the visual Simultaneous Localization and Mapping (SLAM) technology, the UAV group can create a map of the space station's interior, providing an environmental basis for inspection operations. Nevertheless, the mobility of UAVs presents potential collision risks among multiple vehicles, which currently remains a challenging research issue. To address this issue, we establish a system that relies on visual information for positioning. By introducing a target tracking algorithm based on deep learning, we can effectively obtain the relative positioning among UAVs, thereby avoiding collisions. Specifically, the tracking algorithm introduced in this paper is realized based on its accurate target recognition. It conducts tracking by associating high-confidence and low-confidence detection boxes, and shows stronger robustness against occluded and disappeared targets. Even when the initial positions of the UAVs are unknown, the on-board computers of the UAVs can identify and track other UAVs captured in the images according to the visual information, enabling the stable acquisition of relative distances and azimuths. This approach significantly enhances the reliability of the system in complex environments, ensuring that the multi-UAV system can still achieve precise tracking and positioning when facing complex scenarios with GNSS signal denial. This characteristic can be further extended to other types of missions and robot groups. By enhancing the adaptability of these robot groups to extreme environments, we can lay a reliable foundation of positioning and environmental perception for emerging missions such as lunar lava tube exploration. In the simulation experiments using the measurement models obtained from real data, we can verify the effectiveness of the method proposed in this paper.
AB - The interior of the space station is in a complex environment where the signals of the Global Navigation Satellite System (GNSS) are denied. With a vast number of equipment and a complex structure inside the cabin, the manual workload for its daily operation and maintenance is substantial, thus giving rise to the demand for auxiliary inspection robots. Multi-rotor UAV swarms, on the other hand, are uniquely suited in this regard due to their small size, high flexibility, and parallel processing capabilities. This paper aims to explore the feasibility of establishing an in-cabin auxiliary inspection robot system using a group of micro-UAVs. By harnessing the visual Simultaneous Localization and Mapping (SLAM) technology, the UAV group can create a map of the space station's interior, providing an environmental basis for inspection operations. Nevertheless, the mobility of UAVs presents potential collision risks among multiple vehicles, which currently remains a challenging research issue. To address this issue, we establish a system that relies on visual information for positioning. By introducing a target tracking algorithm based on deep learning, we can effectively obtain the relative positioning among UAVs, thereby avoiding collisions. Specifically, the tracking algorithm introduced in this paper is realized based on its accurate target recognition. It conducts tracking by associating high-confidence and low-confidence detection boxes, and shows stronger robustness against occluded and disappeared targets. Even when the initial positions of the UAVs are unknown, the on-board computers of the UAVs can identify and track other UAVs captured in the images according to the visual information, enabling the stable acquisition of relative distances and azimuths. This approach significantly enhances the reliability of the system in complex environments, ensuring that the multi-UAV system can still achieve precise tracking and positioning when facing complex scenarios with GNSS signal denial. This characteristic can be further extended to other types of missions and robot groups. By enhancing the adaptability of these robot groups to extreme environments, we can lay a reliable foundation of positioning and environmental perception for emerging missions such as lunar lava tube exploration. In the simulation experiments using the measurement models obtained from real data, we can verify the effectiveness of the method proposed in this paper.
KW - GNSS-Denied Environment
KW - Multi-UAVs Cooperation
KW - Target Tracking
KW - Visual SLAM
UR - https://www.scopus.com/pages/publications/105032648034
U2 - 10.52202/083083-0073
DO - 10.52202/083083-0073
M3 - 会议稿件
AN - SCOPUS:105032648034
T3 - Proceedings of the International Astronautical Congress, IAC
SP - 641
EP - 649
BT - IAF Human Spaceflight Symposium - Held at the 76th International Astronautical Congress, IAC 2025
PB - International Astronautical Federation, IAF
T2 - 2025 IAF Human Spaceflight Symposium at the 76th International Astronautical Congress, IAC 2025
Y2 - 29 September 2025 through 3 October 2025
ER -