TY - JOUR
T1 - Multi-Object Tracking with Distributed Drones’ RGB Cameras Considering Object Localization Uncertainty
AU - Liao, Xin
AU - Fang, Bohui
AU - Shao, Weiyu
AU - Fu, Wenxing
AU - Yang, Tao
N1 - Publisher Copyright:
© 2025 by the authors.
PY - 2025/12
Y1 - 2025/12
N2 - Highlights: What are the main findings? A comprehensive passive sensing framework for multi-object tracking with distributed drones is presented. The robust localization and tracking of aerial objects can be achieved by exploiting spatio-temporal information to associate targets detected by different views. Object localization uncertainty is modeled in the Kalman filter through carefully designed process and observation noise covariance matrices. The resulting data association, based on the Mahalanobis distance, enhances the performance of multi-object tracking. What are the implications of the main finding? Since aerial objects are always small and share similar appearance features, data association between different viewpoints of passive sensors needs to consider more about the geometric constraints and temporal information. The motion of the observer drone necessitates refined modeling of process and observation noise to achieve robust object tracking. Reliable 3D multi-object tracking (MOT) using distributed drones remains challenging due to the lack of active sensing and the ambiguity in associating detections from different views. This paper presents a passive sensing framework that integrates multi-view data association and 3D MOT for aerial objects. First, object localization is achieved via triangulation using two onboard RGB cameras. To mitigate false positive objects caused by crossing bearings, spatial–temporal cues derived from 2D image detections and tracking results are exploited to establish a likelihood-based association matrix, enabling robust multi-view data association. Subsequently, optimized process and observation noise covariance matrices are formulated to quantitatively model localization uncertainty, and a Mahalanobis distance-based data association is introduced to improve the consistency of 3D tracking. Both simulation and real-world experiments demonstrate that the proposed approach achieves accurate and stable tracking performance under passive sensing conditions.
AB - Highlights: What are the main findings? A comprehensive passive sensing framework for multi-object tracking with distributed drones is presented. The robust localization and tracking of aerial objects can be achieved by exploiting spatio-temporal information to associate targets detected by different views. Object localization uncertainty is modeled in the Kalman filter through carefully designed process and observation noise covariance matrices. The resulting data association, based on the Mahalanobis distance, enhances the performance of multi-object tracking. What are the implications of the main finding? Since aerial objects are always small and share similar appearance features, data association between different viewpoints of passive sensors needs to consider more about the geometric constraints and temporal information. The motion of the observer drone necessitates refined modeling of process and observation noise to achieve robust object tracking. Reliable 3D multi-object tracking (MOT) using distributed drones remains challenging due to the lack of active sensing and the ambiguity in associating detections from different views. This paper presents a passive sensing framework that integrates multi-view data association and 3D MOT for aerial objects. First, object localization is achieved via triangulation using two onboard RGB cameras. To mitigate false positive objects caused by crossing bearings, spatial–temporal cues derived from 2D image detections and tracking results are exploited to establish a likelihood-based association matrix, enabling robust multi-view data association. Subsequently, optimized process and observation noise covariance matrices are formulated to quantitatively model localization uncertainty, and a Mahalanobis distance-based data association is introduced to improve the consistency of 3D tracking. Both simulation and real-world experiments demonstrate that the proposed approach achieves accurate and stable tracking performance under passive sensing conditions.
KW - 3D MOT
KW - distributed drones
KW - multi-view data association
KW - observation noise modeling
KW - passive localization
UR - https://www.scopus.com/pages/publications/105025767811
U2 - 10.3390/drones9120867
DO - 10.3390/drones9120867
M3 - 文章
AN - SCOPUS:105025767811
SN - 2504-446X
VL - 9
JO - Drones
JF - Drones
IS - 12
M1 - 867
ER -