TY - JOUR
T1 - High-Precision Domain Adaptive Detection Method for Noncooperative Spacecraft Based on Optical Sensor Data
AU - Zhang, Gaopeng
AU - Zhang, Zhe
AU - Lai, Jiahang
AU - Zhang, Guangdong
AU - Ye, Hao
AU - Yang, Hongtao
AU - Cao, Jianzhong
AU - Du, Hubing
AU - Zhao, Zixin
AU - Chen, Weining
AU - Lu, Rong
AU - Wang, Changqing
N1 - Publisher Copyright:
© 2001-2012 IEEE.
PY - 2024/4/15
Y1 - 2024/4/15
N2 - The accurate detection of noncooperative spacecraft based on optical sensor data is essential for critical space tasks, such as on-orbit servicing, rendezvous and docking, and debris removal. Traditional object detection methods struggle in the challenging space environment, which includes extreme variations in lighting, occlusions, and differences in image scale. To address this problem, this article proposes a high-precision, deep-learning-based, domain-adaptive detection method specifically tailored for noncooperative spacecraft. The proposed algorithm focuses on two key elements: dataset creation and network structure design. First, we develop a spacecraft image generation algorithm using cycle generative adversarial network (CycleGAN), facilitating seamless conversion between synthetic and real spacecraft images to bridge domain differences. Second, we combine a domain-adversarial neural network with YOLOv5 to create a robust detection model based on multiscale domain adaptation. This approach enhances the YOLOv5 network's ability to learn domain-invariant features from both synthetic and real spacecraft images. The effectiveness of our high-precision domain-adaptive detection method is verified through extensive experimentation. This method enables several novel and significant space applications, such as space rendezvous and docking and on-orbit servicing.
AB - The accurate detection of noncooperative spacecraft based on optical sensor data is essential for critical space tasks, such as on-orbit servicing, rendezvous and docking, and debris removal. Traditional object detection methods struggle in the challenging space environment, which includes extreme variations in lighting, occlusions, and differences in image scale. To address this problem, this article proposes a high-precision, deep-learning-based, domain-adaptive detection method specifically tailored for noncooperative spacecraft. The proposed algorithm focuses on two key elements: dataset creation and network structure design. First, we develop a spacecraft image generation algorithm using cycle generative adversarial network (CycleGAN), facilitating seamless conversion between synthetic and real spacecraft images to bridge domain differences. Second, we combine a domain-adversarial neural network with YOLOv5 to create a robust detection model based on multiscale domain adaptation. This approach enhances the YOLOv5 network's ability to learn domain-invariant features from both synthetic and real spacecraft images. The effectiveness of our high-precision domain-adaptive detection method is verified through extensive experimentation. This method enables several novel and significant space applications, such as space rendezvous and docking and on-orbit servicing.
KW - Deep learning
KW - domain adaptation
KW - noncooperative spacecraft
KW - object detection
KW - optical sensor data processing
UR - http://www.scopus.com/inward/record.url?scp=85187376882&partnerID=8YFLogxK
U2 - 10.1109/JSEN.2024.3370309
DO - 10.1109/JSEN.2024.3370309
M3 - 文章
AN - SCOPUS:85187376882
SN - 1530-437X
VL - 24
SP - 13604
EP - 13619
JO - IEEE Sensors Journal
JF - IEEE Sensors Journal
IS - 8
ER -