TY - JOUR
T1 - Scattering and Optical Cross-Modal Attention Distillation Framework for SAR Target Recognition
AU - Wang, Longfei
AU - Liu, Zhunga
AU - Zhang, Zuowei
N1 - Publisher Copyright:
© 2001-2012 IEEE.
PY - 2024
Y1 - 2024
N2 - Synthetic Aperture Radar (SAR) and optical sensors detect distinct target features across different spectral bands in the field of remote sensing. Consequently, the comprehensive utilization of these two kinds of data is significant to enhance the capability of SAR target recognition. However, the substantial differences in their imaging principles make it difficult to align their discriminative features. To mitigate this problem, we propose a scattering and optical cross-modal attention distillation framework for SAR target recognition. The core contribution is to utilize SAR scattering features to bridge the shared semantic features between same-class SAR and optical images, facilitating the fusion and utilization of features between the two modalities. Firstly, a Dual Attention Enhancement (DAE) module is established to supplement the significant scattering structures of physical attention on the basis of multi-scale visual attention. This provides comprehensive semantic features for subsequent optical-to-SAR distillation. Subsequently, a Multi-Scale Attention (MSA) enhancement module is developed to adaptively enhance category-relevant discriminative features. This contributes to extracting shared semantic knowledge between scattering and optical features. Finally, the scattering image enhances the significant representations in SAR images, using scattering features as a bridge to facilitate cross-modal feature distillation, thereby improving the target recognition performance of SAR images with the assistance of optical images. Extensive experiments with different fusion strategies and various novel recognition methods on FUSAR-Ship, and FGSC-23, and FGSCR-42 datasets verify the effectiveness of the proposed framework.
AB - Synthetic Aperture Radar (SAR) and optical sensors detect distinct target features across different spectral bands in the field of remote sensing. Consequently, the comprehensive utilization of these two kinds of data is significant to enhance the capability of SAR target recognition. However, the substantial differences in their imaging principles make it difficult to align their discriminative features. To mitigate this problem, we propose a scattering and optical cross-modal attention distillation framework for SAR target recognition. The core contribution is to utilize SAR scattering features to bridge the shared semantic features between same-class SAR and optical images, facilitating the fusion and utilization of features between the two modalities. Firstly, a Dual Attention Enhancement (DAE) module is established to supplement the significant scattering structures of physical attention on the basis of multi-scale visual attention. This provides comprehensive semantic features for subsequent optical-to-SAR distillation. Subsequently, a Multi-Scale Attention (MSA) enhancement module is developed to adaptively enhance category-relevant discriminative features. This contributes to extracting shared semantic knowledge between scattering and optical features. Finally, the scattering image enhances the significant representations in SAR images, using scattering features as a bridge to facilitate cross-modal feature distillation, thereby improving the target recognition performance of SAR images with the assistance of optical images. Extensive experiments with different fusion strategies and various novel recognition methods on FUSAR-Ship, and FGSC-23, and FGSCR-42 datasets verify the effectiveness of the proposed framework.
KW - knowledge distillation
KW - optical image
KW - Synthetic aperture radar
KW - target recognition
UR - http://www.scopus.com/inward/record.url?scp=85211999683&partnerID=8YFLogxK
U2 - 10.1109/JSEN.2024.3508822
DO - 10.1109/JSEN.2024.3508822
M3 - 文章
AN - SCOPUS:85211999683
SN - 1530-437X
JO - IEEE Sensors Journal
JF - IEEE Sensors Journal
ER -