Abstract
With recent advances in remote sensing, abundant multimodal data are available for applications. However, considering the redundancy and the huge domain differences among multimodal data, how to effectively integrate these data is becoming important and challenging. In this paper, we proposed a triplet attention feature fusion network (TAFFN) for SAR and optical image fusion classification. Specifically, spatial attention module and spectral attention module based on self-attention mechanism are developed to extract spatial and spectral long-range information from the SAR image and optical image respectively, at the same time, cross-attention mechanism is proposed to capture the long-range interactive representation. Triplet attentions are concatenated to further integrate the complementary information of SAR and optical images. Experiments on a SAR and optical multimodal dataset demonstrate that the proposed method can achieve the state-of-the-arts performance.
Original language | English |
---|---|
Pages | 4256-4259 |
Number of pages | 4 |
DOIs | |
State | Published - 2021 |
Event | 2021 IEEE International Geoscience and Remote Sensing Symposium, IGARSS 2021 - Brussels, Belgium Duration: 12 Jul 2021 → 16 Jul 2021 |
Conference
Conference | 2021 IEEE International Geoscience and Remote Sensing Symposium, IGARSS 2021 |
---|---|
Country/Territory | Belgium |
City | Brussels |
Period | 12/07/21 → 16/07/21 |
Keywords
- attention mechanism
- Feature fusion
- land cover classification
- SAR image