Few-Shot Object Detection Based on Contrastive Class-Attention Feature Reweighting for Remote Sensing Images

Wang Miao, Zihao Zhao, Jie Geng, Wen Jiang

Research output: Contribution to journalArticlepeer-review

3 Scopus citations

Abstract

Remote sensing image object detection with deep neural networks has been highly successful, but it heavily relies on a large number of labeled samples for optimal performance. Unfortunately, when faced with limited labeled samples, the performance of object detection deteriorates. In order to overcome these limitations, we propose a few-shot object detection (FSOD) method based on the reweighting of contrastive class-attention features for remote sensing images. A Siamese representation embedding model based on contrastive learning with a distinguishing operator is proposed to deal with interclass feature ambiguity between base classes and novel classes under complex backgrounds. At the same time, the attention class weights method on region of interest (ROI) features is utilized to boost the discrimination of the novel class features. We conducted comprehensive experiments on two widely used remote sensing object detection datasets, RSOD and DIOR. The proposed FSOD model demonstrated superior performance compared to MM-RCNN, achieving an improvement of approximately 4.7% in terms of mAP on the DIOR dataset. In addition, in comparison to the self-adaptive attention network (SAAN), the FSOD model exhibited an improvement of approximately 3.4% in mAP on the RSOD dataset.

Original languageEnglish
Pages (from-to)2800-2814
Number of pages15
JournalIEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing
Volume17
DOIs
StatePublished - 2024

Keywords

  • Class-attention reweighting
  • contrastive learning
  • few-shot object detection (FSOD)
  • remote sensing image

Fingerprint

Dive into the research topics of 'Few-Shot Object Detection Based on Contrastive Class-Attention Feature Reweighting for Remote Sensing Images'. Together they form a unique fingerprint.

Cite this