Task-adaptive embedding learning with dynamic kernel fusion for few-shot remote sensing scene classification

Pei Zhang, Guoliang Fan, Chanyue Wu, Dong Wang, Ying Li

科研成果: 期刊稿件文章同行评审

19 引用 (Scopus)

摘要

The central goal of few-shot scene classification is to learn a model that can generalize well to a novel scene category (UNSEEN) from only one or a few labeled examples. Recent works in the Remote Sensing (RS) community tackle this challenge by developing algorithms in a meta-learning manner. However, most prior approaches have either focused on rapidly optimizing a meta-learner or finding good similarity metrics while overlooking the embedding power. Here we propose a novel Task-Adaptive Embedding Learning (TAEL) framework that complements the existing methods by giving full play to feature embedding’s dual roles in few-shot scene classification—representing images and constructing classifiers in the embedding space. First, we design a Dynamic Kernel Fusion Network (DKF-Net) that enriches the diversity and expressive capacity of embeddings by dynamically fusing information from multiple kernels. Second, we present a task-adaptive strategy that helps to generate more discriminative representations by transforming the universal embeddings into task-adaptive embeddings via a self-attention mechanism. We evaluate our model in the standard few-shot learning setting on two challenging datasets: NWPU-RESISC4 and RSD46-WHU. Experimental results demonstrate that, on all tasks, our method achieves state-of-the-art performance by a significant margin.

源语言英语
文章编号4200
期刊Remote Sensing
13
21
DOI
出版状态已出版 - 1 11月 2021

指纹

探究 'Task-adaptive embedding learning with dynamic kernel fusion for few-shot remote sensing scene classification' 的科研主题。它们共同构成独一无二的指纹。

引用此