Gated and Axis-Concentrated Localization Network for Remote Sensing Object Detection

Xiaoqiang Lu, Yuanlin Zhang, Yuan Yuan, Yachuang Feng

科研成果: 期刊稿件文章同行评审

59 引用 (Scopus)

摘要

In the multicategory object detection task of high-resolution remote sensing images, small objects are always difficult to detect. This happens because the influence of location deviation on small object detection is greater than on large object detection. The reason is that, with the same intersection decrease between a predicted box and a true box, Intersection over Union (IoU) of small objects drops more than those of large objects. In order to address this challenge, we propose a new localization model to improve the location accuracy of small objects. This model is composed of two parts. First, a global feature gating process is proposed to implement a channel attention mechanism on local feature learning. This process takes full advantages of global features' abundant semantics and local features' spatial details. In this case, more effective information is selected for small object detection. Second, an axis-concentrated prediction (ACP) process is adopted to project convolutional feature maps into different spatial directions, so as to avoid interference between coordinate axes and improve the location accuracy. Then, coordinate prediction is implemented with a regression layer using the learned object representation. In our experiments, we explore the relationship between the detection accuracy and the object scale, and the results show that the performance improvements of small objects are distinct using our method. Compared with the classical deep learning detection models, the proposed gated axis-concentrated localization network (GACL Net) has the characteristic of focusing on small objects.

源语言英语
文章编号8827601
页(从-至)179-192
页数14
期刊IEEE Transactions on Geoscience and Remote Sensing
58
1
DOI
出版状态已出版 - 1月 2020

指纹

探究 'Gated and Axis-Concentrated Localization Network for Remote Sensing Object Detection' 的科研主题。它们共同构成独一无二的指纹。

引用此