激光与光电子学进展, 2021, 58 (2): 0228003, 网络出版: 2021-01-11
基于注意力和特征融合的遥感图像目标检测模型 下载: 1072次
Remote Sensing Image Target Detection Model Based on Attention and Feature Fusion
图像处理 遥感图像 注意力分支 特征融合 目标检测 image processing remote sensing images attention branch feature fusion target detection
摘要
针对环境背景复杂且包含小目标的遥感图像难以进行精准目标检测的问题,在单阶段检测(SSD)模型的基础上,提出了一种基于注意力和特征融合的单阶段目标检测模型,该模型主要由检测分支和注意力分支组成。首先,在检测分支SSD中加入注意力分支,注意力分支的全卷积网络通过逐像素回归得到待检测目标的位置特征;其次,采用对应元素相加的方法对检测分支和注意力分支进行特征融合,获得细节信息和语义信息更丰富的高质量特征图;最后,用软非极大值抑制(Soft-NMS)进行后处理,进一步提高目标检测的准确性。实验结果表明,本模型在UCAS-AOD和NWPU VHR-10数据集上的平均精度均值分别为92.52%和82.49%,相比其他模型,检测效率更高。
Abstract
Aiming at the problem that remote sensing images with complex environmental backgrounds and small targets are difficult to perform accurate target detection, based on the single-stage detection model (SSD), a single-stage target detection model based on attention and feature fusion is proposed in this paper, which is mainly composed of detection branch and attention branch. First, the attention branch is added to the detection branch SSD. The fully convolutional network (FCN) of the attention branch obtains the location characteristics of the target to be detected through pixel-by-pixel regression. Second, by using the method of adding corresponding elements to the detection branch and attention branch, the feature fusion of detection branch and attention branch are carried out to obtain high-quality feature image with more detailed information and semantic information. Finally, soft non-maximum suppression (Soft-NMS) is used as a post-processing part to further improve the accuracy of target detection. Experimental results show that the mean average accuracy of the model on the UCAS-AOD and NWPU VHR-10 data sets are 92.52% and 82.49%, respectively. Compared with other models, the detection efficiency of the model is higher.
汪亚妮, 汪西莉. 基于注意力和特征融合的遥感图像目标检测模型[J]. 激光与光电子学进展, 2021, 58(2): 0228003. Yani Wang, Xili Wang. Remote Sensing Image Target Detection Model Based on Attention and Feature Fusion[J]. Laser & Optoelectronics Progress, 2021, 58(2): 0228003.