激光与光电子学进展, 2019, 56 (10): 102802, 网络出版: 2019-07-04
结合全卷积神经网络与条件随机场的资源3号遥感影像云检测 下载: 1112次
Cloud Detectionof ZY-3 Remote Sensing Images Based on Fully Convolutional Neural Network and Conditional Random Field
遥感 云检测 全卷积神经网络 资源3号遥感影像 条件随机场 高斯核 平均场算法 remote sensing cloud detection full convolutional neural network ZY-3 remote sensing image conditional random field Guassian kernel mean field algorithm
摘要
提出一种结合全卷积神经网络与条件随机场的资源3号卫星遥感影像云检测方法。优化了全卷积神经网络(FCN)模型,对3次上采样后的全卷积神经网络(FCN-8s)进行上采样,采用自适应+动量算法调整参数学习率加速收敛;将全卷积神经网络与条件随机场结合,以全卷积输出影像作为前端一阶势,高斯核函数作为后端二阶势;加入mean-shift区域约束0条件保护影像的局部特征信息,运用平均场算法推断条件随机场模型后验概率。实验结果表明,本研究提出的云检测方法可将影像云区识别准确率提高至97.38%,较FCN-8s算法提高13.42%。
Abstract
A novel method for the cloud detection of ZY-3 satellite remote sensing images is proposed based on the fully convolutional neural network (FCN) combined with the conditional random field. The model of a fully convolutional neural network is optimized and the FCN after three times of upsampling (FCN-8s) is upsampled. The momentum combined adaptive algorithm is used for the acceleration of convergence by adjusting the learning rate of parameters. The fully convolutional neural network is combined with the conditional random field, the fully convolutional output image is taken as the first-order potential of the front end, and the Gaussian kernel function is used as the second-order potential of the back end. The mean-shift regional constraints are added to protect the local feature information of images and the posterior probability of the conditional random field model is inferred by the mean field algorithm. The experimental results show that the proposed cloud detection method can increase the identification accuracy rate of an image cloud region to 97.38%, which is 13.42% higher than that from FCN-8s.
裴亮, 刘阳, 高琳. 结合全卷积神经网络与条件随机场的资源3号遥感影像云检测[J]. 激光与光电子学进展, 2019, 56(10): 102802. Liang Pei, Yang Liu, Lin Gao. Cloud Detectionof ZY-3 Remote Sensing Images Based on Fully Convolutional Neural Network and Conditional Random Field[J]. Laser & Optoelectronics Progress, 2019, 56(10): 102802.