光学学报, 2019, 39 (11): 1110003, 网络出版: 2019-11-06
结合脉冲耦合神经网络与引导滤波的红外与可见光图像融合 下载: 1049次
Infrared and Visible Image Fusion Combining Pulse-Coupled Neural Network and Guided Filtering
图像处理 脉冲耦合神经网络 点火时间矩阵 多区域划分 引导滤波 图像加权融合 image processing pulse-coupled neural network firing time matrix multi-regional division guided filtering image weight fusion
摘要
针对红外与可见光图像融合细节信息不够丰富、易出现伪影等问题,提出了一种结合脉冲耦合神经网络(PCNN)与引导滤波的红外与可见光图像融合方法。改进传统PCNN模型结构,在脉冲产生单元加入抑制项,避免像素重复点火对点火时间矩阵带来噪声;以原图为引导图像对点火时间矩阵T进行引导滤波,得到兼具显著信息与边缘细节信息的多区域加权划分矩阵;基于该多区域加权划分矩阵,对红外与可见光图像进行加权融合。同时,根据PCNN数学模型点火行为分析,提出了一种包含约束的PCNN模型参数设置方法,可降低PCNN模型参数设置的复杂度。实验结果表明该融合方法具有较高的融合效率,同时融合图像细节信息丰富,无明显伪影,交叉熵、空间频率指标相对于当前常用融合方法均较优。
Abstract
This study proposes a novel fusion method that combines a pulse-coupled neural network (PCNN) and guided filtering to solve the issues of lacking details and virtual shadows in infrared and visible images. First, to eliminate the additional noise caused by the multi-impulse of a pixel in firing time matrix T, the traditional PCNN model is improved by simplifying its structure and adding restraint items into the pulse generating unit. Second, using original images as input, guided filtering is utilized to improve T with more edge details and salient information. Finally, based on the modified T, the weight fusion rule is adopted to obtain the fusion image. Following the firing mechanism analysis of the PCNN model, a new parameter setting method combining constraints is proposed to reduce the model's parameter setting complexity. Experimental results show that the proposed method provides efficient, satisfactory, and well-detailed fusion results, and obvious virtual shadows scarcely appear in the fusion image. Additionally, the cross entropy and space frequency indexes of the results are superior to those of other current fusion methods.
周哓玲, 江泽涛. 结合脉冲耦合神经网络与引导滤波的红外与可见光图像融合[J]. 光学学报, 2019, 39(11): 1110003. XiaoLing Zhou, Zetao Jiang. Infrared and Visible Image Fusion Combining Pulse-Coupled Neural Network and Guided Filtering[J]. Acta Optica Sinica, 2019, 39(11): 1110003.