应用光学, 2011, 32 (4): 671, 网络出版: 2011-08-29   

图像融合质量客观评价方法

Objective quality assessment of image fusion
作者单位
北京理工大学光电学院,光电成像技术与系统教育部重点实验室,北京100081
摘要
随着图像融合技术的快速发展,科学的图像融合质量客观评价对选择合理的融合算法以及研究新的融合算法具有重要的指导意义,成为图像质量评价研究的热点。理想的图像质量客观评价方法可以给出与人主观感受相一致的量化指标。对目前已提出的多种图像融合质量客观评价算法进行综述。简要介绍了基本的客观评价指标;将基于边缘保持度、基于结构相似度、基于信息论以及基于对比度的4类客观评价算法分别进行了介绍,着重分析和比较了各类算法的思路及特点;总结融合图像质量客观评价算法的研究趋势,指出无参考图像与彩色融合图像的质量评价方法是未来重要的发展方向。
Abstract
With the rapid development of image fusion technology, image fusion quality evaluation plays a very important guiding role in selecting or designing image fusion algorithms. Objective image quality assessment is an interesting research subject in the field of image quality assessment. The ideal objective evaluation method is consistent with human perceptual evaluation. The paper gives an overview of existing image fusion quality assessment algorithms. Firstly, basic objective evaluation specifications are presented briefly. Secondly, objective image quality assessment algorithms are classified into 4 categories: based on edge information preservation, based on structural similarity (SSIM), based on information theory and based on contrast. They are introduced with emphasis on their strategies and characteristics. At last, the trends of future research are summarized. Objective image quality assessment considering features of the human visual system or based on specific visual tasks is more and more popular. Quality assessments of no reference image and color fusion image are important development directions in future.
参考文献

[1] TOET A, IJSPEERT J K, WAXMAN A M, et al. Fusion of visible and thermal imagery improves situational awareness[J]. Displays, 1997, 18 (2): 85-95.

[2] RILEY T, SMITH M. Image fusion technology for security and surveillance applications[J]. Optics and Photonics for Counterterrorism and Crime Fighting II, The International Society for Optical Engineering, 2006, 6402: 640204.

[3] TOET A. Color image fusion for concealed weapon detection[J]. SPIE, 2003, 5071: 372-380.

[4] ZOU X, BHANU B. Tracking humans using multi-modal fusion[J]. IEEE, 2005, 25 (25): W01-30-1-W01-30-8.

[5] KONG S G, HEO J, BOUGHORBEL F, et al. Multiscale fusion of visible and thermal ir images for illumination-invariant face recognition[J]. International Journal of Computer Vision, 2007, 71 (2): 215-233.

[6] DAS S, ZHANG Y. Color night vision for navigation and surveillance[J]. Transportation Research Record, 2000, 1708: 40-46.

[7] TOET A, HOGERVORST M A, NIKOLOV S G, et al. Towards cognitive image fusion[J]. Information Fusion, 2010, 11 (2): 95-113.

[8] KING B, LEUNG L W. Comparison of image data fusion techniques using entropy and INI[J]. 22nd Asian Conference on Remote Sensing, 2001, 5 (9): 132-136.

[9] 康圣, 王江安, 宗思光,等.图像融合的量化评价方法及实验分析[J]. 光电子技术与信息,2006,19(2): 59-63.

    KANG Sheng, WANG Jiang-an, ZONG Si-guang, et al. Objective evaluation and experimental analysis of multisensor image fusion[J]. Optoelectronic Technology and Information, 2006, 19 (2): 59-63. (in Chinese with an English abstract)

[10] 宋乐. 异源图像融合及其评价方法的研究[D]. 天津:天津大学, 2008.

    SONG Le. Research on the method for different-source image fusion and its evaluation[D]. Tianjin:Tianjin University, 2008. (in Chinese)

[11] 杨威, 赵剡, 许东. 基于人眼视觉的结构相似度图像质量评价方法[J]. 北京航空航天大学学报, 2008, 34(1): 1-4.

    YANG Wei, ZHAO Yan, XU Dong. Method of image quality assessment based on human visual system and structural similarity[J]. Journal of Beijing University of Aeronautics and Aastronautics, 2008, 34(1): 1-4. (in Chinese with an English abstract)

[12] XYDEAS C, PETROVIC V. Objective image fusion performance measure[J]. Electronics Letters, 2000, 36 (4): 308-309.

[13] XYDEAS C, PETROVIC V. Objective pixel-level image fusion performance measure[J]. SPIE, 2000, 4051: 89-99.

[14] PETROVIC V, XYDEAS C. Sensor noise effects on signal-level image fusion performance[J]. Information Fusion, 2003, 4(3): 167-183.

[15] PETROVIC V, XYDEAS C. Evaluation of image fusion performance using visible differences[J]. Lecture Notes in Computer Science, Computer Vision ECCV, 2004, 3023: 380-391.

[16] WANG Z , BOVIK A C. A Universal image quality index[J]. IEEE Signal Processing Letters, 2002, 9(3): 81-83.

[17] WANG Z, BOVIK A C, SHEIKH H R, et al. Image quality assessment: from error measurement to structural similarity[J]. IEEE Transactions on Image Processing, 2004, 13(4): 600-612.

[18] WANG Z, SIMONCELLI E P, BOVIK A C. Multi-scale structural similarity for image quality assessment[J]. Signals, Systems and Computers, IEEE, 2003, 2: 1398-1402.

[19] PIELLA G, HEIJMANS H A. A new quality metric for image fusion[J]. International Conference on Image Processing, IEEE, 2003, 2: 173-176.

[20] PIELLA G. New quality measures for image fusion[C]. Proc. of the 7th International Conference on Information Fusion, Sweden:Stockholm, 2004, 542-546.

[21] CVEJIC N, OZA A, BULL D, et al. A similarity metric for assessment of image fusion algorithms[J]. International Journal of Signal Processing , 2005, 2(3):178-182.

[22] YANG Cui, ZHANG Jian-qi, WANG Xiao-rui, et al. A novel similarity based quality metric for image fusion[J]. Information Fusion, 2008, 9(2): 156-160.

[23] ZHENG You-zhi, QIN Zheng. Objective image fusion quality evaluation using structural similarity[J]. Tsinghua Science and Technology, 2009, 14(6): 703-709.

[24] QU G, ZHANG D, YAN P. Information measure for performance of image fusion[J]. Electronics Letters, 2002, 38 (7): 313-315.

[25] RAMESH C , RANJITH T. Fusion performance measures and a lifting wavelet transform based algorithm for image fusion[J]. Information Fusion, 2002, 1: 317-320.

[26] WANG Qiang, SHEN Yi, ZHANG Ye, et al. A quantitative method for evaluating the performances of hyperspectral image fusion[J]. IEEE Transactions on Instrumentation and Measurement. 2003, 52(4): 1041-1047.

[27] WANG Qiang, ZHANG Ye, LI Shuo, et al. A quantitative and comparative analysis of hyperspectral data Fusion Performance[J]. Journal of Harbin Institute of Technology (New Series), 2002, 9(3): 234-238.

[28] WANG Qiang, SHEN Yi, ZHANG Ye. A quantitative method to evaluate the performance of hyperspectral Data fusion[J]. IEEE, 2002, 2:919-923.

[29] WANG Qiang, SHEN Yi, ZHANG Ye. A fast method to evaluate the performances of image fusion techniques and its error analysis[J]. IEEE, 2003, 1: 823-826.

[30] WANG Qiang, SHEN Yi. Performances evaluation of image fusion techniques based on nonlinear correlation measurement[J]. IEEE, 2004, 1: 472-475.

[31] WANG Qiang, SHEN Yi, ZHANG Ye,et al. Fast quantitative correlation analysis and information deviation analysis for evaluating the performances of image fusion techniques[J]. IEEE Transactions on Instrumentation and Measurement,2004, 53(5): 1441-1447.

[32] TSAGARIS V, ANASTASSOPOULOS V. A global measure for assessing image fusion methods[J]. Optical Engineering, 2006, 45(2): 026201.

[33] TSAGARIS V, ANASTASSOPOULOS V. Information measure for assessing pixel-level fusion methods[J]. SPIE, 2004, 5573: 64-71.

[34] TSAGARIS V. Objective evaluation of color image fusion methods[J]. Optical Engineering, 2009, 48(6): 066201.

[35] TSAGARIS V, ANASTASSOPOULOS V. Assessing information content in color images[J]. Journal of Electronic Imaging, 2005, 14(4): 043007.

[36] SHI J S, JIN W Q, WANG L X, et al. Objective evaluation of color fusion of visual and IR imagery by measuring image contrast[J]. SPIE, 2005, 5640: 594-601.

[37] PELI E. Contrast in complex images[J]. J. Optical Soc. Am A, 1990, 7 (10): 2032-2040.

[38] NADENAU M. Integration of human colour vision models into high quality image compression[D].Cole Switzerland: Polytechnique Fédérale de Lausanne, 2000.

[39] CHEN Y, BLUM R S. A new automated quality assessment algorithm for image fusion[J]. Image and Vision Computing, 2009, 27 (10): 1421-1432.

[40] CHEN Y, BLUM R S. A new automated quality assessment algorithm for night vision image fusion[J]. IEEE, 2007, 14: 518-523.

高绍姝, 金伟其, 王岭雪, 王吉晖, 王霞. 图像融合质量客观评价方法[J]. 应用光学, 2011, 32(4): 671. GAO Shao-shu, JIN Wei-qi, WANG Ling-xue, WANG Ji-hui, WANG xia. Objective quality assessment of image fusion[J]. Journal of Applied Optics, 2011, 32(4): 671.

本文已被 7 篇论文引用
被引统计数据来源于中国光学期刊网
引用该论文: TXT   |   EndNote

相关论文

加载中...

关于本站 Cookie 的使用提示

中国光学期刊网使用基于 cookie 的技术来更好地为您提供各项服务,点击此处了解我们的隐私策略。 如您需继续使用本网站,请您授权我们使用本地 cookie 来保存部分信息。
全站搜索
您最值得信赖的光电行业旗舰网络服务平台!