激光与光电子学进展, 2020, 57 (14): 141006, 网络出版: 2020-07-28
基于自然场景统计的色域映射图像无参考质量评价 下载: 884次
No-Reference Quality Evaluation for Gamut Mapping Images Based on Natural Scene Statistics
色域映射 图像质量评价 自然场景统计 颜色失真 颜色空间 gamut mapping image quality evaluation natural scene statistics color distortion color space
摘要
色域映射是实现彩色图像在不同设备中传输和再现的关键技术,也是现代颜色管理系统的核心环节。但关于色域映射图像的质量评价研究较少,因此,提出了一种基于自然场景统计的无参考色域映射图像质量评价算法。首先将色域映射图像转换到Spatial-CIELAB颜色空间并提取颜色三属性,即亮度、彩度和色调。对亮度分量进行Log-Gabor滤波,在频域上提取统计特征表征图像的结构失真和对比度失真;对彩度和色调两个分量,在空间域上提取统计特征来表征图像的颜色失真。然后结合主观分数和提取的特征,利用后向传播神经网络训练图像质量评价模型。最后用模型评价图像质量,实验结果表明,该算法优于现有的无参考质量评价算法。
Abstract
Gamut mapping is a key technology for color image transmission and reproduction in different devices, and it is also the core part of modern color management system. However, there are few studies on the quality evaluation of gamut mapping images, therefore, in this paper, a no-reference quality evaluation algorithm based on natural scene statistics for gamut mapping images is proposed. First, the gamut mapping images are converted to the Spatial-CIELAB color space and the three attributes (e.g., luminance, chroma and hue) are extracted. Next, luminance components are decomposed by using Log-Gabor filter, and statistical features are extracted in the frequency domain to characterize image structure distortion and contrast distortion. For the two components of chroma and hue, statistical features are extracted in the spatial domain to characterize color distortion. Then, combined with subjective scores and extracted features, the backward propagation neural network is used to train the image quality prediction model. Finally, this model is employed to assess the image quality. The experimental results prove that the proposed method is superior to the existing no-reference quality evaluation algorithms.
余伟, 徐晶晶, 刘玉英, 张俊升, 李腾腾. 基于自然场景统计的色域映射图像无参考质量评价[J]. 激光与光电子学进展, 2020, 57(14): 141006. Wei Yu, Jingjing Xu, Yuying Liu, Junsheng Zhang, Tengteng Li. No-Reference Quality Evaluation for Gamut Mapping Images Based on Natural Scene Statistics[J]. Laser & Optoelectronics Progress, 2020, 57(14): 141006.