光学 精密工程, 2018, 26 (7): 1766, 网络出版: 2018-10-02
Tetrolet域uHMT结构先验与Turbo均衡的压缩成像
Compressive imaging based on Tetrolet-domain uHMT structured sparse prior and Turbo equalization
Tetrolet变换 通用隐马尔科夫树 Turbo均衡 消息传递 压缩成像 Tetrolet transform universal hidden markov trees Turbo equalization belief propagation compressive imaging
摘要
基于Tetrolet变换系数的尺度间传递特性与按指数衰减特性, 本文构建了一种Tetrolet域通用隐马尔科夫树结构稀疏先验模型, 把Tetrolet变换系数的统计分布表示成二值高斯混合形式作为先验信息, 并采用因子图方法估计后验状态概率。为了解决在有环路的因子图中消息不能稳定收敛的问题, 利用Turbo均衡方法把压缩采样和结构先验部分分割成两个子图, 分别进行状态估计并相互交换消息。最后依据最小均方误差准则估计得到重构图像, 对128×128的测试图像重构的归一化均方误差可达-20.97 dB, 运行时间为45.24 s。实验结果表明该算法在重构质量和运行速度上优于小波域隐马尔科夫树模型的各类算法。
Abstract
Based on the persistence and exponential decay across scales of Tetrolet coefficients, the Tetrolet-domain universal hidden Markov tree structured sparse prior model was established for compressive imaging. In this model, the statistic distribution of Tetrolet coefficients was presented as the prior with the Gaussian-mixture form, and then, the posterior probability density function (PDF) was estimated by using the factor graph method. In order to solve the problem that the messages passing through the loop factor graph cannot reach stable convergence, the Turbo equalization method was exploited to decouple the factor graph into two parts for estimating the states of compressive sampling and the structured sparse model. Then, the exchange of messages was performed mutually in the two sub-graphs until reaching convergence. Finally, the image was estimated based on the minimum mean-squared error criterion. The normalized mean-squared error of reconstructing the testing image with size 128×128 was -20.97 dB, and the run-time was 45.24 s. Experimental results demonstrate that the proposed algorithm outperforms the algorithms based on the wavelet-domain hidden Markov tree model in terms of reconstruction quality and speed.
杨星. Tetrolet域uHMT结构先验与Turbo均衡的压缩成像[J]. 光学 精密工程, 2018, 26(7): 1766. YANG Xing. Compressive imaging based on Tetrolet-domain uHMT structured sparse prior and Turbo equalization[J]. Optics and Precision Engineering, 2018, 26(7): 1766.