激光与光电子学进展, 2019, 56 (11): 111503, 网络出版: 2019-06-13
一种改进Census变换的双目匹配测距方法 下载: 1106次
Binocular Ranging Method Using Stereo Matching Based on Improved Census Transform
机器视觉 双目测距 相机标定 Census变换 立体校正 立体匹配 machine vision binocular ranging camera calibration Census transform stereo correction stereo matching
摘要
为了得到一种易实现且精度较高的双目测距方法,立体校正左右相机的非前向平行结构,先将改进的Census变换算法应用于立体匹配,得到准确的视差值,再根据双目视觉特殊的外极线几何结构计算出实际的距离信息。将原始Census变换中比较周围像素与中心像素的方案改进为多中心点相互监督融合,极大地提升了立体匹配精度。在实验室环境下,利用两个完全相同的互补金属氧化物半导体(CMOS)相机搭建了双目测距实验平台,详细介绍了测距流程中的硬件、算法以及标定过程。将实验结果应用于实际距离测量,并与原始的Census变换进行对比,结果表明:改进的Census变换测量误差为6.4 cm,2 m测量误差精度提高了19.1%,满足高精度双目测距要求。
Abstract
This study proposes a simple and high-precision binocular ranging method. A stereo correction algorithm was first used for the stereo correction of the non-forward parallel structures of the left and right cameras. The improved Census transform algorithm was then applied to obtain accurate disparity values. Finally, the true distance information was calculated based on the special epipolar-line geometry of binocular vision. Further, the multi-center points were used to compare with the surrounding pixels in the original Census transform, which is improved as mutual supervision and fusing of multi-center points. Thus the accuracy of the stereo matching is improved. Two identical complementary metal-oxide-semiconductor (CMOS) cameras were used to build a binocular ranging platform, and the hardware, algorithm and calibration process in the flow chart of ranging were introduced in detail. The experimental results show that the proposed method performs better than the original Census transform. For a 2 m measurement, the accuracy is increased by 19.1% and the measurement error is 6.4 cm, thus meeting the requirements of high-precision binocular ranging.
李大华, 沈洪宇, 于晓, 高强, 汪宏威. 一种改进Census变换的双目匹配测距方法[J]. 激光与光电子学进展, 2019, 56(11): 111503. Dahua Li, Hongyu Shen, Xiao Yu, Qiang Gao, Hongwei Wang. Binocular Ranging Method Using Stereo Matching Based on Improved Census Transform[J]. Laser & Optoelectronics Progress, 2019, 56(11): 111503.