光学与光电技术, 2018, 16 (5): 66, 网络出版: 2018-12-16
一种基于加权稀疏优化的双目视觉运动估计方法
Robust Stereo Dense Disparity Estimation in Textureless Environment
双目视觉 运动估计 不确定度 加权优化 最小二乘法 stereo motion estimation uncertainty weighted optimization the least square method
摘要
自主运动估计是机器人视觉感知系统的重要组成部分,包括环境特征识别和载体运动估计两大共性技术问题。环境特征识别由于受图像分辨率、景深和配准算子等多种因素的影响,导致特征定位精度和配准的准确率存在一定的不确定性。这种输入的不确定度会严重影响载体运动估计输出的精度和稳定性。针对这一问题,利用协方差矩阵对特征点三维重建过程的不确定度进行概率意义描述,将不确定度引入运动参数优化估计的权值计算中,通过加权优化,弱化特征点定位精度差异性的影响,以期获得接近于无偏估计的优化估计值。提出了一种利用不确定度和加权最小二乘法估计运动初值,并在初值估计中引入RANSAC方法来滤除配准错误特征点集合的影响。构建了基于稀疏Levenberg-Marquardt法的运动精确估计加权优化算法,通过划分平移运动和旋转运动的参数集合,对载体的运动估计初值进行快速的非线性优化,获得更高的运动估计精度。
Abstract
Ego-motion estimation is an important part of robot vision system including feature recognition and motion estimation. Due to the factors such as image resolution, field depth, the accuracy of feature locating and matching, the uncertainty of the environmental feature recognition is probably inevitable. The uncertainty of the input will seriously affect the accuracy and stability of the motion estimation. To resolve the problem, the uncertainty of the reconstruction of the feature points is described by covariance matrix and utilized to calculate the build the weight of motion parameter optimization. The impact of the difference of the accuracy of the feature point positioning will be reduced to get close to the unbiased optimal estimation. The initial motion is estimated by RANSAC and weighted least square methods and the outliers are also filtered. The accurate motion estimation is rapid optimized by building a weighted sparse Levenberg-Marquardt method and dividing the motion parameters into the parameter sets of translational and rotational motion. The experimental results validate the robustness and efficiency of our method.
杜英魁, 陈艳, 何丽霞, 韩晓微, 原忠虎. 一种基于加权稀疏优化的双目视觉运动估计方法[J]. 光学与光电技术, 2018, 16(5): 66. DU Ying-kui, CHEN Yan, HE LI-Xia, HAN Xiao-wei, YUAN Zhong-hu. Robust Stereo Dense Disparity Estimation in Textureless Environment[J]. OPTICS & OPTOELECTRONIC TECHNOLOGY, 2018, 16(5): 66.