光学学报, 2014, 34 (5): 0533001, 网络出版: 2014-04-24
基于立体视觉及蛇模型的行人轮廓提取及其识别
Pedestrian Contour Extraction and Its Recognition Using Stereovision and Snake Models
图像处理 目标分割 轮廓提取 行人识别 蛇模型 改进的距离模型 image processing object segmentation contour extraction pedestrian recognition Snake models improved distance models
摘要
从复杂、动态多变的交通图像中准确提取障碍物的轮廓曲线是智能汽车的一个重要研究课题,它对行人保护起着十分重要的作用。Snake模型是用来自动提取物体边界的曲线模型。结合立体视觉技术和Snake模型以实现行人检测:运用基于稠密视差的立体分割方法查找并分割潜在行人目标区域,为了便于后期目标轮廓提取,基于边缘检索的立体匹配算法被进一步用于提取感趣区(ROIs)内的目标初始边界;在此基础上,用Snake模型提取目标的完整轮廓曲线;轮廓因子及目标高程被用于ROIs的验证,即行人识别。针对Snake模型易受噪声干扰及难以收敛到凹陷边界等缺陷,提出了改进的距离势能模型。以典型交通场景为分析对象,对所提出的方法进行了测试,得到了较为理想的结果。
Abstract
Accurate extraction of object contour from complex and dynamic traffic images is crucial for intelligent vehicles, and plays an important role for pedestrian protection. Snake Models are widely employed to object contour outomatic extraction. Presented here is a novel approach for pedestrian recognition by combining stereovision with Snake models. Object segmentation based on dense disparity map is used to locate and break up regions which contains potential pedestrians. Furthermore, edge-indexed stereo matching algorithm is employed to obtain the initial edges of the targets of the regions in order to facilitate object contour extraction in the later stage. Snake models are adopted to extract complete contour curves of the targets. Contour factors derived from the contour curves and target elevation are used to verify the targets i.e. pedestrian recognition. To overcome the limitations of Snake models, distance potential models are modified to immunize from noise and to make the model converge into the boundary concavity. The approach presented here is tested on substantial traffic images and the corresponding results prove the efficiency of the approach.
刘述民, 黄影平, 张仁杰. 基于立体视觉及蛇模型的行人轮廓提取及其识别[J]. 光学学报, 2014, 34(5): 0533001. Liu Shumin, Huang Yingping, Zhang Renjie. Pedestrian Contour Extraction and Its Recognition Using Stereovision and Snake Models[J]. Acta Optica Sinica, 2014, 34(5): 0533001.