激光与光电子学进展, 2021, 58 (8): 0810011, 网络出版: 2021-04-12
基于单列深度时空卷积神经网络的人群计数 下载: 779次
Crowd Counting Based on Single-Column Deep Spatiotemporal Convolutional Neural Network
图像处理 神经网络 人群计数 深度时空网络 空洞卷积 空间变换 image processing neural networks crowd counting deep spatiotemporal network dilated convolution spatial transfornation
摘要
突发性人群聚集会给人们的人身安全带来隐患,因此,对高风险区域进行有效的人群计数具有重要意义。针对多列神经网络结构臃肿、冗余信息多及耗时长等问题,提出了一种基于单列深度时空卷积神经网络的人群计数模型,并对模型进行改进,以满足视频图像计数的需要。首先,在全卷积神经网络(FCN)中加入空洞卷积和跳级连接特征融合,以提高网络提取特征的能力。然后,为了减少视频监控产生的角度畸变对计数结果的影响,在长短期记忆(LSTM)网络结构中加入空间变换模块;为了提高网络计数结果的精确性,用残差连接方式连接改进的FCN和关联时序的LSTM网络。最后,在UCSD、Mall和自建人群数据集上分别进行测试,结果表明,相比其他模型,本模型的人群计数准确性和鲁棒性更好。
Abstract
Sudden mass gatherings are detrimental to people's safety. Therefore, it is paramount to conduct effective crowd counting in high-risk areas. Aiming at the problems of multicolumn neural network structure is bloated, redundant information and time consuming, we proposed a crowd counting model based on a single-column deep spatiotemporal convolutional neural network and modified it for video image counting. First, a fully convolutional network (FCN) is added to the feature fusion of dilated convolution and level jump connection to improve the ability of the network to extract features. Then, to reduce the influence of the angle distortion generated by the video surveillance on the counting results, a spatial transformation module is added to the long short-term memory (LSTM) network structure. Further, the residual connection method is used to connect and improve the FCN and associated timing LSTM network to improve the accuracy of the network counting results. Finally, tests are performed on UCSD, Mall, and self-built population data sets. Results show that the crowd counting accuracy and robustness of the model are better compared with other models.
鱼春燕, 徐岩, 缑丽莎, 南哲锋. 基于单列深度时空卷积神经网络的人群计数[J]. 激光与光电子学进展, 2021, 58(8): 0810011. Chunyan Yu, Yan Xu, Lisha Gou, Zhefeng Nan. Crowd Counting Based on Single-Column Deep Spatiotemporal Convolutional Neural Network[J]. Laser & Optoelectronics Progress, 2021, 58(8): 0810011.