首页 > 论文 > 红外与激光工程 > 48卷 > 2期(pp:226003--1)

单传感器数据驱动的人体日常短时行为识别方法

Human daily short-time activity recognition method driven by single sensor data

  • 摘要
  • 论文信息
  • 参考文献
  • 被引情况
  • PDF全文
分享:

摘要

在基于惯性传感器的人体行为识别研究中, 特征提取是其中的关键环节之一。而离散数据统计特征的稳定性依赖于特征提取的窗口大小。一般来说, 训练数据的窗口长度需要大于一个运动周期。因此, 针对测试数据远小于一个运动周期的短序列样本识别问题, 提出了一种基于模板匹配的新的解决方案。首先, 通过适当分割训练数据的长序列样本, 构建一个过完备的短时行为模板库, 将待测短时样本与模板库中样本进行一致化处理并进行匹配; 其次, 在匹配算法中, 采用样本间的F范数与整体梯度向量的2范数累加作为匹配度量准则, 得到相似度直方图; 最后, 基于相似度直方图, 根据投票策略得到最终分类识别结果。实验表明: 在使用单传感器识别短时行为的情况下, 新算法比传统算法在精度和稳定性上具有更好的性能, 并能适应不同窗口下短时行为分类问题。

Abstract

In the study of human activity recognition(HAR) based on the inertial sensor, feature extraction was one of the key points. The stability of discrete data statistical features depended on the window size of feature extraction. Generally speaking, the length of window needed to be greater than one motion cycle. Therefore, compared to the traditional behavior recognition, short-time behavior recognition was more difficult. Thus a novel template matching method was proposed for recognizing the test samples whose durations were shorter than one motion cycle. Firstly, by properly dividing the long sequence samples, a complete short-time activity template library was constructed. The short-time samples to be tested and the samples in the template library were processed and matched. Secondly, in the matching algorithm, the similarity histogram was obtained by utilizing the sum of the F norm distance between the samples and the 2 norm distance of the global gradient vector as the matching metric. Finally, based on the similarity histogram, the final classification recognition results were obtained according to the voting strategy. Experiments show that in the case of using a single sensor to identify short-term behavior, the new algorithm had better performance than traditional algorithms in accuracy and stability, and can be adapted to short-term behavior classification problems under different windows.

Newport宣传-MKS新实验室计划
补充资料

中图分类号:TP181

DOI:10.3788/irla201948.0226003

所属栏目:信息获取与辨识

基金项目:国家自然科学基金(61603003, 11471093); 教育部“云数融合科教创新”基金(2017A09116); 安徽省高校优秀拔尖人才培育资助项目(gxbjZD26)

收稿日期:2018-09-10

修改稿日期:2018-10-20

网络出版日期:--

作者单位    点击查看

苏本跃:安庆师范大学 计算机与信息学院, 安徽 安庆 246011安徽省智能感知与计算重点实验室, 安徽 安庆 246011
郑丹丹:安庆师范大学 计算机与信息学院, 安徽 安庆 246011安徽省智能感知与计算重点实验室, 安徽 安庆 246011
汤庆丰:杭州师范大学 医学院, 浙江 杭州 311121
盛 敏:安徽省智能感知与计算重点实验室, 安徽 安庆 246011

联系人作者:联系作者(Email: subenyue@sohu.com)

备注:苏本跃(1971-), 男, 教授, 博士, 主要从事机器学习、图形、图像处理等方面的研究。

【1】Poppe R. A survey on vision-based human action recognition[J]. Image & Vision Computing, 2010, 28(6): 976-990.

【2】Wang J, Liu Z, Wu Y, et al. Learning actionlet ensemble for 3D human action recognition[J]. IEEE Transactions on Pattern Analysis & Mchine
Intelligence, 2014, 36(5): 914-927.

【3】Bulling A, Blanke U, Schiele B. A tutorial on human activity recognition using body-worn inertial sensors[J]. Acm Computing Surveys, 2014, 46(3): 1-33.

【4】Lara O D, Labrador M A. A survey on human activity recognition using wearable sensors[J]. IEEE Communications Surveys & Tutorials, 2013, 15(3):1192-1209.

【5】Yang J, Wang S, Chen N, et al. Wearable accelerometer based extendable activity recognition system[C]//IEEE International Conference on Robotics and Automation. Piscataway, 2010: 3641-3647.

【6】Davis K, Owusu E, Bastani V, et al. Activity recognition based on inertial sensors for ambient assisted living[C]//International Conference on Information Fusion, 2016: 371-378.

【7】Su B, Tang Q F, Jiang J, et al, A novel method for short-time human activity recognition based on improved template matching technique[C]//ACM SIGGRAPH Conference on Virtual-Reality Continuum and Its Applications in Industry, 2016: 233-242.

【8】Li L J, Su H, Lim Y, et al. Object bank: An object-level image representation for high-level visual recognition[J]. International Journal of Computer Vision, 2014, 107(1): 20-39.

【9】Sadanand S, Corso J J. Action bank: A high-level representation of activity in video[C]//Computer Vision and Pattern Recognition(CVPR), 2012: 1234-1241.

【10】Liu Z J, Lin W, Geng Y L, et al. Intent pattern recognition of lower-limb motion based on mechanical sensors[J]. Journal of Automatica Sinica, 2017, 4(4):651-660.

【11】Young A J, Simon A M, Hargrove L J. A training method for locomotion mode prediction using powered lower limb prostheses[J]. IEEE Transactions on Neural Systems & Rehabilitation Engineering, 2014, 22(3): 671-677.

【12】Young A J, Simon A M, Fey N P, et al. Intent recognition in a powered lower limb prosthesis using time history information[J]. Annals of Biomedical Engineering, 2014, 42(3): 631-641.

【13】Yuan K B, Wang Q N, Wang L. Fuzzy-logic-based terrain identification with multisensor fusion for transtibial amputees[J]. IEEE/ASME Transactions on Mechatronics, 2015, 20(2): 618-630

【14】Zheng E H, Wang Q N. Noncontact capacitive sensing-based locomotion transition recognition for amputees with robotic transtibial prostheses[J]. IEEE Transactions on Neural Systems & Rehabilitation Engineering, 2016, 25(2): 161-170.

【15】Yang A Y, Jafari R, Sastry S S, et al. Distributed recognition of human actions using wearable motion sensor networks[J]. Journal of Ambient Intelligence & Smart Environments, 2009, 1(2): 103-115.

【16】Su B, Tang Q, Wang G, et al. Transactions on Edutainment XII: The Recognition of Human Daily Actions With Wearable Motion Sensor System[M]. Germany: Springer, 2016:68-77.

【17】He W, Guo Y, Gao C, et al. Recognition of human activities with wearable sensors[J]. Eurasip Journal on Advances in Signal Processing, 2012, 2012(1): 1-13.

【18】Xiao L. Li R F, Luo J. Recognition on human activity based on compressed sensing in body sensor networks[J]. Journal of Electronics & Information Technology, 2013, 35(1): 119-125.

【19】Li F, Pan J K. Human motion recognition based on triaxial accelerometer[J]. Journal of Computer Research and Development, 2016, 53(3): 621-631.

【20】Sheng M, Jiang J, Su B, et al. Short-time activity recognition with wearable sensors using convolutional neural network[C]//Proceedings of the 15th ACM SIGGRAPH Conference on Virtual-Reality Continuum and Its Applications in Industry, 2016: 413-416.

引用该论文

Su Benyue,Zheng Dandan,Tang Qingfeng,Sheng Min. Human daily short-time activity recognition method driven by single sensor data[J]. Infrared and Laser Engineering, 2019, 48(2): 0226003

苏本跃,郑丹丹,汤庆丰,盛 敏. 单传感器数据驱动的人体日常短时行为识别方法[J]. 红外与激光工程, 2019, 48(2): 0226003

您的浏览器不支持PDF插件,请使用最新的(Chrome/Fire Fox等)浏览器.或者您还可以点击此处下载该论文PDF