激光与光电子学进展, 2020, 57 (12): 121022, 网络出版: 2020-06-03
基于Faster R-CNN金丝猴优化检测方法
Optimized Detection Method for Snub-Nosed Monkeys Based on Faster R-CNN
摘要
采用深度学习方法,利用网络爬取、实地拍摄两种方式获取数据,并构建基于Faster R-CNN(region-based convolutional neural networks)的金丝猴优化检测模型。通过比较不同的模型,明确模型的最优构建方案;通过对比基于不同训练集构建的模型的检测精度,探究建模数据最优补充方案。研究结果表明:相比Vgg16和Res50网络,基于Res101网络在迭代70000次时可以构建最优的模型;在实测数据有限时,可以采用网络图片作为替代数据源进行金丝猴面部检测,并且可以将其作为辅助数据源优化金丝猴身体检测效果;相对于经典的卷积神经网络,该方法不仅检测效果更好,而且运行时间更少;该模型可以在复杂生态背景图片中有效地进行金丝猴定位与识别。所提方法对金丝猴野外发现与跟踪有很强的现实意义。
Abstract
In this paper, deep learning method is adopted to obtain data by network crawling and field shooting, and an optimized detection model of snub-nosed monkeys based on Faster region-based convolutional neural networks (R-CNN) is constructed. An optimal modeling scheme is determined by comparing different models, and the detection precision of different models constructed based on different training datasets is compared to explore the optimal supplementary scheme for modeling data. The results indicate that, compared with Vgg16 and Res50 networks, the optimal model can be constructed based on Res101 network with 70000 iterations. The network images could be used as an alternative data source for face detection of snub-nosed monkeys, and the network images could also be used as an auxiliary data source to optimize the body detection effects of snub-nosed monkeys when the measured data is limited. Compared with classical convolutional neural networks, this method has better detection effect and less running time. Among the images containing the complicate eco-environmental background information, this model could effectively locate and identify snub-nosed monkeys. The proposed algorithm has a strong practical significance for the discovery and tracking of snub-nosed monkeys in the wild.
孙蕊, 张旭, 郭颖, 于新文, 陈艳, 侯亚男. 基于Faster R-CNN金丝猴优化检测方法[J]. 激光与光电子学进展, 2020, 57(12): 121022. 孙蕊, 张旭, 郭颖, 于新文, 陈艳, 侯亚男. Optimized Detection Method for Snub-Nosed Monkeys Based on Faster R-CNN[J]. Laser & Optoelectronics Progress, 2020, 57(12): 121022.