光学 精密工程, 2020, 28 (7): 1454, 网络出版: 2020-11-02
基于卷积神经网络的增敏型光纤弯曲传感器
Enhanced fiber optic bending senor based on convolutional neural network
光纤弯曲传感 卷积神经网络 散斑 塑料光纤 增敏型光纤 fiber bending sensor convolutional netural network speckle figure plastic optical fiber sensitzed fiber
摘要
为了提高光纤弯曲传感器的灵敏度, 增大线性范围, 降低成本, 本文提出一种基于深度神经网络分类塑料光纤弯曲角度及方向的方法。使用进行侧抛增敏处理的塑料光纤, 在光纤输出端采集不同弯曲角度的散斑图。制作了包含五类弯曲角度的数据集一以及包含七类弯曲角度的数据集二, 在预处理图像数据之后, 利用多层卷积神经网络对散斑图像进行卷积和池化处理, 得到散斑图像的特征图, softmax分类器用来得到分类准确率, 最后对两种不同卷积神经网络模型的分类效果进行对比。结果显示: 数据集一光纤弯曲的角度间隔为5°时分类准确率达到了96%, 理论和实际分析结果表明该方案识别率较高, 基于该方法有望实现一种简单、高效的光纤弯曲传感器。
Abstract
To improve the sensitivity and cost-efficiency of a fiber bending sensor and to increase its linear range, a method based on a deep neural network was proposed to classify different bending angles and directions of plastic fiber. Plastic fiber with side throw sensitization processing was used to collect speckle images of different bending angles at the output end of the fiber. Data set one was made with five types of bending angle and data set two contained seven types of bending angle. After the pretreatment of image data, a multilayer convolution neural network was used to analyze the speckle image. The convolution and pooling provided speckle image features. A softmax classification was used for classification accuracy. Finally, the effect of two different convolutions on the classification of the neural network model was compared. The results show that the classification accuracy reaches 96% when the angle interval of fiber bending in the data set one is 5°. The theoretical and practical analysis results show that the scheme has a high recognition rate. Moreover, the realization of this method is expected to provide a new type of simple and efficient fiber bending sensor.
谭中伟, 杨婧雅, 刘艳, 卢顺, 张利伟, 牛慧. 基于卷积神经网络的增敏型光纤弯曲传感器[J]. 光学 精密工程, 2020, 28(7): 1454. TAN Zhong-wei, YANG Jing-ya, LIU Yan, LU Shun, ZHANG Li-wei, NIU Hui. Enhanced fiber optic bending senor based on convolutional neural network[J]. Optics and Precision Engineering, 2020, 28(7): 1454.