﻿ 基于机器学习的铁路工务人员行为识别方法
 计算机系统应用  2019, Vol. 28 Issue (7): 199-205 PDF

Railway Engineering Staff Behavior Recognition Method Based on Machine Learning
DU Cheng-Fei
School of Electrical Engineering, Southwest Jiaotong University, Chengdu 611756, China
Abstract: In view of the fact that today's railway engineering staff cannot be real-time monitored during operation, and safety incidents occur from time to time, seven main behaviors of them are analyzed by taking railway inspectors as an example. An embedded device integrated with an accelerometer is worn by every worker, collects their behavior data and extracts features, and uses four kinds of classifiers, which are C4.5 decision tree, random forest, KNN, and SVM, to carry out experiments. The results show that classifier SVM performs the best, the behavioral recognition accuracy rate reaches 99.2%. This research has certain engineering application value for eliminating the safety hazards of railway field engineering staffs.
Key words: machine learning     accelerometer     railway engineering staff     behavior recognition     engineering application

 图 1 研究思路流程图

1 铁路工务人员行为识别研究 1.1 铁路工务人员行为特点分析

1.2 数据采集

 图 2 传感器BWT901CL与佩戴方式

1.3 数据预处理 1.3.1 倾斜矫正

 图 3 设备坐标系与观察坐标系

α1 β1 γ1 α2 β2 γ2 α3 β3 γ3分别为观察坐标系的轴OX’ OY’ OZ’在设备坐标系下与OX OY OZ轴的夹角, 根据坐标转换公式(1)即可将测量值(x y z)转换为统一的观察坐标系下的值(x’ y’ z’).

 $\left[ \begin{gathered} x' \\ y' \\ z' \\ \end{gathered} \right] = \left[ \begin{gathered} {{i}} \\ {{j}} \\ {{k}} \\ \end{gathered} \right] \cdot \left[ \begin{gathered} x \\ y \\ z \\ \end{gathered} \right] = \left[ \begin{gathered} \cos \alpha 1{\rm{ }}\cos \beta 1{\rm{ cos}}\gamma {\rm{1}} \\ \cos \alpha 2{\rm{ }}\cos \beta 2{\rm{ cos}}\gamma {\rm{2}} \\ \cos \alpha 3{\rm{ }}\cos \beta 3{\rm{ cos}}\gamma {\rm{3}} \\ \end{gathered} \right] \cdot \left[ \begin{gathered} x \\ y \\ z \\ \end{gathered} \right]$ (1)
1.3.2 去噪

 $yy(n) = \begin{gathered} \left\{ {\begin{array}{*{20}{l}} {y(n)},&{n = 1,N} \\ {\frac{1}{{2n - 1}}\displaystyle\sum\limits_{i = 0}^{2n + 2} {y(1 + i)} },&{1 < n < \frac{{M + 1}}{2}} \\ {\frac{1}{M}\displaystyle\sum\limits_{i = (1 - M)/2}^{(M - 1)/2} {y(n + i)} },&{\frac{{M + 1}}{2} \leqslant n \leqslant \frac{{2N - M + 1}}{2}} \\ {\frac{1}{{2N - 2n + 1}}\displaystyle\sum\limits_{i = n - N}^{N - n} {y(n + i)} },&{\frac{{2N - M + 1}}{2} < n < N} \end{array}} \right. \\ \end{gathered}$ (2)
1.3.3 分割与标记

1.4 提取特征

(1) 均值

 $mean = \frac{1}{N}\sum\limits_{i = 1}^N {{A_i}}$ (3)

(2) 标准差

 $std = \sqrt {\frac{1}{{n - 1}}\sum\limits_{i = 1}^N {{{({A_i} - mean)}^2}} }$ (4)

(3) 最大值

max是窗口内加速度数据的最大值, 它代表了加速度值变化区间的上限. 见式(5).

 $\forall i \in \left[ {0,N} \right],{\rm{ }}\exists max \geqslant {A_i}$ (5)

(4) 四分位距

 $IQR = {Q_3} - {Q_1}$ (6)

(5) 任意两轴的相关系数

 $corr(A,A') = \frac{{\operatorname{cov} (A,A')}}{{st{d_A}st{d_{A'}}}},A \ne A'$ (7)

(6) 偏度

 $skewness = \frac{n}{{(n - 1)(n - 2)st{d^3}}}\sum\limits_{i = 1}^N {{{({A_i} - mean)}^3}}$ (8)

(7) FFT系数

 $FFT = \sum\limits_{i = 1}^N {{A_i}{e^{ - j\frac{{2\pi }}{n}k}}} ,k = 0,1, \cdots ,N - 1$ (9)
1.5 分类器

2 实验与结果分析 2.1 实验方案

 $\begin{split} {{j}} &= {{k}} \times {{i}} = \left( {{x_3}{y_3}{z_3}} \right) \times \left( {{x_1}{y_1}{z_1}} \right) \\ & = ({y_3}{z_1} - {y_1}{z_3},{z_3}{x_1} - {z_1}{x_3},{x_3}{y_1} - {x_1}{y_3}) \end{split}$ (10)
 图 4 向量k与i的测定

 图 5 七种行为数据的采集

2.2 分类结果及分析

3 结论

 [1] 中国铁路总公司. TG/GW101-2014 普速铁路工务安全规则. 北京: 中国铁路总公司运输局, 2014. [2] 黄洪超. 关于加强铁路现场作业人员安全防护问题的研究. 工程技术, 2016, 18: 261-262. [3] 林强, 田双亮. 行为识别与智能计算. 西安: 西安电子科技大学出版社, 2016. 100–101. [4] Zhang HB, Li SZ, Guo F, et al. Real-time human action recognition based on shape combined with motion feature. Proceedings of 2010 IEEE International Conference on Intelligent Computing and Intelligent Systems. Xiamen, China. 2010. 633–637. [5] Zhang Z, Liu J. Recognizing human action and identity based on affine-SIFT. Proceedings of 2012 IEEE Symposium on Electrical & Electronics Engineering. Kuala Lumpur, Malaysia. 2012. 216–219. [6] Sowmya KS. Construction workers activity detection using BOF. Proceedings of 2017 International Conference on Recent Advances in Electronics and Communication Technology. Bangalore, India. 2017. 159–163. [7] Jiang ZL, Rozgic V, Adali S. Learning spatiotemporal features for infrared action recognition with 3D convolutional neural networks. Proceedings of 2017 IEEE Conference on Computer Vision and Pattern Recognition Workshops. Honolulu, HI, USA. 2017. 309–317. [8] Liu Y, Lu ZY, Li J, et al. Global temporal representation based CNNs for infrared action recognition. IEEE Signal Processing Letters, 2018, 25(6): 848-852. DOI:10.1109/LSP.2018.2823910 [9] Huang YC, Yi CW, Peng WC, et al. A study on multiple wearable sensors for activity recognition. Proceedings of 2017 IEEE Conference on Dependable and Secure Computing. Taipei, China. 2017. 449–452. [10] Ravì D, Wong C, Lo B, et al. A deep learning approach to on-node sensor data analytics for mobile or wearable devices. IEEE Journal of Biomedical and Health Informatics, 2017, 21(1): 56-64. DOI:10.1109/JBHI.2016.2633287 [11] Cardoso HL, Moreira JM. Human activity recognition by means of online semi-supervised learning. Proceedings of the 2016 17th IEEE International Conference on Mobile Data Management. Porto, Portugal. 2016. 75–77. [12] Khan SH, Sohail M. Activity monitoring of workers using single wearable inertial sensor. Proceedings of 2013 International Conference on Open Source Systems and Technologies. Lahore, Pakistan. 2013. 60–67. [13] Ward JA, Lukowicz P, Troster G, et al. Activity recognition of assembly tasks using body-worn microphones and accelerometers. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2006, 28(10): 1553-1567. DOI:10.1109/TPAMI.2006.197 [14] Mekruksavanich S, Hnoohom N, Jitpattanakul A. Smartwatch-based sitting detection with human activity recognition for office workers syndrome. Proceedings of 2018 International ECTI Northern Section Conference on Electrical, Electronics, Computer and Telecommunications Engineering. Chiang Rai, Thailand. 2018. 160–164. [15] 秦皇岛电视台. 铁路探伤工: 忙碌的" 轨道医生”. http: //v.pptv.com/show/ia9BZ1j6kFFK1M5s.html. [2018-11-15] [16] Cornacchia M, Ozcan K, Zheng Y, et al. A survey on activity detection and classification using wearable sensors. IEEE Sensors Journal, 2017, 17(2): 386-403. DOI:10.1109/JSEN.2016.2628346 [17] 杨博. 基于智能移动终端的人体运动识别技术研究与应用[硕士学位论文]. 成都: 西南交通大学, 2017. [18] Siirtola P, Röning J. Ready-to-use activity recognition for smartphones. Proceedings of 2013 IEEE Symposium on Computational Intelligence and Data Mining. Singapore, 2013. 59–64. [19] Álvarez de la Concepción MA, Soria Morillo LM, Gonzalez-Abril L, et al. Discrete techniques applied to low-energy mobile human activity recognition. A new approach. Expert Systems with Applications, 2014, 41(14): 6138-6146. [20] Bayat A, Pomplun M, Tran DA. A study on human activity recognition using accelerometer data from smartphones. Procedia Computer Science, 2014, 34: 450-457. DOI:10.1016/j.procs.2014.07.009 [21] Ravi N, Dandekar N, Mysore P, et al. Activity recognition from accelerometer data. Proceedings of the 17th Conference on Innovative Applications of Artificial Intelligence. Pittsburgh, PA, USA. 2005. [22] 强茂山, 张东成, 江汉臣. 基于加速度传感器的建筑工人施工行为识别方法. 清华大学学报(自然科学版), 2017, 57(12): 1338-1344. [23] Geurts P, Ernst D, Wehenkel L. Extremely randomized trees. Machine Learning, 2006, 63(1): 3-42. DOI:10.1007/s10994-006-6226-1