计算机系统应用  2020, Vol. 29 Issue (6): 97-103 PDF

1. 国网山东省电力公司检修公司, 济南 250100;
2. 北京科技大学 计算机与通信工程学院, 北京 100081;
3. 积成电子股份有限公司, 济南 250100;
4. 中国石油大学(华东) 计算机科学与技术学院, 青岛 266580

Fault Recognition of Power Equipment in Infrared Thermal Images Based on Deep Learning with Embedded Devices
WANG Yan-Bo1, CHEN Pei-Feng1, XU Liang2, ZHANG He-Bao3, FANG Kai4
1. Overhaul Company, State Grid Shandong Electric Power Company, Jinan 250100, China;
2. School of Computer and Communication Engineering, Beijing University of Science and Technology, Beijing 100081, China;
3. Jicheng Electronics Co. Ltd., Jinan 250100, China;
4. School of Computer Science and Technology, China University of Petroleum, Qingdao 266580, China
Abstract: With the emerging large image sets and the rapid development of computer hardware, especially GPU, the deployment of Convolutional Neural Network (CNN) model on embedded devices with limited computing resources becomes a challenging problem. Overheating of power equipment can be identified from infrared thermal images. Because of the fading of infrared radiation in the air, the result of infrared temperature measurement is lower than the actual value. In this study, an efficient CNN based on embedded devices is proposed for thermal fault detection of power equipment. The backbone network of SSD algorithm is replaced by MobileNet. At the same time, batch normalization is combined with the previous volume to reduce model parameters, improve reasoning speed, and make it run on a lightweight computing platform. To solve the problem of infrared radiation loss in the air, an infrared temperature correction unit based on BP neural network is proposed. Based on the above innovation, a thermal fault detection system for power equipment is designed. Experiments and field applications show that the proposed method has high accuracy and reasoning speed.
Key words: deep learning     infrared thermal imaging     lightweight     fault detection     electric power equipment

1 引言

2 相关工作

3 电力设备检测

3.1 基于嵌入式平台的电力设备检测

SSD是典型的基于深度学习的目标检测算法, 与R-CNN系列目标检测算法相比, SSD取消中间的候选框和像素特征的重采样过程, 保证速度的同时保证了检测精度, SSD 输出一系列离散化的候选框, 候选框生成在不同层上的特征图且长宽比不同, 经过卷积神经网络的前馈操作, SSD 生成一系列固定大小的候选框, 使用小卷积Filter 来预测候选框位置中的目标类别和偏移即候选框中包含目标种类的概率, 最后通过极大值抑制方法得到最终的预测结果.

 图 1 MobileNet-SSD与SSD网络结构对比

3.2 合并Batch Normalization层

 $W \times X + b$ (1)

 $\dfrac{{X - \mu }}{{\sqrt {{\sigma ^2} + \varepsilon } }}$ (2)

Batch Normalization层第二个操作是缩放:

 $\gamma X + \beta$ (3)

 $\gamma \times \dfrac{{({W_{\rm old}} \times X + {b_{\rm old}}) - \mu }}{{\sqrt {{\sigma ^2} + \varepsilon } }} + \beta$ (4)

 ${W_{\rm new}} = \dfrac{\gamma }{{\sqrt {{\sigma ^2} + \varepsilon } }} \times {W_{\rm old}}$ (5)

 ${b_{\rm new}} = \dfrac{\gamma }{{\sqrt {{\sigma ^2} + \varepsilon } }}({b_{\rm old}} - \mu ) + \beta$ (6)
4 红外测温结果修正

 图 2 神经网络结构

 ${y_i} = f({x_i})$ (7)

 ${E_i} = {D_i},\;i = 1,2, \cdots ,\gamma$ (8)

 ${F_i} = {L_{j1}} \times {E_1} + {L_{j2}} \times {E_2} + \cdots + {L_{j\gamma }} \times {E_\gamma } + {M_j}$ (9)

 ${H_j} = f({x_j}),j = 1,2, \cdots ,\alpha$ (10)

 $k = \sum\nolimits_{j = 1}^\alpha {L_{1j}^2} \times {H_j} + {M^2}$ (11)

 ${Y_j} = f({x_j}) = f(k) = k$ (12)
5 电力设备热故障检测系统

 图 3 电力设备热故障检测框架

 图 4 电力设备热故障检测类图

 图 5 检测流程

(1)从红外热像仪读取红外热成像视频流, 将其解码成帧;

(2)电力设备检测算法检测每帧图像中是否含有电力设备, 并将其定位;

(3)根据上一个步骤所得到的定位信息, 从红外热像仪中获取红外测温与激光测距数据;

(4)根据红外测温与激光测距数据, 通过温度修正模块得到修正后的温度;

(5)最后利用先验知识库, 对修正后的温度进行热故障诊断, 得到热故障检测结果.

6 实验 6.1 硬件环境

6.2 效果展示

6.3 性能与准确率测试

7 总结

 图 6 检测结果

 [1] 王家林, 夏立, 吴正国, 等. 电力系统故障诊断研究现状与展望. 电力系统保护与控制, 2018, 38(18): 210-216. [2] Liu J, Lin LH, Liu GQ, et al. A substation monitoring and warning system based on infrared technology and image separating. Proceedings of 2008 3rd International Conference on Intelligent System and Knowledge Engineering. Xiamen, China. 2008. 66–70. [3] 李杨. 电力设备状态检修及故障诊断中红外技术的应用分析. 世界有色金属, 2016(12): 164, 167. [4] 李孟兴. 电力设备故障红外诊断系统的研究与实现. 电力信息化, 2013, 11(2): 36-39. [5] Amantea R, Goodman LA, Pantuso FP, et al. Progress toward an uncooled IR imager with 5-mK NETD. Proceedings of SPIE Infrared Technology and Applications XXIV, vol.3436. San Diego, CA, USA. 1998. 647–659. [6] LeCun Y, Bottou L, Bengio Y, et al. Gradient-based learning applied to document recognition. Proceedings of the IEEE, 1998, 86(11): 2278-2324. DOI:10.1109/5.726791 [7] Yu P, Dong BG, Xue YJ. Electric power tower inclination angle detection method based on SIFT feature matching. Applied Mechanics and Materials, 2012, 236–237: 759-764. DOI:10.4028/www.scientific.net/AMM.236-237.759 [8] Lowe DG. Distinctive image features from scale-invariant keypoints. International Journal of Computer Vision, 2004, 60(2): 91-110. DOI:10.1023/B:VISI.0000029664.99615.94 [9] Otsu N. A threshold selection method from gray-level histograms. IEEE Transactions on Systems, Man, and Cybernetics, 1979, 9(1): 62-66. DOI:10.1109/TSMC.1979.4310076 [10] Zhou QQ, Zhao ZB. Substation equipment image recognition based on SIFT feature matching. Proceedings of 2012 5th International Congress on Image and Signal Processing. Chongqing, China. 2012. 1344–1347. [11] Fischler MA, Bolles RC. Random sample consensus: A paradigm for model fitting with applications to image analysis and automated cartography. Communications of the ACM, 1981, 24(6): 381-395. DOI:10.1145/358669.358692 [12] Hinton GE, Osindero S, Teh YW. A fast learning algorithm for deep belief nets. Neural Computation, 2006, 18(7): 1527-1554. DOI:10.1162/neco.2006.18.7.1527 [13] Girshick R, Donahue J, Darrell T, et al. Rich feature hierarchies for accurate object detection and semantic segmentation. Proceedings of 2014 IEEE Conference on Computer Vision and Pattern Recognition. Columbus, OH, USA. 2014. 580–587. [14] He KM, Zhang XY, Ren SQ, et al. Spatial pyramid pooling in deep convolutional networks for visual recognition. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2015, 37(9): 1904-1916. DOI:10.1109/TPAMI.2015.2389824 [15] Girshick R. Fast R-CNN. Proceedings of 2015 IEEE International Conference on Computer Vision. Santiago, Chile. 2015. 1440–1448. [16] Ren SQ, He KM, Girshick R, et al. Faster R-CNN: Towards real-time object detection with region proposal networks. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2017, 39(6): 1137-1149. DOI:10.1109/TPAMI.2016.2577031 [17] Redmon J, Divvala S, Girshick R, et al. You only look once: Unified, real-time object detection. Proceedings of 2016 IEEE Conference on Computer Vision and Pattern Recognition. Las Vegas, NV, USA. 2016. 779–788. [18] Cun YL, Denker JS, Solla SA. Optimal brain damage. Proceedings of the 2nd International Conference on Neural Information Processing Systems. Cambridge, UK. 1989. 598–605. [19] Hassibi B, Stork DG. Second order derivatives for network pruning: Optimal brain surgeon. Advances in Neural Information Processing Systems. San Francisco, CA, USA. 1993. 164–171. [20] Han S, Pool J, Tran J, et al. Learning both weights and connections for efficient neural networks. Proceedings of the 28th International Conference on Neural Information Processing Systems. Cambridge, UK. 2015. 1135–1143. [21] Hu HY, Peng R, Tai YW, et al. Network trimming: A data-driven neuron pruning approach towards efficient deep architectures. arXiv: 1607.03250, 2016. [22] Luo JH, Wu JX. An entropy-based pruning method for CNN compression. arXiv: 1706.05791, 2017. [23] Yang TJ, Chen YH, Sze V. Designing energy-efficient convolutional neural networks using energy-aware pruning. Proceedings of 2017 IEEE Conference on Computer Vision and Pattern Recognition. Honolulu, HI, USA. 2017. 6071–6079. [24] Molchanov P, Tyree S, Karras T, et al. Pruning convolutional neural networks for resource efficient inference. arXiv: 1611.06440, 2016. [25] Han S, Mao HZ, Dally WJ. Deep compression: Compressing deep neural networks with pruning, trained quantization and Huffman coding. arXiv: 1510.00149, 2015. [26] Chen WL, Wilson JT, Tyree S, et al. Compressing neural networks with the hashing trick. Proceedings of the 32nd International Conference on International Conference on Machine Learning. Lille, France. 2015. 2285–2294. [27] Howard AG, Zhu ML, Chen B, et al. MobileNets: Efficient convolutional neural networks for mobile vision applications. arXiv: 1704.04861, 2017. [28] Zhang XY, Zhou XY, Lin MX, et al. ShuffleNet: An extremely efficient convolutional neural network for mobile devices. Proceedings of 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition. Salt Lake City, UT, USA. 2018. 6848–6856. [29] Liu W, Anguelov D, Erhan D, et al. SSD: Single shot multibox detector. Proceedings of the 14th European Conference on Computer Vision. Amsterdam, the Netherlands. 2016. 21–37. [30] Simonyan K, Zisserman A. Very deep convolutional networks for large-scale image recognition. arXiv: 1409.1556, 2014.