﻿ 基于图神经网络的配电网故障预测
 计算机系统应用  2001, Vol. 29 Issue (9): 131-135 PDF

1. 中国科学院大学 计算机与控制学院, 北京 100049;
2. 中国科学院 沈阳计算技术研究所, 沈阳 110168;
3. 国网辽宁省电力有限公司, 沈阳 110168

Accident Prediction of Power Distribution Network Based on Graph Neural Network
YANG Hua1,2, LI Xi-Wang2, SI Zhi-Jian2,3, ZHANG Xiao1,2
1. School of Computer and Control Engineering, University of Chinese Academy of Sciences, Beijing 100049, China;
2. Shenyang Institute of Computing Technology, Chinese Academy of Sciences, Shenyang 110168, China;
3. State Grid Liaoning Electric Power Co. Ltd., Shenyang 110168, China
Foundation item: National Science and Technology Major Program of China (2017ZX01030-201)
Abstract: Power distribution network accident in actual application scenarios account for more than 80% of total grid accident, and the prediction of power distribution network accident has always been a difficult issue. This study, under the call of “Ubiquitous IoT” proposed by the State Grid, analyzes the research results of scholars on this issue, and proposes an accident prediction method for power distribution network based on graph neural network with the idea of graph neural network. Referring to the commonly used graph neural network design framework, the node information aggregation function, prediction function, and loss function are designed in detail, and reasonable depth parameters are selected according to the algorithm flow test. The algorithm fully considers the mutual influence between connected nodes, and uses the real grid operation data to compare the two other algorithms commonly used in this field. Experiments show that the proposed algorithm improves the accuracy by 3.0% and is more robust.
Key words: graph neural network     power distribution network     Ubiquitous IoT     deep learning     back propagation

1 配电网结构分析

 图 1 配电网局部拓扑图

2 算法模型 2.1 算法框架概述

 图 2 配电网简化拓扑图

 $\left\{\begin{array}{l} h_v^k = {f_v}({l_v},h_v^{k - 1},h_{ne[v]}^k) \\ {o_v} = {g_v}(h_v^k,{l_v}) \\ \end{array} \right.$ (1)

 图 3 图模型计算结构

2.2 关键算法流程

 $h_v^k \leftarrow \sigma (W_{agg}^k \cdot {{MEAN(\{ h}}_v^k{\rm{\} }} \cup {{\{ h}}_n^k{{,}}\forall {{n}} \in {{Ne[v]\} )}})$ (2)

1　　 $\scriptstyle h_v^0 \leftarrow {x_v},\forall v \in V$

2　　for $\scriptstyle K$ to 0 do

3　　　for $\scriptstyle v \in V$ do

4　　 $\scriptstyle h_v^k \leftarrow f_w^k(\{ h_v^{k - 1},\forall v \in Ne[v]\} )$

5　　　end

6　　　 $\scriptstyle h_v^k \leftarrow h_v^k/{\left\| {h_v^k} \right\|_2},\forall v \in V$

7　　end

8　　 $\scriptstyle {t_v} \leftarrow h_v^k,\forall v \in V$

 ${o_v} = \sigma ({W_o} \times h_v^k)$ (3)

 $L = {y_t}\log ({o_t}) + (1 - {y_t})\log (1 - {o_t})$ (4)

2.3 优化方法

 图 4 网络展开图

 \left\{ \begin{aligned} \dfrac{{\partial L}}{{\partial {W_o}}}\;\;& = \left(\dfrac{{{y_t}}}{{{o_t}}} + \dfrac{{1 - {y_t}}}{{1 - {o_t}}}\right) \cdot \dfrac{{\partial {o_t}}}{{\partial {W_o}}}\\ & = \left(\dfrac{{{y_t}}}{{{o_t}}} \!+\! \dfrac{{1 - {y_t}}}{{1 - {o_t}}}\right)\! \cdot \! \sigma ({W_o}h_v^k\! +\! {b_o})(1 - \sigma ({W_o}h_v^k \!+ \!{b_o})) \cdot h_v^k \\ & = ({y_t} - {o_t}) \cdot h_v^k\\ \dfrac{{\partial L}}{{\partial {b_o}}} \;\;\; &= {y_t} - {o_t}\\ \dfrac{{\partial h_v^k}}{{\partial {W_{agg}}}}\! &= \sigma ({W_{agg}} \! \cdot \! h_v^{k \! - \! 1} \! + \! {b_{agg}}) \! \cdot \! (1 \! - \! \sigma ({W_{agg}} \! \cdot \! h_v^{k - 1} \! + \! {b_{agg}})) \! \cdot \! h_v^{k - 1}\\ \dfrac{{\partial h_v^k}}{{\partial {b_{agg}}}} &= \sigma ({W_{agg}} \cdot h_v^{k - 1} + {b_{agg}}) \cdot (1 - \sigma ({W_{agg}} \cdot h_v^{k - 1} + {b_{agg}})) \end{aligned}\right. (5)

Different aggregator functions $\scriptstyle f_v^k$ ; iteration $\scriptstyle Epoch$

$\scriptstyle Main$ :

for $\scriptstyle epoch$ =1 to $\scriptstyle Epoch$ do

$\scriptstyle L = Forward$

$\scriptstyle Backward$

End

$\scriptstyle Forward$ :

$\scriptstyle {t_v} = aggregation(G(V,\varepsilon ),{x_v},k,W_{agg}^k,f_v^k)$

$\scriptstyle {o_t} = \sigma ({W_o} \times h_v^{})$

$\scriptstyle L = \sum\limits_{t \in T} {{y_t}\log ({o_t}) + (1 - {y_t})\log (1 - {o_t})}$

return $\scriptstyle L$

$\scriptstyle Backward$ :

For 2 to $\scriptstyle k$ do:

$\scriptstyle {W_o} = {W_o} - \lambda ({y_t} - {o_t}) \cdot h_v^k$

$\scriptstyle {b_o} = {b_o} - \alpha \cdot ({y_t} - o{}_t)$

$\scriptstyle{ {W_{agg}} = {W_{agg}} - \beta ({y_t} - {o_t}) \cdot \sigma ({W_{agg}} \cdot h_v^{k - 1} + {b_{agg}}) \times }$

$\scriptstyle(1 - \sigma ({W_{agg}} \cdot h_v^{k - 1} + {b_{agg}})) \cdot h_v^{k - 1}$

$\scriptstyle {b_{agg}} = {b_{agg}} - \delta \cdot ({y_t} - o{}_t)\sigma ({W_o}h_v^k + {b_o})(1 - \sigma ({W_o}h_v^k + {b_o}))$

3 仿真实验

3.1 网络深度的取值与分析

 图 5 不同k值的损失对比

3.2 模型对比

4 结论与展望

 [1] 郑晨玲, 朱革兰. 基于贝叶斯估计的配电网智能分布式故障区段定位算法. 电网技术, 2020, 44(4): 1561-1567. [2] 刘科研, 吴心忠, 石琛, 等. 基于数据挖掘的配电网故障风险预警. 电力自动化设备, 2018, 38(5): 148-153. [3] 张稳, 盛万兴, 刘科研, 等. 计及天气因素相关性的配电网故障风险等级预测方法. 电网技术, 2018, 42(8): 2391-2398. [4] Scarselli F, Gori M, Tsoi AC, et al. The graph neural network model. IEEE Transactions on Neural Networks, 2009, 20(1): 61-80. DOI:10.1109/TNN.2008.2005605 [5] Zhou J, Cui GQ, Zhang ZY, et al. Graph neural networks: A review of methods and applications. arXiv: 1812.08434, 2018. [6] Hamilton WL, Ying R, Leskovec J. Inductive representation learning on large graphs. arXiv: 1706.02216, 2017. [7] Wu ZH, Pan SR, Chen FW, et al. A comprehensive survey on graph neural networks. IEEE Transactions on Neural Networks and Learning Systems. [8] Cui P, Wang X, Pei J, et al. A survey on network embedding. IEEE Transactions on Knowledge and Data Engineering, 2019, 31(5): 833-852. DOI:10.1109/TKDE.2018.2849727 [9] Hamilton WL, Ying R, Leskovec J. Representation learning on graphs: Methods and applications. IEEE Data Engineering Bulletin, 2017, 40(3): 52-74. [10] Zhang DK, Yin J, Zhu XQ, et al. Network representation learning: A survey. IEEE transactions on Big Data, 2020, 6(1): 3-28. DOI:10.1109/TBDATA.2018.2850013 [11] Cai HY, Zheng VW, Chang KCC. A comprehensive survey of graph embedding: Problems, techniques, and applications. IEEE Transactions on Knowledge and Data Engineering, 2018, 30(9): 1616-1637. DOI:10.1109/TKDE.2018.2807452 [12] Battaglia P, Pascanu R, Lai M, et al. Interaction networks for learning about objects, relations and physics. Proceedings of the 30th International Conference on Neural Information Processing Systems. Barcelona, Spain. 2016. 4502–4510. [13] Atwood J, Towsley D. Diffusion-convolutional neural networks. Annual Conference on Neural Information Processing Systems 2016. Barcelona, Spain. 2016. 1993–2001. [14] Kipf TN, Welling M. Semi-supervised classification with graph convolutional networks. Proceedings of the 5th International Conference on Learning Representations. Toulon, France. 2017. [15] Sanchez-Gonzalez A, Heess N, Springenberg JT, et al. Graph networks as learnable physics engines for inference and control. arXiv: 1806.01242, 2018. [16] Beck D, Haffari G, Cohn T. Graph-to-sequence learning using gated graph neural networks. Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics. Melbourne, Australia. 2018. 273–283. [17] Schlichtkrull M, Kipf TN, Bloem P, et al. Modeling relational data with graph convolutional networks. Proceedings of the 15th International Conference. Heraklion, Greece. 2018. 593–607. [18] Zhang ZW, Cui P, Zhu WW. Deep learning on graphs: A survey. arXiv: 1812.04202, 2018.