Abstract:Distributed average consensus and decentralized machine learning are widely employed decentralized computing methods. The convergence rates of the two methods are mainly determined by the spectral gap of the topology. The heterogeneity of the network environment among nodes includes the difference in node bandwidth and inter-node connection availability. The heterogeneous network environment poses a challenge to decentralized computation efficiency. This work studies the topology design of maximizing the spectral gap under a heterogeneous network environment. The gradient of the spectral gap for any edge of the topology is derived and an edge-addition and deletion algorithm is designed based on this gradient to construct the target topology. The generated topology has larger spectral gaps and similar data communication time of each node. The performance of this algorithm remains stable under different levels of heterogeneous network environments. The generated topology achieves convergence with a faster convergence rate and shorter time in distributed consensus. Based on this algorithm, this paper further verifies the recently discovered weak relationship between the spectral gap and convergence rate of decentralized machine learning.