本文已被:浏览 1117次 下载 4312次
Received:December 29, 2021 Revised:January 29, 2022
Received:December 29, 2021 Revised:January 29, 2022
中文摘要: 联邦学习通过聚合客户端训练的模型, 保证数据留在客户端本地, 从而保护用户隐私. 由于参与训练的设备数目庞大, 存在数据非独立同分布和通信带宽受限的情况. 因此, 降低通信成本是联邦学习的重要研究方向. 梯度压缩是提升联邦学习通信效率的有效方法, 然而目前常用的梯度压缩方法大多针对独立同分布的数据, 未考虑联邦学习的特性. 针对数据非独立同分布的联邦场景, 本文提出了基于投影的稀疏三元压缩算法, 通过在客户端和服务端进行梯度压缩, 降低通信成本, 并在服务端采用梯度投影的聚合策略以缓解客户端数据非独立同分布导致的不利影响. 实验结果表明, 本文提出的算法不仅提升了通信效率, 而且在收敛速度和准确率上均优于现有的梯度压缩算法.
Abstract:Federated learning protects user privacy by aggregating trained models of the client and thereby keeping the data local on the client. Due to the large numbers of devices participating in training, the data is non-independent and identically distributed (non-IID), and the communication bandwidth is limited. Therefore, reducing communication costs is an important research direction for federated learning. Gradient compression is an effective method of improving the communication efficiency of federated learning. However, most of the commonly used gradient compression methods are for independent and identically distributed data without considering the characteristics of federated learning. For the scene of non-IID data in federated learning, this study proposes a sparse ternary compression algorithm based on projection. The communication cost is reduced by gradient compression on the client and server, and the negative impact of non-IID client data is mitigated by gradient projection aggregation on the server. The experimental results show that the proposed algorithm not only improves communication efficiency but also outperforms the existing gradient compression algorithms in convergence speed and accuracy.
keywords: federated learning communication efficiency non-independent and identically distributed (non-IID) data gradient compression
文章编号: 中图分类号: 文献标志码:
基金项目:
Author Name | Affiliation | |
TIAN Jin-Xiao | School of Computer and Artificial Intelligence, Southwest Jiaotong University, Chengdu 611756, China | tianjinx@foxmail.com |
Author Name | Affiliation | |
TIAN Jin-Xiao | School of Computer and Artificial Intelligence, Southwest Jiaotong University, Chengdu 611756, China | tianjinx@foxmail.com |
引用文本:
田金箫.提升联邦学习通信效率的梯度压缩算法.计算机系统应用,2022,31(10):199-205
TIAN Jin-Xiao.Gradient Compression Algorithm for Improving Communication Efficiency of Federated Learning.COMPUTER SYSTEMS APPLICATIONS,2022,31(10):199-205
田金箫.提升联邦学习通信效率的梯度压缩算法.计算机系统应用,2022,31(10):199-205
TIAN Jin-Xiao.Gradient Compression Algorithm for Improving Communication Efficiency of Federated Learning.COMPUTER SYSTEMS APPLICATIONS,2022,31(10):199-205