本文已被:浏览 809次 下载 2552次
Received:May 03, 2023 Revised:June 06, 2023
Received:May 03, 2023 Revised:June 06, 2023
中文摘要: 联邦学习能使用户不共享原始数据的情况下, 允许多个用户协同训练模型. 为了确保用户本地数据集不被泄露, 现有的工作提出安全聚合协议. 但现有的多数方案存在未考虑全局模型隐私、系统计算资源与通信资源耗费较大等问题. 针对上述问题, 提出了联邦学习下高效的强安全的隐私保护安全聚合方案. 该方案利用对称同态加密技术实现了用户模型与全局模型的隐私保护, 利用秘密共享技术解决了用户掉线问题. 同时, 该方案利用Pedersen承诺来验证云服务器返回聚合结果的正确性, 利用BLS签名保护了用户与云服务器交互过程中的数据完整性. 此外, 安全性分析表明该方案是可证明安全的; 性能分析表明该方案是高效且实用的, 适用于大规模用户的联邦学习系统.
Abstract:Federated learning allows multiple users to collaboratively train models without sharing the original data. To ensure that users’ local datasets are not leaked, the existing works propose secure aggregation protocols. However, most of the existing schemes fail to consider global model privacy, and the system is at a high cost of computational and communicational resources. In response to the above problems, this study proposes an efficient and secure privacy-preserving secure aggregation scheme for federated learning. The scheme uses symmetric homomorphic encryption to protect the privacy of the user model and the global model and adopts secret sharing to solve users’ dropout. At the same time, the Pedersen commitment is applied to verify the correctness of the aggregation results returned by the cloud server, and the BLS signature is utilized to protect the data integrity during the interaction between the users and the cloud server. In addition, security analysis illustrates that the proposed protocol is of provable security; performance analysis indicates that the protocol is efficient and practical for federated learning systems with large-scale users.
文章编号: 中图分类号: 文献标志码:
基金项目:
引用文本:
王珊,荆桃,肖淦文,张新林.联邦学习下高效的隐私保护安全聚合方案.计算机系统应用,2023,32(11):175-181
WANG Shan,JING Tao,XIAO Gan-Wen,ZHANG Xin-Lin.Efficient Privacy-preserving Secure Aggregation Scheme for Federated Learning.COMPUTER SYSTEMS APPLICATIONS,2023,32(11):175-181
王珊,荆桃,肖淦文,张新林.联邦学习下高效的隐私保护安全聚合方案.计算机系统应用,2023,32(11):175-181
WANG Shan,JING Tao,XIAO Gan-Wen,ZHANG Xin-Lin.Efficient Privacy-preserving Secure Aggregation Scheme for Federated Learning.COMPUTER SYSTEMS APPLICATIONS,2023,32(11):175-181