聯邦學習論文研究(FedBoost: Communication-Efficient Algorithms for Federated Learning)


主要內容:

  不同於梯度壓縮和模型壓縮,FedBoost集成學習算法,能夠降低服務器到客戶端

  和客戶端到服務器的通信成本,提高通信效率。

集成學習:集成學習(ensemble learning)原理詳解_春華秋實-CSDN博客_集成學習

主要優點:

  1. Pre-trained base predictors: base predictors can be pre-trained on publicly available data,

   thus reducing the need for user data in training.

  2. Convergence guarantee: ensemble methods often require training relatively few parameters,

   which typically results in far fewer rounds of optimization and faster convergence compared to

   training the entire model from scratch.

  3. Adaptation or drifting over time: user data may change over time, but, in the ensemble approach,

   we can keep the base predictors fixed and retrain the ensemble weights whenever the data changes.

  4. Differential privacy (DP): federated learning can be combined with global DP to provide an additional

   layer of privacy . Training only the ensemble weights via federated learning is well-suited for DP since

   the utility-privacy trade-off depends on the number of parameters being trained . Furthermore, this

   learning problem is typically a convex optimization problem for which DP convex optimization can give

   better privacy guarantees.

 


免責聲明!

本站轉載的文章為個人學習借鑒使用,本站對版權不負任何法律責任。如果侵犯了您的隱私權益,請聯系本站郵箱yoyou2525@163.com刪除。



 
粵ICP備18138465號   © 2018-2025 CODEPRJ.COM