Federated learning (FL) has become de facto framework for collaborative learning among edge devices with privacy concern. The core of the FL strategy is the use of stochastic gradient descent (SGD) in a distributed manner. Large scale implementation of FL brings new challenges, such as the incorporation of acceleration techniques designed for SGD into the distributed setting, and mitigation of the drift problem due to non-homogeneous distribution of local datasets. These two problems have been separately studied in the literature; whereas, in this paper, we show that it is possible to address both problems using a single strategy without any major alteration to the FL framework, or introducing additional computation and communication load. To achieve this goal, we propose FedADC, which is an accelerated FL algorithm with drift control. We empirically illustrate the advantages of FedADC.

FedADC: Accelerated Federated Learning with Drift Control / Ozfatura, E.; Ozfatura, K.; Gunduz, D.. - 2021-:(2021), pp. 467-472. (Intervento presentato al convegno 2021 IEEE International Symposium on Information Theory, ISIT 2021 tenutosi a aus nel 2021) [10.1109/ISIT45174.2021.9517850].

FedADC: Accelerated Federated Learning with Drift Control

Gunduz D.
2021

Abstract

Federated learning (FL) has become de facto framework for collaborative learning among edge devices with privacy concern. The core of the FL strategy is the use of stochastic gradient descent (SGD) in a distributed manner. Large scale implementation of FL brings new challenges, such as the incorporation of acceleration techniques designed for SGD into the distributed setting, and mitigation of the drift problem due to non-homogeneous distribution of local datasets. These two problems have been separately studied in the literature; whereas, in this paper, we show that it is possible to address both problems using a single strategy without any major alteration to the FL framework, or introducing additional computation and communication load. To achieve this goal, we propose FedADC, which is an accelerated FL algorithm with drift control. We empirically illustrate the advantages of FedADC.
2021
2021 IEEE International Symposium on Information Theory, ISIT 2021
aus
2021
2021-
467
472
Ozfatura, E.; Ozfatura, K.; Gunduz, D.
FedADC: Accelerated Federated Learning with Drift Control / Ozfatura, E.; Ozfatura, K.; Gunduz, D.. - 2021-:(2021), pp. 467-472. (Intervento presentato al convegno 2021 IEEE International Symposium on Information Theory, ISIT 2021 tenutosi a aus nel 2021) [10.1109/ISIT45174.2021.9517850].
File in questo prodotto:
File Dimensione Formato  
FedADC_Accelerated_Federated_Learning_with_Drift_Control.pdf

Accesso riservato

Tipologia: Versione pubblicata dall'editore
Dimensione 481.13 kB
Formato Adobe PDF
481.13 kB Adobe PDF   Visualizza/Apri   Richiedi una copia
Pubblicazioni consigliate

Licenza Creative Commons
I metadati presenti in IRIS UNIMORE sono rilasciati con licenza Creative Commons CC0 1.0 Universal, mentre i file delle pubblicazioni sono rilasciati con licenza Attribuzione 4.0 Internazionale (CC BY 4.0), salvo diversa indicazione.
In caso di violazione di copyright, contattare Supporto Iris

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11380/1280018
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 22
  • ???jsp.display-item.citation.isi??? 15
social impact