Federated learning (FL) enables multiple clients to collaboratively train a shared model, with the help of a parameter server (PS), without disclosing their local datasets. However, due to the increasing size of the trained models, the communication load due to the iterative exchanges between the clients and the PS often becomes a bottleneck in the performance. Sparse communication is often employed to reduce the communication load, where only a small subset of the model updates are communicated from the clients to the PS. In this paper, we introduce a novel time-correlated sparsification (TCS) scheme, which builds upon the notion that sparse communication framework can be considered as identifying the most significant elements of the underlying model. Hence, TCS exploits the correlation between the sparse representations at consecutive iterations in FL, so that the overhead due to encoding of the sparse representation can be significantly reduced without compromising the test accuracy. Through extensive simulations on the CIFAR-10 dataset, we show that TCS can achieve centralized training accuracy with 100 times sparsification, and up to 2000 times reduction in the communication load when employed with quantization.

Time-Correlated Sparsification for Communication-Efficient Federated Learning / Ozfatura, E.; Ozfatura, K.; Gunduz, D.. - 2021-:(2021), pp. 461-466. (Intervento presentato al convegno 2021 IEEE International Symposium on Information Theory, ISIT 2021 tenutosi a aus nel 2021) [10.1109/ISIT45174.2021.9518221].

Time-Correlated Sparsification for Communication-Efficient Federated Learning

Gunduz D.
2021

Abstract

Federated learning (FL) enables multiple clients to collaboratively train a shared model, with the help of a parameter server (PS), without disclosing their local datasets. However, due to the increasing size of the trained models, the communication load due to the iterative exchanges between the clients and the PS often becomes a bottleneck in the performance. Sparse communication is often employed to reduce the communication load, where only a small subset of the model updates are communicated from the clients to the PS. In this paper, we introduce a novel time-correlated sparsification (TCS) scheme, which builds upon the notion that sparse communication framework can be considered as identifying the most significant elements of the underlying model. Hence, TCS exploits the correlation between the sparse representations at consecutive iterations in FL, so that the overhead due to encoding of the sparse representation can be significantly reduced without compromising the test accuracy. Through extensive simulations on the CIFAR-10 dataset, we show that TCS can achieve centralized training accuracy with 100 times sparsification, and up to 2000 times reduction in the communication load when employed with quantization.
2021
2021 IEEE International Symposium on Information Theory, ISIT 2021
aus
2021
2021-
461
466
Ozfatura, E.; Ozfatura, K.; Gunduz, D.
Time-Correlated Sparsification for Communication-Efficient Federated Learning / Ozfatura, E.; Ozfatura, K.; Gunduz, D.. - 2021-:(2021), pp. 461-466. (Intervento presentato al convegno 2021 IEEE International Symposium on Information Theory, ISIT 2021 tenutosi a aus nel 2021) [10.1109/ISIT45174.2021.9518221].
File in questo prodotto:
File Dimensione Formato  
Time-Correlated_Sparsification_for_Communication-Efficient_Federated_Learning.pdf

Accesso riservato

Tipologia: Versione pubblicata dall'editore
Dimensione 352.44 kB
Formato Adobe PDF
352.44 kB Adobe PDF   Visualizza/Apri   Richiedi una copia
Pubblicazioni consigliate

Licenza Creative Commons
I metadati presenti in IRIS UNIMORE sono rilasciati con licenza Creative Commons CC0 1.0 Universal, mentre i file delle pubblicazioni sono rilasciati con licenza Attribuzione 4.0 Internazionale (CC BY 4.0), salvo diversa indicazione.
In caso di violazione di copyright, contattare Supporto Iris

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11380/1280118
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 19
  • ???jsp.display-item.citation.isi??? 14
social impact