Communication-Efficient Federated Deep Learning With Layerwise Asynchronous Model Update and Temporally Weighted Aggregation

Chen Y, Sun X, Jin Y (2020)
IEEE Transactions on Neural Networks and Learning Systems 31(10): 4229-4238.

Zeitschriftenaufsatz | Veröffentlicht | Englisch
 
Download
Es wurden keine Dateien hochgeladen. Nur Publikationsnachweis!
Autor*in
Chen, Yang; Sun, Xiaoyan; Jin, YaochuUniBi
Abstract / Bemerkung
Federated learning obtains a central model on the server by aggregating models trained locally on clients. As a result, federated learning does not require clients to upload their data to the server, thereby preserving the data privacy of the clients. One challenge in federated learning is to reduce the client-server communication since the end devices typically have very limited communication bandwidth. This article presents an enhanced federated learning technique by proposing an asynchronous learning strategy on the clients and a temporally weighted aggregation of the local models on the server. In the asynchronous learning strategy, different layers of the deep neural networks (DNNs) are categorized into shallow and deep layers, and the parameters of the deep layers are updated less frequently than those of the shallow layers. Furthermore, a temporally weighted aggregation strategy is introduced on the server to make use of the previously trained local models, thereby enhancing the accuracy and convergence of the central model. The proposed algorithm is empirically on two data sets with different DNNs. Our results demonstrate that the proposed asynchronous federated deep learning outperforms the baseline algorithm both in terms of communication cost and model accuracy.
Erscheinungsjahr
2020
Zeitschriftentitel
IEEE Transactions on Neural Networks and Learning Systems
Band
31
Ausgabe
10
Seite(n)
4229-4238
ISSN
2162-237X
eISSN
2162-2388
Page URI
https://pub.uni-bielefeld.de/record/2978394

Zitieren

Chen Y, Sun X, Jin Y. Communication-Efficient Federated Deep Learning With Layerwise Asynchronous Model Update and Temporally Weighted Aggregation. IEEE Transactions on Neural Networks and Learning Systems. 2020;31(10):4229-4238.
Chen, Y., Sun, X., & Jin, Y. (2020). Communication-Efficient Federated Deep Learning With Layerwise Asynchronous Model Update and Temporally Weighted Aggregation. IEEE Transactions on Neural Networks and Learning Systems, 31(10), 4229-4238. https://doi.org/10.1109/TNNLS.2019.2953131
Chen, Yang, Sun, Xiaoyan, and Jin, Yaochu. 2020. “Communication-Efficient Federated Deep Learning With Layerwise Asynchronous Model Update and Temporally Weighted Aggregation”. IEEE Transactions on Neural Networks and Learning Systems 31 (10): 4229-4238.
Chen, Y., Sun, X., and Jin, Y. (2020). Communication-Efficient Federated Deep Learning With Layerwise Asynchronous Model Update and Temporally Weighted Aggregation. IEEE Transactions on Neural Networks and Learning Systems 31, 4229-4238.
Chen, Y., Sun, X., & Jin, Y., 2020. Communication-Efficient Federated Deep Learning With Layerwise Asynchronous Model Update and Temporally Weighted Aggregation. IEEE Transactions on Neural Networks and Learning Systems, 31(10), p 4229-4238.
Y. Chen, X. Sun, and Y. Jin, “Communication-Efficient Federated Deep Learning With Layerwise Asynchronous Model Update and Temporally Weighted Aggregation”, IEEE Transactions on Neural Networks and Learning Systems, vol. 31, 2020, pp. 4229-4238.
Chen, Y., Sun, X., Jin, Y.: Communication-Efficient Federated Deep Learning With Layerwise Asynchronous Model Update and Temporally Weighted Aggregation. IEEE Transactions on Neural Networks and Learning Systems. 31, 4229-4238 (2020).
Chen, Yang, Sun, Xiaoyan, and Jin, Yaochu. “Communication-Efficient Federated Deep Learning With Layerwise Asynchronous Model Update and Temporally Weighted Aggregation”. IEEE Transactions on Neural Networks and Learning Systems 31.10 (2020): 4229-4238.

Link(s) zu Volltext(en)
Access Level
Restricted Closed Access

Export

Markieren/ Markierung löschen
Markierte Publikationen

Open Data PUB

Suchen in

Google Scholar