Konferans bildirisi Açık Erişim

Hybrid Federated and Centralized Learning

   Elbir, Ahmet M.; Coleri, Sinem; Mishra, Kumar Vijay

Many of the machine learning tasks are focused on centralized learning (CL), which requires the transmission of local datasets from the clients to a parameter server (PS) leading to a huge communication overhead. Federated learning (FL) overcomes this issue by allowing the clients to send only the model updates to the PS instead of the whole dataset. In this way, FL brings the learning to edge level, wherein powerful computational resources are required on the client side. This requirement may not always be satisfied because of diverse computational capabilities of edge devices. We address this through a novel hybrid federated and centralized learning (HFCL) framework to effectively train a learning model by exploiting the computational capability of the clients. In HFCL, only the clients who have sufficient resources employ FL; the remaining clients resort to CL by transmitting their local dataset to PS. This allows all the clients to collaborate on the learning process regardless of their computational resources. We also propose a sequential data transmission approach with HFCL (HFCL-SDT) to reduce the training duration. The proposed HFCL frameworks outperform previously proposed non-hybrid FL (CL) based schemes in terms of learning accuracy (communication overhead) since all the clients collaborate on the learning process with their datasets regardless of their computational resources.

Dosyalar (143 Bytes)
Dosya adı Boyutu
bib-13a66bd0-73a8-4369-89d2-10caaaeca585.txt
md5:c0bad280bda6837b6741979cde5a12cd
143 Bytes İndir
27
4
görüntülenme
indirilme
Tüm sürümler Bu sürüm
Görüntülenme 2727
İndirme 44
Veri hacmi 572 Bytes572 Bytes
Tekil görüntülenme 2727
Tekil indirme 44

Alıntı yap