FedConv: A Learning-on-Model Paradigm for Heterogeneous Federated Clients

Leming Shen, Qiang Yang, Kaiyan Cui, Yuanqing Zheng, Xiao Yong Wei, Jianwei Liu, Jinsong Han

Research output: Chapter in book / Conference proceedingConference article published in proceeding or bookAcademic researchpeer-review

6 Citations (Scopus)

Abstract

Federated Learning (FL) facilitates collaborative training of a shared global model without exposing clients' private data. In practical FL systems, clients (e.g., edge servers, smartphones, and wearables) typically have disparate system resources. Conventional FL, however, adopts a one-size-fits-All solution, where a homogeneous large global model is transmitted to and trained on each client, resulting in an overwhelming workload for less capable clients and starvation for other clients. To address this issue, we propose FedConv, a client-friendly FL framework, which minimizes the computation and memory burden on resource-constrained clients by providing heterogeneous customized sub-models. FedConv features a novel learning-on-model paradigm that learns the parameters of the heterogeneous sub-models via convolutional compression. Unlike traditional compression methods, the compressed models in FedConv can be directly trained on clients without decompression. To aggregate the heterogeneous sub-models, we propose transposed convolutional dilation to convert them back to large models with a unified size while retaining personalized information from clients. The compression and dilation processes, transparent to clients, are optimized on the server leveraging a small public dataset. Extensive experiments on six datasets demonstrate that FedConv outperforms state-of-The-Art FL systems in terms of model accuracy (by more than 35% on average), computation and communication overhead (with 33% and 25% reduction, respectively).

Original languageEnglish
Title of host publicationMOBISYS 2024 - Proceedings of the 2024 22nd Annual International Conference on Mobile Systems, Applications and Services
PublisherAssociation for Computing Machinery, Inc
Pages398-411
Number of pages14
ISBN (Electronic)9798400705816
DOIs
Publication statusPublished - 3 Jun 2024
Event22nd Annual International Conference on Mobile Systems, Applications and Services, MOBISYS 2024 - Minato-ku, Japan
Duration: 3 Jun 20247 Jun 2024

Publication series

NameMOBISYS 2024 - Proceedings of the 2024 22nd Annual International Conference on Mobile Systems, Applications and Services

Conference

Conference22nd Annual International Conference on Mobile Systems, Applications and Services, MOBISYS 2024
Country/TerritoryJapan
CityMinato-ku
Period3/06/247/06/24

Keywords

  • federated learning
  • model compression
  • model heterogeneity

ASJC Scopus subject areas

  • Artificial Intelligence
  • Computer Networks and Communications
  • Information Systems
  • Software
  • Safety, Risk, Reliability and Quality
  • Health Informatics
  • Instrumentation
  • Radiation

Fingerprint

Dive into the research topics of 'FedConv: A Learning-on-Model Paradigm for Heterogeneous Federated Clients'. Together they form a unique fingerprint.

Cite this