Towards Fairer and More Efficient Federated Learning via Multidimensional Personalized Edge Models

Yingchun Wang, Jingcai Guo, Jie Zhang, Song Guo, W. J. Zhang, Qinghua Zheng

Research output: Chapter in book / Conference proceedingConference article published in proceeding or bookAcademic researchpeer-review

Abstract

Federated learning (FL) is an emerging technique that trains massive and geographically distributed edge data while maintaining privacy. However, FL has inherent challenges in terms of fairness and computational efficiency due to the rising heterogeneity of edges, and thus usually results in sub-optimal performance in recent state-of-the-art (SOTA) solutions. In this paper, we propose a Customized Federated Learning (CFL) system to eliminate FL heterogeneity from multiple dimensions. Specifically, CFL tailors personalized models from the specially designed global model for each client jointly guided by an online trained model-search helper and a novel aggregation algorithm. Extensive experiments demonstrate that CFL has full-stack advantages for both FL training and edge reasoning and significantly improves the SOTA performance w.r.t. model accuracy (up to 7.2% in the non-heterogeneous environment and up to 21.8% in the heterogeneous environment), efficiency, and FL fairness.

Original languageEnglish
Title of host publicationIJCNN 2023 - International Joint Conference on Neural Networks, Proceedings
Pages1-8
Number of pages8
ISBN (Electronic)9781665488679
DOIs
Publication statusPublished - Aug 2023

Publication series

NameProceedings of the International Joint Conference on Neural Networks
Volume2023-June

Keywords

  • Deep Learning
  • Edge Computing
  • Federated Learning
  • Model Compression
  • Neural Architecture Search

ASJC Scopus subject areas

  • Software
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'Towards Fairer and More Efficient Federated Learning via Multidimensional Personalized Edge Models'. Together they form a unique fingerprint.

Cite this