Towards Data-Independent Knowledge Transfer in Model-Heterogeneous Federated Learning

Jie Zhang, Song Guo, Jingcai Guo, Deze Zeng, Jingren Zhou, Albert Zomaya

Research output: Journal article publicationJournal articleAcademic researchpeer-review

3 Citations (Scopus)

Abstract

Federated Distillation (FD) extends classic Federated Learning (FL) to a more general training framework that enables model-heterogeneous collaborative learning by Knowledge Distillation (KD) across multiple clients and the server. However, existing KD-based algorithms usually require a set of shared input samples for each client to produce soft-prediction for distillation. Worse still, such a manual selection is accompanied by careful deliberations or prior information on clients' private data distribution, which is not in line with the privacy-preserving characteristic of classic FL. In this paper, we propose a novel training framework to achieve data-independent knowledge transfer by properly designing a distributed generative adversarial network (GAN) between the server and clients that can synthesize shared feature representations to facilitate the FD training. Specifically, we deploy a generator on the server and reuse each local model as a federated discriminator to form a lightweight efficient distributed GAN that can automatically synthesize simulated global feature representations for distillation. Moreover, since the synthesized feature representations are usually more faithful and homologous with global data distribution, faster and better training convergence can be obtained. Extensive experiments on different tasks and heterogeneous models demonstrate the effectiveness of the proposed framework on model accuracy and communication overhead.

Original languageEnglish
Pages (from-to)1-13
Number of pages13
JournalIEEE Transactions on Computers
DOIs
Publication statusPublished - May 2023

Keywords

  • Adaptation models
  • Computational modeling
  • Data models
  • Federated learning
  • GAN
  • Generators
  • knowledge transfer
  • model heterogeneity
  • Servers
  • Training

ASJC Scopus subject areas

  • Software
  • Theoretical Computer Science
  • Hardware and Architecture
  • Computational Theory and Mathematics

Fingerprint

Dive into the research topics of 'Towards Data-Independent Knowledge Transfer in Model-Heterogeneous Federated Learning'. Together they form a unique fingerprint.

Cite this