Gradient Scheduling with Global Momentum for Asynchronous Federated Learning in Edge Environment

Haozhao Wang, Ruixuan Li, Chengjie Li, Pan Zhou, Yuhua Li, Wenchao Xu, Song Guo

Research output: Journal article publicationJournal articleAcademic researchpeer-review

10 Citations (Scopus)

Abstract

Federated Learning has attracted widespread attention in recent years because it allows massive edge nodes to collaboratively train machine learning models without sharing their private datasets. However, these edge nodes are usually heterogeneous in computational capability and statistically different in data distribution, i.e., non-IID, leading to significant performance degradation. Although status quo asynchronous training methods can solve the heterogeneity issue, they cannot prevent the non-IID problem from reducing the convergence rate. In this paper, we propose a novel paradigm that schedules the gradient with partially averaged gradients and applies the global momentum (GSGM) for asynchronous training over non-IID datasets in edge environment. Our key idea is to apply global momentum and partial average on the biased gradients calculated on edge nodes after scheduling, to make the training process stable. Empirical results demonstrate that GSGM can well adapt to different degrees of non-IID data, and bring 20% performance gains in terms of training stability for popular optimization algorithms with enhanced accuracy over Fashion-Mnist and CIFAR-10 datasets.
Original languageEnglish
Pages (from-to)1-13
JournalIEEE Internet of Things Journal
Publication statusPublished - 25 Mar 2022

Fingerprint

Dive into the research topics of 'Gradient Scheduling with Global Momentum for Asynchronous Federated Learning in Edge Environment'. Together they form a unique fingerprint.

Cite this