Gain without Pain: Offsetting DP-injected Nosies Stealthily in Cross-device Federated Learning

Wenzhuo Yang, Yipeng Zhou, Miao Hu, Di Wu, Xi Zheng, Jessie Hui Wang, Song Guo, Chao Li

Research output: Journal article publicationJournal articleAcademic researchpeer-review

8 Citations (Scopus)


Federated Learning (FL) is an emerging paradigm through which decentralized devices can collaboratively train a common model. However, a serious concern is the leakage of privacy from exchanged gradient information between clients and the parameter server (PS) in FL. To protect gradient information, clients can adopt differential privacy (DP) to add additional noises and distort original gradients before they are uploaded to the PS. Nevertheless, the model accuracy will be significantly impaired by DP noises, making DP impracticable in real systems. In this work, we propose a novel Noise Information Secretly Sharing (NISS) algorithm to alleviate the disturbance of DP noises by sharing negated noises among clients. We theoretically prove that: 1) If clients are trustworthy, DP noises can be perfectly offset on the PS; 2) Clients can easily distort negated DP noises to protect themselves in case that other clients are not totally trustworthy, though the cost lowers model accuracy. NISS is particularly applicable for FL across multiple IoT (Internet of Things) systems, in which all IoT devices need to collaboratively train a model. To verify the effectiveness and the superiority of the NISS algorithm, we conduct experiments with the MNIST and CIFAR-10 datasets. The experiment results verify our analysis and demonstrate that NISS can improve model accuracy by 19% on average and obtain better privacy protection if clients are trustworthy.

Original languageEnglish
Pages (from-to)22147 - 22157
JournalIEEE Internet of Things Journal
Publication statusPublished - Nov 2022


  • Computational modeling
  • Differential Privacy
  • Differential privacy
  • Distortion
  • Federated Learning
  • Internet of Things
  • Machine learning
  • Privacy
  • Secretly Offsetting.
  • Training

ASJC Scopus subject areas

  • Signal Processing
  • Information Systems
  • Hardware and Architecture
  • Computer Science Applications
  • Computer Networks and Communications


Dive into the research topics of 'Gain without Pain: Offsetting DP-injected Nosies Stealthily in Cross-device Federated Learning'. Together they form a unique fingerprint.

Cite this