TY - GEN
T1 - FedDM: Data and Model Heterogeneity-Aware Federated Learning via Dynamic Weight Sharing
AU - Shen, Leming
AU - Zheng, Yuanqing
N1 - Publisher Copyright:
© 2023 IEEE.
PY - 2023/10
Y1 - 2023/10
N2 - Federated Learning (FL) plays an indispensable role in edge computing systems. Prevalent FL methods mainly address challenges involved in heterogeneous data distribution across devices. Model heterogeneity, however, has seldom been put under scrutiny. In practice, different devices (e.g., PCs and smartphones) generally have disparate computation and communication resources, necessitating neural network models with varying parameter sizes. Therefore, we propose FedDM, a novel data and model heterogeneity-aware FL system, which improves the FL system's accuracy while reducing edge devices' computation and communication costs for heterogeneous model training. FedDM features: 1) dynamic weight sharing scheme that handles model heterogeneity by dynamically selecting parts of the large model to share with smaller ones; 2) tree-structured layer-wise client cooperation scheme that handles data heterogeneity by allowing clients with similar data distribution to share some network layers. We implement FedDM and evaluate it using five public datasets with different tasks.
AB - Federated Learning (FL) plays an indispensable role in edge computing systems. Prevalent FL methods mainly address challenges involved in heterogeneous data distribution across devices. Model heterogeneity, however, has seldom been put under scrutiny. In practice, different devices (e.g., PCs and smartphones) generally have disparate computation and communication resources, necessitating neural network models with varying parameter sizes. Therefore, we propose FedDM, a novel data and model heterogeneity-aware FL system, which improves the FL system's accuracy while reducing edge devices' computation and communication costs for heterogeneous model training. FedDM features: 1) dynamic weight sharing scheme that handles model heterogeneity by dynamically selecting parts of the large model to share with smaller ones; 2) tree-structured layer-wise client cooperation scheme that handles data heterogeneity by allowing clients with similar data distribution to share some network layers. We implement FedDM and evaluate it using five public datasets with different tasks.
KW - Data Heterogeneity
KW - Federated Learning
KW - Model Heterogeneity
KW - Parameter Sharing
UR - http://www.scopus.com/inward/record.url?scp=85175067206&partnerID=8YFLogxK
U2 - 10.1109/ICDCS57875.2023.00093
DO - 10.1109/ICDCS57875.2023.00093
M3 - Conference article published in proceeding or book
AN - SCOPUS:85175067206
T3 - Proceedings - International Conference on Distributed Computing Systems
SP - 975
EP - 976
BT - Proceedings - 2023 IEEE 43rd International Conference on Distributed Computing Systems, ICDCS 2023
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 43rd IEEE International Conference on Distributed Computing Systems, ICDCS 2023
Y2 - 18 July 2023 through 21 July 2023
ER -