Research on Edge oriented Federated Learning Training Optimization

Authors

  • Wenzhe Zhang
  • Yong Liu
  • Xiaoli Song

DOI:

https://doi.org/10.54097/976ks692

Keywords:

Federated learning, Continuous learning, Heterogeneous data

Abstract

Federated learning, as a new distributed computing paradigm, reduces communication bandwidth consumption while ensuring data privacy and security, and can also utilize data from other terminal devices for collaborative training. However, in real edge computing scenarios, the data collected by edge devices usually have certain heterogeneity, which will lead to weight divergence, catastrophic forgetting of knowledge and other phenomena in the model training process of federated learning. Many existing federated learning methods improve from one direction of local client updates and global model updates, inevitably overlooking the impact of the other. Therefore, this article proposes a group based federated continuous learning method (FCL). Firstly, clients with similar data distributions are grouped together to reduce the weight differences between different clients within the same group; Then, during the local training process of the model, intelligent synaptic algorithm terms are introduced, and the learning of each group is modeled as a continuous learning task, in order to integrate knowledge between different local models and improve the model's ability to recognize and analyze old learning tasks. Experiments on the MNIST and CIFAR-10 standard datasets have shown that the model testing accuracy of FCL has improved by approximately 0.31% to 2.17% compared to FedProx, Scaffold, and FedCurve algorithms. Effectively enhancing the anti forgetting ability of the training model, further improving the convergence speed and accuracy of the training model.

Downloads

Download data is not yet available.

References

[1] Wang M , Deng W .Deep face recognition: A survey [J].Neurocomputing, 2022.

[2] Lim W Y B , Luong N C , Hoang D T ,et al.Federated Learning in Mobile Edge Networks: A Comprehensive Survey[J].IEEE Communications Surveys & Tutorials, 2020, PP(99):1-1.

[3] Ramaswamy S , Mathews R , Rao K ,et al.Federated Learning for Emoji Prediction in a Mobile Keyboard[J].arXiv, 2023.

[4] Wang H , Kaplan Z , Niu D ,et al.Optimizing Federated Learning on Non-IID Data with Reinforcement Learning[C]//IEEE INFOCOM 2021 - IEEE Conference on Computer Communications.IEEE, 2021.

[5] Zhao Y , Li M , Lai L ,et al.Federated Learning with Non-IID Data[J].2018.

[6] Fraboni Y , Vidal R , Kameni L ,et al.Clustered Sampling: Low-Variance and Improved Representativity for Clients Selection in Federated Learning[J].2023.

[7] Li T , Sahu A K , Zaheer M ,et al.Federated Optimization in Heterogeneous Networks[J]. Proceedings of Machine Learning and Systems, 2021, 2: 429-450.

[8] Shoham N , Avidor T , Keren A ,et al.Overcoming Forgetting in Federated Learning on Non-IID Data[J]. 2021.

[9] KARIMIREDDY S P, KALE S, MOHRI M, et al. Scaffold: Stochastic controlled averaging for on-device federated learning. [J]. 2020.

[10] Jiang Y , Konen J , Rush K ,et al.Improving Federated Learning Personalization via Model Agnostic Meta Learning[J].arXiv, 2021.

[11] Hu W, Qin Q, Wang M, et al. Continual learning by using information of each class holistically[C]. Proceedings of the AAAI Conference on Artificial Intelligence. 2023.

[12] CORINZIA L, BEURET A, BUHMANN J M. Variational federated multi-task learning. [J]. arXiv preprint arXiv:1906.06268, 2020.

[13] Shoham N, Avidor T, Keren A, et al. Overcoming forgetting in federated learning on non-iid data [EB/OL].[2021-10-17].

[14] Lee G,Shin Y,Jeong M,et al.Preservation of the global knowledge by not-true self knowledge distillation in federatedlearning [EB/OL].[2023-11-29]. https://arxiv.org/abs/ 2106.03097v2.

[15] TORCHVISIONMODEL. Pre-trained models from torchvision. [EB/OL]. 2023.

[16] Kopparapu K, Lin E. FedFMC:Sequential Efficient Federated Learning on Non-iid Data[EB/OL].[2023-6-19].

[17] HSU T M H, QI H, BROWN M. Measuring the effects of non-identical data dis-tribution for federated visual classification. [J]. arXiv preprint arXiv:1909.06335,2021.

Downloads

Published

29-05-2025

Issue

Section

Articles

How to Cite

Zhang, W., Liu, Y., & Song, X. (2025). Research on Edge oriented Federated Learning Training Optimization. Journal of Computing and Electronic Information Management, 17(1), 16-20. https://doi.org/10.54097/976ks692