skip to main content
research-article

Federated Learning Survey: A Multi-Level Taxonomy of Aggregation Techniques, Experimental Insights, and Future Frontiers

Published: 20 November 2024 Publication History

Abstract

The emerging integration of Internet of Things (IoT) and AI has unlocked numerous opportunities for innovation across diverse industries. However, growing privacy concerns and data isolation issues have inhibited this promising advancement. Unfortunately, traditional centralized Machine Learning (ML) methods have demonstrated their limitations in addressing these hurdles. In response to this ever-evolving landscape, Federated Learning (FL) has surfaced as a cutting-edge ML paradigm, enabling collaborative training across decentralized devices. FL allows users to jointly construct AI models without sharing their local raw data, ensuring data privacy, network scalability, and minimal data transfer. One essential aspect of FL revolves around proficient knowledge aggregation within a heterogeneous environment. Yet, the inherent characteristics of FL have amplified the complexity of its practical implementation compared to centralized ML. This survey delves into three prominent clusters of FL research contributions: personalization, optimization, and robustness. The objective is to provide a well-structured and fine-grained classification scheme related to these research areas through a unique methodology for selecting related work. Unlike other survey papers, we employed a hybrid approach that amalgamates bibliometric analysis and systematic scrutinizing to find the most influential work in the literature. Therefore, we examine challenges and contemporary techniques related to heterogeneity, efficiency, security, and privacy. Another valuable asset of this study is its comprehensive coverage of FL aggregation strategies, encompassing architectural features, synchronization methods, and several federation motivations. To further enrich our investigation, we provide practical insights into evaluating novel FL proposals and conduct experiments to assess and compare aggregation methods under IID and non-IID data distributions. Finally, we present a compelling set of research avenues that call for further exploration to open up a treasure of advancement.

References

[1]
Sharnil Pandya, Gautam Srivastava, Rutvij Jhaveri, M. Rajasekhara Babu, Sweta Bhattacharya, Praveen Kumar Reddy Maddikunta, Spyridon Mastorakis, Md Jalil Piran, and Thippa Reddy Gadekallu. 2023. Federated learning for smart cities: A comprehensive survey. Sustainable Energy Technologies and Assessments, 55 (Feb 2023), 102987. DOI:
[2]
Omar Abdel Wahab, Azzam Mourad, Hadi Otrok, and Tarik Taleb. 2021. Federated machine learning: Survey, multi-level classification, desirable criteria and future directions in communication and networking systems. IEEE Communications Surveys & Tutorials 23, 2 (2021), 1342–1397.
[3]
Ahmed Imteaj, Urmish Thakker, Shiqiang Wang, Jian Li, and M. Hadi Amini. 2021. A survey on federated learning for resource-constrained iot devices. IEEE Internet of Things Journal 9, 1 (2021), 1–24.
[4]
Bjarne Pfitzner, Nico Steckhan, and Bert Arnrich. 2021. Federated learning in a medical context: A systematic literature review. ACM Transactions on Internet Technology 21, 2 (2021), 1–31.
[5]
Chris Culnane, Benjamin I. P. Rubinstein, and Vanessa Teague. 2017. Health data in an open world. arXiv:1712.05627v1. Retrieved from https://arxiv.org/pdf/1712.05627
[6]
Luc Rocher, Julien M Hendrickx, and Yves-Alexandre de Montjoye 2019. Estimating the success of re-identifications in incomplete datasets using generative models. Nature Communications 10, 1 (2019), 1–9.
[7]
Brendan McMahan, Eider Moore, Daniel Ramage, Seth Hampson, and Blaise Agüera y Arcas. 2017. Communication-efficient learning of deep networks from decentralized data. In Artificial Intelligence and Statistics, PMLR, 1273–1282.
[8]
Mohammed Aledhari, Rehma Razzak, Reza M., and Fahad Saeed. 2020. Federated learning: A survey on enabling technologies, protocols, and applications. IEEE Access 8 (2020), 140699–140725.
[9]
Sawsan AbdulRahman, Hanine Tout, Hakima Ould-Slimane, Azzam Mourad, Chamseddine Talhi, and Mohsen Guizani. 2020. A survey on federated learning: The journey from centralized to distributed on-site learning and beyond. IEEE Internet of Things Journal 8, 7 (2020), 5476–5497.
[10]
Dinh C. Nguyen, Ming Ding, Pubudu N. Pathirana, Aruna Seneviratne, and Jun Li, H. Vincent Poor 2021. Federated learning for internet of things: A comprehensive survey. IEEE Communications Surveys & Tutorials 23, 3 (2021), 1622–1658.
[11]
Dinh C. Nguyen, Ming Ding, Quoc-Viet Pham, Pubudu N. Pathirana, Long Bao Le, Aruna Seneviratne, Jun Li, Dusit Niyato, and H. Vincent Poor 2021. Federated learning meets blockchain in edge computing: Opportunities and challenges. IEEE Internet of Things Journal, 8, 16 (2021), 12806–12825.
[12]
Qinbin Li, Zeyi Wen, Zhaomin Wu, Sixu Hu, Naibo Wang, Yuan Li, Xu Liu, and Bingsheng He. 2023. A survey on federated learning systems: Vision, hype and reality for data privacy and protection. IEEE Transactions on Knowledge and Data Engineering 35, 4 (2023), 3347–3366. DOI:
[13]
Viraaji Mothukuri, Reza M. Parizi, Seyedamin Pouriyeh, Yan Huang, Ali Dehghantanha, and Gautam Srivastava 2021. A survey on security and privacy of federated learning. Future Generation Computer Systems 115 (2021), 619–640.
[14]
AlysaZiying Tan, Han Yu, Lizhen Cui, and Qiang Yang. 2023. Towards personalized federated learning. IEEE Transactions on Neural Networks and Learning Systems 34, 12 (Dec 2023), 9587–9603. DOI: 35344498
[15]
Marcos F. Criado, Fernando E. Casado, Roberto Iglesias, Carlos V. Regueiro, and Senén Barro 2022. Non-iid data and continual learning processes in federated learning: A long road ahead. Information Fusion 88 (2022), 263–280.
[16]
Dinh C. Nguyen, Quoc-Viet Pham, Pubudu N. Pathirana, Ming Ding, Aruna Seneviratne, Zihuai Lin, Octavia A. Dobre, and Won-Joo Hwang 2022. Federated learning for smart healthcare: A survey. ACM Computing Surveys (CSUR) 55, 3 (2022), 1–37.
[17]
Lingjuan Lyu, Han Yu, Xingjun Ma, Chen Chen, Lichao Sun, Jun Zhao, Qiang Yang, and PhilipS Yu. 2024. Privacy and robustness in federated learning: Attacks and defenses. IEEE Transactions on Neural Networks and Learning Systems 35, 7 (Jul 2024), 8726–8746. DOI: 36355741
[18]
Nuria Rodríguez-Barroso, Daniel Jiménez-López, M. Victoria Luzón, Francisco Herrera, and Eugenio Martínez-Cámara. 2023. Survey on federated learning threats: Concepts, taxonomy on attacks and defences, experimental study and challenges. Information Fusion 90 (2023), 148–173.
[19]
Omair Rashed Abdulwareth Almanifi, Chee-Onn Chow, Mau-Luen Tham, Joon Huang Chuah, and Jeevan Kanesan. 2023. Communication and computation efficiency in Federated Learning: A survey. Internet of Things 22 (Jul 2023), 100742. DOI:
[20]
Hongzhi Yin, Liang Qu, Tong Chen, Wei Yuan, Ruiqi Zheng, Jing Long, Xin Xia, Yuhui Shi, and Chengqi Zhang. 2024. On-device recommender systems: A comprehensive survey. arXiv:2401.11441v2. Retrieved from https://arxiv.org/pdf/2401.11441
[21]
Chaomei Chen. 2006. Citespace ii: Detecting and visualizing emerging trends and transient patterns in scientific literature. Journal of the American Society for information Science and Technology 57, 3 (2006), 359–377.
[22]
Dimensions. Online, 2023. Retrieved from https://www.dimensions.ai.
[23]
Dashan Gao, Xin Yao, and Qiang Yang. 2022. A survey on heterogeneous federated learning. arXiv:2210.04505.
[24]
Chenhao Xu, Youyang Qu, Yong Xiang, and Longxiang Gao. 2021. Asynchronous federated learning on heterogeneous devices: A survey. arXiv:2109.04269.
[25]
Tian Li, Anit Kumar Sahu, Ameet Talwalkar, and Virginia Smith. 2020. Federated learning: Challenges, methods, and future directions. IEEE Signal Processing Magazine 37, 3 (2020), 50–60.
[26]
Peter Kairouz, H. Brendan McMahan, Brendan Avent, Aurélien Bellet, Mehdi Bennis, Arjun Nitin Bhagoji, Kallista Bonawitz, Zachary Charles, Graham Cormode, Rachel Cummings, Rafael G.L. D’Oliveira, Hubert Eichner, Salim El Rouayheb, David Evans, Josh Gardner, Zachary Garrett, Adrià Gascón, Badih Ghazi, Phillip B. Gibbons, Marco Gruteser, Zaid Harchaoui, Chaoyang He, Lie He, Zhouyuan Huo, Ben Hutchinson, Justin Hsu, Martin Jaggi, Tara Javidi, Gauri Joshi, Mikhail Khodak, Jakub Konečný, Aleksandra Korolova, Farinaz Koushanfar, Sanmi Koyejo, Tancrède Lepoint, Yang Liu, Prateek Mittal, Mehryar Mohri, Richard Nock, Ayfer Özgür, Rasmus Pagh, Mariana Raykova, Hang Qi, Daniel Ramage, Ramesh Raskar, Dawn Song, Weikang Song, Sebastian U. Stich, Ziteng Sun, Ananda Theertha Suresh, Florian Tramèr, Praneeth Vepakomma, Jianyu Wang, Li Xiong, Zheng Xu, Qiang Yang, Felix X. Yu, Han Yu, and Sen Zhao 2021. Advances and open problems in federated learning. Foundations and Trends in Machine Learning, 14, 1–2 (2021), 1–210.
[27]
Zhifeng Jiang, Wei Wang, Bo Li, and Qiang Yang. 2023. Towards efficient synchronous federated training: A survey on system optimization strategies. IEEE Transactions on Big Data 9, 2, 437–454. DOI:
[28]
Bouchra Fakher, Mohamed-el-Amine Brahmia, Mustafa Al Samaraa, Ismail Bennis, and Abdelhafid Abouaissa. 2024. Fedlbs: Federated learning loss-based swapping approach for energy building’s load forecasting. In 2024 International Wireless Communications and Mobile Computing (IWCMC), 1–6.
[29]
Jianyu Wang, Zachary Charles, Zheng Xu, Gauri Joshi, H. Brendan McMahan, Blaise Aguera y Arcas, Maruan Al-Shedivat, Galen Andrew, Salman Avestimehr, Katharine Daly, Deepesh Data, Suhas Diggavi, Hubert Eichner, Advait Gadhikar, Zachary Garrett, Antonious M. Girgis, Filip Hanzely, Andrew Hard, Chaoyang He, Samuel Horvath, Zhouyuan Huo, Alex Ingerman, Martin Jaggi, Tara Javidi, Peter Kairouz, Satyen Kale, Sai Praneeth Karimireddy, Jakub Konecny, Sanmi Koyejo, Tian Li, Luyang Liu, Mehryar Mohri, Hang Qi, Sashank J. Reddi, Peter Richtarik, Karan Singhal, Virginia Smith, Mahdi Soltanolkotabi, Weikang Song, Ananda Theertha Suresh, Sebastian U. Stich, Ameet Talwalkar, Hongyi Wang, Blake Woodworth, Shanshan Wu, Felix X. Yu, Honglin Yuan, Manzil Zaheer, Mi Zhang, Tong Zhang, Chunxiang Zheng, Chen Zhu, and Wennan Zhu 2021. A field guide to federated optimization. arXiv:2107.06917 (2021).
[30]
Chen Zhang, Yu Xie, Hang Bai, Bin Yu, Weihong Li, and Yuan Gao 2021. A survey on federated learning. Knowledge-Based Systems 216 (2021), 106775.
[31]
Yuwei Sun, Hideya Ochiai, and Hiroshi Esaki. 2021. Decentralized deep learning for multi-access edge computing: A survey on communication efficiency and trustworthiness. IEEE Transactions on Artificial Intelligence 3, 6 (2021), 963–972.
[32]
Osama Shahid, Seyedamin Pouriyeh, Reza M. Parizi, Quan Z. Sheng, Gautam Srivastava, and Liang Zhao 2021. Communication efficiency in federated learning: Achievements and challenges. arXiv:2107.10996v1. Retrieved from https://arxiv.org/pdf/2107.10996
[33]
Zihao Zhao, Yuzhu Mao, Yang Liu, Linqi Song, Ye Ouyang, Xinlei Chen, and Wenbo Ding. 2023. Towards efficient communications in federated learning: A contemporary survey. Journal of the Franklin Institute 360, 12 (Aug 2023), 8669–8703. DOI:
[34]
Aline Abboud, Mohamed-El-Amine Brahmia, Abdelhafid Abouaissa, Ahmad Shahin, and Rocks Mazraani. 2023. A hybrid aggregation approach for federated learning to improve energy consumption in smart buildings. In 2023 International Wireless Communications and Mobile Computing (IWCMC), 854–859.
[35]
Nader Bouacida and Prasant Mohapatra. 2021. Vulnerabilities in federated learning. IEEE Access 9 (2021), 63229–63249.
[36]
Chuan Ma, Jun Li, Ming Ding, Kang Wei, Wen Chen, and H. Vincent Poor 2021. Federated learning with unreliable clients: Performance analysis and mechanism design. IEEE Internet of Things Journa 8, 24 (2021), 17308–17319.
[37]
Ehsan Hallaji, Roozbeh Razavi-Far, Roozbeh Razavi-Far, and Mehrdad Saif. 2022. Federated and transfer learning: A survey on adversaries and defense mechanisms. In Federated and Transfer Learning. Springer International Publishing, 27, 29–55. DOI:
[38]
Attia Qammar, Jianguo Ding, and Huansheng Ning. 2022. Federated learning attack surface: Taxonomy, cyber defences, challenges, and future directions. Artificial Intelligence Review (2022), 1–38.
[39]
Ali Shafahi, W. Ronny Huang, Mahyar Najibi, Octavian Suciu, Christoph Studer, Tudor Dumitras, and Tom Goldstein. 2018. Poison frogs! Targeted clean-label poisoning attacks on NNs. In Advances in Neural Information Processing Systems. Curran Associates, Inc., 31 (2018), 1–11.
[40]
Wei Yang Bryan Lim, Nguyen Cong Luong, Nguyen Cong Luong, Dinh Thai Hoang, Yutao Jiao, and Ying-Chang Lia 2020. Federated learning in mobile edge networks: A comprehensive survey. IEEE Communications Surveys & Tutorials 22, 3 (2020), 2031–2063.
[41]
Tianyu Gu, Brendan Dolan-Gavitt, and Siddharth Garg. 2017. Badnets: Identifying vulnerabilities in the machine learning model supply chain. arXiv:1708.06733v2. Retrieved from https://arxiv.org/pdf/1708.06733
[42]
Yifei Zhang, Dun Zeng, Jinglong Luo, Zenglin Xu, and Irwin King 2023. A survey of trustworthy federated learning with perspectives on security, robustness, and privacy. arXiv:2302.10637.
[43]
Xuefei Yin, Yanming Zhu, and Jiankun Hu. 2021. A comprehensive survey of privacy-preserving federated learning: A taxonomy, review, and future directions. ACM Computing Surveys (CSUR) 54, 6 (2021), 1–36.
[44]
Briland Hitaj, Giuseppe Ateniese, and Fernando Perez-Cruz. 2017. Deep models under the gan: information leakage from collaborative deep learning. In 2017 ACM SIGSAC Conference on Computer and Communications Security, 603–618.
[45]
Luca Melis, Congzheng Song, Emiliano De Cristofaro, and Vitaly Shmatikov. 2019. Exploiting unintended feature leakage in collaborative learning. In 2019 IEEE Symposium on Security and Privacy (SP). IEEE, 691–706.
[46]
Zhilin Wang, Qiao Kang, Xinyi Zhang, and Qin Hu. 2022. Defense strategies toward model poisoning attacks in federated learning: A survey. In 2022 IEEE Wireless Communications and Networking Conference (WCNC). IEEE, 548–553.
[47]
Jiale Zhang, Junjun Chen, Di Wu, Bing Chen, and Shui Yu. 2019. Poisoning attack in federated learning using generative adversarial nets. In 2019 18th IEEE International Conference On Trust, Security And Privacy In Computing And Communications/13th IEEE International Conference On Big Data Science And Engineering (TrustCom/BigDataSE). IEEE, 374–380.
[48]
Su Wang, Rajeev Sahay, and Christopher G. Brinton. 2023. How potent are evasion attacks for poisoning federated learning-based signal classifiers? IEEE International Conference on Communications, ICC (2023), 2376–2381. DOI:
[49]
Pengrui Liu, Xiangrui Xu, and Wei Wang. 2022. Threats, attacks and defenses to federated learning: Issues, taxonomy and perspectives. Cybersecurity 5, 1 (2022), 1–19.
[50]
Attia Qammar, Ahmad Karim, Huansheng Ning, and Jianguo Ding. 2022. Securing federated learning with blockchain: A systematic literature review. Artificial Intelligence Review, 1–35.
[51]
Jonas Geiping, Hartmut Bauermeister, Hannah Dröge, and Michael Moeller. 2020. Inverting gradients-how easy is it to break privacy in federated learning? In 34th Conference on Neural Information Processing Systems, 16937–16947.
[52]
Milad Nasr, Reza Shokri, and Amir Houmansadr. 2019. Comprehensive privacy analysis of deep learning: Passive and active white-box inference attacks against centralized and federated learning. In 2019 IEEE Symposium on Security and Privacy (SP). IEEE, 739–753.
[53]
Hongsheng Hu, Zoran Salcic, Lichao Sun, Gillian Dobbie, Philip S. Yu, and Xuyun Zhang 2022. Membership inference attacks on machine learning: A survey. ACM Computing Surveys (CSUR), 54, 11s (2022), 1–37.
[54]
Lixu Wang, Shichao Xu, Xiao Wang, and Qi Zhu 2019. Eavesdrop the composition proportion of training labels in federated learning. arXiv:1910.06044v2. Retrieved from https://arxiv.org/pdf/1910.06044
[55]
Jierui Lin, Min Du, and Jian Liu. 2019. Free-riders in federated learning: Attacks and defenses. arXiv:1911.12560.
[56]
Yann Fraboni, Richard Vidal, and Marco Lorenzi. 2021. Free-rider attacks on model aggregation in federated learning. In International Conference on Artificial Intelligence and Statistics, PMLR, 1846–1854.
[57]
Meriem Arbaoui, Mohamed-el-Amine Brahmia, and Abdellatif Rahmoun. 2022. Towards secure and reliable aggregation for federated learning protocols in healthcare applications. In 2022 9th International Conference on Software Defined Systems (SDS). IEEE, 1–3.
[58]
Manoj Ghuhan Arivazhagan, Vinay Aggarwal, Vinay Aggarwal, Aaditya Kumar Singh, Sunav Choudhary. 2019. Federated learning with personalization layers. arXiv:1912.00818.
[59]
Zheng Lin, Guangyu Zhu, Yiqin Deng, Xianhao Chen, Yue Gao, Kaibin Huang, and Yuguang Fang 2023. Efficient parallel split learning over resource-constrained wireless edge networks. arXiv:2303.15991v4. Retrieved from https://arxiv.org/pdf/2303.15991
[60]
Kuo-Yun Liang, Abhishek Srinivasan, and Juan Carlos Andresen. 2022. Modular federated learning. In 2022 International Joint Conference on Neural Networks (IJCNN). IEEE, 1–8.
[61]
Zhengyang Lit, Shijing Sit, Jianzong Wang, and Jing Xiao 2022. Federated split bert for heterogeneous text classification. In 2022 International Joint Conference on Neural Networks (IJCNN). IEEE, 1–8.
[62]
Duc Bui, Kshitiz Malik, Jack Goetz, Honglei Liu, Seungwhan Moon, Anuj Kumar, and Kang G. Shin 2019. Federated user representation learning. arXiv:1909.12535.
[63]
Alain Rakotomamonjy, Maxime Vono, Hamlet Jesse Medina Ruiz, and Liva Ralaivola 2023. Personalised federated learning on heterogeneous feature spaces. arXiv:2301.11447.
[64]
Tailin Zhou, Jun Zhang, and Danny Tsang. 2022. FedFA: Federated learning with feature anchors to align feature and classifier for heterogeneous data. arXiv:2211.09299.
[65]
Liam Collins, Hamed Hassani, Aryan Mokhtari, and Sanjay Shakkottai. 2021. Exploiting shared representations for personalized federated learning. In International Conference on Machine Learning, PMLR, 2089–2099.
[66]
Benyuan Sun, Hongxing Huo, Yi Yang, and Bo Bai 2021. Partialfed: Cross-domain personalized federated learning via partial initialization. In 34th International Conference on Neural Information Processing Systems, 23309–23320.
[67]
Xin-Chun Li, Le Gan, De-Chuan Zhan, Yunfeng Shao, Bingshuai Li, and Shaoming Song 2021. Aggregate or not? exploring where to privatize in dnn based federated learning under different non-iid scenes. arXiv:2107.11954v1. Retrieved from https://arxiv.org/pdf/2107.11954
[68]
Filip Hanzely and Peter Richtárik. 2020. Federated learning of a mixture of global and local models. arXiv:2002.05516.
[69]
Yuyang Deng, Mohammad Mahdi Kamani, and Mehrdad Mahdavi. 2020. Adaptive personalized federated learning. arXiv:2003.13461.
[70]
Edvin Listo Zec, Olof Mogren, John Martinsson, Leon René Sütfeld, and Daniel Gillblad. 2020. Specialized federated learning using a mixture of experts. arXiv:2010.02056v3. Retrieved from https://arxiv.org/pdf/2010.02056
[71]
Chen Dun, Mirian Hipolito, Chris Jermaine, Dimitrios Dimitriadis, and Anastasios Kyrillidis 2023. Efficient and light-weight federated learning via asynchronous distributed dropout. In International Conference on Artificial Intelligence and Statistics, PMLR, 6630–6660.
[72]
Jialuo Cui, Qiong Wu, Zhi Zhou, and Xu Chen 2022. Fedbranch: Heterogeneous federated learning via multi-branch NN. In 2022 IEEE/CIC International Conference on Communications in China (ICCC). IEEE, 1101–1106.
[73]
Junki Mori, Tomoyuki Yoshiyama, Furukawa Ryo, and Isamu Teranishi. 2022. Personalized federated learning with multi-branch architecture. arXiv:2211.07931.
[74]
Yutao Huang, Lingyang Chu, Zirui Zhou, Lanjun Wang, Jiangchuan Liu, Jian Pei, and Yong Zhang. 2021. Personalized cross-silo federated learning on non-iid data. In AAAI Conference on Artificial Intelligence, Vol. 35, 7865–7873.
[75]
Michael Zhang, Karan Sapra, Sanja Fidler, Serena Yeung, and Jose M. Alvarez. 2020. Personalized federated learning with first order model optimization. arXiv:2012.08565.
[76]
Sashank Reddi, Zachary Charles, Manzil Zaheer, Zachary Garrett, Keith Rush, Jakub Konečný, Sanjiv Kumar, and H. Brendan McMahan 2020. Adaptive federated optimization. arXiv:2003.00295.
[77]
John Duchi, Elad Hazan, and Yoram Singer. 2011. Adaptive subgradient methods for online learning and stochastic optimization. Journal of Machine Learning Research, 12 (2011), 2121–2159. Retrieved from http://jmlr.org/papers/v12/duchi11a.html
[78]
Diederik P. Kingma and Jimmy Ba. 2014. Adam: A method for stochastic optimization. arXiv:1412.6980v9. Retrieved from https://arxiv.org/pdf/1412.6980
[79]
Manzil Zaheer, Sashank Reddi, Devendra Sachan, Satyen Kale, and Sanjiv Kumar 2018. Adaptive methods for nonconvex optimization. In 32nd International Conferece on Neural Information Processing Systems.
[80]
Junyi Li, Feihu Huang, and Heng Huang. 2023. FedDA: Faster framework of local adaptive gradient methods via restarted dual averaging. arXiv:2302.06103.
[81]
Prashant Khanduri, Pranay Sharma, Haibo Yang, Mingyi Hong, Jia Liu, Ketan Rajawat, and Pramod K. Varshney. 2021. STEM: A stochastic two-sided momentum algorithm achieving near-optimal sample and communication complexities for federated learning. In 35th International Conference on Neural Information Processing Systems, 6050–6061.
[82]
Rudrajit Das, Anish Acharya, Abolfazl Hashemi, Sujay Sanghavi, Inderjit S. Dhillon, and Ufuk Topcu. 2022. Faster non-convex federated learning via global and local momentum. In Uncertainty in Artificial Intelligence, PMLR, 496–506.
[83]
Zhouyuan Huo, Qian Yang, Bin Gu, Lawrence Carin. Heng Huang. 2020. Faster on-device training using new federated momentum algorithm. arXiv:2002.02090.
[84]
Zhengjie Yang, Wei Bao, Dong Yuan, Nguyen H. Tran, and Albert Y. Zomaya. 2022. Federated learning with nesterov accelerated gradient. IEEE Transactions on Parallel and Distributed Systems 33 12 (2022), 4863–4873.
[85]
Durmus Alp Emre Acar, Yue Zhao, Yue Zhao, Ramon Matas Navarro, Matthew Mattina, Paul N. Whatmough, and Venkatesh Saligrama. 2021. Federated learning based on dynamic regularization. arXiv:2111.04263.
[86]
Xujing Li, Min Liu, Sheng Sun, Yuwei Wang, Hui Jiang, and Xuefeng Jiang. 2023. Fedtrip: A resource-efficient federated learning method with triplet regularization. In 2023 IEEE International Parallel and Distributed Processing Symposium (IPDPS). IEEE, 809–819. DOI:
[87]
Yiyang Luo, Ting Lu, Shan Chang, and Bingyue Wang. 2023. Improving federated learning on heterogeneous data via serial pipeline training and global knowledge regularization. In 2022 IEEE 28th International Conference on Parallel and Distributed Systems (ICPADS). IEEE, 851–858.
[88]
Chenyang Lu, Wubin Ma, Su Deng, and Yahui Wu. 2023. Federated learning based on stratified sampling and regularization. Complex & Intelligent Systems 9, 2 (2023), 2081–2099.
[89]
Ahmed-Rafik-El Mehdi Baahmed, Jean-François Dollinger, Mohamed-el-Amine Brahmia, and Mourad Zghal. 2024. Hyperparameter impact on computational efficiency in federated edge learning. In 2024 International Wireless Communications and Mobile Computing (IWCMC), 1–6.
[90]
Bing Luo, Xiang Li, Shiqiang Wang, Jianwei Huang, and Leandros Tassiulas. 2021. Cost-effective federated learning design. In IEEE INFOCOM 2021-IEEE Conference on Computer Communications. IEEE, 1–10.
[91]
Hongda Wu and Ping Wang. 2021. Fast-convergent federated learning with adaptive weighting. IEEE Transactions on Cognitive Communications and Networking 7, 4 (2021), 1078–1088.
[92]
Wenqi Shi, Sheng Zhou, and Zhisheng Niu. 2020. Device scheduling with fast convergence for wireless federated learning. In ICC 2020–2020 IEEE International Conference on Communications (ICC). IEEE, 1–6.
[93]
Pengfei Guo, Dong Yang, Ali Hatamizadeh, An Xu, Ziyue Xu, Wenqi Li, Can Zhao, Daguang Xu, Stephanie Harmon, Evrim Turkbey, Baris Turkbey, Bradford Wood, Francesca Patella, Elvira Stellato, Gianpaolo Carrafiello, Vishal M. Patel, and Holger R. Roth. 2022. Auto-FedRL: Federated hyperparameter optimization for multi-institutional medical image segmentation. In 17th European Conference on Computer Vision (ECCV ’22), Part XXI. Springer, 437–455.
[94]
Peiying Zhang, Chao Wang, Chunxiao Jiang, and Zhu Han. 2021. Deep reinforcement learning assisted federated learning algorithm for data management of iiot. IEEE Transactions on Industrial Informatics 17, 12 (2021), 8475–8484.
[95]
Yunlong Lu, Xiaohong Huang, Ke Zhang, Sabita Maharjan, and Yan Zhang. 2020. Blockchain empowered asynchronous federated learning for secure data sharing in internet of vehicles. IEEE Transactions on Vehicular Technology 69, 4 (2020), 4298–4311.
[96]
Qian Chen, Zilong Wang, Jiawei Chen, Haonan Yan, and Xiaodong Lin. 2023. Dap-FL: Federated learning flourishes by adaptive tuning and secure aggregation. IEEE Transactions on Parallel and Distributed Systems 34, 6, 1923–1941. DOI:
[97]
Sai Praneeth Karimireddy, Martin Jaggi, Satyen Kale, Mehryar Mohri, Sashank J. Reddi, Sebastian U. Stich, and Ananda Theertha Suresh. 2021. Mime: Mimicking centralized stochastic algorithms in federated learning. arXiv:2008.03606v2. Retrieved from https://arxiv.org/pdf/2008.03606
[98]
Yi Zhou, Parikshit Ram, Theodoros Salonidis, Nathalie Baracaldo, Horst Samulowitz, and Heiko Ludwig. 2021. Flora: Single-shot hyper-parameter optimization for federated learning. arXiv:2112.08524v1. Retrieved from https://arxiv.org/pdf/2112.08524
[99]
Ahmed Khaled, Konstantin Mishchenko, and Peter Richtárik. 2020. Tighter theory for local SGD on identical and heterogeneous data. In International Conference on Artificial Intelligence and Statistics, PMLR, 4519–4529.
[100]
Tzu-Ming Harry Hsu, Hang Qi, and Matthew Brown. 2019. Measuring the effects of non-identical data distribution for federated visual classification. arXiv:1909.06335v1. Retrieved from https://arxiv.org/pdf/1909.06335
[101]
Zahidur Talukder and Mohammad A. Islam. 2022. Computationally efficient auto-weighted aggregation for heterogeneous federated learning. In 2022 IEEE International Conference on Edge Computing and Communications (EDGE). IEEE, 12–22.
[102]
Mohammadsadeq Garshasbi Herabad. 2023. Communication-efficient semi-synchronous hierarchical federated learning with balanced training in heterogeneous iot edge environments. Internet of Things 21 (2023), 100642, 2023.
[103]
Lei. Fu, Huanle Zhang, Ge Gao, Mi Zhang, and Xin Liu. 2023. Client selection in federated learning: Principles, challenges, and opportunities. IEEE Internet of Things Journal 10, 24, 21811–21819. DOI:
[104]
Hyunsung Cho, Akhil Mathur, and Fahim Kawsar. 2022. FLAME: Federated learning across multi-device environments. In Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 6, 3 (2022), 1–29.
[105]
Jianfeng Lu, Bangqi Pan, Juan Yu, Wenchao Jiang, Jianmin Han, and Zhiwei Ye. 2023. Towards energy-efficient and time-sensitive task assignment in cross-silo federated learning. Journal of King Saud University-Computer and Information Sciences 35, 4 (2023), 63–74.
[106]
Joel Wolfrath, Nikhil Sreekumar, Dhruv Kumar, Yuanli Wang, and Abhishek Chandra. 2022. HACCS: Heterogeneity-aware clustered client selection for accelerated federated learning. In 2022 IEEE International Parallel and Distributed Processing Symposium (IPDPS). IEEE, 985–995.
[107]
Renhao Lu, Weizhe Zhang, Yan Wang, Qiong Li, Xiaoxiong Zhong, and Hongwei Yang. 2023. Auction-based cluster federated learning in mobile edge computing systems. IEEE Transactions on Parallel and Distributed Systems 34, 4 (2023), 1145–1158.
[108]
Mubashir Imran, Hongzhi Yin, Tong Chen, Quoc Viet Hung Nguyen, Alexander Zhou, and Kai Zheng. 2023. ReFRS: Resource-efficient federated recommender system for dynamic and diversified user preferences. ACM Transactions on Information Systems 41, 3 (2023), 1–30.
[109]
Muhammad Firdaus, Siwan Noh, Zhuohao Qian, Harashta Tatimma Larasati, Kyung-Hyune Rhee. 2023. Personalized federated learning for heterogeneous data: A distributed edge clustering approach. Mathematical Biosciences and Engineering 20, 6 (2023), 10725–10740.
[110]
Yongheng Deng, Feng Lyu, Ju Ren, Yi-Chao Chen, Peng Yang, Yuezhi Zhou, and Yaoxue Zhang. 2022. Improving federated learning with quality-aware user incentive and auto-weighted model aggregation. IEEE Transactions on Parallel and Distributed Systems 33, 12 (2022), 4515–4529.
[111]
Ambrish Rawat, Giulio Zizzo, Swanand Kadhe, Jonathan P. Epperlein, and Stefano Braghin. 2022. Robust learning protocol for federated tumor segmentation challenge. arXiv:2212.08290.
[112]
Ayoung Shin and Yujin Lim. 2023. Federated-learning-based energy-efficient load balancing for UAV-enabled mec system in vehicular networks. Energies 16, 5 (2023), 2486.
[113]
Su Wang, Yichen Ruan, Yuwei Tu, Satyavrat Wagle, Christopher G. Brinton, and Carlee Joe-Wong. 2020. Network-aware optimization of distributed learning for fog computing. In IEEE INFOCOM 2020-IEEE Conference on Computer Communications. IEEE, 2509–2518.
[114]
Younghwan Jeong and Taeyoon Kim. 2022. A cluster-driven adaptive training approach for federated learning. Sensors 22, 18 (2022), 7061.
[115]
Silvana Trindade, LuizF Bittencourt, and Nelson L. S da Fonseca. 2024. Resource management at the network edge for federated learning. Digital Communications and Networks 10, 3 (Jun 2024), 765–782. DOI:
[116]
José Ángel Morell, Zakaria Abdelmoiz Dahi, Francisco Chicano, Gabriel Luque, and Enrique Alba. 2022. Optimising communication overhead in federated learning using NSGA-II. In International Conference on the Applications of Evolutionary Computation (Part of EvoStar). Springer, 317–333.
[117]
Huayan Guo, Yifan Zhu, Haoyu Ma, Vincent K. N. Lau, Kaibin Huang, Xiaofan Li, Huabin Nong, and Mingyu Zhou. 2021. Over-the-air aggregation for federated learning: Waveform superposition and prototype validation. Journal of Communications and Information Networks 6, 4 (2021), 429–442.
[118]
Houssem Sifaou, and GeoffreyYe Li. 2024. Over-the-air federated learning over scalable cell-free massive MIMO. IEEE Transactions on Wireless Communications 23, 5, 4214–4227. DOI:
[119]
Haibo Yang, Peiwen Qiu, Jia Liu, and Aylin Yener. 2022. Over-the-air federated learning with joint adaptive computation and power control. In 2022 IEEE International Symposium on Information Theory (ISIT). IEEE, 1259–1264.
[120]
Jiayu Mao, Haibo Yang, Peiwen Qiu, Jia Liu, and Aylin Yener. 2022. CHARLES: Channel-quality-adaptive over-the-air federated learning over wireless networks. In 2022 IEEE 23rd International Workshop on Signal Processing Advances in Wireless Communication (SPAWC). IEEE, 1–5.
[121]
Zhigang Yang, Xuhua Zhang, Dapeng Wu, Ruyan Wang, Puning Zhang, and Yu Wu. 2023. Efficient asynchronous federated learning research in the internet of vehicles. IEEE Internet of Things Journal 10, 9, 7737–7748. DOI:
[122]
Mohammad Taha Toghani and César A. Uribe. 2022. Unbounded gradients in federated leaning with buffered asynchronous aggregation. In 58th Annual Allerton Conference on Communication, Control, and Computing (Allerton). IEEE, 1–8 (2022). DOI:
[123]
Zhou Su, Yuntao Wang, Tom H. Luan, Ning Zhang, Feng Li, Tao Chen, and Hui Cao. 2021. Secure and efficient federated learning for smart grid with edge-cloud collaboration. IEEE Transactions on Industrial Informatics 18, 2 (2021), 1333–1344.
[124]
Meriem Arbaoui, Mohamed-el-Amine Brahmia, and Abdellatif Rahmoun. 2022. A review of IoT architectures in smart healthcare applications. In 2022 Seventh International Conference on Fog and Mobile Edge Computing (FMEC). IEEE, 1–8.
[125]
Xingfu Yi, Rongpeng Li, Chenghui Peng, Fei Wang, Jianjun Wu, and Zhifeng Zhao. 2022. RHFedMTL: Hierarchical federated multi-task learning. In 2022 IEEE 33rd Annual International Symposium on Personal, Indoor and Mobile Radio Communications (PIMRC). IEEE, 1–6.
[126]
Parminder Singh, GurjotSingh Gaba, Avinash Kaur, Mustapha Hedabou, and Andrei Gurtov. 2023. Dew-cloud-based hierarchical federated learning for intrusion detection in IoMT. IEEE Journal of Biomedical and Health Informatics 27, 2 (Feb 2023), 722–731. DOI: 35816521
[127]
Othmane Marfoq, Chuan Xu, Giovanni Neglia, and Richard Vidal 2022. Throughput-optimal topology design for cross-silo federated learning. In 34th International Conference on Neural Information Processing Systems 19478–19487.
[128]
Shuang Zhou, Bennett A. Landman, Yuankai Huo, and Aniruddha Gokhale. 2022. Communication-efficient federated learning for multi-institutional medical image classification. In Medical Imaging 2022: Imaging Informatics for Healthcare, Research, and Applications, Vol. 12037. SPIE, 6–12.
[129]
Haoyu Jin, Donglei Wu, Shuyu Zhang, Xiangyu Zou, Sian Jin, Dingwen Tao, Qing Liao, and Wen Xia. 2023. Design of a quantization-based dnn delta compression framework for model snapshots and federated learning. IEEE Transactions on Parallel & Distributed Systems, 1–15.
[130]
Natalie Lang, Elad Sofer, Tomer Shaked, and Nir Shlezinger. 2023. Joint privacy enhancement and quantization in federated learning. IEEE Transactions on Signal Processing 71 (2023), 295–310.
[131]
Daniel Rothchild, Ashwinee Panda, Enayat Ullah, Nikita Ivkin, Ion Stoica, Vladimir Braverman, Joseph Gonzalez, and Raman Arora. 2020. FetchSGD: Communication-efficient federated learning with sketching. In International Conference on Machine Learning, PMLR, 8253–8265.
[132]
Georgios Kollias, Theodoros Salonidis, and Shiqiang Wang. 2023. Sketch to skip and select: Communication efficient federated learning using locality sensitive hashing. In Trustworthy Federated Learning: First International Workshop, FL 2022, Held in Conjunction with IJCAI 2022, Revised Selected Papers. Springer, 72–83.
[133]
Irem Ergun, Hasin Us Sami, and Basak Guler. 2021. Sparsified secure aggregation for privacy-preserving federated learning. arXiv:2112.12872v1. Retrieved from https://arxiv.org/pdf/2112.12872
[134]
Jin-Hyun Ahn, Mehdi Bennis, and Joonhyuk Kang. 2023. Model compression via pattern shared sparsification in analog federated learning under communication constraints. IEEE Transactions on Green Communications and Networking 7, 1, 298–312. DOI:
[135]
Amirhossein Malekijoo, Mohammad Javad Fadaeieslam, Hanieh Malekijou, Morteza Homayounfar, Farshid Alizadeh-Shabdiz, and Reza Rawassizadeh. 2021. FEDZIP: A compression framework for communication-efficient federated learning. arXiv:2102.01593.
[136]
Lumin Liu, Jun Zhang, Shenghui Song, and KhaledB Letaief. 2023. Hierarchical federated learning with quantization: Convergence analysis and system design. IEEE Transactions on Wireless Communications 22, 1, 2–18. DOI:
[137]
Jiaqi Zhao, Hui Zhu, Fengwei Wang, Rongxing Lu, Hui Li, Jingwei Tu, and Jie Shen. 2022. CORK: A privacy-preserving and lossless federated learning scheme for deep NN. Information Sciences 603 (2022), 190–209.
[138]
Yujun Lin, Song Han, Huizi Mao, Yu Wang, and William J. Dally. 2010. Deep gradient compression: Reducing the communication bandwidth for distributed training. arXiv:1712.01887v3. Retrieved from https://arxiv.org/pdf/1712.01887
[139]
Nima Mohammadi, Jianan Bai, Qiang Fan, Yifei Song, Yang Yi, and Lingjia Liu. 2021. Differential privacy meets federated learning under communication constraints. IEEE Internet of Things Journal 9, 22(2021), 22204–22219.
[140]
Laurent Condat, Grigory Malinovsky, and Peter Richtárik. 2024. TAMUNA: Doubly accelerated distributed optimization with local training, compression, and partial participation. arXiv:2302.09832v3. Retrieved from https://arxiv.org/pdf/2302.09832
[141]
Naram Mhaisen, Alaa Awad Abdellatif, Amr Mohamed, Aiman Erbad, and Mohsen Guizani. 2021. Optimal user-edge assignment in hierarchical federated learning based on statistical properties and network topology constraints. IEEE Transactions on Network Science and Engineering 9, (2021), 55–66.
[142]
Yang Li, Jie Li, and Kan Li. 2023. HFML: Heterogeneous hierarchical federated mutual learning on non-iid data. Annals of Operations Research (2023), 1–17. DOI:
[143]
Chung-Hsuan Hu, Zheng Chen, and Erik G. Larsson. 2023. Scheduling and aggregation design for asynchronous federated learning over wireless networks. IEEE Journal on Selected Areas in Communications 41, 4 (2023), 874–886.
[144]
Jun Sun, Tianyi Chen, Georgios B. Giannakis, Qinmin Yang, and Zaiyue Yang. 2020. Lazily aggregated quantized gradient innovation for communication-efficient federated learning. IEEE Transactions on Pattern Analysis and Machine Intelligence 44, 4 (2020), 2031–2044.
[145]
Lusine Abrahamyan, Yiming Chen, Giannis Bekoulis, and Nikos Deligiannis. 2021. Learned gradient compression for distributed deep learning. IEEE Transactions on Neural Networks and Learning Systems 33, 12 (2021), 7330–7344.
[146]
Gaith Rjoub, Omar Abdel Wahab, Jamal Bentahar, Robin Cohen, and Ahmed Saleh Bataineh. 2022. Trust-augmented deep reinforcement learning for federated learning client selection. Information System Frontiers, 1–18. DOI: 35875592
[147]
Lianhong Zhang, Yuxin Wu, Lunyuan Chen, Lisheng Fan, and Arumugam Nallanathan. 2024. Scoring aided federated learning on Longtailed data for wireless IoMT based healthcare system. IEEE Journal of Biomedical and Health Informatics 28, 6 (Jun 2024), 3341–3348. DOI: 37531307
[148]
Meriem Arbaoui, Mohamed-el-Amine Brahmia, Abdellatif Rahmoun, and Mourad Zghal. 2024. Optimizing shapley value for client valuation in federated learning through enhanced gtg-shapley. In 2024 International Wireless Communications and Mobile Computing (IWCMC), 1–6.
[149]
Fang Shi, Weiwei Lin, Lisheng Fan, Xiazhi Lai, and Xiumin Wang. 2023. Efficient client selection based on contextual combinatorial multi-arm bandits. IEEE Transactions on Wireless Communications 22, 8, 5265–5277. DOI:
[150]
Honglan Huang, Wei Shi, Yanghe Feng, Chaoyue Niu, Guangquan Cheng, Jincai Huang, and Zhong Liu. 2023. Active client selection for clustered federated learning. IEEE Transactions on Neural Networks and Learning Systems, 1–15. DOI:
[151]
Yun Ji, Zhoubin Kou, Xiaoxiong Zhong, Sheng Zhang, Hangfan Li, and Fan Yang. 2022. Client selection and bandwidth allocation for federated learning: An online optimization perspective. In GLOBECOM 2022–2022 IEEE Global Communications Conference. IEEE, 5075–5080.
[152]
Chenrui Wu, Yifei Zhu, Rongyu Zhang, Yun Chen, Fangxin Wang, and Shuguang Cui. 2023. FedAB: Truthful federated learning with auction-based combinatorial multi-armed bandit. IEEE Internet of Things Journal 10, 17, 15159–15170. DOI:
[153]
Hanlin Gu, Lixin Fan, Xingxing Tang, and Qiang Yang. 2024. Fedcut: A spectral analysis framework for reliable detection of byzantine colluders. In IEEE Transactions on Pattern Analysis and Machine Intelligence. DOI:
[154]
Hui Zeng, Tongqing Zhou, Xinyi Wu, Zhiping Cai. 2022. Never too late: Tracing and mitigating backdoor attacks in federated learning. In 2022 41st International Symposium on Reliable Distributed Systems (SRDS). IEEE, 69–81.
[155]
Jungwuk Park, Dong-Jun Han, Minseok Choi, and Jaekyun Moon. 2021. Sageflow: Robust federated learning against both stragglers and adversaries. Advances in Neural Information Processing Systems, 34, 840–851.
[156]
Yongkang Wang, Yuanqing Xia, and Yufeng Zhan. 2021. Elite: Defending federated learning against byzantine attacks based on information entropy. In 2021 China Automation Congress (CAC). IEEE, 6049–6054.
[157]
Wentao Liu, Xiaolong Xu, Dejuan Li, Lianyong Qi, Fei Dai, Wanchun Dou, and Qiang Ni. 2023. Privacy preservation for federated learning with robust aggregation in edge computing. IEEE Internet of Things Journal 10, 8, 7343–7355. DOI:
[158]
Marc Vucovich, Amogh Tarcar, Penjo Rebelo, Narendra Gade, Ruchi Porwalb Abdul Rahman, Christopher Redino, Kevin Choi, Dhruv Nandakumar, Robert Schiller, Edward Bowen, Alex West, Sanmitra Bhattacharya, and Balaji Veeramani. 2021. Anomaly detection using federated learning. In International Conference on Artificial Intelligence and Applications. Springer Singapore, 1164, 141–148 (2021). DOI:
[159]
Xiaofeng Wang, Yonghong Wang, Zahra Javaheri, Laila Almutairi, Navid Moghadamnejad, and Osama S. Younes. 2023. Federated deep learning for anomaly detection in the internet of things. Computers and Electrical Engineering 108 (2023), 108651.
[160]
Qinyong Wang, Hongzhi Yin, Tong Chen, Junliang Yu, Alexander Zhou, and Xiangliang Zhang. 2022. Fast-adapting and privacy-preserving federated recommender system. The VLDB Journal, 31, 5 (2022), 877–896.
[161]
Wenyuan Yang, Yuguo Yin, Gongxi Zhu, Hanlin Gu, Lixin Fan, and Xiaochun Cao. 2023. FedZKP: Federated model ownership verification with zero-knowledge proof. arXiv:2305.04507.
[162]
Zhibo Xing, Zijian Zhang, Meng Li, Jiamou Liu, Liehuang Zhu, Giovanni Russello, and Muhammad Rizwan Asghar. 2023. Zero-knowledge proof-based practical federated learning on blockchain. arXiv:2304.05590.
[163]
Di Cao, Shan Chang, Zhijian Lin, Guohua Liu, and Donghong Sun. 2019. Understanding distributed poisoning attack in federated learning. In 2019 IEEE 25th International Conference on Parallel and Distributed Systems (ICPADS). IEEE, 233–239.
[164]
Wenqi Wei, Ling Liu, Jingya Zhou, Ka-Ho Chow, and Yanzhao Wu. 2023. Securing distributed SGD against gradient leakage threats. IEEE Transactions on Parallel and Distributed Systems 34, 7, 2040–2054. DOI:
[165]
Zhiqiu Zhang, Zhu Tianqing, Wei Ren, Ping Xiong, and Kim-Kwang Raymond Choo. 2023. Preserving data privacy in federated learning through large gradient pruning. Computers & Security 125 (2023), 103039.
[166]
Dimitris Stripelis, Umang Gupta, et al. 2022. Towards sparsified federated neuroimaging models via weight pruning. In Distributed, Collaborative, and Federated Learning, and Affordable AI and Healthcare for Resource Diverse Global Health: 3rd MICCAI Workshop, DeCaF 2022, and 2nd MICCAI Workshop, FAIR 2022, Held in Conjunction with MICCAI 2022. Springer, 141–151.
[167]
Ehsan Hallaji, Roozbeh Razavi-Far, Mehrdad Saif, and Enrique Herrera-Viedma. 2023. Label noise analysis meets adversarial training: A defense against label poisoning in federated learning. Knowledge-Based Systems 266 (2023), 110384.
[168]
Fangjiao Zhang, Guoqiang Li, Zhufeng Suo, Li Wang, Chang Cui, and Qingshu Meng. 2023. Secure vertical federated learning based on feature disentanglement. Signal Processing (2023), 109077.
[169]
Hyowoon Seo, Jihong Park, Seungeun Oh, Mehdi Bennis, and Seong-Lyun Kim. 2022. Federated knowledge distillation. Machine Learning and Wireless Communications (2022), 457.
[170]
Daliang Li and Junpu Wang. 2019. Fedmd: Heterogenous federated learning via model distillation. arXiv:1910.03581v1. Retrieved from https://arxiv.org/pdf/1910.03581
[171]
Zhuangdi Zhu, Junyuan Hong, and Jiayu Zhou. 2021. Data-free knowledge distillation for heterogeneous federated learning. In International Conference on Machine Learning, PMLR, 12878–12889.
[172]
Zilu Yang, Yanchao Zhao, and Jiale Zhang. 2023. FD-leaks: Membership inference attacks against federated distillation learning. In Web and Big Data: 6th International Joint Conference, APWeb (WAIM ’22), Part III. Springer, 364–378.
[173]
Ye Li, Jiale Zhang, Junwu Zhu, and Wenjuan Li. 2023. Hbmd-FL: Heterogeneous federated learning algorithm based on blockchain and model distillation. In Emerging Information Security and Applications: Third International Conference (EISA ’22). Springer, 145–159.
[174]
Jie Yang, Jun Zheng, Haochen Wang, Jiaxing Li, Haipeng Sun, Weifeng Han, Nan Jiang, and Yu-An Tan. 2023. Edge-cloud collaborative defense against backdoor attacks in federated learning. Sensors 23, 3 (2023), 1052.
[175]
Jiawei Shao, Fangzhao Wu, and Jun Zhang. 2024. Selective knowledge sharing for privacy-preserving federated distillation without a good teacher. Nature Communications 15, 1 (Jan 2024), 349. DOI: : 38191466
[176]
Hongbin Liu, Han Zhou, Hao Chen, Yong Yan, Jianping Huang, Ao Xiong, Shaojie Yang, Jiewei Chen, and Shaoyong Guo. 2023. A federated learning multi-task scheduling mechanism based on trusted computing sandbox. Sensors 23, 4 (2023), 2093.
[177]
Jiale Zhang, Chunpeng Ge, Feng Hu, and Bing Chen. 2021. RobustFL: Robust federated learning against poisoning attacks in industrial iot systems. IEEE Transactions on Industrial Informatics 18, 9 (2021), 6388–6397.
[178]
Xiaoyu Cao, Minghong Fang, Jia Liu, and Neil Zhenqiang Gong. 2022. Fltrust: Byzantine-robust federated learning via trust bootstrapping. arXiv:2012.13995v3. Retrieved from https://arxiv.org/pdf/2012.13995
[179]
Zhuoran Ma, Jianfeng Ma, Yinbin Miao, Yingjiu L, and Robert H. Deng. 2022. ShieldFL: Mitigating model poisoning attacks in privacy-preserving federated learning. IEEE Transactions on Information Forensics and Security 17 (2022), 1639–1654.
[180]
Duygu Nur Yaldiz, Tuo Zhang, and Salman Avestimehr. 2023. Secure federated learning against model poisoning attacks via client filtering. arXiv:2304.00160v2. Retrieved from https://arxiv.org/pdf/2304.00160
[181]
Yunlong Mao, Xinyu Yuan, Xinyang Zhao, and Sheng Zhong. 2021. Romoa: Robust model aggregation for the resistance of federated learning to model poisoning attacks. In Elisa Bertino, Haya Shulman, and Michael Waidner (Eds.), Computer Security – ESORICS 2021. Springer International Publishing, Cham, Switzerland, 476–496.
[182]
Kai. Yue, Richeng Jin, Chau-Wai Wong, and Huaiyu Dai. 2024. Federated learning via plurality vote. IEEE Transactions on Neural Networks and Learning Systems 35, 6 (Jun 2024), 8215–8228. DOI: 37015530
[183]
Krishna Pillutla, Sham M Kakade, and Zaid Harchaoui. 2022. Robust aggregation for federated learning. IEEE Transactions on Signal Processing 70 (2022), 1142–1154.
[184]
Dong Yin, Yudong Chen, Kannan Ramchandran, and Peter Bartlett. 2018. Byzantine-robust distributed learning: Towards optimal statistical rates. In International Conference on Machine Learning, PMLR, 5650–5659.
[185]
Peva Blanchard, El Mahdi El Mhamdi, et al. 2017. Machine learning with adversaries: Byzantine tolerant gradient descent. In 31st International Conference on Neural Information Processing Systems.
[186]
El Mahdi El Mhamdi, Rachid Guerraoui, and Sébastien Rouault. 2018. The hidden vulnerability of distributed learning in byzantium. In International Conference on Machine Learning, PMLR, 3521–3530.
[187]
Clement Fung, Chris J. M. Yoon, and Ivan Beschastnikh. 2020. The limitations of federated learning in sybil settings. In RAID, 301–316.
[188]
Giulio Zizzo, Ambrish Rawat, Mathieu Sinn, and Beat Buesser. 2020. FAT: Federated adversarial training. arXiv:2012.01791v1. Retrieved from https://arxiv.org/pdf/2012.01791
[189]
Shu Liu and Yanlei Shang. 2022. Federated learning with anomaly client detection and decentralized parameter aggregation. In 2022 52nd Annual IEEE/IFIP International Conference on Dependable Systems and Networks Workshops (DSN-W). IEEE, 37–43.
[190]
Yanli Li, Abubakar Sadiq Sani, Dong Yuan, and Wei Bao. 2022. Enhancing federated learning robustness through clustering non-iid features. In Asian Conference on Computer Vision, 41–55.
[191]
AshneetKhandpur Singh, Alberto Blanco-Justicia, and Josep Domingo-Ferrer. 2023. Fair detection of poisoning attacks in federated learning on non-i.i.d. data. Data Mining and Knowledge Discovery, 1–26. DOI: 36619003
[192]
Zan Zhou, Changqiao Xu, Mingze Wang, Tengchao Ma, and Shui Yu. 2021. Augmented dual-shuffle-based moving target defense to ensure CIA-triad in federated learning. In 2021 IEEE Global Communications Conference (GLOBECOM). IEEE, 01–06.
[193]
Xiaokang Zhou, Wei Liang, Kevin I-Kai Wang, Zheng Yan, Laurence T. Yang, Wei Wei, and Jianhua Ma. 2023. Decentralized p2p federated learning for privacy-preserving and resilient mobile robotic systems. IEEE Wireless Communications 30, 2 (2023), 82–89.
[194]
Yang Lu, Zhengxin Yu, and Neeraj Suri. 2023. Privacy-preserving decentralized federated learning over time-varying communication graph. ACM Transactions on Privacy and Security 26, 3 (Aug 2023), 1–39. DOI:
[195]
Han Wang, Luis Muñoz-González, Muhammad Zaid Hameed, David Eklund, and Shahid Raza. 2023. SparSFA: Towards robust and communication-efficient peer-to-peer federated learning. Computers & Security, 129 (2023), 103182.
[196]
Tim Piotrowski and Zoltán Nochta. 2023. Towards a secure peer-to-peer federated learning framework. In Advances in Service-Oriented and Cloud Computing: International Workshops of ESOCC 2022, Revised Selected Papers. Springer, 19–31.
[197]
Qian Chen, Zilong Wang, Yilin Zhou, Jiawei Chen, Dan Xiao, and Xiaodong Lin. 2022. CFL: Cluster federated learning in large-scale peer-to-peer networks. In 25th International Conference on Information Security (ISC ’22). Springer, 464–472.
[198]
Abdul Rehman Javed, Muhammad Abul Hassan, Faisal Shahzad, Waqas Ahmed, Saurabh Singh, Thar Baker, and Thippa Reddy Gadekallu. 2022. Integration of blockchain technology and federated learning in vehicular (IoT) networks: A comprehensive survey. Sensors 22, 12 (2022), 4394.
[199]
Bomin Mao, Jiajia Liu, Yingying Wu, and Nei Kato. 2023. Security and privacy on 6G network edge: A survey. IEEE Communications Surveys and Tutorials 25, 2, 1095–1127. DOI:
[200]
Zhe Peng, Jianliang Xu, Xiaowen Chu, Shang Gao, Yuan Yao, Rong Gu, and Yuzhe Tang. 2021. VFChain: Enabling verifiable and auditable federated learning via blockchain systems. IEEE Transactions on Network Science and Engineering 9, 1 (2021), 173–186.
[201]
Omaji Samuel, Akogwu Blessing Omojo, Abdulkarim Musa Onuja, Yunisa Sunday, Prayag Tiwari, Deepak Gupta, Ghulam Hafeez, Adamu Sani Yahaya, Oluwaseun Jumoke Fatoba, and Shahab Shamshirband. 2022. IoMT: A COVID-19 healthcare system driven by federated learning and blockchain. IEEE Journal of Biomedical and Health Informatics 27, 2 (2022), 823–834.
[202]
Abbas Yazdinejad, Ali Dehghantanha, Reza M. Parizi, and Mohammad Hammoudeh. 2022. Block hunter: Federated learning for cyber threat hunting in blockchain-based iiot networks. IEEE Transactions on Industrial Informatics 18, 11 (2022), 8356–8366.
[203]
Fan Mo, Hamed Haddadi, Kleomenis Katevas, Eduard Marin, Diego Perino, and Nicolas Kourtellis. 2021. PPFL: Privacy-preserving federated learning with trusted execution environments. In 19th Annual International Conference on Mobile Systems, Applications, and Services, 94–108.
[204]
Yu Chen, Fang Luo, Tong Li, and Tao Xiang. 2020. A training-integrity privacy-preserving federated learning scheme with trusted execution environment. Information Sciences 522 (2020), 69–79.
[205]
Phillip Rieger, Torsten Krauß, Markus Miettinen, and Alexandra Dmitrienko. 2022. Close the gate: Detecting backdoored models in federated learning based on client-side deep layer output analysis. arXiv:2210.07714.
[206]
Cen-Jhih Li, Pin-Han Huang, and Zaid Harchaoui. 2022. Robust aggregation for federated learning by minimum γ-divergence estimation. Entropy 24, 5 (2022), 686.
[207]
Zan Zhou, Xiaohui Kuang, Limin Sun, Lujie Zhong, and Changqiao Xu. 2020. Endogenous security defense against deductive attack: When artificial intelligence meets active defense for online service. IEEE Communications Magazine 58, 6 (2020), 58–64.
[208]
Yi Zhang, Yunfan Lv, and Fengxia Liu. 2023. A systematic survey for differential privacy techniques in federated learning. Journal of Information Security 14, 2 (2023), 111–135.
[209]
Xinwei Zhang, Xiangyi Chen, Mingyi Hong, Zhiwei Steven Wu, and Jinfeng Yi. 2022. Understanding clipping for federated learning: Convergence and client-level differential privacy. In 39th International Conference on Machine Learning, PMLR, 162 (2022), 26048–26067. Retrieved from https://proceedings.mlr.press/v162/zhang22b.html
[210]
Trang-Thi Ho, Khoa-Dang Tran, and Yennun Huang. 2022. FedSGDCOVID: Federated SGD COVID-19 detection under local differential privacy using chest x-ray images and symptom information. Sensors 22, 10 (2022), 3728.
[211]
Li. Zhang, Jianbo Xu, Audithan Sivaraman, Jegatha Deborah Lazarus, PradipKumar Sharma, and Vijayakumar Pandi. 2024. A two-stage differential privacy scheme for federated learning based on edge intelligence. IEEE Journal of Biomedical and Health Informatics 28, 6 (Jun 2024), 3349–3360. DOI: 37594867
[212]
Wei-Ning Chen, Ayfer Ozgur, and Peter Kairouz. 2022. The poisson binomial mechanism for unbiased federated learning with secure aggregation. In International Conference on Machine Learning, PMLR, 3490–3506.
[213]
Li. Zhang, Jianbo Xu, Pandi Vijayakumar, PradipKumar Sharma, and Uttam Ghosh. 2023. Homomorphic encryption-based privacy-preserving federated learning in IoT-enabled healthcare system. IEEE Transactions on Network Science and Engineering 10, 5, 2864–2880. DOI:
[214]
Weizhao Jin, Yuhang Yao, Shanshan Han, Jiajun Gu, Carlee Joe-Wong, Srivatsan Ravi, Salman Avestimehr, and Chaoyang He. 2023. FedML-HE: An efficient homomorphic-encryption-based privacy-preserving federated learning system. arXiv:2303.10837.
[215]
Yuhang Tian, Rui Wang, Yanqi Qiao, Emmanouil Panaousis, and Kaitai Liang. 2023. FLVoogd: Robust and privacy preserving federated learning. In Machine Learning Research. PMLR, 189, 1022–1037 (2023). Retrieved from https://proceedings.mlr.press/v189/yuhang23a.html
[216]
Rajesh Kumar, Abdullah Aman Khan, Jay Kumar, Zakria, Noorbakhsh Amiri Golilarz, Simin Zhang, Yang Ting, Chengyu Zheng, and Wenyong Wang. 2021. Blockchain-federated-learning and deep learning models for covid-19 detection using CT imaging. IEEE Sensors Journal 21, 14 (2021), 16301–16314.
[217]
Alex Krizhevsky. 2009. CIFAR-10 and CIFAR-100 datasets. Retrieved from https://www.cs.utoronto.ca/~kriz/cifar.html
[218]
Han Xiao, Kashif Rasul, and Roland Vollgraf. 2017. Fashion-MNIST: A novel image dataset for benchmarking machine learning algorithms. arXiv:1708.07747.
[219]
Jiancheng Yang, Rui Shi, and Bingbing Ni. 2021. Medmnist classification decathlon: A lightweight automl benchmark for medical image analysis. In 2021 IEEE 18th International Symposium on Biomedical Imaging (ISBI). IEEE, 191–195.
[220]
Jiancheng Yang, Rui Shi, Donglai Wei, Zequan Liu, Lin Zhao, Bilian Ke, Hanspeter Pfister, and Bingbing Ni. 2023. Medmnist v2-a large-scale lightweight benchmark for 2D and 3D biomedical image classification. Scientific Data, 10, 1 (2023), 41.
[221]
Wang Lu, Jindong Wang, Yiqiang Chen, Xin Qin, Renjun Xu, Dimitrios Dimitriadis, and Tao Qin. 2022. Personalized federated learning with adaptive batchnorm for healthcare. IEEE Transactions on Big Data. IEEE. 14 ( 2022). DOI:
[222]
Dun Zeng, Siqi Liang, Xiangjing Hu, Hui Wang, and Zenglin Xu. 2023. Fedlab: A flexible federated learning framework. Journal of Machine Learning Research 24, 100 (2023), 1–7.
[223]
Mikhail Yurochkin, Mayank Agarwal, Soumya Ghosh, Kristjan Greenewald, Trong Nghia Hoang, and Yasaman Khazaeni. 2019. Bayesian nonparametric federated learning of NNs. In International Conference on Machine Learning, PMLR, 7252–7261.
[224]
Hongyi Wang, Mikhail Yurochkin, Yuekai Sun, Dimitris Papailiopoulos, and Yasaman Khazaeni. 2020. Federated learning with matched averaging. arXiv:2002.06440v1. Retrieved from https://arxiv.org/pdf/2002.06440
[225]
Yuhao Li, Wenling Li, Bin Zhang, and Junping Du. 2022. Federated Adam-type algorithm for distributed optimization with lazy strategy. IEEE Internet of Things Journal 9, 20 (2022), 20519–20531.
[226]
Alex Krizhevsky, Ilya Sutskever, and GeoffreyE Hinton. 2017. ImageNet classification with deep convolutional neural networks. Communications of the ACM 60, 6 (May 2017), 84–90. DOI:
[227]
Yann LeCun, Léon Bottou, Y. Bengio, and P. Haffner. 1998. Gradient-based learning applied to document recognition. Proceedings of the IEEE 86, 11 (1998), 2278–2324.
[228]
Xiaoxiao Li, Meirui Jiang, Xiaofei Zhang, Michael Kamp, and Qi Dou. 2021. FedBN: Federated learning on non-IID features via local batch normalization. arXiv:2102.07623v2. Retrieved from https://arxiv.org/pdf/2102.07623
[229]
Yae Jee Cho, Jianyu Wang, and Gauri Joshi. 2020. Client selection in federated learning: Convergence analysis and power-of-choice selection strategies. arXiv:2010.01243v1. Retrieved from https://arxiv.org/pdf/2010.01243
[230]
Tian Li, Anit Kumar Sahu, Manzil Zaheer, Maziar Sanjabi, Ameet Talwalkar, and Virginia Smith. 2020. Federated optimization in heterogeneous networks. Machine Learning and Systems 2 (2020), 429–450.
[231]
Jacob Devlin, Ming-Wei Chang, Kenton Lee, and Kristina Toutanova. 2019. BERT: Pre-training of deep bidirectional transformers for language understanding. arXiv:1810.04805v2. Retrieved from https://arxiv.org/pdf/1810.04805
[232]
OpenAI. 2023. Gpt-4 technical report. arXiv:2303.08774.
[233]
Ebtesam Almazrouei, Hamza Alobeidli, Abdulaziz Alshamsi, Alessandro Cappelli, Ruxandra Cojocaru, Mérouane Debbah, Étienne Goffinet, Daniel Hesslow, Julien Launay, Quentin Malartic, Daniele Mazzotta, Badreddine Noune, Baptiste Pannier, Guilherme Penedo. 2023. The Falcon series of open language models. Technology Innovation Institute. arXiv.2311.16867v2. Retrieved from https://arxiv.org/pdf/2311.16867
[234]
Hugo Touvron, Thibaut Lavril, Gautier Izacard, Xavier Martinet, Marie-Anne Lachaux, Timothée Lacroix, Baptiste Rozière, Naman Goyal, Eric Hambro, Faisal Azhar, Aurelien Rodriguez, Armand Joulin, Edouard Grave, and Guillaume Lample. 2023. LLaMA: Open and efficient foundation language models. arXiv:2302.13971v1. Retrieved from https://arxiv.org/pdf/2302.13971
[235]
Mohamed Amine Ferrag, Ammar Battah, Norbert Tihanyi, Ridhi Jain, Diana Maimut, Fatima Alwahedi, Thierry Lestable, Narinderjit Singh Thandi, Abdechakour Mechri, Merouane Debbah, and Lucas C. Cordeiro. 2024. Securefalcon: The next cyber reasoning system for cyber security. arXiv:2307.06616v2. Retrieved from https://arxiv.org/pdf/2307.06616
[236]
Chaochao Chen, Xiaohua Feng, Jun Zhou, Jianwei Yin, and Xiaolin Zheng. 2023. Federated large language model: A position paper. arXiv:2307.08925v1. Retrieved from https://arxiv.org/pdf/2307.08925
[237]
Weiming Zhuang, Chen Chen, and Lingjuan Lyu. 2023. When foundation model meets federated learning: Motivations, challenges, and future directions. arXiv:2306.15546v2. Retrieved from https://arxiv.org/pdf/2306.15546
[238]
Tuo Zhang, Tiantian Feng, Samiul Alam, Dimitrios Dimitriadis, Mi Zhang, Shrikanth Narayanan, and Salman MAvestimehr. 2023. GPT-FL: Generative pre-trained model-assisted federated learning. arXiv:2306.02210v4. Retrieved from https://arxiv.org/pdf/2306.02210v
[239]
Khaled B Letaief, Wei Chen, Yuanming Shi, Jun Zhang, and Ying-Jun Angela Zhang. 2019. The roadmap to 6G: Ai empowered wireless networks. IEEE Communications Magazine 57, 8 (2019), 84–90.
[240]
Shuping Dang, Osama Amin, Basem Shihada, and Mohamed-Slim Alouini. 2020. What should 6G be? Nature Electronics 3, 1 (2020), 20–29.
[241]
Yi Liu, Xingliang Yuan, Zehui Xiong, Jiawen Kang, Xiaofei Wang, and Dusit Niyato. 2020. Federated learning for 6G communications: Challenges, methods, and future directions. China Communications 17, 9 (2020), 105–118.
[242]
Latif U Khan, Walid Saad, Dusit Niyato, Zhu Han, and Choong Seon Hong. 2022. Digital-twin-enabled 6G: Vision, architectural trends, and future directions. IEEE Communications Magazine 60, 1 (2022), 74–80.
[243]
Wei Yang, Wei Xiang, Yuan Yang, and Peng Cheng. 2022. Optimizing federated learning with deep reinforcement learning for digital twin empowered industrial IoT. IEEE Transactions on Industrial Informatics 19, 2 (2022), 1884–1893.
[244]
Mohammad Al-Quraan, Lina Mohjazi, Lina Bariah, Anthony Centeno, Ahmed Zoha, Kamran Arshad, Khaled Assaleh, Sami Muhaidat, Mérouane Debbah, and MuhammadAli Imran. 2023. Edge-native intelligence for 6G communications driven by federated learning: A survey of trends and challenges. IEEE Transactions on Emerging Topics in Computational Intelligence 7, 3, 957–979. DOI:
[245]
Aikaterini I Griva, Achilles D Boursianis, Lazaros A. Iliadis, Panagiotis Sarigiannidis, George Karagiannidis, and Sotirios K. Goudos. 2023. Model-agnostic meta-learning techniques: A state-of-the-art short review. In 2023 12th International Conference on Modern Circuits and Systems Technologies (MOCAST). IEEE, 1–4.
[246]
Hassan Gharoun, Fereshteh Momenifar, Fang Chen, and Amir Gandomi. 2024. Meta-learning approaches for few-shot learning: A survey of recent advances. ACM Computing Surveys 56, 12 (Dec 2024), 1–41. DOI:
[247]
Chelsea Finn, Pieter Abbeel, and Sergey Levine. 2017. Model-agnostic meta-learning for fast adaptation of deep networks. In International Conference on Machine Learning, PMLR, 1126–1135.
[248]
Timothy Hospedales, Antreas Antoniou, Paul Micaelli, and Amos Storkey. 2021. Meta-learning in NNs: A survey. IEEE Transactions on Pattern Analysis and Machine Intelligence 44, 9 (2021), 5149–5169.
[249]
Francisco M. Castro, Manuel J. Marín-Jiménez, Nicolás Guil, Cordelia Schmid, and Karteek Alahari. 2018. End-to-end incremental learning. In European Conference on Computer Vision (ECCV), 233–248.
[250]
Bingyan Liu, Nuoyan Lv, Yuanchun Guo, and Yawen Li. 2024. Recent advances on federated learning: A systematic survey. Neurocomputing, 597 (Sep 2024), 128019. DOI:
[251]
Yi-Ming Lin, Yuan Gao, Mao-Guo Gong, Si-Jia Zhang, Yuan-Qiao Zhang, and Zhi-Yuan Li. 2023. Federated learning on multimodal data: A comprehensive survey. Machine Intelligence Research 20, 4 (Aug 2023), 539–553. DOI:
[252]
Jie Ding, Eric Tramel, Anit Kumar Sahu, Shuang Wu, Salman Avestimehr, and Tao Zhang. 2022. Federated learning challenges and opportunities: An outlook. In ICASSP 2022–2022 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, 8752–8756.

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Transactions on Intelligent Systems and Technology
ACM Transactions on Intelligent Systems and Technology  Volume 15, Issue 6
December 2024
727 pages
EISSN:2157-6912
DOI:10.1145/3613712
  • Editor:
  • Huan Liu
Issue’s Table of Contents

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 20 November 2024
Online AM: 17 July 2024
Accepted: 28 June 2024
Revised: 21 May 2024
Received: 15 November 2023
Published in TIST Volume 15, Issue 6

Check for updates

Author Tags

  1. Federated Learning
  2. Aggregation Methods
  3. Privacy-Preserving
  4. Security
  5. Heterogeneity
  6. Efficiency
  7. Optimization
  8. Personalization
  9. Multilevel Classification

Qualifiers

  • Research-article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 895
    Total Downloads
  • Downloads (Last 12 months)895
  • Downloads (Last 6 weeks)86
Reflects downloads up to 12 Feb 2025

Other Metrics

Citations

View Options

Login options

Full Access

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Full Text

View this article in Full Text.

Full Text

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media