Can MergeSFL Enhance Split Federated Learning with Feature Merging and Batch Size Control?

Original title: MergeSFL: Split Federated Learning with Feature Merging and Batch Size Regulation

Authors: Yunming Liao, Yang Xu, Hongli Xu, Lun Wang, Zhiwei Yao, Chunming Qiao

The article delves into federated learning’s role in edge AI for knowledge extraction in edge computing systems. Split federated learning (SFL) was introduced to reduce computing loads and protect model privacy in resource-constrained environments. However, challenges persisted with statistical and system heterogeneity. Enter MergeSFL, a novel framework integrating feature merging and batch size regulation into SFL. Feature merging amalgamates worker features, approximating IID data, enhancing model accuracy. Batch size regulation assigns diverse batch sizes to heterogeneous workers, refining training efficiency. MergeSFL optimizes both strategies synergistically, amplifying SFL’s performance. Rigorous experiments with 80 NVIDIA Jetson edge devices demonstrated MergeSFL’s prowess, boosting model accuracy by 5.82% to 26.22% and accelerating speeds by 1.74x to 4.14x compared to baseline methods. This innovation marks a significant stride in optimizing federated learning for edge computing systems.

Original article: https://arxiv.org/abs/2311.13348