Recurrent early exits for federated learning with heterogeneous clients

R. Lee*, J. Fernandez-Marques, S. X. Hu, D. Li, S. Laskaridis, Ł. Dudziak, Timothy Hospedales, F. Huszar, N. D. Lane

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract / Description of output

Federated learning (FL) has enabled distributed learning of a model across multiple clients in a privacy-preserving manner. One of the main challenges of FL is to accommodate clients with varying hardware capacities; clients have differing compute and memory requirements. To tackle this challenge, recent state-of-the-art approaches leverage the use of early exits. Nonetheless, these approaches fall short of mitigating the challenges of joint learning multiple exit classifiers, often relying on hand-picked heuristic solutions for knowledge distillation among classifiers and/or utilizing additional layers for weaker classifiers. In this work, instead of utilizing multiple classifiers, we propose a recurrent early exit approach named ReeFL that fuses features from different sub-models into a single shared classifier. Specifically, we use a transformer-based early exit module shared among sub-models to i) better exploit multi-layer feature representations for task-specific prediction and ii) modulate the feature representation of the backbone model for subsequent predictions. We additionally present a per-client self-distillation approach where the best sub-model is automatically selected as the teacher of the other sub-models at each client. Our experiments on standard image and speech classification benchmarks across various emerging federated fine-tuning baselines demonstrate ReeFL’s effectiveness over previous works.
Original languageEnglish
Title of host publicationProceedings of the 41st International Conference on Machine Learning
Publication statusAccepted/In press - 15 May 2024
EventThe 41st International Conference on Machine Learning - Vienna, Austria
Duration: 21 Jul 202427 Jul 2024


ConferenceThe 41st International Conference on Machine Learning
Abbreviated titleICML 2024
Internet address


Dive into the research topics of 'Recurrent early exits for federated learning with heterogeneous clients'. Together they form a unique fingerprint.

Cite this