Projects per year
Abstract
Federated Distillation (FD) is a novel distributed learning paradigm that shares the privacy-preserving nature of Federated Learning (FL) and provides possible solutions to the challenges introduced by the FL framework, such as being able to train local models with nonidentical architectures. In this paper, a communication channel-aware FD-framework is presented for multi-user massive multi-input-multi-output (mMIMO) communication system, where zero-forcing (ZF) and minimum mean-squared-error (MMSE) schemes are utilized to null the intra-cell interference. Unlike most existing studies, where both model parameters and model outputs (logits) are utilized for transmission, we exclusively adopt logits as the information exchanged in the wireless links to reduce the overall communication overhead in each round. Based on the analysis, the dynamic training steps based FD algorithm (FedTSKD) is proposed to save communication resources and accelerate the training process. Further, a group-based FD algorithm (FedTSKD-G) is proposed for the system experiencing different channel conditions like deep-fade. Simulation results on image classification tasks with ImageNette/STL-10, CIFAR-10/STL-10 and MNIST/FMNIST datasets combinations have demonstrated the proposed algorithm’s effectiveness and efficiency. Comparison with the FL algorithm shows that the proposed FD algorithm only incurs 1% of FL’s communication overhead to achieve the same testing performance.
| Original language | English |
|---|---|
| Pages (from-to) | 1535-1550 |
| Journal | IEEE Transactions on Cognitive Communications and Networking |
| Volume | 10 |
| Issue number | 4 |
| Early online date | 18 Mar 2024 |
| DOIs | |
| Publication status | E-pub ahead of print - 18 Mar 2024 |
Keywords / Materials (for Non-textual outputs)
- Analytical models
- Convergence
- Deep learning
- Fading channels
- Interference
- Quantization (signal)
- Servers
- Training
- federated distillation
- federated learning
- massive MIMO (mMIMO)
Fingerprint
Dive into the research topics of 'Federated Distillation in Massive MIMO Networks: Dynamic Training, Convergence Analysis and Communication Channel-Aware Learning,'. Together they form a unique fingerprint.Projects
- 1 Active
-
COG-MHEAR: Towards cognitively-inspired 5G-IoT enabled, multi-modal Hearing Aids
Ratnarajah, T. (Principal Investigator), Arslan, T. (Co-investigator) & Ratnarajah, T. (Co-investigator)
Engineering and Physical Sciences Research Council
1/03/21 → 28/02/26
Project: Research