Projects per year
Abstract / Description of output
In this paper, we propose an encoder-decoder neural architecture (called Channelformer) to achieve improved channel estimation for orthogonal frequency-division multiplexing (OFDM) waveforms in downlink scenarios. The self-attention
mechanism is employed to achieve input precoding for the input features before processing them in the decoder. In particular, we implement multi-head attention in the encoder and a residual convolutional neural architecture as the decoder, respectively. We also employ a customized weight-level pruning to slim the trained neural network with a fine-tuning process, which reduces the computational complexity significantly to realize a low complexity and low latency solution. This enables reductions of up to 70% in the parameters, while maintaining an almost identical performance compared with the complete Channelformer. We also propose an effective online training method based on the
fifth generation (5G) new radio (NR) configuration for the modern communication systems, which only needs the available information at the receiver for online training. Using industrial standard channel models, the simulations of attention-based solutions show superior estimation performance compared with
other candidate neural network methods for channel estimation.
mechanism is employed to achieve input precoding for the input features before processing them in the decoder. In particular, we implement multi-head attention in the encoder and a residual convolutional neural architecture as the decoder, respectively. We also employ a customized weight-level pruning to slim the trained neural network with a fine-tuning process, which reduces the computational complexity significantly to realize a low complexity and low latency solution. This enables reductions of up to 70% in the parameters, while maintaining an almost identical performance compared with the complete Channelformer. We also propose an effective online training method based on the
fifth generation (5G) new radio (NR) configuration for the modern communication systems, which only needs the available information at the receiver for online training. Using industrial standard channel models, the simulations of attention-based solutions show superior estimation performance compared with
other candidate neural network methods for channel estimation.
Original language | English |
---|---|
Pages (from-to) | 6562-6577 |
Number of pages | 16 |
Journal | IEEE Transactions on Wireless Communications |
Volume | 22 |
Issue number | 10 |
Early online date | 17 Feb 2023 |
DOIs | |
Publication status | Published - 1 Oct 2023 |
Keywords / Materials (for Non-textual outputs)
- Channel estimation
- attention mechanism
- deep learning
- online learning
- orthogonal frequency division multiplexing (OFDM)
- self-attention mechanism
Fingerprint
Dive into the research topics of 'Channelformer: Attention based Neural Solution for Wireless Channel Estimation and Effective Online Training'. Together they form a unique fingerprint.Projects
- 1 Finished
Datasets
-
Channelformer Neural Network Software
Thompson, J. (Creator) & Luan, D. (Creator), Edinburgh DataShare, 31 Jan 2023
DOI: 10.7488/ds/3801, https://hdl.handle.net/20.500.11820/244a98cb-c237-497c-bbf2-2d8f3ad0068b
Dataset