Train your classifier first: Cascade Neural Networks Training from upper layers to lower layers

Shucong Zhang, Cong-Thanh Do, Rama Doddipatla, Erfan Loweimi, Peter Bell, Steve Renals

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Although the lower layers of a deep neural network learn features which are transferable across datasets, these layers are not transferable within the same dataset. That is, in general, freezing the trained feature extractor (the lower layers) and retraining the classifier (the upper layers) on the same datasetleads to worse performance. In this paper, for the first time, we show that the frozen classifier is transferable within the same dataset. We develop a novel top-down training method which can be viewed as an algorithm for searching for high-quality classifiers. We tested this method on automatic speech recognition (ASR) tasks and language modelling tasks. The proposed method consistently improves recurrent neural network ASR models on Wall Street Journal, self-attention ASR models on Switchboard, and AWD-LSTM language models on WikiText-2.
Original languageEnglish
Title of host publication2021 IEEE International Conference on Acoustics, Speech and Signal Processing
PublisherInstitute of Electrical and Electronics Engineers (IEEE)
Number of pages5
Publication statusAccepted/In press - 30 Jan 2021
Event46th IEEE International Conference on Acoustics, Speech and Signal Processing - Toronto, Canada
Duration: 6 Jun 202111 Jun 2021
https://2021.ieeeicassp.org/

Conference

Conference46th IEEE International Conference on Acoustics, Speech and Signal Processing
Abbreviated titleICASSP 2021
CountryCanada
CityToronto
Period6/06/2111/06/21
Internet address

Keywords

  • top-down training
  • layer-wise training
  • general classifier
  • speech recognition
  • language model

Fingerprint Dive into the research topics of 'Train your classifier first: Cascade Neural Networks Training from upper layers to lower layers'. Together they form a unique fingerprint.

Cite this