Emotion Recognition in Low-Resource Settings: An Evaluation of Automatic Feature Selection Methods

Fasih Haider, Senja Pollak, Pierre Albert, Saturnino Luz

Research output: Contribution to journalArticlepeer-review

Abstract / Description of output

Research in automatic affect recognition has seldom addressed the issue of com-putational resource utilization. With the advent of ambient intelligence tech-nology which employs a variety of low-power, resource-constrained devices, thisissue is increasingly gaining interest. This is especially the case in the contextof health and elderly care technologies, where interventions may rely on mon-itoring of emotional status to provide support or alert carers as appropriate.This paper focuses on emotion recognition from speech data, in settings whereit is desirable to minimize memory and computational requirements. Reducingthe number of features for inductive inference is a route towards this goal. Inthis study, we evaluate three different state-of-the-art feature selection methods:Infinite Latent Feature Selection (ILFS), ReliefF and Fisher (generalized Fisherscore), and compare them to our recently proposed feature selection methodnamed ‘Active Feature Selection’ (AFS). The evaluation is performed on threeemotion recognition data sets (EmoDB, SAVEE and EMOVO) using two stan-dard acoustic paralinguistic feature sets (i.e. eGeMAPs and emobase). The results show that similar or better accuracy can be achieved using subsets offeatures substantially smaller than the entire feature set. A machine learningmodel trained on a smaller feature set will reduce the memory and computa-tional resources of an emotion recognition system which can result in loweringthe barriers for use of health monitoring technology.
Original languageEnglish
Article number101119
JournalComputer Speech and Language
Early online date12 Jun 2020
Publication statusPublished - 1 Jan 2021


Dive into the research topics of 'Emotion Recognition in Low-Resource Settings: An Evaluation of Automatic Feature Selection Methods'. Together they form a unique fingerprint.

Cite this