TY - JOUR
T1 - On sparse ensemble methods
T2 - An application to short-term predictions of the evolution of COVID-19
AU - Benítez-Peña, Sandra
AU - Carrizosa, Emilio
AU - Guerrero, Vanesa
AU - Jiménez-Gamero, M. Dolores
AU - Martín-Barragán, Belén
AU - Molero-Río, Cristina
AU - Ramírez-Cobo, Pepa
AU - Romero Morales, Dolores
AU - Sillero-Denamiel, M. Remedios
N1 - Funding Information:
We thank the reviewers for their thorough comments and suggestions, which have been very valuable to strengthen the quality of the paper. This research has been financed in part by research projects EC H2020 MSCA RISE NeEDS (Grant agreement ID: 822214); FQM-329 and P18-FR-2369 (Junta de Andalucía, Spain); MTM2017-89422-P (Ministerio de Economía, Industria y Competitividad, Spain); PID2019-110886RB-I00 (Ministerio de Ciencia, Innovacin y Universidades, Spain); PR2019-029 (Universidad de Cádiz, Spain); PITUFLOW-CM-UC3M (Comunidad de Madrid and Universidad Carlos III de Madrid, Spain); and EP/R00370X/1 (EPSRC, United Kingdom). This support is gratefully acknowledged.
Publisher Copyright:
© 2021 The Authors
Copyright:
Copyright 2021 Elsevier B.V., All rights reserved.
PY - 2021/12/1
Y1 - 2021/12/1
N2 - Since the seminal paper by Bates and Granger in 1969, a vast number of ensemble methods that combine different base regressors to generate a unique one have been proposed in the literature. The so-obtained regressor method may have better accuracy than its components, but at the same time it may overfit, it may be distorted by base regressors with low accuracy, and it may be too complex to understand and explain. This paper proposes and studies a novel Mathematical Optimization model to build a sparse ensemble, which trades off the accuracy of the ensemble and the number of base regressors used. The latter is controlled by means of a regularization term that penalizes regressors with a poor individual performance. Our approach is flexible to incorporate desirable properties one may have on the ensemble, such as controlling the performance of the ensemble in critical groups of records, or the costs associated with the base regressors involved in the ensemble. We illustrate our approach with real data sets arising in the COVID-19 context.
AB - Since the seminal paper by Bates and Granger in 1969, a vast number of ensemble methods that combine different base regressors to generate a unique one have been proposed in the literature. The so-obtained regressor method may have better accuracy than its components, but at the same time it may overfit, it may be distorted by base regressors with low accuracy, and it may be too complex to understand and explain. This paper proposes and studies a novel Mathematical Optimization model to build a sparse ensemble, which trades off the accuracy of the ensemble and the number of base regressors used. The latter is controlled by means of a regularization term that penalizes regressors with a poor individual performance. Our approach is flexible to incorporate desirable properties one may have on the ensemble, such as controlling the performance of the ensemble in critical groups of records, or the costs associated with the base regressors involved in the ensemble. We illustrate our approach with real data sets arising in the COVID-19 context.
KW - COVID-19
KW - ensemble method
KW - machine learning
KW - mathematical optimization
KW - selective sparsity
UR - http://www.scopus.com/inward/record.url?scp=85106306808&partnerID=8YFLogxK
U2 - 10.1016/j.ejor.2021.04.016
DO - 10.1016/j.ejor.2021.04.016
M3 - Article
AN - SCOPUS:85106306808
SN - 0377-2217
VL - 295
SP - 648
EP - 663
JO - European Journal of Operational Research
JF - European Journal of Operational Research
IS - 2
ER -