MiDataSets: Creating the Conditions for a More Realistic Evaluation of Iterative Optimization

Grigori Fursin, John Cavazos, Michael O'Boyle, Olivier Temam

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract / Description of output

Iterative optimization has become a popular technique to obtain improvements over the default settings in a compiler for performance-critical applications, such as embedded applications. An implicit assumption, however, is that the best configuration found for any arbitrary data set will work well with other data sets that a program uses.

In this article, we evaluate that assumption based on 20 data sets per benchmark of the MiBench suite. We find that, though a majority of programs exhibit stable performance across data sets, the variability can significantly increase with many optimizations. However, for the best optimization configurations, we find that this variability is in fact small. Furthermore, we show that it is possible to find a compromise optimization configuration across data sets which is often within 5% of the best possible configuration for most data sets, and that the iterative process can converge in less than 20 iterations (for a population of 200 optimization configurations). All these conclusions have significant and positive implications for the practical utilization of iterative optimization.
Original languageEnglish
Title of host publicationHigh Performance Embedded Architectures and Compilers
Subtitle of host publicationSecond International Conference, HiPEAC 2007, Ghent, Belgium, January 28-30, 2007. Proceedings
EditorsKoen De Bosschere, David Kaeli, Per Stenström, David Whalley, Theo Ungerer
PublisherSpringer Berlin Heidelberg
Number of pages16
ISBN (Electronic)978-3-540-69338-3
ISBN (Print)978-3-540-69337-6
Publication statusPublished - Jan 2007

Publication series

NameLecture Notes in Computer Science
PublisherSpringer Berlin Heidelberg
ISSN (Print)0302-9743


Dive into the research topics of 'MiDataSets: Creating the Conditions for a More Realistic Evaluation of Iterative Optimization'. Together they form a unique fingerprint.

Cite this