Title :
Precise computer comparisons via statistical resampling methods
Author :
Bin Li ; Shaoming Chen ; Lu Peng
Author_Institution :
Louisiana State Univ., Baton Rouge, LA, USA
Abstract :
Performance variability, stemming from non-deterministic hardware and software behaviors or deterministic behaviors such as measurement bias, is a well-known phenomenon of computer systems which increases the difficulty of comparing computer performance metrics. Conventional methods use various measures (such as geometric mean) to quantify the performance of different benchmarks to compare computers without considering variability. This may lead to wrong conclusions. In this paper, we propose three resampling methods for performance evaluation and comparison: a randomization test for a general performance comparison between two computers, bootstrapping confidence estimation, and an empirical distribution and five-number-summary for performance evaluation. The results show that 1) the randomization test substantially improves our chance to identify the difference between performance comparisons when the difference is not large; 2) bootstrapping confidence estimation provides an accurate confidence interval for the performance comparison measure (e.g. ratio of geometric means); and 3) when the difference is very small, a single test is often not enough to reveal the nature of the computer performance and a five-number-summary to summarize computer performance. We illustrate the results and conclusion through detailed Monte Carlo simulation studies and real examples. Results show that our methods are precise and robust even when two computers have very similar performance metrics.
Keywords :
Monte Carlo methods; performance evaluation; sampling methods; Monte Carlo simulation; bootstrapping confidence estimation; computer performance metrics; deterministic behaviors; empirical distribution; measurement bias; nondeterministic hardware; performance evaluation; performance variability; precise computer performance comparisons; randomization test; software behaviors; statistical resampling methods; Measurement; Performance attributes; Performance of Systems; evaluation; experimental design; modeling; simulation of multiple processor systems;
Conference_Titel :
Performance Analysis of Systems and Software (ISPASS), 2015 IEEE International Symposium on
Conference_Location :
Philadelphia, PA
DOI :
10.1109/ISPASS.2015.7095787