Title :
Measuring High-Performance Computing with Real Applications
Author :
Sayeed, Mohamed ; Bae, Hansang ; Zheng, Yili ; Armstrong, Brian ; Eigenmann, Rudolf ; Saied, Faisal
Author_Institution :
Purdue Univ., West Lafayette, IN
Abstract :
A good benchmarking methodology can save a tremendous amount of resources in terms of human effort, machine cycles, and cost. Such a methodology must consider the relevance and openness of the chosen codes, well-defined rules for executing and reporting the benchmarks, a review process to enforce the rules, and a public repository for the obtained information. For the methodology to be feasible, it must also be supported by adequate tools that enable the user to consistently execute the benchmarks and gather the requisite metrics. At the very least, reliable benchmarking results can help people make decisions about HPC acquisitions and assist scientists and engineers in system advances. By saving resources and enabling balanced designs and configurations, realistic benchmarking ultimately leads to increased competitiveness in both industry and academia.
Keywords :
benchmark testing; parallel processing; resource allocation; software metrics; software performance evaluation; benchmarking methodology; high-performance computing application measurement; resource saving; software metrics; Application software; Code standards; Computer applications; High performance computing; Kernel; Performance evaluation; Supercomputers; Testing; Weather forecasting; World Wide Web; high-performance computing; kernel benchmarks; performance analysis; performance evaluation; performance modeling; real application benchmarks;
Journal_Title :
Computing in Science & Engineering
DOI :
10.1109/MCSE.2008.98