DocumentCode :
2356994
Title :
DiPerF: an automated distributed performance testing framework
Author :
Dumitrescu, Catalin ; Raicu, Ioan ; Ripeanu, Matei ; Foster, Ian
Author_Institution :
Dept. of Comput. Sci., Chicago Univ., IL, USA
fYear :
2004
fDate :
8 Nov. 2004
Firstpage :
289
Lastpage :
296
Abstract :
We present DiPerF, a distributed performance-testing framework, aimed at simplifying and automating service performance evaluation. DiPerF coordinates a pool of machines that test a target service, collects and aggregates performance metrics, and generates performance statistics. The aggregate data collected provide information on service throughput, on service fairness´ when serving multiple clients concurrently, and on the impact of network latency on service performance. Furthermore, using this data, it is possible to build predictive models that estimate a service performance given the service load. We have tested DiPerF on 100+machines on two testbeds, Grid3 and PlanetLab, and explored the performance of job submission services (pre-WS GRAM and WS GRAM) included with Globus Toolkit® 3.2.
Keywords :
benchmark testing; distributed processing; performance evaluation; DiPerF; Globus Toolkit® 3.2; Grid3 testbed; PlanetLab testbed; WS GRAM; automated distributed performance testing; automating service performance evaluation; job submission service; performance statistics; pre-WS GRAM; predictive model; Aggregates; Automatic testing; Delay; Measurement; Performance evaluation; Quality of service; Resource management; Scalability; Statistical distributions; Throughput;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Grid Computing, 2004. Proceedings. Fifth IEEE/ACM International Workshop on
ISSN :
1550-5510
Print_ISBN :
0-7695-2256-4
Type :
conf
DOI :
10.1109/GRID.2004.21
Filename :
1382843
Link To Document :
بازگشت