Title :
Benchmarking and Evaluation Support for Self-Adaptive Distributed Systems
Author :
Vilenica, Ante ; Lamersdorf, Winfried
Author_Institution :
Dept. of Inf., Univ. of Hamburg, Hamburg, Germany
Abstract :
Increasingly, distributed systems have to deal with highly dynamic and hardly predictable environments. This trend, in conjunction with rising demands for sophisticated non-functional system requirements, challenges both the development and operation (i.e. management) of traditional distributed systems. One promising approach to cope with these challenges are self-adaptive distributed systems that are characterized by the capability to configure and maintain themselves. However, the inherent dynamic of self-adaptive systems requires intensive evaluation and benchmarking efforts in order to ensure the intended system behaviour. In order to support that, this paper presents a framework that aims at supporting the nominal-actual comparison of self-adaptive distributed systems as well as the comparison of different self-adaptive solutions with respect to a specific software implementation task. The underlying approach consists of (i) a declarative definition language and (ii) a software component that is capable of conducting evaluations and benchmarks on different software implementations.
Keywords :
benchmark testing; distributed processing; benchmarking efforts; declarative definition language; evaluation support; nonfunctional system requirement; self-adaptive distributed system; self-adaptive solution; self-adaptive system; software implementation task; system behaviour; Adaptive systems; Benchmark testing; Computer architecture; Measurement; Runtime; Software systems; Autonomous Components; Benchmarking; Evaluation; Self-Adaptive Systems;
Conference_Titel :
Complex, Intelligent and Software Intensive Systems (CISIS), 2012 Sixth International Conference on
Conference_Location :
Palermo
Print_ISBN :
978-1-4673-1233-2
DOI :
10.1109/CISIS.2012.115