Title :
Deriving metric thresholds from benchmark data
Author :
Alves, Tiago L. ; Ypma, Christiaan ; Visser, Joost
Author_Institution :
Software Improvement Group, Amsterdam, Netherlands
Abstract :
A wide variety of software metrics have been proposed and a broad range of tools is available to measure them. However, the effective use of software metrics is hindered by the lack of meaningful thresholds. Thresholds have been proposed for a few metrics only, mostly based on expert opinion and a small number of observations. Previously proposed methodologies for systematically deriving metric thresholds have made unjustified assumptions about the statistical properties of source code metrics. As a result, the general applicability of the derived thresholds is jeopardized. We designed a method that determines metric thresholds empirically from measurement data. The measurement data for different software systems are pooled and aggregated after which thresholds are selected that (i) bring out the metric´s variability between systems and (ii) help focus on a reasonable percentage of the source code volume. Our method respects the distributions and scales of source code metrics, and it is resilient against outliers in metric values or system size. We applied our method to a benchmark of 100 object-oriented software systems, both proprietary and open-source, to derive thresholds for metrics included in the SIG maintainability model.
Keywords :
object-oriented methods; public domain software; software metrics; benchmark data; metric threshold; object oriented software system; open source software; software metrics; source code metrics; Benchmark testing; Complexity theory; Histograms; Java; Measurement; Software systems;
Conference_Titel :
Software Maintenance (ICSM), 2010 IEEE International Conference on
Conference_Location :
Timisoara
Print_ISBN :
978-1-4244-8630-4
Electronic_ISBN :
1063-6773
DOI :
10.1109/ICSM.2010.5609747