Title :
A Framework to Compare Alert Ranking Algorithms
Author :
Allier, Simon ; Anquetil, Nicolas ; Hora, Andre ; Ducasse, Stephane
Author_Institution :
RMoD Team, INRIA, Lille, France
Abstract :
To improve software quality, rule checkers statically check if a software contains violations of good programming practices. On a real sized system, the alerts (rule violations detected by the tool) may be numbered by the thousands. Unfortunately, these tools generate a high proportion of "false alerts", which in the context of a specific software, should not be fixed. Huge numbers of false alerts may render impossible the finding and correction of "true alerts" and dissuade developers from using these tools. In order to overcome this problem, the literature provides different ranking methods that aim at computing the probability of an alert being a "true one". In this paper, we propose a framework for comparing these ranking algorithms and identify the best approach to rank alerts. We have selected six algorithms described in literature. For comparison, we use a benchmark covering two programming languages (Java and Smalltalk) and three rule checkers (Find Bug, PMD, Small Lint). Results show that the best ranking methods are based on the history of past alerts and their location. We could not identify any significant advantage in using statistical tools such as linear regression or Bayesian networks or ad-hoc methods.
Keywords :
Java; Smalltalk; program debugging; software quality; software tools; FindBug rule checker; Java programming language; PMD rule checker; SmallLint rule checker; Smalltalk programming language; alert ranking algorithm; false alert generation; probability; rule violation detection; software quality; true alerts; Benchmark testing; History; Java; Measurement; Open source software; Training;
Conference_Titel :
Reverse Engineering (WCRE), 2012 19th Working Conference on
Conference_Location :
Kingston, ON
Print_ISBN :
978-1-4673-4536-1
DOI :
10.1109/WCRE.2012.37