• DocumentCode
    3128683
  • Title

    Evaluating Defect Prediction Models for a Large Evolving Software System

  • Author

    Mende, Thilo ; Koschke, Rainer ; Leszak, Marek

  • Author_Institution
    Univ. of Bremen, Bremen
  • fYear
    2009
  • fDate
    24-27 March 2009
  • Firstpage
    247
  • Lastpage
    250
  • Abstract
    A plethora of defect prediction models has been proposed and empirically evaluated, often using standard classification performance measures. In this paper, we explore defect prediction models for a large, multi-release software system from the telecommunications domain. A history of roughly 3 years is analyzed to extract process and static code metrics that are used to build several defect prediction models with random forests. The performance of the resulting models is comparable to previously published work. Furthermore, we develop a new evaluation measure based on the comparison to an optimal model.
  • Keywords
    decision trees; learning (artificial intelligence); software metrics; software performance evaluation; defect prediction model; multirelease software system; random forests; static code metrics; telecommunications domain; Costs; History; Measurement standards; Predictive models; Size measurement; Software maintenance; Software measurement; Software standards; Software systems; Testing; defect prediction;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Software Maintenance and Reengineering, 2009. CSMR '09. 13th European Conference on
  • Conference_Location
    Kaiserslautern
  • ISSN
    1534-5351
  • Print_ISBN
    978-0-7695-3589-0
  • Type

    conf

  • DOI
    10.1109/CSMR.2009.55
  • Filename
    4812760