Title :
Development of Robust Traceability Benchmarks
Author :
Xiaofan Chen ; Hosking, John ; Grundy, John ; Amor, Robert
Author_Institution :
Dept. of Comput. Sci., Univ. of Auckland, Auckland, New Zealand
Abstract :
Traceability benchmarks are essential for the evaluation of traceability recovery techniques. This includes the validation of an individual trace ability technique itself and the objective comparison of the technique with other traceability techniques. However, it is generally acknowledged that it is a real challenge for researchers to obtain or build meaningful and robust benchmarks. This is because of the difficulty of obtaining or creating suitable benchmarks. In this paper, we describe an approach to enable researchers to establish affordable and robust benchmarks. We have designed rigorous manual identification and verification strategies to determine whether or not a link is correct. We have developed a formula to calculate the probability of errors in benchmarks. Analysis of error probability results shows that our approach can produce high quality benchmarks, and our strategies significantly reduce error probability in them.
Keywords :
benchmark testing; error statistics; probability; program diagnostics; program verification; software performance evaluation; error probability reduction; manual identification; traceability benchmark; traceability recovery; verification strategy; Benchmark testing; Buildings; Documentation; Error probability; Measurement; Robustness; Software; benchmark development; traceability benchmark;
Conference_Titel :
Software Engineering Conference (ASWEC), 2013 22nd Australian
Conference_Location :
Melbourne, VIC
DOI :
10.1109/ASWEC.2013.26