DocumentCode :
2747476
Title :
How good is your blind spot sampling policy
Author :
Menzies, Tim ; di Stefano, Justin S.
Author_Institution :
Lane Dept. of Comput. Sci. & Electr. Eng., West Virginia Univ., USA
fYear :
2004
fDate :
25-26 March 2004
Firstpage :
129
Lastpage :
138
Abstract :
Assessing software costs money and better assessment costs exponentially more money. Given finite budgets, assessment resources are typically skewed towards areas that are believed to be mission critical. This leaves blind spots: portions of the system that may contain defects which may be missed. Therefore, in addition to rigorously assessing mission critical areas, a parallel activity should sample the blind spots. This paper assesses defect detectors based on static code measures as a blind spot sampling method. In contrast to previous results, we find that such defect detectors yield results that are stable across many applications. Further, these detectors are inexpensive to use and can be tuned to the specifics of the current business situations.
Keywords :
formal specification; program verification; sampling methods; software metrics; automatic formal methods; black box probing; blind spot sampling; defect detectors; formal specification; public domain defect data; software assessment; Aerospace engineering; Computer science; Costs; Detectors; Mission critical systems; NASA; Project management; Proposals; Sampling methods; Systems engineering and theory;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
High Assurance Systems Engineering, 2004. Proceedings. Eighth IEEE International Symposium on
ISSN :
1530-2059
Print_ISBN :
0-7695-2094-4
Type :
conf
DOI :
10.1109/HASE.2004.1281737
Filename :
1281737
Link To Document :
بازگشت