Title :
Selecting Best Practices for Effort Estimation
Author :
Menzies, Tim ; Chen, Zhihao ; Hihn, Jairus ; Lum, Karen
Author_Institution :
Lane Dept. of Comput. Sci., West Virginia Univ., Morgantown, WV
Abstract :
Effort estimation often requires generalizing from a small number of historical projects. Generalization from such limited experience is an inherently underconstrained problem. Hence, the learned effort models can exhibit large deviations that prevent standard statistical methods (e.g., t-tests) from distinguishing the performance of alternative effort-estimation methods. The COSEEKMO effort-modeling workbench applies a set of heuristic rejection rules to comparatively assess results from alternative models. Using these rules, and despite the presence of large deviations, COSEEKMO can rank alternative methods for generating effort models. Based on our experiments with COSEEKMO, we advise a new view on supposed "best practices" in model-based effort estimation: 1) Each such practice should be viewed as a candidate technique which may or may not be useful in a particular domain, and 2) tools like COSEEKMO should be used to help analysts explore and select the best method for a particular domain
Keywords :
data mining; project management; software cost estimation; statistical analysis; COSEEKMO toolkit; effort estimation method; heuristic rejection rule; standard statistical method; Best practices; Data mining; Estimation error; Guidelines; Humans; Linear regression; Predictive models; Software safety; Statistical analysis; Testing; COCOMO; Model-based effort estimation; data mining.; deviation;
Journal_Title :
Software Engineering, IEEE Transactions on
DOI :
10.1109/TSE.2006.114