Title :
Model Selection and Assessment Using Cross-indexing
Abstract :
Cross-indexing is a method for selecting the optimal model complexity and estimating the corresponding performance. It aims to reduce the optimistic selection bias that may emerge when too many models are compared to each other using cross-validation as the performance estimator. In this paper, a generalization is introduced that covers the previously presented variations (cross-indexing A and B) as special cases. Originally, cross-indexing was suggested for decreasing the selection bias in a feature selection setting. In order to apply it to a generic model selection problem that presents a large number of candidate model structures and an infinite number of potential hyperparameter values, the method needs to be modified. This paper also describes one way of doing such modifications, and reports the promising results of using the consequent method in three open competitions.
Keywords :
game theory; modelling; cross-indexing method; feature selection setting; hyperparameter values; model selection; model structures; open competitions; optimal model complexity; optimistic selection bias; selection bias; Bridges; Context modeling; Design methodology; Design optimization; Neural networks; Optimization methods; Support vector machine classification; Support vector machines; Training data;
Conference_Titel :
Neural Networks, 2007. IJCNN 2007. International Joint Conference on
Conference_Location :
Orlando, FL
Print_ISBN :
978-1-4244-1379-9
Electronic_ISBN :
1098-7576
DOI :
10.1109/IJCNN.2007.4371365