DocumentCode :
1558987
Title :
Subspace information criterion for nonquadratic regularizers-Model selection for sparse regressors
Author :
Tsuda, Koji ; Sugiyama, Masashi ; Miller, K.-R.
Author_Institution :
Fraunhofer FIRST, Berlin, Germany
Volume :
13
Issue :
1
fYear :
2002
fDate :
1/1/2002 12:00:00 AM
Firstpage :
70
Lastpage :
80
Abstract :
Nonquadratic regularizers, in particular the l1 norm regularizer can yield sparse solutions that generalize well. In this work we propose the generalized subspace information criterion (GSIC) that allows to predict the generalization error for this useful family of regularizers. We show that under some technical assumptions GSIC is an asymptotically unbiased estimator of the generalization error. GSIC is demonstrated to have a good performance in experiments with the l1 norm regularizer as we compare with the network information criterion (NIC) and cross- validation in relatively large sample cases. However in the small sample case, GSIC tends to fail to capture the optimal model due to its large variance. Therefore, also a biased version of GSIC is introduced,which achieves reliable model selection in the relevant and challenging scenario of high-dimensional data and few samples
Keywords :
learning (artificial intelligence); optimal control; statistical analysis; GSIC; NIC; cross- validation; generalization error prediction; generalized subspace information criterion; l1 norm regularizer; model selection; network information criterion; nonquadratic regularizers; sparse regressors; subspace information criterion; Bayesian methods; Computational biology; Frequency estimation; Machine learning; Neural networks; Prediction methods; Silicon carbide; Text categorization; Training data; Virtual colonoscopy;
fLanguage :
English
Journal_Title :
Neural Networks, IEEE Transactions on
Publisher :
ieee
ISSN :
1045-9227
Type :
jour
DOI :
10.1109/72.977272
Filename :
977272
Link To Document :
بازگشت