Title :
On a training scheme based on orthogonalization and thresholding for a nonparametric regression problem
Author :
Hagiwara, Katsuyuki
Author_Institution :
Fac. of Educ., Mie Univ., Tsu, Japan
Abstract :
For a nonparametric regression problem, we have been proposed a training scheme based on orthogonalization and thresholding, in which a machine is assumed to be a weighted sum of fixed basis functions. In the basis of the scheme, vectors of basis function outputs are orthogonalized and coefficients of the orthogonalized vectors are estimated instead of weights. The coefficients are set to zero if those are less than predetermined threshold levels which are assigned componentwisely to every coefficients. We then obtain a resulting weight vector by transforming the thresholded coefficients. In this article, we presented theoretical details for supporting threshold levels applied in the training scheme. For a simple situation, we also gave an upper bound of a generalization error of the training scheme. As an implication of the bound, we found that an increase of generalization error is of O(log n/n) when there is a sparse representation of a target function in an orthogonal domain. In implementing the training scheme, an eigendecomposition or a Gram-Schmidt procedure for orthogonalization has been employed, in which the corresponding training methods are referred as HTED and HTGS. Also modified versions of HTED and HTGS has been proposed to reduce an estimation bias, which are referred as HTED2 and HTGS2 respectively. We also examined performances of the training methods on real benchmark datasets, in which HTED2 and HTGS2 exhibit relatively good generalization performances. In addition to the generalization performance, HTGS2 is found to obtain a sparse representation of a target function in terms of basis functions.
Keywords :
computational complexity; eigenvalues and eigenfunctions; learning (artificial intelligence); nonparametric statistics; regression analysis; Gram-Schmidt procedure; HTED; HTGS; eigendecomposition; fixed basis functions; generalization error; nonparametric regression problem; orthogonalization; thresholding; training scheme; Elevators;
Conference_Titel :
Neural Networks (IJCNN), The 2010 International Joint Conference on
Conference_Location :
Barcelona
Print_ISBN :
978-1-4244-6916-1
DOI :
10.1109/IJCNN.2010.5596649