Title :
An easily calculated bound on condition for orthogonal algorithms
Author :
Adeney, K.M. ; Korenberg, M.J.
Author_Institution :
Queen´´s Univ., Kingston, Ont., Canada
Abstract :
Orthogonal search techniques are often used in training generalized single-layer networks (GSLNs) such as the radial basis function (RBF) network. Care must be taken with these techniques in order to avoid ill-conditioning of the required data matrix. The usual approach is to impose an arbitrary lower limit, say dmin, on the norms of the orthogonal expansion terms, or equivalently on the diagonal values in the Cholesky decomposition matrices, which are calculated by the algorithms in question. In the paper, a bound on the condition number of the data matrix in terms of these qualities is given, and is used to derive a model-dependent guideline for dmin
Keywords :
learning (artificial intelligence); matrix algebra; radial basis function networks; search problems; Cholesky decomposition matrices; generalized single-layer networks; ill-conditioning; model-dependent guideline; orthogonal algorithms; orthogonal search techniques; Accuracy; Arithmetic; Autocorrelation; Computational complexity; Electronic mail; Guidelines; Least squares methods; Linear regression; Matrix decomposition; Roundoff errors;
Conference_Titel :
Neural Networks, 2000. IJCNN 2000, Proceedings of the IEEE-INNS-ENNS International Joint Conference on
Conference_Location :
Como
Print_ISBN :
0-7695-0619-4
DOI :
10.1109/IJCNN.2000.861390