DocumentCode :
1473817
Title :
Finite word length computational effects of the principal component analysis networks
Author :
Szabó, Tamás ; Horváth, Gábor
Author_Institution :
Dept. of Meas. & Inf. Syst., Tech. Univ. Budapest, Hungary
Volume :
47
Issue :
5
fYear :
1998
fDate :
10/1/1998 12:00:00 AM
Firstpage :
1218
Lastpage :
1222
Abstract :
This paper deals with some of the effects of finite precision data representation and arithmetics in principal component analysis (PCA) neural networks. The PCA networks are single layer linear neural networks that use some versions of Oja´s learning rule. The paper concentrates on the effects of premature convergence or early termination of the learning process. It determines an approximate analytical expression of the lower limit of the learning rate parameter. Selecting the learning rate below this limit-which depends on the statistical properties of the input data and the quantum size used in the finite precision arithmetics-the convergence will slow down significantly or the learning process will stop before converging to the proper weight vector
Keywords :
learning (artificial intelligence); neural nets; principal component analysis; roundoff errors; Oja learning rule; convergence; data representation; finite precision arithmetic; finite word length computation; linear neural network; principal component analysis; quantum size; statistical properties; Autocorrelation; Computer networks; Convergence; Error analysis; Fixed-point arithmetic; Karhunen-Loeve transforms; Multidimensional systems; Neural networks; Principal component analysis; Stability;
fLanguage :
English
Journal_Title :
Instrumentation and Measurement, IEEE Transactions on
Publisher :
ieee
ISSN :
0018-9456
Type :
jour
DOI :
10.1109/19.746586
Filename :
746586
Link To Document :
بازگشت