Title :
Roundoff error analysis of the PCA networks
Author :
Szabó, Tamás ; Horváth, Gábor
Author_Institution :
Dept. of Meas. & Instrum. Eng., Tech. Univ. Budapest, Hungary
Abstract :
This paper deals with some of the effects of finite precision data representation and arithmetics in principal component analysis (PCA) neural networks. The PCA networks are single layer linear neural networks that use some versions of Oja´s learning rule. The paper concentrates on the effects of premature convergence or early termination of the learning process. It determines an approximate analytical expression of the lower limit of the learning rate parameter. Selecting the learning rate below this limit-which depends on the statistical properties of the input data and the quantum size used in the finite precision arithmetics the convergence will slow down significantly or the learning process will stop before converging to the proper weight vector
Keywords :
convergence of numerical methods; digital arithmetic; digital simulation; eigenvalues and eigenfunctions; learning (artificial intelligence); neural nets; Oja´s learning rule; PCA networks; approximate analytical expression; arithmetics; convergence; finite precision arithmetics; finite precision data representation; learning process; learning rate parameter; neural networks; premature convergence; principal component analysis; quantum size; roundoff error analysis; single layer linear neural networks; statistical properties; weight vector; Arithmetic; Autocorrelation; Computer networks; Convergence; Error analysis; Karhunen-Loeve transforms; Multidimensional systems; Neural networks; Principal component analysis; Roundoff errors;
Conference_Titel :
Instrumentation and Measurement Technology Conference, 1997. IMTC/97. Proceedings. Sensing, Processing, Networking., IEEE
Conference_Location :
Ottawa, Ont.
Print_ISBN :
0-7803-3747-6
DOI :
10.1109/IMTC.1997.603954