DocumentCode :
2871169
Title :
Linear feedforward neural network classifiers and reduced-rank approximation
Author :
Huang, De-Shuang
Author_Institution :
Beijing Inst. of Syst. Eng., China
Volume :
2
fYear :
1998
fDate :
1998
Firstpage :
1331
Abstract :
This paper discusses the relationship between linear feedforward neural network classifiers (FNNC) and the reduced-rank approximation. From the viewpoint of linear algebra, it is shown that if the rank of the trained connection weight matrix of a two layered linear FNNC is greater than or equal to the rank of the between-class dispersion matrix of the input training samples, the two layered linear FNNC will be merged into a one layered linear FNNC. In addition, the condition of the null error cost function for a reduced rank approximation is also derived
Keywords :
feedforward neural nets; learning (artificial intelligence); matrix algebra; pattern classification; between-class dispersion matrix; input training samples; linear algebra; linear feedforward neural network classifiers; null error cost function; one layered linear FNNC; reduced-rank approximation; trained connection weight matrix; two layered linear FNNC; Computer networks; Cost function; Ear; Feedforward neural networks; Linear algebra; Linear approximation; Merging; Neural networks; Neurons; Systems engineering and theory;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Signal Processing Proceedings, 1998. ICSP '98. 1998 Fourth International Conference on
Conference_Location :
Beijing
Print_ISBN :
0-7803-4325-5
Type :
conf
DOI :
10.1109/ICOSP.1998.770865
Filename :
770865
Link To Document :
بازگشت