DocumentCode :
3251207
Title :
Reducing linear redundancy in neural networks
Author :
Sperduti, Alessandro ; Starita, Antonina
Author_Institution :
Dipartimento di Inf., Pisa Univ., Italy
Volume :
4
fYear :
1992
fDate :
7-11 Jun 1992
Firstpage :
612
Abstract :
The authors show how to reduce linear redundancy both in the input vectors and in the weight vectors of one layer of neurons by using two simple one-layer feedforward networks with linear neurons. The main contributions are to show how to use linear dependences in the weight matrix to reduce the number of connections; and to give neuronal tools to optimize another network. A new class of neurons is proposed to reduce the disparity among neurons introduced by this technique. Two simple neural networks are proposed to perform network optimization, showing that linear redundancy can be eliminated from the network within the neural network paradigm
Keywords :
feedforward neural nets; optimisation; redundancy; linear dependences; linear neurons; linear redundancy; network optimization; one-layer feedforward networks; weight matrix; Computer architecture; Computer networks; Electronic mail; Feedforward systems; Intelligent networks; Neural networks; Neurons; Vectors; Weight measurement;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 1992. IJCNN., International Joint Conference on
Conference_Location :
Baltimore, MD
Print_ISBN :
0-7803-0559-0
Type :
conf
DOI :
10.1109/IJCNN.1992.227251
Filename :
227251
Link To Document :
بازگشت