DocumentCode :
2574963
Title :
Random weights search in compressed neural networks using overdetermined pseudoinverse
Author :
Manic, Milos ; Wilamowski, Bogdan
Author_Institution :
Coll. of Eng., Idaho Univ., Moscow, ID, USA
Volume :
2
fYear :
2003
fDate :
9-11 June 2003
Firstpage :
678
Abstract :
Proposed algorithm exhibits 2 significant advantages: easier hardware implementation and robust convergence. Proposed algorithm considers one hidden layer neural network architecture and consists of following major phases. First phase is reduction of weight set. Second phase is gradient calculation on such compressed network. Search for weights is done only in the input layer, while output layer is trained always with pseudo-inversion training. Algorithm is further improved with adaptive network parameters. Final algorithm behavior exhibits robust and fast convergence. Experimental results are illustrated by figures and tables.
Keywords :
backpropagation; gradient methods; neural net architecture; search problems; adaptive network parameters; gradient calculation; layer neural network architecture; pseudo-inversion training; random weights search; Convergence; Educational institutions; Intelligent networks; Iterative algorithms; Iterative methods; Neural networks; Neurons; Robustness; Search methods; USA Councils;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Industrial Electronics, 2003. ISIE '03. 2003 IEEE International Symposium on
Print_ISBN :
0-7803-7912-8
Type :
conf
DOI :
10.1109/ISIE.2003.1267901
Filename :
1267901
Link To Document :
بازگشت