Title :
On minimal net size, convergence speed and generalization ability of backpropagation trained heterogeneous neural net
Author :
Kamruzzaman, Joarder ; Kumagai, Yukio ; Hikita, Hiromitsu
Author_Institution :
Muroran Inst. of Technol., Hokkaido, Japan
Abstract :
A three-layer heterogeneous network consisting of linear output layer units and sigmoid hidden layer units is considered. Compared with a conventional homogeneous network of all sigmoid units, this network can be implemented much more economically, and for analog mapping this network is more suitable. The minimal hidden size needed for convergence in a three-layer heterogeneous network for a given task is derived and a performance comparison between the two types of networks in terms of convergence behavior and generalization ability is also presented. The comparison shows that the heterogeneous network has the unique feature of very fast early stage learning and converges much faster for a suitable range of the slope of the learning activation function. Both have comparable generalization abilities
Keywords :
backpropagation; convergence; feedforward neural nets; analog mapping; backpropagation trained heterogeneous neural net; convergence behavior; convergence speed; early stage learning; generalization ability; linear output layer units; minimal hidden size; minimal net size; sigmoid hidden layer units; three-layer heterogeneous network; Acceleration; Backpropagation algorithms; Convergence; Knowledge representation; Multi-layer neural network; Neural networks; Operational amplifiers; Sonar applications; Speech recognition; Vocabulary;
Conference_Titel :
Circuits and Systems, 1992., Proceedings of the 35th Midwest Symposium on
Conference_Location :
Washington, DC
Print_ISBN :
0-7803-0510-8
DOI :
10.1109/MWSCAS.1992.271184