Title :
On the optimal number of hidden nodes in a neural network
Author :
Wanas, Nayer ; Auda, Gasser ; Kamel, Mohamed S. ; Karray, Fakhreddine
Author_Institution :
Dept. of Syst. Design Eng., Waterloo Univ., Ont., Canada
Abstract :
In this study we show, empirically, that the best performance of a neural network occurs when the number of hidden nodes is equal to log(T), where T is the number of training samples. This value represents the optimal performance of the neural network as well as the optimal associated computational cost. We also show that the measure of entropy in the hidden layer not only gives a good foresight to the performance of the neural network, but can be used as a criteria to optimize the neural network as well. This can be achieved by minimizing the network entropy (i.e. maximizing the entropy in the hidden layer) as a means of modifying the weights of the neural network
Keywords :
entropy; learning (artificial intelligence); neural nets; optimisation; entropy measure; hidden layer; hidden nodes; information measures; network entropy; neural network architecture; neural network weights; optimal computational cost; optimal performance; pattern classification; training samples; Computational efficiency; Computer networks; Design engineering; Entropy; Feedforward systems; Intelligent networks; Neck; Neural networks; Pattern classification; Systems engineering and theory;
Conference_Titel :
Electrical and Computer Engineering, 1998. IEEE Canadian Conference on
Conference_Location :
Waterloo, Ont.
Print_ISBN :
0-7803-4314-X
DOI :
10.1109/CCECE.1998.685648