DocumentCode :
504754
Title :
Effect of number of hidden neurons on learning in large-scale layered neural networks
Author :
Shibata, Katsunari ; Ikeda, Yusuke
Author_Institution :
Oita Univ., Oita, Japan
fYear :
2009
fDate :
18-21 Aug. 2009
Firstpage :
5008
Lastpage :
5013
Abstract :
In order to provide a guideline about the number of hidden neurons N(h) and learning rate eta for large-scale neural networks from the viewpoint of stable learning, the authors try to formulate the boundary of stable learning roughly, and to adjust it to the actual learning results of random number mapping problems. It is confirmed in the simulation that the hidden-output connection weights become small as the number of hidden neurons becomes large, and also that the trade-off in the learning stability between input-hidden and hidden-output connections exists. Finally, two equations N(h) = radic(N(i) N(o)) and eta = 32 /radic(N(i)N(o)) are roughly introduced where N(i) and N(o) are the number of input and output neurons respectively even though further adjustment is necessary for other problems or conditions.
Keywords :
learning (artificial intelligence); neural nets; hidden neurons; large-scale layered neural networks; learning; Biological neural networks; Guidelines; Humans; Image recognition; Intelligent sensors; Large-scale systems; Neural networks; Neurons; Stability; Supervised learning; error back propagation; large-scale layered neural network; learning stability; supervised learning;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
ICCAS-SICE, 2009
Conference_Location :
Fukuoka
Print_ISBN :
978-4-907764-34-0
Electronic_ISBN :
978-4-907764-33-3
Type :
conf
Filename :
5334631
Link To Document :
بازگشت