DocumentCode :
2538745
Title :
A New Technique for Searching the Global Minimum of Supervised Neural Network
Author :
Huang, Chih-Chien ; Cheng, Jay ; Chen, Yu-Ju ; Chuang, Shang-Jen ; Wang, Shuming T. ; Hwang, Rey-Chue
Author_Institution :
Electr. Eng. Dept., I-Shou Univ., Kaohsiung, Taiwan
fYear :
2010
fDate :
13-15 Dec. 2010
Firstpage :
114
Lastpage :
117
Abstract :
This paper presents a technique in how to searching the global minimum for the supervised neural network training. This technique is developed based on the idea of nearly equivalent model. To demonstrate the new technique proposed, two signal processing studies, including signal recognition and signal modeling were simulated. For a comparison, the same simulations were also performed by using the neural network with the standard steepest descent error back-propagation (BP) algorithm. From the simulation results shown, the technique we proposed not only can evidence whether the neural network is in the local training or not, but also can show that the “best” performance of the neural network should have.
Keywords :
backpropagation; neural nets; search problems; signal processing; BP algorithm; nearly equivalent model; signal modeling; signal processing study; signal recognition; standard steepest descent error back-propagation algorithm; supervised neural network training; Approximation methods; Artificial neural networks; Neurons; Optimization; Polynomials; Signal processing algorithms; Training; local minimum; nearly equivalent model; supervised neural network;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Genetic and Evolutionary Computing (ICGEC), 2010 Fourth International Conference on
Conference_Location :
Shenzhen
Print_ISBN :
978-1-4244-8891-9
Electronic_ISBN :
978-0-7695-4281-2
Type :
conf
DOI :
10.1109/ICGEC.2010.36
Filename :
5715384
Link To Document :
بازگشت