Title :
An Efficient Global Optimization of Neural Networks by Using Hybrid Method
Author :
Cho, Yong-Hyun ; Hong, Seong-Jun
Author_Institution :
Sch. of Comput. & Inf. Comm. Eng., Catholic Univ. of Daegu, Daegu
Abstract :
This paper proposes a global optimization of neural network by hybrid method. The hybrid method combines a stochastic approximation with a gradient descent method. The approximation point inclined toward a global escaping from a local minimum is estimated first by stochastic approximation, and then the update rule of Hopfield model is applied for high speed convergence as a gradient descent method. The proposed method has been applied to the 7- and 10- city traveling salesman problems, respectively. The experimental results show that the proposed method has superior convergence performances (rate and speed) to the conventional method that is Hopfield model with randomized initial neuron outputs setting. Especially, the proposed method is less affected by the initial outputs setting and so gives relatively better results than the Hopfield model as the prom size becomes larger.
Keywords :
Hopfield neural nets; gradient methods; stochastic processes; travelling salesman problems; 10-city TSP; 7-city TSP; Hopfield model; global optimization; gradient descent method; hybrid method; neural network; stochastic approximation; traveling salesman problem; Analog computers; Associative memory; Biological neural networks; Computer networks; Convergence; Neural networks; Neurons; Optimization methods; Stochastic processes; Traveling salesman problems;
Conference_Titel :
Frontiers in the Convergence of Bioscience and Information Technologies, 2007. FBIT 2007
Conference_Location :
Jeju City
Print_ISBN :
978-0-7695-2999-8
DOI :
10.1109/FBIT.2007.83