Title :
How to find global minima in finite times of search for multilayer perceptrons training
Author :
Chao, Jinhui ; Ratanasuwan, Wijak ; Tsuj, Shigeo, II
Author_Institution :
Dept. of Electr. & Electron. Eng., Tokyo Inst. of Technol., Japan
Abstract :
The authors present a novel global optimization method and a learning scheme. They first show a `magic hair-brushing´ method to eliminate any singular point while reserving the global differential structure of the rest of the singular points in the gradient field. Then global search is defined to trace a sequence of the hair-brushing flows on n-D torus Tn, which is derived by `pasting´ smoothly the admissible region in Rn and the gradient field at the boundary. The proposed method converges to the global minima from any initial point in finite times of search, and each search has almost the same cost as a gradient descent
Keywords :
learning systems; neural nets; optimisation; search problems; global minima; global optimization method; gradient descent; gradient field; magic hair-brushing; multilayer perceptrons training; pasting; search; Artificial neural networks; Brushes; Chaos; Convergence; Cost function; Information processing; Multi-layer neural network; Multilayer perceptrons; Nonhomogeneous media; Optimization methods;
Conference_Titel :
Neural Networks, 1991. 1991 IEEE International Joint Conference on
Print_ISBN :
0-7803-0227-3
DOI :
10.1109/IJCNN.1991.170541