Title :
Two-level learning algorithm for multilayer neural networks
Author :
Liu, Chin-Sung ; Tseng, Ching-Huan
Author_Institution :
Dept. of Mech. Eng., Nat. Chiao Tung Univ., Hsinchu, Taiwan
Abstract :
A two-level learning algorithm that decomposes multilayer neural networks into a set of sub-networks is presented. Many popular optimization methods, such as conjugate-gradient and quasi-Newton methods, can be utilized to train these sub-networks. In addition, if the activation functions are hard-limiting functions, the multilayer neural networks can be trained by the perceptron learning rule in this two-level learning algorithm. Two experimental problems are given as examples for this algorithm
Keywords :
Newton method; conjugate gradient methods; learning (artificial intelligence); multilayer perceptrons; optimisation; activation functions; conjugate gradient method; experimental problems; hard-limiting functions; multilayer neural networks; optimization; perceptron learning rule; quasi-Newton methods; sub-networks; two-level learning algorithm; Computer networks; Electronic mail; Feedforward neural networks; Laboratories; Mechanical engineering; Multi-layer neural network; Multilayer perceptrons; Neural networks; Neurons; Optimization methods;
Conference_Titel :
Tools with Artificial Intelligence, 1998. Proceedings. Tenth IEEE International Conference on
Conference_Location :
Taipei
Print_ISBN :
0-7803-5214-9
DOI :
10.1109/TAI.1998.744795