Title :
Supervised learning for feed-forward neural networks: a new minimax approach for fast convergence
Author :
Chella, A. ; Gentile, A. ; Sorbello, F. ; Tarantino, A.
Author_Institution :
Dept. of Electr. Eng., Palermo Univ., Italy
Abstract :
An approach to the problem of the learning process for feedforward neural networks, based on an optimization point of view, is proposed. The developed algorithm is a minimax method based on a configuration of the quasi-Newton and steepest-descent methods. The optimum point is reached by minimizing the maximum of the error functions of the network without requiring any tuning of internal parameters. The algorithm is tested on several widespread benchmarks and shows superior convergence properties when compared with other algorithms available in the literature. Significant experimental results are included
Keywords :
convergence of numerical methods; feedforward neural nets; learning (artificial intelligence); minimax techniques; convergence; error functions; fast convergence; feedforward neural networks; learning process; minimax method; quasi-Newton method; steepest-descent methods; supervised learning; Benchmark testing; Computer networks; Convergence; Electronic mail; Feedforward neural networks; Feedforward systems; Linear matrix inequalities; Minimax techniques; Neural networks; Supervised learning;
Conference_Titel :
Neural Networks, 1993., IEEE International Conference on
Conference_Location :
San Francisco, CA
Print_ISBN :
0-7803-0999-5
DOI :
10.1109/ICNN.1993.298626