Title :
Dynamic learning using exponential energy functions
Author :
Ahmad, Maqbool ; Salam, Fathi M A
Author_Institution :
Dept. of Electr. Eng., Michigan State Univ., East Lansing, MI, USA
Abstract :
The authors use a continuous-time gradient descent weight update law for supervised learning of feedforward artificial neural networks because of specific advantages over its discrete-time counterpart. An exponential energy function is used in the update law. It is shown that this energy function would speed up the learning dynamics and insure faster convergence to a useful minimum. The dynamics would `skip´ minima which are at higher energy levels and converge to one at a lower energy level. Software implementation of the learning dynamics based on the exponential energy function is described. Various supporting simulations on the XOR and character recognition problems are also included
Keywords :
feedforward neural nets; learning (artificial intelligence); XOR; character recognition; continuous-time; dynamic learning; exponential energy function; feedforward artificial neural networks; gradient descent weight update; learning dynamics; supervised learning; Artificial neural networks; Circuits; Convergence; Difference equations; Electronic mail; Energy states; Laboratories; Polynomials; Supervised learning; Very large scale integration;
Conference_Titel :
Neural Networks, 1992. IJCNN., International Joint Conference on
Print_ISBN :
0-7803-0559-0
DOI :
10.1109/IJCNN.1992.226974