Title :
Stochastic approximation techniques and associated tools for neural network optimization
Author :
Dedieu, H. ; Flanagan, A. ; Eriksson, J. ; Robert, A.
Author_Institution :
Dept. of Electr. Eng., Ecole Polytech. Federale de Lausanne, Switzerland
Abstract :
This paper is devoted to the optimization of feedforward and feedback artificial neural networks (ANN) working in supervised learning mode. We describe in a general way how it is possible to derive first and second order stochastic approximation methods that provide learning capabilities. We show how certain variables, the sensitivities of the ANN outputs, play a key role in the ANN optimization process. Then we describe how some useful and elementary tools known in circuit theory can be used to compute these sensitivities with a low computational cost. We show on an example how to apply these two sets of complementary tools, i.e. stochastic approximation and sensitivity theory
Keywords :
approximation theory; backpropagation; feedforward neural nets; optimisation; parameter estimation; sensitivity analysis; adaptive systems; backpropagation; feedback neural networks; feedforward neural nets; multilayer perceptrons; optimization; sensitivity theory; sequential parameter estimation; stochastic approximation; supervised learning; Adaptive systems; Approximation methods; Artificial neural networks; Circuits; Least squares approximation; Neural networks; Neurofeedback; Probability distribution; Stochastic processes; Supervised learning;
Conference_Titel :
Neuro-Fuzzy Systems, 1996. AT'96., International Symposium on
Conference_Location :
Lausanne
Print_ISBN :
0-7803-3367-5
DOI :
10.1109/ISNFS.1996.603816