DocumentCode :
1482219
Title :
A method to determine the required number of neural-network training repetitions
Author :
Iyer, Mahesh S. ; Rhinehart, R. Russell
Author_Institution :
Dept. of Chem. Eng., Texas Tech. Univ., Lubbock, TX, USA
Volume :
10
Issue :
2
fYear :
1999
fDate :
3/1/1999 12:00:00 AM
Firstpage :
427
Lastpage :
432
Abstract :
Conventional neural-network training algorithms often get stuck in local minima. To find the global optimum, training is conventionally repeated with ten, or so, random starting values for the weights. Here we develop an analytical procedure to determine how many times a neural network needs to be trained, with random starting weights, to ensure that the best of those is within a desirable lower percentile of all possible trainings, with a certain level of confidence. The theoretical developments are validated by experimental results. While applied to neural-network training, the method is generally applicable to nonlinear optimization
Keywords :
learning (artificial intelligence); neural nets; optimisation; confidence level; global optimum; local minima; nonlinear optimization; random starting weights; training repetitions; Bioreactors; Chemical engineering; Fault diagnosis; Feedforward neural networks; Neural networks; Optimization methods; Pattern recognition; Process control; Steady-state; System identification;
fLanguage :
English
Journal_Title :
Neural Networks, IEEE Transactions on
Publisher :
ieee
ISSN :
1045-9227
Type :
jour
DOI :
10.1109/72.750573
Filename :
750573
Link To Document :
بازگشت