Title :
A self-optimizing stochastic dynamic node learning algorithm for layered neural networks
Abstract :
Summary form only given, as follows. The random optimization method typically uses a Gaussian probability density function (PDF) to generate a random search vector. In the present work the random search technique was applied to the neural network training problem and was modified to dynamically seek out the optimal probability density function (OPDF) from which to select the search vector. The dynamic OPDF search process, an autoadaptive stratified sampling technique, and a dynamic node architecture (DNA) learning scheme complete the modifications of the basic method. The DNA technique determines the appropriate number of hidden nodes needed for a given training problem. Using DNA, the neural network architectures do not have to be set before training is initiated. The approach was applied to networks of generalized, fully interconnected continuous perceptrons. Computer simulation results were obtained
Keywords :
learning systems; neural nets; probability; DNA; Gaussian probability density function; autoadaptive stratified sampling technique; dynamic node architecture; fully interconnected continuous perceptrons; layered neural networks; optimal probability density function; random optimization method; random search technique; self-optimizing stochastic dynamic node learning algorithm; training; Artificial intelligence; Contracts; DNA; Heuristic algorithms; Neural networks; Optimization methods; Power generation; Probability density function; Stochastic processes; US Department of Energy;
Conference_Titel :
Neural Networks, 1991., IJCNN-91-Seattle International Joint Conference on
Conference_Location :
Seattle, WA
Print_ISBN :
0-7803-0164-1
DOI :
10.1109/IJCNN.1991.155550