Title :
Advances in feedforward neural networks: demystifying knowledge acquiring black boxes
Author_Institution :
Dept. of Comput. Sci., Nevada Univ., Reno, NV, USA
fDate :
4/1/1996 12:00:00 AM
Abstract :
We survey research of recent years on the supervised training of feedforward neural networks. The goal is to expose how the networks work, how to engineer them so they can learn data with less extraneous noise, how to train them efficiently, and how to assure that the training is valid. The scope covers gradient descent and polynomial line search, from backpropagation through conjugate gradients and quasi Newton methods. There is a consensus among researchers that adaptive step gains (learning rates) can stabilize and accelerate convergence and that a good starting weight set improves both the training speed and the learning quality. The training problem includes both the design of a network function and the fitting of the function to a set of input and output data points by computing a set of coefficient weights. The form of the function can be adjusted by adjoining new neurons and pruning existing ones and setting other parameters such as biases and exponential rates. Our exposition reveals several useful results that are readily implementable
Keywords :
conjugate gradient methods; feedforward neural nets; knowledge acquisition; learning (artificial intelligence); search problems; adaptive step gains; backpropagation; biases; black boxes; coefficient weights; conjugate gradients; exponential rates; feedforward neural networks; gradient descent; knowledge acquisition; learning quality; learning rates; network function; output data points; polynomial line search; quasi Newton methods; starting weight set; supervised training; Acceleration; Backpropagation; Computer networks; Convergence; Data engineering; Feedforward neural networks; Neural networks; Newton method; Polynomials; Transfer functions;
Journal_Title :
Knowledge and Data Engineering, IEEE Transactions on