Title :
New nonleast-squares neural network learning algorithms for hypothesis testing
Author :
Pados, Dimitris A. ; Papantoni-Kazakos, P.
Author_Institution :
Dept. of Electr. Eng., Virginia Univ., Charlottesville, VA, USA
fDate :
5/1/1995 12:00:00 AM
Abstract :
Hypothesis testing is a collective name for problems such as classification, detection, and pattern recognition. In this paper we propose two new classes of supervised learning algorithms for feedforward, binary-output neural network structures whose objective is hypothesis testing. All the algorithms are applications of stochastic approximation and are guaranteed to provide optimization with probability one. The first class of algorithms follows the Neyman-Pearson approach and maximizes the probability of detection, subject to a given false alarm constraint. These algorithms produce layer-by-layer optimal Neyman-Pearson designs. The second class of algorithms minimizes the probability of error and leads to layer-by-layer Bayes optimal designs. Deviating from the layer-by-layer optimization assumption, we propose more powerful learning techniques which unify, in some sense, the already existing algorithms. The proposed algorithms were implemented and tested on a simulated hypothesis testing problem. Backpropagation and perceptron learning were also included in the comparisons
Keywords :
Bayes methods; approximation theory; feedforward neural nets; learning (artificial intelligence); pattern classification; probability; Bayes method; Neyman-Pearson approach; backpropagation; feedforward binary-output neural network; hypothesis testing; layer-by-layer optimization; nonleast-squares neural network; pattern classification; pattern recognition; perceptron learning; probability; stochastic approximation; supervised learning algorithms; Algorithm design and analysis; Backpropagation algorithms; Bayesian methods; Error correction; Least squares approximation; Least squares methods; Neural networks; Pattern recognition; Stochastic processes; Testing;
Journal_Title :
Neural Networks, IEEE Transactions on