DocumentCode :
3099924
Title :
Neural architectures for parametric estimation of a posteriori probabilities by constrained conditional density functions
Author :
Arribas, Juan Ignacio ; Cid-Sueiro, Jesus ; Adali, Tülay ; Figueiras-Vidal, Anibal R.
Author_Institution :
Dept. of Teoria de la Senal, Comunicaciones e Ing. Telematica, Valladolid Univ., Spain
fYear :
1999
fDate :
36373
Firstpage :
263
Lastpage :
272
Abstract :
A new approach to the estimation of `a posteriori´ class probabilities using neural networks, the Joint Network and Data Density Estimation (JNDDE), is presented. It is based on the estimation of the conditional data density functions, with some restrictions imposed by the classifier structure; the Bayes´ rule is used to obtain the `a posteriori´ probabilities from these densities. The proposed method is applied to three different network structures: the logistic perceptron (for the binary case), the softmax perceptron (for multi-class problems) and a generalized softmax perceptron (that can be used to map arbitrarily complex probability functions). Gaussian mixture models are used for the conditional densities, The method has the advantage of establishing a distinction between the network architecture constraints and the model of the data, separating network parameters and the model parameters. Complexity on any of them can be fixed as desired. Maximum likelihood gradient-based rules for the estimation of the parameters can be obtained. It is shown that JNDDE exhibits a more robust convergence characteristics than other methods of a posteriori probability estimation, such as those based on the minimization of a Strict Sense Bayesian cost function
Keywords :
Bayes methods; convergence; learning (artificial intelligence); maximum likelihood estimation; neural nets; parameter estimation; pattern classification; Bayes rule; Gaussian mixture models; JNDDE; Joint Network and Data Density Estimation; Strict Sense Bayesian cost function; a posteriori probabilities; classifier structure; conditional data density functions; constrained conditional density functions; logistic perceptron; maximum likelihood gradient-based rules; neural networks; parametric estimation; robust convergence; softmax perceptron; Bayesian methods; Convergence; Cost function; Density functional theory; Logistics; Maximum likelihood estimation; Minimization methods; Neural networks; Parameter estimation; Robustness;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks for Signal Processing IX, 1999. Proceedings of the 1999 IEEE Signal Processing Society Workshop.
Conference_Location :
Madison, WI
Print_ISBN :
0-7803-5673-X
Type :
conf
DOI :
10.1109/NNSP.1999.788145
Filename :
788145
Link To Document :
بازگشت