DocumentCode :
295755
Title :
A maximum likelihood neural network based on a log-linearized Gaussian mixture model
Author :
Tsuji, Toshio ; Ichinobe, Hiroyuki ; Fukuda, Osamu ; Kaneko, Makoto
Author_Institution :
Fac. of Eng., Hiroshima Univ., Japan
Volume :
3
fYear :
1995
fDate :
Nov/Dec 1995
Firstpage :
1293
Abstract :
The present paper proposes a new probabilistic neural network based on a log-linearized Gaussian mixture model, which can estimate a posteriori probability for pattern classification problems. Although a structure of the proposed network represents a statistic model, a forward calculation and a backward learning rule based on the maximum likelihood estimation can be defined in the same manner as the error back propagation neural network model. It is shown from experiments that considerably high classification performance for small sample size of training data can be realized and a structure of the network is easily determined by an incorporated statistical model
Keywords :
Gaussian distribution; generalisation (artificial intelligence); learning (artificial intelligence); maximum likelihood estimation; neural nets; pattern classification; a posteriori probability; backward learning rule; error back propagation neural network model; forward calculation; log-linearized Gaussian mixture model; maximum likelihood neural network; pattern classification problem; probabilistic neural network; statistical model; Error analysis; Feedforward neural networks; Flexible structures; Iterative methods; Maximum likelihood estimation; Neural networks; Pattern classification; Probability density function; Stochastic processes; Training data;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 1995. Proceedings., IEEE International Conference on
Conference_Location :
Perth, WA
Print_ISBN :
0-7803-2768-3
Type :
conf
DOI :
10.1109/ICNN.1995.487343
Filename :
487343
Link To Document :
بازگشت