Title :
The local backward-forward algorithm
Author_Institution :
Div. of Appl. Math., Brown Univ., Providence, RI, USA
Abstract :
The author introduces stochastic recurrent networks, which are collections of interconnected finite state units. Each unit goes into a new state at every discrete time step following a probability law that is conditional on the state of neighboring units at the previous time step. A network of this type can be trained to learn a stochastic process, where `training´ means maximizing the probability likelihood function of the model. A novel training (i.e. likelihood maximization) algorithm is introduced, the local backward-forward algorithm. This algorithm is based on the fast backward-forward algorithm of hidden Markov models training and improves speed of learning (as compared to backpropagation) substantially. Essentially, the local backward-forward algorithm is a version of Baum´s algorithm which estimates local transition probabilities rather than the global transition probability matrix
Keywords :
learning systems; neural nets; optimisation; probability; stochastic processes; Baum´s algorithm; hidden Markov models training; interconnected finite state units; local backward-forward algorithm; local transition probabilities; machine learning; neural nets; probability likelihood function; stochastic process; stochastic recurrent networks; Hidden Markov models; Speech recognition; Stochastic processes;
Conference_Titel :
Neural Networks, 1991., IJCNN-91-Seattle International Joint Conference on
Conference_Location :
Seattle, WA
Print_ISBN :
0-7803-0164-1
DOI :
10.1109/IJCNN.1991.155355