Title :
Recurrent neural network using mixture of experts for time series processing
Author :
Tabuse, Mirai ; Kinouchi, Makoto ; Hagiwara, Masafumi
Author_Institution :
Dept. of Electr. Eng., Keio Univ., Yokohama, Japan
Abstract :
In this paper, we propose a mixture of expects (MOE) with recurrent connections for improved time series processing. The proposed network has recurrent connections from the output layer to the context layer as the Jordan network. The context layer is expanded to a number of sublayers so that the necessary information for time series processing can be held for longer time. Most of the learning algorithms for the conventional recurrent networks are based on the backpropagation algorithm so that the number of epochs required for convergence tends to increase. The MOE used in the proposed network employs a modular approach. Trained with the expectation-maximization (EM) algorithm, the MOE performs very fast convergence especially in the initial steps. The proposed network can also employ the EM algorithm so that faster convergence is expected. We have examined the ability of the proposed network by some computer simulations. It is shown that the proposed network is faster than the conventional ones in the number of epochs required for convergence
Keywords :
convergence of numerical methods; learning (artificial intelligence); mathematics computing; recurrent neural nets; statistical analysis; time series; Jordan network; context layer; convergence; expectation-maximization algorithm; learning algorithms; mixture of experts; output layer; recurrent neural network; time series processing; Biological neural networks; Computer networks; Convergence; Delay effects; Electronic mail; Jacobian matrices; Motion pictures; Neurofeedback; Recurrent neural networks; Speech processing;
Conference_Titel :
Systems, Man, and Cybernetics, 1997. Computational Cybernetics and Simulation., 1997 IEEE International Conference on
Conference_Location :
Orlando, FL
Print_ISBN :
0-7803-4053-1
DOI :
10.1109/ICSMC.1997.625807