Title :
The past is important: a method for determining memory structure in NARX neural networks
Author :
Giles, C. Lee ; Lin, Tsungnan ; Horne, Bill G. ; Kung, S.Y.
Author_Institution :
NEC Res. Inst., Princeton, NJ, USA
Abstract :
Recurrent networks have become popular models for system identification and time series prediction. NARX (nonlinear autoregressive models with exogenous inputs) network models are a popular subclass of recurrent networks and have been used in many applications. Though embedded memory can be found in all recurrent network models, it is particularly prominent in NARX models. We show that the use of intelligent memory order selection through pruning and good initial heuristics significantly improves the generalization and predictive performance of these nonlinear systems on problems as diverse as grammatical inference and time series prediction
Keywords :
autoregressive processes; content-addressable storage; forecasting theory; generalisation (artificial intelligence); inference mechanisms; learning (artificial intelligence); recurrent neural nets; time series; NARX neural networks; delay damage algorithm; generalization; grammatical inference; learning algorithm; memory structure; nonlinear autoregressive models; nonlinear systems; optimisation; order selection; pruning; recurrent neural networks; time series prediction; Computer networks; Delay effects; Intelligent networks; Laboratories; Memory architecture; National electric code; Neural networks; Predictive models; Signal processing; Signal processing algorithms;
Conference_Titel :
Neural Networks Proceedings, 1998. IEEE World Congress on Computational Intelligence. The 1998 IEEE International Joint Conference on
Conference_Location :
Anchorage, AK
Print_ISBN :
0-7803-4859-1
DOI :
10.1109/IJCNN.1998.687136