Title :
Computing iterative roots with second order training methods
Author :
Kindermann, Lars ; Protzel, Peter
Author_Institution :
Dept. of Electr. Eng. & Inf. Technol., Chemnitz Univ. of Technol., Germany
Abstract :
Iterative roots are a valuable tool for modeling and analyzing dynamical systems. They provide a natural way to construct a continuous time model from discrete time data. However, they are in most cases extremely difficult to compute analytically. Previously we have demonstrated how to use neural networks to calculate the iterative roots and fractional iterations of functions. We used a special topology of MLPs together with weight sharing. The paper shows how adding a regularization term to the error function can direct any backpropagation based training method to the same result but in a fraction of epochs when using advanced 2-nd order learning rules
Keywords :
backpropagation; continuous time systems; discrete time systems; iterative methods; learning (artificial intelligence); multilayer perceptrons; MLPs; backpropagation based training method; continuous time model; discrete time data; dynamical systems; error function; iterative roots; multilayer perceptrons; neural networks; regularization term; second order training methods; weight sharing; Chaos; Chemical technology; Continuous time systems; Equations; Information technology; Iterative methods; Network topology; Neural networks; Nonlinear dynamical systems; World Wide Web;
Conference_Titel :
Neural Networks, 2001. Proceedings. IJCNN '01. International Joint Conference on
Conference_Location :
Washington, DC
Print_ISBN :
0-7803-7044-9
DOI :
10.1109/IJCNN.2001.939095