DocumentCode :
2162834
Title :
Approximation ability of a class of locally recurrent globally feed-forward neural networks
Author :
Patan, Krzysztof
Author_Institution :
Inst. of Control & Comput. Eng., Univ. of Zielona Gora, Gora, Poland
fYear :
2007
fDate :
2-5 July 2007
Firstpage :
3850
Lastpage :
3857
Abstract :
The paper investigates approximation abilities of a special class of discrete-time dynamic neural networks. These networks are called locally recurrent globally feed-forward, because they are designed with dynamic neuron models which contain inner feedbacks, but interconnections beetween neurons are strictly feed-forward ones like in the well-known multi-layer perceptron. The paper presents analytical results showing that a locally recurrent network with two hidden layers is able to approximate a state-space trajectory produced by any Lipschitz continuous function with arbitrary accuracy. Moreover, based of these results the network can be simplified and transformed to a more practical structure useful in real world applications.
Keywords :
feedforward neural nets; multilayer perceptrons; Lipschitz continuous function; approximation abilities; discrete-time dynamic neural networks; dynamic neuron models; locally recurrent globally feed-forward neural networks; multilayer perceptron; real world applications; state-space trajectory; Actuators; Approximation methods; Biological neural networks; Neurons; Training; Valves; Vectors;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Control Conference (ECC), 2007 European
Conference_Location :
Kos
Print_ISBN :
978-3-9524173-8-6
Type :
conf
Filename :
7068622
Link To Document :
بازگشت