DocumentCode :
2644341
Title :
Neural training and generalisation of sequences using continuous temporal structure
Author :
Weir, Michael K. ; Chen, Li H.
Author_Institution :
Dept. of Comput. Sci., St. Andrews Univ., UK
fYear :
1991
fDate :
18-21 Nov 1991
Firstpage :
2027
Abstract :
An approach for sequential neural behavior called continuous backpropagation is evolved from standard backpropagation where a state is replaced by a state transition sequence as the goal weight condition. The approach may be used to train mappings of analog input/output (I/O) signals or discrete I/O sequences with underlying continuity. An arbitrarily increasing number of values in the I/O sequences may be trained without having to increase the number of hidden units. The training and generalization techniques are illustrated by a sequential four-spiral version of Wieland´s two-spirals problem. The results show a substantial improvement over standard state-based backpropagation
Keywords :
learning systems; neural nets; Wieland´s two-spirals problem; continuous backpropagation; continuous temporal structure; four-spiral version; generalisation; learning systems; neural training; sequential; sequential neural behavior; Multilayer perceptrons; Signal mapping; Switches; Time factors; Topology;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 1991. 1991 IEEE International Joint Conference on
Print_ISBN :
0-7803-0227-3
Type :
conf
DOI :
10.1109/IJCNN.1991.170690
Filename :
170690
Link To Document :
بازگشت