Author_Institution :
Dept. of Electr. Eng., Texas Univ., El Paso, TX, USA
Abstract :
A new class of temporal associative neural network, called finite state network (FSN), is presented. Unlike other temporal networks, the proposed FSN has the desirable feature that it can associate any input temporal pattern with any output temporal pattern. A temporal pattern is represented by a symbol string, and each symbol is a bipolar vector. The FSN is trained on input-output exemplar string pairs. The FSN is always capable of learning new exemplar pairs while retaining all trained pairs unchanged. Suppose that the FSN has been trained on exemplar pairs (α1, θ1), (α2, θ2), ..., (αp, θp), each time the FSN receives an input string α, it will compare α with each of α1, α2,..., αp and find out the closest one, denoted by αk. Then the FSN will respond with the corresponding θk as its output. Training the FSN on an exemplar string pair (α, θ) is a one-passing process and it adds (α, θ) to the FSN´s memory. This paper describes the structure of the FSN which consists of three subnets, one for input, one for state representation and one for output. A process is given to train the network by adjusting all adaptive weights associated with the three subnets. Implementation and training of the FSN are validated using simulation and test results are given
Keywords :
adaptive systems; content-addressable storage; learning by example; neural nets; adaptive weights; bipolar vector; exemplar pairs; finite internal states; finite state network; input string; input temporal pattern; input-output exemplar string pairs; learning; output temporal pattern; simulation; state representation; symbol string; temporal associative memory; temporal associative neural network; temporal networks; test results; training; Associative memory; Backpropagation; Neural networks; Testing;
Conference_Titel :
Neural Networks, 1994. IEEE World Congress on Computational Intelligence., 1994 IEEE International Conference on