Title :
Induced specialization of context units for temporal pattern recognition and reproduction
Author :
Kothari, Ravi ; Agyepong, Kwabena
Author_Institution :
Dept. of Electr. & Comput. Eng., Cincinnati Univ., OH, USA
Abstract :
Additional inputs to a feedforward network, derived from the output of the hidden layer neurons, allow a feedforward network to deal with temporal pattern recognition and reproduction tasks. These `network derived´ or `context´ inputs augment the `true´ inputs to the network and allow the network to retain past information necessary for temporal sequence processing. The choice of which hidden neurons to retain to provide the context inputs is difficult. Use of all the hidden neurons increases the size of the overall network resulting in poorer generalization performance. The problem is complicated due to difficulty in choosing the number of hidden layer neurons in the first place. In this paper, we propose the use of regularization terms in the sum-of-squared error cost function. Assuming the hidden layer neurons are indexed 1,2,...,m, the regularization terms force the differentiation of hidden neurons 1 through m1, and m2 through m (where 1<m1<m2<m). Both m1 and m2 are controllable and allow fringe neurons to be used to provide the context inputs if the number of context units to use is known. When the number of context neurons to use cannot be determined, the regularization terms minimize m1, and maximize m2, while hidden neurons m1 through m 2 are penalized for differentiation. An amplitude detection simulation is used to evaluate the efficacy of the proposed paradigm
Keywords :
feedforward neural nets; generalisation (artificial intelligence); pattern recognition; signal detection; amplitude detection; context units; feedforward network; generalization performance; hidden layer neurons; induced specialization; past information; regularization terms; sum-of-squared error cost function; temporal pattern recognition; temporal pattern reproduction; temporal sequence processing; Computer science; Context modeling; Cost function; Delay lines; Feedforward systems; Laboratories; Neural networks; Neurons; Pattern recognition; Training data;
Conference_Titel :
Neural Networks for Signal Processing [1997] VII. Proceedings of the 1997 IEEE Workshop
Conference_Location :
Amelia Island, FL
Print_ISBN :
0-7803-4256-9
DOI :
10.1109/NNSP.1997.622391