DocumentCode :
1482065
Title :
Fixed-weight on-line learning
Author :
Younger, A. Steven ; Conwell, Peter R. ; Cotter, Neil E.
Author_Institution :
Dept. of Phys., Utah Univ., Salt Lake City, UT, USA
Volume :
10
Issue :
2
fYear :
1999
fDate :
3/1/1999 12:00:00 AM
Firstpage :
272
Lastpage :
283
Abstract :
Conventional neural nets perform functional mappings from their input space to their output space in a manner analogous to long-term biological memory. This paper presents a method of designing neural nets where recurrent signal loops store knowledge in a manner analogous to short-term memory. The synaptic weights encode a learning algorithm. This gives these nets the ability to dynamically learn any functional mapping from a (possibly very large) set, without changing any synaptic weight. These nets are adaptive dynamic systems. Learning is online continually taking place as part of the net´s overall behavior instead of a separate, externally driven process. We present four high-order fixed-weight learning nets. Two of these have standard backpropagation embedded in their synaptic weights. The other two utilize a more efficient gradient-descent-based rule. This scheme was discovered by examining variations in fixed-weight topology. We present empirical tests showing that all these nets could successfully learn functions from both discrete (Boolean) and continuous function sets. Largely, the networks were robust with respect to perturbations in the synaptic weights. The exception was the recurrent connections used to store information. These required a tight tolerance of 0.5%. The cost of these nets scaled approximately in proportion to the total number of synapses. We consider evolving fixed weight networks tailored to a specific problem class by analyzing the meta-learning cost surface of the networks presented
Keywords :
gradient methods; learning (artificial intelligence); recurrent neural nets; Boolean function sets; adaptive dynamic systems; backpropagation; continuous function sets; discrete function sets; dynamic learning; fixed-weight online learning; fixed-weight topology; functional mapping; gradient-descent-based rule; high-order fixed-weight learning nets; meta-learning cost surface; online continual learning; recurrent neural nets; recurrent signal loops; synaptic weight perturbations; synaptic weights; Adaptive systems; Backpropagation; Biological information theory; Costs; Design methodology; Network topology; Neural networks; Recurrent neural networks; Signal design; Testing;
fLanguage :
English
Journal_Title :
Neural Networks, IEEE Transactions on
Publisher :
ieee
ISSN :
1045-9227
Type :
jour
DOI :
10.1109/72.750553
Filename :
750553
Link To Document :
بازگشت