DocumentCode :
2697967
Title :
A gradient descent method for a neural fractal memory
Author :
Melnik, Ofer ; Pollack, Jordan
Author_Institution :
Volen Center, Brandeis Univ., Waltham, MA, USA
Volume :
2
fYear :
1998
fDate :
4-9 May 1998
Firstpage :
1069
Abstract :
It has been demonstrated that higher order recurrent neural networks exhibit an underlying fractal attractor as an artifact of their dynamics. These fractal attractors offer a very efficient mechanism to encode visual memories in a neural substrate, since even a simple twelve weight network can encode a very large set of different images. The main problem in this memory model, which so far has remained unaddressed, is how to train the networks to learn these different attractors. Following other neural training methods this paper proposes a gradient descent method to learn the attractors. The method is based on an error function which examines the effects of the current network transform on the desired fractal attractor. It is tested across a bank of different target fractal attractors and at different noise levels. The results show positive performance across three error measures
Keywords :
conjugate gradient methods; digital storage; fractals; recurrent neural nets; error function; fractal attractor; gradient descent method; high-order recurrent neural networks; neural fractal memory; neural substrate; neural training methods; noise; Fractals; Image coding; Image generation; Neural networks; Neurons; Noise level; Pressing; Recurrent neural networks; Testing; Tree data structures;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks Proceedings, 1998. IEEE World Congress on Computational Intelligence. The 1998 IEEE International Joint Conference on
Conference_Location :
Anchorage, AK
ISSN :
1098-7576
Print_ISBN :
0-7803-4859-1
Type :
conf
DOI :
10.1109/IJCNN.1998.685920
Filename :
685920
Link To Document :
بازگشت