DocumentCode :
1796525
Title :
On simplification of chaotic neural network on incremental learning
Author :
Deguchi, Tadayoshi ; Takahashi, Tatsuro ; Ishii, Naohiro
Author_Institution :
Dept. of Electr. & Comput. Eng., Gifu Nat. Coll. of Technol., Motosu, Japan
fYear :
2014
fDate :
June 30 2014-July 2 2014
Firstpage :
1
Lastpage :
4
Abstract :
The incremental learning is a method to compose an associate memory using a chaotic neural network and provides larger capacity than correlative learning in compensation for a large amount of computation. A chaotic neuron has spatiotemporal sum in it and the temporal sum makes the learning stable to input noise. When there is no noise in input, the neuron may not need temporal sum. In this paper, to reduce the computations, a simplified network without temporal sum are introduced and investigated through the computer simulations comparing with the network as in the past. It turns out that the simplified network has the same capacity to and can learn faster than the usual network.
Keywords :
chaos; content-addressable storage; learning (artificial intelligence); neural nets; associate memory; chaotic neural network; chaotic neuron; correlative learning; incremental learning; simplified network; spatio-temporal sum; Biological neural networks; Computational complexity; Electronic mail; Neurons; Noise; Noise measurement;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Software Engineering, Artificial Intelligence, Networking and Parallel/Distributed Computing (SNPD), 2014 15th IEEE/ACIS International Conference on
Conference_Location :
Las Vegas, NV
Type :
conf
DOI :
10.1109/SNPD.2014.6888706
Filename :
6888706
Link To Document :
بازگشت