Title :
On Memory Capacity in Incremental Learning with Appropriate Refractoriness and Weight Increment
Author :
Deguchi, Toshinori ; Ishii, Naohiro
Author_Institution :
Dept. of Electr. & Comput. Eng., Gifu Nat. Coll. of Technol., Gifu, Japan
Abstract :
Neural networks are able to learn more patterns with the incremental learning than with the correlative learning. The incremental learning is a method to compose an associate memory using a chaotic neural network. The capacity of the network is found to increase along with its size which is the number of the neurons in the network and to be larger than the one with correlative learning. In former work, the capacity was over the direct proportion to the network size with suitable pairs of the refractory parameter and the learning parameter. In this paper, the maximum capacity of the networks is investigated changing the refractory parameter and the learning parameter. Through the computer simulations, it turns out that the capacity is proportional to the network size and that the proportional constant is about 1.38.
Keywords :
chaos; content-addressable storage; learning (artificial intelligence); neural nets; associate memory; chaotic neural network; computer simulation; correlative learning; incremental learning; learning parameter; memory capacity; network capacity; network size; refractory parameter; weight increment; Artificial neural networks; Associative memory; Chaos; Computers; Electronic mail; Error correction; Neurons; capacity; chaotic neural network; incremental learning; refractory;
Conference_Titel :
Computers, Networks, Systems and Industrial Engineering (CNSI), 2011 First ACIS/JNU International Conference on
Conference_Location :
Jeju Island
Print_ISBN :
978-1-4577-0180-1
DOI :
10.1109/CNSI.2011.61