Title :
Winner-take-all neural network for visual handwritten character recognition
Author :
Tayel, M. ; Shalaby, H. ; Saleh, H.
Author_Institution :
Dept. of Electr. Eng., Alexandria Univ., Egypt
Abstract :
A neural network model of visual pattern recognition, called the Necognitron, was previously proposed by Kunihiko Fukushima (1988). After training, it can recognize the input pattern without being affected by a change in size or a shift in position. The Fukushima model possesses several appealing categories so that its application would not be restricted to the process of pattern recognition. It can be applied to many other fields if its details are modified properly. With the model´s ability of selective attention, gain control and perfect recall for deformed patterns, as a model of associative memory, it would be possible to endow the data compression problem. This article proposes a learning scheme that embeds the Karhunen-Loeve (K-L) transform basis technique into the structure of a Fukushima based neural network to compress the Arabic alphabetic patterns data as well as to reduce the input dimensionality for the network training. Despite that, the input pattern can still be recognized and reconstructed from a few local features. The proposed scheme converges the connections weight vectors to the principal eigenvectors, that retains the maximum information contained in the Arabic alphabetic patterns set into a few significant local features and reduces the redundancies present among the inputs to the perceptual layer. The learning process not only leads to efficient data compression and reconstruction but also enhances the network´s ability of feature extraction and Arabic alphabetic recognition
Keywords :
character recognition; content-addressable storage; data compression; feature extraction; handwriting recognition; learning (artificial intelligence); multilayer perceptrons; self-organising feature maps; transforms; Arabic alphabetic patterns; Fukushima based neural network; Fukushima model; Karhunen-Loeve transform; Necognitron; associative memory; connections weight vectors; data compression; data reconstruction; deformed patterns recall; feature extraction; gain control; input pattern recognition; local features; network training; neural network model; perceptual layer; principal eigenvectors; selective attention; self organisation multilayer structure; visual handwritten character recognition; visual pattern recognition; winner take all neural network; Associative memory; Character recognition; Data compression; Data mining; Deformable models; Feature extraction; Gain control; Management training; Neural networks; Pattern recognition;
Conference_Titel :
Radio Science Conference, 1996. NRSC '96., Thirteenth National
Conference_Location :
Cairo
Print_ISBN :
0-7803-3656-9
DOI :
10.1109/NRSC.1996.551115