Title :
An implementation of backpropagation algorithm on a massively parallel processor
Author :
Omidvar, O.M. ; Wilson, C.L.
Author_Institution :
Univ. of the District of Columbia, Washington, DC, USA
Abstract :
The backpropagation learning algorithm has been modified to operate in a massively parallel environment. The network has 1024 neurons in the first layer; there are two hidden layers, either of which can be activated on demand. The first hidden layer has 256 neurons and the second hidden layer has only 64 neurons. The output layer has only ten different classes. The network operates in concurrent and parallel manner. All incoming signals are fed to the input layer at the same time, and they are processed and passed to subsequent stages in tandem. The connections are created at random with normal distributions. The error is calculated not one at the time, but simultaneously for all the classes, then resulted errors are used for calculation of total error in the network. The network is applied to the task of character recognition, using Gabor image coefficients
Keywords :
character recognition; learning systems; neural nets; parallel algorithms; Gabor image coefficients; backpropagation algorithm; character recognition; hidden layers; learning algorithm; massively parallel processor; multilayer neural networks; Backpropagation algorithms; Character recognition; Feedforward neural networks; Gaussian distribution; Image recognition; Image segmentation; NIST; Neural networks; Neurons; Signal processing;
Conference_Titel :
System Theory, 1991. Proceedings., Twenty-Third Southeastern Symposium on
Conference_Location :
Columbia, SC
Print_ISBN :
0-8186-2190-7
DOI :
10.1109/SSST.1991.138577