DocumentCode
2009795
Title
Two-stage parallel partial retraining scheme for defective multi-layer neural networks
Author
Yamamori, Kunihito ; Abe, Tom ; Horiguchi, Susumu
Author_Institution
Japan Inst. of Sci. & Technol., Ishikawa, Japan
Volume
2
fYear
2000
fDate
14-17 May 2000
Firstpage
642
Abstract
We address a high-speed defect compensation method for multi-layer neural networks implemented in hardware devices. To compensate stuck defects of the neurons and weights, we have proposed a partial retraining scheme that adjusts the weights of a neuron affected by stuck defects between two layers by a backpropagation (BP) algorithm. Since the functions of defect compensation can be achieved by using learning circuits, we can save chip area. To reduce the number of weights to adjust, it also leads to high-speed defect compensation. We propose a two-stage partial retraining scheme to compensate input unit stuck defects. Our simulation results show that the two-stage partial retraining scheme can be about 100 times faster than whole network retraining by the BP algorithm.
Keywords
backpropagation; fault tolerant computing; multilayer perceptrons; parallel processing; backpropagation; defective multilayer neural networks; high-speed defect compensation; learning circuits; neuron weights; simulation results; stuck defects; two-stage parallel partial retraining;
fLanguage
English
Publisher
ieee
Conference_Titel
High Performance Computing in the Asia-Pacific Region, 2000. Proceedings. The Fourth International Conference/Exhibition on
Conference_Location
Beijing, China
Print_ISBN
0-7695-0589-2
Type
conf
DOI
10.1109/HPC.2000.843515
Filename
843515
Link To Document