DocumentCode
295952
Title
An error perturbation for learning and detection of local minima in binary 3-layered neural networks
Author
Yatsuzuka, Yohtaro
Author_Institution
Res. & Dev. Lab., Kokusai Denshin Denwa Co. Ltd., Kamifukuoka, Japan
Volume
1
fYear
1995
fDate
Nov/Dec 1995
Firstpage
63
Abstract
In binary multilayer neural networks with a backpropagation algorithm, achievement of quick and stable convergence in binary space is a major issue for a wide range of applications. We propose a learning technique in which tenacious local minima can be evaded by using a perturbation of the unit output errors in an output layer in polarity and magnitude. Simulation results showed that a binary 3-layered neural network can converge very rapidly in binary space with insensitivity to a set of initial weights, providing high generalization ability. It is also pointed out that tenacious local minima can be detected by monitoring a minimum magnitude of the unit output errors for the erroneous binary outputs, and that the overtraining concerning to generalization performance for test inputs is roughly estimated by monitoring the minimum and maximum magnitudes of the unit output errors for the correct binary outputs
Keywords
backpropagation; convergence; feedforward neural nets; generalisation (artificial intelligence); minimax techniques; perturbation techniques; backpropagation; binary multilayer neural networks; binary space; convergence; error perturbation; generalization; learning technique; local minima detection; output errors; overtraining; Artificial neural networks; Backpropagation algorithms; Convergence; Error correction; Fault diagnosis; Intelligent networks; Knowledge acquisition; Monitoring; Multi-layer neural network; Neural networks; Testing;
fLanguage
English
Publisher
ieee
Conference_Titel
Neural Networks, 1995. Proceedings., IEEE International Conference on
Conference_Location
Perth, WA
Print_ISBN
0-7803-2768-3
Type
conf
DOI
10.1109/ICNN.1995.487878
Filename
487878
Link To Document