DocumentCode :
1264327
Title :
Parallel, self-organizing, hierarchical neural networks
Author :
Ersoy, Okan K. ; Hong, Daesik
Author_Institution :
Sch. of Electr. Eng., Purdue Univ., West Lafayette, IN, USA
Volume :
1
Issue :
2
fYear :
1990
fDate :
6/1/1990 12:00:00 AM
Firstpage :
167
Lastpage :
178
Abstract :
A new neural-network architecture called the parallel, self-organizing, hierarchical neural network (PSHNN) is presented. The new architecture involves a number of stages in which each stage can be a particular neural network (SNN). At the end of each stage, error detection is carried out, and a number of input vectors are rejected. Between two stages there is a nonlinear transformation of input vectors rejected by the previous stage. The new architecture has many desirable properties, such as optimized system complexity (in the sense of minimized self-organizing number of stages), high classification accuracy, minimized learning and recall times, and truly parallel architectures in which all stages operate simultaneously without waiting for data from other stages during testing. The experiments performed indicated the superiority of the new architecture over multilayered networks with back-propagation training
Keywords :
neural nets; parallel architectures; self-adjusting systems; error detection; hierarchical neural networks; input vectors; parallel architectures; self organising neural nets; Artificial neural networks; Automatic testing; Fault tolerance; Multi-layer neural network; Neural networks; Parallel architectures; Robustness; Signal representations; System testing; Temperature;
fLanguage :
English
Journal_Title :
Neural Networks, IEEE Transactions on
Publisher :
ieee
ISSN :
1045-9227
Type :
jour
DOI :
10.1109/72.80229
Filename :
80229
Link To Document :
بازگشت