DocumentCode :
2633551
Title :
Load sharing in the training set partition algorithm for parallel neural learning
Author :
Girau, B. ; Paugam-Moisy, H.
Author_Institution :
Lab. d´´Inf. du Parallelisme, CNRS, Lyon, France
fYear :
1995
fDate :
25-28 Apr 1995
Firstpage :
586
Lastpage :
591
Abstract :
A parallel back-propagation algorithm that partitions the training set on a ring of processors has been introduced. In this paper, we study the performance of this algorithm on MIMD machines and develop a new version, based on a heterogeneous load sharing. Algebraic models allow precise comparisons between the different methods, and show great improvements in case of parallel learning
Keywords :
backpropagation; parallel algorithms; resource allocation; algebraic models; heterogeneous load sharing; parallel back-propagation algorithm; parallel learning; parallel neural learning; training set partition algorithm; Asynchronous communication; Communication standards; Computer networks; Concurrent computing; Distributed computing; Load management; Multi-layer neural network; Neural networks; Partitioning algorithms; Stochastic processes;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Parallel Processing Symposium, 1995. Proceedings., 9th International
Conference_Location :
Santa Barbara, CA
Print_ISBN :
0-8186-7074-6
Type :
conf
DOI :
10.1109/IPPS.1995.395888
Filename :
395888
Link To Document :
بازگشت