DocumentCode :
1383901
Title :
The local minima-free condition of feedforward neural networks for outer-supervised learning
Author :
Huang, De-Shuang
Author_Institution :
Beijing Inst. of Syst. Eng., China
Volume :
28
Issue :
3
fYear :
1998
fDate :
6/1/1998 12:00:00 AM
Firstpage :
477
Lastpage :
480
Abstract :
In this paper, the local minima-free conditions of the outer-supervised feedforward neural networks (FNN) based on batch-style learning are studied by means of the embedded subspace method. It is proven that only if the rendition that the number of the hidden neurons is not less than that of the training samples, which is sufficient but not necessary, is satisfied, the network will necessarily converge to the global minima with null cost, and that the condition that the range space of the outer-supervised signal matrix is included in the range space of the hidden output matrix Is sufficient and necessary condition for the local minima-free in the error surface. In addition, under the condition of the number of the hidden neurons being less than that of the training samples and greater than the number of the output neurons, it is demonstrated that there will also only exist the global minima with null cost in the error surface if the first layer weights are adequately selected
Keywords :
feedforward neural nets; learning (artificial intelligence); batch-style learning; embedded subspace method; error surface; feedforward neural networks; global minima; hidden neurons; local minima-free condition; outer-supervised learning; range space; Cost function; Feedforward neural networks; Least squares approximation; Neural networks; Neurons; Resonance light scattering; Sufficient conditions; Systems engineering and theory;
fLanguage :
English
Journal_Title :
Systems, Man, and Cybernetics, Part B: Cybernetics, IEEE Transactions on
Publisher :
ieee
ISSN :
1083-4419
Type :
jour
DOI :
10.1109/3477.678658
Filename :
678658
Link To Document :
بازگشت