Title of article :
A new convex objective function for the supervised learning of single-layer neural networks
Author/Authors :
Oscar Fontenla-Romero، نويسنده , , Oscar and Guijarro-Berdiٌas، نويسنده , , Bertha and Pérez-Sلnchez، نويسنده , , Beatriz and Alonso-Betanzos، نويسنده , , Amparo، نويسنده ,
Issue Information :
روزنامه با شماره پیاپی سال 2010
Pages :
9
From page :
1984
To page :
1992
Abstract :
This paper proposes a novel supervised learning method for single-layer feedforward neural networks. This approach uses an alternative objective function to that based on the MSE, which measures the errors before the neuronʹs nonlinear activation functions instead of after them. In this case, the solution can be easily obtained solving systems of linear equations, i.e., requiring much less computational power than the one associated with the regular methods. A theoretical study is included to proof the approximated equivalence between the global optimum of the objective function based on the regular MSE criterion and the one of the proposed alternative MSE function. rmore, it is shown that the presented method has the capability of allowing incremental and distributed learning. An exhaustive experimental study is also presented to verify the soundness and efficiency of the method. This study contains 10 classification and 16 regression problems. In addition, a comparison with other high performance learning algorithms shows that the proposed method exhibits, in average, the highest performance and low-demanding computational requirements.
Keywords :
Single-layer neural networks , Supervised learning method , Convex optimization , least squares , incremental learning , Global optimum
Journal title :
PATTERN RECOGNITION
Serial Year :
2010
Journal title :
PATTERN RECOGNITION
Record number :
1733508
Link To Document :
بازگشت