Title :
Support vector regression based LTS-CPBUM neural networks
Author :
Jeng, Jin-Tsong ; Chuang, Chen-Chia ; Chuang, Chi-Ta
Author_Institution :
Dept. of Comput. Sci. & Inf. Eng., Nat. Formosa Univ., Yunlin, Taiwan
Abstract :
In this paper, support vector regression (SVR) based least trimmed squares CPBUM (LTS-CPBUM) neural networks are proposed to improve the outliers and noise problems of conventional neural networks. In general, the obtained training data in the real applications maybe contain the outliers and noise. Although the CPBUM neural networks have fast convergent speed, this model is difficult to deal with training data set with outliers and noise. That is, the robust property must be enhanced for the CPBUM neural networks. Hence, the LTS computational architecture is proposed for the CPBUM neural networks in this paper. However, the initial structure for the LTS is difficult to determine. Hence, in this paper SVR apply to determine the initial LTS. That is, due to a SVR approach with the ε-insensitive loss function provides an estimated function within ε zone, the initial structure of the LT S can be obtained by the SVR approach that justly provides better initialization to robust against training data with outliers and noise. After the SVR based LTS, the gradient-descent kind of learning algorithms is used as the learning algorithm to adjust the weights of the CPBUM neural networks. It tunes out that the proposed SVR based LTS-CPBUM neural networks have fast convergent speed and robust against outliers and noise than the conventional neural networks with robust mechanism. Simulation results are provided to show the validity and applicability of the proposed neural networks.
Keywords :
learning (artificial intelligence); least squares approximations; multilayer perceptrons; radial basis function networks; recurrent neural nets; regression analysis; support vector machines; ε-insensitive loss function; LTS computational architecture; LTS-CPBUM neural networks; SVR; general regression neural network; learning algorithm; least trimmed squares CPBUM neural networks; piecewise smooth networks; radial basis function network; recurrent neural networks; support vector regression; time delay input multilayered perceptron; training data; wavelet networks; Biological neural networks; Chebyshev approximation; Noise; Recurrent neural networks; Robustness; Training data; CPBUM neural networks; Outliers; least trimmed squares; robust mechanism; support vector regression;
Conference_Titel :
SICE Annual Conference (SICE), 2011 Proceedings of
Conference_Location :
Tokyo
Print_ISBN :
978-1-4577-0714-8