DocumentCode :
9342
Title :
A New Discrete-Continuous Algorithm for Radial Basis Function Networks Construction
Author :
Long Zhang ; Kang Li ; Haibo He ; Irwin, George W.
Author_Institution :
Sch. of Electron., Electr. Eng. & Comput. Sci., Queen´s Univ. Belfast, Belfast, UK
Volume :
24
Issue :
11
fYear :
2013
fDate :
Nov. 2013
Firstpage :
1785
Lastpage :
1798
Abstract :
The construction of a radial basis function (RBF) network involves the determination of the model size, hidden nodes, and output weights. Least squares-based subset selection methods can determine a RBF model size and its parameters simultaneously. Although these methods are robust, they may not achieve optimal results. Alternatively, gradient methods are widely used to optimize all the parameters. The drawback is that most algorithms may converge slowly as they treat hidden nodes and output weights separately and ignore their correlations. In this paper, a new discrete-continuous algorithm is proposed for the construction of a RBF model. First, the orthogonal least squares (OLS)-based forward stepwise selection constructs an initial model by selecting model terms one by one from a candidate term pool. Then a new Levenberg-Marquardt (LM)-based parameter optimization is proposed to further optimize the hidden nodes and output weights in the continuous space. To speed up the convergence, the proposed parameter optimization method considers the correlation between the hidden nodes and output weights, which is achieved by translating the output weights to dependent parameters using the OLS method. The correlation is also used by the previously proposed continuous forward algorithm (CFA). However, unlike the CFA, the new method optimizes all the parameters simultaneously. In addition, an equivalent recursive sum of squared error is derived to reduce the computation demanding for the first derivatives used in the LM method. Computational complexity is given to confirm the new method is much more computationally efficient than the CFA. Different numerical examples are presented to illustrate the effectiveness of the proposed method. Further, Friedman statistical tests on 13 classification problems are performed, and the results demonstrate that RBF networks built by the new method are very competitive in comparison with some popular classifiers.
Keywords :
computational complexity; gradient methods; least squares approximations; parameter estimation; radial basis function networks; LM-based parameter optimization; Levenberg-Marquardt based parameter optimization; OLS-based forward stepwise selection; RBF model size; RBF network construction; continuous forward algorithm; discrete-continuous algorithm; gradient methods; least squares-based subset selection methods; orthogonal least squares; radial basis function network; Forward stepwise selection; Levenberg–Marquardt (LM); model generalization; orthogonal least squares (OLS); radial basis function (RBF) networks;
fLanguage :
English
Journal_Title :
Neural Networks and Learning Systems, IEEE Transactions on
Publisher :
ieee
ISSN :
2162-237X
Type :
jour
DOI :
10.1109/TNNLS.2013.2264292
Filename :
6547235
Link To Document :
بازگشت