DocumentCode
3267448
Title
A Recursive Growing and Pruning RBF (GAP-RBF) Algorithm for Function Approximations
Author
Huang, Guang-Bin ; Saratchandran, P. ; Sundararajan, Narashiman
Author_Institution
School of Electrical and Electronic Engineering, Nanyang Technological University, Nanyang Avenue, Singapore 639798. E-Mail: egbhuang@ntu.edu.sg
fYear
2003
fDate
12-12 June 2003
Firstpage
491
Lastpage
495
Abstract
A new sequential growing and pruning algorithm for RBF networks, referred to as GAP-RBF algorithm, has been proposed in our paper[1]. GAP-RBF algorithm modifies the growth criterion of Platt’s RAN and combines it with a new pruning strategy. Both the growing and pruning strategy is based on the link between the learning accuracy and the "significance" of the "nearest" or intentionally added new neuron. It has been shown that the GAP-RBF network outperforms RAN, RAN-EKF and MRAN in function approximation area. GAP-RBF significantly increases the learning speed and learning accuracy, and grows and prunes neurons much more smoothly. However, although GAP-RBF can reach higher learning accuracy than RAN, RANEKF, and MRAN, similar to RAN, RANEKF and MRAN, its learning error can be very high for certain applications. This paper proposed a recursive learning based sequential learning algorithm for Growing and Pruning RBF (GAP-RBF) networks. This new algorithm is called Recursive-GAP-RBF and thus, the original GAP-RBF can be called Basic-GAP-RBF. If no new observation comes to the network, Recursive-GAP-RBF can recursively self adjust its Parameters based on its historic observations. Whenever a new observation has been input to the network, Recursive-GAP-RBF works like a basic GAP-RBF.
fLanguage
English
Publisher
ieee
Conference_Titel
Control and Automation, 2003. ICCA '03. Proceedings. 4th International Conference on
Conference_Location
Montreal, Que., Canada
Print_ISBN
0-7803-7777-X
Type
conf
DOI
10.1109/ICCA.2003.1595070
Filename
1595070
Link To Document