Title :
Asymptotic optimality of competitive associative nets for their learning in function approximation
Author_Institution :
Dept. of Control Eng., Kyushu Inst. of Technol., Kitakyushu, Japan
Abstract :
The competitive associative nets called CAN2 involve competitive and associative schemes for learning to achieve piecewise linear approximation of nonlinear functions. Although the learning schemes of the nets have been shown effective in many applications such as function approximation, control, and rainfall estimation, the competitive learning basically has local minima problems. To overcome the problems, we here introduce asymptotic situation, where the number of units are very large, and then derive a condition of asymptotic optimality for minimizing the mean square error (MSE) of approximation. We next embed the condition into the incremental learning algorithm, where the condition is used for deciding whether the learning process is stuck at a local minimum or not, and reinitializing a unit for global optimum. By means of numerical experiments with a number of benchmark functions, we have verified the CAN2 with the present algorithm achieves smaller MSE than the widely used schemes using learning or optimization techniques as follows: the BPN (backpropagation net), the RBFN (radial basis function net) and the SVR (support vector regression).
Keywords :
function approximation; mean square error methods; neural nets; optimisation; unsupervised learning; CAN2; asymptotic optimality; competitive associative networks; competitive learning; function approximation; incremental learning algorithm; mean square error; neural networks; optimization; Approximation algorithms; Control engineering; Data engineering; Ear; Function approximation; Least squares approximation; Mean square error methods; Piecewise linear approximation; Predictive models; Vectors;
Conference_Titel :
Neural Information Processing, 2002. ICONIP '02. Proceedings of the 9th International Conference on
Print_ISBN :
981-04-7524-1
DOI :
10.1109/ICONIP.2002.1202222