Title :
Approximation, dimension reduction, and nonconvex optimization using linear superpositions of Gaussians
Author :
Saha, Avijit ; Wu, Chuan-lin ; Tang, Dun-Sung
Author_Institution :
Adv. Workstation Div., IBM Corp., Austin, TX, USA
fDate :
10/1/1993 12:00:00 AM
Abstract :
This paper concerns neural network approaches to function approximation and optimization using linear superposition of Gaussians (or what are popularly known as radial basis function (RBF) networks). The problem of function approximation is one of estimating an underlying function f, given samples of the form {(yi, xi); i=1,2,···,n; with yi=f(xi)}. When the dimension of the input is high and the number of samples small, estimation of the function becomes difficult due to the sparsity of samples in local regions. The authors find that this problem of high dimensionality can be overcome to some extent by using linear transformations of the input in the Gaussian kernels. Such transformations induce intrinsic dimension reduction, and can be exploited for identifying key factors of the input and for the phase space reconstruction of dynamical systems, without explicitly computing the dimension and delay. They present a generalization that uses multiple linear projections onto scalars and successive RBF networks (MLPRBF) that estimate the function based on these scaler values. They derive some key properties of RBF networks that provide suitable grounds for implementing efficient search strategies for nonconvex optimization within the same framework
Keywords :
function approximation; neural nets; optimisation; polynomials; dimension reduction; function approximation; linear superpositions of Gaussians; neural network approaches; nonconvex optimization; radial basis function; Control systems; Delay; Function approximation; Gaussian approximation; Gaussian processes; Kernel; Microelectronics; Neural networks; Radial basis function networks; Workstations;
Journal_Title :
Computers, IEEE Transactions on