Title of article :
Nonlinear approximation using Gaussian kernels
Author/Authors :
Thomas Hangelbroek، نويسنده ,
Issue Information :
روزنامه با شماره پیاپی سال 2010
Abstract :
It is well known that nonlinear approximation has an advantage over linear schemes in the sense that it
provides comparable approximation rates to those of the linear schemes, but to a larger class of approximands.
This was established for spline approximations and for wavelet approximations, and more recently
by DeVore and Ron (in press) [2] for homogeneous radial basis function (surface spline) approximations.
However, no such results are known for the Gaussian function, the preferred kernel in machine learning
and several engineering problems. We introduce and analyze in this paper a new algorithm for approximating
functions using translates of Gaussian functions with varying tension parameters. At heart it employs
the strategy for nonlinear approximation of DeVore–Ron, but it selects kernels by a method that is not
straightforward. The crux of the difficulty lies in the necessity to vary the tension parameter in the Gaussian
function spatially according to local information about the approximand: error analysis of Gaussian
approximation schemes with varying tension are, by and large, an elusive target for approximators. We
show that our algorithm is suitably optimal in the sense that it provides approximation rates similar to other
established nonlinear methodologies like spline and wavelet approximations. As expected and desired, the
approximation rates can be as high as needed and are essentially saturated only by the smoothness of the
approximand.
© 2010 Published by Elsevier Inc.
Keywords :
Triebel–Lizorkin space , Machine Learning , Gaussians , kernels , Radial basis functions , nonlinear approximation , Besov space
Journal title :
Journal of Functional Analysis
Journal title :
Journal of Functional Analysis