DocumentCode
110628
Title
Function Approximation Using Combined Unsupervised and Supervised Learning
Author
Andras, Peter
Author_Institution
Sch. of Comput. Sci., Newcastle Univ., Newcastle upon Tyne, UK
Volume
25
Issue
3
fYear
2014
fDate
Mar-14
Firstpage
495
Lastpage
505
Abstract
Function approximation is one of the core tasks that are solved using neural networks in the context of many engineering problems. However, good approximation results need good sampling of the data space, which usually requires exponentially increasing volume of data as the dimensionality of the data increases. At the same time, often the high-dimensional data is arranged around a much lower dimensional manifold. Here we propose the breaking of the function approximation task for high-dimensional data into two steps: (1) the mapping of the high-dimensional data onto a lower dimensional space corresponding to the manifold on which the data resides and (2) the approximation of the function using the mapped lower dimensional data. We use over-complete self-organizing maps (SOMs) for the mapping through unsupervised learning, and single hidden layer neural networks for the function approximation through supervised learning. We also extend the two-step procedure by considering support vector machines and Bayesian SOMs for the determination of the best parameters for the nonlinear neurons in the hidden layer of the neural networks used for the function approximation. We compare the approximation performance of the proposed neural networks using a set of functions and show that indeed the neural networks using combined unsupervised and supervised learning outperform in most cases the neural networks that learn the function approximation using the original high-dimensional data.
Keywords
Bayes methods; data analysis; function approximation; learning (artificial intelligence); self-organising feature maps; support vector machines; Bayesian SOM; data space sampling; exponentially-increasing data volume; function approximation; high-dimensional data; high-dimensional data mapping; lower dimensional data manifold; neural networks; nonlinear neurons; over-complete self-organizing maps; single-hidden layer neural networks; supervised learning; support vector machines; two-step procedure; unsupervised learning; Biological neural networks; Function approximation; Manifolds; Neurons; Prototypes; Vectors; Function approximation; learning; neural network; self-organizing map (SOM);
fLanguage
English
Journal_Title
Neural Networks and Learning Systems, IEEE Transactions on
Publisher
ieee
ISSN
2162-237X
Type
jour
DOI
10.1109/TNNLS.2013.2276044
Filename
6589012
Link To Document