DocumentCode :
1381767
Title :
A Very Fast Neural Learning for Classification Using Only New Incoming Datum
Author :
Jaiyen, Saichon ; Lursinsap, Chidchanok ; Phimoltares, Suphakant
Author_Institution :
Dept. of Math., Chulalongkorn Univ., Bangkok, Thailand
Volume :
21
Issue :
3
fYear :
2010
fDate :
3/1/2010 12:00:00 AM
Firstpage :
381
Lastpage :
392
Abstract :
This paper proposes a very fast 1-pass-throw-away learning algorithm based on a hyperellipsoidal function that can be translated and rotated to cover the data set during learning process. The translation and rotation of hyperellipsoidal function depends upon the distribution of the data set. In addition, we present versatile elliptic basis function (VEBF) neural network with one hidden layer. The hidden layer is adaptively divided into subhidden layers according to the number of classes of the training data set. Each subhidden layer can be scaled by incrementing a new node to learn new samples during training process. The learning time is O(n), where n is the number of data. The network can independently learn any new incoming datum without involving the previously learned data. There is no need to store all the data in order to mix with the new incoming data during the learning process.
Keywords :
computational complexity; neural nets; fast 1-pass-throw-away learning algorithm; hyperellipsoidal function; versatile elliptic basis function neural network; Classification; clustering; ellipsoid; elliptic basis function (EBF); fast learning; hyperellipsoid; neural network; principal component analysis (PCA); radial basis function (RBF); recognition; Algorithms; Artificial Intelligence; Heart; Humans; Iris; Models, Neurological; Neural Networks (Computer); Neurons; Nonlinear Dynamics; Principal Component Analysis;
fLanguage :
English
Journal_Title :
Neural Networks, IEEE Transactions on
Publisher :
ieee
ISSN :
1045-9227
Type :
jour
DOI :
10.1109/TNN.2009.2037148
Filename :
5382496
Link To Document :
بازگشت