DocumentCode :
3661492
Title :
Incremental learning on a budget and a quick calculation method using a tree-search algorithm
Author :
Akihisa Kato;Hirohito Kawahara;Koichiro Yamauchi
Author_Institution :
Depeartment of Computer Science, Chubu University 1200, Matsumoto-cho, Kasugai-shi, Aichi, Japan
fYear :
2015
fDate :
7/1/2015 12:00:00 AM
Firstpage :
1
Lastpage :
7
Abstract :
In this study, a lightweight kernel regression algorithm for embedded systems is proposed. In our previous study, we proposed an online learning method with a limited number of kernels based on a kernel regression model known as a limited general regression neural network (LGRNN). The LGRNN behavior is similar to that of k-nearest neighbors except for its continual interpolation between learned samples. The output of kernel regression to an input is dominant for the closest kernel output. This is in contrast to the output of kernel perceptrons, which is determined by the combination of several nested kernels. This means that the output of a kernel regression model can be lightly weighted by omitting calculations for the other kernels. Therefore, we have to find the closest kernel and its neighbors to the current input vector quickly. To realize this, we introduce a tree-search-based calculation method for LGRNN. In the LGRNN learning method, the kernels are clustered into k groups and organized as tree-structured data for the tree-search algorithm.
Keywords :
"Concrete","Servomotors","Kernel","Nickel","Robustness"
Publisher :
ieee
Conference_Titel :
Neural Networks (IJCNN), 2015 International Joint Conference on
Electronic_ISBN :
2161-4407
Type :
conf
DOI :
10.1109/IJCNN.2015.7280805
Filename :
7280805
Link To Document :
بازگشت