DocumentCode :
3647817
Title :
Make it cheap: Learning with O(nd) complexity
Author :
Wlodzislaw Duch;Norbert Jankowski;Tomasz Maszczyk
fYear :
2012
fDate :
6/1/2012 12:00:00 AM
Firstpage :
1
Lastpage :
4
Abstract :
Classification methods with linear computational complexity O(nd) in the number of samples n and their dimensionality d often give results that are better or at least statistically not significantly worse that slower algorithms. This is demonstrated here for many benchmark datasets downloaded from the UCI Machine Learning Repository. Results provided in this paper should be used as a reference for estimating usefulness of new learning algorithms: higher complexity methods should provide significantly better results to justify their use.
Keywords :
"Complexity theory","Vectors","Prototypes","Support vector machines","Benchmark testing","Machine learning algorithms"
Publisher :
ieee
Conference_Titel :
Neural Networks (IJCNN), The 2012 International Joint Conference on
ISSN :
2161-4393
Print_ISBN :
978-1-4673-1488-6
Type :
conf
DOI :
10.1109/IJCNN.2012.6252380
Filename :
6252380
Link To Document :
بازگشت