DocumentCode :
228996
Title :
Gradient descent and normal equations on cost function minimization for online predictive using linear regression with multiple variables
Author :
Lubis, Fetty Fitriyanti ; Rosmansyah, Yusep ; Supangkat, Suhono Harso
Author_Institution :
Sch. of Electr. Eng. & Inf., Inst. Teknol. Bandung, Bandung, Indonesia
fYear :
2014
fDate :
24-25 Sept. 2014
Firstpage :
202
Lastpage :
205
Abstract :
The cost function minimization is essential in finding a good model for linear regression. This paper works on prototyping and examining the minimizing cost function´s two known algorithms for online predictive, namely gradient descent and normal equations. The data used in this paper are found in Open Data and split into three parts, training, test, and cross validation datasets. Empirical results are given on number of datasets, showing that normal equation performs better than gradient descent (with cross correlation 0.0117 higher and relative absolute error 0.5154 less).
Keywords :
cost reduction; data handling; gradient methods; regression analysis; Open Data; cost function minimization; cross validation datasets; gradient descent; linear regression; multiple variables; normal equations; online predictive; test datasets; training datasets; Cost function; Educational institutions; Equations; Linear regression; Mathematical model; Prediction algorithms; Predictive models; cost function; gradient descent; normal equations; online predictive;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
ICT For Smart Society (ICISS), 2014 International Conference on
Conference_Location :
Bandung
Type :
conf
DOI :
10.1109/ICTSS.2014.7013173
Filename :
7013173
Link To Document :
بازگشت