DocumentCode :
2422774
Title :
Parallel Implementation of Certain Robust Regression Methods Using Lazy Evaluation in Python
Author :
Unpingco, José H.
Author_Institution :
Ohio Supercomput. Center, Columbus, OH
fYear :
2008
fDate :
14-17 July 2008
Firstpage :
495
Lastpage :
497
Abstract :
Least-mean sum of Squares (LS) regression methods enjoy many conceptual and structural advantages and generate models with powerful mathematical properties. Unfortunately, the corresponding estimators are sensitive to the presence of outliers in the data, which can skew them severely, inflate their variances, and conceal the presence of outliers. Robust regression estimation techniques have been around since the mid-80s and provide methods to compensate for and pinpoint outliers. Despite their superior performance in many situations, these least-median sum of squares (LMS) methods have remained unpopular since they are much more computationally intensive than least-mean squares estimates. In this paper, we discuss how using the lazy evaluation mechanism of the popular Python language can significantly mitigate these computational costs by distributing the overall computation across multiple processors and thereby reduce the overall walltime by a factor equal to the number of processes employed.
Keywords :
parallel languages; regression analysis; Python language; lazy evaluation mechanism; least-mean squares estimates; least-median sum of squares; parallel implementation; regression estimation; robust regression methods; Computational efficiency; Distributed computing; Explosions; Functional programming; Least squares approximation; Mathematical model; Power generation; Robustness; Statistical analysis; Supercomputers;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
DoD HPCMP Users Group Conference, 2008. DOD HPCMP UGC
Conference_Location :
Seattle, WA
Print_ISBN :
978-1-4244-3323-0
Type :
conf
DOI :
10.1109/DoD.HPCMP.UGC.2008.47
Filename :
4755914
Link To Document :
بازگشت