DocumentCode
72952
Title
RISE: An Incremental Trust-Region Method for Robust Online Sparse Least-Squares Estimation
Author
Rosen, David M. ; Kaess, Michael ; Leonard, John J.
Author_Institution
Comput. Sci. & Artificial Intell. Lab., Massachusetts Inst. of Technol., Cambridge, MA, USA
Volume
30
Issue
5
fYear
2014
fDate
Oct. 2014
Firstpage
1091
Lastpage
1108
Abstract
Many point estimation problems in robotics, computer vision, and machine learning can be formulated as instances of the general problem of minimizing a sparse nonlinear sum-of-squares objective function. For inference problems of this type, each input datum gives rise to a summand in the objective function, and therefore performing online inference corresponds to solving a sequence of sparse nonlinear least-squares minimization problems in which additional summands are added to the objective function over time. In this paper, we present Robust Incremental least-Squares Estimation (RISE), an incrementalized version of the Powell´s Dog-Leg numerical optimization method suitable for use in online sequential sparse least-squares minimization. As a trust-region method, RISE is naturally robust to objective function nonlinearity and numerical ill-conditioning and is provably globally convergent for a broad class of inferential cost functions (twice-continuously differentiable functions with bounded sublevel sets). Consequently, RISE maintains the speed of current state-of-the-art online sparse least-squares methods while providing superior reliability.
Keywords
convergence of numerical methods; least squares approximations; minimisation; Dog-Leg numerical optimization method; RISE; bounded sublevel sets; global convergence; incremental trust-region method; inference problems; inferential cost functions; input datum; many point estimation problems; numerical ill-conditioning; online inference; online sequential sparse least-squares minimization; robust incremental least-squares estimation; robust online sparse least-squares estimation; sparse nonlinear least-squares minimization problems; sparse nonlinear sum-of-squares objective function; summands; twice-continuously differentiable functions; Approximation methods; Convergence; Jacobian matrices; Linear programming; Minimization; Robots; Robustness; Computer vision; machine learning; online estimation; simultaneous localization and mapping (SLAM); sparse least-squares minimization;
fLanguage
English
Journal_Title
Robotics, IEEE Transactions on
Publisher
ieee
ISSN
1552-3098
Type
jour
DOI
10.1109/TRO.2014.2321852
Filename
6845338
Link To Document