Title :
Accelerated second-order stochastic optimization using only function measurements
Author_Institution :
Appl. Phys. Lab., Johns Hopkins Univ., Laurel, MD, USA
Abstract :
Consider the problem of loss-function minimization when only (possibly noisy) measurements of the loss function are available. In particular, no measurements of the gradient of the loss function are assumed available. The simultaneous perturbation SA (SPSA) algorithm has successfully addressed one of the major shortcomings of those finite-difference SA algorithms by significantly reducing the number of measurements required in many multivariate problems of practical interest. This paper presents a second-order SPSA algorithm that is based on estimating both the loss function gradient and inverse Hessian matrix at each iteration. The aim of this approach is to emulate the acceleration properties associated with deterministic algorithms of Newton-Raphson form, particularly in the terminal phase where the first-order SPSA algorithm slows down in its convergence. This second-order SPSA algorithm requires only five loss function measurements at each iteration, independent of the problem dimension. This paper represents a significantly enhanced version of a previously introduced second-order algorithm by the author (1996)
Keywords :
Hessian matrices; convergence of numerical methods; function approximation; optimisation; parameter estimation; perturbation techniques; Newton-Raphson form; convergence; function measurements; inverse Hessian matrix; loss-function; parameter estimation; second-order stochastic optimization; simultaneous perturbation; stochastic approximation; Acceleration; Approximation algorithms; Convergence; Finite difference methods; Loss measurement; Measurement standards; Parameter estimation; Particle measurements; Physics; Stochastic processes;
Conference_Titel :
Decision and Control, 1997., Proceedings of the 36th IEEE Conference on
Conference_Location :
San Diego, CA
Print_ISBN :
0-7803-4187-2
DOI :
10.1109/CDC.1997.657661