Title :
A conjecture on global optimization using gradient-free stochastic approximation
Author :
Maryak, John L. ; Chin, Daniel C.
Author_Institution :
Appl. Phys. Lab., Johns Hopkins Univ., Laurel, MD, USA
Abstract :
A concern with iterative optimization techniques is that the algorithm reaches the global optimum rather than gets stranded at a local optimum value. One method used to try to assure global convergence is the injection of extra noise terms into the recursion, which may allow the algorithm to escape local optimum points. The amplitude of the injected noise is decreased over time (a process called “annealing”), so that the algorithm can finally converge when it reaches the global optimum point. In this context, we examine the performance of a certain “gradient free” stochastic approximation algorithm. We argue that, in some cases, the naturally occurring error in the gradient approximation effectively introduces injected noise that promotes convergence of the algorithm to a global optimum. The discussion is supported by a numerical study
Keywords :
approximation theory; convergence of numerical methods; optimisation; stochastic processes; global convergence; global optimization; optimization; recursive annealing; simultaneous perturbation; stochastic approximation; Annealing; Approximation algorithms; Convergence; History; Iterative algorithms; Laboratories; Noise level; Physics; Stochastic processes; Stochastic resonance;
Conference_Titel :
Intelligent Control (ISIC), 1998. Held jointly with IEEE International Symposium on Computational Intelligence in Robotics and Automation (CIRA), Intelligent Systems and Semiotics (ISAS), Proceedings
Conference_Location :
Gaithersburg, MD
Print_ISBN :
0-7803-4423-5
DOI :
10.1109/ISIC.1998.713702