DocumentCode
3373155
Title
Combining gradient-based optimization with stochastic search
Author
Enlu Zhou ; Jiaqiao Hu
Author_Institution
Dept. of Ind. & Enterprise Syst. Eng., Univ. of Illinois at Urbana-Champaign, Urbana, IL, USA
fYear
2012
fDate
9-12 Dec. 2012
Firstpage
1
Lastpage
12
Abstract
We propose a stochastic search algorithm for solving non-differentiable optimization problems. At each iteration, the algorithm searches the solution space by generating a population of candidate solutions from a parameterized sampling distribution. The basic idea is to convert the original optimization problem into a differentiable problem in terms of the parameters of the sampling distribution, and then use a quasi-Newton-like method on the reformulated problem to find improved sampling distributions. The algorithm combines the strength of stochastic search from considering a population of candidate solutions to explore the solution space with the rapid convergence behavior of gradient methods by exploiting local differentiable structures. We provide numerical examples to illustrate its performance.
Keywords
Newton method; convergence of numerical methods; gradient methods; optimisation; sampling methods; search problems; statistical distributions; convergence behavior; differentiable problem; gradient methods; gradient-based optimization; local differentiable structures; nondifferentiable optimization problems; parameterized sampling distribution; population generation; quasi-Newton-like method; reformulated problem; stochastic search algorithm; Adaptation models; Linear programming; Optimization; Search problems; Space exploration; Stochastic processes;
fLanguage
English
Publisher
ieee
Conference_Titel
Simulation Conference (WSC), Proceedings of the 2012 Winter
Conference_Location
Berlin
ISSN
0891-7736
Print_ISBN
978-1-4673-4779-2
Electronic_ISBN
0891-7736
Type
conf
DOI
10.1109/WSC.2012.6465032
Filename
6465032
Link To Document