DocumentCode :
1803020
Title :
Convergence properties of normalized random incremental gradient algorithms for least-squares source localization
Author :
Rabbat, Michael ; Nedic, Angelia
Author_Institution :
Electr. & Comput. Eng., McGill Univ., Montreal, QC, Canada
fYear :
2012
fDate :
4-7 Nov. 2012
Firstpage :
1417
Lastpage :
1421
Abstract :
We consider the problem of localizing a single source using received signal strength measurements gathered at a number of sensors. We assume that the measurements follow the standard path loss model and are corrupted by additive white Gaussian noise. Under this model, the maximum likelihood solution to the source localization problem involves solving a non-linear least squares optimization problem. We study the convergence property of a normalized incremental gradient method for solving this problem. Remarkably, despite the fact that the problem is non-convex, the normalized incremental gradient method generates a sequence of iterates which are attracted to the global optimum under some mild conditions.
Keywords :
AWGN; concave programming; convergence; gradient methods; least squares approximations; maximum likelihood estimation; nonlinear programming; source separation; additive white Gaussian noise; convergence properties; least-squares source localization problem; maximum likelihood solution; nonconvex optimization; nonlinear least squares optimization problem; normalized random incremental gradient algorithms; received signal strength measurements; standard path loss model;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Signals, Systems and Computers (ASILOMAR), 2012 Conference Record of the Forty Sixth Asilomar Conference on
Conference_Location :
Pacific Grove, CA
ISSN :
1058-6393
Print_ISBN :
978-1-4673-5050-1
Type :
conf
DOI :
10.1109/ACSSC.2012.6489259
Filename :
6489259
Link To Document :
بازگشت