DocumentCode :
730515
Title :
Proximal diffusion for stochastic costs with non-differentiable regularizers
Author :
Vlaski, Stefan ; Sayed, Ali H.
Author_Institution :
Dept. of Electr. Eng., Univ. of California, Los Angeles, Los Angeles, CA, USA
fYear :
2015
fDate :
19-24 April 2015
Firstpage :
3352
Lastpage :
3356
Abstract :
We consider networks of agents cooperating to minimize a global objective, modeled as the aggregate sum of regularized costs that are not required to be differentiable. Since the subgradients of the individual costs cannot generally be assumed to be uniformly bounded, general distributed subgradient techniques are not applicable to these problems. We isolate the requirement of bounded subgradients into the regularizer and use splitting techniques to develop a stochastic proximal diffusion strategy for solving the optimization problem by continuously learning from streaming data. We represent the implementation as the cascade of three operators and invoke Banach´s fixed-point theorem to establish that, despite gradient noise, the stochastic implementation is able to converge in the mean-square-error sense within O(μ) from the optimal solution, for a sufficiently small step-size parameter, μ.
Keywords :
acoustic signal processing; optimisation; stochastic processes; Banach´s fixed-point theorem; bounded subgradients; general distributed subgradient techniques; global objective; nondifferentiable regularizers; optimization problem; stochastic costs; stochastic proximal diffusion strategy; Aggregates; Context; Cost function; Noise; Signal processing algorithms; Stochastic processes; Distributed optimization; diffusion strategy; fixed point; gradient noise; proximal operator; regularization;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Acoustics, Speech and Signal Processing (ICASSP), 2015 IEEE International Conference on
Conference_Location :
South Brisbane, QLD
Type :
conf
DOI :
10.1109/ICASSP.2015.7178592
Filename :
7178592
Link To Document :
بازگشت