Title :
Distributed sparse regression by consensus-based primal-dual perturbation optimization
Author :
Tsung-Hui Chang ; Nedic, Angelia ; Scaglione, Anna
Author_Institution :
Dept. of Elec. & Compt. Eng., Nat. Taiwan Univ. of Sci. & Tech., Taipei, Taiwan
Abstract :
This paper studies the decentralized solution of a multi-agent sparse regression problem in the form of a globally coupled objective function with a non-smooth sparsity promoting constraint. In particular, we propose a distributed primal-dual perturbation (PDP) method which combines the average consensus technique and the primaldual perturbed subgradient method. Compared to the conventional primal-dual (PD) subgradient method without perturbation, the PDP subgradient method exhibits a faster convergence behavior. In order to handle the non-smooth constraints, we propose a novel proximal gradient type perturbation point. The proposed distributed optimization algorithm can be implemented as a fully decentralized protocol, with each agent using its local information and exchanging messages between neighbors only. We show that the proposed method converges to the global optimum of the considered problem under standard convex problem and network assumptions.
Keywords :
gradient methods; multi-agent systems; optimisation; regression analysis; PDP subgradient method; average consensus technique; convergence behavior; decentralized solution; distributed primal-dual perturbation; distributed sparse regression; globally coupled objective function; multiagent sparse regression problem; nonsmooth sparsity; primal-dual perturbation optimization; proximal gradient type perturbation point; Convergence; Cost function; Data models; Electronic mail; Linear programming; Optimization methods; Distributed optimization; average consensus; primal-dual subgradient method; sparse regression;
Conference_Titel :
Global Conference on Signal and Information Processing (GlobalSIP), 2013 IEEE
Conference_Location :
Austin, TX
DOI :
10.1109/GlobalSIP.2013.6736872