Title :
Convergence rate analysis of distributed gradient methods for smooth optimization
Author :
Jakovetic, Dusan ; Xavier, Joao ; Moura, Jose M. F.
Author_Institution :
Inst. for Syst. & Robot. (ISR), Tech. Univ. of Lisbon, Lisbon, Portugal
Abstract :
We derive the convergence rate of a distributed gradient algorithm for smooth optimization in networked systems. We assume that each agent in the network has a convex cost function fi(x), known only to agent i, and the goal is to minimize the sum ΣNi=1 fi(x) of all agents´ costs; such problem formulation arises in various networked applications, like, e.g., distributed inference or source localization in sensor networks. With the distributed gradient algorithm under study, each agent, at each iteration k, performs a weighted average of its own and its neighbors´ solution estimates, and performs a step in the direction of the negative of its local function´s gradient. We establish a novel result that the distributed gradient algorithm has the convergence rate O(1/k2/3), in terms of the cost function optimality gap, under the assumption of convex fi´s with Lipschitz continuous and bounded gradients.
Keywords :
distributed algorithms; gradient methods; optimisation; Lipschitz continuous derivative; consensus; convergence rate analysis; convex cost function; cost function optimality gap; distributed gradient algorithm; sensor networks; smooth optimization; source localization; Algorithm design and analysis; Convergence; Cost function; Educational institutions; Gradient methods; Upper bound; Consensus; Convergence rate analysis; Distributed optimization; Gradient methods;
Conference_Titel :
Telecommunications Forum (TELFOR), 2012 20th
Conference_Location :
Belgrade
Print_ISBN :
978-1-4673-2983-5
DOI :
10.1109/TELFOR.2012.6419345