Title :
Consensus-based distributed online prediction and optimization
Author :
Tsianos, Konstantinos I. ; Rabbat, Michael G.
Author_Institution :
Dept. of Electr. & Comput. Eng., McGill Univ., Montréal, QC, Canada
Abstract :
This paper considers the problems of distributed online prediction and optimization. Each node in a network of processors processes a stream of data in an online manner. Before the next data point arrives, the processor must make a prediction. Then, after receiving the next point, the processor accrues some loss or regret. The goal of the processors is to minimize the total aggregate regret. We propose a consensus-based distributed optimization method for fitting a model used to make the predictions online. After observing each data point, nodes individually make gradient descent-like adjustments to their model parameters, and then consensus iterations are performed to synchronize models across the nodes. We prove that the proposed method achieves the optimal regret bound when the loss function has Lipschitz continuous gradients, and the amount of communication required depends on the network structure.
Keywords :
distributed processing; gradient methods; learning (artificial intelligence); network theory (graphs); Consensus-based distributed online optimization; Lipschitz continuous gradients; communication amount; consensus iterations; consensus-based distributed online prediction; data point; gradient descent-like adjustments; loss function; machine learning problems; model parameters; network structure; processors network; total aggregate regret minimization; Convergence; Distributed databases; Loss measurement; Optimization; Prediction algorithms; Program processors; Stochastic processes;
Conference_Titel :
Global Conference on Signal and Information Processing (GlobalSIP), 2013 IEEE
Conference_Location :
Austin, TX
DOI :
10.1109/GlobalSIP.2013.6737014