Title :
Distributed Learning of Distributions via Social Sampling
Author :
Sarwate, Anand D. ; Javidi, Tara
Author_Institution :
Dept. of Electr. & Comput. Eng., Rutgers Univ., Piscataway, NJ, USA
Abstract :
A protocol for distributed estimation of discrete distributions is proposed. Each agent begins with a single sample from the distribution, and the goal is to learn the empirical distribution of the samples. The protocol is based on a simple message-passing model motivated by communication in social networks. Agents sample a message randomly from their current estimates of the distribution, resulting in a protocol with quantized messages. Using tools from stochastic approximation, the algorithm is shown to converge almost surely. Examples illustrate three regimes with different consensus phenomena. Simulations demonstrate this convergence and give some insight into the effect of network topology.
Keywords :
learning (artificial intelligence); multi-agent systems; sampling methods; discrete distribution; distributed learning; message passing model; network topology; social networks; social sampling; stochastic approximation; Approximation methods; Convergence; Histograms; Noise; Protocols; Stochastic processes; Vectors; Distributions; independent and identically distributed (i.i.d.);
Journal_Title :
Automatic Control, IEEE Transactions on
DOI :
10.1109/TAC.2014.2329611