Title :
Distributed inference over regression and classification models
Author :
Towfic, Zaid J. ; Jianshu Chen ; Sayed, Ali H.
Author_Institution :
Electr. Eng. Dept., Univ. of California, Los Angeles, Los Angeles, CA, USA
Abstract :
We study the distributed inference task over regression and classification models where the likelihood function is strongly log-concave. We show that diffusion strategies allow the KL divergence between two likelihood functions to converge to zero at the rate 1/Ni on average and with high probability, where N is the number of nodes in the network and i is the number of iterations. We derive asymptotic expressions for the expected regularized KL divergence and show that the diffusion strategy can outperform both non-cooperative and conventional centralized strategies, since diffusion implementations can weigh a node´s contribution in proportion to its noise level.
Keywords :
inference mechanisms; maximum likelihood detection; regression analysis; asymptotic expressions; classification models; diffusion strategies; distributed inference; expected regularized KL divergence; likelihood function; node contribution; noise level; regression models; Approximation methods; Convergence; Logistics; Noise; Optimization; Stochastic processes; Vectors; Kullback-Leibler divergence; diffusion adaptation; distributed classification; distributed regression; relative entropy;
Conference_Titel :
Acoustics, Speech and Signal Processing (ICASSP), 2013 IEEE International Conference on
Conference_Location :
Vancouver, BC
DOI :
10.1109/ICASSP.2013.6638696