Title :
Estimating Divergence Functionals and the Likelihood Ratio by Convex Risk Minimization
Author :
Nguyen, XuanLong ; Wainwright, Martin J. ; Jordan, Michael I.
Author_Institution :
Dept. of Stat., Univ. of Michigan, Ann Arbor, MI, USA
Abstract :
We develop and analyze M-estimation methods for divergence functionals and the likelihood ratios of two probability distributions. Our method is based on a nonasymptotic variational characterization of f -divergences, which allows the problem of estimating divergences to be tackled via convex empirical risk optimization. The resulting estimators are simple to implement, requiring only the solution of standard convex programs. We present an analysis of consistency and convergence for these estimators. Given conditions only on the ratios of densities, we show that our estimators can achieve optimal minimax rates for the likelihood ratio and the divergence functionals in certain regimes. We derive an efficient optimization algorithm for computing our estimates, and illustrate their convergence behavior and practical viability by simulations.
Keywords :
convex programming; functional analysis; maximum likelihood estimation; minimax techniques; risk analysis; statistical distributions; variational techniques; M-estimation method; convex empirical risk optimization; convex risk minimization; divergence functional estimation; f-divergence; likelihood ratio; nonasymptotic variational characterization; optimal minimax rate; probability distribution; Convergence; Convex functions; Entropy; Estimation; Kernel; Measurement; Probability distribution; $f$-divergence; Convex optimization; Kullback-Leibler (KL) divergence; M-estimation; density ratio estimation; divergence estimation; reproducing kernel Hilbert space (RKHS); surrogate loss functions;
Journal_Title :
Information Theory, IEEE Transactions on
DOI :
10.1109/TIT.2010.2068870