• DocumentCode
    639928
  • Title

    Logarithmic Sobolev inequalities and strong data processing theorems for discrete channels

  • Author

    Raginsky, Maxim

  • Author_Institution
    Dept. of Electr. & Comput. Eng., Univ. of Illinois, Urbana, IL, USA
  • fYear
    2013
  • fDate
    7-12 July 2013
  • Firstpage
    419
  • Lastpage
    423
  • Abstract
    The noisiness of a channel can be measured by comparing suitable functionals of the input and output distributions. For instance, if we fix a reference input distribution, then the worst-case ratio of output relative entropy to input relative entropy for any other input distribution is bounded by one, by the data processing theorem. However, for a fixed reference input distribution, this quantity may be strictly smaller than one, giving so-called strong data processing inequalities (SDPIs). This paper shows that the problem of determining both the best constant in an SDPI and any input distributions that achieve it can be addressed using so-called logarithmic Sobolev inequalities, which relate input relative entropy to certain measures of input-output correlation. Another contribution is a proof of equivalence between SDPIs and a limiting case of certain strong data processing inequalities for the Rényi divergence.
  • Keywords
    entropy; Renyi divergence; SDPI; data processing theorem; discrete channels; fixed reference input distribution; logarithmic Sobolev inequalities; relative entropy; strong data processing inequalities; Correlation; Data processing; Entropy; Equations; Information theory; Limiting; Markov processes;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Information Theory Proceedings (ISIT), 2013 IEEE International Symposium on
  • Conference_Location
    Istanbul
  • ISSN
    2157-8095
  • Type

    conf

  • DOI
    10.1109/ISIT.2013.6620260
  • Filename
    6620260