• DocumentCode
    739766
  • Title

    Information–Theoretic Applications of the Logarithmic Probability Comparison Bound

  • Author

    Atar, Rami ; Merhav, Neri

  • Author_Institution
    Department of Electrical Engineering, Technion???Israel Institute of Technology, Haifa, Israel
  • Volume
    61
  • Issue
    10
  • fYear
    2015
  • Firstpage
    5366
  • Lastpage
    5386
  • Abstract
    A well-known technique in estimating the probabilities of rare events in general and in information theory in particular (used, for example, in the sphere–packing bound) is that of finding a reference probability measure under which the event of interest has the probability of order one and estimating the probability in question by means of the Kullback–Leibler divergence. A method has recently been proposed in [2] that can be viewed as an extension of this idea in which the probability under the reference measure may itself be decaying exponentially, and the Rényi divergence is used instead. The purpose of this paper is to demonstrate the usefulness of this approach in various information–theoretic settings. For the problem of channel coding, we provide a general methodology for obtaining matched, mismatched, and robust error exponent bounds, as well as new results in a variety of particular channel models. Other applications we address include rate-distortion coding and the problem of guessing.
  • Keywords
    Channel coding; Channel models; Context; Q measurement; Upper bound; Change-of-measure; R??nyi divergence; Renyi divergence; change-of-measure; error exponent; mismatch;
  • fLanguage
    English
  • Journal_Title
    Information Theory, IEEE Transactions on
  • Publisher
    ieee
  • ISSN
    0018-9448
  • Type

    jour

  • DOI
    10.1109/TIT.2015.2464378
  • Filename
    7181681