• DocumentCode
    916285
  • Title

    Information rates of stationary ergodic finite-alphabet sources

  • Author

    Gray, Robert M.

  • Volume
    17
  • Issue
    5
  • fYear
    1971
  • fDate
    9/1/1971 12:00:00 AM
  • Firstpage
    516
  • Lastpage
    523
  • Abstract
    The generalized Shannon lower bound to the rate-distortion function R(D) for stationary sources with memory is extended to a wide class of distortion measures involving no symmetry conditions. The lower bound R_{L} (D) is a reasonably simple function of the entropy and marginal probabilities of the source and the per-letter distortion measure. Sufficient conditions only slightly less general than necessary conditions are given for the existence of a strictly positive cutoff distortion D_c such that R(D) = R_{L} (D) for D \\leq D_c . The sufficient conditions are the most general to date and include all previously known examples. This provides a nearly complete resolution of the question of when the Shannon-type lower bound to the rate-distortion function of a source with memory is tight. The results are applied to generalize earlier results for balanced distortion measures and Markov sources to nonbalanced distortion measures and wide-sense Markov sources. As a special case, it is shown that D_c > 0 for all finite-alphabet autoregressive sources. As an example, R_{L} (D) is evaluated for the first-order ternary autoregressive source for a balanced (Hamming) and a nonbalanced (modular distance) distortion measure. A simple lower bound to D_c is derived for this example.
  • Keywords
    Information rates; Rate-distortion theory; Codes; Distortion measurement; Entropy; Information rates; Rate-distortion; Sufficient conditions;
  • fLanguage
    English
  • Journal_Title
    Information Theory, IEEE Transactions on
  • Publisher
    ieee
  • ISSN
    0018-9448
  • Type

    jour

  • DOI
    10.1109/TIT.1971.1054694
  • Filename
    1054694