• DocumentCode
    1369902
  • Title

    A Neyman–Pearson Approach to Universal Erasure and List Decoding

  • Author

    Moulin, Pierre

  • Author_Institution
    Electr. & Comput. Eng. Dept., Univ. of Illinois at Urbana-Champaign, Urbana, IL, USA
  • Volume
    55
  • Issue
    10
  • fYear
    2009
  • Firstpage
    4462
  • Lastpage
    4478
  • Abstract
    When information is to be transmitted over an unknown, possibly unreliable channel, an erasure option at the decoder is desirable. Using constant-composition random codes, we propose a generalization of Csiszar and Korner´s maximum mutual information (MMI) decoder with an erasure option for discrete memoryless channels. The new decoder is parameterized by a weighting function that is designed to optimize the fundamental tradeoff between undetected-error and erasure exponents for a compound class of channels. The class of weighting functions may be further enlarged to optimize a similar tradeoff for list decodersin that case, undetected-error probability is replaced with average number of incorrect messages in the list. Explicit solutions are identified. The optimal exponents admit simple expressions in terms of the sphere-packing exponent, at all rates below capacity. For small erasure exponents, these expressions coincide with those derived by Forney (1968) for symmetric channels, using maximum a posteriori decoding. Thus, for those channels at least, ignorance of the channel law is inconsequential. Conditions for optimality of the Csiszar-Korner rule and of the simpler empirical-mutual-information thresholding rule are identified. The error exponents are evaluated numerically for the binary symmetric channel.
  • Keywords
    error statistics; maximum likelihood decoding; Csiszar maximum mutual information decoder; Korner maximum mutual information decoder; Neyman-Pearson approach; binary symmetric channel; constant-composition random codes; discrete memoryless channels; empirical-mutual-information thresholding rule; erasure exponents; list decoding; maximum a posteriori decoding; sphere-packing exponent; undetected-error; undetected-error probability; universal erasure; weighting function; Design optimization; Information theory; Maximum likelihood decoding; Memoryless systems; Minimax techniques; Monte Carlo methods; Mutual information; Random variables; Testing; Constant-composition codes; Neyman–Pearson hypothesis testing; erasures; error exponents; list decoding; maximum mutual information (MMI) decoder; method of types; random codes; sphere packing; universal decoding;
  • fLanguage
    English
  • Journal_Title
    Information Theory, IEEE Transactions on
  • Publisher
    ieee
  • ISSN
    0018-9448
  • Type

    jour

  • DOI
    10.1109/TIT.2009.2027569
  • Filename
    5238746