• DocumentCode
    818303
  • Title

    New Bounds on the Expected Length of Optimal One-to-One Codes

  • Author

    Cheng, Jay ; Huang, Tien-Ke ; Weidmann, Claudio

  • Author_Institution
    Dept. of Electr. Eng., Nat. Tsing Hua Univ., Hsinchu
  • Volume
    53
  • Issue
    5
  • fYear
    2007
  • fDate
    5/1/2007 12:00:00 AM
  • Firstpage
    1884
  • Lastpage
    1895
  • Abstract
    In this correspondence, we consider one-to-one encodings for a discrete memoryless source, which are "one-shot" encodings associating a distinct codeword with each source symbol. Such encodings could be employed when only a single source symbol rather than a sequence of source symbols needs to be transmitted. For example, such a situation can arise when the last message must be acknowledged before the next message can be transmitted. We consider two slightly different types of one-to-one encodings (depending on whether the empty codeword is used or not) and obtain lower and upper bounds on the expected length of optimal one-to-one codes. We first give an extension of a known tight lower bound on the expected length of optimal one-to-one codes for the case that the the size of the source alphabet is finite and partial information about the source symbol probabilities is available. As expected, our lower bound is no less than the previously known lower bound obtained without side information about the source symbol probabilities. We then consider the case that the source entropy is available and derive arbitrarily tight lower bounds on the expected length of optimal one-to-one codes. We also derive arbitrarily tight lower bounds for the case that the source entropy and the probability of the most likely source symbol are available. Finally, given that the probability of the most likely source symbol is available, we obtain an upper bound on the expected length of optimal one-to-one codes. Our upper bound is tighter than the best upper bound known in the literature
  • Keywords
    discrete systems; entropy; memoryless systems; probability; source coding; discrete memoryless source; encoding; entropy; source symbol probability; Channel coding; Concatenated codes; Decoding; Educational institutions; Encoding; Entropy; Memoryless systems; Random variables; Source coding; Upper bound; Nonprefix codes; one-to-one codes; source coding;
  • fLanguage
    English
  • Journal_Title
    Information Theory, IEEE Transactions on
  • Publisher
    ieee
  • ISSN
    0018-9448
  • Type

    jour

  • DOI
    10.1109/TIT.2007.894628
  • Filename
    4167726