• DocumentCode
    1401751
  • Title

    On fixed-database universal data compression with limited memory

  • Author

    Hershkovits, Yehuda ; Ziv, Jocob

  • Author_Institution
    Dept. of Electr. Eng., Technion-Israel Inst. of Technol., Haifa, Israel
  • Volume
    43
  • Issue
    6
  • fYear
    1997
  • fDate
    11/1/1997 12:00:00 AM
  • Firstpage
    1966
  • Lastpage
    1976
  • Abstract
    The amount of fixed side information required for lossless data compression is discussed. Nonasymptotic coding and converse theorems are derived for data-compression algorithms with fixed statistical side information (“training sequence”) that is not large enough so as to yield the ultimate compression, namely, the entropy of the source
  • Keywords
    sequences; source coding; converse theorems; entropy; fixed side information; fixed-database universal data compression; limited memory; lossless data compression; nonasymptotic coding; statistical side information; training sequence; Convergence; Data compression; Databases; Encoding; Entropy; Information theory; Jacobian matrices; Random variables; Source coding; Statistics;
  • fLanguage
    English
  • Journal_Title
    Information Theory, IEEE Transactions on
  • Publisher
    ieee
  • ISSN
    0018-9448
  • Type

    jour

  • DOI
    10.1109/18.641559
  • Filename
    641559