Title : 
On fixed-database universal data compression with limited memory
         
        
            Author : 
Hershkovits, Yehuda ; Ziv, Jocob
         
        
            Author_Institution : 
Dept. of Electr. Eng., Technion-Israel Inst. of Technol., Haifa, Israel
         
        
        
        
        
            fDate : 
11/1/1997 12:00:00 AM
         
        
        
        
            Abstract : 
The amount of fixed side information required for lossless data compression is discussed. Nonasymptotic coding and converse theorems are derived for data-compression algorithms with fixed statistical side information (“training sequence”) that is not large enough so as to yield the ultimate compression, namely, the entropy of the source
         
        
            Keywords : 
sequences; source coding; converse theorems; entropy; fixed side information; fixed-database universal data compression; limited memory; lossless data compression; nonasymptotic coding; statistical side information; training sequence; Convergence; Data compression; Databases; Encoding; Entropy; Information theory; Jacobian matrices; Random variables; Source coding; Statistics;
         
        
        
            Journal_Title : 
Information Theory, IEEE Transactions on