Title :
Exact characterization of the minimax loss in error exponents of universal decoders
Author :
Akirav, Yaniv ; Merhav, Neri
Author_Institution :
EE Dept., Technion - Israel Inst. of Technol., Haifa
Abstract :
Universally achievable error exponents pertaining to certain families of channels (most notably, discrete memoryless channels (DMCpsilas)), and various ensembles of random codes, are studied by combining the competitive minimax approach, proposed by Feder and Merhav, with the Chernoff bound and Gallagerpsilas techniques for the analysis of error exponents. In particular, we derive a single-letter expression for the largest, universally achievable fraction zeta of the optimum error exponent pertaining to the optimum ML decoding. Moreover, a simpler single-letter expression for a lower bound to zeta is presented. To demonstrate the tightness of this lower bound, we use it to show that zeta = 1, for the binary symmetric channel (BSC), when the random coding distribution is uniform over: (i) all codes (of a given rate), and (ii) all linear codes, in agreement with well-known results. We also show that zeta = 1 for the uniform ensemble of systematic linear codes, and for that of time-varying convolutional codes in the bit-error-rate sense. For the latter case, we also show how the corresponding universal decoder can be efficiently implemented using a slightly modified version of the Viterbi algorithm which employs two trellises.
Keywords :
Viterbi decoding; convolutional codes; error statistics; linear codes; maximum likelihood decoding; memoryless systems; minimax techniques; random codes; time-varying systems; Chernoff bound; Gallagerpsilas techniques; Viterbi algorithm; binary symmetric channel; bit-error-rate sense; competitive minimax approach; discrete memoryless channels; error exponents; linear codes; minimax loss; optimum ML decoding; random codes; random coding distribution; single-letter expression; time-varying convolutional codes; universal decoders; Additive noise; Convolutional codes; Error analysis; Error probability; Linear code; Maximum likelihood decoding; Memoryless systems; Minimax techniques; Mutual information; Viterbi algorithm;
Conference_Titel :
Information Theory, 2008. ISIT 2008. IEEE International Symposium on
Conference_Location :
Toronto, ON
Print_ISBN :
978-1-4244-2256-2
Electronic_ISBN :
978-1-4244-2257-9
DOI :
10.1109/ISIT.2008.4595293