Title :
A Closed-Form Expression for the Exact Bit Error Probability for Viterbi Decoding of Convolutional Codes
Author :
Bocharova, Irina E. ; Hug, Florian ; Johannesson, Rolf ; Kudryashov, Boris D.
Author_Institution :
Dept. of Inf. Syst., St. Petersburg State Univ. of Inf. Technol., Mech. & Opt., St. Petersburg, Russia
fDate :
7/1/2012 12:00:00 AM
Abstract :
In 1995, Best etal. published a formula for the exact bit error probability for Viterbi decoding of the rate R =1/2, memory m = 1 (two-state) convolutional encoder with generator matrix G(D) = (1 1 + D) when used to communicate over the binary symmetric channel. Their formula was later extended to the rate R = 1/2, memory m = 2 (four-state) convolutional encoder with generator matrix G(D) = (1 + D2 1 + D + D2) by Lentmaier et al. In this paper, a different approach to derive the exact bit error probability is described. A general recurrent matrix equation, connecting the average information weight at the current and previous states of a trellis section of the Viterbi decoder, is derived and solved. The general solution of this matrix equation yields a closed-form expression for the exact bit error probability. As special cases, the expressions obtained by Best et al. for the two-state encoder and by Lentmaier et al. for a four-state encoder are used. The closed-form expression derived in this paper is evaluated for various realizations of encoders, including rate R = 1/2 and R = 2/3 encoders, of as many as 16 states. Moreover, it is shown that it is straightforward to extend the approach to communication over the quantized additive white Gaussian noise channel.
Keywords :
AWGN channels; Viterbi decoding; channel coding; convolutional codes; error statistics; matrix algebra; Viterbi decoding; binary symmetric channel; closed-form expression; convolutional codes; convolutional encoder; exact bit error probability; four-state encoder; generator matrix; quantized additive white Gaussian noise channel; recurrent matrix equation; trellis section; two-state encoder; Convolutional codes; Decoding; Equations; Error probability; Measurement; Vectors; Viterbi algorithm; Additive white Gaussian noise (AWGN) channel; Viterbi decoding; binary symmetric channel (BSC); bit error probability; convolutional code; convolutional encoder; exact bit error probability;
Journal_Title :
Information Theory, IEEE Transactions on
DOI :
10.1109/TIT.2012.2193375