DocumentCode
46075
Title
On Decoding Irregular Tanner Codes With Local-Optimality Guarantees
Author
Halabi, Nissim ; Even, Guy
Author_Institution
Sch. of Electr. Eng., Tel-Aviv Univ., Tel-Aviv, Israel
Volume
60
Issue
1
fYear
2014
fDate
Jan. 2014
Firstpage
191
Lastpage
211
Abstract
We consider decoding of binary linear Tanner codes using message-passing iterative decoding and linear-programming (LP) decoding in memoryless binary-input output-symmetric (MBIOS) channels. We present new certificates that are based on a combinatorial characterization for the local optimality of a codeword in irregular Tanner codes with respect to any MBIOS channel. This characterization is a generalization of (Arora , Proc. ACM Symp. Theory of Computing, 2009) and (Vontobel, Proc. Inf. Theory and Appl. Workshop, 2010) and is based on a conical combination of normalized weighted subtrees in the computation trees of the Tanner graph. These subtrees may have any finite height h (even equal or greater than half of the girth of the Tanner graph). In addition, the degrees of local-code nodes in these subtrees are not restricted to two (i.e., these subtrees are not restricted to skinny trees). We prove that local optimality in this new characterization implies maximum-likelihood (ML) optimality and LP optimality, and show that a certificate can be computed efficiently. We also present a new message-passing iterative decoding algorithm, called normalized weighted min-sum (NWMS). NWMS decoding is a belief-propagation (BP) type algorithm that applies to any irregular binary Tanner code with single parity-check local codes (e.g., low-density and high-density parity-check codes). We prove that if a locally optimal codeword with respect to height parameter h exists (whereby notably h is not limited by the girth of the Tanner graph), then NWMS decoding finds this codeword in h iterations. The decoding guarantee of the NWMS decoding algorithm applies whenever there exists a locally optimal codeword. Because local optimality of a codeword implies that it is the unique ML codeword, the decoding guarantee also provides an ML certificate for this codeword. Finally, we apply the new local-optimality characterization to regular Tanner codes, and prove lower bounds on the noise thresh- lds of LP decoding in MBIOS channels. When the noise is below these lower bounds, the probability that LP decoding fails to decode the transmitted codeword decays doubly exponentially in the girth of the Tanner graph.
Keywords
binary codes; channel coding; graph theory; iterative decoding; linear codes; linear programming; maximum likelihood decoding; message passing; BP type algorithm; LP decoding; MBIOS channel; ML optimality; NWMS decoding algorithm; Tanner graph; belief-propagation type algorithm; binary linear Tanner code; irregular Tanner code decoding; linear-programming decoding; local-code node; maximum-likelihood optimality; memoryless binary-input output-symmetric channel; message-passing iterative decoding algorithm; normalized weighted min-sum; parity-check local code; Heuristic algorithms; Iterative decoding; Maximum likelihood decoding; Vectors; Vegetation; Error bounds; Tanner codes; factor graphs; graph cover; lDPC codes; linear programming (LP) decoding; local optimality; max-product algorithm; maximum-likelihood (ML) certificate; memoryless binary-input output-symmetric (MBIOS) channel; message-passing algorithms; min-sum algorithm; thresholds;
fLanguage
English
Journal_Title
Information Theory, IEEE Transactions on
Publisher
ieee
ISSN
0018-9448
Type
jour
DOI
10.1109/TIT.2013.2284912
Filename
6626647
Link To Document