Author_Institution :
Dept. of Comput. Sci., Carnegie Mellon Univ., Pittsburgh, PA, USA
Abstract :
We study certain combinatorial aspects of listdecoding, motivated by the exponential gap between the known upper bound (of O(1/γ)) and lower bound (of Ωp(log(1/γ))) for the list size needed to list decode up to error fraction p with rate γ away from capacity, i.e., 1 - h(p) - γ [here p E (0, 1/2) and γ > 0]. Our main result is that we prove that in any binary code C ⊆ (0, 1)n of rate 1 - h(p) - γ, there must exist a set l ⊂ C of p(1/√γ) codewords such that the average distance of the points in L from their centroid is at most pn. In other words, there must exist Ωp(1/√γ) codewords with low average radius. The standard notion of list decoding corresponds to working with the maximum distance of a collection of codewords from a center instead of average distance. The average radius form is in itself quite natural; for instance, the classical Johnson bound in fact implies average-radius list-decodability. The remaining results concern the standard notion of list-decoding, and help clarify the current state of affairs regarding combinatorial bounds for list-decoding as follows. First, we give a short simple proof, over all fixed alphabets, of the above-mentioned Ωp(log(1/γ)) lower bound. Earlier, this bound followed from a complicated, more general result of Blinovsky. Second, we show that one cannot improve the Ωp(log(1/γ)) lower bound via techniques based on identifying the zero-rate regime for list-decoding of constantweight codes [this is a typical approach for negative results in coding theory, including the Ωp(log(1/γ)) list-size lower bound]. On a positive note, our Ωp(1/√γ) lower bound for average radius list-decoding circumvents this barrier. Third, we exhibit a reverse connection between the existenc- of constant-weight and general codes for list-decoding, showing that the best possible list-size, as a function of the gap γ of the rate to the capacity limit, is the same up to constant factors for both constant-weight codes (with weight bounded away from p) and general codes. Fourth, we give simple second moment-based proofs that w.h.p. a list-size of Ωp(1/γ) is needed for list-decoding random codes from errors as well as erasures. For random linear codes, the corresponding list-size bounds are Ωp(1/γ) for errors and expΩp(log(1/γ)) for erasures.
Keywords :
binary codes; combinatorial mathematics; decoding; linear codes; random codes; average radius list decoding; binary code; classical Johnson bound; coding theory; combinatorial bounds; constant-weight codes; random linear codes; Binary codes; Decoding; Entropy; Hamming distance; Standards; Upper bound; Combinatorial coding theory; linear codes; list error-correction; probabilistic method; random coding;