Title :
Hardness of Reconstructing Multivariate Polynomials over Finite Fields
Author :
Gopalan, Parikshit ; Khot, Subhash ; Saket, Rishi
Author_Institution :
Washington Univ., Seattle
Abstract :
We study the polynomial reconstruction problem, for low-degree multivariate polynomials over F[2]. In this problem, we are given a set of points x epsi {0, 1}n and target values f(x) epsi {0, 1} for each of these points, with the promise that there is a polynomial over F[2] of degree at most d that agrees with f at 1 - epsiv fraction of the points. Our goal is to find agree d polynomial that has good-agreement with f. We show that it is NP-hard to find a polynomial that agrees with f on more than 1 - 2-d + delta fraction of the points for any epsiv, delta > 0. This holds even with the stronger promise that the polynomial that fits the data is in fact linear, wherejis the algorithm is allowed to find a polynomial of degree d. Previously the only known, hardness of approximation (or even NP-completeness) was for the case when d = I, which follows from a celebrated result of Has tad. In the setting of computational learning, our result shows the hardness of (non-proper) agnostic learning of parities, where the learner is allowed, a low-degree polynomial over F[2] as a hypothesis. This is the first non-proper hardness result for this central problem in computational learning. Our results extend-to multivariate polynomial reconstruction over any finite field.
Keywords :
computational complexity; learning (artificial intelligence); NP-hard problem; approximation hardness; computational learning; multivariate polynomial reconstruction problem; set theory; Application software; Computational complexity; Computer science; Decoding; Engineering profession; Error correction; Error correction codes; Galois fields; Polynomials; Reed-Solomon codes;
Conference_Titel :
Foundations of Computer Science, 2007. FOCS '07. 48th Annual IEEE Symposium on
Conference_Location :
Providence, RI
Print_ISBN :
978-0-7695-3010-9
DOI :
10.1109/FOCS.2007.37