DocumentCode :
655189
Title :
Optimal Bounds on Approximation of Submodular and XOS Functions by Juntas
Author :
Feldman, Vitaly ; Vondrak, Jan
Author_Institution :
IBM Res. - Almaden, San Jose, CA, USA
fYear :
2013
fDate :
26-29 Oct. 2013
Firstpage :
227
Lastpage :
236
Abstract :
We investigate the approximability of several classes of real-valued functions by functions of a small number of variables (juntas). Our main results are tight bounds on the number of variables required to approximate a function f:{0, 1}n → [0,1] within ℓ2-error ϵ over the uniform distribution: If f is sub modular, then it is ϵ-close to a function of O(1/ϵ2 log 1/ϵ) variables. This is an exponential improvement over previously known results FeldmanKV:13. We note that Ω(1/ϵ2) variables are necessary even for linear functions. If f is fractionally sub additive (XOS) it is ε-close to a function of 2O(1/ϵ2) variables. This result holds for all functions with low total ℓ1-influence and is a real-valued analogue of Fried gut´s theorem for boolean functions. We show that 2Ω(1/ϵ) variables are necessary even for XOS functions. As applications of these results, we provide learning algorithms over the uniform distribution. For XOS functions, we give a PAC learning algorithm that runs in time 21/poly(ϵ) poly(n). For sub modular functions we give an algorithm in the more demanding PMAC learning model BalcanHarvey:[12] which requires a multiplicative (1 + γ) factor approximation with probability at least 1 - ϵ over the target distribution. Our uniform distribution algorithm runs in time 21/poly(γϵ) poly(n). This is the first algorithm in the PMAC model that can achieve a constant approximation factor arbitrarily close to 1 for all sub modular functions (even over the uniform distribution). It relies crucially on our approximation by junta result. As follows from the lower bounds in FeldmanKV:13 both of these algorithms are close to optimal. We also give applications for proper learning, testing and agnostic learning with value queries o- these classes.
Keywords :
Boolean functions; computational complexity; function approximation; learning (artificial intelligence); statistical distributions; Boolean functions; Friedgut theorem; Juntas; PAC learning algorithm; PMAC learning model; XOS function approximation; agnostic learning; constant approximation factor; exponential improvement; fractionally subadditive; l2-error; multiplicative (1 + γ) factor approximation; optimal bounds; real-valued functions; submodular function approximation; uniform distribution algorithm; Approximation algorithms; Approximation methods; Boolean functions; Context; Cost accounting; Game theory; Testing; PAC learning; approximation; fractionally-subadditive; junta; submodular; testing;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Foundations of Computer Science (FOCS), 2013 IEEE 54th Annual Symposium on
Conference_Location :
Berkeley, CA
ISSN :
0272-5428
Type :
conf
DOI :
10.1109/FOCS.2013.32
Filename :
6686158
Link To Document :
بازگشت