Title :
Extracting Computational Entropy and Learning Noisy Linear Functions
Author :
Lee, Chia-Jung ; Lu, Chi-Jen ; Tsai, Shi-Chun
Author_Institution :
Dept. of Comput. Sci., Nat. Chiao Tung Univ., Hsinchu, Taiwan
Abstract :
We study the task of deterministically extracting randomness from sources containing computational entropy. The sources we consider have the form of a conditional distribution (f(X)|X), for some function f and some distribution X, and we say that such a source has computational min-entropy k if any circuit of size 2k can only predict f(x) correctly with probability at most 2-k given input x sampled from X. We first show that it is impossible to have a seedless extractor to extract from one single source of this kind. Then we show that it becomes possible if we are allowed a seed which is weakly random (instead of perfectly random) but contains some statistical min-entropy, or even a seed which is not random at all but contains some computational min-entropy. This can be seen as a step toward extending the study of multisource extractors from the traditional, statistical setting to a computational setting. We reduce the task of constructing such extractors to a problem in computational learning theory: learning linear functions under arbitrary distribution with adversarial noise, and we provide a learning algorithm for this problem. In fact, this problem is a well-recognized one in computational learning theory and variants of this problem have been studied intensively before. Thus, in addition to its application to extractors, our learning algorithm also has independent interest of its own, and it can be considered as the main technical contribution of this paper.
Keywords :
computational complexity; learning (artificial intelligence); minimum entropy methods; statistical distributions; computational entropy; computational learning theory; computational min-entropy; computational setting; conditional distribution; learning linear functions; multisource extractors; noisy linear functions; probability; randomness extractor; statistical min-entropy; statistical setting; Data mining; Entropy; Noise; Noise measurement; Prediction algorithms; Probabilistic logic; Training; Computational min-entropy; computational complexity; learning linear functions; randomness extractors;
Journal_Title :
Information Theory, IEEE Transactions on
DOI :
10.1109/TIT.2011.2158897