• DocumentCode
    23792
  • Title

    Which Boolean Functions Maximize Mutual Information on Noisy Inputs?

  • Author

    Courtade, Thomas A. ; Kumar, G. Ramesh

  • Author_Institution
    Univ. of California at Berkeley, Berkeley, CA, USA
  • Volume
    60
  • Issue
    8
  • fYear
    2014
  • fDate
    Aug. 2014
  • Firstpage
    4515
  • Lastpage
    4525
  • Abstract
    We pose a simply stated conjecture regarding the maximum mutual information a Boolean function can reveal about noisy inputs. Specifically, let Xn be independent identically distributed Bernoulli(1/2), and let Yn be the result of passing Xn through a memoryless binary symmetric channel with crossover probability α. For any Boolean function b : {0, 1}n → {0, 1}, we conjecture that I(b(Xn); Yn) ≤ 1 - H(α). While the conjecture remains open, we provide substantial evidence supporting its validity. Connections are also made to discrete isoperimetric inequalities.
  • Keywords
    Boolean functions; information theory; probability; Boolean functions; crossover probability; discrete isoperimetric inequalities; independent identically distributed Bernoulli; maximum mutual information; memoryless binary symmetric channel; noisy inputs; simply stated conjecture; Boolean functions; Context; Entropy; Mutual information; Noise measurement; Random variables; Tin; Boolean functions; extremal inequality; isoperimetric inequality; mutual information;
  • fLanguage
    English
  • Journal_Title
    Information Theory, IEEE Transactions on
  • Publisher
    ieee
  • ISSN
    0018-9448
  • Type

    jour

  • DOI
    10.1109/TIT.2014.2326877
  • Filename
    6822629