• DocumentCode
    38461
  • Title

    Measures of Entropy From Data Using Infinitely Divisible Kernels

  • Author

    Sanchez Giraldo, Luis Gonzalo ; Rao, Madhav ; Principe, Jose C.

  • Author_Institution
    Dept. of Electr. & Comput. Eng., Univ. of Florida, Gainesville, FL, USA
  • Volume
    61
  • Issue
    1
  • fYear
    2015
  • fDate
    Jan. 2015
  • Firstpage
    535
  • Lastpage
    548
  • Abstract
    Information theory provides principled ways to analyze different inference and learning problems, such as hypothesis testing, clustering, dimensionality reduction, classification, and so forth. However, the use of information theoretic quantities as test statistics, that is, as quantities obtained from empirical data, poses a challenging estimation problem that often leads to strong simplifications, such as Gaussian models, or the use of plug in density estimators that are restricted to certain representation of the data. In this paper, a framework to nonparametrically obtain measures of entropy directly from data using operators in reproducing kernel Hilbert spaces defined by infinitely divisible kernels is presented. The entropy functionals, which bear resemblance with quantum entropies, are defined on positive definite matrices and satisfy similar axioms to those of Renyi´s definition of entropy. Convergence of the proposed estimators follows from concentration results on the difference between the ordered spectrum of the Gram matrices and the integral operators associated to the population quantities. In this way, capitalizing on both the axiomatic definition of entropy and on the representation power of positive definite kernels, the proposed measure of entropy avoids the estimation of the probability distribution underlying the data. Moreover, estimators of kernel-based conditional entropy and mutual information are also defined. Numerical experiments on independence tests compare favorably with state-of-the-art.
  • Keywords
    Hilbert spaces; entropy; inference mechanisms; learning (artificial intelligence); matrix algebra; Gram matrices; empirical data; entropy functionals; entropy measures; inference problems; infinitely divisible kernels; information theory; integral operators; kernel-based conditional entropy estimator; learning problems; mutual information; population quantities; positive definite kernels; positive definite matrices; quantum entropy; reproducing kernel Hilbert spaces; Eigenvalues and eigenfunctions; Entropy; Estimation; Hilbert space; Joints; Kernel; Random variables; Independence test; Infinitely divisible kernels; Learning; Positive definite functions; Renyi’s Entropy; Renyi???s entropy; independence test; learning; positive definite functions;
  • fLanguage
    English
  • Journal_Title
    Information Theory, IEEE Transactions on
  • Publisher
    ieee
  • ISSN
    0018-9448
  • Type

    jour

  • DOI
    10.1109/TIT.2014.2370058
  • Filename
    6954500