• DocumentCode
    588278
  • Title

    Derivative of the relative entropy over the poisson and Binomial channel

  • Author

    Taborda, Camilo G. ; Perez-Cruz, Fernando

  • Author_Institution
    Dept. of Signal Theor. & Commun., Univ. Carlos III of Madrid, Leganes, Spain
  • fYear
    2012
  • fDate
    3-7 Sept. 2012
  • Firstpage
    386
  • Lastpage
    390
  • Abstract
    In this paper it is found that, regardless of the statistics of the input, the derivative of the relative entropy over the Binomial channel can be seen as the expectation of a function that has as argument the mean of the conditional distribution that models the channel. Based on this relationship we formulate a similar expression for the mutual information concept. In addition to this, using the connection between the Binomial and Poisson distribution we develop similar results for the Poisson channel. Novelty of the results presented here lies on the fact that, expressions obtained can be applied to a wide range of scenarios.
  • Keywords
    Poisson distribution; binomial distribution; entropy; Poisson channel; Poisson distribution; binomial channel; binomial distribution; conditional distribution; function expectation; mutual information concept; relative entropy derivative; similar expression; Channel estimation; Conferences; Entropy; Estimation; Mutual information; Random variables;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Information Theory Workshop (ITW), 2012 IEEE
  • Conference_Location
    Lausanne
  • Print_ISBN
    978-1-4673-0224-1
  • Electronic_ISBN
    978-1-4673-0222-7
  • Type

    conf

  • DOI
    10.1109/ITW.2012.6404699
  • Filename
    6404699