• DocumentCode
    659197
  • Title

    On the mutual information between random variables in networks

  • Author

    Xiaoli Xu ; Thakor, Satyajit ; Yong Liang Guan

  • Author_Institution
    Sch. of Electr. & Electron. Eng., Nanyang Technol. Univ., Singapore, Singapore
  • fYear
    2013
  • fDate
    9-13 Sept. 2013
  • Firstpage
    1
  • Lastpage
    5
  • Abstract
    This paper presents a lower bound on the mutual information between any two sets of source/edge random variables in a general multi-source multi-sink network. This bound is useful to derive a new class of better information-theoretic upper bounds on the network coding capacity given existing edge-cut based bounds. A refined functional dependence bound is characterized from the functional dependence bound using the lower bound. It is demonstrated that the refined versions of the existing edge-cut based outer bounds obtained using the mutual information lower bound are stronger.
  • Keywords
    information theory; network coding; edge-cut based outer bound; general multisource multisink network; information theoretic upper bounds; lower bound; mutual information; network coding capacity; source-edge random variable; Educational institutions; Entropy; Mutual information; Network coding; Random variables; Receivers; Network coding capacity; functional dependece bound; refined edge-cut bound; weighted bound;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Information Theory Workshop (ITW), 2013 IEEE
  • Conference_Location
    Sevilla
  • Print_ISBN
    978-1-4799-1321-3
  • Type

    conf

  • DOI
    10.1109/ITW.2013.6691320
  • Filename
    6691320