• DocumentCode
    51364
  • Title

    A Rate-Splitting Approach to Fading Channels With Imperfect Channel-State Information

  • Author

    Pastore, Adriano ; Koch, Thorsten ; Rodriguez Fonollosa, Javier

  • Author_Institution
    Dept. of Signal Theor. & Commun., Univ. Politec. de Catalunya, Barcelona, Spain
  • Volume
    60
  • Issue
    7
  • fYear
    2014
  • fDate
    Jul-14
  • Firstpage
    4266
  • Lastpage
    4285
  • Abstract
    As shown by Médard, the capacity of fading channels with imperfect channel-state information can be lower-bounded by assuming a Gaussian channel input X with power P and by upper-bounding the conditional entropy h(X|Y, Ĥ) by the entropy of a Gaussian random variable with variance equal to the linear minimum mean-square error in estimating X from (Y, Ĥ). We demonstrate that, using a rate-splitting approach, this lower bound can be sharpened: by expressing the Gaussian input X as the sum of two independent Gaussian variables X1 and X2 and by applying Médard´s lower bound first to bound the mutual information between X1 and Y while treating X2 as noise, and by applying it a second time to the mutual information between X2 and Y while assuming X1 to be known, we obtain a capacity lower bound that is strictly larger than Médard´s lower bound. We then generalize this approach to an arbitrary number L of layers, where X is expressed as the sum of L independent Gaussian random variables of respective variances P, ℓ = 1, ... , L summing up to P. Among all such rate-splitting bounds, we determine the supremum over power allocations P and total number of layers L. This supremum is achieved for L →∞ and gives rise to an analytically expressible capacity lower bound. For Gaussian fading, this novel bound is shown to converge to the Gaussian-input mutual information as the signal-to-noise ratio (SNR) grows, provided that the variance of the channel estimation error H - Ĥ tends to zero as the SNR tends to infinity.
  • Keywords
    Gaussian channels; channel capacity; channel estimation; entropy; fading channels; mean square error methods; random processes; Gaussian fading channel capacity; Gaussian-input mutual information; Médard lower bound; SNR; channel estimation error; entropy; imperfect channel-state information; independent Gaussian random variable; linear minimum mean-square error; power allocation; rate-splitting approach; signal-to-noise ratio; Entropy; Fading; Mutual information; Random variables; Receivers; Signal to noise ratio; Upper bound; Channel capacity; fading channels; flat fading; imperfect channel-state information;
  • fLanguage
    English
  • Journal_Title
    Information Theory, IEEE Transactions on
  • Publisher
    ieee
  • ISSN
    0018-9448
  • Type

    jour

  • DOI
    10.1109/TIT.2014.2321567
  • Filename
    6832779