DocumentCode :
947381
Title :
Information capacity of time-continuous channels
Author :
Huang, R.Y. ; Johnson, R.A.
Volume :
8
Issue :
5
fYear :
1962
fDate :
9/1/1962 12:00:00 AM
Firstpage :
191
Lastpage :
198
Abstract :
The maximum average mutual information in the observation of the output, y(t) , of a channel over the time interval [T_3,T_4] about the signal (input), s(t) , in the interval [T_1, T_2] is taken as the definition of channel capacity for the time-continuous case. In the case where the channel introduces additive Independent Gaussian noise of known correlation function, the capacity is evaluated subject to the constraint that the signal process have a given correlation function. For this evaluation a new joint expansion of the processes y(t) and s(t) is introduced which has the property that all coefficients in the expansion are uncorrelated. Thus, the expansion is a generalization of the Karkunen-Lo\´{e}ve expansion to which it reduces when the noise is white and the time intervals coincide. The channel capacity is shown to be directly related to results in the theory of optimum filtering over a finite time interval. Closed form results for the capacity of several channels are given as well as some limiting expressions and bounds. For the case of white noise of spectral density N_o , the capacity is always bounded by \\bar{E}/N_o where \\bar{E} is the average signal energy.
Keywords :
Filtering; Karhunen-Loeve transforms; Mutual information; Additive noise; Channel capacity; Filtering theory; Gaussian noise; Information theory; Mutual information; Noise reduction; Random processes; Signal processing; White noise;
fLanguage :
English
Journal_Title :
Information Theory, IRE Transactions on
Publisher :
ieee
ISSN :
0096-1000
Type :
jour
DOI :
10.1109/TIT.1962.1057753
Filename :
1057753
Link To Document :
بازگشت