Recent psychophysical experiments with sinusoidally flickering waveforms provide suitable data for calculating the maximum rate at which information can enter the human visual system, according to the single-channel model which explains these data; i.e., if the signal-to-noise ratio in the retinal pathways governs the minimum detectable modulation amplitude, then the latter is an appropriate measure of the maximum number of distinguishable signals within a given narrow frequency band. Applying the Hartley-Shannon Law, these measured (gain-vs-frequency) response curves are integrated to obtain the (retinal average) channel capacity. This procedure yields a monotonic function of the adapting luminance, increasing at high photopic levels to almost 800 bits per sec per channel or about

bits per sec for the entire retina. Most of this large input capacity is obviously not directly available for the transmission of (random) signals by the human observer; the results are discussed from this viewpoint and compared with other estimates of sensory information rates.