Information measures and performance bounds are derived for frequency-domain linear array processors deployed in homogeneous Gaussian random fields.

-divergence, a measure of the (net) information rate of an array, is shown to be a useful measure of how effectively detection and estimation functions can be performed in optimum and conventional array processing structures. In a detection context,

- divergence becomes a detection index that can be interpreted in terms of array gain and output signal-to-noise ratio (SNR). Comparisons between the divergence of optimum and conventional processors indicate, for example, that optimum processing can provide on the order of a 13 dB gain over conventional processing when trying to detect a 20 dB signal in the presence of a 20 dB interference located within the Rayleigh limit of the array. In an estimation context, J-divergence can be used to derive "critical divergence" and Cramér-Rao bounds on resolution variance. These bounds indicate that approximately 25 dB output signal-to-noise ratio is required to obtain a 10:1 improvement over the classical Rayleigh resolution limit. The Rayleigh limit is argued to have significance only at output SNR\´s of approximately 10 dB. The argument is based on a new resolution limit termed the critical divergence limit. This limit is shown to give resolution limits approximately three times the Cramér-Rao bound, indicating that the latter bound is perhaps an optimistic resolution limit.