The relation between the statistics of the antenna beam pointing direction and the phase and amplitude errors at the source has been obtained to first order in the mean-square errors, under certain restrictions, for long line sources. It is shown that when the desired phase at the source is a constant, the results are, to first order, independent of the amplitude errors. When the desired amplitude is also constant, there is a simple formula for computing the allowable rms-phase error at the source when the pointing direction is required to lie in a given angular range with a given probability. When the amplitude distribution corresponds to the Taylor-modified

pattern, the allowed rms-phase error is obtained from the constant-amplitude case by a multiplicative factor which depends only on the one parameter characterizing the Taylor distribution. This function is plotted for the range corresponding to sidelobe ratios of 13.2 to 40 db. At 40 db the allowed rms-phase errors are about three fourths of the allowed rms-phase errors at 13.2 db (constant amplitude) for the same uncertainty in the pointing direction. The results are applied to a hypothetical example and to an actual "Mills Cross" for illustrative purposes.