In the theory of linear prediction and/or filtering, it is well known that the optimum linear device obtained using the minimum mean-square error criterion is also optimum for a much wider class of symmetric error criteria if the input process is Gaussian. This result is extended here to include nonsymmetric error criteria as well as the case of nonstationary Gaussian inputs. A simple direct proof is given which exploits the fact that the probability density function of the error is known explicitly. The method consists of showing that the expected value of the generalized error weighting function

is a monotonic (nondecreasing) function of the mean-squared error.