The discrete-time detection of a constant signal in corrupting noise is considered. First the case where the noise is independent and identically distributed is considered, and the criterion of asymptotic relative efficiency is employed to investigate the effect on the detector\´s performance induced by altering the form of the detector nonlinearity from that of the locally optimal nonlinearity. The results show that the resultant degradation in performance can be bounded in terms of the

distance between the locally optimal nonlinearity and the nonlinearity of interest. We then extend our results to the case of weakly dependent

-mixing noise and see that, in particular, asymptotic relative efficiency can be viewed as a mapping between metric spaces that is continuous at the point of interest.