Abstract :
This paper presents, for the first time, the exact theoretical solution to the problem of maximum-likelihood (ML) estimation of time-varying delay d(t) between a random signal s(t) received at one point in the presence of uncorrelated noise, and the time-delayed, scaled version αs(t - d(t)) of that signal received at another point in the presence of uncorrelated noise. The signal is modeled as a sample function of a nonstationary Gaussian random process and the observation interval is arbitrary. The analysis of this paper represents a generalization of that of Knapp and Carter [1], who derived the ML estimator for the case that the delay is constant, d(t) = d0, the signal process is stationary, and the received processes are observed over the infinite interval (-∞, +∞). We show that the ML estimator of d(t) can be implemented in any of four canonical forms which, in general, are time-varying systems. We also show that our results reduce to a generalized cross correlator for the special case treated in [1].