Author_Institution :
Abel Innovations, Palo Alto, CA, USA
Abstract :
A lower bound on mean-square-estimate error is derived as an instance of the covariance inequality by concatenating the generating matrices for the Bhattacharyya and Barankin bounds; it represents a generalization of the Bhattacharyya, Barankin, Cramer-Rao, Hammersley-Chapman-Robbins, Kiefer, and McAulay-Hofstetter bounds in that all of these bounds may be derived as special cases. The bound is applicable to biased estimates´ of functions of a multidimensional parameter. Termed the hybrid Bhattacharyya-Barankin bound, it may be written as the sum of the mth-order Bhattacharyya bound and a nonnegative term similar in form to the rth-order Hammersley-Chapman-Robbins bound. It is intended for use when small-error bounds, such as the Cramer-Rao bound, may not be tight; unlike many large-error bounds, it provides a smooth transition between the small-error and large-error regions. As an example application, bounds are placed on the variance of unbiased position estimates derived from passive array measurements. Here, the hybrid Bhattacharyya-Barankin bound enters the large-error region at a larger SNR and, in the large-error region, is noticeably greater than either the Hammersley-Chapman-Robbins or the Bhattacharyya bounds
Keywords :
array signal processing; error analysis; information theory; maximum likelihood estimation; parameter estimation; Barankin bound; Bhattacharyya bound; Cramer-Rao bound; Hammersley-Chapman-Robbins bound; Kiefer bound; McAulay-Hofstetter bound; covariance inequality; generating matrices; hybrid Bhattacharyya-Barankin bound; information inequality; large-error region; lower bound; mean-square-estimate error; multidimensional parameter estimation; passive array measurements; small-error region; unbiased position estimates; Abstracts; Aggregates; Bandwidth; Communication switching; Communication systems; Information theory; Multiaccess communication; Notice of Violation; Random processes; Read only memory;