DocumentCode
943980
Title
Fixed memory least squares filters using recursion methods
Author
Blum, M.
Volume
3
Issue
3
fYear
1957
fDate
9/1/1957 12:00:00 AM
Firstpage
178
Lastpage
182
Abstract
Given a set of equally spaced measurements, it is possible to curve fit a "least squares" polynomial to the
observed data points and obtain estimates of the past, present, or future values of the data or its derivatives by appropriate manipulations of the curve fit. This curve fitting can be accomplished by a linear weighting of the observed data over an interval
. If the data is measured in real time such that a new data point is observed each
seconds, then the desired output (for example, the smooth or predicted value of the data) can be obtained by sliding these fixed number of weights such that the same weight always multiplies the data which is at a fixed lag with respect to the most recent data. Since these weights are zero for lags greater than
, they may be described as a fix-finite memory linear digital filter. In calculating the desired output for each new sample one requires a machine which can store
coefficients,
data points and performs n multiplications and
additions in at least
seconds. The coefficients do not change but the multiplications and additions must be performed each
seconds as a new data point is measured. For large values of
, and small
, this may put a severe requirement on the real time solutions of the computer. This paper presents an alternate technique using recursion formulas to obtaining the same results as the
point weighting equation. The method has the advantage of requiring considerably less storage, multiplications and additions when
and the degree of the curve fitting polynomial
is small.
observed data points and obtain estimates of the past, present, or future values of the data or its derivatives by appropriate manipulations of the curve fit. This curve fitting can be accomplished by a linear weighting of the observed data over an interval
. If the data is measured in real time such that a new data point is observed each
seconds, then the desired output (for example, the smooth or predicted value of the data) can be obtained by sliding these fixed number of weights such that the same weight always multiplies the data which is at a fixed lag with respect to the most recent data. Since these weights are zero for lags greater than
, they may be described as a fix-finite memory linear digital filter. In calculating the desired output for each new sample one requires a machine which can store
coefficients,
data points and performs n multiplications and
additions in at least
seconds. The coefficients do not change but the multiplications and additions must be performed each
seconds as a new data point is measured. For large values of
, and small
, this may put a severe requirement on the real time solutions of the computer. This paper presents an alternate technique using recursion formulas to obtaining the same results as the
point weighting equation. The method has the advantage of requiring considerably less storage, multiplications and additions when
and the degree of the curve fitting polynomial
is small.Keywords
Least-squares estimation; Curve fitting; Digital filters; Eigenvalues and eigenfunctions; Equations; Extraterrestrial measurements; Least squares approximation; Least squares methods; Performance evaluation; Polynomials; Recursive estimation; White noise;
fLanguage
English
Journal_Title
Information Theory, IRE Transactions on
Publisher
ieee
ISSN
0096-1000
Type
jour
DOI
10.1109/TIT.1957.1057412
Filename
1057412
Link To Document