A fixed-point roundoff error analysis of the exponentially windowed RLS algorithm is presented. It is shown that a tradeoff exists in the choice of the forgetting factor λ. In order to reduce the sensitivity of the algorithm to additive noise, λ must be chosen close to one. On the other hand, the roundoff error increases as

. It is shown that the algorithm is stabilized with λ < 1. The algorithm may diverge for

. To derive the theoretical results, it is assumed that the input signal is a white Gaussian random process. Finally, simulations are presented which confirm the theoretical findings of the paper.