Two fast gradient algorithms for block FIR (finite impulse response) adaptive digital filtering are presented in this paper. The proposed algorithms employ a time-varying convergence factor

which is optimized in a least-squares (LS) sense. In the first algorithm, the optimum block adaptive (OBA) algorithm, the processed signal blocks are disjointed. In the second algorithm, the optimum block adaptive shifting (OBAS) algorithm, the signal blocks are overlapping. Computer simulations and analysis of the computational complexity of the algorithms are given. It is shown that, although OBA and OBAS require a relatively modest increase in computation for each block iteration compared to the existing block least-mean-square (BLMS) algorithm, OBA and OBAS may be found in some applications more computationally efficient due to the considerable reduction in the number of iterations required for a given adaptation accuracy. A comparison between the OBAS algorithm and the recently proposed fast a posteriori error sequential technique (FAEST) is also conducted for adapting to time-varying (unknown) systems, and the results show that OBAS is superior to the FAEST algorithm with respect to speed and accuracy of adaptation, at least for the variety of filters used in the simulations.