A Self-Orthogonalising Block Adaptive Filter
Traditionally the recursive least squares (RLS) and the least mean (LMS) algorithms have been considered to be the two major alternatives in adaptive finite impulse response (FIR) filtering. They represent the two extremes in a trade off of convergence performance against computational complexity. The conventional RLS algorithm of section 2.4.1 requires a number of computations per new data point that is a function of the square of the number of coefficients, N, in the FIR filter i.e. order N2 or O (N2). This contrasts sharply with the LMS algorithm which requires O (N) computations. However it is evident from the simulation results of chapter 3 that the RLS algorithm offers consistent rapid mean square error (MSE) convergence properties whereas the convergence properties of the LMS algorithm are generally poorer and dependent upon the input signal conditioning . Fortunately the computational complexity of the RLS algorithm can be reduced by exploiting the shifting property of the input vector to yield the fast algorithms [35,65,68,60,61], such as those which are developed in appendix A and appendix B. Although the fast algorithms offer RLS convergence properties at O (N) computations they still represents a computational load which is significantly higher than the LMS algorithm.
KeywordsFIlters Autocorrelation Convolution Fermat
Unable to display preview. Download preview PDF.