# Modified Particle Swarm Optimization for Blind Deconvolution and Identification of Multichannel FIR Filters

- 919 Downloads
- 1 Citations

## Abstract

Blind identification of MIMO FIR systems has widely received attentions in various fields of wireless data communications. Here, we use Particle Swarm Optimization (PSO) as the update mechanism of the well-known inverse filtering approach and we show its good performance compared to original method. Specially, the proposed method is shown to be more robust against lower SNR scenarios or in cases with smaller lengths of available data records. Also, a modified version of PSO is presented which further improves the robustness and preciseness of PSO algorithm. However the most important promise of the modified version is its drastically faster convergence compared to standard implementation of PSO.

### Keywords

Particle Swarm Optimization Particle Swarm Optimization Algorithm Normalize Mean Square Error Standard Particle Swarm Optimization Hybrid Particle Swarm Optimization## 1. Introduction

Here Open image in new window is additive white Guassian noise and Open image in new window are tap weight coefficients of channel impulse response from Open image in new window th source to the Open image in new window th sensor. Each subchannel of the MIMO system can be written in Open image in new window domain as Open image in new window .

Inverse filtering approach [1] consists of an iterative solution, which recursively extracts sources from the mixture one by one and then following the estimation of experienced channel by any extracted source, and reconstructs this signal as it was originally observed on each sensor. Now, after reduction of reconstructed sources from sensor measurements, the same procedure can be used for extraction of remaining sources. The source extraction step is down by the steepest descent maximization of a class of cumulant-based cost functions with respect to coefficients of equalizer filters. However, this optimization strategy is prone to being trapped in local maxima, especially in lower SNR scenarios or when the available data record is too small [1].

An alternative to this gradient-based optimization is a structured stochastic search of the objective function space. These types of global searches are structure independent because a gradient is not calculated and the adaptive filter structure does not directly influence the parameter updates. Due to this property these types of algorithms are potentially capable of globally optimizing any class of objective functions [2]. Particle Swarm Optimization (PSO) is one of these stochastic structured search algorithms that have recently gained popularity for optimization problems.

This paper investigates the application of PSO technique in source extraction step of the above-mentioned procedure for mutually independent, zero mean i.i.d binary sequences. It should be noted that although [3] has addressed the same subject, but its update equation seems more like a heuristic method which employs a random weighted addition of gradients based on second-order statistics and prediction methods, while PSO is defined as cooperative random search of particles toward their current global and local best points in search space according to some fitness function [4] as it will be discussed in Section 3.1 and simple multiplication of update equation by a random number (as in [3]) does not represent the essence of global random search suggested by PSO.

In this paper, after studying the suitability of standard PSO for blind identification problem, we propose a modified version which finds its initial direction according to original gradient-based method in [1]. In fact the promised random search of PSO will be limited to smaller local areas with the most probability of maximizing the cost function. Also we modify the original fitness function used in [1, 3] by two supportive performance indexes in order to reduce the probability of algorithm failures which are the result of complete ignorance of original objective function about the existence of additive noise.

## 2. Iterative Source Extraction and Channel Estimation

This separation criteria, however, has the weakness of complete ignorance about the presence of noise. Authors in [1] have developed their theoretical proof about suitability of the above-mentioned objective function based on the assumption that the noise Open image in new window in (1) is negligible. This is the reason for poor performance of their proposed iterative, batch, steepest descent method Open image in new window optimization at lower SNRs.

If we consider the presence of noise Open image in new window in (1), the addition of the term Open image in new window to Open image in new window of (2) will destruct the validity of theoretical proof in [1]. In fact when noise is dominant in sensor measurements, cooperation of this noise with equalizer coefficients may build up new fake maxima into Open image in new window regardless of overall impulse response Open image in new window and in this new maximum, separation point of (6) is no longer achievable.

However it is obvious that (6) would still provide local maximum for Open image in new window and separation would be met if we could somehow control the algorithm from trapping into these new maximum as it will be explained in Section 3.2.

Hereafter, the same procedure can be used for extraction of remaining sources and then estimating the SIMO channel experienced by any one of them.

## 3. Particle Swarm Optimization

### 3.1. PSO Principles

Particle swarm optimization [2] is a stochastic, population-based evolutionary algorithm for problem solving in complex multidimensional parameter spaces. It is a kind of swarm intelligence that is based on social-psychological principles.

A multidimensional optimization problem is given, along with an objective function to evaluate the fitness of each candidate point in parameter space; the swarm is typically modeled by particles in this multidimensional space that have a position and a velocity. After the definition of a random population of individuals (particles) as candidate solutions, they fly through hyperspace of parameters with the aid of two essential reasoning capabilities: their memory of their own best local position and knowledge of the global or their neighborhood's best [5].

Here, Open image in new window is the velocity vector of particle Open image in new window , Open image in new window is a random value, Open image in new window are acceleration coefficients toward Open image in new window and Open image in new window and Open image in new window is the inertia weight.

In fact, the trajectory of each particle is determined by random superposition of its previous velocity with the location of local and global best particles found by far. As new Open image in new window are encountered during the update process, all other particles begin to swarm toward this new Open image in new window , continuing their random search along the way. The optimization is terminated when all of the particles have converged to Open image in new window or a sufficient condition of fitness function met:

### 3.2. Implementation of PSO for Source Extraction

We propose to use PSO as optimization method for maximization of Open image in new window with respect to coefficients of parallel equalizers ( Open image in new window , Open image in new window , Open image in new window ) as Open image in new window dimensional particles.

Clearly Open image in new window of (4) seems a reasonable choice of fitness function since in the absence of noise, its maximization provides pure separation of the source Open image in new window at multichannel equalizer output. But as it is mentioned in Section 2 this objective function fails in low SNR scenarios and would have several fake maxima with respect to equalizer coefficients. Also the steepest gradient method of [1] has shown poor performance in cases with limited number of available data samples.

Here Open image in new window , Open image in new window are approximately small coefficients and their selection has some effects on the convergence speed and robustness of the algorithms. In this new fitness function, Open image in new window and Open image in new window have comparable values as their weight should be adjusted by Open image in new window and Open image in new window . Open image in new window has very large modulus values which is desirable because it should strictly reject any particle with undesired histogram shape in the tails. Note that we can not apply this objective function to original gradient based method since the disapproval of a new equalizer coefficient set is equal to termination of algorithm run and it may causes the algorithm to stock near the initialization point, even in high SNR scenarios.

As it will be shown in Section 4, although this combination of three different cost functions improves the performance of PSO algorithm in noisy scenarios, it also has restricted the approval of new global and local best particles to some extent and it may slow down the search for optimum vector of parameters even in high SNR cases. So there will be a tradeoff between convergence speed and probability of algorithm failures in noisy environments.

Here, Open image in new window is a small constant about 0.25 and Open image in new window , Open image in new window are two constants that may dynamically change during the algorithm. The added Open image in new window term is the gradient of Open image in new window w.r.t multichannel equalizer coefficients. Further details about derivation of this gradient are just the same as the steepest descent method of [1]. In our implementation we have used larger starting Open image in new window and after few iterations set Open image in new window . With this choice, we are able to use the gradient as initial fast direction of swarm toward desired point in search space. Later when the swarm finds its way toward desired solution, we set Open image in new window in order to let random cooperative search of swarm perform its fine tuning of equalizer coefficients. Clearly the relation Open image in new window must be hold. The exact values of these two will not seriously affect the results, since the PSO will adjust the initial direction of the gradient anyhow. For example a good initial choice would be Open image in new window . These values will be changed into Open image in new window after 10 iterations.

From now on, we will refer to this modified version of PSO as hybrid PSO.

## 4. Simulation Results

Inputs Open image in new window of MIMO system are mutually independent, zero mean i.i.d., binary sequences taking values Open image in new window with probability 0.5 each. The normalized fourth-order cumulant is Open image in new window in this case.

In the first simulation, the equalizer length was chosen to be 15 Open image in new window . For the purpose of impulse response estimation and extracted signal cancellation, Open image in new window was estimated for Open image in new window . The possible permutation and scaling ambiguities in this estimation were solved by imposing proper normalization, and some shifting and alignments as in [1].

The parameters for PSO and hybrid PSO algorithms were chosen according to the tradeoff between convergence speed and algorithm runtime. A population of 100 particles is used with the maximum of 400 allowable iterations for PSO and 60 iterations for hybrid PSO. Open image in new window and Open image in new window of (13) are set to 0.3 and Open image in new window , Open image in new window and Open image in new window of (10) are all set to 1. Open image in new window of (14) was chosen to be 0.2 at the start of simulation ( Open image in new window ) in order to force the swarm in desired direction. After about 30 iterations, Open image in new window was set to 0.8 ( Open image in new window ) so the cooperative random local search (around the global maximum) of the directed swarm begins.

Standard deviation of NMSE of channel impulse response estimations for three methods ( Open image in new window , Open image in new window ).

Algorithm | Steepest gradient | Standard PSO | Hybrid PSO |
---|---|---|---|

Mean (dB) | −16.8 | −17.9 | −18.7 |

Std (dB) | −3.43 | −2.1 | −0.56 |

Std/Mean | 20.2% | 11.9% | 2.8% |

Best (dB) | −19.41 | −19.2 | −19.33 |

Worst (dB) | −8.96 | −12.1 | −14.23 |

Number of complete failures | 7 | 5 | 0 |

In final comparison of PSO and proposed hybrid PSO, the speed of convergence and computational complexity of these two must be studied. Simulation results show that for hybrid PSO, usually a small population of less than 30 particles is enough to very fine convergence of the swarm to the global optimal point; however it takes more than hundreds of iterations for even large populations of more than 100 particles for standard PSO. So, the very fast convergence and the preciseness of optimization are two most important promises of hybrid PSO.

In order to study the effect of Open image in new window and Open image in new window of (13) in the identification quality, another experiment was done. In this experiment, the quotient of Open image in new window and Open image in new window is the parameter under study. The simulation results for 50 Monte Carlo simulations with Open image in new window are presented in Figure 2. It can be seen that the best results are met by Open image in new window .

Finally, it should be mentioned that our implementation of both PSO and hybrid PSO was the simplest possible approach in order to keep computational complexity and algorithm run time as close as possible to steepest gradient approach. However, it is always possible to further improve PSO algorithms by employing larger population of swarms and more number of allowable iterations with the cost of more computational complexity. Also the performance of any PSO algorithm can be further improved by strategically selecting the starting positions of the particles [8].

## 5. Conclusion

Two different realizations of Particle Swarm Optimization for source extraction step of well known inverse filtering MIMO identification approach were studied. They both show satisfying results as the original steepest descent method in noise-free scenarios. However, they achieved moderate improvement in lower SNRs and smaller data lengths.

Also the Hybrid PSO algorithm exhibited significant improvement in convergence speed compared to standard PSO, while the initial population of particles was kept smaller. This was the main advantage of hybrid PSO over standard PSO beside the fact that Hybrid PSO was the most precise method with the least probability of complete failure.

## Notes

### Acknowledgment

We gratefully acknowledge partial financial support of this research by the French *Institut national de recherche en informatique et en automatique (INRIABordeaux SudOuest)*.

### References

- [1]Tugnait JK: Identification and deconvolution of multichannel linear non-Gaussian processes using higher order statistics and inverse filter criteria.
*IEEE Transactions on Signal Processing*1997, 45(3):658-672. 10.1109/78.558482CrossRefGoogle Scholar - [2]Kennedy J, Eberhart RC: Particle swarm optimization.
*Proceedings of IEEE International Conference on Neural Networks, November 1995, Perth, Australia*4: 1942-1948.CrossRefGoogle Scholar - [3]Song T, Xu J, Zhang K: Blind MIMO channel identification using particle swarm algorithm.
*Proceedings of the International Conference on Wireless Communications, Networking and Mobile Computing (WiCOM '07), September 2007, Shanghai, China*444-447.Google Scholar - [4]Krusienski DJ, Jenkins WK: A particle swarm optimization-least mean squares algorithm for adaptive filtering.
*Proceedings of the 8th Asilomar Conference on Signals, Systems and Computers, November 2004, Pacific Grove, Ca, USA*1: 241-245.Google Scholar - [5]Krusienski DJ, Jenkins WK: The application of particle swarm optimization to adaptive IIR phase equalization.
*Proceedings of IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP '04), May 2004, Montreal, Canada*2: 693-696.Google Scholar - [6]Das S, Abraham A: Synergy of particle swarm optimization with evolutionary algorithms for intelligent search and optimization.
*Proceedings of IEEE International Congress on Evolutionary Computation, 2006*1: 84-88.Google Scholar - [7]Tugnait JK: On blind separation of convolutive mixtures of independent linear signals in unknown additive noise.
*IEEE Transactions on Signal Processing*1998, 46(11):3109-3111. 10.1109/78.726826MathSciNetCrossRefGoogle Scholar - [8]Richards M, Ventura D: Choosing a starting configuration for particle swarm optimization.
*Proceedings of IEEE International Conference on Neural Networks, July 2004, Budapest, Hungary*3: 2309-2312.Google Scholar

## Copyright information

This article is published under license to BioMed Central Ltd. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.