Font Size: a A A

Research On Sparse Adaptive Filtering Algorithms

Posted on:2016-01-26Degree:MasterType:Thesis
Country:ChinaCandidate:Y FengFull Text:PDF
GTID:2308330503476779Subject:Biomedical engineering
Abstract/Summary:PDF Full Text Request
Adaptive filtering, with its theory as a specialized branch of signal processing and its devices adjusting their parameters automatically, has grown from a trickle to a torrent in the last four decades, riding the wave of the rapid development of digital signal processing. Its "self-design" adaption feature, which constitutes the essential characteristic of an intelligent information system, has attracts such a wealth of interest in academia and industry that an avalanche of impressive results have come into sight in a multitude of subjects such as biology, medicine, applied mathematics and engineering, and tons of applications, e.g., acoustic echo cancellation, wireless channel estimation, linear prediction and system identification. Among the massive algorithms of linear adaptive filtering, the least mean square (LMS) approach that has numerous variants is the most popular one, due to its low computational complexity, ease of implementation, stability and robustness, etc. Additionally, as a commonsense variant of LMS, the leaky LMS (LLMS) was proposed to mitigate the deterioration in the performance of LMS algorithm with highly correlated input signals.Interestingly, signals and systems in numerous practical scenarios including the mentioned applications above, are sparse, thus sparse models in signal processing can be widely utilized in the ubiquity of real-word problems. Mainly motivated by the recent hotspot Compressed Sensing, the field of sparse adaptive filtering has opened new floodgates to the so-called sparse penalty adaptive filters, which perform much better than traditional methods while dealing with sparse systems.In the thesis, we propose several new sparse penalty LMS based algorithms, including the Lp norm constraint LLMS (Lp-LLMS), Lp-norm-like constraint LLMS (Lp-like-LLMS), Gradient compared Lp norm constraint LMS (GC-Lp-LMS) and it improved version, the New GC-Lp-LMS (NGC-Lp-LMS) algorithm.The Lp-LLMS incorporates a p (0< p< 1) norm penalty into the cost function of the LLMS to obtain a shrinkage in the weight update equation, which then enhances the performance of the filter in sparse system identification settings. Similarly, the Lp-like-LLMS algorithm exploits a p-norm-like penalty instead to achieve the same goal. Simulation results verify that these two algorithms above improve the performance of the filter in sparse system identification settings in the presence of noisy input signals.The GC-Lp-LMS is proposed as a supplement of the noted Lp-LMS algorithm established for sparse adaptive filtering recently, and it employs a gradient comparator to selectively apply the zero attractor of p-norm constraint to only those taps that have the same polarity as that of the gradient of the squared instantaneous error, and thus achieves lower mean square error than the standard Lp-LMS algorithm theoretically and experimentally.Moreover, the NGC-Lp-LMS algorithm is derived using a new gradient comparator that is actually the sign-smoothed version of the previous one, to further enhance the behavior of the filter. Thus its performance is superior to that of the GC-Lp-LMS algorithm above in theory and simulations. Besides, these two comparators can be easily transplanted into other norm constrained LMS algorithms to derive new approaches for sparse adaptive filtering. The numerical simulation results show that the two proposed algorithms achieve better performance than the standard LMS and Lp-LMS algorithm in terms of convergence rate and steady-state behavior in sparse system identification settings.
Keywords/Search Tags:Adaptive filtering, Compressed Sensing, Sparse penalty, Adaptive algorithm, Sparse LMS, L_p LLMS, p norm like LLMS, Gradient comparator L_p LMS
PDF Full Text Request
Related items