Font Size: a A A

Low Complexity Turbo Equalizations and Lower Bounds on Information Rate for Intersymbol Interference Channels

Posted on:2012-03-02Degree:Ph.DType:Thesis
University:University of MinnesotaCandidate:Jeong, SeongwookFull Text:PDF
GTID:2458390008996987Subject:Engineering
Abstract/Summary:
In this research, low complexity turbo equalization algorithms are examined as an alternatives to the optimal, but, much more complex, Bahl-Cocke-Jelinek-Raviv (BCJR) algorithm. First, the soft-in soft-out (SISO) decision feedback equalizer (DFE) algorithm with the extrinsic information mapping methods that directly take into account the error propagation effects of DFE is presented. We also utilize a pair of DFE operating in opposite directions in turbo equalization setting to remove the effect of intersymbol interference (ISI) at the receiver with new extrinsic information combining strategy that explores error correlation between the two sets of DFE outputs. When this method is combined with the proposed DFE extrinsic information formulation, the resulting "bidirectional" turbo-DFE achieves excellent performance-complexity tradeoffs compared to the turbo equalization based on the BCJR algorithm. Furthermore, a self-iterating soft equalizer (SISE) consisting of a few relatively weak equalizers is shown to provide robust performance in severe ISI channels. Constituent suboptimal equalizers are allowed to exchange soft information based on the method that are designed to suppress significant correlation among their soft outputs. The resulting SISE works well as a stand-alone equalizer or as the equalizer component of a turbo equalization system. The performance advantages of the proposed algorithms are validated with bit-error-rate (BER) simulations and extrinsic information transfer (EXIT) chart analysis.;In the thesis, provable lower bounds are also presented for the information rate of any finite ISI channels. Let us consider I( X;X + S + N) where X is the symbol drawn independently and uniformly from a fixed, finite-size alphabet, S a discretevalued random variable (RV) and N a Gaussian RV. Especially, when S represents the precursor ISI after the infinite-length unbiased minimum mean-squared error (MMSE) DFE is applied at the channel output, the mutual information I( X;X + S + N) serves as a tight lower bound for the symmetric information rate (SIR) as well as capacity of the ISI channel corrupted by Gaussian noise. The new lower bounds are obtained by first introducing a "mismatched" mutual information function that can be proved as a lower bound to I(X; X + S + N) and then further lower-bounding this function with expressions that can be computed via a few single-dimensional integrations with a small computational load. The new bounds provide a similar level of tightness as the well-known conjectured lower bound by Shamai and Laroia for a wide variety of ISI channels of practical interest.
Keywords/Search Tags:Turbo equalization, Lower bound, Information, ISI channels, DFE
Related items