+ All Categories
Home > Documents > Iterative Equalization

Iterative Equalization

Date post: 19-Mar-2016
Category:
Upload: evette
View: 44 times
Download: 2 times
Share this document with a friend
Description:
Iterative Equalization. â k. Decoder. s(b k ). s‘(b k ). Deinterleaver. Interleaver. Speaker: Michael Meyer [email protected]. s(c k ). s‘(c k ). Demapper. Mapper. Equalizer / Detector. y k. System Configuration and Receiver Structures. a k. â k. â k. â k. Encoder. Decoder. - PowerPoint PPT Presentation
39
JASS 05 St. Petersburg - Michael Meyer - Iterative Equalization - 1 - Iterative Equalization Speaker: Michael Meyer [email protected] Interleaver Deinterleaver Decoder â k Equalizer / Detector Demapper Mapper y k s(b k ) s‘(b k ) s(c k ) s‘(c k )
Transcript
Page 1: Iterative Equalization

JASS 05 St. Petersburg - Michael Meyer - Iterative Equalization

- 1 -

Iterative Equalization

Speaker:Michael [email protected]

InterleaverDeinterleaver

Decoderâk

Equalizer / DetectorDemapper Mapper

yk

s(bk)s‘(bk)

s(ck)s‘(ck)

Page 2: Iterative Equalization

JASS 05 St. Petersburg - Michael Meyer - Iterative Equalization

- 2 -

System Configuration and Receiver Structures

Encoder

Interleaver

Mapper

Channel

bk

ck

xk

yk

ak

System Configuration

Receiver A:optimal detector

Receiver B:one-time equalization

and detection

Receiver C:turbo equalization

OptimalDetector

yk

âk

Decoder

Deinterleaver

Demapper

Equalizer / Detector

yk

âk

kb

kc

kx

s(bk)

s(ck)

s(xk)

InterleaverDeinterleaver

Decoderâk

Equalizer / DetectorDemapper Mapper

yk

s(bk)s‘(bk)

s(ck)s‘(ck)

Page 3: Iterative Equalization

JASS 05 St. Petersburg - Michael Meyer - Iterative Equalization

- 3 -

Interleaver

Example for an interleaver: A 3-random interleaver for 18 code bits

Encoder

Interleaver

Mapper

Channel

bk

ck

xk

yk

ak

Page 4: Iterative Equalization

JASS 05 St. Petersburg - Michael Meyer - Iterative Equalization

- 4 -

Equalization

• Methods to compensate the channel effects

Transmitter Receiver

e.g. Multi-Path Propagation might lead to IntersignalInterference (ISI).

Page 5: Iterative Equalization

JASS 05 St. Petersburg - Michael Meyer - Iterative Equalization

- 5 -

Used Channel ModellIn the following, we will use an AWGN channel with known channel impulse response (CIR). The received signal is given by

L

llklk xhy

0N 1,2,..., k , n

In matrix form: y = Hx + nchannel coefficient sent signal noise

NNN n

nn

x

xx

hhh

hhhhhh

hhh

y

yy

2

1

2

1

012

012

012

01

0

2

1

00

00000000000

y

As an example, we have alength-three channel with h0=0.407h1=0.815h2=0.407

The noise is Gaussian: 2

2

2)(

2

n

enp

Page 6: Iterative Equalization

JASS 05 St. Petersburg - Michael Meyer - Iterative Equalization

- 6 -

The Forward / Backward Algorithm

For Receiver B, the Forward / Backward Algorithm is often used for equalization and decoding.

As this algorithm is a basic building block for our turbo equilization setup, we will discuss it in detail• for equalization• for decoding

We will continue our example to make things clear.

The example uses binary phase shift keying (BPSK)

Page 7: Iterative Equalization

JASS 05 St. Petersburg - Michael Meyer - Iterative Equalization

- 7 -

The Decision Rule

Receiver B:

Decoder

Deinterleaver

Demapper

Equalizer / Detector

yk

âk

kb

kc

kx s(xk)

The decision rule for the equalizer is

0 )|L(c if ,10 )|L(c if ,0c

k

kk y

y

with the log-likelihood ratio

)|1()|0(ln)|( y

yycPcPcL

So, we have to calculate L(c|y)

Page 8: Iterative Equalization

JASS 05 St. Petersburg - Michael Meyer - Iterative Equalization

- 8 -

The Trellis Diagram (1)

time k+2state rjstate ri input xk=xi,j output vk=vi,j

A branch of the trellis is a four-tuple (i, j, xi,j, vi,j)

(1, 1)

(-1, 1)

(1, -1)

(-1, -1)

(1, 1)

(-1, 1)

(1, -1)

(-1, -1)

Skip TrellisOnly FBA Matrix

Page 9: Iterative Equalization

JASS 05 St. Petersburg - Michael Meyer - Iterative Equalization

- 9 -

The Trellis Diagram (2)

If the tapped delay line contains L elements and if we use a binary alphabet {+1, -1}, the channel can be in one of 2L states ri. The set of possible states is

S = {r0,r1,…,r2L-1}At each time instance k=1,2,…,N the state of the channel is a random variable sk S.

2L = 4

Page 10: Iterative Equalization

JASS 05 St. Petersburg - Michael Meyer - Iterative Equalization

- 10 -

The Trellis Diagram (3)

Using a binary alphabet, a given state sk = ri can only develop into two different states sk+1 = rj depending on the input symbol xk = xi,j = {+1, -1}.The output symbol vk = vi,j in the noise-free case is easily calculated by

L

llklk xhv

0Hxv

v2,0 = h0x2,0 + h1x3,2 + h2x3,3 = 0.407∙1 + 0.815∙1 + 0.407∙(-1) = 0.815

Page 11: Iterative Equalization

JASS 05 St. Petersburg - Michael Meyer - Iterative Equalization

- 11 -

The Trellis Diagram (4)

xi,j and vi,j are uniquely identified by the index pair (i j). The set of all index pairs (i j) corresponding to valid branches is denoted B.

e.g B = {(00), (01), (12), (13), (20), (21), (33), (32)}

Page 12: Iterative Equalization

JASS 05 St. Petersburg - Michael Meyer - Iterative Equalization

- 12 -

The Joint Distribution p(sk, sk+1, y)

As we are separating the equalization from the decoding task, we assume that the random variables xk are statistically independent (IID), hence

N

kkxPxP

1)()(

We then have to calculate P(sk=ri, sk+1=rj | y).

This is the probability that the transmitted sequence path in the trellis contains the branch (I, j, xi,j, vi,j) at the time instance k.

This APP (a posteriori probability) can be computed efficiently with the forward / backward algorithm, based on a suitable decomposition of the joint distribution

p(sk, sk+1, y) = p(y) ∙ P(sk, sk+1 | y)

Page 13: Iterative Equalization

JASS 05 St. Petersburg - Michael Meyer - Iterative Equalization

- 13 -

The DecompositionWe can write the joint distribution asp(sk, sk+1, y) = p(sk, sk+1, (y1,…,yk-1), yk, (yk+1,…,yN)) and decompose it to

probability that contains all paths through the Trellis to come to state sk

probability that contains all possible pathes from state sk+1 to sN

probability for the transition from sk to sk+1 with symbol yk

sNso s1 sN-1yk… …sk

)(

11

),(

1

)(

111

111

)|,...,()|,(),...,,(),,(

kkkkkkk s

kNk

ss

kkk

s

kkkk syypsyspyyspssp

y

sk+1

Page 14: Iterative Equalization

JASS 05 St. Petersburg - Michael Meyer - Iterative Equalization

- 14 -

The Transition Probability γWe can further decompose the transition probability into

k(sk, sk+1) = P(sk+1|sk) ∙ p(yk|sk, sk+1)Using the index pair (i j) and the set B we get

Bj) (i if 0B j) (i if )|()(),( ,, jikkjik

jikvvypxxPrr

k(r0, r3) = 0 as (03) B

k(r0, r0) = P(xk=+1)∙p(yk|vk=1.63)

From the channel law yk=vk+nk and the Gaussian distribution we know that

2

2)(

2)|(

2

2

kk vy

kkevyp

Skip Probabilities

Page 15: Iterative Equalization

JASS 05 St. Petersburg - Michael Meyer - Iterative Equalization

- 15 -

The Probability (Forward)The term k(s) can be computed via the recursion

Ss

kkk ssss ),()()( 11

with the initial value 0(s) = P(s0=s).

so s1

1(s1) 1(s1,s2)

s2

2(s2)

ri rj

4

0112 ),()()(

ijiij rrrr

Note: k contains all possible paths leading to sk.

Page 16: Iterative Equalization

JASS 05 St. Petersburg - Michael Meyer - Iterative Equalization

- 16 -

The Probability β (Backward)

Analogous, the term βk(s) can be computed via the recursion

Ss

kkk ssss ),()()( 1

with the initial value βN(s) = 1 for all s S.

sN

2(s2,s3)

yk

β3(s3)ri rj …

β2(s2)

… s3s2

4

0232 ),()()(

ijijj rrrr

Page 17: Iterative Equalization

JASS 05 St. Petersburg - Michael Meyer - Iterative Equalization

- 17 -

The Formula For The LLRNow, we know the APP P(xk = x|y). All we need to accomplish this task is to sum the branch APPs P(sk, sk+1|y) over all branches that correspond to an input symbol xk=x

xxBij

jkjikikxxBij

jkikkjiji

rrrrrsrsPxxP,, :)(

1:)(

1 )(),()()|,()|( yy

To compute the APP P(xk=+1|y) the branch APPs of the indexpairs (00),(12),(20) and (32) have to be summed over

-1

1:)(1

1:)(1

,

,

)(),()(

)(),()(ln)|1(

)|1(ln)|1()|0(ln)|(

ji

ji

xBijjkjikik

xBijjkjikik

k

k

k

kk rrrr

rrrr

xPxP

cPcPcL

yy

yyy

Page 18: Iterative Equalization

JASS 05 St. Petersburg - Michael Meyer - Iterative Equalization

- 18 -

The FBA in Matrix FormFor convenience, the forward/backward algorithm may also be expressed in matrix-form. We need to create two matrices.

Pk |S|x|S| with {Pk}I,j = k(ri, rj)

A(x) |S|x|S| with

A third matrix is created by elementwise multiplication: B(x) = A(x)∙Pk |S|x|S|

otherwise 0

x withbranch a is j) (i if 1)( ji,,

xx jiA

0100000101000001

)1(A

1000001010000010

)1(A

Page 19: Iterative Equalization

JASS 05 St. Petersburg - Michael Meyer - Iterative Equalization

- 19 -

The Algorithm

Input: Matrices Pk and Bk(x)We calculate vectors fk |S|x1 and bk |S|x1

Initialize with f0 = 1 and bN = 1For k = 1 to N step 1 (forward)

Output the LLRs: 1

1)1()1(ln)|(

kk

Tk

kkTk

kcL bBfbBfy

fk = Pk-1fk-1

bk=PkTbk+1

For k = N to 1 step -1 (backward)

Page 20: Iterative Equalization

JASS 05 St. Petersburg - Michael Meyer - Iterative Equalization

- 20 -

Soft Processing

Decoder

Deinterleaver

Demapper

Equalizer / Detector

yk

âk

kb

kc

kx

s(bk)

s(ck)

s(xk)

A natural choice for the soft information s(xk) are the APPs or similarly the LLRs L(ck|y), which are a “side product” of the maximum a-posteriori probability (MAP) symbol detector.

Also, the Viterbi equalizer may produce approximations of L(ck|y).

For filter-based equalizers extracting s(xk) is more difficult. A common approach is to assume that the estimation error is Gaussian distributed with PDF p(ek)…

kkk xxe ˆ

Page 21: Iterative Equalization

JASS 05 St. Petersburg - Michael Meyer - Iterative Equalization

- 21 -

Decoding - Basics

• Convert the LLR L(ck|y) back to probabilities:

• Deinterleave P(ck|y) to P(bk|y)

• is the input set of probabilities to the decoder

•With the forward/backward algorithm we may again calculate the LLR L(ak|p)

1,01)|( )|(

)|(

ce

eccPk

k

cL

cLc

k y

yy

)|(

)|()|(

2

1

y

yy

p

NbP

bPbP

For the example:Encoder of a convolutional code, where each incoming data bit ak yields two code bits (b2k-1, b2k) via

b2k-1 = ak ak-2b2k = ak ak-1 ak-2

Page 22: Iterative Equalization

JASS 05 St. Petersburg - Michael Meyer - Iterative Equalization

- 22 -

Decoding - Trellis

The convolutional code yields to a new trellis with branches denoted by the tuple (i, j, ai,j, b1,i,j, b2,i,j). Set B remains {(00),(01),(12),(13),(20),(21),(33),(32)}

state rjstate ri input ak=ai,joutput (b2k-1, b2k)=(b1,i,j, b2,i,j)

(0,0)

(1,0)

(0,1)

(1,1)

(0,0)

(1,0)

(0,1)

(1,1)

Page 23: Iterative Equalization

JASS 05 St. Petersburg - Michael Meyer - Iterative Equalization

- 23 -

Decoding – Formulas (1)To apply the forward/backward algorithm, we have to adjust the way Pk and A(x) are formed. For Pk we have to redefine the transition probability

B j) (i if 0B j) (i if)|()|()P(a),( ,,22,,112,k yy jikjikji

jikbbPbbParr

equalizer from

2k1-2k

IID of because 21

00k )|0P(b)|0P(b)0(),( g. e. yy

kaPrr .

{Pk}I,j = k(ri, rj)

otherwise 0

a withbranch a is j) (i if 1)( ji,,

xx jiaA

Ba(x) = Aa(x)∙Pk

Page 24: Iterative Equalization

JASS 05 St. Petersburg - Michael Meyer - Iterative Equalization

- 24 -

Decoding – Formulas (2)

So, we calculate L(ak|p) using the forward / backward algorithm. By changing A(x) we can also calculate L(b2k-1|p) and L(b2k|p) which will later serve as a priori information for the equalizer.

L(b2k-1|p): otherwise 0

b withbranch a is j) (i if 1)( j1,i,,1

xx jibA

otherwise 0

b withbranch a is j) (i if 1)( j2,i,,2

xx jibAL(b2k|p):

with the set of probabilities:

)|(

)|()|(

2

1

y

yy

p

NbP

bPbP

Page 25: Iterative Equalization

JASS 05 St. Petersburg - Michael Meyer - Iterative Equalization

- 25 -

Decoding - Example

otherwise 0

a withbranch a is j) (i if 1)( ji,,

xx jiaA

L(b2k-1|p): otherwise 0

b withbranch a is j) (i if 1)( j1,i,,1

xx jibA

otherwise 0

b withbranch a is j) (i if 1)( j2,i,,2

xx jibAL(b2k|p):

L(ak|p):

1000001010000010

)1(aA

0100000101000010

)1(1bA

0100000110000010

)1(2bA

0100000101000001

)0(aA

0100001010000001

)0(1bA

1000001001000001

)0(2bA

Page 26: Iterative Equalization

JASS 05 St. Petersburg - Michael Meyer - Iterative Equalization

- 26 -

Decoding - AlgorithmFor decoding, we may use the same forward/backward algorithm with different initialization, as the encoder has to terminate at the zero state at time steps k=0, k=K. Change Bk(x) to output L(b2k-1|p) or L(b2k|p).

Input: Matrices Pk and Bk(x)

Initialize with f0 = [1 0…0]T |S|x1 and bN = [1 0…0]T |S|x1

For k = 1 to N step 1 (forward)

Output the LLRs: 1

1)1()0(ln)|(

kk

Tk

kkTk

k paL bBfbBf

fk = Pk-1fk-1

bk=PkTbk+1

For k = N to 1 step -1 (backward)

Page 27: Iterative Equalization

JASS 05 St. Petersburg - Michael Meyer - Iterative Equalization

- 27 -

Bit Error Rate (BER)

Performance of separate equalization and decoding with hard estimates (dashed lines) or soft information (solid lines). The System transmits K=512 data bits and uses a 16-random interleaver to scramble N=1024 code bits.

With soft information, we may gain 2dB, but it is still a long way to -1.6 dB.

Page 28: Iterative Equalization

JASS 05 St. Petersburg - Michael Meyer - Iterative Equalization

- 28 -

Block Diagram - Separated Concept

EqualizerForward/BackwardAlgorithm

Observations

Prior Probabilities

a PosterioriProbabilitiesL(ck|y)

y

Deinterleaver

DecoderForward/BackwardAlgorithm

L(bk|y)

Prior Prob.Observations

a PosterioriProbabilities

L(bk|p)

DecisionRule

L(ak|p)

âk

Let’s look again at the transition propability:

)()|(),( ,, jikjikkjik xxPvvyprr .

Prior informationlocal evidence aboutwhich branch in thetrellis was transversed.

So far:•The equalizer does not have any prior knowledge available, so the formation of entries in Pk relies solely on the observation y.•The decoder forms the corresponding entries in Pk without any local observations but entirely based on bitwise probabilities P(bk|y) provided by the equalizer.

Forward/BackwardAlgorithm

Observations a PosterioriProbabilities

Block diagram of the f/b algorithm

Prior Probabilities

Page 29: Iterative Equalization

JASS 05 St. Petersburg - Michael Meyer - Iterative Equalization

- 29 -

Block Diagram - Turbo Equalization

EqualizerForward/BackwardAlgorithm

Observations

Prior Probabilities

a PosterioriProbabilities

Lext(ck|p)L(ck|y)

y

Deinterleaver

InterleaverExtrinsic InformationLext(bk|p)

Extrinsic InformationLext(ck|y)

+

_

+_

DecoderForward/BackwardAlgorithm

Lext(bk|y)

Prior Prob.Observations

a PosterioriProbabilities

L(bk|p)

DecisionRule

L(ak|p)

âk

EqualizerForward/BackwardAlgorithm

Observations

Prior Probabilities

a PosterioriProbabilitiesL(ck|y)

y

Deinterleaver

DecoderForward/BackwardAlgorithm

Prior Prob.Observations

a PosterioriProbabilities

L(bk|p)

DecisionRule

L(ak|p)

âk

Let’s look again at the transition propability:

)()|(),( ,, jikjikkjik xxPvvyprr .

Prior informationlocal evidence aboutwhich branch in thetrellis was transversed

So far:•The equalizer does not have any prior knowledge available, so the formation of entries in Pk relies solely on the observation y•The decoder forms the corresponding entries in Pk without any local oobservations but entirely based on bitwise probabilities P(bk|y) provided by the equalizer.

Turbo Equalization

Page 30: Iterative Equalization

JASS 05 St. Petersburg - Michael Meyer - Iterative Equalization

- 30 -

Block Diagram - Comparison

EqualizerForward/BackwardAlgorithm

Observations

Prior Probabilities

a PosterioriProbabilitiesL(ck|y)

y

Deinterleaver

DecoderForward/BackwardAlgorithm

L(bk|y)

Prior Prob.Observations

a PosterioriProbabilities

L(bk|p)

DecisionRule

L(ak|p)

âk

EqualizerForward/BackwardAlgorithm

Observations

Prior Probabilities

a PosterioriProbabilities

Lext(ck|p)L(ck|y)

y

Deinterleaver

InterleaverExtrinsic InformationLext(bk|p)

Extrinsic InformationLext(ck|y)

+

_

+_

DecoderForward/BackwardAlgorithm

Lext(bk|y)

Prior Prob.Observations

a PosterioriProbabilities

L(bk|p)

DecisionRule

L(ak|p)

âk

EqualizerForward/BackwardAlgorithm

Observations

Prior Probabilities

a PosterioriProbabilitiesL(ck|y)

y

Deinterleaver

DecoderForward/BackwardAlgorithm

Prior Prob.Observations

a PosterioriProbabilities

L(bk|p)

DecisionRule

L(ak|p)

âk

Receiver B:separated equalization

and detectionReceiver C:

Turbo Equalization

Page 31: Iterative Equalization

JASS 05 St. Petersburg - Michael Meyer - Iterative Equalization

- 31 -

Turbo Equalization - Calculation

EqualizerForward/BackwardAlgorithm

Observations

Prior Probabilities

a PosterioriProbabilities

Lext(ck|p)L(ck|y)

y

Deinterleaver

InterleaverExtrinsic InformationLext(bk|p)

Extrinsic InformationLext(ck|y)

+

_

+_

DecoderForward/BackwardAlgorithm

Lext(bk|y)

Prior Prob.Observations

a PosterioriProbabilities

L(bk|p)

DecisionRule

L(ak|p)

âk

EqualizerForward/BackwardAlgorithm

Observations

Prior Probabilities

a PosterioriProbabilitiesL(ck|y)

y

Deinterleaver

DecoderForward/BackwardAlgorithm

Prior Prob.Observations

a PosterioriProbabilities

L(bk|p)

DecisionRule

L(ak|p)

âk

Caution: We have to split L(ck|y)=Lext(ck|y) + L(ck)as only extrinsic information is fed back. Lext(ck|y) does not depend on L(ck). L(ck) would create direct positive feedback converging usually far from the globally optimal solution.

The interleavers are included into the iterative update loop to further disperse the direct feedback effect. The forward/backward algorithm creates locally highly correlated output. These correlations between neighboring symbols are largely suppressed by the interleaver.

Page 32: Iterative Equalization

JASS 05 St. Petersburg - Michael Meyer - Iterative Equalization

- 32 -

Turbo Equalization - Algorithm Input:

Observation sequence yChannel coefficients hl for l=0,1,…,L

Initialize:Predetermine the number of iterations T

Initialize the sequence of LLRs Lext(c|p) to 0

Output: Compute data bit estimates âk from L(ak|y)

L(c|y) = Forward/Backward(Lext(c|p))Lext(c|y) = L(c|y) – Lext(c|p)L(b|p) = Forward/Backward(Lext(b|y))Lext(b|p) = L(b|p) – Lext(b|y)

Compute recursively for T iterations

Page 33: Iterative Equalization

JASS 05 St. Petersburg - Michael Meyer - Iterative Equalization

- 33 -

Turbo Equalization - BER

A

B C

The system transmits K=512 data bits and uses a 16-random interleaver to scramble N=1024 code bits. Figure A uses separate equalization and detection.Figure B uses turbo MMSE Equalization with 0, 1, 2, 10 iterations. Figure C uses turbo MAP equalization after the same iterations. The line marked with “x” is the performance with K = 25000 and 40-random interleaving after 20 iterations.

Turbo MMSE Turbo MAP

Page 34: Iterative Equalization

JASS 05 St. Petersburg - Michael Meyer - Iterative Equalization

- 34 -

Turbo Equalization – Exit Charts [2]

Receiver EXIT chart at 4 dB ES/N0 Receiver EXIT chart at 0.8 dB ES/N0

No MMSE

Page 35: Iterative Equalization

JASS 05 St. Petersburg - Michael Meyer - Iterative Equalization

- 35 -

Linear EqualizationThe computational effort is so far determined by the number of trellis states. An 8-ary alphabet gives 8L states in the trellis. Linear filter-based approaches perform only simple operations on the received symbols, which are usually applied sequentially to a subset of M observed symbols yk.e.g. yk=(yk-5 yk-4 … yk+5)T M=11

A channel of length L can be expressed as with M x (M+L).Any type of linear processing of yk to compute can be expressed as .

The channel law immediately suggests , the zero-forcing approach. With noise present, an estimate is obtained. This approach suffers from “noise enhancement”, which can be severe if is ill conditioned.

This effect can be avoided using linear minimum mean square error (MMSE) estimation minimizing . The equation used is

It is also possible to nonlinearly process previous estimates to find the besides the linear processing of yk (decision-feedback equalization (DFE).

kkk nxHy ~ H~kx kk

Tkkx byf ˆ

Hf ~Tk

kTkkk nfxx ˆ

H~

]|ˆ[| 2kkE xx

kx

uHHHIfyfx ~~~ˆ 12 H

kkTk

Page 36: Iterative Equalization

JASS 05 St. Petersburg - Michael Meyer - Iterative Equalization

- 36 -

Complexity [2]

Approach Real Multiplications Real Additions

MAP Equalizer 3 ∙ 2mM + 2m ∙ 2m(M-1) 3 ∙ 2mM + 2(m-1) ∙ 2m(M-1)

exact MMSE LE 16N2 + 4M2 + 10M – 4N - 4

8N2 + 2M2 - 10N + 2M + 4

approx. MMSE LE (I) 4N + 8M 4N + 4M - 4approx. MMSE LE

(II) 10M 10M - 2

MMSE DFE 16N2 + 4M2 + 10M – 4N - 4

8N2 + 2M2 - 10N + 2M + 4M: Channel impulse response length

N: Equalizer filter length2m: Alphabet length of the signal constellation

DFE: Decision Feedback Equalization

Page 37: Iterative Equalization

JASS 05 St. Petersburg - Michael Meyer - Iterative Equalization

- 37 -

Comparison• The MMSE approaches have reduced complexity.

•The MMSE approaches perform as well as the BER-optimal MAP approach, only requiring a few more iterations.

•However, the MAP equalizer may handle SNR ranges where all other approaches fail.

Ideas• treat scenarios with unknown channel characteristics, e. g.

combined channel estimation and equalization using a-priori information

• Switch between MAP and MMSE algorithms depending on fed back soft information

Page 38: Iterative Equalization

JASS 05 St. Petersburg - Michael Meyer - Iterative Equalization

- 38 -

Thank you for your attention!

Questions & Comments ?

Page 39: Iterative Equalization

JASS 05 St. Petersburg - Michael Meyer - Iterative Equalization

- 39 -

References

[1] Koetter, R.; Singer, A.; Tüchler, M.: Turbo EqualizationIEEE Signal Processing Magazine, vol. 21, no. 1, pp 67-80, Jan 2004

[2] Tüchler, M.; Koetter, R.; Singer, A.: Turbo Equalization: Principles and New ResultsIEEE Trans. Commun., vol. 50, pp. 754-767, May 2002

[3] Tüchler, M.; Singer, A.; Koetter, R.:Minimum Mean Squared Error Equalization Using A-priori InformationIEEE Trans. Signal Processing, vol. 50, pp. 673-683, March 2002


Recommended