HYBRID ARQ IN WIRELESS NETWORKS

Post on 12-Apr-2022

2 views 0 download

transcript

HYBRID ARQ IN WIRELESS NETWORKS

Emina Soljanin

Mathematical Sciences Research Center, Bell Labs

March 19, 2003

HYBRID ARQ IN WIRELESS NETWORKS

Emina Soljanin

Mathematical Sciences Research Center, Bell Labs

March 19, 2003

R. Liu and P. Spasojevic

WINLAB

HYBRID ARQ IN WIRELESS NETWORKS

Emina Soljanin

Mathematical Sciences Research Center, Bell Labs

March 19, 2003

R. Liu and P. Spasojevic

WINLAB

1ACKNOWLEDGEMENTS

Alexei Ashikhmin

Jaehyiong Kim

Sudhir Ramakrishna

Adriaan van Wijngaarden

2AUTOMATIC REPEAT REQUEST

• The receiving end detects frame errors and requests retransmissions.

• Pe is the frame error rate, the average number of transmissions is

1 · (1− Pe) + · · ·+ n · Pn−1e (1− Pe) + · · · = 1

1− Pe

2AUTOMATIC REPEAT REQUEST

• The receiving end detects frame errors and requests retransmissions.

• Pe is the frame error rate, the average number of transmissions is

1 · (1− Pe) + · · ·+ n · Pn−1e (1− Pe) + · · · = 1

1− Pe

• Hybrid ARQ uses a code that can correct some frame errors.

2AUTOMATIC REPEAT REQUEST

• The receiving end detects frame errors and requests retransmissions.

• Pe is the frame error rate, the average number of transmissions is

1 · (1− Pe) + · · ·+ n · Pn−1e (1− Pe) + · · · = 1

1− Pe

• Hybrid ARQ uses a code that can correct some frame errors.

• In HARQ schemes

– the average number of transmissions is reduced, but

– each transmission carries redundant information.

3THROUGHPUT IN HYBRID ARQBPSK, AWGN, BCH Coded

0

0.2

0.4

0.6

0.8

1

-6 -4 -2 0 2 4 6 8 10

uncoded

Es/N0 [dB]

3THROUGHPUT IN HYBRID ARQBPSK, AWGN, BCH Coded

0

0.2

0.4

0.6

0.8

1

-6 -4 -2 0 2 4 6 8 10

uncoded

Es/N0 [dB]

0

0.2

0.4

0.6

0.8

1

-6 -4 -2 0 2 4 6 8 10

[255,247,3]

3THROUGHPUT IN HYBRID ARQBPSK, AWGN, BCH Coded

0

0.2

0.4

0.6

0.8

1

-6 -4 -2 0 2 4 6 8 10

uncoded

Es/N0 [dB]

0

0.2

0.4

0.6

0.8

1

-6 -4 -2 0 2 4 6 8 10

[255,247,3]

0

0.2

0.4

0.6

0.8

1

-6 -4 -2 0 2 4 6 8 10

[255,215,11]

3THROUGHPUT IN HYBRID ARQBPSK, AWGN, BCH Coded

0

0.2

0.4

0.6

0.8

1

-6 -4 -2 0 2 4 6 8 10

uncoded

Es/N0 [dB]

0

0.2

0.4

0.6

0.8

1

-6 -4 -2 0 2 4 6 8 10

[255,247,3]

0

0.2

0.4

0.6

0.8

1

-6 -4 -2 0 2 4 6 8 10

[255,215,11]

0

0.2

0.4

0.6

0.8

1

-6 -4 -2 0 2 4 6 8 10

[255,177,23]

3THROUGHPUT IN HYBRID ARQBPSK, AWGN, BCH Coded

0

0.2

0.4

0.6

0.8

1

-6 -4 -2 0 2 4 6 8 10

uncoded

Es/N0 [dB]

0

0.2

0.4

0.6

0.8

1

-6 -4 -2 0 2 4 6 8 10

[255,247,3]

0

0.2

0.4

0.6

0.8

1

-6 -4 -2 0 2 4 6 8 10

[255,215,11]

0

0.2

0.4

0.6

0.8

1

-6 -4 -2 0 2 4 6 8 10

[255,177,23]

0

0.2

0.4

0.6

0.8

1

-6 -4 -2 0 2 4 6 8 10

cutoff rate

3THROUGHPUT IN HYBRID ARQBPSK, AWGN, BCH Coded

0

0.2

0.4

0.6

0.8

1

-6 -4 -2 0 2 4 6 8 10

uncoded

Es/N0 [dB]

0

0.2

0.4

0.6

0.8

1

-6 -4 -2 0 2 4 6 8 10

[255,247,3]

0

0.2

0.4

0.6

0.8

1

-6 -4 -2 0 2 4 6 8 10

[255,215,11]

0

0.2

0.4

0.6

0.8

1

-6 -4 -2 0 2 4 6 8 10

[255,177,23]

0

0.2

0.4

0.6

0.8

1

-6 -4 -2 0 2 4 6 8 10

cutoff rate

0

0.2

0.4

0.6

0.8

1

-6 -4 -2 0 2 4 6 8 10

capacity

4TYPE II HYBRID ARQIncremental Redundancy

• Information bits are encoded by a (low rate) mother code.

• Information and a selected number of parity bits are transmitted.

4TYPE II HYBRID ARQIncremental Redundancy

• Information bits are encoded by a (low rate) mother code.

• Information and a selected number of parity bits are transmitted.

• If a retransmission is not successful:

– transmitter sends additional selected parity bits

– receiver puts together the new bits and those previously received.

• Each retransmission produces a codeword of a stronger code.

• Family of codes obtained by puncturing of the mother code.

5INCREMENTAL REDUNDANCYA Rate 1/5 Mother Code

at the transmitter

5INCREMENTAL REDUNDANCYA Rate 1/5 Mother Code

at the transmitter

transmission # 1

at the receiver

5INCREMENTAL REDUNDANCYA Rate 1/5 Mother Code

at the transmitter

transmission # 1

at the receiver

transmission # 2

5INCREMENTAL REDUNDANCYA Rate 1/5 Mother Code

at the transmitter

transmission # 1

at the receiver

transmission # 2

transmission # 3

5INCREMENTAL REDUNDANCYA Rate 1/5 Mother Code

at the transmitter

transmission # 1

at the receiver

transmission # 2

transmission # 3

transmission # 4

6THROUGHPUT IN HYBRID ARQ

−8 −6 −4 −2 0 2 4 6 8 100

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

Es/N0 (dB)

Thr

ough

put

HARQ Scheme based on Turbo codes in AWGN Channel

Throughout of new puncturing schemeThroughout of standardBPSK CapacityCutoff Rate

7RANDOMLY PUNCTURED CODES

• The mother code is an (n, k) rate R turbo code.

• Each bit is punctured independently with probability λ.

7RANDOMLY PUNCTURED CODES

• The mother code is an (n, k) rate R turbo code.

• Each bit is punctured independently with probability λ.

• The expected rate of the punctured code is R/(1− λ).

• For large n we have

turbo code

k bits

puncturing device

n bits (1−λ)n bits

8A FAMILY OF RANDOMLY PUNCTURED CODESRate Compatible Puncturing

• The mother code is an (n, k) rate R turbo code.

• λj for j = 1, 2, . . . ,m are puncturing rates, λj > λk for j < k.

• If the i-th bit is punctured in the k-th code and j < k,

then it was punctured in the j-th code.

8A FAMILY OF RANDOMLY PUNCTURED CODESRate Compatible Puncturing

• The mother code is an (n, k) rate R turbo code.

• λj for j = 1, 2, . . . ,m are puncturing rates, λj > λk for j < k.

• If the i-th bit is punctured in the k-th code and j < k,

then it was punctured in the j-th code.

• θi for i = 1, 2, . . . , n are uniformly distributed over [0, 1].

• If θi < λl, then the i-th bit is punctured in the l-th code.

9MEMORYLESS CHANNEL MODEL

• Binary input alphabet {0, 1} and output alphabet Y.

• Constant in time with

transition probabilities W (b|0) and W (b|1), b ∈ Y.

9MEMORYLESS CHANNEL MODEL

• Binary input alphabet {0, 1} and output alphabet Y.

• Constant in time with

transition probabilities W (b|0) and W (b|1), b ∈ Y.

• Time varying with

transition probabilities at time i Wi(b|0) and Wi(b|1), b ∈ Y.

• Wi(·|0) and Wi(·|1) are known at the receiver.

10PERFORMANCE MEASURETime Invariant Channel

• Sequence x ∈ C ⊆ {0, 1}n is transmitted, and x′ decoded.

• Sequences x and x′ are at Hamming distance d.

10PERFORMANCE MEASURETime Invariant Channel

• Sequence x ∈ C ⊆ {0, 1}n is transmitted, and x′ decoded.

• Sequences x and x′ are at Hamming distance d.

• The probability of error Pe(x,x′) can be bounded as

Pe(x,x′) ≤ γd = exp{−dα},

where γ is the Bhattacharyya noise parameter:

γ =∑b∈Y

√W (b|x = 0)W (b|x = 1)

10PERFORMANCE MEASURETime Invariant Channel

• Sequence x ∈ C ⊆ {0, 1}n is transmitted, and x′ decoded.

• Sequences x and x′ are at Hamming distance d.

• The probability of error Pe(x,x′) can be bounded as

Pe(x,x′) ≤ γd = exp{−dα},

where γ is the Bhattacharyya noise parameter:

γ =∑b∈Y

√W (b|x = 0)W (b|x = 1)

and α = − log γ is the Bhattacharyya distance.

11PERFORMANCE MEASURE

• An (n, k) binary linear code C with Ad codewords of weight d.

11PERFORMANCE MEASURE

• An (n, k) binary linear code C with Ad codewords of weight d.

• The union-Bhattacharyya bound on word error probability:

P CW ≤

n∑d=1

Ade−αd.

11PERFORMANCE MEASURE

• An (n, k) binary linear code C with Ad codewords of weight d.

• The union-Bhattacharyya bound on word error probability:

P CW ≤

n∑d=1

Ade−αd.

• Weight distribution Ad for a turbo code?

11PERFORMANCE MEASURE

• An (n, k) binary linear code C with Ad codewords of weight d.

• The union-Bhattacharyya bound on word error probability:

P CW ≤

n∑d=1

Ade−αd.

• Weight distribution Ad for a turbo code?

• Consider a set of codes [C] corresponding to all interleavers.

• Use the average A[C](n)

d instead of Ad for large n.

12TURBO CODE ENSEMBLESA Coding Theorem by Jin and McEliece

• There is an ensemble distance parameter c[C]0 s.t. for large n

A[C](n)

d ≤ exp(dc

[C]0

)for large enough d.

12TURBO CODE ENSEMBLESA Coding Theorem by Jin and McEliece

• There is an ensemble distance parameter c[C]0 s.t. for large n

A[C](n)

d ≤ exp(dc

[C]0

)for large enough d.

• For a channel whose Bhattacharyya distance α > c[C]0 , we have

P[C](n)

W = O(n−β).

• c[C]0 is the ensemble noise threshold.

13PUNCTUREDTURBO CODE ENSEMBLESITW, April 2003

• c[CP ]0 is the punctured ensemble noise threshold:

A[CP ](n)

d ≤ exp(dc

[CP ]0

)for large enough n and d.

13PUNCTUREDTURBO CODE ENSEMBLESITW, April 2003

• c[CP ]0 is the punctured ensemble noise threshold:

A[CP ](n)

d ≤ exp(dc

[CP ]0

)for large enough n and d.

• If log λ < −c[C]0 ,

13PUNCTUREDTURBO CODE ENSEMBLESITW, April 2003

• c[CP ]0 is the punctured ensemble noise threshold:

A[CP ](n)

d ≤ exp(dc

[CP ]0

)for large enough n and d.

• If log λ < −c[C]0 ,

c[CP ]0 ≤ log

[1− λ

exp (−c[C]0 )− λ

].

14PUNCTUREDTURBO CODE ENSEMBLES

0 2 4 6 8 10 12

10−3

10−2

10−1

100

Es/N0 (dB)

FE

R

R=0.7 k=384R=0.7 k=3840R=0.8 k=384R=0.8 k=3840R=0.9 k=384R=0.9 k=3840

15PUNCTUREDTURBO CODE ENSEMBLES

−6 −4 −2 0 2 4 6 8 100.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

← R=0.82

Es/N0 (dB)

Thr

ough

put

Punctured TC k=384Punctured TC k=3840BPSK CapacityCutoff Rate

16HARQ MODEL

• There are at most m transmissions.

• I = {1, . . . , n} is the set indexing the bit positions in a codeword.

• I is partitioned in m subsets I(j), for 1 ≤ j ≤ m.

• Bits at positions in I(j) are transmitted during j-th transmission.

16HARQ MODEL

• There are at most m transmissions.

• I = {1, . . . , n} is the set indexing the bit positions in a codeword.

• I is partitioned in m subsets I(j), for 1 ≤ j ≤ m.

• Bits at positions in I(j) are transmitted during j-th transmission.

• The channel remains constant during a single transmission:

γi = γ(j) for all i ∈ I(j).

17PERFORMANCE MEASURETime Varying Channel

• Let Wn(y|x) =∏n

i=1 Wi(yi|xi).

17PERFORMANCE MEASURETime Varying Channel

• Let Wn(y|x) =∏n

i=1 Wi(yi|xi).

• Sequence x ∈ C ⊆ {0, 1}n is transmitted, and x′ decoded.

• The probability of error Pe(x,x′) can be bounded as

Pe(x,x′) ≤∑

y∈Yn

√Wn(y|x)Wn(y|x′)

=n∏

i=1

(∑b∈Y

√Wi(b|xi)Wi(b|x′i)

)≤

∏i:xi 6=x′i

γi

18HARQ PERFORMANCE

• dj is the Hamming distance between x and x′ over I(j).

• The probability of error Pe(x,x′) can be bounded as

Pe(x,x′) ≤m∏

j=1

γ(j)dj

18HARQ PERFORMANCE

• dj is the Hamming distance between x and x′ over I(j).

• The probability of error Pe(x,x′) can be bounded as

Pe(x,x′) ≤m∏

j=1

γ(j)dj

• Ad1...dm is the number of codewords with weight dj over I(j).

• The union bound on the ML decoder word error probability:

P ≤|I(1)|∑d1=1

· · ·|I(m)|∑dm=1

Ad1...dm

m∏j=1

γ(j)dj

19HARQ PERFORMANCERandom Transmission Assignment

• A bit is assigned to transmission j with probability αj.

• d is the weight of the original codeword.

• dj is the weight of the d-th transmission sub-word.

• The probability that the sub-word weights are d1, d2 . . . , dm is(d

d1

)(d− d1

d2

). . .

(d− d1 · · · − dm−1

dm

)αd1

1 αd22 . . . αdm

m

20HARQ PERFORMANCERandom Transmission Assignment

• The union bound on the ML decoder word error probability:

P ≤|I(1)|∑d1=1

· · ·|I(m)|∑dm=1

Ad1...dm

m∏j=1

γ(j)dj

20HARQ PERFORMANCERandom Transmission Assignment

• The union bound on the ML decoder word error probability:

P ≤|I(1)|∑d1=1

· · ·|I(m)|∑dm=1

Ad1...dm

m∏j=1

γ(j)dj

• The expected value of the union bound is∑d

Ad

( m∑j=1

γ(j)αj

)h

.

• The average Bhattacharyya noise parameter:

γ =m∑

j=1

γ(j)αj

21A RANDOMLY PUNCTURED TURBO CODEAn Example of Random Transmission Assignment

• The puncturing probability is λ.

• Transmission over the channel with noise parameter γ.

21A RANDOMLY PUNCTURED TURBO CODEAn Example of Random Transmission Assignment

• The puncturing probability is λ.

• Transmission over the channel with noise parameter γ.

• Equivalent to having two transmissions:

– first with assignment probability (1− λ) and noise parameter γ;

– second with assignment probability λ and noise parameter 1.

21A RANDOMLY PUNCTURED TURBO CODEAn Example of Random Transmission Assignment

• The puncturing probability is λ.

• Transmission over the channel with noise parameter γ.

• Equivalent to having two transmissions:

– first with assignment probability (1− λ) and noise parameter γ;

– second with assignment probability λ and noise parameter 1.

• The average noise parameter is γ = (1− λ)γ + λ.

• Requirement − log γ > c[C]0

21A RANDOMLY PUNCTURED TURBO CODEAn Example of Random Transmission Assignment

• The puncturing probability is λ.

• Transmission over the channel with noise parameter γ.

• Equivalent to having two transmissions:

– first with assignment probability (1− λ) and noise parameter γ;

– second with assignment probability λ and noise parameter 1.

• The average noise parameter is γ = (1− λ)γ + λ.

• Requirement − log γ > c[C]0 translates into

− log γ > log

[1− λ

exp (−c[C]0 )− λ

].

22INCREMENTAL REDUNDANCYConcluding Remarks

at the transmitter

22INCREMENTAL REDUNDANCYConcluding Remarks

at the transmitter

transmission # 1

at the receiver

22INCREMENTAL REDUNDANCYConcluding Remarks

at the transmitter

transmission # 1

at the receiver

transmission # 2

22INCREMENTAL REDUNDANCYConcluding Remarks

at the transmitter

transmission # 1

at the receiver

transmission # 2

transmission # 3

22INCREMENTAL REDUNDANCYConcluding Remarks

at the transmitter

transmission # 1

at the receiver

transmission # 2

transmission # 3

transmission # 4