+ All Categories
Home > Documents > Arash Saber Tehrani Alex Dimakis Mike Neely

Arash Saber Tehrani Alex Dimakis Mike Neely

Date post: 25-Feb-2016
Category:
Upload: unity
View: 45 times
Download: 0 times
Share this document with a friend
Description:
SigSag : Iterative Detection Through Soft Message Passing. Arash Saber Tehrani Alex Dimakis Mike Neely . University of Southern California. Model. We consider a multiple access problem N users, each has a packet for an access point (base station). Each user retransmits until ack. - PowerPoint PPT Presentation
22
Arash Saber Tehrani Alex Dimakis Mike Neely University of Southern California SigSag: Iterative Detection Through Soft Message Passing
Transcript
Page 1: Arash  Saber  Tehrani Alex  Dimakis Mike Neely

Arash Saber TehraniAlex Dimakis

Mike Neely

University of Southern California

SigSag:Iterative Detection Through Soft

Message Passing

Page 2: Arash  Saber  Tehrani Alex  Dimakis Mike Neely

2

Model

• We consider a multiple access problem• N users, each has a packet for an access point

(base station).– Each user retransmits until ack.

• Flat fading where cth transmission of user i is affected by fading h(c)

i.

• Worst case: Assume they always collide. • All users transmit packets for N rounds.

Page 3: Arash  Saber  Tehrani Alex  Dimakis Mike Neely

3

Model• Example:• 2 users transmit their packets twice.

1

1

1 2 3

2 3 1 2 3

1 2 3

2 3 4 1 2 3U2

Xy

U1

Page 4: Arash  Saber  Tehrani Alex  Dimakis Mike Neely

4

Model• Example:• 2 users transmit their packets twice.

1

2

3

1

2

3

1

2

3

4

1

2

3

Page 5: Arash  Saber  Tehrani Alex  Dimakis Mike Neely

5

Model• Example:• 2 users transmit their packets twice.

1

2

3

1

2

3

1

2

3

4

1

2

3

Page 6: Arash  Saber  Tehrani Alex  Dimakis Mike Neely

6

multiuser detection

• BS gets a bunch of noisy linear equations in the unknown bits xi, yi .

• If there was no noise, optimal detection is solving linear equations.

• Now that there is noise would ideally like to compute the likelihood for each xi ={ ±1 }B, yi ={ ±1 }B

• Integer least squares problem that is NP-hard in general.

[Boutros & Caire],[Verdu],[Reynolds,Wang,Poor] and references therein

Page 7: Arash  Saber  Tehrani Alex  Dimakis Mike Neely

7

multiuser detection for wifi

• 802.11 BS gets noisy linear equations formed by repetition only.

• If we are at high SNR we can make hard decisions for the bits (ignore the noise) and solve linear equations.

• Gaussian elimination= • Bring in triangular form + Back-substitution.

• If the equations are in triangular form already, we only need back-substitution

Page 8: Arash  Saber  Tehrani Alex  Dimakis Mike Neely

8

ZigZag decoding

11 1

1 11

1 11 1

1 1

Forward ZigZag is back-substitution.

[Gollakota, Katabi], [ Zhang, Shin],[Erran, Liu, Tan, Viswanathan, Yang]

Page 9: Arash  Saber  Tehrani Alex  Dimakis Mike Neely

9

Shortcomings of ZigZag• ZigZag can fail to decode when back-substitution is not

possible.

• Noise is accumulated as the ZigZag decoder advances through the packet.

• Previous hard decisions can push other bits to be incorrect.

• How to solve the problems? One heuristic solution is the forward-backward Zigzag.

• Here we view this as a graphical model (factor graph).

Page 10: Arash  Saber  Tehrani Alex  Dimakis Mike Neely

10

Optimal decoding as inference

Given observations of u’s Obtain the most likely xi, yi

Page 11: Arash  Saber  Tehrani Alex  Dimakis Mike Neely

fun facts about inference in graphical models

11

• Optimal graphical model inference is NP-hard.

• Well-known approximation algorithm: belief propagation (aka message passing, sum-product / max-product )

• BP is optimal if the factor graph has no cycles.

• Loopy BP typically is a very good approximation algorithm.

[Pearl in AI, Gallager for LDPC codes in information theory]

Page 12: Arash  Saber  Tehrani Alex  Dimakis Mike Neely

fun facts about inference in graphical models

12

• Observation: ZigZag decoding is BP • (special case that deals with noise by hard

thresholding)

• (also equivalent to Luby erasure decoding and the algorithm you use to solve Sudoku)

• Heuristic approach to deal with error-accumulation, run ZigZag twice (forward-backward) and average results.

• Here we propose a systematic way to keep track of soft information about the bits: SigSag decoding

• Natural generalization of ZigZag that maintains soft information (probability distributions).

Page 13: Arash  Saber  Tehrani Alex  Dimakis Mike Neely

13

SigSag = BP that maintains probabilities• SigSag is belief propagation on the collision graph.

1

1

2

3

2

3

1

2

3

4

1

2

3

1

2

3

4

1

2

3

U1

U2

f1

f2

Page 14: Arash  Saber  Tehrani Alex  Dimakis Mike Neely

14

results: two users • Factor graph is cycle free for two users:

• Theorem 1 for two users transmitting without permuting their bits, the resulting factor graph is cycle-free with probability at least 1-1/w

• SigSag is performing maximum likelihood decoding whp for two users with jitter.

(max-product minimizes block error probability Sum-product minimizes bit error probability)

Page 15: Arash  Saber  Tehrani Alex  Dimakis Mike Neely

15

results: multiple usersMultiple users

• Theorem 2 The left-regular bipartite factor graph G resulting from N packets of length B sent by N users with symbol permutation is locally tree-like with high probability.

• Specifically,

where e = (v,c) is a randomly chosen edge of the graph, 2ℓ is a fixed depth, and s is a suitable constant that depends on ℓ, N, but not on B.

•SigSag is near ML for large packets B.

Page 16: Arash  Saber  Tehrani Alex  Dimakis Mike Neely

16

Proof ideasStart from a variable

and grow the random factor graph. Write a recursion for the probability of forming a loop at step ℓ

(similar to LDPC high girth arguments, Richardson, Urbanke, Luby, Mitzenmacher, et al.)

Page 17: Arash  Saber  Tehrani Alex  Dimakis Mike Neely

17

importance of Jitter• Theorem 3 if the users transmit packets consecutively

without any jitter delay (W = 0), the matrix A is rank deficient (even with bit permutations)

• Without jitter ML detector fails even if there is no noise.

• Random Delay of one Symbol Suffices to make the matrix A full rank whp. (equivalent to a classic result on random matrices by Kolmos 1967)

Page 18: Arash  Saber  Tehrani Alex  Dimakis Mike Neely

18

Experimental Results

N = 2 users packet length B = 100.

Page 19: Arash  Saber  Tehrani Alex  Dimakis Mike Neely

19

Experimental ResultsN = 3 users packet length B = 100.

Page 20: Arash  Saber  Tehrani Alex  Dimakis Mike Neely

conclusions

20

• Showed how ZigZag decoding is an instance of Belief propagation

• Used this connection to develop a new decoding algorithm that maintains probabilistic information about symbols.

Showed that SigSag is optimal for two users whp and near optimal for multiple users.

Empirical performance shows significant gains.

Performance gains increase for more users.

Page 21: Arash  Saber  Tehrani Alex  Dimakis Mike Neely

21

fin

Page 22: Arash  Saber  Tehrani Alex  Dimakis Mike Neely

22

igSag to the rescue!!!


Recommended