From Stopping sets to Trapping setsThe Exhaustive Search Algorithm & The Suppressing Effect
Chih-Chun Wang
School of Electrical & Computer Engineering
Purdue University
Wang – p. 1/21
ContentGoodexhaustivetrapping set searchalgorithm forarbitrary
codes.
Thesuppressing effectfor cyclically lifted code ensembles.
Wang – p. 2/21
ContentGoodexhaustivetrapping set searchalgorithm forarbitrary
codes.
New results on the hardness of the problem
Thesuppressing effectfor cyclically lifted code ensembles.
Wang – p. 2/21
ContentGoodexhaustivetrapping set searchalgorithm forarbitrary
codes.
New results on the hardness of the problem
Existing work on exhaustive search for stopping sets
Thesuppressing effectfor cyclically lifted code ensembles.
Wang – p. 2/21
ContentGoodexhaustivetrapping set searchalgorithm forarbitrary
codes.
New results on the hardness of the problem
Existing work on exhaustive search for stopping sets
The exhaustive search for trapping sets based on exhaustive
search for stopping sets.
Thesuppressing effectfor cyclically lifted code ensembles.
Wang – p. 2/21
ContentGoodexhaustivetrapping set searchalgorithm forarbitrary
codes.
New results on the hardness of the problem
Existing work on exhaustive search for stopping sets
The exhaustive search for trapping sets based on exhaustive
search for stopping sets.
Lessons from the results of exhaustive search algorithms
Thesuppressing effectfor cyclically lifted code ensembles.
Wang – p. 2/21
ContentGoodexhaustivetrapping set searchalgorithm forarbitrary
codes.
New results on the hardness of the problem
Existing work on exhaustive search for stopping sets
The exhaustive search for trapping sets based on exhaustive
search for stopping sets.
Lessons from the results of exhaustive search algorithms
Thesuppressing effectfor cyclically lifted code ensembles.
Definition: Prob(the bad structure remains after lifting)
Wang – p. 2/21
ContentGoodexhaustivetrapping set searchalgorithm forarbitrary
codes.
New results on the hardness of the problem
Existing work on exhaustive search for stopping sets
The exhaustive search for trapping sets based on exhaustive
search for stopping sets.
Lessons from the results of exhaustive search algorithms
Thesuppressing effectfor cyclically lifted code ensembles.
Definition: Prob(the bad structure remains after lifting)
Quantifyingthe suppressing effect.
Wang – p. 2/21
ContentGoodexhaustivetrapping set searchalgorithm forarbitrary
codes.
New results on the hardness of the problem
Existing work on exhaustive search for stopping sets
The exhaustive search for trapping sets based on exhaustive
search for stopping sets.
Lessons from the results of exhaustive search algorithms
Thesuppressing effectfor cyclically lifted code ensembles.
Definition: Prob(the bad structure remains after lifting)
Quantifyingthe suppressing effect.
A design criteria forbase code optimization.
Wang – p. 2/21
Stopping SetsDefinition: a set of variable nodes⇒ the induced graph contains
no check node of degree 1.
i =g1
g2
g3
wg4
g5
wg6
wg7
1 2 3j =
������
������
������
��@
@�
�@
@PPPPPP
HHHH@
@
Wang – p. 3/21
Stopping SetsDefinition: a set of variable nodes⇒ the induced graph contains
no check node of degree 1.
i =g1
g2
g3
wg4
g5
wg6
wg7
1 2 3j =
������
������
������
��@
@�
�@
@PPPPPP
HHHH@
@
Why exhaustive searchalgorithms (for small stopping sets)?
Wang – p. 3/21
Stopping SetsDefinition: a set of variable nodes⇒ the induced graph contains
no check node of degree 1.
i =g1
g2
g3
wg4
g5
wg6
wg7
1 2 3j =
������
������
������
��@
@�
�@
@PPPPPP
HHHH@
@
Why exhaustive searchalgorithms (for small stopping sets)?
Error floor optimization. BECs vs. non-erasure channels.
Wang – p. 3/21
Stopping SetsDefinition: a set of variable nodes⇒ the induced graph contains
no check node of degree 1.
i =g1
g2
g3
wg4
g5
wg6
wg7
1 2 3j =
������
������
������
��@
@�
�@
@PPPPPP
HHHH@
@
Why exhaustive searchalgorithms (for small stopping sets)?
Error floor optimization. BECs vs. non-erasure channels.
Good butinexhaustivesearch algorithms: error floors of LDPC
codes [Richardson 03], projection algebra [Yedidiaet al. 01], the
approximate minimum distance of LDPC codes [Huet al. 04],
[Hirotomo et al. 05], [Richter 06]
Wang – p. 3/21
An NP-Hard Problem=== TheSD(H, t) problem ===
INPUT: A code represented by itsparity-check matrixH and an
integert.
OUTPUT: Output 1 if the minimal stopping distance ofH is≤ t.
Otherwise, output 0.
The hardness results:[Krishnanet al. 06]: For arbitraryH, SD(H, t) is NP-complete.
Proof: By reducing a VERTEX-COVER problem to SD(H, t).
A byproduct of [Krishnanet al. 06]: With thesparsity restriction
that the number of1’s in H is limited toO(n) rather thanO(n2),
then SD(H, t) is still NP-complete.Wang – p. 4/21
Trapping Sets: DefinitionsOperational definition: “the set of bits that arenot eventually
correct" [Richardson 03].
Wang – p. 5/21
Trapping Sets: DefinitionsOperational definition: “the set of bits that arenot eventually
correct" [Richardson 03].
Empirical observations: For non-erasure channels: trapping
sets are(a, b) near-codeword [MacKayet al. 03]
Wang – p. 5/21
Trapping Sets: DefinitionsOperational definition: “the set of bits that arenot eventually
correct" [Richardson 03].
Empirical observations: For non-erasure channels: trapping
sets are(a, b) near-codeword [MacKayet al. 03]
(a, b) near codeword:A set ofa variable nodes such the
induced graph hasb odd-degreecheck nodes.
Wang – p. 5/21
Trapping Sets: DefinitionsOperational definition: “the set of bits that arenot eventually
correct" [Richardson 03].
Empirical observations: For non-erasure channels: trapping
sets are(a, b) near-codeword [MacKayet al. 03]
(a, b) near codeword:A set ofa variable nodes such the
induced graph hasb odd-degreecheck nodes.
A (a, 0) near codeword:
⇒a stopping set
Wang – p. 5/21
Trapping Sets: DefinitionsOperational definition: “the set of bits that arenot eventually
correct" [Richardson 03].
Empirical observations: For non-erasure channels: trapping
sets are(a, b) near-codeword [MacKayet al. 03]
(a, b) near codeword:A set ofa variable nodes such the
induced graph hasb odd-degreecheck nodes.
A (a, 0) near codeword:
⇒a stopping set
We propose a new graph-theoretic definition:
Definition 1 ( k-out Trapping Sets) A subset of{v1, . . . , vn}
such that inthe induced subgraph, there are exactlyk check
nodes of degree one.
Wang – p. 5/21
k-Out Trapping Sets vs. Near-CodewordDefinition 1 ( k-out Trapping Sets) A subset of variables such that
in the induced subgraph, there are exactlyk check nodes of degree one.
k-out trapping sets←→ stopping sets(a, b) near-codewords←→ valid codewords
0-out trapping sets⇐⇒ stopping sets(a, 0) near-codewords⇐⇒ valid codewords
Wang – p. 6/21
k-Out Trapping Sets vs. Near-CodewordDefinition 1 ( k-out Trapping Sets) A subset of variables such that
in the induced subgraph, there are exactlyk check nodes of degree one.
k-out trapping sets←→ stopping sets(a, b) near-codewords←→ valid codewords
0-out trapping sets⇐⇒ stopping sets(a, 0) near-codewords⇐⇒ valid codewords
Why this definition?
Wang – p. 6/21
k-Out Trapping Sets vs. Near-CodewordDefinition 1 ( k-out Trapping Sets) A subset of variables such that
in the induced subgraph, there are exactlyk check nodes of degree one.
k-out trapping sets←→ stopping sets(a, b) near-codewords←→ valid codewords
0-out trapping sets⇐⇒ stopping sets(a, 0) near-codewords⇐⇒ valid codewords
Why this definition?
Better analogy to stopping sets.
Wang – p. 6/21
k-Out Trapping Sets vs. Near-CodewordDefinition 1 ( k-out Trapping Sets) A subset of variables such that
in the induced subgraph, there are exactlyk check nodes of degree one.
k-out trapping sets←→ stopping sets(a, b) near-codewords←→ valid codewords
0-out trapping sets⇐⇒ stopping sets(a, 0) near-codewords⇐⇒ valid codewords
Why this definition?
Better analogy to stopping sets.
An (a, b) near-codeword:
⇒“k ≤ b"-out trapping set.
k<=b-out TS
(a,b) near-cdwd
Our goal: With fixedb, search all min.k ≤ b-out TSs.
Wang – p. 6/21
k-Out Trapping Sets vs. Near-CodewordDefinition 1 ( k-out Trapping Sets) A subset of variables such that
in the induced subgraph, there are exactlyk check nodes of degree one.
k-out trapping sets←→ stopping sets(a, b) near-codewords←→ valid codewords
0-out trapping sets⇐⇒ stopping sets(a, 0) near-codewords⇐⇒ valid codewords
Why this definition?
Better analogy to stopping sets.
An (a, b) near-codeword:
⇒“k ≤ b"-out trapping set.
k<=b-out TS
(a,b) near-cdwd
Our goal: With fixedb, search all min.k ≤ b-out TSs.
Empirically, all error bits consist of only degree 1 & 2 check
nodes. (The elementary trapping set [Landneret al. 05].) Wang – p. 6/21
The Hardness of k-OTD(H, t)
=== Thek-OTD(H, t) problem ===
INPUT: A code represented by its parity-check matrixH and an
integert.
OUTPUT: Output 1 if the minimal k-out trapping distanceof H is
≤ t. Otherwise, output 0.
Wang – p. 7/21
The Hardness of k-OTD(H, t)
=== Thek-OTD(H, t) problem ===
INPUT: A code represented by its parity-check matrixH and an
integert.
OUTPUT: Output 1 if the minimal k-out trapping distanceof H is
≤ t. Otherwise, output 0.
Whenk = 0, then0-OTD(H, t) = SD(H, t) is NP-complete.
Wang – p. 7/21
The Hardness of k-OTD(H, t)
=== Thek-OTD(H, t) problem ===
INPUT: A code represented by its parity-check matrixH and an
integert.
OUTPUT: Output 1 if the minimal k-out trapping distanceof H is
≤ t. Otherwise, output 0.
Whenk = 0, then0-OTD(H, t) = SD(H, t) is NP-complete.
Is the hardness the same forany fixedk > 0 values?
Wang – p. 7/21
Our First ResultTheorem 1 Consider a fixedk > 0. For arbitrary H, k-OTD(H, t) is
NP-complete.
Theorem 2 Consider a fixedk > 0. With thesparsity restrictionthat
the number of1’s in H is limited toO(n) rather thanO(n2), then
k-OTD(H, t) is still NP-complete.
Proof: Reduction from SD(H, t).
Wang – p. 8/21
SD(H, t) By k-OTD(H′, t′)
k = 2Step 1: DuplicateG (k + 2) times
G
G
G
G
Wang – p. 9/21
SD(H, t) By k-OTD(H′, t′)
k = 2Step 1: DuplicateG (k + 2) times
G
G
G
G
Wang – p. 9/21
SD(H, t) By k-OTD(H′, t′)
k = 2Step 1: DuplicateG (k + 2) times
G
G
G
G
Wang – p. 9/21
SD(H, t) By k-OTD(H′, t′)
k = 2Step 1: DuplicateG (k + 2) times
G
G
G
G
Runk-OTD(H′, t(k + 2)).
Wang – p. 9/21
SD(H, t) By k-OTD(H′, t′)
k = 2Step 1: DuplicateG (k + 2) times
G
G
G
G
Runk-OTD(H′, t(k + 2)).
Claim:
anyk-out TS must be parallel
Wang – p. 9/21
SD(H, t) By k-OTD(H′, t′)
k = 2Step 1: DuplicateG (k + 2) times
G
G
G
G
Runk-OTD(H′, t(k + 2)).
Claim:
anyk-out TS must be parallel and it must contain the target bit.
Wang – p. 9/21
SD(H, t) By k-OTD(H′, t′)
k = 2Step 1: DuplicateG (k + 2) times
G
G
G
G
Runk-OTD(H′, t(k + 2)).
Claim:
anyk-out TS must be parallel and it must contain the target bit.
Wang – p. 9/21
NP-hard problem = Impossible?
Wang – p. 10/21
NP-hard problem = Impossible?Most approaches useheuristicsinstead for error-floor
optimization.
The girth, the Approximate Cycle Extrinsic (ACE) message
degree, partial stopping set elimination, and
ensemble-inspired upper bounds.
Wang – p. 10/21
NP-hard problem = Impossible?Most approaches useheuristicsinstead for error-floor
optimization.
The girth, the Approximate Cycle Extrinsic (ACE) message
degree, partial stopping set elimination, and
ensemble-inspired upper bounds.
Is there anything else we can do?
Wang – p. 10/21
NP-hard problem = Impossible?Most approaches useheuristicsinstead for error-floor
optimization.
The girth, the Approximate Cycle Extrinsic (ACE) message
degree, partial stopping set elimination, and
ensemble-inspired upper bounds.
Is there anything else we can do?
NP-completeness=⇒ theasymptotic complexity.
Wang – p. 10/21
NP-hard problem = Impossible?Most approaches useheuristicsinstead for error-floor
optimization.
The girth, the Approximate Cycle Extrinsic (ACE) message
degree, partial stopping set elimination, and
ensemble-inspired upper bounds.
Is there anything else we can do?
NP-completeness=⇒ theasymptotic complexity.
NP-completeness has relatively less predictability for finite n.
Wang – p. 10/21
NP-hard problem = Impossible?Most approaches useheuristicsinstead for error-floor
optimization.
The girth, the Approximate Cycle Extrinsic (ACE) message
degree, partial stopping set elimination, and
ensemble-inspired upper bounds.
Is there anything else we can do?
NP-completeness=⇒ theasymptotic complexity.
NP-completeness has relatively less predictability for finite n.
For practical codes, we only needn ≈ 500–5000.
Wang – p. 10/21
NP-hard problem = Impossible?Most approaches useheuristicsinstead for error-floor
optimization.
The girth, the Approximate Cycle Extrinsic (ACE) message
degree, partial stopping set elimination, and
ensemble-inspired upper bounds.
Is there anything else we can do?
NP-completeness=⇒ theasymptotic complexity.
NP-completeness has relatively less predictability for finite n.
For practical codes, we only needn ≈ 500–5000.
An encouraging example: Thetravelling salesman problem.
Optimal solution for 24,978 cities in Sweden is found in 2004.Wang – p. 10/21
Leverage Upon SD (H, t)
=== TheSD(H, t) problem ===
OUTPUT: Outputan exhaustive listof minimum stopping setsif the
minimal stopping distance is≤ t. Otherwise, output∅.
In our previous work [ISIT 06], a goodexhaustive search
SD(H, t) is provided.
Capable of exhaustingt = 11–13 for codes ofn ≈ 500.
Wang – p. 11/21
Leverage Upon SD (H, t)
=== TheSD(H, t) problem ===
OUTPUT: Outputan exhaustive listof minimum stopping setsif the
minimal stopping distance is≤ t. Otherwise, output∅.
In our previous work [ISIT 06], a goodexhaustive search
SD(H, t) is provided.
Capable of exhaustingt = 11–13 for codes ofn ≈ 500.
On this Friday 4:45pm [Rosnes & Ytrehus, ISIT07], a more
efficient exhaustive searchSD(H, t) will be introduced.
Capable of exhaustingt = 18–26 for codes of
n = 150–5000.
Wang – p. 11/21
Leverage Upon SD (H, t)
=== TheSD(H, t) problem ===
OUTPUT: Outputan exhaustive listof minimum stopping setsif the
minimal stopping distance is≤ t. Otherwise, output∅.
In our previous work [ISIT 06], a goodexhaustive search
SD(H, t) is provided.
Capable of exhaustingt = 11–13 for codes ofn ≈ 500.
On this Friday 4:45pm [Rosnes & Ytrehus, ISIT07], a more
efficient exhaustive searchSD(H, t) will be introduced.
Capable of exhaustingt = 18–26 for codes of
n = 150–5000.
Good SD(H, t) ?⇒ goodk-OTD(H, t)Wang – p. 11/21
k-OTD(H, t′) By SD (H, t)
k = 2
Wang – p. 12/21
k-OTD(H, t′) By SD (H, t)
k = 2
1. Selectk edges.
Wang – p. 12/21
k-OTD(H, t′) By SD (H, t)
k = 2
1. Selectk edges.
2. Based on thek check nodes, identify theneighbor variables.
Wang – p. 12/21
k-OTD(H, t′) By SD (H, t)
k = 2
1. Selectk edges.
2. Based on thek check nodes, identify theneighbor variables.
3. Remove the check nodes and neighbor variables.
Wang – p. 12/21
k-OTD(H, t′) By SD (H, t)
k = 2
1. Selectk edges.
2. Based on thek check nodes, identify theneighbor variables.
3. Remove the check nodes and neighbor variables.
Wang – p. 12/21
k-OTD(H, t′) By SD (H, t)
k = 2
1. Selectk edges.
2. Based on thek check nodes, identify theneighbor variables.
3. Remove the check nodes and neighbor variables.
4. Run SD(H, t) to find the minimal stopping sets containing the
interested variables.
Wang – p. 12/21
k-OTD(H, t′) By SD (H, t)
k = 2
1. Selectk edges.
2. Based on thek check nodes, identify theneighbor variables.
3. Remove the check nodes and neighbor variables.
4. Run SD(H, t) to find the minimal stopping sets containing the
interested variables.
Wang – p. 12/21
k-OTD(H, t′) By SD (H, t)
k = 2
1. Selectk edges.
2. Based on thek check nodes, identify theneighbor variables.
3. Remove the check nodes and neighbor variables.
4. Run SD(H, t) to find the minimal stopping sets containing the
interested variables.
5. Select anotherk edges and repeat the procedure.
Wang – p. 12/21
Empirical Study of k-OTD(H, t)
Complexity growsO(nk).
Wang – p. 13/21
Empirical Study of k-OTD(H, t)
Complexity growsO(nk). A harder problem than SD(H, t).
Wang – p. 13/21
Empirical Study of k-OTD(H, t)
Complexity growsO(nk). A harder problem than SD(H, t).
For codes of interest, 50% FER fromk ≤ 2 TS [Richardson 03].
Wang – p. 13/21
Empirical Study of k-OTD(H, t)
Complexity growsO(nk). A harder problem than SD(H, t).
For codes of interest, 50% FER fromk ≤ 2 TS [Richardson 03].
Whenn ≈ 500 and rate12 codes,t = 10–12 for 1-OTS(H, t).
t = 9–11 for 2-OTS(H, t), based onour SD(H, t).
Wang – p. 13/21
Empirical Study of k-OTD(H, t)
Complexity growsO(nk). A harder problem than SD(H, t).
For codes of interest, 50% FER fromk ≤ 2 TS [Richardson 03].
Whenn ≈ 500 and rate12 codes,t = 10–12 for 1-OTS(H, t).
t = 9–11 for 2-OTS(H, t), based onour SD(H, t).
Tanner (155,64,20) code 04: Minimal 1-out TD≥ 12,
and minimal 2-out TD= 8 w. multiplicity 465 .
All from the following by automorphisms [Tanneret al. 04].
7, 17, 19, 33, 66, 76, 128, 140
7, 31, 33, 37, 44, 65, 100, 120
1, 19, 63, 66, 105, 118, 121, 140
44, 61, 65, 73, 87, 98, 137, 146
31, 32, 37, 94, 100, 142, 147, 148. Wang – p. 13/21
Empirical Study of k-OTD(H, t)
Ramanujan-Margulis (2184,1092) Code w. q = 13, p = 5
[Rosenthalet al. 00];
Inexhaustive results — upper bounds: analytical search [Mackay
et al. 03], error-impulse search [Huet al. 04]
Minimum Hamming distance≤ 14
Exhaustive results by SD(H, t) — lower bounds:
Minimum Hamming distance≥ minimum SD≥ 14
multiplicity 1092
Min. 1-out TD≥ 13 and min. 2-out TD≥ 10.
Wang – p. 14/21
Impact on Error Floorsλ(x) = 0.31961x + 0.27603x2 + 0.01453x5 + 0.38983x6, ρ(x) = 0.50847x5 + 0.49153x6
0 1 2 3 4 5 6 710
−8
10−7
10−6
10−5
10−4
10−3
10−2
10−1
100
Signal to Noise Ration: ES/N
0 = 20*log(1/σ)
Fra
me
/ Bit
Err
or R
ate
(FE
R/B
ER
)
Rand: n=512SS Opt: n=512TS+SS Opt: n=512
AWGN, (λ(x), ρ(x)), n = 512, 0-out/1-out trapping sets.
“Rand" (2, 1), (2, 8); “SS Opt"(13, 40), (5, 4); “SS+TS Opt"(11, 12), (10, 24).
Sum-product decoder, 80 iterations, 100 frame errors.Wang – p. 15/21
Insufficiency of TSsThe relationship to error floors.
n = 504 Girth-optimizedIrregular PEG code [Huet al. 05],1-out TSs ofsize 7:
52, 53, 122, 136, 178, 229, 348
5, 42, 100, 131, 187, 199, 374
n = 504 TS-optimizedirregular code w. the same deg. distr.,
0/1-out TSs:(10, 7)/(8, 40).
Wang – p. 16/21
Insufficiency of TSsThe relationship to error floors.
n = 504 Girth-optimizedIrregular PEG code [Huet al. 05],1-out TSs ofsize 7:
52, 53, 122, 136, 178, 229, 348
5, 42, 100, 131, 187, 199, 374
n = 504 TS-optimizedirregular code w. the same deg. distr.,
0/1-out TSs:(10, 7)/(8, 40).
0 1 2 3 4 5 6 710
−8
10−7
10−6
10−5
10−4
10−3
10−2
10−1
100
Signal to Noise Ration: ES/N
0 = 20*log(1/σ)
Fra
me
/ Bit
Err
or R
ate
(FE
R/B
ER
)
CA Opt: n=504PEG Opt: n=504
Wang – p. 16/21
The Cyclically Lifted Ensemble[Gross 74], [Richardson & Urbanke] and many more.
i i i i���
aaaaaa
��
��
���
AAA
��
��
���
AAA
i i i i���
aaaaaa
��
��
���
AAA
��
��
���
AAA
i i i i���
aaaaaa
��
��
���
AAA
��
��
���
AAA
(a) The base code (b) The lifted code with an all-zero liftingsequence
i i i i����������
���
��
��
����������
�������
��
��
���
�������
i i i iPPPPPPP
hhhhhhhhhhhhhh
��
��
PPPPPPP
XXXXXXXXXX
��
��
���
XXXXXXXXXX
(c) The lifted code with acyclic lifting sequence.
Wang – p. 17/21
The Cyclically Lifted Ensemble[Gross 74], [Richardson & Urbanke] and many more.
i i i i���
aaaaaa
��
��
���
AAA
��
��
���
AAA
i i i i���
aaaaaa
��
��
���
AAA
��
��
���
AAA
i i i i���
aaaaaa
��
��
���
AAA
��
��
���
AAABased Code Optimization⇒ lower ensemble error floor.
(a) The base code (b) The lifted code with an all-zero liftingsequence
i i i i����������
���
��
��
����������
�������
��
��
���
�������
i i i iPPPPPPP
hhhhhhhhhhhhhh
��
��
PPPPPPP
XXXXXXXXXX
��
��
���
XXXXXXXXXX
(c) The lifted code with acyclic lifting sequence.
Wang – p. 17/21
The Cyclically Lifted Ensemble[Gross 74], [Richardson & Urbanke] and many more.
i i i i���
aaaaaa
��
��
���
AAA
��
��
���
AAA
i i i i���
aaaaaa
��
��
���
AAA
��
��
���
AAA
i i i i���
aaaaaa
��
��
���
AAA
��
��
���
AAABased Code Optimization⇒ lower ensemble error floor.
(a) The base code (b) The lifted code with an all-zero liftingsequence
i i i i����������
���
��
��
����������
�������
��
��
���
�������
i i i iPPPPPPP
hhhhhhhhhhhhhh
��
��
PPPPPPP
XXXXXXXXXX
��
��
���
XXXXXXXXXX
(c) The lifted code with acyclic lifting sequence.
Base Code— of sizen (n = 16)h h h h h h h h h h h h h h h h
Lifted Code— of lifting factor K (K = 4)h h h h h h h h h h h h h h h h
h h h h h h h h h h h h h h h h
h h h h h h h h h h h h h h h h
h h h h h h h h h h h h h h h h
Wang – p. 17/21
Survival of Trapping SetsTheorem 3 If xhforms akL-out trapping set for one lifted code, thenxhforms akB-out trapping set for the base code wherekL ≥ kB.
Base Code — of sizen (n = 16)h xh xh h h h xh h h xh h h xh h h h
Lifted Code — of lifting factorK (K = 4)h xh xh h h h h h h xh h h h h h h
h h xh h h h xh h h h h h h h h h
h h xh h h h h h h h h h h h h h
h h h h h h h h h xh h h xh h h h
Wang – p. 18/21
Different Orders of SurvivalsDefinition 2First order survivals
Base Code — of sizen (n = 16)h xh xh h h h xh h h xh h h xh h h h
Lifted Code — of lifting factorK (K = 4)h h h h h h xh h h h h h h h h h
h xh xh h h h h h h h h h h h h h
h h h h h h h h h xh h h h h h h
h h h h h h h h h h h h xh h h h
Wang – p. 19/21
Different Orders of SurvivalsDefinition 2First order survivals
Base Code — of sizen (n = 16)h xh xh h h h xh h h xh h h xh h h h
Lifted Code — of lifting factorK (K = 4)h h h h h h xh h h h h h h h h h
h xh xh h h h h h h h h h h h h h
h h h h h h h h h xh h h h h h h
h h h h h h h h h h h h xh h h h
Definition 3High order survivals
Base Code — of sizen (n = 16)h xh xh h h h xh h h xh h h xh h h h
Lifted Code — of lifting factorK (K = 4)h xh xh h h h h h h xh h h h h h h
h h xh h h h xh h h h h h h h h h
h h xh h h h h h h h h h h h h h
h h h h h h h h h xh h h xh h h hWang – p. 19/21
Different Orders of SurvivalsDefinition 2First order survivals
Base Code — of sizen (n = 16)h xh xh h h h xh h h xh h h xh h h h
Lifted Code — of lifting factorK (K = 4)h h h h h h xh h h h h h h h h h
h xh xh h h h h h h h h h h h h h
h h h h h h h h h xh h h h h h h
h h h h h h h h h h h h xh h h h
Definition 3High order survivals
Base Code — of sizen (n = 16)h xh xh h h h xh h h xh h h xh h h h
Lifted Code — of lifting factorK (K = 4)h xh xh h h h h h h xh h h h h h h
h h xh h h h xh h h h h h h h h h
h h xh h h h h h h h h h h h h h
h h h h h h h h h xh h h xh h h h
Empirically, almost all
small trapping sets are
of first order.
[Wang 06, Ländner 05]Wang – p. 19/21
First order survivalTheorem 4 (kL = kB = 0 , a preliminary result) For a fixed base
code with amin. stopping setsB,E{|first order survivals|} ∝ K−(0.5#E−#V+0.5#Codd,≥3)
FERBEC,ensemble= const · K−(0.5#E−#V+0.5#Codd,≥3).
whereconst = f (the min. stp. dist., multi.).
Wang – p. 20/21
First order survivalTheorem 4 (kL = kB = 0 , a preliminary result) For a fixed base
code with amin. stopping setsB,E{|first order survivals|} ∝ K−(0.5#E−#V+0.5#Codd,≥3)
FERBEC,ensemble= const · K−(0.5#E−#V+0.5#Codd,≥3).
whereconst = f (the min. stp. dist., multi.).
Theorem 5 (kL = kB > 0 ) For a base-codek-out trapping settB ,
E{|first order survivals|} ∝ K0.5kB K−(0.5#E−#V+0.5#Codd,≥3).
Wang – p. 20/21
First order survivalTheorem 4 (kL = kB = 0 , a preliminary result) For a fixed base
code with amin. stopping setsB,E{|first order survivals|} ∝ K−(0.5#E−#V+0.5#Codd,≥3)
FERBEC,ensemble= const · K−(0.5#E−#V+0.5#Codd,≥3).
whereconst = f (the min. stp. dist., multi.).
Theorem 5 (kL = kB > 0 ) For a base-codek-out trapping settB ,
E{|first order survivals|} ∝ K0.5kB K−(0.5#E−#V+0.5#Codd,≥3).
Theorem 6 (kL = kB + 1 ) For a base-codek-out trapping settB ,
E{|first order survivals|} ∝ K0.5kB K−(0.5#E−#V+0.5#Codd,≥3)(K#Codd,≥3 + #Ceven,≥4).
Wang – p. 20/21
First order survivalTheorem 4 (kL = kB = 0 , a preliminary result) For a fixed base
code with amin. stopping setsB,E{|first order survivals|} ∝ K−(0.5#E−#V+0.5#Codd,≥3)
FERBEC,ensemble= const · K−(0.5#E−#V+0.5#Codd,≥3).
whereconst = f (the min. stp. dist., multi.).
Theorem 5 (kL = kB > 0 ) For a base-codek-out trapping settB ,
E{|first order survivals|} ∝ K0.5kB K−(0.5#E−#V+0.5#Codd,≥3).
Theorem 6 (kL = kB + 1 ) For a base-codek-out trapping settB ,
E{|first order survivals|} ∝ K0.5kB K−(0.5#E−#V+0.5#Codd,≥3)(K#Codd,≥3 + #Ceven,≥4).
Base code optimization: 0.5#E− #V + 0.5#Codd,≥3 Wang – p. 20/21
First order survivalTheorem 4 (kL = kB = 0 , a preliminary result) For a fixed base
code with amin. stopping setsB,E{|first order survivals|} ∝ K−(0.5#E−#V+0.5#Codd,≥3)
FERBEC,ensemble= const · K−(0.5#E−#V+0.5#Codd,≥3).
whereconst = f (the min. stp. dist., multi.).
Theorem 5 (kL = kB > 0 ) For a base-codek-out trapping settB ,
E{|first order survivals|} ∝ K0.5kB K−(0.5#E−#V+0.5#Codd,≥3).
Theorem 6 (kL = kB + 1 ) For a base-codek-out trapping settB ,
E{|first order survivals|} ∝ K0.5kB K−(0.5#E−#V+0.5#Codd,≥3)(K#Codd,≥3 + #Ceven,≥4).
Base code optimization: 0.5#E− #V + 0.5#Codd,≥3
0 1 2 3 4 5 6 710
−8
10−7
10−6
10−5
10−4
10−3
10−2
10−1
100
Signal to Noise Ration: ES/N
0 = 20*log(1/σ)
Fra
me
/ Bit
Err
or R
ate
(FE
R/B
ER
)
Rand: n=512SS Opt: n=512TS+SS Opt: n=512
nB = 128, K = 4.0/1-out TSs: (11,12)/(10,24)
Wang – p. 20/21
ConclusionDefine thek-out trapping set graph-theoretically.
Wang – p. 21/21
ConclusionDefine thek-out trapping set graph-theoretically.
Deciding the minimalk-out trappingdistance isNP-hard.
Wang – p. 21/21
ConclusionDefine thek-out trapping set graph-theoretically.
Deciding the minimalk-out trappingdistance isNP-hard.
But still doable for practical code lengthsn ≈ 500.
Wang – p. 21/21
ConclusionDefine thek-out trapping set graph-theoretically.
Deciding the minimalk-out trappingdistance isNP-hard.
But still doable for practical code lengthsn ≈ 500.
Implementk-OTD(H, t) by SD(H, t).
Wang – p. 21/21
ConclusionDefine thek-out trapping set graph-theoretically.
Deciding the minimalk-out trappingdistance isNP-hard.
But still doable for practical code lengthsn ≈ 500.
Implementk-OTD(H, t) by SD(H, t).
Insufficiency of the trapping set (near-codeword) .
Wang – p. 21/21
ConclusionDefine thek-out trapping set graph-theoretically.
Deciding the minimalk-out trappingdistance isNP-hard.
But still doable for practical code lengthsn ≈ 500.
Implementk-OTD(H, t) by SD(H, t).
Insufficiency of the trapping set (near-codeword) .
Quantifying thesuppressing effectof cyclic lifting for trapping
sets.
Wang – p. 21/21
ConclusionDefine thek-out trapping set graph-theoretically.
Deciding the minimalk-out trappingdistance isNP-hard.
But still doable for practical code lengthsn ≈ 500.
Implementk-OTD(H, t) by SD(H, t).
Insufficiency of the trapping set (near-codeword) .
Quantifying thesuppressing effectof cyclic lifting for trapping
sets.
Base code optimization:0.5#E− #V + 0.5#Codd,≥3.
Wang – p. 21/21