Post on 18-Aug-2020
transcript
Hybrid Random-Structured Coding
Zixiang Xiong
Texas A&M University
December 28, 2013
Z. Xiong (joint work with Y. Yang) Xidian December 28, 2013 1 / 42
Outline
1 Random vs structured codes: Point-to-point systems
2 Random vs. structured coding: Multiterminal systemsSlepian-Wolf codingKorner & Marton’s binary two-help-one problem
3 The Gaussian two-help-one problemBerger-Tung (BT) random codingKrithivasan-Pradhan (KP) structured codingHybrid random-structured coding
Partial sum-rate tightnessGap to optimal sum-rate
4 Conclusions
Z. Xiong (joint work with Y. Yang) Xidian December 28, 2013 2 / 42
Outline
1 Random vs structured codes: Point-to-point systems
2 Random vs. structured coding: Multiterminal systemsSlepian-Wolf codingKorner & Marton’s binary two-help-one problem
3 The Gaussian two-help-one problemBerger-Tung (BT) random codingKrithivasan-Pradhan (KP) structured codingHybrid random-structured coding
4 Conclusions
Z. Xiong (joint work with Y. Yang) Xidian December 28, 2013 3 / 42
Random codes for point-to-point systems
What are they?
Codes with randomness
Invented by Shannon in 1948
Key theoretical tool
In proving Shannon’s classic source andchannel coding theorems
Recent invention/rediscovery of codes based onrandom graphs
Turbo/LDPC codes
Fulfilled Shannon’s prophecy (after 50 years)
What do we learn from Shannon?
Random codes are optimal for point-to-pointsystems
Random graphs
Z. Xiong (joint work with Y. Yang) Xidian December 28, 2013 4 / 42
Structured codes for point-to-point systems
What are they?
Codes with structure
Linear/lattice/trellis codes
Widely used in communication systems
For their low complexityPotentially limit-approaching (as promisedby info theory)
Achievable capacity12 log(SNR)→ 1
2 log(1 + SNR) forAWGN channelsOptimal for at least some point-to-pointsystems as wellStructure “comes for free”
Nested lattice codes
Can structured codes exceed the info-theoretic limits?
No, at least for memoryless point-to-point systems
Z. Xiong (joint work with Y. Yang) Xidian December 28, 2013 5 / 42
Outline
1 Random vs structured codes: Point-to-point systems
2 Random vs. structured coding: Multiterminal systemsSlepian-Wolf codingKorner & Marton’s binary two-help-one problem
3 The Gaussian two-help-one problemBerger-Tung (BT) random codingKrithivasan-Pradhan (KP) structured codingHybrid random-structured coding
4 Conclusions
Z. Xiong (joint work with Y. Yang) Xidian December 28, 2013 6 / 42
Random codes for multiterminal systems
Slepian-Wolf coding’73X,Y: two correlated sources with finite alphabet
Separate encoding and joint decoding
Near-lossless decoding, i.e., limn→∞ Pe((Xn,Yn) ≠ (Xn, Yn)) = 0
Encoder
Encoder
Joint Decoder
Y
X
WY
WX
Y
X^
^n
n n
n
Rate R1
Rate R2
Z. Xiong (joint work with Y. Yang) Xidian December 28, 2013 7 / 42
Random codes for multiterminal systems
Slepian-Wolf theoremSlepian-Wolf rate region (proved by using random coding/bining)
R1 ≥ H(X∣Y)R2 ≥ H(Y ∣X)
R1 + R2 ≥ H(X,Y)
R1
H(Y|X)
H(X)
H(Y)
H(X|Y)
R2
H(X,Y)
H(X,Y)
achievablerate region for
SW coding
achievablerate region for
separate entropy coding
No rate loss compared to joint encoding of X and Y
Z. Xiong (joint work with Y. Yang) Xidian December 28, 2013 8 / 42
Random codes for multiterminal systems
Slepian-Wolf coding exampleAssumptions:
X,Y ∈ {0,1}3 → binary triplets
H(X) = H(Y) = 3 bits → equiprobable tripletsCorrelation between X and Y
x and y differ at most in one positionHamming distance dH(x,y) ≤ 1H(X∣Y) = 2 bits
Question:If y is perfectly known at the decoder but not at the encoder, is it possible
to send 2 bits instead of 3 for x andreconstruct x without loss?
Z. Xiong (joint work with Y. Yang) Xidian December 28, 2013 9 / 42
Random codes for multiterminal systems
Slepian-Wolf coding exampleSolution:
Form 4 cosets and send 2-bit index of the coset x belongs to:coset Z00: {000,111} ← codewords of rate- 1
3 repetition codecoset Z01: {001,110}coset Z10: {010,101}coset Z11: {011,100}
In each set: 2 members at dH = 3Joint decoder: in the set indexed by Z:
Using y, pick x s.t. dH(x,y) ≤ 1This guarantees correct/lossless decodingExample: y=[000], index 00 from encoder, x =[000]
index 01 from encoder, x =[001]index 10 from encoder, x =[010]index 11 from encoder, x =[100]
Separate encoding as efficient as joint encoding!
Z. Xiong (joint work with Y. Yang) Xidian December 28, 2013 10 / 42
Random codes for multiterminal systems
Equivalent way of viewing last example from a syndrome concept
Form parity-check matrix of rate- 13 repetition code
H = [ 1 1 01 0 1
]
Syndrome=coset/bin indexCoset/bin 0: {000,111} has syndrome Z = 00Coset/bin 1: {001,110} has syndrome Z = 01Coset/bin 2: {010,101} has syndrome Z = 10Coset/bin 3: {011,100} has syndrome Z = 11
All 4 cosets preserve the distance properties of the repetition codeEncoding corresponds to matrix multiplication Hx
Compression 3:2
Separate encoding as efficient as joint encoding!
1. Random coding optimal for this symmetric Slepian-Wolf coding example
Z. Xiong (joint work with Y. Yang) Xidian December 28, 2013 11 / 42
Outline
1 Random vs structured codes: Point-to-point systems
2 Random vs. structured coding: Multiterminal systemsSlepian-Wolf codingKorner & Marton’s binary two-help-one problem
3 The Gaussian two-help-one problemBerger-Tung (BT) random codingKrithivasan-Pradhan (KP) structured codingHybrid random-structured coding
4 Conclusions
Z. Xiong (joint work with Y. Yang) Xidian December 28, 2013 12 / 42
Structured codes for multiterminal systems
The binary two-help-one problemHow to encode the mod-2 sum of binary sources? Introduced by Korner &Marton in 1979
Encoder 1
Encoder 2
Encoder 3
Joint
Decoder
Y n1Y n1
Y n2Y n2
Zn = Y n1 © Y n
2Zn = Y n1 © Y n
2
W1W1
W2W2
00
ZnZn
Y1 and Y2 are doubly symmetric binary sourcesOnly the first two encoders are allowed to transmit, the decoderreconstructs Z = Y1 ⊕ Y2 losslessly
This is combined compression and inference (for big data)Goes beyond Slepian-Wolf coding
Z. Xiong (joint work with Y. Yang) Xidian December 28, 2013 13 / 42
Structured codes for multiterminal systems
The rate region of Korner-Marton codingSlepian-Wolf coding before forming Z = Y1 ⊕ Y2 certainly works
Structured coding strictly improves random (Slepian-Wolf) coding!
R1
R2
H(Z)
H(Z)
Structured coding
R1
R2
H(Z)
H(Z)
H(Y1)
H(Y2)
Random coding
Using the same linear code, encoder i transmits Hyi
So both encoders use the same rate H(Z)The decoder forms Hy1 ⊕Hy2 = H(y1⊕y2) = Hz before recovering z
by picking the element with dH ≤ 1 in the coset indexed by z
Z. Xiong (joint work with Y. Yang) Xidian December 28, 2013 14 / 42
Structured codes for multiterminal systems
Korner-Marton coding exampleRecall 4 cosets at each encoder (from the Slepian-Wolf coding example)
coset Z00: {000,111} ; coset Z01: {001,110}coset Z10: {010,101} ; coset Z11: {011,100}
Suppose y1=[101], encoder 1 transmits index 10y2=[101], encoder 2 transmits 10, decoder forms 00 before reconstructing z as [000]
y2=[001], encoder 2 transmits 01, decoder forms 11 before reconstructing z as [100]
y2=[111], encoder 2 transmits 00, decoder forms 10 before reconstructing z as [010]
y2=[100], encoder 2 transmits 11, decoder forms 01 before reconstructing z as [001]
First instance of linear coding beating the best-known random coding
2. Linear coding optimal for this symmetric Korner-Marton coding example
Z. Xiong (joint work with Y. Yang) Xidian December 28, 2013 15 / 42
Multiterminal systems
Structured vs. random codesIt was long held that random codes are optimal for many comm. systems
Korner and Marton’s work showed advantage of structured code for some multiterminalsystems
Partially responsible for their recent back-to-back Shannon awards
Structured coding for multiterminal systems currently a very active area of research
RepriseLinear coding optimal for symmetric Korner-Marton coding
Symmetry means H(Y1∣Y2) = H(Y2∣Y1) or P(Y1 = 0∣Y2 = 1) = P(Y1 = 1∣Y2 = 0)
What about the general asymmetric case?Y1 and Y2 are binary sources with joint PMF (p01 ≠ p10 in general)
Y1/Y2 0 10 p00 p01
1 p10 p11
Q: Which coding scheme is better? A: Neither
Z. Xiong (joint work with Y. Yang) Xidian December 28, 2013 16 / 42
How to encode the mod-2 sum of binary sources?
Ahlswede-Han codingIntroduced by Ahlswede & Han in 1983
Random coding: First quantize Yn1 and Yn
2 as Un1 and Un
2
Structured coding: Then apply Korner-Marton coding on Yn1 and Yn
2 withUn
1 and Un2 as decoder side info
Achievable rate pairs
R1 ≥ I(Y1; U1∣U2) +H(Z∣U1,U2)R2 ≥ I(Y2; U2∣U1) +H(Z∣U1,U2)
R1 + R2 ≥ I(Y1,Y2; U1,U2) + 2H(Z∣U1,U2)
Performs no worse than Slepian-Wolf and Korner-Marton coding forgeneral correlation
Z. Xiong (joint work with Y. Yang) Xidian December 28, 2013 17 / 42
How to encode the mod-2 sum of binary sources?
Ahlswede-Han coding exampleExpanded rate region when
Y1/Y2 0 10 0.00392 0.976081 0.01992 0.00008
Due to asymmetry, time sharing between Slepian-Wolf and Korner-Marton always exists
Ahlswede-Han coding achieves points (e.g., P) outside the time-sharing regionOptimal coding scheme not known
Z. Xiong (joint work with Y. Yang) Xidian December 28, 2013 18 / 42
Outline
1 Random vs structured codes: Point-to-point systems
2 Random vs. structured coding: Multiterminal systemsSlepian-Wolf codingKorner & Marton’s binary two-help-one problem
3 The Gaussian two-help-one problemBerger-Tung (BT) random codingKrithivasan-Pradhan (KP) structured codingHybrid random-structured coding
4 Conclusions
Z. Xiong (joint work with Y. Yang) Xidian December 28, 2013 19 / 42
The Gaussian two-help-one problem
DefinitionSeparate compression and joint decompression of a linear combination Z = Y1 − cY2 ofjointly Gaussian sources Y1 and Y2 subject to an MSE distortion constraint on Z
Problem characterized by the linear coefficient c, the source correlation coefficient ρ, andthe MSE distortion constraint D on Z
Encoder 1
Encoder 2
Encoder 3
Joint
Decoder
Y n1Y n1
Y n2Y n2
Zn = Y n1 ¡ cY n
2Zn = Y n1 ¡ cY n
2
W1W1
W2W2
00
ZnZn
MotivationArises in many practical video surveillance applications, e.g.,reconstructing the motion difference between two video sequences
Z. Xiong (joint work with Y. Yang) Xidian December 28, 2013 20 / 42
Outline
1 Random vs structured codes: Point-to-point systems
2 Random vs. structured coding: Multiterminal systemsSlepian-Wolf codingKorner & Marton’s binary two-help-one problem
3 The Gaussian two-help-one problemBerger-Tung (BT) random codingKrithivasan-Pradhan (KP) structured codingHybrid random-structured coding
4 Conclusions
Z. Xiong (joint work with Y. Yang) Xidian December 28, 2013 21 / 42
The Gaussian two-help-one problem
Berger-Tung’s generic random coding scheme 1977Independent quantization of
Y1 to U1 s.t. U1 = Y1 +Q1 with Q1 ∼ N (0,q1)Y2 to U2 s.t. U2 = Y2 +Q2 with Q2 ∼ N (0,q2)
Followed by Slepian-Wolf compression (or binning) of U1 and U2 leads to rate region
RBT
(q1, q2) =
⎧⎪⎪⎪⎪⎪⎪⎨⎪⎪⎪⎪⎪⎪⎩
(R1,R2) ∶
R1 ≥ H(U1∣U2)
R2 ≥ H(U2∣U2)
R1 + R2 ≥ H(U1,U2)
⎫⎪⎪⎪⎪⎪⎪⎬⎪⎪⎪⎪⎪⎪⎭
Berger-Tung (BT) achievable rate region
RBT(D) = conv( ⋃(U1,U2)∈U(Y1,Y2)
{(R1,R2) ∶ R1 ≥ I(Y1;U1∣U2), R2 ≥ I(Y2;U2∣U1),
R1 + R2 ≥ I(Y1,Y2;U1,U2)})
Z. Xiong (joint work with Y. Yang) Xidian December 28, 2013 22 / 42
The Gaussian two-help-one problem
When c ⋅ ρ ≤ 0, e.g., ρ > 0, c = −1
Referred to as the µ-sum problem (e.g., with Z = Y1 + Y2)
Considered by Wagner et al. in 2005
Berger-Tung (or random QB) coding is optimal
0 0.5 1 1.5 2 2.5 3 3.5 40
1
2
3
4
5
6
distortionD
rate(b/s)
Sum-rate-distortion functionR (D )
Sum-rate of the Berger-Tung schemeRBT (D )
Z. Xiong (joint work with Y. Yang) Xidian December 28, 2013 23 / 42
Outline
1 Random vs structured codes: Point-to-point systems
2 Random vs. structured coding: Multiterminal systemsSlepian-Wolf codingKorner & Marton’s binary two-help-one problem
3 The Gaussian two-help-one problemBerger-Tung (BT) random codingKrithivasan-Pradhan (KP) structured codingHybrid random-structured coding
4 Conclusions
Z. Xiong (joint work with Y. Yang) Xidian December 28, 2013 24 / 42
The Gaussian two-help-one problem
When c ⋅ ρ > 0, e.g., ρ > 0, c = 1Referred to as the µ-difference problem (e.g., with Z = Y1 − Y2)
Considered by Krithivasan-Pradhan in 2009 using structured/latticecodesTwo lattices (Λ1,Λ2) for SC; one lattice ΛC with ΛC ⊆ Λi for CC
Independent lattice quantizers (Λ1,Λ2) on Y1 and cY2Encoders send quantized versions modulo the same lattice ΛC
Transmission rates
Ri = log2σ2(ΛC)σ2(Λi)
Krithivasan-Pradhan (KP) achievable rate region
RKP(D) = {(R1,R2) ∶ 2−2R1 + 2−2R2 ≤ D1 + c2 − 2cρ
}
Z. Xiong (joint work with Y. Yang) Xidian December 28, 2013 25 / 42
The Gaussian two-help-one problem
When c ⋅ ρ > 0, e.g., ρ > 0, c = 1
Referred to as the µ-difference problem (e.g., with Z = Y1 − Y2)
Considered by Krithivasan-Pradhan in 2009 using structured/lattice codes
Smaller sum-rate than the random Berger-Tung scheme for certain (ρ, c) pairs
Structured codes beat random codes!
0 0.02 0.04 0.06 0.08 0.1 0.12 0.14 0.16 0.18 0.2 0.22 0.240
1
2
3
4
5
6
distortionD
rate(b/s)
Sum-rate of Krithivasan & Pradhan's schemeRKP (D )
Sum-rate of the Berger-Tung schemeRBT (D )
Z. Xiong (joint work with Y. Yang) Xidian December 28, 2013 26 / 42
The Gaussian two-help-one problem
When c ⋅ ρ > 0, e.g., ρ > 0, c = 1
Referred to as the µ-difference problem (e.g., with Z = Y1 − Y2)
Considered by Krithivasan-Pradhan in 2009 using structured/lattice codes
Smaller sum-rate than the random Berger-Tung scheme for certain (ρ, c) pairs
Structured codes beat random codes!
0 0.02 0.04 0.06 0.08 0.1 0.12 0.14 0.16 0.18 0.2 0.22 0.240
1
2
3
4
5
6
distortionD
rate(b/s)
Sum-rate of Krithivasan & Pradhan's schemeRKP (D )
Sum-rate of the Berger-Tung schemeRBT (D )
Wagner ’08
Maddah-Ali & Tse ‘10Improved by
Z. Xiong (joint work with Y. Yang) Xidian December 28, 2013 26 / 42
Outline
1 Random vs structured codes: Point-to-point systems
2 Random vs. structured coding: Multiterminal systemsSlepian-Wolf codingKorner & Marton’s binary two-help-one problem
3 The Gaussian two-help-one problemBerger-Tung (BT) random codingKrithivasan-Pradhan (KP) structured codingHybrid random-structured coding
4 Conclusions
Z. Xiong (joint work with Y. Yang) Xidian December 28, 2013 27 / 42
The Gaussian two-help-one problem
When c ⋅ ρ > 0, e.g., ρ > 0, c = 1
For the µ-difference problem, can we do better?
New hybrid random-structured coding scheme
Inspired by Ahlswede & Han’s for encoding the mod-2 sum of binary sourcesHybrid random BT coding (layer I) and structured KP coding (layer II)
Un1Un1Y n
1Y n1
V n1V n1
V n2V n2
Encoder 1
Encoder 2
mod ¤Cmod ¤C
++++
Y n2Y n2
Decoder
Un2Un2
¡¡
V nV n
Random Quantizer I
Q¤1(Tn1 )Q¤1(Tn1 )
Q¤2(Tn2 )Q¤2(Tn2 )
Random Quantizer II
mod ¤Cmod ¤C
mod ¤Cmod ¤C
R(1)1R(1)1
R(2)1R(2)1
R(2)2R(2)2
R(1)2R(1)2
Layer I
Layer IILayer I
Slepian-Wolf Encoder II
Slepian-Wolf
Decoder
Un1Un1
££
¸3¸3
Layer II
Layer II
Layer I
££
cc
ZnZn
Slepian-Wolf Encoder I
££
¸1¸1
££
¸2¸2
Un2Un2
¡¡++
++++
¡¡
++
Z. Xiong (joint work with Y. Yang) Xidian December 28, 2013 28 / 42
The Gaussian two-help-one problem
When c ⋅ ρ > 0, e.g., ρ > 0, c = 1Theorem 1: Achievable rate region of hybrid random-structured coding
Rnew(D) = conv( ⋃(U1,U2)∈U(c,D)
{(R1,R2) ∶
R1 ≥ I(Y1; U1∣U2) + I(V1 − V2; Y1,V2∣U1,U2),R2 ≥ I(Y2; U2∣U1) + I(V1 − V2; V1,Y2∣U1,U2),
R1 + R2 ≥ I(Y1,Y2; U1,U2) + I(V1 − V2; Y1,V2∣U1,U2)
+ I(V1 − V2; V1,Y2∣U1,U2)})
U(c,D) ∆= {(U1,U2,V1,V2) ∶ Ui = Yi + Pi,V1 = Y1 +Q1,V2 = cY2 +Q2,
Pi ∼ N (0,pi),Qi ∼ N (0,qi), i = 1,2, indep. of each other and
(Y1,Y2), such that E[(Z − E(Z∣U1,U2,V1 − V2))2] ≤ D}
Z. Xiong (joint work with Y. Yang) Xidian December 28, 2013 29 / 42
The Gaussian two-help-one problem
When c ⋅ ρ > 0, e.g., ρ > 0, c = 1
Comparison betweenRnew(D),RKP(D) andRBT(D)Hybrid coding always subsumes structured KP coding
Rnew(D) ⊇RKP(D)
It becomes a QB random coding with additional rate of 12 log2
1α
and12 log2
1β
(with α + β = 1) at the two encoders
Lemma 1:
Rnew(D) ⊇RKP(D) ∪ [RBT(D) ⊎ {(12
log21α,
12
log21β) ∶ α + β = 1}]
New rate regionRnew(D) strictly improves the time-sharing regionbetweenRKP(D) andRBT(D)
Z. Xiong (joint work with Y. Yang) Xidian December 28, 2013 30 / 42
The Gaussian two-help-one problem
When c ⋅ ρ > 0, e.g., ρ > 0, c = 1
Comparison betweenRnew(D),RKP(D) andRBT(D)
3.5 4 4.5 5 5.5 63
3.5
4
4.5
5
5.5
6
6.5
7
R1 (b/s)
R2(b/s)
KP rate region RKP (D)QB rate region RQB (D)
Time-sharing region between RQB(D) and RKP (D)Rate region of new hybrid scheme Rnew(D)Time-sharing region between RQB(D) and Rnew(D)
Z. Xiong (joint work with Y. Yang) Xidian December 28, 2013 30 / 42
The Gaussian two-help-one problem
When c ⋅ ρ > 0, e.g., ρ > 0, c = 1We look deeper at the minimum achievable sum-rate
Rnew(D) ∆= min{R1 + R2 ∶ (R1,R2) ∈Rnew(D)}
Theorem 2: Achievable sum-rate of hybrid random-structured coding
Rnew(D)=
⎧⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎨⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎩
12 log2
16c2(1−ρ2)(1−cρ)2D2 ,
if c ≤ 1ρ+√
1−ρ2& D ≤ 2c2(1 − ρ2)
min(log22σ2
ZD , 1
2 log216c((1−ρ2)c−ρD)
D2 ) ,
if c > 1ρ+√
1−ρ2& D ≤ 2c2(1−ρ2)
1+cρ
min(log+22σ2
ZD , 1
2 log+24(1−cρ)2
D−c2(1−ρ2)) ,otherwise
Z. Xiong (joint work with Y. Yang) Xidian December 28, 2013 31 / 42
The Gaussian two-help-one problem
When c ⋅ ρ > 0, e.g., ρ > 0, c = 1
Comparison between Rnew(D), RKP(D) and RBT(D)
Corollary to Lemma 1:
Rnew(D) ≤ min (RKP(D),RBT(D) + 1)
When can Rnew(D) strictly improve both RKP(D) and RBT(D)?
Lemma 2:
Rnew(D) < min (RKP(D),RBT(D))
if either 12ρ < c < min (
√
32ρ ,
1ρ+√
1−ρ2) & D < c(1−ρ2
)(3−2cρ)(2cρ−1)ρ
or√
32ρ < c < 1
ρ+√
1−ρ2& D < 4(2 −
√3)c2(1 − ρ2)
Z. Xiong (joint work with Y. Yang) Xidian December 28, 2013 32 / 42
The Gaussian two-help-one problem
When c ⋅ ρ > 0, e.g., ρ > 0, c = 1
Comparison between Rnew(D), RKP(D) and RBT(D)
0.5 0.55 0.6 0.65 0.7 0.75 0.8 0.85 0.9 0.95 10.4
0.5
0.6
0.7
0.8
0.9
1
ρ
c
(ρ, c) region where Rnew(D) <min{RQB(D),RKP (D)} for sufficiently small D
Boundary of (ρ, c) region where RKP (D) < RQB(D) for sufficiently small D
Boundary of (ρ, c) region where Rnew(D) < RQB(D) for sufficiently small D
Boundary of (ρ, c) region where Rnew(D) strictly improves conv(RQB(D),RKP (D)) for small D
c = θ(ρ) and ρ <√22
c = θ(ρ) and ρ ≥√22
c = 1
ρ+√
1−ρ2and ρ ≥
√22
c = 12ρ and ρ ≥
√22
Z. Xiong (joint work with Y. Yang) Xidian December 28, 2013 32 / 42
Outline
1 Random vs structured codes: Point-to-point systems
2 Random vs. structured coding: Multiterminal systemsSlepian-Wolf codingKorner & Marton’s binary two-help-one problem
3 The Gaussian two-help-one problemBerger-Tung (BT) random codingKrithivasan-Pradhan (KP) structured codingHybrid random-structured coding
Partial sum-rate tightnessGap to optimal sum-rate
4 Conclusions
Z. Xiong (joint work with Y. Yang) Xidian December 28, 2013 33 / 42
The Gaussian two-help-one problem
Partial sum-rate tightnessSum-rate lower bound needed −− to compared with achievable sum-rateupper bound from hybrid coding
Consider a new and more general problem of Gaussian two-terminal SC
problem with covariance matrix ΣY = [ 1 ρρ 1 ] and covariance matrix
distortion constraintD = [ k21 θk1k2
θk1k2 k22
]
Fact: If the minimum sum-rate for the above problem is R(D), then
R(D) = minD∈Υ(ρ,c,D)
R(D),
where Υ(ρ, c,D) contains all real 2 × 2 symmetric matricesD such that0 ⪯D ⪯ ΣY and [1 − c]D[1 − c]T ≤ D
Z. Xiong (joint work with Y. Yang) Xidian December 28, 2013 34 / 42
The Gaussian two-help-one problem
Partial sum-rate tightness
Lemma 3: A new sum-rate lower bound R(D) ≥ max (R†(D),R‡
(D)), where
R†(D)
∆=
⎧⎪⎪⎪⎪⎨⎪⎪⎪⎪⎩
12 log2 [
1−ρ2+2ρk1k2(1+θ)(1+θ)2k2
1k22
] , θ ≤ θ⋆
12 log2 [
(1−ρ2)
2
(1−θ)2k21k2
2(1−ρ2+2ρk1k2(1+θ))
] , θ > θ⋆
R‡(D)
∆=
⎧⎪⎪⎪⎪⎪⎪⎪⎪⎪⎨⎪⎪⎪⎪⎪⎪⎪⎪⎪⎩
12 log2 [
1−ρ2+2ρk1k2(1+θ)(1+θ)2k2
1k22
] , θ ≤ θ⋆
12 log2 [
(1−ρ2)
2
(1−θ)2k2i (4k2
i ρ2−4ρθk1k2+k2
j )] , θ⋆ < θ ≤ θ‡
12 log2 [
(1−ρ2)
2(1−ρ2
−k2j (1−θ
2))
(1−θ2)2k21k2
2((1−ρ2)2−kj(kj−2kiρθ)(1−ρ2)−k2
1k22ρ
2(1−θ2))] , θ‡
< θ
with ki = min(k1, k2), kj = max(k1, k2), and
θ⋆∆=
12ρk1k2
(
√
(1 − ρ2)2 + 4ρ2k21k2
2 − (1 − ρ2))
θ‡ ∆=
12ρk1k2
(
√
(1 − ρ2)2 + 4ρ2k21k2
2 − 8k2i ρ
2(1 − ρ2) + (1 − ρ2))
Z. Xiong (joint work with Y. Yang) Xidian December 28, 2013 35 / 42
The Gaussian two-help-one problem
Partial sum-rate tightness
Lemma 3: A new sum-rate lower bound R(D) ≥ max (R†(D),R‡
(D)), where
R†(D)
∆=
⎧⎪⎪⎪⎪⎨⎪⎪⎪⎪⎩
12 log2 [
1−ρ2+2ρk1k2(1+θ)(1+θ)2k2
1k22
] , θ ≤ θ⋆
12 log2 [
(1−ρ2)
2
(1−θ)2k21k2
2(1−ρ2+2ρk1k2(1+θ))
] , θ > θ⋆
R‡(D)
∆=
⎧⎪⎪⎪⎪⎪⎪⎪⎪⎪⎨⎪⎪⎪⎪⎪⎪⎪⎪⎪⎩
12 log2 [
1−ρ2+2ρk1k2(1+θ)(1+θ)2k2
1k22
] , θ ≤ θ⋆
12 log2 [
(1−ρ2)
2
(1−θ)2k2i (4k2
i ρ2−4ρθk1k2+k2
j )] , θ⋆ < θ ≤ θ‡
12 log2 [
(1−ρ2)
2(1−ρ2
−k2j (1−θ
2))
(1−θ2)2k21k2
2((1−ρ2)2−kj(kj−2kiρθ)(1−ρ2)−k2
1k22ρ
2(1−θ2))] , θ‡
< θ
The 1st lower bound R†(D) proved in Xiong’13 using the estimation-theoretic approach
of Wang’10
The 2nd lower bound R‡(D) newly obtained by combining the approach of Wang’10
and the technique in Wagner’11, which exploits stochastic degradedness of the channelY1 → Y2 with respect to Y1 → Z
Z. Xiong (joint work with Y. Yang) Xidian December 28, 2013 35 / 42
The Gaussian two-help-one problem
Partial sum-rate tightnessTheorem 3: A new sum-rate lower bound
R(D) ≥ R(D)∆= minD∈Υ(ρ,c,D)
max (R†(D),R‡
(D))
Comparison among sum-rate lower and upper bounds
0 0.5 1 1.5 2 2.5 3 3.5 4
x 10−5
0
2
4
6
8
10
12
D
Sum-rate
(b/s)
Time-sharing sum-rate between QB and hybrid coding schemes l.c.e(Rnew(D), RQB(D))
Wagner’s sum-rate lower bound RW (D)Maddah-Ali and Tse’s sum-rate lower bound RMAT (D)New sum-rate lower bound R(D)
Z. Xiong (joint work with Y. Yang) Xidian December 28, 2013 36 / 42
The Gaussian two-help-one problem
Partial sum-rate tightnessTheorem 4: First partial sum-rate tightness result
RQB(D) = R(D) = R(D)
if ρ ∈ (0, 1), 0 ≤ c ≤ 11+2ρ , D ≥
2c2(1−ρ2
)(1−2cρ)1−3cρ or 0 ≤ c ≤ 2ρ
1+2ρ2 , D ≥c(1−ρ2
)(1−2c2ρ2)
ρ(2−3cρ)
0
0.2
0.4
0.6
0.8
1
00.2
0.40.6
0.81
0
0.2
0.4
0.6
0.8
1
ρ
c
D/σ2 Z
Boundary of Dσ2Z
ratio for QB sum-rate tightness
Boundary of Dσ2Z
ratio for QB sum-rate suboptimality
Z. Xiong (joint work with Y. Yang) Xidian December 28, 2013 37 / 42
The Gaussian two-help-one problem
Partial sum-rate tightnessTheorem 4: First partial sum-rate tightness result
RQB(D) = R(D) = R(D)
if ρ ∈ (0, 1), 0 ≤ c ≤ 11+2ρ , D ≥
2c2(1−ρ2
)(1−2cρ)1−3cρ or 0 ≤ c ≤ 2ρ
1+2ρ2 , D ≥c(1−ρ2
)(1−2c2ρ2)
ρ(2−3cρ)
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 10
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
-c
½
-c = q (r) if r <
p2
2
if r ≥p2
22r1{
Rnew(D) < RBT(D) for small D
Z. Xiong (joint work with Y. Yang) Xidian December 28, 2013 37 / 42
The Gaussian two-help-one problem
Partial sum-rate tightnessTheorem 4: First partial sum-rate tightness result
RQB(D) = R(D) = R(D)
if ρ ∈ (0, 1), 0 ≤ c ≤ 11+2ρ , D ≥
2c2(1−ρ2
)(1−2cρ)1−3cρ or 0 ≤ c ≤ 2ρ
1+2ρ2 , D ≥c(1−ρ2
)(1−2c2ρ2)
ρ(2−3cρ)
R(D) = RBT(D) for large D
-c < 1+2r
1
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 10
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
-c
½
-c = q (r) if r <
p2
2
if r ≥p2
22r1{ Rnew(D) < RBT(D) for small D
Z. Xiong (joint work with Y. Yang) Xidian December 28, 2013 37 / 42
Outline
1 Random vs structured codes: Point-to-point systems
2 Random vs. structured coding: Multiterminal systemsSlepian-Wolf codingKorner & Marton’s binary two-help-one problem
3 The Gaussian two-help-one problemBerger-Tung (BT) random codingKrithivasan-Pradhan (KP) structured codingHybrid random-structured coding
Partial sum-rate tightnessGap to optimal sum-rate
4 Conclusions
Z. Xiong (joint work with Y. Yang) Xidian December 28, 2013 38 / 42
The Gaussian two-help-one problem
Gap to optimal sum-rateTheorem 5: For any (ρ, c,D) triple, it holds that
Rnew(D) − R(D) ≤ Rnew(D) − R(D) ≤ 2
10−4
10−3
10−2
10−1
100
0
2
4
6
8
10
12
14
16
D/σ2Z
sum-rate
(b/s)
Sum-rate of Krithivasan & Pradhan’s scheme RKP (D)
BT sum-rate bound RBT (D)
Sum-rate of the new scheme Rnew (D)
Lower sum-rate bound Rlb(D)
BT sum−rate is suboptimal
BT sum−rate is TIGHT
Z. Xiong (joint work with Y. Yang) Xidian December 28, 2013 39 / 42
The Gaussian two-help-one problem
Gap to optimal sum-rateTheorem 5: For any (ρ, c,D) triple, it holds that
Rnew(D) − R(D) ≤ Rnew(D) − R(D) ≤ 2
10−4
10−3
10−2
10−1
100
0
2
4
6
8
10
12
14
16
D/σ2Z
sum-rate
(b/s)
Sum-rate of Krithivasan & Pradhan’s scheme RKP (D)
BT sum-rate bound RBT (D)
Sum-rate of the new scheme Rnew (D)
Lower sum-rate bound Rlb(D)
BT sum−rate is suboptimal
Sum−rate gap
BT sum−rate is TIGHT
Z. Xiong (joint work with Y. Yang) Xidian December 28, 2013 39 / 42
The Gaussian two-help-one problem
Gap to optimal sum-rateLemma 4: If c = 1 or c = ρ, it holds that
Rnew(D) − R(D) ≤ Rnew(D) − R(D) ≤ 1
0 0.02 0.04 0.06 0.08 0.1 0.120
1
2
3
4
5
6
7
8
D
Sum-rate
(b/s)
Sum-rate of QB coding scheme RQB (D)
Sum-rate of Krithivasan-Pradhan coding scheme RKP (D)Sum-rate of new hybrid coding scheme Rnew(D)New sum-rate lower bound R(D)
QB sum-ratesuboptimal
QB sum-rate is tight
Z. Xiong (joint work with Y. Yang) Xidian December 28, 2013 40 / 42
Conclusions
Intellectual merits:
Hybrid scheme conceptually brings together two different worlds
Philosophically the right approach (with better performance)
Problem addressed:Very timely and interesting
Thanks to Korner & Marton, Ahlswede & Han, and other IT gurusCombined SC and inference for big dataThe more general many-help-one problem
Results significant and intriguingFirst partial sum-rate tightness resultsMany open issues (e.g., optimality of hybrid coding)
Broader impact:Hybrid approach applicable to many other network comm. scenarios
Cooperative networks: The two-way relay channelThe interference channels
Z. Xiong (joint work with Y. Yang) Xidian December 28, 2013 41 / 42
Conclusions
Intellectual merits:
Hybrid scheme conceptually brings together two different worlds
Philosophically the right approach (with better performance)
Problem addressed:Very timely and interesting
Thanks to Korner & Marton, Ahlswede & Han, and other IT gurusCombined SC and inference for big dataThe more general many-help-one problem
Results significant and intriguingFirst partial sum-rate tightness resultsMany open issues (e.g., optimality of hybrid coding)
Broader impact:Hybrid approach applicable to many other network comm. scenarios
Cooperative networks: The two-way relay channelThe interference channels
Z. Xiong (joint work with Y. Yang) Xidian December 28, 2013 41 / 42
Conclusions
Intellectual merits:
Hybrid scheme conceptually brings together two different worlds
Philosophically the right approach (with better performance)
Problem addressed:Very timely and interesting
Thanks to Korner & Marton, Ahlswede & Han, and other IT gurusCombined SC and inference for big dataThe more general many-help-one problem
Results significant and intriguingFirst partial sum-rate tightness resultsMany open issues (e.g., optimality of hybrid coding)
Broader impact:Hybrid approach applicable to many other network comm. scenarios
Cooperative networks: The two-way relay channelThe interference channels
Z. Xiong (joint work with Y. Yang) Xidian December 28, 2013 41 / 42
References
1 Slepian and Wolf, “Noiseless coding of correlated information sources,” IEEETrans. Inform. Theory, Jul. 1973.
2 Korner and Marton, “How to encode the modulo-two sum of binary sources,”IEEE Trans. Inform. Theory, Mar. 1979.
3 Ahlswede and Han, “On source coding with side information via amultiple-access channel and related problems in multi-user information theory,”IEEE Trans. Inform. Theory, May 1983.
4 Krithivasan and Pradhan, “Lattices for distributed source coding: jointlyGaussian sources and reconstruction of a linear function,” IEEE Trans. Inform.Theory, Dec. 2009.
5 Wagner, “On distributed compression of linear functions,” IEEE Trans. Inform.Theory, Jan. 2011.
6 Maddah-Ali and Tse, “Interference neutralization in distributed lossy sourcecoding,” Proc. ISIT’10, Jun. 2010
7 Yang, Zhang, and Xiong, “A new sufficient condition for sum-rate tightness inquadratic Gaussian multiterminal source coding,” IEEE Trans. Inform. Theory,January 2013.
8 Yang and Xiong, “Distributed compression of linear functions: Partial sum-ratetightness and gap to optimal sum-rate,” IEEE Trans. Inform. Theory, to appear.
Z. Xiong (joint work with Y. Yang) Xidian December 28, 2013 42 / 42