Title: SlepianWolf Coding over Broadcast Channels
1Slepian-Wolf Coding overBroadcast Channels
UC San Diego, December 1, 2006
2Outline
- Problem definition and motivation
- Separate source-channel coding
- Optimal combination of source and channel codes.
- Joint source-channel coding
- Complete region of achievable rates.
- Operational separation.
- Comparison between separate and joint schemes
- No informational separation in general.
3Problem setting and motivation
4Problem setting and motivation
Local observation
Xt
Local observations
Y1t
Y2t
Y3t
Y4t
5Formal description
Discrete memoryless source and channel
6Questions
- What combination of coding schemes is optimal?
- How does the optimal joint scheme work?
- Is joint coding superior to combination of
separate codes?
7PART I SEPARATESOURCE-CHANNELCODING
8The most general separate scheme
For K 3
9A suboptimal separate coding method
- Worst-case Slepian-Wolf coding over compound
channels. Transmit only W(12K) .
10Trivial outer bound for source codes
- Is this outer bound exactly the achievable
region? - If so, what is the mechanism to achieve it?
11Slepian-Wolf coding revisited
- It is well-known that to achieve this trivial
lower bound, we use binning.
12Multiple binning
13Error analysis
14Optimal source coding rates
15Why this is good news
- In other words To characterize the minimum
achievable k in separate coding, it suffices to
find the set of achievable total rates the
broadcast channel can deliver to each receiver.
16Why this is good news
17Why this is good news
- Therefore, assuming WLOG that
- this implies that k is achievable in separate
coding with K 2 if and only if
18Comparison
19Infinite gains possible
20Now the bad news
- Would the generalization of the degraded message
sets scenario cover all the achievable total
rates?
21Now the bad news
- The framework of degraded message sets is not
sufficient for the analysis of minimum achievable
k.
22PART II JOINTSOURCE-CHANNELCODING
23Joint source-channel coding
- Easy to prove that this is a convex region.
24Proof sketch for the converse
Fanos inequality
25Proof sketch for the direct result
Decoding Find i such that
26Probability of decoding error
27Is this separation?
- Not in the classical sense
- There are no stand-alone source or channel
codes here. - However, no interplay between (X,Y1,Y2,,YK) and
(U,V1,V2,,VK) unlike in other studied joint
source-channel coding scenarios. - Operational separation!
28Virtual binning
Channel codebook
Typical X n
29Virtual binning
Channel codebook
Typical X n
For better channels, worse side information is OK
30Suboptimality of separate coding
- Consider a binary symmetric broadcast channel
31Suboptimality of separate coding
R2
R
R1
32Joint coding gains
R2
R
We observed gains of upto a ? 2.
R1
33Summary and conclusions
- Separate source-channel coding
- We have a single-letter characterization for K
2. - For K gt 2, it suffices to characterize the total
channel coding rates deliverable to each
receiver. - Joint source-channel coding
- Single-letter characterization for all K.
- Effective capacity region R .
- Operational separation via virtual binning.
- No informational separation in general.
34Open questions
- Is there a single-letter characterization for the
total channel coding rates? - Any hint for how practical SW-coding over
broadcast channels should work? - Any other multi-terminal problem for which our
simple joint coding technique works? - Any implications for lossy coding (w or w/o side
information)?