Title: Distributed Compression For Binary Symetric Channels
1Distributed CompressionFor Binary Symetric
Channels
2Introduction
- Description of the Problem
- Slepian-Wolf Theorem
- Prior Work
- Basic Encoder-Decoder Scheme
- Methodology
- Results
-
3Problem Description
- Given two correlated data sets, a noisy version,
X , at the decoder and the original, Y, at the
encoder, how to transmit Y with the best coding
efficiency? - No communication of X and Y at the encoder
Encoder
Decoder
Y
X
4Slepian-Wolf Theorem
- Slepian-Wolf Given the following scheme,
(X,Y)
(X,Y)
R1
Encode X
Encode Y
R2
5Slepian-Wolf Theorem
- Can transmit X and Y, if
- - R1 gt H(XY) , R2 gt H(YX), and
- - R1 R2 gt H(X,Y).
-
R2
H(Y)
H(YX)
R1
H(XY)
H(X)
6Slepian-Wolf Theorem
- Our problem is a special case of this
-
R2
H(Y)
H(YX)
H(YX)
R1
H(XY)
H(X)
H(X)
7Prior Work
Bin 1
0 0 0 1 1 1
0 10 1 0 1
Bin 2
0 0 1 1 1 0
0 1 0
Bin 3
0 10 1 0 1
Bin 4
Y 0 1 1
0 1 1 1 0 0
Channel
Decoder
Encoder
8Prior Work
- How to maximally separate very long
- input sequences?
- Use error-correcting codes.
9Prior Work
1-p
0
0
p
with EQUAL input probabilities of 0 and 1.
p
1
1
1-p
by Ramchandran, Pradhan.
10Prior Work
-
-
- What if
- the input probabilities are NOT EQUAL?
11Methodology
Plane 1
Bit Plane 1
Huffman Code The Input Sequence X
Decoder
Form the Bins using Error Correcting Codes
Bit Plane 2
Plane 2
Plane N
Bit Plane N
Y sequence
12Encoder
- Inputs 0 (with probability .7) and 1 (with
probability .3) - Huffman code 2-length sequences
- 00 ? 0 (with probability .49)
- 01 ? 10 (with probability .21)
- 10 ? 110 (with probability .21)
- 11 ? 111 (with probability .09)
- Bit-Plane 1 0, 1 , 1 ,1
- Bit-Plane 2 -, 0 , 1 ,1
- Bit-Plane 3 - , - , 0 ,1
13Encoder
001001
00, 10, 01
Error Control Coding To Form Bins
011
0, 110, 10
-10
-0-
14Decoder
- Decoder receives a BIN NUMBER, which corresponds
to - MULTIPLE CODEWORDS.
- How to select the correct codeword out of these
multiple codewords? -
- Use MAXIMUM LIKELIHOOD detection.
15Decoder
011
Decoder
Bin 4
011
110
This is what the decoder receives
Huffman codes for 2 length sequences z1 z2 z3
Assume Y 01, 11, 10 Compute the probability of
z1 z2 z3 given 01,11,10, using the channel
error probability.
16Parameters
Plane 1
Bit Plane 1
Huffman Code The Input Sequence X
Decoder
Form the Bins using Error Correcting Codes
Bit Plane 2
Plane 2
Plane N
Bit Plane N
Length 4
Use BCH (15,k)
Y sequence
17Bit Rate vs. Probability of Occurrence of 0s(at
the fixed error rate p of 0.06)
18Difference between the Actual Bit Rate and the
Slepian-Wolf Bound vsError Probability (p)
19Conclusions
- Huffman Code is not a very good choice
- Better error correcting codes can be selected.
- Gives good results for low error (p) cases
- and for cases in which the Huffman code
gives nearly equal distribution of 0s and 1s.