Distributed Compression For Binary Symetric Channels - PowerPoint PPT Presentation

About This Presentation
Title:

Distributed Compression For Binary Symetric Channels

Description:

Distributed Compression For Binary Symetric Channels Kivanc Ozonat Introduction Description of the Problem Slepian-Wolf Theorem Prior Work Basic Encoder-Decoder ... – PowerPoint PPT presentation

Number of Views:20
Avg rating:3.0/5.0
Slides: 20
Provided by: cluster
Learn more at: http://web.stanford.edu
Category:

less

Transcript and Presenter's Notes

Title: Distributed Compression For Binary Symetric Channels


1
Distributed CompressionFor Binary Symetric
Channels
  • Kivanc Ozonat

2
Introduction
  • Description of the Problem
  • Slepian-Wolf Theorem
  • Prior Work
  • Basic Encoder-Decoder Scheme
  • Methodology
  • Results

3
Problem Description
  • Given two correlated data sets, a noisy version,
    X , at the decoder and the original, Y, at the
    encoder, how to transmit Y with the best coding
    efficiency?
  • No communication of X and Y at the encoder

Encoder
Decoder
Y
X
4
Slepian-Wolf Theorem
  • Slepian-Wolf Given the following scheme,

(X,Y)
(X,Y)
R1
Encode X
Encode Y
R2
5
Slepian-Wolf Theorem
  • Can transmit X and Y, if
  • - R1 gt H(XY) , R2 gt H(YX), and
  • - R1 R2 gt H(X,Y).

R2
H(Y)
H(YX)
R1
H(XY)
H(X)
6
Slepian-Wolf Theorem
  • Our problem is a special case of this

R2
H(Y)
H(YX)
H(YX)
R1
H(XY)
H(X)
H(X)
7
Prior Work
Bin 1
0 0 0 1 1 1
0 10 1 0 1
Bin 2
0 0 1 1 1 0
0 1 0
Bin 3
0 10 1 0 1
Bin 4
Y 0 1 1
0 1 1 1 0 0
Channel
Decoder
Encoder
8
Prior Work
  • How to maximally separate very long
  • input sequences?
  • Use error-correcting codes.

9
Prior Work

1-p
0
0
p
with EQUAL input probabilities of 0 and 1.
p
1
1
1-p
by Ramchandran, Pradhan.
10
Prior Work
  • What if
  • the input probabilities are NOT EQUAL?

11
Methodology

Plane 1
Bit Plane 1
Huffman Code The Input Sequence X
Decoder
Form the Bins using Error Correcting Codes
Bit Plane 2
Plane 2
Plane N
Bit Plane N
Y sequence
12
Encoder
  • Inputs 0 (with probability .7) and 1 (with
    probability .3)
  • Huffman code 2-length sequences
  • 00 ? 0 (with probability .49)
  • 01 ? 10 (with probability .21)
  • 10 ? 110 (with probability .21)
  • 11 ? 111 (with probability .09)
  • Bit-Plane 1 0, 1 , 1 ,1
  • Bit-Plane 2 -, 0 , 1 ,1
  • Bit-Plane 3 - , - , 0 ,1

13
Encoder

001001
00, 10, 01
Error Control Coding To Form Bins
011
0, 110, 10
-10
-0-
14
Decoder
  • Decoder receives a BIN NUMBER, which corresponds
    to
  • MULTIPLE CODEWORDS.
  • How to select the correct codeword out of these
    multiple codewords?
  • Use MAXIMUM LIKELIHOOD detection.

15
Decoder

011
Decoder
Bin 4
011
110
This is what the decoder receives
Huffman codes for 2 length sequences z1 z2 z3

Assume Y 01, 11, 10 Compute the probability of
z1 z2 z3 given 01,11,10, using the channel
error probability.
16
Parameters

Plane 1
Bit Plane 1
Huffman Code The Input Sequence X
Decoder
Form the Bins using Error Correcting Codes
Bit Plane 2
Plane 2
Plane N
Bit Plane N
Length 4
Use BCH (15,k)
Y sequence
17
Bit Rate vs. Probability of Occurrence of 0s(at
the fixed error rate p of 0.06)
18
Difference between the Actual Bit Rate and the
Slepian-Wolf Bound vsError Probability (p)
19
Conclusions
  • Huffman Code is not a very good choice
  • Better error correcting codes can be selected.
  • Gives good results for low error (p) cases
  • and for cases in which the Huffman code
    gives nearly equal distribution of 0s and 1s.
Write a Comment
User Comments (0)
About PowerShow.com