Title: Compression with Side Information using Turbo Codes
1Compression with Side Information usingTurbo
Codes
Anne Aaron and Bernd Girod Information Systems
Laboratory Stanford University Data Compression
Conference April 3, 2002
2Overview
- Introduction
- Turbo Coder and Decoder
- Compression of Binary Sequences
- Extension to Continuous-valued Sequences
- Joint Source-Channel Coding
- Conclusion
Compression with Side Information Using Turbo
Codes
April 3, 2002
3Distributed Source Coding
Compression with Side Information Using Turbo
Codes
April 3, 2002
4Research Problem
- Motivation
- Slepian-Wolf theorem It is possible to compress
statistically dependent signals in a distributed
manner to the same rate as with a system where
the signals are compressed jointly. - Objective
- Design practical codes which achieve compression
close to the Slepian-Wolf bound
Compression with Side Information Using Turbo
Codes
April 3, 2002
5Asymmetric Scenario Compression with Side
Information
- Compression techniques to send at rate close to
H(Y) are well known - Can perform some type of switching for more
symmetric rates
Compression with Side Information Using Turbo
Codes
April 3, 2002
6Our Approach Turbo Codes
- Turbo Codes
- Developed for channel coding
- Perform close to Shannon channel capacity limit
(Berrou, et al., 1993) - Similar work
- Garcia-Frias and Zhao, 2001 (Univ. of Delaware)
- Bajcsy and Mitran, 2001 (McGill Univ.)
Compression with Side Information Using Turbo
Codes
April 3, 2002
7System Set-up
- X and Y are i.i.d binary sequences X1X2XL and
Y1Y2YL with equally probable ones and zeros.
Let Xi be independent of Yj for i?j, but
dependent on Yi. X and Y dependency described by
pmf P(xy). - Y is sent at rate RY?H(Y) and is available as
side information at the decoder
Compression with Side Information Using Turbo
Codes
April 3, 2002
8Turbo Coder
Compression with Side Information Using Turbo
Codes
April 3, 2002
9Turbo Decoder
Pchannel
SISO Decoder
Pa posteriori
Pextrinsic
Pa priori
L bits out
Pextrinsic
Pa priori
SISO Decoder
Pchannel
Pa posteriori
Compression with Side Information Using Turbo
Codes
April 3, 2002
10Simulation Binary Sequences
- X-Y relationship
- P(XiYi)1-p and P(Xi?Yi)p
- System
- 16-state, Rate 4/5 constituent convolutional
codes - RX0.5 bit per input bit with no puncturing
- Theoretically, must be able to send X without
error when H(XY)?0.5
Compression with Side Information Using Turbo
Codes
April 3, 2002
11Results Compression of Binary Sequences
RX0.5
0.15 bit
Compression with Side Information Using Turbo
Codes
April 3, 2002
12Results for different rates
- Punctured the parity bits to achieve lower rates
Compression with Side Information Using Turbo
Codes
April 3, 2002
13Extension to Continuous-Valued Sequences
- X and Y are sequences of i.i.d continuous-valued
random variables X1X2XL and Y1Y2YL. Let Xi be
independent of Yj for i?j, but dependent on Yi. X
and Y dependency described by pdf f(xy). - Y is known as side information at the decoder
To decoder
Compression with Side Information Using Turbo
Codes
April 3, 2002
14Simulation Gaussian Sequences
- X-Y relationship
- X is a sequence of i.i.d Gaussian random
variables - YiXiZi, where Z is also a sequence of i.i.d
Gaussian random variables, independent of X.
f(xy) is a Gaussian probability density function - System
- 4-level Lloyd-Max scalar quantizer
- 16-state, rate 4/5 constituent convolutional
codes - No puncturing so rate is 1 bit/source sample
Compression with Side Information Using Turbo
Codes
April 3, 2002
15Results Compression of Gaussian Sequences
RX1 bit/sample
2.8 dB
CSNR ratio of the variance of X and Z
Compression with Side Information Using Turbo
Codes
April 3, 2002
16Joint Source-Channel Coding
- Assume that the parity bits pass through a
memoryless channel with capacity C - We can include the channel statistics in the
decoder calculations for Pchannel. - From Slepian-Wolf theorem and definition of
Channel capacity
Compression with Side Information Using Turbo
Codes
April 3, 2002
17Results Joint Source-Channel Coding
RX0.5 BSC with q0.03
0.15 bit
0.12 bit
Compression with Side Information Using Turbo
Codes
April 3, 2002
18Conclusion
- We can use turbo codes for compression of binary
sequences. Can perform close to the Slepian-Wolf
bound for lossless distributed source coding. - We can apply the system for compression of
distributed continuous-valued sequences.
Performs better than previous techniques. - Easy extension to joint source-channel coding
Compression with Side Information Using Turbo
Codes
April 3, 2002