Title: Low Density Generator Matrix Codes
1Low Density Generator Matrix Codes
- forSource, Channel and Joint Source-Channel
Coding
Wei Zhong and Javier Garcia-Frias Department of
Electrical and Computer Engineering University of
Delaware
2Outline
- Review
- 1. Shannons Theorem
- 2. Linear block codes
- 3. Low Density Parity-Check codes (LDPC codes)
- Low Density Generator Matrix codes (LDGM codes)
- 1. Channel Coding
- 2. Source Coding
- 3. Joint source-channel coding
3Review Shannons Theorem
- Source coding Information sources can be
compressed up to its entropy H. - Channel coding For a noisy channel, near
error-free communications can be achieved up to
its capacity C (info bits/ channel use).
4Review Linear Block Codes
- Two classical channel codes
- Linear block codes
- Convolutional codes
- An (n,k) linear block code is determined by
- Length of information message k
- Length of codeword message n
- Generator matrix G
- Parity-check matrix H
5Review Linear Block Codes
- Let uu1 u2uk be the information message
- Encoding
- Codeword c uG u1u2ukG, where G is the
generator matrix - Channel
- ccn, where n is the channel noise
- Decoding
- scH, where
- s is the syndrome
- H is the parity-check matrix
- If s is the all zero vector, claim no error
- Otherwise, claim error and try to correct
6Review Low Density Parity-Check codes (LDPC
codes)
- Application of iterative decoding in channel
coding - Gallagers thesis on the topic of LDPC codes
(1963) - Turbo codes (1993)
- Iterative decoding is actually an instance of
Pearls Belief Propagation Algorithm.
7Review Low Density Parity-Check codes (LDPC
codes)
- LDPC codes are linear block codes with long block
lengths and special structure in its parity-check
matrix H, which is - low density
- short-cycle free
- With above features, iterative decoding can be
applied to get good performance.
8Review Low Density Parity-Check codes (LDPC
codes)
- Bipartite graph with connections defined by
matrix H - c variable nodes
- corrupted codeword
- s check nodes
- Syndrome, must be all zero for the decoder to
claim no error
- Given the syndromes and the statistics of c, the
LDPC decoder solves the equation - cHTs
- in an iterative manner.
9Review Low Density Parity-Check codes (LDPC
codes)
- Performance of LDPC codes is VERY good
- For AWGN and block lengths of 106, an LDPC code
approaching capacity within 0.06 dB has been
obtained (Richardson, Urbanke, Chung) - Decoding complexity is linear with time O(n)
- Encoding complexity is substantial
- Preprocessing to get no-low-density G from
low-density H (Gaussian elimination) requires
O(n2) - Encoding requires O(n2)
10Low Density Generator Matrix Codes (LDGM Codes)
- Goal To design a coding scheme with O(n)
complexity for both encoding and decoding - Question Can we use low density Generator Matrix
to achieve near-Shannon performance?
11Low Density Generator Matrix Codes (LDGM Codes)
- Systematic linear block codes with low-density
generator matrix GI P - uu1...uk systematic bits
- c uP parity bits
- LDGM codes are LDPC codes, since HPT I is also
sparse - Decoding can be performed in the same way as LDPC
codes or using matrix G (intuitive for source and
joint source-channel coding)
- Given the syndromes and the statistics of u, the
LDGM decoder solves the equation - uPc
- in an iterative manner.
12Low Density Generator Matrix Codes (LDGM Codes)
- Low Density Generator Matrix code has linear time
complexity O(n) in both encoding and decoding - How is the performance?
13Low Density Generator Matrix codes (LDGM Codes)
- As noticed by MacKay, LDGM codes are
asymptotically bad (error floor does not decrease
with the block length) - Solution Concatenated scheme
14Concatenated LDGM Codes for Channel Coding
For BER10-5, resulting performance is .8 dB from
theoretical limit, comparable to state-of-the-art
coding schemes such as Turbo codes or irregular
LDPC codes
15Performance of Concatenated LDGM Codes in Channel
Coding
- Performance very close to the theoretical limits
- Within 0.8 dB for AWGN
- Within 0.6 dB for BSC
- Within 1.3 dB for uncorrelated Rayleigh fading
with perfect channel side information at the
receiver
16Correlated Sources Problem of Interest
Application of turbo-like codes to achieve a
performance close to theoretical limits for
- Source coding (compression)
- Joint source-channel coding (compressible
sequence transmitted through noisy channel) - of single and correlated sources
17Correlated Sources Practical Applications
Sensor networks Several sensors in a given
environment receiving correlated information.
Sensors have very low complexity, do not
communicate with each other, and send information
to processing unit
- Use of turbo-like codes (LDGM codes) to exploit
the correlation, so that transmitted energy
necessary to achieve a given performance is
reduced - Data compression
- Joint source-channel coding
18Joint Source-Channel Coding of Correlated
Sources General Problem
- Two correlated sources U1,U2p(U1,U2)
- Ri Information rate for system i
R1
source 1
channel 1
encoder 1
decoder
R2
source 2
channel 2
encoder 2
- General framework, including single source as a
particular case
- Noiseless channel?Source coding (compression)
- Noisy channel?Joint source-channel coding
- Universal-like coding
- Sources S1 and S2 do not communicate with each
other - Correlation parameters unknown at the encoders
Simple encoders - In many occasions correlation model can be
estimated in the decoding process Complexity in
the decoding process
19Joint Source-Channel Coding of Correlated
Sources Theoretical Limits
Source coding Slepian-Wolf achievable region
Joint source-channel coding Separation principle
applies Ri
Why joint source-channel coding? Encoder much simpler. Similar complexity at the
decoder site Separated scheme can present error propagation
between source and channel decoder 20Joint Source-Channel Coding of Correlated
Sources Rationale of Turbo-Like Codes
- Turbo-like codesRandom-like codes Theoretical
limit (in both source and channel coding)
achieved by random coding - Cover and Thomas
- Turbo-like codes perfectly suited to exploit any
type of side information Compression of
correlated sources as a problem of channel coding
with side information - Wyner
- Shamai and Verdu
21Source Coding of Correlated Sources Equivalent
Model as Channel Coding with Side Information
- XS Source 1 ? Systematic bits
- Cx Compressed version ? Coded bits (noiseless)
- YhXs?e Source 2 ? Corrupted systematic bits
22LDGM Codes for Source Coding of Correlated
Sources Correlation Model
- U2 U1 e, e correlation vector
- Assumption Source U2 is perfectly known at the
decoder ? same problem as channel coding, where e
is the error vector - Correlation/error vector e can be
- Binary Symmetric Channel, BSC (no memory)
- Hidden Markov Model, HMM (with memory)
23LDGM Codes for Correlated Sources Encoder
- Each source independently encoded using a
different LDGM code - Information (compression) rate achieved by
choosing the number of parity bits
Source coding (data compression)
- Concatenation not necessary
Joint source-channel coding
- Concatenation required to reduce the error floor
24LDGM Codes for Source Coding of Correlated
Sources Decoder
- Belief propagation over the graph representing
the whole system - INTUITIVE IDEA In each iteration, modify the a
priori probability of the bit nodes depending on
the information proceeding from the other source
Correlation model
25LDGM Codes for Joint Source-Channel Coding of
Correlated Sources Decoder
- Concatenation necessary to decrease the error
floor - Different scheduling possibilities lead to
similar performance
26LDGM Codes for Joint Source-Channel Coding of
Correlated Sources Decoder
- Schedule I (Flooding)u1,c1,in,c1,out,u2,c2,in,c2
,out - Schedule IIu1,c1,in,c1,out,c1,in,c1,out,u1,u
2,c2,in,c2,out,c2,in,c2,out,u2 - Schedule IIIu1,c1,in,c1,out,u1,u2,c2,in,c2,out,u2
- Schedule IVu1,c1,in,u1,c1,out,u1,u2,c2,in,u2,c2,o
ut,u2 - Schedule Vc1,out,c2,out,u1,c1,in,u2,c2,in,u1,
c1,out,u1,c1,outu2,c2,out,u2,c2,out
27Simulation Results for Source Coding Correlation
Defined by HMMs
- Source 1 i.i.d. binary sequence P(0)P(1)1/2
- U2 U1 ? e, e correlation vector
- Correlation vector e Hidden Markov Model (with
memory)
- A(aij), aij probability of transition from
state i to state j - B(biv), biv probability of getting output v in
state i
- HMMs can model complicated (unknown) correlations
- In order to achieve good performance, the
statistical properties of e have to be exploited
28LDGM Codes for Source Coding of Correlated
Sources Simulation Results
- For a desired compression rate, different ratios
of known/unknown bits can be specified - a? More a priori knowledge vs less compress
potential - a ? More compression potential vs less a priori
knowledge
- LDPC/LDGM codes with different pairs (Rc, a) are
simulated and optimum distribution for (Rc, a)
can be determined for minimum compression rate
29LDGM Codes for Source Coding of Correlated
Sources Simulation Results
30LDGM Codes for Joint Source-Channel Coding of
Correlated Sources Simulation Results
- Performance of different activation schedules for
correlation parameter p0.1 - Both AWGN and Rayleigh fading channels are
considered.
31LDGM Codes for Joint Source-Channel Coding of
Correlated Sources Simulation Results
- Correlation Model
- Source 1 i.i.d. binary sequence P(0)P(1)1/2
- Source 2 Bits of source 1 are flipped with
probability p - Message length9,500
- Rate for each LDGM encoderoverall rate, Rc0.475
32Novel Contributions
- Design of LDGM codes with good performance
- Theoretical analysis of error floor region
- Concatenated schemes
- Application of LDGM codes for source coding of
correlated sources with correlation having memory - Application of LDGM codes for joint
source-channel coding of correlated sources - In all cases, performance very close to the
theoretical limits, even without much code
optimization
33Future Work
- We will consider more complicated channel and
source models - We will develop optimization techniques for the
design of irregular LDGM codes