Low Density Generator Matrix Codes - PowerPoint PPT Presentation

1 / 67
About This Presentation
Title:

Low Density Generator Matrix Codes

Description:

... to the state-of-the-art. Serial and parallel structure. Code ... If s is the all zero vector, claim no error. Otherwise, claim error ... correlation vector ... – PowerPoint PPT presentation

Number of Views:366
Avg rating:3.0/5.0
Slides: 68
Provided by: sally7
Category:

less

Transcript and Presenter's Notes

Title: Low Density Generator Matrix Codes


1
Low Density Generator Matrix Codes
  • for Source and Channel Coding

Wei Zhong Department of Electrical and Computer
Engineering University of Delaware
PhD Dissertation Proposal 12/05/2005
2
The Communication Problem
Source 1
Destination
Communication Channel
Source 2
Source coding
Channel coding
Joint-Source-Channel Coding
  • The communication channel can be described as
    P(xy)
  • At source, source coding and channel coding
  • At destination, source decoding and channel
    decoding
  • Divide-and-conquer leads to optimum ???

3
Outline and Contributions
  • LDGM codes for channel coding
  • Comparable to the state-of-the-art
  • Serial and parallel structure
  • Code optimization
  • LDGM codes for distributed source coding (of
    multiple sources) when noisy/noiseless channel is
    present
  • Hidden Markov Model correlated sources
  • Independent channels
  • Multi-Access channel
  • Future work

4
Publications
  • Journal publication
  • Channel coding J. Garcia-Frias and W. Zhong,
    Approaching Near Shannon Performance by
    Iterative Decoding of Linear Codes with
    Low-Density Generator Matrix, IEEE
    Communications Letters, vol. 7, no. 6,
    pp.266-268, June 2003.
  • Source coding J. Garcia-Frias and W. Zhong,
    LDPC Codes for Compression of Multi-Terminal
    Sources with Hidden Markov Correlation, IEEE
    Communications Letters, vol. 7, pp. 115-117,
    March 2003.
  • Joint-Source-Channel coding over independent
    channels W. Zhong and J. Garcia-Frias, LDGM
    Codes for Channel Coding and Joint-Source-Channel
    Coding of Correlated Sources, EURASIP Journal on
    Applied Signal Processing, vol. 6, pp. 942-953,
    2005.
  • Joint-Source-Channel coding over multiple-access
    channel Y. Zhao, W. Zhong, and J. Garcia-Frias,
    Transmission of Correlated Senders over a
    Rayleigh Fading Multiple Access Channel, to
    appear in EURASIP Journal on Applied Signal
    Processing special issue on distributed coding,
    2006.

5
Channel Coding
6
Review Linear Block Codes
  • An (n,k) linear block code is determined by
  • Length of information message k
  • Length of codeword message n
  • Generator matrix G
  • Parity-check matrix H

7
Review Linear Block Codes
  • Let uu1 u2uk be the information message
  • Encoding
  • Codeword c uG u1u2ukG, where G is the
    generator matrix
  • Channel
  • ccn, where n is the channel noise
  • Decoding
  • scH, where
  • s is the syndrome
  • H is the parity-check matrix
  • If s is the all zero vector, claim no error
  • Otherwise, claim error and try to correct

8
Review Low Density Parity-Check codes (LDPC
codes)
  • LDPC codes are linear block codes with long block
    lengths and special structure in its parity-check
    matrix H, which is
  • low density
  • short-cycle free
  • With above features, iterative decoding can be
    applied to get good performance.

9
Review Low Density Parity-Check codes (LDPC
codes)
  • Application of iterative decoding in channel
    coding
  • Gallagers thesis on the topic of LDPC codes
    (1963)
  • Turbo codes (1993)
  • Iterative decoding is actually an instance of
    Pearls Belief Propagation Algorithm.

10
Review Low Density Parity-Check codes (LDPC
codes)
  • Bipartite graph with connections defined by
    matrix H
  • c variable nodes
  • corrupted codeword
  • s check nodes
  • Syndrome, must be all zero for the decoder to
    claim no error
  • Given the syndromes and the statistics of v, the
    LDPC decoder solves the equation
  • cHTs
  • in an iterative manner.

11
Review Low Density Parity-Check codes (LDPC
codes)
  • Performance of LDPC codes is VERY good
  • For AWGN and block lengths of 106, an LDPC code
    approaching capacity within 0.06 dB has been
    obtained (Richardson, Urbanke, Chung)
  • Decoding complexity is linear with time O(n)
  • Encoding complexity is substantial
  • Encoding requires O(n2) (computation and storage)

12
Low Density Generator Matrix Codes (LDGM Codes)
  • Goal To design a coding scheme with O(n)
    complexity for both encoding and decoding
  • Question Can we use low density Generator Matrix
    to achieve near-Shannon performance?

13
Low Density Generator Matrix Codes (LDGM Codes)
  • Systematic linear block codes with low-density
    generator matrix GI P
  • uu1...uk systematic bits
  • c uP parity bits
  • LDGM codes are LDPC codes, since HPT I is also
    sparse
  • Decoding can be performed in the same way as LDPC
    codes or using matrix G (intuitive for source and
    joint source-channel coding)
  • Given the syndromes and the statistics of u, the
    LDGM decoder solves the equation
  • uPc
  • in an iterative manner.

14
Previous works
  • LDPC
  • Gallager (1963) Original
  • MacKay (1995) Re-discovery
  • LDGM
  • M. Luby (2001) Erasure correction
  • R. McEliece (1996), L. Ping (1998), J. Moon
    (2001) High-rate application

15
Low Density Generator Matrix Codes (LDGM Codes)
  • Low Density Generator Matrix code has linear time
    complexity O(n) in both encoding and decoding
  • How is the performance?

16
Low Density Generator Matrix codes
  • As noticed by MacKay, LDGM codes are
    asymptotically bad (error floor does not decrease
    with the block length)
  • Solution Concatenated scheme (serial and
    parallel)

17
Density Evolution Algorithm
  • Originated for channel coding using LDPC codes
  • Assume infinite block size
  • Tracks the asymptotic behavior of the iterative
    decoder
  • Objective Find the threshold of the code
    systematically instead of running extensive
    simulations
  • For LDGM codes, both threshold and error floor
    can be predicted by DE.

18
Density Evolution Algorithm for LDGM
  • Originally designed for LDPC codes
  • Modified DE for LDGM
  • Variable node
  • Parity node
  • Difference from LDPC is the existence of channel
    message for parity nodes

19
Density Evolution Algorithm, Single Code, BSC
  • DE predictions match well simulation results
    (threshold and error floor)

20
Concatenated LDGM Codes for Channel Coding
For BER10-5, resulting performance is .8 dB from
theoretical limit, comparable to state-of-the-art
coding schemes such as Turbo codes or irregular
LDPC codes
21
Density Evolution Algorithm, Serial Concatenated
Code
  • Trade-off between convergence threshold and error
    floor

22
Performance of Concatenated LDGM Codes in Channel
Coding
  • Performance very close to the theoretical limits
  • Within 0.8 dB for AWGN
  • Within 0.6 dB for BSC
  • Within 1.3 dB for ergodic Rayleigh fading with
    channel side information at the receiver

23
Parallel Concatenated LDGM codes
  • Encoding diagram

k/(km)
k info bits
Encoder1
Overall rate k/(kmn)
k/(kn)
Encoder2
  • Encoder1 uses the generator matrix G1I P1
  • Encoder2 uses the generator matrix G2I P2
  • The whole code can be considered as a single
    irregular code with Gwhole I P1 P2
  • Intuitive design using parallel framework
  • 2nd encoder reduces error floor left by the 1st
    one

24
Parallel Concatenated LDGM Codes
  • Decoding exactly the same as single code

Cn2
Cm1
Ul
c1 represent the coded bits generated at the
first constituent encoder
c2 represent the coded bits generated at the
second constituent encoder
Ul represent the nodes corresponding to the
systematic bits
25
DE Results for Parallel Concatenated LDGM, AWGN
  • Goal To find scheme that achieves desired
    trade-off between threshold and error floor
  • Error floor mostly dependent on high-rate code,
    and always improves with the degree of that code
  • For low degrees in low-rate code, threshold
    degrades with the degree of high rate code
  • For high degree in low-rate code, threshold
    almost independent of high-rate code

26
Density Evolution Algorithm, Parallel
Concatenated Code
27
Conclusion
  • LDGM codes for channel coding
  • Very good performance with relatively lower
    complexity
  • Serial and parallel design
  • Performance analysis using density evolution
  • Code optimization

28
Source Coding
Joint-Source-Channel Coding over Independent
Channels
29
Correlated Sources Problem of Interest
Application of turbo-like codes to achieve a
performance close to theoretical limits for
  • Source coding (compression)
  • Joint source-channel coding (compressible
    sequence transmitted through noisy channel)
  • of single and correlated sources

30
Correlated Sources Practical Applications
Sensor networks Several sensors in a given
environment receiving correlated information.
Sensors have very low complexity, do not
communicate with each other, and send information
to processing unit
  • Use of turbo-like codes (LDGM codes) to exploit
    the correlation, so that transmitted energy
    necessary to achieve a given performance is
    reduced
  • Data compression
  • Joint source-channel coding

31
Previous works
  • Distributed source coding
  • Garcia-Frias (2001) turbo codes
  • Xiong (2002) LDPC codes
  • Joint-Source-Channel Coding
  • Garcia-Frias (2001) turbo codes

32
Joint Source-Channel Coding of Correlated
Sources General Problem
  • Two correlated sources U1,U2p(U1,U2)
  • Ri Information rate for system i

R1
source 1
channel 1
encoder 1
decoder
R2
source 2
channel 2
encoder 2
  • General framework, including single source as a
    particular case
  • Noiseless channel?Source coding (compression)
  • Noisy channel?Joint source-channel coding
  • Features
  • Sources S1 and S2 do not communicate with each
    other
  • Correlation parameters unknown at the encoders
    Simple encoders
  • In many occasions correlation model can be
    estimated in the decoding process Complexity in
    the decoding process

33
Joint Source-Channel Coding of Correlated
Sources Theoretical Limits
Source coding Slepian-Wolf achievable region
Joint source-channel coding Separation principle
applies RiltC?Eb/Nolimit (Barros 2002)
  • Why joint source-channel coding?
  • Encoder much simpler. Similar complexity at the
    decoder site
  • Separated scheme can present error propagation
    between source and channel decoder

34
Joint Source-Channel Coding of Correlated
Sources Rationale of Turbo-Like Codes
  • Turbo-like codesRandom-like codes Theoretical
    limit (in both source and channel coding)
    achieved by random coding
  • Cover and Thomas
  • Turbo-like codes perfectly suited to exploit any
    type of side information Compression of
    correlated sources as a problem of channel coding
    with side information
  • Wyner
  • Shamai and Verdu

35
Source Coding of Correlated Sources Equivalent
Model as Channel Coding with Side Information
  • XS Source 1 ? Systematic bits
  • Cx Compressed version ? Coded bits (noiseless)
  • YhXs?e Source 2 ? Corrupted systematic bits

36
LDGM Codes for Source Coding of Correlated
Sources Correlation Model
  • U2 U1 e, e correlation vector
  • Assumption Source U2 is perfectly known at the
    decoder ? same problem as channel coding, where e
    is the error vector
  • Correlation/error vector e can be
  • Binary Symmetric Channel, BSC (no memory)
  • Hidden Markov Model, HMM (with memory)

37
LDGM Codes for Correlated Sources Encoder
  • Each source independently encoded using a
    different LDGM code
  • Information (compression) rate achieved by
    choosing the number of parity bits

Source coding (data compression)
  • Concatenation not necessary

Joint source-channel coding
  • Concatenation required to reduce the error floor

38
LDGM Codes for Source Coding of Correlated
Sources Decoder
  • Belief propagation over the graph representing
    the whole system
  • INTUITIVE IDEA In each iteration, modify the a
    priori probability of the bit nodes depending on
    the information proceeding from the other source

Correlation model
39
LDGM Codes for Joint Source-Channel Coding of
Correlated Sources Decoder
  • Concatenation necessary to decrease the error
    floor
  • Different scheduling possibilities lead to
    similar performance

40
LDGM Codes for Joint Source-Channel Coding of
Correlated Sources Decoder
  • Schedule I (Flooding)u1,c1,in,c1,out,u2,c2,in,c2
    ,out
  • Schedule IIu1,c1,in,c1,out,c1,in,c1,out,u1,u
    2,c2,in,c2,out,c2,in,c2,out,u2
  • Schedule IIIu1,c1,in,c1,out,u1,u2,c2,in,c2,out,u2
  • Schedule IVu1,c1,in,u1,c1,out,u1,u2,c2,in,u2,c2,o
    ut,u2
  • Schedule Vc1,out,c2,out,u1,c1,in,u2,c2,in,u1,
    c1,out,u1,c1,outu2,c2,out,u2,c2,out

41
Simulation Results for Source Coding Correlation
Defined by HMMs
  • Source 1 i.i.d. binary sequence P(0)P(1)1/2
  • U2 U1 ? e, e correlation vector
  • Correlation vector e Hidden Markov Model (with
    memory)
  • A(aij), aij probability of transition from
    state i to state j
  • B(biv), biv probability of getting output v in
    state i
  • HMMs can model complicated (unknown) correlations
  • In order to achieve good performance, the
    statistical properties of e have to be exploited

42
Simulation Results for Source Coding Correlation
Defined by HMM
  • Source 2 assumed perfectly known at the decoder
  • Different LDGM (X,Y) codes with K16,000

43
LDGM Codes for Joint Source-Channel Coding of
Correlated Sources Simulation Results
  • Performance of different activation schedules for
    correlation parameter p0.1
  • Both AWGN and Rayleigh fading channels are
    considered.

44
LDGM Codes for Joint Source-Channel Coding of
Correlated Sources Simulation Results
  • Correlation Model
  • Source 1 i.i.d. binary sequence P(0)P(1)1/2
  • Source 2 Bits of source 1 are flipped with
    probability p
  • Message length9,500
  • Rate for each LDGM encoderoverall rate, Rc0.475

45
Conclusion
  • LDGM codes for distributed source coding
  • Describe correlation using Hidden Markov Model
  • Very close to source entropy
  • LDGM codes for joint-source-channel coding over
    independent channels
  • Different decoding recipes
  • Very close to theoretical limit

46
Joint-Source-Channel Coding over Multi-Access
Channel
47
Main Picture
N
coding
S1
Joint Decoder r X1X2XnN
coding
S2
coding
Sn
What is the theoretical limit? How to achieve
it? Still an open problem
Separation limit does not hold (codewords should
be correlated)
48
System Model
S1...1010110
N
encoder
Joint Decoder
e
S2.0110111
encoder
  • S1, S2 are binary sequences
  • Correlated with an i.i.d. correlation
    characterized by Pr(e 1)p, i.e. S1 is different
    from S2 with prob. p
  • AWGN

49
Previous Work
  • Early work
  • MAC with arbitrarily correlated sources (T. M.
    Cover et al., 1980)
  • Separate source-channel coding not optimum
  • Bounds, non-closed form
  • Binary correlated sources
  • Turbo-like codes for correlated sources over MAC
    (J. Garcia-Frias et al., 2003)
  • Turbo codes
  • Low Density Generator Matrix (LDGM) codes
  • Interleaver design, exploiting correlation in
    systematic bits
  • Correlated sources and wireless channels (H. El
    Gamal et al., 2004)
  • LDGM codes
  • Not a pure MAC, need independent links for a
    small fraction of parity bit
  • Serial concatenated LDGM codes ( J. Garcia-Frias
    et al., 2004)
  • Pure MAC
  • Good performance for highly correlated sources

50
Main Contributions
  • LDGM schemes resulting in good performance for
    the whole correlation range outperforming
    separation limit
  • Serial concatenation
  • Interleaving trade-off between threshold and
    error floor
  • Parallel concatenation
  • Interleaving trade-off between threshold and
    error floor
  • Identification code for each sender
  • Combination of serial and parallel scheme

51
Theoretical Limit Assuming Separation between
Source and Channel Coding
  • Theoretical limit unknown
  • The separation limit is achieved by
  • Slepian-Wolf source coding optimum channel
    coding

R1
  • Ei Energy constraint for sender i (we assume
    E1E2)
  • Ri Information rate for sender i (we assume
    R1R2R/2)

52
Reducing Error Floors (Channel Coding)
Serial Concatenated Scheme
  • GwholeGouterGinner
  • Use of Gwhole directly ? worse performance (cycle
    structure)
  • GwholeI P1 P2
  • Irregular LDGM code can be designed in an
    intuitive manner

outer
inner
Parallel Concatenated Scheme
I P1
I P2
53
LDGM Encoder for Correlated Senders over MAC
Single LDGM Encoder per Sender
u11 uL1
Sender 1
LDGM Encoder
Ok1
u12 uL2
Sender 2
LDGM Encoder
Ok2
  • To exploit correlation at encoder site, each
    sender encoded using the same LDGM code
  • 11? Twice energy
  • 1-1? Erasure like

54
LDGM Encoder for Correlated Senders over MAC
Single LDGM Encoder per Sender
Information bits
Parity bits
Sender 1
Sender 2
  • Information bits are
  • correlated by pPr(u1k?u2k)
  • Parity bits are correlated by
  • p Pr(c1k?c2k)

Parity bits are generated as

55
LDGM Encoder for Correlated Senders over MAC
Drawback of Single LDGM Encoder Scheme
  • Each sender is encoded by the same LDGM codebook.
  • Decoder graph completely symmetric
  • At the receiver, even if the decoder can recover
    the sum perfectly, there is no way to tell which
    bit corresponds to sender 1 and which to sender 2
  • Solution
  • Introduce asymmetry in decoding graph
  • Serial and parallel concatenated scheme with
    additional interleaved parity bits (asymmetry)
  • One bit acting as a sender ID

56
LDGM Encoder for Correlated Senders over MAC
Serial Concatenated Scheme
u11 uL1
Ok1
Eouter
Einner
Sender 1
Encoder 1
u12 uL2
Channel Interleaver
Sender 2
Eouter
Einner
Ok2
Encoder 2
  • Each sender is encoded by a serial concatenated
    LDGM code
  • Sender 2s sequence is scrambled by a special
    channel interleaver
  • Information bits are not interleaved (most
    correlation preserved).
  • Inner coded bits are partially interleaved
    (trade-off between exploiting correlation and
    introducing asymmetry).
  • Outer coded bits are totally interleaved (little
    correlation, introduce asymmetry).

57
LDGM Decoder for Correlated Senders over MAC
Serial Concatenated Scheme
  • Detailed message passing expressions can be
    obtained by applying Belief Propagation over the
    graph

58
LDGM Encoder for Correlated Senders over MAC
Parallel Concatenated Scheme
u11 uL1
Ok1
Parallel LDGM
Sender 1
u12 uL2
Parallel LDGM
Sender 2
Channel Interleaver
Ok2
  • Each sender is encoded by a parallel concatenated
    LDGM code
  • Channel interleaver
  • Interleave small portion of parity bits
  • Preference given to parity bits with higher
    degrees ( less correlation loss)
  • A different identification bit is assigned for
    each sender
  • Ambiguity at the bit level extends sometimes for
    the whole sequence (decoder obtains both
    codewords but fails to assign to the right
    sender)
  • Decoding in the corresponding graph

59
Simulation Results High Correlation, Parallel
Scheme
  • Information block size 50k
  • 10,560,120 parallel concatenated LDGM code
    with different interleaving ratios
  • ID for each sender
  • Error floor close to theoretical lower bound
    (calculated in paper)
  • Outperforming Shannons separation limit by 1.3
    dB

60
Simulation Results High correlation, Serial
Scheme
  • Trade-off between error floor and threshold,
    driven by fraction of interleaved inner parity
    bits
  • Information block size 10k, inner (8,4), outer
    (4,76), rate 0.32

61
Simulation Results Low Correlation, Combo Scheme
  • Information block size 50k
  • Combination of parallel and serial concatenated
    codes
  • Parallel inner 2x,x10,10
  • Outer 4,76
  • Outperforming Shannons separation limit by 0.1
    dB

62
Ergodic Rayleigh Fading MAC
  • Consider Ya1x1a2x2N, a1 and a2 being Rayleigh
    fading amplitudes
  • Similar coding scheme as for AWGN-MAC
  • Knowledge of a1 and a2, i.e. Channel State
    Information (CSI) at receiver
  • Non-CSI at receiver, using mean value for a1 and
    a2

63
Simulation Results High Correlation, Serial
Scheme
  • Using the same codes in the AWGN-MAC case

64
Simulation Results Low Correlation, Combo Scheme
  • Using the same codes in the AWGN-MAC case

65
Conclusion
  • LDGM schemes outperforming divide-and-conquer!
  • AWGN-MAC outperforming in both low and high
    correlation scenario
  • Rayleigh-MAC outperforming in high correlation,
    very close in low correlation

66
Future Work
  • Extension of previous work to many users (gt2),
    requires good low-rate code design
  • Potential candidates
  • repeat-LDGM codes
  • repeat-Hadamard codes
  • With Hi-rate LDGM as outer code

67
Repeat-Hadamard Codes
  • Use high-rate LDGM as outer codes
  • Coding rate at 0.015 0.05
Write a Comment
User Comments (0)
About PowerShow.com