Predictive Coding of Correlated Sources - PowerPoint PPT Presentation

1 / 26
About This Presentation
Title:

Predictive Coding of Correlated Sources

Description:

white. noise. A(z) S. S. observation. noise. observation. noise. x ... Independent white Gaussian observation noise. where noise variances are relatively small. ... – PowerPoint PPT presentation

Number of Views:57
Avg rating:3.0/5.0
Slides: 27
Provided by: eeU3
Category:

less

Transcript and Presenter's Notes

Title: Predictive Coding of Correlated Sources


1
Predictive Coding ofCorrelated Sources
  • Ertem Tuncel
  • University of California, Riverside
  • September 29, 2004

2
Coding of Correlated Sources
  • Sensors are preferably cheap, and therefore
    low-power.
  • It is more efficient if they directly
    communicate with the center, rather than with
    each other.
  • On the other hand, their measurements are not
    independent.
  • Can we make use of that correlation?

RIVERSIDE
3
Sample Waveforms
xn
yn
n
4
Underlying Model
xn
S
observation noise
A(z)
AR process
white noise
yn
S
observation noise
5
Example Used in This Work
  • First-order Gauss-Markov process

6
The Separate Coding Regime
xn
Encoder
Decoder
yn
Encoder
7
Distributed VQ
  • Collect N samples and encode jointly

xn
yn
n
8
Disadvantages
  • Though optimal, distributed VQ is problematic in
    the following aspects
  • Prevalence of local minima
  • Exacerbated by the required binning structure.
  • High design and codebook complexities
  • N?2N(RXRY) scalars to be designed and stored.
  • Encoding complexity
  • Optimal encoders are not based on
    nearest-neighbor mapping

9
Outline of the Rest of the Talk
  • A family of scalar quantization schemes based on
    a simple binning technique.
  • Predictive coding
  • How to optimally exploit inter- and
    intra-correlation.
  • Experimental results
  • Suboptimality of the traditional prediction
    filter.
  • Conclusion and future work

10
Scalar Quantization
  • Independent quantization

11
A Simple Binning Technique
Pair encoded as (0,2)
12
A Family of Codes
  • Choose three parameters W, NX, and NY.
  • Uniformly quantize X and Y using WNXNY intervals
    each. Enumerate the intervals from 0 to WNXNY ?1.

13
Meaning of the Parameters
14
Choosing the Parameters
Fix NX and NY. W controls the rate-distortion
point
15
Choosing the Parameters
16
High-resolution R-D Performance
FACT To minimize DX DY for a fixed total bit
budget, one must spend additional bits to split
each cell further so that DX DY .
17
An Optimality Statement
  • Consider the hypothetical scenario where xn and
    yn are both available at any encoder.
  • The encoder can then apply KLT and de-correlate
    the two sources, followed by optimal bit
    allocation under the uniform quantization regime.
  • Gives the same asymptotic performance!!!

18
Predictive Coding
19
The Prediction Filter
  • Two extremes
  • No prediction. It can be shown that in our
    example process, this maximizes the correlation,
    and hence the efficiency of binning.
  • Traditional prediction, i.e., to minimize the
    prediction error variances for both xn and yn.

20
Prediction Filter Design
  • Instead of fixing the prediction filters to
    minimize , which relies on
    the accuracy of
  • we more accurately (and safely) choose for
    each W and r the maximum NXNY so that
  • and minimize over filter coefficients.

21
Experimental Results
  • For

22
Experimental Results
23
Experimental Results
24
Discussion
  • For , i.e., when the
    SNR of the observed signals is 30dB, the observed
    sequences are themselves almost first-order
    Gauss-Markov.
  • Yet, the optimal prediction is not first-order.
    More specifically, we observe up to 1.15dB
    improvement over the traditional prediction
    filter when we apply second-order prediction.

25
Effect of an Out-of-synch Codec
26
Conclusions and Future Work
  • Preliminary experiments show the advantage of
    non-traditional prediction.
  • Non-uniform quantizer design and a corresponding
    high-resolution analysis is the future direction.
  • How do we optimally compand scalars separately?
  • Is uniform quantization optimal under entropy
    coding in this regime?
Write a Comment
User Comments (0)
About PowerShow.com