Field-Gathering Sensor Networks, Distributed Encoding and Oversampling - PowerPoint PPT Presentation

About This Presentation
Title:

Field-Gathering Sensor Networks, Distributed Encoding and Oversampling

Description:

Periodically take a snapshot of 2-dim'l field X(u,v) in region G. ... Number of originating bits to transport per snapshot ... successive snapshots are ... – PowerPoint PPT presentation

Number of Views:21
Avg rating:3.0/5.0
Slides: 47
Provided by: wwwnetC
Category:

less

Transcript and Presenter's Notes

Title: Field-Gathering Sensor Networks, Distributed Encoding and Oversampling


1
Field-Gathering Sensor Networks, Distributed
Encoding and Oversampling
  • David L. Neuhoff
  • Electrical Engineering and Computer Science
  • University of Michigan, Ann Arbor 48109
  • neuhoff_at_umich.edu
  • Canadian Workshop on Information Theory
  • May 2003
  • Presented By Junning Liu

2
Information Theory Basics
  • Entropy measure of uncertainty
  • You should call it entropy and for two reasons
    first, the function is already in use in
    thermodynamics under that name second, and more
    importantly, most people dont know what entropy
    really is, and if you use the word entropy in
    an argument, you will win every time!
  • von Neumann to Claude Shannon when prompted
    for a suitable term

3
Entropy
  • Discrete random variable X with a distribution p
    on N possible values
  • H(X) Sx p(x) log2 (1/p(x) ) Si p(x)
    log2 p(x) in bits
  • H(X)?0 for any distribution, a concave function
    of p
  • Information is measured by the entropy reduction
    from before we receive a symbol to after we
    receive the symbol
  • For uniform distribution, H(X)log2N bits
  • For fixed N, H(X) achieves maximum when p is
    uniform dist.
  • H(X)Ep (log2 1/p(x) ) Can also be viewed as an
    self referential expectation
  • Shannon shows that entropy is the ultimate
    compression limit

Shannon CE A mathematical theory of
communication The Bell System Technical Journal
(1948) 27379-423 and 623-656
4
Entropy basics
  • Joint Entropy
  • H(X,Y) -Sx Sy p(x,y) log p(x,y)
  • Conditional Entropy
  • H(YX) Sx p(x) H(YXx)
  • -Sx p(x) Sy p(yx) log p(yx)
  • -Sx Sy p(x,y) log p(yx)
  • Chain Rule
  • H(X,Y) H(X)H(YX)
    H(Y)H(XY)
  • H(XY) ? H(X)
  • H(X1, X2, , Xn) Si H(Xi
    X1 ,X2,, Xi-1 )

  • ? Si H(Xi)
  • Mutual information
  • I(XY) H(X)-H(XY)

5
Field-Gathering With a Wireless Sensor Network
  • Example Measuring, conveying and reproducing a
    temperature field.

region G
  • Other Examples pressure, moisture, vibration,
    light, sound, gas concentration, position,

6
Outline
  • Field-gathering sensor networks
  • Components of a field-gathering network
  • Capacity Network transport system
  • Compressibility Distributed source encoding
  • The efficiency of field-gathering networks and
    the scaling question.
  • The oversampling-quantization question
  • Back to the scaling question
  • Open problems, future work

7
Field-Gathering
  • Periodically take a snapshot of 2-diml field
    X(u,v) in region G.
  • Convey to a collector who produces a
    reconstruction of the field for
    display, param. estimn, object detectn,
    recognn, tracking.
  • The quality of the reconstruction measured by MSE

Note In the paper, the Expectation is taken
inside the integration
  • Minimize resources, such as power, needed to
    collect snapshots at a given frequency to within
    a target MSE.
  • Alternatively, given available resources target
    MSE, maximize frequency with which snapshots
    conveyed to collector.

8
Field-Gathering with a Wireless Sensor Network
  • N sensors with radios are uniformly deployed over
    the region G.
  • Sensor n ÃŽ 1,,N
  • Measures field X(un,vn) at its location
    (un,vn)
  • Quantizes X(un,vn)
  • Encodes into bits
  • Sends its encoded bits to the collector

9
Observations
  • Field gathering is like image coding
  • Sensor measurements are pixel values
  • However
  • We dont simply count bits produced by encoders
  • Instead the cost of communication includes
    relaying.
  • Encoding must be distributed. No VQ or
    transform coding. No filtering before sampling

10
Question
  • To minimize resources, how densely should the
    sensors be deployed?
  • Sparsely? So as to reduce the number of sensors
    whose encoded data must be transmitted? (and
    reduce delay)
  • Densely? So as to increase the correlation
    between neighboring sensor values? (reduce
    communication energy)

11
The goal
  • Asymptotical behavior of the throughput as N goes
    to infinity with a fixed MSE requirement
  • Two parts
  • The Compressibility of the field
  • The many-to-one transport capacity
  • both as N 8

12
Outline
  • Field-gathering sensor networks
  • Components of a field-gathering network
  • Network transport system
  • Distributed source encoding
  • The efficiency of field-gathering networks and
    the scaling question.
  • The oversampling-quantization question
  • Back to the scaling question
  • Open problems, future work

13
Components of a Field-Gathering System
  • Questions
  • How many bits must each source encoder produce
    and send to collector in order that collector can
    create reproduction with MSE D? bN
  • How many bits can the network transport system
    convey from each source encoder to the collector?
    cN
  • Source encoder for each sensor
  • Quantizer
  • Lossless coder
  • Modem/codec for each sensor
  • Because we focus on scaling question, we wont
    need to specify the form of modulation/coding.(Cha
    nnel coding)
  • Network transport system
  • Routing protocol
  • Scheduling protocol (MAC) Scheduling

14
Network Transport System
  • To study scaling question, we adopt framework
    like protocol model in Gupta-Kumar (IT, 2000).
  • Time is slotted.
  • A sensor cannot receive and transmit
    simultaneously, nor can it receive simultaneously
    from more than one transmitter.
  • Modem/codec W bits in each slot.
  • Depending on power P, there are ranges r1 lt r2
    such that W bits are successfully transmitted
    from sensor m to sensor n iff m is
    within r1 of n, and at least r2 from other
    transmitters.
  • Routing tree specifies how each sensors data
    travels to collector.
  • Media Access Control Transmission schedule
    avoids conflicts.
  • Data is pipelined Bits describing next snapshot
    begin to be sent before bits describing present
    snapshot reach collector.

15
Example of Routing, Scheduling and Pipelining
16
New Result Many-To-One Transport Capacity
  • Consider wireless network with N nodes randomly
    deployed on a disk with collector at the center.
  • Protocol model as before.
  • Slotted time. Max transmission rate W
    bits/slot. Power P is subject to choice,
    inducing ranges r1 and r2 .
  • Definition
  • cN capacity of network with N nodes
  • largest number c s.t. there exists a
    route from each node to collector and a schedule
    s.t. each node conveys c bits/slot to
    collector with high probability.
  • Theorem

17
Components of a Field-Gathering System
  • Questions
  • How many bits must each source encoder produce
    and send to collector in order that collector can
    create reproduction with MSE D? bN??
  • How many bits can the network transport system
    convey from each source encoder to the collector?
    cN Q(W/N)
  • Source encoder for each sensor
  • Quantizer
  • Lossless coder
  • Modem/codec for each sensor
  • Because we focus on scaling question, we wont
    need to specify the form of modulation/coding.
  • Network transport system
  • Routing protocol
  • Media access control (MAC) Scheduling

18
Source Encoder for nth Sensor
  • Scalar quantizer Xn X(un,vn) In
    q(Xn )
  • q is uniform scalar quantizer with step size D
    (same for all n)
  • q(x ) integer index of quant. cell in which
    x lies
  • Lossless encoder In bn bits
  • Encoding can be conditional ---- bn can depend
    on past indices from the same sensor or on
    indices received from other sensors.

19
Qunatization example
  • With courtesy to Sorour Falahati

amplitude x(t)
111 3.1867
110 2.2762
101 1.3657
100 0.4552
011 -0.4552
010 -1.3657
001 -2.2762
000 -3.1867
Ts sampling time
t
PCM codeword
110 110 111 110 100 010 011 100
100 011
PCM sequence
20
Quantization error
  • With courtesy to Sorour Falahati
  • Quantizing error
  • Granular or linear errors happen for inputs
    within the dynamic range of quantizer
  • Saturation errors happen for inputs outside the
    dynamic range of quantizer
  • Saturation errors are larger than linear errors
  • Saturation errors can be avoided by proper tuning
    of AGC
  • Quantization noise variance

R. M. Gray and D. L. Neuhoff, Quantization,
IEEE Trans. Inform.Theory, this issue, pp.
23252383.
21
Reconstruction and MSE
When sensors are dense (N is large)
  • where Q(x) denotes the centroid of the cell in
    which x lies.
  • That is, when sensors are dense, interpolation
    error is negligible, and MSE approaches a
    constant determined by the quantizer.
  • There is a lower bound for the quantizers
    resolution. So before encoding, total number of
    bits for I1 I2 , , In is unbounded.

22
Three Types of Lossless Coding
  • Independent encoding and decoding
  • Conditional encoding and decoding (also called
    explicit entropy encoding)
  • Slepian-Wolf -- independent/distributed
    encoding, conditional decoding

23
Independent Lossless Encoding
  • Each sensor encodes its quantization index
    independently of others.
  • Number of originating bits to transport to
    collector per snapshot
  • BN H(I1)H(IN)
  • which is the same for all routing trees.
  • We focus on originating bits bn rather than
    total bits Bn because the capacity of the
    network transport system counts the number of
    originating bits that can be conveyed to the
    collector.
  • Also, the extra bits due to relaying are
    designed to reduce power, and we are not yet
    ready to take power into account.

24
Conditional Encoding and Decoding
  • Each sensor encodes its quantization index
    conditioned on quantization indices it has
    already received from descendants in the routing
    tree.
  • Decoder conditionally decodes.
  • Number of originating bits to transport per
    snapshot
  • BN is minimized by a linear routing tree, in
    which case
  • BN H(I1)H(I2 I1)H(IN I1,,IN-1)
    H(I1,,IN)

25
Slepian-Wolf -- Independent/Distributed
Encoding, Conditional Decoding
  • Each sensor encodes its indices without knowing
    other sensor indices, but with the assumption
    that the decoder will know other sensor indices
    at the time of decoding.
  • Choose an arbitrary ordering of the sensors.
  • Sensor n encodes In assuming decoder knows
    I1,,In-1.
  • Number of originating bits to transport per
    snapshot
  • BN H(I1)H(I2 I1)H(IN I1,,IN-1)
    H(I1,,IN)
  • Block coding is required. Apply to block of
    indices from one sensor. Assume successive
    snapshots are independent.
  • BN is independent of the ordering of the sensors
    and the choice of routes.
  • BN for S-W BN for the two previous methods.
  • S-W coding can be structured so the same number
    of bits are produced by each encoder bN BN /N

26
Summary of Lossless Coding --- Minimum Number of
Originating Bits
  • BN H(I1,,IN)
  • Attained by Slepian-Wolf coding.
  • and sometimes same by conditional coding.

27
Outline
  • Field-gathering sensor networks
  • Components of a field-gathering network
  • Network transport system
  • Distributed source encoding
  • The efficiency of field-gathering networks and
    the scaling question.
  • The oversampling-quantization question
  • Back to the scaling question
  • Open problems, future work

28
Efficiency of a Field-Gathering Network
  • As a measure of the resources required by a
    field-gathering network in conveying snapshots,
    define
  • usage rate U network slots per snapshot
  • (Alternatively, throughput 1/ U
    snapshots per slot.)
  • Given D, W and N, we wish to find
    transmission power P, routing tree, and
    schedule that yield the minimum usage rate,
    denoted UN , at which MSE D is attained.
  • With Slepian-Wolf coding and network transport
    discussed earlier
  • Notice the Shannon-style separation between
    distributed source coding and the network
    transport system.
  • No claim of optimality, but separation seems to
    be useful.

29
The Scaling Question
  • What happens to UN as N 8 ?

The Oversampling-Quantization Question What
happens to BN as N 8 ?
30
Outline
  • Field-gathering sensor networks
  • Components of a field-gathering network
  • Network transport system
  • Distributed source encoding
  • The efficiency of field-gathering networks and
    the scaling question.
  • The oversampling-quantization question
  • Back to the scaling question
  • Open problems, future work

31
The Oversampling and Quantization Question in
One Dimension
R(f)
  • Consider sampling, quantizing and entropy coding
  • a one-diml continuous-time, stationary random
    process X(t).
  • Oversampling-Quantization Question With the
    scalar quantizer fixed, what happens to the
    encoding rate R(f) (bps) as sampling rate f
    8?

32
Most Relevant Case
  • X(t) is defined only on unit time interval t ÃŽ
    0,1
  • Take N samples X1,,XN sampling rate
    f N samp/sec
  • Quantize X1,,XN to indices I1,,IN
  • Lossless entropy code produces
  • R(f) H(I1,,IN) f H(I1,,IN) / N
    bits/sec
  • Oversampling-Quantization Question
  • What happens to H(I1,,IN) and H(I1,,IN)/N
    as N 8
  • Good news -- Theorem 1 H(I1,,IN)/N 0 as
    N 8
  • Bad news -- Theorem 2 H(I1,,IN) 8 as N
    8

33
Theorem 1
Assume X(t) is stationary. Then
  • Proof

34
Theorem 2
  • Assume
  • X(t) is stationary random process with
  • Pr( constant sample functions ) lt 1,
  • The quantizer has a threshold t such that
  • Pr( X(t) crosses threshold t in time interval
    0,1 ) gt 0
  • Then
  • H(I1,,IN) 8 as N 8 .

35
Key Observation1
  • T time of first threshold crossing in 0,1.
    T 1 if no crossing.
  • H(T) 8, since T is a mixed random
    variable.
  • From quantizer indices I1,,IN , we find
    estimate TN of T s.t.
  • It follows that H(TN ) 8.
  • Theorem 2 follows
  • H(I1,,IN ) H(TN ) 8.

1Courtesy of Bruce Hajek
36
Proof that H(TN) 8
  • Lemma If H(T) 8 and E(T-TN )2 0 as N
    8, then H(TN) 8
  • Proof Consider the rate-distortion function of
    T wrt MSE

Note that RT(D) 8 as D 0 . Let
qN(ut) be defined by T, TN . Then H(TN)
I(TTN) RT( E(T-TN )2) 8,
as N 8, because E(T-TN )2 0
T. Berger and J. D. Gibson, "Lossy Source
Coding," IEEE Trans. Info. Theory, Vol. 44, No.
6, pp. 2693-2723, Oct. 1998.
37
Compare SQEC to Ideal R-D Coding
  • Replace scalar quantizer and entropy coder with
    ideal rate-distortion coder with same distortion
    D as scalar quantizer.
  • This produces f R(D) bits/sec
  • where R(D) is the rate-distortion funct. (wrt
    MSE) in bits/sample of the discrete-time source
    X1,X2, ...

38
At What Rate Does H(I1,,IN) 8?
  • Theorem
  • For stationary Gaussian X(t) with autocorr.
    funct. RX(t)

Examples
39
Theorem 2 -- Entropy-Rate
R(f)
  • Suppose now that R(f) f H8(I) bits/sec
  • This is smaller than before with f N
  • f H8(I) f H(I1,,IN) / N
  • Theorem 2 Under same assumptions as before
  • R(f) f H8(I) 8

40
Comments
  • VQ vs. scalar independent quantization
  • Why the latter is unable to bound BN?
  • Scalar independent quan. Is too strong an
    assumption which prevent the sensors from jointly
    quantizing the field in a lossy way.
    Particularly, this constraint prevent sensors
    from reducing the spatial resolution redundancy
    from the data

41
Outline
  • Field-gathering sensor networks
  • Components of a field-gathering network
  • Network transport system
  • Distributed source encoding
  • The efficiency of field-gathering networks and
    the scaling question.
  • The oversampling-quantization question
  • Back to the scaling question
  • Open problems, future work

42
The Scaling Question
  • What happens to UN as N 8 ?

43
The Optimal Sensor Density
N
  • There is an optimal number of sensors
  • Too many sensors is, unfortunately,
    disadvantageous.
  • However, you can put some sensors to sleep, but
    continue to use their radios.
  • In this case we use N active sensors. N- N
    sleeping sensors

44
Summary - Field Gathering Sensor Networks
  • Network Transport System can deliver
  • from each node to the collector
  • Scalar quantization plus distributed lossless
    coding can repre-sent the source to MSE D with
    BN H(I1,,IN) bits/unit area.
  • BN H(I1,,IN) 8 as N 8 (with
    D fixed)
  • Sensor network usage
  • Conclusion Excessive density (oversampling) is
    not good.
  • Use the proper density, or put some sensors to
    sleep, i.e. subsample.

45
Open Questions, Future Work
  • How to take power into account?
  • Replace the protocol with multiuser detection,
    multiple-access coding/modulation.
  • The usual model Preceive cPtransmit / d a
    is innacurate when sensors are dense
  • Might there be interpolation methods that make D
    go to zero as sampling rate increases with a
    fixed scalar quantizer?
  • Cvetkovic-Daubechies DCC 2000 have demonstrated
    a method for bandlimited deterministics signals.
    But it uses dithering, and the period of the
    dither grows with sampling rate. Can it be done
    for nonbandlimited signals? nondeterministc
    signals? with dither period that does not
    increase?
  • Turning fundamental limits into practice?

46
Discussions
  • Finding the optimal density
  • Slepian-Wolf encoding, explict entropy encoding,
    Vector quantization all have practical complexity
    issues. Are there simple distributed adaptive
    fashion schemes that are also asymptotically
    optimal?
  • Other performance factors
  • Many constrained optimization problems
Write a Comment
User Comments (0)
About PowerShow.com