Title: http://www.basics.eecs.berkeley.edu
1http//www.basics.eecs.berkeley.edu
2Towards a System Theory for Robust Large-Scale
Sensor Networks
NSF Sensors (Ramchandran, Sastry, Tse, Vetterli,
Poolla)
3Sensor networks a systems view
Systems tasks
- Data acquisition
- Distributed compression and communication
- Networking and routing
- Distributed inference and decision
(classification / estimation) - Closing the loop (control)
Guiding principles
- Statistical models for sensor-fields
- Scaling laws for dense networks
- Information and coding theory
- Learning theory and adaptive signal processing
4Distributed SP (DSP) low-hanging fruit
- Revisit many classical SP problems (estimation,
inference, detection, fusion) under constraints
of - bandwidth (compression)
- noisy transmission medium (coding MAC)
- total system energy (communication processing)
- highly unreliable system components (robust
design)
- Voila ? you get a distributed signal
processing recipe! - Constraints force robust distributed solutions
sampling, processing, routing, compressing,
coding, controlling.
- Architectures should reflect and exploit
computational diversity in wireless devices
(TVs, cell phones, laptops, cheap sensors) - Asymmetric complexities
- In-built robustness fault-tolerant designs
- Diversity in representation communication
- Rehaul deterministic frameworks (e.g.
prediction-based) with probabilistic ones
5Sampling sensor fields
- Many physical signals e.g., pressure,
temperature, are approximately BL - Physical propagation laws often provide a
natural smoothing effect
A/D converters (sensors)
- Sensor network constraints
- Low-precision A/D
- Limited power and bandwidth
Sampling a 1-D spatio-temporal field
space
2X
2X
2X
X
X
X
T
2T
3T
time
6Motivation Acquisition reconstruction of
sensor fields
- Is there an information scaling law ?
- Gupta-Kumar00 In ad-hoc networks, with
- independent data sources,
throughput/sensor ? 0 - as 1/sqrt(N).
- In sensor nets, data correlation increases with
density. - Can information-rate/sensor and reconstruction
distortion - go to zero with density?
-
- Tradeoffs between sensor precision and of
sensors? - Can we overcome low precision sensors by
throwing - scale at the problem?
- Is there an underlying conservation of bits
principle?
7Sensor-Field Reconstruction Distributed
Sampling Theory
- Conservation of bits principle We can trade
off A/D precision for oversampling rate (quality
? bits per Nyquist interval).
0
8Overcoming Unreliable Radios
- Narrowband Radios
- Simple, used by all sensor nodes today Motes,
PicoRadio, Ember, SmartDust - How to get fcarrier?
- Crystal Oscillator (precise but expensive)
- MEMS Resonator (less precise less expensive)
- On-chip LC-Resonator (cheap, low-power, imprecise)
9Distributed compression
Dense, low-power sensor-networks
- The encoder needs to compress the source X.
- The decoder has access to correlated side
- information Y.
- Can we compress X to H(XY)?
Can design practical distr. source coding
framework to approach this.
10Integrating learning correlation tracking
- Many sensors report to controller
- Correlation tracking
- Controller keeps track of correlation
- Specifies how much compression
- Sensors blindly encode readings
- Minimal processing at sensor nodes
- Complexity at controller
- Cheap sensors
- Probabilistic reference to side information
allows for robustness to packet loss
11Collaborative processing compressing raw-data
versus local estimates
- Several scenarios
- Sensor-clusters (groups of sensors that can
collaborative) - Multiple antennas per sensor
- Multimodal sensors
12Result
- If collaborative processing is (MSE) optimal
when R is infinity,
13Result
- then it is also optimal for any finite R.
Suggests that distributed estimation and
compression tasks can be de-coupled, i.e., one
can design adapt network topology by ignoring
bandwidth requirements in a number of scenarios.
14Opportunities architecture rehauls
- Architectures should reflect and exploit
computational diversity in wireless devices
(TVs, cell phones, laptops, cheap sensors) - Asymmetric complexities
- In-built robustness fault-tolerant designs
- Diversity in representation communication
- Rehaul deterministic frameworks
(e.g.prediction-based frameworks LP, DPCM, etc.)
with probabilistic ones
15Rethinking video-over-wireless
- Todays video architectures shaped by downlink
broadcast model - Complex encoder
- Light decoder
Motion estimation task dominates (up to 90)
16New class of video codecs requirements
- Light codec complexity in order to
- Maximize battery-life.
- Satisfy complexity constraints at encoding
device. - High compression efficiency to match
- Available bandwidth/storage constraints.
- Low transmission power constraints.
- Robustness to packet/frame drops to
- Combat harsh wireless transmission medium.
17Rethinking the division of labor
Under reasonable signal models, it is possible
to transfer (motion search) complexity to
decoder without loss of compression efficiency
(Ishwar, Prabhakaran, Ramchandran, 2003)
18 PRISM video simulation results
- Sequence used Football (14 frames, 352x240)
- Comparison H.263 (free version from UBC,
Vancouver) - Frame rate 30fps, Encoding rate 10kB per frame
- Compression Performance is visually competitive
with respect to full-motion complex inter-frame
codecs such as MPEG-4 H.263. - (For pure compression, H.263 outperforms PRISM
by about 1.3 dB on our tests on the
Football sequence) - Robustness Much more robust than current
solutions. Can recover from frame losses. - Test for robustness second frame was removed
from frame memory after decoding. third frame was
decoded off the first frame in both cases.
19Qualcomms simulator for CDMA-2000 1X
- At packet error rate 6
- At packet error rate 11
- H.263 at packet error rate of 3 and PRISM at
16
PRISM is 4-8 dB better than H.263 for the loss
rates investigated.