COMPRESSED SENSING - PowerPoint PPT Presentation

About This Presentation
Title:

COMPRESSED SENSING

Description:

Title: PowerPoint Presentation Last modified by: Luis Mancera Pascual Created Date: 1/1/1601 12:00:00 AM Document presentation format: Presentaci n en pantalla – PowerPoint PPT presentation

Number of Views:325
Avg rating:3.0/5.0
Slides: 50
Provided by: decsaiUg6
Category:

less

Transcript and Presenter's Notes

Title: COMPRESSED SENSING


1
COMPRESSED SENSING
  • Luis Mancera
  • Visual Information Processing Group
  • Dep. Computer Science and AI
  • Universidad de Granada

2
CONTENTS
  • WHAT?
  • Introduction to Compressed Sensing (CS)
  • HOW?
  • Theory behind CS
  • FOR WHAT PURPOSE?
  • CS applications
  • AND THEN?
  • Active research and future lines

3
CONTENTS
  • WHAT?
  • Introduction to Compressed Sensing (CS)
  • HOW?
  • Theory behind CS
  • FOR WHAT PURPOSE?
  • CS applications
  • AND THEN?
  • Active research and future lines

4
Transmission scheme
Brick wall to performance
N gtgt K
Sample
Compress
N
K
Transmit
Why so many samples?
K
N
Receive
Decompress
Natural signals (sparse/compressible) ? no
significant perceptual loss
5
Shannon/Nyquist theorem
  • Shannon/Nyquist theorem tell us to use a sampling
    rate of 1/(2W) seconds, if W is the highest
    frequency of the signal
  • This is a worst-case bound for ANY band-limited
    signal
  • Sparse / compressible signals is a favorable case
  • CS solution melt sampling and compression

6
Compressed Sensing (CS)
What do we need for CS to success?
  • Recover sparse signals by directly acquiring
    compressed data
  • Replace samples by measurements

7
We now how to Sense Compressively
Im glad this battle is over. Finally my military
period is over. I will now come back to Motril
and get married, and then I will grow up pigs as
I have always wanted to do
Do you mean youre glad this battle is over
because now youve finished here and you will go
back to Motril, get married, and grow up pigs as
you always wanted to?
Aye
Cool!
8
What does CS need?
I know this guy so much that I know what he means
  • Nice sensing dictionary
  • Appropriate sensing
  • A priori knowledge
  • Recovery process

Wie lange wird das nehmen?
Saint Roques dog has no tail
Cool!
What?
Words
Idea
9
CS needs
  • Nice sensing dictionary
  • Appropriate sensing
  • A priori knowledge
  • Recovery process

INCOHERENCE
RANDOMNESS
SPARSENESS
OPTIMIZATION
10
Sparseness less is more
He was advancing by the valley, the only road
traveled by a stranger approaching the
Hut Comments to Wyandotte
Dictionary
Idea
Hummm, you could say the same using less words
A stranger approaching a hut by the only known
road the valley
How to express it?
Combining elements
SPARSER
Combining elements
He was advancing by the only road that was ever
traveled by the stranger as he approached the
Hut or, he came up the valley Wyandotte
J.F. Cooper
E.A. Poe
11
Sparseness less is more
  • Sparseness Property of being small in numbers or
    amount, often scattered over a large area
  • Cambridge Advanced Learners Dictionary

A CERTAIN DISTRIBUTION
A SPARSER DISTRIBUTION
12
Sparseness less is more
  • Pixels not sparse ?
  • A new domain can increase sparseness ?

Original Einstein
10 Fourier coeffs.
10 Wavelet coeffs.
Taking 10 pixels
13
Sparseness less is more
Dictionary
How to express it?
X-lets elementary functions (atoms)
Non-linear analysis
SPARSER
Linear analysis
non-linear subband
Synthesis-sense Sparseness We can increase
sparseness by non-linear analysis
X-let-based representations are compressible,
meaning that most of the energy is concentrated
in few coefficients
Analysis-sense Sparseness Response of X-lets
filters is sparse Malllat 89, Olshausen Field
96
linear subband
14
Sparseness less is more
Dictionary
Idea
How to express it?
X-lets elementary functions
Combining other way
SPARSER
Taking around 3.5 of total coeffs
non-linear subband
Taking less coefficients we achieve strict
sparseness, at the price of just approximating
the image
PSNR 35.67 dB
15
Incoherence
  • Sparse signals in a given dictionary must be
    dense in another incoherent one
  • Sampling dictionary should be incoherent w.r.t.
    that where the signal is sparse/compressible

A time-sparse signal
Its frequency-dense representation
16
Measurement and recovery processes
  • Measurement process
  • Sparseness Incoherence ? Random sampling will
    do
  • Recovery process
  • Numerical non-linear optimization is able to
    exactly recover the signal given the measurements

17
CS relies on
  • A priori knowledge Many natural signals are
    sparse or compressible in a proper basis
  • Nice sensing dictionary Signals should be dense
    when using the sampling waveforms
  • Appropriate sensing Random sampling have
    demonstrated to work well
  • Recovery process Bounds for exact recovery
    depends on the optimization method

18
Summary
  • CS is a simple and efficient signal acquisition
    protocol which samples at a reduced rate and
    later use computational power for reconstruction
    from what appears to be an incomplete set of
    measurements
  • CS is universal, democratic and asymmetrical

19
CONTENTS
  • WHAT?
  • Introduction to Compressed Sensing (CS)
  • HOW?
  • Theory behind CS
  • FOR WHAT PURPOSE?
  • CS applications
  • AND THEN?
  • Active research and future lines

20
The sensing problem
  • xt Original discrete signal (vector)
  • F Sampling dictionary (matrix)
  • yk Sampled signal (vector)

21
The sensing problem
  • Traditional sampling

Sampled signal
Original signal
Sampling dictionary
22
The sensing problem
  • When the signal is sparse/compressible, we can
    directly acquire a condensed representation with
    no/little information loss
  • Random projection will work if M O(K log(N/K))
    Candès et al., Donoho, 2004

y
F
x
K nonzero entries
K lt M ltlt N
M x 1
M x N
N x 1
23
Universality
  • Random measurements can be used if signal is
    sparse/compressible in any basis

y
F
a
Y
K nonzero entries
K lt M ltlt N
M x 1
M x N
N x 1
N x N
24
Good sensing waveforms?
  • F and Y should be incoherent
  • Measure the largest correlation between any two
    elements
  • Large correlation ? low incoherence
  • Examples
  • Spike and Fourier basis (maximal incoherence)
  • Random and any fixed basis

25
Solution sensing randomly
M O(K log(N/K))
Random measurements
M
Transmit
M
N
Receive
Reconstruct
  • We have set up the encoder
  • Lets now study the decoder

26
CS recovery
  • Assume a is K-sparse, and y FYa
  • We can recover a by solving
  • This is a NP-hard problem (combinatorial)
  • Use some tractable approximation

Count number of active coefficients
27
Robust CS recovery
  • What about a is only compressible and y F(Ya
    n), with n and unknown error term?
  • Isometry constant of F The smallest K such that,
    for all K-sparse vectors x
  • F obeys a Restricted Isometry Property (RIP) if
    dK is not too close to 1
  • F obeys a RIP ? Any subset of K columns are
    nearly orthogonal
  • To recover K-sparse signals we need d2K lt 1
    (unique solution)

28
Recovery techniques
  • Minimization of L1-norm
  • Greedy techniques
  • Iterative thresholding
  • Total-variation minimization

29
Recovery by minimizing L1-norm
Sum of absolute values
  • Convexity tractable problem
  • Solvable by Linear or Second-order programming
  • For C gt 0, â1 â if

30
Recovery by minimizing L1-norm
  • Noisy data Solve the LASSO problem
  • Convex problem solvable via 2nd order cone
    programming (SOCP)
  • If d2K lt ?2 1, then

31
Example of L1 recovery
x
y Ax
  • A120X512 Random orthonormal matrix
  • Perfect recovery of x by L1-minimization

32
Recovery by Greedy Pursuit
  • Algorithm
  • New active component that whose corresponding fi
    is most correlated with y
  • Find best approximation, y, to y using active
    components
  • Substract y from y to form residual e
  • Make y e and repeat
  • Very fast for small-scale problems
  • Not as accurate/robust for large signals in the
    presence of noise

33
Recovery by Iterative Thresholding
  • Algorithm
  • Iterates between shrinkage/thresholding operation
    and projection onto perfect reconstruction
  • If soft-thresholding is used, analogous theory to
    L1-minimization
  • If hard-thresholding is used, the error is within
    a constant factor of the best attainable
    estimation error Blumensath08

34
Recovery by TV minimization
  • Sparseness signals have few jumps
  • Convexity tractable problem
  • Accurate and robust, but can be slow for
    large-scale problems

35
Example of TV recovery
x
xLS FTFx
F
  • F Fourier transform
  • Perfect recovery of x by TV-minimization

36
Summary
  • Sensing
  • Use random sampling in dictionaries with low
    coherence to that where the signal is sparse.
  • Choose M wisely
  • Recovery
  • A wide range of techniques are available
  • L1-minimization seems to work well, but choose
    that best fitting your needs

37
CONTENTS
  • WHAT?
  • Introduction to Compressed Sensing (CS)
  • HOW?
  • Theory behind CS
  • FOR WHAT PURPOSE?
  • CS applications
  • AND THEN?
  • Active research and future lines

38
Some CS applications
  • Data compression
  • Compressive imaging
  • Detection, classification, estimation, learning
  • Medical imaging
  • Analog-to-information conversion
  • Biosensing
  • Geophysical data analysis
  • Hyperspectral imaging
  • Compressive radar
  • Astronomy
  • Comunications
  • Surface metrology
  • Spectrum analysis

39
Data compression
  • The sparse basis Y may be unknown or impractical
    to implement at the encoder
  • A randomly designed F can be considered a
    universal encoding strategy
  • This may be helpful for distributed source coding
    in multi-signal settings
  • Baron et al. 05, Haupt and Nowak 06,

40
Magnetic resonance imaging
41
Rice Single-Pixel CS Camera
42
Rice Analog-to-Information conversion
  • Analog input signal into discrete digital
    measurements
  • Extension of A2D converter that samples at
    signals information rate rather than its Nyquist
    rate

43
CS in Astronomy Bobin et al 08
  • Desperate need for data compression
  • Resolution, Sensitivity and photometry are
    important
  • Herschel satellite (ESA, 2009) conventional
    compression cannot be used
  • CS can help with
  • New compressive sensors
  • A flexible compression/decompression scheme
  • Computational cost (Fx) O(t) vs. JPEG 2000s O(t
    log(t))
  • Decoupling of compression and decompression
  • CS outperforms conventional compression

44
CONTENTS
  • WHAT?
  • Introduction to Compressed Sensing (CS)
  • HOW?
  • Theory behind CS
  • FOR WHAT PURPOSE?
  • CS applications
  • AND THEN?
  • Active research and future lines

45
CS is a very active area
46
CS is a very active area
  • More than seventy 2008 papers in CS repository
  • Most active areas
  • New applications (de-noising, learning, video,
  • New recovery methods (non-convex, variational,
    CoSamp,)
  • ICIP 08
  • COMPRESSED SENSING FOR MULTI-VIEW TRACKING AND
    3-D VOXEL RECONSTRUCTION
  • COMPRESSIVE IMAGE FUSION
  • IMAGE REPRESENTATION BY COMPRESSED SENSING
  • KALMAN FILTERED COMPRESSED SENSING
  • NONCONVEX COMPRESSIVE SENSING AND RECONSTRUCTION
    OF GRADIENT-SPARSE IMAGES RANDOM VS. TOMOGRAPHIC
    FOURIER SAMPLING

47
Conclusions
  • CS is a new technique for acquiring and
    compressing images simultaneously
  • Sparseness Incoherence random sampling allows
    perfect reconstruction under some conditions
  • A wide range of applications are possible
  • Big research effort now on recovery techniques

48
Our future lines?
  • Convex CS
  • TV-regularization
  • Non-convex CS
  • L0-GM for CS
  • Intermediate norms (0 lt p lt 1) for CS
  • CS Applications
  • Super-resolved sampling?
  • Detection, estimation, classification,

49
Thank you
  • See references and software here
  • http//www.dsp.ece.rice.edu/cs/
Write a Comment
User Comments (0)
About PowerShow.com