Error Estimation in TV Imaging - PowerPoint PPT Presentation

1 / 27
About This Presentation
Title:

Error Estimation in TV Imaging

Description:

Total variation methods are one of the most popular techniques in ... Rudin-Osher Fatemi 89/92, Chambolle-Lions 96, Scherzer-Dobson 96, Meyer 01,... TV flow ... – PowerPoint PPT presentation

Number of Views:60
Avg rating:3.0/5.0
Slides: 28
Provided by: mbu91
Category:

less

Transcript and Presenter's Notes

Title: Error Estimation in TV Imaging


1
Error Estimation in TV Imaging
  • Martin Burger
  • Institute for Computational and Applied
    Mathematics
  • European Institute for Molecular Imaging (EIMI)
  • Center for Nonlinear Science (CeNoS)
  • Westfälische Wilhelms-Universität Münster

2
Joint Work with
  • Stan Osher (UCLA)
  • mb-Osher, Inverse Problems 04
  • Elena Resmerita, Lin He (Linz)
  • mb-Resmerita-He, Computing 07

3
TV Imaging
  • Total variation methods are one of the most
    popular techniques in modern imaging
  • Basic idea is to model image, resp. their main
    structure (cartoon) as functions of bounded
    variation
  • Reconstructions seek images of as small total
    variation as possible

4
TV Imaging
  • Total variation is a convex, but not
    differentiable and not strictly convex
    functional

  • Banach space BV consisting of all L1 functions
    of bounded variation

5
Denoising Models
  • ROF model
  • Rudin-Osher Fatemi 89/92, Chambolle-Lions 96,
    Scherzer-Dobson 96, Meyer 01,
  • TV flow
  • Caselles et al 99-06, Feng-Prohl 03, ..

6
ROF Model
  • Optimality condition for ROF denoising
  • Dual variable p enters in ROF and TV flow
    related to mean curvature of edges for total
    variation
  • Subdifferential of convex functional

7
ROF Model
Reconstruction (code by Jinjun
Xu) clean noisy
ROF
8
ROF Model
  • ROF model denoises cartoon images resp. computes
    the cartoon of an arbitrary image, natural
    spatial multi-scale decomposition by varying l

9
Error Estimation ?
  • First question for error estimation estimate
    difference of u (minimizer of ROF) and f in terms
    of l
  • Estimate in the L2 norm is standard, but does
    not yield information about edges
  • Estimate in the BV-norm too ambitious even
    arbitrarily small difference in edge location can
    yield BV-norm of order one !

10
Error Measure
  • We need a better error measure, stronger than
    L2, weaker than BV
  • Possible choice Bregman distance Bregman 67
  • Real distance for a strictly convex
    differentiable functional not symmetric
  • Symmetric version

11
Bregman Distance
  • Bregman distances reduce to known measures for
    standard energies
  • Example 1
  • Subgradient Gradient u
  • Bregman distance becomes

12
Bregman Distance
  • Example 2
  • Subgradient Gradient log u
  • Bregman distance becomes Kullback-Leibler
    divergence (relative Entropy)

13
Bregman Distance
  • Total variation is neither symmetric nor
    differentiable
  • Define generalized Bregman distance for each
    subgradient
  • Symmetric version
  • Kiwiel 97, Chen-Teboulle 97

14
Bregman Distance
  • For energies homogeneous of degree one, we have
  • Bregman distance becomes

15
Bregman Distance
  • Bregman distance for total variation is not a
    strict distance, can be zero for
  • In particular dTV is zero for contrast change
  • Resmerita-Scherzer 06
  • Bregman distance is still not negative
    (convexity)
  • Bregman distance can provide information about
    edges

16
Error Estimation
  • For estimate in terms of l we need smoothness
    condition on data
  • Optimality condition for ROF

17
Error Estimation
  • Apply to u v
  • Estimate for Bregman distance, mb-Osher 04

18
Error Estimation
  • In practice we have to deal with noisy data f
    (perturbation of some exact data g)
  • Analogous estimate for Bregman distance
  • Optimal choice of the parameter
  • i.e. of the order of the noise variance

19
Error Estimation
  • Analogous estimate for TV flow mb-Resmerita-He
    07
  • Regularization parameter is stopping time T of
    the flow T l-1
  • Note all estimates multivalued ! Hold for any
    subgradient satisfying

20
Interpretation
  • Let g be piecewise constant with white
    background and color values ci on regions Wi
  • Then we obtain subgradients of the form
  • with signed distance function di and y s.t.

21
Interpretation
  • e chosen smaller than distance between two
    region boundaries
  • Note on the region boundary (di 0)
  • subgradient equals mean curvature of edge

22
Interpretation
  • Bregman distances given by
  • If we only take the sup over those g with and
    let e tend to zero we obtain

23
Interpretation
  • Multivalued error estimates imply quantitative
    estimate for total variation of u away from the
    discontinuity set of g
  • Other geometric estimates possible by different
    choice of subgradients, different limits

24
Extensions
  • Direct extension to deconvolution / linear
    inverse problems A linear operator
  • under standard source condition
  • mb-Osher 04
  • Nonlinear problems Resmerita-Scherzer 06,
    Hofmann-Kaltenbacher-Pöschl-Scherzer 07

25
Extensions
  • Stronger estimates under stronger conditions
    Resmerita 05
  • Numerical analysis for appropriate
    discretizations (correct discretization of
    subgradient crucial) mb 07

26
Future Tasks
  • Extension to other fitting functionals (relative
    entropy, log-likelihood functionals for different
    noise models)
  • Extension to anisotropic TV (Interpretation of
    subgradients)
  • Extension to geometric problems (segmentation by
    Chan-Vese, Mumford-Shah) use exact relaxation in
    BV with bound constraints Chan-Esedoglu-Nikolova
    04

27
Download and Contact
  • Papers and talks at
  • www.math.uni-muenster.de/u/burger
  • Email
  • martin.burger_at_uni-muenster.de
Write a Comment
User Comments (0)
About PowerShow.com