Title: Edge Preserving Image Restoration using L1 norm
1Edge Preserving Image Restoration using L1 norm
- Vivek Agarwal
- The University of Tennessee, Knoxville
2Outline
- Introduction
- Regularization based image restoration
- L2 norm regularization
- L1 norm regularization
- Tikhonov regularization
- Total Variation regularization
- Least Absolute Shrinkage and Selection Operator
(LASSO) - Results
- Conclusion and future work
3Introduction -Physics of Image formation
Imaging system
g(x,y)
K(x,y,x,y)
f(x,y)
Registration system
g(x,y)noise
noise
Reverse Process
Forward Process
4Image Restoration
- Image restoration is a subset of image
processing. - It is a highly ill-posed problem.
- Most of the image restoration algorithms uses
least squares. - L2 norm based algorithms produces smooth
restoration which is inaccurate if the image
consists of edges. - L1 norm algorithms preserves the edge information
in the restored images. But the algorithms are
slow.
5Well-Posed Problem
- In 1923, the French mathematician Hadamard
introduced the - notion of well-posed problems.
- According to Hadamard a problem is called
well-posed if - A solution for the problem exists (existence).
- This solution is unique (uniqueness).
- This unique solution is stable under small
perturbations in the data, in other words small
perturbations in the data should cause small
perturbations in the solution (stability). - If at least one of these conditions fails the
problem is called ill or - incorrectly posed and demands a special
consideration.
6Existence
- To deal with non-existence we have to enlarge
the domain where - the solution is sought.
- Example A quadratic equation ax2 bx c 0 in
general form has - two solutions
There is a solution
Real Domain No SolSution
Complex domain
Non-existence is Harmfull
7Uniqueness
- Non-uniqueness is usually caused by the lack or
absence of - information about underlying model.
- Example Neural networks. Error surface has
multiple local minima - and many of these minima fit training data very
well, however - Generalization capabilities of these different
solution (predictive - models) can be very different, ranging from poor
to excellent. How - to pick up a model which is going to generalize
well?
Solution 3 Bad or good?
Solution 1 Bad or good?
Solution 2 Bad or good?
8Uniqueness
- Non-uniqueness is not always harmful. It depends
on what we are looking for. If we are looking for
a desired effect, that is we know how the good
solution looks like then we can be happy with
multiple solutions just picking up a good one
from a variety of solution. - The non-uniqueness is harmful if we are looking
for an observed effect, that is we do not know
how good solution looks like. - The best way to combat non-uniqueness is just
specify a model - using prior knowledge of the domain or at
least restrict the space - where the desired model is searched.
9Instability
- Instability is caused by an attempt to reverse
cause-effect - relationships.
- Nature always solves just for forward problem,
because of the - arrow of time. Cause always goes before effect.
- In practice very often we have to reverse the
relationships, that is - to go from effect to cause.
- Example Convolution-deconvolution, Fredhold
integral equations - of the first kind.
Forward Operation
Effect
Cause
10L1 and L2 Norms
- The general expression for norm is given as
- L2 norm is the
Euclidean distance or vector - distance.
- L1 norm is also known as
Manhattan norm because - it corresponds to the sum of the distances along
the coordinate - axes.
11Why Regularization?
- Most of the restoration is based on Least
Squares. But if the problem is ill-posed then
least squares method fails.
12Regularization
- The general formulation for regularization
techniques is - Where
- is the Error term
- is the regularization parameter
- is the penalty term
-
-
13Tikhonov Regularization
- Tikhonov is a L2 norm or classical regularization
technique. - Tikhonov regularization technique produces
smoothing effect on the restored image. - In zero order Tikhonov regularization, the
regularization operator (L) is identity matrix. - The expression that can be used to compute,
Tikhonov regularization is - In Higher order Tikhonov, L is either first order
or second order differentiation matrix.
14Tikhonov Regularization
Original Image
Blurred Image
15Tikhonov Regularization - Restoration
16Total Variation
- Total Variation is a deterministic approach.
- This regularization method preserve the edge
information in the restored images. - TV regularization penalty function obeys the L1
norm. - The mathematical expression for TV regularization
is given as
17Difference between Tikhonov regularization and
Total Variation
S.No Tikhonov Regularization Total Variation regularization
1.
2. Assumes smooth and continuous information Smoothness is not assumed.
3. Computationally less complex Computationally more complex
4. Restored image is smooth Restored image is blocky and preserves the edges.
18Computation Challenges
Total Variation
Gradient
Non-Linear PDE
19Computation Challenges (Contd..)
- Iterative method is necessary to solve.
- TV function is non-differential at zero.
- The is non-linear operator.
- The ill conditioning of the operator
causes numerical difficulties. - Good Preconditioning is required.
20Computation of Regularization Operator
- Total Variation is computed using the
formulation. - The total variation is obtained after
minimization of the
Total Variation Penalty function (L)
Least Square Solution
21Computation of Regularization Operator
- Discretization of Total variation function
- Gradient of Total Variation is given by
22Regularization Operator
- The regularization operator is computer using the
expression - Where
23Lasso Regression
- Lasso for Least Absolute Shrinkage and Selection
Operator is a shrinkage and selection method for
linear regression introduced by Tibshirani 1995. - It minimizes the usual sum of squared errors,
with a bound on the sum of the absolute values of
the coefficients. - The computation of solution for Lasso is a
quadratic programming problem that can be best
solved by least angle regression algorithm. - Lasso also uses L1 penalty norm.
24Ridge Regression and Lasso Equivalence
- The cost function of ridge regression is given as
- Ridge regression is identical to Zero Order
Tikhonov regularization - Analytical Solution of Ridge and Tikhonov are
similar - The bias introduced favors solution with small
weights and the effect is to smooth the output
function. -
25Ridge Regression and Lasso Equivalence
- Instead of single value of ?, different values
of ? can be used for different pixels. - It should provide same solution as lasso
regression (regularization). - Thus we establish relation between lasso and Zero
Order Tikhonov, there is a relation between Total
Variation and Lasso
Tikhonov
Our Aim To Prove
Proved
Lasso
Total Variation
Both are L1 Norm penalties
26L1 norm regularization - Restoration
Synthetic Images
Input Image
Blurred and Noisy Image
27L1 norm regularization - Restoration
Total Variation Restoration
LASSO Restoration
28L1 norm regularization - Restoration
I Deg of Blur
III Deg of Blur
II Deg of Blur
Blurred and Noisy Images
Total Variation Regularization
LASSO Regularization
29L1 norm regularization - Restoration
I level of Noise
III level of Noise
II level of Noise
Blurred and Noisy Images
Total Variation Regularization
LASSO Regularization
30Cross Section of Restoration
Different degrees Of Blurring
Total Variation Regularization
LASSO Regularization
31Cross Section of Restoration
Different levels of Noise
Total Variation Regularization
LASSO Regularization
32Comparison of Algorithms
Original Image
LASSO Restoration
Tikhonov Restoration
Total Variation Restoration
33Effect of Different Levels of Noise and Blurring
LASSO Restoration
Blurred and Noisy Image
Total Variation Restoration
Tikhonov Restoration
34Numerical Analysis of Results - Airplane
First Level of Noise
Plane PD Iteration CG Iteration Lambda Blurring Error () Residual Error () Restoration Time (min)
Total Variation 2 10 2.05e-02 81.4 1.74 2.50
LASSO Regression 1 6 1.00e-04 81.4 1.81 0.80
Tikhonov Regularization -- -- 1.288e-10 81.4 9.85 0.20
Second Level of Noise
Plane PD Iteration CG Iteration Lambda Blurring Error () Residual Error () Restoration Time (min)
Total Variation 1 15 1e-03 83.5 3.54 1.4
LASSO Regression 1 2 1e-03 83.5 4.228 0.8
Tikhonov Regularization -- -- 1.12e-10 83.5 11.2 0.30
35Numerical Analysis of Results - Airplane
Shelves PD Iteration CG Iteration Lambda Blurring Error () Residual Error () Restoration Time (min)
Total Variation 2 11 1.00e-04 84.1 2.01 2.00
LASSO Regression 1 8 1.00e-06 84.1 1.23 0.90
Plane PD Iteration CG Iteration Lambda Blurring Error () Residual Error () Restoration Time (min)
Total Variation 2 10 1.00e-03 81.2 3.61 2.10
LASSO Regression 1 14 1.00e-03 81.2 3.59 1.00
36Graphical Representation 5 Real Images
Different degrees of Blur
Restoration Time
Residual Error
37Graphical Representation - 5 Real Images
Different levels of Noise
Restoration Time
Residual Error
38Effect of Blurring and Noise
39Conclusion
- Total variation method preserves the edge
information in the restored image. - Restoration time in Total Variation
regularization is high - LASSO provides an impressive alternative to TV
regularization - Restoration time of LASSO regularization is two
times less than restoration time of RV
regularization - Restoration quality of LASSO is better or equal
to the restoration quality of TV regularization
40Conclusion
- Both LASSO and TV regularization fails to
suppress the noise in the restored images. - Analysis shows increase in degree of blur
increases the restoration error - Increase in the noise level does not have a
significant influence on the restoration time but
effects the residual error