Image Enhancement and Filtering Techniques

About This Presentation
Title:

Image Enhancement and Filtering Techniques

Description:

EE4H, M.Sc 0407191 Computer Vision Dr. Mike Spann m.spann_at_bham.ac.uk http://www.eee.bham.ac.uk/spannm Linear filtering and convolution The inverse DFT is defined by ... –

Number of Views:747
Avg rating:3.0/5.0
Slides: 61
Provided by: eeeBhamA
Category:

less

Transcript and Presenter's Notes

Title: Image Enhancement and Filtering Techniques


1
Image Enhancement and Filtering Techniques
  • EE4H, M.Sc 0407191
  • Computer Vision
  • Dr. Mike Spann
  • m.spann_at_bham.ac.uk
  • http//www.eee.bham.ac.uk/spannm

2
Introduction
  • Images may suffer from the following
    degradations 
  • Poor contrast due to poor illumination or finite
    sensitivity of the imaging device
  • Electronic sensor noise or atmospheric
    disturbances leading to broad band noise
  • Aliasing effects due to inadequate sampling
  • Finite aperture effects or motion leading to
    spatial

3
Introduction
  • We will consider simple algorithms for image
    enhancement based on lookup tables
  • Contrast enhancement
  • We will also consider simple linear filtering
    algorithms
  • Noise removal

4
Histogram equalisation
  • In an image of low contrast, the image has grey
    levels concentrated in a narrow band
  • Define the grey level histogram of an image h(i)
    where
  • h(i)number of pixels with grey level i
  • For a low contrast image, the histogram will be
    concentrated in a narrow band
  • The full greylevel dynamic range is not used

5
Histogram equalisation
6
Histogram equalisation
  • Can use a sigmoid lookup to map input to output
    grey levels
  • A sigmoid function g(i) controls the mapping from
    input to output pixel
  • Can easily be implemented in hardware for maximum
    efficiency

7
Histogram equalisation
8
Histogram equalisation
  • ? controls the position of maximum slope
  • ? controls the slope
  • Problem - we need to determine the optimum
    sigmoid parameters and for each image
  • A better method would be to determine the best
    mapping function from the image data

9
Histogram equalisation
  • A general histogram stretching algorithm is
    defined in terms of a transormation g(i)
  • We require a transformation g(i) such that from
    any histogram h(i)

10
Histogram equalisation
  • Constraints (N x N x 8 bit image)
  • No crossover in grey levels after
    transformation

11
Histogram equalisation
  • An adaptive histogram equalisation algorithm can
    be defined in terms of the cumulative histogram
    H(i)

12
Histogram equalisation
  • Since the required h(i) is flat, the required
    H(i) is a ramp

h(i)
H(i)
13
Histogram equalisation
  • Let the actual histogram and cumulative histogram
    be h(i) and H(i)
  • Let the desired histogram and desired cumulative
    histogram be h(i) and H(i)
  • Let the transformation be g(i)

14
Histogram equalisation
  • Since g(i) is an ordered transformation

15
Histogram equalisation
  • Worked example, 32 x 32 bit image with grey
    levels quantised to 3 bits

16
Histogram equalisation
17
Histogram equalisation
18
Histogram equalisation
19
Histogram equalisation
20
Histogram equalisation
  • ImageJ demonstration
  • http//rsb.info.nih.gov/ij/signed-applet

21
Image Filtering
  • Simple image operators can be classified as
    'pointwise' or 'neighbourhood' (filtering)
    operators
  • Histogram equalisation is a pointwise operation
  • More general filtering operations use
    neighbourhoods of pixels

22
Image Filtering
23
Image Filtering
  • The output g(x,y) can be a linear or non-linear
    function of the set of input pixel grey levels
    f(x-M,y-M)f(xM,yM.

24
Image Filtering
  • Examples of filters

25
Linear filtering and convolution
  • Example
  • 3x3 arithmetic mean of an input image (ignoring
    floating point byte rounding)

26
Linear filtering and convolution
  • Convolution involves overlap multiply add
    with convolution mask

27
Linear filtering and convolution
28
Linear filtering and convolution
  • We can define the convolution operator
    mathematically
  • Defines a 2D convolution of an image f(x,y) with
    a filter h(x,y)

29
Linear filtering and convolution
  • Example convolution with a Gaussian filter
    kernel
  • s determines the width of the filter and hence
    the amount of smoothing

30
Linear filtering and convolution
s
31
Linear filtering and convolution
Noisy
Original
Filtered s3.0
Filtered s1.5
32
Linear filtering and convolution
  • ImageJ demonstration
  • http//rsb.info.nih.gov/ij/signed-applet

33
Linear filtering and convolution
  • We can also convolution to be a frequency domain
    operation
  • Based on the discrete Fourier transform F(u,v) of
    the image f(x,y)

34
Linear filtering and convolution
  • The inverse DFT is defined by

35
Linear filtering and convolution
36
Linear filtering and convolution
37
Linear filtering and convolution
  • F(u,v) is the frequency content of the image at
    spatial frequency position (u,v)
  • Smooth regions of the image contribute low
    frequency components to F(u,v)
  • Abrupt transitions in grey level (lines and
    edges) contribute high frequency components to
    F(u,v)

38
Linear filtering and convolution
  • We can compute the DFT directly using the formula
  • An N point DFT would require N2 floating point
    multiplications per output point
  • Since there are N2 output points , the
    computational complexity of the DFT is N4
  • N44x109 for N256
  • Bad news! Many hours on a workstation

39
Linear filtering and convolution
  • The FFT algorithm was developed in the 60s for
    seismic exploration
  • Reduced the DFT complexity to 2N2log2N
  • 2N2log2N106 for N256
  • A few seconds on a workstation

40
Linear filtering and convolution
  • The filtering interpretation of convolution can
    be understood in terms of the convolution theorem
  • The convolution of an image f(x,y) with a filter
    h(x,y) is defined as

41
Linear filtering and convolution
42
Linear filtering and convolution
  • Note that the filter mask is shifted and inverted
    prior to the overlap multiply and add stage of
    the convolution
  • Define the DFTs of f(x,y),h(x,y), and g(x,y) as
    F(u,v),H(u,v) and G(u,v)
  • The convolution theorem states simply that

43
Linear filtering and convolution
  • As an example, suppose h(x,y) corresponds to a
    linear filter with frequency response defined as
    follows
  • Removes low frequency components of the image

44
Linear filtering and convolution
DFT
IDFT
45
Linear filtering and convolution
  • Frequency domain implementation of convolution
  • Image f(x,y) N x N pixels
  • Filter h(x,y) M x M filter mask points
  • Usually MltltN
  • In this case the filter mask is 'zero-padded'
    out to N x N
  • The output image g(x,y) is of size NM-1 x NM-1
    pixels. The filter mask wraps around truncating
    g(x,y) to an N x N image

46
Linear filtering and convolution
47
Linear filtering and convolution
48
Linear filtering and convolution
  • We can evaluate the computational complexity of
    implementing convolution in the spatial and
    spatial frequency domains
  • N x N image is to be convolved with an M x M
    filter
  • Spatial domain convolution requires M 2 floating
    point multiplications per output point or N 2 M 2
    in total
  • Frequency domain implementation requires 3x(2N 2
    log 2 N) N 2 floating point multiplications (
    2 DFTs 1 IDFT N 2 multiplications of the
    DFTs)

49
Linear filtering and convolution
  • Example 1, N512, M7
  • Spatial domain implementation requires 1.3 x 107
    floating point multiplications
  • Frequency domain implementation requires 1.4 x
    107 floating point multiplications
  • Example 2, N512, M32
  • Spatial domain implementation requires 2.7 x 108
    floating point multiplications
  • Frequency domain implementation requires 1.4 x
    107 floating point multiplications

50
Linear filtering and convolution
  • For smaller mask sizes, spatial and frequency
    domain implementations have about the same
    computational complexity
  • However, we can speed up frequency domain
    interpretations by tessellating the image into
    sub-blocks and filtering these independently
  • Not quite that simple we need to overlap the
    filtered sub-blocks to remove blocking artefacts
  • Overlap and add algorithm

51
Linear filtering and convolution
  • We can look at some examples of linear filters
    commonly used in image processing and their
    frequency responses
  • In particular we will look at a smoothing filter
    and a filter to perform edge detection

52
Linear filtering and convolution
  • Smoothing (low pass) filter
  • Simple arithmetic averaging
  • Useful for smoothing images corrupted by additive
    broad band noise

53
Linear filtering and convolution
Spatial domain
Spatial frequency domain
54
Linear filtering and convolution
  • Edge detection filter
  • Simple differencing filter used for enhancing
    edged
  • Has a bandpass frequency response

55
Linear filtering and convolution
  • ImageJ demonstration
  • http//rsb.info.nih.gov/ij/signed-applet

56
Linear filtering and convolution
57
Linear filtering and convolution
  • We can evaluate the (1D) frequency response of
    the filter h(x)1,0,-1 from the DFT definition

58
Linear filtering and convolution
  • The magnitude of the response is therefore
  • This has a bandpass characteristic

59
Linear filtering and convolution
60
Conclusion
  • We have looked at basic (low level) image
    processing operations
  • Enhancement
  • Filtering
  • These are usually important pre-processing steps
    carried out in computer vision systems (often in
    hardware)
Write a Comment
User Comments (0)
About PowerShow.com