BEST Lecture: Image Processing Image Processing : the image - PowerPoint PPT Presentation

1 / 140
About This Presentation
Title:

BEST Lecture: Image Processing Image Processing : the image

Description:

BEST Lecture: Image Processing Image Processing : the image signal, linear processing and segmentation, Mathematical morphology Benoit Macq (macq_at_tele.ucl.ac.be) – PowerPoint PPT presentation

Number of Views:1219
Avg rating:3.0/5.0
Slides: 141
Provided by: teleUclA
Category:

less

Transcript and Presenter's Notes

Title: BEST Lecture: Image Processing Image Processing : the image


1
BEST Lecture Image Processing
  • Image Processing the image signal,
  • linear processing and segmentation,
  • Mathematical morphology
  • Benoit Macq (macq_at_tele.ucl.ac.be)
  • www.similar.cc

2
The image signal
  • Resolution and spatial frequency
  • Depth
  • Palette (including colours spaces)

3
The structure of an image
  • The original image is a multidimensional physical
    parameter (colour intensity, electromagnetic
    radiation)
  • 2D (Photography), 2Dt (vidéo)
  • 22D (stereo), 2Dt (stereo video)
  • 3D ou 3Dt (MRI)
  • Historically digital imaging starts with spatial
    imagery, then came the medical imaging world,
    digital TV was implemented during the 90s, today
    the Internet, mobile video, tomorrow mixed
    realities and immersion in 3-D worlds

4
The 2-D image
  • The digital image (2D) is a matrix defined by
  • its resolution (amount of pixels)
  • its depth (amount of potential values for each
    pixel)
  • its palette (colour look up table (CLUT))

5
Digital Image
6
Resolution 64X64
7
Resolution 32X32
8
Resolution Vision (physical Image) vs. Digital
Image
  • Resolution
  • Human vision is able to look (in stereo) at from
    very small objects to huge ones
  • The digital image has a fixed resolution
    determined by the amount of pixels of the image.

9
Resolution of an image
  • Nl amounts of lines per picture height
  • Maximum frequency Nl/2 cycles per picture
    height
  • If D is given (in TV, D6H), there is a direct
    relationship between cycles per degree and cycles
    per picture height

H
Viewer

D
10
The spectrum of digital image
11
n
m
12
Sampling of an image
Then sampling the spatial frequencies
13
Sampling
  • Must be achieved at a frequency superior to 2
    times the highest frequency in the signal
    (Nyquist frequency)
  • Or the original image has to be filtered at a
    frequency below the half of the sampling frequency

14
Aliasing ?
15
Aliasing!
16
(No Transcript)
17
(No Transcript)
18
(No Transcript)
19
Sampled Frequencies
20
Fourier more
  • Benoit Macq
  • (ATHENS Lecture in Paris)

21
Fourier Transform (I)
  • Brief Description
  • An important image processing tool which is used
    to decompose an image into its sine and cosine
    components.
  • Output of the transform represents the image in
    the Fourier or frequency domain.
  • In the Fourier domain image, each point
    represents a particular frequency contained in
    the spatial domain image.
  • Applications image analysis, image filtering,
    image reconstruction and image compression

22
Fourier Transform (II)
  • How It Works
  • 2-dimensional DFT
  • Number of frequencies corresponds to the number
    of pixels in the spatial domain image.
  • f(i,j) is the image in the spatial domain.
  • Exponential term is the basis function
    corresponding to each point, F(k,l), in the
    Fourier space.

23
Fourier Transform (III)
  • DC-value and AC-value
  • DC-value F(0,0) - image mean
  • AC-value the others
  • F(0,0) corresponds to the average brightness and
    F(N-1, N-1) represents the highest frequency
  • Inverse Fourier transform

24
Fourier Transform (IV)
  • Separation of DFT Formula
  • Expressing two-dimensional Fourier transform in
    terms of series of two N one-dimensional
    transforms decreases the number of required
    computations.

25
Fourier Transform (V)
  • Fast Fourier Transform(FFT)
  • Ordinary one-dimensional DFT has N2 complexity.
  • Fast Fourier Transform reduce to Nlog2N.
  • Restrict the input size as N 2n.
  • Real and imaginary part of output
  • Output of Fourier transform is a complex number.
  • Displayed with two images real and imaginary OR
    magnitude and phase.
  • Often only use magnitude part.
  • To re-transform we need to use both magnitude and
    phase.

26
Fourier Transform (VI)
  • Guidelines for Use
  • The Fourier transform is used if we want to
    access the geometric characteristics of a spatial
    domain image.
  • Easy to process certain frequencies of the image,
    thus influencing the geometric structure in the
    spatial domain.
  • Display Fourier image
  • DC-value is displayed in center.
  • The further away from the center an image point
    is, the higher is its corresponding frequency.

27
Fourier Transform (VII)
  • Example Fourier Transform
  • The image contains components of all frequencies,
    but their magnitude gets smaller for higher
    frequencies.
  • Low frequencies contain more image information
    than the higher ones.
  • Two dominating directions in the Fourier image,
    vertical and horizontal. These originate from the
    regular patterns in the background.

Original image
Fourier transform
Logarithmic operator applied
28
Fourier Transform (VIII)
  • Example Phase image
  • The value of each point determines the phase of
    the corresponding frequency.
  • The phase information is crucial to reconstruct
    the correct image in the spatial domain.

29
Fourier Transform (IX)
  • Contains 3 main values the DC-value and two AC
    points corresponding to the frequency of the
    stripes in the original image, since the Fourier
    image is symmetrical to its center.
  • Distance of the point to the center
  • One pixel wide stripes maximum frequency the
    furthest point
  • Two pixel wide stripes maximum frequency/2 1/2
    of the furthest point

30
Fourier Transform (X)
  • Example Simple image 2
  • (c) Minor frequencies appear by approximating the
    diagonal as the square pixels of the image.
  • (d) To find the most important frequencies,
    threshold all the frequencies whose magnitude is
    at least 5 of the main peak.
  • The represented frequencies are all multiples of
    the basic frequency of the stripes in the spatial
    domain image.

(a) Diagonal stripes
(b) Fourier transform
(c) Logarithm scaling
(d) Thresholding
31
Fourier Transform (XI)
  • Example Low-pass filtering in Fourier image
  • Result image is a low-pass filtered version of
    the original spatial domain image.

Fourier image (r32 pixels)
Original image
Inverse transform
32
Fourier Transform (XII)
  • Example Add noise in Fourier image
  • This image is the same as the direct sum of the
    two original spatial domain images.

33
Fourier Transform (XIII)
  • Example Finding geometric structure (text
    orientation)
  • We can see that the main values lie on a vertical
    line, indicating that the text lines in the input
    image are horizontal.

Original image (text document)
Fourier transform
Thresholded image
34
Fourier Transform (XIV)
  • Example Finding geometric structure (continued)
  • We can see that the line of the main peaks in the
    Fourier domain is rotated according to rotation
    of the input image.
  • The second line originates from the black corners
    in the rotated image.
  • Also, there exists a reasonable amount of noise
    from the irregular pattern of the letters. It can
    be decreased by forming solid blocks out of the
    text lines.

35
Fourier Transform (XV)
  • Discrete Cosine Transform(DCT)
  • ?(k,n) 1/N for k,n 0
  • 2/N for k,n 1,2,, N-1
  • It generates a real valued output image, and thus
    fast.
  • A major use is in image compression
  • By throwing away the coefficients in high
    frequency components that the human eye is not
    very sensitive to.

36
Notes on Texture by FFT
  • The power spectrum computed from the Fourier
    Transform reveals which waves represent the image
    energy.

37
Stripes of the zebra create high energy waves
generally along the u-axis grass pattern is
fairly random causing scattered low frequency
energy
y
x
v
u
38
More stripes
Power spectrum x 64
39
Spectrum shows broad energy along u axis and less
along the v-axis the roads give more structure
vertically and so does the regularity of the
houses
40
Spartan stadium the pattern of the seats is
evident in the power spectrum lots of energy in
(u,v) along the direction of the seats.
41
Stripes of the zebra create high energy waves
generally along the u-axis grass pattern is
fairly random causing scattered low frequency
energy
y
x
v
u
42
Getting features from the spectrum
  • FT can be applied to square image regions to
    extract texture features
  • set conditions on u-v transform image to compute
    features f1 sum of all pixels where R1 lt
    (u,v) lt R2 (bandpass)
  • f2 sum of pixels (u,v) where u1 lt u ltu2
  • f3 sum of pixels
  • where (u,v)-(u0,v0) lt R

43
Filtering or feature extraction using special
regions of u-v
F4 is all energy in directional wedge
F1 is all energy in small circle
F2 is all energy near origin (low pass)
F3 is all energy outside circle (high pass)
44
Depth of an image
  • Binary representation b bits, 2b levels

white
11
1
10
01
0
00
black
45
The depth Vision vs. Digital Image
  • The vision sensitivity is limited to 6 to 8 bits
    per colour component i.e. max 24 bits)
  • The digital image may have a depth very much
    larger (in medical imaging and in remote sensing)

46
Palette Vision vs Digital Image
  • Palette
  • Vision works in the (350nm) to red (700nm) range
  • Digital imaging allows visualisation in a wider
    range from NMR imaging (0,001nm) to microwaves
    images (100000nm)

47
Image Histogram
  • In raw imagery, the useful data often populates
    only a small portion of the available range of
    digital values (commonly 8 bits or 256 levels).
  • Contrast enhancement involves changing the
    original values so that more of the available
    range is used, thereby increasing the contrast
    between targets and their backgrounds.
  • The key to understanding contrast enhancements is
    to understand the concept of an image histogram

48
Histogram Stretch
  • By manipulating the range of digital values in an
    image, graphically represented by its histogram,
    we can apply various enhancements to the data.
  • There are many different techniques and methods
    of enhancing contrast and detail in an image we
    will cover only a few common ones here.
  • Linear Stretch
  • Histogram Equalised Stretch

49
A linear stretch involves identifying lower and
upper bounds from the histogram (usually the
minimum and maximum brightness values in the
image) and applying a transformation to stretch
this range to fill the full range.
50
(No Transcript)
51
Colour spaces
  • The RGB space (3 types of phosphors for colour
    excitation on a screen)

B
white
black
G
R
52
Colours (cont.)
  • Printing industry CYM(K) Cyan Yellow Magenta
    (can be translated into RGB)
  • Hue Saturation Intensity (HSI) artists, vision

I
green
H
S
white
red
blue
black
53
Colours(cont.)
  • YUV luminance, chrominances in TV
  • black and white TV backward compatibility
  • Decorrelation of the components
  • Y is containing most of the information
  • Y0.299R0.587G0.114B
  • U-0.147R-0.289G0.437B
  • V0.615R-0.515G-0.100B

54
Video formats
  • CCIR 601 576 lines of 720 pixels, 25 images per
    second, 2 interlaced fields per frame, format
    422 (YUYVYUYV) 165 Mbit/s
  • HDTV 16/9 format, 2576 lignes, 7202(4/3)
    pixels (non interlaced and even higher for
    digital cinema)
  • TV 16/9
  • CIF CCIR 601 /2 /2 /2
  • QCIF CIF /2 /2

55
Colour Composites
  • A colour composite is a colour image produced
    through optical combination of multiband images
    by projection through filters.
  • True Colour Composite A colour imaging process
    whereby the colour of the image is the same as
    the colour of the object imaged.
  • False Colour Composite A colour imaging process
    which produces an image of a colour that does not
    correspond to the true colour of the scene (as
    seen by our eyes).

56
TM1
TM4
TM7
TM2
TM5
Landsat TM 5 sub-scene showing the region around
the Alpinforschungszentrum Rudolfshütte
TM3
TM6
57
Landsat TM, August 1991, Alpinforschungszentrum
Rudolfshütte.
58
TM 7,4,1
TM 5,4,3
TM 5,7,2
TM 4,3,2
59
Filtering
  • Denoising
  • Features extraction
  • Multiresolution pyramids

60
Spatial Filtering
  • Spatial filtering encompasses another set of
    digital processing functions which are used to
    enhance the appearance of an image.
  • Spatial filters are designed to highlight or
    suppress specific features in an image based on
    their spatial frequency.
  • Spatial frequency is related to the concept of
    image texture, and refers to the frequency of the
    variations in tone that appear in an image

61
A common filtering involves moving a 'window' of
a few pixels in dimension (e.g. 3x3, 5x5, etc.)
over each pixel in the image, applying a
mathematical calculation using the pixel values
under that window, and replacing the central
pixel with the new value.
62
Spatial Filtering
63
Processing the borders
  • Theoretically periodisation
  • Zero out of the image (ringing effects)
  • Mirroring (optimum for symetric filters)

64
Examples in 2-D filtering
65
Example of denoising in Matlab
  • Mean filters for noise reduction
  • Simple non-linear filters median filters

66
Image of CHURN Farm Daedalus 1268 ATM
67
A low-pass filter is designed to emphasise
larger, homogeneous areas of similar tone and
reduce the smaller detail in an image. Thus,
low-pass filters generally serve to smooth the
appearance of an image.
68
A high-pass filter does the opposite, and serves
to sharpen the appearance of fine detail in an
image.
69
Vertical edges
Directional or edge detecting filters highlight
linear features, such as roads or field
boundaries. These filters can also be designed to
enhance features which are oriented in specific
directions and are useful in applications such as
geology, for the detection of linear geologic
structures.
70
Horizontal edges
71
The Marr-Hildret approach
  • Filter the image with a Gaussian filter
  • Contour points
  • Zero crossing of the
  • second derivative of the
  • filtered image
  • Filtering by the second
  • derivative of the Gaussian
  • filter countours are
  • zero crossing

white
L(x)
black
L (x)
L  (x)
72
Examples in Matlab
73
More about edges
  • Benoit Macq
  • (ATHENS lecture)

74
Feature Detection
  • Image Feature collection of pixels with some
    higher level interpretation
  • Points
  • Edges
  • Texture
  • Features help to describe an object in an image
    (image analysis/segmentation)

75
(No Transcript)
76
Edge Detection in Images
  • Finding the contour of objects in a scene

77
Edge Detection in Images
  • What is an object?
  • It is one of the goals of computer vision to
    identify objects in scenes.

78
Edge Point Detection
  • Several methods
  • Image domain
  • Transform domain
  • Others (e.g. neural nets)
  • Image domain use filters usually seek local
    extrema of Intensity function I(x,y)
  • Derivatives etc can be expressed as filtering
    ops.
  • Can use image gradient (?I)(x,y) (Ix,Iy) and
    identify maxima in M2(x,y) (Ix)2 (Iy)2 (can
    find orientation too)

79
  • Can also use Laplacian (zero crossings)
  • (?2I)(x,y) Ixx Iyy (isotropic)
  • Can use finite diffs, or filters which have
    similar behaviours
  • Some cheaper derivative-like filters e.g Sobel
  • Use 2D filters S1 and S2 (vertical and horiz
    edges)
  • Edges at maxima of I S1 I S2
  • But noise may given spurious results!

80
Differentiation and convolution
  • Recall
  • Now this is linear and shift invariant, so must
    be the result of a convolution.
  • We could approximate this as
  • (which is obviously a convolution with Kernel
  • its not a very good way to do
    things, as we shall see)

81
Finite Difference in 2D
Definition
Convolution Kernels
Discrete Approximation
82
Finite differences
83
Classical Operators
Sobels Operator
Differentiate
Smooth
84
Classical Operators
Prewitts Operator
Differentiate
Smooth
85
Gaussian Filter
86
Marr-Hildreth Edge Detection
  • This means we must pre-filter image before
    detection (e.g Gaussian filter)
  • Marr-Hildreth edge detector includes smoothing
    and multi-scale edge detection
  • Simple model of the HVS (information present at
    multiple scales we see edges and uniform
    regions)
  • Uses a Gaussian G?(x) to smooth image at
    different scales
  • Compute Laplacian, L(x,y) at each scale
  • Edges have zero crossing in L(x,y) at multiple
    scale values

87
  • Create 1D filters
  • Sample G?(x) to get filter G
  • Sample (G?)x to get filter Gx
  • Sample (G?)y to get filter Gy
  • Compute Ix Irows G, Iy Icols G
  • Compute Ix Ix Gx, Iy Iy Gy
  • Compute magnitude image
  • M(x,y) ? (Ix)2 (Iy)2
  • Scan M(x,y) for maxima (edge points)

88
Canny Edge Detection
  • Canny posed edge detection as an optimisation
    problem and solved it
  • The optimal filter for edge detection is
  • (?G?)(x,y) ((G?)x, (G?)y)
  • The scale ? must be decided in advance
  • 2D Gaussian is separable (product of 1Ds)
  • Derivative also separates
  • Method as follows

89
  • Maxima can be clustered across edge and edges may
    be broken
  • Canny also introduced threshold hysteresis
  • Have a low and high threshold
  • Non-maximal suppression
  • Checks nbhd for corroborating evidence

90
Classification for segmentation
91
Some segmentation techniques
  • Various choice of features (luminance, colour,
    variance, texture features, frame difference
    intensity)
  • Region growing
  • split and merge
  • watersheds
  • Contours closing
  • Combined region and contours

92
Example of Region Of Interest segmentation
Change detector
Regions changed with respect to a reference
frame
Original sequence
ROI 2
ROI 1
ROI 3
93
The multiple feature approach
  • Vector of feature for each pixel
  • Input exploit coherence and redundancies among
    features at the pixel level
  • Output low level descriptors and confidence
    measure

Texture
Motion (vx, vy)
Color (Y,U,V)
image
Position (x,y)
94
The multiple feature approach
  • What do regions look like in the feature space?

color
R2
R1
Features space
R3
motion
95
Fuzzy C-Means
The algorithm
begin
Initialize membership matrix U
Update centroids minimize objective function
J(U,m) with constant U
Update memberships minimize objective function
J(U,m) with constant m
no
Stability?
yes
end
96
Clusters of data in feature-space corresponding
to different surfaces
97
Paralellpiped Classifier
  • In this classifier, the range of spectral
    measurements are taken into account. The range is
    defined as the highest and lowest digital numbers
    assigned to each band from the training data
  • An unknown pixel is therefore classified
    according to its location within the class range.
    However, difficulties occur when class ranges
    overlap. This can occur when classes exhibit a
    high degree of correlation or covariance.
  • This can be partially overcome by introducing
    stepped borders to the class ranges.

98
Simple parallelpiped classification
99
Parallelpiped classification with more precise
boundaries
100
Minimum distance to means classifier
  • 1. Calculate of the mean spectral value in each
    band and for each category.
  • 2. Relate each mean value by a vector function
  • 3. A pixel of unknown identity is calculated by
    computing the distance between the value of the
    unknown pixel and each of the category means.
  • 4. After computing the distances, the unknown
    pixel is then assigned to the closest class.
  • Limitations of this process include the fact that
    it is insensitive to different degrees of
    variance within spectral measurements.

101
(No Transcript)
102
Minimum distance to means classification method
103
Maximum Likelihood Classifier
  • This classifier quantitatively evaluates both the
    variance and covariance of the trained spectral
    response patterns when deciding the fate of an
    unknown pixel.
  • To do this the classifier assumes that the
    distribution of points for each cover-type are
    normally distributed
  • Under this assumption, the distribution of a
    category response can be completely described by
    the mean vector and the covariance matrix.
  • Given these values, the classifier computes the
    probability that unknown pixels will belong to a
    particular category.

104
Maximum likelihood classification method
105
(No Transcript)
106
(No Transcript)
107
Mathematical Morphology - Set-theoretic
representation for binary shapes
108
What is the mathematical morphology ?
  • An approach for processing digital image based on
    its shape
  • A mathematical tool for investigating geometric
    structure in image
  • The language of morphology is set theory

109
Goal of morphological operations
  • Simplify image data, preserve essential shape
    characteristics and eliminate noise
  • Permits the underlying shape to be identified
    and optimally reconstructed from their distorted,
    noisy forms

110
Shape Processing and Analysis
  • Identification of objects, object features and
    assembly defects correlate directly with shape
  • Shape is a prime carrier of information in
    machine vision

111
Shape Operators
  • Shapes are usually combined by means of

112
Morphological Operations
  • The primary morphological operations are dilation
    and erosion
  • More complicated morphological operators can be
    designed by means of combining erosions and
    dilations

113
Dilation
  • Dilation is the operation that combines two sets
    using vector addition of set elements.
  • Let A and B are subsets in 2-D space. A image
    undergoing analysis, B Structuring element,
    denotes dilation

114
Dilation
B
A
115
Dilation
  • Let A be a Subset of and . The
    translation of A by x is defined as
  • The dilation of A by B can be computed as the
    union of translation of A by the elements of B

116
Dilation
B
117
Dilation
118
Example of Dilation
Pablo Picasso, Pass with the Cape, 1960
119
Properties of Dilation
  • Commutative
  • Associative
  • Extensivity
  • Dilation is increasing

120
Extensitivity
A
B
121
Properties of Dilation
  • Translation Invariance
  • Linearity
  • Containment
  • Decomposition of structuring element

122
Erosion
  • Erosion is the morphological dual to dilation. It
    combines two sets using the vector subtraction of
    set elements.
  • Let denotes the erosion of A by B

123
Erosion
A
B
124
Erosion
  • Erosion can also be defined in terms of
    translation
  • In terms of intersection

125
Erosion
126
Erosion
127
Example of Erosion
Structuring Element
Pablo Picasso, Pass with the Cape, 1960
128
Properties of Erosion
  • Erosion is not commutative!
  • Extensivity
  • Dilation is increasing
  • Chain rule

129
Properties of Erosion
  • Translation Invariance
  • Linearity
  • Containment
  • Decomposition of structuring element

130
Duality Relationship
  • Dilation and Erosion transformation bear a marked
    similarity, in that what one does to image
    foreground and the other does for the image
    background.
  • , the reflection of B, is defined as
  • Erosion and Dilation Duality Theorem

131
Opening and Closing
  • Opening and closing are iteratively applied
    dilation and erosion
  • Opening
  • Closing

132
Opening and Closing
133
Opening and Closing
  • They are idempotent. Their reapplication has not
    further effects to the previously transformed
    result

134
Opening and Closing
  • Translation invariance
  • Antiextensivity of opening
  • Extensivity of closing
  • Duality

135
Example of Opening
Pablo Picasso, Pass with the Cape, 1960
136
Example of Closing
137
Morphological Filtering
  • Main idea
  • Examine the geometrical structure of an image by
    matching it with small patterns called
    structuring elements at various locations
  • By varying the size and shape of the matching
    patterns, we can extract useful information about
    the shape of the different parts of the image and
    their interrelations.

138
Morphological filtering
  • Noisy image will break down OCR systems

Clean image
Noisy image
139
Morphological filtering
Restored image
140
Summary
  • Mathematical morphology is an approach for
    processing digital image based on its shape
  • The language of morphology is set theory
  • The basic morphological operations are erosion
    and dilation
  • Morphological filtering can be developed to
    extract useful shape information
Write a Comment
User Comments (0)
About PowerShow.com