Title: Chapter 3 contd'
1Chapter 3 contd.
- Adjacency, Histograms, Thresholding
2RAGs(Region Adjacency Graphs)
3RAGs (Region Adjacency Graphs)
- Steps
- label image
- scan and enter adjacencies in graph
- (includes containment)
4(No Transcript)
5Define degree of a node. What is special about
nodes with degree 1?
6But how do we obtain binary images?
7Histograms Thresholding
8Gray to binary
- Thresholding
- G ? B
- const int t200
- if (Grcgtt) Brc1
- else Brc0
- How do we choose t?
- Interactively
- Automatically
9Gray to binary
- Interactively. How?
- Automatically.
- Many, many, many, , many methods.
- Experimentally (using a priori information).
- Supervised / training methods.
- Unsupervised
- Otsus method (among many, many, many, many,
other methods).
10Histogram
- Probability of a given gray value in an image.
- h(g) count of pixels w/ gray value equal to g.
- p(g) h(g) / (wh)
- wh of pixels in entire image
- What are the range of possible values for p(g)?
11Histogram
- Note Sometimes we need to group gray values
together in our histogram into bins or
buckets. - E.g., we have 10 bins in our histogram and 100
possible different gray values. So we put 0..9
into bin 0, 10..19 into bin 1,
12Histogram
13Something is missing here!
14Example of histogram
15Example of histogram
We can even analyze the histogram just as we
analyze images. One common measure is entropy
16Example of histogram
We can even analyze the histogram just as we
analyze images. One common measure is entropy
17Calculating entropy
- Notes
- p(k) is in 0,1
- If p(k)0 then dont calculate log(p(k)). Why?
- My calculator only has log base 10. How do I
calculate log base 2? - Why - to the left of the summation?
18Example histograms
Same subject but different images and histograms
(because of difference in contrast).
19Example of different thresholds
20So how can we determine the threshold
automatically?
21Otsus method
- Automatic thresholding method
- automatically picks t given an image histogram
- Assume 2 groups are present in the image
- Those that are ltt
- Those that are gtt
22Otsus method
Best choices for t.
23Otsus method
- For every possible t
- Pick a t.
- Calculate within group variances
- probability of being in group 1
- probability of being in group 2
- determine mean of group 1
- determine mean of group 2
- calculate variance for group 1
- calculate variance for group 2
- calculate weighted sum of group variances and
remember which t gave rise to minimum.
24Otsus methodprobability of being in each group
25Otsus methodmean of individual groups
26Otsus methodvariance of individual groups
27Otsus methodweighted sum of group variances
- Calculate for all ts and minimize.
- Demo Otsu.
28(No Transcript)
29Generalized thresholding
- Single range of gray values
- const int t1200
- const int t2500
- if (Grcgtt1 Grcltt2) Brc1
- else Brc0
30Even more general thresholding
- Union of ranges of gray values.
- const int t1200, t2500
- const int t31200, t41500
- if (Grcgtt1 Grcltt2) Brc1
- else if (Grcgtt3 Grcltt4) Brc1
- else Brc0
31Something is missing here!
32K-Means Clustering
- Clustering the process of partitioning a set of
pattern vectors into subsets called clusters. - K number of clusters (known in advance).
- Not an exhaustive search so it may not find the
globally optimal solution. - (see section 10.1.1)
33Iterative K-Means Clustering Algorithm
- Form K-means clusters from a set of nD feature
vectors. - Set ic1 (iteration count).
- Choose randomly a set of K means m1(1), m2(1),
mK(1). - For each vector xi compute D(xi,mj(ic)) for each
j1,,K. - Assign xi to the cluster Cj with the nearest
mean. - ic ic1 update the means to get a new set
m1(ic), m2(ic), mK(ic). - Repeat 3..5 until Cj(ic1) Cj(ic) for all j.
34K-Means for Optimal Thresholding
35K-Means for Optimal Thresholding
- What are the features?
- Individual pixel gray values
36K-Means for Optimal Thresholding
- What value for K should be used?
37K-Means for Optimal Thresholding
- What value for K should be used?
- K2 to be like Otsus method.
38Iterative K-Means Clustering Algorithm
- Form 2 clusters from a set of pixel gray values.
- Set ic1 (iteration count).
- Choose 2 random gray values as our initial K
means, m1(1), and m2(1). - For each pixel gray value xi compute
fabs(xi,mj(ic)) for each j1,2. - Assign xi to the cluster Cj with the nearest
mean. - ic ic1 update the means to get a new set
m1(ic), m2(ic), mK(ic). - Repeat 3..5 until Cj(ic1) Cj(ic) for all j.
39Iterative K-Means Clustering Algorithm
- Example.
- m1(1)260.83, m2(1)539.00
- m1(2)39.37, m2(2)1045.65
- m1(3)52.29, m2(3)1098.63
- m1(4)54.71, m2(4)1106.28
- m1(5)55.04, m2(5)1107.24
- m1(6)55.10, m2(6)1107.44
- m1(7)55.10, m2(7)1107.44
- .
- .
- .
- Demo K-means.
40Otsu vs. K-Means
- Otsus method as presented determines the single
best threshold. - How many objects can it discriminate?
- Suggest a modification to discriminate more.
- How is Otsus method similar to K-Means?