CURVE PARTITIONING - PowerPoint PPT Presentation

1 / 14
About This Presentation
Title:

CURVE PARTITIONING

Description:

CURVE PARTITIONING. Issues in Curve ... Suppose we want to recognize scissors among the ... more than find scissors, we will need to extract straight ... – PowerPoint PPT presentation

Number of Views:40
Avg rating:3.0/5.0
Slides: 15
Provided by: kimlb
Category:

less

Transcript and Presenter's Notes

Title: CURVE PARTITIONING


1
CURVE PARTITIONING
Issues in Curve Partitioning Intent, or
purpose of the partitioning To build
structural descriptions - for
recognition - to characterize some
known thing To isolate particular types of
features To eliminate unwanted clutter, or
to simplify the curve As an explanation of
curve construction - perhaps to
recognize portions of the curve The vocabulary
of the partitioning Primitive type(s)
Relations Definition of noise
Believability Representation (related to the
vocabulary) Evaluation (related to its
believability) Computational effort
2
Paradigms for Curve Partitioning
The local detection of distinguished points
Points of high curvature Inflection points
Extremal distance from axis or centroid
Discontinuities in the above Zeroes or
inflections in the above, above Repeated
extraction of parameterized segments
Straight Smooth Constant curvature Conic
sections Best global description Piecewise
polynomial (or splines) Confirming evidence
(find consensus among multiple methods)
Recursive simplification (hierarchical, based on
above, e.g. across scales)
3
Evaluation Criteria
Stability The output segmentation should
vary smoothly with changes in the input data
minor perturbations in the curve should
produce only minor differences in the output.
Completeness The output segmentation should
capture all salient features of the curve
the description formalism should be capable
of explaining all important characteristics
of the curve. Rich enough. Conciseness
The curve description should be compact the
curves segmented representation should be
able to express the curves behavior - to
the precision needed - in the fewest
possible terms. But just rich enough.
Limited Complexity The number of primitive
element types, and their own complexities,
should be limited, as should the
algorithm(s) to compute them. Again, just
rich enough.
4
The Role of Intent
(a simple example)
Suppose we want to recognize scissors among the
following set of objects, using their
silhouettes
Finding two nearly closed, nearly circular
segments in the silhouette might be enough.
But if we add a box-end wrench, or if we want
to do more than find scissors, we will need to
extract straight lines, at least, and possibly
other arcs and their spatial relations.
5
Recursive Splitting
Ramers Algorithm -- Its been around a loooong
time.
Approximation 1
Approximation 2
6
Approximation 3 And so on.... 1. Draw a
straight line between the endpoints (If a
closed curve, pick two points, usually the
farthest apart, and split the curve into two
open halves.) 2. For each point on the real
curve, compute its perpendicular distance to
the approximating line (error). 3. If the
maximum error is within a preset tolerance,
STOP. 4. Else, insert a new breakpoint (vertex)
at the point of greatest error. 5. Recurse
on the two subproblems, each identical to the
initial problem, only smaller. 6. Output a
polygonal approximation to the input curve.
7
Curvature Breakpoints
It is often useful to construct a polygonal
approximation by breaking the contour a points
of high curvature. There is ample psychovisual
evidence that points of high curvature convey
a great deal of shape information in a few
points. (Attneaves cat ) I believe (but not
quite so strongly) that arguments can be made
to support this compact encoding idea that are
analogous to the arguments for edge
detection. To do this, apply a local curvature
estimator to each point
Compute and threshold on that. Or, just
take the dot product of the inbound and
outbound vectors from the central point and
threshold on that.
8
NOTE In general, points i, i1, and i-1 are
not adjacent. Otherwise, we could just use
chain code s.
In fact, selecting the span of this type of
simple curvature operator can be tricky. It
will impact the scale (in a monotonic, but
otherwise rather unpredictable way) of the
curvatures you can detect. Long span --
Will filter spatial curvature. We will
approximate sizeable portions of the contour with
straight lines. May miss features of
importance. Short span -- Less filtering
of curvature. More sensitive to noise
and other perturbations. May retain
useless (or worse, obscuring) details, and if
this happens, we will not achieve a compact
encoding of the shapes salient features.
Selecting the thresholds for declaring a point to
be of high curvature is not an exact science,
either. Long, smooth curves of large radius
are problematic with any choice of span,
threshold. Can easily be represented
(erroneously) by a single straight line (short
span), or maybe two (long span). (Think of
the Earths curvature.)
9
Another problem
We can dream up many variations on this theme.
10
Scale-Based Description of Curves
Mokhtarian Mackworth describe a scale space
curvature calculation for planar curves to
build a representation based on inflection
point locations as a function of scale.
Parametric representation of curve
Curvature
(s in pixels)
Define
Then the curvature is
This is inconvenient, because our curve will
not (usually) be functional.
11
Denote ... and similarly for y.
Then And the curvature becomes
Notice that we can now compute the curvature by
considering only the projections of the curve
onto the x and y coordinate axes. As you
might suspect, applying this equation for
curvature directly can be noise-enhancing
(lots of derivatives) and you are only getting
an analysis at a single scale. Mokhtarian and
Mackworth recognized the value of multiscale
curvature analysis.
12
So, we can control noise and limit the scale by
smoothing the projection functions with a
Gaussian kernel Then, using the
convolution theorem for differentiation...
... and similarly for Y.
13
Significant Points of the W B Algorithm
Decomposition based on a constant curvature
criterion. Non-linear blip filter.
Overlapping voting scheme. Stability with
respect to modest viewpoint-induced
distortions -- especially as in stereo image
pairs. Stability with respect to local
perturbations. Ability to decompose curves
without inflection points. e.g. The rotor in a
Wankel (Mazda) engine. Decomposition results
in agreement with expert (subjective human)
results. The option exists to apply over
multiple scales, as do M M.
14
Overlapping Voting Scheme
The basic idea is to build consensus for the
strongest (most consistent and longest )
constant curvature segments first. Each
point on the contour votes for its own curvature
value, and a set of values within a
tolerance. Thus, the votes
overlap. Then... The resulting
pseudohistogram is then polled for the bin
receiving the greatest number of votes. The
longest contiguous section of points voting
in this bin is extracted as a trial section.
The longest sections from each other bin are
also extracted and compared. The longest of
these longest contiguous sections is retained
as the first segment extracted. All the
votes corresponding to points in the extracted
segment are deleted from the pseudohistogram.
Repeat until exhausted.
Write a Comment
User Comments (0)
About PowerShow.com