Sorting - PowerPoint PPT Presentation

1 / 48
About This Presentation
Title:

Sorting

Description:

Sorting Dr. Bernard Chen Ph.D. University of Central Arkansas Quicksort Quicksort uses a divide-and-conquer strategy A recursive approach The original problem ... – PowerPoint PPT presentation

Number of Views:163
Avg rating:3.0/5.0
Slides: 49
Provided by: Berna111
Category:

less

Transcript and Presenter's Notes

Title: Sorting


1
Sorting
  • Dr. Bernard Chen Ph.D.
  • University of Central Arkansas

2
Insertion Sort I
  • The list is assumed to be broken into a sorted
    portion and an unsorted portion
  • Keys will be inserted from the unsorted portion
    into the sorted portion.

Unsorted
Sorted
3
Insertion Sort II
  • For each new key, search backward through sorted
    keys
  • Move keys until proper position is found
  • Place key in proper position

4
Insertion Sort Code
template ltclass Comparablegt void insertionSort(
vectorltComparablegt a ) for( int p 1 p
lt a.size( ) p ) Comparable tmp
a p int j for( j p j gt 0
tmp lt a j - 1 j-- ) a j a
j - 1 a j tmp
Fixed n-1 iterations
Worst case i-1 comparisons
Searching for the proper position for the new key
Move current key to right
Insert the new key to its proper position
5
Insertion Sort Analysis
  • Worst Case Keys are in reverse order
  • Do i-1 comparisons for each new key, where i runs
    from 2 to n.
  • Total Comparisons 123 n-1

Comparison
6
Optimality Analysis I
  • To discover an optimal algorithm we need to find
    an upper and lower asymptotic bound for a
    problem.
  • An algorithm gives us an upper bound. The worst
    case for sorting cannot exceed ?(n2) because we
    have Insertion Sort that runs that fast.
  • Lower bounds require mathematical arguments.

7
Other Assumptions
  • The only operation used for sorting the list is
    swapping two keys.
  • Only adjacent keys can be swapped.
  • This is true for Insertion Sort and Bubble Sort.

8
Shell Sort
  • With insertion sort, each time we insert an
    element, other elements get nudged one step
    closer to where they ought to be
  • What if we could move elements a much longer
    distance each time?
  • We could move each element
  • A long distance
  • A somewhat shorter distance
  • A shorter distance still
  • This approach is what makes shellsort so much
    faster than insertion sort

9
Sorting nonconsecutive subarrays
Here is an array to be sorted (numbers arent
important)
  • Consider just the red locations
  • Suppose we do an insertion sort on just these
    numbers, as if they were the only ones in the
    array?
  • Now consider just the yellow locations
  • We do an insertion sort on just these numbers
  • Now do the same for each additional group of
    numbers
  • The resultant array is sorted within groups, but
    not overall

10
Doing the 1-sort
  • In the previous slide, we compared numbers that
    were spaced every 5 locations
  • This is a 5-sort
  • Ordinary insertion sort is just like this, only
    the numbers are spaced 1 apart
  • We can think of this as a 1-sort
  • Suppose, after doing the 5-sort, we do a 1-sort?
  • In general, we would expect that each insertion
    would involve moving fewer numbers out of the way
  • The array would end up completely sorted

11
Example of shell sort
original 81 94 11 96 12 35 17 95 28 58 41 75 15
5-sort 35 17 11 28 12 41 75 15 96 58 81 94 95
3-sort 28 12 11 35 15 41 58 17 94 75 81 96 95
1-sort 11 12 15 17 28 35 41 58 75 81 94 95 96
12
Diminishing gaps
  • For a large array, we dont want to do a 5-sort
    we want to do an N-sort, where N depends on the
    size of the array
  • N is called the gap size, or interval size

13
Diminishing gaps
  • We may want to do several stages, reducing the
    gap size each time
  • For example, on a 1000-element array, we may want
    to do a 364-sort, then a 121-sort, then a
    40-sort, then a 13-sort, then a 4-sort, then a
    1-sort
  • Why these numbers?

14
Increment sequence
  • No one knows the optimal sequence of diminishing
    gaps
  • This sequence is attributed to Donald E. Knuth
  • Start with h 1
  • Repeatedly compute h 3h 1
  • 1, 4, 13, 40, 121, 364, 1093
  • This sequence seems to work very well

15
Increment sequence
  • Another increment sequence mentioned in the
    textbook is based on the following formula
  • start with h the half of the containers size
  • hi floor (hi-1 / 2.2)
  • It turns out that just cutting the array size in
    half each time does not work out as well

16
Analysis
  • What is the real running time of shellsort?
  • Nobody knows!
  • Experiments suggest something like O(n3/2) or
    O(n7/6)
  • Analysis isnt always easy!

17
Merge Sort
  • If List has only one Element, do nothing
  • Otherwise, Split List in Half
  • Recursively Sort Both Lists
  • Merge Sorted Lists

18
Merging
  • Merge.
  • Keep track of smallest element in each sorted
    half.
  • Insert smallest of two elements into auxiliary
    array.
  • Repeat until done.

smallest
smallest
auxiliary array









A
19
Merging
  • Merge.
  • Keep track of smallest element in each sorted
    half.
  • Insert smallest of two elements into auxiliary
    array.
  • Repeat until done.

auxiliary array
A









G
20
Merging
  • Merge.
  • Keep track of smallest element in each sorted
    half.
  • Insert smallest of two elements into auxiliary
    array.
  • Repeat until done.

auxiliary array
A
G








H
21
Merging
  • Merge.
  • Keep track of smallest element in each sorted
    half.
  • Insert smallest of two elements into auxiliary
    array.
  • Repeat until done.

auxiliary array
A
G
H







I
22
Merging
  • Merge.
  • Keep track of smallest element in each sorted
    half.
  • Insert smallest of two elements into auxiliary
    array.
  • Repeat until done.

auxiliary array
A
G
H
I






L
23
Merging
  • Merge.
  • Keep track of smallest element in each sorted
    half.
  • Insert smallest of two elements into auxiliary
    array.
  • Repeat until done.

auxiliary array
A
G
H
I
L





M
24
Merging
  • Merge.
  • Keep track of smallest element in each sorted
    half.
  • Insert smallest of two elements into auxiliary
    array.
  • Repeat until done.

auxiliary array
A
G
H
I
L
M




O
25
Merging
  • Merge.
  • Keep track of smallest element in each sorted
    half.
  • Insert smallest of two elements into auxiliary
    array.
  • Repeat until done.

auxiliary array
A
G
H
I
L
M
O



R
26
Merging
  • Merge.
  • Keep track of smallest element in each sorted
    half.
  • Insert smallest of two elements into auxiliary
    array.
  • Repeat until done.

first halfexhausted
auxiliary array
A
G
H
I
L
M
O
R


S
27
Merging
  • Merge.
  • Keep track of smallest element in each sorted
    half.
  • Insert smallest of two elements into auxiliary
    array.
  • Repeat until done.

first halfexhausted
auxiliary array
A
G
H
I
L
M
O
R
S

T
28
Merging
  • Merge.
  • Keep track of smallest element in each sorted
    half.
  • Insert smallest of two elements into auxiliary
    array.
  • Repeat until done.

first halfexhausted
second halfexhausted
auxiliary array
A
G
H
I
L
M
O
R
S
T
29
Merging numerical example
30
Binary Merge Sort
  • Given a single file
  • Split into two files

31
Binary Merge Sort
  • Merge first one-element "subfile" of F1 with
    first one-element subfile of F2
  • Gives a sorted two-element subfile of F
  • Continue with rest of one-element subfiles

32
Binary Merge Sort
  • Split again
  • Merge again as before
  • Each time, the size of the sorted subgroups
    doubles

33
Binary Merge Sort
  • Last splitting gives two files each in order
  • Last merging yields a single file, entirely in
    order

34
Quicksort
  • Quicksort uses a divide-and-conquer strategy
  • A recursive approach
  • The original problem partitioned into simpler
    sub-problems,
  • Each sub problem considered independently.
  • Subdivision continues until sub problems obtained
    are simple enough to be solved directly

35
Quicksort
  • Choose some element called a pivot
  • Perform a sequence of exchanges so that
  • All elements that are less than this pivot are to
    its left and
  • All elements that are greater than the pivot are
    to its right.

36
Quicksort
  • If the list has 0 or 1 elements,
  • return. // the list is sorted
  • Else do
  • Pick an element in the list to use as the pivot.
  •   Split the remaining elements into two disjoint
    groups
  • SmallerThanPivot all elements lt pivot
  • LargerThanPivot all elements gt pivot
  •  
  •  Return the list rearranged as
  • Quicksort(SmallerThanPivot),
  • pivot,
  • Quicksort(LargerThanPivot).

37
Quicksort
  • Given to sort75, 70, 65, 84, 98, 78, 100, 93,
    55, 61, 81, 68
  • Select, arbitrarily, the first element, 75, as
    pivot.
  • Search from right for elements lt 75, stop at
    first element lt75
  • And then search from left for elements gt 75,
    starting from pivot itself, stop at first element
    gt75
  • Swap these two elements, and then repeat this
    process until Right and Left point at the same
    location

38
Quicksort Example
  • 75, 70, 65, 68, 61, 55, 100, 93, 78, 98, 81, 84
  • When done, swap with pivot
  • This SPLIT operation placed pivot 75 so that all
    elements to the left were lt 75 and all elements
    to the right were gt75.
  • View code for split() template
  • 75 is now placed appropriately
  • Need to sort sublists on either side of 75

39
Quicksort Example
  • Need to sort (independently)
  • 55, 70, 65, 68, 61 and 100, 93, 78, 98, 81,
    84
  • Let pivot be 55, look from each end for values
    larger/smaller than 55, swap
  • Same for 2nd list, pivot is 100
  • Sort the resulting sublists in the same manner
    until sublist is trivial (size 0 or 1)
  • View quicksort() recursive function

40
Quicksort Split
  • int split (ElementType x, int first, int last)
  • ElementType pivot xfirst
  • int left first, right last
  • While (left lt right)
  • while (pivot lt xright)
  • right--
  • while (left lt right xleftltpivot)
  • left
  • if (left lt right)
  • swap(xleft,xright)
  • int pos right
  • xfirst xpos
  • xpos pivot
  • return pos

41
Quicksort
  • Note visual example ofa quicksort on an array

etc.
42
Quicksort Performance
  • O(nlog2n) is the average case computing time
  • If the pivot results in sublists of approximately
    the same size.
  • O(n2) worst-case
  • List already ordered, elements in reverse
  • When Split() repetitively results, for example,
    in one empty sublist

43
Improvements to Quicksort
  • Better method for selecting the pivot is the
    median-of-three rule,
  • Select the median of the first, middle, and last
    elements in each sublist as the pivot.
  • Often the list to be sorted is already partially
    ordered
  • Median-of-three rule will select a pivot closer
    to the middle of the sublist than will the
    first-element rule.

44
Counting Sort
  • Counting sort assumes that each of the n input
    elements is an integer in the range 0 to k
  • When kO(n), the sort runs in O(n) time

45
Counting Sort
  • Approach
  • Sorts keys with values over range 0..k
  • Count number of occurrences of each key
  • Calculate of keys lt each key
  • Place keys in sorted location using keys counted

46
(No Transcript)
47
Radix Sort
  • Approach
  • 1. Decompose key C into components C1, C2, Cd
  • Component d is least significant, Each component
    has values over range 0..k

48
(No Transcript)
Write a Comment
User Comments (0)
About PowerShow.com