Analysis of Algorithms - PowerPoint PPT Presentation

1 / 82
About This Presentation
Title:

Analysis of Algorithms

Description:

Characterizes running time as a function of the input size, n. ... Then. a (8n 2) T(n) b(8n 2) Hence, the running time T(n) is bounded by two linear functions ... – PowerPoint PPT presentation

Number of Views:627
Avg rating:3.0/5.0
Slides: 83
Provided by: robert1789
Category:

less

Transcript and Presenter's Notes

Title: Analysis of Algorithms


1
Analysis of Algorithms
Algorithm
Input
Output
An algorithm is a step-by-step procedure
for solving a problem in a finite amount of time.
2
Running Time
  • Most algorithms transform input objects into
    output objects.
  • The running time of an algorithm typically grows
    with the input size.
  • Average case time is often difficult to
    determine.
  • We focus on the worst case running time.
  • Easier to analyze
  • Crucial to applications such as games, finance
    and robotics

3
Experimental Studies
  • Write a program implementing the algorithm
  • Run the program with inputs of varying size and
    composition
  • Use a method like System.currentTimeMillis() to
    get an accurate measure of the actual running
    time
  • Plot the results

4
Limitations of Experiments
  • It is necessary to implement the algorithm, which
    may be difficult
  • Results may not be indicative of the running time
    on other inputs not included in the experiment.
  • In order to compare two algorithms, the same
    hardware and software environments must be used

5
Theoretical Analysis
  • Uses a high-level description of the algorithm
    instead of an implementation
  • Characterizes running time as a function of the
    input size, n.
  • Takes into account all possible inputs
  • Allows us to evaluate the speed of an algorithm
    independent of the hardware/software environment

6
Pseudocode
  • High-level description of an algorithm
  • More structured than English prose
  • Less detailed than a program
  • Preferred notation for describing algorithms
  • Hides program design issues


7
Pseudocode Details
  • Control flow
  • if then else
  • while do
  • repeat until
  • for do
  • Indentation replaces braces
  • Method declaration
  • Algorithm method (arg , arg)
  • Input
  • Output
  • Method call
  • var.method (arg , arg)
  • Return value
  • return expression
  • Expressions
  • Assignment(like ? in Java)
  • Equality testing(like ?? in Java)
  • n2 Superscripts and other mathematical formatting
    allowed

8
Primitive Operations
  • Examples
  • Evaluating an expression
  • Assigning a value to a variable
  • Indexing into an array
  • Calling a method
  • Returning from a method
  • Basic computations performed by an algorithm
  • Identifiable in pseudocode
  • Largely independent from the programming language
  • Exact definition not important (we will see why
    later)
  • Assumed to take a constant amount of time in the
    RAM model

9
Counting Primitive Operations
  • By inspecting the pseudocode, we can determine
    the maximum number of primitive operations
    executed by an algorithm, as a function of the
    input size
  • Algorithm arrayMax(A, n)
  • operations
  • currentMax ? A0 2
  • for i ? 1 to n ? 1 do 2n
  • if Ai ? currentMax then 2(n ? 1)
  • currentMax ? Ai 2(n ? 1)
  • increment counter i 2(n ? 1)
  • return currentMax 1
  • Total 8n ? 2

10
Estimating Running Time
  • Algorithm arrayMax executes 8n ? 2 primitive
    operations in the worst case. Define
  • a Time taken by the fastest primitive operation
  • b Time taken by the slowest primitive
    operation
  • Let T(n) be worst-case time of arrayMax. Then a
    (8n ? 2) ? T(n) ? b(8n ? 2)
  • Hence, the running time T(n) is bounded by two
    linear functions

11
Growth Rate of Running Time
  • Changing the hardware/ software environment
  • Affects T(n) by a constant factor, but
  • Does not alter the growth rate of T(n)
  • The linear growth rate of the running time T(n)
    is an intrinsic property of algorithm arrayMax

12
Seven Important Functions
  • Seven functions that often appear in algorithm
    analysis
  • Constant ? 1
  • Logarithmic ? log n
  • Linear ? n
  • N-Log-N ? n log n
  • Quadratic ? n2
  • Cubic ? n3
  • Exponential ? 2n
  • In a log-log chart, the slope of the line
    corresponds to the growth rate of the function

13
Constant Factors
  • The growth rate is not affected by
  • constant factors or
  • lower-order terms
  • Examples
  • 102n 105 is a linear function
  • 105n2 108n is a quadratic function

14
Big-Oh Notation
  • Given functions f(n) and g(n), we say that f(n)
    is O(g(n)) if there are positive constantsc and
    n0 such that
  • f(n) ? cg(n) for n ? n0
  • Example 2n 10 is O(n)
  • 2n 10 ? cn
  • (c ? 2) n ? 10
  • n ? 10/(c ? 2)
  • Pick c 3 and n0 10

15
Asymptotic Analysis
  • asymptotic analysis is a method of classifying
    limiting behavior, by concentrating on some
    trend. It is sometimes expressed in the language
    of equivalence relations. For example, given
    complex-valued functions f and g of a natural
    number variable n, one can write
  • to express the concept that
  • This defines an equivalence relation and the
    equivalence class of f consists of all functions
    g with similar behavior to f, in the limit.
  • Asymptotic notation has been developed to provide
    a convenient language for the handling of
    statements about order of growth.

16
Big-Oh Example
  • Example the function n2 is not O(n)
  • n2 ? cn
  • n ? c
  • The above inequality cannot be satisfied since c
    must be a constant

17
More Big-Oh Examples
  • 7n-2
  • 7n-2 is O(n)
  • need c gt 0 and n0 ? 1 such that 7n-2 ? cn for n
    ? n0
  • this is true for c 7 and n0 1
  • 3n3 20n2 5

3n3 20n2 5 is O(n3) need c gt 0 and n0 ? 1
such that 3n3 20n2 5 ? cn3 for n ? n0 this
is true for c 4 and n0 21
  • 3 log n 5

3 log n 5 is O(log n) need c gt 0 and n0 ? 1
such that 3 log n 5 ? clog n for n ? n0 this
is true for c 8 and n0 2
18
Big-Oh Rules
  • If is f(n) a polynomial of degree d, then f(n) is
    O(nd), i.e.,
  • Drop lower-order terms
  • Drop constant factors
  • Use the smallest possible class of functions
  • Say 2n is O(n) instead of 2n is O(n2)
  • Use the simplest expression of the class
  • Say 3n 5 is O(n) instead of 3n 5 is O(3n)

19
Asymptotic Algorithm Analysis
  • The asymptotic analysis of an algorithm
    determines the running time in big-Oh notation
  • To perform the asymptotic analysis
  • We find the worst-case number of primitive
    operations executed as a function of the input
    size
  • We express this function with big-Oh notation
  • Example
  • We determine that algorithm arrayMax executes at
    most 8n ? 2 primitive operations
  • We say that algorithm arrayMax runs in O(n)
    time
  • Since constant factors and lower-order terms are
    eventually dropped anyhow, we can disregard them
    when counting primitive operations

20
Relatives of Big-Oh
  • big-Omega
  • f(n) is ?(g(n)) if there is a constant c gt 0
  • and an integer constant n0 ? 1 such that
  • f(n) ? cg(n) for n ? n0
  • big-Theta
  • f(n) is ?(g(n)) if there are constants c gt 0 and
    c gt 0 and an integer constant n0 ? 1 such that
    cg(n) ? f(n) ? cg(n) for n ? n0

21
Seven Important Functions
  • Seven functions that often appear in algorithm
    analysis
  • Constant ? 1
  • Logarithmic ? log n
  • Linear ? n
  • N-Log-N ? n log n
  • Quadratic ? n2
  • Cubic ? n3
  • Exponential ? 2n
  • In a log-log chart, the slope of the line
    corresponds to the growth rate of the function

22
Constant Factors
  • The growth rate is not affected by
  • constant factors or
  • lower-order terms
  • Examples
  • 102n 105 is a linear function
  • 105n2 108n is a quadratic function

23
Big-Oh Notation (3.4)
  • Given functions f(n) and g(n), we say that f(n)
    is O(g(n)) if there are positive constantsc and
    n0 such that
  • f(n) ? cg(n) for n ? n0
  • Example 2n 10 is O(n)
  • 2n 10 ? cn
  • (c ? 2) n ? 10
  • n ? 10/(c ? 2)
  • Pick c 3 and n0 10

24
Big-Oh Example
  • Example the function n2 is not O(n)
  • n2 ? cn
  • n ? c
  • The above inequality cannot be satisfied since c
    must be a constant

25
Big-Oh and Growth Rate
  • The big-Oh notation gives an upper bound on the
    growth rate of a function
  • The statement f(n) is O(g(n)) means that the
    growth rate of f(n) is no more than the growth
    rate of g(n)
  • We can use the big-Oh notation to rank functions
    according to their growth rate

26
Big-Oh Rules
  • If is f(n) a polynomial of degree d, then f(n) is
    O(nd), i.e.,
  • Drop lower-order terms
  • Drop constant factors
  • Use the smallest possible class of functions
  • Say 2n is O(n) instead of 2n is O(n2)
  • Use the simplest expression of the class
  • Say 3n 5 is O(n) instead of 3n 5 is O(3n)

27
Asymptotic Algorithm Analysis
  • The asymptotic analysis of an algorithm
    determines the running time in big-Oh notation
  • To perform the asymptotic analysis
  • We find the worst-case number of primitive
    operations executed as a function of the input
    size
  • We express this function with big-Oh notation
  • Example
  • We determine that algorithm arrayMax executes at
    most 8n ? 2 primitive operations
  • We say that algorithm arrayMax runs in O(n)
    time
  • Since constant factors and lower-order terms are
    eventually dropped anyhow, we can disregard them
    when counting primitive operations

28
Kinds of analyses
  • Worst-case (usually)
  • T(n) maximum time of algorithm on any input of
    size n.
  • Average-case (sometimes)
  • T(n) expected time of algorithm over all inputs
    of size n.
  • Need assumption of statistical distribution of
    inputs.
  • Best-case (NEVER)
  • Cheat with a slow algorithm that works fast on
    some input.

29
Example Merge sort
Key subroutine MERGE
30
Merging two sorted arrays
20 13 7 2
12 11 9 1
31
Merging two sorted arrays
20 13 7 2
12 11 9 1
1
32
Merging two sorted arrays
20 13 7 2
12 11 9 1
20 13 7 2
12 11 9
1
33
Merging two sorted arrays
20 13 7 2
12 11 9 1
20 13 7 2
12 11 9
1
2
34
Merging two sorted arrays
20 13 7 2
12 11 9 1
20 13 7 2
12 11 9
20 13 7
12 11 9
1
2
35
Merging two sorted arrays
20 13 7 2
12 11 9 1
20 13 7 2
12 11 9
20 13 7
12 11 9
1
2
7
36
Merging two sorted arrays
20 13 7 2
12 11 9 1
20 13 7 2
12 11 9
20 13 7
12 11 9
20 13
12 11 9
1
2
7
37
Merging two sorted arrays
20 13 7 2
12 11 9 1
20 13 7 2
12 11 9
20 13 7
12 11 9
20 13
12 11 9
1
2
7
9
38
Merging two sorted arrays
20 13 7 2
12 11 9 1
20 13 7 2
12 11 9
20 13 7
12 11 9
20 13
12 11 9
20 13
12 11
1
2
7
9
39
Merging two sorted arrays
20 13 7 2
12 11 9 1
20 13 7 2
12 11 9
20 13 7
12 11 9
20 13
12 11 9
20 13
12 11
1
2
7
9
11
40
Merging two sorted arrays
20 13 7 2
12 11 9 1
20 13 7 2
12 11 9
20 13 7
12 11 9
20 13
12 11 9
20 13
12 11
20 13
12
1
2
7
9
11
41
Merging two sorted arrays
42
Merging two sorted arrays
Time Q(n) to merge a total of n elements
(linear time).
43
Analyzing merge sort
MERGE-SORT A1 . . n
T(n) Q(1) 2T(n/2) Q(n)
  • If n 1, done.
  • Recursively sort A 1 . . ?n/2? and A ?n/2?1
    . . n .
  • Merge the 2 sorted lists

Sloppiness Should be T( ?n/2? ) T( ?n/2? ) ,
but it turns out not to matter asymptotically.
44
Recurrence for merge sort
  • We shall usually omit stating the base case when
    T(n) Q(1) for sufficiently small n, but only
    when it has no effect on the asymptotic solution
    to the recurrence.

45
Recursion tree
Solve T(n) 2T(n/2) cn, where c gt 0 is
constant.
46
Recursion tree
Solve T(n) 2T(n/2) cn, where c gt 0 is
constant.
T(n)
47
Recursion tree
Solve T(n) 2T(n/2) cn, where c gt 0 is
constant.
48
Recursion tree
Solve T(n) 2T(n/2) cn, where c gt 0 is
constant.
49
Recursion tree
Solve T(n) 2T(n/2) cn, where c gt 0 is
constant.
cn
cn/2
cn/2
cn/4
cn/4
cn/4
cn/4

Q(1)
50
Recursion tree
Solve T(n) 2T(n/2) cn, where c gt 0 is
constant.
cn
cn/2
cn/2
h lg n
cn/4
cn/4
cn/4
cn/4

Q(1)
51
Recursion tree
Solve T(n) 2T(n/2) cn, where c gt 0 is
constant.
cn
cn
cn/2
cn/2
h lg n
cn/4
cn/4
cn/4
cn/4

Q(1)
52
Recursion tree
Solve T(n) 2T(n/2) cn, where c gt 0 is
constant.
cn
cn
cn/2
cn
cn/2
h lg n
cn/4
cn/4
cn/4
cn/4

Q(1)
53
Recursion tree
Solve T(n) 2T(n/2) cn, where c gt 0 is
constant.
cn
cn
cn/2
cn
cn/2
h lg n
cn/4
cn/4
cn
cn/4
cn/4


Q(1)
54
Recursion tree
Solve T(n) 2T(n/2) cn, where c gt 0 is
constant.
cn
cn
cn/2
cn
cn/2
h lg n
cn/4
cn/4
cn
cn/4
cn/4


Q(1)
leaves n
Q(n)
55
Remember
  • logax is the power to which a must be raised to
    get x.
  • y logax is equivalent to ay x
  • f(f-1(x)) alogax x, for x gt 0
  • f-1(f(x)) logaax x, for all x.
  • There are two common forms of the log fn.
  • a 10, log10x, commonly written a simply log x
  • a e 2.71828, logex ln x, natural log.
  • logax does not exist for x 0.

56
Recursion tree
Solve T(n) 2T(n/2) cn, where c gt 0 is
constant.
cn
cn
cn/2
cn
cn/2
h lg n
cn/4
cn/4
cn
cn/4
cn/4


Q(1)
leaves n
Q(n)
Total Q(n lg n)
57
Example Conclusions
  • Q(n lg n) grows more slowly than Q(n2).
  • Therefore, merge sort asymptotically beats
    insertion sort in the worst case.
  • In practice, merge sort beats insertion sort for
    n gt 30 or so.

58
B-Trees
59
Motivation for B-Trees
  • So far we have assumed that we can store an
    entire data structure in main memory
  • What if we have so much data that it wont fit?
  • We will have to use disk storage but when this
    happens our time complexity fails
  • The problem is that Big-Oh analysis assumes that
    all operations take roughly equal time
  • This is not the case when disk access is involved

60
Motivation (cont.)
  • Assume that a disk spins at 3600 RPM
  • In 1 minute it makes 3600 revolutions, hence one
    revolution occurs in 1/60 of a second, or 16.7ms
  • On average what we want is half way round this
    disk it will take 8ms
  • This sounds good until you realize that we get
    120 disk accesses a second the same time as 25
    million instructions
  • In other words, one disk access takes about the
    same time as 200,000 instructions
  • It is worth executing lots of instructions to
    avoid a disk access

61
Motivation (cont.)
  • Assume that we use an AVL (height balanced binary
    search ) tree to store 20 million records
  • We still end up with a very deep tree with lots
    of different disk accesses log2 20,000,000 is
    about 24, so this takes about 0.2 seconds (if
    there is only one user of the program)
  • We know we cant improve on the log n for a
    binary tree
  • But, the solution is to use more branches and
    thus less height!
  • As branching increases, depth decreases

62
Definition of a B-tree
  • A B-tree of order m is an m-way tree (i.e., a
    tree where each node may have up to m children)
    in which
  • 1. the number of keys in each non-leaf node is
    one less than the number of its children and
    these keys partition the keys in the children in
    the fashion of a search tree
  • 2. all leaves are on the same level
  • 3. all non-leaf nodes except the root have at
    least ?m / 2? children
  • 4. the root is either a leaf node, or it has from
    two to m children
  • 5. a leaf node contains no more than m 1 keys
  • The number m should always be odd

63
An example B-Tree
26
A B-tree of order 5 containing 26 items
6
12
51
62
42
1
2
4
7
8
13
15
18
25
55
60
70
64
90
45
27
29
46
48
53
Note that all the leaves are at the same level
64
Constructing a B-tree
  • Suppose we start with an empty B-tree and keys
    arrive in the following order1 12 8 2 25 5
    14 28 17 7 52 16 48 68 3 26 29 53 55
    45
  • We want to construct a B-tree of order 5
  • The first four items go into the root
  • To put the fifth item in the root would violate
    condition 5
  • Therefore, when 25 arrives, pick the middle key
    to make a new root

1
2
8
12
65
Constructing a B-tree (contd.)
8
1
2
12
25
66
Constructing a B-tree (contd.)
Adding 17 to the right leaf node would over-fill
it, so we take the middle key, promote it (to the
root) and split the leaf
8
17
12
14
25
28
1
2
6
67
Constructing a B-tree (contd.)
Adding 68 causes us to split the right most leaf,
promoting 48 to the root, and adding 3 causes us
to split the left most leaf, promoting 3 to the
root 26, 29, 53, 55 then go into the leaves
3
8
17
48
1
2
6
7
12
14
16
52
53
55
68
25
26
28
29
68
Constructing a B-tree (contd.)
17
3
8
28
48
1
2
6
7
12
14
16
52
53
55
68
25
26
29
45
69
Inserting into a B-Tree
  • Attempt to insert the new key into a leaf
  • If this would result in that leaf becoming too
    big, split the leaf into two, promoting the
    middle key to the leafs parent
  • If this would result in the parent becoming too
    big, split the parent into two, promoting the
    middle key
  • This strategy might have to be repeated all the
    way to the top
  • If necessary, the root is split in two and the
    middle key is promoted to a new root, making the
    tree one level higher

70
Exercise in Inserting a B-Tree
  • Insert the following keys to a 5-way B-tree
  • 3, 7, 9, 23, 45, 1, 5, 14, 25, 24, 13, 11, 8, 19,
    4, 31, 35, 56
  • Check your approach with a neighbour and discuss
    any differences.

71
Removal from a B-tree
  • During insertion, the key always goes into a
    leaf. For deletion we wish to remove from a
    leaf. There are three possible ways we can do
    this
  • 1 - If the key is already in a leaf node, and
    removing it doesnt cause that leaf node to have
    too few keys, then simply remove the key to be
    deleted.
  • 2 - If the key is not in a leaf then it is
    guaranteed (by the nature of a B-tree) that its
    predecessor or successor will be in a leaf -- in
    this case can we delete the key and promote the
    predecessor or successor key to the non-leaf
    deleted keys position.

72
Removal from a B-tree (2)
  • If (1) or (2) lead to a leaf node containing less
    than the minimum number of keys then we have to
    look at the siblings immediately adjacent to the
    leaf in question
  • 3 if one of them has more than the min number
    of keys then we can promote one of its keys to
    the parent and take the parent key into our
    lacking leaf
  • 4 if neither of them has more than the min
    number of keys then the lacking leaf and one of
    its neighbours can be combined with their shared
    parent (the opposite of promoting a key) and the
    new leaf will have the correct number of keys if
    this step leave the parent with too few keys then
    we repeat the process up to the root itself, if
    required

73
Type 1 Simple leaf deletion
Assuming a 5-way B-Tree, as before...
Delete 2 Since there are enough keys in the
node, just delete it
Note when printed this slide is animated
74
Type 2 Simple non-leaf deletion
Delete 52
56
Borrow the predecessor or (in this case) successor
Note when printed this slide is animated
75
Type 4 Too few keys in node and its siblings
Too few keys!
Delete 72
Note when printed this slide is animated
76
Type 4 Too few keys in node and its siblings
Note when printed this slide is animated
77
Type 3 Enough siblings
Delete 22
Note when printed this slide is animated
78
Type 3 Enough siblings
12
31
29
7
9
15
Note when printed this slide is animated
79
Exercise in Removal from a B-Tree
  • Given 5-way B-tree created by these data (last
    exercise)
  • 3, 7, 9, 23, 45, 1, 5, 14, 25, 24, 13, 11, 8, 19,
    4, 31, 35, 56
  • Add these further keys 2, 6,12
  • Delete these keys 4, 5, 7, 3, 14
  • Again, check your approach with a neighbour and
    discuss any differences.

80
Analysis of B-Trees
  • The maximum number of items in a B-tree of order
    m and height h
  • root m 1
  • level 1 m(m 1)
  • level 2 m2(m 1)
  • . . .
  • level h mh(m 1)
  • So, the total number of items is (1 m m2
    m3 mh)(m 1) (mh1 1)/ (m 1) (m
    1) mh1 1
  • When m 5 and h 2 this gives 53 1 124

81
Reasons for using B-Trees
  • When searching tables held on disc, the cost of
    each disc transfer is high but doesn't depend
    much on the amount of data transferred,
    especially if consecutive items are transferred
  • If we use a B-tree of order 101, say, we can
    transfer each node in one disc read operation
  • A B-tree of order 101 and height 3 can hold 1014
    1 items (approximately 100 million) and any
    item can be accessed with 3 disc reads (assuming
    we hold the root in memory)
  • If we take m 3, we get a 2-3 tree, in which
    non-leaf nodes have two or three children (i.e.,
    one or two keys)
  • B-Trees are always balanced (since the leaves are
    all at the same level), so 2-3 trees make a good
    type of balanced tree

82
Comparing Trees
  • Binary trees
  • Can become unbalanced and lose their good time
    complexity (big O)
  • AVL trees are strict binary trees that overcome
    the balance problem
  • Heaps remain balanced but only prioritise (not
    order) the keys
  • Multi-way trees
  • B-Trees can be m-way, they can have any (odd)
    number of children
  • One B-Tree, the 2-3 (or 3-way) B-Tree,
    approximates a permanently balanced binary tree,
    exchanging the AVL trees balancing operations
    for insertion and (more complex) deletion
    operations
Write a Comment
User Comments (0)
About PowerShow.com