Algorithm Analysis - PowerPoint PPT Presentation

About This Presentation
Title:

Algorithm Analysis

Description:

... for analyzing algorithms are central to more advanced topics in computer ... a model that is appropriate for today's computations, software and hardware. ... – PowerPoint PPT presentation

Number of Views:50
Avg rating:3.0/5.0
Slides: 23
Provided by: kumarm8
Learn more at: https://cse.buffalo.edu
Category:

less

Transcript and Presenter's Notes

Title: Algorithm Analysis


1
Algorithm Analysis
  • Bina Ramamurthy
  • CSE116A,B

2
Introduction
  • The basic mathematical techniques for analyzing
    algorithms are central to more advanced topics in
    computer science and give you a way to formalize
    the notion that one algorithm is significantly
    more efficient than another.
  • Instead of the traditional model our text defined
    a model that is appropriate for todays
    computations, software and hardware.

3
Basic Axioms
  • Axiom 2.1 The time required to fetch an operand
    from memory is constant, Tfetch, and the time
    required to store a result in memory is a
    constant, Tstore.
  • Axiom 2.2 The times required to perform
    elementary arithmetic, such addition,
    subtraction, multiplication, division, and
    comparison, are all constants. These times are
    denoted by T, T-, Tx, T/ and Tlt.

4
Basic Axioms (contd.)
  • Axiom 2.3 The time required to call a method is
    constant, Tcall, and the time required to return
    from a method is a constant, Treturn.
  • Axiom 2.4 The time required to store pass an
    argument to a method is the same as the time
    required to store a value in the memory Tstore.

5
Examples
  • y x
  • y 1
  • y y 1
  • y
  • y f(x)

6
A Simple Example
  • public class Example
  • public static int sum (int n)
  • int result 0
  • for (int i 1 ilt n i)
  • result I
  • return result

7
Array Access
  • Axiom 2.5 The time required for the address
    calculation implied by an array operation, for
    example, ai is a constant, T. . This time
    does not include the time to compute the
    subscript expression, nor does it include the
    time to access (fetch) the array element.
  • Example y a k
  • 3Tfetch T. Tstore

8
Example Horners Rule
  • Horners rule gives a method to evaluate
    polynomials.
  • ? ai xi
  • public static int horner (int, int n, int x)
  • int result a n
  • for (int j n-1 j gt 0 j--)
  • result result c aj
  • return result

9
Example findMaximun
  • public static int findMaximum(int a)
  • int result a0
  • for (int j 1 j lt a.length j)
  • if (aj gt result)
  • result aj
  • return result

10
Average Running Times
  • In the last two examples we computed running
    times as a function of number of input values and
    the actual input values.
  • What will be average running time? If we run the
    program with various sequences of n numbers what
    will the average running times be?

11
Order-of-Magnitude Analysis
  • If Alg A requires time proportional to f(N), Alg
    A is said to be order f(N), which is denoted by
    O(f(N))
  • f(N) is called the algorithms growth-rate
    function.
  • The notation uses the upper-case O to denote
    order, it is called the Big O notation.
  • If a problem size of N requires time that is
    directly proportional to N, the problem is O(N),
    if it is , then it is O( ), and so on.

12
Key concepts
  • Formal definition of the order of an algorithm
    Algorithm A is order f(N)-- denoted O(f(N)) -- if
    constants c and N0 exist such that A requires no
    more than c f(N) time units to solve a problem
    of size N gt N0.

13
Interpretation of growth-rate functions
  • 1 -- A growth rate function 1 implies a problem
    whose time requirement is constant and, therefore
    independent of problem size.
  • log2N -- Time requirement for a logarithmic
    algorithm increases slowly as the problem size
    increases. For example, if you square the problem
    size you only double the time requirement.
  • N -- Linear algorithm Time requirement increases
    directly with the size of the problem.
  • N-squared Quadratic. Algorithms that use two
    nested loops are examples.

14
Interpretation of growth-rate functions
  • N-cubed Time requirement for a cubic algorithm
    increases more rapidly. Three nested loops is an
    example.
  • 2-power N exponential algorithm. Too rapidly to
    be of any practical use.
  • N log N Algorithms that divide the problems
    into subproblems and solve them.
  • N-squared Quadratic. Algorithms that use two
    nested loops are examples.
  • N-cubed Time requirement for a cubic algorithm
    increases more rapidly. Three nested loops is an
    example.
  • 2-power N exponential algorithm. Too rapidly to
    be of any practical use.

15
Properties of growth rate functions
  • You can ignore low-order terms in an algorithms
    growth
  • Example
  • You can ignore multiplicative constant in the
    higher order term of a growth-rate function
  • Example
  • You can combine growth rate functions.
  • Example O(f(N))) O(g(N)) O(f(N) g(N))

16
Worst-case and average-case analyses
  • A particular algorithm may require different
    times to solve different problems of the same
    size.
  • For example searching for an element in a sorted
    list.
  • Worst-case analysis gives the pessimistic time
    estimates.
  • Average-case analysis attempts to determine the
    average amount of time that A requires to solve
    the problems of size N.

17
How to use Order-Of-Magnitude function?
  • For example, array-based listRetrieve is O(1)
    meaning whether it is nth element or 1st element
    it will take the same time to access it.
  • A linked-list based listRetrieve is O(N)
    meaning that the retrieval time depends on the
    position of the element in the list.
  • When using an ADTs implementation, consider how
    frequently particular ADT operations occur in a
    given application.

18
How to ?
  • If the problem size is small, you can ignore an
    algorithms efficiency.
  • Compare algorithms for both style and efficiency.
  • Order-of-magnitude analysis focuses on large
    problems.
  • Sometimes you may have to weigh the trade-offs
    between an algorithms time requirements and its
    memory requirements.

19
Efficiency of search algorithms
  • Linear search (sequential search)
  • Best case First element is the required
    element O(1)
  • Worst case Last element or element not present
    O(N)
  • Average case N/2 After dropping the
    multiplicative constant (1/2) O(N)

20
Binary search algorithm
  • Search requires the following steps
  • 1. Inspect the middle item of an array of size N.
  • 2. Inspect the middle of an array of size N/2
  • 3. Inspect the middle item of an array of size
    N/power(2,2) and so on until N/power(2,k) 1.
  • This implies k log2N
  • k is the number of partitions.

21
Binary search algorithm
  • Best case O(1)
  • Worst case O(log2N)
  • Average Case O(log2N)/2 O(log2N)

22
Efficiency of sort algorithms
  • We will consider internal sorts (not external
    sorts).
  • Selection sort, Bubble sort(exchange sort),
    Insertion sort, Merge sort, Quick sort, Radix
    sort
  • For each sort, study the
  • 1. Algorithm
  • 2. Analysis and Order of magnitude expression
  • 3. Application
Write a Comment
User Comments (0)
About PowerShow.com