Algorithm Efficiency and Sorting - PowerPoint PPT Presentation

About This Presentation
Title:

Algorithm Efficiency and Sorting

Description:

Absolute time expressions have same difficulties as comparing execution times. Alg. ... Compare algorithms for both style and efficiency. ... – PowerPoint PPT presentation

Number of Views:69
Avg rating:3.0/5.0
Slides: 19
Provided by: kumarm8
Learn more at: https://cse.buffalo.edu
Category:

less

Transcript and Presenter's Notes

Title: Algorithm Efficiency and Sorting


1
Algorithm Efficiency and Sorting
  • Bina Ramamurthy
  • CSE116A,B

2
Introduction
  • The basic mathematical techniques for analyzing
    algorithms are central to more advanced topics in
    computer science and give you a way to formalize
    the notion that one algorithm is significantly
    more efficient than another.
  • We will also study sorting algorithms. Sorting
    algorithms provide varied and relatively easy
    examples of the analysis of efficiency.

3
Topics to be Covered
  • Comparing programs
  • Time as a function of Problem size
  • Big-O notation
  • Growth Rate function
  • Worst-case and average case analyses
  • Summary

4
Comparing Programs
  • Comparing the execution times of programs instead
    of the algorithms has the following difficulties
  • 1. Efficiency may depend on implementation of the
    algorithms rather than algorithms themselves.
  • 2. Dependency on the computer or hardware on
    which the program is run.
  • 3. Dependency on the instance of data that is
    used.
  • Algorithm analysis should be independent of
    specific implementation, computers and data.

5
Example1
  • PtrType cur head // 1 assignment
  • while (cur ! null) // N comparisons
  • System.out.println(cur.data) // N writes
  • cur cur.next // N assignments
  • Time (N1) a N c N w
  • where a - assignment time, c - comparison time, w
    - write time.
  • Simply, Time is proportional to N.

6
Time as a function of problem size
  • Absolute time expressions have same difficulties
    as comparing execution times.
  • Alg. A requires N-squared/5 time units
  • Alg. B requires 5 N time units.
  • Moreover, the attribute of interest is how
    quickly the algorithms time requirement grows as
    a function of the problem size.
  • Instead of above expressions,
  • Alg. A requires time proportional to N-squared.
  • Alg. B requires time proportional to N.
  • These characterize the inherent efficiency of
    algs. independent of such factors as particular
    computers and implementations.
  • The analyses are done for large values of N.

7
Order-of-Magnitude Analysis
  • If Alg A requires time proportional to f(N), Alg
    A is said to be order f(N), which is denoted by
    O(f(N))
  • f(N) is called the algorithms growth-rate
    function.
  • The notation uses the upper-case O to denote
    order, it is called the Big O notation.
  • If a problem size of N requires time that is
    directly proportional to N, the problem is O(N),
    if it is , then it is O( ), and so on.

8
Key concepts
  • Formal definition of the order of an algorithm
    Algorithm A is order f(N)-- denoted O(f(N)) -- if
    constants c and N0 exist such that A requires no
    more than c f(N) time units to solve a problem
    of size N gt N0.

9
Interpretation of growth-rate functions
  • 1 -- A growth rate function 1 implies a problem
    whose time requirement is constant and, therefore
    independent of problem size.
  • log2N -- Time requirement for a logarithmic
    algorithm increases slowly as the problem size
    increases. For example, if you square the problem
    size you only double the time requirement.
  • N -- Linear algorithm Time requirement increases
    directly with the size of the problem.
  • N-squared Quadratic. Algorithms that use two
    nested loops are examples.

10
Interpretation of growth-rate functions
  • N-cubed Time requirement for a cubic algorithm
    increases more rapidly. Three nested loops is an
    example.
  • 2-power N exponential algorithm. Too rapidly to
    be of any practical use.
  • N log N Algorithms that divide the problems
    into subproblems and solve them.
  • N-squared Quadratic. Algorithms that use two
    nested loops are examples.
  • N-cubed Time requirement for a cubic algorithm
    increases more rapidly. Three nested loops is an
    example.
  • 2-power N exponential algorithm. Too rapidly to
    be of any practical use.

11
Properties of growth rate functions
  • You can ignore low-order terms in an algorithms
    growth
  • Example
  • You can ignore multiplicative constant in the
    higher order term of a growth-rate function
  • Example
  • You can combine growth rate functions.
  • Example O(f(N))) O(g(N)) O(f(N) g(N))

12
Worst-case and average-case analyses
  • A particular algorithm may require different
    times to solve different problems of the same
    size.
  • For example searching for an element in a sorted
    list.
  • Worst-case analysis gives the pessimistic time
    estimates.
  • Average-case analysis attempts to determine the
    average amount of time that A requires to solve
    the problems of size N.

13
How to use Order-Of-Magnitude function?
  • For example, array-based listRetrieve is O(1)
    meaning whether it is nth element or 1st element
    it will take the same time to access it.
  • A linked-list based listRetrieve is O(N)
    meaning that the retrieval time depends on the
    position of the element in the list.
  • When using an ADTs implementation, consider how
    frequently particular ADT operations occur in a
    given application.

14
How to ?
  • If the problem size is small, you can ignore an
    algorithms efficiency.
  • Compare algorithms for both style and efficiency.
  • Order-of-magnitude analysis focuses on large
    problems.
  • Sometimes you may have to weigh the trade-offs
    between an algorithms time requirements and its
    memory requirements.

15
Efficiency of search algorithms
  • Linear search (sequential search)
  • Best case First element is the required
    element O(1)
  • Worst case Last element or element not present
    O(N)
  • Average case N/2 After dropping the
    multiplicative constant (1/2) O(N)

16
Binary search algorithm
  • Search requires the following steps
  • 1. Inspect the middle item of an array of size N.
  • 2. Inspect the middle of an array of size N/2
  • 3. Inspect the middle item of an array of size
    N/power(2,2) and so on until N/power(2,k) 1.
  • This implies k log2N
  • k is the number of partitions.

17
Binary search algorithm
  • Best case O(1)
  • Worst case O(log2N)
  • Average Case O(log2N)/2 O(log2N)

18
Efficiency of sort algorithms
  • We will consider internal sorts (not external
    sorts).
  • Selection sort, Bubble sort(exchange sort),
    Insertion sort, Merge sort, Quick sort, Radix
    sort
  • For each sort, study the
  • 1. Algorithm
  • 2. Analysis and Order of magnitude expression
  • 3. Application
Write a Comment
User Comments (0)
About PowerShow.com