Title: Algorithm Analysis Big O
1Algorithm Analysis (Big O)
2Complexity
- In examining algorithm efficiency we must
understand the idea of complexity - Space complexity
- Time Complexity
3Space Complexity
- when memory was expensive and machines didnt
have much, we focused on making programs as space
efficient as possible and developed schemes to
make memory appear larger than it really was
(virtual memory and memory paging schemes) - Although not as important today, space complexity
is still important in the field of embedded
computing (hand held computer based equipment
like cell phones, palm devices, etc)
4Time Complexity
- Is the algorithm fast enough for my needs
- How much longer will the algorithm take if I
increase the amount of data it must process - Given a set of algorithms that accomplish the
same thing, which is the right one to choose
5Cases to examine
- Best case
- if the algorithm is executed, the fewest number
of instructions are executed - Average case
- executing the algorithm produces path lengths
that will on average be the same - Worst case
- executing the algorithm produces path lengths
that are always a maximum
6Worst case analysis
- Of the three cases the really only useful case
(from the standpoint of program design) is that
of the worst case. - Worst case helps answer the software lifecycle
issue of - If its good enough today, will it be good enough
tomorrow
7Frequency Count
- examine a piece of code and predict the number of
instructions to be executed - Ex
for each instruction predict how mant times each
will be encountered as the code runs
Code for (int i0 ilt n i) cout ltlt
i p p i
F.C. n1 n n ____ 3n1
Inst 1 2 3
totaling the counts produces the F.C. (frequency
count)
8Order of magnitude
- In the previous example best_caseavg_caseworst_c
ase because the example was based just on fixed
iteration - taken by itself, F.C. is relatively meaningless
but expressed as an order of magnitude we can use
it as a estimator of algorithm performance as we
increase the amount of data - to convert F.C. to order of magnitude
- discard constant terms
- disregard coefficients
- pick the most significant term
- the order of magnitude of 3n1 becomes n
- if F.C. is always calculated taking a worst case
path through the algorithm the order of magnitude
is called Big O (i.e. O(n))
9Another example
Code for (int i0 ilt n i) for int j0 j
lt n j) cout ltlt i p p
i
F.C. n1 n(n1) nn nn ____ 3n1
Inst 1 2 3 4
F.C. n1 n22n1 n2 n2 ____ 3n23n2
discarding constant terms produces
3n23n clearing coefficients n2n picking
the most significant term n2
Big O O(n2)
10What is Big O
- Big O is the rate at which performance of an
algorithm degrades as a function of the amount of
data it is asked to handle - For example O(n) indicates that performance
degrades at a linear rate O(n2) indicates the
rate of degradation follows a quadratic path.
11Common growth rates
12Small collections of data
n2
log2n
For small collections of data, an algorithm with
a worse overall big O may actually perform better
than an algorithm with a better big O
20
13Combining Algorithms
- Sometimes we can increase the overall performance
of an algorithm (but not its bigO) by combining
two algorithms. - Suppose we have a large number of data elements
to be sorted and the algorithm we pick has
O(log2n) but we notice that it spends a lot of
time doing housekeeping for small sub-collections
of data. - We can take advantage of the better performance
of an O(n2) algorithm on small collections of
data by switching algorithms when we notice that
the collection size is getting small - This will improve the overall performance (time
that this specific case takes to run) but will
not change the bigO
14Summary
- Order-of-magnitude analysis and Big O measure an
algorithms time requirement as a function of of
the problem size by using a growth rate function.
This allows you to analyze the efficiency without
regard to factors like the computer speed and
skill of the person coding the solution. - When comparing the efficiency of algorithms, you
examine their growth rate functions when the
problems are large. Only significant differences
in growth-rate functions are meaningful. - Worst case analysis considers the maximum amount
of work an algorithm will require while average
case analysis considers the expected amount of
work. - Use order-of-magnitude analysis to help you
select a particular implementation for an ADT. If
your application frequently uses particular ADT
operations, your implementation should be
efficient for at least those operations.