Title: Complexity Analysis Part I
1Complexity Analysis (Part I)
- Motivation
- Average, Best, and Worst Case Complexity Analysis
- Asymptotic Analysis
2Motivations for Complexity Analysis
- There are often many different algorithms which
can be used to solve the same problem. - For example, assume that we want to search for a
key in a sorted array. - Thus, it makes sense to develop techniques that
allow us to - compare different algorithms with respect to
their efficiency - choose the most efficient algorithm for the
problem - The efficiency of any algorithmic solution to a
problem can be measured according to the - Time efficiency the time it takes to execute.
- Space efficiency the space (primary or secondary
memory) it uses. - We will focus on an algorithms efficiency with
respect to time.
3How do we Measure Efficiency?
- Running time in micro/milli seconds
- Advantages
- Disadvantages
- Instead,
- we count the number of basic operations the
algorithm performs. - we calculate how this number depends on the size
of the input. - A basic operation is an operation which takes a
constant amount of time to execute. - E.g.,
- Hence, the efficiency of an algorithm is the
number of basic operations it performs. This
number is a function of the input size n.
4Example of Basic Operations
- Arithmetic operations , /, , , -
- Assignment statements of simple data types.
- Reading of primitive types
- writing of a primitive types
- Simple conditional tests if (x lt 12) ...
- method call (Note the execution time of the
method itself may depend on the value of
parameter and it may not be constant) - a method's return statement
- Memory Access
- We consider an operation such as , , and
as consisting of two basic operations. - Note To simplify complexity analysis we will not
consider memory access (fetch or store)
operations.
5Best, Average, and Worst case complexities
- What is the best case complexity analysis?
- What is the worst case complexity analysis?
- What is the average case complexity analysis?
- We are usually interested in the worst case
complexity - Easier to compute
- Usually close to the actual running time
- Crucial to real-time systems (e.g. air-traffic
control)
6Best, Average, and Worst case complexities
- Example For linear search algorithm, determine
the case and the number of comparisons for the - Best Case
- Worst Case
- Average Case
- Best. worst and average complexities of common
sorting algorithms
7Simple Complexity Analysis
public class TestAbstractClass public
static void main(String args)
Employee list new Employee3 list0
new Executive("Jarallah Al-Ghamdi", 50000)
list1 new HourlyEmployee("Azmat Ansari",
120) list2 new MonthlyEmployee("Sahalu
Junaidu", 9000) ((Executive)list0).awardB
onus(11000) for(int i 0 i lt list.length
i) if(listi instanceof
HourlyEmployee) ((HourlyEmployee)listi
).addHours(60) for(int i 0 i lt
list.length i) listi.print()
System.out.println("Paid " listi.pay())
System.out.println("
")
8Simple Complexity Analysis
- Counting the number of basic operations is
cumbersome - We will learn that it is not important to count
the number of basic operations - Instead, we count/find the maximum number of
times a statement gets executed that will
directly reflect the time complexity behavior of
the algorithm.
9Simple Complexity Analysis Simple Loops
- Find the cost of the body of the loop (if
independent of the iterator) - Find the number of iterations for the for loop
- Multiply the two numbers to get the total number
of operations - E.g.
double x, y x 2.5 y 3.0 for(int i 0 i
lt n i) ai x y x 2.5 x y y
ai
10Simple Complexity Analysis Complex Loops
- Method Represent the cost of the for loop in
summation form. - The main idea is to make sure that we find an
iterator that iterates over successive values. - Examples
-
-
for (int i k i lt n i i
m) statement1 statement2
for (int i k i lt n i i m) statement1 s
tatement2
11Simple Complexity Analysis Complex Loops
- Suppose n is a power of 2. Determine the number
of basic operations performed by of the method
myMethod() - Solution
static int myMethod(int n) int sum 0
for(int i 1 i lt n i i 2) sum
sum i helper(i) return sum
static int helper(int n) int sum 0
for(int i 1 i lt n i) sum sum
i return sum
12Simple Complexity Analysis Examples
- Suppose n is a multiple of 2. Determine the
number of basic operations performed by of the
method myMethod() - Solution The number of iterations of the loop
- for(int i 1 i lt n i i 2)
- sum sum i helper(i)
- is log2n (A Proof will be given later)
- Hence the number of basic operations is
- 1 1 (1 log2n) log2n2 4 1 1 (n
1) n2 2 1 1 - 3 log2n log2n10 5n 1
- 5 n log2n 11 log2n 4
static int myMethod(int n) int sum 0
for(int i 1 i lt n i i 2) sum
sum i helper(i) return sum
static int helper(int n) int sum 0
for(int i 1 i lt n i) sum sum
i return sum
13Simple Complexity Analysis Loops With
Logarithmic Iterations
- In the following for-loop (with lt)
-
- The number of iterations is ?(Logm (n / k) )?
- In the following for-loop (with lt)
-
- The number of iterations is ? (Logm (n / k) 1)
?
for (int i k i lt n i i m) statement1 s
tatement2
for (int i k i lt n i i
m) statement1 statement2
14Asymptotic Growth
- Since counting the exact number of operations is
cumbersome, sometimes impossible, we can always
focus our attention to asymptotic analysis, where
constants and lower-order terms are ignored. - E.g. n3, 1000n3, and 10n310000n25n-1 are all
the same - The reason we can do this is that we are always
interested in comparing different algorithms for
arbitrary large number of inputs.
15Asymptotic Growth (1)
16Asymptotic Growth (2)
17Running Times for Different Sizes of Inputs of
Different Functions
18Asymptotic Complexity
- Finding the exact complexity, f(n) number of
basic operations, of an algorithm is difficult. - We approximate f(n) by a function g(n) in a way
that does not substantially change the magnitude
of f(n). --the function g(n) is sufficiently
close to f(n) for large values of the input size
n. - This "approximate" measure of efficiency is
called asymptotic complexity. - Thus the asymptotic complexity measure does not
give the exact number of operations of an
algorithm, but it shows how that number grows
with the size of the input. - This gives us a measure that will work for
different operating systems, compilers and CPUs.
19Big-O (asymptotic) Notation
- The most commonly used notation for specifying
asymptotic complexity is the big-O notation. - The Big-O notation, O(g(n)), is used to give an
upper bound on a positive runtime function f(n)
where n is the input size. - Definition of Big-O
- Consider a function f(n) that is non-negative ? n
? 0. We say that f(n) is Big-O of g(n) i.e.,
f(n) O(g(n)), if ? n0 ? 0 and a constant c gt
0 such that f(n) ? cg(n), ? n ? n0
20Big-O (asymptotic) Notation
- Implication of the definition
- For all sufficiently large n, c g(n) is an upper
bound of f(n) - Note By the definition of Big-O
- f(n) 3n 4 is O(n)
- it is also O(n2),
- it is also O(n3),
- . . .
- it is also O(nn)
- However when Big-O notation is used, the function
g in the relationship f(n) is O(g(n)) is CHOSEN
TO BE AS SMALL AS POSSIBLE. - We call such a function g a tight asymptotic
bound of f(n)
21Big-O (asymptotic) Notation
- Some Big-O complexity classes in order of
magnitude from smallest to highest
22Examples of Algorithms and their big-O complexity
23Warnings about O-Notation
- Big-O notation cannot compare algorithms in the
same complexity class. - Big-O notation only gives sensible comparisons of
algorithms in different complexity classes when n
is large . - Consider two algorithms for same task Linear
f(n) 1000 n Quadratic f'(n) n2/1000The
quadratic one is faster for n lt 1000000.
24Rules for using big-O
- For large values of input n , the constants and
terms with lower degree of n are ignored. - Multiplicative Constants Rule Ignoring constant
factors. - O(c f(n)) O(f(n)), where c is a constant
- Example
- O(20 n3) O(n3)
- 2. Addition Rule Ignoring smaller terms.
- If O(f(n)) lt O(h(n)) then O(f(n) h(n))
O(h(n)). - Example
- O(n2 log n n3) O(n3)
- O(2000 n3 2n ! n800 10n 27n log n 5)
O(n !) - 3. Multiplication Rule O(f(n) h(n)) O(f(n))
O(h(n)) - Example
- O((n3 2n 2 3n log n 7)(8n 2 5n 2))
O(n 5)
25How to determine complexity of code structures
- Loops for, while, and do-while
- Complexity is determined by the number of
iterations in the loop times the complexity of
the body of the loop. - Examples
for (int i 0 i lt n i) sum sum - i
O(n)
for (int i 0 i lt n n i) sum sum
i
O(n2)
i1 while (i lt n) sum sum i i
i2
O(log n)
26How to determine complexity of code structures
sum 0 for(int i 0 i lt n i) for(int j
0 j lt n j) sum i j
O(n2)
i 1 while(i lt n) j 1 while(j lt
n) statements of constant complexity
j j2 i i1
O(n log n)
27How to determine complexity of code structures
- Sequence of statements Use Addition rule
- O(s1 s2 s3 sk) O(s1) O(s2) O(s3)
O(sk) - O(max(s1, s2, s3, . . . , sk))
- Example
- Complexity is O(n2) O(n) O(1) O(n2)
for (int j 0 j lt n n j) sum sum
j for (int k 0 k lt n k) sum sum -
l System.out.print("sum is now sum)
28How to determine complexity of code structures
- Switch Take the complexity of the most expensive
case
char key int X new intn int Y new
intnn ........ switch(key) case
'a' for(int i 0 i lt X.length i)
sum Xi break case 'b'
for(int i 0 i lt Y.length j)
for(int j 0 j lt Y0.length j)
sum Yij break
// End of switch block
o(n)
o(n2)
Overall Complexity O(n2)
29How to determine complexity of code structures
- If Statement Take the complexity of the most
expensive case
char key int A new intnn int B
new intnn int C new intnn ........
if(key '') for(int i 0 i lt n
i) for(int j 0 j lt n j)
Cij Aij Bij // End of if
block else if(key 'x') C matrixMult(A,
B) else System.out.println("Error! Enter
'' or 'x'!")
O(n2)
Overall complexity O(n3)
O(n3)
O(1)
30How to determine complexity of code structures
- Sometimes if-else statements must carefully be
checked - O(if-else) O(Condition) MaxO(if), O(else)
-
int integers new intn ........ if(hasPrimes
(integers) true) integers0
20 else integers0 -20 public boolean
hasPrimes(int arr) for(int i 0 i lt
arr.length i) ..........
.......... // End of hasPrimes()
O(1)
O(1)
O(n)
O(if-else) O(Condition) O(n)