Title: Program Complexity
1Program Complexity Performance
- ECE573 Data Structures and Algorithms
- Electrical and Computer Engineering Dept.
- Rutgers University
2Administrative
- PA1 HW1 Will be assigned today
- Lab tutorials
3What is a Good Program?
- Run correctly
- Run in accordance with the specifications
- Run efficiently
- In minimum time?
- Other constraints Memory use?
- How? Use appropriate data structures and
algorithms - Algorithm A precisely specified procedure for
solving a problem - Easy to read and understand
- Easy to debug
- Easy to modify (easy to maintain)
- Portability?
Focus
Software Engineering Building large SW
systems Reliable easy to maintain
4Complexity Performance
- Complexity
- ? IMPACTS PERFORMANCE
- Computer Memory required to run a program
(Space complexity) - Time required to run a program (Time
complexity) - Performance Analysis
- Analytical Methods
- Performance Measurement
- Conduct Experiments
5Space Complexity
- Instruction space
- Data space
- Constants
- Variables
- Global, Local, Static, Dynamic
- Environment stack space
6Space Complexity
7Space Complexity
- Fixed Part
- Independent of the instance characteristics
- Instruction space, simple variables, constants,
compiler generated temporary variables etc. - Variable Part
- Size depends on the particular problem instance
being solved - Dynamically allocated space, Recursion stack space
Well, we can write programs in may different ways
just to solve the same problem WE ARE NOT
DISCUSSING THIS SITUATION HERE!! Ex Compute sum
of 100 numbers stored in a file? We can read all
100 numbers and then add We can read one at a
time and add it to the sum
8Time Complexity
- Contributors or influencers are same as the ones
space complexity depends on. - Time to run a program compile time run time
- Compile time does not depend on the instance
characteristics - Compiled program can run several times without
recompilation - Focus Run time
- Also depends on instance characteristics!
9Time Complexity Measures
- Operation Counts
- Based on compiler used number of adds, multiply,
compares etc. - Success depends on our ability to identify the
operations that contribute most to the time
complexity - Step Counts
- Account for all time spent in all parts of the
program/function - Function of instance characteristics
- Definition A program step is syntactically or
semantically meaningful segment of a program for
which the execution time is independent of the
instance characteristics - Ex 10 additions can be one step, 100
multiplications one step
10Time Complexity Instance Characteristics
- Instance characteristics
- Number of inputs, outputs etc.
- More complex example
- Number of swaps performed by a Bubble sort
- Depends on the instance characteristic array size
(n), but also on the values the numbers of swaps
varies from 0 to n-1 - Operation count is not uniquely determinable by
the chosen instance characteristics - ? Need best, worst, and average counts
11Asymptotic Notation
- Different languages, compilers, machines,
operating systems, etc will produce different
constant factors - We are concerned with general behavior of the
running time as n increases to very large values
that we are concerned with
- Big Oh Notation (O)
- A General Portable Performance Metric
- Time to solve problem of size n
- f(n) O(g(n)) iff f(n) lt c g(n)
- for some constant, c gt 0, and n gt N
- Alternatively
- g is an upper bound for f
12Big Oh
- O( g ) - the set of functions that grow no
faster than g. - g(n) describes the worst case behavior of an
algorithm that is O( g ) - It does not say how good or tight the bound is
- To be informative, the function g(n) should be as
small a function of n as possible for which f(n)
O(g(n)) - X O(g(n)) f(n) ? symbol means is not
equals - Example If f(n) amnm a1n a0 and amgt0,
then - f(n) O(nm)
13Omega Theta Notation
- W( g(n) ) Lower bound analog of big oh notation
- the set of functions, f, such that - f(n) gt c g(n)
- for some constant, c, and n gt N
- g is a lower bound for f
- Q( g ) O( g ) Ç W( g ) function f is
bounded both from above and below by the same
function g - F(n) Q( g ) iff c1g(n) lt f(n) ltc2g(n) for
all ngtN
14Common Asymptotic Functions
15Properties of Oh Notation
- Constant factors may be ignored
- " k gt 0, kf is O( f)
- Higher powers grow faster
- nr is O( ns) if 0 r s
- Fastest growing term dominates a sum
- If f is O(g), then f g is O(g)
- eg an4 bn3 is O(n4 )
- Polynomials growth rate is determined by leading
term - If f is a polynomial of degree d, then f is
O(nd)
16Properties of Oh Notation
- f is O(g) is transitive
- If f is O(g) and g is O(h) then f is O(h)
- Product of upper bounds is upper bound for the
product - If f is O(g) and h is O(r) then fh is
O(gr) - Exponential functions grow faster than powers
- nk is O( bn ) " b gt 1 and k ³ 0eg n20
is O( 1.05n) - Logarithms grow more slowly than powers
- logbn is O( nk) " b gt 1 and k gt 0eg log2n
is O( n0.5)
17Properties of Oh Notation
- All logarithms grow at the same rate
- logbn is O(logdn) " b, d gt 1
- Sum of first n rth powers grows as the (r1)th
power - S kr is Q ( nr1 )
- eg S i is Q ( n2 )
n
k1
n
n(n1)
2
k1
18Polynomials and Intractable Algorithms
- Polynomial Time complexity
- An algorithm is said to be polynomial if it is
O( nd ) for some integer d - Polynomial algorithms are said to be efficient
- They solve problems in reasonable times!
- Intractable algorithms
- Algorithms for which there is no known polynomial
time algorithm - We will come back to this important class later
in the course
19Analyzing an Algorithm
- Simple statement sequence
- s1 s2 . sk
- O(1) as long as k is constant
- Simple loops
- for(i0iltni) s
- where s is O(1)
- Time complexity is n O(1) or O(n)
- Nested loops
- for(i0iltni) for(j0jltnj) s
- Complexity is n O(n) or O(n2)
20Analyzing an Algorithm
- Loop index depends on outer loop index
- for(j0jltnj)
- for(k0kltjk) s
- Inner loop executed
- 1, 2, 3, ., n times
- Complexity O(n2)
n
In the big picture, it did not matter!
21Analyzing an Algorithm
- Loop index doesnt vary linearly
- h 1while ( h lt n ) s h 2 h
-
- h takes values 1, 2, 4, until it exceeds n
- There are 1 log2n iterations
- Complexity O(log n)
22Analyzing an Algorithm
- Loop index doesnt vary linearly
- h 1while ( h lt n )
- for (i0 iltn i)
- s h h/2
-
- There are log2n iterations of the outer loop and
- The inner loop is O(n)
- Complexity O(n log n)
23Complexity Analysis Example
t(n) Q( maxgi(n) Q( n )
24Practical Complexities
- Asymptotic evaluation ? True for sufficiently
large n - What is the breakeven point?
- Example 2n vs n2
- Programs that have a complexity that is a
high-degree polynomial are also of limited
utility - Example 1 billion stes/second computer, program
needs n10 steps - n10 ? 10 sec
- n100 ?3171 years
- n1000 ? 3.171013 years
25(No Transcript)
26Performance Measurement
- How to measure the actual space and time
requirements of a program? - Choosing instance size
- Asymptotic analysis tells us the behavior for
sufficiently large values of n - For small values of n, the run time may not
follow asymptotic curve - To determine the break even point, we need to
examine the times for several values of n - Even for the region where asymptotic behavior is
exhibited, the times may not lie exactly on the
predicted curve - Due to the effects of low-order terms that we
discarded - Choose the test data
- Experiment what gives worst case and best case
- Perform the measurements
27Example Binary Search vs. Sequential Search
- Find method
- Sequential Worst case time c1 n
- Binary search Worst case time c2 log2n
Small problems - were not interested!
Large problems - were interested in this gap!
Binary search More complex Higher constant factor
28Next Time
- Linked lists, arrays, and matrices
-
- Student TalkWho?