Title: Greedy Algorithms
1Greedy Algorithms
2Greedy Technique
- Constructs a solution to an optimization problem
piece by - piece through a sequence of choices that are
- Feasible satisfying the prob. constraints
- locally optimal the best local choice
- Irrevocable cannot be changed on subsequent
steps once made - For some problems, yields an optimal solution for
every instance. - For most, does not but can be useful for fast
approximations.
3DP vs. Greedy Algorithms
- Both to solve problems exhibiting optimal
substructure - DP
- Solution to the problem assembled from the
solutions to subproblems by considering various
choices - Greedy algroithms
- Greedy-choice property the locally optimal
choice is made w/o considering results from
subproblems - However, DP could be overkill sometimes
4Applications of the Greedy Strategy
- Optimal solutions
- change making for normal coin denominations
- minimum spanning tree (MST)
- single-source shortest paths (Dijkstras
algorithm) - simple scheduling problems
- Huffman codes
- Approximations
- traveling salesman problem (TSP)
- knapsack problem
- other combinatorial optimization problems
5An Activity-Selection Problem
- Suppose A set of activities Sa1, a2,, an
- They use resources, such as lecture hall, one
lecture at a time - Each ai, has a start time si, and finish time fi,
with 0? silt filt?. - ai and aj are compatible if si, fi) and sj, fj)
do not overlap - Goal select maximum-size subset of mutually
compatible activities. - Start from dynamic programming, then greedy
algorithm, see the relation between the two.
6Activity-Selection Problem
- Problem get your moneys worth out of a carnival
- Buy a wristband that lets you onto any ride
- Lots of rides, each starting and ending at
different times - Your goal ride as many rides as possible
- Another, alternative goal that we dont solve
here maximize time spent on rides - Welcome to the activity selection problem
7Activity-Selection
- Formally
- Given a set S of n activities S a1, a2,, an
- si start time of activity ai
- fi finish time of activity ai
- Find max-size subset A of compatible activities
- Assume (wlog) that f1 ? f2 ? ? fn
8DP Solution
9DP solution step 1
- Optimal substructure of activity-selection
problem. - Assume that f1 ? ?fn (otherwise, sort them by
fi) - Define Sijak fi? skltfk?sj, i.e., all
activities starting after ai finished and ending
before aj begins. - Define two fictitious activities a0 with f00 and
an1 with sn1? - So f0 ?f1 ? ?fn1.
- Then an optimal solution including ak to Sij
contains within it the optimal solution to Sik
and Skj.
10DP solution step 2
- A recursive solution
- Let ci,j be of activities in a maximum-size
subset of mutually compatible activities in Sij.
So the solution is c0,n1S0,n1. - Ci,j 0 if Sij?
maxci,kck,j1 if Sij?? - iltkltj and ak?Sij
11Greedy Algorithms
- Greedy choice
- Intuition we should choose an activity that
leaves the resource available for as many other
activities as possible - So, consider the locally optimal choice
- Select the activity ak with the earliest finish
time in Si,j - Unlike DP solution, after the local greedy
choice, only one subproblem remains! - One big question
- Is our intuition correct?
- We have to prove it is safe to make the greedy
choice
12Justify Greedy Choice
- Theorem 16.1 consider any nonempty subproblem
Sij, and let am be the activity in Sij with
earliest finish time fmminfk ak ? Sij, then - Activity am is used in some maximum-size subset
of mutually compatible activities of Sij. - The subproblem Sim is empty, so that choosing am
leaves Smj as the only one that may be nonempty. - Proof of the theorem (p418)
13Top-Down Rather Than Bottom-Up
- To solve Sij, choose am in Sij with the earliest
finish time, then solve Smj, (Sim is empty) - It is certain that optimal solution to Smj is in
optimal solution to Sij. - No need to solve Smj ahead of Sij.
- Subproblem pattern Si,n1.
14Recursive Solution
recursive_select(s, f, k, n) m k1
while (m lt n sm lt fk) m if (m lt
n) return am U recursive_select(s, f, m, n)
else return Ø
15Optimal Solution Properties
- In DP, optimal solution depends
- How many subproblems to divide. (2 subproblems)
- How many choices to determine which subproblem to
use. (j-i-1 choices) - However, the above theorem (16.1) reduces both
significantly - One subproblem (the other is sure to be empty).
- One choice, i.e., the one with earliest finish
time in Sij. - Moreover, top-down solving, rather than bottom-up
in DP. - Pattern to the subproblems that we solve, Sm,n1
from Sij. - Pattern to the activities that we choose. The
activity with earliest finish time. - With this local optimal, it is in fact the global
optimal.
16Elements of greedy strategy
- Determine the optimal substructure
- Develop the recursive solution
- Prove one of the optimal choices is the greedy
choice yet safe - Show that all but one of subproblems are empty
after greedy choice - Develop a recursive algorithm that implements the
greedy strategy - Convert the recursive algorithm to an iterative
one.
17Change-Making Problem
- Given unlimited amounts of coins of denominations
d1 gt gt dm , - give change for amount n with the least number of
coins - Example d1 25c, d2 10c, d3 5c, d4 1c
and n 48c - Greedy solution
- Greedy solution is
- optimal for any amount and normal set of
denominations - may not be optimal for arbitrary coin
denominations - (4,3,1) for 6
18Minimum Spanning Tree (MST), p623-628
- Spanning tree of a connected graph G a connected
acyclic subgraph of G that includes all of Gs
vertices - Minimum spanning tree of a weighted, connected
graph G a spanning tree of G of minimum total
weight - Example
6
c
a
1
4
2
d
b
3
19Prims MST algorithm (p634-636)
- Start with tree T1 consisting of one (any) vertex
and grow tree one vertex at a time to produce
MST through a series of expanding subtrees T1,
T2, , Tn - On each iteration, construct Ti1 from Ti by
adding vertex not in Ti that is closest to those
already in Ti (this is a greedy step!) - Stop when all vertices are included
20Prims MST algorithm
- Start with tree T1 consisting of one (any) vertex
and grow tree one vertex at a time to produce
MST through a series of expanding subtrees T1,
T2, , Tn - On each iteration, construct Ti1 from Ti by
adding vertex not in Ti that is closest to those
already in Ti (this is a greedy step!) - Stop when all vertices are included
21Prims algorithm
Step 0 Original graph
Step 1 D is chose as an arbitrary starting node
Step 3 F is added into the MST
Step 2 A is added into the MST
22Prims algorithm
Step 5 E is added into the MST
Step 4 B is added into the MST
Step 6 C is added into the MST
Step 7 G is added into the MST
23Notes about Prims algorithm
- Proof by induction that this construction
actually yields MST - Needs priority queue for locating closest fringe
vertex - Efficiency
- O(n2) for weight matrix representation of graph
and array implementation of priority queue - O(m log n) for adjacency list representation of
graph with n vertices and m edges and min-heap
implementation of priority queue, how to get this
24O(m log n) Prims Alg.
- Hints
- A mini-heap of size n, each vertex ordered by
mini_dist of infinity except the initial vertex - parentn
- n iterations of heap removal operation
- For each removal, update the mini_dist and
parent of the remaining vertices in the heap - m/n avg. of edges per vertex
25Shortest paths Dijkstras algorithm
- Single Source Shortest Paths Problem Given a
weighted - connected graph G, find shortest paths from
source vertex s - to each of the other vertices
- Dijkstras algorithm Similar to Prims MST
algorithm, with - a different way of computing numerical labels
Among vertices - not already in the tree, it finds vertex u with
the smallest sum - dv
w(v,u) - where
- v is a vertex for which shortest path has been
already found on preceding iterations (such
vertices form a tree) - dv is the length of the shortest path form
source to v w(v,u) is the length (weight) of
edge from v to u
26Example
d
d
4
Tree vertices Remaining vertices
a(-,0) b(a,3) c(-,8) d(a,7) e(-,8)
4
b(a,3) c(b,34) d(b,32)
e(-,8)
b
c
3
6
5
2
a
d
e
7
4
4
d(b,5) c(b,7) e(d,54)
b
c
3
6
5
2
a
d
e
7
4
4
c(b,7) e(d,9)
b
c
3
6
2
5
a
d
e
7
4
e(d,9)
27Notes on Dijkstras algorithm
- Doesnt work for graphs with negative weights
- Applicable to both undirected and directed graphs
- Efficiency
- O(V2) for graphs represented by weight matrix
and array implementation of priority queue - O(ElogV) for graphs represented by adj. lists
and min-heap implementation of priority queue - Dont mix up Dijkstras algorithm with Prims
algorithm!
28ReviewThe Knapsack Problem
- The famous knapsack problem
- A thief breaks into a museum. Fabulous
paintings, sculptures, and jewels are everywhere.
The thief has a good eye for the value of these
objects, and knows that each will fetch hundreds
or thousands of dollars on the clandestine art
collectors market. But, the thief has only
brought a single knapsack to the scene of the
robbery, and can take away only what he can
carry. What items should the thief take to
maximize the haul?
29Review The Knapsack Problem
- More formally, the 0-1 knapsack problem
- The thief must choose among n items, where the
ith item worth vi dollars and weighs wi pounds - Carrying at most W pounds, maximize value
- Note assume vi, wi, and W are all integers
- 0-1 b/c each item must be taken or left in
entirety - A variation, the fractional knapsack problem
- Thief can take fractions of items
- Think of items in 0-1 problem as gold ingots, in
fractional problem as buckets of gold dust
30Review The Knapsack Problem And Optimal
Substructure
- Both variations exhibit optimal substructure
- To show this for the 0-1 problem, consider the
most valuable load weighing at most W pounds - If we remove item j from the load, what do we
know about the remaining load? - A remainder must be the most valuable load
weighing at most W - wj that thief could take
from museum, excluding item j
31Solving The Knapsack Problem
- The optimal solution to the fractional knapsack
problem can be found with a greedy algorithm - How?
- The optimal solution to the 0-1 problem cannot be
found with the same greedy strategy - Greedy strategy take in order of dollars/pound
- Example 3 items weighing 10, 20, and 30 pounds,
knapsack can hold 50 pounds - Suppose item 2 is worth 100. Assign values to
the other items so that the greedy strategy will
fail
32(No Transcript)
33The Knapsack Problem Greedy vs. DP
- The fractional problem can be solved greedily
- The 0-1 problem cannot be solved with a greedy
approach - As you have seen, however, it can be solved with
dynamic programming
34Coding Problem
- Coding assignment of bit strings to alphabet
characters - Codewords bit strings assigned for characters of
alphabet - Two types of codes
- fixed-length encoding (e.g., ASCII)
- variable-length encoding (e,g., Morse code)
- Prefix-free codes no codeword is a prefix of
another codeword - Problem If frequencies of the character
occurrences are - known, what is the best binary
prefix-free code?
35Huffman codes
- Any binary tree with edges labeled with 0s and
1s yields a prefix-free code of characters
assigned to its leaves - Optimal binary tree minimizing the expected
(weighted average) length of a codeword can be
constructed as follows - Huffmans algorithm
- Initialize n one-node trees with alphabet
characters and the tree weights with their
frequencies. - Repeat the following step n-1 times join two
binary trees with smallest weights into one (as
left and right subtrees) and make its weight
equal the sum of the weights of the two trees. - Mark edges leading to left and right subtrees
with 0s and 1s, respectively.
36Example
- character A B C D _
- frequency 0.35 0.1 0.2 0.2 0.15
- codeword 11 100 00 01 101
- average bits per character 2.25
- for fixed-length encoding 3
- compression ratio (3-2.25)/3100 25
37(No Transcript)
38(No Transcript)
39(No Transcript)