Date post: | 16-Jan-2016 |
Category: |
Documents |
Upload: | nathan-short |
View: | 218 times |
Download: | 0 times |
1
Algorithmic Paradigms
Jeff EdmondsYork University COSC 2011Lecture 9
Brute Force: Optimazation ProblemGreedy Algorithm: Minimal Spanning TreeDual Hill Climbing: Max Flow / Min CutLinear Programing: HotdogsRecursive Back Tracking: Bellman-FordDynamic Programing: Bellman-FordNP-Complete Problems
2
• Ingredients: • Instances: The possible inputs to the problem. • Solutions for Instance: Each instance has an
exponentially large set of solutions. • Cost of Solution: Each solution has an easy to
compute cost or value. • Specification • <preCond>: The input is one instance.• <postCond>: A valid solution with optimal cost.
(minimum or maximum)
Optimization Problems
3
The Brute Force Algorithm
Exponential Time, because exponentially many
Try every solution!
4
Greedy Algorithms
Surprisingly, many important and practical computational problems can be solved this way.
Every two year old knows the greedy algorithm.
In order to get what you want, just start grabbing what looks best.
5
Instances: A set of objects and a relationship between them.
Solutions for Instance: A subset of the objects.Or some other choice about each object.
Some subsets are not allowed because some objects conflict
Greedy Algorithms
6
Instances: A set of objects and a relationship between them.
Solutions for Instance: A subset of the objects.Or some other choice about each object.
Cost of Solution: The number of objects in solution or the sum of the costs of objects
Greedy Algorithms
7
Instances: A set of objects and a relationship between them.
Goal: Find an optimal non-conflicting solution.
Greedy Algorithms
8
Commit to the object that looks the “best”
Must prove that this locally greedy choice does not have negative global consequences.
Greedy Algorithms
9
Problem: Choose the best m prizes.
Greedy Algorithms
10
Problem: Choose the best m prizes.
Greedy: Start by grabbing the best.
Consequences: If you take the lion, you can't take the elephant.
But greedy algorithms do not try to predict the future and do not back track.
Greedy Algorithms
11
Iterative Greedy Algorithm:
Loop: grabbing the best, then second best, ...
if it conflicts with committed objects or fulfills no new requirements. Reject this next best objectelse Commit to it.
Problem: Choose the best m prizes.
Greedy Algorithms
12
Makes a greedy first choice and then recurses (See Recursive Backtracking Algorithms)
Recursive Greedy Algorithm:
Problem: Choose the best m prizes.
Greedy Algorithms
13
We have not gone wrong. There is at least one optimal solution St
that extends the choices At made so far.
Loop Invariant
Take the lion because it looks best.
Consequences: If you take the lion, you can't take the elephant.
Maybe some optimal solutions do not contain the lion.But at least one does.
Greedy Algorithms
14
Minimal Spanning Tree
15
Minimal Spanning TreeInstance: A undirected graph with weights on the edges.s
c
ba
d
f
i j
hg
40
1
2
15
1 6
1
30
3301
2
12k
2
2
4
16
Minimal Spanning Tree
s
c
ba
d
f
i j
hg
40
1
2
15
1 6
1
30
3301
2
12k
Instance: A undirected graph with weights on the edges.Solution: A subset of edge• A tree (no cycles, not rooted)• Spanning
Connected nodes still connected.
Cost: Sum of edge weightsGoal: Find Minimal Spanning Tree
2
2
4
17
3
Minimal Spanning Tree
s
c
ba
d
f
i j
hg
40
1
2
15
1
4
6
1
30
2
30
2
1
2
12k
Instance: A undirected graph with weights on the edges.Solution: A subset of edge• A tree (no cycles)• Spanning
Connected nodes still connected.
Cost: Sum of edge weightsGoal: Find Minimal Spanning TreeGreedy Alg: Commit to the edge that looks the “best.”
Can’t add because of cycle.
Must prove that this is• acylic• spanning • optimal.
Done
18
O(E)Time =
Time =O(E log(E))
19
3
Minimal Spanning Tree
s
c
ba
d
f
i j
hg
40
1
2
15
1
4
6
1
30
2
30
2
1
2
12k
How does the algorithmdetect a cycle?
No cycle.Cycle.
20
3
Minimal Spanning Tree
s
c
ba
d
f
i j
hg
40
1
2
2
1
2
6
1
30
2
30
2
1
2
12k
Cycle detection algorithm:• Keep track of sets of nodes
in connected components.• If edge with one component,• then new cycle.
• If edge bridges two components,• then no new cycle.
• Merge components.
21
3
Minimal Spanning Tree
s
c
ba
d
f
i j
hg
40
1
2
2
1
2
6
1
30
2
30
2
1
2
12k
Cycle detection algorithm:• Keep track of sets of nodes
in connected components.• If edge with one component,• then new cycle.
• If edge bridges two components,• then no new cycle.
• Merge components.
Can’t add because of cycle.
22
Minimal Spanning Tree
23
Minimal Spanning Tree Algorithm for keeping track of components: Union-Find Data structure.
Average Time = Akerman’s-1(E) 4
24
Adaptive Greedy
s
c
ba
d
f
i j
hg
40
1
2
15
1 6
1
30
3301
2
12k
Another application: (www)• Suppose we don’t know of edges
until we find them searching from s.
Another Greedy Alg:• Expand out from s,
committing to best edge connected to component.2
2
4
Can’t add because of cycle.These edges are never found.
Done
Is this a greedy algorithm? (The priorities on the edges keep changing.)
25
Fixed Priority:Sort the objects from best to worst and loop through them.
Adaptive Priority:–Greedy criteria depends on which objects have been committed to so far. –At each step, the next “best” object is chosen according to the current greedy criteria. –Searching or re-sorting takes too much time. –Use a priority queue.
Adaptive Greedy
26
Adaptive Greedy
27
Adaptive Greedy
28
Dijkstra's shortest weighted path algorithm can be considered to be a greedy algorithm with an adaptive priority criteria.
Adaptive Greedy
29
• Instance: • A Network is a directed graph G • Edges represent pipes that carry flow• Each edge <u,v> has a maximum capacity c<u,v>
• A source node s out of which flow leaves• A sink node t into which flow arrives
Goal: Max Flow
Network Flow
30
• Instance: • A Network is a directed graph G • Edges represent pipes that carry flow• Each edge <u,v> has a maximum capacity c<u,v> • A source node s out of which flow leaves• A sink node t into which flow arrives
Network Flow
31
• Solution: • The amount of flow F<u,v> through each edge.• Flow F<u,v> can’t exceed capacity c<u,v>.• Unidirectional flow• No leaks, no extra flow.
Network Flow
32
• Solution: • The amount of flow F<u,v> through each edge.• Flow F<u,v> can’t exceed capacity c<u,v>.• Unidirectional flow• No leaks, no extra flow.
For each node v: flow in = flow out u F<u,v> = w F<v,w>
Except for s and t.
Network Flow
33
• Value of Solution: • Flow from s into the network
rate(F) = u F<s,u>
Goal: Max Flow
Network Flow
34
An Application: MatchingSam Mary
Bob Beth
John Sue
Fred Ann
Who loves whom.Who should be matched with whom
so as many as possible matchedand nobody matched twice?
3 matches Can we do better?
4 matches
35
An Application: Matching
s t
c<s,u> = 1• Total flow out of u = flow into u 1• Boy u matched to at most one girl.
1
c<v,t> = 1• Total flow into v = flow out of v 1• Girl v matched to at most one boy.
1u v
36
• Instance: • A Network is a directed graph G • Special nodes s and t.• Edges represent pipes that carry flow• Each edge <u,v> has a maximum capacity c<u,v>
Min Cut
s
t
37
• Instance: • A Network is a directed graph G • Special nodes s and t.• Edges represent pipes that carry flow• Each edge <u,v> has a maximum capacity c<u,v>
Min Cut
38
• Solution: C = partition of nodes <U,V> with sU, tV.
Min Cut
s
t
U
VYork
UC Berkeley
= Canada
= USA
39
Min Cut
s
t
York
UC Berkeley
UCB
• Solution: C = partition of nodes <U,V> with sU, tV.
40
Min Cut
s
t
York
UC Berkeley
U
V
• Solution: C = partition of nodes <U,V> with sU, tV.
41
• Value Solution C=<U,V>: cap(C) = how much can flow from U to V = uU,vV c<u,v>
Min Cut
s
t
U
V
u
v
Goal: Min Cut
42
We have a valid solution.(not necessarily optimal)
Take a step that goes up.
measure
progress
Value of our solution.
Problems:
Exit Can't take a step that goes up.
Running time?
Initially have the “zero
Local Max
Global Max
Can our Network Flow Algorithm get stuck in a local maximum?
Make small local changes to your solution toconstruct a slightly better solution.
If you take small step,could be exponential time.
Primal-Dual Hill Climbing
43
Primal-Dual Hill Climbing
Prove:• For every location to stand either:• the alg takes a step up or• the alg gives a reason that explains why not
by giving a ceiling of equal height.i.e. L [ L’ height(L’) > height(L) or R height(R) = height(L)]
or
But R L height(R) height(L)
No Gap
44
Primal-Dual Hill Climbing
No Gap
Lalg witness that height(Lmax) is no smaller. Ralg witness that height(Lmax) is no bigger.
Prove:• For every location to stand either:• the alg takes a step up or• the alg gives a reason that explains why not
by giving a ceiling of equal height.i.e. L [ L’ height(L’) > height(L) or R height(R) = height(L)]
45
Primal-Dual Hill Climbing
No Gap
Flowalg witness that network has this flow. Cutalg witness that network has no bigger flow.
Prove:• For every location to stand either:• the alg takes a step up or• the alg gives a reason that explains why not
by giving a ceiling of equal height.i.e. L [ L’ height(L’) > height(L) or R height(R) = height(L)]
46
A combination of pork, grain, and sawdust, …
Linear Programing
47
Constraints: • Amount of moisture• Amount of protein,• …
Linear Programing
48
Given today’s prices,what is a fast algorithm to find the cheapest hotdog?
Linear Programing
49
Cost: 29, 8, 1, 2
Amount to add: x1, x2, x3, x4
pork
grai
nw
ater
saw
dust
3x1 + 4x2 – 7x3 + 8x4 ≤ 122x1 - 8x2 + 4x3 - 3x4 ≤ 24
-8x1 + 2x2 – 3x3 - 9x4 ≤ 8x1 + 2x2 + 9x3 - 3x4 ≤ 31
Constraints: • moisture• protein,• …
29x1 + 8x2 + 1x3 + 2x4Cost of Hotdog:
Linear Programing(Abstract Out Essentials)
50
29x1 + 8x2 + 1x3 + 2x4
Subject to:
Minimize:
3x1 + 4x2 – 7x3 + 8x4 ≤ 122x1 - 8x2 + 4x3 - 3x4 ≤ 24
-8x1 + 2x2 – 3x3 - 9x4 ≤ 8x1 + 2x2 + 9x3 - 3x4 ≤ 31
Linear Programing(Abstract Out Essentials)
51
For decades people thought that there was no fast algorithm.
Then one was found!
Theoretical Computer Sciencefinds new algorithms every day.
3x1 + 4x2 – 7x3 + 8x4 ³ 12
2x1 - 8x2 + 4x3 - 3x4 ³ 24
-8x1 + 2x2 – 3x3 - 9x4 ³ 8x1 + 2x2 + 9x3 - 3x4 ³ 31
29x1 + 8x2 + 1x3 + 2x4
Subject to:
Minimize:
»
Linear Programing
52
• Given an instance of Network Flow: <G,c<u,v>> express it as a Linear Program: • The variables: • Maximize:• Subject to:
Flows f<u,v> for each edge.
<u,v>: F<u,v> c<u,v>. (Flow can't exceed capacity)v: u F<u,v> = w F<v,w> (flow in = flow out)
rate(F) = u F<s,u> - v F<v,s>
Linear Programing
53
54
Primal
Dual
55
• Consider your instance I.• Ask a little question (to the little bird) about its optimal solution.• Try all answers k.
• Knowing k about the solutionrestricts your instance to a subinstance subI.
• Ask your recursive friend for a optimal solution subsol for it. • Construct a solution optS<I,k> = subsol + k
for your instance that is the best of those consistent with the kth bird' s answer.
• Return the best of these best solutions.
Recursive Back TrackingBellman Ford
56
Specification: All Nodes Shortest-Weighted Paths • <preCond>: The input is a graph G (directed or undirected)
with edge weights (possibly negative)• <postCond>: For each u,v, find a shortest path from u to v
Stored in a matrix Dist[u,v].
b
d
cu
k
g i
v
h
40
1 10
2
15
181
2
6
8
1
230325
12
3
For a recursive algorithm, we must give our friend a smaller subinstance.How can this instance be made smaller?Remove a node? and edge?
Recursive Back TrackingBellman Ford
with ≤l edges
and integer l.
with at most l edge.
l=3
l=4
57
b
d
cu
k
g i
v
h
40
1 10
2
15
181
2
6
8
1
230325
12
3
Recursive Back TrackingBellman Ford
l=4
• Consider your instance I = u,v,l.• Ask a little question (to the little bird) about its optimal solution.
• “What node is in the middle of the path?”• She answers node k.• I ask one friend subI = u,k, l/2
and another subI = k,v, l/2• optS<I,k> = subsolu,k,l/2
+ k + subsolk,v,l/2 is the best solution for I consistent with the kth bird‘s answer.
• Try all k and return the best of these best solutions.
58
Dynamic Programming Algorithm• Given an instance I,
• Imagine running the recursive alg on it.• Determine the complete set of subI
ever given to you, your friends, their friends, …• Build a table indexed by these subI• Fill in the table in order so that nobody waits.
Recursive Back Tracking
Given graph G, find Dist[uv,l] for l =1,2,4,8,…
b
d
cu
k
g i
v
h
40
1 10
2
15
181
2
6
8
1
230325
12
3
l=4
,n
59
b
d
cu
k
g i
v
h
40
1 10
2
15
181
2
6
8
1
230325
12
3
l=4
Dynamic Programming AlgorithmLoop Invariant: For each u,v,
Dist[u,v,l] = a shortest path from u to v with ≤l edges
Exit
for l = 2,4,8,16,…,2n % Find Dist[uv,l] from Dist[u,v,l/2] for all u,v Verticies Dist[u,v,l] = Dist[u,v,l/2] for all k Verticies Dist[u,v,l] = min( Dist[u,v,l], Dist[u,k,l/2]+Dist[k,v,l/2] )
60
b
d
cu
k
g i
v
h
40
1 10
2
15
181
2
6
8
1
230325
12
3
l=4
Dynamic Programming AlgorithmLoop Invariant: For each u,v,
Dist[u,v,l] = a shortest path from u to v with ≤l edges
When l = 1, Dist[b,c,1] = 10Dist[u,v,1] = ∞
% Smallest Instancesfor all u,v Verticies if u,v Edges Dist[u,v,1] = weight[u,v] else Dist[u,v,1] = ∞
Dist[u,u,1] = 0 (sometimes useful)
61
b
d
cu
k
g i
v
h
40
1 10
2
15
181
2
6
8
1
230325
12
3
l=4
Dynamic Programming AlgorithmLoop Invariant: For each u,v,
Dist[u,v,l] = a shortest path from u to v with ≤l edges
Exit
When to exit? A simple path never uses a node more than once and so does not have more than l=n-1.
for all u,v Verticies Dist[u,v] = Dist[u,v,n]
62
Dynamic Programming Algorithm• Dealing with negative cycles.
b
d
cu
v
-5
1
2253
Dist[u,v,4] = 25+3+2 = 30
Dist[u,v,8] = 25+(3+1-5)+3+2 = 29
Dist[u,v,9] = 25+(3+1-5)2+3+2 = 28
Dist[u,v,2] = ∞
Dist[u,v,12] = 25+(3+1-5)3+3+2 = 27Dist[u,v,303] = 25+(3+1-5)300+3+2 = 30-300 = -270Dist[u,v,∞] = 25+(3+1-5)∞+3+2 = 30-∞ = -∞
• There is a negative cycle ifDist[u,v,2n] < Dist[u,v,2n]
% Check for negative cycles for all u,v Verticies if( Dist[u,v,2n]<Dist[u,v,n] ) Dist[u,v] = ∞
63
Dynamic Programming AlgorithmAlgorithm BellmanFord(G)% Smallest Instancesfor all u,v Verticies if u,v Edges Dist[u,v,1] = weight[u,v] else Dist[u,v,1] = ∞ for l = 2,4,8,16,…,2n % Find Dist[uv,l] from Dist[u,v,l/2] for all u,v Verticies Dist[u,v,l] = Dist[u,v,l/2] for all k Verticies Dist[u,v,l] = min( Dist[u,v,l], Dist[u,k,l/2]+Dist[k,v,l/2] )% Check for negative cycles for all u,v Verticies if( Dist[u,v,2n]==Dist[u,v,n] ) Dist[u,v] = Dist[u,v,n] else Dist[u,v] = ∞
• Don’t actually need to keep old and new values.
Time = O(n3 logn)
64
Dynamic Programming AlgorithmAlgorithm BellmanFord(G)% Smallest Instancesfor all u,v Verticies if u,v Edges Dist[u,v] = weight[u,v] else Dist[u,v] = ∞ for l = 2,4,8,16,…,2n % Find Dist[uv,l] from Dist[u,v,l/2] for all u,v Verticies for all k Verticies Dist[u,v] = min( Dist[u,v], Dist[u,k]+Dist[k,v] )% Check for negative cycles for all u,v Verticies if( changed last iteration ) Dist[u,v] = ∞
• Don’t actually need to keep old and new values.
NP-Complete Problems
Computable
Exp
Poly
Known
GCDMatching
Halting
Jack Edmonds Steve Cook
NP
• exponential time to search• poly time to verify given witness
Non-Deterministic Polynomial Time
Circuit-Sat Problem: Does a circuit have a satisfying assignment.
SAT
Industry would love a free lunch
• Given a description of a good plane, automatically find one.
• Given a circuit, find a satisfying assignment
• Given a graph find a bichromatic coloring.
• Given course description, find a schedule
X
X X X X
NP-Complete Problems
NP-Complete Problems
Find the biggest clique,ie subset of nodes that are all connected.
NP-Complete Problems
Find the LONGEST simple s-t path.
NP-Complete Problems
Find a partition of the nodes into two sets with most edges between them.
Colour each node
Use the fewest # of colours.
Nodes with lines between them must have different colours.
NP-Complete Problems
Try all possible colourings
Too many to try. A 50 node graph has more colourings than the number of atoms.
NP-Complete Problems
Is there a fast algorithm?
Most people think not.
We have not been able to prove that there is not.
It is one of the biggest open problems in the field.
NP-Complete Problems
73
End