Post on 25-Feb-2016
description
transcript
Cliff ShafferComputer Science
Computational Complexity
Computer Performance
Computer PerformanceDo we need to care about
performance when computers keep getting faster?
Computer PerformanceDo we need to care about
performance when computers keep getting faster?Our history is to do bigger problems,
not the same ones faster.
Computer PerformanceDo we need to care about
performance when computers keep getting faster?Our history is to do bigger problems,
not the same ones faster.More complex problems are less tied
to our everyday “common sense” experience
Algorithm AnalysisWe could compare two programs
by running them side-by-side
Algorithm AnalysisWe could compare two programs
by running them side-by-sideBut that means we have to implement
them!
Algorithm AnalysisWe could compare two programs
by running them side-by-sideBut that means we have to implement
them!We want a way to easily evaluate
programs before they are writtenLook at the algorithm, not the
program
Algorithm AnalysisWe could compare two programs
by running them side-by-sideBut that means we have to implement
them!We want a way to easily evaluate
programs before they are writtenLook at the algorithm, not the
programAlgorithm Analysis estimates
problem cost as a function of growth rate
Simple SearchFind the record with key value
1005.Sequential search: Look through
each record in turn.If there are n records, we do work
proportional to n (unless we are lucky).
Simple SearchFind the record with key value
1005.Sequential search: Look through
each record in turn.If there are n records, we do work
proportional to n (unless we are lucky).
The growth rate of this problem (in the average or worst cases) is linear on n.
Sorting: Insertion Sort• For each record• Insert it into the sorted list made from the records already seen.
• Might have to look at each such record already in the (sorted) list – n work.• Since we do this for n records, this
is n*n in the worst (and average) cases.• So the cost is proportional to n2
(unless we are really lucky).
Sorting: Merge SortFor a list of n records:
Split the list in half. Merge each half (using merge sort) Merge the records together (needs n
work)Total cost: proportional to n log n
Sorting DemoURL:
http://www.cs.ubc.ca/spider/harrison/Java/
Compare Insertion, Shell, Heap, Quick sorts
Does It Matter?• 1000 records:• Insertion sort: 1,000,000• Mergesort: 10,000• Factor of 100 difference
• 1,000,000 records• Insertion sort: 1,000,000,000,000• Mergsort: 20,000,000• Factor of 50,000 difference• Hours vs. seconds on a real computer
Tractable vs. Intractable• Cost n is better than cost n log n,
which is better than cost n2.
Tractable vs. Intractable• Cost n is better than cost n log n,
which is better than cost n2.• These are all polynomial: a faster
computer gives you a bigger problem in an hour by some factor.
Tractable vs. Intractable• Cost n is better than cost n log n,
which is better than cost n2.• These are all polynomial: a faster
computer gives you a bigger problem in an hour by some factor.• Exponential growth: 2n.
Tractable vs. Intractable• Cost n is better than cost n log n,
which is better than cost n2.• These are all polynomial: a faster
computer gives you a bigger problem in an hour by some factor.• Exponential growth: 2n.• Making input one unit bigger
doubles the cost.
Tractable vs. Intractable• Cost n is better than cost n log n,
which is better than cost n2.• These are all polynomial: a faster
computer gives you a bigger problem in an hour by some factor.• Exponential growth: 2n.• Making input one unit bigger
doubles the cost.• Running twice as fast only gives
you one more problem unit.
Tractable vs. Intractable• Cost n is better than cost n log n,
which is better than cost n2.• These are all polynomial: a faster
computer gives you a bigger problem in an hour by some factor.• Exponential growth: 2n.• Making input one unit bigger
doubles the cost.• Running twice as fast only gives
you one more problem unit.• Exponential-time algorithms are
“intractable”.
Problems• Problems have many algorithms
(sorting)• What does “cost of a problem”
mean?
Problems• Problems have many algorithms
(sorting)• What does “cost of a problem”
mean?• We say the problem’s cost is that of
the best algorithm.• But we can’t know all the
algorithms!
Problems• Problems have many algorithms
(sorting)• What does “cost of a problem”
mean?• We say the problem’s cost is that of
the best algorithm.• But we can’t know all the
algorithms!• It is possible (though difficult) to
figure out lowest cost for any algorithm to solve the problem• Sorting: n log n lower bound.
Traveling Salesman ProblemGiven n cities, find a tour for all
the cities that is of shortest length.
Traveling Salesman ProblemGiven n cities, find a tour for all
the cities that is of shortest length.Nobody knows a polynomial-time
algorithm, only exponential algorithms.
Traveling Salesman ProblemGiven n cities, find a tour for all
the cities that is of shortest length.Nobody knows a polynomial-time
algorithm, only exponential algorithms.
We don’t KNOW that this problem needs exponential time.
Traveling Salesman ExampleURL:
http://itp.nat.uni-magdeburg.de/~mertens/TSP/TSP.html
Nearest Neighbor Heuristic
NP-Completeness• Many, many problems are like
traveling salesman – we know no polynomial algorithm, and have no proof they need exponential time.
NP-Completeness• Many, many problems are like
traveling salesman – we know no polynomial algorithm, and have no proof they need exponential time.• It is possible to “cheaply” convert
any problem from this collection into any other.
NP-Completeness• Many, many problems are like
traveling salesman – we know no polynomial algorithm, and have no proof they need exponential time.• It is possible to “cheaply” convert
any problem from this collection into any other.• So if we had a polynomial time
algorithm for any of them, we’d have one for all.
NP-Completeness• Many, many problems are like
traveling salesman – we know no polynomial algorithm, and have no proof they need exponential time.• It is possible to “cheaply” convert
any problem from this collection into any other.• So if we had a polynomial time
algorithm for any of them, we’d have one for all.• These are called NP-complete
problems.
NP-Completeness• Many, many problems are like
traveling salesman – we know no polynomial algorithm, and have no proof they need exponential time.• It is possible to “cheaply” convert
any problem from this collection into any other.• So if we had a polynomial time
algorithm for any of them, we’d have one for all.• These are called NP-complete
problems.• NP problems are those problems for
which we can quickly verify that a proposed solution is correct.
Examples of NP-complete problems• Find the cheapest way to wire up
telephones in a city• Find the largest clique in a graph• Find a way to assign values to a
boolean expression to make it true• Find the largest matching between
workers and compatible jobs• Find the least number of boxes
needed to pack some goods
What do you do?… when you must solve an NP-
complete problem?
What do you do?… when you must solve an NP-
complete problem?ApproximationOptimization
What do you do?… when you must solve an NP-
complete problem?ApproximationOptimizationMany engineering problems are
optimization problems
What do you do?… when you must solve an NP-
complete problem?ApproximationOptimizationMany engineering problems are
optimization problemsExamples: Aircraft design, “best”
decision
Why is optimization hard?• Imagine a 2d problem – find the
highest hill.
Why is optimization hard?• Imagine a 2d problem – find the
highest hill.• Imagine a 10-parameter problem• Just checking the “high” and “low”
values would give 1024 combinations.
Why is optimization hard?• Imagine a 2d problem – find the
highest hill.• Imagine a 10-parameter problem• Just checking the “high” and “low”
values would give 1024 combinations.• Imagine a 10d “cube”… 1024
corners.• The goal is to find the best point in
the cube, for a complex function.
Why is optimization hard?• Imagine a 2d problem – find the
highest hill.• Imagine a 10-parameter problem• Just checking the “high” and “low”
values would give 1024 combinations.• Imagine a 10d “cube”… 1024
corners.• The goal is to find the best point in
the cube, for a complex function.• Many problems have higher
dimension
Why is optimization hard?• Imagine a 2d problem – find the
highest hill.• Imagine a 10-parameter problem• Just checking the “high” and “low”
values would give 1024 combinations.• Imagine a 10d “cube”… 1024
corners.• The goal is to find the best point in
the cube, for a complex function.• Many problems have higher
dimension• Whole branches of
CS/Math/Engineering devoted to optimization
Uncomputable ProblemsNot all problems that we can think
of can be solved.Abstractly, not all functions can be
computed.
Uncomputable ProblemsNot all problems that we can think
of can be solved.Abstractly, not all functions can be
computed.The number of computable
programs is countably infinite.The number of integer functions is
uncountably infinite.
The Halting ProblemProblem: Given a particular
program P on particular input I, does P halt when run on I?
The Halting ProblemProblem: Given a particular
program P on particular input I, does P halt when run on I?
Can be proved that this is impossible to determine in all cases.
The Halting ProblemProblem: Given a particular
program P on particular input I, does P halt when run on I?
Can be proved that this is impossible to determine in all cases.
Lots of problems like this that try to determine program behavior.
The Halting ProblemProblem: Given a particular
program P on particular input I, does P halt when run on I?
Can be proved that this is impossible to determine in all cases.
Lots of problems like this that try to determine program behavior.
Does this program contain a virus?
Does this terminate for all n?(n is an integer)While n > 1 do if Odd(n) then n = 3n + 1 else n = n/2