Date post: | 29-Dec-2015 |
Category: |
Documents |
Upload: | beatrix-cunningham |
View: | 224 times |
Download: | 1 times |
CSC 201CSC 201Analysis and Design of AlgorithmsAnalysis and Design of Algorithms
Lecture 04:Lecture 04:Time complexity analysis in form of Big-Oh
Dr.Surasak MungsingDr.Surasak MungsingE-mail: [email protected]
04/19/23 1
Apr 19, 20232
Big-Oh NotationBig-Oh Notation
Big-Oh was introduced for functions’ growth rate comparison in 1927 based on asymptotic behavior
Big-Oh notation characterizes functions according to their growth rates: different functions with the same growth rate may be represented using the same O notation.
A description of a function in terms of big-Oh notation usually only provides an upper bound on the growth rate of the function.
CSC201 Analysis and Design of Algorithms
Apr 19, 20233
Upper BoundUpper Bound
In general a function
f(n) is O(g(n)) if positive constants c and n0 such that
f(n) c g(n) n n0
e.g. if f(n)=1000n and g(n)=n2, n0 > 1000 and c = 1 then f(n0
) < 1.g(n0) and we say that f(n) = O(g(n))
The O notation indicates 'bounded above by a constant multiple of.'
CSC201 Analysis and Design of Algorithms
Apr 19, 2023 4
Big-Oh, the Asymptotic Upper BoundBig-Oh, the Asymptotic Upper Bound
Because big O notation discards multiplicative constants on the running time, and ignores efficiency for low input sizes, it does not always reveal the fastest algorithm in practice or for practically-sized data sets, but the approach is still very effective for comparing the scalability of various algorithms as input sizes become large.
If an algorithm‘s time complexity is in the order of O(n2) then it’s growth rate of computation time will not be faster than a quadratic function for input that is large enough
Some upper bounds may be too broad, for example saying that 2n2 = O(n3)
By definition if c = 1 and n0 = 2, it is better to say that 2n2 = O(n2)
CSC201 Analysis and Design of Algorithms
Apr 19, 20235
For all n>6, g(n) > 1 f(n). f (n) is in big-O of g(n)then, f(n) is in O(g(n)).
Example 1
CSC201 Analysis and Design of Algorithms
Apr 19, 20236
Example 2
There exists n0 such that for all n>n0, If f(n) < 1 g(n) then f(n) is in O(g(n))
CSC201 Analysis and Design of Algorithms
Apr 19, 20237
There exists n0=5, c=3.5, for all n>n0, if f(n) < c h(n) then f(n) is in O(h(n)).
Example 3
CSC201 Analysis and Design of Algorithms
Apr 19, 20238
CSC201 Analysis and Design of Algorithms
Apr 19, 20239
Exercise on O-notationExercise on O-notation
Show that f(n)=3n2+2n+5 is in O(n2)
10 n2 = 3n2 + 2n2 + 5n2
3n2 + 2n + 5 for n 1 f(n)
or f(n) ≤10 n2
consider c = 10, n0 = 1
f(n) ≤c g(n2 ) for n n0
then f(n) is in O(n2)
CSC201 Analysis and Design of Algorithms
Apr 19, 202310
Usage of Big-OhUsage of Big-Oh
We should always write Big-Oh in the most simple form
e.g. 3n2+2n+5 = O(n2)
It is not wrong to write these functions in term of Big-Oh as below, but the most appropriate form should be in the most simple form • 3n2+2n+5 = O(3n2+2n+5)• 3n2+2n+5 = O(n2+n)• 3n2+2n+5 = O(3n2)
CSC201 Analysis and Design of Algorithms
Apr 19, 202311
Exercise on O-notationExercise on O-notation
f1(n) = 10 n + 25 n2
f2(n) = 20 n log n + 5 n
f3(n) = 12 n log n + 0.05 n2
f4(n) = n1/2 + 3 n log n
• O(n2)• O(n log n)• O(n2) • O(n log n)
CSC201 Analysis and Design of Algorithms
Apr 19, 202312
Classification of Function : BIG-OhClassification of Function : BIG-Oh
A function f(n) is said to be of at most logarithmic growth if f(n) = O(log n)
A function f(n) is said to be of at most quadratic growth if f(n) = O(n2)
A function f(n) is said to be of at most polynomial growth if f(n) = O(nk), for some natural number k > 1
A function f(n) is said to be of at most exponential growth if there is a constant c, such that f(n) = O(cn), and c > 1
A function f(n) is said to be of at most factorial growth if f(n) = O(n!).
CSC201 Analysis and Design of Algorithms
Apr 19, 202313
Classification of Function : BIG-Oh Classification of Function : BIG-Oh (cont.)(cont.)
A function f(n) is said to have constant running time if the size of the input n has no effect on the running time of the algorithm (e.g., assignment of a value to a variable). The equation for this algorithm is f(n) = c
Other logarithmic classifications: f(n) = O(n log n)
f(n) = O(log log n)
CSC201 Analysis and Design of Algorithms
Apr 19, 202314
Big O FactBig O Fact
A polynomial of degree k is O(nk)
Proof:
ถ้�า f(n) = bknk + bk-1nk-1 + … + b1n + b0
และให้� ai = | bi |
ดั�งนั้��นั้ f(n) aknk + ak-1nk-1 + … + a1n + a0
CSC201 Analysis and Design of Algorithms
Apr 19, 202315
Some RulesSome RulesTransitivity
f(n) = O(g(n)) and g(n) = O(h(n)) f(n) = O(h(n))
Addition f(n) + g(n) = O(max { f(n) ,g(n)})
Polynomialsa0
+ a1
n + … + adnd = O(nd)
Heirachy of functions n + log n = O(n) ; 2n + n3 = O(2n)
CSC201 Analysis and Design of Algorithms
Apr 19, 202316
Some RulesSome Rules
Base of Logs ignoredlogan = O(logbn)
Power inside logs ignoredlog(n2) = O(log n)
Base and powers in exponents not ignored3n is not O(2n)a(n)2 is not O(an)
CSC201 Analysis and Design of Algorithms
Apr 19, 202317
Big-Oh ComplexityBig-Oh Complexity
O(1) The cost of applying the algorithm can be bounded independently of the value of n. This is called constant complexity.
O(log n) The cost of applying the algorithm to problems of sufficiently large size n can be bounded by a function of the form k log n, where k is a fixed constant. This is called logarithmic complexity.
O(n) linear complexity O(n log n) n lg n complexity O(n2) quadratic complexity
CSC201 Analysis and Design of Algorithms
Apr 19, 202318
Big-Oh Complexity (cont.)Big-Oh Complexity (cont.)
O(n3) cubic complexity O(n4) quadratic complexity O(n32) polynomial complexity O(cn) If constant c>1, then this is called exponential
complexity O(2n) exponential complexity O(en) exponential complexity O(n!) factorial complexity
CSC201 Analysis and Design of Algorithms
Apr 19, 202319
Practical Complexity t < 500Practical Complexity t < 500
0
500
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20
f(n) = n
f(n) = log(n)
f(n) = n log(n)
f(n) = n̂ 2
f(n) = n̂ 3
f(n) = 2̂ n
CSC201 Analysis and Design of Algorithms
Apr 19, 202320
Practical Complexity t < 5000Practical Complexity t < 5000
0
1000
2000
3000
4000
5000
1 3 5 7 9 11 13 15 17 19
f(n) = n
f(n) = log(n)
f(n) = n log(n)
f(n) = n̂ 2
f(n) = n̂ 3
f(n) = 2̂ n
CSC201 Analysis and Design of Algorithms
Apr 19, 202321
Practical ComplexityPractical Complexity
1
10
100
1000
10000
100000
1000000
10000000
1 4 16 64 256 1024 4096 16384 65536
CSC201 Analysis and Design of Algorithms
Apr 19, 202322
Things to Remember in AnalysisThings to Remember in Analysis
Constants or low-order terms are ignored if f(n) = 2n2 then f(n) = O(n2)
running time and memory are important resources for algorithm and very large input affects algorithm performance the most
Parameter N, normally means size of input N may refers to degree of polynomial, size of
input file for data sorting, or number of nodes in graph
CSC201 Analysis and Design of Algorithms
Apr 19, 202323
Things to Remember in AnalysisThings to Remember in Analysis
Worst case analysis means the worst-case execution time of particular concern (it is important to know how much time might be needed in the worst case to guarantee that the algorithm will always finish on time)
Average performance ,and also worst-case), performance is the most used in algorithm analysis, using probabilistic analysis techniques, especially expected value, to determine expected running times (i.e. the case of typical input data)
CSC201 Analysis and Design of Algorithms
Apr 19, 202324
General Rules for AnalysisGeneral Rules for Analysis (1) (1)
1. Consecutive statements count only the most time required of the
consecutive block of statements
count only the most time required of the consecutive loops
Block #1
Block #2
t1
t2
t1+ t2 = max(t1,t2)
CSC201 Analysis and Design of Algorithms
Apr 19, 202325
General Rules for AnalysisGeneral Rules for Analysis(2)(2)
2. If/Else
if cond then S1
else
S2Block #1 Block #2t1 t2
Max(t1,t2)
CSC201 Analysis and Design of Algorithms
Apr 19, 202326
General Rules for AnalysisGeneral Rules for Analysis(3)(3)
3. For Loops Running time of a for-loop is at most the
running time of the statements inside the for-loop times number of iterations
for (i = sum = 0; i < n; i++) sum += a[i]; for loop iterates n times, executes 2
assignment statements each iteration ==> asymptotic complexity of O(n)
CSC201 Analysis and Design of Algorithms
Apr 19, 202327
General Rules for AnalysisGeneral Rules for Analysis(4)(4)
4. Nested For-LoopsAnalyze inside-out: total time is the product of time required for each loop
for (i =0; i < n; i++) for (j = 0, sum = a[0]; j <= i ; j++)
sum += a[j]; printf("sum for subarray - through %d is %d\n", i,
sum);
CSC201 Analysis and Design of Algorithms
Apr 19, 202328
General Rules for AnalysisGeneral Rules for Analysis
CSC201 Analysis and Design of Algorithms
Apr 19, 202329
General Rules for AnalysisGeneral Rules for Analysis
Analysis strategy : analyze from inside out analyze function calls first
CSC201 Analysis and Design of Algorithms
Apr 19, 202330
CSC201 Analysis and Design of Algorithms