+ All Categories
Home > Documents > Lecture 3: Analysing Complexity of Algorithms - Big Oh ... · Lecture 3: Analysing Complexity of...

Lecture 3: Analysing Complexity of Algorithms - Big Oh ... · Lecture 3: Analysing Complexity of...

Date post: 21-May-2020
Category:
Upload: others
View: 5 times
Download: 0 times
Share this document with a friend
21
Outline Complexity Basic tools Big-Oh Big Omega Big Theta Examples Lecture 3: Analysing Complexity of Algorithms Big Oh, Big Omega, and Big Theta Notation Georgy Gimel’farb COMPSCI 220 Algorithms and Data Structures 1 / 15
Transcript
Page 1: Lecture 3: Analysing Complexity of Algorithms - Big Oh ... · Lecture 3: Analysing Complexity of Algorithms Big Oh, Big Omega, and Big Theta Notation ... The notation g(n)2O f ))

Outline Complexity Basic tools Big-Oh Big Omega Big Theta Examples

Lecture 3: Analysing Complexity of AlgorithmsBig Oh, Big Omega, and Big Theta Notation

Georgy Gimel’farb

COMPSCI 220 Algorithms and Data Structures

1 / 15

Page 2: Lecture 3: Analysing Complexity of Algorithms - Big Oh ... · Lecture 3: Analysing Complexity of Algorithms Big Oh, Big Omega, and Big Theta Notation ... The notation g(n)2O f ))

Outline Complexity Basic tools Big-Oh Big Omega Big Theta Examples

1 Complexity

2 Basic tools

3 Big-Oh

4 Big Omega

5 Big Theta

6 Examples

2 / 15

Page 3: Lecture 3: Analysing Complexity of Algorithms - Big Oh ... · Lecture 3: Analysing Complexity of Algorithms Big Oh, Big Omega, and Big Theta Notation ... The notation g(n)2O f ))

Outline Complexity Basic tools Big-Oh Big Omega Big Theta Examples

Typical Complexity Curves

Running time T (n)

is proportional to: Complexity:

T (n) ∝ log n logarithmicT (n) ∝ n linearT (n) ∝ n log n linearithmicT (n) ∝ n2 quadraticT (n) ∝ n3 cubicT (n) ∝ nk polynomialT (n) ∝ 2n exponentialT (n) ∝ kn; k > 1 exponential

3 / 15

Page 4: Lecture 3: Analysing Complexity of Algorithms - Big Oh ... · Lecture 3: Analysing Complexity of Algorithms Big Oh, Big Omega, and Big Theta Notation ... The notation g(n)2O f ))

Outline Complexity Basic tools Big-Oh Big Omega Big Theta Examples

Separating an Algorithm Itself from Its Implementation

Two concepts to separate an algorithm from implementation:

• The input data size n, or the number of individual data itemsin a single data instance to be processed.

• The number of elementary operations f(n) taken by analgorithm, or its running time.

The running time of a program implementation: cf(n)

• The constant factor c can rarely be determined and dependson a computer, operating system, language, compiler, etc.

When the input size increases from n = n1 to n = n2, all otherfactors being equal, the relative running time of the programincreases by a factor of T (n2)

T (n1)= cf(n1)

cf(n2)= f(n1)

f(n2).

4 / 15

Page 5: Lecture 3: Analysing Complexity of Algorithms - Big Oh ... · Lecture 3: Analysing Complexity of Algorithms Big Oh, Big Omega, and Big Theta Notation ... The notation g(n)2O f ))

Outline Complexity Basic tools Big-Oh Big Omega Big Theta Examples

Relative Growth g(n) = f(n)f(5) of Running Time

The approximate running time for large input sizes gives enoughinformation to distinguish between a good and a bad algorithm.

Input size nFunction f(n) 5 52 = 25 53 = 125 54 = 625

Constant 1 1 1 1 1

LogarithmicQ

log5 n 1 2 3 4

LinearQ

n 1 5 25 125

LinearithmicQn log5 n 1 10 75 500

QuadraticQ

n2 1 52 = 25 54 = 625 56 = 15, 625

CubicQ

n3 1 53 = 125 56 = 15, 625 59 = 1, 953, 125

ExponentialQ

2n 1 220 ≈ 106 2120 ≈ 1036 2620 ≈ 10187

5 / 15

Page 6: Lecture 3: Analysing Complexity of Algorithms - Big Oh ... · Lecture 3: Analysing Complexity of Algorithms Big Oh, Big Omega, and Big Theta Notation ... The notation g(n)2O f ))

Outline Complexity Basic tools Big-Oh Big Omega Big Theta Examples

Big-Oh, Big-Theta, and Big-Omega Tools

Let f(n) and g(n) be nonnegative-valued functions defined onnonnegative integers n.

Math notation for “of the order of . . . ” or “roughly proportional to. . . ”:

• Big-Oh (actually: Big-Omicron) O(. . .) ⇒ g(n) = O(f(n))

• Big-Theta Θ(. . .) ⇒ g(n) = Θ(f(n))

• Big-Omega Ω(. . .) ⇒ g(n) = Ω(f(n))

Big Oh O(. . .) – Formal Definition

The function g(n) is O(f(n)) (read: g(n) is Big Oh of f(n)) iffthere exists a positive real constant c and a positive integer n0

such that g(n) ≤ cf(n) for all n > n0.

The notation iff abbreviates “if and only if”.

6 / 15

Page 7: Lecture 3: Analysing Complexity of Algorithms - Big Oh ... · Lecture 3: Analysing Complexity of Algorithms Big Oh, Big Omega, and Big Theta Notation ... The notation g(n)2O f ))

Outline Complexity Basic tools Big-Oh Big Omega Big Theta Examples

Example 1.8; p.13: g(n) = 100 log10 n is O(n)

g(n) < n if n > 238 or g(n) < 0.3n if n > 1000.

By definition, g(n) is O(f(n)), or g(n) = O(f(n)) if a constantc > 0 exists, such that cf(n) grows faster than g(n) for all n > n0.

• To prove that some g(n) is O(f(n))means to show that for g and f suchconstants c and n0 exist.

• The constants c and n0 areinterdependent.

• g(n) is O(f(n)) iff the graph of g(n)is always below or at the graph ofcf(n) after n0.

7 / 15

Page 8: Lecture 3: Analysing Complexity of Algorithms - Big Oh ... · Lecture 3: Analysing Complexity of Algorithms Big Oh, Big Omega, and Big Theta Notation ... The notation g(n)2O f ))

Outline Complexity Basic tools Big-Oh Big Omega Big Theta Examples

Big-Oh O(. . .): Informal Meaning

O(f(n)) generalises an asymptotic upper bound.

• If g(n) is O(f(n)), an algorithm with running time g(n) runsasymptotically, i.e. for large n, at most as fast, to within aconstant factor, as an algorithm with running time f(n).

• In other words, g(n) for large n may approach cf(n) closer andcloser while the relationship g(n) ≤ cf(n) holds for n > n0.

• The scaling factor c and the threshold n0 are interdependentand differ for different particular functions g(n) in O(f(n)).

• Notations g(n) = O(f(n)) or g(n) is O(f(n)) mean actuallyg(n) ∈ O(f(n)).

• The notation g(n) ∈ O(f(n)) indicates that g(n) is a memberof the set O(f(n)) of functions.

• All the functions in the set O(f(n)) are increasing with thesame or the lesser rate as f(n) when n→∞.

8 / 15

Page 9: Lecture 3: Analysing Complexity of Algorithms - Big Oh ... · Lecture 3: Analysing Complexity of Algorithms Big Oh, Big Omega, and Big Theta Notation ... The notation g(n)2O f ))

Outline Complexity Basic tools Big-Oh Big Omega Big Theta Examples

Big-Oh O(. . .): Informal Meaning

O(f(n)) generalises an asymptotic upper bound.

• If g(n) is O(f(n)), an algorithm with running time g(n) runsasymptotically, i.e. for large n, at most as fast, to within aconstant factor, as an algorithm with running time f(n).

• In other words, g(n) for large n may approach cf(n) closer andcloser while the relationship g(n) ≤ cf(n) holds for n > n0.

• The scaling factor c and the threshold n0 are interdependentand differ for different particular functions g(n) in O(f(n)).

• Notations g(n) = O(f(n)) or g(n) is O(f(n)) mean actuallyg(n) ∈ O(f(n)).

• The notation g(n) ∈ O(f(n)) indicates that g(n) is a memberof the set O(f(n)) of functions.

• All the functions in the set O(f(n)) are increasing with thesame or the lesser rate as f(n) when n→∞.

8 / 15

Page 10: Lecture 3: Analysing Complexity of Algorithms - Big Oh ... · Lecture 3: Analysing Complexity of Algorithms Big Oh, Big Omega, and Big Theta Notation ... The notation g(n)2O f ))

Outline Complexity Basic tools Big-Oh Big Omega Big Theta Examples

Big Omega Ω(. . .)

The function g(n) is Ω(f(n))

iff there exists a positive real constant c and a positive integer n0

such that g(n) ≥ cf(n) for all n > n0.

• Ω(. . .) is complementary to O(. . .).

• It generalises the concept of “lower bound” (≥) in the sameway as O(. . .) generalises the concept of “upper bound” (≤):if g(n) is Ω(f(n)) then f(n) is O(g(n)).

• Example 1: 5n2 is Ω(n) because 5n2 ≥ 5n for n ≥ 1.

• Example 2: 0.01n is Ω(log n) because 0.01n ≥ 0.5 log10 n forn ≥ 100.

9 / 15

Page 11: Lecture 3: Analysing Complexity of Algorithms - Big Oh ... · Lecture 3: Analysing Complexity of Algorithms Big Oh, Big Omega, and Big Theta Notation ... The notation g(n)2O f ))

Outline Complexity Basic tools Big-Oh Big Omega Big Theta Examples

Big Omega Ω(. . .)

The function g(n) is Ω(f(n))

iff there exists a positive real constant c and a positive integer n0

such that g(n) ≥ cf(n) for all n > n0.

• Ω(. . .) is complementary to O(. . .).

• It generalises the concept of “lower bound” (≥) in the sameway as O(. . .) generalises the concept of “upper bound” (≤):if g(n) is Ω(f(n)) then f(n) is O(g(n)).

• Example 1: 5n2 is Ω(n) because 5n2 ≥ 5n for n ≥ 1.

• Example 2: 0.01n is Ω(log n) because 0.01n ≥ 0.5 log10 n forn ≥ 100.

9 / 15

Page 12: Lecture 3: Analysing Complexity of Algorithms - Big Oh ... · Lecture 3: Analysing Complexity of Algorithms Big Oh, Big Omega, and Big Theta Notation ... The notation g(n)2O f ))

Outline Complexity Basic tools Big-Oh Big Omega Big Theta Examples

Big Omega Ω(. . .)

The function g(n) is Ω(f(n))

iff there exists a positive real constant c and a positive integer n0

such that g(n) ≥ cf(n) for all n > n0.

• Ω(. . .) is complementary to O(. . .).

• It generalises the concept of “lower bound” (≥) in the sameway as O(. . .) generalises the concept of “upper bound” (≤):if g(n) is Ω(f(n)) then f(n) is O(g(n)).

• Example 1: 5n2 is Ω(n) because 5n2 ≥ 5n for n ≥ 1.

• Example 2: 0.01n is Ω(log n) because 0.01n ≥ 0.5 log10 n forn ≥ 100.

9 / 15

Page 13: Lecture 3: Analysing Complexity of Algorithms - Big Oh ... · Lecture 3: Analysing Complexity of Algorithms Big Oh, Big Omega, and Big Theta Notation ... The notation g(n)2O f ))

Outline Complexity Basic tools Big-Oh Big Omega Big Theta Examples

Big Theta Θ(. . .)

The function g(n) is Θ(f(n))

iff there exists two positive real constants c1 and c2 and a positiveinteger n0 such that c1f(n) ≤ g(n) ≤ c2f(n) for all n > n0.

• If g(n) is Θ(f(n)) then g(n) is O(f(n)) and f(n) is O(g(n))or, what is the same, g(n) is O(f(n)) and g(n) is Ω(f(n)):

• g(n) is O(f(n)) → g(n) ≤ c′f(n) for n > n′.• g(n) is Ω(f(n)) → f(n) ≤ c′′g(n) for n > n′′.• g(n) is Θ(f(n)) ← c′′ = 1

c1; c′ = c2, and n0 = maxn′, n′′.

• Informally, if g(n) is Θ(f(n)) then both the functions havethe same rate of increase.

• Example: the same rate of increase for g(n) = n + 5n0.5 andf(n) = n because n ≤ n + 5n0.5 ≤ 6n for n > 1.

10 / 15

Page 14: Lecture 3: Analysing Complexity of Algorithms - Big Oh ... · Lecture 3: Analysing Complexity of Algorithms Big Oh, Big Omega, and Big Theta Notation ... The notation g(n)2O f ))

Outline Complexity Basic tools Big-Oh Big Omega Big Theta Examples

Big Theta Θ(. . .)

The function g(n) is Θ(f(n))

iff there exists two positive real constants c1 and c2 and a positiveinteger n0 such that c1f(n) ≤ g(n) ≤ c2f(n) for all n > n0.

• If g(n) is Θ(f(n)) then g(n) is O(f(n)) and f(n) is O(g(n))or, what is the same, g(n) is O(f(n)) and g(n) is Ω(f(n)):

• g(n) is O(f(n)) → g(n) ≤ c′f(n) for n > n′.• g(n) is Ω(f(n)) → f(n) ≤ c′′g(n) for n > n′′.• g(n) is Θ(f(n)) ← c′′ = 1

c1; c′ = c2, and n0 = maxn′, n′′.

• Informally, if g(n) is Θ(f(n)) then both the functions havethe same rate of increase.

• Example: the same rate of increase for g(n) = n + 5n0.5 andf(n) = n because n ≤ n + 5n0.5 ≤ 6n for n > 1.

10 / 15

Page 15: Lecture 3: Analysing Complexity of Algorithms - Big Oh ... · Lecture 3: Analysing Complexity of Algorithms Big Oh, Big Omega, and Big Theta Notation ... The notation g(n)2O f ))

Outline Complexity Basic tools Big-Oh Big Omega Big Theta Examples

Big Theta Θ(. . .)

The function g(n) is Θ(f(n))

iff there exists two positive real constants c1 and c2 and a positiveinteger n0 such that c1f(n) ≤ g(n) ≤ c2f(n) for all n > n0.

• If g(n) is Θ(f(n)) then g(n) is O(f(n)) and f(n) is O(g(n))or, what is the same, g(n) is O(f(n)) and g(n) is Ω(f(n)):

• g(n) is O(f(n)) → g(n) ≤ c′f(n) for n > n′.• g(n) is Ω(f(n)) → f(n) ≤ c′′g(n) for n > n′′.• g(n) is Θ(f(n)) ← c′′ = 1

c1; c′ = c2, and n0 = maxn′, n′′.

• Informally, if g(n) is Θ(f(n)) then both the functions havethe same rate of increase.

• Example: the same rate of increase for g(n) = n + 5n0.5 andf(n) = n because n ≤ n + 5n0.5 ≤ 6n for n > 1.

10 / 15

Page 16: Lecture 3: Analysing Complexity of Algorithms - Big Oh ... · Lecture 3: Analysing Complexity of Algorithms Big Oh, Big Omega, and Big Theta Notation ... The notation g(n)2O f ))

Outline Complexity Basic tools Big-Oh Big Omega Big Theta Examples

Comparisons: Two Crucial Ideas

• The exact running time function g(n) is not very importantsince it can be multiplied by an arbitrary positive constant, c.

• The relative behaviour of two functions is compared onlyasymptotically, for large n, but not near the origin where itmay make no sense.

• If the constants c involved are very large, the asymptoticalbehaviour loses practical interest!

• In most cases, however, the constants remain fairly small.• To prove that g(n) is not O(f(n)), Ω(f(n)), or Θ(f(n)), one

has to show that the desired constants do not exist, i.e. leadto a contradiction.

• g(n) and f(n) in the Big-Oh, -Omega, and -Theta definitionsmostly relate, respectively, to “exact” and rough approximate(like log n, n, n2, etc) running time on inputs of size n.

11 / 15

Page 17: Lecture 3: Analysing Complexity of Algorithms - Big Oh ... · Lecture 3: Analysing Complexity of Algorithms Big Oh, Big Omega, and Big Theta Notation ... The notation g(n)2O f ))

Outline Complexity Basic tools Big-Oh Big Omega Big Theta Examples

Example 1.11, p.14

Prove that linear function g(n) = an + b; a > 0, is O(n).

The proof: By the following chain of inequalities:g(n) ≤ an + |b| ≤ (a + |b|)n for all n ≥ 1

Do not write O(2n) or O(an + b) as this means still O(n)!

O(n)-time:

T (n) = 3n + 1 T (n) = 108 + n

T (n) = 50 + 10−8n T (n) = 106n + 1

• Remember that “Big-Oh”, as well as “Big-Omega” and“Big-Theta”, describes an asymptotic behaviour forlarge problem sizes.

• Only the dominant terms as n→∞ need to be shown as theargument of “Big-Oh”, “Big-Omega”, and “Big-Theta”.

12 / 15

Page 18: Lecture 3: Analysing Complexity of Algorithms - Big Oh ... · Lecture 3: Analysing Complexity of Algorithms Big Oh, Big Omega, and Big Theta Notation ... The notation g(n)2O f ))

Outline Complexity Basic tools Big-Oh Big Omega Big Theta Examples

Example 1.12, p.15

The polynomial Pk(n) = aknk +ak−1n

k−1 + . . .+a2n2 +a1n+a0;

ak > 0, is O(nk).

The proof: Pk(n) ≤ (ak + |ak−1|+ . . . + |a0|)nk for n ≥ 1.

• Do not write O(Pk(n)) as this means still O(nk)!

• O(nk)-time:T (n) = 3n2 + 5n + 1 is O(n2) Is it also O(n3)?T (n) = 10−8n3 + 108n2 + 30 is O(n3) Is it also Ω(n2)?T (n) = 10−8n8 + 1000n + 1 is O(n8) Is it also Θ(n8)?

T (n) = Pk(n) is

O(nm); m ≥ kΘ(nk);Ω(nm); m ≤ k

13 / 15

Page 19: Lecture 3: Analysing Complexity of Algorithms - Big Oh ... · Lecture 3: Analysing Complexity of Algorithms Big Oh, Big Omega, and Big Theta Notation ... The notation g(n)2O f ))

Outline Complexity Basic tools Big-Oh Big Omega Big Theta Examples

Example 1.12, p.15

The polynomial Pk(n) = aknk +ak−1n

k−1 + . . .+a2n2 +a1n+a0;

ak > 0, is O(nk).

The proof: Pk(n) ≤ (ak + |ak−1|+ . . . + |a0|)nk for n ≥ 1.

• Do not write O(Pk(n)) as this means still O(nk)!

• O(nk)-time:T (n) = 3n2 + 5n + 1 is O(n2) Is it also O(n3)?T (n) = 10−8n3 + 108n2 + 30 is O(n3) Is it also Ω(n2)?T (n) = 10−8n8 + 1000n + 1 is O(n8) Is it also Θ(n8)?

T (n) = Pk(n) is

O(nm); m ≥ kΘ(nk);Ω(nm); m ≤ k

13 / 15

Page 20: Lecture 3: Analysing Complexity of Algorithms - Big Oh ... · Lecture 3: Analysing Complexity of Algorithms Big Oh, Big Omega, and Big Theta Notation ... The notation g(n)2O f ))

Outline Complexity Basic tools Big-Oh Big Omega Big Theta Examples

Example 1.13, p.15

• The exponential function g(n) = 2n+k, where k is a constant,is O(2n) because 2n+k = 2k2n for all n.

• Generally, g(n) = mn+k is O(ln); l ≥ m > 1, becausemn+k ≤ ln+k = lkln for any constant k.

A “brute-force” search for the best combination of n binarydecisions by exhausting all the 2n possible combinations hasexponential time complexity!

• 230 ≈ 109 = 1, 000, 000, 000 and240 ≈ 1012 = 1, 000, 000, 000, 000

• Therefore, try to find a more efficient way of solving thedecision problem if n ≥ 30 . . . 40.

14 / 15

Page 21: Lecture 3: Analysing Complexity of Algorithms - Big Oh ... · Lecture 3: Analysing Complexity of Algorithms Big Oh, Big Omega, and Big Theta Notation ... The notation g(n)2O f ))

Outline Complexity Basic tools Big-Oh Big Omega Big Theta Examples

Example 1.14, p.15

For each m > 1, the logarithmic function g(n) = logm(n) has thesame rate of increase as lg(n), i.e. log2 n, because

logm(n) = logm(2) lg(n) for all n > 0.

Omit the logarithm base when using “Big-Oh”, “Big-Omega”, and

“Big-Theta” notation: logm n is O(log n), Ω(log n), and Θ(log n).

You will find later that the most efficient search for data in anordered array has logarithmic time complexity.

15 / 15


Recommended