+ All Categories
Home > Documents > Asymptotic Growth Rate. Asymptotic Running Time The running time of an algorithm as input size...

Asymptotic Growth Rate. Asymptotic Running Time The running time of an algorithm as input size...

Date post: 19-Dec-2015
Category:
View: 213 times
Download: 0 times
Share this document with a friend
58
Asymptotic Growth Rate
Transcript
Page 1: Asymptotic Growth Rate. Asymptotic Running Time The running time of an algorithm as input size approaches infinity is called the asymptotic running time.

Asymptotic Growth Rate

Page 2: Asymptotic Growth Rate. Asymptotic Running Time The running time of an algorithm as input size approaches infinity is called the asymptotic running time.

Asymptotic Running Time

• The running time of an algorithm as input size approaches infinity is called the asymptotic running time

• We study different notations for asymptotic efficiency.

• In particular, we study tight bounds, upper bounds and lower bounds.

Page 3: Asymptotic Growth Rate. Asymptotic Running Time The running time of an algorithm as input size approaches infinity is called the asymptotic running time.

Outline

• Why do we need the different sets?• Definition of the sets O (Oh),

(Omega) and (Theta), o (oh), (omega)

• Classifying examples:– Using the original definition– Using limits

Page 4: Asymptotic Growth Rate. Asymptotic Running Time The running time of an algorithm as input size approaches infinity is called the asymptotic running time.

The functions

• Let f(n) and g(n) be asymptotically nonnegative functions whose domains are the set of natural numbers N={0,1,2,…}.

• A function g(n) is asymptotically nonnegative, if g(n)0 for all nn0 where n0N

Page 5: Asymptotic Growth Rate. Asymptotic Running Time The running time of an algorithm as input size approaches infinity is called the asymptotic running time.

Big Oh

• Big “oh” - asymptotic upper bound on the growth of an algorithm

• When do we use Big Oh?1. To provide information on the maximum

number of operations that an algorithm performs– Insertion sort is O(n2) in the worst case

• This means that in the worst case it performs at most cn2 operations

2. Theory of NP-completeness1. An algorithm is polynomial if it is O(nk) for some constant k2. P = NP if there is any polynomial time algorithm for any NP-

complete problem Note: Don’t worry about NP now; it is to be discussed much later

in the semester

Page 6: Asymptotic Growth Rate. Asymptotic Running Time The running time of an algorithm as input size approaches infinity is called the asymptotic running time.

Definition of Big Oh

• O(f(n)) is the set of functions g(n) such that:there exist positive constants c, and N for which,

0 g(n) cf(n) for all n N• f(n) is called an asymptotically upper

bound for g(n); that is, g(n) is O(f(n)).

Page 7: Asymptotic Growth Rate. Asymptotic Running Time The running time of an algorithm as input size approaches infinity is called the asymptotic running time.

g(n) O(f(n))

N n

cf(n)

g(n)

I grow at

most as

fast as f

Page 8: Asymptotic Growth Rate. Asymptotic Running Time The running time of an algorithm as input size approaches infinity is called the asymptotic running time.

n2 + 10 n O(n2) Why?

0200400600800

100012001400

0 10 20 30

n2 + 10n

2 n2

take c = 2N = 10n2+10n <=2n2 for all n>=10

Page 9: Asymptotic Growth Rate. Asymptotic Running Time The running time of an algorithm as input size approaches infinity is called the asymptotic running time.

Does 5n+2 O(n)?

Proof: From the definition of Big Oh, there must exist c > 0 and integer N > 0 such that 0 5n+2 cn for all n N.

Dividing both sides of the inequality by n > 0 we get: 0 5+2/n c.– 2/n ( > 0) becomes smaller as n increases– For instance, let N = 2 and c = 6

There are many choices here for c and N.

Page 10: Asymptotic Growth Rate. Asymptotic Running Time The running time of an algorithm as input size approaches infinity is called the asymptotic running time.

Does 5n+2 O(n)?

If we choose N = 1 then 5+2/n 5+2/1= 7. So any c 7 works. Let’s choose c = 7.

If we choose c = 6, then 0 5+2/n 6. So any N 2 works. Choose N = 2.

In either case (we only need one!) , c > 0 and N > 0 such that 0 5n+2 cn for all n N. So the definition is satisfied and

5n+2 O(n)

Page 11: Asymptotic Growth Rate. Asymptotic Running Time The running time of an algorithm as input size approaches infinity is called the asymptotic running time.

Does n2 O(n)? No. We will prove by contradiction that the

definition cannot be satisfied. Assume that n2 O(n).From the definition of Big Oh, there must

exist c > 0 and integer N > 0 such that 0 n2 cn for all n N.

Divide the inequality by n > 0 to get 0 n c for all n N.

n c cannot be true for any n > max{c,N }. This contradicts the assumption. Thus, n2 O(n).

Page 12: Asymptotic Growth Rate. Asymptotic Running Time The running time of an algorithm as input size approaches infinity is called the asymptotic running time.

Are they true? • 1,000,000 n2 O(n2) why/why not? • True

• (n - 1)n / 2 O(n2) why /why not? • True

• n / 2 O(n2) why /why not? • True

• lg (n2) O( lg n ) why /why not? • True

• n2 O(n) why /why not? • False

Page 13: Asymptotic Growth Rate. Asymptotic Running Time The running time of an algorithm as input size approaches infinity is called the asymptotic running time.

Omega

Asymptotic lower bound on the growth of an algorithm or a problem

When do we use Omega?1. To provide information on the minimum

number of operations that an algorithm performs– Insertion sort is (n) in the best case

• This means that in the best case its instruction count is at least cn,

– It is (n2) in the worst case• This means that in the worst case its instruction

count is at least cn2

Page 14: Asymptotic Growth Rate. Asymptotic Running Time The running time of an algorithm as input size approaches infinity is called the asymptotic running time.

Omega (cont.)

2. To provide information on a class of algorithms that solve a problem – Sorting algorithms based on comparison of

keys are (nlgn) in the worst case• This means that all sort algorithms based only on

comparison of keys have to do at least cnlgn operations

– Any algorithm based only on comparison of keys to find the maximum of n elements is (n) in every case

• This means that all algorithms based only on comparison of keys to find maximum have to do at least cn operations

Page 15: Asymptotic Growth Rate. Asymptotic Running Time The running time of an algorithm as input size approaches infinity is called the asymptotic running time.

Supplementary topic: Why (nlgn) for sorting?

• n numbers to sort and no special assumptions unlike pigeonhole sorting (we discussed in a previous class), there are n! permutations

• One comparison has only two outcomes• So, lg(n!) comparisons are required in

the worst case• n! is approximately equal to (n/e)n

– Striling’s approximation

n2

Page 16: Asymptotic Growth Rate. Asymptotic Running Time The running time of an algorithm as input size approaches infinity is called the asymptotic running time.

Definition of the set Omega

• (f(n)) is the set of functions g(n) such that:there exist positive constants c, and N for which, 0 cf(n) g(n) for all n N

• f(n) is called an asymptotically lower bound for g(n); that is, g(n) =(f(n))

Page 17: Asymptotic Growth Rate. Asymptotic Running Time The running time of an algorithm as input size approaches infinity is called the asymptotic running time.

g(n) (f(n))

N n

cf(n)

g(n)

I grow at

least as

fast as f

Page 18: Asymptotic Growth Rate. Asymptotic Running Time The running time of an algorithm as input size approaches infinity is called the asymptotic running time.

Is 5n-20 (n)?Proof: From the definition of Omega, there must exist

c>0 and integer N>0 such that 0 cn 5n-20 for all nN

Dividing the inequality by n > 0 we get: 0 c 5-20/n for all n N.

20/n 20, and 20/n becomes smaller as n grows.

There are many choices here for c and N. Since c > 0, 5 – 20/n > 0 and N >4. If we choose c=1, then 5 – 20/n 1 and N 5 Choose N = 5.If we choose c=4, then 5 – 20/n 4 and N 20. Choose N

= 20.

In either case (we only need one!) we have c>o and N>0 such that 0 cn 5n-20 for all n N. So 5n-20 (n).

Page 19: Asymptotic Growth Rate. Asymptotic Running Time The running time of an algorithm as input size approaches infinity is called the asymptotic running time.

Are they true?• 1,000,000 n2 (n2) why /why not?• (true)• (n - 1)n / 2(n2) why /why not? • (true)

• n / 2(n2) why /why not? • (false)• lg (n2) ( lg n ) why /why not? • (true)• n2 (n) why /why not? • (true)

Page 20: Asymptotic Growth Rate. Asymptotic Running Time The running time of an algorithm as input size approaches infinity is called the asymptotic running time.

Theta

• Asymptotic tight bound on the growth rate of an algorithm – Insertion sort is (n2) in the worst and

average cases• This means that in the worst case and average

cases insertion sort performs cn2 operations

– Binary search is (lg n) in the worst and average cases

• The means that in the worst case and average cases binary search performs clgn operations

Page 21: Asymptotic Growth Rate. Asymptotic Running Time The running time of an algorithm as input size approaches infinity is called the asymptotic running time.

Definition of the set Theta

• (f(n)) is the set of functions g(n) such that:there exist positive constants c, d, and N, for which,

0 cf(n) g(n) df(n) for all nN

• f(n) is called an asymptotically tight bound for g(n).

Page 22: Asymptotic Growth Rate. Asymptotic Running Time The running time of an algorithm as input size approaches infinity is called the asymptotic running time.

g(n) (f(n))

N n

cf(n)

g(n)

df(n)

We grow

at same

rate

Page 23: Asymptotic Growth Rate. Asymptotic Running Time The running time of an algorithm as input size approaches infinity is called the asymptotic running time.

Another Definition of Theta

))(())(())(( nfnfOnf

log n n2 n5

5n + 3 1/2 n2 2n

n1.5 5n2+ 3nnlog n

(n2)

O(n2) (n2)

Small o(n2)

Small (n2)

Page 24: Asymptotic Growth Rate. Asymptotic Running Time The running time of an algorithm as input size approaches infinity is called the asymptotic running time.

• We use the last definition and show:

1.

2.

?)(32

1 Does 22 nnn

)(32

1 22 nOnn

)(32

1 22 nnn

Page 25: Asymptotic Growth Rate. Asymptotic Running Time The running time of an algorithm as input size approaches infinity is called the asymptotic running time.

?)(32

1 Does 22 nOnn

6 Choose 6. all for

. all for

:get weby inequality the Dividing

. all for

that such and, exist must there definition the From

Choose

chosen be can 1/2 any learly

NNn

Nncn

n

Nncnnn

Nc

c

cC

2

13210

3210

02

232210

00

.2/1

Page 26: Asymptotic Growth Rate. Asymptotic Running Time The running time of an algorithm as input size approaches infinity is called the asymptotic running time.

?)(32

1 Does 22 nnn

. and So

. all for

1/4.c Choose finite for 0 Since

6. Since

.

that such and exist must There

1241

12321

41

.2/1 ,3

and 32100

3210

get we02by Dividing

allfor 322120

00

N/c

nn

cn/n

N, c

nc

n

Nnnncn

Nc

N

Page 27: Asymptotic Growth Rate. Asymptotic Running Time The running time of an algorithm as input size approaches infinity is called the asymptotic running time.

More • 1,000,000 n2 (n2) why /why not? • (true)

• (n - 1)n / 2(n2) why /why not? • (true)

• n / 2(n2) why /why not? • (false)

• lg (n2) ( lg n ) why /why not? • (true) • n2 (n) why /why not? • (false)

Page 28: Asymptotic Growth Rate. Asymptotic Running Time The running time of an algorithm as input size approaches infinity is called the asymptotic running time.

small o

(f(n)) is the set of functions g(n) which satisfy the following condition:For every positive real constant c, there exists a positive integer N, for which,

g(n) cf(n) for all nN

Page 29: Asymptotic Growth Rate. Asymptotic Running Time The running time of an algorithm as input size approaches infinity is called the asymptotic running time.

small o

• Little “oh” - used to denote an upper bound that is not asymptotically tight. – n is in o(n3). – n is not in o(n)

Page 30: Asymptotic Growth Rate. Asymptotic Running Time The running time of an algorithm as input size approaches infinity is called the asymptotic running time.

small omega

(f(n)) is the set of functions g(n) which satisfy the following condition:For every positive real constant c, there exists a positive integer N, for which,

g(n) cf(n) for all nN

Page 31: Asymptotic Growth Rate. Asymptotic Running Time The running time of an algorithm as input size approaches infinity is called the asymptotic running time.

small omega and small o

g(n)(f(n)) if and only if f(n) o(g(n))

Page 32: Asymptotic Growth Rate. Asymptotic Running Time The running time of an algorithm as input size approaches infinity is called the asymptotic running time.

Limits can be used to determine Order

c then f (n) = ( g (n)) if c > 0

if lim f (n)/g (n) = 0 then f (n) = o ( g(n))

then f (n) = ( g (n))

• The limit must exist

{n

Page 33: Asymptotic Growth Rate. Asymptotic Running Time The running time of an algorithm as input size approaches infinity is called the asymptotic running time.

Example using limits

22

3

2

3

23

3lim

5lim

35lim

,since)(35

n

n

n

n

n

nn

nnn

nnn

Page 34: Asymptotic Growth Rate. Asymptotic Running Time The running time of an algorithm as input size approaches infinity is called the asymptotic running time.

L’Hopital’s Rule

If f(x) and g(x) are both differentiable with derivatives f’(x) and g’(x), respectively, and if

exists right the on limit the whenever

then

)('

)('lim

)(

)(lim

)(lim)(lim

xg

xf

xg

xf

xfxg

xx

xx

Page 35: Asymptotic Growth Rate. Asymptotic Running Time The running time of an algorithm as input size approaches infinity is called the asymptotic running time.

Example using limits

ax

kk

n

e

n

e

n

e

n

e

nnn

k

k

a

y

xy

n

n

nn

n

n

nn

nonn

n

n

n

n

n

nn

nnn

ln

)!1()1()(

2

2

33

3

3

3

33

1

log

01

/1lim

)'(

)'(loglim

:Rule sHopital'L' Use?log

limlog

lim

,since)(log

103

lim10

lim310

lim

,since )(310

(kth order differentiation of y)

Page 36: Asymptotic Growth Rate. Asymptotic Running Time The running time of an algorithm as input size approaches infinity is called the asymptotic running time.

Example using limit

02ln

1lim

'

)'(lglim

lglim

2ln

1'

2ln

ln'lg and

2ln

lnlg

)(lg

nn

n

n

n

n

nn

nn

non

nnn

Page 37: Asymptotic Growth Rate. Asymptotic Running Time The running time of an algorithm as input size approaches infinity is called the asymptotic running time.

Example using limits

02ln2

!lim

...2ln2

)1(lim

2ln2lim

2lim

)(')'(:

)2(2ln2ln''2

2

integer positive a is where)2(

2

21

2ln2ln

2ln

knn

n

k

nn

k

nn

k

n

xx

nnnn

nn

nk

k

nkkknn

exeNote

ee

e

kon

Page 38: Asymptotic Growth Rate. Asymptotic Running Time The running time of an algorithm as input size approaches infinity is called the asymptotic running time.

Further study of the growth functions

• Properties of growth functions

O <= >= = o < >

Page 39: Asymptotic Growth Rate. Asymptotic Running Time The running time of an algorithm as input size approaches infinity is called the asymptotic running time.

Asymptotic Growth RatePart II

Page 40: Asymptotic Growth Rate. Asymptotic Running Time The running time of an algorithm as input size approaches infinity is called the asymptotic running time.

Outline

• More examples:• General Properties• Little Oh• Additional properties

Page 41: Asymptotic Growth Rate. Asymptotic Running Time The running time of an algorithm as input size approaches infinity is called the asymptotic running time.

Order of AlgorithmOrder of Algorithm

• Property - Complexity Categories:

θ(lg n) θ(n) θ(n lgn) θ(n2) θ(nj) θ(nk) θ(an) θ(bn) θ(n!)

Where k>j>2 and b>a>1. If a complexity

function g(n) is in a category that is to the left of the category containing f(n), then g(n) o(f(n))

Page 42: Asymptotic Growth Rate. Asymptotic Running Time The running time of an algorithm as input size approaches infinity is called the asymptotic running time.

Comparing ln n with na (a > 0)

• Using limits we get:

• So ln n = o(na) for any a > 0• When the exponent a is very small,

we need to look at very large values of n to see that na > ln n

01

limln

lim anan ann

n

Page 43: Asymptotic Growth Rate. Asymptotic Running Time The running time of an algorithm as input size approaches infinity is called the asymptotic running time.

Values for log10n and n 0.01

0.01

n log n n.01

1 0 11.00E+10 10 1.258925

1.00E+100 100 101.00E+200 200 1001.00E+230 230 199.52621.00E+240 240 251.1886

Page 44: Asymptotic Growth Rate. Asymptotic Running Time The running time of an algorithm as input size approaches infinity is called the asymptotic running time.

Values for log10n and n 0.001

n log n n.001

1 0 11.00E+10 10 1.023293

1.00E+100 100 1.2589251E+1000 1000 101E+2000 2000 1001E+3000 3000 10001E+4000 4000 10000

Page 45: Asymptotic Growth Rate. Asymptotic Running Time The running time of an algorithm as input size approaches infinity is called the asymptotic running time.

Lower-order terms and constants

• Lower order terms of a function do not matter since lower-order terms are dominated by the higher order term.

• Constants (multiplied by highest order term) do not matter, since they do not affect the asymptotic growth rate

• All logarithms with base b >1 belong to (lg n) since constant a is wherelg

lg

lglog cnc

b

nnb

Page 46: Asymptotic Growth Rate. Asymptotic Running Time The running time of an algorithm as input size approaches infinity is called the asymptotic running time.

General Rules• We say a function f (n ) is polynomially bounded

if f (n ) = O ( nk ) for some positive constant k

• We say a function f (n ) is polylogarithmic bounded if f (n ) = O ( lgk n) for some positive constant k

• Exponential functions – grow faster than positive polynomial

functions• Polynomial functions

– grow faster than polylogarithmic functions

Page 47: Asymptotic Growth Rate. Asymptotic Running Time The running time of an algorithm as input size approaches infinity is called the asymptotic running time.

More properties

• The following slides show– Two examples of pairs of functions that

are not comparable in terms of asymptotic notation

– How the asymptotic notation can be used in equations

– That Theta, Big Oh, and Omega define a transitive and reflexive order. Theta also satisfies symmetry, while Big Oh and Omega satisfy transpose symmetry

Page 48: Asymptotic Growth Rate. Asymptotic Running Time The running time of an algorithm as input size approaches infinity is called the asymptotic running time.

Are n and nsinn comparable with respect to growth rate? yes

Increases from 0 to 1

Increases from 1 to n

Decreases from 1 to 0

Decreases from n to 1

Decreases from 0 to –1

Decreases from 1 to 1/n

Increases from –1 to 0

Increases from 1/n to 1

nsin nnsin

Clearly nsin n = O(n), but nsin n (n)

Page 49: Asymptotic Growth Rate. Asymptotic Running Time The running time of an algorithm as input size approaches infinity is called the asymptotic running time.

0

50

100

150

200

250

300

350

400

450

500

550

600

650

700

750

800

850

900

950

1000

1050

1100

0 50 100

150

200

250

300

350

400

450

500

550

600

650

700

750

800

850

900

950

1000

1050

1100

1150

1200

1250

1300

1350

1400

1450

1500

1550

1600

sin(n)

n̂ sin(n)

n

5n

Page 50: Asymptotic Growth Rate. Asymptotic Running Time The running time of an algorithm as input size approaches infinity is called the asymptotic running time.

Are n and nsinn+1 comparable with respect to growth rate? no

Increases from 0 to 1

Increases from n to n2

Decreases from 1 to 0

Decreases from n2 to n

Decreases from 0 to –1

Decreases from n to 1

Increases from –1 to 0

Increases from 1 to n

nsin 1sin nn

Clearly nsin n+1 O(n), but nsin n+1 (n)

Page 51: Asymptotic Growth Rate. Asymptotic Running Time The running time of an algorithm as input size approaches infinity is called the asymptotic running time.

1

10

100

1000

10000

100000

0 20 40 60 80 100

120

140

160

180

200

220

240

260

280

300

320

340

n

n (̂sin(n)+1)

Are n and nsinn+1 comparable? No

Page 52: Asymptotic Growth Rate. Asymptotic Running Time The running time of an algorithm as input size approaches infinity is called the asymptotic running time.

Another example

The following functions are not asymptotically comparable:

nn

nng

n

nnnf

oddfor

evenfor 1)(

oddfor 1

evenfor )(

,))(()( and ,))(()( ngnfngOnf

Page 53: Asymptotic Growth Rate. Asymptotic Running Time The running time of an algorithm as input size approaches infinity is called the asymptotic running time.

0

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18

f(n)

g(n)

Page 54: Asymptotic Growth Rate. Asymptotic Running Time The running time of an algorithm as input size approaches infinity is called the asymptotic running time.

Transitivity:

If f (n) = (g(n )) and g (n) = (h(n )) then f (n) = (h(n )) .

If f (n) = (g(n )) and g (n) = (h(n )) then f (n) = (h(n )).

If f (n) = (g(n )) and g (n) = (h(n )) then f (n) = (h(n )) .

If f (n) = (g(n )) and g (n) = (h(n )) then f (n) = (h(n )) .

If f (n) = (g(n )) and g (n) = (h(n )) then f (n) = (h(n ))

Page 55: Asymptotic Growth Rate. Asymptotic Running Time The running time of an algorithm as input size approaches infinity is called the asymptotic running time.

Reflexivity:

f (n) = (f (n )).

f (n) = (f (n )).

f (n) = (f (n )).

“o” is not reflexive

“” is not reflexive

Page 56: Asymptotic Growth Rate. Asymptotic Running Time The running time of an algorithm as input size approaches infinity is called the asymptotic running time.

Symmetry and Transpose symmetry

• Symmetry:f (n) = (g(n)) if and only if g (n) = (f (n )) .

• Transpose symmetry: f (n) = (g(n )) if and only if g (n) = (f (n )). f (n) = (g(n )) if and only if g (n) = (f (n )).

Page 57: Asymptotic Growth Rate. Asymptotic Running Time The running time of an algorithm as input size approaches infinity is called the asymptotic running time.

Analogy between asymptotic comparison of functions and comparison of real numbers.

f (n) = ( g(n)) a b

f (n) = ( g(n)) a b

f (n) = ( g(n)) a b

f (n) = ( g(n)) a b

f (n) = ( g(n)) a bf(n) is asymptotically smaller than g(n) if f (n) = ( g(n))

f(n) is asymptotically larger than g(n) if f (n) = ( g(n))

Page 58: Asymptotic Growth Rate. Asymptotic Running Time The running time of an algorithm as input size approaches infinity is called the asymptotic running time.

Is O(g(n)) = (g(n)) o(g(n))?

We show a counter example:

The functions are:

g(n) = nand

f(n)O(n) but f(n) (n) and f(n) o(n) Conclusion:

O(g(n)) (g(n)) o(g(n))

even is if 1

odd is if )(

n

nnnf


Recommended