+ All Categories
Home > Documents > Asymptotics(Big Oh)

Asymptotics(Big Oh)

Date post: 05-Apr-2018
Category:
Upload: madhulika-naveen
View: 227 times
Download: 0 times
Share this document with a friend

of 20

Transcript
  • 8/2/2019 Asymptotics(Big Oh)

    1/20

    Asymptotics

    Data Structures & Algorithms

    1

    CS@VT 2000-2009 McQuain

    Big-O Analysis

    Definition: Suppose that f(n) and g(n) are nonnegative functions of n. Then we saythat f(n) is O(g(n)) provided that there are constants C > 0 and N > 0 such

    that for all n > N, f(n) Cg(n).

    Big-O expresses an upper bound on the growth rate of a function, for sufficiently large

    values of n.

    By the definition above, demonstrating that a function f is big-O of a function g requires

    that we find specific constants C and N for which the inequality holds (and show that the

    inequality does, in fact, hold).

  • 8/2/2019 Asymptotics(Big Oh)

    2/20

    Asymptotics

    Data Structures & Algorithms

    2

    CS@VT 2000-2009 McQuain

    Big-O Example

    Consider the following function: 25 5( ) 22 2

    T n n n

    We might guess that:

    We could easily verify the guess by induction:

    If n = 2, then T(2) = 17 which is less than 20, so the guess is valid if n = 2.

    Assume that for some n 2, T(n) 5n2. Then:

    2 2

    2

    2

    2 2

    5 5 5 5 5 5( 1) ( 1) ( 1) 2 5 2

    2 2 2 2 2 2

    5 52 5 5

    2 2

    5 5 5 by the inductive assumption

    5 10 5 5( 1)

    T n n n n n n

    n n n

    n n

    n n n

    Thus, by induction, T(n) 5n2 for all n 2. So, by definition, T(n) is O(n2).

    2( ) 5 for all 2T n n n

  • 8/2/2019 Asymptotics(Big Oh)

    3/20

    Asymptotics

    Data Structures & Algorithms

    3

    CS@VT 2000-2009 McQuain

    Making the Guess

    The obvious question is "how do we come up with the guess in the first place"?

    Here's one possible analysis (which falls a bit short of being a proof):

    2

    2 2

    2

    5 5( ) 2

    2 2

    5 52 2 (replace n with n^2, subtract 2)

    2 2

    5

    T n n n

    n n

    n

    The middle step seems sound since if n 2 then n n2, substituting n2 will thus add at

    least 5 to the expression, so that subtracting 2 should still result in a larger value.

  • 8/2/2019 Asymptotics(Big Oh)

    4/20

    Asymptotics

    Data Structures & Algorithms

    4

    CS@VT 2000-2009 McQuain

    Big-O Theorems

    For all the following theorems, assume that f(n) is a non-negative function of n and that Kis an arbitrary constant.

    Theorem 2: A polynomial is O(the term containing the highest power of n)

    Theorem 3: K*f(n) is O(f(n)) [i.e., constant coefficients can be dropped]

    Theorem 1: K is O(1)

    )7(is1000537)( 424 nOnnnnf

    )(is7)( 44 nOnng

    Theorem 4: If f(n) is O(g(n)) and g(n) is O(h(n)) then f(n) is O(h(n)). [transitivity]

    )(is1000537)( 424 nOnnnnf

  • 8/2/2019 Asymptotics(Big Oh)

    5/20

    Asymptotics

    Data Structures & Algorithms

    5

    CS@VT 2000-2009 McQuain

    Big-O Theorems

    Theorem 5: Each of the following functions is strictlybig-O of its successors:

    K [constant]

    logb(n) [always log base 2 if no base is shown]

    n

    n logb(n)n2

    n to higher powers

    2n

    3n

    larger constants to the n-th power

    n! [n factorial]

    nn

    smaller

    larger

    )2(and)(and))log((is)log(3)( 2 nOnOnnOnnnf

  • 8/2/2019 Asymptotics(Big Oh)

    6/20

    Asymptotics

    Data Structures & Algorithms

    6

    CS@VT 2000-2009 McQuain

    Big-O Theorems

    Theorem 6: In general, f(n) is big-O of the dominant term of f(n), wheredominant may usually be determined from Theorem 5.

    Theorem 7: For any base b, logb(n) is O(log(n)).

    )(is10005)log(37)( 22 nOnnnnnf

    )3(is100000037)( 4 nn Onng

    )(is))log((7)(

    2

    nOnnnnh

  • 8/2/2019 Asymptotics(Big Oh)

    7/20

    Asymptotics

    Data Structures & Algorithms

    7

    CS@VT 2000-2009 McQuain

    Big-Omega

    In addition to big-O, we may seek a lower bound on the growth of a function:

    Definition: Suppose that f(n) and g(n) are nonnegative functions of n. Then we say

    that f(n) is (g(n)) provided that there are constants C > 0 and N > 0 such

    that for all n > N, f(n) Cg(n).

    Big- expresses a lower bound on the growth rate of a function, for sufficiently largevalues of n.

    Analagous theorems can be proved for big-.

  • 8/2/2019 Asymptotics(Big Oh)

    8/20

    Asymptotics

    Data Structures & Algorithms

    8

    CS@VT 2000-2009 McQuain

    Big-Theta

    Finally, we may have two functions that grow at essentially the same rate:

    Definition: Suppose that f(n) and g(n) are nonnegative functions of n. Then we say

    that f(n) is (g(n)) provided that f(n) is O(g(n)) and also that f(n) is(g(n)).

    If f is (g) then, from some point on, f is bounded below by one multiple of g andbounded above by another multiple of g (and vice versa).

    So, in a very basic sense f and g grow at the same rate.

  • 8/2/2019 Asymptotics(Big Oh)

    9/20

    Asymptotics

    Data Structures & Algorithms

    9

    CS@VT 2000-2009 McQuain

    Order and Limits

    The task of determining the order of a function is simplified considerably by thefollowing result:

    Theorem 8: f(n) is (g(n)) if

    cc

    ng

    nf

    n

    0where

    )(

    )(lim

    Recall Theorem 7 we may easily prove it (and a bit more) by applying Theorem 8:

    )ln()2ln(

    )ln()2ln(lim

    )2ln(

    1)ln(

    1

    lim)log()(loglim

    bb

    n

    bnnn

    nn

    b

    n

    The last term is finite and positive, so logb(n) is (log(n)) by Theorem 8.

    Corollary: if the limit above is 0 then f(n) is strictly O(g(n)), and

    if the limit is then f(n) is strictly (g(n)).

  • 8/2/2019 Asymptotics(Big Oh)

    10/20

    Asymptotics

    Data Structures & Algorithms

    10

    CS@VT 2000-2009 McQuain

    Order and Limits

    The converse of Theorem 8 is false. However, it is possible to prove:

    Theorem 9: If f(n) is (g(n)) then

    provided that the limit exists.

    cc

    ng

    nf

    n

    0where

    )(

    )(lim

    A similar extension of the preceding corollary also follows.

  • 8/2/2019 Asymptotics(Big Oh)

    11/20

    Asymptotics

    Data Structures & Algorithms

    11

    CS@VT 2000-2009 McQuain

    More Theorems

    Many of the big-O theorems may be strengthened to statements about big-:

    Theorem 11: A polynomial is (the highest power of n).

    Theorem 10: If K > 0 is a constant, then K is (1).

    kkk

    kknk

    k

    k

    naa

    n

    a

    n

    a

    n

    a

    n

    nanaa

    1

    1

    1010 limlim

    proof: Suppose a polynomial of degree k. Then we have:

    Now ak> 0 since we assume the function is nonnegative. So by Theorem 8, the

    polynomial is (nk).

    QED

  • 8/2/2019 Asymptotics(Big Oh)

    12/20

    Asymptotics

    Data Structures & Algorithms

    12

    CS@VT 2000-2009 McQuain

    More Theorems

    Theorems 3, 6 and 7 can be similarly extended.

    Theorem 12: K*f(n) is (f(n)) [i.e., constant coefficients can be dropped]

    Theorem 13: In general, f(n) is big- of the dominant term of f(n), wheredominant may usually be determined from Theorem 5.

    Theorem 14: For any base b, logb(n) is (log(n)).

  • 8/2/2019 Asymptotics(Big Oh)

    13/20

    Asymptotics

    Data Structures & Algorithms

    13

    CS@VT 2000-2009 McQuain

    Strict Comparisons

    For convenience, we will say that:- f is strictly O(g) if and only if f is O(g) but f is not (g)

    - f is strictly (g) if and only if f is (g) but f is not (g)

    For example, n log n is strictly O(n2) by Theorem 8 and its corollary, because:

    0

    1

    /1lim

    loglim

    loglim

    2

    n

    n

    n

    n

    nn

    nnn

  • 8/2/2019 Asymptotics(Big Oh)

    14/20

    Asymptotics

    Data Structures & Algorithms

    14

    CS@VT 2000-2009 McQuain

    Big- Is an Equivalence Relation

    Theorem 17: If f(n) is (g(n)) and g(n) is (h(n)) then f(n) is (h(n)). [transitivity]

    Theorem 16: If f(n) is (g(n)) then g(n) is (f(n)). [symmetry]

    Theorem 15: If f(n) is (f(n)). [reflexivity]

    By Theorems 1517, is an equivalence relation on the set of positive-valued functions.

    The equivalence classes represent fundamentally different growth rates.

    Algorithms whose complexity functions belong to the same class are essentially

    equivalent in performance (up to constant multiples, which are not unimportant inpractice).

  • 8/2/2019 Asymptotics(Big Oh)

    15/20

    Asymptotics

    Data Structures & Algorithms

    15

    CS@VT 2000-2009 McQuain

    Applications to Algorithm Analysis

    Ex 1: An algorithm with complexity function

    is (n2) by Theorem 10.32

    5

    2

    3)( 2 nnnT

    Ex 2: An algorithm with complexity function

    is O(n log(n)) by Theorem 5.

    Furthermore, the algorithm is also (n log(n)) by Theorem 8 since:

    2log4log3)( nnnnT

    3log

    243lim

    log

    )(lim

    nnnnn

    nT

    nn

    For most common complexity functions, it's this easy to determine the big-O and/or

    big- complexity using the given theorems.

  • 8/2/2019 Asymptotics(Big Oh)

    16/20

    Asymptotics

    Data Structures & Algorithms

    16

    CS@VT 2000-2009 McQuain

    Complexity of Linear Storage

    For a contiguous list of N elements, assuming each is equally likely to be the target of asearch:

    - average search cost is (N) if list is randomly ordered

    - average search cost is (log N) is list is sorted

    - average random insertion cost is (N)

    - insertion at tail is (1)

    For a linked list of N elements, assuming each is equally likely to be the target of asearch:

    - average search cost is (N), regardless of list ordering

    - average random insertion cost is (1), excluding search time

  • 8/2/2019 Asymptotics(Big Oh)

    17/20

    Asymptotics

    Data Structures & Algorithms

    17

    CS@VT 2000-2009 McQuain

    Most Common Complexity Classes

    Theorem 5 lists a collection of representatives of distinct big- equivalence classes:

    K [constant]

    logb(n) [always log base 2 if no base is shown]

    n

    n logb(n)

    n2

    n to higher powers

    2n

    3n

    larger constants to the n-th power

    n! [n factorial]nn

    Most common algorithms fall into one of these classes.

    Knowing this list provides some knowledge of how to compare and choose the right algorithm.

    The following charts provide some visual indication of how significant the differences are

  • 8/2/2019 Asymptotics(Big Oh)

    18/20

    Asymptotics

    Data Structures & Algorithms

    18

    CS@VT 2000-2009 McQuain

    Graphical Comparison

    Common Growth Curves

    0

    200

    400

    600

    800

    1000

    1200

    1 2 3 4 5 6 7 8 9 10 11

    n (input size)

    log n

    n

    n log n

    n^2

    n^3

    2^n

    10^n

  • 8/2/2019 Asymptotics(Big Oh)

    19/20

    Asymptotics

    Data Structures & Algorithms

    19

    CS@VT 2000-2009 McQuain

    Lower-order Classes

    Low-order Curves

    0

    20

    40

    60

    80

    100

    120

    1 3 5 7 9

    1

    1

    1

    3

    1

    5

    1

    7

    1

    9

    2

    1

    2

    3

    n (input size )

    log n

    n

    n log n

    n2

    For significantly largevalues of n, only these

    classes are truly

    practical, and whether

    n2 is practical is

    debated.

  • 8/2/2019 Asymptotics(Big Oh)

    20/20

    Asymptotics

    Data Structures & Algorithms

    20

    CS@VT 2000-2009 McQuain

    Proof of Theorem 8

    Theorem 8: f(n) is

    (g(n)) if

    cc

    ng

    nf

    n0where

    )(

    )(lim

    Suppose that f and g are non-negative functions of n, and that

    ccngnf

    n0where

    )()(lim

    Then, from the definition of the limit, for every > 0 there exists an N > 0 such that

    whenever n > N:

    ( )from which we have ( ) ( ) ( ) ( ) ( )

    ( )

    f nc c g n f n c g n

    g n

    Let = c/2, then we have that:3

    ( ) ( ) whence is ( )2

    cf n g n f g

    ( ) ( ) whence is ( )2

    cg n f n f g

    Therefore, by definition, f is (g).


Recommended