0% found this document useful (0 votes)
22 views31 pages

Lect4 DAA and Lect 5

This document is a lecture on asymptotic notation, including O-, Ω-, and Θ-notation, and methods for solving recurrences such as substitution, recursion trees, and the master method. It provides definitions and examples for each notation type and discusses how to analyze algorithms using these concepts. The lecture emphasizes the importance of understanding these notations for analyzing the efficiency of algorithms.

Uploaded by

nihar.sbmp
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
22 views31 pages

Lect4 DAA and Lect 5

This document is a lecture on asymptotic notation, including O-, Ω-, and Θ-notation, and methods for solving recurrences such as substitution, recursion trees, and the master method. It provides definitions and examples for each notation type and discusses how to analyze algorithms using these concepts. The lecture emphasizes the importance of understanding these notations for analyzing the efficiency of algorithms.

Uploaded by

nihar.sbmp
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 31

Introduction to Algorithms

6.046J/18.401J
LECTURE 2
Asymptotic Notation
• O-, Ω-, and Θ-notation
Recurrences
• Substitution method
• Iterating the recurrence
• Recursion tree
• Master method
Prof. Erik Demaine
September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.1
Asymptotic notation
O-notation (upper bounds):
We write f(n) = O(g(n)) if there
exist constants c > 0, n0 > 0 such
that 0 ≤ f(n) ≤ cg(n) for all n ≥ n0.

September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.2
Asymptotic notation
O-notation (upper bounds):
We write f(n) = O(g(n)) if there
exist constants c > 0, n0 > 0 such
that 0 ≤ f(n) ≤ cg(n) for all n ≥ n0.

EXAMPLE: 2n2 = O(n3) (c = 1, n0 = 2)

September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.3
Asymptotic notation
O-notation (upper bounds):
We write f(n) = O(g(n)) if there
exist constants c > 0, n0 > 0 such
that 0 ≤ f(n) ≤ cg(n) for all n ≥ n0.

EXAMPLE: 2n2 = O(n3) (c = 1, n0 = 2)

functions,
not values
September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.4
Asymptotic notation
O-notation (upper bounds):
We write f(n) = O(g(n)) if there
exist constants c > 0, n0 > 0 such
that 0 ≤ f(n) ≤ cg(n) for all n ≥ n0.

EXAMPLE: 2n2 = O(n3) (c = 1, n0 = 2)


funny, “one-way”
functions, equality
not values
September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.5
Set definition of O-notation
O(g(n)) = { f(n) : there exist constants
c > 0, n0 > 0 such
that 0 ≤ f(n) ≤ cg(n)
for all n ≥ n0 }

September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.6
Set definition of O-notation
O(g(n)) = { f(n) : there exist constants
c > 0, n0 > 0 such
that 0 ≤ f(n) ≤ cg(n)
for all n ≥ n0 }

EXAMPLE: 2n2 ∈ O(n3)

September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.7
Set definition of O-notation
O(g(n)) = { f(n) : there exist constants
c > 0, n0 > 0 such
that 0 ≤ f(n) ≤ cg(n)
for all n ≥ n0 }

EXAMPLE: 2n2 ∈ O(n3)


(Logicians: λn.2n2 ∈ O(λn.n3), but it’s
convenient to be sloppy, as long as we
understand what’s really going on.)
September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.8
Ω-notation (lower bounds)
O-notation is an upper-bound notation. It
makes no sense to say f(n) is at least O(n2).

Ω(g(n)) = { f(n) : there exist constants


c > 0, n0 > 0 such
that 0 ≤ cg(n) ≤ f(n)
for all n ≥ n0 }

September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.13
Ω-notation (lower bounds)
O-notation is an upper-bound notation. It
makes no sense to say f(n) is at least O(n2).

Ω(g(n)) = { f(n) : there exist constants


c > 0, n0 > 0 such
that 0 ≤ cg(n) ≤ f(n)
for all n ≥ n0 }

EXAMPLE: n = Ω(lg n) (c = 1, n0 = 16)


September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.14
Θ-notation (tight bounds)

Θ(g(n)) = O(g(n)) ∩ Ω(g(n))

September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.15
Θ-notation (tight bounds)

Θ(g(n)) = O(g(n)) ∩ Ω(g(n))

EXAMPLE: 1
2
n − 2 n = Θ( n )
2 2

September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.16
Solving recurrences
• The analysis of merge sort from Lecture 1
required us to solve a recurrence.
• Recurrences are like solving integrals,
differential equations, etc.
 Learn a few tricks.
• Lecture 3: Applications of recurrences to
divide-and-conquer algorithms.

September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.19
Recursion-tree method
• A recursion tree models the costs (time) of a
recursive execution of an algorithm.
• The recursion-tree method can be unreliable,
just like any method that uses ellipses (…).
• The recursion-tree method promotes intuition,
however.
• The recursion tree method is good for
generating guesses for the substitution method.

September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.32
Example of recursion tree
Solve T(n) = T(n/4) + T(n/2) + n2:

September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.33
Example of recursion tree
Solve T(n) = T(n/4) + T(n/2) + n2:
T(n)

September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.34
Example of recursion tree
Solve T(n) = T(n/4) + T(n/2) + n2:
n2
T(n/4) T(n/2)

September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.35
Example of recursion tree
Solve T(n) = T(n/4) + T(n/2) + n2:
n2
(n/4)2 (n/2)2

T(n/16) T(n/8) T(n/8) T(n/4)

September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.36
Example of recursion tree
Solve T(n) = T(n/4) + T(n/2) + n2:
n2
(n/4)2 (n/2)2

(n/16)2 (n/8)2 (n/8)2 (n/4)2

Θ(1)

September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.37
Example of recursion tree
Solve T(n) = T(n/4) + T(n/2) + n2:
n2 n2
(n/4)2 (n/2)2

(n/16)2 (n/8)2 (n/8)2 (n/4)2

Θ(1)

September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.38
Example of recursion tree
Solve T(n) = T(n/4) + T(n/2) + n2:
n2 n2
5 n2
(n/4)2 (n/2)2
16
(n/16)2 (n/8)2 (n/8)2 (n/4)2

Θ(1)

September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.39
Example of recursion tree
Solve T(n) = T(n/4) + T(n/2) + n2:
n2 n2
5 n2
(n/4)2 (n/2)2
16
25 n 2
(n/16)2 (n/8)2 (n/8)2 (n/4)2
256


Θ(1)

September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.40
Example of recursion tree
Solve T(n) = T(n/4) + T(n/2) + n2:
n2 n2
5 n2
(n/4)2 (n/2)2
16
25 n 2
(n/16)2 (n/8)2 (n/8)2 (n/4)2
256


Θ(1) Total = n 2
(
1 + 16 + 16
5 5 2
( ) +( ) + 5 3
16
)
= Θ(n2) geometric series
September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.41
The master method

The master method applies to recurrences of


the form
T(n) = a T(n/b) + f (n) ,
where a ≥ 1, b > 1, and f is asymptotically
positive.

September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.42
Three common cases
Compare f (n) with nlogba:
1. f (n) = O(nlogba – ε) for some constant ε > 0.
• f (n) grows polynomially slower than nlogba
(by an nε factor).
Solution: T(n) = Θ(nlogba) .

September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.43
Three common cases
Compare f (n) with nlogba:
1. f (n) = O(nlogba – ε) for some constant ε > 0.
• f (n) grows polynomially slower than nlogba
(by an nε factor).
Solution: T(n) = Θ(nlogba) .
2. f (n) = Θ(nlogba lgkn) for some constant k ≥ 0.
• f (n) and nlogba grow at similar rates.
Solution: T(n) = Θ(nlogba lgk+1n) .
September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.44
Three common cases (cont.)
Compare f (n) with nlogba:
3. f (n) = Ω(nlogba + ε) for some constant ε > 0.
• f (n) grows polynomially faster than nlogba (by
an nε factor),
and f (n) satisfies the regularity condition that
a f (n/b) ≤ c f (n) for some constant c < 1.
Solution: T(n) = Θ( f (n) ) .

September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.45
Examples

EX. T(n) = 4T(n/2) + n


a = 4, b = 2 ⇒ nlogba = n2; f (n) = n.
CASE 1: f (n) = O(n2 – ε) for ε = 1.
∴ T(n) = Θ(n2).

September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.46
Examples

EX. T(n) = 4T(n/2) + n


a = 4, b = 2 ⇒ nlogba = n2; f (n) = n.
CASE 1: f (n) = O(n2 – ε) for ε = 1.
∴ T(n) = Θ(n2).

EX. T(n) = 4T(n/2) + n2


a = 4, b = 2 ⇒ nlogba = n2; f (n) = n2.
CASE 2: f (n) = Θ(n2lg0n), that is, k = 0.
∴ T(n) = Θ(n2lg n).

September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.47
Examples
EX. T(n) = 4T(n/2) + n3
a = 4, b = 2 ⇒ nlogba = n2; f (n) = n3.
CASE 3: f (n) = Ω(n2 + ε) for ε = 1
and 4(n/2)3 ≤ cn3 (reg. cond.) for c = 1/2.
∴ T(n) = Θ(n3).

September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.48
Examples
EX. T(n) = 4T(n/2) + n3
a = 4, b = 2 ⇒ nlogba = n2; f (n) = n3.
CASE 3: f (n) = Ω(n2 + ε) for ε = 1
and 4(n/2)3 ≤ cn3 (reg. cond.) for c = 1/2.
∴ T(n) = Θ(n3).
EX. T(n) = 4T(n/2) + n2/lg n
a = 4, b = 2 ⇒ nlogba = n2; f (n) = n2/lg n.
Master method does not apply. In particular,
for every constant ε > 0, we have nε = ω(lg n).
September 12, 2005 Copyright © 2001-5 Erik D. Demaine and Charles E. Leiserson L2.49

You might also like