0% found this document useful (0 votes)
5 views24 pages

Lecture 03

The document outlines a lecture on asymptotic notations for algorithm analysis, focusing on the concepts of order of growth, including O, Ω, and Θ notations. It provides examples and exercises to illustrate the differences in growth rates of various functions and algorithms. The lecture aims to help students understand how to analyze the efficiency of algorithms based on their running time and input size.

Uploaded by

nidhi.gupta6
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
5 views24 pages

Lecture 03

The document outlines a lecture on asymptotic notations for algorithm analysis, focusing on the concepts of order of growth, including O, Ω, and Θ notations. It provides examples and exercises to illustrate the differences in growth rates of various functions and algorithms. The lecture aims to help students understand how to analyze the efficiency of algorithms based on their running time and input size.

Uploaded by

nidhi.gupta6
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 24

CS483-03 Asymptotic Notations for Algorithm Analysis

Instructor: Fei Li

Room 443 ST II

Office hours: Tue. & Thur. 4:30pm - 5:30pm or by appointments

lifei@cs.gmu.edu with subject: CS483

http://www.cs.gmu.edu/∼ lifei/teaching/cs483 fall07/

Some of this lecture note is based on Dr. J.M. Lien’s lecture notes.

CS483 Design and Analysis of Algorithms 1 Lecture 03, September 4, 2007


Outline

➣ Review
➣ Order of growth
➣ O, Ω and Θ
➣ Examples and exercises

CS483 Design and Analysis of Algorithms 2 Lecture 03, September 4, 2007


Review

In last class

➣ Analysis framework
ˆ Input size
ˆ Basic operation
ˆ Running time

T (n) ≈ cop · C(n),


where
– n is the input size
– C(n) is the total number of basic operations for input of size n.
– cop is the time needed to execute one single basic operation.

ˆ Worst-case, best-case, average-case

CS483 Design and Analysis of Algorithms 3 Lecture 03, September 4, 2007


Outline

➣ Review
➣ Order of growth
➣ O, Ω and Θ
➣ Examples and exercises

CS483 Design and Analysis of Algorithms 4 Lecture 03, September 4, 2007


Orders of Growth

➣ Running time T (n)

T (n) ≈ cop · C(n),


where n is the input size, C(n) is the total number of basic operations for
input of size n, and cop is the time needed to execute one single basic
operation.

➣ Example 1
Given that C(n) = 21 n(n − 1), how much the performance will be affected
if the input size n is doubled?

T (2n) cop · C(2n) 4n − 2


growth = ≈ = ≈4
T (n) cop · C(n) n−1

CS483 Design and Analysis of Algorithms 5 Lecture 03, September 4, 2007


Orders of Growth

➣ Running time T (n)

T (n) ≈ cop · C(n),


where n is the input size, C(n) is the total number of basic operations for
input of size n, and cop is the time needed to execute one single basic
operation.

➣ Example 2
Given an algorithm A with CA (n) = 10 · n and another algorithm B with
CB (n) = 21 n(n − 1), which algorithm is better?
8
< B, if n ≤ 21
Answer =
: A, if n > 21

CS483 Design and Analysis of Algorithms 6 Lecture 03, September 4, 2007


Compare A and B
500
10 * x
0.5 * x * (x - 1)

400

300
Running time

200

100

0
0 5 10 15 20 25 30 35 40
Input size n

CS483 Design and Analysis of Algorithms 7 Lecture 03, September 4, 2007


Orders of Growth

➣ Some of the commonly seen functions representing the number of the basic
operations C(n)

1. n (linear)
2. n2 (quadratic)
3. n3 (cubic)
4. log10 (n) (logarithmic)
5. n log10 (n) (n-log-n)
6. log210 (n) (quadratic of log)

7. n (square root of n)
8. 2n (exponential)
9. n! (factorial function of n, 1808 by Christian Kramp)
➣ Can you order them by their growth rate?

CS483 Design and Analysis of Algorithms 8 Lecture 03, September 4, 2007


CS483 Design and Analysis of Algorithms 9 Lecture 03, September 4, 2007
Orders of Growth

n n2 n3 2n n!
10 102 103 1024 3.6 × 106
100 104 106 1.3 × 1030 9.3 × 10157
1000 106 109 1.1 × 10301
10000 108 1012

CS483 Design and Analysis of Algorithms 10 Lecture 03, September 4, 2007



n log10 (n) n log10 (n) log210 (n) n
10 1 10 1 3.16

100 2 200 4 10

1000 3 3000 9 31.6

10000 4 40000 16 100

Order the functions by their growth rate



log10 (n) < log210 (n) < n < n < n log10 (n) < n2 < n3 < 2n < n!

CS483 Design and Analysis of Algorithms 11 Lecture 03, September 4, 2007


Power of Growing Exponentially

from http://www.ideum.com/portfolio

➣ The king owns Shashi: 10, 000, 000, 000, 000, 000, 000 grains of rice.

CS483 Design and Analysis of Algorithms 12 Lecture 03, September 4, 2007


Outline

➣ Review
➣ Order of growth
➣ O, Ω and Θ
➣ Examples and exercises

CS483 Design and Analysis of Algorithms 13 Lecture 03, September 4, 2007


Asymptotic Notations for Orders of Growth

➣ Ignore constant factors


➣ Ignore small input sizes
➣ Focus on order of growth
ˆ O(g(n)): a set of functions f (n) that grow no faster than g(n).
ˆ Ω(g(n)): a set of functions f (n) that grow at least as fast as g(n).
ˆ Θ(g(n)): a set of functions f (n) that grow at the same rate as g(n).

CS483 Design and Analysis of Algorithms 14 Lecture 03, September 4, 2007


Asymptotic Notations

ˆ O(g(n)) := {f (n) | there exist positive constants c and n0 such that 0 ≤ f (n) ≤ c · g(n)
for all n ≥ n0 }.

f (n) ∈ O(g(n))
f (n) grow no faster than g(n).
ˆ Ω(g(n)) := {f (n) | there exist positive constants c and n0 such that 0 ≤ c · g(n) ≤ f (n)
for all n ≥ n0 }.

f (n) ∈ Ω(g(n))
f (n) grows at least as fast as g(n).
ˆ Θ(g(n)) := {f (n) | there exist positive constants c1 , c2 , and n0 such that
c1 · g(n) ≤ f (n) ≤ c2 · g(n) for all n ≥ n0 }.

f (n) ∈ Θ(g(n))
f (n) grows at the same rate as g(n).

CS483 Design and Analysis of Algorithms 15 Lecture 03, September 4, 2007


Asymptotic Notations

c2 g(n)
f (n)
c1 g(n)

n0 n
f (n) ∈ Θ(g(n))

cg(n)
f (n) f (n)
cg(n)

n0 n n0 n
f (n) ∈ O(g(n)) f (n) ∈ Ω(g(n))

CS483 Design and Analysis of Algorithms 16 Lecture 03, September 4, 2007


Examples

➣ O(g(n)) := {f (n) | there exist positive constants c and n0 such that 0 ≤ f (n) ≤ c · g(n)
for all n ≥ n0 }.

ˆ 2n2 − 5n + 1 ∈ O(n2 )
ˆ 2n + n100 − 2 ∈ O(n!)
ˆ 2n + 6 ∈
/ O(log n)

➣ Ω(g(n)) := {f (n) | there exist positive constants c and n0 such that 0 ≤ c · g(n) ≤ f (n)
for all n ≥ n0 }.

➣ For any 2 functions f (n) and g(n), f (n) = Θ(g(n)) if and only if f (n) = O(g(n)) and
f (n) = Ω(g(n)).

CS483 Design and Analysis of Algorithms 17 Lecture 03, September 4, 2007


Examples

➣ O(g(n)) := {f (n) | there exist positive constants c and n0 such that 0 ≤ f (n) ≤ c · g(n)
for all n ≥ n0 }.

ˆ 2n2 − 5n + 1 ∈ O(n2 )
ˆ 2n + n100 − 2 ∈ O(n!)
ˆ 2n + 6 ∈
/ O(log n)

➣ Ω(g(n)) := {f (n) | there exist positive constants c and n0 such that 0 ≤ c · g(n) ≤ f (n)
for all n ≥ n0 }.

ˆ 2n2 − 5n + 1 ∈ Ω(n2 )
ˆ 2n + n100 − 2 ∈
/ Ω(n!)
ˆ 2n + 6 ∈ Ω(log n)
➣ For any 2 functions f (n) and g(n), f (n) = Θ(g(n)) if and only if f (n) = O(g(n)) and
f (n) = Ω(g(n)).

CS483 Design and Analysis of Algorithms 18 Lecture 03, September 4, 2007


Examples

➣ O(g(n)) := {f (n) | there exist positive constants c and n0 such that 0 ≤ f (n) ≤ c · g(n)
for all n ≥ n0 }.

ˆ 2n2 − 5n + 1 ∈ O(n2 )
ˆ 2n + n100 − 2 ∈ O(n!)
ˆ 2n + 6 ∈
/ O(log n)
➣ Ω(g(n)) := {f (n) | there exist positive constants c and n0 such that 0 ≤ c · g(n) ≤ f (n)
for all n ≥ n0 }.

ˆ 2n2 − 5n + 1 ∈ Ω(n2 )
ˆ 2n + n100 − 2 ∈
/ Ω(n!)
ˆ 2n + 6 ∈ Ω(log n)
➣ For any 2 functions f (n) and g(n), f (n) = Θ(g(n)) if and only if f (n) = O(g(n)) and
f (n) = Ω(g(n)).
ˆ 2n2 − 5n + 1 ∈ Θ(n2 )
ˆ 2n + n100 − 2 ∈
/ Θ(n!)
ˆ 2n + 6 ∈
/ Θ(log n)

CS483 Design and Analysis of Algorithms 19 Lecture 03, September 4, 2007


Establish Order of Growth


 0 f (n) has a smaller order of growth than g(n)


f (n)
lim = c > 0 f (n) has the same order of growth as g(n)
n→∞ g(n) 

∞ f (n) has a larger order of growth than g(n)

2n2 − 5n + 1 vs. n2
2n + n100 − 2 vs. n!
2n + 6 vs. log n

CS483 Design and Analysis of Algorithms 20 Lecture 03, September 4, 2007


Establish Order of Growth

ˆ L’Hopital’s rule
If limn→∞ f (n) = limn→∞ g(n) = ∞ and the derivatives f ′ and g ′
exist, then

f (n) f ′ (n)
lim = lim ′
n→∞ g(n) n→∞ g (n)

ˆ Stirling’s formula

√ n
n! ≈ 2πn( )n
e
where e is the natural logarithm, e ≈ 2.718. π ≈ 3.1415.

√ n n √ n n+ 1
2πn( ) ≤ n! ≤ 2πn( ) 12n
e e

CS483 Design and Analysis of Algorithms 21 Lecture 03, September 4, 2007


Some Facts on Growth of Order

ˆ A weak version of Stirling’s formula

n n n n
( ) < n! < ( ) , if n ≥ 6.
3 2
ˆ Another view
ln n ln 2π
ln n! ≈ n ln n − n + +
2 2
ˆ 11! and 20! are the largest factorials stored in 32-bit and 64-bit computers.
(20! = 2432902008176640000)

ˆ googol is 10100 and 70! ≈ 1.198 googol.


Consider arranging 70 people in a row.

ˆ A googol is greater than the number of atoms in the observable universe,


which has been variously estimated from 1079 up to 1081 .

CS483 Design and Analysis of Algorithms 22 Lecture 03, September 4, 2007


Exercises

2n2 − 5n + 1 vs. n2
2n + n100 − 2 vs. n!
2n + 6 vs. log n

CS483 Design and Analysis of Algorithms 23 Lecture 03, September 4, 2007


Exercises

➣ All logarithmic functions loga n belong to the same class Θ(log n) no matter
what the logarithmic base a > 1 is.

➣ All polynomials of the same degree k belong to the same class


ak nk + ak−1 nk−1 + · · · + a1 n + a0 ∈ Θ(nk ).
➣ Exponential functions an have different orders of growth for different a’s, i.e.,
2n ∈
/ Θ(3n ).
➣ Θ(log n) < Θ(na ) < Θ(an ) < Θ(n!) < Θ(nn ), where a > 1.

CS483 Design and Analysis of Algorithms 24 Lecture 03, September 4, 2007

You might also like