0% found this document useful (0 votes)
141 views41 pages

Algorithms and Data Structures Lecture Slides: Asymptotic Notations and Growth Rate of Functions, Brassard Chap. 3

This document discusses asymptotic notations and big O notation for analyzing algorithms. It defines big O notation precisely as O(g(n)) being the set of functions f(n) where there exists positive constants c and n0 such that 0 ≤ f(n) ≤ cg(n) for all n ≥ n0. The document provides examples of using big O notation to classify algorithms by their growth rate, such as linear (O(n)), quadratic (O(n^2)), logarithmic (O(log n)), etc. It also discusses how to determine the simplest function g(n) that bounds a given function f(n) and proves f(n) is in O(g(n)).

Uploaded by

Phạm Gia Dũng
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
141 views41 pages

Algorithms and Data Structures Lecture Slides: Asymptotic Notations and Growth Rate of Functions, Brassard Chap. 3

This document discusses asymptotic notations and big O notation for analyzing algorithms. It defines big O notation precisely as O(g(n)) being the set of functions f(n) where there exists positive constants c and n0 such that 0 ≤ f(n) ≤ cg(n) for all n ≥ n0. The document provides examples of using big O notation to classify algorithms by their growth rate, such as linear (O(n)), quadratic (O(n^2)), logarithmic (O(log n)), etc. It also discusses how to determine the simplest function g(n) that bounds a given function f(n) and proves f(n) is in O(g(n)).

Uploaded by

Phạm Gia Dũng
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 41

Algorithms and Data Structures

Lecture slides: Asymptotic notations and growth


rate of functions, Brassard Chap. 3

Lecturer: Michel Toulouse

Hanoi University of Science & Technology


michel.toulouse@soict.hust.edu.vn
Topics outline

Introduction

Big O-Notation

Omega-Notation

Theta-Notation

Some exercises
Running time

In the previous lecture, the running time of an algorithm was obtained


by figuring out the worst case instance and then counting the number
of times the most frequent basic operations is executed

We did agree this was not an exact measurement of the execution time
of an algorithm because

1. we count as “basic” pseudo-code operations that may take quite


different numbers of machine operations compared to other
pseudo-code operations
2. we don’t count all the basic operations of an algorithm but only
the one that is executed the most often
Further abstraction of running time

When counting the number of time a basic operation is executed within


an algorithm, we end up with an expression like an2 + bn + c which
again give us how many time a basic instruction has been executed

Here we move even further in abstracting measurements from real


execution time by dropping all the constants and lower terms from the
expression like the one above

So we end up with expression like n2 in place of an2 + bn + c

Clearly n2 does not describe the number of time an instruction is


executed, so what it is ?
From running time to growth rate (”orders”)
The dominant term (n2 ) in an expression describing the number of
times an instruction is executed (such as this one : an2 + bn + c) is
called the rate of growth, or order of growth, of the running time.

The growth rate is an indication on how fast the running time


increases, when the input size n of an algorithm increases.

The growth rate is also a bound on the running time of an algorithm,


if we put a sufficiently large constant d in front of n2 then
dn2 > an2 + bn + c because n2 > n and of course dn2 bigger than any
constant c for a sufficiently large d.

So for now on our main task will be to find the growth rate of an
algorithm and to compare the running time of different algorithms
using their respective growth rate.
Frequent “orders” or ”growth rates”

Some orders occur so frequently that we give them a name.

I Logarithmic algorithm : An algorithm never executes more than


c log n basic operations, its run time is in the order of log n or
logarithmic time.
I Linear algorithm : An algorithm never executes more than cn basic
operations, its run time is in the order of n or linear time.
I Quadratic algorithm : An algorithm never executes more than cn2
basic operations, its run time in the order of n2 or quadratic time.
I Cubic, polynomial or exponential algorithms : For algorithms that
have run time in the order of n3 , nk or c n .
Basic exercises

1. Give the order of the running time for selection sort


2. What is the order of the running time for insertion sort ?
Order notations

When the running time of an algorithm is bound above by a function


like log n, n log n, n2 , etc, we denote the order of the corresponding
algorithm as O(log n), O(n log n) or O(n2 ) which we pronounce as big
O of some function.

Similarly, when the running time of an algorithm is bound below by


some function like like log n, n log n, n2 , i.e. the number of time the
basic instruction is executed is at least like log n, n log n, n2 , etc, we
denote the running time of the algorithm as Omega of some function
such as Ω(log n), Ω(n log n) or Ω(n2 )

Finally, if the upper bound and the lower of the running time of an
algorithm are the same function then the running time of the algorithm
is denoted Theta of that function such as Θ(log n), Θ(n log n) or Θ(n2 )
Order notations

These notations O, Ω and Θ are called asymptotic notations because


they refer to the size of the input in the limit, i.e. as the size increases
to infinity.

The asymptotic notations are defined precisely, we now introduce these


definitions.
O-Notation : An Asymptotic Upper Bound I

Definition (Big O notation)


Let g (n) be a function from N to R. Denote

O(g (n)) = {f (n) : there exist positive constant c and n0 such that
0 ≤ f (n) ≤ c g (n) for all n ≥ n0 }

the set of functions defined on natural numbers which are bounded


above by a positive real multiple of g (n) for sufficiently large n.
O-Notation : An Asymptotic Upper Bound II
Example : Let f (n) = 27n2 + 355
113 n + 12 be the number of basic
operations performed by an algorithm in the worst case, giving an
input of size n. We would like to find a simple function g (n) such that
f (x) ∈ O(g (n)).

We can guess g (n) = n2 . Thus,

f (n) = 27n2 + 355


113 n + 12
≤ 27n + 355
2 2
113 n + 12n
2
16 2 16
≤ 42 113 n = 42 113 g (n)

So instead of saying that an algorithm takes 27n2 + 355 113 n + 12


elementary operations to solve an instance of size n, we can say that
the time of the algorithm is in order of n2 , or write the algorithm is in
O(n2 ).
O-Notation : An Asymptotic Upper Bound III

Terminologies : Let f and g be non-negative valued functions


N → R≥0 :

1. We say that f (n) is in the order of g (n) if f (n) ∈ O(g (n)).


2. We say that “f (n) is big-O of g (n)”. For convenience, we also
write f (n) = O(g (n)).
3. As n increases, f (n) grows no faster than g (n). In other words,
g (n) is an asymptotic upper bound of f (n).
Graphic Example of O-notation
I f (n) ∈ O(g (n)) if there are constants c and n0 such that

0 ≤ f (n) ≤ c g (n) for all n0 ≤ n

cg(n)
f(n)

n0
How to find g (n)

Given that we have a function f that gives the exact number of


elementary operations performed by an algorithm in the worst case, we
need to find :

1. the simplest and slowest-growing function g such that


f (n) ∈ O(g (n))
2. prove the relation f (n) ∈ O(g (n)) is true, i.e. show ∃ c and n0
such that f (n) ≤ cg (n) for all n ≥ n0 .
Strategy for finding g (n)

Assume f (n) = 3n2 + 2n :

I Throw away the multiplicative constants : 3n2 + 2n is replaced by


n2 + n
I Also if you have 2n+1 = 2 × 2n can be replaced by 2n .
I If you have logs, throw away the bases since the log properties says
that for any two bases a and b, logb n = c × loga n for some
multiplicative constant c.
I Once f (n) has been simplified, the fastest growing term in f (n) is
your g (n).
I f (n) = 3n2 + 2n = O(n2 ).

OK, this is a ways to find g (n), but this is not a proof that
f (n) ∈ O(g (n))
Prove that f (n) ∈ O(g (n))

Often the easiest way to prove that f (n) ∈ O(g (n)) is to take c to be
the sum of the positive coefficients of f (n).

Example : Prove 5n2 + 3n + 20 ∈ O(n2 )

I We pick c = 5 + 3 + 20 = 28. Then if n ≥ n0 = 1,

5 n2 + 3 n + 20 ≤ 5 n2 + 3 n2 + 20 n2 = 28 n2 ,

thus 5n2 + 3n + 20 ∈ O(n2 ).


I We can also guess other values for c and then find n0 that work.
Prove that f (n) ∈ O(g (n))
Another way is to assume c = 1 and find for which n0 f (n) ≤ g (n)

Example : Show that 21 n2 + 3n ∈ O(n2 )

Proof : The dominant term is 12 n2 , so g (n) = n2 . Therefore we need


to find c and n0 such that

1
0 ≤ n2 + 3n ≤ c n2 for all n ≥ n0 .
2
Since we decided to fix c = 1, we have
1 2 1
n + 3n ≤ n2 ⇔ 3n ≤ n2 ⇔ 6 ≤ n
2 2
Thus, we pick n0 = 6.

We have just shown that if c = 1, and n0 = 6, then


0 ≤ 21 n2 + 3n ≤ c n2 for all n ≥ n0 , i.e. 12 n2 + 3n ∈ O(n2 ).
Some properties for Big O notation

1. Reflexivity : f (n) ∈ O(f (n)).


2. Scalar rule : Let f be a non-negative valued functions defined on
N and c be a positive constant. Then

O(cf (n)) ∈ O(f (n)).

Example : 6n2 ∈ O(n2 )


3. Maximum rule : Let f , g be non-negative functions. Then

O(f (n) + g (n)) ∈ O(max{f (n), g (n)}).


Exercises on Big O I

1. Given the following algorithm written in pseudo code :


t := 0;
for i := 1 to n do
for j := 1 to n do
t := t + i + j;
return t.
1.1 Which instruction can be used as elementary operation ?
1.2 Express the running time of this algorithm in terms of the number
of times your selected elementary operation is executed ?
1.3 Give (without proof) a big O estimate for the running time of the
algorithm.
1.4 What is computed and returned by this algorithm ?
Exercises on Big O I

2. Find g (n) for each of the following functions fi (n) such that
fi (n) = O(g (n)).
I f1 = 3n log2 n + 9n − 3 log2 n − 3

I f2 = 2n2 + n log3 n − 15

I f3 = 100n + (n + 1)(n + 5)/2 + n3/2

3 n+1

I f4 = 1, 000n2 + 2n + 36n log n + 2
Exercises on Big O II

3. Which of the following statements are true ?

I n2 ∈ O(n3 ) I n 32 ∈ O(n log n)

I 2n ∈ O(3n ) I 2n+1 ∈ O(2n )

I 3n ∈ O(2n ) I O(2n+1 ) = O(2n )

I n log n ∈ O(n 23 ) I O(2n ) = O(3n )


Exercises on Big O III
4. Give an upper bound on the worst-case asymptotic time
complexity of the following function used to find the Kth smallest
integer in an unordered array of integers. Justify your result. You
do not need to find the closed form of summations.

int selectkth( int A[ ], int k, int n )


int i, j, mini, tmp ;
for ( i = 0 ; i < k ; i++ )
mini = i ;
for ( j = i + 1 ; j < n ; j++ )
if ( A[j] < A[mini] )
mini = j ;
tmp = A[i] ;
A[i] = A[mini] ;
A[mini] = tmp ;
return A[k-1] ;
Exercises on Big O IV

5. How n log n compares with n1. for 0 <  < 1 ?

Answer : Note that n log n = n × (log n) and n1. = n × n .

The grow rate of log n is slower than n for any value of  > 0

Eventually n catch up with log n for some value of n > n0 ,


depending on how small is .

Therefore, n log n ∈ O(n1. ) for 0 < .


Exercises on Big O V

6. Find the appropriate ”Big-Oh” relationship between the functions


n log n and 5n and find the constants c and n0
Answer : 5n ∈ O(n log n). Looking for c and n0 such that
0 ≤ 5n ≤ cn log n

5n ≤ cn log n
≤ c log n
5 ≤ log 32

As 25 = 32 and for c = 1, we have 5n ≤ cn log n


Exercises on Big O VI

7. Give the polynomial expression describing the running time of the


code below. Provide the asymptotic time complexity of this code
using the ”Big-Oh” notation.
for (i = 0; i < n; i + +){
for (j = 0; j < 2 ∗ n; j + +)
sum = sum + A[i] ∗ A[j]
for (j = 0; j < n ∗ n; j + +)
sum = sum + A[i] + A[j]
}
Big Omega (Ω) : An Asymptotic Lower Bound

Given a non-negative valued function g (n). Denote

Ω(g (n)) = {f (n) : there exist positive constant c and n0 such that
f (n) ≥ c g (n) for all n ≥ n0 }

Definition
Let f and g be non-negative valued functions N → R≥0 :
1. We say that f (n) is in omega of g (n) if f (n) ∈ Ω(g (n)).
2. As n increases, f (n) grows no slower than g (n). In other words,
g (n) is an asymptotic lower bound of f (n).
Graphic Example of Ω-notation
I f (n) = Ω(g (n)) if there are constants c and n0 such that

0 ≤ c g (n) ≤ f (n) for all n ≥ n0 .

f(n)

cg(n)

n0
Big Omega : Examples

1. f (n) = 3n2 + n + 12 is Ω(n2 ) and also Ω(n), but not Ω(n3 ).

2. n3 − 4n2 ∈ Ω(n2 ).
Proof : Let c = 1. Then we must have

cn2 ≤ n3 − 4n2
1≤n−4

which is true when n ≥ 5, therefore n0 = 5


so
0 ≤ n2 ≤ n2 (n − 4) = n3 − 4 n2
Ω Proofs : How to choose c and n0

To prove that f (n) ∈ Ω(g (n)), we must find positive values of c and
n0 that make c · g (n) ≤ f (n) for all n > n0 .

I You can assume that c < 1, pick a n0 such that f (n) is larger
than c · g (n) and then find the exact constant c for n0 , OR
I Choose c to be some positive constant less than the multiplicative
constant of the fastest growing term of f (n), then find n0 that
works with the chosen c.
Example 1

For this example we assume that c < 1 and find an appropriate n0

Show that (n log n − 2 n + 13) ∈ Ω(n log n)

Proof : We need to show that there exist positive constants c and n0


such that

0 ≤ c n log n ≤ n log n − 2 n + 13 for all n ≥ n0 .

Since n log n − 2 n ≤ n log n − 2 n + 13,


we will instead show that

c n log n ≤ n log n − 2 n,
Example 1 continue

c n log n ≤ n log n − 2 n
n log n 2n
c≤ −
n log n n log n
2
c ≤1−
log n

2
so, c ≤ 1 − log n , when n > 1.

If n ≥ 8, then 2/(log n) ≤ 2/3, and picking c = 1/3 suffices.

Thus if c = 1/3 and n0 = 8, then for all n ≥ n0 , we have

0 ≤ c n log n ≤ n log n − 2 n ≤ n log n − 2 n + 13.

Thus (n log n − 2 n + 13) ∈ Ω(n log n).


Example 2
For this example we select c to be smaller than the constant of the
fastest growing term in the expression describing the running time.

Prove that f (n) = 3n2 − 2n − 7 ∈ Ω(n2 ).

Proof : The fastest growing term of f (n) is 3n2 . Try c = 1, since


1 < 3.

Then
1 · n2 ≤ 3n2 − 2n − 7 for all n > n0
is true only if (subtracting n2 from both sides)

0 ≤ 2n2 − 2n − 7 for all n > n0

is also true.

Choose n0 = 3, then the inequality above hold for any n ≥ 3.


An Asymptotic Tight Bound Θ-notation

Let g (n) be a non-negative valued function. Denote

Θ(g (n)) = {f (n) : there exist positive constants c1 , c2 and n0 s.t.


c1 · g (n) ≤ f (n) ≤ c2 · g (n) for all n ≥ n0 }

Definition
Let f and g be non-negative valued functions N → R≥0 :
1. We say that f (n) is in the theta of g (n) if f (n) ∈ Θ(g (n)).
2. As n increases, f (n) grows at the same rate as g (n). In other
words, g (n) is an asymptotic tight bound of f (n).
Graphic Example of Θ-notation
I f (n) = Θ(g (n)) if there are constants c1 , c2 and n0 such that

0 ≤ c1 g (n) ≤ f (n) ≤ c2 g (n) for all n0 ≤ n

c2 g(n) f(n)

c1 g(n)

n0
Θ-notation : Example

Prove that n2 − 5n + 7 ∈ Θ(n2 ).


Proof : Let c1 = 12 , c2 = 1, and n0 = 10. Then 12 n2 ≥ 5n and
−5n + 7 ≤ 0. Thus,

1 1
0 ≤ n2 ≤ n2 − n2 ≤ n2 − 5n ≤ n2 − 5n + 7 ≤ n2
2 2

If f (n) is Θ(g (n)), then

I f (n) is “sandwiched” between c1 g (n) and c2 g (n) for sufficiently


large n ;
I g (n) is an asymptotically tight bound for f (n) ;
Big Theta Proofs

The following theorem shows us that proving f (n) ∈ Θ(g (n)) is


nothing new :

I Theorem : f (n) ∈ Θ(g (n)) if and only if f (n) ∈ O(g (n)) and
f (n) ∈ Ω(g (n)).
I Thus, we just apply the previous two strategies.
Example
Show that 12 n2 − 3n ∈ Θ(n2 )

Proof :

I Find positive constants c1 , c2 , and n0 such that

1
0 ≤ c1 n2 ≤ n2 − 3 n ≤ c2 n2 for all n ≥ n0
2
I Dividing by n2 , we get 0 ≤ c1 ≤ 21 − n3 ≤ c2
I c1 ≤ 12 − n3 holds for n ≥ 10 and c1 = 1/5
I 1 3
2 −
≤ c2 holds for n ≥ 10 and c2 = 1.
n
I Thus, if c1 = 1/5, c2 = 1, and n0 = 10, then for all n ≥ n0 ,

1
0 ≤ c1 n2 ≤ n2 − 3 n ≤ c2 n2 for all n ≥ n0 .
2
Thus we have shown that 12 n2 − 3n ∈ Θ(n2 ).
Cookbook for asymptotic notations

Theorem (Limit rule)


Given non-negative valued functions f and g : N → R≥0 . Then the
following statements are true
1. if 0 < limn→∞ gf (n)
(n) = L < ∞, then f (n) ∈ Θ(g (n)) and
consequently f (n) ∈ O(g (n)) and f (n) ∈ Ω(g (n)).
f (n)
2. if limn→∞g (n) = 0, then f (n) ∈ O(g (n)).
3. iflimn→∞ gf (n)
(n)= +∞, then f (n) ∈
/ O(g (n)) but g (n) ∈ O(f (n))
and f (n) ∈ Ω(g (n)).
Exercises

1. Prove that f (n) = n3 + 20n + 1 ∈ O(n3 )


2. Prove that f (n) = n3 + 20n + 1 6∈ O(n2 )
3. Prove that f (n) = n3 + 20n + 1 ∈ O(n4 ).
4. Prove f (n) = n3 + 20n ∈ Ω(n2 ).
5. Prove f (n) = 12 n2 − 3n ∈ Ω(n2 ).
6. Prove that f (n) = 5n2 − 7n ∈ Θ(n2 ).
7. Prove that f (n) = 23n3 − 10n2 log n + 7n + 6 ∈ Θ(n3 ).
8. Find the appropriate Ω relationship between the functions n3 and
3n3 − 2n2 + 2 and find the constants c and n0 .
Exercises (continue)

9. Consider the following iterative procedure :


for (i = 0; i < n; i + +){
for (j = 0; j < 2 ∗ n; j + +)
sum = sum + A[i] ∗ A[j]
for (j = 0; j < n ∗ n; j + +)
sum = sum + A[i] + A[j]
}
9.1 Give a function f describing the computing time of this procedure
in terms of the input size n.
9.2 Bound above the running time of this code using the ”Big-Oh”
notation. Prove your result.
9.3 Give a lower bound on the running time of this code using the “Ω”
notation. Prove your result. Then argue, based on your two
previous results about an exact time complexity of f
Exercises (continue)

10. To illustrate how the asymptotic notation can be used to rank the
efficiency of algorithms, use the relation ⊂ and = to put the
orders of the following functions into a sequence.

n2 , 1, n3/2 , 2n , log n, nn , 3n , n, n3 , n log n, n, log log n, n!
11. Similar to previous question, order the following growth rate
functions

n! (n + 1)! 2n 2n+1 22n nn n n
nlog n .

You might also like