Module-5
LIMITATIONS OF ALGORITHMIC POWER
and
COPING WITH LIMITATIONS OF ALGORITHMIC
POWER
Decision Trees
▪ Many important algorithms, especially those for sorting and searching, work by
comparing items of their inputs.
▪ We can study the performance of such algorithms with a device called a decision
tree.
▪ Figure 11.1 presents a decision tree of an algorithm for finding a minimum of three
numbers.
DESIGN AND ANALYSIS OF ALGORITHMS Module-5
▪ Each internal node of a binary decision tree represents a key comparison
indicated in the node, e.g., k< k’.
▪ The node’s left subtree contains the information about subsequent
comparisons made if k< k’, and its right subtree does the same for the case
of k> k’.
▪ Each leaf represents a possible outcome of the algorithm’s run on some
input of size n.
▪ Note that the number of leaves can be greater than the number of
outcomes because, for some algorithms, the same outcome can be arrived
at through a different chain of comparisons.
▪ An important point is that the number of leaves must be at least as large as
the number of possible outcomes.
▪ The algorithm’s work on a particular input of size n can be traced by a path
from the root to a leaf in its decision tree, and the number of comparisons
made by the algorithm on such a run is equal to the length of this path.
▪ Hence, the number of comparisons in the worst case is equal to the height
of the algorithm’s decision tree.
▪ The central idea behind this model lies in the observation that a tree with a
given number of leaves, which is dictated by the number of possible
outcomes, has to be tall enough to have that many leaves.
DESIGN AND ANALYSIS OF ALGORITHMS Module-5
▪ Specifically, it is not difficult to prove that for any binary tree with l leaves
and height h,
▪ Indeed, a binary tree of height h with the largest number of leaves has all
its leaves on the last level (why?).
▪ Hence, the largest number of leaves in such a tree is 2h.
▪ In other words, 2h ≥ l, which immediately implies.
DESIGN AND ANALYSIS OF ALGORITHMS Module-5
Decision Trees for Sorting
▪ Most sorting algorithms are comparison based, i.e., they work by
comparing elements in a list to be sorted.
▪ By studying properties of decision trees for such algorithms, we can
derive important lower bounds on their time efficiencies.
▪ We can interpret an outcome of a sorting algorithm as finding a
permutation of the element indices of an input list that puts the list’s
elements in ascending order.
▪ Consider, as an example, a three-element list a, b, c of orderable items
such as real numbers or strings.
▪ For the outcome a< c < b obtained by sorting this list (see Figure 11.2),
the permutation in question is 1, 3, 2.
▪ In general, the number of possible outcomes for sorting an arbitrary
n-element list is equal to n!.
DESIGN AND ANALYSIS OF ALGORITHMS Module-5
DESIGN AND ANALYSIS OF ALGORITHMS Module-5
▪ Inequality (11.1) implies that the height of a binary decision tree for any
comparison-based sorting algorithm and hence the worst-case number of
comparisons made by such an algorithm cannot be less than [ log2 n! ]:
▪ Using Stirling’s formula for n!, we get
▪ In other words, about n log2 n comparisons are necessary in the worst
case to sort an arbitrary n-element list by any comparison-based sorting
algorithm.
DESIGN AND ANALYSIS OF ALGORITHMS Module-5
▪ We can also use decision trees for analyzing the average-case efficiencies
of comparison-based sorting algorithms.
▪ We can compute the average number of comparisons for a particular
algorithm as the average depth of its decision tree’s leaves, i.e., as the
average path length from the root to the leaves.
▪ For example, for the three-element insertion sort whose decision tree is
given in Figure 11.3, this number is,
▪ Under the standard assumption that all n! outcomes of sorting are
equally likely, the following lower bound on the average number of
comparisons Cavg made by any comparison-based algorithm in sorting an
n-element list has been proved:
DESIGN AND ANALYSIS OF ALGORITHMS Module-5
DESIGN AND ANALYSIS OF ALGORITHMS Module-5
Decision Trees for Searching a Sorted Array
▪ In this section, we shall see how decision trees can be used for
establishing lower bounds on the number of key comparisons in searching
a sorted array of n keys: A[0] < A[1] < ... < A[n 1].
▪ The principal algorithm for this problem is binary search.
▪ The number of comparisons made by binary search in the worst case,
Cbsworst (n), is given by the formula
▪ We will use decision trees to determine whether this is the smallest
possible number of comparisons.
▪ Since we are dealing here with three-way comparisons in which search
key K is compared with some element A[i] to see whether K < A[i], K = A[i],
or K > A[i], it is natural to try using ternary decision trees.
▪ Figure 11.4 presents such a tree for the case of n=4.
DESIGN AND ANALYSIS OF ALGORITHMS Module-5
DESIGN AND ANALYSIS OF ALGORITHMS Module-5
▪ The internal nodes of that tree indicate the array’s elements being
compared with the search key.
▪ The leaves indicate either a matching element in the case of a successful
search or a found interval that the search key belongs to in the case of an
unsuccessful search.
▪ We can represent any algorithm for searching a sorted array by three-way
comparisons with a ternary decision tree similar to that in Figure 11.4.
▪ For an array of n elements, all such decision trees will have 2n + 1 leaves
(n for successful searches and n+1 for unsuccessful ones).
▪ Since the minimum height h of a ternary tree with l leaves is [ log3 l ] , we
get the following lower bound on the number of worst-case comparisons:
▪ This lower bound is smaller than [ log2(n+1) ] , the number of worst-case
comparisons for binary search, at least for large values of n (and smaller
than or equal to [ log2(n+1) ] for every positive integer n.
DESIGN AND ANALYSIS OF ALGORITHMS Module-5
▪ To obtain a better lower bound, we should consider binary rather than
ternary decision trees, such as the one in Figure 11.5.
▪ Internal nodes in such a tree correspond to the same three- way
comparisons as before, but they also serve as terminal nodes for
successful searches.
▪ Leaves therefore represent only unsuccessful searches, and there are
n+1 of them for searching an n-element array.
DESIGN AND ANALYSIS OF ALGORITHMS Module-5
▪ As comparison of the decision trees in Figures 11.4 and 11.5 illustrates,
the binary decision tree is simply the ternary decision tree with all the
middle subtrees eliminated.
▪ Applying inequality (11.1) to such binary decision trees immediately yields
▪ This inequality closes the gap between the lower bound and the number
of worst- case comparisons made by binary search, which is also
[log2(n+1)] .
▪ A much more sophisticated analysis shows that under the standard
assumptions about searches, binary search makes the smallest number of
comparisons on the average, as well.
▪ The average number of comparisons made by this algorithm turns out to
be about [ log2(n-1) ] and [ log2(n+1) ] for successful and unsuccessful
searches, respectively.
DESIGN AND ANALYSIS OF ALGORITHMS Module-5
DESIGN AND ANALYSIS OF ALGORITHMS Module-5
DESIGN AND ANALYSIS OF ALGORITHMS Module-5
Backtracking
▪ Some problems can be solved by exhaustive search.
▪ The exhaustive-search technique suggests generating all candidate solutions and
then identifying the one (or the ones) with a desired property.
▪ Backtracking is a more intelligent variation of this approach.
▪ The principal idea is to construct solutions one component at a time and evaluate
such partially constructed candidates as follows:
▪ If a partially constructed solution can be developed further without violating the
problem’s constraints, it is done by taking the first remaining legitimate option for
the next component.
▪ If there is no legitimate option for the next component, no alternatives for any
remaining component need to be considered.
▪ In this case, the algorithm backtracks to replace the last component of the partially
constructed solution with its next option.
▪ It is convenient to implement this kind of processing by constructing a tree of
choices being made, called the state-space tree.
▪ Its root represents an initial state before the search for a solution begins.
▪ The nodes of the first level in the tree represent the choices made for the first
component of a solution; the nodes of the second level represent the choices for the
second component, and so on.
DESIGN AND ANALYSIS OF ALGORITHMS Module-5
▪ A node in a state-space tree is said to be promising if it corresponds to a partially
constructed solution that may still lead to a complete solution; otherwise, it is called
non-promising.
▪ Leaves represent either non-promising dead ends or complete solutions found by the
algorithm.
▪ In the majority of cases, a state-space tree for a backtracking algorithm is
constructed in the manner of depth-first search.
▪ If the current node is promising, its child is generated by adding the first remaining
legitimate option for the next component of a solution, and the processing moves to
this child.
▪ If the current node turns out to be non-promising, the algorithm backtracks to
the node’s parent to consider the next possible option for its last component; if there
is no such option, it backtracks one more level up the tree, and so on.
▪ Finally, if the algorithm reaches a complete solution to the problem, it either stops (if
just one solution is required) or continues searching for other possible solutions.
DESIGN AND ANALYSIS OF ALGORITHMS Module-5
N-Queens Problem
▪ The problem is to place n queens on an n × n chessboard so that NO Two queens
attack each other by being in the same row or in the same column or on the same
diagonal.
▪ So let us consider the four-queens problem and solve it by the backtracking
technique.
▪ Since each of the four queens has to be placed in its own row, all we need to do is to
assign a column for each queen on the board presented in figure.
DESIGN AND ANALYSIS OF ALGORITHMS Module-5
▪ We start with the empty board and
▪ Then place queen 1 in the first possible position of its row, which is in column 1 of
row.
▪ Then we place queen 2, after trying unsuccessfully columns 1 and 2, in the first
acceptable position for it, which is square (2, 3), the square in row 2 and column 3.
▪ This proves to be a dead end because there is no acceptable position for queen 3.
▪ So, the algorithm backtracks and puts queen 2 in the next possible position at (2, 4).
▪ Then queen 3 is placed at (3, 2), which proves to be another dead end.
▪ The algorithm then backtracks all the way to queen 1 and moves it to (1, 2).
▪ queen 2 then goes to (2, 4), queen 3 to (3, 1), and queen 4 to (4, 3), which is a
solution to the problem.
▪ The state-space tree of this search is shown in figure.
DESIGN AND ANALYSIS OF ALGORITHMS Module-5
Figure: State-space tree of solving the four-queens problem by backtracking.
× denotes an unsuccessful attempt to place a queen in the indicated column.
The numbers above the nodes indicate the order in which the nodes are generated
DESIGN AND ANALYSIS OF ALGORITHMS Module-5
▪ If other solutions need to be found, the algorithm can simply resume its operations at
the leaf at which it stopped. Alternatively, we can use the board’s symmetry for this
purpose.
▪ Finally, it should be pointed out that a single solution to the n-queens problem for any
n ≥ 4 can be found in linear time.
DESIGN AND ANALYSIS OF ALGORITHMS Module-5
DESIGN AND ANALYSIS OF ALGORITHMS Module-5
Sum of Subsets Problem
▪ Problem definition: Find a subset of a given set A = {a1, . . . , an } of n positive
integers whose Sum is EQUAL to a given positive integer d.
▪ For example, for A = {1, 2, 5, 6, 8} and d = 9, there are two solutions: {1, 2, 6} and {1,
8}.
▪ Of course, some instances of this problem may have no solutions.
▪ It is convenient to sort the set’s elements in increasing order. So, we will assume
that
a1< a2 < . . . < an.
▪ The state-space tree can be constructed as a binary tree like that in Figure shown
below for the instance A = {3, 5, 6, 7} and d = 15.
▪ The number inside a node is the sum of the elements already included in the subsets
represented by the node. The inequality below a leaf indicates the reason for its
termination.
DESIGN AND ANALYSIS OF ALGORITHMS Module-5
A = {3, 5, 6, 7} and d = 15
DESIGN AND ANALYSIS OF ALGORITHMS Module-5
▪ The root of the tree represents the starting point, with no decisions about the given
elements made as yet.
▪ Its left and right children represent, respectively, inclusion and exclusion of a1 in a set
being sought.
▪ Similarly, going to the left from a node of the first level corresponds to inclusion of a2
while going to the right corresponds to its exclusion, and so on.
▪ Thus, a path from the root to a node on the ith level of the tree indicates which of the
first i numbers have been included in the subsets represented by that node.
▪ We record the value of s, the sum of these numbers, in the node.
▪ If s is equal to d, we have a solution to the problem. We can either report this result
and stop or, if all the solutions need to be found, continue by backtracking to the
node’s parent.
▪ If s is not equal to d, we can terminate the node as non-promising if either of the
following two inequalities holds:
▪ Example: Apply backtracking to solve the following instance of the subset sum
problem:
A = {1, 3, 4, 5} and d = 11
DESIGN AND ANALYSIS OF ALGORITHMS Module-5
DESIGN AND ANALYSIS OF ALGORITHMS Module-5
DESIGN AND ANALYSIS OF ALGORITHMS Module-5
DESIGN AND ANALYSIS OF ALGORITHMS Module-5
DESIGN AND ANALYSIS OF ALGORITHMS Module-5
DESIGN AND ANALYSIS OF ALGORITHMS Module-5
DESIGN AND ANALYSIS OF ALGORITHMS Module-5
DESIGN AND ANALYSIS OF ALGORITHMS Module-5
DESIGN AND ANALYSIS OF ALGORITHMS Module-5
DESIGN AND ANALYSIS OF ALGORITHMS Module-5
DESIGN AND ANALYSIS OF ALGORITHMS Module-5
DESIGN AND ANALYSIS OF ALGORITHMS Module-5
DESIGN AND ANALYSIS OF ALGORITHMS Module-5