0% found this document useful (0 votes)
16 views8 pages

DAA Unit-3 Part-1 YNotes

The document discusses the Greedy Method, a straightforward algorithmic approach used to solve optimization problems by making locally optimal choices at each stage. It covers various applications including Job Sequencing with Deadlines, the Knapsack Problem, Single Source Shortest Path Problem, and Optimal Merge Patterns. Each application is explained with procedures and algorithms to illustrate how the Greedy Method can be effectively utilized.

Uploaded by

Yagneswar Naidu
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
16 views8 pages

DAA Unit-3 Part-1 YNotes

The document discusses the Greedy Method, a straightforward algorithmic approach used to solve optimization problems by making locally optimal choices at each stage. It covers various applications including Job Sequencing with Deadlines, the Knapsack Problem, Single Source Shortest Path Problem, and Optimal Merge Patterns. Each application is explained with procedures and algorithms to illustrate how the Greedy Method can be effectively utilized.

Uploaded by

Yagneswar Naidu
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 8

UNIT-3(PART-1)

GREEDY METHOD
• The greedy method is a most straight forward approach.
• The greedy problems have ‘n’ inputs and require us to obtain a subset that satisfies some constraint.
• Any subset that satisfies constraints is called as “Feasible Solution”.
• We need to find feasible solution that either maximizes or minimizes a given “Objective Function”.
• The greedy method suggests an algorithm that works in stages, by considering one input at a time.
• At each stage, a ‘decision’ is made regarding whether a particular input is optimal solution by using
“Selection Procedure”.
Greedy Method Control Abstraction:
Algorithm Greedy(a, n)
{
//a[1:n] contains array of ‘n’ inputs
Solution:= 0;
for i:=1 to n do
{
x:= select(a);
if feasible(solution, x) then
Solution: = Union(Solution,x);
}
return Solution;
}

APPLICATIONS OF GREEDY METHOD

• Job Sequencing with deadlines


• Knapsack Problem
• Single Source Shortest Path Problem
• Optimal Merge Patterns

JOB SEQUENCING WITH DEADLINES


In job sequencing problem, the objective is to find a sequence of jobs, which is completed within their
deadlines and gives maximum profit.
Procedure:
1. There is a set of ‘n’ jobs. For any job ‘i’ , the deadline 𝑑𝑖 ≥ 0 and profit 𝑃𝑖 ≥ 0
2. The profit ‘𝑃𝑖 ’ is earned iff the job completed by its deadline.
3. An ‘Optimal Solution’ is a feasible solution with maximum value.
4. The main objective is to find a sequence of jobs, which are completed within their deadline and gives
maximum profit.
5. Only one machine is available for processing jobs.
6. Only one job is processed at a time on the machine.

Algorithm:

Click Herefor
Better Explanation
Explanation:
1) Sort all jobs in decreasing order of profit.
2) Iterate on jobs in decreasing order of profit. For each job , do the following :
a) Find a time slot i, such that slot is empty and i < deadline and i is greatest. Put the job in
this slot and mark this slot filled.
b) If no such i exists, then ignore the job.
Example:
𝒏=𝟓
0 1 2 3

• Five jobs are given and each job takes one unit of time for execution.
• Assume that we are having single-CPU to execute these jobs to get maximum profit value with
three units of slots.
• This example is arranged in a decreasing order based on profit values.
• So, Job ‘J1’ is having highest profit and select this job for completion under respective unit slot.
Solution:

KNAPSACK PROBLEM
The Greedy algorithm could be understood very well with a well-known problem referred to as Knapsack
problem. Although the same problem could be solved by employing other algorithmic approaches, Greedy
approach solves Fractional Knapsack problem reasonably in a good time.

Definition:
• We are given ‘n’ objects and a knapsack (Bag).
• Object ‘i’ has a weight ‘𝑊𝑖 ’, Profit ‘𝑃𝑖 ’ and the knapsack has a capacity ‘m’.
• If a fraction ‘𝑋𝑖 ’ , 0 ≤ 𝑋𝑖 ≤ 1, of object ‘i’ is placed in to the knapsack then the profit of ‘𝑃𝑖 𝑋𝑖 ’ is
earned.
• The main objective is to obtain a filling of knapsack that maximizes the total profit earned.
• Since the knapsack capacity is ‘m’, we require the total weight of all chosen objects to be at most ‘m’.
The representation of problem: • The ‘profits’ and ‘weights’ are
positive integers.
Maximize ∑1≤𝑖≤𝑛 𝑃𝑖 . 𝑋𝑖 … ① • A ‘feasible solution’ is any set
(𝑋1, 𝑋2, … , 𝑋𝑛) which is
Subject to ∑1≤𝑖≤𝑛 𝑊𝑖 . 𝑋𝑖 ≤ 𝑚 … ② satisfying equation 2.
• An ‘optimal solution’ is a
And 0 ≤ Xi ≤ 1, 1 ≤ i ≤ n … ③ feasible solution for which
equation ‘1’ is maximized.
Example:
𝒏 = 𝟑, 𝒎 = 𝟐𝟎, (𝑷𝟏, 𝑷𝟐, 𝑷𝟑) = (𝟐𝟓, 𝟐𝟒, 𝟏𝟓) and (𝑾𝟏, 𝑾𝟐, 𝑾𝟑) = (𝟏𝟖, 𝟏𝟓, 𝟏𝟎)

Strategy-1:

Strategy-2:

Click Herefor
Better Explanation

• Solution Set: ( 0, 1, ½)
• Profit value: 31.5

Algorithm

SINGLE SOURCE SHORTEST PATH PROBLEM


Dijkstra’s algorithm solves the single-source shortest-paths problem on a directed weighted graph
𝐺 = (𝑉, 𝐸), where all the edges are non-negative (i.e., 𝑤(𝑢, 𝑣) ≥ 0 for each edge (𝑢, 𝑣) Є 𝐸).

Procedure:
• We are given a directed graph 𝐺 = (𝑉, 𝐸) which contains ‘Vertices’ and ‘Edges’ and one of the vertices
is designated as “Source Vertex”.
• The problem is the graph is having all these nodes with edges are weighted and directed.
• From this choose a node ‘S’ and we have to find out the shortest path from ‘S’ to every node.
• This is called “Single Source Shortest Path” (Dijkstra’s) Problem.

Click Herefor
Better Explanation
Relaxing an edge:

• Suppose the distance from S to U is 10 and S to V is 20 and there is an edge from U to V is 5.


• In order to reach from 𝑆 → 𝑉, the known path we can go from 𝑆 → 𝑈 to 𝑈 → 𝑉 gives the distance
10 + 5 = 15.
• So, the graph becomes:

Algorithm:
OPTIMAL MERGE PATTERNS
Merge a set of sorted files of different length into a single sorted file. We need to find an optimal solution,
where the resultant file will be generated in minimum time.
If the number of sorted files are given, there are many ways to merge them into a single sorted file.
This merge can be performed pair wise. Hence, this type of merging is called as 2-way merge patterns.

Procedure:
• Merging means combining of two sorted lists in to one.
• Example:

• These two sorted files containing ‘n’ and ‘m’ records respectively could be merged together
in time is 𝑶(𝒏 + 𝒎)
• When “more than two” sorted files are to be merged together, then the merge can be
accomplished by repeatedly merging sorted files in pairs is called as “TWO-WAY
MERGING”.
• So, in the above 3 cases, case-3 gives minimum cost to merge four files, because of merging of
minimum length of files first.
• The ‘greedy method’ which we follow always says that merging a pair of small files to get the best
solution.

Example: Optimal Merge Pattern

𝑭𝒊𝒍𝒆𝒔 → 𝑥1 𝑥2 𝑥3 𝑥4 𝑥5
𝑺𝒊𝒛𝒆𝒔→ 20 30 10 5 30
Now, apply greedy method to merge all files in increasing order.
𝑭𝒊𝒍𝒆𝒔 → 𝑥4 𝑥3 𝑥1 𝑥2 𝑥5
𝑺𝒊𝒛𝒆𝒔→ 5 10 20 30 30

In the binary tree, the leaf nodes are


drawn as ‘squares’ known as “External
Nodes”.

And the nodes are drawn as ‘circles’


known as “Internal Nodes”.

Each ‘Internal Node’ has exactly two


children

𝑻𝒐𝒕𝒂𝒍 𝒄𝒐𝒔𝒕 = 𝑂(15 + 35 + 60 + 95)

= 𝟐𝟎𝟓

(or)

The external node ‘x4’ is at a distance


3 from the root node ‘Z4’. Therefore:
Ʃ𝒅𝒊 ∗ 𝒙𝒊

= 5 ∗ 3 + 10 ∗ 3 + 20 ∗ 2 + 30 ∗ 2 + 30 ∗ 2

= 15 + 30 + 40 + 60 + 60 = 𝟐𝟎𝟓
Algorithm:

Time Complexity: 𝑶(𝒏𝟐 )

Application: Huffman Coding


Example:
Message → BCCABBDDAECCBBAEDDCC
• The length of message = 20 Alphabets
• To store the above message, we need ASCII coding.
• Each alphabet needs 8-bits of memory, so that 20 ∗ 8 = 𝟏𝟔𝟎 𝒃𝒊𝒕𝒔.
• By “Huffman Coding”, instead of using ASCII codes we are using Huffman codes to be
constructed by binary tree.

You might also like