Download as pdf or txt
Download as pdf or txt
You are on page 1of 43

Course Packet 9

Greedy Algorithm

Maria Lolita G. Masangcap


Introduction
This course packet introduces the topic greedy algorithm and
enumerates its importance in algorithm designing.
It explains how this technique is applied in solving different types of
computing problems.
At the end of this activity, students are expected to apply greedy
algorithm concepts to the given problems.
Also, students are expected to design an algorithm implementing
the concept of greedy algorithm in solving a computing problem.
Introduction

At the end of this course packet, you


will be able to Illustrate and
implement the concept of greedy
algorithm technique in designing
Design and Analysis of
algorithm.
Algorithm
Greedy Algorithm

Greedy algorithms are simple and Most greedy algorithms are easy to
straightforward develop, implement, and time efficient

An approach for solving a problem by selecting the best option


available at the moment, without worrying whether the current
best result will bring the overall optimal result.
Greedy Method
Constraint
Reach the destination
within 2 hours only!
Feasible Solutions
Problem: Travel from school to market
Solutions:

Must be in minimum cost

Optimal
Optimization Problem Solution
requires either minimum or maximum resource
Greedy Method
Constraint
Reach the destination
within 2 hours only!
Feasible Solutions
Problem: Travel from school to market
Optimization Problem
requires either minimum or maximum resource
Remember: Must be in minimum cost
• Feasible - it has to satisfy the problem’s constraint
• Locally optimal - it has to be the best local choice among all Optimal
feasible choices available on a specific step
• Irrevocable - once made, it cannot be changed on subsequent Solution
steps of the algorithm
Greedy Method
Greedy Algorithms try to solve a problem by always making a
choice that looks best for the moment. Once the choice is made,
it is not taken back even if later a better choice was found.

The greedy
method is a Configurations (or states): different choices, collections, or
general algorithm values to find
design paradigm,
built on the Objective function: a score assigned to configurations, which
following we want to either maximize or minimize
elements:
Making Change

Problem Configuration
•A peso amount •A peso amount yet
using a collection of to return to a
coin amounts customer plus the At each step, we use the
coins already
returned highest possible coin from the
denominations.
At last, we are able
Objective Greedy Solution
function to reach the value of
•Always return the
•Minimize number of highest value coin 93 just by using 5
you can
coins returned coins.
Making Change
Issue with Greedy Algorithm Approach
Problem Configuration
•A peso amount •A peso amount yet
using a collection of to return to a
coin amounts customer plus the
coins already
returned
Accordingly, using the Greedy
algorithm, we will end up the
denomination 9, 1, 1, (3 coins) to
Objective Greedy Solution reach the value of 11,
function •Always return the
highest value coin
However, there is a
•Minimize number of
coins returned you can more optimal
solution. And that is
by using the
denominations 5 &
6. Using them, we
can reach 11 with only
2 coins.
Fractional Knapsack Problem
Ø Given: A set S of n items, with each item i having
ü bi – positive benefit of all of item i available
ü wi – positive weigh of all of item i available
Ø Goal: Choose items with maximum total benefit but with weight at most W.
Ø If we are allowed to take fractional amounts, then this is the Fractional Knapsack
Problem.
ü Let xi <= wi denote the amount we take of item i
ü Objective: maximize

ü Constraint:
Fractional Knapsack Problem
Ø Given: A set S of n items, with each item I having
ü bi – positive benefit of all of item i available
ü wi – positive weigh of all of item i available
Ø Goal: Choose items with maximum total benefit but with weight at most W.
Fractional Knapsack Problem

Analysis: O(n log n) with sorting


Text Compression
Files can often be compressed
• Represented using fewer bytes that the standard representation

Fixed Length encoding Huffman Encoding

• Somewhat wasteful, because some • By David Huffman, 1952


characters are more common than • A binary tree which is based on the
others probability distribution of a symbol set
• If a character appears frequently, it • Greedy – shorter codes for more
should have shorter representation. frequent symbols
• No code is a prefix of another code –
easy decoding
Text Compression
Text = “beekeepers & bees”
Character Fixed-Length Code Huffman Code Freq Count
Fixed Length Encoding: b 000 110 2
000 001 001 010 001 001 011
e 001 0 7
001 100 101 110 111 110 000
001 001 101 (17 x 3 = 51 bits) k 010 11110 1
p 011 11111 1
Huffman Encoding:
110 0 0 11110 0 0 11111 0 r 100 1011 1
1011 100 1110 1010 1110 110 s 101 100 2
0 0 100 (45 bits) _ 110 1110 2
& 111 1010 1
Huffman Encoding
Text = “ab ab cabe”

Character Freq Count Huffman Code

a 3 11
b 3 10
_ 2 00
c 1 010
e 1 011

Huffman Tree
Huffman Encoding

Text = “ab ab cabe”

‘e’

In Priority queue, items are


ordered by key value so that
item with the lowest value of
key is at front and item with
the highest value of key is at
rear or vice versa.
Huffman Encoding
Huffman Code Algorithm
1. Count the occurrences of each character in file.

Text = “ab ab cabe”

2. Place characters and counts into a priority queue


§ Store a single character and its count as a Huffman node object
§ The priority queue will organize them into ascending order
Huffman Encoding
3. Create Huffman tree from the node counts.
Algorithm
• Put all node counts into a priority queue (PQ)
• While PQ size > 1
• Remove two rarest characters (with lowest freq)
• Combine into a single node with these two as its children
Huffman Encoding
3.Create Huffman tree from
the node counts.
Algorithm
• Put all node counts into a
priority queue (PQ)
• While PQ size > 1
• Remove two rarest
characters (with lowest
freq)
• Combine into a single
node with these two as its
children
• Compute for the sum of
its count then insert the
subtree into the priority
queue
Huffman Encoding
4. Traverse tree to find (char à binary) map
The code for each character is
determined by the path from the
root to the corresponding leaf
• Right is 1; Left is 0
• Example
• ‘e’ is left-right-right so its
code is 011
• ‘b’ is right-left so its code is
10
Huffman Encoding
5. For each char, convert to compressed binary version
Based on the preceding tree, we have the following encodings:

Using this map, we can encode the text/file into a shorter binary
representation. The text “ab ab cabe” would be encoded as:

Overall, the text in its equivalent Huffman code is 1110001110000101110011.


Minimum Spanning Trees
Spanning Tree

• Subgraph tree that connects all nodes

Weight of a tree

• Sum of edge weights

Minimum Spanning Tree (MST)

• Given a weighted connected graph, an MST is one that connects all nodes with the least total weight

Examples

• Circuitry – shortest length of pin connections to make them equivalent, weights are lengths
• Airports – least cost of inter-city flights to reach all cities; weight of each edge is ticket price
Minimum Spanning Trees
Two MST Greedy Algorithm

• Kruskal’s Algorithm
• Prim’s Algorithm
Kruskal’s Algorithm
Cycle Property
This is a greedy
Grows the MST by
algorithm, taking the
adding a least weighted
cheapest available edge
edge that will not cause
at any time that will not
a cycle
cause a cycle

Cycles can be detected


Greedy using Cycle by keeping track of sets
property of connected
components

If weight(f) > weight(e), we can get a spanning


tree of smaller weight by replacing f with e.
Kruskal’s Algorithm

MST Total Weight


2 + 2 + 3 + 3 + 4 + 5 + 7 = 26
Kruskal’s Algorithm
Another Example to better understand Kruskal’s algorithm.

Step 1: Remove all loops and parallel edges.


In case of parallel edges, keep the one which has the least cost
associated and remove all others.
Kruskal’s Algorithm
Another Example to better understand Kruskal’s algorithm.

Step 2: Arrange all edges in their increasing


order of weight.
Create a set of edges and weight and arrange them in an
ascending order of cost.
Kruskal’s Algorithm
Another Example to better understand Kruskal’s algorithm.

The least cost is 2 and edges involved are B, D and D, T. Next least cost is 3 and associated edges are A, C and C, D.

Step 3: Add the edge which has the least cost.


Kruskal’s Algorithm
Another Example to better understand Kruskal’s algorithm.

Next cost in the table is 4. However, adding it in the tree will create a cycle so just ignore it.
We can also observe that edges with cost 5 and 6 will create cycle so just ignore them and move on.

Step 3: Add the edge which has the least cost.


Kruskal’s Algorithm
Another Example to better understand Kruskal’s algorithm.

Next cost in the table is 4. However, adding it in the tree will create a cycle so just ignore it.
We can also observe that edges with cost 5 and 6 will create cycle so just ignore them and move on.

Step 3: Add the edge which has the least cost.


Kruskal’s Algorithm
Another Example to better understand Kruskal’s algorithm.

MST Total Weight


2 + 2 + 3 + 3 + 7 = 17

To complete the spanning tree, only one edge must be added to connect all the vertices. Between the two cost edges available (7
and 8), add the edge with minimum which is the edge with cost 7.

Step 3: Add the edge which has the least cost.


Prim’s Algorithm
Grows MST from a vertex by adding a least weighted edge
joining a vertex in MST to another vertex not in MST

This is a greedy algorithm, taking the cheapest available edge


to grow the tree

Start by picking any vertex to be the root of the tree

While the tree does not contain all vertices in the graph, find
shortest edge having the tree and add it to the tree.
Prim’s Algorithm

MST Total Weight


2 + 1 + 3 + 3 + 2 = 11
Prim’s Algorithm
Another Example to better understand Prim’s algorithm.

Step 1: Remove all loops and parallel edges.


In case of parallel edges, keep the one which has the least cost
associated and remove all others.
Prim’s Algorithm
Another Example to better understand Prim’s algorithm.

Step 2: Choose any arbitrary node as root node


Let us choose vertex S as the root node of Prim’s spanning tree.
Prim’s Algorithm
Another Example to better understand Prim’s algorithm.

After choosing the root node S, we see that S,A and S,C are Check for all edges going out from the chosen vertices (S and A)
two edges with weight 7 and 8, respectively. We choose the and select the one edge which has the lowest cost and include it
edge S,A as it is lesser than the other. in the tree.

Step 3: Check outgoing edges and select the one with less cost
Prim’s Algorithm
MST Total Weight
Another Example to better understand Prim’s algorithm. 2 + 2 + 3 + 3 + 7 = 17

Check for all edges going out from the chosen vertices (S, A After adding the node D to the spanning tree, there are two
and C) and select the one edge which has the lowest cost edges going out of it having the same cost 2. Add these two
and include it in the tree. edges and it will result to the final spanning tree.

Step 3: Check outgoing edges and select the one with less cost
Shortest Path Problems
Given a weighted directed graph G(v, E), find:

Single Pair • the shortest path from vertex u to vertex v.


Multiple
• The shortest paths from each vertex in V to a vertex u.
Destinations
Single Source –
Multiple • The shortest paths from vertex u to all other vertices in V.
Destinations
All Pairs • Find shortest paths from each vertex to every other vertex.
Shortest Path Problems
Applications

Find the shortest path between two points.

Find the shortest time from start to finish of a project


given list of dependent tasks

Minimize cost to transmit data to multiple locations


Dijkstra’s Algorithm
Given: Problem:
Find the shortest paths from a to every vertex in the graph
Dijkstra:
ü This greedy algorithm solves the single-
source multiple-destinations shortest path
problems, assuming edge weights are all
positive.
ü Grows shortest-path tree from a vertex by
adding the nearest node from the tree
ü Checks for improvement of paths to non-tree
vertices from newly added nodes to the tree.
Dijkstra’s Algorithm
Given:

Problem:
Find the shortest paths from f to every
vertex in the graph
Dijkstra’s Algorithm
Given: Problem:
Find the shortest
paths from f to
every vertex in
the graph
Q&A

Any
Question???
CP9 Greedy Algorithm

You might also like