[CS203] Design and Analysis of Algorithms
Course Instructor: Dr. Dibyendu Roy Autumn 2023-24
Scribed by: Om N. Patel (202252329) Lecture (Week 12)
Kruskal’s Algorithm
Defination
A minimum spanning tree (MST) of a connected, undirected graph is a subgraph
that is a tree and connects all the vertices together with the minimum possible
sum of edge weights.
Algorithm
function kruskal(G):
sort all edges in G by their weights in non-decreasing order
initialize an empty graph MST
create a disjoint-set data structure for each vertex in G
for each vertex v in G:
make-set(v)
for each edge (u, v) in the sorted order:
if find(u) is not equal to find(v):
union(u, v) // merge the sets containing u and v
add edge (u, v) to MST
return MST
Time Complexity
O(E log V) due to sorting edges, where E is the number of edges and V is the
number of vertices.
Space Complexity
O(V + E) for the disjoint-set data structure and the MST, where V is the
number of vertices and E is the number of edges.
Mathematical Derivation
The algorithm selects edges in non-decreasing order of weights. Let MST be the
optimal minimum spanning tree, and e1, e2, . . . , ek be the edges selected by
1
Kruskal’s algorithm. The algorithm only selects edges that do not form cycles in
MST . Therefore, the final MST produced by the algorithm is also a minimum
spanning tree.
Prim’s Algorithm
Definition
: Similar to Kruskal’s algorithm, Prim’s algorithm aims to find the minimum
spanning tree of a graph.
Algorithm
function sieveOfEratosthenes(n):
create a boolean array prime[1..n] and initialize all entries as true
for i from 2 to n:
if prime[i] is true:
for j from i^2 to n with step i:
prime[j] = false
return all i such that prime[i] is true
Time Complexity
O(n log log n) due to the nested loops and marking multiples.
Space complexity
O(n) for the boolean array to store prime flags.
Mathematical Derivation
: Let MST be the optimal minimum spanning tree, and e1, e2, . . . , ek be
the edges selected by Prim’s algorithm. At each step, the algorithm chooses the
minimum-weight edge connecting a vertex in S to a vertex outside S. By the cut
property, the selected edges form a minimum spanning tree
2
Bellman-Ford Algorithm
Definition
: The Bellman-Ford algorithm calculates the shortest paths and detects negative
weight cycles
Algorithm
function bellmanFord(G, source):
initialize distances[] and predecessors[] for each vertex
set distance[source] to 0
for i from 1 to |V| - 1:
for each edge (u, v) in G:
if distance[u] + weight(u, v) < distance[v]:
distance[v] = distance[u] + weight(u, v)
predecessors[v] = u
// Check for negative cycles
for each edge (u, v) in G:
if distance[u] + weight(u, v) < distance[v]:
return "Negative cycle detected"
return distances, predecessors
Time Complexity
O(VE) where V is the number of vertices and E is the number of edges.
Space complexity
O(V) for the distances and predecessors arrays.
Mathematical Derivation
Let dist[v] be the shortest distance from source vertex s to vertex v. The
algorithm relaxes edges —V— 1 times, ensuring that the final distances are
the shortest paths. If a negative weight cycle exists, the algorithm can detect it
during the —V—th iteration.
3
Dijkstra’s Algorithm
Definition
Dijkstra’s algorithm calculates the shortest paths and is often used in routing
and as a subroutine in other graph algorithms.
Algorithm
function dijkstra(G, source):
initialize distances[] and predecessors[] for each vertex
set distance[source] to 0
create a priority queue Q and insert all vertices with their distances
while Q is not empty:
u = extractMin(Q)
for each neighbor v of u:
alt = distance[u] + weight(u, v)
if alt < distance[v]:
distance[v] = alt
predecessors[v] = u
decreaseKey(Q, v, alt)
return distances, predecessors
Time Complexity
O((V + E) log V) using a binary heap or Fibonacci heap for the priority queue,
where V is the number of vertices and E is the number of edges.
Space complexity
O(V) for the distances and predecessors arrays, and O(V + E) for the priority
queue and adjacency list.
Mathematical Derivation
Dijkstra’s algorithm maintains a set of vertices with known shortest distances.
At each step, it selects the vertex with the minimum tentative distance and
updates the distances of its neighbors. The algorithm guarantees the shortest
paths when all edge weights are non-negative.
4
Traveling Salesman Problem (TSP)
Explanation
The Traveling Salesman Problem is a classic optimization problem where the
goal is to find the shortest possible route that visits a set of cities and returns
to the original city.
Algorithm
function tsp(graph, start):
let n be the number of vertices in graph
let mask_all_visited = (1 << n) - 1
initialize memoization table memo[2^n][n] with -1
function tsp_util(mask, pos):
if mask == mask_all_visited:
return weight(pos, start) // Return to the starting city
if memo[mask][pos] != -1:
return memo[mask][pos]
min_cost = INFINITY
for next_city in range(n):
if (mask & (1 << next_city)) == 0:
new_mask = mask | (1 << next_city)
cost = graph[pos][next_city] + tsp_util(new_mask, next_city)
min_cost = min(min_cost, cost)
memo[mask][pos] = min_cost
return min_cost
return tsp_util(1 << start, start)
Floyd Warshall Algorithm
function floydWarshall(graph):
let n be the number of vertices in graph
initialize dist[][] matrix with shortest path distances
for each vertex k from 1 to n:
for each vertex i from 1 to n:
for each vertex j from 1 to n:
if dist[i][j] > dist[i][k] + dist[k][j]:
dist[i][j] = dist[i][k] + dist[k][j]
5
return dist