Download as pdf or txt
Download as pdf or txt
You are on page 1of 17

All Algorithms

Graph Algorithms:
• BFS
• DFS
• Dijkstra’s Algorithm
• Warshall’s Algorithm
• Kruskal’s Algorithm
• Prim’s Algorithm
Searching Algorithms:
• Linear Search
• Binary Search & BST
• BFS*
• DFS*
Sorting Algorithms:
• Bubble Sort
• Selection Sort
• Insertion Sort
• Quick Sort
• Merge Sort
• Heap Sort
• Radix Sort
More:
• Greedy Algorithm
• Knapsack Problem
• Dynamic Programming
• Recursive Procedure
Graph Algorithms (Bard Style)
Breadth-First Search (BFS)
1. Start at a vertex and add it to a queue.
2. Mark the vertex as visited.
3. While the queue is not empty:
o Remove the first vertex from the queue.
o For each of its unvisited neighbors:
 Add the neighbor to the queue.
 Mark the neighbor as visited.
Depth-First Search (DFS)
1. Start at a vertex and push it onto a stack.
2. Mark the vertex as visited.
3. While the stack is not empty:
o Pop a vertex from the stack.
o For each of its unvisited neighbors:
 Push the neighbor onto the stack.
 Mark the neighbor as visited.
Dijkstra's Algorithm
1. Initialize a distance array to infinity for all vertices.
2. Set the distance to the source vertex to 0.
3. Create a priority queue and add the source vertex to it.
4. While the priority queue is not empty:
o Remove the vertex with the minimum distance from the
priority queue.
o For each of its unvisited neighbors:
 If the distance to the neighbor is greater than the distance
to the current vertex plus the weight of the edge between
them:
 Update the distance to the neighbor.
 Add the neighbor to the priority queue.
Warshall's Algorithm
1. Initialize a matrix of distances between all pairs of vertices.
2. For each vertex v:
o For each other vertex w:
 If v and w are connected, set the distance between them
to 1.
3. For each edge (v, w) in the graph:
o Update the distance between v and w to the minimum of the
distance between v and w, and the distance between v and the
endpoint of the edge, plus the weight of the edge.
Kruskal's Algorithm
1. Initialize a set of all the vertices in the graph.
2. Create a priority queue of all the edges in the graph.
3. While the set of vertices is not connected:
o Remove the edge with the minimum weight from the priority
queue.
o If the two endpoints of the edge are not in the same set:
 Add the edge to the minimum spanning tree.
 Merge the two sets containing the endpoints of the edge.
Prim's Algorithm
1. Initialize a set of vertices that have already been added to the
minimum spanning tree.
2. Add any vertex to the set.
3. While the set of vertices is not connected:
o For each vertex v in the set:
 For each edge (v, w) in the graph:
 If w is not in the set and the weight of the edge is
less than the weight of any edge already in the
minimum spanning tree:
 Add the edge to the minimum spanning tree.
 Add w to the set.

Searching Algorithms
Linear Search
1. Start at the beginning of the list.
2. Compare the value to the current element in the list.
3. If the value is equal to the current element, return the index of the
element.
4. If the value is not equal to the current element, move on to the next
element in the list.
5. Repeat steps 2-4 until the end of the list is reached.
6. If the value is not found, return -1.
Binary Search
1. Find the middle element of the list.
2. Compare the value to the middle element.
3. If the value is equal to the middle element, return the index of the
element.
4. If the value is less than the middle element, recursively search the
left half of the list.
5. If the value is greater than the middle element, recursively search
the right half of the list.
6. Repeat steps 2-5 until the value is found or the entire list has been
searched.
Binary Search Trees (BST)
1. Start at the root node of the BST.
2. Compare the key to the value of the root node.
o If the key is equal to the value of the root node, return the
node.
o If the key is less than the value of the root node, recursively
search the left subtree.
o If the key is greater than the value of the root
node, recursively search the right subtree.
3. Repeat steps 2-3 until the key is found or the entire BST has been
searched.
4. If the key is not found, return None.
Sorting Algorithm
Bubble Sort
1. Compare the first two elements in the list.
2. If the first element is greater than the second element, swap their
positions.
3. Repeat steps 1-2 for the next two elements in the list.
4. Continue repeating steps 1-3 until the end of the list is reached.
5. Repeat steps 1-4 until no more swaps are made.
Selection Sort
1. Find the smallest element in the list.
2. Swap the smallest element with the first element in the list.
3. Find the smallest element in the list, starting from the second
element.
4. Swap the smallest element with the second element.
5. Repeat steps 3-4 until the end of the list is reached.
Insertion Sort
1. Start at the second element in the list.
2. Compare the current element to the element before it.
3. If the current element is less than the element before it, swap their
positions.
4. Repeat steps 2-3 for the next element in the list.
5. Continue repeating steps 2-4 until the end of the list is reached.
Quick Sort
1. Choose a pivot element from the list.
2. Partition the list around the pivot element.
3. Recursively sort the left and right sublists.
Merge Sort
1. Divide the list in half.
2. Recursively sort the left and right sublists.
3. Merge the sorted sublists together.
Heap Sort
1. Build a max heap from the list.
2. Remove the root element from the heap and place it at the end of
the list.
3. Sift down the root element of the heap.
4. Repeat steps 2-3 until the list is sorted.
Radix Sort
1. Find the maximum value in the list.
2. Calculate the radix of the maximum value.
3. Create a bucket for each digit of the radix.
4. Distribute the elements of the list into the buckets according to
their digit.
5. Sort the elements in each bucket.
6. Merge the sorted buckets together.
7. Repeat steps 3-6 for the next digit of the radix.
More
Greedy Algorithm
1. Sort the items by their profit-to-weight ratio.
2. Starting with the item with the highest profit-to-weight ratio, add
the item to the knapsack if it fits.
3. Repeat step 2 until the knapsack is full or there are no more items
to add.

Knapsack Problem
1. Initialize a table to store the maximum profit that can be achieved
with a knapsack of a given capacity.
2. For each item and each capacity:
o If the item fits in the knapsack, add the profit of the item to
the maximum profit.
o If the item does not fit in the knapsack, keep the maximum
profit as it is.
3. Return the maximum profit.

Dynamic Programming
1. Break the problem down into smaller subproblems.
2. Solve the subproblems recursively.
3. Store the solutions to the subproblems in a table.
4. Use the solutions to the subproblems to solve the original problem.

Recursive Procedure
1. Define the base case.
2. Define the recursive step.
3. Write the recursive procedure.

Here are some examples of how these algorithms can be used:


• The greedy algorithm can be used to solve the knapsack problem.
• Dynamic programming can be used to solve the knapsack problem,
the longest common subsequence problem, and the shortest path
problem.
• Recursive procedures can be used to solve the factorial function,
the Fibonacci sequence, and the Tower of Hanoi problem.
Graph Algorithms (ChatGPT Style)
1. Breadth-First Search (BFS):
• Start with a source vertex.
• Explore all its neighboring vertices at the same level before
moving to the next level.
• Maintain a queue to keep track of the vertices to visit next.
• Repeat the process until all vertices are visited or the
destination vertex is reached.
2. Depth-First Search (DFS):
• Start with a source vertex.
• Explore as far as possible along each branch before
backtracking.
• Maintain a stack to keep track of the vertices to visit next.
• Repeat the process until all vertices are visited or the
destination vertex is reached.
3. Dijkstra's Algorithm:
• Initialize the distances of all vertices as infinite except for the
source vertex, which is set to 0.
• Select the vertex with the smallest distance from the set of
unvisited vertices and mark it as visited.
• Update the distances of adjacent vertices based on the current
vertex, taking into account the edge weights.
• Repeat the process until all vertices are visited or the
destination vertex is reached.
• The final distances will represent the shortest paths from the
source vertex to all other vertices.
4. Warshall's Algorithm:
• Create a matrix to represent the reachability between pairs of
vertices.
• Initialize the matrix with the adjacency matrix of the graph.
• Perform a series of iterations to update the matrix by
considering all possible intermediate vertices.
• At each iteration, check if there is a shorter path between two
vertices using the intermediate vertex.
• Repeat the process until all pairs of vertices have been
considered.
• The final matrix will represent the transitive closure of the
graph.
5. Kruskal's Algorithm:
• Sort all the edges in ascending order of their weights.
• Start with an empty spanning tree.
• Consider the edges in the sorted order and add them to the
spanning tree if they do not form a cycle.
• Repeat the process until all vertices are included in the
spanning tree or all edges have been considered.
• The final spanning tree will be a minimum spanning tree.
6. Prim's Algorithm:
• Start with an arbitrary vertex as the initial tree.
• Select the minimum-weight edge connected to the tree at each
step.
• Add the selected edge and its associated vertex to the tree.
• Repeat the process until all vertices are included in the tree.
• The final tree will be a minimum spanning tree.
Searching Algorithms
1. Linear Search:
• Start from the beginning of the list/array.
• Compare each element with the target element.
• If a match is found, return the index of the element.
• If the target element is not found, continue searching until the
end of the list/array.
• Return a "not found" indicator if the target element is not
present in the list/array.
2. Binary Search:
• Binary search is applicable only to a sorted list/array.
• Compare the target element with the middle element of the
list/array.
• If the target element matches the middle element, return the
index.
• If the target element is smaller, repeat the search process on
the left half of the list/array.
• If the target element is larger, repeat the search process on the
right half of the list/array.
• Continue dividing the list/array in half and repeating the
search until the target element is found or the search range
becomes empty.
• Return a "not found" indicator if the target element is not
present in the sorted list/array.
3. Binary Search Tree (BST):
• A BST is a binary tree where each node has a key (value) that is
greater than all keys in its left subtree and smaller than all keys in
its right subtree.
• To search for an element in a BST:
• Start at the root node.
• If the target element is equal to the current node's key, return
the node.
• If the target element is smaller, move to the left child of the
current node.
• If the target element is larger, move to the right child of the
current node.
• Repeat this process until the target element is found or a leaf
node is reached (indicating the target element is not in the
BST).
Sorting Algorithms
1. Bubble Sort:
• Compare adjacent elements in the list/array.
• Swap them if they are in the wrong order.
• Repeat this process for each pair of adjacent elements, moving
from the beginning to the end of the list/array.
• After each iteration, the largest element "bubbles" to the end
of the list/array.
• Repeat the process until the list/array is sorted.
2. Selection Sort:
• Divide the list/array into two parts: the sorted portion and the
unsorted portion.
• Find the smallest (or largest) element in the unsorted portion.
• Swap it with the first element of the unsorted portion, placing
it in the correct position in the sorted portion.
• Move the boundary between the sorted and unsorted portions
one element to the right.
• Repeat the process until the entire list/array is sorted.
3. Insertion Sort:
• Divide the list/array into two parts: the sorted portion and the
unsorted portion.
• Take the first element from the unsorted portion and insert it
into the correct position within the sorted portion, shifting
elements as needed.
• Move the boundary between the sorted and unsorted portions
one element to the right.
• Repeat the process until the entire list/array is sorted.
4. Quick Sort:
• Choose a pivot element from the list/array.
• Partition the list/array into two subarrays: elements smaller
than the pivot and elements larger than the pivot.
• Recursively apply the quicksort algorithm to the two
subarrays.
• Concatenate the sorted subarrays along with the pivot element
to obtain the final sorted list/array.
5. Merge Sort:
• Divide the list/array into two halves.
• Recursively apply the merge sort algorithm to each half,
dividing them further if needed.
• Merge the two sorted halves back together to form a single
sorted list/array.
• Repeat the process until the entire list/array is sorted.
6. Heap Sort:
• Build a max heap from the list/array.
• Extract the maximum element (root) from the heap and place
it at the end of the list/array.
• Reduce the heap size and restore the heap property.
• Repeat the extraction and restoration process until the heap is
empty and the list/array is sorted in descending order.
• Reverse the sorted list/array for ascending order.
7. Radix Sort:
• Consider each digit of the elements in the list/array, starting
from the least significant digit to the most significant digit.
• Group the elements based on each digit, placing them in
separate buckets.
• Concatenate the elements from the buckets in the order of
their buckets.
• Repeat the process for each digit until all digits have been
considered.
• The final result will be a sorted list/array.
More:
1. Greedy Algorithm:
• A greedy algorithm makes locally optimal choices at each
step to find an overall optimal solution.
• It selects the best available option at the current stage without
considering the future consequences.
• The greedy approach is easy to implement and can often
provide efficient solutions for certain problems.
• However, it may not always guarantee the globally optimal
solution for all cases.
2. Knapsack Problem:
• The knapsack problem is a combinatorial optimization
problem where given a set of items with values and weights,
the goal is to select items to maximize the total value while
not exceeding a given weight limit (the capacity of a
knapsack).
• The problem can be solved using dynamic programming,
greedy algorithms, or other techniques, depending on the
specific variation of the knapsack problem (0/1 knapsack,
fractional knapsack, etc.).
3. Dynamic Programming:
• Dynamic programming is a technique used to solve complex
problems by breaking them down into overlapping
subproblems and solving each subproblem only once, storing
the results in a table (memorization).
• It utilizes the principle of optimality, which states that an
optimal solution to a problem contains optimal solutions to its
subproblems.
• Dynamic programming is particularly useful for solving
problems with overlapping subproblems and has applications
in various areas, such as optimization, graph algorithms, and
sequence alignment.
4. Recursive Procedure:
• A recursive procedure is a programming technique where a
function or method calls itself to solve a problem by reducing
it into smaller, simpler instances of the same problem.
• It involves defining a base case that terminates the recursion,
and a recursive case that breaks down the problem into
smaller subproblems until the base case is reached.
• Recursive procedures are useful for solving problems that can
be naturally divided into subproblems and exhibit self-similar
patterns.
• However, recursive solutions may suffer from performance
issues if not implemented efficiently, as they can lead to
redundant computations.

You might also like