Answer Format (CA2-DAA)
Answer Format (CA2-DAA)
Answer Format (CA2-DAA)
Brute force:
Idea: Exhaustively try all possible solutions and select the one with the best outcome.
Example: Simple searching algorithms like linear search and brute-force string matching.
Dynamic Programming:
Idea: Break the problem into overlapping subproblems and solve each subproblem only
once, storing the solutions in a table to avoid redundant computations.
Examples: The classic example is the computation of Fibonacci numbers using dynamic
programming.
Greedy Algorithms:
Idea: Greedy method is used to solve the optimization problem. An optimization
problem is one in which we are given a set of input values, which are required either to
be maximized or minimized (known as objective), i.e. some constraints or conditions.
Greedy Algorithm always makes the choice (greedy criteria) looks best at the
moment, to optimize a given objective.
The greedy algorithm doesn't always guarantee the optimal solution however it
generally produces a solution that is very close in value to the optimal.
Example: Dijkstra's algorithm for finding the shortest path in a graph and Huffman
coding for data compression.
Backtracking:
Idea: Systematically explore all possible solutions by trying out different
choices and backtracking when a solution is not viable.
Examples: N-Queens problem and Sudoku solving.
Randomized Algorithms:
Idea: Introduce randomness into the algorithm to achieve a probabilistic guarantee on the
solution's correctness and efficiency.
Examples: Randomized Quicksort and algorithms for approximating solutions to NP-
hard problems.
Parallel Algorithms:
Idea: Design algorithms to be executed concurrently on multiple processors or threads to
improve efficiency.
Example: Parallel matrix multiplication and parallel sorting algorithms.
Approximation Algorithms:
Idea: Provide fast solutions that may not be optimal but are close to the optimal solution.
Examples: The Traveling Salesman Problem has approximation algorithms that find
solutions close to the optimal tour.
Reduction:
Idea: Transform one problem into another problem for which we already have
an algorithm.
Examples: Many NP-hard problems are solved through reductions from other known
problems.
Amortized Analysis:
Idea: Analyze the average performance of a sequence of operations, even if a single
operation may be expensive.
Examples: Dynamic arrays use amortized analysis to show that a sequence of n append
operations takes O(n) time.
Problem Statement: Write an algorithm for quick sort. Explain with an examples and
show the analysis of the algorithm.
Theory:
Quicksort is the widely used sorting algorithm that makes n log n comparisons in average
case for sorting an array of n elements. It is a faster and highly efficient sorting algorithm.
This algorithm follows the divide and conquer approach. Divide and conquer is a
technique of breaking down the algorithms into subproblems, then solving the
subproblems, and combining the results back together to solve the original problem.
Divide: In Divide, first pick a pivot element. After that, partition or rearrange the array
into two sub-arrays such that each element in the left sub-array is less than or equal to the
pivot element and each element in the right sub-array is larger than the pivot element.
Best Case:
The best case occurs when we select the pivot as the mean. So here
T(N) = 2 * T(N / 2) + N * constant
then, 2^k = N
k = log2N
So T(N) = N * T(1) + N * log2N. Therefore, the time complexity is O(N * logN).
Worst Case:
The worst case will occur when the array gets divided into two parts, one part consisting
of N-1 elements and the other and so on. So,
Similarly, we can get the value of T(N-2) by replacing N by (N-2) in the equation (i). At last it will
be like
T(N) = log2N * (N + 1)
Conclusion: Using Quick Sort, the numbers in the list are sorted successfully.