Sorting and Searching

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 17

Insertion Sort Algorithm

Insertion sort works similar to the sorting of playing cards in hands. It is


assumed that the first card is already sorted in the card game, and then
we select an unsorted card. If the selected unsorted card is greater than
the first card, it will be placed at the right side; otherwise, it will be placed
at the left side. Similarly, all unsorted cards are taken and put in their
exact place.

The same approach is applied in insertion sort. The idea behind the
insertion sort is that first take one element, iterate it through the sorted
array. Although it is simple to use, it is not appropriate for large data sets
as the time complexity of insertion sort in the average case and worst
case is O(n2), where n is the number of items. Insertion sort is less
efficient than the other sorting algorithms like heap sort, quick sort,
merge sort, etc.
Exception Handling in Java - Javatpoint

Insertion sort has various advantages such as -


o Simple implementation
o Efficient for small data sets
o Adaptive, i.e., it is appropriate for data sets that are already
substantially sorted.

Now, let's see the algorithm of insertion sort.

Algorithm
The simple steps of achieving the insertion sort are listed as follows -

Step 1 - If the element is the first element, assume that it is already


sorted. Return 1.

Step2 - Pick the next element, and store it separately in a key.

Step3 - Now, compare the key with all elements in the sorted array.

Step 4 - If the element in the sorted array is smaller than the current
element, then move to the next element. Else, shift greater elements in
the array towards the right.

Step 5 - Insert the value.

Step 6 - Repeat until the array is sorted.


Insertion sort complexity
Now, let's see the time complexity of insertion sort in best case, average
case, and in worst case. We will also see the space complexity of insertion
sort.

1. Time Complexity
Case Time Complexity
Best Case O(n)
Average Case O(n2)
Worst Case O(n2)
o Best Case Complexity - It occurs when there is no sorting
required, i.e. the array is already sorted. The best-case time
complexity of insertion sort is O(n).
o Average Case Complexity - It occurs when the array elements are
in jumbled order that is not properly ascending and not properly
descending. The average case time complexity of insertion sort
is O(n2).
o Worst Case Complexity - It occurs when the array elements are
required to be sorted in reverse order. That means suppose you
have to sort the array elements in ascending order, but its elements
are in descending order. The worst-case time complexity of insertion
sort is O(n2).

2. Space Complexity
Space Complexity O(1)
Stable YES
o The space complexity of insertion sort is O(1). It is because, in
insertion sort, an extra variable is required for swapping.
Merge Sort Algorithm
Merge sort is similar to the quick sort algorithm as it uses the divide and
conquer approach to sort the elements. It is one of the most popular and
efficient sorting algorithm. It divides the given list into two equal halves,
calls itself for the two halves and then merges the two sorted halves. We
have to define the merge() function to perform the merging.

The sub-lists are divided again and again into halves until the list cannot
be divided further. Then we combine the pair of one element lists into
two-element lists, sorting them in the process. The sorted two-element
pairs is merged into the four-element lists, and so on until we get the
sorted list.

Now, let's see the algorithm of merge sort.

Algorithm
In the following algorithm, arr is the given array, beg is the starting
element, and end is the last element of the array.

1. MERGE_SORT(arr, beg, end)


2.
3. if beg < end
4. set mid = (beg + end)/2
5. MERGE_SORT(arr, beg, mid)
6. MERGE_SORT(arr, mid + 1, end)
7. MERGE (arr, beg, mid, end)
8. end of if
9.
10. END MERGE_SORT

Merge sort complexity


Now, let's see the time complexity of merge sort in best case, average
case, and in worst case. We will also see the space complexity of the
merge sort.

1. Time Complexity

Case Time Complexity


Best Case O(n*logn)

Average Case O(n*logn)

Worst Case O(n*logn)

o Best Case Complexity - It occurs when there is no sorting


required, i.e. the array is already sorted. The best-case time
complexity of merge sort is O(n*logn).
o Average Case Complexity - It occurs when the array elements are
in jumbled order that is not properly ascending and not properly
descending. The average case time complexity of merge sort
is O(n*logn).
o Worst Case Complexity - It occurs when the array elements are
required to be sorted in reverse order. That means suppose you
have to sort the array elements in ascending order, but its elements
are in descending order. The worst-case time complexity of merge
sort is O(n*logn).

2. Space Complexity

Space Complexity O(n)

Stable YES

o The space complexity of merge sort is O(n). It is because, in merge


sort, an extra variable is required for swapping.
Quick Sort Algorithm
Sorting is a way of arranging items in a systematic manner. Quicksort is
the widely used sorting algorithm that makes n log n comparisons in
average case for sorting an array of n elements. It is a faster and highly
efficient sorting algorithm. This algorithm follows the divide and conquer
approach. Divide and conquer is a technique of breaking down the
algorithms into subproblems, then solving the subproblems, and
combining the results back together to solve the original problem.

Divide: In Divide, first pick a pivot element. After that, partition or


rearrange the array into two sub-arrays such that each element in the left
sub-array is less than or equal to the pivot element and each element in
the right sub-array is larger than the pivot element.

5.6M
635
Features of Java - Javatpoint

Conquer: Recursively, sort two subarrays with Quicksort.

Combine: Combine the already sorted array.

Quicksort picks an element as pivot, and then it partitions the given array
around the picked pivot element. In quick sort, a large array is divided into
two arrays in which one holds values that are smaller than the specified
value (Pivot), and another array holds the values that are greater than the
pivot.

After that, left and right sub-arrays are also partitioned using the same
approach. It will continue until the single element remains in the sub-
array.
Choosing the pivot
Picking a good pivot is necessary for the fast implementation of quicksort.
However, it is typical to determine a good pivot. Some of the ways of
choosing a pivot are as follows -

o Pivot can be random, i.e. select the random pivot from the given
array.
o Pivot can either be the rightmost element of the leftmost element of
the given array.
o Select median as the pivot element.

Algorithm
Algorithm:

1. QUICKSORT (array A, start, end)


2. {
3. 1 if (start < end)
4. 2 {
5. 3 p = partition(A, start, end)
6. 4 QUICKSORT (A, start, p - 1)
7. 5 QUICKSORT (A, p + 1, end)
8. 6 }
9. }

Partition Algorithm:

The partition algorithm rearranges the sub-arrays in a place.


1. PARTITION (array A, start, end)
2. {
3. 1 pivot ? A[end]
4. 2 i ? start-1
5. 3 for j ? start to end -1 {
6. 4 do if (A[j] < pivot) {
7. 5 then i ? i + 1
8. 6 swap A[i] with A[j]
9. 7 }}
10. 8 swap A[i+1] with A[end]
11. 9 return i+1
12. }

Quicksort complexity
Now, let's see the time complexity of quicksort in best case, average case,
and in worst case. We will also see the space complexity of quicksort.

1. Time Complexity

Case Time Complexity

Best Case O(n*logn)

Average Case O(n*logn)

Worst Case O(n2)

o Best Case Complexity - In Quicksort, the best-case occurs when


the pivot element is the middle element or near to the middle
element. The best-case time complexity of quicksort is O(n*logn).
o Average Case Complexity - It occurs when the array elements are
in jumbled order that is not properly ascending and not properly
descending. The average case time complexity of quicksort
is O(n*logn).
o Worst Case Complexity - In quick sort, worst case occurs when
the pivot element is either greatest or smallest element. Suppose, if
the pivot element is always the last element of the array, the worst
case would occur when the given array is sorted already in
ascending or descending order. The worst-case time complexity of
quicksort is O(n2).

Though the worst-case complexity of quicksort is more than other sorting


algorithms such as Merge sort and Heap sort, still it is faster in practice.
Worst case in quick sort rarely occurs because by changing the choice of
pivot, it can be implemented in different ways. Worst case in quicksort can
be avoided by choosing the right pivot element.

2. Space Complexity

Space Complexity O(n*logn)

Stable NO

o The space complexity of quicksort is O(n*logn).

Heap Sort Algorithm


What is a heap?
A heap is a complete binary tree, and the binary tree is a tree in which the
node can have the utmost two children. A complete binary tree is a binary
tree in which all the levels except the last level, i.e., leaf node, should be
completely filled, and all the nodes should be left-justified.

What is heap sort?


Heapsort is a popular and efficient sorting algorithm. The concept of heap
sort is to eliminate the elements one by one from the heap part of the list,
and then insert them into the sorted part of the list.

Heapsort is the in-place sorting algorithm.

Now, let's see the algorithm of heap sort.

Algorithm
1. HeapSort(arr)
2. BuildMaxHeap(arr)
3. for i = length(arr) to 2
4. swap arr[1] with arr[i]
5. heap_size[arr] = heap_size[arr] ? 1
6. MaxHeapify(arr,1)
7. End

BuildMaxHeap(arr)

1. BuildMaxHeap(arr)
2. heap_size(arr) = length(arr)
3. for i = length(arr)/2 to 1
4. MaxHeapify(arr,i)
5. End

MaxHeapify(arr,i)

1. MaxHeapify(arr,i)
2. L = left(i)
3. R = right(i)
4. if L ? heap_size[arr] and arr[L] > arr[i]
5. largest = L
6. else
7. largest = i
8. if R ? heap_size[arr] and arr[R] > arr[largest]
9. largest = R
10. if largest != i
11. swap arr[i] with arr[largest]
12. MaxHeapify(arr,largest)
13. End

Heap sort complexity


Now, let's see the time complexity of Heap sort in the best case, average
case, and worst case. We will also see the space complexity of Heapsort.

1. Time Complexity

Case Time Complexity

Best Case O(n logn)

Average Case O(n log n)


Worst Case O(n log n)

o Best Case Complexity - It occurs when there is no sorting


required, i.e. the array is already sorted. The best-case time
complexity of heap sort is O(n logn).
o Average Case Complexity - It occurs when the array elements are
in jumbled order that is not properly ascending and not properly
descending. The average case time complexity of heap sort is O(n
log n).
o Worst Case Complexity - It occurs when the array elements are
required to be sorted in reverse order. That means suppose you
have to sort the array elements in ascending order, but its elements
are in descending order. The worst-case time complexity of heap
sort is O(n log n).

The time complexity of heap sort is O(n logn) in all three cases (best
case, average case, and worst case). The height of a complete binary tree
having n elements is logn.

2. Space Complexity

Space Complexity O(1)

Stable N0

o The space complexity of Heap sort is O(1).

Radix Sort Algorithm


In this article, we will discuss the Radix sort Algorithm. Radix sort is the
linear sorting algorithm that is used for integers. In Radix sort, there is
digit by digit sorting is performed that is started from the least significant
digit to the most significant digit.

The process of radix sort works similar to the sorting of students names,
according to the alphabetical order. In this case, there are 26 radix formed
due to the 26 alphabets in English. In the first pass, the names of students
are grouped according to the ascending order of the first letter of their
names. After that, in the second pass, their names are grouped according
to the ascending order of the second letter of their name. And the process
continues until we find the sorted list.

Now, let's see the algorithm of Radix sort.


Algorithm
1. radixSort(arr)
2. max = largest element in the given array
3. d = number of digits in the largest element (or, max)
4. Now, create d buckets of size 0 - 9
5. for i -> 0 to d
6. sort the array elements using counting sort (or any stable sort) acco
rding to the digits at
7. the ith place

Radix sort complexity


Now, let's see the time complexity of Radix sort in best case, average
case, and worst case. We will also see the space complexity of Radix sort.

1. Time Complexity

Case Time Complexity

Best Case Ω(n+k)

Average Case θ(nk)

Worst Case O(nk)

o Best Case Complexity - It occurs when there is no sorting


required, i.e. the array is already sorted. The best-case time
complexity of Radix sort is Ω(n+k).
o Average Case Complexity - It occurs when the array elements are
in jumbled order that is not properly ascending and not properly
descending. The average case time complexity of Radix sort
is θ(nk).
o Worst Case Complexity - It occurs when the array elements are
required to be sorted in reverse order. That means suppose you
have to sort the array elements in ascending order, but its elements
are in descending order. The worst-case time complexity of Radix
sort is O(nk).
Radix sort is a non-comparative sorting algorithm that is better than the
comparative sorting algorithms. It has linear time complexity that is better
than the comparative algorithms with complexity O(n logn).

2. Space Complexity

Space Complexity O(n + k)

Stable YES

o The space complexity of Radix sort is O(n + k).

Linear Search Algorithm


In this article, we will discuss the Linear Search Algorithm. Searching is
the process of finding some particular element in the list. If the element is
present in the list, then the process is called successful, and the process
returns the location of that element; otherwise, the search is called
unsuccessful.

Two popular search methods are Linear Search and Binary Search. So,
here we will discuss the popular searching technique, i.e., Linear Search
Algorithm.

Linear search is also called as sequential search algorithm. It is the


simplest searching algorithm. In Linear search, we simply traverse the list
completely and match each element of the list with the item whose
location is to be found. If the match is found, then the location of the item
is returned; otherwise, the algorithm returns NULL.

5M
414
Hello Java Program for Beginners
It is widely used to search an element from the unordered list, i.e., the list
in which items are not sorted. The worst-case time complexity of linear
search is O(n).

The steps used in the implementation of Linear Search are listed as


follows -

o First, we have to traverse the array elements using a for loop.


o In each iteration of for loop, compare the search element with the
current array element, and -
o If the element matches, then return the index of the
corresponding array element.
o If the element does not match, then move to the next
element.
o If there is no match or the search element is not present in the
given array, return -1.

Now, let's see the algorithm of linear search.

Algorithm

1. Linear_Search(a, n, val) // 'a' is the given array, 'n' is the size of give
n array, 'val' is the value to search
2. Step 1: set pos = -1
3. Step 2: set i = 1
4. Step 3: repeat step 4 while i <= n
5. Step 4: if a[i] == val
6. set pos = i
7. print pos
8. go to step 6
9. [end of if]
10. set ii = i + 1
11. [end of loop]
12. Step 5: if pos = -1
13. print "value is not present in the array "
14. [end of if]
15. Step 6: exit

Linear Search complexity


Now, let's see the time complexity of linear search in the best case,
average case, and worst case. We will also see the space complexity of
linear search.

1. Time Complexity

Case Time Complexity

Best Case O(1)

Average Case O(n)

Worst Case O(n)

o Best Case Complexity - In Linear search, best case occurs when


the element we are finding is at the first position of the array. The
best-case time complexity of linear search is O(1).
o Average Case Complexity - The average case time complexity of
linear search is O(n).
o Worst Case Complexity - In Linear search, the worst case occurs
when the element we are looking is present at the end of the array.
The worst-case in linear search could be when the target element is
not present in the given array, and we have to traverse the entire
array. The worst-case time complexity of linear search is O(n).

The time complexity of linear search is O(n) because every element in the
array is compared only once.

2. Space Complexity

Space Complexity O(1)

o The space complexity of linear search is O(1).


Binary Search Algorithm
Binary search is the search technique that works efficiently on sorted lists.
Hence, to search an element into some list using the binary search
technique, we must ensure that the list is sorted.

Binary search follows the divide and conquer approach in which the list is
divided into two halves, and the item is compared with the middle
element of the list. If the match is found then, the location of the middle
element is returned. Otherwise, we search into either of the halves
depending upon the result produced through the match.

NOTE: Binary search can be implemented on sorted array elements. If the list
elements are not arranged in a sorted manner, we have first to sort them.

Now, let's see the algorithm of Binary Search.

Algorithm
1. Binary_Search(a, lower_bound, upper_bound, val) // 'a' is the given a
rray, 'lower_bound' is the index of the first array element, 'upper_bo
und' is the index of the last array element, 'val' is the value to searc
h
2. Step 1: set beg = lower_bound, end = upper_bound, pos = - 1
3. Step 2: repeat steps 3 and 4 while beg <=end
4. Step 3: set mid = (beg + end)/2
5. Step 4: if a[mid] = val
6. set pos = mid
7. print pos
8. go to step 6
9. else if a[mid] > val
10.set end = mid - 1
11. else
12.set beg = mid + 1
13. [end of if]
14.[end of loop]
15. Step 5: if pos = -1
16.print "value is not present in the array"
17. [end of if]
18.Step 6: exit

Binary Search complexity


Now, let's see the time complexity of Binary search in the best case,
average case, and worst case. We will also see the space complexity of
Binary search.

1. Time Complexity

Case Time Complexity

Best Case O(1)

Average Case O(logn)

Worst Case O(logn)


o Best Case Complexity - In Binary search, best case occurs when the
element to search is found in first comparison, i.e., when the first middle
element itself is the element to be searched. The best-case time
complexity of Binary search is O(1).
o Average Case Complexity - The average case time complexity of Binary
search is O(logn).
o Worst Case Complexity - In Binary search, the worst case occurs, when
we have to keep reducing the search space till it has only one element.
The worst-case time complexity of Binary search is O(logn).

2. Space Complexity

Space Complexity O(1)

o The space complexity of binary search is O(1).

You might also like