Sorting and Searching
Sorting and Searching
Sorting and Searching
The same approach is applied in insertion sort. The idea behind the
insertion sort is that first take one element, iterate it through the sorted
array. Although it is simple to use, it is not appropriate for large data sets
as the time complexity of insertion sort in the average case and worst
case is O(n2), where n is the number of items. Insertion sort is less
efficient than the other sorting algorithms like heap sort, quick sort,
merge sort, etc.
Exception Handling in Java - Javatpoint
Algorithm
The simple steps of achieving the insertion sort are listed as follows -
Step3 - Now, compare the key with all elements in the sorted array.
Step 4 - If the element in the sorted array is smaller than the current
element, then move to the next element. Else, shift greater elements in
the array towards the right.
1. Time Complexity
Case Time Complexity
Best Case O(n)
Average Case O(n2)
Worst Case O(n2)
o Best Case Complexity - It occurs when there is no sorting
required, i.e. the array is already sorted. The best-case time
complexity of insertion sort is O(n).
o Average Case Complexity - It occurs when the array elements are
in jumbled order that is not properly ascending and not properly
descending. The average case time complexity of insertion sort
is O(n2).
o Worst Case Complexity - It occurs when the array elements are
required to be sorted in reverse order. That means suppose you
have to sort the array elements in ascending order, but its elements
are in descending order. The worst-case time complexity of insertion
sort is O(n2).
2. Space Complexity
Space Complexity O(1)
Stable YES
o The space complexity of insertion sort is O(1). It is because, in
insertion sort, an extra variable is required for swapping.
Merge Sort Algorithm
Merge sort is similar to the quick sort algorithm as it uses the divide and
conquer approach to sort the elements. It is one of the most popular and
efficient sorting algorithm. It divides the given list into two equal halves,
calls itself for the two halves and then merges the two sorted halves. We
have to define the merge() function to perform the merging.
The sub-lists are divided again and again into halves until the list cannot
be divided further. Then we combine the pair of one element lists into
two-element lists, sorting them in the process. The sorted two-element
pairs is merged into the four-element lists, and so on until we get the
sorted list.
Algorithm
In the following algorithm, arr is the given array, beg is the starting
element, and end is the last element of the array.
1. Time Complexity
2. Space Complexity
Stable YES
5.6M
635
Features of Java - Javatpoint
Quicksort picks an element as pivot, and then it partitions the given array
around the picked pivot element. In quick sort, a large array is divided into
two arrays in which one holds values that are smaller than the specified
value (Pivot), and another array holds the values that are greater than the
pivot.
After that, left and right sub-arrays are also partitioned using the same
approach. It will continue until the single element remains in the sub-
array.
Choosing the pivot
Picking a good pivot is necessary for the fast implementation of quicksort.
However, it is typical to determine a good pivot. Some of the ways of
choosing a pivot are as follows -
o Pivot can be random, i.e. select the random pivot from the given
array.
o Pivot can either be the rightmost element of the leftmost element of
the given array.
o Select median as the pivot element.
Algorithm
Algorithm:
Partition Algorithm:
Quicksort complexity
Now, let's see the time complexity of quicksort in best case, average case,
and in worst case. We will also see the space complexity of quicksort.
1. Time Complexity
2. Space Complexity
Stable NO
Algorithm
1. HeapSort(arr)
2. BuildMaxHeap(arr)
3. for i = length(arr) to 2
4. swap arr[1] with arr[i]
5. heap_size[arr] = heap_size[arr] ? 1
6. MaxHeapify(arr,1)
7. End
BuildMaxHeap(arr)
1. BuildMaxHeap(arr)
2. heap_size(arr) = length(arr)
3. for i = length(arr)/2 to 1
4. MaxHeapify(arr,i)
5. End
MaxHeapify(arr,i)
1. MaxHeapify(arr,i)
2. L = left(i)
3. R = right(i)
4. if L ? heap_size[arr] and arr[L] > arr[i]
5. largest = L
6. else
7. largest = i
8. if R ? heap_size[arr] and arr[R] > arr[largest]
9. largest = R
10. if largest != i
11. swap arr[i] with arr[largest]
12. MaxHeapify(arr,largest)
13. End
1. Time Complexity
The time complexity of heap sort is O(n logn) in all three cases (best
case, average case, and worst case). The height of a complete binary tree
having n elements is logn.
2. Space Complexity
Stable N0
The process of radix sort works similar to the sorting of students names,
according to the alphabetical order. In this case, there are 26 radix formed
due to the 26 alphabets in English. In the first pass, the names of students
are grouped according to the ascending order of the first letter of their
names. After that, in the second pass, their names are grouped according
to the ascending order of the second letter of their name. And the process
continues until we find the sorted list.
1. Time Complexity
2. Space Complexity
Stable YES
Two popular search methods are Linear Search and Binary Search. So,
here we will discuss the popular searching technique, i.e., Linear Search
Algorithm.
5M
414
Hello Java Program for Beginners
It is widely used to search an element from the unordered list, i.e., the list
in which items are not sorted. The worst-case time complexity of linear
search is O(n).
Algorithm
1. Linear_Search(a, n, val) // 'a' is the given array, 'n' is the size of give
n array, 'val' is the value to search
2. Step 1: set pos = -1
3. Step 2: set i = 1
4. Step 3: repeat step 4 while i <= n
5. Step 4: if a[i] == val
6. set pos = i
7. print pos
8. go to step 6
9. [end of if]
10. set ii = i + 1
11. [end of loop]
12. Step 5: if pos = -1
13. print "value is not present in the array "
14. [end of if]
15. Step 6: exit
1. Time Complexity
The time complexity of linear search is O(n) because every element in the
array is compared only once.
2. Space Complexity
Binary search follows the divide and conquer approach in which the list is
divided into two halves, and the item is compared with the middle
element of the list. If the match is found then, the location of the middle
element is returned. Otherwise, we search into either of the halves
depending upon the result produced through the match.
NOTE: Binary search can be implemented on sorted array elements. If the list
elements are not arranged in a sorted manner, we have first to sort them.
Algorithm
1. Binary_Search(a, lower_bound, upper_bound, val) // 'a' is the given a
rray, 'lower_bound' is the index of the first array element, 'upper_bo
und' is the index of the last array element, 'val' is the value to searc
h
2. Step 1: set beg = lower_bound, end = upper_bound, pos = - 1
3. Step 2: repeat steps 3 and 4 while beg <=end
4. Step 3: set mid = (beg + end)/2
5. Step 4: if a[mid] = val
6. set pos = mid
7. print pos
8. go to step 6
9. else if a[mid] > val
10.set end = mid - 1
11. else
12.set beg = mid + 1
13. [end of if]
14.[end of loop]
15. Step 5: if pos = -1
16.print "value is not present in the array"
17. [end of if]
18.Step 6: exit
1. Time Complexity
2. Space Complexity