What is Sorting? Types of Sorting? Merge Sort?

Archit Mittal
5 min readMar 21, 2021

Recently I started to learn Data Structure and Algorithms and got lot of confusion in few topics like What is Sorting ? What are it’s type and many more like this. After talking to my friend and clearing my doubts here is what I understand from my learning….

Sorting any sequence means to rearrange the weather of that sequence consistent with some specific criterion.
For Example, the array arr[] = {5, 4, 2, 1, 3} after sorting in increasing order will be: arr[] = {1, 2, 3, 4, 5}. The same array after sorting in descending order will be: arr[] = {5, 4, 3, 2, 1}.
In-Place Sorting: When creating the output, an in-place sorting algorithm takes a constant amount of extra space (modifies the given array only). It only sorts the list by changing the order of the list’s elements.

In this tutorial, we’ll see three of such in-place sorting algorithms, namely:
1. Insertion Sort
2. Selection Sort
3. Bubble Sort

Insertion Sort
Insertion Sort is an In-Place sorting algorithm. This algorithm works during a similar way of sorting a deck of playing cards.
The idea is to start out iterating from the second element of array till last element and for each element insert at its correct position within the subarray before it.
In the below image you’ll see, how the array [4, 3, 2, 10, 12, 1, 5, 6] is being sorted in increasing order following the insertion sort algorithm.
Insertion sort runs in O(n)O(n) time in its best case and runs in O(n²)O(n2) in its worst and average cases.

Analysis of the Best Case Scenario:

It executes two operations: it scans the list, evaluating each pair of elements, and swapping out of order elements. Each operation adds to the algorithm’s overall execution time. Insertion sort compares O(n)O(n) elements and makes no swaps if the input array is already sorted. As a result, insertion sort takes O(n)O(n) time in the best case.

Worst and Average Case Analysis:
When the input list is in decreasing order, the worst case scenario for insertion sort will occur. We’ll need at least n-1n1 comparisons and n-1n1 swaps to insert the last element. We’ll need at least n-2n2 comparisons and n-2n2 swaps to insert the second-to-last element, and so on. As a result, the number of operations required to perform insertion sort is:

When it comes to algorithm complexity, the average case is always the same as the worst case. So, on average, insertion sort takes time. Insertion sort is a good sorting algorithm to use if the input list is still mostly sorted so it has a quick best-case running time. For bigger or more unordered lists, a faster worst- and average-case running time algorithm, such as mergesort, would be a better choice.

Insertion sort is a reliable sort with an O(1) space complexity.

Bubble Sort
Bubble Sort is also an in-place sorting algorithm. This is the only algorithm and it works on the principle that:
In one iteration if we swap all adjacent elements of an array such after swap the primary element is a smaller amount than the second element then at the top of the iteration, the primary element of the array are going to be the minimum element.
The Bubble-Sort algorithm merely repeats the preceding steps N-1 times, where N is the array’s size.
Time Complexity: O(N2)
Consider the array arr[] = {5, 1, 4, 2, 8} as an example.
Selection Sort
The selection sort algorithm sorts an array by repeatedly identifying and placing the smallest element from the unsorted part (in ascending order) at the start. During a given array, the algorithm holds two subarrays.
1. The subarray which is already sorted.
2. Remaining subarray which is unsorted.
The unsorted subarray’s minimum element (in ascending order) is selected and transported to the sorted subarray in each iteration of selection sort.
Following example explains the above steps:
arr[] = 64 25 12 22 11.
// Find the minimum element in arr[0…4]
// and place it at beginning
11 25 12 22 64
// Find the minimum element in arr[1…4]
// and place it at beginning of arr[1…4]
11 12 25 22 64
// Find the minimum element in arr[2…4]
// and place it at beginning of arr[2…4]
11 12 22 25 64
// Find the minimum element in arr[3…4]
// and place it at beginning of arr[3…4]
11 12 22 25 64
Time Complexity: O(N2)

Merge Sort
Merge Sort is a Divide and Conquer algorithm. It splits the input array in half, calls itself for each half, and then merges the two sorted halves. The merge() function is employed for merging two halves. The merge(arr, l, m, r) is vital process that assumes that arr[l..m] and arr[m+1..r] are sorted and merges the 2 sorted sub-arrays into one during a sorted manner. See following implementation for details:

Advantages of Merge Sort
· It is very fast for larger lists because unlike insertion and bubble sort it does not go through the whole list several times.
· It has a consistent running time , carries out different bits with similar times in a stage.
Disadvantages of Merge Sort
· Slower comparative to other sort algorithm for smaller task.
· The sub elements of the initial split list take up more memory space.

pseudo code for merge sorting algorithm

Time Complexity: Sorting arrays on different machines. Merge Sort may be a recursive algorithm and time complexity are often expressed as following recurrence relation.
T(n) = 2T(n/2) + Θ(n)
The Recurrence Tree method or the Master method should be used to solve the above recurrence.. It falls just in case II of Master Method and solution of the recurrence is Θ(nLogn).
Time complexity of Merge Sort is Θ(nLogn) altogether 3 cases (worst, average and best) as merge sort always divides the array in two halves and take linear time to merge two halves.
I guess that was all about sorting I hope I have understand it well and this article would have helped you in any way.

--

--