Article Series: Sorting Algorithms Part 1: Introduction Part 2: Sorting in Java Part 3: Insertion Sort Part 4: Selection Sort Part 5: Bubble Sort Part 6: Quicksort Part 7: Merge Sort Part 8: Heapsort Part 9: Counting Sort Part 10: Radix Sort (Sign up for the HappyCoders Newsletter This article is part of the series "Sorting Algorithms: Ultimate Guide" and… You can find the source code for the entire article series in my GitHub repository. Contents hide 1Example: Sorting Playing Cards 2Insertion Sort Algorithm 2.1Step 1 2.2Step 2 2.3Step 3 2.4Step 4 2.5Step 5 3Insertion Sort Java Source Code 4Insertion Sort Time Complexity 4.1Average Time Complexity 4.2Worst-Case Time Complexity 4.3Best-Case Time Complexity 4.4Insertion Sort With Binary Search? 4.5Insertion Sort With a Linked List? 4.6Runtime of the Java Insertion Sort Example 5Other Characteristics of Insertion Sort 6Insertion Sort vs. Selection Sort 7Summary Let us start with a playing card example. Imagine being handed one card at a time. You take the first card in your hand. Then you sort the second card to the left or right of it. The third card is placed to the left, in between or to the right, depending on its size. And also, all the following cards are placed in the right position. Have you ever sorted cards this way before? If so, then you have intuitively used "Insertion Sort". Let's move from the card example to the computer algorithm. Let us assume we have an array with the elements [6, 2, 4, 9, 3, 7]. This array should be sorted with Insertion Sort in ascending order. First, we divide the array into a left, sorted part, and a right, unsorted part. The sorted part already contains the first element at the beginning, because an array with a single element can always be considered sorted. Then we look at the first element of the unsorted area and check where, in the sorted area, it needs to be inserted by comparing it with its left neighbor. In the example, the 2 is smaller than the 6, so it belongs to its left. In order to make room, we move the 6 one position to the right and then place the 2 on the empty field. Then we move the border between sorted and unsorted area one step to the right: We look again at the first element of the unsorted area, the 4. It is smaller than the 6, but not smaller than the 2 and, therefore, belongs between the 2 and the 6. So we move the 6 again one position to the right and place the 4 on the vacant field: The next element to be sorted is the 9, which is larger than its left neighbor 6, and thus larger than all elements in the sorted area. Therefore, it is already in the correct position, so we do not need to shift any element in this step: The next element is the 3, which is smaller than the 9, the 6 and the 4, but greater than the 2. So we move the 9, 6 and 4 one position to the right and then put the 3 where the 4 was before: That leaves the 7 – it is smaller than the 9, but larger than the 6, so we move the 9 one field to the right and place the 7 on the vacant position: The array is now completely sorted. The following Java source code shows how easy it is to implement Insertion Sort. The outer loop iterates – starting with the second element, since the first element is already sorted – over the elements to be sorted. The loop variable i, therefore, always points to the first element of the right, unsorted part. In the inner while loop, the search for the insert position and the shifting of the elements is combined: The code shown differs from the code in the GitHub repository in two ways: First, the InsertionSort class in the repository implements the SortAlgorithm interface to be easily interchangeable within my test framework. On the other hand, it allows the specification of start and end index, so that sub-arrays can also be sorted. This will later allow us to optimize Quicksort by having sub-arrays that are smaller than a certain size sorted with Insertion Sort instead of dividing them further. We denote with n the number of elements to be sorted; in the example above n = 6. The two nested loops are an indication that we are dealing with quadratic effort, meaning with time complexity of O(n²)*. This is the case if both the outer and the inner loop count up to a value that increases linearly with the number of elements. With the outer loop, this is obvious as it counts up to n. And the inner loop? We'll analyze that in the next three sections. * In this article, I explain the terms "time complexity" and "Big O notation" using examples and diagrams. Let's look again at the example from above where we have sorted the array [6, 2, 4, 9, 3, 7]. In the first step of the example, we defined the first element as already sorted; in the source code, it is simply skipped. In the second step, we shifted one element from the sorted array. If the element to be sorted had already been in the right place, we would not have had to shift anything. This means that we have an average of 0.5 move operations in the second step. In the third step, we have also shifted one element. But here it could also have been zero or two shifts. On average, it is one shift in this step. In step four, we did not need to shift any elements. However, it could have been necessary to shift one, two, or three elements; the average here is 1.5. In step five, we have on average two shift operations: And in step six, 2.5: So in total we have on average 0.5 + 1 + 1.5 + 2 + 2.5 = 7.5 shift operations. We can also calculate this as follows: Six elements times five shifting operations; divided by two, because on average over all steps, half of the cards are already sorted; and again divided by two, because on average, the element to be sorted has to be moved to the middle of the already sorted elements: 6 × 5 × ½ × ½ = 30 × ¼ = 7,5 The following illustration shows all steps once again: If we replace 6 with n, we get n × (n - 1) × ¼ When multiplied, that's: ¼ n² - ¼ n The highest power of n in this term is n²; the time complexity for shifting is, therefore, O(n²). This is also called "quadratic time". So far, we have only looked at how the sorted elements are shifted – but what about comparing the elements and placing the element to be sorted on the field that became free? For comparison operations, we have one more than shift operations (or the same amount if you move an element to the far left). The time complexity for the comparison operations is, therefore, also O(n²). The element to be sorted must be placed in the correct position as often as there are elements minus those that are already in the right position – so n-1 times at maximum. Since there is no n² here, but only an n, we speak of "linear time", noted as O(n). When considering the overall complexity, only the highest level of complexity counts (see "Big O Notation and Time Complexity – Easily Explained"). Therefore follows: The average time complexity of Insertion Sort is: O(n²) Where there is an average case, there is also a worst and a best case. In the worst case, the elements are sorted completely descending at the beginning. In each step, all elements of the sorted sub-array must, therefore, be shifted to the right so that the element to be sorted – which is smaller than all elements already sorted in each step – can be placed at the very beginning. In the following diagram, this is demonstrated by the fact that the arrows always point to the far left: The term from the average case, therefore, changes in that the second dividing by two is omitted: 6 × 5 × ½ Or: n × (n - 1) × ½ When we multiply this out, we get: ½ n² - ½ n Even if we have only half as many operations as in the average case, nothing changes in terms of time complexity – the term still contains n², and therefore follows: The worst-case time complexity of Insertion Sort is: O(n²) The best case becomes interesting! If the elements already appear in sorted order, there is preciselyone comparison in the inner loop and no swap operation at all. With n elements, that is, n-1 steps (since we start with the second element), we thus come to n-1 comparison operations. Therefore: The best-case time complexity of Insertion Sort is: O(n) Couldn't we speed up the algorithm by searching the insertion point with binary search? This is much faster than the sequential search – it has a time complexity of O(log n). Yes, we could. However, we would not have gained anything from this, because we would still have to shift each element from the insertion position one position to the right, which is only possible step by step in an array. Thus the inner loop would remain at linear complexity despite the binary search. And the whole algorithm would remain at quadratic complexity, that is O(n²). If the elements are in a linked list, couldn't we insert an element in constant time, O(1)? Yeah, we could. However, a linked list does not allow for a binary search. This means that we would still have to iterate through all sorted elements in the inner loop to find the insertion position. This, in turn, would result in linear complexity for the inner loop and quadratic complexity for the entire algorithm. After all this theory, it's time to check it against the Java implementation presented above. The UltimateTest class from the GitHub repository executes Insertion Sort (and all other sorting algorithms presented in this series of articles) as follows: After each iteration, the test program prints out the median of the previous measurement results. Here is the result for Insertion Sort after 50 iterations (this is only an excerpt for the sake of clarity; the complete result can be found here): It is easy to see This corresponds to the expected time complexities of O(n²) and O(n). Here the measured values as a diagram: With pre-sorted elements, Insertion Sort is so fast that the line is hardly visible. Therefore here is the best case separately: The space complexity of Insertion Sort is constant since we do not need any additional memory except for the loop variables i and j and the auxiliary variable elementToSort. This means that – no matter whether we sort ten elements or a million – we always need only these three additional variables. Constant complexity is noted as O(1). The sorting method is stable because we only move elements that are greater than the element to be sorted (not "greater or equal"), which means that the relative position of two identical elements never changes. Insertion Sort is not directly parallelizable.* However, there is a parallel variant of Insertion Sort: Shellsort (here its description on Wikipedia). * You could search binary and then parallelize the shifting of the sorted elements. But this would only make sense with large arrays, which would have to be split exactly along the cache lines in order not to lose the performance gained by parallelization – or to even reverse it into the opposite direction – due to synchronization effects. This effort can be saved since there are more efficient sorting algorithms for larger arrays anyway. You can find a comparison of Insertion Sort and Selection Sort in the article about Selection Sort. Insertion Sort is an easy-to-implement, stable sorting algorithm with time complexity of O(n²) in the average and worst case, and O(n) in the best case. For very small n, Insertion Sort is faster than more efficient algorithms such as Quicksort or Merge Sort. Thus these algorithms solve smaller sub-problems with Insertion Sort (the Dual-Pivot Quicksort implementation in You will find more sorting algorithms in this overview of all sorting algorithms and their characteristics in the first part of the article series. If you liked the article, feel free to share it using one of the share buttons at the end. Would you like to be informed by e-mail when I publish a new article? Then click here to subscribe to the HappyCoders newsletter.
to be immediately informed about new parts.)Example: Sorting Playing Cards
Insertion Sort Algorithm
Step 1
Step 2
Step 3
Step 4
Step 5
Step 6
Insertion Sort Java Source Code
public class InsertionSort { public static void sort(int[] elements) { for (int i = 1; i < elements.length; i++) { int elementToSort = elements[i]; // Move element to the left until it's at the right position int j = i; while (j > 0 && elementToSort < elements[j - 1]) { elements[j] = elements[j - 1]; j--; } elements[j] = elementToSort; } }}
Code language: Java (java)Insertion Sort Time Complexity
Average Time Complexity
Worst-Case Time Complexity
Best-Case Time Complexity
Insertion Sort With Binary Search?
Insertion Sort With a Linked List?
Runtime of the Java Insertion Sort Example
n unsorted descending ascending ... ... ... ... 32,768 87.86 ms 175.80 ms 0.042 ms 65,536 350.43 ms 697.59 ms 0.084 ms 131,072 1,398.92 ms 2,840.00 ms 0.168 ms 262,144 5,706.82 ms 11,517.36 ms 0.351 ms 524,288 23,009.68 ms 46,309.27 ms 0.710 ms 1,048,576 – – 1.419 ms ... ... ... ... 536,870,912 – – 693.310 ms Other Characteristics of Insertion Sort
Insertion Sort vs. Selection Sort
Summary
Arrays.sort()
of the JDK, for example, for less than 44 elements).
FAQs
Insertion Sort - Algorithm, Source Code, Time Complexity? ›
Insertion Sort is an easy-to-implement, stable sorting algorithm with time complexity of O(n²) in the average and worst case, and O(n) in the best case. For very small n, Insertion Sort is faster than more efficient algorithms such as Quicksort or Merge Sort.
What is the time complexity of the insertion sort code? ›The best-case time complexity of insertion sort algorithm is O(n) time complexity. Meaning that the time taken to sort a list is proportional to the number of elements in the list; this is the case when the list is already in the correct order.
What is the time complexity function of insertion sort? ›To elaborate, Time complexity measures the time taken to execute each statement of code in an algorithm. If a statement is set to execute repeatedly then the number of times that statement gets executed is equal to N multiplied by the time required to run that function each time.
What is the time complexity of the insert algorithm? ›Insertion sort is a simple and efficient algorithm for small input sizes or partially sorted data. It has a time complexity of O(n^2) in the worst case and O(n) in the best case. It is a stable sorting algorithm that maintains the relative order of equal elements.
What is the time complexity of inserting into a sorted list? ›Time complexity in big O notation | ||
---|---|---|
Operation | Average | Worst case |
Search | O(log n) | O(log n) |
Insert | O(n) | O(n) |
Delete | O(n) | O(n) |
The algorithm as a whole still has a running time of O(n2) on average because of the series of swaps required for each insertion. The number of swaps can be reduced by calculating the position of multiple elements before moving them.
What is the time complexity of Merge sort and insertion sort? ›Time complexity: In merge sort the worst case is O(n log n); average case is O(n log n); best case is O(n log n) whereas in insertion sort the worst case is O(n2); average case is O(n2); best case is O(n).
What is the time complexity of selection sort and insertion sort? ›Insertion Sort has time complexity of O(n^2) and Ω(n) . Selection sort has time complexity of O(n^2) and Ω(n^2) .
What is the time complexity of insertion sort nearly sorted? ›Insertion sort provides a O(n2) worst case algorithm that adapts to O(n) time when the data is nearly sorted. One would like an O(n·lg(n)) algorithm that adapts to this situation; smoothsort is such an algorithm, but is complex.
What is the time complexity of insertion and shell sort? ›Time Complexity: Time complexity of the above implementation of Shell sort is O(n2).
What is the time complexity of insertion sort using step count method? ›
- Best case: O(n) , If the list is already sorted, where n is the number of elements in the list.
- Average case: O(n 2 ) , If the list is randomly ordered.
- Worst case: O(n 2 ) , If the list is in reverse order.
The insertion-sort algorithm sorts a list of values by repeatedly inserting a new element into a sorted sublist until the whole list is sorted. What is the time complexity for an insertion sort? O(n^2).
What is the time complexity of insertion sort in linked list? ›This problem uses an underlying approach of inserting nodes in a sorted linked list in a sorted manner with time complexity of O(N), where 'N' is the number of nodes in the list.
What is the time complexity of a sorting set? ›Algorithm | Time Complexity | |
---|---|---|
Best | Worst | |
Tim Sort | O(n) | O(n log (n)) |
Tree Sort | O(n log(n)) | O(n2) |
Cube Sort | O(n) | O(n log(n)) |
Selection sort and insertion sort have worst-case time O(N2). Quick sort is also O(N2) in the worst case, but its expected time is O(N log N).
Why is selection sort O'n 2? ›O(n^2) is the runtime complexity of selection sort due to the nested loops. It finds the minimal value by iterating (n times) through the unsorted elements in each pass.