The six sorting algorithms commonly used in Java are broken down in detail

  • 2020-04-01 03:25:27
  • OfStack

Sorting algorithm will be used in many places, recently looked at the algorithm again, and simple implementation of their own once, this is recorded, for the future review of some materials.

Without further ado, here's a look at the classic sorting algorithm:

1. Select sort

The basic idea of selection sort is that in the process of traversing groups, I represents the current order number to be sorted, then the minimum value in the remaining [I... n-1] needs to be found, and then the minimum value found is exchanged with the value pointed by I. Since there is a sub-process of selecting the maximum value in each step of determining an element, it is figuratively called selection sort. Here's an example:

Initial: [38, 17, 16, 16, 7, 31, 39, 32, 2, 11] 
 
I = 0:   [2, 17, 16, 16, 7, 31, 39, 32, 38, 11] (0th [38]< - > 8 th [2])  
 
I = 1:   [2, 7, 16, 16, 17, 31, 39, 32, 38, 11] (1st [38]< - > 4 th [17])  
 
I = 2:   [2, 7, 11, 16, 17, 31, 39, 32, 38, 16] (2nd [11]< - > 9 th [16])  
 
I = 3:   [2, 7, 11, 16, 17, 31, 39, 32, 38, 16] (no exchange required) 
 
I = 4:   [2, 7, 11, 16, 16, 31, 39, 32, 38, 17] (4th [17]< - > 9 th [16])  
 
I = 5:   [2, 7, 11, 16, 16, 17, 39, 32, 38, 31] (5th [31]< - > 9 th [17])  
 
I = 6:   [2, 7, 11, 16, 16, 17, 31, 32, 38, 39] (6th [39]< - > 9 th [31])  
 
I = 7:   [2, 7, 11, 16, 16, 17, 31, 32, 38, 39] (no exchange required) 
 
I = 8:   [2, 7, 11, 16, 16, 17, 31, 32, 38, 39] (no exchange required) 
 
I = 9:   [2, 7, 11, 16, 16, 17, 31, 32, 38, 39] (no exchange required)

By example, you can see that the selection as is gradually increasing (I), compare the number of times will be less and less, but whether or not the initial array orderly, selection sort from I to compare a choice at the end of an array, so the array of a given length, selection sort of number is fixed: 1 + 2 + 3 +... . + n = n * (n + 1) / 2, and the number of exchanges is related to the order of the original array. If the initial array order is random, the array elements will be exchanged n times in the worst case and 0 times in the best case (the array itself is in order).

It follows that the time and space complexity of selection sort are O(n2) and O(1) respectively (selection sort requires only one extra space for array element exchange).

Implementation code:


/** 
* Selection Sorting 
*/
SELECTION(new Sortable() { 
    public <T extends Comparable<T>> void sort(T[] array, boolean ascend) { 
        int len = array.length; 
        for (int i = 0; i < len; i++) { 
            int selected = i; 
            for (int j = i + 1; j < len; j++) { 
                int compare = array[j].compareTo(array[selected]); 
                if (compare != 0 && compare < 0 == ascend) { 
                    selected = j; 
                } 
            } 
 
            exchange(array, i, selected); 
        } 
    } 
})

2. Insert sort

Insertion sort is the basic thought of in the process of iterating group, before the serial number I element assumption [0.. 1] I have sorted, this trip need to find the correct position of the corresponding element of x I k, and k in the search for this location will compare one by one element in the process of moving back one, for the element x "teng position", finally will k corresponding to the element values for x, insertion sort is according to the properties of the sort.

Below is an example, a red tag Numbers for the insert, is zoned off Numbers are not involved in the ordering of elements, the red tag number to be between crossing out the number of elements as the backward elements one by one, such as the second trip to participate in sorting elements for [12] 11, 31, and need to insert elements of 12, but 12 current is not in the correct position, so we need to order from the previous elements 31, 11, while comparing compare mobile elements, until you find the first element is smaller than 12 at 11 stop the comparisons, In this case, the index 1 corresponding to 31 is the position where 12 needs to be inserted.

Original:       [11, 31, 12, 5, 34, 30, 26, 38, 36, 18]& cake;
 
First pass: [11, 31, 12, 5, 34, 30, 26, 38, 36, 18] (no moving element)  
 
Second pass: [11, 12, 31, 5, 34, 30, 26, 38, 36, 18] (31 moves back)  
 
Third pass: [5, 11, 12, 31, 34, 30, 26, 38, 36, 18] (11, 12, 31 all move backwards)  
 
Fourth pass: [5, 11, 12, 31, 34, 30, 26, 38, 36, 18] (no moving element) & cake;
 
Fifth pass: [5, 11, 12, 30, 31, 34, 26, 38, 36, 18] (31, 34 moves back)  
 
The sixth pass: [5, 11, 12, 26, 30, 31, 31, 34, 38, 36, 18] (30, 31, 34 moves back)  
 
7: [5, 11, 12, 26, 30, 31, 34, 38, 36, 18] (no moving element) & cake;
 
8: [5, 11, 12, 26, 30, 31, 34, 36, 38, 18] (38 moves back)  
 
9: [5, 11, 12, 18, 26, 30, 31, 34, 36, 38] (26, 30, 31, 34, 36, 38 moves back)

Insertion sort is better than the selection sort, the reason is that it can use the former part of the array elements in the process of sorting has been sorted one advantage, effectively reduce a few more times, of course this advantage depends on the initial order of the array, the worst case (given an array of reverse right) insertion sort need comparison and mobile number will be equal to 1 + 2 + 3... Plus n is equal to n times n plus 1 over 2, and in this extreme case, insertion sort is even less efficient than selection sort. Therefore, insertion sort is an unstable sorting method, and the insertion efficiency is closely related to the initial order of the array. In general, the time and space complexity of insertion sort are O(n2) and O(1) respectively.

Implementation code:


/** 
* Insertion Sorting 
*/
INSERTION(new Sortable() { 
    public <T extends Comparable<T>> void sort(T[] array, boolean ascend) { 
        int len = array.length; 
        for (int i = 1; i < len; i++) { 
            T toInsert = array[i]; 
            int j = i; 
            for (; j > 0; j � ) { 
                int compare = array[j - 1].compareTo(toInsert); 
                if (compare == 0 || compare < 0 == ascend) { 
                    break; 
                } 
                array[j] = array[j - 1]; 
            } 
 
            array[j] = toInsert; 
        } 
    } 
})

Bubble sort

Bubble sort is one of the most classic sorting algorithm can, remember the younger brother go to school is the first contact with this algorithm, because of their simple method to realize the two level for loop, the inner loop whether adjacent two elements in reverse, so the two element exchange, outer cycle time, can will be the least of the rest of the elements in the array elements "float" to the front, so call it a bubble sort.

As usual, a simple example:

Initial state:     [24, 19, 26, 39, 36, 7, 31, 29, 38, 23]& cake;
 
Inner first pass: [24, 19, 26, 39, 36, 7, 31, 29, 23, 38] (9th [23]< - > 8 th [38)  
 
Inner second pass: [24, 19, 26, 39, 36, 7, 31, 23, 29, 38] (8th [23]< - > 7 th [29])  
 
Inner third pass: [24, 19, 26, 39, 36, 7, 23, 31, 29, 38] (7th [23]< - > 6 th [31])  
 
Inner fourth pass: [24, 19, 26, 39, 36, 7, 23, 31, 29, 38] (7, 23 are in the correct order, no need to swap) & PI;
 
Inner fifth: [24, 19, 26, 39, 7, 36, 23, 31, 29, 38] (5th [7]< - > 4 th [36])  
 
Inner sixth pass: [24, 19, 26, 7, 39, 36, 23, 31, 29, 38] (4th [7]< - > 3 rd [39])  
 
Inner seventh pass: [24, 19, 7, 26, 39, 36, 23, 31, 29, 38] (3rd [7]< - > 2 nd [26])  
 
Inner eighth: [24, 7, 19, 26, 39, 36, 23, 31, 29, 38] (2nd [7]< - > 1 st [19])  
 
The 9th inner pass: [7, 24, 19, 26, 39, 36, 23, 31, 29, 38] (1st [7]< - > 0 th [24])  
 
...... .

Bubble sort and selection sort are alike, times, comparing to n * (n + 1) / 2, but the bubble sort will in the process of choosing the minimum additional exchange (bubble sort in order if found adjacent elements in the right order will be exchanged, and the matching selection sort, will only after the inner loop is depending on the situation to decide whether to exchange), so in my opinion, selection, belongs to the improved version of bubble sort.

Implementation code:


/** 
* Bubble Sorting, it's very similar with Insertion Sorting 
*/
BUBBLE(new Sortable() { 
    public <T extends Comparable<T>> void sort(T[] array, boolean ascend) { 
        int length = array.length; 
        int lastExchangedIdx = 0; 
        for (int i = 0; i < length; i++) { 
            // mark the flag to identity whether exchange happened to false 
            boolean isExchanged = false; 
            // last compare and exchange happened before reaching index i 
            int currOrderedIdx = lastExchangedIdx > i ? lastExchangedIdx : i; 
            for (int j = length � 1; j > currOrderedIdx; j � ) { 
                int compare = array[j - 1].compareTo(array[j]); 
                if (compare != 0 && compare > 0 == ascend) { 
                    exchange(array, j � 1, j); 
                    isExchanged = true; 
                    lastExchangedIdx = j; 
                } 
            } 
            // if no exchange happen means array is already in order 
            if (isExchanged == false) { 
                break; 
            } 
        } 
    } 
})

4. Hill sort

Hill sort was born out of the problem of insertion sort having to move too many elements when dealing with large arrays. The idea of hill sort is to divide a large array "divide and conquer" into several small arrays divided by gap, such as the array [1, 2, 3, 4, 5, 6, 7, 8]. If the array is divided by gap = 2, it can be divided into two arrays [1, 3, 5, 7] and [2, 4, 6, 8]. [1, 4, 7], [2, 5, 8], [3, 6]), and then to insertion sort of allocated array, respectively after sequence each subarray to reduce the gap value repeated the previous steps, until the gap = 1, namely to insertion sort of the whole array, the array has basically sorted soon, so you need to move the element will be tiny, solved the insertion sort when dealing with large-scale array is more mobile number.

For an example, see insertion sort.

Hill sort is an improved version of insertion sort, which can greatly improve the efficiency when the data volume is large. It is recommended to use insertion sort directly when the data volume is small.

Implementation code:


/** 
* Shell Sorting 
*/
SHELL(new Sortable() { 
    public <T extends Comparable<T>> void sort(T[] array, boolean ascend) { 
        int length = array.length; 
        int gap = 1; 
 
        // use the most next to length / 3 as the first gap 
        while (gap < length / 3) { 
            gap = gap * 3 + 1; 
        } 
 
        while (gap >= 1) { 
            for (int i = gap; i < length; i++) { 
                T next = array[i]; 
                int j = i; 
                while (j >= gap) { 
                    int compare = array[j - gap].compareTo(next); 
                    // already find its position 
                    if (compare == 0 || compare < 0 == ascend) { 
                        break; 
                    } 
 
                    array[j] = array[j - gap]; 
                    j -= gap; 
                } 
                if (j != i) { 
                    array[j] = next; 
                } 
            } 
            gap /= 3; 
        } 
 
    } 
})

Merge sort

Merge sort is recursion, belong to "divide and rule", the target array from the middle into two, then respectively for the two array sorted, sort after and then sorted two arrays together, "merge" merge sort of the most important is the "merge" process, the process of merging need extra to merge the two arrays of same length space, such as the need to specified array are: [3, 6, 8, 11] and [1, 3, 12, 15] (although logically divided into two arrays, these elements are actually in the original array, but are divided into two arrays by some index, which was [3, 6, 8, 11, 1, 3, 12, 15]. We set the three Pointers lo, mid and high as 0,3 and 7 respectively to achieve the logical subarray division.) then the length of the extra array required is 4 + 4 = 8. The merge process can be briefly summarized as follows:

1) copy the elements in the two subarrays into the new copiedArray. Take the previous example as an example, copiedArray = [3, 6, 8, 11, 1, 3, 12, 15];

2) set two Pointers to point to the first element corresponding to the array of atoms respectively. Assuming the two Pointers are named leftIdx and rightIdx, then leftIdx = 0 (corresponding to the first element in copiedArray [3]), and rightIdx = 4 (corresponding to the fifth element in copiedArray [1]);

3) to compare leftIdx and rightIdx pointing to the array element value, select one of the smaller and its value is assigned to the original array corresponding to the location of the I, after the completion of the assignment of the two indexes involved in the assignment to do since the operation, 1-6 if leftIdx or with rigthIdx value has reached the end of the array, the rest just need to rest of the array elements in the sequence copy to the rest of the location.

Here is a concrete example of merging:

First trip:  
 
Auxiliary array [21, 28, 39 | 35, 38]
 
[21,& PI;,& PI;,& PI;,& PI;] (the first time 21 is compared to 35, the left subarray wins, leftIdx = 0, I = 0) & PI;
 
Second pass:  
 
Auxiliary array [21, 28, 39 | 35, 38] 
 
[21, 28, , , ] (the second time 28 is compared to 35, the left subarray wins, leftIdx = 1, I = 1)  
 
Third pass: [21, 28, 39 | 35, 38] 
 
[21, 28, 35, , ] (the third time 39 is compared to 35, the right subarray wins, rightIdx = 0, I = 2)  
 
The fourth: [21, 28, 39 | 35, 38] 
 
[21, 28, 35, 38, ] (the right subarray wins the fourth time when 39 is compared to 38, rightIdx = 1, I = 3)  
 
The fifth: [21, 28, 39 | 35, 38] 
 
[21, 28, 35, 38, 39] (the fifth time, the right subarray has been copied, there is no need to compare leftIdx = 2, I = 4)
So that's a merge, we can split the entire array that needs sorting a finite number of times (splitting it in half each time) until we split it into a small array of length 1, which is no longer sorted. After that, you merge the arrays in reverse order (because of recursion) until you merge the last subarray of length n / 2, and then you sort the array.

Merge sort requires the most additional space of any sort, and each merge requires the same number of elements as the sum of the two array lengths involved in the merge (to provide the auxiliary array). Then it can be inferred that the space complexity of merge sort is 1 + 2 + 4 +... + n = n * (n + 2) / 4 (ignoring the odd-even judgment of n), the time complexity is difficult to estimate, here little brother also forgot how much.

Implementation code:


/** 
* Merge sorting 
*/
MERGE(new Sortable() { 
    public <T extends Comparable<T>> void sort(T[] array, boolean ascend) { 
        this.sort(array, 0, array.length � 1, ascend); 
    } 
 
    private <T extends Comparable<T>> void sort(T[] array, int lo, int hi, boolean ascend) { 
        // OPTIMIZE ONE 
        // if the substring's length is less than 20, 
        // use insertion sort to reduce recursive invocation 
        if (hi � lo < 20) { 
            for (int i = lo + 1; i <= hi; i++) { 
                T toInsert = array[i]; 
                int j = i; 
                for (; j > lo; j � ) { 
                    int compare = array[j - 1].compareTo(toInsert); 
                    if (compare == 0 || compare < 0 == ascend) { 
                        break; 
                    } 
                    array[j] = array[j - 1]; 
                } 
 
                array[j] = toInsert; 
            } 
 
            return; 
        } 
 
        int mid = lo + (hi � lo) / 2; 
        sort(array, lo, mid, ascend); 
        sort(array, mid + 1, hi, ascend); 
        merge(array, lo, mid, hi, ascend); 
    } 
 
    private <T extends Comparable<T>> void merge(T[] array, int lo, int mid, int hi, boolean ascend) { 
        // OPTIMIZE TWO 
        // if it is already in right order, skip this merge 
        // since there's no need to do so 
        int leftEndCompareToRigthStart = array[mid].compareTo(array[mid + 1]); 
        if (leftEndCompareToRigthStart == 0 || leftEndCompareToRigthStart < 0 == ascend) { 
            return; 
        } 
 
        @SuppressWarnings("unchecked") 
        T[] arrayCopy = (T[]) new Comparable[hi - lo + 1]; 
        System.arraycopy(array, lo, arrayCopy, 0, arrayCopy.length); 
 
        int lowIdx = 0; 
        int highIdx = mid � lo + 1; 
 
        for (int i = lo; i <= hi; i++) { 
            if (lowIdx > mid � lo) { 
                // left sub array exhausted 
                array[i] = arrayCopy[highIdx++]; 
            } else if (highIdx > hi � lo) { 
                // right sub array exhausted 
                array[i] = arrayCopy[lowIdx++]; 
            } else if (arrayCopy[lowIdx].compareTo(arrayCopy[highIdx]) < 0 == ascend) { 
                array[i] = arrayCopy[lowIdx++]; 
            } else { 
                array[i] = arrayCopy[highIdx++]; 
            } 
        } 
    } 
})

6. Quicksort

Quicksort is also a "divide and conquer" sort algorithm implemented by merging. The charm of quicksort is that it can determine the correct position of an array element at each partition (the core of the sorting algorithm).


Related articles: