C language is used to implement 12 sorting methods

  • 2020-06-23 01:28:03
  • OfStack

1. Bubble sort

Idea: Compare two adjacent Numbers. If the first number is large, swap the two Numbers until they are in order.

Time complexity O(n^2), stability: This is a stable algorithm.

Code implementation:


void bubble_sort(int arr[],size_t len){
 size_t i,j;
 for(i=0;i<len;i++){ 
 bool hasSwap = false; // Optimize to determine if the array is already ordered. If so, you can exit the loop in advance 
 for(j=1;j<len-i;j++){ // Here, j<len-i Because the last one is always the largest, you don't need to compare it much 
 if(arr[j-1]>arr[j]){ // If the former 1 After than 1 A big 
 swap(&arr[j-1],&arr[j]); // Exchange two data 
 hasSwap = true;
 } 
 }
 if(!hasSwap){
 break; 
 }
 }
}

2. Insertion sort

Idea: Insert a number into an ordered sequence to keep it in order. For example, for an array that needs sorting, we can make its first i Numbers in order, and then insert i+1 number into an appropriate position to keep it in order until all the Numbers are in order.

Time complexity: O(n^2) Stability: Stable algorithm

Code implementation:


void insert_sort(int arr[],int len){
 int i,j;
 for(i=1;i<len;i++){
 int key = arr[i]; // Records the data currently to be inserted 
 for(j= i-1;i>=0&&arr[j]>key;j--){ // Find the insertion location 
 arr[j+1] = arr[j]; // Move the element behind the one you want to insert back 
 }
 arr[j+1] = key; // Insert the element 
 }
}

3. Half insertion sort

Idea: It is essentially insertion sort, but find the insertion location through the half-minute lookup method, making the efficiency a little faster than 1 point.

Time complexity: O(n^2), stability: Stable algorithm.

Code implementation:


void half_insert_sort(int arr[],int len){
 int i,j;
 for(i=1;i<len;i++){
 int key = arr[i];
 int left = 0;
 int right = i-1;
 while(left<=right){ // Find the insertion location 
 int mid = (left+right)/2;
 if(key<arr[mid]){
 right = mid-1; 
 }else{
 left = mid+1; 
 } 
 }
 for(j=i-1;j>=left;j--){ // I'm going to move the next element back 
 arr[j+1]=arr[j]; 
 }
 arr[j+1] = key; // Insert elements 
 }
}

4. Hill sort

Idea: Take 1 positive integer d1 first < n, put all array elements whose sequence Numbers are separated by d1 into one group, and carry out direct insertion sort within the group; Then take d2 < d1, repeat the above grouping and sorting operation; Until di=1, that is, all records are sorted into one group.

Time complexity: O(n^1.3), greatly improving the efficiency of the algorithm. Stability: An unstable algorithm.

Code implementation:


void shell_sort(int arr[],int len){ // In essence 1 The sort of insertion sort avoids the movement of large amounts of data at each 1 After the group is sorted, each data has reached its approximate position. 
 int i,j;
 int step=0;
 for(step = len/2;step>=1;step=step/2){ // grouping   Divided into step group , Insert sort the elements of each group 
 for(i=step;i<len;i++){
 int key = arr[i];
 for(j=i-step;j>=0&&arr[j]>key;j=j-step){
 arr[j+step] = arr[j]; 
 } 
 arr[j+step] = key;
 }
 }
}

5. Select sort

Idea: Loop through to find the maximum position, then the maximum value and the last element to exchange, through the loop until all the data in order. Time complexity: O(n^2) Stability: Unstable algorithm

Code implementation:


void select_sort(int arr[],size_t len){
 size_t i,j;
 for(i=0;i<len-1;i++){
 int max = 0; // Maximum index 
 for(j=1;j<len-i;j++){
 if(arr[max]<arr[j]){ // Find the subscript of the maximum 
 max = j; 
 } 
 }
 if(max!=j-1){ 
 swap(&arr[max],&arr[j-1]); // The final 1 The element and the maximum are swapped 
 }
 }
}

6. Cocktail ordering

Idea: Select 1 improvement of sorting, 1 cycle to directly find the maximum and minimum position, the maximum and the last element of the exchange, the minimum and the first element of exchange, so the outermost loop only need to execute len/2 times

Time complexity: O(n^2) Stability: Unstable algorithm

Code implementation:


void cocktail_sort(int arr[],size_t len){
 size_t i,j;
 for(i=0;i<len/2;i++){
 int max = i; // Maximum index 
 int min = i; // Minimum index 
 for(j=i+1;j<len-i;j++){
 if(arr[max]<arr[j]){ // Find the maximum index 
 max = j; 
 } 
 if(arr[min]>arr[j]){ // Find the minimum index 
 min = j; 
 }
 }
 if(max!=j-1){
 swap(&arr[max],&arr[j-1]); // Swap the maximum and unsorted end 1 An element 
 }
 if(min == j-1){ // If the minimum is at the end of the unsorted list 1 So, by swapping the maximum, you've swapped the maximum 
 min = max; // Change the coordinates of the minimum 
 }
 if(min!=i){
 swap(&arr[i],&arr[min]); // Exchange the minimum and the first element not sorted 
 }
 }
}

7. The heap sort

Idea: heap the data, then swap the top (maximum) and the last element in turn, then make the top heap again, finally loop after the array is ordered. Process:

Maximum heap adjustment (Max Heapify) : To adjust the end child of the heap so that the child is always smaller than the parent created maximum heap (Build Max Heap) : to reorder all data in the heap

Heap sort (HeapSort) : Remove the bit at the root of the first data node, and do the most heap tuning recursion time complexity: O(nlgn) Stability: unstable algorithm

Implementation code:


void re_heap(int arr[],size_t index,size_t len){
 size_t child = 2*index+1; // Left nodal coordinates 
 int key = arr[index]; // Current node value 
 while(child<len){
 if(child+1<len&&arr[child]<arr[child+1]){ // If the right node exists and the value of the right node is greater than the value of the left node, then child Records the coordinates of larger word nodes 
 child++; 
 } 
 if(arr[child]>key){ // If the value of the child node is greater than the value of the root node 
 arr[index] = arr[child]; // Change the value of the root node 
 }else{
 break; 
 }
 index = child;
 child = 2*index+1;
 }
 arr[index] = key; // Insert the recorded values 
}
void heap_sort(int arr[],size_t len){
 int i;
 for(i=len/2;i>=0;i--){
 re_heap(arr,i,len); // For the first i The root nodes are massaged 
 }
 for(i=len-1;i>0;i--){
 swap(&arr[0],&arr[i]); // Exchange the first 1 And the last 1 An element 
 re_heap(arr,0,i); // For the first 1 I'm going to heap up the elements 
 }
}

8. Quicksort

Idea: by 1 visit to sort to sort data split into separate two parts, one part of all the data than the other one part of all data, and then of the two parts data respectively according to the method of quick sort, the entire sorting process can be recursive, to achieve the whole data into an orderly sequence. Process:

(1) First set a boundary value, through which the array is divided into left and right parts

(2) Set data greater than or equal to the boundary value on the right side of the array, and data less than the boundary value on the left side of the array. In this case, every element on the left is less than or equal to the boundary, and every element on the right is greater than or equal to the boundary.

(3) Then, the data on the left and right can be sorted independently. For the array data on the left, another boundary value can be taken to divide the data into the left and right parts. Similarly, a smaller value can be placed on the left and a larger value on the right. The array data on the right can be treated similarly.

(4) Repeat the above process, and you can see that this is a recursive definition. After the left part is sorted recursively, the right part is sorted recursively. When the left and right parts of each data sorting is completed, the sorting of the entire array is completed. Time complexity: O(nlog2n) Stability: Unstable algorithm

Code implementation:


void quick_sort(int arr[],size_t left,size_t right){
 if(left>=right){ // If only 1 Three elements, that's ordered, return 
 return; 
 }
 int i = left;
 int j = right;
 int key = arr[left]; // At baseline 
 while(i<j){ // Locate the base value so that everything to the right of the base value is higher than the base value and everything to the left is lower than the base value 
 while(i<j&&arr[j]>=key){ // From the right to find 1 A number less than the base value, 
 --j;
 }
 arr[i] = arr[j];// Place this value at the base value 
 while(i<j&&arr[i]<=key){ // From the left to find 1 A number larger than the base value 
 ++i; 
 }
 arr[j] = arr[i]; // So let's put this element right here j The location of the 
 }
 arr[i] = key;
 if(i-left>1) // The number of elements must be at least two to make a recursive call, which is less 1 A recursive 
 quick_sort(arr,left,i-1); // Sort the elements to the left of the reference value 
 if(right-i>1)
 quick_sort(arr,i+1,right); // Sort the elements to the right of the reference value 
}

Merge sort

Idea: For two ordered subsequences, they can be merged at 1 to become a new completely ordered sequence. Therefore, merge sort is similar to quicksort, which is recursive. Time complexity: O(nlog2n) Stability: Stable algorithm code:


void merge(int arr[],int left,int right){
 int i,j,k;
 int mid = (left+right)/2;
 int len = mid-left+1;
 int *temp = malloc(sizeof(arr[0])*len);
 for(i=0;i<len;i++){
 temp[i] = arr[i+left]; // Copy all the elements of this array into a temporary array 
 }
 i=0,j=mid+1,k=left;
 while(i<len&&j<=right){
 if(temp[i]<arr[j]){ // Combine the elements of a temporary array  [mid+1,right] The elements in this part 1 a 1 Two of the comparison, if who is smaller, then arr Is where the element is stored 
 arr[k++] = temp[i++]; 
 }else{
 arr[k++] = arr[j++]; 
 }
 }
 while(i<len){ // if temp I haven't gone through all the elements of this array yet, so let's go temp All subsequent elements are copied to arr Inside, 
 // because arr[mid+1,right]  This guy right here is what it is arr The rest of the ordered elements, so if arr[mid+1,right] It doesn't matter that I didn't go through this part, 
 arr[k++] = temp[i++]; 
 }
 free(temp);
}
void merge_sort(int arr[],int left,int right){
 if(left>=right){ // If only 1 The number of elements indicates that the sequence is in order, so return 
 return; 
 } 
 int mid = (left+right)/2; // Sort two ordered arrays, 
 merge_sort(arr,left,mid); // right [left,mid] The elements of this interval are sorted 
 merge_sort(arr,mid+1,right); // right [mid+1,right] The elements in this interval are sorted 
 merge(arr,left,right); // Of this sequence [left,mid] Is an ordered sequence  [mid+1,right] It's also an ordered sequence 
}

10. Count sort

Thinking: this is a kind of based on the comparison algorithm, we use a large arrays to store the data, the data in the form of large arrays is the large array subscript exist, such as 57,60,42 to sort these three figures, then use a large arrays, the large arrays of arr [57] = 1, arr [60] = 1, arr [42] = 1, then traverse the large arrays.

Time complexity: O(n+k), where k is the range of data, so count sort is most suitable for the array sort in the data set. Stability: A stable algorithm

Code implementation:


void count_sort(int arr[],size_t len){
 int max = arr[0]; // The maximum 
 int min = arr[0]; // The minimum value 
 size_t i;
 for(i=0;i<len;i++){
 if(max<arr[i]){ // Find the maximum 
 max =arr[i]; 
 }
 if(min > arr[i]){ // Find the minimum 
 min = arr[i]; 
 }
 }
 int cnt = max-min+1; // The scope of 
 int *prr = malloc(cnt*sizeof(int)); // Application for Temporary Space 
 for(i=0;i<cnt;i++){ // This temporary array is all set 0
 prr[i] = 0; 
 }
 for(i=0;i<len;i++){ // Iterate over the sequences that need to be sorted 
 prr[arr[i]-min]++; // Let the subscript (arr[i]-min) The value of a temporary large array +1
 }
 size_t j=0;
 for(i=0;i<cnt;i++){ // Iterate through the temporary array 
 while(prr[i]){ // If this array is subscripted i Theta is not equal to theta 0
 arr[j++] = i+min; // So the value of the array that you want to sort is i+min;
 --prr[i];
 } 
 }
 free(prr); // Frees the requested dynamic memory 
}

11. Bucket sorting

Idea: It works by dividing an array into a finite number of buckets. Each bucket is sorted individually (it is possible to use another sorting algorithm or recursively continue sorting with bucket sort). Bucket sorting is an inductive result of pigeon nest sorting. This is a sort that consumes a lot of space in exchange for high efficiency, time complexity: O(N+C), where C=N*(ES148en-ES149en), M is the number of buckets. So for bucket sorting, the more buckets, the higher the sorting efficiency. Stability: Stable algorithm code implementation: First define the bucket type:


void insert_sort(int arr[],int len){
 int i,j;
 for(i=1;i<len;i++){
 int key = arr[i]; // Records the data currently to be inserted 
 for(j= i-1;i>=0&&arr[j]>key;j--){ // Find the insertion location 
 arr[j+1] = arr[j]; // Move the element behind the one you want to insert back 
 }
 arr[j+1] = key; // Insert the element 
 }
}
0

Bucket;


void insert_sort(int arr[],int len){
 int i,j;
 for(i=1;i<len;i++){
 int key = arr[i]; // Records the data currently to be inserted 
 for(j= i-1;i>=0&&arr[j]>key;j--){ // Find the insertion location 
 arr[j+1] = arr[j]; // Move the element behind the one you want to insert back 
 }
 arr[j+1] = key; // Insert the element 
 }
}
1

12. Radix sort

Distribution of radix sort (radix sort) belongs to "sort" (distribution sort), also known as "bucket law" (bucket sort) or bin sort, as the name implies, it is through the key part of the information, will sort of element distribution to some "bucket", so as to achieve sort, radix sort method belongs to the stability of sorting, its time complexity is O (nlog & reg; m), where r is the cardinality adopted, and m is the heap number. In some cases, the efficiency of radix sort is higher than that of other stability sorts.

Solution:

1. First assign the values to buckets numbered 0 through 9 according to the values of units; 2. Next, re-concatenate the values in the buckets, and then make another distribution, this time based on 10 digits; 3. Next, reconnect the values in these buckets and continue until the highest digit is reached. Time complexity: suppose there are n records and d key codes to be sorted, and the value range of key codes is radix, then the time complexity of chain cardinality sorting is O(d(n+radix)). Among them, the time complexity of 1 trip allocation is O(n), and the time complexity of 1 trip collection is O(radix), so there are a total of d allocation and collection. Stability: Stable algorithm;

Code implementation:

Again, define the type of bucket:


void insert_sort(int arr[],int len){
 int i,j;
 for(i=1;i<len;i++){
 int key = arr[i]; // Records the data currently to be inserted 
 for(j= i-1;i>=0&&arr[j]>key;j--){ // Find the insertion location 
 arr[j+1] = arr[j]; // Move the element behind the one you want to insert back 
 }
 arr[j+1] = key; // Insert the element 
 }
}
2

conclusion


Related articles: