| Source code sort algorithm | ||||||||||||||||||||||||||||||||||||||||||
|
||||||||||||||||||||||||||||||||||||||||||
| Bubble Sort | ||||||||||||||||||||||||||||||||||||||||||
|
Bubble sort is a simple sorting algorithm. It works by repeatedly stepping
through the list to be sorted, comparing each pair of adjacent items and
swapping them if they are in the wrong order. The pass through the list is
repeated until no swaps are needed, which indicates that the list is sorted.
Because it only uses comparisons to operate on elements, it is a comparison
sort. Step-by-Step Example Assume we have an array "5 1 4 2 8" and we want to sort the array from the lowest number to the greatest number using bubble sort. Pseudocode Implementation
procedure bubbleSort( A : list of sortable items ) defined as:
An Improved Alternative Implementation
Here, instead of doing n(n-1) comparisons, we reduce it to (n-1) + (n-2) + ... + 1 = n(n-1)/2 comparisons. Performance
Bubble sort is not a practical sorting algorithm when n is large.
|
||||||||||||||||||||||||||||||||||||||||||
| Selection Sort | ||||||||||||||||||||||||||||||||||||||||||
|
Selection sort is an in-place comparison sort. It has O(n2)
complexity, making it inefficient on large lists, and generally performs worse
than the similar insertion sort. Selection sort is noted for its simplicity, and
also has performance advantages over more complicated algorithms in certain
situations. Algorithm
Effectively, we divide the list into two parts: the sublist of items already sorted and the sublist of items remaining to be sorted. Example: Consider an example of sorting "64 25 12 22 11". Java Implementation of the Algorithm
Performance
|
||||||||||||||||||||||||||||||||||||||||||
| Insertion Sort | ||||||||||||||||||||||||||||||||||||||||||
Insertion sort is a comparison sort in which the sorted array (or list) is built
one entry at a time. It is much less efficient on large lists than more advanced
algorithms such as quicksort, heapsort, or merge sort. However, insertion sort
provides several advantages:
Algorithm Every iteration of insertion sort removes an element from the input data, inserting it into the correct position in the already-sorted list, until no input elements remain. The choice of which element to remove from the input is arbitrary, and can be made using almost any choice algorithm. Sorting is typically done in-place. The resulting array after k iterations has the property where the first k+1 entries are sorted. In each iteration the first remaining entry of the input is removed, inserted into the result at the correct position, thus extending the result. Example: Consider an example of sorting "64 25 12 22 11". Pseudocode Implementation:
|
||||||||||||||||||||||||||||||||||||||||||
| Merge Sort | ||||||||||||||||||||||||||||||||||||||||||
|
Merge sort is an O(n log n) comparison-based sorting
algorithm. It is an example of the divide and conquer algorithmic paradigm. Algorithm Conceptually, a merge sort works as follows:
Merge sort incorporates two main ideas to improve its runtime:
Algorithm MergeSort(A, 0, n-1) Example: sort the following numbers 35, 62, 33, 20, 5, 72, 48, 50. Performance
|
||||||||||||||||||||||||||||||||||||||||||
| Quicksort | ||||||||||||||||||||||||||||||||||||||||||
|
Quicksort is a well-known sorting algorithm that, on average, makes O(n
log n) comparisons to sort n items. However, in the worst case, it
makes O(n2) comparisons. Typically, quicksort is
significantly faster than other O(n log n) algorithms,
because its inner loop can be efficiently implemented on most architectures, and
in most real-world data, it is possible to make design choices which minimize
the probability of requiring quadratic time. Quicksort is a comparison sort and, in efficient implementations, is not a stable sort. Algorithm Quicksort sorts by employing a divide and conquer strategy to divide a list into two sub-lists. The steps are:
function quicksort(array)Quicksort is similar to merge sort in many ways. It divides the elements to be sorted into two groups, sorts the two groups by recursive calls, and combines the two sorted groups into a single array of sorted values. However, the method for dividing the array in half is much more sophisticated than the simple method we used for merge sort. On the other hand, the method for combining these two groups of sorted elements is trivial compared to the method used in mergesort. The correctness of the partition algorithm is based on the following two arguments:
The disadvantage of the simple version above is that it requires O(n) extra storage space, which is as bad as merge sort. The additional memory allocations required can also drastically impact speed and cache performance in practical implementations. There is a more complex version which uses an in-place partition algorithm and use much less space. The partition pseudocode:
Choosing a Good Pivot Element The choice of a good pivot element is critical to the efficiency of the quicksort algorithm. If we can ensure that the pivot element is near the median of the array values, then quicksort is very efficient. One technique that is often used to increase the likelihood of choosing a good pivot element is to randomly choose three values from the array and then use the middle of these three values as the pivot element. Let's try the quicksort algorithm with the following array: 40, 20, 10, 80, 60, 50, 7, 30, 100, 90, and 70.
|