# Counting Sort

Counting sort, like Radix sort are linear time sorting algorithms. The performace of the sorting is proportional to the size of the interger keys being sorted.

### Time Complexity

The worse case complexity of a linear algorithm like counting sort can be visualized using a decision tree. The decision tree represents all the permutations of the comparisons performed during a sort.

For example an array, array[n] where n=3, a decision tree would look like the following:

The height of the tree is the worse case running time (most comparisons).

We know that of the height of any decision tree is the worse case running time and that the decision tree must have at least n leaves (representing all permutations)…

Note that in the example of a binary decision tree (like this) there are only n leafs, each leaf representing a permutation. However other decision trees can represent +n leaves because of more than one possible outcome per element.

## Psuedocode - unstable sort

### Searching

From the decision tree example, we can derive the &‌Omega;(log n) as follows…

1. We know that a height of a tree is O(log n). So sorting has…

height >= Omega(log n) (lower bound)

### Sorting

When sorting an element of n, there are n! possible permutations and therefore n! possible comparisons…

We know that in logerithmic scale, when we only look at n/2, these will make up most of the value. So we can say that the height is approximatlety …

Remove all constant factors and we get n log (n)

Note you can also use Sterling approximation to show that log n! is n(log n).

## Psuedocode - Stable sort

The above algorithm is not a stable sort since the relative positions are not maintained. This is demonstarted when you have two elements of the same value in the array to be sorted. The above algorithm does not guarantee the relative original order of those elements. For example [1, 2, 3, 2'], 2 and 2' might end up being sorted as [1, 2', 2,3] instead of what you would expected in a stable sort [1, 2, 2' ,3].

Below shows a summary of the above operations:

To modify the above counting sort algorithm into a stable sort algorithm, we do the following…

1. Itterate through kArray to produce cArray (array that keeps the positions of values encountered in the unsorted array).

2. Itterate through the unsorted array backwards, for each element lookup the position in cArray.

3. Place the array in the position indicated in the cArray cArray[position], reduce cArray value by 1 so that the next encounter of the same value is placed adjacent in the sorted array.

### Searching - stable counting sort

Stable counting sort has the same searching as unstable counting sort. Since the algorithm doesn’t change how elements are searched.

### Sorting - stable counting sort

We know from the algorithm that we are doing three itterations through arrays (kArray, cArray and sortedArray - kArray and sortedArray involves itterating through array[n]). So the time complexity for this would take at least:

O( n + k + n)

This is the same as:

O( 2n + k) or, removing constant factors … O(n+k).