counting sort sort the array in O(n) time but we don’t use it why…? it’s just the reason if range gets big then it may be costly process… Or some other reason also…
Counting sort algo
Now, as we know that, the worst case time complexity of Counting Sort is O(n+k), where n is our input size and k is the range in which all our numbers lie. We define range, as the difference between the maximum and minimum element. For eg: For an array, [1,5,11,0,-5,36,88,100,3,24], n is 10 but k is .
Now, O(n+k) makes counting sort looks like a linear sorting algorithm and thus makes it very tempting to use because, it appears to have a much much better performance than other sorting algorithms like Merge Sort, Quick Sort, etc. But here’s the catch, what if k was of the order of n^2. For example, in the array given above, n was 10 but k was 106, so k was slightly greater than square of n. This make the time complexity O(n+n^2), which essentially reduces to O(n^2). Hence, in this case, the time complexity got worse. And this isn’t the end. It can get even worse. What if the difference between maximum and minimum element was of the order of 1000, i.e array was something like, [0,1,55,106,1000,92,74,642,123,49]. Here also n is 10, but k is 1001, which is cube of n. Thus time complexity in this case becomes O(n^3)
Thus as you can see, this time complexity of Counting Sort strictly depends upon the relationship between n and k. If k is of the order of n, Counting sort works in linear time, otherwise it may take up any form based on how k changes with n. Hence, with respect to time, counting sort is good if the numbers lie within a smaller range, comparable in size to the input. If the range of numbers increase significantly, counting sort gets slower
In case of any doubt feel free to ask
If you got your answer mark your doubt as resolved