Wednesday, August 8, 2007

Problem #1 Redux

So, Problem #1 was maybe a bit too easy, as I "solved" it (cough, cough) while trying to fall asleep last night.

Heuristically: Choose about sqrt(n) elements from L, spaced more or less uniformly. Sort these elements in their places; this can be done in O(sqrt(n)ln(n)) time. Then (heuristically!) the smallest of these elements will on average have size ~sqrt(n)/2, the next-smallest will have size around ~3sqrt(n)/2, and so on. Counting the largest possible number of out-of-order pairs, we have that there are at most:

sqrt(n)+3sqrt(n)+5sqrt(n)+...+(2n-1)sqrt(n) ~ n^3/2 pairs which are out of order.

But this is o(n^2), so for large enough n, this will, on average, produce a (1-\epsilon)-sorted list.


We can easily extend this construction to produce an algorithm running in O(n^\epsilon) time for any positive \epsilon that gives (on average) arbitrarily good partial sorting (but the scaling gets worse and worse...)

So that part seems to be done (so long as the above argument can be made rigorous, which doesn't seem prohibitively difficult.) But the question remains: What about algorithms guaranteed to produce a c-sorting? My hunch: There do not exist algorithms with worst-case time complexity o(n) guaranteed to c-sort a list of size n. (Average-case complexity, I'm not so sure.) Anyone who can prove this gets my everlasting respect and admiration.

No comments: