Researching Searching Algorithms? 62
NiN3x asks: "I have recently written a sorting algorithm that can be close to four times as fast as quicksort and never much slower. I was wondering if there are lots of sorting algorithms like this out there. I don't think I could be the only one that has thought of something like this. I am only in my second year of computer science, so I don't know a lot about these things. I have tried searching the net and can't find anything. My algorithm is NOT an inplace sorting algorithm. Can anyone point me to some sources for this type of thing?"
Copy and paste it here (Score:2, Insightful)
Wrong title? (Score:2, Insightful)
Proper Analysis is Key (Score:5, Insightful)
There are lots of algorithms out there that seem to be multiples faster than quicksort in daily use. The problem is when you formally analyze them, you find out their not gauranteed to be faster, and could in fact be horribly slower. If you run it enough times against various wierd data, you'll probably hit one of these slow runs yourself. I can only guess because you haven't mentioned anything about the algorithm or shown any sample code. Read up on how to analyze your code for O() notation and figure out what the real worst case is mathematically.
Re:Funny that this topic came up... (Score:2, Insightful)
The original poster mentions that his algorithm isn't a simple in-place sort - which, based on my best guess, means that he's probably sorting pointers quickly then re-arranging the elements in a single pass. Maybe, maybe not - but still no faster than O(n log n).
But at this point, I'd start worrying more about scalability, memory usage, etc. - one of the nice things about quicksort is that it is so general, it can quickly be adapted to low-memory or multi-processor implementations. Sure the algorithm is faster - but faster does not necessarily mean better. If my implementation guess above is correct, the algorithm would use significant additional memory, making it unsuitable for sorting simple numbers (where the pointers would dwarf the elements themselves) or for use in low-memory (i.e. embedded) applications.
The true test of whether the algorithm is innovative is: 1) have a professor look over it, and then 2) get a (respectable) journal to publish it. If it passes the peer review required for publication, then it's an innovative algorithm. If not, it's one of thousands of alternative implementations of the same idea. Clever, maybe even useful, but not spectacular.
Fast vs. Optimal (Score:2, Insightful)
I know the question raised was on the correct analysis of the algorithm in question, but several of these postings have diverged into languages and types of data. This is good and it is the whole point of a well grounded CS education -- knowing all of the strengths and weaknesses of available algorithms so you can make the correct choices based on the size and type of the data and the language (and libraries of choice).