Yes and no. Yes there are searches that are faster, on average, than a bisection search. But I believe that they are still O(lg N), just with a lower constant.

You want to minimize the time taken to find your element. Generally it is desirable to use fewer steps, and one way to approach this is to maximize the expected number of elements that will be eliminated at each step. With bisection, always exactly half the elements are eliminated. You can do better than this, IF you know something about the distribution of the elements. But, the algorithm for choosing the partition element is generally more complicated than choosing the midpoint, and this extra complexity may overwhelm any time savings you expected to get from using fewer steps.

Really, in a problem like this it's better to attack second-order effects like cache locality, than the search algorithm. For example, when doing a repeated binary search, the same few elements (first, second, and third quartiles) are used VERY frequently, so putting them in a single cache line could be far superior to random access into the list.

Dividing each level into say 4 or 8 equal sections (instead of 2) and doing a linear search through those could also be quicker than the bisection search, because a linear search doesn't require calculating the partition and also has fewer data dependencies that can cause cache stalls.

But all of these are still O(lg N).

a state machine worked an order of magnitude for me on large data, but the complexity/memory for building states is much larger than sorting. – technosaurus – 2014-05-13T06:41:21.770

11

If you have a quantum computer you can try http://en.wikipedia.org/wiki/Grover%27s_algorithm :)

– David Titarenco – 2010-10-30T04:48:19.7534@David: The list is sorted though, so Grover's algorithm is worse than bisection search. O(sqrt N) > O(lg N) – Ben Voigt – 2010-10-30T05:00:16.760