how to calculate binary search complexity



I heard somebody say that since binary search halves the input required to search hence it is log(n) algorithm. Since I am not from a mathematics background I am not able to relate to it. Can somebody explain it in a little more detail? does it have to do something with the logarithmic series?

Bunny Rabbit

Posted 2011-11-18T15:50:12.363

Reputation: 3 635


this might help you:

– 2cupsOfTech – 2016-05-20T04:00:32.043



Here a more mathematical way of seeing it, though not really complicated. IMO much clearer as informal ones:

The question is, how many times can you divide N by 2 until you have 1? This is essentially saying, do a binary search (half the elements) until you found it. In a formula this would be this:

1 = N / 2x

multiply by 2x:

2x = N

now do the log2:

log2(2x)    = log2 N
x * log2(2) = log2 N
x * 1         = log2 N

this means you can divide log N times until you have everything divided. Which means you have to divide log N ("do the binary search step") until you found your element.


Posted 2011-11-18T15:50:12.363

Reputation: 7 489

i just calculated it to t(n) = (2^2)*K. how to make it to log form? – Shan Khan – 2016-01-30T15:25:44.793


the same concept explained graphically:

– 2cupsOfTech – 2016-02-22T16:51:04.800

Clear and simple... Great answer!!! – Manish – 2016-05-12T12:52:54.093

The part I'm missing is, if you have a BST with 7 entries, what is its formula? log2(7)? I did a brute force calculation with every possible outcome, and came to an answer that did not equal log2(7), so what am I doing wrong? – Perry Monschau – 2016-11-27T17:08:42.420

With this formula you can't calculate all there is about BST. If you have a BST counting only the leaves, it would be correct. Then you'd have 4 leaves, giving you log2(4) = 2, which is the correct tree depth. Apart from that, you have to define what you want to calculate. In your case I guess you want to calculate some expectancy, which is more complex. – duedl0r – 2016-12-06T23:37:18.460

So much easier than the binary tree explanation. – NoName – 2017-10-01T22:05:30.823

Very nice answer – VHS – 2018-03-13T01:31:50.370


For Binary Search, T(N) = T(N/2) + O(1) // the recurrence relation

Apply Masters Theorem for computing Run time complexity of recurrence relations : T(N) = aT(N/b) + f(N)

Here, a = 1, b = 2 => log (a base b) = 1

also, here f(N) = n^c log^k(n) //k = 0 & c = log (a base b)

So, T(N) = O(N^c log^(k+1)N) = O(log(N))

Source :


Posted 2011-11-18T15:50:12.363

Reputation: 465

1why log(a base b) is 1 when a=1 and b=2 , shouldn't it be 0 ? – GAURANG VYAS – 2017-02-09T19:33:54.600



T(n/2)= T(n/4)+1+1

Put the value of The(n/2) in above so T(n)=T(n/4)+1+1 . . . . T(n/2^k)+1+1+1.....+1

=T(2^k/2^k)+1+1....+1 up to k


As we taken 2^k=n

K = log n

So Time complexity is O(log n)

Dhiren Biren

Posted 2011-11-18T15:50:12.363

Reputation: 121


It doesn't half search time, that wouldn't make it log(n). It decreases it logarithmicly. Think about this for a moment. If you had 128 entries in a table and had to search linearly for your value, it would probably take around 64 entries on average to find your value. That's n/2 or linear time. With a binary search, you eliminate 1/2 the possible entries each iteration, such that at most it would only take 7 compares to find your value (log base 2 of 128 is 7 or 2 to the 7 power is 128.) This is the power of binary search.

Michael Dorgan

Posted 2011-11-18T15:50:12.363

Reputation: 10 747

Sorry for the necropost but 128 is not an evenly filled out tree. I used a basic example to get my head around this, and I found that 7 entries evenly fills a tree with 3 layers. I calculated that the complexity should be 17/7 (the mean of the sum total of compares) which is 2.43. But log2(7) is 2.81. So what am I missing here? – Perry Monschau – 2016-11-23T21:06:05.853

Two answers - First one here: Even if there is no error in the math, we can see that 2.43 average is still better than 3.5 average for linear, and this is at a low value. Once you get into the 100s of entries, log2() is much much better than linear. I think you see this though, so onto next. – Michael Dorgan – 2016-11-28T19:14:03.467

1Second answer: I'm not sure what kind of tree you have where 7 has everything filled out. When I think about a perfect tree of 8 entries, I see a 3 level deep tree with 8 total leaves. In this tree, no matter which number you search, it takes 3 total compares to get from root to leaf. For 7 entries, one of paths would take one less compare so 20/7 (6 nodes of 3 compares, 1 node of 2 compares) which is ~2.85. Log2(7) is ~2.81. I don't have the math background to explain away the .04 difference, but I guess it has to do with not having fractional bits available or some other magic :) – Michael Dorgan – 2016-11-28T19:26:44.967

the number is the number of leaves!? Not the number of nodes? Well that was a big piece of information that I was missing.. Seems strange to me that the function is based on leaves, when each branching node is also a potential stop point. Well anyway, thanks for straightening that out for me! – Perry Monschau – 2016-11-29T13:49:35.043


The time complexity of the binary search algorithm belongs to the O(log n) class. This is called big O notation. The way you should interpret this is that the asymptotic growth of the time the function takes to execute given an input set of size n will not exceed log n.

This is just formal mathematical lingo in order to be able to prove statements, etc. It has a very straightforward explanation. When n grows very large, the log n function will out-grow the time it takes to execute the function. The size of the "input set", n, is just the length of the list.

Simply put, the reason binary search is in O(log n) is that it halves the input set in each iteration. It's easier to think about it in the reverse situation. On x iterations, how long list can the binary search algorithm at max examine? The answer is 2^x. From this we can see that the reverse is that on average the binary search algorithm needs log2 n iterations for a list of length n.

If why it is O(log n) and not O(log2 n), it's because simply put again - Using the big O notation constants don't count.


Posted 2011-11-18T15:50:12.363

Reputation: 8 377


Here is wikipedia entry

If you look at the simple iterative approach. You are just eliminating half of the elements to be searched for until you find the element you need.

Here is the explanation of how we come up with the formula.

Say initially you have N number of elements and then what you do is ⌊N/2⌋ as a first attempt. Where N is sum of lower bound and upper bound. The first time value of N would be equal to (L + H), where L is the first index (0) and H is the last index of the list you are searching for. If you are lucky, the element you try to find will be in the middle [eg. You are searching for 18 in the list {16, 17, 18, 19, 20} then you calculate ⌊(0+4)/2⌋ = 2 where 0 is lower bound (L - index of the first element of the array) and 4 is the higher bound (H - index of the last element of the array). In the above case L = 0 and H = 4. Now 2 is the index of the element 18 that you are searching found. Bingo! You found it.

If the case was a different array{15,16,17,18,19} but you were still searching for 18 then you would not be lucky and you would be doing first N/2 (which is ⌊(0+4)/2⌋ = 2 and then realize element 17 at the index 2 is not the number you are looking for. Now you know that you don’t have to look for at least half of the array in your next attempt to search iterative manner. Your effort of searching is halved. So basically, you do not search half the list of elements that you searched previously, every time you try to find the element that you were not able to find in your previous attempt.

So the worst case would be

[N]/2 + [(N/2)]/2 + [((N/2)/2)]/2.....
N/21 + N/22 + N/23 +..... + N/2x …..

until …you have finished searching, where in the element you are trying to find is at the ends of the list.

That shows the worst case is when you reach N/2x where x is such that 2x = N

In other cases N/2x where x is such that 2x < N Minimum value of x can be 1, which is the best case.

Now since mathematically worst case is when the value of
2x = N
=> log2(2x) = log2(N)
=> x * log2(2) = log2(N)
=> x * 1 = log2(N)
=> More formally ⌊log2(N)+1⌋


Posted 2011-11-18T15:50:12.363

Reputation: 86

1How exactly do you obtain the more formal version? – Kalle – 2018-02-05T18:22:37.867


Log2(n) is the maximum number of searches that are required to find something in a binary search. The average case involves log2(n)-1 searches. Here's more info:

Jonathan M

Posted 2011-11-18T15:50:12.363

Reputation: 13 536


Since we cut down a list in to half every time therefore we just need to know in how many steps we get 1 as we go on dividing a list by two. In the under given calculation x denotes the numbers of time we divide a list until we get one element(In Worst Case).

1 = N/2x

2x = N

Taking log2

log2(2x) = log2(N)

x*log2(2) = log2(N)

x = log2(N)

Abdul Malik

Posted 2011-11-18T15:50:12.363

Reputation: 129


A binary search works by dividing the problem in half repeatedly, something like this (details omitted):

Example looking for 3 in [4,1,3,8,5]

  1. Order your list of items [1,3,4,5,8]
  2. Look at the middle item (4),
    • If it is what you are looking for, stop
    • If it is greater, look at the first half
    • If it is less, look at the second half
  3. Repeat step 2 with the new list [1, 3], find 3 and stop

It is a bi-nary search when you divide the problem in 2.

The search only requires log2(n) steps to find the correct value.

I would recommend Introduction to Algorithms if you want to learn about algorithmic complexity.

Silas Parker

Posted 2011-11-18T15:50:12.363

Reputation: 6 874


ok see this
1. Suppose at i=k the loop terminate. i.e. the loop execute k times.

2. at each iteration n is divided by half.

2.a n=n/2                   .... AT I=1
2.b n=(n/2)/2=n/(2^2)
2.c n=((n/2)/2)/2=n/(2^3)....... aT I=3
2.d n=(((n/2)/2)/2)/2=n/(2^4)

So at i=k , n=1 which is obtain by dividing n  2^k times
k=log(N)  //base 2

Piyush Jain

Posted 2011-11-18T15:50:12.363

Reputation: 1


T(N) = T(N/2) + 1

T(N) = T(N/2) + 1 = (T(N/4) + 1)+ 1


T(N) = T(N/N) + (1 + 1 + 1 +... + 1) = 1 + logN (base 2 log) = 1 + logN

So the time complexity of binary search is O(logN)


Posted 2011-11-18T15:50:12.363

Reputation: 1