Due to better access locality, B-trees are faster than binary search trees *in practice* -- but are they also better *in theory*? To answer this question, let's look at the number of comparisons required for a search operation. Assuming we store *n* elements in a binary search tree, the lower bound for the number of comparisons is *log _{2} n* in the worst case. However, this is only achievable for a perfectly balanced tree. Maintaining such a tree's perfect balance during insert/delete operations requires

*O(n)*time in the worst case.

Balanced binary search trees, therefore, leave some slack in terms of how balanced they are and have slightly worse bounds. For example, it is well known that an AVL tree guarantees at most

*1.44 log*comparisons, and a Red-Black tree guarantees

_{2}n*2 log*comparisons. In other words, AVL trees require at most 1.44 times the minimum number of comparisons, and Red-Black trees require up to twice the minimum.

_{2}nHow many comparisons does a B-tree need? In B-trees with degree

*k*, each node (except the root) has between

*k*and

*2k*children. For

*k=2*, a B-tree is essentially the same data structure as a Red-Black tree and therefore provides the same guarantee of

*2 log*comparisons. So how about larger, more realistic values of

_{2}n*k*?

To analyze the general case, we start with a B-tree that has the highest possible height for

*n*elements. The height is maximal when each node has only

*k*children (for simplicity, this analysis ignores the special case of underfull root nodes). This implies that the worst-case height of a B-tree is

*log*. During a lookup, one has to perform a binary search that takes

_{k}n*log*comparisons in each of the

_{2}k*log*nodes. So in total, we have

_{k}n*log*comparisons.

_{2}k * log_{k}n = log_{2}nThis actually matches the best case, and to construct the worst case, we have to modify the tree somewhat. On one (and only one) arbitrary path from the root to a single leaf node, we increase the number of children from

*k*to

*2k*. In this situation, the tree height is still less than or equal to

*log*, but we now have one worst-case path where we need

_{k}n*log*(instead of

_{2}2k*log*) comparisons. On this worst-case path, we have

_{2}k*log*comparisons.

_{2}2k * log_{k}n = (log_{2}2k) / (log_{2}k) * log_{2}nUsing this formula, we get the following bounds:

k=2: 2 log

_{2}n

k=4: 1.5 log

_{2}n

k=8: 1.33 log

_{2}n

k=16: 1.25 log

_{2}n

...

k=512: 1.11 log

_{2}n

We see that as k grows, B-trees get closer to the lower bound. For

*k>=8*, B-trees are guaranteed to perform fewer comparisons than AVL trees in the worst case. As

*k*increases, B-trees become more balanced. One intuition for this result is that for larger

*k*values, B-trees become increasingly similar to sorted arrays which achieve the

*log*lower bound. Practical B-trees often use fairly large values of

_{2}n*k*(e.g., 100) and therefore offer tight bounds -- in addition to being more cache-friendly than binary search trees.

(Caveat: For simplicity, the analysis assumes that

*log*and

_{2}n*log*are integers, and that the root has either

_{2}2k*k*or

*2k*entries. Nevertheless, the observation that larger

*k*values lead to tighter bounds should hold in general.)