I made a departure after studying binary search trees to visit priority queues and heaps. I had planned to do this one later, but it kept coming up in lectures.

I really like heaps. They are space and operation efficient, easy to understand and code, and make sorting easy. What's not to like?

Here's a fine video from MIT that explains max-heaps and heap sort:

## Building a Heap in O(n) Time

I was surprised by this. At quick glance, you'd think building a heap from a given array of elements would take O(n log n) time:

- n operations to place the elements in an array (1 if you use the array given)
- n * log n operations to take each element (n) and heapify it (log n)

But not so fast!

Since the last half of a given array (if you consider the array represents a complete binary search tree) are leaves, you can ignore them, since each leaf is already a subtree that satisfies the heap property (either max-heap or min-heap). No work needs to be done on them.

A tree of size n nodes, will have floor(n/2^h) nodes with height >= h.

Going bottom-up (ignoring the last n/2 items) and satisfying the heap property one level at a time, each level going up the tree has to do at most 1 operation more than the level below it. But as you go up the tree, higher levels have fewer nodes, so you may be doing more operations (h operations at most), but it happens on fewer number of times.

This resembles a series:

n/2 - height 1: 1 operations

n/4 - height 2: 2 operation

n/8 - height 3: 3 operations

...
going all the way to the root: floor(n/2^h) - height h: h operations

A summation of the work done at each level is:

n * (1/2 + 2/4 + 3/8 + 4/16 ....) = n * 1 = **n**

The math:

I like the video of this gent explaining it. He's a hoot!