Fibonacci Heap: Amortized Efficient Priority Queue
Binary heaps are the workhorse of priority queue implementations. They're simple, cache-friendly, and offer O(log n) for insert, extract-min, and decrease-key. But that decrease-key complexity...
Key Insights
- Fibonacci heaps achieve O(1) amortized time for insert and decrease-key operations, making them theoretically optimal for graph algorithms like Dijkstra’s and Prim’s that perform many decrease-key operations.
- The “lazy” design philosophy defers consolidation work until extract-min, spreading costs across operations through amortized analysis rather than paying upfront.
- Despite theoretical advantages, Fibonacci heaps rarely outperform simpler structures in practice due to high constant factors and poor cache locality—consider pairing heaps for real-world applications.
Introduction: Why Fibonacci Heaps Matter
Binary heaps are the workhorse of priority queue implementations. They’re simple, cache-friendly, and offer O(log n) for insert, extract-min, and decrease-key. But that decrease-key complexity becomes a bottleneck in graph algorithms.
Consider Dijkstra’s shortest path algorithm. With a binary heap, you perform O(E) decrease-key operations, each costing O(log V). The total complexity becomes O(E log V). For dense graphs where E approaches V², this hurts.
Fibonacci heaps flip the tradeoff. They achieve O(1) amortized time for decrease-key while maintaining O(log n) amortized extract-min. Dijkstra’s algorithm with a Fibonacci heap runs in O(E + V log V)—a significant improvement for dense graphs.
The catch? This performance comes from amortized analysis. Individual operations can be expensive, but averaged over a sequence of operations, the bounds hold. Understanding when this tradeoff benefits you is crucial.
Core Structure and Properties
A Fibonacci heap is a collection of heap-ordered trees. Unlike binary heaps with their rigid structure, Fibonacci heaps impose minimal constraints. Each tree satisfies the min-heap property (parent keys are smaller than children), but tree shapes vary wildly.
The key insight is laziness. Rather than maintaining structure after every operation, Fibonacci heaps defer cleanup work. This lazy approach enables O(1) insert and decrease-key by simply postponing the expensive consolidation phase.
class FibonacciNode:
def __init__(self, key, value=None):
self.key = key
self.value = value
self.degree = 0 # number of children
self.mark = False # has this node lost a child?
self.parent = None
self.child = None # pointer to one child
self.left = self # circular doubly-linked list
self.right = self
class FibonacciHeap:
def __init__(self):
self.min_node = None
self.num_nodes = 0
def _add_to_root_list(self, node):
"""Add node to the root list (circular doubly-linked)."""
if self.min_node is None:
self.min_node = node
node.left = node
node.right = node
else:
# Insert between min_node and min_node.right
node.right = self.min_node.right
node.left = self.min_node
self.min_node.right.left = node
self.min_node.right = node
def _remove_from_list(self, node):
"""Remove node from its current list."""
node.left.right = node.right
node.right.left = node.left
The circular doubly-linked list structure enables O(1) insertion and removal. Each node maintains pointers to its parent, one arbitrary child, and its siblings in the circular list.
Key Operations: Insert and Find-Min
Insert is trivially O(1). Create a new node, add it to the root list, and update the min pointer if necessary. No restructuring, no rebalancing—just append.
def insert(self, key, value=None):
"""Insert a new key into the heap. O(1) amortized."""
node = FibonacciNode(key, value)
node.left = node
node.right = node
self._add_to_root_list(node)
if self.min_node is None or node.key < self.min_node.key:
self.min_node = node
self.num_nodes += 1
return node # Return for decrease_key reference
def find_min(self):
"""Return the minimum key. O(1)."""
if self.min_node is None:
raise IndexError("Heap is empty")
return self.min_node.key, self.min_node.value
def is_empty(self):
return self.min_node is None
The returned node reference from insert() is critical. Decrease-key requires direct access to the node—you can’t search for it efficiently. Callers must maintain their own mapping from values to node references.
Extract-Min and Consolidation
Extract-min is where the deferred work gets paid. We remove the minimum node, promote its children to the root list, then consolidate trees to restore structure.
Consolidation merges trees of equal degree until no two roots share the same degree. This mirrors binomial heap behavior and bounds the number of trees to O(log n).
def extract_min(self):
"""Remove and return the minimum element. O(log n) amortized."""
min_node = self.min_node
if min_node is None:
raise IndexError("Heap is empty")
# Add all children to root list
if min_node.child is not None:
child = min_node.child
while True:
next_child = child.right
child.parent = None
self._add_to_root_list(child)
child = next_child
if child == min_node.child:
break
# Remove min_node from root list
self._remove_from_list(min_node)
if min_node == min_node.right:
# Was the only node
self.min_node = None
else:
self.min_node = min_node.right
self._consolidate()
self.num_nodes -= 1
return min_node.key, min_node.value
def _consolidate(self):
"""Merge trees until no two roots have the same degree."""
max_degree = int(self.num_nodes ** 0.5) + 1
degree_table = [None] * (max_degree + 1)
# Collect all roots
roots = []
if self.min_node is not None:
current = self.min_node
while True:
roots.append(current)
current = current.right
if current == self.min_node:
break
for root in roots:
degree = root.degree
current = root
while degree < len(degree_table) and degree_table[degree] is not None:
other = degree_table[degree]
# Ensure current has smaller key
if current.key > other.key:
current, other = other, current
self._link(other, current)
degree_table[degree] = None
degree += 1
if degree >= len(degree_table):
degree_table.extend([None] * (degree - len(degree_table) + 1))
degree_table[degree] = current
# Rebuild root list and find new min
self.min_node = None
for node in degree_table:
if node is not None:
node.left = node
node.right = node
self._add_to_root_list(node)
if self.min_node is None or node.key < self.min_node.key:
self.min_node = node
def _link(self, child, parent):
"""Make child a child of parent."""
self._remove_from_list(child)
child.parent = parent
if parent.child is None:
parent.child = child
child.left = child
child.right = child
else:
child.right = parent.child.right
child.left = parent.child
parent.child.right.left = child
parent.child.right = child
parent.degree += 1
child.mark = False
The degree table ensures we only merge trees of equal degree. After consolidation, each degree appears at most once among roots, bounding root list size logarithmically.
Decrease-Key and Cascading Cuts
Decrease-key is the crown jewel of Fibonacci heaps. When we decrease a node’s key, we might violate the heap property with its parent. Rather than bubble up (expensive), we cut the node and move it to the root list.
The mark bit tracks whether a node has lost a child since becoming a non-root. If a marked node loses another child, we cascade: cut it too, and check its parent. This cascading cut mechanism is essential for the amortized bounds.
def decrease_key(self, node, new_key):
"""Decrease a node's key. O(1) amortized."""
if new_key > node.key:
raise ValueError("New key must be smaller than current key")
node.key = new_key
parent = node.parent
if parent is not None and node.key < parent.key:
self._cut(node, parent)
self._cascading_cut(parent)
if node.key < self.min_node.key:
self.min_node = node
def _cut(self, node, parent):
"""Cut node from parent and add to root list."""
# Remove from parent's child list
if node.right == node:
parent.child = None
else:
if parent.child == node:
parent.child = node.right
node.left.right = node.right
node.right.left = node.left
parent.degree -= 1
# Add to root list
node.parent = None
node.mark = False
node.left = node
node.right = node
self._add_to_root_list(node)
def _cascading_cut(self, node):
"""Perform cascading cuts up the tree."""
parent = node.parent
if parent is not None:
if not node.mark:
node.mark = True
else:
self._cut(node, parent)
self._cascading_cut(parent)
Why does this work? The mark bit ensures no node loses more than two children before being cut itself. This bounds tree structure enough that degrees remain logarithmic in heap size.
Amortized Analysis with Potential Method
The potential method assigns a “potential energy” to the data structure. We define:
Φ = t + 2m
Where t is the number of trees in the root list and m is the number of marked nodes.
For each operation, amortized cost = actual cost + ΔΦ. The key insights:
- Insert: Actual O(1), increases t by 1. Amortized O(1).
- Extract-min: Actual O(t + degree), consolidation reduces t dramatically. Amortized O(log n).
- Decrease-key: Each cut increases t by 1 but may unmark nodes (reducing m). Cascading cuts are “paid for” by the marks being cleared. Amortized O(1).
| Operation | Binary Heap | Binomial Heap | Fibonacci Heap |
|---|---|---|---|
| Insert | O(log n) | O(log n) | O(1)* |
| Find-min | O(1) | O(1) | O(1) |
| Extract-min | O(log n) | O(log n) | O(log n)* |
| Decrease-key | O(log n) | O(log n) | O(1)* |
| Merge | O(n) | O(log n) | O(1) |
*Amortized bounds
Practical Considerations
Despite theoretical elegance, Fibonacci heaps rarely win in practice. The constant factors are brutal: pointer chasing destroys cache locality, and the bookkeeping overhead dominates for reasonable input sizes.
When should you actually use them? Only when:
- Your graph is large (millions of vertices)
- Your graph is dense (E » V)
- You perform many decrease-key operations
- You’ve profiled and confirmed the bottleneck
For most applications, a binary heap or pairing heap performs better. Pairing heaps offer similar amortized bounds with simpler implementation and better cache behavior.
def dijkstra_fibonacci(graph, source):
"""Dijkstra's algorithm using Fibonacci heap."""
dist = {v: float('inf') for v in graph}
dist[source] = 0
heap = FibonacciHeap()
nodes = {} # vertex -> heap node mapping
for v in graph:
nodes[v] = heap.insert(dist[v], v)
while not heap.is_empty():
d, u = heap.extract_min()
if d == float('inf'):
break
for v, weight in graph[u]:
alt = dist[u] + weight
if alt < dist[v]:
dist[v] = alt
heap.decrease_key(nodes[v], alt)
return dist
The node mapping is essential—without O(1) access to heap nodes, decrease-key becomes useless. This is the practical reality that textbooks often gloss over.
Consider Fibonacci heaps a theoretical benchmark rather than a practical tool. Understand them to appreciate amortized analysis, but reach for simpler structures in production code. When you do need their guarantees, pairing heaps usually deliver similar performance with less implementation pain.