AA Tree: Simplified Red-Black Tree
In 1993, Swedish computer scientist Arne Andersson published a paper that should have changed how we teach self-balancing binary search trees. His AA tree (named after his initials) achieves the same...
Key Insights
- AA trees achieve the same O(log n) performance as red-black trees while cutting the implementation complexity roughly in half by enforcing a single rule: red nodes can only be right children.
- The entire balancing logic reduces to just two operations—skew and split—applied uniformly after insertions and deletions, eliminating the case analysis that makes red-black trees error-prone to implement.
- AA trees are the ideal choice when you need a self-balancing BST and code maintainability matters more than squeezing out the last 5% of performance.
Introduction to AA Trees
In 1993, Swedish computer scientist Arne Andersson published a paper that should have changed how we teach self-balancing binary search trees. His AA tree (named after his initials) achieves the same logarithmic guarantees as red-black trees while being dramatically simpler to implement correctly.
The insight is elegant: red-black trees are complicated because red nodes can appear as either left or right children, creating numerous cases during rebalancing. Andersson asked a simple question—what if we restricted red nodes to only appear as right children? This single constraint eliminates half the rotation cases and transforms a notoriously tricky data structure into something you can implement correctly on a whiteboard.
If you’ve ever struggled through a red-black tree implementation, debugging subtle rotation bugs at 2 AM, AA trees offer a compelling alternative. Same performance, half the headache.
Core Concepts: Levels Instead of Colors
AA trees replace the red/black coloring scheme with a simpler abstraction: levels. Every node has an integer level, and the tree maintains two invariants:
- Left child rule: A left child’s level must be exactly one less than its parent’s level.
- Right child/grandchild rule: A right child’s level must be equal to or one less than its parent’s level. A right grandchild’s level must be strictly less than its grandparent’s level.
When a right child has the same level as its parent, we call this a “horizontal link”—conceptually equivalent to a red node in red-black terminology. The invariants ensure that horizontal links only go right and never chain more than two nodes together.
Here’s the node structure:
class AANode:
def __init__(self, key, value=None):
self.key = key
self.value = value
self.level = 1 # New nodes start at level 1
self.left = None
self.right = None
Leaf nodes (and newly inserted nodes) always have level 1. As the tree grows, interior nodes accumulate higher levels. The level of a node roughly corresponds to the black-height in red-black tree terms—it represents the “true” depth in the balanced structure.
This level-based approach makes reasoning about the tree’s balance straightforward. You’re not juggling colors and trying to remember which configurations are legal. Instead, you’re comparing integers and applying simple arithmetic rules.
The Two Operations: Skew and Split
Here’s where AA trees truly shine. The entire rebalancing logic consists of exactly two operations: skew and split. Compare this to red-black trees, which require handling uncle colors, double rotations, and multiple distinct cases.
Skew: Fixing Left Horizontal Links
A skew occurs when a left child has the same level as its parent—a violation of our first invariant. The fix is a right rotation:
P L
/ \ / \
L c → a P
/ \ / \
a b b c
(where L.level == P.level)
def skew(node):
"""Remove left horizontal links by rotating right."""
if node is None:
return None
if node.left is None:
return node
if node.left.level == node.level:
# Rotate right
left = node.left
node.left = left.right
left.right = node
return left
return node
Split: Fixing Consecutive Right Horizontal Links
A split occurs when we have two consecutive right horizontal links—a right child and right grandchild both at the parent’s level. The fix is a left rotation with a level increase:
P R
/ \ / \
a R → P X
/ \ / \
b X a b
(where P.level == R.level == X.level)
def split(node):
"""Remove consecutive right horizontal links by rotating left and promoting."""
if node is None:
return None
if node.right is None or node.right.right is None:
return node
if node.level == node.right.right.level:
# Rotate left
right = node.right
node.right = right.left
right.left = node
right.level += 1 # Promote the new root
return right
return node
That’s it. Two simple functions, each under 15 lines. Every rebalancing operation in an AA tree is just a combination of skew and split applied at various points up the tree.
Insertion Implementation
Insertion follows the standard BST pattern, then applies skew and split on the way back up the recursion. The beauty is in the uniformity—you don’t need to check cases or track parent pointers.
class AATree:
def __init__(self):
self.root = None
def insert(self, key, value=None):
self.root = self._insert(self.root, key, value)
def _insert(self, node, key, value):
# Standard BST insertion
if node is None:
return AANode(key, value)
if key < node.key:
node.left = self._insert(node.left, key, value)
elif key > node.key:
node.right = self._insert(node.right, key, value)
else:
node.value = value # Update existing key
return node
# Rebalance on the way up
node = skew(node)
node = split(node)
return node
Compare this to red-black insertion, where you need to handle:
- Uncle is red: recolor and recurse upward
- Uncle is black, node is inner child: double rotation
- Uncle is black, node is outer child: single rotation with recolor
The AA tree approach is refreshingly mechanical. Insert, skew, split, done.
Deletion Implementation
Deletion is more complex—this is true for all balanced BST variants. The strategy involves finding the in-order successor or predecessor for internal nodes, then rebalancing by potentially decreasing levels and applying skew/split.
def delete(self, key):
self.root = self._delete(self.root, key)
def _delete(self, node, key):
if node is None:
return None
if key < node.key:
node.left = self._delete(node.left, key)
elif key > node.key:
node.right = self._delete(node.right, key)
else:
# Found the node to delete
if node.left is None:
return node.right
elif node.right is None:
return node.left
else:
# Replace with in-order successor
successor = self._min_node(node.right)
node.key = successor.key
node.value = successor.value
node.right = self._delete(node.right, successor.key)
# Rebalance: decrease level if necessary
node = self._decrease_level(node)
node = skew(node)
if node.right:
node.right = skew(node.right)
if node.right.right:
node.right.right = skew(node.right.right)
node = split(node)
if node.right:
node.right = split(node.right)
return node
def _min_node(self, node):
while node.left:
node = node.left
return node
def _decrease_level(self, node):
"""Ensure level invariants after deletion."""
should_be = min(
node.left.level if node.left else 0,
node.right.level if node.right else 0
) + 1
if should_be < node.level:
node.level = should_be
if node.right and should_be < node.right.level:
node.right.level = should_be
return node
The deletion rebalancing applies skew and split multiple times to handle cascading violations. While more involved than insertion, it’s still more straightforward than red-black deletion, which is notoriously one of the most error-prone algorithms in computer science.
Performance Comparison
Let’s be precise about the trade-offs:
| Metric | AA Tree | Red-Black Tree |
|---|---|---|
| Search | O(log n) | O(log n) |
| Insert | O(log n) | O(log n) |
| Delete | O(log n) | O(log n) |
| Space | O(n) | O(n) |
| Rotations per insert | ~1.5 avg | ~1 avg |
| Implementation complexity | Low | High |
AA trees perform slightly more rotations on average because skew and split are applied somewhat liberally. In practice, benchmarks show AA trees running 5-15% slower than optimized red-black implementations. For most applications, this difference is irrelevant—you’re not going to notice 15% on a logarithmic operation.
The real difference is in implementation and maintenance. A correct red-black tree implementation typically runs 200-400 lines of careful code. An AA tree achieves the same guarantees in under 100 lines.
When to Use AA Trees
Use AA trees when:
- You’re implementing a balanced BST from scratch and correctness matters more than the last bit of performance
- You’re in an educational context teaching self-balancing trees
- You’re working in a constrained environment (embedded systems, limited tooling) where debugging complex code is costly
- Your codebase values simplicity and you can’t just use the standard library
Consider alternatives when:
- You’re using a language with a battle-tested standard library (just use
std::map,TreeMap, etc.) - You need maximum read performance and can tolerate slower writes (AVL trees maintain stricter balance)
- You’re implementing a concurrent data structure where the specific rotation patterns matter for lock-free algorithms
The honest truth is that most developers should never implement their own balanced BST. But when you must—for learning, for specialized requirements, or because you’re building infrastructure—AA trees offer the best complexity-to-capability ratio available.
Andersson gave us a gift in 1993: proof that we don’t need complexity to achieve performance. Sometimes the simplest solution really is the best one.