Download as pdf or txt
Download as pdf or txt
You are on page 1of 33

AVL Trees

(AVL Trees) Data Structures Fall 2020 1 / 33


Balanced Binary Tree

The disadvantage of a binary search tree is that its height can be as


large as N-1
This means that the time needed to perform insertion and deletion
and many other operations can be O(N) in the worst case
We want a tree with small height
A binary tree with N node has height at least Θ(log N)
Thus, our goal is to keep the height of a binary search tree O(logN)
Such trees are called balanced binary search trees. Examples are AVL
tree, red-black tree.

(AVL Trees) Data Structures Fall 2020 2 / 33


AVL Tree (Georgy Adelson-Velsky and Evgenii
Landis’ tree, 1962)

Height of a node
I The height of a leaf is 1. The height of a null pointer is zero.
I The height of an internal node is the maximum height of its
children plus 1

Note that this definition of height is different from the one we defined
previously (we defined the height of a leaf as zero previously).
(AVL Trees) Data Structures Fall 2020 3 / 33
AVL Tree (Cont’d)
An AVL tree is a binary search tree in which for every node in the
tree, the height of the left and right subtrees differ by at most 1.
An AVL tree maintains a ”balance factor” (BF(T) = Height(T.right)
- Height(T.left) ) in each node of 0, 1, or -1 i.e. no node’s two child
subtrees differ in height by more than 1.

(AVL Trees) Data Structures Fall 2020 4 / 33


AVL Tree Examples

(a) an AVL tree


(b) not an AVL tree (unbalanced nodes are darkened)

(AVL Trees) Data Structures Fall 2020 5 / 33


Tracking subtree height of an AVL Tree
Many of the AVL tree operations depend on height.
I Height can be computed recursively by walking the tree; too slow.
I Instead, each node can keep track of its subtree height as a field.
I Practically, we keep the balance factor instead.

(AVL Trees) Data Structures Fall 2020 6 / 33


AVL Tree Height
Let x be the root of an AVL tree of height h
Let Nh denote the minimum number of nodes in an AVL tree of
height h
Clearly, Ni ≥ Ni−1 by definition
We have
Nh ≥ Nh−1 + Nh−2 + 1 ≥ 2Nh−2 + 1 > 2Nh−2
By repeated substitution, we obtain the general form
Nh > 2i Nh−2i
The boundary conditions are: N1 = 1 and N2 = 2. This implies
that h = O(logNh ).
More precisely, the height of an n-node AVL tree is approx.
1.44 log2 n.
Thus, many operations (searching, insertion, deletion) on an AVL
tree will take O(logN) time.
(AVL Trees) Data Structures Fall 2020 7 / 33
Smallest AVL Tree

(AVL Trees) Data Structures Fall 2020 8 / 33


Rotations

When the tree structure changes (e.g., insertion or deletion), we


need to transform the tree to restore the AVL tree property.
This is done using single rotations or double rotations.
Since an insertion/deletion involves adding/deleting a single
node, this can only increase/decrease the height of some subtree
by 1
Thus, if the AVL tree property is violated at a node x, it means that
the heights of left(x) and right(x) differ by exactly 2.
Rotations will be applied to x to restore the AVL tree property.

(AVL Trees) Data Structures Fall 2020 9 / 33


Insertion
First, insert the new key as a new leaf just as in ordinary binary
search tree
Then trace the path from the new leaf towards the root. For each
node x encountered, check if heights of left(x) and right(x) differ
by at most 1.
If yes, proceed to parent(x). If not, restructure by doing either a
single rotation or a double rotation [next slide].
For insertion, once we perform a rotation at a node x, we won’t
need to perform any rotation at any ancestor of x.

(AVL Trees) Data Structures Fall 2020 10 / 33


AVL Insertion Cases

Consider the lowest node k2 that has now become unbalanced.


The new offending node could be in one of the four following
grandchild subtrees, relative to k2 :

(AVL Trees) Data Structures Fall 2020 11 / 33


Right rotation

Basic operation used in AVL trees: A left child could legally have its
parent as its right child.

(AVL Trees) Data Structures Fall 2020 12 / 33


Right rotation
Right rotation (clockwise): (fixes Case 1 (LL))
left child k1 becomes parent
original parent k2 demoted to right
k1 ’s original right subtree B (if any) is attached to k2 as left subtree

(AVL Trees) Data Structures Fall 2020 13 / 33


Right rotation Example

What is the balance factor of k2 before and after rotating?

(AVL Trees) Data Structures Fall 2020 14 / 33


Right Rotation Steps

Detach left child (11)’s right subtree (27) (don’t lose it!)
Consider left child (11) be the new parent.
Attach old parent (43) onto right of new parent (11).
Attach new parent (11)’s old right subtree (27) as left subtree of
old parent (43).

(AVL Trees) Data Structures Fall 2020 15 / 33


Left Rotation
Left rotation (clockwise): (fixes Case 4 (RR))
right child k2 becomes parent
original parent k1 demoted to left
k2 ’s original left subtree B (if any) is attached to k1 as right subtree

(AVL Trees) Data Structures Fall 2020 16 / 33


Left Rotation Steps

Detach right child (65)’s left subtree (51) (don’t lose it!)
Consider right child (65) be the new parent.
Attach old parent (43) onto left of new parent (65).
Attach new parent (65)’s old left subtree (51) as right subtree of
old parent (43).

(AVL Trees) Data Structures Fall 2020 17 / 33


Problem Cases

A single right rotation does not fix Case 2 (LR).

(Similarly, a single left rotation does not fix Case 3 (RL).)

(AVL Trees) Data Structures Fall 2020 18 / 33


Left-right Double Rotation

Left-right double rotation: (fixes Case 2 (LR))


left-rotate k3 ’s left child (i.e., k1 ) ... reduces Case 2 into Case 1
right-rotate k3 to fix Case 1

(AVL Trees) Data Structures Fall 2020 19 / 33


Left-right Rotation Example

What is the balance factor of k1 , k2 , k3 before and after rotating?

(AVL Trees) Data Structures Fall 2020 20 / 33


Right-left Double Rotation

Right-left double rotation: (fixes Case 3 (RL))


right-rotate k1 ’s right child (i.e., k3 ) ... reduces Case 3 into Case 4
left-rotate k1 to fix Case 4

(AVL Trees) Data Structures Fall 2020 21 / 33


Properties of General Insert + Single Rotation

Restores balance to a lowest point in tree where imbalance occurs

After rotation, height of the subtree (in the example, h+1) is the
same as it was before the insert that imbalanced it

Thus, no further rotations are needed anywhere in the tree!

(AVL Trees) Data Structures Fall 2020 22 / 33


Deletion (Really Easy Case)

(AVL Trees) Data Structures Fall 2020 23 / 33


Deletion (Pretty Easy Case)

(AVL Trees) Data Structures Fall 2020 24 / 33


Deletion (Pretty Easy Case cont’d)

(AVL Trees) Data Structures Fall 2020 25 / 33


Deletion (Hard Case #1)

(AVL Trees) Data Structures Fall 2020 26 / 33


Single Rotation on Deletion

What is different about deletion than insertion?

(AVL Trees) Data Structures Fall 2020 27 / 33


Deletion (Hard Case)

(AVL Trees) Data Structures Fall 2020 28 / 33


Double Rotation on Deletion

(AVL Trees) Data Structures Fall 2020 29 / 33


Deletion with Propagation

(AVL Trees) Data Structures Fall 2020 30 / 33


Propagated Single Rotation

(AVL Trees) Data Structures Fall 2020 31 / 33


Propagated Double Rotation

Imbalance may propagate upward so that many rotations may be


needed.

(AVL Trees) Data Structures Fall 2020 32 / 33


Pros and Cons of AVL Trees

Pros and Cons of AVL Trees


1 Search is O(log N) since AVL trees are always balanced.
2 Insertion and deletions are also O(log n)
3 The height balancing adds no more than a constant factor to the
speed of insertion.
Arguments against using AVL trees:
1 Difficult to program and debug; more space for balance factor.
2 Asymptotically faster but rebalancing costs time.
3 Most large searches are done in database systems on disk and use
other structures (e.g. B-trees).
4 May be OK to have O(N) for a single operation if total run time for
many consecutive operations is fast (e.g. Splay trees).
Can we guarantee O(log N) performance with less overhead?

(AVL Trees) Data Structures Fall 2020 33 / 33

You might also like