Unit 2 Notes
Unit 2 Notes
Unit 2 Notes
SEARCHING
Using a procedure Tree-Search, we can search a node with a given value.
Tree-Search(x,k) returns a pointer to a node with key k if it exists or returns
NIL.
TREE-SEARCH(x, k)
1 if x == NIL or k == x.key
2 return x
3 if k < x.key
4 return TREE-SEARCH(x.left, k)
5 else return TREE-SEARCH(x.right, k)
ITERATIVE-TREE-SEARCH(x, k)
1 while x ≠ NIL and k ≠ x:key
2 if k < x.key
3 x = x.left
4 else x = x.right
5 return x
The TREE-SEARCH procedure begins its search at the root. For each node x it compares
the key k with x.key. If the two keys are equal, the search terminates. If k is smaller than
x.key, the search continues in the left subtree of x, if k is larger than x:key, the search
continues in the right subtree of x. The nodes encountered during the recursion form a
simple path downward from the root of the tree, and thus the running time of TREE-
SEARCH is O(h), where h is the height of the tree. The ITERATIVE-TREE-SEARCH
procedure is more efficient than the TREE-SEARCH procedure.
TREE-MAXIMUM(x)
1 while x.right ≠ NIL
2 x = x:right
3 return x
In this Binary Search tree, the minimum value 2 is
found by following the left subtree from the root. Since
there is no left subtree for the node 2 the procedure stops by
returning the minimum value. Similarly, the maximum
value 20 is found by following the right subtree. Since there
is no more right subtree for the node 20 the procedure stops
by returning the maximum value.
The predecessor of the node x is the node with the largest key lesser than x.key.
The predecessor of a node is the previous node visited in an inorder tree walk. The
TREE- PREDECESSOR procedure returns the predecessor of a node x in a binary search
tree if it exists, or NIL if x is the first node in the inorder walk. The code for TREE-
PREDECESSOR has two cases:
➢ If the left subtree of node x is nonempty, then the predecessor of x is just the
rightmost node in x’s left subtree. The predecessor of node 6 is 4, which is
the rightmost node of the left subtree.
➢ If the left subtree of node x is empty, then the predecessor of x is found by
going up the tree from x until we reach either the root node or a node that is
the right child of its parent.
The running time of procedure TREE-PREDECESSOR is O(h) time.
TREE-SUCCESSOR(x)
1 if x:right ≠ NIL
2 return TREE-MINIMUM(x.right)/ // leftmost node in right subtree
3 else // find the lowest ancestor of x whose left child is an ancestor of x
4 y = x.p
5 while y ≠ NIL and x == y.right
6 x=y
7 y = y:p
8 return y
TREE-PREDECESSOR(x)
1 if x:right ≠ NIL
2 return TREE-MAXIMUM(x.left) // rightmost node in left subtree
3 else // find the lowest ancestor of x whose right child is an ancestor of x
4 y = x.p
5 while y ≠ NIL and x == y.right
6 x=y
7 y = y.p
8 return y
INSERTION
The TREE-INSERT procedure inserts a new node into a binary search tree. The
procedure takes a binary search tree T and a node z for which z.key the z.left = NIL and
z.right = NIL. It modifies T and some of the attributes of z so as to insert z into an
appropriate position in the tree.
TREE-INSERT (T,z)
1 x = T.root // node being compared with z
2 y = NIL // y will be parent of z
3 while x ≠ NIL // descend until reaching a leaf
4 y=x
5 if z.key < x.key
6 x = x.left
7 else x = x.right
8 z.p = y // found the location insert z with parent y
9 if y == NIL
10 T.root = z // tree T was empty
11 elseif z.key < y.key
12 y.left = z
13 else y.right = z
DELETION
There are three cases for deleting a node in a Binary Search Tree.
1. If z has no children, then remove it by changing the parent as NIL.
2. If z has one child, then replace z.child in z’s location by changing the z’s
parent for z’s child
3. If z has two children, then find y, the successor of z.
a. If y has no left subtree l and the successor y = r, the right child of z,
then replace z by y and the original right subtree of z becomes y’s
right subtree and the original left subtree of z becomes the left
subtree of y.
b. If y has no left subtree and the successor y ≠ r, then replace y by its
own right child x, and set y to be r’s parent. Then set y to be q’s
child and the parent of l.
The subroutine TRANSPLANT replaces one subtree as a child of its parent with
another subtree. When TRANSPLANT replaces the subtree rooted at node u with the
subtree rooted at node v, node u’s parent becomes node v’s parent, and u’s parent ends up
having v as its appropriate child. TRANSPLANT allows v to be NIL instead
of a pointer to a node.
TRANSPLANT (T, u, v)
1 if u.p = = NIL
2 T.root = v
3 elseif u = = u.p.left
4 u
.p.left
=v5
else
u.p.rig
ht = v
6 if v ≠ NIL
7 v.p = u.p
TREE-DELETE (T, z)
1 if z.left == NIL
2 TRANSPLANT (T, z, z.right) // replace z by
its right child 3 elseif z.right == NIL
4 TRANSPLANT (T,z,z.left) // replace z by its left child
5 else y = TREE-MINIMUM(z.right) // y is z’s successor
6 if y ≠ z.right // is y farther down the tree?
7 TRANSPLANT (T, y, y.right) // replace y by its right child
8 y.right = z.right // z’s right child becomes
9 y.right .p = y // y’s right child
10 TRANSPLANT (T, z, y) // replace z by its successor y
11 y.left = z.left // and give z’s left child to y,
12 y.left.p = y // which had no left child
A red-black tree is a binary search tree that satisûes the following red-black properties:
1. Every node is either red or black.
2. The root is black.
3. Every leaf (NIL) is black.
4. If a node is red, then both its children are black.
5. For each node, all simple paths from the node to descendant leaves contain
the same number of black nodes.
Insertion Logic
Insert the node similar to a binary tree and assign a red color to it. If the node is a
root node then change its color to black. If it is not then check the color of the parent
node. If its color is black then don’t change the color but if it is red then check the color
of the node’s uncle. If the node’s uncle is red then change the color of the node’s parent
and uncle to black and grandfather to red color. If grandfather is root then don’t change
grandfather to red color.
If the node’s uncle is black, then there are four possible cases
1. Left Left Rotation
Since the first element is the root node, the color of the node is changed as black
As Red Black tree is a Binary search tree, the new key 21 is checked with 3 and inserted
in right and as a new node it is colored as red.
32 is inserted to the right of 21 and colored as red. Two red nodes are not possible so it
needs a RR Rotation and we need to recolor to balance the tree. After rotating 21
becomes the root node which is red. But a root node cannot be red, So we need to recolor
it as black. And 3 is the grandparent of 32 and hence recolor it as red.
15 is inserted to the right of 3 and colored as red. Two red nodes are not possible and as
the uncle node is red, change the color of parent and uncle as black.
DELETION
Deletion of a node in a red black tree takes O(lg n) time. Deleting a node from a
red-black tree is more complicated than inserting a node. The procedure for deleting a
node from a red-black tree is based on the TREE-DELETE procedure. RB-DELETE
AND RB- DELETE-FIXUP is used to perform deletion in Red-Black Tree.
RB-DELETE (T, z)
1y=z
2 y-original-color = y.color
3 if z.left = = T.nil
4 x = z.right
5 RB-TRANSPLANT (T, z, z.right ) // replace z by its right
child 6 elseif z.right = = T.nil
7 x = z.left
19 x.p.color = BLACK
20 w.right.color = BLACK
21 LEFT-ROTATE(T, x.p)
22 x = T.root
23 else // same as lines 3-22, but with “right” and “left” exchanged
24 w = x.p.left
25 if w.color == RED
26 w.color = BLACK
27 x.p.color = RED
28 RIGHT-ROTATE(T, x.p)
29 w = x.p.left
30 if w.right.color == BLACK and w.left.color == BLACK
31 w.color = RED
32 x = x.p
33 else
34 if w.left.color == BLACK
35 w.right.color = BLACK
36 w.color = RED
37 LEFT-ROTATE(T, w)
38 w = x.p.left
39 w.color = x.p.color
40 x.p.color = BLACK
41 w.left.color = BLACK
42 RIGHT-ROTATE(T, x.p)
43 x = T.root
44 x.color = BLACK
Example:
Case 3: If the double black node is not a root node, then check the sibling of
double black node
If the double black nodes sibling and the children are black
Delete 20
Double Black Children is black and sibling 15 is also black, then
a. Remove double black
b. Add black to parent i.e) 30 is red which becomes black
c. Make the sibling as red i.e) 15 becomes red
15 is Double Black and its Children are black and so it becomes Double Black
Still Double Black exists in node 10. But it is the root. So just remove it
Reapply the cases since there is double black and it follow the case 3
● Move the double black to parent 20 which is red and it becomes black
● Make the sibling 25 as red and remove the node to be deleted
Case 5: If a double black sibling is black and the children are not black i.e). one is red
and the red is near to double black
Delete 1
Double black Sibling 7 is black. So Move double black to parent 5 and make the sibling
7 as red and remove nil node
Still Double black exists. Now the double black node is 5 and the sibling of node 5 is
black and all children are not black. One of its children is red and it is near to Double
black.
Action 1: Swap the color of sibling 30 and sibling child 25.
Action 2: Do the rotation opposite to double black
This follows Case 6. Since the double black node’s sibling is black and one of its
children is red and it is far away from the double black node.
Action 1: Swap the color of parent and sibling of double black
Since the parent of double black 10 is black and the sibling 25 also black, it
doesn't make any change
BINARY TREES
CS8391 DATA
STRUCTURES
TERMINOLOGY
Parent
If N is any node in T that has left successor S1 and right successor S2, then N is
called the parent of S 1 and S2. Correspondingly, S1 and S2 are called the left child and
the right child of N. Every node other than the root node has a parent
CS8391 DATA
STRUCTURES
Level number
Every node in the binary tree is assigned a level number. The root node is defined
to be at level 0. The left and the right child of the root node have a level number 1.
Similarly, every node is at one level higher than its parents. So all child nodes are
defined to have level number as parent's level number + 1.
Degree of a node
Degree of a node is equal to the number of children that a node has The degree of
a leaf node is zero.
In-degree
Out-degree
It is equal to the number of children that a node has. The degree of a leaf node is
zero. For example, in above tree, degree of node 4 is 2, degree of node 5 is zero and
degree of node 7 is 1.
Path
A sequence of consecutive edges. For example, in above Fig., the path from the
root node to the node 8 is given as: 1, 2, 4, and 8.
Sibling
All nodes that are at the same level and share the same parent are called siblings
(brothers). For example, nodes 2 and 3; nodes 4 and 5; nodes 6 and 7; nodes 8 and 9; and
nodes 10 and 11 are siblings.
CS8391 DATA
STRUCTURES
Leaf node
A node that has no children is called a leaf node or a terminal node. The leaf
nodes in the tree are: 8, 9, 5, 10, 11, and 12.
Similar binary trees
Two binary trees T and T’ are said to be similar if both these trees have the same
structure. Figure shows two similar binary trees.
Edge
It is the line connecting a node N to any of its successors. A binary tree of n nodes
has exactly n – 1 edges because every node except the root node is connected to its
parent via an edge.
Depth
The depth of a node N is given as the length of the path from the root R to the
node N. The depth of the root node is zero.
Height of a tree
It is the total number of nodes on the path from the root node to the deepest node
in the tree. A tree with only a root node has a height of 1.
COMPLETE BINARY TREES
A complete binary tree is a binary tree that satisfies two properties. First, in
a complete binary tree, every level, except possibly the last, is completely
filled. Second, all nodes appear as far left as possible.
In a complete binary tree Tn , there are exactly n nodes and level r of T can
have at most 2r nodes.
CS8391 DATA
STRUCTURES
Figure shows a complete binary tree.
The formula can be given as—if K is a parent node, then its left child can be
calculated as 2× K and its right child can be calculated as 2 × K + 1.
For example, the children of the node 4 are 8 (2 × 4) and 9 (2 × 4 + 1). Similarly,
the parent of the node K can be calculated as | K/2 |. Given the node 4, its parent
can be calculated as | 4/2 | = 2.
The height of a tree Tn having exactly n nodes is given as: Hn = | log2 (n + 1) |
NOTE: In Fig. 9.7, level 0 has 20 = 1 node, level 1 has 21 = 2 nodes, level 2 has
22 = 4 nodes, level 3 has 6 nodes which is less than the maximum of 23 = 8 nodes.
Extended Binary Trees
A binary tree T is said to be an extended binary tree (or a 2-tree) if each node in
the tree has either no child or exactly two children.
In an extended binary tree, nodes having two children are called internal nodes and
nodes having no children are called external nodes.
In Given Fig., the internal nodes are represented using circles and the external
nodes are represented using squares.
CS8391 DATA
STRUCTURES
To convert a binary tree into an extended tree, every empty sub-tree is replaced by
a new node. The original nodes in the tree are the internal nodes, and the new
nodes added are called the external nodes. Below Figure shows how an ordinary
binary tree is converted into an extended binary tree.
Representation of Binary Trees in the Memory
};
Every binary tree has a pointer ROOT, which points to the root element
(topmost element) of the tree.
If ROOT = NULL, then the tree is empty.
CS8391 DATA
STRUCTURES
Consider the binary tree given. The schematic diagram of the linked
representation of the binary tree is shown in Fig.
In Fig, the left position is used to point to the left child of the node or to store the
address of the left child of the node. The middle position is used to store the data.
Finally, the right position is used to point to the right child of the node or to store the
address of the right child of the node. Empty sub-trees are represented using X (meaning
NULL).
CS8391 DATA
STRUCTURES
Look at the tree given. Note how this tree is represented in the main memory
using a linked list
The maximum size of the array TREE is given as (2h–1), where h is the height of
the tree.
An empty tree or sub-tree is specified using NULL. If TREE [1] = NULL, then the
tree is empty.
CS8391 DATA
STRUCTURES
This shows a binary tree and its corresponding sequential representation. The tree
has 11 nodes and its height is 4
HEAPS
BINARY HEAPS
A binary heap is a complete binary tree in which every node satisfies the heap property which states
that:
This implies that elements at every node will be either greater than or equal to the element at its left and right
child. Thus, the root node has the highest key value in the heap. Such a heap is commonly known as a max-
heap.
Alternatively, elements at every node will be either less than or equal to the element at its left and right child.
Thus, the root has the lowest key value. Such a heap is called a min-heap.
CS8391 DATA
STRUCTURES
The properties of binary heaps are given as follows:
Since a heap is defined as a complete binary tree, all its elements can be stored sequentially in an
array. It follows the same rules as that of a complete binary tree. That is, if an element is at position
i in the array, then its left child is stored at position 2i and its right child at position 2i+1.
Being a complete binary tree, all the levels of the tree except the last level are completely filled.
The height of a binary tree is given as log2n, where n is the number of elements.
Heaps (also known as partially ordered trees) are a very popular data structure for implementing priority
queues.
CS8391 DATA
STRUCTURES
OPERATIONS:
1. Insertion
2.Deletion
Consider a max heap H with n elements. Inserting a new value into the heap is done in the following
two steps:
1. Add the new value at the bottom of H in such a way that H is still a complete binary tree but not necessarily
a heap.
2. Let the new value rise to its appropriate place in H so that H now becomes a heap as well. To do this,
compare the new value with its parent to check if they are in the correct order. If they are, then the procedure
halts, else the new value and its parent’s value are
CS8391 DATA
STRUCTURES
ROHINI COLLEGE OF ENGINEERING &
TECHNOLOGY
The first step says that insert the element in the heap so that the heap is a complete binary tree. So,
insert the new value as the right child of node 27 in the heap. This is illustrated in Fig. Now, as per
the second step, let the new value rise to its appropriate place in H so that H becomes a heap as well
Compare 99 with its parent node value. If it is less than its parent’s value, then the new node is in its
appropriate place and H is a heap. If the new value is greater than that of its parent’s node, then
swap the two values. Repeat the whole process until H becomes a heap. This is illustrated in Fig.
CS8391 DATA
STRUCTURES
ROHINI COLLEGE OF ENGINEERING &
TECHNOLOGY
Example 2 Build a max heap H from the given set of numbers: 45, 36, 54, 27, 63, 72, 61, and 18.
Solution
Consider a max heap H having n elements. An element is always deleted from the root of the heap.
So, deleting an element from the heap is done in the following three steps:
1. Replace the root node’s value with the last node’s value so that H is still a complete binary tree but not
necessarily a heap.
CS8391 DATA
STRUCTURES
3. Sink down the new root node’s value so that H satisfies the heap property. In this step, interchange the root
node’s value with its child node’s value (whichever is largest among its children). Here, the
value of root node
= 54 and the value of the last node = 11. So, replace 54 with 11 and delete the last node.
Example 1 Consider the max heap H shown in Fig. 12.8 and delete the root node’s value.
CS8391 DATA
STRUCTURES
Solution
A priority queue is similar to a queue in which an item is dequeued (or removed) from the front. However,
unlike a regular queue, in a priority queue the logical order of elements is determined by their priority. While
the higher priority elements are added at the front of the queue, elements with lower priority are added at the
rear.
Though we can easily implement priority queues using a linear array, but we should first consider the time
required to insert an element in the array and then sort it. We need O(n) time to insert an element and at least
O(n log n) time to sort the array. Therefore, a better way to implement a priority queue is by using a binary
heap which allows both enqueue and dequeue of elements in O(log n) time.
BINOMIAL HEAPS
A binomial heap H is a set of binomial trees that satisfy the binomial heap properties. First, let us
CS8391 DATA
STRUCTURES
A binomial tree is an ordered tree that can be recursively defined as follows:
• A binomial tree of order i has a root node whose children are the root nodes of binomial trees of order i–1, i–2, ...,
2, 1, and 0.
Look at Fig. which shows a few binomial trees of different orders. We can construct a binomial tree
Bi from two binomial trees of order Bi–1 by linking them together in such a way that the root of one
A binomial heap H is a collection of binomial trees that satisfy the following properties:
• Every binomial tree in H satisfies the minimum heap property (i.e., the key of a node is either greater than or equal
• There can be one or zero binomial trees for each order including zero order. According to the first property, the
root of a heap-ordered tree contains the smallest key in the tree. The second property, on the other hand, implies that a
CS8391 DATA
STRUCTURES
3. FIBONACCI HEAPS
A Fibonacci heap is a collection of trees. It is loosely based on binomial heaps. If neither the decrease-value nor the delete
operation is performed, each tree in the heap is like a binomial tree. Fibonacci heaps differ from binomial heaps as they have a
Although a Fibonacci heap is a collection of heap-ordered trees, the trees in a Fibonacci heap are not constrained to be binomial
trees. That is, while the trees in a binomial heap are ordered, those within Fibonacci heaps are rooted but unordered.
Applications of Heaps
Heap Implemented priority queues are used in Graph algorithms like Prim’s Algorithm and Dijkstra’s algorithm.
Order statistics: The Heap data structure can be used to efficiently find the kth smallest (or largest) element in an array.
⚫ The root of the trees are again connected with a circular, doubly linked list using their left and right
pointers
⚫ A Fibonacci heap H is defined by
⚫ A pointer min[H] which points to the root of a tree containing the minimum element (minimum
node of the heap)
⚫ A variable n[H] storing the number of elements in the heap
min[H]
3 17 24
23 7
18 52 38 30 26 46
39 41 35
min[H]
2 7 3 1 2
3 7 4
1 5 3 3 2 4
8 2 8 0 6 6
3 4 3
9 1 5
Additional Variables
⚫ Each node x also has two other fields
⚫ degree[x] – stores the number of children of x
⚫ mark[x] – indicates whether x has lost a child since the last time x was made the child of another
node
⚫ We will denote marked nodes by color black, and unmarked ones by color grey
⚫ A newly created node is unmarked
⚫ A marked node also becomes unmarked whenever it is made the child of another node
Amortized Analysis
⚫ We mentioned Fibonacci heap is an amortized data structure
⚫ We will use the potential method to analyse
⚫ Let t(H) = no. of trees in a Fibonacci heap H
⚫ Let m(H) = number of marked nodes in H
⚫ Potential function used
(H) = t(H) + 2m(H)
Operations
⚫ Create an empty Fibonacci heap
⚫ Insert an element in a Fibonacci heap
⚫ Merge two Fibonacci heaps (Union)
⚫ Extract the minimum element from a Fibonacci heap
⚫ Decrease the value of an element in a Fibonacci heap
⚫ Delete an element from a Fibonacci heap
Creating a Fibonacci
⚫ This creates an empty Fibonacci heap
⚫ Create an object to store min[H] and n[H]
⚫ Initialize min[H] = NIL and n[H] = 0
⚫ Potential of the newly created heap (H) = 0
⚫ Amortized cost = actual cost = O(1)
Inserting an
⚫ Add the element to the left of min[H]
⚫ Update min[H] if needed
Insert 21
21
min[H]
17 24 23 7 3
30 26 46
18 52 41
35
39 44
Inserting an Element
⚫ Add the element to the left of node pointed to by min[H]
⚫ Update min[H] if needed
min[H]
17 24 23 7 21 3
30 26 46
18 52 41
35
39 44
Amortized Cost of
⚫ Actual Cost O(1)
⚫ Change in potential +1
⚫ One new tree, no new marked node
⚫ Amortized cost O(1)
Merging Two Heaps
⚫ Concatenate the root lists of the two Fibonacci heaps
⚫ Root lists are circular, doubly linked lists, so can be easily concatenated
min[H1] min[H2]
7 39
18 52
3 44
41 21
23 24 17
30 26 46
35
Merging Two Heaps (contd.)
⚫ Concatenate the root lists of the two Fibonacci heaps
⚫ Root lists are circular, doubly linked lists, so can be easily concatenated
min[H]
23 24 17 7 3 21
30 26 46
18 52 41
35
39 44
Amortized Cost of Merge/Union
⚫ Actual cost = O(1)
⚫ Change in potential = 0
⚫ Amortized cost = O(1)
Extracting the Minimum Element
⚫ Step 1:
⚫ Delete the node pointed to by min[H]
⚫ Concatenate the deleted node’s children into root list
min[H]
7 24 23 17 3
30 26 46 18 52 41
35 39 44
Extracting the Minimum (contd.)
⚫ Step 1:
⚫ Delete the node pointed to by min[H]
⚫ Concatenate the deleted node’s children into root list
min[H]
7 24 23 17 18 52 41
30 26 46 39 44
35
Extracting the Minimum (contd.)
⚫ Step 2: Consolidate trees so that no two roots have same degree
⚫ Traverse the roots from min towards right
⚫ Find two roots x and y with the same degree, with key[x] ≤
key[y]
⚫ Remove y from root list and make y a child of x
⚫ Increment degree[x]
⚫ Unmark y if marked
⚫ We use an array A[0..D(n)] where D(n) is the maximum degree of any node in the heap with n nodes,
initially all NIL
⚫ If A[k] = y at any time, then degree[y] = k
Extracting the Minimum
⚫ Step 2: Consolidate trees so that no two roots have same degree. Update min[H] with the new min after
consolidation.
current
min[H]
7 24 2 17 18 52 41
3
39 44
30 26 46
35
Extracting the Minimum
0 1 2 3
A
c
m u
i r
n r
[ e
H n
] t
7 24 23 17 18 52 41
39 44
30 26 46
Extracting the Minimum
35
Extracting the Minimum
0 1 2 3
A
current
min[H]
7 24 23 17 18 52 41
39 44
30 26 46
35
Extracting the Minimum
0 1 2 3
A
min[H]
7 24 23 17 18 52 41
39 44
30 26 46 cu
rrent
35
Extracting the Minimum
0 1 2 3
A
m
i
n
[
H
]
7 24 2 17 18 52 41
3
39 44
30 26 46 current
Extracting the Minimum
35
Merge 17 and 23 trees
Extracting the Minimum
0
1
2
3 current
m A
i
n
[
H
]
7 24 17 18 52 41
23 39 44
30 26 46
Extracting the Minimum
35
Merge 7 and 17 trees
Extracting the Minimum
0 1 2 3
min[H] current
24 7 18 52 41
26 46 17 30 39 44
35 23
Merge 7 and 24 trees
Extracting the Minimum
0 1 2 3
min[H] current
7 18 52 41
24 17 30 39 44
26 46 23
35
Extracting the Minimum
0 1 2 3
current
min[H]
7 18 52 41
24 17 30 39 44
26 46 23
35
Extracting the Minimum
0 1 2 3
current
min[H]
7 18 52 41
24 17 30 39 44
26 46 23
35
Extracting the Minimum
0 1 2 3
current
min[H]
7 18 52 41
24 17 30 39 44
26 46 23
Merge 41 and 18 trees
35
Extracting the Minimum
0 1 2 3
current
min[H]
7 52 18
24 17 30 41 39
26 46 23 44
35
Extracting the Minimum
0 1 2 3
current
min[H]
7 52 18
24 17 30 41 39
26 46 23 44
35
Extracting the Minimum
⚫ All roots covered by current pointer, so done
⚫ Now find the minimum among the roots and make min[H] point to it (already pointing to minimum in
this example)
⚫ Final heap m
is in[H]
7 52 18
24 17 30 41 39
26 46 23 44
35
Amortized Cost of Extracting Min
⚫ Recall that
⚫ D(n) = max degree of any node in the heap with n nodes
⚫ t(H) = number of trees in heap H
⚫ m(H) = number of marked nodes in heap H
⚫ Potential function (H) = t(H) + 2m(H)
⚫ Actual Cost
⚫ Time for Step 1:
⚫ O(D(n)) work adding min's children into root list
⚫ Time for Step 2 (consolidating trees)
⚫ Size of root list just before Step 2 is D(n) + t(H) - 1
⚫ t(H) original roots before deletion minus the one deleted plus the number of children of the
deleted node
⚫ The maximum number of merges possible is the no. of nodes in the root list
⚫ Each merge takes O(1) time
⚫ So total O(D(n) + t(H)) time for consoildation
⚫ O(D(n)) time to find the new min and updating min[H] after consolidation, since at most D(n) +
1 nodes in root list
⚫ Total actual cost = time for Step 1 + time for Step 2
= O(D(n) + t(H))
⚫ Potential before extracting minimum = t(H) + 2m(H)
⚫ Potential after extracting minimum ≤ (D(n) + 1) + 2m(H)
⚫ At most D(n) + 1 roots are there after deletion
⚫ No new node is marked during deletion
⚫ Can be unmarked, but not marked
⚫ Amortized cost = actual cost + potential change
= O(D(n)+ t(H)) + ((D(n)+1) +2m(H)) – (t(H) + 2m(H))
= O(D(n))
⚫ But D(n) can be O(n), right? That seems too costly! So is O(D(n)) any good?
⚫ Can show that D(n) = O(lg n) (proof omitted)
⚫ So amortized cost = O(lg n)
Decrease Key
min[H]
7 18 38
2 1 2 2 3 4
4 7 3 1 9 1
2 4 3 5
6 5 0 2
35 88 Decrease 46 to 45
72
⚫ Case 1: parent of x is unmarked
⚫ decrease key of x to k
⚫ cut off link between x and its parent, unmark x if marked
⚫ mark parent
⚫ add tree rooted at x to root list, updating heap min pointer
min[H]
7 18 38
2 1 2 2 3 4
4 7 3 1 9 1
2 4 3 5
6 1 0 2
5
35 88 72 Decrease 45 to 15
⚫ Case 1: parent of x is unmarked
⚫ decrease key of x to k
⚫ cut off link between x and its parent, unmark x if marked
⚫ mark parent
⚫ add tree rooted at x to root list, updating heap min pointer
min[H]
7 18 38
2 1 2 2 3 4
4 7 3 1 9 1
2 1 3 5
6 5 0 2
Decrease 45 to 15
35 88 72
⚫ Case 1: parent of x is unmarked
⚫ decrease key of x to k
⚫ cut off link between x and its parent, unmark x if marked
⚫ mark parent
⚫ add tree rooted at x to root list, updating heap min pointer
min[H]
15 7 18 38
72
2 1 2 2 3 4
4 7 3 1 9 1
2 3 5
6 0 2
35 88
Decrease 45 to 15
⚫ Case 2: parent of x is marked
⚫ decrease key of x to k
⚫ cut off link between x and its parent p[x], add x to root list, unmark x if marked
⚫ cut off link between p[x] and p[p[x]], add p[x] to root list, unmark p[x] if marked
⚫ If p[p[x]] unmarked, then mark it and stop
⚫ If p[p[x]] marked, cut off p[p[x]], unmark, and repeat until unmarked
node found or root reached min[H]
15 7 18 38
72 p[p[x]] 24 17 23 21 39 41
p[x 26 30 52
]
Decrease 35 to 5
x 35 88
5
⚫ Case 2: parent of x is marked
⚫ decrease key of x to k
⚫ cut off link between x and its parent p[x], add x to root list, unmark x if marked
⚫ cut off link between p[x] and p[p[x]], add p[x] to root list, unmark p[x] if marked
⚫ If p[p[x]] unmarked, then mark it and stop
⚫ If p[p[x]] marked, cut off p[p[x]], unmark, and repeat until unmarked
node found or root reached min[H]
15 5 7 18 38
x
72 p[p[x]] 24 17 23 21 39 41
p[x 26 30 52
]
Decrease 35 to 5
88
⚫ Case 2: parent of x is marked
⚫ decrease key of x to k
⚫ cut off link between x and its parent p[x], add x to root list, unmark x if marked
⚫ cut off link between p[x] and p[p[x]], add p[x] to root list, unmark p[x] if marked
⚫ If p[p[x]] unmarked, then mark it and stop
⚫ If p[p[x]] marked, cut off p[p[x]], unmark, and repeat until unmarked node found or root reached
Decrease 35 to 5
min[H]
15 5 26 7 18 38
p[x
x ]
72 88 24 17 23 21 39 41
p[p[x]]
30 52
⚫ Case 2: parent of x is marked
⚫ decrease key of x to k
⚫ cut off link between x and its parent p[x], add x to root list, unmark x if marked
⚫ cut off link between p[x] and p[p[x]], add p[x] to root list, unmark p[x] if marked
⚫ If p[p[x]] unmarked, then mark it and stop
⚫ If p[p[x]] marked, cut off p[p[x]], unmark, and repeat until unmarked node found or root reached
(cascading cut)
min[H]
15 5 26 24 7 18 38
72 88 17 23 21 39 41
3. key[x] = k
4. y p[x]
6. { CUT(H, x, y)
7. CASCADING-CUT(H, y) }
9. min[H] = x
CUT(H, x, y)
2. if z NIL
3. if mark[y] = FALSE
4. mark[y] = TRUE
5. else CUT(H, y, z)
6. CASCADING-CUT(H, z)
Amortized Cost of Decrease Key
⚫ Actual cost
⚫ O(1) time for decreasing key value, and the first cut of x
⚫ O(1) time for each of c cascading cuts, plus reinserting in root list
⚫ Total O(c)
⚫ Change in Potential
⚫ H = tree just before decreasing key, H’ just after
⚫ t(H') = t(H) + c
⚫ t(H) + (c-1) trees from the cascading cut + the tree rotted at x
⚫ m(H') m(H) – c + 2
⚫ Each cascading cut unmarks a node except the last one (–(c – 1))
⚫ Last cascading cut could potentially mark a node (+1)
⚫ Change in potential
= (t(H’) + 2m(H’)) – (t(H) + 2m(H))
c + 2(– c + 2) = 4 – c
⚫ Delete node x
⚫ Decrease key of x to –
⚫ Delete min element in heap
⚫ Amortized cost
⚫ O(1) for decrease-key.
⚫ O(D(n)) for delete-min.
⚫ Total O(D(n))
⚫ Again, can show that D(n) = O(lg n)
⚫ So amortized cost of delete = O(lg n)