ADS-UNIT III-PRIORITY QUEUES (1)

Download as pdf or txt
Download as pdf or txt
You are on page 1of 18

ADVANCED DATA STRUCTURES UNIT - III CSE

PRIRORITY QUEUES
A priority queue is an important data type in computer science. Major operations supported
by priority queues are Inserting and Delete min.

Insert, which does the obvious thing; and Delete min, which finds, returns, and removes the
minimum element in the priority queue.

The priority queues are extensive use in.

 Implementing schedulers in OS, and Distributed systems


 Representing event lists in discrete event simulation
 Implementing numerous graph algorithms efficiently
 Selecting kth largest or kth smallest element in lists (order statistics problem)
 Sorting Applications

Simple Implementation:

There are several ways to implement a priority queue

Linked list: stored and unsorted

 Performing insertions at front in O(1) and traversing the list which requires O(N) time

 To delete the minimum, we could insist that the list be kept always sorted: this
makes insertions expensive O(N) and delete-mins cheap O(1)

Another way of implementing priority queues would be use a binary search tree.

 This gives an O(log N) average running time for both operations

 The basic data structure we will use will not require pointers and will support both
operations in O(log N) worst case time. The implementations we will use is known as
a binary – heap

Binary – Heaps:

Heaps (occasionally called as partially ordered trees) are a very popular data structure for
implementing priority queues.

Binary heaps are refer to merely as heaps, like binary search trees, heaps have two
properties, namely, a structure property and a heap order property.

Narasaraopeta Engineering College Page 33


ADVANCED DATA STRUCTURES UNIT - III CSE

Structure property:

A heap is a binary tree that is completely filled, with the possible exception of the bottom
level, which is filled from left to right, such tree is known as a complete binary tree as shown
below

A binary heap is a complete binary tree with elements from a partially ordered set, such that
the element at every node is less than (or equal to) the element at its left child and the
element at its right child.

It is easy to show that a complete binary tree height ‘ h ‘ has between 2h and 2h+1-1 nodes.

This implies that the height of a complete binary tree is log N , which is clearly O(log N).

One important observation is that because a complete binary tree is so regular, it can be
represented in an array and no pointers are necessary.

Since a heap is a complete binary tree, the elements can be conveniently stored in an array.
If an element is at position i in the array, then the left child will be in position 2i, the right

child will be in the position (2i+1), and the parent is in position i/2

The only problem with this implementation is that an estimate of the maximum heap size is
required in advance, but typically this is not a problem.

Because of the heap property, the minimum element will always be present at the root of
the heap. Thus the find min operation will have worst case O(1) running time.

Narasaraopeta Engineering College Page 34


ADVANCED DATA STRUCTURES UNIT - III CSE

Heap – order property:

It is the property that allows operations to be performed quickly. Since we want to be able
to find the minimum quickly, it makes sense that the smallest element should be at the root.

If we consider that any sub tree should also be a heap, then any node should be smaller
than all of its descendants.

Applying this logic, we arrive at the heap order property. In a heap, for every node X, the
key in the parent of X is smaller than ( or equal to) the key X, with exception of the root.
(Which has no parent)?

NOTE: Binary heaps were first introduced by Williams in 1964.


NOTE: Binary Heap is either a min – heap or a max – heap. A min heap supports the insert
and delete min operations while a max heap supports the insert and delete max operations

Basic Heap Operations:

It is easy to perform the two required operations. All the work involves ensuring that the
heap order property is maintained.

Insert:

To insert an element say x, into the heap with n elements we first create a hole in position
(n+1) and see if the heap property is violated by putting x into the hole. If the heap property
is violated then we have found the current position for x. Otherwise we push - up or
percolate – up x until the heap property is restored.
To do this we slide the element that is in the holes parent node into the hole thus bubbling
the hole up toward the root. We continue this process until x can be placed in the whole.

Consider the heap

Narasaraopeta Engineering College Page 35


ADVANCED DATA STRUCTURES UNIT - III CSE

We create a hole in the next available heap location. Inserting 14 in the hole would violate
the heap order property so 31 is slid down into the hole. This strategy is continued until the
correct location for 14 is found.

This general strategy is known as a percolate up. i.e. the new element is percolated up the
heap until the correct location is found.

NOTE: Worst case complexity of insert is O(h) where h is the height of the heap. Thus
insertions are O(log n) where n is the no .of elements in the heap.

Delete min:

Where the minimum is deleted a hole is created at the root level. Since the heap now has
one less element and the heap is a complete binary tree, the element in the least position is
to be relocated.

This we first do by placing the last element in the hole created at the root. This will leave the
heap property possibly violated at the root level.

We now push – down or percolate – down the hole at the root until the violation of heap
property is stopped. While pushing down the hole it is important to slide it down to the less
of its two children (pushing up the latter). This is done so as not to create another violation
of heap property.

Consider the previous example:

First remove or delete min is 13.

Narasaraopeta Engineering College Page 36


ADVANCED DATA STRUCTURES UNIT - III CSE

This general strategy is known as a percolate down. We use same technique as in the insert
routine to avoid the use of swaps in this routine.

NOTE: The worst case running time of delete min is O(log n) where n is the no. of elements
in the heap.

Narasaraopeta Engineering College Page 37


ADVANCED DATA STRUCTURES UNIT - III CSE

Creating Heap:

The build heap operation takes as input n elements. The problem here is to create a heap of
these elements i.e. places them into an empty heap.

 Obvious approach is to insert the n element one at a time into an initially empty
heap. Since each insert will take O(1) average and O(log n) worst case time, the total
running time of this algorithm would be O(n) average but O(n log n) worst case

 Another approach proposed by Floyd in 1964 is to use a procedure called push –


down or percolate – down repeatedly. Starting with the array consisting of the given
n elements in the input – order.
If percolate – down (i) percolate down from node i, perform the algorithm to create
a heap – ordered tree.

for ( i = n/2 ; I > 0 ; i --)


percolate - down (i)

Consider the tree is the unordered tree

Left: initial tree Right: after percolate – down (7)

Narasaraopeta Engineering College Page 38


ADVANCED DATA STRUCTURES UNIT - III CSE

Left: after percolate – down (6) Right: after percolate – down (5)

Left: after percolate – down (4) Right: after percolate – down (3)

Left: after percolate – down (2) Right: after percolate – down (1)

Each dashed line corresponds to two comparisons: one to find the smallest child and one to
compare the smaller child with the node.

Notice that there are only 10 dashed lines in the entire algorithm (there could been an 11 th
where?) corresponding to 20 comparisons.

Narasaraopeta Engineering College Page 39


ADVANCED DATA STRUCTURES UNIT - III CSE

To bounding the running time of build heap, we must bound the no. of dashed lines. This
can be done by computing the sum of the heights of all the nodes in the heap, which is the
maximum no. of dashed lines. What we would like to show is that this is O(n).

THEOREM:

For the perfect binary tree of height h containing 2h+1-1 nodes, the sum of the heights of
nodes is 2h+1-1-(h+1)

Proof:

It is easy to see that this tree consists of 1 node at height h, 2 nodes at height h-1, 22 nodes
at height h-2, and in general 2i nodes at height h-i

The sum of the height of all the nodes is then S = ∑ 2i (h-i) where i = o to h

S = h + 2 (h – 1) + 4 (h – 2) + 8 (h – 3) + 16 (h – 4) + - - - + 2h – 1 (1)

Multiplying by 2 gives the equation

S = - h + 2 + 4 + 8 + 16 + - - - + 2h – 1 + 2h

There fore S = (2h+1 – 1) – (h+1) which proves the theorem

It is easy to see that the above is an upper bound on the sum of heights of nodes of a
complete binary tree. Since a complete binary tree of height h has between 2 h and 2h+1
nodes, the above sum is there fore O(n)
Where n is the no. of nodes in the heap
Since the worst case complexity of the heap building algorithm is of the order of the sum of
height of the nodes of the heap built, we then have the worst case complexity of heap
building as O(n)

Narasaraopeta Engineering College Page 40


ADVANCED DATA STRUCTURES UNIT - III CSE

Binomial Queues:
We know that previous concepts support merging, insertion, and delete min all effectively in
O(log n) time per operation but insertion take constant average time.

Binomial Queues support all three operations in O(log n) worst case time per operation, but
insertions take constant time on average.

Binomial Queue Structure:

It differs from all the priority queue implementations that a binomial queue is not a heap –
ordered tree but rather a collection of heap – ordered trees known as a forest.

Each of the heap – ordered trees is of a constrained from known as a binomial tree. There is
at most one binomial tree of every height.

 A binomial tree of height 0 is a one – node tree


 A binomial tree Bk of height k is formed by attaching a binomial tree Bk-1 to the root
of another binomial tree Bk-1

B0 B1 B2 B3

The above diagram shows binomial trees B0 B1 B2 and B3 from the diagram we see that a
binomial tree Bk consists of a root with children B0 B1 B2 - - - Bk-1

NOTE: Binomial tree of height k have exactly 2k nodes and the no. of nodes at depth d is the
binomial coefficient kCd

NOTE: If we impose heap order on the binomial tree and allow at most one binomial tree of
any height we can uniquely represent a priority queue of any size by a collection of binomial
trees (forest).

Narasaraopeta Engineering College Page 41


ADVANCED DATA STRUCTURES UNIT - III CSE

For instance, a priority queue of size 13 could be represented by the forest B 3 B2 B0


We might write this representation as 1 1 0 1. Which not only represent 13 in binary but
also represent the fact that B3 B2 and B0 are present in the representation and B1 is not.

As an example, a priority queue of six elements could be represented as in below figure

H1:

Figure: Binomial queue H1 with six elements

Binomial Queue operations:

Find – min:

This is implemented by scanning the roots of the entire tree. Since there are at most log n
different trees, the minimum can be found in O(log n) time.

Alternatively, one can keep track of the current minimum and perform find – min in O(1)
time. If we remember to update the minimum if it changes during other operations.

Merge:

Merging two binomial queues is a conceptually easy operation, which we will describe by
example.
Consider the two binomial queues H1 and H2 with six and seven elements respectively as
shown below.

H1: with 6 elements

Narasaraopeta Engineering College Page 42


ADVANCED DATA STRUCTURES UNIT - III CSE

H2: with 7 elements

Merge of two B1 trees (i.e. 21 = 2 nodes) in H1 and H2. i.e.

Now we left with 1 tree of height 0 and 3 trees of height 2

Binomial queue H3: the result of merging H1 and H2

H3: with 13 elements

Narasaraopeta Engineering College Page 43


ADVANCED DATA STRUCTURES UNIT - III CSE

The merging is performed by essentially adding the two queues together.

Let H3 be the new binomial queue.

Since H1 has no binomial tree of height 0, and H2 does, we can just use the binomial tree of
height o in H2 as part of H3.

Next, we add binomial trees of height 1.

Since both H1 and H2 have binomial tree of height 1, we merge them by making the larger
root a sub tree of the smaller, creating a binomial tree of height 2.

Thus, H3 will not have a binomial tree of height 1 as shown in the above diagrams.

There are now three binomial trees of height 2, namely, the original trees in both H1 and H2
plus the tree formed by adding of height 1 tree in both H1 and H2.

We keep one binomial tree of height 2 in H3 and merge the other two, creating a binomial
tree of height 3.

Since H1 and H2 have no trees of height 3, this tree becomes part of H3 and we are finished.
The resulting binomial queue is as shown in above figure.

Since merging two binomial trees takes constant time with almost any reasonable
implementation, and there are O(log n) binomial tree, the merge takes O(log n) time in the
worst case.

To make this operation efficient, we need to keep the trees in the binomial queue sorted by
height, which is certainly a simple thing to do.

Insertion:

 Insertion is a special case of merging, since we merely create a one – node tree and
perform a merge.

 The worst – case time of this operation is likewise O(log n)

 More precisely, if the priority queue into which the element is being inserted has the
property that the smallest non existent binomial tree is B i the running time is
proportional to i+1.

Narasaraopeta Engineering College Page 44


ADVANCED DATA STRUCTURES UNIT - III CSE

For example:
In The previous example, H3 is missing a binomial tree of height 1, so the insertion will
terminate in two steps. Since each tree in a binomial queue is present with probability ½.

If we define the random variable X as representing the no. of steps in an insert operation,
then

X = 1 with probability 1/2 (B0 not present)


X = 2 with probability 1/2 (B0 and B1 not present)
X = 3 with probability 1/8
Thus average number of steps in an insert operation = 2.

Thus we expect an insertion to terminate in two steps on the average. Further more
performing n inserts on an initially empty binomial queue will take O(n) worst case time.

In deed it is possible to do this operation using only (n - 1) comparisons.

Consider an example, the binomial queue that are formed by inserting 1 through 7 in order.

After 1 is inserted:

After 2 is inserted:

After 3 is inserted:

After 4 is inserted:

Narasaraopeta Engineering College Page 45


ADVANCED DATA STRUCTURES UNIT - III CSE

After 5 is inserted:

After 6 is inserted:

After 7 is inserted:

If we insert 8 then

Inserting 4 shows off a bad case, we merge 4 with B0 obtaining a new tree of height 1. We
merge this tree with B1 obtaining a tree of height 2 which is the new priority queue.

The next insertion after 7 is another bad case and would require three merges.

Narasaraopeta Engineering College Page 46


ADVANCED DATA STRUCTURES UNIT - III CSE

Delete min:

 A delete min can be performed by first finding the binomial tree with the smallest
root.
 Let this tree be Bk and let the original priority queue be H
 Remove the binomial tree Bk from the forest of trees in H forming the new binomial
queue H′
 Now remove the root of Bk creating binomial trees B0 B1 - - - Bk – 1 which collectively
from priority queue H″.
 Finish the operation by merging H′ & H″

Consider the same example of merge operation which has H3.


H3:

The minimum root is 12 so we obtain the two priority queues H′ & H″

The binomial queue that results from merging H′ & H″ is as shown below

NOTE: The entire delete min operation takes O(log n) worst case time

Narasaraopeta Engineering College Page 47


ADVANCED DATA STRUCTURES UNIT - III CSE

Binomial Amortized Analysis:

Amortized Analysis of Merge


To merge two binomial queues, an operation similar to addition of binary integers is
performed:
At any stage, we may have zero, one, two, or three Bk trees, depending on whether or not
the two priority queues contain a Bk tree and whether or not a Bk tree is carried over from
the previous step.
 If there is zero or more Bk tree, it is placed as a tree in the resulting binomial queue.
 If there are two, they are merged into a Bk + 1 tree and carried over
 If there are three, one is retained and other two merged.

Result 1:

 A binomial queue of n elements can be built by n successive insertions in 0(n) time.


 Brute force Analysis
Define the cost of each insertions to be
o 1time unit + an extra unit for each linking step
thus the total will be n units plus the total number of linking steps.
o The 1st, 3rd, ... and each odd-numbered step requires no linking steps since
there is no B0 present.
o A quarter of the insertions require only one linking step: 2nd, 6th, 10, ...
o One eighth of insertions require two linking steps.

We could do all this and bound the number of linking steps by n.

The above analysis will not help when we try to analyze a sequence of operations that
include more than just insertions.

 Amortized Analysis

Consider the result of an insertion.

o If there is no B0 tree, then the insertion costs one time unit. The result of
insertion is that there is now a B0 tree and the forest has one more tree.
o If there is a B0 tree but not B1 tree, then insertion costs 2 time units. The new
forest will have a B1 tree but not a B0 tree. Thus number of trees in the forest is
unchanged.
o An insertion that costs 3 time units will create a B2 tree but destroy a B0 and
B1, yielding one less tree in the forest.
o In general, an insertion that costs c units results in a net increase of 2 - c trees.
Since
 a Bc - 1 tree is created
 all Bi trees, 0 i c - 1 are removed.

Narasaraopeta Engineering College Page 48


ADVANCED DATA STRUCTURES UNIT - III CSE

Thus expensive insertions remove trees and cheap insertions create trees.

Let ti =
ci =

We have

c0 =0
ti + (ci - ci - 1) = 2

Result 2:
 The amortized running times of Insert, Delete-min, and Merge are 0(1), 0(log n), and
0(log n) respectively.
Potential function = # of trees in the queue
To prove this result we choose:
 Insertion

ti =
ci =
ai = ti + (ci - ci - 1)
ai =2 i

= 2n - (cn - c0)
 As long as (cn - c0) is positive, we are done.
 In any case (cn - c0) is bounded by log n if we start with an empty tree.
 Merge:
Assume that the two queues to be merged have n1 and n2nodes with T1 and T2 trees. Let n =
n1+ n2. Actual time to perform merge is given by:
ti = 0(logn1 + logn2)
= 0(max(logn1, logn2)
= 0(log n)
(ci - ci - 1) is at most (log n) since there can be at most (log n) trees after merge.
 Deletemin:
The analysis here follows the same argument as for merge.

Narasaraopeta Engineering College Page 49


ADVANCED DATA STRUCTURES UNIT - III CSE

Lazy Binomial Queues:

Binomial queues in which merging is done lazily.


Here, to merge two binomial queues, we simply concatenate the two lists of binomial trees.
In the resulting forest, there may be several trees of the same size.
Because of the lazy merge, merge and insert are both worst case 0(1) time.
 Delete min:
o Convert lazy binomial queue into a standard binomial queue
o Do delete min as in standard queue.
Fibonacci Heaps
Fibonacci heap supports all basic heap operations in 0(1) amortized time, with the exception
of delete min and delete which take 0(log n) amortized time.
Fibonacci heaps generalize binomial queues by adding two new concepts:
 A different implementation of decrease-key
 Lazy merging: Two heaps are merged only when it is required.

It can be shown in a Fibonacci heap that any node of rank r 1 has at least Fr + 1
descendant.

Narasaraopeta Engineering College Page 50

You might also like