Question? Leave a message!
I Agree to Thesis Scientist
Terms and Conditions
Not a member?
Sign Up Now
Register a Free Account
I Agree to Thesis Scientist
Terms and Conditions
Sign up At Thesis Scientist
Done, your profile is created.Finish your profile by filling in the following fields
Earn Money,Free Notes
Password sent to your Email Id, Please Check your Mail
Updating Cart........ Please Wait........
ECE 250 Algorithms and Data Structures Binary heaps Douglas Wilhelm Harder, M.Math. LEL Department of Electrical and Computer Engineering University of Waterloo Waterloo, Ontario, Canada ece.uwaterloo.ca dwharderalumni.uwaterloo.ca © 20143 by Douglas Wilhelm Harder. Some rights reserved.Binary heaps 2 Outline In this topic, we will: – Define a binary minheap – Look at some examples – Operations on heaps: • Top • Pop • Push – An array representation of heaps – Define a binary maxheap – Using binary heaps as priority queuesBinary heaps 3 7.2 Definition A nonempty binary tree is a minheap if – The key associated with the root is less than or equal to the keys associated with either of the subtrees (if any) – Both of the subtrees (if any) are also binary minheaps From this definition: – A single node is a minheap – All keys in either subtree are greater than the root keyBinary heaps 4 7.2 Definition Important: THERE IS NO OTHER RELATIONSHIP BETWEEN THE ELEMENTS IN THE TWO SUBTREES Failing to understand this is the greatest mistake a student makesBinary heaps 5 7.2 Example This is a binary minheap:Binary heaps 6 7.2 Example Adding colour, we observe – The left subtree has the smallest (7) and the largest (89) objects – No relationship between items with similar priorityBinary heaps 7 7.2.1 Operations We will consider three operations: – Top – Pop – PushBinary heaps 8 220.127.116.11 Example We can find the top object in Q(1) time: 3Binary heaps 9 18.104.22.168 Pop To remove the minimum object: – Promote the node of the subtree which has the least value – Recurs down the subtree from which we promoted the least valueBinary heaps 10 22.214.171.124 Pop Using our example, we remove 3:Binary heaps 11 126.96.36.199 Pop We promote 7 (the minimum of 7 and 12) to the root:Binary heaps 12 188.8.131.52 Pop In the left subtree, we promote 9:Binary heaps 13 184.108.40.206 Pop Recursively, we promote 19:Binary heaps 14 220.127.116.11 Pop Finally, 55 is a leaf node, so we promote it and delete the leafBinary heaps 15 18.104.22.168 Pop Repeating this operation again, we can remove 7:Binary heaps 16 22.214.171.124 Pop If we remove 9, we must now promote from the right subtree:Binary heaps 17 126.96.36.199 Push Inserting into a heap may be done either: – At a leaf (move it up if it is smaller than the parent) – At the root (insert the larger object into one of the subtrees) We will use the first approach with binary heaps – Other heaps use the secondBinary heaps 18 188.8.131.52 Push Inserting 17 into the last heap – Select an arbitrary node to insert a new leaf node:Binary heaps 19 184.108.40.206 Push The node 17 is less than the node 32, so we swap themBinary heaps 20 220.127.116.11 Push The node 17 is less than the node 31; swap themBinary heaps 21 18.104.22.168 Push The node 17 is less than the node 19; swap themBinary heaps 22 22.214.171.124 Push The node 17 is greater than 12 so we are finishedBinary heaps 23 126.96.36.199 Push Observation: both the left and right subtrees of 19 were greater than 19, thus we are guaranteed that we don’t have to send the new node down This process is called percolation, that is, the lighter (smaller) objects move up from the bottom of the minheapBinary heaps 24 7.2.2 Implementations With binary search trees, we introduced the concept of balance From this, we looked at: – AVL Trees – BTrees – Redblack Trees (not course material) How can we determine where to insert so as to keep balanceBinary heaps 25 7.2.2 Implementations There are multiple means of keeping balance with binary heaps: – Complete binary trees – Leftist heaps – Skew heaps We will look at using complete binary trees – In has optimal memory characteristics but suboptimal runtime characteristicsBinary heaps 26 7.2.2 Complete Trees By using complete binary trees, we will be able to maintain, with minimal effort, the complete tree structure We have already seen – It is easy to store a complete tree as an array If we can store a heap of size n as an array of size Q(n), this would be greatBinary heaps 27 7.2.2 Complete Trees For example, the previous heap may be represented as the following (nonunique) complete tree:Binary heaps 28 7.2.2 Complete Trees: Push If we insert into a complete tree, we need only place the new node as a leaf node in the appropriate location and percolate upBinary heaps 29 7.2.2 Complete Trees: Push For example, push 25:Binary heaps 30 7.2.2 Complete Trees: Push We have to percolate 25 up into its appropriate location – The resulting heap is still a complete treeBinary heaps 31 7.2.2 Complete Trees: Pop Suppose we want to pop the top entry: 12Binary heaps 32 7.2.2 Complete Trees: Pop Percolating up creates a hole leading to a noncomplete treeBinary heaps 33 7.2.2 Complete Trees: Pop Alternatively, copy the last entry in the heap to the rootBinary heaps 34 7.2.2 Complete Trees: Pop Now, percolate 36 down swapping it with the smallest of its children – We halt when both children are largerBinary heaps 35 7.2.2 Complete Trees: Pop The resulting tree is now still a complete tree:Binary heaps 36 7.2.2 Complete Trees: Pop Again, popping 15, copy up the last entry: 88Binary heaps 37 7.2.2 Complete Trees: Pop This time, it gets percolated down to the point where it has no childrenBinary heaps 38 7.2.2 Complete Trees: Pop In popping 17, 53 is moved to the topBinary heaps 39 7.2.2 Complete Trees: Pop And percolated down, again to the deepest levelBinary heaps 40 7.2.2 Complete Trees: Pop Popping 19 copies up 39Binary heaps 41 7.2.2 Complete Trees: Pop Which is then percolated down to the second deepest levelBinary heaps 42 7.2.3 Complete Tree Therefore, we can maintain the completetree shape of a heap We may store a complete tree using an array: – A complete tree is filled in breadthfirst traversal order – The array is filled using breadthfirst traversalBinary heaps 43 188.8.131.52 Array Implementation For the heap a breadthfirst traversal yields:Binary heaps 44 184.108.40.206 Array Implementation Recall that If we associate an index–starting at 1–with each entry in the breadthfirst traversal, we get: Given the entry at index k, it follows that: – The parent of node is a k/2 parent = k 1; – the children are at 2k and 2k + 1 leftchild = k 1; rightchild = leftchild 1; Cost (trivial): start array at position 1 instead of position 0Binary heaps 45 220.127.116.11 Array Implementation The children of 15 are 17 and 32:Binary heaps 46 18.104.22.168 Array Implementation The children of 17 are 25 and 19:Binary heaps 47 22.214.171.124 Array Implementation The children of 32 are 41 and 36:Binary heaps 48 126.96.36.199 Array Implementation The children of 25 are 33 and 55:Binary heaps 49 188.8.131.52 Array Implementation If the heapasarray has count entries, then the next empty node in the corresponding complete tree is at location posn = count + 1 We compare the item at location posn with the item at posn/2 If they are out of order – Swap them, set posn /= 2 and repeatBinary heaps 50 184.108.40.206 Array Implementation Consider the following heap, both as a tree and in its array representationBinary heaps 51 220.127.116.11.1 Array Implementation: Push Inserting 26 requires no changesBinary heaps 52 18.104.22.168.1 Array Implementation: Push Inserting 8 requires a few percolations: – Swap 8 and 23Binary heaps 53 22.214.171.124.1 Array Implementation: Push Swap 8 and 12Binary heaps 54 126.96.36.199.1 Array Implementation: Push At this point, it is greater than its parent, so we are finishedBinary heaps 55 188.8.131.52.2 Array Implementation: Pop As before, popping the top has us copy the last entry to the topBinary heaps 56 184.108.40.206.2 Array Implementation: Pop Instead, consider this strategy: – Copy the last object, 23, to the rootBinary heaps 57 220.127.116.11.2 Array Implementation: Pop Now percolate down Compare Node 1 with its children: Nodes 2 and 3 – Swap 23 and 6Binary heaps 58 18.104.22.168.2 Array Implementation: Pop Compare Node 2 with its children: Nodes 4 and 5 – Swap 23 and 9Binary heaps 59 22.214.171.124.2 Array Implementation: Pop Compare Node 4 with its children: Nodes 8 and 9 – Swap 23 and 10Binary heaps 60 126.96.36.199.2 Array Implementation: Pop The children of Node 8 are beyond the end of the array: – StopBinary heaps 61 188.8.131.52.2 Array Implementation: Pop The result is a binary minheapBinary heaps 62 184.108.40.206.2 Array Implementation: Pop Dequeuing the minimum again: – Copy 26 to the rootBinary heaps 63 220.127.116.11.2 Array Implementation: Pop Compare Node 1 with its children: Nodes 2 and 3 – Swap 26 and 8Binary heaps 64 18.104.22.168.2 Array Implementation: Pop Compare Node 3 with its children: Nodes 6 and 7 – Swap 26 and 12Binary heaps 65 22.214.171.124.2 Array Implementation: Pop The children of Node 6, Nodes 12 and 13 are unoccupied – Currently, count == 11Binary heaps 66 126.96.36.199.2 Array Implementation: Pop The result is a minheapBinary heaps 67 188.8.131.52.2 Array Implementation: Pop Dequeuing the minimum a third time: – Copy 15 to the rootBinary heaps 68 184.108.40.206.2 Array Implementation: Pop Compare Node 1 with its children: Nodes 2 and 3 – Swap 15 and 9Binary heaps 69 220.127.116.11.2 Array Implementation: Pop Compare Node 2 with its children: Nodes 4 and 5 – Swap 15 and 10Binary heaps 70 18.104.22.168.2 Array Implementation: Pop Compare Node 4 with its children: Nodes 8 and 9 – 15 23 and 15 25 so stopBinary heaps 71 22.214.171.124.2 Array Implementation: Pop The result is a properly formed binary minheapBinary heaps 72 126.96.36.199.2 Array Implementation: Pop After all our modifications, the final heap isBinary heaps 73 7.2.4 Runtime Analysis Accessing the top object is Q(1) Popping the top object is O(ln(n)) – We copy something that is already in the lowest depth—it will likely be moved back to the lowest depth How about pushBinary heaps 74 7.2.4 Runtime Analysis If we are inserting an object less than the root (at the front), then the run time will be Q(ln(n)) If we insert at the back (greater than any object) then the run time will be Q(1) How about an arbitrary insertion – It will be O(ln(n)) Could the average be lessBinary heaps 75 7.2.4 Runtime Analysis With each percolation, it will move an object past half of the remaining entries in the tree – Therefore after one percolation, it will probably be past half of the entries, and therefore on average will require no more percolations h1 h 1 2 h 2 k (hk )2 nn k0 nh 1 Θ(1) n Therefore, we have an average run time of Q(1)Binary heaps 76 7.2.4 Runtime Analysis An arbitrary removal requires that all entries in the heap be checked: O(n) A removal of the largest object in the heap still requires all leaf nodes to be checked – there are approximately n/2 leaf nodes: O(n)Binary heaps 77 7.2.4 Runtime Analysis Thus, our grid of run times is given by:Binary heaps 78 7.2.4 Runtime Analysis Some observations: – Continuously inserting at the front of the heap (i.e., the new object being pushed is less than everything in the heap) causes the runtime to drop to O(ln(n)) – If the objects are coming in order of priority, use a regular queue with swapping – Merging two binary heaps of size n is a Q(n) operationBinary heaps 79 7.2.4 Runtime Analysis Other heaps have better runtime characteristics – Leftist, skew, binomial and Fibonacci heaps all use a nodebased implementation requiring Q(n) additional memory – For Fibonacci heaps, the runtime of all operations (including merging two Fibonacci heaps) except pop are Q(1)Binary heaps 80 7.2.5 Binary Max Heaps A binary maxheap is identical to a binary minheap except that the parent is always larger than either of the children For example, the same data as before stored as a maxheap yieldsBinary heaps 81 7.2.5 Example Here we have a maxheap of presents under a redgreen tree: http://xkcd.com/835/Binary heaps 82 Memory allocation and pointer arithmetic Do we really have to allocate one additional memory location for a binary treeasheap Type heaparray = new Typecapacity() + 1; 0 1 2 3 4 5 6 7 8 9 heaparray Could we not just allocate one less memory and point to the previous location in memory 0 1 2 3 4 5 6 7 8 heaparrayBinary heaps 83 Memory allocation and pointer arithmetic To do this, we must understand pointer arithmetic: int ptr = new int5 = 1, 23, 45, 67, 89; std::cout ptr std::endl; std::cout ptr std::endl; 00d3a260 00d3a264 00d3a268 00d3a26B 00d3a270 00000001 00000017 0000002D 00000043 00000059 ptr What is the output of std::cout (ptr + 1) std::endl; std::cout (ptr + 1) std::endl;Binary heaps 84 Memory allocation and pointer arithmetic Just adding one to the address would be, in almost all cases, useless – Assuming big endian, this would have a value 256 – If this was little endian, it would be even more bizarre… 00d3a260 00d3a264 00d3a268 00d3a26B 00d3a270 00000001 00000017 0000002D 00000043 00000059 ptr What is the output of std::cout (ptr + 1) std::endl; std::cout (ptr + 1) std::endl;Binary heaps 85 Memory allocation and pointer arithmetic Instead, C and C++ add as many bytes as the size of the object being pointed to – In the cases of int, sizeof( int ) == 4 on most 32bit machines – The output is 23 00d3a260 00d3a264 00d3a268 00d3a26B 00d3a270 00000001 00000017 0000002D 00000043 00000059 ptr What is the output of std::cout (ptr + 1) std::endl; std::cout (ptr + 1) std::endl;Binary heaps 86 Memory allocation and pointer arithmetic Essentially, these two statements are identical: std::cout ptri std::endl; std::cout (ptr + i) std::endl; Now you can do the following: Type tmp = new Typecapacity(); Type heaparray = tmp 1; 00d3a260 00d3a264 00d3a268 00d3a26B 00d3a270 00000001 00000017 0000002D 00000043 00000059 heaparray tmp Now, heaparray1; and tmp0; both point to the same memory locationBinary heaps 87 Memory allocation and pointer arithmetic Issues: – Never access or modify the contents of heaparray0 – When you deallocate memory, you must point to the original address returned by new: delete (heaparray + 1); – Pointer arithmetic is not for the faint of heart but it is fun; for example: int arrayN; int ptr = array; // Print all the entries of the array for ( int i = 0; i N; ++i ) std::cout (ptr++) std::endl; Binary heaps 88 7.2.6 Priority Queues Now, does using a heap ensure that that object in the heap which: – has the highest priority, and – of that highest priority, has been in the heap the longest Consider inserting seven objects, all of the same priority (colour indicates order): 2, 2, 2, 2, 2, 2, 2Binary heaps 89 7.2.6 Priority Queues Whatever algorithm we use for promoting must ensure that the first object remains in the root position – Thus, we must use an insertion technique where we only percolate up if the priority is lower The result: Challenge: – Come up with an algorithm which removes all seven objects in the original orderBinary heaps 90 7.2.6 Lexicographical Ordering A better solution is to modify the priority: – Track the number of insertions with a counter k (initially 0) – For each insertion with priority n, create a hybrid priority (n, k) where: (n , k ) (n , k ) if n n or (n = n and k k ) 1 1 2 2 1 2 1 2 1 2Binary heaps 91 7.2.6 Priority Queues Removing the objects would be in the following order:Binary heaps 92 7.2.6 Priority Queues Popped: 2 – First, (2,1) (2, 2) and (2, 3) (2, 4)Binary heaps 93 7.2.6 Priority Queues Removing the objects would be in the following order:Binary heaps 94 7.2.6 Priority Queues Removing the objects would be in the following order:Binary heaps 95 7.2.6 Priority Queues Removing the objects would be in the following order:Binary heaps 96 7.2.6 Priority Queues Removing the objects would be in the following order:Binary heaps 97 7.2.6 Summary In this talk, we have: – Discussed binary heaps – Looked at an implementation using arrays – Analyzed the run time: • HeadQ(1) • PushQ(1) average • Pop O(ln(n)) – Discussed implementing priority queues using binary heaps – The use of a lexicographical orderingBinary heaps 98 References 1 Donald E. Knuth, The Art of Computer Programming, Volume 3: Sorting nd and Searching, 2 Ed., Addison Wesley, 1998, §7.2.3, p.144. 2 Cormen, Leiserson, and Rivest, Introduction to Algorithms, McGraw Hill, 1990, §7.13, p.1407. rd 3 Weiss, Data Structures and Algorithm Analysis in C++, 3 Ed., Addison Wesley, §6.3, p.21525.Binary heaps 99 Usage Notes • These slides are made publicly available on the web for anyone to use • If you choose to use them, or a part thereof, for a course at another institution, I ask only three things: – that you inform me that you are using the slides, – that you acknowledge my work, and – that you alert me of any mistakes which I made or changes which you make, and allow me the option of incorporating such changes (with an acknowledgment) in my set of slides Sincerely, Douglas Wilhelm Harder, MMath dwharderalumni.uwaterloo.ca
Add To Cart
BINARY SEARCH TREES
GEOMETRIC APPLICATIONS OF Binary Search Trees (BSTs)
Perfect binary trees
Complete binary trees
Left-child right-sibling binary tree
Binary search trees
Average depth of randomly generated binary search trees
Digital Logic Circuits Binary Arithmetic
Important properties of Binary Tree
What is difference between text files and binary files
Study With Thesis Scientist
How to Sell
Tips and Tricks
Get To Know Us
© Copyright @ Thesis Scientist Pvt. Ltd.
Terms & Conditions
Tips & Tricks