# CS711008Z Algorithm Design and Analysis

Save this PDF as:

Size: px
Start display at page:

## Transcription

1 CS711008Z Algorithm Design and Analysis Lecture 7 Binary heap, binomial heap, and Fibonacci heap 1 Dongbo Bu Institute of Computing Technology Chinese Academy of Sciences, Beijing, China 1 The slides were made based on Chapter 19, 20 of Introduction to algorithms, Data Structure by Ellis Horowitz, et al, a note by Danny Sleator, and the slides by Kevin Wayne with permission 1 / 10

2 Outline Introduction to priority queue Various implementations of priority queue: Linked list: a list having n items is too long to support efficient ExtractMin and Insert operations simultaneously; Binary heap: using a tree rather than a linked list; Binomial heap: allowing multiple trees rather than a single tree to support efficient Union operation Fibonacci heap: implement DecreaseKey via simply cutting an edge rather than exchanging nodes, and control a bushy tree shape via allowing at most one child losing for any node 2 / 10

3 Priority queue 3 / 10

4 Priority queue: motivation Motivation: It is usually a case to extract the minimum from a set S of n numbers, dynamically Here, the word dynamically means that on S, we might perform Insertion, Deletion and DecreaseKey operations The question is how to organize the data to efficiently support these operations 4 / 10

5 Priority queue Priority queue is an abstract data type similar to stack or queue, but each element has a priority associated with its name A min-oriented priority queue must support the following core operations: 1 H=MakeHeap(): tocreateanewheaph; 2 Insert(H, x): toinsertintoh an element x together with its priority 3 ExtractMin(H): to extract the element with the highest priority; 4 DecreaseKey(H, x, k): to decrease the priority of element x; 5 Union(H 1,H 2 ): return a new heap containing all elements of heaps H 1 and H 2,anddestroytheinputheaps 5 / 10

6 Priority queue is very useful Priority queue has extensive applications, such as: Dijkstra s shortest path algorithm Prim s MST algorithm Huffman coding A searching algorithm HeapSort / 10

7 An example: Dijkstra s algorithm 7 / 10

8 Dijkstra s algorithm [1959] Dijkstra(G, s) 1: key(s) = 0; //key(u) stores an upper bound of the shortest-path weight from s to u; 2: PQInsert (s); 3: S = {s}; // Let S be the set of explored nodes; 4: for all node v s do 5: key(v) =+; : PQInsert (v); // n times 7: end for 8: while S V do 9: v = PQExtractMin(); // n times 10: S = S {v}; 11: for each w/ S and <v,w> E do 12: if key(v)+d(v, w) <key(w) then 13: PQDecreaseKey(w, key(v)+d(v, w)); // m times : end if 15: end for 1: end while Here PQ denotes a min-priority queue 8 / 10

9 Dijkstra s algorithm: an example s a b c d e f t

10 Dijkstra s algorithm: an example s a b c d e f t S = {} PQ = {s(0),a(),b(),c(),d(),e(),f(),t()} 0

11 Dijkstra s algorithm: an example S = {} PQ = {s(0),a(),b(),c(),d(),e(),f(),t()} 0 9 s a 15 b 20 c d e f 1 ExtractMin returns s t

12 Dijkstra s algorithm: an example S = {} PQ = {s(0),a(),b(),c(),d(),e(),f(),t()} 0 9 s a 15 b 20 c d e f 1 ExtractMin returns s DecreaseKey(a, 9) DecreaseKey(b, ) DecreaseKey(c, 15) t

13 Dijkstra s algorithm: an example S = {s} S = {} PQ PQ = = {a(9),b(),c(15),d(),e(),f(),t()} {s(0),a(),b(),c(),d(),e(),f(),t()} s a 15 b c 24 d e f 1 ExtractMin returns s DecreaseKey(a, 9) DecreaseKey(b, ) DecreaseKey(c, 15) t

14 Dijkstra s algorithm: an example S = {s} S = {} PQ PQ = = {a(9),b(),c(15),d(),e(),f(),t()} {s(0),a(),b(),c(),d(),e(),f(),t()} s a 15 b c 24 d e f 1 ExtractMin returns as DecreaseKey(a, 9) DecreaseKey(b, ) DecreaseKey(c, 15) t

15 Dijkstra s algorithm: an example S = {s} S = {} PQ PQ = = {a(9),b(),c(15),d(),e(),f(),t()} {s(0),a(),b(),c(),d(),e(),f(),t()} s a 15 b c 24 d e f 1 ExtractMin returns as DecreaseKey(d, DecreaseKey(a, 33) 9) DecreaseKey(b, ) DecreaseKey(c, 15) t

16 Dijkstra s algorithm: an example S = {s, {s} S a} = {} PQ PQ = {a(9),b(),c(15),d(),e(),f(),t()} = {s(0),a(),b(),c(),d(),e(),f(),t()} {b(),c(15),d(33),e(),f(),t()} s a 15 b c d e f 1 ExtractMin returns as DecreaseKey(d, DecreaseKey(a, 33) 9) DecreaseKey(b, ) DecreaseKey(c, 15) t

17 Dijkstra s algorithm: an example S = {s, {s} S a} = {} PQ PQ = {a(9),b(),c(15),d(),e(),f(),t()} = {s(0),a(),b(),c(),d(),e(),f(),t()} {b(),c(15),d(33),e(),f(),t()} s a 15 b c d e f 1 ExtractMin returns as b DecreaseKey(d, DecreaseKey(a, 33) 9) DecreaseKey(b, ) DecreaseKey(c, 15) t

18 Dijkstra s algorithm: an example S = {s, {s} S a} = {} PQ PQ = {a(9),b(),c(15),d(),e(),f(),t()} = {s(0),a(),b(),c(),d(),e(),f(),t()} {b(),c(15),d(33),e(),f(),t()} s a 15 b c d e 44 ExtractMin returns as b DecreaseKey(d, DecreaseKey(a, 33) 32) 9) DecreaseKey(e, DecreaseKey(b, ) 44) 19 DecreaseKey(c, 15) f 1 t

19 Dijkstra s algorithm: an example S S = = {s, {s} a, S a} b} = {} PQ PQ = PQ {a(9),b(),c(15),d(),e(),f(),t()} = {s(0),a(),b(),c(),d(),e(),f(),t()} {b(),c(15),d(33),e(),f(),t()} = {c(15),d(32),e(44),f(),t()} s a 15 b c d e 44 ExtractMin returns as b DecreaseKey(d, DecreaseKey(a, 33) 32) 9) DecreaseKey(e, DecreaseKey(b, ) 44) 19 DecreaseKey(c, 15) f 1 t

20 Dijkstra s algorithm: an example S S = = {s, {s} a, S a} b} = {} PQ PQ = PQ {a(9),b(),c(15),d(),e(),f(),t()} = {s(0),a(),b(),c(),d(),e(),f(),t()} {b(),c(15),d(33),e(),f(),t()} = {c(15),d(32),e(44),f(),t()} s a 15 b c d e 44 ExtractMin returns as cb DecreaseKey(d, DecreaseKey(a, 33) 32) 9) DecreaseKey(e, DecreaseKey(b, ) 44) 19 DecreaseKey(c, 15) f 1 t

21 Dijkstra s algorithm: an example S S = = {s, {s} a, S a} b} = {} PQ PQ = PQ {a(9),b(),c(15),d(),e(),f(),t()} = {s(0),a(),b(),c(),d(),e(),f(),t()} {b(),c(15),d(33),e(),f(),t()} = {c(15),d(32),e(44),f(),t()} s a 15 b c d e 44 ExtractMin returns as cb DecreaseKey(d, DecreaseKey(e, DecreaseKey(a, 35) 33) 32) 9) DecreaseKey(e, DecreaseKey(b, DecreaseKey(t, 59) ) 44) 19 DecreaseKey(c, 15) f 1 t

22 Dijkstra s algorithm: an example S S = = {s, {s, {s} a, a, S a} b, b} = c} {} PQ PQ = PQ {a(9),b(),c(15),d(),e(),f(),t()} = PQ {s(0),a(),b(),c(),d(),e(),f(),t()} {b(),c(15),d(33),e(),f(),t()} = {c(15),d(32),e(44),f(),t()} = {d(32),e(35),t(59),f()} s a 15 b c d e 44 ExtractMin returns as cb DecreaseKey(d, DecreaseKey(e, DecreaseKey(a, 35) 33) 32) 9) DecreaseKey(e, DecreaseKey(b, DecreaseKey(t, 59) ) 44) 19 DecreaseKey(c, 15) f 1 59 t

23 Dijkstra s algorithm: an example S S = = {s, {s, {s} a, a, S a} b, b} = c} {} PQ PQ = PQ {a(9),b(),c(15),d(),e(),f(),t()} = PQ {s(0),a(),b(),c(),d(),e(),f(),t()} {b(),c(15),d(33),e(),f(),t()} = {c(15),d(32),e(44),f(),t()} = {d(32),e(35),t(59),f()} s a 15 b c d e 44 ExtractMin returns ad sc b DecreaseKey(d, DecreaseKey(e, DecreaseKey(a, 35) 33) 32) 9) DecreaseKey(e, DecreaseKey(b, DecreaseKey(t, 59) ) 44) 19 DecreaseKey(c, 15) f 1 59 t

24 Dijkstra s algorithm: an example S S = = {s, {s, {s} a, a, S a} b, b} = c} {} PQ PQ = PQ {a(9),b(),c(15),d(),e(),f(),t()} = PQ {s(0),a(),b(),c(),d(),e(),f(),t()} {b(),c(15),d(33),e(),f(),t()} = {c(15),d(32),e(44),f(),t()} = {d(32),e(35),t(59),f()} s a 15 b c d e 44 ExtractMin returns ad sc b DecreaseKey(d, DecreaseKey(e, DecreaseKey(t, DecreaseKey(a, 51) 35) 33) 32) 9) DecreaseKey(e, DecreaseKey(b, DecreaseKey(t, 59) ) 44) 34) 19 DecreaseKey(c, 15) f 1 59 t

25 Dijkstra s algorithm: an example S S = S = {s, = {s, a, {s} a, a, b, S a} b, c, b} = c} d} {} PQ PQ = PQ {a(9),b(),c(15),d(),e(),f(),t()} = PQ {s(0),a(),b(),c(),d(),e(),f(),t()} {b(),c(15),d(33),e(),f(),t()} = PQ {c(15),d(32),e(44),f(),t()} = {d(32),e(35),t(59),f()} = {e(34),t(51),f()} s a 15 b c d e 44 ExtractMin returns ad sc b DecreaseKey(d, DecreaseKey(e, DecreaseKey(t, DecreaseKey(a, 51) 35) 33) 32) 9) DecreaseKey(e, DecreaseKey(b, DecreaseKey(t, 59) ) 44) 34) 19 DecreaseKey(c, 15) f t

26 Dijkstra s algorithm: an example S S = S = {s, = {s, a, {s} a, a, b, S a} b, c, b} = c} d} {} PQ PQ = PQ {a(9),b(),c(15),d(),e(),f(),t()} = PQ {s(0),a(),b(),c(),d(),e(),f(),t()} {b(),c(15),d(33),e(),f(),t()} = PQ {c(15),d(32),e(44),f(),t()} = {d(32),e(35),t(59),f()} = {e(34),t(51),f()} s a 15 b c d e 44 ExtractMin returns ad se cb DecreaseKey(d, DecreaseKey(e, DecreaseKey(t, DecreaseKey(a, 51) 35) 33) 32) 9) DecreaseKey(e, DecreaseKey(b, DecreaseKey(t, 59) ) 44) 34) 19 DecreaseKey(c, 15) f t

27 Dijkstra s algorithm: an example S S = S = {s, = {s, a, {s} a, a, b, S a} b, c, b} = c} d} {} PQ PQ = PQ {a(9),b(),c(15),d(),e(),f(),t()} = PQ {s(0),a(),b(),c(),d(),e(),f(),t()} {b(),c(15),d(33),e(),f(),t()} = PQ {c(15),d(32),e(44),f(),t()} = {d(32),e(35),t(59),f()} = {e(34),t(51),f()} s a 15 b c d e 44 ExtractMin returns ad se cb DecreaseKey(f,45) DecreaseKey(d, DecreaseKey(e, DecreaseKey(t, DecreaseKey(a, 51) 35) 33) 32) 9) DecreaseKey(e, DecreaseKey(b, DecreaseKey(t, 59) 50) ) 44) 34) 19 DecreaseKey(c, 15) f t

28 Dijkstra s algorithm: an example S S = S = {s, = {s, a, {s, a, {s} a, b, a, b, S a} c, b, c, b} = d, c} d} {} e} PQ PQ = PQ {a(9),b(),c(15),d(),e(),f(),t()} = PQ {s(0),a(),b(),c(),d(),e(),f(),t()} {b(),c(15),d(33),e(),f(),t()} = PQ {c(15),d(32),e(44),f(),t()} = PQ {d(32),e(35),t(59),f()} = {e(34),t(51),f()} = {f(45),t(50)} s a 15 b c d e 44 ExtractMin returns ad se cb DecreaseKey(f,45) DecreaseKey(d, DecreaseKey(e, DecreaseKey(t, DecreaseKey(a, 51) 35) 33) 32) 9) DecreaseKey(e, DecreaseKey(b, DecreaseKey(t, 59) 50) ) 44) 34) 19 DecreaseKey(c, 15) 45 f t

29 Dijkstra s algorithm: an example S S = S = {s, = {s, a, {s, a, {s} a, b, a, b, S a} c, b, c, b} = d, c} d} {} e} PQ PQ = PQ {a(9),b(),c(15),d(),e(),f(),t()} = PQ {s(0),a(),b(),c(),d(),e(),f(),t()} {b(),c(15),d(33),e(),f(),t()} = PQ {c(15),d(32),e(44),f(),t()} = PQ {d(32),e(35),t(59),f()} = {e(34),t(51),f()} = {f(45),t(50)} s a 15 b c d e 44 ExtractMin returns fa ds ec b DecreaseKey(f,45) DecreaseKey(d, DecreaseKey(e, DecreaseKey(t, DecreaseKey(a, 51) 35) 33) 32) 9) DecreaseKey(e, DecreaseKey(b, DecreaseKey(t, 59) 50) ) 44) 34) 19 DecreaseKey(c, 15) 45 f t

30 Dijkstra s algorithm: an example S S = S = {s, S {s, = a, {s, = a, {s, b, a, {s} a, b, c, a, b, S a} c, b, d, c, b} = d, c} e, d} {} e} f} PQ PQ = PQ {a(9),b(),c(15),d(),e(),f(),t()} = PQ {s(0),a(),b(),c(),d(),e(),f(),t()} {b(),c(15),d(33),e(),f(),t()} = PQ {c(15),d(32),e(44),f(),t()} = PQ {d(32),e(35),t(59),f()} = PQ {e(34),t(51),f()} = {f(45),t(50)} = {t(50)} s a 15 b c d e 44 ExtractMin returns fa ds ec b DecreaseKey(f,45) DecreaseKey(d, DecreaseKey(e, DecreaseKey(t, DecreaseKey(a, 51) 35) 33) 32) 9) DecreaseKey(e, DecreaseKey(b, DecreaseKey(t, 59) 50) ) 44) 34) 19 DecreaseKey(c, 15) 45 f t

31 Dijkstra s algorithm: an example S S = S = {s, S = {s, a, = a, {s, a, b, {s, b, a, {s} a, c, b, c, a, b, S a} d, c, b, d, c, b} = d, c} e, e, d} {} e} f, f} t} PQ PQ = PQ {a(9),b(),c(15),d(),e(),f(),t()} = PQ {s(0),a(),b(),c(),d(),e(),f(),t()} {b(),c(15),d(33),e(),f(),t()} = PQ {c(15),d(32),e(44),f(),t()} = PQ {d(32),e(35),t(59),f()} = PQ {e(34),t(51),f()} = PQ {f(45),t(50)} = {t(50)} = {} s a 15 b c d e 44 ExtractMin returns fa ds ec b DecreaseKey(f,45) DecreaseKey(d, DecreaseKey(e, DecreaseKey(t, DecreaseKey(a, 51) 35) 33) 32) 9) DecreaseKey(e, DecreaseKey(b, DecreaseKey(t, 59) 50) ) 44) 34) 19 DecreaseKey(c, 15) 45 f t 9 / 10

32 Time complexity of Dijkstra algorithm Operation Linked Binary Binomial Fibonacci list heap heap heap MakeHeap Insert 1 log n log n 1 ExtractMin n log n log n log n DecreaseKey 1 log n log n 1 Delete n log n log n log n Union 1 n log n 1 FindMin n 1 log n 1 Dijkstra O(n 2 ) O(m log n) O(m log n) O(m + n log n) Dijkstra algorithm: n Insert, n ExtractMin, andm DecreaseKey 10 / 10

33 Implementing priority queue: array or linked list 11 / 10

34 Implementing priority queue: unsorted array Unsorted array: Unsorted linked list: Operations: Insert: O(1) ExtractMin: O(n) head Note: a list containing n elements is too long to find the minimum efficiently 12 / 10

35 Implementing priority queue: sorted array Sorted array: Sorted linked list: Operations: Insert: O(n) ExtractMin: O(1) head Note: a list containing n elements is too long to maintain the order among elements 13 / 10

36 Implementing priority queue: array or linked list / 10

37 Binary heap: from alinkedlistto atree 15 / 10

38 Binary heap Figure 1: R W Floyd [194] 1 / 10

39 Binary heap: a complete binary tree Basic idea: loosing the structure: Recall that the objective is to find the minimum To achieve this objective, it is not necessary to sort all elements; but don t loose it too much: westillneedorderbetween some elements; Binary heap: elements are stored in a complete binary tree, ie, a tree that is perfectly balanced except for the bottom level Heap order is required, ie, any parent has a key smaller than his children; Advantage: any path has a short length of O(log 2 n) rather than n in linked list, making it efficient to maintain the heap 17 / 10

40 Binary heap: an explicit implementation Pointer representation: each node has pointers to its parent and two children; The following information are maintained: the number of elements n; the pointer to the root node; lptr root rptr Note: the last node can be found in O(log n) time 18 / 10

41 Binary heap: an implicit implementation Array representation: one-one correspondence between a binary tree and an array Binary tree array: the indices starting from 1 for the sake of simplicity; the indices record the order that the binary tree is traversed level by level Array binary tree: the k-th item has two children located at 2k and 2k +1; the parent of the k-th item is located at k 2 ; / 10

42 Sorted array vs binary heap Sorted array: an array containing n elements in an increasing order; Binary heap: heap order means that only the order among nodes in short paths (length is less than log n) are maintained Note that some inverse pairs exist in the array / 10

43 Binary heap: primitive and other operations 21 / 10

44 Primitive: exchanging nodes to restore heap order Primitive operation: when heap order is violated, ie a parent has a value larger than only one of its children, we simply exchange them to resolve the conflict Figure 2: Heap order is violated: 15 > 13 Exchange them to resolve the conflict 22 / 10

45 Primitive: exchanging nodes to restore heap order Primitive operation: when heap order is violated, ie a parent has a value larger than both of its children, we exchange the parent with its smaller child to resolve the conflict Figure 3: Heap order is violated: 20 > 12, and20 > 18 Exchange 20 with its smaller child (12) to resolve the conflicts 23 / 10

46 Binary heap: Insert operation Insert operation: the element is added as a new node at the end Since the heap order might be violated, the node is repeatedly exchanged with its parent until heap order is restored For example, Insert( 7 ): / 10

47 Binary heap: ExtractMin operation ExtractMin operation: exchange element in root with the last node; repeatedly exchange the element in root with its smaller child until heap order is restored For example, ExtractMin(): / 10

48 Binary heap: DecreaseKey operation DecreaseKey operation: given a handle to a node, repeatedly exchange the node with its parent until heap order is restored For example, DecreaseKey( ptr, 7): ptr 12 2 / 10

49 Binary heap: analysis Theorem In an implicit binary heap, any sequence of m Insert, DecreaseKey, and ExtractMin operations with n Insert operations takes O(m log n) time Note: Each operation touches at most log n nodes on a path from the root to a leaf Theorem In an explicit binary heap with n nodes, the Insert, DecreaseKey, and ExtractMin operations take O(m log n) time in the worst case Note: If using array representation, a dynamic array expanding/contracting is needed However, the total cost of array expanding/contracting is O(n) (see TableInsert) 27 / 10

50 Binary heap: heapify a set of items Question: Given a set of n elements, how to construct a binary heap containing them? Solutions: 1 Simply Insert the elements one by one Takes O(n log n) time 2 Bottom-up heapifying Takes O(n) time For i = n to 1, we repeatedly exchange the element in node i with its smaller child until the subtree rooted at node i is heap-ordered / 10

51 Binary heap: heapify Theorem Given n elements, a binary heap can be constructed using O(n) time Proof There are at most n 2 h+1 nodes of height h; It takes O(h) time to sink a node of height h; The total time is: log 2 n h=0 n h 2h+1 log 2 n 2n h=0 n h 2 h 29 / 10

52 Implementing priority queue: binary heap 30 / 10

53 Binary heap: Union operation Union operation: Given two binary heaps H 1 and H 2,to merge them into one binary heap H 1 21 H O(n) time is needed if using heapify Question: Is there a quicker way to union two heaps? 31 / 10

54 Binomial heap: using multiple trees rather than a single tree to support efficient Union operation 32 / 10

55 Binomial heap Figure 4: Jean Vuillenmin [1978] 33 / 10

56 Binomial heap: efficient Union Basic idea: loosing the structure: if multiple trees are allowed to represent a heap, Union can be efficiently implemented via simply putting trees together but don t loose it too much: there should not be too many trees; otherwise, it will take a long time to find the minimum among all root nodes H 1 21 H ExtractMin: simply finding the minimum element of the root nodes Note that a root node holds the minimum of the tree due to the heap order 34 / 10

57 Why we can t loose the structure too much? An extreme case of multiple trees: each node is itself a tree Then it will take O(n) time to find the minimum Solution: consolidating, ie, twotrees(withthesamesize) are merged into one the larger root is linked to the smaller one to keep the heap order Note that after consolidating, at most log n trees will be left 11 link 11 8 link / 10

58 Binomial tree Definition (Binomial tree) The binomial tree is defined recursively: a single node is itself a B 0 tree, and two B k trees are linked into a B k+1 tree B k B k B k+1 B 0 3 / 10

59 Binomial tree examples: B 0, B 1, B 2 B 0 8 B B 2 37 / 10

60 Binomial tree example: B B 3 38 / 10

61 Binomial tree example: B B 4 39 / 10

62 Binomial tree example: B 5 B B 5 40 / 10

63 Binomial tree: property Properties: 1 B k =2 k ; 2 height(b k )=k; 3 degree(b k )=k; 4 The i-th child of a node has a degree of i 1; 5 The deletion of the root yields trees B 0, B 1,,B k 1 Binomial tree is named after the fact that the node number of all levels are binomial coefficients B 3 B k B 0 B 1 B2 B k 41 / 10

64 Binomial heap: a forest Definition (Binomial forest) A binomial heap is a collection of several binomial trees: Each tree is heap ordered; There is either 0 or 1 B k for any k Example: B 0 B 1 B 2 B / 10

65 Binomial heap: properties Properties: 1 A binomial heap with n nodes contains the binomial tree B i iff b i =1,whereb k b k 1 b 1 b 0 is binary representation of n 2 It has at most log 2 n +1trees 3 Its height is at most log 2 n Thus, it takes O(log n) time to find the minimum element via checking the roots B 0 B 1 B 2 B / 10

66 Union is efficient: example 1 B 1 5 B B B 1 8 B B 2 B 3 B 0 = Figure 5: An easy case: no consolidating is needed 44 / 10

67 Union is efficient: example 2 I B 1 4 B 2 B 1 B B 2 = 3 10 B 2 B Figure : Consolidating two B 1 trees into a B 2 tree 45 / 10

68 Union is efficient: example 2 II B 1 4 B 2 B 1 B B 3 B 3 = Figure 7: Consolidating two B 2 trees into a B 3 tree 4 / 10

69 Union is efficient: example 2 III B 1 4 B 2 B 1 B B 4 = Time complexity: O(log n) since there are at most O(log n) trees 47 / 10

70 Binomial heap: Insert operation Insert(x) 1: Create a B 0 tree for x; 2: while there are two B k trees for some k do 3: link them together into one B k+1 tree; 4: end while B 0 B Figure 8: An easy case: no consolidating is needed 48 / 10

71 Insert operation: example 2 I B 0 B 0 B B 2 B Figure 9: Consolidating two B 0 49 / 10

72 Insert operation: example 2 II B 1 2 B 1 B 2 B Figure 10: Consolidating two B 1 50 / 10

73 Insert operation: example 2 III B 2 2 B 2 B Figure 11: Consolidating two B 2 51 / 10

74 Insert operation: example 2 IV 2 B B 3 Figure 12: Consolidating two B 3 52 / 10

75 Insert operation: example 2 V 2 B Time complexity: O(log n) (worst case) since there are at most log n trees 53 / 10

76 Binomial heap: ExtractMin operation ExtractMin() 1: find min of all roots; 2: remove the min node; 3: while there are two B k trees for some k do 4: link them together into one B k+1 tree; 5: end while B 0 9 B / 10

77 ExtractMin operation: an example I B 0 B 0 B B 2 B Figure 13: The four children become trees 55 / 10

78 ExtractMin operation: an example II B 1 B 1 B 2 B Figure : Consolidating two B 1 trees 5 / 10

79 ExtractMin operation: an example III 3 9 B B B 3 Figure 15: Consolidating two B 2 trees 57 / 10

80 ExtractMin operation: an example IV 3 9 B B 3 Figure 1: Consolidating two B 2 trees 58 / 10

81 ExtractMin operation: an example V 3 9 B Time complexity: O(log n) 59 / 10

82 Implementing priority queue: Binomial heap 0 / 10

83 Binomial heap: a more accurate amortized analysis 1 / 10

84 Amortized analysis of Insert Motivation: If an Insert takes a long time (say log n), the subsequent Insert operations shouldn t take long! Thus, it will be more accurate to look at a sequence of operations rather than an individual operation 2 / 10

85 Amortized analysis of Insert operation Insert(x) 1: Create a B 0 tree for x; 2: while there are two B k trees for some k do 3: link them together into one B k+1 tree; 4: end while Analysis: Let w be #WHILE Thus an Insert operation takes time: 1+w; Consider a quantity Φ=#trees (called potential function) Φ increase: 1; Φ decrease: w; Running time of Insert: 1+w = 1+ decrease in Φ 3 / 10

86 Amortized analysis of ExtractMin ExtractMin() 1: find min of all roots; 2: remove the min node; 3: while there are two B k trees for some k do 4: link them together into one B k+1 tree; 5: end while Analysis: Let w be #WHILE Thus,ExtractMin takes d + w time, where d denotes degree of the removed root node; Consider a potential function Φ=#trees Φ increase: d; Φ decrease: w; Running time: d + w = d+ decrease in #trees Note that d log n 4 / 10

87 Amortized analysis Let s consider any sequence of n Insert and m ExtractMin operations; The total running time is at most n + m log n+ total decrease in #trees; Note: total decrease in #trees total increase in #trees (why?), which is at most n + m log n Thus, total time is at most 2n +2m log n We say Insert takes O(1) amortized time, and ExtractMin takes O(log n) amortized time Definition (Amortized time) For any sequence of n 1 operation 1, n 2 operation 2, if the total time is O(n 1 T 1 + n 2 T 2 ), we say that operation 1 takes T 1 amortized time, operation 2 takes T 2 amortized time 5 / 10

88 Intuition of the amortized analysis The actual running time of an Insert operation is 1+w A large w means that the Insert operation takes a long time Note that the w time was spent on decreasing trees ; thus, if the w time was amortized over the operations creating trees, the amortized time of Insert operation will be only O(1) The actual running time of an ExtractMin operation is at most log n + w Note that at most log n new trees are created during an ExtractMin operation; thus, the amortized time is still O(log n) even if some costs have been amortized to it from other operations due to tree creating / 10

89 Implementing priority queue: Binomial heap 7 / 10

90 Binomial heap: DecreaseKey operation B Figure 17: DecreaseKey: 17 to 1 17 Time: O(log n) since in the worst case, we need to perform node exchanging up to the root Question: is there a quicker way for decrease key? 8 / 10

91 Fibonacci heap: an efficient implementation of DecreaseKey via simply cutting an edge rather than exchanging nodes 9 / 10

92 Fibonacci heap Figure 18: Robert Tarjan [198] 70 / 10

93 Fibonacci heap: an efficient DecreaseKey operation Basic idea: loosing the structure: When heap order is violated, a simple solution is to cut off a node, and insert it into the root list but don t loose it too much: the cutting off operation makes a tree not binomial any more; however, it should not deviate from a binomial tree too much A technique to achieve this objective is allowing any non-root node to lose at most one child Figure 19: Heap order is violated when DescreaseKey 13 to 2 However, the heap order can be easily restored via cutting off the node, and inserting it into the root list 71 / 10

94 Fibonacci heap: DescreaseKey DecreaseKey(v, x) 1: key(v) =x; 2: if heap order is violated then 3: u = v s parent; 4: cut subtree rooted at node v; 5: while u is marked do : cut subtree rooted at node u, and insert it into the root list; 7: unmark u; 8: u = u s parent; 9: end while 10: mark u; 11: end if 72 / 10

95 DecreaseKey: anexamplei Figure 20: The original Fibonacci heap 73 / 10

96 DecreaseKey: anexampleii Figure 21: DecreaseKey: 19 to 3 74 / 10

97 DecreaseKey: anexampleiii Figure 22: DecreaseKey: 15 to 2 75 / 10

98 DecreaseKey: anexampleiv Figure 23: DecreaseKey: 12 to 8 7 / 10

99 DecreaseKey: anexamplev Figure 24: DecreaseKey: to 1 77 / 10

100 DecreaseKey: anexamplevi Figure 25: DecreaseKey: 1 to 9 78 / 10

101 Fibonacci heap: Insert Insert(x) 1: create a tree for x, and insert it into the root list; Note: Being lazy! Consolidating trees when extracting minimum Figure 2: Insert(): creating a new tree, and insert it into the root list 79 / 10

102 Fibonacci heap: ExtractMin ExtractMin() 1: find min of all root nodes; 2: remove the min node; 3: while there are two roots u and v of the same degree do 4: consolidate the two trees together; 5: end while 80 / 10

103 ExtractMin: anexample I Figure 27: ExtractMin: removing the min node, and adding 3 trees 81 / 10

104 ExtractMin: anexample II Figure 28: ExtractMin: consolidating two trees rooted at node 1 and Figure 29: ExtractMin: consolidating two trees rooted at node 1 and 9 82 / 10

105 ExtractMin: anexample III Figure 30: ExtractMin: consolidating two trees rooted at node 1 and Figure 31: ExtractMin: consolidating two trees rooted at node 3 and 5 83 / 10

106 ExtractMin: anexample IV Figure 32: ExtractMin: consolidating two trees rooted at node 2 and Figure 33: ExtractMin: consolidating two trees rooted at node 2 and 9 84 / 10

107 ExtractMin: anexample V Figure 34: ExtractMin: consolidating two trees rooted at node 2 and 3 85 / 10

108 ExtractMin: anexample VI Figure 35: ExtractMin: consolidating two trees rooted at node 1 and 2 8 / 10

109 Fibonacci heap: an amortized analysis 87 / 10

110 Fibonacci heap: DecreaseKey DecreaseKey(v, x) 1: key(v) =x; 2: if heap order is violated then 3: u = v s parent; 4: cut subtree rooted at node v; 5: while u is marked do : cut subtree rooted at node u, and insert it into the root list; 7: unmark u; 8: u = u s parent; 9: end while 10: mark u; 11: end if 88 / 10

111 DecreaseKey: analysis Analysis: The actual running time is: 1+w, wherew =#WHILE; Consider a potential function Φ=#trees + 2#marks; Φ increase: 1+2=3; Φ decrease: ( 1+2 1) w = w; Running time: 1+w =1+Φdecrease; Intuition: a large w means that DecreaseKey takes a long time; however, if we can amortize w over other operations, a DecreaseKey operation takes only O(1) amortized time 89 / 10

112 Fibonacci heap: ExtractMin ExtractMin() 1: find min of all root nodes; 2: remove the min node; 3: while there are two roots u and v of the same degree do 4: consolidate the two trees into one; 5: end while 90 / 10

113 ExtractMin: analysis Analysis: The actual running time is: d + w, whered denotes degree of the removed node, and w =#WHILE; Consider a potential function Φ=#trees + 2#marks; Φ increase: d; Φ decrease: w; Running time: d + w = d+ decrease in Φ; Note: d d max,whered max denotes the maximum root node degree; 91 / 10

114 Fibonacci heap: Insert Insert(x) 1: create a tree for x, and insert it into the root list; Analysis: Note: Running time: 1; Φ increase: 1; Φ decrease: 0; Recall that a binomial heap consolidates trees in both Insert and ExtractMin operations In contrast, the Fibonacci heap adopts the strategy of being lazy tree consolidating is removed from Insert operation for the sake of efficiency, and there is no tree consolidating until an ExtractMin operation 92 / 10

115 Fibonacci heap: amortized analysis Consider any sequence of n Insert, m ExtractMin, andr DecreaseKey operations; The total running time is at most: n + md max + r+ total decrease in Φ; Note: total decrease in Φ total increase in Φ= n + md max +3r; Thus the total running time is at most: n + md max + r + n + md max +3r =2n +2md max +4r; Thus Insert takes O(1) amortized time, DecreaseKey takes O(1) amortized time, and ExtractMin takes O(d max ) amortized time In fact, ExtractMin takes O(log n) amortized time since d max can be upper-bounded by log n Question: how to bound d max? 93 / 10

116 Fibonacci heap: bounding d max 94 / 10

117 Fibonacci heap: bounding d max Recall that for a binomial tree having n nodes, the root degree d is exactly log 2 n,ied = log 2 n In contrast, a tree in a Fibonacci heap might have several subtrees cutting off, leading to d log 2 n However, the marking technique guarantees that any node can lose at most one child, thus we can show that log φ n d log 2 n,whereφ = = / 10

118 Fibonacci heap: a property of node degree Recall that for a binomial tree, the i-th child of each node has a degree of exactly i For a tree in a Fibonacci heap, we will show that the i-th child of each node has degree i / 10

119 Lemma For any node in a Fibonacci heap, the i-th child has a degree i 2 w u 11 Proof Suppose u is the current i-th child of w; If w is not a root node, it has at most 1 child lost; otherwise, it might have multiple children lost; Consider the time when u is linked to w Atthattime, degree(w) i 1, sodegree(u) =degree(w) i 1; Subsequently, degree(u) decreases by at most 1 (Otherwise, u will be cut off and no longer a child of w) Thus, degree(u) i 2 97 / 10

120 The smallest tree with root degree k in a Fibonacci heap Let F k be the smallest tree with root degree of k, andfor any node of F k,thei-th child has degree i 2; 8 B 1 F 0 Figure 3: B 1 =2 1 and F 0 =1 φ 0 98 / 10

121 Example: B 2 versus F 1 Let F k be the smallest tree with root degree of k, andforany node of F k,thei-th child has degree i 2; B 2 8 F 1 Figure 37: B 2 =2 2 and F 1 =2 φ 1 99 / 10

122 Example: B 3 versus F 2 Let F k be the smallest tree with root degree of k, andforany node of F k,thei-th child has degree i 2; B 3 F Figure 38: B 3 =2 3 and F 2 =3 φ / 10

123 Example: B 4 versus F 3 B F Figure 39: B 4 =2 4 and F 3 =5 φ / 10

124 Example: B 5 versus F 4 B B F Figure 40: B 5 =2 5 and F 4 =8 φ / 10

125 General case of trees in Fibonacci heap Recall that a binomial tree B k+1 is a combination of two B k trees B 0 B k+1 B k However, F k+1 is the combination of one F k tree and one F k 1 tree B k F 0 F k+1 F k F k 1 We will show that though F k is smaller than B k,the difference is not too much In fact, F k 118 k 103 / 10

126 Fibonacci numbers and Fibonacci heap Definition (Fibonacci numbers) The Fibonacci sequence is 0, 1, 1, 2, 3, 5, 8, 13, 21, 34 Itcanbe 0 if k =0 defined by the recursion relation: f k = 1 if k =1 f k 1 + f k 2 if k 2 Recall that f k+2 φ k,whereφ = =118 Note that F k = f k+2 Consider a Fibonacci heap H having n nodes Let T denote a tree in H with root degree d We have n T F d = f d+2 φ d Thus d = O(log φ n)=o(log n) So, d max = O(log n) Therefore, ExtractMin operation takes O(log n) amortized time 104 / 10

127 Implementing priority queue: Fibonacci heap 105 / 10

128 Time complexity of Dijkstra algorithm Operation Linked Binary Binomial Fibonacci list heap heap heap MakeHeap Insert 1 log n log n 1 ExtractMin n log n log n log n DecreaseKey 1 log n log n 1 Delete n log n log n log n Union 1 n log n 1 FindMin n 1 log n 1 Dijkstra O(n 2 ) O(m log n) O(m log n) O(m + n log n) Dijkstra algorithm: n Insert, n ExtractMin, andm DecreaseKey 10 / 10

### Algorithms and Data Structures

Algorithms and Data Structures CMPSC 465 LECTURES 20-21 Priority Queues and Binary Heaps Adam Smith S. Raskhodnikova and A. Smith. Based on slides by C. Leiserson and E. Demaine. 1 Trees Rooted Tree: collection

### 6 March 2007 1. Array Implementation of Binary Trees

Heaps CSE 0 Winter 00 March 00 1 Array Implementation of Binary Trees Each node v is stored at index i defined as follows: If v is the root, i = 1 The left child of v is in position i The right child of

### Binary Heaps * * * * * * * / / \ / \ / \ / \ / \ * * * * * * * * * * * / / \ / \ / / \ / \ * * * * * * * * * *

Binary Heaps A binary heap is another data structure. It implements a priority queue. Priority Queue has the following operations: isempty add (with priority) remove (highest priority) peek (at highest

### Cpt S 223. School of EECS, WSU

Priority Queues (Heaps) 1 Motivation Queues are a standard mechanism for ordering tasks on a first-come, first-served basis However, some tasks may be more important or timely than others (higher priority)

### Data Structures Fibonacci Heaps, Amortized Analysis

Chapter 4 Data Structures Fibonacci Heaps, Amortized Analysis Algorithm Theory WS 2012/13 Fabian Kuhn Fibonacci Heaps Lacy merge variant of binomial heaps: Do not merge trees as long as possible Structure:

### A binary heap is a complete binary tree, where each node has a higher priority than its children. This is called heap-order property

CmSc 250 Intro to Algorithms Chapter 6. Transform and Conquer Binary Heaps 1. Definition A binary heap is a complete binary tree, where each node has a higher priority than its children. This is called

### Binary Heaps. CSE 373 Data Structures

Binary Heaps CSE Data Structures Readings Chapter Section. Binary Heaps BST implementation of a Priority Queue Worst case (degenerate tree) FindMin, DeleteMin and Insert (k) are all O(n) Best case (completely

### From Last Time: Remove (Delete) Operation

CSE 32 Lecture : More on Search Trees Today s Topics: Lazy Operations Run Time Analysis of Binary Search Tree Operations Balanced Search Trees AVL Trees and Rotations Covered in Chapter of the text From

### Converting a Number from Decimal to Binary

Converting a Number from Decimal to Binary Convert nonnegative integer in decimal format (base 10) into equivalent binary number (base 2) Rightmost bit of x Remainder of x after division by two Recursive

### Binary Heap Algorithms

CS Data Structures and Algorithms Lecture Slides Wednesday, April 5, 2009 Glenn G. Chappell Department of Computer Science University of Alaska Fairbanks CHAPPELLG@member.ams.org 2005 2009 Glenn G. Chappell

### The Union-Find Problem Kruskal s algorithm for finding an MST presented us with a problem in data-structure design. As we looked at each edge,

The Union-Find Problem Kruskal s algorithm for finding an MST presented us with a problem in data-structure design. As we looked at each edge, cheapest first, we had to determine whether its two endpoints

### Analysis of Algorithms I: Binary Search Trees

Analysis of Algorithms I: Binary Search Trees Xi Chen Columbia University Hash table: A data structure that maintains a subset of keys from a universe set U = {0, 1,..., p 1} and supports all three dictionary

### CSE 326, Data Structures. Sample Final Exam. Problem Max Points Score 1 14 (2x7) 2 18 (3x6) 3 4 4 7 5 9 6 16 7 8 8 4 9 8 10 4 Total 92.

Name: Email ID: CSE 326, Data Structures Section: Sample Final Exam Instructions: The exam is closed book, closed notes. Unless otherwise stated, N denotes the number of elements in the data structure

### Balanced Binary Search Tree

AVL Trees / Slide 1 Balanced Binary Search Tree Worst case height of binary search tree: N-1 Insertion, deletion can be O(N) in the worst case We want a tree with small height Height of a binary tree with

### Binary Search Trees 3/20/14

Binary Search Trees 3/0/4 Presentation for use ith the textbook Data Structures and Algorithms in Java, th edition, by M. T. Goodrich, R. Tamassia, and M. H. Goldasser, Wiley, 04 Binary Search Trees 4

### 1/1 7/4 2/2 12/7 10/30 12/25

Binary Heaps A binary heap is dened to be a binary tree with a key in each node such that: 1. All leaves are on, at most, two adjacent levels. 2. All leaves on the lowest level occur to the left, and all

### Binary Search Trees. A Generic Tree. Binary Trees. Nodes in a binary search tree ( B-S-T) are of the form. P parent. Key. Satellite data L R

Binary Search Trees A Generic Tree Nodes in a binary search tree ( B-S-T) are of the form P parent Key A Satellite data L R B C D E F G H I J The B-S-T has a root node which is the only node whose parent

### Outline BST Operations Worst case Average case Balancing AVL Red-black B-trees. Binary Search Trees. Lecturer: Georgy Gimel farb

Binary Search Trees Lecturer: Georgy Gimel farb COMPSCI 220 Algorithms and Data Structures 1 / 27 1 Properties of Binary Search Trees 2 Basic BST operations The worst-case time complexity of BST operations

### root node level: internal node edge leaf node CS@VT Data Structures & Algorithms 2000-2009 McQuain

inary Trees 1 A binary tree is either empty, or it consists of a node called the root together with two binary trees called the left subtree and the right subtree of the root, which are disjoint from each

### Binary Search Trees > = Binary Search Trees 1. 2004 Goodrich, Tamassia

Binary Search Trees < 6 2 > = 1 4 8 9 Binary Search Trees 1 Binary Search Trees A binary search tree is a binary tree storing keyvalue entries at its internal nodes and satisfying the following property:

### Sorting revisited. Build the binary search tree: O(n^2) Traverse the binary tree: O(n) Total: O(n^2) + O(n) = O(n^2)

Sorting revisited How did we use a binary search tree to sort an array of elements? Tree Sort Algorithm Given: An array of elements to sort 1. Build a binary search tree out of the elements 2. Traverse

### Learning Outcomes. COMP202 Complexity of Algorithms. Binary Search Trees and Other Search Trees

Learning Outcomes COMP202 Complexity of Algorithms Binary Search Trees and Other Search Trees [See relevant sections in chapters 2 and 3 in Goodrich and Tamassia.] At the conclusion of this set of lecture

### Binary Search Trees. Data in each node. Larger than the data in its left child Smaller than the data in its right child

Binary Search Trees Data in each node Larger than the data in its left child Smaller than the data in its right child FIGURE 11-6 Arbitrary binary tree FIGURE 11-7 Binary search tree Data Structures Using

### 1 2-3 Trees: The Basics

CS10: Data Structures and Object-Oriented Design (Fall 2013) November 1, 2013: 2-3 Trees: Inserting and Deleting Scribes: CS 10 Teaching Team Lecture Summary In this class, we investigated 2-3 Trees in

### Questions 1 through 25 are worth 2 points each. Choose one best answer for each.

Questions 1 through 25 are worth 2 points each. Choose one best answer for each. 1. For the singly linked list implementation of the queue, where are the enqueues and dequeues performed? c a. Enqueue in

### Lecture 6: Binary Search Trees CSCI 700 - Algorithms I. Andrew Rosenberg

Lecture 6: Binary Search Trees CSCI 700 - Algorithms I Andrew Rosenberg Last Time Linear Time Sorting Counting Sort Radix Sort Bucket Sort Today Binary Search Trees Data Structures Data structure is a

### Tables so far. set() get() delete() BST Average O(lg n) O(lg n) O(lg n) Worst O(n) O(n) O(n) RB Tree Average O(lg n) O(lg n) O(lg n)

Hash Tables Tables so far set() get() delete() BST Average O(lg n) O(lg n) O(lg n) Worst O(n) O(n) O(n) RB Tree Average O(lg n) O(lg n) O(lg n) Worst O(lg n) O(lg n) O(lg n) Table naïve array implementation

### Lecture 4: Balanced Binary Search Trees

Lecture 4 alanced inar Search Trees 6.006 Fall 009 Lecture 4: alanced inar Search Trees Lecture Overview The importance of being balanced VL trees Definition alance Insert Other balanced trees Data structures

### Binary Search Trees CMPSC 122

Binary Search Trees CMPSC 122 Note: This notes packet has significant overlap with the first set of trees notes I do in CMPSC 360, but goes into much greater depth on turning BSTs into pseudocode than

### CSC148 Lecture 8. Algorithm Analysis Binary Search Sorting

CSC148 Lecture 8 Algorithm Analysis Binary Search Sorting Algorithm Analysis Recall definition of Big Oh: We say a function f(n) is O(g(n)) if there exists positive constants c,b such that f(n)

### Heaps & Priority Queues in the C++ STL 2-3 Trees

Heaps & Priority Queues in the C++ STL 2-3 Trees CS 3 Data Structures and Algorithms Lecture Slides Friday, April 7, 2009 Glenn G. Chappell Department of Computer Science University of Alaska Fairbanks

### Strategic Deployment in Graphs. 1 Introduction. v 1 = v s. v 2. v 4. e 1. e 5 25. e 3. e 2. e 4

Informatica 39 (25) 237 247 237 Strategic Deployment in Graphs Elmar Langetepe and Andreas Lenerz University of Bonn, Department of Computer Science I, Germany Bernd Brüggemann FKIE, Fraunhofer-Institute,

### Union-Find Problem. Using Arrays And Chains

Union-Find Problem Given a set {,,, n} of n elements. Initially each element is in a different set. ƒ {}, {},, {n} An intermixed sequence of union and find operations is performed. A union operation combines

### Algorithms Chapter 12 Binary Search Trees

Algorithms Chapter 1 Binary Search Trees Outline Assistant Professor: Ching Chi Lin 林 清 池 助 理 教 授 chingchi.lin@gmail.com Department of Computer Science and Engineering National Taiwan Ocean University

### Data Structure [Question Bank]

Unit I (Analysis of Algorithms) 1. What are algorithms and how they are useful? 2. Describe the factor on best algorithms depends on? 3. Differentiate: Correct & Incorrect Algorithms? 4. Write short note:

### Data Structures and Algorithm Analysis (CSC317) Intro/Review of Data Structures Focus on dynamic sets

Data Structures and Algorithm Analysis (CSC317) Intro/Review of Data Structures Focus on dynamic sets We ve been talking a lot about efficiency in computing and run time. But thus far mostly ignoring data

### The Goldberg Rao Algorithm for the Maximum Flow Problem

The Goldberg Rao Algorithm for the Maximum Flow Problem COS 528 class notes October 18, 2006 Scribe: Dávid Papp Main idea: use of the blocking flow paradigm to achieve essentially O(min{m 2/3, n 1/2 }

### Krishna Institute of Engineering & Technology, Ghaziabad Department of Computer Application MCA-213 : DATA STRUCTURES USING C

Tutorial#1 Q 1:- Explain the terms data, elementary item, entity, primary key, domain, attribute and information? Also give examples in support of your answer? Q 2:- What is a Data Type? Differentiate

Lecture 10 Union-Find The union-nd data structure is motivated by Kruskal's minimum spanning tree algorithm (Algorithm 2.6), in which we needed two operations on disjoint sets of vertices: determine whether

### Complexity of Union-Split-Find Problems. Katherine Jane Lai

Complexity of Union-Split-Find Problems by Katherine Jane Lai S.B., Electrical Engineering and Computer Science, MIT, 2007 S.B., Mathematics, MIT, 2007 Submitted to the Department of Electrical Engineering

### Data Structures. Jaehyun Park. CS 97SI Stanford University. June 29, 2015

Data Structures Jaehyun Park CS 97SI Stanford University June 29, 2015 Typical Quarter at Stanford void quarter() { while(true) { // no break :( task x = GetNextTask(tasks); process(x); // new tasks may

### Big Data and Scripting. Part 4: Memory Hierarchies

1, Big Data and Scripting Part 4: Memory Hierarchies 2, Model and Definitions memory size: M machine words total storage (on disk) of N elements (N is very large) disk size unlimited (for our considerations)

### A binary search tree is a binary tree with a special property called the BST-property, which is given as follows:

Chapter 12: Binary Search Trees A binary search tree is a binary tree with a special property called the BST-property, which is given as follows: For all nodes x and y, if y belongs to the left subtree

### MATHEMATICAL ENGINEERING TECHNICAL REPORTS. The Best-fit Heuristic for the Rectangular Strip Packing Problem: An Efficient Implementation

MATHEMATICAL ENGINEERING TECHNICAL REPORTS The Best-fit Heuristic for the Rectangular Strip Packing Problem: An Efficient Implementation Shinji IMAHORI, Mutsunori YAGIURA METR 2007 53 September 2007 DEPARTMENT

### Exam study sheet for CS2711. List of topics

Exam study sheet for CS2711 Here is the list of topics you need to know for the final exam. For each data structure listed below, make sure you can do the following: 1. Give an example of this data structure

### Binary Trees and Huffman Encoding Binary Search Trees

Binary Trees and Huffman Encoding Binary Search Trees Computer Science E119 Harvard Extension School Fall 2012 David G. Sullivan, Ph.D. Motivation: Maintaining a Sorted Collection of Data A data dictionary

### In our lecture about Binary Search Trees (BST) we have seen that the operations of searching for and inserting an element in a BST can be of the

In our lecture about Binary Search Trees (BST) we have seen that the operations of searching for and inserting an element in a BST can be of the order O(log n) (in the best case) and of the order O(n)

### Algorithms and Data Structures (INF1) Lecture 14/15 Hua Lu

Algorithms and Data Structures (INF1) Lecture 14/15 Hua Lu Department of Computer Science Aalborg University Fall 2007 This Lecture Shortest paths Problem preliminary Shortest paths in DAG Bellman-Moore

### AS-2261 M.Sc.(First Semester) Examination-2013 Paper -fourth Subject-Data structure with algorithm

AS-2261 M.Sc.(First Semester) Examination-2013 Paper -fourth Subject-Data structure with algorithm Time: Three Hours] [Maximum Marks: 60 Note Attempts all the questions. All carry equal marks Section A

### Data Structure with C

Subject: Data Structure with C Topic : Tree Tree A tree is a set of nodes that either:is empty or has a designated node, called the root, from which hierarchically descend zero or more subtrees, which

### Previous Lectures. B-Trees. External storage. Two types of memory. B-trees. Main principles

B-Trees Algorithms and data structures for external memory as opposed to the main memory B-Trees Previous Lectures Height balanced binary search trees: AVL trees, red-black trees. Multiway search trees:

### Persistent Data Structures and Planar Point Location

Persistent Data Structures and Planar Point Location Inge Li Gørtz Persistent Data Structures Ephemeral Partial persistence Full persistence Confluent persistence V1 V1 V1 V1 V2 q ue V2 V2 V5 V2 V4 V4

### Union-find with deletions

19 Union-find with deletions Haim Kaplan * Nira Shafrir Robert E. Tarjan ~ Abstract In the classical union-find problem we maintain a partition of a universe of n elements into disjoint sets subject to

### Social Media Mining. Graph Essentials

Graph Essentials Graph Basics Measures Graph and Essentials Metrics 2 2 Nodes and Edges A network is a graph nodes, actors, or vertices (plural of vertex) Connections, edges or ties Edge Node Measures

### Merge Sort. 2004 Goodrich, Tamassia. Merge Sort 1

Merge Sort 7 2 9 4 2 4 7 9 7 2 2 7 9 4 4 9 7 7 2 2 9 9 4 4 Merge Sort 1 Divide-and-Conquer Divide-and conquer is a general algorithm design paradigm: Divide: divide the input data S in two disjoint subsets

### An Evaluation of Self-adjusting Binary Search Tree Techniques

SOFTWARE PRACTICE AND EXPERIENCE, VOL. 23(4), 369 382 (APRIL 1993) An Evaluation of Self-adjusting Binary Search Tree Techniques jim bell and gopal gupta Department of Computer Science, James Cook University,

### Classification/Decision Trees (II)

Classification/Decision Trees (II) Department of Statistics The Pennsylvania State University Email: jiali@stat.psu.edu Right Sized Trees Let the expected misclassification rate of a tree T be R (T ).

### Full and Complete Binary Trees

Full and Complete Binary Trees Binary Tree Theorems 1 Here are two important types of binary trees. Note that the definitions, while similar, are logically independent. Definition: a binary tree T is full

### Home Page. Data Structures. Title Page. Page 1 of 24. Go Back. Full Screen. Close. Quit

Data Structures Page 1 of 24 A.1. Arrays (Vectors) n-element vector start address + ielementsize 0 +1 +2 +3 +4... +n-1 start address continuous memory block static, if size is known at compile time dynamic,

### Analysis of Algorithms, I

Analysis of Algorithms, I CSOR W4231.002 Eleni Drinea Computer Science Department Columbia University Thursday, February 26, 2015 Outline 1 Recap 2 Representing graphs 3 Breadth-first search (BFS) 4 Applications

### Tree Properties (size vs height) Balanced Binary Trees

Tree Properties (size vs height) Balanced Binary Trees Due now: WA 7 (This is the end of the grace period). Note that a grace period takes the place of a late day. Now is the last time you can turn this

### A binary search tree or BST is a binary tree that is either empty or in which the data element of each node has a key, and:

Binary Search Trees 1 The general binary tree shown in the previous chapter is not terribly useful in practice. The chief use of binary trees is for providing rapid access to data (indexing, if you will)

### Lecture 7: Approximation via Randomized Rounding

Lecture 7: Approximation via Randomized Rounding Often LPs return a fractional solution where the solution x, which is supposed to be in {0, } n, is in [0, ] n instead. There is a generic way of obtaining

### Approximation Algorithms

Approximation Algorithms or: How I Learned to Stop Worrying and Deal with NP-Completeness Ong Jit Sheng, Jonathan (A0073924B) March, 2012 Overview Key Results (I) General techniques: Greedy algorithms

### International Journal of Software and Web Sciences (IJSWS) www.iasir.net

International Association of Scientific Innovation and Research (IASIR) (An Association Unifying the Sciences, Engineering, and Applied Research) ISSN (Print): 2279-0063 ISSN (Online): 2279-0071 International

### CS104: Data Structures and Object-Oriented Design (Fall 2013) October 24, 2013: Priority Queues Scribes: CS 104 Teaching Team

CS104: Data Structures and Object-Oriented Design (Fall 2013) October 24, 2013: Priority Queues Scribes: CS 104 Teaching Team Lecture Summary In this lecture, we learned about the ADT Priority Queue. A

### Lecture 2 February 12, 2003

6.897: Advanced Data Structures Spring 003 Prof. Erik Demaine Lecture February, 003 Scribe: Jeff Lindy Overview In the last lecture we considered the successor problem for a bounded universe of size u.

### Operations: search;; min;; max;; predecessor;; successor. Time O(h) with h height of the tree (more on later).

Binary search tree Operations: search;; min;; max;; predecessor;; successor. Time O(h) with h height of the tree (more on later). Data strutcure fields usually include for a given node x, the following

### PES Institute of Technology-BSC QUESTION BANK

PES Institute of Technology-BSC Faculty: Mrs. R.Bharathi CS35: Data Structures Using C QUESTION BANK UNIT I -BASIC CONCEPTS 1. What is an ADT? Briefly explain the categories that classify the functions

### External Memory Geometric Data Structures

External Memory Geometric Data Structures Lars Arge Department of Computer Science University of Aarhus and Duke University Augues 24, 2005 1 Introduction Many modern applications store and process datasets

### Algorithms and Data Structures

Algorithms and Data Structures Part 2: Data Structures PD Dr. rer. nat. habil. Ralf-Peter Mundani Computation in Engineering (CiE) Summer Term 2016 Overview general linked lists stacks queues trees 2 2

### Persistent Data Structures

6.854 Advanced Algorithms Lecture 2: September 9, 2005 Scribes: Sommer Gentry, Eddie Kohler Lecturer: David Karger Persistent Data Structures 2.1 Introduction and motivation So far, we ve seen only ephemeral

### Lecture 1: Course overview, circuits, and formulas

Lecture 1: Course overview, circuits, and formulas Topics in Complexity Theory and Pseudorandomness (Spring 2013) Rutgers University Swastik Kopparty Scribes: John Kim, Ben Lund 1 Course Information Swastik

### 6.852: Distributed Algorithms Fall, 2009. Class 2

.8: Distributed Algorithms Fall, 009 Class Today s plan Leader election in a synchronous ring: Lower bound for comparison-based algorithms. Basic computation in general synchronous networks: Leader election

### Symbol Tables. Introduction

Symbol Tables Introduction A compiler needs to collect and use information about the names appearing in the source program. This information is entered into a data structure called a symbol table. The

### TREE BASIC TERMINOLOGIES

TREE Trees are very flexible, versatile and powerful non-liner data structure that can be used to represent data items possessing hierarchical relationship between the grand father and his children and

### Graphs without proper subgraphs of minimum degree 3 and short cycles

Graphs without proper subgraphs of minimum degree 3 and short cycles Lothar Narins, Alexey Pokrovskiy, Tibor Szabó Department of Mathematics, Freie Universität, Berlin, Germany. August 22, 2014 Abstract

### Data storage Tree indexes

Data storage Tree indexes Rasmus Pagh February 7 lecture 1 Access paths For many database queries and updates, only a small fraction of the data needs to be accessed. Extreme examples are looking or updating

### Any two nodes which are connected by an edge in a graph are called adjacent node.

. iscuss following. Graph graph G consist of a non empty set V called the set of nodes (points, vertices) of the graph, a set which is the set of edges and a mapping from the set of edges to a set of pairs

### ! Solve problem to optimality. ! Solve problem in poly-time. ! Solve arbitrary instances of the problem. #-approximation algorithm.

Approximation Algorithms 11 Approximation Algorithms Q Suppose I need to solve an NP-hard problem What should I do? A Theory says you're unlikely to find a poly-time algorithm Must sacrifice one of three

### Dynamic 3-sided Planar Range Queries with Expected Doubly Logarithmic Time

Dynamic 3-sided Planar Range Queries with Expected Doubly Logarithmic Time Gerth Stølting Brodal 1, Alexis C. Kaporis 2, Spyros Sioutas 3, Konstantinos Tsakalidis 1, Kostas Tsichlas 4 1 MADALGO, Department

### On the k-path cover problem for cacti

On the k-path cover problem for cacti Zemin Jin and Xueliang Li Center for Combinatorics and LPMC Nankai University Tianjin 300071, P.R. China zeminjin@eyou.com, x.li@eyou.com Abstract In this paper we

### ! Solve problem to optimality. ! Solve problem in poly-time. ! Solve arbitrary instances of the problem. !-approximation algorithm.

Approximation Algorithms Chapter Approximation Algorithms Q Suppose I need to solve an NP-hard problem What should I do? A Theory says you're unlikely to find a poly-time algorithm Must sacrifice one of

### Online Bipartite Perfect Matching With Augmentations

Online Bipartite Perfect Matching With Augmentations Kamalika Chaudhuri, Constantinos Daskalakis, Robert D. Kleinberg, and Henry Lin Information Theory and Applications Center, U.C. San Diego Email: kamalika@soe.ucsd.edu

### The following themes form the major topics of this chapter: The terms and concepts related to trees (Section 5.2).

CHAPTER 5 The Tree Data Model There are many situations in which information has a hierarchical or nested structure like that found in family trees or organization charts. The abstraction that models hierarchical

### , each of which contains a unique key value, say k i , R 2. such that k i equals K (or to determine that no such record exists in the collection).

The Search Problem 1 Suppose we have a collection of records, say R 1, R 2,, R N, each of which contains a unique key value, say k i. Given a particular key value, K, the search problem is to locate the

### GRAPH THEORY LECTURE 4: TREES

GRAPH THEORY LECTURE 4: TREES Abstract. 3.1 presents some standard characterizations and properties of trees. 3.2 presents several different types of trees. 3.7 develops a counting method based on a bijection

### Ordered Lists and Binary Trees

Data Structures and Algorithms Ordered Lists and Binary Trees Chris Brooks Department of Computer Science University of San Francisco Department of Computer Science University of San Francisco p.1/62 6-0:

### Binary Search Trees. Ric Glassey glassey@kth.se

Binary Search Trees Ric Glassey glassey@kth.se Outline Binary Search Trees Aim: Demonstrate how a BST can maintain order and fast performance relative to its height Properties Operations Min/Max Search

### Sidebar: Data Structures

Sidebar: Data Structures A data structure is a collection of algorithms for storing and retrieving information. The operations that store information are called updates, and the operations that retrieve

### Heap. Binary Search Tree. Heaps VS BSTs. < el el. Difference between a heap and a BST:

Heaps VS BSTs Difference between a heap and a BST: Heap el Binary Search Tree el el el < el el Perfectly balanced at all times Immediate access to maximal element Easy to code Does not provide efficient

### DATA STRUCTURES USING C

DATA STRUCTURES USING C QUESTION BANK UNIT I 1. Define data. 2. Define Entity. 3. Define information. 4. Define Array. 5. Define data structure. 6. Give any two applications of data structures. 7. Give

### Analysis of Algorithms I: Optimal Binary Search Trees

Analysis of Algorithms I: Optimal Binary Search Trees Xi Chen Columbia University Given a set of n keys K = {k 1,..., k n } in sorted order: k 1 < k 2 < < k n we wish to build an optimal binary search

### 5. A full binary tree with n leaves contains [A] n nodes. [B] log n 2 nodes. [C] 2n 1 nodes. [D] n 2 nodes.

1. The advantage of.. is that they solve the problem if sequential storage representation. But disadvantage in that is they are sequential lists. [A] Lists [B] Linked Lists [A] Trees [A] Queues 2. The

### 8.1 Min Degree Spanning Tree

CS880: Approximations Algorithms Scribe: Siddharth Barman Lecturer: Shuchi Chawla Topic: Min Degree Spanning Tree Date: 02/15/07 In this lecture we give a local search based algorithm for the Min Degree

### B-Trees. Algorithms and data structures for external memory as opposed to the main memory B-Trees. B -trees

B-Trees Algorithms and data structures for external memory as opposed to the main memory B-Trees Previous Lectures Height balanced binary search trees: AVL trees, red-black trees. Multiway search trees:

### CIS 700: algorithms for Big Data

CIS 700: algorithms for Big Data Lecture 6: Graph Sketching Slides at http://grigory.us/big-data-class.html Grigory Yaroslavtsev http://grigory.us Sketching Graphs? We know how to sketch vectors: v Mv

### Introduction Advantages and Disadvantages Algorithm TIME COMPLEXITY. Splay Tree. Cheruku Ravi Teja. November 14, 2011

November 14, 2011 1 Real Time Applications 2 3 Results of 4 Real Time Applications Splay trees are self branching binary search tree which has the property of reaccessing the elements quickly that which