CS711008Z Algorithm Design and Analysis

Similar documents
Algorithms and Data Structures

Binary Heaps * * * * * * * / / \ / \ / \ / \ / \ * * * * * * * * * * * / / \ / \ / / \ / \ * * * * * * * * * *

6 March Array Implementation of Binary Trees

Cpt S 223. School of EECS, WSU

Data Structures Fibonacci Heaps, Amortized Analysis

A binary heap is a complete binary tree, where each node has a higher priority than its children. This is called heap-order property

Binary Heaps. CSE 373 Data Structures

From Last Time: Remove (Delete) Operation

Analysis of Algorithms I: Binary Search Trees

Converting a Number from Decimal to Binary

Binary Heap Algorithms

The Union-Find Problem Kruskal s algorithm for finding an MST presented us with a problem in data-structure design. As we looked at each edge,

CSE 326, Data Structures. Sample Final Exam. Problem Max Points Score 1 14 (2x7) 2 18 (3x6) Total 92.

Binary Search Trees. A Generic Tree. Binary Trees. Nodes in a binary search tree ( B-S-T) are of the form. P parent. Key. Satellite data L R

Binary Search Trees 3/20/14

1/1 7/4 2/2 12/7 10/30 12/25

Outline BST Operations Worst case Average case Balancing AVL Red-black B-trees. Binary Search Trees. Lecturer: Georgy Gimel farb

root node level: internal node edge leaf node Data Structures & Algorithms McQuain

How To Create A Tree From A Tree In Runtime (For A Tree)

Sorting revisited. Build the binary search tree: O(n^2) Traverse the binary tree: O(n) Total: O(n^2) + O(n) = O(n^2)

Learning Outcomes. COMP202 Complexity of Algorithms. Binary Search Trees and Other Search Trees

Binary Search Trees. Data in each node. Larger than the data in its left child Smaller than the data in its right child

Binary Search Trees CMPSC 122

Questions 1 through 25 are worth 2 points each. Choose one best answer for each.

Lecture 6: Binary Search Trees CSCI Algorithms I. Andrew Rosenberg

Lecture 4: Balanced Binary Search Trees

Tables so far. set() get() delete() BST Average O(lg n) O(lg n) O(lg n) Worst O(n) O(n) O(n) RB Tree Average O(lg n) O(lg n) O(lg n)

Heaps & Priority Queues in the C++ STL 2-3 Trees

Strategic Deployment in Graphs. 1 Introduction. v 1 = v s. v 2. v 4. e 1. e e 3. e 2. e 4

Union-Find Problem. Using Arrays And Chains

Data Structure [Question Bank]

Data Structures and Algorithm Analysis (CSC317) Intro/Review of Data Structures Focus on dynamic sets

CSC148 Lecture 8. Algorithm Analysis Binary Search Sorting

Krishna Institute of Engineering & Technology, Ghaziabad Department of Computer Application MCA-213 : DATA STRUCTURES USING C

Complexity of Union-Split-Find Problems. Katherine Jane Lai

Big Data and Scripting. Part 4: Memory Hierarchies

Algorithms Chapter 12 Binary Search Trees

Data Structures. Jaehyun Park. CS 97SI Stanford University. June 29, 2015

A binary search tree is a binary tree with a special property called the BST-property, which is given as follows:

Exam study sheet for CS2711. List of topics

Binary Trees and Huffman Encoding Binary Search Trees


The Goldberg Rao Algorithm for the Maximum Flow Problem

Previous Lectures. B-Trees. External storage. Two types of memory. B-trees. Main principles

Data Structure with C

Persistent Data Structures and Planar Point Location

Union-find with deletions

MATHEMATICAL ENGINEERING TECHNICAL REPORTS. The Best-fit Heuristic for the Rectangular Strip Packing Problem: An Efficient Implementation

Social Media Mining. Graph Essentials

An Evaluation of Self-adjusting Binary Search Tree Techniques

Home Page. Data Structures. Title Page. Page 1 of 24. Go Back. Full Screen. Close. Quit

A binary search tree or BST is a binary tree that is either empty or in which the data element of each node has a key, and:

Full and Complete Binary Trees

CS104: Data Structures and Object-Oriented Design (Fall 2013) October 24, 2013: Priority Queues Scribes: CS 104 Teaching Team

International Journal of Software and Web Sciences (IJSWS)

Operations: search;; min;; max;; predecessor;; successor. Time O(h) with h height of the tree (more on later).

Analysis of Algorithms, I

PES Institute of Technology-BSC QUESTION BANK

Approximation Algorithms

Classification/Decision Trees (II)

Lecture 1: Course overview, circuits, and formulas

Algorithms and Data Structures

6.852: Distributed Algorithms Fall, Class 2

Symbol Tables. Introduction

TREE BASIC TERMINOLOGIES

Data storage Tree indexes

External Memory Geometric Data Structures

Dynamic 3-sided Planar Range Queries with Expected Doubly Logarithmic Time

Any two nodes which are connected by an edge in a graph are called adjacent node.

Online Bipartite Perfect Matching With Augmentations

Quiz 4 Solutions EECS 211: FUNDAMENTALS OF COMPUTER PROGRAMMING II. 1 Q u i z 4 S o l u t i o n s

Persistent Data Structures

Lecture 2 February 12, 2003

DATA STRUCTURES USING C

GRAPH THEORY LECTURE 4: TREES

Graphs without proper subgraphs of minimum degree 3 and short cycles

Binary Search Trees. Ric Glassey

Ordered Lists and Binary Trees

! Solve problem to optimality. ! Solve problem in poly-time. ! Solve arbitrary instances of the problem. #-approximation algorithm.

Analysis of Algorithms I: Optimal Binary Search Trees

B-Trees. Algorithms and data structures for external memory as opposed to the main memory B-Trees. B -trees

Introduction Advantages and Disadvantages Algorithm TIME COMPLEXITY. Splay Tree. Cheruku Ravi Teja. November 14, 2011

On the k-path cover problem for cacti

5. A full binary tree with n leaves contains [A] n nodes. [B] log n 2 nodes. [C] 2n 1 nodes. [D] n 2 nodes.

! Solve problem to optimality. ! Solve problem in poly-time. ! Solve arbitrary instances of the problem. !-approximation algorithm.

CIS 700: algorithms for Big Data

Distributed Computing over Communication Networks: Maximal Independent Set

CS473 - Algorithms I

External Sorting. Chapter 13. Database Management Systems 3ed, R. Ramakrishnan and J. Gehrke 1

The following themes form the major topics of this chapter: The terms and concepts related to trees (Section 5.2).

Optimal Tracking of Distributed Heavy Hitters and Quantiles

CSE 326: Data Structures B-Trees and B+ Trees

Single machine models: Maximum Lateness -12- Approximation ratio for EDD for problem 1 r j,d j < 0 L max. structure of a schedule Q...

Scheduling Home Health Care with Separating Benders Cuts in Decision Diagrams

GENERATING THE FIBONACCI CHAIN IN O(log n) SPACE AND O(n) TIME J. Patera

Mathematical Induction. Lecture 10-11

B+ Tree Properties B+ Tree Searching B+ Tree Insertion B+ Tree Deletion Static Hashing Extendable Hashing Questions in pass papers

Sorting Hierarchical Data in External Memory for Archiving

How Asymmetry Helps Load Balancing

1) The postfix expression for the infix expression A+B*(C+D)/F+D*E is ABCD+*F/DE*++

Transcription:

CS711008Z Algorithm Design and Analysis Lecture 7 Binary heap, binomial heap, and Fibonacci heap 1 Dongbo Bu Institute of Computing Technology Chinese Academy of Sciences, Beijing, China 1 The slides were made based on Chapter 19, 20 of Introduction to algorithms, Data Structure by Ellis Horowitz, et al, a note by Danny Sleator, and the slides by Kevin Wayne with permission 1 / 10

Outline Introduction to priority queue Various implementations of priority queue: Linked list: a list having n items is too long to support efficient ExtractMin and Insert operations simultaneously; Binary heap: using a tree rather than a linked list; Binomial heap: allowing multiple trees rather than a single tree to support efficient Union operation Fibonacci heap: implement DecreaseKey via simply cutting an edge rather than exchanging nodes, and control a bushy tree shape via allowing at most one child losing for any node 2 / 10

Priority queue 3 / 10

Priority queue: motivation Motivation: It is usually a case to extract the minimum from a set S of n numbers, dynamically Here, the word dynamically means that on S, we might perform Insertion, Deletion and DecreaseKey operations The question is how to organize the data to efficiently support these operations 4 / 10

Priority queue Priority queue is an abstract data type similar to stack or queue, but each element has a priority associated with its name A min-oriented priority queue must support the following core operations: 1 H=MakeHeap(): tocreateanewheaph; 2 Insert(H, x): toinsertintoh an element x together with its priority 3 ExtractMin(H): to extract the element with the highest priority; 4 DecreaseKey(H, x, k): to decrease the priority of element x; 5 Union(H 1,H 2 ): return a new heap containing all elements of heaps H 1 and H 2,anddestroytheinputheaps 5 / 10

Priority queue is very useful Priority queue has extensive applications, such as: Dijkstra s shortest path algorithm Prim s MST algorithm Huffman coding A searching algorithm HeapSort / 10

An example: Dijkstra s algorithm 7 / 10

Dijkstra s algorithm [1959] Dijkstra(G, s) 1: key(s) = 0; //key(u) stores an upper bound of the shortest-path weight from s to u; 2: PQInsert (s); 3: S = {s}; // Let S be the set of explored nodes; 4: for all node v s do 5: key(v) =+; : PQInsert (v); // n times 7: end for 8: while S V do 9: v = PQExtractMin(); // n times 10: S = S {v}; 11: for each w/ S and <v,w> E do 12: if key(v)+d(v, w) <key(w) then 13: PQDecreaseKey(w, key(v)+d(v, w)); // m times : end if 15: end for 1: end while Here PQ denotes a min-priority queue 8 / 10

Dijkstra s algorithm: an example s a b c d e f t 9 15 24 18 30 20 2 11 1 19 44

Dijkstra s algorithm: an example s a b c d e f t 9 15 24 18 30 20 2 11 1 19 44 S = {} PQ = {s(0),a(),b(),c(),d(),e(),f(),t()} 0

Dijkstra s algorithm: an example S = {} PQ = {s(0),a(),b(),c(),d(),e(),f(),t()} 0 9 s a 15 b 20 c 24 18 d 2 30 11 e 44 19 f 1 ExtractMin returns s t

Dijkstra s algorithm: an example S = {} PQ = {s(0),a(),b(),c(),d(),e(),f(),t()} 0 9 s a 15 b 20 c 24 18 d 2 30 11 e 44 19 f 1 ExtractMin returns s DecreaseKey(a, 9) DecreaseKey(b, ) DecreaseKey(c, 15) t

Dijkstra s algorithm: an example S = {s} S = {} PQ PQ = = {a(9),b(),c(15),d(),e(),f(),t()} {s(0),a(),b(),c(),d(),e(),f(),t()} 0 9 9 s a 15 b 15 20 c 24 d 18 2 30 11 e 44 19 f 1 ExtractMin returns s DecreaseKey(a, 9) DecreaseKey(b, ) DecreaseKey(c, 15) t

Dijkstra s algorithm: an example S = {s} S = {} PQ PQ = = {a(9),b(),c(15),d(),e(),f(),t()} {s(0),a(),b(),c(),d(),e(),f(),t()} 0 9 9 s a 15 b 15 20 c 24 d 18 2 30 11 e 44 19 f 1 ExtractMin returns as DecreaseKey(a, 9) DecreaseKey(b, ) DecreaseKey(c, 15) t

Dijkstra s algorithm: an example S = {s} S = {} PQ PQ = = {a(9),b(),c(15),d(),e(),f(),t()} {s(0),a(),b(),c(),d(),e(),f(),t()} 0 9 9 s a 15 b 15 20 c 24 d 18 2 30 11 e 44 19 f 1 ExtractMin returns as DecreaseKey(d, DecreaseKey(a, 33) 9) DecreaseKey(b, ) DecreaseKey(c, 15) t

Dijkstra s algorithm: an example S = {s, {s} S a} = {} PQ PQ = {a(9),b(),c(15),d(),e(),f(),t()} = {s(0),a(),b(),c(),d(),e(),f(),t()} {b(),c(15),d(33),e(),f(),t()} 0 9 9 s a 15 b 15 20 c 33 24 d 18 2 30 11 e 44 19 f 1 ExtractMin returns as DecreaseKey(d, DecreaseKey(a, 33) 9) DecreaseKey(b, ) DecreaseKey(c, 15) t

Dijkstra s algorithm: an example S = {s, {s} S a} = {} PQ PQ = {a(9),b(),c(15),d(),e(),f(),t()} = {s(0),a(),b(),c(),d(),e(),f(),t()} {b(),c(15),d(33),e(),f(),t()} 0 9 9 s a 15 b 15 20 c 33 24 d 18 2 30 11 e 44 19 f 1 ExtractMin returns as b DecreaseKey(d, DecreaseKey(a, 33) 9) DecreaseKey(b, ) DecreaseKey(c, 15) t

Dijkstra s algorithm: an example S = {s, {s} S a} = {} PQ PQ = {a(9),b(),c(15),d(),e(),f(),t()} = {s(0),a(),b(),c(),d(),e(),f(),t()} {b(),c(15),d(33),e(),f(),t()} 0 9 9 s a 15 b 15 20 c 24 18 33 d 2 30 11 e 44 ExtractMin returns as b DecreaseKey(d, DecreaseKey(a, 33) 32) 9) DecreaseKey(e, DecreaseKey(b, ) 44) 19 DecreaseKey(c, 15) f 1 t

Dijkstra s algorithm: an example S S = = {s, {s} a, S a} b} = {} PQ PQ = PQ {a(9),b(),c(15),d(),e(),f(),t()} = {s(0),a(),b(),c(),d(),e(),f(),t()} {b(),c(15),d(33),e(),f(),t()} = {c(15),d(32),e(44),f(),t()} 0 9 9 s a 15 b 15 20 c 24 18 33 32 d 2 30 44 11 e 44 ExtractMin returns as b DecreaseKey(d, DecreaseKey(a, 33) 32) 9) DecreaseKey(e, DecreaseKey(b, ) 44) 19 DecreaseKey(c, 15) f 1 t

Dijkstra s algorithm: an example S S = = {s, {s} a, S a} b} = {} PQ PQ = PQ {a(9),b(),c(15),d(),e(),f(),t()} = {s(0),a(),b(),c(),d(),e(),f(),t()} {b(),c(15),d(33),e(),f(),t()} = {c(15),d(32),e(44),f(),t()} 0 9 9 s a 15 b 15 20 c 24 18 33 32 d 2 30 44 11 e 44 ExtractMin returns as cb DecreaseKey(d, DecreaseKey(a, 33) 32) 9) DecreaseKey(e, DecreaseKey(b, ) 44) 19 DecreaseKey(c, 15) f 1 t

Dijkstra s algorithm: an example S S = = {s, {s} a, S a} b} = {} PQ PQ = PQ {a(9),b(),c(15),d(),e(),f(),t()} = {s(0),a(),b(),c(),d(),e(),f(),t()} {b(),c(15),d(33),e(),f(),t()} = {c(15),d(32),e(44),f(),t()} 0 9 9 s a 15 b 15 20 c 24 18 33 32 d 2 30 44 11 e 44 ExtractMin returns as cb DecreaseKey(d, DecreaseKey(e, DecreaseKey(a, 35) 33) 32) 9) DecreaseKey(e, DecreaseKey(b, DecreaseKey(t, 59) ) 44) 19 DecreaseKey(c, 15) f 1 t

Dijkstra s algorithm: an example S S = = {s, {s, {s} a, a, S a} b, b} = c} {} PQ PQ = PQ {a(9),b(),c(15),d(),e(),f(),t()} = PQ {s(0),a(),b(),c(),d(),e(),f(),t()} {b(),c(15),d(33),e(),f(),t()} = {c(15),d(32),e(44),f(),t()} = {d(32),e(35),t(59),f()} 0 9 9 s a 15 b 15 20 c 24 18 33 32 d 2 30 44 35 11 e 44 ExtractMin returns as cb DecreaseKey(d, DecreaseKey(e, DecreaseKey(a, 35) 33) 32) 9) DecreaseKey(e, DecreaseKey(b, DecreaseKey(t, 59) ) 44) 19 DecreaseKey(c, 15) f 1 59 t

Dijkstra s algorithm: an example S S = = {s, {s, {s} a, a, S a} b, b} = c} {} PQ PQ = PQ {a(9),b(),c(15),d(),e(),f(),t()} = PQ {s(0),a(),b(),c(),d(),e(),f(),t()} {b(),c(15),d(33),e(),f(),t()} = {c(15),d(32),e(44),f(),t()} = {d(32),e(35),t(59),f()} 0 9 9 s a 15 b 15 20 c 24 18 33 32 d 2 30 44 35 11 e 44 ExtractMin returns ad sc b DecreaseKey(d, DecreaseKey(e, DecreaseKey(a, 35) 33) 32) 9) DecreaseKey(e, DecreaseKey(b, DecreaseKey(t, 59) ) 44) 19 DecreaseKey(c, 15) f 1 59 t

Dijkstra s algorithm: an example S S = = {s, {s, {s} a, a, S a} b, b} = c} {} PQ PQ = PQ {a(9),b(),c(15),d(),e(),f(),t()} = PQ {s(0),a(),b(),c(),d(),e(),f(),t()} {b(),c(15),d(33),e(),f(),t()} = {c(15),d(32),e(44),f(),t()} = {d(32),e(35),t(59),f()} 0 9 9 s a 15 b 15 20 c 24 18 33 32 d 2 30 44 35 11 e 44 ExtractMin returns ad sc b DecreaseKey(d, DecreaseKey(e, DecreaseKey(t, DecreaseKey(a, 51) 35) 33) 32) 9) DecreaseKey(e, DecreaseKey(b, DecreaseKey(t, 59) ) 44) 34) 19 DecreaseKey(c, 15) f 1 59 t

Dijkstra s algorithm: an example S S = S = {s, = {s, a, {s} a, a, b, S a} b, c, b} = c} d} {} PQ PQ = PQ {a(9),b(),c(15),d(),e(),f(),t()} = PQ {s(0),a(),b(),c(),d(),e(),f(),t()} {b(),c(15),d(33),e(),f(),t()} = PQ {c(15),d(32),e(44),f(),t()} = {d(32),e(35),t(59),f()} = {e(34),t(51),f()} 0 9 9 s a 15 b 15 20 c 24 18 33 32 d 2 30 44 35 34 11 e 44 ExtractMin returns ad sc b DecreaseKey(d, DecreaseKey(e, DecreaseKey(t, DecreaseKey(a, 51) 35) 33) 32) 9) DecreaseKey(e, DecreaseKey(b, DecreaseKey(t, 59) ) 44) 34) 19 DecreaseKey(c, 15) f 1 59 51 t

Dijkstra s algorithm: an example S S = S = {s, = {s, a, {s} a, a, b, S a} b, c, b} = c} d} {} PQ PQ = PQ {a(9),b(),c(15),d(),e(),f(),t()} = PQ {s(0),a(),b(),c(),d(),e(),f(),t()} {b(),c(15),d(33),e(),f(),t()} = PQ {c(15),d(32),e(44),f(),t()} = {d(32),e(35),t(59),f()} = {e(34),t(51),f()} 0 9 9 s a 15 b 15 20 c 24 18 33 32 d 2 30 44 35 34 11 e 44 ExtractMin returns ad se cb DecreaseKey(d, DecreaseKey(e, DecreaseKey(t, DecreaseKey(a, 51) 35) 33) 32) 9) DecreaseKey(e, DecreaseKey(b, DecreaseKey(t, 59) ) 44) 34) 19 DecreaseKey(c, 15) f 1 59 51 t

Dijkstra s algorithm: an example S S = S = {s, = {s, a, {s} a, a, b, S a} b, c, b} = c} d} {} PQ PQ = PQ {a(9),b(),c(15),d(),e(),f(),t()} = PQ {s(0),a(),b(),c(),d(),e(),f(),t()} {b(),c(15),d(33),e(),f(),t()} = PQ {c(15),d(32),e(44),f(),t()} = {d(32),e(35),t(59),f()} = {e(34),t(51),f()} 0 9 9 s a 15 b 15 20 c 24 18 33 32 d 2 30 44 35 34 11 e 44 ExtractMin returns ad se cb DecreaseKey(f,45) DecreaseKey(d, DecreaseKey(e, DecreaseKey(t, DecreaseKey(a, 51) 35) 33) 32) 9) DecreaseKey(e, DecreaseKey(b, DecreaseKey(t, 59) 50) ) 44) 34) 19 DecreaseKey(c, 15) f 1 59 51 t

Dijkstra s algorithm: an example S S = S = {s, = {s, a, {s, a, {s} a, b, a, b, S a} c, b, c, b} = d, c} d} {} e} PQ PQ = PQ {a(9),b(),c(15),d(),e(),f(),t()} = PQ {s(0),a(),b(),c(),d(),e(),f(),t()} {b(),c(15),d(33),e(),f(),t()} = PQ {c(15),d(32),e(44),f(),t()} = PQ {d(32),e(35),t(59),f()} = {e(34),t(51),f()} = {f(45),t(50)} 0 9 9 s a 15 b 15 20 c 24 18 33 32 d 2 30 44 35 34 11 e 44 ExtractMin returns ad se cb DecreaseKey(f,45) DecreaseKey(d, DecreaseKey(e, DecreaseKey(t, DecreaseKey(a, 51) 35) 33) 32) 9) DecreaseKey(e, DecreaseKey(b, DecreaseKey(t, 59) 50) ) 44) 34) 19 DecreaseKey(c, 15) 45 f 1 59 51 50 t

Dijkstra s algorithm: an example S S = S = {s, = {s, a, {s, a, {s} a, b, a, b, S a} c, b, c, b} = d, c} d} {} e} PQ PQ = PQ {a(9),b(),c(15),d(),e(),f(),t()} = PQ {s(0),a(),b(),c(),d(),e(),f(),t()} {b(),c(15),d(33),e(),f(),t()} = PQ {c(15),d(32),e(44),f(),t()} = PQ {d(32),e(35),t(59),f()} = {e(34),t(51),f()} = {f(45),t(50)} 0 9 9 s a 15 b 15 20 c 24 18 33 32 d 2 30 44 35 34 11 e 44 ExtractMin returns fa ds ec b DecreaseKey(f,45) DecreaseKey(d, DecreaseKey(e, DecreaseKey(t, DecreaseKey(a, 51) 35) 33) 32) 9) DecreaseKey(e, DecreaseKey(b, DecreaseKey(t, 59) 50) ) 44) 34) 19 DecreaseKey(c, 15) 45 f 1 59 51 50 t

Dijkstra s algorithm: an example S S = S = {s, S {s, = a, {s, = a, {s, b, a, {s} a, b, c, a, b, S a} c, b, d, c, b} = d, c} e, d} {} e} f} PQ PQ = PQ {a(9),b(),c(15),d(),e(),f(),t()} = PQ {s(0),a(),b(),c(),d(),e(),f(),t()} {b(),c(15),d(33),e(),f(),t()} = PQ {c(15),d(32),e(44),f(),t()} = PQ {d(32),e(35),t(59),f()} = PQ {e(34),t(51),f()} = {f(45),t(50)} = {t(50)} 0 9 9 s a 15 b 15 20 c 24 18 33 32 d 2 30 44 35 34 11 e 44 ExtractMin returns fa ds ec b DecreaseKey(f,45) DecreaseKey(d, DecreaseKey(e, DecreaseKey(t, DecreaseKey(a, 51) 35) 33) 32) 9) DecreaseKey(e, DecreaseKey(b, DecreaseKey(t, 59) 50) ) 44) 34) 19 DecreaseKey(c, 15) 45 f 1 59 51 50 t

Dijkstra s algorithm: an example S S = S = {s, S = {s, a, = a, {s, a, b, {s, b, a, {s} a, c, b, c, a, b, S a} d, c, b, d, c, b} = d, c} e, e, d} {} e} f, f} t} PQ PQ = PQ {a(9),b(),c(15),d(),e(),f(),t()} = PQ {s(0),a(),b(),c(),d(),e(),f(),t()} {b(),c(15),d(33),e(),f(),t()} = PQ {c(15),d(32),e(44),f(),t()} = PQ {d(32),e(35),t(59),f()} = PQ {e(34),t(51),f()} = PQ {f(45),t(50)} = {t(50)} = {} 0 9 9 s a 15 b 15 20 c 24 18 33 32 d 2 30 44 35 34 11 e 44 ExtractMin returns fa ds ec b DecreaseKey(f,45) DecreaseKey(d, DecreaseKey(e, DecreaseKey(t, DecreaseKey(a, 51) 35) 33) 32) 9) DecreaseKey(e, DecreaseKey(b, DecreaseKey(t, 59) 50) ) 44) 34) 19 DecreaseKey(c, 15) 45 f 1 59 51 50 t 9 / 10

Time complexity of Dijkstra algorithm Operation Linked Binary Binomial Fibonacci list heap heap heap MakeHeap 1 1 1 1 Insert 1 log n log n 1 ExtractMin n log n log n log n DecreaseKey 1 log n log n 1 Delete n log n log n log n Union 1 n log n 1 FindMin n 1 log n 1 Dijkstra O(n 2 ) O(m log n) O(m log n) O(m + n log n) Dijkstra algorithm: n Insert, n ExtractMin, andm DecreaseKey 10 / 10

Implementing priority queue: array or linked list 11 / 10

Implementing priority queue: unsorted array Unsorted array: 8 1 2 4 Unsorted linked list: Operations: Insert: O(1) ExtractMin: O(n) head 8 1 2 4 Note: a list containing n elements is too long to find the minimum efficiently 12 / 10

Implementing priority queue: sorted array Sorted array: 1 2 4 8 Sorted linked list: Operations: Insert: O(n) ExtractMin: O(1) head 1 2 4 8 Note: a list containing n elements is too long to maintain the order among elements 13 / 10

Implementing priority queue: array or linked list / 10

Binary heap: from alinkedlistto atree 15 / 10

Binary heap Figure 1: R W Floyd [194] 1 / 10

Binary heap: a complete binary tree Basic idea: loosing the structure: Recall that the objective is to find the minimum To achieve this objective, it is not necessary to sort all elements; but don t loose it too much: westillneedorderbetween some elements; 10 8 12 18 11 25 21 17 Binary heap: elements are stored in a complete binary tree, ie, a tree that is perfectly balanced except for the bottom level Heap order is required, ie, any parent has a key smaller than his children; Advantage: any path has a short length of O(log 2 n) rather than n in linked list, making it efficient to maintain the heap 17 / 10

Binary heap: an explicit implementation Pointer representation: each node has pointers to its parent and two children; The following information are maintained: the number of elements n; the pointer to the root node; lptr root rptr 10 8 12 18 11 25 21 Note: the last node can be found in O(log n) time 18 / 10

Binary heap: an implicit implementation Array representation: one-one correspondence between a binary tree and an array Binary tree array: the indices starting from 1 for the sake of simplicity; the indices record the order that the binary tree is traversed level by level Array binary tree: the k-th item has two children located at 2k and 2k +1; the parent of the k-th item is located at k 2 ; 1 2 3 10 8 4 5 7 12 18 11 25 10 8 12 18 11 25 21 1 2 3 4 5 7 8 9 10 11 8 21 19 / 10

Sorted array vs binary heap Sorted array: an array containing n elements in an increasing order; 8 10 11 12 18 21 25 1 2 3 4 5 7 8 9 10 11 Binary heap: heap order means that only the order among nodes in short paths (length is less than log n) are maintained Note that some inverse pairs exist in the array 10 8 12 18 11 25 21 1 2 3 4 5 7 8 9 10 11 20 / 10

Binary heap: primitive and other operations 21 / 10

Primitive: exchanging nodes to restore heap order Primitive operation: when heap order is violated, ie a parent has a value larger than only one of its children, we simply exchange them to resolve the conflict 10 15 8 13 8 18 13 11 25 18 15 11 25 21 27 21 27 Figure 2: Heap order is violated: 15 > 13 Exchange them to resolve the conflict 22 / 10

Primitive: exchanging nodes to restore heap order Primitive operation: when heap order is violated, ie a parent has a value larger than both of its children, we exchange the parent with its smaller child to resolve the conflict 10 20 8 12 8 12 18 11 25 20 18 11 25 21 27 21 27 Figure 3: Heap order is violated: 20 > 12, and20 > 18 Exchange 20 with its smaller child (12) to resolve the conflicts 23 / 10

Binary heap: Insert operation Insert operation: the element is added as a new node at the end Since the heap order might be violated, the node is repeatedly exchanged with its parent until heap order is restored For example, Insert( 7 ): 10 8 7 8 12 18 11 25 10 18 11 25 21 7 21 12 24 / 10

Binary heap: ExtractMin operation ExtractMin operation: exchange element in root with the last node; repeatedly exchange the element in root with its smaller child until heap order is restored For example, ExtractMin(): 21 10 8 10 8 12 18 11 25 12 18 11 25 21 8 10 11 12 18 21 25 25 / 10

Binary heap: DecreaseKey operation DecreaseKey operation: given a handle to a node, repeatedly exchange the node with its parent until heap order is restored For example, DecreaseKey( ptr, 7): 10 8 7 8 12 18 11 25 10 18 11 25 21 ptr 12 2 / 10

Binary heap: analysis Theorem In an implicit binary heap, any sequence of m Insert, DecreaseKey, and ExtractMin operations with n Insert operations takes O(m log n) time Note: Each operation touches at most log n nodes on a path from the root to a leaf Theorem In an explicit binary heap with n nodes, the Insert, DecreaseKey, and ExtractMin operations take O(m log n) time in the worst case Note: If using array representation, a dynamic array expanding/contracting is needed However, the total cost of array expanding/contracting is O(n) (see TableInsert) 27 / 10

Binary heap: heapify a set of items Question: Given a set of n elements, how to construct a binary heap containing them? Solutions: 1 Simply Insert the elements one by one Takes O(n log n) time 2 Bottom-up heapifying Takes O(n) time For i = n to 1, we repeatedly exchange the element in node i with its smaller child until the subtree rooted at node i is heap-ordered 1 21 2 3 10 11 4 5 7 21 10 11 25 8 18 12 1 2 3 4 5 7 8 9 10 11 25 8 18 12 8 28 / 10

Binary heap: heapify Theorem Given n elements, a binary heap can be constructed using O(n) time Proof There are at most n 2 h+1 nodes of height h; It takes O(h) time to sink a node of height h; The total time is: log 2 n h=0 n h 2h+1 log 2 n 2n h=0 n h 2 h 29 / 10

Implementing priority queue: binary heap 30 / 10

Binary heap: Union operation Union operation: Given two binary heaps H 1 and H 2,to merge them into one binary heap H 1 21 H 2 25 27 8 11 29 O(n) time is needed if using heapify Question: Is there a quicker way to union two heaps? 31 / 10

Binomial heap: using multiple trees rather than a single tree to support efficient Union operation 32 / 10

Binomial heap Figure 4: Jean Vuillenmin [1978] 33 / 10

Binomial heap: efficient Union Basic idea: loosing the structure: if multiple trees are allowed to represent a heap, Union can be efficiently implemented via simply putting trees together but don t loose it too much: there should not be too many trees; otherwise, it will take a long time to find the minimum among all root nodes H 1 21 H 2 25 27 8 11 29 ExtractMin: simply finding the minimum element of the root nodes Note that a root node holds the minimum of the tree due to the heap order 34 / 10

Why we can t loose the structure too much? An extreme case of multiple trees: each node is itself a tree Then it will take O(n) time to find the minimum 11 8 29 Solution: consolidating, ie, twotrees(withthesamesize) are merged into one the larger root is linked to the smaller one to keep the heap order Note that after consolidating, at most log n trees will be left 11 link 11 8 link 11 9 11 8 9 35 / 10

Binomial tree Definition (Binomial tree) The binomial tree is defined recursively: a single node is itself a B 0 tree, and two B k trees are linked into a B k+1 tree B k B k B k+1 B 0 3 / 10

Binomial tree examples: B 0, B 1, B 2 B 0 8 B 1 8 9 10 B 2 37 / 10

Binomial tree example: B 3 8 9 7 10 11 12 B 3 38 / 10

Binomial tree example: B 4 0 8 9 7 5 10 11 12 11 12 13 15 1 17 B 4 39 / 10

Binomial tree example: B 5 B 4 0 8 9 7 5 10 11 12 11 12 13 15 1 17 B 5 40 / 10

Binomial tree: property Properties: 1 B k =2 k ; 2 height(b k )=k; 3 degree(b k )=k; 4 The i-th child of a node has a degree of i 1; 5 The deletion of the root yields trees B 0, B 1,,B k 1 Binomial tree is named after the fact that the node number of all levels are binomial coefficients B 3 B k+1 8 9 7 10 11 12 B 0 B 1 B2 B k 41 / 10

Binomial heap: a forest Definition (Binomial forest) A binomial heap is a collection of several binomial trees: Each tree is heap ordered; There is either 0 or 1 B k for any k Example: B 0 B 1 B 2 B 3 3 4 5 8 8 9 8 9 7 10 10 11 12 42 / 10

Binomial heap: properties Properties: 1 A binomial heap with n nodes contains the binomial tree B i iff b i =1,whereb k b k 1 b 1 b 0 is binary representation of n 2 It has at most log 2 n +1trees 3 Its height is at most log 2 n Thus, it takes O(log n) time to find the minimum element via checking the roots B 0 B 1 B 2 B 3 3 4 5 8 8 9 8 9 7 10 10 11 12 43 / 10

Union is efficient: example 1 B 1 5 B 0 3 4 B 2 5 8 9 + B 1 8 B 3 8 9 7 10 10 11 12 B 2 B 3 B 0 = 3 4 8 8 9 8 9 7 10 10 11 12 Figure 5: An easy case: no consolidating is needed 44 / 10

Union is efficient: example 2 I B 1 4 B 2 B 1 B 3 3 7 5 8 9 + 8 8 9 7 B 2 = 3 10 B 2 B 3 5 10 11 12 7 4 8 9 8 9 7 8 10 10 11 12 Figure : Consolidating two B 1 trees into a B 2 tree 45 / 10

Union is efficient: example 2 II B 1 4 B 2 B 1 B 3 3 7 5 8 9 + 8 8 9 7 10 10 11 12 B 3 B 3 = 3 7 4 5 8 9 7 8 8 9 10 11 12 10 Figure 7: Consolidating two B 2 trees into a B 3 tree 4 / 10

Union is efficient: example 2 III B 1 4 B 2 B 1 B 3 3 7 5 8 9 + 8 8 9 7 B 4 = 3 10 10 11 12 7 4 5 8 8 9 8 9 7 10 10 11 12 Time complexity: O(log n) since there are at most O(log n) trees 47 / 10

Binomial heap: Insert operation Insert(x) 1: Create a B 0 tree for x; 2: while there are two B k trees for some k do 3: link them together into one B k+1 tree; 4: end while B 0 B 4 9 2 3 4 5 8 8 9 8 9 7 10 10 11 12 Figure 8: An easy case: no consolidating is needed 48 / 10

Insert operation: example 2 I B 0 B 0 B 1 2 3 4 5 B 2 B 3 8 8 9 8 9 7 10 10 11 12 Figure 9: Consolidating two B 0 49 / 10

Insert operation: example 2 II B 1 2 B 1 B 2 B 3 4 5 3 8 8 9 8 9 7 10 10 11 12 Figure 10: Consolidating two B 1 50 / 10

Insert operation: example 2 III B 2 2 B 2 B 3 5 3 4 8 9 8 9 7 8 10 10 11 12 Figure 11: Consolidating two B 2 51 / 10

Insert operation: example 2 IV 2 B 3 3 4 8 5 8 9 10 8 9 7 10 11 12 B 3 Figure 12: Consolidating two B 3 52 / 10

Insert operation: example 2 V 2 B 4 3 4 8 5 8 9 10 8 9 7 10 11 12 Time complexity: O(log n) (worst case) since there are at most log n trees 53 / 10

Binomial heap: ExtractMin operation ExtractMin() 1: find min of all roots; 2: remove the min node; 3: while there are two B k trees for some k do 4: link them together into one B k+1 tree; 5: end while B 0 9 B 4 2 3 4 5 8 8 9 8 9 7 10 10 11 12 54 / 10

ExtractMin operation: an example I B 0 B 0 B 1 9 3 4 5 B 2 B 3 8 8 9 8 9 7 10 10 11 12 Figure 13: The four children become trees 55 / 10

ExtractMin operation: an example II B 1 B 1 B 2 B 3 3 4 5 9 8 8 9 8 9 7 10 10 11 12 Figure : Consolidating two B 1 trees 5 / 10

ExtractMin operation: an example III 3 9 B 2 4 8 5 8 9 10 B 2 8 9 7 10 11 12 B 3 Figure 15: Consolidating two B 2 trees 57 / 10

ExtractMin operation: an example IV 3 9 B 3 4 8 5 8 9 10 8 9 7 10 11 12 B 3 Figure 1: Consolidating two B 2 trees 58 / 10

ExtractMin operation: an example V 3 9 B 4 4 8 5 8 9 10 8 9 7 10 11 12 Time complexity: O(log n) 59 / 10

Implementing priority queue: Binomial heap 0 / 10

Binomial heap: a more accurate amortized analysis 1 / 10

Amortized analysis of Insert Motivation: If an Insert takes a long time (say log n), the subsequent Insert operations shouldn t take long! Thus, it will be more accurate to look at a sequence of operations rather than an individual operation 2 / 10

Amortized analysis of Insert operation Insert(x) 1: Create a B 0 tree for x; 2: while there are two B k trees for some k do 3: link them together into one B k+1 tree; 4: end while Analysis: Let w be #WHILE Thus an Insert operation takes time: 1+w; Consider a quantity Φ=#trees (called potential function) Φ increase: 1; Φ decrease: w; Running time of Insert: 1+w = 1+ decrease in Φ 3 / 10

Amortized analysis of ExtractMin ExtractMin() 1: find min of all roots; 2: remove the min node; 3: while there are two B k trees for some k do 4: link them together into one B k+1 tree; 5: end while Analysis: Let w be #WHILE Thus,ExtractMin takes d + w time, where d denotes degree of the removed root node; Consider a potential function Φ=#trees Φ increase: d; Φ decrease: w; Running time: d + w = d+ decrease in #trees Note that d log n 4 / 10

Amortized analysis Let s consider any sequence of n Insert and m ExtractMin operations; The total running time is at most n + m log n+ total decrease in #trees; Note: total decrease in #trees total increase in #trees (why?), which is at most n + m log n Thus, total time is at most 2n +2m log n We say Insert takes O(1) amortized time, and ExtractMin takes O(log n) amortized time Definition (Amortized time) For any sequence of n 1 operation 1, n 2 operation 2, if the total time is O(n 1 T 1 + n 2 T 2 ), we say that operation 1 takes T 1 amortized time, operation 2 takes T 2 amortized time 5 / 10

Intuition of the amortized analysis The actual running time of an Insert operation is 1+w A large w means that the Insert operation takes a long time Note that the w time was spent on decreasing trees ; thus, if the w time was amortized over the operations creating trees, the amortized time of Insert operation will be only O(1) The actual running time of an ExtractMin operation is at most log n + w Note that at most log n new trees are created during an ExtractMin operation; thus, the amortized time is still O(log n) even if some costs have been amortized to it from other operations due to tree creating / 10

Implementing priority queue: Binomial heap 7 / 10

Binomial heap: DecreaseKey operation B 4 0 8 9 7 5 10 11 12 11 12 13 15 1 Figure 17: DecreaseKey: 17 to 1 17 Time: O(log n) since in the worst case, we need to perform node exchanging up to the root Question: is there a quicker way for decrease key? 8 / 10

Fibonacci heap: an efficient implementation of DecreaseKey via simply cutting an edge rather than exchanging nodes 9 / 10

Fibonacci heap Figure 18: Robert Tarjan [198] 70 / 10

Fibonacci heap: an efficient DecreaseKey operation Basic idea: loosing the structure: When heap order is violated, a simple solution is to cut off a node, and insert it into the root list but don t loose it too much: the cutting off operation makes a tree not binomial any more; however, it should not deviate from a binomial tree too much A technique to achieve this objective is allowing any non-root node to lose at most one child Figure 19: Heap order is violated when DescreaseKey 13 to 2 However, the heap order can be easily restored via cutting off the node, and inserting it into the root list 71 / 10

Fibonacci heap: DescreaseKey DecreaseKey(v, x) 1: key(v) =x; 2: if heap order is violated then 3: u = v s parent; 4: cut subtree rooted at node v; 5: while u is marked do : cut subtree rooted at node u, and insert it into the root list; 7: unmark u; 8: u = u s parent; 9: end while 10: mark u; 11: end if 72 / 10

DecreaseKey: anexamplei 0 8 9 7 5 10 11 12 18 19 13 20 15 1 17 Figure 20: The original Fibonacci heap 73 / 10

DecreaseKey: anexampleii 0 8 9 7 5 10 11 12 18 3 13 20 15 1 17 5 Figure 21: DecreaseKey: 19 to 3 74 / 10

DecreaseKey: anexampleiii 0 8 9 7 5 10 11 12 18 3 13 20 2 1 17 5 13 Figure 22: DecreaseKey: 15 to 2 75 / 10

DecreaseKey: anexampleiv 0 8 9 7 5 10 11 8 18 3 13 20 2 1 17 5 13 Figure 23: DecreaseKey: 12 to 8 7 / 10

DecreaseKey: anexamplev 0 8 9 7 5 10 11 8 18 3 13 1 20 2 1 17 8 5 13 Figure 24: DecreaseKey: to 1 77 / 10

DecreaseKey: anexamplevi 0 8 9 7 5 10 11 8 18 3 13 1 20 2 9 17 8 Figure 25: DecreaseKey: 1 to 9 78 / 10

Fibonacci heap: Insert Insert(x) 1: create a tree for x, and insert it into the root list; Note: Being lazy! Consolidating trees when extracting minimum 1 0 5 3 2 13 9 8 9 7 18 20 17 10 11 8 1 0 5 3 2 13 9 8 9 7 18 20 17 10 11 8 Figure 2: Insert(): creating a new tree, and insert it into the root list 79 / 10

Fibonacci heap: ExtractMin ExtractMin() 1: find min of all root nodes; 2: remove the min node; 3: while there are two roots u and v of the same degree do 4: consolidate the two trees together; 5: end while 80 / 10

ExtractMin: anexample I 1 0 5 3 2 13 9 8 9 7 18 20 17 10 11 8 1 8 9 7 5 3 2 13 9 10 11 8 18 20 17 Figure 27: ExtractMin: removing the min node, and adding 3 trees 81 / 10

ExtractMin: anexample II 1 9 7 5 3 2 13 9 8 10 11 8 18 20 17 Figure 28: ExtractMin: consolidating two trees rooted at node 1 and 8 1 7 5 3 2 13 9 8 9 11 8 18 20 17 10 Figure 29: ExtractMin: consolidating two trees rooted at node 1 and 9 82 / 10

ExtractMin: anexample III 1 5 3 2 13 9 8 9 7 18 20 17 10 11 8 Figure 30: ExtractMin: consolidating two trees rooted at node 1 and 7 1 3 2 13 9 8 9 7 20 5 17 10 11 8 18 Figure 31: ExtractMin: consolidating two trees rooted at node 3 and 5 83 / 10

ExtractMin: anexample IV 1 3 2 9 8 9 7 20 5 13 17 10 11 8 18 Figure 32: ExtractMin: consolidating two trees rooted at node 2 and 13 1 3 2 8 9 7 20 5 13 9 10 11 8 18 17 Figure 33: ExtractMin: consolidating two trees rooted at node 2 and 9 84 / 10

ExtractMin: anexample V 8 5 7 2 10 11 8 13 9 20 1 17 3 9 18 8 Figure 34: ExtractMin: consolidating two trees rooted at node 2 and 3 85 / 10

ExtractMin: anexample VI 8 5 7 2 10 11 8 13 9 20 1 17 3 9 18 8 Figure 35: ExtractMin: consolidating two trees rooted at node 1 and 2 8 / 10

Fibonacci heap: an amortized analysis 87 / 10

Fibonacci heap: DecreaseKey DecreaseKey(v, x) 1: key(v) =x; 2: if heap order is violated then 3: u = v s parent; 4: cut subtree rooted at node v; 5: while u is marked do : cut subtree rooted at node u, and insert it into the root list; 7: unmark u; 8: u = u s parent; 9: end while 10: mark u; 11: end if 88 / 10

DecreaseKey: analysis Analysis: The actual running time is: 1+w, wherew =#WHILE; Consider a potential function Φ=#trees + 2#marks; Φ increase: 1+2=3; Φ decrease: ( 1+2 1) w = w; Running time: 1+w =1+Φdecrease; Intuition: a large w means that DecreaseKey takes a long time; however, if we can amortize w over other operations, a DecreaseKey operation takes only O(1) amortized time 89 / 10

Fibonacci heap: ExtractMin ExtractMin() 1: find min of all root nodes; 2: remove the min node; 3: while there are two roots u and v of the same degree do 4: consolidate the two trees into one; 5: end while 90 / 10

ExtractMin: analysis Analysis: The actual running time is: d + w, whered denotes degree of the removed node, and w =#WHILE; Consider a potential function Φ=#trees + 2#marks; Φ increase: d; Φ decrease: w; Running time: d + w = d+ decrease in Φ; Note: d d max,whered max denotes the maximum root node degree; 91 / 10

Fibonacci heap: Insert Insert(x) 1: create a tree for x, and insert it into the root list; Analysis: Note: Running time: 1; Φ increase: 1; Φ decrease: 0; Recall that a binomial heap consolidates trees in both Insert and ExtractMin operations In contrast, the Fibonacci heap adopts the strategy of being lazy tree consolidating is removed from Insert operation for the sake of efficiency, and there is no tree consolidating until an ExtractMin operation 92 / 10

Fibonacci heap: amortized analysis Consider any sequence of n Insert, m ExtractMin, andr DecreaseKey operations; The total running time is at most: n + md max + r+ total decrease in Φ; Note: total decrease in Φ total increase in Φ= n + md max +3r; Thus the total running time is at most: n + md max + r + n + md max +3r =2n +2md max +4r; Thus Insert takes O(1) amortized time, DecreaseKey takes O(1) amortized time, and ExtractMin takes O(d max ) amortized time In fact, ExtractMin takes O(log n) amortized time since d max can be upper-bounded by log n Question: how to bound d max? 93 / 10

Fibonacci heap: bounding d max 94 / 10

Fibonacci heap: bounding d max Recall that for a binomial tree having n nodes, the root degree d is exactly log 2 n,ied = log 2 n 8 9 7 10 11 12 In contrast, a tree in a Fibonacci heap might have several subtrees cutting off, leading to d log 2 n 8 9 7 However, the marking technique guarantees that any node can lose at most one child, thus we can show that log φ n d log 2 n,whereφ = 1+ 5 11 2 =118 95 / 10

Fibonacci heap: a property of node degree Recall that for a binomial tree, the i-th child of each node has a degree of exactly i 1 8 9 7 10 11 12 For a tree in a Fibonacci heap, we will show that the i-th child of each node has degree i 2 8 9 7 11 9 / 10

Lemma For any node in a Fibonacci heap, the i-th child has a degree i 2 w 8 9 7 u 11 Proof Suppose u is the current i-th child of w; If w is not a root node, it has at most 1 child lost; otherwise, it might have multiple children lost; Consider the time when u is linked to w Atthattime, degree(w) i 1, sodegree(u) =degree(w) i 1; Subsequently, degree(u) decreases by at most 1 (Otherwise, u will be cut off and no longer a child of w) Thus, degree(u) i 2 97 / 10

The smallest tree with root degree k in a Fibonacci heap Let F k be the smallest tree with root degree of k, andfor any node of F k,thei-th child has degree i 2; 8 B 1 F 0 Figure 3: B 1 =2 1 and F 0 =1 φ 0 98 / 10

Example: B 2 versus F 1 Let F k be the smallest tree with root degree of k, andforany node of F k,thei-th child has degree i 2; 8 9 10 B 2 8 F 1 Figure 37: B 2 =2 2 and F 1 =2 φ 1 99 / 10

Example: B 3 versus F 2 Let F k be the smallest tree with root degree of k, andforany node of F k,thei-th child has degree i 2; B 3 F 2 8 9 7 10 11 12 8 9 Figure 38: B 3 =2 3 and F 2 =3 φ 2 100 / 10

Example: B 4 versus F 3 B 4 0 8 9 7 5 F 3 10 11 12 11 12 13 8 9 7 15 1 17 11 Figure 39: B 4 =2 4 and F 3 =5 φ 3 101 / 10

Example: B 5 versus F 4 B 5 0 8 9 7 5 B 4 10 11 12 11 12 13 F 4 0 8 9 7 5 15 1 17 11 11 12 Figure 40: B 5 =2 5 and F 4 =8 φ 4 102 / 10

General case of trees in Fibonacci heap Recall that a binomial tree B k+1 is a combination of two B k trees B 0 B k+1 B k However, F k+1 is the combination of one F k tree and one F k 1 tree B k F 0 F k+1 F k F k 1 We will show that though F k is smaller than B k,the difference is not too much In fact, F k 118 k 103 / 10

Fibonacci numbers and Fibonacci heap Definition (Fibonacci numbers) The Fibonacci sequence is 0, 1, 1, 2, 3, 5, 8, 13, 21, 34 Itcanbe 0 if k =0 defined by the recursion relation: f k = 1 if k =1 f k 1 + f k 2 if k 2 Recall that f k+2 φ k,whereφ = 1+ 5 2 =118 Note that F k = f k+2 Consider a Fibonacci heap H having n nodes Let T denote a tree in H with root degree d We have n T F d = f d+2 φ d Thus d = O(log φ n)=o(log n) So, d max = O(log n) Therefore, ExtractMin operation takes O(log n) amortized time 104 / 10

Implementing priority queue: Fibonacci heap 105 / 10

Time complexity of Dijkstra algorithm Operation Linked Binary Binomial Fibonacci list heap heap heap MakeHeap 1 1 1 1 Insert 1 log n log n 1 ExtractMin n log n log n log n DecreaseKey 1 log n log n 1 Delete n log n log n log n Union 1 n log n 1 FindMin n 1 log n 1 Dijkstra O(n 2 ) O(m log n) O(m log n) O(m + n log n) Dijkstra algorithm: n Insert, n ExtractMin, andm DecreaseKey 10 / 10