Homework #4. CMSC351 - Spring 2013 PRINT Name : Due: Thu Apr 16 th at the start of class

Similar documents
1 Solutions to Tute09

SET 1C Binary Trees. 2. (i) Define the height of a binary tree or subtree and also define a height balanced (AVL) tree. (2)

useful than solving these yourself, writing up your solution and then either comparing your

AVL Trees. The height of the left subtree can differ from the height of the right subtree by at most 1.

0/1 knapsack problem knapsack problem

Fundamental Algorithms - Surprise Test

On the Optimality of a Family of Binary Trees Techical Report TR

Design and Analysis of Algorithms 演算法設計與分析. Lecture 8 November 16, 2016 洪國寶

CSE 21 Winter 2016 Homework 6 Due: Wednesday, May 11, 2016 at 11:59pm. Instructions

IEOR E4004: Introduction to OR: Deterministic Models

Introduction to Greedy Algorithms: Huffman Codes

COSC160: Data Structures Binary Trees. Jeremy Bolton, PhD Assistant Teaching Professor

Structural Induction

CSE202: Algorithm Design and Analysis. Ragesh Jaiswal, CSE, UCSD

Lecture l(x) 1. (1) x X

Successor. CS 361, Lecture 19. Tree-Successor. Outline

COMP Analysis of Algorithms & Data Structures

Microeconomics of Banking: Lecture 5

CS4311 Design and Analysis of Algorithms. Lecture 14: Amortized Analysis I

2 all subsequent nodes. 252 all subsequent nodes. 401 all subsequent nodes. 398 all subsequent nodes. 330 all subsequent nodes

The potential function φ for the amortized analysis of an operation on Fibonacci heap at time (iteration) i is given by the following equation:

Binary and Binomial Heaps. Disclaimer: these slides were adapted from the ones by Kevin Wayne

June 11, Dynamic Programming( Weighted Interval Scheduling)

Math 167: Mathematical Game Theory Instructor: Alpár R. Mészáros

Priority Queues. Fibonacci Heap

CSE 100: TREAPS AND RANDOMIZED SEARCH TREES

Design and Analysis of Algorithms

Initializing A Max Heap. Initializing A Max Heap

1 Binomial Tree. Structural Properties:

CSE 417 Dynamic Programming (pt 2) Look at the Last Element

Algorithmic Game Theory and Applications. Lecture 11: Games of Perfect Information

CMPSCI 311: Introduction to Algorithms Second Midterm Practice Exam SOLUTIONS

Chapter 16. Binary Search Trees (BSTs)

Slides credited from Hsu-Chun Hsiao

Binary Search Tree and AVL Trees. Binary Search Tree. Binary Search Tree. Binary Search Tree. Techniques: How does the BST works?

Handout 4: Deterministic Systems and the Shortest Path Problem

3/7/13. Binomial Tree. Binomial Tree. Binomial Tree. Binomial Tree. Number of nodes with respect to k? N(B o ) = 1 N(B k ) = 2 N(B k-1 ) = 2 k

Design and Analysis of Algorithms. Lecture 9 November 20, 2013 洪國寶

UNIT 2. Greedy Method GENERAL METHOD

Design and Analysis of Algorithms 演算法設計與分析. Lecture 9 November 19, 2014 洪國寶

PRIORITY QUEUES. binary heaps d-ary heaps binomial heaps Fibonacci heaps. Lecture slides by Kevin Wayne Copyright 2005 Pearson-Addison Wesley

Maximum Contiguous Subsequences

Algorithms PRIORITY QUEUES. binary heaps d-ary heaps binomial heaps Fibonacci heaps. binary heaps d-ary heaps binomial heaps Fibonacci heaps

Notes on Natural Logic

About this lecture. Three Methods for the Same Purpose (1) Aggregate Method (2) Accounting Method (3) Potential Method.

Binary Tree Applications

NOTES ON FIBONACCI TREES AND THEIR OPTIMALITY* YASUICHI HORIBE INTRODUCTION 1. FIBONACCI TREES

A relation on 132-avoiding permutation patterns

1) S = {s}; 2) for each u V {s} do 3) dist[u] = cost(s, u); 4) Insert u into a 2-3 tree Q with dist[u] as the key; 5) for i = 1 to n 1 do 6) Identify

Priority Queues 9/10. Binary heaps Leftist heaps Binomial heaps Fibonacci heaps

Lecture 8 Feb 16, 2017

Yao s Minimax Principle

UNIT VI TREES. Marks - 14

THE TRAVELING SALESMAN PROBLEM FOR MOVING POINTS ON A LINE

Tug of War Game. William Gasarch and Nick Sovich and Paul Zimand. October 6, Abstract

Information Theory and Coding Prof. S. N. Merchant Department of Electrical Engineering Indian Institute of Technology, Bombay

Lecture Notes 1

Ch 10 Trees. Introduction to Trees. Tree Representations. Binary Tree Nodes. Tree Traversals. Binary Search Trees

What is Greedy Approach? Control abstraction for Greedy Method. Three important activities

Chapter 7 One-Dimensional Search Methods

PARELLIZATION OF DIJKSTRA S ALGORITHM: COMPARISON OF VARIOUS PRIORITY QUEUES

Outline for this Week

Data Structures and Algorithms February 10, 2007 Pennsylvania State University CSE 465 Professors Sofya Raskhodnikova & Adam Smith Handout 10

CMSC 441: Design & Analysis of Algorithms

LECTURE 2: MULTIPERIOD MODELS AND TREES

Sublinear Time Algorithms Oct 19, Lecture 1

Advanced Algorithmics (4AP) Heaps

Max Registers, Counters and Monotone Circuits

Splay Trees. Splay Trees - 1

CSE 417 Algorithms. Huffman Codes: An Optimal Data Compression Method

Essays on Some Combinatorial Optimization Problems with Interval Data

CS360 Homework 14 Solution

Heaps. Heap/Priority queue. Binomial heaps: Advanced Algorithmics (4AP) Heaps Binary heap. Binomial heap. Jaak Vilo 2009 Spring

6.231 DYNAMIC PROGRAMMING LECTURE 3 LECTURE OUTLINE

Chapter wise Question bank

An Optimal Algorithm for Calculating the Profit in the Coins in a Row Game

Heaps

Heaps. c P. Flener/IT Dept/Uppsala Univ. AD1, FP, PK II Heaps 1

6.854J / J Advanced Algorithms Fall 2008

More Advanced Single Machine Models. University at Buffalo IE661 Scheduling Theory 1

2. This algorithm does not solve the problem of finding a maximum cardinality set of non-overlapping intervals. Consider the following intervals:

Introduction to Dynamic Programming

COMPUTER SCIENCE 20, SPRING 2014 Homework Problems Recursive Definitions, Structural Induction, States and Invariants

Single Machine Inserted Idle Time Scheduling with Release Times and Due Dates

Lesson 9: Heuristic Search and A* Search

> asympt( ln( n! ), n ); n 360n n

Lecture 5 January 30

ECON 459 Game Theory. Lecture Notes Auctions. Luca Anderlini Spring 2017

Lecture 6 Dynamic games with imperfect information

Optimization Prof. A. Goswami Department of Mathematics Indian Institute of Technology, Kharagpur. Lecture - 18 PERT

Fibonacci Heaps CLRS: Chapter 20 Last Revision: 21/09/04

Data Structures, Algorithms, & Applications in C++ ( Chapter 9 )

CIS 540 Fall 2009 Homework 2 Solutions

On the Optimality of a Family of Binary Trees

The exam is closed book, closed calculator, and closed notes except your one-page crib sheet.

Problem Set 2: Answers

FINANCIAL OPTION ANALYSIS HANDOUTS

1 Online Problem Examples

CEC login. Student Details Name SOLUTIONS

1.6 Heap ordered trees

Transcription:

Homework #4 CMSC351 - Spring 2013 PRINT Name : Due: Thu Apr 16 th at the start of class o Grades depend on neatness and clarity. o Write your answers with enough detail about your approach and concepts used, so that the grader will be able to understand it easily. You should ALWAYS prove the correctness of your algorithms either directly or by referring to a proof in the book. o Write your answers in the spaces provided. If needed, attach other pages. o The grades would be out of 16. Four problems would be selected and everyone s grade would be based only on those problems. You will also get 4 bonus points for trying to solve all problems. 1. Given two sets S 1 and S 2, and a real number x, find whether there exists an element from S 1 and an element from S 2 whose sum is exactly x. The algorithm should run in average time O(n), where n is the total number of elements in both sets. (Try to use the data structures you have learned in the lectures.) Use a Hash data structure to store all the members of S 1. For every member u of S 2, we simply search for x u in the data structure. In a hash data structure the running time of insertion and search are O(1) in average, thus the algorithm runs in O(n) time in average.

2. [Prob 4.22,Pg 88] Determine the general structure of a binary search tree formed by inserting the numbers 1 to n in order (you may draw a general structure, but explain it properly). What is the height of this tree? The tree is a path of length n 1. Hence its height is also n 1.

3. [Prob 4.16,Pg 88] Design a data structure that supports the following operations. Each operation should take O(lg n) time, where n is the number of elements in the data structure. Explain the required process for doing each operation and prove that it takes O(lg n) time. a) Insert(x): Insert the key x into data structure only if it is not already there. b) Delete(x): Delete the key x, if it is there! c) Find_Smallest(k): Find the kth smallest key in the data structure. A balanced binary search tree (e.g. AVL tree) can perform the first two types of commands. We modify the binary search tree such that we can also perform the third type of command in O(lg n). In each node of the tree v, store a number l v which is equal to the number of vertices in the left subtree of that node. Insertion and Deletion of an AVL tree can be easily adjusted to update these values. To perform the last command, we implement a recursive function fs(v, k) which finds the kthe element in the subtree rooted at node v. Clearly the answer to operation (c) would be fs(root, k). Now when we want to find the kth smallest element, we simply check l v. If l v = k 1, then node v is the answer to fs(v, k). If l v < k 1, the answer would be in the right subtree of v. Thus the answer would be fs(right_child_of_v, k l v 1). Similarly, if l v > k 1, the answer would be fs(left_child_of_v, k).

4. Scheduling jobs intervals with penalties: For each 1 i n job j i is given by two numbers d i and p i, where d i is the deadline and p i is the penalty. The length of each job is equal to 1 minute. We want to schedule all jobs, but only one job can run at any given time. If job i does not complete on of before its deadline d i, we should pay its penalty p i. Design a greedy algorithm to find a schedule which minimizes the sum of penalties. Observation: We can assume that all jobs finish after n minutes. Suppose not. So there is an empty minute starting at say 0 k n 1 (where no job is scheduled) and there is a job j i for some 1 i n which is scheduled some time after n minutes. If we schedule job j i to start at minute k, then this can only be a better solution since everything remains same except j i might now be able to meet its deadline. M 1 M 2 M 3 M n 0 1 2 3 n-1 n We assign time intervals M i for 1 i n where M i starts at minute i 1 and ends at minute i. The greedy algorithm is as follows: Arrange the jobs in decreasing order of the penalties p 1 p 2 p n and add them in this order To add job j i, if any time interval M l is available for 1 l d i, then schedule j i in the last such available interval. Else schedule j i in the first available interval starting backwards from M n Let S be the greedy schedule and suppose it schedules job j i at time t i for each 1 i n. We show by induction that there is an optimal schedule T r that agrees with S on the schedule of the first r intervals for 1 r n. Base Case: The base case is when r = 1. Let S be an optimal schedule. Suppose S schedules j 1 at time t t 1. By our observation above, S must schedule some job j k at time t 1. Let T 1 be the schedule obtained by swapping times of j 1 and j k in S. Note that T 1 makes the greedy choice to schedule j 1. In each of the following 3 cases we show that penalty of T 1 is at most that of S. d 1 t 1 and j 1 does not incur penalty in S. This implies t 1 > t since t 1 was the last available slot for j 1 in greedy algorithm. So in T 1, j 1 still does not incur penalty and j k gets scheduled at t which is earlier than t 1. d 1 t 1 and j 1 incurs penalty in S. This implies t > d 1 t 1. In T 1 we do not pay penalty for j 1 since d 1 t 1. Worst case we might pay the penalty p k for j k. So the net difference is penalty is p k p 1 0. d 1 < t 1. Hence there is no way to schedule j 1 without incurring penalty. Hence j 1 pays p 1 in both S and T 1. In greedy algorithm since t 1 is chosen as last available slot, we have t 1 > t. In T we schedule j k at t instead of t 1, which can only reduce the penalty of j k. Inductive Hypothesis: There is an optimal schedule T r that agrees with S on the schedule of the first r intervals. Inductive Step: If r = n 1, then by the Observation above the job j n must be scheduled in the only remaining slot from 1 to n. If r < n 1 then consider the optimal schedule T r. Let it schedule the job j r+1 at time t t r+1. By Observation above, T r schedules a job say j h at time t r+1. Since T r agrees with S on first r jobs we have h > r + 1. Let T r+1 be the schedule that

swaps times of j h and j r+1, i. e., schedules j h at time t and j r+1 at time t r+1. Note that T r+1 agrees with S on first r + 1 jobs. A very similar argument as in the base case shows that penalty of T r+1 is at most as much as the penalty of T r. Then the optimality of T r implies the optimality of T r+1 as well. 5. Making Change: Suppose we want to make change for n cents and the only denominations allowed are 1, 10 and 25 cents. a. Find an example such that the greedy algorithm does not find the minimum number of coins required to make change for n cents (give a concrete counterexample). b. Give a O(n) dynamic programming algorithm to find the minimum number of coins required to make change for n cents. Let C n be the least number of coins needed to make change for n cents For 1 n 9 we have C n = n since the only way is to use n pennies For 10 n 24 we have C n = C n 10 + 1 since now we must use a dime For 25 n we have C n = min(c n 10 + 1, C n 25 + 1) since now we must use either a dime or a quarter There is also a simple O(1) algorithm!!

6. Weighted Interval Scheduling: Consider a set of n intervals where each interval is given by (s i, t i ) Where s i is the start time and t i is the finish time. In addition each interval also has a weight given by w i. Give a dynamic programming algorithm to find the maximum weight of a non-conflicting set of intervals. First sort the intervals in increasing order of finishing times. This takes O(n log n) time. Let 1 p j j 1 denote the largest index of an interval which can be scheduled with j. Since we have sorted the intervals, we can find the indices p j for each 1 j n in O(n) time. For each 1 j n let OPT j denote the maximum weight possible from only the intervals 1,2, j. Clearly OPT 1 =w 1. For each 2 i n we have OPT i = max ( OPT i 1, w i + OPT p(j) ) since you can either have the interval i in your optimal schedule (in which case the max weight you can gain from earlier intervals is OPT p(j) ) ; or the interval i is not present in the optimal schedule (in which case the max weight you can gain from earlier intervals is OPT i 1 by definition). Hence each OPT i can be found in constant time (it only involves making one comparison). Therefore, the answer which is OPT n can be calculated in O(n) time. Note that the total time taken is O(n. log n ) + O(n) + O(n) = O(n. log n )

7. Scheduling a Class: A professor needs to choose a sequence of locations to conduct a class, one for each day. The days are numbered 1,2,, n. The two possible locations are AVW (A.V. Williams) and CSIC. For each 1 i n, the cost of conducting the class in AVW on day i is A i and the cost of conducting the class in CSIC on day i is C i. The cost of moving from AVW to CSIC (and vice versa) is some constant M. Give an O(n) algorithm which computes the cost of an optimal schedule of the class. For each 0 i n and X {AVW, CSIC} let T(i, j) denote the cost of an optimal schedule for the first i days given that the last day the class is held in X. Clearly, T(0, AVW) = 0 = T(0, CSIC). For each 1 i n, we have T(i, AVW) = min (T(i 1, AVW) + A i ; T(i 1, CSIC) + C i + M ) T(i, CSIC) = min( T(i 1, CSIC) + C i ; T(i 1, AVW) + A i + M ) The final answer for our problem is min { T(n, CSIC); T(n, AVW) } [This problem was taken from the Kleinberg-Tardos book]