Algorithm Design And Analysis Asst. Prof. Ali Kadhum Idrees The Sum-of-Subsets Problem Department of Computer Science College of Science for Women

Similar documents
BackTracking Introduction

Chapter 5 Backtracking 1

Backtracking. Chapter 5

The problem: given N, find all solutions of queen sets and return either the number of solutions and/or the patterned boards.

Coping with the Limitations of Algorithm Power Exact Solution Strategies Backtracking Backtracking : A Scenario

Greedy Algorithms 1. For large values of d, brute force search is not feasible because there are 2 d

Backtracking and Branch-and-Bound

Greedy Algorithms 1 {K(S) K(S) C} For large values of d, brute force search is not feasible because there are 2 d {1,..., d}.

Chapter-6 Backtracking

Algorithmic patterns

6. Algorithm Design Techniques

CS 231: Algorithmic Problem Solving

Practice Problems for the Final

Greedy Algorithms. This is such a simple approach that it is what one usually tries first.

Data Structures (CS 1520) Lecture 28 Name:

CS2 Algorithms and Data Structures Note 1

Heaps. Heaps. A heap is a complete binary tree.

Learning Recursion. Recursion [ Why is it important?] ~7 easy marks in Exam Paper. Step 1. Understand Code. Step 2. Understand Execution

Lecture Notes on Binary Decision Diagrams

ESc101 : Fundamental of Computing

Treewidth and graph minors

1 Non greedy algorithms (which we should have covered

Subset sum problem and dynamic programming

CMSC 451: Lecture 10 Dynamic Programming: Weighted Interval Scheduling Tuesday, Oct 3, 2017

AC64/AT64 DESIGN & ANALYSIS OF ALGORITHMS DEC 2014

Design and Analysis of Algorithms Prof. Madhavan Mukund Chennai Mathematical Institute. Module 02 Lecture - 45 Memoization

08 A: Sorting III. CS1102S: Data Structures and Algorithms. Martin Henz. March 10, Generated on Tuesday 9 th March, 2010, 09:58

Lecture 5: Dynamic Programming II

Discussion 2C Notes (Week 8, February 25) TA: Brian Choi Section Webpage:

CS473-Algorithms I. Lecture 11. Greedy Algorithms. Cevdet Aykanat - Bilkent University Computer Engineering Department

Greedy algorithms is another useful way for solving optimization problems.

The Knapsack Problem an Introduction to Dynamic Programming. Slides based on Kevin Wayne / Pearson-Addison Wesley

Chapter S:II. II. Search Space Representation

Department of Computer Science and Engineering Analysis and Design of Algorithm (CS-4004) Subject Notes

Recursive-Fib(n) if n=1 or n=2 then return 1 else return Recursive-Fib(n-1)+Recursive-Fib(n-2)

Data Structures Lecture 3 Order Notation and Recursion

CMSC 451: Dynamic Programming

Chapter Four: Loops II

Data Structures (CS 1520) Lecture 28 Name:

Parallel and Sequential Data Structures and Algorithms Lecture (Spring 2012) Lecture 16 Treaps; Augmented BSTs

Design and Analysis of Algorithms

Trees. Chapter 6. strings. 3 Both position and Enumerator are similar in concept to C++ iterators, although the details are quite different.

UNIT 4 Branch and Bound

Scribe: Virginia Williams, Sam Kim (2016), Mary Wootters (2017) Date: May 22, 2017

CS 6783 (Applied Algorithms) Lecture 5

Dynamic Programming Homework Problems

Theorem 2.9: nearest addition algorithm

Dynamic Programming Homework Problems

Towards a Memory-Efficient Knapsack DP Algorithm

Applied Algorithm Design Lecture 3

LECTURE NOTES OF ALGORITHMS: DESIGN TECHNIQUES AND ANALYSIS

15.4 Longest common subsequence

Suppose that the following is from a correct C++ program:

Chapter 4: Trees. 4.2 For node B :

CPS311 Lecture: Procedures Last revised 9/9/13. Objectives:

Greedy Algorithms. Algorithms

Discussion 1H Notes (Week 3, April 14) TA: Brian Choi Section Webpage:

Beyond Counting. Owen Kaser. September 17, 2014

Chapter 16: Greedy Algorithm

Elements of Dynamic Programming. COSC 3101A - Design and Analysis of Algorithms 8. Discovering Optimal Substructure. Optimal Substructure - Examples

Literature Review On Implementing Binary Knapsack problem

Lecture Notes on Priority Queues

COMP Data Structures

Big-O-ology. Jim Royer January 16, 2019 CIS 675. CIS 675 Big-O-ology 1/ 19

Two Approaches to Algorithms An Example (1) Iteration (2) Recursion

ECE G205 Fundamentals of Computer Engineering Fall Exercises in Preparation to the Midterm

The Basics of Graphical Models

CSCE 110 Dr. Amr Goneid Exercise Sheet (7): Exercises on Recursion (Solutions)

Algorithm Computation of a Single Source Shortest Path in a Directed Graph. (SSSP algorithm)

CSCI-1200 Data Structures Spring 2018 Lecture 7 Order Notation & Basic Recursion

Chapter 2 Divid d e-an -Conquer 1

COT 3100 Spring 2010 Midterm 2

Algorithm Design Methods. Some Methods Not Covered

CS 3410 Ch 7 Recursion

Department of Computer Applications. MCA 312: Design and Analysis of Algorithms. [Part I : Medium Answer Type Questions] UNIT I

Backtracking. Examples: Maze problem. The bicycle lock problem: Consider a lock with N switches, each of which can be either 0 or 1.

Algorithms and Data Structures

Lecture 17: Array Algorithms

Priority queues. Priority queues. Priority queue operations

How much space does this routine use in the worst case for a given n? public static void use_space(int n) { int b; int [] A;

Problem solving paradigms

Greedy algorithms part 2, and Huffman code

Lab 1. largest = num1; // assume first number is largest

UMCS. The N queens problem - new variants of the Wirth algorithm. Marcin Łajtar 1. ul. Nadbystrzycka 36b, Lublin, Poland

Greedy Algorithms CHAPTER 16

CS 161 Fall 2015 Final Exam

HEURISTIC SEARCH. 4.3 Using Heuristics in Games 4.4 Complexity Issues 4.5 Epilogue and References 4.6 Exercises

CMSC Introduction to Algorithms Spring 2012 Lecture 16

Chapter 20: Binary Trees

MergeSort. Algorithm : Design & Analysis [5]

The University Of Michigan. EECS402 Lecture 07. Andrew M. Morgan. Sorting Arrays. Element Order Of Arrays

CSC 284/484 Advanced Algorithms - applied homework 0 due: January 29th, 11:59pm EST

COMP 250 Fall priority queues, heaps 1 Nov. 9, 2018

PROGRAMMING IN HASKELL. Chapter 5 - List Comprehensions

DEPARTMENT OF COMPUTER SCIENCE AND ENGINEERING QUESTION BANK UNIT-III. SUB NAME: DESIGN AND ANALYSIS OF ALGORITHMS SEM/YEAR: III/ II PART A (2 Marks)

Exam 2. CSI 201: Computer Science 1 Fall 2016 Professors: Shaun Ramsey and Kyle Wilson. Question Points Score Total: 80

Lecture 15 Notes Binary Search Trees

LECTURE NOTES ON PROGRAMMING FUNDAMENTAL USING C++ LANGUAGE

Lecture 15 Binary Search Trees

Software Design Fundamentals. CSCE Lecture 11-09/27/2016

Transcription:

The Sum-of-Subsets Problem In this problem, there is a set of items the thief can steal, and each item has its own weight and profit. The thief's knapsack will break if the total weight of the items in it exceeds W. Therefore, the goal is to maximize the total value of the stolen items while not making the total weight exceed W. Suppose here that the items all have the same profit per unit weight. Then an optimal solution for the thief would simply be a set of items that maximized the total weight, subject to the constraint that its total weight did not exceed W. The thief might first try to determine whether there was a set whose total weight equaled W, because this would be best. The problem of determining such sets is called the Sum-of-Subsets problem. Specifically, in the Sum-of-Subsets problem, there are n positive integers (weights) wi and a positive integer W. The goal is to find all subsets of the integers that sum to W. As mentioned earlier, we usually state our problems so as to find all solutions. For the purposes of the thief's application, however, only one solution need be found. Example 1: This instance can be solved by inspection. For larger values of n, a systematic approach is necessary. One approach is to create a state space tree. A possible way to structure the tree appears in Figure (1). For the sake of 1

simplicity, the tree in this figure is for only three weights. We go to the left from the root to include w1, and we go to the right to exclude w1. Similarly, we go to the left from a node at level 1 to include w2, and we go to the right to exclude w2, etc. Each subset is represented by a path from the root to a leaf. When we include wi, we write wi on the edge where we include it. When we do not include wi, we write 0. Figure (1): A state space tree for instances of the Sum-of-Subsets problem in which n = 3. Example 2: Figure 2 shows the state space tree for n = 3, W = 6, and Figure 2: A state space tree for the Sum-of-Subsets problem for the instance in Example 2. Stored at each node is the total weight included up to that node. 2

At each node, we have written the sum of the weights that have been included up to that point. Therefore, each leaf contains the sum of the weights in the subset leading to that leaf. The second leaf from the left is the only one containing a 6. Because the path to this leaf represents the subset {w1, w2}, this subset is the only solution. If we sort the weights in nondecreasing order before doing the search, there is an obvious sign telling us that a node is nonpromising. If the weights are sorted in this manner, then wi+1 is the lightest weight remaining when we are at the ith level. Let weight be the sum of the weights that have been included up to a node at level i. If wi+1 would bring the value of weight above W, then so would any other weight following it. Therefore, unless weight equals W (which means that there is a solution at the node), a node at the ith level is nonpromising if There is another, less obvious sign telling us that a node is nonpromising. If, at a given node, adding all the weights of the remaining items to weight does not make weight at least equal to W, then weight could never become equal to W by expanding beyond the node. This means that if total is the total weight of the remaining weights, a node is nonpromising if The following example illustrates these backtracking strategies. Example 4: Figure 3 shows the pruned state space tree when backtracking is used with n = 4, W = 13, and 3

Figure 3: The pruned state space tree produced using backtracking in Example 4. Stored at each node is the total weight included up to that node. The only solution is found at the shaded node. Each nonpromising node is marked with a cross The only solution is found at the node shaded in color. The solution is {w 1, w 2, w 4 }. The nonpromising nodes are marked with crosses. The nodes containing 12, 8, and 9 are nonpromising because adding the next weight (6) would make the value of weight exceed W. The nodes containing 7, 3, 4, and 0 are nonpromising because there is not enough total weight remaining to bring the value of weight up to W. Notice that a leaf in the state space tree that does not contain a solution is automatically nonpromising because there are no weights remaining that could bring weight up to W. The leaf containing 7 illustrates this. There are only 15 nodes in the pruned state space tree, whereas the entire state space tree contains 31 nodes. When the sum of the weights included up to a node equals W, there is a solution at that node. Therefore, we cannot get another solution by including more items. This means that if we should print the solution and backtrack. This backtracking is provided automatically by our general procedure checknode because it never expands beyond a promising node where a solution is found. Recall that when we discussed checknode we mentioned that some backtracking 4

algorithms sometimes find a solution before reaching a leaf in the state space tree. This is one such algorithm. Next we present the algorithm that employs these strategies. The algorithm uses an array include. It sets include[i] to"yes" if w[i] is to be included and to "no" if it is not. Algorithm: The Backtracking Algorithm for the Sum-of-Subsets Problem Problem: Given n positive integers (weights) and a positive integer W, determine all combinations of the integers that sum to W. Inputs: positive integer n, sorted (nondecreasing order) array of positive integers w indexed from 1 to n, and a positive integer W. Outputs: all combinations of the integers that sum to W. void sum_of_subsets (index i, int weight, int total) { if (promising (i)) if (weight == W) cout << include [1] through include [i]; else{ include [i + 1] = "yes"; // Include w[i + 1]. sum_of_subsets (i + 1, weight + w[i + 1], total - w[i + 1]); include [i + 1] = "no"; // Do not include w[i + 1]. sum_of_subsets (i + 1, weight, total - w [i + 1]); } } bool promising (index i); { return (weight + total >=W) && (weight == W weight + w[i + 1] <= W); } Following our usual convention, n, w, W, and include are not inputs to our routines. If these variables were defined globally, the top-level call to sum_of_subsets would be as follows: Where initially Recall that a leaf in the state space tree that does not contain a solution is nonpromising because there are no weights left that could bring the value of weight up to W. This means that the algorithm 5

should not need to check for the terminal condition i = n. Let's verify that the algorithm implements thiscorrectly. When i = n, the value of total is 0 (because there are no weights remaining). Therefore, at this point which means that is true only if weight W. Because we always keep weight W, we must have weight = W. Therefore, when i = n, function promising returns true only if weight = W. But in this case there is no recursive call because we found a solution. Therefore, we do not need to check for the terminal condition i = n. Notice that there is never a reference to the nonexistent array item w[n + 1] in function promising because of our assumption that the second condition in an or expression is not evaluated when the first condition is true. The number of nodes in the state space tree searched by Algorithm above is equal to Given only this result, the possibility exists that the worst case could be much better than this. That is, it could be that for every instance only a small portion of the state space tree is searched. This is not the case. For each n, it is possible to construct an instance for which the algorithm visits an exponentially large number of nodes. This is true even if we want only one solution. To this end, if we take There is only one solution {W n }, and it will not be found until an exponentially large number of nodes are visited. As stressed before, even though the worst case is exponential, the algorithm can be 6

efficient for many large instances. In the exercises you are asked to write programs using the Monte Carlo technique to estimate the efficiency of Algorithm above on various instances. 7