Greedy Technique - Definition

Similar documents
The Greedy Method. Outline and Reading. Change Money Problem. Greedy Algorithms. Applications of the Greedy Strategy. The Greedy Method Technique

Course Introduction. Algorithm 8/31/2017. COSC 320 Advanced Data Structures and Algorithms. COSC 320 Advanced Data Structures and Algorithms

GSLM Operations Research II Fall 13/14

Dijkstra s Single Source Algorithm. All-Pairs Shortest Paths. Dynamic Programming Solution. Performance. Decision Sequence.

11. APPROXIMATION ALGORITHMS

Design and Analysis of Algorithms

Dijkstra s Single Source Algorithm. All-Pairs Shortest Paths. Dynamic Programming Solution. Performance

An Optimal Algorithm for Prufer Codes *

6.854 Advanced Algorithms Petar Maymounkov Problem Set 11 (November 23, 2005) With: Benjamin Rossman, Oren Weimann, and Pouya Kheradpour

Sorting Review. Sorting. Comparison Sorting. CSE 680 Prof. Roger Crawfis. Assumptions

U.C. Berkeley CS294: Beyond Worst-Case Analysis Handout 5 Luca Trevisan September 7, 2017

Kent State University CS 4/ Design and Analysis of Algorithms. Dept. of Math & Computer Science LECT-16. Dynamic Programming

Problem Set 3 Solutions

Smoothing Spline ANOVA for variable screening

Sum of Linear and Fractional Multiobjective Programming Problem under Fuzzy Rules Constraints

Graph-based Clustering

LECTURE NOTES Duality Theory, Sensitivity Analysis, and Parametric Programming

Support Vector Machines

Support Vector Machines. CS534 - Machine Learning

Today s Outline. Sorting: The Big Picture. Why Sort? Selection Sort: Idea. Insertion Sort: Idea. Sorting Chapter 7 in Weiss.

Sorting: The Big Picture. The steps of QuickSort. QuickSort Example. QuickSort Example. QuickSort Example. Recursive Quicksort

CHAPTER 10: ALGORITHM DESIGN TECHNIQUES

Programming in Fortran 90 : 2017/2018

5 The Primal-Dual Method

Compiler Design. Spring Register Allocation. Sample Exercises and Solutions. Prof. Pedro C. Diniz

CSCI 104 Sorting Algorithms. Mark Redekopp David Kempe

CSE 326: Data Structures Quicksort Comparison Sorting Bound

Insertion Sort. Divide and Conquer Sorting. Divide and Conquer. Mergesort. Mergesort Example. Auxiliary Array

Sequential search. Building Java Programs Chapter 13. Sequential search. Sequential search

Algebraic Connectivity Optimization of the Air Transportation Network

CSE 326: Data Structures Quicksort Comparison Sorting Bound

NGPM -- A NSGA-II Program in Matlab

CS221: Algorithms and Data Structures. Priority Queues and Heaps. Alan J. Hu (Borrowing slides from Steve Wolfman)

Dijkstra s Algorithm

All-Pairs Shortest Paths. Approximate All-Pairs shortest paths Approximate distance oracles Spanners and Emulators. Uri Zwick Tel Aviv University

CS246: Mining Massive Datasets Jure Leskovec, Stanford University

Parallelism for Nested Loops with Non-uniform and Flow Dependences

Optimization Methods: Integer Programming Integer Linear Programming 1. Module 7 Lecture Notes 1. Integer Linear Programming

Sorting. Sorting. Why Sort? Consistent Ordering

Subspace clustering. Clustering. Fundamental to all clustering techniques is the choice of distance measure between data points;

Fixing Max-Product: Convergent Message Passing Algorithms for MAP LP-Relaxations

Support Vector Machines

A mathematical programming approach to the analysis, design and scheduling of offshore oilfields

Performance Evaluation of Information Retrieval Systems

CS1100 Introduction to Programming

CS 534: Computer Vision Model Fitting

ELEC 377 Operating Systems. Week 6 Class 3

CHAPTER 2 DECOMPOSITION OF GRAPHS

Priority queues and heaps Professors Clark F. Olson and Carol Zander

1 Introducton Effcent and speedy recovery of electrc power networks followng a major outage, caused by a dsaster such as extreme weather or equpment f

Report on On-line Graph Coloring

Outline. Discriminative classifiers for image recognition. Where in the World? A nearest neighbor recognition example 4/14/2011. CS 376 Lecture 22 1

Optimal planning of selective waste collection

User Authentication Based On Behavioral Mouse Dynamics Biometrics

Private Information Retrieval (PIR)

Non-Split Restrained Dominating Set of an Interval Graph Using an Algorithm

Module Management Tool in Software Development Organizations

Ramsey numbers of cubes versus cliques

News. Recap: While Loop Example. Reading. Recap: Do Loop Example. Recap: For Loop Example

Life Tables (Times) Summary. Sample StatFolio: lifetable times.sgp

The AVL Balance Condition. CSE 326: Data Structures. AVL Trees. The AVL Tree Data Structure. Is this an AVL Tree? Height of an AVL Tree

Improving Low Density Parity Check Codes Over the Erasure Channel. The Nelder Mead Downhill Simplex Method. Scott Stransky

LP Decoding. Martin J. Wainwright. Electrical Engineering and Computer Science UC Berkeley, CA,

Load Balancing for Hex-Cell Interconnection Network

AADL : about scheduling analysis

Faster Shortest Paths in Dense Distance Graphs, with Applications

3. CR parameters and Multi-Objective Fitness Function

Outline. Type of Machine Learning. Examples of Application. Unsupervised Learning

AGGREGATED MODELS TECHNIQUE FOR INTEGRATING PLANNING AND SCHEDULING OF PRODUCTION TASKS

Radial Basis Functions

the nber of vertces n the graph. spannng tree T beng part of a par of maxmally dstant trees s called extremal. Extremal trees are useful n the mxed an

CMPS 10 Introduction to Computer Science Lecture Notes

OPL: a modelling language

Discrete Applied Mathematics. Shortest paths in linear time on minor-closed graph classes, with an application to Steiner tree approximation

The Codesign Challenge

Harvard University CS 101 Fall 2005, Shimon Schocken. Assembler. Elements of Computing Systems 1 Assembler (Ch. 6)

For instance, ; the five basic number-sets are increasingly more n A B & B A A = B (1)

Machine Learning: Algorithms and Applications

More on Sorting: Quick Sort and Heap Sort

Intro. Iterators. 1. Access

A Saturation Binary Neural Network for Crossbar Switching Problem

Announcements. Supervised Learning

Range images. Range image registration. Examples of sampling patterns. Range images and range surfaces

Parallel matrix-vector multiplication

Dynamic Programming. Example - multi-stage graph. sink. source. Data Structures &Algorithms II

Data Representation in Digital Design, a Single Conversion Equation and a Formal Languages Approach

Capacitated Domination and Covering: A Parameterized Perspective

Active Contours/Snakes

LECTURE : MANIFOLD LEARNING

Design for Reliability: Case Studies in Manufacturing Process Synthesis

Can We Beat the Prefix Filtering? An Adaptive Framework for Similarity Join and Search

CHAPTER 3 SEQUENTIAL MINIMAL OPTIMIZATION TRAINED SUPPORT VECTOR CLASSIFIER FOR CANCER PREDICTION

CHAPTER 2 PROPOSED IMPROVED PARTICLE SWARM OPTIMIZATION

CE 221 Data Structures and Algorithms

Control strategies for network efficiency and resilience with route choice

CHARUTAR VIDYA MANDAL S SEMCOM Vallabh Vidyanagar

Biostatistics 615/815

Machine Learning. Support Vector Machines. (contains material adapted from talks by Constantin F. Aliferis & Ioannis Tsamardinos, and Martin Law)

On a Local Protocol for Concurrent File Transfers

An Iterative Solution Approach to Process Plant Layout using Mixed Integer Optimisation

Transcription:

Greedy Technque

Greedy Technque - Defnton The greedy method s a general algorthm desgn paradgm, bult on the follong elements: confguratons: dfferent choces, collectons, or values to fnd objectve functon: a score assgned to confguratons, hch e ant to ether maxmze or mnmze It ors best hen appled to problems th the Greedy-choce property Optmal sub structure 2

Greedy Choce Property Mae hatever choce seems best at the moment and then solve the sub-problems arsng after the choce s made. The choce made by a greedy algorthm may depend on choces so far. But, t cannot depend on any future choces, t progresses one greedy choce after another teratvely reducng each gven problem nto a smaller one. A greedy algorthm maes the decson early and ll never reconsder the old decsons. It may not be accurate for some problems. 3

Optmal Sub-structure A problem exhbts optmal sub-structure, f an optmal soluton to the sub-problem contans thn ts optmal soluton to the problem. Ths property s used to determne the usefulness of dynamc programmng and greedy algorthms n a problem. 4

Greedy Technque - Mang Change Problem: A dollar amount to reach and a collecton of con amounts to use to get there. Objectve functon: Mnmze number of cons returned. Greedy soluton: Alays return the largest con you can Example : Cons are valued $.32, $.8, $. Has the greedy-choce property, snce no amount over $.32 can be made th a mnmum number of cons by omttng a $.32 con (smlarly for amounts over $.8, but under $.32). Example 2: Cons are valued $.3, $.2, $.5, $. Does not have greedy-choce property, snce $.4 s best made th to $.2 s, but the greedy soluton ll pc three cons (hch ones?) 5

Tas Schedulng or Actvtes Selecton Gven: a set T of n tass, each havng: A start tme, s A fnsh tme, f (here s < f ) Goal: Perform all the tass usng a mnmum number of machnes. Machne 3 Machne 2 Machne 2 3 4 5 6 7 8 9 6

Tas Schedulng or Actvty Selecton Brute force: Try all subsets of actvtes. Choose the largest subset hch s feasble. The runnng tme for lstng all subsets of actvtes ould be Θ(2 n ) 7

Tas Schedulng or Actvty Selecton s 4 f 2 3 4 5 6 7 8 9 2 3 5 3 3 8 4 5 9 5 6 6 5 7 8

Tas Schedulng or Actvty Selecton s 4 f 2 3 4 5 6 7 8 9 2 3 5 3 3 8 4 5 9 5 6 6 5 7 Sorted by fnsh tmes 9

Tas Schedulng or Actvty Selecton Algorthm Greedy_Actvty (s[ n-], f[ n-]) A {} j for 2 to n do f s[j]>=f[j] A A + {} j = return Actvty Exclude the sortng tme, ths algorthm s runnng tme T(n) = Θ(n) If nclude the sortng tme, hat ll be the total runnng tme? T(n) =??

Provng Optmalty Proof Greedy choce property Purpose: Shong that actvty# (greedy choce) s n the optmal soluton. Let S = {, 2,..., n} be the set of actvtes. Snce actvtes are n order by fnsh tme. It mples that actvty has the earlest fnsh tme. Suppose, A S s an optmal soluton and let actvtes n A are ordered by fnsh tme. Suppose, the frst actvty n A s. If =, then A begns th greedy choce and e are done (or to be very precse, there s nothng to proof here). If, there s another soluton B that begns th greedy choce, actvty. Let B = (A - {})+{}. Because f f the actvtes n B are dsjont and snce B has same number of actvtes as A,.e., A = B, B s also optmal.

Provng Optmalty Proof Optmal Substructure Purpose: Sho that an optmal soluton to the problem contan thn ts optmal solutons to sub-problems. Once the greedy choce s made, the problem reduces to fndng an optmal soluton for the problem. If A s an optmal soluton to the orgnal problem S, then A` = A - {} s an optmal soluton to the actvty-selecton problem S`= { S: s f }. Why? Because f e could fnd a soluton B` to S` th more actvtes than A`, addng to B` ould yeld a soluton B to S th more actvtes than A, there by contradctng the optmalty. 2

The Fractonal Knapsac Problem Gven: A set S of n tems, th each tem havng b - a postve beneft - a postve eght Goal: Choose tems th maxmum total beneft but th eght at most W. If e are alloed to tae fractonal amounts, then ths s the fractonal napsac problem. In ths case, e let x denote the amount e tae of tem Objectve: maxmze S b ( x / ) Constrant: S x W 3

Example Gven: A set S of n tems, th each tem havng b - a postve beneft - a postve eght Goal: Choose tems th maxmum total beneft but th eght at most W. napsac Items: Weght: Beneft: Value: 3 ($ per ml) 2 3 4 5 4 ml 8 ml 2 ml 6 ml ml $2 $32 $4 $3 $5 4 2 5 5 ml Soluton: ml of 5 2 ml of 3 6 ml of 4 ml of 2 4

The Fractonal Knapsac Algorthm Algorthm fractonalknapsac(s, W) Input: set S of tems / beneft b and eght ; max. eght W Output: amount x of each tem to maxmze beneft / eght at most W for each tem n S x v b / //{value} //{total eght} hle < W remove tem / hghest v x mn{, W - } + mn{, W - } What s a runnng tme? 5

Provng Optmalty Let v / v 2 / 2 v 3 / 3 v n /v n Let x be the soluton Let y be any feasble soluton vector We ant to sho that n = ( x x v y v y ) v 2 n x x x < (Fractonal) = Select a hole tem = Does not select at all x = Select partal tem 6

7 Provng Optmalty = = n v y x ) ( = ) ( v y x v y x ) + ( x 2 n x x < (Fractonal) + = + n v y x ) ( = ) ( v y x v y x ) = ( + = n v y x ) (

8 Provng Optmalty = = n v y x ) ( = ) ( v y x v y x ) + ( x 2 n x + = + n v y x ) ( = = n y x v ) ( = n v y x ) (

Huffman Code Huffman code s a technque for compressng data. Huffman's greedy algorthm loo at the occurrence of each character and t as a bnary strng n an optmal ay. Suppose e have a data conssts of, characters that e ant to compress. The characters n the data occur th follong frequences. a b C d e f Frequency 45, 3, 2, 6, 9, 5, Consder the problem of desgnng a "bnary character code" n hch each character s represented by a unque bnary strng. 9

Fx Length Code In fxed length code, e needs only 3 bts to represent sx characters. a b C d e f Frequency 45, 3, 2, 6, 9, 5, Fx Length Code Total number of characters are 45, + 3, + 2, + 6, + 9, + 5, =,. Add each character s assgned 3-bt codeord => 3*, = 3, bts. Fxed-length code requres 3, bts!!! 2

Varable Length Code A varable-length code gves frequent characters n shorter codeords (a sequence of bts) and nfrequent characters n longer codeords usng prefx codes. In Prefx Codes no codeord s a prefx of other codeord. The reason prefx codes are desrable s that they smply encodng (compresson) and decodng. a b C d e f Frequency 45, 3, 2, 6, 9, 5, Varable Length Code A varable Length Code requres only 224, bts Ho come? 2

Huffman Code Bnary Tree a b C d e f Frequency 45, 3, 2, 6, 9, 5, Varable Length Code a b c d f e 22

Constructng Huffman Code 55 25 3 a = 45 b = 3 c = 2 f = 5 e = 9 d = 6 4 23

Huffman Code Bnary Tree Gven a tree T correspondng to the prefx code, compute the number of bts requred to encode a fle. Let f(c) be the frequency of c and let dt(c) denote the depth of c's leaf. Note that dt(c) s also the length of codeord. The number of bts to encode a fle s B(T) = f(c) dt(c) = 45* +3*3 + 2*3 + 6*3 + 9*4 +5*4 = 224 = 224* = 224, 24

Huffman Code Bnary Tree Algorthm Huffman(C,n) Q = BuldHeap(C) for to n- do z Allocate-Node() z.left Extract_Mn(Q) z.rght Extract_Mn(Q) z.freq z.left.freg + z.rght.freg Insert(Q,z) Return Extract_Mn(Q) lne 2, BuldHeap s n O(n) tme. for loop executed n - tmes Each heap operaton requres O(log n) tme. Therefore, the for loop contrbutes (n - )O(log n) O(n log n) 25

Optmal Substructure T T* c c x y B(T) = B(T*) + f(x)dt(x)+f(y)dt(y) f(c)dt*(c) f(x)dt(x)+f(y)dt(y) = (f(x) + f(y))(dt*(c)+) = f(c)dt*(c) + f(x) + f(y) = B(T*) + f(x)+f(y) 26

Greedy Choce Property If x and y are the nodes that have the least frequency, then there exsts an optmal tree (represent prefx code) that contan node x and y as the deepest depth. Let T s the optmal tree, x and y s the node that has the least frequency. 27

Greedy Choce Property T T* T** x y b y b c b c x c x y f(c)dt(c) - f(c)dt*(c) //Before sap After sap = f(x)dt(x) + f(b)dt(b) f(x)dt*(x) f(b)dt*(b) = f(x)dt(x) + f(b)dt(b) f(x)dt(b) f(b)dt(x) = ( f(b) f(x) )( dt(b) dt(x) ) 28

Sngle Source Shortest Path (Djstra s Algorthm) Gven a vertex called the source n a eghted connected graph, fnd shortest paths to all ts other vertces. (Ths s not a TSP) The dstance of a vertex v from a vertex s s the length of a shortest path beteen s and v Djstra s algorthm computes the dstances of all the vertces from a gven start vertex s Assumptons: the graph s connected the edges are undrected the edge eghts are nonnegatve 29

Sngle Source Shortest Path (Djstra s Algorthm) Gro a cloud of vertces, begnnng th s and eventually coverng all the vertces Store th each vertex v a label d(v) representng the dstance of v from s n the sub-graph consstng of the cloud and ts adjacent vertces At each step Add to the cloud the vertex u outsde the cloud th the smallest dstance label, d(u) Update the labels of the vertces adjacent to u 3

Edge Relaxaton Consder an edge e = (u,z) such that u s the vertex most recently added to the cloud z s not n the cloud The relaxaton of edge e updates dstance d(z) as follos: d(z) mn{d(z),d(u) + eght(e)} s d(u) = 5 u e d(z) = 75 z s d(u) = 5 u e d(z) = 6 z 3

Example B 8 8 A 2 7 2 C 4 D 4 B 8 8 A 2 7 2 C 4 D 3 3 9 2 5 E F 5 3 9 8 2 5 E F B 8 8 A 2 7 2 C 4 D 3 B 8 7 A 2 7 2 C 4 D 3 5 3 9 2 5 E F 5 3 9 8 2 5 E F 32

Example B 8 7 A 2 7 2 C 4 D 3 5 3 9 8 2 5 E F B 8 7 A 2 7 2 C 4 D 3 5 3 9 8 2 5 E F 33

Pseudo Code Algorthm Djstra(G,, s) 2 for each vertex v n V[G] // Intalzatons 3 d[v] nfnty 4 prevous[v] undefned 5 d[s] 6 S empty set 7 Q V[G] // Buld Q and Store V to Q 8 hle Q s not an empty set // The algorthm tself 9 u Extract_Mn(Q) S S + {u} for each edge(u,v) outgong from u 2 f d[u] + (u,v) < d[v] // Relax (u,v) 3 d[v] d[u] + (u,v) 4 prevous[v] u 34

Analyss The tme effcency of Djstra s algorthm depend on the data structures used for mplementng the prorty queue and for representng an nput graph tself. If e store a graph n a form of an ordnary lned lst or array and for representng Q, operaton Extract-Mn(Q) s a lnear search through all vertces n Q. The runnng tme s O(V 2 ). If e store a graph n a form of adjacency lsts and usng a bnary heap as a prorty queue (to mplement the Extract-Mn() functon). Wth a bnary heap, the algorthm requres O((E+V)logV) tme, recall that deg(v) = 2E The runnng tme can also be expressed as O(E log V). 35

Why Djstra s Algorthm Wors Djstra s algorthm s based on the greedy method. It adds vertces by ncreasng dstance. Suppose t ddn t fnd all shortest dstances. Let F be the frst rong vertex the algorthm processed. When the prevous node, D, on the true shortest path as consdered, ts dstance as correct. But the edge (D,F) as relaxed at that tme! Thus, so long as d(f)>d(d), F s dstance cannot be rong. That s, there s no rong vertex. B 8 7 A 2 7 2 C 5 3 9 8 2 5 E F 4 D 3 36

Why It Doesn t Wor for Negatve- Weght Edges Djstra s algorthm s based on the greedy method. It adds vertces by ncreasng dstance. If a node th a negatve ncdent edge ere to be added late to the cloud, t could mess up dstances for vertces already n the cloud. 8 A 4 6 7 7 5 B C D 5-8 9 2 5 E F 4 C s true dstance s, but t s already n the cloud th d(c)=5! 37

Acnoledgement http://.personal.ent.edu/~rmuhamma/algorth ms/algorthm.html http://3.algorthmdesgn.net/ http://en.peda.org// 38