Great Theoretical Ideas in Computer Science

Similar documents
Design and Analysis of Algorithms (VII)

COT 6936: Topics in Algorithms! Giri Narasimhan. ECS 254A / EC 2443; Phone: x3748

Randomized algorithms have several advantages over deterministic ones. We discuss them here:

1 Minimum Cut Problem

CPSC 536N: Randomized Algorithms Term 2. Lecture 10

Lecture 1 Aug. 26, 2015

Lecture 1 August 31, 2017

Introduction to Randomized Algorithms

Lecture 1: A Monte Carlo Minimum Cut Algorithm

Outline. CS38 Introduction to Algorithms. Approximation Algorithms. Optimization Problems. Set Cover. Set cover 5/29/2014. coping with intractibility

/633 Introduction to Algorithms Lecturer: Michael Dinitz Topic: Approximation algorithms Date: 11/27/18

Introduction to Algorithms / Algorithms I Lecturer: Michael Dinitz Topic: Algorithms and Game Theory Date: 12/3/15

Randomized Algorithms

UCT Algorithm Circle: Probabilistic Algorithms

Copyright 2000, Kevin Wayne 1

Introduction to Algorithms / Algorithms I Lecturer: Michael Dinitz Topic: Approximation algorithms Date: 11/18/14

15-451/651: Design & Analysis of Algorithms November 4, 2015 Lecture #18 last changed: November 22, 2015

Randomized Algorithms Week 4: Decision Problems

Combinatorial Optimization

The complement of PATH is in NL

Greedy Approximations

Approximation Algorithms

CS 231: Algorithmic Problem Solving

princeton univ. F 15 cos 521: Advanced Algorithm Design Lecture 2: Karger s Min Cut Algorithm

Partha Sarathi Mandal

COSC 311: ALGORITHMS HW1: SORTING

/633 Introduction to Algorithms Lecturer: Michael Dinitz Topic: Sorting lower bound and Linear-time sorting Date: 9/19/17

6 Randomized rounding of semidefinite programs

CSE 203A: Karger s Min-Cut Algorithm (Lecture Date: 5/11/06)

RAM with Randomization and Quick Sort

UCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis

A Randomized Algorithm for Minimum Cuts

Welcome to the course Algorithm Design

Treaps. 1 Binary Search Trees (BSTs) CSE341T/CSE549T 11/05/2014. Lecture 19

val(y, I) α (9.0.2) α (9.0.3)

Modules. 6 Hamilton Graphs (4-8 lectures) Introduction Necessary conditions and sufficient conditions Exercises...

Fast Sorting and Selection. A Lower Bound for Worst Case

Lecturers: Sanjam Garg and Prasad Raghavendra March 20, Midterm 2 Solutions

Approximation Algorithms

6.856 Randomized Algorithms

4 Greedy Approximation for Set Cover

CS364A: Algorithmic Game Theory Lecture #19: Pure Nash Equilibria and PLS-Completeness

Last week: Breadth-First Search

CSCI-B609: A Theorist s Toolkit, Fall 2016 Sept. 6, Firstly let s consider a real world problem: community detection.

COMP 161 Lecture Notes 16 Analyzing Search and Sort

K 4 C 5. Figure 4.5: Some well known family of graphs

Probabilistic (Randomized) algorithms

Randomized Algorithms, Hash Functions

11.1 Facility Location

Introduction to Algorithms / Algorithms I Lecturer: Michael Dinitz Topic: Dynamic Programming I Date: 10/6/16

31.6 Powers of an element

You ve already read basics of simulation now I will be taking up method of simulation, that is Random Number Generation

NP Completeness. Andreas Klappenecker [partially based on slides by Jennifer Welch]

Introduction to Graph Theory

/ Approximation Algorithms Lecturer: Michael Dinitz Topic: Linear Programming Date: 2/24/15 Scribe: Runze Tang

1 Probabilistic analysis and randomized algorithms

Approximation Basics

6.842 Randomness and Computation September 25-27, Lecture 6 & 7. Definition 1 Interactive Proof Systems (IPS) [Goldwasser, Micali, Rackoff]

Sorting. Sorting. Stable Sorting. In-place Sort. Bubble Sort. Bubble Sort. Selection (Tournament) Heapsort (Smoothsort) Mergesort Quicksort Bogosort

Lecture 9: Sorting Algorithms

13 Randomized Minimum Cut

Computational Complexity and Implications for Security DRAFT Notes on Infeasible Computation for MA/CS 109 Leo Reyzin with the help of Nick Benes

CPSC 536N: Randomized Algorithms Term 2. Lecture 4

10.4 Linear interpolation method Newton s method

Lecture 18. What we ve done and what s to come

Theory and Frontiers of Computer Science. Fall 2013 Carola Wenk

Graph Theory and Optimization Approximation Algorithms

Graph Algorithms. Chromatic Polynomials. Graph Algorithms

CS270 Combinatorial Algorithms & Data Structures Spring Lecture 19:

EECS 2011M: Fundamentals of Data Structures

Algorithm Analysis. (Algorithm Analysis ) Data Structures and Programming Spring / 48

CS 137 Part 8. Merge Sort, Quick Sort, Binary Search. November 20th, 2017

CSCI 5454 Ramdomized Min Cut

Randomized Graph Algorithms

Stanford University CS359G: Graph Partitioning and Expanders Handout 18 Luca Trevisan March 3, 2011

Min-Cut and Randomization

13 Distribution Ray Tracing

The Probabilistic Method

Review of course COMP-251B winter 2010

INDIAN STATISTICAL INSTITUTE

UML CS Algorithms Qualifying Exam Spring, 2004 ALGORITHMS QUALIFYING EXAM

Topic: Local Search: Max-Cut, Facility Location Date: 2/13/2007

Advanced Operations Research Techniques IE316. Quiz 1 Review. Dr. Ted Ralphs

Extremal Graph Theory: Turán s Theorem

Graph theory - solutions to problem set 1

10. EXTENDING TRACTABILITY

Lecture 4 Median and Selection

AXIOMS FOR THE INTEGERS

Chapter 8. NP-complete problems

1 Better Approximation of the Traveling Salesman

Graph Theory Day Four

Approximation Algorithms: The Primal-Dual Method. My T. Thai

Solving Linear Recurrence Relations (8.2)

Introduction to Algorithms

CS154, Lecture 18: 1

Exact Sampling for Hardy- Weinberg Equilibrium

1 Achieving IND-CPA security

Linear Programming Motivation: The Diet Problem

Heuristic Optimization Today: Linear Programming. Tobias Friedrich Chair for Algorithm Engineering Hasso Plattner Institute, Potsdam

Comparison Based Sorting Algorithms. Algorithms and Data Structures: Lower Bounds for Sorting. Comparison Based Sorting Algorithms

Transcription:

15-251 Great Theoretical Ideas in Computer Science Lecture 20: Randomized Algorithms November 5th, 2015

So far Formalization of computation/algorithm Computability / Uncomputability Computational complexity Graph theory and graph algorithms NP-completeness. Identifying intractable problems. Making use of intractable problems (in Social choice). Dealing with intractable problems: Approximation algs. Online algs.

Next

Randomness and the universe Does the universe have true randomness? Newtonian physics suggests that the universe evolves deterministically. Quantum physics says otherwise.

Randomness and the universe Does the universe have true randomness? God does not play dice with the world. - Albert Einstein Einstein, don t tell God what to do. - Niels Bohr

Randomness and the universe Does the universe have true randomness? Even if it doesn t, we can still model our uncertainty about things using probability. Randomness is an essential tool in modeling and analyzing nature. It also plays a key role in computer science.

Randomness in computer science Randomized algorithms Does randomness speed up computation? Statistics via sampling e.g. election polls Nash equilibrium in Game Theory Nash equilibrium always exists if players can have probabilistic strategies. Cryptography A secret is only as good as the entropy/uncertainty in it.

Randomness in computer science Randomized models for deterministic objects e.g. the www graph Quantum computing Randomness is inherent in quantum mechanics. Machine learning theory Data is generated by some probability distribution. Coding Theory Encode data to be able to deal with random noise.

Randomness and algorithms How can randomness be used in computation? Where can it come into the picture? Given some algorithm that solves a problem - What if the input is chosen randomly? - What if the algorithm can make random choices?

Randomness and algorithms How can randomness be used in computation? Where can it come into the picture? Given some algorithm that solves a problem - What if the input is chosen randomly? - What if the algorithm can make random choices?

Randomness and algorithms What is a randomized algorithm? A randomized algorithm is an algorithm that is allowed to flip a coin. (it can make decisions based on the output of the coin flip.) In 15-251: A randomized algorithm is an algorithm that is allowed to call: - RandInt(n) (we ll assume these take O(1) time) - Bernoulli(p)

Randomness and algorithms An Example def f(x): y = Bernoulli(0.5) if(y == 0): while(x > 0): print( What up? ) x = x - 1 return x+y For a fixed input (e.g. x = 3) - the output can vary - the running time can vary

Randomness and algorithms For a randomized algorithm, how should we: - measure its correctness? - measure its running time? If we require it to be - always correct, and - always runs in time O(T (n)) then we have a deterministic alg. with time compl. O(T (n)). (Why?)

Randomness and algorithms So for a randomized algorithm to be interesting: - it is not correct all the time, or - it doesn t always run in time O(T (n)), (It either gambles with correctness or running time.)

Types of randomized algorithms Given an array with n elements (n even). A[1 n]. Half of the array contains 0s, the other half contains 1s. Goal: Find an index that contains a 1. repeat: k = RandInt(n) if A[k] = 1, return k repeat 300 times: k = RandInt(n) if A[k] = 1, return k return Failed Doesn t gamble with correctness Gambles with run-time Gambles with correctness Doesn t gamble with run-time

Types of randomized algorithms repeat 300 times: k = RandInt(n) if A[k] = 1, return k return Failed Pr[failure] = 1 2 300 Worst-case running time: O(1) This is called a Monte Carlo algorithm. (gambles with correctness but not time)

Types of randomized algorithms repeat: k = RandInt(n) if A[k] = 1, return k Pr[failure] = Worst-case running time: 0 can t bound (could get super unlucky) Expected running time: O(1) (2 iterations) This is called a Las Vegas algorithm. (gambles with time but not correctness)

Given an array with n elements (n even). A[1 n]. Half of the array contains 0s, the other half contains 1s. Goal: Find an index that contains a 1. Deterministic Monte Carlo Las Vegas Correctness always w.h.p. always Run-time (n) O(1) O(1) w.h.p. w.h.p. = with high probability

Formal definition: Monte Carlo algorithm Let f :! be a computational problem. Suppose A is a randomized algorithm such that: 8x 2, 8x 2, Pr[A(x) 6= f(x)] apple #stepsa(x) takes is apple T ( x ). (no matter what the random choices are) Then we say A is a T (n)-time Monte Carlo algorithm for f with probability of error.

Formal definition: Las Vegas algorithm Let f :! be a computational problem. Suppose A is a randomized algorithm such that: 8x 2, 8x 2, A(x) =f(x) E[# steps A(x) takes] apple T ( x ) Then we say A is a T (n)-time Las Vegas algorithm for f.

NEXT ON THE MENU Example of a Monte Carlo Algorithm: Min Cut Example of a Las Vegas Algorithm: Quicksort

Example of a Monte Carlo Algorithm: Min Cut Gambles with correctness. Doesn t gamble with running time.

Cut Problems Max Cut Problem (Ryan O Donnell s favorite problem): Given a graph G =(V,E), color the vertices red and blue so that the number of edges with two colors (e = {u,v}) is maximized. S V S red blue

Cut Problems Max Cut Problem (Ryan O Donnell s favorite problem): Given a graph G =(V,E), find a non-empty subset S V such that number of edges from S to V S is maximized. S V S size of the cut = # edges from S to V S.

Cut Problems Min Cut Problem (my favorite problem): Given a graph G =(V,E), find a non-empty subset S V such that number of edges from S to V S is minimized. S V S size of the cut = # edges from S to V S.

Contraction algorithm for min cut Let s see a super simple randomized algorithm Min-Cut.

Contraction algorithm for min cut a b e c d Select an edge randomly: Green edge selected. Contract that edge. Size of min-cut: 2

Contraction algorithm for min cut a b d e c Select an edge randomly: Size of min-cut: 2 Green edge selected. Contract that edge. (delete self loops)

Contraction algorithm for min cut a b d e c Select an edge randomly: Size of min-cut: 2 Purple edge selected. Contract that edge. (delete self loops)

Contraction algorithm for min cut a b d e c Select an edge randomly: Size of min-cut: 2 Purple edge selected. Contract that edge. (delete self loops)

Contraction algorithm for min cut a b d e c Select an edge randomly: Size of min-cut: 2 Blue edge selected. Contract that edge. (delete self loops)

Contraction algorithm for min cut a b c d e Select an edge randomly: Size of min-cut: 2 Blue edge selected. Contract that edge. (delete self loops)

Contraction algorithm for min cut a b c d e Select an edge randomly: Size of min-cut: 2 Blue edge selected. Contract that edge. (delete self loops) When two vertices remain, you have your cut: {a, b, c, d} {e} size: 2

Contraction algorithm for min cut a b e c d Select an edge randomly: Size of min-cut: 2 Green edge selected. Contract that edge. (delete self loops)

Contraction algorithm for min cut a b d e c Select an edge randomly: Size of min-cut: 2 Green edge selected. Contract that edge. (delete self loops)

Contraction algorithm for min cut a b d e c Select an edge randomly: Size of min-cut: 2 Yellow edge selected. Contract that edge. (delete self loops)

Contraction algorithm for min cut a b e d c Select an edge randomly: Size of min-cut: 2 Yellow edge selected. Contract that edge. (delete self loops)

Contraction algorithm for min cut a b e d c Select an edge randomly: Size of min-cut: 2 Red edge selected. Contract that edge. (delete self loops)

Contraction algorithm for min cut a c b e d Select an edge randomly: Size of min-cut: 2 Red edge selected. Contract that edge. (delete self loops)

Contraction algorithm for min cut a c b e d Select an edge randomly: Size of min-cut: 2 Red edge selected. Contract that edge. (delete self loops) When two vertices remain, you have your cut: {a} {b,c,d,e} size: 3

contract contract contract contract G = G 0! G 1! G 2!! G n 2 n vertices 2 vertices n 2 iterations Observation: For any i : A cut in of size k corresponds exactly to G i a cut in G of size k.

a b G e c d a G i b e d c

Poll Let k be the size of a minimum cut. Which of the following are true (can select more than one): For G = G 0, For every G i, k apple min v k apple min v deg G (v) deg Gi (v) For every G i, k min v deg Gi (v) For G = G 0, k min v deg G (v)

Poll For every G i, k apple min deg Gi (v) v i.e., for every G i and every v 2 G i, k apple deg Gi (v) Why? A single vertex v forms a cut of size deg(v). a This cut has size deg(a) =3. G i c b e d Same cut exists in original graph. So k apple 3.

Contraction algorithm for min cut Theorem: Let G =(V,E) be a graph with n vertices. The probability that the contraction algorithm will output a min-cut is 1/n 2. Should we be impressed? - The algorithm runs in polynomial time. - There are exponentially many cuts. (~ ~ 2 n ) - There is a way to boost the probability of success to 1 1 (and still remain in polynomial time) e n

Proof of theorem Fix some minimum cut. F = k V = n E = m S V S F Will show Pr[algorithm outputs F ] 1/n 2 (Note Pr[success] Pr[algorithm outputs F ])

Proof of theorem Fix some minimum cut. F = k V = n E = m S V S F When does the algorithm output F? What if it never picks an edge in Then it will output F. F to contract? What if the algorithm picks an edge in Then it cannot output F. F to contract?

Proof of theorem Fix some minimum cut. F = k V = n E = m S V S F Pr[algorithm outputs F ]= Pr[algorithm never contracts an edge in F ] = Pr[E 1 \ E 2 \ \ E n 2 ] E i = an edge in F is contracted in iteration i.

Proof of theorem Let E i = an edge in F is contracted in iteration i. Goal: Pr[E 1 \ E 2 \ \E n 2 ] chain rule Pr[E 1 \ E 2 \ \ E n 2 ] 1/n 2. =Pr[E 1 ] Pr[E 2 E 1 ] Pr[E 3 E 1 \ E 2 ] Pr[E n 2 E 1 \ E 2 \ \E n 3 ] Pr[E 1 ] =1 Pr[E 1 ] =1 # edges in F total # edges =1 k m want to write in terms of k and n

Proof of theorem Let = an edge in F is contracted in iteration i. E i Goal: Pr[E 1 \ E 2 \ \ E n 2 ] 1/n 2. V S Observation: 8v 2 V : k apple deg(v) v S X Recall: v2v Pr[E 1 ]=1 deg(v) =2m =) 2m kn kn =) m kn 2 1 = 1 k m k kn/2 2 n

Proof of theorem Let = an edge in F is contracted in iteration i. E i Goal: Pr[E 1 \ E 2 \ \ E n 2 ] 1/n 2. Pr[E 1 \ E 2 \ \E n 2 ] 2 1 Pr[E 2 E 1 ] Pr[E 3 E 1 \ E 2 ] n Pr[E n 2 E 1 \ E 2 \ \E n 3 ] Pr[E 2 E 1 ] =1 Pr[E 2 E 1 ] =1 k # remaining edges want to write in terms of k and n

Proof of theorem Let = an edge in F is contracted in iteration i. E i Goal: Pr[E 1 \ E 2 \ \ E n 2 ] 1/n 2. Let be the graph after iteration 1. G 0 =(V 0,E 0 ) Observation: 8v 2 V 0 : k apple deg G 0(v) X deg G 0(v) =2 E 0 =) 2 E 0 k(n 1) v2v 0 k(n 1) Pr[E 2 E 1 ]=1 k E 0 1 =) E 0 k k(n 1)/2 k(n 1) 2 2 = 1 n 1

Proof of theorem Let = an edge in F is contracted in iteration i. E i Goal: Pr[E 1 \ E 2 \ \ E n 2 ] 1/n 2. Pr[E 1 \ E 2 \ \E n 2 ] 1 1 2 n 2 n 1 Pr[E 3 E 1 \ E 2 ] Pr[E n 2 E 1 \ E 2 \ \E n 3 ]

Proof of theorem Let = an edge in F is contracted in iteration i. E i Goal: Pr[E 1 \ E 2 \ \ E n 2 ] 1/n 2. Pr[E 1 \ E 2 \ \E n 2 ] 1 2 1 n 2 1 n 1 2 1 n 2 2 1 n (n 4) 2 n (n 3) = n 2 n n 3 n 4 n 5 n 1 n 2 n 3 2 1 4 3 = 2 n(n 1) 1 n 2

Contraction algorithm for min cut Theorem: Let G =(V,E) be a graph with n vertices. The probability that the contraction algorithm will output a min-cut is 1/n 2. Should we be impressed? - The algorithm runs in polynomial time. - There are exponentially many cuts. (~ ~ 2 n ) - There is a way to boost the probability of success to 1 1 (and still remain in polynomial time) e n

Contraction algorithm for min cut Theorem: Let G =(V,E) be a graph with n vertices. The probability that the contraction algorithm will output a min-cut is 1/n 2. Should we be impressed? - The algorithm runs in polynomial time. - There are exponentially many cuts. (~ ~ 2 n ) - There is a way to boost the probability of success to 1 1 (and still remain in polynomial time) e n

Boosting phase Run the algorithm t times using fresh random bits. Output the smallest cut among the ones you find. G G G G Contraction Algorithm Contraction Algorithm Contraction Algorithm Contraction Algorithm F 1 F 2 F 3 F t Output the minimum among F i s. larger t =) better success probability What is the relation between t and success probability?

Boosting phase What is the relation between t and success probability? Let A i = in the i th repetition, we don t find a min cut. Pr[error] = Pr[don t find a min cut] =Pr[A 1 \ A 2 \ \ A t ] ind. events =Pr[A 1 ]Pr[A 2 ] Pr[A t ] t apple 1 =Pr[A 1 ] t 1 n 2

Pr[error] apple 1 Boosting phase 1 n 2 t Extremely useful inequality: 8x 2 R : 1+x apple e x

Pr[error] apple 1 Boosting phase 1 n 2 t Extremely useful inequality: 8x 2 R : 1+x apple e x Let x = 1/n 2 Pr[error] apple (1 + x) t apple (e x ) t = e xt = e t/n2 t = n 3 =) Pr[error] apple e n3 /n 2 =1/e n =) Pr[success] 1 1 e n

Conclusion for min cut We have a polynomial time algorithm that solves the min cut problem with probability 1 1/e n. Theoretically, not equal to 1. Practically, equal to 1.

Important Note Boosting is not specific to Min-cut algorithm. We can boost the success probability of Monte Carlo algorithms via repeated trials.

Example of a Las Vegas Algorithm: Quicksort Doesn t gamble with correctness. Gambles with running time.

Quicksort Algorithm 4 8 2 7 99 5 0 On input S =(x 1,x 2,...,x n ) - If n apple 1, return S

Quicksort Algorithm 4 8 2 7 99 5 0 On input S =(x 1,x 2,...,x n ) - If n apple 1, return S - Pick uniformly at random a pivot x m

Quicksort Algorithm 8 2 7 99 5 0 4 On input S =(x 1,x 2,...,x n ) - If n apple 1, return S - Pick uniformly at random a pivot x m

Quicksort Algorithm 8 2 7 99 5 0 4 On input S =(x 1,x 2,...,x n ) - If n apple 1, return S - - - Pick uniformly at random a pivot x m Compare x m to all other x s Let S 1 = {x i : x i <x m }, S 2 = {x i : x i >x m }

Quicksort Algorithm 8 7 99 5 2 0 4 S 1 On input S =(x 1,x 2,...,x n ) - If n apple 1, return S - - - Pick uniformly at random a pivot x m Compare x m to all other x s Let S 1 = {x i : x i <x m }, S 2 = {x i : x i >x m }

Quicksort Algorithm 2 0 4 8 7 99 5 S 1 S 2 On input S =(x 1,x 2,...,x n ) - If n apple 1, return S - - - Pick uniformly at random a pivot x m Compare x m to all other x s Let S 1 = {x i : x i <x m }, S 2 = {x i : x i >x m }

Quicksort Algorithm 2 0 4 8 7 99 5 S 1 S 2 On input S =(x 1,x 2,...,x n ) - If n apple 1, return S - - - - Pick uniformly at random a pivot x m Compare x m to all other x s Let S 1 = {x i : x i <x m }, S 2 = {x i : x i >x m } Recursively sort S 1 and S 2.

Quicksort Algorithm 0 2 4 5 7 8 99 S 1 S 2 On input S =(x 1,x 2,...,x n ) - If n apple 1, return S - - - - Pick uniformly at random a pivot x m Compare x m to all other x s Let S 1 = {x i : x i <x m }, S 2 = {x i : x i >x m } Recursively sort S 1 and S 2.

Quicksort Algorithm 0 2 4 5 7 8 99 S 1 S 2 On input S =(x 1,x 2,...,x n ) - If n apple 1, return S - - - - - Pick uniformly at random a pivot x m Compare x m to all other x s Let S 1 = {x i : x i <x m }, S 2 = {x i : x i >x m } Recursively sort S 1 and S 2. Return [S 1,x m,s 2 ]

Quicksort Algorithm This is a Las Vegas algorithm: - always gives the correct answer - running time can vary depending on our luck It is not too difficult to show that the expected run-time is apple 2n ln n = O(n log n). In practice, it is basically the fastest sorting algorithm!

Final remarks Randomness adds an interesting dimension to computation. Randomized algorithms can be faster and much more elegant than their deterministic counterparts. There are some interesting problems for which: - there is a poly-time randomized algorithm, - we can t find a poly-time deterministic algorithm. Another (morally) million dollar question: Does every efficient randomized algorithm have an efficient deterministic counterpart? Is P = BPP?